# Compare the Cloud > News and opinion from the cloud computing marketplace ### AI Quantum and IP Security Shaping Innovation As technology continues to evolve, the boundaries of possibility are constantly beingpushed. Each year brings new advancements that reshape industries, redefine business operations, and introduce fresh challenges. 2025 will be no different. But while we stand on the brink of significant changes, one thing remains constant. Data is the foundation of innovation, underpinning everything from AI, through to quantum computing and beyond. Valerio Romano, Accenture’s Senior Managing Director and Cloud First Lead in EMEAunderlined this at a recent event, stating that data is critical to powering emergingtechnologies. But he also said that without a deep understanding of data, it’s impossible to scale innovations. Unifying data via the cloud is the way to drive full value from it. This has propelled cloud to the forefront of business decision-making, moving beyond simple technology transformation to playing a pivotal role in the data strategies that enable progress. Data is the enabler of innovation Bringing data together from disparate environments allows organisations to harness its power, utilising the full data set effectively. Only then can organisations focus on advanced use cases. It’s these emerging applications of data that push the needle for businesses. So, what are the technologies and trends that will shape the tech landscape in 2025 and beyond? AI will be a key theme as businesses continue to experiment with new forms andapplications of the technology: AI Agents will redefine business decision-making. Currently, AI still falls shortof replicating human-level decision-making, but next year that is set to changewith Agentic AI. Agentic AI is set to drive a wave of innovation, transforming real-time problem-solving and decision-making. Expect these AI agents to optimise tasks with ant-like efficiency, navigating challenges quickly and adapting in real time. This willsee businesses building event-driven architectures that allow AI to react instantlyto real-life events, revolutionising industries like telecoms and logistics. Agentic AI’s ability to run complex simulations will enable organisations to plan,test, and optimise faster than ever, offering real-time actionable insights. Forinstance, we will see telecom networks become smarter, where AI can anticipatedisruptions like storms, enabling proactive steps to be taken to minimise potentialservice disruption while also enhancing the overall customer experience inunexpected ways. But with new use cases come new challenges: ‘Always on’ AI will reignite data management challenges. In 2025, we’ll startto see AI infused into nearly everything, with it switched on by default. Yourphone will automatically analyse your emails to suggest follow-ups or prioritisetasks based on your daily patterns - with models running locally to preserve yourprivacy. Your car will even predict optimal routes based on real-time traffic andweather data. With AI everywhere, the amount of new data generated will skyrocket. This willreignite data management challenges as organisations strive to gain insight froma growing volume and variety of AI-generated data. Poor data managementcould lead to disillusionment as companies find themselves overwhelmed by the constant stream of information, unable to leverage it effectively. As AI becomes standard in daily operations, the challenge will be ensuring itsinsights are relevant and actionable, meeting minimum security and resilienceexpectations, and not just adding to the noise. To unlock AI’s full potential,businesses will need robust data management and multi-cloud strategies toaccess, store, and analyse data - whether on-prem, in the cloud, or at the edge -to extract the best value. Meanwhile, emerging technologies like quantum will start to take centre stage: Quantum computing will ignite the next ‘tech race’. Quantum computing isset to overshadow AI as the next major technological revolution. Rapiddevelopment is underway, with organisations investing heavily in next-generationdata centres equipped to provide ultra-cold temperatures specialisedinfrastructure and massive power requirements required to support quantumsystems. The potential value of quantum breakthroughs is immeasurable fromaccelerating drug discovery and genetic reprogramming in healthcare to pushingenergy closer to fusion, potentially rendering traditional power sources obsolete.Beyond global leaders such as Google and Microsoft, quantum technology willalso be offered as a cloud service, enabling businesses to access its powerwithout building their own infrastructure. As quantum emerges as a game-changer, this shift will trigger a new race ascompanies rush to harness quantum’s capabilities, using it to enhance AIcapabilities and gain a competitive edge. And as companies create their own IP around AI, protecting their assets will become apriority: Intellectual property and data protection concerns will drive increased on-premise and sovereign cloud investments. On-premise investments will surgeas companies seek to protect their data and intellectual property. Companies areinvesting heavily in data and AI, developing their own IP that they want to keepsecure. By keeping their data on-premise, businesses have more security andcontrol, ensuring that proprietary code and project information remains withintheir own protected environments. In addition, interest in sovereign cloud solutions is also growing. Hyperscalerslike AWS and Microsoft have recognised that the repatriation of data will affecttheir business models, prompting them to offer more localised, regulatory-compliant cloud options. Preparing for the future today For organisations, the challenge will be to stay ahead of these trends having the ability to adapt quickly to opportunities as they emerge. With data and cloud intrinsically linked to driving innovation, modern data architectures will be more important than ever, providing the foundations for agentic AI, quantum computing, sovereign cloud and the next tranche of innovations – whatever they may be. The future is coming at us faster than ever, and organisations with the capability to embrace change whilst reducing the challenges presented by new technologies will have the advantage over those that fail to adapt. ### <strong>Is sustainability ‘enough’ from a Cloud perspective?</strong> In technology, migrating to the cloud is now widely recognised to bring a host of benefits. Beyond agility and cost savings, sustainability is an increasingly important, yet often underestimated, advantage. Moving to the cloud is generally viewed as a more sustainable alternative to the energy-intensive practice of maintaining your own large server rooms, which can be considerable sources of emissions. But is sustainability truly enough in the face of the escalating climate crisis and the increasingly vocal demands for environmental action? First, let's examine the term "sustainability" itself. It has become something of an overused buzzword, often applied to any initiative that contributes to saving the planet. However, at its core, sustainability implies maintaining our current way of life. It suggests preserving the world as it is, preventing the climate crisis from worsening. This definition, while important, doesn't always align with the urgent and transformative action that many scientists, activists, and, increasingly, consumers believe is now essential. Therefore, we need to shift our focus towards the concept of regeneration. This represents a far more proactive and ambitious approach, one that cloud providers and users alike should be considering when assessing just how "sustainable" the move to the cloud actually is. It's about going beyond minimising harm and actively contributing to environmental restoration. Pushing Past Sustainability The thought of overhauling established sustainability initiatives can be unsettling for businesses. However, it doesn't necessarily require a radical transformation overnight. Instead, it's about building upon existing strategies, using them as a foundation to take the next, crucial step forward. This requires a careful assessment of current practices and a willingness to embrace new technologies and methodologies. Pay close attention to evolving market trends and shifting consumer sentiment. Consumers are demonstrably influenced by a company's environmental credentials. Research consistently indicates that a significant percentage of consumers actively consider a brand's sustainability efforts when making purchasing decisions, and many have stopped buying from companies perceived as lacking in environmental responsibility. The Competition and Markets Authority (CMA) in the UK continues to scrutinise "greenwashing," making transparency and verifiable claims paramount. This trend is only intensifying as we approach 2025, with consumers becoming increasingly savvy at spotting unsubstantiated claims. While sustainability has served a valuable purpose in raising awareness and driving initial improvements, the idea of simply sustaining the world's current state no longer aligns with the ambitions of consumers, businesses, and governments. The growing awareness of the urgency of the climate crisis has made it clear that we need to strive for more than just preventing further damage. Furthermore, stricter environmental regulations, such as the evolving carbon pricing mechanisms across Europe and the UK, are placing increasing pressure on businesses to move beyond simple sustainability and demonstrate real, measurable impact on emissions reduction and environmental restoration. Carbon taxes and emissions trading schemes are becoming more widespread and stringent, making it financially imperative for businesses to reduce their carbon footprint. Defining Regeneration Let's clarify what we mean by "regeneration" in this specific context of cloud computing and environmental impact. Rather than simply sustaining, it's about actively developing and improving something to make it more active, successful, or vital, particularly after a period of decline or damage. It's about actively healing the planet and restoring ecosystems. Consumers are placing increasing pressure on companies to move beyond mere sustainability pledges. Regeneration evokes action, a sense of renewal, and tangible improvement. It's a more assertive and impactful approach, aimed at actively restoring the environment to a healthier state. Expect to see increased consumer activism and heightened scrutiny of corporate environmental pledges in 2025, demanding demonstrable results, not just carefully worded rhetoric. Social media campaigns will continue to amplify consumer voices and hold businesses accountable for their environmental actions. A regenerative business is one that strives to achieve more with fewer resources across all stages of its operations, driven by a commitment to move beyond offsetting towards achieving complete net-zero emissions and creating a positive environmental impact. This includes implementing circular economy principles, prioritising resource efficiency across the entire value chain, and actively investing in projects that restore ecosystems and sequester carbon. This requires a holistic approach, considering the environmental impact of everything from raw material sourcing to product disposal. The transition from sustainability to regeneration doesn't require a monumental leap; it begins with a fundamental shift in mindset. It requires companies to move beyond simply reporting on emissions and invest in solutions that actively draw down carbon and restore ecosystems. It's about embedding environmental responsibility into the core of the business strategy. How the Cloud Fits In Technology, of course, plays a crucial role in enabling organisations to make this transition from sustainability to regeneration. While the "regenerative" business model is still emerging, many businesses have already invested in "green" solutions available for cloud platforms and other services. In 2025, expect to see cloud providers offering more sophisticated and granular tools for measuring and managing the environmental impact of cloud usage, allowing businesses to make more informed decisions about resource allocation, workload placement, and energy consumption. These tools will provide detailed insights into the carbon footprint of specific applications and services, enabling businesses to optimise their cloud usage for maximum environmental benefit. Emerging technologies and continuous innovation are catalysts for these regenerative models, encouraging businesses to surpass traditional sustainability efforts. This not only contributes to global environmental goals but also unlocks significant competitive advantages for organisations willing to embrace the opportunity. Expect to see more businesses leveraging artificial intelligence (AI) and machine learning (ML) to optimise energy consumption within their cloud infrastructure, dynamically adjusting resource allocation based on real-time demand and environmental conditions. Regenerative agriculture provides a compelling example of how technology can be used to achieve both environmental and economic benefits. Data and technology are used to help companies protect and enhance the environment while optimising their business operations. This includes practices such as enhanced soil carbon sequestration, optimised water usage, and other methods that improve soil health, increase biodiversity, upcycle by-products, and reduce the use of synthetic chemicals. In 2025, satellite-based monitoring, drone technology, and AI-powered analytics will play an even larger role in enabling precision agriculture and verifying the effectiveness of regenerative practices. Using the Cloud for Regeneration Practices We're rapidly moving beyond the point where the cloud can be considered merely a sustainable solution. While it undoubtedly helps reduce an organisation's on-premises carbon footprint, it essentially shifts the footprint to servers in a different location. While cloud infrastructure is often more sustainable due to reduced maintenance and cooling requirements per company, it still generates carbon emissions. The key focus in 2025 will be on utilising cloud resources in a way that actively contributes to carbon reduction and environmental restoration, not just minimising harm. Therefore, organisations need to be proactive and fully embrace the concept of regeneration as the foundation for the future of the cloud. This involves reversing existing environmental damage rather than simply settling for the status quo. This means actively choosing cloud providers who are demonstrably investing in renewable energy sources, carbon removal technologies, and circular economy initiatives. It also means optimising workloads to minimise energy consumption and actively seeking out opportunities to use cloud resources to support environmental restoration projects. This doesn't necessitate a massive upheaval; it's about making smart, incremental changes in processes and technologies that collectively shift the conversation from sustainability to regeneration. A prime example is the use of cloud technology in agricultural tech companies to reverse environmental damage by improving soil health. This illustrates how innovation in the cloud, leveraging the power of Machine Learning, Artificial Intelligence, predictive insights, and advanced data analytics, can make the cloud a powerful force for good in addressing the climate crisis. We'll see wider adoption of technologies like digital twins for modelling ecosystems and optimising resource management in 2025, allowing for more precise and effective environmental interventions. In addition, the rise of edge computing will contribute to more efficient data processing and reduced data transfer distances, further minimising the carbon footprint of cloud-based applications. Expect increased adoption of serverless computing, function-as-a-service (FaaS), and other resource-efficient architectures that only consume resources when actively processing data. The future of the cloud is not just about being sustainable; it's about being regenerative and actively contributing to a healthier planet. ### How GenAI can tackle challenges in Software Engineering When it comes to software management, developers face a variety of challenges in maintaining software quality. They can be restricted by legacy systems, hindered by complex code management and debugging processes, limited in ways to optimise performance and struggle to ensure code security. How businesses manage and optimise their code has a direct and wide-ranging impact on their software performance. By optimising code efficiency, companies can tackle the root causes of these challenges; efficient code means greater energy efficiency and more cost savings, and it allows businesses to effectively scale their applications. Here are four challenges in maintaining software quality and how optimising code using AI can tackle them. Outdated code: time for an upgrade Developers often face a battle to ensure their software is up to date with the latest versions of the programming languages they are written in. Why does this matter? Outdated code can lead to far more problems than it may first appear: it makes software applications more inefficient and unsustainable, more costly to run, more vulnerable to cyber-attacks, and it can create compatibility issues with modern systems. AI-powered code efficiency platforms can speed up the process of modernising code to the latest version on offer and ensure that it adheres to current standards. Crucially, this allows engineers to optimise their productivity by minimising the time they spend on manually upgrading each application – a process which can eat into days of developers’ time and increase costs. With modern codebases, companies are well placed to increase the lifespan of their legacy systems, enhance interoperability with different systems and new technologies, and strengthen their defences against cyber-attacks. Code complexity and knowledge loss As companies grow and evolve, so too will their codebases. These growing codebases can make it increasingly challenging for developers to find specific functionalities or to understand parts of the code, particularly when new team members are onboarded or the code has been inadequately documented. Some of the latest AI platforms utilising LLMs have a chat interface which can understand natural language queries and offer advanced code search functionalities. Using the chat, developers can ask complex questions about the codebase, such as investigating how different code elements interact with each other, and can also search the web for the most up-to-date open-source libraries. Crucially, the platform acts as a live documentation tool, storing updates/changes to code and massively minimising the learning curve for developers in understanding complex codebases. Using the tool, companies can foster better collaboration between developers and minimise the damage of ‘knowledge loss’ if employees leave, meaning that when new developers enter the team, they can be onboarded more efficiently. Optimising code inefficiencies Inefficient code can be a significant drain on resources – whether directly noticeable or not. A code section may contain suboptimal algorithms or have unoptimised data structures, meaning that the software is overusing resources and contributing to huge compute costs and scalability challenges. If a developer were to try and manually identify these inefficiencies, it would take huge amounts of time. With an AI platform to hand, however, developers can automate the code analysis process and then optimise code performance. The technology can automatically identify inefficiencies in code bases in line with developer priorities, such as optimising run time or memory usage. It then provides a range of alternative code snippets that are able to perform the same tasks more efficiently, offering optimal improvements to algorithms and data structures. If software is performing tasks more efficiently, it is also using less energy to do the same tasks, and so it operates more sustainably. By having more efficient code, organisations can manage their resources better and scale applications more effectively, creating a strong foundation for continued growth. Ongoing security vulnerabilities The world of cyber threats never sleeps and continually evolves, meaning that code has to be constantly evaluated for security vulnerabilities (including those that may have been overlooked when it was first developed). These dangers can be particularly apparent in open-source code, where vulnerabilities are more likely to be publicly shared. Such an unforgiving landscape requires a tool that safeguards security around the clock. AI can scan hundreds of repositories to highlight security deficiencies and then suggest any code changes to bolster code bases. Not only does this ensure the software is running efficiently but it also secures it against any potential attacks, whenever they may come. By taking a proactive approach to security, applications are set up to be robust against emerging threats, mitigating the use of time, money and resources to reactively deal with any security breaches. Crucially, this helps build customer trust and protects sensitive data. Ensuring quality code with AI As codebases grow in an evolving technological landscape, developers face a range of challenges in maintaining software quality and scaling applications. Outdated systems, a lack of code knowledge and documentation, hidden code inefficiencies, and cybersecurity vulnerabilities all impede code and software performance. The modern business environment is not built for manual code optimisation processes. In fact, with technology to assist developers, businesses will be protected from security breaches, and technical disruption and find it easier to scale. Optimising code is where AI can work to its optimum; it provides objective codebase analysis and then offers suggestions for improving the inefficiencies identified. Developers can then assess these suggestions and integrate improvements. Through this approach, businesses can drive software performance and efficiency, improving sustainability, reducing costs and safeguarding security. Quality software has never been as necessary, and AI can help developers overcome their greatest software management challenges. ### Ensuring Seamless Data Shopping in 2025  More and more companies today are migrating their data to the cloud. With lower operating costs, infrastructure flexibility, new data governance possibilities and fewer data silos, the cloud data experience allows businesses to easily access their data whenever they need to. Or at least, that's how it's meant to work in theory. In reality, many businesses are still unable to access specific data when they need it most, despite having spent years collecting it. Too often, businesses lack the internal data habits needed to create a well-rounded and more accessible data infrastructure. As a result, businesses looking to fast-track their migration to the cloud are still facing the threat of slow data scaling due to a lack of data access control policies. How can businesses optimise their data strategy to ensure they provide employees with the access they need to analyse, understand and trust data, and most importantly, make reliable business decisions from that data? Data Shopping: A Business’s 'Grocery Store' The success of any data strategy largely relies on the ability to understand, trust and utilise the data collected. Unfortunately, this tends to be where many projects fail, as businesses lack a solid long-term plan to increase data literacy. Recent research indicates that 70% of businesses believe having access to intelligence about the data being used to make decisions is crucial to success. Yet over 65% of organisations still face challenges with identifying and managing their data sources. To battle these challenges, one approach to data use and management that has been gaining popularity across the industry is the concept of a data marketplace. This data marketplace serves as a digital shop window for all of a company’s data, which employees can use to search, find and extract specific data records. The appeal of the data marketplace is that it offers a data shopping experience that simulates classic online shopping. A user can filter their search for a desired dataset using different categories and specifications in the same way they would search for a product on Amazon, and then ‘check-out’ the data they need. This data shopping experience is designed to provide an easy-to-use approach, opening up employee access to all the data they need and placing it at their fingertips. However, creating a data marketplace is only useful if an organisation can encourage staff to use it. If it does not offer easy access and a seamless experience, employees will not adopt this solution. So, how can organisations design an effective data shopping experience? Data Accountability: Keeping the Shelves Stocked It is easy to assume that all data is good data, but just because data is discoverable does not guarantee it to be relevant or of good quality. Data quality can have a significant impact on the successful integration of data shopping across the business: if employees can only find ‘bad quality’ data which is unlikely to be relevant or useful, they are far less likely to use the provided solution. The general rule of thumb is, no matter how great a shop looks, if it’s full of items you don’t want or need then the shopping experience will be poor. To use the Amazon analogy again, if you’re looking for a product and see a bunch of poor reviews, you’re not likely to buy it. Data is no different and poor quality data is unlikely to get ‘checked-out’. Business leaders should take accountability for the data being put into the marketplace and ensure that the right data is available in the right form at the right time. But as well as keeping on top of what goes into the data marketplace, businesses must be cautious that data — especially personal data — is handled with discretion and is subject to further duty of care. Identifying and Protecting Sensitive Data One of the key challenges in data management is the ability to regulate the access to and use of sensitive data. Privacy regulations call for extra caution when handling sensitive data such as personally identifiable information or protected health information. It is right that businesses implement protections for sensitive data to control who can access it and for what purpose. Failing to protect sensitive information can lead to data breaches and regulatory fines for the company. However, exerting too much control and reducing access to all data will limit the value that data can generate; employees could be left with no access to data when it matters most. As such, businesses migrating to the cloud need to find the right balance between setting up strong access control policies without slowing the ability to scale data access across the organisation. The Three-Step Approach Businesses looking to ensure compliant access to data and analytics in the cloud must overcome these common challenges associated with data governance and data quality in order. Here is a three-step approach to guide businesses through this process. First, divide and conquer. The business must distribute responsibility for systematising data to specific departments. Doing so when moving data to the cloud prepares a business for compliant use. Most businesses have been collecting data for years, meaning that most data ecosystems have evolved through a wide variety of generations. Therefore, it is essential to have a complete understanding and overview of the company’s data and its sources, both past and present. Next, prepare for the big move. A key strategy here should be universal data authorisation. Using the right tools and solutions to regulate data use, build context and identify sensitive data is essential. Businesses should look to simplify, standardise and automate access controls and policies throughout this process. Tools can make it simple and effective for businesses to move their data. For instance, they can be used throughout the migration process to enable the creation of new policies and push dynamic use at scale. Finally, apply discretion. A data catalogue can help structure and keep track of the data management process in the cloud end to end. The metadata, data classifications and technical metadata would essentially feed the access management engine and data pipelines, enabling policy-driven access to sensitive data. When the data catalogue is populated with shared records it should be ready for access and universally available. This data catalogue can be used as the basis for the data marketplace. It is also important for businesses to prepare policy-driven strategies for handling sensitive data. The Check Out If these steps are done correctly, businesses can safely migrate their data to the cloud and offer employees a streamlined data shopping experience. Then they will be able to access and discover available data, as well as transparently see data content, including metrics and compare data without the risk of breaching data protection rules. Importantly, employees will be able to trust their data, leading to better informed business decisions that truly move the needle on business outcomes and deliver strong ROI on their data shopping initiatives. This combination can drive a business’s data culture and provide a structured approach for businesses to benefit from data intelligence across their operations with only a few clicks. ### We’re in a Decentralised AI Revolution Open-source AI promises to address a lot of the ethical concerns around AI as well as helping to drive much greater levels of innovation than closed-source models from big tech companies. Here's how open-source could eventually outpace big tech, ushering in a new era of decentralised artificial intelligence. Chris Price reports. It’s clear that the divide between open-source and closed-source AI looks likely to shape tech industry dynamics over the coming months and years. Driven in part by Chinese company Deepseek’s much-publicised release recently, open-source AI is starting to attract attention, particularly from enterprises attracted by its lower costs and greater flexibility. However, closed-source models still remain by far the most popular AI solutions available to businesses right now. But what exactly are the differences and why should developers and enterprises consider going open, rather than closed-source, in the long term? Firstly, closed-source AI companies such as Open AI - producer of industry leader ChatGPT4 - operate largely behind closed doors. Their algorithms, training data and model parameters are kept secret, accessible only to the company that developed them. And while this approach has clearly proved extremely lucrative so far (Open AI has as estimated valuation of $157 billion according to research company CB Insights), it has also attracted much criticism within the tech community. For example, in a closed-model, power and control over AI systems is concentrated largely in the hands of a few large corporations, such as OpenAI, Anthropic and Google, potentially stifling competition from smaller rivals. Furthermore, access to these models is often restricted and expensive, which can prove a barrier for smaller organisations. Another problem is around their lack of transparency. The opaque nature of closed-source AI models can make it difficult to understand how they arrived at their decision, which in turn raises suspicions and concerns about their bias and accountability. Embracing collaboration and innovation Conversely, open-source AI embraces transparency and collaboration. The code, data and models are freely available for anyone to access, modify and distribute, thereby fostering a much more community-driven approach to development. This can help to create more of a level playing field, allowing smaller – often more innovative companies to compete in the AI revolution which they previously wouldn’t have been able to do. Indeed, it’s an approach that has so far been favoured by Meta for its Llama LLMs (Large Language Models). As Meta CEO Mark Zuckerberg stated in a blog post to launch its Llama 3.1 LLM last year: “To ensure that we have access to the best technology and aren’t locked into a closed ecosystem over the long term, Llama needs to develop into a full ecosystem of tools, efficiency improvements, silicon optimisations, and other integrations,” he wrote. “If we were the only company using Llama, this ecosystem wouldn’t develop and we’d fare no better than the closed variants of Unix.” Stable Diffusion is another example of a community-driven, open-source AI platform. A powerful text-to-image AI tool, its core code was made publicly available to developers and enterprises, allowing anyone to download, use, and modify it. This immediately lowered the barrier to entry for individuals and smaller teams who previously lacked the resources to develop such sophisticated AI. It also helped to spark a massive wave of community-driven development including integration with image editing programs and game development tools. Helping to build trust Nor is innovation the only advantage of an open-source AI model. Decentralisation is another key benefit. By distributing access to AI technology, it can help to prevent the concentration of power in the hands of a few corporations. For futurist Eric Bravick, CEO and Founder of The Lifted Initiative, decentralised open-sourced models are vital to establish transparency and build trust. “In a decentralised system, the origin of data is as critical as the data itself—establishing ownership that consumers can trust and control,” he says. Furthermore, decentralised AI is essential to help prevent misuse of technology. “History teaches us that the concentration of power is antithetical to human liberty,” he adds. “I believe decentralised AI offers the best defence against the potential abuses of this technology.” Founded by Web2 and Web3 experts with experience at Google, Apple, Microsoft, Facebook and many others, The Lifted Initiative is building a decentralised physical infrastructure for its customers through its Manifest Network. This comprises a suite of tools for enterprises including a distributed network of GPUs (Graphical Processing Units) for AI processing as well as secure decentralised storage and scaleable, on-demand compute power. Look to the future Although closed-sourced AI currently dominates the market, the long-term potential for open-source AI looks extremely encouraging. The collaborative nature of open-source development allows for faster iteration and innovation while the decentralised nature of the technology helps to foster greater transparency. Rather than hiding behind closed doors, open-source platforms are open to scrutiny, enabling researchers and developers to examine the code and data used to train models and identify any potential biases. Furthermore, open-source AI encourages the democratisation of technology, preventing control from being held in the hands of a few large, increasingly powerful big tech companies. Of course, challenges still remain for open-source platforms, not least around funding as well as maintaining both quality control and high levels of security. However, as open-source continues to build momentum it may well redefine how AI innovation happens – setting a new pace that big tech’s slower, proprietary models will struggle to match. ### Does every Cloud have a silver lining? Cloud computing has infiltrated practically every aspect of our lives, both at home and at work. From streaming our favourite box sets to accessing critical business applications, the cloud is ubiquitous. But is it something we should wholeheartedly embrace, or are there legitimate reasons to proceed with caution? Let's revisit the pros and cons, grounding ourselves in the basics first. Essentially, cloud computing allows you to store your files, software, and even entire applications remotely, rather than relying on a physical hard drive in your computer or an on-site server room. You're almost certainly interacting with the cloud on a daily basis, even if you don't consciously realise it. Services like Gmail, YouTube, Spotify, and even the iPlayer are all powered by the cloud. Businesses, too, can leverage the cloud, establishing their own private clouds, offering services that are accessible only to authorised personnel within the organisation, or creating public-facing services. Sounds attractive, doesn’t it? Let's delve into the advantages, considering the landscape of 2025. Pros of Cloud Computing in 2025 Enhanced Remote Access This remains a cornerstone benefit. Employees can securely access data and applications from anywhere with an internet connection. Imagine a construction site manager accessing building plans on a tablet or a sales representative updating a CRM system from a client's office. This promotes flexibility and enables remote working, a trend that’s only gained momentum in recent years. Cross-Device Compatibility Information is accessible from your laptop at home, your phone on the commute, or a tablet at a client's site – provided, of course, you have a decent internet connection. This seamless accessibility is a huge boon for productivity. Improved Collaboration Teams can collaborate on documents and projects in real time, regardless of their location. Think of architects working simultaneously on a building design using a shared cloud-based platform. Collaboration platforms have become increasingly sophisticated, offering features like version control, commenting, and integrated video conferencing. Rapid Deployment Setting up cloud services is still remarkably quick and easy. Compare creating a new online account with setting up your own email server. It's a world apart. This rapid deployment allows businesses to respond quickly to changing market needs. Cost-Effectiveness Cloud computing can be more cost-effective, particularly for smaller businesses. It eliminates the need to invest in expensive hardware and software licences. The "pay-as-you-go" model allows you to scale resources up or down as needed, only paying for what you actually use. However, it's crucial to carefully monitor usage to avoid unexpected bills. Scalability The cloud offers virtually limitless storage capacity. You can easily increase or decrease your storage based on your needs by adjusting your subscription. This scalability is particularly useful for businesses that experience seasonal fluctuations in demand. Automatic Updates Cloud service providers handle software updates automatically, relieving the burden on internal IT teams and ensuring that users always have access to the latest features and security patches. This significantly reduces the risk of vulnerabilities caused by outdated software. Cons of Cloud Computing in 2025 Now, let's explore the potential downsides, which are equally important to consider: Data Control and Security Concerns You're essentially entrusting your data to a third-party provider. While security measures have improved dramatically, data breaches and vulnerabilities remain a real concern. Robust security protocols, compliance certifications (like ISO 27001), and transparent data handling policies are absolutely crucial. You need to be confident that your provider is taking data protection seriously. Vendor Lock-In Dependency on a single cloud provider can limit your freedom. Migrating your data and applications to a new provider can be complex, time-consuming, and expensive. While tools and services to facilitate migration are improving, it's still something to bear in mind. It pays to have an exit strategy. Business Continuity Risks Your business continuity and disaster recovery plans are heavily reliant on your provider's capabilities. It's essential to have robust Service Level Agreements (SLAs) that guarantee uptime and data recovery in case of emergencies. Regular data backups to geographically diverse server locations are vital. Ask your provider what their contingency plans are in the event of a major outage. Provider Stability What happens if your cloud provider goes bust, or is taken over by another company? Due diligence is essential when choosing a provider. Assess their financial stability and have a contingency plan in place. Don't put all your eggs in one basket. Data Security and Compliance Ensuring your cloud provider can guarantee the security of your data and comply with relevant regulations (like GDPR and the UK Data Protection Act) is paramount. Data encryption, access controls, and regular security audits are essential. You need to be sure that your provider understands and adheres to the relevant legal frameworks. Downtime Risks Cloud servers can still experience outages. Redundancy, failover mechanisms, and geographically distributed data centres are crucial to minimise downtime. Carefully review your provider's uptime guarantees and disaster recovery plans. What compensation is offered if they fail to meet their service level agreement? Internet Dependency Cloud computing is only as reliable as your internet connection. Ensure you have a robust and reliable internet connection with sufficient bandwidth. Consider having backup internet connections for critical operations. A power cut combined with a broadband outage can bring your business to a standstill. Evolving Security Threats Cloud environments are constantly targeted by increasingly sophisticated cyber threats. Robust security measures, including intrusion detection systems, firewalls, and regular security audits, are essential to protect against evolving threats. Security is an ongoing process, not a one-off task. The Verdict in 2025 Cloud computing has come of age. Many of the initial anxieties have been alleviated through technological advancements, enhanced security measures, and more robust service-level agreements. Businesses are now more sophisticated in their approach to cloud adoption, meticulously evaluating providers and implementing appropriate security and governance policies. Cloud computing is undeniably a critical component of the future for businesses of all sizes. The key is to approach it strategically, carefully weighing the pros and cons, selecting the right provider for your specific needs, and implementing robust security measures to mitigate the risks. As someone might say, the cloud is here to stay! The real question is no longer if you should embrace the cloud, but how to do so safely, responsibly, and effectively. So, what is cloud computing? Should I be embracing it or sheltering from it? Whenever I’m faced with a question like this, I always find it comforting to make a list of the pros and cons, but before I do this, let’s start with some basics. Cloud computing, in its simplest terms, enables you to store files and software remotely rather than on a hard drive or server in the office. You may not know it, but you are probably using the cloud every day in your life. Services such as Gmail, Hotmail, Skype, YouTube, Vimeo and SoundCloud all operate in the Cloud. So if all these services are using the Cloud, it should be safe, shouldn’t it? OK, it’s nearly time for that list. It’s now possible for businesses to have their own private cloud, which incorporates specific services and is only accessible to selected people. Sounds good, doesn’t it? Let’s look at the Pros of Cloud Computing: Employees can access data and files they need, even when they are working remotely or outside of office hours. Assuming they can get onto the internet employees can access information from home, in the car, from customer’s offices, and from their smart phone. Employees can work collaboratively on files and documents even when they are not together. Documents can be viewed and edited at the same time from different locations. Setting up cloud computing can be very quick and easy. If you think about how easy it is to set up a Gmail or Hotmail account and be up and running in comparison to installing software which can be time consuming. Cloud computing can be cheaper – you don’t have to buy and install software because it’s already installed online remotely. You don’t need loads of disk space. With cloud computing, you subscribe to the software rather than own it, which means it works a bit like pay-as-you-go. You only pay for what you use and you can scale this up and down depending on your requirements. Cloud computing can offer unlimited data storage because it is online. It is not restricted by server and hard drive limits, and there are no issues with server upgrades, etc. If you need more data you just up your subscription fee. Sounds like a no-brainer so far, doesn’t it? With all of the above benefits, why wouldn’t I embrace the Cloud? Let’s have a look at some of the Cons of Cloud Computing. After all, every silver lining has a Cloud, if you pardon the pun! With the Cloud you do not physically possess storage of your own data, leaving the control and responsibility of your data storage with your Cloud provider. So it could be seen that this is a leap of faith. You could become completely dependent upon your cloud computing provider, taking away your freedom to some extent. Your business continuity and disaster recovery are in the hands of your provider. Do you trust them enough? What happens with data migration issues should you want to change provider? What happens if your cloud provider goes out of business? Can your Cloud provider guarantee the security of your data? Cloud servers can go down just like normal servers, so how do I access my data if this happens? Cloud computing is only as robust as your internet connection. If you are experiencing internet issues you won’t be able to access your data. Hmmm, not so sure now. However, it’s still early days for Cloud Computing, and as time progresses, then, some of these issues will be ironed out. The comedian Peter Kay once famously said about Garlic Bread…..it’s the future! The same can be said about Cloud Computing. It’s here to stay; it is the future and whatever size your business is, it’s time to start thinking if Cloud Computing is going to be the most cost-effective and flexible solution for your future data needs ### Enterprise Hosting, VPS or VMware Which is Best? Cloud computing continues its relentless march forward, solidifying its position as a cornerstone of modern IT infrastructure. As this dynamic sector matures, organisations are increasingly faced with a bewildering array of hosting solutions, each promising optimal performance, scalability, and cost-effectiveness. Understanding the subtle, yet critical, nuances between different hosting models is paramount for making informed decisions that align with business objectives. Many providers offer enterprise hosting services that, on the surface, appear remarkably similar. However, delving beneath the marketing gloss reveals fundamental differences in the underlying technology. Virtual Private Servers (VPS) and Virtual Machines (VMs) both leverage virtualisation to create isolated environments for applications and data, but they achieve this virtualisation in fundamentally contrasting ways. For businesses operating with tight IT budgets, VPS solutions often present themselves as a tempting, more affordable alternative to the seemingly pricier VM options, particularly those powered by industry leaders like VMware. However, a purely cost-driven decision can lead to unforeseen complications down the line. It's absolutely crucial to carefully weigh the implications of choosing a VPS over a VM, considering factors beyond just the initial price tag. Shared Resources, Potential Constraints A VPS, in its simplest form, involves partitioning a single, physical server into multiple smaller, self-contained virtual servers. Each VPS acts as an independent entity, allowing users to install their own operating system, applications, and configurations. However, crucially, all these VPS instances share the same underlying operating system and, consequently, the same physical resources – CPU, RAM, and storage – of the host server. This shared resource model, while contributing to lower costs, can introduce several potential drawbacks that businesses need to be acutely aware of: Security Risks One of the primary concerns with VPS solutions stems from the shared system files and kernel. While virtualisation technologies strive to isolate each VPS, achieving complete isolation is inherently more challenging than with VMs. This shared infrastructure can potentially create security vulnerabilities. A compromised VPS could, in theory, provide an attacker with a pathway to access or disrupt other VPS instances on the same server. Although safeguards exist, the risk remains elevated compared to the dedicated isolation offered by VMs. Regular security audits and patching are critical to mitigate these risks. Performance Issues Overcrowding on a physical server is a common pitfall. When numerous VPS partitions are crammed onto a single physical server, it can place significant strain on the available resources, leading to noticeable performance degradation for all users. If one VPS experiences a sudden surge in traffic – perhaps due to a marketing campaign or a denial-of-service attack – it can disproportionately consume the server's CPU and RAM resources, negatively impacting the performance of all the other VPS instances sharing that server. This "noisy neighbour" effect can lead to unpredictable and unreliable performance, particularly during peak hours. Limited Scalability VPS solutions typically have a limited capacity to handle unexpected traffic spikes. Because they are constrained by the resources allocated to the shared server, scaling up to accommodate increased demand can be slow and, in some cases, impossible. While providers often offer the ability to upgrade your VPS plan, this may involve a disruptive migration to a different server or limitations on the available resources. This lack of scalability can be a significant disadvantage for businesses that experience fluctuating traffic patterns or anticipate rapid growth. The rise of containerisation (like Docker) attempts to address some of these scalability concerns within the VPS model, but fundamental limitations still exist. Management Overhead Whilst some providers offer managed VPS services, many require users to handle the majority of the server administration tasks themselves. This can include installing security updates, configuring firewalls, and troubleshooting performance issues. This management overhead can be a burden for businesses that lack in-house IT expertise. Dedicated Resources, Enhanced Performance VMware Virtual Machines, on the other hand, offer a fundamentally different and arguably more robust approach to virtualisation. With VMware, each VM runs its own, independent operating system – often referred to as a "guest OS" – and is allocated dedicated resources from the physical server's underlying hardware. This dedicated resource allocation and full operating system isolation provides several distinct advantages: Enhanced Security The complete isolation of each VM significantly enhances security and drastically reduces the risk of cross-contamination. Because each VM runs its own operating system and kernel, a security breach in one VM is far less likely to compromise other VMs on the same server. This enhanced isolation is a critical factor for businesses handling sensitive data or operating in regulated industries. Consistent Performance The dedicated CPU and RAM resources allocated to each VM ensure that one user's traffic or application workload does not negatively affect the performance of other VMs on the same server. This guaranteed resource allocation eliminates the "noisy neighbour" effect and provides a more predictable and reliable performance profile. Businesses can confidently run resource-intensive applications on VMs without worrying about performance bottlenecks caused by other users. Improved Uptime VMware technologies like vMotion, High Availability (HA), and Dynamic Resource Scheduling (DRS) are designed to minimise downtime and ensure business continuity. vMotion allows for the live migration of VMs between physical servers without interrupting service, enabling hardware maintenance to be performed without any impact on users. HA automatically restarts VMs on different servers in the event of a hardware failure, minimising disruption. DRS dynamically allocates resources to VMs based on their current needs, optimising overall performance and preventing resource contention. These features significantly enhance the resilience and availability of VMware-based solutions. Scalability and Flexibility Virtual machines offer superior scalability and flexibility. Resources can be dynamically adjusted based on changing requirements, allowing businesses to quickly scale up or down to meet fluctuating demand. This scalability is crucial for businesses that experience seasonal traffic patterns or anticipate rapid growth. VMware also supports a wide range of operating systems and applications, providing businesses with greater flexibility in choosing the right technology for their needs. VMware cloud services allow for seamless integration with public cloud providers, enabling hybrid cloud deployments. The Verdict In 2025, VMware Virtual Machines represent a more robust and reliable hosting solution, offering improved security, redundancy, consistent performance, and superior scalability compared to VPS solutions. However, the enhanced capabilities of VMware typically come at a higher cost. Therefore, if your business operates on a severely constrained budget and requires only basic virtualisation functionality, a VPS might be a viable option, provided you fully understand and carefully mitigate the associated performance and security considerations. Consider using a reputable provider that specialises in VPS security hardening. When evaluating hosting solutions, take a holistic approach. Carefully consider your business's specific needs, budget constraints, tolerance for risk, and long-term growth plans. Don't solely focus on the upfront cost. A more expensive, but ultimately more reliable and scalable, solution like VMware might prove to be the more cost-effective choice in the long run by minimising downtime, improving security, and enabling business growth. Perform due diligence on potential providers, carefully examining their SLAs, security protocols, and disaster recovery plans. The right hosting solution is a strategic investment that can significantly impact your business's success. ### Security Seeing Through the Cloud When organisations consider moving to the cloud, a common initial reaction is, "We're not ready for that yet," often followed by concerns about security. However, it's crucial to understand that "the cloud" doesn't automatically mean giving up control or moving everything off-premises. While some cloud solutions are purely hosted environments with limited security control, it's wrong to assume that all clouds are intangible services where data is simply surrendered. Hosted clouds aren't inherently insecure, but it’s important to differentiate between hosted and privately owned cloud approaches. As with all business decisions, an organisation's risk profile, technical resources, and culture will influence the best strategy (hosted vs. private). Only when the practical details of cloud migration are examined does the real security impact come into focus. Let's consider the security aspects of a fully hosted solution. We'll address risk using these common attack vectors: physical, network, system (OS), and out-of-band management. Physical Security Most reputable cloud providers now house their infrastructure in purpose-built data centres. These facilities employ robust, multi-layered security measures that would be financially out of reach for most organisations. For cloud providers, the physical security of their infrastructure is crucial for business continuity. This should reassure those hesitant to relinquish physical control of data storage and processing systems. While poorly managed data centres still exist, thorough due diligence can expose any major shortcomings. Data residency is another key consideration, particularly concerning data privacy laws like GDPR. Providers should be transparent about where your data is stored. Despite the abstractions in cloud service delivery, there's no valid reason for a provider to be evasive about data location. Network Security Given the diverse risk profiles across industries, a good cloud solution should allow for the deployment of traditional network and application-level security measures. Virtual machines can be placed behind firewalls, intrusion detection systems (IDS), and management systems. They can also be easily deployed across segmented DMZs, development, and private networks. In 2025, advanced cloud solutions enhance network security by simplifying resource monitoring and management, minimising entry points, and enabling rapid incident response. The ability to quickly isolate compromised systems for analysis while redeploying a trusted build is a significant advantage of on-demand cloud services. Network segmentation, micro-segmentation, and increasingly sophisticated intrusion detection and prevention systems (IDPS) are standard offerings. Zero-trust network access (ZTNA) is also gaining traction as a way to further limit access and improve security. System Security Implemented correctly, cloud migration has no negative impact on system security. By reducing the burden of managing physical and network resources, administrators have more time to focus on OS and application-level security. There's a misconception that moving to the cloud automatically grants the provider unrestricted access to all data and applications. In terms of system access, the provider typically has enough access to reach the OS login screen. If an attacker compromises the provider's management system, their access to guest systems would be similarly restricted. The exception is providers whose virtualisation technology requires guest-based software clients. This setup violates established trust models and should be carefully evaluated to determine the added risk. Regardless of the virtualisation technology, those with root-level access to host systems can access stored data. This is an unavoidable reality of current computer technology and must be addressed directly when dealing with sensitive data. Modern operating systems and third-party tools offer straightforward data encryption, which works seamlessly in a cloud environment with minimal performance impact. Cloud providers generally don't penalise customers for encrypting their data. While relying on the cloud provider for data encryption might seem appealing, it goes against best practices. If a provider's management system is compromised, their key management system could also be compromised, exposing encrypted data for all clients. In situations requiring encryption, a distributed key management model (where clients manage their own keys) is the only reliable solution. The Rise of DevSecOps and Automated Security In 2025, DevSecOps practices are becoming increasingly important in cloud security. Integrating security into the development pipeline from the start helps to identify and address vulnerabilities early on. Automated security tools and policies are also essential for maintaining a strong security posture in the cloud. These tools can automatically scan for vulnerabilities, enforce security policies, and respond to security incidents. Staying Ahead of Emerging Threats Cloud security threats are constantly evolving, so it's important to stay informed about the latest risks and vulnerabilities. Cloud providers and security vendors regularly release updates and patches to address new threats. It's also important to educate employees about cloud security best practices and to implement strong access controls and authentication measures. Conclusion The cloud model, whether hosted or on-premises, offers significant security advantages without introducing new vulnerabilities. While some have exaggerated the "dangers" of cloud computing, this is largely unfounded paranoia. Systems in the cloud must be secured like any other system, but they also benefit from streamlined management, monitoring, and resource utilisation. By addressing the key security considerations outlined above, organisations can confidently embrace the cloud and reap its many benefits. ### How to take advantage of the Cloud In today's interconnected business landscape, organisations across all sectors face the critical challenge of secure document exchange. Whether it's legal firms handling sensitive client data, financial institutions managing confidential banking information, or broadcasting companies managing pre-release confidential content, the need for a robust, user-friendly, and legally compliant solution is paramount. While traditional methods like email and FTP have long been staples, they are increasingly inadequate in the face of evolving security threats and stringent data protection regulations, particularly as we move further into the 2020s. Let's consider the challenges faced by German law firms, as highlighted by Björn Matthiessen, CEO of Secure MSP. These firms operate under strict privacy regulations, mandating data storage on German servers and rigorous encryption of all sensitive traffic. This situation isn't unique to Germany; similar data sovereignty concerns are becoming increasingly prevalent worldwide, including in the UK. Post-Brexit, UK organisations are increasingly aware of the need to ensure their data remains under the jurisdiction of UK laws, or at least within a jurisdiction offering equivalent levels of protection. The need to exchange large volumes of files, contracts, and reports with clients and partners is a constant requirement for nearly all businesses. However, relying on outdated workarounds introduces significant risks and inefficiencies that can hamper productivity and expose organisations to potentially crippling data breaches. So, what are the most common pitfalls of older systems still in use today? The Downfalls of Traditional Workaround Email While ubiquitous and seemingly convenient, email suffers from inherent security vulnerabilities that make it unsuitable for exchanging sensitive documents. File size limitations often necessitate splitting large documents into multiple ZIP files, creating a cumbersome and frustrating user experience. Unencrypted emails are simply unacceptable for transmitting sensitive data, and even with encryption, key management can be a logistical nightmare, particularly when dealing with external parties. Phishing attacks targeting email remain a constant threat. FTP (File Transfer Protocol) Larger organisations sometimes resort to FTP servers for handling large file transfers. However, these systems are often complex to manage, require specialised technical expertise, and typically lack the advanced security features required to meet modern compliance standards like GDPR and the UK's Data Protection Act 2018. Furthermore, the user experience is typically far from intuitive, leading to frustration, reduced productivity, and an increased risk of human error. Many FTP solutions lack adequate audit trails. Shared Network Drives Whilst seemingly convenient for internal file sharing, these systems are often implemented without adequate security controls, proper versioning, or robust access management, creating significant vulnerabilities and hindering effective collaboration with external parties. Embracing the Cloud Cloud solutions offer a compelling and increasingly essential alternative to these outdated methods. The cloud provides virtually unlimited storage capacity, easy accessibility from anywhere with an internet connection, and enhanced collaboration capabilities that can significantly improve productivity. However, simply migrating to a generic cloud storage service is not enough. To truly ensure security, legal compliance, and optimal usability, a comprehensive solution must address the following critical requirements: End-to-End Encryption All data leaving the company network must be encrypted, both in transit and at rest. This includes not only the files themselves but also the associated metadata (e.g., file names, timestamps, access logs). Crucially, the encryption keys should be managed centrally within the organisation's control, ensuring that only authorised personnel can access the data. The solution should support robust encryption algorithms and key management practices. Data Sovereignty and Location Control Customers must have the ability to determine the precise physical location of their data storage. This is especially important for organisations operating in heavily regulated industries or those subject to strict data residency requirements. The ability to choose a data centre within a specific geographic region (e.g., the UK) ensures compliance with local laws and regulations and provides greater control over data access and security. User-Friendliness and Seamless Integration The solution must be exceptionally easy to use for both end-users and administrators. A clunky, complicated, or unintuitive system will inevitably lead to user resistance, the adoption of insecure workarounds, and a gradual undermining of the entire security posture. Seamless integration with existing workflows, document management systems, and applications is crucial for a smooth transition and optimal user adoption. Granular Access Controls The system should provide granular control over who can access which files and folders, with the ability to define specific permissions based on roles, departments, or individual users. Multi-factor authentication (MFA) should be mandatory. Lessons from the German Market The experience of German companies, as highlighted by Secure MSP, provides valuable lessons for the UK market. German organisations have long been subject to stringent data protection regulations (driven by GDPR and German Federal Data Protection Act), forcing them to adopt robust security measures for document exchange. By carefully examining the solutions, technologies, and strategies successfully employed in Germany, UK organisations can proactively address emerging challenges, anticipate future regulatory changes, and avoid costly mistakes. One key takeaway is the paramount importance of choosing a cloud provider that fully understands, respects, and demonstrably complies with data sovereignty requirements and the intricacies of international data transfer regulations. As data protection laws continue to evolve and become increasingly complex, it's essential to partner with a provider that can offer flexible deployment options, including the ability to securely store and manage data within the UK or other specified regions, as needed. Key Considerations for Future-Proofing Zero-Trust Architecture Implement a zero-trust security model throughout the organisation, where no user or device is automatically trusted, regardless of their location or network affiliation. This approach requires strict identity verification, continuous monitoring of all activity, and the enforcement of least-privilege access controls at all times. Data Loss Prevention (DLP) Integrate robust Data Loss Prevention (DLP) solutions to proactively prevent sensitive data from leaving the organisation's control, whether intentionally or accidentally. DLP systems can automatically detect, classify, and block unauthorised data transfers, ensuring strict compliance with established data protection policies and security protocols. Collaboration and Workflow Automation Seek out solutions that streamline collaboration on documents and automate document workflows, whilst maintaining the highest levels of security. This can significantly improve efficiency, reduce errors associated with manual processes, and enhance overall productivity. AI-Powered Security Increasingly, leverage the power of artificial intelligence (AI) and machine learning (ML) to enhance security monitoring capabilities and significantly improve threat detection effectiveness. AI-powered systems can intelligently identify anomalous behaviour, learn from patterns, and proactively respond to potential security incidents in real time. Regular Security Audits and Penetration Testing Mandate and conduct regular, independent security audits and penetration testing to proactively identify vulnerabilities in your document exchange systems and ensure the ongoing effectiveness of your implemented security controls. Don't let outdated and insecure document exchange methods put your organisation at risk of data breaches, regulatory fines, and reputational damage. Embrace the cloud with a secure, compliant, and user-friendly solution that empowers your team to collaborate efficiently, securely, and with confidence. ### Data storage in and outside the UK Data security remains a paramount concern for all companies, especially with the increasing reliance on cloud infrastructure to manage and process ever-growing volumes of data. As we approach the mid-point of the 2020s, the cloud has become an indispensable tool, but it also introduces complexities that demand careful attention. When considering cloud storage, it’s crucial to understand precisely where your data is stored, who has access to it (both internally and externally), and, most importantly, how it's protected against unauthorised access, data breaches, and potential loss. This is especially important in 2025, given the continued evolution of data protection laws, the refinement of existing regulations like GDPR, and the emergence of new threats in the cybersecurity landscape. The days of simply 'hoping for the best' are long gone; a proactive, informed approach to data security is non-negotiable. The Importance of Data Location UK-based companies should carefully consider the wide-ranging implications of storing data outside the UK. While cost-effective cloud storage options undoubtedly abound, promising tempting savings and scalable resources, they may come with hidden risks that can significantly outweigh any perceived financial advantages. Data location isn't merely about physical servers; it's about legal jurisdiction, data sovereignty, and the enforcement capabilities of regulatory bodies. Choosing the wrong location could expose your organisation to compliance breaches, legal liabilities, and reputational damage that could prove catastrophic. A Cornerstone of Data Protection in 2025 A key principle of GDPR (General Data Protection Regulation) is that personal data transferred outside the European Economic Area (EEA) must have an equivalent level of protection as it would receive within the EEA. This fundamental principle remains firmly in place in 2025, despite the UK's departure from the European Union. Although the UK has its own equivalent data protection regime based on the Data Protection Act 2018 (DPA 2018), which largely mirrors GDPR, cross-border data transfers still require careful consideration. This means you need to be absolutely certain that any country or territory where your data is stored, or processed, adequately protects the rights and freedoms of data subjects. Factors to consider include the recipient country's data protection laws, the availability of effective legal redress mechanisms, and the potential for government access to data. Learning From Past Mistakes Outsourcing data storage to less reputable companies, particularly those operating in jurisdictions with weaker data protection laws and less robust security standards, can lead to disastrous consequences, ranging from data loss and theft to crippling regulatory fines and irreparable damage to your brand reputation. One cautionary example highlighted a company that outsourced its data storage to a firm in India, which then further outsourced it to a company in Nigeria. The client was subsequently fined a significant sum by their governing body after a substantial portion of their data could not be located, let alone recovered. As a result, they were forced to move their entire IT operations to a UK-based IT infrastructure at considerable expense and disruption. This serves as a stark reminder that the cheapest option is rarely the best when it comes to data security. The true cost of a data breach far outweighs any initial savings. Ensuring Adequate Protection with Contracts So, how can contracts effectively ensure an adequate level of protection for your data when using cloud storage, especially when data is transferred internationally? It's not enough to simply sign a standard agreement; you need a multi-layered approach that incorporates specific clauses and safeguards. Standard Contractual Clauses (SCCs) One widely accepted way to ensure adequate protection when transferring personal data outside the EEA (or the UK) is to utilise Standard Contractual Clauses (SCCs) approved by the European Commission (and the UK's ICO). These clauses provide a pre-approved, legally binding framework for data transfers, ensuring that the data is protected to GDPR standards, regardless of where it's physically located. However, it's crucial to remember that SCCs are not a silver bullet. They must be implemented correctly and supplemented with appropriate technical and organisational measures to address any specific risks associated with the transfer. Contracts Based on Risk Assessment Alternatively, businesses can create their own bespoke contracts, tailored to their specific needs and risk profile. However, this approach requires a thorough and comprehensive risk assessment to identify potential vulnerabilities and implement appropriate safeguards to bring the level of protection up to an adequate standard. This includes assessing the recipient country's legal and regulatory environment, the security practices of the cloud provider, and the potential for government interference. These risk assesments must be revisited and updated on a frequent basis. Important Considerations for Data Security in 2025: Due Diligence is Paramount: Always perform thorough due diligence on any firm you're considering for data storage, regardless of their location, size, or reputation. Check their security certifications, audit reports, and data protection policies. Don't rely solely on marketing claims; verify their credentials independently. Data Residency: A Smart Choice? Seriously consider keeping your data storage within the UK to ensure clear compliance with UK data protection laws and GDPR. This simplifies compliance efforts, reduces the risk of legal challenges, and provides greater control over your data. Evolving Landscape Requires Continuous Monitoring: Cloud technology continues to evolve at a rapid pace, and the regulatory landscape is constantly changing. Staying informed about the latest data handling practices, emerging security threats, and evolving legal requirements is crucial for maintaining a robust data security posture. Employee Training: Ensure all employees are adequately trained on data protection principles and security best practices. Human error is a leading cause of data breaches, so investing in employee education is essential. Prioritising Data Security in a Cloud-First World In 2025, data security and storage remain critical considerations for businesses of all sizes. By understanding the implications of data location, utilising robust contracts, conducting thorough due diligence, and staying informed about evolving regulations like GDPR, you can protect your valuable data, avoid potential pitfalls, and maintain the trust of your customers. When in doubt, prioritising data storage within your country of residence, under the protection of GDPR and UK law, offers a significant advantage in terms of compliance, security, and control. Don't compromise on data security; it's an investment in the long-term success and sustainability of your business. ### Data Loss in the Cloud By Murray Moceri, Marketing Director, CloudAlly We live in a world increasingly powered by the cloud. We entrust our most valuable data to platforms like Google Workspace, Microsoft 365, Salesforce, and countless others, drawn to their promises of scalability, accessibility, and cost-effectiveness. We rely on their robust infrastructure and sophisticated security measures, believing our information is safe and readily available. But is that trust entirely well-placed? Are their safeguards truly enough to protect your business, your livelihood, from the devastating impact of data loss? The answer, unfortunately, is often a resounding no. While these platforms undeniably boast impressive disaster recovery capabilities, it's crucial to understand the subtle but critical distinction: those capabilities are primarily designed to protect their systems, their core infrastructure, and their overall service availability. They're focused on keeping the lights on for millions of users, not necessarily ensuring the granular recoverability of the unique, specific data that resides within your organisation. Think of it like this: they're safeguarding the building itself – the data centre, the servers, the network – but not necessarily the contents of your specific office within that building. They're ensuring the foundation remains strong, but not guaranteeing the integrity of every document, every contact, every financial record stored within their systems. In reality, your data's immediate safety net on these services is often limited to a trash or recycle bin with a relatively short lifespan – typically 30 days, and sometimes even less. Beyond that fleeting window of opportunity, your critical information becomes vulnerable to permanent deletion, corruption, or inaccessibility. You're essentially relying on the platform provider to act as your sole safety net, and their priorities may not always align perfectly with your specific, immediate recovery needs. They're prioritising the overall health of their ecosystem, which can sometimes mean making difficult choices about data retention and recovery. Data Loss Reality - A Wake-Up Call IT professionals instinctively understand the fundamental necessity of robust backup strategies for traditional, on-premise systems. It's a cornerstone of responsible data management. So why is there often a disconnect, a sense of complacency, when it comes to cloud solutions? Is it the alluring promise of "infinite" storage? Is it the perceived reliability of these massive tech giants? Or is it simply a lack of awareness regarding the inherent limitations of cloud provider data recovery policies? The uncomfortable truth is that once your data is deleted, altered, corrupted, or otherwise rendered inaccessible – whether by accidental user error, malicious intent from an insider or external attacker, a poorly designed integration with a third-party app, or even a rare but possible service provider issue – recovery can be incredibly difficult, time-consuming, and in some cases, simply impossible. You're essentially betting the future of your business on the hope that nothing will go wrong, and that the platform provider will be able to swoop in and save the day if something does. That's a risky gamble. Understanding the Threats To truly grasp the importance of cloud data backups, it's essential to understand the common culprits behind data loss incidents. These aren't just theoretical risks; they're real-world threats that businesses face every single day: Human Error: We are, after all, only human. Mistakes happen. Accidental deletion of critical files, incorrect modifications that overwrite valuable data, unintended changes to configurations that break integrations – these simple, every day slips account for a shockingly large portion of data loss incidents. According to Kroll Ontrack, a leading computer forensics and E-Discovery firm, human error is responsible for a staggering 40% of all data loss incidents. That's a sobering statistic. Malicious Activity: Disgruntled employees, former contractors, or external cybercriminals can intentionally damage, corrupt, or delete critical data. These malicious acts can range from simple vandalism to sophisticated sabotage, and they can have devastating consequences for a business. The risk is particularly acute when dealing with sensitive financial data, trade secrets, or customer information. Third-Party Apps and Integrations: The vast ecosystem of third-party applications and integrations that enhance the functionality of cloud platforms can also introduce vulnerabilities. Poorly designed integrations, buggy apps, or simply incompatible software can lead to data corruption, data loss, or unexpected service disruptions. It's crucial to carefully vet and monitor any third-party tools that have access to your cloud data. Service Provider Issues: While exceedingly rare, data loss can occur due to incidents affecting the service provider itself. These can include hacking attempts that compromise data integrity, internal errors or outages within the provider's infrastructure, or even account access revocation due to policy violations. While these scenarios are unlikely, they highlight the importance of having an independent backup strategy that doesn't rely solely on the provider's internal systems. Why Daily Backups are No Longer Optional, But Essential The decision to implement a robust cloud data backup strategy is no longer a matter of "if," but rather "when." Consider these compelling reasons why daily backups are absolutely essential for the long-term survival and success of your organisation: Business Continuity: Ensure you can rapidly and reliably recover the data that is absolutely critical to your daily operations. Minimise downtime in the event of data loss, and prevent significant business disruption that can damage your reputation and erode customer trust. Compliance: Meet stringent data retention requirements mandated by industry regulations such as HIPAA (healthcare), GDPR (Europe), SOX (finance), and others. Failure to comply with these regulations can result in significant financial penalties, legal action, and reputational damage. eDiscovery: Streamline the often complex and costly process of providing documentation for legal proceedings, audits, and regulatory investigations. Having easily accessible and readily searchable backups can save you time, money, and legal headaches. eForensics: Facilitate thorough and efficient investigations into computer security incidents, data breaches, and other potentially damaging events. Backups provide a historical record of your data that can be invaluable in identifying the root cause of a problem and preventing future incidents. Audit Readiness: Maintain easily accessible and auditable records for both internal and external audits. Demonstrating a strong data backup and recovery strategy is a key component of maintaining compliance and demonstrating responsible data management practices. Understanding Data Recovery Limitations with Major Providers Let's take a more detailed look at the native data recovery options offered by some of the leading cloud service providers. Understanding these limitations is crucial to making informed decisions about your own backup strategy: Google Workspace (formerly G Suite): Data retention policies can vary significantly depending on the specific service and your subscription level. For example, deleted emails are typically permanently removed from the trash folder after 30 days. Deleted documents can become irretrievable, particularly if an administrator deletes a user's account. While Google does offer some data recovery tools, they are often limited in scope and may not be sufficient for all scenarios. Microsoft 365 (formerly Office 365): Similar to Google Workspace, Microsoft 365 has its own data retention policies that can impact your ability to recover lost or deleted data. While the Recycle Bin offers a temporary reprieve for deleted files, permanent deletion is often irreversible. Furthermore, recovering data from SharePoint or OneDrive can be a complex and time-consuming process. Salesforce: While Salesforce offers a Recycle Bin for deleted records (recoverable for a limited time, typically 15 days), permanently deleting a custom object or making significant changes to your data model can be irreversible. The Salesforce data recovery service, while available, can be incredibly expensive (upwards of $10,000 USD) and can take a significant amount of time (up to 15 business days) to complete. For many small and medium-sized businesses, that cost and delay are simply unacceptable. Taking Control of Your Data Destiny The key to protecting your business from the potentially devastating consequences of cloud data loss lies in adopting a proactive approach: implement regular, automated backups of your online data, and store those backups in a secure, independent location. Don't rely solely on the native data recovery tools offered by your cloud providers. Even the service providers themselves tacitly acknowledge the limitations of their own recovery systems by recommending that users implement independent backup solutions. Salesforce, for example, explicitly advises users to "Use a partner backup solution." This isn't just a sales pitch; it's a recognition of the inherent risks involved in relying solely on their internal data protection measures. By creating and maintaining your own independent backup and recovery system, you gain complete control over your data and its destiny. You're no longer reliant on the unpredictable policies and limitations of your cloud provider. You're empowered to recover your data quickly and efficiently in the event of any type of data loss incident. Don't wait until it's too late – explore your cloud data backup options today and take control of your data security. The future of your business may depend on it. ### AWS Braket Unlocks Quantum Computing The present and future of computing are very intriguing. Although our current classical computers are ever more sophisticated, new and different calculating methods appear. This paper will be incomplete if it does not mention how AWS Braket unlocks quantum computing. Additionally, the recent advancements made by AWS and NVIDIA are noteworthy because we are seeing improvements in computers and a new generation of computers. Understanding the Potential of Quantum Computing In its simplest form, quantum computing is the application of quantum mechanics to computing. Quantum computing uses quantum mechanics to do this. Quantum bits or qubits differ from the usual bits, either off or on state. In this state, qubits can be simultaneously in more than one state due to a phenomenon known as quantum superposition. Imagine a coin that is not only black or white but both simultaneously; this is the new age of quantum mechanics. This ability to provide simultaneous outputs makes quantum computers far more powerful than classical computers. They are not just new and better computers; they are new computers that can solve specific problems beyond the reach of classical systems. Introduction to AWS Braket: A Cloud-Based Platform for Quantum Computing The launch of AWS Braket in 2019 was a prudent action regarding this new technology. Instead of asking organisations to invest in quantum hardware, AWS provided a cloud platform where individuals and organisations can apply quantum algorithms across different hardware, including superconducting circuits, trapped ions, and neutral atoms. The platform's pay-as-you-go model has made quantum computing available to the public for the first time. This democratisation of quantum computing mirrors our transformation in other technical fields, where virtual events have made complex technologies accessible to global audiences. It is no longer necessary to spend a lot of money initially and then hope it will be used well. Organisations can begin with quantum, play around with it, and grow their quantum strategy based on actual application instead of assumed potential. For those looking to understand how generative AI enhances the potential of quantum computing, this article on Generative AI unlocking quantum computing data potential provides valuable insights into how AI and quantum computing are converging to drive the next wave of innovation. This discussion on Disruptive Live also explores how AWS Braket makes quantum computing accessible and what it means for businesses looking to integrate quantum strategies. The Current State of Quantum Systems We are still facing many problems with quantum computers. They are inaccurate once out of a thousand operations, not the one-in-a-billion accuracy required for most business applications. This is not just a minor problem but a critical issue determining how we work with quantum computers today. That is why the AWS approach via Braket is so essential. With the ability to access several quantum hardware ecosystems, each with its benefits and downsides, AWS Braket unlocks quantum computing to allow researchers and organisations to explore various strategies and identify the most effective solution—the most appropriate solution for their problem. NVIDIA: A New Player in the Quantum Computing Scene Another critical step is NVIDIA's inclusion in quantum computing through its CUDA-Q platform. Using its experience in GPU acceleration, NVIDIA has provided a means of implementing realistic quantum system simulations 75 times faster than conventional methods. This boost is very valuable in generating and optimising quantum algorithms for actual quantum devices. The integration of CUDA-Q with AWS Braket is a great synergy. Scientists can now use NVIDIA's high-performance simulation tools to write algorithms and translate them to run on various quantum hardware via Braket. Designing the Quantum Data Centers of the Future The quantum data centres of the future will not be different buildings from the conventional ones. It is the best way to describe the environment where quantum processors are integrated with classical computers to solve specific tasks that are more suitable for one or the other system. This integration is already possible through services like Braket Direct, which enables expert users to engage directly with the quantum hardware providers. Error correction, algorithm development, and hardware optimisation all require the cooperation of quantum and classical systems. Networking, latency, and system integration issues have also inspired the development of new hardware and software solutions. How to Get Ready for the Quantum Combat For organisations looking to implement quantum computing, there is a clear path through AWS' Quantum Embark program. This step-by-step approach helps organisations to graduate from quantum intrigue to quantum maturity through three key stages: Determining the possible quantum application that is relevant to their industry. Practical application of quantum hardware. In-depth study on quantum research and algorithm development. Its modular structure allows organisations to develop their quantum strategy quickly and gain experience with quantum systems. The Future Although the quantum advantage is still distant in the commercial world, the foundation is being created today through services like AWS Braket and NVIDIA's CUDA-Q. Integrating quantum and classical computing is not only a technical idea; it is becoming an operational fact. For companies that are still unsure when they should start implementing the quantum strategy, the answer is simple: now. This is not because quantum computers are ready for production use but because it takes a new level of comprehension of what computation is and will be. As quantum technology advances, AWS Braket is making previously unattainable computing capabilities a reality, transforming groundbreaking research into practical, real-world solutions. Resources and the Path Forward Access to quantum computing tools has never been easier. Through AWS Braket, organisations can begin to explore quantum algorithms on both simulators and real quantum hardware. Leading organisations already leverage advanced virtual production techniques to demonstrate these quantum computing concepts and make them more tangible for stakeholders. The simulation of quantum algorithms is the CUDA-Q platform of NVIDIA for efficient and fast development and testing. To understand the impact of quantum computing on the industry,  this expert session on Disruptive Live features discussions on quantum integration, cloud-based solutions, and real-world applications. Whether you are a researcher attempting to find new and innovative ways to design quantum algorithms or a business leader seeking insights into the future of computation, the barriers to entry have never been lower. It is not just coming but in different forms across different technological areas. As we continue on this journey, one thing becomes increasingly clear: The future of computing is not in quantum or classical computing but in integrating the two. Platforms like AWS Braket and other collaborations between industry partners create that future today. ### Business Needs AI Now More Than Ever No, AI in the workplace wasn’t just a ‘flash in the pan’. Two years ago, it was the shiny new gadget that companies flaunted at board meetings but rarely embedded into core operations. Today, it’s the backbone of workflow automation, decision-making, and even creativity. The shift has been nothing short of revolutionary—like moving from horse-drawn carriages to high-speed trains overnight. The Big Leap From Hype to Utility Back in 2023, AI in the workplace was largely experimental. Enterprises dabbled in chatbots for customer service, sprinkled some natural language processing (NLP) magic on their data analytics, and, for the particularly adventurous, flirted with reinforcement learning models for logistics optimisation. But for all the hype, adoption remained cautious—CEOs saw AI as a high-risk, high-reward proposition, the corporate equivalent of tightrope walking. Fast forward to today, and the story has changed. AI is no longer a novelty—it’s an indispensable workhorse. Large Language Models (LLMs) like ChatGPT and Google Gemini now handle everything from content generation to coding, while predictive AI reshapes hiring processes, supply chain management, and even real-time fraud detection. The workplace hasn’t just accepted AI; it’s running on it. The Tech Behind the Transformation Several critical advancements propelled AI from ‘fun-to-have’ to ‘must-have’ in just two years. Fine-Tuned Large Language Models (LLMs) AI models in 2023 were impressive but often riddled with inaccuracies (remember the infamous ‘hallucinations’). Fast-forward to today, and continual reinforcement learning has created vastly improved reliability in AI-driven content creation and coding assistance. Generative AI at Scale Two years ago, generating high-quality text, images, or code required considerable computational power and expertise. Today, it’s embedded into Microsoft Office, Adobe Photoshop, and even customer relationship management (CRM) tools, making AI a seamless co-worker rather than a separate entity. AI-Powered Decision-Making Gone are the days when AI simply suggested options; now, it actively informs strategic choices. Financial institutions rely on AI to assess risks in real-time, while HR departments use machine learning algorithms to analyse employee engagement and predict turnover before it happens. Real-World Impact - How AI Went from Sidekick to CEO Whisperer Consider this, in 2023, AI was largely relegated to answering customer queries with slightly robotic politeness. Fast-forward to 2025, and AI has become an integral part of business decision-making. Take Amazon’s logistics division, for example. Previously, AI played a supporting role in route optimisation. Today, it autonomously adjusts supply chains based on global demand fluctuations, weather patterns, and real-time stock levels. The result is reduced waste, lower costs, and a supply chain that operates with the precision of a Swiss watch. Or look at the hiring landscape. Two years ago, AI-assisted recruitment mostly meant automated CV screening. Now, it predicts employee success, assesses cultural fit, and even provides real-time coaching to hiring managers - effectively making it both talent scout and corporate therapist. The Future Smarter, More Ethical, More Integrated And here lies perhaps the biggest question: where does AI go from here? If the last two years have taught us anything, it’s that AI adoption is exponential. But with great power comes great regulation—governments are now scrambling to introduce AI governance frameworks to ensure ethical use, data privacy, and bias mitigation. The workplace of 2026 will likely see AI not just as a tool but as a strategic partner. Expect AI-driven ‘digital executives’ to assist with decision-making at the highest levels, and intelligent automation to take over entire business functions. Yet, the human element remains irreplaceable—AI will continue to augment, not replace, critical thinking, creativity, and leadership. Summary Two years ago, AI in the workplace was a curiosity. Today, it’s an inevitability. If the rapid progress continues, the next workplace transformation might not take two years—it might take two months. Companies that fail to integrate AI effectively risk becoming relics of the past, while those that embrace it will lead the charge into an era where AI is not just an assistant but an essential team member. ### Three Cloud Challenges Leaders Can Learn for AI The launch of Amazon Web Services (AWS) in 2006 accelerated not only public cloud adoption but also the awareness it created accelerated the adoption of new services, with Infrastructure As a Service (IaaS), Platform as a Service (PaaS), Software as a Service (SaaS) and more recently Function as a Service (FaaS) as well as machine learning operations offerings. The low-cost, highly flexible model offered by the public cloud was too tempting to resist and very quickly organisations of all sizes were moving a variety of workloads to the cloud. However, "cloud regret” soon set in, with a variety of challenges arising that hadn’t been foreseen when opting to migrate, causing headaches across organisations. As the famous saying goes: "Those who cannot learn from history are doomed to repeat it" - George Santayana. Nearly two decades later, artificial intelligence (AI) is going to pose similar challenges for organisations looking to rush to benefit from increased efficiency without considering the long-term implications of vendor-lock-in, such as cost management, as cloud services costs can be difficult to understand, control and account for, as well as data sovereignty and lack of safety nets, especially with the fast-moving pace of recent AI developments. With AI experiencing unprecedented growth, and subsequent adoption by organisations, it’s no wonder that it is on track to become a $1.3 trillion market (equivalent to £1 trillion) in less than a decade. With implementations set to continue increasing, learning from the top 3 challenges from the public cloud adoption could prevent later AI regret. Challenge 1: Putting all of Your AI Eggs in One Basket Many technology leaders will remember the pain of being tied to expensive contacts during the initial rush to benefit from the cloud, with an increasing number of businesses reliant on a single cloud provider, leaving them vulnerable to update issues, outages and reliability problems. 80% of cloud-migrated organisations faced vendor lock-in issues, according to Gartner. Solving problems and delivering value will require creating services which leverage different AI technologies, constraining your solutions will impact your competitive advantage. Cloud lock-in made it difficult for organisations to switch to another cloud provider, or revert back to on-premises solutions. In the same way, what is considered best in class for AI today may not be tomorrow. Having the flexibility to shift between AI vendors is crucial as the market is moving at a rapid pace. Organisations that were hoping to leverage the earliest AI technology might find themselves tied to models that are now obsolete, leaving them in the dust of companies with long-term AI implementation strategies. It isn’t just technology that changes, your needs do too, and being married to just one vendor is likely to hold you back, and, like the cloud, see you paying more. Challenge 2: Where’s Your Data Gone? When businesses first moved to the public cloud, there were few regulations governing data storage and privacy. However, regulations always eventually catch up with innovation and within data storage we saw a number of frameworks, including the General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA) emerged; as a result, companies had to react quickly to stricter compliance requirements. Similarly, as AI adoption continues to increase, regulations will likely follow suit, particularly around data processing. Much like the public cloud transition, businesses looking to successfully implement AI will need to account for region-specific compliance standards and take necessary steps to protect their data. Challenge 3: Running Before You Walk The hype around new technologies can make organisations eager to adopt them as fast as possible, and AI is no different. The excitement surrounding AI, much like the early rush to the public cloud, could push organisations to adopt it hastily, often without proper groundwork. AI adoption should follow a similar trajectory. 25% of tech leaders have reported investing in AI too quickly, which has led to several teething problems becoming viral in recent months. Organisations looking to invest in the benefits of AI should look to integrate AI-based applications piece by piece, from well-versed established providers. While AI promises efficiency and transformative potential, it’s crucial to start small, applying AI to less critical functions before scaling up. This iterative approach allows for troubleshooting and learning without putting core business operations at risk. Learning from the Cloud As organisations navigate the rapidly advancing AI landscape, the lessons learned from the early days of public cloud adoption are invaluable. The challenges of vendor lock-in, data sovereignty, and rushed implementation are just as relevant today as they were almost two decades ago. To avoid repeating past mistakes, technology leaders must approach AI implementation from a strategic perspective: starting small, ensuring flexibility in vendor relationships, and staying ahead of evolving regulations. By doing so, businesses can leverage AI's transformative potential while safeguarding against costly missteps and ensuring sustainable, long-term success. ### Impact of IoT on Future Web App Development The increasing usage of IoT in web app development services lies in its ability to gather and analyse real-time data from connected devices, which opens the door for more innovative solutions. IoT is changing how developers approach functionality, user experience, and system integration. A web app development company USA is massively influencing the web app development process through innovation and opening new possibilities for growth. This article explores the Internet of Things and its broader usage across web app development. What is the Internet of Things (IoT)? The network of linked devices with sensors, software, and other technologies installed that allow them to gather and share data online is known as the Internet of Things. These gadgets, which range from wearable health monitoring to smart home systems and industrial machinery, create an ecology of constant data interchange. Leading app development services providers in the USA offer this smooth connection and transform manufacturing, logistics, healthcare, and agriculture industries by promoting automation, real-time monitoring, and intelligent decision-making. For instance, an industrial facility where machinery automatically notifies maintenance requirements or a bright house where your thermostat interacts with your lighting system. Through the Internet of Things (IoT), you can expect a world in which all systems and devices cooperate to improve productivity and user experience. 7 Reasons IoT Can Shape Web App Development Let's check out seven main reasons IoT can improve web development - Live Insights The top Web app development company in the USA can interpret and display vast volumes of real-time data generated by IoT-enabled devices into actionable insights. This feature changes how companies function by enabling them to react quickly to shifting circumstances. For example, a retail web app may track inventory levels in real-time to automate restocking. In contrast, a logistics web app may leverage the Internet of Things data from delivery vehicles to optimise routes and minimise delays. Live insights streamline processes and improve customer satisfaction by providing immediate, data-driven intelligence to decision-makers. Advanced Backend Capabilities The massive data streams that IoT generates necessitate reliable backend systems. Web apps should interface with databases, middleware, and cloud services to store, analyze, and retrieve IoT data. To satisfy the ever-changing needs of IoT ecosystems, scalable architectures such as microservices and serverless computing are becoming increasingly common. The foundation of IoT-powered applications, these backend technologies allow them to handle intricate operations and guarantee dependable performance even when faced with demanding workloads. Enhanced System Security Linked devices are more often subject to cyberattacks, and for this, IoT ecosystems provide better security. Web apps need to avoid data breaches and preserve user trust. So, they need to interface with Internet of Things devices. It includes sophisticated encryption techniques, safe authentication procedures, and real-time monitoring capabilities. Security techniques like zero-trust frameworks and blockchain technologies are becoming more popular. Developers may protect sensitive data and guarantee the dependability of web applications driven by the Internet of Things by prioritising security. Seamless Chatbot Integration IoT-powered web apps use chatbots to improve user engagement. These intelligent assistants may help users with device problems by offering them real-time support and walking them through the features of apps. For example, users can use simple text or voice commands to regulate the thermostat, change the lights, or keep an eye on security with a smart home app that is connected to IoT devices. Chatbots improve user engagement by providing a more individualised and user-friendly experience and lowering the learning curve for intricate IoT systems. Next-Gen Technology Integrations Web apps can now integrate cutting-edge technology like augmented reality (AR), machine learning (ML), and artificial intelligence (AI) due to the Internet of Things. These technologies increase the intelligence and interactivity of web apps driven by the Internet of Things. For example, IoT-based education software can employ AR to generate lesson plans, ML to forecast learning paths, and AI to analyse grading. In addition to improving functionality, App Development Services USA uses these linkages to reshape user expectations and establish new standards. Faster Performance Web apps must have lightning-fast speed because the Internet of Things emphasises real-time communication. It reduces latency and increases responsiveness with technologies such as content delivery networks (CDNs) and edge computing. All this helps process data closer to the point of origin and lessens reliance on central servers. This level of optimisation leads to fast communication. Also, IoT-enabled web apps manage large data volumes quickly without experiencing performance issues. Dynamic and Adaptive UI/UX Because of the wide range of IoT use cases, web apps must have adaptable and flexible user interfaces to meet the needs of different user types. More and more developers are considering designing systems that can adapt dynamically to different kinds of devices, user preferences, and real-time data inputs. An edtech app offers different interfaces for teachers, students, and parents to guarantee that every user sees pertinent information and features. This flexibility guarantees IoT-powered apps stay usable and accessible in all settings, improving user happiness. Conclusion IoT integration with web app development is something great that changes how apps are designed and implemented. IoT is influencing a new era of web app innovation by facilitating real-time insights, strong backend capabilities, improved security, and smooth connections with next-generation technologies. An app development company in USA is taking full advantage of these developments and creating innovative and quick solutions. IoT's impact on web app development is increasing as it develops. A web app development company USA provides users with excellent capabilities and seamless experiences by deploying IoT-powered web apps. Adopting IoT is essential for leading the way into a more intelligent and connected future, opening up immense growth potential. ### Generative AI Unlocking Quantum Computing Data Potential In the modern world of technology, two transformative forces are converging: languages, generative AI, and quantum computing. While each of these domains will revolutionise industries on its own, their combined capabilities present an order of magnitude advance in data simulation and predictive modelling. This blog explores how generative AI, particularly with the expertise of a Generative AI Development Company, is unlocking new frontiers in quantum computing, addressing challenges, and creating pathways for revolutionary applications.  What is Quantum Computing? Quantum computing is a new way of computing that utilises principles of quantum physics as the basis for calculations. Quantum computers unlike classical computers deal with quantum bits or qubits instead of the standard binary equivalents (0 and 1). Qubits can be in the state of 0 or 1 at the same time, which is superposition and can also be connected in couples so that the other’s state also changes, entanglement, which makes for a hell of a lot more computation power. Key Features of Quantum Computing: - Superposition: The facility of the device to do several operations at the same time. - Entanglement: Coherent states of qubits that enhance computational speed. - Quantum Tunnelling: Helps to identify the solutions for difficult optimisation problems within the shortest time. As the field of quantum computing is relatively young, its ability to address tasks that are unfeasible for classical computing has attracted interest around the world.  AI’s Role in Quantum Simulations Quantum systems have high levels of complexity and dynamics that are not easily understood and described analytically. It is at this juncture that generative artificial intelligence has come into the picture. Generative AI with the help of complex algorithms can mimic quantum systems, forecast their behavior, and can suggest the best design for quantum circuits. How Generative AI Enhances Quantum Simulations: - Modelling Quantum Behaviour: Models of generative learning including GANs and VAEs can simulate quantum states and the influence of these states as well with high levels of precision. - Data Augmentation: Appreciably, generative AI can generate simulations of quantum data to help developers train quantum algorithms better. - Circuit Optimisation: With the help of machine learning, settings in quantum circuits can be determined that make their functioning as effective as possible and reduce the probability of errors. - Predictive Insights: There are multiple cases where the results of quantum computations are expected by certain models sustained by artificial intelligence to avoid continuous computing applying much time and effort. This integration demands expertise often provided by a Generative AI Development Company that understands the nuances of both quantum computing and artificial intelligence.  Applications of Generative AI in Quantum Computing The cooperation between generative AI and quantum computing has given birth to great opportunities to transform the markets. Below are some key areas where their integration is proving transformative:  1. Cryptography Quantum computers are a concern to traditional cryptographic encryption because they can break ciphers. However, generative AI might help in creating the proper quantum-safe cryptography methods for maintaining cyber security within the quantum age.  2. Material Science It has been said that to comprehend materials and even to create novel ones, it is necessary to model interactions of atoms, an inadequacy that can be solved by the use of quantum machines. This makes generative AI capable of speeding up these simulations so that scientists can find new potential materials for semiconductors, renewable energy, and much more.  3. Drug Discovery The pharmaceutical industry struggles to model molecular structures and interactions. Incidentally, generative deep learning coupled with quantum computing can mimic these interactions at a greatly accelerated pace leading to the development of these essential drugs.  4. Financial Modelling Large and mixed data collections and complicated connections are part of complex financial systems. Quantum computing combined with generative AI can capture the dynamics and associated risks that render efficient decision-making systems.  5. Climate Modelling Quantum-driven simulations can further be applied by generative AI to analyse climate systems and changes that can support the world in designing strategic approaches toward climate change.  Challenges and Limitations While the opportunities are tremendous, there are some important challenges that have to be solved before harmonising generative AI with quantum computing.  1. Technical Scalability Quantum computers are still in their experimental stage, with small numbers of qubits and relatively greater error rates. The biggest issue, of course, is extending these systems to be capable of addressing real-world phenomena.  2. Data Complexity Generally, random quantum data is very large and it is difficult for AI systems to comprehend and apply or analyse for that matter. Supervised generative AI for quantum data required computing capacity to be trained at a very high level.  3. Resource Requirements Quantum computing and generative artificial intelligence both use resources. Creating infrastructure that facilitates their integration calls for large hardware and knowledge investments.  4. Ethical Considerations The capabilities of quantum-advanced intelligent applications are ethically questionable, especially about surveillance, privacy, and weapons. To this end, countering these issues remains important to exercise responsible innovation.  Future Outlook AI integration with generative AI and quantum computing could transform more than one sector. Here is a glimpse of what the future may hold:  1. Cybersecurity Sophisticated quantum cryptography techniques created with generative AI could change the approach that the information is protected from future sophisticated threats.  2. Energy Optimisation QAI is used to predict energy systems with incredible accuracy and help revolutionise renewable energy and grid integration.  3. Healthcare Revolution Ranging from individualised medications to enhanced diagnosis equipment, the association of quantum composite and generative Artificial Intelligence might result in improvements to the practice of medical facilities.  4. Scientific Research Generative AI Development Services are already helping researchers model quantum phenomena more efficiently. It is anticipated that as the technology progresses, there will be additional exciting findings in fundamental physics and beyond.  5. Industrial Innovation Examples of sectors that could be positively impacted through the implementation of a more efficient methodical approach and forecasting by methods of quantum-augmented AI include manufacturing, AI in logistics, and many others.  The Role of Generative AI Development Companies The gap between AI and quantum computing is bridged by Specialised Generative AI Development Companies. These companies provide customised Generative AI Development Services, ensuring businesses can use the potential of these technologies effectively. They are invaluable when it comes to creating AI models that mimic quantum systems, right through to implementing these solutions into configurations. FAQ 1. How does generative AI support quantum computing? Quantitative AI models emulate quantum processes, improve circuits, and forecast results, contributing massively to quantum computation advancements. 2. What industries benefit most from AI and quantum computing integration? Industries including banking, material science, cybersecurity, and healthcare find the most transforming results. 3. What are the challenges in merging generative AI with quantum computing? Four significant limitations include scalability, resource needs, data properties and characteristics, and ethical issues will be discussed. 4. How can businesses utilise generative AI in quantum computing? By partnering with a Generative AI Development Company to access customised Generative AI Development Services for their specific needs. Conclusion The integration of generative AI and quantum computing lies beyond the technological world; it is an architectural shift in how we practice computation, simulation, as well as innovation. However, there is still much work to be done because together, these technologies are capable of doing things we never thought possible across industries. With the guidance of a Generative AI Development Company and access to innovative Generative AI Development Services, businesses, and researchers can stay ahead in this transformative era. ### Driving the Future of Connectivity with Data Centres As an astonishing matter of fact, there are 5.3 billion people using the internet in 2024,equating to 66 % of the world’s total population. As internet usage surges, the demand forhigh-speed data transfer increases. Data centre interconnection (DCI) enables seamless, secure communication between data centres, addressing challenges like scalability, latency, and data redundancy, which are vital for managing interconnected data flows. For businesses today, Data centre interconnect (DCI) is the glue that keeps their digital assets connected, secure, and easily accessible. Imagine this: you’re a large enterprise or a cloud provider with multiple data centres across regions, each housing massive amounts of data. Without DCI, data would be siloed, disconnected, and difficult to manage, leading toinefficiencies, high operational costs, and potential security risks. At its core, DCI is all about linking data centres, whether across the street or the country,ensuring they work as one seamless unit. This setup solves a few crucial problems: Data Silos and Latency: data sitting in separate locations with interconnectivity leads toinefficient processes and delays. With DCI, data can flow freely between data centres,improving response times, reducing latency, and enabling real-time processing-essentialor industries like finance-commerce, and media streaming. When compared tohyperscale data centres, enterprises that have their own on-premises data centre cancut latencies by 12 to 30 times. Disaster Recovery and Redundancy: Natural disasters, cyber-attacks, or technicalfailures can put data centres out of commission. Over the past ten years, FEMA has allocated more than USD 165 million in grant funds to strengthen state and local jurisdictions’ cyber readiness because of the close relationship between cyberattacks and natural disasters. DCI helps ensure that if one data centre fails, another can take over, providing a layer of redundancy vital for business continuity. Scalability and Flexibility: Growing businesses need flexible solutions to add storageand computing power on demand. In 2020, cloud spending increased by 36%, or USD 29billion. The number of business drivers for public cloud computing rose sharply to 23%in 2023. DCI makes scaling across different facilities possible without complex overhaulsor migration issues. Moreover, the trends shaping the Data Centre Interconnect (DCI) industry include AI-driventraffic management for optimised data flow, edge computing integration to handle IoTdemand, and multi-cloud solutions for a seamless hybrid environment. 34% of enterprisesutilise one cloud, and 27% use two, making up 61% of businesses in 2022. Innovations likequantum-resistant encryption enhance security, while advancements in software-definednetworking (SDN) simplify scalability and improve operational efficiency, driving market growth. Why are Companies Pouring investments into DCI? As businesses race to adopt cloud solutions, streamline operations, and secure data, the need for effective DCI solutions is skyrocketing. Companies are investing in DCI for several reasons: Lower Costs and Enhanced Efficiency: Instead of duplicating storage resources, DCI allows businesses to pool resources across multiple data centres, optimising capacity and cutting down unnecessary expenses. Global Expansion and Performance: For enterprises aiming to expand globally, DCI ensures that data can be accessed with consistent performance, no matter where users or clients are located. This improves how users or clients are located. This improves user experiences, minimises latency, and supports seamless global operations. Security Compliance and Data Sovereignty: With regulations becoming increasingly strict, companies need to manage where their data is stored and how it’s transferred. DCI enables secure data flows that comply with regional data privacy regulations, supporting everything from GDPR in Europe to regional data requirements in Asia. How DCI Powers Key Sectors Data Centre Interconnect isn’t just a tech solution; it’s a growth enabler for industries across the board. Here’s how it’s playing a pivotal role in various sectors: Finance - According to 95% of banking and capital markets participants in PwC's 2023 Cloud Business Survey, they are currently or will be entirely on the cloud in two years. DCI is crucial for financial institutions, enabling fast data processing, secure trading, andefficient data backup while meeting regulatory standards. Healthcare - DCI streamlines access to patient records and imaging data across facilities,enabling quicker diagnoses, treatments, and global collaboration for compliant research. In2020, the healthcare sector would spend about USD 2.7 trillion annually on data centres and other IT infrastructure. Media and Entertainment - For video streaming companies, latency is detrimental. DCiprovides low-latency, high-speed connectivity for real-time streaming during peak viewership without interruptions. In 2023, it is anticipated that 35% of data centre demand will come from the media and entertainment sectors. Telecom and Cloud Providers - Telecoms and cloud giants depend heavily on DCI to link vast amounts of data spread across data centres globally. 3% of global energy use is attributed to cloud data centres. This interconnectivity is essential for 5G rollouts, content delivery networks, and supporting ever-increasing internet traffic. Industrial Landscape: Key Players and Competitive Dynamics The DCI market has an impressive roster of players, including Cisco, Huawei, and Nokia, who offer solutions ranging from optical transport to software-defined networking (SDN) systems. These companies focus on integrating high-capacity, low-latency links with enhanced security features to stay ahead. Cloud providers like Amazon Web Services(AWS) and Microsoft Azure are also highly active in the DCI space, investing heavily to ensure inter-region data centre connectivity. For instance, in October 2022, Oracle launched a new Interconnect for Microsoft Azure in Johannesburg, South Africa. This provides direct access between Oracle Cloud and Azure, allowing customers across Africa to use the Oracle Database Service and integrate workloads with Oracle Cloud Infrastructure (OCI). Furthermore, according to forecasts, the Data Center Interconnect Market was valued at USD 10.56 billion in 2024 and is expected to surpass USD 54.15 billion by the end of 2037. This represents a compound annual growth rate (CAGR) of over 13.4% during the forecast period from 2025 to 2037. Competition is driving rapid innovation, especially in areas like SDN and multi-cloudconnectivity. Vendors are developing solutions that make it easier for businesses to connectdata centres from different cloud providers, creating a flexible, hybrid cloud environment. This flexibility is a game-changer for businesses looking to balance performance, cost, andregulatory needs. As we saw above the data centre interconnect (DCI) industry is crucial for seamless data flow, redundancy, and scalability. As companies adopt AI and edge computing, DCI will become essential for digital transformation, ensuring data is both accessible and secure in our connected future. ### Hybrid Cloud - Mix and Management Most organisations, whether they realise it or not, are already using hybrid cloud services. At the very least, their email is likely a cloud service, but doubtless they will be using others for various purposes, perhaps alongside remaining on-premises infrastructure or data centre co-located equipment. Unless the organisation is wholly served by Microsoft or Google, internal infrastructure or a fully comprehensive private cloud, they are to some extent already hybrid. In many cases this will not be a general strategic decision, but rather the result of ad hoc changes over the years for specific sets of services. This reflects how technology and our use of it has evolved in recent times, driven by cost, efficiency, resiliency and performance. New services being created and legacy services being migrated are why cloud investment continues to grow. This is unlikely to be a trend that will be reversed, despite there being fluctuations in the kind of cloud services in demand and increasing concerns by companies and governments with big tech dominance in many areas. The risk of big tech public cloud The benefits of cloud services are well known, but the more interesting issue is the type and mix of cloud services that any one organisation should be employing in their hybrid infrastructure. Whilst it might seem to make operational sense to limit these to as few separate services as possible, such an approach carries risks which need to be carefully considered. For example, many organisations will default to big tech public cloud because they already use it to some extent (likely seeded with years of email use) and can centralise administration in one portal, more easily train and recruit experienced staff and align terms and billing amongst other aspects. There will be a nebulous trust, supported by partners and peers within the same technical ecosystem and a general knowledge of the ultimate vendor that makes board decisions easier (everyone knows Microsoft, Google, Amazon etc). On the surface, centralising as much as possible with such a vendor seems to make sense. Diversifying your cloud services However, the old ‘putting your eggs in one basket’ aphorism stands firm. If that one vendor experiences problems, it will impact your business much greater than if services were distributed across multiple independents. Such incidents are widely reported as they can often be global. The largest targets will attract the attention of bad actors, meaning vulnerabilities will be rapidly exploited and a vigilant security posture may not be enough when caught in the crossfire of escalating geopolitical tensions. From a strategic perspective, a broader mix of cloud services should be considered. Ultimately, this will depend on the size of the organisation and the kind of services that are required, but it is highly likely that smaller vendors will offer better fit services than the big tech generalists. Innovative specialists can replace or run alongside more generic infrastructure and provide enhanced performance, more responsive support and a better return on investment. Choosing the right fit for business requirements Virtual desktop infrastructure (VDI) and Infrastructure as a service (IaaS) are cases in point. These are generic services offered by big tech public cloud providers. For organisations that require low compute and storage, such an offering may make operational and budgetary sense, even with the risk of total downtime. But the cost of these services can ramp up very quickly, especially for large and more complex compute requirements. It can be difficult to justify multiples of cost when performance and resiliency don’t improve; a problem that has seen a number of high-profile exits from the public cloud in recent years. Private cloud or dedicated specialist vendors are better bets for specific requirements and make a return on investment much more likely when the service is made to measure a particular organisation or industry. Such vendors will be much more engaged in a relationship and invested in long term success, including incorporating requests and feedback into service development. The need for cloud vendor flexibility Service offerings and business requirements are likely to change at an ever faster rate in the coming years as economies fluctuate, politics shift and new tech emerges. Flexibility is key, ensuring that any services employed allow for adjustment to new challenges and opportunities, including avoidance of long-term vendor lock in. The ability for an organisation to adjust their hybrid cloud mix in terms of scale and service type - without crippling disruption - is going to be increasingly important. Planning infrastructure beyond five years at the most will be problematic. Look to cloud vendors to address these issues. Balance services across public, private and - where it makes sense - on premises. Spend adequate time on due diligence and assessing risk factors, potential consequences and maintain an evolving strategy. Being able to draw on the appropriate skills and resources to manage a hybrid infrastructure is a critical investment. ### Willow's Breakthroughs in Quantum Stability The Start of a New Era Have you heard the latest news? Google have gone and mesmerised the world with their most recent announcement of Willow. Willow, is their most sophisticated and powerful quantum computing chip to date. Why is this important you ask? Well, this new development isn’t just for scientific research in labs, it has the potential to change fields such as pharmaceutical research to financial modelling and many more! In this article, I will be discussing what is quantum advantage, why Willow is important, real-world applications (so where and what will be benefitting from this insane new update), Google’s journey with quantum and what is new in the pipeline for Willow. If you’re looking to read about Willow in more detail, you’ve come to the right place! What is The Quantum Advantage? Okay, so before we nestle down to all the nitty gritty details of Willow and Google, it’s best to ease into the topic by discussing the basics. What is quantum advantage? It’s important to know, that traditional computers process information in a fundamentally limited way, no matter how powerful they are! They work with bits, which must be either 1 or 0, think of it like having millions of tiny switches that must be either on or off. Because of this format that has been made, it then limits to what a classic computer can do. When it comes to complex problems such as looking at huge number of possibilities at the same time, they will struggle compared to quantum computers. So how do quantum computers work, and how are they different to classical computers you may ask? Quantum computers don’t work with codes of 1 and 0 like classical computers, they use something called "qubits," which can be in multiple states at the same time (a concept called superposition). This unique feature lets them handle some problems much faster than traditional computers ever could. The main real benefit of a quantum computer, is it’s power to work on multiple possibilities at the same time. As mentioned above something that a classic computer cannot do. What is fascinating, is that the more qubit you add, ability to do multiple calculations grows dramatically, making them far more powerful than traditional computers for certain tasks. Willow’s Features Okay now we have that explained and out of the way, lets move over to the main topic of conversation today. Willow. A 100 Microsecond Milestone Okay so why is Willow making headlines, and what is so good about it? Well as we discussed earlier, quantum computing is all about working with multiple possibilities at a time in a timely manner. Willow takes quantum computing to a whole new level, with it’s ability to stay in it’s quantum state much longer than it’s previous version. It has been recorded that it can stay in this quantum state for 100 microseconds, five times longer than its predecessor, Sycamore. Microseconds sound tiny right? Well though that is true for our natural human timescale, in the quantum world however, this is a huge leap forward. These extended coherence times allow for more complex calculations and bring us closer to practical quantum computing applications. This is huge news, as with longer coherence times quantum computers are able to run more complex calculations before their quantum state breaks down. This makes it possible to tackle problems that were to difficult to solve before. Correcting The Errors One challenge that is constantly an issue, is managing errors as quantum states are fragile. Keeping them stable enough has been proven difficult in the past. Willow is able to overcome these problems using ‘logical qubits’, allowing it to stay stable enough to correct these errors. This breakthrough represents the first time this level of error correction has been achieved in a practical quantum system. The only way we will be able to create reliable quantum computers, is by fixing these errors more effectively. This will stop crashes and result in no mistakes being produced. Advance Design Willow’s new design sets it apart from other previous quantum processors, featuring an innovative architecture chip. The new structure allows qubits to interact more efficiently, helping to run more advanced quantum calculations. The system’s adjustable qubits and connectors also work faster and with fewer errors, this is pushing the limits of what quantum computers can do! Willow’s features a chip that has an advance design and is built specifically for quantum computing applications. With these slight improvements, the qubits are more stable and less affected by outside factors. This creates a more reliable system. The Real-World and The Impact Transforming Medical Research Medical research will have a positive impact from using Willow. It can speed up drug discovery by simulating how molecules interact with amazing accuracy. This then would shorten the amount of time it would take to discover new medicines, cutting the development cycles from years to months! Researchers could better understand protein folding and how drugs work, leading to better treatments for difficult diseases. Willow can assist by helping scientists understand better how diseases work at the molecular level, this is done by accurately simulating quantum systems. As a result of this, it could lead to news ways of treating diseases like cancer or neurodegenerative conditions, allowing the possibilities for new medical breakthroughs. New Financial Systems Another industry that will be able to benefit from Willow’s capabilities, is the financial sector. Risk assessments that are complex, originally would have taken days to be completed, but with the help of Willow could now be completed in a matter of minutes. Willow can even help with the process of managing investments (also known as portfolio optimisation) become more advanced. By taking into consideration a variety of factors and market scenarios, Willow could make a more stable financial system with better investment strategies. This means that quantum computing has the potential to completely change how we protect data and secure financial transactions. However, for this to happen, further improvements are needed in error correction and making qubits more stable. Helping Climate Science Understanding and predicting climate change requires processing enormous amounts of data and recreating intricate environments. This is where Willow can come in to play and aid with this. Willow's quantum abilities could help make these models more accurate, helping scientists better understand and predict climate patterns. This could prove hugely valuable to finding effective solutions to combat climate change. Willow’s ability to simulate accurate chemical reactions could also help in creating efficient technologies for capturing carbon and discovering new materials for renewable energy. Making the efforts to fight climate change more effective. Google's Quantum Journey Google's journey with quantum computing is hugely impressive, and with the latest announcement of Willow, it is another milestone for them to tick. Starting with Foxtail in 2017, followed by Bristol Cone in 2018 and Sycamore in 2019, each project has brought significant improvements. Google has continuously shown their dedication and long-term commitment to advancing quantum technology, through investing in a new facility located in Santa Barbara. This facility is focussed on making quantum chips, using specialised equipment and featuring state-of-the-art clean rooms. The Path Forward We are in the early stages still of quantum computing development, despite the huge success of Willow. The technology continues to evolve rapidly, and new applications are being discovered regularly. As quantum computers become more powerful and reliable, we may unlock possibilities that we haven't yet imagined. The establishment of Google's new fabrication facility suggests that we can expect even more breakthroughs in the coming years. This dedicated space for quantum chip development could accelerate the pace of innovation, potentially leading to even more powerful quantum processors in the near future. Looking to The Future The quantum computing revolution isn't a distant dream—it's unfolding right now. Willow is more than just technology; it marks a major change in how we approach solving tough problems. As we continue to push the boundaries of what's possible with quantum computing, we're not just advancing technology—we're unlocking new possibilities for human knowledge and what we can achieve. The next few years will be crucial in figuring out how quickly quantum computing can move from the research lab to practical applications. With continued investment and innovation, we might see quantum computers solving real-world problems sooner than we think. ### Cloud Computing Demands Robust Security Solutions Modern organisations are increasingly reliant on cloud computing to drive creativity and productivity. Huge influence entails a significant degree of responsibility, particularly when it comes to protecting personal data and preventing cybercrime. The protectors of this digital world are cloud security solutions which provide strong safeguards for data confidentiality and integrity. These solutions are essential for guaranteeing that businesses can confidently take advantage of cloud computing without sacrificing security. We will go further into the ever-changing world of cloud security to uncover why these solutions are so critical to preserving our digital futures. Understanding the Threat Landscape To stay ahead in the digital age modern businesses are constantly changing in a world full of technological marvels. However there are some challenges ahead as companies adopt cloud computing. The volume of data and information saved in the cloud both an opportunity as well as a difficulty. The cloud delivers unparalleled accessibility and flexibility, but it also demands ongoing monitoring for new cyberthreats. Typical Cyberthreats Modern Businesses Face Every cyberthreat that modern businesses encounter is more advanced than the last and uses vulnerabilities to inflict damage. Phishing attacks in which knowledgeable scammers trick victims into divulging personal information like usernames and passwords are one of these risks. Ransomware which prevents users from accessing their systems or data unless a ransom is paid is another significant issue. By overloading networks or services with traffic DDoS attacks can take them down. Data breaches involving unauthorised access to personal data can result in serious harm to ones finances and reputation. Each of these threats compromises enterprise data integrity confidentiality and availability. Understanding these risks is crucial to fortifying a business’s defences as it increasingly conducts business online. Importance of Cloud Security in Data Protection Cloud computing has revolutionised how businesses access and manage their data. This change however makes cloud security a crucial component of data security plans. The protection of data from loss theft and unauthorised access is guaranteed by effective cloud security. It offers a framework for preserving data availability confidentiality and integrity in the face of evolving cyberthreats. Businesses can confidently navigate the digital ocean when they have the right cloud security measures in place. They find comfort in the knowledge that their most valuable resources are protected from unforeseen circumstances. Additionally customers and clients are more trusting when there is strong cloud security because they know that their financial and personal information is secure. Key Components of Cloud Security Solutions To secure their cloud infrastructures, businesses need to employ a multi-layered strategy. Implementing a range of tactics strengthens a company against a variety of dangers. In this defence important elements of cloud security solutions are highlighted. Data Encryption Techniques Since it transforms unencrypted data into a coded format that cannot be decrypted without the proper key data encryption is crucial to cloud security. Even if data is intercepted this ensures that it cannot be accessed by unauthorised users. Data is regularly protected both in transit and at rest using advanced encryption techniques like Advanced Encryption Standard and Rivest Shamir Adleman. Sensitivities are protected from exposure through the use of robust encryption protocols which improves the security posture overall. For businesses it serves as an impenetrable barrier that safeguards their data which is the lifeblood of their operations. IAM The watchful guardian of the cloud environment is Identity and Access Management. In the cloud environment it controls who can access what resources. Businesses can manage and regulate user permissions with IAM guaranteeing that only individuals with the proper authorisation can access confidential data. By implementing stringent validation procedures initiatives like single sign-on and two-factor authentication improve security. By putting strong IAM solutions in place businesses reduce the risk of unwanted access and keep a strong defence against possible attacks. IAM is essential for building a stronghold of security that satisfies enterprise needs and individual privacy. Network Security Measures The unseen nerve centre protecting system connections and communications is cloud network security. It includes a variety of defences such as VPNs, IDS and firewalls. These methods and tools serve as early warning systems informing security teams of possible breaches before they become more serious by monitoring and reacting to anomalous activity. Integrating thorough network security procedures into daily operations gives companies a strong defence against the complex nature of cyberattacks assisting in ensuring the dependability and security of their services. Every element has the potential to significantly strengthen contemporary businesses against the constantly changing threat landscape while balancing operational excellence safety and assurance in a world that is constantly changing. Cloud Security Implementation in Contemporary Businesses It takes a strategic approach to successfully implement cloud security solutions in contemporary businesses. Fortunately firms can protect their assets from the increasing wave of cyberthreats by implementing best practices and the appropriate technology. Selecting the Right Security Solutions for Your Business Choosing the right cloud security solutions is crucial for any business. It entails determining the possible risks to your data and comprehending your particular business requirements. These are some important things to remember: Understand Your Cloud Environment: Every cloud environment whether private public or hybrid presents a unique set of security issues. Selecting the appropriate security tools and procedures is aided by being aware of your surroundings. Evaluate the Scalability: Select solutions that are scalable to your company. The amount of data and the complexity of threats will grow along with your company. User Access Management: Use strong identity and access management systems to guarantee that sensitive information is only accessible by authorised individuals. Integrating Cloud Security with Existing Infrastructure One of the most difficult parts of putting cloud security into practice is frequently integration. When executed well though it can improve the overall security posture of your company. The following advice will help you do this: Conduct a Thorough Assessment: Identify any vulnerabilities in your current security setup that might be exploited by cloud deployments by conducting a thorough assessment. Use APIs for Smooth Integration: Real-time communication and threat management are ensured by the APIs that many cloud security solutions provide which make it simple to integrate with current systems. Employee Training: Train employees on a regular basis to identify and address cloud-specific threats. Data breaches are still mostly caused by human error. Best Practices for Ensuring Continuous Security It takes constant attention to detail and flexibility to keep up with new threats to maintain cloud security. Important procedures include frequent patching and updates to guard against the most recent vulnerabilities ongoing alert monitoring to spot and handle questionable activity instantly and encrypting sensitive data while its in transit and at rest for an extra degree of security. Effectively protecting your systems and data in the event of a security breach is ensured by having a well-documented incident response plan that outlines exactly how your team should react. Future Trends in Cloud Security The nature of cloud security is ever-changing with new threats appearing at the same rate as defences. Monitoring upcoming developments and trends is essential to staying ahead. Emerging Technologies in Cybersecurity With new technologies offering cloud security both opportunities and challenges the cyber security landscape is always changing. By providing an impenetrable approach to information management blockchain a decentralised ledger is being investigated to improve identity management and data integrity. Artificial Intelligence's Potential to Improve Security An increasingly important ally in cloud security initiatives is AI. Threats can be identified and eliminated with previously unheard-of speed thanks to its ability to dynamically analyse enormous volumes of data in real-time: Predictive Threat Intelligence: AI can identify trends to anticipate possible dangers before they materialise enabling preventative measures to be implemented. Automated Response Systems: Reduce damage and downtime by using AI-driven automation to react quickly to suspected security breaches. Preparing for the Evolving Threats The complexity and scope of cyber threats are constantly changing so being prepared is essential to protecting your business: Regular Security Audits: To maintain the efficacy of your security measures against new threats you must regularly conduct audits. Threat Intelligence Sharing: To keep up with the most recent cyberthreats take part in threat intelligence sharing communities. Advanced Threat Analysis: To continuously strengthen your defences and adjust to emerging threat vectors use advanced threat analytics. Modern businesses must be vigilant flexible and willing to invest in state-of-the-art technologies as they continue their journey towards secure cloud environments. In the ever-changing world of cloud security you can make sure your company is thoroughly protected by selecting the best security solutions integrating them into your current infrastructure with ease and staying up to date with emerging trends. Modern businesses looking to strengthen cybersecurity defences and safeguard their data must embrace cloud security solutions. As we’ve seen choosing the appropriate security tactics improves overall business resilience in addition to protecting sensitive data. Make threat detection and ongoing monitoring your top priorities. Make an investment in strong access controls and encryption. Encourage staff members to adopt a security-conscious mindset. By adhering to these guidelines businesses can navigate the constantly changing world of cyberthreats with assurance staying safe creative and ahead of possible dangers. ### Unlocking the future of manufacturing; AIOps network optimisation Global manufacturing activity has failed to show signs of growth for much of 2023. S&P Global's gauge of worldwide manufacturing activity came in at 49.1 in September, below the 50 mark that separates expansion from contraction. China's Caixin/S&P Global manufacturing PMI also fell to 49.5 in October, down from 50.6 in September. As the sector looks to recover, rising bandwidth requirements, restricted access to applications and increased latency all pose hurdles to ongoing efficiency in manufacturing. Facilities have complex infrastructures, connecting multitudes of platforms and devices. Legacy hardware and software implemented by different integrators create a challenging ecosystem. The integration of Internet of Things (IoT) devices and other smart technologies has made manufacturing networks more complex and, consequently, more vulnerable to disruptions. In recent research commissioned by OpenGear, “Measuring the True Cost of Network Outages”, nearly one in three (31%) of senior IT decision-makers globally said network outages had cost their business over $1.2 million over the past 12 months and one in six in total (17%) said it had cost them $6 million or more. However, the impact on manufacturers from network downtime extends beyond direct financial costs. Efficiency in manufacturing is not just about maintaining a steady production rate; it's also about the ability to quickly adapt and respond to changing market demands. Network issues can hinder this adaptability, causing delays in communication, data transfer, and decision-making processes. In an industry where time is money, these delays can be particularly detrimental. In summary, the challenges of network downtime and efficiency in manufacturing are multifaceted, impacting not only the immediate production processes but also the broader operational dynamics of manufacturing businesses. Addressing these challenges is crucial for maintaining competitiveness in a rapidly-evolving industrial landscape. To do this effectively, manufacturing organisations need a platform that ensures always-on remote access. Continuous connectivity is critical for supply chain visibility, IoT data collection and asset management. Synergising AI and Smart OOB for manufacturing excellence The combination of artificial intelligence (AI) and Smart Out-of-Band (OOB) management has the real near-term potential of revolutionising the manufacturing industry. By synergising these technologies, manufacturers can significantly enhance their operational efficiency and reduce network downtime, both of which are critical factors for maintaining a competitive edge. The past year has witnessed a remarkable surge in AI growth, and this momentum is expected to accelerate, ushering in a significant impact on IT operational efficiencies. As a result, the new term or category of AIOps has been recently created. So, what is AIOps? Gartner defines AIOps as the marriage of Big Data with machine learning to create predictive outcomes that drive faster root-cause analysis and accelerate mean time to repair (MTTR). Add Smart OOB to AIOps and you have the most powerful combination of tools available that can deliver true Network Resilience. AI plays a pivotal role in predictive maintenance, a technique that uses device visibility and data analytics to foresee equipment failures before they occur. This proactive approach can significantly reduce unplanned downtime in manufacturing, with Deloitte suggesting it can reduce equipment downtime by up to 50%. But using AI for predictive maintenance is just the start. If you extrapolate the full use of AIOps it can provide better real-time operational analysis, which in turn could drive better network capacity planning, it can pro-actively improve network optimisation and ultimately drive the quality of experience of all network users. Network ‘uptime’ is not the only measure used anymore, the next evolution will focus on the provision of a seamless and consistent ‘user experience’; to any device, to any application and from location. One issue that does need to be addressed is that many people have a very limited trust in AI and its potential impact on their lives, driven in part from a lack of understanding. Like all transformational technologies, AI has the ability to be used for good or bad. In the context of the Networking environment the use of AIOps has huge potential for delivering good by providing intelligent, actionable insights that drive a higher level of automation, collaboration, and user experience. The skills dimension At the same time, AIOps provides a way both of addressing the current skills shortage in manufacturing and also of complementing the day-to-day activity being carried out by the existing workforce. Far from signalling job elimination, AI streamlines server monitoring and resource allocation, allowing human operators to focus on the strategic facets of the manufacturing process. This transformative power opens avenues for professionals across the sector to evolve and adapt their skill sets. While technology advances, the human-centric approach to innovation remains crucial. As the next generation of tech-savvy employees enters the workforce, businesses must ensure their technology deployments keep pace with the latest developments, especially amid a growing skills shortage. The rise of Smart Out-of-Band in tandem with AI Smart Out-of-Band (OOB) management complements AIOps by providing a reliable, dedicated ‘always on’ backup communication channel for network devices. In the event of a primary network failure, Smart OOB allows for remote troubleshooting and recovery, ensuring that the manufacturing processes remain uninterrupted. This capability is crucial in a landscape where even a few minutes of downtime can result in substantial financial losses. Furthermore, the combination of AI and Smart OOB enables more efficient resource allocation and better decision-making. AI algorithms can analyse vast amounts of data from network devices, providing insights that can be used to optimise production processes and resource utilisation. Meanwhile, Smart OOB ensures that these AI-enabled systems remain connected and functional at all times, even during network disruptions. In essence, the synergy between AIOps and Smart OOB in manufacturing leads to a more resilient, efficient, automated and adaptable production environment. This integration not only addresses the immediate challenges of network downtime and efficiency, but also paves the way for future innovations in the manufacturing sector and beyond. Empowering network teams for the future AI's integration into manufacturing operations is transforming roles and skills, paving the way for a future where technology and human expertise coalesce, offering unprecedented possibilities. Embracing thisevolution presents an opportunity to redefine network management in the digital age. With AI's increasingly transformative role in managing and monitoring networks, the broader industry stands at the cusp of a revolution, intertwining innovation and resilience with human expertise to unlock vast possibilities. ### Why is the hybrid cloud the future of computing? Have you ever wondered how companies can benefit from the advantages of public and private cloud at the same time? Would you like to know why more and more organisations are opting for hybrid solutions? In this article, we take a closer look at the hybrid cloud and its growing importance in the IT world. What is the hybrid cloud? A hybrid cloud is a combination of a private cloud (usually located on a company's premises) and a public cloud (offered by third-party providers). Such a solution allows flexible management of IT resources and gives the possibility to move data and applications between different environments as required. According to research, more than 80% of companies are already using some form of hybrid cloud! Flexibility and scalability One of the main advantages of the hybrid cloud is its flexibility. You can quickly increase or decrease the resources you use depending on your current business needs. For example, during periods of increased traffic to your website, you can use the additional computing power of the public cloud. Conversely, you can store sensitive data in a more secure private cloud environment. Cost optimisation The hybrid cloud allows you to significantly optimise your IT costs. You don't need to invest in expensive on-site infrastructure to handle peak loads. Instead, you can use public cloud resources only when they are actually needed. This makes the hybrid cloud particularly attractive for small and medium-sized enterprises that want to reduce their IT expenditure. Security and Compliance One of the key aspects of the hybrid cloud is enhanced security and compliance. Many companies have to comply with stringent data protection requirements, especially in industries such as finance or healthcare. The hybrid cloud allows sensitive data to be stored in a private part of the infrastructure, providing full control over it. At the same time, less critical data and applications can be moved to the public cloud, optimising costs. According to research, up to 90% of companies using the hybrid cloud believe it has improved their ability to meet regulatory requirements. Innovation and faster deployment of new solutions The hybrid cloud creates an excellent environment for innovation and rapid deployment of new solutions. Thanks to the flexibility of this model, you can easily test new ideas and projects in a public cloud environment and then move them to the private cloud when they are ready for production. This significantly speeds up the product and service development cycle, giving your business a competitive advantage. Better workload management The hybrid cloud enables you to effectively manage the load on your IT infrastructure. During periods of increased demand for computing power, such as during seasonal e-commerce sales, you can easily move some workloads to the public cloud. When demand drops, resources can be automatically released, allowing you to optimise costs. The hybrid cloud offers companies flexibility, scalability, and cost efficiency while providing control over data and compliance. Click here for more on this topic. Scalability During Peak Demand For businesses, particularly those in e-commerce or seasonal industries, the hybrid cloud shines during periods of increased activity, such as Black Friday, holiday sales, or promotional campaigns. Instead of over-provisioning on private infrastructure—which can be costly and underutilised during off-peak times—workloads can be seamlessly transferred to the public cloud. This ensures uninterrupted performance, improved customer experience, and the ability to handle surges in traffic without investing in additional on-premises resources. Automatic Resource Optimisation One of the key advantages of the hybrid cloud is its ability to dynamically allocate resources based on real-time demand. When computing power requirements decrease, as often happens after seasonal peaks or during quieter periods, the system automatically releases unused public cloud resources. This prevents unnecessary expenditure on idle resources, ensuring cost efficiency without sacrificing operational capability. Enhanced Flexibility and Control The hybrid model also offers organisations greater control over sensitive data and critical workloads. While day-to-day or less critical operations can be offloaded to the public cloud, sensitive information or core processes can remain securely housed in the private cloud. This ensures compliance with data protection regulations and mitigates risks associated with data breaches. Practical Use Cases E-commerce: During flash sales or seasonal events, websites need to manage high traffic volumes. The hybrid cloud accommodates these surges without service degradation.Media and Entertainment: Streaming platforms can handle spikes in viewer numbers during major events, such as sports finals or new releases. Education: Institutions offering online learning can scale resources during enrolment periods or exam seasons. Optimising IT Costs The ability to use the public cloud for temporary workload increases means businesses only pay for the extra resources they need when they need them. This pay-as-you-go model ensures that infrastructure investments align closely with business demands, making IT budgets more predictable and manageable. Conclusion By leveraging the hybrid cloud, businesses gain a strategic advantage in workload management, achieving a balance between cost-efficiency, scalability, and security. This flexibility ensures they can stay agile in a fast-changing digital landscape, meeting customer expectations without overburdening their infrastructure or resources. ### 6 Basic Things to Know About IP Addresses IP addresses are like real addresses but for the internet. Well, they are more like phone numbers, but you get the idea. Anytime you connect to the internet, your internet service provider assigns you an IP address. Similarly, every other entity on the internet has an IP address. So, it includes website servers, other clients, and even IoT devices. When you navigate to a website in a browser, your computer is navigating to the IP address of the server the website is hosted on. Similarly, the server uses your address to send you the website data so it can load. In this article, we will teach six more basic things about them. 6 Basic Things to Know About IP Addresses Public and Private IP Address You encounter two sorts of addresses in your day-to-day life. The first is the public IP address, and the second is the private IP address. So, what’s the difference? Well, everybody who connects to the Internet uses a router/modem. This router is provided by the ISP, or you can buy it yourself. This router is known as the internet access point. The access point has its own IP address (assigned by your ISP), which is known as the public IP address. All the online tools for checking IP show your public IP. A private IP address belongs to each device that connects to the access point. So, if you have a computer and smartphone connected to your router, they will have different addresses that differentiate them. Private IP addresses are typically hidden from the Internet. When a server sends you information to load a website, it uses the public IP address of your access point. Your access point (router) receives the information and sends it to the correct private address that created the original request to the server. This segregation of public and private addresses is necessary for security. IP addresses can be used for personal identification, and as such, they can be used for doxing. Public IP addresses alleviate that issue. We will learn about that in more detail in a later section. IPv6 and IPv4 IP addresses have two versions: version 4 and version 6, which are known as IPv4 and IPv6, respectively. IPv4 addresses consist of 4 octets. They typically look like this: 255.168.45.6 Each octet can range from 0 to 255, for a total of 256 values per octet. This means the maximum possible address space in IPv4 is 4,294,967,296. However, this address space has run out. There are just too many devices that connect to the Internet. To deal with this, IPv6 was introduced. This address space has 16 octets instead of 4. Additionally, Instead of just numbers, IPv6 addresses also use hexadecimal notation. Here’s what a typical IPv6 address looks like: FE38:DCE3:124C:C1A2:BA03:6745:EF1C:683D Owing to bigger octet sizes and more variations available per octet, the IPv6 address space is extremely huge—to be precise, it is 340 trillion trillion trillion. With IPv6, the address space problem has been resolved. IPv4 addresses are far more common today, but as companies roll out new internet infrastructure, IPv6 is gaining more and more traction. Static IP Address So, remember public IP addresses? The ones assigned by your ISP. They are changeable, and they change quite a lot. But most people don’t realise this because nothing changes on the client side. However, servers and website hosts don’t have this feature. Their addresses remain the same. An address that does not change is called a static IP address. Servers require these as that’s what enables client computers to find them. Without them, servers would be difficult to find because no client would know where they are. To understand this concept, we must discuss routing tables. The Internet is an amalgam of routers, switches, servers, and edge clients. Routers and some servers maintain routing tables. These tables contain information on the IP addresses of every router and server directly connected to a particular router. When a client requests a specific IP address, the routers ask each other about it. Once the router with the IP address is reached, it sends the information back, and the client can find the server. Since servers have static IPs, their information is stored in a routing table and stays there indefinitely. If their IPs changed, routing tables would need to be updated constantly, and that’s not easy. It takes a long time. So, until the update was done, the server would be unreachable. Dynamic IP Address Dynamic addresses are the opposite of static addresses. They are usually assigned to edge clients, including your access point, i.e., your public IP address. This is a security feature that prevents people from getting doxxed. Every few days or every time you power cycle your router, your ISP assigns you a new public address. Remember that private addresses are not dynamic and are hidden from the internet. How to Find Your IP Address Let’s learn how you can find your IP address. Both private and public. So, there are several ways to find your public IP address. The easiest method is given below. Open a browser. In the search bar, type "What is my IP address." Open one of the search results from the top 10 entries. Your public address should be listed in a conspicuous place. This method works regardless of the operating system and device. It can be used on phones, laptops, desktops, and tablets. They are listed below if you want more specific methods to do an IP address lookup. For Windows Open the command prompt. Type “ipconfig -all” and hit enter A slew of information will be given. Under the ethernet or Wireless adaptor heading, find the IPv4/IPv6 address. That’s your public IP address. For Linux Open the terminal.Type “ifconfig” and press Enter. Look for “inet” under the network interface (usually eth0 or wlan0). For Mac Open the Terminal (Finder > Applications > Utilities > Terminal). Type “ifconfig” and press Enter. Look for “inet” under the network interface (usually en0 or en1). For Android Go to Settings. Tap on Network & Internet or Connections. Tap on Wi-Fi (ensure Wi-Fi is on and connected). Tap on the connected network's name. Your address will be displayed. For iOS Go to Settings. Tap on Wi-Fi. Tap on the i icon next to the connected network. Your address will be displayed under the IP Address section. Some of these instructions may have more steps depending on the version of your operating system. So, if they don’t work, just use a location tool to find your public IP. How to Hide Your IP Address IP addresses can be used to identify people to a shocking degree. With an IP location finder, one can easily locate the approximate location of the user. Thus, websites use information associated with an IP address for marketing purposes. Many people believe that is an invasion of privacy, so they wish to hide this. There are several ways in which you can do that. So, let’s take a look. With a VPN A VPN basically provides your client with a borrowed IP address. This address is from a server in a different location than yours. Anyone who tries to find your address just finds the borrowed IP and your real one remains secret. Proxy Server A proxy server is similar to a VPN, but the difference is that it doesn’t tunnel your information through a secure channel. They hide your IP the same way, though. Your real one gets hidden behind the proxy server's. Public WiFi Since all access points have a unique public IP, you can use public WiFi away from your home to hide your personal access point’s public IP. This is a very contrived solution and the least convenient. There is also the TOR browser, but network experts use it. A VPN is the best solution for a normal person, as it is the most secure and hassle-free option. Conclusion So, there you have six basic things you should know about IP addresses. While this information does not change much for the average user (except for the hiding part), it is still good to know more about the technology that one uses daily. Hopefully, this sparked some interest in the world of networking. ### Leveraging AI for Business Transformation and Market Adaptability Artificial Intelligence is changing the game for businesses everywhere. It's not just about making things run smoother; it's about creating better customer experiences and adapting faster to market shifts. In today’s world, AI is key to staying ahead and thriving. With predictive analytics shaping strategies and automation taking over routine tasks, teams have more time to focus on growth and fresh ideas. From spotting trends to meeting customer demands, companies are using AI to gain an edge. By boosting both customer engagement and internal efficiency, AI helps drive success now and in the future. It’s not just about keeping up—it’s about setting the pace and leading the way in your industry. Understanding AI and Its Impact on Businesses Artificial Intelligence (AI) is not only transforming business operations but is also revolutionising industries with its adaptability. It's essential to grasp how AI functions and its tangible effects on various sectors. Defining Artificial Intelligence Think of AI as technology that enables machines to simulate human intelligence. AI systems can perform tasks like recognising speech, making decisions, and translating languages. AI encompasses subfields such as machine learning, which teaches computers to learn from data, and natural language processing, which helps machines understand human language. These capabilities allow businesses to automate processes, making operations more efficient. AI can be the game-changer in optimising workflows, reducing human error, and providing timely insights. AI in the Modern Business Landscape In today's business environment, AI serves as a catalyst for innovation. From retail to healthcare, companies deploy AI to enhance customer experiences and streamline operations. AI tools can analyse consumer data to offer personalised recommendations, increasing customer satisfaction. In finance, AI helps in fraud detection by analysing patterns that could indicate fraudulent activities. Manufacturing utilises AI for predictive maintenance, reducing downtime and improving productivity. The integration of AI into business models continuously evolves, presenting new opportunities for growth and efficiency. Examples of AI Transforming Industries AI has a profound influence on various industries. In retail, AI-driven chatbots provide immediate customer assistance, influencing purchasing decisions. Meanwhile, in healthcare, AI algorithms analyse medical images faster than human radiologists, aiding in quicker diagnosis. In the agricultural sector, AI-powered drones help monitor crop health, optimising farming practices and increasing yields. These examples highlight AI's potential to create value, solve complex problems, and drive success in different fields. Each industry experiences unique transformations, showcasing AI's versatility and potential to innovate. Strategies for AI Integration and Market Responsiveness Successfully integrating AI in your business can transform operations and enhance responsiveness to market demands. Key strategies involve assessing your organisation's readiness, fostering a culture centred around data, and enabling agile business processes. Assessing Organisational Readiness for AI Before adopting AI, assess your organisation’s readiness. Check if your tech infrastructure can support AI and identify any skill gaps that may hinder deployment. Also, evaluate the organisational culture for openness to new technologies, as resistance can derail efforts. Create a roadmap that prioritises AI projects aligned with strategic goals for smooth implementation. Building a Data-Driven Culture Cultivating a data-driven culture is essential for AI success. Start by emphasising the value of data in decision-making processes. Encourage staff at all levels to rely on data insights rather than intuition alone. Invest in data management tools that improve data accessibility and quality. Providing ongoing training can enhance data literacy across your organisation. Highlight achievements that result from data-driven strategies to build enthusiasm and demonstrate tangible benefits. Promote collaboration between departments to share data insights. This cross-functional approach can unlock innovative solutions and foster a deeper understanding of market trends. Creating Agile Business Processes Agility in business processes allows you to respond swiftly to market shifts. Streamline operations by identifying bottlenecks and inefficiencies that hinder adaptability. Implement AI tools that automate routine tasks, freeing up resources for strategic activities. Prioritise a flexible mindset within your teams. Encourage experimentation with quick iterations and feedback loops to refine AI applications. This progressive approach enables you to troubleshoot promptly and scale successful initiatives. Regularly update your strategies to reflect the dynamic market landscape. Continual assessment ensures your processes remain relevant and competitive. Empower your workforce with the tools and autonomy needed to adapt swiftly, ensuring that AI works effectively within your agile framework. Overcoming Challenges with AI Adoption Adopting AI in business presents unique challenges, including ethical concerns, maintaining quality and accountability, and managing change. Understanding these key areas can help businesses effectively leverage AI for growth and adaptability. Addressing Ethical and Privacy Concerns Ethical and privacy concerns are at the forefront of AI adoption. Ensuring data privacy involves securing user data and obtaining proper consent. Companies must implement transparent policies to build trust with stakeholders. It's crucial to create AI systems that are unbiased and fair. This includes regularly updating AI models and maintaining diverse data sets to prevent discrimination. Engaging with ethical boards or committees can guide decision-making and reinforce public confidence. Ensuring Quality and Accountability in AI Systems Quality and accountability in AI systems are vital for maintaining reliability. Businesses must establish clear standards and criteria to evaluate AI performance. Continual monitoring and validation are necessary for improving AI systems and avoiding errors. For further insights on maximising AI's impact, refer to Protex AI’s guide on driving ROI with AI, which outlines strategies to boost efficiency and profitability. A structured process is beneficial. Defining roles and responsibilities helps maintain accountability within teams. Creating a culture of transparency encourages team members to voice concerns and take corrective measures when necessary. Managing Change in the AI Era Managing change effectively is essential for a successful AI transition. You need to foster a culture that embraces innovation and adapts quickly to new technologies. This could involve providing comprehensive training to employees to boost their confidence in using AI systems. Open communication is key. Address employee concerns early, demonstrating how AI can complement rather than replace their roles. Encouraging feedback channels ensures that employees feel heard, easing the transition. Measuring Success and Continual Learning To make AI work for your business, tracking progress and staying up-to-date are essential. You need defined metrics, an open-minded culture, and a strategy for adapting to new technologies. These elements ensure AI is not just implemented but thrives, generating tangible results. Establishing AI KPIs and Metrics Knowing what you want from AI is crucial. Key Performance Indicators (KPIs) and metrics act as a map, showing you if your AI is delivering or needs adjustment. Focus on precision, time efficiency, and user engagement. For instance: Precision: Measure how accurately AI completes tasks. Time Efficiency: Track savings in processing times. User Engagement: Monitor user interaction and satisfaction. By setting these measurable goals, you'll gain insight into AI's impact on your bottom line, promoting more informed decision-making. Fostering an Environment of Continuous Improvement AI adoption requires a culture of learning and adaptability. Encourage feedback and collaboration within your team. This open culture fuels innovation and streamlines the AI learning curve. Valuable questions to consider include: Are employees trained to use AI effectively? Are there regular sessions to update skills and strategies? Incorporate workshops and continuous training programs. When everyone participates in improvement, your AI solutions become more robust, aligning tightly with business objectives. Keeping Pace with Evolving AI Technologies Staying current with AI technology positions your business strategically in the market. Keep an eye on industry trends, participating in seminars and networking events. Ask these questions: What are the latest AI advancements? How can emerging technologies integrate with your current AI systems? Utilise research and development partnerships to harness new AI tools effectively. Staying informed fosters resilience and adaptability, enabling your business to seize new opportunities quickly and efficiently. Conclusion Bringing AI into your business isn’t just about staying in the game—it’s about taking the lead. Whether it’s making processes smoother or helping you respond faster to market changes, AI can truly drive growth and spark innovation. Of course, there are challenges like ethics, data quality, and managing change, but with a thoughtful strategy and a focus on a data-driven culture, these hurdles can be overcome. By keeping an eye on AI’s impact and adapting as you go, your business will not only stay competitive but be ready to seize new opportunities as they come. ### How AI is Transforming Customer Communication Management Business communication has evolved over the years. Today, it's not just about sending generic emails or newsletters. Customers expect brands to build and cultivate relationships in a way that feels personal, relevant, and timely. In view of this, businesses have adapted their customer communications by incorporating Customer Communication Management (CCM) solutions that make them stand out in the varied market spaces. Beyond that, Artificial Intelligence (AI) has come in to push customer communication to a new level by integrating with CCM solutions to make communication smarter, faster, and more customer-centric than ever before.  In this guide, we'll look at how AI is revolutionising customer communication. Hyper-Personalisation Like Never Before Personalisation is no longer just about addressing a customer by their name. Today, you can tailor every message or offer to an individual's preferences and needs. This is possible through analysing huge amounts of customer data to understand customer behaviours and previous purchases. Every time they interact with the business, this data builds up and contributes to the communication.  This means that when your business communicates to a customer, it no longer feels robotic. It also doesn't just tell a customer that you know them but rather addresses key issues that they may be having. It then helps build a deeper connection between the business and the customer. Predictive Communication that Anticipates Customer Needs AI has improved customer communication as it can now predict their needs before they even reach out to the business. This is possible as there’s a lot of customer data it can analyse – past behaviour, purchasing patterns, and real-time interactions. That said, it isn't just about personalisation, as the business can also use the data it collects from other customers to know what a client might need. For example, a customer may receive an offer about an item that's often bought together with or after a specific product. It can also be a troubleshooting guide for common issues associated with a product. This approach helps boost customer satisfaction and enhance their loyalty to a brand.  Multi-Channel Consistency and Automation Customers today interact with brands across multiple channels – from email and social media to live chat and mobile apps. One of the main goals of CCM solutions is to ensure that communication is consistent regardless of the channel used. AI then helps take this further through automation. The system can understand the brand voice and previous interactions with the business and use this in automated messages. Since all data is synchronised, the next channel the customer uses to reach out to the business won't matter. This consistency helps reduce customer frustrations and makes them feel like they're having a one-on-one conversation with your brand. Faster, Smarter Response Times with AI-Driven Chatbots Most people today prefer to do things online as it's faster and more convenient. And in that case, they don't like to wait in line. AI-driven chatbots are helping solve this issue as they have information about business products and processes, as well as frequently asked questions. These aren't the simple automated replies we are mostly used to, as AI chatbots can understand natural language and learn from all past interactions – both human to human and human to AI. This means that when a customer only has simple queries, they won't need to connect to a human representative as the AI chatbot can handle both routine and complex queries. If a customer isn't satisfied, the chatbot can then connect them to a human representative. The reduced tasks mean that your agents can focus on more complex issues or customers who really need their assistance. Enhanced Decision-Making with AI Analytics When CCM solutions bring all customer data together, AI analytics can then come in to make sense of it. It can analyse data about customer complaints, responses in chats, and social media sentiment to understand patterns. It can then combine this with trends and predictions on customer needs. This type of analysis makes it easy for the management to make decisions that go beyond traditional metrics. They can tailor-make strategies about a product or the entire company based on the insights to improve decisions and customer satisfaction. The company will also have a better idea of what customers expect and use this information to notify them of upcoming improvements.  Real-Time Sentiment Analysis for Better Customer Insights It's crucial for any brand to understand how their customers feel about various products and services. AI offers sentiment analysis, which analyses data and gauges customer emotions in real time. This can be data collected through emails, social media posts, or chat conversations. It's then pooled together by the CCM and analysed by AI to help know where a business can improve and address customer needs. Sentiment analysis can either be about the business in its entirety or a specific product. For example, if users post negative reviews about a new product, the customer service team can quickly jump in and engage customers before the relationship is broken. This shows that the business cares about the user and helps ensure all customers stay satisfied. ### Investment Opportunities for Startups and Technologies in AI  Although artificial intelligence developed from niche technology has become a global phenomenon of automated powerhouse. Its ability to accelerate the pace has positively impacted multiple industrial sectors. With the increasing implementation of AI and its proven advantages, it has emerged as a lucrative area of investment for corporates, angel investors, and venture capitalists. Identifying highly promising technologies and startups demands a strong understanding of trends, innovations, and the scope of future growth. Tracking High-potential start-ups AI startups often make ingenious developments that test the prevalent methods and create profitable market opportunities. For example, a startup with cutting-edge algorithms that improve precision or reduce computational costs largely influences various industries. Looking for founders with strong AI backgrounds and proven success in research or industry applications is a viable option for funding. Moreover, investing in startups solving urgent problems or meeting growing market needs—like those in healthcare or finance—are other rewarding options. Moreover, focusing on companies with distinctive technologies, patents, or exclusive partnerships gives a competitive edge. Partnerships, customer growth, or endorsements from industry experts are also key indicators of upcoming accomplishments for the companies. Investment strategies and considerations Investing in a wide range of AI startups helps to manage risk and earn opportunities across various sectors. Venture capitalists are required to inspect each startup’s technology thoroughly. They are also advised to check market projections, financial stability, and competitive landscape of the respective companies. Moreover, expert consultations and technical assessments often help to gather valuable perspectives. Staying updated on upcoming trends, progressions in tech, and regulatory changes is necessary in this matter. Engaging with other startup founders, and understanding their vision and strategies also gives a deeper knowledge of their success. In addition, considering exit strategies like growth scopes, acquisition opportunities, and IPO plans helps to anticipate the possible ROI. Major trends and prominent investment areas of AI technologies Technologies that generate content such as texts, music, and images have become the center of investment attention in the artificial intelligence market because of their wide applications. Generative AI applications like MuseNet, Copy.ai, Chat GPT-4, Canva, and Jasper are attracting major investment due to their broad applications. The artificial intelligence market has created numerous new opportunities in diagnostics, treatment recommendations, and drug discovery. Investing in startups producing AI-powered diagnostic tools, digital health platforms, or precision medicine solutions is another trend gaining popularity. The autonomous vehicle sector has markedly grown, with funding in self-driving tech, enhanced driver assistance systems, and vehicle-to-everything communication. AI has also improved fraud detection, risk management, and customer experiences in finance. This has made fintech startups with AI-driven trading algorithms or robot advisors attractive. Retail and e-commerce have gained popularity with personalised shopping experiences, efficient supply chains, and advanced customer service. Cyber security startups focusing on threat detection and automated responses also offer significant investment opportunities. Leading companies to study in the sector before investing Goldman Sachs Economic Research estimates that global investment in AI technologies is expected to reach $200 billion by 2025. Tech giants had two-thirds of the $27 billion share raised by emerging AI companies in 2023. The five tech companies that have invested the most in AI startups in 2023 are Amazon, Google, Microsoft, NVIDIA, and Salesforce. Their commitment has been made directly through the leading corporation or their respective venture investing arms (e.g., Google Ventures, Microsoft’s M12, and Salesforce Ventures). Investing in AI-focused ETFs or mutual funds offers an easy way to gain exposure to a range of companies. Tech giants secured two-thirds of the $27 billion raised by emerging AI companies. Professional managers handle the research and selection, allowing investors to own a diversified portfolio of stocks through a single investment. This large-cap fund includes 186 numbers of U.S. and global stocks aimed at disrupting various industries. With $3.4 billion in assets, XT focuses on AI’s role in automation, analysis, and innovation across tech, healthcare, industrial, and financial sectors. Tracking the BlueStar Quantum Computing and Machine Learning Index, QTUM covers 71 global stocks engaged in cutting-edge AI and machine learning technologies. ROBO Global Robotics & Automation Index ETF invests in companies specialising in robotics, automation, and AI, spanning growth and blending stocks of various market capitalisations. OpenAI obtained the highest investment among AI startup companies According to CBCInsights, Statista, among the 100 most prominent start-ups, OpenAI secured the highest position with $14 billion in capital through partnerships with Microsoft and other leading investments. Its market capitalisation is estimated at around $80 billion. Anthropic is in the second position with an investment of $4.2 billion, and Databricks ranked third with $4 billion. Decision-makers in companies like Amazon and Google have begun promoting it less aggressively to their customers to set better expectations of what to accomplish while staying within budget. The crux The artificial intelligence sector has numerous investment opportunities, influenced by technological progressions and demand across various industries. However, identifying highly. competent startups requires a thorough understanding of emerging trends, technological differentiation, and market requirements. It is thus essential to evaluate factors such as innovation, team expertise, and regulatory changes. Investors are expected to trace the promising ventures by following this guidance and to make the most out of their funds. ### Four Surprising Lessons I've Learned Leading Tech Teams Techies. Geeks. Boffins. Whatever your organisation calls its IT department, there's a well worn trope that they're a different breed – so focused on their core tasks and interests that they don't always pull in the same direction as the rest of the workforce or even listen to what their taskmasters tell them. As someone who has led and formed part of tech teams my entire career, I agree we can be headstrong and sometimes anti-authority. But in my experience, this also often makes us game-changers – the ones who transform commercial performance and disrupt outdated business processes. As GenAI and Large Language Models (LLMs) usher in a new era of AI-enabled working, businesses can't afford to underestimate our importance. Research shows that companies are feeling the pressure to use these new, ever-evolving technologies but are panicking about how. A recent survey of 200 technology leaders we carried out with CTO Craft, revealed that 47% felt negative about GenAI, and over half (55%) ranked their company's current level of AI expertise as 1-5 out of 10. Another survey shows the impact of AI on employees, with 77% of those surveyed saying AI tools had added to their workload. So, who do you turn to when you need to understand this emerging tech you're spending money on? It's your tech squad – and knowing how to collaborate with and manage them is vital. Businesses across all sectors are increasingly expected to be technology companies. What this means is that data analytics, GenAI, blockchain, the cloud, digital user experience and platformisation are now critical to boardroom decision-making. Having started as a technologist before moving into management, I've always beenpassionate about cultivating a culture where tech teams are empowered to do their best work and achieve collaborative outcomes. Companies need to adopt a different playbook if they are to empower tech teams to help drive a technology-first approach to business. With that in mind, here are four lessons I’ve learned from leading tech teams: AI is really quite stupid. Despite the show-stopping announcements around multimodal AI models which can see, hear and interact with people, GenAI and LLMs in particular, are still no match for general-purpose human intelligence. Recent research tested seven well-known foundational models, including GPT-4, Claude, Mistral and Llama with 30 questions designed to test their capabilities across areas such as linguistic understanding, relational thinking, spatial and mathematical reasoning and knowledge of basic scientific concepts. The highest score a foundational model achieved was 38%, and the lowest score was 16%, suggesting a considerable gap between foundational models' current capabilities and human cognitive abilities. The researchers who conducted the study warned that businesses embracing LLMs should be wary of trusting them with tasks involving high-stakes decision-making, nuanced reasoning, or subtle linguistic tools without a human in the loop. Even when AI becomes increasingly sophisticated and widespread, human skills - aka your tech team - will become more critical, not less, in order to oversee, train, and augment machine intelligence. Translating tech is a core skill. With executive teams struggling to keep up with the pace of technological change, people who really understand the tech, the job it needs to do, who can translate its benefits and limitations and manage expectations internally, will play a key role in organisations going forward. These could be technologists with excellent communications skills or communications people who deeply understand tech. Either way, they must be objective, tech-savvy translators who can quantify the benefits and limitations, manage expectations in both camps and provide a blueprint for the entire organisation. Tech teams are already adept at this translating tech role: it’s one that will only become more business-critical with the rise of AI-native business models. Tech teams don’t follow orders - and that’s good! One of the frustrations non-tech leaders have with IT teams is that they seem to have their own agenda and don't always follow orders. In my experience, this is a good thing. Tech teams work on detailed and complicated jobs, and because of this, they assess innovations against their own criteria of what will or won't work. This independent analysis is borne from years of immersion in complex technology and is crucial for tech projects to be successful. Think of them as canaries in the coal mine – the first to spot potential disasters. With the Horizon Post Office scandal, for example, an empowered tech team could potentially have identified much more quickly than senior executives that the technology - not the sub-postmasters – was at fault. More generally, there needs to be a willingness to listen to tech teams in the first instance rather than consultants and vendors selling solutions, which are then handed to IT teams to implement. (This sometimes results in us having to graft the solutions onto legacy systems or replace them wholesale). Tech teams excel at human centred systems thinking. Tech teams are great at looking at complex problems holistically and mapping out processes and approaches that work well for humans as well as for computers and tech. With technology accelerating at breakneck speed, we know how to continuously adapt these processes to fit the emerging tech landscape. Tech ways of working such as agile, sprints, and retrospectives aim to blend people,technology, connectivity, time and places, to deliver fast learning and decision-making in a way that positively impacts performance and outcomes. The emphasis we place on ensuring systems and processes are people-centred, such as creating a safe psychological space for feedback, are migrating out of the tech department and into wider organisational ways of doing things. Take the US where the role of agile administrators is becoming increasingly important.Organisations such as the American Society of Administrative Professionals now offercourses on how to train administrators to embrace agile ways of working including scrums and daily stand-ups. This human centred continuous learning culture is essential for businesses as they seek to enhance operational efficiency, foster innovation, and improve employee and customer satisfaction within a context of ever increasing tech developments. A collaborative, valued tech team is the solution. Good tech teams are often the guardians of a company's critical faculties when it comes to new and emerging technologies. It's no longer excusable to sideline tech teams because we don't respect the rule book or you don't understand what we're saying. Instead, reframe our impulse to question anything and understand everything as an essential skill in today's complex, ever-changing tech environment. Be grateful for your geeks – you're going to need them. ### A Business Continuity Cheat Sheet Right, let's be honest. When you hear "business continuity," you probably picture a meteor strike, a rogue AI takeover, or maybe a particularly nasty flood. While those are all possibilities (and let's not rule out the zombie apocalypse just yet), the reality of business disruption is often far less dramatic, but no less devastating. Think rogue employees, a sneaky ransomware attack, or even just a good old-fashioned power cut. These everyday hiccups can cripple a business faster than a horde of slow-moving undead. This is not about stocking up on tinned beans and bottled water (although, a bit of emergency preparation never hurts). This is about ensuring your business can keep calm and carry on, no matter what curveballs life throws its way. And surprisingly, it is less about technology and more about good old-fashioned planning and preparation. Think about weaving resilience into the very fabric of the organisation, creating a culture that not only survives disruption, but thrives in its wake. Tech has got your back (but it's not the whole story) Technology plays a role, sure. Cloud backups, standby datacentres, redundant systems and robust cybersecurity are all vital pieces of the puzzle. But they are just that – pieces. They are tools in a larger toolkit, and without a skilled craftsman wielding them with precision and purpose, they are not going to build you a resilient business. The real magic happens when you combine these technological safeguards with a solid plan, a well-trained team, and a culture of proactivity and adaptability. “A fancy sports car is great, but without a skilled driver and a clear route, it's not going to get you very far.” Similarly, the most advanced technology in the world won't save you if you don't know how to use it effectively. Triggers: More than just the End of the World (and more common than you think) Forget Hollywood disaster scenarios for a moment. Business continuity is not just about surviving the apocalypse. It's about navigating the everyday bumps in the road, the little niggles that can quickly escalate into a full-blown crisis if left unchecked. Here are a few potential triggers that might not involve meteors or zombies, but are far more likely to seriously disrupt an organisation: Cybersecurity breaches: A ransomware attack can lock down your systems faster than you can say "password reset," holding your data hostage and crippling your operations. Phishing scams, malware, and denial-of-service attacks are all too common, and their impact can be devastating. Internal malicious actions: Disgruntled employees, intentional sabotage, or even accidental data breaches can cause significant damage, from data theft and financial loss to reputational damage and legal consequences. Supply chain disruptions: Remember the great toilet paper shortage? A disruption in your supply chain, whether due to natural disasters, political instability, or simply a key supplier going out of business, can bring your operations to a grinding halt. IT failures: Server crashes, software glitches, network outages, and even simple human error can all cause significant downtime, impacting productivity, customer service, and ultimately, your bottom line. Natural disasters (the less dramatic kind): A heavy snowfall, a burst water pipe, a local fire, or even a particularly aggressive flock of pigeons nesting in your air conditioning unit can disrupt your business and cause unexpected downtime. Reputational damage: A negative news story, a social media firestorm, or even a poorly handled customer complaint can quickly escalate and damage your reputation, impacting customer trust and ultimately, your profitability. The holy trinity: Planning, Execution, and Testing (rinse and repeat) The key to effective business continuity is a three-pronged approach, a continuous cycle of improvement and adaptation: Planning: This involves identifying potential risks, assessing their likelihood and potential impact, and developing detailed strategies to mitigate them. Think of it as your business's very own escape plan, a roadmap to navigate the treacherous terrain of disruption. This isn't a one-time exercise; it requires ongoing review and refinement as your business evolves and the threat landscape changes. Execution: This is where the rubber meets the road. When a disruption occurs, you need to be able to put your plan into action quickly and effectively. This requires clear communication, well-defined roles and responsibilities, and a streamlined process for activating your contingency plans. Testing: You wouldn't wait until a fire to test your fire alarm, would you? Regularly testing your business continuity plan is essential to ensure it's up to scratch, identify any gaps or weaknesses, and ensure your team is familiar with the procedures. This might involve tabletop exercises, simulations, or even full-scale drills. People power: Training, Empowerment, and Practice (your secret weapon) Your team is your greatest asset in a crisis. Investing in their training, empowering them to take initiative, and providing opportunities for practice is crucial. Regular drills and simulations can help them familiarise themselves with the plan, build confidence in their ability to respond effectively, and foster a sense of shared responsibility for business continuity. Don't go at it alone: Seek expert advice (a little help goes a long way) Let's face it. We see it every day. Most businesses don't have the in-house expertise to develop a comprehensive business continuity plan. That's where a trusted partner comes in. At Ultima, we unashamedly lean on over 35 years of experience to help you identify your specific risks, advise and develop a tailored plan that aligns with your business objectives and train your team to execute it flawlessly. We can also provide ongoing support and guidance to ensure your plan remains up-to-date and effective. In other words, we pride ourselves on being there to support, no matter what. Risk assessment: Know your enemy (forewarned is forearmed) A crucial part of business continuity planning is committing to and conducting a thorough risk assessment. This involves identifying potential threats, analysing their likelihood and potential impact, and quantifying the risk they pose to your business. This isn't just about ticking boxes; it's about gaining a deep understanding of your vulnerabilities and developing targeted strategies to mitigate them. Think of it as a pre-mortem for your business – a chance to identify potential weaknesses and address them before they become critical. Beyond the basics: Building a culture of resilience (the ultimate goal) Business continuity isn't just about tick boxes and dusty documents. It's about building a culture of resilience – a mindset that embraces change, encourages proactive problem-solving, and empowers your team to adapt and thrive in the face of adversity. This involves fostering open communication, encouraging innovation, celebrating successes (even small ones), and learning from failures. It's about creating an environment where agility and adaptability are not just buzzwords, but core values that permeate every level of your organisation. About Michael Minarik, Head of Business Consulting, Ultima Michael Minarik is a business consultant at Ultima, a leading AI-powered technology service provider. With over 20 years of experience in the IT industry, he has worked with global companies such as HPE, Allen & Overy, Orbit and Ultima. His expertise lies in business transformation, strategy and operations, with a focus on digital transformation, cybersecurity and service-driven architecture. Michael excels at leading strategy discussions with board-level decision-makers and translating vision into practical work programmes that yield sustainable results. He has managed and delivered complex projects involving both technology implementation and operational change. His key strength is a comprehensive understanding of IT, including end-user systems, data centres and cloud computing. This enables him to effectively align technical capabilities with business requirements, understand their interdependencies and identify necessary operational changes. He has successfully managed large-scale projects across private and public sectors, including data centre consolidations, cloud migrations and repatriations, as well as overhauls of business end-user estates. Michael's core competency is however in strategy and programme definition. He excels at understanding client requirements and mapping out detailed next steps for execution. This includes broader pieces of work associated with M&A. Working with cross-functional teams, Michael has supported accelerated business growth and change. He has demonstrated success in optimising IT budgets to deliver greater business benefits and return on investment. Michael's personal quote is: "It is not that I'm so smart. But I stay with the questions much longer." - Albert Einstein Hear more from Michael Minarik, Head of Business Consulting at the UiQ LIVE Resilience Redefined: How to turn IT Challenges into Strategic Advantage on 29th November 10 - 11.00am. Register now. ### Challenges of Cloud & Ultima's Solution to Transform Business With the way that AWS and Microsoft dominate technology conversations, particularly now around AI and Gen-Ai, most business and IT decision makers would be forgiven for thinking that everyone is in the cloud. Whilst two thirds of organisations have some sort of footprint in a public cloud solution, 55% of all businesses are still reliant on traditional on premise systems[1] so it’s really not quite as dominant as you might have been led to believe.  You may in fact have noticed a change in how Senior Decision Making Boards are talking about their cloud strategy- We’ve seen a shift from the early days of cloud strategies being a black and white ‘all or nothing’ approach evolve into Cloud First, and now something more resembling’ cloud where appropriate’. Organisations are no longer expecting to be in the cloud by default. In fact, the buzz word of a few years back, MultiCloud is also proving to be somewhat less enthusiastically achieved then the market Thought Leaders may have predicted. Only 14% of organisations according to Flexeras report[2] run in multiple hyperscale cloud platforms, with the real dominant approach proving to be a Hybrid approach. When you take these two points together; the lower adoption of cloud than perhaps expected and the dominance of Hybrid cloud over multi or even Single cloud, the logic and reason start to become somewhat clearer.  Because the truth is that adopting cloud is not easy. It is fraught with challenges and disruption and the two most commonly identified sinners are not going to blow you away, indeed it is the same two problem children that plague technology.  Problem Child One: Costs. Let’s start by discussing costs of the cloud. There are a lot of overwhelming stats out there illustrating this point - from Gartner’s view that if you run a highly optimised cloud environment, you’re still wasting 15% of your cost - through to 70% of companies not being sure exactly what they’re spending on the cloud[3]. It’s of course, a recurring issue that the cloud has never fully accomplished true ease of transparency with costs, and those of us in the industry as AWS and Azure rose up, will remember not having any native tooling at all with which to review costs. Problem Child Two: Skills  The skills gap however is a much trickier issue to resolve. The simple truth is that finding the right people with the right skills and experience to effectively, securely and capably manage cloud deployments and environments is really hard. Only 36% of organisations consider that they have the right amount of expertise[4] and what’s worse is that almost as many, consider their shortage to be significant. The sad thing is that this is not a new phenomenon, and has only been getting worse. Harvard Business Review pointed out almost the exact same issue in 2021[5]. Worse, a lack of appropriate skills has now been highlighted to be the leading cause of cloud wastage[6]. These two challenges have hindered the cloud from truly dominating technology in businesses for a while. Realistically it’s going to get worse as the drivers for cloud adoption are changing from enabling infrastructure to be scalable, flexible and more resilient to enabling customers to leverage their data for competitive advantage through AI, and more specifically Gen-AI. Whilst not trying to add to the fatigue in that particular discussion, it’s really important to note, that Gen-AI is bringing a new wave of adoption to Azure and AWS that’s focused on your data platforms whilst leaving your infrastructure largely alone. This will bring challenges, because your Azure foundation and landing zone doesn’t care if your workload is a custom built application, a virtual desktop experience, or a hefty OpenAI deployment, but the skills you need to look after each of those workloads, and the landing zone itself, are very different.  The truth is that these two issues are not going to go away any time soon. The onus is on expert technology partners and service led providers to step up and help businesses make the best and most of their IT investments.  So how can Ultima help ? Ultima has always been known for a strong pedigree in infrastructure solutions. We invested early in developing cloud automation technologies under our Intelligent Automation team, and in 2021 we invested in additional cloud native skills, a global 24x7 helpdesk and acquired Just After Midnight. This acquisition added some of the best DevOps engineers and cloud architects in containers, cloud native and web technologies to further strengthen our cloud capabilities and customer proposition. Our Group Cloud Team, now offers a full cloud services and solutions portfolio and access to the right expert skills. Front of mind is the need to support our customers with their two ‘problem children’ of cost and skills gap.  To that end, our new combined cloud services focus on modularity to provide scalable services that support specific outcomes; any combination of which will solve your ‘gaps’ without encroaching on what you do, creating a tailored experience.  In short, when you lack the skills, or time, to activate plans, implement improvements, or cut waste, you now have access to an even broader range of experienced talent at Ultima, from cloud engineers, FinOps practitioners and break fix support to Solution Architects, DevOps experts and managed services.   Now you can rest assured that Ultima can meet you where you are, and help get you where you want to go. Dominic Melly, Group Head of Cloud, Ultima Dominic Melly is the Group Head of Cloud at Ultima, a leading IT services provider that delivers expert-led solutions for cloud, automation, security, and workspace. He has over a decade of experience in the IT industry, working with clients across various sectors and regions. He is passionate about helping clients and partners leverage the power of cloud to transform their businesses and achieve their objectives. As the Group Head of Cloud, Dominic Melly oversees the cloud strategy, delivery, and operations for Ultima’s customers across private an public sectors. He leads a team of cloud architects, engineers, and consultants who design, implement, and manage cloud solutions for clients and partners. He also works closely with cloud vendors, such as Microsoft Azure, AWS, and Google Cloud, to ensure Ultima stays ahead of the curve and delivers the best value and service to its customers. Dominic Melly believes that cloud is the key to digital transformation, as it enables businesses to be more agile, innovative, and efficient. He sees cloud as a catalyst for change, not only in terms of technology, but also in terms of culture, processes, and business transformation mindset.  Dominic Melly is also a fan of clear communication and simplicity. He strives to demystify cloud and make it accessible and understandable for everyone, regardless of their technical background or level of expertise. He avoids jargon and complexity, and focuses on delivering practical and pragmatic solutions that meet the needs and expectations of his clients and partners. Join Dominic Melly at the Cloud Success webinar on 24th October, 10 – 11.00am  Register now or contact dominic.melly@ultima.com for more information References: [1] https://www.cloudzero.com/blog/cloud-computing-statistics/ [2] https://info.flexera.com/CM-REPORT-State-of-the-Cloud-2024-Thanks [3] https://www.cloudzero.com/blog/cloud-computing-statistics/ [4] https://www.hashicorp.com/state-of-the-cloud#skills [5] https://www.splunk.com/en_us/pdfs/resources/analyst-report/hbr-the-state-of-cloud-driven-transformation-2021.pdf ### Data privacy concerns linger around LLMs training We have all witnessed the accelerated capabilities of Large Language Models (LLMs) in recent years, with the scope of what’s possible widening at every exciting turn. On a technical level, they are becoming more and more sophisticated and are gradually finding their way into public consciousness. While most may not be able to explain exactly what an LLM is, they will often have come into contact with one; in August 2024, ChatGPT notched over 200 million users in just one week. But do the wider public actually understand how their data might be used to train these LLMs? The vast majority of the data used are texts scraped from publicly available internet resources (e.g. the latest Common Crawl dataset, which contains data from more than three billion pages). This can include personal data, which can cause problems if inaccurate. Clearly, data protection should be the priority here, but implementing these controls has proven very challenging. In worst case scenarios, LLMs may reveal sensitive information included in the datasets used for training, leading to worrying data breaches. Now, social media giants are scraping their own sites for training purposes; controversially, Meta has resumed this activity after previously bowing to public concerns on privacy. Similarly, LinkedIn began the same process unannounced earlier this month, before also suspending its output. Whether these cases thrust the data privacy challenges of LLMs more into the public domain remains to be seen. But there’s no doubt users should be educated and protected by robust regulation; balancing innovation with privacy is the billion dollar challenge. Meta’s next steps With Meta forging ahead with the training of its LLMs via user data, despite the public outcry, it appears to be an irreversible decision. Therefore, the focus must now switch to which actions the social media giant must undertake as the process continues. Unsurprisingly, transparent communication would be a good place to start, perhaps something users have found was lacking up to this point. After all, many users only found out about the training via a news story or word of mouth initially, not through direct communication from Meta. In recent weeks, Meta has been notifying users of the impending process, although the choice of opting out is not explicitly stated. Even then, the process is not a simple one; users must navigate multiple clicks and scrolls, while Meta also claims it is at their discretion whether they honour this wish anyway. While Meta would argue that the policy is now GDPR compliant, the treatment of its users has been poor overall. Straightforward opt out mechanisms and greater transparency on how the data will be used would begin to repair those bridges a little. Regaining actual trust will be more challenging, as Meta will have to demonstrate it is adhering to regulation as it evolves. For many users, its current actions remain a little too murky. Protecting your digital footprint Despite only being a casual user of the internet, many will have left a considerable digital footprint without realising. Conventional wisdom has often indicated that consumers are indifferent to protecting their privacy online but this view is becoming outdated, especially with the rise of language models scalping data across the web. In 2023, the IAPP unveiled its first Privacy and Consumer Trust Report, which surveyed 4,750 individuals across 19 countries. It found 68% of consumers globally are either somewhat or very concerned about their privacy online, indicating that the value of online data may be becoming apparent. It would not be surprising to see this figure rise year on year, especially as awareness grows. This is a positive development; personal data is a precious commodity and it should not be obtained without necessary protections and regulation, many of which have proven difficult to enforce. Therefore, it’s essential for consumers to take steps, however limited, to protect their personal information, if they wish for it not to be used. A robust understanding of a platform’s data privacy policies is ultimately a good place to start. Balancing innovation and necessary protections As ever, regulators are walking a tightrope when it comes to legislation that offers necessary protections but does not stifle innovation more broadly. That is very much easier said than done. The EU AI Act is an example of a policy which took time to come to fruition and, due to the fast pace of artificial intelligence, some have argued is already behind the times. This must be kept in mind alongside encouraging risk-taking in the space. It is no different when applied to data privacy; regulators understandably do not want to starve LLM developers of the tools they need, but must also be mindful of the rights of consumers. We can expect to see a similar tightrope walk as the UK Government’s Data Protection and Digital Information Bill goes through the committee stage of approval. Time will tell as to whether the correct balance has been considered. What comes next? In terms of the future and data privacy, it’s very hard to predict. The rapid pace of AI means our understanding can change quickly, leaving previous explanations in the dust. And as AI becomes more advanced, the appetite for data will grow, despite some experts arguing that its capabilities will become more limited as it increasingly relies on similar AI-generated data. This may prompt developers like Meta to be more creative in the data they capture. But with consumer awareness growing, it feels like we are on something of a collision course between those obtaining data and those who own it. Regulators have a colossal part to play in clearing a sensible path for both parties, the sooner the better. ### Securing Benefits Administration to Protect Your Business Data Managing sensitive company information is a growing challenge. Multiple departments share the responsibility to protect your data, but most of it falls on the human resources teams who collect and use employee data for their jobs – particularly for benefits administration processes. Employee data is extremely sensitive and valuable, leaving your company with an inherent responsibility to take appropriate steps to shore up cybersecurity protocols. While many businesses know this, putting cybersecurity into practice can be easier said than done. In addition, not all businesses understand all of their points of vulnerability in their administration systems – or how to protect them effectively. If you want a secure company, you can’t overlook your benefits administration processes. The Value of Employee Benefits Data Employees release a lot of sensitive data to their employers through data sharing. This is especially true during open enrolment periods. While this is necessary for human resources to do their jobs, this information is extremely valuable to cybercriminals for several reasons: Personal Information of Employees The personal information of employees, also known as personal identifiable information (PII), is a desirable prize for cybercriminals. They can use the information to steal people’s identities, gain access to their finances, or ransom the information for money. PII may include names, addresses, phone numbers, bank account information, credit card information, and more. When it’s in a cybercriminal’s hands, they can wreak havoc on your employee’s finances and wellbeing. Sensitive Financial Information The benefits enrolment process often includes collecting and storing sensitive financial data that’s used for direct deposits. In some cases, credit card details may be used for premium payments, adding to the financial risks. If cybercriminals get access to an unsecured system, this financial data gives them everything they need to exploit your employees.Cybercriminals can use stolen bank account information to redirect payroll deposits or benefit payments to their own accounts. They also have details about your employees’ finances, allowing them to calculate fraudulent tax returns, manipulate tax withholdings, and more. If they have credit card information, they can use it to open lines of credit or make purchases in your employee’s name, damaging their credit and leaving them on the hook for the money. Access to Business Credentials The risk of benefits administration with cybercrime isn’t just about your employees. They can gain a lot of valuable information about your employees, but there are also risks to your company. Cybercriminals can leverage the stolen information to launch phishing campaigns, impersonate employees, or disrupt business processes related to benefits enrolment. If a cybercriminal gains access to the systems that manage benefits information, they may have an opportunity to manipulate benefits selection, process fraudulent claims, or gather credentials to gain access to other parts of your system. As a result, you may face significant financial, operational, and reputational damage. Common Risks Associated with Benefits Administration Benefits administration processes can leave your system and employee data vulnerable to cybercriminals. Here are the common risks associated with benefits administration: Inadequate Security Protocols of Third-Party Vendors Often, benefits administration involves relationships with third-party vendors like insurance providers, payroll service providers, and benefits brokers. No matter how many cybersecurity solutions you have in place, you’re still vulnerable if your third-party vendors aren’t on the same page with protecting data. A cybercriminal can gain access to your system through your third-party vendors’ vulnerabilities. If a vendor experiences a breach, it can ripple through all the vendor partners because these systems are interconnected, leaving a path of sensitive data that the criminal simply needs to follow. Poor Access Controls Employees can be an asset with cybersecurity, but there’s also a potential point of vulnerability. It’s common for companies to create self-service employee portals to streamline the benefits enrolment process or allow employees to access plan information. While this helpful for employees, the credentials they need can be a point of entry for cybercriminals. It’s common for people to rely on weak passwords that they can remember easily, which happen to be easy for cybercriminals as well. If they use weak passwords and skip vital cybersecurity measures like multi-factor authentication, it’s much easier for cybercriminals to exploit these vulnerabilities and access sensitive data. Legacy Software The benefits administration system may include legacy software and hardware, which is another common vulnerability that leaves your system exposed to cybercrime. In many cases, businesses lack the resources or experience to identify outdated systems, leaving them at risk of a breach. Worse yet, if you skip regular patches and updates – many of which are important for cybersecurity – you’re making it easier for cybercriminals to attack.If you delay updating some of your more critical systems to avoid downtime, you’re putting your business in a dangerous situation that cybercriminals can exploit. Cybercriminals are sophisticated and know where to look for vulnerabilities, and this is a great way to compromise an entire system. Employee and Administrator Errors Human resources teams can be the weakest link in the security chain. Phishing attacks, reusing passwords across systems, and choosing weak credentials can pose a real threat to your business. Employee training needs to be part of your cybersecurity protocol. It doesn’t matter how sophisticated your cybersecurity efforts are if your employees are leaving you vulnerable to breaches caused by lack of awareness or human error. How to Keep Your Benefits Administration Processes More Secure No matter the size of your organisation, cybersecurity best practices can be used to reduce your risk and limit your vulnerabilities. Here are some steps you should take to protect your business: Conduct a Risk Assessment Risk assessments can help your business identify any potential threats or vulnerabilities within your benefits administration processes. Your risk assessment should be broad, including both internal risks and external threats from third-party vendor security practices.A thorough assessment is a crucial part of understanding your current risk profile and recognising areas that need to be prioritised in your cybersecurity plan. Encrypt All of Your Data Data encryption is a vital tool to protect sensitive digital information like employee benefits data. You should encrypt data at rest, or when it’s stored, and in transit, or when it’s being shared between systems. Covering both bases makes unauthorised access more difficult if a breach occurs. If a cybercriminal does manage to steal data, an effective data encryption strategy makes stolen data virtually useless to the criminal. This can go a long way toward limiting the damage they can do. Create an Incident Response Plan Data breaches happen to even the most stringent of companies. It’s crucial for companies to have a well-defined incident response plan in place if a breach occurs. Your plan should outline the steps for investigating the incident, containing the breach, notifying all parties affected, and putting new strategies in place to prevent another breach in the future. Regularly Train Employees As a possible weak link in your business, employee awareness is a critical aspect of cybersecurity. Conduct regular cybersecurity training to teach your employees and human resources team how to identify phishing attempts and recognise suspicious activity. This will help them avoid breaches on their end and share some of the responsibility of protecting your company data. Work with an Experienced Solutions Provider Partnering with a reputable solutions provider gives you the expertise to implement effective and secure benefits technology solutions. Look for partners that specialise in benefits administration security. They can help you assess your vulnerabilities, create strong security measures, and assist with incident response if a breach occurs. Make Data Security a Priority Cybersecurity isn’t an afterthought. It should be one of the top priorities for your business and human resources team when it comes to benefits administration. Cybersecurity strategies and employee training can significantly reduce your cybersecurity risk and keep your sensitive business data secure. ### Which Cloud Type Suits You – Public, Private, Hybrid? Valuable lessons have been learnt about cloud deployments over the last decade. Most significantly, while both public and private options have their strengths, they have important differences that must be taken into consideration when assessing providers. To make the most beneficial investment, a decision maker must understand the pros and cons of each approach and determine what’s right for their own specific infrastructure, applications, customers – and budget. However, with so many solutions on offer, it’s a confusing marketplace to evaluate. Having an appreciation of why public and private clouds differ, and the key points to take into account when choosing between them, will help senior executives make an informed decision. Firstly, not all clouds are equal Public clouds are run by third party vendors who maintain the supporting hardware, including servers, network, and storage. They deliver virtual infrastructures to multiple customers and each manages their own discrete virtual environment, sharing the underlying physical resources with other customers. But not all clouds are equal. Within this multi-tenant category, there are varying levels of service, performance, security, and pricing methods to choose from. Whereas, private clouds supplied by external providers do not share hardware or resources with other organisations. Each customer is a sole tenant, working independently. Separation from other systems makes a private cloud inherently more secure than a shared option. Robust authentication and security measures can also be put in place by the customer, minimising the risk of unauthorised access or data breaches. Therefore, it is particularly well-suited to handling sensitive and confidential information, such as personal data, medical records, intellectual property, and other critical data. Pros and cons of a private cloud On the positive side, private clouds can be engineered to dovetail precisely with customer needs, ensuring the most appropriate technology stack is deployed. Moreover, hardware and software can be configured for optimal performance and tweaked easily to meet changing demands. Greater control over technology decisions and capacity helps to ensure costs remain stable, in contrast to fluctuating usage-based pricing, typical of public clouds. This means accurate and reliable calculations can be extrapolated from a fixed infrastructure investment, giving more confidence in projected costings. Predictability is particularly useful for organisations needing a high level of financial stability when budgeting and planning for future business infrastructure requirements. For some, it’s an advantage over a usage-based model if costs can go up with only a short notice period, leaving customers with escalating outgoings and tied-in to a particular supplier. Flexibility is another benefit of private clouds. For example, supplementary managed services such as maintenance, monitoring, and troubleshooting, can be brought in to reduce the administrative burden on internal staff. Organisations can also continue to run vital legacy applications that wouldn’t easily migrate to public clouds, or set-up high-performance computing for specialised work such as technical modelling and simulations. For increasingly regulated industries, private clouds can help provide stringent adherence to compliance requirements. Having autonomy within a dedicated environment also facilitates the enforcement of appropriate security measures, access control, data protection, privacy, backup, and recovery. As such, private clouds can provide maximum control and tailored security, but on the downside, they can be expensive to establish and very difficult to scale. Why choose a public cloud Public clouds, on the other hand, usually offer cost-effective and responsive levels of scalability. Organisations can increase or decrease their demand for resources easily whenever the need for additional computing power or storage arises. Ideal for accommodating unforeseen circumstances, expansion, or consolidation. And, a good fit for web applications, APIs, content delivery networks, collaboration tools, big data and analytics. Also, user-friendly interfaces make it simple to spin up temporary environments for development, testing and prototyping. Service level agreements give added reassurance of reliability and performance, often guaranteeing 99.9% uptime for all instances. Cost can be a significant factor, making a public cloud an attractive option. Flexible subscription models with hour or monthly billing removes the financial burden of upfront investment. This stacks up well against hyperscalers, delivering up to 30% reduction in costs based on standard workload benchmarks, although actual savings depend on specific use cases. Public clouds by their nature are generic offerings, but this doesn’t mean they won’t provide adequate compliance and security measures. The caveat is that customers must evaluate whether they are sufficient to meet their own governance commitments and external regulations. It’s a case of picking the most suitable solution from the myriad available. Or, it could make sense to take the hybrid route. Going the hybrid route A hybrid model potentially offers the best of both worlds, providing the scalability and computing power of a public cloud along with the security and control of a private infrastructure. Having access to a modern public cloud also allows organisations to evaluate and take advantage of emerging technologies such as AI and machine learning, as well as services like big data analytics. With minimal upfront investment and allocation of internal resources, this helps organisations to stay agile while not compromising the security of sensitive data which can remain in the safe confines of a private cloud. For many, cloud computing has brought substantial efficiency and performance gains over the last decade. But it hasn’t been all plain sailing as organisations have had to wise up to the pitfalls of choosing the wrong model or provider. Now, with the benefit of hindsight, decision makers are in a strong position to pick the right option and the right supplier, if they do their homework. This should ensure the majority have enough knowledge to cut through the jargon and hype to work out what’s best for their organisation’s unique requirements. ### The Role of Artificial Intelligence in Subscription Management AI has revolutionised the landscape of sales and reinvented different business activities. Today artificial intelligence in subscription services is going beyond the thoughts & changing the way consumers buy products & services. The different models of AI development services have a great impact on subscription management systems. These models have helped companies transform to recurring revenue models from off purchases. AI has already explored the e-commerce sector, but the next one is the subscription industry. Artificial intelligence solutions for subscription services have become so extensive that nowadays most software companies like to integrate it into their application. Subscriptions have been important in generating revenue for streamlining industries to SaaS solutions. Not only these industries, but AI is also going to place the mark on different industries providing advanced solutions. Let’s explore the role of AI in subscription management & how exactly it is raising the subscription economy. Understanding Artificial Intelligence in Subscription Management Subscription management is a process of handling subscriptions including from sign-up to invoice & renewal. This process intended to cause errors and was time-consuming as well. The advent of AI integrated into subscription management has made it easy. One can automate their subscription management procedure which helps to improve efficiency, accuracy, and user experience. Before indulging in the role of AI, why is it necessary for subscription management? The term subscription management involves different activities in its lifecycle. Some of them are acquisition, retention, and servicing of subscribers. This consists of other activities like invoicing, metering, billing, provisioning, and customer support. An AI development company has helped in replacing the traditional subscription management system. This system highly relied on manual interventions that led to errors, inefficiency, and delays. Hence the advancement of AI development has streamlined the workflow process like never before. Role of Artificial Intelligence in Subscription Management Subscription Invoice Platform AI-driven subscription management provides an effective way to create bills and delivery in turn resulting in seamless and error-free billing. Artificial intelligence in subscription management uses AI algorithms to create custom and personalised invoices. It has become easy to create personalised invoices based on users usage habit, billing preferences, and other information. For example, for recurring billing, AI creates bills automatically. And these bills can be customised, based on the customer’s subscription plan, mode of payment, and usage. Retention of Customers & Churn Prediction Churn is one of the most sensitive factors for subscription management. A slight up or down in the bar can have a great impact on the revenue graph as well as overall business performance. Artificial intelligence in subscription detects early possible churns minimising client’s debilitation. AI does this by analysing the customer’s data, usage patterns, behaviour, and interactions. A business can have custom AI solutions for subscription, adopting a proactive marketing strategy. This helps to keep clients active and gives them specialised discounts and incentives. Customised Recommendations: AI-powered recommendation engines help to improve client interactions increasing upscaling prospects. Artificial intelligence in subscriptions helps to focus on personalised recommendations analysing the client’s behaviour, data, and preferences. With this, one can recommend new upgrades & services to the subscribers. AI algorithms in OTT platforms help to increase customers’ subscription renewals giving satisfactory results. Increased User Interaction: AI has revolutionised business in knowing the user's engagement with the subscription management platform. The advanced AI algorithms help to understand the user's behaviour, and preferences and implement strategies that increase user satisfaction & interaction. Analysing this, AI can adapt the user patterns that adjust the platforms for timely updates, features, and promotions. Preventing Fraud Detection: AI algorithms provide super protection against fraud. Along with this, artificial intelligence in subscriptions can examine transactional data, and user behaviour to identify and stop fraud. One of the main advantages for businesses is they can prevent revenue loss. And this can be done by quickly recognising fraudulent subscriptions with the help of AI algorithms. This is also useful for e-commerce platforms to use AI algorithms. It helps to identify suspicious activity or an unexpected sign up or subscription from a single IP address. Dunning: To your subscription management platform, AI can process effective collection strategies. You can have an improved collection rate based on customer behaviour, payment patterns, and historical data. This reduces manual intervention and also enables efficient resource allocation. Benefits of AI Development for Subscription Management The most emerging term in the market is AI, i.e. Artificial Intelligence. This is an effective machine learning tool that improves business and consumer experience. The use of AI in business is perfect for recurring revenue models by making predictions from the data given. For a business, subscription branding is a way to have a long-term customer relationship. Here the customers can get new updates on data regularly and can adapt it quickly. Subscription companies with AI enhance customer experience and increase revenue growth. Let’s explore below the benefits of AI development for subscription management. Flexibility: Artificial intelligence in subscriptions offers flexibility to create personalised content recommendations. AI algorithms analyse subscriber’s data based on their browsing history and shopping behaviour which can lead to engagement and loyalty. Improved Customer Service: Many businesses are following the trend of integrating AI chatbots. These chatbots provide a quick and personalised response to subscribers that is time-saving for customers. The AI-powered chatbots can answer each question of the customer, and solve issues that also reduce the workload on customer service. Analytical Prediction: Artificial intelligence in subscription identifies trends and patterns that are not possible to detect manually. This ensures the subscriber continues with the service and meets their requirements. Customer Retention: This is one of the most important benefits of AI in subscription management. It helps to identify the subscriber's risk of churning & take measures to prevent it. Churn Management: AI has made a big contribution in different aspects, especially in revenue recovery. Let’s consider an example. If a user’s payment retry fails, then it should show that the best time to retry is when subscription renewal fails. Conclusion With different tools available in the marketplace, artificial intelligence is one of the most revolutionary tools. This has created a great remark and impact on the subscription management industry. As technology evolves so does the role of AI. It has the potential for businesses to have sustainable growth in the industry. Businesses can also integrate custom AI solutions for subscription management by taking help of subscription management system AI developers. AI-powered billing subscription automates billing procedures, reduces churn, gets the best pricing strategy, identifies fraudulent activity, etc. AI helps businesses maximise their income with AI in subscription management. Additionally, it increases client retention, improves effectiveness, etc. Adapting this strategy of AI-powered subscription management promotes sustainable growth of the business. A big thanks to the subscription management system AI developers for implementing different algorithms and making a business more successful. There is a good future scope as well for AI getting integrated into different industries. Also, artificial intelligence development companies embrace AI-powered solutions to businesses to stay ahead of competition. ### Understanding the cloud adoption curve and what the future holds Globally, strong cloud adoption trends are well established, with the industry gradually maturing into increasingly defined sectors and specialisms. Take the public cloud, for example, where research from Gartner predicts that the market will grow by over 20% this year to nearly $700 billion – more than twice the rate for the IT industry as a whole. As Gartner goes on to explain, “cloud has become essentially indispensable” and is set to become a trillion-dollar industry by 2027. But what is it about cloud computing that has created such a compelling proposition across today’s digitally-centric organisations? Whether they choose an ‘all in’ approach with one partner or a hybrid strategy where some functions are migrated to the cloud and others remain on-premises, it’s this inherent flexibility that gives businesses more scope to manage IT provision and expenditure much more effectively than was possible in the pre-cloud era. Despite these genuinely transformative advantages, Gartner argues that, currently, most organisations use the cloud as a technology disruptor or capability enabler. Typically, adoption is happening in the context of wider digital transformation strategies and to drive benefits that only the cloud can deliver. In the next 3-4 years, however, this process will accelerate and “by 2028, most organisations will fully transform into cloud-based entities capable of sensing and responding to business and market conditions.” The M&E cloud microcosm Take the media and entertainment (M&E) industry, for example, which has become one of the most enthusiastic adopters of cloud technologies and can be seen as a bellwether for other markets. According to industry research carried out by Forrester last year, for example, over half (52%) of M&E organisations reported having more than 60% of their workloads in the cloud. As is the case for industry sectors across the board, this heavy investment in cloud infrastructure and services is also driving increased performance and reliability expectations. Three-quarters of respondents to the same Forrester research said their organisation’s users expect zero downtime, rebuffering or playback errors. At the same time, data volumes are exploding, and the ability to scale storage infrastructure has become critical for organisations that are in the business of creating high-resolution content. From capture and post-production to viewer streaming, the M&E sector has demanding storage requirements where cloud services play a fundamental role. Given these various challenges, some M&E organisations are also taking the opportunity to reevaluate their cloud supplier relationships. In particular, rising costs in the form of TCO, data retention and egress charges contribute to targeted repatriation strategies where certain workloads are moved away from hyperscalers to other providers or even back to on-premises execution venues as part of a hybrid approach. Wherever organisations are on their cloud journey, the M&E industry is one to watch. Organisations including Netflix, Disney+ and Amazon are pushing the boundaries of digital content streaming, with cloud technologies enabling real-time scalability for millions of concurrent users. The industry is also a good example of how cloud-based collaboration can be made to work for geographically distributed teams focused on large and complex projects. This trend is accelerating the move to fully virtual production environments and demonstrates that online collaboration can be much more ambitious and effective than the narrow use of video-based meetings seen in many organisations. Looking ahead So, what do these various issues say about the future of cloud adoption? There are a number of trends to follow. The first is that despite the massive size of the existing cloud market, there remains substantial scope for future growth. According to research published by IDC, spending on compute and storage infrastructure products for cloud deployments last year, including dedicated and shared IT environments, overtook that of non-cloud spending with a 46% market share. Add to this the growing popularity of SaaS technologies among SMEs, and it’s likely that cloud adoption will continue to surge across the board. Many organisations are also now opting for multi-cloud strategies to avoid vendor lock-in and reduce risks. By spreading workloads across multiple cloud platforms, it becomes possible to optimise flexibility and resilience to guard against service outages or cost increases. This offers further evidence that cloud users are becoming more sophisticated in their choice and implementation of cloud services, as well as their willingness to change. It almost goes without saying that AI will have an increasingly important influence on the development of cloud technologies. AI-optimised cloud services are already integrating capabilities such as predictive analytics and machine learning into their solutions portfolios while the technology is also being used to manage cloud infrastructure more effectively. That’s just the tip of the cloud and AI iceberg, and we can expect to see service providers continue to roll out a wide range of new and enhanced products, particularly those that focus on areas such as cybersecurity and automation. Suffice it to say that less than 20 years after the launch of the first commercial cloud, it’s a story that will continue to evolve, driven by a highly innovative culture and customers who increasingly know what they do and don’t want. ### AI Build or Buy and the Death of Billable Hours "The billable hour has been a universal system applied by law firms. Using the traditional industry standard of six-minute intervals, lawyers would monitor their time spent working on client activities and then record it; first manually, and then later onto a technical application that would allow the firm to report and invoice accurately. While retainers and fixed fees are beginning to take over as the industry standard, billable hours are still commonplace. And even if they aren’t used to charge clients, they can be used as an internal model to measure workloads, employee efficiency and how busy a firm is. Now, the explosion of AI is set to change these approaches altogether. AI will automate many legal tasks traditionally performed manually, transforming how firms and staff spend their time. We see law firms gaining time and cost efficiencies at an operational level. But how exactly will it impact the billable hour and firms? AI – the killer of the billable hour AI will have an impact on the billable hour, with the dying model already beginning to fade into obscurity. But its rollout will impact different firms, different departments and differently sized companies at different times. Thomson Reuters experts wrote a piece predicting how Generative AI will impact the legal sector. Looking ahead to the next three to five years, the article highlights how, as expected, AI will mean legal work gets done more efficiently. Yet this in turn will increase the pressure on law firms from external corporate legal departments “to deliver work faster”. Of particular note, when it comes to the billable hour, it notes that this increase in efficiency will mean the model “will no longer be the most cost-effective way to capture value”. Instead, it will trigger legal entities “to reimagine their commercial models to better share in time and cost efficiencies, and capture the value that’s added through advanced technology”. But what does a reimagined commercial model for the legal technology sector look like? And what firms will lead AI adoption? The rise of ‘lawtech’ companies The death of the billable hour, coupled with AI’s evolution, paves the way for law firms to deliver and sell their legal services using best-in-class technology platforms. What this means is that companies will instead charge the customer for the use of the technology in hand with their legal services, rather than just the services alone. In this respect, while legal tech already helps firms and lawyers streamline their workflows internally, law firms will begin to emerge from this digital change as technology or ‘lawtech’ companies, using AI-powered platforms to provide and deliver services for clients. Looking at how AI is set to change and shape the legal sector, the successful law firms leading the adoption of AI will not be experimenting or in playrooms with GenAI. Instead, they will know and have a good understanding of the business use cases that are most likely to have a positive impact on the firm financially. This understanding – of progressive and forward-thinking firms – will propel measurable and successful change, not only with time and cost efficiencies but also by driving revenue and gaining competitive advantage. To build or to buy? It’s one thing to say that one will implement AI and best-in-class technology – it’s quite another to actually do it. When approaching such a task, the main question to ask is whether to build or to buy. The real question here is whether a firm wants to gain a competitive advantage. For those firms looking for speedy adoption or who have limited technical expertise, using commercial off-the-shelf (COTS) platforms is an attractive option. However, as other companies will be using the same COTS platforms, these platforms cannot provide a competitive advantage. The only competitive advantage firms have here is how quickly they can adopt, right? What’s more, using out-of-the-box AI tools will mean that competitors benefit from the machine learning improvements that a firm’s data brings to the COTS AI tool. So, whether you like it or not, everyone benefits from your data. If firms can build best-in-class technology themselves to deliver their services – that works for their processes – then they stand a chance of gaining a competitive advantage. They can gain this advantage through using bespoke and technically excellent software applications. What’s important is that firms buy commodities: ‘SaaS/COTS, licence’. But firms should build ‘custom, in-house, bespoke’ elements that give them a competitive advantage. However, what if firms don’t have the expertise to do this? Software outsourcing companies offer cost-effective and efficient ways to build a bespoke platform. A firm can access a flexible pool of talent that can be handpicked to suit the business and software needs of its business/employees, and this talent tap can be turned on and off as required. Building to the death of the billable hour The legal sector is set to be one of the early widespread adopters of AI. Tasks such as managing and reviewing documents and updating contracts are perfectly suited to be carried out by AI and automation tools, with its rapid evolution sure to take over more activities in the coming years. Crucially, this will signal the eventual death of the billable hour. As firms transition into selling their legal services through best-in-class technology platforms, the most successful AI adopters will be the firms that can not only outline specific business use cases and targets, but build their own AI applications. The decades-strong approach of the billable hour is already seeing out its dying days. AI is sure to provide the final blow. ### Optimising Cloud Cost Management to Maximising ROI A business’s cloud infrastructure needs will evolve with its growth. A report by the State of Cloud Costs (2024) shares that 58% of companies report that their cloud costs are too high. These costs will continue to increase without strategic cost management and optimisation. Effective cloud cost management is about optimising cloud resources utilisation, gaining transparency in usage patterns, and minimising waste. In their effort to reduce cloud costs 59% startups and SMEs are bringing architecture changes while 43% are reducing wastage. Cloud cost optimisation is now essential, but it’s a complex process involving multiple variables. Companies with a multi-cloud environment must figure out how to tackle dynamic pricing, evolving needs, etc. to figure out what strategies are needed and how to implement them for the greater good. Understanding Types of Cloud Costs A business’s cloud infrastructure needs will evolve with its growth. A report by the State of Cloud Costs (2024) shares that 58% of companies report that their cloud costs are too high. These costs will continue to increase without strategic cost management and optimisation. Effective cloud cost management is about optimising cloud resources utilisation, gaining transparency in usage patterns, and minimising waste. In their effort to reduce cloud costs 59% startups and SMEs are bringing architecture changes while 43% are reducing wastage. Cloud cost optimisation is now essential, but it’s a complex process involving multiple variables. Companies with a multi-cloud environment must figure out how to tackle dynamic pricing, evolving needs, etc. to figure out what strategies are needed and how to implement them for the greater good. Understanding Types of Cloud Costs Cloud costs define the expenses associated with using cloud computing services and tools. These costs can vary based on services businesses utilise, pricing models, and organisation needs. Types of Cloud Costs Compute Costs: This refers to the cost of using Virtual Machines (VMs), containers, hardware resources, and serverless functions, which are generally grouped into instances. Customers are charged based on the instance type and their duration. Compute costs have the highest share among all cloud cost types and rightsizing instances and auto-scaling resources to match requirements in real-time can optimise costs. Storage Costs: Including hot (frequently accessed) and cold (occasionally accessed), storage costs are added when data is stored in cloud memory. Where hot storage has higher costs, cold storage is less expensive, credited to its usage and accessibility. But as data accumulates overtime, costs can increase substantially, which means effective data management is essential to control costs. Networking Costs: These costs include bandwidth charges, which are the cost of moving data out of the cloud (egress) and into the cloud (ingress). Customers are also charged based on data transfer between zones, regions, and external services. Hidden Cloud Costs Data Transfer Costs: It’s a simple task to cut, copy, and paste data. However, what you might not know is that data transfer is chargeable and the costs can mount up based on the service provider and region. Organisations with a multi-cloud strategy and globally distributed services completing frequent data transfers can compound over time and lead to higher unexpected costs. Underutilised Resources: Underutilisation of cloud instances, including virtual machines, storage volumes, etc., amounts to waste of resources. These are resources you are paying for according to the usage. If you keep servers running 24/7, but they are only needed for a few hours, you will end up paying storage fees when the data stored is rarely accessed. Hence, deallocate resources after project completion or check if they are not running idle. Mismanaged Licenses: Even though your cloud services provider will offer several operational licenses to run your systems, you may not need all of them. Mismanagement occurs when you purchase more licenses than required or forget to cancel existing licenses, which are no longer needed. Identifying, understanding, and optimising the internal processes to check and manage cloud and hidden costs is even more important. Doing so will help you save substantial costs while learning ways to mitigate them as you move forward. Key Strategies for Optimising Cloud Costs Cloud cost optimisation isn’t a one-time event. It’s a dynamic, ongoing process that must change according to the needs while following the basic structure. Right Sizing Resources Right-sizing, as the name suggests, is about analysing cloud resources’ utilisation against your organisation’s performance metrics. Through this, determine whether you are using all the resources efficiently to decide corrective actions. This includes aligning the usage resources and modifying the infrastructure as needed. Depending on the requirements, you may upgrade, downgrade, or even downsize cloud resource utilisation. Right-sizing is crucial for cloud cost optimisation as it ensures you only pay for the computing power you are actively using. Moreover, this ensures right instance size aligns resource allocation with actual demand, effectively removing the barriers caused by underpowered resources. AWS offers a myriad of tools like AWS Cost Explorer, AWS CloudWatch, and AWS Compute Optimiser to help organisations identify gaps and optimise resource utilisation. Using these resources, you can choose the right size based on the following: Usage patterns Variable and predictable load Temporary workloads Turn off idle instances Select the right instance cohort. Discovery saved 61% in Total Cost of Ownership (TCO) after analysis done with AWS Cloud Economics. The analysis led the organisation, which stores 15 petabytes of broadcast content at any given time, to downsize server racks and use reserved instances. Reserved Instances and Spot Instances Reserve instances are about holding a specific amount of computing power for a fixed time. Committing to a long-term contract leads to significant cost-savings on the monthly or hourly usage prices. However, the savings are subject to all costs paid upfront. You can also choose from partial or zero upfront payments accompanied by a higher hourly rate compared to all upfront transactions. Spot instances are about bidding on excess resources or unused capacity with a cloud service provider. They are generally used for non-critical workloads and often come with a heavy discount, up to 90%, but there’s a catch. Cloud service providers can terminate the spot instances contract within a short notice period. For AWS, it’s only 2 minutes. These two minutes are supposed to help organisations save application state and update log files. How to Use Spot and Reserved Instances Together for Best Cost Optimisation? The best approach is to move forward and use both reserved and spot instances together in a way that saves you the maximum money without compromising on work. Identify Critical vs Non-Critical Workloads: Use reserved instances for workloads that require constant uptime and uninterrupted server access. Go for spot instances with workloads with flexible time schedules and which can be restarted when interrupted. Use Auto Scaling Policies: Auto-scaling automatically scales spot instances when they are available at a lower price without affecting the core workloads on reserved instances. Automating Cost Management When manual cloud cost management gets challenging, try automating cost monitoring, resource allocation, and optimisation. Whether with in-built monitoring tools or third-party integrations, these tools can track usage patterns and detect inefficiencies to automatically adjust settings or send notifications about the same. Popular tools include: AWS Cost Explorer: Let's organisations visualise trends in usage patterns and forecast cloud costs based on existing usage patterns. Azure Cost Management: Offers detailed cost analysis and recommendations for optimizing budgets. Google Cloud Cost Management: Scales cloud resources based on demand with preemptive budget setting and cost alerts. When combining automation with scheduled scaling, you can get an even better execution of cloud resources. Scaling cloud resources for a growing organization is a necessary activity, but it requires careful planning to prevent cost overruns. In scheduled scaling, you can set rules to automatically increase or decrease cloud resource utilisation. It not only reduces wastage of resources but also ensures you deliver services with maximum efficiency. All of this works in real time based on predefined triggers. For instance, if memory usage exceeds 80%, the cloud services platform will automatically provision additional resources and reduce them when demand decreases. Infrastructure Monitoring and Optimisation Infrastructure monitoring includes managing and optimising cloud computing costs, including resources and service analysis and whether they align with performance metrics. This information is used to identify inefficiencies and underutilised resources to implement cost-saving measures. Offers Detailed Visibility in Real-Time: Continuous infrastructure monitoring provides up-to-date insights into the usage of cloud resources. This information lets organisations spot resource consumption trends and patterns, ensuring the costs paid are according to the resource utilised. Proactive Performance Optimisation: Monitoring CPU utilisation, memory usage, disk I/O, and network bandwidth makes it easier to identify performance issues and bottlenecks. Knowing these, allows for changes in the resource requirements and optimizing usage of instances, leading to cost savings. Infrastructure Monitoring Tools for Resource and Cost Optimisation Regardless of their location, infrastructure monitoring has crucial implications for I&O leaders and helps collate the availability of resources and ensure effective utilisation. Middleware: Middleware offers a full-stack cloud observability platform, assisting you with real-time on cloud performance facilitating tracking network throughput, memory usage, and CPU utilisation, along with disk I/O. This lets you track the health of servers, containers, and the applications working on the cloud in real-time. With this information, you can identify performance bottlenecks and optimise resources to ensure optimal utilisation and prevent downtime. Middleware lets you customise dashboards for extracting information on Kubernetes performance. Lastly, use Middleware’s automated alert system to receive notifications on potential issues and make quick changes to ensure perk system efficiency and performance. How Middleware Helped Revenium in Optimising Observability Costs? Cloud cost management is a challenging process for businesses, given the complexity of multitude of variables involved in the process. One of these variables are observability costs, which takes up to 30% of the total infrastructure monitoring costs. This is the amount spent on tools and services employed to monitor (observe) the performance and availability of the cloud infrastructure. Middleware is enabling Revenium to reduce their observability costs by over 10X. Taking advantage of Middleware’s friendly features and tools, Revenium connected its business data with Middleware’s telemetry data. Middleware helps Revenium overcome observability challenges including ECS monitoring, track critical business events, and gain insights into customer interaction, transaction volume, and error rates. Access to this information, allow the client to make changes and reduce observability costs by 10X. Grafana: An open-source platform popular for its visualisation and integrations, Grafana brings together data from multiple sources to help you decipher cloud infrastructure metrics. As you visualise data from customisable dashboards that update in real time, it’s easier to track KPIs, detect trends, and set up anomaly detection alerts. Prometheus: Prometheus automatically collects metrics about different tools and services in real-time and stores the data in a time-series format. This makes data analysis at a later date easier and lets you identify trends for resource utilisation and issues and trigger alerts according to predefined rules. Prometheus and Grafana can be paired for advanced data visualisation analytics for a detailed analysis of the cloud infrastructure. Datadog: Datadog is popular for real-time visibility into the cloud infrastructure, applications, and cloud services. With support for multi-cloud environments, Datadog also offers a unified dashboard to group information from various cloud providers and analyze their usage, performance, and costs. You can also set real-time alerts to detect anomalies, schedule scaling, and set notification alerts to monitor usage patterns. Implementing a Cloud Cost Management Framework Experts recommend taking the “Crawl, Walk, and Run” approach prompting organisations to take quick action but gradually expand the size of their cloud strategy as the situation requires. Cloud cost management framework is implemented in two parts; Develop a Cost Management Strategy Assessment: Consider the existing cloud storage utilisation of every department, application, and resource to understand the existing cloud spend. Don’t miss EC2 instances, spot instances, Storage costs, reserved instances, data ingress, egress costs, and others. This will help: Review existing cloud infrastructure and workloads. Identify areas of inefficiency like knowing underutilised and idle resources and unexpected cost spikes. Calculate the Total Cost of Ownership (TCO). Goal Setting: Set realistic, measurable, and specific goals regarding cost reduction and visibility. Following this approach, the goals you set will be easier to track and measure. Combine these with cost management-related performance goals, like improving resource utilisation, avoiding cost overruns, and achieving better cost visibility. Tools Selection: You can track and assess set goals easily only with the right tools. Choose a tool that can help you monitor, optimise, and automate your cloud infrastructure. Consider tools with the following features: Cost management Automation Real-time monitoring Scheduled scaling Expense visibility Security Compliance Unified view Budgeting Forecasting Incorporating Infrastructure Monitoring Infrastructure monitoring includes tracking performance and health of IT infrastructure components to ensure their continuous operation and reliability to maintain overall efficiency. Track metrics like memory and CPU usage along with network traffic and disk performance to analyse system health and make changes with the best-fit tools to obtain crucial information in real-time. Implementing infrastructure monitoring ensures improved system uptime and proactive issue resolution. Furthermore, effective analysis leads to optimise resource utilisation and improve capacity planning. With Middleware infrastructure monitoring, you can monitor website, web application, and serverless functions performance and availability. Infrastructure monitoring tool integrates with GitLab CI/CD, AWS Lambda or S3 Triggers, ensuring you can monitor all cloud-enabled applications. Regular Review and Adjustment Cloud cost management is an ongoing process. Don’t expect it to deliver results with one-time execution. Regular review of usage, costs, and performance metrics can ensure your cost management strategy is working as intended. As a part of the review process, stay up-to-date with the latest technological advancements, pricing models, and dynamics of business requirements. Conclusion Cloud cost optimisation is an evolving process and businesses are now focusing on using automation and AI-driven analytics to further enhance cost efficiency. Going forward, AI-powered assistants will help keep an eye on the cloud management systems 24/7 and provide real-time information for cost-analysis and predict future expenses. While using advanced technologies will be crucial, ensure you are also updating the existing systems for smooth integration and run a cost-benefit analysis before implementation to maximise benefits. ### Welcome to More Productive, AI-powered Working Lives According to content services expert Dr. John Bates, AI (Artificial Intelligence) is ushering in a new era of highly responsive business systems Before long, AI bots will be sifting through a variety of content for you, some familiar, some of which is “dark” some of which is structured and some of which is unstructured. By extracting metadata and overlaying structure, these intelligent agents will connect various elements, uncovering insights and making them searchable. They'll integrate all your organisational metadata, creating a holistic, continuously evolving narrative that can serve as the foundation for more context-aware process automation. This suggests that anyone dealing with content should be ready for AI capable of processing all kinds of documents—contracts, orders, policies, even handwritten notes—and helping you navigate them. Imagine being able to simply ask a document that has appeared in your electronic inbox questions like: "What are you?", "What's your subject?", "What is this contract?", "Who are the contracting parties?", "What is your expiry date?", and "What are the penalty clauses for breach?" And the document itself understands and responds, almost as if it were a conscious entity. These same tools can then intelligently trigger subsequent actions, such as paying a supplier or even initiating a new project from scratch within your broader enterprise systems like ERP, CRM, contract management, and more. There is immense potential for AI-enabled tools to "read" and interpret incoming documents. By allowing interaction through natural language instead of complex queries like SQL, this new generation of what I call "sentient" documents will overcome language barriers. Even if a document is in a language you don't understand, you'll be able to engage with it naturally in your native language—for instance, asking a question in English about a Japanese invoice and receiving an immediate, accurate response in either language, or even a third, as needed. With these advancements in how we handle business information, the emergence of the sentient document marks a revolution that could be the most significant leap in document automation since the invention of the printing press. The reason: sentient documents have the potential to fundamentally change how enterprises understand and harness the vast repository of work patterns and corporate knowledge embedded within them. The rise of document sentience couldn't be more timely. By 2025, global data creation is projected to grow to more than 180 zettabytes, or 180 trillion gigabytes. The challenge isn't just the sheer volume of data but our inability to effectively leverage it. Estimates suggest that about 54% of information stored on business systems is "dark," meaning it's inaccessible or underutilised. Sentient document technology will address this by ingesting and illuminating dark data, extracting descriptive metadata whenever content is created or ingested. This will form a centralised "metadata kernel" that powers AI bots and enhances analytics capabilities. How do we get there? Clearly, sentient documents will offer a crucial solution for managing the overwhelming flood of data inundating businesses today. Some of my company's customers are already leveraging automated systems that read incoming emails, categorise and interpret embedded documents, and take intelligent actions based on them. For example, SEW-Eurodrive, a global leader in industrial automation, utilises a form of document sentience to expedite the processing of over 5,000 orders daily. It also uses nearly 100 embedded “Content Apps” to automate order workflows, from creating CAD designs to scheduling the manufacturing and delivery of its products. This approach is thus automating critical business processes and making document discovery as intuitive and responsive as the concept of “sentience” implies. So, how do we achieve this? Document sentience evolves from automating content processes, such as automatically responding to invoices in finance or applying business rules for employee onboarding in HR. The key breakthrough lies in leveraging a broad spectrum of AI capabilities, not just conversational AI. Sentience requires the integration of natural language understanding, generative AI, large language models, and machine learning for pattern recognition. This combination is what enables documents to truly "understand" and respond intelligently, transforming how we interact with business information. The real magic lies in the synergy of these intelligent technologies, which together create a powerful framework for a next-generation content platform. Simultaneously, the rise of sentient documents will introduce an intuitive and familiar conversational interface, similar to what we experience with virtual assistants like Siri or Alexa. However, this interface will be specifically designed for intelligent document and content interaction, rather than everyday consumer tasks. I am confident that document sentience will empower knowledge workers globally to make genuinely data-driven decisions. This advancement will save time, reduce costs, and enhance the efficiency of every knowledge worker. ### Cloud Security Challenges in the Modern Era Organisations already have to store files and data in the cloud as a regular work process, and studies have shown that the security side of it is at high risk. Businesses are now balancing the cost of cloud management and security, meaning organisations are at risk of needing help managing their data correctly within these systems. In the following article, I will discuss the challenges the cloud environment faces and the risks organisations face with security. Security procedures in the cloud One continuous challenge in cloud security is the mistake that security procedures with on-premise data centres can also be applied to the cloud. Cloud systems are a lot more complex and need additional security measures than on-prem does. It is worth noting that monitoring internal traffic is just as important as monitoring external traffic. Once a hacker breaks in, they are able to move around within the system. This then creates vulnerability in cloud security. In more traditional data centres, security is focussed in physical forms such as locked rooms and restricted entry points. But where cloud resources can be spread across multiple locations and can be accessed from anywhere, it means that the security side of it has to be flexible and adaptable. From this, organisations need to make sure that their security processes can handle the constantly changing cloud environment, where workloads can quickly start or stop, and data moves easily between services. Studies have revealed that 95% of internet traffic is now encrypted, producing another challenge in the cloud security landscape. Though encryption is essential for protecting sensitive data, it then creates a blind spot for security teams. Hackers take advantage of this and mask their attacks, meaning that organisations must find ways to inspect encrypted traffic effectively without compromising performance and incurring excessive costs. Don’t have cloud “tool sprawl”! Cost management is another challenge that cloud security faces and there are a few strategies that organisations should consider to overcome this challenge. One way is for organisations to review their security tools across different cloud platforms. By doing this, it then highlights areas where the same tool can be used across different cloud environments, creating a simpler oeprating system and providing a clearer picture of what is happening in the cloud setups, resulting in improving security overall. Following on from this, gathering excessive amounts of different tools and methods for collecting data can cause complications and higher costs. As mentioned above, by constantly reviewing tools that are in use, companies can avoid “tool sprawl” and help provide consistency across the whole cloud system. Tool sprawl is where too many tools are used; this results in increasing licensing costs and requires employees to have a variety of skills to manage these tools. So, by managing the tools used, organisations can manage their cost spending. Be picky with your data There is nothing wrong with being selective about the data you send. Companies need to be more selective and careful about the data their tools collect. By ignoring low-risk traffic and focussing on real threats, they can reduce the amount of data that is process, which then enables them so save on cost. With processes put in place that filter out unnecessary data, organisations can be confident that the security tools they use focus on the most important and critical data. Work processes are then improved, such as enhancing their ability to detect and respond to genuine threats. The rise of generative AI brings new security concerns to the cloud environment. It’s always been said that you should be careful with who you share your data with; this can also be the same when sharing data with AI tools. Organisations need to monitor the network traffic between the company and AI to understand precisely how the data is being used and highlight any potential risks. Regularly reviewing the data collected will help with this. The 3 keys As cloud tech becomes more popular, there are three key points that should be remembered to ensure a safe working environment. Security features are something that needs to be understood clearly. Organisations need to know their responsibility with security in the cloud. Though cloud providers offer strong security features, they do not handle every part and need to be understood. It isn't uncommon to find that the customer has the main responsibility for security. The second key point is that organisations need to be able to balance the speed of innovation with thorough security planning. Do not rush to the finish line and deploy cloud processes without proper evaluation of the long-term implications. When launching cloud applications, security, as always, should be a top priority, and organisations should never add security features later on. During the early stages of design and development, security should be one of the features that must be considered first. The final point is one that can save organisations money. For each new project, unless it is necessary, do not buy new tools; instead, focussing on current tools allows companies to save money and resources. Through reviewing, businesses often find they are able to make better use of current tools of a variety of cloud platforms, and employees are then able to develop their skills in this. Cost Management & Regulatory Compliance Continuing on with cost management, a lot of companies need to find a way to manage their cost management in cloud security. Businesses need to make sure that as they look to more cost-effective spending in the cloud, that security is not at risk. This issue can be solved by using automation and AI-processed security solutions, which will help balance cost and cloud security successfully. From this, businesses are able to delegate resources effectively and confidentially, knowing that their cloud security is strong. Another challenge that many are facing is linked with the regulatory requirements. Data protection regulations are constantly being reviewed and amended with stricter rules, organisations must be aware that their cloud security practices must follow the guidelines and any industry standards. Ways which they are able to ensure they do the following: practising proper data governance practices, maintaining audit trails, and ensuring that data is gathered properly and correctly. Adding a Human Touch to Cloud Security Technology itself is only part of the puzzle that organisations need to focus on; another aspect is employees within the company itself. Employee training and awareness cannot be overlooked and must be incorporated to ensure that they are aware of their role in keeping the cloud system secure. Creating a culture within the organisation where employees are able to see their responsibility in protecting data will keep the security of cloud platforms strong. Taking steps such as regular training, running practice exercises like fake phishing attacks, and having clear security rules will help maintain this culture. In conclusion, businesses need to update and manage their security practices to overcome the challenges mentioned in this article. By focusing on understanding all network activity, secure data management, and choosing the right security tools, organisations will be able to manage costs and improve cloud security. Cloud security is constantly developing, which means that businesses need to review and ensure that their practices are following alongside and evolving appropriately without risking security. How are you managing your cloud security? ### Why I welcome AI software development Today, I will be taking you on a journey, learning about communication from ancient times to today's software development and machine scientists.  The purpose of this blog is to understand how the knowledge of human history has been encoded and passed through generations. Grab a cup of tea! My journey with you begins in ancient times when scribes were the ultimate keepers of knowledge. The reason, the scribes were the writers of the day. Holding positions of importance, they created the stories, the journeys that tell us about the politics religion and heroes of those days.  These texts handed down between generations were crafting the code of yesteryear. Defining civilisations and creating in our minds a reconstruction of millennia that have passed, (I am personally a fan of the Ancient Greek schools of thought). A great example of religious scribes comes from the Jewish practice of Tiqqun Sofermin. These scribes were tasked with maintaining the Hebrew Bible, making corrections and minor alterations, and preserving the integrity of the sacred texts. Another way of viewing this is in a modern context, were they doing version control or taking a commit from GitLab? Ensuring the code base remained stable? Correcting the minor mistakes to ensure smooth operations of the text? My journey with you now moves us into the mediaeval period. Near me is Romsey Abbey, where you can imagine the smells and sounds of monks praying in years gone by. The scribes of the monastic period were the Wikipedia of the day. Working a minimum of 6 hours a day by candlelight, they were re-coding, refining and updating religious texts. These subtle but important additions to our wisdom impacted future generations, although the gatekeepers could shape the form and function of their belief system.  The belief system introduced by the scribe affected how other monasteries received and interpreted the work. This sometimes subtle change could lead to further alterations and permanently change the context of the original work. I imagine, at times during the darker hours of winter, if there is a comparable modern feel to illuminated monitors and laptops lighting the rooms of our homes and offices, and the scribes working by candlelight in times gone by. The Silicon quills of today. Swapping out devotions to coffee (in my case, tea) our scribes of today use keyboards and scrapers. They do not work in the candlelight but open plan offices with headphones usually inserted. The work environment aside, the modern-day scribes and software engineers wield power that is hard to comprehend. The architects of the modern world working on artificial systems are the scribes of today. Shaping our modern life and worldview in ways we cannot comprehend. The algorithms they create do everything from deciding what products we buy to manipulating our core beliefs. Deciding what news we view through to trying to change the thing that makes us human our beliefs. Think about this for a minute. From the cats in our homes to the machines we label as grumpy as humans, we have a bias (Anthropomorphisation) that attributes human-like attributes to non-humans. This has the effect of shaping our belief that the systems targeting us for various purposes think and act likes us. It doesn't. But what the machine is doing is looking at (in this case) codified social interactions and ascribing to us a persona and predicting our core intentions (with great accuracy). Belief systems, religious or cultural, have a right to exist independently, this goes to the main crux of this blog. Freedom of choice to not be served content or LLM output ascribing to a one-world view.  Perhaps the singularity was never about AGI and machines becoming smarter than us. Could it be the shaping of worldviews into a single bias? The Tech oligarchy will say there are guardrails against these events, but recent events would determine otherwise. From data harvesting to misaligned LLM outputs, the race to “move fast” means that we are in serious danger of things being forever broken. Having worldviews forced upon us in this manner, makes us slaves to the machine's hard-coded biases and interpretations. I remember watching a Sir Johny Ive speech about how the iPad is a fluid device, to moves in form and functions with our preferences.  Can we say the same about our early AI creations? do they move in form and function with us? Silicon Valley’s Talent Hoarding Hoarding Artificial Intelligence and computer engineering talent from across the globe has been a staple of the largest technology companies. These Silicon Valley powerhouses have been stockpiling talent for years. The retention of this talent has caused a non-competitive global field creating a global divide in accessibility and innovation. Like cloud computing the roll-up of services in terms of AI providers will be centred around a few powerhouses able to afford the huge training costs. This concentration of talent has created a singularity of thought and beliefs. When the same pool of elite developers, coming from similar societal and economic backgrounds, is responsible for creating the digital infrastructure and selection of guardrails and AI training we all use, this leads to a narrow perspective being imposed on us all.  This is akin to a single monastic order dominating the religious and cultural discourse of an entire era, shaping the thoughts and beliefs of millions. Smaller companies, public services, and nonprofits are often left scrambling for technological crumbs, unable to compete with the salaries and perks offered by the tech giants. The AI Digital Gutenberg Press Moment The Gutenberg printing press caused huge upheaval, the scribes who controlled the distribution of mass information were “disrupted”. From witty pamphlets being sold around Tudor England to the lower Netherlands joining the reformation. The democratisation of information was a turning point in human evolution moving the centre of information to mass production. Todays, Gutenberg printing press is the LLM  The democratisation of information and access to that information is an amazing feat. The dawn of the LLM is heralding in a new dawn of technology democratisation.  We are seeing the beginning of coding abilities using an LLM. Imagine needing an app for your small business idea but not knowing how to code. Soon, you will be able to chat with an LLM to get the perfect app for your small business.  This democratisation of coding has the potential to unleash a wave of innovation, allowing more voices to participate in the creation of digital tools and platforms.  The Great Computing Redistribution We are seeing trends in job losses from the larger technology companies as more work is being divested to AI. This great redistribution of talent could be the light of a new movement based on humanities values. In the United Kingdom, public services like the NHS, which have been struggling with technical debt from aged systems, would be able to see digital efficiency increase. Should a tax incentive be put in place for AI engineers returning from large tech? For small businesses, the democratisation of coding and software development would bring to us a world where my local fish and chip shop has an app as slick as Deliveroo, or where interacting with my local council’s website doesn’t feel like time-travelling back to 1995 (Geocities anyone?).  The repatriation and rebalancing of technology talent would help create a more fairer digital landscape. Coupled with the democratisation of software development we could bring in a new paradigm of education, healthcare and digital inclusivity. As experienced developers oversee AI-generated code, we have a chance to create a more equitable digital world, free from the biases that have plagued both ancient manuscripts and modern algorithms.  This could lead to a digital landscape that reflects the diversity of the global population. And takes into account a worldview that is respectful of local beliefs and customs.  A Better Digital Future for Our Children I believe we are standing at a pivotal moment in human history. The power to reform the digital world is no longer confined to Silicon Valley. It’s spreading, democratising, and evolving. A Human and AI Friendship As we move forward, it’s essential to recognise that the relationship between humans and AI is not a them and us. The use of AI systems does not mean the fall of human creativity or ingenuity. Instead, it opens the door for a powerful collaboration where we act as co-pilots guiding AI to new heights. I, for one, welcome this new era.  Thanks for Reading. ### A Practical Guide to the EU AI Act Disclaimer: This article is opinion-based; please seek legal advice from reputable companies if needed. Like many others, I have watched and experienced the growth and development of AI. I have always been interested in its legal side and the regulations to ensure that organisations use it correctly and efficiently. After researching the topic, I will break down the key aspects of the EU AI Act that will highlight how to use AI responsibly and innovatively. Why Should You Care? With AI rapidly developing, organisations must care about how they use it. From ethical responsibilities to professional standards and organisational strategy, we will be exploring the reasons why the AI Act should be a priority for everyone. Prioritising Human Responsibility Let's start with human responsibility and ensure that the technologies organisations use do not harm individuals or violate basic human rights. No one wants to build systems that compromise public safety, health, or privacy. Professional Standards to Enhance Innovation Legal compliance isn't just about dodging fines; it's an opportunity for companies to improve how they develop their AI practices. AI users can benefit from better collaboration and fresh innovation by using the law as a way to connect technical and business teams. Organisational Clarity for a Sustainable Future: For any organisation, the EU AI Act can act as a guideline for avoiding risks and fines whilst highlighting their dedication to using AI responsibly. It is vital to stay on top of the continuous development of AI laws to ensure a sustainable future. How the EU AI ACT is Structured The EU AI Act is structured according to their potential impact on health, safety, and fundamental rights. Let's take a look further into how this is structured and why. At the highest level, systems that could show that they pose an unacceptable risk are banned due to the potential threat to safety. Industries such as healthcare, finance, or law enforcement High-risk AI systems, particularly those functioning in sensitive sectors such as healthcare, finance, or law enforcement, must follow strict rules and checks to ensure they are used safely. This is compared to limited-risk systems, where they have to follow certain procedures to ensure they do not cause significant harm. Transportation logistics, manufacturing and customer service are considered in this limited-risk bracket. Finally, we move onto minimal-risk systems, these are common consumer applications and are deemed low potential for harm. Due to this, they have minimal procedures that they must follow. This structure has been set up to allow businesses to continue with innovation while still protecting others. How to Classify AI Risk Levels Figuring out the risk of an AI system is vital to ensure that it follows the rules that regulators set. There are four factors that must be evaluated by people who are using any AI software, which we will look into. First, they must assess the system's potential impact on health outcomes, especially in sensitive areas like medical diagnostics, where patient care could be affected by AI. Another safety concern is how the system itself manages sensitive data. Systems handling personal information, such as chatbots, must prioritise privacy and security. The third factor is whether the AI risks violating human rights such as equality or access to public services. The last factor is how much control the users have over AI decisions and making sure that AI enhances decision-making rather than replaces it. The Best Steps to Comply with the EU AI Act You are probably wondering how you can make sure that you are following the regulations of the EU AI Act correctly. First, it is essential to keep track of every AI project that is completed on the system; this can be done through documentation. For systems that are considered high-risk, human monitoring is highly important to ensure that important AI decisions are logged and that there's always a way for humans to intervene if necessary. It's important to prioritise transparency and fairness in AI. Tools like Shapley Values or LIME can help explain how AI models make decisions. Plus, including fairness metrics during the training process can prevent any bias from creeping in. Conducting regular compliance audits is another vital step, moving towards external evaluations by third parties to ensure impartiality and full compliance. Following this, compliance must be seen as an ongoing process, with continuous monitoring and regular updates to documentation as new features are added or any developments with regulations. This proactive approach ensures long-term success in navigating AI regulations. The Evolving Landscape With the popularity of AI technologies like generative models, the EU AI Act comes into play to establish clear regulations and responsibilities that must be followed. It can be argued that this can cause additional challenges. One challenge is testing the system to identify any vulnerabilities and weaknesses, such as data poisoning or model evasion. Tests like these are essential for ensuring that the system remains robust, even in unexpected conditions. It is also important for AI users to consider the overall effect AI has on society, also known as systemic risk. Users need to think about how their models might impact people and communities. If a system poses a significant risk to society, it will attract a stricter response from regulators. With AI continuously evolving, organisations need to make sure that they are continuously engaging with legal updates and incorporating best practices into their everyday lives. We must remember that the EU AI Act is just the beginning of global AI regulation; new opportunities will be available to develop better and more ethical systems to benefit both businesses and society. Conclusion To conclude, following the regulations set by the EU AI Act requires ethical awareness, technical precision, and proactive risk management. By adhering to the rules, organisations can benefit from avoiding failing to meet the legal requirements, adopting innovation, and ensuring that AI systems are built, supporting confidence among users and the public. By following these guidelines, organisations can position themselves at the front line of AI development, preparing for a future where AI plays a crucial role in business, society, and beyond. ### Building a Smart City If you ask me how I picture the future, I would imagine a city where everything worked together as one. I wouldn’t be stuck in traffic on the way to my commute to work, energy is used efficiently and sustainably. This is something that for long I won’t have to stop and imagine, I will be able to live it all thanks to AI (Artificial Intelligence). AI allows life to become more efficient and sustainable through collecting a huge amount of urban data, this makes city planners not only manage the day-to-day activities, but also plan for the challenges of tomorrow. Think of it as a digital architect working tirelessly to design a city that breathes, moves, and responds in real-time. A Compassionate City Take Singapore for example, through machine learning it doesn’t just optimise power, AI is used to monitor the elderly population. Through machine learning Singapore has created a monitoring system that can detect unusual patterns in senior citizens' daily routines, alerting caregivers to potential issues. I find this a brilliant example of how AI can make cities not only smarter but also compassionate. Building a City That Thinks Okay, now going back to my comment at the start of the article where I express the dream of not having to be stuck in bumper to bumper traffic, on my commute to work. A Smart city can solve this issue simply by being interconnected with other systems. From energy-efficient buildings to intelligent transport networks, everything is working in sync. A great example of this would be an orchestra where every instrument is playing it’s part, and is being conducted by the AI. How is this all possible you may ask? Well it’s all done by the Internet of Things (IoT), with sensors and devices constantly communicating, ensuring the city is always in tune with its residents’ needs. A Shift Toward Sustainability Being “smart” is only the beginning for these AI cities, sustainability marks a crucial step in smart city development. I definitely am concerned about the future of the planet, and doing my up most best to ensure I am being as green as possible. With the help of AI cities are integrating green technologies, adopting the circular economy principles (the three R’s) which are: reduce, reuse and recycle), and engaging their communities in creating urban environments that are resilient and eco-friendly. Take a moment to think about it, the city you live in could one day reduce it’s carbon footprint significantly whilst still growing and thriving all thanks to the decisions made by AI from data gathered. GIS-based Urban Management Now lets move onto, Geographic Information Systems (GIS). This forms the backbone of smart city management. These systems help city planners effectively plan where everything can go and allocating the resources appropriately. GIS technologies also facilitates everything from efficient route planning for public transport to optimal placement of public amenities. Industry 4.0 and Smart Cities The fourth industrial revolution, or Industry 4.0, is closely linked to smart city development. Technologies such as automation, 3D printing, and cyber-physical systems are reshaping urban manufacturing and services. This is the future where industry and urban infrastructure is creating new economic opportunities and enhancing urban productivity. A City for Everyone When I say for everyone, I mean for EVERYONE. Smart transportation systems are being designed with a focus on accessibility for people with disabilities. This includes the development of voice-activated navigation systems, tactile pavements with embedded sensors, and adaptive traffic signals. These innovations ensure that smart cities ensure that everyone, regardless of physical ability can navigate with ease and independence. Waste Management Solutions Effective waste management is crucial for smart cities. Smart bins that monitor fill levels when full, keeping streets clean and hygienic. Automated waste sorting systems, and route optimisation for collection vehicles are some innovations that are getting a technological upgrade. These technologies not only improve efficiency but also contribute to cleaner, more hygienic urban environments. Emergency Management Systems Waste management is only the start, emergency management is also improving. Imagine a city that uses real-time data and predictive analytics to detect a crisis, enabling authorities to respond faster, evacuate more efficiently, and save lives coordinating across all emergency services. The integration of social media and mobile technologies also further enhances communication during crises. Sustainable Financing & Risk Management Developing sustainable financing models is essential for the long-term viability of smart city projects. This involves a mix of public-private partnerships, green bonds, and innovative revenue-generating schemes. Effective financial planning ensures that smart city initiatives deliver value for money and tangible benefits to citizens. Everything isn’t as always simple as it seems, smart cities do face unique risks, including cybersecurity threats and technological dependencies. Robust risk management strategies are necessary to identify, assess, and mitigate these risks. This involves regular security audits, redundancy in critical systems, and comprehensive disaster recovery plans. Ethical Considerations & Data Security As smart cities collect and process vast amounts of personal data, strong ethical frameworks and data protection measures are essential. Implementing strict data anonymisation protocols, obtaining informed consent for data collection, and ensuring transparency in data usage are ways in which smart cities will be able to keep stored information safe. To maintain public trust, smart cities will follow regulations such as the General Data Protection Regulation (GDPR). Ensuring the security of data in smart city ecosystems is a complex challenge. It requires multi-layered security protocols, including encryption, blockchain technology for data integrity, and advanced intrusion detection systems. Regular security assessments and updates are essential to stay ahead of evolving cyber threats. Cloud Computing in Smart Cities With all the data these cities collect, it will need somewhere to store it all, that is where Cloud Computing comes into play. These cloud platforms serve as the backbone of smart city infrastructure, offering scalable storage and processing capabilities. They enable centralised data management, advanced analytics, and seamless integration of diverse urban services. An example of this being used is Barcelona's Sentilo platform, a cloud-based IoT system, that integrates data from various sensors across the city, enabling efficient management of urban services. Edge Computing for Real-Time Processing Edge computing brings processing power closer to data sources, enabling real-time decision-making, reduced bandwidth usage, and enhanced data privacy by processing sensitive information locally. This can be seen in Amsterdam, where edge computing devices on streetlights process video data locally for traffic management, sending only relevant information to central systems. Conclusion From researching into my dream world of smart cities, they do represent a broad approach to urban development, integrating technology, sustainability, and citizen-centric design. By strategically deploying cloud computing, AI, and edge technologies, cities can create responsive, efficient, and inclusive urban environments. I believe that as these urban centres continue to evolve, addressing challenges in data security, ethical governance, and inclusive development will be key to realising their full potential and creating more liveable urban spaces for all residents. ### Mastering Hypervisors for Enhanced Business Efficiency The cloud computing landscape is a complex ecosystem characterised by constant innovation and the pursuit of optimal efficiency. Hypervisors have emerged as critical components of this ecosystem, and their role in cloud computing is more important than ever. This increased importance is due to the growing complexity of the cloud and recent structural changes in the industry, including Broadcom's high-profile acquisition of VMware—a leader in virtualisation — and the emergence of new vendors offering innovative virtualisation solutions. Understanding hypervisors, their capabilities, and their benefits can unlock new opportunities for your business’s growth and success. Let's explore what hypervisors are, why they matter, and how they're transforming modern industries, from financial services to healthcare. The evolution of server management In the early days of computing, physical servers were monolithic entities, each dedicated to running a specific operating system and application. This rigid architecture led to significant inefficiencies. Servers often sat idle, underutilised, or were over-provisioned to handle peak loads, resulting in wasted resources and increased costs. Managing these physical servers was a complex and time-consuming task, requiring dedicated hardware, power, cooling, and physical space. The advent of hypervisors has transformed this landscape. A hypervisor is essentially a software layer that creates and manages virtual machines (VMs) on a single physical server. These VMs are isolated environments, each with its own operating system and applications as if they were running on dedicated hardware. This virtualisation technology, enabled by hypervisors, has transformed server management by allowing multiple workloads to coexist on a single physical server, significantly improving processes like resource utilisation, management, and disaster recovery. The power of virtualisation Virtualisation creates a flexible and efficient environment where resources can be dynamically allocated and managed, offering a number of benefits for your business such as: Improved resource efficiency By consolidating multiple workloads onto a single physical server, hypervisors maximise hardware utilisation and reduce costs associated with underutilised resources. This means your business can make the most of its existing infrastructure while minimising unnecessary expenses. Enhanced scalability Hypervisors provide the ability to dynamically allocate resources, ensuring optimal performance and cost-efficiency. In other words, you can easily scale your IT infrastructure up or down to meet fluctuating demands. For example, a retail company can use hypervisors to dynamically scale its e-commerce platform to handle peak demand during holiday seasons, ensuring a seamless customer experience without over-provisioning resources. Simplified management Hypervisors offer a centralised platform for managing multiple virtual machines, reducing administrative overhead and accelerating service delivery. As a result, IT operations are streamlined, and your teams can focus on other business initiatives. Disaster recovery and business continuity Hypervisors help with building resiliency against disruptions by enabling the creation of virtual backups and disaster recovery plans. Consequently, critical data can be protected, and operations can be quickly restored in the event of hardware failure or other unforeseen incidents. For instance, if you’re a manufacturing company, you can use hypervisors to create virtual backups of your production systems, ensuring that you can quickly restore operations in the event of a hardware failure. Accelerated time-to-market Finally, hypervisors allow for rapid deployment of virtual machines, enabling organisations to quickly provision new applications and services. This reduces time-to-market and helps businesses gain a competitive edge. For example, a software development company can use hypervisors to quickly test and deploy new software updates, accelerating product release and increasing customer satisfaction. Real-world impact Hypervisors have proven to be invaluable in many industries. As mentioned above, they are an essential tool for any modern e-commerce business, enabling the platform to dynamically scale to meet peaks in demand during busy periods such as the festive season, ensuring a seamless customer experience. The same can be said for any organisation likely to experience sudden surges in online traffic. Take Hyve Managed Hosting's partnership with the National Television Awards here in the UK as an example. During the awards' online voting period, we were able to increase server resources by 24 times without adding any physical hardware, ensuring a smooth voting experience even during peak traffic. This dynamic scalability, enabled by hypervisors, allowed us to meet fluctuating demands while optimising costs. Elsewhere, in healthcare, hypervisors are used to create virtual desktop infrastructures (VDIs), providing remote access to patient records and applications for doctors and staff. In financial services, hypervisors help institutions consolidate their IT infrastructure, reducing costs and improving disaster recovery capabilities. Staying ahead with hypervisors As cloud computing technologies continue to evolve and the industry faces frequent shifts due to market changes, it is vital to keep abreast of the latest trends and developments. This knowledge not only helps your organisation navigate the industry's volatility but also enables you to make informed decisions about your IT infrastructure and gain a competitive edge in the increasingly digital market. Hypervisors are essential tools that enable you to optimise your IT infrastructure, improve efficiency and increase scalability. By understanding the capabilities, benefits and recent advancements of this technology, you can identify new opportunities for growth and innovation in your organisation’s digital transformation efforts. ### Cloud Computing's Role in Digital Transformation Definition of Digital Transformation Digital transformation refers to the process of leveraging digital technologies to fundamentally change how businesses operate and deliver value to their customers. It involves integrating digital tools and platforms into all areas of an organisation, fundamentally altering its operations, culture, and customer interactions. This transformation aims to improve efficiency, enhance customer experiences, and drive innovation, positioning businesses for long-term success in an increasingly digital world. Overview of Cloud Computing Cloud computing is the delivery of computing services—including servers, storage, databases, networking, software, and analytics—over the internet (the cloud). Rather than owning and maintaining physical servers and other hardware, businesses can access these resources on a pay-as-you-go basis from cloud service providers. This model allows for greater flexibility, scalability, and efficiency compared to traditional on-premises infrastructure. Connection Between Cloud Computing and Digital Transformation Cloud computing acts as a catalyst for digital transformation by providing the technological foundation needed for modernising IT infrastructure. Its inherent scalability, flexibility, and cost-effectiveness enable businesses to adopt new technologies and processes more rapidly. By leveraging cloud services, organisations can accelerate their digital initiatives, streamline operations, and enhance their ability to innovate and respond to market changes. Key Benefits of Cloud Computing in Digital Transformation Scalability and Flexibility One of the most significant advantages of cloud computing is its scalability. Businesses can easily adjust their cloud resources based on current needs, scaling up during peak periods and scaling down during quieter times. This flexibility extends to the deployment of new applications and services, allowing organisations to quickly introduce and integrate innovations without the constraints of physical infrastructure. Cost Efficiency Cloud computing offers substantial cost savings compared to traditional IT models. By eliminating the need for expensive hardware and maintenance, businesses can reduce capital expenditures. The pay-as-you-go pricing model further contributes to cost efficiency, enabling organisations to pay only for the resources they use. This approach helps in managing operational expenses more effectively and optimising IT budgets. Enhanced Collaboration Cloud computing facilitates real-time collaboration across teams and geographical locations. Cloud-based tools and platforms allow employees to work together seamlessly, share documents, and communicate effectively from anywhere. This enhanced collaboration can lead to increased productivity, faster decision-making, and a more cohesive work environment. Improved Data Management and Analytics Cloud solutions offer advanced data management and analytics capabilities. Businesses can store vast amounts of data securely and access it from any location. Cloud-based analytics tools provide powerful insights and reporting, enabling organisations to make data-driven decisions more efficiently. This improved data management and analytical capability is crucial for driving informed business strategies and gaining a competitive edge. IV Cloud Security and Compliance in Digital Transformation Importance of Cloud Security As businesses transition to the cloud, securing cloud environments becomes critical. Cloud security involves protecting data, applications, and systems from threats and vulnerabilities. Ensuring robust security measures, such as encryption, access controls, and regular security assessments, is essential to safeguarding digital assets and maintaining business continuity during digital transformation. Data Privacy and Protection During digital transformation, safeguarding sensitive data is of utmost importance. Cloud providers often offer advanced data protection features, but businesses must also implement their own strategies to ensure data privacy. This includes understanding data compliance regulations, setting up proper data governance practices, and monitoring for potential security breaches to protect customer and business information. Challenges and Considerations in Adopting Cloud for Digital Transformation Migration Challenges Migrating to the cloud can present several challenges, including data transfer issues, compatibility concerns, and potential downtime. Businesses need to plan and execute their migration strategies carefully, ensuring minimal disruption to operations. Working with experienced cloud migration experts can help address these challenges and ensure a smooth transition. Cost Management While cloud computing offers cost advantages, managing cloud expenses can be complex. Businesses should implement strategies for monitoring and optimising cloud costs, such as setting budgets, using cost management tools, and regularly reviewing usage patterns. Effective cost management helps avoid unexpected expenses and ensures that cloud investments deliver value. Skills and Expertise Successful cloud adoption requires the right talent and expertise. Organisations need skilled professionals who understand cloud technologies and can manage the transition effectively. Investing in training and development, or partnering with cloud experts, can help build the necessary skills and ensure that the cloud implementation aligns with business objectives. Recap of Key Points In summary, cloud computing plays a pivotal role in driving digital transformation by offering scalability, flexibility, cost efficiency, and enhanced collaboration. It supports improved data management and analytics while addressing the need for robust security and compliance. However, businesses must navigate challenges such as migration, cost management, and acquiring the right skills. Embracing cloud computing can significantly accelerate digital transformation efforts, positioning organisations for future growth and success in a rapidly evolving digital landscape. ### The hidden costs of technical debt inaction With technology moving at a rapid pace, you would be forgiven for thinking that many organisations are adapting their IT processes at a similar rate. However, many large businesses still operate on what would be deemed legacy systems; this covers any outdated computing system, hardware or software that is still in use. This is more widespread than you’d think. If you work within an enterprise organisation, it’s highly likely you will have legacy applications somewhere running essential business processes. In fact, a 2024 study by the Financial Conduct Authority revealed that 92% of financial services surveyed still rely on legacy technology. Many businesses are tethered to legacy applications that only run on obsolete versions of Windows. If you’re running day-to-day business-critical operations on these applications, the thought of upgrading or modernising them can be daunting. In many instances, organisations simply sidestep the problem or allow it to keep them awake at night. Yet there is another way, one that empowers businesses to maintain their legacy applications and transport apps to the cloud without the apps even knowing it’s happened. It’s all about making old applications stable in modern operating system environments, akin to changing the rug under your feet without it even registering. But by overlooking legacy systems entirely, organisations are at risk of racking up ‘hidden’ costs. These include growing technical debt, fines for non-compliance, and increased security risks. As hidden costs build, there is a consistent outlay of funds and potential for lasting damage to brand reputation and shareholder value. These hidden costs illustrate why maintaining and optimising your legacy system is vital for your business-critical apps. Organisations can keep the value of their legacy, they just need to give it a lifeline. Accruing technical debt The more debt you accrue, the harder it is to pay off. It’s the same with technical debt: the more legacy systems are left to their own devices, the more outdated they become and the more costly they are to maintain – as these systems are often more inefficient and less reliable, they require frequent repairs. McKinsey labels technical debt as the “tax” an organisation pays to rectify existing issues, and its research says this tax amounts to a massive 40 percent of IT balance sheets. For example, some organisations have a large number of Windows 2008 or 2012 servers running critical business applications, including multi-tier applications running a web server, app server and database server. These Windows 2008 and 2012 servers are out of support, raising operational resilience concerns, enhancing security risks and significantly increasing maintenance costs. In the US, a 2020 report revealed that technical debt had “ballooned to over $2 trillion every year”, with inefficiencies also contributing to 23-42 percent of development time being wasted “because of that tech debt”. Crucially, these costs divert resources from more strategic investments. Non-compliance and reputational damage You might think you run a tight ship when it comes to compliance and regulatory processes. But if legacy systems are not maintained, then you may unwittingly be failing to comply with current data protection and privacy regulations. GDPR is a prime example of how legislation can significantly alter a firm’s technical requirements, and obsolete software might not support necessary encryption or data handling practices. By risking non-compliance, you could face fines and legal issues that impact your business operations and balance sheet. But it’s not just a monetary cost on the line. Many firms are tied to sustainability goals and ESG programmes. As the climate crisis deepens, greener IT policies have become increasingly vital. Allowing your technical debt to grow and relying on older hardware can be less energy-efficient and increase your carbon footprint. When software is more inefficient, performance can also be impacted. If systems are running slowly, customer satisfaction – especially with increased expectations in today's digital world – can be dented and business lost. Technical debt can also limit the company’s ability to offer modern and customer-facing services and apps, contributing to possible revenue loss. All of these consequences can damage an organisation’s reputation and directly impact business success and shareholder confidence. Security vulnerabilities It was only in July that a cyberattack caused a global outage for some of Microsoft’s services and products. This was actually an instance of a flawed update gone wrong by its cybersecurity firm CrowdStrike, showing the extra diligence required when upgrading systems. Yet with cybercrime dramatically increasing year-on-year and ransomware attacks a constant threat, obsolete systems present major security vulnerabilities. As they lack the latest security patches and updates, they are at risk of cyberattacks and ransomware. And as these outdated servers might not be able to integrate modern security tools, technical debt also increases the risk of breaches. Cloud infrastructure is also seen as a key security requirement in today's business environment – software providers can make any security updates and patches remotely whenever they are needed. But technical debt can mean your applications are not cloud-compatible, resulting in a reliance on on-premise infrastructure that is harder and more costly to maintain and less scalable to suit business needs. Don’t forget about your legacy The fact something is a legacy system means it works. Legacy systems are not the problem – it’s how they are maintained and optimised. It’s possible to migrate old applications and systems on-premise and in the cloud without needing to alter the actual system. But the maintenance of legacy infrastructure is crucial to avoiding technical debt. If technical debt builds, maintenance costs rise substantially and inefficient systems hamper productivity. Neglecting this software also means you could be falling foul of compliance requirements, failing to achieve ESG goals and failing to deliver against customer expectations, impacting your company’s bottom line and reputation. And in a world of surging cybercrime activity, obsolete systems can leave you vulnerable to far greater threats. So, if you procrastinate on your technical debt, you could end up footing a bill with much higher hidden costs than you think. ### Why are IT decision-makers under pressure to save? What if I told you one in five IT decision-makers at SMEs across the UK feel rushed while implementing public cloud migration projects? Findings uncovered in our newly released independent research offer some suggestions for why this is happening. They also highlight some best practice principles that could reduce the pressure while helping achieve positive organisational change. Public cloud adoption strategies have matured in recent years, and many organisations are taking a longer-term view that aligns cloud strategy and core business objectives. For example, almost a third of respondents measure the success of their cloud transformation projects based on productivity increases, enhanced security, and better user experience.However, our data suggests that a quarter of IT decision-makers still migrate to public cloud environments primarily to reduce operational costs. Almost 30% want to minimise capital investment, and 32% cite improved flexibility (often a euphemism for cost savings). Map out your journey Unfortunately, if the case for public cloud adoption is based principally on cost savings, those delivering the project are more likely to face pressure from the outset. That’s not to say public cloud adoption can’t or won’t deliver cost savings; it’s just that there’s a lot of work to do in terms of planning and implementation to get to that point. Application modernisation, for example, is central to realising savings but can be difficult to achieve and take far longer than expected. In addition, applications and data scheduled for migration might be more challenging to find than anticipated, causing project delays.Developing a well-thought-out plan at the outset, agreeing on outcomes, and setting realistic milestones will help considerably—as will clear communications to key stakeholders and the wider organisation during the project. We often talk about cloud migration as a journey. Extending that metaphor, try to keep those travelling with you comfortable, informed and (most importantly) on board. This will help manage short and long-term expectations—including a realistic point at which the project could deliver savings and efficiencies. With 20% of respondents saying that board-level resistance and cultural challenges were barriers to cloud adoption, it’s certainly advice that’s worth considering. Cloud and business strategy in alignment To be successful, an organisation’s cloud adoption strategy should be an extension of its business strategy. So, it’s reassuring that our research shows some alignment between the top objective for the next 12 months (improving security) and the top driver for cloud adoption (security). Our data also shows that some respondents are migrating to public cloud environments to access innovative technology and adopt new applications that require a hosted environment. These are mature and rational drivers with realistic outcomes—and will play a major part in improving business efficiency, customer experience, and staff productivity. It's likely that public cloud migration projects such as these will have well-thought-out long-term goals. As a result, IT decision-makers will probably experience less pressure to deliver immediate short-term outcomes since everyone will understand the timeframe at the outset, and the ultimate goal should be achievable. Countering a shortage of cloud talent According to our research, almost eight out of 10 UK SMEs have experienced unexpected costs or cloud-related budget overruns. This is bound to increase stress and ramp up the pressure to rush implementations. Our data suggests it’s more common in the public than private sector—and most common of all in government cloud projects. Indeed, over 90% of government sector-related SMEs experienced unexpected costs or budget overruns—as did almost 85% of SMEs working in the education sector. Organisations implementing their own public cloud migrations can be more prone to unexpected costs—especially if they opt for a basic ‘lift-and-shift’ approach. One way to counter this is to consider a cloud-native strategy from the outset. Exploring operating model changes, regularly reviewing architecture and automating critical processes could also help keep costs under control. This can be a big task for some teams—especially if, like almost 20% of respondents, they’re concerned that a lack of skills could be a barrier to cloud adoption. IT decision-makers are doing their best to address this issue, but our research found that three-quarters have faced challenges recruiting cloud talent. Nearly 60% have implemented initiatives to upskill the existing workforce, while half have outsourced to third parties such as managed service providers. Although this route might look more expensive on paper, it will ultimately save money, help align business and cloud strategy and maintain a strong security posture. It's not too late to change course As a closing thought, it’s worth considering that public cloud adoption is not a one-way journey. Some businesses find applications can’t or don’t scale in their new environment. This leads them to wonder if they might be better suited to a different ecosystem—perhaps returning them to on-premises infrastructure and/or private or managed data centres. Perhaps this explains the rise in hybrid cloud adoption and why, according to our research, hybrid cloud environments are now twice as popular as public cloud ones. While it’s preferable to write and gain sign-off on a public cloud adoption strategy at the start of the project, it’s never too late to draft one—and doing so might just help reduce stress levels, realign goals, support momentum, and keep everyone on board. The full version of Six Degrees’ UK SME Cloud Intelligence Report 2024 can be downloaded here. ### Ensuring AI Success in Telecommunications Like many sectors, the telecommunications industry faces a tough economic landscape. Back in 2022, Bain Capital predicted that telcos would struggle with increased personnel and escalating operating costs due to inflation. We’re seeing that occur now, meaning, telcos must proactively seek opportunities to optimise revenue streams and streamline operations. Data plays a pivotal role in powering cost control strategies, with automation and digital transformation initiatives helping to drive long-term cost reductions but also increase efficiency. A recent Boston Consulting Group report found that leading telecoms companies are “radically optimising costs through next-generation network architecture and core-to-cloud transformation” as well as “exploring ways to deploy generative AI that will transform each step in the industry’s value chain.” For telcos to not only survive, but thrive, in an increasingly competitive market, they’ll need to make strategic use of the cloud and AI to enhance efficiency and scalability in a cost-effective way. Reduce cloud costs by running large regular data workloads on private cloud. Public cloud is flexible, agile and accessible, making it a great option for innovative businesses with varying project workloads. However, most telcos experience “sticker shock” when public cloud operations go into full production and look to FinOps to counter this. But careful workload management, combined with data observability, can help with cost reduction by offering more efficient alternatives, such as moving more mature workloads to private cloud. But telcos need to have visibility of their data and workloads first. Hybrid cloud data architecture for managing workloads Due to regulatory and security policies, some workloads or sets of data cannot be stored on public cloud infrastructure. Similarly, these workloads can vary greatly in terms of resource requirements when they’re initially deployed, making it cheaper to run them in the cloud. Telcos need to have the flexibility to move between public and private clouds to ensure they are compliant with evolving regulations and company policy. To be successful, they must have the ability to move workloads at will, enabling them to ensure the best option is always available. Deploying generative AI to identify and resolve network anomalies Telco networks are very complex – often multi-vendor, integrating telecom and IT standards with wireless, wired and legacy models. Such complex structures make finding network defects a time-consuming and challenging task. Institutional knowledge of the network compounds this, as it often sits with just a small group of individuals. Generative AI can help to discover network issues as they arise and translate them into natural language. It can also suggest causes and resolutions based on historical performance, vendor communities and other documentation. Using data and generative AI to build customer profiles for personalised support Traditionally, customer profiles were based purely on data like billing information, payment history and customer service interactions. However, networks can draw upon layers of valuable passive experience data from customers, building nuance into each customer interaction. Telcos can utilise generative AI to gather new sources of data, such as instantly translating voice interactions and suggesting recommendations, as well as providing sentiment analysis and CSR guidance in real time. However, these innovations all rely on data. Telcos have an abundance of it at their disposal and being able to harness data’s power will be key to cutting costs. Whether it’s enabling AI by giving it access to a complete set of quality data, or making data-driven decisions over workloads, telcos need to deploy a modern data architecture which will allow them to drive insight from their data, no matter where it resides. Data as a competitive edge Generative AI offers huge potential, as does workload analytics. But the impact of these tools depends on data. With the industry looking to ride out this period of economic uncertainty, only the telcos that can capitalise on the potential of data to drive innovation will truly flourish in the long term. ### The growing threat of ransomware in healthcare In the dynamic landscape of healthcare and life sciences, where data security and the provision of care must be balanced. Recent attacks in healthcare settings highlight this conflict. The Synnovis ransomware attack in June 2024 revealed how even critical infrastructures remain vulnerable and how complex and intertwined supply chains are, even in public services. This creates cascading impacts when failures or attacks occur. As the pathology laboratory that provides blood testing services to many in London and the wider UK, Synnovis is a vital partner. The ransomware attack on the laboratory plunged blood sampling tests from 10,000 per day in London hospitals to just 400 per day, and forced the affected hospitals to postpone a total of around 800 operations and around 700 outpatient appointments. Another attack saw 12.9 million Australians caught up in a hack (almost half of the Australian population) on electronic prescriptions provider, MediSecure. Victims may never be told their personal information has been compromised, with the Australian prime minister saying on Friday he wasn’t aware if he was one of the victims. It was revealed that 6.5TB of data had been compromised after a ransomware attack on a database server, which was discovered by the company in April. MediSecure went into administration after the hack, forcing healthcare providers to scramble to find an alternative distracting them away from delivering patient care. Closer to home, the threat landscape has resulted in politicians in the UK wanting to ensure companies are more transparent. Initial ideas are being discussed on whether all victims of ransomware attacks should be required to report incidents to the government. Affected businesses should also have to obtain a licence before making extortion payments.A complete ban on ransom payments for organisations involved in critical national infrastructure is also being proposed by some. The ban is intended to remove the incentive for hackers to disrupt these critical services by preventing them from monetising attacks. This would likely only reduce wiper attacks, as nation-state actors are focused on destabilisation and destruction rather than a financial gain. How healthcare organisations can respond The risk of successful cyberattacks on the well-being and lives of citizens will continue to drive politicians to enact new rules and regulations with the aim of strengthening levels of resilience. So there is likely to be more to come. Healthcare institutions should start now in their journey to build resilience to destructive cyber attacks, both to ensure the continuity of care to their patients and to ease the future burden of regulatory compliance once legislation is enacted. The following steps are essential in this journey: Understanding data precisely: Companies need to know exactly what data they have, what value it has in the delivery of health products and services and what regulatory obligations surround that data. Only then can they report to the authorities which data was corrupted or exfiltrated in any attack. Companies must index and classify their data, including classification to their relevant record strategy. This should include not just the structured data sources like databases, but also all of the unstructured data scattered across an organisation. Regulating access: Once the data has been correctly classified, it can automatically enforce rules and rights that regulate access to it. Survive attacks: In order for an organisation to be able to continue to deliver its services to patients and its supply chain, it must be able to maintain its ability to rapidly recover its critical services into a trusted state. A ransomware or wiper attack may mean nothing will work: door access control, email, voice-over-IP telephone systems, authentication servers controlling access to medical devices. The systems needed to investigate the incident to determine root cause may have been taken out, the IT operations team may need to assist the security operations team in recovery of the response capability before investigation can start. Isolation of infected hosts and networks to contain the incident can cause issues with some security tooling, further complicating response and delaying recovery. Recovery without investigation and mitigation leaves the gaps in security controls, vulnerabilities, persistence mechanisms and even phishing emails that kicked off the whole incident languishing in recover email inboxes. We need to speed investigation and enable rapid response, to minimise impact on patient care. Being prepared for an attack by establishing a clean room capability beyond the reach of the adversary, that can be stood up in minutes that contains trusted tooling and other essentials to handle the incident like workflows and contact lists is critical in speeding response, and therefore recovery of critical services. The ability for the clean room to be able to allow investigation, even if systems have been isolated further speeds efforts. Incident preparedness and response forms the basis of most recent regulations including NIS 2, DORA and GDPR. Choosing an efficient backup and recovery solution for critical patient data Healthcare institutions grapple with many challenges concerning managing and protecting electronic health records (EHR) and electronic medical records (EMR). These challenges stem from the sheer volume and complexity of healthcare data, encompassing patient records, imaging files, lab results, and administrative documents, which require meticulous management and safeguarding. Healthcare providers must navigate stringent regulatory requirements, necessitating robust data protection measures to ensure compliance and mitigate the risk of legal penalties. Time is of the essence in patient care and treatment, so any downtime can have serious health implications. Therefore, healthcare institutions need a modern backup and recovery solution that offers high performance, allowing for rapid backup and recovery processes to minimise disruption to patient care and operational workflows. Rapid access to accurate patient information is essential for delivering timely and effective care, making the speed and efficiency of backup and recovery processes critical. Protecting data integrity is also non-negotiable in healthcare. Immutability features that prevent unauthorised alteration or deletion of backup data ensure the integrity and reliability of system backups. By considering the above factors and partnering with a trusted data management provider, healthcare, and life sciences organisations can enhance their cyber resilience, improve response and recovery capabilities, and safeguard the continuity of patient care and medical research initiatives in light of the growing threat of ransomware. ### Data Tips Protecting Your Organisation From Insider Theft Data security is no longer a choice, it is a necessity. The threats we face from within our organisations are growing faster than ever. In fact, insider incidents have surged by a staggering 47%, with the cost of each breach now hitting companies hard at around $15 million on average. These are not just numbers, they reflect a real and pressing threat that businesses everywhere are dealing with. To stay ahead, protecting your data from insider threats is essential. Therefore, organisations must implement effective strategies to mitigate this risk. Here are the key steps to protect your organisation from insider data theft: Identify and Classify Sensitive Data The first step in protecting your organisation is thoroughly identifying and classifying the data that is most critical to your organisation. Understanding what data you have, where is it stored, and who has access to it is the foundation of any data security strategy. This process should cover intellectual property, financial records, personal data, and other sensitive information. Data classification helps you prioritise protection efforts and ensures that you are focusing your resources on the most valuable assets. Understand the Types of Insider Threats To effectively combat insider data theft, it is important to first understand the different types of insider threats. Insiders can be categorised into three main categories: Malicious Insiders: These are individuals who intentionally steal, leak, or misuse data for personal gain, to harm the organisation, or to help a competitor. Negligent Insiders: These individuals don’t intend to cause any harm, however, their lack of awareness or carelessness leads to data breaches. For example, an employee who falls for a phishing scam or accidentally shares confidential information with the wrong person. Compromised Insiders: These are individuals whose credentials have been stolen by an external attacker who then uses them to access sensitive information. Understanding these categories will help in designing targeted strategies to mitigate the risks associated with each type. Implement Strong Access Controls Not everyone in your organisation needs access to all data. Implementing the principle of least privilege ensures that individuals only have access to the data that is necessary for their role. This minimises the damage that can be done if an insider plans to steal data. Regularly review and update access controls to ensure they align with current roles and responsibilities. Use Multi-factor authentication (MFA) to add an extra layer of security, making it more difficult for compromised credentials to be used maliciously. Conduct Regular Employee Training Awareness is a powerful tool in preventing insider threats. Regular training sessions should be conducted to educate employees about the dangers of insider threats, the importance of data security, and how to recognise possible threats. Training should cover topics like phishing, social engineering, and proper handling of sensitive information. Employees should also be made aware of the legal and professional consequences of data theft. When employees understand the seriousness of these actions, they are more likely to act responsibly. Monitor User Activity Monitoring user activity is crucial for detecting insider threats. By tracking user behaviour, organisations can identify unusual activities that may indicate a breach. For example, an employee accessing a large volume of sensitive files outside of normal business hours could be a red flag. Advanced monitoring tools can help detect anomalies in real time and provide alerts to the security team. It is important to strike a balance between monitoring and privacy- employees should be informed that their activities are being monitored as part of the organisation’s security efforts. Implement Data Loss Prevention (DLP) Solutions Data Loss Prevention (DLP) solutions are designed to prevent sensitive information from leaving the organisation. These tools can monitor and control data transfers, whether through email, cloud storage, or removable media. DLP solutions can also enforce encryption and prevent unauthorised moving, copying, or sharing of sensitive data. By setting up rules and policies that align with your organisation’s security needs, DLP solutions can help stop insider data theft before it happens. Conduct Regular Security Audits and Compliance Checks Regular security audits are necessary to ensure that your data protection measures are effective and compliant with industry regulations. These audits should evaluate access controls, data handling practices, and user activity logs. Compliance checks should also be part of this process, especially if your organisation is subject to regulations such as GDPR, HIPAA, or SOX. Automated tools can help streamline audits and provide comprehensive reports, making it easier to identify and address vulnerabilities. Establish Data Anonymisation Practices A highly effective but often overlooked strategy to protect against insider data theft is data anonymisation. It involves transforming personal and sensitive data into a form where it cannot be traced back to an individual, making it less useful to the insiders with malicious intent. This technique can be particularly effective in environments where employees need to access large datasets but do not see the identifying details. Prepare for the Worst with an Incident Response Plan Despite your best efforts, it is important to be prepared for the possibility of insider data theft. An incident response plan should be in place to quickly and effectively address any security breaches. This plan should include steps for identifying a breach, containing the damage, and communicating with affected parties. Regular drills and simulations can help ensure that your team is prepared to respond swiftly and effectively in the event of a breach. The faster you can respond to a security incident, the less damage it will cause. Utilise Advanced Threat Detection and Response Tools To stay ahead of insider threats, organisations need advanced tools that can detect and respond to threats in real-time. These tools can analyse huge amounts of data to detect patterns and anomalies that traditional security measures might miss. By integrating threat detection with automated response capabilities, you can swiftly neutralise threats and reduce the impact of data breaches. Conclusion It is important to remember that even the most sophisticated security systems are only as strong as the people who use them. Equip your employees with the knowledge and tools they need to protect your organisation from within. In the end, a strong organisation is about building a resilient and informed workforce that values and protects the data at the core of your business. By taking the necessary steps, you are not just preventing insider threats, you are protecting the future of your organisation. ### Common e-commerce vulnerabilities and how to combat them The e-commerce landscape is a dynamic and rapidly evolving sector, propelling businesses into unprecedented realms of growth and customer engagement.  However, with these opportunities come significant security challenges. E-commerce platforms are prime targets for cybercriminals, given the vast amounts of sensitive data they handle. One report by the Internet Crime Complaint Center found that around $12.5 billion was lost to cybercrime in 2023. Understanding the common vulnerabilities and implementing robust strategies to mitigate them is crucial for safeguarding your business and ensuring customer trust.  Here, I will explore five common e-commerce vulnerabilities and how to combat them effectively.  SQL injection attacks  Structured Query Language (SQL) injection attacks occur when cyber criminals exploit vulnerabilities in a website's code to execute arbitrary SQL queries. This can lead to unauthorised access to the database, compromising sensitive customer information. In one recent incident, the data of over two million people was compromised due to an SQL injection attack.  There are several ways this can be prevented. One of these methods, input validation, ensures that data meets certain requirements before it is processed. These requirements could include a limit on the data type, ensuring inputs are within a certain numerical range or adhering to a specific format. Parameterised queries are specifically designed to prevent SQL attacks by separating SQL data from code. While input validation applies to all forms of user input, parameterised queries are limited only to SQL data to prevent SQL attacks. Regular security audits will help to identify and correct any vulnerabilities in the code and database configurations. However, it’s essential that these audits form part of regular security maintenance rather than a one-off security measure.   Cross-site scripting  Cross-site scripting (XSS attacks) involves injecting malicious scripts into web pages viewed by other users. These scripts can then redirect users to phishing sites or steal session cookies.  One study recently found that a XXS vulnerability within a Google subdomain allowed hackers to perform various malicious attacks, including phishing and distributing malware. Output encoding is a technique that transforms any data sent from a server to the user’s browser into a safe format before it is visible on a user’s webpage. This ensures that any data provided by users is visible only as text, rather than executable code. A Content Security Policy (CSP) also provides additional security by giving developers the ability to control which resources can be executed on web pages. This blocks malicious users from executing their own content on the page. Cross-site request forgery  Cross-Site Request Forgery (CSRF) attacks trick users into performing actions they did not intend. This could include changing account details or making unauthorised transactions, by exploiting their authenticated session with a trusted site.  Anti-CSRF tokens can be generated by the server following each submission and this token is then validated before processing a user’s request. When a user initiates any action on a site, like logging on to a site, a unique token is issued. This token is then stored on the server for the duration of the session. Whenever they attempt an action on the site, like submitting a form or changing their details, the token is submitted with the request. The server then checks if the token is valid. If a token is missing or invalid, the server rejects the request. However, it’s essential to ensure that there is a token expiry after a certain period or when the user logs out. It’s also important to use secure channels to transmit these tokens so that attackers cannot intercept them. Weak authentication and session management  Poor authentication mechanisms, such as weak passwords or session management flaws, can lead to unauthorised access and data breaches.  Additional security factors, like multi-factor authentication, can create an additional layer of security that significantly reduces the likelihood of unauthorised access. One study found that an eight-character password can be cracked in as little as five minutes so it’s important to implement and enforce strong password policies and regular password changes. Passwords should be complex, with a variety of characters and at least 12 characters long for additional security. It’s important to prioritise best practices for session management too. This includes setting session timeouts and regenerating session IDs when logging in. Unpatched software and vulnerabilities  One study found that thousands of WordPress plugins have vulnerabilities. Plugins can be exploited by attackers to gain access to e-commerce systems and data.  This plugin vulnerability also allowed hackers to redirect visitors to malicious sites such as phishing sites or those that download malware. Run regular vulnerability scans to identify and address security weaknesses. To minimise manual checking, automated tools can be used to identify any malware or vulnerabilities. Planning and keeping on top of regular software updates can also prove challenging so utilising automated patch management solutions is essential to ensure timely updates.  Effective plugin management also requires diligent checking of apps and plugins used on the website to ensure that they are from verified sources. ### Importance of Runtime Security for Cloud Native Environments Runtime security plays a critical role in protecting cloud-based workloads from various threats, including data breaches, unauthorised access, and malicious attacks. Cloud environments often involve multiple services, platforms, and providers, resulting in a complex and dynamic infrastructure. This complexity can make it challenging to manage and monitor security effectively. This challenge is just going to grow. According to Gartner, “90% of global organisations will be running containerised applications in production by 2026 – up from 40% in 2021.” That’s hardly surprising: cloud native application development means we can introduce new code more quickly. However, such dynamism in these environments comes with increased vulnerability to runtime attacks. Preventing risky code from running involves more than just using assurance gateways in the repository and CI/CD pipelines Static analysis isn't enough. For example, containers should be immutable, but if they aren't, as is often the case, an attacker can execute a fileless attack, thereby loading the malware directly into memory and executed, evading common defences and static scanning. The problem is that security professionals often struggle to implement security guardrails and can fail to spot the difference between authorised and illegitimate application behaviour in real-time. Furthermore, the challenge of preventing attacks in cloud native environments is exacerbated by the time sensitive nature and ease of lateral movement in comparison to on-premises environments. So, when a breach occurs, it’s harder to stop. Let’s look a little more closely at the issues involved. Traditional security challenges Cloud native environments are built on dynamism, introducing unmatched versatility and efficiency, but also changes in how they are secured. Perhaps the most significant challenge in runtime security is the poor visibility of the dynamic architecture. Traditional security procedures are static by design and so find it difficult to keep up with the fleeting nature of cloud native implementations. Static analysis tools and pre-deployment security scans are helpful but fall short in fully addressing the risks associated with runtime threats. Similarly, Endpoint Detection and Response (EDR) solutions are more adapted to protecting endpoint devices, usually based on Windows, and aren't as well suited to the attacker's modus operandi in cloud-based applications. While EDR agents are crucial for traditional endpoint security, they struggle in an environment with a constantly shifting perimeter because of scalability, management complexity, and visibility challenges. Instead, runtime security embraces a variety of processes and technologies, which are specifically designed to identify, stop and ameliorate threats during application execution. These defences work in real-time, overseeing application behaviour and the underlying infrastructure to detect and mitigate malicious activity quickly. Key runtime security capabilities When assessing cloud native security solutions, it is vital to identify the key capabilities that will deliver robust security and seamless operations across multiple cloud platforms. First: the ability to secure workloads of all types across public, private and hybrid clouds, ensuring a thorough security posture throughout your cloud transformation journey, and protecting against various attacks on different architectures. Runtime security solutions should also use the eBPF (Extended Berkeley Packet Filter), which provides increased proficiency and visibility for cloud native ecosystems. eBPF’s kernel-level integration delivers high efficiency at low overheads, making it the perfect fit for high-performance environments. It also provides a profound view of system calls, network packets and other kernel-level activity, allowing detailed monitoring and identification of malicious operations. That means eBPF can carry out security policies in real-time, offering instant responses to threats while minimising liabilities. In addition, runtime security solutions must use a multi-layered approach to identify potentially dangerous workload behaviour, whether provoked by bad actors or tricky applications which expose the environment to threat actors. Let’s look at the reverse-shell attack as a use case for runtime security: it dodges traditional firewall and network security settings, enabling the attacker to remotely execute commands and carry out malicious operations on the victim’s system. With identifying indicators of attack (IoA) in runtime, this pattern of attack can be detected and from happening in real time. There are essentially four layers of runtime protection to consider: Prevent: Prevent attacks by reducing the attack surface. Hardening the environment to reduce attack surface, such as limiting the ability to run images as root or preventing images with high severity CVEs from running in production could be very helpful in closing off open doors and windows for an attacker. The most effective hardening measure is preventing an attacker's ability to add executables to a running container. Containers should be immutable and if kept as such, you're making the attacker's life very difficult. Of course, it's crucial to implement automatic blocking of all known indicators of compromise such as malware, rootkits, crypto miner identifiers etc. Detect: Detect threats in real-time by observing suspect patterns, uncovering behavioural anomalies and using real-world threat intelligence Stop: Stop attacks across workloads by preventing complex zero-day attacks, stopping malware in real-time and blocking exploitation of vulnerabilities Respond: Investigate and respond faster by gathering forensic data, reporting attack impact and mitigating attacks across all stages. Look for a solution that provides genuine cloud native workload protection, risk minimisation, and attack prevention, which extends beyond simple risk management and offers security teams comprehensive visibility and the tools they need to stop complex threats from being carried out in runtime. The imperative of runtime If companies want to adopt the benefits of cloud computing without compromising security, it is vital they wrap their heads around the significance of runtime security in cloud native scenarios. By implementing runtime security processes and embracing a proactive approach to threat detection and mitigation, companies can protect their cloud native applications in an ever-changing threat environment. ### Three Ways Automation Boosts Cloud ROI As businesses increasingly rely on cloud services, the worldwide public cloud computing market continues to grow, with an estimated value of 675 billion U.S. dollars expected by the end of 20241. Cloud services promise scalability and agility, but this comes with challenges like tracking spending, optimising resources, and ensuring performance. Issues like cloud sprawl, overspend and slow adoption are leading many IT leaders to turn to automation and advanced software tools as solutions. These useful tools help organisations streamline processes, reduce costs, and enhance overall operational efficiency. In a recent survey, 36% of software developers relied on these automation tools to understand code errors and receive suggestions for fixes. With more businesses likely to turn to automation to reduce cloud spend, here are three ways automation tools can help organisations maximise their ROI on cloud investments. Optimising resource utilisation Cloud environments are incredibly dynamic, with workloads scaling up and down based on demand. However, without proper oversight, resources can be over provisioned or underutilised, leading to unnecessary or spiralling costs. Think about how frustrating it is to overpay for resources you don't use or to scramble for more when you need them – automation tools can offer a solution for this. By continuously monitoring cloud usage and adjusting resources in real-time, these tools can help to ensure you only pay for what you actually need. This predictive capability helps in making informed decisions about resource allocation and scaling, and ultimately boosts ROI down the line. Automation tools can also help generate detailed data analysis, providing insights into how often different data types are accessed and who by, as well as peak usage times, and geographic data distribution. Additionally, these tools can identify patterns and trends that might not be immediately obvious, enabling proactive management and allocation of resources. This data-driven approach not only improves resource management but also informs strategic decisions, allowing businesses to effectively align their cloud resources with operational needs. By using automation for resource scaling and data analysis, organisations can streamline operations, eliminate waste, and maximise their ROI. Enhancing performance and reducing downtime Nothing kills productivity, user experience, and revenue like downtime, so it’s crucial to ensure high availability and minimise disruptions. Automated performance monitoring and management can spot anomalies before they become problems. These tools can track performance metrics, such as response times, latency, and error rates, and set up alerts to proactively keep everything running smoothly. Automation goes beyond just managing resources – it streamlines and automates repetitive tasks across various departments, from finance to human resources and IT. This reduces processing time, minimises human error, and fosters improved collaboration. This integration not only simplifies operations but also reduces overhead costs associated with physical infrastructure and manual oversight. Additionally, automation can help in maintaining compliance with industry regulations by ensuring consistent application of policies and procedures, thus reducing the risk of non-compliance penalties and enhancing security – therefore boosting ROI overall. Streamlining cost management and reporting Understanding cloud costs across departments, projects, and services can be daunting, and manual reporting processes are time-consuming and error-prone. This is where automation tools come in. They can revolutionise cost management in cloud environments by providing real-time tracking and analysis of expenses, offering detailed insights and automated reports on spending patterns which can support budgeting and forecasting. They can even pinpoint unused or underutilised resources and suggest cost-saving measures like right-sizing instances or switching to more cost-effective pricing plans. By providing actionable insights and automating cost management tasks, these tools help businesses make data-backed financial decisions to boost ROI. Beyond real-time tracking, advanced automation tools can further streamline cost management through accurate cost allocation and chargeback mechanisms, ensuring cost transparency and accountability across departments. Additionally, these tools enforce cost control policies, benchmark against industry standards, and ensure compliance and security. User training and adoption strategies also play a crucial role in maximising the effectiveness of these tools, ultimately improving overall ROI and operational efficiency. Automating repetitive reporting tasks streamlines cost management, optimising resource utilisation and enhancing cloud performance. These tools not only free up your team from manual chores but also ensure that every financial decision is based on empirical data. This approach keeps your spending in check and empowers you to make informed choices that drive up your ROI. By integrating automation into the cloud environment and leveraging purpose-built software tools, organisations can effectively work through the complexities of cloud management and harness the benefits of their cloud environments. With these automation tools in your corner, managing cloud expenditures becomes easier and more efficient, allowing you to redirect resources where they matter most – fuelling growth, innovation, and competitive advantage across your organisation. Remember, it’s not just about cutting costs; it’s about making informed decisions that drive growth and innovation to future proof your business. ### Understanding the Dynamic World of Computer-Aided Dispatch The history of computer-aided dispatch (CAD) can be traced back to the year 1960s. During this time, there was a high demand for efficacious ways to handle emergency calls. With time, there is surge in 911 calls all across the globe. For instance, in an average, around 600,000 911 calls per day are made in United States. Some of the benefits of CAD systems are as follows: Back end call taking and incident entry application Incident tracking and status updates Geographic Information Systems integration Reporting and data analysis In the current times, CAD services have been widely incorporated globally. CAD has become a vital tool that has transformed the way emergency services operate. Let us understand why a plenty of industries are adopting for CAD services: The world of CAD has come a long way from its early days, and today’s systems are packed with high-tech features that sound like they’re straight out of a sci-fi movie. Artificial Intelligence and Predictive Analytics AI is being increasingly incorporated into CAD systems to enhance decision-making processes. By analysing historical data and real-time information, AI can predict potential incidents and suggest optimal resource allocation, thereby improving response times and outcomes. The NextGen 911 idea, unveiled by the US National Emergency Number Association, calls for modernising the nation's computer-aided dispatch infrastructure in 2023. GIS Integration Geographic Information Systems are now integral to CAD platforms, providing dispatchers with a real-time, visual representation of incidents. This spatial awareness is crucial in managing large-scale emergencies, coordinating multi-agency responses, and optimising route planning for first responders. GIS technology is being used by more than 80% of theworld's leading corporations to obtain location intelligence. Cloud- Based Solution The shift towards cloud-based CAD systems is another significant trend. Cloud deployment offers several advantages, including scalability, cost- effectiveness, and the ability to access the system from anywhere. According to Research Nester analysts, cloud-based computer-aided design adoption is currently estimated to be between 20% and 30%. This flexibility is particularly valuable for agencies looking to streamline operations and reduce IT infrastructure costs. The Industry is buzzing with new innovations: In May 2024, The top provider of cloud-native public safety software, Mark43, stated that it has partnered with the New Mexico Department of Public Safety (DPS) to offer mobile and single-platform computer-aided dispatch (CAD). The mobile dispatch app, analytics, and several interfaces that come with the Mark43 CAD assist the New Mexico Department of Public Safety fulfil its goal of providing full law enforcement services to promote a safer state. In March 2024, RapidSOS and the Safety, Infrastructure & Geospatial division of Hexagon announced the launch of a Digital Alerts solution that would help firemen respond to emergencies and fires in commercial buildings more quickly and accurately. Texas' El Paso County 911 District is the first to implement the new system, providing their community with a more effective and knowledgeable emergency response. In August 2021 The most major operational overhaul to the FDNY's dispatch operations in 45 years, FireCAD is a new computer-aided dispatch (CAD) system created in partnership with Accenture and the Fire Department of the City of New York (FDNY). It replaces the now-retired STARFIRE system. In July 2024 - Tyler Technologies, Inc. announced that it has secured a contract for its Enterprise Public Safety suite with the Evanston Police Department in the city of Evanston, Illinois. With the help of Amazon Web Services (AWS), the product suite will be housed on the cloud, providing the agency with improved security and functionality. Case Studies: CAD in Action Real-world applications of CAD technology showcase its transformative impact on emergency response. Urban Public Safety In large cities, CAD systems are pivotal in managing high volumes of emergency calls, coordinating multi-agency responses, and ensuring that resources are deployed efficiently. The integration of real-time traffic data and GIS allows for precise incident mapping and faster dispatch. For instance, between 2017 and 2020, the University of Michigan Division of Public Safety and Security (DPSS) handled close to 500,000 service requests. Such large numbers of emergency calls highlight how crucial CAD systems are to preserving public safety and enhancing response times. Healthcare Sector CAD systems are increasingly used in hospital emergency departments to manage patient intake, track bed availability, and coordinate with ambulance services. This integration improves patient outcomes by streamlining emergency care workflows. The computer-aided dispatch industry is pretty dynamic, with major players like Motorola Solutions and Hexagon leading the pack. It’s not just about big names though; startups are also shaking things up with innovative solutions. The market is growing fast, driven by advancements in AI, cloud computing, and real-time data integration, making emergency response more efficient than ever. Forecasts indicate that the computer- aided dispatch market was valued at over USD 3.4 billion in 2023 and is expected to grow to over USD 20.45 billion by 2036, with a compound annual growth rate of over 14.8% between 2024 and 2036. Wrapping It Up, Computer Aided Dispatch systems might not be the flashiest part of emergency response, but they’re one of the most important. As technology continues to evolve, CAD systems are getting smarter, faster, and more efficient, helping emergency responders save lives and keep our communities safe. As the industry moves forward, the focus will be on integrating cutting-edge technologies, ensuring interoperability, and making these systems accessible to all agencies, regardless of size or budget. However, for more insights on computer aided dispatch industry a detailedreport focusing on the segmental, regional analysis. ### How Port Scanning Strengthens Cyber Defences In today's rapidly evolving digital landscape, the importance of robust cyber security measures cannot be overstated. In recent years, cyber attacks have become more sophisticated and more frequent, posing a significant threat to individuals, organisations, and even nations. In order to safeguard against such threats, proactive security measures must be adopted. Port checker is one of the techniques that play a crucial role in strengthening cyber defences.  With port scanning, organisations can build a robust cyber security posture that not only protects against existing threats but also enables them to meet the challenges of the future with confidence. In this article, we will examine how port scanning can enhance IT security and why organisations should prioritise its implementation. What Is Port Scanning? Port scanning is a technique used to assess the security posture of computer systems and networks. It involves systematically  scanning a target system to identify open ports and services running on those ports. Ports act as communication endpoints, enabling computers to send and receive data over the network. By scanning for open ports, organisations gain valuable insights into potential vulnerabilities that can be exploited by malicious actors. Multiple online tools are available to check open ports, you can use any of them. How Port Scanning Strengthens Cyber Defences? The following are some ways in which port scanning strengthens cyber defences: Identifying Vulnerabilities The scanning of ports is an essential component of identifying potential entry points for cybercriminals. There is a potential vulnerability associated with each open port since it signifies that a service or application is exposed to the network.  Identifying these open ports allows IT security teams to assess the risks involved and take proactive measures to ensure their security. When a vulnerability in an application is not patched or a service is misconfigured, port scanning is a valuable tool that can alert you to it immediately. Enhancing Firewall Configuration A firewall provides the first line of defence against unauthorised access to a network. However, the use of misconfigured firewalls may lead to the inadvertent exposure of open ports, which can lead to the vulnerability of systems to attacks.  By performing regular port scans, organisations can verify the effectiveness of their firewall configurations. Detecting unintentionally exposed or improperly configured ports can be done in a timely manner so that IT teams are able to make the necessary adjustments to strengthen their defence mechanisms. Detecting Unauthorised Services As time passes, organisations may unknowingly accumulate unauthorised services and applications on their network. Security risks are sometimes introduced by these unapproved services, which may allow attackers to gain access to sensitive information.  Using port scanning, one can identify such unauthorised services by revealing unknown or hidden ports. Organisations can reduce their attack surface and minimise the risk of security breaches by detecting and eliminating these rogue services. Assessing Perimeter Security By scanning the ports of an organisation's perimeter, one can gain valuable insights into the security of the organisation. The process provides an opportunity to identify any weak points that might be exploited by external threats. Performing regular scans from outside the network allows organisations to simulate potential attack scenarios and detect any exposed ports or services which could be exploited by hackers. This proactive approach not only highlights vulnerabilities but also aids in understanding the evolving threat landscape. With this knowledge, IT teams are equipped with the ability to reinforce perimeter security measures, develop proactive defensive strategies, and ensure continuous monitoring to stay ahead of potential risks. Strengthening Incident Response In the event of a cyber incident, organisations are required to respond swiftly and effectively. By identifying compromised systems or systems that have been exploited, port scanning plays a vital role in incident response. Organisations can minimise the impact of an attack by scanning for unusual or unexpected open ports. By doing so, they can isolate the affected areas, mitigate the impact of the attack, and facilitate a faster recovery. Additionally, regular port scanning as part of an incident response plan helps organisations to continually assess their defences, enabling them to identify potential vulnerabilities before they can be exploited and ensure a more resilient security posture. Conclusion Due to the ever-evolving nature of cyber threats, organisations are required to adopt proactive measures to strengthen their cybersecurity defences. The use of port scanning is emerging as an important technique in IT security, as it can be used to identify vulnerabilities, assess perimeter security, and improve incident response capacity.  In order to reduce the likelihood of successful cyber attacks, organisations should conduct port scans on a regular basis in order to identify and address potential weaknesses. Port scanning can be an essential component of an organisation's cyber defence strategy in the digital age by integrating it into its security strategy. Maintaining a proactive mindset and working continuously to prevent cyber threats is essential to staying ahead of them. A significant step toward fortifying the security of an organisation's IT infrastructure can be taken by leveraging the power of port scanning, which protects its valuable assets from cybercrime's ever-present threat. ### MSPs think public cloud needs improvement Public cloud is one of the technology industry’s outstanding recent success stories. In a little over 20 years, it has gone from an emerging tech trend to a global phenomenon, with the likes of AWS, Azure and Google Cloud helping to build a market expected to reach three-quarters of a trillion dollars in value this year. Given another boost by the huge level of investment going into generative artificial intelligence (GenAI) applications, Gartner predicts the market will continue growing by more than 20% next year. As explained in their recent market analysis: “We expect public cloud end-user spending to eclipse the one trillion-dollar mark before the end of this decade.” At the heart of this infrastructure ecosystem sit Managed Service Providers (MSPs), whose partnerships with the public cloud providers are key growth drivers. According to research published by the Department for Science, Innovation and Technology (DSIT) earlier this year, 56% of MSPs partner with Microsoft (Azure and O365), followed by 43% with AWS. This contributes significantly to a vibrant MSP sector that generated over £52 billion in combined annual revenue in 2022. Public cloud concerns Clearly, public cloud offers a wide range of compelling advantages compared to other infrastructure options. This is reflected in recent research carried out among the MSP community, which revealed that scalability (61%), flexible capacity (51%), high levels of resilience (43%) and superior IT architecture (42%) are seen as its most important benefits for users of hyperscale services. For non-hyperscale public cloud users, there are other plus points, with customer support (35%) and access to a flexible on-demand business model (32%) seen as the biggest advantages. Whichever way you look at it, public cloud is meeting a very clear set of needs – under 4% of survey respondents across the board said there were no benefits of working with hyperscale providers. Despite this very healthy position and market outlook, however, public cloud does bring some significant challenges that are increasingly reflected in the experiences of MSPs. In particular, many are reporting efficiency and cost control issues that have a knock-on effect on their own service provision. Top of the list of concerns are public cloud spending levels, with 54% of MSPs saying that they have gone over budget on hyperscale cloud services in the last 12 months, with 34% saying cost levels were currently their top challenge when working with hyperscalers. These issues are seen as more important than concerns about limited customisation and control capabilities (31%) or lack of customer service (29%). MSPs also have other concerns borne out of their experience of working with hyperscale public cloud providers. For example, nearly 30% cite vendor lock-in and complex pricing structures as a key issue, while 54% say their hyperscale partners don’t provide a dedicated account manager (or they don’t know if they do). Put this all together, and just under a third of MSPs participating in the survey said these issues have had a negative impact on their ability to service their customers. Looking more broadly at other issues influencing the public cloud market, cloud repatriation – where organisations move workloads away from public clouds to their own data centres – is being considered by more than 70% of organisations, according to IDC. A key takeaway from this insight is that while public cloud adoption remains on a firm upward trajectory, there are a range of important issues that are having a tangible impact on MSPs. Unless hyperscalers use their leadership position to make changes, these challenges will remain part of the public cloud ecosystem in the years ahead. A better balance Despite these frustrations and challenges, there are steps MSP can take to mitigate the various shortcomings of the existing public cloud model. At the heart of the matter is how MSPs procure these services in the first place because there are infrastructure and service providers out there who can deliver a more equitable balance across price, performance, service and flexibility. While it might be tempting to focus on what the dominant public cloud providers have to offer, the market is now mature enough to feature niche players geared towards what MSPs specifically need. Whether they are focused on all-in public cloud adoption, hybrid and multi-cloud strategies or need to help customers with cloud repatriation, an intelligent buying strategy can deliver the balance they need. In an ideal world, an MSP should be able to set out their specifications, such as compute power, storage and network capabilities, and have the public cloud provider respond with a solution that meets those requirements. Instead of modifying their approach to fit the available architectures, MSPs should be offered services that can adapt to their needs. Armed with this level of customisation and control, it becomes much more practical for MSPs to build the agile solutions their customers need in a way that only the cloud can deliver. ### A Futuristic Guess on the Future of AI from 2025-2050 Today, let’s take a simplified look at the future of artificial intelligence and what this may hold for our future. As with anything futuristic, this blog post is purely speculative. Remember, technology disruptions hit us when we least expect them! I have asked my team to produce easy-to-understand graphics for each “time block” in this post. Sit back, grab tea or coffee, and travel with me to 2050! As we stand on the cusp of a new era of technological advancement, the future of Artificial Intelligence (AI) looms large. From healthcare to governance, AI’s potential applications are vast and transformative. This blog post explores a speculative timeline of AI developments from 2025 to 2050, examining possible impacts, challenges, and ethical considerations as AI becomes increasingly integrated into society. 2025: The Healthcare Revolution By 2025, healthcare ecosystems will be trusting the output of AI systems, moving beyond traditional machine-learning techniques into a new era of discovery. Advancements in vision-based AI tools will leverage systems that are adept at imaging and recognition to diagnose medical conditions at a much earlier stage. The combination of extensive medical records, used as training data and cutting-edge AI and human-curated research, will usher in a new healthcare system. Voice generation allows instantaneous remote and GP-based diagnoses, easing the burdens on the NHS and other worldwide systems. Healthcare gets personal From the discovery of the fingerprint to the advent of DNA. We have discovered that we are unique in many ways. Targeting treatments based on genetics profiles and a patient’s many individual factors such as lifestyle factors and previous medical history. AI and Quantum computing combined will give us unparalleled access to computing power, giving patients hypoer personal treatment plans.  Drug discovery, with the copious notes and details of molecular combinations Quantum-based AI, will process billions of end-points, making the process of new research and medicines at lightning speed.  In drug discovery, AI will sift through millions of molecular combinations, speeding up the identification of new medicines. This could lead to breakthroughs in treating incurable diseases and reduce the time and cost of bringing new drugs to market. This computing power would bring new discoveries for incurable diseases that burden the world today. As recent events have shown, data privacy and data security will be at the forefront of people’s minds when implementing training systems based on their medical data. It is up to the relevant health authorities to show that this training is needed and that all data will be anonymised before going to any provider.  Re-skilling If done correctly, the adoption of AI-based healthcare services will modernise systems and outcomes for millions of patients. The technological leap, though, should come with caution. Without the right up-skilling and training, any large-scale project will fail. Alongside any leap, a well-considered training plan should be included, with the migration of legacy skill sets being captured and advanced into the new AI-driven systems. 2028: The car is here! Autonomous driving By 2028, self-driving cars are expected to be free-flowing across our roads (check out the recent Tesla update); self-driving car algorithms are improving, with sensors and computer vision models improving exponentially. This means that most vehicles will begin to migrate to being self-driven; the car could be leased out when not being used, and a host of new business models will be born.  Goods and services delivered by lorries running constantly, reducing accidents through fatigue or distraction, reducing insurance premiums and a better road system for all. The never-ending issue of traffic jams and congestion: Would AI solve this using smart vehicle tracking and rerouting around hotspots in real-time? Chaotic, uncontrolled systems such as traffic, including issues introduced by signals and breakdowns, may become a thing of the past. AI-powered systems would target traffic flows in an intelligent, dynamic way.  The rise of self-driving vehicles will impact city planning. The issue with all cities is parking spaces. Self-driving vehicles would be able to drop off and park outside congestion zones, similar to the park-and-ride schemes of today; this development would be amazing for pedestrians, reduce emissions, and give back areas for other uses. Any major technology shift upends current industries. For instance, how would transportation employment evolve if drivers were not required? This and ethical concerns, such as collecting passenger movement data and the threat of security breaches, will be debated intensely. 2030: The AI teacher will see you now! AI Transforms Education By 2030, AI is expected to revolutionise education, creating hyper-personalised learning plans for each student. The nature of education will experience a shift unlike one we have never seen before.  AI tutors will provide one-on-one attention to every student simultaneously, offering instant feedback, encouragement, and additional challenges. This could help close achievement gaps and ensure all students receive the support they need to reach their full potential. The journey we see now in virtual reality, such as meta and augmented reality with Apple Vision, is a glimpse into the future of learning. A pupil learning about the Renaissance will be able to visit the Medici workshops in Florence to study the master’s creating works such as David or the Portrait of Pope Leo X. This form of interactive learning allows for a much greater understanding beyond names, dates and summaries and will allow our children to be immersed on the subject in question. The teacher working late marking papers will be a relic of the past, with AI automating these tasks and providing on-demand feedback on any student performance data. The freeing of resources within the education sector will give teachers the opportunity to grow the child’s emotional and physical well-being. This would be needed to offset any screentime concerns and provide human interaction that is key to cognitive development. Open sourcing of all AI models for education should be made a law to ensure that privilege isn’t gained by the deployment of advanced models within the system causing inequality. 2033: AI-Driven Scientific Breakthroughs By 2033, AI running on quantum-based systems is expected to drive possibilities and outcomes that we have never imagined, leading to breakthroughs in physics, chemistry, biology and every field of human endeavour. At this time, human history will advance in ways not even conceivable today. From implementations of quantum-level materials such as graphene to the upending of Newtonian physics, an age of human enlightenment would be upon us. Dark matter, energy generation nuclear fission, black holes, wormholes. The possibilities of solving the superconductor properties that work at room temperature and solar cells that retain more energy are boundless. AI will become an indispensable tool for everyone where traditional silos of knowledge such as medical vs, astrophysics will merge into one discipline. Knowledge will be codified but with a human intuition to drive AI into new research and discoveries.  I hope that today’s AI race will be tomorrow’s collaboration. The power of AI will need to be shared with open access and global laws to ensure that humankind benefits from discoveries made.  2035: Minority Report Time? AI Assists in Governance By 2035, governments may begin bringing AI systems into judicial and other legal and policy-making areas. The AI judge will mediate county court disputes and allocate the budget from central to local government. Would we vote for the party that wants to be elected on a data-driven efficiency platform? Open access to the vast troves of government and industry data (this is where China has a huge advantage today) to model the impacts of policy decisions and run ‘what if’ analyses before implementing a policy, including accounting for unintended consequences. And an AI would not fall asleep in the House of Lords! Readiness for disasters and pandemics from AI simulations would ensure that whatever the occurrence, we have a plan to work with. That plan could involve AI allocating human resources such as police, fire and medical staff to zones.  During crises such as natural disasters or pandemics, AI systems could assist in coordinating emergency responses, predicting the spread of diseases, and allocating resources where they're needed. However, using AI in governance will raise significant concerns amongst those affected by a decision, whether this be ethical or political bias. The key to gaining trust is to be open; AI systems should be open, and explain any judgement or reasoning with a clear path showing an external user how it reached any decision; the AI black boxes of today should be related to the past. 2040: Enter the Humanoid the Human-AI Collaboration Peak By 2040, will AI become a flow within us, like the bird I can see out of my office window riding the thermal air currents. I believe we will become one with AI; advances in voice-driven technology will give us instant access to everything from personal management to healthcare diagnosis and advice. The construction site of today will be no more tomorrow. Robots will perform all dangerous tasks with little to no human supervision, and deliveries will be drone or robot-based. Communication will be voice-driven, instantaneous, and holographic in nature. Like the smartphone of today, AI augmentation will help us navigate, connect, and be human. Lawyers, accountants and other professions will be driven with augmented intelligence with access to vast troves of data to drive outcomes in all professional fields. This human / AI / Robotic shift will bring benefits but also significant challenges. We must be prepared for the psychological and other drawbacks this coordination will bring to our humanity. 2045: Humanity and the AI Consciousness Debate By 2045, will machines be seen as sentient? humans have always tried to make other entities, such as dogs, “human” by giving them human names and bringing them into our homes.  Will we do the same for AI and robot entities? Will the robot be part of the family? Will the children have overwhelming affection for the robot nanny? Will human attachments to robots cause personal loss? Will we look to transfer consciousness to make ‘digital twins”? Will there be a robot bill of rights as we begin to see signs of emotional intelligence and self-awareness? Philosophers, neuroscientists, and AI researchers should come together now to map out the answers to these questions.  Thank you for reading this blog, if you have any questions or feedback, get in touch! ### How Small Businesses Can Start Using AI Today “Let me start this blog by saying that the threats of your business extinction due to not deploying Artificial Intelligence (AI) are greatly exaggerated. “ The first paragraph should, I hope, set the tone of this blog, which I hope will be a brief guide through the choices available to add AI to your business. First, AI is not new have a look at the below; The UK government definition of SMEs encompasses micro (less than 10 employees and an annual turnover under €2 million), small (less than 50 employees and an annual turnover under €10 million) and medium-sized (less than 250 employees and an annual turnover under €50 million) businesses. (UK.Gov) This means that in the UK, many Small Businesses are struggling to see through the constant big-tech propaganda and gain a rounded understanding of what AI can do for them. Testing the waters Starting with ChatGPT / Claude / Google AI / Microsoft co-pilot I will not compare the leading chatbots but take you on a very short journey through how they can help you. The first task to run is to define what technology stack you are currently using. Examples could be running Google Workspace for Business with Gmail and Google Drive, with the office stack being Google Docs/Sheets and Slides, et al.  Alternatively, you could be running a Microsoft stack comprising Microsoft Office 365, including online email and the usual office suite of products. This matters because you are looking to get a simple integration into the tools you are using. In both these cases, the respective Microsoft Co-Pilot would be best for the Microsoft stack and Google AI for Google for ease of use and rapid deployment. The caveat for the above is that this is for full integration; my advice is always to try first. If you are looking for certain tools, this could be content creation, research or a marketing plan. Always carefully measure the impact of time saved and costs before taking the plunge. I like Dr Who, but I don’t want a Dalek. I prefer Darth Vader myself, but that doesn’t mean he is going to turn up in my company and enslave all the employees! Like Cloud Computing before, AI hype is incredible. Is it a generational event? Yes, but the robots are not coming for you or the company you reside in. The way to think about AI is to view it as a computer translator that speaks machine to other machines, a Rosetta stone for the modern age.  “In time like the Alexa command that plays that soundtrack you demand, AI will be the bridge between that which is human, your voice.”  At the time of writing this blog, OpenAI (ChatGPT) has released voice mode. The voice journey has begun! What does this mean? The technology will be simplified to the point that voice commands with screen outputs will be the way we control our devices.  My advice is to plan for this eventuality within the next 18 months and look to understand how (if at all) this could affect your business.  Think about voice ordering at a fast food kiosk; you speak first, and the return is always voice and pictures. (see previous blog here) AI use cases The easiest win with any AI is the generation of documents, plans, images and being a hella smart assistant. This then has the benefit of saving time on repetitive tasks Planning and using AI allows you to understand customers better Use AI as a sounding board. Ask it to grab the latest thinking (not all chat-type programs can access the Internet), enabling you to make business decisions based on the latest thinking in your business sector. Automate processes using AI, such as calendar booking and calling out to confirm details or appointments. (requires an API system, which we cover next) Public LLM / Private LLM / API’s / LLM wrapper / Vector store Confused? LLM stands for ‘large language model,’ which is what most AI tools are based on.  The training for these tools has allowed them to predict ‘what comes next’ based on algorithms and reasoning. In simple terms, it’s a prediction machine. Public LLM This is a typical use case LLM such as ChatGPT or Claude. In these scenarios, you would subscribe for a monthly fee (around £16-£20) and use the interface or chat window provided.  This can take the form of a native app or be accessed through a web browser. As this rolls out in your company, the subscription costs over time may require you to rethink your costs.  Public models are subject to constraints such as the number of messages, even with “pro” or paid subscriptions.  Private LLM Thanks to the Open Source (free software) movement, a number of LLMs or, for simplicity, chat systems are available free of charge for you to download and use on your own computing hardware. The advantage of deploying in this fashion is that nothing goes outside the server. This ensures data privacy (within the LLM, physical and system security are essential), zero licence costs, and restrictions on usage.  Working with an IT partner and scoping correctly means that a number of secure integrations can be built from the network and hardware you control. A good resource to see the possibilities would be Langchain, which you can find here: https://github.com/langchain-ai/langchain. Quick tips to begin your private LLM project Create the use cases. These could be a customer support chatbot, data analysis, service or customer update service, document generator, project manager, or access to your CRM tools (customer management) to analyse data with graphs on a deeper level.  Good hardware choices for IT hardware go with a well-supported brand with a great partner ecosystem. The specifications below are for LLama by Meta: A great GPU (such as NVIDIA A100 or RTX 4090) At least 64GB of RAM Fast storage solutions (SSD/NVMe drives) Reliable power supply and cooling systems The initial investment in hardware may be expensive, but over the long term, if privacy is key, this is an excellent path forward. Many companies do not retain the necessary IT skills to set up an LLM environment. I have provided the link below, which is for TD SYNNEX, a Lenovo hardware distributor. They can independently find you a suitable local partner to work with. https://trustedadvisor.tdsynnex.co.uk/vendors/lenovo/contact/ The API conundrum What the hell is an API, you may ask? It is an application programming interface, one would say. If that is not clearer, don’t worry—join the majority of business owners. An API is a way for two systems to “chat” in machine language to pull information. Let me give you an example. You go to that holiday booking website and decide, I want to go to Florida, I want the best hotels 4-5* and it has to be near the Universal Studios. The travel website then “calls” the various hotel platforms, airlines and mapping services. The call then comes back from the external platforms as an API. This completes the page you are viewing with all that glorious information, including user reviews. Then, when the card is put in, a final API is sent to the merchant to process The API services from public LLMs are great for chaining together various actions and services.  They are like a business canvas that slots together to form an outcome for you.  A note of caution: Be careful and make sure you add cost limits to ensure you do not have a nasty surprise bill. If you are struggling and need help, then feel free to reach out via this website. LLM Wrapper You will come across this term when doing AI research on any platform. An AI wrapper is an API that has been configured into a product and trained within that product. An example is online AI writers that are tuned to produce an outcome. The majority of AI-driven products online are created in this form; the issue though is when the models are changed, you will find many wrappers based, for instance, on ChatGPT 3.5.  When buying an AI service online, check the model used and always ask what they do regarding an upgrade. Be careful.  There are a lot of “zombie” services based on an old model sitting there collecting subscriptions offering considerably less than a public LLM subscription. Email the support and ask these questions. Vector store A vector store is a database in the middle of public LLMs, private LLMs and the deployment choice you made.  A vector store allows you to ‘cache’ or store previous searches and answers and add documents and learnings to create your own database. Vector stores integrate with the majority of AI tools. An excellent use case is the enrichment of queries within private stores and the security this provides. Conclusion AI doesn't have to be complicated or expensive. Even the smallest businesses can benefit from this technology by starting small and focusing on practical applications. Plan for the future, but look at the present: where can you save on repetitive tasks, where can AI add value, what can you automate and evaluate, and where can you cut costs? Above all, do not be afraid. Technology should serve you, not the other way around. Harness technology to become one with your business, making it personal and sound like your brand and voice. And if you get stuck, never be afraid to call on the vendors and their partners to assist. Above all, experiment, iterate, iterate, and make it right for your needs. Be open and ready for unexpected use cases. Thank you for reading this blog.  [^We do not sell AI services, but if you need help, reach out using the various forms on this website, and we can point [you in the right direction.] ### Are Cloud and AI driving the need for Universal Basic Income? I keep hearing the words Universal Basic Income (UBI) across many social and blog networks. This has led me to think: Why not try to explain the concept to those of us who wish to learn more? here goes: Grab your cup of tea or coffee and read on. Close your eyes for a second; now let me take you on a journey we have seen beginning. Transportation in London is now using Artificial intelligence avatars and text-to-speech services to read out arrival and departure times. Yum brands, including KFC, Pizza Hut, Taco Bell, and others, are now rolling out voice AI ordering at all of its drive-throughs. This means that the voice services are capturing the orders and managing the front-of-house ordering process. Elon Musk has said that the Tesla Robot Optimus will target general availability by 2026. These humanoid robots will be able to complete many tasks and take manual labour away from homes and businesses. (sign me up) Robot Chefs: From burger flipping to adding that slice of cheese, robot chefs are already here and will only get better as they progress into the more complicated restaurant tasks.The rise and rise of voice! My children say, “Alexa, what is the capital of Peru?” smart devices are unlocking knowledge on demand and bridging cultural divides using instant translation services, no more struggling to speak another language. Workforce deskilling: as discussed in a previous blog, we are going to see AI take more of a lead in many knowledge worker tasks. What does all this mean? This means that, unlike any other period in time, including the Italian Renaissance, Industrial Revolution, and electrification of factories, we are about to see humankind’s greatest achievement (or peril). Disruption will be on an unimaginable scale. No one will be safe from the automation of systems, labour, and work as we know it. This will be no Guttenberg press providing information to the masses; this will be the essence of humanity that is captured and industrialised. Cloud Systems, bolstered by advanced connectivity with predictive Quantum clusters, learning from every snippet of interaction, every right and wrong answer. Like a vine in my garden that I thought I could control, that's now climbing up the wall spreading and getting larger and more woven into the fabric of life. Was that a bit dramatic? or a justified description? I’m afraid we won’t know now, time will show this. How do we protect society against the rise of robotics and AI? Universal Basic Income? Universal basic income, or UBI, is a scheme that gives everyone a regular payment unconditionally. These payments should not be token-based or subject to rationing, as the UK did after or during the Second World War. Is this a workable utopia? In the following paragraphs I am going to give examples and parallels to help explain the concept further. The Global Pandemic & the UK Government Furlough Scheme This was a scary time for everyone, I remember the need to shut down core parts of the company and the worries about payroll, coupled with daily news about big brands failing.At the start, I remember hearing about a German government scheme that would cover the salaries of workers. Then rumours surfaced that the UK was going to implement something called “Furlough.” A furlough is a temporary leave of absence from work Imposed by an employer due to the special needs of a company or employer, which may be due to economic conditions at the specific employer or in the economy as a whole. During a furlough, employees are not paid but retain their employment rights and benefits. This description was my first impression of ‘Furlough’, but clarity was finally given regarding where Employers would, in fact, be paid under this scheme. Whilst not the panacea of UBI, it was a big step towards it during a time of crisis. The similarities with UBI are: Both provide financial support during times of crisis.Both aim to maintain economic stability and individual well-being. However, key differences exist: UBI is universal and permanent, while the furlough scheme was temporary and linked to employment. UBI is unconditional, whereas the furlough scheme was tied to specific circumstances.The Furlough scheme is a fantastic example of what a government could or may provide to its people under a UBI model. Great, how do we pay for it? Unlike the furlough scheme, UBI would be permanent. I will look at a few models and examples of UBI schemes that are under trial, failed, or implemented. Increased income taxes, new wealth taxes, value-added taxes (VAT), or carbon taxes. Build out Sovereign wealth funds, digital services taxes, or financial transaction taxes. Robot tax: like an employee today, the use of robots could be subject to full taxation and national insurance. AI metering: could we use metering such as the 95th percentile method to measure AI outputs? A displacement tax, an example being delivery robots being subject to a one-off job displacement tax, therefore requiring investment up-front before deployment. UBI “in the wild” worldwide examples Below you will find a number of case studies that I looked at to help the reader understand the UBI successes and failures. Finland’s UBI Experiment Back in 2017, Finland decided to try something pretty wild. They picked 2,000 unemployed people and said, "Hey, here's €560 a month. No catches. Keep it even if you get a job." It was a two-year experiment to see what would happen with a Universal Basic Income (UBI). With AI and robots potentially taking over jobs left and right, Finland wanted to see if giving people a safety net would help them take risks, like accepting low-paid work or starting a business. So, what happened? Well, it didn't exactly send everyone rushing into new jobs. But something interesting did occur. People felt… better. They were less stressed, more satisfied with life, and felt more in control of their situation. The experiment was short and brief, and left us with many questions as answers, but it does provide us with a look at one possible future. A future that highlights how we’ve found a way to work with AI instead of competing against it. So, what do you think? Is UBI the solution to our AI anxieties? The debate's far from over, but one thing's for sure – as our tech gets smarter, we're going to need some pretty creative solutions to keep up. Alaska's "Free Money" Experiment So, you think Finland's experiment was wild? Alaska's been handing out cash to its residents since 1982. Yep, you heard that right - for over 40 years! Alaska's sitting on a whole lot of oil. Instead of letting all that money line the pockets of big corporations or disappear into government coffers, they decided to share the wealth in a scheme called Permanent Fund Dividend (PFD). Every year, Alaskans get a check just for, well, being Alaskan, it's like a birthday present from Mother Nature herself. In 2023, each resident pocketed a cool $1,312. Now, before you start packing your bags for the Last Frontier, let's break this down a bit: The amount changes every year. Sometimes it's higher, sometimes lower. It all depends on how well the oil market's doing. It's not meant to cover all your living expenses. It's more of a nice bonus to help with bills, save for college, or maybe splurge on that new iPhone. But here's where it gets really interesting: This "Alaska model" has actually done some pretty cool things. It's helped reduce poverty, narrowed the gap between rich and poor, and generally made Alaskans a bit more financially secure. It's not perfect, of course. Critics worry it might make people lazy or that the money could be better spent on public services. But overall, people are pretty happy with the yearly payday. Ontario's UBI Rollercoaster Finland and Alaska aren’t the only places where the experiment has been put into play. In 2017, Ontario, Canada, decided to take a page out of Finland's book and try their own UBI experiment. But it didn’t go according to plan… 4,000 people in places like Hamilton and Thunder Bay would get a guaranteed income.Single folks would get up to $16,989 a year, while couples could score up to $24,027.The plan was to run for three years and see how it affected things like health, education, and employment. Just as the experiment was getting off the ground, Ontario had an election and shut it down due to high costs. Not only did this fuel outrage but it meant that we’ll never really know if it would've worked. The experiment was cut short before we could get any solid data. Even in its short life, the program seemed to be doing some good. People reported feeling more financially stable and just… happier. Some people were using the money to go back to school or start small businesses. In the end, Ontario's experiment leaves us with a big "what if." As we keep wrestling with questions about AI, job security, and the future of work, stories like this remind us that finding solutions isn't always straightforward. But hey, at least we're trying, right? UK Examples of Universal Basic Income (UBI) Programs England's £1,600 UBI Trial In 2023, the UK government started its first universal basic income trial by giving £1,600 per month to 30 people in central Jarrow, North-East England, and East Finchley, North London. This is a two-year pilot; there are stated aims, but to predict outcomes wouldn’t add to this blog. We will update you when more is published. Wales Income Pilot for Welsh Care Leavers In 2022, the Welsh Government's basic income pilot targets young people leaving the care system. It provides 500 care leavers (Available for those leaving care who are turning 18 years of age between 1 July 2022 and 30 June 2023) with a cash payment of £1,600 per month (£1,280, after-tax) for two years to support their transition to adulthood. Conclusion Thanks for reading until the end of my blog. As we automate more functions, both manual and knowledge-based, we need to uphold the very fabric of society. Whether UBI or maybe Universal Basic Services (UBS) which instead of providing cash transfers, UBS focuses on guaranteeing access to essential services such as healthcare, education, housing, and transportation - we will need something to help sustain society as things change. ### The Pros and Cons of Consumption vs Subscription Models As a newly appointed CEO one of my first tasks was to deep-dive into the impact of Artificial intelligence (AI) on the business I manage and run. The Disruptive brands that I am custodian of have been at the forefront of technology adoption, with a bleeding-edge/leading-edge mindset throughout the organisation. For reference, we use the wonderful model from Geoffrey Moores’s classic Crossing the Chasm to benchmark technology adoption. (see below) Internally, we used a very simple model to classify our business operations, which has now developed into our AI strategy; it was based on 3 words: Human (for human tasks), Hybrid (for human-assisted tasks), And Automated (for AI tasks not needing human intervention). I find it simple and easy to explain and has helped me clarify my team's and I's minds about where we are heading. Moving forward to current times, I have seen across the technology industry landscape that reductions in the workforce are a trend that will continue as technology companies adopt more AI tools. SaaS background The launch of Software as a Service models has always been a game changer for a smaller nimble business. The ability to use (in various circumstances) Enterprise level software becomes a game changing advancement. The usual pricing schema was per-user with a ramp to more advanced features, the predictable revenue being shored up by annual payment discounts and in nefarious cases by auto-subscription and dark patterns (more on this in another blog). The various SaaS tools, encompassing everything from marketing through to accounting, have revolutionised the IT industry, with many SaaS providers opting to use cloud providers and migrating away from on-premise hardware (although we are seeing the pendulum swing the other way with AI projects). The SaaS models in many cases brought a plethora of on-demand tools and services, a complete shift away from traditional operational IT plans. As with many industries though is SaaS feeling the pinch based on the pricing models which were the bedrock of there growth. The Subscription Squeeze Platform-driven services such as SaaS are always dependent on both sides of a model working. An example (albeit a cheesy one) I use internally: “If Air BNB had no homes to let, it would fail the people going onto the platform to book a stay; if UBER had no drivers, the passengers would not use them. Alternatively, if either had no users, the platforms would die on both fronts.” Therefore, “born on the cloud” SaaS providers need platform economics to drive rising cloud consumption costs while also funding development to retain monthly users who may be transient if adopted based on cost switching. This becomes a higher risk. The good news is that Cloud providers dropping egress charges helps with new platform migrations, albeit on-premise or cloud. The end-user or ultimate consumer is reducing; therefore are companies looking to reduce costs on users and negotiating renewal contracts hard, of this I have no doubts. Is it time for something new? My vote would be yes, the many SaaS companies who extolled the values and virtues of SaaS vs. on-premise hardware now need to look at new dynamics, the users are. Embracing the Consumption Model Loss aversion is a strong motivator of change, whether in a business cost or personal expenditure. We all despise paying for something we did not use. The plethora of unused SaaS licences will be subject to the FINOps movement, and it will take time to modernise or get cancelled. Learning from software history In previous downturns or disruptions, there have always been historical precedents; it’s a case of hunting those nuggets of wisdom and bringing them back like a tardis to execute now. Sorry, reader, I had to tease that out. For users that are now subject to loss, why not take back the age-old model of licence metering? whereas the consumption is based on licences used rather than a fixed number; another great term for this was floating licence. Move to a model that offers these licence types or alternatively look at going completely in on consumption. Consumption models, currently being practised by companies such as AWS, IBM Cloud, Azure, Snowflake and many others, is where the cost is based on usage, I.e. you have used 100GB of storage and that is what you pay, if it reduces you pay a reduced fee. This model is fairer, based on AI workloads coming into systems to be analysed, and for the SaaS providers, you do not get slammed in the renewal as you have migrated to a fairer model for you and the customer. Prepare for a big “Downskilling Surprise” The march of AI is relentless; It reminds me of the verticalisation concept, and it impacts every business sector. An example of this is Salesforce, which sold sales tools across multiple industries because it was not verticalised. Across every sector, AI is being implemented to drive efficiency and to make us leverage tools such as code generation through to websites and predicted failure, even tackling the age-old concepts of complex systems such as traffic management. As AI tools undergo the usual technology adoption process and grow smarter and more ubiquitous, we will experience a paradigm shift. This uninvited shift will be the de-skilling of the workforce. As AI gets smarter, many jobs will not need the required skills, and we will see AI coexist in these roles. This democratisation of technology skills and white-collar tasks will eventually see disruption. With this disruption, as always, will come opportunities (I will be discussing this in another blog soon). Conclusion The impact of AI on tech business models extends far beyond the shift from subscription to consumption-based pricing. It's reshaping the very nature of tech work, influencing decisions about infrastructure (on-premise vs. cloud), and altering the skill sets required in the industry. Adopting, consumtion-based models, I predict, will be the next shift we see in technology spending. Like the shift to SaaS models, will we see a huge shift to fairer licence models? Let me leave you with a quote from Daniel Kahneman. “Loss aversion refers to the relative strength of two motives: we are driven more strongly to avoid losses than to achieve gains.” ### The Uncanny Valley of AI Writing As I often do, I like to check the submissions from our press releases and blog request inbox. After a quick drink of my early morning tea (a British essential), I noticed something strange.Each blog has the following style of title “Something about noting: a guide to nothing” Notice the: in the title? I then proceeded to read down, and bang, I would say 60% of the submitted guest blogs were AI-written; what makes this worse is thstorat the Public Relations agencies submitting are charging customers for each blog written. Think about that: they care that little about the brand they are paid to represent. Were these blogs interesting? Not really, they were complex, offered not a single broken-down explanation, full of technical mumbo jumbo and at worst lazy. Why does this matter? My personal view is that each blog submitted is posted to inform and entertain our audience, it is a primary motive for not sun-setting the blog and allowing it to be a free-flowing exchange of thought leadership ideas.I asked ChatGPT to give me a description of what it considers to be the writing uncanny valley.The "uncanny valley" in AI writing refers to a phenomenon where AI-generated text is almost, but not quite, indistinguishable from human writing. This near-perfect imitation can evoke a sense of unease or discomfort in readers. Several factors contribute to this uncanny effect:Subtle Errors: AI writing might contain small but noticeable mistakes in grammar, context, or style that a human writer would typically avoid.Lack of Genuine Emotion: AI-generated text can feel mechanical or insincere, as it lacks the true emotional depth and understanding that humans bring to their writing.Predictability: AI writing can sometimes follow overly predictable patterns or clichés, making it feel less authentic.Inconsistencies in Tone and Style: AI might struggle to maintain a consistent tone or style throughout a piece, leading to abrupt changes that feel unnatural.Contextual Misunderstandings: AI can misinterpret context or cultural nuances, resulting in awkward or inappropriate phrases that a human would recognise as incorrect.These elements combined can create a sense of dissonance for the reader, who might recognise the text as almost human-like but still detect something off-putting or unnatural about it.I did have to correct the US English, but I left the post intact. I am a heavy Grammarly user, which sometimes causes a written article to be flagged as AI.Back to my morning tea, I began to realise that AI has some great uses but also some significant drawbacks. The main drawback is simply trust! The mainstay of human connection Trust When we adopt the uncanny valley mindset, we then erode the element of trust: trust that the author wrote the blog with meaning, trust that the information is fact-checked, trust that you are not breaching copyright or plagiarising a body of work. Checks and balances for AI content 1. Always have a human edit, and never believe sources or citations until you have confirmed them.2. Cite the AI insert above to show transparency to the reader.3. If you are a Public Relations agency, respect the client’s brand, stop taking shortcuts, and be transparent about using AI.4. Content spinning and paraphrasing tools have been around forever; just don’t use them; the content just feels “uncanny”5. The accuracy of online detection tools is somewhat flawed. Get AI to do the work for you and prompt you to make you write better. 6. Be human.7. Word count, don't write to fill a word count, perhaps just let it flow and then come back without the need to use filler content.8. Storytelling, immerse the reader in the story you create, give a piece of yourself away.As we are now in the maturity stage of chatbots, especially with the excellent Claude 3.5, my advice is to be transparent, open, and cite AI content. If it reads well and adds value, then go for it; if it is jargon-led mumbo jumbo, rewrite it.As we look forward to chatbots’ continual maturity, the cost of using AI to produce blogging and brand content will fall. With this fall, though, comes the further step of the shadow of the Uncanny Valley. I can spot a blog written with AI within 10 seconds of reading it, and I am sure the majority of people reading this post could also.To the person who just walked into the boss’s office in a slightly elevated manner saying, “Look what I produced in 1 minute”, think about why it took a minute, and then ask yourself, does this really create an advantage for you and your customers? I asked ChatGPT to give me a quote to end the blog. “Time invested in learning and adapting is always worthwhile, especially in a world where change is the only constant.” It’s attributed to no one. ### Crypto History is Blocking Secure Blockchain Data Storage In 1991, two scientists first introduced the concept of blockchain technology, seeking a way to time-stamp digital documents so that they couldn’t be tampered with or misdated. In short, a blockchain is a distributed ledger suited to the storing of files and data. It combines a number of different technologies including enhanced encryption and AI and as a result offers many security and privacy advantages over traditional cloud storage solutions. Many people will associate blockchain with cryptocurrency because perhaps its most famous use in recent years is to store and manage digital assets such as cryptocurrency. Due to the media hype surrounding cryptocurrency, there is an unfortunate and overwhelming public belief that cryptocurrency and blockchain are intrinsically linked. This has meant that blockchain is often associated with the more negative aspects of cryptocurrency including its often-destructive environmental impact and the many scams and bad actors that plague the industry. This is undoubtedly a massive waste. In layman’s terms, a blockchain is a platform that is used to record transactions across many computers. Once a ‘block’ is added to the ‘chain’ it can’t be removed or changed, only further blocks can be added. This process takes place in a secure, chronological and immutable way. This concept can be applied to many things outside of cryptocurrency. It can be applied to files for example, ensuring that each time a file is saved it is stored that way and that version of the file can’t be changed or removed. Time and time again it has been proven that current mainstream data storage solutions are vulnerable to attack with ransomware and internal data leaks causing financial and personal harm to both businesses and individuals. As such, we desperately need to free the blockchain from crypto’s clutches and start accessing its added security, privacy, and accountability to solve this worldwide problem. Blockchain for data storage Applying blockchain technology and its principles to data storage is simple. Instead of cryptocurrency of varying and fluctuating value, your sensitive and private data is securely stored in your own isolated and unchangeable blockchain. It can’t be changed or deleted once stored and crucially, your data can’t lose its value and cost you your life savings. Utilising blockchain for storage is a tried and tested way to stop cybercriminals from stealing your data and holding you to ransom. With the addition of fully homomorphic encryption, files can be searched, analysed or previewed without exposing them to attackers. Nigeria’s National Blockchain Policy One of the largest global moves to utilise blockchain is in Nigeria where the government recently announced its National Blockchain Policy. The policy’s vision is: “To create a Blockchain-powered economy that supports secure transactions, data sharing, and value exchange between people, businesses, and Government, thereby enhancing innovation, trust, growth, and prosperity for all.” Among the key benefits outlined in their policy of adopting blockchain nationally are improved transparency and accountability, increased efficiency, enhanced security, and financial inclusion. These benefits are all visible in OmniIndex’s recent partnership with educational management experts Future-X. Together they are working with the Federal Ministry of Education in Nigeria to transform Nigeria’s educational ecosystem with blockchain data storage. As well as creating local jobs and new data centres, the adoption of blockchain in the school system will protect crucial student data from attack and misuse while also greatly enhancing the ability to introduce improvements through secure analytics of the encrypted data without decrypting it. The Big Question The question has to be asked as to why more nations and companies aren’t adopting blockchain data storage and are relying on legacy storage solutions that have proven to be weak and vulnerable. While there is an obvious monopoly over data storage from the leading cloud providers, surely their influence alone isn’t enough to prevent others from stepping up and adequately protecting their data where they can? Especially when failing to act now puts the data of these nations and companies at further risk of attack while others are acting to protect it. ### Three Questions to Realign Your IT Investment Strategy Speed is usually the silver bullet to remaining competitive in a complex cloud environment. What holds back that critical race is the fact that tech investments can be siloed or driven by the latest hype or most convincing business case at that moment in time. There can sometimes be little regard for how a solution or tool will perform or produce savings in the years to come, or simply that IT budgets get stretched thin – both cases leading to the same outcome: a tech investment strategy taking a backseat. It might be that some efficiencies or performance improvements are made, but without proper alignment to business objectives, gains will be sporadic and insignificant. The consequence? IT businesses become virtual homes to fragmented artefacts. What organisations should be aiming to achieve when investing in technology solutions is substantial digital transformation, a certain level of agility to adopt the right tech and gain a competitive advantage, and reduced pressure on IT teams. This can be possible – by bringing strategy back to the fore, aligning with top business goals, and realising the role that technology must play in achieving them. To get started, IT decision-makers should step back and seek to answer the following three questions. Is my existing infrastructure providing or restraining the gains the organisation needs? As organisations accumulate IT solutions and platforms, many will also rack up a variety of tools that aren’t core to the product but get added through iterations and updates. IT professionals will find this compounds an already complex IT infrastructure and that it becomes more difficult to realise the extent of the tools and services they can deploy. As a result, products can get doubled up and organisations spend more than they should as additional software is procured to provide functionality they unknowingly already have.This topic comes up frequently with customers. A well-known media company had an active tender out for nearly a year as it was looking to establish a sustainability dashboard for its IT operations. They were unaware that they could executive that sustainability work with their existing VMware dashboard with their current fee. There was no need to lay out additional costs, and they could make the solution go the extra mile, once they understood how it all fit together. Image done by Disruptive LIVE You may be surprised at how many tools you already own that already work so effectively. Instead of complicating your infrastructure, take a more holistic view of your toolkit and regularly look at your existing investments to get full value and a return on them. Is a bespoke product necessary for achieving innovation? Technologists often believe that building a bespoke tech stack is the best avenue to meet their customers’ specific needs. But despite good intentions, the tech investments made to achieve a bespoke product are piecemeal. These become focused on specific tools and products that are combined to make something completely different, which usually do no work well together and require more work than intended just to keep them ticking along. Again, step back and reevaluate. There are solutions and platforms out there that remove this headache and help you to get the basics right. Solutions that can be bought off the shelf, deployed quickly, and are scalable so that you can create a centralised development environment that responds to customer needs and delivers on new requirements based on what customers asked for. This is how you can find pockets of innovation in your business and provide more value. Are my current tech investments keeping the lights on or unlocking innovation? Innovation can be associated with the urge to adopt the latest and greatest technology on the market, whether that be public cloud networks or generative AI tools. This approach can pay off, but the same level of agility and potentially greater innovation – not to mention a better focus on cost, data sovereignty and security - can be achieved on-premises or in a private cloud environment. There is the right solution out there for your organisation. Instead of building your own cloud platforms, you should be focusing on layering innovation above that and looking for a solution that has built-in resilience for non-stop operations, threat prevention, and fast recovery from cyber threats. That’s where the real differentiation is. It can be easy to be swept up with what competitors and the broader market are offering, but think back to the three questions I’ve outlined here to help you better understand your IT infrastructure and what investments you need to make to keep on top of your overarching business objectives. Nurturing an innovation mindset will help you revitalise your tech investment strategy. Instead of thinking that you need to create a tailored IT solution or procure every new shining toy on the market, the onus should be placed on creating an environment that provides flexibility and agility through the use of reliable solutions, allowing your business to deliver innovation that makes a difference. ### Defending Health and Social Care from Cyber Attacks The National Cyber Security Centre (NCSC) recently calculated that Health and Social Care is the 5th biggest sector attacked by cyber-criminals in the UK. Recent instances, such as the cyber attack on the University of Manchester which led to over one million NHS patients’ data compromised, further proves the case for enhanced security measures. Such attacks on businesses cause enormous disruption but in the case of care businesses the consequences can be life threatening. The inability of a carer to access a service user’s data can have a huge impact on missed medication or missed care provision – it can rapidly escalate to a safe-guarding situation. Particularly in the case of direct care information, where data privacy and security is so crucial, cyber attacks will likely always remain a risk. It is essential that healthcare and health tech software providers continuously monitor, update, and improve their technology to ensure that a breach does not occur. What measures must be taken? There are several measures home care agencies can take to help combat being victims of cyber attacks. Picking a suitable software solution is a key element in mitigating any potential cyber attacks, particularly if they’re hosting your data, but remember that they’re a black box. You don’t know exactly what’s going on past your interactions with the application. There are security standards available which can give you reassurance that a provider is operating in a secure manner. The NHS Data Security and Protection Toolkit (NHS DSPT) is a great place to start; it’s a self-assessment programme largely based on the ISO27001 standards and has special affordances for healthcare. Digital Social Care recommends being NHS DSPT compliant to all CQC registered care providers, and it’s a requirement if you deliver services under an NHS contract. They have great advice and guidance for meeting the standards and becoming conformant. However, many may dismiss these as only being applicable to technology companies. The truth is that cyber security incidents can occur at any step in the process, whether it’s a virus spread via email, sending sensitive information to the wrong individual, or someone managing to get physical access to your computer. As general advice, good cyber security practices stem from a defensive way of thinking, posing questions such as: “can this email be trusted?”; “could my password be easily guessed if someone knows me?”; “who else could possibly use my computer?”. Protecting client data When it comes to ensuring that customers’ data is secure, there should be numerous measures in place. I could speak at length on the many different systems to have in place, but I’ll try not to get carried away. Naturally, any software provider must encourage good security practices through the platform itself, such as data encryption both in-transit and at-rest, multi-factor authentication, and a comprehensive role based access control system to provide additional restrictions to the viewing and modification of data by authorised users. The provider should take on the responsibility of securing the platform for customers, so they don’t have to worry about managing their own servers, or engaging with a third party IT company to do it for them. Any trustworthy software provider will take care of firewalls, intrusion detection, and encryption, as well as protections and mitigations against many other common attack vectors. All databases should have point-in-time-recovery enabled, which differs slightly to conventional nightly backups. It allows the provider to restore to any point in time within the backup retention period. Backups must all be replicated to multiple locations, and secured by different credentials. Services should also have a failover mechanism, where if there’s an issue with the underlying server, the provider can switch to a stand-by instance that has a full copy of the data. Employing proactive vulnerability testing, as well as periodic penetration testing, can be helpful in scanning for emerging threats. And lastly, have a plan. All software providers should implement a comprehensive disaster recovery plan which covers backups and restoration, and is regularly tested so that in the event of an issue, you can be confident on what actions to take in order to mitigate the fallout. These measures can minimise the impact of a data breach and ensure a swift recovery without compromising client data. ### CIOs and CISOs Battle Cyber Threats, Climate, Compliance CIOs and CISOs face unrelenting pressure from three massive forces. First, the risk to data is constantly growing. Some of that is due to an increase in hacking. This is the result of AI and SaaS models making it easier for even complete beginners to carry out attacks and it’s also from political agendas leading to a rise in state-sponsored hacking. Second, classic IT failures within data centres are more likely because of the risk of climate change, making extreme weather such as floods, heat waves and drought more prevalent across Europe and posing a threat to data centres across the Continent. And third, the EU has recognised these risks and wants to use regulatory means to force companies to implement a minimum level of cyber resilience. That exposes CIOs and CISOs to another challenge, because they only have a certain amount of time left to meet the requirements of DORA and NIS2 or face the risk of fines. As the novel trilogy “Three Body Problem” by author Liu Cixin describes, three forces have catastrophic effects on those affected when they catch organisations unprepared. So what can organisations do to address the pressure from these forces? A data-centric focus on cyber resilience Given that the risk to data is constantly growing, organisations should adopt a data-centric focus to cyber resilience, ensuring that data from an organisation’s diverse compute and storage environments is brought together providing the governance, detective, response and recovery capabilities needed to achieve a high level of resiliency. This is logically sensible. After all, it is data that drives the business, data that adversaries want to steal, encrypt or wipe, and data that has compliance obligations. Set alongside this, the technology infrastructure is becoming a commodity with orchestration, cloud and virtualisation now readily accessible to help organisations manage and protect that data. Any approach to bring this data together and provide those governance, detective, response and recovery capabilities should do so in a manner that supports the wider security and IT ecosystem though integration and orchestration.Being resilient means being able to withstand any and all possible threats: fire, flood, hurricane, misconfiguration, ransomware, wiper attack and many, many other potential eventualities. The ability to resume normal service with minimal impact and cost is critical. Addressing IT failures In the event of a massive incident, all employees, partners and customers are isolated and no one knows what anyone else is doing. Even access control systems can be brought down, meaning employees can't open doors to get into buildings or to leave rooms. It is imperative that an organisation understands that these impacts are real; the disruptions caused by many successful attacks prove it. They also need to ensure that they establish an isolated clean room that is capable of rapidly restoring the organisation’s ability to investigate, contain, eradicate and recover from the incident, including all of the security, collaboration and communication tooling needed. Personal liability With its two sets of rules - the DORA, which focuses on the financial industry, and the NIS-2 Directive focused on an increasing definition of critical national infrastructure - the EU wants to start right here and strengthen cyber resilience. To this end, the rules also specifically hold company management accountable. Anyone who violates the requirements can be held personally liable for a lack of governance of their cyber risk. Sanctions can include fines and/or management restrictions. The fines are tough, as they are based on the mechanisms of the GDPR. If companies fail to meet their DORA obligations, they face fines of up to EUR 10 million or 5% of the previous year's global turnover. The penalties under NIS-2 are even tougher and now target management more closely. The fines can range from EUR 100,000 to EUR 20 million for legal entities. The fines for violations have increased significantly since the IT Security Act 2.0 of 2021. It is also to be expected that the authorities will pursue violations with similar rigour as they do with the GDPR. NIS-2 dramatically expands the number of industrial sectors that must comply with the standard compared to its predecessor from 2016. It is important to know that in all cases in which NIS-2 regulates areas that were left out of DORA, NIS-2 must be considered. The latter therefore fills in the gaps left out by DORA, and both are connected. While DORA is a regulation and organisations can determine what is expected of them reading the documentation, NIS 2 is a directive and should be seen as a minimal baseline as each of the 27 member states have the freedom to extend the scope of what is determined as critical national infrastructure and mandate more stringent requirements than the directive themselves. With this in mind, organisations should start their journey to cyber resiliency now to build a foundation that any country-specific legislation will require. For CIOs and CISOs, it's all about being prepared for these periods of chaos and failure. Because a cyber incident, whether via a successful attack or heavy rain, will definitely happen. It is crucial that all security and infrastructure teams have the right infrastructure, the right processes and the right muscle memory. This is the way to create and strengthen resilience and successfully address the “Three Body Problem”. ### Discover the Power of On-premise Cloud Innovation For most organisations, the shift from on-premise to the public cloud appears inevitable as many businesses are continually won over by the perceived benefits of flexibility and cost models available. However, they fail to identify whether this really is the most operationally cost-effective, efficient and secure approach to building a technology infrastructure. The truth of the matter is that numerous companies have fallen short of the hidden costs of support and security, as well as the unknown additional costs attached to storage models. They have failed to realise the promised operational flexibility. When these hidden costs are inevitably compounded with rising security threats, significant latency issues, and apprehensions regarding vendor lock-in, a prevailing realisation emerges – the public cloud strategy is not only more costly but also poses higher risks. But what is the alternative? While on-premise undeniably addresses both security and latency concerns, the prospect of substantial capital investments in new hardware and the requirement to rebuild in-house IT teams present formidable challenges for many companies. So, the question arises: How can organisations keep the benefits of flexible, usage-based pricing reduce costs and improve security? Mark Grindey, CEO, Zeus Cloud, explains why it’s time to go “back to the future” and build an in-house, on-premise private cloud. Debunking the Cloud Myth All organisations, across both the private and public sectors, have embraced the notion that a shared IT infrastructure provides a superior value for money compared with a dedicated, on-premise set-up. Consequently, the UK cloud computing market has surged to an eye-watering valuation of £7.5 billion, prompting an investigation by the CMA into its domination by just three vendors. However, this review, designed to tackle worries raised by Ofcom about exit fees, the lack of flexibility and the structure of financial agreements, while valid, diverts attention from the overarching issue at hand: the cost of public cloud services tends to be double that of an on-premise setup. Every organisation using one of the big three hyperscalers is essentially incurring double the expense for their critical IT systems, including storage and application hosting. To compound matters, they are investing in a service that is drastically less secure and often less well-supported than an on-premise alternative. Security is an ever-present issue for businesses that are now dependent on the public cloud. And the inherent dominance of the big three hyperscalers renders them prime targets for hackers. Distributed Denial of Service (DDoS) attacks on these organisations are occurring almost continuously, creating huge security vulnerabilities. Beyond the immediate consequence of impeding access to vital services and causing operational disruptions, DDoS attacks also unveil vulnerabilities in the security infrastructure, potentially granting unauthorised access to critical data. With that in mind, why do the majority of organisations still opt to pay through the nose for a service that is less secure and less flexible than on-premise alternatives? Exposing Hidden Expenses Admittedly, at first glance, the cloud model is appealing, especially the shift from capital expenditure (capex) to operational expenditure (opex). Furthermore, the notion that costs are known, with a set monthly subscription, is compelling, especially when combined with the option to scale up and down in line with demand, hidden costs are around every corner and, unfortunately, catch the majority of organisations by surprise. While the finances and investment involved in hyper scalers may look straightforward at first, buried within the fine print is the information that every additional bit of service and support incurs an additional cost. The extra – and much-needed – security, for instance, comes at an extra expense. Storage cost models are also disturbingly opaque: the promised price per terabyte looks great until a company discovers it is being charged not just to store data but also to delete it. Moreover, the perceived complementary act of uploading is countered by the charge for every object downloaded. Consequently, the monthly bill skyrockets for many and can often be two, or even three times the expected amount. When the limitations on bandwidth are also factored in, alongside the additional charges for CPU or RAM, along with the fact that if the business is using VMWare, organisations can find themselves paying out yet again. All things considered, it's not surprising that the cost of the public cloud has far exceeded any CTO’s original expectations. The ‘New’ Model on the Block The solution, for businesses looking to reclaim control and reintegrate their equipment, is to go on-premise. This strategic shift allows organisations to preserve the advantages of cloud technology, such as remote support and adaptable finance and usage models tailored to meet operational demands, all while ensuring a more affordable security framework. Increasing numbers of Service Integration and Management (SIAM) companies recognise the fundamental issues associated with public cloud services and are offering this ‘back to the future’ on-premise model with the necessary flexibility to address the pain points attached to using a traditional hyperscaler. On-premise servers can be effortlessly activated, as required, with costs tied to actual usage. The inclusive support attached to this service is also included and, by moving back on-premise, the security risks are all but alleviated. For any business apprehensive over the prospect of rebuilding their server rooms or employing dedicated tech experts, rest assured. The latest generation of servers can be run at higher temperatures, eliminating the need to recreate the air-conditioned server rooms of the past. These new servers can seamlessly integrate within existing network rooms or offices. Alternatively, if limited space is an issue, the entire system can be securely co-located within a dedicated and locked rack. Tech support – an integral part of the service – is administered by leveraging the remote, open-source technology used to provide cloud services cost-effectively, ensuring the on-premise systems are working efficiently. Future Proof Your Business Bringing this vital infrastructure back into the business isn’t just a cost-effective measure: it also fortifies security. Unlike the expansive, public access model mandated by the large hyperscalers, an on-premise set-up adopts an opposing approach: everything is locked down first, with access granted only as necessary through highly secure tunnels, safeguarding the business. Additionally, because the entire private cloud set-up is company-owned, any essential security changes can be made instantaneously. This eliminates the interconnected risk associated with the public cloud which has led to devastating, extended attacks across key public services in recent years. The prospect of regaining this level of control is encouraging an increasing number of organisations across both the public and private sectors to actively bring data and systems back in-house. These organisations have – and rightly so - serious concerns regarding data security, expressing dissatisfaction with the escalating latency issues connected with the numerous layers of security imposed by hyperscalers – issues that disappear when systems are on-premise. In addition, there is a growing consensus that a dependence on the public cloud introduces operational risk, as any interruption to the internet connection leaves an entire organisation incapacitated. A transformative shift is underway. While the public cloud undoubtedly serves a purpose, particularly for hosting a website or public-facing apps, there is a growing recognition that IT deployment can be both more cost-effective and more secure with an on-premise set-up. The time has come to regain control, go back to the future and implement an on-premise private cloud. ### The AI Show - Episode 8 - Theo Saville In episode 8 of the AI Show, our host Emily Barrett, AI Lead & HPC Systems Architect for Lenovo, engages in an insightful interview with Theo Saville, Founder of CloudNC. They delve into the intricate world of software development at CloudNC, shedding light on the innovative approaches and challenges faced by the company. During their conversation, Emily and Theo explore the pivotal role of artificial intelligence in transforming manufacturing strategies. Theo shares his expertise on how AI technologies are being harnessed to optimise production processes, enhance efficiency, and reduce costs. Their discussion offers a glimpse into the future of manufacturing and the potential that AI holds in revolutionising the industry. The episode provides a deep dive into the intersection of AI and manufacturing, offering viewers valuable insights from industry leaders. Emily's expertise and Theo's pioneering work at CloudNC make for a compelling dialogue that is both informative and thought-provoking, highlighting the transformative impact of AI on the manufacturing landscape. ### The Data Conundrum: How sustainable is its future? In this article, Dan Smale, Senior Service Owner of Fasthosts ProActive, discusses the various ways to overcome the ‘data centre crunch’ and how Fasthosts ProActive is meeting demand while remaining as sustainable as possible. Earlier this month, the FT reported that nearly 100tn gigabytes of data was created and consumed last year. It’s expected that this figure will nearly double again by 2025. There are a few concerns associated with this. With AI, blockchain and the IoT becoming bigger drivers in technology development, the need for data centres will continue to rapidly grow. More data centres mean we need more space and better electricity generation. But working around these concerns isn’t as simple as we might think. Developers may find themselves needing to overcome hurdles, including local planning restrictions and upgrades to national transmission systems. In some places, such as Singapore and the Netherlands, there are also strict restrictions to limit and control the construction of new hyperscalers. This resistance comes as strong sector growth raises questions about the impact of data centres, including their high energy, heat and water usage, as well as their potential impact on agriculture due to the size of larger data centres. These concerns mean that we can’t just build more data centres to store all of the information that’s being processed. However, the rapid reliance on data will require extensive new builds, and if restrictions prevent this, then companies will run the risk of losing users, as they will slowly begin to be unable to offer the data storage customers need. The need for sustainability Data centres typically use a lot of energy to maintain their servers, and as a result, account for approximately 1% of global energy consumption. This may not sound like a lot right now, but as mentioned, it’s only set to increase. For instance, AI typically uses more energy than other forms of computing, with some reports suggesting that a single model can use up more electricity than a hundred homes use in an entire year. The AI sector is also growing fast, making it hard to know how much electricity it uses and its carbon footprint. A greener option is to rely on cloud-based storage solutions, as this offers a simple and cost-effective alternative – they still rely on data centres to power, but cloud servers hosted in a specialist’s data centre can reach capacities of up to 40-70%, opposed to just 18% with on-site premises servers. This means there’s a better rate of efficiency, as cloud data centres can consolidate large amounts of equipment to handle traffic spikes much better. Not only is this beneficial for the environment, but opting for cloud storage can also result in lower running costs for businesses. Traditional on-site storage solutions are also complex and can be hard to manage – resulting in further costs due to ongoing training and maintenance. In order for this to work, however, businesses need to make sure they are opting for a provider that powers their data centres sustainably. What planning goes into building a green data centre? While data centre energy efficiency has improved, the growing demand for data has significantly impacted power usage. For data centre companies themselves, some are working around this by ensuring they rely on renewable energy. Emissions can vary depending on what type of power plants provide the electricity to the data centre. A centre that relies on coal or natural gas-fired power plants will collect higher emissions than one that draws power from renewable energy, such as from solar or wind farms. Bigger conglomerates, such as Google, are also investing in renewables to counteract any impact. Many components can also be made more energy-efficient and environmentally friendly, from how the data centre is constructed to the equipment it uses. Centres can also opt for server virtualisation by using virtualisation technology in cloud servers to maximise resources. This reduces the total number of servers and storage devices needed, as it can split a physical server into multiple virtual machines, which will have their own dedicated resources and customers – allowing one server to host multiple customers. Another way to build a green data centre, especially for cloud hosting providers, is to decrease energy consumption by investing in hardware and software that eliminates inefficiencies. Strategies such as frequency scaling can also improve the energy efficiency of data centre equipment. Naturally, as churning data can be quite impactful, data centre companies now need to prove they are as sustainable as they can be if they want to expand and build in certain jurisdictions. Turning to green data centres is a way companies can be more sustainable, especially as our reliance on data and the cloud continues to grow. In 2022, Fasthosts unveiled its multimillion-pound investment in a brand-new, green data centre in Worcester. The data centre incorporates 100% renewable energy, solar panels, electric car charge points, green roofs and best-in-class power usage effectiveness of 1.25 or less. ### Adopting open architecture for robust data strategy As the world's economy grapples with continuous challenges and uncertainty remains the prevailing theme, data emerges as the modern-day fuel, powering enterprises across various sectors. Data and analytics have a pivotal role in helping companies not only manage and alleviate risks but also discover opportunities for adjustment and expansion. Transitioning to data-centric business models is at the heart of the digital transformation wave that continuously impacts every industry throughout 2023 and beyond.Therefore, the right data storage solution is critical in meeting a business's requirements. At its core, a data storage system's role is to securely store valuable data that can be easily accessed and retrieved on demand.However, as data volume continues to expand and stack up, the need for businesses to increase their storage capacity has grown. The situation gets complicated when data warehouse providers preserve data in their unique, proprietary format. This locks the data into a specific platform, making it a costly challenge to extract when needed by a business. Organisations can quickly fall into a downward spiral while adding more data to their systems given the myriad of data storage options and system configurations - an approach far from efficiency.Instead, businesses must adopt open-source standards, technology, and formats to allow fast, affordable analytics and flexibility to capitalise on emerging technologies without straining resources. On the path to open architecture In the past, businesses leaned heavily on traditional databases or storage facilities to satisfy their Business Intelligence (BI) needs. However, these systems had their fair share of challenges, such as tech interoperability, scalability and costly investments in onsite hardware to maintain structured data in owner-specific formats. It also required a centralised IT and data department for conducting analytical tasks. Since then, data architecture has experienced dramatic shifts in operations, able to analyse vast datasets in seconds and data quickly moving to cloud storage. This became more appealing for businesses who otherwise would have struggled with physical storage. The transition to cloud warehouses helped elevate scalability and efficiency.Despite these advancements, certain practices persisted. Data still needed to be imported, uploaded, and replicated into one unique proprietary system linked to a standalone query engine. The deployment of several databases or data warehouses mandated the accumulation of numerous data replicas. Furthermore, companies continued to bear costs for the transportation of data into and out of the proprietary system.It has all changed with the emergence of open data architecture. As data operates as a standalone layer, it emphasises the distinct delineation between data and compute, where open-source file formats and table formats store the data, and independent, flexible computational engines handle it. As a result, different engines can access identical data within a loosely connected infrastructure. In these arrangements, data is retained as its distinct tier source in open formats within the organisation's cloud account, allowing customers to access it through a variety of services.This evolution mirrors the transition from monolithic architectures to microservices in applications. This is especially true in data analytics, where companies migrate from closed proprietary data warehouses and nonstop ETL (Extract, Transform, Load) processes to open data architectures like data lakes and lakehouses.In addition, separating computing from storage allowed for more efficient operations. Firstly, it lowered the cost of raw storage, practically eliminating them from the IT budgets and leading to significant financial savings. Secondly, the compartmentalisation of computing costs allowed customers to pay only for the resources used during data processing, leading to a reduction in overall expenditures. Lastly, both storage and compute became independently scalable, paving the way for on-demand, adaptable resource allocation that injected flexibility into architecture designs. Choosing the right data solution Whilst cloud data warehouses promised organisations scalability and cost-efficiency in data storage, businesses were left tied into the single vendor ecosystem, restricting them from expanding to other invaluable technology solutions. Instead, open data lakes and lakehouses offer a significant advantage for businesses looking to take full control of their data and analytics: Adopting a flexible approach to use various premium services and engines on a company's data paves the way for utilising diverse technologies such as data-processing tools. With companies having different use cases and requirements, the ability to deploy the most appropriate tool leads to enhanced productivity, specifically for data teams, and a reduction in cloud costs. It's vital to remember that no single vendor can provide the full range of processing capabilities a company may require. Switching platforms becomes significantly difficult when grappling with a data warehouse containing anywhere between 100,000 or even up to a million tables alongside hundreds of complex ingestion pipelines. However, with data lakehouse, it becomes possible to query the current data with a new system without data migration. Vendor lock-in leads to financial exploitation by vendors, so organisations must avoid it at all costs. However, it's even more important to be able to merge new technologies, regardless of whether the current vendor continues to be preferred.How organisations gather, store, manage, and utilise data will be crucial in shaping the future of nearly every market segment in the upcoming years. Given its vast benefits, organisations must embrace open data architectures if they want to move towards a more flexible, scalable, and insightful future. Only open-source tools can offer the scalability, efficiency and cost-effectiveness that will allow them to stand out from their competitors. ### Transitioning from legacy tech to the cloud Not too long ago, on-site physical IT infrastructure and traditional software were the only options available to tech-forward companies. As such, many businesses have relied on these legacy systems for decades. Today, businesses are increasingly embracing the multiple benefits of moving away from these older systems and towards digital-first solutions like cloud computing, which offers both flexibility and scalability for dynamic organisations, as well as notable cost-savings and improved security and compliance compared to their legacy counterparts. Migrating operations to the cloud can pose a significant undertaking, from the technology and processes to the skills needed to make it a success in the long term. Technical debt, accrued as companies must continue investing in legacy technology to keep up in an increasingly digital age, can compound this issue further. Below, we explore the best practices for companies looking to make this transition. Conduct a SWOT analysis A robust pre-migration strategy is key to avoiding being blindsided part way through a long and costly migration. Many companies want to improve their digital capabilities through a transition to the cloud but enter the process without conducting a full SWOT analysis, leaving them vulnerable to unexpected issues. A SWOT analysis refers to the Strengths, Weaknesses, Opportunities and Threats that in this case, cloud migration presents. Though this is a complex technical undertaking, this analysis should strive to take a holistic view, evaluating everything from technical, financial, training and operational aspects. This will allow you to build a fuller picture, and incorporate these elements into a full migration plan, while also assisting you with spotting potential roadblocks as well as potential opportunities for further growth or optimisations. Assess existing infrastructure Businesses that rely on their legacy technology often find themselves dealing with a patchwork of various systems. In these cases, cloud migration is a more detailed operation than simply “lifting and shifting” your existing systems into the cloud. It is important to evaluate each system or application and its readiness for migration to the cloud, as well as any existing integrations these systems rely on for business-critical operations. Some may need special consideration when it comes to migration, and being aware of these early on will help you plan ahead and identify where new migrations may be needed within the new system. A full audit of this kind will enable you to gain oversight of your existing architecture, along with its dependencies or unique requirements. Create a migration plan Many cloud migration projects fail to deliver on time or on budget due to insufficient preparation at the planning stage, so it is vital that you create a detailed plan that sets out realistic objectives, timelines and milestones for each stage of your migration journey. Ensure you have a full inventory of all your existing legacy architecture and any associated integrations, as this will help you to facilitate a smooth transition. From here, prioritise each part of the transition based on factors such as operational importance, cloud-readiness and estimated business impacts and downtime. This final point is particularly important, as migrations, especially those involving long-standing legacy systems, will naturally come with a certain amount of downtime as systems are moved into the cloud. It is crucial that this downtime is built into your plan, along with strategies on how this disruption to normal operations can be minimised or mitigated. Define roles One of the biggest consequences of reliance on legacy systems is that the skills and knowledge associated with these systems can end up siloed within pockets of an organisation. As part of the migration process, it is important to identify these existing skillsets, and how they can be best adapted to fit with modern cloud-based systems. This not only helps to boost the buy-in across an organisation if employees can transfer their existing skills, but it also helps bolster your in-house capabilities with cloud systems.Sometimes, the skills needed to maintain and optimise your new cloud systems will not be available in-house. It’s therefore important to have an open dialogue with a competent cloud partner, to ensure both your vendor’s and your own responsibilities are clear during and post-migration. Defining these roles and responsibilities around new cloud systems will help to ensure a smooth process, avoiding confusion and promoting accountability. Consider change management The transition from traditional legacy systems to an agile cloud system comes with a host of benefits, but many employees can also view it as a threat. Achieving real transformation through a legacy tech-to-cloud transition must involve recognising and mitigating employee-based risks, by minimising resistance to new technology and maximising adoption rates. One of the most effective ways to achieve this is through a change management programme. Different organisations will require differing levels of support when it comes to adopting changes and realising the full potential of new technologies. A robust change management strategy will include assessments of how change-ready your organisation is, working with employees to build both confidence and trust in new systems, increasing adoption rates and building technical capabilities along the way. The migration process Before the migration itself begins, it is crucial that you back up your data. From here, you can begin with smaller, non-business-critical parts of your existing systems, using these as a test run for the wider migration, and allowing you to identify and address any potential issues before moving on to business-critical systems and applications. It is important to note that not everything must be migrated at once. Legacy systems can be complex, and breaking the migration down into more manageable chunks helps to set realistic timelines, allowing time for any potential troubleshooting. The process of transferring data must be managed carefully to ensure the integrity of your data is preserved. Once this is done, the next step is to migrate applications. Again, this must be done with the utmost care, ensuring they are properly integrated and work optimally within the new cloud environment. Continued improvement Cloud systems, once up and running, require much less maintenance and management than legacy technology, though it is still necessary and recommended to keep up to date with new features and updates from your cloud provider. Even in situations where it has been possible to lift and shift systems to reduce risk and effort upfront, it is still possible to improve and amend these systems once in place. Similarly, once the initial migration is over, this does not mean you shouldn’t continue to ask for feedback from your teams and users of the new systems. Doing so can provide invaluable insight into how the cloud is working for your business, and where tweaks and optimisations might be made. This approach of continuous improvement is key to ensuring you are leveraging the cloud environment to its full potential. Final thoughts Migrating from legacy systems to the cloud can pose significant challenges for businesses of all sizes, but through careful planning and a robust migration strategy, organisations can spot potential roadblocks early on, and plot a course that enables a smooth and timely transition, allowing them to reap the business benefits of cloud computing. ### Is Poor Collaboration Hurting Your Team's Productivity? Regardless of how many days you believe hybrid workers should spend in or out of the office and its positive or negative impact on productivity, hybrid work is now the new normal for organisations across the globe. While increased flexibility has brought many positives for employees, it's essential that employees and customers can collaborate effectively to ensure improved productivity. I believe there are a few key areas you need to get right to make collaboration effective and keep that ever-important productivity high. Quality Network With a hybrid workforce, almost every meeting or call features video, screen and app sharing and collaborative working. To do this in real-time and with good-quality audio and video, businesses need the right infrastructure to use these collaboration tools effectively. Ensuring your network (wired, wireless, SDWAN and even employees’ home network) is fit for purpose is critical or employees will have a poor experience which hinders the effectiveness of hybrid work. For example, some networks cannot handle the increase in capacity and minimal latency that video-enabled meetings and collaborative working requires. As collaborative tools become feature rich and the content becomes media heavy, network connectivity needs to adapt. With the acceleration of cloud adoption, combined with the birth of AI-powered services such as Microsoft 365 Copilot, connectivity needs to be able to be application-aware and optimise network traffic in real time. Company-wide Security Security isn’t about a product or a new tool. It’s about ensuring your whole organisation adopts a zero-trust approach to security rather than simply protecting the legacy network boundary that existed before. This approach ensures that employees have suitable security enabled on their devices, such as two-factor authentication. Employees also need regular training on the latest cybersecurity threats and challenges that stimulate and foster employee engagement. Productivity and Collaboration Tools Productivity and collaboration tools have evolved hugely over the past three years. We’ve seen a coming together between the network, the physical spaces (offices) and the virtual/remote spaces we work in. Many organisations are transforming their traditional offices into intelligent spaces that are more appealing to work from, are more sustainable and promote a collaborative and inclusive environment that is aware and can adapt to how and where people work. These new “employee hubs” use the network to monitor air quality, people flow, and room occupancy whilst delivering secure, end-to-end cloud-managed connectivity supporting the needs of every employee. CNA Hardy, the global speciality insurance company, is typical of many businesses. They had legacy collaboration and meeting room technology that couldn’t provide the high standard of collaboration required for hybrid working. They needed help connecting remote participants and integrating with existing tools, which caused issues around sharing content and reduced collaboration and productivity. To create a modern, flexible workplace that supports the needs of its people and customers, CNA Hardy deployed meeting room technology powered by Microsoft Teams. This enabled seamless communication and collaboration between teams in different locations. They also refreshed their physical office, incorporating adaptive signage to show things like CO2 consumption per day based on real data and reminders to book a desk for busy days. CNA Hardy now has state-of-the-art tech and a more attractive workplace which has increased employee satisfaction and benefits staff recruitment and retention. It has also allowed for continued flexible working, which has meant the business has saved money by reducing its London office space. The Three P’s – People, Process, Productivity Technology alone doesn’t make “hybrid work” work! You can have the best network, great collaboration tools and amazing new energy-efficient office spaces equipped with the latest video technology, but without due process for people, they amount to nothing. You need to ensure your people know how to use the right tools and how to get the best from them. This is not simply about training; it’s about embedding a mindset of learning too. The latest technology puts collaboration at its heart. For example, digital whiteboards and canvasses (such as Microsoft Loop or Notion) that enable mass collaborative working, multi-screen and camera video conferencing which use AI to transcribe meeting notes, take actions, translate text into different languages and allow people to adjust zoom levels. Contrast and screen reading tools all help to foster inclusive and make hybrid meetings successful. The right network, state-of-the-art collaboration technology, company-wide adoption, and great security make for successful collaboration. This results in an organisation that can deliver its core business successfully, whilst creating an empowered and happier workforce that is more productive. ### AI Decodes Human Behaviour Like Never Before In this exciting episode of "The AI Show," we explore the intersection of artificial intelligence and psychology. Discover how AI is revolutionising our understanding of human behaviour, offering unprecedented insights into the mind. Our guest, John Scott, shares his expertise on how AI technologies at Culture AI decode complex human actions, emotions, and patterns. Join us for a deep dive into the future of AI and psychology. John Scott discusses the latest advancements in AI algorithms and how they are being utilised to analyse and interpret psychological data. He explains how these technologies are not only enhancing our comprehension of individual behaviours but also providing a broader understanding of societal trends and dynamics. This episode reveals the potential of AI to transform fields such as mental health, marketing, and social research. Additionally, we examine the ethical considerations and challenges that arise from the use of AI in psychological analysis. John provides insights into the importance of maintaining privacy, consent, and the need for responsible AI development. This thought-provoking conversation is essential for anyone interested in the cutting-edge applications of AI in understanding human behaviour and the implications it holds for the future. ### The AI Show - Episode 7 - Martin Taylor In this episode of the AI Show, host Emily Barrett engages in an insightful discussion with Martin Taylor about the transformative impact of artificial intelligence in contact centres. Together, they explore the evolution from traditional call-centric operations to the adoption of diverse communication channels, highlighting the significant changes that have revolutionised customer service. Martin Taylor delves into the intricacies of how AI leverages natural language processing (NLP) to enhance customer interactions, providing a more seamless and personalised experience. They discuss real-world applications, showcasing how AI-driven solutions are reshaping the way contact centres operate, improving efficiency, and customer satisfaction. This episode is a must-watch for anyone interested in the intersection of AI and customer service. Whether you're a professional in the industry or simply curious about the future of customer interactions, Emily and Martin's conversation offers valuable insights into the ongoing advancements and future possibilities of AI in contact centres. Tune in to gain a deeper understanding of how AI is redefining customer engagement and the innovative tools driving this change. ### Building a Cyber Empire with Strategies for Success Join us for an enlightening episode of our show, where host Philip Blake sits down with Amelia Hewitt to explore the multifaceted journey of launching both a consultancy and a nonprofit organisation. Amelia shares her personal challenges and triumphs, providing a candid look into the world of entrepreneurship. Throughout this engaging conversation, they delve into the complexities of balancing work-life dynamics, shedding light on the strategies that have helped Amelia maintain equilibrium in her professional and personal life. Additionally, the episode uncovers innovative approaches within the industry, offering practical advice and fresh perspectives that are invaluable for aspiring entrepreneurs. Whether you're an experienced business owner or just starting out, this episode is brimming with insights and advice that can help guide your entrepreneurial journey. Don’t miss this opportunity to learn from Amelia Hewitt's unique experiences and expertise. Tune in now for an episode packed with inspiration and knowledge! ### Mastering Cyber Defense with Teamwork and Threat Response Hosted by Philip Blake, this episode features special guest Jon France, CISSP, ChCSP from ISC2. Jon will share his path to becoming a Chief Information Security Officer, offering valuable insights and experiences from his journey. He will also discuss the vital importance of teamwork in responding to cyber attacks, emphasising how collaboration can enhance security measures and mitigate risks. Additionally, Jon will address strategies for tackling the growing cyber skills gap, providing practical advice for both aspiring cybersecurity professionals and organisations seeking to strengthen their defenses. Don't miss this engaging and informative discussion! ### Cloud Secrets and Staying Ahead in Tech with James Moore Join us for an exhilarating episode as Jez sits down with James Moore, a distinguished authority in cloud technologies, for an in-depth conversation. In this engaging discussion, they delve into a wide array of topics, offering unique insights into the future of cloud computing, the transformative impact of emerging technologies, and strategic advice on how businesses can maintain a competitive edge in an ever-evolving digital landscape. Throughout the episode, James Moore shares his expert perspectives on the latest trends and innovations shaping the cloud computing industry. Whether you are a technology enthusiast eager to stay abreast of the latest developments or a professional seeking to enhance your knowledge and skills, this episode is brimming with valuable information and actionable insights. Don't miss out on this opportunity to gain a deeper understanding of the dynamic world of cloud computing and learn how to navigate the complexities of this rapidly changing field. Tune in now and equip yourself with the knowledge you need to stay ahead in today's tech-driven world. ### Three ways to protect your business from cyberattacks Keeping on top of cyberattacks in this current digital age requires businesses to regularly evolve and keep their security practices up to date. Cyberattacks were listed as one of the biggest threats to the UK in the 2023 risk register, with 97 people every hour falling victim to cyberattacks. While security experts stress the importance of security solutions such as strong passwords and up-to-date software, it’s always important to go a step further and take a more holistic approach to cybersecurity. Implementing holistic cybersecurity practices allows for businesses and their customers to not have to worry about security which in turn builds more trust and creates greater efficiencies across the board. Fortifying cybersecurity measures requires a real team effort from all levels of the business; employees must feel empowered to apply cybersecurity initiatives to their everyday work. As businesses look for ways to revamp their cybersecurity efforts, here are three things organisations can do to ensure they are working in a fully secure environment.  Address the evolving security needs of SMBs Small business owners often find themselves particularly susceptible to security and privacy threats due to their limited resources, which often translates into inadequate cybersecurity measures compared with their much larger counterparts. Even if resources are limited, business owners can safeguard against security pitfalls by implementing well-defined security procedures in collaboration with their employees - making it everybody’s responsibility.By adopting a shared responsibility model, smaller businesses can better scrutinise cybersecurity objectives and practices which help to better assess and combat against evolving cyber threats. Active awareness and accountability begin with clearly defined roles and responsibilities. Further to this, SMBs should choose a hosting platform that provides round-the-clock cybersecurity support to quickly bring security threats under control. Without this support, SMBs will struggle to identify threats and will be slow to act in thwarting them. Develop a transparency model Every business should make transparency and communication a core value as this is vital in earning customers' trust should any kind of incident or breach occur.  For genuine trust to exist, customers must be able to understand company security processes and have insight into how the business handles their data and personal information. By providing this kind of information, businesses can greatly empower customers to apply their own cybersecurity methods to their daily lives, saving them and businesses from potential future breaches.   It’s imperative to include security and trust as driving principles from the get-go. This approach encompasses all customers, investors, regulators and employees. Policies and initiatives, like security awareness training and ongoing compliance with industry regulations, can go a long way in strengthening security measures beyond data protection and incident response. Stay vigilant With the rise of remote and hybrid working practices, it’s vital that businesses stay ahead of their security and privacy. Outdated technology still remains a huge issue for businesses as undertaking large website enhancement procedures opens the doors for human error and cyberattacks.Businesses need to recognise the importance of implementing technology that is capable of automatically detecting updates and executing secure backups. This will eliminate the need for manual maintenance and will allow teams to focus on other critical areas of the business.Furthermore, organisations can deploy AI-powered solutions to handle laborious tasks such as analysing data for anomalies, swiftly detecting malware and flagging abnormalities for cybersecurity teams to address. AI-powered technology helps to streamline and automate time-consuming processes which can further enhance the team’s overall efficiency.Over the past year, we have seen cybercriminals become more advanced in their tactics and attacks have grown in sophistication, putting businesses under pressure to act quickly and effectively. Going into 2024, all organisations must place cybersecurity measures at the heart of their business as this will be key to supporting business growth and success for the future. ### Data Sovereignty in AI: Five Key Considerations Data is the most valuable economic asset for modern companies and their greatest resource for growth and change. Consequently, safeguarding such data and adhering to the laws and standards of the nations where business operations are conducted is essential. Notably, several national authorities prohibit the storage of confidential personal or corporate information on servers beyond their legal and regulatory jurisdiction, including: European Union (EU) - General Data Protection Regulation (GDPR) Canada - Canadian Consumer Privacy Protection Act (CCPPA) Australia - Privacy Act and Australian Privacy Principles (APP) China - Personal Information Protection Law (PIPL) Russia - Federal Law on Personal Data India - Personal Data Protection Bill This is the challenge of data sovereignty. Artificial Intelligence (AI) 's continued rise and prevalence somewhat adds to this challenge. AI is perhaps the world’s biggest consumer of data outside of search engines and intelligence agencies. And unlike search engines, AI will be deployed inside a business and consume data previously considered off-limits by such technologies. Generative AI (GenAI) consumes data indiscriminately, and that data is often stored and processed at the AI companies’ discretion, not its users. AI services implemented by many business applications will need to limit the use of this data outside the data sovereignty boundaries as required by the specific country's regulations.AI’s exponential and ongoing growth collides with established and emerging privacy and data sovereignty regulations globally, such as GDPR and CCPPA. Businesses must adhere to these policies and balance them with the ongoing business priority to compete using the latest data technologies available. Few businesses (if any) would voluntarily keep their data in a single regulatory domain to make the most of available tools. In reality, most multinational businesses have data residency strategies to store and use data in multiple countries and regions to serve customers and employees better. However, physically locating information in different regions and locations means it might become subject to other data protection laws. Fortunately, competitive and regulatory balance is achievable for organisations with the right blend of mindset, policy, and tools. Here are five things you must consider to protect data sovereignty as your organisation integrates AI: Education and Understanding Ensure data sovereignty issues are front and centre throughout the company and understood and prioritised by everyone. Everyone who creates systems that use or modify data must understand the fundamentals of data sovereignty, which means understanding the business risk of not following policy. It should be an absolute priority to educate employees about data sovereignty and why it matters. Know Your Data Data grows as businesses expand, and as it grows, it fragments and ends up in silos. According to Salesforce, the average enterprise now has 1,061 different applications, although only one-third of them are connected. Additionally, Salesforce also discovered that it typically takes 35 applications to support a single customer interaction. You must know your data. Once employees understand the importance of data sovereignty, the organisation can create and maintain an inventory of the company’s data. In addition to knowing what is in the data, organisations must know the vendors that act on your data. Internal Communication It's essential to be fluent in regional data residency laws and make every effort to comply. The company’s governance team must understand what is in the data, its structure, and what vendors it uses to process it. To protect the company from data sovereignty challenges, you must have systems to manage anonymisation and pseudonymisation when partnering with other companies that process data. Close Collaboration with Vendors Enforce vendor compliance. The governance team must work with external vendors to ensure they have specific provisions that comply and align with the company’s policies. As data sovereignty regulations expand, vendors must deliver the data services in compliance with the regulatory frameworks of the regions where the business operates. Data Unification and AI To ensure data sovereignty compliance, you must have clean, connected, trusted data. Today's tools and modern technologies leverage AI/ML capabilities to speed up and enhance data unification to ensure your data is internally consistent. Data unification tools can work more expansively and faster than a manual process.Today’s modern master data management (MDM) tools also leverage AI to detect data leakage – when data leaves or is at risk of leaving the borders that policy restricts it to. Modern MDM also can trace the entire lineage of data sources the data came from, who contributed changes to it (a particular system or user), who consumed it, and when these actions were taken as the level of visibility we provide. Provenance is an essential capability for managing data products. AI-powered MDM tools can use pattern recognition to spot when a business is leaking personally identifiable information (PII) or other information that needs to be kept within borders. Healthcare Data Healthcare systems, such as the NHS, naturally face some of the most restrictive privacy and sovereignty policies, given the nature of the data they work with. Many of today’s healthcare management systems show that it is possible to have flexible and robust data management systems that remain compliant if the systems are designed with data management policies from the outset. For example, in a healthcare system, each customer (patient) can get a client ID kept locally; any system that needs to attach data to a patient uses only that ID. When data is processed, for billing or analytics or other reasons, the PII stays put; only the necessary information is transmitted. Prioritise Speed, Agility and Geography With the right tools and technology partner, a company does not have to slow down to ensure compliance with its own or governmental policies. Instead, the correct tools, approach, and mindset let companies act quickly and seamlessly on emerging issues, such as flagging pop-up data programmes that are against policy before they become entrenched – and while the team is still active on the project and can modify it for compliance. Data sovereignty regulations vary by country, and they can be incredibly detailed and complex—spanning data privacy, data localisation, data residency, and more. Working with a partner that can manage data in compliance with local privacy regulations around the globe - from Asia-Pacific to North America to Europe is essential for every global organisation today. ### Streamlining Cloud Management with Terraform Functions The multi-cloud deployment trend is making infrastructure management more complex. It requires investment in new tools and training to establish efficient and secure infrastructure. Security, in particular, is crucial to take into account when embracing this trend, as vulnerabilities in the cloud make it difficult to detect and counter threats. One of the popular tools for cloud infrastructure management is Terraform. This open-source infrastructure-as-code (IaC) tool developed by HashiCorp enables the secure and efficient building, modification, and versioning of cloud and on-prem resources. It is a powerful and flexible tool that comes with functions that automate vital tasks in infrastructure management and provisioning, and these functions are especially useful in streamlining cloud infrastructure management. Terraform’s Key Functions Terraform functions are built-in features that aid data manipulation and management in configurations to ensure smooth data transformations and infrastructure provisioning.  There are many of these functions, but they can be grouped by their categories – namely the string, numeric, collection, encoding, filesystem, date and time, hash and crypto, IP network, and type conversion functions, as explained in this guide. Each of these categories has a multitude of specific supported functions used for specific purposes. For the purposes of this article, we will not exhaustively describe the role of each function in streamlining infrastructure management. The goal here is to explain how Terraform functions make IaC provisioning and management more efficient in key use cases when it comes to string concatenation, string splitting, tag combination, data type conversion, and the implementation of conditional logic. String Concatenation Concatenation refers to the linking of two strings to generate resource names. This usually entails the appending of environment names to base names to come up with unique resource names. Concatenating strings are useful in meaningful dynamic resource naming. It results in such resource names as “app-server-prod,” for example,  wherein “app-server” is the base name and “prod” is the environment name. Through this dynamic way of naming resources, it becomes easy to generate unique names that make sense, as the names provide hints as to what the resource is. Another important use of string concatenation is the parameterisation of Terraform configurations. This lessens configuration hard-coding and supports the writing of modular infrastructure code. For example, in configuring the instance name, the strings for the project name and region can be concatenated, resulting in a configuration that can be used with different parameters. Additionally, the concatenation of strings is useful in ensuring consistency in naming conventions. Consistent naming is especially crucial when managing large-scale deployments. It facilitates the faster monitoring and auditing of resources. String Splitting The opposite of string concatenation, string splitting means the division of a group of strings separated by commas into a list of individual items.  This function is important in parsing and handling complex data. It breaks down complex strings into basic components for efficient data processing and manipulation. It can turn complex data into lists and maps that can then be subjected to further operations. Similar to concatenation, string splitting also plays an important role in dynamic configurations and the modularity/reusability of configurations. It provides a way to pull data from different data groups or complex strings to produce unique names or point to specific instances that are supposed to be provisioned. Tag Combining Tags are vital in the management and organization of resources in cloud infrastructure. They simplify resource tracking and reporting across different projects and environments.  The ability to combine or merge tags helps streamline infrastructure management, because it enables consistency. It ensures that all resources follow a standardised set of tags, making them easier to track and manage. This function displays all the instances, resources, and other components that bear common or specific tags. When used effectively, tags provide an efficient way to categorise and filter resources. As such, tag merging helps simplify the management and governance of resources. This is helpful in the implementation of policies and compliance with requirements. Also, it supports easier reporting and auditing, flexible and modular configuration, and the automated application of tags across different resources. Data Type Conversion There are different types of data in cloud infrastructure management, from resource definitions to configuration values and variables. The conversion of data types streamlines infrastructure management by enabling interoperability with different systems and tools. It ensures that data is correctly interpreted across different systems. Also, data conversion supports dynamic and flexible configurations. This makes it possible to utilise variables and outputs, for example, in different contexts by having them converted to the suitable data type for a specific operation. For example, ports presented as strings can be converted to number data when configuring a security group rule. Moreover, the conversion of data types is valuable in data integrity and accuracy. It is uncommon for configurations to have incompatible data types. Hence, it is important to convert data to the right type to make sure it is interpreted correctly and not create inconsistencies and errors.  Implementing Conditional Logic Conditional logic is about using programming constructs to automate decision-making based on the emergence of specific criteria or conditions. It can be simple “if-else” statements, count parameters, or conditional expressions in resource attributes. It is important in streamlining infrastructure management for a number of reasons. First, conditional logic enables the creation of configurations that are adaptable to a variety of scenarios. It supports dynamic resource provisioning like in the case of choosing an instance type according to the environment to ensure correct resource allocation. Next, conditional logic enables the creation of generalised templates that can adapt to different inputs. This leads to configuration flexibility and reusability, supporting the easier management and maintenance of infrastructure across different environments and projects. It also provides the benefit of reduced configuration files, as conditional logic makes it possible to manage variations of related configurations through a single configuration file. Additionally, conditional logic supports automation and CI/CD integration. Its ability to create configurations that adapt dynamically to different environments or conditions, including different stages in the deployment pipeline, ensures the right resources and configurations are applied at every stage. Moreover, conditional logic plays a role in the consistent application of security policies and compliance with regulatory requirements. For example, server-side encryption can be applied based on a specific condition, which is based on internal security policy, best practices, and regulatory requirements. The purposeful implementation of conditional logic provides organisations with a way to systematically secure their infrastructure and keep up with regulations. Using Terraform for Cloud Infrastructure Management  Effective infrastructure management is a must amid the growing complexities of cloud and hybrid environments. Terraform is one of the best ways to achieve effective cloud infrastructure management through infrastructure-as-code. IaC management teams would be remiss not to take advantage of the different functions that come with Terraform. Incorporating these functions into infrastructure management workflows results in a streamlined process that has fewer errors and robust security. ### Defending Health and Social Care from Cyber Attacks The National Cyber Security Centre (NCSC) recently calculated that Health and Social Care is the 5th biggest sector attacked by cyber-criminals in the UK. Recent instances, such as the cyber attack on the University of Manchester which led to over one million NHS patients’ data being compromised, further proves the case for enhanced security measures. Such attacks on businesses cause enormous disruption but in the case of care businesses, the consequences can be life threatening. The inability of a carer to access a service user’s data can have a huge impact on missed medication or missed care provision – it can rapidly escalate to a safeguarding situation. Particularly in the case of direct care information, where data privacy and security are so crucial, cyber attacks will likely always remain a risk. It is essential that healthcare and health tech software providers continuously monitor, update, and improve their technology to ensure that a breach does not occur. What measures must be taken? There are several measures home care agencies can take to help combat being victims of cyber attacks. Picking a suitable software solution is a key element in mitigating any potential cyber attacks, particularly if they’re hosting your data, but remember that they’re a black box. You don’t know exactly what’s going on past your interactions with the application. There are security standards available which can give you reassurance that a provider is operating in a secure manner. The NHS Data Security and Protection Toolkit (NHS DSPT) is a great place to start; it’s a self-assessment programme largely based on the ISO27001 standards and has special affordances for healthcare. Digital Social Care recommends being NHS DSPT compliant to all CQC registered care providers, and it’s a requirement if you deliver services under an NHS contract. They have great advice and guidance for meeting the standards and becoming conformant. However, many may dismiss these as only being applicable to technology companies. The truth is that cyber security incidents can occur at any step in the process, whether it’s a virus spread via email, sending sensitive information to the wrong individual, or someone managing to get physical access to your computer. As general advice, good cyber security practices stem from a defensive way of thinking, posing questions such as: “can this email be trusted?”; “could my password be easily guessed if someone knows me?”; “who else could possibly use my computer?”. Protecting client data When it comes to ensuring that customers’ data is secure, there should be numerous measures in place. I could speak at length on the many different systems to have in place, but I’ll try not to get carried away. Naturally, any software provider must encourage good security practices through the platform itself, such as data encryption both in transit and at rest, multi-factor authentication, and a comprehensive role-based access control system to provide additional restrictions to the viewing and modification of data by authorised users. The provider should take on the responsibility of securing the platform for customers, so they don’t have to worry about managing their own servers or engaging with a third-party IT company to do it for them. Any trustworthy software provider will take care of firewalls, intrusion detection, and encryption, as well as protections and mitigations against many other common attack vectors. All databases should have point-in-time-recovery enabled, which differs slightly to conventional nightly backups. It allows the provider to restore to any point in time within the backup retention period. Backups must all be replicated to multiple locations, and secured by different credentials. Services should also have a failover mechanism, where if there’s an issue with the underlying server, the provider can switch to a stand-by instance that has a full copy of the data. Employing proactive vulnerability testing, as well as periodic penetration testing, can be helpful in scanning for emerging threats. And lastly, have a plan. All software providers should implement a comprehensive disaster recovery plan which covers backups and restoration and is regularly tested so that in the event of an issue, you can be confident about what actions to take in order to mitigate the fallout. These measures can minimise the impact of a data breach and ensure a swift recovery without compromising client data. ### How is AI changing patch management? Cyberattacks are increasing in frequency and severity. This means maintaining the security of IT systems is more critical now than ever – and doing so effectively requires innovative solutions. As a result, many companies are turning to AI – which is set to revolutionise patch management. Effective patch management is a time and resource-intensive task, and as a result, many businesses struggle to keep pace with the sheer quantity of patches released by software suppliers. It’s a case of balancing patches to stay secure while maintaining system stability, which can be especially daunting for businesses with a large and complex IT infrastructure. How can AI streamline patch management? AI is set to revolutionise patch management, bringing with it the capacity to process vast amounts of data, analyse patterns within this data and support informed decision-making. AI can streamline processes and reduce the time and resources required to keep up with patch management and ensure security. This enables businesses to allocate their IT resources more strategically so that they’re centred around activities that require human input. AI in patch management offers a number of benefits, including:• Continuous monitoring – 24/7 monitoring of your systems to identify any new vulnerabilities or patches to reduce the window of exposure to potential threats. • Predictive analysis – the ability to predict the impact of a patch on a system’s performance before it’s implemented, reducing the risk of downtime or unintended consequences. • Customised patch prioritisation – assessing an organisation’s unique IT system and applying this understanding to prioritise patches based on their potential impact, criticality and relevance. • Automated vulnerability detection – applying historical data to predict the system vulnerabilities most likely to be exploited to inform patch prioritisation. • Automated testing – testing the impact of patches before they’re applied to ensure they don’t conflict with existing software and create unexpected issues elsewhere in the system. • Adaptive learning – learning from past patch management processes to improve the efficiency and effectiveness of an organisation’s patch management strategies. Potential challenges of AI in patch management However, using AI in patch management doesn’t come without its challenges. Primarily, as is the case wherever AI is implemented, is that the technology comes with a steep learning curve. There’s a need for new skills to understand how, where and when to apply AI for patch management. This brings the need for investments in outsourced expertise or training to upskill existing team members. There are also ethical considerations to be made, especially if the technology is being used to make autonomous decisions about patch prioritisation and implementation. Additionally, as anyone who’s used AI in the past will know, it’s not perfect. Its predictions aren’t always accurate, and it may not always correctly predict the impact of patches, which can lead to false positives or negatives within the system. The future of AI in patch management As AI continues to evolve, its impact on patch management will become increasingly prevalent. However, when considering how to use AI in this way, it’s important for organisations to consider the IT infrastructure’s complexity, AI training requirements and the balance between automation and human oversight. Effective patch management isn’t easy – and organisations need to find a balance between trying to keep up without automation and applying AI too quickly and getting it wrong. There’s a need to have someone with dedicated expertise – whether internal or external – who can help oversee and manage AI’s implementation to maximise its potential for patch management. AI has the potential to completely transform patch management from a resource-intensive task to an automated, data-driven process. When applied carefully, AI can ultimately help organisations to enhance their security and optimise their IT resources. ### AI Readiness - Harnessing the Power of Data and AI Join us for a compelling session on "AI Readiness: Harnessing the Power of Data and AI," featuring industry experts from Ataccama and Microsoft. This session will explore how organizations can leverage artificial intelligence to achieve their goals and create value. The panel will discuss the pivotal role of high quality data in unlocking the full potential of AI for business transformation, and why developing an AI governance strategy to establish guardrails for AI usage is fundamental. ### CIF Presents TWF – Dion Hinchcliffe In this Episode of the 2024 season of our weekly show, TWF! (Tech Wave Forum), we'll be talking about the new AI era, how it impacts the future of work, and how CIO's should be dealing with this new challenge, with a well known, well respected, and influential analyst. Our host and CIF CEO David Terrar will be interviewing Dion Hinchcliffe of Futurum. We'll be discussing how you navigate these rapid shifts, about the impact of AI-driven development platforms, and about what's coming next. ### Futureproofing Identity Security with AI and ML The field of Identity Security is undergoing a technological evolution, moving from repetitive admin work to automated, more secure solutions. Gone are the days when IT departments were required to manually look after who could access their organisation’s infrastructure and data, and often using outdated systems. Identity Security has reached such a level of maturity that it is now pushing the boundaries of innovation. Encompassing data science, artificial intelligence (AI) and machine learning (ML) it is shaping how cybersecurity controls are implemented across entire organisations, as well as delivering an enhanced user experience. From legacy solutions to the cloud With traditional solutions, setting up a new employee network profile necessitated a series of tedious manual approvals, each of which needed to be multiplied by the number of systems and applications the individual needed to access. Any delays experienced when granting these access rights equate to lost productivity. It’s not only the onboarding process that is long-winded, re-certifying access to fulfil regular compliance requirements compounds the problem, representing another management overhead. To overcome these challenges, many enterprises have begun to adopt cloud-based user provisioning solutions, which can offer greater flexibility and scalability. Now, the industry is about to embark on a new era of Identity Security, being driven by five key forces. The Five Forces Now Shaping Identity Security 1: Utilisation of AI One of AI’s advantages is that the technology is excellent at undertaking automated and repetitive tasks and processes. One of these processes is the annual entitlement recertification needed to maintain regulatory requirements, which can benefit tremendously from automation introduced by AI and/or ML. Another area where AI excels is monitoring. Implementing AI-powered behavioural profiling reduces the need for human oversight and minimises approval times. Additionally, analysing behavioural data also enables organisations to enforce least privilege access, which improves user experience, decreases operational costs and enhances risk management. User Behavioural Analytics (UEBA) pairs big data analytics with artificial intelligence to analyse user behavioural data in order to spot patterns and anomalies, gathering valuable insights to empower better decision-making. 2: Increased capability of behavioural models Setting parameters for normal user behaviour is of utmost importance for the functionality and efficiency of an Identity Security program. Once these parameters have been established, granting or revoking application or system access can be fully automated. Accelerated by computing power, these decisions take only milliseconds and draw on information streamed from data architectures, data lakes, and cloud computing resources. High-capacity behavioural models can also protect against the impact of weak or reoccurring passwords, which are common when users are required to set passwords for multiple systems. Instead of relying on access authentication through conventional Multi-Factor Authentication (MFA) models, authentication becomes a continuous process. 3: The rise of data science skills With an increasing overlap between the fields of Identity Security and data science, data analytics skills are growing in value. This could be challenging for individuals who were honed on traditional Identity Security methods, with little experience in analysing behavioural patterns. These team members are increasingly reliant on their CISOs to re-skill them or at least guide them to areas where they can still contribute. The rise of data science also presents the opportunity for an organisation to attract top talent. By offering roles that implement and support data-driven Identity Security programmes, businesses can create centres of excellence where highly skilled people want to work, as well as ensure that the innovative techniques are deployed optimally. 4: CISOs are now the driving force for Identity Security Identity Security should be a key pillar in every organisation’s cybersecurity strategy, so it makes sense that more and more CISOs are taking closer ownership of these programmes. Their approach to programme management should be: Plan, Build, or Run, with separate teams responsible for planning – providing the blueprint for Identity Security operations and identifying which changes the organisation can benefit from – building – designing the workflows and analytics – and finally, running the operations, which involves overseeing operations on a day-to-day basis. 5: Increased oversight from the board Presenting the business case for any project is a challenge, but in the case of Identity Security, there are clear benefits to communicate to the board. Firstly, it lowers operational costs as automation and data science replace manual processes, although some savings might be offset by the cost of hiring highly specialised talent. Secondly, it boosts productivity by reducing decision-making time, typically by 10 to 25 per cent. It also ensures employee provisioning and de-provisioning take place within hours instead of days. Conclusion Transforming an organisation’s Identity Security programme will not only drive security best practices, it will do so at reduced cost, as the increased use of automation and data science techniques will replace expensive and outdated labour-intensive processes. Such initiatives don’t merely benefit the enterprise, they also provide an opportunity for Identity Security professionals to extend their skill sets, and for forward-thinking companies to attract top talent. ### The Critical Convergence of OT and IT Cybersecurity In today’s increasingly digitised world, the threat of cyberattacks involving operational technology (OT) systems is increasing, becoming more apparent and frequent. Whilst cyberattacks of any nature can be detrimental, attacks associated with OT systems can be even more catastrophic, as these breaches can often threaten the integrity of information, as well as the functionality and safety of a building. The influx of Internet of Things (IoT) devices and connected systems, as well as the escalating complexity of securing OT systems, are some of the main challenges faced by businesses when it comes to OT cyberattacks. As such, it is imperative for businesses to prioritise the security of these critical systems. Rise in OT attacks The rise of cyberattacks, especially in the realm of OT and information technology (IT) systems, has become a critical concern in recent years. As technology advances, so do the tactics of malicious actors, and with critical infrastructure becoming increasingly digitised, threat actors have been exploiting vulnerabilities within many businesses’ infrastructure. The seriousness of this nature of attack was highlighted in September 2023, when two casinos in Las Vegas were victims to an IT cybersecurity breach. This breach caused slot machines to go down and rendered hotel cards useless, among other disruptions. Although this example demonstrates the severe consequences of any cybersecurity attack, it is critical for businesses to recognise that any attack associated with OT can be catastrophic and can mean loss of business and data for victims. Unlike IT cybersecurity attacks – which primarily endanger data – OT cybersecurity breaches threaten the integrity of information, along with the functionality and safety of buildings. For instance, in commercial office spaces, OT cybersecurity attacks could compromise building automation systems, leading to disruptions in heating, ventilation, air conditioning (HVAC) systems, lighting, or even access controls. Malicious actors could manipulate smart building technologies, causing safety concerns and operational inefficiencies, especially if authorised personnel cannot gain access to the building. In these cases, operations cease, and entire systems can come to a standstill. Given the growing complexity of securing OT systems – due to the surge in industrial IoT devices – it is crucially important for businesses to prioritise the security of these systems. A layered security approach When addressing OT cybersecurity attacks in buildings, it is imperative for businesses to recognise that buildings are not just physical structures, but they also serve as vital hubs of modern life, encompassing offices, entertainment facilities, factories, hospitals and more, making them attractive targets for cyber threats. Buildings now host various smart systems, including access control, HVAC systems, and lighting. While this digital transformation enhances efficiency, it also introduces vulnerabilities, making buildings attractive targets for cyberattacks. As the lines blur between the physical and digital worlds, attackers exploit new avenues to infiltrate these spaces. In attempts to make buildings more resistant to cyber threats, building operators should follow a step-by-step security plan. First, they need to check all systems in the building and see how they work together. After understanding these systems, it becomes possible to identify weak points and make a plan, which should include utilising appropriate cybersecurity applications that help to protect critical assets within the building. Operators can then begin to install cybersecurity monitoring and create an incident readiness plan to be prepared for future incidents. This convergence provides real-time threat detection and response capabilities, bolstering a building’s defence against cyberattacks. Nonetheless, while these measures are just a few steps building owners and operators can undertake to reduce vulnerabilities and the potential impact of attacks, each plan will vary based on the building’s security requirements and budget. A secure future Addressing the convergence of OT and IT cybersecurity in the built environment demands a thorough strategy. It is essential to recognise that buildings are not just physical spaces, but intricate digital ecosystems. Looking ahead, OT security calls for a comprehensive approach that integrates IT practices and building management. This alignment is crucial to fortify the security of the spaces we depend on, emphasising the need for a holistic perspective in safeguarding our built environments. ### Exploring the human side of DevOps metrics Labels are an important part of the DevOps armoury. They help guide developers, streamline processes, and ensure seamless collaboration. They simplify large quantities of information, enabling DevOps teams to use performance metrics as part of the software development pipeline. Metric assessments like DevOps Research and Assessment (DORA) have been designed to provide DevOps teams with the essential data required to have visibility and control over the development pipeline. But labels don’t always paint the complete picture. Yes, they can capture information such as quantitative metrics, but they’re not as adept at gathering information on DevOps culture such as collaboration and communication. Nor are labels the best method to ensure that a team is aligned with broader organisational goals.The reality is simple: not all teams are built the same, which means they shouldn’t be measured in the same way. Take the issue of team capacity, which is influenced by factors critical to understanding the capabilities and constraints for effective software development. It goes without saying that traditional DevOps metrics offer valuable insights into deployment frequency and lead time. But there’s a problem. Sometimes, these metrics simply fall short and fail to deliver— especially when it comes to capturing the reality of how IT teams work on the ground. After all, modern IT environments are more complicated now than ever before, especially since more and more organisations are balancing a mixture of on-prem, cloud, and hybrid environments. While the use of AIOps has helped to address this increasing complexity, DevOps teams are often pulled in many directions. Another issue is that DevOps metrics don’t always account for the size, scale or scope of individual organisations and their DevOps teams. This means that ‘small but mighty’ teams—those that punch above their weight and are very efficient given their scale—can often be evaluated poorly if only focusing on performance metrics. Aligning goals Another key component that traditional performance metrics sometimes fail to capture is the alignment of their work with the overall objectives and vision of the organisation. Metrics such as ‘change failure rate’ and ‘time to restore’ services are important but may not provide the level of nuanced understanding needed in terms of common business outcomes. For instance, many DevOps team members spend considerable blocks of time identifying and fixing minor issues across applications and services. While AIOps and observability can help IT teams by automating the detection and resolution of service and application problems, DevOps teams are still largely on the ground troubleshooting these situations. And while this may be achieved quickly, it still eats into a team’s ability to build and deploy the innovative solutions that customers need. One way around this is to ensure that organisations complement quantitative metrics with qualitative assessments and other feedback. That way, they can foster a more ‘holistic approach’ that considers both operational efficiency and strategic coherence within the DevOps landscape. The case for measuring customer satisfaction There’s another thing worth considering as well. Alongside traditional DevOps performance metrics, organisations should also consider customer satisfaction as a measure of performance. While customer satisfaction is multifaceted and complex, influenced by factors beyond the scope of traditional DevOps performance metrics, it is a good way to gauge the user experience. For this reason, it is arguably one of the most important metrics available. Perhaps that explains why organisations looking for a more rounded understanding of their DevOps teams’ impact are casting their net further. Instead of solely relying on traditional metrics, they’re supplementing them with customer-centric feedback mechanisms, user surveys and qualitative assessments to ensure that the development processes align with—and enhance—customer satisfaction. Supporting DevOps teams Whatever your perspective, organisations need to adopt a balanced and inclusive approach to evaluating performance. While it would be great to be able to use a single set of performance metrics for all functions and teams, a one-size-fits-all approach simply does not work. Thankfully, there are a number of ways organisations can begin to improve the lives of DevOps teams while also achieving high performance in the standards of traditional metrics. For instance, integrating observability solutions can allow teams to scale their work while also being able to more easily identify and resolve problems as they arise. Not only is this a more efficient and productive approach, but it can also help free up time for smaller teams. Elsewhere, observability can also be used to help measure and provide insights into end-user experience. The benefit of this approach is that it helps to ensure that the needs of customers, DevOps teams and the organisations they represent are aligned. Recognising the human element in the work done by DevOps teams, and supporting team members through professional development opportunities, can significantly contribute to a positive culture at both individual and organisational levels. This has a broader impact across the entire DevOps community. ### AI for Workplace Efficiency and Security Enhancement Artificial intelligence (AI) is rapidly transforming workplaces around the world. From routine administrative tasks to complex decision-making, AI has already automated and enhanced many business operations for those who have been savvy enough to implement it into their structure. For many, however, introducing AI to their tech stack requires careful planning to maximise benefits and minimise risks. AI is now used by organisations to make smarter and more data-driven decisions, and the benefits of AI in the workplace are clear. One powerful solution, announced in 2023, is Copilot for Microsoft 365, which merges the power of Generative AI (Gen AI) into your Microsoft 365 apps to create an AI assistant for your everyday work. Together with the data in your M365 tenant it offers an intelligent, secure solution aimed at improving efficiency and productivity. It's clear that AI has a host of benefits to offer the modern workplace. However, the question still remains how businesses can draw out the ultimate potential from their AI investment. AI - Beyond simply an assistant We all wish we had a personal assistant. Traditionally this was something that came with seniority and success, but AI now makes that more achievable for most of us than it ever was before. Microsoft's Copilot stands out as a comprehensive embodiment of Gen AI seamlessly integrated into the workplace, acting as your constant and personal support. Unlike a standalone product, Copilot functions as an embedded assistant—a contemporary version of the familiar Clippy from Windows 98, albeit with unimagined power compared to it’s 90’s counterpart. There are many Copilots across the range of Microsoft technologies, addressing diverse aspects of how we work and leveraging the data that we and our colleagues generate. This nuanced approach, tailoring AI to specific requirements, amplifies the value of its integration into our daily lives and work. However, the successful optimisation of AI in the workplace hinges on various elements, with one significant factor being employee engagement with the technology: if you are paying for an AI assistant on a per-user-per-month basis, each user needs to regularly make valuable use of that assistance, otherwise there is no business case for investment. We all need to overturn career-long muscle memory of how we work to exploit the capabilities of Copilot, but if we can truly adapt, Copilot can unleash our creativity, unlock our productivity and up-skill us to perform tasks that we were previously unqualified for. At a senior level, AI digests business plans and strategies, providing valuable insights and suggesting objectives to enhance business decisions and foster growth. At a more grass-roots level, AI can help every one of us to get our daily tasks done quicker, and to a higher standard. Transforming IT from within Integrating AI into the workplace goes beyond improving operational efficiency; it significantly enhances the employee experience by automating time-consuming IT maintenance tasks, such as CRM and ERP administration that often deplete user time and engagement. By delegating these responsibilities to AI, employees can redirect their focus towards innovation and more creative and valuable endeavours. The advantages extend beyond mere employee satisfaction: AI implementation yields real time savings in critical business processes. This including efficiently helping respond to client emails and delivering post-meeting analysis and summary work that can create tangible benefits for the business case for investment. What makes this implementation even more appealing is that it doesn't demand intricate expert knowledge. Organisations can leverage natural language AI and Copilot to construct customised automation flows that seamlessly integrate into enterprise-wide processes. This aligns precisely with specific business needs without any need for code or analysis beyond that which the end-user can provide. Eagerness with caution AI in general and AI assistants like Copilot in particular have the potential to change the way that we work and to deliver significant value to our organisations. At the same time, it is prudent to be mindful of what data our users have access to, even if it has not previously reared its head as a concern. We also need to carefully consider the implications for data security and data privacy laws when embracing the power of AI. The good news is that Copilot for Microsoft 365 has some excellent built-in guardrails: ask the question “What is the CEO’s salary?” and its answer will be “I won’t tell you that!” not “I can’t find that information.” Most of us look after data and document landscapes that have grown legs over time. They all likely contain something that has been accidentally over-shared, or sensitive documents that users have not previously realised they have access to. It’s a good idea to perform an audit of your documents and data before organisation-wide implementation, but don’t let this stop you from experimenting with the value of Copilot or similar tools: organisations that are successfully leveraging AI and AI assistants start by exploring its value with a discrete set of users while simultaneously auditing their document permissions. Often these users sign an additional NDA explaining their need to report any sensitive information that they find, and not to share it. The important thing is not to quell eagerness with caution, but to encourage the eagerness of individuals and the organisation whilst better preparing and understanding any risk involved. AI: Your Cutting-Edge Cyber Sentry AI helps workplaces to improve their security systems against advanced threats. It offers a new way to enhance organisational security. By using natural language inputs, AI creates tailored responses based on insights from internal and external data. This allows organisations to deal quickly and accurately with current incidents, detect new threats, and prevent future risks. For example, developers working with Microsoft Defender for Endpoint to detect and respond to security dangers can benefit from Copilot’s ability to create code to automate and streamline security workflows Similarly, users of Azure Security Centre can deploy Copilot to generate code snippets which expedite the automated deployment of security controls. The efficiency gains are substantial – Copilot enables organisations to defend against threats at machine speed, achieving in minutes that which may have previously taken hours to accomplish. In a constantly evolving security landscape, AI-driven solutions such as Security Copilot are now indispensable assets, reinforcing the resilience of workplaces against ever-evolving risks. And – unpalatable though it may be to consider – threat actors are also increasingly leveraging the power of AI to assist them in their activities. It is incumbent on those of us who wish to keep our organisations, people and data secure from such forces that we allow our defences and security solutions to evolve at the same rate. Security Copilot answers this need. AI: beyond a simple tool AI is reshaping workplaces globally, providing a range of benefits from streamlined operations to enhanced decision-making. Beyond being a simple tool, AI emerges as a personalised assistant for every employee, seamlessly integrated into daily workflows. However, this enthusiasm warrants caution. Organisations must assess readiness, define clear objectives, and optimise cloud technologies before embarking on ambitious AI transformation. The realisation of AI's potential hinges on thoughtful planning, enthusiastic yet continuous adaptation and a commitment to leveraging its capabilities for sustainable growth and innovation in the long-term. ### Reducing LLM hallucinations with trusted data Large language models (LLM) have a range of uses, but LLMs or AI hallucinations pose great risks to organisations, and can lead to mistrust amongst consumers. LLMs are trained on models which are used to predict the next word based on patterns in their training data, and are not “intelligent” or thinking beings as sometimes portrayed in popular media. Models can generate outputs that seem factual but are wholly fabricated. Not only that, the data that is used to train these models may also be of low quality as LLMs cannot conduct any fact-checking on their own. Consequently, with faulty inputs, end users cannot confirm whether the model provides trusted information.So, how can organisations address AI hallucinations? With today’s technology, it is impossible to entirely eliminate hallucinations, although organisations can significantly reduce them by improving the quality of data used to train them. A strategy that helps mitigate the problem includes applying a robust data unification framework. This approach improves data quality and trustworthiness, creating a solid foundation for responsible AI.Implementing this strategy should follow a series of phases and steps. Tailor LLMs for business specific-knowledge As organisations use LLMs to enhance their operations and support customer service, it is critical for them to reduce hallucinations. Businesses should review the data which is training the model to introduce sources that can produce personalised business-specific responses. This can be done through Retrieval Augmented Generation (RAG), which involves creating a model to retrieve information from a database of company information to feed LLM with informed responses.However, RAG alone will not be effective in suppressing AI hallucinations. To ensure that the RAG implementation is successful, the data used in the system should be of high quality, accurate, complete, and relevant. As such, businesses must invest in a robust data management and unification system, which includes quality controls and the ability to make real-time updates.Further, organisations can use graph augmentation. This involves utilising a structured knowledge graph of enterprise-wide entities and relationships within a business, which enables company-specific terminology and facts to be included in the outputs. Similar to RAG, the effectiveness of graph augmentation is dependent upon the quality of the data used, but it adds another layer of control and assurance to the quality of AI-powered responses. Leveraging trusted and unified data for LLMs Modern cloud-native data unification and management approaches are key when addressing the challenges of training LLMs. Some of these approaches include master data management (MDM), entity resolution and data products, and collectively, these are the essential parts of creating business-critical core data. Through unifying data from a number of sources and feeding this directly to LLMs in a constant, real-time and accurate way, these systems become key pillars to ensure that the models produce reliable outputs.One of the essential aspects of creating a robust data unification and management solution is leveraging canonical data models. As these models allow businesses to unify data from several sources across different entities seamlessly, they create consistency and accuracy. It is also important for these models to scale due to the vast amount of data that is generated and consumed so the LLM can constantly access the most up-to-date information.As well as scalability, application programming interface-driven (API) performance is essential for real-time data availability and automation of these models. Through the security-compliant APIs creating a seamless integration between the data unification platform and the LLMs, there is the ability for rapid data access and processing. This also curbs the chance of the model generating inconsistent outputs and so increases trust amongst users. Creating successful LLMs now and into the future As AI continues to become deeply integrated into our lives, there is a great need for these models to be based on trusted, reliable, and transparent data. The recent passage of the EU AI Act seeks to hold companies to higher standards. Organisations will need a foundation of trusted, high-quality data to ensure compliance with the new law.Businesses must also continue to comply with rigorous data privacy laws. In Europe, they could face a fine of up to €10 million, or 2 percent of the organisations’ global annual revenue from the preceding financial year, depending on which amount is higher, according to GDPR.EU. While data privacy compliance is crucial, the erosion of customer trust from misleading or biased data sets being used in these LLMs, will have a long-term impact on the reputation of the organisation and its bottom line.There is no shying away from the significant role AI and data play in business operations, but it is of utmost importance that LLMs are built on a robust data foundation. Through strong data unification and management strategies, businesses are much better placed to create trusted and up-to-date LLMs. Organisations must lay the groundwork now in order to reduce hallucinations and successfully reap the benefits from AI to reduce hallucinations and successfully reap the benefits from AI to improve operations and customer service practices. ### Are WAAP's needed to protect cloud applications and APIs? Attacks in the cloud are on the rise, with The State of Cloud Native Security 2024 report revealing 64% of organisations reported an increase in data breaches over the past year. It found the top concerns when it came to cloud security were over AI threats both in terms of code poisoning and assaults, and the risks associated with unmanaged, unsecured or third-party Application Programming Interfaces (APIs). APIs are an essential component in transferring data between cloud services and web applications and link these together, making them a prime target for attackers. Where web application security is concerned with protecting the front end, API security is focused on the mechanisms that retrieve the data servicing the application and this means it can be attacked to leverage access to sensitive data. But determining how you should protect your APIs is still an issue many are grappling with. In fact, the survey found 88% struggle to identify the security tools they need. Will my WAF work? Part of this confusion lies in the way in which security solutions have evolved. Given that APIs were essentially seen as an extension of the application, the consensus was that they could be protected using Web Application Firewalls (WAFs). However, unlike web applications which are vulnerable to things like SQL injection or cross site scripting (XSS) attacks, API attacks are not syntactic in nature. Indeed, they can still be attacked even when conforming to their set specifications, so using signature-based detection methods are not effective in detecting abuse. Attackers have also since learnt to neatly sidestep such protection measures by rotating through tactics, techniques and procedures (TTPs). For examples, they might switch rapidly from one IP address to another. Other times they resort to overloading the WAF with thousands of IP addresses, effectively bricking it, which is why WAFs are so poor at blocking automated i.e. bot driven attacks. The defence capabilities of API Gateways were likewise oversold. These are predominantly aimed at developers looking to push out and manage their APIs so cover the design, build, test and release phases with basic security protection such as SSL built in. They centralise operations and can rate limit or throttle traffic but have limitations int terms of detecting attacks that are not volumetric in nature. Therefore, while both WAFs can do the initial filtering and API Gateways detect surges in traffic, they are not designed to detect anomalous behaviour or lateral movement, which is where API protection comes in. It can take an inventory, define governance and block and tackle as needed. Cloud WAAP It's for these reasons that we saw the emergence of what Gartner calls Web Application and API Protection or WAAP which essentially combines the abilities of the next generation WAF with DDoS protection, API security and bot mitigation. WAAP sees these amalgamated to provide a more comprehensive form of protection using solutions that directly complement one another and Cloud WAAP is now gaining ground, with more CSPs looking to offer it. In its latest report, the Market Guide for Cloud Web Application and API Protection (Nov 2023), Gartner makes the point that while the market is steadily growing, it still faces some challenges in the form of alert fatigue from WAFs and sophisticated bot attacks. Key to solving the former are AI and machine learning that can correlate and qualify events for investigation while in terms of the latter, bot mitigation requires more advanced threat hunting and the use of deceptive techniques designed to exhaust the attacker’s resources. Interestingly, the report insinuates that while traditional (ie on premise) WAAP is being superseded by its equivalent in the Cloud, the API security sector has come around the outside to become something of a contender by offering discovery, threat detection and response capabilities. It even suggests that some organisations may prefer a standalone solution over a WAAP service to focus security efforts and reduce overheads, effectively creating their own version. API security in action For example, a large enterprise that rapidly expanded and acquired a number of its competitors sought to address its API attack surface. The company’s applications were distributed across multiple cloud providers so they needed to ascertain how many API's they had, where they were located, if they were conforming to security best practices and which API's were exposing sensitive data. This included looking at whether those APIs even needed to expose that data, as many APIs are configured to do so unnecessarily. In addition to addressing API posture management, the business also needed to be able to detect advanced business logic abuse or data exfiltration attacks over those APIs. Deploying an API security solution across multiple cloud environments as well as their traditional data centres allowed the security team to holistically look at the entire API attack surface, supplementing the information they already had from an API gateway perspective as well as from a WAF perspective.Given that standalone API solutions are also available with bot mitigation, thereby dealing with the problem of automated and DDoS attacks which usually follow the discovery of a compromisable API, it will be interesting to see how the market plays out. Gartner is predicting we will inevitably see some convergence in the market as a result. In fact, we’ve already seen API vendors being snapped up this year, such as in the case of Fib acquiring Wib and Akamai announcing its intention to acquire NoName, revealing that WAAP providers are keen to incorporate the advanced capabilities of standalone API offerings. For now, however, those looking to address their APIs with a cloud offering need to look at the nuances of their current set-up to determine if they need to plumb for a WAAP service or if they can use an API and bot management solution. It will very much depend on the solutions the business already has, the attack surface it is looking to manage and what it is trying to achieve. What’s important is that it doesn’t continue to try to use a square peg in a round hole and does attend to the very specific challenges of API security. ### Connectivity, Innovation and Trust: The new automotive future What’s defining the exciting new era of automobility? Software, personalisation, and automation. It’s a thrilling time to be buying a new car in the UK. One million electric vehicles (EVs) have been brought into the UK since 2002, but the shift to electrification is taking time with public scepticism and lack of trust in new technologies among the challenges that automakers face. In fact, electrification only scratches the surface of automotive opportunity and exciting software-powered service and feature innovation, and the creation of new in-vehicle experiences, are set to challenge buyers’ reservations. What will connected automotives look like in the year ahead? Here’s my view: Q: Are consumers able to place their trust in new automotive technologies?A: Yes - if automakers get it right! As every part of our lives becomes more software-defined, our cars are no exception. The past decade of automotive developments has hinged on vehicle, driver, and sensor data, to increasingly sophisticated ends. Now, with our cars creating and sharing more data points than ever before, scrutiny increases. Fears of data breaches, reduced privacy, and the integrity of new connected insights like autonomous features have led to diminished trust in connected vehicles – just look at the recent recall of 2 million US Tesla vehicles in the US, in which its driver assistance system was found to be defective. However, even in this scenario, the ability to fix issues via an “over the air” software update still illustrates the major convenience benefits of software-defined vehicles. Less stress and inconvenience for the owner. Building trust must be a focus for automakers this year, working with tech partners to ensure trust, safety and security are proven and paramount, as well as keeping regulators and the public informed to restore confidence in vehicle advancements.Q: What are the secrets behind these exciting new capabilities?A: One of the most exciting developments is in cloud-based digital twin development platforms We kicked off this year Stellantis announcing its world-first virtual development platform built on QNX technology, that enables Stellantis’ developers to work around the world in a digital twin environment. It speeds development 100-fold, cutting the time and cost of bringing innovation to market significantly. Advanced, cloud-based digital twins will help automakers not only to identify integration issues sooner in the development process but also to enable more developers to participate earlier in the development cycle, long before hardware is available. This helps manage project risk and foster collaboration, so development teams can engage the best talent from anywhere in the world. The importance of creating a true-to-life virtual development environment for embedded software cannot be underestimated. Soon, virtual cockpit high-performance compute (HPC) simulation will allow automotive companies to virtualise graphics, audio, and other inputs in the cloud, enabling earlier prototyping, scaling of developer teams, and a reduction in time-to-market. With this addition, in 2024, we’ll see the world’s first whole-vehicle simulation become a reality. It will allow the automaker to accelerate the delivery of software and fully harness the cost, collaboration and time efficiencies of developing in a cloud environment. The true-to-life virtual development environment for embedded software will reduce complexity, accelerate innovation, and cut costs on in-car software development throughout the entire product lifecycle. Q: What does the evolution of automotive software look like in the year ahead?A: Safe, secure, reliable systems will make upgrades seamless Can you imagine technology that can judge how worn your tyres are, personalise your in-car entertainment, message you if your teenagers are driving without seatbelts, score your driving for better insurance premiums, analyse incidents, recharge or refuel your car without a bank card or phone, or ensure you’re staying alert while driving? Even if you can’t, automotive developers can. And, using the latest operating systems, they are building exciting new capabilities for OEMs to update cars with innovative experiences and revenue-generating services. Taking advantage of today’s scalable, high-performance real-time operating systems (OS) enables embedded software engineers to design safe, secure, and reliable systems in a way that fully harnesses the benefits of CPU advancements way into the future. When in-car technology is based on these innovative systems, automakers can seamlessly upgrade the electrification, e-commerce, safety and security, vehicle lifecycle and operations, and the in-cabin experience in the electric, connected vehicles of the future – or even the one you already own! Q: On the ever-evolving playing field of auto innovation, what should we be looking out for?A: Audio is one to watch! Great audio shapes experiences – think about the best noise-cancelling headphones or announcements on trains that you can actually hear! A sure-fire sign of quality, innovative in-car audio is a core feature of the software-defined vehicle, not least because of how important it is to the user. When software is decoupled from hardware, audio designers and engineers are rewarded with complete creative freedom to deliver new and exciting in-vehicle sound experiences. It includes everything from basic audio functionalities such as hands-free telephony, sound alerts and active noise control as well as innovative new infotainment and safety features. Looking ahead, software will not only provide more control over the quality and functionality of audio services, but it also unlocks new revenue streams from the activation of optional features. Upgrades to premium, branded quality, like on-demand or subscription services, bring new, value-added features into the vehicle. This innovation in audio will provide new business opportunities and exciting collaborations that will truly appeal to and excite the end user. Automakers who truly differentiate themselves through sound could well steal a march on competitors. ### How To Use Artificial Intelligence For Digital Writing? Do you want to know the trick for using AI for digital writing? If yes, you must seek the assistance of an AI story generator to build your business in the correct direction. You should not make things work in the wrong direction. The application of AI can help you build your business in the right direction. You should not make your choices on the incorrect end. Try to keep up the best options in perfect parity. Otherwise, you cannot make things happen in the right order. The application of AI can make things work accurately well in your favour. You should understand the reality while reaching your objectives with complete ease. Keep the process in perfect parity with one another while you want to attain your needs. Ways To Use AI For Digital Writing There are several ways you can make use of AI in digital writing. You cannot make your selection on the incorrect end. Ensure that you do not make things too complex from your counterpart while meeting your aims with ease. 1. Beat Writers Block When you struggle to come up with appropriate content ideas. AI will help you to build proper topics, angles, and titles for writing the projects. You should not make things happen on the incorrect end. Otherwise, things can become more complex for you in the long run. AI generates creative prompts as well as outlines to keep your content in proper parity with your digital writing. Ensure that you follow the correct process from your end while meeting your needs with absolute clarity. 2. Research & Topic Exploration Research & topic exploration form an essential part of your digital writing. You can analyse vast amounts of data. You can identify topics and potential areas of interest. It can be a great way to find unique angles for your content to ensure it covers the relevant information. Once you explore the topics and start to do research on them, you lose the scope of using AI for your digital writing. Additionally, you should keep things in perfect shape while attaining your goals with complete ease 3. Content Formatting & Tone Matching AI Story generator can help you in making proper content formatting and tone matching. You should not make your choices in the wrong manner. Additionally, you should be aware of the facts that can assist you in meeting your requirements with ease. Without knowing the reality, you cannot make things happen in your own way. Try to keep the process in perfect parity with one another. Additionally, it can make things work accurately well for you when you want to reach your goals. 4. Content Repurposing & Summarisation AI can help you repurpose the existing content and summarise it with complete ease. You must be well aware of the situations while meeting your needs with complete clarity. Ensure that you follow the perfect solution while getting things done in perfect order. AI can help verify the accuracy of information within your content and identify potential factual inconsistencies. This ensures your writing is credible and trustworthy. Ensure that you follow the best process from your endpoints. 5. Grammar & Mechanics Check Grammatical errors can cause serious problems for you. If you want to keep things in perfect parity, things can become easier for you to meet your objectives with complete ease. Try to keep the methods in proper order when you want to reach your goals with ease. Grammatical errors can be reduced if you do a fact check using AI tools. You need to be well aware of the facts while attaining your needs. Keep the process in proper order while you want to reach your needs with complete clarity. AI for digital writing can make things more lucid for you when you want to reach your requirements with complete ease. Artificial intelligence will make your job easier in the long run. It can make the process of writing feasible and attractive to the readers. Here, you should not make things too complex from your end. Final Take Away Hence, you need to understand these facts while meeting your goals with complete ease. Keep the process in perfect parity, which can help you get your requirements with complete clarity. You can share your views with us with complete ease when you want to share your feedback. Keep things in proper parity while you want to keep things in perfect shape. Ensure that you do not make things too complex from your end. AI for digital writing can make things more lucid for you when you want to reach your requirements with complete ease. Here, you need to identify the perfect solution that can boost your brand value to a greater extent. ### Unlock Financial Efficiency with E-Procurement Procurement teams are constantly seeking new ways to optimise their operations and drive financial performance. This can prove difficult, as the process is often plagued by inefficiencies, but the simple solution to these challenges is to manage spending through e-procurement tools. By leveraging digital platforms, procurement professionals can improve cost-saving capabilities and enhance process efficiency. Here’s why financial efficiency in procurement is important, the biggest challenges of manual cost optimisation and how to address them with software. Why is financial efficiency important? With inflation rates currently sitting at three per cent, cost optimisation has become the top priority for procurement teams in 2024, with a survey revealing that 65 per cent of UK procurement departments consider cost control to be the most important focus. Economic uncertainty is the biggest influence in achieving financial efficiency, as ongoing global economic concerns and fluctuations are putting businesses under increasing pressure to maximise cost savings and optimise resources to build resiliency and remain competitive. Inflated prices are also impacting the costs of materials, labour and operational expenses, heightening procurement teams' urgency to find more innovative means of cutting expenses and improving financial performance. The procurement sector is becoming increasingly digitised thanks to the development of Artificial Intelligence and automated systems, and this acceleration has empowered teams to leverage advanced e-procurement tools and technologies to streamline processes, reduce inefficiencies and minimise costs.Additionally, supply chain disruption caused by the pandemic highlighted the importance of building agile and resilient supply chains, which is why procurement teams are more invested in financial efficiency to mitigate risks and ensure supply chain continuity in the face of disruption. Teams are also under pressure from key stakeholders like investors, customers and regulatory bodies to display best practices when it comes to financial performance and sustainability efforts to determine suitability for partnerships. The pressure of high internal expectations compels them to work harder to demonstrate fiscal responsibility and efficiency to meet ambitious targets. Problems caused by manual procurement cost reduction Manual financial optimisation has become an outdated and flawed process that can significantly impact an organisation’s efficiency. One of the most significant challenges of manual reduction strategies is their susceptibility to human error, which can result in inaccurate data entry, miscalculations and overlooked cost-reduction opportunities. These errors can also extend beyond compromising the reliability of procurement data by leading to poor decision-making. These processes are also inherently time-consuming, requiring a substantial administrative effort that diverts resources and attention away from more pressing strategic activities. Research found that procurement and supply chain professionals spend 31 per cent of their time dealing with inefficient manual processes. Lacking efficiency can slow down the procurement function and cause delays that affect overall business operations. Tracking and analysing spending patterns manually can also become burdensome, as it can be difficult to identify and act on trends or discrepancies quickly. Manual procurement also often lacks transparency, which can lead to inconsistencies in vendor selection and contract management, increasing the risk of non-compliance with internal and external policies and regulations. The absence of a centralised system equally complicates the consolidation of procurement data across multiple departments, which can hinder effective communications and collaboration. Fragmentation like this can result in missed opportunities for cheaper purchasing and supplier negotiation for better terms. Plus, time-consuming cost reduction efforts are typically more reactive than proactive, meaning issues are addressed only after they arise rather than teams taking steps to prevent them from the beginning. These weaknesses collectively highlight the need for automated procurement solutions that enhance accuracy, efficiency and strategic decision-making to drive more sustainable cost optimisation, which is why procurement purchasing departments are aiming for a 70 per cent procurement digitisation rate by 2027. How can e-procurement address these challenges? E-procurement tools can resolve the key challenges highlighted by manual procurement processes. One of its most significant functionalities that can drive cost savings is the automation of data and calculation entry, which can drastically reduce the likelihood of human errors and ensure decisions are based on accurate and reliable data. The tool has a multi-stage response feature that gives users the flexibility to ask for new offers on goods and services at different stages of supplier negotiation. This provides more opportunities to negotiate a better price or terms for things like delivery dates and lead times, identifying cost-saving opportunities at every stage of the procurement process.Automated e-auctions are also able to minimise the necessity for users to waste time going back and forth negotiating with suppliers, leaving more time for teams to focus on more pressing responsibilities, allowing them to get the most competitive price in the shortest amount of time and streamline supplier relationship management.Procurement software provides a much more efficient and accurate tracking system to help users gain better visibility over their spending and allows you to see all savings achieved from multiple contracts and partnerships. General reporting dashboards can also provide users with an overview of their request spending and savings analytics, allowing them to monitor data and identify where and how savings are being maximised. The software's proactive approach to procurement, with features like customisable workflows and automated reminders, ensures timely and accurate processing of requests for quotations and other procurement activities. ### Cloud Security: Are We Really Safe? In this episode of the Cloud Chronicles, host Jez Back is joined by Mark Jow from Gigamon to explore the complex challenges of cloud security. They dive deep into the evolving landscape of securing cloud environments, highlighting the unique difficulties compared to traditional data centre security. Mark emphasises the critical need for a fresh approach to encryption and traffic monitoring to safeguard data in the cloud effectively. Throughout the discussion, Mark and Jez examine how the transition from traditional data centres to cloud environments demands new strategies and technologies. They discuss the importance of adopting advanced encryption methods to protect sensitive information and the role of comprehensive traffic monitoring in detecting and mitigating potential threats. Mark shares insights into best practices for cloud security, including the implementation of robust encryption protocols and continuous monitoring solutions. Tune in to this engaging episode of the Cloud Chronicles to learn from industry experts and stay ahead of the curve in the ever-evolving field of cloud security. Discover how to protect your organisation's data with the latest encryption technologies and traffic monitoring techniques, and gain practical advice on transitioning from traditional data centre security to a more secure cloud environment. ### Mastering the Cloud with Strategies for Multi-Cloud and AI Integrity In the inaugural episode of the Cloud Chronicles vodcast, host Jez Back sits down with Lee Thatcher, the former Head of Cloud and Innovation at CloudCoco, for an in-depth conversation about the evolving landscape of cloud computing. They explore fundamental concepts such as cloud agnosticism, multi-cloud, and hybrid cloud, providing clear definitions and real-world examples. The discussion also addresses the significant challenges organisations face in avoiding vendor lock-in and the complexities involved in managing multiple cloud services. Through their expert insights, Jez and Lee offer valuable guidance on navigating these challenges and maximising the benefits of a diversified cloud strategy. Whether you're an industry veteran or new to cloud computing, this episode is packed with informative content and practical advice that will enhance your understanding of cloud technologies. Tune in to gain a deeper insight into the future of cloud computing and learn how to effectively implement these strategies in your organisation. ### Securing Kubernetes in Multi-Cloud If you want to secure Kubernetes, you have to be strong in the multi-cloud and in cyber resilience. A number of companies now rely on Kubernetes, and container orchestration has become mainstream. How can this particular workload be best secured? You have to do justice to its special character and master the multi-cloud. The advantages of Kubernetes for developers are enormous; they can quickly roll out lean and dynamic web-enabled applications. Open-source container orchestration tools such as Kubernetes enable them to split monolithic applications into smaller, different microservices and run this software reliably between computer platforms. New workloads can be integrated into the IT environment at a more laborious pace, even if they are made up of old and private and public cloud resources in complex multi-cloud scenarios. If you want to secure Kubernetes workloads in this dynamic structure, you have to master the multi-cloud perfectly. No more silos By definition, data in a multi-cloud is distributed across a variety of storage locations, and Kubernetes is by definition designed for these scenarios. IT teams must invest a lot of time and resources to properly manage this colourful puzzle of proprietary, isolated data islands. The distorted picture causes a number of glaring problems that come with the amount of data and the number of legal regulations such as GDPR or NIS2. In this way, it quickly becomes impossible to know whether data is redundant, whether critical personal data is stored in risky locations, or whether it has been included in the backup plan. A company can try to get these data islands under control with processes and data product solutions but must deal with excessive infrastructure and operational costs, lack of integration between products, and complex architectures. It is questionable whether all data in such a fragmented environment is protected from ransomware and whether important tasks such as rapid recovery can be implemented in the time and quality required to maintain business operations. However, these problems can be solved by bringing together the distributed workloads on a hyper-converged platform that is easily scalable and strictly follows the zero-trust security model. The locally stored data is stored on immutable storage and is inherently highly encrypted, even during transport. Access is only possible if you have authenticated yourself using multi-factor authentication and previously authorised using role authentication. IT managers also only access all workloads via a uniform console. The Kubernetes elements are also maintained accordingly. The roles and rules control which resources the IT managers actually see in their console. Important configuration changes or deletion processes are additionally protected by a quorum process. If, for example, an IT manager wants to change critical settings, other responsible persons are triggered in the background and automatically asked whether they approve the process. This is intended to protect highly sensitive areas from manipulation. Protect integrity, and understand data values. Respond to successful attacks Even the best defence strategy can fail. If a ransomware or wiper attack is successful, the lights in the company are turned off, literally. In an emergency, nothing will work anymore. No phone, no email, no door, let alone the website. The IT teams of the CIOs and CISOs will not even be able to respond to this attack because all security tools are offline and evidence in logs and on the systems is encrypted. No one will be able to call their team together because VoIP doesn't work. However, using a data management platform the infrastructure and security teams can jointly establish an isolated cleanroom in which an emergency set of tools and system and production data is located, including the Kubernetes backups. In this cleanroom, the IT teams can create an emergency operation of the entire IT. This contains all the important tools for the security teams so that they can begin the essential incident response process. This process is important for generating correct and meaningful reports for NIS-2, DORA and GDPR violations. From the cleanroom, the production environment, including Kubernetes, can be gradually restored with hardened, clean systems in close coordination with the infrastructure teams. Doing justice to the character of Kubernetes Companies can secure their container environment with a Data Cloud Platform, as the platform offers the same level of reliability and flexibility for Kubernetes-based workloads as for all other business-critical workloads in the multi-cloud. One thing is clear. The amount of workloads will continue to increase and the application landscape in companies is even more diverse. It is therefore crucial that the backup and recovery of data for all workloads can be carried out in the same way via the same console with the same rules. This allows vital processes for the company to run as efficiently as possible when an attack is underway. But that's not enough. It's just as important to provide the infrastructure and security teams with a secure space where they can respond to the attack together and restore data and systems to production in a hardened form. No matter what the workload is. ### How database replication keeps enterprises agile There is a pervasive challenge confronting businesses across all industries: dependence on poor-quality data. Besides holding organisations back from embracing emerging technologies, the cost of bad data also impacts the bottom line, with research showing that enterprises building AI programmes on poor-quality data are losing, on average, six per cent of their global annual revenues, equivalent to 406 million USD. This poor data quality is the result of ineffective data processes, such as siloed data. A staggering 68 per cent of organisations still struggle with the basic task of accessing and cleansing data into a format that is usable for analysis and more advanced use cases such as building LLMs and AI. The challenge of data inaccessibility is particularly prevalent in more established enterprises relying on traditional on-premises servers. If these companies are to undergo successful digital transformation and keep pace with innovation, they must make their databases work for them, rather than against them. The untapped potential of legacy databases Organisations pre-dating the cloud-native era hold vast troves of valuable data in legacy databases, including important customer records, financial and inventory data and even regulatory information. However, without the technological capabilities to move, operationalise and make use of the data in real-time, the insights lie dormant – causing organisations to miss out on opportunities for operational improvements, product innovation or customer-centric growth. Within these organisations, data engineers are tasked with working tirelessly to keep data flowing between databases, sources and analytics engines, but with the pipelines transporting the data breaking multiple times a month, data talent is fighting a losing battle. When the flow of data is interrupted, employees – not just data workers but marketing teams, sales and even the C-suite – are forced to make business decisions based on outdated or inaccurate insights. The opportunity cost of this kind of disruption is only eclipsed by the financial cost – for example, Thomson Reuters loses $1m when its pipeline goes down. Legacy infrastructures also lack the modern safeguards required to comply with data privacy, security and governance requirements, exposing organisations to the risk of data breaches, fines from the ICO and reputational damage. Enter automation Improving data access is in everyone’s interest in the organisation, none more so than CDOs, who tend to have a relatively short tenure of around 30 months and will therefore look to make an impact upon joining their organisations. Whatever digital transformation efforts they set their sights on, making data accessible, reliable and secure will be central to their activity. A recent survey by MIT Technology Review conducted for Fivetran further underscores this, showing that 82 per cent of C-suite executives now prioritise data integration and data movement solutions for long-term foundational AI success. This is where automation comes in. As enterprises move to the cloud to simplify their architectures, automated data movement can help them replicate business-critical databases, so that they can take advantage of more advanced operational and analytical cloud use cases. Historically, replicating these databases had a major impact on business systems – slowing everything down to a crawl. Today, automated, high-volume data pipelines can not only give decision-makers real-time access to this data in the systems they need it; the same technology can also automate the maintenance and repair of data pipelines – freeing up data engineers to focus on value-added tasks instead. These capabilities make database replication low-impact and low-latency – leading to more reliable data and faster business decision-making. The role of data governance Unfettered access to data is key to business applications running smoothly – but that doesn’t mean all staff need access to all the data. In fact, ‘data democratisation’ is about giving access to the right people, to the right data, at the right time. It is important that there are strong data governance policies and processes following the path of data as it moves – for example, that personally identifiable information (PII) is masked and that only those with the right permissions can see it. Building and managing these features manually is a gargantuan and potentially error-ridden undertaking.As data protection regulations vary around the world, enterprises trading internationally will need to adopt data movement tools that grant them the ability to set up automatic guardrails for data movement and scale them as regulatory landscapes evolve. Keeping a competitive edge The future waits for no one. Companies are already maximising the opportunities offered to them by cloud computing, Machine Learning and AI. In fact, according to Fivetran’s latest research, nine in ten organisations are using AI/ML methodologies to build models for autonomous decision-making, and a quarter of organisations report that they have already reached an advanced stage of AI adoption where they utilise AI to its full advantage with little to no human intervention. To compete, more established businesses need to kick their data processes into higher gear and focus on modernising legacy infrastructures, so that vital data is more readily accessible and reliable enough to underpin key business decisions. Leveraging automation in database replication today will help businesses unlock greater profitability and scalability, without security considerations keeping CDOs up at night. ### Ethical, Compliant, Innovative AI Deployment Strategies in Cloud Recently, government entities have been discussing AI regulation and how organisations can remain compliant while leveraging AI for business needs. An example of this is the recent EU AI Act—the first piece of artificial intelligence legislation. The deployment of AI offers organisations an array of benefits, from enhanced operational efficiency to improved decision-making. However, it also brings a host of challenges, particularly concerning adherence to a complex web of regulations. In the sphere of cloud computing, AI can serve as both an enhancer and a safeguard, augmenting its capabilities while strengthening its defences. AI enriches cloud services with predictive analytics, reinforcing security and optimising resource allocation, ensuring a symbiotic relationship that propels innovation. Simultaneously, they navigate complexities, ensuring businesses harness their combined power to thrive in the digital age. As AI adoption continues, it becomes imperative for organisational leaders to navigate the intricate balance between innovation and compliance. From understanding the nuances of global regulations to implementing ethical AI practices, successful deployment hinges on a strategic and comprehensive approach. Understanding the intersection between innovation and compliance Governments around the world are grappling with the regulation of AI, each taking a unique approach tailored to its specific concerns and priorities. For instance, the recently adopted EU AI Act emphasises a risk-aware approach, particularly for high-risk AI applications, and places a premium on transparency and accountability. In contrast, the UK government has opted for a more pro-innovation stance, providing organisations with greater flexibility in interpreting and implementing AI principles. However, regardless of the regulatory framework in a particular jurisdiction, organisations must adopt a flexible compliance strategy that can adapt to the nuances of different regulations. This requires a deep understanding of the global regulatory landscape and the ability to navigate the complexities of compliance across various jurisdictions. Implementing ethical AI practices to align with business needs Beyond compliance with regulations, organisations must also prioritise the ethical deployment of AI. Ethical considerations are paramount, particularly given the potential for AI systems to perpetuate or exacerbate existing biases and inequalities. Implementing ethical AI practices requires organisations to establish robust guidelines and frameworks that go beyond legal compliance. This entails addressing issues such as data privacy, transparency, and accountability throughout the AI lifecycle. From data collection and model training to deployment and monitoring, ethical considerations must be woven into every stage of the AI development process. Additionally, organisations must invest in ongoing training and education to ensure that employees understand the ethical implications of AI and are equipped to make informed decisions. The role low-code can play in AI governance In the complex landscape of AI governance, low-code platforms emerge as a valuable tool for organisations seeking to streamline compliance and mitigate risks. Low-code platforms offer a simplified approach to software development, allowing organisations to build and deploy AI solutions with greater speed and agility. One of the key advantages of low-code platforms is their ability to enforce governance and compliance standards throughout the development process. By providing built-in controls and governance features, low-code platforms enable organisations to ensure that AI solutions adhere to regulatory requirements and ethical standards from the outset. Furthermore, low-code platforms facilitate collaboration between stakeholders involved in AI development, including data scientists, developers, and business users. This cross-functional collaboration is crucial for ensuring that AI solutions are aligned with organisational goals and ethical principles. The impact of embracing ethical AI and regulations As organisations navigate the complex landscape of AI deployment, they must embrace a holistic approach that integrates ethical considerations with regulatory compliance. This requires a commitment from organisational leaders to prioritise ethics and compliance in every aspect of AI development and deployment. Moreover, organisations must recognise that ethical AI deployment is not just a regulatory requirement but also a business imperative. Organisations can build trust with customers, employees, and other stakeholders by prioritising ethical AI practices, enhancing their reputation and mitigating potential risks. Successful AI deployment requires organisations to take a strategic and holistic approach to navigating the intersection of innovation and compliance. By understanding global regulations, implementing ethical AI practices, leveraging low-code platforms, and embracing a culture of ethics and compliance, organisations can unlock the full potential of AI while minimising risks and maximising benefits. ### CIF Presents TWF – Tara Halliday In this episode of the 2024 season of our weekly show, TWF! (Tech Wave Forum), we will be talking about people rather than tech. We'll be talking about being successful, and being a high achiever. About good and bad challenges, about psychology and neuroscience. Our host and CIF CEO David Terrar will be interviewing Imposter Syndrome specialist Tara Halliday, talking about her company Complete Success. She'll explain how she helps leaders and entrepreneurs, and how she wants to eliminate imposter syndrome for good! Please subscribe to what will be a fascinating event. ### Start Bridging the Cloud Skill Gap Today In today's rapidly evolving digital landscape, cloud computing has emerged as the cornerstone of innovation and efficiency for businesses worldwide. However, despite its transformative potential, many organisations find themselves facing a significant hurdle: the cloud skill gap. As the demand for cloud expertise continues to rise, the shortage of qualified professionals poses a formidable challenge to organisational cloud ambitions. But, there are strategies to bridge this gap and embark on the cloud journey without delay. The impact of the digital skill gap The lack of digital experts in this space represents a critical barrier for businesses looking to leverage cloud technology effectively. With the increasing adoption of cloud solutions, the demand for skilled professionals capable of architecting, deploying, and managing cloud infrastructure is at an all-time high. However, according to The IT Skills Gap Report by Forbes, there is a noticeable shortage of talent with 93% of organisations reporting a lack of cloud skills. This shortage not only hampers organisations' ability to implement cloud solutions, it also impedes innovation, slowing down digital transformation initiatives, and limiting competitiveness in the market. Moreover, the lack of cloud skills can lead to inefficiencies, security vulnerabilities, and increased operational costs, further exacerbating the challenges faced by businesses in today's hyper-connected world. Partnering for success: Leveraging external expertise in cloud computing In the face of the cloud skill gap, organisations must adopt a proactive approach to building and augmenting their cloud capabilities. One effective strategy is to leverage external partners and consultancies with the skills and resources needed in cloud computing. Those wanting to take advantage of external experts should start by identifying whether you need a temporary augmentation model – recruiting temporary staff to fill the gaps – or if you need a partner who can help you upskill as well. Then, you should set out to find reputable partners and consultants with expertise in cloud computing. It’s important to look for firms with a proven track record of successful cloud implementations and a deep understanding of your organisation's industry and specific requirements. Once you’ve found a trusted partner, they can help you achieve: Strategic roadmap: Paving the way for successful cloud transformation A clear cloud roadmap is crucial, therefore, your chosen expert should work closely with your team to chart a course that aligns technology with business objectives. They should consider factors such as workload migration, application modernisation, and data management, for example. The goal is to create a roadmap that balances short-term wins with long-term vision, ensuring sustained value delivery. Foundational value: From planning to prototyping When partnering with an external expert, organisations can tap into a wealth of knowledge and experience that accelerates their cloud journey from planning to prototyping. An agile approach combines expert design, engineering, and delivery skills to rapidly identify and configure cloud platforms for your most advantageous cloud initiatives. By bridging any skill gaps within your team, consultants can foster a culture of experimentation and establish an efficient delivery operating model. Value accumulator: Scaling success and building internal cloud expertise As initial wins accumulate, the focus can shift to rapidly scaling your cloud implementations while continuously delivering enduring business value. Experts can also help build your internal team's cloud skillset, augmenting them as needed to maintain momentum and quality throughout the process. By combining external expertise with coaching, mentoring and a robust hiring plan, organisations can ensure a seamless transition towards self-sufficiency and long-term success in cloud adoption. Full adoption: Optimisation and efficiency Partnerships shouldn’t simply end once initial success is achieved. Consultants should guide you through completing complex cloud programs, tackling the "long tail legacy" that often plagues organisations with minimal benefits and high ongoing costs. Together, you can remove these constraints and establish a customised cloud operating model seamlessly integrated across your organisation. This ensures continuous monitoring and optimisation of your cloud capabilities, maximising value and efficiency in the long run. Cloud iteration: The journey continues The cloud journey is an ongoing process of innovation, and experienced cloud practitioners should support your business every step of the way. From product design to operations, they should empower your team to develop a continuous cloud improvement model, fostering greater cloud maturity and unlocking broader value drivers such as advanced digitisation, operational efficiency, and competitive advantage. With guidance, your business can embrace the iterative nature of cloud adoption, driving continuous innovation and growth tailored to your specific needs and timeframe. Unlocking cloud potential By leveraging external expertise, organisations can bridge the cloud skill gap and accelerate their journey to the cloud without delay. Through collaborative efforts and a commitment to continuous learning, businesses can harness the full potential of cloud computing to drive innovation, efficiency, and competitive advantage. With the right strategy and commitment to skill development, there's no limit to what businesses can achieve in the cloud. ### How Data Fabric Helps Address Multi-Cloud Sprawl The abundance of data facilitates good decision-making, but too much of it can be problematic. Indeed, collecting too much business data can be wasteful, as it’s unlikely to ever be used to derive any useful insights, but it’s also likely to be stored across multiple repositories, creating hassles around data management and data security. Database managers at your business might not even be aware that some of these resources exist. This phenomenon, referred to as “data sprawl,” is particularly pronounced among organisations that manage multi-cloud environments. This setup comes with the inherent challenge of having to account for information scattered across various cloud platforms. There is a tendency for organisations that use multiple cloud platforms to have visibility issues, inconsistent implementation of security policies, data siloing, and difficulties when it comes to regulatory compliance. In response to the data sprawl problem, data fabric can provide a good model for a solution. Data fabric is an approach in IT architecture wherein organisations manage their data in different environments to achieve unified access, governance, and integration. It supports organisations as they deal with the challenges of data fragmentation and overload in complex IT environments. Specifically, data fabric provides the following mechanisms and functions to combat data sprawl across multiple cloud environments. Enhanced Security The idea of data fabric focuses on using a set of technologies and tools to establish a data management architecture that efficiently addresses governance, integration, and access across multiple sources. It provides a systematic way to manage data to overcome challenges, especially those brought about by the reliance on multiple platforms. Marketed as data security fabric (DSF) by security firms, solutions that leverage the concept of data fabric are characterised by their emphasis on providing adequate and appropriate security for multi-cloud environments, scalable data security, and compliance at scale. They offer security features that target the changing security needs of data storage and processing across different platforms. Adequate and appropriate data security in the context of different data storage topologies requires more complex data encryption practices to address more sophisticated attacks while ensuring efficiency. It needs to deal with cloud provider-specific security features, which can be chaotic to manage through conventional means. Organisations also need automation and continuous monitoring to keep up with the deluge of data. Additionally, it is important to centralise the management of user identities and access requests to comprehensively and efficiently manage data hosted and processed on different platforms. On the other hand, scalability in securing data is a must for multi-cloud environments. As organisations rapidly expand and onboard more cloud platforms, they inevitably broaden their attack surfaces. Hence, there is a need to efficiently integrate new systems, security tools, and policies. Moreover, it is important to keep up with data security regulations, especially when expanding into regions that impose different sets of laws or rules on data collection, retention, and protection. Compliance may sound easy, but it can get complicated when working with varying regulatory requirements in different locations. Data fabric provides the framework to guide organisations in dealing with the threats that emerge as a result of using multiple clouds as well as hybrid environments that include on-premise data storage. It presents the tools and security best practices necessary to anticipate, respond to, and mitigate attacks. Integration and Unification Data fabric is notable for being an integration powerhouse. Designed to seamlessly bring together multiple environments, it serves as a robust bridge that connects disparate data sources while allowing data to move with ease between different systems and platforms.  Data fabric streamlines technological intricacies by simplifying the details required for transferring, altering, and merging data throughout the enterprise. It dismantles the obstacles of data movement. In the process, it allows organisations to manage data more efficiently to support the assimilation of disparate data, break down data silos, and enable comprehensive analysis.  All relevant data are put to good use, while those that are deemed unnecessary are archived or destroyed to free up space. Additionally, data fabric creates a virtual data layer for an organisation’s IT infrastructure. It provides a unified way to view and manage all data. It does not matter where the data resides or which cloud platform hosts it. Everything is viewable and manageable through a virtual data layer that simplifies data discovery and ensures secure access for authorised users. Moreover, this combination of integration and unification facilitates standardised management for all of an organisation’s data. It supports the consistent implementation of data governance policies, ensuring adherence to data quality standards, access controls, and integrity measures. There is a sense of predictability in how data is treated, which makes it easier for teams to find, manage, and use data. Automation and Machine Learning Data fabric may also leverage automation and artificial intelligence to counter data sprawl. Various repetitive tasks in multi-cloud data management can benefit from automation. Also, machine learning can significantly improve data handling tasks that would take a long time when done manually. Cloud database security managers can automate the process of data discovery and cataloguing. These time-consuming and typically error-prone tasks can result in various mistakes that contribute to data sprawl. Also, automation plays a crucial role in data movement and transformation. There are many instances when unnecessarily redundant copies of files are kept because of unplanned or disorganised file movements.  Data transformations that are not thoughtfully planned out like in the case of business data calculation and reporting can also mess up data management. Organisations can use data fabric to automate data lifecycle management and ensure that data is optimally utilised and carefully disposed of. Data fabric solutions can take advantage of machine learning for various purposes. For one, AI helps intelligently discover and classify data, especially for organisations that gather, generate, and process vast amounts of data. Artificial intelligence is also useful in anomaly detection and data lineage tracking.  Additionally, machine learning is important in rapid automated data curation and optimisation, particularly when it comes to data deduplication, archiving, and making decisions on which data to delete. AI also facilitates predictive analytics for proactive data management, allowing admins to automatically get rid of unwanted and unneeded data that take up limited storage space. Tidying the Sprawl Given the increasingly rapid generation of data, sprawl is a reality that modern organisations face. A lot of this data is likely to be useful, but unnecessary data proliferation also happens frequently – especially when using disjointed and disorganised data storage and processing platforms. Data fabric provides a highly viable model for how to address data overload and manageability in the age of multi-cloud and hybrid environments. ### CIF Presents TWF – Erika Jacobi In this episode of the 2024 season of our weekly show, TWF! (Tech Wave Forum), we are talking about Organisational Design, and adaptive organisational design. Our host and CIF CEO David Terrar will be interviewing Dr. Erika Jacobi the Founder, President, and Managing Director of LC GLOBAL Consulting, beaming in to us from New York. She’s both an entrepreneur and a recognised expert in adaptive organisation design. She'll explain the experiences which led her to the topic, what it is, why it's important, and how AI changes things.  ### CIF Presents TWF – Nigel Kilpatrick In this episode of the 2024 season of our weekly show, TWF! (Tech Wave Forum), today's show will mention technology, but we'll be focusing on a different kind of leadership and different approach to engagement of your teams. Our host and CIF CEO David Terrar will be interviewing Nigel Kilpatrick live in our mixed reality studio. His company's website opens with the bold headline "Compassion is our best Human Resource, so let's use it!". Please subscribe to the event to find out more about compassionate leadership, and the difference between empathy and compassion - ### CIF Presents TWF – Dean & Sarah-Jane Gratton In this episode of the 2024 season of our weekly show, TWF! (Tech Wave Forum), we are talking about the technology topic that will, possibly, make the most significant and fundamental changes to both our business and personal lives since the dawn of the Internet. Our host and CIF CEO David Terrar will be interviewing Dean Anthony Gratton and Sarah-Jane Gratton, well known tech influencers who have just published their book Playing God with Artificial Intelligence. I'm guessing it will be both controversial, and illuminating.  ### The Quantum Leap in Internet Technology Envision a world where the internet, as we currently understand it, becomes a relic of the past, much like dial-up connections did years ago. Our current internet, while revolutionary in its own right, is not without its limitations. It struggles with data security, speed, and the ability to handle large amounts of data. This is where quantum internet technology comes in. This seemingly futuristic scenario might not be so far-fetched, thanks to China's groundbreaking advancements in quantum internet technology. In 2024, China is spearheading a technological revolution that could fundamentally alter how we connect and communicate, a prospect sure to ignite the curiosity and awe of technology enthusiasts and researchers alike. Explaining the Quantum Internet To understand the quantum internet, let's start with a simple analogy. Picture yourself at a busy party, trying to share a secret with a friend across the room. With the traditional internet, you might write the secret on a piece of paper and pass it through the crowd, risking interception by someone else. Quantum internet is like whispering into a magic box only your friend can open. If anyone else tries to tamper with it, the message vanishes instantly. This magical security is made possible through the principles of quantum mechanics, particularly quantum entanglement and superposition. These principles sound complex but can be broken down into simpler terms. The Quantum Entanglement Entanglement is a phenomenon where two particles become linked, such that the state of one particle instantaneously affects the state of the other, no matter how far apart they are. Imagine having two dice that are entangled. When you roll one dice and get a six, the other dice will instantly show a six, too, even if it's on the other side of the world. This instant connection is the cornerstone of quantum communication, allowing data to be shared instantly across vast distances. Quantum Superposition Superposition refers to the ability of a quantum particle to exist in multiple states at once. Think of it as a coin spinning in the air – it's both heads and tails until it lands. In terms of data, this means a quantum bit (qubit) can represent both 0 and 1 simultaneously, unlike classical bits that are either 0 or 1. This ability to hold multiple states at once significantly increases the amount of information that can be processed and transmitted, leading to much faster and more efficient data handling. China's Quantum Achievements In 2024, China has made several significant strides in quantum internet technology. Here are some of the key milestones: 1. Quantum Key Distribution (QKD) Quantum Key Distribution uses quantum mechanics principles to transmit encryption keys securely. Unlike classical encryption methods, which can be intercepted and deciphered by hackers, QKD ensures that any eavesdropping attempt will be immediately detected. This is because measuring a quantum system inherently changes its state, alerting the communicating parties to the intrusion. 2. Quantum Satellite – Micius China launched the world's first quantum communication satellite, Micius. This satellite can send and receive quantum-encrypted messages, enabling secure communication over long distances without the risk of interception. The successful deployment and operation of Micius mark a significant leap forward in global quantum communication. 3. 2,000 km Quantum Link China established a 2,000-kilometre quantum communication link between Beijing and Shanghai. This is the longest quantum communication line in the world, showcasing China's ability to scale up quantum technology for practical use. This link allows for secure data transmission between two significant cities, paving the way for more extensive quantum networks. Why China Leads in Quantum Internet China's leadership in quantum internet technology is not a coincidence. It results from strategic governmental support, significant investments in research and development, and robust collaborations between various sectors. The Chinese government has identified quantum technology as a critical area for national development. Recognising its potential to revolutionise global communication and security, the government has implemented a comprehensive national strategy for quantum science. This includes heavy investment in research and development, infrastructure, and talent cultivation. Research and Development Investments China's investment in quantum technology research and development has skyrocketed recently. Billions of dollars are channelled into quantum research, outpacing other major players in the field. This massive influx of funding has accelerated the pace of breakthroughs, propelling China to the forefront of the quantum race. Collaborations and Synergy The collaboration between the Chinese government, top universities, and tech companies has created a powerful synergy. Universities provide the brainpower, tech companies offer practical applications, and the government supplies the funding and policy support. This tripartite alliance has been instrumental in driving China's quantum internet advancements. For instance, the launch of the Micius satellite was a direct result of collaboration between the Chinese Academy of Sciences, the University of Science and Technology of China, and the state-owned China Aerospace Science and Technology Corporation. Furthermore, the government has implemented educational programs to nurture a new generation of quantum scientists, ensuring the country's long-term supremacy. Global Implications of Quantum Internet China's advancements in quantum internet technology are significant for the country and hold profound implications for the globe. Let's explore some of these implications: Cybersecurity Our current internet constantly struggles with data security. Hacks, leaks, and breaches are everyday news. But a quantum internet could change the game. Quantum technology uses the principles of quantum mechanics to encrypt data, making it virtually unhackable. This new level of security could revolutionise everything from online banking to national defence, providing a sense of reassurance in an increasingly digital world. Speed of Communication Today's internet is fast, but quantum internet promises to be a game-changer. Quantum particles can exist in multiple states simultaneously. They can be linked together regardless of the distance, a phenomenon known as quantum entanglement. This could lead to instantaneous communication even across vast distances. Imagine downloading an entire movie in the blink of an eye or scientists across the globe collaborating in real-time on complex problems. The potential for excitement and the possibilities are limitless. Industry Transformation The potential for transformation across various industries is immense. In finance, the quantum internet could make transactions more secure and efficient, potentially reshaping global markets. It could enable the rapid and safe transmission of large medical data files in healthcare, enhancing telemedicine and remote patient monitoring. In defence, where secure communication is vital, the quantum internet could offer unprecedented security, giving nations a strategic edge. The possibilities are limitless, sparking excitement about the future of technology. Of course, these are just the tip of the iceberg. The potential applications of quantum internet are vast, and we are only beginning to imagine them. As this technology continues to evolve, it's likely to open doors we never even knew existed. Ethical and Societal Considerations With great power comes great responsibility. As we step into this new era of quantum internet, it's crucial to consider this technology's ethical and societal implications. How do we ensure it's used for the greater good? How do we prevent misuse? These are questions we'll need to grapple with as we move forward. The potential is vast, and the implications are profound. Speed and Efficiency of Quantum Internet Quantum internet promises unprecedented speed and efficiency. Let's delve deeper into these aspects: Unprecedented Speed Quantum entanglement enables instantaneous data transmission significantly faster than the current internet. This means downloading an entire HD movie in seconds or conducting video conferences with zero lag. Imagine the impact on real-time applications like online gaming, video streaming, and remote work. The ability to transfer data instantaneously can revolutionise how we interact with the digital world. Enhanced Data Throughput Quantum superposition allows multiple states simultaneously, enhancing data throughput and reducing latency. In practical terms, a quantum internet connection can handle far more data faster than traditional internet connections. This could lead to smoother streaming services, faster download speeds, and a more responsive online experience. Quantum Encryption: The Future of Secure Communication One of the most exciting aspects of quantum internet is its potential for unparalleled security. Let's explore how quantum encryption works and why it's so revolutionary: Virtually Unhackable Encryption Quantum encryption offers security that is fundamentally different from traditional methods. Any attempt to intercept the communication is immediately detectable, rendering traditional hacking methods obsolete. This is crucial for sectors like banking, defence, and personal data security, where breaches can have severe consequences. Quantum encryption ensures that data remains confidential and tamper-proof, providing peace of mind in an increasingly digital world. Securing Sensitive Communications For government agencies and military organisations, communicating securely without the risk of interception is paramount. Quantum encryption can provide unprecedented security, ensuring that sensitive information remains protected. This could be a game-changer in national defence and international diplomacy, where secure communication is vital. Interoperability with Current Networks One of the significant challenges and potential benefits of quantum internet is its interoperability with existing networks. Integrating quantum communication with current internet infrastructure could enhance security and efficiency without requiring a complete overhaul. Let's explore how this might work: Hybrid Systems Hybrid systems are being developed to allow seamless transition and communication between quantum and classical networks. This means that quantum internet could be gradually integrated into our existing infrastructure, enhancing security and efficiency without disrupting current services. Hybrid systems could bridge the old and the new, allowing for a smoother transition to quantum internet. Enhancing Existing Services By incorporating quantum communication into current networks, we can enhance the security and efficiency of existing services. For example, online banking systems could use quantum encryption to protect transactions, while healthcare providers could securely transmit patient data. The potential applications are vast, and the benefits could be realised without completely overhauling existing infrastructure. Future Prospects of Quantum Internet The future with quantum internet promises incredible advancements across various sectors. Let's take a closer look at some of the most exciting prospects: Healthcare: Remote Surgeries with Zero Lag Imagine a world where doctors can perform complex surgeries remotely, with zero lag in data transmission. Quantum internet could make this a reality, enabling real-time telemedicine and remote patient monitoring. This could revolutionise healthcare, making it possible for patients in remote or underserved areas to receive the same level of care as those in major cities. Scientific Research: Accelerated Simulations and Data Processing The quantum internet could accelerate scientific research by enabling faster simulations and data processing. For example, climate scientists could run more complex models to predict weather patterns better and understand climate change. In pharmaceuticals, researchers could simulate drug interactions at a molecular level in real time, speeding up the discovery of new medications and treatments. Finance: Secure, Fraud-Resistant transactions In the financial sector, the quantum internet could revolutionise transactions. With its inherent security, quantum transactions could virtually eliminate fraud. The speed of quantum communication could make real-time global transactions a reality, reshaping financial markets and banking systems. Imagine transferring large sums of money internationally in seconds, with complete security and no risk of interception. Internet of Things (IoT): Smarter, Safer Environments Quantum internet could bring the Internet of Things (IoT) to life in ways we can barely imagine. Your self-driving car could communicate instantaneously with other vehicles, traffic lights, and road sensors, making your commute smoother and safer. Home automation systems could become more efficient, with smart devices communicating seamlessly. The potential for innovative, safer, and more efficient environments is vast. Military and Defence: Enhanced Secure Communications The quantum internet could provide unprecedented secure communication for military and defence applications. Military operations could be coordinated with absolute confidentiality, reducing the risk of intercepted messages and data breaches. This could give nations a strategic edge in global security and defence. The UK's Position in Quantum Technology The UK is also making significant strides in quantum technology, though it currently lags behind China in some aspects. Let's explore where the UK stands in this global race. The UK government has launched the National Quantum Technologies Programme, which aims to propel the UK to the forefront of quantum research and innovation. This programme includes substantial investments in research and development, focusing on four key areas: quantum computing, quantum sensing, quantum communications, and quantum imaging. Collaboration and Innovation The UK has fostered collaboration between academic institutions, such as the University of Oxford and Imperial College London, and tech companies, including BT and Toshiba. These partnerships are driving forward innovations in quantum computing and communication. For instance, researchers at the University of Oxford are developing scalable quantum computers. BT and Toshiba are testing quantum critical distribution systems for secure communications. Quantum Communication Infrastructure Although the UK's quantum internet infrastructure is less advanced than China's, significant progress is being made. The UK has established several quantum communication testbeds, such as the Cambridge Quantum Network, to explore its practical applications. These testbeds serve as a platform for testing and developing new quantum technologies. Educational Initiatives The UK government recognises the importance of nurturing a new generation of quantum scientists and engineers. Educational initiatives have been implemented to introduce quantum science into the curriculum of schools and universities. These initiatives aim to develop the skills and knowledge required to maintain and advance the UK's position in the quantum technology race. Challenges and Opportunities While the UK has made commendable progress, there are still challenges to overcome. The high cost of research and development, the need for more robust infrastructure, and the competition from other countries are significant hurdles. However, these challenges also present opportunities for innovation and growth. By continuing to invest in quantum research and fostering international collaboration, the UK can strengthen its position in the global quantum technology landscape. In conclusion, the quantum internet is poised to revolutionise how we connect and communicate. China's advancements in 2024 are leading the way, showcasing the incredible potential of this technology. From virtually unhackable encryption to instantaneous data transmission, the benefits of quantum internet are vast and transformative. While the UK is making significant strides in quantum technology, it must continue to invest in research and development and foster collaboration to keep pace with global advancements. The race for quantum supremacy is not just about technology; it's about shaping the future of communication, security, and industry. As we stand on the brink of this new era, it's crucial to consider the ethical implications and ensure that quantum technology is used for the greater good. The potential is immense, and the future is closer than we think. ### Translating ancient code for the generative AI era Having easy access to data is critically important in today’s era of generative AI. For many organisations, this is not yet the case, as their legacy code hampers them from developing the advanced AI systems that they need to propel their business forward. Legacy, or out-of-date, code can have significant performance issues and scalability limitations, not to mention huge maintenance costs which can eat away at IT budgets. Yet, there is often a reluctance to migrate away from this legacy code due to the inherent risk and time implications in the process, as well as the overall upheaval to team members who are set in their ways. So, how can organisations turn this often-dreaded task of migration into a desirable opportunity for improvement and growth? Increasingly the answer lies in generative AI. Translations can be tricky Organisations are increasingly recognising the need for migrating away from legacy code. Migration is the process of moving data from one location or application to another, typically to replace or augment legacy systems with new applications that will share the same dataset. In today’s context, migrations are often driven by a business taking the decision to move from traditional on-premises infrastructure to the cloud. This digital transformation is designed to make an organisation more agile and better equipped to deal with the ever-changing technological landscape. Despite a significant appetite for change, challenges can slow down the migration process. For instance, many organisations worry about the security implications associated with migrations. Think of when you have moved some of your favourite photos or music from an external hard drive to another, but on a vast scale. There are risks around data loss and duplication that can arise in the transportation process, which could leave data vulnerable, inaccurate or lost altogether. But also concerns about the permissions and policies associated with data, particularly unstructured, that could leave data at risk. Additionally, and arguably most difficult to overcome, is the vast investment in time, skills and, of course, money that is associated with migrations. Migrations require a lot of time-consuming and arduous tasks, diving into minute detail of things like code, databases, security, governance and more. To do this properly involves a lot of time, and needs an organisation to have very specific and sought-after skills on hand. To compound this, legacy data is often unstructured and written in a variety of coding languages, requiring unique skills to translate legacy code into a standardised format and modern code language that runs efficiently on a cloud data platform. Equally, today’s analysts or data scientists may not have the skills to understand legacy code, with this lack of consistency making teams not feel confident about moving away from a language they are very familiar with. Easing the burden with generative AI Generative AI provides a path to clear the roadblocks associated with migration and make the process more manageable, primarily in its ability to augment, accelerate and streamline the many processes involved. In the early part of the migration process, code preparation and generative AI can accelerate various analyses that can identify usage patterns. These reports can support teams in narrowing down which data is still being used within the organisation, and thus what should be translated versus what can be archived. Additionally, by analysing vast amounts of error logs, generative AI can accelerate data quality assessment between the legacy and new platforms, ensuring that governance is upheld throughout the process. Also, beyond simply identifying and validating data, it can also recommend optimal configurations and strategies for teams to use as they migrate, giving key insights to teams as they work through the process. One key way in which generative AI can achieve all this is by translating code into a natural language that everyone can understand, taking an enormous amount of complexity out of the whole migration effort. Additionally, this step helps to re-evaluate the business rules and logic that have been written as code to make sure it is clean, curated and still fits with the overall business direction. Through this accelerated augmented analysis, generative AI takes significant pressure off teams who otherwise would have to sift through all this data. Previously, this led to many teams resorting to ‘lift and shift’ efforts where they would simply pick up the code and data and move it to analogues on a public cloud provider. This corner-cutting can lead to a hugely inflated cloud budget, which can cause more hesitation to move over to the cloud, or even in some cases leaving the cloud altogether. Due to the importance of the process and its impact on time and cost, it is crucial for teams to get this right the first time. Generative AI helps to accelerate the process, thus freeing up human oversight to be more analytical instead of being bogged down by the more time-consuming and manual elements involved. Bolstering the business case Modernisation will always be a significant undertaking and one that requires a lot of hard work, diligence, and accuracy. However, by accelerating a lot of the heavy lifting involved in migration, and drastically improving efficiency, teams can instead focus on ensuring the process’s effectiveness. Additionally, the translation to natural language goes a long way to mitigating the organisational risks associated with the unknown rules and logic embedded within legacy systems. This approach can convince those who are reluctant to migrate that there is a sufficiently strong business case for leveraging these techniques in the execution, despite the possible challenges in the model accuracy. Those who modernise will put themselves in the best position to thrive in today’s era of generative AI. All AI systems, particularly those large language models that have exploded in popularity recently, are fully reliant on the data they use. If the data is obsolete, duplicated or inaccurate, the model will be rendered useless. As part of the modernisation process, organisations must embrace a mindset that is set up to thrive long-term. What is built today, will soon be ‘legacy’ due to how quickly the industry is advancing. Therefore having a robust architecture in place that is able to adapt and evolve as the technology does is crucial and modernisation is the foundation upon which this must be built. ### Tackling AI’s Data Sovereignty Dilemma With the advent of ever stricter data privacy and protection regulations – as well as stringent client requirements with regards to where their data resides – organisations have become finely attuned to the importance of storing data within specific geographic regions, lest they make a misstep that will put them out of compliance. But this is where AI can potentially throw a spanner in the works. The data might be stored in a specific geographic region – but when an AI service is run against that data, where does that processing take place? As new regulatory frameworks like the proposed EU AI Act come into play, it will become increasingly important for organisations to ferret out these types of hidden risks if they want to safely leverage AI – and asking a few key questions can help them safely navigate a path forward. Storage over here, processing over there… Sometimes – either for cost or operational efficiency reasons – vendors will have computer clusters in various locations around the globe that are in charge of running specific services on the data, whether that’s OCR, search indexing, or AI. This approach can quickly lead to some potential data sovereignty pitfalls. Say that data is stored by the vendor in the EU, but their AI processing is handled by a cluster of servers in the United States. Whenever someone wants to run AI on a document that is stored in the EU, the vendor will send it over to the United States, process it through the AI engine located there, and then send it back to the end user who's doing the query. This is all invisible to the customer, who is likely unaware that the data has moved and been processed in the United States. The vendor, meanwhile, can place a hand on the heart and swear that data is stored in the EU – even though it temporarily takes a brief journey across the pond anytime AI is run across it. The lesson here? Customers need to make sure they’re asking their vendors the right questions. Customers should always ask their vendor not only where data is stored, but where processing of that data takes place. Vendors who have placed not just storage servers but AI clusters across the globe – rather than in one specific geographic area – are best situated to ensure that data is both stored and processed in the same region, ensuring it doesn’t leave places it shouldn’t. How many clouds, exactly? As part of their due diligence, customers should also determine just how many cloud providers are being utilised to provide the overall AI service. Is it just one cloud provider? Or is it multiple cloud providers? For example, maybe a vendor stores data on their private cloud, then hands it off to Large Cloud Vendor A, and then over to Large Cloud Vendor B for AI processing, before returning it to their private servers. That’s three different clouds that vendors are moving data across. All three clouds may well be in the proper geographic region, but it becomes harder to pin down the location as more clouds get involved. Additionally, increasing the number of cloud vendors increases the risk profile. The more companies in the supply chain, the greater the risk of something going wrong. All it takes is one weak link in the chain for a seemingly secure service to become compromised. Customers should specifically ask any AI vendor they’re utilising how many vendors they are relying on to deliver the service. Knowing the full extent of the supply chain helps shine a light on the potential risk of that chain. No prying eyes on sensitive files Given the confidential information that so many of today’s organisations traffic in, one of the most essential things to clarify with their vendor is whether their data is being used to train the underlying large language model (LLM) that powers the AI service. The answer should be an unqualified “no.” When the LLM is presented with a document and a question to ask of that document, the LLM should provide an answer and then forget both the document and the question. It should not be storing any of that information unless you have explicitly allowed it to. Additionally, many of the large AI service providers have an “abuse monitoring” program that allows them to monitor the questions being asked of its AI engine, to ensure it isn’t being used in a harmful manner. It’s important for customers to ask whether their vendor is subject to that monitoring policy, or if they are exempt. At the end of the day, customers have a right to know who might be monitoring their traffic and the questions they ask of their AI, as that data trail provides a window into the sensitive content they’ve been entrusted with. Steer clear of potential pitfalls In the rush to embrace AI and make it part of their operations, there are critical areas that organisations can’t afford to overlook if they hope to steer clear of hidden risks. Fortunately, by homing in on the right questions during the due diligence phase, they can minimise potential risk around everything from data sovereignty to how many different clouds the data moves across, to who’s able to monitor the questions being asked of the AI. This careful approach empowers organisations to leverage AI confidently, ensuring better business outcomes while steering clear of any potential pitfalls. ### How midmarket manufacturers should approach cloud-based ERP The benefits of cloud computing to a business – easy accessibility, more robust data security, scalability, rapid deployment, and more – have been clear to many people for at least a decade now. However, some industries are far less mature when it comes to the cloud than others – manufacturing is one of those. Because many midsized manufacturers started out as factories or workshops, they operate traditionally and have been slower to embrace newer technologies. There's a thought process that goes, 'Why on earth would I change when my on-premise solution works perfectly for my needs?' And some might be even more reluctant to embrace technology and not use an ERP solution at all. That’s understandable in some ways, but it’s also misguided. The world is changing, and even midsized manufacturers that operate traditionally can benefit from cloud ERP. For those thinking of taking a leap of faith, this is how they should go about it. ERP and midsized manufacturers There is a perception that Enterprise Resource Planning (ERP) is a technology most suited to larger organisations. That’s a myth. There is no reason whatsoever why businesses of any size cannot use and benefit from ERP. That’s especially true of manufacturing companies. Who wouldn't want greater transparency and insight, improved productivity, better customer service, reduced costs, and more collaboration? But thousands and thousands of smaller manufacturers rely on manual processes or muddle through with spreadsheets. They likely recognise the need to centralise and streamline business processes but don't have the resources or expertise to go about it. This is where ERP can make a real difference, and ERP in the cloud is even more so. Embracing the cloud The right cloud ERP solution for a midsized manufacturer should facilitate production planning strategies. These strategies are a series of concepts focused on how the supply chain works—do you procure or make-to-stock, do you make-to-order, or even engineer-to-order? Good cloud ERP can combine those different procurement methods, drive the customers, and help them make decisions about their stock management against those strategies. Stock management is also incredibly important for midmarket manufacturers. When they get this wrong, they can experience shortages that can prevent them from delivering on time or have far too much stock that takes up valuable warehouse space. Furthermore, a surplus of stock also affects cash flow, which is often a challenge for midmarket businesses. Another problem that cloud ERP helps to address is knowledge-sharing and collaboration. It's common in smaller manufacturers for individuals to carry critical information on their laptops or, even worse, in their heads. Suppose they go away for a day or two. In that case, the business struggles to operate as no one is certain how the goods should be dispatched, what a particular customer's preferences might be and details around any bespoke pricing. In this case, the cloud ERP solution acts as an operational blueprint, detailing precisely everything anyone would need to know to make the business function effectively. Getting started Regarding best practices when implementing cloud ERP, there are several elements to consider. The first thing they should ask the vendor is about the cloud quality because there are as many different clouds as there are vendors. AWS is the market leader, offering all the benefits of AWS innovation, infrastructure knowledge, scalability, and best practices that can be passed on to customers. There are other options too - Azure, GCP, and Oracle all provide high levels of quality and are worthy of consideration. Potential users should always work through a list of questions with the provider. What's the quality of the cloud? How will migration work, and how best to handle the legacy systems? How do you transition from your old system to the new system? What data do you want to retrieve, and for what purpose? The latter should always be the data that is absolutely core to the business – employees, means of production, IP (how products are made, who are your suppliers and customers) and all financial data. They should also look beyond the infrastructure provider and look at cloud operations policies: backup policy, DR/BCP policies, security measures, SLAs, and more. Manufacturers should then use the standard product as much as possible and not look at customising it. With a cloud subscription, you benefit from the value that comes with the vendor making integrative and incremental improvements and adding new features and functionality. If you have a customised version, that's different. It won't evolve in the same way, could run into future compatibility issues, and will slow you down in adopting new features. Finally, change management is one of the most crucial elements to consider. Rolling out any new system is hard. People need to get used to it, and the only way it can work is if the management team shows it believes in the system, leading the way as the first users. If not, the rollout is much easier to fail. Cloud ERP can be transformative for a midsized manufacturer, future-proofing the organisation not just now but for years ahead. Not taking advantage of cloud ERP could be the biggest misstep the company makes. ### Transforming The Recruitment Process with AI In recruitment, the integration of artificial intelligence and automation has become a game-changer, showcased by a small number of recruitment businesses including IT recruitment agency TechNET IT – which began leveraging AI in this capacity shortly after the announcement of the COVID-19 lockdown in 2020. Their early adoption of technology during a challenging period allowed the agency to stay ahead of industry trends and continue thriving while others were slowing down and making business cuts. Although it was a risky move, many businesses are now reaping the benefits. But what impacts can AI and automation have on the recruitment process itself – both positive and negative, and how can businesses benefit? Saving time For any recruitment agency, exploring AI can be a solution for optimal time management.Tasks like data input, confirmation emails, resume screening, and candidate outreach, typically those that are time-consuming, can now be streamlined through AI – TechNET has shared that they currently save up to 21 hours per week per consultant. This automation liberates valuable human resources, presenting an opportunity to redirect focus toward strategic aspects of recruitment. Embracing such efficiency not only helps meet weekly goals sooner but also allows teams to prioritise more meaningful, people-centric tasks, fostering a well-rounded and effective approach to recruitment. The saved time has prompted businesses to shift to a shorter workweek, keeping productivity intact and allowing their teams to reclaim valuable time as well. In recruitment, consultants are expected to generate enough revenue to cover their overhead costs, including the implementation of technology. However, with today’s advanced recruitment technology, consultants are equipped to thrive and surpass their targets, making it easier to cover their costs and achieve higher revenue. Attracting and retaining talent Your tech setup goes beyond being just a tool; it acts as a magnetic force attracting top talent. Streamlined processes not only make life easier for experienced recruiters but also ensure a seamless onboarding experience for new team members. Naturally, job seekers are inclined to partner with those equipped with the best technology—a valuable asset alongside perks like flexible working and more. Integrating advanced AI technology into your systems attracts consultants, offering a preview of tools that enhance efficiency and strengthen connections with candidates and clients. Increasing engagement The numbers reveal a significant impact—TechNET records an 89% higher placement rate for candidates engaged through automated processes. This increase in engagement is thanks to the effective use of technology to maintain regular contact, sending well-timed follow-ups that convey genuine interest. This positive engagement experience not only results in successful placements but also strengthens the relationship between recruiter and candidate. Elevating candidate experience Technology shouldn’t just automate; it should intelligently customise communication using algorithms and generative AI. This personalised approach enhances interactions with both candidates and clients, allowing consultants to save time on routine tasks and invest more quality time in understanding candidates' needs and aspirations. To achieve a delicate balance, AI streamlines processes without removing the human touch that candidates and clients value. While AI and automation offer numerous benefits to the recruitment process, there are also some potential drawbacks to consider. Costly investment Integrating automation and AI into business operations is a substantial investment, and can often be out of the question for smaller businesses. While the benefits can be significant with proper implementation, it becomes an expensive venture if not utilised judiciously. To manage costs effectively, start by figuring out what your business really needs instead of splurging on tech you might not use. The initial investment and ongoing maintenance costs may outweigh the benefits for some organisations. Tech-savvy enterprises work closely with technology partners to develop bespoke tools, ensuring that the investment in technology is purposeful and aligns seamlessly with the unique requirements of their business. Over-reliance on technology While technology brings efficiency, relying too much on AI and automation, especially without proper training, can have its own set of challenges. Recruiters might miss subtle details, and the personal touch in interactions may fade. Additionally, an excessive dependence on technology could lead to complacency, potentially resulting in mistakes as recruiters become accustomed to automated processes.Ensuring a proper understanding and optimisation of AI tools is crucial to maintaining an effective recruitment process. Striking the right balance between technology and human judgment ensures a positive and impactful recruitment experience. Making the right decisions Considering these potential benefits and challenges, it's important for businesses to strike a balance between leveraging AI for efficiency and preserving the human touch in recruitment processes. As businesses contemplate the integration of AI and automation into their recruitment processes, the key takeaway is clear: AI, when strategically implemented, amplifies human capabilities rather than replacing them. The delicate synergy between technology and the human touch is crucial for guiding successful recruitment in the era of AI. ### 5 Best Tips for Transitioning to the Cloud Organisations of all sizes and across nearly every sector are recognising the immense benefits of cloud-based systems in day-to-day operations. Whether reducing the need for investment into costly IT infrastructure, streamlining data management and insight, or enhancing the scalability and dynamic adjustment of resources, upgrading operations to the cloud has become a key component of effective business transformation. However, with so many benefits on offer, many organisations jump into a cloud transition without a proper assessment of the challenges involved. With a successful cloud transition requiring adequate planning and implementation strategies, we explore the best practices for success. Ensuring buy-in Without getting your key stakeholders on board from the outset and maintaining their engagement, defining and delivering your goals in transitioning to the cloud is going to be an uphill struggle. It goes without saying that senior leadership buy-in is crucial, as without engagement from leaders, you are unlikely to generate much enthusiasm from the rest of the business, but it’s a mistake to assume these are the only people you need to get on board with the project. Engagement across the entire organisation is necessary for a cloud transition to be successful long-term, and while this obviously includes those in technical roles more intimately involved with delivering the transition, it also means communicating your aims to the wider business and, crucially, the end-users. Ultimately, those using new cloud systems in their day-to-day role will make or break the success of your transition, so it’s vital to get them on board early, consider their points of view and keep them engaged with the project if it is to be successful in the long run. Getting clear on goals Every business and organisation are different, working with unique challenges, objectives and circumstances. This means taking the time to identify your “why” when it comes to transitioning to the cloud. As with any large project, it is necessary to pinpoint the potential benefits to know what you want to achieve, and how you will define and measure your success. This might be a rather straightforward reason such as simplifying cost by opting for a pay-as-you-go model and thereby avoiding large investments into IT infrastructure, or it could involve more complex reasoning such as achieving more flexible deployment of resources. Whatever it may be, identifying these drivers early on will help you to build a vision for the future and a roadmap for the transition. By spotting potential obstacles and challenges from the outset, they can be proactively mitigated rather than blind sighting you halfway through the project. Developing a change management strategy A transition to the cloud, while a complex technical undertaking, can only be successful in the long run if your people are brought on board throughout the journey. Gaining initial buy-in is the first step, but without a longer-term change management plan, you risk losing engagement and interest. Many organisational change projects, such as a transition to the cloud, fail because there is no strategy in place for supporting and assisting the employees and teams at the very heart of the change. Leaving employees to cope with proposed changes alone, or addressing concerns and feedback too late, or not at all, can endanger the longevity and success of an entire cloud transition project. A change management strategy helps combat these issues. From defining roles and ownership of cloud environments to proactively engaging employees in the transitional process, establishing a clear change management plan and open lines of communication with employees will mitigate the risks associated with organisational change. Correctly timing training Appropriate training for cloud systems is important across all areas of the organisation, not just your technical teams. Failing to build cloud literacy and skills is a missed opportunity, as doing so ensures employees can engage more effectively with cloud systems, delivering a better return on investment in the project.  With the fight for talent in a competitive job market representing a severe challenge for many businesses, taking the time to identify existing staff skill sets and then considering how they can best be adapted to fit with cloud-based systems offers the dual benefit of developing better in-house capabilities, and increasing the adoption rates of new systems.  However, it is also important to bear in mind that while effective training is crucial, the timing of training can have a huge impact on its efficacy. Leave it too late, and you run the risk of running into major problems around adoption while training too early means employees are likely to forget what was taught by the time the solution is implemented. It’s therefore advisable to plan training alongside your implementation timeline to ensure it is timed properly. Safeguarding security and compliance Cloud environments can be complex, and the possible security risks associated with a transition to the cloud need to be identified early in the process and mitigated. Without a proper analysis of these security and compliance risks, a transition can run into huge implications regarding data privacy and security further down the line. Data is the most valuable asset a company has, so it is crucial organisations of all sizes are aware of and compliant with best practices to keep that asset safe. Trustworthy providers take these industry standards and regulations seriously, and it is paramount to gain a thorough understanding of your chosen provider’s security practices and data encryption protocols to safeguard sensitive information. Final thoughts With many organisations already poised to embrace the benefits of cloud systems, getting a head-start on proper planning and strategy is key to avoiding roadblocks in the middle of a project, and mitigating some of the common challenges of cloud transitioning. Following best practices, from both a technological and people perspective, ensures not only a short-term return on investment but also long-term success. ### Distinguishing Real AI in Cybersecurity The term ‘artificial intelligence’ can be broadly described as an IT system’s simulation of human intelligence processes, such as the ability to adapt, solve problems or plan. These abilities enable AI to mimic human-like cognitive functions, revolutionising industries by providing advanced decision-making abilities and automation across diverse applications. With the advent of OpenAI's human-like conversational AI model, ChatGPT, AI has become widespread in everyday life. However, the huge interest in, and rapid uptake of ChatGPT and other Large Language Models (LLMs) has resulted in organisations exploiting the term "artificial intelligence," seeking to capitalise on its appeal. The term AI is often used loosely and can refer to a variety of different technologies. With so many companies boasting that they have AI capabilities, it’s essential to be able to distinguish ‘real’ AI solutions from those that simply claim they are based on the technology. Distinguishing real AI The most common misconception about AI is that it is synonymous with automation. The reality is that automated systems must be manually configured to execute monotonous and repetitive tasks. AI systems, on the other hand, adapt independently once they have data to process. While AI does leverage aspects of automation, it goes beyond simply executing tasks. Here are the key differences between real AI and technologies purporting to be based on it: Training: AI systems use machine learning (ML) to generate algorithms that continuously learn from data they are fed, and use statistical algorithms to identify patterns. On the other hand, there are also intelligent systems that don't use AI. These systems are simpler and just follow sets of predefined rules and instructions. Imagine it like a flowchart: if this happens, do this; if that happens, do that. They don't learn or adapt like AI systems do; they simply follow whatever rules they're given. Continuous learning: AI is designed to continuously learn and improve over time. As new data is made available, an AI system can retrain itself to enhance its capabilities and accuracy – just as we’ve seen with every new iteration of ChatGPT. However, solutions that rely on automation are limited in scope, and can only perform specific tasks within the constraints of pre-programmed rules. Decision-making: AI is designed for non-repetitive tasks, so it can analyse situations and make decisions without human intervention - whereas automated systems are incapable of making autonomous decisions. Benefits of real AI for cybersecurity AI has great potential when used in cybersecurity. It can quickly analyse vast amounts of data and detect patterns indicative of cyber threats, enhancing threat detection and response capabilities. By leveraging AI algorithms, cybersecurity systems can adapt and evolve to counter new and sophisticated cyber attacks more effectively. Automation makes it possible to combat automated bot attacks and alleviate alert fatigue, enabling analysts to apply their knowledge and skills more efficiently. Real AI offers benefits such as: Improved performance over time: Solutions using ML improve performance over time due to the ability to learn from experiences and network patterns to refine their effectiveness. This brings new levels of adaptability to security defences and steps up accuracy in detecting anomalies in standard network activity. Improved threat detection: Thanks to its ability to learn and adapt to changes in malicious cyber actor behaviour, AI improves threat detection by identifying patterns that human analysts cannot – or at least identifying patterns much faster. It adds value when detecting unknown threats and is a powerful ally when dealing with customised APT (advanced persistent threat) attacks. Helping address talent shortages: Through analysing large amounts of data, AI can identify patterns, anomalies, and potential threats much faster than human analysts. These capabilities don’t mean human expertise isn’t relevant, rather, they allow us to stay ahead of the curve by discovering evolving threats and detecting attacks in near real-time. In this respect, AI enables us to do more in less time; an enormous benefit to cybersecurity teams struggling with talent shortages. Better endpoint protection: AI-based endpoint detection and response tools such as WatchGuard's EPDR and EDR establish a behavioural baseline for endpoints. WatchGuard’s Zero Trust Application Service, included in both solutions, only allows applications classified as ‘trusted’ to run on each endpoint. Execution of malicious applications and processes or unknown applications are classified in a maximum of four hours and blocked by AI in 99.98% of cases - and WatchGuard’s technical experts block the remaining 0.02%. AI-powered XDR solutions, such as WatchGuard's ThreatSync, which uses these security products as a foundation, can continuously learn, adapt, and improve its threat detection and response capabilities. By using AI and ML technologies to alert us to potential threats in real-time and across multiple domains, it reduces mean time to detection (MTTD), adds greater visibility and enables multi-product response. Simply put, these measures help build more robust security. So to distinguish ‘real’ AI in cyber security, we can employ a simple test - assessing whether the system demonstrates genuine learning capabilities, such as the ability to analyse and adapt to new threats autonomously, rather than merely executing predefined rules or algorithms. Real AI systems should also show demonstrable efficiency in processing large volumes of data, identifying complex patterns, and generating insights beyond what traditional approaches can achieve. What is clear is that AI is a definite boon in the ever-evolving battle against cyberattack – just ensure the tools and solutions you are buying into are really AI-driven. ### Transforming customer engagement with omnichannel solutions In today's business landscape, the importance of delivering exceptional customer experiences cannot be overstated. As technology advances, customers expect more from their service providers when it comes to communication channels, expecting more personally tailored interactions and care, over largely automated systems such as interactive voice response systems. It is frustrating to be fed back answers from a script that are not relevant to the enquiry, especially in time-pressured situations. At the same time, organisations are increasingly looking to move away from their more costly communication channels, such as call centres, opting for more advanced automated solutions. Chatbots have evolved, particularly with many now enhanced with generative AI, but these solutions still don’t suit every customer or situation. Some customers still prefer and at times require a personal conversation. An effective way to handle this is to retain chatbots and other automated solutions for simple customer queries across channels, but to implement an intelligent escalation system, underpinned by an omnichannel solution. This doesn’t eradicate the telephone option; it just allocates resources more effectively and in a way that does not negatively impact the general customer experience.  Companies are constantly seeking creative ways to engage with their customers seamlessly across various communication channels. Omnichannel solutions offer a transformative approach that not only caters to diverse customer preferences but also leverages artificial intelligence to enhance the overall customer journey as well as internal processes. Omnichannel functionality At their core, omnichannel solutions differ from single-channel or multi-channel solutions, because they offer a coordinated approach. This means that all channels work seamlessly to provide a consistent experience for customers, considering historical interactions and immediate data capture, as opposed to multi-channel solutions, which operate independently from one another, and single-channel solutions that only provide one point of contact for customers.  Omnichannel solutions are the future of customer service, as they empower customers, allowing them to choose their preferred communication channels and provide a consistent experience for customers across the channels they use. Whether it's telephone, SMS, WhatsApp, or live chat, the omnichannel approach ensures that all inquiries are managed in a unified manner.  AI integration in omnichannel solutions AI plays a pivotal role in optimising the functionality of omnichannel solutions. The integration of omnichannel solutions with platforms like Microsoft 365 and Teams goes beyond external communication. APIs extend the reach of these solutions, enabling communication and collaboration with various external systems, and further enhancing overall efficiency and productivity. The ability to analyse context enables intelligent routing of calls, ensuring inquiries are directed to the right person with the necessary expertise. Agents can triage and manage cases using the same platform, creating consistency in their workflow. They can be assigned customers based on specific channels, skills, and capacity, further streamlining internal processes. The intelligent routing, facilitated by AI, ensures that each incoming enquiry is directed to the most suitable agent. It fosters seamless collaboration internally, empowering agents to communicate and seek advice from colleagues effortlessly. Additionally, AI-driven automation extends to script generation or dynamic information provision for agents based on the nature of the conversation. This deeper context and increased personalisation contributes to more efficient and tailored customer interaction. Importantly, the deeper context enabled by the AI-powered solution prevents customers from needing to repeatedly explain their issues by summarising key points, reducing frustration, and enhancing overall efficiency.  From an agent's perspective, omnichannel solutions provide a uniform interface, regardless of the communication channel. Agents can therefore focus on providing quality service rather than navigating disparate interfaces and information.  Data utilisation and analysis The data generated by omnichannel solutions then becomes a goldmine for organisations seeking to understand customer behaviour and preferences. By tracking customer interactions across all channels, businesses can get a more complete picture of who they are and their unique requirements. This information then helps organisations to fine-tune marketing strategies and improve customer service. AI-driven data analysis helps identify trends and topics, facilitating informed decision-making in the future. Predictive analytics, powered by historical data, aids in demand forecasting, allowing organisations to allocate resources efficiently and optimise their services. Insights derived from data analysis provide valuable information for refining strategies and addressing emerging customer needs in real-time. AI-powered omnichannel solutions can harness natural language processing and sentiment analysis, adding further layers of sophistication to customer interactions. Sentiment analysis can identify when a customer is distressed by evaluating aspects like tone of voice and the occurrence of certain words. If the sentiment becomes extremely negative, automated triggers can be set up, such as alerting supervisors that a difficult conversation is taking place and assistance, or escalation is necessary. The technology itself can then add the supervisor to the call, making it a three-way conversation where the customer is getting the help they need.  Despite this increase in data sharing, there are also sophisticated governance and security measures in place. For example, the capability to automatically stop recording when the customer is about to share confidential information or pay for something.  This is furthered by multi-language capabilities for inbound and outbound chat messages, catering to diverse customer needs, and reflecting a commitment to inclusivity. Live language translation is especially important in areas like healthcare and housing, where being understood is crucial.  The seamless escalation from chat to voice or video calls ensures that customer interactions remain smooth and frustration-free, irrespective of the chosen communication channel. Ultimately, omnichannel solutions deliver incredible agility when it comes to customer communications and data capture, but organisations must navigate certain challenges during implementation. Customers may be resistant to new technologies or have security concerns, so it’s important to ensure that they understand that their preferred channels are open to them and that their data will be entirely secure.  In this regard, successful omnichannel customer service requires a strategic approach to integration, and a customer-centric mindset to overcome challenges and deliver a seamless experience across all touchpoints. ### Cloud and AI Can Beat the Tough Times Cloud Industry Forum's latest primary research is, as usual, a must read take on the "State of the Cloud" landscape. Titled "Tough times, but innovation springs internal" here are some highlights and why you should follow the link to download your copy: Tough times is the global backdrop 82% of our respondents said the current global economic climate had at least some impact on their IT spend. Of those, 30% said that the impact had been extreme while 52% said at least some projects had been put on hold as a result. Add to that 50% said cloud migration had been more complex than expected, and 41% said costs were out of control or too high. That highlights the need for better planning and a focus on cloud economics, and explains the rise of the FinOps topic. But our data shows innovation around AI is a positive, confirmed by the fact that the so-called "magnificent seven" of Apple, Nvidia, Tesla, Microsoft, Alphabet, Amazon, and Meta, all closely associated with AI, collectively increased in market value by 111% in 2023. Most organisations use more than one cloud For the first time in over 10 years of research, everyone we surveyed are using cloud services and technology. 48% say they have a cloud-first strategy, and 48% are hybrid.  We live in a hybrid cloud/multicloud world.   Our data confirms Artificial Intelligence has the fastest new tech adoption ever Almost all have AI in their plans, but less than 18 months since it exploded on the scene, two-thirds are already using it. To emphasise the point 96% report it will or has already benefited their organisation. We are in a new era, and it doesn't matter whether we call it the third wave of the Internet, the fifth industrial revolution, or the age of AI, and the innovation couldn't have matured as quickly, or spread so widely without cloud as the key enabler. There's positive news for the planet, but not enough 44% of our survey say they are fully committed to their ESG and sustainability initiatives.  That's good news for climate change and the environment, but what about the rest?  Unfortunately, our data highlights that cost is still king in the crucible of the selection process. There are some sustainability champions, but as a sector we need to do better, particularly as new AI functions increase the need for computing, storage, and network capacity. Follow our download link to see the graphs, trends, analysis, and the rest of the story that our data supports, so that you can adjust your operational plans and sales approaches accordingly. The bottom line is that decision-makers and employees are overwhelmingly positive about the cloud’s potential for its flexibility, agility, as an enabler supporting this age of AI, and the positive impact they will have together. ### How Businesses Should Tackle Big Data Challenges In today's data-driven landscape, Big Data plays a pivotal role, encompassing vast amounts of information from various sources within an organisation. Its magnitude and diversity make it a valuable asset with the potential to revolutionise businesses across sectors. In the UK alone, Big Data is estimated to be worth over £16.8 billion, with a significant number of organisations, around 432,000, embracing its technology. However, despite its potential, many Big Data projects fail to deliver on expectations - but why? The Workforce Isn’t Well Equipped With Big Data Knowledge One major challenge lies in the lack of expertise among employees. While technical issues are apparent, the shortage of individuals with proficient Big Data skills poses a significant obstacle. According to the Department of Science, Innovation, and Technology, there are approximately 215,000 roles in businesses demanding advanced data skills beyond basic IT proficiency, yet there's a scarcity of qualified applicants. To address this, organisations can focus on upskilling existing staff, offering opportunities for professional development in data science, and leveraging user-friendly analytics tools like Amazon SageMaker. It Can Be Difficult to Interpret Big Data Another hurdle is the difficulty in interpreting Big Data effectively. While it holds immense potential, extracting valuable insights requires understanding and interpreting the data accurately. This is where Artificial Intelligence (AI) comes into play. AI analytics can sift through vast and complex datasets efficiently, identifying patterns and connections that might elude human analysis. Combining AI with human expertise allows for thorough review and refinement of insights, ensuring their accuracy and relevance. There Are Privacy and Cybersecurity Risks Moreover, cybersecurity and privacy risks are significant concerns associated with Big Data. With the increasing frequency of cyberattacks, organisations holding sensitive data are vulnerable to breaches that can result in substantial losses and damage to reputations. Compliance with regulations like the GDPR further complicates data management efforts. To mitigate these risks, companies must implement robust security measures, regularly update protocols, and utilise AI tools for real-time monitoring and threat detection. Additionally, AI and robotic process automation can streamline compliance efforts by identifying areas for improvement and ensuring adherence to data regulations. Large Data Sets Can Contain Errors Data quality is crucial for businesses, encompassing accuracy, relevance, and completeness. However, larger datasets often contain inaccuracies, errors, and duplicates, leading to mistakes and inefficiencies. This can result in significant financial losses for firms, with UK businesses estimated to lose £244 billion annually due to poor data quality. To tackle this issue, automated cleansing tools can be employed to identify and rectify duplicates, anomalies, and missing data. Establishing clear data quality standards and regularly assessing them is also vital. Difficulties in Integrating Big Data Additionally, the diversity of Big Data presents integration challenges, making it difficult to unify multiple file formats from various sources. Traditional tools struggle with this, often leading to data segregation into silos. Cloud storage and management tools offer a solution by efficiently sharing information between databases and consolidating them without the need for costly transfer procedures. Data virtualisation tools, such as CData Software, further enhance Big Data visibility by allowing access to information from various sources without physically moving it. Big Data is Difficult to Store in One Place As data continues to expand into terabytes and exabytes, effective data storage management becomes crucial. Without the right architecture and infrastructure, businesses may miss out on deriving value from their data assets. 85% of UK companies manage digital data, according to the UK Business Data Survey 2022, suggesting there is a substantial demand for effective storage solutions. Transitioning to cloud storage allows for scalable data management and cost reduction. Techniques like compression, deduplication, and automated data lifecycle management can also minimise storage requirements. What Are The Ethics of Big Data? Moreover, Big Data collection raises ethical concerns, as it increases the risk of including personally identifiable details and biases in AI systems. Establishing a data ethics committee and implementing regular review processes for data collection and usage policies are essential steps to address these concerns. Scrubbing data of identifying factors and avoiding storing irrelevant details can help eliminate bias-prone information and reduce ethical concerns. Eliminating Big Data Challenges is Worth It Despite the challenges, strategically approaching Big Data can yield significant benefits. The Centre for Economics and Business Research estimates that Big Data analytics benefits the UK economy at £40 billion annually. Enterprises that tackle common obstacles effectively can harness the promise of Big Data and drive innovation and growth. ### UK IP Benefits and How to Get One There are many reasons why you may get a UK public IP address. Whether you're a UK citizen working remotely from a different part of the world or a foreign expat looking for a digital link to the UK, getting a UK IP Address will be beneficial for you! This digital card is of great use to people who live outside the United Kingdom because with it they can bypass the geo-restrictions and access their favourite British sites for free. One thing you need to do to get ahead of this is to install a VPN to access censored UK websites. In this article, we not only discuss the top VPNs but also provide clear directions on how to get a UK IP. The process of bypassing geographical restrictions and enjoying the best of British websites and streaming services will be explored as you move forward. What is an IP Address? An IP address is a unique address allocated to every device on a network. It serves two purposes: finding network devices and specifying their location as a house or an office is per se {a physical address}. This individual identity allows computers to share as well as receive information from given devices on a network. Currently, there are two common IP address standards: IPv4 and IPv6 – the abbreviations for Internet Protocol Version 4 and Internet Protocol Version 6 respectively.  IPv4 are of 32 bits, but the Internet's fast expansion has led to the rise of IPv6, which uses 128 bits. The IPv6 (IP version 6), designed in 1995 and standardized in 1998, started gaining traction around the mid-2000s and is still evolving. A VPN is necessary to access British internet content from anywhere worldwide.  VPN encrypts the device data sent through the Internet thus raising the security level a notch by blocking unauthorised connections or uninvited online traffic. Importance of Getting a UK IP Address Getting a UK IP address is very important as it provides users with many advantages that go beyond geographical boundaries. The main reasons why it is useful to have a UK IP address. Secure Online Banking: Having a UK IP allows you to permanently get access to online banking services of UK banks like Barclays, Natwest, Lloyds, and HSBC and it disenables the security breach. Privacy and Security: A British IP address provides greater online privacy and security as it appears as if the user is browsing from the UK. This also blocks any security intrusions and saves personal information as well. Access to Exclusive material: A UK-based IP address allows access to the UK content available exclusively, for example, websites, news portals, TV series, as well as movies on platforms like UK Linux, ITV Hub, BBC iPlayer, and All4. Unblocking Sports Coverage: A UK IP address helps to overcome geo-restrictions that some sites like Sky Sports or BT Sport impose and allow British sports events streaming. How to Get a UK IP: A Step-by-Step Guide The best way to have access to UK-only internet content is by getting a physical IP address in the UK or you can use VPN. Follow the given step-by-step directions to get a UK IP with a VPN easily. Step 1.  Sign up with a VPN provider who has servers in the UK, e.g. CyberGhost or any other operators in the list provided. Step 2.  Get and set up on your device your VPN provider's client application. Step 3.  To access the VPN service, open the client app and sign in with your credentials. Step 4. Pick a server in the United Kingdom from among the ones available in the client app. Step 5.  Once connected to the UK server, go to https: Go to https://iplocation.io/ to make sure that your IP address now shows UK. Having set up a VPN connection you will now find yourself free to access sites and services which previously were inaccessible due to geolocation restrictions within the UK. The following section provides a list of all of the reviewed top recommended VPN services to get a UK IP address. Top Best VPNs for UK IP Address  In this section we will discuss the five best VPNs that put more emphasis on privacy and security, but at the same time, they offer good speed. Explore the benefits and features that these easy-to-use VPNs have for you to know so you can choose one for a stress-free UK IP address search. NordVPN NordVPN is a VPN giant with over 400 plus VPN servers in the UK across its extensive network. It has been working for more than a decade, specializing in unlimited geo-restricted content, providing blazing-fast speeds and user privacy via solid encryption. NordVPN allows you to surf digitally and get UK content. Pros: Operates 400+ servers in the UK Unblocks popular UK streaming services Impressive speed performance Prioritizes user security and privacy Comprehensive support and live customer service Cons: Desktop apps may have a learning curve Slightly expensive Pricing: https://nordvpn.com/pricing/bundle-site/  AtlasVPN AtlasVPN, having started only in 2019, boasts at least 30 server locations, including the UK. Its major features include the most popular WireGuard protocol and are great for streaming. It supports unlimited concurrent connections. AtlasVPN is a solid VPN with simplified features and decent support. Pros: Fast WireGuard protocol Works with major UK streaming services Unlimited simultaneous connections Servers in 30+ countries Solid support is available 24/7 Cons: Fewer servers compared to others Limited advanced features Pricing: https://atlasvpn.com/pricing  ExpressVPN ExpressVPN is a resilient all-rounder with over 3,000 high-speed servers worldwide, including a sizeable UK presence. It is trusted for its reliability, high speeds, and excellent customer care. It is brilliant at accessing locked geographical streaming services. Enjoy browsing the web with ExpressVPN's vast server range. Pros: Fast and reliable servers in the UK Excellent for streaming UK content abroad High-end security features 24/7 live chat support Robust privacy protection Cons: Limited advanced options Relatively higher cost Pricing: https://www.expressvpn.com/order  CyberGhost CyberGhost offers almost 700 servers in the UK and over 8,000 servers worldwide, that is the biggest server variety available. It deals with unblocking different materials and is widely known for its fast speeds, user-friendly applications, and strong privacy security. Benefit from online freedom and security with CyberGhost's server network all over the world . Pros: Largest server selection in the UK Top-notch speeds for streaming User-friendly apps Strong privacy and online security Extensive server network Cons: Blocked in China and UAE Limited control over advanced features Pricing: https://www.cyberghostvpn.com/en_US/buy/cyberghost-vpn-3  Surfshark Surfshark is the cheapest VPN provider with more than 3,000 servers around the world including the UK. It has a reputation for enabling unlimited device connections at once, as well as high bandwidth and unbreakable encryption, thus perfect for streaming and online security. Surfshark offers a VPN experience that combines effectiveness with affordability. Pros: Strong unblocking capabilities Fast connection speeds No logs kept Affordable pricing Allows unlimited simultaneous connections Cons: Occasional connection slowness Limited advanced options Pricing: https://surfshark.com/pricing  Conclusion  Acquiring a UK IP address is a great digital key that gives many benefits such as access to exclusive content, safety of online banking, unlock sports broadcasts and many others. This article discusses the importance of and provides a clear instruction on how to get a UK IP address using a VPN, focusing on the easiness of overcoming geographical restrictions. The article further outlines some of the top-class VPNs such as NordVPN, AtlasVPN, ExpressVPN, CyberGhost, and Surfshark, which all have remarkable user security, speed, and dependable access to UK content. Choose a VPN that satisfies your demands for a better online experience without geographical boundaries. ### CIF Presents TWF – 2024 state of the cloud report In this episode of the 2024 season of our weekly show, TWF! (Tech Wave Forum), we are focusing on the primary research from Cloud Industry Forum published 13th May. This is our annual "state of the cloud" report. This year it's called "Tough times, but innovation springs internal" with the subtitle of "How cloud is the key technology enabler helping businesses innovate in a turbulent period". Instead of the usual interview, our host and CIF CEO David Terrar will summarise the report and its themes, which highlight cloud's ubiquity, the amazing rate of adoption of GenAI, and the importance of cloud supporting this new era. ### CIF Presents TWF – Jay Patel In this episode of the 2024 season of our weekly show, TWF! (Tech Wave Forum), we are getting the Cloud customer perspective. We're talking to someone with in depth experience of enterprise level public cloud adoption and management, from his time at some major well-known brands. Our host and CIF CEO David Terrar will be interviewing Jay Patel, Head of Public Clouds at Citi, and before that Senior Engineering Manager at Maersk. We'll be understanding his strategy with cloud, how he selects suppliers, how important people and process is over technology, and getting his 2.0 approach to digital transformation. Please subscribe to the event for a fascinating in depth session ### Navigating the Landscape of AI Adoption in Business In today's rapidly evolving technological landscape, the integration of Artificial Intelligence (AI) into businesses has become both a necessity and a challenge. Tech giants have invested substantial resources in AI development, resulting in impressive technological advancements. However, despite these efforts, consumer adoption of AI beyond a few standout applications like ChatGPT remains limited. The reasons for this disparity are multifaceted. Usability issues, trust concerns regarding data privacy, and a general lack of awareness about AI capabilities all contribute to the slow pace of adoption. Moreover, the applications themselves often lack integration into existing systems, creating complexity and posing a barrier to entry for users, further hindering widespread adoption. The pressing question now is whether this significant investment in AI will yield the expected consumer uptake. Over the coming year, businesses must closely monitor employee engagement to gauge the extent to which AI tools are used in their working lives. Additionally, attention should be paid to whether investment trends continue to favour infrastructure development and support new tools for integration into the workplace. Investment and Infrastructure: Investment trends in AI offer valuable insights into the direction of AI adoption in businesses. Currently, there is a noticeable emphasis on infrastructure development, with substantial investments directed towards AI research and technology infrastructure. However, it is imperative for businesses to assess whether these investment strategies align with employee needs and preferences. While infrastructure development is crucial for advancing AI capabilities, investment in new applications is equally essential for driving widespread adoption. Striking the right balance between infrastructure development and specialised workplace tools is key to fostering AI adoption across various industries. Adoption Across Different Industries:  The adoption of AI varies significantly across different industries, with some sectors embracing it more readily than others. Slow adopters include industries such as healthcare, construction, and businesses relying on legacy systems, where concerns about compatibility and integration often slow down the adoption process. Conversely, industries like marketing and advertising, which thrive on creativity and innovation, are more inclined to embrace AI solutions. However, even in these creative fields, there are concerns about preserving human intuition and creativity in the face of increasing automation. Understanding these industry-specific dynamics is crucial for businesses seeking to implement AI solutions effectively.  By recognising the unique challenges and opportunities within each industry, businesses can tailor their AI adoption strategies to align with authentic growth and innovation. Bridging Humanity and Technology: In most industries, bridging humanity and technology is paramount. While AI tools offer valuable insights and automation capabilities, they should complement rather than replace human intuition and creativity. Success lies in striking a delicate balance between leveraging AI's capabilities and preserving human intuition. This integration of AI and human expertise allows businesses to harness the power of technology while retaining the unique perspectives and insights that only humans can provide. By fostering collaboration between humans and AI, businesses can optimise productivity, drive innovation, and achieve better outcomes across various industry sectors. Understanding Your Employees Monitoring employee engagement provides crucial insights for businesses seeking to understand the extent of AI tool usage. By tracking interactions with AI technologies, businesses can tailor their technology tool stack to support employees effectively. Recognising that this is a revolutionary process is crucial, as emerging technology is continually evolving to optimise and streamline business functions. Likewise, it’s not just about understanding employee behaviour; it's also about evaluating the complexity level of integration into existing workflows. Businesses require AI platforms that seamlessly enhance efficiency without causing disruption. Expanding AI Adoption Beyond ChatGPT: The mass adoption of consumer-friendly AI tools like ChatGPT is one of the biggest technology shifts of the century. Within two months, ChatGPT gained over 100 million users, making it the fastest-growing consumer app in history. Researchers are finding that AI can outperform humans in a variety of different tasks, such as coming up with new business ideas. But while ChatGPT has emerged as a frontrunner in AI adoption, businesses must recognise that it is just one piece of the larger AI landscape.  Numerous AI tools and platforms offer diverse functionalities tailored to different business needs. For instance, plug-ins like Plus Docs can be very helpful to streamline workflow processes by converting text-based documents into visually appealing presentations. While ChatGPT may assist in brainstorming the general outline of a presentation, automated tools can take it a step further by automatically generating slides according to user instructions. This combination of AI-generated drafts and human input allows for the creation of polished presentations that resonate with both coworkers and customers. Unlike standalone AI platforms, there is a rise of plug-ins that prioritise user experience, ensuring quick adoption and maximum impact. By exploring a diverse range of AI solutions, businesses can unlock new opportunities for innovation and growth across various industries. Opportunities and Challenges The opportunities presented by AI adoption in various business functions are vast, ranging from sales assistance and content creation to project management and customer service. AI-driven solutions have the potential to revolutionise these areas, offering increased efficiency, personalisation, and scalability. However, along with these opportunities come challenges, including ethical considerations, data privacy concerns, and algorithmic bias. It's imperative for businesses to navigate these challenges responsibly, prioritising transparency, accountability, and ethical AI practices. Final Thoughts In conclusion, navigating the landscape of AI adoption in business requires vigilance, adaptability, and a commitment to responsible innovation. By regularly reviewing their technology tool stack, aligning investment strategies with employee needs, and leveraging new AI technologies like plug-ins, businesses can drive widespread adoption and unlock the full potential of AI technology. As we embrace the opportunities and confront the challenges of AI adoption, we pave the way for a future where businesses are empowered to make conscious decisions about AI utilisation to achieve increased success. ### Three Ways to Strengthen API Security APIs (Application Programming Interfaces) are a critical driver of the digital economy. Observers predicted that Nvidia’s recently launched Omniverse Cloud APIs will propel new digital twin software tools that will supercharge design, simulation and operation processes. However, with digital advancements such as this, there are always significant security considerations. APIs, while indispensable, have become one of cybercriminals’ favourite vectors for account takeover attacks. Credential stuffing, business logic abuse, and DDoS attacks are just some of the malicious automated bot attacks deployed to take over accounts and perpetrate identity theft and fraud. The ease with which these attacks can be mounted, thanks to widely available tools and scripts, underscores the inadequacy of traditional defence mechanisms in addressing the modern threat landscape. In fact, our recent research study revealed that 84% of respondents admitted to not having advanced API security in place. Meanwhile, only 14% of companies surveyed viewed using AI technologies in API security as a priority. Furthermore, sectors with stringent regulatory requirements, such as finance and insurance, reported a lack of sufficient resources to effectively detect API threats. So, what do organisations need to do to improve the security of their APIs? Prioritising API Security as a Strategic Objective Elevating the importance of API security within an organisation is imperative. We know that the majority of companies (95%) have experienced API security problems in the last 12 months, so recognising the need for a strategy - because of how ubiquitous this risk is - is a crucial first step towards this goal. ‘Insufficient budget’ and a ‘lack of expertise’ are the most common reasons for a lack of action on developing a comprehensive strategy. This is surprising given that the reputational and operational cost of a breach far outweighs the price of deploying a consolidated web application and API security solution. Therefore, organisations need to realign their strategic objectives by adopting comprehensive security strategies that go beyond conventional measures and protect their digital assets effectively. This means preparing not just for current threats but also anticipating future risks, and having adaptable services to allow organisations to react to future risks. Consolidated Security Solutions: Streamlining Protection The first step in reinforcing defences is to, where it makes sense, integrate web applications and API security solutions from a single provider. This consolidated approach ensures a seamless security process across all digital touchpoints, reducing the complexity and potential gaps that could be exploited by attackers. Using a single provider brings multiple benefits and efficiencies. These include a reduction in complexity and workload associated with managing multiple security systems as well as enhanced threat visibility and decision-making. In addition, a unified solution facilitates integration with existing IT Infrastructure, which can lead to a more streamlined and operationally efficient way to manage and mitigate risk. It’s crucial for companies to evaluate potential providers carefully, ensuring that they offer comprehensive and adaptable solutions that meet their security requirements without creating additional silos. The Role of AI in API Security Enhancement Incorporating AI-based tools into a business's security arsenal could be a step forward in tackling the complexity of the API threat landscape. Our report found that 58% of security professionals anticipate that generative AI will have a ‘large or very large’ impact on API security in the next 2-3 years. This expectation increases to 75% among financial institutions and insurers. Investigating AI-based tools for API security offers a promising pathway to strengthening defences. AI and machine learning algorithms can sift through extensive data sets to identify complex patterns and anomalies that may indicate cyber threats. This improves the accuracy and timely threat detections through AI’s ability to uncover previously unknown attacks. What’s more, the enhanced ability to forecast potential security issues can empower organisations to take more preventative measures against risks and better anticipate future challenges. That said, there is unfortunately little enthusiasm for this. Only 14% of the individuals surveyed regarded the use of AI technologies in API security as a top priority. While the potential of AI to enhance API security is significant, concerns about the accuracy, complexity and management of AI systems persist. Organisations need to stay agile, continuously updating their AI security tools to effectively combat evolving cyber threats. Elevating API Security APIs are fundamental to the digital economy, making their security paramount for businesses. Fortunately, the means to enhance API security are available and accessible. By prioritising API security, leveraging consolidated security solutions and exploring AI’s potential, companies can protect their digital assets more effectively. As we move forward, it’s essential to focus on, not only driving innovation but also ensuring the security of the digital infrastructure that supports it. ### A Comprehensive Guide To The Cloud Native Database [2024] Databases are crucial for storing and managing important information. Moreover, they are the backbone of many operations, such as financial transactions. However, the way we store data is evolving. Cloud computing offers a new approach, and cloud-native databases are changing how businesses manage information in the digital age.  This guide thoroughly explores these powerful tools. Let’s unpack their benefits, potential drawbacks, and implementation strategies together. Understanding Cloud-Native Databases A cloud native database (DB) is a service designed specifically for deployment and management within cloud environments.  There are several key characteristics of a cloud native database: Cloud-Based Delivery: Built, deployed, and delivered entirely through cloud platforms, organisations simplify management and operations. Scalability and Reliability: These databases are designed to scale up or down easily to meet changing data storage demands. They also offer high levels of reliability, ensuring consistent uptime and data availability. Cloud-Native Technologies: Cloud-native databases often leverage technologies like Kubernetes, a container orchestration platform. Kubernetes allows for flexible and scalable deployment of the database. Cloud-native databases offer features like on-demand scaling and built-in redundancy. This factor simplifies management and allows businesses to focus on strategic initiatives. ​​ Now, let's explore the key differences between traditional and cloud-native databases to understand the advantages of the latter. Traditional vs Cloud Databases The distinction between the cloud database and traditional database has become a key consideration for businesses. But what exactly sets these two storage solutions apart? The key difference lies in where the databases are housed. Traditional databases reside on physical servers within an organisation's data centres. While being powerful, they present challenges. For example, scaling up storage or processing power can be cumbersome. Plus, security breaches pose a greater threat due to the on-premise nature of the data. Additionally, managing these physical servers also diverts valuable IT resources from core business activities. Cloud databases, on the other hand, offer a more dynamic approach. Imagine renting a secure, scalable storage locker in the cloud. Providers like Amazon Web Services or Microsoft Azure handle the infrastructure, freeing organisations from hardware headaches. Here's where cloud databases outshine their traditional counterparts: Scalability on Demand: Cloud databases adapt to fluctuating data volumes. Need more storage for that surge in customer activity? No problem – scale up with ease. Resilient by Design: Cloud providers boast robust infrastructure. They ensure high data availability with redundant systems. If a hiccup occurs, data can be swiftly replicated and restored, minimising downtime. Cost-Effective Choice: The "pay-as-you-go" cloud computing model translates to significant cost savings. Organisations pay only for what they use, avoiding the upkeep of unused hardware. Cloud databases offer a clear advantage over traditional options. However, before a complete switch, it is important to consider any potential limitations of cloud-native databases: Complexity: Cloud-native environments, especially those involving containers and Kubernetes, introduce complexities. Managing databases within them adds another layer of difficulty. The initial setup might seem straightforward, but the real challenge lies in ongoing management. Automating application provisioning requires a skilled team with expertise in Kubernetes and related tools.  Finding personnel with this specialised knowledge is a challenge. Businesses may find themselves needing to invest in costly training. Also, companies might require partnerships with IT recruitment agencies to source this talent.  Security: Cloud-native database security adheres to a shared responsibility model. Here, cloud providers offer a secure infrastructure. Meanwhile, organisations remain accountable for safeguarding their data. They need to monitor unauthorised access, cyber threats, and data breaches continuously. This vigilance requires specialised knowledge of cloud security best practices. Compliance: A cloud-native database is a powerful tool. But, making sure data follows regulations and localisation rules is complex. Strict rules regarding data storage and access vary by industry and geographic location. Businesses must navigate legal complexities to avoid penalties or disruptions. Cloud storage offers flexible, on-demand data storage with lower costs compared to traditional on-site servers. However, managing cloud databases can be trickier and requires more security awareness. That said, let’s explore strategies for using cloud databases next. Implementation Strategies Migrating to a cloud-native database offers exciting possibilities. However, careful planning and execution are crucial for a smooth transition. This section outlines key strategies to ensure a successful implementation: 1. Pre-Migration Planning Needs Assessment: Before starting the migration journey, thoroughly assess your current data storage requirements. First, analyse your data volume and growth projections. You can then identify any compliance regulations. Cloud Provider Selection: Evaluate cloud providers based on factors like pricing models, security measures, and integration capabilities with your existing infrastructure. Data Migration Strategy: Develop a plan for transitioning your existing data to the cloud-native database. Consider phased migrations, using tools for bulk data transfer, or leveraging continuous data replication mechanisms to minimise downtime. Explore options for data cleansing and transformation tools if needed. 2. Implementation and Deployment Provisioning and Configuration: Work with your chosen cloud provider to set up the cloud-native database instance and configure security settings Data Migration Execution: Transfer your existing data to the cloud-native database. Depending on your chosen approach, you might need bulk loading tools or continuous replication mechanisms. Integration with Applications: Modify your applications to connect with the new cloud-native database. These changes could involve updating connection strings, API calls, and any relevant data access logic within your codebase. 3. Post-Migration Optimisation Performance Monitoring: Once migrated, closely monitor the performance of your cloud-native database. Tools offered by your cloud provider can help track key metrics like query response times and resource utilisation. Security Best Practices: Implement security measures to protect your data in the cloud. For example, reinforce access controls, use strong encryption (at rest and in transit), and regularly monitor for potential security threats. Ongoing Maintenance and Support: Stick to a consistent update schedule for the cloud-native database software. This way, you can benefit from bug fixes and performance enhancements offered by the vendor. Use the support resources for any technical assistance needed. 4. Additional Considerations Team Training: Moving to a cloud-native database may require your IT team to get new skills and knowledge. Invest in training your team with the chosen platform and best practices for cloud-based data management. Plan for Disaster Recovery: Develop a full plan to keep the business running in case of disruptions. Cloud providers offer features that keep your data safe and minimise downtime. These features include automated backups, global data redundancy, and swift backup system transitions during outages.  Cost Optimisation: Cloud-native databases allow you to manage expenses by scaling resources up or down based on your real-time requirements. Monitor resource usage and explore cost-saving strategies offered by your provider. Migrating to a cloud-native database requires a roadmap. This section charted a course for a smooth transition. However, the real question is: what problems do cloud-native databases solve best? Let’s go over some instances where cloud-native databases shine. List of Use Cases Cloud-native databases aren't just theoretical concepts. Here are some scenarios that reflect the power of cloud-native database technologies: Modern App Development: Cloud-native database is a perfect fit for modern applications built with microservices architecture. This approach breaks down large apps into smaller, independent pieces. Cloud technologies can easily manage the data needs of each service within these applications. Real-time Analytics and Big Data: Today's businesses increasingly rely on real-time data insights. Cloud-native databases can handle the high volume of data generated by real-time applications. Their ability to scale on-demand ensures smooth operation even during spikes in data flow. Globally Distributed Applications: Many businesses operate across multiple geographical locations. Due to their distributed nature, cloud-native databases can be easily deployed across different regions. This flexibility ensures compliance with data residency rules and low latency for users globally. Serverless Applications: Serverless computing is gaining traction due to its cost-effectiveness and scalability. Cloud-native database integrates with serverless architectures, letting developers focus on application logic, not infrastructure. IoT and Connected Devices: The Internet of Things (IoT) generates massive sensor data. With its ability to handle high concurrency, Cloud DB is ideal for storing and processing data. Machine Learning (ML) and Artificial Intelligence (AI): ML and AI require vast datasets for training and analysis. Cloud native databases provide a scalable platform to manage these massive data volumes. Cloud native database is more than just a fancy term. It is a useful tool that helps businesses of all kinds. This type of database can handle big data, work with modern apps, and adjust as your needs change. Conclusion Cloud native database offers a compelling solution for businesses seeking agility and scalability in data storage. By leveraging the expertise of cloud providers, organisations can free up IT resources and focus on core business activities. With a careful approach, cloud database technologies can be a powerful tool for driving digital transformation. ### AI is the future foundation of business’ ESG frameworks ESG has emerged as a key focus for businesses worldwide. Presently, all major EU corporations are in the process of applying the Corporate Sustainability Reporting Directive (CSRD) to their 2025 reports. While the CSRD primarily applies to EU-based companies with an annual turnover exceeding €150 million, it serves as a precursor to broader sustainability regulations. For example, 16 pieces of legislation affecting the entire value chain of the retail industry, from product conception to marketing, are due this year. This transition is already influencing various industries. In addition to regulatory requirements, investors are increasingly using ESG ratings to guide their investment decisions, and this behaviour will only increase as ratings are standardised. Furthermore, customers are growing more conscious of ‘greenwashing’ practices, which will potentially alter their purchasing and investment decisions based on how companies are rated. To prepare for these transformative shifts, businesses must elevate their ESG protocols. This isn’t only important for meeting regulatory requirements but is also fundamental for a business’s comprehensive auditing and strategic plans. Enhanced ESG planning enables finance departments to integrate ESG considerations into financial frameworks, gain deeper insights into operational efficiency opportunities and costs, and inform strategic decision-making processes. Improved margins and cash flows, streamlined reporting, and greater visibility into a business’s performance are the result. Establishing the link between a company’s financial plans and its ESG framework has been a challenge for FP&A. To address this, businesses need to support a solid strategy with modern and innovative tools. Traditional data analysis tools and spreadsheets are simply not sufficient. McKinsey’s research underscores how cloud-powered technologies like artificial intelligence, machine learning, and the Internet of Things can accelerate almost half of businesses’ net zero initiatives (47%). This highlights the critical role of technology in helping organisations meet sustainability and regulatory requirements while competing effectively. AI for data processing Data surrounding ESG frameworks is complex, as it is derived from multiple sources and comprised of both structured and unstructured data created across an entire business value chain. For example, the environmental component of ESG must consider each step in a supply chain, from shipping and packaging to cloud usage and employee commuting. To tackle the challenges associated with deriving value from ESG data, AI techniques such as Generative AI can be applied. AI is well suited to automating previously manual efforts in the gathering, merging, and analysis of data. It can combine ESG data with other related data sources, summarise vast amounts of text, and generate actionable insights in a fraction of the time required by traditional methods. However, AI-derived outcomes can only be as good as the data to which they are applied. To maximise AI’s benefits, companies must ensure access to clean, accurate data from both internal and external sources. Accelerating access to bad data through natural language is obviously detrimental to any AI initiative, and thus it is key for organisations to have a solid data strategy in place before any AI techniques are leveraged. When used correctly, however, machine learning, generative AI, and even video/image processing can provide teams with a broader and more comprehensive understanding of their data. Organisations are already using classical AI techniques to process ESG data across the business, and while Generative AI use cases are emerging, initial results are promising as natural language prompts allow analysts to interrogate data and generate plain-English analyses, automate ESG report creation processes, and reduce errors. This level of data integration and analysis is key for creating an overall ESG plan. Businesses must be able to track, monitor, and adjust according to varying sustainability goals. The best way to use AI is to collect, standardise, and aggregate data for each metric involved in ESG reporting. Once this is in place, businesses can more easily understand their individual ESG targets. An intelligent planning tool can facilitate these tasks, especially if it provides capabilities beyond simple “AI-washing” and allows businesses to truly work with AI in a seamless and integrated fashion. But can we trust AI yet? Despite its transformative potential, trust in AI remains a concern. While companies are under pressure to improve productivity and efficiency with techniques such as AI, some mistrust around the technology exists. This is evident in the industry, as in March, the European Parliament approved the Artificial Intelligence Act, which takes a risk-based approach to ensure companies release products that comply with the law before they are made available to the public. ESG regulations demand that companies have audit points that are fully explainable and transparent, and creating matching processes should be a top priority for finance teams deploying AI. High profile examples of generative AI ‘hallucinating’ stem largely from its probabilistic behaviour along with the quality of the inputs used to pre-train, ground, prompt, or fine-tune the models. ChatGPT, for example, is trained on data from the publicly available internet, which can contain inaccuracies and biases. To improve outcomes, organisations can look to use specialised large language models, where models have been tuned or trained with proprietary or domain-specific data. This can reduce hallucinations, again assuming that the data is of sufficient quality and accuracy, as specific context is readily available to the models. Data teams can also employ techniques such as RAG, ReAct, model fine-tuning, or prompt engineering to improve the quality of their responses. And, as models can suffer from ‘AI drift’, whereby the model becomes less accurate over time, it’s important that humans are able to easily verify outputs via grounded references and other background sources provided in the model responses. Companies can mitigate these flaws and optimise their AI use Businesses have ample opportunities to safeguard against the potential pitfalls of AI, as carefully compiling high-quality datasets will set a strong foundation for any AI initiative. Customisations can be introduced to your AI integration points to improve overall accuracy, and solutions can be deployed with appropriate management and monitoring capabilities to ensure continuous, accurate outcomes.Furthermore, clarity and consistency of standards within emerging ESG regulatory frameworks will be important to ensure deeper standardisation across datasets, reports, and metrics, and make it easier to compare information from numerous sources. Timelines for achieving these objectives, however, will depend on several factors, including the pace of technological advancements, industry adoption rates, and the formulation of regulatory guidelines. The integration of AI into ESG functions holds the promise of resource savings for businesses. However, before embarking on this journey to streamline ESG reporting through technology, a thorough assessment of available data and an evaluation of how technology can foster new best practices are essential steps. ### Are Deep Fakes the Ultimate Threat to Privacy and Security in AI? Episode 6 of the AI Show, hosted by Emily Barrett, discusses the intersection of technology and privacy with Nigel Cannings, CEO and co-founder of Intelligent Voice, whose unique blend of legal expertise and passion for tech brings a fresh perspective to the table. Nigel emphasises the importance of secure data processing in sensitive sectors. Intelligent Voice focuses on privacy, offering on-premise data encryption to protect information from external risks. The episode also touches on the challenges of deepfake technology and the balance between leveraging AI benefits and understanding its implications. The conversation concludes with reflections on the current state of AI advancements. ### Cloud is a business turbo-charger We live in momentous times for IT. Developments such as the modern democratization of AI, the emergence of quantum computing, the rise of the Internet of Things, the rapidly gaining metaverse and new cybersecurity threats obsess us. Not only that, these shifts force us to prepare for the potentially seismic effects on how we conduct ourselves as organisations. And, underpinning all of these is cloud computing, the great uber-platform, enabler, and catalyst for rapid innovation. Such a glut of technological changes can potentially overwhelm CIOs, never mind non-technical executives left floundering in an alphabet soup of TLAs (three letter acronyms) and techno-jargon. In addition, if we’re not careful, that can prompt an analysis paralysis dynamic. So, sometimes it pays to zoom out and take a broader look at how IT choices relate to the business outcomes we’re all trying to achieve.With all the above in mind, Unit4 recently took the pulse of IT decision-makers in a series of snap surveys that covered nearly 300 responses. The resulting findings provided some sharp insights into what cloud technology can help to deliver in the real world of organisations. Cloud: Why all the fuss and what’s the business angle? Sometimes, amid all the jargon, the business reasons for Cloud can get lost in the mix and even the smartest IT leaders can fail to explain the big picture. That is, that Cloud technology is far more cost-effective, fast-moving, and relatable to the needs of the organisation than the old on-premises, datacentre-bound, legacy world. We began our survey by asking our panel about their primary business driver for moving to the cloud and the biggest response by far, with 45% of votes, was agility and the chance to be innovative. Whether it’s special offers such as dynamic pricing promotions, an experimental new product line, branching out to fresh sales or new marketing channels or something else entirely, Cloud affords the chance to experiment without high costs or ancillary risks. A related reason given by the panel was customer responsiveness (22%) and, in the age where customer experience and CRM are so front of mind, that shouldn’t be a surprise. Cloud provides the opportunity to create one-to-one customer experiences and customise on a mass scale...and all at speed and at a reasonable price. But there are defensive reasons for strategic cloud investments too, and cybersecurity is a notable advantage. This is a modern change: once, Cloud was seen as a risk factor because IT departments could no longer hold all the cards when it came to maintaining cyber defences. But today, Cloud’s inherent security models and the abilities of cloud data centre operators to run highly efficient Security Operations are viewed as a big competitive differentiation factor over the on-premises world of perimeter defences. Finally, 12% of respondents said Cloud-based solutions would let them get ahead of competitors. However, with Cloud becoming a default for so many, that window of opportunity is shrinking fast. Don’t leave me this way We returned to the notion of competitive advantage in our second question, asking whether respondents felt they risked being left behind if they didn’t invest in Cloud. More than half of respondents (56%) did feel that fear, saying they worried about losing out on innovation. And, while about a quarter (26%) felt confident in being ahead of the curve, the remaining 18% only felt they were moving at the same pace as competitors. Modern business history tells us starkly that the race for innovation is usually won by the swift, so the message is clear: early adopters trump fast followers… and you certainly don’t want to be a laggard. Cloud: What’s not to like? For our third and final question, we asked respondents to name the leading business performance benefit derived from Cloud technologies. Here, the answers were a potent mixture of the pragmatic and the innovative. One prosaic advantage from Cloud is reducing upfront costs with servers, storage, software licences and even data centres no longer needed to the same extent, or even at all. Also, there are large cuts to be made in administrative tasks, floor space and power consumption. In all, over half the audience (53%) said reduced infrastructure costs added up to the number-one business performance advantage of Cloud. Another advantage, as we have already stated, is also all about bolstering whatever the business wants to do (and of course providing the platform to help make those smart decisions). Quick access to innovation was cited by 26% of our panel and our other results also had an innovative slant. More than one in 10 (11%) nominated more collaborative teams and of course, Cloud showed its end-to-end joined-up mettle in the pandemic when we were forced into remote virtual teamworking. A side-effect: faster decision-making was voted for by 9% of respondents. So, there we have it, and perhaps the best question that all CIOs should ask of their business-minded executives is this: what possible reason can you give for not wanting to move to Cloud? ### GenAI needs to be used responsibly - here’s how AI continues to dominate headlines, with businesses increasingly recognising the value it brings to the workplace. And yet, more than one in four organisations have now banned the use of GenAI in the workplace due to apprehensions around privacy and data security risks. As GenAI continues to make headway, its pivotal role in shaping the future of work is undeniable. Implementing a blanket ban is not a viable business strategy, and will not serve as the easy fix you’re looking for. Business owners need to cultivate an environment in which employees are encouraged and supported to leverage AI tools responsibly. To foster responsible adoption and use of generative AI within companies, a multi-faceted approach is essential. Whitelist-approved tools that meet organisational demands Every business has unique needs and requirements, so begin by clearly identifying and approving generative AI tools that meet your organisation’s standards for compliance, cost-effectiveness and practicality. This selective endorsement process ensures that only vetted applications are used. Your team is likely already using some (think ChatGPT, GitHub Copilot, Grammarly, etc.) due to their widespread availability, so it’s wise to streamline applications to ensure coherence and compliance. Develop an AI policy If your organisation lacks a specific AI usage policy, make one. This policy should be drafted after a thorough assessment of your organisation’s current AI usage, and clearly articulate the types of data that are permitted for AI interactions, as well as those that are off-limits. Through an established policy, you’re able to set out rules and responsibilities for employees and prevent unauthorised AI use, mitigating the risk of sensitive data exposure. Without this, you risk subjecting yourself to an array of headaches down the line, like violating NDAs by inputting private information into a third-party source. Staying informed about upcoming laws and regulations like the EU’s AI Act is also essential to guarantee industry best practices. Naturally, this also helps to cultivate a more cohesive organisational structure, allowing employees to utilise GenAI tools with confidence. Do ensure that you continue to update the policy over time and that your employees are kept informed along the way. After all, they are the main users of this and, without their compliance, your organisation could find itself at risk. Educate and train employees With an estimated 30% of hours currently spent on activities today likely to be automated in the future, it’s essential that employees are upskilled to be able to work with AI quickly and confidently. Currently, only 7% of respondents surveyed had received any AI skills training in the last 12 months. Empower your workforce by providing education and training on the opportunities and limitations of generative AI. Offer a comprehensive overview of what GenAI-powered tools are, their capabilities and safe and effective usage practices - it’s easy to forget that this is an alien concept for most! Make sure that you discuss its potential applications within the organisation, whether this be for automating processes, predictive maintenance, data analysis or something else entirely. It’s also worth addressing ethical and legal issues surrounding GenAI, such as bias in algorithms, hallucinations, and opening a forum for Q&As. Once the theory is put in place, you can organise hands-on training, offering tutorials and workshops so employees can interact with AI software and put what they’ve learned into practice. Select privacy-conscious services When making the choice on the right GenAI software for your business, opt for services that do not utilise your data for training their models publicly or offer mechanisms to keep your data secure within a proprietary framework. You can do this by carefully reviewing privacy policies before utilising the software to understand their unique data practices, including what they collect, how they use it, and with whom it’s shared. Recent customer feedback is a good starting point for some desk research. If there isn’t transparency and clarity in their policies - avoid! Fun fact: reputable providers like Microsoft, Alphabet, and OpenAI are known for their robust security measures and options for data privacy and commercial data protection. Promote an AI-forward culture Encourage a culture that embraces the transformative potential of generative AI through strategic initiatives, cultural shifts and practical implementation. As you might expect, leadership buy-in goes a long way. With senior management loudly championing AI initiatives, the tone is set for embracing technological advancements, with more junior staff members feeling empowered to take advantage of these tools themselves. In a company promoting an AI-forward culture, most (if not all) of actions and ideas are supported by Generative AI. This does not mean, the GenAI results are always incorporated. On the contrary, they can be actively rejected. They should, however, become a core part of the business-as-usual and be utilised by default, and not as an option. On a more practical note, all employees should be equipped with the necessary tools and resources to facilitate AI development. There’s no point in promoting an AI-forward culture if your infrastructure is unable to cope with what that entails! Consider investing in data infrastructure, cloud computing resources, and collaboration tools tailored to AI projects. By adopting these strategies, your organisation will not only ensure the responsible integration of generative AI in the workplace but also position itself to fully leverage the potential of emerging technologies. This proactive approach will enable your company to optimise operational efficiency, spur innovation, and sustain a competitive advantage, all while safeguarding the integrity of your digital infrastructure and data. ### Navigating Cloud Migration in Banking Motivated by many different compelling factors, an increasing number of banks are moving their operations towards Cloud-based platforms. A major factor catalyst is the substantial cost-saving potential it presents. Cloud computingempowers financial organisations to slash both their operational expenses and capital expenditures by eliminating the necessity for extensive physical infrastructure and the accompanying maintenance expenses. The Cloud offers unparalleled levels of customisation and scalability, enabling businesses to quickly adapt their resources to meet evolving demands. This flexibility is further bolstered by the availability of XaaS (Everything-as-a-Service) solutions, which make deployment processes and operations smoother and easier. Cloud providers usually offer robust redundancy and uptime service-level agreements, ensuring service reliability. Additionally, security and compliance concerns are comprehensively addressed within Cloud environments, with providers offering advanced security measures and compliance assurances. These advantages position Cloud computing as a highly efficient option for banks and other businesses that significantly mitigates the complexities and challenges associated with traditional infrastructure management. Cloud transitions in the banking sector Banks today face a pressing imperative to adapt to the escalating demands of digitisation, efficiency, and innovation to maintain their competitiveness. However, juggling cumbersome legacy tech stacks whilst navigating a complex regulatory landscape, data security and protection concerns, and an array of other concerns are (understandably) likely to evoke a sense of apprehension about the unknown. Technology leaders in banking are confronted with a multitude of priorities. Enhancing products and propositions, ensuring compliance with regulations, modernising legacy systems, and optimising speed and agility. They also need to enrich the customer experience, boost operational efficiency, reduce costs, fortify security and operational resilience, and exploring novel business and operating models. Broader business imperatives also come into play. Environmental, Social, and Governance (ESG) considerations continue to be a priority for the finance sector. They significantly influence technological decisions - underscoring a collective responsibility towards sustainability. The extent to which cloud migration can support sustainable IT is a topic of discussion. According to Microsoft, transitioning to the Cloud could result in being up to 98% more carbon efficient. This stems from consolidating computing resources within data centres equipped with clever cooling mechanisms and energy-management protocols. Engineering in quality for successful migrations It is widely reported that restrictions on transferring legacy systems, organisational culture, and Infrastructure requirements are three of the biggest hurdles for banks transitioning to the Cloud. These concerns carry substantial weight, as any missteps in these areas can trigger substantial and significant repercussions. This underscores the paramount importance of comprehensive testing throughout Cloud migration to preserve data integrity, optimise the performance of applications, and avoid unexpected costs or performance glitches in the new cloud environment. Whilst testing plays an undeniably crucial role in cloud migration, embracing a Quality Engineering approach goes above and beyond conventional testing methods and offers additional risk mitigation benefits. Implementing Quality Engineering within the banking industry is crucial, particularly in addressing perceived limitations in cloud transformation. The Quality Engineering approach integrates stringent practices throughout the delivery lifecycle to ensure the successful completion of projects within budget and schedule. By proactively identifying and resolving issues early on, Quality Engineering mitigates the risks associated with cloud migration, ensuring that the new cloud infrastructure not only meets functional requirements but also adheres to performance, security, and scalability standards. Furthermore, the lessons learnt can be transferrable leading to continuous improvement for future technology projects that brings long-term operational benefits. This alignment of business objectives with technical outcomes empowers banks to navigate the complexities of cloud migration more efficiently, enhancing the reliability, resilience, and overall success of cloud initiatives. Tips for Successful Cloud Migration: Tailor your strategy. Develop a comprehensive strategy specifically addressing the unique challenges and risks faced by your business during cloud migration. A tailored approach is essential for mitigating potential obstacles. Embrace Quality Engineering. Adopt a Quality Engineering approach that starts early and spans the entire software-delivery lifecycle to minimise risk. Leverage automation. Utilise automation tools and performance testing to further mitigate migration risks and streamline the testing process. Automation enhances efficiency and accuracy while reducing the likelihood of errors. Prioritise transparency. Implement transparent reporting mechanisms to providestakeholders with real-time visibility into testing progress. Informed decision-making iscrucial for navigating cloud migration successfully. Partner with experience. When embarking on Cloud migration, consider the resources and capabilities that will be crucial for success. If you need that additional knowledge andcapacity, choose a provider with a proven track record in seamlessly migrating applicationsand services, from IaaS and PaaS, SaaS to ensure quality outcomes. For further information on Roq’s Quality Engineering services for firms in sectors including retail, banking, healthcare, automotive, public sector and more, visit www.roq.co.uk. ### How AI is transforming the battle against fincrime In the ever-evolving world of financial services, combatting financial crime remains a top priority for governments, regulatory bodies, and financial institutions alike. Traditional methods of fraud detection and prevention are proving inadequate in the face of increasingly sophisticated criminal tactics. Enter Artificial Intelligence (AI) – a potential game-changer in the fight against financial crime. The challenge Financial crime encompasses a wide range of illicit activities, including money laundering, fraud, terrorist financing, and cybercrime. These activities not only pose significant risks to financial institutions but also undermine trust in the integrity of the global financial system. Despite concerted efforts to combat financial crime, criminals continue to exploit vulnerabilities, costing economies billions of pounds annually. Traditional approaches to detecting financial crime tend to rely heavily on rule-based systems and manual processes, making them reactive and prone to errors. Moreover, the sheer volume and complexity of financial transactions make it increasingly challenging for human analysts to detect and react to suspicious activities in a timely manner. Enter AI Artificial Intelligence offers a paradigm shift in the fight against financial crime, leveraging advanced algorithms, machine learning, and big data analytics to detect, prevent, and mitigate risks more effectively. Unlike rule-based systems, AI-driven solutions can analyse vast amounts of data in real-time, identifying patterns and anomalies that may indicate fraudulent behaviour. Machine learning algorithms can adapt and evolve based on historical data, continuously improving their accuracy and effectiveness over time. By analysing transactional data, user behaviours, risk profiles, and other relevant information, AI algorithms can uncover hidden patterns and correlations that may go undetected by traditional methods. Enhanced detection capabilities One of the most significant advantages is AI’s ability to enhance detection capabilities. AI-driven systems can analyse data from multiple sources simultaneously, enabling organisations to identify complex fraud schemes and illicit activities more efficiently. Moreover, AI algorithms can detect subtle deviations from normal behaviour patterns, flagging suspicious transactions or adverse changes to customer risk profiles for further investigation. By automating the detection process, organisations can reduce false positives and focus their resources on high-risk activities, thereby improving operational efficiency and reducing costs. Predictive analytics for risk assessment Predictive analytics, powered by AI, plays a crucial role in preemptively identifying and mitigating financial risks. By analysing historical data and emerging trends, predictive models can forecast potential areas of vulnerability and proactively implement preventive measures. These models assess various risk factors, including customer behaviour, transactional data, and market trends, to anticipate and prevent fraudulent activities before they occur. By identifying emerging threats and adapting their strategies accordingly, organisations can stay one step ahead of criminals and minimise their exposure to financial crime. Real-time monitoring and prevention The speed at which fraudulent activities can occur necessitates real-time monitoring and prevention mechanisms. AI-powered solutions enable organisations to monitor customer risk profiles and activities in real-time, flagging suspicious behaviour instantly for further investigation. By integrating AI algorithms with transaction monitoring systems, financial institutions can identify potential fraud in milliseconds, minimising the impact and preventing financial losses. Moreover, AI-driven fraud prevention systems can employ advanced authentication methods, such as biometric recognition and behavioural analysis, to verify the identity of users and detect unauthorised access attempts. Enhanced compliance Money laundering and terrorist financing pose significant threats to the integrity of the global financial system, requiring robust measures to detect and prevent illicit activities. AI technologies play a crucial role in identifying suspicious transactions and entities, enabling organisations to comply with anti-money laundering (AML) and counter-terrorism financing regulations more effectively. By analysing transactional data and identifying patterns indicative of money laundering or terrorist financing activities, AI algorithms can help financial institutions flag suspicious transactions for further investigation. Moreover, AI-driven solutions can enhance transaction monitoring capabilities, enabling organisations to identify complex money laundering schemes and illicit networks more efficiently. The future As the threat landscape continues to evolve, AI has become indispensable in the fight against financial crime. By harnessing the power of advanced algorithms, machine learning, and big data analytics, organisations can detect, prevent, and mitigate risks more effectively than ever before. However, while AI offers tremendous potential in combating financial crime, it is not a silver bullet solution. Effective implementation requires collaboration between industry stakeholders, regulatory bodies, and technology providers to ensure that AI-driven solutions are deployed ethically and responsibly. In the years to come, AI will undoubtedly play an increasingly central role in safeguarding the integrity of the global financial system, enabling organisations to stay one step ahead of criminals and ultimately better protect their customers. ### Navigating Data Governance AI transformation has become the strategic priority de jour for organisations looking to innovate and stay relevant in the digital age. The ability to rapidly analyse vast amounts of data and provide valuable insights makes it a crucial tool for decision-makers. Hundreds of billions of dollars are being invested globally in AI. Everyone in tech from the webscalers to the high street; plus governments, financial services companies, healthcare providers, manufacturers, aviation, automotive, agriculture - the list goes on. Not surprisingly, industry analyst Gartner identified democratised generative AI, AI trust, risk and security management, and AI-augmented development as the top three strategic technology trends for 2024. Consumers have been fast to adopt the technology. ChatGPT reached 1 million users five days after its launch in late 2022. In some quarters AI has been characterised as a threat to jobs, and in others a threat to humanity. Several countries block access to ChatGPT, including obvious candidates like North Korea and China, but also less obvious candidates like Italy, which introduced a temporary ban last year due to privacy concerns and continues to be wary. No matter, at the last count more than 180 million people have created accounts and 100 million people access it weekly. AI-driven solutions, algorithms, and technologies have the potential to help businesses improve efficiencies, boost customer satisfaction and deliver competitive advantage. But AI transformation is not without its challenges. The power of AI lies in data, yet according to McKinsey, 72% of organisations say managing data is a key challenge preventing them from scaling AI. Where that management takes place will be crucial to solving the challenge. That ‘place’ will undoubtedly be the cloud where, as of 2022, 60% of the world’s corporate data was being stored. By 2027, Gartner forecasts over 70% of enterprises will use industry cloud platforms to drive business initiatives, up from less than 15% in 2023. In the very near future the vast majority of corporate data will be stored and accessed on public and private cloud platforms, or more likely will be spread globally across a complex hybrid landscape. That landscape of cloud platforms will be both virtual and physical in the sense that they’ll operate out of a network of interconnected data centres dotted around the world. The physical location of data centres brings with it additional question marks over data sovereignty and security. That’s a huge part of the reason NexGen Cloud is investing $1 billion to build the European AI Supercloud. Data sovereignty is the concept that data is subject to the laws of the country in which it is located. Anyone familiar with data privacy and GDPR will know about data sovereignty since data collected on individuals in the EU must be either stored in the EU, so it is subject to European privacy laws, or within a jurisdiction that has similar levels of protection. Collecting, storing and acting on data is one thing, keeping it safe and/or sharing it (especially cross borders) is a whole new regulatory ball game. And the regulations on data privacy are far from toothless. Last year, the Irish regulator fined social media giants TikTok and Meta €345m and €1.2bn respectively. Outside Europe meanwhile ride-hailing firm Didi Global was fined $1.19 billion by the Cyberspace Administration of China. High-profile cases like these are the tip of the iceberg.In addition to direct financial repercussions associated with breaching regulations come indirect problems with what could arguably be much longer-term impacts such as a loss of trust among customers and partners. As noted previously, there is a certain level of distrust regarding AI – the figures vary, one survey in the US found 79% of respondents reported trusting businesses “not much” or “not at all” to adopt AI responsibly. While the Office for National Statistics in the UK reported 36% of adults said they did not think AI could have a positive impact on their lives. It would not take too many data breaches to push those percentages up higher.The critical interplay between ensuring data sovereignty and robust cloud security measures is at the heart of ensuring organisations capitalise on the advantages promised by adopting and transforming business processes with AI. Organisations face a wide variety of security threats associated with cloud sovereignty including but not limited to data jurisdiction and legal compliance, data access control issues, cross-border data transfers, vendor compliance and transparency, supply chain security risks, data encryption and decryption challenges, inadequate incident response plans, national security concerns and broader political and geopolitical risks.The relationship between data sovereignty and cloud security is entwined. Organisations must navigate data governance to ensure compliance with regional regulations to safely and ethically harness the power of AI. They must also prioritise robust cloud security and data sovereignty when choosing partners in their AI journey. The promises of AI appear limitless, but mitigating security challenges necessitates a holistic strategy that integrates technical safeguards, adherence to legal requirements, and effective risk management practices. ### 10 Best Marketing Tools to Leverage Business Growth The use of marketing tools is imperative in this digital age to effectively promote and grow the business. They make the tasks clearer, educate the audience, help optimise the policies, encourage interaction, and create the maximum benefit for investment. This list of the top 10 marketing tools provides many useful functions for marketing and the entire business environment. These tools will effectively target your prospects, promote your value proposition, and optimise your business strategies to maximise ROI and growth. Using these platforms, you can strengthen your online presence, attract your target audience, and drive sustainable success in this competitive market. Let's first understand what marketing tools are and why they are important. What Are Marketing Tools? They are nothing but the strategies, techniques, and resources used by businesses to do their marketing activities. These tools help you determine your areas of strength and weakness and select the tactics that will work best for perfect execution. Marketing tools can be grouped into various categories, depending on their nature and function. First, traditional tools are advertising, sales promotion, personal sales, and public relations. Second, non-traditional tools include advanced information technologies, psychological methods, and economic mathematical models. Some of the key benefits of marketing tools are:1. Improved marketing efficiency2. Make data-driven decisions3. Scalable business growth 10 Best Marketing Tools That Can Escalate Your Business Growth SEMrushSEMrush is the software for all your SEO and SEM needs. From backlink analysis and organic traffic insights to content auditing and competitor research, this tool has got you covered. The key features and benefits include a library of high-quality images and an easy-to-follow editor with built-in AI features for content creation. Services can be obtained for $129.95 per month.Thus, SEMrush can help you rank higher with search engines, do thorough keyword research, and research what your competitor is doing so that you can plan accordingly. Google AnalyticsGoogle Analytics is a free web analytics service that provides vital tools for measuring website performance and visitor engagement. You can review metrics like users, bounce rate, sessions, and goal completions to understand how efficiently a website works.Some features of Google Analytics are website traffic reports, e-commerce tracking, landing page analysis, audience insight, and flow visualisation.Businesses, marketers, SEO professionals, and website owners are the most vital users of Google Analytics. Companies can leverage the power of this platform to optimise their online presence by learning how users interact with the websites. Firms can use this data to improve the end-user experience and achieve marketing goals. ViralpepViralpep is an all-around social media management tool. It simplifies the way scheduling and posting of content is done, regardless of whether it is on Instagram, Facebook, Twitter, or any other platform. With this feature, you can reach out to more users.In addition, this tool has a rich collection of stunning images, an AI-powered editor, and three various pricing plans that start at $29 per month. With Viralpep, you can write and post content, create visually appealing social media campaigns, and automate repetitive tasks so that you only connect with your followers. HubSpotHubSpot is a package deal for inbound marketing and sales platforms. It consists of tools for email marketing, content management, lead scoring, and customer relationship management. It is designed to facilitate digital marketing activities with existing and potential customers. HubSpot offers a number of tools to run marketing campaigns, manage content, and increase customer engagement. Its pricing plans start at $800 per month. Marketers who want to attract and engage their audience, sales teams who value their leads, and entrepreneurs who strive to satisfy customers opt for HubSpot. SEMrush LocalSemrush Local is a popular tool to optimise local search and maximise brand visibility domestically. The top features here are local SEO audits, online reputation management, and competitor analysis. This tool gives you a clear picture of how your local search campaigns are performing while also providing actionable tips to bolster your online presence. Its pricing starts at $129.95 per month. Businesses looking for a way to increase local search visibility and connect better with their target audience benefit the most from Semrush Local. GetResponseGetResponse is an easy-to-use marketing tool that has features like email marketing, landing pages, webinars, and automation. You can create and send emails, set up landing pages, and host webinars using this tool. The subscription plan begins at $13.30 per month.GetResponse is the best tool for anyone wanting to upgrade their e-marketing and automation processes. Marketers, sales teams, and website owners can use this tool to capture or manage their email lists. ProofHubProofHub is a proven project management and team collaboration software that gathers critical marketing tools under one roof. You can easily plan, organise, collaborate, and deliver marketing projects on time with ProofHub. Simple social media planning, all-in-one scheduling, monitoring, analytics, and two flat pricing plans are the major features.You can trust ProofHub to take control of your marketing projects, collaborate without complications, and monitor how everything is performing. BuzzSumoBuzzSumo is one of the most popular social media analytics platforms for content marketing. You can use it to make content based on current trends. You can also identify influencers and track the performance of your social media efforts. The key benefit of BuzzSumo is the easy-to-use AI image generator. Its basic services can be availed for $199 per month.It provides valuable insights to identify viral topics, assists in curating shareable content for social media strategy, and monitors social media performance. CanvaCanva is a popular graphic design tool to create visual content. It has various templates and design elements suiting your needs to create social media images, presentations, and various other marketing materials.With its top-notch graphic design, social media post creation, and presentation design tools, you can create attention-grabbing content that targets your audience better. Canvas subscription plans start at $12 per month. It can be used by businesses and individuals who want to improve their visual content and audience engagement. MailchimpMailchimp is a US-based market automation platform as well as an email newsletter service provider. It has a wide range of plans with email deliveries and analytics. It offers simple yet effective campaign development, target customers, and detailed reports for analysis and growth. It also has pre-designed email templates that are mobile-friendly and customisable. It can integrate with other apps and offer tutorials to users on effective use of the same. Mailchimp is perfect for businesses, marketers, and website owners who want to optimise their online presence, understand user intents, and make data-driven decisions to boost the user experience and achieve marketing goals. Conclusion The top 10 marketing tools listed above provide features to help businesses thrive. You have a chance to make your marketing campaign simple, communicate effectively with your target audience, and, in turn, boost sustainability opportunities for your organisation.Undoubtedly, you can improve brand awareness, customer satisfaction, and the survival of your business via these excellent marketing platforms. ### Three key approaches to safeguarding modern application security More than a decade ago, Marc Andreeson famously declared that "software is eating the world."  The Silicon Valley venture capitalist’s comments came as he looked back at the creative disruption caused by the 1990s dot-com bubble and the “dozen or so new Internet companies like Facebook and Twitter sparking controversy in Silicon Valley.” But he also observed in 2011 that an increasing number of major businesses and industries were being run on software and delivered as online services — in effect, “overturning established industry structures.” “Over the next ten years, I expect many more industries to be disrupted by software, with new world-beating Silicon Valley companies doing the disruption in more cases than not,” he wrote. Fast forward to today, and Andreeson’s comments still ring true. But in an interesting update, McKinsey has suggested that the slogan — ”software is eating the world” — should be reworded to “software is the world.” The linguistic tweak leaned on findings from McKinsey’s research, which showed that almost seven in ten top economic performers used their own software to differentiate themselves from their competitors. Of course, the software being discussed is far from the monolithic, all-in-one solutions hosted on-premises. Instead, today’s forward-thinking organisations prefer flexible, user-friendly applications that are scalable and can be rolled out at speed.  It’s an approach followed almost universally today since it removes many obstacles to digital transformation that might otherwise prevent employees from being more innovative, efficient, and productive. But it has its problems.  The challenges of securing modern applications Today, applications tend to be based on multiple microservices loosely coupled together to create a more modern architectural and organisational approach to software engineering. Typically, these are decentralised across multiple platforms. It’s a tactic favoured because it enables businesses and organisations to deliver large, complex applications quickly. The snag is that this way of working can make it difficult for users to visualise and understand the entire application.  Worse, it can lead to increased security exposure. Unless properly insulated and protected, cyber criminals or hackers may be able to gain access to the entire application via a single insecure microservice. And if these security incidents can’t be seen, it would be almost impossible to identify the threat — let alone respond to and even prevent them. Then there are the challenges brought by open-source. In nearly every modern application, open-source code can be a valuable resource for developers. However, researchers at application security company Synopsys found at least one vulnerability in 84% of commercial code bases.  And one vulnerability is all that an attacker needs to do untold damage. The good news is that while modern applications' security challenges are real, they are not insurmountable. Instead of returning to monolithic legacy software, organisations can embrace modern application development while ensuring security by taking three steps: 1) Increase visibility into complex IT environments through observability It doesn’t matter whether it’s a fault in a car engine, a laptop, or a toaster — you cannot fix a fault until you can identify the problem. That’s why observability is such an important tool.  Observability solutions provide real-time visibility across an entire IT estate, which is essential for the secure development of modern applications. Having a clear view of an entire infrastructure enables IT teams to quickly identify and resolve security issues before they develop into significant problems.  2) Build in security with a ‘shift left’ approach to testing There’s a general rule of thumb that you can’t really test anything until it’s been built or created. Want to see if a kite will fly? Build it, wait for the wind to blow, and then give it a whirl. If it fails to take off — or comes crashing to the ground — it’s probably worth returning to the drawing board.  It’s an approach to testing that is almost universal — and that includes software. But what if that testing could occur earlier to identify issues even before they arise? What if you could identify security issues during — not after — the build phase? In effect, that’s the principle behind ‘shift left.’   Traditionally, security checks occur in the ‘testing phase’ after the software has been written and pulled together. However, what if an issue has already been programmed into a device early in its development? If that’s the case, the DevOps team will have to work retroactively and unpick work already done. This can slow down the process and inhibit a thorough application review. Implementing the ‘shift left’ approach improves the development process by embedding security measures sooner. This enables DevOps teams to identify vulnerabilities during development — rather than after the project is completed. In this way, DevOps teams can streamline this process by having security at the forefront of their development process, enabling them to deliver safer and more reliable products. 3) Be transparent by utilising a software bill of materials Now, more than ever, the technology industry needs to be transparent — especially since cyberattacks are becoming more sophisticated and instigated increasingly by rogue nations.  That’s why a Software Bill of Materials (SBOM) is so important. It’s a structured list of all the components and dependencies that comprise a piece of software. It serves as a comprehensive inventory of the various software elements used in an application, including open-source libraries, third-party components, frameworks, modules, and other software assets. Each component in the SBOM provides a clear view into the software supply chain, helping developers identify and address vulnerabilities. For example, if a new vulnerability were to be discovered in an open-source library, an SBOM helps to pinpoint the affected applications and prompt the appropriate teams to take action. Knowing what materials are going into development also helps predict the final product’s functionality, as process developers can identify points of concern from the start. The trend for modern software applications is something that should be embraced since it brings a host of benefits. And it goes without saying that it’s a vast improvement on the on-prem, monolithic, all-in-one solutions of the past. However, the quest for function-rich, easy-to-use systems should not be sought at the expense of ensuring security is built into applications. Any vulnerabilities introduced at any stage increase the risk of a security breach.  However, by using observability tools, by ‘shifting left’ and testing much earlier in the software development process, and adhering to best practises such as SBOM, organisations can find a balance between embracing new technology — and profiting from its benefits — while also ensuring that all-important element of security. ### AI Show – Episode 5 – Matt Rebeiro Navigating the Diverse Applications of AI in Marketing The 5th episode of The AI Show is a must-watch for anyone interested in the innovative world of AI in marketing. Host Emily Barrett sits down with Matt Rebeiro from Iris to delve into the multifaceted applications of artificial intelligence in this ever-evolving field. Their insightful conversation explores how AI is revolutionising marketing operations, from streamlining processes to powering cutting-edge AI-driven products that deliver personalised, data-driven experiences to consumers. However, they emphasise the importance of maintaining transparency and adhering to regulatory compliance to uphold consumer trust. Emily and Matt tackle the challenge of striking the right balance between leveraging AI's efficiency and preserving the human touch of creativity that sets captivating marketing campaigns apart. They also discuss the sensitive matter of handling proprietary data ethically and responsibly while maximising its potential for valuable insights. With his expertise, Matt sheds light on the diverse and often hidden uses of AI in marketing, from analysing complex healthcare documents to generating fresh, innovative creative concepts. They explore how AI uncovers brand perception, a game-changer for crafting resonant marketing strategies. In a thought-provoking segment, they delve into the creation of synthetic personas – virtual representations of target audiences – to gather invaluable feedback and refine marketing approaches. Emily and Matt underscore the need for a balanced approach, seamlessly blending AI-driven insights with traditional methods for a holistic marketing strategy. ### CIF Presents TWF – Fanny Bouton For this episode of our weekly show, TWF! (Tech Wave Forum), we are talking to one of our Forum members about an emerging technology that has profound and enormous potential, but needs some explanation to make sense of it. Our host and CIF CEO David Terrar will be interviewing Fanny Bouton, OVHcloud's Quantum Lead. She'll explain what Quantum Computing is, how it can be applicable to almost any sector, and how it can calculate answers beyond the capability of classical high performance computers. Please subscribe to the event for a fascinating and exciting session.  ### CIF Presents TWF – Nick Powell In this episode of the 2024 season of our weekly show, TWF! (Tech Wave Forum), instead of talking about technology, we'll be exploring the people factors, performance coaching, and leadership needed to get it right. Our host and CIF CEO David Terrar will be interviewing Nick Powell, who will explain biohacking, his book Limitless, and we'll discuss how his experience as a UK MMA champion and Judo Black Belt influences his approach and thinking. Please subscribe to the event for a fascinating session. ### The importance of channel partners in driving forward Zero Trust Once coined a ‘buzzword’, there has been a positive shift in the perception of Zero Trust over the last few years, and as such, 87% of global security leaders now claim that Zero Trust is openly discussed at board level. If we look back at 2022, almost half of security leaders (44%) described the model as requiring too much oversight, and yet only one-third (33%) shared the same opinion when surveyed last year. IT and security leaders across EMEA are therefore clearly considering implementing Zero Trust architecture now more than ever, but the real issue appears to be in its adoption as very few organisations claim to have achieved it. As hybrid cloud environments continue to grow in complexity, there is a clear need for more structured guidance on implementing Zero Trust. While enthusiasm is growing for the power of the framework in building out a layered approach to cyber defence, confusion around the ‘how’ of Zero Trust remains. This creates a perfect opportunity for channel partners and vendors to step in as trusted advisors, supporting end-users in building out their Zero Trust architecture. Fostering a Zero Trust mindset The first key consideration for channel partners is helping customers understand that Zero Trust is not a product or solution. It is a mindset of “never trust, always verify” that must be built into a business’ architecture across every part of their network. Zero Trust network architecture demands that every user, application, and packet is analysed and authenticated both at perimeter access points and at various sub-perimeters within the network. Implementing this level of verification and layered security requires a curated combination of solutions and strategy, but it is also a fundamental culture shift. By helping foster a Zero Trust mindset, channel partners can unlock success for their customers by building awareness around the risk of a network with implicit trust and offering guidance on how to better implement tools and processes to verify any potential threats. It can also help channel partners to differentiate themselves ahead of the competition - the IT channel can become truly valuable in providing the expertise and technical knowledge that end-users need to navigate the complex and necessary journey towards a Zero Trust mindset. With 97% of global IT leaders now accepting Zero Trust as a journey rather than a tick-box exercise, the time has never been better for channel partners to become Zero Trust allies. Deep observability is key The threat landscape is expanding, and ransomware continues to be one of the most dangerous risks to businesses across the globe. Just looking at the UK, the country is at high risk of a catastrophic ransomware attack and lawmakers are openly criticising Government's strategy for cybersecurity. Therefore, even if they are not experts in this field, given the evolving landscape it is important for channel partners to understand the role that deep observability plays in enabling Zero Trust and a more secure posture. A robust Zero Trust Network Architecture hinges on deep observability. Without complete visibility into users, assets, devices, and hybrid cloud infrastructure, monitoring and enforcing granular access controls becomes incredibly difficult – you simply cannot verify what you cannot see. Organisations seeking to implement Zero Trust must, therefore, gain a clear and comprehensive view of all network activity. In the current hybrid cloud landscape, traditional monitoring tools are not sufficient to have a real view into all network activity. Real-time network-level intelligence is therefore key to empowering metric, event, log and trace-based monitoring, and observability tools with all the necessary data to properly mitigate threats, without complexity. This architecture proactively prevents bad actors from infiltrating networks and exploiting sensitive data, and it is up to the channel to educate their end-users on the benefits of adopting this comprehensive approach to visibility as a starting block for their Zero Trust journeys. More than one benefit to Zero Trust While security is clearly a massive concern for many companies, it will not be their sole focus. It is therefore important that channel partners demonstrate exactly how Zero Trust may help their customers in more ways than one. With lack of resources and the digital skills gap still very much pressing issues, this architecture can assist in increasing productivity within a workforce, as well as allowing the wider business to better cope with the current climate and become a more agile organisation. With potentially sensitive data travelling between corporate networks and personal devices connected to home broadband routers, a framework that increases security – reducing the threat of breaches from internal and external sources – without impacting productivity is a no-brainer. As long as channel partners demonstrate the various benefits of Zero Trust, it will be difficult for their clients to ignore its potential. Zero Trust for the future Whether UK organisations immediately perceive Zero Trust as attainable, today’s landscape means it is rapidly becoming a business priority. But the thought of Zero Trust as a single purchasable solution has not entirely subsided. Unlike quick-fix solutions, Zero Trust demands a comprehensive and continuous approach to better cybersecurity and business agility. Channel partners play a pivotal role in guiding their customers through this journey and it is therefore critical that they seize their opportunity to enabling Zero Trust success. ### What observability can teach us about corporate culture Observability is big business. As an IT tool, it seamlessly integrates and encompasses visibility across datasets, networks and user experiences, pinpointing the root causes of outages or performance issues while delivering tailored insights. With a market projected to be worth over $2 billion by 2026, observability can deliver tangible benefits in terms of its impact on both digital ecosystems and the bottom line. But while these tools tend to be the sole preserve of IT departments, the foundational elements upon which they are based do have a place beyond the digital ecosystem. In fact, I would argue that transparency, collaboration and continuous improvement — the three pillars that support this methodology — can also be used across any organisation to deliver broader cultural change. Let me explain. Though company culture may often seem intangible, the repercussions of a weak or non-existent culture are far from inconsequential. Studies have shown that if the culture of an organisation deteriorates, employees may start to look for new opportunities elsewhere. In other words, company culture matters. This is why it’s so important to be authentic and specific when setting the values at the heart of that culture. For instance, in 2021 we introduced Secure by Design as a guiding principle for how we approach security and cyber resiliency at SolarWinds. But this approach isn’t just restricted to security. We now use it as our approach to everything — from infrastructure and software development to people. For us, it’s important to establish that our culture doesn’t just focus on what we do — which is providing leading IT solutions. Instead, we wanted to focus on being the trusted global IT leader we wanted to become. So how do the three guiding principles of observability make lasting changes to corporate culture? Transparency Observability is, essentially, all about transparency. By providing a single pane of glass through which you can see your entire IT environment, observability helps organisations address and prevent issues before they arise. The same can be done with a company’s culture by promoting open communication while giving people a greater insight into the work and priorities of other departments. It also helps if organisations can be purposeful and direct about the company’s values and goals. What’s more, it costs nothing to be transparent. And yet the ROI can be massive. A company culture underpinned by transparency is proven to deliver high-value benefits including establishing trust and boosting employee engagement. On the other hand, without transparency, even the most thoughtfully planned initiatives can fall at the first hurdle. Corporate transparency is a tonic for enhancing employee morale and retention, while at the same time helping to improve stress levels. And since all of this helps to improve productivity, it can have a direct impact on your business’s bottom line. Collaboration Observability breaks down silos and makes it easier to collaborate seamlessly across different clouds, databases, and dashboards. By taking a similar approach, business leaders can do the same for their teams by fostering greater collaboration across the entire organisation. With a culture underpinned by collaboration, employees won’t just learn how to get along — they’ll understand why each cog in your machine functions the way it does. They’ll also understand how their work impacts colleagues, the end product, and the business as a whole. Collaboration doesn’t need to apply only to interpersonal or interdepartmental work. For instance, learning to collaborate with machine learning tools is one way to keep employees engaged and productive. Far from replacing your employees, solutions such as AIOps can support employees by delivering insights, automating mundane tasks, and reducing the risk of human error — all while freeing them up for the creative, innovative and forward-looking parts of their jobs. Continuous Improvements It’s worth keeping in mind that being transparent doesn’t just apply to the things you’re doing well. Being afraid of change has little place in today’s business landscape, where agility and flexibility are increasingly table stakes. If your organisation is competitive and has longevity, it stands to reason that the challenges it faces tomorrow may not be the ones it met yesterday. As the company grows, evolves, and shifts, make sure that the culture does as well. Otherwise, you may look around one day and realise that the business and its culture have fallen completely out of sync with one another. And nobody — least of all your teams — wants to start from square one. Unlike technological observability solutions, no tool can monitor for cultural issues at the root cause and deliver tailored insights for remediation. So how do you find out what improvements need to be made? Just ask. Speak to people. Canvass their opinion. With a diverse, distributed, and distanced workforce, the best practice for building company culture is to include your people in the process. But it’s not just a case of giving people a voice. You need to listen to what they have to say. And, where appropriate, you need to act. Let’s be honest, the most valuable capital of any organisation is its people. Making sure this ecosystem is healthy and full-functioning is critical to the success of any organisation. By adopting the principles of transparency you can instil a methodology that can help drive positive change. Not only will you gain more than just a clear view across your organisation — you’ll gain a clearer understanding of the road to success ahead. ### Three tips for managing complex Cloud architectures "Moving to the Cloud is a strategic choice many organisations make to reduce data costs, accelerate their time to market, mitigate risk, foster innovation and achieve scalability. Whilst its benefits are tenfold, some businesses are grappling with the increased complexity associated with migrating to the Cloud, a result of the rapid uptake in Cloud adoption. NetApp reports that 98% of businesses have been impacted by the increasing complexity of data across the Cloud or on-premise solutions. Put simply, Cloud architecture refers to the design and organisation of the components and subcomponents needed for cloud computing, including front-end platforms, back-end platforms, cloud-based delivery, and a network. It serves as a blueprint for managing and deploying cloud services and resources efficiently, ensuring scalability, reliability, and security. Define your data strategy and ensure personnel competencies Working together as a team to develop an overall data and Cloud strategy is the most important element to achieving business objectives. Doing so ensures that there is a predetermined, specified level of competency set when employing staff members as a prerequisite, needed to assist or manage complex Cloud-based tasks and architectures. Overall data management within complex Cloud architectures is of increased importance. By ensuring there is a strong yet adaptable strategy in place, data managers can work together easily to solve any number of issues in short periods of time - especially when these issues become critical to system access. Furthermore, having a clearly defined data strategy helps an organisation's IT team understand the relevance of the data they’re processing. Data can become a mountain of information in a short period, but where it becomes gold is in its analysis, so, by having a disorganised system an organisation misses out on important insights. A lack of understanding of the impact and usefulness of data signals a malnourished and underdeveloped strategy making it more difficult for team members to develop consistent approaches and reduce external threats. There are too, without a strong strategy, issues in matching data to modern data stacks. Using legacy data presets for data sequences requires a high level of oversight, conversion and often human intervention to ensure there is no data loss in the migration process. This takes time, and expert knowledge - something many organisations struggle to find, and can be costly. Cloud-based services require constant and significant oversight, management, and organisation of cybersecurity to run and operate safely and smoothly. While most of this is done via a third-party host, an enterprise must consider what part its IT team will play in this process, if any. Dedicated members of the cybersecurity and IT team can be employed to undertake primary Cloud management and act as a troubleshooter for remote employees. Build toward best practice and automated governance Rapid Cloud migration can cause Cloud complexity, and as a direct result, further issues fester as the business begins to expand. By understanding, and developing failure-management solutions early, IT teams can safeguard issues and account for any additional challenges down the line. By doing so, organisations can ensure their costs are kept to a minimum when failure inevitably occurs. Migrating data exclusively to Cloud solutions opens an organisation up to more sophisticated, frequent, and costly cyberattacks. Developing a best practice model for employees operating Cloud architectures reduces the number of risks incurred, generates process consistency, and helps to trace systemic or human errors. Many organisations might feel it is important to introduce a zero-trust model when undertaking Cloud management. When systems don’t have adequate resilience and/or disaster recovery/business continuity solutions in place, a level of chaos can ensue. Google, one of the largest third-party Cloud providers, proved integral to many institutions during the COVID-19 pandemic and to this day, abrupt disruptions in system operations can leave many organisations unable to operate. When Gmail, Google Drive, Docs, Meet, and Voice had a global outage for 6 hours in August 2020, organisations were left unable to operate for the remainder of the day. Outages and failures are rare, but focusing on backup plans, best practices and automated governance can ensure a business provides consistency for employees and customers alike. Continuous learning is critical to building, improving, and automating the future Data is adaptive and can be in frequent states of change, whether that is by initiating new data sets, adding data categories, or accounting for automation, the team managing complex Cloud architectures should consider the importance of simplistic and future-proof strategies. Cloud architectures become complex partly for the length of time they are employed. As an organisation grows and develops across industries and managed services, business operations change, as does user base size, therefore, new services or higher storage plans must be employed for organisations to keep pace with this access demand increase. Legacy data sets and architectures must be archived correctly or adapted to fit into the predeveloped data strategy going forward. Data, across the architecture can be scattered, of varying degrees of quality and security, meaning it can be costly to maintain. With the help of AI, automation can play a large role in addressing system complexities. By examining vulnerabilities of data, services, workloads, and platforms, businesses can use AI to employ better management and process tasks quickly and efficiently. IT managers can also reduce the scope of manual effort needed for data input and repetitive tasks. Scaling, storage, sizing, and configuration are tasks that can all be automated for IT managers, meaning risk is mitigated significantly. By doing so, IT teams can streamline workflows, and more accurately spot, diagnose, and treat any issues within critical Cloud architectures in line with pre-set governance frameworks. When manually undertaking Cloud architecture-based tasks, an IT manager will encounter errors. Due to their repetitive nature, monotonous tasks like managing performance, VM clusters, load balancing, deployment, and monitoring virtual networks, can all lead to increased human error, therefore automation-led strategies reduce overall human error and further relieve pressure on personnel workloads. Unlocking potential Responsible for hosting data, a third-party provider of the Cloud maintains the physicality of servers and ensures minimal data loss. With their sole purpose being data management, using Cloud services outsource the complexities of on-site server management to a dedicated and equipped team for enterprises. Cloud usage among all companies is now high; 92% of medium-sized businesses, in one form or another, use Cloud architectures and 85% of larger corporations do as well - a number that directly correlates to the rise in remote working during and after the COVID-19 pandemic. Businesses are faced with the challenge of legacy servers, databases and other platforms that continue to exist when workloads are moved to the Cloud. Whilst they are used less frequently, these endpoints must still be managed and decommissioned, which means that companies’ architectures will inevitably become more complex. It’s important to consider how to leverage technology to reduce the impact of this complexity to truly unlock the Cloud’s potential. " ### Demystifying AI Image Copyright Stable Diffusion and Legal Confusion: Demystifying AI Image Copyright Think back. It’s 2011. The seemingly never-ending Harry Potter franchise has just ended, Adele’s ‘Rolling in the Deep’ is EVERYWHERE, and planking in weird places is still the most popular social media trend. Are you overloaded with nostalgia, yet? Well, there’s one big thing from 2011 that we haven’t mentioned yet, but that definitely dominated the minds of many: the iconic Monkey Selfie. Now, you might be thinking: “I do remember that! But what does a decade-old monkey selfie have to do with image copyright?” Well. What you might not know is that, in 2015, PETA opted to sue wildlife photographer David John Slater on behalf of Naruto (the monkey) on the grounds of copyright infringement. What followed was a legal storm that rivalled even the most dramatic episodes of 'Law & Order.' Ultimately, Slater went on to win the case on the basis that animals can’t legally own copyright, and the image was kept in the public domain. However, just like Naruto's selfie, the world of AI-generated images has thrown us into a whirlwind of legal confusion. After all, if a monkey can’t own copyright, can AI? And given the blurring of boundaries, where do businesses and marketers stand when it comes to using AI-generated images? Let’s find out. Firstly, What’s the Buzz With AI-Generated Images? Given their rising popularity, it’s likely you’ve heard of (if not used!) at least one AI image-generation tool in recent months. And yes; that includes jumping on the social media trend involving Lensa.ai! At present, two of the most widely used tools are Stable Diffusion and DALL-E; both of which have the ability to create highly detailed and photo-realistic images from simple text prompts. And, with these tools being so readily available, it hasn’t taken businesses long to consider how using AI-generated images could streamline their activities… They’re Cost-Effective: Because AI image generation tools can create graphics with minimal human input, the cost of these images is next to nothing. These tools eliminate the need for professional equipment and human talent, ultimately allowing companies to allocate their resources more strategically. They’re Quick to Create: With just a simple text prompt, AI image generation tools can create graphics at an astronomical rate. This rapid creation can be invaluable to businesses that need to produce a high volume of images in a very short time. It can also be a huge benefit when businesses want different variations of the same image. They’re Customisable, Scalable, and Consistent: In the same way images from graphic designers are customisable, AI-generated images are, too. Visuals can be tailored to brand guidelines with ease. Plus, as a brand grows and evolves, these images can be scaled to meet ever-changing needs with a level of consistency that’s challenging for graphic artists to achieve. They Push the Boundaries of Design: AI can produce innovative graphics that push the boundaries of traditional design. The algorithms they rely on can generate artistic concepts that may not have otherwise been explored. This can spark fresh creativity and expand a brand’s visual horizons. The Benefits Sound Great! What’s the Problem? Well, to explain this, we’re going to have to get a bit technical, so bear with us while we dust off our law books… The current copyright law in the UK is the UK Copyright, Designs and Patents Act 1998. This law covers all intellectual property rights, and it gives the creators of literary, dramatic, musical, or artistic work (plus others) the right to control how their materials can be used. And this is where things start to get a bit sticky for AI-generated images. Copyright is automatically applied when an individual or company creates a piece of work. However, in order to be protected, the work needs to be original and exhibit a degree of skill, labour, or judgement. Technically, this criteria isn’t met when AI creates new works. Furthermore, copyright can’t be claimed for any work that copies or uses aspects of an existing work. Even if you’re granted permission from the original owner to use certain elements of their creation, the elements of their work you use to create a new work are still owned by them. Because of these caveats, without the consent of the original owner, businesses and people alike are committing an offence if they: Copy the work of another. Rent, lend, or issue copies of the copyrighted work to the public. Perform, broadcast, or show the copyrighted work in public. Adapt the copyrighted work. Now, AI image generators have been trained to create images using a database of pre-existing graphics, photographs, paintings, illustrations, and text prompts. They learn how to identify which types of images match certain phrases in order to create results in line with the user query. However, and this is a biggie, these training databases often include copyrighted or licenced imagery. And that’s the problem. Now, you might be thinking… “But, Heinz Did It! Why Can’t We?” Well. Before we explain, let's give an overview of Heinz’s campaign to our less-informed friends. Heinz gave AI image generators one prompt: Ketchup. The result? Hundreds of images of tomato ketchup that looked exactly like - you guessed it - Heinz ketchup bottles. The campaign was a huge success, going viral on social media and online, and proved that even AI chooses Heinz. Clever, right? Sure. Really clever. But, there’s a key thing that you need to consider before starting to come up with your own viral AI campaign: Heinz asked AI to create renders of a product it designed and owns. It’s this differentiation that makes it highly unlikely the condiment giant will land in legal hot water. And just to prove the point: Getty Images - one of the world’s largest image libraries - is currently suing Stability AI, creators of the AI image generation platform Stable Diffusion, for infringing on their intellectual property rights. Getty Images has been open about its support of AI and regularly allows AI companies to purchase image licences in order to use their assets in AI training databases. But, Stability AI has applied for no such licence. Therefore, Getty Images is claiming the AI company has infringed on intellectual property rights including the copyright and licensing of imagery owned and created directly by Getty Images. Although the case is very much ongoing, it’s further highlighted the need for rules surrounding copyright and AI to be put in place for the protection of all involved. So, What’s Best to Do? The truth is, the best way to avoid the potential copyright infringement lawsuits that come with AI-generated images is to avoid using them for commercial purposes, if not for everything. Alternatively, considering many image libraries, such as Getty Images, offer licence packages AI companies can purchase, it may be best to reach out to the owner of your AI image generation tool of choice. Simply ask them if the training images provided to their AI tool were free from copyright, and they should be able to give you a helpful answer. Aside from this, unless you have your own product (such as Heinz) - where infringements will be less likely - as with all AI, it’s best to approach with caution rather than diving in head first. On the off chance you aren’t in the business of having your own products, this article from Soap Media offers some great ways you might be able to scratch your commercial AI image itch, without getting in trouble. So, are you feeling demystified now? ### CIF Presents TWF – Duane Jackson In this episode of our weekly show, TWF! (Tech Wave Forum), we are talking about cloud, business software, and the startup scene with a serial entrepreneur who has founded, grown, and then sold two successful software as a service businesses. Our host and CIF CEO David Terrar will be interviewing Duane Jackson who has an inspiring back story, will give us great advice, insights, and experience that he'll relate about the cloud space from before we called it cloud, to now, and what's next. ### CIF Presents TWF – Emily Barrett In this episode of our weekly show, TWF! (Tech Wave Forum), we are talking about all things AI. Our host and CIF CEO David Terrar will be interviewing the AI lead for one of our members, Lenovo. Emily Barrett has just started hosting her own AI show with Disruptive Live, which we heartily recommend. We'll talk about her background with high performance computing, her approach as a solution architect, and her practical take on the AI explosion and how to make the most of it for your company. ### AI Show – Episode 4 – Richard Osborne On the latest captivating instalment of the AI Show, host Emily Barrett engages in a thought-provoking dialogue on ethical AI with Richard Osborne, the esteemed Chief Technology Officer at Purple Transform. Their insightful conversation delves into Purple Transform's pioneering application of machine vision technology in CCTV and Internet of Things (IoT) analytics, revolutionising tasks such as railway track monitoring and event detection across diverse environments. With a keen focus on the responsible integration of AI, Richard Osborne emphasises the paramount importance of human involvement in decision-making processes, even as AI continues to advance and automate certain tasks with remarkable efficiency. Addressing the pressing concerns surrounding privacy, they shed light on Purple Transform's unwavering commitment to data anonymization and stringent adherence to robust regulations like the General Data Protection Regulation (GDPR). This riveting episode illuminates the transformative potential of AI to enhance safety, optimise operations, and drive efficiency across a multitude of industries. However, it also underscores the critical importance of maintaining ethical standards and upholding human-centric decision-making as the cornerstone of technological progress. Join Emily Barrett and Richard Osborne on this insightful exploration of the intricate interplay between cutting-edge AI solutions and the ethical considerations that must guide their responsible implementation. Witness how Purple Transform is paving the way for a future where innovation and ethical conduct go hand in hand, ensuring that AI remains a force for good, enhancing our lives while preserving the essential values that define our humanity. ### Building a people-centric strategy to unlock AI’s potential Today, there is a real atmosphere of excitement for the rise of generative artificial intelligence (AI) tools, such as chatbots and image generators. Leaders across industries have a significant appetite for the technology, with a staggering 97% viewing its potential as transformative for their company and industry. Yet, this buzz does not quite align with current usage rates in many organisations. For example, less than half (44%) of organisations are using AI in their HR operations, despite the opportunities the technology offers for streamlining processes, enhancing decision-making, and improving employee experiences. Leaders clearly view AI as a strategic priority, but are less clear on the roadmap for its deployment. Considering the needs of the workforce, focusing on upskilling and promoting a culture of collaboration will be at the heart of developing a long-term strategy for AI implementation. Align ambitions with workforce expectations Many employees have reservations about AI. Some individuals may be worried about their role being automated, with research suggesting that AI could remove the need for some entry level jobs in the next 10 years. According to this study, many duties often carried out by junior employees, such as writing and administrative tasks, could be done by AI in the next decade. Faced with this worrying prospect, some people may not feel adequately equipped for the AI revolution, especially considering that only one in 10 global workers have in-demand AI-related skills like encryption, cyber security, coding, and app development skills. On the other hand, executives often see AI as a game-changer, with the potential to improve efficiencies and thus optimise resource allocation across departments. For instance, by automating administrative tasks, AI can free up skilled employees to focus on creative or technical pursuits. For example, almost two-thirds (63%) of those involved in coaching practices state that the collection of employee experience feedback is an effective application of AI in coaching. Therefore, thanks to AI, employees would be able to dedicate more time to analysing the feedback and implementing solutions, which would in turn improve the experience of professional development for employees more quickly. It is up to leaders to bridge their knowledge of AI’s potential with any concerns that their employees may hold. Apprehension will make widespread deployment challenging, so addressing this early and providing details on the organisation’s AI intentions will lead to more effective AI usage in the long run. This could include sharing an AI roadmap and allowing employees to ask questions and receive feedback. As part of this, leaders could detail the types of tasks that are likely to be automated, and explain any new responsibilities that the workforce may acquire as a result. By keeping communication channels open, leaders can encourage an enthusiasm around AI that will make it much easier to unlock the technology’s full potential. Build an AI-ready workforce Upskilling should be an essential part of the AI preparation process. Currently, two-thirds of British businesses are experiencing a digital skills gap in their workforce. If not addressed, this could become more pronounced as AI is introduced. Any effective learning and development programme should begin with an assessment of current skills gaps. Organisations should consider their near and long-term goals, and outline any additional capabilities that may be required to achieve these ambitions. Where possible, these skills should be taught internally, rather than hiring externally. By doing so, organisations not only reduce hiring costs, but improve the productivity of existing employees. Soft skills also play a pivotal role in the modern workplace, serving as the glue that binds teams together and facilitates effective communication and collaboration. While technical expertise is undoubtedly valuable, it is soft skills such as emotional intelligence, adaptability, and empathy that can often determine an individual's success in their role. These skills enable employees to provide effective feedback as they learn complex new technical skills, manage conflicts constructively, and foster positive relationships with colleagues, clients, and stakeholders. And most importantly, in an era marked by rapid technological advancements and evolving job roles, soft skills provide the flexibility and resilience necessary to thrive in uncertain environments. Leaders should consider a range of digital professional development tools to upskill their workforce, including coaching. When engaging with a coach, people can access a personalised approach to development that accounts for their unique needs and experiences. When paired with broad e-learning programmes, coaching encourages individuals to hone those new skills at their own pace with the guidance and support of an expert coach. This human partnership both accelerates skill-building and increases employee resilience as they become more adept with unfamiliar tools and technologies. As such, individuals can work on their personal competencies at the same time as they acquire new knowledge, which can help them learn more quickly in the future. Foster a culture of AI innovation Implementing a revolutionary technology like AI cannot be a short-term process. Leaders may understand the benefits that AI promises, but fully unlocking these capabilities is all about engaging with the workforce. Introducing a new technology to existing process can have a significant impact on individuals and teams, who must be supported accordingly to ensure a successful transition. Organisations can encourage collaboration and knowledge sharing around AI by building on personalised learning and development programmes for employees at different levels. This is best achieved either through cross-functional team development opportunities or additional training opportunities. When combined with regular communication on the organisation’s AI roadmap, employees are much more likely to engage with AI, thanks to a deeper understanding of its purpose and potential. Going beyond the hype, and executing a people-driven strategy, will be the winning formula for driving AI success. ### Beyond Borders: Cloud Solutions for Universal Interoperability In the journey towards transforming ways of working, businesses are looking to enhance operational efficiency, foster collaboration, and drive innovation. Interoperability stands out as the key to achieving these goals – acting as a seamless interaction between diverse systems, devices, and applications. But what allows cloud computing to be such a great catalyst for creating a connected, coherent, and communicative digital ecosystem? Leveraging cloud services to achieve interoperability Interoperability remains a challenge when organisations are undergoing their digital innovation journey. It’s not just about ensuring that various technological platforms connect but do so in a manner that is efficient, reliable, and scalable. Cloud computing emerges as the solution here, offering a shared, centralised platform that dismantles silos and encourages a harmonious digital environment. This shared platform facilitates the interaction of disparate systems, allowing them to communicate seamlessly across technological and geographical boundaries. The cloud provides a strategic alignment with global objectives. Through the cloud, businesses can achieve a level of connectivity that fosters greater collaboration and efficiency. What’s more, cloud computing is already being widely adopted by organisations, cementing itself as the backbone of modern business operations. In fact, according to a Flexera survey, as many as 89% of companies use a multi-cloud approach; while 80% take a hybrid approach, utilising public and private clouds. This makes it a natural partner to facilitate successful interoperability. By providing a centralised and scalable hub, cloud platforms act as the unifying force that brings together various systems, applications, and data sources. Beyond connectivity, this shared platform enables robust data integration across a myriad of applications and services. This integration is crucial for synchronising operations and facilitating real-time data exchange, ensuring that all aspects of the digital ecosystem have access to the latest information. Through the utilisation of APIs and middleware, cloud platforms create a digital dialogue between previously isolated applications, allowing them to share data and functionalities seamlessly. This capability is indispensable for organisations that depend on accurate, up-to-date data to drive decision making, optimise customer experiences, and streamline workflows. Real world applications and impacts The advantages of cloud-enabled interoperability are compelling, but it’s tangible, real-world applications that truly highlight its transformative power. Across various sectors, organisations leverage intelligent content solutions in the cloud to not only achieve enhanced interoperability but also reap significant benefits in efficiency, collaboration, and innovation. Healthcare organisations worldwide, such as UNC Health, are increasingly turning to intelligent content solutions in the cloud to address the complex challenge of interoperability – dramatically improving patient care as a result. By harnessing the power of the cloud, these entities can facilitate access to patient records in real time. This allows healthcare providers to make more informed decisions, reduce the risk of errors, and ensure high quality care that spans across different service providers. In the financial services sector, cloud computing is revolutionising operations by enhancing interoperability. For large corporations, it can be used to integrate its various service offerings seamlessly. This integration facilitates real-time data sharing, enabling a unified customer experience across different service lines. Customers can then easily manage their finances through a single interface, benefiting from personalised services such as tailored investment advice. Furthermore, the cloud’s scalability allows organisations to rapidly expand into new markets, whilst maintaining compliance and ensuring data security. Additionally, the retail sector benefits immensely from cloud-based interoperability, which allows for the synchronisation of online and physical operations. This seamless integration provides customers with a unified shopping experience, whether they’re browsing online or walking through the aisles of a shop. Retailers can manage inventory more effectively, personalise customer interactions, and optimise supply chains – all of which contribute to increased customer satisfaction and loyalty. Looking ahead: the future of cloud interoperability Looking to the future, the potential for further innovation through intelligent content solutions in the cloud seems limitless, and its role in interoperability will only grow in importance. With ongoing advancements of technologies, like AI blockchain, and edge computing, the complexity of digital ecosystems will only increase. The cloud offers a scalable and flexible platform that can cope with these complexities, while also delivering more robust data security measures, enhanced interoperability standards, and greater efficiencies in information delivery. All of this will ultimately lead to improved user experience and a more sustainable technical ecosystem. Moreover, the push towards more open standards and protocols in cloud computing will further enhance interoperability, making it easier for organisations to integrate diverse technologies and platforms. This openness is crucial for fostering innovation and ensuring that businesses can leverage the best tools and technologies, regardless of their origin. A future without borders As much as the continuing technical evolution of the cloud is important, people will also play a big role in the future of interoperability. It’s more than just technology – it’s the culture and people leading organisations. Companies need to prioritise collaborations and building relationships among customers, vendors, and professionals, alike. This builds a strong foundation for the technology to be used to its maximum potential and deliver the best results. The journey towards digital innovation is still paved with challenges, but also immense opportunities. To capitalise on these opportunities requires the right technology, the right culture, and the right people. With these elements in place, the cloud will lay the groundwork for a more efficient, collaborative, and innovative digital ecosystem – enabling seamless connectivity, data integration, and real-time sharing. As organisations continue to push the boundaries of what’s possible, cloud-enabled interoperability will remain a critical enabler of their success, allowing them to break limitations and embrace a future without borders. ### The Future of Marketing: Automation vs Innovation Does AI Understand Your Brand Voice? AI is dropping jaws and blowing minds every single day. Evangelists will tell you there’s not much it can’t do. The advent of ChatGPT (as well as Gemini) and their integration into our daily lives and the online tools we use has caused a monumental shift in how we view technology and the possibilities it offers. However, the praise goes hand-in-hand with the caveats - and marketing is the perfect sandbox in which to test AI’s limitations. AI Doesn’t Understand Feelings One major shortcoming of AI is its lack of emotional intelligence. A brand is more than the sum of its products and services - it’s a living, breathing, continuously evolving entity with its own personality and voice. This is because a brand is a group of people sharing a common goal under one banner. AI sees the banner, but cannot grasp the human force driving it. Marketers understand that a brand’s voice has a level of human insight that AI intrinsically lacks. As human beings, we don’t simply relay information to one another - there’s an emotional association underpinning all aspects of our communication based on culture, upbringing, and a whole host of other factors that AI could never take into account. For example, the phrase “I’m fine” could have a million different meanings, depending on the subtext. Can AI pick up on those nuances? Or does “I’m fine” simply mean “everything is good?” It can generate the word, but it still can’t understand what it really means. Now try to explain the meaning of Just Do It. Generation Is Not Innovation The fact that AI takes existing data and renders it as relevant information to you is one of the greatest technological advances of our time. Looking for the best way to express a thought? AI will scour our collective digital landscape in order to bring you the best possible results. This can be hugely advantageous to a brand’s copywriting and communication capabilities in terms of timing, but it’s not a positive move towards creating innovative, emotive, original content. The problem is this: It’s already been said and done before. Simply taking AI’s services at face value means you’re rehashing what already exists. You’re using another brand’s voice and simply quoting the best in the business is a surefire way to keep them at the top of the food chain. What’s more, there’s something about a person behind a keyboard writing from a wealth of experience that AI simply can’t replicate. If you think it’s written by AI, it probably is.And, while some sectors will benefit from having their more mundane tasks and messaging automated for the best possible results, repeating someone else’s content is never going to result in impactful, insightful communication that makes a reader sit up, take notice, and go out and purchase your product. How Brands Can Utilise AI AI is here to stay and it offers a vast amount of benefits in terms of productivity and speed. Every day, there are new revelations about the capabilities of AI, and what this means in terms of the way we approach our operations. We’re all testing new ways to use AI to our advantage, and that includes (obviously) the big players. Big Companies Aren’t Sold Yet For the moment, big brands are not placing their advertising strategies in the hands of AI, and with good reason - AI is far from ready. Emotional intelligence and human innovation are still the biggest assets to a brand’s voice and that’s unlikely to change anytime soon. Somewhere between pure logic and random chaos lies the searing human truth that really sells a brand to its consumers - and that can’t be achieved artificially. AI might have experience scouring the Internet, but it has no experience as a human being. Bottom line: A soda commercial written by a machine that’s never had the opportunity to sip an ice-cold drink on a hot summer’s day just won’t be quite the same. Using AI as an Innovation Tool At its best, AI is currently on par with a great personal assistant. Need some ideas to get you going? Have at it. Just don’t entrust your entire strategy to an assistant - you’re still the expert and they have a lot to learn. Besides, anyone who thinks that marketing can be fully automated probably has a very low opinion about the purpose of marketing, or simply doesn’t understand its real power. And we all know what happens to a business that doesn’t truly value marketing as a means to achieve ROI. A tool is only as good as the person using it - we’re still in the process of figuring out the kinks and optimising the benefits. So, when it comes to AI, it’s best to remind ourselves that, while technology can be helpful, the heart of any brand lies in the very real human hearts behind it. ### AI Act - New Rules, Same Task The first law for AI was approved this month and gives manufacturers of AI applications between six months and three years to adapt to the new rules. Anyone who wants to utilise AI, especially in sensitive areas, will have to strictly control the AI data and its quality and create transparency - classic core disciplines from data management. The EU has done pioneering work and put a legal framework around what is currently the most dynamic and important branch of the data industry with the AI Act, just as it did with GDPR in April 2016, and with Digital Operational Resilience in January 2025. And many of the new tasks from the AI Act will be familiar to data protection officers and every compliance officer involved in GDPR and DORA. The law sets a definition for AI and defines four risk levels: minimal, limited, high and unacceptable. AI applications that companies want to use in aspects of healthcare, education and critical infrastructure fall into the highest security category of “high risk”. Those that fall into the “unacceptable” category will be banned, for example if considered a clear threat to the safety, livelihoods and rights of people. AI systems must, by definition, be trustworthy, secure, transparent, accurate and accountable. Operators must carry out risk assessments, use high-quality data and document their technical and ethical decisions. They must also record how their systems are performing and inform users about the nature and purpose of their systems. In addition, AI systems should be supervised by humans to minimise risk, and to enable interventions. They must be highly robust and achieve a high level of cybersecurity. Companies now need clear guidance. Because they want to use the great potential of this technology, the first companies are already doing so and at the same time must be prepared for the future in order to be able to implement the upcoming details of the regulation. There are five clear recommendations on how companies can approach this without causing legal issues and while still not standing in the way of innovation. And at the same time, be positioned in such a way that you can fully implement the AI Act without turning your business upside down: Allow AI to act with trust: If you want to achieve this, you have to completely understand the AI data content. The only way to get there is to closely control the data and data flows into and out of AI. This close control is similar to the requirements of the GDPR for personal data. Companies should always consider this compliance when they use and develop AI themselves. If you want to use AI in a GDPR and AI Act-compliant manner, you should seek the advice of a data protection expert before introducing it. Understand the data: Much of the law focuses on reporting on the content used to train the AI, the datasets that gave it the knowledge to perform. Companies and their employees need to know exactly what data they are feeding the AI and what value this data has for the company. Some AI providers consciously transfer this decision to the data owners because they know the data best. AI must be trained responsibly, and with the right controls in place for data access by approved individuals. The question of copyright: Much of the law focuses on reporting on the content used to train the AI, the datasets that gave it the knowledge to perform. Previous models of AI have used available internet and book crawls to train their AI. This was content that contained protected content - one of the areas the AI Act seeks to clean up. If companies have already used controlled records without accurately labelling them, they will have to start over. Understanding the contents of the data: This is an essential task. In order for data owners to make correct decisions, the value and content of the data must be clear. In everyday life, this task is gigantic and most companies have accumulated mountains of information that they know nothing about. AI and machine learning can help massively in this area and alleviate one of the most complex problems by automatically indexing and classifying companies' data according to their own Relevant Record strategy. Predefined filters immediately fish compliance-relevant data such as personal data like credit card details, and business-specific data like mortgage records, architectural blueprints, seismic surveys, etc from the data pond and mark them. Security principles can also be added during AIs research to identify insecure data, threat patterns, and more. Once unleashed on the data, this AI develops a company-specific language, a company dialect. And the longer she works and the more company data she examines, the more accurate her results become. The charm of this AI-driven classification is particularly evident when new specifications have to be adhered to. Whatever new requirements the AI Act brings up in the longer term, ML and AI driven classification will be able to search for these additional attributes and provide the company with some future security. Control data flows: If the data is indexed and classified with the correct characteristics, the underlying data management platform can automatically enforce rules without the data owner having to intervene. This vastly reduces the chances of human error and risks. A company could enforce that certain data such as intellectual property or financial data may never be passed on to other storage locations or external AI modules. Modern data management platforms control access to this data by automatically encrypting it and requiring users to authorize themselves using access controls and multifactor authentication. It can also direct most valuable data to an air-gapped vault to ensure continuity of brand and business. The AI Act is also getting teeth The AI Act has another similarity to GDPR and DORA. Once enacted, sanctions for non-compliance will be enforced. Anyone who violates the AI Act must expect penalties of up to 35 million or 7 percent of global sales. The supervisory authorities have imposed fines of 4.5 billion euros since the GDPR came into force until February 2024. When DORA arrives in January 2025 it brings with it potential penalties of 1% of average worldwide turnover the previous year, applied daily until compliance is achieved for 6 months, and the potential for criminal sanctions, alongside existing regulatory penalties. The AI Actis likely to be ratified this summer and will come into force 20 days after publication in the EU's Official Journal. Most of its provisions apply after 24 months. The rules for banned AI systems apply after six months, the rules for General Purpose AI after twelve months and the rules for high-risk AI systems after 36 months. The EU has taken a big step with the AI law and underlines the seriousness with which it wants to balance the great potential of AI and the potential risk it brings with it. Responsible AI, coupled with full AI governance, is now the only true and right way to bring AI into your business. The major practical applications of AI are now subject to detailed compliance requirements. Anyone who sets out early will have to examine their approach and the data used in detail to ensure that they do not violate the regulations. Companies need to act now and get their AI house in order. ### Time to Ditch Traditional Tools for Cloud Security Reliance on cloud technologies has significantly expanded the attack surface, exposing organisations to increasing cyber risks. Recent research from Illumio found that nearly half of all data breaches now originate in the cloud, highlighting the critical gap in security measures organisations often rely on when it comes to securing cloud-based resources. While storing more sensitive data and running more crucial applications in the cloud naturally means an increased risk of breaches, when prepared for accordingly, it doesn’t necessarily mean more critical business assets lost. With the cloud being an inescapable necessity, this much is clear: cloud risk must be equated to business risk. And it must be accounted for accordingly. Because not only do cloud breaches bring with them harsh financial losses for organisations, as cloud-based attacks increase in frequency and severity, they also represent severe reputational damage and trust erosion. The need for a robust, dynamic security strategy is evident now more than ever. And Zero Trust Segmentation (ZTS), rooted in the Zero Trust principle of "never trust, always verify," is emerging as a key solution. It offers a granular, adaptable approach to security, aligning with the cloud's diverse and distributed nature, offering a promising path forward for organisations looking to secure and get the most ROI out of endangered cloud operations. What are the leading concerns in cloud security today? To get a sense of the current state of play in cloud security, Illumio commissioned Vanson Bourne to survey 1,600 IT security decision-makers from organisations around the world. Our research revealed a concerning trend: 46 per cent of breaches reported in the US originated in the cloud, with 61 per cent of affected organisations losing over $500,000 annually to cloud breaches. The frequency of breaches and the size of that price tag underscores the financial and operational risks inherent in inadequate cloud security. However, as steep as the cost of a cloud breach can be, there are many other far-reaching impacts. The top consequences of cloud breaches, as reported by affected companies, include reputational damage, sensitive data loss, and a decrease in productivity. And organisations recognise that these issues can persist long after the breach itself is resolved and operations restored. Many organisations have come to realise that reputational damage can be more harmful (and longer lasting) than the immediate financial impact of a breach. This sentiment particularly holds true among UK companies, where only 29 per cent listed the loss of revenue generation as a primary issue, against a global average of 35 per cent. Instead, nearly half listed reputational damage as their main cloud breach concern. While IT and security leaders are increasingly worried about the risk of a cloud breach, a growing number are also realising that their mainstay traditional security tools are no longer up to the task. Why are traditional security tools falling short? Many companies complete their initial cloud migration by wholesale copying their digital assets from their on-premises hardware, including the same security processes and tools. While faster and easier to budget, this approach is a security breach waiting to happen. The dynamic and interconnected nature of the cloud is a very different proposition to a more static and controllable in-house server, and it’s evident that traditional tools and processes can’t keep up. Despite 55 per cent of IT decision-makers claiming a thorough understanding of their organisation's cloud security risks, 61 per cent acknowledge that their current security measures are inadequate at addressing these risks, leaving their business severely exposed. Respondents almost universally told us they needed better visibility into connectivity between resources and faster reaction times to cloud breaches. At present, less than half had full visibility into the connectivity of their cloud services, making the interplay between cloud and on-premises environments a chief concern. Notably, less than a quarter of respondents consider themselves highly confident in their ability to stop breaches from spreading across hybrid and multi-cloud environments. This poses a big problem, as weak access controls can enable attackers to rapidly move though an environment and access critical data and systems – turning a minor breach into a business catastrophe if rendered uncontained and unchecked. These issues make it clear organisations must prioritise moving beyond traditional detective security methods and adopt more proactive, real-time solutions to safeguard cloud environments effectively. The necessity of Zero Trust Segmentation in cloud security Zero Trust Segmentation (ZTS) is one of the critical solutions in addressing the gaps posed by traditional cloud security solutions. Almost all of the respondents in our research believe ZTS can significantly enhance cloud security. ZTS is a core technological component of the Zero Trust framework, which advocates for a “never trust, always verify” approach to cybersecurity. ZTS offers a consistent method for segmentation or containment across various environments, including across cloud, endpoints, and data centres. Communications and access controls can be as granular as the organisation requires, even down to the cloud resource level - while easy to implement and automatically enforce. Effective segmentation, when implemented properly, has the effect of locking down IT infrastructure against attackers. Faced with strict access controls, intruders and malware are trapped in the initial breach point, greatly reducing the ‘blast radius’ of the attack and enabling affected organisations to continue business-critical operations even while under an active attack. How ZTS is delivering enhanced cloud resilience Segmentation has become an essential strategy for a secure cloud, with 93 per cent of IT security leaders believing it to be critical in securing cloud-based projects. In fact, 100 per cent of the businesses in our study that have implemented ZTS or another form of microsegmentation have seen improvements to their security capabilities. Implementing ZTS not only enhances an organisation's security posture but also contributes greatly to building digital resilience and ensuring business continuity. With more businesses realising that reputation and confidence are the real stakes in a security incident, ZTS offers the highest chance of maintaining customer trust and ensuring organisational integrity even in the evitable event of a breach. The evolving landscape of cloud security demands a strategic shift towards more resilient and dynamic solutions. And organisations adopting ZTS are proven to significantly mitigate the many risks they face in the cloud, safeguard sensitive data, and maintain operational integrity. Embracing proactive measures like these are essential for any company relying on the cloud today. But more than a security strategy, it's a commitment to maintaining a secure, resilient, and trusted digital presence. ### AI Show – Episode 3 – Guy Murphy In this third episode of The AI Show! Host Emily Barrett sits down with Guy Murphy, the co-founder of OSSA, to discuss the revolutionary impact AI is having on advertising for small businesses. They dive into the common challenges small firms face with traditional marketing methods - tight budgets, lack of expertise, and an inability to effectively compete with larger competitors. But OSSA's innovative AI platform is a game-changer, simplifying campaign creation and leveling the playing field. Guy enthusiastically explains how OSSA's AI technology democratises marketing by enabling even the smallest businesses to craft targeted, effective advertising strategies with ease. No more being overshadowed by big brands! The thought-provoking conversation also touches on AI's potential impact on advertising jobs in the future, as well as the critical importance of data security as more processes become digitised. ### 6 Ways Businesses Can Boost Their Cloud Security Resilience The rise in cloud-based cyberattacks continues to climb as hackers pursue vulnerable businesses transitioning away from on-premises infrastructure. Although decision-makers may have believed the cloud was a more secure space to house valuable and sensitive digital assets, the statistical data appears to prove otherwise. Phishing Schemes: 58 per cent of incidents involved the cloud. Ransomware or Malware: 19 per cent of attacks involve the cloud. Infrastructure Attacks: 30 per cent of incidents were cloud-based. Supply Chain Compromises: 17 per cent of organisations reported a cloud incident. Data theft incidents in the cloud were almost equal to the number that occurred in on-premises networks in 2023. These statistics represent real-life catastrophes that harmed businesses of all types and sizes. They also highlight the fact that hackers will follow companies anywhere in hopes of exploiting vulnerabilities for ill-gotten gain. The question industry leaders need to answer is: How can my organisation improve its cloud security resiliency? Ways to Improve Cybersecurity Resilience Cybersecurity resiliency involves a two-fold approach to protecting vital data in the cloud. On the one hand, possessing capabilities that can deter, detect, and expel threat actors is mission-critical. But as the statistics demonstrate, hackers continue to devise new and more nefarious techniques to penetrate business networks. It may be a hard pill to swallow, but no operation is genuinely immune from a data breach or malware attack. That’s why security resiliency must also include a way to right the ship after getting digitally blindsided. These are ways your organisation can harden its attack surface and become more self-reliant. 1: Use a Private Cloud The vast majority of companies use public clouds because they are viewed as cost-effective. These shared spaces house data from hundreds, if not thousands, of entities. Although they are considered adequately safe, shifting to a private cloud offers complete control over data storage and infrastructure. Rather than relying on the cloud provider to deal with security measures, you can onboard a third-party cybersecurity expert to ensure the organisation’s personal protection. Does a private cloud cost more? The short answer is: Yes. Is it worth the investment? That’s something for you to decide in terms of budgetary flexibility, ability to tolerate risk, and a cybersecurity consultant’s recommendation. 2: Embrace Zero Trust Policies One of the perks of leveraging the cloud involves seamless collaboration. Users from anywhere can log onto the platform and work together to drive company goals. But this facet also cracks the window for insider threats and human error. It’s essential to integrate zero trust measures into every legitimate user profile, bar none. Anytime an employee or independent contractor requires access to items outside their base needs, they must get approval, often from a high-level role like a CISO or vCISO. Limiting access protects trade secrets, personal identity information, financial records, and the system at large. 3: Multi-Factor Authentication (MFA) This simple security measure has been roundly effective at stopping hackers from breaching cloud-based networks. One of the cybercriminals’ more popular ways to steal valuable data involves learning an employee’s username and password. Too many organisations use corporate email as the username, leaving hackers to guess sometimes weak passwords. Fortunately, MFA changes the dynamic. Anytime a legitimate user tries to access the network, a code is sent to a secondary device. Hackers would need to possess the cellphone or device to learn the code. Brilliant in its simplicity, MFA continues to frustrate digital thieves. 4: Ongoing Threat Monitoring Transitioning a business into the cloud does not alleviate the need for constant cybersecurity monitoring. Threat actors are often living in other time zones and countries that won’t extradite them for online crimes. It’s up to every business to ensure that attempted data breaches at 2 a.m. are met with resistance. Some companies leverage threat identification methods such as AI and machine learning to set off alerts. Then, cybersecurity experts quickly investigate and decide on an appropriate defence. Others take proactive measures such as threat-hunting to ferret out bad actors and potential malware before they can gain a foothold. When an enterprise has security experts monitoring its cloud-based infrastructure and assets 24 hours a day, 7 days a week, anomalies can be identified and dealt with swiftly. The alternative may be waking up to a ransomware demand. 5: Incident Response Policies Business resiliency requires a pathway back from a devastating cybersecurity breach. Leadership teams typically work with a third-party firm that handles issues such as cloud computing, managed IT and has staff members who are cybersecurity experts. Following a thorough risk assessment, the firm provides in-depth analyses of the company’s strengths and vulnerabilities. The report also makes recommendations regarding ways to cure cybersecurity gaps and lays the foundation for an incident response plan. Bringing department heads and key stakeholders together, everyone takes on a role. These may involve daily backups that are stored in multiple locations, as well as offline, among others. In the event of an emergency, everyone has a role to play in restoring operational integrity as quickly as possible. 6: Cybersecurity Awareness Training The importance of a security-trained staff cannot be understated. When decision-makers enlist the support of cybersecurity firms to educate front-line workers about the tactics used by hackers, that heightened awareness changes the company’s culture. Instead of employees being the weak link cybercriminals seek to exploit, they are now part of an overarching security posture. When hackers grow weary of testing their phishing schemes and social engineering tricks on educated staff members, they look elsewhere for low-hanging fruit. The ability to deter, detect, defend, and implement a disaster restoration program is an absolute must in the digital age. The fact that companies are adopting the cloud to lower costs and improve efficiency doesn’t change the global chess match being played out between hackers and cybersecurity professionals. Keep in mind that honest business professionals and their livelihoods remain at risk. By hardening your defences and planning for the worst, your preparation will pay dividends. ### Good, Bad and the Ugly of Cybersecurity GenAI As the cyber threat landscape continues to evolve at an unprecedented pace, organisations are finding themselves in a ‘two-speed’ security environment. In one lane, threat actors are leveraging emerging technologies such as generative AI to their advantage, increasing the frequency and sophistication of their attacks. On the other, organisations themselves are racing to harness the same technology in order to bolster their resilience. It’s often unwise to fight fire with fire, but in this instance, it’s an absolute necessity. This ‘arms race’ comes at a time when cyber threats are already on the rise. Infoblox’s 2023 Global State of Cybersecurity report revealed that more than 60% of organisations worldwide suffered at least one data breach in the past year, with the average cumulative loss for each organisation standing at around $2 million. Phishing was the top attack vector, accounting for more than 80% of breaches. Phishing, which uses social engineering tactics, fraudulent emails, and ‘lookalike’ domains to mimic legitimate websites, has become a major cause for concern. In fact, in a recent paper, Forrester announced that gaining application, user and device context was one of the biggest challenges now facing security teams. By weaponizing generative AI and the large language models (LLMs) that underpin them, threat actors can scale their attacks with greater speed and complexity than ever before. Without the means to understand who or what is connecting to a network – and what data is being accessed - organisations are left blind. In this article, we will delve into the threats and opportunities presented by generative AI in cybersecurity, drawing particular attention to the role of protective DNS and how AI can enhance DNS security. Every device, every app, every link relies on DNS. This universality makes DNS an ‘easy target’, but also a powerful line of defence. The threat of AI-powered phishing Generative AI, particularly models like GPT-4, represents a transformative leap in AI capabilities. GPT-4 can generate human-like text based on the information it's trained on, enabling businesses to automate content creation, enhance customer service through advanced chatbots, and streamline decision-making processes by providing data-driven insights. By harnessing the power of GPT-4, businesses can achieve greater efficiency, foster innovation, and tailor their offerings more precisely to customer needs, all while reducing operational costs. However, large language models (LLMs) like GPT-4 can also be exploited by threat actors for malicious purposes. For instance, models can craft sophisticated social engineering narratives consisting of convincing emails, messages, or fake websites that can play the “long game” and manipulate victims over time into taking actions they wouldn’t ordinarily take, such as sharing passwords or transferring funds. This a step up from a basic phishing email which, while potentially convincing, is easily avoided with the proper filters and staff training in place. Imagine a scenario where a threat actor meticulously gathers information for a spear-phishing attack. They craft a convincing lookalike domain and, using generative AI, forge a set of emails so authentic they appear to be from a trusted vendor or someone within the organisation. The target, unsuspecting, engages with these emails and clicks through to a fake landing page or lookalike domain. This could be disastrous, but if protective DNS measures are implemented by the target's company, that lookalike domain can be intercepted and neutralised before anybody engages with it. This goes hand in hand with the rising trend of "lookalike" domains, designed to deceive even the most discerning eye. These domains often mimic internal websites, business partners, or legitimate software-as-a-service (SaaS) applications, making them potent tools for spear phishing and smishing attacks. Generative AI can be used to create thousands of sophisticated lookalike variants, using various methods such as SMS messages, phone calls, direct messages on social media, emails, and QR codes to deploy them. In a recent report, Infoblox identified and curated more than 300,000 lookalike domains between January 2022 and March 2023 to highlight the scale of the problem and the sophistication of the methods being used. It revealed that more than 10,000 organisations had been targeted, including banks, government services, delivery companies, software providers, and more. Another study revealed the growing use of generative AI, including ChatGPT, by threat actors to create increasingly authentic and convincing phishing attacks. The democratised nature of generative AI also means that the barrier to entry for cybercrime has been lowered. Not only are existing threat actors and groups enhancing their capabilities but more threats are being added to the mix. What happens when unique malware and toolkits can be mass-produced by average hackers? Or when attackers are able to exploit vulnerabilities in mere minutes rather than days? The industry must brace itself for these eventualities. The role of protective DNS A DNS query is the process by which a device translates a human-readable domain name, like “example[.]com”, into an IP address that it can use to locate the website on the internet. In the context of phishing, attackers often create fake or "lookalike" domains to deceive users into providing sensitive information. By monitoring DNS queries, security systems can detect and flag these malicious domains, preventing them from resolving and thereby stopping users from accessing these deceptive sites. When a user unknowingly clicks on a phishing link, the DNS query can act as a safety net, blocking threats earlier by not establishing a connection to the malicious site and safeguarding the user's data - this is protective DNS in action. When DNS is compromised, the consequences can be catastrophic. However, the same ubiquity that makes DNS a target for attackers also offers a unique vantage point for defence. By monitoring DNS data, it's possible to detect and thwart malicious phishing attempts at their inception, regardless of their delivery method. Bolstering DNS detection and response with AI Harnessing the power of AI can further enhance DNS detection and response. The same technology that underpins generative AI can be used to amplify monitoring capabilities, identifying patterns and anomalies that might indicate malicious activity. Domains registered well in advance of an attack can then be flagged and monitored. Machine learning algorithms, combined with human expertise, can sift through vast amounts of data to pinpoint these potential threats and proactively shut them down. The recently identified Decoy Dog malware is a significant threat to enterprise networks, but its detection underscores the critical role of protective DNS cybersecurity. Originating as an advanced version of the "Pupy RAT," Decoy Dog utilises DNS for its command-and-control operations. This reliance on DNS highlights the system's dual nature: while it can be exploited for malicious activities, it also serves as a potent line of defence. By monitoring and analysing DNS queries, potential threats like Decoy Dog can be identified and neutralised early. Trying to get as efficient as possible the lines between networking and cybersecurity continue to blur and the need to unite networking and security under one umbrella has become clear. By contextualising real-time user and end-point activity, using functions such as AI-powered DNS querying, it's possible to eliminate network and security bottlenecks, streamlining security operations so that they’re fit for a world that never stops. While the rise of generative AI presents undeniable challenges, it also offers opportunities. By embracing the power of DNS and integrating AI-driven insights, organisations can collectively forge a path to a more secure digital future. ### Maximising the business value of data In today's volatile economic and geopolitical climate, companies must harness the power of data to safeguard their future operations. However, business leaders and IT departments need to work in close partnership to build robust data management infrastructures. By combining forces, business leaders and IT professionals are well-positioned to create a buffer against marketplace fluctuations while securing a strategic edge. Effective data use enables organisations to anticipate market trends, make informed decisions, and drive innovation with insight – all crucial for achieving commercial success. As such, business and IT leaders must prioritise the following five key areas to ensure data-driven initiatives yield optimal results. Align business objectives with data management strategy Effective data management is vital to an organisation's strategic framework, especially in our knowledge economy. Progressive companies intertwine their data practices with business objectives, creating a symbiosis that fosters growth. A data-focused strategy emerges through close collaboration between business executives and IT experts, propelling the achievement of KPIs, market expansion, and customer experience improvement. Fundamental to this tactic is a unified vision across company ranks that recognises data's role in steering success. This aligned approach creates business-centric data collection, analysis, and application processes, which in turn supports leadership teams with true data intelligence. Building a robust data management infrastructure Every modern enterprise needs a robust data management framework. A scalable, resilient, and flexible data architecture enables prompt responses to economic shifts and gives organisations a steady foundation from which to adjust their data handling capabilities easily. Central to this is cloud computing technology, which offers scalability, access, and redundancy and is vital for rapidly growing businesses with increasingly remote teams and the need to ensure continuous operations. The goal is a resilient and responsive data management system poised to help the organisation seize market opportunities with precision and speed. Cloud data management offers additional benefits, including superior analytics, security measures and governance, enhanced data quality, quick discovery, and effective metadata consolidation. It also plays a key role in facilitating the idea of a unified data platform – one which brings together master data across different systems into a highly reliable, real-time single source of truth. Sustaining data quality through solid governance structures At the heart of delivering dependable, high-calibre data lies effective data governance, ensuring uniformity, integrity, and regulatory compliance. In contrast, poorly managed and disorganised data can lead to database errors, ambiguities, and security vulnerabilities. Thus, companies must craft clear policies, uphold stringent data quality protocols, and appoint stewards to maintain data accuracy throughout its lifecycle. A sound governance strategy transforms data into a robust asset for informed decision-making, shielding an organisation from legal and regulatory risks. Governance tools set the bar for quality standards and aid in organising data and managing metadata, simplifying compliance evidence, and reinforcing end-user trust in data-driven approaches. Before implementing a new data governance strategy, business-wide consensus and the right tools must be obtained. It's crucial to align stakeholders from various departments around a common vision for management, oversight, and data use. This agreement streamlines operations, fostering a collaborative environment. In this context, selecting a user-friendly, cooperative, and versatile data governance tool becomes vital - one that can accommodate the evolving demands of a dynamic business landscape. Utilising AI and business intelligence tools Artificial intelligence (AI) and business intelligence (BI) aren’t fads – they offer considerable capability to transform data management and analytics. By leveraging cutting-edge tools, like advanced algorithms and machine learning, businesses can unlock predictive insights into market shifts and consumer patterns, establishing a strategic advantage. AI scrutinises extensive datasets for hidden patterns and opportunities, empowering organisations to fine-tune innovation and decision-making processes. BI tools, meanwhile, provide a detailed examination of company operations, revealing opportunities for cost-efficiency and improved service delivery, for example. However, AI’s efficiency hinges on the quality of the underlying data. Prior to integrating AI, it's crucial to lay a solid data foundation by ensuring the information is structured, accurate, and functional. This foundation enhances AI’s capabilities, streamlining rather than complicating operational functions. It also ensures that any AI use cases, like quality assurance and customer experience, are effective and insightful. Developing this robust data infrastructure involves four critical steps: • Enforcing meticulous data cleansing procedures• Investing in a comprehensive master data management (MDM) system• Establishing strict data governance rules• Prioritising cybersecurity and data privacy to mitigate potential threats.Applied with dexterity, AI and BI serve as dynamic tools for hands-on strategy development and decision-making, offering a data-informed outlook on future developments. Measuring data management ROI In a cost-conscious world, companies must validate the financial viability of their investments in data management. Establishing quantifiable performance metrics and tracking them against the investment allows organisations to gauge the success and ROI of their data management initiatives. These metrics could include: • Revenue increases• Cost reduction• Process error reduction• Operational improvements• Number of compliance issues resolved• Data quality improvements Thriving with better data management Implementing thorough data management strategies, underpinned by the right technologies, goes beyond preserving a competitive edge – it’s about securing survival and prospering in an ever-changing global landscape. Companies that acknowledge the crucial role of a data-driven approach in fostering resilience are poised to emerge as future trailblazers. ### The cloud: a viable option for data storage Cloud-first strategies have become commonplace across many industries. In fact, according to findings from Statista, more than 60% of all corporate data is currently stored in the cloud—up from 30% in 2015—and 94% of all enterprises are connected to online services that are hosted on the cloud. Furthermore, Loftware’s annual “Top 5 Trends in Labeling and Packaging Artwork” survey found that nearly three-quarters (71%) of companies believe the cloud or a hybrid solution will be their preferred deployment method for labelling — a vital component to any well-managed supply chain — within the next three years. Despite this, some businesses cite security risks as the primary deterrent from adoption. What these businesses might not know is that the cloud leverages encryption, physical and virtual security systems, authentication, and continuous monitoring to protect data and processes against security threats. By shifting to the cloud and utilising the right cloud provider, businesses will find themselves significantly more confident about their data security. Why SaaS solutions offer next-level security and dependability? Unlike on-premises software solutions, SaaS applications use the cloud to store data and offer easier access to your information. The key to the successful deployment of these applications is finding the right cloud partner who values privacy. These partners are more likely to employ top-end security protocols and be responsive and decisive in the unlikely event of data breaches. Employing disaster recovery protocols and offering high levels of availability will ensure that manufacturers experience as little disruption as possible. Without these features, companies could experience lengthy operational delays due to investigating a breach, costly fines resulting from non-compliance with regulatory standards, or a loss of customer trust which in turn impacts sales. Therefore, selecting the right partner is paramount. Strong cloud partners will often host their solution in tandem with other large and mature hosting organisations, such as AWS or Azure. By doing so, cloud service providers gain access to their extensive industry knowledge and scalability, resulting in an improved and more advanced cloud offering. Ultimately, this collaboration leads to a superior cloud experience for customers. How providers can demonstrate a commitment to data privacy and security? Maintaining a healthy relationship is paramount to a successful partnership. To demonstrate their dependability, data processors must make clear their commitment to providing top-of-the-line security measures. Sometimes, this commitment takes the shape of a legal agreement such as a Data Processing Agreement (DPA). In short, it is an arrangement between a business and its cloud service provider that establishes the scope and purpose of data processing. This includes what data is processed and, more importantly, how it will be protected. Whilst DPA is part of the EU’s General Data Protection Regulation (GDPR), it could prove to be useful outside the EU because it outlines the responsibilities of each party as well as the legal protections to which cloud services customers are entitled. Organisations in the EU and around the globe may be required to have Standard Contractual Clauses (SCC) together with a DPA. This is an added benefit to cloud service customers because providers who have a DPA and SCC tend to have more mature processes since they have been required to develop them for GDPR requests. Another route for cloud providers to showcase their commitment to security is through employing a Chief Information Security Officer (CISO) and Data Protection Officer (DPO). By appointing executives to oversee security and data privacy, the providers demonstrate they are ready to take on security responsibilities. Whilst employing a DPO is a legal requirement but having a CISO is optional, both roles indicate a commitment to privacy protection and data rights. Peace of mind around secure data Over time, the industry has evolved to better understand what practices work best in data protection. An example of this is the practice of encrypting data in motion and data at-rest. By storing your data in unique databases, encryption prevents eavesdropping and adds an extra layer of protection against unauthorised access. Providers will outline specific security measures to prevent outside players from accessing, changing, or stealing at-rest data. Providers sometimes opt to use third-party audits as well, in order to certify the soundness of their security practices. These certifications include the likes of Service Organisation Control 2 (SOC2), ISO27001 and National Institute of Standards and Technology (NIST). They serve as a signal to customers that these are trusted providers. Certified cloud solutions typically offer a significantly more secure environment, as their security practices, including preventative and responsive protocols, have been tested and certified as reliable. This offers businesses greater equanimity, assuring them that they are uploading their data to a reliable cloud system. Regulations are adapting to help providers and data holders Over time, regulators have looked to incorporate new rules and regulations to improve oversight of data protection and prevent privacy breaches. These regulations typically deal directly with privacy and offer guidance for best practices in data protection. Cloud providers understand that this is an ever-evolving regulatory landscape and they may need to quickly adapt their day-to-day practices to new regulations. Regulations such as General Data Protection Regulation (GDPR) and California Consumer Privacy Act (CCPA) directly deal with privacy and offer specific guidelines for data processors. They were enacted to assist providers who deal with large data privacy breaches. In fact, they outline specific requirements for data processors. Providing minimum requirements for providers to ensure GDPR compliance and that they have a transparent privacy policy outlining rights of you and your customers. Finding a trustworthy cloud provider who is GDPR compliant and stays on top of application laws and regulations is paramount to maintaining top-of-the-line data security. The future lies in the cloud As supply chains grow increasingly complex and adopt intelligent technologies, mission-critical labelling operations will require more data than ever before. Businesses will therefore need to find the right partner to store their data. While some see potential security risks to adopting the cloud, many of these concerns can be alleviated by employing the right cloud partner. Security is, of course, paramount and with more providers showcasing their commitment to data security, many businesses will find the cloud is a reliable option for their requirements. By leveraging data encryption, improved security systems, reliable disaster protocols, and a commitment to regulatory compliance, cloud providers continue to demonstrate just how reliable and secure cloud-based labelling strategies can be. ### Emerging trends in Cloud, DevOps and Governance The cloud landscape has an immense impact on how we think about IT and DevOps. With revenues in the cloud infrastructure market increasing at a rate of 35% to 40% annually, there’s a pressing need for leaders to think strategically about where to deploy resource. This shift is technological, but it’s also financial, operational, and strategic. Here are the key trends reshaping the cloud and DevOps landscape. With increased adoption, FinOps maturity needs to increase Since cloud computing is becoming more integral to business operations, IT leaders should concentrate on the maturity of FinOps. Currently, according to the FinOps Foundation, a staggering 62% of organisations are at a ‘crawl’ stage in their FinOps journey, reflecting a nascent understanding and implementation of financial management in cloud environments. This is slowly changing. It’s in every team’s best interest to evaluating releases based on cost deltas, signalling a deeper integration of financial acumen, cloud knowledge, and new tooling into the development process. Monthly bill forensics, treating FinOps as a compliance exercise, and the outdated notion that cost management is solely the business side’s concern being replaced by this more inclusive approach. Product-centric platform engineering is emerging to help improve Developer Experience The Developer Experience is being improved by product-centric platform engineering. By 2026, a staggering 80% of software organisations will have established dedicated platform teams. The core of product-centric platform engineering lies in creating internal platforms built by product experts; they embody the best operational practices, ensuring that every aspect of development is streamlined and efficient. A well-designed platform removes the burden of understanding every nuance of the cloud from app developers, freeing them up to do what they do best: Write features. This is a departure from the traditional, more fragmented methods. Outdated practices such as raising endless tickets, dealing with opaque errors, and the repetitive, manual workload are becoming obsolete. The shift also drifts away from reliance on boilerplate code and the often-limited institutional knowledge behind it. Governance will digitise to keep pace The digitalisation of governance is happening as a response to a growing concern in the software development field — approximately 70% of developers say cannot release software as frequently as they desire. The crux of this issue lies in the traditional governance models, which are increasingly unable to keep up with the rapid pace of technological advancements. Leveraging digital metadata to attest to quality, security, and provenance, as well as the implementation of pattern libraries and ‘paved roads’ — pre-defined best practices and guidelines that streamline development processes — enhances governance by providing a framework that supports faster and more secure releases, without compromising on the quality of compliance. The goal is to reduce manual, tedious tasks while keeping human oversight and decision-making in the loop. It’s about using technology to enhance, not replace, human judgement and expertise.Hybrid cloud is here to stay By 2025, $94 billion will be invested in on-premises IT, while cloud revenues will be an astounding $372 billion. This means that the hybrid cloud must be a foundational element of modern IT strategies going forward. There is a recognition of the limitations of a single cloud or greenfield approach. What’s ‘in’ is multifaceted: Mainframe migrations, which were once sidelined, are now getting the attention they deserve. The ‘Supercloud’ emphasises portability and the flexibility of ‘bring your own cloud’ models, complemented by hyper-converged infrastructure — offering operational expense pricing and a cloud-like experience, even in on-premises environments. Simultaneously, there’s a shift from basic ‘low-hanging fruit’ solutions and single-cloud dependency to more strategic, nuanced approaches due to dissatisfaction with public cloud programs not meeting expectations quickly. This trend also emphasises containers and higher-level abstractions for enhanced portability across on-premises and multi-cloud environments. Cloud security is evolving to provide a holistic app & infra threat model In recent years, we’ve observed a startling 742% year-on-year growth in software supply chain attacks and by 2026, it’s predicted that organisations prioritising a continuous exposure management program will be three times less likely to suffer from a security breach. The focus should now be on a comprehensive examination of software supply chain risks, deploying next-generation Cloud-Native Application Protection Platforms (CNAPP) and adopting ‘as-code’ methodologies for implementing controls. This approach leverages cloud policy and Open Policy Agent (OPA) to enforce security measures, ensuring a more responsive security posture. The industry must take a strategic, informed approach to managing these trends. Whether it’s adapting to the nuanced financial management of cloud resources, embracing the potential of product-centric platform engineering, or navigating cloud security, each demands a deep understanding and proactive stance. It’s essential that business leaders and decision-makers take what resonates and consider how each approach would impact their customers, always keeping their organisation’s best interests at the forefront. ### AI Show – Episode 2 – Paul Pallin In this episode of the AI Show, host Emily Barrett talks with Paul, Co-Founder and Chief Scientist of Miros, about revolutionising e-commerce search with AI. Paul explains the inefficiency of current search methods, particularly for visually appealing products, and how Mirus aims to improve the shopping experience. He highlights the limitations of keyword searches and introduces Miros's solution, which involves semantic vector search and an iterative image-based feedback loop to understand and fulfill consumer desires without relying on verbal descriptions. This technology not only promises to save time for shoppers but also has potential applications beyond e-commerce, such as in interior design and visual communication. ### The hybrid future of cloud in surveillance networks Cloud’s continued expansion into every vertical is inevitable. Already a predominant and essential part of many businesses, it will expand rapidly, predicted to grow to $2.5 trillion by 2031. Anyone with an eye on the cloud market will be unsurprised by such figures. But for those interested in introducing such technology to their business – whatever that business may be – it is important to act with a certain amount of caution. On-demand, distributed computing, for all its benefits, is not a drop-in solution for every business need.  The security and surveillance space greatly benefits from the power offered by the cloud. Cloud tools make hardware like networked cameras more flexible, scalable, and efficient. Yet integrators must consider whether cloud is the correct choice at every step. In many individual cases, a transition to cloud may be entirely appropriate, making for a more cost-effective and optimal solution; in others, the cloud may prove far more expensive and complex than the function demands.  Given the variance between solutions and situations in regards to surveillance systems, there is no clear route to follow when transitioning to the cloud. In truth, installers and integrators are best served by employing a hybrid approach which finds a balance between what cloud services can offer and the benefits of keeping certain technology in-house. Physical difficulties  Whatever the industry, the decision to lean on the cloud is never a black-or-white choice. But it is particularly grey in the physical security space. Trust, regulation, and heavy bandwidth and latency demands contend with the potential benefits of connected platforms and the unique specifics of every company’s security needs. Even where cloud technology can be justified and offer a significant upgrade to existing functionality, it will not always fit neatly into the environs offered to it. Locations with slow or unstable internet connections, for example, cannot transfer video data reliably – and a loss of connection could mean the loss of vital evidence or the inability to remotely control a system when it is most required. Other locations may wish to upgrade a system and incorporate existing legacy hardware which does not offer the level of connectivity that cloud requires, or they might operate a data storage strategy which is incompatible with cloud technology. There are cloud functions which could still suit such sites, but it is important to be cognisant of the potential pitfalls and opt for local solutions where appropriate. Barriers to cloud use IT professionals may find it tempting to take the easy route; indeed security, as with every other digital field, benefits from a number of as-a-service cloud models which can be moulded to assist with everything from infrastructure and platforms to software and, in the case of Video Surveillance-as-a-Service (VSaaS), even core camera functionality. Cloud technology places the burden of processing, storage and administration outside of one’s organisation, which can minimise many of the challenges that security systems raise and lower the bar of administration. But that extraction creates challenges of its own.  Storing data in the cloud may prove to go against confidentiality principles or data regulations in certain industries, for example. Through increased bandwidth demands and additional charges, it can also vastly increase the cost and complexity over self-hosting that storage in-house. The link to a data centre creates a certain amount of lag, which hampers the reaction time of advanced analytics. These not only improve the security level of today’s surveillance systems, but also allow them to perform operational tasks beyond security, making analytics a core part of modern network camera installations.  Maintaining the flexibility of a hybrid model shines in all of these cases. Managing one’s own hardware directly helps ensure security installations adhere to all relevant regulations. Administrators can easily expand, manage, and back up local storage, without a need to pay ongoing archiving or upload fees. The increasing power of edge processing, which places functions like object analytics directly within the camera, eliminates the need to send data back and forth to the cloud – or even over a local network – to actuate analytics. These are core functions, and sending them to the cloud does not, today, make a huge amount of sense. Making the right choice In the end, cost will be a deciding factor. For businesses with fewer devices, the simplicity and support of VSaaS plans could prove cheaper than administering local hardware. However, cloud’s scaling costs – as well as the potential for expensive and time-consuming refactoring should a cloud service be withdrawn – suggest that the larger the scale of a cloud transition, the higher its overheads. Narrowing one’s focus to those elements where cloud makes the biggest impact is the best way to keep its costs under control. Realistically though there is no future for advanced surveillance networks which does not include the cloud in some fashion. There is also no future which eliminates on-site processing. The balance will change over time. As bandwidth increases and costs trend downwards, the opportunities cloud services offer will become greater and more accessible, all while endpoint hardware grows more efficient and powerful.  For security and surveillance, cloud’s true value comes from its addition to a hybrid model which brings together the best of all worlds. Select those elements of the cloud which are most relevant and which create the most value, while also incorporating the powerful abilities found in edge processing and the control of closely administering one’s own IT environment. Mould the system around the site – not the other way around. Find out more about Axis Solutions. ### CIF Presents TWF – Joshua Stanton In this episode of TWF, we are talking about the metaverse. Our host and CIF CEO David Terrar will set the scene with his perspective on why this immersive technology is important, and then he'll interview Dr Joshua Stanton of Disruptive Live explaining their open standards, portable, browser based approach with new platform Spaces Metaverse. Josh will explain the thinking, and show practical use cases. ### CIF Presents TWF – Garfield Smith In this Episode, we are talking to a law firm which specialises in technology and data. Our host and CIF CEO David Terrar will be interviewing Garfield Smith. He's the founder and managing director of the firm, and we'll explore legal issues for tech start-ups, larger corporates, all aspects of dealing with data, and how those legal issues are changed by the AI wave that we are all experiencing.  ### Cloud security is a shared responsibility The good news is that increasing numbers of customers are moving to the cloud to take advantage of all the benefits it provides, including around security. The not-so-good news is that against a backdrop of highly complex software supply chains, enterprises who don’t understand that security is a shared responsibility might risk leaving aspects of their cloud services vulnerable to unauthorised access. It's helpful here to understand the different pieces of the software supply chain to pin down who’s responsible for which elements. A strong, secure foundation Microsoft Azure or AWS provides the cloud infrastructure that a specific application vendor leverages and builds on top of to deliver a cloud application to the market. Customers will want to ensure that any application provider they’re engaging with has a strong partnership with a well-established cloud service provider, since the latter provides the foundational layer of any cloud offering. Things to look for in a cloud service provider include global datacenters (to comply with any geo-specific data protection requirements) and compliance with the latest security and quality ISO standards, as well as the requisite level of performance, reliability, and scalability. The application itself The application provider, meanwhile, needs to ensure that the application itself is designed and architected in a way that ensures the highest levels of security and governance. This includes incorporating frameworks like zero trust architecture, which – by default – assumes that no user, device, or system should be implicitly trusted, but instead should be continuously verified and validated. It is also up to the application provider to regularly patch and update the application on an ongoing basis, making sure customers always have the latest version of the service. Customers are the final piece of the puzzle The cloud service provider and the application provider create a secure foundation – but customers add one more piece to the puzzle. And even the most securely architected application can be vulnerable to breaches if it isn’t properly configured and administered. Start with user accounts. The application provider isn’t in a position to know who within the organisation should have access to the application and what level of access they should have. For example: who should be a power user or an administrator, with broad access? Who should be a read-only user, with limited access? It’s a customer's responsibility to determine who needs to have what level of access. There’s also responsibility around managing those user accounts over time. For example, if someone leaves the company, that account needs to be deactivated. This requires ensuring that user information is synchronized with other directory management tools within the organisation, such as Active Directory. Oftentimes, companies will assume that these different applications and services “automatically” work together. The reality is that things can change after the initial configuration so regular monitoring is required to ensure all systems are working with the most accurate and up-to-date information. Careful with that configuration Aside from user accounts and access levels, there are other areas that customers need to pay careful attention to, particularly when it comes to configuring the application. Most cloud services come with a set of security features that need to be toggled on and adjusted to properly protect the data within the application. Imagine a scenario, however, where a cloud application has been purchased by a non-technical member of the organisation, i.e. a business user. For that individual – who’s probably just looking to get up and running with the application as quickly as possible so that they can roll it out to a few other members of their team or department – the tendency is to accept whatever the default security settings are and leave them as is rather than tinkering around with them. As a result, the organisation might be totally unaware that certain ports were left open by default, or that an important data backup setting wasn’t turned on. Involving the IT team early on in the process of introducing a new cloud application into an organisation is a way to prevent these accidental missteps and ensure the application is properly configured according to internal policies. The application vendor should also provide plenty of guidance, documentation, and technical support as needed to customers who have configuration questions, as a way of heading off any problems before they arise. It's a team effort None of these responsibilities should be seen as insurmountable, nor should they overshadow the value of moving to the cloud. On the contrary, it’s a way for organisations to ensure that they attain full business value from their cloud investments by eliminating security vulnerabilities. By demystifying the responsibilities between cloud service provider, application provider, and customer – and then making sure that these are well documented and well understood by internal stakeholders – the phrase “Isn’t security just the vendor’s responsibility?” will be consigned to the dustbin of history, replaced by an understanding that it is a shared responsibility. ### Navigating the Cloud Conundrum – Public, Private or Hybrid? While the concept of ‘computing as a service’ dates back to the 1960s, the explosion of cloud computing and hyperscale providers as we know it today is a more recent phenomenon. In recent years, particularly post-2020, a 'cloud gold rush' has swept across the business landscape. Driven by the pandemic, companies large and small have scrambled to transition their infrastructure applications to the cloud. However, in their haste to benefit from its headline advantages, many organisations overlooked the full ramifications of their decisions. Now, questions are arising as to why cloud economics don't always align with expectations, with many CIOs and IT directors finding that the promised savings are elusive, with hidden costs in bandwidth and overlooked payment plans complicating the picture. In many organisations, the cloud model now only truly makes sense if the strategy also leverages automation tools to deliver on its full potential. As a result, many IT professionals are now considering a fundamental question: is the cloud really all it was hyped up to be? From Public to Private So, where does that leave organisations who perhaps may be facing a cloud conundrum? For many, the answer lies in a strategic shift to the private cloud – a distinct approach that contrasts sharply with its public counterpart in a number of important ways: Customisation: Tailored to specific needs, the private cloud allows for high levels of customisation and spreads costs over a longer duration. In this context, businesses can dictate the scale of their infrastructure, performance characteristics and compliance standards. This bespoke nature makes it particularly appealing for sectors with stringent regulatory demands or unique operational requirements. Simplicity: The private cloud caters to volatile computing environments that might not be ideal for the public cloud. For companies with legacy systems or specialised software, for example, transitioning to a private cloud can provide a more controlled and gradual migration path, reducing disruption to ongoing operations. Security: The private cloud also offers more control over data, a capability that is crucial for sensitive workloads. While public providers such as Azure offer high-quality security, private clouds offer more exclusive control. In sectors including finance and healthcare, where data privacy is paramount, the private cloud provides the necessary assurances, while the ability to monitor and manage data and access closely adds a layer of security that is often required in these industries. Isolation: In addition, the isolated nature of private cloud environments significantly reduces the risk of resource contention, ensuring stable and predictable performance. This isolation is particularly beneficial for organisations running resource-intensive applications, as it guarantees the availability of computational power or storage resources as needed. Moreover, private clouds can offer superior network performance within an organisation, as the data does not have to leave the private network, which can be crucial for bandwidth-intensive applications. Autonomy: Private clouds also enable organisations to maintain a higher degree of operational autonomy and control over their IT infrastructure. This control extends not just to the technical aspects but also to the governance of data and applications, allowing for a more tailored IT strategy that aligns closely with the organisation's long-term goals and regulatory requirements. Transformation: In the current environment where digital transformation is a major area of focus, the adaptability of private cloud infrastructure supports long-term scalability in a controlled environment, which is particularly beneficial for organisations on a transformation journey. Best of Both Worlds: The Hybrid Cloud In practical terms, however, every organisation will have a set of unique requirements, and there's no universal solution that the cloud can deliver. In these circumstances, a hybrid cloud approach, blending the strengths of both private and public clouds, is arguably the most pragmatic strategy. It's especially relevant for organisations running legacy software that might operate more effectively on older servers, while the hybrid model also enables a unified management approach across different infrastructures, benefiting from the cost efficiencies of each. By adopting a hybrid cloud strategy, businesses can allocate resources more effectively, using the public cloud for scalable, less sensitive tasks while reserving the private cloud for critical, high-security workloads. This flexibility allows for better disaster recovery strategies and operational resilience. Hybrid cloud environments also facilitate innovation, enabling organisations to test new applications in a public cloud setting while maintaining core systems securely on-premises or in a private cloud. It’s important to note that collaboration with experts is crucial to understanding the capabilities and maturity of different applications. Not every application will thrive in the cloud, and some may be better suited to more traditional solutions. Consulting with cloud architects and IT strategists can help businesses tailor their hybrid cloud environments, ensuring optimal performance and cost-effectiveness. As technology evolves, this balanced approach allows organisations to remain agile and adaptable, ready to harness new opportunities while maintaining a secure and stable IT foundation. ### Navigating the Hype and Reality of Cloud Service Providers In recent years, the excitement surrounding artificial intelligence (AI) has reached unprecedented levels. From revolutionising industries to transforming daily interactions, AI's rapid progress and expanding applications have captured the imagination of businesses and individuals alike. Yet, amidst this fervour, it's crucial to examine the infrastructure that underpins AI advancements and the role of cloud service providers in enabling this transformative technology.While AI has made significant strides at the application layer, this does not diminish the critical role that cloud service providers play in facilitating AI development and deployment. Let's delve into the key principles underlying the architecture for AI and how cloud providers contribute to realising these objectives.Scalability and Flexibility: AI applications demand significant computational resources, especially when training large models or processing massive datasets. Scalability and flexibility are paramount in accommodating these varying workloads and demands. Zadara’s cloud service partners excel in providing scalable infrastructure solutions, enabling Enterprise organisations to dynamically scale compute resources up or down based on workload requirements. Whether it's provisioning additional zStorage data services or leveraging serverless zCompute resources, Zadara’s Edge Cloud offers the flexibility needed to support AI initiatives at scale.High-Performance Computing (HPC): AI workloads often benefit from high-performance computing infrastructure, particularly GPUs (Graphics Processing Units), which excel at parallel processing tasks common in AI algorithms. Zadara’s cloud service partners offer access to cutting-edge HPC resources, allowing organisations to harness the computational power needed for intensive AI workloads. With GPU-accelerated instances and specialized hardware configurations, Zadara Edge Cloud empowers researchers and data scientists to train complex AI models efficiently and accelerate innovation in the field.Data Storage and Management: The success of AI applications hinges on the availability and efficient management of large volumes of data. Zadara’s cloud service partners offer robust zStorage and management solutions tailored to the unique requirements of AI workloads. From scalable object storage to distributed file systems and data lakes, Zadara Edge Cloud provides the foundation for storing and processing vast datasets.Networking Infrastructure: High-speed networking infrastructure is essential for facilitating seamless communication and data transfer between different components of AI systems. Zadara’s cloud service partners offer highly available and low-latency networking solutions, ensuring optimal performance for distributed AI workloads. Whether it's interconnecting virtual machines within a cloud region or establishing global network connectivity across multiple regions, Zadara Edge Cloud enables organisations to build robust and resilient AI architectures capable of meeting the demands of near real-time data processing.Edge Computing: The proliferation of edge devices and the need for near real-time response times have fueled the adoption of Zadara Edge Cloud for AI applications. Zadara’s cloud service partners offer edge computing solutions that extend their infrastructure to the network edge, enabling organisations to deploy AI models closer to the point of data generation. By leveraging Zadara Edge Cloud capabilities, organisations can minimise latency, reduce bandwidth costs, and enhance data sovereignty and privacy compliance for AI workloads. Zadara’s cloud service partners provide the flexibility to seamlessly integrate Zadara Edge Cloud environments, enabling a distributed approach to AI deployment that optimises performance and efficiency.Security and Privacy: AI systems often deal with sensitive data, highlighting the importance of robust security and privacy measures. Zadara’s cloud service partners prioritize security and data sovereignty, offering a comprehensive suite of tools and services to protect AI workloads and data assets. From encryption and access controls to threat detection and identity management, Zadara Edge Cloud enables organisations to enforce stringent security policies and safeguard against cyber threats and data breaches.Zadara’s cloud service partners play a pivotal role in realizing the potential of AI by providing the scalable, high-performance, and secure infrastructure needed to support AI development and deployment. As organisations embrace AI to drive innovation and competitive advantage, Zadara Edge Cloud serves as the foundation for building and scaling AI-driven solutions that transform industries and shape the future of technology. While the excitement around AI continues to soar, it's essential to recognize the indispensable contributions of Zadara’s cloud service partners in turning AI aspirations into reality. ### What’s next for the cloud? Reflecting on last year, there were some landmark events in the cloud industry that paved the way for 2024 – from the advances in AI, tech giants grappling with unforeseen financial challenges and cuts, to criticism toward the UK government’s G-Cloud initiative efforts, to Ofcom’s prompting to the CMA to investigate anti-competitive practices in the nation’s cloud market. As we head further into the new year, these cloud stories serve as a backdrop for anticipating the trajectory of the cloud industry. In 2024, companies are expected to leverage lessons learned from last year's challenges and become more inclined to explore the various cloud models available to determine which one might best fit their specific business needs. Hidden costs are no longer hidden With financial planning underway, IT decision-makers are re-evaluating their cloud spending to consider where unnecessary costs lie. It’s no secret that the cost of the cloud is on the rise, with 82% of organisations listing cloud spending as their top challenge. It’s clear to say that the initial intention of the public cloud to be the most cost-effective solution no longer holds as true. Hidden costs of public cloud services include pilling charges from leaving servers running (which also means paying for unused resources) and hidden data access and exit fees. Naturally, IT decision-makers are increasingly asking for transparent, fixed monthly rates to plan their IT budgets around because this is what other cloud models, such as the private cloud, can offer. For this reason, 2024 will likely witness an increase in adopting alternative cloud models, such as private cloud, or a move towards hybrid infrastructures. Decision-makers are now understanding that despite higher implementation costs, these options can evade hidden charges and offer better long-term investments while better serving their company’s individual needs. Increased control over data in the cloud In addition to hidden costs, businesses are facing a lack of control over their cloud infrastructure, with 40% of organisations in 2023 reporting a loss of control over IT environments. This stems from data security concerns, challenges in handling sensitive data, risk management complexities, issues with data portability, transparency gaps, and the risk of vendor lock-in. Again, the cloud was intended to simplify complex data management. However, the reality is that cloud resources need near-constant upkeep or monitoring to ensure the cloud is delivering real value. Heading further into 2024, businesses will likely be looking for cloud setups that offer them greater control over their infrastructure. This again helps the rise of adoption of alternative cloud models — private cloud, multi-cloud, and hybrid cloud. What’s more, businesses will seek cloud solutions that can be tailored to the amount of control their industry demands; for example, financial institutions dealing with highly confidential client information might opt for a hybrid cloud model to retain sensitive data on-premises or in dedicated environments while leveraging cloud benefits for other workloads. Cloud de-concentration will be a top priority A term of cloud technology concern that has taken off recently (but is older than you might think) is cloud concentration. Cloud concentration refers to a situation where an organisation depends on a single cloud provider, typically one of the hyperscalers. This dependency can pose significant risks to the infrastructure, increasing its security vulnerability and harmful downtime effects. Following the industry’s conversations, it seems like cloud concentration is set to dominate the concerns of IT decision-makers in 2024, marking a shift away from the default ‘all-in’ on the public cloud. To avoid potential outages, insecure APIs, and limited visibility, organisations are reconsidering their cloud strategies to a hybrid or multi-cloud approach, with a reported 80% of companies already using multiple cloud providers to spread risk.This shift reflects a broader trend in the industry, where companies are recognising the importance of having well-considered continuity plans and exploring alternative cloud strategies to ensure resilience and security. A close in the skills gap Despite a surge in cloud-related job postings, 80% of organisations do not have a dedicated cloud security team or lead due to a scarcity of skilled IT professionals. This skills gap is resulting in a multitude of issues like misconfigurations, security vulnerabilities, and operational inefficiencies. A recent survey from SoftwareOne reveals that the skills gap has pushed developer teams to their limits, causing project delays, staff burnout and a decline in overall productivity, forcing some organisations to restrict their cloud usage. Contributing to this problem are tech giants who often take the top talent and leave little for the smaller players.Consequently, enterprises need to adopt new solutions in 2024 to address this skills shortage through either upskilling existing internal teams or seeking assistance from managed service providers to bridge workforce skills gaps. Big changes around the corner It’s clear that the cloud industry is set to embark on a shift towards more secure, transparent and cost-effective cloud solutions. As 2023 has shown us, this change has been prompted by the demand for control, fair costs, avoidance of cloud concentration, and a concerted effort to bridge the ever-widening skills gap through innovative solutions. In other words, 2024 is promising to kick-start a dynamic and resilient era of cloud computing. ### Lenovo and CUDO Compute: How Will AI and Cloud Infrastructure Impact the Environment? In this video, Pete Hill and Matt Hawkins discuss the intersection of AI, cloud infrastructure, and environmental sustainability. They examine the growing carbon footprint of the cloud industry, driven by AI's demand for computing power. The duo explores sustainable energy sources like hydro, solar, and wind for data centres and the potential of edge computing networks to reduce emissions. They highlight the importance of regulations in promoting eco-friendly practices and introduce CUDO, a platform facilitating the global sale of sustainable cloud services. CUDO enables companies with renewable energy-powered data centres to offer their excess capacity, promoting environmental responsibility and cost-effective computing resources. ### Next-Generation Ransomware Protection for the Cloud Have you heard the story of the LockBit gang? If not, consider yourself lucky. The LockBit gang is known for its widespread ransomware attacks targeting organisations around the world and in early 2024, became a focal point for the UK and the FBI due to its prolific and disruptive activities. By pooling resources, sharing intelligence, and coordinating their legal and technical strategies, international teams were able to identify, pursue, and apprehend key members of the gang, disrupting one of the most active ransomware schemes of all time, and sending a strong message to similar criminal entities about the global community's commitment to combating cyber threats. The Extent of the Ransomware Problem The financial impact of ransomware attacks is staggering. Notable cases included attacks on Western Digital, the City of Dallas, Prospect Medical Holdings, MGM Resorts, Boeing, Henry Schein, and Ardent Health Services. These ranged from data theft and business disruption to a massive $100 million loss for MGM Resorts due to operational disruptions.According to the World Economic Forum, 2023 saw a concerning surge of ransomware activity, up 50% year-on-year in the first half of the year. March 2023 was the record breaker with a staggering 459 cases reported, a 91% increase from the previous month and a 62% rise compared to March 2022. The most active ransomware groups included Clop, which exploited a vulnerability in Fortra's GoAnywhere MFT, and LockBit.The healthcare sector, in particular, felt the dire consequences of ransomware attacks, with at least 141 hospitals directly affected. The financial toll on healthcare was profound, with the average cost of a healthcare data breach reaching an all-time high of $11 million.What can we do about this? Well, unfortunately managing these attacks is very difficult. Cybercriminals often operate from countries with either weak cybercrime laws or a lack of enforcement infrastructure, making international cooperation difficult. The anonymity of cryptocurrencies, which are commonly used for ransom payments, further complicates efforts to trace and apprehend these attackers. And there’s also the decentralised nature of ransomware operations, as we explored in our discussion of RaaS, which allows attackers to launch sophisticated campaigns with minimal risk of detection. Never Pay the Ransom This is not just cautionary advice. It’s a principle backed by cybersecurity experts and law enforcement agencies worldwide. Complying with ransom demands presents several risks and consequences, both immediate and long-term. For one, there’s no guarantee of data recovery. Studies and real-world experiences indicate that a significant number of victims who pay do not get all their data back. Paying also emboldens cybercriminals, signalling that the victim is willing to comply. This not only puts the victim at risk of future attacks but also encourages the perpetrators to target others. If you’re the victim of ransomware and considering paying, think about where your money is going. Ransom payments directly finance criminal operations, enabling them to refine their attack methodologies, invest in more sophisticated technology, and expand their targets. Each successful transaction is a testament to the viability of ransomware as a profitable venture for cybercriminals. The economic model of ransomware is built on the willingness of victims to pay, ensuring its continuous growth and sophistication. Yes, a ransomware attack can leave you feeling like you’re backed into a corner, with no choice but to cough up the cash, but the negative implications of paying far outweigh the possibility of a positive result. Immutable Cloud Storage Is Key for Ransomware Recovery Immutable storage is pivotal in defence against ransomware attacks. It’s like a safety deposit box where you store your information and no one – not you, not your employees, and not attackers – can access it. The significance of immutable storage in ransomware defence cannot be overstated. In the event of an attack, having backups that are untouchable means you can restore your data to a pre-attack state without paying the ransom. This not only minimises downtime but also significantly reduces the financial and operational impact of the attack. Cloud services play a vital role in providing immutable storage solutions. Major cloud providers like CTERA offer services designed with immutability in mind, leveraging write once, read many (WORM) policies to protect data. Cloud-based immutable storage then, is an essential component of a comprehensive data protection strategy, offering scalability, reliability, and accessibility, alongside robust defence against ransomware threats. It’s a way for you to be both proactive against attacks, and ensuring any cyber threats that sneak through the battle lines don’t disable your business operations. Immutable storage doesn’t just protect individual organisations. It’s also key to the broader fight against ransomware. Advanced Detection and Response Techniques Artificial Intelligence (AI) and Machine Learning (ML) are changing the way we identify and neutralise ransomware activities. These technologies excel in sifting through massive volumes of data at high speeds, enabling us to identify suspicious patterns and anomalies that could indicate a ransomware attack in progress. One of the key strengths of AI and ML in this context is their ability to offer real-time detection and behavioural analysis. Unlike traditional, signature-based defences, which rely on known threat databases and thus can only defend against previously identified ransomware, AI and ML systems learn from the data they process. This continuous learning allows them to identify new and evolving ransomware strains based on behaviour and other indicators, rather than waiting for a specific signature to appear. In cloud environments where data flows dynamically and the scale and complexity of operations can render traditional defences less effective, AI-driven systems allow us to implement continuous monitoring. And if they find a problem, they can automatically trigger alerts and initiate actions to mitigate the attack. The behavioural analysis offered by AI and ML also provides insights into the tactics, techniques, and procedures used by attackers, which can improve overall security and allow us to conduct detailed forensic investigations post-attack. With this knowledge, we’ll know how the attack happened, which vulnerabilities were exploited, and how we can prevent similar incidents in the future. A Call to Action The persistent nature of ransomware means we need to be continuously vigilant and adaptable. This threat is not static; it morphs, adapts, and finds new ways to exploit vulnerabilities. Businesses, regardless of size or sector, have to evolve to battle against ransomware, perpetually learning and improving their security practices. To stand resilient in the face of ransomware, a multi-layered approach to defence is not just advisable; it's imperative. But the journey towards creating it is a marathon, requiring endurance. What can you do while passing the time on building your cyber defences? First, foster a culture of cybersecurity awareness in your business. Then implement a comprehensive defense strategy. And next, don’t pay the ransom. Just don’t. The path forward is challenging, but by taking collective action and making a commitment to cybersecurity, we can pave the way for a safer digital future. ### The art and science of GenAI in software development The past year has shown just how quickly technologies can become the go-to innovation for companies globally. However, there isn’t a switch that enterprises can flick to embed generative AI within their offerings and processes. It takes planning, time, and resources. Once an enterprise has decided to implement AI, that’s the time for software developers to shine. From identifying business use cases to determining the quality of existing data, there’s an art and a science to implementing and maintaining generative AI.  The starting point  Decision-makers need to be clear on the reasons for implementing generative AI. Is it to act as a stimulus for marketing activity? Is it to develop a new business product or offering? Or is it to increase business processes across the organisation? You need to ask the what, the why and the how. Without clear direction, it’s easier to stray off course.  When it comes to using generative AI, there’s an art to what you choose to use and how you use it. Generative AI is a creative catalyst for ideas – it’s then on the human to decide what to do with them. For it to truly work as a tool, employees will need to know what it is for, how it can help them in their role, and also understand its current inaccuracies. That way, you can bring together the creativity and reality of generative AI. Fine-tuning and testing  Once you have a purpose for the technology, be it for your processes, product or offering, and have started to implement it into your business, it then needs to be fine-tuned and tested. How do you develop models and optimise performance? How do you evaluate this performance? Well, for both aspects, you need to take a deep dive into science. And this is where software developers come into their own. In order to ‘know’ your AI, you need to access your data and understand your data. With a central repository system like a data lake, software developers can gather data from a range of sources, such as business applications, apps and social media, into one place. They can use this data to fine-tune your model, develop its capability and customise it based on user feedback.  These developers can then use analytical tools to assess the model’s performance, compare datasets, and help guide decision-making to improve your offering. This is where science informs the art of building a product or service.  Depending on the costs and ambitions of the project, you might not want to start from scratch. Generative AI libraries, for instance, offer a range of pre-trained open-source LLMs to help developers build new AI-powered capabilities into their products. Continuous improvement The implementation, fine-tuning and testing of generative AI is only the beginning of the process. You then have to maintain it.  One of the flaws of generative AI is that, for the moment, it relies on past datasets. The current models have a year or two data gap from present day. Therefore, it is imperative that models are continually retrained with up-to-date data for the best results – the model is only as good as the data it is trained on. Again, a system like a data lake can allow you to import data in real time.   Moreover, your business requirements will change, and so your model(s) needs to be retrained alongside these shifts. How do you embed it into your operations and then keep it relevant by enhancing and evolving it? Generative AI needs to drive your business processes towards the goal you set out from the starting point.  Finally, seeing this improvement as part of an evolution is vital. A digital transformation may view reaching the end goal as job done – a digital evolution returns to the starting point and sees what can be implemented next to build continuous business improvement.  Mastering the art and science of generative AI Like with any technology, there’s both an art and science to using generative AI in software development. There is an art and science to implementing the technology, and knowing where and how to use it. Then, there is an art and science in fine-tuning and testing your model, knowing your data and how to use it. Both of these aspects contribute to building continuous improvement of your offering.  Generative AI is still in its infancy, growing and learning quickly, and there is much to be explored and discovered. The businesses that can start to master the art and science of generative AI in software development now will be best placed to evolve with it. ### From the lab to production: Driving GenAI business success Generative AI (GenAI) was the technology story of 2023. Spurred on by the breakaway success of ChatGPT, the likes of Amazon, Microsoft and Google have accelerated their own efforts, creating a tidal wave of innovation that promised to reshape the way businesses and users harness the power of technology to drive productivity. It’s already made significant strides in various sectors such as pharma and law. But what we’ve seen so far is just the beginning. The true power of GenAI will only become clear once organisations take it out of the experimental stage and begin to use it more widely in production. However, in order to ride the wave rather than get caught up in it, organisations must overcome some key challenges around cost and trust. Doing so will require a robust data roadmap that leverages the cloud. Cost and trust are the biggest barriers When it comes to GenAI, the old computing maxim of “garbage in, garbage out” applies—you can’t expect to generate useful results if the model is trained on untrustworthy data. The challenge is that data governance and security are still at a nascent stage in many organisations, with crucial information often locked away in silos—making it effectively unusable without costly integration. In practice, this means that AI training data may be poor quality and lack crucial business context, which can lead to hallucinations (fictional information that seems realistic) or factual responses that lack the necessary context. Either way, it adds no value for the business. Another critical pain point is the high cost of in-house GenAI projects. While outsourcing comes with its own security, compliance and potential risk, doing everything internally can be eye-wateringly expensive. A single cutting-edge GPUs, designed specifically for running large language models (LLMs), costs around $30,000. And an organisation wanting to run training a model with, say, 175 billion parameters might need 2,000 GPUs. That’s a hefty bill in the region of tens of millions of dollars. Taking GenAI from the lab to production This is why cloud infrastructure is becoming increasingly popular as a foundation for AI. Cloud providers have the GPU resources to empower customers to scale their GenAI projects and only pay for what they use. This enables organisations to experiment with GenAI and turn off the model once they’ve finished tinkering, rather than having to provision GPU in on-premise environments. That saves on CapEx and provides the flexibility organisations need to take operations back in-house in the future if required. Once they’ve decided to implement cloud, how can organisations get GenAI projects out of the lab and deliver value in production environments? The BRIESO model – Build, Refine, Identify, Experiment, Scale and Optimise – is instructive here: Build: First, create a modern data architecture and universal enterprise data mesh. Whether on-premises or in the cloud, this will enable the organisation to gain visibility and control of its data. It will also help by establishing a unified ontology for mapping, securing and achieving compliance across all data silos. Look for tools which not only meet current demand but have the scalability to accommodate future growth. Open source solutions often offer the greatest flexibility. Refine: Next, it’s time to refine and optimise data according to existing business requirements. It’s important at this stage to anticipate future requirements as accurately as possible. This will reduce the chances of migrating too much unnecessary data, which will add no value but may increase the cost of the project significantly.Identify: Spot opportunities to utilise cloud for specific workloads. A workload analysis will be useful here in helping to determine where most value could be derived. It’s about connecting data across locations – whether on-premises or in multiple clouds – to optimise the project. Now is also a good time to consider potential use cases for development.Experiment: Try pre-built, third-party GenAI frameworks to find the one that best aligns with business requirements. There are plenty to choose from, including AWS’s Bedrock (Hugging Face), Azure’s OpenAI (ChatGPT) and Google’s AI Platform (Vertex). It’s important not to rush into a decision too early. The model must integrate closely with existing enterprise data for the project to stand any chance of success. Scale and Optimise: Once a suitable platform is chosen, consider picking one or two use cases to scale into a production model. Continuously optimise the process, but keep an eye on GPU-related costs in case they start to spiral. As the organisation’s GenAI capabilities start to grow, look for ways to optimise their use. A flexible AI platform is crucial to long-term success. The future is here IT and business leaders are understandably excited about the transformative potential of GenAI applications. From enhanced customer service to seamless supply chain management and supercharged DevOps – it’s no surprise that 98% of global executives agree AI foundation models will play an important role in their strategy over the coming 3-5 years. But before anyone gets too carried away, there’s still plenty of work to be done. A modern data architecture must be the starting point for any successful AI project. Then it’s time to refine, identify, experiment, scale and optimise. The future awaits. ### AI Show - Episode 1 - Clare Walsh We invite you to join us on an extraordinary journey into the world of artificial intelligence with The AI Show. In this groundbreaking series, we delve into the heart of cutting-edge innovations, ethical quandaries, and paradigm-shifting applications that are redefining our world. In each episode led by our host, Emily Barrett, an AI Lead and HPC Systems Architect for Lenovo, she engages with the brilliant minds shaping the AI landscape. From visionary researchers and industry titans to ethical philosophers and policymakers, our guests shed light on the intricate interplay between humanity and technology. Brace yourself for captivating discussions that unravel the complexities of AI's impact on our lives, unveiling its potential to revolutionise industries, transform education, and reshape societal norms. Whether you're an AI enthusiast, a curious learner, or a forward-thinker seeking to stay ahead of the curve, The AI Show is your gateway to the extraordinary world of artificial intelligence. Join us on this exhilarating odyssey as we navigate the profound implications of AI and its indelible impact on our collective tomorrow. ### Generative AI and the copyright conundrum In the last days of 2023, The New York Times sued OpenAI and Microsoft for copyright infringement. A move such as this had been mooted for a long time – large language models (LLMs) are trained on data gleaned from published work online, after all – but a high-profile lawsuit finally being tabled is still a watershed moment. Generative AI’s ambiguity with copyright laws has troubled both developers, publishers and lawmakers for many years. But it’s one of many headaches felt due to the rapid ascent of AI and how to strike the perfect balance between quelling bad actors and encouraging sensible growth. Take the EU AI Act; lawmakers finally agreed on legislation to demand transparency from providers and prohibit certain uses of the technology a month ago, albeit after struggling to keep up with the pace of progress since it was first proposed in 2021. Despite reaching a long-awaited agreement, enforcement likely won’t start for years. The issue of copyright is more delicate to settle, particularly with models now becoming multimodal. We’re very much exploring uncharted territory right now. But what could be the solution to appease publishers and attribute creators appropriately? And, regardless of the outcome of The New York Times suit, how is this dealt with by developers in the short term? Special treatment After The New York Times filed its lawsuit, OpenAI expressed its “surprise and disappointment” at the move. In a statement, the company confirmed that “regurgitation” – generating unaltered pieces of memorised content – “…is a rare bug that we are working to drive to zero.” Evidently, this progress was not gathering pace quick enough for many creators. It probably hasn’t helped that OpenAI has pleaded for special treatment when it comes to paying licence fees, submitting evidence to the House Of Lords communications and digital committee that the development of artificial intelligence is doomed without the use of copyrighted materials. Aside from the fact an exemption would violate the Berne Convention, this move was met with a mixture of scorn and outrage by many. Gary Marcus, in his Marcus On AI newsletter, put it best: “OpenAI’s lobbying campaign, simply put, is based on a false dichotomy (give everything to us free or we will die) - and also a threat: either we get to use all the existing IP we want for free, or you won’t get to use generative AI anymore.” It appears that the only solution to appease creators and publishers is for foundational AI developers to pay up. Much like those playing copyrighted music in a public place, owners of a business must pay for a PRS licence to ensure the people involved are reimbursed. Why should it be any different for AI companies? What next? If AI developers are ultimately forced to pay up through landmark court rulings, costs to build and run LLMs will grow to even higher, eye watering levels, perhaps solidifying their use by only the wealthiest private companies. Regardless of whatever monetary reimbursement a court may decide, the issue of attribution still looms large. Can it actually be solved with current technology? This is particularly timely as LLMs have become multimedial, with images especially tricky to pinpoint. This is why, for now, OpenAI appears to be implementing new guardrails to steer users away from inadvertently creating copyrighted material. Up to now, the success of guardrails has been mixed at best; not overly surprising when it must strike a perfect balance between restrictiveness and permissiveness. But whether new guardrails can be little more than a sticking plaster remains to be seen. In the meantime, other developers are experimenting with tools to arm creators with. An example is Nightshade, a project from the University of Chicago. Playfully comparing it to “putting hot sauce in your lunch so it doesn't get stolen from the workplace fridge”, the tool tricks the learning model into interpreting a different image to what a human viewer would see, skewing the reproduction of copyrighted material. Furthermore, if and when a technical solution can be found, how will licence fees and royalties be tracked, quantified and paid? The thought of AI developers policing themselves in yet another area is not advisable and surely not best for all those involved; does the answer lie in a brand new regulative body to oversee the legal process? As generative AI becomes more and more part of our daily lives, this could be something to explore. The only option It seems that, no matter how often foundational AI developers try to kick the can down the road, paying to use copyright material will eventually be their only option. As is often the case, prosecutions will be the catalyst for change. But we should already be looking beyond that, finding solutions for how creators will be successfully attributed. This will no doubt define the year to come and beyond. While the pace of progress is exciting, foundational AI developers must be shown no special treatment in their work and their empty threats dismissed at every turn. ### Cloud ERP shouldn't be a challenge or a chore More integrated applications and a streamlined approach mean that making the cloud switch for enterprise applications can deliver rapid ROI It’s a fact of life that change can be tough, but we all know that the alternative to change could be much, much worse. Today, most CIOs understand they need to move away from on-premises computing to the Cloud in order to take advantage of the value, agility, and accessibility the online world offers. The chance to try things out without high risks, to internationalize, mobilise, flex licences, switch services on and off and gain an on-ramp to the latest innovations are too good to resist. But still, fears over “big bang” migrations and transformation complexity deter business leaders from making the move as quickly and broadly as they would like. The good news though is that now transitioning to the Cloud, even for core enterprise applications, can be smooth and predictable. No more big bangLet’s be clear from the off here: the days of lift-and-shift, big-bang upgrades are gone. Quite understandably, companies don’t like the risk, or the need for wholesale user and process adaptation, nor the long wait for value demanded by this arcane model. Switching simple productivity apps to the Cloud is relatively straightforward, but who wants to wait a year to derive value from a wholesale embrace of cloud for the systems that business operations depend on? Previously, moving to a cloud ERP meant a twin-track waiting game that saw the on-prem system running alongside the cloud-based equivalent. This meant the legacy system was often running for 18 months to 24 months (or more), whilst the new ERP big-bang was deployed. As a result, customers were paying legacy license fees as well as the new subscription without deriving any benefit from the latter for at least 18 months. Now with technology advancements around lowcode development models and open APIs, it is possible to use a more agile model to transition to the Cloud, which enables customers to see more immediate value from a Cloud subscription in a shorter timeframe. From our experience, we have been able to implement Cloud-based applications in short, agile cycles, which has enabled customers to turn off their legacy systems in less than a year – in some cases realizing value within three to four months. What has changed? There has been no alchemy. Instead, simply by having many more functions built into industry-specific and role-specific ERP systems, we have been able to remove the need for customisations to allow organisations to adopt standard processes. ERP was once bespoke but now configuration happens at the API level with extension kits allowing tweaks without layering on complexity. We have also seen a fundamental change in how we go to market. Today, there is no conflict between the different services offered to customers in the journey to the Cloud. Traditionally, vendors have offered professional services, customer success teams and on-going support, but these teams often had overlapping service offerings, creating conflicts of interest. By applying rigour to the people and process sides of ERP cloud migration, this no longer occurs. Land and expandInstead of big bang, modern ERP is all about a phased ‘land and expand’ model. Value is extracted quickly and the organic use of more services over time unlocks yet more value on an ongoing basis. We need to take away the guesswork from migrations that have traditionally been risk-heavy projects. We can spell out exactly what we need from people who lead migrations and explain the level of commitment required, how long processes will take and what all this will cost, because we now have a template that is reliable and a plan that is well structured. Let’s face it: software companies have been deemed as tricky to talk to in the past. Just finding the right person to answer a question was often viewed as difficult. By reducing the number of people with direct customer contact to one or two, we have made that pain go away. Pain starts to dissipate if you kill complexity. We know our customers hate complexity and, in these challenging times, the last thing they need is to make massive changes to how they operate. By automating at every turn and simplifying operational processes, we have eliminated that. ERP cloud migrations do not have to be hard. We can quickly onboard people and drive process change enabled by pre-packaged software models that delivers real and immediate value. To win business and keep customers onside we need to deliver real value and simplify throughout with a standard level of excellent service. It starts with asking customers what success looks like for them and working back from there. We know how customers use our software in the Cloud, so there is no excuse for not coaching them to extract optimal value. Aside from meeting market or vertical regulatory needs, we do not need different processes or approaches for different sectors or parts of the world. We need to dispense with all that and focus on delivering value…fast. ### Top 7 Cloud FinOps Strategies for Optimising Cloud Costs According to a survey by Everest Group, 67% of organisations have more than 60% of their workload on the cloud. This global increase in cloud adoption has given rise to the tricky challenge of managing rising cloud costs. We call it tricky because of the complex pricing structures arising out of the numerous cloud providers. On the other hand, factors such as lack of cloud spend visibility, cloud cost wastage, lack of optimal resource utilisation, and lack of skilled personnel also pose a challenge to cloud cost optimisation. These challenges are what have led to the advent of FinOps in cloud cost optimisation. Cloud FinOps is a methodology that combines financial accountability with cloud operations and over the recent years has emerged as a potent resource to tackle all the challenges mentioned above. Cloud FinOps models help businesses align their financial operations with their business objectives to get visibility into cloud usage and performance. It enables businesses to work with a more collaborative decision-making setup around IT assets usage. This helps finance and IT teams maintain financial accountability for the cloud service, get real-time insights into cloud performance, and build transparency. In this article, we discuss the top 7 Cloud FinOps strategies that can leapfrog your cloud cost optimisation efforts. 1. Cloud spend monitoring and tracking The first step towards building an effective cloud cost optimisation strategy is monitoring and tracking your cloud spending. Cloud cost monitoring is an umbrella term and can refer to tracking of any cost incurred by teams, services, instance types, and regions, across cloud environments. Monitoring costs against the consumption of resources enables you to keep a check on cost anomalies and avoid any unnecessary expenses. It also helps in identifying the origin of such expenses and any other lever including unused resources which can improve cost savings. The underlying benefit of cost monitoring is establishing visibility and accountability among teams, and correlating cloud costs with the business impact, which is the key to any business. 2. Cost optimisation through automation Automation holds the key to quality cloud cost optimisation. From identifying and eliminating cost inefficiencies automation can do a deep scan of your cloud environment and identify cost-saving opportunities. It can ease resource usage monitoring, respond to deviations, and take coercive measures in case of usage exceeding predefined thresholds. Other areas where it is useful are: a) Identifying unused or underutilised resources for recycling b) Automate resource provisioning for scaling up/down based on demand c) Dynamic resource tagging and cost attribution Overall, automation can bolster cloud transparency by mitigating risks related to human errors and significantly impact the overall cloud cost optimisation. 3. Invest in Cloud governance Cloud governance is yet another important aspect of your cloud journey. FinOps Foundation, the leading governing body of Cloud FinOps states - “Cloud Governance is a set of processes, tooling or other guardrail solution that aims to control the activity as described by the Cloud Policy to promote the desired behaviour and outcomes”. Cloud governance entails setting up rules and policies, defining roles and responsibilities, and adhering to cloud compliances. Effective cloud governance can ensure cloud security, efficient cloud deployment, risk management, legal compliance, data security, and overall a high level of cloud cost optimisation. Other noteworthy benefits are: a) Improve visibility and business continuity b) Regulate data access thereby mitigating security risks c) Optimisation of cloud resource management d) Automation by way of reducing manual tasks 4. Reserved Instances and Savings Plans Reserved instances and savings plans are discount pricing models that can help you with significant cloud cost savings in comparison to on-demand pricing. Both of these allow enterprises to commit to a specific amount of cloud usage in exchange for a significant amount of savings. Each of these plans has unique advantages and should be evaluated according to the business needs. By strategically leveraging these options, businesses can lock in lower rates for a specified term, ensuring predictable costs for steady workloads. This approach is particularly beneficial for applications with consistent and predictable resource needs, resulting in substantial long-term savings. 5. Leverage Cloud tagging and labels Tags and labels are metadata attributes that cloud service providers widely use to attach additional information to cloud resources. They act primarily as key-value pairs, helping companies in the categorisation, and administration of resources within a cloud environment. By assigning a tag to each cloud resource, it becomes very easy to identify entities such as workload, business unit, or application thus streamlining your cloud costs. This further makes it convenient to allot, budget, forecast, and establish a culture of accountability. Furthermore, tagging can aid in automation, and cost reporting and adds a very critical layer to cloud security by identifying sensitive data and workloads that need compliance regulations. 6. Right-sizing and Auto-scaling Balance between workload demand and resource allocation is key to success in cloud cost optimisation. While cloud service providers offer hundreds of instance types, you must choose the most appropriate instance type and size for your workloads. This will also ensure that you do not overprovision which can add up to cloud cost waste. Similarly, auto-scaling will ensure that your cloud resources are allocated dynamically by real-time demand and pre-determined policies. Auto-scaling will scale up your resources during the peak hours and scale down during the lull hours, which can be a critical factor in your cost optimization. Broadly speaking, the following are the benefits of right-sizing and auto-scaling: a) Cost-savings b) Dynamic resource provisioning c) Seamless traffic handling d) Optimised performance 7. Build a culture of accountability and cost awareness Finally, a successful cloud cost optimisation journey pivots on bringing together cross-functional teams, each committed to the cause of FinOps. Cloud centres of excellence must be set up to establish FinOps best practices and mutually decide the current cloud health and key performance indicators (KPIs). Raising cost awareness across these cross-functional teams and lines of business is crucial to drive this cultural shift. Assembling a cost-conscious team of dedicated professionals, regularly reviewing the cloud strategy, fostering collaboration, adopting the latest technology, and promoting continuous improvement will help in the long-term success of cloud cost optimisation. Conclusion The strategies discussed above can bolster your Cloud FinOps initiatives and facilitate a culture that encourages accountability, adaptability, and responsibility. By building a holistic framework, organisations can optimise their cloud costs without compromising performance or scalability. Finally, by embracing these strategies, companies can realise the full potential of their cloud investments. ### Eco-friendly Data Centres Demand Hybrid Cloud Sustainability With COP28’s talking points echoing globally, sustainability commitments and efforts to reduce our environmental footprint are top of mind for businesses, and many companies are choosing to put the latest technology front and centre in their sustainability plans. Hitachi Vantara research from 2023 speaking to 1000 businesses globally endorsed this, finding that the top four steps companies are taking to improve sustainability all involve technology: decarbonising their data centres; harnessing the latest technology solutions to reduce their carbon footprint; shifting to alternative energy sources, and ensuring energy efficient buildings, plants and equipment. According to the same research, nearly four out of five IT leaders have developed plans for achieving net zero carbon emissions, with 60% saying the creation of eco-friendly data centres is a top priority. Also earlier this year, the International Energy Agency (IEA) found that data centres account for 1-1.5% of global electricity use. Energy is required for data storage, data workloads and compute instances, and data centre IP traffic. Energy is also needed for power cooling and ventilation systems – accounting for around 30% of energy consumption – and to run the building itself. In short, minimise the impact of running the IT infrastructure, and you will greatly contribute to having a more sustainable business. The importance of the shift to sustainability But that’s not a straightforward task. Even some of the more mature companies on this journey have lessons to learn, and with the story still being written in terms of the growing computing demands of artificial intelligence, there are many unknowns.To better understand why this is so important and where companies are on the sustainability roadmap, the Hitachi Vantara research identified a hierarchy of “eco-data leaders” based on the progress their companies have made building and implementing an eco-friendly data centre. It found that this group are more likely to associate sustainability with value creation, rather than purely a regulatory compliance exercise. Building an eco-friendly data centre presents greater opportunities to grow their business, attract investment, meet customer expectations, and optimise costs. On the latter point, large companies (those with a revenue of over $10 billion) can reap the biggest cost savings from becoming eco-data leaders, with their annual data centre operating costs coming in at $9.8 million. By comparison, other companies of similar size spent about $20.2 million – more than twice as much. The benefits of an efficient data centre are therefore clear. A strategy supported by the cloud Companies are using a wide range of strategies to make this sustainable change, but the most common is to migrate data to the cloud – a tactic used by 45% of those surveyed in the Hitachi Vantara research. Although many companies are gaining a host of benefits from this approach, enterprises reliant on hybrid cloud solutions must ensure that this technology properly contributes to a sustainable future. Cloud deployment without a robust strategy could mean carbon emissions become more difficult to track, and that the business has less control over the steps taken to create a sustainable data centre. In order to successfully overhaul data centre infrastructure and data management practices with the support of the cloud, strategies such as eliminating hot spots and excess energy usage, enhancing cooling systems and properly removing electrical waste should be explored. It's steps like these, taken to intelligently optimise workloads in the hybrid cloud setup, that will help to make a difference to reducing energy consumption overall. This transformation will likely be a key focus for organisations in the year ahead, moving from a “nice to have” to an absolute necessity. A sustainable hybrid cloud strategy won’t just align with business goals; ultimately, it will drastically lower energy costs and streamline data management operations to improve efficiency, protect resources, and substantially curb environmental impact. Inspiring innovation and action It’s clear that the stakes are high, and with many benefits available to unlock, it will be the fastest movers that will have the biggest rewards. Looking again at the eco-data leaders, they are the ones most likely to share the responsibility of sustainability across the entire leadership level. That’s something that all businesses can learn from: data centre sustainability should be a team effort. The entire C-suite should contribute to the strategy, as well as outside experts who can help map out options using modern hybrid technologies. All this should be a process driven by a thirst for innovation – not just to tick the compliance box. So, although cloud migration may feature in your plans to build a sustainable data centre, there should also be thought given to how emissions will be tracked and how improvements can be implemented on-premise. For example, can the business use smarter approaches to making cooling systems more efficient and reduce hot spots, or can it increase the use of high-performance network attached storage (NAS), which allows data storage and retrieval from a central location? There is no “one and done” approach. Achieving the optimal balance of workload placement, which will change as business requirements change, and maintaining a flexible operational approach across multiple environments is critical. Technology choices that provide the agility to enable innovation whilst also delivering on immediate needs for flexible consumption which supports sustainability goals are key. Top management already recognises the need for change, but they are in various states of making that change happen. Looking to 2024, organisations should remember that this transformation not only presents a golden opportunity to create a better and more sustainable business, but to get ready for a future that will continually be transformed by technology. ### The Path to Cloud Adoption Success As digital transformation continues to be a priority for businesses, forecasts suggest that European spending on public cloud services will rise to a total of $142 billion this year. With this number predicted to reach $291 billion by 2027, the industry is showing no signs of stopping. To handle such dramatic growth, businesses must prepare themselves with the knowledge and tools needed to guarantee a successful cloud adoption journey, and thus achieve continued business productivity and unlock the potential for innovation. The benefits of cloud migration, and why businesses struggle Embracing cloud technology unquestionably offers numerous advantages to businesses, notably enhancing scalability, efficiency, agility and innovation, whilst also decreasing costs and risks. While some businesses are constrained by their dependence on on-site data centres, making the shift to the public cloud paves the way for previously unattainable speed of business innovation. This, in turn, helps companies meet their goals by creating new value propositions and revenue models, opening new markets or segments, and launching new products or services. Such a level of innovation and agility is becoming paramount in a perpetually competitive business landscape and is fundamental for superior customer experience, acquisition and retention. Despite the overarching benefits, it unfortunately is not always easy for businesses to migrate to the cloud. Companies often quickly jump headfirst into cloud migration, eager to achieve the benefits, without taking the time to consider the details needed to properly see the project through. The result? Substantial technical debt and an inadequate cloud setup, which will undoubtedly cause technical challenges and higher costs down the line. Businesses must prioritise carefully and be thoughtful about their cloud migration journey to be successful. This includes an understanding that cloud adoption should not be a simple ‘lift and shift,’ but rather be viewed as an opportunity to move and improve their infrastructure and ensure continued business innovation and cloud productivity in the future. Practical ways to start a cloud-based project There are four key steps to bear in mind during a migration project: assess, mobilise, migrate and modernise, and optimise. Assess: Assessment is required to gain a thorough understanding of your current on-premise environment. This will result in a list of the various workloads and interdependencies between them, the complexity and impact of the migration, and an understanding of who owns each workload, all of which will help prioritise the order in which they are migrated. Mobilise: Businesses then need to build a cloud migration plan that will prioritise the workloads to migrate and how they will be migrated according to the ‘5 Rs’: rehost, refactor, re-platform, rebuild and replace. Migrate and modernise: Companies need to follow through on the migration plan, starting slowly and going step by step. By taking time to learn from each migration, teams can continue projects with increased expertise each time. An awareness of computing costs is crucial during this phase as it’s likely that they’ll rise due to workloads running in parallel and teams learning to operate in the new environment Optimise: Finally, start optimising the migrated and modernised workloads in relation to cost, security, performance, reliability and maintainability. This is essential to guarantee a return on investment immediately and on a long-term basis, whilst also increasing developer productivity and supporting business operations as they migrate from an on-site premise to the cloud. The importance of a strong, reliable team Many companies understandably focus on the technical aspects of cloud migration, but a critical part of any cloud journey is to make sure the team is along for the ride – they need to have the time to learn, succeed, fail and try again. Typically, if there are changes to the product structure, there will be changes to the organisational structure as well. Any successful cloud migration starts with a strong driving committee and a core cloud centre of excellence. This group is made up of first adopters, who can learn, make decisions and implement changes quickly. Their initial work will focus on leading the assessment, mobilising and migrating pilot workloads. Gradually, the cloud centre of excellence will become internal change influencers and enablers for the broader team. This is key to scaling the working team and completing the migration to the cloud within relatively short time frames. Another key success factor is assessing the current team's capabilities, and creating training programs accordingly. Following through on these plans in parallel with a phased migration approach will allow team members and leaders to learn about the various cloud services and best practices and implement them during the migration and modernisation phase. Choosing the right partner Cloud migrations are complex endeavours and choosing the right partner for the journey is critical to the success of this type of project. It’s therefore vital that you choose a partner whose cloud architects can support the entire migration process, from assessment to optimisation, from guiding and on-the-job migration and modernisation training to finally helping reduce costs and improve operations in the optimise phase. By doing this, your team will be able to operate and make changes to your cloud environment independently. This will help speed up innovation and reduce cost and complexity, without the usual dependency on third-party providers or a need to go through complex SOW negotiations every time you want to wow your customers. ### What You Need to Know about NIS2 The Network and Information Security (NIS) Directive was designed to strengthen cybersecurity in organisations across European Union (EU) states. That directive is now being updated. The NIS2 legislation aims to increase cyber resilience and improve the EU’s level of preparedness in managing cyberattacks. Organisations within scope must act promptly to comply with NIS2, as well as to bolster their defences against cyber risks.Indeed, as the threat landscape continues to evolve and cyberattacks become increasingly sophisticated, compliance with the updated NIS2 legislation is critical. The UK government’s 2023 cybersecurity breaches survey, which asked UK organisations about cyberattacks they had experienced in the preceding 12 months, estimates that across all UK businesses, approximately 2.39 million cybercrimes occurred during that period. This staggering volume of cyber incidents shows — if proof was needed — just how essential effective security measures are today. A collective approach, guided and enforced by stricter regulations and penalties, is vital to mitigating some of the most pressing cybersecurity challenges today. What is NIS2? NIS2 is an update to the original NIS directive, which was introduced in 2016. The new incarnation sets out to modernise the legal framework in line with increased digitisation and the evolving cyber threat landscape. It also extends the scope of the cybersecurity legislation to include more organisations and entities. NIS2 came into force in January 2023 and must be incorporated into the national laws of EU member states by 17 October 2024. UK organisations within scope must comply with the directive if they operate in the EU. What does NIS2 mean for companies? NIS2 aims to harmonise and strengthen cybersecurity across Europe. It bolsters and streamlines reporting requirements through a risk management approach. It also introduces the need to assess cybersecurity risks in a company’s supply chains and supplier relationships. NIS2 also emphasises improving cybersecurity awareness by exchanging relevant information among impacted organisations and any necessary third parties in their supply chains. This information includes threats, vulnerabilities, tactics, techniques and procedures, and indicators of compromise. Notification obligations have also been extended. Which companies fall under the scope of NIS2? The NIS2 legislation applies to ‘essential entities’ in energy, transport, finance, health, water, public administration, space and digital infrastructure. It also applies to ‘important entities’ in postal services, waste management, chemicals, food, research, manufacturing and digital providers. All entities must meet the same requirements, but different supervisory measures and penalties apply. However, NIS2 is an opportunity for all organisations to reappraise their cybersecurity approaches and move from reactive to proactive risk management. It also provides a framework against which companies can effectively assess their operations and capabilities. The directive extends consideration beyond organisations themselves to include supply chains and partners, making now the ideal time for all organisations to expand cybersecurity thinking to whole ecosystems. Cyber threats will continue to intensify and become more sophisticated, so organisations should take every cue to expand their awareness and increase their measures. Laws and regulations in this area will continue to become more stringent and far-reaching, so businesses must equip themselves now to avert a potential compliance debt that would be difficult to overcome in the long term. Through NIS2 compliance, organisations can adopt an optimal security posture to protect their data, identities and infrastructures. What must companies do to be NIS2 compliant? All organisations must have cybersecurity strategies and plans that they review regularly and maintain. An effective approach for achieving this is to use the NIST Cybersecurity Framework (CSF) as a baseline. This set of architecture guidelines was developed by the US National Institute of Standards and Technology and defines five pillars: identify, protect, detect, respond and recover. To implement a defence-in-depth as recommended by NIST, organisations need strong identity and access management (IAM) and data access governance (DAG). IAM solutions help govern identities and limit access to data, systems and applications according to the principle of least privilege to prevent inappropriate access. Strict password policies and rigorous control of privileged accounts are also vital components of IAM. DAG solutions, meanwhile, help organisations identity what sensitive data they have, where it is stored and who has access to it, so they can put effective measures in place to protect it. DAG extends to continuous auditing to spot any suspicious activity around this data, such as changes to identities, the integrity of systems or the status of security. Any of these could indicate an attack in progress. Prepared companies will have customisable, automated response options in place so they can immediately block active threats to minimise their impact and enable rapid recovery. Conclusion NIS2 updates cybersecurity requirements for organisations operating in the EU. Companies must take advantage of all solutions at their disposal to identify and mitigate cybersecurity risks. Regulatory compliance is non-negotiable for covered entities, of course, but all organisations can treat the revised directive as an opportunity to improve their cyber hygiene, understand the risks associated with their extended supply chain and ecosystem, and protect their cyber infrastructure against constantly evolving threat. ### CIF Presents TWF - Silvia Cambie In this episode of our weekly show, TWF! (Tech Wave Forum), we are talking to an AI evangelist and Partner at Wipro Consulting within Talent & Change. Our host and CIF CEO David Terrar will be interviewing Silvia Cambie. She's a published author, has a journalism background, and is an ex-IBMer .We'll explore the importance of linking AI to your digital and data strategy, and we'll zoom in on some exciting use cases ### CIF Presents TWF - Tony Lock In this episode of our weekly show, TWF! (Tech Wave Forum) we're talking to a well-known analyst with a deep knowledge of the tech landscape and its history. Our host and CIF CEO David Terrar will be interviewing Tony Lock of Freeform Dynamics. We'll talk about some of the history, good and bad, in the development of the cloud technology landscape, and see what lessons we learned that might apply to the explosion (and hype) around AI that we're all experiencing since ChatGPT came on the scene 15 months ago. Like David, our guest has direct experience of a lot of technology changes over decades. ### CIF Presents TWF – Harry Berg In this episode of our weekly show, TWF! (Tech Wave Forum) we'll be talking about both the tech startup scene, and the explosion in AI technology we are all experiencing, with somebody who connects the two. Our host and CIF CEO David Terrar will be interviewing Harry Berg, who has already exited two startups, and started his third a few months ago. His new venture Parapet turns global Ivy League graduates into technical founders. We'll get his view on where AI is headed next, and find out more about his community of graduate entrepreneurs. ### CIF Presents TWF – Sue Black In this episode of our weekly show, TWF! (Tech Wave Forum) we'll be talking to one of the World's top 50 women in tech, according to Forbes. Our host and CIF CEO David Terrar will be interviewing Professor Sue Black OBE about some of her history, how she saved Bletchley Park, how she got in to software engineering, her role as a technology evangelist, her research on bias in AI, and she'll have plenty to say about equality, diversity, and inclusion in our space. ### The Pros and Cons of Using Digital Assistants The ongoing debate about artificial intelligence taking jobs is not going away anytime soon. The introduction of large language models (LLMs) such as ChatGPT has demonstrated capabilities akin to developers, copywriters, and many professions in between. In fact, Goldman Sachs estimates that generative AI could digitally transform the equivalent of 300 million full-time jobs to intelligent automation, further accelerating fears of major disruption and resistance to change among workers. At the same time, recent research revealed that many employees are already embracing intelligent automation with 60% of IT leaders attributing AI to an increase in higher value work, 62% reporting happier staff, and 59% saying employees are more innovative. So, could the fears of displacement by LLMs be unfounded, and should we be welcoming our new digital coworkers with open arms? Let's look at the pros and cons of digital assistants -- defined as autonomous, AI-driven software built to augment humans and robotic process automation (RPA) systems for specific tasks and purposes -- and whether they are better co-workers than humans. Pro: They are easier to fine-tune. It can take months to train a new staff member to get them up to speed with how the company operates and familiarise them with internal policies and processes. Not only is it cumbersome for the people to train them, but it takes staff away from their regular duties, often causing friction and delays with the workload. A digital worker, on the other hand, can be built for the specific context of the job such as invoice processing or reviewing healthcare referrals so it can hit the ground running. It’s a perfect newcomer because it can quickly assimilate vast amounts of information without needing any breaks to increase focus or relieve stress. Better still, your new co-worker will continue to teach themselves on the job, thanks to machine learning capabilities. Con: You can’t have the new AI on the block make coffee or run errands, and they won’t improve the diversity of the team. The benefits of a diverse team are widely known, such as better problem solving, creativity, and innovation through a variety of backgrounds and perspectives. In fact, according to research by McKinsey, diverse companies are 21% more likely to have above-average profitability. Pro: They are versatile. Specific tasks and responsibilities of digital employees can vary based on the objectives set by the manager. Typical digital employees can support customer inquiries, utilise corporate knowledge bases and training materials, and navigate enterprise information systems such as help desk, CRM, and ERP databases that are vital for their duties. They can also undertake intellectual tasks such as data analysis and problem-solving and adhere to company policies and procedures, including privacy and confidentiality. For humans, the great benefit is handing over boring or repetitive tasks such as data entry or chasing elusive information. A recent survey showed that a quarter of employees waste eight hours a week -- that’s a whole day -- searching for information to serve customers, with business-critical data often locked within digitised documents including PDFs, Excel sheets, emails, images, text messages and chatbot conversations. Con: While digital assistants can adapt with machine learning, humans are far more equipped to effectively navigate difficult circumstances using soft skills such as empathy, compassion, and other emotional intelligence to situations. An automated agent simply doesn’t have the same level of creativity and critical thinking skills as the human brain -- yet. Pro: They’re faster and rarely make mistakes. With reduced manual data entry, the likelihood of errors and data discrepancies are minimised. For example, invoice processing times can be reduced by 90% using digital assistants. Purpose-built AI models can also help enforce compliance with company policies, vendor agreements and regulatory requirements, meaning fewer breaches and increased security. Your friendly intelligent agent is also available 24/7 to support any flexible working arrangements for a better work-life balance. They’re also happy to take on extra workloads, which can help you become more productive and save costs for the company. Also, there’s less chance of personality clashes, which are the cause of 50% workplace conflict according to recent research. Many AI models can generate ideas and recommend changes in processes for greater efficiency, but you won’t need to worry about them being offended if you turn down their proposals. Con: Interpersonal relationships are seen as a huge part of any successful working environment. Having that sense of camaraderie, empathy and friendship at work goes a long way for enhancing mental health. While AI can’t currently simulate complex human personalities, it can free up more time for relationship-building among humans by reducing their workloads. Finding the Balance: Augmenting Humans with Digital Assistants There are many advantages to bringing digital assistants into the workplace, especially with businesses being under pressure to do accomplish more at faster speeds while using fewer skilled resources. However, there is no replacing humans for emotional support, motivation and tasks that require empathy and complex communication. Critical thinking and adaptability can only go so far when instilled through programming and AI. I believe we need to utilise the skills and unique strengths of both humans and digital assistants to work together in harmony for a balanced and more productive, efficient work environment. ### Three Reasons Cyberattackers Target the Cloud Malware, ransomware attacks and sophisticated phishing campaigns are all serious problems across cloud, on-prem or hybrid environments. Our reliance on the cloud in particular has created bigger, easier to penetrate, and more rewarding targets for cyberattacks. Defending against security threats in the cloud is complex, and traditional security technologies and approaches simply aren’t enough against the continuous waves of attack. The way threats manifest in the cloud are different for 3 main reasons: 1) Complex configuration. Attackers know that cloud infrastructure is hosted in specific silos, owned and managed by the big cloud providers. They also know that these accounts are regularly misconfigured, which can lead directly to breaches as was the case for Capital One and Uber. One of the most severe cloud misconfigurations occurred in 2022, where 1.5 billion files pertaining to airport workers were exposed, potentially putting lives at risk. 2) Lack of designed-in security. The development of a cloud deployment is often undertaken at breakneck speed, and without the same level of involvement of the IT team who would have been more invested in the development of an on-prem security solution. Instead, the responsibility for a cloud environment often sits initially with DevOps teams, whose core concern is making fast, incremental steps toward launching. If security is not built into these incremental steps, it leaves the end product vulnerable. 3) High volume, high speed. The associated volume that these attacks can take place sets the cloud apart. As organisations place their faith in some of the biggest cloud providers in the world like Amazon or Azure, they do so in good faith that their data is safe. However, in practice, the very thing that gives the cloud its obvious business benefit also leaves it vulnerable. Cloud computing’s big benefits are in its speed and agility, but those are just the things that hackers can turn to their advantage by applying the power of the cloud against an increasingly accessible attack surface. Traditional controls such as firewalls and IDS are not as effective in a cloud environment, due to the increasingly dynamic nature of workloads, the complex management requirements and the static nature of their rulesets. This means traditional security stacks fail to keep pace with the highly dynamic world of cyberthreats. Usually, the first pass in securing the cloud environment is centred on access control and secure configurations. This isn’t enough, and the breach landscape reflects this. The Thales 2023 Cloud Security report found that more than a third (39%) of businesses have experienced a data breach in their cloud environment over the past year. As a result of this failure of protection and appropriate configuration, things fall through the cracks. This is understandable when a lot of the associated best practices prescribed by the large cloud providers often provide different and conflicting advice, leading to the widespread security failings we currently see in the cloud. With many organisations likely to be using multiple cloud providers, this failure to unify best practices across cloud platforms leaves gaps, and confusion. In this confusion, people will seek to find workarounds, which in turn lead to mistakes being made: Thales also found that human error is the leading cause of cloud breaches, identified by 55% of respondents. The dramatic shift by cybercriminals towards automated attacks will continue, increasing the level and potency of attacks in cloud environments. The sad reality is this ever-compounding increase in attacks makes the current technology sets commonly used for cloud defences insufficient. Even if detection and response times are improving, the attackers are moving laterally in the network and establishing a foothold within a reducing time window. A new way of safeguarding the future, powered by intelligence The way to combat such a daunting threat landscape is with intelligence-powered defensive capabilities. Threat Intelligence provides us the knowledge and context to identify threats as they appear in the network and technology solutions now offer the ability to apply that intelligence in real-time to prevent attacks. Enterprise cloud adoption is moving at such a rapid speed that security is sometimes an afterthought to the enterprises deploying it. And, unfortunately, the attackers targeting your cloud infrastructure also move with an increased speed and frequency. The only way to stay up to date with the constant evolving and shifting attack vectors is to work with teams who can ensure that you are empowered with consolidated real-time threat intelligence derived from multiple reliable sources. Aggregated, contextualised threat intelligence from multiple feeds give you access to a massively increased data set of known attacks, giving security teams a better chance of determining and defending against a security incident. This context, if properly deployed, buys a security team's time by acting as a buffer to malicious activity, and allows security teams to regain some control of their environments, without impacting the flexibility and performance of their cloud infrastructure. Cloud computing represents the new norm for solution deployment and will continue to expand at an unprecedented rate. The speed and flexibility that it brings provides enormous benefits as enterprise IT infrastructure. However, unless we accept that existing security stack technology and threat detection technologies are ill matched with the growing attack surface and threat vectors in the cloud, we may find that the technology provides us with more problems than solutions. Only by leveraging the vast threat intelligence being gathered by security researchers and applying it in real-time can we hope to stem the tide of crippling cyberattacks. ### Managing Private Content Exposure Risk in 2024 There is no doubt that managing data privacy and compliance risks becomes increasingly difficult year on year. Cybercriminals continue to evolve their strategies and approaches, making it more difficult to identify, stop, and mitigate the damages of malicious attacks. This has made managing the privacy and compliance of sensitive content communications a difficult undertaking for many businesses and has led to several serious data breaches this year. But what are the key issues to look out for in 2024? The risk of AI LLMs Despite bans and restrictions, the number of employees and third parties using generative artificial intelligence (GenAI) large language models (LLMs continues to rise as the competitive advantage of them becomes too significant to ignore. Unfortunately, this will lead to the threat surface expanding in 2024 and potentially sensitive content to be exposed. Even with advances in security controls, data breaches stemming from GenAI LLM misuse will only rise in 2024. High-profile examples threatening customer trust and drawing regulatory scrutiny are likely. This will force data security to be a central part of GenAI LLM strategies moving forward. Organisations slow to adapt will face a damage to brand reputation, lost revenue opportunities, potential regulatory fines, and litigation costs. A shift of approach for MFTs Managed file transfer (MFT) tools are used in many businesses for the digital transfer of data in an automated, reliable, and supposedly secure manner. However, many are based on decades-old technology that have inherent security deficiencies. This has led to iswitnessing a spiralling escalation of cyberattacks on the software supply chain over the past few years by rogue nation-states and cybercriminals. Two major MFT tools experienced zero-day exploits in 2023. In both instances, multiple zero-day vulnerabilities were targeted; a remote code execution (RCE) in the case of Fortra GoAnywhere that impacted over 130 organisations and a SQL injection in the case of MOVEit that affected over 2,000 organisations and 62 million individuals. If these MFT attacks in 2023 are any indication, rogue nation-states and cybercriminals will continue to exploit zero-day vulnerabilities in legacy MFT solutions in 2024. Email will remain a major attack vector Email remains the number one attack vector and shows no sign of losing its place. Malware attacks instigated through email shot up 29% in the past year, while phishing attacks also grew 29% and business email compromise (BEC) spiked 66%. In fact, more than eight in ten data breaches now target humans as their first line of access using social engineering strategies. Like with legacy MFT solutions, many legacy email systems lack modern security capabilities. Until organisations embrace an email protection gateway where email is sent, received, and stored using zero-trust policy management with single-tenant hosting, email security will unfortunately remain a serious risk factor. Morphing regulatory standards Regulatory bodies will continue evolving data privacy regulations in 2024 and continue to ratchet up their fines. Recent major fines, like those against Marriott and British Airways under GDPR, were in large part due to lapses in data security. This precedent indicatesregulators will come down hard on any company that negligently exposes personal data. In 2024, businesses will, more than ever, need to track and control content access and generate audit log reports to demonstrate compliance. It is not going to go away. In fact, Gartner predicts that personal data for three-quarters of the world’s population will be covered by data privacy regulations by the end of 2024, and the average annual budget for privacy in a company will rise to over $2.5 million. Rising importance of data sovereignty The need for increased data localisation is a growing trend that will make data sovereignty a challenge for organisations in 2024. Many emerging privacy laws require organisations to control the country where data resides, which can prove to be a significant challenge. At the same time, data democratisation, the practice of making data accessible and consumable for everyone in an enterprise regardless of technical skill, is a trend that will also impact data sovereignty. Data sovereignty empowers organisations to maintain compliance with local and international data regulations, which minimises legal risks, establishes a reputation for responsible data handling, and helps companies avoid hefty fines. By prioritising data sovereignty, organisations can build trust with customers and stakeholders, enhance brand reputation, and avoid costly legal issues. The increased use of DRM to protect sensitive content Challenges surrounding the handling of large files containing sensitive content will become increasingly pressing for organisations in 2024. Digital rights management (DRM) adoption will accelerate as organisations aim to protect sensitive content with robust solutions toensure they can comply with expanding regulations. For 2024, data classification and DRM policy management will drive organisations large and small to institute data protection using least-privilege access and watermarks for low-risk data, view-only DRM for moderate-risk data, to safe video-streamed editing that blocks downloads and copy and paste for high-risk data. Time to hit the reset button In 2024, businesses will be under heightened strain to protect confidential data amidst escalating cyber threats and to ensure adherence to burgeoning international regulatory standards. It is time for organisations to look at alternatives. By adopting zero-trust architectures, detailed security models based on content, strong access management, integrated DRM, DLP, and other leading-edge security measures, organisations can mitigate risks and uphold compliance. It is time for organisations to hit the reset button on their sensitive content communication strategies and work to ensure they have the right technologies in place to protect their communications. ### Generative AI: Three CXO challenges and opportunities Infrastructure, data governance and culture change provide CXOs with a leadership step change, writes Rowen Grierson, Senior Director and General Manager at Nutanix. It is clear that generative AI will be a major step change in enterprise technology. For technology CXOs, step changes pose as many business challenges as they do opportunities. If organisations are to benefit from artificial intelligence (AI), then they need a technology infrastructure fit for purpose, a new culture, governance - especially around data - and a new relationship with technology. The hype around AI has led to technology leaders, once again, being in the spotlight. Technology CXOs are expected to navigate the adoption of AI across the organisation; a Vanson Bourne study finds that 90% of organisations have made AI a priority, and technology analysts Gartner stated at their Symposium event recently that 51% of CEOs expect the CIO to lead their AI strategy. Technology CXOs are trusted to ensure AI is a business success. Creating that success means CXOs have to analyse and implement AI where it will make a difference to the typical operations of the business, but also where AI could completely change the business. AI will, therefore, do two things to organisations: it will accelerate the automation of everyday tasks, cutting the amount of manual and repetitive - and typically not value-adding - tasks that skilled team members do. But AI is more than an automation tool, as has already been seen with its ability to diagnose illnesses from large data sets, predict rogue waves that can sink ships or communicate on a human level; AI will create new business models. CXO Focus 1: AI needs infrastructure These opportunities will only be achieved in organisations that have the technology infrastructure in place. Our Vanson Bourne study found that many organisations are yet to determine which technology environment is best suited to run the different parts of an AI process and workload. Part of that challenge is that organisations are yet to decide which AI applications are most suited to their business and vertical market. This is understandable, as the pace of development is rapid, and GPT5 is already on the horizon. Just as with mobile and cloud computing, AI will trigger a wave of technology infrastructure modernisation. For organisations to extract business value from AI, they will require an interconnected data environment. So it is no surprise to learn that more than half of CXOs say they need to improve their data transfer abilities between multi-cloud, data centre and edge computing environments. The same study finds that most CXOs cannot pinpoint the infrastructure modernisation plan needed to support AI workloads. At present, over half (63%) of organisations are deploying AI on virtual machines, and a similar number (62%) have deployed AI in a container environment. Investment and modernisation of technology infrastructure is going to become a continual and long-term programme for technology leaders in order to meet the expectations of AI users. That is already being borne out; our study found that 85% of organisations plan to increase their infrastructure modernisation over the next one to three years in order to support AI initiatives. A similar number (84%) plan to increase the headcount of data engineering and data science teams, with AI prompt engineers joining their ranks. This gold rush towards AI-optimised organisations will not forego the need for technology CXOs to be budget-conscious. We find that 90% of CXOs, unsurprisingly, expect their IT costs, and in particular, cloud to increase as a direct result of AI implementations. CXO Focus 2: Governance Tasked with leading the AI opportunity for organisations, technology CXOs are also worried about the implications of AI. The majority (90%) are concerned about data security, governance and data quality. If organisations are not already confident about their data governance, then they may fall behind in the AI race. The Digital Leadership Report by Nash Squared, one of the most influential temperature checks of global CXOs, finds that only one in four digital leaders is very effective at using data insight; it adds that many organisations are still having problems defining a basic data strategy. Technology CXOs tell us that over the next two years, data modelling and data security governance challenges will be high on their agenda. Over half (51%) say that adding data protection and disaster recovery will join their AI governance plans. This tallies with the findings of the Digital Leadership Report, which states that 36% of CXOs are concerned about data privacy being compromised by generative AI implementations. A quarter of CXOs are also worried about hallucinations in the data, which could impact customers and, therefore, the brand value of the business. Given these challenges, it is not surprising to see technology CXOs opting for pre-built Large Language Models (LLM). These existing models from trusted providers, including AWS, will enable CXOs to increase the speed to market for AI solutions and may provide ways to increase the utilisation of existing resources. Given that many CXOs tell surveys they have concerns about their own data, buying LLMs provides room and time to build and not delay the usage of AI. CXO Focus 3: Culture change “Technology is the easy bit” is a common refrain of CXOs, the real challenge for business technology leaders and their organisations is the culture change - the same will be true of generative AI. The conversational and wide abilities of generative AI make it both an easy tool for end-users to engage with, but also an increased security risk. Employees can be uploading intellectual property or customer data with ease, instantly exposing the business to regulatory and market risk. That same conversational ability will herald a new relationship that business end-users have with technology. At the Symposium, Gartner technology analysts advised CXOs to consider AI not as a technology to be implemented in the traditional sense but as a co-worker that, just as with any colleague, has to be interviewed and coached on the culture of their team and organisation. AI will be a step change in enterprise IT. It will, therefore, also be a step change in technology leadership. To achieve AI success, organisations will need the right infrastructure, governance and culture in place. These three require an understanding that AI is a business issue and not one solely for IT to resolve. Therefore, enterprise AI adoption will be a marathon, not a sprint. ### CIF Presents TWF – Hutton Henry In this episode of our weekly show, TWF! (Tech Wave Forum) we're talking to someone who works in the field of mergers and acquisitions in the technology sector. Our host and CIF CEO David Terrar will be interviewing Hutton Henry who will talk about technology due diligence, his back story, and how he's built up his expertise and approach. We'll ask how he assesses a businesses, what he thinks about the impact of AI, and we'll definitely learn something from his particular viewpoint on what makes a successful tech company. ### Hybrid IT Infrastructure: Integrating On-Premises and Cloud Solutions Blending traditional on-site tech with cloud services creates a dynamic hybrid IT setup that's both agile and robust. Hybrid IT weaves together on-site and cloud solutions, giving companies a tailored setup that's secure and scales with ease.Modern businesses are adopting a hybrid approach, leveraging a combination of on-site and cloud computing to achieve a balance in costs, flexibility, and security. Hybrid IT integrates on-site data centres, private clouds, and public cloud services, striking a harmonious blend between control and flexibility. A well-balanced IT infrastructure aligns with the dynamic needs of contemporary businesses. This shift is driving the increasing popularity of services offered by cloud consulting firms, meeting the evolving demands of organisations.Understanding Hybrid IT InfrastructureGrasping Hybrid IT means seeing how it fuses the local data hustle with cloud-based agility, all while keeping costs and security in check.Hybrid IT is not just a technological choice; it's a strategic business decision that influences how organisations operate and compete. With flexibility at the helm, companies can tailor their IT systems to nail down those specific requirements they have. Flexibility allows businesses to fit their IT to their needs. Scalability is another crucial aspect; cloud resources can be rapidly scaled up or down in response to changing demands.Cost-effectiveness comes from the ability to leverage cloud services for variable workloads while maintaining core systems in-house. A hybrid model ramps up security by keeping your top-secret data on-site and tosses routine tasks to the cloud, cutting down both stress and clock-watching.Key Considerations for IntegrationIntegrating on-premises and cloud solutions in a hybrid IT environment requires careful planning and consideration. Assessing the specific business needs and IT requirements is the first step in this integration. Deciding which bits should stay put and which should head to the cloud means digging into your apps and data with a critical eye.Security and compliance are paramount in any IT infrastructure, more so in a hybrid environment. Companies have to carefully follow data protection rules so their IT systems stay within industry and legal standards. Companies must tighten their cloud and on-premises security measures to meet stringent data protection standards.Strategies for Successful IntegrationA successful integration into a hybrid IT environment often follows a phased approach. It starts with identifying the right cloud services, whether Infrastructure as a Service (IaaS), Platform as a Service (PaaS), or Software as a Service (SaaS), to complement existing on-premises infrastructure. When picking out cloud services, companies need to think hard about what they actually need and what they can handle.Interoperability through standardised protocols enables effective system communication and integration. APIs can integrate systems by letting them share data smoothly.Managing and optimising a hybrid IT infrastructure is an ongoing process. To keep a hybrid IT landscape running smoothly, we lean on the smarts of automation and orchestration tools, which cut through the complexity like a hot knife through butter. Monitoring your systems lets you spot the high-fliers and pinpoint areas ripe for improvement, keeping everything humming along without busting the budget. Juggling your workload across on-site hardware and the cloud is essential for maxing out efficiency without blowing your budget.Future-Proofing with Hybrid IT InfrastructureIn lieu of specific case studies, it's important to discuss the concept of future-proofing in the context of hybrid IT infrastructure. Businesses must strategically adapt their IT to support changing needs.Adapting to Technological Advancements: Hybrid IT infrastructure allows businesses to seamlessly adopt new technologies. With new cloud services popping up, companies can easily weave these tech perks into their current setup without flipping the whole system on its head. Staying ahead in this rapid tech race demands quick adaptation without flipping your whole IT setup upside down.Catering to Evolving Business Models: Modern business models are increasingly data-driven and reliant on real-time analytics. A hybrid IT setup, with its combination of on-premises control and cloud-based agility, is well-suited to support these evolving requirements. Businesses can swiftly adjust their resources, ramping them up or dialing back in response to the ebb and flow of customer needs and the ever-changing market landscape.Enhancing Disaster Recovery and Business Continuity: A key aspect of future-proofing is ensuring business resilience. Hybrid IT infrastructures offer more robust disaster recovery options. Mixing cloud recovery tools with traditional backups lets companies build a solid, bulletproof disaster plan.Sustainability and Environmental Considerations: As businesses become increasingly conscious of their environmental impact, hybrid IT offers a path to more sustainable operations. Businesses can reduce their carbon footprint by using hybrid IT resources while keeping essential on-site operations. Leveraging these greener techs, firms can notably dial down their carbon output without sacrificing the essential on-the-ground tasks.ConclusionHybrid IT infrastructure allows businesses to balance security with flexibility. Transitioning legacy systems to the cloud can get tricky. Small businesses must strategically budget every marketing dollar. Hybrid IT's a smart move—it scales with your biz, saves cash, ramps up security and keeps you nimble for whatever the future throws at you.Flexibility is key as IT infrastructure evolves. Embracing Hybrid IT equips businesses with the agility to adapt and innovate amid ever-shifting technological landscapes. Embracing Hybrid IT positions companies to tackle today's hurdles and swiftly adapt to whatever new tech waves or market shifts come next. Hybrid IT is not just a solution for today; it's a strategic investment in the future. ### Why APIs are the new frontier in cloud security Organisations are seeing a proliferation of APIs (Application Programming Interfaces) across their cloud environments (single, hybrid and multi-cloud) as they expand their use of microservices and create new cloud-native applications. They’re the tool of choice for developers because each API includes all necessary commands, payload, and data needed to facilitate machine-to-machine interactions and they rarely malfunction. But this sprawling API footprint represents a significant security challenge. The prevalence of APIs across diverse locations makes it difficult to inventory, manage and protect them and insecure APIs are vulnerable to cyber attack. APIs are a top target for attackers because they have access to sensitive data residing on applications and backend systems. Calls made to APIs can be subverted to enable attackers to access this data so the stakes are extremely high. In fact, over 95% of organisations experience API-related security incidents and the vulnerability of a network, its APIs and associated endpoints is directly proportional to the number of deployed APIs which makes it vital to secure them. Why other tools are ineffective Typically, organisations concentrate on implementing technical measures to detect and prevent API exploitation. This often involves deploying infrastructure like content delivery networks (CDNs) or Web Application Firewalls (WAFs) to mitigate attacks. However, these solutions may face challenges as they are not specifically designed to withstand the types of attacks typically directed at APIs. Most automated attacks result in hundreds of thousands if not millions of IP addresses being blocked which has the potential to cripple or even crash a WAF. When attackers see their attack being blocked, they re-tool, often shifting to a different batch of IP addresses, which then creates another influx of additional IPs to block. Secondly, the number of WAF rules that are needed to block an attack based on IP or other expressions will slow down the WAF. It becomes difficult to manage and, if there are false positives, screening through these to identify the IP addresses that belong to legitimate users is nearly impossible. An API-specific approach therefore needs to be taken and one which recognises the unique challenges involved. The OWASP Top 10 API Security Risks, recently updated, serves as a crucial guide to understanding how these attacks manifest themselves. Broken Object Level Authorisation (BOLA) remains a prevalent attack, emphasising the importance of adhering to API authentication best practices to safeguard data. Authentication coding errors may attackers unauthorised access or elevated privileges, underscoring the need for diligent development practices. Automated versus long tail attacks Automated attacks unique to APIs, like API4 focusing on unrestricted resource consumption, aim to disrupt services through denial of service (DoS) ausaltor escalate resource usage, leading to significant operating costs. Similarly, API6 sees unrestricted access to sensitive business flows, such as order mechanisms, abused again allowing the attacker to place excessive orders for goods or services. Such attacks demonstrate the vital link between API security and bot mitigation, which is why any approach needs to be able to not just rate limit but actively monitor and address automated attacks. API attacks can vary considerably, however. We are increasingly seeing automated attacks that seek to quietly slip under the radar and use the API to mine for data, scrape content, or harvest credentials, for instance, or set up fake accounts to carry out fraud. Such attacks are not volumetric, so they will not trigger a WAF and the only way of spotting them is through behavioural analysis. So, what does an effective approach to API security look like? Firstly, it’s essential to establish an inventory of all deployed APIs across the network and in the cloud. The discovery process needs to be carried out internally and externally to provide a hacker’s eye view so that all public-facing APIs are captured. It’s a process that needs to run continuously so that APIs continue to be logged throughout their lifetime as they are updated, annexed to third parties, or replaced to prevent any shadow (undetected) or zombie (redundant) APIs from being overlooked. Those APIs then need to be monitored to look for suspicious activity, but simply monitoring for traffic surges is insufficient. Protecting APIs requires two forms of threat detection and response. There’s the immediate need to curtail an aggressive attack and this will typically utilise a host of defence techniques from attack blocking to rate limiting, geo-fencing, and even deception by using a fake response to cause the attacker to pivot or exhaust their efforts. However, other more subtle attacks that aim to accumulate data are less obvious and can only be detected using bot mitigation techniques. To monitor these, it is necessary to compare the types of requests being made against a catalogue of malicious behaviours, known bad infrastructure and attack tool kits to create a behavioural fingerprint for that attack that can then be tracked in real time. It therefore pays to verify the threat intelligence an API vendor has at their disposal. Security starts in development The final piece of the API lifecycle management is compliance. To remain secure, the API needs to comply with the build specifications or protocol used, such as SOAP, REST or GraphQL, and there may well be industry-specific regulations that need to be complied with, such as the Payment Card Industry Data Security Standard (PCI DSS), data protection regulations such as GDPR or new regs such as the Digital Operational Resilience Act (DORA). The most effective way of doing this is to put governance processes in place during development. Introducing shift left testing, whereby testing for security and compliance happens earlier in the development process, ensures coding issues are addressed pre-production, making it much more cost-effective to remedy problems and less likely that issues will arise once the API goes live. Focusing on API security as a standalone issue rather than a subset of web application security must be the way forward if organisations are to avoid falling victim to these attacks. Real-time detection and native prevention in the form of API-specific solutions that can map and track malicious activity are key to defend against fraud, business logic attacks, exploits and unintended data leakage. Attackers have already switched their attention to the API attack surface and are continually evolving novel ways of compromising this technology, so defenders, too, now need to address the unique challenges of managing APIs in the cloud. ### Tackling AI challenges in ethics, data, and collaboration We can safely say that artificial intelligence (AI) was certainly one of the buzzwords of 2023. Initially developed for sophisticated, enterprise applications that enabled businesses to analyse data and increase revenue, we are now witnessing the democratisation of AI. This trend comes with its own set of challenges and emphasises the need for strong AI systems and effective data management. Generative AI, in particular, a technology with enormous potential, brings new risks. Business leaders, lawmakers and politicians came together at the first AI Safety Summit in November 2023 to understand the broader implications of this technology, acknowledge the aspirations of pioneering AI safety, and take proactive measures to address future concerns. Navigating Future AI Concerns: A Tripartite ApproachIn this section, we will explore three fundamental areas essential for addressing future concerns related to AI. These areas encompass the ethical and responsible use of AI, tackling issues of data quality and integrity with data products, and promoting collaboration in AI development. By examining these aspects, we aim to provide insights and strategies that can help steer AI technologies towards a more secure and promising future. Ethical and responsible use of AIAI grapples with the Collingridge dilemma - the challenge of introducing novel innovations without fully anticipating their long-term societal impacts. Similar situations have arisen with previous technological revolutions such as computers, mobile phones, and the internet. However, AI presents a unique complexity as it becomes deeply ingrained in our daily lives and cannot easily be ‘switched off.’To mitigate these societal consequences, we must distinguish between ethical AI and responsible AI. Ethical AI prompts us to ask, "Are we pursuing the right actions?” while responsible AI focuses on how to execute them correctly. The challenge lies in the fact that ethics and responsibility often diverge. The technical community is severely under-represented at the governmental level, resulting in a limited understanding of how organisations can implement AI safely. Without transparency and a comprehensive understanding, it becomes impossible to give an accurate assessment of AI’s impact on society at large.To proactively lay the groundwork for trust and responsibility, technology leaders must incorporate ethical and responsible considerations into the AI development process. This practice is essential, even when the future applications of AI remain uncertain, to ensure that AI contributes more to the moral betterment of society than harm. Data products: empowering AI successPoor-quality training data can limit the successful adoption of AI, with biases occurring when the AI is fed dirty or low-quality, which then affects all of the AI’s outputs. For organisations, this could lead to inaccurate predictions and decisions, or even damage to the company's reputation when biased algorithms based on gender, race, or other biases lead to harmful consequences for those affected. However, AI can also provide a solution to this problem. AI-powered data products (clean, curated, continuously updated data sets, aligned to key business entities) improve low-quality data by recognising and correcting errors. Data products fill gaps in data, remove duplicates, and ensure data correctness and consistency, maintaining accurate and reliable data. They can also integrate data from various sources to streamline manual or traditional data cleaning processes. To harness the full potential of AI technology, business leaders need to prioritise strategies for creating clean training data. AI algorithms learn from data; they identify patterns, make decisions, and generate predictions based on the information they're fed. Clean, curated data truly serves as the foundation for any successful AI application. Additionally, with humans acting as supervisors to AI that powers data products, there will be strengthened data quality resulting in sharper and more precise systems.A critical factor in the success of AI projects is the role of a data product owner or manager. This individual is tasked with overseeing the development and ensuring the successful implementation of data products as part of a broader data strategy. The data product owner holds a unique position – they must comprehend the intricate details of the data, understand the business's needs, and have a vision for how the two can best intertwine. Clean training data is the backbone of AI, and a dedicated manager ensures not only high-quality data but also alignment with strategic business objectives. Consequently, the data product owner becomes an essential bridge between the technological possibilities of AI and the practicalities of business strategy, ultimately driving the successful application of AI within the organisation. Collaboration in AI developmentEffective collaboration in AI development serves as a fundamental pillar for its successful, ethical, and responsible deployment. This collaborative approach offers a multifaceted perspective on the creation and implementation of AI technologies, guaranteeing that these tools incorporate an awareness of a broad spectrum of social, cultural, and ethical considerations.The significance of collaboration between businesses and governments in AI governance cannot be overstated. This synergy is essential for several key reasons. First and foremost, governments play a pivotal role in establishing regulatory frameworks and policies that govern AI. Through collaboration, businesses and governments can work together to strike a balance between encouraging innovation and safeguarding the rights and safety of individuals. This collaborative approach culminates in the formulation of responsible AI guidelines that benefit society at large.Moreover, AI systems heavily rely on vast datasets, often comprising sensitive or personal information. Collaboration ensures that businesses adhere to data privacy laws and cybersecurity standards set by governments, aligning their practices with legal requirements. This alignment, in turn, safeguards the privacy rights of individuals and protects against data breaches and misuse.Ethical considerations hold paramount importance in the realm of AI, encompassing principles like fairness, transparency, and accountability. Collaborative efforts facilitate the early identification and resolution of ethical concerns during the development phase, reducing the likelihood of biased or discriminatory AI outcomes. Additionally, governments can contribute resources, funding, and expertise to support AI research and development. Collaboration allows businesses to tap into these valuable resources, accelerating innovation in the field of AI and ensuring that the benefits of AI technologies extend to society as a whole.AI operates on a global scale, underscoring the need for cross-border cooperation. Governments and businesses collaborating can harmonise AI standards, ensuring that AI technologies are not only interoperable but also adhere to consistent ethical and technical guidelines on a global scale.By harnessing the diverse perspectives, skills, and experiences of individuals from various disciplines, including data science, ethics, law, and sociology, we can develop AI systems that are not just technologically robust but also ethically sound. These collaborative endeavours foster transparency, accountability, and fairness in AI systems, mitigating biases and promoting equitable outcomes.Shaping the future of AIAs we move into a future increasingly dominated by AI, the paths we take today will shape the societal impact of this powerful technology. Rather than viewing the challenges associated with AI not as roadblocks, we should see them as opportunities to foster ethical innovation, ensure data integrity, and promote collaboration. By doing so, we can harness the immense potential of this technology while effectively mitigating its risks. The collective responsibility for achieving this balance rests with technology leaders, businesses, governments, and society at large. Together, we must guide AI development in a direction that aligns with ethical norms, upholds data quality and reliability, and encourages broad collaboration. In this way, AI will continue to serve as a driving force for positive transformation and societal progress. ### The evolution of the CISO What began as a technical innovation on the hacker side ends up as a decisive impetus for CISOs to further develop their role in the organisation. Ransomware is forcing CISOs to position themselves more operationally and to dovetail closely with the CIO and their IT infrastructure teams. You only need enough budget and enough security specialists to protect the company: this mantra has been a strategic cornerstone for many CISOs for decades, who like Dickens’ Oliver Twist have come to budget meetings with their bowl in hand asking “Please Sir can I have some more?”. With the right tools, the right team and the right processes, a CISO can manage cyber risk to a tolerable level, in fact, that’s the very reason for our existence. Every new tool adds user friction and reduces agility. At a recent DefCon presentation Deloitte said the average enterprise now has over 130 different cyber security tools deployed: just while IT is looking to digital transformation to focus on value, consolidation and the move to the cloud, cyber security teams are rapidly turning into infrastructure management teams rather than the management of cyber risk. Before the advent of ransomware, the vast majority of cyber risks were related to the exfiltration of the organisation’s data. In data exfiltration attacks the organisation faces reputational damage, regulatory fines and potential litigation, but a copy of the data still exists inside of the organisation and products and services continue to be delivered. Contrast this with a ransomware attack, where the organisation is unable to fulfil its very reason for existence. Ransomware and wiper attacks have very different impacts than a CISO is used to, and unlike data exfiltration attacks where the damage is already done at the point of the attack, in ransomware the bleeding continues until the incident is dealt with. The old-school CISO who is focused myopically on prevention and detection assures board that with an appropriate increase in their headcount and budget, the business will be protected from attack and even talks of the latest miracle security tool they are deploying will detect all zero-day attacks. For these CISOs, who are watching organisations with much larger security teams and exponentially higher security budgets suffer significant impacts from ransomware, it is worrying times. It is now difficult to go back to the board of directors and admit, despite all the investment in security, it is likely that they will fall victim to such an attack. After all, they have to admit that despite all the effort, there is a residual risk that could jeopardise the company's bottom line. In the World of ransomware, they have to adapt their strategy because adding a 131th tool to the Security Operation’s arsenal will only fractionally move the residual risk needle while adding more noise, more friction and reducing agility even further….and there will still be some element of residual risk. Investments in cyber resiliency, managing the impact axis of the risk equation by building effective and efficient response and recovery, means the CISO is having a more realistic conversation with the organisation’s leadership and making investments that are far more likely to move the residual risk needle more than more incremental investments in managing likelihood. It is crucial to focus on the consequences of a successful intrusion and how these can be contained, investigated and eradicate as quickly and efficiently as possible. However, if you want to expand this response part, you will have to interact much more closely with the business and other teams in the company. Establish an open culture The problem cannot be solved within your own security team. In the event of a ransomware attack, there is a risk of total IT failure. Preventing this emergency means answering all important questions about availability, recovery and incident analysis with other teams and the entire Executive Board and putting the measures on paper. It is important to regularly run through this emergency in drills and training sessions and to transfer what has been learned to your own security world. CISOs must have a precise understanding of the consequences for data and the business and be able to explain them to the board. Incidentally, the CIO and the infrastructure team are already quite experienced in this regard, as floods, fires, power failures and the like also threaten data centres in their world. You can learn from this wealth of experience and work out together how services and infrastructure depend on each other and what their failure due to ransomware would mean. Incidentally, when taking this joint approach, the CISO and CIO should decide to precisely determine the value of their data and assets and decide on a process in which this happens automatically from now on. After all, how can a joint risk analysis work if there is no common understanding of what you actually want to protect? This close coordination requires a new culture of close collaboration across team boundaries. However, this is essential, as the company will only survive a ransomware attack if the security and infrastructure teams work closely together. Preparing for the zero hour The consequences of acute ransomware attacks are so different from malware, spam or data exfiltration. The CISO might not even be able to start their response to an attack: security tooling may have been impacted or evidence encrypted or wiped. Communications with senior management, law enforcement, insurers, the press and regulators may be impacted by lack of email or Voice-over-IP systems. Incident responders might not even be able to get into the building if physical access control systems are down. Looking to pay the ransom? How do you match a decryption key to a MAC address if your CMDB is down? How can you communicate with your legal team to ensure the ransomware gang aren’t sanctioned? A lot of tabletop scenarios organisations run through haven’t been created by someone who has lived through real ransomware incidents and often lack answers to questions like these. The CISO and their teams need to prepare for this zero hour and build contingency plans and environments in advance. Together with the infrastructure teams, isolated cleanrooms can be established in which an emergency set of tools and system and production data are stored in order to create an emergency operation of the entire IT system. This also contains all the essential tools for the security teams so that they can start the essential incident response process. From there, the production environment is then restored step by step and in close coordination with the infrastructure teams. To do this, however, the CISO and CIO must once again have a common understanding of the order in which systems and data are restored. They will achieve this level of maturity in their interaction if they and their teams regularly discuss these issues. Control frameworks such as the NIST provide good best practices for structuring and automating these tasks. They also describe other important security best practices such as least privileges, separation of duties or authentication, which both teams should implement together in order to increase the overall security level of IT. In the end, the CISO will learn from the CIO that a cyber incident is not a failure of the security strategy. Incidents happen just like the fire in the data centre or the ransomware infection. The key is to have the tools and processes in place to mitigate the consequences of the cyber incident. If CISOs can manage this evolution away from pure defence to operational defence, this will have an impact on their investment behaviour. The security tool will probably be less on their shopping list than joint software projects with the infrastructure teams for clean-rooming, classification and incident response. Because it is certain that the defence structure will fail. It is uncertain how well the CISO and CIO will manage this incident together. But both can start working on this today. ### Building Trust: Uniting Developers & AppSec Teams The relationship between developers and security teams has typically been fraught with challenges. Each team is working to related, yet not always aligned, objectives. This can give rise to tensions and have a negative impact both on the organisational culture and the output of each team. Whilst developers are focussed on delivering products and features and releasing these to customers as quickly as possible, security teams have the ‘safety-first’ role which can put barriers and delays in their way. Building a more collaborative culture in which trust and co-operation between the teams is central is in everyone’s interest. It means that developers are focussed not only on meeting release dates, but also on building in security from the outset. It also means that security teams can play a more central role in ensuring the integrity of applications to avoid problems further down the production line. And whilst the issue of trust might sound nebulous it can, ultimately, have a direct bearing on revenue. When the world runs on code, the delivery of safe, secure applications has never been more important; strategies that foster a close, productive relationship between the two teams, with open dialogue to strengthen bonds, should be prioritised and embedded within the culture of every software-first organisation. The divide between AppSec and Developers Amongst the factors that have contributed most recently to the disconnect between the two teams is the rapid pace of digital transformation across nearly every industry sector. Applications need to be rolled-out at an ever-greater speed to support these huge operational shifts. At the same time, technological advancements in IoT, AI, and 5G, have put further pressure on development teams to speed up software delivery, creating further tension between the objectives of each team: on the one hand the need for fast development and on the other, for robust application security. Bridging the gap between these teams is challenging as, for developers, their primary focus is on delivering feature-rich, user-friendly products quickly, while security teams concentrate on ensuring safety and minimising risks. However, there’s a clear business imperative to align objectives and build a close working relationship between developers and security teams so that AppSec is ingrained as a strategic element of the development process. The current state of affairs To help us better understand some of the issues that might be holding teams back from a close, constructive working relationship, we conducted research with CISOs, AppSec managers and developers across a range of organisations. Firstly, we there seems to be a lack of consensus on who should spearhead AppSec policies. According to our research, 56% of respondents view policy creation as their responsibility, 41% assign it to developers, and 38% assign it to AppSec teams. This uncertainty over who owns AppSec will inevitably lead to gaps in accountability and ownership which could prevent policies being implemented and increase an organisation’s exposure to threats. AppSec is a specialist and constantly changing area of cybersecurity, which means that education and training are vital for developers. They need to be equipped not only with tools, but also the latest know-how in AppSec to ensure that security is at the heart of the development process. Here again, however, there were divided opinion on the ownership of training developers in AppSec best practices. Around half of the respondents from our research believe this should be the remit of AppSec teams, while the rest favour developers undertaking self-directed learning, such as through interactive courses. Given the complexity of cyber threats, effective protection of applications and the broader organisation hinges on equipping shared knowledge between developers and AppSec teams. How to align Appsec and Developers Establishing clear Key Performance Indicators (KPIs) is key to aligning the two teams. These KPIs, which could range from acceptable vulnerability counts in initial scans to timelines for mitigation, provide a tangible framework for tracking security progress. More importantly, they must be regularly reviewed to gauge improvement and ensure they align with the broader business objectives. Developers need to see the direct correlation between these security measures and the overall success of the organisation, fostering a deeper commitment to adhering to security protocols. Another critical component is the provision of AppSec training tailored to developers. Training that is interactive and meshes well with their existing development environments can significantly boost their understanding and application of secure coding practices. This proactive learning approach not only enriches the developers' skillset but also embeds a security mindset from the very beginning of the coding process. Factors to consider for smooth operations Addressing the issue of alert fatigue is crucial for security teams. The past decade's proliferation of cybersecurity tools has bombarded developers and AppSec teams with excessive alerts, often hindering their efficiency. Streamlining these alerts to focus on their relevance and accuracy is vital. Reducing false positives and zeroing in on the most critical vulnerabilities can significantly enhance the security process. Integrating AppSec testing directly into development environments can be transformative, making the process more efficient and developer-friendly and bolstering adherence to AppSec standards and practices. Moving towards a united front Around 60% of vulnerabilities are identified during the coding, building, or testing stages of software development which underlines the importance of processes that encompass each phase of the Software Development Life Cycle (SDLC) to pinpoint and address security vulnerabilities. This approach goes beyond 'shifting left' to address early vulnerabilities; it’s about integrating security measures throughout the development process, a strategy referred to as 'shifting everywhere.' Employing a cloud-based platform can significantly streamline this integration. Such platforms enable development teams to incorporate security scans directly into their Continuous Integration/Continuous Deployment (CI/CD) workflows. This method not only facilitates closer collaboration between development and security teams but also enhances the focus on comprehensive product security. Processes that can help the way that security teams and developers collaborate are not just a security requirement but a fundamental aspect of successful business operations in software-driven organisations. By adopting strategies that integrate AppSec seamlessly into development workflows and fostering an environment of mutual understanding and shared responsibility, organisations can ensure the security, efficiency, and innovation of their digital products and the applications that run their businesses. ### Building cyber resilience across the supply chain While the world is becoming increasingly interconnected and digitised, globalised economies rely on robust and resilient supply chains. However, this digitisation exposes businesses to risks as they must navigate complex and sophisticated supply networks. This leaves the door open for supply chains to be a prime target for cyber criminals seeking to exploit the industry’s vulnerabilities and capitalise on this digital transformation. Cyber threats on the rise As the number of cyber attacks continues to increase each year, it’s paramount to deploy and enforce cybersecurity measures in global supply chains. Hackers’ methods constantly evolve, ranging from data breaches and stolen security certificates to malware and ransomware attacks, targeting manufacturers, suppliers, and third-party providers. Just look at the average cost per data breach globally, which amounted to a staggering 4.45 million dollars last year. Adding to the urgency of this issue, Resilinc – the global leader in supply chain mapping, risk monitoring, and resiliency analytics – reported a 36% surge in cyber attacks in 2023 compared to 2022. Over 12 months, Resilinc’s 24/7 risk monitoring platform, EventWatchAI, identified 703 potentially disruptive cyber attacks worldwide across all tracked industries. Of the alerts, more than 57% triggered a WarRoom – meaning that there was a confirmed impact on the supply chain. With so much at stake, future-proofing supply chains through strategic and proactive cybersecurity measures is critical. But what steps can organisations take to minimise cyber threats and enhance the resilience of their supply chains? Here we will delve into four key elements that any cybersecurity strategy should include. Building transparency: multi-tier mapping Every company should begin its journey toward increased cybersecurity by improving the transparency and visibility of its supply chain. A necessary first step in achieving this is to map out the entire supply network. Importantly, given that as much as 85% of disruptions arise from tier 2+ suppliers, it is crucial to go beyond direct high-volume suppliers and map indirect sub-tier vendors as well. Mapping offers the visibility and information needed to make data-driven decisions about who to work with and what changes to implement if cybersecurity issues emerge. Screening disruptions: AI-powered monitoring The next integral step is gaining real-time insight into potential disruptions that could impact supply chains. Be it natural disasters, geopolitical issues, or cyber attacks, having continuous 24/7 access to information on global events means staying one step ahead of the disruption. AI-powered monitoring tools equipped with predictive analytics capabilities enable a level of automation where decisions are made in a split second, even before disruption unfolds. Conducting cyber assessments for continuous improvement Another essential best practice is to conduct thorough and ongoing cyber assessments of systems. These can unveil security gaps that need attention and lay the groundwork for enhanced security measures. By continuously evaluating and refining processes, organisations can ensure that their systems and their suppliers' systems are up-to-date and resistant to breach attempts. Cybersecurity assessments must be collected from both direct and indirect tiers of an organisation’s supply chain, with updates occurring at least every six months. Creating a crisis-ready contingency plan Finally, an effective risk mitigation strategy should include a contingency plan. Companies need to proactively determine steps to take in the aftermath of a potential cyber breach, focusing on people and processes. They must ensure their employees know how to swiftly respond and have the necessary processes in place to effectively mitigate the negative impact of a cyber attack. What does the future hold? As supply chain management shifts from a reactive to a proactive approach, cybersecurity is a key risk area where this change is crucial. Artificial intelligence will play a vital role in driving it further. AI will enable the development of powerful solutions used to optimize supply chains for a variety of risk mitigation strategies, including cybersecurity. It will also facilitate the creation of digital supply chain twins that can accurately simulate and remediate supply chain disruptive risks, including cybersecurity. Generative AI trained on massive amounts of data from the Internet and augmented with search capabilities will provide in-depth insights into suppliers' past problems, allowing smarter decisions about which suppliers to do business with. While rapid technological developments create new opportunities for optimisation, efficiency, and better decision-making, they also bring new risks. As today's complex supply chains are increasingly vulnerable to cyber threats, businesses must implement and prioritise robust cybersecurity measures to safeguard their supply chains. This requires a multi-level strategy that includes full supply chain visibility, risk monitoring, strong supplier relationships, and data analytics. ### Enter the Application Generation: redefining digital experiences in 2024 During the pandemic, with the various restrictions in place, consumers had no option but to rely on digital services. Most people couldn’t go to work, meet friends, or go to the shops. And as a result, we saw a huge surge in the use of applications and digital services. But now, with the pandemic thankfully behind us, people once again have options. They have the choice between digital services and traditional, offline channels. And significantly, across almost every aspect of their lives, consumers are favouring the ease and convenience of applications and digital services. Whether it’s fitness classes, collaboration and productivity tools, messaging platforms or access to public services, people across the world are still relying on applications to live, work and play. What has changed over the last two years, however, is people’s expectations of how these applications and digital services should perform. The sense of gratitude that consumers felt towards brands for helping them navigate through the pandemic has disappeared. They’re demanding the very best, most seamless and secure digital experiences and they have zero tolerance for brands whose applications fall below this mark. In particular, the emergence of a new generation of younger consumers is re-defining what digital experience needs to be in 2023 and beyond. The Application Generation has arrived, and they are out to take down any brand whose application fails to meet their expectations. The Application Generation - Discerning, demanding and empowered The latest research from Cisco, The App Attention Index 2023: Beware the Application Generation, reveals how attitudes and behaviours towards digital services have evolved differently amongst people aged 18-34. This younger cohort of consumers relied on applications to get them through the pandemic - for their education, to start out in their careers, and to stay close to friends - and now they're using them with great skill to live in a hybrid world. They’re heavier users of digital services, using an average of 41 different applications each month, compared to 30 amongst people aged 35 and above. One of the things that makes this ‘Application Generation’ so different is how discerning they are about the quality of applications, and the way in which they consider the relevance and value - or otherwise - of every digital service. They’re much more mindful about the digital services they use and think hard before downloading new applications, wanting to avoid a sense of ‘application clutter’. The Application Generation have experienced first-hand the types of digital experiences that the most innovative brands in the world are now offering, and they point to big improvements in app design and responsiveness over recent years. They’ve felt how world class digital experiences can enrich their lives. The issue for application owners, however, is that these young consumers now demand this level of digital experience every time they use an application or digital service. And they’re getting increasingly frustrated and angry when applications don’t match these expectations. They feel that brands are showing them a complete lack of respect. Indeed, 65% of the Application Generation admit that they are less forgiving of poor digital services than they were two years ago and, crucially, they are actively looking to make brands pay. They feel empowered to switch to alternative digital services and, where they are locked into relationships with service providers (such as banks and utility companies), they are ditching poorly performing applications and reverting back to offline channels. In the last 12 months, on average, younger consumers have deleted seven applications as a result of bad digital experiences, and 70% report that they’re now more likely to warn other people against using digital services that don’t perform. Application owners must focus on digital experience to keep The Application Generation on side Organisations in all industries are now operating in a market where a single slip up in application availability, performance or security can antagonise an entire generation of consumers. But alarmingly, 94% of the Application Generation state that they have encountered at least one poorly performing application in the past year. Application owners can’t afford to be delivering anything other than the very best, seamless, and secure digital experiences. But as anybody who works in IT will know, this is becoming an ever more difficult task. Widespread deployment of cloud native technologies has left IT teams managing a highly complex and dispersed application landscape, and many don’t have full and unified visibility for applications running across hybrid environments. This means it is almost impossible for them to identify and resolve issues before they impact end user experience. Application observability is the answer to this challenge, providing IT teams with full and unified visibility across both cloud native and on-premises environments. It allows technologists to rapidly detect issues and understand root causes. And by correlating application performance and security data with key business metrics, IT teams can identify and prioritise those issues which could do the most damage to digital experience. Organisations need to understand that the Application Generation is like nothing we’ve seen before. Billions of consumers around the world are simply no longer willing to tolerate bad digital experiences and they’re making it their mission to punish brands whose digital services continue to let them down. The standard for digital experience has been raised and application owners have been warned about the consequences of falling short. ### CIF Presents TWF - Dr. Ems Lord In this Episode of our weekly show, TWF! (Tech Wave Forum) we're coming at technology from the perspective of mathematics. Our host and CIF CEO David Terrar will be interviewing Dr. Ems Lord, Research Fellow at the University of Cambridge, and Director of NRICH, the award winning mathematics outreach programme. As well as explaining what NRICH does, we'll be talking about the importance of number sense, problem solving, thinking skills, AI (of course), and women in tech and STEM subjects. ### Cloud spending defies tough times in technology A funny thing happened on the way to the recession at the start of this year. Companies spent more on cloud computing. Infosys surveyed 2,500 executives in April 2023, when economic forecasters and business commentators repeatedly asserted that a global recession was imminent. Economic signals and tech company layoffs gave good reason to anticipate this. And yet two-thirds of companies increased cloud spending in that time. What’s more, four out of five said they would further increase cloud spending in the year ahead. What did these companies know that economists and tech bosses didn’t? First, cloud works well for big business, and business leaders want more. Nearly three-quarters of respondents (73%) said their cloud migrations have been very effective or extremely effective. Second, cloud is working well in new ways. Cloud allows companies to dynamically add new capabilities, which stokes further new demand. It still is used to replace outdated systems, but companies said they were just as likely to use cloud to pursue new revenue streams, access new technology or integrate subsidiaries/new acquisitions, Cloud Radar 2023 found. This held true across industries and across regions. Of the seven industries surveyed, healthcare, banking and high tech firms were slightly more likely to have increased spending last year. In terms of spending next year, four out of five companies increasing spending held for all industries, except telecom, where it stood at three out of four. Overall 57% of survey respondents who answered both questions increased spending in the past year and intend to do so next year. Only telecom and automotive/manufacturing firms landed slightly below that proportion, with 54% of firms in those two industries increasing spend over both terms. European companies were more resilient with their cloud spending than the rest of the world, particularly in Germany and the Nordic region, including Norway, Sweden, Finland, Denmark and Iceland. Some 64% of European companies increased cloud spending in the last year and intend to do so in the year ahead, compared with 49% of companies based in the rest of the regions surveyed. British companies were the least likely European group to spend more in cloud, with 54% of those surveyed intending to spend more in both periods. British businesses in the survey averaged $28 million a year in cloud spending, lower than the survey average of $33 million, and trailing, leaders in the US ($43 million a year) & France ($41 million a year). Complexities could develop into downsideCloud works well, companies want more, and cloud enables growth. But there is possibly an emerging dual downside. Executives surveyed for Cloud Radar expressed relatively lower confidence in their ability to monitor, predict, and optimise future cloud costs. This is rooted in the dynamic nature of modern cloud. Companies have a more nuanced view of the performance and capabilities different cloud providers can deliver. For example, certain cloud providers have a reputation for handling large quantities of data more efficiently, and others excel at interoperating with standard business applications. That influences cloud choice decisions & leads most companies to a complex hybrid multi-cloud architecture. When Infosys conducted its first in-depth study of cloud in 2021, multi-billion-dollar enterprises typically used two or three cloud providers. This year, 65% of companies use three or four cloud providers. That’s up from 37% in 2021. What if a company wants to keep it simple? Use a single cloud provider for all its needs? That’s grown more difficult. Only 7% of respondents claimed to rely on a single cloud. In the UK, 75% of companies responding use three or four providers, and only 3% claim to use a single cloud. This vendor complexity makes for a more complicated cloud bill. And while companies express low confidence in predicting spending, they also struggle to use the cloud they’ve already committed to. Major cloud providers also reflect this gap between cloud commitment & migration in their financial reporting in the form of revenue backlogs or unearned revenue. When we account for the 12 cloud providers and cloud service providers in our survey, we identified $321.4 billion in unused cloud commitments. While this does not indicate a near-term problem, companies that fail to meet their cloud contracts stand to face higher costs as cloud providers renegotiate contracts. Cloud is delivering for growth and innovation. The challenge facing business and tech leaders now is optimising usage of cloud committed to & establishing controls to predict and manage spending. ### Why is Cloud security failing to keep pace? The cloud is now a significant attack surface for most organisations, with 80% of exposures deemed medium, high or critical associated with assets housed in the cloud, according to the Unit 42 Attack Surface Threat Report. It’s a shocking reflection on the state of cloud security today which has had considerable time to mature, with only 19% of on-premise assets deemed vulnerable to security exposures in comparison. The question is why are assets in the cloud so vulnerable and what can be done to improve protection? Firstly, patching remains problematic. The report found that, of the 30 common vulnerabilities and exposures (CVEs) tracked, three were exploited within hours and 19 within the next three months. This provides a wide window of opportunity for attackers to leverage and exploit a security vulnerability that has been fully disclosed publicly. But it’s not simply a case of tardy patching here. The Unit 42 Cloud Threat Report, Volume 7 found 63% of source code repositories have high or critical vulnerabilities, 51% of which were at least two years old revealing the persistency of these issues which aren’t being identified soon enough. Amongst Internet-facing services in the public cloud, 11% of exposed hosts had high or critical vulnerabilities and 71% of which were at least two years old. Any single one of which could pose a threat to the supporting infrastructure, applications or data. Config confusion Other practices aimed to make it easier and more efficient to use cloud services were also criticised for increasing risk. The report refers to ready-to-use templates and default configurations supplied by Cloud Service Providers (CSPs) which are contributing to the problem of misconfiguration, long been identified as the primary security issue facing the cloud by the Cloud Security Alliance (CSA). Instances of common misconfiguration include overly permissive identity and access management settings (the report found 76% don’t enforce multi-factor authentication (MFA) for console users and 58% don’t do so for admin users), the inadvertent exposure of cloud storage buckets, overly permissive network access controls, poorly configured logging and monitoring, and a failure to keep on top of DNS subdomains and records. Moreover, the Attack Surface Threat Report found that the opportunity for misconfiguration is also significantly higher in the cloud due to the continual swap in and out of services. It estimates that on average, 20% of externally accessible cloud services were changed every month across the 250 organisations surveyed, making it highly problematic to screen for misconfigurations. Nearly half (45%) of exposures were then found to have originated in these new services. This rapid rollout is not being matched with the retiring of infrastructure, however, with the same report revealing that 95% of end-of-life systems were still in situ over the cloud. In addition, many organisations make it far too easy for attackers to gain access by storing credentials in plaintext. The report found 83% of organisations have credentials hard-coded into their source control management systems and 85% in virtual machine user data. Those details can then be used to access privileged user accounts and move laterally across the organisation exploring assets before exfiltrating data or escalating the attack. In some cases, they won’t need to look far, with sensitive data found housed in 66% of cloud storage buckets. Don’t lose the essentials In the event of an attack, it’s the logging data that will help pinpoint exactly what happened and that will help inform remedial action but while logging is usually offered by CSPs, it is often disabled due to concerns over storage costs. The number of businesses opting to switch off logging is worryingly high – 75% for AWS trail logs, 74% for MS Azure key vault audit logging, and 81% for Google Cloud Platform storage bucket logging – and it’s a problem that is likely to get worse as teams seek to cut costs. So, what can organisations do to mitigate these risks? Firstly, an organisation could consider conducting a gap analysis assessment or cloud configuration review. This can identify where services have been configured incorrectly or in a way that elevates risk as well as comparing current settings against common best practice and compliance standards. Configuration documentation can then be created which the security team can use as a blueprint when deploying new builds, managing services, or testing changes to existing live configurations, with those changes then also documented and reported on. As touched upon, default configurations are not always the best option so these must be reviewed to establish whether they are in the best security interests of the business. However, it’s a false economy to disable logging which is essential for remediation and analysis in the event of a security incident. Logs should be collated and centralised for analysis by a SOC team, a process usually carried out with a Security, Incident and Event Management (SIEM) solution. When it comes to access management, it’s vital to ensure MFA is enabled and that the policy of least privilege is applied, i.e. that users only have access to cloud management functions that are required to support their role. Conducting a specific cloud penetration test can also be useful to determine whether there is a route to gain higher privileged functions from a typically assigned user account. The problem of scanning of open-source code as part of application development is also problematic. Using tools to conduct vulnerability scanning of code can help here but as the issue can run layers deep if the vulnerability is in an imported code package from a third-party repository that has been compromised, detection can be difficult. The persistence of old vulnerabilities is testament to this fact. Thankfully the industry itself and various governments are now taking steps to improve the security of open-source software in light of devastating attacks such as Log4j which illustrated the susceptibility of the software supply chain. The Open Source Security Foundation (Open SSF) has launched the Alpha-Omega project to find and fix vulnerabilities in open source code, with Alpha focused on critical projects and Omega using automated tooling to analyse, score and provide remediation advice to open source maintainer communities. Legislatively, in the UK, a consultation to improve cyber resilience took place in 2022, with the response published later the same year and this is likely to see changes brought in during 2024. In the EU, the Cyber Resilience Act also aims to address issues with the supply chain. These actions also testify to the importance of the cloud which has become the foundation of our digital ecosystem. Its flexibility is vital in allowing organisations to capitalise on opportunities but as the Unit 42 reports reveal, it’s also highly susceptible to attack and the applications it relies upon do not provide as firm a basis as they should. It’s therefore down to the individual organisation to ensure it takes the necessary steps to configure and secure its systems rather than rely on a shared ownership model with the CSP. ### The skills every cloud practitioner needs in 2024 As more and more data are collected, data storage, protection and management are more important than ever – and cloud computing seems to be the ultimate solution. Moving to the cloud has become essential for businesses to be able to store and access a large quantity of data more quickly, easily and securely – and ultimately increase productivity and growth. However, although Pluralsight’s 2023 State of Cloud report reveals that almost 70% of organisations in EMEA have more than half of their infrastructure in the cloud, only 23% are driving customer value with their cloud strategies. It is clear that adopting cloud computing is not enough to achieve the strategic advantages companies are looking for. For businesses not reaping the long-term results of cloud adoption, many are struggling with a lack of cloud skills amongst the workforce. And not many are tackling this head on – in fact, only 8% of businesses in EMEA have established a dedicated cloud skills development program. While implementing the right technology is a good first step, it must be paired with investment in upskilling to close the cloud skills gap and truly accelerate business outcomes. Here are five skills gaps that any technologist working with the cloud must close for their company’s cloud strategy to be successful: Navigating the threat landscape with cloud security skills Our research shows that the number one challenge faced by cloud leaders and technologists is making sure the technology is used securely, with 22% of EMEA businesses reporting security concerns and constraints. While cloud security is an absolute priority to avoid experiencing a cyber-attack, which risks slowing down growth, cloud security skills still feature amongst the most in-demand skills. Being an expert in cloud technology shouldn’t stop anyone from learning continuously and building new skills. In today's ever-changing threat landscape, it can be hard to keep pace with the latest forms of attacks. As such, even expert cloud engineers must familiarise themselves with the foundational principles of cybersecurity. Leaders among all industries are growing more alert when it comes to data breaches and the overall security of their technical systems, and building the right cybersecurity skills has become key. Additionally, the two most prominent factors in cloud data breaches are misconfigured access restrictions and inadequately secured cloud systems – roadblocks that can be alleviated with heavier cybersecurity training for every single employee. A solid cloud strategy requires data analytics, storage, and management skills Just like cybersecurity, proficiency in data skills is absolutely essential to implementing better cloud strategies. Businesses are facing more complex attacks as new technologies such as AI are emerging, and technologists are increasingly aware of data breaches as a result. One of the best solutions is the upkeep of data processes and practices. A higher level of analytical experience leads to more efficiency in finding patterns in data, making strategic decisions, and generating value through optimised system performance. On top of these skills, knowledge and experience in data storage and management allows technologists to build reliable and expandible cloud infrastructures. Overall, helping all workers further develop their data skills is necessary to maintain cloud security as well as increase durability in cloud solutions. The importance of proficiency in software engineering and development The third cloud-focused skill gap that organisations need to close is software engineering and development knowledge. Software development skills are key to building structured and systematic approaches to cloud strategy. All employees, not only cloud practitioners, need a strong foundation in software development to understand how each cloud solution is built and used to its full potential. Those skilled in software engineering and development have the tools they need to support adaptable cloud solutions, enabling organisations to unlock long-term benefits rather than only focusing on short-term adoption goals. Don’t forget about DevOps and programming skills 4. While knowing the basics of software engineering and development benefits cloud engineers as they work to build solutions, going one step further and acquiring DevOps and programming skills is key to encouraging collaboration, automation, and ultimately building a more efficient cloud system. Additionally, these tools assist engineers in simplifying their building methods, fostering a system better fit for changes in the market. Advanced familiarity with programming languages even helps with customisation in systems and allows for a smoother transition when integrating new systems – all factors that can boost competitive advantages, accelerate processes and drive growth. Last but not least is system administration 5. The fifth and final skill gap that businesses need to close is centred around system administration and dexterity in cloud infrastructure. Possessing the skills to manage servers, networks, and storage resources is fundamental to maintaining high-performing and reliable cloud strategies. Cloud practitioners should be able to work proactively to both identify and resolve any issues – only then will they ensure an optimised cloud environment and increase efficiency. In addition to increased speed and reliability, expertise in SysAdmin can support cost-effective cloud solutions by lowering expenses through a highly adaptive and accessible approach to building the infrastructure. Upskilling is key to building a solid long-term strategy While businesses are increasingly adopting cloud technology to drive efficiency, increase customer value and boost innovation capabilities, the cloud skills gap remains an issue. Only 13% of organisations have established a dedicated cloud team, as business leaders tend to prioritise rapid adoption of the technology over truly mastering the technology to bring value. Most organisations don’t know how to translate embracing cloud technology into a solid plan that leads to the desired long-term outcomes. One word is key in every cloud strategy: upskilling. Only with the right skills will companies using the cloud be able to unlock the full benefits of this powerful technology. While business leaders might assume those working with cloud technology already know how to use it efficiently, this is not necessarily the case. Continuous learning remains key to closing all employees’ cloud-focused skills gaps, keeping pace with technological advancements such as emerging technologies, driving success, and moving securely and efficiently towards a cloud-centric future. ### The Paradox of “Open-Source AI” Interest in “open-source AI” has grown into a frenzy. Meta made headlines in July by releasing the source code and weights for its flagship large language model (LLM) Llama 2 under a licence permitting commercial use and has enjoyed a much-needed image boost since becoming a vocal proponent of “open-source AI”. Meanwhile, the collective efforts of the AI developer community have led to the rise of Hugging Face, a platform for openly sharing AI models, which has seen various companies like Stability AI and Mistral AI release open alternatives to OpenAI’s conspicuously “unopen” AI models. The popularity of “open-source” AI models has even led to suggestions that AI is having its “Linux moment”—that pivot point at which open-source software became popular. In drawing a parallel between Linux’s critical moment in the 1990s and the state of AI today, however, we need to remind ourselves that open-source licences were developed for software, not AI. Since open-source licences were created for software, they do not apply directly to AI models or systems. The term “open-source AI” is thus inherently paradoxical. On the one hand, it may seem that if an AI model is licenced under an open-source licence, the model must be open-source, but on the other hand, it cannot be open-source because it is not software. For that reason, the phrase “open-source AI” will always be accompanied by quotation marks in this article. AI ≠ Software Open-source licences were born of a legal watershed moment in the 1960s. Back then, the products created by a fledgling software industry had to compete with the software that came bundled with hardware products. The judgement in United States v. IBM (1969) declared bundled software to be anti-competitive, triggering increased investment in independent software. When copyright law was extended to computer programs in 1980, it provided the legal foundation for the sale of independent software under proprietary licences. Richard Stallman founded the Free Software Movement (FSM) in the early 1980s in response to these changes. Stallman’s GNU Manifesto declared that free software should have four “essential freedoms”: to run, study, and distribute code, and to re-distribute modified code. Copyleft licences, such as the GNU Public Licence (GPL), were introduced to protect these freedoms. Directly and indirectly indebted to FSM, Linus Torvalds began developing the Linux kernel in 1991, sparking the so-called “Linux moment” and heralding an era of open-source licences which offered similar freedoms. When applying open-source licences to AI systems, it is all too easy to disregard the fundamental techno-legal differences between software and AI. Traditional software comprises computer code that may be in object or source form, the latter of which is human-readable and is the primary asset made publicly available when software is open-sourced. Publicly releasing code in source form under a licence that permits anybody to use, modify and redistribute the code has enabled a collaborative software development model that is unrivalled in its efficacy, efficiency, and inclusivity. However, while AI systems certainly include code, they also include a number of additional elements which fundamentally differentiate them from other types of software, meaning that transplanting the benefits of open-source software to AI systems is not a routine operation. Challenges of open-sourcing AI An AI system is best conceptualised as a collection of components or “artefacts”, which include not only software artefacts encompassing code for training and executing an AI model, but also trainable parameters of the model (such as weights) that have no human-readable source form. Depending on how the AI system is defined, other artefacts can include the datasets used to train the model and software drivers which enable use of the specialised hardware the models are designed to run on. Merely open-sourcing the software artefacts of an AI system does not automatically grant users the essential freedoms associated with open-source licences. To achieve a similar effect, at least some of the other artefacts must also be made openly available. The question is: which other artefacts, and to what extent? Along this line of thinking, academics and legal professionals have suggested that any sensible definition of an “open-source AI system” must entail transparency, reusability, and extensibility, or alternatively transparency, reproducibility, and enablement of the AI system. Legal and practical restrictions can apply to artefacts of AI systems other than code. For example, datasets used to train an AI model may include material that is subject to copyright or other exclusive rights, meaning that a user may not be legally free to use the data, or even to use a model trained using the data. Such restrictions potentially deny some users the freedom to run the AI system, which is incompatible with the four essential freedoms and the overall ethos of open source. As another example, training large AI models such as LLMs is impracticable without powerful GPUs or other specialist hardware, whose drivers and firmware are typically not open source. The majority of large “open” AI models today have been trained using Nvidia hardware that operates on proprietary CUDA software. The resulting AI systems arguably do not give users the full freedom to study and modify them. In addition to the above, open-source licences are conditional copyright licences that cannot be applied directly to artefacts that are not subject to copyright. In fact, it is not clear as yet whether any licensable intellectual property right is applicable to trained model parameters. So, for models that differ from others only by their weights, there remains a question of what legal basis there is for imposing conditions on their use. Can the paradox be resolved? Proponents of Ptolemy’s geocentric model of the universe ended up going around in circles seeking to revise prior knowledge in response to newly-discovered inconsistencies. Copernicus, instead, was willing to redefine the whole problem in light of fresh evidence. A similar willingness to rethink presumed premises is needed in the definition of ‘open-source AI’. In this regard, the Open Source Initiative (OSI) has made early steps by releasing a first draft definition of open-source AI systems, stating that to be open-source, an AI system needs to make its components available under licences that individually enable to user to study, use, modify and share the AI system. However, there is still some way to go before we have a broadly accepted definition. In the meantime, the “open-source AI” brand will continue to be used inconsistently and enthusiastically by companies of all shapes and sizes. Such is the extent of its marketing and lobbying value. ### Strengthening Cloud Security As the digital landscape advances, cloud computing has become integral to modern business operations.  According to an International Data Corporation (IDC) report, global cloud infrastructure spending is projected to reach $118 billion by 2025. However, the expanding adoption of cloud services also brings heightened risks, including data breaches and cyber-attacks.  Thales' Global Cloud Security Study for 2022 revealed that in the past 12 months, 45% of businesses experienced a cloud data breach or failed to conduct audits, marking a 5% increase from the previous year. Katie Barnett, Toro’s Director of Cyber Security shares her top tips on how to assure your company's cloud cybersecurity. Develop a Cloud Security Strategy: Create a comprehensive cloud security strategy aligned with your business goals, risks, and resources. This strategy should be a living document and serve as a guiding framework when researching or onboarding any new cloud providers.  Assess Your Cloud Provider: Before selecting a new cloud provider, meticulously evaluate their security policies, practices, and certifications, utilising your established cloud security strategy as a blueprint.  Ensure compliance with industry standards such as ISO 27001, PCI DSS, HIPAA, and GDPR (General Data Protection Regulation). Verify alignment of the provider's platform and technologies with your goals, look for a service development roadmap for long-term compatibility, and scrutinise data policies and security controls for risk-based alignment with your security policies. Don’t forget about 4th party providers – the downstream suppliers who may play a crucial role in delivery of your cloud service - assess their reliability, and consider a separate assessment if needed. During due diligence, discuss data backup, recovery, encryption methods, and incident response procedures, establishing a clear service level agreement for mutual understanding. Secure a Data Backup Plan: As highlighted in the first point, understanding the policy on this is critical. Having a robust data backup plan is integral to any cloud security strategy. In the event of data loss, you need to ensure that a solid backup plan is in place to facilitate data recovery and uninterrupted operations. Many organisations do not backup their cloud hosted data, believing this is covered by their cloud provider. But resiliency in the cloud is not the same thing as having a separate copy of your data, should someone permanently delete it from its cloud location, for example. Therefore, during your due diligence, engage in discussions about data backup, recovery, encryption, and the provider's incident response procedures. Clarify these aspects in a well-defined service level agreement to avoid misunderstandings in the future.  Data backup integrity is of paramount importance. Having a solid plan in place for storing data in the cloud, locally, and offline is critical for swift recovery in case of unexpected issues. Work out the specifics of this plan before entering into any partnership. Monitor and Maintain Your Cloud Environment: It is important that as an organisation you stay vigilant against potential threats by consistently monitoring and updating your cloud environment.  Utilise tools that detect suspicious activities, unauthorised access, or data leakage. It’s also important to regularly patch and update cloud applications, software, and operating systems to address vulnerabilities.  Train Staff to Understand Attacks Providing critical training to staff on identifying and responding to various cyberattacks, including phishing emails and malware.  Employees need to be educated about existing and emerging cloud security threats and effective ways to mitigate them, your people can be your greatest strength but also your greatest weakness when it comes to security, but training will help give them the tools and knowledge to protect both themselves and the business.  Encrypt and Secure Your Data: Recognise data as your business's most valuable asset.  Check encryption standards for authentication and make sure data is encrypted in transit and at rest. Think about what sensitive data you are storing in the cloud and whether it is in compliance with relevant regulations. Check who has access to it, using the principle of least privilege to manage data permissions and processes for access control. Use Multi-Factor Authentication (MFA): Implement MFA, requiring two or more verification methods for authentication. This adds an extra layer of security beyond traditional username and password authentication. MFA is difficult for cybercriminals to crack without acccess to personal information, enhancing overall account security. Perform Pen Testing to Find Gaps: Regularly conduct penetration testing to identify vulnerabilities in both cloud and on-premises systems. Pen tests help improve overall system security by pinpointing potential weaknesses that attackers could exploit. Integrating hacking simulations ensures ongoing resilience against evolving threats. A proactive and multi-faceted approach is crucial for ensuring cybersecurity in the cloud. By integrating these strategies into your security posture, you will have the necessary assurance that your cloud operations are secure and resilient.  Basic security measures, such as data backup, encryption, and employee education, coupled with comprehensive cloud security policies, form the foundation of a robust defence against the evolving landscape of cyber threats.  ### CIF Presents TWF - Andrew Grill In this episode of our weekly show, TWF! (Tech Wave Forum) we want to talk about what's next for GenAI, and the tech trends you should be investing time and resources in for 2024. Our host and CIF CEO David Terrar will be interviewing Andrew Grill who is a former IBM Managing Partner, an in demand keynote speaker, a TEDx presenter, and a technology advisor to major corporations. Andrew is the Actionable Futurist®, so as well as getting his take on how GenAI, mixed reality, quantum computing, 5G, and edge computing will develop, he'll give you some homework - five actions to try straight away to help change your thinking. ### How to Turn GenAI from a Toy to a Real Enterprise Tool Just a year on from its debut, ChatGPT is a phenomenally disruptive technology. CIOs know it can’t be ignored, but it comes with risks. What is the best way forward here? The underlying technology that powers ChatGPT is called a Large Language Model (or LLM). LLMs like ChatGPT are based on a “novel transformer architecture” that allows a system to generate responses rather than predict them as prior NLP (natural language processing) approaches did. Accordingly, LLMs are very large deep-learning models trained on vast amounts of data to be able to generate new content. LLMs can also achieve remarkable results, and have even been able to pass the Turing Test, meaning these systems can pass as offering human-level responses. However, one of the issues plaguing GenAI is hallucinations, where generative outputs can’t be fully trusted. Another challenge is the high costs associated with training and scaling the technology, difficulties associated with audits, and a corresponding lack of transparency on how the system arrived at its (seemingly plausible) answers. In contexts where compliance or safety are important, we can’t accept the answers we might get from LLMs at face value. This poses a challenge for enterprises aspiring to leverage LLMs more extensively. Many LLM issues can be avoided by combining them with a knowledge graph, vector search, and retrieval-augmented generation (RAG). Unlocking useful insights Let’s look at each of these components individually. A knowledge graph is an information-rich structure that provides a view of entities and how they interrelate. It enables us to represent these identities and connections as a network of curated, deterministic, and verifiable facts, forming a structured graph of collective knowledge. And increasingly, in the context of LLMs, a useful application of them involves integrating a knowledge graph with an LLM and establishing a fact-checking layer to improve accuracy. The beauty of a knowledge graph is it can provide context that goes beyond the scope of an LLM. In particular, it’s easy and cost-effective to add data to a knowledge graph in real time, whereas retraining an LLM that is just weeks or months old with new facts is prohibitively expensive. The glue which developers are using to join LLMs with knowledge graphs is embedded vector search. While AI practitioners tend to use vectors to perform neighbourhood searches in a high-dimensional space, those same vectors can be used as keys into the curated, deterministic knowledge graph. This means you can relate terms from the LLM into the knowledge graph and then use graph algorithms and queries to find accurate data to enrich the response. Capping all this is the increasing use of a third LLM-enhancing technology called retrieval-augmented generation (RAG), as exemplified by language model integration frameworks like LangChain. The RAG pattern formalises the interactions with the LLM, using the knowledge graph to improve fact-check responses from the LLM before they reach the user. Use of RAG with a knowledge graph can increase the richness of fact-checking of the LLM. For example, Basecamp Research, a UK-based biological data company, is revolutionising protein discovery from biodiverse natural environments. It uses AI to search for commercially useful molecules and biomes and has built the world’s most extensive natural biodiversity knowledge graph. The company understands the importance of linking LLMs with graph technology, and is now upgrading to a fully LLM-augmented knowledge graph to overcome the limitations of GenAI. Basecamp Research isn’t the only trailblazer regarding the LLM-graph twinning concept. A multinational energy company uses a knowledge graph with ChatGPT in the cloud to create an enterprise knowledge hub too. This year, we will see a lot more of this kind of combination as enterprises look to create intuitive and informative LLM-based decision support systems. ### CIF Presents TWF - Dr. Ems Lord In this episode of our weekly show, TWF! (Tech Wave Forum) we're coming at technology from the perspective of mathematics. Our host and CIF CEO David Terrar will be interviewing Dr. Ems Lord, Research Fellow at the University of Cambridge, and Director of NRICH, the award winning mathematics outreach programme. As well as explaining what NRICH does, we'll be talking about the importance of number sense, problem solving, thinking skills, AI (of course), and women in tech and STEM subjects. ### CIF Presents TWF – 2023 in Review & 7 Tech Trends for 2024 In this episode of our weekly show, TWF! - Tech Wave Forum, host and CIF CEO David Terrar will explain why 2023 has been a pivotal year and the start of a new tech wave. He reviews this year's season of shows, and then gives us his 7 tech trends to watch for 2024. Join us for next Wednesday's Live show, but then TWF! is taking a holiday break. ### CIF Presents TWF – Evan Lever In this episode of our weekly interview show, TWF! - Tech Wave Forum, host and CIF CEO David Terrar will interview our member Escrow London's founder and CEO Evan Lever. They'll be talking about how and why he started the business, the important topic of software escrow, as well as data, cybersecurity, malware, legal stuff, the cloud software landscape, and going global. ### Realise the Value of the Connected Workforce As companies embark on their digitalisation journey, they want to optimise processes, minimise downtime and more effectively leverage equipment and teams to remain competitive as costs continue to rise. But many are doing so with a reduced workforce and they are battling to retain existing expertise while also attracting new workers with the digital skillsets they now need.  One aspect of industry digitalisation they should consider is the connected workforce. The connected workforce is one that is digitally linked into the company’s IT and operational technology (OT) ecosystems allowing them to seamlessly interact with machines and colleagues to access the information they need when they need it. By enabling a connected workforce, companies can enhance safety, boost productivity, augment and accelerate training and empower workers, equipping them with tools they need to successfully manage their tasks, to aid talent retention, and attract new recruits.   Shaping the Connected Workforce Geo-positioning of workers and assets, for example, will allow teams to locate mislaid equipment in the maze of tunnels across a mine to reduce downtime or used to enhance safety and locate colleagues quickly during an emergency at a vast chemical processing plant or port terminal. Predictive maintenance can be implemented, and maintenance workers' tasks streamlined with access to OT data at their fingertips. For example, using readings from equipment health data, anything that falls outside of an acceptable range can prompt alerts to the appropriate maintenance workers’ devices. By scanning a QR code on the equipment workers can access all relevant OT information and using VR data overlaid to the equipment and activated through their device, can be guided to areas that need attention.  Then if they need further assistance they can access step-by-step guides, process data, or interact with experts wherever they are, sharing video and OT data to resolve issues promptly to reduce downtime and impact on productivity. These capabilities will benefit both experienced workers and serve to assist the training of new team members.  Planned downtime can be scheduled more appropriately by leveraging equipment health data as well as that from vendors and logistics companies on the availability and shipping of new equipment parts. Training can also be augmented using VR to simulate unplanned events and keep teams safe while ensuring they are better prepared.  How companies implement their connected workforce will depend on the type of industry they are in as well as their challenges and business goals. but it will require them to be equipped with the most appropriate devices for the work they do. These could be in the form of ruggedized smartphones and tablets, designed to work in extreme temperatures and harsh environments. Wearable technology such as connected glasses, cameras and remote speaker microphones with push-to-talk capabilities supports hands-free communication. PPE such as helmets with integrated headsets can also allow workers to communicate safely in a hands-free way, for example at a nacelle on an offshore wind farm or in a noisy factory or mine. Requirements for the connected workforce As companies begin their digital transformation and deploy the connected workforce, they must implement technology that allows them to seamlessly evolve, benefitting from: Pervasive wireless connectivity such as that delivered by 4.9G/LTE and 5G private wireless networks. This is needed to reliable, and cost effectively support the connection of huge number of assets and people wherever they are through workers’ devices, Industrial IoT (IIoT) sensors, legacy and new equipment and more.  Seamless integration with existing networks, including Wi-Fi to support IT and other non-mission-critical services, as well as exiting push-to-talk land mobile radio-based networks such as TETRA and P25. An array of compelling industry 4.0 and digitalisation applications that allow them to achieve their goals ranging from workers’ messaging communication solution, worker assistance, etc. A single connected worker platform that delivers these capabilities, with the ability to scale and adapt as needs evolve. All of this must be supported by Industrial on-prem edge computing. This move away from the public cloud to on-premises edge processing is vital in allowing companies to retain OT data sovereignty and security. It also provides better user experience by removing the delay that occurs when data is processed externally, meaning companies can support the real-time requirements of mission-critical use cases such as autonomous robots and predictive maintenance. The public cloud can then be leveraged for IT services and other non-critical use cases.  The connected worker in real-world deployments Companies are already seeing benefits from the connected workforce. Nippon Steel, for example, is using a private LTE network across its manufacturing facility to link sensors, smartphones and cameras for remote monitoring and support. This allows it to improve workforce safety and minimize the impact downtime has on productivity. By implementing a 5G private wireless network, Lufthansa Technik can deliver the high-quality video streams and crystal-clear images required for virtual customer equipment inspections. This wasn’t possible using Wi-Fi as the metal equipment across the hangar impeded the signals and disrupted the service. Bosch is another company using 5G private wireless at its Stuttgart-Feuerbach factory in Germany to enable industrial automation, including an autonomous transport system, to deliver materials exactly where and when they are needed. The case for the connected worker Results from an ABI Research-led study unveiled this year show that companies are at different stages of their digitalisation journeys. Electronics and appliance manufacturers, for example, are leading the way in Industry 4.0/digitalisation technology investments and use case deployments while manufacturers of heavy equipment can gain by closely aligning their IT and OT environments. Within their digital transformation, the case for the connected workforce is clear. Giving workers access to pervasive high-quality voice, video and OT data where and when they need it, and in a format that makes sense to them will dramatically enhance their capabilities. enhance their capabilities.  As companies think about the connected workforce, they should consider implementing new technologies to enable their overall digital transformation. It is critical that their teams are integrated into the IT and OT environment, and this must be supported by robust, pervasive networks such as 4G/LTE and 5G wireless, an on-premises cloud using industrial edge computing, ruggedised devices and an ecosystem of compelling applications. Only then can they benefit from a truly connected workforce that is better protected and more productive. ### Unravelling the pitfalls of cloud repatriation Today, technology is evolving faster than the speed of light. Everywhere we turn, we’re presented with new innovations – ChatGPT leading the AI revolution, hyper-connectivity spearheading the continued adoption of IoT technologies, to name a few.  It’s an exciting time for tech, but it presents an ongoing challenge for businesses to keep pace with these advancements to boost productivity, increase profits and generate a return on investment. The advent of cloud computing promised a revolution in the way businesses manage their IT infrastructure – offering scalability, flexibility, and cost-efficiency  As such, the adoption of cloud-powered technologies has skyrocketed in recent years, with recent research revealing that up to 94% of UK businesses have adopted the cloud in some capacity.  Yet, despite the countless advantages cloud-powered computing can offer, such as more effective data storage and streamlined workplace collaboration, it’s certainly not without its challenges. Network latency issues, hefty cost considerations and vendor lock-ins have led a growing number of businesses to consider reverting their cloud migration journey, or move away from the cloud, altogether.   Reducing costs or reducing cloud-flexibility? One of the biggest motivations to withdraw from the cloud is to reduce costs. It’s no secret that the global economic climate is turbulent, with inflation and spiralling costs calling budgets into question. Many businesses looked to capitalise on the hype surrounding the cloud, praised for its ease of use and low upfront costs. A lot of businesses bit off more than they could chew and adopted cloud services and products they didn’t need, without fulling understanding or appreciating the cloud’s major perk of being able to scale easily to meet business demand as and when it is necessary And with rising business costs, they are not seeing enough benefits to justify these soaring costs which is making them want to apply the brakes, leading to many business leaders considering repatriating their data.  ‘Cloud repatriation’, the process where companies migrate their data, applications, and workloads from cloud-powered infrastructure back to on-premises infrastructure, began as a whisper a few years ago, but now, it is gaining major traction as IT budgets are under enhanced scrutiny. In fact, a survey from 451 Research found that 48% of respondents have repatriated at least some of their workloads from the cloud. Whilst business leaders may initially be enticed by the return to simplicity and reduction in costs often associated with repatriating, this is a short-sighted approach, and the pitfalls can have serious consequences in the long-term. One of the biggest benefits of the cloud is the ability to scale resources up and down when required, giving businesses unmatched flexibility, whereas on-premises infrastructure can often require hefty upfront investments and with ongoing maintenance costs, ensuring it can be as cost-efficient as the cloud can be incredibly challenging.  Addressing cloud inefficiencies In today’s fast-paced business environment, the ability to quickly adapt to changing market conditions to meet demand is paramount. We saw this was as clear as day during the height of the pandemic when countless businesses adopted cloud-powered technologies in numerous shapes and forms. Research from Centrify supports this, with over half of UK business leaders stating that the transition to the cloud saved their business from collapse during the pandemic. However, as fast as cloud adoption was during that time, numerous businesses didn’t follow a clear and well-thought strategy when it came to cloud adoption and as a result, their use of cloud-powered technology is poorly inefficient leading to ‘cloud wastage’ and subsequent bloated costs.    Before deciding to repatriate entirely from the cloud, businesses should take stock of their cloud services and applications to see if there are any inefficiencies that can be remedied. Whilst the cloud can offer unparalleled flexibility and scalability, there’s a likelihood that some businesses are not making the most of cost optimisation strategies provided by cloud platforms. However, if they implement an auto-scaling policy for instance, this could help align their costs with actual demand they are experiencing.  In addition, as the cloud offers a wealth of specialised services, tools and applications – not all will be beneficial to a business’ efficiency. By conducting a thorough review of what cloud-powered architecture they have implemented, they may discover opportunities to refactor or re-architect existing cloud infrastructure to leverage these capabilities more effectively, leading to improved performance and reduced overhead costs.  Complete cloud repatriation should be a last resort If there’s no going back and cloud repatriation is determined to be the strategy for your business’ IT infrastructure moving forward, it is important to understand it is far more complex than simply pulling the plug.  It is important to remember that repatriation does not have to be a blanket strategy where all operations are reverted on-premises. A hybrid to cloud utilisation can often work well for businesses as it is possible to repatriate certain components of your business’ IT infrastructure whilst still utilising the cloud for other core operations. It doesn’t have to be all or nothing when it comes to the cloud. Ultimately, whatever decisions is made for the cloud, the key to any successful transition is to ensure they are as seamless as possible as if cloud repatriation is conducted poorly, companies will likely experience downtime and service disruptions – negatively impacting their ability to conduct business. Cloud adoption is only going to continue to rise, businesses who are digitally armed will be better poised to serve their customers and keep up with their evolving demands. Cloud repatriation is often the result of panic, and for businesses looking to assess their cloud-based operations, it’s important that they conduct a full audit with a specialist cloud provider to fully understand and appreciate their situation and identify any areas that are experiencing cloud wastage. As such complete cloud repatriation should be considered a last resort.  Whilst maintaining a clear view of the associated costs and benefits, businesses who embrace the full potential of the cloud will be poised to thrive in the digital age. ### Why AI is more than a passing fad Last year, Collins Dictionary coined “AI” as Word of the Year. Generative AI is a new but fast developing technology, and there’s high hopes for its potential. In the latest budget, Jeremy Hunt committed £500m of investment for AI innovation centres in the UK; probably because the UK’s AI market is currently valued at over £18.4 billion, and it is estimated to grow significantly during the next few years to add £797 billion to the UK economy by 2035. If you weren’t convinced already that AI is here to stay, we’re seeing real demand from businesses for AI to be included in their strategies. AI technology is already being embedded into all facets of our lives from our homes and workplaces, through to some of the UK’s largest sectors, retail and finance. While these two sectors may be on different ends of the scale when it comes to regulation and compliance, both are leveraging AI to enhance efficiency, streamline operations, and deliver personalised services to their respective clientele. Reviving the retail industry It's true, the UK retail industry has been facing some challenges, especially on the high street. ONS reported recently that retail sales fell again in October, to their lowest level since February 2021 when covid restrictions remained in place. But fear not, because AI is here to save the day and rejuvenate the retail industry! The retail sector is actually one of the leading adopters of AI technology. Retailers can use sophisticated recommendation models and other advanced AI-driven personalisation techniques to anticipate customer needs and give them an empathetic and enjoyable shopping experience, ensuring they get what they need as quickly as possible. Imagine a future where AI can blur the line between chat and shopping, with large language models responding to shoppers' questions and making product recommendations in a way that feels natural and appropriate. And that’s not all. Predictive analytics is also playing a vital role in the transformation of the industry. Retailers have access to a myriad of customer data, and they will use this to make more sophisticated predictions about the behaviour of individual customers and the market. This will allow them to spot dangers and opportunities, and act on these before it’s too late. The data owned by retailers is a treasure trove that can be monetised in various ways beyond just retail businesses. Exciting practical applications of AI are already making their way into stores. Computer vision will be a big help in preventing shoplifting. AI can observe and predict in-store movements of people and produce, optimising placement, picking and personnel. Looking ahead, the number of advanced and experimental AI options is going to increase. Retailers need to be aware of these opportunities and start thinking about prototypes and proofs of concept now. Technology constantly evolves, so getting in on the ground floor can make a huge difference. Transforming financial services Now let's talk about the financial services industry, referred to as the "engine room" driving UK growth. With 2.5 million people employed across the UK – over 1.1 million in financial services (FS) and more than 1.3 million in related professional services – the industry produced £278bn of economic output, 12% of the entire UK’s economic output, and £100bn in tax revenue. And just like the retail industry, financial services is also in the process of being transformed by AI. Through the use of AI, financial service providers will be able to build more accurate customer profiles, allowing them to offer a more personalised range of products. On top of this, repetitive data-driven tasks can be automated, freeing up human workers for more high-level work. AI that can read and understand human-generated content will allow for the incorporation of a greater quantity of data and a higher quality of insights than ever. Financial documents can benefit from this, as it allows users to go beyond sentiment analysis, and yield insights through the analysis and extraction of valuable data. Finally, as algorithms become more powerful, they will allow more accurate modelling- helping in areas such as fraud detection and portfolio management. Of course, the monetary stakes are higher in finance, so the risks are also greater. AI is only as good as the data it’s trained on, so there’s a real danger of predictions being skewed by bad information, or biased datasets. But with the 28 leading countries signing the Bletchley Park Declaration, there is now a combined global effort to make sure AI is trained ethically and responsibly. AI’s just getting started AI won't only transform the retail and finance industries but will have a much broader impact on the UK economy. We're just scratching the surface of AI's full potential and its ability to improve various sectors. These industries will thrive in the AI revolution because many workers have already embraced AI as a major efficiency tool. Unlike previous AI fads that were often imposed from above on sceptical workers, the current AI revolution is being driven from the bottom up. Individuals are realising the value and benefits of AI and are proactively incorporating it into their work routines. It's a collaborative effort that empowers workers to take advantage of the technology. Businesses must be alert in understanding this new technology as it comes through, and before it comes through, to stay ahead of the market and keep our competitive edge, whilst also doing their utmost to understand its ramifications and be at the forefront of good practice. Whatever happens, AI is bound to play a huge role in the workplace of the future.  It’s just a question of how. No matter what your views on AI, one thing is certain - it’s not going anywhere. AI technology is already deeply embedded in our homes and workplaces, and it will only penetrate further as time goes on. Remember, the power to harness AI lies in your hands. With the right knowledge and understanding, you have the tools to embrace AI and thrive in the AI revolution. So, let's get started on this exciting journey together! ### Designing digital services for the public sector For many public sector organisations, finding new ways to improve the services they offer is an ongoing challenge. Fundamentally, it’s about designing a service that can bring greater efficiencies for the staff managing the service and improve life for the end users engaging with it. The design process itself performs a central role in delivering better outcomes for the public sector. For many organisations who are looking to significantly develop their services, there’s not a one solution-fits-all-approach. But there are a few fundamental things to consider when you are laying the foundations for change. So what are the key design considerations that can help improve public services at scale? The challenge of transforming services at scale It’s often the larger inherited underperforming or disjointed services that hold organisations back. These services will usually have been built up over a number of years and any issues or problems can be deeply ingrained. Of course, there are likely to be good reasons why things have evolved in the way they have. Across many areas of the public sector, including central and local government, things like changing policy, rules and regulations all have an impact on how a service is shaped over time. For organisations looking to transform these large scale services, an in-depth discovery exercise is needed to kick off the design process to understand the existing landscape, so that all of the basic elements of the service are identified. The first step is to compartmentalise the existing service, breaking it down into its simplest form. This can be challenging but it’s the best way to gain a better understanding of how the service works now. Once you have each element of the end-to-end service laid out, you can start to build an understanding of the user journey. From the customer experience interacting with the service at the front end to the employees behind the scenes, it’s important to look at the journey and identify the pain points along the way. Understanding the user journey is a vital part of the design process as it enables you to identify how best to achieve the desired goals and outcomes for the service. Continuity is key, there’s no room for interruption Continuity is important across the public sector and there are vital services that can’t just be switched off while we work on fixing them. Services are live and serving citizens at all hours of the day. The Home Office, for example, works with multiple agencies and public bodies including HM Passport Office, and is constantly reviewing and approving applications. Delivering digital transformation for these types of services must be done in a way that avoids disruption. Transitioning a live service over to a new design architecture requires careful planning. It’s important that it happens smoothly and users are able to understand the value of the new design, whether that be enhanced features or a more stable and robust platform. With a service that’s already live, this often means a redesign to make it operate more efficiently, rather than building something completely new. This has the advantage of making the process more fluid, as familiarity with the system remains and staff don’t need to learn an entirely new process. Cultural collaboration is paramount In our experience of delivering these projects at scale, collaboration and communication are vital to the overall success of the transformation. As well as tasking multidisciplinary teams to lead the delivery, it’s about uniting the operational, customer and marketing teams to ensure communication is joined up and key elements of the delivery are agreed and understood throughout. Embedding the project team within the organisation to fully understand the policy behind the service and ways of working, will help with the overall seamlessness of the delivery. Understanding the organisation’s culture and getting buy-in early on is important, particularly when you need to change mindsets around how work gets done. If staff are used to working in a particular way, their needs have to be understood and designed for so that in turn, change will improve their own role and deliver a better service going forward. Successful change is often more about people, than technology. Keeping communication open and encouraging feedback during all stages, means everyone can keep learning throughout the process. Updates and iterative improvements can be made and shared, as and when they are needed. People who are involved will feel valued. Across any organisation, people hold different views about how good design and delivery happens. If the goal is to design a cohesive, joined up service then collaboration sits at the centre. At scale, it takes the work of many different people to redesign a service. By approaching the process in this way, your service will deliver greater impact for both your teams and the users who rely on it. ### Evolving legacy tech - the major movement of 2024? Legacy technology is one of the most pressing issues for many IT teams. Technology teams can find themselves devoting large portions of their budget to legacy maintenance - not to mention finding chunks of their time swallowed up by issues caused by using legacy technology. Over the last few years, the rise in software-as-a-service (SaaS) and cloud-based options has helped to facilitate an upgrade from legacy technology for many businesses. However there are still too many organisations grappling with the issues presented by the use of out-dated technology that is not fit for purpose. It’s previously been estimated that legacy tech is holding back as many as nine out of 10 businesses, and it’s proving to be a major barrier to innovation. Is 2024 the year for legacy evolution? Fortunately for many IT teams, much legacy tech is rapidly reaching the point of no return - it simply must be upgraded to enable the business to continue to operate. Organisations may have stretched their evolution plans out over a few extra years to cope with other business interruptions (like the pandemic) but now they are officially out of time. We see this very clearly in the industry where we operate, which is marketing technology and loyalty programmes. Many systems in this area were launched in the 2010s, so almost 15 years later they are in desperate need of an upgrade. The legacy systems are doing a disservice to the brands they serve - they’re slower, clunkier and less performative than they should be. And despite a wealth of talent in the sector and no shortage of newer and cleverer alternative platforms, many brands have stuck with their legacy tech for too long. There are lots of reasons for this, but perhaps the two key ones are budget and disruption. Legacy upgrades require budget - even SaaS options need funding - and they can be disruptive to the business, both while being evolved and then also in terms of training and upgrades. These two problems have not been solved overnight - so why then are we predicting a year of legacy evolution in 2024? The clock is ticking As mentioned earlier, many customer loyalty schemes are reaching 15 or so years of age and businesses are officially out of time. Having stretched resources (including loyalty programmes) to run a few extra years due to the pandemic, 2024 is likely to be the time when the realisation that time has run out hits. As well as being clunky to use and outdated, legacy programmes run a very real risk of being discontinued by their providers, and those developed in the early 2010s are at the highest risk. The customer demands it Legacy tech evolution is needed to allow businesses to keep pace with modern customer demands. Over the next 12 months, legacy tech will be increasingly phased out as businesses put greater focus on fit-for-purpose systems that deliver better results. With the rising cost of doing business hindering profitability and the cost of living hike impacting customer spend, it will be imperative for businesses to invest in order to survive and evolving legacy tech is a fundamental part of this. Loyalty is more important than ever In the loyalty space specifically, we have observed a very clear trend in recent years - businesses are putting more and more importance on their loyalty programmes. Customer retention is high on the agenda and will remain there for the foreseeable future. For brands across all sectors - both B2B and B2C - the cost of living and cost of doing business crises have pushed loyalty further up the list. It’s easier to keep a customer than win a new one, so all eyes are on new ways to do this. Marketing teams’ loyalty budgets are growing and we anticipate a focus on ensuring the building blocks are in place to develop and deliver excellent customer loyalty programmes. That means putting the right technology in place. Where does this leave IT teams? This is the year to bid for budget. The stakes are aligning and businesses are ready - it’s time to push the button and make your case for evolution. IT teams can deliver much more without the weight of legacy tech acting as a weight around their necks; evolving legacy tech is a strategic step forwards for the IT department as well as the business as a whole. The playing field is awash with seriously good cloud solutions that can quickly step into the gap created by the retirement of legacy systems. There is an element of research and diligence to complete to find the most appropriate solution for the business and then it will be handed over to IT to manage the planned and careful roll out of the new system. Whatever your business, it is definitely time to have conversations about legacy evolution - or risk being left behind in 2024. ### Is Ireland the next big gaming hub? Ireland has emerged as a strong contender in the global gaming industry, drawing the attention of both game developers and enthusiasts alike. Supportive government policies and a rapidly expanding talent pool have helped create a vibrant tech scene – positioning the Emerald Isle as a rival to established gaming hubs such as Japan, South Korea, and the USA. The evolving gaming industry  As an industry, gaming has witnessed remarkable growth in recent years, and has become a leading form of entertainment worldwide. While console gaming like Playstation and Xbox was already hugely successful before the pandemic, and continued to grow in the period since, many other platforms have started to gain traction in the market. Mobile gaming has quickly become one of the most popular forms of gaming. Driven by the proliferation of mobile phones, comparatively low price point and wide variety of games, the mobile gaming industry unsurprisingly caters to a more diverse customer base and has led to strong growth across the sector.  Elsewhere, e-sports are turning gaming into a competitive arena and professional tournaments are drawing millions of viewers worldwide, creating dedicated fan bases and substantial prize pools (peaking at $40 million). E-sports have become grand spectacles, with players competing in games such as League of Legends, Dota 2, Counter-Strike: Global Offensive and more. Such events are becoming an industry in their own right, attracting sponsors, advertisers and a range of increasingly sophisticated investors. Ireland as a natural hub From the rise of mobile gaming to the emergence of e-sports – and with new technologies such as virtual reality and cloud gaming coming into the mix – the industry has recorded extraordinary growth, with Global Data predicting a $470 billion valuation in 2030 (up from $197 billion in 2021). So where can such a dynamic industry find a home for the next stage of its stellar growth? One country that is becoming an increasingly popular choice is Ireland, due to several attractive attributes. Pro-business environment Ireland’s pro-business environment is a major draw for gaming companies with international operations and is bolstered by the release of the new digital gaming credit. The digital gaming credit allows eligible companies to have the opportunity to receive a refundable tax credit equal to 32% of their expenses on designing, producing and testing a digital game, with a maximum limit of €25 million per project. This means that recipients of the credit could potentially receive up to €8 million, which could prove to be the final deciding location factor for companies aiming to scale.  Strong legal foundations Ireland has a common-law system which is beneficial for the gaming industry for several reasons. First, it has a specialised intellectual property (IP) Commercial Court that is well-known for its effectiveness in dealing with intellectual property infringement claims, supporting the gaming sector by protecting and enforcing IP rights. Second, Ireland’s common law regime presents stability and predictability, enabling businesses to plan and manage risks effectively – and since Brexit, Ireland is the sole common law regime within the European Union (EU). Third the flexibility and adaptability of common law allows the legal framework to quickly respond to dynamic business practices and changing market dynamics. Bridge between Europe and the US Ireland is strategically located between Europe and North America and is the only native English-speaking country in the EU, – an ideal gateway for gaming businesses that are looking to operate in both regions. With direct flights to all major countries across Europe and North America, Ireland’s strong connectivity has already supported hundreds of companies in exploring new markets and improv global reach. Ireland’s skilled workforce All this is boosted by Ireland’s highly skilled workforce, particularly in the creative, technological and commercial fields necessary for the gaming industry, such as software development, computer science, animation and video production. The country is home to renowned universities and technical institutes such as Trinity College Dublin and Dun Laoghaire Institute of Art Design & Technology, as well as complementary research institutions like Tyndall National Institute. This ecosystem is dedicated to building a highly skilled workforce in Ireland, with the ultimate aim of providing valuable support for gaming and technology companies. Ireland is one of the best places in Europe and the world to produce animation. The Irish passion for storytelling and the arts has a created a culture perfectly suited for world class animation studios to succeed. The skilled workforce in these fields has been demonstrated by the number of award-winning animation studios based in Ireland – such as Cartoon Saloon, a five-time Academy Award®, Golden Globe®, BAFTA and Emmy nominated animation studio, as well as Boulder Media, famed for its animated shows such as Fosters Home for Imaginary Friends and The Amazing World of Gumball. Looking ahead Ireland’s gaming space has made significant strides in recent years. This is demonstrated by the several industry giants that have established bases of operations in Ireland – such as EA (known best for their sports games), World of Warcraft’s Activision Blizzard, and Riot, developer of the global phenomenon, League of Legends. These household names have helped to establish a strong cluster and skilled talent pool, leading to many award-winning indie games developers/publishers also choosing Ireland as a hub. This includes the likes of Team 17, the developer responsible for Worms, The Escapists, Overcooked and over 90 other titles. The ecosystem in Ireland continues to grow and is also supporting a wide range of Irish companies. There are currently over 60 gaming start-ups in Ireland – including Vela games, War ducks, Pewter Games and more. All which have greatly benefited from the emergence of a skilled workforce and are eager to scale operations.   As the gaming space continues to expand, the range of supportive government policies, strong legal framework and skilled talent pool will only continue to drive the country’s appeal as an attractive location.  ### The Open Source Boom and What This Means for the Cloud Over recent years, open source software has boomed, with 47% of European organisations seeing an increase in value from open source technologies over the last year. Indeed, 70% of enterprises now favour cloud providers with open source technology, and it’s clear that open source works well with cloud tech, given that 90% of all cloud infrastructure is now run on Linux. Despite open source’s interoperability, it may not immediately seem to be a natural fit, given that many organisations use different cloud providers for different workloads, more support is usually required for its maintenance, and it often appears far more complex than it actually is. However, we cannot ignore the boom in popularity of open source, particularly in the hybrid multi-cloud space. So let’s take a deeper dive into whether open source is a good facilitator for this. Benefits of open sourceOrganisations have always tended to sway towards proprietary technologies for a variety of reasons, mainly because the big vendors provide users with the reassurance of a well-known name and a safe pair of hands. However, customers are increasingly finding that when navigating their projects, the supposed support becomes less effective when more features are added and packages become unwieldy and sluggish. Also, when considering cloud services, one cannot ignore the fees, such as license and hosting fees, which can soon add up. Similarly, with long-term contracts, spurred on by tempting discounts, organisations often become tied to vendors, vastly limiting their freedom. For these reasons, many communities, and particularly the cloud community, are opting for open source. This preference is highlighted in the Red Hat developer survey, which shows that over three-quarters (77%) of organisations are planning to increase their open source adoption in the next 12 months. The are multiple reasons for this preference for open source, including there being no licensing fees and no increases in costs per CPU or cluster, so therefore it is essentially free. Furthermore, cloud environments like OpenStack are built for standard infrastructure, whereas other cloud services may require unique technology. Workloads often do run best on specialised infrastructure; however, it’s always good to know one’s options. Open source also benefits from thriving user communities sharing best practice and tools, spurring security and transparency. Organisations don’t need to take their vendors’ word for it that the software is secure; they can view and audit it themselves. In addition, when faced with a challenge, organisations can rely on the open source community, which includes thousands of engineers and other stakeholders, to share solutions, code and approaches to tackling issues, providing instant responses to vulnerabilities. One could argue however, that with proprietary solutions the contract will usually include support, albeit at an extra cost. However, open source comes as it is and therefore support can be more time-consuming to find, although it can be more cost-effective in the long run. Open source also frees organisations from the perils of vendor lock-in, allowing software to run from any location using open source. Looking to run some Kubernetes? No problem – you can run it from your datacentre, from your own platform or even from your kitchen. This is an ideal fit for hybrid multi-cloud environments where application mobility and freedom are a huge benefit. Challenges and responsibilitiesDespite the clear benefits, open source is not always the answer and can provide some challenges when implemented. From the outset, a major challenge is the difficulty in transferring an existing, closed system to an open source one because creating in-house open source environments requires a large amount of time, human resource and investment. Responsibility is another key aspect of open source to consider. Open source communities encourage and thrive on collaboration, but providers are often driven and existing for profit, resulting in a one-way benefit. It’s critical for vendors to actively support the open source communities they are utilising. This could involve promotional events and contributing or sharing code. A mutually beneficial relationship is essential for the essence of open source, rather than a relationship where businesses are profiting off a community project based on a genuine desire for innovation and goodwill. Both open and closed source infrastructure solutions produce challenges and pitfalls in hybrid multi-cloud setups. For instance, it is vital not to put different application components in different clouds, as this hugely reduces the functionality of applications. Despite the challenges one must consider with open source, it is still a favourable solution for hybrid multi-cloud environments. Its offer in terms of flexibility, collaboration, and community support arguably beats that of closed source. However, as with most solutions, there need to be careful considerations. Perhaps most importantly, organisations must make sure to support the open source community and its principles of empowerment, accountability and trust, to ensure that open source continues to provide both today and for the future. ### Revolutionising Land Transport: Key Tactics for Enhancing Efficiency in Logistics The logistics industry has faced unprecedented challenges and obstacles in the past few years, beginning with the pandemic. We’re still feeling the ripple effect on supply chains, even years later, and the ongoing repercussions of elevated prices, continued global developments, and a possible looming recession. In response, logistics providers have become more innovative to adapt and keep goods moving. One of the key aspects is land transport, which fills the gaps with air and ocean freight and boosts the efficiency of the entire supply chain. Land transport is not without its own challenges, however, including labor shortages, high fuel costs, and limited capacity. But there are strategies to employ to streamline land transport and boost efficiency. The Current State of Land Transport When the world shut down during the pandemic, consumers shifted from in-person shopping to online shopping. The sudden rise in purchases put high demands on the warehouse inventories, both domestic and foreign, and strained transportation networks. Logistics providers had to pivot, offering courier services, less-than-truckload (LTL) services, and more, all to meet the dramatic shift in demand. Web-based logistics tools and other innovative technological advancements have allowed logistics companies to not only survive but thrive in this volatile market. Even when the world opened back up, the e-commerce trend didn’t slow down. It’s still moving forward, creating more and more pressure for transportation providers to continuously innovative and improve the speed and quality of their services, while still delivering value to the customer. There are several strategies you can employ to keep your land transport moving efficiently, including: Route Optimisation Effective route planning and optimisation is important for minimising transportation costs, optimising fuel usage, and maximising delivery speed. Using tailored software solutions, you can determine the most efficient routes based on information like traffic conditions, distance, and delivery time windows. This reduces idle time and unnecessary detours or traffic backups that can threaten rapid delivery. Real-Time Tracking and Visibility Real-time visibility is a necessity in modern logistics. You can track and trace the movement of goods from suppliers, manufacturers, warehouses, and hubs to the end customer in real time using GPS-tracking and radio frequency identification (RFID) to plan, schedule, and monitor the logistics process at every point. With real-time visibility and tracking, you can get the status of a shipment, regulatory information, raw materials, and more to avoid delayed shipments and lost profits. Shipment Consolidation Partial shipments, or shipments that don’t fill capacity, are a common issue in supply chains. Consolidated shipping, also referred to as less-than-container load (LCL) or less-than-truckload (LTL) shipping, combines multiple shipments from various shippers into one full container or shipment. Consolidated shipments offers bulk rates that benefit both the business and the end customer. Businesses that have a high frequency or smaller shipments can make the most of an entire truckload, even with goods in multiple locations, to reduce costs without causing delays. Partnering with an experienced logistics provider to help with shipment consolidation and ensure you’re making the best decision for your freight load and business needs. Last-Mile Delivery Optimisation The last mile of the delivery is one of the most expensive points in the delivery process. The challenge is to reduce those last-mile delivery costs while showing up on time consistently to keep customers happy. To make matters worse, shipments are getting smaller, and more and more ecommerce companies are under pressure to offer Next Day Delivery or Same Day Delivery to compete with rivals, leading to more individual deliveries. Routine planning and optimisation can assist with efficient last-mile delivery according to time, distance, and cost, considering possible delays from road restrictions. Route optimization also takes into account delivery windows, load capacity, current traffic, and other factors that can impact the last mile. Data Analytics and AI Data analytics and predictive modeling provide insights to optimise road freight operations based on a wealth of real-time and historical data. The model considers patterns and forecasts demand to develop an optimised solution to avoid potential disruptions or minimise their impact. Ai and IoT can also be used to create more efficient processes, reducing errors that can lead to delays and lost revenue. With automation, you can optimise your supply chain based on evolving market conditions and respond to changes quickly. Efficient Warehouse Operations Efficient warehouse operations are an important part of road freight efficiency. Strategies like optimised storage layouts and adopting warehouse management systems can streamline the fulfillment processes, reduce handling time, and minimise inventory holding costs. Automated material handling systems and barcode scanning systems also help with organisation and operational efficiency. Fulfillment becomes easier, preventing unnecessary delays in the process that disrupt your shipping. Intermodal Logistics Intermodal logistics is a popular way to transport goods, especially when there are continued bottlenecks and shortages in the global supply chain. This approach combines different transportation modes to ensure on-time delivery – including land transport like trucking and rail – without exorbitant costs. With a seamless integration of road, rail, air, and sea transport, intermodal logistics offers flexibility to adapt to new challenges and disruptions while minimising environmental footprints through optimised use of resources. Proactive Maintenance and Repairs With increased supply chain demands and pandemic-related disruptions, the truck shortage is still persisting into the end of 2023. Maintaining a functional fleet is crucial for efficient land transport, especially with trucks and space at a premium. Proactive maintenance and repairs can reduce downtime, prevent unexpected breakdowns, and extend the lifespan of vehicles. Instead of dealing with a major repair in the middle of a busy time – or worse, multiple issues with multiple trucks – you can plan your inspections, repairs, and maintenance work to rotate your trucks and improve your fleet reliability. Support Efficient Land Transport Maximising your road freight efficiency is critical for logistics companies, especially as the industry experiences ongoing disruptions. Optimising routes, investing in technology, and taking proactive approaches to your fleet can help you streamline your land transport and stay ahead of your competitors. ### Battle-Ready Cybersecurity: Top 4 Tactics to Empower Teams Against Cyber Attacks Royal Mail, the UK Electoral Commission, genetics firm 23andMe, Microsoft: just some of the brands and organisations to suffer major cyber attacks in the past year. Global research that we’ve just carried out at Fastly reveals that the increasingly complex threat landscape is a big concern for cybersecurity professionals (35%) over the last 12 months, and even more (37%) feel this will continue to drive threats in the coming year. Security teams are understandably focused on the particular threats they are most at risk from. Our research shows that the most common attacks this year were ransomware - experienced by 29% of businesses, DDoS (28%) and attacks related to open source software (25%), followed by social engineering (22%) and API/web application-related attacks (20%). As varied as these threats are, they really only scratch the surface of a broad and ever-changing array of methods criminals can tailor to their targets. Certain sectors are more susceptible to particular types of cyber attacks, for example media and entertainment companies are more likely to be targeted for social engineering attacks. However, another worrying trend is the overall increase in alarm about bad actors using social engineering attacks like ransomware as a way of gaining access to organisations’ data and finances. Twenty nine per cent of organisations flagged this as a priority threat compared with 23% in 2022. Boosting defences by increasing cybersecurity headcount is one obvious way to deal with the increasingly sophisticated cybersecurity threats; our survey revealed that 48% of businesses increased their spend on new talent over the last year. However, only 36% feel these hires possess the necessary skills to protect the business. Meanwhile, nearly half of cybersecurity professionals surveyed are worried about the ability of their existing talent pool to deal with threats arising from emerging technologies. In summary, cybersecurity teams are under more pressure than ever before to combat a growing range of increasingly sophisticated security threats, while feeling less resourced than ever to be able to do so. Here are four ways for companies to square the circle and ensure cybersecurity teams are adequately equipped for future cyberattacks: Prioritise cross-organisational security When facing a wide range of threats, it’s important to equip as much of the business as possible to meet these challenges. This requires a twofold process of hiring the right experts and ensuring internal security processes are up to standard. Recruitment can be complicated in cybersecurity - a third of team leaders felt that security issues in the last 12 months were directly attributable to the talent shortage. What’s more, the skills cybersecurity teams require, have to flex to match the developing threat landscape. Because of this, it’s best to focus hiring practices around talent that understands emerging technologies. Cybersecurity experts agree: over a third (36%) of businesses we surveyed say that high-quality recruitment tops their investment wishlists in the coming year. In terms of shoring up internal security processes, businesses need to be rigorous with training and establish clear security protocols for employees to follow in the event of a cyber attack. Given the increase in social engineering attacks, training plans should be reinforced across the entire organisation to make sure that all employees recognise the signs of an ongoing threat and can take appropriate action. Move cybersecurity accessibility up the agenda As we have seen, an organisation’s security posture is not just down to the experts, security strategies must involve all employees in protecting the business. In view of this, accessibility plays a key role in making the overall business more secure. If regular workers don’t understand their role in preventing cyber attacks or are unable to use the cybersecurity tools at their disposal, they can be much more easily targeted and leave the business more vulnerable. Struggles with the transition to hybrid work have highlighted the urgency of accessibility as 78% of security professionals believe hybrid workers are more difficult to secure. For this reason, it’s encouraging to see that 35% of security professionals aim to make cybersecurity more accessible in order to meet usability requirements and boost their cybersecurity posture. Embrace a Secure by Design approach It’s tempting for cybersecurity teams to throw money at the issue of shoring up teams to combat growing cybersecurity teams - in fact 76% of cybersecurity professionals plan to do just that by increasing cybersecurity spending in the next 12 months. Secure by Design is an alternative solution to spiralling cybersecurity expenditure. This mindset involves designing security into the core of any project right from the outset. Addressing and preparing for security hazards when designing a product or system, shifts the need for human action further away from the stack, which means that security success does not need to rely on human perfection to succeed. There are two ways to embrace a Secure by Design approach. The first is through solutions that eliminate hazards and the second is via solutions that reduce hazards. Eliminating hazards is all about ensuring solutions don’t rely on human behaviour. For example, isolating access to financial data in payment apps by using a more modular application architecture removes hazards for security teams rather than making them act on them. Reducing hazard by design is about identifying problems and choosing safe, reliable technologies and methods to build each component of the cybersecurity architecture. This could include avoiding using code in languages lacking memory safety like C or C++. Instead, reducing hazard by design would involve isolating unsafe code or refactoring systems into a memory safe language. This approach ensures the stack is built with security in mind, reducing the unnecessary costs of a reactive strategy. Leverage generative AI to develop training programmes Generative AI is viewed as a bit of a mixed blessing by survey respondents with over a third (37%) looking to define a new security approach as it relates to generative AI. More positively, generative AI can be leveraged as a tool to counteract the strain on security teams. Nearly half (43%) of security professionals recognise this and expect a productivity boost as the technology is more widely adopted. One area where cybersecurity professionals expect generative AI to have a big impact is in training and development, with generative AI’s content development potential coming to the rescue of security professionals tasked with writing training programmes. Our survey revealed that the implementation of generative AI-created training programmes will become more prevalent within the industry. In fact, 36% of security professionals predict that generative AI will allow them to train their colleagues more effectively in cybersecurity basics. This development will offer them significant support when it comes to fostering security-first mindsets throughout organisations. Small wonder that 42% of security professionals believe that, deployed correctly, AI is likely to be an effective tool when it comes to protecting their businesses. Failing to prepare is preparing to fail Security teams don’t need to spend wildly to combat the growing danger from cyber attacks. By prioritising cross-organisational security and accessibility, embracing Secure by Design and using generative AI for training, companies can implement security resilience through the entire organisation. This approach equips security teams to prove their value beyond the point of an actual attack. That said, the numbers show that bad actors are only getting better, so the time for action is now. ### What’s next for digital transformations? Digital Transformation Trends 2024 Digital transformation is becoming increasingly critical for almost every company looking to enhance its productivity. It is not only essential for start-ups to consider what’s next for digital transformation but also important for scale-ups and long-established companies to consider the impact of digital transformation too. New competitors are harnessing the benefits of digital transformation, leading long-established companies to explore what the future of digital transformation will mean for them. That is why it is essential to stay ahead of the upcoming trends in digital transformation and understand the predictions for the upcoming year… Artificial Intelligence and Machine Learning Artificial Intelligence (AI) has been the biggest talking point among digital transformation conversations and it will only continue to grow. As more businesses integrate AI and machine learning into everyday practices, we will see more automation, predictive analytics and personalised user experiences in the upcoming year. Generative AI such as ChatGPT and Google Bard will continue to improve its knowledge with more users from various industries effectively training the AI. AI art - such as the product Bing has already made available - will continue to have an effect on creative industries. Cloud Computing Adoption In recent years, many companies have increasingly adopted cloud computing services such as Amazon Web Services, Microsoft Azure and Google Cloud for scalable and flexible infrastructure. This is likely to continue growing in 2024, especially with the continued exploration of quantum computing capabilities with cloud services. With the aim of providing access to quantum computing resources for certain applications, it is clear cloud computing will continue to grow. We may also see more personalised and industry-specific solutions tailored to the unique needs and compliance requirements of different sectors, fostering more widespread adoption across various industries. More companies will also adopt hybrid cloud environments. This will combine on-premises legacy infrastructure with public and private cloud services. Multi-cloud strategies are likely to become more prevalent as businesses leverage services from multiple cloud providers in order to avoid vendor lock-in and optimise performance. 5G Developments The widespread rollout of 5G will cause further developments for enhanced connectivity. Further possibilities for the Internet of Things (IoT) will be explored in 2024. The IoT refers to a network of interconnected physical devices that communicate and exchange data with each other through the Internet. These devices can perform various tasks without direct human intervention. While the development of IoT still presents challenges related to security, privacy and interoperability, which need to be addressed for widespread and secure adoption, it is likely these problems will be addressed throughout the upcoming year. The further development of 5G is also likely to have an effect on Edge computing. Localised computational power and data storage will be elevated by 5G as it provides the necessary infrastructure for high-speed, low-latency, and reliable communication, which is crucial for applications demanding real-time processing and rapid data exchange. Focus on cybersecurity As digital systems such as AI and cloud computing become more common in our lives with more sophisticated systems, the call for more robust cybersecurity measures is likely to increase. Regulators will likely focus on protecting user data, privacy and digital assets. Worldwide, we are likely to see governments bring in laws to regulate the use of digital systems, especially AI. Companies such as CrowdStrike and Darktrace already offer sophisticated threat detection systems which use AI and machine learning algorithms to identify and respond to evolving cyber threats. These systems are likely to advance and see more investment over the next year in order to face developing cyber threats. Decentralised Finance Blockchain technology will develop and find broader uses beyond its original foundations for cryptocurrencies. One of these further developments will be for decentralised finance (DeFi). While it's still in relatively early stages, we are likely to see more prominence for DeFi. Decentralised lending, borrowing and trading are already becoming more commonplace, and this is likely to see more use in the upcoming year. DeFi company Aave is already innovating with “Flash Loans” which allows users to borrow without collateral as long as the borrowed amount is returned within the same transaction. DeFi is another sector of digital transformation which will see stricter regulatory scrutiny and compliance but once the legal frameworks are in place, it could represent the future of finance. Real-world asset tokenisation Another upcoming digital transformation trend is focused on Real-world asset tokenisation. Real-world asset tokenisation involves representing ownership or rights to actual assets such as art and property digitally using blockchain technology. It has the ability to be a major disruptor in real estate and art dealing, as tokenising allows for fractional ownership, enabling individuals to own a fraction of an asset. Real-world asset tokenisation also supports a globalised economy as investors can own and trade tokenised assets from anywhere in the world. It also leverages the blockchain for more efficient and streamlined process without intermediaries. Enhanced security is also promoted by the transaction history of tokenised assets which are securely recorded and verifiable. Real-world asset tokenisation is a major contribution to the larger landscape of digital transformation in the financial and asset management sectors. Overall, it’s essential to understand and keep track of digital transformation trends to both scale and future-proof your business. Many of these trends have the ability to majorly disrupt these industries and it is important to know how to leverage these new technologies to improve your business rather than get left behind in this era of change. ### The Impact of AI on Social Media Social media content isn't solely steered by experienced Social Media Managers anymore. Thanks to the rise of AI, creators everywhere can take advantage of its streamlining and automating processes and abilities. However, we would be lying if we said AI isn't without its challenges, concerns, and limitations. Artificial Intelligence has become a game-changer, shaking up how users engage and how content is crafted. Whether creators are using AI tools like Content Swarm, an app that supercharges social engagement and sales, or just jazzing up content for a smoother user experience, AI's influence is spreading globally. But how can we use it for good? AI-Driven Tools According to Ross Webb, Senior Product Leader at Content Swarm, AI-driven tools play a transformative role in enhancing successful LinkedIn strategies. He shared, "LinkedIn has doubled down on its own commitment to AI. What I have seen is that AI is a great tool to help users speed up what works on LinkedIn." Webb highlights the efficiency of AI in commenting, stating that it "speeds this up" significantly. AI's ability to summarise articles and suggest comments swiftly becomes a valuable asset, allowing users to post more often and connect with a broader audience for improved sales outcomes. Adapting to Algorithm Changes In addressing the challenge of frequent algorithm changes, Ross emphasises the importance of consistency. He states, "Because we have access to the LinkedIn APIs, we ensure that we are always consistent with the requirements of the platform." Webb highlights the significance of playing within the rules of the LinkedIn platform to prevent account bans, stressing the critical nature of maintaining engagement amid dynamic platform changes. The Future of AI in Social Media Speaking on the future of AI in social media, Webb cautions against over-reliance on AI for content creation. He states, "The challenge with AI is a lot of people simply type in a few words into ChatGPT and expect a great result." Webb stresses that a human touch is crucial for standing out in a sea of automated content. Alexe Jugaru, Business Development Executive at Content Swarm, echoes this sentiment, stating, "AI is great to increase productivity, but it’s not great to increase empathy." Alexe uses AI in his daily role to help build his personal brand and streamline his activity to fit alongside his sales role. Emerging Trends in AI for Social Media Strategy Anticipating broader trends in AI adoption, Jugaru highlights key buzzwords such as automation, productivity tools, generative AI, and integrations with real-life experiences. He envisions companies tailoring processes based on customer profiles, emphasising the need for a human touch to enhance customer experiences. What to avoid Particularly on LinkedIn, there's a noticeable surge in content generated by AI from individuals, resulting in a large amount of 'noise' that falls short in delivering substantial value. Many users have taken notice and shared their sentiments about encountering AI-generated content. In such cases, people tend to disengage when they can spot a post crafted using AI, sensing a lack of authenticity and that personal touch, ultimately eroding trust over time. Written content is slowly becoming untrustworthy, whereas video content is thriving in comparison. However, AI shines when it comes to companies aiming to scale up their content production, such as boosting output from 5 to 25 posts per month. While it requires a well-thought-out strategy to avoid generating empty 'noise,' AI proves to be a valuable asset for marketers, solopreneurs, and professionals, generating a continuous flow of innovative ideas with just a fraction of the time investment required from a human. It is particularly impactful for start-ups and SME’s who are rapidly growing and need support online. Emerging AI trends AI is always changing, serving as a hub for innovation where every day introduces something new. Tools like ChatGPT are at the forefront, leading the way and nudging businesses to cast away their reservations and fully embrace AI. There's a sense of urgency not to be left behind in this fast-paced tech evolution. Operating within the AI domain ourselves, we have a front-row seat to witness the unfolding trends and the shifts in user engagement and content creation. Looking ahead, the fusion of AI with up-and-coming tech like Virtual Reality is proof of the evolving story. Companies, realising the potential for ground-breaking changes, are more and more inclined to tweak their processes to match the unique tastes of each customer. The blend of VR and personalisation isn't just a game-changer for user experiences; it's a redefinition of how businesses connect with their audience. Finally, Alexe Jugaru shares a relatable wish that hits home for all content creators—hoping for a “tool that can quickly clear your social media of bot followers.” As AI continues shaping the future, the desire for tools dealing with practical issues like bot followers reveals the human side of tech progress. It's a reminder that even in the AI realm, being relatable and prioritising user experience remains key. ### Taking it personally - the future of eCommerce The past four years have seen some of the most dynamic changes in eCommerce since the birth of teleshopping 45 years ago. eCommerce is now mainstream, but it’s no longer limited to selling online. This is forcing ecommerce platforms to evolve. eCommerce isn’t confined to Online Customers today might begin their buying journey on a website, do product research in a physical store, and complete their purchase via a smart speaker. M-commerce is accelerating the adoption of AR, because mobiles have in-built cameras and microphones. AR is a great fit for ecommerce. Being able to see products in a live environment, trying out different colours, sizes, and other variants, is a great way to enhance the try-before-you-buy process in a digital setting. These fluid customer journeys are driving merchants’ adoption of headless technology, which decouples back-end databases from public interfaces. Headless technology provides merchants with the flexibility to engage with customers on their preferred channel, or device, while providing consistent content and a seamless customer experience across all of those channels. Making the journey comfortable Our digital agency partners are seeing increasing traffic coming from physical stores, which indicates that customers are mixing the in-store experience with the digital experience, to carry out price comparisons for example, or check product availability. A great example of this is the HSL Chairs website, developed by digital agency, McCann Manchester, which undertook extensive research into user experience (UX) and accessibility, allowing the team to successfully carry the in-store feeling of personal service over to the digital experience. The integration of product information management (PIM) software with Umbraco allows content editors to continually update the site with new product information and enables a faster workflow. Blending physical and digital commerce Another good example of blending in-store experiences with ecommerce is the virtual assistant that digital agency, Enjoy Digital, built for paint company, Valspar. Enjoy Digital, built the virtual assistant using Umbraco’s Heartcore headless CMS and custom-built microservices integrated with Elasticsearch. This allows customers to upload a picture of their home and visualise interior design projects in-store, or on-screen at home, selecting from 2,000 paint colours. The Umbraco Commerce integration also allows customers to order their paint through B&Q. Personalisation depends on machine learning In the digital realm, machine learning is the secret sauce that makes personalisation possible, because it enables better product recommendations. In this way, customers gain the in-store experience of talking to an expert, who listens to them and understands their needs and uses this information to recommend the right products and accessories. Recommendation engines process vast amounts of data, and apply item-to-item collaborative filtering, so that each customer is offered a personalised selection of products that are most likely to meet their requirements, based on their previous purchasing and browsing history. Especially in the fashion industry, return-rates are extremely high, because customers buy multiple sizes to try out and return the items that don’t fit. This is hugely wasteful, massively increases the carbon footprint for each purchase, and is not sustainable for businesses or for the planet. The ecommerce industry needs to find better ways to handle this issue. Personalisation is one solution. Respecting ePrivacy The flipside of making web experiences more personalised is that recommendation engines rely upon data collected by cookies spread across multiple websites. The major e-commerce challenge is how businesses can map the customer journey to enable product recommendations and personalised customer experiences, while still complying with the EU ePrivacy Directive, otherwise known as the ‘Cookie Law’, and without adding friction to the buying process by requiring user logins, or other pain points. The opportunities for ecommerce businesses include creating user-friendly cookie consent tools, which provide a much smoother user experience, while still complying with the law. The sites that provide the best experience are likely to gain repeat custom and a competitive edge. Composable Commerce To cater to evolving buyer behaviour, organisations are adopting composable commerce platforms that allow them to integrate software components that meet their customers’ current requirements, with the flexibility to swap them out for new components in the future. Taking this composable commerce approach allows organisations to more quickly and economically incorporate fresh DXP elements, such as PIM, DAM, automation, AI and personalisation software, without having to go to the time and expense of re-platforming. Digital agency, true, selected a composable architecture when developing an ecommerce site for international window shutters manufacturer, TCMM Shutters, because it provided the flexibility to integrate Workbooks, CrazyEgg, Struct, Umbraco Commerce, Taxjar, Algolia, uMarketing Suite, Azure: where every element does exactly what it is required to do. Conclusion: Consumer behaviour drives technology development. The rapid advances in ecommerce over the past four years have only happened because more people were making online purchases. With more than a quarter of sales now taking place online, organisations have to meet customers where they are, whether that’s online, in-store, on a headset, or on a smart speaker. API-based integration, composable architectures and headless technologies provide organisations with the flexibility to respond to these rapid changes, so that individual customer needs are met and business opportunities are not lost. ### Is Ireland the next big gaming hub? Ireland has emerged as a strong contender in the global gaming industry, drawing the attention of both game developers and enthusiasts alike. Supportive government policies and a rapidly expanding talent pool have helped create a vibrant tech scene – positioning the Emerald Isle as a rival to established gaming hubs such as Japan, South Korea, and the USA. The evolving gaming industry As an industry, gaming has witnessed remarkable growth in recent years, and has become a leading form of entertainment worldwide. While console gaming like Playstation and Xbox was already hugely successful before the pandemic, and continued to grow in the period since, many other platforms have started to gain traction in the market. Mobile gaming has quickly become one of the most popular forms of gaming. Driven by the proliferation of mobile phones, comparatively low price point and wide variety of games, the mobile gaming industry unsurprisingly caters to a more diverse customer base and has led to strong growth across the sector. Elsewhere, e-sports are turning gaming into a competitive arena and professional tournaments are drawing millions of viewers worldwide, creating dedicated fan bases and substantial prize pools (peaking at $40 million). E-sports have become grand spectacles, with players competing in games such as League of Legends, Dota 2, Counter-Strike: Global Offensive and more. Such events are becoming an industry in their own right, attracting sponsors, advertisers and a range of increasingly sophisticated investors. Ireland as a natural hub From the rise of mobile gaming to the emergence of e-sports – and with new technologies such as virtual reality and cloud gaming coming into the mix – the industry has recorded extraordinary growth, with Global Data predicting a $470 billion valuation in 2030 (up from $197 billion in 2021). So where can such a dynamic industry find a home for the next stage of its stellar growth? One country that is becoming an increasingly popular choice is Ireland, due to several attractive attributes. Pro-business environment Ireland’s pro-business environment is a major draw for gaming companies with international operations and is bolstered by the release of the new digital gaming credit. The digital gaming credit allows eligible companies to have the opportunity to receive a refundable tax credit equal to 32% of their expenses on designing, producing and testing a digital game, with a maximum limit of €25 million per project. This means that recipients of the credit could potentially receive up to €8 million, which could prove to be the final deciding location factor for companies aiming to scale. Strong legal foundations Ireland has a common-law system which is beneficial for the gaming industry for several reasons. First, it has a specialised intellectual property (IP) Commercial Court that is well-known for its effectiveness in dealing with intellectual property infringement claims, supporting the gaming sector by protecting and enforcing IP rights. Second, Ireland’s common law regime presents stability and predictability, enabling businesses to plan and manage risks effectively – and since Brexit, Ireland is the sole common law regime within the European Union (EU). Third the flexibility and adaptability of common law allows the legal framework to quickly respond to dynamic business practices and changing market dynamics. Bridge between Europe and the US Ireland is strategically located between Europe and North America and is the only native English-speaking country in the EU, – an ideal gateway for gaming businesses that are looking to operate in both regions. With direct flights to all major countries across Europe and North America, Ireland’s strong connectivity has already supported hundreds of companies in exploring new markets and improv global reach. Ireland’s skilled workforce All this is boosted by Ireland’s highly skilled workforce, particularly in the creative, technological and commercial fields necessary for the gaming industry, such as software development, computer science, animation and video production. The country is home to renowned universities and technical institutes such as Trinity College Dublin and Dun Laoghaire Institute of Art Design & Technology, as well as complementary research institutions like Tyndall National Institute. This ecosystem is dedicated to building a highly skilled workforce in Ireland, with the ultimate aim of providing valuable support for gaming and technology companies. Ireland is one of the best places in Europe and the world to produce animation. The Irish passion for storytelling and the arts has a created a culture perfectly suited for world class animation studios to succeed. The skilled workforce in these fields has been demonstrated by the number of award-winning animation studios based in Ireland – such as Cartoon Saloon, a five-time Academy Award®, Golden Globe®, BAFTA and Emmy nominated animation studio, as well as Boulder Media, famed for its animated shows such as Fosters Home for Imaginary Friends and The Amazing World of Gumball. Looking ahead Ireland’s gaming space has made significant strides in recent years. This is demonstrated by the several industry giants that have established bases of operations in Ireland – such as EA (known best for their sports games), World of Warcraft’s Activision Blizzard, and Riot, developer of the global phenomenon, League of Legends. These household names have helped to establish a strong cluster and skilled talent pool, leading to many award-winning indie games developers/publishers also choosing Ireland as a hub. This includes the likes of Team 17, the developer responsible for Worms, The Escapists, Overcooked and over 90 other titles. The ecosystem in Ireland continues to grow and is also supporting a wide range of Irish companies. There are currently over 60 gaming start-ups in Ireland – including Vela games, War ducks, Pewter Games and more. All which have greatly benefited from the emergence of a skilled workforce and are eager to scale operations. As the gaming space continues to expand, the range of supportive government policies, strong legal framework and skilled talent pool will only continue to drive the country’s appeal as an attractive location. ### Embracing DevSecOps: Agile, Resilient Software Development in Flux  When it comes to DevSecOps, one of the challenges you face when you get into DevOps is the speed of change is so high that businesses can’t keep applying security changes as frequently as they would like. If you go back 10 or 15 years, infrastructure would typically be built and would sit there for 5 years and that would be considered a done job. But when we talk about DevOps, we are lucky if that infrastructure lasts more than a week at most. It is changing all the time.  Therefore, when talking about the integration of DevOps and DevSecOps, we need to ask how those security principles can be applied more often. For example, you might maintain an intrusion detection system that you update on a regular basis based on your logs. In this, it is vital that businesses understand who is making changes to their systems.   And one of the great things about DevOps is it guarantees immutability so you know exactly what the state of the infrastructure is. This means that even if your infrastructure is compromised, when you re-release it, you can rebuild it from scratch which guarantees any changes are reverted. So, you’re giving yourself a fresh slate every time and ensuring that every release you are at the latest update. Combining this with modern log management and observability means you have much more visibility and proactiveness compared to 15 years ago when someone would have to trawl the logs manually. It gives you far better visibility when you compare it to what we used to do 15 years ago, which was put the measures out there and hope no one did anything.  The rate of change in the security landscape is so fast-paced and there are always new threats and new threats to keep on top of. The idea of being able to build that process regularly into the framework means you can keep on top of it. If your system can’t scale, you can’t grow and if it is vulnerable the damage is immeasurable.   The mantra ‘shift security left’ is often associated with DevSecOps. Can you elaborate on what this means and why it’s crucial?  It may sound obvious but when releasing a product or a service, businesses often don’t know what they don’t know. From a business perspective, the priority is just that the product is valuable in some way. This sometimes means that a product is released that has some areas that have the bare minimum; if no one is going to use the platform yet there is no point in locking it down.   Over time, businesses can then change and develop. The architectural change goes through enhanced security and deployments far quicker than it used to. Almost on a weekly basis, businesses can change their architecture, something that is very hard to do when you have a traditional mindset. The traditional mindset process follows a pattern of thinking that “right, this week the architecture is this, let’s go through the review boards, let’s get people to sign it off, now this week the architecture is this”.   Those processes don’t work on week-long time scales; they work on month or quarterly time scales. Having that embedded mindset and having someone who is the security-focused member of the team means you are then able to release things that are more secure than they would have been. It also has the knock-on effect of a continuous improvement attitude. Therefore, when the next sprint comes along, you make it better again. There are definitely a lot of benefits to shifting it to the left because you address those problems far sooner.   What are some of the primary challenges organisations face when trying to adopt a DevSecOps culture?  The biggest challenge to DevOps adoption no matter what capacity you do it in is whether or not you can buy into the process and accept that things change. Unfortunately, in many organisations, there is a difference between what they want to do and what they are set up to do. In an ideal world, everybody would want to be able to deploy security patches as quickly as possible, but the problem often is that you have two worlds and cultures often colliding.   On the one hand, you have those who come from a change background where everything is done really slowly and with a strong emphasis on audit which doesn’t play in nicely with releasing updates 20 times a day. On the other hand, you’ve got those with backgrounds in engineering for example who are more open-minded to this level of change. The problem is that for those in change management, this slow and audit-focused process is their job and by automating that aspect, you leave many wondering what their purpose is.   It is also hard to take teams of developers who are accustomed to writing code in a certain way that maybe doesn’t have security at the forefront of their work. And that is a massive cultural shift; trying to get your team developers to become aware of the potential security implications of the code they are writing day in and day out. You’ve got tools that can catch those in your DevOps pipelines.   From a cultural perspective, having awareness of security throughout the entire engineering team is a challenge. When consulting, its common to hear that “we know what we know and we are good at what we do but we are really struggling with the people we’ve got as they can’t switch this mindset.” They are looking to bring in new people with fresh ideas.  It’s true that there has been a mass adoption of DevOps but only about 10% actually implement it correctly. So, you have this mass adoption but no one understands why they’re not getting the value because they haven’t solved the cultural side.  For businesses looking to transition from a traditional DevOps model to DevSecOps, what are your top recommendations for a smooth and effective shift?  Part of any change is about leadership and who is leading this shift. I’ve seen organisations with 20,000 software engineers under a CTO and they have an operations team under a CIO with another 10,000 people and the DevOps is being done to them, both areas independently. I don’t think this is a good approach because you end up getting good software released but the operational support is poor. The reality is that it needs to be merged and it needs to have a concise unit that is doing both sides of it, or else businesses are just creating a DevOps wall between the two.   The problem is that if there are people who are already in an organisation, particularly if they’ve been there for four or five years, they aren’t going to make recommendations that are conducive to the best output for the business. They are going to make recommendations that are conducive to them and that’s one of the biggest challenges that you can see already.  A lot of big organisations aren’t able to make this shift because there is such an aversion to change and there is a difference in priorities with the board level and the tech teams. When teams talk about security, in comparison to other things that will be more revenue-generating, it just falls by the wayside. But what often isn’t considered is that security tends to not be a problem until it’s a big problem. Having that mindset and keeping that in mind will help for an effective shift.  It starts with priorities and businesses need to establish what priorities are at the top. If the mindset and culture can be embedded across your security, IT, operations and financial areas, businesses will have a far better position of balance between what the business needs and what it requires. It’s about finding a balance and people just need to work with each other rather than in silos.   To really be successful with this transition, businesses need to have someone who is at the board level who is happy to lead and own this change. Often the best way of doing it is more like a snowball; start with a team and get the processes up to scratch and in doing so, accidentally prove that it’s better. This can then be treated like a virus that keeps spreading throughout an organisation. Flexibility of mindset, but the thing that underpins that is consistency. With consistency, you can make logical adaptions and monitor your progress. I think that’s one thing a lot of people don’t do. It drives scientific behaviour of what is going to change.  ### AI Limits in Customer Service: The Need for Human Agents The recent boom in generative artificial intelligence (AI) has been impossible to ignore. In fact, a survey conducted by Writer’s found that 59 per cent of businesses have either purchased or are planning to purchase a minimum of one generative AI tool this year, while almost one in five companies use five or more generative AI tools. From financial services and telecommunications to healthcare and manufacturing, the deployment of the technology has become increasingly prevalent throughout a variety of industries, reshaping how businesses interact with their customers and beyond. This includes the customer service industry, which is currently assessing the benefits and subsequent the impacts of AI within the industry prior to making strategic investments in advanced digital solutions. The benefits generative AI brings In fact, generative AI has taken the customer service sector by storm in recent months, with ChatGPT being the most well-known solution. The tool, which is already the most used generative AI application with 47 per cent of companies utilising it, brings numerous benefits to customer service organisations, highlighted by its ability to create human-like responses to questions. For those businesses with particularly formidable customer experiences, generative AI provides them with an opportunity to distinguish themselves from their competitors by decreasing wait times and ensuring consistency, as well as by providing data analysis in addition to the opportunity to create a uber-personalised experience for customers. Ultimately the capability, benefits, and qualitative aspect as well as security implications for such tech are yet to be fully realised.  As companies continue integrating generative AI technologies within their business, they’ll start to reap the benefits that it should help initiate, both from an internal and external perspective. Amongst these benefits is the technology’s ability to improve customer experience (CX). Generative AI is capable of providing personalised interactions and recommendations, as well as content tailored specifically for each individual in order to improve customer satisfaction. What’s more, the technology also enhances creativity. By inspiring and boosting human creativity, it assists individuals when it comes to generating new ideas, designs, and campaigns. Adding to this, generative AI can manually sift through vast volumes of data and subsequently analyse it, processing and translating these raw data sources into more digestible insights. From this, contact centre agents can contextualise these insights to make informed decisions, saving customer services firms lots of time, boosting efficiency and productivity. Human expertise will always be required Nevertheless, while generative AI undoubtedly has an extensive list of capabilities, it cannot eclipse the role that contact centre agents play in the customer service industry on a day-to-day basis.  The technology simply doesn’t have some necessary attributes which contact centre agents bring to the table, such as empathy and understanding. When customers are facing an issue, they will want to communicate with someone who recognises and is conscious of their situation. Contact centre agents are specifically trained to be sympathetic and well-mannered when dealing with customers. Irrespective of how unpleasant the circumstances may be, a customer service representative will endeavour to respond to queries in good faith and continue to assist until the issue is rectified – which cannot be guaranteed for generative AI in the same scenario. Elsewhere, tools like ChatGPT lack the personal touch of contact centre agents. As social creatures, people look forwarding to speaking to another person to discover answers and solutions to a problem. Customer service representatives have the experience of speaking with customers that have different personalities and come from different, distinct backgrounds. This enables contact centre agents to provide customised solutions for every request and complaint they have to deal with, interacting with customers in a manner that suits them. In contrast to this, due to the size and scale of generative AI applications, they cannot be tailored to suit the requirements of every single customer. Additionally, customer service representatives are able to provide strategic customer engagement. While there is no doubt that AI-based chatbots are quicker than contact centre agents at providing a response, these applications do not have the understanding to comprehend the relative severity of what they’re dealing with. Although individuals may take small steps to accomplish something substantial, they do not adopt the same approach towards all customers when it comes to resolving problems – unlike trained generative AI tools. Customer service representatives have the ability to create a holistic CX journey that ensures effective solutions with advantageous results and peace of mind for customers. Ultimately, businesses will make strategic decisions to either fully automate CX or to utilise human in the loop solutions which will ensure that customers do still have the ability to engage with an agent, albeit in a much more informed, next best action, interaction which drives better outcomes.  Why generative AI alone is not the solution As is often the case with new technologies, it is important to find the right balance between the tech and human interaction – this is no different for generative AI. The technology should be used to enhance human skillsets rather than replace them entirely. In terms of the customer service industry, by embracing generative AI applications like ChatGPT, contact centre agents will be able to improve their own performances and succeed in several business areas. Relying on generative AI tools alone to take sole responsibility for customer support will provide an experience for individuals that lacks empathy, understanding and human interaction. While generative AI unquestionably has numerous benefits that can help to reconfigure what the CX industry looks like, a human and digital experience will always be the favoured option – with the combination of the two ensuring efficiency and productivity, as well as helping to build rapport and increase loyalty amongst customers. ### Behind the times: how the cloud can help you escape legacy systems The past decade has seen digital transformation – the integration of data and technologies across every aspect of a business – move from an aspirational buzzword to a fundamental part of driving innovation and fostering growth. Businesses are constantly seeking new ways to enhance their operations, boost efficiency, and secure a competitive edge. Nevertheless, many remain tethered to legacy systems – outdated infrastructure that hinders their ability to fully embrace the advantages offered by digital transformation. When it comes to storage, where data is kept, and from which modern technologies need to draw, that can undermine huge parts of your business. The lingering grip of legacy storage systems Legacy storage systems, often characterised by monolithic software and hardware configurations, were designed for an era of technology that has long passed. Inflexible, inefficient, and difficult to upgrade or modify, they pose a significant challenge to businesses seeking to adapt to the rapidly changing demands of the digital age. For one, limited scalability is a constant frustration. Data’s value is rising year-on-year – but in the face of burgeoning data needs, legacy systems struggle to expand efficiently. This translates into performance lags, system outages, and missed business opportunities – stagnating innovation, limiting progress and putting you well behind your competitors. This would be manageable if such systems could be upgraded to accommodate modern needs – but that’s not often the case. Imagine you still own a car you bought two decades ago. It’s on its third clutch, doesn’t pass ULEZ regulations, and each MOT costs more and more to keep it going. No matter the modifications or upgrades, the vehicle still won’t fit the bill – there’s only so far it can take you. Legacy systems are in a similar position today, especially with the rapid evolution of artificial intelligence. It’s harder and harder to keep them going, especially as more efficient options are being taken up by your competition. They might run, but they’ll never match up to the capabilities of the next generation. This inability to modernise also impacts security – which, in today’s fast-moving threat landscape, is more important than ever. Even if a business can withstand the other costs associated with legacy systems, suffering a significant cyberattack has the potential to be crippling. As legacy systems become outdated, the companies that developed them will gradually pull support, before eventually moving to ‘end-of-life’ status, with no further updates. That leaves the system that you rely on to keep your data safely stored increasingly exposed. Classic frustrations So, in the face of all these difficulties – why do so many businesses hang on to legacy technology? The first answer is obvious: cost. If you have previously invested heavily in a certain technology, you want to maximise value from that investment. This leads many businesses holding on to technology for longer than is sensible. The cost of removing and replacing such a system, and the related disruption to everyday business is another common barrier to innovation. Similarly, many organisations simply don’t have time. When handling vast amounts of data, constituting billions of documents and terabytes upon terabytes, we’re not talking about a filing cabinet that you can shuffle down the hall. Transitioning from one system to another demands a substantial investment of time – not to mention expertise. The agility of a company is often handcuffed by legacy systems. Their rigid structures make it challenging to adapt swiftly to evolving market conditions or seize new opportunities as they arise. Lack of flexibility means missing out on critical competitive advantages or failing to meet customer demands promptly. To the cloud Migrating to the cloud stands to remedy many of the pain points associated with legacy systems. In stark contrast to the limitations of outdated systems, transitioning to the cloud translates into notable time and cost savings for companies. As you’d expect, one of the most striking advantages is the immediate impact on the financial front. The shift to the cloud eradicates the hefty upfront capital investments and ongoing operational costs associated with maintaining on-premises infrastructure. Once the cloud infrastructure is operational, expenses related to hardware, database maintenance, security, and software upgrades become part of operational expenses, offering predictability and manageable budgeting on a monthly or annual basis. Furthermore, the scalability inherent in the cloud allows businesses to quickly and easily reduce or expand their storage, and processing capabilities, or introduce new software or solutions, as needed. This allows a business to avoid getting stuck in a contract that isn’t worth the fee, or over-purchasing on solutions that aren’t needed – as well as drastically reducing the cost of implementing new systems. Beyond cost, cloud-enabled systems also accelerate business operations for employees right across the organisation, with cumbersome legacy systems no longer slowing people and processes down. Modern systems will invariably move faster than their legacy counterparts, with many able to automate the menial tasks that chip away at an employee’s day to day productivity. Additionally, these systems facilitate flexible working arrangements as they’re accessible online from any secure network to those with permission, promoting a decentralised yet secure work environment. Cloud migration streamlines employee collaboration, fostering a seamless working environment. These benefits all point towards one truth: you can’t hold on to your legacy systems forever. The cloud offers transformative benefits – and if you’re not taking advantage of them, your competitors are. Now is the time to shed the past and embrace the cloud's limitless potential – not just an upgrade, but a strategic leap toward innovation and competitiveness. ### AI: from nice to have, to necessity AI technology is dominating conversations across the globe. In the last few months alone, we have seen industry-leading technology events like Gitex declare 2023 the ‘Year of AI’; the Collins dictionary named ‘AI’ as its ‘word of the year’; and we have celebrated ChatGPT turning one. If 2023 was - as Gitex’s theme suggested - ‘the year to imagine AI in everything’ - it certainly has been the year for AI investment. Early on this year, the UK government announced a £100 million fund to help the UK build and adopt AI, and according to the US International Trade Administration, the UK AI market is worth more than £17 billion and is expected to rise to £800 billion by 2035. It is undeniable that the proliferation of artificial intelligence has exploded in enabled systems and software industries worldwide – and for businesses operating fleets of vehicles and equipment, this is no different. Across sectors including construction, logistics and deliveries, the benefits of AI are broad and multifaceted: AI promises to revolutionise operational efficiency, safety protocols and the overall sustainability of fleets across the globe. But how can businesses in these industries gain their customer’s trust in a technology that has had its fair share of mixed media commentary? Realising the benefits of AI in fleet management Imagine owning a crystal ball which offered granular insights into the journey you were about to make - from route planning and timing, right down to the last drop of fuel you would use. We are close to this reality, and managers of fleets operating multiple vehicles - such as delivery trucks, or construction equipment - are not far off realising just how beneficial AI can be. By harnessing the power of AI-driven predictive analytics and machine learning algorithms, there are three major areas that could benefit the most: Route optimisation: businesses are able to streamline their route planning processes, minimise delivery delays and optimise overall fleet performance. For example, AI-enabled predictive models can effectively anticipate traffic patterns, weather conditions, and other variables to provide real-time route recommendations, enabling companies to ensure timely deliveries, and ultimately, enhance customer and user satisfaction. Fleet safety and security: AI presents unparalleled opportunities for enhancing fleet safety and security through the implementation of advanced driver monitoring systems and predictive maintenance protocols. By leveraging AI-powered sensor tech and data analytics, fleet managers can gain comprehensive insights into driver behaviour, vehicle performance, and maintenance requirements. This enables the fleet managers to proactively identify potential risks, minimise accidents, and optimise vehicle uptime. An example of this would be driver-facing AI cameras which detect high-risk or unusual behaviours, alerting both the driver and the fleet manager to potential driver fatigue or distractions which could result in an accident. On top of this, AI-driven predictive maintenance systems can detect potential equipment failures and schedule timely maintenance, reducing vehicle downtime and ensuring the safety of drivers, assets, and cargo. Promote sustainability: High on the C-level agenda is sustainability, and there's a huge opportunity for AI-powered energy management and sustainability metrics in fleet management that promote eco-friendly operational practices. Organisations can identify energy-efficient routes, implement fuel-saving strategies, and adopt environmentally conscious supply chain practices. This will help to reduce carbon emissions and contribute to a more sustainable and environmentally responsible fleet management ecosystem. Overcoming challenges in AI adoption As with all new technologies and software, the integration of AI into fleet management systems follows a multi-stage process, and initial adoption can present certain challenges.While the benefits of AI are extensive, it is important to acknowledge potential drawbacks. Ensuring the accuracy and reliability of AI-generated insights is critical. In the context of fleet management, where decisions impact operational efficiency and driver safety, the precision of AI-driven analytics is paramount. Fleet managers need to ensure the partners they are working with are dedicated to developing AI solutions that undergo rigorous testing and validation to ensure accurate insights are provided for their users. Maintaining the privacy and integrity of sensitive information within the fleet management ecosystem should also be a top priority. Some software partners will have implemented robust security protocols and data encryption measures to safeguard the information of users and their fleets. So, what’s next? The benefits of AI within fleet management are far-reaching and have the potential to completely redefine the operational landscape. By embracing artificial intelligence, industries such as logistics, construction, and haulage amongst others, can unlock new avenues for operational optimisation, safety enhancement, and sustainability. This can help form the backbone of further innovation in fleet management. Businesses that are either actively working on the implementation of AI or with an anticipated rollout of AI-driven functionalities in 2024, need to focus on delivering data protection and privacy as the foundation for building trustworthy and secure AI-driven fleet management systems. This commitment to exploring innovation whilst keeping employees’ and customers’ security and safety as a top priority will foster confidence and peace of mind for all. ### Optimising supply chains with real-time visibility As the globe has become increasingly interconnected, the impact of disruption is felt on a larger scale. And nowhere has this been felt more than in our global supply chains, with the global pandemic highlighting the lack of resilience in the industry. Disruption is, of course inevitable. This has been abundantly clear over the last few years. However, many businesses are totally dependent on a predictable supply chain to produce goods or deliver an end product. A supply chain failure can lead to contractual penalties, lost reputation and potentially business failure, having an effective mitigation strategy is key. While disruptive events differ, it is clear that real-time visibility across the entire supply chain is non-negotiable for managing it, as it supports a stronger decision-making platform, reducing the impact on the damage to partners, suppliers and end customers. Increased visibility improves the decision-making platformIf businesses want to keep shelves stocked, then they need to have a deep understanding of how to track and control inventory, which requires real-time visibility in every corner of the supply chain. With real-time alerts and updates, businesses can make faster decisions when responding to potential disruption, for instance by changing the route or method of transportation. Moreover, not only does this increased visibility give businesses the ability to improve the speed at which they can make these decisions, but also accurate real-time data allows them to make more calculated judgements about the alternative shipping and transportation paths that are available, as they will have a clearer picture of the situation and, in turn, what is needed to address it. This does not mean achieving real-time supply chain visibility is easy. In an increasingly globalised world, our supply chains are more interconnected than ever, and this means that the tracking and managing of inventory needs to be done across multiple locations, which could include retail locations, warehouses, and manufacturers. And because these locations and operations are typically owned by different companies, this can make creating a single source of truth quite difficult to achieve, collecting data from each point in the chain and each data owner – with their own data formats and rules, introducing third party data like vessel tracking, weather forecasts, tide times, local events, public holidays, local traffic conditions etc, is no mean feat. Establishing real-time visibility Many businesses have turned to new technologies, like advanced analytics, to achieve real-time visibility and create advanced forecasts for supply chains ahead of time. These more sophisticated algorithms work on both new and old data, which increases accuracy. However, they require a strong and stable data connection between on-site facilities and the cloud, as well as low-latency services that offer constant access to important data.These technologies help paint a clearer picture of supply chain operations. Where data is shared to align everyone involved across the supply chain, it translates better operations directly into satisfaction both for internal stakeholders and customers. The cyber risk of hyper-connectedness However, the increasing interconnection between modern-day supply chains does not come without its risks. While it increases transparency, it also inevitably expands the cyberattack surface. The increased use of IoT sensors to modernise and interlink supply chains means that there are more endpoints and, in turn, entry points for cyberattacks. This puts not only the device owner at risk but also every company in the supply chain since their networks are interconnected. The disruption caused by cyberattacks, as well as the risk of breached data, put organisations at risk of financial consequences like lost revenue or penalties, as well as more material losses like hits to business reputation and productivity. Robust security and risk policy processes are therefore necessary at every stage to make ensure each company maintains the same standards and doesn’t introduce a “weak link”. The UK Government have previously laid out recommendations for improving supply chain security. Put simply, effective risk management and more direct control over relationships with suppliers is key. This could mean requiring Cyber Essentials Plus or the need for contracts to include a “right to audit.” If your organisation doesn’t have a senior security leader, you’ll have to take additional measures to educate all key decision-makers so they are aware of, and act on the potential risks. For example, articulating the full requirements and obligations between partners to minimise supply chain exposure. This is where security audits can be beneficial, as a full audit can make sure everyone knows the risks well enough to identify them quickly and solve the issue with time to spare. Minimising the risk Disruption is inevitable, but to address it properly, you need to know about it and take action quickly. The end goal of all supply chains is the timely delivery of goods. But where companies don’t have the visibility to track and react to potential disruption as it arises, the whole network is at risk of delays, as goods may have to be diverted through less time-efficient routes, impacting customer satisfaction and the supplier-partner relationship. While sharing real-time data across the supply chain offers resiliency, businesses need to be fully aware of the risks that come with data sharing for supply chain optimisation. While critical to the long-term success and resiliency of supply chains, cybersecurity processes and policies across the board need to ensure that the opportunity of the increased hyper-connectedness isn’t outweighed by the risk. ### Solving the need for a customer-first retail strategy In 2022, 22,109 businesses failed, with insolvencies 57% higher than in 2021. From recent financial increases and new customer demands, the UK retail landscape is under immense pressure. According to ONS statistics, 15% of these insolvencies were from the wholesale and retail trade category. Gavin Masters, Principal Digital Strategy & Transformation Consultant, Digital Commerce, Columbus UK draws on insights from the Internet Retailing RetailX UKTop500 Report, to highlight how retailers can exploit the cloud to create high-quality digital and hybrid shopping experiences to overcome market challenges. To succeed at a time when competition is tougher than ever, retailers need to go the extra mile – prioritising fast deliveries, seamless online buying purchase journeys, and providing a 360-degree customer view will be crucial. But all of this requires a flexible business model that utilises new technologies, data-led customer insights and digital consultants, to offer digital and hybrid shopping experiences. A unified commerce strategy is the ingredient for successThe backdrop of inflation and cost-of-living increases make it important for retailers to serve shoppers in any and every way in which they want to buy. Customers may be returning to in-person shopping but recent ONS statistics show that they are still buying online more frequently than they did during the pandemic (seasonally adjusted internet sales accounted for 26.6% of all official retail sales in May 2022, compared with 19.7% in February 2020) – and retailers must adapt to this demand. A flexible cross-channel approach can help retail businesses to engage consumers whenever and wherever they choose to shop – and this is where unified commerce comes in. Retailers are finding considerable success by moving beyond the concept of omnichannel to a more integrated approach. Here, unified commerce can offer customers a seamless and relevant purchase journey, driven by consolidated real-time customer data that’s actionable across channels. The result is that every part of the retail process, whether customer-facing or back-office related, has the customer at heart. Add value to the customer journey and the road to profitability will shortenAs cost-of-living rises increase financial pressures, customers are prioritising value for money over brand loyalty. The RetailX report highlights that the most successful retail businesses offer a customer value chain that makes it easy for shoppers to buy a product, have it delivered, and then return it, if necessary. The ability to offer additional value to products such as free shipping, easy returns, and extensive customer support enables retail organisations to provide a more enhanced customer experience and go beyond transactional customer relationships. This is where an omnichannel experience that works for customers and being able to adapt to their evolving shopping habits is critical. First, removing friction from the checkout has long been a goal for retailers aiming to cut down on the number of people abandoning purchases. Retailers do this by offering guest and third-party checkout options. Requiring shoppers to register to checkout does have its own benefits, as retailers can start to build long-term relationships when they acquire more customer information – but there is a balance to strike. According to the RetailX report, more than a quarter of retailers and brands (28%) now enable shoppers to use PayPal to checkout. This ensures a quick checkout for customers and provides necessary customer data for retailers to personalise future service offerings. The next step involves the removal of sales barriers with a combination of delivery options that are either always free or free above a minimum order value – but don’t forget about those easy and convenient returns. As most shoppers don’t return their online orders, retailers can boost sales by making it easier to send something back. The digital customer wants tailored experiences – AI can provide just thatAs customers become more thoughtful about their buying decisions, the right online tools will be crucial to excellent customer experiences. Tailored search results ensure customers feel that a retailer can meet their need to find relevant items quickly. Nine out of ten retailers and brands surveyed in the RetailX report now enable shoppers to filter searches by product type but there is still more to be done here to improve search filters. AI technologies will take on-site search capability one step further and enable retailers to create effortless customer-centric product discovery experiences that constantly refresh with trends and shopper behaviour. The more information retailers and brands gather, the more personalised the customer experience will be, leading to higher conversions. Shoppers are also more likely to buy when they see what other customers thought of an item, which is why a growing number of retailers and brands are now sharing product reviews. According to the RetailX report, this is something more retailers and brands are doing this year (58%) than last year (55%). As shoppers return in-store, retail businesses are reinforcing omnichannel systems that fully bring digital into their stores – and the store experience onto their websites. Cover all your product bases with timely insights from a digital consultantResearchers for the RetailX report found that the quality of online product information remains unchanged from last year (median score of 3.0 out of 4) – that’s plenty of missed opportunities to adapt to consumer shopping habits. Product information, from technical information to details about sizes and materials, can be the make or break of a sale, so retailers have got to get it right. If this information is readily available, retailers can make more from every customer contact. This is where the value of working with a digital business consultancy cannot be understated to successfully achieve transformation in the retail industry. The right consultancy business will provide that second pair of eyes over critical insights across the entire business, helping to bring together the digital and technology skills that deliver value in the cloud. The skills gap can be difficult to manage but retailers need to strike a balanceFor the retail sector, the UK’s largest private sector employer, there’s a deadlock. Digital skills are now needed for 79% of retail jobs, but 62% of leaders say they can’t find people with the right experience (https://internetretailing.net/reports/annual-reports/rxuk-top500-report-2023/). Today’s retailers often struggle trying to link multiple channels without the support of in-house technology experts with the right IT skillset. By bringing in real-time communication tools, and the expertise to optimise their use, teams can improve collaboration with cloud systems, regardless of their location – and this is where consultants demonstrate their value. A digital consultancy business can help retailers align industry best practices with company goals to drive transformation initiatives, helping retailers discover how digital technology can unlock opportunities to create new business value. Digital consultancy businesses will help find ways to release value early and incrementally and ensure successful value management. Plus, at a time of economic uncertainty, consultancy support will be a flexible resource when retailers need to maintain agility in their cost base and align value creation with cost. To build resilience and thrive in difficult conditions, embrace innovationBetween the cost-of-living index and changing consumer behaviours, 2023 is creating a turbulent market for UK retailers to navigate. But it’s not all doom and gloom. Retailers that seek to provide customers with simple online buying journeys, easy returns policies, and unified commerce experiences can assert their dominance in a crowded market and thrive during difficult conditions. Only by working with digital consultants and harnessing powerful AI tools to overhaul their digital offerings, can e-commerce businesses truly level-up their retail strategy. ### CIF Presents TWF – Pollyanna Jones In this episode of our weekly news and interview show, Tech Wave Forum, host and CIF CEO David Terrar will use the news and opinion section to connect the word of the year with recent ChatGPT announcements by OpenAI, and changes for creators at YouTube to combat the AI generated fake news and impersonation that does harm. Our interview guest is Pollyanna Jones, Health and Life Science Partner at Monstarlab, previously NHS England Commercial Lead. We are going to be talking about a Tech For Good, NHS England success story called FutureNHS, and launching the Agile Elephant report on their progress. ### Why edge computing is sharper on a hybrid multi-cloud architecture The Internet of Things (IoT) is only going to get bigger. As yet, nobody quite knows when and where the world of smart cities and intelligent refrigerators will reach its critical mass. That’s probably because we’re still on the ascendant part of the exponential growth curve that describes the trajectory for development in this space. But given the momentum in IoT-rich computing topographies, now is an opportune time for us to consider how we will run, support and underpin the edge computing that happens on these devices; but why is this so? Let’s start with a simple clarification that many still get befuddled with. We talk about the IoT being made up of the devices that exist in remote (and some not-so-remote) locations around our planet. Sensors, gyroscopes, cameras, gauges and all manner of connected electronic devices are what creates the IoT. The computing processes that happen on IoT devices are what give us edge computing. Although many people still use both terms interchangeably, this is not always helpful. A swinging pendulum With the move to put more computing out in edge environments, there is a concerted migration away from enterprises using one centralised cloud hyperscaler for all their computing needs. Principal consultant at research and advisory firm ISG, Anay Nawathe, called this drift a ‘swinging pendulum’ and it is a reality that sees enterprises operate on a more variegated scale, in more locations and - crucially - in a position where they need to use an expanded variety of cloud services from more than one Cloud Services Provider (CSP). Again we need to ask a very straightforward question here - and it’s ‘why?’ Organisations naturally need to use services from a bigger range of datacentres based upon where their edge estate is actually located. Factors like latency, service differentiation and local data compliance regulations all come into play, as does plain and simple convenience. But running multiple cloud instances to serve edge requirements means multiple infrastructures, multiple configurations, multiple workloads and - typically - multiple systems management headaches. As we have been direct from the start of this discussion, let’s continue in that vein i.e. the most prudent, productive and efficient way to achieve mixed-cloud management for edge scenarios is to adopt a hyperconverged infrastructure (HCI) platform. Because HCI brings together a ‘family’ of technologies - including server computing, storage and networking - into one single software-defined platform, its suitability for what are now increasingly complex and resource-intensive edge environments can not be overstated. Elements of edge Before we come back to underpinning infrastructure as we always must, let’s think about the way edge computing is diversifying and differentiating its workloads. We know that some edge deployments will be essentially autonomous with very limited connection to the datacentre and some will connect more frequently to the mothership and to other devices. Although a degree of edge computing will be quite rudimentary and basic, an increasing proportion will now handle business-critical workloads at the edge. Extending this thought down to the data layer itself, we know that some instances of edge computing will need to process and handle sensitive or regulated data. With the IT management burden of governance, compliance and security now extending outside the walls of the company headquarters, edge computing demands a more holistic and hybrid approach to digital security. These factors are a key validation point for what we can call a universal cloud operating model, an approach designed to help manage applications and data across environments, including multiple public cloud instances, across on-premises servers and throughout hosted datacentre and edge endpoints. Single control plane Because edge computing across the IoT now exists as a first-class citizen in an enterprise organisation's total application and data services fabric, we need to promote its position so that it ranks alongside all other workflows and data streams. If for example a supermarket, fashion store or garage operates across multiple locations (and most do), it might be running edge computing workloads to shoulder security cameras, a footfall measurement system or Point of Sale devices. Having a single control plane capable of managing all those different deployment points becomes essential. If those same organisations take a more piecemeal approach to edge management characterised by unsystematic partial measures taken over a period of time, the resulting IT system typically features islands of infrastructure. Across these islands, we find servers, storage repositories, elements of networking connectivity and functions such as backup all being provided by different service providers. Ultimately, any organisation working at this level will need to consolidate infrastructure and application management onto a single platform. By eliminating the presence of multiple management consoles through an HCI-based approach, these businesses can achieve simplicity, flexibility and rapid scalability. The view from Nutanix If I may present some views that stem from our own platform developments in this space, we have been working hard to progress the relevance and need for hyperconverged infrastructure for a decade and a half. This year has also seen us partner with Cisco to integrate with its SaaS-managed compute and networking infrastructure (Unified Computing System) and its Cisco Intersight cloud operations platform. With Cisco rack and blade servers customers now running Nutanix Cloud Platform, they will benefit from a fully integrated and validated solution that is sold, built, managed and supported holistically for a seamless end-to-end experience. This key partnership sits directly in line with use cases that span the most hybrid of hybrid cloud estates spanning on-premises, public cloud and edge computing and it does so by providing infrastructure engine power at an unprecedented level. With the prospect of Artificial Intelligence (AI) now further expanding and augmenting the deployment surface for edge computing, as we said at the start, this space is only set to grow. We know that a hyperconverged infrastructure (HCI) approach has the breadth to provide a universal cloud operating model across every conceivable computing resource, which means we will be able to push the edge out as far as we can dream. ### Situating SIEM: On-prem, hybrid or cloud? Knowing where to situate your Security Incident and Event Management (SIEM) solution is no easy task. The nuances of the business and the data it holds, overheads and storage, support and scalability, are all contributing factors which means it is no longer a cut and dried decision of simply migrating to the cloud. There are pros and cons to using an on premise, SaaS or a hybrid approach, and the business will need to weigh these up to make the right choice. A SIEM is a vital part of the security arsenal. It serves as a central hub for security event data within the business and collects, correlates, analyses and visualises data from various sources, empowering to detect, investigate and resolve security incidents quicky. Next generation SIEMs also incorporate threat detection, enriching log data with information from threat intelligence sources, while the converged SIEM supplements log management, threat detection and response, monitoring, alert generation and reporting with Security Orchestration Automation and Response (SOAR), User Entity and Behaviour Analytics (UEBA), and endpoint detection. When buying a SIEM, the dilemma remains whether to deploy on prem, to take a SaaS offering or even a combination of the two. On prem vs cloud Traditionally, the SIEM was physically housed on prem due to the sensitivity of the data it accessed and many organisations in highly regulated industries continue to do so because of compliance regulations or a reluctance to share personal identifiable information via the cloud. For the majority of businesses, data privacy regulations have made the migration to the cloud more complex. On prem can confer advantages in that the solution is owned and maintained internally which means that approximately nine percent of the total cost of ownership is spent on license fees. This gives the business complete control over all aspects, from architectural design to the installation and configuration of the hardware, and where the data is stored. But, as a result, a far higher proportion of spend is dedicated to infrastructure and operations and the associated costs. In contrast, the total cost of ownership of a SaaS service sees approximately 68 percent of spend devoted to licence fees. However, providing licensing on a per user basis and not on data volume, gives a SaaS offering far more advantages. There is no capex associated with acquiring hardware, backups, cooling or power, while the service can be expanded to accommodate more devices, receive more data or perform more analytics, all without the need to increase budget. Unlike in the on prem model, management of a cloud-based SIEM is not the sole responsibility of the business. While an inhouse team will still need to perform log collection, implement customer-specific configurations, deliver data to the solution and monitor and respond to alerts, the overall maintenance and ongoing management of the SIEM falls on the provider. From an operational perspective, the provider must ensure platform availability, optimal performance, and that the system is regularly updated. A SaaS deployment will also see the business begin to reap the benefits straight away, without the need to procure, install and configure the solution, and continuous product improvements ensure new threats are identified. Straddling the divide Irrespective of whether the SIEM is on prem or in the cloud, a good SIEM provider will also monitor for and produce content on emerging threats, making these available in the form of playbooks for detection and response via SOAR. Over SaaS these can be made instantly available and it is also easier to fine tune content updates for rules, analytics models, dashboards and reports by using detection logic situated in the cloud. Of course, some businesses opt for a mix of the two, allowing them to combine event data from systems and entities held on prem with those in the cloud. It is an approach that works well for those who use multiple data centres, for example, but hybrid is also a mainstay for many medium to large businesses that were not ‘cloud born’. As these organisations have workflows being run in both environments, it makes sense for them to process logs locally before they are sent to the cloud, and in some cases customers want to maintain a copy of those logs on prem. As mentioned previously, this also applies to sensitive data without the right data processing agreements in place, preventing the business from simply moving that data to the cloud, which means that as the workflows are local, so the data must be held locally too. The type of SIEM service the business selects will therefore depend on numerous variables from governance and compliance through to how much resource it can dedicate to the management of the platform and whether it will need to scale this over the coming years. It is not simply a matter of ‘cloud is best’ but of weighing up the relative merits of each approach within the context of the business. What we do know is that the business is now more cost conscious than ever, making licensing terms paramount. Data driven licensing can see costs rocket, for instance. Human resource is also limited, with a significant cumulative skills shortage in the cybersecurity sector, which means businesses will want to minimise dependency. And there is real demand for convergence, as security teams look to consolidate the time and resource dedicated to maintaining disparate systems. This is likely to mean that next generation SIEMs that play more than one tune by incorporating SOAR and UEBA for threat detection and response, for example, will be favoured, because of their ability to be deployed in any of the three forms and to conserve costs. ### Why IT Resellers Must Revisit the Full-Funnel Marketing Approach The past two years have signalled a major shift in the marketing tactics of channel resellers. According to recent research carried out by Active Profile, the nature of this shift hasn’t necessarily proved beneficial for the marketing strategies of even the most successful and established IT resellers. The shift we’re talking about involves a stark move away from the strategic marketing playbook. Research indicates that we’re witnessing an outright rejection of the B2B funnel approach of awareness, consideration, and decision. This is, of course, the three-step process that has been helping businesses to attract, engage and convert leads into customers since the world of B2B marketing began. It’s been tried, tested, and successfully executed for brands all over the world - big and small, from every sector imaginable. So, what’s with this mass exodus of resellers who no longer wish to rely on the dependable B2B funnel? Why are they bypassing such a crucial formula for success? And what tactics have they switched to instead? Quick-fix temptation According to Active Profile’s latest Channel Marketing Trends Report, many resellers are now turning their attentions towards shorter-term view tactics; smaller bursts of activity, as opposed to a considered strategic approach. We get it – they’re cheap and cheerful, and this stripped back method of marketing is still geared towards driving engagement and leads – but it rarely brings about long-term success. The same report found that only 34% of resellers are utilising the full-funnel process, with just 31% currently gating valuable content, and only 20% executing targeted, sector-specific campaigns. “Quick fix” tactics might work to generate leads initially but can’t be relied upon long-term. When companies are in this stealth mode, they find themselves caught up in a “sell, sell, sell!” mindset - the blinkers are on, and the focus is all about short term gain - when, instead, resellers should be prioritising customer needs and taking a deeper dive into their challenges. In fact, there’s a solid argument to be had that these short-term marketing tactics are actually deeply harming growth potential. Scattergun-style marketing collateral, with no considered message or strategic purpose, simply brings nothing new and zero value to the table for IT resellers or the vendors they’re selling on behalf of. And can they really afford to be sitting on their hands when it comes to their marketing plan? The importance of being pragmatic With an eyewatering 42,920 IT reseller businesses registered in the UK alone, it’s never been more crucial to adopt a pragmatic approach to marketing execution. One that simply works and isn’t too overcomplicated. And when you consider the basic psychology of consumers, it’s no secret that they tend to buy from brands that they can relate to. So why do so many business leaders and marketers in the channel seem to have forgotten that B2B buyers are consumers too? Perhaps they need gently reminding that the same psychology behind buying extends to this demographic of customers, who still have very specific challenges, needs, and goals. When deciding how to tackle these challenges and remedy these pain points, often the simplest solutions tend to be the most effective ones. The full-funnel marketing approach provides a practical and proven process to remedying the pain points of channel resellers, who likely find they’re struggling to stand out from the crowd, in an oversaturated industry space. Through a full funnel approach, you can expect to benefit from enhanced customer engagement, improved sales conversions, measurable impact, enhanced brand awareness, and greater alignment between sales and marketing Good things take time Ultimately, resellers that simply package up vendor solutions without adding any added value run the risk of being left behind. Resellers need to be creating content that informs and resonates at every stage of the (often neglected) B2B consumer journey. These companies also need to be meticulously considering the strategy behind building their overall brand awareness, and exploring how they can be better standing out from the crowd, to ultimately resonate impactfully with their target audience. And let’s face it: all of these crucial considerations cannot be improved upon with a singular, short-term burst of marketing activity. But, if resellers consider taking their brand and content strategy on the full-funnel B2B journey - they will find all of these challenges can and will be met, with just a little bit of planning and precision. So, if you’re a reseller thinking of jacking in your long-term marketing strategy for a bit of short-term success – stop what you’re doing. Head straight back to the top of that funnel, and work your way down. The result at the end will be worth it. Real, sustainable, scalable reseller marketing success. ### ChatGPT one year on: six paradigms that AI and LLMs have introduced for brands It’s a year since the launch of ChatGPT sparked frenzied conversations about AI and the seismic impact it will have on business and society. And the recent chaos at OpenAI, the $86 billion nonprofit behind the tech, only magnifies the disruption seen more widely across the field of generative AI. With McKinsey hailing 2023 as the “breakout year” for AI, the launch of OpenAI's revolutionary tool has triggered multi-billion dollar AI investments from Google, Microsoft, Amazon and Meta. And, at a time when the global economy is on the ropes, venture capitalists have still managed to splash out $15 billion in backing for generative AI-based businesses. While AI has grabbed the headlines, the media spotlight has also honed in on Large Language Models (LLMs), the technology that powers services such as ChatGPT, Bing, Bard and Claude. While LLMs have been around for years,  their mainstream debut unlocks intriguing potential for brands around productivity, speed to market and the ability to monetise their domain expertise and intellectual property.  For many companies, the prospect of hurtling headlong into AI/LLM investment might seem daunting – particularly after all the froth associated with previous tech innovations like VR, blockchain, NFTs and the metaverse. But unlike these hyped technologies, AI/LLMs have real-life applications and companies from all sectors are busy experimenting with them.   Here are six paradigms that AI and LLMs have introduced this year which underline why brands should embrace this technology to proactively pursue first-mover advantage: LLMs can supercharge organisational intelligence Until now, legacy brands have struggled to monetise and scale domain expertise and IP cost-effectively. But with LLMs, brands can feed their collective knowledge into AI models and turn it into tools with a conversational interface, which gives employees instant access to a company's entire knowledge archive at scale and with human-like nuance.  Morgan Stanley moved quickly here, unveiling its AI @ Morgan Stanley Assistant this autumn. The platform gives financial advisors at the bank instant access to information from 100,000 internal research reports, used to feed on-demand answers to queries around markets, recommendations and processes. AI co-pilots free workers to be more creative Imagine having a highly capable AI-enabled tool that helps with the dull, predictable elements of your job, freeing you to be more creative. A new generation of AI copilot tools will move beyond conversational interfaces, to radically rethink how we work. These copilot tools will weave intelligence into tasks to help boost productivity and performance, replacing the enterprise tools we are familiar with. Early examples include GitHub Copilot, which provides developers with troubleshooting in real-time, including fixing bugs and demystifying complex aspects of code. As a result, users can "spend less time searching and more time learning."  The rise of AI native businesses  Beyond copilots, brands can also develop autonomous agents - basically software that can make contextually smart decisions without the need for constant prompts.  Autonomous agents can chain multiple tasks together to achieve specific goals. They can be used to dramatically improve performance,  reduce processes from days to minutes, and to free up  time that can be spent more creatively.  Early examples of autonomous agents include  Auto-GPT and BabyAGI.  Looking ahead, autonomous agents could be deployed to redesign businesses’ entire operational models, moving away from rigid, linear models that were designed around the limits of human productivity and availability constraints. New AI native operational models could allow brands to create products tailored to customers' individual circumstances and brought to market with previously unimaginable scale and speed.  Customisation at scale AI allows brands to create artificial personas for testing bespoke products and services, a development that has the potential to revolutionise sectors such as banking, insurance and retail. Time-consuming and costly focus group testing will be unnecessary which means that new products and services can be launched much faster.  As a result, we will shift from a "one size fits all" era of mass commoditisation, to one where brands will use AI /LLMs to cater at scale to each customers' particular needs and wants.  Technology has become highly open-ended Until a year ago, brands were used to a relatively consistent rate of incremental change around prevailing social web and mobile technologies.  ChatGPT's launch has completely redrawn the horizon – kickstarting a massive AI innovations race with a language all of its own. Google’s Bing, Microsoft’s Bard, Anthropic’s Claude and OpenAI Enterprise are some of the critical touch points in nearly 12 months of extraordinary transformation.  Despite the unprecedented pace of change, brands need to get going with AI. Pioneers like Microsoft and OpenAI have invested heavily in ensuring businesses can easily use AI services. For example,  OpenAI Enterprise addresses concerns over data privacy, which has previously prevented companies from adopting the consumer version of ChatGPT. Looking ahead to next year,  sector-specific LLMs will create a new paradigm for software architecture, allowing the creation of more affordable, effective and sophisticated tools for business.   6.   Increased conflict between commercialising AI benefits and mitigating AI risks If the recent tussle over the firing and re-hiring of OpenAI CEO Sam Altman has taught us anything, it’s that there will always be an inherent tension between what AI promises –  its inherent value – and the risks of a technology potentially “too disruptive for its own good”.  Over the next year we can expect more OpenAI style fall-outs involving native AI companies,  regulators, governments and other stakeholders as society grapples with the implications of what is essentially a live experiment.  Companies engaging at any level with generative AI should expect and even encourage debate around how this dichotomy can be handled.  None of these are reasons to stay static, however. The window for first-mover advantage is tiny. Soon, legions of different companies will become fluent in the new lexicon of LLMs and AI, using emerging platforms to create efficiencies, transform team performance, boost revenue and grow market share. Brands that fail to seize the initiative will invariably fall behind. At the very least, the willingness to experiment is a powerful first step. ### Remote connectivity: the linchpin in a sustainable future In today’s world and business environment, remote connectivity has become a game-changing solution to modern day challenges, such as remote working. This goes beyond enabling people to digitally connect and collaborate as if they were together in person. It is also about enabling staff training and the support and maintenance of devices from afar. This is ushering in a new age where staff don’t have to physically go into their place of work to be upskilled, or to resolve any technical challenges they or others may be experiencing. Retaining the workforce In the era of globalisation, organisations must adapt their working practices to effectively retain their employees, and this means offering hybrid work across the board. By providing the flexibility to work from anywhere, businesses can hold on to valuable IT talents who might otherwise be drawn to opportunities elsewhere. Remote connectivity plays a pivotal role in making this possible for tech teams that form the backbone of digital infrastructure across businesses. For example, remote connectivity tools can empower staff to rectify all manner of IT issues remotely, at speed, and reliably. The ability to do their jobs effectively, regardless of their physical location, not only keeps IT staff happy but also reinforces productivity across the entire organisation as downtime is minimised. This generates job satisfaction across the broader employee base; increasing the likelihood that they will choose to stay with their current employer when enticing offers appear elsewhere. In addition, remote connectivity enables the onboarding and upskilling of colleagues regardless of where they are. For example, new colleagues can have access to training equipment without needing to be physically on site. Whether that is through them connecting directly to the equipment like simulators and virtual machines or by simply screen sharing with an experienced colleague. No longer do people need to travel to where equipment or people are to get up to speed for their role. Building a sustainable future with connectivity Beyond employee satisfaction and business stability resulting from a more effective staff base, a significant benefit of deploying an effective remote connectivity solution is that it encourages substantially more sustainable work practices. It does this by allowing colleagues to share fast responses with one another, independent of location, including through troubleshooting and maintaining remote devices as though the handler were on site. This reduces the need for travel and the consequent environmental impacts. In fact, a recent study has calculated that one single TeamViewer connect avoids, on average, 13 kilograms of CO²e, which is the equivalent of 5.5 litres of gasoline, or a 70km car ride. This reduction in carbon emissions not only helps combat climate change but also conserves valuable resources, such as gasoline. How Specsavers has deployed remote connectivity across its business Alongside managing environmental impacts, businesses are also facing the ongoing cost-of-living crisis. One such company is UK eyewear giant Specsavers which has faced one of its most challenging periods to date – ensuring customers receive the same level of care, whilst maintaining affordable prices for their services. To remain competitive, Specsavers deployed TeamViewer’s remote connectivity solutions at 2,300 of their stores worldwide to ensure IT issues don’t disrupt the employee and customer experience. With TeamViewer’s reliable connections and priorities on security, Specsavers now has access to the best IT support systems across the globe, increased staff efficiencies, and greater scalable success. All thanks to remote connectivity. Futureproofing with remote connectivity The potential for remote connectivity to shape the future of work cannot be understated. By embracing solutions that offer this, businesses can foster a positive culture built on flexibility and productivity. And pass the benefits on to their customers. What’s more, by implementing remote connectivity solutions, businesses can not only enhance their competitive edge but also positively contribute to the health of our planet. Embracing remote connectivity is, therefore, not just a smart business move, but also a step towards building a greener world. ### Want to improve your cybersecurity posture? The UK’s Cyber Security Breaches Survey released earlier this year estimated that approximately 2.9 million cases of cyber crime affected UK businesses over a 12-month period between 2022-2023. With cyber attacks and data leaks happening almost daily, businesses are looking at how to improve their cybersecurity and resilience across every part of the organisation.Although engaging with cybersecurity vendors and deploying their solutions is a very important part of a business’ strategy, it’s not the only action that needs to be taken. In order to be truly resilient, businesses should look at their entire infrastructure and assess the risks it could be posing. One such element is the software the business is using. Using software that has not been specifically built for a business’ environment or does not align with its processes, may cause challenges for both cybersecurity specifically but also wider business objectives. This is where custom software development comes in – a powerful tool that not only allows businesses to create bespoke solutions to support their business goals, but to also aid in broader resilience and cybersecurity across their environments. The challenges with off-the-shelf software While off-the-shelf software has its benefits – it can be deployed instantly, it is usually cost effective, and it tends to include a wide range of features – it cannot align with every business priority. Common challenges that businesses face when using off-the-shelf software include: Integration. As it has been built to broadly fit the needs of businesses, it can be difficult to integrate off-the-shelf software with other existing systems and extra time and resource is often required to consolidate them. Inflexibility and scalability. As a “one-size-fits-all” model, off-the-shelf software may be more difficult to customise for a business’ specific workflow or operational process and therefore is likely to stifle scalability as well. Data partitioning. In off-the-shelf software solutions, businesses have no influence on data partitioning due to data sensitivity. Solutions need to be universal so often all data is in one database or domain and cannot be separated and delivered by a service on demand. Data leakage. Usage monitoring systems are also a frequently overlooked aspect when choosing a solution off-the-shelf. Monitoring allows early detection of irregularities and can prevent or limit data leakage. Limited support. While off-the-shelf software tends to receive regular updates, troubleshoots, and bug fixes, it is unlikely that a business will receive dedicated support from the vendor that goes beyond that. Due to these challenges, overall, when it comes to cybersecurity, off-the-shelf software does not allow businesses to have the best cybersecurity posture possible. What’s more, with every industry having to adhere to stringent security requirements, each business needs its own customised strategy to ensure it is resilient but also compliant. So, how can customising software help? Security baked-in When it comes to cybersecurity posture, every business will be different – there’ll be different measurements, different goals, and therefore different priorities. These also may change over time as the business grows. As such, cybersecurity must always be considered at the start of any new process, and that includes software development. Building in security from the start is critical, but it can sometimes be missed if software is being developed on a mass scale. The benefits of custom software development tailored to a specific business are that not only can developers bake in security from the start, but they can also ensure the relevant security policies are in place and adhered to as well as provide ongoing consultancy and support for the business using their software. Tailoring software to your needsSoftware can be customised in several ways, from a few tweaks to existing platforms, to a fully developed piece of software designed for a specific business’ needs. Therefore, it’s important for a business to assess its priorities and challenges before embarking on a software development venture. Some considerations include: Gathering input from stakeholders to understand how everyone across the business uses the different software that is in place, how it benefits them as well as any current challenges that need to be addressed. Conducting an audit of business processes and what existing systems and software are in place to currently support them. Defining requirements to outline what the software needs to do from a feature perspective as well as what else it can support across wider business processes. This is where cybersecurity should be included in the conversation. Engaging with third-party experts who can consult on the software development process or even design it themselves with the business priorities in mind. This helps, helping businesses to understand the technical feasibility of their requirements as well as the best approach for implementing custom software solutions. Confirming the budget and resource available for the project and aligning with any third-party providers. Custom software that cannot be finished due to a lack of money, time, or resource, will likely present more challenges. Reap the rewards Further to the benefits discussed above, aligning cybersecurity and software development through a customised project specific to a business will make both businesses’ processes and cybersecurity initiatives easier. Trying to do one without the other creates double the work in the end and is unlikely to result in the improvements or benefits businesses are looking for. Any business currently assessing its cybersecurity posture or current software setup, or even better – both, should look into customising its software to suit its needs. Who knows what the rewards will be. ### The three stages of AI in innovation We’ve all heard a lot about AI recently, fuelled by the arrival in our daily lives of OpenAI’s ChatGPT. Generative AI (GenAI) was unheard of 12 months ago (aside from people that worked in the AI sector) and is now one of the most hyped technologies in years, with further GenAI tools such as Google Bard, Microsoft Bing and ChatSonic emerging. I won’t get drawn into the “shiny object” hype or the safety considerations around this important technology, particularly given the recent AI Safety Summit held at Bletchley Park which was dominated by GenAI talk. Instead, I’d like to offer a different perspective – how AI is used for innovation. Innovation is integral to meeting some of the challenges the world is facing right now, from finding solutions to help address the climate crisis to how best to supply clean water in developing countries. As a practitioner who has been working with machine learning and, more recently, AI and Large Language Models (LLMs) such as ChatGPT in the field of innovation, I know how much value it can add in this area. I’d categorise the usage of AI for innovation into three main stages. Stage one: connecting the dotsWe founded Wazoku on the belief that humans can solve any problem when supported by the right technology. We swiftly understood that basic AI / Machine Learning - could help reduce complexity and duplication in the basic process of generating and gathering ideas. It could also accelerate collaboration and transparency by helping people who had similar ideas find each other in large, complex organisations and work together on ideas and innovation. The machine made the connection that the humans created. This ability to make connections also means that any ideas that don’t work right now or for their original purpose, aren’t lost or forgotten. Instead, they live in the system memory and when, in the future, a company is looking for an idea in that area, the AI can connect that historic idea with the current need for a solution. Recently, one of our engineering customers solved an urgent issue with an idea that had first been captured more than four years previously. Stage two: understanding the patternsOpen innovation is an approach to problem solving that involves organisations asking questions (Challenges) to and capturing the input of thousands of experts (Solvers). It is an extremely powerful proposition and can help solve a staggering different array of problems and challenges, across many industries and sectors. These include helping to create cleaner drinking water in developing countries to protecting astronauts on space walks, and many more in between. However, knowing the nature of the thousands of Challenges they have solved as well as the specific skills and expertise these Solvers have is extremely difficult. Simple tagging and key words could never uncover the rich details required to truly understand questions such as ‘are there trends or patterns in Challenges and solutions?’ or ‘what skills, knowledge, and experience have Solvers demonstrated that is not claimed on a CV?’ This can work both ways. People could list a skill on a CV that they actually don’t have, but also, the rigid structure of a CV can lead to the vital information required on a Solver’s overall skillset being left off. AI has helped achieve this. Through analysis of data, covering more than 20 years of solutions to Challenges, it is possible now to see precisely what skills are contained in a community, how trends in problems have changed over time, and the evolution of AI Challenges over time. It would be impossible to understand the rich complexity of a crowd without AI reading, understanding, and categorising information. This process of clustering would have taken thousands of hours of work and still wouldn’t have been anywhere near as comprehensive or effective. Stage three: drawing the pictureThe emergence of GenAI has heralded a further boost to innovation. It can assist with approaches to writing or refining problem statements or potential solutions and can also reframe problem statements to direct the question to different crowds. This becomes incredibly useful for making a technical topic more accessible to non-technical Solvers, inviting a broader range of perspectives when seeking ideas. This will play a critical role in encouraging more diverse groups to participate in the innovation process, and will make it faster and simpler to launch innovation Challenges. Crucially, the addition of more diverse thinking should also generate more impactful ideas.Humankind has demonstrated that the scale of some of the innovation Challenges the world is currently facing, will not be addressed without the support of technology. It’s my belief that AI can be a powerful aid to human ingenuity and creativity and can play a vital role in addressing those challenges and helping to make the world a better place for us all. ### CIF Presents TWF – Frank Jennings In this episode of our weekly news and interview show, Tech Wave Forum, host and CIF CEO David Terrar will explain that, in a time of chip shortages, major technology companies are looking to develop custom silicon optimised for generative AI and LLM models. Good for business, good for the planet, but hard to do. The interview guest is our good friend Frank Jennings, The Cloud Lawyer, partner at Teacher Stern LLP. We'll be quizzing Frank on the legal and ethical implications of AI and training data it uses. Frank will give us the lowdown, and how to avoid the landmines in the AI landscape ### CIF Presents TWF - Cécile Rénier In this tenth episode of our weekly news and interview show, Tech Wave Forum, host and CIF CEO David Terrar will discuss artificial general intelligence (AGI), that's artificial intelligence that surpasses human intelligence in almost all areas, triggered by Softbank's CEO suggesting, this last week, that it will be here within 10 years. David believes it will be gradual, inevitable, and sooner than you think. The interview guest is Cécile Rénier, Wolters Kluwer's VP Customer Service. She will discuss their approach to customer service, customer engagement, and how they use it as a competitive advantage. She will explain the change in mindset, culture, and leadership required for real transformation. ### CIF Presents TWF - Miguel Clarke In this ninth episode of our weekly news and interview show, Tech Wave Forum, host and CIF CEO David Terrar, will discuss his take on Meta's latest announcements of their Quest 3 headset, Ray-Ban Smart glasses, and the collection of AIs that you can access from their various message and chat services. The interview guest is Miguel Clarke, Cybersecurity Evangelist from Armor Defence. He has an ex-FBI agent's perspective on the security and threat landscape that includes applying military thinking to counter the bad actors. ### Discover The Final Way to Cut Cloud Costs Remember when they told you that you needed this shiny new thing called ‘cloud?’ That it was silly to pay for more computing resources than you needed most of the time, just because it was required to cope with traffic and demand peaks (like Christmas) a couple of times a year? In fairness, this was true—cloud is perfect for scaling up and scaling down and it was (and is) a better way to even out your computing resourcing. But, they didn’t tell you that this was only one sort of scaling—scaling the application up and down to serve varying demands. If you want more people connecting to your service, doing more transactions and creating more data, you need a bigger database at the back end to do this. ‘Scale’ doesn't necessarily mean petabytes of data; very few CIOs have applications requiring that much data at run time. What we’re talking about is the number of concurrent active users rather than storage. If hundreds of thousands of people connect at the same time, you need a database capable of managing that—it’s a processing issue, not a storage one. Ten years ago—and often still now, to be honest—we worked around this by buying lots of little application servers to run those connections. But, to make it happen they all connected back to one big Oracle or DB2 style box somewhere in a data centre. Back then, that Oracle database was bigger than it needed to be because of Black Friday, or Singles Day in China. And it still is. In other words, the problem we thought we'd cracked didn’t go away. Monolithic databases are still being bodged to make cloud business work. Why do I say ‘bodged’? Because these engines were just not built for the cloud and the unique way it works with data. You have kept paying for proprietary software, running on specialist hardware. Scale your database capability up and down as you scale your applications This set-up is limiting the positive impact of cloud on your overall budget. It’s a cost you’ve kept having to carry, and it also means you're not exploiting all the capability and innovation the cloud offers. Cloud has been fantastic for one part of the scaling challenge, application scaling—but that’s really only possible by using as big a single database as you can power up. The uncomfortable truth about using cloud to scale a business process is that you’ve always had to use traditional, monolithic SQL databases to make it fulfil that promise of smoothing out your IT peaks and troughs. This was pricey enough before; now it’s getting out of hand. How can you solve this challenge and make business savings? One option is to adopt a cloud native database which allows you to scale your database capability up and down in the same way you scale your applications. Instead of a single, giant machine and expensive proprietary licences, you could have three smaller ones, perhaps in different geographies. Then, if one suffers an outage, you’d fail over to the other two, keeping up and running without any business (or customer) impact. It’s also cheaper. Running a big transactional app in the cloud might require renting a 32, 48, or even 64 core server… but, two 8 core virtual machines are cheaper than renting a 16 core machine, etc. In fact, even if you don't need to scale up immediately, it quickly becomes cheaper to have a number of small machines cooperating to carry your workload, rather than one big machine. Why isn’t this the norm, though? Because we left the cloud revolution unfinished. We have done so much – moving capital expenditure into operating expense, lifting and shifting workloads (then finally redesigning and optimising them). Today, the application layer of most organisations is much healthier than it was, as we rapidly moved from virtual machines to Docker containers, and now in many instances Kubernetes pods. All this has made building applications and distributing them resiliently around the world more straightforward and cost effective. However, database evolution has lagged behind and is the final step we need to take to complete the cloud revolution. It’s a step we might call enabling, not vertical, but horizontal scaling. Transactional consistency, and the ability to horizontally ‘scale out’ A good example of this is a major US retailer we worked with. Though a very successful bricks ‘n mortar brand name, the company was moving aggressively into the e-retail space even before Covid accelerated the importance of a strong online store. It was doing all the right things with microservices and modern application development techniques and technologies, but the database side posed a problem. Even the biggest database they could build, on the biggest virtual machine they could rent in the cloud, could not handle their load. To work around this, its engineers had to do messy fixes like sharding, which started to get complicated and raised the spectre of transactional inconsistency. This meant that one shopper might ask for something that the system had just placed in another customer’s shopping basket elsewhere. It got very messy. Their solution was to adopt a cloud native database that gave them both transactional consistency and the ability to horizontally scale out. And it worked! The company now benefits from sub-10 millisecond performance, boosted operational simplicity and efficiency, and crucially, can now easily meet massive eCommerce peaks. To complete the cloud revolution and achieve the horizontal scaling that you want, it’s worth considering distributing your business data and sharing the load among lots of machines rather than pouring all your money and hopes into one database Leviathan. The answer is a scalable, resilient open-source-based PostgreSQL database (like YugabyteDB). This kind of technology is available now and easy to install. It allows you to finally achieve the time and cost savings promised by the cloud all those years ago. ### CIF Presents TWF – Michael Anderson In this episode of our weekly news and interview show, Tech Wave Forum, host and CIF CEO David Terrar will use the news section to explain Cloud Industry Forum’s position on the recently announced Competition and Market Authority’s investigation in to the practices of the dominant cloud players, AWS, Microsoft, and Google. The interview guest is Gregory Lebourg, from CIF member OVHcloud. We'll be having an in depth discussion on the vital topic of Sustainability, and OVHcloud's comprehensive approach to it. ### Application observability is the foundation for sustainable innovation It’s become almost clichéd within IT to talk about the ever-increasing speed of innovation. For several years, the overriding focus for organisations has been on delivering digital transformation as fast as possible to respond to changing customer needs and enable hybrid work. Applications have been at the heart of that transformation. Most recently, the move to modern application stacks and no code and low code platforms has enabled organisations in all sectors to increase release velocity and deliver new applications and digital services at speeds which were previously unimaginable. However, accelerated innovation has come at a cost. IT teams find themselves consumed by soaring levels of complexity and overwhelming volumes of data, which are being spawned every second by cloud native technologies. Technologists are finding it almost impossible to manage application availability, performance and security within highly volatile and dynamic cloud native environments. And this is significantly increasing the chances of organisations suffering from disruption and downtime to their applications. Worryingly, the unrelenting pressure being felt in IT departments is also taking a more personal toll on technologists. In the latest research from Cisco AppDynamics, The Age of Application Observability, 79% of global technologists claim that they are experiencing more stress and tension within their IT department because of the switch to hybrid environments, and this in turn is leading to growing numbers of technologists leaving their jobs. Without a doubt, there is now a widespread recognition that organisations need to find a more sustainable approach to innovation which prioritises not only application performance and security but also the wellbeing and productivity of technologists. Accelerated innovation has led to crippling levels of IT complexity The new research finds that 49% of new innovation initiatives are now being delivered with cloud native technologies, and this figure is expected to climb to 58% by 2028. The reasons for this are quite clear - technologists believe that increased adoption of cloud native technologies will enable them to deliver applications at least four times faster than with traditional, on-premises technology. Given the extent to which innovation has already sped up since the outset of the pandemic, it’s quite staggering to think that technologists still see this much room for further acceleration. The problem is, however, that most IT teams are already buckling under the massive volumes of metrics, events, logs and traces (MELT) data which are coming at them from microservices and containers. They can’t cut through this data noise to identify and resolve application performance issues. And where organisations are operating hybrid environments, with application components running across cloud native and on-premises technologies, IT teams can’t get a clear line of sight up and down the application path. In fact, 78% of technologists claim that the increased volume of data from multi-cloud and hybrid environments is making manual monitoring impossible. They’re still relying on multiple monitoring tools and so don’t have unified visibility across their hybrid applications to quickly pinpoint and troubleshoot issues. The result is growing concern amongst technologists that their current approach to innovation just isn’t sustainable. They haven’t got the tools, processes, structures, and skills required to deliver accelerated digital transformation while maintaining world-class application performance. The shift to application observability is now vital to achieve innovation goals More than anything else, technologists now point to application observability across multi-cloud and hybrid environments as important for their organisation to deliver accelerated and sustainable innovation. In fact, 85% of technologists point to observability as a strategic priority for their organisation. Application observability provides IT teams with unified visibility across both cloud native and on-premises environments, ingesting and combining vast volumes of telemetry data from cloud native environments and data from agent-based entities within on-premises applications. It enables technologists to access real-time insights into application availability and performance across their hybrid environments. Application observability allows IT teams to take a more proactive approach to managing their hybrid IT environment, integrating security into the application lifecycle from day one to develop more robust products. Technologists can access deeper insight into their applications, enabling them to detect issues, understand root causes and remediate far more quickly. Interestingly, 81% of technologists report that they’re coming under increasing pressure to validate the impact of their cloud investments. Organisations are putting more scrutiny on their digital transformation budgets amid ongoing economic challenges. The worry is that if IT leaders can’t demonstrate the value that they are delivering then investments will decline, and the current pace of innovation will inevitably slow down. However, application observability allows technologists to correlate IT data with real-time business transactions, so IT teams can measure, optimise and report on the impact of their innovation projects. This is vital to keep investment flowing. It’s important to remember that the implementation of application observability solutions needs to be complemented by cultural and structural change. IT leaders need to break down silos between people, processes and data within the IT department, with technologists from all disciplines coming together to achieve shared goals around innovation and application performance. Ultimately, application observability can enable IT teams to get away from constant firefighting and get back on the front foot - indeed, 88% of technologists feel that observability will allow them to be more strategic and spend more time on innovation. For organisations, this statistic alone should be enough to convince them of the need for application observability. It’s about arming their best technologists with the insights they need to innovate at speed, while ensuring applications are performing at an optimal level at all times. IT teams can regain control and deliver digital transformation on a sustainable basis for years to come. ### “Computer – Enhance!” Using AI to maximise developer capabilities. Developer teams are facing ever-increasing pressure to innovate and perform as businesses evolve into software-driven organisations. In what can be a cutthroat race to beat competitors, even the slightest advantage can be transformational. Yet given the lack of best practices, and often understanding, around generative AI, CIOs face a crucial dilemma. Lock developers away from AI tools entirely, and competitors willing to exploit the technology will leap ahead. Yet overly permissive use can create major issues, from unwittingly sharing sensitive data and IP, to accidentally committing copyright theft. Unwise use of AI can even cause reputational damage. Ultimately the best position lies somewhere in the middle, and since developers are already making use of AI tools, finding a responsible approach that creates maximum value is essential. Due diligence The first step is understanding the landscape. A sign of AI’s rapid adoption is the growing awareness of “shadow AI” that exists outside of IT teams’ visibility, for instance because developers are using ChatGPT or other tools of their own volition. This use might be harmless, or it might be opening the organisation up to huge risks. The only way to be sure is to discover exactly what tools are being used, and how. Nothing is more dangerous than an unknown unknown. The IT team can then begin building its AI policy, using its investigation into shadow AI to guide its thinking. By knowing what developers and others want to accomplish with AI, the organisation can ensure its own approach supports these goals in a safe, controlled manner. This won’t be an overnight process – as well as existing AI use, teams will need to understand the different tools available; their unique strengths, weaknesses and vulnerabilities; and the developer team and business’s overall goals. But it will be the key to answering those age-old questions. Owning the means of production The next step to consider is mitigating the risks associated with generative AI. For most, the largest is the way in which these tools learn and share data. Many individuals still don’t understand that all the information they share in their AI queries is essentially public. Anything entered, from an innocuous question to the layout of a specific data set, will be shared with the AI’s central database – potentially informing answers to other users, and permanently existing outside of the organisation’s control. This question of ownership will be familiar to enterprises that have already battled to retain control over their data when adopting cloud services. The most desirable solution might be a non-public AI instance: where the organisation retains control of the AI’s database, and so all of the data in queries and responses. But this isn’t a simple fix: owning and managing such an instance will require resources enterprises might not have access to. And if an organisation owns the database, it also needs to ensure that there is enough data for the AI to “learn” from. After all, the more developers need to second-guess an AI and help it towards learning, the less value it has. The Case for an Open Source Foundation A way to mitigate Legal questions in software, especially in the open source world where ownership questions are legions, is to go through a foundation. This model has worked well with open source software, clarifies ownership and made companies feel safe in using that software. And while some companies are currently training proprietary LLMs, some others are making their model and data set open source. There are still so many questions unanswered, who has the right to use a model, to train it, fine tune it, can it be done with any data, what about a contaminative license like AGPL and the likes? What about ethical questions? HYPER[LINK: https://www.unesco.org/en/artificial-intelligence/recommendation-ethics] Something modeled after the Eclipse or Apache foundation would bring clarity and possibly encourage the adoption of OpenSource LLMs. Fixed focus The crucial question for most organisations will be where and how can developers use AI in a way that helps their work without increasing risk? This is where a more targeted approach can pay dividends. For instance, forbidding developers from using AI to develop production code that directly uses company data isn’t an overly authoritarian measure – and most developers would resent the idea that they can’t create their own final code. However, suggesting and troubleshooting test code that doesn’t use corporate data; answering common queries; or suggesting ways to follow best practice are all ways AI can support and enhance developers without creating risk. Ideally, this approach should be integrated with the tools developers already use. For example, if existing development tools or databases include AI-supported options they will be more accessible, and become part of developers’ workflows much more easily. The critical component AI hasn’t changed an essential truth. Any business’s most important asset is still people – whether customers, partners or employees. Despite fears that AI will replace developers, it lacks the creativity, reliability and ultimately the intelligence to do so. Instead, if used the right way it can supercharge performance without increasing risk – giving development teams a platform to apply their skills and creativity where they are most valuable. ### Navigating a multi-cloud ecosystem The cloud operating model has evolved from an emerging trend to a fundamental backbone in modern business infrastructures – with Gartner forecasting that worldwide public cloud end-user spending will reach nearly $600 billion by the end of 2023. As organisations continue to embrace cloud technology – including public clouds, private clouds, and cloud principles in dedicated IT environments – they are met with boundless opportunities to drive agility, efficiency and innovation at a pace previously unimaginable. This has led to the rapid increase of cloud-based software-as-a-service (SaaS), platform-as-a-service (PaaS) and infrastructure-as-a-service (IaaS) solutions, granting businesses with fast and easy access to computing resources without the need to invest in, or maintain, expensive hardware or infrastructure. Yet, alongside the quick expansion of cloud services, we are facing a complex challenge that can impede the agility promised by this technology. Namely, the formation of silos within a rapidly growing multi-cloud environment.  With more cloud providers than ever using a wide variety of different operating frameworks, businesses must find a way to connect the disparate parts of their cloud environments – or lose the very efficiencies the cloud was built to provide. The rise of the multi-cloud era Cloud computing is far from being a new technology, yet the way in which companies use the cloud has significantly developed over the years. What initially started as the use of ‘virtual’ private networks, quickly changed into businesses adopting a hybrid model bringing together both private and public clouds to tailor implementations and achieve more comprehensive cloud service integration. Now, as cloud technology continues to evolve and new offerings are brought to market, companies are coming to understand the benefits of splitting their cloud usage between more than one provider. This is where the multi-cloud strategy comes in, as it offers businesses the ability to leverage the best elements of different cloud infrastructures while neutralising any negatives, optimise costs across their entire cloud environment, and avoid vendor lock-in.  The potential of these advantages has led to the rapid adoption of multiple cloud services, with research indicating that 85% of organisations have deployed applications on two or more IaaS providers. A significant jump from 2020, when 70% of companies were still tied to one cloud service provider. The challenges of a multi-cloud environment  As is often the case, the opportunities presented by multi-cloud usage come alongside their own unique set of challenges.  By introducing new services, businesses are adding to the complexity of their IT infrastructure and increasing the workload of their IT teams. As each application operates in isolation, managing the integration, data flow and overall efficiencies of each cloud service can be time consuming, which may prove particularly troublesome for teams already struggling with the ongoing IT skills gap. Security, compliance, and performance can also become harder to manage across multiple cloud services. While the ability to move workloads to public providers and between IaaS platforms has its benefits, securing data and monitoring vulnerabilities across multiple cloud platforms will require more time and effort. The same can be said for monitoring and enhancing performance, as well as ensuring compliance with any regional legislation in the case of using services from different locations. To get the most out of a multi-cloud environment, businesses must develop a firm understanding of exactly where their data is going to being stored and anything that might impact its usage, the security requirements across each platform, how easy it will be to move from one solution to another, the process of deleting data and, ultimately, how they can prove its ROI.  Managing multi-cloud complexities While the complexities of a multi-cloud environment may be varied, they are not insurmountable. Numerous cloud management tools have emerged that can consolidate multiple cloud applications and allow these to be managed centrally, helping to maintain consistency, compliance, and security across multiple platforms. By using such a tool, businesses can alleviate some of the additional pressures imposed by working across multiple cloud platforms.  Importantly though, when choosing to go down the multi-cloud route, you also need to decide what your objectives are for doing so and be able to quantifiably measure the ability of each cloud provider to deliver your objectives. It no good selecting a cloud provider because they are the cheapest – only to discover that it does not deliver the performance, resilience, or reliability you need to run your business. The cloud (and applications running in it) are just a vessel for your business needs, so you need to develop a testing process, that can be easily repeated with each new platform, which allows you to measure what you have already, and then compare that to what the different cloud environments can provide. Comparing “apples with apples” will then guarantee that you pick the right environment for each task. It is also important to continue this testing process. Just like your own datacentre, cloud environments change over time, and you need to make sure the decisions you made previously are still the right ones now. Using an MSSP to run these tests periodically ensures that you are aware of issues that may be effecting your business, and allows you to either alter the configuration of your cloud environment, or reevaluate your provider section. A unified path forward To fully harness the cloud’s potential and maintain a competitive edge, businesses must focus on integrating their different platforms and applications. There are numerous cloud management tools available, providing a solution to dealing with the complexities of a multi-cloud environment, however managing this in-house will still require significant resources and expertise.  An alternative option would be to outsource cloud management to a managed service provider. This can help those without the ability to manage multiple clouds in-house to develop a cohesive and agile strategy, enabling a smoother flow of data and processes across various cloud platforms, while allowing them to focus on their core business responsibilities. As the cloud landscape continues to grow and diversify, the need to unify disparate cloud environments increases. Whether this is conducted in-house, or via a third-party, the focus is no longer on simply adopting the cloud, it is now on how to consolidate platforms and optimise them for maximum benefit. ### 4 Benefits of Leveraging AI-Powered SaaS Marketing Tools Artificial intelligence (AI) has become a buzzword in the SaaS industry, and rightly so.  AI tools can help SaaS businesses resolve their key challenges via automation.  No wonder, most leading industry players and marketers are investing in AI-powered SaaS marketing tools. In fact, the artificial intelligence SaaS market size, which was $73.8 billion in 2020, will cross $1,547.57 billion by 2030, with a CAGR of 37.66% from 2022 to 2030. What’s more? 88% of marketers want their firms to increase the use of automation and AI to meet customer expectations and gain a competitive edge.  If you are still wondering whether to invest in artificial intelligence for your SaaS, this post is for you. We’ll discuss the top four benefits of leveraging AI-powered SaaS marketing tools. 1. Unveil and Capitalise on the Latest Industry Trends  Uncovering the latest industry trends fetches crucial insights into ever-evolving customer expectations and demands. This helps marketers align their SaaS services with customer preferences, thus meeting their needs. The result?  Enhanced customer satisfaction directly impacts sales and revenue. It supports a firm's revenue marketing efforts by accelerating repeat business, upselling opportunities, and high customer acquisition rate through referrals and positive word-of-mouth. Manual analysis of market trends can be daunting and inconsistent because of human errors.  However, with AI-powered SaaS marketing tools embedded with advanced predictive modelling and machine learning algorithms, all this becomes a breeze.  They can help you - Collect, compile, and analyse vast datasets on a centralised platform to identify emerging market trends. Gain in-depth insights into customer choices and behaviour in the past and present.  Identify gaps in the market and develop an actionable strategy to attract the target audience. Gauge the potential of your existing SaaS marketing strategy and predict outcomes like consumer acquisition, engagement, satisfaction, and more in real time.  2. Personalise Your Marketing Campaigns  Personalising marketing campaigns can help you create tailored content and deliver it to the right audience.  This can help boost customer engagement, conversions, and ROI.  Here’s a quick example of how AI-powered SaaS tools can help you create personalised messages. Say, your SaaS firm has been using Sitecore (content management system), which has the capability to collect data on user interactions. However, Sitecore’s content architecture tends to hinder UX  Now, you decide to migrate to WordPress for several reasons, such as user interface issues, limited customisation options, and more. The Sitecore to WordPress migration can be tricky, especially because crucial data is at stake. Here, using AI solutions can help maximise data’s utility. For instance, an AI-powered SaaS marketing software with predictive and prescriptive analytics capabilities can help you collect and analyse audience behaviour patterns and choices from Sitecore’s data.  This allows you to gain valuable insights into target audience needs and create highly personalised content via actionable recommendations (offered by prescriptive analytics) for your WordPress website.  The tool can recommend specific actions like adding blog posts on trending topics, exclusive offers, etc. This can boost your customer engagement significantly, thus strengthening relationships. In fact, Gallup research reveals that customer engagement can boost your revenue by 23%. 3. Deliver an Exemplary Omnichannel Experience Customer journeys are becoming more complex day by day.  Thanks to the internet, customers have access to a wealth of information.  In fact, they research your website, social media, and other platforms to check out your existing customer’s testimonials, experiences, and more before making buying decisions. This means you must provide a seamless omnichannel experience across all platforms, or else you can lose valuable opportunities. Take, for instance, HubSpot. This B2B marketing and sales software provider ensures that each touchpoint depicts a uniform message without compromising the brand’s tone.  See how HubSpot delivers the same message across social channels and chat provision. Here’s a sneak peek of their Twitter account (social). With an AI-powered SaaS marketing tool integrated with data analytics, you can build an omnichannel experience like HubSpot.  The tool can help you collect, compile, and analyse customer interactions across multiple digital channels.  With these insights, you can gauge customer likes and dislikes and create a uniform experience across all platforms. 4. Track and Measure KPIs that Matter the Most Tracking key performance indicators (KPIs) can help you understand the effectiveness of your marketing efforts. However, tracking random KPIs won’t be helpful.  It’s imperative to understand and monitor the metrics that matter the most.  For instance, the chief marketing officer (CMO) should monitor KPIs like monthly recurring revenue (MRR), churn, etc.  This can help them understand your marketing campaign performance and revenue growth.  On the other hand, the chief financial officer (CFO) should keep a tab on metrics like customer acquisition cost (CAC), customer lifetime value (CLTV), etc. AI-powered SaaS marketing tools with customised dashboards can help your team understand and track the KPIs that matter the most.  The AI-powered marketing dashboard offers a customised view of KPIs according to the user’s job role, responsibilities, and goals.   This customised data-driven approach can ensure informed decision-making, thus enhancing revenue growth. Summing Up AI-powered SaaS marketing tools, especially the ones with data analytics integration can breathe new life into your marketing efforts.  From helping you understand the latest trends and target audience expectations to fulfilling them and achieving high ROI, these tools can prove a game-changer, as shared in this post. So, deploy AI-powered SaaS marketing tools and take your business to new heights.  ### CIF Presents TWF – Paul Bevan In this episode of our weekly news and interview show, Tech Wave Forum, host and CIF CEO David Terrar will use the news section to discuss two recent events he attended. PICTFOR's talk triggered by the Online Safety Act titled Tackling Online Toxicity and Misogyny. The second event was Envision London 2050, and he'll discuss futurist Amelia Kallman's great keynote, and wonder how Smart London will be in 27 years. Our interview guest is Bloor Research analyst Paul Bevan, covering the current cloud landscape, the impact of GenAI, and asking what comes next? ### CIF Presents TWF – Gregory Lebourg In this episode of our weekly news and interview show, Tech Wave Forum, host and CIF CEO David Terrar will use the news section to explain Cloud Industry Forum’s position on the recently announced Competition and Market Authority’s investigation in to the practices of the dominant cloud players, AWS, Microsoft, and Google. The interview guest is Gregory Lebourg, from CIF member OVHcloud. We'll be having an in depth discussion on the vital topic of Sustainability, and OVHcloud's comprehensive approach to it. ### Supplementing CNAPP: Why APIs and cloud apps need additional API security Cloud security is a complex issue, particularly when considering APIs that are now the glue connecting mobile and web applications. APIs are highly visible and well-defined doorways into the data and business processes of organisations and are the top attack surface exploited by cyber criminals. Even the most compliant and secure APIs can be exploited by attackers in the form of business logic abuse and automated threats resulting in data loss, fraud, and business disruption. CNAPP (Cloud-native Application Protection Platform), a new category of cloud security by Gartner, will provide some coverage for application security such as scanning application code for OWASP vulnerabilities and enabling DevOps teams to remediate security issues. However, API security can deliver a different set of capabilities, providing a complementary approach. Blindspots When looking at cloud application security, it’s important to consider that an application will require different components such as virtual workload, data-store, network, and identity services to support it. These components and the effort to secure applications deployed in the cloud have created a patchwork of security solutions resulting in a disjointed and complex deployment that lacks a centralised view. Consequently, customers are presented with isolated silos of information leading to blind spots between different security products that could potentially enable attackers to exploit applications or infrastructure. The goal of CNAPP is to combine the functionality of existing products such as Cloud Security Posture Management (CSPM), Cloud Workload Protection Platforms (CWPP), Cloud Infrastructure Entitlement Management (CIEM), and Cloud Service Network Security (CSNS) into one solution. CNAPP not only simplifies the management of disparate security products and minimises security blind spots for security and DevOps teams but it also helps with cloud misconfiguration. Constant changes often lead to human error that can result in misconfigurations of security groups, ACLs, network and security policies, and these mistakes that can lead to exploitation. As CNAP identifies deviations from the desired security posture and vulnerable components, unused workloads, and areas of application misuse and attack, it can help prevent misconfig and can increase developer and DevOps team productivity. Potential threats in the CI/CD pipeline phases can also be detected, reducing the number of bug fixes and merge/pull requests. Plus it can be used to conduct vulnerability testing, performing periodic security scans across cloud components such as containers, serverless environments, and VMs to pre-empt any security issues. Yet, while CNAPP can provide a unified view between different security products that could potentially prevent attackers from exploiting applications or infrastructure, protecting APIs requires a different approach. APIs can be exploited in numerous ways, making attacks difficult prevent or detect, so applying API protection that complements CNAPP can help to better secure API-based and cloud-native applications. CNAPP shortcomings A common problem is that business critical applications can become blocked due to inaccurate threat detection, for example. Using real-time threat detection and mitigation can prevent this and ensure those applications that might be accessible to malicious entities are protected. API-dedicated security can also be used to discover those that have been deployed by internal groups without notifying the security team of their existence. These ‘Shadow APIs’ may have vulnerabilities that have not been corrected and can become an easy target for cybercriminals. This discovery capability complements CNAPP, helping protect unknown APIs and cloud components while CNAPP focuses on known applications and cloud infrastructure. API applications deployed in the cloud often adhere to agile development methodologies and these see changes constantly introduced to API specifications that if not monitored can lead to compliance or data exposure issues. This requires constant inventory and compliance checks to identify and mitigate vulnerabilities, such as those covered in the OWASP API Security Top 10, before they can lead to exploitation. However, CNAPP solutions are not designed to capture deviations from the API specification. Finally, API endpoints tend to have direct access to sensitive data stored in the backend of applications. Detecting any data leakage can be difficult, particularly if the request is made legitimately, but API security can surface sensitive data access, helping to prevent data loss. Complementary technologies CNAPP solutions serve a critical purpose in simplifying and managing critical cloud native security and cloud security posture issues. They help security and DevOps teams continuously monitor their cloud infrastructure ensuring that critical security issues are remedied before attackers exploit them. But cloud native API security solutions can provide a valuable contribution that should not be overlooked as CNAPPs do not fully extend up the stack to securing mission-critical API applications. For API application protection, organisations need to focus on implementing security solutions that understand how APIs are built, deployed, and exploited and how this can unnecessarily expose applications. This often requires a focused approach that understands the end-to-end API lifecycle and how to properly secure each stage, from discovery to detection and defence, helping to ensure continuous API protection for the organisation’s mission-critical applications. ### 5 ways AI is driving the future of airport operations AI is charting a new course for airport operations globally, enabling airports to optimise processes that drive efficiencies and improve the customer experience. This is a radical shift for a sector which, until now, has widely relied on manual reporting processes, like Microsoft Excel and PowerPoint, and where, as a result, many operations decisions could only be taken retrospectively, at the end of the end of the week or month depending on the reporting cadence. But by leveraging AI in their operations management platforms, some airports are now seizing the ability to react to situations in real-time for faster and more effective responses, as well as glean predictions that will support longer-term improvements. So, which applications have the potential to make the most significant impact on their operations and, in turn, the passenger experience? Predictive maintenance Airports process a huge number of passengers and airlines every day. For instance, more than 7 million passengers passed through Heathrow in September 2023 alone. This constant flow of passengers throughout airports means that even a slight disruption to airline and passenger-facing processes can create significant issues as the situation builds. It is therefore important to anticipate potential issues, wherever possible, and pre-emptively remediate them to minimise any potential disruption. Take a baggage handling system, for example. If a conveyor belt or carousel were to break, it could negatively impact the passenger experience as delays in receiving luggage build up. This is one area where AI-powered predictive maintenance can leverage IoT sensor data to monitor performance and anticipate potential issues to streamline maintenance, which in turn minimises unnecessary expenses and reduces disruption for the airport and its passengers. Flight management Flight management systems are complex, but critical to airport operations. These dynamic solutions ensure the efficient processing of aircraft at the airport – which is no small job when you think that in August there were almost 27,000 total plane movements at PHL Philadelphia International Airport alone. There are also a number of external factors that can lead to flight disruption, from technical faults to extreme weather. To improve their ability to optimally manage inbound and outbound flights, even in the face of unexpected events, many are leveraging AI to automate processes and for deeper situational awareness. Similarly, automating gate management is enabling many operations teams to stop manually manipulating data or updating spreadsheets, with gate usage dynamically planned based on real-time flight data. Passenger flow management To ensure a positive airport experience, passenger flow needs to be as seamless as possible, even during peak times. However, this has historically been hard to manage dynamically. For example, it’s still not uncommon for security wait times to be monitored with a stopwatch and a clipboard. Yet, the ability to anonymously track passengers from kerb to gate to get an end-to-end understanding of the passenger journey, as well as where potential crunch points are arising in real-time, is now becoming a reality for the very first time, enabled with AI-powered computer vision technology. The insights enhance airport operations leaders’ ability to make improvements, both to the immediate situation, for example by opening more security lanes or check-in desks as queues build, and long-term improvements, such as needing to increase the frequency of cleaning of bathrooms based on how often they are used. Airports can then better allocate resources and, in turn, streamline service delivery and improve the passenger experience, from check-in and security, through concessions areas, right up until they board their aircraft. Passenger verification Until now, airports have relied on manual boarding pass checks at gates, which can be a slow process and add friction to passenger journeys. This is now being improved with the use of facial biometrics at boarding, which utilises AI to authenticate passenger identities at a more accurate rate than manual checks. The AI scans the passenger’s face and compares it to their previously scanned passport photos, so they don’t need to continue displaying passports and boarding passes throughout their airport journey. Biometric passenger verification may eliminate the need for passengers to show their passports entirely. Indeed, while most airports require facial biometrics and boarding pass scans before passing through security, some cutting-edge airports have begun to ask passengers to pre-register their biometric information prior to arriving at the airport. From there, their identity can be automatically verified at different points throughout the airport journey. The future of airport operations AI presents a clear and substantial opportunity to improve airport operations and support the processing of millions of passengers every month. Real-time insights and predictive capabilities put airports on the front foot to ensure they can deliver a seamless experience to both airlines and passengers. ### The cost of cybercrime to our emotional wellbeing Joan Jett & the Blackhearts had the luxury of belting out those lyrics before the modern internet age, back when counterculture was subversive, cool, and edgy. Today, counterculture is mainstream, and our online reputations are everything. Our digital personas are deliberately curated, highly visible, and tightly managed as we wed ourselves ever closer to the devices in our pockets. So, when accounts get taken over because of credential stuffing and bad actors take advantage, the results can be devastating on a very personal level. Panic, embarrassment, and shame. These are real feelings resulting from things that happen in our digital world. This is especially true in the case of social media account takeover, which the Identity Theft Resource Center (ITRC) has dubbed an “Account Takeover Epidemic.” According to the ITRC, who in 2021 had just short of 15,000 identity crime victims contact them for support services (a record in and of itself), there was a 1044% increase in social media account takeovers from 2020 to 2021. As a follow up, the ITRC conducted a survey of social media account takeover victims and found that 66% reported experiencing strong emotional reactions to losing control of their social media account: 92% felt violated, 83% were worried and anxious, 78% felt angry, 77% felt vulnerable, and 7% felt suicidal. These are all important statistics to consider within the cybersecurity space. And while it may be easy for some to view social media identity theft as a mere inconvenience, these figures illustrate how closely tied one’s online reputation is to one's emotional wellbeing. A couple of friends of mine, Trevor and Stacey, both had their social media accounts hacked by presumably the same credential stuffing attack in July 2022. Neither had set up their 2-factor authentication. Both friends are successful professionals who were active on social media, and one happened to be a moderate crypto enthusiast. The bad actors posted on their Instagram stories a not-so-subtle message about getting involved in a bitcoin mining scheme. It was a screenshot of an iPhone lock screen, which included a picture from their profile (in the case of Trevor, a picture of he and his wife from his profile) and displayed a bogus text message from Bank of America (BofA), followed by a screenshot from his supposed bank account. While it doesn't take a cybersecurity expert to recognise this was a scam, it could nonetheless prove to be an effective phishing tactic since it is coming from the trusted source’s actual account within a social ecosystem not known for abuse. Curious about the sophistication of these attackers—and because I'll never pass up an opportunity to speak directly to our black-hatted counterparts—I responded to the story to see how effective their messaging was: I know, I know. I'm such a good friend, right? It was an awful ordeal for both individuals. Trevor was able to use Instagram's facial recognition verification process, which scans your face and compares it against their endless library of tagged photos. He was able to regain access within 27 hours and set up his 2-factor authentication. Stacey, on the other hand, left social media altogether. The ordeal was just too much of an embarrassment and created so much anxiety for her that she just up and left. Decided the whole persona in a digital realm thing was not for her. This is not unusual. More than ever, consumers will stop using a website if their account is hacked. Panic, embarrassment, and shame. Not the sort of feelings we want customers’ end users to have when they rely on our products. And while this example may be specific to social media, the sentiment is something we can all share. Whether it’s social media, fintech, ecommerce, or any other organisation with an exploitable user base, credential stuffing is a cat-and-mouse game that is here to stay—and with eyebrow-raising impact. According to Javelin Strategy and Research in their 2021 Identity Fraud Study, account takeover (ATO) fraud resulted in over $6B in total losses in 2020. Companies create new defenses, hackers develop tools to bypass these safeguards, and the cycle continues. So how can businesses fight back? In a recent Aite Group report, risk executives from financial institutions, fintech lenders, and ecommerce companies were interviewed to learn how they are protecting themselves from the escalating volume of ATO attacks. Key takeaways:• Most consumers use the same handful of usernames and passwords across websites, creating a vulnerability exploited by organised crime rings.• The available attack surface continues to expand, making detection and mitigation more complex.• Organisations need a solution that leverages real-time data analytics to keep pace with automated attacks and block malicious activity before it affects the business.• Firms with robust defenses will see attack volumes decrease as criminals focus their attacks on easier targets. Looking beyond the obvious bottom-line impacts of ATO attacks, it’s important to remember that these crimes have a real human impact. Stopping fraud isn’t only about saving money. It is just as critical for preventing the kind of human trauma that is surreptitiously corroding the fundamental fibers of a more ideal digital future. As in the physical world, what we want is safety, security, and trust. ### The Great Data Clean Up: SAP Master Data Management The benefit from any IT system is dependent on the quality of the data that underpins it – and that primarily means the master data. Most organisations rely on a core set of business objects, such as customer and employee information, product lists, geographic locations, and purchase histories, to perform business activities. In SAP, maintaining master data quality has always been a challenge. In many cases, data is entered and updated manually, which is time-consuming and error-prone. Often it resides in multiple silos, which can lead to duplication and inconsistency. The majority of data management takes place manually and periodically, with data being cleansed and updated only before major system upgrades or migrations. But with each generation, SAP adds more and more functionality and pulls in more and more data. Businesses are making greater use of SAP as a data hub, too. All of which means that the already-difficult task of managing master data (MDM) manually is becoming impossible.Data quality therefore needs to become an everyday activity – an integral part of ‘business-as-usual’ as Dan Barton, COO and co-founder, Bluestonex, argues – and outlines how organisations can deploy advanced policy-based automation and role-aware workflows to implement continuous processes solve the dirty data dilemma and keep master data management up to date. Drifting DataMaster data may be vital to a business - foundational, even - but it has a tendency to go awry. Addresses and prices change, products may become discontinued, and of course purchase histories will evolve. These changes are typically event-driven and are often very repetitive. At the same time, our world is speeding up, becoming more digitised and information-dense. This puts ever more pressure on core systems to keep up with the pace and depth of change. SAP customers have felt this pressure more than many, due to the complexity of SAP environments and master data models. S/4HANA can be extremely resource-intensive for users migrating to modern applications, further reducing MDM's relevance. With the release of BTP, SAP's Business Technology Platform, the company responded to this by developing tools to help analyse and optimise how systems perform. BTP has put process automation and optimisation firmly on the agenda, however, it is a toolkit, not a complete solution, and can be too complex and expensive for some SAP users, especially those with small or medium sized systems. A major challenge facing SAP users is the quality, consistency, and integrity of their data, which is hindering their business progress - and master data management does not help with this. Automation and Process MappingIncreasing the capacity of data and demands for MDM mean businesses need more efficient, streamlined processes. For data sharing and maintaining consistency, it is necessary to centralise data in a single repository, simplify data models, and integrate MDM with other SAP and external systems to simplify the process. Instead of accepting that core data objects inevitably drift over time, streamlining and automating processes can help businesses learn why they drift and ultimately fix that. In addition, the ownership of data quality and MDM needs to be put in the hands of the business user – not IT. Of course, IT teams do not normally want to get involved in business processes, and conversely, it is inadvisable for users to be manually editing master data. This is where process automation comes in, allowing users to see only what is relevant to their role and to the task at hand. Additionally, advanced policy-based automation makes it possible to broaden participation in MDM safely - and users may not even be aware that they are engaged in MDM activities. Achieving seamless MDMIt is important to examine, firstly, that MDM does not have to be a ‘big bang’ implementation, many organisations may – and can – choose a more digestible, play by play approach. Whichever methodology a business embraces, though, the key steps to success are the same. It is imperative to start with understanding ‘business as usual’ and derive from that the workflows and rules as they apply to master data. Then look for master data owners within the business – as mentioned, IT should be the enabler, not the owner of MDM. Business users and prospects need to work together to identify, standardise and automate role-based workflows that integrate a continuous MDM process into day-to-day operations. The automated workflow is how we move MDM out of IT and onto the responsible business people. In particular, input and feedback from the users will be essential as an organisation goes through the process of identifying the areas and/or workflows to automate.Finally, it is important to consider IT’s new role. IT still has both visibility and control, but as an MDM orchestrator, so it owns the rules not the data, and it enables the processes rather than executing them. Governance and data quality will derive from automated MDM as a side benefit. The proper programming of rules engines and workflows can ensure compliance with governance, good quality, responsibility, tracking, and auditing information by their very nature. ConclusionWhether a business is an SAP user or not - but especially if it is - master data is crucial. In almost every organisation, a consistent, accurate, and properly-governed foundation of customer, supplier, product, and other data is essential. Increasingly, it is also a legal and regulatory requirement.Master data must therefore be managed - not just cleaned or re-baselined on a periodic or ad-hoc basis, as it used to be in the past. Rather, tackle the root issues that can bring in inconsistencies and errors. That means getting rid of data silos and utilising process automation and the power of AI to seamlessly build routine, manual MDM tasks, coupled with robust configurable business rules, into workflows. That way, master data should always be clean, clear and on track, without overburdening business users. Fortunately, with advanced policy-based automation and role-aware workflows, MDM platforms can widen participation in the process of continuously keeping master data updated. With the right approach, IT can pass on responsibility for the master data to the business, while retaining control of the guardrails. ### Compare the Cloud - Interview - Josh Hough Meet Josh Hough, the Managing Director of CareLineLive, in this inspiring video where he shares the journey of creating a platform that's transforming home care management. Discover the story behind CareLineLive, a comprehensive solution designed to make home care delivery more efficient, transparent, and compassionate. Josh talks about the motivation behind his mission, the challenges faced, and how CareLineLive is enhancing the quality of care for those in need. Whether you're involved in healthcare, interested in technology's impact on well-being, or looking for inspiration from innovative entrepreneurs, this video is a must-watch. Dive in to see how CareLineLive is making a difference in the lives of caregivers and clients alike. ### Compare the Cloud - Interview - Peter Mildon Dive deep into the world of smart cities and machine learning with Peter Mildon, the esteemed co-founder and Chief Operating Officer of Viva City, a pioneering firm at the forefront of urban technological innovation. In this comprehensive and enlightening video, Peter shares his vast knowledge and experience, shedding light on how Viva City is leading the charge in transforming urban spaces into intelligent, efficient, and sustainable habitats. ### Why Throwing Money at Ransomware Won't Make It Disappear Organisations need to think differently about cybersecurity if they want to mitigate risk and recover quickly from an attack. There are about 1.3 million ransomware attacks a day worldwide, according to recent research, and organisations average over 270 days to identify and recover from a ransomware incident. That’s the scale of the cybersecurity problem and operational impact that organisations now face. It’s not just about being hit. That’s bad enough, but it’s also about how long the organisation takes to identify the problem, discover the root cause, recover fully and ensure that a  breach won’t happen again. Unfortunately, it’s a common story. You only have to look at the big-name organisations that spend millions on cyber security – major banks, Government departments, retailers, law firms, universities, major airlines and so on – that have been attacked over the past few years to realise that this is not just about throwing money at the problem. It doesn’t matter how high you build the wall; the cybercriminals get bigger and better ladders. Or they create a Trojan Horse in the form of phishing and walk right through your technical controls. Recent research from Stanford University claims that around 88% of all data breaches are caused by an employee clicking on a link in an email or downloading an attachment. Organisations invest heavily in cybersecurity awareness training, while at the same time migrating to cloud workflows that rely heavily on users clicking links embedded in emails!  In fact, some of the worst security operations centres (SOCs) are those with the most people and the most products, that haven’t been properly operationalised to reduce the likelihood, and especially the impact, of an attack. It begs the question; is there something fundamentally wrong with the way in which organisations buy security products and structure their operations?  As a World Economic Forum (WEF) and Accenture report Global Cyber Security Outlook 2023 suggests, the threats are getting worse, with 86% of business leaders and 93% of cyber leaders saying that global geopolitical instability is likely to lead to “a catastrophic cyber event in the next two years.” The traditional, transactional model of buying additional products year-on-year rather than focusing on operations that truly move the cyber risk needle leads to more alert fatigue, more infrastructure to manage, more user friction, less agility and more attack surface. To combat this we need to move from a cybersecurity approach to a cyber resilience approach. That means rethinking expectations on whether or not the organisation will suffer an attack. Organisations need to accept it is a probability, not a possibility, and with this will come a different set of priorities. The focus is then on response and recovery to minimise impact. How do organisations get to a Recovery Time Objective (RTO) of zero, down from the 270 day-plus average? How can organisations stop and then root out the cause of the breach? Backups are absolutely key when the systems you need to investigate for root cause are encrypted or wiped. If digital forensics capabilities begin from the start, at incident response, we can essentially create a ‘clean room’ in which we take a more surgical approach to recovery – in effect, identifying, isolating and investigating those compromised systems in a safe environment, giving SOC analysts the superpower of time travel across the entire incident timeline. Modern data management platforms support near-instant instantiation of these point-in-time snapshots and orchestration via APIs which allow Security Orchestration and Automated Response platforms to manage complex response and recovery security operations workflows. Some data management solutions have even baked some of these security operations capabilities to classify data, hunt for indicators of compromise and identify vulnerabilities into the data management platform itself. But this also brings in a business continuity strategy as how will any organisation know what will be online and offline after an attack? It’s no good just backing up files. Organisations have to think about how to get communications, security systems and identity and access management systems recovered first. The temptation of course is just to press ‘backup’ and restore the whole system but there is a danger here that this will just lead to another attack. Unless the incident is fully investigated, the organisation will not know whether or not the backup is compromised. In former roles, I have personally been involved in breach response in multiple organisations that have made the business decision to recover without fully understanding the nature of the incident and how it happened, then closing the vulnerabilities and removing the artefacts. In every incident this was the approach, systems were quickly attacked again, further delaying the recovery time of critical services. The reality is, unlike traditional business continuity and business continuity incidents, the Recovery Time Objective in cyber attacks isn’t on first recovery.  Organisations have to start thinking about their backup strategies, how the backup is needed for the security operations response process following a ransomware attack and how they can use the cloud to isolate systems, create backup clean rooms and use workflow data automation to enable faster recovery. As more organisations cater for remote working and therefore data flowing outside of the perimeter, this becomes an increasing necessity.  ### How Investing in Information Technology Benefit 'Manufacturing' Industry? As the manufacturing landscape evolves, so do its needs. Today, manufacturers require more than the conventional in the data-driven environment and are embracing information technology for its benefits. IT is transforming factories into hubs of innovation, where machines are collaborators in efficiency and excellence. Imagine machines that think, communicate, and even make decisions. This is the magic of Information Technology (IT) in the manufacturing sector, where innovation sparks a revolution that transforms factories into efficiency, productivity, and innovation. In this world, factories have become smart, almost sentient. Machines become interconnected entities that converse through data. They collect real-time information, analyse it with the wisdom of algorithms, and make decisions that optimise operations—all without a human whisper. The benefits are immense. Efficiency soars, downtime vanishes, and quality control becomes an art of precision. There is a new era of adaptability in manufacturing. Production lines become flexible, bending to the winds of market demand or supply chain disruptions. They optimise themselves, responding to customer needs and changing landscapes with finesse. This article looks at how managed service providers benefit the manufacturing industry. Operational Efficiency IT systems have transformed many industries, and manufacturing stands at the forefront of this technological revolution. Picture a factory floor where machines operate in perfect sync, downtime is reduced, and efficiency is high. This is the magic of IT in action. Manufacturers use the power of informed decision-making through real-time data collection and analysis. They improve operations, discovering inefficiencies and bottlenecks and reshaping them into streamlined processes. The Internet of Things (IoT) helps communication flow seamlessly through a networked system. When an issue arises, a glitch, a hiccup—these can be nipped in the bud before they become full-blown nightmares. And then there's the predictive analytics that whispers impending troubles, allowing manufacturers to start proactive maintenance. Instead of scrambling to repair a broken system, they embrace prevention, saving time, money, and resources. Machine learning algorithms help decipher patterns and adjust the production choreography. These algorithms fine-tune machinery, enhancing efficiency without human intervention. Information technology knits teams together, breaking down walls with cloud-based communication platforms. Departments collaborate seamlessly, sharing real-time updates and making decisions at the speed of thought. Therefore, the best managed IT services for manufacturing enable optimisation and profitability. Inventory Management Advanced IT systems use the power of predictive analytics, machine learning, and automated data processing to unlock demand forecasting. It grants the manufacturer a crystal ball crafted from historical sales trends, market whispers, and seasonal melodies. With algorithms and data, these systems forecast future demand with uncanny accuracy. The result is a production schedule tailored to meet expectations and perfectly balances supply and demand. IT can help stop overproducing, wasting precious resources and revenue. These IT conjurers ensure that production aligns with reality, avoiding excesses that drain capital and enthusiasm. And through real-time inventory tracking, manufacturers keep a vigilant eye on every step of the journey, from production to distribution to retail shelves. The systems result in a Just-in-time supply that balances demand, eliminating the excess. For example, a smartphone manufacturer can determine the future demand. The system calculates the required units with predictive analytics, factoring in sales history, market trends, and promotions. Real-time inventory tracking ensures that the shelves remain stocked, obliterating excess and wastage. Supply Chain Management With data and connectivity, IT solutions transform how companies manage their operations from start to finish. When data connects suppliers, partners, and manufacturers in a seamless transparency, real-time data sharing becomes the heartbeat of the supply chain. It ensures that information moves effortlessly from sourcing raw materials to delivering finished products. With this accurate and up-to-date data, manufacturers make informed decision-makers, crafting a narrative of efficiency and cost reduction. IT systems also help with responsiveness. Fluctuations in demand are met with agility, and suppliers adjust their inventory levels and deliveries in perfect sync. No more delays or missteps. The supply chain adapts in real time, ensuring the production process remains harmonious. Within manufacturing, enterprise resource planning (ERP) systems help with integration. Departments speak a common language, their voices harmonising to create a collaboration. Information flows effortlessly from procurement to logistics, from production to distribution. Everyone is on the same page, armed with real-time data on inventory, forecasts, and schedules. Regulatory Compliance In manufacturing, regulations are essential to safety and sustainability. IT systems can ensure manufacturers stand strong against compliance challenges and emerge victorious on responsible production. Imagine every production process meticulously monitored and seamlessly integrated with regulatory requirements. With software solutions that track the production rhythm, companies can meet the guidelines set by health and safety agencies and environmental protectors. But it's not just about avoiding penalties. These IT champions empower manufacturers to be proactive, nipping potential issues before they can wreak havoc. They maintain a watchful eye on quality control, manage inventory, and unravel the production processes. Quality Control Imagine machines possessing an uncanny sense of perception and help in quality control. IT tools like sensors and data analytics enable real-time monitoring and precision. In a factory floor humming with activity, sensors monitor every nuance—temperature, humidity, pressure, and even the vibrations in the air. Besides collecting data, they craft a story of production. If something strays from the norm, they sound the alarm—a deviation, an anomaly, or potential trouble. Data analytics decipher the warnings based on the patterns from the data, such as revealing an outlier in the temperature and a hiccup in the vibrations. These anomalies might hold the key to something bigger that could impact the quality of the final product. This age of IT has benefited manufacturing in many ways. Investing in information technology is a strategic move that not only trims expenses but propels operations into efficiency and innovation. It creates a future where downtime becomes a rarity. Machines anticipate their hiccups and problems, allowing manufacturers to intervene before a single cog falters. ### “Must-have” Cloud Tech To Help Gig Workers Excel The gig economy, characterised by temporary and part-time roles filled by independent contractors and freelancers rather than traditional full-time employees, has become a prominent fixture in the UK labour market. Its influence spans various sectors, including legal services, retail and education. Surprisingly, recent findings from a CIPD report reveal that the gig economy consists of less than half a million workers in the UK, with only a fifth of them relying on it as their primary source of income. Despite the relatively smaller size of this workforce, as living expenses continue to rise, gig workers find themselves facing growing financial challenges. A recent study conducted by the University of Bristol revealed a startling fact: a whopping 52% of gig economy workers in the UK earn less than the minimum wage, despite their contributions to various industries In this shifting terrain, gig workers find themselves grappling with tools that no longer adequately serve their evolving needs. Traditional tools, like spreadsheets and manual task management systems, fall short in helping them efficiently manage their gigs, schedules, and finances. To thrive in this demanding landscape, gig workers must leverage the capabilities of cloud-based technology, seamlessly incorporating it into their arsenal to enhance productivity and overall effectiveness. But what does the future hold in the realm of cloud technology, and how can it significantly enhance the efficiency and success of gig economy workers? Charting a Course in the Gig Economy The gig economy offers flexibility and independence, but it also comes with its fair share of challenges. Gig and self-employed workers often face instability in income, limited access to traditional employee benefits, and the responsibility of managing their business independently. Financial management is one of the biggest challenges for gig workers. With irregular income streams, tracking earnings, expenses and taxes can be daunting. However, cloud-based accounting software has become a lifeline for gig workers. These tools allow them to automate financial tasks, easily generate invoices, and keep tabs on their financial health. This not only saves time but also ensures accurate financial records. Cloud-based communication systems are a game-changer for the gig economy, providing seamless connectivity and collaboration tools that empower freelancers and remote workers. These platforms enable instant communication, file sharing, and project management from anywhere, fostering productivity and flexibility in the gig workforce.With the ability to stay connected, gig workers can serve clients and collaborate with teams across the globe, irrespective of their physical location, enhancing their overall efficiency and marketability. Cloud-Based 'Must-Haves' for Gig Workers As freelancers venture into the gig economy, they often find themselves juggling multiple roles, from CEO to marketing manager to accountant. To succeed, they need a reliable toolkit of cloud-based essentials: Cloud-Based Productivity Suites - Platforms like Microsoft 365 and Google Workspace offer a suite of essential tools, including email, document editing, and cloud storage. Gig workers can manage their professional communication, collaborate on documents, and store important files securely in the cloud. Website Builders - Establishing an online presence is crucial for gig workers. Website builders like WordPress and Wix provide user-friendly templates and hosting solutions, enabling them to create and maintain professional websites without technical expertise. Customer Relationship Management (CRM) Systems - CRM systems like HubSpot or Salesforce help gig workers manage client relationships, track leads, and automate marketing efforts. These cloud-based tools enable gig workers to provide personalised services and nurture customer loyalty. Cloud-Based Phone Systems - A cloud-based phone system offers gig workers the advantage of maintaining a professional communication setup without requiring dedicated hardware or complicated installations. For gig workers frequently on the move, establishing a local or national presence becomes crucial. Systems like CircleLoop allow individuals to easily select a landline number and manage it through their mobile device, ensuring seamless connectivity. Advantages of a Dependable and Professional Technology Stack Freelancers, by nature, thrive on their ability to provide specialised skills and services. A reliable and professional tech stack enhances their competitive edge and offers several benefits: Credibility - A professional online presence, complete with a polished website and branded email addresses, lends credibility to freelancers. Clients are more likely to trust and hire freelancers who appear established and trustworthy. Efficiency - Freelancers can streamline their workflows and meet deadlines more effectively with cloud-based project management tools. These tools help freelancers prioritise tasks, track project progress, and communicate with clients efficiently. Scalability - Cloud-based solutions, including communication systems, can easily scale to accommodate freelancers' growing client bases. Whether it's expanding storage space, adding team members, or automating processes, freelancers can adapt their tech stack to match their business growth. Mobility - Freelancers often enjoy the freedom to work from anywhere. Cloud-based communication systems enable them to access their work tools and data from any device with an internet connection, facilitating a truly mobile work lifestyle. Shaping the Gig Economy’s Tomorrow The gig economy is evolving rapidly, with phone systems emerging as a hidden gem for freelancers. These tools, often underestimated, bolster professionalism and appeal to potential clients. Integrating phone systems provides gig workers with a valuable edge, conveying reliability and professionalism. This often-overlooked aspect is crucial for success in the gig economy. In addition to phone systems, embracing cloud-based technology offers gig workers numerous advantages. They can scale swiftly, expand their client base, and enjoy the freedom of working from anywhere. Moreover, cloud technology streamlines their work, automating tasks and boosting efficiency. This efficiency allows gig workers to take on more projects and increase their income. As cloud-based technology continues its evolution, it is poised to assume an even more central role in shaping the future of work for these dynamic and resilient individuals. ### CIF Presents TWF - David Tebbutt In this fourth episode of our weekly news and interview show, Tech Wave Forum, host and CIF CEO David Terrar, gives his viewpoint on Elon Musk's rebranding of Twitter as X.com. He is also joined in the studio by special guest David Tebbutt, who he describes as the best technology writer and editor that he knows He relaunched Personal Computer World back in 1979 and so will share some of his unique perspective on the tech scene. David has just launched The Messagery - a company that helps organisations with the clarity of their strategic messaging. That's something that a lot of tech companies in our sector need to get better at. ### 10 cloud opportunities most organisations are still overlooking Businesses are constantly seeking ways to remain competitive in the fast-paced digital landscape of today. In this pursuit, many have embraced cloud computing as a means to enhance their operations and scalability. However, amid the ongoing cloud revolution, there are still numerous untapped opportunities that many organisations have overlooked, or aren’t using to their full potential. These opportunities extend far beyond conventional cloud applications and can significantly impact business growth and resilience. In this article, Lee Thatcher, head of cloud and innovation at Cloud CoCo, explores 10 innovative cloud solutions that businesses – from SME to enterprise level – can leverage to gain a competitive edge, improve efficiency, and position themselves for long-term success in today's digital business landscape. Cloud-based AI and machine learning Artificial intelligence (AI) and machine learning (ML) have the potential to transform business operations. Cloud-based AI and ML services – such as those offered by Amazon Web Services (AWS), Google Cloud, and Microsoft Azure – allow organisations to harness the power of AI without the need for significant upfront investments. However, it's important to note that due to differences in company size and available resources, enterprises are more inclined to adopt AI compared to SMEs. These powerful services can be harnessed for various purposes, including predictive analytics, gaining customer insights, and automating processes, thereby empowering teams to make data-driven decisions and streamline their operations effectively. IoT Cloud integration The internet of things (IoT) is revolutionising various industries, from manufacturing to healthcare. Organisations can tap into this opportunity by integrating IoT devices with cloud platforms. By doing so, they can collect and analyse data from sensors and devices in real-time, enabling better decision-making and predictive maintenance. AWS IoT, Azure IoT, and Google Cloud IoT are popular cloud platforms that offer IoT integration solutions suitable for all businesses. Cloud-based cybersecurity Cybersecurity is a top concern for businesses of all sizes, as more and more organisations are threatened by cybercriminals using advanced tactics. That being said, SMEs often struggle to allocate resources for robust cybersecurity measures, making them attractive targets. However, cloud-based cybersecurity solutions offer a cost-effective and scalable way to protect business assets. Services like Cisco Umbrella and Palo Alto Networks' Prisma Cloud provide SMEs with advanced threat protection, identity management, and security analytics in the cloud at an affordable price, meaning security doesn’t have to be an expensive and unattainable afterthought. Serverless computing Serverless computing is an innovative cloud opportunity that allows organisations to focus on their code rather than managing servers. Platforms like AWS Lambda and Azure Functions offer serverless computing capabilities, enabling businesses to run applications without the need to provision or manage servers. This not only reduces operational overhead but also allows for more flexible and cost-effective scaling. Edge computing Edge computing is gaining prominence as a cloud opportunity that brings computing closer to the data source, reducing latency and improving real-time processing. SME to enterprise-level businesses can benefit from edge computing by deploying edge devices and leveraging cloud platforms like AWS IoT Greengrass or Azure IoT Edge. This is particularly valuable for applications like autonomous vehicles, industrial automation, and remote monitoring. Hybrid and multi-cloud strategies Many organisations have adopted a hybrid or multi-cloud strategy without fully realising its potential. These strategies offer flexibility and resilience by spreading workloads across multiple cloud providers or combining on-premises infrastructure with the cloud. But also, public cloud environments, which are often part of these strategies, may not inherently offer the same level of security as private infrastructure. Consequently, organisations must remain vigilant and proactive in safeguarding their data and assets. Implementing robust security measures and monitoring protocols becomes imperative to mitigate potential risks and vulnerabilities while still reaping the benefits of optimised cost, performance, and disaster recovery through strategic hybrid and multi-cloud solutions. Disaster recovery and backups Disaster recovery and cloud backups are often underestimated but are crucial for business resilience. In today's digital era, data is a priceless asset, and ensuring its safety and availability is paramount. Cloud-based disaster recovery solutions offered by providers like Amazon Web Services (AWS), Google Cloud, and Microsoft Azure allow organisations to create robust backup and recovery strategies without the need for extensive on-premises infrastructure. These solutions enable businesses to replicate their data and systems to secure cloud environments, providing a fail-safe mechanism in case of unexpected disruptions. Whether it's a natural disaster, a cyberattack, or simply human error, having cloud-based backups ensures that critical data and applications can be swiftly restored, minimising downtime and potential data loss. Cloud-based collaboration and communication tools Effective communication and collaboration are vital for SMEs' success. Cloud-based collaboration tools like Microsoft Teams, Google Workspace, and Slack offer SMEs a seamless way to connect their teams, irrespective of location. These tools include features such as video conferencing, file sharing, and project management, enhancing productivity and collaboration. Data analytics and business intelligence Data is a valuable asset for all businesses, and cloud-based data analytics and business intelligence tools can help unlock its potential. Services like AWS QuickSight, Google Data Studio, and Microsoft Power BI allow organisations to visualise and analyse their data, enabling them to make informed decisions, identify trends, and drive business growth. Cloud-based financial management Managing finances efficiently is crucial when budgets are strapped. Cloud-based financial management software like QuickBooks Online and Xero offer SMEs a centralised platform for accounting, invoicing, payroll, and financial reporting. These tools not only streamline financial processes but also provide real-time insights into the financial health of the business. Unlock untapped potential In the fast-evolving digital landscape, businesses cannot afford to overlook the myriad cloud opportunities available to them. From harnessing the power of AI and IoT to strengthening cybersecurity and embracing serverless and edge computing, organisations have a wealth of cloud solutions at their disposal. Additionally, hybrid and multi-cloud strategies, collaboration tools, data analytics, and cloud-based financial management can further propel their growth and competitiveness. To remain agile and competitive, organisations of all sizes should continually explore these innovative cloud solutions and adapt them to their specific needs. By doing so, they can position themselves for long-term success in the digital age, ensuring they stay ahead of the curve and meet the evolving demands of their customers and markets. ### The Connectivity Backbone: Telecoms in the AI era Since the pandemic, there has been a shift in the telecommunications industry. Connectivity in the home came under scrutiny, as more people began to work from home, and a boom in unhappy customers brought the fragility of networks into the light. This was the start of digital transformation accelerating across industries, as businesses began to refocus on improving the customer experience. After years of telcos attempting to diversify into the world of media and entertainment, their minds have been refocused. Telcos are now looking to fulfil the promise of their original manifesto: providing robust connectivity to it's customers. To do this, telcos are now investing more money and time into providing better services. Much of the investment is being directed towards artificial intelligence (AI). Like many other industries, telcos have woken up to the power of the technology that not only can help with improving services but also with more human-centric roles to make day-to-day tasks easier too. With 6G coming down the line and consumer demand for seamless services greater than ever, providing high-performing connectivity is key. If AI is to be successful at helping achieve this mission, telcos need to ensure they have access to large volumes of high quality data. “Why is my broadband so slow?” Telecoms networks are highly complex. They consist of multiple generations of technologies, and many different interconnected systems – which makes predicting and preventing network failures and downtime extremely challenging. Networks also have physical elements to contend with – such as the weather and volume of traffic on the network. AI can show enormous value in tackling issues in both of these areas. In fact, an Accenture analysis estimates that AI has the potential to reduce network downtime by up to 50% for telecom companies. This is because AI systems can do things that humans simply can’t, and in the blink of an eye. Take outages and network faults. An AI can identify patterns of weather that can be layered with machine learning (ML) algorithms trained on past incidents. By analysing previous instances of adverse weather, and their likelihood to occur again, these two technologies can recommend preventative measures engineers can take to avoid an outage. An example might be predicting the severity of high wind, with AI able to make a call on whether telco towers need more robust defences to avoid being taken offline. AI and ML takes a lot of the guesswork and grunt work out of engineers’ hands, enabling them to address problems before they become a major issue. When a major issue does occur, AI-powered decisions also reduce the mean time to repair drastically. Proactivity in predicting surges of traffic and advising customers can also be aided by AI. Systems can be trained to autonomously manage and optimise network workloads, enabling telcos to make informed decisions about what technologies should be used at times of high demand. There are a range of technologies available to manage demand – from 2G-5G in wireless networks, and copper to fibre in wired networks. Most telcos will have these capabilities in use – all of which are useful for different solutions and enable telcos to be flexible. But only if they’re smart about how to use them to provide network stability. For example, the pandemic put a huge strain on networks. Parents were working at home, whilst their children were streaming TV shows and playing games online. This stretched fibre networks and slowed speeds dramatically. But with AI powering decisions and identifying bottlenecks, telcos were able to come up with solutions to solve this problem and advise customers on how to get the best service. These remediations were often counter-intuitive, such as asking parents to stream TV on wireless networks instead of fibre, but they made a huge difference in keeping the network stable and customers happy. AI is only as strong as its data As AI continues to mature, other use cases will emerge. But organisations need to understand that AI is only as good as the data it learns from before letting it loose on the network. Models trained on data from only a subset of an organisation’s data may miss crucial insights, or provide “hallucinated” responses. With the global AI market for telcos projected to grow from $1.2 billion in 2021 to almost $40 billion by 2030, AI solutions are clearly the future of the industry. So, it’s essential that the technology is unbiased, fair, secure, and well-rounded, which relies on clean and accurate data. So, organisations must build AI use cases from robust groundwork, giving it access to a complete set of data. This will require a modern data architecture built around a unified data platform that enables AI to draw insights from data across the enterprise – from cloud environments to on-premise data centres. Strict governance must also be enforced always and everywhere, ensuring that compliance is met. Honing connectivity with AI Having refocused on core offerings, AI will play a crucial role as telcos look to deliver a better service. With 6G networks on the horizon, the landscape will only become more complex. It’s vital that telcos prepare now and unify their data so services are unhindered.Without quality data powering AI models, there will be a risk of the AI failing and misunderstanding context – or even providing inaccurate recommendations that will tarnish brand reputation. To fully unleash AI’s potential, the industry must prioritise curating diverse, unbiased data, coupled with thoughtful data practices to protect that data. AI has the potential to revolutionise the industry, but only if it’s built on solid foundations. ### CIF Presents TWF - Steph Macleod In this eighth episode of our weekly news and interview show, Tech Wave Forum, host and CIF CEO David Terrar, will discuss how we humans have lived through waves of technology from the agricultural revolution, to the printing press, to the four industrial revolutions, and then today's sophisticated information landscape, and how it's affecting democracy. He'll touch on issues around generative AI, climate change, and a post truth world, culminating in a book recommendation that he believes is the most important book on the AI topic that will be published this year. Our interview guest is the newest member of the Cloud Industry Forum board. Steph MacLeod, co-owner and director of technology PR firm Kaizo, will explain why she's joined the CIF family, and how she can help, refine, and expand what we do for our members. ### CIF Presents TWF - John Hayden In this seventh episode of our weekly news and interview show, Tech Wave Forum, host and CIF CEO David Terrar, will be commenting on the things that weren’t covered by the news stories about Google’s 25th birthday. Our guest is John Hayden of Ctrl O, talking about their product Linkspace. It’s a secure, trusted business process management and productivity tool aimed at replacing spreadsheets as databases, or as work around applications. John will talk through how and why customers like the MoD and MoJ are using it. ### CIF Presents TWF - George Athannassov In this sixth episode of our weekly news and interview show, Tech Wave Forum, host and CIF CEO David Terrar, will be commenting on the four Chinese tech companies that launched generative AI, Large Language Model chatbots last week - Baidu, SenseTime, Baichuan Intelligent Technology, and Zhipu AI. The buzz around the AI topic has been with us for 10 months, and it wont be dying down any time soon. Our interview is with George Athannassov, CTO of TAVA Discovery. The product is aimed at knowledge workers giving them a one-stop cloud-based productivity tool to find, organise, store and share information, with use of AI and algorithms to summarise and show you what's most relevant. ### CIF Presents TWF - Jez Back In this fifth episode of our weekly news and interview show, Tech Wave Forum, host and CIF CEO David Terrar, comments on the PSNI data leak, the Oxford Internet Institute research on use Facebook and wellbeing, and controversy around Zoom's recent change of terms and how they are using your data (for machine learning and AI). David is joined in the studio by special guest Jez Back, the well-known expert on cloud economics and cloud cost control. They'll be explaining FinOps, the current state of the FinOps landscape, and how you should be approaching and managing your company's journey in to the Cloud. ### CIF Presents TWF - Ian Jeffs In the third episode of our weekly news and interview show, Tech Wave Forum, host and CIF CEO David Terrar, will complete his three part review of the key tech stories of 2023 so far, including what he believes is a game changer. David will then be joined in the studio by special guest Ian Jeffs of Lenovo, the newly appointed Chair of the Cloud Industry Forum. Ian will give us a little of his technology life story, talk about why he wanted to become Chairman of CIF, explain what he’d like to contribute, his approach to making the Forum better, and the kind of things we should be focusing on. He’ll also give his perspective on how the cloud topic has evolved over the last 15 years. ### CIF Presents TWF - Mark Osborn In the second episode of our weekly news and interview show, Tech Wave Forum, host and CIF CEO David Terrar, gives you the latest and greatest news on cloud computing. David is also joined in the studio by special guest Mark Osborn of IBM. Mark will tell us all about data, AI and his new team. We'll also discuss this year's ChatGPT/Generative AI explosion and we'll recognise that we are at a pivotal moment in tech. We'll talk about how insanely powerful generative AI tools are, and Mark will touch on use cases his team are working on. ### Is CNAPP the biggest security category in history? Cloud native security requires one integrated approach, and this means protecting the entire lifecycle - from code to cloud. This lifecycle consists of two main stages: building the applications, and then running them. This means securing the dev environments (dev security) that are used to build applications, as well as the workloads and infrastructure that runs them during the production process (cloud security). In order to do this effectively we must track exactly what’s going on across both dev and cloud. For example, how many containers are running at any given point in time, and where are they located? What plugins are supporting the company’s Jenkins build process? Is the company’s cloud account configured properly? How exposed is the business to potential Log4j vulnerabilities? These questions and many more can all be answered with proper end-to-end visibility. However, effective security isn’t just about what businesses can see in dev and cloud. It’s also having the ability to act quickly when detecting issues, and suspicious or downright malicious incidents. Simply put, businesses need to be able to detect and stop - in real-time - security incidents in their cloud applications.  The growing importance of cloud native application protection platforms (CNAPPs) CNAPP is projected to be one of the biggest security categories ever – a $25 to $30B market. This is because enterprises are continuing to move applications to the cloud while adopting cloud native practices, requiring new security measures. Traditional security tools were not designed for cloud native architectures and can only supply limited visibility and control. CNAPP is the opportunity for enterprises to connect the dots across the cloud application lifecycle and create more efficient and effective security. CNAPP offers a way to reduce complexity while improving security and the developer experience. Gartner recently defined them as a unified and tightly integrated set of security and compliance capabilities designed to secure and protect cloud-native applications across development and production.  It further explained that CNAPPs consolidate many previously siloed capabilities, including container security, cloud security posture management (CSPM), Kubernetes security posture management (KSPM), infrastructure-as-code scanning, and workload protection into a single platform. The benefits don’t end there either. Horizontal integration: Bringing dev and cloud together in one place It’s not good enough for modern businesses to only concern themselves with what’s happening in their cloud environment.  In recent years, organisations have increasingly adopted tools and solutions that are designed to offer more cloud visibility. As part of this, the market is embracing CSPM, which is great. However, this doesn’t solve the issue of software supply chain security. What about code and open source components? Organisations need the ability to scan their code and software supply chain all the way from one end to the other.  It’s critical to see the implications of dev decisions on the cloud. Connected to this, when a problem is identified there, businesses need the ability to connect it in real-time to a dev that can resolve the issue quickly. This is known as security from dev to cloud and back. Vertical integration: Stopping attacks in real-time As mentioned earlier, visibility across dev and cloud is good but it’s only the first step in the process. As cloud attacks continue to grow in terms of both volume and sophistication, security practitioners are starting to ask more of the right questions, such as ‘am I protected from bad things happening to my cloud applications’, and ‘can I detect and stop an attack in real-time if I need to.’ Cloud security must be unified, and organisations need to be able to see and stop attacks in real-time if they are to prevent serious damage being inflicted.  Cloud Security is becoming increasingly consolidated Given present market conditions and the growing realisation that cloud native security requires a much more integrated approach, it is widely expected that security tools used to protect cloud applications will quickly start to become more consolidated. In fact, Gartner expects customers to consolidate the number of tools they use to secure cloud applications from 10 to 3 in the space of just a few short years.  For this reason, customers are increasingly looking for fully end-to-end CNAPPs that can address all aspects of cloud native application security, across all stages of the application delivery lifecycle, in one platform. Here’s a simple example of how Aqua’s end-to-end CNAPP operates: If a vulnerability is detected in runtime, the CNAPP will display the exact line of code where it originated, pinpoint the developer who owns it, and make a suggested pull request to fix the problem. No time is wasted waiting for a snapshot to identify it tomorrow, instead the CNAPP identifies it immediately, connects the dots and suggests a remediation. It is amazing what you can do with a state-of-the-art sensor supporting agentless scanning. After all, remediation is better with the full application context. Not all CNAPPs are created equal While other vendors may claim to offer end-to-end CNAPP solutions, very few currently exist on the market. For example, traditional EDR vendors lack the software supply chain context, understanding of cloud infrastructure settings, and granularity for real-time response to attacks. Furthermore, while other cloud native security vendors may be able to see everything happening in a customer’s cloud, they can’t connect it to the code in their repo, making them powerless to address issues quickly. Others still are able to provide broad visibility of issues, but don’t provide the ability to respond and stop attacks. As more and more organisations migrate their environments to the cloud, the need for cloud native applications security has never been higher. One of the most effective ways to do this is through implementation of cloud native application protection platforms (CNAPPs), which are purpose built to secure and protect cloud-native applications across both development and production. However, as the need for a more integrated CNAPP approach grows, customers must take the time to identify solutions that offer them the best possible protection in a single, comprehensive package. Otherwise, they may well find themselves back in the market very soon. ### Is Cybersecurity Finally Becoming a Business Enabler? In the current dynamic business world, the importance of cybersecurity has transformed from being a gone from being a safety and compliance requirement to an indispensable contributor to business health and success. Today, numerous organisations have channeled significant resources into strengthening their cybersecurity framework to safeguard their digital assets and counteract the challenges posed by progressively sophisticated cyber attacks. This alteration has pushed a change in the perception of cybersecurity - transitioning from a solely precautionary measure to a vital instrument for business revenue protection and innovation. It is not just about stopping intrusions but about establishing a strong foundation where businesses can leverage the power of a secure digital environment. A Look at Today's Cybersecurity Landscape In the fast-changing digital landscape, cybersecurity presents both unique and demanding challenges. With businesses increasingly dependent on technology for their daily operations, the urgency for strong cybersecurity safeguards has never been more paramount. The present-day cybersecurity scenario is marked by a tangled network of shifting threats. Cybercriminals are becoming more specialised, utilising advanced tactics such as ransomware, phishing, and data theft to take advantage of weaknesses in corporate systems. These attacks are not limited to large enterprises only - small and medium-sized businesses are also under relentless assault, making cybersecurity a concern for everyone. Simultaneously, the emergence of artificial intelligence and machine learning has introduced new facets to cybersecurity. These technologies are being employed to predict and pinpoint potential attack vectors, improving the capability for businesses to react quickly and efficiently. The Traditional Perception of Cybersecurity In the past, cybersecurity was often pushed to the background or seen as a necessary financial burden. It was perceived as a technical matter relevant only to IT departments and was frequently ignored until a security incident occurred. This attitude was predominantly due to a limited understanding of the critical role cybersecurity plays in guarding valuable digital assets and data. In this conventional perspective, cybersecurity was mainly concerned with implementing firewalls, antivirus, and other defensive measures to ward off malicious attackers. The approach was more reactive than proactive, centering on addressing attacks after they transpired rather than averting them from the onset. Cybersecurity was also frequently regarded as a financial drain rather than a strategic expenditure. Companies would assign few resources to their cybersecurity initiatives, treating it as a cost to be limited rather than a vital safeguard for their operations and revenue protection. This mindset also carried over to regulatory adherence. Numerous businesses treated compliance with cybersecurity regulations as a mere formality rather than an essential element of their comprehensive security plan. The Shift in Perspective Recently, however, there has been a noticeable evolution of this attitude. Cybersecurity is no longer regarded as a strictly technical matter but as a strategic business facilitator that can assist organisations in optimising their prospects and minimising hasards. Solutions like Privilege Access Management (PAM), identity and access management (IAM), and encryption are now staples in building a strong cybersecurity foundation. Organisations are now channeling resources into a holistic cybersecurity strategy that encompasses employing advanced technologies like machine learning and artificial intelligence, as well as formulating policies and procedures to guarantee compliance. Cybersecurity is no longer viewed as a financial burden but as a crucial asset that can aid organisations in safeguarding their digital assets and information. Companies also now acknowledge the importance of investing in cybersecurity training for their workforce. Organisations can enhance their security stance and diminish the likelihood of data breaches by equipping their employees with the necessary knowledge and skills to recognise potential threats and respond accordingly. In summary, cybersecurity is no longer regarded as a secondary concern. Instead, it is now perceived as a strategic expenditure enabling businesses to optimise their prospects and minimise dangers. By dedicating resources to comprehensive cybersecurity strategies, businesses can shield their digital assets and information, lessen the probability of data breaches, and ensure adherence to cybersecurity regulations. Steps Businesses Can Take to Turn Cybersecurity into an Enabler Cybersecurity is no longer solely about safeguarding - it's also a facilitator that can promote business development, productivity, and innovation. Here are some actions businesses can implement to utilise cybersecurity as a strategic resource: Harmonise Cybersecurity Strategies with Business Goals: Incorporating cybersecurity into your comprehensive business plan is the initial step. This entails coordinating your cybersecurity goals with your business objectives. For example, if your aim is to broaden your digital presence, your cybersecurity plan should concentrate on safeguarding your digital properties and guaranteeing the secure exchange of data across platforms. Utilise Security in Marketing as a Distinctive Selling Proposition: In the present digital era, customers are increasingly anxious about the security of their data. Businesses can employ robust cybersecurity measures to further secure Remote Desktop Protocol (RDP) and Active Directory as a distinctive selling proposition in their marketing plans. Emphasising your dedication to data security can aid in establishing trust with your customers and distinguishing your brand in the marketplace. Concentrate on Regulatory Adherence: Compliance with cybersecurity regulations is not merely about evading fines - it's an affirmation of your dedication to data security. Businesses can exhibit their commitment to upholding high-security standards by concentrating on regulatory adherence. This can improve their reputation and cultivate trust among stakeholders. Incorporate Security Metrics into Business KPIs: Businesses should contemplate incorporating security metrics into their key performance indicators (KPIs). This will enable them to routinely assess their cybersecurity performance and make the required modifications to their plan. Monitoring metrics such as the quantity of identified threats, response durations, and the efficacy of security measures can offer valuable insights into your cybersecurity posture. By taking these steps, businesses can shift from viewing cybersecurity as a cost center that puts up roadblocks to seeing it as a strategic investment and business enabler. This approach can facilitate business growth, enhance brand reputation, and create a more secure digital environment. Let Your Cybersecurity Initiatives Help Your Business Grow Security is an essential part of any digital transformation initiative. By taking the right steps to enhance your cybersecurity posture, you can create a secure digital environment that encourages innovation and growth. ### Facing your fears and overcoming tech anxiety Business leaders are increasingly under pressure. They’re feeling the ever-growing need to adapt to changing market conditions, to adjust to changing regulations, and to understand and utilise powerful new technologies, all while facing continuing pressure to perform in the market. This has an effect on their decision-making, particularly when it comes to technology. It can cast doubt on whether investments are being made in the right places, what technologies should be part of the stack, and how to keep pace with competitors. With these doubts and pressures, it becomes difficult to account for the next quarter, let alone the next year. And with the lack of foresight comes the uncertainty, the fear of the unknown. What some have called ‘tech anxiety’ - the feelings senior leaders experience as the exponential pace of technology continuously challenges the way businesses operate. But when looking to conquer these fears and anxieties, understanding the process of digital transformation is the key to unlocking huge potential. Bringing clarity through understanding At its root, tech anxiety is the fear that resources are being misspent, that the business is travelling in the wrong direction, rapidly accumulating technical debt, and there isn’t a turn-off anywhere in the near future. Digital transformation looks different for each organisation. Recently, Kin + Carta surveyed over 800 business leaders in the UK and US on this topic for its “2024 Leadership Priorities in Tech” report. Three key themes emerged that are set to define digital transformation in 2024: Transformation of processes, practices, and culture The survey emphasised the need to leverage digital technologies for business transformation, with respondents covering key operational factors such as processes, culture and organisational structures. This entails embracing digital operations and automation and integrating technology across the organisation to drive continuous progress. It found that, for example, 45% of leaders relate digital transformation to cloud modernisation, while 43% of businesses say digital transformation centres on automating internal processes. Evaluating technology impact and adoption strategies Second was the need to assess the effects of emerging technologies, evaluate potential risks and opportunities, and align them with organisational objectives. This involves making informed decisions, setting adoption strategies and selecting appropriate technologies. It’s a matter of balancing current needs with future innovation. AI and machine learning are good examples of emerging technologies that have the potential to cause major impacts and generate anxiety. Around a third (35%) of leaders who said AI and machine learning was a source of anxiety, attributed this fear to the speed at which the technology is evolving. Deriving benefits and competitive advantage As expected given the current economic environment, participant organisations focused on the value they’d derive from digital transformation and new technologies. They highlighted the potential for improved business processes, enhanced customer experiences, and increased productivity and growth, leading to a competitive edge in the market. The study identified that nearly half (49%) of businesses use digital transformation to describe efforts to improve customer experience - focusing on tailored marketing, personalised experiences and customer data to create satisfaction, loyalty and business growth. The importance of investment Digital transformation requires significant resources and investment to be successful. 81% of business leaders surveyed believe investment in digital transformation to be either critical or necessary for business success. However, concerns about return on investment from digital initiatives remain common. The research further revealed that economic uncertainty has impacted the majority of organisations’ short-term (84%) and long-term (81%) transformation roadmaps. Yet despite this uncertainty, leaders are showing resilience. Three-quarters (75%) believe investment is necessary within the next 12 months and the majority (58%) plan to spend more on digital transformation initiatives this year compared to the previous year. Overcoming your tech anxieties Despite their anxieties, leaders can still ensure they can overcome obstacles and make changes that deliver real value for their organisations, colleagues, and customers. First, think bigger. Challenge your teams and yourself to drive differentiation. Connect end-user value propositions that drive customer lifetime value and lower acquisition cost (via increased NPS) with data-driven intelligent experiences. Next, focus on parallel priorities. Put high-priority investment areas first. AI and machine learning, cyber security and customer experience are all both high-anxiety areas and top investment priorities. Multiple simultaneous initiatives can ensure you succeed when unexpected challenges arise and that learnings are shared across programmes. Finally, plan for future needs. Leaders often avoid challenges that fall into the high-anxiety and low-investment space, such as sustainability strategy, customer retention and marketing performance, out of fear of the unknown. But often these become hot-button issues insidethe C-suite, quickly rising up priority lists and catching leaders unprepared. Scoping targeted initiatives for these areas now will build perspectives and skills in anticipation of future investment and prioritisation. ### On-Prem vs Cloud CMMS/CAFM: 10 Tips for Total Cost Analysis For many years, on-premises CMMS solutions have been the mainstay of operational teams, but hard-pressed IT teams with tight budgets and scarce resources may be struggling to continually justify the expenditure or have the skills in-house to build and support the necessary IT infrastructure.  The same pressures are also on the operational teams managing the facility and maintenance operations day-to-day. To help manage increasing operational costs and external economic pressures, many companies are gradually migrating their applications and workloads from on-premise to cloud, relocating servers from their office facilities to dedicated racks in co-location data centres or fully into the cloud using services such as Microsoft Azure or Amazon Web Sevices.  On-Prem or Cloud – what’s involved? On-premises and cloud-based CMMS solutions are generally the same applications or technologies but delivered in different ways. On-prem CMMS is the more familiar and traditional model, where a company purchases the hardware and software upfront and then installs and configures it on-site. Dedicated IT teams are then responsible for managing the whole solution throughout its lifecycle - installing patches, upgrades and integrations. Cloud-based CMMS solutions use a Software as a Service (SaaS) model as a fully managed resource, which makes it popular with smaller companies that don’t have large budgets or the necessary internal expertise to set up and manage the technology in-house. But that doesn’t mean that cloud-based solutions are just suitable for small companies, many larger companies choose cloud software because it’s easy to scale and an OPEX-only expense.  Large organisations also often have the available resources in-house to manage on-prem software but still choose a cloud solution to leverage the benefits of a cloud service, instead choosing to expand their offerings for scalability and OPEX-only expenses.  Which is right for you? To decide which approach is right for your company, it’s helpful to compare the benefits of on-prem CMMS versus cloud CMMS software. But before you make any side-by-side comparison, you’ll need to decide which features you’ll need. Certain features can only be delivered using cloud-based technologies, so creating a checklist could prove key to your decision-making process. For example, have seen with Microsoft Office 365 that sometimes it is better to deliver new and enhanced services only from the cloud - such as Permit To Work, Risk Assessments and BI Connectors.  Ultimately you have to calculate the total cost of ownership of each solution, so here are 10 things to consider in your calculations: Security - Wrapping a secure infrastructure around your CMMS/CAFM is necessary to mitigate risks and ensure data privacy and alignment with GDPR. Looking closely at firewalls and anti-virus and keeping these elements patched and current will be a significant resource overhead for an on-premises system. In contrast, all of this can be built into a cloud SaaS model.   The Data Centre Environment - Dedicating rack space, power, environmental aspects and resources to put a resilient and robust server infrastructure in place will mean looking at where your data centres should be located. You need to also consider the cost of internetworking between those sites and the resources needed to manage and monitor those environments. None of this applies to a cloud-based system, meaning you can scale your solution quickly and affordably should your business needs change. Operational Management - Operational management often impacts overheads the most. Your on-premises TCO should allow for the manual workload involved in patching, scaling, upgrading, monitoring, security auditing, backup and general housekeeping. Although cloud-based CMMS/CAFM software will liberate you from many time-consuming operational tasks, you’ll still need to account for some lifecycle administration - for example, user account and permission management.  Software - In addition to your CMMS/CAFM application, you’ll also need the software to access the functionality of your solution. This may require separate licenses for each user, which incurs additional costs, plus you’ll need staff with technical expertise to deploy it.  Setup - Setting up on-premises servers is no easy feat; it requires significant people power to procure, configure and deploy servers to a production environment. In addition, the process can take days - possibly weeks if you include time waiting for components to arrive - adding significant overheads to operational team members. With a SaaS-based CMMS/CAFM, you can be up and running in a matter of minutes. Not only does this drastically reduce labour costs, but it also increases your business agility, helping to improve your bottom line through lower project costs and faster times to market.  Payment Models - Based on the CAPEX model, an on-premises server and related CMMS purchase can range from around a few thousand to many thousands of pounds/dollars, depending on the level of functionality and security you require. By contrast, the OPEX approach of a cloud-based service provides a more manageable way to finance your CMMS/CAFM needs. However, be aware that these payment models also require different approaches to calculating TCO in terms of the cost of capital versus ongoing commitments.   Application Integration - Connecting in-house systems with any CMMS/CAFM may prove time-consuming and challenging. A good vendor will provide simple APIs and clear documentation on integrating its product with common applications. A SaaS-based system offers out-of-box API interoperability with a wide range of services. However, you should still consider the integration or migration aspects of your environment with both your operations and IT teams. Compliance - Compliance is essential to meet key legislative requirements, especially in highly regulated sectors such as healthcare, food and drink manufacturing, pharmaceutical, utilities and local government. Ensuring on-prem solutions comply with privacy/security standards such as GDPR and ISO27001 can be a complex, lengthy undertaking requiring regulatory knowledge. However, most of this work is automatically covered when you use a cloud-based service, helping to reduce your TCO.  Service Levels - Most companies will look for high levels of availability to optimise production up-time from their architecture. In addition to the hardware costs of building a resilient and highly available on-premises solution, you’ll need to invest in people and tools to maintain service levels.  Technical Knowledge & Training - Managing an on-premises CMMS/CAFM requires considerable technical expertise, therefore it’s vital to include the cost of upskilling existing staff or potential recruitment in your TCO evaluation. A cloud-based solution requires far less technical know-how, as most of the work is outsourced to your vendor partner and as part of a SaaS service, you should be able to access priority support. The list could (and does) go on to include other costs such as CMMS/ CAFM licenses, client and operating system licences to integrate with your solution and any ongoing support licences.  Remember that with either option there will be one-off set-up costs - whether that’s physical hardware or software licences. Then you’ll need to factor in set-up work such as design and planning of the installation, configuration, testing, handover and staff training.  To complete your total cost of ownership calculation, you’ll need to combine the one-off and annual costs, plus administration and management costs into a single figure. 90% of companies are using the cloud As more companies carry out a TCO assessment, we see both small and large enterprises recognise the value of cloud-based asset maintenance solutions. Over 90% of companies are already using the cloud in some shape or form, and cloud computing is now the second most talked-about technology area in business. This has led to the emergence of a new type of CMMS solution based on the on-demand delivery model of the cloud. The ongoing cost of employing IT capabilities makes managed service options more attractive, as considerable skills and resources are required to implement and maintain on-prem solutions.  Many companies can also see significant benefits in cloud-based services, which are quick to set up and make it easy to manage projects or sandbox environments. Most importantly, companies using cloud software find their solutions return much higher value than those using on-premises solutions. ### Tips to Spot Patterns & Reduce Invalid Survey Responses When you're trying to extract meaningful information from survey responses, it can be challenging to know whether the email address provided by a respondent is valid. An incorrect email address can result in missing data, inaccurate information, or account security issues. To avoid these issues, you must pay special attention to the patterns present in email addresses. By understanding the structure of an email address, you can quickly spot any irregularities and take steps to ensure a valid response. What Are Fake Emails and Why Do They Matter? Fake emails, also known as disposable or temporary emails, are email addresses used temporarily for a specific purpose, such as accessing a discount code or signing up for a one-time service. They often look legitimate but are designed to be used only once before being discarded. Many people use fake emails to protect their real email addresses and avoid unwanted spam or marketing emails. However, fake emails can be a big problem when it comes to survey responses.  Since the email address is not real, it will be impossible to contact that person if they need clarifying information or a reminder to finish the survey. Also, fake emails can lead to inconsistencies in your data as respondents answer questions with different addresses each time. Using fake emails in surveys can lead to inaccurate data, as those survey responses may not represent genuine customer sentiment. Surveying the same fake email multiple times can also skew the results, making it difficult to draw meaningful insights from the data. This can hurt not only the business making decisions based on the survey results but also those customers who took the time to provide valuable feedback only to have their responses ignored due to the impact of fake emails. So, why do fake emails matter when managing survey responses? Simply put, they can negatively impact the accuracy and usefulness of the survey data. Inaccurate data can lead to poor decision-making and ultimately harm customer satisfaction and business growth. Managing fake emails is essential to ensuring survey results' accuracy. How to Prevent Fraudulent Activity with Your Surveys When managing survey responses, looking for patterns that may indicate a fake email address is essential. Here are some tips to help you spot fake emails: Use a CAPTCHA A CAPTCHA is an acronym for "Completely Automated Public Turing test to tell Computers and Humans Apart." It is a security test strategy to differentiate between human beings and automated programs or bots. CAPTCHAs present complex puzzles or problems that challenge bots while ensuring that human respondents can solve them easily.  Using a captcha allows businesses to save time and resources in dealing with fraudulent activity. The captcha provides an excellent way to prevent bots from submitting online surveys multiple times, ensuring accurate survey results. Using a captcha also helps to ensure quality feedback from genuine and committed customers, reducing the quality of responses.  The most common types of CAPTCHAs are image-based, audio-based, and text-based. Image-based CAPTCHAs introduce a challenge presented in graphic formats, with humans needing to identify letters or numbers.  Audio-based CAPTCHAs present a series of random sounds, and humans must identify spoken words from the mess of noise. Text-based CAPTCHAs present a combination of letters and numbers, which humans can decode. Use Random Questions Random questions are a powerful tool for survey research, as they safeguard against fraudulent responses. By adding random questions, respondents cannot anticipate what questions will come next or how to answer them. This way, fraudulent respondents can't provide pre-rehearsed answers, making it harder for them to manipulate survey data. To maximise the effectiveness of randomised questions, you should choose the right type of questions to add. Open-ended, situational, or vague questions are commonly used as random questions. Open-ended questions invite respondents to provide unique, unrehearsed answers making it difficult for fraudulent actors to manipulate. Vague questions like "How many cities have you visited?" or "What kind of vehicle do you drive?" do not provide specific answers and help further safeguard against fraudulent responses. Situational questions like, "What would you do if...?" help respondents consider a scenario and respond, making it difficult for fraudulent actors to provide pre-rehearsed answers. The timing of the random question's is critical to ensure their effectiveness. The perfect order of random questions is spread throughout the survey. It throws off fraudulent actors looking for a set pattern to manipulate survey data. Password Protect Your Survey One of the most effective ways to prevent fraudulent activity is by implementing a password on your survey. This ensures that only those given the correct password can access the survey. A password can be a combination of letters, numbers, and symbols that can be given to participants or companies wishing to participate in the survey. It is important to use a strong password that is difficult to guess. Using a reputable survey platform such as SurveyMonkey or Qualtrics can provide an added layer of security. These platforms have built-in security features that can help prevent fraudulent activity. Choosing a platform that offers encryption, SSL, and other security measures is vital to help protect your data. Limiting access to the survey is another way to prevent fraudulent activity. By restricting access, you can ensure that only the intended audience can access the survey. This can be done by making the survey available only to selected participants or by setting up a login system that requires participants to verify their identity. Use a Consent Form  Using a consent form can help prevent fraudulent activity by ensuring that participants know the purpose of the survey and the consequences of providing false information. For instance, if you're conducting a survey about work anniversary gifts, you can specify that in the consent form to ensure relevant responses. By signing a consent form, participants acknowledge that they understand the importance of honesty in survey responses and agree to provide accurate information. A consent form can be used as a legal document to protect your business from fraudulent claims. For example, suppose a participant provides false information and later tries to file a claim against your company. In that case, the consent form can act as evidence to support the truthfulness of your survey results. When creating your consent form, you should include a few key components. Firstly, you should have a section that details the purpose of the survey and what information you will be collecting from participants. This can increase transparency and encourage participants to provide accurate information. You should also include a section outlining the consequences of providing false information. This can be as simple as explaining that the survey results will be used to make critical business decisions. Include a signature section where participants can sign to indicate that they acknowledge the purpose of the survey, the importance of providing accurate information, and the consequences of giving false information. Prevent Multiple Responses From the Same IP Every device connected to the internet has a unique IP address that identifies its location. Survey platforms use this information to limit the number of responses from a specific IP address.  Knowing this information can help you understand how multiple responses from the same IP address can negatively impact the authenticity of your survey results. Most survey platforms have IP address validation features that prevent multiple responses from a single IP address. This feature can be easily enabled while creating your survey, and it will help avoid fraudulent activities. This feature will keep track of the IP addresses that have already responded, and it stops subsequent attempts from the same IP. Another way to prevent multiple responses from the same IP is to use cookies. Cookies are small pieces of data stored on the respondent's device that enables the survey software to recognise when they've already responded to the survey. Some survey platforms allow you to configure cookies once the survey is created to prevent any fraudulent activities. Limiting survey access to specific email addresses or domains can help prevent multiple responses from the same IP. This approach is suitable for surveys with a small sample size and a specific target audience. It provides the ability to configure access to the survey to individuals or companies that meet certain criteria. Monitoring your survey responses is essential, especially if you're gathering data on topics like bulk gifts for employees. It's good practice to track the IP addresses responding to your survey and analyse their pattern to ensure genuine feedback. Any deviation from the norm and pattern increases the chance of fraudulent activities. Monitoring surveys also provide actionable insights into your target audience that can help shape your marketing message. Ensure The Data You Extract From Your Surveys Is Always Accurate By utilising a consent form, IP address validation features, cookies, and limiting survey access to specific email addresses or domains, you can ensure the authenticity of your survey results.  Monitoring your survey responses is also essential as it will provide actionable insights into your target audience that can help shape your marketing message. With these tips in mind, you'll be able to create surveys with accurate data every time. ### How critical organisations can spot unseen threats much sooner The British public and press were shocked by a breach disclosure from the Electoral Commission. The security incident, which is thought to have impacted as many as 40 million UK voters, was described by the Commission as a “complex cyberattack”, in which malicious actors leveraged a “sophisticated infiltration method, intended to evade our checks”. And evade they did. When the news broke on Tuesday, it came two years after the hackers first accessed the Commission’s systems, and only 10 months after security teams had identified the breach. This meant that hackers had access to the organisation’s electoral registers, email and control systems for well over a year before anyone noticed. The image of the unseen threat lurking in your midst is one that keeps a majority of CISOs up at night, and over half of security leaders report that “unknown blind spots” are their biggest concern. Despite this, blind spots are all too common in complex organisations and continue to pose a risk to their most sensitive data and workflows. Critics have been quick to highlight that, given the Electoral Commission’s central role in Britain’s democratic processes, the near 15-month delay in incident response is positively a nightmare scenario. As public sector organisations invest more into digitalisation to meet modern challenges, it remains paramount that the public can have full faith in public institutions to transform as securely as possible. Unfortunately, incidents like this unseen breach can sow real distrust, and public sector organisations should endeavor to do more. Protecting against a network of motivations With ransomware attacks often filling the headlines, it’s important to remember that not all bad actors are after money. From kinetic hacks to attacks targeting intellectual property (IP), there are countless reasons why a bad actor would benefit from an extended time within your systems. The electoral registers stolen in last week’s breach present their own threat. While each individual entry is of low value, the aggregated database could prove very useful for a nation state. Intelligence agencies have four techniques that they deploy to persuade a person to do something they wouldn’t normally consider: Money, Ideology, Coercion and Ego - MICE for short. Gaining access to the home addresses and family members of a would-be target offers bad actors powerful intelligence for persuasive campaigns. In general, nation-critical organisations have an almost “white whale" status within the cybercriminal community: not only do hostile actors benefit directly from a breach, they also succeed in undermining the security posture of the United Kingdom as a whole. Organisations with this greater risk potential should therefore have the processes and tools in place to identify any suspicious activity. The longer a bad actor can hide in any organisation’s networks, the more damage they can do, but maintaining visibility over complex networks – especially those with legacy technologies - is an ongoing challenge. A future-facing observability strategy Hybrid cloud networks are a common part of most IT strategies, but as organisations migrate more and more workloads to the cloud, the security stack is struggling to keep up. Many monitoring tools are designed for on-premises environments, making them insufficient in addressing unseen weaknesses and security gaps in hybrid landscapes. In turn, cloud-centric tools have little visibility into on-premises traffic. The result is an ongoing visibility challenge that bad actors are keen to exploit. In today’s climate, organisations must shift towards a more proactive security mindset and reduce blind spots before they’re exploited by embracing deep observability. This will provide real-time, network-level intelligence to track normal and suspicious activity. Achieving visibility into networks should be a priority for security teams if they are expected to manage cyber risks in a complex environment, and this is where deep observability comes into play. It offers security teams insight into malicious incoming traffic, even when encrypted, and monitors East-West traffic for suspicious activity. The ability to identify behavioural anomalies in an organisation’s data is vital to spotting potential breaches – ensuring threat actors can’t go months or years inside an IT environment without anyone noticing. As cloud adoption continues to grow, security leaders need to empower themselves with the right insights and oversight of their own networks to effectively address the complexities that their hybrid state brings to conventional security methods. Learning from a breach Nation-critical organisations should build out their security posture with a well-resourced and capable intelligence service in mind. These hostile attackers, with access to some of the best exploit writers and operatives on the planet, need to be met with a network security model that goes beyond tradition logs or Endpoint Detection and Response (EDR). Techniques to evade these methods are well known: threat researchers have shown that EDR is easily disabled, especially on Windows, and the hackers’ attack playbook consists of turning down logging and erasing Windows event logs as soon as they enter a system. Deep observability offers the next level of visibility to ensure that breaches can be detected and mitigated as early as possible. The detection of intruders is a critical component of any cybersecurity strategy. However, tracking intruder activity after detection is equally important. With deep observability, when security leaders detect an ongoing attack in their network, they can then track intruders to inform a long-term incident response and recovery strategy. When organisations are lucrative targets to criminals, tracking intruder activity provides valuable insight into attacker tactics, techniques, and procedures (TTPs) to improve Network Detection and Response (NDR) capabilities. To use an example, if an organisation identifies a specific type of malware used in an attack, it can deploy tools to block that malware in future. In the time between detection and public disclosure, the Electoral Commission has reportedly implemented multiple measures to improve its security posture. These include improved threat monitoring capabilities, updated firewall policies, and more robust network login requirements. These are, of course, important upgrades, and we can only hope that other critical organisations are taking note of the blind spots that these attackers were able to exploit. But we can all learn from this breach. Without the right level of visibility, bad actors can hide in complex hybrid environments undetected for a dangerously long time. The ability to collect and analyse vast amounts of data from network traffic, endpoints, applications, and other sources across a network infrastructure is the key to building a real defensive strategy and responding to threats in real time. Bad actors are everywhere. In embracing deep observability, both public and private companies will ensure that when their time comes, they can react accordingly to minimise damage. ### Unlocking the Full Potential of a Cloud ERP solution It's no secret that organisations today are on the quest for the perfect Enterprise Resource Planning (ERP) solution that aligns with their specific needs and business objectives. There are four key categories that stand out as the most sought-after features in an ERP system: security, cloud-based architecture, scalability, and real-time functionality. But how do they all connect to ensure your business has the best solution available? Robert MacDonald, Innovation & Technology Manager at Absoft, discusses how adopting a cloud mindset can help companies navigate the change effectively, and how SAP’s S/4HANA Cloud Edition stands out as a strong cloud ERP solution. Identifying The Ideal ERP Solution One of the most crucial aspects of any ERP system is its security, due to organisations recognising the importance of protecting its sensitive data and business processes from cyber threats like ransomware. And, whilst in recent times, cloud-based ERP solutions have gained significant popularity, the appeal lies in the ability to store data in hyperscale data centres such as Amazon Web Services, Microsoft Azure, or Google Cloud, which provides an extra layer of security. The subscription-based model for SAP further streamlines accessibility through a user-friendly portal, eliminating the need for heavy installations and customisations. As a result, regular updates and bug fixes are delivered in the background, ensuring a smooth and secure user experience. In addition to security, organisations seek ERP solutions that are flexible and scalable. The dynamic nature of modern businesses demands the ability to adjust user numbers, modules, and applications as required, without incurring unnecessary costs associated with licences or bespoke customisations. A cloud-based ERP with out-of-the-box applications reduces reliance on third-party relationships and streamlines the overall ERP solution, resulting in greater agility and adaptability. Moreover, in today's fast-paced business landscape, real-time functionality is a game-changer when it comes to effective decision-making. To stay up-to-date and make informed decisions in a timely manner, organisations now require live analytics and embedded reporting. Using real-time data and smart analytics enables businesses to make data-driven decisions that drive growth and success and respond swiftly to changes in market conditions. SAP S/HANA Cloud: Adopting a Cloud Mindset In the pursuit of finding an ERP solution that ticks all the boxes, SAP's S/4HANA Cloud has emerged as a leading software vendor. This cloud-based ERP solution addresses the aforementioned requirements effectively, offering a secure and scalable platform with regular updates and bug fixes to ensure optimum performance. In addition to this, the focus on fit-to-standard practices enables faster implementations and reduces efforts in dealing with additional customisations, making it an attractive choice for organisations of all sizes. However, implementing a cloud ERP solution successfully requires more than just choosing the right software. It demands a shift in mindset. Organisations need to adopt a cloud mindset which embraces the possibilities and benefits that come with this technology. Collaborative efforts between the organisation and its technology partner are crucial, particularly in navigating the complex change management processes involved in ERP implementation. By embracing a fit-to-standard approach, this ensures that the organisation's processes align with the best practices of the chosen cloud software, resulting in smoother transitions and streamlined operations. Beyond the initial implementation phase, digital transformation becomes the key to unlocking the full potential of a cloud ERP solution. By leveraging various apps and digital processes, businesses can enhance efficiency and optimise their operations. Intelligently implementing extensions and integrations is vital to keep the core ERP system clean and easily upgradeable, helping to reduce complications in the future. Innovation adoption becomes an ongoing process in the cloud ERP model. Regular updates to the software offer access to new functionalities that organisations should embrace to fully benefit from the cloud solution. Embracing a cloud mindset and adopting agile methodologies ensures that the ERP implementation remains flexible and successful, which enables maximum software capabilities, as well as business growth potential. The Journey to Success The significance of choosing the right ERP solution cannot be overstated in today's competitive business landscape. By prioritising security, scalability, real-time functionality, and embracing cloud-based architecture, organisations can position themselves for success. SAP's S/4HANA Cloud stands out as a strong example of a cloud ERP solution that effectively addresses these requirements. However, the journey to ERP implementation success requires a holistic approach, and one that includes a cloud mindset, collaboration with technology partners, and a commitment to continuous innovation and digital transformation. By understanding and embracing these principles, organisations can harness the full potential of a cloud ERP solution like S/4HANA Cloud and future-proof their business operations in an increasingly digital world. ### Why a multi-cloud strategy is the secret sauce to creating unrivalled fan experiences at sporting events  There’s a saying that ‘sport is nothing without the fans’ and over the last decade, fans’ expectations have evolved hugely. In live sport, it’s no longer enough to be present within the venue, the experience also must be enveloped through mobile devices and an ever increase array of media platforms. At the 2022 Football World Cup, for instance, fans consumed more than 800TB of data (the equivalent to 400,000 hours of movies) within stadiums as they pushed local mobile networks to the limits, watching live replays, sharing game footage and social posting en masse.    This year, as the Rugby World Cup approaches, the organisers will face similar challenges to meet the needs and deliver exceptional experiences to more than 405 million fans worldwide. But not all of those will be able to attend the game in person, and there will be limits on who can access virtual streams reliably. Therefore, delivering consistent, rich media and content experiences, no matter where the fans are located, will be critical in delivering a World Cup to remember.   This surging demand will stress-test the tournament’s ability to deliver connectivity at scale, far beyond what a traditional IT environment could support. So, what benefit could a multi-cloud strategy, which enables any app to be managed across any cloud to any device, bring to large scale sporting events such as the Rugby World Cup 2023?   Here are my top three use-cases:   1.       Delivering unparalleled insights, on-demand:  The magic of sport is captured in the emotion and affinity that fans feel towards their favourite players and teams. Their desire to feel connected to game preparation, tactics, strategy, as well as the tribalism, that extraordinary emotional reaction that comes with fans wanting to be part of a united group, is almost unmatched.  Rugby fans may have previously settled for reading the match-day programme or tuning into the ‘Ref Radio’, a communications channel that fans use to hear referee commentary. But expectations have changed. Fans now want complete insight into the match-day lifecycle, from training and injury updates, through to build-up and post-game analysis. They also want to review the Television Match Official decision, otherwise known as a TMO, on their mobile device as they see it on the big screen. They want to interrogate the replay, review different angles, and settle debates with other fans.  It’s here that a multi-cloud strategy can support this level of real-time connectivity, engagement, and sheer scale. Working seamlessly together, 5G, telco and edge clouds provide the necessary compute power, as well as the low-latent connectivity to scale multi-media content from within the stadium to end-user devices around the world. Stadiums will find they can deliver the right applications at the right moment, through a local connectivity point with telco edge and 5G, for the best possible content experience.   2.       Building trust with fair play and accurate officiating    The future of the fan experience at large-scale tournaments, like the Rugby World Cup, will be largely supported by multi-cloud – as it underpins several different sporting technologies.   For instance, Rugby’s TMO connects to multiple stadium cameras to deliver clarity and certainty on whether the ball was grounded for a try, if there was a foul or infringement in the build-up and whether points need to be rewarded or removed. It works in a similar way to VAR in football or hawk-eye in tennis and cricket. In doing so, it removes the ‘debate’ from the sport and ensures that victory is earned, not stolen. This enhances the fan experience as on-field decisions can be trusted, and losses can be begrudgingly accepted.   However, this trust can easily deteriorate if the technology fails under stress due to poor testing. If the TMO replay is limited to just one stadium screen or the feed becomes pixelated, slow, or simply stops entirely, then the integrity it brings to the game breaks down.   Through private 5G infrastructure and the compute support at the edge, the Rugby World Cup stadia can benefit from a containerised environment, so all stadium screens, camera equipment, local applications and storage software connect to the same local network services – delivering fast, low-latent rich multi-media content at scale. This means the TMO is never delayed, fans trust its output and the tournament maintains an exceptional experience.    3.       Seamless public infrastructure support When fans enter a stadium, often their mobile device will automatically connect to the public Wi-Fi network. This is often followed by a swift disconnection as the Wi-Fi proves painfully slow and untrustworthy. The use of private 5G networks transforms the connected fan experience and aid how large-scale tournaments like the Rugby World Cup interact with public services.   For instance, if every fan connects to the same private 5G network as they enter the stadium, each seat becomes a de facto IoT device. This is because the connections provide a virtual picture of how many fans are in the stadium and where and especially, when they are sitting in their seats, supporting capacity planning, forecasting, and calculating floor space. But from a more practical point of view, this means the Rugby World Cup can also track how many fans have left a game and at what point, and when they are about to enter nearby roads or use public transport  Moreover, this innovation comes to life if public services, transport, or security. The ubiquitous connectivity provided between private and public 5G networks is all about data balance and federation. Suddenly, these services can get real-time insight into fan activity – supporting policing, logistics and emergency services as they leave the ground. At the same, it also means that should there be an emergency in the stand, there’s no network congestion, so support can be attained faster with greater context between the digital and the physical characteristics or large-scale sporting events.   Foundational multi-cloud technology in sporting events isn’t anything new. It’s evolved naturally with the infrastructure requirements. But the consumer requirements are also evolving and will demand 5G connectivity to better deliver applications and rich companion experiences. As well as providing an unparalleled fan experience, multi-cloud systems support fair play and accurate officiating – using modern technology to bring the sport back to what it once was. Whilst it can take the ‘combativeness’ out of certain sports, with umpiring decisions being made on facts, the uplift in encouraging sportsmanship and mutual respect both on and off the pitch Is further affected positively, with technology as the enabler.   Events such as the Rugby World Cup are only set to grow in popularity in the future especially following events around the world such as pandemics. Smart multi-cloud roles are become ever more critical in connecting public services. Moreover, to ensure the fan experience is safe and turn-out is sustainable – it is rapidly becoming the role of the telco service providers, to provision 5G cloud services upon which fans and organisers rely, provisioning continuity.   These examples show where multi-cloud technology bridges seamlessly between public safety and security needs, whilst delivering transparent excitement and passion for sports fans.  ### The need for speed: Rapid prototyping for SaaS success Anyone who has spent a decade or more in the software industry will have witnessed firsthand how technological change shows no signs of halting its exponential acceleration. And that frenetic pace has perhaps been felt most in the software-as-a-service sector. Modern SaaS companies operate in a landscape where customer expectations are constantly shifting. To keep up, rapid prototyping has become critical. Early in my career, I learned that speed and agility are imperative to iterating quickly based on user feedback. Traditional long-release cycles are incompatible with today’s reality. The benefits are clear and compelling. Rapid prototyping enables testing new ideas without major upfront investment. You can validate what resonates with minimal sunk costs. This “fail-fast” approach allows for effective learning and smarter product decisions over time. No-code prototyping tools have played a big part in the process revolution. With simple drag-and-drop interfaces, even non-technical users can build functional prototypes without writing code. This empowers product designers, entrepreneurs, and others to quickly test ideas and create tangible prototypes. They can iterate rapidly without overburdening engineering teams. SaaS companies have used no-code tools to shorten the feedback loop from the initial concept to the final product. Prototyping is no longer a lengthy, expensive endeavour. By rapidly iterating prototypes, teams can refine them until the desired functionality is achieved. Then, the final prototype can be handed off for full development, giving engineers a clear direction. This approach saves engineering time during the exploration phase and allows them to focus on building products that deliver impact. Traditional product development is costly and risky, but rapid prototyping reduces both. No-code tools allow product managers to easily build MVPs without expensive engineering resources. Rather than hiring engineers to translate every idea into an MVP, they can focus on complex features that differentiate your product. Beyond cost savings, rapid prototyping boosts revenue growth. The sooner you launch a functional product, even in limited form, the faster you generate revenue. Imagine waiting eight more months to launch the full product. That’s eight months of missed revenue opportunities. Rapid prototyping allows progressive companies to get to market faster and unlock revenue growth sooner. Thanks to modern tools, creating a usable prototype in just a day or two is now feasible. You can immediately get qualitative customer input to ensure you’re focused on the highest-value features that offer the most benefit. For larger user bases, usage analytics reveal what prototypes users engage with most. A sub-day turnaround would be ideal but is unrealistic given logistics – humans need more time for meetings than machines! However, the key is to constantly tighten the feedback loop between releases to match the pace of changing expectations. A company that in the past may have invested six months internally developing a major analytics module before launch would find, at the time of launch, that priorities had changed as the market evolved. Much of that effort would have been wasted on features users no longer needed. With rapid prototyping, they could have validated demand earlier and adjusted the course as required. In the past, many software professionals believed “big bang” releases after months of preparation were best. Today, it is accepted that SaaS customers want a steady stream of incremental enhancements, not sporadic major versions. Their needs rapidly change, so products must evolve in tandem. This is where rapid prototyping really shines. Instead of long dev and testing cycles, new code deploys frequently to deliver value faster. Minimum viable products provide learning to guide development. You tackle technical debt along the way rather than allowing it to accumulate. Rapid prototyping complements DevOps practices like continuous delivery. Based on feedback, small enhancements can be pushed multiple times daily to refine the user experience. Real-time customer insights direct this process. Admittedly, potential downsides exist. Prototypes relied on by customers may get cut if not useful at scale. Frequent changes can also create bugs and complexity over time. However, smart monitoring and engineering can mitigate these risks. The benefits far outweigh the costs. In a competitive market, speed and agility rule. The companies leading our industry fully embrace rapid experimentation and customer-focused development. Modern SaaS allows for faster prototyping and release than ever before. Successful software entrepreneurs are excited by the velocity of change, not intimidated. My advice is to learn quickly and don’t fear failure. Deliver value iteratively in small chunks. The future belongs to companies moving at the speed of customer expectations. Rather than being overwhelmed, rapid prototyping allows you to harness that pace of change. Companies need easy access to visualisations and dashboards to make sense of overwhelming amounts of data. Structuring all the data fully upfront is a common but wasteful mistake. You’ll never have all the data you ideally want. Rapid prototyping lets you start by quickly analysing some initial customer metrics. Get their qualitative feedback on whether it provides value. Then, iteratively grow your capabilities based on what they find most useful. Soon, they may even identify new areas you hadn’t considered, helping you get ahead of competitors. The pace of change won’t slow anytime soon. SaaS companies must embrace rapid prototyping and tight feedback loops with customers to keep up. Speed and agility now win in software development. Delivering innovation that creates value is what matters. Lagging competitors will be left behind. ### Embracing repatriation for cloud optimisation: Reclaiming control “You’re crazy if you don’t start in the cloud; you’re crazy if you stay on it.”– Sarah Wang and Martin Casado, general partners at Andreessen Horowitz (a venture capital firm in Silicon Valley, California). It is a raging debate. The business world is divided on this. Cloud computing has, no doubt, created powerful value propositions for global enterprises. Cloud-hosted digital environments offer unmatched scalability, flexibility, and cost efficiencies to future-facing organisations. However, despite its numerous benefits, certain difficult-to-ignore challenges have also emerged along the way. This has prompted many companies to explore cloud repatriation strategies to optimise their business operations. But what is cloud repatriation? And, what makes it worth our while? Simply put, repatriation is the process of moving data and applications from a public cloud back to an organisation’s on-premises data centre, private cloud, or to its hosting service provider. But it is not a simple process. Given that the ultimate objective is to identify and implement the most optimised architecture that effectively supports our business needs and objectives, it just may work better for some businesses, in some instances.Why we need to reclaim cloud mastery: Repatriation as an optimisation strategyFor many enterprises, the bevy of challenges associated with leveraging public cloud may outpace its perceived benefits. Enterprises are clearly concerned about multiple factors associated with cloud usage including their: • Budgetary outlays: Cloud services can be expensive if not managed efficiently. As per the Flexera 2023 State of the Cloud Report, a staggering 82% of enterprises identify the management of cloud expenses as their primary challenge. Managing cloud costs effectively becomes complicated due to factors such as storage costs, underutilised resource cost due to infrastructure sprawls, regulatory compliance, and data transfer expenses etc.• Cloud security vulnerabilities: 79% of businesses harbour apprehensions about cloud security. Moving data or applications back to on-premises infrastructure, on the other hand, empowers businesses with increased control over their security infrastructure. This control extends to areas such as network configurations, access controls, encryption methods, and physical security measures.• Limited know-how: Navigating the cloud on your own can be quite a challenge for enterprises, akin to finding your way in a new city without a map or local guide. No wonder, 78% of companies admit to grappling with the issue of insufficient resources and expertise in the cloud.• Vendor lock-in periods: This has added another layer of complexity by making businesses overly dependent on a single cloud provider for their infrastructure, services, or applications. They find it difficult to move their data and applications to a cloud of their choice. In such cases, they may decide to move back their data and applications to avoid vendor lock-in. • Poor data sovereignty: In the modern corporate landscape, safeguarding data, adhering to the regulations of the country in which the data is located, and mitigating leakage risks are critical. A remote cloud environment may undermine data sovereignty and may not comply with local data protection regulations. Enterprises also may lack control over data storage and processing across different jurisdictions.• Latency and performance: Near-edge or on-premises edge locations are emerging as ideal destinations for repatriated workloads. These locations provide benefits such as minimised latency, on-site data processing capabilities for real-time applications, and support for Internet of Things (IoT) use cases. In this scenario, cloud repatriation is quickly emerging as a viable business growth strategy. It involves transferring data and applications from a public cloud to an organization's on-premises data centre, private cloud, or hosting service provider. Optimising your cloud presence with repatriation Cloud optimisation aims to maximise efficiency and cost-effectiveness in using cloud computing resources. Therefore, repatriation can manifest in different forms like multi-tenanted private cloud, hosted private cloud, and alternative deployment models.According to a recent IDC study, customers are finding it extremely compelling to run existing as well as modern born-in-the-cloud workloads in a private cloud environment versus running them on public cloud. Responding to this, system vendors are now providing unified management platforms that offer observability, management, and provisioning capabilities. These solutions allow businesses to access the same user experience as public clouds within their dedicated infrastructure. According to the study, by 2024, the proportion of mission-critical applications running on dedicated traditional data centres will see a decline from 30% currently to 28% while modernised versions of these applications running on private cloud will see an increase to 26%. Modern enterprises may want to keep certain workloads on-premises while migrating others to the cloud without compromising their data. This allows them to harness the advantages of both environments. So, they need to assess their requirements to determine the best approach to cloud optimisation. But repatriation is a complex process. While organisations traverse this journey to optimise their cloud presence, it inevitably leads into rearchitecting of the network infrastructure and revisiting the existing security solution architecture. Hence, importance of a competent partner who can help an organisation navigate the maze at this stage cannot be overstated. ### CIF Presents TWF - Céline Schillinger Cloud Industry Forum presents TWF! with guest Céline Schillinger In this first episode of our weekly news and interview show, Tech Wave Forum, host and CIF CEO David Terrar, gives you the latest and greatest news on cloud computing. David is also joined in the studio by special guest Céline Schillinger, acclaimed author of “Dare To Unlead”.  Céline will tell us the story behind the book, explain the philosophy of the book and tell us why leadership matters.  We'll also cover the power of the network, explain how corporate culture stifles creativity, and talk about a different style of leadership to cope with today's world of complexity.  And there will be a draw for attendees to win copies of the book. ### Generative AI: The Urgency to Accelerate Digital Transformation There is virtually nowhere on the Internet you can go today without encountering some form of generative AI. There is virtually nowhere in the workspace you can go either, without generative AI dominating conversations from the board to the break room. A March 2023 KPMG survey of U.S. executives found that almost two-thirds (65%) believe “generative AI will have a high or extremely high impact on their organisation in the next three to five years.” The catch? “Fewer than half of respondents say they have the right technology, talent, and governance to implement generative AI successfully.” This finding may surprise those who encounter chatbots and read AI generated content daily.  However, it isn’t news to of us who’ve been tracking the corporate digital transformation journey, and mapping it against the six technical capabilities required to successfully navigate that journey (IT infrastructure, data, apps and app delivery, observability and automation, Site Reliability Engineering practices and security). In our recent Digital Enterprise Maturity Index Report, we did a deep dive into the current state of digital enterprise maturity based on those six technical capabilities and found only 4% of the organisations operating at the highest level of maturity. That is, they are close to fully operating as a digital business. Most organisations (65%) are dabbling in digital business. They show signs of maturity, harvesting the rewards of the hard work of modernising IT and its technology domains. Notably, this is nearly the same percentage KPMG found lamenting a lack of the right technology, talent, and governance to implement generative AI successfully. That’s not a coincidence. The sheer breadth and depth of technical capabilities required to establish and operate an AI-enabled business can be overwhelming. From new technical domains in the enterprise architecture—SRE operations, observability and automation, app delivery, and security—to modernising existing domains such as data and infrastructure, a significant amount of work is needed, especially for enterprises that predate the Internet. That’s why it’s no surprise that when we look at the maturity of organisations through the lens of industries, we find that no financial services organisation is operating at the highest level of maturity. Cloud providers, telecom, and technology firms dominate that category. With the exception of telecom, cloud providers and technology firms are relatively adolescent industries with very little technical and architectural debt than their more-traditional counterparts. That makes it a lot easier for them to push forward faster.   The lack of financial services companies in the highest category of digital maturity—the doers—might seem odd given the rapid rise of digital banking. Upon deeper analysis, one can assume that FinServ is moving at a slower rate by design. Adding a new interface (apps and digital services) is much like putting a shiny façade on an old building—it gives the appearance of modernisation but behind the scenes is a great deal of traditional technology and practices. That’s not a condemnation. After all, there are costly risks to missteps, and they have a considerable burden with existing infrastructure and app portfolios requiring modernisation. This also explains the lack of healthcare companies in the top category. If a weighty portfolio and strict governance burden the financial services industry, imagine the burden on healthcare organisations. They are one of the most highly regulated and tightly governed industries, and rightly so, since missteps impact human lives. Digital transformation is not a race; thus, it is encouraging to see some industries progressing with a measured, strategic approach. The tortoise did beat the hare, after all. That’s not to say there aren’t examples of healthcare and financial services organisations that are progressing faster than others. The data set we analysed was drawn from responses to our State of App Strategy survey, based on the completeness of answers. Companies we’ve spoken to in both industries have demonstrated that there are exceptions to every statistic. Healthcare The healthcare sector, for example, needs more critical care nurses to cover the number of available hospital beds. This shortage of resources is an existential threat, but entirely outside of the control of the industry because there just are not enough nurses to go around. To meet this challenge, some healthcare providers—especially those managing a large number of hospitals—are increasingly looking at how AI can increase the number of beds each nurse can cover without compromising the quality of care. Some providers are looking to Large Language Model AI to make nurses more effective by reducing the time needed to maintain patient health records. Others intend to use AI visual modeling to monitor hundreds of high-definition video streams for possible changes in patient conditions. Financial services Financial services companies can sometimes lag in their adoption of certain technologies because they’re generally highly regulated. But bankers and brokers know that the greater the risk, the greater the reward. Using applied technology to obtain a first-movers advantage over competitors is central to the financial services industry embracing blockchain despite significant difficulties with reliable deployment, considerable risk, and regulatory scrutiny. The industry has forecast certain advantages from the technology and is bankrolling some of the most sophisticated studies in areas such as Quantum Key Distribution, which can significantly increase the efficacy of security around transaction processing for monetary transfers or trading. Manufacturing The 2022 SOAS found that IT/OT was the most exciting technology trend, and it remained in the top five in 2023, demonstrating that manufacturers are at the forefront of modernisation. Manufacturers have always been on the vanguard of adopting any technology that leads to greater efficiencies and digitisation is no exception. The annual Oil and Gas Automation Conference for the past three years has shown that natural resources sector is far ahead of most enterprises in their use of automation, collection of telemetry, and the adoption of Zero Trust approaches to securing remote assets.   Get on board or get left behind The benefits of the digital transformation journey are undeniable. Organisations in all stages of digital transformation see benefits, and those exhibiting greater maturity are more likely to cite business benefits such as competitive advantage, new opportunities, and business operational efficiency. Generative AI is just one of many disruptive technologies ahead, and if organisations can’t harness these game-changing technologies, they will undoubtedly be left behind. Digital maturity indicates whether or not an organisation is well-positioned to take advantage of these technologies. ### Compare the Cloud interview – Matthew Bardell – nVentic Step into the realm of inventory management with Matthew Bardell from nVentic! In this comprehensive discussion, Matthew offers a deep dive into what makes nVentic stand out and the nuanced strategies they employ for effective stock inventory control. It's a journey of discovery that unveils the secrets of efficient inventory management. Be sure not to miss these valuable insights that can transform the way you view stock control! nVentic is the leading inventory optimization specialist, with a passion for applying advanced mathematical techniques to business challenges in a pragmatic, results-driven way. nVentic works intensively with manufacturing clients interested in improving inventory performance. nVentic delivers genuine expertise, backed up by proprietary analytical tools and is committed to innovation, creating a legacy of capability to hand on to the next generation.  ### Embracing cloud-based platforms: Unlocking business advantages in today's evolving work landscape Over the past few years, businesses have encountered many unprecedented challenges, prompting them to undergo rapid adaptation and technological evolution. Among the most defining aspects of this modern work landscape has come the emergence of hybrid and remote work models. The importance of embracing technology that enables businesses and their teams to operate seamlessly, regardless of time or location, can simply not be overlooked. Therefore, in today's dynamic and ever-changing working environment, organisations are now truly beginning to recognise the potential of cloud-based platforms. No longer just an optional consideration, these platforms have become a strategic imperative for companies of all shapes and sizes striving to excel in the constantly evolving work landscape. Let’s explore three potential advantages for businesses that choose to embrace cloud-based platforms in today's modern working world. Enhanced collaboration As mentioned, the shift towards hybrid work models, combining remote and in-office work, has become the new norm across the globe. Millions of us no longer commute into an office five days a week, and instead, wake up and simply commute down the hall to our ‘home office’ This new paradigm demands seamless collaboration and real-time, secure access to documents and workflows. Cloud-based platforms are designed precisely to fulfil this requirement, offering a flexible and accessible ecosystem for teamwork. These platforms allow teams to work together efficiently, regardless of geographic boundaries and/or physical location. Through these platforms, all kinds of teams, from HR to marketing and sales, can collaborate on projects, share ideas, and access critical documents from the comfort of their homes or offices. Another additional benefit cloud-based platforms bring is they allow businesses to scale resources up or down based on project requirements, ultimately providing cost-effective solutions for organisations of all sizes, across a whole host of industries. Furthermore, with document management systems (DMS) integrated into cloud-based platforms, businesses can adapt to ever-changing needs with relative ease. The agility offered by these systems ensures that companies can evolve and grow without compromising productivity or collaboration. A real win-win. Remote work enablement for the 'future' workforce In response to shifting workforce demographics, businesses are now really recognising the importance of catering to the expectations and preferences of the ‘future workforce’. The rise of Millennials and Gen-Z’ers has brought about strong demand for flexible working environments, with remote work now being a key consideration for job seekers. In fact, a recent survey from Deloitte showed that more than three-quarters of UK Gen Zs (77%) and millennials (71%) would consider looking for a new job if their employer asked them to go into their workplace full-time. Cloud-based platforms have emerged as indispensable tools in enabling remote and hybrid work. By offering employees the necessary tools and infrastructure to work effectively from any location, these platforms contribute to a healthier work-life balance. Businesses that adopt these platforms not only meet the expectations of the modern workforce but also position themselves as forward-thinking and adaptable employers Appealing to and retaining this skilled and digitally native generation. boosts competitiveness in the market. By embracing cloud-based solutions, businesses demonstrate their commitment to staying current with evolving work trends and accommodating the ever-changing needs of their employees. This adaptability not only fosters a more engaged and satisfied workforce but also positions companies as innovative and future-ready players in an increasingly competitive job market. Environmental sustainability Many modern organisations are now prioritising reducing their carbon footprint and embracing paperless operations as essential sustainability goals. Cloud platforms offer an effective solution to this endeavour by digitising processes and workflows, allowing businesses to significantly reduce paper consumption. Embracing a paperless approach not only conserves valuable resources and in the long-term saves businesses money, it also aids in preserving forests, contributing to the broader mission of safeguarding the planet's vital, delicate ecosystems. The remote/hybrid working that cloud platforms enable, positively impacts the environmental sustainability efforts of every business. Gone are the days of commuting 5 days a week. Instead, most employees find themselves dealing with traffic congestion, fuel consumption and the greenhouse gas emissions that comes with it, only 1-3 times a week now. Less cars on the road, less fuel in said cars and more workers having the flexibility to connect and collaborate remotely (at least a few days a week) through cloud-based technologies is a big part of the way forward! By embracing cloud-based technologies and implementing eco-friendly practices, businesses can effectively contribute to a greener future while enjoying the benefits of improved productivity and reduced environmental impact. As sustainability becomes an increasingly significant factor in the business landscape, particularly as more government-backed regulations and frameworks come into effect, cloud-based solutions present a viable path towards achieving both operational excellence and environmental responsibility. In today's rapidly evolving work landscape, embracing cloud-based platforms has become essential for businesses of all sizes and across all verticals to stay competitive and relevant. As businesses continue to navigate hybrid work challenges, harnessing the indispensable power of cloud-based platforms will undoubtedly lead to enhanced productivity, greater talent attraction, and a positive impact on the environment. In an ever-changing world, embracing the cloud is a strategic imperative for businesses to invest in to ensure the success of their business. ### The Three Guiding Principles for Optimising Cloud Costs One of the often-cited barriers to Cloud migration is cost. Business leaders fear that it will be an expensive exercise in a number of ways, from the various direct costs associated with engaging a service provider, to the indirect costs resulting from enforced system downtime and speed of implementation. Even directing resources towards staff training to use new systems may be difficult to justify when many businesses are having to make tough decisions around reducing employee numbers and are implementing other cost-saving initiatives. While successfully implemented Cloud-based technology can offer organisations the power to work more efficiently and drive greater profitability, the pressure to ensure the greatest return on investment (ROI) has never been greater. Leaders cannot afford to make mistakes. True optimisation of Cloud costs is a complex topic and the methods to approach it will vary from one organisation to another, depending on the unique characteristics of each. However as with many other areas of business, good governance, clear processes, and access to data that drives understanding, ownership and accountability through the entire business, can help organisations realise significant savings and should be the guiding principles in decision making here. • Governance is important to ensure that all Cloud expenditure is known and attributable. • Process is important to ensure the lifecycle of Cloud resources is managed and procurement and decommissioning of Cloud services is relatively simple. • Good data and visibility is key to allow the translation of Cloud costs into business value and to ensure any assessment of total cost of ownership of Cloud platforms and software is evaluated correctly against existing equivalents. Elements such as tagging, structured naming conventions, and utilisation of best practice frameworks can contribute to ensuring that all of the above can operate effectively together. Getting the best ROIFrom a technical standpoint, the best way to optimise Cloud costs is to design and utilise Cloud native services wherever possible, ideally utilising services that can be converted or transitioned between Cloud providers to minimise the risk of vendor lock in. For example, when deploying a distributed application, consideration of container deployment and the use of Kubernetes will allow the application to be ported between the large Cloud providers and even to local data centres with relative ease. Alternatively, utilising Azure App Services or AWS Elastic Beanstalk can lock an organisation into an ecosystem that may require significant overhead to extricate itself from in future. Those using line of business apps from third party vendors that are currently installed on on-premises servers, rather than migrating those virtual machines to Cloud providers could investigate whether the vendor offers a hosted or managed solution for the application that can be used as a drop-in replacement. If not, it may be worth considering a different application vendor.Ultimately, the key to extracting the best value from any Cloud service lies in understanding the organisation's specific workloads and requirements. Armed with comprehensive data and insights, organisations can choose the most suitable Cloud options and negotiate better deals with large Cloud providers. Avoid common pitfallsOne of the biggest mistakes that IT leaders can make in their journey to the Cloud is assuming that adopting Cloud technologies will automatically result in cost savings. This misunderstanding often arises when comparing Cloud workloads against traditional on-premises infrastructure costs without considering the total cost of ownership. Sometimes organisations only get their first real view of what they are spending to maintain a system once it has moved to the Cloud, and they can take advantage of the rich granular data from the Cloud platform. Additionally, the management of Cloud costs is either pushed towards operational teams with little to no financial experience or siloed within finance teams who have no experience of IT operations. This can lead to a "Cloud-first" strategy driven by industry trends rather than genuine data-backed insights, resulting in unsuitable workloads being migrated to the Cloud. Adoption of Cloud applications or deployment of Cloud services through a service provider may result in additional costs that were not considered during the initial contract engagement. This could include professional services costs to configure workloads beyond the scope of the original engagement or unexpected egress charges for extracting data from a platform. The best way for businesses to ensure detection of any hidden fees or costs that may arise from migrating to Cloud services is first to fully understand their own needs and internal capabilities. It is important to take time to carefully review any contract to ensure it covers all of the business needs. If it seems likely that a lot of help will be required in configuring a SaaS platform, ensure that the scope for this is included in the initial contract terms. Many SaaS providers may throw in ‘free’ one off professional services work when negotiating a medium to long term contract that may provide a significant stable recurring revenue for them, but it pays to be fully aware of the circumstances that can drive additional charges. When buying a managed service where considerable additional help for ‘net new’ deployments may be required, the contract should state whether this includes calls off hours for new resource deployment. It may also be sensible to request definitions of work labelled as ‘BAU tasks’ and ensuring there is transparency and agreement around what is and isn’t covered. Once the Cloud-based solution is fully implemented and being used by teams who understand its capabilities as well as how to use it most effectively, then leaders will start to see more efficient processes. There can then be improved collaboration and co-operation across their organisation and access to accurate, current data for insights that can help drive greater productivity, profitability and provide a strong foundation for sustainable business growth. ### The power of innovative tracking technology in streamlining trade payments Businesses in today's fast-paced global economy are continuously looking for creative solutions to simplify trade payments, improve transparency, and assure transaction security. The introduction of cutting-edge monitoring technology, powered by IoT, has resulted in a fundamental change in how firms manage their supply chains and accept payments. Let us investigate the role of revolutionary monitoring devices in revolutionising the global financial supply chain, as well as the many advantages they provide.  Meet the tracking technology Businesses encounter several obstacles in guaranteeing flawless payment procedures in the ever-changing context of international commerce. Traditionalmethods sometimes entail time-consuming manual operations and lack the transparency and security required for efficient transactions. However, the introduction of cutting-edge monitoring technology has created new opportunities for firms to simplify trade payments and optimise supply chain processes.  Advanced tracking technology, which includes elements such as location monitoring, temperature sensors, humidity, and motion detection, is at the core of this shift. Devices can be paper-thin, such as our Smart Shipping Label, which at just 2mm thick contains LTE-M connectivity and a printed battery powering it for up to six months. Such devices allow for real-time data on shipping status of a range of package form factors, increasing visibility and control across the whole logistics process.  Track and trace benefits for payments One significant benefit of this monitoring technology is that it increases the transparency of trade activities. Businesses have always struggled to gain real-time information regarding the status and quality of their shipments. With such tracking devices, organisations may now obtain up-to-date information, allowing for better decision-making and increased cooperation between suppliers and customers. This openness creates trust and allows firms to forge better relationships with their partners.  Security is a major issue in trade payments because firms want to reduce the risk of fraud and make payments only when shipping criteria are satisfied. Traditional techniques, such as letters of credit, sometimes fall short of delivering the required degree of security. With track and trace smart devices, payment conditions based on predetermined characteristics such as geolocation, temperature, and motion detection can be put in place. When these requirements are met, payments are delivered automatically, minimising the risk of fraud, and strengthening confidence among trade partners.  Automation and real-time data  This capacity to automate payment procedures is one of the most important benefits of cutting-edge monitoring technology. Historically, companies depended on humans to complete documentation manually to verify shipping fulfilment and start payments. This is time-consuming, error-prone, and inefficient. Businesses may now use sophisticated tracking systems to automate payment processes based on predetermined parameters, removing the need for manual involvement and lowering the risk of human mistake. This saves time, effort, and resources, lowering costs and increasing profitability.  Furthermore, the technology now provides access to data, almost 24/7, meaning companies on both sides of the trade can track the location and condition of goods at any moment in time. This allows for spotting deviations from specified criteria and allows them to be handled in real time, enabling proactive decision-making. For instance, if the temperature of a cargo exceeds the prescribed limit, organisations may take fast action to maintain product quality and integrity. Real-time data enable organisations to optimise supply chain operations, reduce delivery times, and boost customer satisfaction.  In supplier partnerships, trust and openness are critical. Businesses depend on their suppliers to provide products and services in excellent shape and on schedule. Businesses may create confidence with their suppliers by using new monitoring technologies in trade payments. The payment procedure is straightforward and automated, ensuring that suppliers are paid immediately when shipping criteria are satisfied. This promotes deeper connections since suppliers can concentrate on satisfying the demands of their customers because they are confident in regular payments. Improved supplier connections result in more efficient supply chains and mutually beneficial collaborations.  Unlocking the potential  Innovative tracking technology has the potential to go beyond trade payments. These devices may be used in a variety of industries and applications. Real-time monitoring, for example, ensures that temperature-sensitive pharmaceuticals are delivered and kept under ideal circumstances in the pharmaceutical business, where product integrity and safety are crucial. Furthermore, temperature monitoring across the supply chain may help perishable commodities, such as fresh vegetables, retain quality and avoid waste. Tracking devices' flexibility and adaptability bring up a world of possibilities, providing organisations with important data and safeguarding the integrity of their operations.  Furthermore, the advantages of cutting-edge monitoring technology extend well beyond trade payments. Real-time monitoring allows firms in areas such as logistics and transportation to optimise fleet operations and reduce delays. Efficient route planning, pre-emptive maintenance, and real-time asset monitoring save money and increase customer satisfaction. Moreover, tracking devices assist organisations in meeting and exceeding demanding compliance and regulatory requirements, such as the pharmaceutical and food industries, assuring product quality and safety.  Business success through IoT tracking Businesses may improve the efficiency of their supply chain operations by using sophisticated tracking systems. These devices' real-time data transmission, automated procedures, and enhanced visibility enable simplified operations, lower costs, and better resource allocation. Businesses may discover bottlenecks, optimise processes, and make educated choices that create operational excellence with actionable insights into the supply chain process. Furthermore, by eliminating manual processes and lowering the chance of mistake, organisations can refocus their efforts on growth plans, innovation, and customer happiness.  Finally, cutting-edge monitoring technology is transforming trade payments and supply chain processes, providing firms with more transparency, security, and efficiency. cand promote trust and cooperation among trade partners. Tracking devices, with their many uses and advantages, enable organisations to optimise supply chains, satisfy regulatory obligations, and provide enhanced consumer experiences. Businesses that adopt this technology will gain a competitive advantage as the global financial supply chain evolves, paving the way for a more transparent, secure, and efficient future. ### Cracking the AI code: The recipe for successful human-AI collaboration You’ve no doubt seen the relatable AI memes travelling across the universe of cyberspace right now. “To replace creatives with AI, clients will have to accurately describe what they want. We’re safe, people.” The terms “creatives” and “clients” have been swapped out for programmers, product managers and anybody else whose job is at threat. But the message is the same: garbage in, garbage out. We cannot use AI to its full potential if we’re not giving it a clear brief. Baking the cake – understanding what we really want In 2023, we’re generating more data than we know what to do with. Estimates vary, but figures for generated, captured, copied or consumed data come in at as high as 328.77 million terabytes per day [LINK: https://explodingtopics.com/blog/data-generated-per-day]. It’s a classic case of quantity over quality. In fact, some researchers are even going as far as to describe large language models as “ticking timebombs” due to copies of data. With this in mind, we have to acknowledge that not all data is valuable, and as such, won’t help us make decisions. We cannot jump on the bandwagon of AI without understanding what we really want. Let’s bake a cake. To bake a cake worthy of a Hollywood handshake, we need: The cake: the outcome or answer we wantThe ingredients: the data itselfThe recipe: the program that helps everything come together. What we have here, beyond a very worthy Victoria Sponge, is a procedural program. We have raw data – eggs and flour – and we know what we need to do to produce our desired outcome. In isolation, AI doesn’t have the recipe to produce outcomes. Rather, it relies on the outcome you request of it (the cake) and comes up with its own calculations to bake something edible. As we have seen in the past, this comes with mixed results. The benefits of using structured data At Tiger, we rely chiefly on structured data to help our clients make strategic decisions. For example, we look at deterministic data (which is more accurate as it has been provided by customers) such as call length, number of user licences or customer waiting times. In telecoms, this is far more useful than unstructured data, which is prone to error. Let’s say we’re asking an AI to show us this month’s call data. This could include unstructured data such as speech, which the AI would turn into an SQL query to retrieve information from the database. All it takes is for one misinterpretation of a regional accent and we’ve essentially retrieved gobbledegook. At best, this is inconvenient. At worst, this could cause us to make misguided decisions with catastrophic results, particularly if we’re working in fields like healthcare or insurance.Deterministic outcomes Of course, we can apply human intelligence to unstructured data – for instance, gauging employee satisfaction from their tone of voice and general demeanour in video calls. But this is time-consuming and cannot be done at scale. So we use structured data to help our clients make key business decisions. Notable use cases include: Adoption of tools: quantifying user licences shows senior teams where they could cut costsCall waiting times: identifying peak times helps our clients to allocate staff resourcePreferred channels: counting how many customers interact via telephone versus live chat shows managers where to invest. Missed calls: a missed call is a missed sale or opportunity.This gives our clients deterministic outcomes based on the data available. As humans, we offer the roadmap to get from A to B – how can this data guide us in the overall direction of the business? To get back to basics, this is why we need human intervention before we can leverage AI. It’s all very well having the data, but we need to understand what we want from it so that we can use it to its full potential. Do we want to save costs, improve customer service, retain staff or all three? The value of AI As time and technology move on, we should expect to see more use cases for AI in telecoms. For example, while Tiger doesn’t deal directly with network outages, we’re already seeing AI being used to track network congestion and help companies stay operational.It will also be invaluable when it comes to processing data at speed, in particular, unstructured data. We can use human data mining techniques to make decisions on structured data, but the merits of voice or facial recognition will come into their own in years to come. In practical applications, this could help our clients with fraud detection, from video to voice calls (barring some regional accent hiccups – but we can work it out). The challenges However, it’s important not to rely on AI alone or let it be the only guiding principle for business decisions. There are still ethical considerations, such as how natural language interfaces are processing private conversations. Again, this is pertinent in fields such as insurance, finance or healthcare. We should expect to see more regulation around AI and data privacy in years to come – and we should have answers if our customers start asking questions. Already, two in five consumers are worried about AI [LINK: https://yougov.co.uk/topics/technology/articles-reports/2021/11/18/global-more-people-worried-not-about-artificial-in] This is why we cannot underestimate human intervention. Letting humans take the reins will assure stakeholders that: We gain true deterministic outcomes based on knowing what we want from data.We apply an empathetic approach to unstructured data like video calls to read emotions.We respect privacy concerns and keep customers informed about how data is being used. There is no doubt that AI will help us to make better decisions in the future. But just as an AI might not consider gluten-free flour or other data nuances, we cannot overlook human intelligence. ### DORA and its impact on data sovereignty According to the European Commission, no less, ‘data is immensely valuable to all organisations, a significant resource for the digital economy and the ‘cornerstone of our EU industrial competitiveness’. Hardly surprising when you consider the data economy is projected to deliver more than €829b and nearly 11m jobs to the region by 2025. Capitalising on and nurturing numbers of that scale are precisely what is behind evolving EU strategies and regulations coming into play. The latest of which is the Digital Operational Resilience Act (DORA) while updates to the Cybersecurity Act and the Data Act are likely to follow soon (relatively) afterwards. The key difference with DORA is that it extends its scope to encompass your financial business as well as all supply chain businesses and services integrated with your company. DORA aligns with the EU Cybersecurity framework (EUCS) and could become mandatory for sectors classified as highly critical under the EU Networks and Information Systems Directive (NIS2) from 2024 onwards. Regional ‘protectionism’ To give some context to the extent to which Europe is looking to take back control of its own data, there has been investment by the EU in research and innovation with regulations, policies and standards to the tune of €1.8 trillion. DORA is particularly crucial legislation because it addresses the notion of ownership and control head-on, initially for financial organisations, but expanding to a broader scope. Fundamental to its being is that businesses must ensure alignment with the latest regulations as local auditors will be introduced to ensure compliance, which subsequent legislations will reinforce - the Cybersecurity Act (EUCS) will eventually protect EU data, out of reach of a foreign jurisdiction, for instance. These, and other global data privacy regulations, such as EUCS, the AI Act and the Data Act are creating an environment of regional ‘protectionism’ and concerns regarding data ownership and privacy. According to this paper, globally 145 countries have data privacy laws, up from 132 in 2018. These laws vary by country and region, requiring local experts and multiple clouds meaning businesses are feeling the pinch in resourcing and skills. Recent research we conducted with IDC, more than 70% of businesses believe financial and environmental regulations will become more of a threat, while source suggests 88% of boards regard cybersecurity as a business risk. Moreover, companies are grappling with macro issues such as global economic pressures, like inflation and ongoing geopolitical uncertainties. All of this is compounded with the UN triple crisis of climate change, pollution, and biodiversity changes. The upshot being that digital operational resilience and a business’s ability to control and manage its sovereign data under any circumstances has been catapulted to the top of the boardroom agenda. Driving the need for data sovereignty Yet the challenges of managing and storing sensitive and critical data are growing. The volume of highly sensitive data now hosted in the cloud is on an upward trajectory. 64% percent of EMEA organisations have actually increased their volume of sensitive data, and 63% have already stored confidential and secret data in the public cloud, according to the IDC report previously cited. At the same time, 95% of businesses cite the need to manage unstructured data as a problem for their business and 42% of business leaders are very or extremely concerned about critical data managed by U.S. cloud providers - Statista found that 66% of the European cloud market is controlled by US-based providers, who are subject to external jurisdictional controls like the US Cloud Act. Managing this exposure of highly sensitive classified data is driving the need for data sovereignty – where this intelligence is bound by the privacy laws and governance structures within a nation, industry sector or organisation. Maintaining stability within a sovereign scope requires businesses to utilise a cloud endpoint that offers the same sovereign protections as the original location, yet many multinational cloud companies cannot guarantee this. A ‘cloud smart’ strategy This is why businesses need to adopt a Cloud Smart strategy. One that ensures flexibility, allowing business-critical systems to be seamlessly moved from one cloud provider to another to ensure continuity. The recent political agreement of the Data Act (as of the 27th June 2023), seeks to remove legal, financial (egrees fees) and technical barriers to enable easier cloud switching between cloud service providers. Taking this approach means comprehensively addressing all aspects of a business, including sovereign supply chain (in the case of DORA) and will require audits to check all components meet the same standards of operational resilience. It is unsuitable to have a strategy that involves copying data out of a sovereign zone or that could lead to extended outages due to the absence of a secondary site or instance. The EUCS recent updates to the draft proposal now include a High+ category whereby no entity outside the EU would have effective control on cloud data. Additionally, relying on a single cloud vendor is not recommended for achieving true resilience. Instead, a resilient service should leverage multi-cloud and hybrid solutions to efficiently shift workloads and data as needed to avoid downtime and outages. Foundations of a future Europe Ultimately, the reason why sovereignty is so important, is that it enables organisations to be innovative with their data and deliver new digital services. The upcoming legislations may be cloaked with the objective of protection but, long-term they are being brought in to meet and exceed the numbers projected around data by the European Commission - you don’t invest €1.8 trillion if you don’t expect it to pay back big. These legislations are the building blocks for the foundations of a future Sovereign Europe. One where we’re not only in charge of own own data, but our own destiny as a result. ### 3 Ways The Latest Cloud Solutions Can Help Improve Sales Performance Cloud computing is similar to web-based email clients, enabling users to access the system's capabilities and files. It does not require installing or maintaining extensive local copies of the software. Facebook, Instagram, email, and tax preparation software are popular cloud-based applications. Similarly, the sales team can use software that is helpful for individuals, helping organizations that require access to enormous volumes of data via a safe and reliable web network. By leveraging the power of the cloud, the sales team can unlock a new dimension of efficiency, scalability, and innovation. The sales team can use the quote to cash (Q2C) software throughout the sales journey, enabling guided selling, configuration, and pricing, creating quotations, order capture, e-commerce, and contract management. Cloud-based Q2C can also help in asset-based ordering, product renewals, cancellations, billing, and payments. In this article, we will explore how the latest cloud solutions are not merely a technological trend but a vital weapon in the arsenal of a modern sales team. Driving collaboration The sales team thrives on collaboration, especially when working remotely or located across different cities or countries. Here, cloud computing enables collaboration amongst the sales team members to help them view and share information easily and securely across a cloud-based platform.  Cloud-based services offer collaborative social spaces that follow security & compliance norms and connect employees across your organization, increasing interest and engagement. It offers a centralized location where documents, chat, video conferencing, and other collaboration tools are available from any internet-enabled device. Here are a few of the ways how the latest cloud solutions help the sales team: Shared documents and files: The cloud enables sales team members to share access to documents and files in real-time. They can simultaneously edit and comment, ensuring everyone is on the same page and revisions are immediately visible to everyone involved. Unified communication systems: Cloud platforms like Slack or Microsoft Teams that integrate with cloud solutions provide a centralized hub for communication. Everything from text and voice to video conferencing is handled to ensure all communication is in one place, making it easier to track and manage. Mobile accessibility: With cloud-based platforms, sales reps can access critical information and tools from any internet connection device. Such flexibility helps maintain momentum in sales processes. Enhanced customer interaction: Real-time collaboration enables quicker response times to inquiries or concerns, creating a more positive customer experience with the sales team at the center. Scalability to meet growing demand Cloud solutions provide the scalability necessary to cater to the dynamic nature of sales. The flexibility to scale resources according to demand helps sales teams stay agile and competitive. Whether ramping up during peak sales seasons or scaling down during slower periods, the cloud solutions’ flexibility ensures that the sales team always has the necessary tools and infrastructure. This adaptability enhances efficiency and reduces costs since businesses often pay only for the resources they use. Below are certain ways in which cloud solutions aid sales teams: Simplify adapting to market fluctuations: Sales demands can fluctuate seasonally or due to market trends. But, with cloud infrastructure, you can ramp up or scale down the software needs without major capital investments in physical hardware. This offers agility in response to market needs. Cost-effective growth: Cloud scalability offers sales teams to pay for the resources they use. This can be particularly beneficial for startups or SMBs hosting sales teams with budget constraints that may hinder growth. Improved peak time performances: Sales teams can effectively leverage software for sales campaigns, product launches, or seasonal spikes. Here, businesses can allocate additional resources instantly to cope with the increased load. For example, an e-commerce company running a Black Friday sale can anticipate a surge in website traffic. They can dynamically scale their server capacity to handle this spike using cloud solutions like AWS. It ensures that the website remains responsive and functional even with a tenfold increase in users, leading to higher conversion rates and successful sales. Assuring security and compliance Your sales team can access sensitive customer information like personal details, financial data, and proprietary business information. Protecting such information requires leveraging cloud solutions that offer advanced security measures, such as encryption, multi-factor authentication, and intrusion detection systems. Data security is an assurance that enables sales representatives to confidently engage with clients without worrying about potential data breaches. This trust enhances customer relationships and can lead to more successful sales interactions. Moreover, cloud solutions help unlock the following for sales teams: Compliance with regulations: Different industries and regions have specific regulations governing the handling and managing of customer information (e.g., GDPR in Europe and HIPAA in healthcare). Cloud software providers often have dedicated teams to ensure compliance with these regulations, providing regular updates and audits. Adopting such cloud infrastructure can help sales teams focus on their core responsibilities without becoming experts in legal compliance. This ensures aligning the sales interactions with the latest regulations, reducing the risk of legal issues that could damage the company's reputation or result in fines. Customizable security protocols: Sales teams can also benefit from cloud solutions that offer customizable security settings, enabling businesses to tailor the protocols to their needs and industry requirements. This includes setting specific access controls and monitoring tools wherein sales managers can set up specific permissions for sales reps. This ensures sales personnel have access to the information they need while restricting access to more sensitive areas. Ultimately, it minimizes the risk of internal security breaches and maintains the integrity of customer data. Wrapping up Cloud technologies for sales teams are becoming more of a need than a luxury for sales teams to survive and beat the competition. Plenty of SaaS application development companies now offer cloud solutions that benefit everyone, from small businesses seeking a competitive edge to multinational corporations aiming for global dominance. If you are a business that has yet to foray into the cloud systems, now is the time to foster a culture of cloud transparency and empowerment that can boost a sales team. ### Building Trust in the Cloud: How to Secure Your Digital Assets Do you trust every organisation that holds personal information about you? As a consumer, you probably willingly click “accept” on the terms and conditions of an online service that gathers and stores personal information. As a cybersecurity professional, you likely take a moment to evaluate value versus risk when handing over sensitive information. Perhaps you even review the data privacy policy of the app before sharing anything with its developers. As many as 85% of consumers are interested in reading the data privacy policy of a business before making a purchase. More than half say they feel they must check the trustworthiness of a company before making online purchases or using digital services. Cloudy with a Chance of BreachesDefining and publishing data privacy policies is great (and usually mandatory by law), but failing to enforce them and suffering a data breach can do irreparable damage to the reputation of a business. 40% of consumers report that they have chosen a competitor after learning that a potential vendor failed to protect its customers’ data. This rate increases among frequent online shoppers, B2B buyers, and Gen Zers. With the cloudification of application infrastructure and the switch from firewalled on-prem environments to complex and distributed virtualised deployments, software engineers need to take a different approach to securing digital assets throughout the SDLC. The shared responsibility model of cloud computing services may alleviate the overhead of securing servers, networks, and the application itself. Still, it deprives us of the traditional level of control we once had over our application’s storage and runtime environments. Since so many sensitive data assets are handled in a cloud environment, in one way or another, by developers, integrating the principles of digital trust into the software development process is vital for business continuity and resilience and compliance with regulatory requirements. But where do you even start? How can you build trust in a virtualised, mercurial, complex, and shared infrastructure?Defining Digital Trust from a Developer PerspectiveAccording to Deloitte, digital trust is “the confidence among customers, employees, partners, and other stakeholders in an organisation’s ability to create and maintain the integrity of all digital assets (including data/information, architectures, applications, and infrastructure).” The ISACA, a global professional association, defines digital trust as “the confidence in the integrity of relationships, interactions and transactions among providers and consumers within an associated digital ecosystem.” However you choose to define it, digital trust aims to ensure transparency, accessibility, security, reliability, privacy, control, ethics, and responsibility. It is an innumerable currency that flows within and between business entities. For programmers, developing and maintaining the digital trust of application users and other stakeholders entails protecting digital assets involved in the software development lifecycle. Generally speaking, digital trust in software development is based on three key principles: authentication, integrity, and encryption. Identity authentication is the provisioning, maintenance, and management of accounts and resources for individual users, machines, workloads, containers, and services. Maintaining the integrity of digital assets requires monitoring, observability, and well-defined asset ownership. Encryption of all data in transit between endpoints ensures unauthorised parties cannot intercept and decode it. These may sound familiar, as they are also some of the foundations of DevSecOps and the secure software development lifecycle – SSDLC. Overcoming Cloud Trust Issues with DevSecOpsAnother pillar of DevSecOps is automation, embracing tools like SAST, DAST, and AI-enhanced monitoring and observability tools. Eliminating unauthorised access to data, data leakage, and cloud misconfigurations throughout the SDLC while maintaining development velocity is no easy feat. But, it will ultimately cost you and the business much less than the potential repercussions of a successful breach. Along with a shift left approach to software security ownership, there was another, quieter shift left – that of digital trust. Digital trust is key to business growth in a connected world and must be adequately developed and maintained as an organisational effort and shift in culture. For DevOps professionals, this means adopting innovative security tools and DevSecOps best practices while empowering coders to seamlessly and effortlessly integrate these technologies into their workflows with one ultimate goal in mind: nurturing digital trust with all the stakeholders in your SDLC. ### Tackling digital immunity: adopting a holistic strategy Digital technology adoption has become a key differentiator for European enterprises, giving them a competitive edge. With more critical applications moving to the cloud, businesses face increased challenges in protecting sensitive information and resources far beyond the traditional corporate walls. To protect application and network environments, organisations should embrace a much more comprehensive approach.  By adopting the latest technologies and frameworks with the necessary skills at hand, businesses can be sure that their sensitive information and assets are safeguarded, allowing them to succeed in a digital landscape.  The challenges of digitalisation  Businesses today recognise the importance of ‘going digital’ to remain competitive in their respective markets. Digitisation helps organisations streamline their operations and create new revenue streams. However, as companies continue to digitise, they face an increasing risk from cyber threats, including those that exploit cloud-based applications and Internet of Things (IoT) devices. It is vital that organisations develop a robust digital immune system to manage, monitor and protect their applications and services from any anomalies that may arise due to software bugs or security issues. A strong digital immune system includes implementing measures that make applications more resilient, allowing them to recover quickly from failures. In this way, organisations can significantly reduce downtime by up to 80% in the next two years, according to Gartner.  Cybersecurity professionals play a crucial role in keeping the company's digital immune system resilient. Implementing the measures to do so can often come at a significant cost which may leave a sizeable dent in IT budgets. It is essential to look for cost-effective solutions that can help keep the digital immune system strong without compromising on quality or security.  With the corporate network well recognised as the front line of an organisation's digital health, organisations may consider outsourcing their network security operations to managed security service providers that can optimise their network security processes while reducing the total cost of ownership. In addition, it is worth considering new ways to initiate protection before internet traffic even hits the corporate perimeter. For example, taking DDoS solution services that are integrated with Tier 1 Internet Service Provider backbone offer the benefit of added threat intelligence capabilities through constant monitoring of the wider internet for emerging attack trends. Putting security first   The risks of digital transformation are never too far from the headlines. For example, the recent ESXiArgs ransomware attack targeted cloud vulnerabilities, causing significant disruptions to nearly 5,000 companies across Europe and the US. With IT environments becoming increasingly complex, security needs to be integrated to the edge of the network – and, where possible, go beyond – closing any gaps in protection organisations may be blind to. By doing so, companies can minimise the risks of cyber threats and protect their digital assets. Research shows that many organisations are not taking this lesson seriously. A global study by IDC commissioned by GTT found over 95% of enterprises have either deployed software-defined wide area networks (SD-WANs) or plan to do so within the next two years. However, the study also revealed that nearly half of enterprises (42%) either don't have security integrated with SD-WAN or have no specific SD-WAN security at all. These enterprises may unknowingly have vulnerabilities in their corporate networks that malicious actors can easily exploit. To mitigate these risks, companies need to integrate their networks with their security stack and defence in depth strategy to protect users at the point of connection. This approach ensures security measures are implemented throughout the network and helps prevent cyber threats from gaining a foothold. The IDC global study also found over 80% of respondents worldwide have either made Secure Access Service Edge (SASE) a priority (39%) or recognised its benefits and are already incorporating it into company initiatives (42%). A SASE framework integrates Secure Service Edge (SSE) with SD-WAN to deliver global, reliable, end-to-end protection in a single cloud-based service model. Digital transformation is happening at such a fast pace that network security leaders may find it challenging to keep up, especially given shrinking budgets and a limited talent pool. Yet, they still must adopt a proactive approach to security and invest in the right tools and technologies to protect their businesses from cyber threats. This approach may include outsourcing security operations to managed service providers who specialise in network security and hiring skilled cybersecurity professionals to build a robust security posture. Ultimately, organisations must regularly assess their security posture and be prepared to adapt quickly to stay ahead of changes in the threat landscape. Improving an organisation’s digital resilience  Businesses are going through a period of change. Work may now be done from any location and is no longer restricted to a certain space. This means that every point of access to the company's network poses a potential security risk. Businesses can increase their digital resilience by partnering with the best managed service provider. These providers can develop customised solutions at reasonable prices thanks to their ties with leading security technology partners. They also provide the knowledge of certified professional services experts that can design secure solutions that can quickly adapt to shifting business objectives, innovations, and operations. For businesses looking to properly integrate their networks with their security architecture and ensure the security of their digital assets in today's constantly changing threat landscape, these partnerships can be crucial.  ### How to protect your business from procurement fraud In PwC’s Global Economic Crime and Fraud Survey 2022, over half of organisations said they had experienced fraud – the highest level in twenty years of research.  Today’s criminals are constantly innovating new ways to target businesses and break into online platforms. Sometimes, bad actors are operating within organisations too.   Procurement fraud, also known as purchasing fraud, is increasingly being targeted by criminals, with greater complexity and information-sharing, as well as poor data quality, leading to greater fraud risk.  So, what exactly is procurement fraud is, how can it be spotted and what can organisations proactively do to protect themselves falling victim?   What is procurement fraud?  Procurement fraud is the unlawful manipulation of business trade agreements within the procedure-to-pay process to acquire goods, services or contracts unfairly.   An organisation may be defrauded at any stage of the procurement process from bidding to after the contract has been awarded.    There are five common methods of purchasing fraud:  Invoice scams   An illegitimate company sends fake but credible invoices, claiming payment for certain goods/services is required.   Sent with threatening messaging such as legal action, or strict payment deadlines, they incite quick, knee-jerk reactions leaving little room for review before paying.   False accounting fraud  Often when financial reporting is vague and hard to trace, fraudsters are attempting to cut a slice of profits, paying an accomplice, evading certain taxes or inflating share prices of an organisation.   Kickbacks  When an employee develops or has a close relationship with a prospective supplier, there lies the risk in them colluding to develop inflated invoices to be paid to this supplier with the intention of getting a cut of the profits.   Bid rigging  Businesses that are typically market competitors might work together to manipulate their target into paying more than they should by limiting or eliminating competition and then splitting the profits between them.  Fake vendors  This is usually a long-term operation where a fake business is paid repeatedly for goods or services that are never actually provided because the company does not exist.  What is the procurement fraud triangle?  As detailed above, procurement fraud often involves employees, highlighting the importance of training staff to spot the signs, vulnerabilities and understand the seriousness of being involved.  According to the 10-80-10 rule, ten percent of people would never commit fraud, ten percent of people are actively looking for the opportunity to and the remaining 80 percent may or might not commit fraud. There is no type of person more likely to commit fraud than another - it is circumstantial.  Criminologist, Dr Donald Cressey, identified the three factors likely to cause an individual to engage in fraud. Known as the Fraud Triangle, the three pillars are pressure, opportunity and rationalisation.   First is pressure. When someone comes under significant pressure, often financial, it increases the risk of them committing procurement fraud.   Sometimes pressure is applied by someone known to the person – perhaps through extortion or manipulation. Others may commit fraud when they feel pressure to hit certain targets or goals based on past poor performance or undue pressure from senior colleagues.  Second is opportunity. With 10 percent of people actively looking for the opportunity to commit fraud, any internal security weak spots or cracks in management can be exploited.  Finally, there is rationalisation. Someone wanting to commit fraud might have the pressure and opportunity but can’t rationalise a reason to do it – unless they are disgruntled, undervalued/paid or mistreated. Thoughts around the company ‘deserving it’ or being able to ‘afford it’ are examples of rationalisation.   What are the consequences of procurement fraud?  Procurement fraud can have lasting and serious consequences, damaging a business’ reputation but also costing them financially.   Legal and financial consequences leave their mark too. A recent report estimates that 40 percent of all businesses lose anywhere between €150 000 to €400 000 a year to purchasing scams.  Under the Bribery Act (2010), businesses or individuals found guilty of procurement fraud are eligible to be fined or face imprisonment according to the severity of the offence.   Furthermore, the relationship between businesses and vendors can become complicated, distrustful and acrimonious if fraud is suspected with no one wanting to take the blame.   How can organisations detect procurement fraud?  Generally, larger companies with longer supply chains, involving many stakeholders and bigger budgets, are more likely to experience procurement fraud. However, any company can fall victim.  Detecting procurement fraud requires constant vigilance and monitoring alongside having an in-depth knowledge if who each employee is, and vetting ones that are involved in company finances.   There are also red flags to look out for within the procurement function:   - Inflated vendor prices  - An unusually small pool of vendors   - Patterns in bid winners or losers  - Repeat selection of certain vendors   - An unlikely vendor being awarded the contract  - Mismatch between the contracts and the goods or services being delivered  - One individual managing the entire procurement process from start to finish  If red flags are detected, an internal and external investigation must be carried out, reviewing financials and suspicious vendors/employees.   How organisations can protect themselves against procurement fraud  The best protection against procurement fraud is proactively developing defences, maintaining regular monitoring and immediately responding to any red flags.   Being vigilant is a deterrent and limits opportunity for those seeking it, which saves businesses time and money.  Practical ways to protect against procurement fraud include enhanced staff training and awareness, enabling every employee to understand the risks and red flags, and spot and report the signs early.  Responsibility for specific vendor accounts should be rotated regularly, reducing the risk of one encouraging the other into fraudulent activity. Regular reviews and assessments should also take place, making sure that nothing slips under the radar.  In-depth research should also be carried out before any new contract is signed with an external supplier or service. Digitalising procurement, ensuring that processes are streamlined and protected with cloud based secure software, can boost efficiency and improve monitoring.  Using RFx software to streamline procurement processes makes it harder for scammers to spot opportunities to strike because of the auditability and traceability of all processes – the biggest allies in the war against fraud.   According to the CIPFA Counter Fraud Centre strategic model, protection against procurement fraud requires acknowledging who is responsible for monitoring it, identifying all the present risks, developing a strategy to mitigate these risks, providing the resources to do this, and then taking any necessary action.   In this model, deterrence and prevention must be carried out alongside investigation and detection. But should fraud slip through this defence, sanctions and redress are necessary remedial steps, including recovery of funds and assets. ### The motivations behind establishing a UK-US ‘Data Bridge’ We’re all familiar with how much the modern world is powered by data, and specifically the transfer of that data across borders. 93% of the UK’s services exports were data-enabled in 2021, with the value of services exported to the US alone worth over £79 billion – equivalent to 30% of the total. There is a viewpoint, however, that arrangements could be made even simpler to unlock further trade and economic benefits. This is the intention behind the recent agreement in principle between the UK and US Governments, to establish a new legal framework which would facilitate personal data transfers between the two countries. This would extend the existing EU-US Data Privacy Framework, making it “easier for around 55,000 UK businesses to transfer data freely to certified US organisations without cumbersome red tape,” as a Government press release claimed. Upholding standards across territories Both the EU and UK already have regulations in place to accommodate for the transfer of personal data outside the European Economic Area (EEA). This new ‘data bridge’ aims to maintain the same level of protection for personal data exported to other jurisdictions as those that are provided under the GDPR. There are currently several mechanisms that businesses use ensure that EU or UK data protection standards are upheld when exporting personal data. One of these are contractual clauses, the cost and inflexibility of which often makes them a significant barrier to overcome, especially for small and medium-sized businesses. Any UK business that currently needs to send personal data to a service provider or company in the United States need to have these in place to ensure data protection and privacy standards are maintained. Another option are adequacy decisions. Policymakers make these designations to allow the free flow of personal data through specific jurisdictions. Once it is finalised, the UK-US data bridge would represent a UK-issued adequacy decision, while the European Commission has signaled its intention to adopt an adequacy decision with the EU-US Data Privacy Framework. This has happened after the previous decision that was proposed, the EU-US Privacy Shield, was rendered invalid by the Schrems II ruling. The Purpose of a Data Bridge The primary reasons for arrangements like a data bridge is streamline and unify the agreements that would otherwise be needed, and in essence, make innovation, collaboration and data sharing easier for businesses. In this way, the agreement aims to reduce burdens on business – and overall, get the two countries on the same page when it comes to the regulation in place. It is a fine balance to get right, as any UK-US agreement also needs to ensure it doesn’t threaten the existing adequacy status that the UK has with the EU, since the country left the Union. Companies in all regions are closely watching for what comes through – especially as notable privacy campaigner Max Schrems has already been vocal about challenging version 2.0 of the EU-US Privacy Shield. “A UK extension to the Data Privacy Framework is the most streamlined approach to take, it is likely to be the smoothest approach for reaching political agreement. It is also the least likely to cause issues for the UK’s own EU adequacy status, as the UK approach will presumably align with the EU’s,” commented data protection law expert Rosie Nance of Pinsent Masons. Regardless of the specific agreements that are secured between the EU, UK, and the US, it should not cause businesses to stop conducting data transfer impact assessments. The Schrems II ruling emphasised the importance of the robust due diligence businesses must undertake before transferring personal data anywhere outside of the EEA – not just to the US. ### RPA In Healthcare - Drive Savings By Automating Healthcare Processes As we step into a new era of technology, Robotic Process Automation (RPA) is swiftly surfacing as a game-changer. In simple terms, RPA simulates human interactions with digital systems, automating repetitive tasks and operation sequences. With endless possibilities across various sectors, healthcare is a particular area in which the applications of RPA make sense. With pressures mounting in this field due to increasing costs and demands for patient-centered care, exploring automation solutions like RPA has become more crucial than ever before. Let’s look a little deeper into the relationship between the two, and explore the benefits offered, with a particular focus on cost savings. Understanding Robotic Process Automation in Healthcare In the world of healthcare, RPA showcases immense power and potential. It operates by mimicking human actions, such as entering data into a system or coordinating between different digital avenues. This removes the need for constant manual input and oversight that often leads to errors and delays. From handling patient scheduling to processing insurance claims with optimum efficiency, RPA becomes the helping hand facilities need, ensuring smooth operations while freeing up staff for tasks requiring high-level cognitive functions. This allows institutions to focus on patient care without having to worry about administrative burdens, because automation is at the helm. Potential Areas of Savings Through RPA Implementation Robotic Process Automation unlocks pathways to significant cost savings in several areas of healthcare. Let's explore some: Patient Administration: Efficiency in handling patient registration, appointment scheduling and billing can lead to substantial reductions in operational costs. Inventory Management: Automated inventory tracking makes clinical decisions easier, reducing wastage thus leading to financial savings. Insurance Claims Processing: RPA facilitates swift and accurate processing of insurance claims, curtailing overheads, while enhancing customer satisfaction levels. For specialized medical sectors like mental health services where program management software is key, platforms such as footholdtechnology.com elevate functionality further when integrated with RPA. This amalgamation allows for efficient case management while saving on labor-intensive administrative tasks. In essence, the application of RPA streamlines processes across healthcare systems, boosting productivity and trimming unnecessary expenses which were previously unavoidable. Challenges and Potential Solutions While Implementing RPA in Health Sector Like any technological adoption, implementing RPA into the healthcare ecosystem isn't devoid of hurdles. Here are some challenges faced: Data Security Concerns: With sensitive information involved, potential security threats can seem daunting. Employing robust encryption techniques and regular audits can bolster data privacy. Software Compatibility Issues: For a seamless workflow, integrating different software platforms with RPA tools is essential, but not always simple. Building close relationships with technology vendors ensures smooth integration. Employee Resistance to Change: A shift this significant may face resistance from seasoned staff members uncomfortable with change. Creating training programs and being clear about the benefits of automation can facilitate an easier transition. Despite these challenges, by thoughtful planning and targeted strategies, it's possible to reap the countless rewards offered by RPA in healthcare. Future Prospects for RPA in Healthcare - The Road Ahead The future of healthcare lies intertwined with advancements like RPA. As we continue to navigate the wave of digital transformation, acceptance and use of automation tools will morph into an industry norm rather than a novelty. Expectations are high, with industry insiders predicting ubiquitous adoption across healthcare workflows, from administration to diagnostics and patient care. Furthermore, blending AI capabilities with RPA could usher in game-changing breakthroughs such as predictive analytics for preemptive treatments or real-time monitoring systems. Undeniably, the horizon looks promising for RPA, not only in healthcare, but more generally across almost every industry imaginable. ### Multicloud 101: Distinguishing between myths and reality Migrating to the cloud has now become the norm with the latest statistics showing that 98% of companies are using the cloud in some capacity. Almost as many organisations (89%) have gone one step further and adopted a multicloud strategy to manage their workloads. Multicloud is increasingly transforming the way companies operate and, by extension, the way businesses function writ large. Using multiple cloud providers to store and manage data comes with a host of benefits. Companies have the flexibility to choose the best features from each provider, allowing them to optimise their investment and save costs. They can also avoid vendor lock-in and have the capacity to spread the data across multiple applications. Implementing a multicloud strategy also means organisations are less likely to be impacted by a service outage – if one of the applications fails, organisations’ data can still be accessed via the other platforms, which minimises the costs associated with retrieving data, and reduces the risk of having to stop all activity while dealing with the problem. As such, moving to a multicloud strategy is vital for companies to improve performance and become more competitive. Although multicloud is a very powerful approach, there still are a lot of challenges that businesses must be aware of in order to make sure they are ready to embrace it successfully. It’s important to remind business leaders of the myths and facts around multicloud to ensure they implement a solid strategy that isn’t simply based on common misconceptions. Fact: Businesses can maximise productivity and minimise costs by moving to multicloud Today’s difficult economic times mean that organisations must adapt in order to remain competitive. With rising costs, high inflation and hiring freezes, business leaders are doing their best to boost productivity, minimise costs and maximise ROI. This is where multicloud environments can help, as organisations can rely on several specialist providers and combine multiple cloud applications, each with a different focus and area of expertise, to best meet their objectives. Having the option to choose from several service providers helps maximise productivity and is vital to help businesses navigate the changing economic landscape.  Adopting a multicloud strategy also gives companies the possibility to extend their networks and develop more connections between their services. Overall, this helps enhance the user and employee experience. In VMWare’s The Multi-Cloud Maturity Index, 99% of organisations reveal they have benefited a lot from adopting a multicloud strategy, with 45% saying it enhances employee flexibility, 41% reporting an increased ability to innovate, and 40% mentioning how it helps to get apps into production faster. As a result, companies are more competitive and can position themselves as true leaders in their industry. Now that companies have the option to use several cloud providers, they can spend time comparing features and pricing models offered by the different providers to choose the most cost-efficient solutions that best respond to their needs while saving costs. Myth: It’s safer to use multicloud than one single cloud provider At first glance, multicloud strategies may seem less risky than using a single cloud provider because businesses can pick the most secure provider for their needs. But it turns out that adding another cloud platform to an existing business environment brings many potential security risks that companies must be aware of. Storing and managing data on multiple servers means businesses are extending the threat perimeter and giving cyber-attackers more entry points to their network. Cyber-attackers are well aware of the new opportunities they have thanks to multicloud, so companies must consider new security tools to increase protection, and help with access, security key and workload management when moving to multicloud. Research from Gartner shows that 99% of cloud security failures will be the customer’s fault by 2025. So, before being able to fully embrace multicloud environments, businesses must first focus on raising vulnerability awareness and developing a sense of collective responsibility amongst their workforces. The question is not whether or not adopting a multicloud strategy is a safe practice, but whether organisations know how to use such innovative tech securely. It’s key for businesses to implement a security strategy with clear guidelines on how best to protect sensitive data and mitigate security risks. Fact: Developing workers’ cloud skills must be at the heart of a multicloud strategy Clearly, it’s harder for companies to monitor and secure all cloud platforms at the same time, which creates more vulnerabilities for cyber-attackers to exploit. Due to the rising security threat associated with multicloud environments, businesses are required to multiply the maintenance efforts needed to keep the network secure. But the ongoing tech skills shortage is a big challenge for companies. Business leaders are struggling to recruit workers with the cloud skills they need to properly navigate these environments. While cloud has now become a key tool to help organisations transform digitally, tech workers are falling behind when it comes to learning new cloud technologies. To address this, companies must prioritise investing in upskilling employees so that they have the sufficient skills to maximise the potential of the technology. There is currently a significant gap between leaders’ expectations in terms of cloud strategy and what technologists are actually able to execute. Pluralsight’s 2023 State of Cloud Report reveals that while 62% of UK business leaders expect their cloud budget to increase over the next 12 months, only 8% have a dedicated cloud skills development programme. Prioritising cloud skills will be key to multicloud success. Offering training to build in-demand skills in house is as important as investing in new technology. Technology doesn’t grow a business, people do, so putting employee growth at the heart of the business will drive innovation, increase competitiveness and generate growth. The takeaway Moving to multicloud comes with a host of benefits, as well as challenges. When navigating multicloud environments, companies expose themselves to many security risks. In order to fully benefit from the technology and increase productivity while keeping costs down, businesses must have a clear strategy in place. They must compare all cloud applications to ensure they get the best features from every provider – this is vital to designing a cloud strategy that best responds to their needs. Investing in talent is also key to implementing a successful cloud strategy. Having a highly-skilled tech workforce is still the most effective way for organisations to optimise their investments, securely embrace multicloud and prepare for the future. ### To cloud or not to cloud? Why businesses need to look beyond FinOps for the answer Over the past decade we’ve witnessed exponential cloud growth as organisations flocked to the cloud, but it appears the industry is witnessing a slowdown. While hyperscalers Amazon, Google, and Microsoft all posted double-digit growth in their cloud divisions for Q4 2022, this rate was lower than in previous years.  With its ability to rapidly scale up or down, businesses were quick to embrace the cloud at its peak, with the most forward-thinking adopting a cloud first approach. For many organisations the migration to public cloud was not only driven by the promise of greater scalability, but expected cost savings, with IT departments expecting to pay for what they needed when they needed it. The problem is that a lot of IT departments over-provisioned capacity and/or have not been consuming cloud in the right way since.  Time to rethink cloud strategy? The result is that cloud costs have skyrocketed - far exceeding IT departments’ expected budgets in many cases and resulting in businesses experiencing what VMWare call ‘Cloud Bill Shock’. The much-anticipated economies of scale from the hyperscalers have yet to materialise; and AWS, Google and Azure are no longer able to compete on price with on-premises solutions. So, against the backdrop of rising inflation and economic uncertainty, it’s natural that businesses are re-examining their cloud investments and, in many cases, taking the decision to reverse their cloud migrations. And this isn’t only driven by cost, but good governance. So, is the age of the cloud first strategy over? Are we about to witness a large-scale repatriation of workloads back on-premises? While this may be a little extreme, businesses today are certainly beginning to question whether they need to move everything, or indeed anything, to the public cloud. Many organisations that initially rushed to migrate lock, stock and barrel are now back-pedaling, or at the very least exploring a hybrid model of public cloud, private cloud and on-premises. And while some analysts predicted that rising interest rates would result in a slowdown in cloud spending, could it be that the current economic climate is now the catalyst for a much-needed rethink and possible overhaul? Rightsizing to avoid counterproductive cloud costs There’s no question that cloud has radically transformed IT and businesses over the last decade, and the adage: ‘don’t throw the baby away with the bath water’ should be applied here. Reverse cloud migration is not as simple as putting everything back on-premise, and the business case doesn’t necessarily support this either. The agility of public cloud has always been its greatest value proposition – something we witnessed during the COVID-19 pandemic. However, businesses must recognise there are far better ways to manage cloud costs which also take into consideration how IT as a whole can be best optimised within the organisation. The truth is that businesses have been overspending on cloud largely due to inefficiencies in usage, rather than because they are overusing resources. With various sources suggesting that around a third of cloud spend is wasted, it’s fair to say that utilisation has been mismanaged. This is costing businesses millions of pounds each year, resulting in them losing much of the value that cloud solutions promised in the first place. With Gartner predicting that more than half of IT spending will be in the cloud by 2025, there’s an even greater need for cost transparency and optimisation today.  Translating technology investments to organisational value Cutting back on cloud spending does not necessarily mean cutting back on cloud usage – it’s all about fostering good cloud health practices and/or introducing a cloud centre of excellence within the business to help manage cloud provisioning, not just within the IT department but across the various teams and departments within the organisation that procure and consume cloud services. The mechanisms, processes, and policies of FinOps - the financial operating model for public cloud consumption – can help solve cloud cost utilisation issues, putting the responsibility in the hands of the practitioners and helping create responsible cloud users, but it can only go so far in translating technology investments to organisational value. To truly understand and manage the business’ technology footprint and enable CIOs and CFOs to account for, analyse, control and optimise IT costs, as well as communicate their value internally they need a ‘single (data) source of truth’. Often described interchangeably, IT Financial Management (ITFM) and Technology Business Management (TBM) are complementary disciplines to FinOps that help improve an organisation’s outcomes, particularly where IT spend is large and complex, by mapping technology assets and resources to business impact. The C-suite not only needs a top-down overview of compute, infrastructure and storage costs, regardless of whether it’s hosted in the public cloud, private cloud or on-premise, but other business-critical IT expenditure, such as labour costs, hardware, and facilities and power, to deliver the right transparency and charge back to the business.  The marriage of FinOps and ITFM/TBM When it comes to cloud, the industry may be witnessing a plateau, but the reality is that cloud isn’t going away any time soon. Indeed, spending on public cloud services is expected to hit $148bn (about £122bn) in Europe in 2023 and a staggering $258bn by 2026, according to IDC. Where the really interesting changes are occurring is in the evolving relationship between FinOps and ITFM/TBM to better facilitate cloud planning and forecasting, TCO and chargeback.  When costs are consistent across both disciplines, the ITFM/TBM team can assign public cloud spend data from FinOps with key business objectives to deliver a comprehensive, accurate chargeback. Conversely, FinOps can pull data from ITFM/TBM (including labour costs, hardware, and facilities and power) to deliver a fully burdened economic cost. Integrating these processes enables business leaders to compare like for like, i.e. the TCO in public cloud can be reflected in the same way as on-premise services to help business leaders make a truly informed decision about whether to cloud or not to cloud. As public cloud spend increases and becomes a larger percentage of IT’s overall budget, this is critical to strategic decision-making about where workloads should run in the future. Thomas Kopton, Lead Solution Engineer – Cloud Management at VMWare joined Oliver Lorig, International Pre-Sales Solution Architect at Serviceware to talk about how to avoid cloud bill shock with FinOps and value-orientated Cloud Cost Management. Watch the webinar on-demand here. ### Avoiding Unnecessary Corporate Conflict in Climate Change Strategies Due to a desire to combat catastrophic climate change, businesses across the globe are trying to innovate and identify new policies, standards, laws and regulations to allow a concerted effort to be made within their business practices. But despite this, a lack of agreement on a unified strategy to combat it continues to limit progress. Minimising or removing employee commutes is one of the most effective ways for businesses to reduce scope 3 carbon emissions, but this can only be achieved if companies provide a truly effective and engaging digital workplace experience for every single employee, insists Dave Page, Founder & Chief Strategy Officer, Actual Experience. Conflicting Approaches The isolated and individualistic nature of businesses globally is posing a significant hurdle in achieving meaningful progress in supporting climate change goals One clear example of this can be seen in corporations pushing employees to come back into the office at least three days each week, a policy that is totally at odds with the need to reduce carbon emissions. Why are companies encouraging more face-to-face meetings – often overseas – and attendances at conferences in different countries, when this escalation in business travel inevitably contributes significantly to a company’s carbon footprint? Add in the carbon emissions associated with offices, including cooling, heating and lighting, and Working from Home (WfH) offers a clear opportunity for businesses to reduce its energy consumption – even if it is only by mothballing certain unused areas or floors within a building. Both investment and reputation is increasingly predicated on not only pledges to minimise carbon emissions, but tangible evidence of progress toward the United Nations’ 17 Sustainable Development Goals (SGDs) and regulatory plans to enhance and standardise climate related disclosures for investors. So why are so many businesses still failing to grasp the value of an effective digital working environment to its Environmental, Social and Governance (ESG) strategy? Embracing Changing Employee Values What makes this reluctance to change even more surprising is that employees are already on board with this shift in working experience. Each employee may have their own reasons for preferring Work from home (WfH) to the commute or a collaborative video call to the flight to a different country, but this is a corporate win: win situation. Encouraging employees to alter their behaviour in order to reduce scope 3 emissions is unnecessary; they have already embraced the Future of Work. Additionally, showcasing the carbon reduction benefits of this approach can earn significant praise from environmentally conscious younger generations. Rather than resisting this change, companies should be prioritising the delivery of a high quality digital human experience that accelerates and encourages this changing behaviour. This entails investing in a digital workplace to enable seamless remote work for all employees, ensuring a compelling virtual experience for meetings, even with international clients. The focus should be on swiftly recognising individual experiences to maintain consistent engagement and productivity across the workforce. But in order to do this, it is essential to understand and gain an appreciation of each employee’s unique individual experience, in detail and continuously, and use that insight to firstly reveal and address any areas of digital inequality and then build on the quality of experience to create an even more compelling digital workplace experience. Conclusion An effective digital workplace should foster cultural change, empower employees to demonstrate the value of remote working to management, and offer tangible carbon reduction metrics for the ESG team. Armed with proper insight into employee location, it is simple to track activity and demonstrate change. How frequently do employees commute to the office or remote working hubs? Does this involve travelling overseas? By employing standard carbon emission figures, companies can quickly evaluate the present situation, gauge the influence of digital experience improvements on employee behaviour, and showcase year-on-year enhancements. The substantial carbon emissions from daily commuting, compared to those primarily working from home, highlight the compelling reduction in scope 3 emissions that can be achieved through an empowered digital workplace. All of this raises the question: when will companies acknowledge the significance of integrating ESG strategic thinking into all aspects of their business decisions? ### How to end network firefighting by securing end-to-end visibility In the world of network security, the old adage “out of sight, out of mind” simply doesn’t cut it anymore. The rise of multi-cloud architectures and remote working have shifted the security goalposts, making 360-degree network visibility harder to attain, and even harder to maintain. Security teams need to build for a world that never stops - we now have an “always on” mentality when it comes to technology, and network footprints are expanding, creating an increasing number of vulnerabilities for the threat actors to capitalize on. Workers are distributed, endpoints are no longer localized, and the sheer number of devices we use day-to-day, are all adding to this ever-expanding footprint. Even prior to the pandemic, the average number of devices connecting to a network per employee was 4.9 – that’s likely to have increased with the sudden boom in remote working. According to Infoblox’s Global State of Cybersecurity report, nearly a third (29%) of remote devices are employee owned, and Wi-Fi access points and cloud platforms have been prime sources for organizational breaches in the past year, accounting for 34% and 33% respectively. What’s more, a 2023 study found that almost 9 in 10 (87%) of businesses actually rely on their employees to use their personal mobile devices to access company apps and send emails. Businesses need better performance and protection, and that can only be achieved through uniting networking and security.  Company networks are expanding at a time when data breaches have become an occupational hazard. Exposure to risk is simply the cost of doing business in 2023, and how businesses manage that risk and deal with threats as they arrive is a measure of their resiliency. Seeing and stopping critical threats earlier has become a top boardroom priority - the same Infoblox report referenced above also found that the average cost of a data breach is around $2 million. Among the organizations that experienced breaches, most said their attackers were most likely to steal data or hijack credentials, while others experienced system outages and data manipulation.  However, in order to mitigate and remediate threats, real-time visibility and control are key. No matter how focused a network security team is on firefighting threats as they emerge, they will always be on the back foot without proper end-to-end network visibility. That’s because the majority of network firefighting is spent trying to figure out what’s happening, where threats originate, who instigated them, and how long the network has been compromised. For even the most advanced security teams, figuring these things out is an incredibly time-consuming endeavor. Of course, prevention is better than remediation, so network discoverability and uncovering rogue endpoints before they become a problem will always be preferred.  All of this boils down to a need for shared real-time visibility, and businesses know it. A recent Forrester research report, showed a significant correlation between visibility and security, with 81% of surveyed decision-makers agreeing that better network visibility would improve their organisation’s security posture. Nearly two-thirds (61%) of respondents also agreed that investing in network discovery infrastructure was the best way to boost their security capabilities. These are non-negotiable requirements if businesses are to maintain control of a constantly changing environment. The synergy between visibility and security Visibility has become the central pillar of network security in recent years. Visibility helps to plan network availability, assess bandwidth utilization, and predict when network capacity might fall short of future requirements – all of which are crucial to managing a network safely and effectively. Visibility also affords network security teams a way to identify anomalous patterns in traffic activity that could point to a potential threat, or uncover rogue endpoints or devices that have no business being on the network. Comprehensive, end-to-end network visibility reduces the time security teams need to spend firefighting because it provides contextual awareness. Instead of manually scanning the network for threats and identifying vulnerabilities, they can jump straight into the remediation phase. By reducing the mean time to detect (MTTD) in this way, threats can be isolated and taken care of far more quickly, reducing dwell time and increasing the overall resilience of the network. This is now critical as Infoblox’s Global State of Cybersecurity report highlights that only half (52%) of global organisations have accelerated digital transformation with remote workers in mind. The importance of integration and visibility as a philosophy Having end-to-end network visibility tools in place doesn’t count for much if areas of the business, including security teams, remained siloed and separate from one another. According to the Forrester report, 79% of business decision-makers see a fully integrated network visibility solution that benefits their overall networking and security objectives as “appealing” or “very appealing”.   Siloed visibility is not enough. The integration of network visibility across an organisation is critical, because it allows for a holistic view of the entire network infrastructure, allowing security teams to monitor, analyse, and manage network traffic in real time and ensure that any threats are identified and addressed quickly. Businesses should be looking to unite networking and security by providing real-time application, user and device context so they can detect and respond to who and what connects to their network. This joining of forces between networking and security will be critical in the coming years, allowing businesses to share context-rich data in real-time and deliver a faster, safer user experience.  When visibility is siloed across departments or locations, it’s virtually impossible for security teams to identify threats that put day-to-day operations at risk before it’s too late. With a fully integrated network visibility solution, however, all network traffic can be monitored from a centralised location, providing a complete view of network performance and security, vastly reducing the need to firefight. If all relevant network data is readily available, security teams have what they need to not only chase threats reactively, but get ahead of them proactively.  There are business performance gains too. Integration of network visibility allows for better collaboration across departments. When all teams have access to the same network data, they can unite to resolve issues more efficiently. This collaboration can lead to better decision-making and more network uptime, as all parties have a comprehensive understanding of the current state of the network.  Fully integrated security and visibility solutions are becoming the preferred choice for organizations, but there is still work to be done. As detailed in the Forrester report, only 24% of organisations are confident they have a fully integrated network security solution in place. Teams responsible for buying security solutions are also quite often siloed, with only 29% of organisations reporting that there is a strong, integrated relationship between their network security and solutions procurement teams.  In today's security landscape, fully integrated network visibility solutions are no longer a luxury, but a necessity. Expanding network footprints, remote working, and activity across multiple networks and multi-cloud environments mean that manual network firefighting or siloed solutions simply won’t cut it. A unification of networking and security is needed. What you can’t see can hurt you, so it’s vital that security teams are able to monitor, analyse and manage network traffic in real-time from a centralised location. ### The Cloud ERP debate – security versus operational disruption? Recent demands for tighter security measures and reduced operational costs have caused widespread cloud adoptions rates to rise as many businesses make the switch from on-premise to cloud ERP. This switch comes with plenty of business benefits from decreased security burdens to reduced capital expenditure and lower administrative burdens. However, some CIOs are still on the fence about adopting cloud ERP due to concerns that untested updates and version lag may cause disruption to their business operations – but is this really the case?  An in-depth assessment of cloud ERP’s strengths and weaknesses A well-configured cloud deployment offers significant cost, efficiency, and end-user benefits over more ‘traditional’ on-premise deployments, but no system is fully immune from disruption. If businesses adopt an ‘evergreen’ approach to updates they will benefit from a trustworthy, regular stream of bug fixes and security updates – but this does not mean IT departments will not face challenges along the way.  When compared to the previous long-term, on-premise ERP strategy that can only be described as ‘find a version that works for you then sit on it for as long as possible,’ the Software-as-a-Service (SaaS) cloud model has very much established itself as a superior alternative.  By selecting a cloud ERP system, businesses gain a specialist team provided by the vendor which can work 24x7 to ensure their SaaS solution is secure, releasing IT teams of their security burden usually faced when using on-premise deployment. The magnitude of a cloud upgrade can be a gamechanger for IT departments who will no longer need to worry about having to quickly respond to resolve bug problems and integrate new patchworks – an approach that often means other IT tasks are pushed to the side.  A first-hand example of this was when we helped the charity Alzheimer’s Research UK by implementing a scalable cloud-based business management solution. This gave them the tools to have better data reporting, increased remote accessibilities, and improve their security all from changing to cloud ERP. With their old financial software, they struggled to access critical data remotely and restricted data reporting features.  Approach updates the ‘evergreen’ way for top-notch security The Microsoft ‘evergreen’ approach to keeping ERP systems updated, whereby patches are automatically applied on a regular scheduled basis, is a major shift from previous approaches to updates held by many IT departments. Once deployed and customised to be fully functional, many businesses avoid ‘rocking the boat’ with updates or patches – often leading to a significantly outdated version. The ‘evergreen’ approach takes the update burden out of the business’ hands, ensuring a cloud ERP system such as Dynamics 365 is always kept running on a supported and security-patched version, easing end-of-life concerns. This ensures businesses are not running versions with limited functionalities or known security vulnerabilities. Reduce operational disruption with a helping hand from managed services providers While this faster, predictable update cycle tightens systems from a cybersecurity perspective, the highly integrated, customisable nature of today's cloud ERP systems can also be seen as a double-edged sword in terms of operational ‘security’. ERP vendors naturally cannot test these updates for every individual business environment – many of which operate highly customised or extensively integrated ERP systems – so there is a low-lying risk of operational disruption to a critical system. If an update does go ahead, the difficulties don’t end there as many businesses lack the time or resources to analyse all the release notes an ERP vendor produces. These notes contain details of the updates and it’s up to the business to take this responsibility in-house to see how a rollout would affect their system in terms of downtime and user disruption. To ensure business continuity and no unexpected threats to day-to-day operations, having support from a managed service provider along with testing the update of patches on critical processes prior to deployment will be vital – a task that is increasingly being automated to ease the manual burden. Take the case of United Oilseeds, a long-standing Columbus customer which has gone on to become one of the UK’s most successful farmer co-operatives. Due to issues with a previous third-party infrastructure managed service, United Oilseeds reached out to Columbus to unite their application and infrastructure managed services. After an Azure migration project to modernise and futureproof their ERP system, United Oilseeds began to see the benefits of a complete managed services package. The company has been able to eliminate the back-and-forth between separate providers, and the more proactive approach results in less downtime of a single point of contact for their managed services. The newer, more up-to-date infrastructure also enables them to maximise the ROI of their ERP system. Mitigate end-user mishaps with effective cyber training and application security  Since the covid pandemic induced mass shift to remote working, there’s been a noticeable spike in business cyberattacks as more employees start to connect corporate devices to their own personal networks, which often have poorer security compared to corporate networks. Unfortunately, there has been many cases where the end-user is unintentionally the reason corporate systems are compromised. Back in 2021, all it took was for one user to click on an infected file in an email and introduce a major ransomware attack on the entire Irish public health system. To avoid similar outcomes, many businesses have prioritised teaching their employees about cybersecurity and online safety.  IT departments that take a granular approach to security can relax knowing that should a user account be compromised, this will not heavily impacting on user access to critical systems and data. If configured correctly, should a user experience a cyber-attack they will not be able to spread to other users no matter the audit trail, privileges, and additional traceability measures including automated checks.  Take a malware attack on a manufacturing company with operations running around the clock. A compromised on-premise ERP system linked to the factory floor would force the entire business to shut down to prevent the attack from spreading further and causing more damage to other back-end systems. This would have a catastrophic impact on the business because operations and manufacturing would have to cease until the issue is fixed, wasting time and money. However, with a SaaS deployment, whereby a client on a single device is compromised, this will not be the case. Correct implementation can lead to a safe and smooth future with cloud ERP Before rushing to make the switch over to cloud ERP, businesses must first understand the operational responsibilities that can minimise disruptions. Adopting an ‘evergreen’ approach to updates still has its own security and compatibility challenges as cloud ERP is not a fix all solution after all. End-user training, application security, having the right managed service provider, and correct configuration will be crucial for businesses to unlock the true benefits of cloud ERP – from reduced capital expenditure to skipping version lag and enhanced efficiency.  ### Why is Modern Customer-Centric Design so Important? Putting consumer demands at the centre of product creation is a rapidly growing trend. One of the primary reasons individuals invest in technology is to make their lives easier, from unlocking smartphones with facial recognition to managing home temperatures from work. This is no different for corporations. In an increasingly competitive climate, business leaders are continuously looking for ways to enhance their IT software in order to better serve their customers. Customer-centric design refers to the process of developing technology, products, services and solutions based on true consumers' requirements and challenges. A great quantity of data may be collected by placing the customer at the centre of software design and merging their demands with Customer Relationship Management (CRM). Obtaining particular information on customers and their behaviour creates a complete picture, allowing for improved customer experience design. Customer First Modern customer-centric design is essential in software development since it focuses on the end user's expectations. This strategy guarantees that the software being produced meets the needs of the consumers, leading to higher customer satisfaction and loyalty. Software may be adapted to the individual demands of the target audience by collecting user data during the development process, resulting in a more user-friendly and efficient final product. A customer-centric strategy also saves costs by avoiding the need for substantial post-launch adjustments and maintenance. Furthermore, a study by Forrester Research found that companies that have a customer-centric approach to software development can save up to 30% in development costs. Therefore, using a customer-first approach to software development can result in a more successful and lucrative product. Use Case: Uber Uber has come a very long way since it launched as Uber Cab in 2009. Uber went public in May 2019, ten years after its inception. Despite its difficulties, Uber remains a key player in the ride-hailing landscape. Uber reported 1.5 billion journeys on its platform in its most recent quarterly financial announcement for Q2 of 2021.  Uber’s disruptive approach allowed the company to build a service that satisfies the demands of its target audience and sets it apart from its rivals by concentrating on the needs of its consumers and consistently obtaining input, enabling them to remain a leader in the ride-hailing sector. User-friendly App Uber's app is designed to be user-friendly and easy to navigate. It allows users to quickly and easily request a ride, track the progress of their driver and rate their experience. Personalisation Uber allows users to save their preferred pickup and drop-off locations, making it easy for them to request a ride in the future. Additionally, the app remembers users’ previous rides and automatically suggests similar options for future routes. Safety Uber places a strong emphasis on safety. The app includes features such as the ability to share trip details with friends and family, and a panic button to quickly contact emergency services. Flexibility Uber offers a variety of ride options to customers, such as UberX, UberSHARE and UberGREEN. This allows customers to choose the ride option that best suits their needs and budget. Customer Service Uber has a dedicated customer service team that is available 24/7 to assist customers with any issues or concerns. The company also has a rating system for drivers, which allows customers to rate their drivers and provide feedback on their experience. Use Case: Amazon In his 2017 Letter to Shareholders, Amazon’s Jeff Bezos called out the underlying nature of customers’ ever-increasing expectations. “One thing I love about customers,” Jeff wrote, “is that they are divinely discontent. People have a voracious appetite for a better way, and yesterday’s ‘wow’ quickly becomes today’s ‘ordinary.’” Amazon focuses on its customers' long-term needs – not just the requirements they have today but those they will have in the future – enabling long-term, sustainable innovation. Personalisation Amazon's website and app are designed to provide a personalised shopping experience for each customer. The platform utilises customer data, browsing history, and purchase history to recommend products, making it easy for customers to find what they're looking for. Search and Navigation Amazon's search and navigation features are designed to allow customers to find the products they're looking for with ease. The site's search bar enables customers to quickly find products using keywords, while the navigation menu is organised by product category, making it simple to browse. Easy Checkout Amazon's checkout process is designed to be fast and easy. Customers can save their shipping and payment information, making it straightforward for future purchases. Additionally, Amazon offers a variety of payment options, including credit cards, debit cards, and Amazon Pay, making it easy for customers to pay for their purchases. Fast and Reliable Shipping Amazon offers a variety of shipping options, including free shipping, same-day delivery, and one-day delivery. This allows customers to choose the shipping option that best suits their needs and budget. Strong Customer Service Amazon has a dedicated customer service team that is available 24/7 to assist customers with any issues or concerns. The company also has a rating system for products and sellers, allowing customers to rate their experience and provide feedback. A Future Workforce As younger generations enter the workforce, customer-centric design is expected to continue to evolve and adapt to their wants and needs. Younger generations have grown up with technology and expect a seamless, personalised and convenient experience from brands and businesses. A customer-centric design approach focuses on offering individualised and personalised experiences. This may be accomplished by utilising artificial intelligence, machine learning, and big data analytics, which enable organisations to collect and analyse client data in order to better understand their preferences. Additionally, younger generations place a high value on transparency and sustainability, so it is expected that customer-centric design will focus more on ethical and sustainable practices. This includes the use of sustainable materials, the implementation of fair labour practices and the reduction of waste and carbon footprint.  Maintaining focus on addressing customer requirements quickly becomes more challenging as additional business challenges emerge. However, understanding consumers' needs and demands, and quickly devising solutions to suit those requirements, is more important than ever for businesses wanting to remain innovative in an increasingly unforgiving business climate. ### Improving cloud migration efficiency with automated testing The benefits of moving to the cloud are well known - enabling businesses to scale up operations faster and with more efficiency, become more flexible and provide the level of quality in digital experience that customers expect and demand today. According to the 2022/2023 World Quality Report, almost half of enterprises have most of their non-production environments running in the cloud. However, making the journey to the cloud is not always an easy one as evidenced by recent research from McKinsey, which found that three quarters of cloud migrations run over budget and over a third run behind schedule. The balance that must be met for business leaders and IT teams alike is keeping operations on track during their cloud migration journey while at the same time keeping disruption across increasingly hybrid and disparate environments to the absolute minimum.  However, what many enterprises are not aware of is the vital role that automated testing can play in helping to minimise these risks. The 2022/2023 World Quality Report also found that half of respondents admitted their strategy for cloud testing is only somewhat effective, while only just over a quarter include cloud testing as part of the software development process during cloud migration projects.  This is somewhat surprising given that there is now over two decades of research showing that automated testing provides key capabilities to identify and fix errors early in the software development process and help ensure that timelines and budgets stay on track. Migrating a system to the cloud is rarely simple and without the right cloud testing approach, organisations could be introducing risk to their most critical data and processes, including: Performance and scalability of applications Data integrity Business continuity Balancing performance alongside scalability  While cloud infrastructure can theoretically scale to any size, it isn't that simple. The complexity in environment, configuration, and architecture, mean without adequate testing how an application will perform isn't known. Many businesses have found this out the hard way when their systems fail under customer load, leading to trust and reputational loss. Performance testing your system early and regularly is critical to ensuring you understand how the system will behave when deployed on the cloud. Performance testing your migrated projects, whether those are ‘lift and shift’, re-platformed, or re-architected, enables you to analyse the impact of migration. For example, analysing the latency and bandwidth implications between your migrated system and other cloud or still on-prem components is critical when evaluating the suitability of the new cloud environment. Preserving data integrity  Cloud migration projects can often see complex situations where their applications are reliant on shared data centre storage, including directory services for data management. Undoing what is likely years of integration and evolution to move some or all of that to the cloud is a very complex and delicate task. Alternatively, companies may be in a position of migrating a data warehouse from entirely on-premises to the cloud. In this scenario, whether it’s Microsoft Azure, AWS, Google Cloud or any other offering, the process is simpler, but not simple. In both cases IT teams and business leaders need to ensure that all company data is accessible and maintains integrity while the applications that rely on it change and evolve. A well designed and repeatable test plan can play a vital role in ensuring reporting tools maintain operations and that all company data maintains integrity. This means during the evaluation stage of a migration that all potential issues must be identified and resolved before the process begins. Automated data validation and reconciliation can help IT teams with identifying and preventing unintended changes to the data both before and during migration.  One of the key benefits of this automated testing process is that it can detect changes as soon as they occur, meaning any changes caused by the data migrated and are not planned can be dealt with right away when they are quicker and easier to address. Additionally, the same tests can be used post migration to determine if any modifications to systems are at risk of compromising data or other company processes.  Ensuring business continuity - legacy applications The process of migrating legacy applications to the cloud that have been embedded within critical business processes, are often customised, and strongly integrated with other applications, presents a potentially significant risk. Moreover, it is frequently because of this that many enterprise resource planning projects do not run to time or budget.  Manually verifying if legacy applications will continue to function post migration is time-consuming and without guarantee of success in terms of being able to identify all the potential ways in which changes may negatively impact other business processes.  However, automated testing enables migration teams to evaluate these legacy applications and incorporate a safety net and make sure business continuity is maintained before any changes are deployed. The knock on impact of this is that migration teams have more time on their hands to focus on other tasks safe in the knowledge that they have the capability to fix any potential errors before they have the chance to negatively impact customer relations or overall productivity. Migrate with confidence In conclusion, with Gartner estimating that 80% of businesses will close traditional data centres by 2025 and 92% of companies have a multi-cloud strategy in place, there is no excuse for not being adequately prepared when it comes to cloud migration.  A key factor in this too often overlooked is the need to implement and deploy a comprehensive migration testing approach for applications, processes, and data across the entire business. The companies that do this will be the ones to benefit from reducing risks to critical data and business processes.  ### Inside ChatGPT plugins: The new App Store for brands? The evolution of ChatGPT this year has touched on nearly every sphere of society, sparking intense debate over the impact of AI on education, employment, culture and the very nature of communication. So profound and rapid is the change, it's prompted industry leaders to liken its transformational impact to previous technological firsts, including the launch of the internet and the smartphone. Amid a storm of clickbait headlines over the power of conversational AI, however, a quieter revolution is gaining pace. In March this year, OpenAI announced the launch of dozens of plugins, enabling early adopters such as Klarna, OpenTable and Shopify to gain third-party access to ChatGPT’s 100 million plus user base. Allowing brands to pinball on the momentum of ChatGPT is a smart move that, in effect, transforms an already highly sophisticated tool into a transactional platform akin to an app store. The ability to build plugins opens up a world of functionality for participating brands: in the B2B space, OpenAI is already planning to launch a marketplace allowing developers to sell their AI models created on Open AI’s technology.  While many of the capabilities that drive ChatGPT plugins already exist, OpenAI's technology coalesces multiple services simultaneously thanks to a natural language interface. By doing so, it forms an intuitive and evolving ecosystem that promises to redefine consumer behaviour – but to what extent? OpenAI insists it is still early days for ChatGPT plugins, yet already we can see several emerging trends.  1. A shake-up in the Big Tech power rankings: All of the big tech powerhouses have made generative AI a priority, but OpenAI currently looks to have a competitive edge. If OpenAI manages to keep up the pressure on rivals with ChatGPT plugins, two brands have cause for concern: Google and Apple. For Google, the key competitive worry is the risk to its dominance in search (which, together with YouTube, is already under pressure from the phenomenal rise of the social platform TikTok). ChatGPT's plugins represent a major disruption that will provide users with AI-powered natural language access to real-time personal and public data. In the short term, this will directly challenge the hegemony of Google’s own revamped search engine. That said, Google /Deep Mind’s Gemini AI system, which is currently in development, claims to have greater capabilities, including planning and problem solving, than ChatGPT. As for Apple, the advent of plugins mean that ChatGPT now has the capacity to become its own App Store. Rather than end users accessing apps one at a time, the plugins integrate third-party capabilities into ChatGPT, with the result that brands will start to meet customer needs through natural language interfaces instead of old-style apps. Not all Big Tech entities stand to suffer from OpenAI’s emergence onto the scene. Microsoft made a $10bn investment in the brand, propelling it into pole position for generative AI. It might even be enough to resuscitate Bing. It’s also worth stressing that GAMMA (Google, Apple, Meta, Microsoft, Amazon) have developed a Teflon-style resistance to competition. And the ChatGPT app’s appearance in the Apple App Store, coupled with talks of a similar app on Google OS Android, suggests that OpenAI’s route to growth may require it to collaborate with Big Tech rather than compete. 2. Scope for faster “startup to scale” acceleration: In his latest Ted Talk, OpenAI co-founder Greg Brockman illustrated the potential of ChatGPT plugins by showing how easy it is to conjure up menus by integrating existing apps. In this case, Brockman used ChatGPT to suggest a post-Ted meal and draw a picture of it. Without Brockman's input, ChatGPT summoned up digital image creator DALL-E, grocery delivery service Instacart and workflow automation platform Zapier to plan a dinner, including creating an image of the meal, ordering the ingredients to be delivered and tweeting a picture of the end result. The point of the story isn’t what Brockman has for dinner – but how ChatGPT can stimulate new business models that amplify user experience. By allowing brands to integrate, OpenAI is providing an all too rare opportunity for tech first-movers eyeing rapid expansion.  More established brands will inevitably face cultural and logistical challenges in pivoting to generative AI. But for early-phase entrepreneurs, the plugins enable a simple and relatively inexpensive way to innovate their offer – taking on rivals via the mantle of ChatGPT.  3. Renewed focus on user experience and data: UX and data have been touted as key commercial differentiators since the early years of the digital era. Now ChatGPT plugins take the debate to a new level. Regarding UX, the interplay between plugins and third-party providers makes ChatGPT a centralised hub for a myriad of digital interactions. By allowing users to interact with brands through one single natural language interface, the platform presents a streamlined process where users can delegate complex, multi-dimensional tasks to ChatGPT in an enjoyable, conversational way and with minimal effort.  If all brands are delivering a similar standard of great UX, however, then data will become more important as a differentiator. ChatGPT’s advice and recommendations can thereby only be as good as the data that it is provided with. Brands with meaningful and well-managed proprietary data will be better placed in this new plugin landscape.  Follow this idea to its logical conclusion, and there is a compelling argument in favour of shared data ecosystems. Car giant GM is already exploring the potential of ChatGPT in vehicles. But imagine the potential of GM data layered with that of a charging network, satnav operator and finance provider – all plugged into a generative AI business model. Swap out the names, and you can imagine a similarly effective collaboration within financial services or healthcare. 4. The looming threat of job erosion: ChatGPT plugins will not single-handedly reinvent the global labour market, but in many sectors, they create the potential for companies to replace human employees with machines.  People dispute how far and how fast the shift will come, but the IMF has warned generative AI could result in “substantial disruption to the jobs market.” A worrying trend in 2023 is that large corporations are using the promise of AI-inspired job losses to grab the headlines. BT, for example, plans to cut its workforce by 55,000 before 2030 – with 10,000 jobs replaced by AI. Elsewhere, IBM said it intends to replace 30% of back office jobs with AI alternatives. 5. A wave of healthy scepticism: AI’s remarkable potential has led to a flood of investment in recent months such as AI start-up Anthropic’s plans to launch a competitor to ChatGPT,  backed by Google and Spark Capital. But its perceived threat to society has also led to serious pushback, with AI “godfather” Geoffrey Hinton’s departure from Google seen by many as a “canary in the goldmine” moment.  There’s evidence of some early caution around ChatGPT’s pioneering AI model in the wider business world, too. A survey of Fortune 500 companies last month found that the vast majority of CEOs still see data analytics as their current key priority – not the unproven potential of generative AI. In fact, just 12% of leaders saw generative AI as having the biggest influence over their business within the next decade. Many companies don't feel ready for the leap into the unknown just yet and  may not know how to get going with AI. Once they do, it’s essential that technologists and designers with first hand experience of generative AI have a seat at the table: a recent study of digital progression at big companies found that 60% of leaders did not understand how to use emerging technologies.  While there’s no question that ChatGPT and its competitors will become increasingly important, plugins will open a Pandora’s box of unexpected consequences. For all of its potential, brands need to guard themselves against risks such as reputational damage, consumer alienation and the potential for privacy blunders as they navigate the opportunities ahead. ### Creative Cloud: Driving Brand Identity The digital world is developing at a rapid rate, and its constant evolution pushes businesses to innovate constantly, entering into new markets and industries, which ultimately increases competition. As a result, brands must find ways to set themselves apart from the rest. While much of that relies on solutions and offerings, brand awareness and identity are essential in today’s highly competitive market.  The cloud has undergone truly significant advancements in recent years, with its functionality and utility growing on a weekly basis. Similarly, the growth of the creative industries, such as graphic design, videography and photography, has been a highly promising prospect over the past few years, with Deloitte forecasting that the global creative economy will grow by up to 40% by 2030. A number of cloud-based providers are now looking to embed creative offerings within their packages to cater to a brand new market and a new era of creative possibility - fuelled and supported through the use of the cloud.  The Importance of Brand Identity The relationship between the cloud and the creative industries creates an enormous opportunity for businesses to capitalise on the ability to craft and shape strong brand identity. Possessing an advanced brand identity helps companies to stand out from their competitors, allowing them to mould a brand personality, creating loyalty and awareness among its customers, employees and the general public.  Organisations that invest in the innovative ingenious of their marketing teams have a competitive advantage in this new age of the creative cloud. Businesses that are creative hold the possibility to tailor their brand to meet the needs, and motivations of their customers in new, inventive and personalised ways helping to build a memorable experience, and placing them in a considerably more favourable position compared to their competitors. Through the integration of creative solutions into the cloud, businesses receive not just the advantages of a stronger brand identity, they also receive all of the additional benefits of the cloud, including increased flexibility and availability, scalability, advanced security and easier collaboration.  Creativity During Times of Economic Uncertainty  Throughout times of economic uncertainty, budgets are regularly evaluated and constrained, as businesses search to find ways to reduce their spending and remain financially stable. While cutting a marketing budget may seem like a viable way to save some money, it may not be the best outcome for companies. A business’ marketing is what helps them stand out from the competition. It's what gives them originality, brand presence, and recognisability that ultimately plays a key role in driving future prosperity.  In a recent study reflecting on the 2008 recession, the ICAEW reported that advertising expenditure was reduced by 13%. However, statistics showed that companies and organisations who maintained their marketing output received 3.5 times more brand visibility than their competitors, highly strengthening their market presence, brand identity and capacity for future business growth. It is clear that investment in marketing and creativity should be a top priority for businesses, especially in times of economic downturn. Seizing the opportunity As so many organisations choose to reduce their marketing budgets during tough times, this actually presents an opportunity for others because it increases the market size for those who want to increase attention to their brand. A recent report conducted by Analytic Partners revealed that 63% of brands that increased their investment in marketing after the 2008 financial crisis saw a 63% return on their investments, and brands that increased their media investment also recorded 17% growth in incremental sales. Not only does continued investment in marketing drive consistent revenue, but it can also boost revenue during periods of economic downturn as a result of diminishing investment from competitors.  In addition, cloud-based systems are typically more cost-effective compared to their more archaic counterparts - such as the internal hosting, storage and access of servers, which are expensive to run - provide limited room for mass growth and could lead to a greater volume of information silos in comparison to modern cloud solutions. This provides users with additional control and management of their investments.   A recent study from McKinsey provides additional evidence of the benefits of the cloud, stating that businesses who use it effectively can cut their IT infrastructure expenditure by up to 20% due to its increased potential for automation, enhanced efficiency and resource monitoring functionality. The cloud also creates the opportunity for improved collaboration within businesses, allowing organisations to view and share information and resources with ease. The Future of Creativity  The creative industry is in a constant state of evolution, developing in tandem with the advancements and technological success of society. As emerging technologies such as Artificial Intelligence (AI) continue to become increasingly embedded into our lives, they can be harnessed to elevate the industry to new, innovation-led heights.  When considering the future of the creative industry, AI and machine learning can be used to exponentially streamline and simplify often complex operational workflows, helping businesses to make more informed decisions at a much quicker rate. As additional emerging technologies such as virtual and augmented reality continue to transform reality, they create the foundation for a brand new level of creative possibility.  Integrating these advancing and emerging technologies into the vast benefits of the cloud, such as centralised data management and security, creates a truly promising future of creativity in business. This helps pave the way for businesses to showcase their design talents, helping to differentiate themselves from their competitors and stand out amidst a crowded market. A Visionary’s Future In a world where so many ideas have been ‘done already’, creativity is integral to a successful business in today’s environment, and every business should aim to develop a high-quality creative team as a result. As creative solutions are being integrated into the cloud, this provides the ideal opportunity for cloud marketplaces to enter into new potential client pools, diversifying and further growing their offerings and packages to new heights. This significantly improves the possibility for success for a myriad of businesses spread across the channel, from top to bottom.  ### Cloud migration: Easy to start, harder to finish? Cloud migration – a no-brainer? Yes…and no. Despite all its very real benefits, not everything should make its way onto the cloud. Developing a cloud migration strategy helps businesses evaluate which workloads will work best in the cloud and prepare for any complexities. Cloud migration has become a leading objective for virtually all businesses, regardless of industry. If strategic, impactful business modernization is your organisation’s goal, cloud offers increased agility, enhanced security, better backup and recovery options, and greater cost efficiency. Furthermore, it allows organisations to pivot at short notice and to innovate more freely and effectively. At a time when data volumes are rising and workforces are becoming increasingly distributed, shifting workloads into the cloud has become the obvious path forward.  Yet, despite its apparent benefits, not everything should make its way into the cloud. Part of developing a good cloud migration strategy is recognising which workloads will thrive in the cloud, and prioritising them accordingly. While it may be easy to ‘lift and shift’ certain apps or processes into the cloud, some workloads that deal with sensitive information or depend on legacy systems will need more care and attention. This is just one example of how complex cloud migration can be, and one of the key reasons many cloud adoption strategies stall mid-process.  Stalling after the first hurdle False starts and failed attempts are common when it comes to cloud adoption. The UK’s Ministry of Defence has just revised its cloud adoption strategy as it prepares to move sensitive data off-premise by 2025, highlighting the importance of remaining responsive and agile during the process. Put simply, the initial roadmap to cloud adoption will not always chart the most efficient path.   Organisations should bear this in mind. According to a 2021 Gartner report titled ‘The UK: Cloud Migration, Governance and Cost Control Are the Most Frustrating Aspects of Public Cloud’, 79% of UK businesses agree that shifting their workloads into the cloud is a high priority, but almost a third report unsuccessful or ineffectively implemented cloud initiatives in the past three years. This issue is reflected worldwide – in McKinsey’s report, titled ‘Cloud-migration opportunity: Business value grows, but missteps abound’, it is estimated that £124 billion of wasted migration spend is expected over the next three years – a major inhibitor, and potential disincentive, to cloud adoption. Cloud adoption isn’t an event, it’s a process. So, what can organisations do to ensure that process isn’t just started, but finished in the right way? Establishing a cloud migration strategy Migration for migration’s sake can’t be the goal for organisations. Like all transformation processes, it needs to be executed with broader business objectives in mind. What is the current baseline situation? What legacy systems are involved? Which workflows are most important, and how is the workforce distributed?  Organisations need to identify the business goals that the cloud migration will support, determine the types of applications and data that will be migrated, and select the cloud service provider(s) that will best meet their needs. One of the best ways businesses can prepare for cloud migration – and devise the best migration strategy – is by using the Six R’s of Cloud Migration – a concept developed by Gartner analysts and later refined by AWS.  The six R's represent a framework designed to help organisations determine the best approach to migrating their applications and data to the cloud. Each of the 6 R's represents a different migration strategy, and organisations must carefully evaluate each option to determine the approach most appropriate to them.  Rehost (lift and shift): This approach involves migrating applications and data to the cloud without significantly changing the underlying architecture. Essentially, the application is "lifted and shifted" from the on-premises environment to the cloud. Rehosting is typically the quickest and least risky migration approach, but it may not take full advantage of the benefits of cloud-native technologies. This is where many businesses consider their cloud migration project ‘finished’, but in reality only scratches the surface of what’s possible.  Refactor (re-architect): Refactoring involves making significant changes to the application architecture to take advantage of cloud-native technologies. This may include redesigning the application to use cloud-based services such as serverless computing or microservices. Refactoring can significantly improve scalability and agility but may be more time-consuming and complex than other migration approaches. Revise: This approach involves making minor changes to the application to enable it to run in the cloud. This may involve updating the application code to take advantage of cloud-based APIs or services. The revised application is then migrated to the cloud, where it can take advantage of cloud-based scalability and other benefits. Rebuild: Rebuilding involves completely redesigning and rebuilding the application from scratch using cloud-native technologies. This approach is typically used when the existing application is outdated or not well-suited to the cloud environment. Rebuilding can lead to significant improvements in performance and scalability but is also the most time-consuming and risky migration approach. Replace: This involves replacing the existing application with a new cloud-based application that provides similar functionality. This may involve selecting a new application from a cloud-based marketplace or developing a new application from scratch. Retain: Retaining involves keeping the existing application on-premises and not migrating it to the cloud. This approach is typically used when the existing application is critical to the organisation and cannot be migrated to the cloud for regulatory or technical reasons. Going through this framework step by step, and pairing the right migration approach to the right app or workload, will help businesses carve out the most effective path toward ‘finishing’ their cloud migration process.  Risk mitigation in cloud migration Cloud migration is not without risk. More than half (56%) of organisations are hesitant to press forward with cloud migration due to security concerns around data loss and leakage. This is yet another reason many cloud migration processes stall after they have begun, and why many stop at the relatively easy ‘lift and shift’ stage.  To mitigate these risks, organisations must ensure that they have appropriate security controls and security best practices in place. Chief among these is data encryption. By encrypting data before transferring it to the cloud, organisations can ensure that their data remains secure both in transit and at rest. Access controls are also essential for preventing unauthorised access to data during the cloud migration process, and should be established on the basis of ‘least privilege’ – this will ensure that data is only accessible on a ‘need to access’ basis as determined by employee roles and functions. Monitoring is also crucial for detecting and preventing data loss and leakage during cloud migration. Organisations should implement real-time monitoring of their cloud infrastructure and applications to detect any unauthorised access or data leakage. By implementing logging and auditing, they can track all activity related to their data. When it comes to cloud migration, preparation is king. Rushing into an ad-hoc migration process will result in failures and false starts, and will see the migration process stall after some initial ‘lift and shift’ work. While there are some benefits to be gained here, securing a bright and sustainable future in the cloud requires more planning and greater alignment with overall business objectives. With careful planning, building the right relationships with the right vendors and partners, and doing what they can to minimise risk, organisations can ensure a smooth and seamless transition to the cloud in a way that delivers on its true potential.  ### Compare the Cloud interview - Ed Dixon - Bayezian In his enlightening speech, Ed made a clear emphasis on the importance of specific skill sets necessary for graduates to thrive in today's fast-paced industry. He underscored that apart from academic proficiency, the modern industry demands a well-roundness in individuals, which includes critical thinking, adaptability, effective communication & emotional intelligence. He further elaborated that it is this blend of hard and soft skills that sets successful graduates apart. Ed's talk provided a fresh perspective, urging students to go beyond their traditional curriculum and cultivate these essential skills to gain an edge in the competitive professional world. ### Core to cloud: the next phase in telco cloud adoption  Our world is driven by the connectivity that communications service providers (CSPs) deliver to both people and businesses. As our demands for connectivity evolve due to the introduction of technologies such as 5G, virtual reality, augmented reality, mixed reality, IoT and Generative AI, CSPs must evaluate how they can innovate to create and monetise new services and experiences, without increasing prices, lengthening time to market or compromising the efficiency of their existing offerings. How can they achieve this? Undoubtedly, one key pillar of the strategy is adopting cloud.   The cloud brings benefits that CSPs simply can’t access when hosting their core systems on-premise. From enhanced agility, scalability and network resilience to rapid innovation and improved security and regulatory compliance, the cloud is key for telecoms companies looking to up-level their offerings and provide amazing experiences. Cloud will also enable them to adopt enhanced analytics and AI-driven solutions, giving them greater cost control and more, all of which will result in more compelling and personalised experiences for their customers.  Where are we now?  Research from Capgemini recently reported that telcos will invest an average of over $1 billion in telco cloud over the next three to five years to support their mobile networks. So, while we know that many CSPs are a considerable distance along their paths to the cloud already, significant investment is still being pumped into this technology, which needs to be directed towards the areas that will give them the greatest advantage. But before looking ahead, let’s evaluate where we currently stand.  So far, much of the telco cloud investment has been focused on the initial steps and ‘low-hanging fruit’ of a cloud adoption program. This includes strategy, establishing partnerships with the leading cloud providers, the creation of cloud ‘Centres of Excellence’ and building out knowledge and cloud capabilities. When it comes to migration, most of the work done has been with non-critical workloads as a way to trial and initiate the move without putting too much on the line. However, we’re quickly approaching the point where telcos will have to address the question of how to migrate their mission-critical core systems to the cloud – and that layers in a new level of complexity.  Core system migration is a significant undertaking as it touches the lifeblood of the organisation. Any glitches in the migration could impact levels of service, customer experience and revenue streams – so extra care is needed. But as the industry powers ahead with new technologies and business opportunities, telcos must not allow themselves to fall behind the competition or miss the early mover opportunity. They need the power and scale of cloud in their core system capabilities to remain competitive, offer innovative services and avoid losing customers. ‘Core to cloud’ migrations are taking place now and will be completed within the next three to five years, so the time to act is now.  Finding the right path To fully realise the benefits of core to cloud migration, telcos should design a custom strategy that will address their unique needs and situation, taking into account factors such as their business priorities, staff skill sets, existing data centre strategies, funding and resources. For example, their cloud migration timeline may be designed to address a number of compelling events and dovetail with data centre closedown, hardware and software refresh cycles or end of life, plans for skills acquisition and more. After working through cloud programs with dozens of customers, I can tell you that there are no templates. Each cloud plan is determined for each operator, according to a very specific individual set of needs and priorities.  There are several high-level paths that a cloud migration may follow. For some CSPs, cloud migration is part of a larger digital transformation initiative. For others, it may be more of a ‘migrate and modernise’ approach, driven by cost control, technology or a cloud-first strategy. T-Mobile US and Vodafone Germany, for example, are both deeply engaged in transformations. T-Mobile US has chosen to transform its next-generation hybrid-cloud operations to unlock efficiencies and improve its abilities around 5G. Vodafone Germany is also transforming to a cloud-native environment and adopting a DevOps strategy that will enhance the customer experience it provides.  On the other hand, AT&T Mexico and Vodafone Ireland have adopted a core to cloud approach – migrating their core BSS systems to the public cloud. The former is migrating its database and application workloads to the public cloud to benefit from greater flexibility and capacity, and similarly, the latter is migrating its infrastructure and application workloads to the cloud with the ambition of improving its customer experience and offering 5G services.  Whichever path telcos choose towards cloud adoption, they don’t have to travel it alone. Spending the time to decide which partners they want or need to work with on their journey can go far toward ensuring a smooth route with minimal disruption. This will enable them to seamlessly maintain the experiences their customers are used to receiving, and ensure they have all the skills they need to make it a success.  Getting ahead with the cloud As is true with many difficult decisions, delaying will often only make it worse. The time is now ripe for telcos to evaluate what they want to get out of their cloud migration, the type of budget they have to conduct it, and how to go about it. Whether it’s a complete transformation, a core to cloud migration or any other configuration of steps on the journey towards cloud-native, the urgency to start and speed through the journey is growing. My advice for you: don’t delay it further.  ### Supercharging Financial Planning with Generative AI When it comes to financial planning, the ability to make informed and strategic decisions quickly is paramount to the success of any business, be they early-stage startups or industry titans. Traditionally, financial planning involved using either expensive software that required specialist user knowledge and training, or relying on complex, error-prone spreadsheets. The latter especially is not only time-consuming but also poses challenges when updating financial plans as an organisation's numbers change over time, often needing the model to be torn down and rebuilt almost from scratch. However, the emergence of generative AI has the potential to revolutionise financial planning, enabling the rapid creation of bespoke and complex financial plans that can be edited in real-time. This breakthrough will democratise financial planning, allowing businesses and leaders to stay ahead of the game without requiring strong financial or modelling skill sets. The Limitations of Traditional Financial Planning Traditional financial planning, modelling, and forecasting methods often required significant investments in expensive software packages. These tools were typically designed for financial experts, requiring specialised training and expertise to operate effectively, and heavy implementations. Small and medium-sized businesses, lacking the financial resources to invest in such software or hire dedicated financial modelling professionals, found themselves at a disadvantage. As a result, financial planning became a daunting task, often requiring ad hoc external assistance or relying on error-prone spreadsheets. Spreadsheets, while more accessible than specialised financial planning software, presented their own set of challenges. Building comprehensive financial models in spreadsheets is a time-consuming process that demands a deep understanding of finance and modelling techniques. Moreover, this is not a one-off labour; maintaining these models as the organisation's financial landscape evolves requires constant tweaks, increasing the risk of errors and compromising the accuracy of the financial plans. This traditional approach hinders agility and limits the ability of smaller businesses such as startups to adapt quickly to changing market conditions. The Rise of Generative AI in Financial Planning Generative AI, a subset of artificial intelligence, is emerging as a game-changer in financial planning, modelling, and forecasting. By leveraging machine learning algorithms and vast amounts of data, generative AI enables businesses to create and update financial plans rapidly, efficiently, and with a high degree of accuracy. This technology promises to empower organisations of all sizes, including those lacking in-house financial expertise, to make informed, confident decisions based on robust financial models. One of the key advantages of generative AI is its ability to generate bespoke financial plans tailored to a company's specific needs and objectives. By analysing historical financial data, market trends, competitive analysis, and other relevant factors, generative AI algorithms can create intricate financial models that account for various scenarios and contingencies. These models can be customised to reflect the unique dynamics of each business with a high degree of granularity, ensuring that financial plans align with strategic goals. Real-time Editing and Adaptability Generative AI's ability to facilitate real-time editing and adaptation is another significant advancement. Financial plans are no longer static documents that require extensive and painful manual updates on a recurring basis. Instead, generative AI will enable dynamic modifications to financial models as circumstances change. This flexibility ensures that businesses can respond swiftly to market shifts, regulatory changes, or internal developments, keeping them one step ahead of the competition. Real-time editing is made possible through user-friendly interfaces and intuitive tools that allow non-experts to make adjustments to financial plans effortlessly. Complex calculations and intricate financial analyses, which were once the exclusive domain of finance professionals, can now be handled by individuals without strong financial or modelling skill sets. Generative AI simplifies the process, enabling decision-makers from various departments to actively participate in financial planning. Risk Mitigation and Scenario Analysis Financial planning is inherently fraught with uncertainty, and businesses must navigate an ever-changing landscape of risks and opportunities. Generative AI empowers organisations to conduct comprehensive risk assessments and scenario analyses, helping them make informed decisions in the face of uncertainty. By simulating various scenarios and assessing the impact of external factors on financial outcomes, generative AI acts as a ‘virtual CFO’, providing valuable insights. It enables businesses to identify potential risks, optimise resource allocation, and develop robust strategies to mitigate adverse effects. This proactive approach to risk management allows organisations of any size or stage to maintain financial stability and make strategic choices that lead to sustainable growth. Data-driven Decision Making Generative AI's reliance on vast amounts of data is instrumental in enhancing the accuracy and reliability of financial planning. By analysing historical data and incorporating external data sources, such as market trends, consumer behaviour, and macroeconomic indicators, generative AI algorithms can generate more precise financial models. Data-driven decision-making becomes a reality with generative AI, as it eliminates the biases and limitations of human judgement. Financial plans are based on objective analysis and are not influenced by individual cognitive biases or subjective interpretations. This data-centric approach instils confidence in decision-makers and facilitates more effective resource allocation, investment strategies, and business expansion plans. The Road Ahead As generative AI continues to advance, the future of financial planning appears even more promising. Ongoing research and development in the field will lead to further improvements in accuracy, speed, and usability, making generative AI an indispensable tool for businesses worldwide. However, as with any disruptive technology, challenges must be acknowledged and addressed. Organisations must ensure the ethical use of generative AI in financial planning, protecting sensitive data and ensuring compliance with regulations. Transparency and accountability in AI algorithms and decision-making processes are crucial for maintaining trust and confidence in the technology. The advent of generative AI has tremendous promise to supercharge financial planning, financial modelling, and financial forecasting, enabling businesses to create bespoke and complex financial models rapidly. This technology will democratise financial planning, freeing start-ups, scale-ups, and other SMEs from the no-win dilemma of expensive software or complex spreadsheets. Real-time editing, adaptability, risk mitigation, and data-driven decision-making will all become achievable even without strong financial or modelling skill sets. Simply put, ability with spreadsheets will no longer impact your ability to run a successful business. Day after day, the media is full of stories in which the leaps forward in Generative AI are empowering businesses to make informed and strategic decisions, navigate uncertainty, and drive sustainable growth. For instance my own company, Blox, is building the world’s first generative AI planning platform. As this technology continues to evolve, its potential for supercharging financial planning will only grow, enabling businesses to stay ahead of the game and achieve long-term success in today's dynamic and competitive business landscape. ### Can AI help companies transform their ESG performance? The latest cloud-based AI capabilities could be used to track patterns across diverse, previously untapped environmental, social and corporate governance performance data - from signs that valued employees are burnt out to insights into resource wastage in the supply chain. Here, Dr John Bates, AI technologist and CEO of SER, considers where those insights might be buried and how to surface and harness them. It’s no secret that today’s businesses are being swamped by a deluge of data. In 2023 alone, 120bn terabytes of content will be created, captured and consumed by enterprises – rising by a further 50% by 2025. Hidden somewhere inside that data is a gargantuan opportunity to understand the way those companies behave as employers, and how that affects everyday outcomes, including their social and corporate governance (ESG) performance – something to which they are increasingly being held to account. The practical challenge is that as much as 80% of the content in an organisation currently exists in static documents, emails or some other unstructured form, making it very difficult to find and combine as a source of decision-supporting insight. Worse still, 54% of all enterprise data is thought to be ‘dark data’ – in other words, completely undiscoverable (e.g. locked in people’s heads, or gathering dust in paper form).  A whole spectrum of AI capabilities now promises to unlock the value of these overlooked sources of business intelligence, via cloud-based platforms. What’s at stake? To do better at ESG, employers need to be able to identify and pre-empt critical scenarios such as underrepresented members of the workforce or high-potential people becoming stressed, demotivated and leaving the company. Clues to employee burnout, neglect or unfair treatment may reside within appraisal notes, calendar schedules or sick leave records for instance.  Another pressing priority might be to curb environmentally-inefficient use of resources along the supply chain – from new insights into order duplications, logistics mileage, and excessive energy consumption which ordinarily would be dispersed across purchase orders, invoices and delivery notes.  The challenge is not only to capture and structure all of this intelligence digitally and assign to it rich metadata (to aid its discovery), but also to link it in a meaningful way with associated data, and to then harness the latest AI techniques and tools to monitor, cross-analyse and distil meaningful insights from all of those inter-related knowledge assets - to support or trigger targeted actions. Levels of AI & their interplay There are three stages through which companies can apply different forms of AI to move closer to their ESG and wider business transformation (for instance, through the improvement of stakeholder experiences; a honed product or service vision; and/or improved cost efficiency), and cloud-based platforms are an important enabler of all of these. First, AI technology is very effective in pattern matching, and never more so than today, thanks to a wide range of deep learning capabilities – from visual analysis/image recognition to natural language processing (NLP). These can help to precisely identify and capture what the content is, through a process of continuous scanning and metadata generation (detailed tagging/indexing of content). Then comes the application of ‘contextual AI’ to understand what the content is about and how it adds to the company’s intelligence about a given topic. This is about joining the dots between content with related metadata, to capture the context of content and compare/contrast related information over time. This builds the ability to understand correlations, trends and outliers/red flags – or untapped opportunities – on demand. It is through this application of AI that a company might determine the link between a particular manager and colleagues feeling held back or under-developed, for example. A further opportunity for AI surrounds intelligent content assistants, which is about AI’s role in search and discovery. Think of this as a ChatGPT equivalent for the workplace – a bot that can query an enterprise’s metadata-enabled content to distil insights such as “Show me high-potential individuals in our employment who are not satisfied/showing signs of restlessness”. Connecting insights via an intelligent cloud content platform In an ESG context, the opportunity might be to reverse staff attrition and/or enhance employee wellbeing by boosting fair treatment and targeting new development opportunities; or identify new opportunities to limit carbon emissions across the supply chain. More broadly, it could present the chance to enhance the customer experience, or to hone product/service development as new insights are discovered from across helpdesk exchanges, sales/indirect channel feedback, and review forums.  Even just transforming the everyday lives of knowledge workers, who still on average spend over a third of their day hunting for information to complete a task (60% of this across more than four different IT systems, according to our own analysis), can contribute significantly to their improved wellbeing - by reducing stress and enabling them to complete their work more efficiently and with greater sense of achievement. Keeping cloud-based content infrastructures and platforms as flexible and as open as possible will go a long way in ensuring the organisation can keep embracing the latest AI advances. The rest is down to company culture and the foresight of its management in wanting to be ahead of the curve on ESG – both out of a sense of corporate responsibility, and as a means of attracting future talent. ### The Changing Role of Partners in SAP’s New Cloud Mindset Recent changes from SAP has left many partners wondering what the next steps are in providing cloud services, especially given  SAP’s insistence on developing a ‘cloud mindset’, and emphasising its Activate Methodology and Fit to Standard workshops to achieve this.  Up until this point, a change of direction to the cloud has had little impact upon consultants abilities to facilitate a deployment, or to provide daily customer support services.  But the introduction of RISE with SAP S/4HANA Cloud, public edition or private edition, has brought with it a different way of working than its predecessor, HANA Enterprise Cloud (HEC). Given SAP is now driving software infrastructure delivery, the role that SAP partners play has now changed.  From initial consultation to implementation, support to maintenance, as Robert MacDonald, Innovation & Technology Manager at Absoft explains, partners need to adopt a new cloud mindset and skill set to enable them to adapt to the change, and to successfully deliver RISE with SAP.  Identifying the change Over the last ten years, SAP partners have been moving away from providing on-premise ERP solutions to move toward cloud based systems, such as Microsoft Azure.  Even despite what felt like a significant change as a result of switching licences and adapting to more varied cost/flexibility models, partners were able to keep disruption to their business to a minimum..  This was owed in part to the fact that partners were still in control of key aspects of the process, from initial scoping all the way through to implementation, and as such, did not need to train their staff in any new specific skills.  RISE with SAP has ushered in a great deal of change. Irrespective of whether a customer opts for RISE with SAP S/4HANA Cloud, public edition or RISE with SAP S/4HANA Cloud, private edition, the entire approach has changed  – and partners need to change with it . On the surface, the process has been simplified. SAP has created a standard infrastructure and offers customers small/medium/ large architecture options to streamline pricing. An ‘adopt not adapt’ mindset means customers are encouraged to avoid any customisation – indeed customisation or extensibility, if required, can only occur outside the core S/4HANA product, using Application Programming Interfaces (APIs) to link to complementary cloud solutions.  So where do partners fit into this new model? Embracing new skills One significant impact will be felt by consultants who specialise in providing more traditional expertise, offering services in scoping and implementation and outlining business-specific requirements. These services are no longer necessary, and have been replaced with SAP’s Fit to Standard workshops, negating the need for custom development specification and GAP analyses, should the SAP ERP solution not support a specific customer need.   The challenge for partners now is convincing prospective customers of the benefits of a standardised best practice approach, and emphasising that customisation should only be used to differentiate themselves from competitors when using a standard cloud based deployment.  Because of this, consultants need to learn new skills.  They need to learn to assess a customer’s processes, identify those areas of differentiation that would justify the development of extensible solutions and work with department heads to achieve the change management required to match the SAP standard process. Partners must take their ecosystem of consultants with an extensive skill set based on identifying problems, writing development specifications and managing project delivery and help them make the transition to this new approach. They need to dedicate time and resources to changing the mindset of customers to fit SAP’s new cloud mindset, and learning new management skills.  Providing Support and Enabling Delivery  The new skills set requirements extend far beyond the initial consultation stage. RISE with SAP is delivered using SAP’s Activate Methodology, which has been updated to support the implementation of this standard cloud project. This again requires that Project Managers learn a new set of skills. From provisioning systems to testing, connectivity to networks and configuring interfaces, every request has to go via SAP.  For Project Managers who prefer to work internally with their own teams on these processes, it will take time to get to grips with SAP timelines, processes and people.   For example, SAP may insist on providing a week’s notice before connectivity is turned on, which is something that could be achieved within hours if working internally.  If the Project Manager is not familiar with these processes, the entire project could become rapidly derailed. In essence, this new approach and mindset from SAP is both a move to a more modern standardisation method, working concurrently with a more old-fashioned service request system, over which partners have no control.  It also has implications upon where SAP’s influence exists and where it doesn't, which muddies the waters in terms of determining which areas of the service which will incur an extra cost, and which areas do not fall under the remit of SAP.  The new skill set is not limited to implementation – the same issues arise during ongoing support.  From system patches to updates, it is vital to ensure changes fit in with business timelines - avoiding month ends, for example. Despite not being in control of these processes, partners still have a key role to play in liaising with customers when an update is set to occur The key to avoiding those increases in cloud expenses that have impacted organisations in the past is the availability of a service that can organise downtime, alert any affected business areas, handle change control, and oversee testing.  Conclusion SAP recognises that significant change in skill sets and processes are required  to facilitate  this new generation of cloud solutions and is investing in supporting its partners. But partners themselves will have to buy into this new cloud mindset and meet them halfway, if RISE with SAP is to be deployed successfully.  Partners can no longer rely on the same on premise product that they have become familiar with over the last 20 years, and set it up all across the board. Every partner must now collaborate closely with SAP, use the company’s methodology, embrace the lessons learned and work with the customer success teams.  This is fundamentally changing every aspect of the SAP partner role and this is something that took some partners by surprise - especially those that did not expect RISE with SAP to take off in the first place. How many partners have proactively recognised and documented the new support and maintenance model to ensure customers understand the changing roles of suppliers  and partners  in this new cloud mindset? How many have been through their first SAP Activate project and now understand SAP’s processes and timelines? Critically, how many are genuinely committed to creating and embracing a new cloud mindset in terms of skillset to support staff and to enable the smooth transition to this new model?  Ultimately, the success of each customer’s implementation is now inextricably linked to the speed in which partners adopt and embrace the new cloud mindset.  ### Optimising multi-cloud environments: The workload-first approach Over the past few years, we’ve witnessed a shift towards multi-cloud infrastructures as companies scrambled to transition to digital platforms amidst the disruptive conditions brought about by the pandemic, in an attempt to survive and compete. However, whilst multi-cloud is often employed to capitalise on specific domain expertise, this has led to a landscape marked by suboptimal performance that impacts user experience, hefty expenditures, and complex setups. These setups, in many instances, have trapped businesses in restrictive contracts, hindering their capacity to fully optimise their cloud environments.  It's vital to make the cloud work smarter, not harder for you. While this is easier said than done, businesses need to be guided by the needs of their workloads and data, rather than the capabilities of their cloud service provider. The solution to these issues lies in putting the needs of your workload first.  As businesses now have a bit more breathing room compared to the height of the pandemic, they are now re-evaluating their cloud deployments, and looking for the most effective ways to match their specific workload requirements to the ideal cloud provider. A workload-first approach might not be the most straightforward initially, but in the long run, it allows businesses to prioritise their needs above those of the providers, thereby maximising the utility and efficiency of their cloud investments. Here’s how to get started. Building a multi-cloud architecture with workloads in mind  Cloud adoption is often motivated by an organisation’s desire to innovate and drive cost reductions, as well as reap the agility and scalability benefits. While these benefits are real, many organisations have discovered that without the right architecture and governance, the desired cost reductions fail to materialise. Thus, businesses must ensure that their cloud setup is designed with their specific business needs in mind and align the right workload to the right cloud platform based on which provider is best suited to meet these needs.  Instead of arbitrarily deploying workloads to any available cloud, this means auditing workloads and understanding what they need to run most effectively. Some workloads are more or less intensive than others and require two or more different platforms to cater to their needs. Others would benefit by being closer to the end users they serve. This workload-first approach will enable organisations to benefit from a breadth of providers serving multiple needs without worrying about complex processes or high costs.  As IT decision-makers weigh up their options, they should also ensure that their chosen cloud offering can provide flexibility, with services that can be modified according to a company’s business needs and with an organisation charged for only what they eventually consume. The right provider will offer businesses the flexibility to choose the most optimal workloads to host with them as part of a multi-cloud environment. By building your multi-cloud architecture with these kinds of insights in mind, businesses enable their cloud costs to be determined by their workload needs. Cloud providers should then be motivated to tailor their services to their needs - meaning businesses are only paying for features such as portability, data processing, and disaster recovery when needed.  The importance of developer-focused simplicity  By adopting a workload-centric approach to multi-cloud environments, IT leaders are unlocking powerful opportunities and experiences for the wider business. For instance, the developer experience is fast being recognised as a key metric for success. Multi-cloud platforms are most valuable when they’re built to suit developers’ needs by enabling them to choose their preferred tools and providers, thereby enhancing productivity. This strategic decision enables businesses to regain control over their digital direction and steer their companies towards a future of smarter, more efficient cloud usage. Scale, performance, and security  It goes without saying that security is a vital component of any multi-cloud setup and must be a priority from the very beginning. As cybersecurity threats continue to plague all businesses, the cloud presents another in-road for bad actors to exploit organisations’ weaknesses within their cloud infrastructure. Thus, IT teams are needing to rise to the challenge of meeting a large and growing number of requirements, from application performance to infrastructure scaling to security.  Given the increasing complexity of cyber threats, IT leaders are recognising the need to place stringent security measures at the heart of their cloud strategy. Choosing cloud providers who truly understand all elements of cybersecurity becomes imperative, and those who include these measures as standard are likely to prevail. Such a proactive approach helps businesses mitigate risks associated with data breaches and cyber-attacks, thereby fortifying their digital infrastructures. Zero trust architecture operates on the principle of 'never trust, always verify'. Every access request is thoroughly authenticated, authorised, and encrypted before being granted, regardless of its origin within or outside the network. By adopting the workload-first approach, businesses have the flexibility to ensure that different types of data, such as that related to HR or payroll, are receiving the appropriate level of protection. Opting for incorrect levels of security can be both a costly and dangerous mistake. Thus, it is important to let the workload define the security requirements for our data, rather than the cloud provider.  Maximising value from the cloud  It’s clear the journey to effective multi-cloud management is not a sprint but a marathon. The rapid transition to the cloud during the pandemic may have been a necessary survival strategy, but businesses now have the opportunity – and the responsibility – to ensure that their cloud strategies are aligned with their workloads and completely secure. ### 6 Key Practices To Achieve High Reliability In Cloud Computing Achieving high reliability in cloud computing is critical for modern businesses to keep up with user demands and maintain a competitive edge. This article outlines some essential practices to help establish and maintain a highly reliable cloud environment.  These practices include selecting the right cloud provider, continuous monitoring, implementing redundancy, embracing automation, planning for disaster recovery, and optimising resource usage.  By following these key practices and employing a proactive approach to managing your cloud infrastructure, you can ensure optimal reliability of your applications backed by efficient use of resources and investment allocation — all while minimising potential risks associated with downtime or data loss. As cloud computing becomes increasingly integral to modern business operations, achieving and maintaining high reliability within your cloud infrastructure is essential for meeting user expectations, mitigating risks, and optimising performance.  With countless factors to consider, from selecting the right provider to handling potential outages, it's crucial to adopt a proactive approach that addresses various aspects of cloud management.  In this article, we'll explore key practices that help you build and sustain a highly reliable cloud environment, ensuring your organisation continues to thrive in today's fast-paced digital landscape. Let's get started. 1. Choose the Right Cloud Provider The first key practice to achieve high reliability in cloud computing revolves around choosing the right cloud provider. This decision is crucial because it lays the foundation for your entire cloud experience, including performance, costs, and security. In essence, you need a reliable partner capable of meeting your specific requirements and industry standards. Begin by researching cloud providers that have a strong reputation for delivering exceptional services and uptime guarantees. Take the time to thoroughly examine their service level agreements (SLAs) to ensure they match your expectations regarding performance, availability, and support.  Also, peruse customer reviews and case studies to obtain valuable insights into other users' experiences with the provider. Don't be afraid to compare multiple providers' offerings before landing on your final choice. Think about factors such as ease of use, integration capabilities, available features, and pricing models during your evaluation process.  Ultimately, you want a provider that aligns well with your organisation’s objectives while offering top-notch reliability for your cloud infrastructure so that you can focus on driving value for your business. 2. Monitor Continuously Staying vigilant and keeping an eye on your cloud resources can help you detect fluctuations in performance or potential issues before they escalate into full-blown problems. By using an essential tool for effective network monitoring, you can gather critical metrics and performance data, allowing you to make informed decisions about resource allocation or system adjustments. Monitoring tools come in various forms, from basic pre-packaged services offered by some cloud providers to sophisticated third-party applications that cater to specific needs like security, cost optimisation, or application-specific customisation Regularly monitoring your cloud environment helps ensure optimal performance and uncovers areas that require improvement. In addition to tracking metrics, set up alerts for crucial events that warrant immediate attention, such as abnormal spikes in resource usage or potential security threats. Swift responses can mitigate the impact of these issues on your environment's overall reliability and minimise downtime. By investing in ongoing monitoring efforts and harnessing the insights they provide, you'll be better equipped to maintain a highly reliable cloud infrastructure that consistently delivers on your objectives while avoiding unexpected setbacks. 3. Implement Redundancy Redundancy means having backup systems, components, or even entire architectures in place to safeguard your cloud environment from downtime and failures. In simple terms, it's like having a safety net that ensures your applications continue running smoothly, even if one part of the system encounters issues. When building redundancy into your cloud infrastructure, consider multiple layers of protection. For instance, you can integrate load balancers to distribute traffic evenly across resources or employ distributed architectures that automatically reroute tasks should one component fail. It's important to remember that designing for redundancy might involve additional costs and considerations like data synchronisation and latency. However, the benefits of maintaining uptime and providing uninterrupted service to users far outweigh these initial investments. By incorporating redundancy into your cloud strategy, you ensure continuous operation under diverse circumstances and ultimately bolster the reliability of your applications within the cloud environment. 4. Embrace Automation Automation plays a vital role in streamlining administrative tasks and reducing the potential for human errors, particularly within complex cloud environments where multiple configurations and moving parts are at play. By automating routine processes such as backups, deployments, scaling, or even security measures like patching, you not only improve efficiency but also ensure these tasks are performed consistently and accurately. This consistency helps maintain a stable cloud infrastructure that remains resilient to fluctuations in workload or resource demands. To implement automation effectively, consider investing in appropriate tools and platforms tailored to your specific requirements. Many cloud providers offer integrated automation services, or you can explore third-party options for more specialised needs. Incorporating automation into your cloud computing strategy provides significant benefits,— from minimising human error and increasing overall performance to maintaining a reliable infrastructure that stands up against unexpected changes or issues. 5. Plan for Disaster Recovery Despite best efforts to maintain a stable and secure cloud environment, unforeseen outages or issues can still occur. This is where a robust disaster recovery plan comes into play, providing a blueprint for how your organisation will respond and recover if faced with unexpected disruptions. Developing an effective disaster recovery plan involves identifying potential risks, prioritising critical resources and applications, defining backup and restore procedures, and assigning responsibilities to relevant team members. Remember that communication is crucial during this process. Once you've established your disaster recovery plan, don't let it collect dust on the shelf — carry out regular tests to ensure its effectiveness and update it as needed based on any changes within your cloud infrastructure or organisation. By actively addressing potential threats and preparing your response strategy ahead of time, you minimise downtime in the face of adversity while maintaining the resilience and reliability of your cloud computing services. 6. Optimise Resource Usage Ensuring that your applications make full use of available cloud resources without overspending or underutilising capacity aids in maintaining efficiency, performance, and cost-effectiveness within your environment. Begin by regularly reviewing instance types, memory allocation, and storage options to confirm that they align with your current requirements. Take advantage of scaling features offered by cloud providers to adjust resources based on demand automatically. This way, you can avoid over-provisioning services like CPU and memory while still meeting user expectations. Additionally, consider adopting a pay-as-you-go model or utilising reserved instances if they suit your organisation's needs. These flexible pricing models allow you to optimise costs based on actual usage rather than allocating unnecessary resources upfront. By keeping a close eye on resource consumption trends and making adjustments as required, you can maintain reliability in the cloud and achieve better overall performance while minimising waste and potential inefficiencies. Conclusion Now that you're equipped with these key practices for achieving high reliability in cloud computing, begin by assessing your current cloud environment, and identify any areas that could benefit from these insights.  Remember, adopting a proactive approach and regularly reviewing your infrastructure ensures its continued stability and optimisation. So go ahead and make reliability a top priority in your cloud strategy — your organisation, users, and IT team will all appreciate the benefits in the long run.  And don't hesitate to share your experience or any additional tips you've discovered along the way with fellow professionals navigating the cloud landscape. Together we can build a more reliable and efficient future for our digital endeavours! ### What can businesses do to prevent easy email mistakes? It's no secret that threats to email security are on the rise globally. According to a recent survey, 92% of organisations were victims of successful phishing attacks in 2022, while 91% of the respondents admitted to experiencing email data loss. When companies fail to implement sufficient email security strategies, they expose themselves, their clients, and their customers to cyber security incidents such as phishing, data breaches, and business email compromises (BEC). It’s not just external cyber threats that businesses need to be mindful of, there is the human element to heavily consider. With so many email-related incidents happening resulting in data loss, the question poses of how businesses can do more to prevent these events. Oliver Paterson, Director Product Management, VIPRE Security Group, explores more.. Emailing the wrong person Thanks to the pandemic, there is an increase in hybrid employees and the traditional single office-based computer setup is now becoming less popular within businesses. As employees are under pressure to work harder, better, and faster, it is easy to understand why they do not always verify the validity of the email address they are sending information to, especially in an age when smarter technology like autofill in Outlook is advancing rapidly. But, while it might just seem like an innocent mistake, it could have huge consequences.  For example, that was the case with a university in the UK, where the personal medical details of a student were wrongly sent to the whole campus. Or when Australia’s Registered Organisations Commission (ROC) accidentally leaked confidential information, including a whistleblower’s identity. An employee entered an incorrect character when emailing someone with the same last name but a different first initial. It only takes one incorrect character or autocorrect taking over for sensitive information to land in the wrong inbox. And, what if that recipient is a competitor or intercepted by a cyber criminal?  The risk of email attachments Another common user error is sending the wrong attachment to the wrong person. This could put company data at risk. If confidential corporate information is released to the wrong person or into the public domain, it can result in a major advantage for the competition or even cause irreparable damage to the company's reputation.  In addition, organisations now face severe consequences for violating data protection regulations, including GDPR and other industry-specific regulations. Data loss awareness tools that improve email security can help businesses ensure that their intended distribution list is correct by prompting employees to confirm all internal and external recipients, and flagging attachments that contain confidential information. For example, Surrey County Council was served with a penalty of £120,000 after three data breaches that involved misdirected emails. This included a staff member sending an email with the personal data of 241 individuals to the wrong email address. The information was not encrypted so was instantly accessible to the recipient and a direct breach of data protection regulations.  BCC or not  Adding in email recipients is a task that may seem simple, but if not done correctly, can have devastating repercussions for businesses. The misuse of CC and BCC functions could expose your entire contact database, exposing customer emails to potential hackers or competitors.  In March 2023, NHS Highland was reprimanded for a data breach which revealed the personal email addresses of people invited to use HIV services. Such a mistake is a common error when sending emails and that often go undetected or unreported in many cases. However, it is considered a data breach because none of the involved parties have consented to share their contact details with others. Considering technology, companies should look to implement solutions that warn and educate people to use the CC and BCC fields properly. However, this problem is for more than just BCC and CC misuse; and companies should consider the issues of sending information as much broader.  It is imperative for businesses with sensitive information to be aware of the security risks posed by autocomplete, reply all, and mistakes when adding attachments. Data breach – accident or intent?  More than 300 billion emails are sent each day, so it’s no surprise that misaddressed emails are the largest source of data loss for organisations. Hackers can take advantage of complacency within email culture with a number of techniques. For example, sending emails that appear to be internal, but are actually messages from a spoofed domain that looks almost identical to the real one. Due to the large amount of emails sent every day and the speed at which they are sent, employees may not notice this and fall victim to a malware or ransomware attack, exposing sensitive information and the network. On the other end of the scale are data breaches conducted with malicious intent. For example, the Morrisons insider threat breach was carried out by a disgruntled former employee who stole and published payroll data of nearly 100,000 staff members online. His aim was to disparage the reputation of his former employer after a disciplinary matter. The breach reportedly cost the company £2 million to rectify.  Since emails make up a large part of our professional communication, especially when working remotely, it's important to be aware of and educated about the common email errors that occur. Businesses can support their employees and reduce the risk of a data breach by implementing intuitive technology that detects and highlights errors, pointing out potential errors and threats. Organisations can quickly reduce errors by implementing technology that warns users about poor email security techniques and prompts them to recheck a message twice before sending it all without affecting employee productivity. These solutions can prevent organisations from revealing the wrong information to the wrong person by allowing a quick double check of the receipts of emails and attachments before sending them. Conclusion While foresight is essential, so is the ability to prepare a smart defence. Businesses can implement best practices to protect themselves from email threats and prevent becoming the next easy target. These best practices include: Implementing a layered email security strategy Training employees for better security awareness Deploying email-specific security controls The email safeguards businesses can implement today will have a broader and more lasting impact as the organisation grows. When implementing these best practices, it's essential to partner with the right email security vendor to ensure the company's email security solutions are tailored to the company's size and scale with the business' growth. ### Navigating the Energy Crisis with Efficient Cloud Infrastructures By now, businesses everywhere will have felt the impact of energy prices that have risen significantly since the end of 2021 – a trend given massive impetus just a few months later when Russia invaded Ukraine. In the 12 months to February 2023, for example, UK electricity prices went up by two-thirds, with gas prices reaching even more alarming levels, rising by almost 130%. Despite better news recently, energy prices are still much higher than levels seen historically. As one expert put it, the latest wholesale price reductions have “taken us down from six times normal pricing to double normal pricing.” This forms just part of an extremely challenging economic and business backdrop that has seen organisations facing issues ranging from volatile trading conditions on the one hand to a severe lack of qualified staff on the other. For many, however, the financial burden created by the unprecedented rise in energy prices has given the most cause for concern. Indeed, faced with little scope for reducing costs by switching suppliers, organisations are instead examining the options available from more energy-efficient technologies to reduce consumption for the short and long term. In many of today’s digitally-centric businesses, the obvious place to look is within their power-hungry data centres and, in particular, how cloud-related energy costs can be more effectively managed. But what is the scope for making energy-related savings across the typical cloud infrastructure strategy, and how can IT and business leaders be sure that any choices they make to achieve savings in one area won’t impact performance elsewhere? Workload optimisation For many organisations, the ideal starting point is to look closely at their data hosting costs. One useful option is to focus on economies of scale by moving more workloads to private or collocated data centres. While the issue of migration costs must always be factored in, this approach can be among the easier and cheaper options available and is often described by IT teams as a ‘lift and shift’ approach. Another option is to look at the options available by moving to the public cloud. In this context, data migration can be more complex (and therefore more costly), but the scope organisations have to scale resources up and down according to need can deliver some useful financial benefits over time. For businesses whose requirements are more seasonal, such as the travel and retail sectors, for example, the flexibility to match business peaks and troughs with the use of IT resources in the public cloud can be a game-changer from a cost perspective. This becomes particularly apparent when compared to hosting workloads all year round in more rigid on-premises setups with the associated high energy costs. In contrast, however, organisations that need constant access to data and technology resources might find that public cloud costs more. The important point is to consider the relative merits and energy costs of each option carefully because, for every requirement, there will be an optimum solution that balances energy efficiency with tech performance. The Role of Managed Service Providers  The same kind of issues apply to businesses that partner with managed service providers (MSPs) to deliver some or all of their cloud-based infrastructure requirements. Good MSPs are fully aware of the financial pressure their customers are currently facing and will work hard to demonstrate a compelling ROI, including the costs associated with energy consumption. But it’s healthy to consider how flexible the MSP’s model is when issues such as energy price fluctuations are in play. How quickly do they pass price reductions on to customers, for example? Don’t forget, choosing an MSP should not be about which is the cheapest. Decisions about who to partner with should also be based on their ability to optimise the services hosted on behalf of the customer, their Service Level Agreements, and the costs associated with scaling resources up and down (energy included). Leaving a Legacy What’s sometimes overlooked in energy efficiency discussions is the impact of legacy technology. Outdated data centre equipment is not just more likely to be power-hungry, but it also leaves organisations significantly more vulnerable to the risks of security breaches. Many cybercriminals, for instance, rely on the security gaps left open when organisations fail to keep technology up to date or prioritise other spending decisions in an effort to control costs in general. In this scenario, migrating to the cloud or a colocation environment can deliver a win-win whereby security can be delivered by an experienced and well-resourced MSP and, at the same time, inefficient legacy technology can be recycled in favour of much more modern and cost-effective technologies. Focusing on Sustainability While it’s understandable that many businesses are focused on short and medium-term cost control, there are also some bigger, long-term issues that every organisation should also be focusing on, particularly sustainability and the environment. As a major energy user, the cloud industry worldwide is under the spotlight to improve its efficiency significantly. According to data from Synergy Research Group, there were at least 700 hyperscale data centres operating worldwide in 2021, with the number continuing to “grow at an impressive pace.” As demand for cloud-based services continues to rise, so does the requirement for always-on energy consumption for processing and cooling, bringing with it the associated increases in CO2 production if power isn’t generated by renewable sources. To address these valid concerns, leading cloud operators have embarked on major investment initiatives in order to reach net zero targets and ensure the industry and the modern economies that rely on it have a long-term future. On the Google homepage, for example, the company claims to have been carbon neutral since 2007 and goes further to say that its “. . . 24/7 carbon-free energy programme aims to eliminate emissions associated with our operational electricity use and run on clean energy every hour of every day by 2030.” This isn’t just about corporate responsibility and doing what’s right – energy efficiency is increasingly a commercial imperative, with more enterprises using it to inform their cloud investment decisions. Looking ahead, those organisations that fail to deliver on their environmental obligations will be squeezed out of the market by those who have looked long-term and made the right decisions for the benefit of everyone. ### Ethics in Artificial Intelligence Artificial Intelligence (AI) is changing the way we live. It is transforming diagnosis and treatment throughout healthcare, improving patient outcomes. Through robotics, it is unleashing new productivity and quality in manufacturing, accelerating drug discovery, and improving road safety. As emerging technologies like ChatGPT have become more widely adopted by individuals, ethical and political implications of AI adoption are becoming increasingly important. The benefits of AI are undoubtedly numerous, but if algorithms do not adhere to ethical guidelines, is it safe to trust and use their outputs? If the results are not ethical or, if a business has no way to cannot ascertain whether the results are ethical, where is the trust? Where is the value? And how big is the risk? A collaborative effort to design, implement, and refine ethical AI is more effective when it adheres to a number of ethical principles, including individual rights, privacy, non-discrimination, and non-manipulation. Zizo CEO, Peter Ruffley discusses the ethical issues surrounding AI, trust, and the importance of having robust data sets to support bias checking for organisations on the verge of an enormous shift in technology. Unlocking Pandora’s Box Calls from technology leaders for the industry to hold fire on the development of AI are too late. With the advent of ChatGPT, everyone is now playing with AI - and individual employee adoption is outpacing businesses' ability to adapt. There is no way to tell whether work is done by people or by machines today, and managers are unaware if employees are using AI. And with employees now claiming to be using these tools to work multiple full time jobs, because the tools allow the completion of work such as content creation and coding, in half the time, companies need to get a handle on AI policies fast. Leaving aside the ethical issues raised by individuals potentially defrauding their employers by not dedicating their full time to the job, the current ChatGPT output may not pose a huge risk. Chatbot-created emails and marketing copy should still be subject to the same levels of rigour and approval as manual content.  But this is the tip of a very fast expanding iceberg. These tools are developing at a phenomenal pace, creating new, unconsidered risks every day. It is possible to get a chatbot to write Excel rules, for example, but with no way to demonstrate what rules have been used or data changed, can that data be trusted? With employees tending to hide their use of AI from employers, corporations are completely blind to the fast-evolving business risk. This is just the start. What happens when an engineer asks ChatGPT to compile the list of safety tasks? Or a lawyer uses the tool to check case law prior to providing a client opinion? The potential for disaster is unlimited. Risk Recognition  ChatGPT is just one side of the corporate AI story. A growing number of businesses are also embracing the power of AI and Machine Learning (ML) to automate health care and insurance processes. As a result, the rate of AI adoption by UK businesses is expected to reach 22.7% of companies by 2025, with a third of UK businesses expecting to have at least one AI tool by 2040, according to research commissioned by the Department for Digital, Culture, Media and Sport (DCMS). These technologies are hugely exciting. These AI and deep learning solutions have demonstrated outstanding performance across a wide range of fields, including healthcare, education, fraud prevention, and autonomous vehicles.  But – and it is a large but – can businesses trust these decisions when there is no way to understand how the AI drew its conclusions? Where are the rigorous checks for accuracy, bias, privacy and reliability? To fully realise AI's potential, tools need to be robust, safe, resilient to attack, and, critically, they must provide some form of audit trail to demonstrate how conclusions and decisions were made. A Trusted Relationship Requires Proof Without this ability to ‘show your workings’, companies face a legal and corporate social responsibility (CSR) nightmare.  As a result of bias and discrimination embedded in decision-making as a result of algorithms that operate against the organisation's diversity, equality, and inclusion strategy? The Cambridge Analytica scandal highlighted the urgent need for AI related regulation, and the power of AI has since continued its frenetic evolution without any robust regulatory or governance steps being put in place. As opposed to calling for an unachievable slowdown in AI development, data experts must now work together in order to mitigate risks and enable effective, trusted use of these technologies. It is incumbent upon data experts to develop technology to support the safe and ethical operational use of AI. Data governance and data quality procedures must be in place to ensure both the data used and the output of AI & ML activities are accurate and accessible in order to guarantee accurate AI output. Collaboration Providing essential transparency throughout the AI production pipeline requires the development of trustworthy components to enable businesses to understand how AI reached its conclusions, what sources were used, and why. Such ‘AI checking’ technology must also be inherently usable, requiring a simple data governance and risk monitoring framework that could alert users to bias, discrimination, or questionable sources of data, as well as allow the entire AI process to be reviewed if needed. By creating a simple tool that bridges the gap between domain experts and AI experts, companies will be able to understand and trust the AI system more easily, allowing them to embrace AI confidently. To expand the data available and increase the context and accuracy of Internet-only information, there is a global need for collaboration and data sharing, both within and between organisations.  As a result of this collaboration, we'll be able to counter AI-generated bias and discrimination, and combine that with AI's "explainability" to provide organisations with the tangible business value they need. Conclusion These changes must, of course, take place while AI continues its extraordinary pace of innovation. Despite collaboration and technology that deliver AI trust on the agenda, the next few years will not be without risk. Mismanagement of AI usage both at the employee and corporate levels may lead to large-scale corporate failures.  In this regard, organisations must now develop robust strategies to safely manage AI adoption and usage, with a strong focus on CSR and corporate risk. By adopting an ethical approach to AI, some organisations will, therefore, not progress as fast as others that rush headlong into AI and ML, but they will be safeguarding their stakeholders and, quite possibly, protecting their business' future. ### Dashing across AI’s highway to the danger zone Reflecting on the growing number of industry conversations around artificial intelligence (AI), particularly in the software and service sector, I believe we are witnessing a significant shift. The accessibility of AI has made it easier than ever for companies to integrate AI capabilities into their software. But I wouldn’t rely on ChatGPT to help me cross a busy road in 2023. Would you? With a cloud-first and API-first approach becoming the norm, numerous organisations have embraced AI to enhance their offerings. The benefits are clear, as AI can save a tremendous amount of time and perform tasks far more efficiently than humans.  Personally, I find myself utilising AI on a daily basis for various purposes, from generating speech texts to creating visuals for presentations. There is no doubt that AI is here to stay and can be a positive force for good, provided it’s handled with care. The untapped potential of AI-enhanced SaaS services would be impossible to ignore as the financials involved are mind-bending sums of money when it comes to investment and revenue. The global SaaS market is valued at $197 billion this year and is estimated to reach $232 billion by 2024. Meanwhile, artificial intelligence as a sector already commands a global value of around $207 billion in 2023, and this is expected to race to almost $300 billion next year.  And by the end of this decade, the global valuation of the AI sector is expected to be closing in on an incredible two trillion dollars. To give some context, that 2030 valuation of the AI industry compares with today’s market capitalisation of the entire global information services sector. Who knows what unexpected developments may occur in the next seven years that could make even those eye-watering figures look naive when we’re looking at them in the rear-view mirror? False promises However, there is a growing concern that comes hand in hand with the rise of generative AI, which is the issue of confabulation or hallucinations. These events involve the generation of content that is remarkably convincing, but completely false. As a result, we are faced with the challenge of redefining the concept of quality within the AI landscape. While AI undoubtedly brings immense value, we must remain cautious when relying on it for critical decision-making processes. Like many others, I have come to realise that using AI for tasks like search or gathering factual information requires careful consideration due to the potential risks associated with confabulation and the rise of fake news. And the risk is even greater as the impact of AI extends beyond just the software industry. It has the potential to revolutionise education, how we access information, and the way we work. As more individuals begin to appreciate the benefits of data-driven decision-making, there is an increasing need to address the issues of confabulation and hallucination in this process.  This is where AI in analytics becomes crucial. While generative AI or large language models alone may not render data-driven decision-making obsolete, they do require additional layers of quality and explanation. Analytics plays a vital role in AI-generated decisions, providing metrics and context to substantiate their plausibility. By bringing AI-generated decisions out of the black box and incorporating traditional analytics, we can instill confidence and add a level of quality assurance to the decision-making process. Data dashboards as guardrails To address the risks associated with AI-generated content, dashboards and analytics serve as vital guardrails. Analytics offers valuable insights into the decision-making process of AI algorithms, enabling users to understand the rationale behind recommendations and the supporting metrics. By integrating analytics into AI-driven workflows, businesses can enhance the quality and reliability of generated content. This ensures decision-makers have a comprehensive understanding of the recommendations, enabling them to make informed choices based on accurate insights. Imagine standing at the edge of a busy road, contemplating whether to cross. In the past, when you saw the traffic, you had absolute certainty that what you perceived was real. But now, imagine discovering that you occasionally hallucinate, seeing cars that aren't actually there or missing ones that are. Would you still have the same confidence to step onto that road? I don't think so. This analogy perfectly illustrates the importance of analytics in the realm of AI. Analytics act as our guide, our way of ensuring that we have the most accurate view of the road ahead. Just as analytics can inform us when a car passes at a given moment, it boosts our confidence to trust analytics combined with generative AI to automate decisions. If we genuinely desire to automate processes, we must consider the chances of being run over by a car if we don't incorporate analytics into our decision-making. It becomes a matter of establishing a similar level of trust and ensuring our safety. The pace of change is relentless, much like the cars on the road. We can either stand by the side, overwhelmed by the speed at which things are moving, or we can choose to engage and keep up. However, we must acknowledge that there are times when we might be hallucinating, mistaking false realities for the truth. This is where analytics becomes our ally, helping us discern between what's real and what's not. It provides us with the most accurate view of the traffic, ensuring we don't get blindsided by a truck we didn't even know was there. The dangers of speed The acceleration of business is undeniable, and AI, including large language models, contributes to this increased speed. As it speeds up our work, it simultaneously accelerates the work of everyone else. To keep up with this rapid pace, automation becomes essential. However, we must be cautious not to automate blindly, ignoring the potential for hallucinations and false information. By combining automation with analytics, we mitigate risks and navigate the increased speed with greater confidence.  Businesses must incorporate analytics as a safeguard against such risks. Analytics empowers decision-makers to question and verify AI-generated outputs, facilitating sound decision-making based on reliable insights. The integration of AI into the SaaS industry presents both opportunities and challenges. While AI offers remarkable efficiency gains, it demands cautious implementation. By embracing data dashboards and analytics as essential components of AI-driven workflows, businesses can navigate the challenges posed by confabulated content and misleading outcomes.  Ultimately, the synergy between human expertise, AI capabilities, and analytics leads to informed decision-making and propels businesses towards sustainable success in the era of AI. But we must all remember to look both ways while we’re crossing the particularly busy road ahead. ### Instilling data-centricity through the right company culture How well an organisation puts its data into action defines its success in the hyper digital economy. For all its business benefits, however, data is growing in volume - sourced from a vast array of silos and endpoints – and can create internal obstacles which can hamper productivity if left in its raw state. A single source of truth can be harder to attain if a complex cloud environment results in siloed departments.  In our survey, nearly 40% of data professionals admit they don’t fully grasp how data is being used in their organisations. Compounding the issue, 44% find it hard to manage the diversity of data they are dealing with. All factors considered, data professionals are facing more barriers to making data available in a state that is useful for analytics than they used to.  It goes to show that, even for those businesses with the right tools for extracting, transforming and loading data, a strong foundation must support the process if it is to be truly effective. This foundation comes in the form of an internal data culture that is felt and appreciated across the organisation. The management of data and metadata embedded within a business needs to be communicated in a way for employees to effectively understand and use data in their roles.  It ultimately comes down to how data is turned into something employees can use – including those belonging to the less technical parts of the business, not solely the traditional data team. All departments should be encouraged to buy into modern data architecture: the kind that uses the latest technologies and aims to empower business users to make the most of the data they already have at their disposal. Leaders can take three actionable steps to nurture a strong data culture in their business. First, they should create a common language across teams to prevent misunderstandings. Then, they should make clear to all teams the impact data can have on the business’ bottom line. Finally, leaders should showcase the value that users, beyond the IT team, get from using data. Let me explain how this process can help to unleash the power of data for all. All teams on the same pageRegardless of which roles team members have within an organisation, it is essential to have a shared understanding of where data comes from and how it is being used. Having a centralised data catalogue accessible to anyone across the business helps democratise these trusted datasets and extend their reach further, especially when a significant amount of employees are using such datasets on a regular basis. With this in mind, the first step to developing a data-centric mindset is to switch from a process-oriented mentality to a data-oriented mentality.  This is a step change focused on prioritising data over operations while at the same time granting teams access to data that will improve their decision-making. It is important to make a proactive effort to explain to employees how data-centricity will impact their roles and why it is important to use data from multiple, trusted sources.  Growing and maintaining the data hype The idea of ‘data culture’ helps to dismiss the myth that only data professionals, those that manage the demand for refined data, can handle and understand it. This mindset is particularly unhelpful given the skills shortage affecting data roles - relying solely on them for data production will inevitably create bottlenecks.  In fact, most roles (if not all) within a business can generate and act on useful information in their own way. For example, a marketing executive can be interested in audience behaviour insights, as it will amplify team efforts, as well as meet customer needs more closely and deliver tailored experiences.  However, it will always remain the senior leadership team’s duty to take more optimal, data-driven choices that will have a measurable impact. Generative AI is already helping to inform this; summarising, categorising and presenting data so the decision maker can be more informed and productive. Increasing visibility on the impact data can have and how it can help teams level-up their work is an easy way of raising awareness within the organisation. Business leaders can do so by openly talking about how data analytics is adding great value to all points of the value chain, not just reporting teams.  Delivering informed business decisions From making processes more cost-efficient through to hastening fulfilment times, data can be the source of real positive impacts for customers. However, its biggest advantage for companies is the ability to inform decisions that are based on evidence, rather than just the instinct of a small group. It has been proven that companies that adopt this model are much more likely to gain a competitive advantage.  Dealing with different data sources is quite complex and it can be detrimental to productivity and the speed at which data moves within a company. But it’s not the only overwhelming process: think about how teams feel when it comes to meeting their deliverables too. The smallest breakdown in the value chain caused by resource pressure can have knock-on repercussions on the bottom line.  Teams can prioritise different items for the end user and work more efficiently if they learn to do so in smaller increments. Minor improvements in the process might seem trivial, but even the slightest change can help provide enhanced value to customers, plus it can also show what an organisation can achieve. All of this adds to wider company efforts to build a cohesive data-driven culture. Shifting to a strong data cultureIn a bid to make their own data more useful and usable, organisations should first focus on their own team. The whole company has to buy into the idea, plus have an overall understanding of the benefits data can bring to overcome barriers in communication and start appreciating the advantages it can unlock.  Teams don’t automatically become more productive just because their company has migrated to the cloud. In addition to updating the software and hardware and integrating tech into a business, there has to be a mindset change of how businesses think about data. Shifts in data culture and literacy have to be prioritised too. Increasing business value by implementing a data-centric culture is a collaborative effort. ### Why It’s The Fusion of ChatGPT and Knowledge Graphs Will Make All the Difference According to analysis by UBS, ChatGPT is the fastest-growing app of all time. The analysis estimates that ChatGPT had 100 million active users in January, only two months after its launch; TikTok took nine months to reach 100 million users.  As you probably know, ChatGPT can do more than answer simple questions. It can compose essays, letters, emails, have philosophical conversations, and create simple code. The generative AI underpinning allows it to create good-quality prose, art, and source code, and at a level of a well-informed person. It can even pass some very well-respected university exams, simply by predicting which words should come next. As you also probably know, even the well-informed individuals it has used to build its answers can be mistaken, and ChatGPT makes it hard to detect those errors due to the certainty of its tone and the complex reasoning of the underlying model. The bot itself says: "My responses are not intended to be taken as fact, and I always encourage people to verify any information they receive from me or any other source". OpenAI also notes that ChatGPT sometimes writes "plausible-sounding but incorrect or nonsensical answers" as in its famous hallucinations. In systems where compliance or safety are important, we simply cannot take the chatbot at face value, nor can we ignore the potential benefits of generative AI. Is there a way to move forward with the technology that powers generative AI, and other LLMs (Large Language Models), in ways that cut down these false positives and overconfidence, which will erode trust in the answers a chatbot will cause? Our response needs to blend the best of ChatGPT with other means of ensuring rigour and transparency. A conversational interface over all that complexity What I call a ‘small’ Large Language Model radically cuts down the kinds of errors that can occur with ChatGPT. While this approach may limit the range of responses the LLM can generate because it will have typically been trained on far less data than it consumes from the Internet in one large sweep, it also means that the responses it generates will be more reliable. Consider what Amazon could achieve with a small LLM incorporating all of its product documentation from its databases, loading it into ChatGPT, and offering customers a conversational interface over all that complexity. It’s important to note that you can’t achieve these outcomes simply by connecting ChatGPT with a document cache. If the CIO wants to start exploiting the untapped potential in their internal data stores by applying LLMs, then building and refining knowledge graphs using proven graph database technology is the way ahead. Here, a real breakthrough has been made by a group of researchers through the creation of BioCypher. This FAIR (findable, accessible, interoperable, reusable) framework transparently builds biomedical ‘knowledge graphs’ while preserving all the links back to the source data. And what made the difference was using a graph-based knowledge graph to organise data from multiple sources, capture information about entities of interest in a given domain, and create connections between them. Let’s see how. Going beyond generative AI’s current limitations with Small LLMs The team behind BioCypher accomplished precisely this. The team took a big corpus of medical research papers, built a ‘Small’ Large Language Model around it and derived a knowledge graph from this new model.  This approach allows researchers to more effectively interrogate and work with a mass of previously unstructured data in a well-organised and well-structured way. And having this information resource in a knowledge graph means it is transparent, and the reasons for its answers are clear. And no hallucinations! There is nothing to stop you collecting a substantial amount of information in text form and running an LLM to do the natural language ingestion, too. That will give you a knowledge graph to help you make the most sense of your vital corporate knowledge. The reverse is also true. You can take control of the training of a small language model by feeding it into a knowledge graph, as this would allow you to control the input to the model, resulting in a responsive, easy-to-interrogate natural language interface on top of your graph. Analyst James Governor, co-founder of RedMonk, agrees. He has recently said that “As it touches all business functions, from legal to accounting, customer service to marketing, operations to software delivery and management, Generative AI is beginning to remake industries. Enterprises are justifiably worried about the dangers of incorrect information or ‘hallucinations’ entering their information supply chains, and are looking for technical solutions to the problem: graph databases are one well-established technology that may have a role to play here, and so-called small language models are an interesting approach to the problem.” ### The future of cloud cost optimisation In today’s fast-paced and ever-changing business landscape, cloud computing is a vital part of many companies’ processes. According to a study by the International Data Group, 69% of businesses are already using cloud technology, while 18% say they plan to implement cloud-computing solutions at some point in the near future. Indeed, Amazon Web Services [AWS], the world’s largest cloud platform provider, has approximately 1.45 billion users worldwide alone. Given its effectiveness in enabling firms to scale and adapt at speed, drive innovation, improve agility, streamline operations, and reduce costs, it is hardly surprising that so many organisations have taken up the technology, or intend to do so soon.  While cloud computing brings a range of benefits to organisations of all sizes, implementation is not without its challenges. Investing in the technology is one thing, but ensuring that it is providing as much value as possible is another altogether. Indeed, AI Multiple estimates that 30% of cloud spend is going to waste, while companies generally over budget nearly 24% for their cloud needs. It is clear, therefore, that a large number of organisations are causing themselves undue budgetary pressure through mismanagement of their cloud spend.  More businesses are now waking up to the advantages of cloud cost optimisation, but this in itself is creating a new challenge. With so many companies investing in the cloud, gaining a competitive edge through the technology is becoming increasingly difficult. This is prompting organisations to explore new ways of achieving cost optimisation and, in doing so, helping to shape the future of cloud computing – but what exactly might this look like? Greater uptake of cost optimisation tools As businesses seek to gain a competitive advantage through their investment in cloud computing, it is highly likely that we will see more businesses adopting cost optimisation tools to achieve this goal. With research showing that optimisation tools can help produce cloud savings of over 30%, using them can be something of a silver bullet for organisations eager to make their cloud services more cost effective.  Given that the International Monetary Fund [IMF] is forecasting that the UK will be the worst performing of all G7 nations this year, budgetary pressures on businesses are set to persist for the foreseeable future. As such, this will prompt a greater number of organisations than ever before to explore how they can save money while simultaneously getting the most out of their cloud budgets. Next-gen tools will enable optimisation With organisations increasingly incorporating multi-cloud and hybrid infrastructures, the threat of cost overrun and loss of control is growing exponentially. According to Gartner, 60% of infrastructure and operations leaders will encounter public cloud cost overruns that negatively impact their on-premises budgets through 2024. This is consistent with existing trends, given that the Flexera 2021 State of the Cloud Report found that 61% of respondents had significantly higher than planned costs during that year. To respond to the challenges presented by cost overrun, optimisation tools will evolve to enable businesses to fully optimise not only the entirety of their cloud environments, but also their future edge deployments. This will prepare them for taking full advantage of every potential saving, as well as performance-enhancing solutions and other innovations. Reliance on cloud will grow  Cloud adoption has grown at such an alarming rate, to the point where over half of all companies now run at least some of their workloads in the cloud. Platforms like AWS have grown exponentially, with the Bank of America anticipating that AWS will experience growth of 11% this year. With usage already so high, and with businesses continuously discovering the benefits, reliance on the technology as a means of remaining competitive will grow significantly. After all, Markets and Markets has predicted that the global cloud computing market will be worth $832.1 billion by 2025, while Gartner expects that enterprise cloud spending will make up 14% of IT revenue worldwide by 2024. With cloud set to expand ever further in the coming years, businesses’ reliance on the technology will grow significantly as they aim to stay competitive. This will accentuate the need to optimise cloud cost ever further, increasing the demand for managed cloud service providers who have the knowledge and skill to help companies get the very most from their investment.  Cloud is here to stay With the cloud having revolutionised the way that businesses carry out many of their key processes, and with it set to continue to do so for the foreseeable future, firms must seek new ways to optimise their spend if they aim to benefit from the technology, and gain an advantage over their competitors. While next-generation tools are in development to respond to the growth of the market and the challenges this presents, there is much that organisations can gain from working with managed cloud service providers who can guide them through the optimisation process, and help them to truly unlock the potential that the cloud holds.  ### You might believe you’re doing a great job but what do your clients think? Is doing a consistently exceptional job for your clients enough to make you a stand-out professional services firm? That might seem like an obtuse question and the answer may appear equally facetious – even unfair – because it is ‘not necessarily’. In today’s highly competitive commercial environment, every client demands an exceptional service. Clearly, they can’t all receive one because, by definition, for a provider, or group of providers, to be exceptional, there must be others who are not.  There may be sectors in which everyone is providing an excellent level of service but most appear to include the gilded few, who win the most lucrative contracts, attract the best reviews, and receive the greatest number of referrals. What marks them out as ‘exceptional’ is not necessarily the quality of their work or the high level of their expertise. It may simply be that they are paying more attention to another element of business delivery that is attracting a growing amount of attention and importance – customer experience. The top 20 tennis players in the world may have roughly the same level of ability. What sets apart the grand slam winners from the rest can be measured by the finest margins of advantage gained by paying attention to what are often seen as more peripheral factors, such as diet, fitness, and mental strength. Similarly, in business, companies that pay attention to every aspect of their performance and seek to constantly improve those that others may regard as less important, are likeliest to gain their place among the gilded few.     Research by global management consultancy McKinsey suggests that, of the 900 firms it has advised in the past decade, those which implement enterprise-wide improvements in customer experience saw 15%-20% boosts in sales conversion rates, 20%-50% percent reductions in service costs, and 10%-20% improvement in customer satisfaction. So, what is customer experience, how is it measured and, crucially, what can be done to improve it? A customer journey describes your client’s end-to-end experience, as opposed to their satisfaction with individual touch points or outputs.  A legal firm’s customer journey might begin with the onboarding of a client – the initial meeting and scoping of the parameters of the project. It might include the number of subsequent contacts – phone calls, emails and follow-up meetings – required to create and deliver a formal proposal and, ultimately, the signing of a contract or contracts. At a very basic level, it will be influenced by non-technical but nevertheless important factors such as punctuality, the politeness of its staff, clear communications, and promptness in responding to questions and queries. According to McKinsey, there are three basic elements to ensure good customer experience delivery. Observation – by putting yourself in your customers’ shoes you are better able to see the challenge from their point of view, helping to organise and mobilise your employees around your client’s needs.  Shaping – when designing a customer journey, interactions must be reshaped into different sequences, for example digitisation of processes, reorientation of company culture, and nimble refinements in the field.  Performing – Prioritising your customer journey can be a journey in itself that takes years and requires deep engagement from everyone in the company, from corporate leaders right down to the shop floor.  A law firm might be helping a corporate client with a business sale or takeover. Getting the right price – or a better-than-expected price – for its client within the required timescale might be seen by the firm as a great commercial success. But what does the client feel about the transaction? How was their customer experience? They may have a more negative view for any number of reasons that had by-passed the law firm. Its administrative staff may have been rude or unresponsive; the project management process might have been too clunky and inefficient, causing frustration for the client; sensitive documents may have been shared too widely for the client’s liking; the company may have been asked to perform tasks it was not expecting; the final bill may have included costs not flagged up at the start of the process.     The customer service element of the project may have been satisfactory, good, or even excellent. The law firm did what was promised and what was expected of it, and the sale or takeover progressed without any significant hitches. That doesn’t necessarily mean the client felt positively about the experience and that is more likely to influence whether they will use the law firm again, recommend it to others or even to speak negatively about it in the golf course locker room or at the next chamber of commerce black tie event. Professional services firms live or die by word-of-mouth referrals and recommendations. Business-to-business companies may not be hostage to the same level of public scrutiny faced by their business-to-customer counterparts on social media channels and review websites – but they do talk to each other.  That ability of a corporate client to refuse a request by a law firm for a testimonial to put on its website and its promotional literature can be very powerful, particularly if word gets out to others in the sector. For any professional services firm, ensuring its clients have a good, or a great, customer experience is a requirement for all of its staff, no matter how big the company, and so its importance needs to be embedded within its culture.  Businesses whose employees believe their expertise is its greatest asset need to be re-educated so that they understand good customer experience delivery is at the heart of its strategy for differentiation, cultivating a mindset of continuous improvement at all levels. Equally important is to ensure that everyone in the organisation communicates authentically, so that the promises made by its brand identity and messaging are those that are delivered. Measuring and improving customer experience is best achieved by investing in hardwired technology that captures feedback on an ongoing basis from multiple channels, integrating survey results and other data into centralised dashboards and reports. Today, it is important for firms, not only to react to client feedback, but to actively shape their decision journeys. Predictive customer insight can help to unlock powerful insights about how your clients feel about you.  Having the ability to anticipate customer experience can help you stay ahead of the pack and respond quickly and effectively to expressions of dissatisfaction when they occur.  Being a great firm is not just about doing a great job, it’s also about how you work and interact with other. At the end of the day, we are all human beings and the way we experience things matters as much as what we achieve.  ### Why is integration in Smart Lockers important? Smart lockers have been around for a while now, and have gained popularity in various industries, but integration is transforming the way they operate. With integration, smart lockers are becoming more than just secure storage units; they are now multifunctional solutions that streamline processes, enhance security, and provide a seamless experience for end-users.  The concept of smart locker integration refers to the process of connecting smart lockers o a larger system or network for increased efficiency, and convenience. Integration involves connecting smart lockers with other systems such as access control, payment, and inventory management, to create an ecosystem that offers a range of services.  Why is integration important for businesses looking to adopt smart locker technology? Integration is a crucial aspect of any successful business, as it allows for the seamless connection of different systems and processes. In today's fast-paced and interconnected world, businesses must be able to operate efficiently and effectively across different departments, platforms, and technologies.  Integration plays a vital role in achieving this goal, and it offers numerous benefits that can help businesses to remain competitive and grow. What are the benefits of integration? Integrating smart lockers into a larger system can provide several benefits, such as: Streamlined management: Integrating smart lockers with a central management system can simplify the management and monitoring of locker usage. The system can provide real-time data on locker availability, usage patterns, and maintenance requirements, making it easier to optimise locker use and maintenance. Improved security: Smart lockers can provide enhanced security features such as biometric authentication, remote access control, and surveillance systems. Integrating smart lockers into a larger security system can further enhance security by providing real-time alerts and monitoring capabilities. Increased convenience: Smart lockers can be integrated with other systems such as electronic payment systems, access control systems, and inventory management systems to provide a seamless user experience. This can save time and reduce the potential for errors or delays. Cost savings: Smart locker integration can reduce the need for manual labour and paper-based tracking systems, leading to cost savings over time. Additionally, smart locker systems can be configured to optimise space usage, reducing the need for additional storage space. 5. ServiceNow solution: Some companies such as Velocity Smart Technology already offer a complete integration with ServiceNow, a comprehensive end-to-end solution for enterprises. The platform's powerful features and functionalities enable businesses to optimise their collection operations, improve efficiency, and enhance customer satisfaction.  What are the best practices for smart locker integration? To fully realise the benefits of smart locker technology, businesses need to integrate them with their existing systems. However, deciding which systems are best suited for integration can be a complex decision that requires careful consideration of several factors. The first factor that businesses should consider is their workflow. By analysing their workflow and identifying processes that can benefit from smart locker technology integration, businesses can determine which systems are best suited for integration. For example, businesses that need to manage the distribution of equipment, documents, or supplies could benefit from integrating smart lockers with their inventory management or automated dispensing systems. Security is also an important factor to consider when integrating smart locker technology with other systems. Businesses should ensure that the integrated system provides adequate security measures to protect sensitive data and valuable assets.  By carefully considering these factors, businesses can make informed decisions about integrating smart locker technology with their existing systems and maximise the benefits of this technology. The integration of smart lockers with larger systems and networks has already transformed the way businesses operate. To fully realise the benefits of smart locker technology, businesses should consider their workflow, compatibility, and security when integrating smart lockers with other systems.  With careful consideration and planning, smart locker integration can be a valuable investment that can enhance operational efficiency and improve the overall experience for end-users. ### 5 Barriers to Cloud Modernisation Despite the widespread embrace of cloud technology, tangible benefits often remain frustratingly out of reach for many businesses and enterprises today. The finish line — where IT scaling, maintenance and innovation are made easier — is fully visible, but the legs running towards it feel like they’re moving in place.   Why? What’s holding companies back from modernised systems that simplify mobility and remote work, that offer a foundation for business continuity regardless of global disruption? In our experience working with companies on cloud adoption, we’ve identified five common hurdles that prevent a successful transition to the cloud. In this blog, we’ll discuss exactly what they are and how you can overcome them to achieve true cloud benefits for your business.    Creating a Clear Cloud Migration Strategy   A smooth transition to the cloud begins with a comprehensive, detailed plan. A basic migration strategy includes goals and objectives for the project — for example, replacing a physical data centre with virtual infrastructure or enabling field workers to use mobile solutions to update business systems in real time. However, a truly effective migration strategy takes more into account. It identifies business outcomes, mission-critical systems, and data, and determines how to minimise disruption in the move to the cloud. It also includes realistic timelines and initiatives to get buy-in from employees and other stakeholders throughout the transition.   Unless a business or enterprise has resources with cloud migration experience on staff, planning a cloud migration is best done with experts. Engaging with professionals who have a track record of helping businesses successfully transition to the cloud will provide valuable information on technology and proven processes for the project.   Choosing Optimal Cloud Migration Options   There is more than one path to the cloud. Every cloud migration should begin with desired business outcomes and a full technical assessment to understand the current IT environment, identify options for a cloud platform and determine the best way to migrate business applications. An experienced cloud partner can also ensure businesses mitigate technical debt effectively throughout this process. Outdated or unnecessary components that companies carry into the cloud has a tangible cost associated with it.   Some businesses and enterprises have learned by experience that the best plan for cloud modernisation isn’t to insist it happens all at once. In fact, careful, incremental progress can often be the best way to ensure minimal disruption to operations and that business processes work as expected in the cloud. Additionally, other businesses may determine that the best configuration for their IT environment is to maintain some on-premises systems. For example, a hybrid configuration ensures continual access to data or eliminates latency associated with data transport to and from the cloud. Again, an experienced cloud partner has the expertise to help determine the best configuration.   Continuous Improvement  Another issue can stand in the way of businesses seeing maximum benefits from a transition to the cloud: Cloud migration does not equal cloud modernisation. A “lift-and-shift” may move infrastructure and applications to the cloud quickly, but any inefficiencies or gaps in processes that existed with on-premises systems will move along with them.  Migration should be looked at as the beginning of the journey not the end. Cloud modernisation and business transformation requires a close alignment between business and IT leaders. Alignment on business outcomes, and a determination of key process, tool or personnel modifications is extremely helpful prior to moving to the cloud. Then, if approached systematically, cloud modernisation — with guidance from an experienced cloud partner and with clear objectives designed to produce measurable returns — will advance businesses on their transformation journeys.   Additionally, staying up to date on new tools and services that hyperscalers release each quarter is vital to enabling new capabilities and innovation. Striking the balance between continuous improvement and cloud governance is key in a successful cloud deployment.   Prioritising Security and Governance   Unfortunately, it’s common for businesses not to have an integrated security strategy. Cloud platforms include best-in-class security. However, they also have a shared responsibility model for security. For example, the cloud platform provider will protect its infrastructure, but users are responsible for creating immutable backups of their data and ensuring there is no unauthorised access to their accounts. Businesses should never assume that because they are using cloud solutions, they’re protected from cyberattacks. Working with a partner with expertise in cloud security will help companies to assess their current security policies, vulnerabilities, and threats, and to adapt their strategies for secure cloud computing.  The question of data governance is a related barrier to cloud modernisation. When data is only generated and used onsite, it’s easy to understand who owns and controls that data. The answer is less clear when data is generated and stored in the cloud. Clarifying this issue early can eliminate confusion, allow the business to make informed decisions about protecting personally identifiable data, IP, and other sensitive information, and stay in compliance with data privacy regulations.   Data stored and used in the cloud may also be more accessible to users who log on to systems, even though it’s not necessary for them to perform their jobs. Migrating to the cloud makes role-based access control and a zero-trust policy (one that verifies every user’s identity) crucial to protecting sensitive data and remaining in compliance with data protection and privacy regulations.   Pursuing Maximum Cloud ROI  While a transition to the cloud may align perfectly with the primary objectives of the project and generate a return on investment, businesses may overlook ways to gain even more from the move. Digitalising processes via cloud solutions can create new data streams, providing businesses with new insights as well. Companies can use this information to optimise business processes, evaluate employee performance, find, and eliminate waste, and learn more about their customers – all factors in a better bottom line.   Data from cloud solutions may even allow businesses to build new revenue streams. For example, remote product or equipment monitoring can enable automatic replenishment of consumable products or notify customers that it’s time for maintenance and repairs. Moving Forward   Whether initiating a transition to the cloud or looking for ways to advance a stalled migration, it’s vital to keep pressing on toward the goal. The benefits of the cloud are rapidly becoming table-stakes for operating a competitive business, and organisations that don’t modernise their IT systems will fall behind.   Fortunately, enterprises can rely on cloud migration experts for help crossing these hurdles and maintaining momentum. Their expertise, tools and processes will help businesses and enterprises build a workable plan, strengthen security and data governance, and maximise cloud ROI to finally reach their goals.   ### How AI can streamline procurement and supply chains The AI revolution is underway, with more and more businesses starting to adopt new technologies – including within the procurement function. A recent survey found high-performing Chief Procurement Officers are x18 times more likely to have fully deployed AI/cognitive capabilities. While 63 percent of procurement organisations have increased system automation using intelligent technology. Procurement and supply chain management is one of (if not the) largest spending areas for business expenditure and therefore there is much to gain from utilising AI. It can simplify supplier selection, identify and minimise risks, highlight opportunities to reduce costs, and continuously improve performance. Simplifying vendor discovery and selection Historically, vendor selection has been a time-consuming and arduous process, with procurement teams often spending hours or even days sifting through the internet and internal databases in search of suitable suppliers to engage with. But the whole process can be simplified using AI. AI has the capacity to complete strategic sourcing in a fraction of the time needed to complete the process manually as it can analyse large volumes of data from multiple sources efficiently and accurately. It allows businesses to search for suppliers based on set criteria including location, product and service specifications within minutes. Teams then have an accurate list of new suppliers to explore and shortlist that matches the scope of work. From this point, procurement teams can invite potential vendors to bid on a request, replacing traditional methods and reducing the amount of risk introduced into the supply chain. Some AI tools are continually learning, producing accurate results that are aligned to the latest market insights - allowing teams to strengthen their network and supply chain with more opportunities for cost savings, and better value for money. Reducing risk and human error Risk is unavoidable in the procurement sector – exposure exists with every action and decision that’s made despite people’s best intentions. Any error can have wide-reaching consequences for a company, affecting its ability to achieve its objectives and impacting the bottom line. To minimise levels of risk in procurement and supply chain management, businesses can lean on AI technologies to complete supplier risk assessments, conduct contract reviews, establish airtight agreements and use predictive analytics to forecast supplier performance and anticipate difficulties ahead of time. A thorough analysis of a supplier can be completed using financial and historical data and other relevant factors to ascertain suitability and potential risk exposure. Vendors that have a poor financial or performance history are deemed a significant risk and should therefore be avoided. It can also identify evidence of fraudulent activity. AI can collate this data at a much faster pace than humans, all the while eliminating error and misjudgement. Once vendors have been selected, automated solutions can further protect a business by reviewing contracts to identify ambiguous language, compliance issues, and any negotiations which may expose the company to significant risk. Contracts are only enforceable if they are clear and abide by applicable laws, so wording must be concise and carefully written to avoid leaving an organisation vulnerable to legal consequences. Reducing costs and unlocking efficiencies Manual completion of procurement tasks is time-consuming, mundane, and expensive as larger procurement teams are typically required to complete these tasks and they need to take their time doing so to avoid error. According to a survey, 69 percent of procurement professionals consider cost reduction a strategic priority, and this can be achieved with AI. Task automation can help to complete supplier spending reviews and implement continuous learning and tracking to identify cost-saving opportunities in the supply chain. AI’s ability to automate essential procurement processes is an excellent means of reducing labour costs by eliminating the need for manual tasks to be completed by humans, which cuts salary, training, and additional human resource costs. By reducing the margin for error, costly mistakes are also avoided. Digitisation and the application of AI is key to future supply chain resilience and efficiency, putting real-time visibility and centralisation first. AI can bring this much needed visibility, providing accurate, normalised, connected and timely data to deliver real-time insight and predictions about what’s happening next across the chain. If an AI tool identifies a potential logistic disruption, businesses can act before it’s too late. Improving performance and agility Continued uncertainty can cloud the supply chain sector, resulting in teams being forced to make extremely complex, and potentially costly, decisions without accurate or available information. We have witnessed first-hand the risks that unpredictable events pose, so business agility and continuity are key. Most information systems are built for ‘steady-state environments’. But, in a more volatile world, these systems can easily come under pressure as they aren’t built for the modern day. This translates into stiff and uninformed decision making. It leads to more stressed supply chain teams who must continuously make trade-off decisions with little to go on, ultimately harming business performance, agility and the bottom line. The main purpose of AI as a tool in procurement is to improve the performance of a business and increase profit, and 93 percent of organisations that have implemented AI solutions are satisfied with the ROI. AI can power smarter, data-driven decision making, creating more agile supply chains that can react faster to volatility and protect business continuity. Instead of teams collating information from disparate and non-interoperable systems and spreadsheets (a very time intensive process), AI can help procurement model different scenarios based on what’s known. The result? Reactive and dynamic decision making across the entire supply chain and less pressure on internal teams to deliver it. ### The perfect time to invest in cloud technologies Back in 2021, VC firm Andreessen Horowitz’s captivating think piece, "The cost of cloud: a trillion-dollar paradox," ignited a firestorm of discussions surrounding the control (or loss of) businesses have over their costs when entrusting their operations to cloud providers. The bars and cafes in every tech hub around the world from San Jose to Singapore were buzzing as industry professionals pondered this intriguing paradox. The crux of the paradox lies in the critical questions it raises about the optimal approach to cloud infrastructure. On one hand, businesses readily acknowledge the benefits of starting workloads in the cloud, including on-demand capacity and access to a diverse array of services. However, as companies grow and scale, the costs associated with the cloud can sometimes outweigh these advantages, exerting pressure on profit margins. This conundrum has left many businesses wondering if they would be deemed "crazy" for not embracing the cloud initially and equally "crazy" for staying on it as they expand.  As the CEO of VeUP, I disagree with a key theme in the article: the notion that large firms are "crazy" to remain with public cloud providers even after achieving significant scale. However, I wholeheartedly agree with the other contention: for growing or cash-strapped firms, public cloud providers are key to their success. This is even more relevant during economic downturns. Therefore, in this article, I will use this perspective to explore why the cloud is the way forward when the economy finds itself in the doldrums. 3 Main Reasons to start your relationship with Public Cloud During recessionary periods, when companies find themselves in the midst of economic challenges, the most prudent action they can take is to engage in sustainable cost optimisation while retaining their core strengths. It becomes crucial for businesses to streamline their operations, eliminate unnecessary expenses, and make strategic decisions that ensure their long-term viability.  To achieve this outcome, software companies should embrace the public cloud where they can unlock cost savings, scale up or down based on fluctuating demands, and focus on core competencies by moving to the public cloud. I will expand on each reason below.  Cost Savings: A Path to Financial Efficiency For small firms with limited capital, avoiding large upfront infrastructure costs is essential. Cloud services offer a pay-as-you-go model, allowing businesses to scale their usage based on their needs and pay only for the resources they consume. This cost-effective approach eliminates the need for hefty investments in on-premises infrastructure, making the cloud an attractive alternative. By reducing capital expenditure and shifting to operational expenditure, businesses can allocate their financial resources more efficiently, ensuring maximum value for every dollar spent. Scalability: Adapting to Fluctuating Demands During times of economic downturn, Cloud services provide unparalleled flexibility and scalability, allowing small firms to effortlessly scale their resources up or down based on demand. Whether experiencing sudden spikes in workload or navigating quieter periods, businesses can dynamically adjust their cloud usage without the hassle of procuring and managing additional hardware. This agility not only helps maintain a smooth operation but also enables businesses to optimise resource allocation, avoiding unnecessary expenses during periods of reduced demand. Focus on Core Competencies: Unleashing Business Potential Managing and maintaining IT infrastructure can be a drain on a business's limited resources, particularly for firms looking to manage costs. By embracing cloud services, businesses can offload the management and maintenance of infrastructure to the service provider. This strategic move allows them to redirect their valuable time, energy, and expertise toward their core competencies. By focusing on what they do best, businesses can increase operational efficiency, innovation, and overall competitiveness. The cloud becomes a catalyst for unlocking their full potential and driving growth, unburdened by the complexities of managing IT infrastructure in-house. The main reason to invest in a long-term relationship with Public Cloud During this recessionary climate, businesses are continually seeking to optimise costs while maximising their cloud investments. To support their customers, leading cloud service providers (CSPs), including AWS, are proactively releasing services with cost optimisation in mind. During AWS Re:Invent 2022, the world's largest cloud conference, Adam Selipsky, CEO at AWS, highlighted the crucial importance of cost optimisation during difficult economic times. Recognising the challenges faced by businesses, Selipsky emphasised AWS's commitment to enabling partners through cost optimisation initiatives. AWS then showcased 3 new services to prove their commitment towards cost optimisation.  AWS Supply Chain integrates seamlessly with existing ERP systems like SAP, providing actionable insights and eliminating manual data mapping. This empowers businesses to mitigate risks, reduce costs, and enhance transparency in their supply chain management.  Amazon EventBridge Pipes streamline integration between AWS services and self-managed systems, minimising the learning curve and custom integration code. By leveraging built-in transformations and seamless integration, businesses can optimise resources and drive cost savings.  AWS Application Composer simplifies the rapid deployment of serverless solutions through a browser-based visual canvas. Developers with minimal serverless experience can create architectures and build functions using a drag-and-drop workflow. Real-time synchronisation with infrastructure-as-code (IaaC) ensures alignment and reduces operational complexities, accelerating development cycles and enabling businesses to focus on delivering innovative products.  These actions from cloud providers such as AWS shows they are listening to the market and willing to develop new services that deliver additional value through cost optimisation.  Conclusion In summary, the current economic climate presents a prime opportunity for businesses to invest in cloud technologies. By embracing public cloud providers, companies can unlock cost savings, adapt to fluctuating demands through scalability, and focus on their core competencies. This approach allows businesses to allocate their financial resources efficiently, optimise resource allocation, and redirect their energy towards innovation and growth. Furthermore, cloud service providers like AWS have demonstrated their commitment to cost optimisation by releasing services specifically designed to address these challenges.  With the continuous development of new services that deliver additional value in cost optimisation, cloud providers meeting the needs of businesses during economic downturns. Therefore, developing trusted and active relationships with cloud providers is essential for maximising the value of cloud investments and navigating the uncertainties of the market. ### Why a ‘cloud strategy’ alone can’t stave off your competition For businesses to grow in the digital economy, they need to be smart. They not only need to optimise their workflow to increase efficiency and speed of production so that they can deliver competitive products to market faster, but they also need to ensure the quality and performance of digital products throughout their lifecycle, wherever their customers go. It does not matter whether we are talking about a connected car and its infotainment system, a financial management app, or an electric toothbrush with its associated dental hygiene app – these products all need to work seamlessly and flawlessly while their users are underway, travelling to business meetings or off on holiday with the family. The advantages of the cloud are being leveraged to achieve this. The average employee today uses more than 30 cloud-based apps to carry out their daily business, with the total number of cloud-based environments, applications, and workloads per company running into the hundreds. Quite apart from reducing Capex IT investment and enabling flexible scaling according to needs, the myriad of services in the cloud allow the automation of processes, the training and operation of AI software, and the provision of environments for software and product development, testing, and innovation, not to mention location-independent communication and collaboration tools for a mobile workforce.  So, the cloud has become the all-purpose enabler for business success. But with everyone else also using it, how can you get ahead of the competition? The question which is often neglected in the formulation of a hybrid or multi-cloud strategy is how to even get to the cloud. Even in regions with high cloud adoption, we can see that the most common way of accessing the cloud and transferring data to and from different services is via the public Internet. There are good reasons to avoid this route. The preferable route is via direct interconnection using a distributed Cloud Exchange – this provides access to a single integrated ecosystem of cloud providers and allows the corporate network to be connected directly to the cloud networks in use and bypass the public Internet. What companies need, more than just a cloud strategy, is a cloud interconnection strategy.  Connecting directly to clouds accelerates cloud services You may have already heard the saying “latency is the new currency”, and I can assure you, it’s true. The lower the latency, the faster the response time, and the better the performance of cloud applications and ultimately user experience, as well as the proper functioning of connected systems. Latency basically means delay – it’s the time it takes for a data packet to travel from a device connected to the Internet, such as a smartphone, to a server (such as the ones where your cloud services are hosted), and back again, so that you can actually see the result. Think about the heart-stopping delay that can occur after you’ve pressed the “buy now” button for an exclusive item on an e-commerce site, while you wait to actually see that your transaction has been processed and the product has been purchased. And delays are something modern business simply can’t afford. Shortening the pathway between the corporate network and the cloud by ensuring a direct connection improves the performance of not only enterprise applications (such as production-related analytics and design applications, ERP and financial apps, and virtual conferencing apps), but also customer-facing applications. Therefore, direct interconnection and short data pathways accelerate your work in the cloud. Connecting directly to clouds adds control There are also major advantages in terms of security to directly connecting the enterprise infrastructure to relevant networks. Using the public Internet is, again, a risky choice here, because there is no way to control the pathways that the data takes. The Internet was conceived as a “best effort” tool for communication. This is not sufficient for securing business critical data or sensitive end-customer data. On the Internet, data is passed over from network to network based on the current situation and the policies of each network, and there is no way of controlling where the traffic is exchanged and who might be lurking in the shadows. This way, there is not only the threat of data leaks, but also of phishing servers disguised as legitimate destinations – luring either an employee or a customer to a fake log-in field where they input their credentials.  Setting up direct interconnection between the networks – for example, your corporate network and the networks of the cloud service providers you are using – makes it possible to bypass the long and potentially dangerous route over the public Internet, thus lowering latency and improving security. Through direct interconnection, you know exactly which network is sending traffic and which is receiving it. This gives the legitimate partners full controllability over the data flows and minimises the risk of security breaches.  By accessing each cloud from multiple geographical locations (e.g. using different data centers within a metro area) via a distributed interconnection platform, you also increase resilience, reducing the chance of downtime caused by localised outages at one of the access points. The connectivity to the cloud can be made even more robust by remotely accessing additional cloud regions from each of the cloud providers – this is possible through an interconnection platform that connects different metro markets as well as offering local interconnection. This protects systems against more broadscale incidents. Connecting clouds directly to each other lets them work at top speed There’s one more step that’s needed to make your cloud connectivity state-of-the-art. This is to enable cloud-to-cloud communication by connecting your cloud environments directly to one another using a cloud routing service implemented on a Cloud Exchange. For many automated processes, data from one cloud needs to be provided to applications running in another cloud. Let’s take, for example, a smart factory producing products for just-in-time delivery – this will need to be fed customer order data, and it will need to be synchronised with inventory databases and logistics systems. It will also need real-time access to the AI systems monitoring product quality and the condition of the production plant. These and countless other processes, which may well be running in different clouds, are needed for a seamless and high-quality execution of the order. The way data transfer between clouds is normally handled is that the data from one cloud is transported to the corporate infrastructure, and from there on to the second cloud service provider. Depending on how close the corporate network is to the nearest cloud on-ramps for each cloud service provider, this can mean a substantial detour for the data. This not only results in latency issues, but also in higher costs for the backhaul of the data. Using a cloud routing service massively shortens the path that data needs to travel. It does this by interconnecting the cloud services at the point of the on-ramps, directly on the Cloud Exchange platform. This reduces the length of the data pathway, regardless of where the company itself is located, speeding up transmission and also cutting the costs. The corporate infrastructure can still be directly connected to the clouds, but this pathway would only be used when needed. Time-sensitive use-cases – such as database applications – can thus interact with each other across clouds in the lowest possible latency as if they were in one single cloud environment. Now, this is what we can call a smart cloud interconnection strategy. Cloud interconnection – breaking through the barriers to business success There’s a lot more you can also do once you’ve set up such connectivity. It’s possible to increase the resilience of the connectivity by ensuring redundancy in the lines being used to each access point. It’s possible to take a global approach to cloud connectivity and connect your worldwide operations and workforce to the nearest cloud on-ramps for their given location, ensuring the lowest latency wherever they are based. It’s also possible to set up robust back-up and disaster recovery processes using a secondary cloud (with close to real-time synchronisation through the cloud routing service), so that this cloud can seamlessly take over operations in the event of an outage at one of the primary cloud providers. There is no doubt: the cloud is an unparalleled tool for boosting productivity. But clouds themselves can also be accelerated using the right connectivity. For some applications, it is actually not feasible to run them in the cloud without optimised interconnection. And this will become increasingly important with each new iteration of applications. Developing a cloud interconnection strategy will help you to make the most of the benefits of the cloud. Direct connections to clouds boost security, performance, and the bottom line, while enabling cloud-to-cloud communication accelerates processes even further. With your clouds working at top speed, the sky’s the limit. ### The Rise of Smart Vending Machines Self-service technology demand is increasing and is a trend to watch for 2023, so it’s time to make smart vending machines a part of your businesses IT strategy. Growth in the smart vending machine market, thanks to all the benefits businesses can enjoy from making use of this innovative tech. With the rise of hybrid working companies and many more employees demanding more flexibility, this technology is coming into its own. Businesses hoping to take advantage of smart vending machines and other similar technologies can support the remote growth of their business - offering employees access to key equipment at a time that suits their own personal schedule.  How does it work? In businesses that have a disparate workforce logging on from different cities, regions, countries and even different time zones, there is a need for 24/7 IT support when problems with IT hardware occur. The modern workforce expects workplace technology to evolve and support their working habits - this is where smart vending technology can deliver real value to a business.  Incorporating smart vending machines into an IT support strategy can grant employees access to low-value, high-volume technology items like keyboards, headsets, charging cables or other computer peripherals, saving businesses time and money, while improving efficiency between employees and IT departments. Flexibility is key for modern businesses to thrive Smart vending machines are delivering more accessible solutions to the modern workforce by delivering replacement IT items quickly and efficiently. Smart vending machines empower both the IT support helpdesk and the employees by creating a process that allows employees to ‘self-serve’ when they need to replace a broken or lost item of kit.  Optimising inventory control and improving user experience are just two advantages of smart vending solutions. Going further with the technology, businesses can look to use AI to predict stock requirements and model patterns of usage in different locations to understand where the pressure points are. Data analysis can help your decision making Using data collected by smart vending machines can help inform future IT strategies by analysing customer behaviour, machine usage and performance.  One example is choosing the right location through analysing the data to provide further efficiency savings. Through understanding how often items are replaced and by which employees can allow businesses to decide where the vending machines should be best placed to have the most impact. It can also allow for trends to be identified from employees to see if they are regularly replacing items that are broken or lost - leading to improved management of stock levels and employee use of company equipment. Looking back at 2022, the smart vending machine market went from strength to strength despite and in spite of economic uncertainty. The market will continue to build as remote and flexible access to hardware solutions is something that businesses will increasingly need. With time and money savings evident, the transition to smarter and more efficient technology is especially pertinent during times of financial crisis. Smart vending machines are not one to ignore when putting together your digital transformation strategy. Find out more about the different smart vending options available. ### Designated Driver: Why Complex Technology Environments Require a Steady Hand Monitoring software is designed to run in the background, operate quietly and only intervene when needed, much like a designated driver. While other applications and services are busy driving the business forward, monitoring software is diligently running scans and comparing states, silently examining behaviours, and checking that everything is acting responsibly. It’s a sober, repetitive, and often thankless task, but the cost of not correctly monitoring a complex IT environment is high. And while the fundamental goal of monitoring software isn’t necessarily to drive us home safely at the end of the night, it does provide configuration and compliance assurance, and by doing so it helps to keep the IT estate and by extension the company’s data, safe from harm. The importance of hardening the IT infrastructure has never been greater. In 2023 so far, there have been numerous headlines pertaining to breaches from around the world that have caused multi-million-pound damage to the companies concerned. In January, T-Mobile announced it had been hacked, with the details of 37 million customers released. Likewise, JD Sports, Atlassian, WH Smith, MailChimp, ChatGPT, PayPal and Twitter have all reported breaches just this year. Along with stolen credentials, the fundamental attack vector is via misconfigurations of core infrastructure, something that reliable monitoring software is able to manage. But sometimes, the downtime can be self-inflicted. The large-scale closure of US airspace by the FAA was reportedly due to a damaged database file. Clearly, having an obsessive detail-orientated companion to track your configurations is a sensible plan. Maintaining a clear head and a clear view Before monitoring software can get to work, it needs visibility of the IT estate. Organisations need to understand the full scope of their IT infrastructure, which includes knowing what assets they have, where they're located and who is responsible for them. In this respect, monitoring and compliance software is extremely social and will build a rapport with every node it encounters. Adoption of a configuration management tool gives CTOs oversight of the whole IT suite, which previously may have been opaque. Allied to visibility, another important element of a hardened security stance is data integrity validation, which is about verifying the authenticity and integrity of data stored either in the cloud or on-premises, ensuring that the applications and workloads are operating within expected thresholds and are therefore not compromised. Integrity validation uses security controls such as access controls, firewalls, and encryption to protect data and applications from unauthorised access, tampering, or deletion. Keeping everything present and correct But once every node has been discovered, it needs to be tamed. Monitoring software will perform a test to detect misconfigurations across devices and digital assets. Often, after hotfixes, ad hoc requests and the daily thrills and spills within an IT environment, ports can be left open or cloud storage containers can be set to public. Our humble guardian angel is required to check that everything is neat and tidy and as it should be by examining the current settings to industry-benchmarked configuration states and alerting those who need to know if something is amiss. Setting configurations to the Center for Internet Security (CIS) benchmarks helps to identify any gaps or vulnerabilities in the organisation's security posture, allowing them to take steps to address these issues and improve their overall security and compliance. With the correct software, this can be a simple operation to set up and maintain. Allowing a monitoring tool to do this removes a lot of complex configuration work from busy, stressed IT staff letting them focus on key priorities and adding value to the business as a whole. And in doing this, the software will also manage the configuration state and monitor configuration drift. Drift can be a serious problem because it happens slowly and is therefore very difficult to detect. When it’s present, it means that the current configuration doesn’t align with the backup or failover. In fact, configuration drift is almost always the reason disaster recovery and high-availability systems fail when we need them most. Monitoring software makes sure this isn’t allowed to happen. Good monitoring software will also examine organisation custom policies against digital assets, examining how these assets, such as files, data, and applications, are used and handled within an organisation. These custom policies outline the required configuration of digital assets to meet regulatory and audit requirements, through controls such as access controls, and security protocols. Once they have been created, they are integrated into the monitoring and compliance software to ensure that the assets do not drift away from the defined configuration standard. It’s through monitoring of these policies that organisations can detect and prevent potential security breaches, data leaks, or other failures by safeguarding sensitive information, preventing unauthorised access, and maintaining security postures to give confidence in and proof of regulatory compliance. Delivering accountability Another useful feature that monitoring and compliance software contains is the ability to undertake process auditing. Auditing involves using monitoring and compliance software to automatically record and track the operations and actions related to digital assets. Monitoring and compliance software can then be used to audit these operations and actions against a set of predefined rules, policies, and standards to ensure that they are compliant and aligned with the organisation's goals and objectives. The auditing process can also help to identify any potential security breaches or compliance issues that may have occurred, allowing the appropriate personnel to take corrective action and improve the organisation's security posture. For many organisations, knowing where the vulnerable areas exist in their IT systems is half the battle in ensuring compliance, and for some, many issues will not be noticed until an audit is completed or an unfortunate consequence reveals the vulnerability later down the line. What happens in the cloud, stays in the cloud While the function of monitoring software is to provide configuration and compliance assurance, in achieving these tasks, typically the solution will actually deliver much more than this. Through creating visibility and ensuring data integrity validation, it provides a path to delivering fundamental security hardening by managing configuration state, while protecting against configuration drift and misconfigurations, which means that infrastructure and assets will be resilient to larger failures. So, like a designated driver, monitoring and compliance software delivers the dependability that all complex IT environments require, without asking for more attention. ### Building the Business Case for Satellite IoT  The Operational IoT market continues to expand as organisations across the world imagine an extraordinary range of opportunities to leverage sensor technology. Weather monitoring stations are transforming the efficiency and environmental performance of remote copper mines and helping farmers to safeguard crops and livestock in a changing climate. The shipping industry is improving cargo traceability to mitigate on-going disruption. Charities are monitoring water quality across Africa to ensure remote communities have reliable access to safe drinking water.  With the arrival of robust, proven, cost-effective satellite connection, the true potential of these IoT applications can be realised. With estimates suggesting there will be tens of millions of satellite IoT devices in use by 2030, access to reliable, global coverage is now enabling new opportunities for systems integrators (SIs) across the world.  It is now time for SIs to build a business case for Satellite IoT, says Eric Ménard, Vice President Strategy and Business, Astrocast.  Market Expectation  Satellite connectivity may have been available for years, but the market has been waiting for a satellite connection designed specifically for widescale IoT deployment. Many of the key target applications – from agriculture to supply chain – do not require the continuous or real-time communication associated with high-cost legacy satellite connectivity. These solutions play a critical role but they are too expensive and power hungry to support a compelling business case for most Operational IoT deployments.  A farmer requires only daily or twice daily updates of cattle location to track herd health. A copper mine uses intermittent updates on the water table level to provide operational visibility and meet environmental regulation. A shipping line does not require real-time updates of the temperature of its containers . Transmitting data either once or twice a day – or taking multiple recordings which can be buffered and uploaded every 12 hours – is perfectly adequate.  The value of this data is significant – especially in areas such as shipping. The use of IoT sensors can ensure high value cargo, including pharmaceuticals, are kept at the right temperature and left untampered. Any deviation will prompt an alarm and allow remediation where possible, resulting in less wastage and better integrity.  Building Confidence  However, while the business case is compelling, such IoT operations are incredibly cost sensitive. When a deployment may extend to tens of thousands, even hundreds of thousands of devices, small differences in performance and lifetime will fundamentally change the return on investment (ROI). The business case becomes even more sensitive when extended to remote areas without terrestrial network coverage and require satellite connectivity. How can the sensors be deployed to remote locations cost effectively? What is the cost of satellite transmission? How long must the battery last on a sensor to ensure the ROI is not compromised? Plus, how can the data be collected and used to drive tangible commercial benefits?  Even before exploring the technology, SIs need robust due diligence to ensure confidence in the business credibility and model of the satellite provider. Ensuring excellent satellite coverage, including across international water, is essential. Business longevity is also fundamental for deployments that could be in the field for a decade.  In addition to verifying strong financial credentials, it is also important to assess the billing model, contractual arrangements, warranties and support structure. Is the company committed to supporting its SIs not only in the prototyping and field-testing phase, but also through industrialisation, production and taking the solution to the market? Each stage of this process will raise new challenges. Having a partner in place with both the knowledge and commitment to overcome problems will transform the likelihood of commercial success.  Proof of Concept  Only once the foundations of a business case have been confirmed should an SI make the investment in a technology assessment. For many SIs looking to expand existing IoT solutions, speed of integration is an important consideration. From the quality of documentation to availability of training, the way a satellite company works with its SIs to ease the integration of SatIoT in to the existing IoT solution set can make a significant difference in time to market.  For the past few years, a number of innovative SIs have been testing the latest generation of cost- effective SatIoT connectivity to determine the viability and requirements of an industrial scale deployment. They have built prototypes and invested in field testing. The process has highlighted the importance of ultra-low battery consumption to minimise the need for replacements in situ. Typically, a business case may only stand up if the battery lasts five to ten years. In some locations, the Satellite IoT solution can be integrated with a solar panel, overcoming the need for a dedicated battery.  SIs have also worked closely with SatIoT providers to optimise antenna design and ensure the antenna is both reliable and easy to integrate. A small, flat antenna may be essential but additional questions will arise specific to an area of deployment. For example, lightweight but robust enclosures are now used to securely attach an antenna to livestock to track their movement across remote farmland and identify any that leave the herd, indicating ill-health or injury. Or a simple addition of a Bluetooth connection between the device and the SatIoT antenna provides an excellent solution to achieve indoor satellite IoT deployments in rural locations with no terrestrial networks.  The availability of bidirectional connectivity also provides SIs with a future proofed solution. Updates can be downloaded remotely to the sensors as required – for example, if a customer wants to change the frequency of data recording.  Conclusion  These innovators have led the way, discovering how to optimise SatIoT solutions and antenna design to deliver a robust, viable and cost-effective deployment. Critically, these companies have proved the business case for Satellite IoT. While the demand was never in question, the technology is now in place to enable it. Whether it is shipping containers or agriculture or environmental monitoring or animal tracking, SatIoT developments are now moving into the next phase of industrial scale deployment.   And this is just the start. The shipping industry, for example, has an array of complex operational challenges in its management of 50 million containers across the globe. Tracking location and temperature monitoring are delivering financial benefits. Adding the ability to identify whether a container has been entered or tampered with during the voyage, will support the war on piracy and drugs. Adding smoke detectors will raise the alarm when fire breaks out on board – an increasing concern if owners fail to inform the shipping company that the container holds self-combusting cargo, such as Lithium-Ion batteries.   The door is open for SIs across the world to build on the knowledge gained over the last few years, explore the global reach of cost-effective satellite connections and build a compelling business case for Satellite IoT solutions that will transform operational efficiency for organisations of every size across the world.  ### Using Cloud To Avoid Bloated Business Models There has been an exponential rise in the volume of cloud-based business tools and cloud-first businesses, which has been further propelled by the pandemic and necessitated a rapid transition to remote working. As business working models settle after these significant changes, cloud is becoming an increasingly popular choice for organisations looking to improve productivity and long-term business strategy. However, some aspects are coming under increased scrutiny as the promise of cloud being the cheaper option is becoming challenging for some businesses selling cloud-based platforms or tools to their customers. Cloud providers using hyperscalers, once touted as cost-effective solutions, are now being re-evaluated in terms of affordability. The exponential growth in the volume of cloud storage required by businesses has created challenges for hyperscalers, and as a result, running costs have increased, seeing rising costs passed down to their customers. This includes the likes of SaaS providers, who are too starting to pass these on to their customers.  This process has seen cloud become more expensive than previously promised, resulting in a bloated cost model whereby costly inputs are forcing costly outputs from the SaaS provider to the end customer.  To mitigate these rising costs, some businesses are opting for a hybrid cloud model, combining both public and private cloud solutions. Some organisations even choose to go entirely private. For this to be successful, in house-expertise is required and high maintenance costs occur. However, this is simply out of reach for some due to limited capital, modern IT infrastructure or a lack of skills to carry this out effectively.  Risks such as reduced security protections are presented if introduced without the sufficient resources. In an era of heightened cybersecurity threats, it is crucial to have robust security measures in place to counter cyber-attacks. This can act as a barrier to adoption, making a change in infrastructure strategy unviable for some businesses.  Choosing the right partner  Organisations need to plan carefully and consider their provider when moving to the cloud. This involves assessing the specific needs of the business and aligning them with those of the chosen service provider.  It is important that providers are transparent around their own business strategy. As a business aims to grow, increased prices can occur, so getting a clear idea of how scaling up may affect the price of the service is essential to long term planning to continue to offer value to customers. Understanding a cloud partner’s long-term ambitions is also essential, gathering insights into their market sustainability, potential mergers or acquisitions, and their plans for innovation and growth.  These factors will affect customers in the future. Aligning business values and strategies with the cloud partner is essential to ensure a seamless customer experience and mitigate the potential for unexpected surprises in the long run. The benefits of cloud  There is no question that a cloud-based business strategy will provide unrivalled business benefits. Organisations should recognise the long-term implications of incorporating the cloud into their business strategy, such as more flexible and cost-effective IT infrastructures, while weighing up the potential risks it may pose to their business.  With the right strategy to enable education and adoption throughout an organisation, cloud computing enables businesses to run more efficiently and can provide business insights in real-time from anywhere. As part of this, employees can access files and applications from anywhere, essential in a hybrid working world. With employees more dispersed than ever before, cloud allows people to connect and collaborate seamlessly.  It can automate many processes, saving significant time and labour for business operations. It can also ensure a stronger customer experience, offering a complete picture of the customer journey as well as speeding up processes in customer engagement. And it can provide deep insights available in real-time to enable speedy and strategic decision-making to drive business success. Cloud can enable a business to increase its agility, flexibility and adaptability. Businesses can gain insights to look ahead to potential risks or turbulence and take steps to tackle these before they impact performance. Businesses can scale up or down to suit demand much more easily and quickly. Cost savings with the right provider are another benefit. Businesses can significantly reduce their IT expenditure by enjoying benefits such as cheaper initial expenses, pay-as-you-go pricing, and lowered total cost of ownership over time.  All of these benefits can help a business remain competitive in any market. However, the right considerations need to be made to mitigate risk. The risks involved  An emerging and major risk is the possibility of significant cloud price rises as a company grows, which can have a cascading effect on other sectors of the business. These 'bloated costs' may force budgetary cutbacks in areas such as recruitment or marketing, or lead to the costs being passed on to customers. However, with the right provider, these risks can be all but eliminated and provide strong ROI. A chosen provider must be robust and protect business data. Cyber-attacks and data breaches which can result in data, financial and reputational loss are also potential risks. However, most SaaS providers are equipped with stronger expertise to avoid security risk than most businesses could be if they took data storage in-house. While investing in cloud can bring huge business benefits, the potential risks must be evaluated carefully to inform implementation strategies and help the customer realise the full benefits of cloud. Cloud Economics We have already explored the benefits of running a cloud-based business. These benefits allow organisations to future proof their business, setting them up for growth while also preparing them for unforeseen circumstances. Recent events, such as the pandemic, supply chain challenges, geopolitical conflicts, and high inflation, have demonstrated how economic conditions can rapidly change without warning. This highlights the importance for decision-makers to prioritise building a robust business model that can withstand such turbulence. In the face of these challenges, businesses have encountered significant hardships, with many struggling to survive. A lack of agility, flexibility and adaptability can lead to an organisation's downfall and cloud computing can provide a route to building a resilient and robust business to navigate unstable conditions.  Wider purpose On top of the benefits cloud computing can bring to businesses, it can also provide wider socio-economic benefits, offering increased opportunities reducing geographical barriers and expanding the available talent pools. The cloud enables individuals to work from their preferred locations. This can improve employee experience, allowing increased flexibility, improving work life balance. It also provides opportunities to people all over the world, expanding the talent pool for recruitment.  Organisations no longer have to face restrictions when hunting for new talent or choosing a desirable destination as the cloud enables a seamless and collaborative working environment, that is not location-dependent.  Additionally, as the demand for highly skilled labour increases, cloud can help to eliminate the war for talent while restricting talent erosion and improving skills distribution. Skills no longer need to move away from areas towards traditional cities and business towns, as it enables talent to stay in local communities, building prosperity, rather than eliminating it. The road ahead During challenging economic times that present uncertainty and unpredictability, businesses must now assess their cloud strategy with a long-term view in mind. Organisations considering a cloud-based model must carefully consider all key factors to maximise benefits. This involves assessment and strategic planning in line with company objectives to ensure success.   With the right approach, cloud computing can bring huge benefits, helping businesses become resilient, so they can adapt to any unexpected market disruptors now and in the future. Now, the promise of cloud is as strong as it has ever been. ### Elevating healthcare services with cloud computing Healthcare is going digital as a route to improving patient care and use of clinician time. Everyone gains through faster access to up-to-date patient medical history, better diagnostic support, the wider use of telemedicine and video consultations and providing patients with access to test results, appointment booking and requests for repeat prescriptions on-line. Patients are buying in to the change with the NHS App currently the most downloaded free app in England, with more than 30 million sign-ups and a target of 75% of the adult population to be registered to use the NHS App by March 2024. The sector is facilitating it by adopting cloud technology because it enables faster digitalisation and delivers it at a much lower cost.   Healthcare was named as one of the biggest beneficiaries from cloud adoption by McKinsey & Company in their 2021 report. There are multiple reasons for this. Patient data is integrated across healthcare teams and incorporates sources such as medical devices and smart personal monitors in phones and wearable tech, allowing physicians to have a full picture of the patient’s health updated in real time and accessible anywhere. New diagnostics capacity is being developed to enable image-sharing and clinical decision support based on artificial intelligence (AI). These technologies support testing at or close to home, streamlining of pathways, triaging of waiting lists, faster diagnoses and levelling up under-served areas. Physicians can intervene with proactive treatment earlier and preventive advice. The replacement of paper medical records usefully provides savings in manpower, space and time.  Payoda report that by hosting their healthcare ERP in the cloud, 88% of healthcare organisations have noticed a 20% drop in infrastructure expenses. The cloud can hold vast amounts of information at a very minimal cost and can be scaled up and down effortlessly on demand so that users only have to pay for what they use. The most significant advantage of using cloud for storing data is the benefit in budgeting and financial management. Organisations no longer need to spend large upfront investments in hardware and overpay for data capacity they might not use (CAPEX). Instead of buying, they can access technology services such as computing power, storage and databases on an as-needed basis together with buying software as a service (SaaS) from a cloud provider saving licencing and updating costs (OPEX). One of the biggest problems is legacy systems, transferring the large amount of existing data to the cloud and or enabling new systems to “talk” to the old systems. Both of these are addressed increasingly by artificial intelligence (AI) powered bots as an interface between programmes and also breaking down of barriers between monolithic programmes in legacy systems. In one healthcare setting, a healthcare centres system for elder people in Japan, we achieved performance 300 times faster: processing a day’s data decreased from 27 hours to only 5 minutes; the huge volume of data accumulated over 3 years was processed within 7 hours and there is significant gains in highly automated administrative processes and increased business agility. Typically we see 20 to 30% improvements in operational efficiency and reductions in security risks plus savings in operational costs of 12% and ownership expense down by 16 to 22%. According to a BCG analysis, moving to the cloud can help businesses save between 15 and 40% on infrastructure costs. One of the early barriers to adoption was security of sensitive patient information, though as cloud service providers now undertake best in class security measures, including encryption,  this has become less of a concern and the encouragement of Government to move to the cloud has reduced most qualms. Hybrid-cloud strategies have offered the option to keep sensitive information on-site while shifting everything else to the cloud and multi-cloud strategies help back-up data in case of the need for disaster recovery. Cloud security can still offer some challenges but most of these are caused by cloud misconfiguration and human actions, mitigated by quality training and monitoring. Cloud service outages do occur occasionally caused by power outage, human error, technical issues, periodic maintenance or, in rare cases, cyberattack, all largely mitigated by backups at other sites. Cloud adopters should always be aware of regulations and make sure cloud solutions are compliant with regulations on medical data storage (such as HIPAA, GDPR) and look for certification such as HITRUST which verifies that a company meets the strictest requirements with handling high risk data. The adoption of cloud technology coupled to the pandemic experience has enabled a massive spring forward in areas such as the use and acceptance of telehealth, now 38 times higher than pre-pandemic levels. Tele-home monitoring, virtual wards and outpatient video consultations have brought more services into people’s homes over the course of the pandemic. This has been shown to help deal effectively with a sudden surge in demand and has a role in mitigating the growing shortages of physician face-to-face capacity. The cloud is also facilitating the move towards personalised medicine which requires analysis of various criteria including an individual’s genes; a patient’s genome sequence alone has 200GB of data so the cloud proves helpful in storing this massive data pool and allowing collaboration from anywhere within the care team.  The cloud is enabling healthcare to evolve and is helping to address the challenges of budgets, manpower and other resource shortages while treatment and expectations grow exponentially. Artificial intelligence (AI) and machine learning (ML) coupled with human ingenuity will deliver ever more sophisticated and personal medical intervention options. Addressing these within constrained resources will only be possible with effective cloud strategies. ### Top 5 trends impacting multicloud management An effective multicloud strategy can improve quality of service while enhancing security and strengthening partner connections – the keys to value lie in how organisations manage their multicloud infrastructure, says Jerome Dilouya, CEO, InterCloud  As businesses negotiate a period of economic uncertainty, changing working habits and heightened cyber threats, adopting and refining multicloud capabilities will support efforts to scale sustainably and meet strategic objectives.However, the benefits of a multicloud approach can quickly diminish if the environment is not managed effectively. Multicloud and hybrid infrastructures present an array of challenges including varied skillset requirements, siloed teams, security issues and ensuring visibility across all environments.  For businesses to avoid these pitfalls and fully unlock the potential of a multicloud strategy, they need to address five key trends impacting the market today. Bolstering cyber security is a business imperative.  The desire to enhance cybersecurity is a common trend across businesses, and with good reason. It is driven by rapid increases in attacks such as ransomware, with Microsoft’s Digital Defense Report 2022 recording a 40% year-on-year growth in attacks between June 2021 and July 2022. Global spending on security and risk management is already rising dramatically, forecast to increase by 11.3% in 2023, with cloud security expected to have the strongest growth.Furthermore, with hybrid and remote working now ubiquitous, organisations are adding operational layers and security teams are implementing point solutions to address specific security needs, all of which adds to management complexities.  Hybrid cloud deployments experience security breaches because there are too many moving parts and variations in security configurations, challenging organisations’ ability to implement consistent security across hybrid environments.  Securing networks with an agnostic multicloud platform is an effective filter against harmful traffic, while Application Programmable Interfaces (API) and telemetry reduce risk through real-time monitoring, event correlation and automation.A multicloud approach also secures interconnection with multiple cloud service providers (CSPs), while ensuring data confidentiality via a private data backbone. As a result, network operations and data movement are secure, smooth, and seamless – all governed by enterprise-grade service level agreements (SLAs) with just one provider.  Economic uncertainty requires better cost management.  60% of infrastructure and operations leaders report significant public cloud cost overruns that negatively impact their budgets, outlining the importance of choosing a service that aligns with business needs. The attractive economics of cloud are sometimes diluted by the cost of migration, modernisation and platform construction, with the cloud’s hidden costs capable of throwing IT budgets into disarray. CSPs also have different pricing structures, which is why multicloud must be supported with visibility and observability to unlock its true value. With a global reach from a single mutlicloud connectivity platform, enterprises can lower costs by increasing efficiencies and removing the need to manage connectivity to multiple CSPs at the same time. Businesses should also look for a multicloud connectivity platform with in-built SLAs that integrate into all clouds, eliminating the costly need to maintain multiple cloud network agreements. Scale at speed with digitisation. Digitisation has torn up and pieced back together the corporate rule book, with disruptive business models forcing traditional organisations to embrace digital in order to remain competitive.  Digitally enabled businesses are equipped to scale quickly, with services that can be consumed by customers anywhere in the world. Providing exceptional customer service is crucial to global enterprises and if managed correctly using the right platform, a multicloud strategy helps to deliver transformational experiences, ensuring a consistent level of service no matter where customers are located or how they move around.An agnostic multicloud platform also gives businesses full visibility of all cloud operation data, across all cloud environments, allowing for better forecasting and planning. Analytics at edge is driving need for distributed data management. Recent estimates showed that the world could generate more data in the next three years than over the past 30, so businesses must be equipped to handle this torrent of information. Additionally, the demand for local data analysis and data exchange to support digital workflows is outpacing organisations’ ability to classify, secure, transport and process data across regions. Multicloud connectivity platforms enable businesses to implement a data architecture that captures data at the edge and transports it securely to the nearest locations for processing to deliver real-time insights.Edge devices also capture sensitive information that must be encrypted and secured with a security system that has layered access to data. A multicloud connectivity platform can help facilitate, monitor and orchestrate these capabilities from a single access point. Expansion of value chain creates disruptive digital ecosystems. Digital business ecosystems thrive in shared platforms as the curated environments allow companies to join forces, cocreate, and innovate with partners. They also encompass a network of partners, developers, and customers, and the cloud is increasing interconnectivity across these different stakeholders. This has led to the need for an open environment for modelling and building interoperable system integration. Cloud integration offers a cost-effective business model to implement and leverage an interoperable digital supply chain with disruptive capabilities.  Enterprises seeking to harness the benefits of digital ecosystems must implement a cloud network infrastructure that supports a globally distributed network, facilitating partnerships with smart interconnection within a secure environment.A multicloud connectivity platform also allows organisations to plug into any application, as orchestrating cloud interconnections becomes hardware. Cloud-agnostic plug-ins allow enterprise systems to directly embed a platform without complex modifications to create a network mesh connectivity. It's time to embrace changeManaging individual cloud systems separately creates a dispersed cloud management system that leads to operational silos, blind spots, inefficiencies, and cloud sprawl. It exposes the organisation to security vulnerability, challenges in performance management and cost inefficiencies.A multicloud strategy is an exciting proposition to help organisations reap the benefits of digital transformation at speed and scale, while eliminating the risks mentioned above. However, operational complexities persist as multicloud requires integrating differing technologies, controls, and systems all of which can weigh down the ITOps team. This outlines the importance of balancing adoption with a thoughtful approach to how different clouds are interconnected. If businesses rush their approach, the cost of mismanagement may far outweigh the benefits. Instead, leaders should take their time to ensure things are done well, while not losing sight of the urgent need to make changes. ### CRM testing: a six-step guide Programmatic solutions for customer relationship management (CRM) have become a top choice for businesses across diverse industries. According to IBISWorld, the US CRM market size has already reached 46.2bn in 2023, growing faster than the whole technology sector.  A CRM system is the backbone of many businesses that helps streamline sales and marketing processes. It can also provide valuable insights into customer behavior and preferences. However, your CRM should be properly tested and implemented to give you maximum benefits. Let's see what it takes.  Why should you test CRM? The fact that CRM software is often an out-of-the-box (OOTB) solution provokes a  misconception that if a trustworthy CRM vendor has already tested the tool, you don’t need to do it anymore. However, CRM solutions require customisations or new code to fit particular business requirements. Logically, adding new code lines calls for thorough testing of new features and their interactions with the existing functionality.  How to test a CRM in six steps CRM is a multi-module system with complex interactions across different components. Therefore, we offer a comprehensive six-step guide to help you make the testing process efficient.  Step 1 Designing test data Test data for CRM testing should be well-drafted. For example, you shouldn't use confidential customers' information from your databases. Instead, use test data that resembles the actual data from the system. For example, you can add the formats of addresses, zip codes, or phone numbers that are typical of your country or region. Using such test data helps avoid UI issues commonly caused by random test data – wrong address or currency rendering. Step 2 Functional testing Functional CRM testing usually involves checking how well the system correlates with user requirements. It’s critical to verify that the program has all the required features and that all its components work together as planned. Functional testing is based on customer requirements. Hence, it checks features and their performance and also helps verify the established workflows for different user roles. Moreover, functional testing helps prevent data mismatch for users with similar personal names, task names, etc. It’s also essential to ensure that the introduced configurations and settings don’t provoke any malfunctions in the application. Step 3 Integration testing After successfully migrating to a new CRM, you must check if the data can travel throughout the system according to the established business rules. To test the integration between the CRM and other internal systems, you can use the inbuilt search function. Employing targeted search queries, you'll check how input and output data correlate and assess workflow implementation. In case input and output data coincide the workflow works well. If not, it requires bug fixing. Cross-system integration Many CRM platforms leverage integrations with other systems for more efficient operation. For example, Microsoft Dynamics CRM has a solid link to SharePoint. However, for such integration, you should also check the following: if the data is synced across the systems if user permissions transfer smoothly across the systems if document sharing across the system works in two directions without CRM access CRM data should remain consistent during cross-system transfers. Therefore, any adjustments made to the data through user actions or automated processes should be securely documented for future reference. Furthermore, the modifications should be visible to all system users who could track the history of changes. Step 4 Performance testing Performance testing will help you check if your CRM is scalable enough to manage large amounts of data. To attain the desired performance, you can simulate various scenarios by adding or editing a significant amount of records. What's more, many employees usually use a CRM system simultaneously. Hence, the system should sustain a high load without sudden outages or data loss. Therefore, you should test the CRM work under peak and minimal loads. When testing, monitor the system response time and loading speed. Step 5 Security testing As CRM software hosts business-critical information, its bulletproof security should be among your top priorities. As a rule, such systems have several security levels: Role-based access and multi-factor authentication. These measures help protect the application from third-party’s logins effectively. On the other hand, they ensure that each user has access to the data they need for work and not more. Vulnerability scanning. Cybersecurity experts will study the system and find weak links prone to leakages. As a rule, such specialists provide tips on the enhancements necessary to patch the loopholes. Penetration testing. CRM software is not immune to hacking attacks. In December 2022, such an attack occurred in SevenRooms, a restaurant CRM platform from New York. After the attack, malicious actors threatened to sell 427GB of the stolen customer data on hacking sites, providing data samples. Penetration testing can help prevent such incidents. Ethical hackers simulate attacks using the same tools and techniques malicious actors employ. They do so to detect some dangerous weaknesses hackers can exploit and fix them before an attack breaks out. Step 6 Regression testing Regression testing is a critical step that helps ensure that none of the previously delivered functions get broken after a new feature deployment. This testing type also helps verify that the system is stable and runs smoothly after all the modifications.  Regression testing is a top candidate for automation due to its monotonous character. In this case, automation is a reasonable effort, which helps reduce testing time and the rate of mistakes due to the human factor.  Conclusions Testing a CRM system thoroughly before implementing it in your organisation is crucial for the solution’s future performance. To make sure the CRM software works well, you should prepare realistic test data and run the following testing types: Functional testing to check if the system works as required. Integration testing to ensure all modules process data correctly.  Performance testing to test if the system handles different user loads efficiently. Security testing to ensure the system data is securely stored and protected from outside threats. Regression testing to ensure new features don’t cause malfunctions in previously added functions or the whole system.  CRM testing will help you identify any potential issues before they can cause any damage or disruption in operations. ### How AI can improve battery cell R&D productivity Everywhere you turn these days, there is news about the power of artificial intelligence (AI) to transform our lives - from diagnosing cancers to writing movie scripts. Now AI practitioners are turning their attention to electric vehicle (EV) batteries.   Data-driven, machine-learning-based approaches have received much attention from both academia and industry over the past decade. Research into the use of AI has shown promise for accurately predicting the dynamics of nonlinear multiscale and multiphysics electrochemical systems. Everything from battery state of health (SOH) estimation,safety and risk prediction, to cycle life prediction and Battery Lifetime Prognostics, closed-loop optimisation of fast-charging protocols to identifying degradation patterns of lithium-ion batteries from impedance spectroscopy using machine learning.The predictive ability of AI is however challenged when it comes to EV lithium-ion battery research and development , where extensive datasets do not yet exist. This is particularly true when predicting the cycle life of the battery and the cells making up the pack. Small changes to the chemical and/or physical properties of the anode, cathode, electrolyte, and even the separator can have a significant impact on the battery’s performance. The enormity of the task is better understood when the time required to predict a cell’s life through experimentation alone is considered.  Due to the long service life of batteries, simulating the cycle life of the cell during operation in an EV would require a test regime of at least 500 cycles. Following the extreme fast charge, discharge, and rest cycle, would normally limit the number of cycles to about 17 per day. This means that for every iteration of even the smallest of changes it would take more than a full month of uninterrupted testing to predict the cycle life. However, in practice it is not uncommon to conduct up to 200 experiments in parallel – often comprising groups of  8 cells or more. Running these test procedures while recording all the measurements every second from every one of the thousands of independent battery test channels, creates an enormous amount of data. Even with the solution being programmed, automated, and cloud-based, allowing all these experiments to be conducted every week, the batteries would still need many months of testing to simulate the cycle life. This is a time-consuming and expensive bottleneck in battery research. When conducting their cutting-edge research into Extreme Fast Charging (XFC) battery technologies, researchers at StoreDot found they needed a better way to predict each battery’s end of life without losing valuable data insights and transparency. By choosing to use Artificial Intelligence to do the ‘heavy-lifting’ StoreDot became one of the first battery-tech companies to implement AI in the R&D phase of EV battery development. Deploying AI in battery R&D saves time and money  Analysing the results obtained from the complex set of experiments carried out on diverse battery-cell chemistries and designs is very similar to clinical trials typically conducted on sample sets of patients. This lead the research team to investigate the possibility of merging the chemical battery’s world with the concept and methodology applied to these patient clinical trials. Using Evolution Intelligence’s AI-for-AI platform, StoreDot built and optimised their models, which indicated that the Kaplan-Meier (KM) graphs and methodologies would  be ideally suited to investigating, learning, and predicting the battery lifetime. This would speed up the battery R&D cycle - in particular predicting the battery lifetime and, equally important - learning from the explainable algorithm what were its significant markers and metrics, a very significant and valuable lesson in understanding our unique chemistry and its boundries The genetic KM algorithm takes into consideration the hundreds of “genes” originating from chemistry formulations, cell design decisions, and production process measurements. Coupled with aggregated and augmented test measurements, a forest of decision trees is created dynamically to explain the results. This ad-hoc clustering method calculates the survivability of each battery  in the resulting tree leaf and uses that information to maximise the pre-determined parameters. In essence, as the machine learns and improves the tree leaves - making the batteries more similar to each other – the better the lifetime predictions become. The resulting nodes in the trees are the algorithm’s insights about how to improve battery life and can be used to establish cell design parameters, manufacturing parameters, or specific timed measurements. Thus, the lifecycle can be predicted with a 15 percent accuracy after having completed only 125 cycles, instead of 500-1500 – thereby freeing up resources and slashing the time needed for evaluation and decision-making. What is more, the algorithm allows researchers to cut any experiment short that does not meet the set objectives, whilst adding the successful candidate’s results to the database for inclusion in future experiments. Moving to the next generation of predictors As the project moves from R&D onto real life pilot lines, the importance of the transparency of the models becomes less important, while accuracy and early prediction points become crucial for the manufacturing and operability phases of the battery. At this stage the KM graphs were replaced by tighter clustering algorithms that learn the highly-augmented time-series measurements of the battery’s life as it progresses. With thousands of calculated trajectories of key performance indicators over the current lifetime of the battery, the algorithm optimises for statistical significance and chooses the most important X such vectors that hold most of the signal to accurately predict the remaining useful life of the pack. Here, Evolution’s AI-for-AI platform was used to optimise and select thousands of “genetic algorithm” generations from millions of potential combinations. The results allow researchers to accurately predict multiple future targets at earlier prediction points (for instance: cycles 32, 62, and 122 – equal to 2, 4 and 8 testing days, respectfully) with appropriately high accuracies. In the Predicted vs Actual “Cycle @ retention 85%” graph below, the training set is shown in blue, while the newer, never-seen-by-the-algorithm, test set is in orange, demonstrating a high prediction accuracy. Clearly, the benefit of AI and machine learning during the R&D process is proving invaluable in enabling researchers to evaluate the impact of changing more than one variable at a time, thereby speeding up the process to accurately determine a battery’s lifecycle. However AI and machine learning’s ability to forecast battery life has other benefits to high-tech research companies such as StoreDot – it provides management, and investors, with a valuable tool to predict the viability of untried nascent technologies.  AI used in EV battery R&D delivers other important hidden benefits In the rapidly developing EV battery sector, with new breakthroughs being announced and delivered almost daily, it may be difficult to identify which technologies to pursue or invest time and money in. Even more so if you have a small team of researchers competing with large well-funded corporations. Prediction and forecasting of key battery performance metrics in the early stages of a project can give  decision makers a more tangible indication of the likelihood of the success of a given idea, technology, or even business model.  The prediction of key battery performance metrics in the early stages of a project can give management a tangible indication of the likelihood of a successful outcome. In a landscape filled with exciting technologies and noble ideas, not all will eventually converge into a successful product or technology. Predictive results obtained through AI improve confidence in the chances of a successful convergence. At the same time, the deployment of AI and machine-learning can often supplement hard-to-find skills and talent in battery R&D. This is particularly helpful in smaller organisations that are competing in the same space as the highly funded and well-established industry stalwarts. Effective implementation of AI gives these smaller operations a better ‘David and Goliath’ shot at leveling the playing field. Conclusion With the rapid advances being made in AI, the technology when applied to the EV battery industry is set to assume a pivotal role in reducing costs and improving performance. StoreDot’s R&D success in developing a cutting-edge extreme fast charging battery technology is proof of the effectiveness of AI in saving time and money previously spent on processes such as repetitive life cycle testing.   However, even though AI is ideally positioned to revolutionise EV battery R&D, it has many other, equally profound applications in the all-electric vehicle industry. Everything from managing sensitive global supply chains, to aggregating and evaluating data gathered from fleets through cloud computing to improve battery management. By creating smarter batteries with embedded sensing capabilities, and self-healing functionality, connected EVs’ battery management systems can continuously monitor the ‘state of health’, and even rejuvenate selected battery cells or modules if required.  What is more, when enabled through the IoT and over-the-air updates, EV performance and safety can be enhanced to operate closer to the extremities of the envelope. Whether this comes in the form of optimising the range or reducing the time to charge – one thing is for sure: AI will play a key role in reducing “range anxiety” and speed up the adoption of BEVs. ### Top Three Priorities for A Successful Cloud Security Strategy In recent years, cloud computing has grown in popularity as a means for businesses to store, process, and manage their data. This is because cloud services provide numerous advantages over traditional on-premises infrastructure, such as scalability, cost-effectiveness, and ease of use. As a result, many enterprises today extensively use cloud services at scale and are looking to move further workloads to the cloud. This rapid adoption of cloud services has, however, not been accompanied by an equal emphasis on cybersecurity measures. While cloud providers provide some security measures, the enterprise assumes the ultimate responsibility for data and application security. An attack could be devastating; according to IBM, the average total cost of a data breach is $4.35M. While it’s important to look at the numbers, data breaches also result in other costs such as reputational damage, legal issues and decreased levels of trust.  Organisations are leaving the door open for bad actors and hackers to exploit vulnerabilities as a result of this gap in cybersecurity measures. Cyberattacks on cloud systems are becoming more common, often with catastrophic consequences. 69% of organisations reported data breaches because of multi-cloud security configurations. These attacks can cause data breaches, sensitive information theft, and even business disruption. Enterprises must take a proactive approach to mitigate the risks associated with cloud-based cybersecurity threats. Regulatory Compliance Organisations have a legal and ethical responsibility to comply with all relevant regulatory requirements, and this is especially true when it comes to cloud security. The consequences of non-compliance can be severe, including the imposition of fines, potential legal disputes and a loss of goodwill. Many regulatory frameworks, such as the General Data Protection Regulation (GDPR), have specific requirements for cloud security, including data encryption, access control, incident response planning, and regular security assessments. Failure to comply with regulations for data protection may result in severe penalties. Under the GDPR, organisations can be fined up to 4% of their annual global revenue or €20 million, whichever is greater, for failing to comply with regulations. Noncompliance can result in reputational harm and loss of customer trust, in addition to financial penalties. Data breaches often make the news headlines, with social media leading to greater awareness of incidents. Furthermore, regulatory compliance is a legal requirement in some industries.  The healthcare industry, for example,  is highly regulated, and organisations that fail to comply with regulations such as the Data Protection Act can face significant fines and reputational harm. When highly sensitive information or personal information is involved, regulatory compliance takes on a whole new dimension. IoT and the growth in mobile technology have resulted in companies possessing data that is far more sensitive than before.  Therefore, organisations must prioritise regulatory compliance and ensure their cloud security measures are as extensive as possible and align with relevant standards. By doing so, organisations can reduce the risk of penalties and protect their reputation and customer trust. It is also essential for organisations to regularly review and update their cloud security policies to stay up-to-date with changing regulatory requirements and emerging threats. Ensuring Observability The phrase "prevention is better than cure" certainly applies to the cloud. Observability in the context of cloud security refers to the ability to monitor, measure, and analyse various aspects of the cloud infrastructure, such as applications and data flow, to detect security threats and vulnerabilities before they occur. With data volumes constantly growing and the threat landscape becoming more complex, cybersecurity teams are under pressure to keep up with a range of shifting demands and requirements.  Because observability provides visibility into the various components of the cloud infrastructure, security teams can detect anomalous behaviour or suspicious activity that could indicate a potential security threat. Furthermore, observability can assist organisations in better understanding their cloud environment, such as how data flows through the network, what resources are being used, and who is accessing them. This data can be leveraged to develop more effective security policies and controls, as well as to prioritise security investments based on the most critical infrastructure areas. Ultimately, ensuring cloud observability is critical for successful cloud security. It enables businesses to stay ahead of potential security threats and vulnerabilities and to respond quickly and effectively in the event of an incident. Organisations can ensure the security and integrity of their cloud infrastructure and protect their valuable data and assets from cyber threats by investing in observability tools and practices. A window to the future can make all the difference between a successful and unsuccessful cloud security strategy; Cybersecurity measures must keep up with the pace and scale of the cloud.  Cost of Implementation vs Cost of Inaction   Businesses face numerous challenges in the digital era, and it is critical to ensure that investments in cloud security provide a solid return on investment (ROI). A successful cloud security strategy not only protects the organisation from cyber threats but improves the cloud infrastructure. Businesses can reduce the risks of security breaches and incidents by investing in cloud security measures. They can also improve the reliability, availability, and performance of their cloud-based systems by preventing bad actors from accessing and disrupting systems. This can result in increased productivity and efficiency, as well as higher customer satisfaction and, ultimately, higher revenue and profitability. Because trust has become a watchword in the business agenda today, investing in security demonstrates a strong intention to protect what matters to people, in this case, their data. While the cost of cloud security measures may appear to be high, the cost of a data breach or security incident can be far greater and more damaging. Direct costs associated with data breaches include legal fees, regulatory fines, and customer compensation, as well as indirect costs such as lost revenue, reputational damage, and decreased customer trust. Furthermore, the impact of a security incident can go beyond financial costs to include operational disruptions, legal liabilities, and brand damage. Businesses can help to prevent such incidents and protect themselves from potentially disastrous consequences by investing in cloud security. ### Maximising ROI in B2B Digital Marketing Every business aims to generate revenue, and B2B digital marketing can help achieve that objective. However, to maximise ROI in B2B digital marketing, businesses must first understand the best strategies and practises. This includes identifying the target audience, performing competitor analysis, and staying updated with industry trends. Some strategies that work for maximising ROI in B2B digital marketing campaigns include: Search Engine Optimisation (SEO) Content Marketing Pay-Per-Click (PPC) Advertising Email Marketing, Social Media Marketing Account-Based Marketing (ABM).  Best practises for maximising ROI in B2B digital marketing include: Developing a strong content strategy. Using data to inform decision-making. Personalising the approach. Measuring and optimising campaigns. Aligning sales and marketing teams. A strategic approach that meets the business's specific needs is essential for maximising ROI in B2B digital marketing. Maximising ROI in B2B Digital Marketing: Strategies and Best Practises With the ever-changing digital marketing landscape, sometimes it's challenging for businesses to maximise their return on investment (ROI) effectively. To do so, businesses must understand the best strategies and practises for optimising their digital marketing efforts. In this article, we have shared the key strategies and best practises that businesses can use to maximise ROI in their B2B digital marketing campaigns. The Power of ROI in B2B Marketing ROI, or return on investment, is among the most important factors in B2B digital marketing. Every business's main objective is to generate revenue and growth, and the success of their digital marketing campaigns is often measured by the ROI they make. ROI measures the effectiveness of a marketing campaign by comparing the amount of money spent on the campaign to the revenue generated as a result. This way, B2B companies can identify the most effective campaigns and adjust their strategies accordingly. Ways to Understand the B2B Digital Marketing Landscape To maximise ROI in B2B digital marketing, you must first understand the landscape in which your business operates. This includes learning who your target audience is, conducting competitor analysis, and staying up-to-date with industry trends. Identifying your target audience Knowing your target audience is very important, especially in B2B digital marketing. You need to know your customers, their pain points, and what can make them purchase. Using this information, you can customise your marketing message to get their attention right away. Competitor analysis This is the process of identifying and analysing your competitors' strengths and weaknesses to figure out if you need to do something differently. By understanding what your competitors are doing well and where they fall short, you can identify opportunities to differentiate and improve your business. Industry trends Staying up-to-date with industry trends is a significant step for effective B2B digital marketing. As a business, you must keep track of emerging technologies, changes in consumer behaviour, and new marketing tactics. Strategies for Maximising ROI in B2B Digital Marketing Search Engine Optimisation (SEO) Search engine optimisation (SEO) improves your website's visibility in search engine results pages (SERPs). To maximise ROI with SEO, you should first identify and target the keywords and phrases your target audience uses to search for products or services like yours. There are many SEO tools online that can help you so you don't do all the research yourself, like Ahrefs, Moz, SEMrush, SERPstat, and more. Content Marketing Content marketing is focused on creating and sharing engaging content that offers value to a specific audience. You can easily use the content to maximise ROI, creating high-quality texts that resonate with your target audience. The more visitors click on your content, the higher the chances to drive traffic, generate leads, and build brand awareness. Pay-Per-Click (PPC) Advertising Pay-per-click (PPC) advertising is a popular digital advertising form where advertisers pay each time a user clicks on one of their ads. Similar to SEO, to succeed in PPC, it's essential to identify the right keywords and target the right audience. Email Marketing Email marketing is a form of marketing that focuses on sending targeted messages to a specific audience by emailing them. To have great results, you must first segment your audience, develop highly-targeted campaigns, and continually optimise your messaging and content. You can build trust and loyalty, make an audience, and drive sales by delivering personalised, relevant content to your subscribers. Social Media Marketing Social media marketing is the practise of connecting with your target audience and building an online presence via social media platforms. To maximise ROI with social media marketing, you should find the right platforms for your business, develop a strong content strategy, and engage with your audience on a regular basis. Account-Based Marketing (ABM) Account-based marketing (ABM) is a marketing strategy focusing on specific accounts or customers. To get the most out of it, it's necessary to identify and prioritise high-value accounts, develop highly-personalised campaigns for your audience, and align your sales and marketing teams. Best Practises for Maximising ROI in B2B Digital Marketing Develop a strong content strategy Good content will naturally align with your business goals and target audience. If you know your audience well, creating high-quality content that resonates with them will be easier than ever. A strong content strategy can help you attract your target audience, drive traffic, and generate leads that can convert into sales. Use data to inform decision-making Using data to inform decision-making is critical to maximising B2B digital marketing ROI. For example, you can gain valuable insights into your audience's behaviour and preferences by tracking and analysing data from website analytics, customer feedback, and engagement rates. This way, you will know where to focus your efforts and allocate resources to campaigns driving the highest return. Personalise your approach Customising your messaging and content to your target audience's specific needs and preferences will contribute to higher engagement and stronger relationships with prospects and customers. This can increase customer loyalty and higher conversion rates, resulting in greater ROI for your business. Continuously measure and optimise your campaigns Analysing data from your campaigns will automatically show the areas you can improve and adjust for better performance. This includes testing different messaging, targeting different audiences, and adjusting ad spend to optimise for the highest return. Align your sales and marketing teams Developing a cohesive strategy for your business goals is better achieved through teamwork. Some strategies that work are developing a shared understanding of your target audience's pain points, targeted campaigns that align with the buyer's journey, and developing plans to nurture leads and close sales. Conclusion Maximising ROI in B2B digital marketing requires a strategic approach that meets the target audience's preferences, leading to data-driven and profitable results. By refining their digital marketing efforts, businesses can position themselves for long-term success and differentiate themselves from competitors. And with the strategies and best practises outlined in this article, marketers can confidently optimise their digital marketing efforts and achieve the maximum ROI possible. ### <strong>The Importance of Investing in EDR for SMEs</strong> In the first 6 months of 2022, there were over 10,000 new ransomware variants discovered. As threat actors increase in number, the frequency of attacks witnessed across the globe will continue to rise substantially. No business is immune from a breach, as highlighted by the many cases making headlines today. So, how can SMEs counteract this? High-end, enterprise-level security tools may be regarded as out of reach for the majority of  small businesses, but that thinking is rapidly changing. For the new year, companies should consider smart endpoint detection and response (EDR) solutions that incorporate a robust incident management portal that effectively traces all open threats. Without investing in smart software, smaller businesses are underprepared, and consequently at risk, explains David Corlette, Vice President – Product Management, VIPRE Security Group.  The Heat Is On Not only are small organisations facing additional pressure to protect their business from cyber attacks, but threats are also becoming more innovative, with ready-made tools accessible in numerous forms available even to casual attackers, from Ransomware-as-a-Service and Phishing-as-a-Service, to Malware-as-a-Service. The average breaches cost upwards of £4.5 million,but small businesses only make an average of £2.8 million per year. That’s a tight margin, and one with severe repercussions  – with one out of eight small organisations closing down as a result of  a data breach.  Discouragingly, a new poll discovered that the majority of small business owners are not suitably  concerned. Only 4% put cybersecurity as the top risk facing their business and a further sixty-one percent were not worried about the possibility of their company becoming the target of a cyberattack. This data alone shows that SMEs need to be more cautious – and more prepared – than ever.  Closing The Gap Like the way that bad actors have targeted bigger organisations, small businesses are progressively becoming targets too, but are more likely to be victims due to their "it won't happen to us" or “too small to hack”  mindset: they lack efficient solutions to protect themselves. Deterrence is difficult when the common methods – hire big IT teams, level up solutions, teach  existing employees  – are often cost- and resource-prohibitive for smaller organisations. Nevertheless, small businesses can use their previous expertise to create an enterprise-grade endpoint detection and response strategy using newer Endpoint Detection and Response (EDR) technology. When it comes to protecting their business, some EDR solutions make it easier for smaller companies to keep up with the larger players, as they deliver the sophistication of high-performing, cloud-based solutions without the problems that users may anticipate. This advanced technology gives better detection and discovery of more abnormal behaviour than users would experience from standalone antivirus file, process, and networking analysis solutions while simultaneously providing investigation and remediation tools to quicken response times. By automating much of the elaborate work – threat detection, remediation, and response – EDR can minimise the skills gap and even put small organisations ahead. This can be more cost-effective in comparison to directly employing someone full-time to protect against the latest  threat landscape; not only are new employees becoming increasingly difficult to find but they are also expensive to hire.  What’s more, having an EDR tool that utilises security automation is paramount to detect and stop an attack in its early stages. Using behavioural engines, security teams can trace each part of the attack as it happens in near real-time. With the help of AI and Machine Learning (ML), a modern EDR can natively recognise anomalous activities, such as zero-day ransomware behaviour, and immediately stop these processes upon detection. Looking Ahead According to experts, the market for EDR solutions will continue to grow. According to Gartner’s estimates, by 2025, more than 60 percent of businesses will have shifted from traditional antivirus software to remedies that provide endpoint protection and endpoint detection and response.  Owning a modern Endpoint Detection and Response tool is fundamental for any security team’s arsenal, but is particularly important for SMEs. It offers a holistic security approach needed to challenge successful battles in the latest threat landscape, as EDR solutions provide vital and rapid containment measures, preventing the breach from doing additional harm to a network. But EDR solutions also provide strategic long-term benefits by reinforcing security posture and allowing organisations to protect against known and zero day threats. Right now is the prime time for smaller businesses to invest in cybersecurity technologies, such as EDR,  to help small-scale teams reach the same security demands as bigger businesses, and to protect the SME and its staff from cyber attacks. Evidently, utilising the appropriate tools can make a substantial difference for small businesses and their security teams in the upcoming year. ### <strong>Creating consumer peace-of-mind with all-encompassing data security</strong> Personalised experiences are a much-needed aspect of customer communications today. Google research shows that nine-in-ten (90%) marketers believe that personalised strategies make a tangible impact on business profitability. However, tailoring communications to the needs of specific customers requires in-depth knowledge of their actions, wants and desires. This means leveraging potentially identifiable and sensitive data to create experiences that resonate with the end consumer. Organisations across both the retail and travel sectors need to ensure that when such data is being used to deliver personalisation strategies to meet customer requirements, personal data is being handled responsibly. However, high-profile incidents such as the Facebook-Cambridge Analytica scandal and data breaches taking place at Twitter have raised awareness among consumers how their personal data is collected and used. The implementation of GDPR has also shone a spotlight on how companies store and leverage consumer data. As trust in data protection wanes, how can organisations build the necessary confidence and ensure compliance with regulations? Rising consumer awareness The way that data is processed and consumed by organisations is increasingly entering the consciousness of consumers. Many are now taking action to protect their data privacy, such as by asking companies they deal with to disclose the information they have stored about them, or requesting that they be removed from mailing lists that aren’t relevant. Some are even threatening to boycott organisations that fail to protect their personal information or are slow to act when it comes to data deletion requests. Any technology perceived to be privacy-invading by nature is now under close attention, with pressure from both consumers and regulators to be eradicated or altered to better protect people’s privacy. A prime example of this is third-party cookies. Once the go-to tracking solution for organisations and integrated on all major web browsers, this technology has now been fully removed from most applications. Google Chrome remains the only browser to utilise these cookies, but has intentions to remove them by 2024. With a cookie-free future almost a certainty, any organisation hoping to rely on such solutions will struggle to process relevant data in a way that meets consumer expectations and use it for personalisation strategies. As the consumer landscape continues to evolve rapidly, organisations that fail to take steps to tighten up their data processing practices risk losing business from customers that expect more. However, any strategy to improve data protection and privacy strategy will struggle to gain traction from the very beginning if internal processes aren’t fully optimised. A common challenge is the numerous data sources that organisations need to manage in their marketing activities, making it an almost impossible task to keep tabs on every single element of sensitive data. Employees can also disagree on how to move the strategy forward, which threatens the application of best practice. Being responsible with consumer data So that organisations can keep tabs on all their data sources, a customer data platform (CDP) can assure comprehensive security when processing sensitive information and encouraging personalised actions. For example, local laws can apply depending on where sensitive information is chosen to be stored. This is therefore a key consideration when choosing which country or region in which data resides. Where European organisations need to adhere to GDPR regulations, the US also follows its own laws that businesses must align with if they have operations there. The AICPA for example upholds regulations on how businesses should manage the privacy, security, availability, confidentiality and processing integrity of customer data. Enterprise-grade CDPs can ensure that SOC2 compliance is delivered, enabling organisations to meet AICPA standards, alongside other laws such as CCPA and PECR. Technology brought in should operate with everyone’s privacy at front-of-mind, giving businesses the opportunity to commit to protecting individual data rights. Navigating a world without cookies Technology can provide the foundation for best practice processing and storage of personal data, enabling businesses to win the trust of customers. However, the next hurdle that needs to be overcome is the adoption of personalisation strategies while keeping compliant with regulations. Businesses that attempt to rely on third-party cookies will ultimately fail to effectively identify the people visiting their website, leading to an ineffective user experience. The answer is to utilise a one-of-a-kind privacy solution which is both able to identify a user within the restrictions of a cookie-less browser and also share these identities with authorised services where applicable. Businesses are able to avoid using cookies at any stage of the tracking process during a session. Third-party services are however fully able to leverage insightful data with no disruption. This technology means that the work undertaken by marketing teams, who may have invested significant sums in bringing in potential leads and driving conversions from potential customers, is not wasted. It’s then as simple as leveraging a single interface to manage the different access levels to each service. Any consent or right to be forgotten requests can be actioned quickly and easily. With insightful customer data driving decision making, businesses are best placed to combine this information with multi-channel and automated personalised actions. There are a variety of ways that this can be facilitated, but some of the most common offerings include consistent buying experiences to ensure repeat purchases from loyal customers, product suggestions driven by the specific purchasing activity of a select user, or delivered content that is closely related to the search query of a user.  Adapting to an evolving landscape Data security and privacy is only going to move into the consumer spotlight more over the next few years. Concerns continue to grow for consumers who hand over their personal data to the businesses they interact with and purchase goods from. At the same time, data regulations are becoming stricter and fines are becoming more severe. Third-party cookies are also set to become a technology of the past. All of these trends don’t need to be a detriment to personalisation strategies, however. An enterprise-grade CDP gives businesses all the tools they need to drive consumer insight in a fair, transparent and compliant way while providing customised experiences. ### <strong>Bridging the gender gap in public transport is crucial for achieving equality</strong> No matter how established, extensive, and sophisticated a city’s public transport system is, it probably wasn’t designed with women in mind. If you’re a woman, you might have suspected that’s the case—but it’s not just in your head.  As Caroline Criado Perez notes in her book Invisible Women: Exposing Data Bias in a World Designed for Men (2019), it’s something that impacts every aspect of transportation, starting with the routes. Picture the public transport routes where you live; or better yet, pull up an online map and plan a route from your home to the city centre. Now, do the same thing but add in trips to a school, a grocery store, and an elderly relative’s home (if you have one living in the area). Notice how much simpler the first one is and compare it to the complexity of the second one (even if it’s broadly in the same direction). Now, think about which trip is more likely to be taken by a man and which is more likely to be taken by a woman.  Whether or not the people who planned those routes realise it, their designs actively contribute to gender inequality. Transforming that state of affairs means that everyone involved in public transport needs to take a far more consciously gendered perspective.  Understanding the gender transport problem  Before exploring what that perspective might look like, it’s worth taking a deeper dive into how transport routes contribute to gender inequality. Existing route patterns make it more difficult for women to make the journeys they need to, sometimes changing routes multiple times on the way. But this also means that women's journeys can work out more expensive than those taken by men. The time women spend on those trips also has a cost, making it more difficult for women to study, take on extra paid work, or spend time with friends and colleagues.  Safety is also a major concern. And with good reason. According to research released in 2022, some 55% of Londoners who are women have experienced sexual harassment on public transport. That’s to say nothing of the fear that comes with waiting at a bus or train stop in the dark. Those safety concerns are only magnified for women travelling with children and other dependents.  In the Majority World, those challenges are often much greater. For billions of women in emerging-market countries, formal public transport networks are either underdeveloped or completely non-existent. As a result, they have no choice but to rely on informal public transport networks.  Make no mistake, those networks do fulfil a vital function and often have the agility necessary for enabling women to take the complex trips they need to. But their informal nature can also make navigating a city more challenging. Bus systems being comprised of hundreds of individual operators, lacking centralised coordination, can equate to more cost and more time spent travelling. Safety challenges are particularly compounded in these contexts. Designing with a gendered perspective  Those realities were something we knew we had to be cognizant of when designing our consumer public transport app Rumbo. Remember, women are, on average, 21% (Goel et al, 2022) more likely than men to use public transport to travel to work, with the gap increasing significantly when all trips are combined, making it even more important that public transport solutions are designed with them in mind.  To ensure we build Rumbo with a gendered perspective, we took a human-centred approach using data and research. Through a suite of bespoke tools developed in-house, we mastered the mapping of public transport networks in multiple emerging-market megacities, digitalising every mode of formal and informal public transport however it operates. But we also talked to women in those markets, joined their commutes, and learned about their day-to-day transport experiences.  The lessons we learned through that initial research influenced every aspect of building Rumbo, up to and including font choice and typography. Given the stressful situations women all too frequently find themselves in when using public transport, we knew that we had to make the app as simple and intuitive as possible to use. In other words, by designing for women, we helped ensure that Rumbo was as safe and usable for everyone as possible.  But that research didn’t stop once the product was launched. We’ve also worked with the think-and-do tank Data-Pop Alliance to survey women using Rumbo. The qualitative survey revealed a number of interesting results. We found out that women travel to study, work, visit relatives, and for entertainment purposes. This is likely why, in both cities, women frequently travel by public transport outside of rush hour: 44% in Lima and 53% in Mexico City. We also found that the number of women using public transport decreases as daylight fades, largely as a result of security concerns. That’s starkly illustrated by the fact that just nine percent of women in Lima and 12% of women in Mexico City travel after eight o’clock at night.  By taking a research-driven approach, we’re helping to make commuting safer, more affordable, and faster. That’s not conjecture either: we’ve found that, over the past year, Rumbo has given commuters back 24 minutes every week. Getting that time back is useful to everyone, but is especially so for women, who most frequently have to balance work with the role of primary caregiver.  Incremental changes add up  Of course, we know that a single app can’t magically make public transport more equitable for women. But our data shows that it can help, saving time and creating greater certainty in an extremely uncertain context. As we build information solutions alongside others building infrastructure and creating cultural change, those changes will add up.  As a result, we’ll start seeing public transport systems built with women in mind. Ultimately, this will enable us to unlock fairer and more equitable cities for everyone. ### <strong>Backup Security: Protecting Business-Critical Assets from Ransomware</strong> A litany of research reports cite the inevitability of cyber-attacks and the ongoing sleepless nights this reality can cause. For instance, it’s estimated that a ransomware attack occurs every two seconds – and these attacks are coming at major cost, both financially and mentally. While ransomware and cyber-attacks top the list, they’re certainly not the only significant worrisome risks to critical data that compound the problem and that can cause IT leaders to lose sleep: Disruptions and outages: According to Uptime Institute, around 75% of data centre operators report having experienced an IT service outage in the previous three years.1 A single hour of downtime can cost $300,000 or more for 91% of mid-sized enterprises and large enterprises, according to ITIC.2 Human error: While security issues are the biggest cause of outages and disruptions, human error is also a major issue, cited by 54% of respondents to the ITIC survey as being one of the top issues negatively impacting systems. Talent gap: Even prior to the current recession woes, the IT skills gap was causing consternation. It’s estimated that by 2025, 50% of all workers3 will need reskilling to keep up with technological advances, and 87% of executives4 say they’re already facing skills gaps. Creating a stronger data management strategy that’s rooted in resiliency – including an emphasis on backup strategy – can go far in helping teams tackle these persistent challenges. Improving backup strategy Just backing up your data isn’t enough. In fact, backups have become a major target for bad actors. The Veeam 2022 Ransomware Trends report5 found that backup repositories were targeted in 94% of attacks – and in 72% of incidents, the cyber criminals were at least partially successful. When an organisation’s recovery lifeline is at risk, it increases the likelihood that they’ll pay the ransom – and that’s what bad actors are banking on. However, creating a stronger, better protected backup strategy is possible – and it’s much more complex than just storing a second copy of data. Building ultra-resiliency with immutability Key to data recovery is immutability. Immutable data storage can deliver the ability to store data in a way that can’t be modified, deleted or tampered – in any way. This not only protects your data against ransomware, but it also protects it from inadvertent or malicious human action. Today, it’s widely perceived as a must-have when it comes to combatting ransomware. To truly protect mission-critical data, teams need to achieve ultra-resiliency, which describes air-gapped, immutable or offline copies of data. It’s highly resilient to have a copy of backup data that fits at least one of these criteria in case a ransomware attack occurs, and you need to recover data. A duplicate might possess more than one of these characteristics. Data is safeguarded from unsanctioned change or destruction – whether intentional or not – thanks to immutability, which is now widely recognised as being essential in the fight against ransomware. Many object storage solutions today include support for immutability. Unlike file systems, an object storage solution divides information into distinctive, adjustable-size containers, and employs keys to get the information needed. The data is identified or referred to using that key – and the system retrieves your object on your behalf. And what’s perhaps most appealing is that object storage can store unstructured data at massive scale – petabytes and beyond. While most conversations around the need for data immutability focus on ransomware, immutable object storage can also protect your data from inadvertent or purposeful deletions or overwrites. Smarter data storage for better data security   IT teams have their work cut out for them in today’s complex environment. Ransomware is a continuous and cruel reality, and these threats continue to grow in prevalence and sophistication. Backups have become a more tempting target for cyber criminals, and there is always the risk of human error. Keeping data safe has never been a bigger challenge. And on top of that, most IT teams are grappling with the ongoing challenge of having to do more with less – they must keep a growing amount of data stored securely and accessibly, but often with fewer people to tackle this. There is an obvious need for more robust storage and data management, as the complexity of applications and the value of data are both rising because of digital transformation. Immutability can be used to create an ultra-resilient backup system that keeps your data secure while remaining easier to manage. With backup attacks on the rise, it’s not enough just to have them - you have to ensure they’re protected even if (or when) an attack occurs. ### Getting Web Analytics Right: The Power Of Accurate Data If you’re looking to establish a strong online presence, you must have a strong grasp of web analytics. The insights gained from web analytics can help you understand your audience, improve your marketing strategies, and ultimately help you make your way to a significant increase in revenue.  However, collecting accurate web analytics data can be quite challenging, especially if you’re dealing with a large volume of data. This is why in this post, we will explore the power of accurate web analytics and how to get them right. Let’s get started! What Is Web Analytics? By definition, web analytics refers to the measurement, collection, analysis, and reporting of website data. It may include information about website visitors, page views, click-through rates, conversion rates, and bounce rates. The primary purpose of tracking web analytics is to measure your website’s performance. It helps identify areas of improvement, recognise your best marketing strategies, and consequently, optimise the website for the best results. It’s possible to track web analytics using all-embracing tools like Google Analytics or Hotjar. But as your website grows and you start dealing with large amounts of data, there’s a chance your web analytics configuration becomes askew, and you end up with inaccurate data (leading to wrong decisions). This is where tag auditing tools like DataTrue come in. They allow you to monitor, audit, and validate your website’s tagging and analytics configuration. As a result, you can rest assured that your decisions are backed by the most up-to-date and accurate data.  Types Of Web Analytics Broadly, there are two categories of web analytics. These are: Off-site  On-site On-site analytics refers to the data collected directly from your website, including page views, bounce rates, and click-through rates. These analytics mainly help a business improve its website’s user experience and conversion rate. On the other hand, off-site analytics refers to data collected from external sources, such as social media, email campaigns, and referral links. These analytics help a business in improving search engine optimisation and its overall online presence.  Why Measure Web Analytics? We briefly discussed the power of web analytics above. Now, let’s check out some benefits of accurate web analytics in deeper detail. To Improve Website Performance Web analytics provide insights into how visitors interact with your website, i.e., which pages are most popular and where visitors are dropping off. You can use this information to make data-driven decisions about which pages to optimise such that they deliver the best user experience. This can increase engagement and ultimately drive more conversions. For example, a retailer may use web analytics to identify which product pages are the most popular and optimise these pages to improve the user experience. By doing so, they will increase the store’s conversions and revenue. To Understand User Behavior Web analytics also provide insights into how much time web visitors are spending on certain pages and how frequently they interact with others. You can use this information to understand the likes and dislikes of your target market and produce more of what they like. For example, let’s say you have a news website and publish posts on topics X, Y, and Z. Say your audience spends the most time on pages related to topic Z. You can now create more content around this topic and boost your overall reader engagement. To Optimise Marketing Strategies  Web analytics also offer insights into which marketing channels are driving the most traffic and conversions. And you can use this information to optimise marketing efforts and improve return on investment (ROI). For example, let's say a startup uses web analytics to identify which social media channels are driving the most traffic and conversions. Once identified, the startup can reinforce its efforts and focus on building these channels more actively.  To Personalise User Experience Since web analytics provide insights into visitor behavior and preferences, they can help you personalise your audience’s experience. For example, if you have an e-commerce site, you can use web analytics to identify which products a visitor has previously viewed or purchased. Using this information, you can display related products and offer personalised recommendations. This may increase the visitor's average order value and drive more revenue. To Identify And Resolve Technical Issues Lastly, web analytics help businesses identify issues such as high bounce rates or low engagement rates and take steps to address them. Constant monitoring can help identify trends and make informed decisions to improve website performance. This is particularly useful for landing pages. If you notice a high bounce rate for the homepage and low scroll depth, it’s an indicator that the hero section of your page is not compelling enough or your site is difficult to navigate.  How To Acquire Accurate Web Analytics? To acquire accurate web analytics data, businesses need to ensure that their web tags are properly implemented and functioning correctly.  Web tags are small pieces of code that are embedded in a website to track user behavior and collect data. However, if they are not implemented correctly, they can result in inaccurate data and skewed analytics reports. One way to ensure accurate web analytics data is to conduct a web tag audit. A web tag audit involves a comprehensive review of a website's web tags to ensure that they are implemented correctly and functioning as intended. The audit can also help identify issues such as: Duplicate tags Missing tags Incorrect tag placement By regularly conducting a web tag audit, you can ensure your web analytics data is accurate and reliable. This, in turn, can help you make data-driven decisions to optimise your website. Web Analytics Mistakes To Avoid Lastly, here are some common mistakes you should avoid regarding web analytics: Not Setting Up Goals: Without clear goals, you cannot accurately measure your ROI or identify areas for improvement. Here’s a guide on how to build a web analytics measurement plan that aligns with your goals.  Not Segmenting Data: Segmentation organises data and allows businesses to analyse user behaviour based on different criteria, such as demographics, location, and device type. Without segmentation, you may never be able to direct your market campaigns effectively. Focusing On Vanity Metrics: Vanity metrics (like page views and social media followers) can mislead and may not provide a clear picture of your website's performance. Hence, you should focus on metrics that directly impact your business’s bottom line, such as conversion rates and revenue. Ignoring Mobile Users: With more and more people using mobile devices to access the internet, it is essential for businesses to optimise their websites for mobile users.  Summing it up, gathering and managing accurate web analytics is a demanding task. But it is also the first step towards building a solid online presence that truly benefits your business. So, the sooner you start with it, the better. Good luck! ### <strong>A cloud migration strategy: a 5-step checklist</strong> Cloud migration strategy requires significant initial investments before it starts bringing you benefits. It is a complicated process, often requiring many difficult decisions. At some point, the need to migrate to the cloud falls upon your tech experts, from data scientists to machine learning engineers, who could be unprepared to handle this task alone. To alleviate the situation, our professional IT consultants and cloud migration services experts have joined the efforts to help you along the way. The following guide provides five essential steps to control your cloud migration project. 1. Define how much cloud capacity you need The traditional providers of internet infrastructure don’t sell computational power. Instead, they sell resources, or, to be more specific, servers that run 24/7, charging you by the hour even when you aren’t using these resources. In most cases, your cloud-based application or system is not fully loaded during the prepaid time, so you’re wasting your budget away paying for the server’s idle time. Now, the numbers will vary depending on your scale of operations. For example, if a local company goes without night-time operations like reporting or maintenance, it will face 12 hours of downtime. It means that 50% of their infrastructure budget goes down the drain. Scalability is a sensitive matter Simply purchasing a large server based solely on abstract traffic predictions can be too optimistic, while buying a small server can backfire if you end up going viral and your system crashes under immense pressure. Therefore, the right approach to your cloud migration strategy would be to go for the cloud capable of auto-scaling or to use serverless backend systems based on AWS Lambda, Azure Functions, or similar specialised services. Such services enable automatic backend scaling, meaning you can benefit from pay-as-you-go and pay-as-you-use models. However, using the cloud is not always a good idea. For example, when hosting in the cloud, you don’t know on which physical servers your components will be deployed. Network-attached storage could be less efficient than local drives. And in the case of high-loaded systems where the response time is critical, local servers can work better. 2. Adjust your architecture If you migrate your system as is, the cloud will always be more expensive than local storage. To optimise costs, you should adjust your architecture before migration and choose the right services to facilitate this complex process. You should also add the cloud provider management costs and scalability overheads since cloud providers offer fully managed services. Considering all these factors, the only correct approach to cloud migration is deciding which system components must be left on premises and which need to be migrated to the cloud. Balancing service utilisation Before you make any big decisions, analyse the services that are fully loaded and the underutilised ones. Optimising the architecture first will make the cloud cheaper than a dedicated server. To help you with that, here's a list of the most typical out-of-the-box steps you can take: Move all your static resources to CDN: this should take some of the load off your application servers. Use in-platform infrastructure components such as load balancers, message queues, and caching services. Replace rarely used routines and services with serverless handlers (from AWS Lambda or Azure Functions). The identical replacement can be accomplished for simple backend services with a sporadic load and those tolerant of response time. To discover more services that can be replaced, you can consult cloud computing providers such as AWS and Azure, currently the two most popular ones. 3. Calculate costs Answering the notorious pricing question is incredibly difficult. The calculations can easily sway to either side depending on what you think is the best option—on-premises or the cloud. Let’s provide a straightforward example. Imagine you have a system that was initially designed to run on-premises. This is a very busy system with more than 10 TB of data, an 8,000,000+ events per day update rate, and a 30,000 events per minute throughput. You’re going to need a cloud migration strategy for this system. In this case, you will be migrating in two steps:  You will rewrite the infrastructure components to benefit from serverless services You will choose the best out-of-the-box cloud services By optimising and re-engineering your infrastructure, you will be able to migrate while requiring fewer languages. And, by using serverless services, you will see a dramatic cut in overall costs.  A few perks of migrating this way will be automated scaling, simplified development and support, and resource optimisation. 4. Protect your data For security reasons, specific industries can't fully consider cloud migration: for example, banking and finance, the public sector, insurance, and healthcare. However, many highly regulated organisations, including government agencies, that work with different types of sensitive data have found a way to host their systems on the cloud. They are opting for a hybrid model where they only have to store segments of their data in the cloud, thereby safeguarding themselves from attacks by maintaining strict user access limits and utilising robust government firewalls. Even if you have multiple security concerns, moving to the cloud is still a possibility if you and your chosen cloud provider can share the following responsibilities: Your responsibilitiesCloud provider’s responsibilitiesChoose your cloud provider based on their security acknowledgments.Analyse security risks and scan your architecture for loopholes.Create an effective SLA that covers provider-client relationships, third parties, and cloud brokers.Create a case-specific threat management plan with a clear incident resolution roadmap.Choose hybrid models where you can keep your mission-critical solutions secured on the premises.Counteract security risks with virtual machines that are separate from the corporate network.Maintain a flexible security model that keeps evolving in response to threats. 5. Choose open source Many businesses referring to external software development services and looking to adopt the cloud fear the so-called vendor lock-in, that is, being unable to leave the cloud or change their provider after committing to one. Also among the most significant concerns are the interoperability and portability of applications when switching vendors and the fear of paying even more to change the infrastructure. To address lock-in concerns, you should discover what exactly is behind the provider’s brand name by researching their documentation or asking an expert. For example, here are some services from the AWS stack that are open-source or interoperable: AWS ElastiCache: a caching service using either Redis or Memcached on the backend Amazon Aurora: a database service that is compatible with MySQL Amazon Redshift: a data warehouse service initially built on top of PostgreSQL that is still compatible with most SQL features You can also choose cloud services that utilise either standard or open protocols, allowing you to switch providers when necessary. In addition, open protocols make room for equipment interoperability without using proprietary gateways or interfaces. Now, if cloud services offer their own proprietary protocols alongside open protocols, then the advice is always to opt for the latter. For example, Azure Service Bus provides its own API as well as a supporting AMQP — an open messaging protocol. By choosing an open standard, you get to remain flexible. This means you can switch your provider or even deploy your own service instance that supports the same protocol without making any significant changes in client components. Conclusion Cloud migration is a complex, multi-level process that requires a solid amount of planning. This guide outlined the critical cloud migration steps that can help you minimise your infrastructure costs, avoid vendor lock-in, and address the common fears that come with cloud adoption. ### <strong>Why cost remains king in the on-prem vs cloud debate</strong> Over the last few years, we’ve seen a number of companies that were previously ‘all in’ on cloud moving their data back to on-premises infrastructure. With cloud previously hailed as the silver bullet for driving flexibility and agility, many data leaders have recognised that cloud also has its fair share of drawbacks. Regional regulations and cost are the driving force behind many of these decisions. Companies are rightly wary about the financial implications of getting governance or data sovereignty wrong. Additionally, the cost of cloud itself is another consideration; cloud can become increasingly expensive as companies scale their usage. Companies need to have a better understanding of their data and workloads before making any decisions on cloud.  Ultimately, a data driven approach to planned migrations is highly recommended.  The scaling conundrum Many organisations have recognised that running certain workloads in the cloud is significantly more expensive than initially anticipated. This is prompting them to re-evaluate where some workloads reside. The first question data leaders should ask when they’re looking to shift data to the cloud is why? If the answer is “to save money” then they may be better off remaining on-prem, because the main benefit of cloud is flexibility. That’s not to say organisations can’t reap the storage benefits of cloud in on-prem environments. Companies are starting to understand how approaches such as containerisation and virtualisation can be achieved in their private clouds, delivering qualities such as elasticity, workload isolation and improved storage density on-prem. So, we have moved past the cloud-first ethos and are now in a workload-first era. Decisions around whether a workload is better suited to cloud native deployment in shared public cloud, or  an on-prem environment must be driven by good data. Workload analytics enable companies to observe the performance of a workload before making a call one way or the other. Workloads that are more predictable and consume a relatively stable level of resource are often cheaper to run on-prem. Whereas a customer-facing service that’s more variable may be better in the cloud because of its elasticity. The cost of compliance There is no denying that data compliance and governance is front of mind for many organisations, especially those operating in highly regulated sectors. The governance landscape is becoming more complex by the day. For example, regulations like Schrems II have changed the requirements around citizen data and privacy – introducing tighter controls and steeper financial consequences. Against this backdrop, many organisations are opting to play it safe and move their data back on-prem to gain control over where data resides and ensure it doesn’t leave their jurisdiction. While the issue of sovereignty is less of an issue for companies in the US because the major cloud providers are based there, it is a growing concern for those based in EMEA and APAC. To increase control over data, it’s crucial that companies have cohesive security policies in place across all environments. Ensuring governance is consistently applied ‘always and everywhere’ makes it significantly easier for a company to remain compliant. With one set of globally defined policies in place, enterprises can replicate security standards across all cloud and on-prem environments. This reduces risk, saves time and mitigates the risk of human error. Optimising your cloud investments Organisations need the capability to securely move data from cloud to cloud or from on-prem to any cloud, and vice versa. Until now this has been a challenge. But with the emergence of modern data architectures, organisations can drive more value from their data and optimise their cloud costs at the same time. This is a win-win for organisations looking to drive efficiencies in an ever-changing business climate. ### <strong>Changing People's Mindsets When Measuring ROI</strong> Any business, regardless of its size or industry, can benefit greatly from the ability to calculate return on investment. Return on investment (ROI) is a key performance indicator that businesses frequently use to determine the profitability of an expenditure. It's especially useful for tracking progress over time and removing uncertainty from future business decisions. ROI is used in business analytics and serves as a benchmark for developing future strategies. This allows organisations to determine which tactics are effective and any operational areas that can be improved. Numerous key advantages can assist an organisation in understanding why measuring and analysing ROI can benefit their business and provide a competitive advantage. In a recent roundtable hosted by Future Processing, Daria Polonczyk, Head of Analysis & Design at Future Processing, Rowan Jackson, Co-Founder and Chairman at Promising Outcomes, and Eric Moe, Chief Operating Officer of Whitefoord LLP participated in a discussion that explored the reasons why measuring ROI has become increasingly important and how businesses can shift their team's mindset to focus on this key performance indicator. Why Does ROI Matter for All Businesses? One of the most obvious advantages of measuring ROI is that a company can learn where they should be spending their money. If an organisation discovers that one aspect of its operations isn't performing or yielding the desired results, IT executives can consider redistributing funds to a better-performing strategy. This ensures that a company's spending is optimised, and that money is not being allocated to underperforming activities. When a company starts analysing its ROI, management will be able to set realistic goals based on analytics to see where things can be improved over time. Instead of focusing on the short term, brands can start thinking about the long term and setting and defining goals for the coming year. This enables a business to improve its overall investment strategy and increase brand awareness. Tracking the return on investment of employees will help a company better understand who to hire and whom to let go. It is useful to understand whether certain employees are increasing or decreasing a company's profitability. Similarly, this approach can help determine a department's profitability and highlight opportunities for growth. Finally, it enables a company to modify its strategy based on how its customers behave. Measuring ROI assists brands in determining when to pivot their efforts and the overall impact of their investment strategies. Understanding the value of ROI is essential for business success now and in the future. How Can Businesses Adopt an ROI Mindset? While the importance of calculating ROI is obvious in almost every business case, it is less clear where to begin. Before becoming overly fixated on complex models and calculations, it's best to adopt an ROI mindset. While ROI often gets labelled as a financial measurement that doesn’t look at the total cost to the organisation, it is important to apply findings to help prioritise tasks, compare expected outcomes to actual results, draw conclusions and learn for the next project. The first step toward understanding ROI is determining what data is being collected. Conducting a data audit across the organisation will identify where information sources are located, how to access them, what questions each source can answer, and which gaps still need to be filled. After the data audit is completed, each initiative that has been completed or is in the works will be visible. This is only half of the equation; the other half is connecting the information gathered with actual results to establish a baseline for future improvements. Communication is one of the most important aspects of an ROI mindset. It is best to involve teams in determining key goals at the start of a project. In addition, organisations need to ensure that objectives are clearly defined so that everyone in the organisation understands them. Many people in an organisation will be unfamiliar with ROI, and many of them may not be used to digesting ROI. Bringing them along for the journey by setting up training to get them up to speed on any new terminology, while incorporating ROI into future initiatives. Finally, organisations need to ensure that stakeholders who receive ROI reports are familiar with the inputs used and how results are calculated. Creating an ROI culture increases engagement throughout the decision-making process, making it easier for those involved to see the impact of initiatives. Executives must summarise the team's activities and link those to quantifiable, financial outcomes at the end of a venture. When projects are completed, they must report back to the team on their success and how it compares to previous initiatives. Learning from the past Measuring ROI is critical because it informs organisations about the effectiveness of their ventures. It uses hard numbers to quantify the effectiveness of each campaign and provides businesses with information to help them measure the efficacy of ongoing projects, and the success of a venture on completion, as well as to justify future undertakings. It encourages team accountability because everyone is responsible for delivering results. Furthermore, organisations can calculate project accuracy by comparing the predicted ROI at the start of an endeavour to the actual ROI at the end.  The primary function of ROI is to aid in decision-making, especially since changes to a project early on are much simpler and less expensive. The simplest of all ROI measures are those based on the customers’ desired outcomes and how well the organisation performs against them. These outcomes are the customers’ expectations of an ideal experience, and they are easily measured.  Once measurement completed is it is easy to assess the ROI. ROI calculations are not meant to be exact methods of measurement but rather means of approximation. More accurate projections are always beneficial, but some error is to be expected with ROI. Understanding the ROI of any project aids in the identification of profitable business practices. Many businesses use ROI to determine which strategies will yield the highest return based on previous successes. When an organisation places an increased focus on ROI, it becomes not only a measure of past success but also an estimate for the future.  ### <strong>The passwordless future</strong> It's nearly two decades since Bill Gates predicted the passing of the traditional username and password, warning that this archaic security combo simply wasn't up to the task of keeping information safe and secure in the long term.  Taking an educated guess at what the future might hold, he told a security conference: "There is no doubt that over time, people are going to rely less and less on passwords. People use the same password on different systems, they write them down and they just don't meet the challenge for anything you really want to secure." Fast forward to 2023 and people are still relying on usernames and passwords as the lock and key to personal and sensitive information. What's more, they're still failing to make their passwords secure enough. Research published by NordPass in 2022 shone a light on just how difficult it seems to be to ditch traditional usernames and passwords. For example, it found that "password", "123456" and "guest" were all in the top ten of the most common passwords in 2022, all of which were cracked in a split second.  According to Microsoft, hackers launch an average of 50 million password attacks every day — or about 580 per second. And according to Verizon, 60% of data breaches are attributed to compromised credentials. What’s more, password resets are a top reason workers call IT help desks adding to IT estate maintenance costs.  To paraphrase an infamous quote, the reports of the death of the username and password approach to security appear to have been greatly exaggerated.  Passwordless authentication – what is it?  But, if we are ever to get serious about cyber security and the protection of data, we have to find another way. That's one of the main reasons why passwordless authentication is gaining so much popularity as a more secure and convenient alternative to traditional passwords.  And it seems the industry agrees. Tech giants Apple, Google, and Microsoft announced last May that they would support FIDO2 — authentication specifications based on public key cryptography and international standards — to enable passwordless authentication across devices.  It's a move that's seen by many as a major step forward in providing better cyber security protection against phishing attacks and stolen passwords. And it's easy to see why.  For the most part, passwordless authentication works by allowing people to rely on other forms of authentication — such as biometric data like fingerprint or facial recognition — instead of a username and password. But other solutions are available as well.  For instance, one-time passwords (OTP) are randomly generated codes sent to a user's registered device, such as their smartphone or email address. Public Key Infrastructure (PKI) authentication uses digital certificates to verify a user's identity. Other technologies, such as Security Keys, use a physical USB or Bluetooth device as a unique identifier.  And most people are familiar with multi-factor authentication (MFA), which is increasingly being used to access everything from financial services to on-demand subscription services e.g., sending you an text message with a one-time code. Challenges of going passwordless Of course, passwordless authentication is not without its challenges. For one, it requires widespread acceptance and a significant shift in mindset from those who may not be quite ready to ditch old behaviours. While vendors are more likely to drive passwordless adoption, consumers are particularly guilty of clinging onto old habits unless they’re forced to make the change.  It also requires a fair chunk of investment in infrastructure. Part of the foot-dragging to adopt passwordless security solutions is simply down to the fact that many organisations still rely on older systems and devices that are not compatible with passwordless technology.  But other issues exist as well. For instance, an organisation's preferred passwordless solution may not be interoperable with all systems, platforms, or devices across its IT ecosystem. The reality is it’s incredibly hard to remove the plumbing of an old system.  Of course, that's not a problem for those firms that have grown up with cloud computing. They tend to be much more adept at adopting passwordless authentication because they are not held back by legacy systems.  Cloud-based systems are the way forward The same is true for those established organisations that have already embarked on their digital transformation programmes and migrated to the cloud. For them, switching to passwordless authentication is a far easier process than for those that have yet to swap out legacy infrastructure. For the companies that are already underway with overhauling their systems, the cloud plays a pivotal role in passwordless authentication. Cloud platforms provide a secure, reliable, and centralised location for storing and managing key information, such as user credentials.  What's more, it’s much easier to upgrade and adopt emerging security technology if it's based in the cloud, as there's no need to rip and replace legacy infrastructure.   Done well, not only does it ensure that data remains secure and protected from unauthorised access, but it can also create a more positive user experience, especially among those nervous about adopting biometrics or reluctant to embrace change. Despite this, there are still plenty of organisations that have yet to make that investment. Which may help to explain why they continue to use password manager software programmes.   Having said that, even this technology is not without its pitfalls. In 2022, the popular LastPass software manager tool fell victim to a security breach, potentially jeopardising the security of personal data. If nothing else, incidents such as this are a reminder that everyone needs to evaluate their security exposure on a regular basis and maintain cyber hygiene.  Will the future be passwordless?  For many, there is one outstanding question that still needs to be answered. When will the obituary for passwords finally be written? It's a tricky one to answer. While significant strides have been made with regard to the consumerisation of IT, the unfortunate reality is that there are still a multitude of technologies that are using old authentication systems and haven’t adopted single sign-on or the use of biometrics.  In terms of security, convenience, and the overall user experience, usernames and passwords don't come close to passwordless alternatives. And with the rise of mobile devices and cloud computing, one-time codes sent via email or text message can provide an extra layer of security without requiring users to memorise — or scribble down — yet another password. Only time will tell if passwords will still be around in another 20 years, but the overall attack surface of a business is even larger. A multi-layered approach to endpoint protection through a single platform enables fast reaction times to any threat when the inevitable occurs. ### <strong>Cloud-delivered Windows: What Are the Best Options?</strong> It’s a cliché to say, but it remains true: uncertainty is the only constant in today’s working world, and it can be difficult for businesses to know how to navigate. With Q1 in the rear view mirror, we can begin to make some deductions about what the rest of the year is going to hold, and what organisations should be bracing themselves for in terms of economic and financial circumstances going forward. Simply put, challenges will continue to impact today’s professional landscape. Though the UK just about avoided a recession in late 2022, inflation remains high and the economic environment is unlikely to turn especially favourable anytime soon. We’re already seeing the negative effects of this on a business level, with many organisations starting layoffs and rolling out pay cuts. The current situation offers a great opportunity for reflection, consolidation, and understanding. The current conditions are pushing leaders to closely interrogate the reasons behind every call they make. As a result, decisions made amid these challenging times should be thoughtful, well-motivated by necessity, and improve the business over the long term.  Cloud-based virtual desktops: a worthwhile investment Business leaders and IT decision-makers have to balance the inflationary environment and the need for cost-savings with the continued need for flexibility and changes in workplace scenarios for most people. The cost-of-living crisis is pushing some employees to prefer working from home due to high travel expenses, while at the same time, stricter working conditions are increasing presenteeism at work in some organisations. Other businesses are still unprepared to welcome all employees back on premises full-time, and continue to incentivise working from home. In such a diverse set of circumstances, the wisest thing for businesses to do is to remain adaptable to any and all situations. The flexibility that cloud-based solutions allow is unmatchable by slow on-premise solutions. Alongside this clear advantage, cloud-based virtual desktops are also more scalable due to the system’s elasticity. If they are rolled out and managed well, cloud-based solutions can also improve security practices across an organisation. Moving to the cloud can come with a fairly significant upfront cost, but considering the long-term effects and diverse advantages, it is well worth the investment. There are two main groups of options available to businesses. Let’s explore them. Options for Cloud-delivered Windows Non-native options: AWS, Google Cloud, Citrix, VMware While on-premises based desktop virtualisation solutions were market-leading for many years, it is safe to say that they are predominantly becoming obsolete, especially with the advent of native cloud options. In recent years, Microsoft has taken huge steps towards advancing their cloud-based desktop and application infrastructure and management solutions, which are largely preferable from the point of view of IT administrators. Technologies such as Citrix were originally designed to be deployed in on-premises data centres, meaning that, even if they are repurposed in the cloud, they will cause similar problems that they would in the on-premises environment. They are also difficult to inter-operate with modern cloud infrastructure. And VDI vendors and other public cloud providers like AWS or Google Cloud all can be quite expensive due to the purchases of extra licenses on top of Microsoft – something to bear in mind in the current economic environment. Considering the complicated infrastructure, higher cost, and difficult scalability of these solutions, we can quite confidently say there are better options out there. Native options: Azure Virtual Desktop and Windows 365 Many businesses turn to Microsoft when it comes to selecting their cloud-based Windows solutions, and it’s easy to understand why. As a well-known, hugely popular tech giant in the industry, Microsoft has long represented the gold standard in cloud-based practices. They have preserved and perhaps even improved this standing with their latest Azure-based desktop virtualisation services, Azure Virtual Desktop (AVD) and Windows 365. These solutions offer the significant benefit of optimised integrations with native Microsoft products such as Teams and M365, which can also result in improved performance, agility, and speed. Both Azure Virtual Desktop and W365 can be great options for businesses, and both deliver a familiar Windows experience on any device, but there are important differences between them. Under the hood, AVD and Windows 365 leverage a similar set of cloud technologies. Technically speaking, Windows 365 is built on top of existing AVD components but uses a different transactional model. There are two version of Windows 365 Cloud PCs. Enterprise Cloud PCs are designed for large-scale organisations who have invested into Microsoft’s Intune endpoint management product and are using this platform to manage existing physical desktops. Business Cloud PCs, on the other hand, are designed for individual users and very small businesses, and are managed entirely by the user, similar to a standalone PC. The key difference between these is that Windows 365 Enterprise SKUs require an Azure subscription, and all management tasks are performed through the Intune portal. Whereas Business Cloud PCs can run without an Azure subscription and an Intune license is not required. With Azure Virtual Desktops, certain elements are required to be contained within an Azure subscription. Another significant difference is the IT admin experience. AVD relies heavily on Azure management concepts and provides maximum flexibility. Windows 365, on the other hand, aims to simplify management by making it practically identical to managing existing physical desktop assets and leveraging the same set of Microsoft tools to manage physical and virtual PCs.  This is reflected in elements such as the management portal: for AVD, all components are managed via Azure’s PowerShell portal or third-party tools like Nerdio Manager, whereas Enterprise Cloud PCs are managed via Intune, and Business Cloud PCs are not integrated with Intune. An element that is essentially the same in both Windows 365 and AVD is the end-user experience. Users connecting to AVD sessions and cloud PC sessions using the same client application. In fact, Windows 365 is built on top of AVD global infrastructure, and so this will be familiar to those with AVD experience. This means users have the advantage of a unified experience across Windows 365 and AVD. Costs are one of the most important factors that come into play when deciding on the right virtual desktop technology for an organisation. License costs vary significantly based on many variables from the number of end-users to the specific hardware specs that a user needs. Overall, Windows 365 cloud PCs are the more cost-effective option when users require dedicated, persistently available desktops for over 50 hours per week. However, AVD is the superior option if users can be grouped together into AVD host pools, as in this case, auto-scaling enables a significant infrastructure cost saving. All in all, what are the best options? As we have seen, there are many aspects that require close consideration when it comes to choosing the right form of cloud-delivered Windows. We recommend that each organisation take the leap, given the immense benefits of cloud computing in the current economic and business climate – but that they do so carefully and while taking into account their specific needs. This way, they can unlock the long-term benefits of cloud and make their investment worthwhile. ### <strong>How businesses can boost security with a strong data privacy culture</strong> In the corporate world, data privacy stands for the ethical business decision to use collected consumer data in a safe, secure and compliant way. Data processing provides many benefits, but must be acquired via people’s consent, or else businesses risk damaging their compliance, reputation, brand value, and security. Customers increasingly care about data privacy, as highlighted by a recent global Axyway survey revealing that 85 per cent of people are concerned about the security of their online data and 90 per cent want to know the specific data that companies have collected about them. Businesses should treat access to customer data as an earned privilege, but in recent years this access has been taken for granted and abused, and the legal and consumer backlash against data processing has begun.  Inspirations: Companies with privacy-centered cultures There are only positive outcomes for companies who adopt a privacy-centered culture. Once explicit consent is obtained, they are afforded the abilities to track, offer opt-ins and exchange data, allowing them to understand their markets, and optimise their products, services, and marketing to better serve their customers. Necessary data must be collected in strict compliance with regulations. This means businesses must ask for permissions to collect, process and store sensitive data, adhering to legal practices such as the General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA). In simple terms, companies with a privacy-centered culture treat consumers’ data with utmost integrity and security – and provide reassurances of ethical data usage readily and transparently. Consumer data analytics has been around for decades, but digital technologies, omnipresent connectivity, social media networks, data science and machine learning, have led to an increase in the sophistication of customer profiling, and the possibility (and prevalence) of lax, unethical practices, whether deliberate or inadvertent.  More and more, big tech companies such as Google and Facebook capture millions of user data points every day, from general demographic data like “age” or “gender”, to more granular insights such as “income”, “past browsing history” or “recently visited geo-locations”.  When combined, such personally identifiable information (PII) can be used to approximate the user’s exact address, frequently purchased goods, political beliefs and medical conditions – with the information sold on to third parties, potentially without consent. This is when ethical issues arise The 2018 Cambridge Analytica data scandal, which saw 50 million Facebook profiles being harvested to target American voters, is a prime example of consumer data that was unethically exploited. Google has also faced a series of regulatory issues over the years surrounding consumer privacy breaches. In 2021, a Google Chrome browser update put 2.6 billion users at risk of “surveillance, manipulation and abuse” by providing third parties with data on device usage. Google were also taken to court the same year for failing to provide full disclosures on tracking performed in Google Chrome incognito mode, and still have a five-million-dollar lawsuit pending. Failing to securely protect customer data is a costly endeavor. As of 2022, Google Analytics is considered GDPR non-compliant and was even branded “illegal” by several European countries. Newer products such as Matomo and others like it are GDPR compliant by design, making it easier for privacy professionals and marketers to ensure they are treating customer browsing data with respect. Security risks of non-compliance Data collection and processing sounds like a ‘given’ with easy data accessibility of analytics tools but choosing to do so opens businesses up to a spectrum of risks. Data collection also demands data protection by businesses, yet many focus on the former and neglect the latter. Relaxed attitudes surrounding consumer data protection have consequently caused a major spike in data breaches. For example, Check Point research found that cyberattacks increased 50 per cent year-on-year, with companies globally facing 925 cyberattacks per week – many of these attacks occurring as a result of poor data security. With poor data protection, billions of stolen records can be made public or even get sold on the dark web. Stolen emails can be used in social engineering attacks, to distribute malware, and harvest even more data. Legal risks of non-compliance Globally, 71 per cent of countries have legislation for data collection and customer privacy, yet businesses continuously try to evade existing rules and occasionally break them – with regret. In regard to serious cases, companies who fail to comply with GDPR and/or suffer a data breach could face a fine of up to €20 million euros, or four per cent of a company's annual turnover. As for the CCPA, civil penalties for willful violations can be up to $7,500 per violation or $2,500 per inadvertent violation after notice and a 30-day opportunity to cure is provided. Data protection should accordingly be respected to reduce such compliance burdens in the future and prevent the business from direct losses. Reputational risks of non-compliance As trust is the new currency, data negligence and consumer privacy violations are the fastest ways to lose it. This loss of trust can lead to brand damage, with lasting effects on business results. When businesses fail to collect and use data with integrity, consumers are fast to cut ties. CMSWire even reported that 71 per cent of people said they would purchase less from a business that had lost their trust. A company can lose market share, brand equity, and competitive positioning the moment information about privacy violations become public, let alone the security implications associated. Post-data breach, an AON report estimates that companies can lose around 25 per cent of their initial value, and losses can be significantly higher in other cases. By respecting consumer data, organisations construct additional layers of protection around their business, saving it from direct and indirect losses. Therefore, a strong data privacy culture is essential to enhance security posture and regulatory compliance. Organisations that embed a culture of respect for personal data across their teams as part of the discipline of good data practice will retain the trust of valuable customers, and set themselves up for future success.  ### <strong>5 tips to tackle the cloud vs on-premise dilemma</strong> Cloud computing, generative AI, big data analytics: These fast-moving technologies are causing rapid innovation in the way organisations handle and understand data. Cloud computing, in particular, has proven to be a big boon for firms, as by moving business-critical workloads to the cloud they can enable greater flexibility and scalability. According to Gartner, 51% of IT spending in infrastructure software, business process and system infrastructure will have moved from traditional solutions to the public cloud by 2025, up 10% compared to 2022. Furthermore, almost two-thirds of spending on application software will be directed towards cloud technologies. Compared to on-premises computing, cloud-based technologies are often praised for helping enterprises significantly improve time and cost savings while reducing the impact on IT administration. However, there have been recent concerns surrounding the cloud, such as upfront expense, along with performance, security and regulatory compliance. The debate this has ignited amongst businesses is causing some to reevaluate their decision to move their data to the cloud.  So just what are the key considerations when selecting whether to embrace the cloud, or bolster your on-premises solutions? What can the cloud offer you Given the pace of technological innovation, businesses now have a plethora of options for deploying and expanding their operations. But the question of cloud versus on-prem should start with an assessment of business needs and application performance requirements. For businesses with changing or expanding bandwidth requirements, cloud-based services are typically a good fit as they offer the freedom to scale your operation. The concept of cloud scaling is not new, but actually assessing what your business needs from a cloud solution requires more than guess work. Optimising the capacity of your cloud server to meet rising and falling demand requires ongoing performance testing and IT teams logging user requests and memory usage. Another important benefit of cloud scalability is reducing the cost of disaster recovery in the event of a breach or other incident. With a scalable cloud solution in place, there is no need to build a secondary data centre to preserve data. When appraising third party cloud providers, a key question is what capabilities they have in place to enable seamless scalability.  Flexible data for a flexible world Recent research from KPMG indicates that 42% of executives believe that the main benefit of cloud computing is that it allows for the maintenance of a flexible workforce - a near essential requirement in the age of hybrid working, where being able to manage and retain talent from diverse geographies provides a clear advantage. Of course, being able to decentralise both the database location and the staff required to operate it can lead to huge cost efficiencies in terms of office space and subsidiary maintenance of the premises.  However, economic uncertainty is leading many companies to adopt a more binary view of business expenditure, with spending gates being introduced to limit their expenses per quarter. As well as limiting new spending, there is also a clear scrutiny of tech stack ROI and a need for rationalisation of the tools and services to procurement teams. This outlook can dissuade some companies from properly considering cloud providers.  Save now might mean pay later A key component of the cloud is the diverse ways in which various providers price their offerings. Some might be cheaper up-front but are more likely to incur additional costs over time, while others might seem more expensive to begin with but come with services—such as remote database administration—that can help businesses save money in the long run. When consistent revenue isn’t guaranteed and the economic landscape feels volatile, it’s natural for enterprises to tighten budgets and expect guaranteed returns before investing in new cloud solutions. Given the higher costs associated with the cloud, some businesses are moving workloads back to on-premises servers to have total control over their infrastructure and more predictability in terms of financial outgoings. Nonetheless, given its scalability, agility, flexibility, and security, many businesses continue to use cloud computing. To ensure the best possible resource allocation in the cloud, business leaders are looking for cloud financial operations techniques to update existing budget processes. Comparing platforms can be challenging due to the vast differences in pricing structures between suppliers. But determining your company’s requirements and usage patterns is crucial. Cloud and on-premises are both viable solutions, but the decision will depend on how well you can estimate the performance needs of various applications and keep track of workloads across the board. There are certain situations where some applications might not be a good fit for the cloud, and the same goes for on-premises.  The future in the hybrid cloud Rather than go all-in on-premises or in the cloud, many IT leaders are opting for a hybrid cloud approach. The hybrid cloud unifies public cloud, private cloud and on-premises infrastructure to build a single, flexible, cost-optimal IT infrastructure. It serves as a means for businesses to cut costs, lower risk, and increase their current capabilities to assist efforts related to digital transformation. According to a study by Statista, the worldwide hybrid cloud market will grow from $85 billion in 2021 to an estimated $262 billion in 2027. The Asia Pacific area is anticipated to develop at the fastest rate during this time. This data hints that the adoption of hybrid cloud is anticipated to rise, with it acting as a clear middle way to access the flexibility of the cloud and preserve the security and control of an on premise infrastructure All things considered The choice of cloud or on-premises differs from business to business. Assessing the performance needs of various applications and keeping an eye on workloads in both settings is crucial to make the right decision. Businesses can achieve cost-efficiency as they grow—without compromising essential capabilities—by implementing financial operations practices which include conducting the necessary research, evaluating the available options, and working with the financial, development, and IT teams to select the best database solution. ### <strong>The value of colocation data centres in IoT</strong> The evolution of 5G networks, cloud computing and automated technologies is picking up pace today as Internet of Things (IoT) adoption becomes more widespread across UK businesses. Bridging the gap between the real world and the digital space, IoT is well set to form intelligent data connections, leading to improved customer experiences, streamlined processes, and increased operational efficiencies. In line with this, a recent report by Technavio found that the IoT market in UK is expected to grow by US$21.81 bn from 2021 to 2026, accelerating at a CAGR of 11.80%. The market is also anticipated to post a year-over-year growth rate of 10.86% in 2022. Many sectors are driving this market growth: from retail to healthcare to industrial manufacturing, utilities and, of course, connected vehicles. The advent of the new 5G wireless standard is acting as a catalyst for IoT. It also offers the ability to handle extremely high volumes of data messages with minimal delay, allowing applications and communications running on 5G to connect and share data in almost real time. What’s driving IoT IoT offers a raft of benefits to businesses. It supports and enables machine-to-machine connectivity, allowing companies to generate and collect data to drive operational efficiency and make more informed decisions about future business strategy, for example. If big data collection is ultimately to be a success, organisations need to develop their core strategy and their approach to interrogating the data in order to generate the insights and information that they need. They must ensure that the approach will deliver real value for the company itself, whether that is around reducing outages on the production, cutting costs from poor logistics or reducing health and safety incidents on the shop floor. And they also require buy-in from the very top of the organisation to ensure the approach has committed backing and can be driven through consistently. That said, there is great potential for organisations to start using IoT to drive business benefits. By sending messages to consumer devices, for example, businesses can start communicating more effectively with their customers, access customer feedback and use that to improve both the product and the ongoing service they provide. At the same time, IoT can start to build connected processes that enables it to accelerate its supply chain workflow and streamline key aspects of its operations: from ordering to production. Migrating to the cloud is key While businesses continue accelerating their digital transformation strategies and jumping on the IoT train, however, a note of caution is required. For the projected growth we see in market studies today to be realised, legacy IT infrastructures will need to be modernised and systems migrated to the cloud to power the variety of IoT applications and to respond to the explosive growth in connected devices. And with so many connected systems to manage, organisations will also need to ensure that all this is done securely, reducing the amount of security holes in the system and eradicating, or at least limiting, risk. Cloud-based computing is, of course, becoming ever more popular generally. A growing number of organisations are looking to move away from the traditional scenario, where servers were located on-premises. They want to outsource the running of systems, as well as ensuring they can achieve 100% uptime on their services. And many organisations conclude that outsourcing into the cloud offers them the best way of doing this. In line with this, many of the new services being developed alongside IoT are today being hosted in the cloud, especially large public clouds like Amazon Web Services, Google Cloud Platform and Microsoft Azure. Beyond cloud As we have seen, cloud computing is often fundamental to the success of IoT strategies, but the most powerful IoT applications also rely on high-quality connectivity. They need the right infrastructural foundations in place to ingest the enormous volumes of real-time data that IoT requires.  They also need enough bandwidth to bring that big data in for analysis, utilising it for more efficient decision-making capabilities. IoT implementation is a complex process that currently faces obstacles of systems which need modernising, lack scalability and flexibility, and shortages in competence to fully engage the business with IoT technology. Access to highly-connected ecosystems, available with the right data centre partner, is vital to get business data from one hub to another and ultimately on to its destination 24x7 and in a secure manner. Effective IoT, and the resultant big data being delivered, require the shortest possible distance between sensors, data analytics applications, and the end-users of the processed data. Colocation service providers can effectively serve IoT framework needs by delivering an abundance of choice including major cloud providers and broad peering options, among others. IoT data processing is increasingly being pushed out to the network edge to get as close as possible to the source sensors and end-users of the resulting data analytics. Today’s interconnected data centre colocation providers can provide the best means for filling the gap in IoT’s edge computing landscape, while offering a cost-effective approach for managing, storing, and organising big data. Beyond all that, they can deliver the technology solution, of course, but just as important they can hold the customer’s hand when they are undertaking the transformation, and make sure they get to where they want to be. These providers and the technology solutions they facilitate are therefore key to ensuring the ongoing growth and success of IoT but more specifically making sure that businesses can use it to drive benefits for themselves: from enhanced customer engagement to improved operational efficiencies and sharper competitive edge. ### <strong>Strategy and anticipation are key to securing against cyber threats</strong> With technological progress comes increased security risks. Sophisticated and co-ordinated cyber groups are working every day to find potential entry points into organisations’ networks.  And it’s getting easier for them to exploit companies’ vulnerabilities, as businesses often neglect security when adopting new technology.  Following an attack, an organisation’s activity may be paused for between 20 and 30 days on average, which from our experience, may lead to a revenue loss of £274,000 per day for a business with a yearly turnover of £100m. In fact, ransomware attacks are expected to cost businesses $265 billion per year by 2031. In 2021, the estimated cost of cyber-attacks was equal to $20 billion, meaning we’ll have observed a 13X increase in 10 years if this figure comes true. Still, companies fail to realise the scale of the risk they’re exposed to. A survey commissioned by HYCU reveals that only 15% of IT leaders are currently prepared to defend themselves against a ransomware attack. In order to resume activity and limit the financial consequences, organisations’ only option is therefore to pay the ransom. In the UK alone, over 80% of companies concede to cyber attackers’ demands, as shown by research from Proofpoint. If businesses keep paying their ransom demands, conducting cyber-attacks will remain a profitable career route, and cyber gangs will have access to even more resources to continue performing increasingly advanced attacks. As the stakes are getting higher, companies must move away from their apathetic attitudes towards cyber and begin thinking strategically if they are to put an end to this vicious cycle. The well-organised structure of modern cyber-attacks Ransomware is now a very lucrative industry that is only growing as it recruits new talent. Thousands of individuals around the world have found full-time employment as part of a co-ordinated cyber group. Their mission? To analyse companies’ security systems and access points to ensure their attacks can be conducted effectively, swifty and without detection.   These sophisticated and organised cyber groups recruit people with a wide range of skills, including money specialists, data miners and coders. Every person knows exactly what they have to do, and they all contribute to ensuring the attack is perfectly coordinated so that businesses that fall victim to attacks cannot anticipate the risk. And as more businesses are threatened, bad actors are even more motivated, either by financial gain or simply the satisfaction of hacking companies’ systems. Some gangs are even state sponsored, making it impossible to implement a legislation at a state level to protect businesses. Without the appropriate defences, businesses stand little chance against these coordinated attacks and are often left with no choice than to pay ransom demands in the hope of saving their own and their customers’ data, avoid further financial loss, and protect their reputation. Moving away from cyber apathy and towards defensive strategy In times when cyber-attacks are so prolific, implementing preventative measures is key for companies to protect themselves as best they can. It is also important to remember that large scale or well-known brands are not the only businesses vulnerable. Companies of any size and across all sectors are potential targets.  Raising awareness of this threat within the business is crucial. Research from Tessian shows that, currently, one in three employees don’t express enough concern about cybersecurity at work and a quarter of employees do not understand the importance of reporting they had been involved in a cyber incident, showing there is no collective responsibility amongst the workforce. Businesses cannot remain cyber apathetic – helping employees understand the high risk of falling victim to a ransomware attack and making them feel involved in protecting the business is essential. Employees must realise it is their responsibility, and not only that of the of IT team, to act to prevent data breaches.   Adopting a ‘military grade’ defence strategy is key to helping companies empower their teams. Such a strategy helps to minimise the impact of these unavoidable attacks. Thinking beyond traditional security measures  Avoiding the huge financial loss that an increasingly sophisticated range of attacks can incur means businesses must adopt a new approach. The fast-paced cyber landscape requires businesses to think outside the box; standard and traditional security measures are no longer enough to defend against the evolving threat landscape. More specifically, organisations should follow the example of the once-mysterious defence sector. Making sure the business is secure does not require investing a huge amount of money or developing a completely new security system – companies must simply learn to use existing technologies efficiently to strengthen their infrastructure and anticipate and prevent attacks.  For example, air-gapping technology allows companies to keep specified devices off main servers so immutable data is not stored in the main system, but in isolated environments. Think of them as backups or ‘safe zones’. This technology is very useful to help organisations recover in case of an attack because businesses can regain access to some of their most crucial data, quickly.  Technological innovations like this must also be supported with in-depth recovery plans. The 30/3/3 model is often recommended to make it clear to everyone what data needs to be recovered in 30 minutes, 3 days, and 3 weeks, should attackers manage to gain entry to a company’s systems. As a result, all employees know exactly which data must be prioritised for the organisation to resume business as usual, as quickly as possible. Everyone ultimately has the direction they need to make the right decisions when facing a crisis, and can work together to help mitigate potential chaos.  The importance of always being prepared Keeping pace with the ever-changing cyber landscape is a real challenge. Businesses must make it a priority to regularly adapt their defence strategies, minimise their vulnerabilities and avoid falling victim. In times when the consequences of a breach could be disastrous, inaction is not an option.  Companies should not lose faith; none are condemned to losing millions, providing they work hard to implement the right defence strategy. ### <strong>Raising talent attraction and retention with IT investment</strong> The demands of modern employees are evolving. The shift towards remote working during the pandemic has led to hybrid expectations as the world returns to normality. Flexibility is now a requirement for many potential joiners and longstanding employees navigating a post-Covid world. Despite this intention, new research undertaken by Apogee reveals that almost two-thirds of IT directors (63%) lack confidence in their IT estate’s ability to fully support the hybrid workforce. Even more concerning is that this shortfall is not being addressed with sufficient investment. Over seven-in-ten (71%) organisations are not placing IT spend at the top of the priority list. This trend is deterring prospective jobseekers from applying for roles and culminating in unhappy employees, which is a far from ideal scenario in a time of talent shortages and technology skills proving to be a premium. It’s time for organisations to invest in IT solutions that meet the requirements of the modern worker. Considering collaboration capabilities Due to the current limitations of IT setups in many businesses, almost nine-in-ten organisations say that their collaborative capabilities are lacking (89%). Millennials and members of Generation Z have become accustomed to using the latest technologies to communicate and share files seamlessly in their personal lives. They also expect the same in the workplace. The top expectation among employees today is the ability to undertake effective collaboration with colleagues (38%), with the opportunity to work flexibly coming in at a close second (31%). Seamless video calls are a basic prerequisite for prospective joiners in the modern world. With 48% of remote staff failing to gain access to the same solutions as office workers, any platform integrated must also match the expectations among employees to work under a flexible model. Inconsistent access to applications, depending on the location of the worker, can affect business performance and raise staff frustrations. Talented potential staff, who may need to work flexibly or remotely due to childcare or caring commitments, will also suffer unfairly. Organisations must not only focus investment on video conferencing to provide parity between office and remote workers, but also consider the acoustics, quality of hardware used and software setups. Managed collaboration solutions are scalable to fit a variety of room sizes, feature easy-to-install hardware and require little technical expertise. Employees and potential new starters can make use of technology that matches their expectations, speeds up meetings, allows for rapid file sharing and enhances team efficiency. Adopting greener practices Effective hybrid working practices with readily available collaborative solutions can also help to retain and attract staff due to their compatibility with sustainable measures. Employees and prospective joiners are all too aware of their personal carbon footprints and the steps they must take to contribute to a greener planet. Investment in a comprehensive strategy can ensure that fewer cars are on the road for the morning and evening commutes and the carbon footprint of each member of staff is reduced. But that’s just the tip of the iceberg in terms of green practices. Organisations can also invest in IT solutions that make a direct impact on the environment. Platforms can for example automatically convert business paper footprints into actual trees, with replanting taking place in forests that most need it. The battle against e-waste, one of the leading sources of hazardous chemical waste, also needs to be considered. Rather than incorrectly disposing of aging devices in landfill, businesses can make use of refurbished devices. To ease any concerns around quality and reliability, these solutions can be provided with a full service history to ensure trust in the technology. This provides staff with the same value that new solutions would provide but at a reduced cost. Incorporating a mix of these sustainable measures will help businesses to meet environmental targets and contribute to governmental commitments towards net zero. Supporting security measures Under-investment in IT solutions can not only threaten talent attraction and retention due to a lack of collaborative and sustainable solutions, but also pose a risk to security. More people are becoming aware of the need to protect their personal data, and any information stored on work systems needs to be secure across disparate devices. As many as a quarter of organisations (25%) are concerned about the security challenges faced with remote and hybrid working, affecting IT transformation progress. Key prospective talent is also likely to be deterred by an organisation that fails to take steps to audit its IT estate and protect sensitive data. One-in-three businesses (29%) only audit their equipment between once a month and once every 4-6 months, leaving networks and devices vulnerable to evolving cyber-attacks or data breaches. Multi-function devices, which incorporate internet, e-mail and network access, should be fitted with hard drive encryption and data erasing software, which prevents data from being stolen directly from the hard drive. Pull printing, also referred to as user authentication, can be integrated to ensure that documents are only printed when the user is physically next to the device. This ensures that private information isn’t left exposed to other employees. To further meet the needs of the workforce, effective training measures should be incorporated to ensure that all users understand and conform with data protection mandates. This will help encourage responsible behaviour and also reinforce the message to sceptical employees that data protection is taken seriously. The making of the modern workforce Organisations are ultimately failing to invest in the technologies that meet the needs of the modern workforce. This is not only preventing current employees from working efficiently and securely but is deterring premium talent from applying for roles. Failure to address such issues will likely lead to skilled staff leaving the organisation due to poor morale and a growing number of technology vacancies remaining unfilled. To be at the centre of talent attraction and retention, businesses should make use of workplace technology that enables them to integrate collaborative, secure and sustainable measures into their operations. ### <strong>How NIST started the countdown on the long journey to quantum safety</strong> Quantum computing holds the potential to be one of the most era-defining innovations. So much so, that it’s almost impossible to predict the exact effects it will have across the world of technology. But there’s one thing that most in the tech industry agree on – it will eventually signal the end of asymmetric (public-key) cryptography, which underpins the system of machine identities enabling our online world to exist. Now, the world is racing to discover algorithms resistant to cracking by quantum computers and achieve “quantum safety”. And NIST has taken the lead by announcing the first four contenders. Forward-thinking CISOs will want to start preparing now, despite change not being imminent. And they should assume that the switch between pre- and post- quantum worlds will be defined by hybrid use of both new and old machine identities.  Cracking the cryptographer’s enigma Today’s digital systems uses a binary numerical system – zeros and ones – to store and process information. Quantum computers on the other hand use qubits – these are quantum particles that don’t behave according to the traditional rules of physics. This allows them to be a zero and a one simultaneously, which theoretically will reduce the overall time required to solve mathematical problems and process data. At its core, this presents significant issues for cryptographers. Current public-key encryption systems rely on mathematical challenges, which computers struggle to solve due to sub-standard processing power. On the other hand, quantum computers have the potential to solve these problems in the blink of an eye, meaning they could break current encryption standards with ease. The internet’s transfusion won’t happen overnight Why does it matter if our current encryption standards are upended? RSA produced the first crypto system in 1977, establishing public key cryptography as the primary mechanism for determining trust and authentication online. This underpinned the digital certificates and cryptographic keys that give machines an identity and laid the foundations for our entire system of encryption. Now, these machine identities are the primary method for securing all our online communications – from sensitive customer data to financial transactions or even national security secrets. They ensure that all machines can communicate securely, including everything from servers and applications to Kubernetes clusters and microservices. They run through our digital world like blood travelling around the circulatory system of the body. So, replacing these standards with quantum will be akin to giving the internet a transfusion. We’ve all seen the discussions around the so-called “crypto-apocalypse” – when quantum computers come online and crack our current systems of cryptography wide open. In truth, the reality isn’t quite as dramatic. There won’t be a single catastrophic doomsday event where the world’s secrets are brought into the light and the global economy ceases to function. No, we’re likely to see a slow and steady journey to quantum which is driven by the needs of leadership teams and markets. It's now been 40 years since the inception of the original RSA crypto-system, and the journey to achieve our current encryption standards has been long and onerous. The move to quantum resistance is likely to take decades too, if not longer.  Establishing a standard Leading the charge to develop a post-quantum cryptographic standard for organisations is the US government’s National Institute of Standards and Technology (NIST). There’s been a lot of progress since 2016 when NIST called on the world’s leading minds in cryptography to devise new ways to resist attacks from quantum computers. None more so than from July’s update, where the world of cryptography reached a vital milestone when NIST announced the first group of four quantum-resistant algorithms. And we are set to see four more announced soon. By releasing eight algorithms, NIST recognises that cryptography is deployed in a multitude of use cases, and therefore diversity in encryption is a must. It’s also essential to mitigate the risk of potentially vulnerable, early-stage algorithms.  For this, NIST selected the CRYSTALS-Kyber algorithm for “general encryption”, due to its relatively small encryption keys and operation speed. And for digital signatures, such as the one’s used within TLS machine identities, it selected the CRYSTALS-Dilithium, FALCON and SPHINCS+ algorithms. As the primary algorithm, NIST recommends CRYSTALS-Dilithium, and FALCON is regarded as useful for applications which require smaller signatures. Meanwhile, SPHINCS+ is larger and slower than the others, but is useful as a backup option due to its slightly different mathematical approach. With things accelerating from a standards perspective, organisations now have a clearer path towards planning their own post-quantum journey. Beginning the journey Many will be tempted to turn a blind eye to these early algorithms. They’ll no doubt see that this kind of planning will take considerable effort – after all, we’re talking about a transformation on the same level as changing the way you ride a bike. Yet, while the current machine identity system is working fine now, this won’t always be the case. And sooner or later, CISOs will have to act. While early-stage standards exist, it makes the most sense to start planning laboratory condition testing. Start by choosing a single application and understanding the performance impact of the new algorithms, how to deal with larger machine identities, and how to operate dual pre- and post-quantum modes. The latter point is especially key, because for the next few decades, the world is likely to transition to quantum safety via a hybrid approach – much like how we’ve seen the switch to electric vehicles via hybrid cars. The old will work alongside the new. Having a control plane to automate the management of these machine identities will be crucial to this hybrid mode, enabling visibility over what machine identities are being used with different context, and how they perform.  Of course, it will be difficult to truly predict how long this transition period will last. It’s likely that many currently within the industry will not see the end of it. But, like climate change, it’s not something that we can push down the road for a future generation to deal with.  So, pick an application to test and factor it into next year’s budget. Set yourself a five-year plan to have the first quantum-resistant app up and running. While the road may change course, the destination certainly won’t. It’s time to take the first steps. ### <strong>Overcoming economic uncertainty with cloud flexibility</strong> As rising energy prices and interest rates now work their way through the supply side of the economy, consumer demand is beginning to fall as prices go up. Consequently, in January Meta, Amazon, Apple, Google and Microsoft all announced large numbers of redundancies citing economic uncertainty. But, there is a sense of inconsistency in the message.   For example, just days after Microsoft announced 10,000 layoffs, it was reported that the company was also investing $10 billion into OpenAI, the firm behind ChatGPT. There is no denying the economic pressures that organisations are trying to solve, but for most companies there could be more practical solutions to taking unnecessary cost out of the business. This is particularly the case for companies that underwent rapid digital transformation paths during the pandemic. Flexibility of the cloud is perhaps being underutilised by some of the biggest adopters and advocates of the technology, and through better use, organisations could find efficiencies that relieve pressure to make cuts. The tech-layoffs were a hangover from the pandemic There have been many reasons proposed that explain the wide-scale tech layoffs recently seen, some of them contradictory. Most obviously, the firms misjudged demand during the pandemic and overstaffed as a result. With a limited talent pool, as some companies began aggressively looking for new hires, other companies began to hire defensively, and the market turned into a sort of feeding frenzy. An alternative view is proposed by Financial Times correspondent Rob Armstrong, who suggests that the layoffs could be driven by the volatility in the stock market and are driven by activist investors within these big tech firms. But for most firms in the technology industry, or which rely heavily on technology, slashing the workforce to drive costs down is not a rational option. Luckily there are other ways to drive down costs, that are easy enough to implement. What has cloud optimisation got to do with saving money? Anticipating future demand and sizing an organisation accordingly – especially when there are external factors to consider, is an incredibly complex task. However, whilst salaries make up a significant proposition of costs, compute costs can also be vast and tend to increase over time, especially as companies go through a rapid growth period. When growth occurs quickly, services can be thrown together with hastily coded applications running on infrastructure that makes poor use of the resources, ultimately adding additional and unnecessary costs to a company’s overheads. Ad hoc services created or updated in this manner may serve little purpose to justify their cost, and, even worse, they are unlikely to scale optimally. Even small changes to the efficiency of those resources can provide some buffer against needing to make substantial layoffs when there is a squeeze on the bottom line. A healthy, efficient cloud environment should cost less to manage, maintain and scale than an inefficient sub-optimal cloud. This is true both in terms of the cost of the hardware but also the time (and therefore people) required to manage it. Organisations often think they are powerless to manipulate the productivity of their cloud platform – but it’s absolutely possible to drive those assets harder. The cloud is designed to be flexible, and it’s one of the many reasons it’s been such a game changer to businesses as a whole. Optimising an existing cloud service is also an obvious way of dealing with uncertain market conditions. For example, when companies plan ahead, they can smoothen demand for staff through better automation of processes. In order to deploy cloud optimisation as an approach to reduce costs, organisations need to be developing a mindset of cloud elasticity. This means being prepared to flex your platform, and understanding exactly how to do it as the business priorities change. There are multiple reasons why organisations struggle with this, chief among them being a lack of in-house skills.  Face your fears to fund your future So why aren’t we seeing more organisations seize the cloud as tool for financial security? In my experience, the root cause is fear and misunderstanding. The biggest savings in cloud optimisation are driven by application design, which many organisations see as a black box. However, in some of our most successful engagements, the code is exactly where the problems were. A great example was a software company where we discovered that an authentication call was executed as part of every transaction, adding up to millions of unnecessary code executions, each consuming cloud resources.  Embracing the application as a source of optimisation is a necessary step of the journey. Secondly, organisations are fearful of making changes to platforms, due to the risk of introducing instability Of course, making changes to a production platform is a risk and that must be mitigated, but fortunately the cloud is designed to be flexible. We are no longer shackled to our data centres and we can spin up environments almost instantaneously to try out code or configuration tweaks. This is a fantastic technological advancement and we need take advantage of it. Finally, cloud optimisation is overlooked due to the misconception that as revenue grows, cloud cost has to grow at the same rate. It is just accepted as the status quo. However, true cost optimisation occurs when you influence the shape of the cost trajectory, breaking down the link between cost and revenue and achieving economies of scale. These are not complex obstacles to solve, and once solved properly, enterprises unlock an engine that can protect against future cost pressure. Another common misconception is the use of forward commits in isolation. We are seeing these being increasingly used as a mitigation against inflation and rising costs. This is because cost optimisation is often seen as a purely financial tactic - getting the same compute for less money. However, true optimisation, which is up to four times more effective, is about engineering and efficiency. Without technical efficiency, forward commits are just a commitment to be more wasteful. Optimising the cloud is easy, but it does require the right mindset Cloud optimisation doesn’t need to be difficult, especially since there has been such a trend recently in observability and monitoring tools. It is ultimately a data driven activity, and most organisations are already collecting this data. In our experience, true, consistent optimisation is a cultural issue. Organisations have to have a mindset of creativity, where they appreciate that making small changes and measuring the result leads to the delivery of foundational savings.   Companies typically fall in a spectrum in terms of how they manage their optimisation. Companies at the higher end of the spectrum put in place a culture of efficiency, which means that all technology changes are assessed against efficiency metrics, and this is used to flex the platform in relation to the business demand drivers. Particularly for companies that jumped into the cloud headfirst, taking the time to optimise existing processes is a remarkable way to reduce infrastructure costs and free up OPEX for delivering business value. In this way, cloud optimisation holds the keys to not only unlocking capital, but also using the workforce in ways that directly drive business goals.  ### <strong>“The need for speed” - Finding a way to unlock agility for today’s businesses </strong> Building business agility is high on most board level agendas today. That’s largely because technology has developed rapidly in recent years and enterprises increasingly can digitalise their operations and evolve them to deliver faster, and more personalised, services to customers. In line with this, C-Level directors and decision-makers are placing a growing emphasis on making use of this new technology to drive more rapid time to value as they seek to differentiate their business offering from that of the competition. As such, complex multi-year implementations of the latest enterprise resource planning (ERP) suites are no longer a viable or sustainable option for enterprises. By failing to deliver business value quickly, they leave those businesses unfortunate enough to invest in them trailing behind competitors. Unfortunately, protracted ERP deployments are still prevalent. In fact, many get so bogged down that they consistently fall short of all their targets. Underlining the point, Gartner estimates that 55% to 75% of all ERP implementation projects fail to meet their goals and objectives. Given that many are complicated undertakings lasting many months and even years, these kinds of project failures may significantly impact the profitability, and long-term viability, of businesses. That’s unsustainable. Organisations have a clear need for greater clarity of execution and faster time to insight. Delivering value across the broader business In a recent BearingPoint survey, 82% of businesses polled said fast reactions to new conditions or changing customer requirements was an advantage that agile organisations have in a time of crisis, while 55% referenced fast approval and decision-making processes through already-reduced bureaucracy and waste. Businesses above all need to be agile because they need to deliver more, and faster, as competition increases and customer expectations grow. As a result, they are no longer looking for technologies like ERP, enterprise asset management (EAM), or field service management (FSM) operating in isolation. Such technologies typically solve functional problems specific to departments, like HR or sales, but serve to reinforce silos and rarely deliver value quickly to the broader business. Organisations are now moving away from products designed to address a specific function or issue. Instead, they are searching for a precise solution to their problems that can give them the capability they require at the point of need, rather than months down the line. What works in such an environment is not a traditional technology software implementation. It is a composable approach that allows the customer to rapidly configure the solution in a way that meets its precise needs across those traditional software categories. Looking for a single platform approach Enterprises are increasingly coming to the realisation that a single platform that delivers class-leading capability across all these areas: from CRM to EAM, represents the way forward because it enables them to bring in the functionality they need when they need it. That, in turn, will help them optimise the use of their assets. Regardless of what industry they are in, assets will be a key component.  For example, a washing machine manufacturer will need to orchestrate its operational equipment assets in the most efficient way possible to streamline the production of finished products. A property maintenance business with field service engineers must harness its people assets. That entails assembling the right crew with the right set of skills for each individual job and then ensuring they getting the most out of them. The final piece of the jigsaw is for businesses to ensure they are harnessing all this capability to deliver what the customers wants when they want it at the moment of service’. That way they can ensure that the customer buys from them again. To deliver this, technology vendors need to be completely open in the way they develop and use technology. Platforms must be native and open application programming interface (API)--enabled at the core to empower businesses to be successful and allow them to make use of the latest tools to help achieve their strategic goals. How enterprises can change it up and succeed Making a shift of approach like that outlined above requires board level endorsement. Senior decision-makers need to take the reins here and initiate change throughout the organisation as well as carefully manage the switch in approach needed from employees. To fully support agility, the solutions chosen will need to enshrine all the latest innovations in areas like artificial intelligence, machine learning or prescriptive analytics. That way, the provider can continuously update them and make them available to the customer to drive agility, rather than having to integrate IoT or virtual projects that the customer has carried out itself, for example. This capability enables the business to focus on driving operational efficiencies and delivering the best possible moment of service for its customers instead of getting bogged down by long, protracted IT implementations. As a result, the board will be better placed to register a rapid return on its original investment and more broadly see the benefits of its approach in the bottom line. New model gathers pace Senior decision-makers and business leaders within many enterprises are increasingly moving away from point products and complex system installation that address specific challenges in isolation. Today, they want broader capability that can be tapped into quickly and efficiently. This will enable them to attain fast time to insight to achieve the outcomes they are looking for, orchestrate their assets, both equipment and people-based, and harness them to efficiently deliver the moment of service to customers. This kind of functionality will ultimately be key in allowing organisations to achieve the agility, optimum levels of customer service and competitive edge they crave. ### <strong>Preventing data sovereignty from confusing your data strategy</strong> Data is immensely valuable to all organisations, a significant resource for the digital economy and the ‘cornerstone of our EU industrial competitiveness,’ says the European Commission. But its value is determined by how it can be protected and used by those who own it.  The challenges of managing and storing sensitive and critical data are growing. The volume of highly sensitive data now hosted in the cloud is on an upward trajectory. Sixty four percent of EMEA organisations have actually increased their volume of sensitive data, and 63% have already stored confidential and secret data in the public cloud, according to IDC.  Managing this exposure of highly sensitive data, which could be financial, personal, national or critical information, is driving the need for data sovereignty – where this intelligence is bound by the privacy laws and governance structures within a nation, industry sector or organisation.  This exposure of such data in the public cloud should be influencing every organisation’s future cloud strategy and the imperative of sovereign clouds.  Yet challenges exist. To date, there is no standard definition, nor European certification, to assess a   cloud as ‘sovereign cloud’, and not even common terminology ‘sovereign cloud vs. trusted cloud’. But what is crystal clear is the set of key requirements associated with confidential and sensitive data, such as data and metadata control, residency and exposure to external jurisdiction. Agreements on data sovereignty must come first, so organisations understand how to keep control of their data and to choose the appropriate platform to host their data and innovate in a secured way.  However, this is not always understood outside of technical teams. If your organisation’s management has its head in the clouds when it comes to data sovereignty, here are five key messages to help you demonstrate the value of a secure data strategy, and explain why there is no data sovereignty without cloud sovereignty:   Data classification determines choice of cloud   The days of customer information sitting in a single, monolithic database are well and truly dead. It is now essential for organisations to manage their data and applications in a multi-cloud environment, with the application, workload and data type determining the cloud used. Our research shows that nearly half (47%) of organisations understand that using multiple clouds will help them address security and privacy concerns, while better enabling them to monetise their data. And ultimately, organisations that fail to embrace this will inevitably get left behind.  So, while it is now common for organisations to use several clouds to secure and manage their data and applications, with the drive for sovereignty, we’re seeing a review of usage to allow a mix of clouds with different levels of control and certification. This boils down to the type of data, for example its volume, sensitivity, criticality and exploitability; the data owner’s priorities in regard to it, such as its privacy or economic advantage; and regulations.  Data sovereignty therefore needs to start with the classification of data, to ensure specific assurances and capabilities on data residency, data protection, interoperability and portability. Organisations can then choose the best clouds for the job, from sovereign private clouds to sovereign public clouds to trusted public clouds - ensuring they comply with sovereignty and jurisdictional rules.   Until now much of this has been conducted with the confidence that cloud providers are upholding their promises of data sovereignty. Unfortunately, recent closer scrutiny by regulators suggests that not all providers are equal, with some being very publicly examined to ensure they’re not missing the mark. One provider is under investigation in Germany to ensure it’s meeting GDPR compliance, while another has just launched a new digital sovereignty pledge, leaving some customers questioning their track record up until now.  It’s therefore also essential that decision makers aren’t tripping themselves up by automatically assuming all global hyperscale cloud providers will support data sovereignty because the portfolio, data and applications will be limited to only what can be run in a region. The physical location of data isn’t enough to give the sovereignty stamp of approval. Almost all require jurisdictional control, which cannot be assumed to be met with a data resident cloud, particularly for U.S. or global cloud providers subject to the CLOUD Act and FISA ruling. The flow and management of the data is also crucially important, as are the consumer rights within the country you’re collecting that data from. It can be an incredibly confusing web to unweave and make sense of.  Secure data drives success  Data is undoubtably the driver of success and decision makers know this. A shining example is McDonald’s, where the company successfully used visitor data to assess the effectiveness of its iconic Piccadilly Circus billboards, and redirected marketing spend towards smaller, personalised adverts instead. This increased footfall to desired locations, and ultimately, drove up sales.   Research we conducted earlier this year shows that by 2024, 95% of organisations across EMEA will be looking to their data as a key revenue driver, with 46% recognising it as a significant source of revenue – up from 29% today. And with the data monetisation market already at $2.9 billion, and another $4+ billion to be captured by 2027, it’s no wonder that more business leaders want to tap in. According to the European Commission, the data economy in Europe is expected to grow GDP from 2.6% to 4.2% by 2025.  At the same time, companies are highly aware that their data strategies must be handled with care to ensure customer privacy. Concerns amongst consumers are increasing and getting louder in this growing discussion. There are plenty of fresh rules and regulations on the way, like DORA, which will help harmonise hard-to-reconcile regulations and reporting standards in banking across EMEA. Even with simplification like this on the horizon, meeting these regulations can be a complex journey for companies that operate across international borders.  Local laws don’t need to be a minefield  Whilst the value of data is clear to see, there are often understandable reservations about regulations. Data sovereignty laws differ from one country to the next, with over 100 countries having their own standards on how data should be treated and stored within their sovereign borders. They also rarely stand still and change constantly. Organisations that fall foul of these can end up paying fines of hundreds of millions of dollars and be seen as unreliable and untrustworthy in the eyes of the consumer. Meta, for instance, is currently facing a €390 million fine from The Irish Data Protection Commission, after its Facebook and Instagram privacy breaches.   Most people (87%) are willing to walk away from a company if some kind of data breach happens. Their trust is just as valuable as hard currency. So, how can organisations perform this delicate dance in a way that allows them to mine customer data without betraying their customers’ trust? The answer lies in the ability to share, monetise and protect data that resides across multiple clouds.   Forge relationships with a network  Those looking to run without being tripped up should form relationships with one of the newly formed global networks of sovereign cloud providers who have specifically joined forces to ensure that data is protected, compliant and resident within a national territory. Working with an entity that has both national and local partners guarantees an organisation will be meeting niche requirements across the board. It also gives decision makers the ability to choose the right cloud for a specific data classification, with better governance around data mobility. By definition, these specialised clouds are operated by a sovereign entity, so they’re exempt from foreign jurisdictional control. With a sovereign cloud, data is managed by national citizens with the relevant national security clearances.  As more organisations focus on monetising their data to capture revenue, sovereign clouds are becoming an integral part of a “cloud-smart” strategy, enabling organisations to run their business operations across multiple clouds to better serve their end customers and to gain strategic advantage. If management doesn’t have a clue about data sovereignty, make it your new year’s resolution to ensure these five key messages are understood by all. In a world where trust is everything, both between B2C and B2B, don’t let your data strategy get tripped up by misplaced assumptions about data sovereignty.  Data sovereignty drives innovation   Ultimately, the reason why sovereignty is so important, is that it enables organisations to be innovative with their data and deliver new digital services. Historically, there has been a distinct lack of trust in the cloud, leading to a lack of innovation. Some of the biggest and most important creators of data, such as finance and healthcare, continue to avoid use of public cloud because of privacy fears. This significantly handicaps their ability to innovate, and means they are losing out on other benefits of cloud technology, such as cost-reduction, agility and scalability. It is therefore paramount that moving forward we avoid the mistakes of the past and ensure sovereign data from the start. Today, sovereign cloud is more and more perceived as a key enabler for a ‘data-driven’ innovation.  ### <strong>Edge and cloud joining forces</strong> Edge computing and its big brothers in the hyperscale cloud are often painted as locked in some sort of dog-fight across the datasphere. The reality is each serves a different purpose. Cloud computing provides efficient, centralised storage at scale, processing data that is not so time-sensitive. It is constantly expanding. A report from Marketsand Markets, estimates the global cloud computing market size should grow from US$ 545.8 billion in 2022 to US$ 1,240.9 billion by 2027, which is 17.9% compound annual growth. For a decade-and-a-half now, centralised cloud computing has been considered the standard IT delivery platform and little is set to change on that front. However, with new applications, workload services require an architecture built to support a distributed infrastructure – which is where edge computing is taking off. Emerging problems with latency (time lag) and limited bandwidth when moving data have exposed the cloud’s shortcomings in certain situations. Edge computing is how the infrastructure market responds, providing capabilities to process time-sensitive data without latency. It is also preferred over cloud computing in remote locations, where there is limited or no connectivity to a centralised location. The factors behind edge’s growth are only going to become more powerful. With the increase of remote working, edge data centres provide the important, reliable ‘last mile’ of connectivity, bringing critical data ‘nearer’ to users, increasing reliability of access, security and worker productivity. It addresses regulatory compliance requirements for information to be managed and processed in a specific area, and enables autonomy, offering the necessary separation without loss of performance where technology needs to function in isolation from a dedicated network. Its high bandwidth is ideal for systems generating vast quantities of data, which cannot feasibly be sent to be processed remotely. Covid led to a surge in edge computing and has been followed by increases in video streaming and online gaming. In the wings are a myriad of industrial internet of things applications that industries are moving towards. All of which demands compute power at the edge, especially in remote locations away from central cloud hubs. A global edge computing market size report covering edge computing trend analysis by components, applications, industry vertical and segment forecast, predicts global edge computing market size will reach $155.90 billion by 2030 – a compound annual growth rate of 39%. Cyber-threats and outages Being complementary rather than bitterly competitive, edge and cloud are set to prosper as data volumes continue to explode. Both, however, face a common enemy in cyber-crime. Cloud computing is centralised, which makes is more susceptible to threats such as direct denial of service (DDoS) attacks and outages. Breaches of multiple kinds proliferate, however. An Ermetic-commissioned IDC state of cloud security survey, conducted in the first half of 2021, revealed almost all companies (98%) surveyed had suffered at least one cloud data breach in the previous 18 months – a significant increase from 79% in the previous survey. Edge’s vulnerability arises from the pressure on distributed networks imposed by the burgeoning consumer demand for faster, more efficient services. This ramps up the likelihood of outages. Edge locations often have less redundancy built in, and no on-site engineers, which can make them less resilient than traditional data centres. This new world of cloud and edge will require organisations to adjust their network management approaches to continue delivering the always-on uptime that customers expect. New approaches to maintaining uptime Providers now need proactive monitoring and alerting to keep their cloud infrastructures and edge data centres up and running. They need to ensure they can remediate networks without having to send an engineer on site. Their options include Smart Out of Band (OOB) Management tools which diagnose the problem and remediate it, even when the main network remains congested due after a disruption, or even if it is down entirely.  A technology such as Failover to Cellular™ (F2C) provides continued internet connectivity for remote LANs and equipment over high-speed 4G Long Term Evolution (LTE) when the primary link is unavailable. Easily integrating with existing IT systems and network infrastructure, F2C restores WAN connectivity without boots on the ground or human intervention. Organisations also combine automation with network operations (NetOps) for zero-touch provisioning of their Smart OOB devices. The benefit is to get the Smart OOB network provisioned and up and running, without risk of manual or human mistakes. Often, they will want to ‘zero-touch provision’ their own devices. And this technology is often employed to orchestrate maintenance tasks and automatically deliver remediation in the event of an equipment failure or other technical problem. That effectively means that after shipping new or replacement equipment to site, an organisation uses Smart OOB to quickly bring the site up via a secure cellular connection. This allows for the remote provisioning and configuration of the equipment where it is, without having to send a skilled network engineer. Companies using this approach achieve huge cost savings when implementing new edge deployments, especially those trying to do so rapidly in multiple territories. Then following deployment, if a problem develops that results in a loss of connectivity to the production network and one that cannot be resolved immediately, business continuity is upheld with organisations continuing to pass any mission-critical network traffic across the secure OOB LTE cellular connection. Positive outcomes across edge and cloud As cloud services expand and edge computing applications grow in sophistication and ease-of-implementation, organisations will have to adjust their network management processes to continue delivering the always-on uptime that customers have every right to expect. Achieving this will require hybrid solutions that fully exploit internet and cloud-based connectivity, as well as physical infrastructure.  By using the combined approach of Smart Out of Band with the latest NetOps automation, service providers will give themselves full confidence they have the always-on network access they need. In reality this is the surest way to deliver a level of network resilience that transforms delivery of cloud and edge capabilities. ### <strong>Cybersecurity stress: how to safeguard your organisation and avoid IT burnout </strong> As more facets of our lives move online, the number and impact of data breaches continue to soar. Couple this with unprecedented levels of inflation, and it’s no surprise that IBM’s latest Data Breach report found the cost of an average hack to businesses has hit a record $4.35 million. With so much pressure on IT teams to protect their organisations, employees are reaching their breaking point. One-third now say they’re considering leaving their role over the next couple of years due to stress and burnout, while over half reported that their work stress and mental health have worsened year on year. Sadly, it’s a vicious circle that further weakens security teams and makes them even more vulnerable to future attacks.  To future-proof their cybersecurity, organisations may need to rethink their priorities. Let’s explore how business leaders can protect their companies from cyber-attacks, and support employees to safely shoulder the increasing workload.  Move resources—and responsibilities—to the cloud Though many modern organisations have long harnessed cloud computing services, studies show that COVID-19 lockdowns accelerated cloud migration by as much as four years. Today, 94% of companies use some form of internet-powered cloud resource to optimise their operations.  After all, the cloud brings innumerable high-profile benefits to organisations, such as greater collaboration, remote data recovery, and increased scalability. But one of its lesser-publicised advantages is security—and the option for businesses to offload some security responsibilities and updates to the vendor, rather than have the entire burden fall on their hard-pressed internal IT teams.  This ’Shared Responsibility Model’ is a framework that many cloud service providers (CSPs) use to outline their own responsibilities for securing the cloud environment, and those of their customers. Simply, the model details that the CSP must monitor and tackle security threats affecting the cloud and its infrastructure. Meanwhile, the customer must take on the protection of the data and assets they store within.  This framework still places a large portion of responsibility on the organisation. But it also offers far greater efficiency and protection than a traditional on-premises model. The shift to the cloud frees up security staff to focus on other tasks, while reducing the pressure of their workloads. Meanwhile, organisations also enjoy state-of-the-art data protection through the expertise and hyper-vigilant measures that CSPs use to safeguard their customers’ assets. It’s a win-win for all—as long as customers take extra care when selecting their providers.   Choose software with identifiable contents As product shortages and rising prices have brought physical supply chains back under the spotlight, their software-based cousins are also becoming increasingly fragile. Much like a real-life supply chain, a software supply chain consists of all the components, tools, and processes used to create software. Many modern software applications are no longer built purely from custom code. Instead, they’re created using numerous types of open-source components and libraries from third parties.  This trend of code reuse and cloud-native approaches enables vendors to rapidly craft and deploy software, but it also exposes their customers to vulnerabilities outside of their control. Usually, an attack occurs when a threat actor infiltrates and compromises a vendor’s software before it’s deployed to end-users. Last year, software supply chain attacks grew by more than 300% compared to 2020, becoming so widespread that Gartner listed them as its second-largest security trend of 2022.  For organisations to improve their security, they must seek visibility of all the components that go into the software they use. This can be achieved by looking out for, or creating, a State of Software Bill of Materials (SBOM), which lists all third-party components and dependencies within the software. Then, organisations can either monitor and investigate suspicious activities themselves—or avoid software that uses open-source modules entirely.  Log access to sensitive data Alongside leveraging new software, it’s also vital that businesses carry out a continuous risk assessment. In other words, consistently log and review employees’ access to sensitive data, and set up alerts for abnormal events like late-night database downloads or logins from an unusual location. Even if hackers manage to access data or store malware, a faster response helps to reduce the impact of the breach and simplifies the search for its source.  Typically, a log will record as much information as possible, from the date and time and source IP address to the HTTP status code and the number of bytes received. A specialist Audit Trail product will also enable managers to create custom trackers and reports that keep tabs on who accesses what data, whether for security purposes, compliance, or even the management of workloads.  Keep your security simple Ultimately, data security is an inescapable necessity of modern businesses. But it shouldn’t drastically impinge on your day-to-day operations. Using well-meaning but complex authentication methods, which restrict your employees’ abilities to get on with their jobs, can instead increase the risk of breaches. A 2021 study found that tech-savvy younger workers are likely to swap security for speed, with 51% of those aged 16-24 and over a third of all workers admitting they’d sought out security workarounds to simplify their workdays.  It sounds counterintuitive, but to ensure optimal levels of security managers must seek out the simplest methods of authentication. This begins with basics like multi-factor authentication, but can also involve leveraging day-to-day software, such as cloud storage systems, that automatically produce data logs while using encryptions across AES-256 and HTTPS over TLS 1.2 or higher.  Fortunately, there’s no longer a need to trade stress for safety. By harnessing easy-to-use programmes with watertight data protection readily built-in, you can establish organisation-wide security without heavily impacting the cost to your business—and the mental health of your IT professionals.  ### <strong>Improving industrial cyber defences in the cloud</strong> As we continually seek increased connectivity to support the digitialisation of infrastructure services, legacy infrastructure and an escalating series of interlinked cyber threats, all with different motives and threat vectors are placing significant demands on large-scale industrial and business networks. At the same time, attack surfaces are increasingly vast, with most organisations now making operational technology (OT) systems accessible remotely, over the internet, and in the cloud, further exacerbating the potential threat. Understanding the risks  We need only look at recent news headlines to recognise the threats security teams are confronting. Rising risk from human-operated ransomware and nation-state attacks, ongoing digitalisation and cloud migration, coupled with growing integration between information technology (IT) and OT, is forcing organisations to be hyper-vigilant to an ever-widening range of cyber-risks.  According to new Bridewell research, cyber attacks against CNI have increased significantly in UK since the start of the Ukraine war with over seven in ten cyber security decision-makers reporting a rise in cyber attacks since the start of the conflict.  Ransomware is another ever-present risk factor. Data breaches resulting from ransomware attacks jumped by 13% from last year — equal to the last five years’ increase combined according to the Verizon Business 2022 Data Breach Investigations Report. No organisation is immune, yet many still do not have critical measures to help prevent, detect and deal with ransomware, according to the Bridewell research. And supply chain risks are also escalating, with attacks providing a foothold in and allowing criminals to compromise large sections of an organisation.  The cost of connectivity  Levels are concern are likely being exacerbated by the increasingly complex IT and networking infrastructures which act as the backbone of these businesses.  Traditionally CNI organisations have managed industrial control systems (ICS) and critical applications on their own closed private networks, with limited connectivity back to central management systems and data custodians. However, this basic, air-gapped requirement, meant that cyber security was never a clear focus as physical security was a greater risk. To support digital transformation and the drive to use automation to boost efficiency, many SCADA-based systems are now being connected with wider IT infrastructure and the internet. This means that cyber security issues that affect IT now impact operations; making it a common attack vector into the OT environment. At the same time with the drive to embrace data and automation in ICS, many are turning to the cloud as an enabler but without sufficient skills to manage cyber security in a cloud environment. Indeed, the Bridewell research reveals that many are struggling to manage cloud security with 4 in 10 admitting to not having the skills to monitor threats in the cloud. But cloud risks are growing and organisations cannot afford any gaps. We are continuing to see cloud based systems, services and data being targeted by ransomcloud - attacks that target or take advantage of weaknesses or legitimate functionality in cloud resources to deploy malware, encrypt data and extort money from businesses. No organisation is immune but those that lack maturity in architecting secure cloud services are particularly vulnerable. Facing this complex web of risks, it is incumbent on industrial organisations today to review their own security posture and put in place measures to boost cyber maturity and resilience. Building cyber resilience The government has already taken some important steps to improve the cyber resilience of the UK’s critical national infrastructure, However, there is still further opportunity for operators to strengthen defences. Regulations will only ever go so far in tackling the issue, organisations must also develop a holistic view of cyber security that ensures visibility into site level OT traffic and vulnerabilities, protection and understanding of cloud and SaaS assets, and comprehensive analysis of user and identity behaviour. Traditional preventative methods lack the sophistication and agility needed in today’s complex cyber landscape, instead organisations must adopt a more proactive approach, based on intelligence and develop a holistic view of cyber security across, IT, OT, cloud and end user devices. This means shifting from pure security monitoring and notification to managed detection and response (MDR) that leverage multiple detection and response technologies, including those sensitive to OT environments.  By combining human analysis, artificial intelligence (AI) and automation, MDR enables organisations to rapidly detect, analyse, investigate, and proactively respond to the earliest signs of a potential breach. Importantly, an MDR solution also allows organisations to develop a reference security architecture that facilitates the safeguarding of on-premise and legacy systems, SaaS solutions and cloud-based infrastructure applications. It also helps security teams to protect against and respond effectively to emerging security and user identity threats while reducing the dwell time of any breaches.  The best forms of MDR utilise Extended Detection and Response (XDR) technologies which allow detection and response across endpoint, network, web and email, cloud and – importantly – identity, alongside a service wrap that goes above and beyond the capabilities of the technology. This means all users, assets and data remain protected, regardless of where the attack comes from. Seizing the cloud opportunity The cloud presents an opportunity and a risk to operators of CNI. However, it will be those organisations that use it as an opportunity to make cyber security investments to reduce risk and support transformation will be better placed in the long-term than those that just add new technology as a new risk to a larger risk register for the board. Basic cyber-security hygiene practices, such as regular testing and patching of any systems connected to the internet and segmentation of networks, still have a critically important role to play. However, the nature of current threats also necessitates a strategic change in cyber security that focuses on enabling operators to detect and respond to active threats. With the right strategy, based on threat intelligence linked to MDR, and supported by ethical hacking techniques to test defences, organisations can reduce the time from intrusion to discovery and limiting damage from attackers.  It's an exciting period of change for the industry, as new technologies are woven into operations to streamline services and enhance the customer experience. However, steps must be taken to ensure transformation does not come at the expense of cyber security. With the right strategy, and the support and guidance of a trusted security partner, operators can reduce cyber risks, even as the possibility for cyber attacks increases, and reap the benefits of a stronger, structured system for managing, isolating, and reducing threats. ### <strong>Top Cloud Computing Applications</strong> The world of technology is constantly evolving, and cloud computing has been a game-changer in the last few years. The cloud has a variety of uses in our daily lives, including data storage and programme execution. As cloud technology continues to evolve, it's important to keep yourself updated with the most recent applications that can benefit your business. In this blog, we will cover the top cloud computing Applications that are revolutionising various types of businesses. Transform your career and master the power of the cloud with the most interactive video on Cloud Computing Course. Top Cloud Computing Applications In recent years, with major cloud providers like Amazon Web Services (AWS), Microsoft Azure, and Google Cloud Platform offering a wide range of services to businesses and individuals, cloud computing has become increasingly popular.  Software as a Service (SaaS), Platform as a Service (PaaS), and Infrastructure as a Service (IaaS) are the three main types of cloud computing services. SaaS provides online software applications, PaaS companies provide a platform for developers to create and deploy software & infrastructure-as-a-service companies offer virtualised computing resources including servers, storage, and networking. Discover the limitless possibilities of cloud computing and get skilled with the best engaging and informative Cloud Computing Training video, don't miss out! What is Cloud Computing? Cloud computing is a model of providing computing resources such as servers, storage, databases, networking, software, and other services over the internet. Instead of owning and maintaining physical hardware and infrastructure, cloud computing enables users to access resources and applications from a shared pool of computing resources that are maintained by a third-party provider. The cloud computing model offers several advantages including scalability and flexibility. With cloud computing, users can easily scale up or down their resources as needed & only pay for the services they use. Due to the absence of traditional on-premises hardware and infrastructure, significant capital expenditures and continuous maintenance costs are no longer necessary. The use of technology has been transformed by the cloud computing model, which has made businesses more adaptable, creative & competitive in today's digital economy. Top Applications of Cloud Computing Several cloud computing applications in the current market make a big impact on both individuals and businesses. These cloud computing applications provide a range of features & advantages that can help organisations streamline operations, improve productivity, and reduce expenses. As the adoption of the technology increases, we can expect to see more creative cloud computing applications in the coming years. Here we will be discussing some of the most popular types of cloud computing applications that can be built using cloud platforms. Big Data Analytics- The technique of analysing large and complicated data sets to derive insights and information is known as Big Data Analytics. The infrastructure and resources required to carry out extensive data processing and analysis are provided by cloud platforms.  Businesses can create big data analytics applications using the cloud's resources to analyse and process large data sets in real time. Machine learning, data warehousing, and real-time data analytics are examples of cloud-based big data analytics applications. Internet of Things Applications- The Internet of Things (IoT) is a network of real-world objects like automobiles, appliances, and other items that are connected to sensors, software, and networking capabilities to collect and exchange data.  Data from IoT devices, such as sensors and connected devices, can be managed and processed using cloud platforms. Data collection, storage, analysis, and visualisation are all included in cloud-based IoT applications. Web applications- Web applications are software programmes that be accessed using web browsers that are hosted on web servers. Online applications can be developed and hosted using cloud computing platforms like Microsoft Azure or Amazon Web Services (AWS). Businesses can control their application infrastructure expenses by using cloud-based web apps that can be scaled up or down based on demand. Social media platforms, e-commerce websites, and online productivity tools are examples of cloud-based web applications. Accounting Applications- The financial sector has been significantly impacted by cloud computing, and many accounting applications are currently developed on cloud platforms.  Accounting businesses can use cloud-based accounting applications to protect their client's financial data by using security features offered by cloud providers like encryption and multi-factor authentication. These applications have automation features that can help accounting firms simplify their workflows.  For example, cloud-based accounting applications can automate work like data entry, accounting records, and invoicing, saving up time for accountants to focus on other banking activities. Data Storage Applications- The cloud-based data storage applications have completely transformed the data storage management process. Now, users don't have to worry about purchasing and setting up actual hardware when using cloud storage because they can easily add or remove storage capacity as required. This makes it simple for businesses to increase their data storage capacity as their business needs change.  It is an affordable option for companies of all sizes.  With cloud storage, businesses only pay for the storage space they need, and there are no upfront costs for hardware. Due to this, it is a better solution for companies even without funding for large data storage systems. Medical Applications- The way medical research is done has changed as a result of cloud computing. Researchers now have access to a variety of medical data sets from the cloud, including Electronic Health Records, clinical trials, and other research projects.  This enables more effective and accurate research, which leads to discoveries and advancements in the medical field. By utilising cloud computing, Healthcare practitioners can quickly and easily find trends, patterns, and discoveries in patient data. This allows for better decision-making and improved patient outcomes. Conclusion The use of the cloud has changed the way we work, communicate, and store data and has evolved into an essential part of our daily lives. In this blog, we have looked at some of the most popular applications of cloud computing. Cloud-based solutions can help businesses become more productive, efficient, and competitive.  We may expect to see even more creative and advanced applications in the future as cloud computing technology grows. The cloud has the power to revolutionise industries & improve the convenience and effectiveness of our lives.  As a result, it's crucial for organisations and individuals to remain aware of the latest developments in cloud computing and to explore how they can use this technology to improve their operations and achieve their goals. ### AI and ML: Transforming Industries and Shaping the Future Artificial Intelligence (AI) and Machine Learning (ML) are two of the most talked-about technologies of the modern era. AI and ML have revolutionised many industries and have the potential to shape the future of our world. This article will discuss what AI and ML are, their applications, and their impact on society. Artificial Intelligence and Machine Learning: What are they? Artificial Intelligence (AI) is the ability of a machine to perform tasks that typically require human intelligence, such as visual perception, speech recognition, decision-making, and language translation. AI is a broad field that encompasses various subfields, including machine learning, natural language processing, robotics, computer vision, and more. Machine Learning (ML) is a subset of AI that involves using algorithms to learn from data & improve the performance of a task. In other words, ML is a way to teach machines to learn from examples without being explicitly programmed. Instead, ML aims to enable devices to make decisions based on data, just like humans do. Applications of AI and ML AI and ML have numerous applications across various industries. Let's take a look at some of the most popular ones. Healthcare AI and ML have the potential to revolutionise the healthcare industry. They can be used to develop personalised treatment plans, predict disease outbreaks, and improve the accuracy of diagnoses. For example, machine learning algorithms can analyse medical images and detect cancer early. Finance AI and ML are also making waves in the finance industry. They can detect fraud, improve credit scoring & automate routine tasks. For example, machine learning algorithms can analyse financial data and make investment recommendations. Manufacturing AI and ML are transforming manufacturing by enabling automation and predictive maintenance. They can be used to optimise supply chain management, reduce downtime, and improve quality control. For example, machine learning algorithms can predict when a machine will likely fail, so it can be repaired before breaking down. Transportation AI and ML are being used to improve transportation systems by enabling autonomous vehicles, optimising routes & reducing traffic congestion. For example, machine learning algorithms can analyse traffic patterns and predict the best course for a driver. Customer Service AI and ML also improve customer service by enabling chatbots, virtual assistants, and personalised recommendations. For example, machine learning algorithms can analyse customer data and recommend products or services based on their preferences. Impact on Society The impact of AI and ML on society is both positive and negative. Let's take a look at some of the most significant results. Job Displacement AI and ML have the potential to automate many jobs, leading to job displacement. For example, autonomous vehicles could replace truck drivers & chatbots could replace customer service representatives. While this could lead to increased productivity and efficiency, it could also result in job loss for many people. Bias and Discrimination AI and ML can also perpetuate bias and discrimination. Machine learning algorithms learn from historical data, which may contain biases. For example, if a loan approval algorithm is trained on historical data with discriminatory practices, it could lead to biased loan approvals. Privacy and Security AI and ML also raise concerns about privacy and security. For example, facial recognition technology can be used to track people's movements, which could violate their privacy. Similarly, machine learning algorithms can be vulnerable to cyber attacks, leading to data breaches. Increased Efficiency AI and ML can also increase efficiency and productivity despite the potential negative impacts. For example, autonomous vehicles can reduce traffic congestion, and chatbots can provide instant customer service. Improved Healthcare AI and ML also have the potential to improve healthcare by enabling personalised treatment plans, predicting disease outbreaks, and improving diagnoses. This could lead to better health outcomes and a reduction in healthcare costs. Environmental Impact AI and ML can also positively impact the environment by enabling the development of sustainable technologies. For example, machine learning algorithms can optimise energy consumption, reduce waste & improve resource management. Ethical Considerations AI and ML also raise ethical considerations, such as the responsibility for the decisions made by machines and the potential consequences of those decisions. For example, who is responsible for the outcome if an autonomous vehicle is involved in an accident? In conclusion, AI and ML are two of the most transformative technologies of our time. They have the potential to revolutionise many industries and significantly impact society. While there are concerns about job displacement, bias, and privacy, there are potential benefits such as increased efficiency, improved healthcare, and a positive environmental impact. We must consider AI & ML's positive and negative effects and work towards responsible and ethical use of these technologies. As the field evolves, seeing what new applications and innovations will emerge will be exciting. ### <strong>A guide to broadband for home workers</strong> With better connectivity, working remotely has become a much more viable professional proposition. But with broadband crucial to clocking in, what home office setups perform best? Working from home is becoming much more commonplace for employers and employees alike. Companies are seeing value in the remote office and using the distributed team model to run an increasingly digital workforce. Economic, logistical and motivational benefits are among the many advantages that hinge on adequate infrastructure. If reliable networking and IT are fundamental, then good domestic broadband is essential. So what kind of connection and equipment is worth the investment to make your home office a reality? What is the best provider for working from home? Picking an Internet Service Provider or ISP might depend on an employer or business type. Companies may have preferred corporate partnerships or schemes for staff to use. Installation and contracts might be subsidised or available at discounted rates. More typically, home workers will either have existing broadband packages or be expected to arrange this privately. Making the right choices here will depend very much on balancing requirements across home and business usage. Are any packages advantageous to home working?  Some providers such as BT and Vodafone offer mobile broadband backup solutions alongside landline deals.  Often referred to as “hybrid” broadband, this primarily improves connection reliability. Any line drop-outs are seamlessly switched to mobile (4/5G) network to give “always on” service.     Similarly, business broadband packages are an option. These tend to work off new, dedicated lines being installed to deliver a fast connection separate from any other home networks. Again, the relevance of these solutions will be determined by your type of work. What speed should I get? Generally, broadband ISPs try to differentiate themselves on speed versus price.  Standard package offers are often comparable in terms of speed, with entry-level fibre connections (~35Mbps) always a good starting point. Very often, this is enough, depending on household capacity and expected usage.  For shared connections, it's worth budgeting an additional 10Mb per person, plus 10Mb for each heavier task user. Consider any streaming, gaming and large downloads as intensive activity that could throttle bandwidth. A good place to start is to use a broadband comparison service such as Broadband.co.uk. This will allow you to look up and compare deals available at your address. Time to take a faster route? Remote workers within a busy shared home may feel faster is necessary. Likewise, professionals who fulfil intensive tasks like large file transfers or video conferencing as part of their business could require bigger bandwidths. In terms of superfast UK providers, Virgin Media or a full fibre operator like Hyperoptic will offer the highest speeds. For example, Virgin Media offers Business packages ranging from 400-1000Mbps. Full fibre options rely on network availability, and keep in mind that superfast connections typically require specialist installation to the property. What can I do if I can't get Wi-Fi in my office? If your home office area is some distance from the main living spaces, the Wi-Fi signal might not reach. In this situation, there are four viable solutions worth trying: Wired connection via Ethernet. Powerline network adapters.   Wi-Fi signal extension kits. Add a mobile connection. Hardwiring the connection with an Ethernet cable is always the fastest and most reliable option. This could be a DIY job or something requiring potentially expensive installation by an engineer. Powerline adapters are a good compromise here. These plug in to the electric mains to build a wired network between outlets. But they are not always the most reliable solution. The options for boosting your signals Wi-fi boosters and mesh systems are used to extend the Wi-Fi signal coverage.  Mesh Wi-Fi kits are powerful and can extend wireless in and around a large home, but can be expensive. Boosters aren't always the most powerful option but may be enough for extending Wi-Fi to a home office if there's sufficient signal nearby to boost. Lastly, adding mobile broadband could be the answer. Investing in a 4G or 5G service could give a home office its own dedicated connection. This also has the benefit of not being shared and compromised by any other home users. What can I do if my broadband goes down? When any broadband service drops, it’s never a good time. This can be doubly frustrating and maybe costly if your home working is compromised. Thankfully there are at least some emergency backup options to try when trouble strikes: Mobile tethering to create a hotspot from your phone. Not ideal for lengthy usage because phones are not designed to be routers, and data limits can quickly be exceeded. Make sure your network provider allows tethering also.   Public services like free general access Wi-Fi hotspots in libraries or cafes could be more affordable than paying for a backup. Similarly, borrowing a friend’s connection is a way around an outage if rarely needed.   In general, prolonged outages should be uncommon. If they are very frequent or are expected to last for a long period, then you may be entitled to compensation. Check for local issues and keep a diary of problems before speaking to your provider. Some can provide backup options if required or credit onto the account as an apology.  Can I expense my broadband to my employer? Charging an employer for home office costs will vary by working situation and company policy. Most shouldn’t expect any financial help, as a rule, but it never hurts to ask politely. Some companies who employ home workers will pay expenses, utility bill, and even contribute monthly for broadband services. Those who are self-employed as independent contractors are probably less likely to have this additional support. More generally, HMRC tax reliefs for home working are available here, but the criteria for eligibility is strict. Making good broadband your business Working from home is only going to become more popular. Even if you don’t plan to operate that way full-time, having the facility is hugely useful when required. Broadband is fundamental to this and not necessarily business packages. Most standard, fibre-based services are very capable of handling professional demands alongside the recreational.    By considering the environment and usage, it shouldn’t be hard to build a home office setup you can hang your career on. ### <strong>Three New Year’s resolutions to streamline your Cloud migration</strong> Migrating to the Cloud can be a daunting prospect for businesses of all sizes - from SMEs to global business titans, it can be fraught with difficulties and challenges. Transitioning to the Cloud is more than simply moving data from one location to another, it is a process that takes into account other aspects such as resilience, geo-location and scalability.  Technology leaders today must consider the risks, rather than just thinking about leveraging technology to bolster their day-to-day operations. In order for Cloud migration to be a success for a business, the challenges need to be successfully addressed, in order for the benefits to be maximised. As 2023 continues, now is the time to leverage the advantages of Cloud to the fullest and to remember the ingredients for success that can help organisations take their migration efforts to the next level.  Taking the first steps Gartner has forecasted that by 2026, Cloud spending as a percentage of total IT budgets will reach 17% and will account for a significant 25% of global IT spending in the imminent future. Whilst there’s currently speculation regarding a decrease in Cloud spending, it’s important to remember that not all organisations have migrated and many have more work to do on their journey to the Cloud.  With this in mind, it couldn’t be more vital for the future of many businesses that their Cloud migration takes place seamlessly. The Cloud is pivotal in giving businesses the opportunity to achieve transformation value and as PWC’s 2023 Cloud Survey uncovered, business success doesn’t stem from piecemeal migration but from a thorough reinvention of organisations. From improved decision making to increased productivity, agility and cost savings - the Cloud can truly help businesses at a time when they are under financial pressure and when digital transformation has become expected from stakeholders.  Getting it right first time Before the migration takes place, IT leaders need to define the business strategy and objectives, determining what they want to achieve from Cloud adoption, such as improved flexibility, lower costs or enhanced efficiency. When these goals have been defined, organisations can configure how best to achieve them and identify any challenges that they may come up against in the migration process and before creating a concise and actionable plan with all of the foreseeable future scenarios. By thinking of the end goals first, the Cloud migration won’t be out of touch with the wider business strategy, preventing the financial and resource constraints that can be associated with transformation. Getting it first right time is that much easier when the strategy has been fully thought through and carefully considered.  A company’s existing technology infrastructure also needs to be taken into account, as new Cloud solutions should build synergies rather than require an additional set of investments to be made. Businesses in many cases already have significant amounts of existing infrastructure that have been developed over the past decades, and a move to the Cloud has the potential to affect other systems if not undertaken correctly. By prioritising solutions that complement fundamental elements of their technology, businesses can highlight which areas need to be updated in line with the new Cloud solution.  Furthermore, Cloud migration should take the entire business into account, not just the IT department. Not only does this ensure that the company’s technology experts are not overburdened with the migration, but it also removes decision-making siloes to ensure the Cloud solution is beneficial on a larger and longer-term basis. Fostering a culture of collaboration and communication with relevant stakeholders across the company can make a huge difference to the overall success of implementation. It’s this integration between technology and the wider business that can take Cloud migration to another level, with a synergy between technology and business needs.  Choosing the right technology partner  McKinsey predicts that by 2024 wasted Cloud migration spend will reach $100 billion, and enterprises cite costs around migration as a major inhibitor to adopting the Cloud. Without a cost effective and successful migration, budgets can be quickly squandered and so choosing the right technology partner to help couldn’t be more important. This partner can provide an invaluable external perspective that can often mitigate some of the challenges associated with migration.   Technology partners can maximise the potential of the Cloud by choosing what is optimal for an organisation based on its requirements and unique characteristics. A consultant can first conduct a Cloud requirement analysis to determine what the organisation needs, by hosting interviews with stakeholders, reviewing documentation and building it into business flows. They can advise on the optimal Cloud provider, explain potential challenges and risks, and present the concrete benefits that will apply. Migrations are not easy and by supplementing an internal team with the Cloud experience of an external provider, the process can be that much smoother. Whether a migration is from on-premise to Cloud or by modernising systems towards a Cloud native application, a technology partner can devise a strategy and roadmap to give a migration the best chance of success.  When it comes to security, the weakest link often comes down to humans and not the technology. With organisations now facing large fines for breaches and non-compliance with GDPR, it’s vital that all the potential business risks are considered ahead of the migration. CloudOps consulting can even design a hybrid or Cloud only infrastructure from scratch. The initial migration is not the end of a consultants involvement, as they can improve the performance and optimise the costs of existing infrastructure. Whether it’s security enhancements or advanced disaster recovery services, consultants can ensure that a migration continues to be a success in the long-term.  As we move into 2023, now is the time to leverage the advantages that the Cloud brings and take note of the factors that help companies streamline their migration or Cloud management to stay ahead of the competition. It’s never too late for organisations to take their Cloud migration to the next level and by working with the right partner, the challenges associated with a migration can be successfully managed. ### How Cloud Native Application Protection Reinforces Your Cybersecurity Posture The rapid adoption of the cloud has broadened the horizons for businesses embarking on a digital transformation journey, and organisations are swiftly taking the giant leap.  Cloud-native applications are becoming increasingly popular in modern organisations. These applications are designed to run on cloud-based infrastructure, making them more scalable and flexible than traditional, on-premise applications.  While the benefits of cloud-native applications are clear, they also pose new challenges to cybersecurity teams. As such, it is essential to have robust protection measures in place to ensure the cloud-native security of these applications and the sensitive data they handle.  Let’s discuss the importance of cloud native application protection and critical steps that organisations can take to ensure the highest level of security. The Importance of Cloud Native Application Protection in Today’s Era Cloud computing has revolutionised businesses by providing scalable and cost-effective solutions for storing, processing, and accessing data. However, with the increasing popularity of cloud computing comes an increased risk of cyber threats, making cloud cybersecurity a critical concern for organisations. Hence, businesses must incorporate the highest level of cloud security. While many enterprises have implemented traditional forms of protection, such as firewalls and antivirus software, on their servers to prevent attacks from external sources, they are only sometimes effective against modern threats that can originate from within the organisation itself.  To combat these new types of threats, it’s essential to employ a more advanced solution capable of detecting applications' vulnerabilities before they cause damage or compromise sensitive data stored on systems within your network environment. Let’s figure out what can be done from an enterprise’s end to reinforce cloud-native application protection.  #1. Implement Strong Access Controls Organisations should implement strong access controls to ensure that only authorised users can access sensitive data. This can be achieved through strong passwords, multi-factor authentication, adaptive authentication, and role-based access controls. Strong passwords are critical for protecting against unauthorised users gaining access to sensitive data or systems. Hackers can easily guess weak passwords, so organisations must enforce strong password policies that include minimum length requirements, complexity requirements, and periodic password changes. Multi-factor authentication is another method for bolstering security as it requires users to enter a password and an additional piece of information (such as a PIN) before they can log into their accounts. Role-based access controls allow administrators to grant different levels of access based on job function or seniority within an organisation. For example, developers may only need read-only access, while product managers require complete control over all aspects of development projects. #2. Encrypt Sensitive Data Encrypting sensitive data is critical to ensuring that sensitive information is protected from cyber threats.  Encryption helps to protect data both at rest and in transit, and it should be used for all sensitive data stored in the cloud. Organisations should also ensure that encryption keys are properly managed and stored securely to prevent unauthorised access. In addition to encryption, organisations must consider other security measures, such as vulnerability assessments and penetration testing of their applications. These tests help to ensure that there are no vulnerabilities that hackers could exploit if they could gain access to the cloud environment where your application resides. #3. Regularly Monitor and Audit Cloud Environments Organisations should periodically monitor their cloud environments for suspicious activity or potential threats. This can be done through security tools such as intrusion detection systems, log management tools, and security information and event management (SIEM) solutions.  Additionally, organisations should conduct regular security audits to identify potential vulnerabilities and implement the necessary remediation measures. Security audits are conducted by third-party experts who comprehensively assess your organisation's security infrastructure concerning compliance with industry standards and best practices. This includes identifying gaps in your security posture, weaknesses that need to be addressed, and areas where you can improve your existing controls. The results of these audits can then be used to improve your current processes or procedures to align more closely with best practices and standards by regulatory bodies, including the CCPA (California Consumer Privacy Act) and  GDPR (General Data Protection Regulation). #4. Use Trusted Cloud Service Providers When choosing a cloud service provider, organisations should choose a provider with a proven track record of providing robust security solutions. This includes strong access controls, data encryption, and regular security audits. Organisations should also look for cloud service providers that comply with relevant security standards and regulations, such as the Payment Card Industry Data Security Standard (PCI DSS) and the Health Insurance Portability and Accountability Act (HIPAA). When selecting a cloud service provider, organisations should consider how well the provider can meet their unique requirements. For example, suppose an organisation has specific compliance requirements or needs to collaborate with a particular set of partners. In that case, they need to find a provider that offers these capabilities. #5. Train Employees on Cloud Security Best Practices Finally, organisations should train their employees on cloud security best practices. This includes educating employees on the importance of strong passwords, the dangers of phishing scams, and the proper handling of sensitive data in the cloud.  Employees should also be aware of the potential consequences of security breaches, such as loss of sensitive information, financial loss, and reputational damage.  To Conclude  Ensuring a robust cloud cybersecurity posture is critical for organisations that rely on cloud-based systems and data.  By implementing strong access controls, encrypting sensitive data, regularly monitoring and auditing cloud environments, using trusted cloud service providers, and training employees on cloud security best practices, organisations can take the necessary steps to protect their cloud-based systems and data from cyber threats. Apart from this, it’s always a great idea to incorporate modern tools and technologies to help detect and contain a security breach and avoid any chance of financial or reputational loss.  ### Discovering Rules Through Machine Learning Back in 1868, George Boole’s wife paraphrased his thoughts on the capabilities of machines:  Between them they have conclusively proved, by unanswerable logic of facts, that calculation and reasoning, like weaving and plowing, are work, not for human souls, but for clever combinations of iron and wood. If you spend time in doing work that a machine could do faster than yourselves, it should only be for exercise. We’ve come a long way since Claude Shannon applied Boole’s work to build the first circuit board, heralding the age of computers. Just as he had predicted, most calculations and reasoning chains are now done by “clever combinations of iron and wood.” We just realised silicon might be a bit better at doing the job. What Boole himself did throughout his life would be something he’d not think possible for a machine. He was investigating the rules that governed thought itself. In other words, he was moving towards a higher level of reasoning, above the regular calculations of daily life. While we’re still far away from artificial intelligence that would be able to make use of philosophy in a similar fashion to Boole and others, we’re coming close to a more subtle form of computational thinking. Machine learning can be used to discover unintuitive rules in some areas of life. Machine learning’s capabilities Much has been written about how machine learning is about to replace all other modes of solving problems. One of the most popular suggestions is that we should forgo rule-based approaches for machine learning. Such a view, in my eyes, is overly idealistic. Solving problems with machine learning that could be solved with a rule-based approach is a waste of resources. Models, especially more complicated ones, can be prohibitively expensive and require much maintenance to keep them accurate. In an ideal world with unlimited resources, both computing and fiscal, these differences wouldn’t matter. In business, however, we’re always working within tightly defined boundaries, as any usage of resources also means an opportunity cost. Preferably, then, we would opt to solve all problems with rule-based approaches. However, that runs into other complicated issues, such as not all problems having defined boundaries that can be solved through rules. Machine learning is great at solving two types of challenges. Any problem that requires a probabilistic answer is likely much better done by a model rather than anything rule-based. Another area where machine learning is immensely valuable is when the rules are not clear. In business, we might sometimes not be sure on how to answer specific questions. For example, what rules should govern a self-checkout process? There are nearly infinite possibilities for structuring such a feature, but we’re always looking to maximise the outcome. In other words, we’d prefer that a self-checkout would lead to the most conversions. Inferences from machine learning models A common objection might be that some machine learning models, such as Deep Neural Networks, are essentially black boxes. We’re never quite sure what’s going on under the hood, so extracting rules from them is as much guesswork as without them. Fortunately, in business applications, we don’t need to be as exact as logicians or scientists who attempt to uncover the foundational blocks of minds, language, or the universe. Insights that point us in the right direction are enough to create a case for doing things one way or another. In other words, when building a model that predicts the best outcome for a self-service customer  system, we’re not trying to define some immutable laws of human behavior. We’re simply looking at an admittedly ever-changing set of circumstances and attempting to wrestle out the best way to go about them. So, going back to the same example, a Random Forest algorithm, fed with enough data from event sessions and user activities, could outline the most predictive outputs. These would indicate what users are most influenced by during the self-service process. These outputs might not be ground-breaking or even wide-ranging as they only work in a fairly confined space of circumstances. But they’re more than enough for the engineers, designers, and content writers to perform optimisation that would lead to better conversions. These insights can then be turned into rule-based algorithms. As such, machine learning models can give us a way to discover circumstantial rules that we can implement in our business practices. Conclusion Hopes that machine learning will replace rule-based systems are ill-founded. The latter is often much more efficient and cheaper to build and maintain than complicated machine learning models. As businesses are always turning one eye to efficiency, rule-based systems are here to stay. Machine learning, unlike commonly thought, can be used to supplement rule-based systems. While there are possible ways of combining one into a single system, the former can also be used to garner insights that can then be implemented into the latter. In the end, machine learning shouldn’t be thought of as the cure-all for technical problems. It’s one of the many possibilities that should be used thoughtfully. One of those is to ensure we make better decisions in other systems. ### <strong>How 5G is changing business operations in 2023</strong> Due to the increase in worldwide 5G deployment— across more than 2,000 cities in over 70 countries as of April 2022 — global connectivity has transformed and has subsequently impacted how enterprises manage their business operations. Over the coming years, 5G is expected to reach one billion new users, making it the most rapidly adopted mobile data transmission technology when compared to all previous generations.  While the increased services that 5G enables will boost revenue for communication service providers (CSPs), it’s not all smooth sailing for them, as 5G's shift to a cloud-native architecture has introduced significant complexity. Kubernetes architecture, mandatory encryption, 5G standalone (SA), edge computing, and network slicing are just some of the new challenges. If providers do not design in observability – which measures end-to-end performance from the application to the network level –from the beginning, new services will take off slower than anticipated, and the customer experience cannot be guaranteed. Observability solutions must seamlessly provide continuity between existing 4G and new 5G networks – in other words, visibility without borders. 5G technologies and the shift to the cloud  It is clear that 2022 brought major improvements to realising 5G's potential, with actual and quantifiable results. However, this is not the only driver behind the transition to cloud-native architectures, as they also offer several benefits that would otherwise be unavailable.  For example, the cloud makes it easier to evolve network capabilities and with a higher degree of flexibility. The cloud simplifies updating the network repository function (NRF) responsible for maintaining an updated repository of all 5G network elements and the services they provide. If an operator wishes to roll out new features, the only requirement, with a cloud-native foundation, is an easy software upgrade. This significantly reduces the impact on service during upgrades. Generally, a cloud architecture directs network resources to where and when needed, thereby ‘right-sizing’ operations. This flexibility is particularly important when considering two aspects that are unique to 5G. First of all, the telecom industry is shifting from a ‘horizontal’ to a ‘vertical’ model, and a crucial feature of this move is the implementation of network slicing. This enables multiple independent and virtualised logical networks while using the same network infrastructure. 5G network slicing requires a cloud architecture to ensure critical management and general operations. Each network slice is isolated within an end-to-end network that is designed to meet a series of requirements made by a specific application. Secondly, the use cases for 5G are far more critical than those of previous generations —and not just from a business perspective. With 5G, peoples’ lives could depend on service reliability. For example, ultra-reliable low-latency communications (URLLC), is a 5G service for mission-critical, ultra-latency-sensitive services. URLLC enables new applications such as virtual reality, autonomous driving, remote surgery and more. With URLLC, CSPs must be able to guarantee that these operations and services can operate without downtime. Being able to quickly detect and implement fixes for network issues is therefore paramount and further demonstrates the importance of observability.  With the move to the cloud for 5G taking place at the ‘core’ and ‘network edge’, an open and interoperable framework will be essential for assuring a flawless end-user experience, and once again, it is only possible through the cloud. How companies can deliver a better customer experience A focus area for some organisations looking to improve their business operations is how the cloud can accelerate the artificial intelligence (AI) revolution within the 5G ecosystem. Achieving this requires a more sophisticated level of automation. Without it, the next generation of networks will not be able to operate correctly – affecting everything from cybersecurity protection to IT troubleshooting. Through the use of faster and better cloud-based AI tools and automation, these elements can help organisations understand the initial cause of problems and prevent them from escalating further. The complexity of the 5G network and the broad range of partners in the ecosystem is unlike former wireless generations and therefore necessitates advanced, AI-powered automation. Without that assistance, customers are more likely to have a negative experience. Additionally, with the current staff shortages affecting organisations worldwide, and long wait times for newly available data scientists, network automation can ease the workload for overwhelmed teams while giving them time to focus on the most mission-critical issues.   Ultimately, improving customer service can only be achieved if businesses can measure it – you can’t guarantee what you can’t measure. Service providers that build in observability from day one can measure and know what kind of customer experience they are delivering and analyse what is or isn’t working, and why. Observability and security While there are, of course, many benefits to 5G when it comes to improving business operations, the added complexity and barriers that it brings, means that observability is no longer a ‘nice to have’. It is now essential for ensuring and securing mission-critical business operations – especially given the increased use cases where lives could be at risk. Furthermore, in contrast to the centralised design of 4G, 5G’s disaggregated architecture means the threat landscape has expanded and diffused, which makes things more challenging. As such, observability and security solutions have become ‘must haves’ when it comes to addressing the barriers introduced by 5G. The good news is that observability is now possible in a cloud-native 5G standalone environment, and it can work regardless of the vendor, so interoperability can be maintained. However, given the way in which 5G fosters a more innovative and competitive business landscape with multiple vendors, enterprises must ensure that they are working with an independent security and observability provider that is vendor agnostic to effectively secure and assure the infrastructure. Both cloud-based networks and modern cyber threats are increasing in sophistication, and this shows no sign of slowing down. This means that organisations must continuously look out for new attack methods or trends, as well as new measures to circumvent them. To improve business continuity, every second counts in the race to resolve network performance issues that may hinder business operations or the online experience for customers. Visibility without borders allows organisations to take charge of their cybersecurity position and transform their digital operations for years to come.  Overall, the ongoing rollout of 5G is giving rise to a whole host of new enterprise business applications and services which opens doors for CSPs to reach new vertical industries and boost revenue. However, the mission-critical services that 5G enables adds serious pressure on CSPs to guarantee that those services can operate without downtime. The only way to achieve this is by measuring the end-to-end performance by building in observability from the onset. ### <strong>Organisations are not standing still in cloud adoption</strong> With businesses facing concerns from rising cyber security threats, increasing global supply chain issues and the impact of geopolitical instability, having cloud technology in a company’s IT arsenal is key to optimising their use of data and remaining both competitive and agile in a changing world.   The investment into cloud infrastructure has come largely from the small and medium-sized businesses, as they look to get support with workload migration, data storage services, and cloud-native application development. When it comes to making use of real-time data in business, it’s becoming ever clearer that the cloud is a must-have. Recent research backs this up, with IT decision makers (ITDM) and business decision makers (BDM) confirming that cloud adoption is on the up — and it’s moving fast. More than half (54%) of respondents stated they were using the cloud either at an intermediate or advanced level.  This is hardly surprising. After the pandemic forced millions to pivot working from home almost overnight, the tech that was rushed in to help support that flexibility was always going to remain around afterwards — particularly as many workplaces now face hybrid and home working as a permanent fixture. However, what will be interesting is seeing how businesses continue to build on this kick start and move forward with maturity in the next year to 18 months.  More storage please With more data to hand, comes the need for greater storage. While many organisations have moved key files, applications and data into the cloud, the next step is to find the storage for all the new data they acquire following to the move. The survey found that cloud-based data warehouses, data lakes and lake houses played a prominent role in 2021 for exactly this reason, with respondents citing these as a top initiative (48%) as well as one of their top use cases (57%). These modern data management approaches offer businesses a way of seamlessly accommodating legacy systems so they can work alongside newer, more sophisticated cloud-based systems. That’s good news, considering the popularity of a hybrid approach continues to rise as organisations are either unwilling or unable to give up their legacy systems entirely. From those surveyed, there were nearly double (37.5%) the number of ITDMs and BDMs choosing hybrid over pure public cloud (20%), showing a widening disparity between the two deployments, despite the increased cloud footprint.  Furthermore, almost all organisations (93%) said they were either using, evaluating, or considering leveraging cloud-based data integration, management, and analytics — including powerful technologies like data virtualisation and logical data fabric. This allows for a more fluid “instant access” experience across on-premises and cloud systems. This comes with a number of benefits, including regulatory compliance, scalability and greater agility, thanks to real-time application use improving employee productivity.  Data is complex But what are the main issues that stand in the way of businesses looking to become a wholly “data-driven” organisation? Those surveyed listed the complexity of data integration, accessibility and accommodating different data formats as the biggest difficulty (79%), with a lack of resources to turn raw data into insights come in a the second (62%).  Other key challenges identified were difficulties with accessing and analysing data, with 44% only able to access half or more and 17% only able to leverage three-quarters or more. This is going to be one of the biggest roadblocks to businesses getting to a position where they can make key decisions based on data. In order to harness the full potential of cloud migration, organisations need to ensure they have the right people on board to be able to manage and use the data they have access to. The importance of this has not been overlooked. While the role of IT in the cloud modernisation journey a year or more ago was more focused on the choosing of a provider and the migration itself, now the focus is having the skills required for making the most of cloud systems in place. Almost a third of respondents (31.3%) said their IT teams are now more focused on receiving the training they need to take their cloud capabilities to the next level.  Looking to the future Reporting and dashboards, self-service BI and ad-hoc analytics remain important priorities, but in the future, data virtualisation, data preparation, data quality and blending should become more prominent as the cloud becomes a seamless part of an organisation’s IT structure.  Despite companies being at different stages in their journey to the cloud, what’s clear is that no one is standing still. Companies are looking to the future, looking to prepare for what’s next, and wanting to maximise the data and cloud systems they have with strong cloud-based repositories. The cloud is here to stay, and the companies that invest in making the most of it will reap the benefits at every turn.  ### <strong>Data is the key to unlocking investment for emerging markets</strong> Over the past few years, the rapid economic growth experienced by emerging markets since the turn of the century has faced significant challenges. In the face of the pandemic, they were forced to be more adaptable than their mature market counterparts, taking on fresh debt. And while many economies have rebounded, recovery has been far from even. If emerging market economies are to return to their previous growth trajectories, then investment will be critical.  Those investments, at least in the cumulative sense, have to be big too. In 2020, for example, South Africa revealed that it would need just under US$87 billion just to finance its infrastructure development needs over the next decade. A year earlier, meanwhile, Bangladesh revealed that it would need somewhere around US$24 billion annually to meet its infrastructure needs. Those figures, which have undoubtedly climbed in the wake of the pandemic, are mirrored to varying degrees in emerging-market countries around the world.  Nowhere is the need for such investment clearer than in the transportation sector. Transport remains at the centre of life in busy and bustling cities. It is what keeps the city moving, getting people from A to B, and ensuring a stable economy. Without efficient, reliable, and safe public transportation systems, any attempts at improving a city’s economy is futile. This is especially true in emerging markets. But if these markets are to attract the kind of investment they need, data will be critical.  The power of transport  In order to understand why effective transport data is so critical to economic growth in emerging markets, there are a few things worth considering.  First off, there’s the time spent commuting and ultimately lost from an economic productivity perspective. Take Lagos, for example. A 2021 report found that by the time the average Lagosian reaches 55, they will have spent nearly seven years in traffic. Imagine what you could do and achieve if you were given an extra seven years back. Multiply that by the millions of people who live in the city and you can see how much of an economic cost traffic represents.  That’s to say nothing of the environmental costs (according to the World Economic Forum, mayors in more than 100 cities have said that investing in public transport could create 4.6 million jobs by 2030 and cut transport emissions) that come with all that traffic or the mental health costs that place a burden on healthcare systems around the world.               Even if you can’t give a city’s inhabitants back all of the time they spend in traffic, half an hour on either end would allow commuters to study further, spend more quality time with their family and friends, or even launch a side hustle.  In order to unlock that value, data is critical. After all, unless you know where pile-ups and traffic jams take place, you can’t begin to address them. That’s especially true in emerging market cities where public transport networks are often informal and don’t follow fixed routes. But the value of transport data goes much further than just making life easier for commuters.  Making business and investment easier  That’s true for governments and transport authorities: with the right data, they can map and plan new transport routes, tailored to each and every city. That, in turn, can drive investment. We’ve seen this first-hand, with data from WhereIsMyTransport used by the Sarajevo Canton in Bosnia and Herzegovina in evaluations that helped them secure an infrastructure investment loan from the European Bank for Reconstruction and Development (EBRD).     But public transport data can also be useful for attracting private sector investment. Retailers, for instance, can get a much better idea of where to locate their stores. In emerging market cities especially, it makes sense to have stores as close to transport hubs as possible. With most people relying on informal public transport, rather than their own vehicles, it’s vital to ensure that companies receive maximum return on their investments.  There are further benefits from this kind of data too. Developers and construction companies can more easily figure out where to put their next big real estate project, logistics companies can plan the most efficient routes for their drivers, along with various other benefits.   Ultimately, the more they know about the opportunities in a specific city or market, the more willing organisations will be to invest and keep investing.     Beyond recovery, towards growth  It should be clear then that data, and transport data in particular, is crucial to securing the kinds of public and private investment that will take emerging market economies from recovery, towards sustained (and sustainable) growth. In order for it to have that effect, however, it’s critical that this data is accurate and up-to-date at all times. The fact that cities can attract this kind of investment while also improving the lives of their citizens, should make gathering and sharing the required data a no-brainer.  ### <strong>A New Journey to the Cloud</strong> ERP implementation has changed. And for those companies facing the 2027 maintenance deadline for SAP ECC 6, that is good news. In today’s cloud-first, ‘adopt not adapt model, there are no more white boards. No more consultants offering to customise software to meet any business need. And no more long drawn implementations – followed by expensive and disruptive upgrades and a complete loss of commitment to using technology to accelerate change. Instead, SAP is leading a push to ‘protect the core’, advising companies to avoid customisation, use Best Practice Scenarios to define processes and embrace the six-monthly product enhancements.  But this is a significant cultural change – for everyone involved. As Don Valentine, Commercial Director, Absoft, explains, companies need to look hard at their SAP consultants, ensure they are on board with standardised implementations and have the skills to support a different approach to innovation and competitive differentiation.  Ready to Move Companies are gearing up for the move to SAP S/4HANA, with research from the UK & Ireland SAP User Group (UKISUG), revealing that 70% of companies planning to move to S/4HANA will do so in the next 36 months. While the 2027 maintenance deadline for SAP ECC 6 is, of course, a key driver in migration plans, companies are also looking hard at new functionality (49%) and wider business transformation (47%) were cited as the main drivers. But while companies are getting ready to migrate, how many are comfortable about the right way forward? SAP’s cloud-first model has fundamentally changed the approach to software implementation. Wherever possible, SAP is advocating a ‘no customisation’ strategy, pushing an ‘adopt not adapt’ model. With this approach, companies can not only fast track implementation but also avoid the costs and complexities associated with upgrading customised solutions. And yet, the SAP User Group survey revealed that almost three-quarters (72%) of organisations say existing customisations present a challenge when moving to SAP S/4HANA. This is down on last year’s results (92%) and nearly a quarter (24%) of organisations plan to remove customisations entirely (replacing them with standard functionality) compared to 12% in 2021. While this suggests the standard model is gaining traction, more clearly has to be done to provide SAP ECC users with understanding of what a move to S/4HANA in the cloud both offers and entails. Three Options S4/HANA is only available to companies in the cloud – but that doesn’t mean there is no choice. Businesses have a range of options to explore. The first option is Software as a Service (SaaS) S/4 HANA, where companies use the standard product in a public cloud. There is no need to think about customisation or choice of hosting location and product enhancements are automatically available with every new release, providing a route to continuous improvement. The second is the Private Cloud Edition (RISE) of S/4HANA, where companies can choose a hyperscale – Google, AWS or Azure – to provide the hosting service. This approach can appeal to companies with concerns about the security of a multi-tenanted cloud model – as well as those wanting to control when upgrades and enhancements are deployed. The third option is to replicate an on-premise model, but in the cloud: essentially a business opts to maintain its own SAP system, with complete customisation flexibility.  Business Differentiation Both the public and private cloud editions of S4/HANA are appealing to the growing numbers of companies that are questioning the level of customisation in existing ERP deployments. Why incur the cost and complexity of customisation, especially when the core functionality meets a business’ needs?  Avoiding the upheaval and investment associated with creating, maintaining and upgrading customisations completely changes the total cost of ownership associated with an ERP investment.  With SAP’s Best Practice Explorer, companies can quickly gain access to a range of best practice scenarios across a number of areas, such as purchasing or procurement.  It is the way the business uses the ERP and the information that enables business differentiation, rather than specific customisations. Of course, not every company will find all the features it needs in any ERP. For those companies with specific requirements – such as field services support – the private cloud approach with RISE provides customisation options. But even here, the advice is clear: avoid customising the core, and instead use SAP’s Business Technology Platform to support the creation of specific apps that can be integrated with S4/HANA. With this approach, companies can successfully add functionality without adding the cost and complexity of testing every single change in the core product. New Consultancy Attitude A cloud-first, ‘adopt not adapt’ model, is very different. But it provides a fast-track route to innovation. There are no long drawn out implementations; no hugely expensive and disruptive upgrades. Indeed, with smooth, faster deployment, companies should never experience that ‘post ERP’ hiatus, when planned investments and changes are not achieved because the business needs to recover from months of upheaval.  Consultants will, therefore, play a different role in the implementation – and that is key. How many SAP partners are actively changing their consultancy skills and mindset to support the cloud-first, standardised approach? From Fit to Standard workshops that demonstrate the power of the core product to ensuring the business remains on standard implementations, a good consultant can shorten implementations and reduce costs.  Plus, of course, by streamlining the approach, a consultant can help the business to create a commitment to continuous improvement, leveraging tools such as BTP to create business differentiating apps and explore other cloud solutions such as analytics that can add value. Conclusion Growing numbers of businesses are embracing this cloud-first, ‘adopt not adapt’ approach. Very few are opting to convert their existing ECC systems, with their existing customisations and data, into S4/HANA. Instead, they are embracing the chance to simplify, to move away from complex and unnecessary customisations and achieve a streamlined, effective and manageable ERP deployment. They are embracing a standard model, in the cloud, to create the foundation for seamless, incremental change that should give a business confidence to innovate and achieve continuous improvement. Through partnering with the right SAP consultancy, they can ensure that they choose the most appropriate model with the best migration path to achieve the optimal results as quickly and cost-effectively as possible. ### How to add AI to your cybersecurity toolkit  In recent decades, rapid technological development has transformed the ways we work, communicate, and live our day-to-day lives. However, this progress has been associated with the proliferation of malicious exploitation of systems, networks, and data by cybercriminals looking to take advantage of vulnerable systems.  An overview of modern cybersecurity landscape The traditional cyber defense methods are growing less and less adequate against increasingly more sophisticated attacks. Cybercriminals have become proficient at using emerging technologies and are constantly developing new nuanced tactics to breach organisations’ security systems. Thus, to counter these advanced attacks, organisations are looking into emerging AI use cases in cybersecurity to strengthen their defenses.  It's no wonder that the global AI in the cybersecurity market is booming, with organisations increasingly turning to AI-powered solutions to protect their networks and data. According to a recent report by Acumen Research and Consulting, the global AI in cybersecurity market size is estimated to reach a market value of USD 133.8 billion by 2030.  AI in cyber defense offers numerous advantages, from improved threat detection and prevention capabilities to cumbersome malware analysis automation. However, AI technology is complex, so implementing it can be challenging for organisations that do not have the necessary expertise or resources. Therefore, it’s important for businesses to understand the risks associated with AI-powered solutions before deploying them and ensure they have the right team to manage and monitor the implemented AI systems. Below, we will discuss the major ways in which AI can be used to bolster corporate defenses, as well as considerations to take into account when implementing an AI-powered cybersecurity solution. 4 tips for successful implementation of AI in cybersecurity  Create a solid data governance system The success of any AI-based system largely depends on the quality and quantity of data fed into it, and cybersecurity AI tools are  no exception. To make AI a valuable cybersecurity asset, companies need to ensure that it has access to data from across the entire IT infrastructure. Yet setting this up often becomes challenging because different AI models require different data structures and formats.  To overcome this roadblock, make sure to consolidate all data feeds in one place before adding AI to your cybersecurity toolkit. The most effective way to standardise data is to employ a common data model (CDM).  Revamp incident response frameworks Given the superior capabilities of AI in cybersecurity when compared to traditional methods, the number of threats that organisations face could increase exponentially. So quite often, security teams become overwhelmed with the sheer number of alerts after AI implementation and struggle to adequately prioritise and address them. This is why organisations must review their incident response methods and establish detailed response plans for every possible cyber threat.   Ensure the availability of sufficient talent  While AI implementation results in improved automation, it doesn’t mean that organisations need fewer security specialists. It’s quite the opposite. The implementation of AI calls for a new set of skills and expertise to manage AI-related tasks and workflows. Therefore, organisations must decide whether they want to retrain existing personnel or outsource security teams with specialised skills to maximise the value of AI-powered cyber defense.  First, organisations should consider their current team's capabilities. If the team is already well-versed in cybersecurity and data literate, retraining them to work with AI is the best option. On the other hand, if the existing team lacks expertise or experience in cyber defense, outsourcing a specialised security team will prove more cost-effective. Establish documentation practices When it comes to AI adoption in the cybersecurity context, maintaining detailed documentation of how exactly AI integrates into IT infrastructure is essential. This includes:  Storing logs related to the development of the system and establishing a standard revision control system to accurately track who made changes, when, and what they were.  Maintaining compliance with GDPR and other relevant regulations and keeping a record of all audit logs to facilitate further audits. Keeping AI tools up-to-date through ongoing monitoring and regular maintenance. Since cyber threats evolve quickly, AI models should be able to adjust to new or changed conditions. This can be done by regularly monitoring the system’s performance, implementing patches when necessary, and ensuring that all models are continuously validated.  The bottom line A successful implementation of AI in cyber defense requires a solid data governance system, reimagined incident response frameworks, sufficient talent and expertise to manage the new system, and established documentation practices. By taking these steps and ensuring that all stakeholders understand their roles and responsibilities within the new security framework, organisations can significantly boost their cybersecurity capabilities.  AI can be an incredibly powerful tool for enhancing cybersecurity but only when it is properly implemented. With thorough planning, preparation, and training, organisations can use AI’s unmatched potential to better protect themselves from today's sophisticated cyberthreats.  ### <strong>The Metaverse: Virtually a reality?</strong> Internet giants, such as Meta, Microsoft and Google; major high street brands, the likes of Adidas, Gucci and Samsung; and banks like HSBC and JP Morgan as well as tech giants like Siemens are all moving into the metaverse. Only there is not one metaverse, there are lots, at least potentially. As yet there is nothing that fulfils all the criteria to be a ‘metaverse’. The aspirations are far from a reality. Meta’s president of global affairs and former UK deputy Prime Minister Nick Clegg has said: “These innovations aren’t going to happen overnight. We’re in the early stages of this journey. Many of these products will only be fully realised in 10–15 years, if not longer.” The Metaverse can perhaps best be described as a direction of travel, a journey several major players in the software field have embarked upon and are in the process of investing very substantial sums of money in bringing it into existence. So, the question is not what is a metaverse but what might a metaverse be at some point in the future? The expectation is that it will be an immersively-experienced virtual world; a three-dimensional environment in which you, as a digital avatar, can be highly interactive with people and objects; and entry will probably be via some form of headset. Most also hypothesise that the metaverse worlds will be persistent, continuing to exist when the individual users are not on-line; infinite, supporting massive numbers of users; and real-time, providing live experiences, maybe even with avatars reflecting real-world facial expressions and body language. Some also imagine levels of interoperability that will allow users to move avatars and digital assets between different metaverses on different platforms.  This latter has moved a little closer with the recent formation of the Metaverse Standards Forum, which has been joined by some, but by no means all, of the major players including: Adobe, Alibaba, Epic Games, Khronos, Meta, Microsoft, NVIDIA, Sony and the World Wide Web Consortium. Most notably absent are Google and Apple but other major players not yet signed up include Niantic, Decentraland, The Sandbox and Roblox. Buying into the evolution The metaverse and Web 3, which to many are interchangeable, are seen as the natural evolution of the World Wide Web, as it moves on from Web 2, where its orientation evolved from presentation to increasingly user-generated content and a participatory culture. It proposes that in the near future a sizeable number of us will create a parallel life in metaverses via our avatars and will want to buy ‘land’ (investments have already topped $500m) - build homes, cafés, venues, offices, laboratories and so on; create cultural experiences and digital assets; as well as socialise and collaborate on creative, social or commercial projects. Recent developments have made possible the potential for such things as blockchain-powered, decentralised metaverses, where there is no single entity in overall control. Blockchain also enables non-fungible tokens (NFTs) - certificates of ownership for digital assets that exist on the blockchain and can be bought and sold using cryptocurrencies, which ultimately may have real world values as well as value in the virtual world. Where are we now? Artificial intelligence (AI) is making an enormous contribution to the potential for the metaverse to be persistent, responsive with, for example, functionality of non-player characters and through analysing images to generate realistic avatars with different expressions, clothes, and characteristics. Engineers and software developers are working on areas such as building infrastructure: cloud, edge computing and photogrammetry pipelines (software that utilises images to create 3D renderings). Developing the AR and VR interface headsets, sensors and gloves, or their successor devices necessary to access the metaverses, are achieving mixed results and remain very crude tools, in their present form, for the kind of accessibility and duration of engagement envisaged. Key fundamental methodologies such as cryptography protocols, non-fungible tokens (NFT) are gaining ever more traction to make trading possible and build economies in the virtual world.  Other tools envisaged, but not yet invented, for holograms and for users to touch and manipulate digital objects in the metaverse, have also yet to appear. As always, answers may or may not be found, sooner or, potentially, very much later. Some endeavours will inevitably turn out to be dead-ends and investors in those will lose money. Other concepts will not find the large numbers of interested users hoped for, so will become niche backwaters. The ultimate goal of fully functional, mass appeal virtual worlds in which people enjoy parallel existences may turn out to be just enhanced gaming worlds. Daily outputs of engineering and computer science will form the core pillars of the metaverse and their successes and failures will constantly adjust the aspirations and goals for the metaverse. It might enable life-changing collaborations that produce medical breakthroughs, cheap energy and increased leisure time for all or free mankind to travel the universe as an avatar. The future is always very difficult to predict. There are some scenarios where current virtual realities are evolving quite rapidly. In scientific research, computer simulation has become a common practice. Scientific theories are being proven and tested in computer environments rather than conducted in physical laboratories. Virtual reality is reducing physical limitations and facilitating collaboration on experiments. Participants can dynamically change the parameters and utilise the power of cloud computing systems to visualise results. This saves material preparation expenses and provides a safer and more effective working environment. The ever growing use of digital twins in design, manufacturing, operations and customer support is moving into the metaverse. Here the concept of the digital twin is enhanced and expanded, envisaging a new wave of work that can only be done expeditiously and cost-effectively in virtual worlds. The future: virtual or real Metaverses have the potential to enable virtual worlds to expand beyond the gaming genre to encompass all manner of social and commercial activities. Some of these will pave the way for things to exist in the real world but others will only ever exist in the metaverse. While Citi says the metaverse represents a potential $8 trillion to $13 trillion opportunity by 2030 and could boast as many as 5 billion users, there is never a guarantee of what technology will produce, when it will arrive or that the public will want what it delivers. We can only wait and see. The idea of the metaverse is ambitious. The attempt to make it a reality will undoubtedly produce many innovative contributions to life in the real world and may provide a viable parallel existence for those that want or need one.  ### <strong>Cybersecurity and Cloud: A Look Back at 2022 and What to Expect in 2023</strong> With the exponential growth of the cloud in 2022, businesses are looking to leverage this technology further in 2023. To better deploy the cloud it is important to understand how the cloud evolved in 2022 and what it is likely to bring in 2023. The world has recuperated from the pandemic in 2022 and a number of businesses have started to return to their brick-and-mortar workspaces. However, the digital habits of employees have made it essential for organisations to operate in a hybrid setting. Amidst this growing digital landscape, businesses can not cut corners on their cybersecurity. Organisations today are harnessing the power of digital technologies to sustain, grow and innovate. However, it brings risks such as the ever-increasing volume of personal data and assets available online - which emphasises that businesses must take active responsibility for protecting this data. Organisations that skimp on their cybersecurity are prone to face reputational and economical implications. The question remains how can businesses best secure their digital assets? Organisations that are able to deploy cloud computing in combination with other technologies, such as additional security layers and zero trust architecture (ZTA), can better remain secure.  Cloud Computing in 2022  Cloud computing has been facilitating a range of functions across different sectors. In the past few years, the use of this technology has advanced more than ever. In 2022, with the many businesses operating in hybrid working models, cloud computing has continued to be a key driver of remote working facilitation. Cloud offerings such as Virtual-Desktop-Infrastructure (VDI) and Desktop-as-a-Service (DaaS) have allowed businesses to run their functions seamlessly. Moreover, the use of cloud isn’t limited to the facilitation of remote work but has advanced way beyond.  Today, cloud computing is complementing other emerging technologies such as artificial intelligence (AI) and machine learning (ML) and helping businesses implement and scale their automation services. What’s more, cloud also enables organisations to meet storage requirements of data-driven technologies such as AI and ML in a cost-effective way.  Throughout 2022, many industries either started or advanced their usage of cloud. For example, cloud computing today is helping retailers store relevant consumer data and analyse it with the help of AI and ML to deliver personalised experiences. On the other hand, automotive manufacturers have been taking advantage of cloud storage in their smart factories and automated vehicle inspection systems. The technology has grown across many other sectors and will continue to widen its popularity in the future.  A Lookback at Cybersecurity Lessons There’s no denying that businesses have benefited from digitalisation, but it is significant to identify the fact that many organisations have faced serious cybersecurity challenges too. 39% of the UK businesses identified a cyber attack in 2021-22, and it is evident that with growing digitalisation, the need for cybersecurity has grown exponentially.  From a cybersecurity perspective, humans are the weakest link and the prime target for hackers. Cyber criminals have been leveraging different tactics to mislead employees into clicking on a malware link to penetrate their systems. For instance, leveraging the increase in cost of living, hackers targeted individuals with fake emails promising energy rebates. On the other hand, the industry has also experienced malicious insider threats where employees went corrupt with their own malicious intent - the question is how can organisations protect themselves?  Appropriate employee training and awareness sessions about cybersecurity must be an integral part of every organisation’s strategy. This not only provides employees with the basic knowledge about cybersecurity but also keeps them informed about the latest trends and threats which can help them deal with any potential attack.  Furthermore, businesses can implement ZTA to treat every login with zero trust. Through this technology, each access is authenticated and monitored throughout the online session. At the identification of any malicious activity, the user access is revoked and relevant authority can be alerted. In addition, ZTA allows businesses to grant specific access levels to employees depending upon their roles and responsibilities in the company. This can significantly help businesses minimise the risk of insider attacks.  Predictions for 2023: What To Expect Cloud and WFH As businesses continue to embrace digital technologies, the role of cloud is likely to become more important. One of the driving factors for the growth of cloud is remote and hybrid working. Certain cloud offerings such as VDI and DaaS will continue to be the key technologies in facilitating WFH efficiently. However, 2023 will experience further advancement in the growth of cloud. Cloud in Industry 4.0 With exponential growth of smart factories, smart cities, and the Internet of Things (IoT) - cloud computing will continue to play its role in this era of Industry 4.0. Moreover, with an ongoing economic uncertainty, many businesses are likely to reduce their expenses in the upcoming quarters of 2023. Cloud computing serves as a cost-effective option for businesses as it allows them to pay only for the resources used and eliminates the need to run their own data centres which can be expensive. In fact, with cloud computing businesses can scale up their growth with reduced downtime and increased efficiency. This makes cloud computing a ‘must have’ choice in 2023. Cloud for Security Cyberattacks continue to pose a significant threat to businesses that have already started to leverage the cloud to secure their data. With a range of advancements such as - multi-cloud and hybrid cloud - the technology is playing a key role in helping organisations maintain data privacy. Cloud security is likely to advance more than ever before in upcoming years - especially with the growth in the use of AI and ML, cloud computing will continue to help these technologies facilitate their functions allowing them access to the data they need in a secure way.  Businesses are continuously reassessing their resources and options to fill their tech stack. In this competitive digital landscape, the innovative use of technology will be something that would generate a competitive advantage for organisations. All things considered, the future of cloud computing seems full of possibilities and is set to unveil new advancements and innovations in 2023. Businesses that continue to upgrade their solutions amidst this technological advancement will remain ahead of their competitors. ### TD SYNNEX, NVIDIA, IMSCAD interview TD Synnex is a leading global distributor of technology products and services, offering its customers a comprehensive range of products and services. The company has recently announced a strategic partnership with NVIDIA, a leading manufacturer of graphics processing units (GPUs) and AI hardware. The partnership will allow TD Synnex to distribute NVIDIA's products to US, Canada, and Mexico customers. Through this partnership, TD Synnex's customers will access NVIDIA's cutting-edge technology, including its AI-powered GPUs and data centre solutions. NVIDIA's products are widely used in various industries like gaming, healthcare, manufacturing, and research, and this partnership will help TD Synnex to expand its customer base in these industries. In addition to the partnership with NVIDIA, TD Synnex has also announced a collaboration with IMSCAD, a provider of data centre infrastructure solutions. This partnership will allow TD Synnex to expand its offerings to include IMSCAD's data centre management software. This software offers advanced monitoring, reporting, and capacity planning, enabling TD Synnex's customers to manage their data centre more effectively. The partnership with Imscad will also allow TD Synnex to provide its customers with a comprehensive data centre solution that includes hardware, software, and services. This will help customers to increase their data centre efficiency and reduce costs. Overall, the partnership with NVIDIA and Imscad will allow TD Synnex to offer its customers a more complete and advanced technology solution. As a result, the company's customers will now have access to a broader range of products and services that will help them to improve their operations and achieve their business goals. This partnership will also help TD Synnex to expand its customer base in various industries and increase its market share. ### Unveiling the Top 10 Cybersecurity Threats to Watch Out for in 2023 As technology advances, so do cybercriminals' methods to gain unauthorised access to sensitive information. With the increasing reliance on technology in both personal and professional settings, it is crucial to stay informed about the top cybersecurity threats to watch out for in 2023. Ransomware: Ransomware attacks involve a hacker encrypting a victim's files and demanding payment in exchange for the decryption key. These attacks have become increasingly common in recent years and are expected to continue to be a significant threat in 2023. Phishing: Phishing attacks involve a hacker disguising themselves as trustworthy entities to trick victims into providing sensitive information. This type of attack is often executed via email but can also be done through text messages or social media. Phishing is a type of cyber attack that involves tricking individuals into giving away sensitive information, such as login credentials or financial information. This is typically done by disguising the attacker as a trustworthy entity, such as a bank or popular online service. One of the most common forms of phishing is email, where the attacker sends an email that appears to be from a legitimate source, such as a bank or other financial institution. The email may ask the recipient to click on a link to update their account information or to confirm a recent transaction. When the recipient clicks on the link, they are taken to a fake website that looks similar to the legitimate website but is controlled by the attacker. The victim is then prompted to enter their login credentials or other sensitive information, which the attacker can then use to gain unauthorised access to their account. Cloud Security: As more and more businesses move their data to the cloud, the threat of cloud security breaches increases. Hackers can target vulnerabilities in cloud infrastructure to gain access to sensitive information. IoT devices: Internet of Things (IoT) devices, such as smart home devices and connected cars, are becoming increasingly popular. However, these devices often have poor security, making them easy targets for hackers. AI-based attack: Attackers use AI to create highly targeted and sophisticated attacks that can evade traditional security measures. Supply Chain Attack: Supply chain attacks are when a hacker infiltrates a third-party supplier to gain access to a target's network and steal sensitive information. 5G Security: The increasing adoption of 5G technology also brings new security challenges. 5G networks are more vulnerable to cyberattacks than previous generations of cellular networks, due to their increased complexity and speed. Social Engineering: Social engineering attacks involve tricking victims into providing sensitive information or access to their devices. These attacks can take many forms, such as phishing, pretexting, and baiting. Social engineering is a type of cyber attack that uses psychological manipulation to trick individuals into giving away sensitive information or access to their devices. This attack is often used with other cyber attacks, such as phishing, to gain unauthorised access to sensitive information. Cryptojacking: Cryptojacking is when a hacker uses a victim's device to mine cryptocurrency without their knowledge. This attack can slow down a victim's machine and use up their battery and data. Insider threats: Insider threats refer to security breaches caused by employees, contractors or other insiders. These threats can be accidental or intentional and can result in data breaches, theft of intellectual property, and different types of damage. Individuals and organisations must stay informed about the latest cybersecurity threats and take steps to protect themselves. This includes implementing robust security protocols, regularly updating software and systems, and being cautious when providing sensitive information. ### <strong>Endpoint management: Common challenges and trends for 2023</strong> The surge in remote work and the growing trend of using the same mobile devices for work and leisure have challenged traditional on-premise IT management. When work is no longer tied to a certain place or time, there’s a clear need for more flexible cloud-based device management. In this article, Jere Jutila, Director of Business Development at Miradore will explore the common challenges faced by organisations when it comes to managing and securing endpoints and outline the key trends we can expect in the field over the next 12 months.  These include prioritising frictionless user experience, automation over manual actions, and taking a whole lifecycle approach to endpoint management. Endpoints are a doorway through which employees access the data they need to fulfil the tasks their roles demand.  But while these hybrid IT environments offer many benefits to organisations and their workforces, security risks and management costs can grow without the right strategy in place.  Effective and secure endpoint management is fast becoming a top priority for companies thanks to accelerated digital transformation, hybrid working models, pandemic disruption and new security threats. And in 2023 we can expect investment and focus to grow even further. Specifically, managing data sprawl, a growing number of devices connecting to the network, and securing device fleets.  The sustainability benefits of endpoint management are also being more widely understood. Yet, barriers and common challenges remain. So, looking ahead to the next 12 months, what endpoint management trends can we expect to see and how are organisations working to resolve these shared challenges? More endpoints, more problems? Endpoint management challenges are affecting business large and small, across different countries and industries. There are many reasons why this is the case. Mobility is on the rise. The mobile endpoint count is growing as hybrid working becomes the new default model.  Work-life has been in the middle of change for a long time and the current pandemic has further accelerated the transition. Now, employees are constantly switching between the office and remote work, processing large loads of data, and collaborating over the internet, using the same laptops, tablets, and smartphones for both business and leisure.  As a result, the data sprawl is getting wider. The more devices connected to the network, the bigger the data sprawl organisations must manage, keep secure and maintain compliant. Demand on IT resources is growing, too. Managing endpoints manually is placing more pressure on already overstretched IT teams, as well as on individual employees who have many other more important tasks to prioritise.  Lifecycle management is also becoming more important. A lack of visibility across the hardware and software lifecycle from purchase, set up and through to disposal is creating a headache for IT departments and causing budgetary confusion. Finally, security threats are rising. Endpoints serve as the main point of access to an enterprise network and therefore are often exploited by malicious actors. In fact, in a recent survey by the Ponemon Institute, 68 percent of organisations reported experiencing one or more endpoint attacks that compromise data assets and/or IT infrastructure. So, within this context, what will be the top endpoint management trends in 2023 and beyond? Frictionless user experience Today’s mobile workforce deserves a frictionless experience no matter their location or choice of device. And a generous 69 percent of employees are allowed to use personal devices for work, according to a recent survey. Every employee deserves and has come to expect a consumer-grade digital experience, no matter how or where they connect to the corporate network. And for this to happen, organisations must continue to focus on two priority areas: Unified Endpoint Management (UEM) and Digital Experience Monitoring (DEM). In 2023, we can expect increased adoption of UEM tools, allowing IT teams to manage, secure, and deploy resources and applications on any device from a single cloud-based platform.  Going one step further than Mobile Device Management, UEM supports BYOD flexibility, where employees can move from personal usage to work usage on their devices anytime or anywhere. Combined with DEM tools, companies will be able to better identify technology performance issues across their environment and align performance to support commercial objectives and minimise friction in user experience. Automation replaces manual action Manual management of endpoints is taking up precious time within IT teams and responsibility often falls on individual employees, distracting them from their workload and opening security vulnerabilities. As part of the wider adoption of UEM in 2023, mundane and time-consuming tasks will increasingly become automated. Many tasks associated with device administration aren’t particularly complex, but they are important and need to be completed frequently.  This predictability creates an opportunity for automation to streamline the management and protection of today’s highly distributed end points and automatically monitor systems and/or devices for updates, tampering, degradation and failure. Automation will also provide more IT departments with full visibility of the composition and state of their IT infrastructure and supports, which in turn can provide greater clarity when planning budgets and investment priorities for the year ahead. It will also support compliance with ever-tightening data regulation. A whole lifecycle approach There are many critical stages each device goes through within its lifecycle, from purchase and enrolment to configuration and security, maintenance, and updates. The process ends with retirement and replacement. In 2023, we can expect to see more companies move towards a whole lifecycle approach to endpoint management in pursuit of 360-degree visibility of device fleets - from purchase through to disposal. Efficiency and sustainability will continue to be key issues for businesses across the device lifecycle. Effective endpoint management will form an important part of wider sustainable IT strategies, giving workforces access to the latest technology where and when they need it, while minimising the impact of device fleets on the environment. Taking a whole lifecycle approach to asset management can also support organisations to keep control of the spiralling total costs of device ownership, with management and maintenance estimated to account for 80 percent of the total cost of ownership (TCO). ### Cloud Vs On-Premise Server Cloud computing and on-premise hardware server farms are two different data storage methods. The advantages of using cloud computing include: Scalability: Cloud computing allows businesses to quickly scale their computing resources up or down as needed without needing to purchase or maintain additional hardware. Cost-effectiveness: Cloud computing can be less expensive than maintaining an on-premise server farm, as businesses only pay for the resources they use and do not need to invest in expensive hardware. Maintenance: With cloud computing, businesses do not need to worry about the maintenance and upkeep of their own servers, as this is handled by the cloud provider. On the other hand, the advantages of using an on-premise hardware server farm include: Control: With an on-premise server farm, businesses have more control over the security and configuration of their servers. Data sovereignty: Some businesses may have legal or regulatory requirements that prevent them from storing data in the cloud, so an on-premise server farm may be the only option. Latency: If a business needs to process data quickly and with low latency, an on-premise server farm may be a better option than cloud computing, as data does not need to be transmitted over the internet. Using cloud computing has many advantages that make it a compelling option for businesses of all sizes. One of the key advantages is scalability. With cloud computing, businesses can quickly scale their computing resources up or down as needed without purchasing or maintaining additional hardware. This allows businesses to respond quickly to changes in demand, such as a spike in website traffic, without incurring additional costs. Another significant advantage of cloud computing is cost-effectiveness. With cloud computing, businesses only pay for the resources they use and do not need to invest in expensive hardware. This can be significant cost savings for businesses, especially small and medium-sized businesses that may not have the budget to invest in an on-premise server farm. Finally, cloud computing eliminates the need for businesses to worry about the maintenance and upkeep of their own servers. With cloud computing, the cloud provider handles maintenance and upkeep, allowing businesses to focus on their core competencies. On the other hand, an on-premise hardware server farm has its advantages. One of the key advantages is control. With an on-premise server farm, businesses have more control over the security and configuration of their servers. This can be especially important for businesses that handle sensitive data or operate in highly regulated industries. Another advantage of an on-premise server farm is data sovereignty. Some businesses may have legal or regulatory requirements that prevent them from storing data in the cloud, so an on-premise server farm may be the only option. Finally, an on-premise server farm may offer better latency than cloud computing. If a business needs to process data quickly and with low latency, an on-premise server farm may be a better option than cloud computing, as data does not need to be transmitted over the internet. In conclusion, both cloud computing and an on-premise hardware server farm have their advantages and disadvantages, and the choice between them depends on the specific needs of the business and the application. Businesses should weigh the pros and cons of each option and choose the one that best meets their specific needs. ### Unplugging for Success: Why IT Workers Should Digital Detox and Meditate In today's fast-paced world, technology has become a constant companion. We are constantly connected, always online, and always available. But this constant connection can take a toll on our mental and physical health. Information technology workers, in particular, are at risk of burnout and stress due to the nature of their work. They spend long hours before a computer screen, staring at code and dealing with constant distractions. To combat these negative effects, IT workers should consider taking a digital detox and meditating. A digital detox is a period during which a person refrains from using electronic devices such as smartphones, laptops, and tablets. This can be a great way to unplug and recharge. By taking a break from technology's constant stimulation and distractions, IT workers can reduce their stress levels and improve their overall well-being. Meditation, on the other hand, is a practice that involves focusing the mind and calming the body. It can help IT workers reduce stress and anxiety, improve focus and concentration, and increase well-being. It's important to note that digital detox and meditation don't have to be done in one long time. Incorporating small breaks throughout the day can be just as effective. For example, IT workers can take a short break every hour to step away from their computer and take a walk, or sit and breathe for a few minutes. This can help to refresh the mind and reduce eye strain. Another example is to schedule a longer period for digital detox and meditation, such as a weekend or a week-long vacation. During this time, IT workers can disconnect from technology and focus on other activities such as hiking, yoga, reading, or spending time with family and friends. This can help to recharge the mind and body and return to work feeling refreshed and re-energised. Additionally, IT workers can use different apps or programs that can help reduce distractions and increase focus, such as apps that block social media or remind you to take breaks. It's important to note that digital detox and meditation are not a one-time solution but an ongoing practice. IT workers should make it a habit to unplug and meditate regularly to maintain their physical and mental well-being. In summary, as an IT worker, it's essential to take a break from technology, unplug, and meditate to improve your overall well-being, improve your focus and concentration, and reduce stress and anxiety. It's a simple yet effective way to take care of yourself and perform better at your job. Some resources that may be helpful for IT workers looking to unplug and meditate include: Headspace, Calm, Freedom, The Center for Humane Technology and The Digital Detox. Make sure to select the resources that suit your personal needs and preferences, and always do your research. ### How Synthetic data is used to remove bias in AI Synthetic data refers to data that is artificially generated rather than collected from real-world sources. It is often used in machine learning and AI applications to train and test models, as well as in computer simulations and other research. There are various methods for generating synthetic data, including computer programs, mathematical models, and physical simulations. There are several ways synthetic data can be used in AI and machine learning applications. Some examples include: Data augmentation: Synthetic data can augment existing real-world datasets by generating new examples similar to the ones already present. This can help improve machine learning models' performance by providing them with more diverse and representative training data. Anonymisation: Synthetic data can replace real-world data that is sensitive or private, such as personal information or medical records. By using synthetic data, it is possible to preserve the patterns and relationships in the original data while removing any sensitive information. Overcoming data scarcity: Synthetic data can be used to generate datasets for machine learning models in situations where obtaining real-world data is difficult or prohibitively expensive. For example, there may be legal or ethical barriers to obtaining real-world data in specific industries like finance or healthcare. Pre-training: Synthetic data can be used to pre-train machine learning models before fine-tuning them on real-world data. This approach can improve the model's performance by giving it a general understanding of the task before it is exposed to the complexities of real-world data. Generative models: Synthetic data can be used to train generative models, such as Generative Adversarial Networks (GANs) and Variational Autoencoders (VAEs), which can generate new data samples that are similar to real-world data. Synthetic data can also be used to evaluate the robustness and generalisation of AI models. By generating synthetic samples similar to real-world examples but with variations, it is possible to assess the models' ability to generalise to new data and evaluate their robustness against adversarial attacks. It's important to note that not all synthetic data is created equal, and the quality of the synthetic data will affect the performance of the trained models. Additionally, synthetic data may not always reflect the real-world data distribution, so evaluating the model's performance on real-world data is essential. Synthetic data can also be used to overcome data bias by generating a more diverse and representative dataset. In real-world datasets, specific groups or demographics may need to be more represented, leading to biased models performing poorly on these groups. By generating synthetic data that is more representative, it is possible to train models that are more fair and accurate across different groups. Synthetic data can be a valuable tool for addressing bias in AI and machine learning, but more is needed. For example, synthetic data can generate more diverse and representative datasets, which helps reduce bias in the trained models. However, synthetic data can also introduce bias if it is not developed carefully or if it is not representative of real-world data distribution. In addition, it's important to note that bias in AI can come from multiple sources, such as the data, the model architecture, the training process, and the evaluation metrics. Therefore, addressing bias in AI requires a comprehensive approach that involves multiple steps, such as data preprocessing, model selection, monitoring and debugging during training, and evaluating the model performance on real-world data. Another critical factor is to ensure that the synthetic data represents the diversity of the natural world, including age, gender, race, and other factors that can introduce bias, and also try to replicate the real-world distribution of the data. One common source is the data used to train the AI model. If the data used to train the model is not representative of the population, it will be used, or if it includes certain biases, the model will likely reflect those biases. For example, if a facial recognition model is trained on a dataset mainly composed of images of people with light skin, it may not perform as well on people with darker skin. Another source of bias in AI can come from the algorithms used to train the model. For example, some algorithms may be more prone to certain types of discrimination than others. Additionally, how the algorithm is designed and configured can also introduce bias. For example, if the algorithm is designed to optimise for a specific metric, such as accuracy, it may inadvertently perpetuate existing biases in the data. Another source of bias may come from the people who design, build, and operate AI systems. They may unconsciously or consciously introduce bias into the systems due to their unconscious or conscious biases. In summary, bias in AI can come from various sources, including the data used to train the model, the algorithms used to train the model, and the people who design, build, and operate the AI systems. There are some ways synthetic data can be used in simulations and virtual environments: Training agents: Synthetic data can be used to train agents, such as robots or self-driving cars, in simulated environments similar to the real world. This can provide a safe and controlled setting for training and testing without the risks and expenses associated with physical experiments. Testing: Synthetic data can be used to test agents and robotic systems in a virtual environment by generating test cases tailored to the system's specific characteristics. This helps identify bugs or other issues in the system that might not be apparent when testing with real-world data. Scenario simulation: Synthetic data can be used to simulate scenarios that are difficult or impossible to replicate in real life, such as extreme weather conditions or rare diseases. This can be useful in self-driving cars, weather forecasting, and medical research. Synthetic data can train and test reinforcement learning models, providing the model with many possible scenarios and actions. This can improve the model's performance by giving it more diverse and representative training data. Creating synthetic data for simulations and virtual environments requires specialised knowledge and expertise. For example, it might require knowledge of physics, computer graphics, and programming, depending on the nature of the simulation or virtual environment. Additionally, synthetic data should be as close as possible to the real-world data distribution to be helpful. Several companies and organisations offer synthetic data for purchase or licensing. These include: Data generation companies: Several companies specialise in generating synthetic data, such as Synthetic Minds, DataRobot, and Cognac. These companies can generate synthetic data for various applications, such as machine learning, computer vision, and natural language processing. Data annotation companies: Some companies, such as Scale AI and Appen, offer synthetic data generation services, including data annotation, data labelling, and data curation. Industry-specific providers: Some companies focus on providing synthetic data for specific industries, such as self-driving cars, weather forecasting, and medical research. For example, Waymo, a subsidiary of Alphabet (Google), provides synthetic data for autonomous vehicles. Open-source datasets: There are also open-source datasets that can be used as synthetic data, such as the popular ImageNet dataset, which contains millions of images and their annotations, and can be used for computer vision tasks. Research institutions and universities: Some research institutions and universities also offer synthetic datasets, such as the Center for Machine Perception at Czech Technical University in Prague, which provides the SYNTHIA dataset, a synthetic dataset for the semantic segmentation of urban scenes. It's important to note that the quality and suitability of synthetic data can vary widely depending on the source, so it's essential to evaluate the data carefully before using it for training or testing models. Additionally, it's critical to check the data's terms of use and license agreement before using it. In summary, synthetic data can be used in various ways in AI and machine learning, from data augmentation and anonymisation to testing and scenario simulation. It can also help overcome data scarcity, pre-training, and evaluation of models' robustness and generalisation and overcome data bias. However, it's important to note that the quality of synthetic data must be high enough to be helpful, and the models trained on it must also be evaluated on real-world data. ### <strong>Digital disconnect: the hidden barrier to genuine transformation</strong> “No one expects all board directors to be data scientists or coders. But they should be able to ask the right questions about technology transformation and its implications for their company, colleagues and customers." Outgoing Salesforce president Gavin Patterson's recent comments highlight an emerging digital literacy crisis within the boardrooms of UK companies, as senior leaders pay lip service to the need for digitalisation without understanding how to implement it within their business.  A new study by Elsewhen, in partnership with market research company Vanson Bourne, casts light on this digital disconnect and its impact on a range of leading UK industries. Tasked with charting UK companies' progress towards becoming digital-first, the survey canvassed 200 senior decision-makers across FMCG, retail and financial services sectors.  These respondents are overwhelmingly influential players with sizeable budgets. Most (86%) of them work at businesses generating over £1bn p/a, and 37% work with yearly IT budgets exceeding £10 million. Yet, nearly all those interviewed (96%) have encountered problems with digital projects. While more than half are “extremely proud” of their organisation’s approach to digital transformation,  two-thirds of the study interviewees (64%) report a yawning gap between strategy and what could be achieved in their company's digitalisation efforts. This failure to bridge the gap between strategy and execution comes with serious costs; companies report negative impacts of failed digital projects ranging from lower-than-expected annual revenue (42% report this) to delayed product launches (36% cite this) and IT security risks (35% mention this). While there’s no simple off the shelf answer to the digital disconnect UK companies are experiencing, here are three learnings from the survey findings:  Digital is not an optional extra A clear insight from the survey data set is a lack of commitment and perspective in acknowledging digital problems at a senior level. Nearly a fifth of decision-makers surveyed had a digital project plan that could not be implemented, while 96% had faced at least one significant obstacle. These include challenges such as removing internal productivity blockers (88%), migration to the cloud (84%) and unlocking the hidden value of their data (84%). The fact that well-resourced businesses are wrestling with these digital transformation basics suggests an issue higher up the chain. Confirming this, 31% of decision-makers reported a lack of commitment from their organisation to necessary digital changes. Among IT teams, this figure rises to 38% even though 89% of respondents worked with an annual IT budget of £1m+. By sector, financial services are experiencing the biggest resistance to change, with almost half of respondents reporting that their company has digital commitment issues. This fits a pattern, with many mainstream banks struggling to improve basic metrics such as account uptake, loan approval, customer retention and fraud control through digital transformation. New research by IBM underlines the fact that the difference between success and failure in this sector often comes down to a holistic operational commitment to digital transformation. While most companies acknowledge the importance of digitalisation, the overall picture is that many are failing to place digital at the core of their business model and operational architecture. One mindset that is still very prevalent is that digital can be bolted on top of a legacy business model, almost as an optional extra. The reality is that digital transformation should create value for the consumer from the ground up; it should be leading the agenda and driving radical change at every step of the customer journey.  Strategy is not a destination. Part of the reason why digital transformation projects often fail to deliver is a tendency to disconnect strategy from implementation. This can take the shape of poor understanding of what it actually takes to deliver on a vision or how to design something customers want as well as a failure to anticipate obstacles. A digital vision which fails to clearly define attainable goals makes for a redundant strategy. It's also important not to confuse a business strategy with a digital strategy. Digital strategies contain specific elements crucial to the success of digital projects, including a digital vision, a digital operating model, a prioritised list of initiatives, a roadmap and a service blueprint.  Interestingly, our survey found that while 64% of companies say they have a fully implemented digital strategy, one-fifth of these (21%)  have neither a digital vision nor a digital operating model. What's more, under half (39%) have a prioritised list of initiatives. So perhaps unsurprisingly, 85% agree that their organisation needed guidance in measuring how well their digital transformation projects were going – and what else could be possible. Retail organisations, in particular, appear to lack a roadmap and are unlikely to include a definition of their digital vision in their digital strategy (31% of retail businesses have a definition of a digital vision compared to 52% of companies surveyed overall).  While many brands feel a sense of achievement in having a digital strategy, the absence of clear goals and priorities invariably hampers implementation, with critical projects becoming backlogged, counterproductive, going over budget or being scrapped altogether.  There's a question about self-awareness and openness here, too. To deliver true digital transformation, organisations must be transparent about their capability to plan and run a digital project end-to-end. If senior leadership can understand where the weaknesses are, they will have already made progress towards plugging knowledge gaps as and when they occur. Different departments will be able to support one another more fluidly, too.  Put tech talent in leadership positions. Digital advocates often discuss the importance of inverted hierarchies or embracing learnings from the edge. But when it comes to digital transformation, companies are being hindered by leaders who are ill-equipped to tackle or even acknowledge the challenges at hand.  Many of the organisations that participated in the survey use emerging tech in their digital projects, with lead elements including AI, AR, machine learning and automation. Yet, six in ten respondents (60%) claim that their organisation’s senior leadership team does not understand how to deploy this tech to drive growth and transformation. A third of respondents also highlight a lack of digital product experience among leadership teams.  In this fiercely competitive, constantly evolving digital age, It's no longer enough for senior management to exist in silos as overseers to other specialists. Instead, companies need smart technologists in leadership positions. They need to be specialists, product experts, designers and engineers who can connect the dots and make change happen. Talented technologists should be immersed in big conversations regarding digital innovation and growth. It's only by moving these people to the highest levels of leadership that a robust digital strategy – one that carries through to implementation – can be shaped. It's no coincidence that successful digital-first brands, including Facebook, Google and Amazon, are led by engineers and product people. Of course, appointing technologists to leadership positions is made more challenging while there continues to be a chronic shortage in both current and next-gen tech leadership. Seventy-one per cent of participants in the study agree that poor staff skill sets limit their organisation’s ability to maximise the impact of digital transformation projects. This in turn drives a need for third-party support in areas ranging from digital product design (48%) to product strategy (43%), software engineering (36%) and solution integration (35%).  Consultancies, then, have a critical role and can also clarify what is possible in digital transformation (further bridging the gap between strategy and implementation). With the right mindset and talent in place, there is no reason why UK companies shouldn't be justifiably optimistic about their digital capabilities. Many have the budget, tech prowess and ambition required to make digitisation their superpower. But true transformation also requires unwavering commitment, executional capabilities and tech-savvy leadership to become a reality. ### <strong>Weathering the unpredictable, with cloud migration</strong> The UK is entering a tumultuous period, with the threat of recession looming. At the end of September, following a new budget announced by the government, the pound fell to its lowest rate ever against the dollar, the upheaval continuing as markets follow political headlines. On top of this, average UK wages are falling and dragging spending down with it, against a backdrop of low consumer confidence and business insecurity. And this all comes as the new energy price cap is going to see energy costs skyrocket, with only some potential insolvencies likely to be prevented by further government action. This is a frightening time for businesses that have weathered the storm of the pandemic and only just emerged safely at the other side.  In light of this unpredictable and alarming economic situation, businesses are having to reevaluate the way they operate. Will they have to absorb rising energy costs and increase prices as a result? Will they have to encourage their employees to work remotely again in an attempt to keep bills at a minimum? What will happen if people stop spending? How will the weak exchange rate impact the business they conduct abroad?  In times of uncertainty, technology that is flexible, adaptable and can cater to their evolving needs should be adopted or fully exploited in order to help businesses stay afloat and keep costs under control. The migration of their data and workloads to the cloud could be the advantage they need to handle whatever the future might hold.  Cloud cost savings Gartner predicts that spending on cloud computing will overtake spending on traditional IT by 2025. Its research also found that 75% of all databases will be migrated to the cloud by the end of this year, a shift being driven by the enhanced data and analytics capabilities it will provide. However, when looking at how the cloud can be used for Modern Data Protection, the Veeam Data Protection Trends Report 2022 found that just 65% of UKI businesses use the cloud as part of their data backup tooling, leaving a gap of 35% that rely solely on on-premises solutions.   Those businesses that haven’t yet migrated any of their data or workflows to the cloud must realise that the cloud can be used to offset the costs that they are likely to incur during this challenging economic period, and it can even be a cheaper alternative to on-premises solutions. This is especially true with a subscription model, which will spread the cost and enable businesses to only pay for the cloud services they’re using, rather than burdening them with the cost of maintaining their own on-premises solutions.  In fact, the cost savings provided by the cloud are multifaceted. For example, migrating to the cloud for processes such as data protection can ensure that data from across a business’ estate is monitored and secured and prevent the financial and reputational repercussions of a data breach. Cloud migration will also impact employees from across the company, rather than just those within the IT team. Hosting all of its data in the cloud – using a solution such as Microsoft 365 – can enable employees to work from home, not only minimising the financial burden on staff when it comes to commuting, but also reducing operational costs for businesses too.  Cloud native for agility The Veeam Data Protection Trends Report 2022 also revealed 65% of UKI organisations are running containers in production, with 29% planning to do so in the next 12 months. The key benefits that many firms expect from this are related to improved agility, accelerated time to market and competitive edge. As the ability to bring mobility to workloads and move workloads and data from one platform, cloud or Kubernetes cluster to another, backup strategies will need to evolve to protect vital data wherever it is. Partnering for success When simultaneously looking to manage an unpredictable landscape and migrate to a new or expanded cloud ecosystem, working with a skilled cloud services provider can be greatly beneficial for businesses. Nobody knows which direction we’re heading in but having a partner that is knowledgeable about the cloud space and how best to unlock and leverage its agility and mobility to achieve the goals that each business is aiming for can be greatly helpful.  Not only can cloud services providers bring unmatched experience, but they can also help to plug the digital skills shortage that the industry is facing. This ever-looming skills challenge is reportedly stopping 54% of businesses from progressing with digital transformation initiatives, such as cloud migration. A valuable partner is one that can truly integrate into a business’ IT team and ensure that it achieves the maximum return on its cloud investments by using it in the most impactful way. Furthermore, they can also work to ensure that a business’ cloud estate is compliant where necessary, helping the data owner meet regulatory requirements.  In-built resilience and security Once organisations have understood the business case for cloud migration – such as the cost savings it can provide – as well as choosing which cloud services provider they should partner with for their unique needs, they then need to migrate their data into this new environment.  People often assume that resilience is built into their infrastructure, but this isn’t always the case. However, migrating data to the cloud can enable inbuilt resilience when it comes to data security and disaster recovery that just isn’t possible when it comes to on-premise solutions. Obtaining this level of resilience is critical as breaches become more damaging and, as cloud-enabled remote or hybrid working continues or even increases, breaches can often be more frequent due to the ‘insider threat’.  When putting your data in the hands of another entity, it’s crucial to ensure that it is being handled securely. Fortunately, more cloud services providers are taking cloud-based, Modern Data Protection further than Backup-as-a-Service, and turning their focus to disaster recovery as well. The best cloud services providers will have tried and tested systems to get businesses, and their data, back up and running as soon as possible after a data breach or ransomware attack, and this should be a key consideration for businesses when choosing which provider to partner with. Again, in these tumultuous times, each minute a business is offline or unable to carry on business as usual will cost money they simply cannot afford to lose.  Ultimately, as an adaptable and flexible tool, the cloud can enable businesses to be dynamic and resilient in the face of unpredictability. Not only can it help to cut costs and save money, but it can give businesses access to the skills of an experienced cloud services provider that can enhance the company’s own security tools and processes. Veeam found that 81% of businesses are expected to adopt cloud services within their data backup tooling, which is a positive increase on this year’s figure. If the companies that make up this 81% choose the right partner and leverage their skills and inbuilt data security, backup and disaster recovery processes, businesses can certainly use the cloud to weather this volatile storm.  ### Use Video to Educate Employees About Benefits Benefits are complicated, especially for young employees who don’t fully understand their full benefits packages. Employers often struggle to find meaningful ways to educate and communicate with employees about plans and perks, which is where video can be an asset. Even if you offer competitive compensation and benefits packages, if your employees don’t understand what’s available and how it compares to other organisations, it’s virtually meaningless. Video is the perfect way to upgrade your communications efforts and ensure your employees understand what they have. Benefits of Video According to research, viewers retain 95% of a message when they see it in a video compared to 10% with written text. That’s an 85% increase in retention. Over 300 hours of video are uploaded to YouTube each minute and 5 billion videos are watched each day. That’s a lot of video consumption! Video content performs better, for many, because of its passive and visual means of supplying information. Viewers are riveted, so they absorb the information better and retain it. For most people, if given a choice between video or written communication, they’d choose video. It’s that effective. If you want your employees to actively engage with the information you’re presenting, it’s important to integrate video into your educational planning. Innovative Ways to Use Video in Your Benefits Communications Get creative! There are numerous ways to include videos in your benefits communication strategy, from offering a library of video resources to making announcements. Announcements Open enrollment is a critical time for the human resources department. Employees may miss out on critical information with an email, memo, pamphlet, or other means of communication. Instead, use a video to announce important open enrollment information and eye-catching visuals like animation, Q&A, and bulleted slides. This information can also be repurposed across different channels to ensure employees can reference it as needed. Benefits Education Some benefits enrolment platforms offer a built-in video function to create videos that inform the employees’ experience. On-demand education like this ensures that employees have the necessary details when it matters most – while they’re shopping for the benefits that will impact their lives for the next year. Plus, you can use these videos in an on-demand video reference library throughout the year. Support If employees aren’t sure what their benefits provide or how to choose the best options for their needs or budget, decision support is important. Decision support should be included in the educational process to ensure that employees understand the terminology, evidence of insurability (EOI), plan comparisons, pricing, and their out-of-pocket expenses. Nurturing Benefits education isn’t just for open enrolment for employees. Your human resources department should have year-round benefits education with reminders, events, and resources that can be accessed at any time. Video should be included in this education to communicate plan features, changes, and updates, helping employees to understand what’s being offered. Best Practices for Video Getting started with video can feel overwhelming, but it’s easier than you think. There are plenty of options for different experience levels and budgets. However you choose to create videos, however, it’s important to plan your content accordingly. Keep It Short and Direct Videos don’t need to be complex or long to be effective. The most effective video content leverages the appeal of the medium – succinctness. Keep it short and direct, focusing on one topic at a time. This makes the information easier to absorb and suitable for different points in the enrolment process. Partner with Professional Video Producers If you don’t feel that you can create a video on your own, partner with a high-quality, professional video producer to design attractive videos that break down complex information into short, digestible bits. Don’t make it complicated – take the approach of breaking the concepts down into the simplest terms and concepts. Make It Convenient Investing in educational videos, whether on your own or with the help of a producer, is a waste if no one watches them. Make sure your employees know these video resources exist and how to use them effectively. Share videos through email and text alerts, announce the addition of a video library and inform employees at the start of the enrolment process. If employees come to human resources with questions, this is an opportunity to share the video library and help them find the resources they need. Is Video Always the Best Choice? Video is an excellent choice for many concepts and educational opportunities, but that doesn’t mean they’re perfect for every situation. Videos are more effective when they’re used to explain complex topics that can be aided through narration, illustration, animation, and more. Before you create an educational video for a topic, consider the following questions: Will the content translate well in a video format? Is the content suitable for the short and direct approach with video? How long will the video have to be to communicate the message? If there will be employees in the video, what happens if they leave the organisation in the future? Is the video accessible for people with a lack of equipment, a disability, or other limitations? Can the video be enhanced with visuals, captions, or other details to improve accessibility? How often will the content in the video need to be updated? When will that be? Benefits videos are focused on helping employees learn and understand their options to make informed decisions about their benefits. You have to deliver a lot of information in a short period of time. Videos are excellent for this purpose, but they’re just a tool – it’s up to you to make the content approachable and engaging. If your video doesn’t achieve the goal, or it’s boring or poorly organised, it won’t be much better than written communication. Set Your Employees Up for Success Helping employees understand their benefits is a key responsibility for an employer and human resources. If you haven’t considered video yet, you’re missing out on an opportunity to present complex benefits information in a way that makes it fun and meaningful for your employees and less time-consuming for you ### <strong>Overcoming IT Complexity in a Hybrid World</strong> It’s clear cloud adoption isn’t slowing down any time soon. Gartner® says end-user spending on cloud services has grown by a further 20% in 2022 to almost U.S. 500 billion. And the spending spree isn’t stopping yet. In fact, it’s expected to grow by another hundred billion dollars in 2023. The fuel for this growth is the fact all businesses are under pressure. Competitors are investing in resilience while also spending on improving products and services. At the same time, customers are demanding better experiences from companies. The distributed workforce also has high expectations—they want flexible work options. The fast pace of business innovation forces companies to continue striving to become more efficient and effective. Any way you look at it, companies must catch up, and this requires spending.  As teams try to meet these competing demands, it’s never been harder to work in IT. Pain points include running applications and workloads across both cloud and on-premises infrastructure while simultaneously supporting new technologies and legacy infrastructure.   And as these new features and applications are added, complexity throughout the tech stack grows and spreads exponentially. Of particular challenge is integrating new and legacy technologies, all while managing the transition to different ways of working. This is a challenge facing large organisations in particular.  With IT complexity affecting teams’ confidence, including the tech pros maintaining these systems, it’s becoming a problem we can’t ignore.  A confidence problem According to the SolarWinds® IT Trends Report 2022, complexity has negatively impacted the ability of IT teams to both support their businesses and the bottom line.  Worldwide respondents to the survey revealed the top drivers of increased complexity include new technologies and tools, growing tech requirements from multiple departments, and fragmentation between legacy and modern technologies. These growing demands indicate a potential crisis looming on the horizon.  Additional findings from the IT Trends Report show IT pros lack confidence in their ability to manage today’s complex environments. When asked how confident IT professionals are in their organisation’s ability to manage complexity, only 16% of respondents said they felt extremely confident. More than a third of respondents (34%) also admitted they weren’t fully equipped to manage complexity, and an additional 6% weren’t confident at all.  At a time when budgets are being watched carefully, every penny counts. For every department, spending in the right places is crucial. With the majority of IT leaders (75%) believing return on investment (ROI) has been impacted due to increasing IT complexity, it’s vital business leaders work collaboratively with IT teams to fix this ever-expanding issue and help IT pros get back on track.  Thankfully, there are four key ways to help combat this crisis in confidence. 1. Choose the right payment model When working with an IT vendor, it’s important to keep your options open and shop around for the best deal or payment plan. Subscription or one-off payments have budgetary benefits, but pay-as-you-go IT services shouldn’t necessarily be ignored. Providers of these services must commit to making you happy, or you’d be wise to look elsewhere.  Additionally, a pay-as-you-go approach allows your team to see a direct link between cost, waste, and inefficiency. It forces IT professionals to stay on top of maximising the usage of what they buy. For example, if there’s a technological lag, they’ll reach out to the provider for a faster fix. It also means you’re not paying for more than you need, and it’s easier to scale up or scale down as the needs of the business change. The result is a better understanding of ROI for the entire IT team. No matter which method you choose, selecting the right option for the task at hand and the more general business requirements is essential. This choice should also be re-evaluated and defended at future budgetary meetings to make sure it still aligns with the company’s needs as they change in a rapidly evolving business environment.  2. Seek solutions suited to your size Each organisation faces different challenges, but the size of the organisation can play a significant role in determining what technology, tool, or strategy could help better manage IT complexity. For example, The SolarWinds IT Trends Report also found more than a third (38%) of enterprise tech professionals indicated fragmentation between legacy and new technologies was the leading cause of increased complexity, compared to slightly over a quarter (29%) of their small business counterparts.  Before you select or purchase anything, factor in the size of your organisation. For example, a smaller company may be able to tackle complexity by bringing in one or two external consultants to lay out a strategy. However, this might not have the same impact for a larger organisation.  In fact, larger organisations are better off undertaking a cost-benefit analysis to determine the most effective way to manage big legacy tech stacks. Larger organisations may also have more complexity to contend with, making it especially important to consider how to integrate new and legacy technologies most smoothly.  3. The domino effect of training  When it comes to emerging technologies, the role tech professionals play in helping their companies make strategic business decisions can’t be overstated. Along with the benefits, IT pros will know the limits of their technology initiatives. They also understand where the compliance and security risks are—and how technology can be best used to meet business goals.  Despite this, confidence is waning. The report notes many IT pros feel they have suboptimal visibility into infrastructure and networks and understand they require comprehensive, first-hand training to get there. These upskilling sessions, which should include the time to experiment and learn new technologies, are needed to create an IT workforce secure and confident in its skills.  In communicating with management, tech leaders should stress how improved app performance, visibility, customer experience, and product resolution have a knock-on effect on business growth, which is something every team member at every level should be conscious of.  4. Remember, this isn’t the first time  Take the time to remind yourself: everything changes. The system you preferred last year has since been changed or upgraded. The company you previously relied on seems to have shifted its focus and mission. The business environment may also have pivoted, sending the business down a new path which eventually trickles down to the IT department. On the surface, tools like Zoom® and Slack® may connect us, but with every new connection comes an additional layer of IT infrastructure to track and manage.  IT pros have to take a step back and get perspective on the pace of this change—and why it’s necessary. This last point doesn’t require any technology, but it does require the ability to remember your job has always been about change.  Fighting the problem of complexity is a battle we all face. While there’s no miracle cure, try these four steps in the first instance. But buckle your seatbelt, too, as cloud adoption isn’t going anywhere anytime soon. ### <strong>5 key considerations for moving your data to the cloud</strong> It can seem like everyone’s moving to the cloud nowadays: back in 2017, Gartner estimated that businesses spent just over $38bn globally on database management systems. Today, just five years later, that figure has more than doubled to $80bn. This is a massive shift, and shows that many businesses are well on their way to transitioning to the cloud for their database needs. But ‘moving to the cloud’ is easier said than done. There are many different strategies and tools businesses can use for their cloud databases, depending on their budget and business use case, and not all are created equal. It’s often a confusing picture, and it’s a passion of mine to make sure that businesses have the complete picture when considering their options. A cloud migration can be an expensive and stressful undertaking, and it’s essential that businesses go into it with their eyes open. Understand why you’re moving from on-prem to the cloud It might sound obvious, but it’s vital when embarking on a lengthy cloud migration project.  There are dozens of compelling reasons why your business might be considering a move to the cloud. You might want more freedom, flexibility and agility for the organisation’s technology infrastructure in order to innovate faster. Cloud services provide automation and abstraction tools that often allow users to be more flexible, responsive, and focused on the work that drives business value rather than spending time on maintenance of on-prem hardware and other tasks that are often just overhead. Cloud databases help businesses move faster and implement agile, devops ways of working. For example, with on-prem solutions, testing a new prototype would require the ops team to procure hardware, rack and stack it, install the OS, and integrate everything into the corporate infrastructure before turning it over to the development team. In a cloud based environment, this capacity can be spun up in a matter of minutes. And that’s the bottom line: cloud services offer businesses the chance to experiment, try new things, fail fast, and innovate with a lower degree of risk. On-premises deployments commonly require a bigger commitment to technology regarding licences, hardware, and operational skills. This makes it much harder for enterprises to quickly try something new. And once they do try something new, they tend to be committed to long-term licences. For most businesses, the drive to move to the cloud will be a mix of these factors, but there’s normally one that will be coming out on top. Identifying that at the start of the migration process will pay dividends later when it comes to shortlisting potential solutions, agreeing SLAs and measuring results. Making sense of ‘as a service’ There are three options to take a database to the cloud: virtual machines (VM), containers, or Database-as-a-Service (DBaaS). The DBaaS market is growing at an exponential rate, with recent research predicting the market size to grow to $2.8 billion by 2025, doubling within five years. While DBaaS is currently the most popular option for moving to the cloud, VM deployed through Infrastructure-as-a-Service (IaaS) and containers with Kubernetes also have significant success in the market. Deep dive into DBaaS With DBaaS growing so quickly, it’s worth pausing to take a closer look at what this type of cloud database offers businesses and when it would be worth considering for your deployment. In DBaaS, the cloud database operator (CDO) assumes key responsibilities: managing the software, the hardware, and the network, while assuming responsibility for the service availability.  This is the biggest advantage DBaaS offers its customers – the freedom to not have to think about these maintenance tasks. On the other hand, as a customer of the CDO, you’re totally dependent on them to perform these tasks well and in a timely manner, and if their service is interrupted for whatever reason then you could be in trouble. You’ll miss out on some ability to customise your deployment, and you may also not have control over when your CDO decides to update, or not, your database.  One example to illustrate the hazards of this will help. PostgreSQL, otherwise known as Postgres, is a free, open-source relational database management system, and it frequently releases new updates with lots of new features. Postgres 15 was just launched a couple of months ago. However, as a customer of a DBaaS provider, you need to wait until your CDO implements those new features in their own product before you see the benefits. One major DBaaS vendor for Postgres lagged in providing the latest version with the new features to customers for over two years – that’s a long time to wait. Check your DBaaS provider’s record on providing these features before committing to anything. The value of DBaaS Even with these few drawbacks, more and more enterprises want their IT and developer resources focused on innovation and business-oriented value add, such as data stewardship, analytics and building new applications. In our experience, businesses are more than happy to leave the behind-the-scenes grunt work that keeps the lights on to a cloud database provider.  In an increasingly hybrid world, you need a provider that can help you shift your workloads from on-prem to the cloud in a way that works for your specific needs. DBaaS eliminates vendor-lock in and allows organisations to reap the benefits by way of innovation found with open source Postgres. ### <strong>Why a Digital Twin Graph Will Help Manage Real-World Complexity at Scale</strong> Digital twins are computer models of the real business world. They help asset owners understand the details of what is happening, in real-time as well as retrospectively. They are increasingly being applied in the context of asset tracking, building and engineering maintenance, transport or cargo tracking, operations management, oil and gas flows, financial flows, compliance, and more. Modern digital twins are built on top of knowledge graphs, platforms which can not only scale to the vast amounts of data accrued by assets and people, but also deal with the intricate structures and relationships between them. In turn, this deepens CIO visibility into key business processes. The reason behind the success of this next generation of digital twins is the convergence of analytics, data science, machine learning, and AI (Artificial Intelligence). Knowledge graphs are being deployed because they make data smarter, and provide a superior underlay for those techniques. A knowledge graph is an interconnected dataset enriched with business semantics. It allows you to reason about the underlying data, and use that data for complex decision-making at scale. And that’s not something traditional data management systems (like relational databases) can offer. Connecting your digital twin with external data opens up use cases That’s because when data gets ingested into a graph database, relationships are stored as first-class citizens, not as some afterthought to be expensively computed later. Business assets in the knowledge graph are natively connected to their neighbours, and their neighbours, and so on. That means that as the knowledge graph grows and gets richer, it becomes more useful. Moreover, the rich network of data in your new graph-based knowledge graph becomes very useful to your data science team. The topology of the knowledge graph provides opportunities for powerful graph analytics and graph machine learning – tools which are only available to graph users. Adding onto that graph visualisation means that users who adopt a graph-based digital twin get an immediate head-start in understanding and managing their physical business, as well as being able to respond quickly to events and look for future problems. And as knowledge graph users find, connecting a digital twin with external data from assets, sensors, markets, and even weather forecasts opens up many use cases. The kind of living simulation mirroring the real world a digital twin embodies allows security experts to run vulnerability tests without disrupting everyday services. It’s a technology that can be used to simulate cyberattacks, providing a great aid with threat detection and smart decision-making should a breach occur. Digital twins can be used to carry out network analysis across connected IT systems to rapidly help the security team better identify vulnerabilities and quarantine them before they spread to other parts of the infrastructure. In fact, creating and analysing a graph digital twin of your infrastructure is one of the most effective measures you can take for improving your cybersecurity posture. It’s also very helpful for managing the endless, dynamic complexity of cybersecurity vulnerabilities and threats. Mapping complex what-if queries and event impacts A great example of a digital twin knowledge graph comes from Lending Club, a US peer-to-peer lending company. Its knowledge graph perfectly complements its expansive, microservices-based architecture. The Lending Club knowledge graph listens to the systems around the network, including its hundreds of microservices and the underlying servers, switches, and racks the microservices need to operate.  This real-time view of its systems is piped into the knowledge graph, delivering one central view of all events and metrics, so that Lending Club can make sense of a connected domain. Engineers can see what-if real-time events and ask, ‘What happens if this router dies?’ and map the ripples through the graph of which applications, which customers, or which loans might be affected. Finally, the UK’s Transport for London (TfL) is using graph technology as the basis of a digital twin to achieve quicker identification of traffic disruptions. It states that a graph-based digital twin is more capable than relational databases at handling this interconnected data, enabling faster and more effective interventions. Andy Emmonds, its Chief Transport Analyst, reports that when COVID hit, graph technology was the only way to accurately estimate how long it might take an ambulance to travel from a London hospital to one of the newly-established Nightingale Centres. These use cases demonstrate that graphs are the perfect solution for data complexity. If you're an enterprise looking to build a digital twin to capture real-world complexity and volume, you should model it as a knowledge graph. Non-graph technologies struggle to cope with the complexity and might well fail to give you the meaningful, timely analysis your business needs. ### <strong>Overcoming the increasing challenges to data solution providers</strong> The energy market is in crisis. Russia has reduced supply via a key pipeline to 20% of capacity, causing the cost of Natural Gas to hit an all-time high and driving the cost of energy across many countries in Europe to prices that could lead to industry-wide shutdowns. Pubs and Restaurants are being forced to close as they can no longer afford to heat the building, and even schools are concerned about how they will meet their energy costs this winter.  Heatwaves hit Europe this summer, with the UK experiencing the hottest day since records began this July. Nine of the 10 hottest days in the UK have been recorded since 1990, and the temperatures continue to hit record highs. Drought frequently follows heatwaves, and agriculture is experiencing one of the least productive years in living memory, with livestock well into winter feed as no grass has grown, and crops failing due to lack of water.  Finally, the US government have acted, and the Senate passed the Inflation Reduction Act, confirming a $369bn spend on Climate-related projects including investing in renewable energy, incentivising purchase of EV’s, coastal protection zones, biofuels and hydrogen plants.  Though some say they are 10 years behind the position of the European Union, it is a welcome investment in part.  According to French President Emmanuel Macron, the best energy is that which we don’t use, so if your business is looking to reduce both energy use and costs, there could be a solution in the Cloud. Data Centre energy and water consumption is under the spotlight and rightly so, given that the typical data centre consumes around 3-5 million gallons of water per day, roughly equivalent to the water consumption of 10,000 people.  PeaSoup offers a different solution by using Liquid Immersion Cooling technology. A simple method where servers are submerged in a dielectric liquid within a self-contained unit that cycles the liquid around the energy-hungry electronic components, reducing the need for traditional air-cooled systems. As a result, energy consumption is reduced and reliability is improved. Companies are facing an ever-growing chorus of the need to improve sustainability factors, but it is a moving target, as rule changes, Government initiatives and laws are introduced and amended on an almost weekly basis. The global scale of climate change, the causes and effects, is being realised every year, record high temperatures in the UK in 2022 are just one example, with actions being taken to combat and reduce the effects being liked to a sticking plaster on a gaping wound. There are changes that businesses are obliged to make to reduce their effect on the environment, which include reduction of waste, reduction of their carbon footprint, and switching to more sustainable practices.  Data is often a forgotten factor in sustainability but every email sent has a carbon footprint and if you consider the number of emails sent by both individuals and corporations on a daily basis (globally this amounts to around 3.5 million emails per second!) the carbon footprint of that alone is phenomenal!   Another factor that has become more prevalent this year is water security. As more and more data centres are built (the amount of data will always increase), usually in the legacy style of a large warehouse, with atmospheric controls including air conditioning and humidifiers, both with high-level energy consumption, the demand for water also increases. For example, a 15-megawatt data centre could consume around 1.6 million litres of water per day, roughly that same consumption of around 10,000 people. There are obviously factors influencing the consumption of water by data centres, such as their location and size but the fact remains, they are high consumers of energy and water, but essential to everyday life.  To address this situation, companies can choose to store data in a more sustainable way, with a company that focuses on renewable sources of energy, sustainable practices and technology that has a proven energy consumption of around 40% less than a legacy data centre.  Additionally, the lifespan of servers held within a legacy data centre is short. The heat generated in the constant processing of data burns out the components quickly, meaning more frequent replacement is needed, thus increasing the costs of traditional data storage in comparison to liquid immersion cooling. Technology can more easily withstand the constant and immediately controlled submerged state in the dialectic fluid and lasts longer before a replacement is needed.  The cost of energy has created a great many front pages over the last few months, and still influences headlines while we wait to fully understand what form the intervention will take for businesses and consumers alike. If this is only a temporary solution then we do perhaps need to look to alternatives, reducing our consumption rather than the cost which falls outside of our control. Alternative sources of energy are one route to go down, and PeaSoup have and will always use renewable energy sources to power their data centre storage.  Marketing Manager, Art Malinowski, commented: “We were the first Cloud providers in the UK to use Liquid Immersion Cooling, and this sustainable and energy saving technology is proven to be a more planet friendly way to store data in the cloud. When businesses are looking at solutions for data storage it really should be a consideration to reduce their carbon footprint.” Put simply, data (and the processing thereof) will continue to grow exponentially, and we need to combat the growing need for both energy and water consumption to process that data, and there is a sustainable solution. Cloud based platforms using liquid immersion cooling have an additional benefit: the heat generated from the cooling of the technology can be recycled. Projects are underway involving the cycling of the heat generated by the hardware to heat buildings, which involves a heat exchange system. Similarities to air and ground source heat pumps can be drawn as it uses similar technology and has the same sustainable advantages. The future is sustainable, data can be used as a force for good.  Find your way to a greener and more sustainable solution with PeaSoup.Cloud.  ### How edge cloud is transforming online retail Retail, like almost every other industry, experienced an acceleration in digital transformation brought about by the pandemic and the overnight shift to digital-first. For a time, e-commerce was the only viable shop window and retailers across the globe were presented with the opportunity to prioritise a safe, secure, and scalable online presence to better cater to, and drive revenue from, their suddenly remote customer base. Fast forward some years and e-commerce shops have come to favour flexibility and creativity, both of which they need in order to serve more and more customers from the edge of the network. In doing so, they’re able to reduce costs, improve performance, and transform user experiences all whilst keeping their applications secure under the new demands of their suddenly remote customer base. Edge computing is key to catering to this sudden shift and enabling great shopping experiences quickly, safely and more effectively compared to legacy CDN and security offerings. An edge cloud platform sits between e-commerce infrastructure, either in the cloud or data centres, and edge devices in order to process, serve, and secure customers’ applications as close to end users as possible — at the edge of the internet.  Ultimately, it allows online retailers to create a better, faster and more personalised online shopping experience for customers - improving conversion rates and generating greater revenues. Here’s how. Work bigger and faster The entire edge cloud evolved out of the CDN and, by providing some core features during this evolution, has now created a suitable home for powerful edge computing features. It’s evolved as a function without the need for a physical kit, allowing industries worldwide to get closer to their global users than ever before and enabling far greater user experiences than were previously possible. CDNs have always provided a better scalability model for applications due to the sheer size of their networks, which allows them to take a lot of the burden from existing business applications. With edge computing, however, businesses have been able to go one step further and deliver more content to the edge. Whereas they previously delivered simple, static content like images, CSS and Javascript, edge computing has allowed more complex parts of the application to sit at the edge of the network.  Today, existing CDN features like real-time purge allow edge clouds to let more complex and dynamic content to be served from the edge, enabling for more, altogether better, content. Edge computing takes things even further, as beside the content of the application,  retailers can now have some of the application logic at the edge. And after all, the more at the edge, the better. This is a huge step for online retailers who, over the past decade, have had to split their applications into microservices, distributing their content across a cloud infrastructure from a range of third-party providers. Today, all of those building blocks can run at the edge of the network. This means reduced latency, which in turn means better performance and more sales as more of the application sits increasingly closer to the customer. The past three years have proven that the retail industry is competing in an ever changing and unpredictable landscape, and must remain nimble and ready to switch gears at a moment’s notice to remain competitive and keep up with consumer demands. With legacy CDNs, it may take time to engage with professional services and more time yet to get into their release queue, which doesn’t allow for that nimble mindset. While it’s more efficient for devs to do this themselves, they historically haven’t had the tools to do so. In the edge cloud, devs get the efficiency and automation they need, allowing them to spend less time managing the day-to-day tasks and more time focusing on the innovation that will differentiate your retail business. A new way to build security infrastructure When it comes to security, retailers have traditionally wanted to have experts and their products protect their online presence while they focus on infrastructure. Security has always sat at the perimeter of the network; antivirus evolved to firewalls, firewalls evolved to cloud firewalls and so on.  As such, it makes sense today to deploy security at the edge. If a business chooses to do so for its own application, it can use the aforementioned building blocks to build a security setup, combined with available products, that are bespoke for its needs. After all, given the right tools, the retailers themselves are always going to come up with smarter, more innovative solutions than any provider could offer. Their application is unique, their challenges are unique, and they need a unique solution to help them overcome those challenges.  Encouraging that building and experimentation is one of Fastly’s founding principles. We like to give our customers the building blocks to create whatever they want for their application in a way that suits their needs. Whether they want to build a homegrown security solution, incorporate a vendor’s service or build with Fastly products, we want them to be in creative control.  Stay agile to keep costs down  When it comes to cost savings, it depends on the stage and scale of your e-commerce business. For example, CDNs are still a must for large global operations, but other options are open to smaller, local setups. For the latter, when considering how to build your application and what kind of infrastructure you’ll rely on, there’s more freedom to stay agile, and to avoid giant infrastructure investments. By offloading content to the edge, you can save on pricey bandwidth and storage costs, reduce the burden on your servers, and cut the costs you pay for moving data out of the central cloud–something that simply isn’t achievable with legacy CDNs, despite them being at the edge, due to their low cache hit ratio and the added latency and costs that this incurs. These problems cease to exist with edge computing, which promises to cache more than ever before due to the ability to instantly purge content, which in turn eliminates any worries over serving stale content. Simply put, the more you house at the edge, the more you’re able to cut typically problematic infrastructure and bandwidth costs. Because converting e-commerce shoppers ultimately comes down to offering them a fast and  reliable online experience every time they engage with your site or app. Beyond allowing retailers to cost-effectively offload content, edge computing also helps them deliver that reliable customer experience by radically simplifying image delivery as image transformation is handled at the edge. Image delivery is particularly important for virtual shop windows, and creating the best online shopping experience for consumers lies in serving images optimised for their specific device, connection and location, among other variables. Doing that at scale will become very costly very quickly, but doing it through an edge cloud allows them to deliver a premium service whilst saving money; prior to this, retailers were forced to store, process and multiple versions of each image, or perform offline image preprocessing, which, as well as being terribly inefficient, was significantly more expensive to host.  Go edge or go home As consumers continue to rely on digital channels there has never been a better time for online retailers to speed up their digital transformation and meet their customers’ needs head-on.  Those who do will see steady margins, while those who fail to adapt will suffer the consequences. To do so, edge cloud should be a pivotal piece of the development puzzle for e-commerce businesses and not something that’s subscribed to.  By enabling developers to do what they do best, retailers can grow and adapt their applications in conjunction with their customers’ needs, addressing unique business challenges with unique functionality.  More importantly, this kind of developer enablement will undoubtedly lead to a bespoke and altogether better user experience that will help online sales soar. ### <strong>Why you must enhance visibility and control of your expanding supply chain</strong> In today’s extended digital landscape, the concept of a truly secure perimeter is a myth. With so many endpoints, network assets, and endless digital supply chains, it’s almost impossible to ensure that your network perimeter is impenetrable. However, most organisations are still relying on an inside-out approach to cybersecurity - looking out for incoming threats and trying to arrange their defences accordingly.  As organisations’ supply chains grow and become more digital, threat actors are increasingly targeting third parties to gain access to the enterprise network. Organisations can spend a significant amount of resources in building their internal defences and security strategies, but how will they guarantee that a vulnerability won’t occur from a third party?  In the past year, nearly 60% of all data breaches were initiated via third-party vendors. These attacks are sometimes undetectable by the usual outwards-facing approach to security until they have already breached the perimeter.  So, it’s high time businesses rethink their security strategy, and apply a more proactive ‘outside-in’ approach to complement the traditional inside-out approach.  The critical threat of supply chain attacks Supply chain attacks target one of the most critical foundations of most business operations - Why you must enhance visibility and control of your expanding supply chain The 2020 SolarWinds attack targeting US government organisations is perhaps the most notorious example, with a highly skilled threats group exploiting a trusted software vendor to push out malware-laden code that bypassed normal security precautions. While such attacks were once largely the work of high-level threat actors, rapid digitalisation has made supply chain tactics more accessible. Many businesses today have comprehensive digital supply chains, so even low-level criminal organisations have a greater attack surface and more connections at their disposal. It is possible for an attacker to gain access to an organisation's data or systems via a third-party. If an outside supplier or contractor is breached, attackers may use their credentials to gain access to your network. Data analysis and accounting are only two examples of services that may store copies of private information on servers they don't manage. The security state of these third-party sites is unlikely to be visible to SOC teams or other conventional security monitoring methods. Therefore, even the most meticulously planned outward-looking security procedures may be undone by a third-party partner with less mature security practices. Every supplier has its own set of contacts, which just adds to the complexity of the situation. Each company today operates inside a complex supply web, rather than a linear supply chain, and security breaches at any point may have far-reaching effects. Establishing visibility using an ‘outside-in’ approach  An outside-in strategy to data monitoring entails implementing external attack surface management solutions in addition to conventional internal vulnerability assessments and threat monitoring. This requires continuously scanning the internet for any company-related assets that might be exposed. This strategy will aid in uncovering "unknowns" that the firm was previously unaware of. A sales partner, for instance, may have a client database as part of its service, but may mistakenly expose it by keeping it in an unprotected, publicly available AWS bucket. External attack surface management solutions will identify such susceptible assets, independent of their location or creation method. Crucially, this view extends to “Nth-party” connections – the suppliers-of-suppliers that extend outwards through the entire supply web. The average network of connections is now so complex that it is impossible to accurately trace back all the possible routes of risk exposure. An outside-in approach bypasses this complexity to find the vulnerabilities directly.  When a breach or leak does occur, external monitoring can quickly identify any company-related data by monitoring the open and dark web. This might give the data owner the opportunity to shut down or minimise the breach before it occurs. In the best-case situation, the data can be removed, or at the absolute least, the organisation can get an accurate understanding of the scope of the breach. This allows the company to provide a precise and measured reaction, as opposed to a generic public announcement of a breach. The optimal method for continuous external monitoring is a combination of AI and human-led analysis. This combines the speed and accuracy of AI with the context and expertise of human analysts to provide actionable insights.  An automated, AI-assisted approach has fast become the only way to untangle the complexity of dense supply webs. As firms continue to invest in digitalisation, extended supply chains will only become more convoluted, and present an even greater attack surface for threat actors. Firms must have the ability to monitor and mitigate these risks before they fall victim to the latest supply chain attack. ### <strong>Gaining a grip with the strangler pattern</strong> Big bang migrations, in which all traffic is cutover to a new system when it is sufficiently close to feature-parity with the legacy system, are inherently risky and almost always delayed. Risks arise from complexity built up over time in the legacy system as well as uncertainty about how it works and how it is used. The migration process itself presents a risk due to its sheer scale. Failure can easily be catastrophic, and rollbacks become just as complex as the migration itself. A dual-running period with some kind of bidirectional data synchronisation can mitigate some of these migration-related risks, but this still locks up value in the new system until late in the development process.  This is an inherently Waterfall delivery approach. Not only does the new system need to reach feature-parity (or thereabouts) before any value can be unlocked, but during this time fixes and new features are likely to be required on the legacy system too, duplicating work. What if the legacy system could be gradually replaced piece by piece and start delivering innovative new features to users within a few weeks? The term “strangler pattern'' was coined by Martin Fowler, after noticing the strangler fig growing around its host tree (the host it will eventually kill) during a hike through the rainforests of Queensland. Using this pattern, developers build new components around the legacy system, gradually migrating features as and when it makes the most sense for the business.  Where the pattern can be applied This pattern is applicable whenever a complex system needs to be replaced. It’s a great approach to use when decomposing a monolith into a suite of microservices, though it also works really well when migrating away from a complex SaaS solution. Regardless of whether the SaaS solution has a monolithic or microservices architecture behind the scenes, it’s probably going to look like a monolith from the outside and almost certainly going to be billed as such. Continuing to be billed in full for a platform that is only being partially used may not seem to make sense at first glance, but it is worth looking more closely at the business case.  The hosting and support costs for the new platform are likely to be very small when using a Serverless architecture, even more so when usage is low. Many SaaS solutions are billed on usage or monthly active users (MAOs) and with the right migration strategy these metrics can be reduced gradually as the new system comes online. Unlocking new features can have massive business benefits that may completely outweigh the additional cost, and the hidden costs of a complex big-bang migration likely outweigh even a generous dual-running period.  The pattern in action The following real world example, drawing from experience, illustrates this concept in technical terms. For the last 2 years, a specialist team has been gradually replacing parts of a large-scale Online Video Platform (OVP) for a global media company. The system stores information about TV content and allows monetisation of that content catalogue. There are many parallels to more generic eCommerce systems: there is a catalogue of products that users can browse, search and purchase, users have to sign up and share personal data and billing information that must be protected, and there is widespread personalisation of the experience. Placing a focus on personalisation The first target feature area for innovation with this project was personalisation. Increasing personalisation was predicted to significantly boost user engagement and retention, thereby increasing revenue. The new features included resume playback functionality  and targeted content recommendations driven by machine learning. However, large volumes of usage data (e.g. video plays, clicks, favourites) are required to power these kinds of features and the data was at this point all stored in the existing platform.  The best practice strategy in this case was to wrap the existing system early in development so usage data could be intercepted and then migrate the personalisation APIs as the new features were ready. One of the first things the development teams did was stand up an API and persistence layer to handle this data and insert a middleware service in the path between the apps on end user devices (Web, Mobile, TVs). This is often referred to as the Backends For Frontends (BFF) pattern. This BFF provided the ideal place to implement a data fan-out, with usage data being propagated to both the existing and the new systems in parallel. When the new personalisation APIs were ready for primetime, it was relatively easy to put them into service by releasing a new version of the middleware. Enough data had been captured at this point in the new system to provide the required features to end users without having to do any bulk data migrations, significantly reducing risk and complexity. Devices were migrated incrementally from the old to the new middleware version by device type as the new features were written into the client code. Because data continued to be sent to both the old and new backend services, this switch was seamless for the end users.  In this example, the strangler pattern enabled new features to be delivered rapidly to end users in a fraction of the time it would take to create a complete replacement of the existing platform. Furthermore, a risky big-bang data migration was avoided by implementing data fan-out in the BFF layer. Content management The second feature area for innovation was content management. User focus groups had shown that enhanced navigation and search capabilities would significantly increase user satisfaction and engagement. It was vital that new features were added quickly to stay competitive and replacing the existing system in its entirety was not feasible. The strategy  was to insert the new CMS in the flow of data between the existing system and the applications on the end user devices to enable the catalogue data to be augmented and new features to be launched quickly. The existing system would become fully wrapped by the new CMS in a later phase. In the first phase a new CMS and search engine was built that polled data from the API of the existing system on a regular interval. As soon as the new APIs for browse and search were ready, they were integrated into the BFF layer and put into live service, even though the new CMS UI did not even allow editors to make changes to the data yet. Over the next two years the team followed a strategy of only adding features in the new CMS. When a user-facing feature required additional data, the ability to create this data was built in the new CMS. Eventually the team started to work on migrating the remaining editorial functionality from the existing system. To make this an iterative process and avoid a big bang migration they set up a two-way sync between the systems. This meant that an editorial user could make the same change in either system, meaning no roll-backs would be required. It also meant that there was no pressure to immediately migrate the other consumers of data from the existing system. Here, as in the previous example, the strangler pattern enabled much faster time to market for new features. Using this pattern also avoided two big-bang migrations: the migration of apps on client devices from the old to the new browse and search APIs, and the migration of editorial users from the content management UI in the OVP to the new CMS UI. Rapidly bringing new features to market While a number of organisations will remain steadfast in their commitment to a big bang migration, it is always worth trying to dig deeper into the details and find ways to make the process more iterative. As the above real life example shows, detractors should look again at the strangler pattern as the starting position when starting to talk about the replacement of any complex system. With this pattern, organisations are able to remove ageing components and functionalities at their desired pace. Most importantly however, organisations can ultimately bring new features to market quicker and gain a competitive edge.  ### <strong>Selecting a sovereign cloud: A data guardian’s guide</strong> We’re hearing more and more these days about sovereign cloud. While it is not a new concept it has risen to prominence recently due to a changing geopolitical landscape and new regulations that affect the degree of sovereignty organisations and individuals are able to exercise over their data. This is because a sovereign cloud provides a smart solution for a growing appreciation of the benefits and risks of not having well-defined jurisdictional requirements at a territory level pertaining to data. The size of the global government cloud market is expected to reach $71.2 billion by 2027 from $27.6 billion in 2021, according to market research firm Imarc Group. Microsoft recently unveiled Cloud for Sovereignty - a new offering designed to help the public sector comply with regulators' increasingly strict requirements to keep data within a certain geographical area, particularly in Europe. European holiday season With all the various perspectives on sovereign cloud and the sovereignty of data - and its relevance to cloud consumers - being aired publicly across the continent, it is becoming difficult to understand what it all means. This is especially true for those responsible for corporate or public data, when there would appear to be very little in common between the many competing definitions of sovereignty as it pertains to different forms of customer data and how best to address the concerns raised by the likes of GDPR, the US Cloud Act and Schrems II. At the time of writing, we’re very much in the European holiday season so, to avoid this being a dull technical discourse, we’ll use the analogy of planning family holiday accommodation as a light relief similar to the situation.  While seemingly leftfield, there are more similarities than you would think on the surface. Both involve multiple parties, with varying needs and a huge amount of influencing factors. Similarly, a well planned and executed family holiday can make amazing memories, whereas a badly planned holiday can leave permanent scars. In this respect it is not unlike the decision as to where to host one’s valuable corporate or public data.  Three main choices It’s a situation akin to choosing from a variety of accommodation when going on holiday, notably; the international hotel, the smaller, localised one or a boutique offering. The former is big with lots of support services and comes with a well-known reputation. Its guests consume the basic package of the room, but all other facilities and activities are chargeable individually. The second option is like the first but more localised. Many of the services, activities, and facilities of the international hotel brand are available at this franchise hotel, but where a local partner company owns the operation and does the overall management. This has some appeal as the services are more localised as are the staff who have received additional training that is more locally relevant in line with regulations and jurisdictions. However, as this is a smaller operation some of the services offered by the international hotel brand are not available, nor are the added advantages of accruing loyalty points or familiarity and greater certainty over service that a large international brand offers. The final choice is a local boutique hotel that has been operational for many years in a local area, and which provides a very tailored experience. This varies from the first two options in that it works with the guests to create specific packages of activities and services and on balance is more expensive and labour intensive as the hotel makes a real effort to understand their guests and consequently tailor a very relevant package of activities and services. This entity understands all the requirements of operating locally and can offer benefits accordingly. Five defining factors There are clear pros and cons with each and, much like a cloud provider, the key question is which to choose? The answer is not simple and comes with several defining factors, particularly: People (staff and booking operators) - Data guardians should be aware that certain tiers of data should ideally only be managed by certain types of individuals. Part of the journey to assessing the type of sovereign cloud a business needs is understanding the type of individuals that should have access to the different classifications of data for which the data guardians are responsible. Access (support services) - This is critical in understanding the choice of sovereign cloud and how all associated account and service metadata relating to that customers data, are handled according to which regulatory frameworks, auditing standards and which jurisdictions the sovereign cloud provider are subject to from the perspective of governance, oversight and compliance. Process (ease of booking, autonomy in consuming services and providing feedback to the provider) - The systems used to aid the people in carrying out their duties. This is all about the accountability of the sovereign cloud when it comes to how the customer’s data as well as all the associated account and service metadata generated by the provider are managed and potentially leveraged, and by who and where.  Activities (what you get) - Relates to what people, through access, and leveraging processes can conduct by way of activities against the data, both customer and account, and service metadata. It speaks to the level of expertise and accountability and training of the staff as well as what they are allowed to do. For example, with children you don't have a generic support staff, you have one trained to work specifically with children. Understanding the data classification and how ideally those data types should be accessed and what levels of management are enabled by what types of personnel using which systems, is critical to selecting the right sovereign cloud. Technology  (the accommodation structure and ergonomics) - The need to have a robust and resilient architecture, located locally within the Jurisdiction, and optimised to reflect the sensitivity and value of the data hosted on the platform whether that is at an individual customer level or more broadly at a data classification level. The facility should be secured and operated at the highest levels of resiliency, but with the data also needing to be always available this creates a need for backups and disaster recovery solutions that exist beyond a single site architecture while remaining wholly within the local jurisdiction.  The right choice, for you So, if your sovereign cloud provider was holiday accommodation, which would you choose? Like all families, there’s no one size fits all solution and what will work for some, will not work for others. Classifying and understanding a business’s data types is the first step one should take as a data guardian when looking at selecting the right sovereign cloud for your business. ### <strong>Top security threats to the cloud</strong> Cloud is an important mechanism for digital transformation. However, many companies still operate with an outdated mindset, which can cause them to make many mistakes.  When businesses increase cloud capability and cloud velocity, they often produce new risk areas outside of their scope. For example, over the last few years, application developers have been known to quickly adopt cloud computing due to the demand for faster coding capabilities. Yet, digital infrastructures unable to support this need also fail in the process of migrating from development to testing to production. Nonetheless, as Agile and DevOps methodologies become increasingly utilised, businesses must accept cloud as critical infrastructure to their operations. However, many businesses are still operating in legacy on-premise environments. While the issues, challenges, and risks involved when using cloud computing differ according to the context, companies need to evolve and adapt with the times, as on-premise tasks are not always easily mapped to cloud operating environments.  Relying on a single provider also comes with its risks when outages occur. For example, customers of Amazon Web Services, including Slack, Okta, and Jobvite, were left trapped and vulnerable when an outage incident occurred in December 2021. Even though cloud providers secure servers and infrastructures, many breaches still happen due to defaults, poor architecture and complexity in hybrid and multi-cloud settings. Therefore, it causes the client - not the cloud service provider - to ensure these are secure. This is the genesis of the share security operating model. How to mitigate cyber risks The Cloud Security Alliance outlined the top 11 challenges CISOs have found around cloud consumption to help raise awareness of the lying threats, risks, and vulnerabilities. They are famously known as the Egregious 11: Data breach A data breach is when an unauthorised individual views, steals or uses sensitive or confidential information. To help data breach victims, the CSA advises organisations to develop a well-tested incident response plan, which also considers the content security policy and data privacy laws. Misconfiguration and inadequate change control Misconfiguration happens when computing assets are built incorrectly, which leaves them open to malicious activity. To combat these issues, companies should embrace automation and employ technologies that scan for misconfigured resources continuously and remediate problems in real time. Lack of cloud security architecture and strategy On a global scale, organisations are moving portions of their IT infrastructure to public clouds. One of the most significant issues relating to this is the application of the necessary security architectures needed to withstand cyberattacks. Insufficient identity, access, and key management Cloud computing can impact an organisation's identity, credentials and access management. Avoid security incidents by implementing two-factor authentication, rotating keys and removing unused credentials or access privileges. Establish guard rails to minimise access drift. Account hijacking Account hijacking is when malicious attackers gain access to highly privileged information. According to the CSA, a way to manage account hijacking is through "defence-in-depth" and Identity and Access Management (IAM) controls. A defence-in-depth strategy is one that enforces multiple layers of security controls.  Insider threat This is when a person - who has, or previously had, authorised access to an organisation's networks, computer systems and sensitive company data - uses their privileged access to behave in a way that could negatively impact the organisation. Avoid this by taking precautions to help minimise insider negligence, such as security employee training.  Insecure interfaces and APIs Cloud computing providers allow customers to manage and interact with cloud services by exposing a set of software user interfaces (UIs) and Application Programming Interfaces (APIs). Best practices recommend correctly protecting API keys and not reusing them.  Weak control plane A weak control plane means users cannot protect their cloud-based business data, which can cause data loss, either by theft or corruption. Metastructure and applistructure failures Metastructure and applistructure are two crucial elements of a cloud service, as there can be critical consequences by failing to involve these components. The metastructure, also referred to as the "waterline", serves as a boundary between Cloud Service Providers (CSPs) and their customers. Above the waterline, cloud applications must be reinforced accurately to reap the benefits of the cloud platform fully.   Limited cloud usage visibility Limited cloud usage visibility leaves an organisation unable to have the ability to see and analyse whether cloud service usage is safe or malicious. Complete cloud visibility needs to be developed from the top down to manage these risks. Abuse and nefarious use of cloud services Malicious actors can use cloud computing resources to exploit users, organisations, or other cloud providers. To mitigate the misuse of resources, CSPs need to have an incident response framework.  The 'Egregious 11' list highlights that businesses struggle to establish a clear cloud security architecture or identity strategy. The solution is to efficiently and safely ensure that all relevant identities can access resources and that there are clear instructions for proceeding when a bad actor takes over. Cloud systems can be infiltrated by targeting the gaps in identification. Moreover, application developers frequently work to meet deadlines. Consequently, they struggle to embed security into their operations and culture.  This is paramount in order to maintain a resilient cloud environment. Companies finding it difficult to manage cloud configuration and monitoring effectively should adopt a "shift left" approach by integrating security into the application development lifecycle earlier.  How to develop cloud visibility Although cloud migration improves operational capabilities, increases speed, and helps to reduce costs, if executed poorly, it can have financial, reputational, and material repercussions from vulnerable cyber risks, which can affect business leaders. Furthermore, it is essential to oversee identity governance and access when operating in a fragmented cloud environment, as a lack of foresight can cause irreparable damage to an organisation. Intra-cloud resilience is only attainable once there is full visibility and transparency in the cloud. Once made possible, organisations can establish a clear understanding of how to access data and who will be granted access. Essentially, a company must weave cybersecurity into its cloud roadmap. All while focusing on mitigating and minimising lateral movement. Security teams need graphical visualisations that easily show how data and identities are connected to ensure maturity levels can be baselined and reinforced. This can help with prioritising the enforcement of identity, data classification and entitlement (access) as the standard controls for their multi-cloud security strategy.  All customers, whether SMEs or large enterprises, are expected to use more than one cloud. This means they must have a clear understanding of what 'multi-cloud' looks like and need to ensure access to the correct architecture and strategy is properly secured to gain the maximum benefits of the cloud. The challenge is ensuring this is done without compromising operational and cyber resilience.  Cloud security that is below average will inevitably lead to a cyber breach. The 'Egregious 11' explains well how businesses today can ensure a plan of action is in place for when bad actors attempt to corrupt their resources. Reports like those from Cloud Security Alliance help raise awareness of the risks involved with poor cloud security and provide guidance to organisations, helping them to reduce security breaches and data theft. Such attacks can cause reputational and financial harm to businesses.  Moreover, cybercriminals are counting on corporate leaders to move quickly and forget about making sure the basic precautions are in place. Businesses must remember to 'shift left' and design security upfront into the process instead of suffering the consequences after the damage is already done.  ### <strong>Bring Your Own Device: navigating the cybersecurity risks</strong> A hybrid work model and the abundance of smartphones, tablets, and laptops in the consumer market are prompting many businesses to allow “Bring Your Own Device” (BYOD) in the workplace. While there are clear productivity benefits, businesses must be well prepared to counter BYOD’s potential cybersecurity risks.   The risks became evident in a recent Impero survey of UK employees about their cybersecurity experiences and behaviours. It revealed that one in five respondents had been involved in a security breach or a loss of sensitive data while conducting work. Nearly all (91%) of the employees involved in a security incident use their personal devices to access business applications and data.  To reduce or eliminate the risk, many business owners would instinctively ban the use of personal devices at work in their entirety. Yet, such a knee-jerk reaction could be detrimental to the business since one-third (34%) of the survey respondents stated that they consider the practice of BYOD as a key requirement when job-hunting.  So, instead of undermining employee expectations, it would be more sensible for employers to put proper guidelines and safeguards in place. Implementing cybersecurity policies and encouraging risk-mitigating behaviours will enable them to benefit from the distinct advantages BYOD offers. These include increased productivity, greater employee satisfaction, and organisational flexibility.  The Covid-19 pandemic forced work from home to shift from a luxury to a necessity almost overnight. In many cases, employees needed to collaborate with colleagues and access company networks from their personal devices, creating an environment in which BYOD could thrive.  Hybrid work is a continuing trend, and organisations that do not have a formal BYOD security policy need to implement one without delay. Here are five helpful tips for success. Involve all stakeholders Blindly creating a policy that only serves the company’s interests will likely face resistance. Therefore, obtaining buy-in from all stakeholders and employees is critical to ensure everyone agrees to and supports the proposed policy.  Decision-makers need guarantees that the policy adequately addresses security concerns and that the overall benefits outweigh the drawbacks. They must also see proof of a definitive plan and support for a BYOD policy, especially from IT leaders. Bring Your Own Device security will place additional responsibilities on their departments, and they need to agree on and approve the level of resources and support earmarked for the task.  Moreover, stakeholders from various departments with different interests can contribute to creating a more balanced policy. Build a BYOD project management team with representatives from the HR, Finance, IT, and Security functions to contribute to policy development.  Employee input is equally critical for creating an effective BYOD policy. Building a policy framework that doesn’t cater to their interests or needs may backfire. While it’s necessary to spell out which devices and operating systems are allowed, being too restrictive will lead to a low participation rate by employees. Failing to offer adequate support for the right devices will have the same result, resulting in a waste of resources.   Increase employee cybersecurity awareness Human error and ignorance both pose a severe threat to BYOD security. Our survey revealed that almost a quarter (24%) of employees are short of confidence in recognising cybersecurity threats at work. Therefore, regular, mandatory cybersecurity awareness training should be a cornerstone of any security policy in a constantly evolving landscape. It will equip employees with the know-how to recognise and report common threats, such as phishing emails and suspicious links.  Make awareness part of the working culture by sharing best practices on the security elements employees encounter daily, such as password protection and usage.  Cybersecurity training must also become part of the onboarding process for new employees, including instructions on how to use essential tools.  Develop a clear BYOD policy Perhaps the most critical aspect of managing BYOD security challenges is to have a well-thought-out policy that governs how to use personal devices at work. Yet, in our research survey, 42% of respondents revealed that their place of work does not apply any security policy to control the interaction of personal devices with sensitive information.  Security policies can be complex and detailed, so it’s necessary to formulate an employee-friendly version that employees can easily understand. This version should cover key aspects, such as whether or not they can use their devices for personal communications only, or actually use them for work. Understandably, the latter presents more significant security risks.   However, due diligence is necessary in all cases. Therefore, the policy should clearly stipulate the permitted device types and authorised cybersecurity tools to use alongside the devices. The same goes for business applications. Include a comprehensive list of approved packages and ban the use of unapproved packages. It’s also essential to specify the IT support level available to employees that use personal devices. Make cybersecurity tools readily accessible Our research revealed a surprising lack of ready access to essential cybersecurity tools. For example, only about half (or less) of employees have access to critical tools such as remote access software (57%), virtual private networks (52%), laptop encryption software (50%), and multi-factor authentication (45%).    Yet these tools are no longer the exclusive domain of IT security specialists. All employees must be well-versed in the correct usage of cybersecurity tools if the organisation is serious about reducing risk. Once employees are familiar with and confident in using these tools, the business must introduce measures to ensure they adopt and use them.  Constantly monitor, review, and refine This seemingly obvious point is also one of the most important. As with any business strategy, one cannot base most cybersecurity processes on a “set it and forget it” approach. Organisations must frequently evaluate their effectiveness, investigate new capabilities, and make regular updates and improvements. Remember, cybercriminals never sleep and are constantly on the prowl. ### <em>Nastiest Malware of 2022 from a business perspective</em> With 2023 around the corner, we’ve seen yet another eventful year for the threat landscape and malware continues to be centre stage in the threats posed towards individuals, businesses, and governments. OpenText’s annual survey of the cyber landscape paints a nuanced picture of the biggest threats of the year, but it is more than just doom and gloom. In addition to sharing insights into hackers and their latest techniques, our yearly round-up also allows us to consider the best ways to prepare for, and defend against, increasingly sophisticated cyber threats and double down on cyber resilience. Given that cybercriminals are constantly refining their tactics, cybersecurity experts need to do the same to stay prepared and well-positioned to fend off incoming attacks. 2022: a year in cybercrime Overall findings from the past year show a generalised increase in malware activity. Phishing, in particular, has seen an almost 1100% increase during the first for months of 2022 compared to the same timeframe in 2021.  Meanwhile, the ransomware double extortion tactic continues to a favourite for criminals and absolute hell for victims. Some ransomware groups threaten to leak data even if the victim attempts to work with ransom negotiation services that are typically included in cybersecurity insurance. As a result, we have seen that organisations are shifting from relying on cyber insurance policies to increasing the strength of their layered defences to be more resilient against ransomware attacks. Ransom payments have also been increasing at a much higher rate than inflation. The peak average ransom payment was above $300,000 this year; last year it peaked at $200,000, and in 2019 the average payment was $40,000.  However, it’s not all bad news. Earlier this year we saw the most we’ve seen out of law enforcement agencies striking back against ransomware gangs. REvil, who has been featured on the OpenText Security Solutions’ Nastiest Malware list consistently over the years, got hit the hardest. Russian authorities arrested members of the REvil gang and seized their computers and other assets. The REvil shutdown represents a positive sign for 2022 as law enforcement agencies develop more robust capabilities to arrest, prosecute and incarcerate ransomware gangs. Nastiest Malware of 2022 unwrapped This year’s Nastiest Malware includes: Emotet once again returns to its pole position as the most successful botnet in existence, following a brief shutdown last year. Emotet’s objective is to send malicious spam campaigns to billions of emails a day. It creates a foothold on a victim's computer, with follow-up malware that will then move laterally and compromise the rest of the environment before bringing in the final payload of ransomware. LockBit is this year's most prolific - and successful - ransomware group. While the group has been around for about three years as a ransomware-as-a-service (RaaS) group, they continue to advance their tactics. In addition to taking data, holding it for ransom and threatening to leak it, triple extortion adds a third layer: a distributed denial-of-service (DDoS) attack on an entire system to completely lock it down. Conti, a ransomware-as-a-service malware, has been on our Nastiest Malware radar for quite some time. In February of 2022, Conti released a statement of support on their leak site for the Russian government. Shortly after, a Twitter account, ContiLeaks, leaked the group’s internal chats dating back almost two years, resulting in the dismantling of their leak site and command and control servers. Conti has since rebranded into multiple operations, most notably HelloKitty, BlackCat, and BlackByte. Despite being possibly the oldest of its kind, Qbot (AKA Qakbot) is still very much active today and continues to wreak havoc on systems globally. Qbot is an info-stealing trojan that moves throughout the network and infects the entire environment while “casing the joint” to allow access to as much data as possible to exfiltrate for extortion and to prepare for the final stage of ransomware payloads. Valyria is another strain of a “former banking trojan turned into malicious spam” botnet with email attachments, converted into malicious scripts that that start an infection chain typically resulting in ransomware. The greatest challenge posed by Valyria is the complexity of the components and its ability to evade detection. Cobalt Strike and Brute Ratel are adversarial attack simulation tools. Cobalt Strike is a pen testing tool designed by white hats, while Brute Ratel was created for red teams. The purpose of these tools is to help teams simulate attacks to understand the tactics hackers use, determine security gaps, and make the appropriate changes. Unsurprising, then, that Cobalt Strike and now Brute Ratel are both frequently used by hackers with malicious intent. Of course, the above list is hardly exhaustive; unfortunately, the list of tactics used by hackers to infiltrate systems and obtain data is vast and continues to grow. Voice phishing and MFA (multi-factor authentication) fatigue attacks are just two examples where hackers rely on the vulnerability of human beings as opposed to targeting much more resilient computer systems; the idea being that it is far easier to hack the former than the latter.  Social engineering measures are particularly dangerous, given that they rely on human fallibility. One simple error can override even highly secure, seemingly hack-proof systems. Computers might be near-perfect, but their users never will be. A key takeaway from 2022 is that we need to prepare for attacks on both. Navigating malware in 2023 and beyond No business, regardless of size, is safe from cybercriminals. Putting policies and technology in place to minimize the effectiveness of potential ransomware attacks is essential. Creating cyber resilience requires strong multi-layered security and data protection policies to prevent, respond and quickly recover from threats.  A key step is locking down Remote Desktop Protocols (RDP). Using RDP solutions that encrypt data alongside multi-factor authentication protects against vulnerabilities when remoting into other machines. This ensures awareness of all the remote desktop software being used – an important facet in appropriate defence, as criminals are now installing legitimate remote desktop tools to backdoor your environments while also avoiding detection. Of course, installing reputable cybersecurity software is also crucial. It is best to implement a solution that uses real-time, global threat intelligence and machine learning to stop threats. Protection with multi-layered shielding can detect and prevent attacks at numerous different attack stages. Even with all these steps, hackers might still get through, and so it is also important to have a backup and disaster recovery plan in place. As new threats emerge and evolve, so must security awareness training. Keeping users up to date on the latest scams and attacks will help transforms employees from a weakness into the first line of defence. Running regular cybersecurity awareness trainings and phishing simulations keeps data safe and secure. Also, make sure employees know when and how to report a suspicious message. Hackers never sleep, and neither should our defences. ### <strong>Why data is often a double-edged sword in financial services</strong> Financial institutions, whether oriented towards corporates or individuals, arguably have more customer data at their fingertips than any other type of organisation. That much we know thanks to the rise of fintech, which has transformed the industry by making banking in particular far more accessible to the average consumer. But we know unlocking insights from this goldmine is often a major obstacle. With 73% of financial services firms in the UK planning to invest in “better data analytics to enable more informed decisions” in the next 12 months, it is crucial to have the right building blocks in place to enable this. Every financial services organisation has had to rethink its market position to deal with the impact of the Coronavirus crisis, and how it has reshaped its business model. Nonetheless, digital transformation strategies that may have been hovering in the background are suddenly top of the business agenda. Keeping pace with the complexity of financial data The access to, validity of, interpretation of and insights from data are the keys to digital transformation success, and delivering a better user experience in the so-called “phygital” hybrid reality. However, the skills required to unlock the value of data are hard to come by right now, in one of the most challenging recruitment phases we’ve ever known. In fact, according to DCMS research, almost half of businesses (46%) said they struggled to recruit for roles requiring data skills. Add to that the fact that financial services organisations are grappling with legacy technology and scalability challenges in their transition to the cloud, and you quickly see why they’re often restricted in what they can do with the data they have.  So how do we keep pace with the complexity and volume of data and empower data teams to make it easier for all areas of the business to tap into data-driven insights? Liberate it from being the reserve of small, centralised data teams, who themselves are already overloaded and hit by the continuing war for talent?  Democratise your data – Organisations should democratise data and enable the wider workforce to glean insights directly. Data silos will no longer cut it in financial services, with businesses increasingly using analytics to inform decision-making on anything from compliance to customer experience. It is imperative that every business unit realises the value of the data that both it and the wider business generates. What this means in practice is using tools that don’t require bespoke coding, and instead enable businesses to access their data wherever it lives, run analytics locally and reveal genuine insights, in real-time.  By extracting, transforming and loading data in the cloud, agile financial services businesses can convert raw data into actionable, analytics-ready data in minutes for new insights and better business decisions. Empower data teams – All too often talented data teams are bogged down with slow data migration and maintenance. Our recent research revealed that two-thirds (66%) of data professionals believed their organisation was wasting too much time on data preparation. However, manual integration is increasingly becoming a thing of the past, giving way to more automation and low-code/no-code tools that take away a lot of this pain. They’re making it easier for less technical business users to easily analyse data sets, and freeing up valuable time for skilled data engineers to invest in more technically challenging and value-adding tasks,  and taking full advantage of what the cloud has to offer. Eliminate data blind spots – In our aforementioned survey, nearly 40% of data teams admitted they don’t fully understand how data is being used in their organisations. On top of that, a further 44% are concerned about the challenge presented by the diversity in the types of data they work with. All of which suggests certain data types are being left behind in many organisations, leaving new, growing information gaps in their data strategies. Cloud data and IoT data were noted as the most commonly unavailable or unsuitable sources for business intelligence and analytics. Cloud-native solutions like data lakes or lakehouses take on great importance here, by providing a centralised repository for businesses to collect and store data of any scale and format. When managed correctly, such data stores promise to help create a consolidated, single source of data truth, making it easier to govern, manage and transform into an analytics-ready state. Making your data business-ready  In a landscape permanently adjusted by Covid, financial organisations have a vast amount of data at their disposal, presenting an unprecedented opportunity. Capitalising on this can put them in the strongest position possible, unlocking more informed, real-time decision making across all areas of the business.  Having a holistic, cross-departmental data strategy is the first step, as too often data is siloed, mismanaged or incomplete. Putting the right tools and considered processes in place is a key part of achieving that, and will go a long way towards democratising data for the benefit of all employees. Now is the moment to  complete a major piece of the digital transformation puzzle and accelerate the modern data journey of financial services organisations all over the world. Don't let yours be left behind. ### <strong>How the cloud could hold the answers to the world’s biggest questions</strong> For any business with a growth mindset, cloud adoption can no longer be a “maybe”. When you consider how fast technology has moved in recent years, and the pure amount of data that organisations now have to handle, utilising cloud infrastructure to unlock the potential of that data is critical — not to mention makes it easier and faster to access from anywhere in the world. However, there is more to cloud computing than businesses might initially think. While its face value benefits are more than enough of a reason for companies to jump on board, look a little deeper and the full potential of cloud computing is more far reaching than you might think. In fact, when used correctly, it could help humanity to tackle some of its most difficult challenges — namely sustainability and cybersecurity. These are two global issues that businesses everywhere are struggling with, due to the current solutions being so wrapped up in political and economic concerns. However, with wider adoption of cloud computing and the consistent exploitation of its positive features, businesses could find a way of tackling these challenges, while also reaping the more obvious benefits that cloud computing has to offer. The data journey to carbon neutral It’s taken a while, but big businesses have finally started to realise that they cannot focus on profits alone, and that financial success cannot come at the cost of the planet. The awareness of reducing energy consumption and controlling CO2 emissions has become a core focus for us as individuals, and so companies have started looking at how their business, as well as their products and services, can be more environmentally friendly. For example, Microsoft is making efforts to be carbon negative by as early as 2030, but it’s also going one step further and aiming to remove all the carbon the company has ever emitted — both directly or from electrical consumption — by 2050.  What might not be immediately clear is how companies can use universally available cloud services to help reach these targets. Acting as the roads and bridges of the online world, these services connect information resources and software on different servers across the world, making them an essential part of global infrastructure.   Whether its robotics process automation (RPA) services or an e-commerce site, the benefits and potential of the cloud are huge for all kinds of businesses. International market research company Gartner predicts that by 2025, more than 95% of new digital workloads will be deployed on cloud platforms, up from around 30% in 2021. Making this move from a locally managed on-premises systems makes for a more efficient use of resources, and with less need for hardware or storage facilities, companies can reduce their infrastructure and energy costs almost immediately. This transition to the cloud is predicted to prevent at least 629 metric tonnes of CO2 from entering the atmosphere between 2021 and 2024, according to market research firm IDC. But the carbon benefits to cloud services stretch even further than that. Cloud providers themselves are looking into ways they can proactively reduce their own carbon footprint, so the impact can be twofold.  For example, one of the largest cloud providers Amazon Web Services (AWS) has committed to using electricity from 100% renewable sources for its own use by 2025. Google, its biggest competitor, is carbon neutral for operations today, but has a goal to run on carbon-free energy 24/7 at all of its data centres by 2030.  Overall, it’s thought that a company can achieve an 88% reduction in its carbon emissions by simply switching to the cloud from traditional enterprise data systems, making it an easy move for businesses looking for ways to operate more sustainably and efficiently. Preventing cyber attacks Cyber attacks remain a constant threat for businesses, and as companies use more and more digital services, so criminals have more data to target than ever before. The CSIS found that new and extremely damaging cyber attacks have been occurring almost daily for the past few years, and it’s expected the frequency of these threats is only likely to increase.  Putting more data into the cloud may seem like a strange move when online attacks are the thing we are trying to protect against, but actually the nature and innovation of cloud computing solutions make these a significantly more secure alternative than storing data on-premises.  For a start, cloud storage providers use dedicated operations centres that have cybersecurity experts monitoring the infrastructure 24/7. This means there is always someone looking out for any cyber threats or suspicious behaviour, and that someone is always available to move quickly should a hacking attempt be detected. This physical presence also ensures the servers’ physical safety is protected too, with heightened security protecting them coming to harm from illegal activity or natural disaster. Another benefit to storing a business’ data in the cloud is that it is backed up numerous times, and is instantly copied to a number of different servers in independent data centres. This ensures the data is secure should the original data sources be somehow damaged or compromised, and means it is possible to access an exact copy from another location. Secondly, data in the cloud is backed up several times, and instantly copied to multiple servers in independent data centres. If an organisation’s original data sources were somehow damaged or compromised, it is possible to access an identical copy from another location. Finally, all three of the largest cloud providers — namely Amazon Web Services, Microsoft Azure and Google Cloud — are not leaving security to chance. They have spent years furthering and bettering their security, responding to threats and improving every time they do. They are committed to security innovation in their market, and have a track record of acquiring or adopting the best practices from the most prominent cybersecurity companies, meaning their users can feel sure that their company’s data is in safe hands. The digital landscape is evolving quickly, and businesses are being faced with new challenges to navigate every day. The cloud holds the answer to most of them, offering simple solutions that brings both peace of mind and huge benefits — from security and sustainability, to convenience and speed. Considering the cloud is already doing its part to solve the huge problems of climate change and cybersecurity, its functionality holds enormous promise to contribute further and solve even more of the world’s most pressing issues.  ### <strong>Clarifying UK cloud adoption patterns</strong> There is no doubt that cloud migration continues to gain pace within the UK industry. What is less clear is the exact form this transition is taking; from cloud only to cloud first to fully hybrid, there are many routes to take. Recently, Leaseweb conducted a survey of 500 UK-based IT professionals and found a number of interesting trends to unpack. Here, Leaseweb MD Terry Storrar explores further. Today, especially in the wake of a pandemic which has transformed how we work, cloud platforms are taken for granted as a flexible, cost-effective and scalable way to provide companies with the tools and applications they need. However, our research* has unveiled a few key surprises in how UK IT professionals evaluate, view and use cloud technologies. Cloud First versus Cloud Only There is a common preconception that ‘cloud only’ is the leading trend when it comes to cloud migration, however, our results show that this is not always the case. While cloud is indeed a key component of many IT infrastructures, ‘cloud only’ is not dominant nor considered a cure-all for every IT requirement. Before the pandemic struck, a ‘cloud first’ strategy was the most popular response (36%), followed by a ‘preference but not commitment’ to use private cloud (26%). On the other hand, only 19% stated a dedicated ‘cloud only’ approach. The pandemic changed priorities - perhaps prompted by the huge change in working arrangements and the related need for secure, flexible remote working. ‘Cloud only’ adherents increased to 25%, while ‘cloud first’ decreased to 25% and then bounced back up to 31% post-pandemic while ‘cloud only’ remained the same. These variations are explained by the impact of the pandemic and the size of the company. The research shows smaller companies with under 500 employees were more likely to take the ‘cloud only’ route. While a ‘cloud first’ IT strategy has become more normalised since the pandemic (whereas ‘cloud only’ has not), the key takeaway is the search for flexibility. Weakening trust in cloud solutions  The findings show that trust in public cloud has diminished over the past two years. One primary reason for this is challenges around controlling cloud costs. Half (49%) of respondents saying they have found it difficult to gauge public cloud usage costs. And while three quarters (72%) of respondents said they had been able to do so, almost half (46%) somewhat agreed it had been challenging. Migration challenges have also contributed to lower trust, with 57% finding it challenging to migrate workloads out of a public cloud environment. Infrastructure managers were most likely to have experienced challenges with migration (78%). Poor communications also factored into the overall reduction in trust levels; when almost half (49%) of respondents struggle to contact public cloud customer services, it is going to undermine the relationship.  Transitioning from legacy infrastructure The widespread industry narrative that on-premise has fully given way to the cloud, however, is not as pronounced as might be expected. The research suggests that while legacy infrastructure is becoming less important in the overall IT mix, it still plays an important role in many organisations’ environments. While it is regularly argued that an over-reliance on legacy systems can act as a barrier to innovation and speedy deployment, the results of this survey suggest that does not appear to be the case for many organisations. Instead, companies are focused on deploying applications in the environment best suited to them, which, in some situations, is on-premise.  Nevertheless, it is clear that for on-premise infrastructure, the end is nigh – eventually. 66% of respondents agreed that the industry will see the end of legacy infrastructure within two years, with 29% ‘strongly agreeing’. Considering some respondents are still making investments in on-premise infrastructure, this is a strong statement of intent. Despite the challenges related to supporting legacy technologies, only 19% of respondents said it was forcing their IT strategy to change. As a result, while not the most important component of the modern IT landscape in many businesses, legacy systems are still present as an operational element, which does not act as an obstacle to future innovation. The case for flexible hybrid solutions The survey also asked how respondents would describe their organisation’s ideal IT infrastructure. The most popular choices were private cloud only (23%), followed by a combination of on-premise and public cloud (20%). Next came public cloud only (17%) and a blend of on-premise and private cloud (14%). Trailing in last place, unsurprisingly, was on-premise only with a mere 7%. These results indicate an ongoing dependence on legacy systems for many – in tandem with cloud-based solutions. They also show that the vast majority of companies pick either the public or private cloud route and stick with it: only 8% felt a mixture of all three was optimal and just 10% opted for a combination of private and public cloud with zero on-premise. What does it all mean? By interviewing 500 people involved in IT infrastructure, from IT Managers all the way up to CIOs, we have gained valuable insight into how cloud strategies are viewed today across the industry spectrum. It is clear that the case for hybrid technology approaches is strong because they offer the flexibility and choice that companies of all sizes need in today’s changing business environment. They also help mitigate unforeseen disruptions, such as a pandemic, which is a factor many organisations are not prioritising more than ever before.  It is equally clear that the research suggests the end of new investments in on-premise focused strategies, even though it remains a critical cog in the overall IT ecosystem. Instead, a combination of ‘on-premise and private’ or ‘public cloud and private’ points to the versatility the market expects. Ultimately, there is no ‘one size fits all’ strategy for deploying cloud services and companies should take advantage of a configuration that suits them best. ### <strong>SMEs: The Move from Legacy to the Cloud</strong> Over the past ten years, the on-premise IT footprint of both large enterprises and small and medium enterprises (SMEs) has decreased dramatically. With the emergence of the cloud, even the smallest businesses are seeing the appeal of Software-as-a-Service (SaaS) applications versus server rooms, with reduced on-premise data centre costs and potential headaches. As businesses begin to use more cloud-based applications, they are also realising the relative ease with which they can scale up and down their computing power to conserve costs, or to scale quickly to support new functionality and lines of business. It's a new era of IT and it suits the ‘work from anywhere’ model we’ve all now become accustomed to. However, in many business environments, on-premise applications, which may be considered ‘legacy,’ are still in use, and for good reason. Many of these applications are still performing sufficiently. For the employees that have been using them for many years, it really may come down to the well-known saying – ‘if it ain’t broke, don’t fix it.’ The issue with this approach is that it didn’t always make the grade during the lockdown measures of the pandemic and also, is not one that consistently follows a future-proofed, operationally resilient business model. Beyond this, many legacy applications are so complex that if the employees with experience in running and maintaining them move or retire, businesses may find themselves without the expertise to make critical applications deliver. Still, SMEs do have legitimate concerns when it comes to moving to the cloud. In the Cloud Industry Forum’s recent research paper The Transformational Impact of Cloud, 93% of UK company respondents said that the cloud is important to their company’s digital transformation strategy. However, of these, 35% of respondents said poor integration with legacy technology is holding their organisation back when it comes to faster cloud implementation. When asked why cloud migration projects fail in their organisations, the top reason cited among respondents was ‘legacy technology wasn’t compatible.’ What can SMEs do to address this? Consider Your Approach to the Cloud and Cloud Migration If the process of re-architecting a legacy application so that it can be replicated and used identically in the cloud sounds difficult to you, you’re not alone. The ‘lift and shift’ approach to migrating technology to the cloud is in decline and for good reason. It’s difficult to manage successfully and cost-effectively. In fact, in her 2021 article 6 Ways Cloud Migration Costs Go Off the Rails, Gartner contributor Meghan Rimol goes as far as to identify lift and shift as a mistake. She states, “The best move may be to rewrite and re-release an application in a cloud-native way, or even to replace it entirely with a SaaS-based alternative. Defaulting too quickly to a rehosting approach and deferring the cost to modernise or replace critical applications can result in higher cloud operating costs after migration.” In many instances, the best approach to staying on the path to digital transformation may be to transition from a legacy application to a more modern SaaS application, and export or import your existing data. In many instances, SaaS platforms and services may bundle up suites of functionality, offering more of a one-stop-shop of technology solutions from a single dashboard. Consider Your Data Migration from Legacy to the Cloud One of the foundational benefits of cloud applications is that everyone is up to date and everyone shares the same consistent view of the data, regardless of whether they are working remotely or in the office. However, if you are moving from a legacy to a new SaaS-based solution in the cloud, the reality is that data fields and tables need to be mapped from your legacy database to your cloud database, either by your team, or your new cloud provider. If the time isn’t dedicated to cleansing and mapping data, the reality could be that those data migrations may not run smoothly, particularly if you are migrating data from multiple systems. The move from legacy to the cloud is also the ideal time to cleanse and remove unnecessary data. The importance of having good, clean data and good data governance practices cannot be underestimated. Poor data management in the move from legacy to the cloud can be the root cause of a failed migration, and getting data experts in to accompany a legacy-to-cloud migration may be a good investment. Take the Time to Plan Your Journey from Legacy to the Cloud In the Cloud Industry Forum’s The Transformational Impact of Cloud research, when asked what the top priority in choosing a cloud provider was, over half of respondents selected ‘price’ as their top priority. Price will always be a significant factor, however, the reality is that if businesses choose cloud services and providers only on price, they are setting themselves up to fail. SMEs, in particular, may not have extensive in-house cloud IT management skills, so it is critically important that they choose a provider that specialises in providing services and solutions specific to their sector. The research backs this up further when we look at the other reasons that cloud projects fail (beyond legacy software compatibility.) ‘Resource and manpower,’ budget, ‘we didn’t test it thoroughly enough,’ ‘we scaled up too quickly,’ ‘lack of planning,’ ‘we chose the wrong cloud platform,’ – these are all reasons cited in the research as to why cloud projects have historically proved to be a challenge or have failed. The reality is that moving from legacy applications to the cloud is a commonplace part of ongoing digital transformation efforts, and it can be managed with the right expertise to hand. While price will also be a consideration in moving to the cloud, it should not be the top differentiator in a business’s ultimate cloud partner selection. A Foundation to Increase Cloud Adoption Digital transformation is clearly a mainstream strategy, and the cloud has certainly shown its value in recent times more than ever before. The move from legacy applications to the cloud may be a concern for many SMEs and we understand that. As a business, we’ve been on a transformational journey for the last 19 years. We are now a digital business and we are well on our way to becoming an expert solutions company, and we understand the challenges our customers face. However, if SMEs can approach the cloud with strategy forefront of mind, they will be well-aligned to increase cloud adoption over time to see continued benefits and return on investment. ### <strong>Don't lose sight of SAP on Cloud operational excellence</strong> Digital transformation projects can often become complex with twists and turns, which can lead organisations to focus solely on the migration itself. This is a problem as attention is then detracted from the operating model, both in the planning phase, as well as on completion of the project or migration stage.  To avoid this result, businesses need to take to an approach that is built on delivering operational excellence for SAP on Cloud. And, this approach should be based upon four key pillars, which are: reducing costs, removing complexity, increasing flexibility, and improving security.  Automation lies at the heart of this approach, enabling teams to achieve operational excellence without suffering from the operational burden. For some organisations, this means handing over the keys to their service provider and placing greater emphasis on achieving the standards of the SLA (service level agreement). But for others, being involved in the automation strategy is an opportunity for change and agility, and therefore they want to be in the driver’s seat.  Addressing the service catalogues  The IT team is responsible for executing a significant number of services, all of which should be defined in the service catalogue. Some of these include installing new SAP systems, extending and patching existing systems, etc. Altogether, these tasks classically require an enormous amount of manual effort from the SAP technical teams. In order to make a change, the IT team first raises a change request, then provide all the data needed to make the change, work out the impact on cost and security, and go through multiple levels of approval, before finally getting to the point where they can implement the change.  For the majority of customers with an on-premises model, this would be done using manual steps and hard-coded documents. However, things can be different when you’re on the cloud - this is where automation comes into its own.   For example, if a business wishes to change their SAP HANA database from a 2TB to a 4TB size, automation can create a change request in your ITSM systems, and provide details on how the development would impact security and cost, saving the team huge amounts of time and with more accurate information. This extra detail becomes the basis for the technical change itself – the automation can assess and execute the change faster and with more accuracy, avoiding the risk of human error or having to invest more time and effort when mistakes are made.  Automation helps reduce the technical effort within the IT process, allowing teams to focus on more value-add activities. Everything within the service catalogue can be prioritised, starting with the most critical, or the most expensive, or the quick wins – whatever the focus is for the business.  A common example of when automation delivers true value, is when teams need to access old SAP systems in archive mode. Every so often, someone needs to log on to query old purchase orders or bank records, ideally without having to manually start and stop a system – which could take days to complete if requested and executed manually. Instead, if the business uploads archived systems onto the Cloud, then with automation available through a start-stop button, they can start the system on-demand, conduct their query, and then shut it down at any given moment – saving costs, reducing wait times and eradicating unnecessary complexities.  Streamlining and improving SAP systems Automation also streamlines the process of taking data from one system and transferring to another – known as system refresh. There are several steps in this process beyond the obvious back-up, restore and pre-copy stages – automation helps export user lists and other items, bringing data back in and reloading the system. This process is different for every business and has long been a challenge for SAP customers. Removing the manual steps with automation, which can facilitate custom changes to fit each environment, means no system is deemed unsuitable.  When used to refresh Test Environments, a test cycle will gain more up-to-date data, which will facilitate higher quality testing and help avoid potential defects. For a Prototyping use-case, being able to try out an idea on a copy of the production system in an isolated environment creates a real opportunity for innovation. With automation, teams can assess how projects are driving innovation and track the outcomes. It gives a structure to innovating around SAP without incurring huge costs (which has always been a challenge in the past) inspiring a culture of change across the team.  These are all great examples of how automation essentially brings innovation to life.  Changing your mindset when it comes to security One of the realities of Cloud is that there is a greater number of moving parts than on-premises environments. This, combined with it being in the public domain, makes for a much more complex security environment.  While the tools offered by hyperscalers are equally as sophisticated as the challenges business face, the complexities often overwhelm teams when it comes to day-to-day management. On public Cloud, privilege access management is a mandatory rule, which means establishing fine-grain definitions across a broad number of security objects. The amount of work required to design, build, and maintain the security environment is much greater than on-premises, and teams often fail to realise the importance of automation when it comes to scaling operations post-migration.  Having security ingrained in your way of working is of paramount importance, it is a mindset that you need to adopt. Move your thinking away from the old ‘moat and bridge’ paradigm in data centres, because nowadays there is far more than one way in and out. In Cloud, everything is public until it gets locked down.  Putting the user first It’s easy to tell when an organisation has implemented automation in their operations post-migration by simply looking at user experience. If starting up an SAP system takes days from the request rather than hours, then they’re not automated.  The Uber app is a perfect example of how automation revolutionises a service and prioritises efficiency and customer experience. The company analysed the process of ordering a taxi to get from A to B and removed all possible manual steps. Beforehand, a customer would have to research a taxi company, call them up, provide their location (as accurately as possible), confirm the destination address, be given a quote or started on a meter (so the cost isn’t confirmed until the end of the journey), hand over the cash, and wait for the driver to write up a receipt. Uber has taken each of these steps and automated them through the app.  This is the result of automation being weaved into an operating model from start to finish, or what many people refer to as ‘digitalisation’. Driving innovation through Cloud operation If we go back to the whole reason for most digital transformation projects, it is to effectively future-proof business operations and ensure they are resilient enough to withstand the challenges that may lie ahead. CIOs need to know that their operations are getting the best attention possible, and they need real-time access to dashboards, data and APIs so they can continue to innovate their systems effectively beyond simply operating them on hyperscale Cloud.  ### <strong>Need to reduce software TCO? Focus on people</strong> https://vimeo.com/775107328/da41ceb5da Today's blog summary Investing in software is undoubtedly important for enterprises to stay ahead. However, the process is rarely a simple task for CIOs and IT leaders. The current challenges facing the global economy mean it’s now essential that digitalisation efforts reduce costs and provide a return on investment (ROI) as fast and efficiently as possible. When looking at the impact of digital initiatives, managing cost effectively seems to be a major challenge. A joint study from Oxford University and McKinsey revealed that 66% of large-scale software projects will run over budget and 33% will run behind schedule.  Therefore it is key to look at the total cost of ownership (TCO) on software holistically and identify areas that are challenging to manage. When speaking about TCO, we’re referring to the aggregate cost of a given software product, taking both direct and indirect costs into account. Direct cost is simple to identify: the purchase of the software licence itself or the subscription to a SaaS product. Expenses like software training or ongoing IT support are examples of indirect costs. It’s typically the case that indirect costs are harder to identify, measure, and control – and yet they can quickly accumulate if left unaddressed.  Most indirect costs of any software project involve - in one way or the other - training and supporting people. Failing to focus on people invariably leads to another key challenge which drives up software TCO unnecessarily: poor digital adoption. In the end, people have to work effectively with new technologies and failing to enable them in the right way can lead to massive support costs and slow down key projects.  This challenge has been further intensified by changes to the world of work. The workplace is increasingly digital and people tend to spend a significant amount of their time working from home, somewhat isolated from their co-workers. Such working arrangements will result in stronger reliance on software to keep operations running smoothly and – crucially – to keep employees engaged with their work.  With this in mind, it’s important for decision makers to embrace a new way of thinking about software TCO reduction – one which prioritises strong digital adoption.  How poor digital adoption affects TCO Poor digital adoption is perhaps the single biggest indirect cost responsible for driving up software TCO. Businesses frequently fail to recognise that successfully implementing a new software product is only half the battle; they must also train employees to use it to maximise its potential benefits. While initial training is commonplace, decision makers often fail to consider whether or not employees actually understood the technology, leading to the need to support and re-train them on a regular basis. Not to mention Ebbinghaus's Forgetting Curve, which stipulates that the average learner will forget 90% of what they have learned within the first seven days. This has a seismic effect on overall TCO, with software training costing UK businesses, on average, £2,086 per employee each year, according to Userlane research.  Conventional training methods are also difficult to use at scale. The number of applications that employees are required to use in their daily work is increasing. Simultaneously, the rise of hybrid working arrangements is making it more challenging to organise effective in-office and virtual training sessions.   One consequence is that IT departments are facing severe ticketing backlogs with, according to Zendesk, an average cost-per-ticket of £13. These tickets can quickly add up and dramatically affect the TCO, especially when you think about the thousands of employees who are struggling with software every day. Alternatively, an employee may turn to a peer for guidance on a task or function they are unsure of. This hampers productivity further, as it takes people an average of 30 minutes to refocus on their tasks after even a simple distraction. This is obviously bad news from an ROI perspective. Users lack the training needed to fully take advantage of the software and are lodging complaints en masse, many of which are easy fixes to someone in the know.  Lack of knowledge also translates into massive under-utilisation of software features. In fact, the average employee uses just 40% of a software’s total features – a considerable waste of resources from an investment point of view. Improper use of software will also lead to poorer process quality – if employees do not fully know the software, they are more prone to making errors.  Addressing these challenges  Left unchanged, this pattern of poor digital adoption will lead to avoidable increases in costs for CIOs and IT leaders. Thankfully, there are solutions to these problems.  Implementing a digital adoption platform (DAP) is a highly effective way to reduce software TCO. These platforms can eliminate onboarding and training costs by supporting employees to independently learn how to use their applications.  The DAP provides the user with continuous, interactive guides embedded as an overlay on the software itself, empowering them to “learn while doing”. Employees also become more self-sufficient as they are not reliant on static training materials and classes to make full use of software. A DAP also reduces the burden on internal support resources (human, technological and financial). Furthermore, the decentralised, on-demand nature of training through a DAP makes it far more scalable and cost-effective than regular classroom-style training.  Importantly, a DAP also has a clear positive impact on software ROI. Whereas previously a typical user would leverage less than half of a program’s features, the on-demand training and support offered by DAPs allows them to tap into the full set of advanced capabilities. A DAP also ensures that employees comply with SOPs (standard operating procedures), an important step to ensure process quality is consistent. Overall, it makes the purchase of, or subscription to, a piece of software a much more valuable investment.  In turn, these changes lead to higher productivity and a more forward-thinking culture. Giving users a pathway to mastering a software will instil within them the confidence to take on other applications – this in turn makes them more receptive to digital change.  Lower costs, higher productivity  Even when firms do take the need for continuous learning into account, the repeat cost of traditional training methods causes the software’s TCO to sharply increase. Using DAPs, employees can learn on-demand and receive guidance through every process the program offers – in real time.  This modern approach to training frees up capital and human resources, meaning no further drags on productivity and more funds to invest in further digital transformation. Most importantly, the ability to ensure a consistently a high level of digital adoption allows firms to stay ahead of the curve and truly embrace new technologies. ### <strong>The future of cloud and edge optimisation</strong> https://vimeo.com/772032499/67fe104931 Today's blog summary The big names in public cloud have introduced businesses across the world to data-driven innovation and new levels of operational agility. It is why cloud growth has been so rapid and why, unfortunately, cost is now becoming a major focus for many enterprises using the major hyperscalers. Hybrid working practices and the necessity of employing SaaS applications in everyday conduct of business continue to push many organisations into the cloud. The typical enterprise may now use as many as 1,200 different clouds, including familiar applications such as Microsoft Exchange or Salesforce. Virtualisation and containerisation also continue to make the big hyperscalers more attractive as platforms. Gartner predicts that 95% of new digital workloads will be on cloud-native platforms by 2025, compared with 30% last year.  Alongside their expanding use of cloud environments, enterprises must now prepare for the next big infrastructure event – the advent of edge computing, 5G, and the adoption of AI and machine learning applications.  Cost problems emerge As cloud deployments rise, control over costs has become an ever more pressing topic, with even The Economist asking whether what enterprises spend on the services of Google, Microsoft, AWS and Alibaba is fully justified in business terms. Along with resource flexibility, innovation and agility, the cloud should hand enterprises great economies of scale. This has always been a major selling point. Increasingly, however, many companies find they lose control of cost. It becomes difficult to predict requirements or optimise storage and computing when faced with the complexity of major vendors’ charging structures. Over-provisioning is frequent, and data egress charges often present unpleasant surprises.  Controlling cloud costs has become a business necessity, especially when hybrid infrastructure is increasingly common. Market analysts only see hybrid cloud infrastructure continuing to grow rapidly, meaning more organisations are going to run into cost and management problems. MarketsandMarkets, predicts demand will expand at a compound annual growth rate of 17% into 2023.  But hybrid cloud is also more complex, stretching across several cloud vendors and an on-premises data centre. Even in accountancy and financial services, many organisations are compelled to adopt complex hybrid infrastructure because their business-critical applications are not cloud-compatible. The Flexera 2021 State of the Cloud Report found that 92% of enterprises use multiple clouds, and 82% have adopted hybrid infrastructure. But respondents also estimate they waste as much as 30% of their cloud expenditure – galling for companies that pride themselves on efficiency.  When fully optimised, a hybrid cloud strategy should allow businesses to use revenue-generating applications while maintaining older, more traditional workloads cost-effectively. In practice, hybrid infrastructure spirals out of control unless companies have access to efficient and user-friendly management tools or platforms.  Poor visibility racks up costs and misalignments Lack of visibility is much to blame. In the Thales 2021 Data Threat Report, only 24% of organisations responding said they fully knew where their data is stored. This obscurity of cloud environments leads to difficulties in shaping cloud services and forecasting their cost. Organisations soon realise that right-sizing cloud deployments for maximum efficiency is very challenging, especially where IT departments are short on experience. The “lift-and-shift” approach often leads to misalignment of workloads. To squeeze the full value of each dollar spent in the cloud, organisations should run as close to their configured performance limits as they can without undermining application performance. Yet in the rough-and-tumble of daily management, resources for a particular purpose frequently remain live longer than required, with unnecessary costs quickly mounting.  What organisations need for control Because costs are about usage over a billing period, businesses should be monitoring exactly where their monthly expenditure is going. Cloud providers should offer the ability to analyse monthly invoices by resource type, subscription and where the organisation has deployed it. In a collaborative process, cloud providers should offer several vendors and services to aid with cloud cost management. They should also offer tools that allow engineers to provide cost optimisation reports, identifying the most suitable environment for services. Controlling costs is also a matter of strategy. Certain services are simply not ready for the cloud, so organisations should focus their attention and budgets on areas from which they can derive the greatest value.  An optimised cloud purchasing strategy should adhere to the key principles of discover, plan, implement and decommission, aligning to a cloud adoption and transformation framework. By these means, the framework will deliver continuous improvements, while the ability to use new technologies will save costs and ensure better value for money.  Current hybrid cloud tools With hybrid infrastructure now so important for business success and innovation, enterprises also need access to next-generation cloud management platforms. The hyperscale cloud providers already supply tools for hybrid cloud management, such as Google Anthos. This is significant, given the intense competition between hyperscalers, but it is questionable how vendor-independent such platforms are.  The hyperscalers are also falling behind on edge computing, which is emerging as the next major advance in enterprise IT infrastructure. Edge computing should remove the disadvantages of location, delivering faster, low latency responses through the roll-out of 5G and importantly, by shifting some data and workloads to regional data centres, close to where businesses want to use them. This is the foundation for AI-driven automation, the industrial internet of things (IoT), self-driving vehicles, remote diagnostics, and treatment in healthcare and fast streaming for gamers. A more comprehensive platform These highly significant advances are why organisations should be using cloud management platforms that encompass edge computing. A comprehensive management platform will optimise all current cloud deployments and connectivity, enabling expanded use of IaaS, PaaS and SaaS applications while preparing for the edge.  These next-generation management platforms deliver a complete view of a company’s hybrid and edge infrastructure from a single screen, enabling organisations to reduce unnecessary costs and ensure the best possible performance wherever assets are. They can view monthly expenditure, with invoices available by resource type. These more sophisticated tools assess cloud workloads and provide comprehensive asset discovery, usage reporting and dependence mapping.  Enterprises benefit from a consistent, all-round, and current view of costs, enabling them to devise and implement their cloud strategy that includes on-premises and edge environments. They can move and adapt workloads for maximum operational efficiency and cost-effectiveness, achieving a tighter alignment of demands and resources through real-time visibility. Analytical capabilities give companies insight into the ways they use cloud and edge resources, enabling them to constantly improve efficiency. As more enterprises use multi-cloud and hybrid infrastructures, the danger of cost overruns and loss of control increases.  A new set of tools and the development of next-generation management platforms, however, means they can fully optimise not only the entirety of their cloud environments but also their future edge deployments, ready to take full advantage of every potential saving, performance-enhancing solution, and significant innovation. The alternative is to remain stuck with mounting bills and loss of agility as competitors streak ahead. ### <strong>Here is how to stage a public cloud migration</strong> https://vimeo.com/771485032/ff761608f6 Today's blog summary The pandemic has accelerated the digital transformation of communication service providers (CSPs) in many ways. In this article, we look at the current state of CSP cloud migration and provide insights into what steps CTOs should consider when migrating their BSS functions to the cloud with telco-grade reliability. BSS functions include customer care, self-care, billing and product cataloguing. And it is precisely these functions that are essential to turning CSP's assets into revenues. BSS technology is decades old and runs on complex applications with proprietary integrations that leave little room for innovation; however, things are evolving. Defining Cloud-Native BSS Cloud migration is the process of moving data, applications or other workloads to a cloud-computing environment. One common model is to transfer data and applications from a local on-premises data center to a public, hybrid or private cloud. Cloud-native BSS is the migration of BSS applications to the cloud that meet a certain number of criteria: Adapts to the cloud’s underlying infrastructure to maximise the flexibility of future cloud developments Containerisation of software components which enables faster testing and integrations  Applications remain isolated from other components without service interruption Application logic and database remain separated to ensure service continuity in case of failures  Orchestration and automation are optimised as much as possible  Current State of Telco Cloud Migration In its annual digital transformation tracker, TM Forum reveals that operators’ commitment to a cloud-native approach for IT workloads has strengthened over the past year; nearly a third of CSP respondents confirmed their companies’ commitment — up from 22% in 2021. CSPs are no longer as willing to remain locked into relationships with network equipment and OSS/BSS providers as they have been in the past.  As shown below, adoption of cloud-native technology tops the list of priorities among CSPs. CSPs recognise the benefits that cloud-native technology offers. These benefits include agility, scalability, faster and easier introduction of new services, increased customer personalisation and lower operating costs. TM Forum indicates that most operators have started with cloud-native BSS, such as systems of engagement and charging systems. However, CSPs are still worried that a cloud-native platform is not suitable for all functions because concerns continue about security and privacy. Data sovereignty is also an impediment to moving all workloads to the cloud.  Key Stages for Cloud-Native BSS Migration As shown in the graph above, many CSPs are already deploying microservices. However, they need to manage practical issues when scaling and adding more complex functions. The following stages should be considered by CSPs wanting to leverage the in-place cloud infrastructure to start their cloud-native migration. Adopt “shift-left” for new microservices As CSPs adopt and scale microservices across multiple cloud environments in various parts of their businesses, managing these environments becomes more complex and difficult. One widely accepted approach to operating applications built on microservices is “shift left”. The principle of shift left is to take a task that is traditionally done at a later stage of the process and perform that task at earlier stages, thus “spreading" the task to all stages of the process flow. This reduces risks and costs by enabling early problem resolution and greater automation. In addition, by adopting a shift left approach to rolling out new microservices with a minimum viable product (MVP) and test philosophy, CSPs can reduce their time to market and repair, resulting in a greater net promoter score (NPS).  Telco-grade cloud-native architecture When a new customer purchases a SIM card and subscribes to a mobile plan, a complex workflow is initiated. The process starts with: checking payment authorisation provisioning the CRM creating a billing account provisioning all telecommunications systems (authorisation, authentication, HSS/HLR, cost control system, policy control) checking the smartphone stock ordering the logistics ordering the delivery  In proprietary old BSS platforms, that flow can run into various problems because the steps are not integrated with APIs, and workloads are isolated from each other. If a program fails, then it will affect all the flow because it is not container componentised and is not based on a standard. Moving that flow to the cloud can enable all tasks to be managed centrally as microservices. Microservices applied on containers are easier to test, launch and update without causing issues with other applications. From an operational perspective, CSPs will be able to automate more and deliver faster, more personalised BSS services. However, providing telco-grade services with “five nines” reliability and availability requires special telco skills not available “out of the box” from public cloud providers. CSPs will need guidance from their software vendor to ensure a successful cloud migration. Move to DevOps approach DevOps is a set of practices that combines software development and IT operations. It aims to shorten the systems development life cycle and provide continuous delivery with high software quality. The major features which CSPs should be looking for in cloud-native BSS include: a decomposed architecture based on site reliability engineering (SRE) methodology, CI/CD, open source, access to data for enhanced analytics and AI and high levels of automation. CI/CD introduces ongoing automation and continuous monitoring throughout the lifecycle of apps — from integration and testing phases to delivery and deployment. CSPs should adopt an approach of leveraging the add-ons available instead of customising every piece of software. Leverage cloud skills CSPs face fierce competition for software and digital skills, some of which are in short supply. People with these skills often find internet and cloud businesses more attractive and lucrative than telcos, which can be perceived as old-fashioned. For most CSPs, however, this shortage means they have a big challenge ahead with onboarding enough cloud engineering staff and getting them into senior positions to deliver change. As a result, CSPs should leverage the cloud skills of their hyperscaler partners while developing their own minimum set of cloud skills. As the relationships between CSPs and cloud providers are deepening, CSPs need to develop a clear strategy on how they add value to customer relationships. ### <strong>The future of work is collaborative</strong> https://vimeo.com/771100140/ad7c01f913 Digital transformation is reshaping every aspect of the modern workplace. With hybrid working now the norm, the traditional workspace has had to radically change in response to employees’ needs and wants. Gone are the days of rigid social and physical structures: companies are now experimenting with more innovative, adaptable designs centred on flexibility and teamwork. But what does this mean for the future of work – and for the workforce itself? As a new generation of digital natives join the workforce in ever-growing numbers, they are bringing a new set of expectations about how people should connect, communicate, and collaborate. A recent study found that over a third of Millennials and Generation Z employees rank the ability to both focus and collaborate in their workspace as the top workplace attribute that enables them to do their best work. The signs are increasingly pointing towards collaboration playing a critical role in the workplace of tomorrow. Therefore, organisations should consider the tools and technologies needed to stay ahead of the curve and successfully prepare for the collaborative future of work. The case for collaboration Collaborative teamwork has always been important. Even back in prehistoric times, when we faced a very perilous world, our ability to work well in teams elevated us over stronger, faster creatures. Humans have long understood the need to know their place in a team and their responsibility in keeping their group (or tribe) together, alive, and thriving. But fast-forward thousands of years and collaboration has taken on a whole new dimension in the modern workplace. As organisations become ever-more global and complex, and the post-covid hybrid world continues to evolve, good communication amongst distributed teams has become a requisite for business success. In order to survive in an increasingly challenging and competitive world, organisations must have the flexibility to provide new, innovative ways for their employees to collaborate productively from any place, time, or device. To do this successfully, they will need the right tools. Video conferencing software like Zoom and Teams have become part of everyday working life for many businesses, but a plethora of other collaboration tools are also coming to the fore to help employees thrive in their workplace ecosystem. Preparing for the ‘collaboration generation’ For the new generation entering the workforce, these innovations in collaborative tech cannot come soon enough. As digital natives who have grown up completely surrounded by computers, tablets, smartphones, and other digital devices, Gen Z in particular will expect any company they work for to incorporate emerging social and digital technologies into their operations. The new generation does not value technology for technology’s sake. Instead, it is a key enabler of two core workplace values: flexibility and collaboration. To have the best balance of their working and personal lives, Gen Z wants the option to determine exactly where, when, and how they want to work. Furthermore, younger employees are natural networkers and desire a variety of tools to help them both focus and collaborate with others. Having grown up using social media and instant messaging, Gen Z understands the value of collaborative tools and are now want them to be incorporated into the workplace. Organisations therefore need to adapt and evolve their workplace practices to successfully recruit, retain, and engage a new generation of talent. So, how can they go about achieving this? Bridging the hybrid gap with technology Firstly, organisations must respond to the changed (but still highly important) role of the office. Despite early predictions of the office’s demise post-pandemic, these spaces still play an essential role in workforce collaboration, connection, and camaraderie. But it is not enough for employers to simply leave the office lights on. Careful consideration must be taken of how these offices essentially ‘work’. Many employees are already finding that their homes are now smarter than their offices, with more intuitive and connected technology. This disparity can impact both their productivity and happiness, as well as hampering opportunities for collaborative working. As the hybrid model continues to take shape, more forward-thinking businesses are redefining their offices to meet evolving employee expectations, putting interactive technology in place to allow for more seamless transitions between physical and virtual settings. Businesses now have the opportunity to redesign their office space for a more collaborative hybrid future. How the office looks will vary from organisation to organisation, but employers may opt to combine meeting rooms and working pods with breakout spaces to promote both scheduled and unscheduled teamwork. Most importantly, rooms should be fitted with technologies that allow for the right blend of physical and virtual participation. For example, organisations can invest in a ‘media suite’, complete with microphones, cameras, and other multimedia devices to help ensure a high-quality audio-visual experience for those working remotely, essentially making them feel part of the office environment even if they are dialling in from miles away. In fact, there are many tools out there to support productivity and collaboration in hybrid working spaces. From interactive whiteboards, which blend engagement and participation between the physical and virtual spheres, to visual conferencing technology, businesses can begin today to bring these tools together to provide a truly collaborative ‘smart’ workplace to attract, engage, and inspire employees. The right technology has the power to bridge the gap between office-based working and remote working, ensuring that everybody feels included in a shared environment. It is only when these barriers come down that the most creative and fruitful collaborative work will take shape. Looking beyond the technology However, it is not enough to simply implement the technology and wait for the magic to happen. The key to securing a collaborative hybrid workplace is to ensure that all digital tools are seamlessly working from day one to the end of their lifecycle. This encompasses remote monitoring and maintenance work – ongoing projects that overstretched IT teams may need additional support with. Essentially, the technology needs to be ‘managed’: it is this managed element that is just as important as – if not more than – the technology itself. To ensure that employees consistently get the most out of the collaborative tools at their disposal, increasing numbers of organisations are looking towards trusted managed service providers (MSPs) for end-to-end integrated support. But regardless of whether a business outsources its managed tech or keeps it in-house, the link between technology and collaboration in the workplace is one that must be championed and nurtured from the C-suite down. As hybrid work models continue to gain traction, businesses will need to start implementing collaborative tools and processes to meet the needs and expectations of the upcoming workforce, seamlessly integrating them into existing workflows to enhance productivity and performance. Innovations in technology, including AI and machine learning, mean that organisations are in a better position than ever to shape the collaborative future of work – and with the right support in place, they can ensure that these digital tools continue to bring out the best in their workforce for years to come. ### How Business Data Can Be Protected, Even with Remote Workers https://vimeo.com/770718113/c8d1ca2992 According to a study conducted by OwlLabs, approximately 69% of survey respondents worked remotely during the pandemic or are now working from home since. As remote work becomes increasingly commonplace, businesses and security experts are also required to become more innovative than ever, especially when it comes to protecting business data and personal information. Along with the rise of remote work is also the rise in cybercrime. Ranging from identity theft and phishing attacks to data hijacking, cyber crimes can result in the release of sensitive and classified information, which can cost millions of dollars or result in bankruptcy. Knowing how business data can be protected, even with remote workers helping to run your company, is essential in today's day and age. When you are aware of potential scams, risks, and proper security measures to take online and at work, you can minimise the risk of your business becoming a target of hackers and online thieves. Implement a Company Cybersecurity Policy As a business owner, one of the first steps you can take to improve security for your company is to implement a company-wide cybersecurity policy. A cybersecurity policy can help to minimise the risk of a data breach or of having personal and sensitive information exposed to potential hackers or digital thieves. With a company-wide cybersecurity policy, you can integrate the following into the workplace: Upholding of brand and reputation: Guidance for upholding your brand's image can help reduce the risk of exposing potentially sensitive or classified information to those who do not have permission. Regulatory compliance: Depending on the industry your company represents, incorporate regulatory guidelines, rules, and restrictions directly into your company's cybersecurity policy. Abiding by local, regional, national, and digital compliance requirements can also prevent you from putting your business at risk while reducing your organisation's chances of becoming a high-risk target. When a business is operating outside of regulations or with a high-risk business model, it is much more likely to attract potential hackers or thieves who may have some form of leverage against the business owner themselves. Productivity: If productivity is a top priority for you as a business owner, utilising a cybersecurity policy can minimise distractions in the workplace, especially when it comes to employees using tech or browsing the internet for personal reasons. Use Firewalls, Antivirus Software and Anti-Malware, and other Technology Incorporating updated firewalls, antivirus software, and anti-malware technologies is imperative for most businesses today, including those that hire workers who are working remotely. With the right software and security solutions, streamline the work efforts of your employees without fearing potential security breaches or hacking attempts, even when your employees are out of the office. Choosing the right firewall protection, antivirus software, and anti-malware programs can feel confusing and overwhelming if you are unsure of where to begin. When you need to protect all of your business data, even with remote workers in the picture, consider an Endpoint security solution. EndpointSecure, a comprehensive and unified security solution from Allot, can be combined with their other security solutions, such as: NetworkSecure: NetworkSecure is designed to protect any customers that are connected to your website or network, regardless of the devices used. BusinessSecure: BusinessSecure from EndpointSecure is utilised for end-user devices, in the office of your business, as well as with smart appliances and to secure a range of IoT devices. Two-Factor Authentication Two-factor authentication is a method of managing access to a particular platform or login using two forms of identity in order to complete the verification process. Using two-factor authentication in your own business model for your remote workers is highly advisable to minimise the risks of account hacks and data breaches. Strong and Varied Passwords Along with implementing a company-wide cybersecurity policy, require all remote workers to use strong and varied passwords for their personal work logins. Recommend that employees do not use a password that they have used on any other website, from their bank to social media platforms such as Facebook. Request that employees use a combination of upper and lowercase letters, numbers, and at least one special character (!, *, @, etc.) Use a Password Manager Install and use a password manager on all of your company's computers. Password managers can include two-factor authentication or even fingerprint logins in order for users to gain access to a particular program or an entire computer. For remote workers, you can require that their work computers (desktops or laptops) include the preferred password manager. This will help to add an additional layer of security to the computers that are being used daily to access business data and important documents. Ensure You Use a Secure Network Whenever you access the internet, it is best to do so on a secured network, especially if you are logging into any website or application with your private credentials. As a business owner, ensure that your remote workers are always accessing your business portals and their workspace with a private and secured network. Provide your remote workers with the following tips and guidelines to ensure they remain as safe and secure whenever they are working while connected to the internet: Use a secured private network: Always connect to a privately owned internet connection when accessing business data or important information. Ensure that your internet connection's password is strong, varied, and encrypted. Avoid public connections: Prohibit your remote workers from connecting to any public wireless internet connection in order to work. If they prefer to work while on the go, require that they do so only with private and secured wireless hotspots or network connections. Set clear usage policies: If you provide remote workers with computers, implement clear usage policies to ensure that all employees remain safe and secure whenever they are connected to the internet. You can also use tracking and monitoring software to detect potential intrusions or potentially intrusive IP addresses from accessing your software and business data. Understanding the importance of developing your own cybersecurity policy alongside proper security measures is essential for a business owner or entrepreneur in any industry today. As you become more familiar with available security solutions and services that are suitable for your business, maintain your peace of mind knowing that the protection of your business data is in the right hands. ### Disruptive Seasons - Spring - 2022 - Celerity Limited Panel Craig Aston, COO of Celerity Limited, and Darren Sanders, a Technical Architect at Celerity, participated in a panel discussion about cybersecurity. They discussed the importance of protecting sensitive information, such as personal & financial data, from cyber attacks. They also emphasised the need for companies to have robust security protocols, including regular updates, employee education, and incident response plans. Additionally, they highlighted the importance of staying informed about the latest threats and trends in the field and mentioned the importance of working with other organisations to share knowledge and resources to help protect against cyber attacks. They also discussed how the increased use of remote work and the shift to cloud-based services had presented new challenges for cybersecurity. They discussed how companies need to ensure that employees have secure access to company resources and information while working remotely and how to protect data stored in the cloud. The panel also highlighted the need for companies to be prepared for a cyber attack & have incident response plans to quickly and effectively address any security breaches. The panel also addressed the importance of compliance with regulations and industry standards, such as HIPAA & PCI-DSS, to ensure that companies follow best practices for protecting sensitive information. They also discussed the importance of continuous monitoring and regular security assessments to identify and address vulnerabilities before they can be exploited. Overall, the panel highlighted that cybersecurity is constantly evolving and that companies must stay vigilant and proactive to protect sensitive information & systems from cyber-attacks. They emphasised the need for regular employee education, robust security protocols, and compliance with industry standards to minimise risk and protect against cyber threats. ### DevOps Metrics - How to measure success in DevOps? Even though there is no perfect definition for DevOps, experts believe it is a combination methodology for improvising the software development lifecycle. It provides collaborative work between the application development and the IT operation team. DevOps delivers enhanced communication, consistency, and collaboration leading to an efficient, secure, and faster software development process.  The DevOps metrics allow for measuring success, benchmarks, and milestones. The metrics help in tracking performance and provide valuable analytics that help in improving future processes.  Some Interesting Statistics of DevOps Implementation Over the past couple of years, the demand for DevOps among App Development companies has risen steeply. Let’s take a look at the interesting facts and statistics about implementing the DevOps strategy.  The word ‘DevOps’ is derived from Software Development (Dev) and IT Operation (Ops). The strategy of DevOps supports agile software development on several occasions.The global market size of DevOps is expected to reach $10.31 Billion by the end of the year 2023. The rising demand for fast application delivery among clients has increased the need for DevOps among Software/App Development Companies. Organisations that are good at handling DevOps have very strong teams with well-defined communication channels.  How To Measure DevOps Success? Measuring DevOps success is essential for modern businesses so that they can easily predict what and how their teams do on application development. Also, metrics involved in measuring DevOps success help businesses analyse the efficiency of their team involvement in app development.  Here are the few aspects that play a key role in measuring DevOps success. It is important to understand how fast the team can deploy the software while focusing on the technology. Once-a-week deployment is a good option but depends on the individual deployment goals. Implementing automation in several processes may lead to faster identification and fixing of the issues. Using this team can focus on improvising the application and decreasing the failure rate.Technology may face downtime due to service and maintenance. In such a situation, how the DevOps team can help in resolving the issues and restoring services.  The time taken for the code completion process and app deployment. More expertise will lead to a faster lead time.Efficient customer service will help in prompt addressing of the issues or queries of the customers. Automation helps in continuous monitoring of the process for the change in code, security, or process parameters. This testing helps in the efficient tracking of automation. Some defects may be noticed after the deployment of the software. The metrics allow for identifying defects that escape the production line.  List of DevOps Metrics to Measure DevOps Success: There are many DevOps Metrics used for measuring DevOps success. Some of the key metrics are briefed below. Take a look. Deployment frequency: This metric helps in measuring how often the team successfully releases the software. With the adoption of continuous integration/ continuous delivery (CI/CD), the frequency of release is high. This high deployment frequency helps in the efficient fixing of bugs, enhanced improvements, prompt implementation of new features, etc. It enables developers to get faster real-world feedback for improved development. This metric enables measuring long-term and short-term efficiency. It allows for identifying bottlenecks and solving the issues effectively.Lead time for changes: The metrics measure the time taken for incorporating the committed code into production. This metric is important as it allows for understanding how quickly the team can respond to any system-related issue. A shorter lead time implies faster issue solving, whereas complex issues may take a longer lead time. This metric can be improved by implementing quality assurance testing within multiple development environments and incorporating automated testing. Change failure rate: This metric measures the percentage of deployments that fail in production and require bug fixing or roll-back. This metric checks the number of deployments attempted and how many of those failed during production. This metric helps in gauging the stability of efficiency of the DevOps process. If the rate of change failure is more than 40 % means the testing procedures are poor and significant change is required. Mean time to restore service (MTTR): This metric allows us to understand the time taken by the organisation for recovering from a failure in production. It ensures the resiliency and stability of the software. This metric helps the team understand what improvement is needed in the response process. Longer MTTR indicates poor monitoring and can result in affecting more systems. The deployment of software in small increments can help in achieving quicker MTTR and reducing the risk and deployment of automated monitoring solutions for anticipating failure. Defect escape rate: This measured the number of bugs escaping testing and getting released into production. It helps in determining the effectiveness of the testing procedure and the overall quality of the product. A higher defect escape rate implies that the process needs improvement and enhanced automation. A lower rate or zero value indicates high-quality software and a well-functioning testing program. It is important to track all the defects from all the stages. Mean time to detect (MTTD): This metric measures the average time between the start of the incident and its discovery. This metric helps in determining the effectiveness of the monitoring and detection capabilities. For this metric, all incident detection times for all projects and teams are divided by the total number of incidents.Percentage of code covered by automated tests: This metric measures the proportion of the code with respect to the automated testing. Implementing automated testing implies greater stability of the code in software development as compared to manual testing. The goal is to achieve a higher percentage of code.App availability: This metric is useful in measuring the proportion of time for which the application is fully functional and accessible to the end-users. For determining accurate application availability, it is important to measure the actual end-user experience and not the network statistics. Application usage and traffic: This metric helps in monitoring the number of users accessing the system and also determines the system uptime. On the deployment of the system, the users start accessing the system and the number of transactions is noted, ensuring normal operations. If the application traffic and usage increase, there are increased chances of failure under pressure. If the traffic and usage are less, it indicates there are not many users. The DevOps team can use this metric to ensure systems operate as they have projected, and if it deviates then the team needs to take the appropriate action. Other metrics: The other DevOps metrics include Change Volume, Meantime to Failure, Customer Feedback, Automated Test Cases, Service Level Agreements (SLAs), Unplanned Work Rate, Process Cycle Time, etc.   Closing Thoughts Implementation of DevOps helps in enhanced collaboration and communication between the product development and the IT teams. A successful DevOps practice helps in efficient monitoring of the software development process and ensuring that the process, pipeline, and tools implemented help in achieving the intended goals for faster delivery of software.  The different metrics help in evaluating the success of the software. It also helps in identifying the issues or defects and helps in improving them. It also gives insights into incorporating newer technology for software development. ### Most Effective Methodology for Transformation to SAP S/4HANA The year 2015 witnessed the launch of the SAP S/4HANA solution. The solution can be equipped with all the advanced technologies you can seek in an ERP solution. Following this launch, SAP announced that support for the legacy SAP ECC solution would end by 2025. However, this deadline is now extended till 2027 for many different reasons. Nonetheless, SAP customers were quick to get to the point that they soon had to migrate to the SAP S/4HANA solution. But migrating to a new ERP solution is always much easier said than done. This got much thinking about the ideal methodology for SAP S/4HANA migration.  But there is hardly a single methodology that serves everyone. Different businesses and organisations have different requirements. Yet here are some pointers you can leverage to facilitate a successful transformation to SAP S/4HANA. 1. Define the Primary Ingredients of the Transformation Migrating to a new ERP solution means organisation-wide disruption impacting every aspect of the business. As a result, it becomes imperative to have a well-defined plan to go through with the transformation. In this plan, you need to describe every ingredient or aspect needed to facilitate a successful transformation to Accely’s SAP S/4HANA. So, it is better if you start by understanding the perks you can derive from cloud technology. There are many perks, such as enhanced visibility, seamless data utilisation, driving better innovation, and more. Following this, inform stakeholders about the transformation and spark an interest in them. Never undermine the significance of effective communication when it comes to SAP S/4HANA transformation.    Furthermore, you can facilitate a Transformation Excellence Centre to augment your efforts for solution design, reusability, training, and more. Besides that, the centre can also help you facilitate effective group purchasing and risk management.  Finally, you need to establish a practical model for managing the transformation project. There are many different project models people use these days to facilitate a smooth transition to SAP S/4HANA. So, it is recommended you weigh all your alternatives and pick the project model that best serves the distinct requirements of your business. Once you have a model in place, you can choose the agile approach and facilitate effortless transformation. 2. Leverage Effective Business Leadership SAP S/4HANA migration is a complex process that drains a lot of time and effort from the business end. As a result, it becomes very easy to find oneself lost in between where anything does not make sense anymore. Such adversity can have severely detrimental implications for the overall migration process.  Thus, the best way to deal with such challenges is to have effective leadership to guide you and help you navigate through all complexities. Seek assistance from business leaders to get a practical approach for your SAP S/4HANA migration. Besides that, leaders know what is best for the organisation, so it becomes much easier for them to choose the right approach to drive organisation-wide transformation.  The other thing you need to keep in mind is business needs, and not technology alone must drive that transformation. When you focus on addressing your business needs, finding the right tools and practices that serve your business requirements becomes easier. Additionally, this will help you review your business processes and understand how SAP S/4HANA can address them. Focus on key aspects such as outcomes, key roles, processes, experiences, and, most importantly, data.           3. Leverage Transition Options Every business needs to leverage a different approach toward its SAP S/4HANA transformation. This is important because different businesses have unique requirements based on their industry and the status of the SAP system. Nonetheless, that does not mean all these approaches are always very different from each other.      Instead, there are always some core principles that make SAP S/4HANA transformation a business project for everyone. Thus, you need to keep in mind that your ERP solution is competent enough to serve the future requirements of your business. Apart from that, you need to get a detailed understanding of your existing business processes. This will help ensure you need to carry the previous bottlenecks into new systems.   For instance, you can start with a brand-new implementation model that simplifies SAP S/4HANA implementation for your business. Following this, you need to take a smarter approach to data migration. This means leveraging selective data transition, where you let go of unnecessary data that will clutter your new system. However, you can also choose to keep your historical data and investments in the SAP S/4HANA solution extensions. 4. Focus on Communication The final thing you need to keep in mind is never to undermine the significance of facilitating effective communication with everyone involved in the process. Business leaders have the potential to either make or break a transformation such as this. So, focus your efforts on generating awareness about the solution and providing people with the necessary training opportunities.   When you communicate with your people effectively, it automatically aids in enhancing commitment among people. This also helps keep people around you motivated and inspired enough to go through with the transformation. Also, do not delegate these communications but make sure you are personally involved in the process. This will enhance your reliability, helping you drive effortless transformation.   Effective training and communication will keep everyone informed about the changes they can expect in the coming times. Remember, the people driving or facilitating the transformation are also your people, so you need to focus your efforts on helping them with the transformation. Informing them about ways the transformation will impact their job or tasks can go a long way in facilitating an effective and seamless transformation to the SAP S/4HANA solution. Final Thoughts SAP S/4HANA transformation will always be complex, but that does not mean one cannot go through with it. Be mindful of the aforementioned tactics and tips to simplify tour SAP HANA migration with no hassle. Besides that, you can even seek assistance from SAP experts to help you with the transition. ### Top Tips for Launching a Company in 2022 Most people think about the success, power, and wealth they’ll gain from owning their own business. And these are motivating factors for many people. However, it’s imperative to understand that although we hear many success stories regularly, there are also millions of unsuccessful ventures that you don’t hear about. And every entrepreneur trying to start or maintain a company will tell you that it’s challenging and daunting work. So, when building a brand from the ground up, every potential business owner should know that it takes a great deal of dedication, motivation, people skills, and access to specific information. These are the topics we’re going to discuss today. 1. Create an Image for Your Brand Regardless of your niche, your brand is the most useful and visible asset there is for launching a company. If you haven’t learnt anything from today’s big brands, you should at least know that many consumers choose to support businesses not because of their products but because of their brand. As a result, entrepreneurs need to spend quality time determining their company’s values, unique value proposition, mission, and goals. The information that you gather during this step also plays a pivotal role in coming up with a good business name that’s in line with what you want to do in the market. If you’re having a hard time with a name, check out a reputable startup name generator. 2. Grow a Digital Following Today you can do everything from shop and pay bills to make doctor’s appointments online. So, it’s common for customer transactions to both begin and end using online platforms. Brands that successfully connect with their target market online earn consumer love, patronage, and respect faster. But that’s not all; growing a digital following will also help convey your brand’s identity. The best strategy is to take advantage of every online platform possible. Some of the best include Instagram, Facebook, Twitter, LinkedIn, and TikTok. 3. Incorporate Creativity into Marketing Ads are the best way to introduce your brand to the world. However, it’s vital that you ensure all brand advertisements communicate a clear and compelling message that evokes the right type of feelings from your customers. If you fail to do this, people will simply scan your message and move on without taking notes or engaging with your business. 4. Remain at the Forefront of Business The market is already saturated with countless businesses that offer similar products and services. Perhaps, even the same business model as your brand. For this reason, you must remain innovative and provide unique services so that your brand stands out to customers. Brands that manage to stay at the forefront and ahead of the curve tend to better meet consumer needs. They also always find a way to be considered trendsetters. And this alone has the ability to win over your target audience’s commitment, respect, and attention. 5. Openly Show Appreciation for Your Customers There’s one thing that every top brand has in common, and that’s they offer services that consider the customer’s best interest. In fact, it goes well beyond just recognising the consumer; these companies love their customers enough to put their needs first. Hence, business owners should always be looking for methods to improve business processes, lower pricing, or create higher quality services to better serve their business’ target audience. 6. Prioritize Consumer Values The internet has revolutionized the way we do business. In the past, businesses did their best to stay out of social concerns, but in today’s world, consumers don’t want them to remain silent. Customers rely on companies to take the lead when addressing concerns about the environment and climate change. If you can learn to put the consumer’s values first and find a way to relate with them through your brand, then you can portray your organization as more than just another company; you can align as a friend. 7. Choose a Niche Wisely Many top entrepreneurs know that to build a reputable startup, it’s crucial to choose a specific area of business and place your concentration there. Just be sure that when choosing a niche, you’re doing enough research to avoid oversaturated and unknown areas, as they can drown or restrict your company. 8. Team with Outstanding Partners Above all else, one of the most valuable assets in business today is an excellent partner. Anyone you choose for your team should have skills and knowledge that you don’t already have. Find people who can pick up your slack in areas such as order fulfilment and product testing because your company will grow faster with them than it will without. 9. Assess Your Product’s Credibility When it comes to business, you should never leave anything up to chance. Your company launch and product releases will be visible to millions of people around the world, so unresolved product defects will be identified. It’s your responsibility to have all issues worked out because if you don’t, it may break your company. Remember, you don’t have to take this step by yourself. There are lots of professionals you can hire to assemble a team that will properly examine your brand’s products and services to ensure that they are perfect and ready to sell. 10. Pay Close Attention to Failed Companies There are many reasons why companies fail. Some fail because of poor planning or insufficient capital. However, others fail due to reasons that span from management conflicts to unstable market conditions or even legal action. To avoid having history repeat itself, do your research and learn from every organization that’s already failed. Learn the reasons why they failed and figure out what they could’ve done to prevent it in the first place. Knowing this information will be crucial to help you avoid making the same mistakes and putting a more effective system in place. In the end, starting a new business can be one of the most exciting things you’ll do in your life. Just remember, there’s no such thing as too much planning, so get started now! ### Decoding Zero Trust Identity and embracing its benefits The world has undeniably become more digital. Before the pandemic, ‘digital transformation’ was an aspirational buzzword for technologically innovative companies – but today, with the surge in digitalisation brought on by lockdowns and remote working, everybody is transforming their operations. When it comes to today’s security landscape, it now goes well-beyond the walls of the enterprise network. Every size and shape organisation are becoming increasingly reliant on digital and cloud-based applications and services. The amount of data that has been created and that organisations rely on has grown exponentially, and is prevalent throughout every corner of an organisation’s IT ecosystem.   But with this increased digitalisation comes extra risks and vulnerabilities. Indeed, as workforces become more digitally dispersed, with remote and hybrid work booming, risks associated with accessing sensitive, corporate information via home networks or using personal devices that don’t have that extra layer of company security are also increasing. Additionally, there is the human risk that someone outside of the organisation could gain access to information if an employee is using a shared device at home.  The Great Resignation and labour shortages are also putting pressure on organisations’ security. People are resigning at the highest rate since 2009 in the UK and 70% of UK tech organisations are experiencing staff shortages, per Robert Walters research. This makes it difficult for organisations to keep pace with changing personnel and working environments which, much like with remote/hybrid workers, increases exposure to security risks. As a result of these shortages, growing numbers of organisations are turning towards third-party outsourced partners and talent to meet short-term demands, reduce costs and accelerate growth. But, if not done correctly, bringing on a third-party workforce without due diligence or proper governance and security controls can also create additional cyber risks. In today’s globally connected, digitally-dispersed world, then, the “never trust, always verify” principles of Zero Trust Identity resonate more strongly than ever. With the presence of new vulnerabilities and advancing technology, it’s time we shift to the identity-based Zero Trust paradigm, with an increased layer of security at the identity level. The key factors of Zero Trust Identity Zero Trust has become a popular catalyst for organisations changing their fundamental security and identity practices, particularly in the wake of the pandemic’s digital boom. According to Deloitte, 37% of organisations have increased Zero Trust adoption in just the past two years. Companies are accepting that every core business function — from HR to accounting — now relies on digital technology, and that the only way to protect their primarily remote-workforce and growing IT ecosystem is to assume continuous risk and to reassess trust every time access is attempted. This is the era of Zero Trust. But what exactly does Zero Trust, or Zero Trust Identity mean? Consider that, per Verizon, 36% of data breaches in an enterprise involve internal actors – employees at that company. This doesn’t mean that a third of every organisation is a clandestine hacker, but more likely than not has not kept a close eye on their passwords, or has been given access to content and systems unnecessarily. With Zero Trust Identity, everything and everyone is considered untrustworthy until proven otherwise. With Zero Trust, each access case is reviewed and assessed individually of each other and someone is only granted access when absolutely necessary. Each request for access is evaluated based on various identifying data, such as location, to determine trust, and thus, access.  Bring in identity access management (IAM), Identity Governance and Administration (IGA), and Cloud Privileged Access Management (CloudPAM), and Zero Trust Identity is something more than access control. It is continuous monitoring, management, remediation, and recovery that can mitigate risk and prevent mistakes before they happen. Four key benefits of Zero Trust Identity The Zero Trust Identity model has significant benefits. Without the presence of standing privilege, organisations can see improvements across the whole business. 1. Secure remote workers. KuppingerCole and HP recently found that half of office workers use their work devices for personal use, and that 84% of IT decision-makers worry this increases their company’s risk of a security breach. This is more than understandable, when users, data and access is spread across the world. Firewalls will no longer do the trick in this environment, but Zero Trust Identity can. By providing a tangible perimeter to every employee, user, device and application, no matter where they’re working, organisations can rest easier knowing their workers are safe even outside the corporate network. In addition, with the pandemic resulting in the rapid rollout of VPNs and subsequent configuration and security issues, Zero Trust can help streamline access and reduce performance issues, giving employees what they need, when they needed it and wherever they’re located. 2. Simplify IT management with automation. Continuous monitoring is a key part of Zero Trust, and when automated, it can simplify access management and security enormously. This is especially beneficial in a time where, according to the Information Systems Security Association (ISSA), 62% of organisations report a problematic cybersecurity skills shortage and errors are more likely to slip through the gaps. With a privileged access management (PAM) system, for example, access can be automatically granted to a user based on key identifiers, and requests only need to be manually approved if they are flagged by the automated system. With automation, fewer human resources are needed in the time of a talent crunch. This enables existing security teams to work more efficiently, and spend more time focused on innovation and elements that really do need manual administration. Automated Zero Trust platforms – that rely on centralised monitoring – can also produce reliable, invaluable data that provides insights about areas where security could be improved and helps the security team to further identify threats. 3. Gain greater visibility. Nothing is trusted in the Zero Trust framework, but organisations can decide what elements of their security strategy are more critical or risk-prone. This means that everything from outside resources such as cloud-based serverless processes to legacy on-premise technology needs a seat at the table – and with Zero Trust, organisations are given this visibility.  Then, they can build a solution that best matches their requirements while also covering all assets, giving them insights into who is accessing their network and when.  4. Achieve continuous compliance and improved data protection. Rogue employees, cybersecurity hackers and malware will always be looking for a way to gain access to large enterprise networks, but Zero Trust Identity, particularly without standing privileges, can significantly reduce the risk that they can extract any information. With just-in-time access for example, which falls under the PAM umbrella and means end users receive the right level of privilege for their immediate tasks and are limited in terms of what they can access and for how long. For malicious intruders, the same rules apply and can significantly staunch the impact of a breach.  This also has huge knock-on effect when it comes to compliance, ensuing sensitive data is locked away safely. Indeed, Zero Trust architecture is hugely beneficial to continuous compliance, as by evaluating and logging every access request and by tracking every request’s time, location, and related application, a perfect audit portfolio of evidence is created.  As a result, resources required for any audits are reduced, potentially costly compliance fails can be mitigated, and governance is more effectively upheld. And, according to IBM, organisations with a mature Zero Trust approach can reduce data breach costs by $1.76 million. Increased resilience  Without the presence of standing privilege, organisations can see improvements across the whole business, from securing remote workers to gaining better visibility of their security estate. Even if malware or a threat actor does slip through the gaps, then Zero Trust Identity limits the scope of damage and organisations have the agility and resilience to take proactive actions to secure their valuable assets. ### Achieving PCI DSS 4.0 Compliance in the Cloud Any organisation that takes payments from customers where transactions involve storage, processing or transmission of cardholder data must comply with the Payment Card Industry Data Security Standard (PCI DSS).  Introduced in 2004 by Visa, Mastercard, American Express, JCB International, and Discover Financial Services, the Standard provides 12 rules to help organisations to protect their customers’ sensitive payment card data. An updated version of the Standard was released at the end of March 2022, which provides merchant organisations two years to transition to. This provides time to make the necessary changes to their payment security processes and enable them to maintain compliance with the new version of the Standard. Why is the Standard changing? PCI DSS 4.0 has been introduced to account for technological advances in the payments arena, accelerated digital service adoption, the increase in remote working driven by the pandemic, and evolving cyber threats which demand more robust protections around the card data environment. What are the main changes? While the twelve core requirements remain the same, PCI DSS v4.0 introduces three major changes including customised implementation of the Standard for level 1 merchants, mandatory multifactor authentication, and continuous security testing. The requirement to encrypt cardholder data has also been extended to trusted networks as well as public networks. Continuous testing Cyber security threats are constantly evolving and a card data environment that was compliant during the annual audit may become vulnerable to a new form of attack.  In response, PCI DSS 4.0 requires Qualified Security Assessors (QSAs) to test merchants’ environments, processes and infrastructure over an extended period, rather than relying on annual audits, which only provide a ‘snapshot’ of security compliance.   Passwords are no longer enough Password vulnerabilities have been well documented. To comply with PCI DSS 4.0, all access to the card data environment must now be protected with multifactor authentication (MFA). Passwords for accessing payment and control processes must also be lengthened and strengthened by using at least 12 characters and including a mixture of numbers and letters.  Increased flexibility for Level 1 merchants Rather than being prescriptive about how enterprises comply with PCI DSS, version 4.0 allows Level 1 merchants to design their own data security and access controls to comply with the core intent of the Standard (which is to protect customers’ payment card data). This affords enterprises far greater flexibility to adopt new technology or enhanced security solutions that align with their particular operational requirements, while keeping pace with emerging consumer payment methods and evolving threats that face their payment ecosystem.  Because the customised approach to PCI DSS 4.0 compliance must be documented by a QSA, Level 2 – 4 organisations, which handle fewer than 6 million payment transactions a year and conduct a Self-Assessment Questionnaire (SAQ), are not eligible to use a customised approach. Level 2 merchants, which process between 1 million and 6 million transactions a year, must also complete a Report of Compliance (RoC). Customised security Arguably the biggest change within PCI DSS 4.0 is that the Standard is moving away from a prescriptive tick box approach to compliance to a focus on the outcome of the customised controls that are implemented.  Compliance is subjectively assessed by a QSA to ensure that it meets the intent of the requirement. As an example, requirement 12.6.3.1 of the updated Standard requires that managers ensure that employees have adequate security awareness training to protect cardholder data.  A QSA will need to subjectively assess whether the training provided to employees delivers the level of security awareness to enable the organisation to protect cardholder data against phishing attacks and other forms of social engineering.  Compliance in the cloud: are you ready? In recognition of the fact that public cloud service giants are able to make significantly higher investments in securing their environments and maintaining leading edge technologies, the new wording of PCI DSS 4.0 also allows for adoption of cloud-based hosting services and introduces a new set of requirements for securing cloud and serverless workloads. In the PCI DSS 4.0 document, it states that the scope of PCI DSS requirements applies to the card data environment (CDE), which comprises system components and the people and processes that transmit, process and store cardholder data or authentication data. It states that “system components” include network devices, servers, computing devices, virtual components, cloud components, and software including, “Virtualization components such as virtual machines, virtual switches/routers, virtual appliances, virtual applications/desktops, and hypervisors. Cloud infrastructure and components, both external and on premises, and including instantiations of containers or images, virtual private clouds, cloud-based identity and access management, CDEs residing on premises or in the cloud, service meshes with containerized applications, and container orchestration tools.” It goes on to stipulate the segmentation and access control configurations required for merchants using cloud-based services, stating: “It is important that a multi-tenant service provider defines controls so that each customer can only access their own environment and CDE to prevent unauthorized access from one customer’s environment to another.”  To verify that the intent of the Standard is being met, QSAs must confirm during their audit that configuration is set up to ensure that each merchant organisation only has access to its own cloud-based CDE and cardholder data and that organisations cannot access other organisations’ cloud-based card data environments. The countdown is on When the new Standard was revealed at the end of March 2022, a Summary of Changes was also provided that confirmed the current version (3.2.1) will be retired in the first quarter of 2024.  This means the transition period starts now, providing two years to shift from the old to the new.  By providing this timeline, it allows for organisations to focus on the required organisational changes necessary to achieve compliance, and budget accordingly. It is recommended that a comprehensive transition plan is created by compliance officers, which can be implemented once the new Standard is enforced.  Balancing compliance and experience Cloud-based services became central to all of our lives during the pandemic, but, when contact centre staff started working from home, this moved the Point of Interaction outside of the trusted network. An international survey we conducted found that consumer concerns around card data security sharply increased during this period. However, adding layers of security can slow down transactions and frustrate consumers.  We believe however that payment security can be achieved while also allowing consumers to seamlessly pay using their preferred method. One of the most effective ways to comply with the intent of PCI DSS 4.0 - and protect your customers’ data - is to leverage the cloud to descope your infrastructure by not storing any payment card data within your organisation’s systems. Our cloud-based solutions descope payments made over the phone, chat, email, social media, and messaging, to mitigate risks to payment data and maintain compliance with PCI DSS 4.0. Importantly, this is achieved without adding friction to the payment process, which would otherwise have the potential to impact customers’ experiences and risk transactions being abandoned.  As organisations plan their journey to PCI DSS 4.0 compliance, it’s heartening to see that the Standard has been updated to enable organisations to benefit from the enormous security investments made by public cloud providers, so that they can ultimately deliver enhanced security and convenience to their customers. www.pcipal.com  ### How can security leaders protect their data in a multi-cloud environment? The use of multi-cloud has gained enormously in popularity in recent years, becoming an essential part of day-to-day operations for many businesses. The adoption of such an approach increases agility whilst minimising vendor lock-in, improving disaster recovery and boosting application performance, all while streamlining costs. In a Gartner study, 81% of respondents said they are working with two or more providers while IDC predicts that global spending on ‘whole cloud’ services will reach $1.3 trillion by 2025 as a digital-first economy becomes the future of enterprise. Yet, data protection issues relating to an increasing reliance on the multi-cloud approach, is of growing concern. This is because multi-cloud in the enterprise often comes about organically to meet evolving requirements, so is not always planned. Departments within an organisation can choose to store data in different clouds, resulting in the creation of complicated silos of data. This decreases visibility and can have profound repercussions when it comes to compliance. But what can be done to address this, and what steps should IT leaders be taking to implement a solution?Encrypting confidential dataAlthough a multi-cloud architecture can make data migration easy, managing access to the data and keeping it confidential can be challenging. Regardless of the mode of transfer or method of storage, the key point to remember is that information remains a valuable commodity that is vulnerable at all possible points of connectivity. The most effective methods to address such vulnerability is to consider secure encryption. Encrypting data both in transit and at rest is critical. For ultra-secure encryption, data should preferably be encrypted with a FIPS certified, randomly generated, AES 256-bit encrypted encryption key. Confidential information stored locally on a computer or hard drive, sent via email or file sharing service, or shared via data transfer in the cloud should equally be securely encrypted. By taking such an approach, ongoing protection is guaranteed, giving IT leaders peace of mind that their information remains confidential. Centralised remote managementAs the use of multi-cloud environments essentially means that sensitive data is stored in silos and transferred across numerous servers, it’s important for security managers to gain a holistic view as to which cloud providers hold which data, where that data is located and who holds access permissions within the organisation. This will enable geo-fencing and time fencing restrictions to be set, filenames to be appropriately encrypted and remote access to be enabled or disabled depending on requirement. Such controls will go a long way towards eliminating unnecessary security risks.  Key management for encrypted information is also important. Authorised users can be given a copy of a physical encrypted encryption key; a randomly generated encryption key stored within a USB module to allow ultra-secure and real-time collaboration in the cloud. Having a key management system in place provides greater control of encryption keys when using a multi-cloud solution, helping to facilitate a more centralised administration and management approach to data security.  Multi-factor authenticationBusinesses need to have clear processes in place that all employees follow to uphold adherence to data protection regulations, regardless of where they choose to store the data. Security measures must go beyond simple single-factor cloud login credentials to be truly secure. Incorporating multi-factor authentication will help in relation to data protection governance and is an important step in standardising policies, procedures and processes across multiple cloud providers.  If a malicious threat actor obtains a user’s credentials and compromises an account, the breach is likely to remain unnoticed by the cloud service provider who will not be able to tell the difference between a legitimate user and an attacker. Using an encryption key, but keeping the encryption key away from the cloud, increases the number of security measures from just one level of authentication - the cloud account login - to as many five-factors of authentication. The encryption key should itself be encrypted within an ultra-secure Common Criteria EAL5+ secure microprocessor along with a PIN authenticated code. As more businesses move toward a multi-cloud setup, security leaders should be looking to follow such recommendations; encrypting and centrally managing their data, and then ensuring that multi-factor authentication is employed for further layers of advanced protection while still enabling operatives to share and collaborate in real time. Managing all devices storing the encrypted encryption key, used to access data in the cloud, will provide a more unified administration and monitoring process, an approach which will bring peace of mind and, ultimately, result in safer data. Learn more about managing and encrypting data in the cloud:https://istorage-uk.com/product/cloudashur/ ### Three principles for security leadership in turbulent times When the world is in crisis, security leaders face a serious challenge. When risk levels are high, every staff member, from non-technical teams to the C-Suite, will look to the cybersecurity department for help, guidance, and reassurance.  In turbulent times, it can be tempting to simply pile more technologies on the security stack. But this is a wasteful approach in the short term and is a strategic mistake in the long term because solving today’s problems often comes at the expense of preparing for tomorrow’s.  When incidents such as ransomware attacks or major data breaches hit the headlines, security leaders have the attention of decision-makers and the chance to enact real change. Stay calm in a crisis and you are more likely to be successful during more stable times. To help you survive and succeed in this era of upheaval, here are three principles that should guide security leaders during periods of turbulence.  Principle 1: Focus on Strategy There will never be an “easy button” that can be pressed to trigger total security. Leaders that want to make meaningful, successful, and lasting changes to their company’s security posture must play a long game focused on changes in technology, process, and culture. Strategy and planning are the foundation of these changes and should be established as far as possible ahead of real-time events. Proactivity during calm moments allows for resilience during turbulent periods. The opposite is also true. If an organisation is forced to hurriedly redefine its security posture in the middle of a crisis, risk will soar and resilience will drop. External problems are contagious, and good strategy inoculates against some of the pathogens that inevitably arise in a turbulent world.  When a crisis hits, an organisation’s pre-developed strategy should guide its response. Ensuring recommendations made during a crisis align with strategy will build support for leaders’ goals and demonstrate that they and their team are in control. A calm leader will be more effective during a crisis than one that yells “fire” and runs around in a panic. If they are seen to have pre-empted events and placed the correct strategies in place to deal with a crisis, leaders will project confidence and show colleagues that competent people are at the helm who can deal with the problem effectively.  Leadership teams should support strategies (although this is not the same as funding them). If executives are not behind the security strategy, either more effort needs to be made in clearly explaining it to secure buy-in, or it may ultimately need to be revised or replaced. If leaders are in a crisis and it is too late, it is advisable to adopt an existing framework. I am an advocate of NIST and its Special Publication 800-207 on Zero Trust Architecture. I believe Zero Trust is the best “working” security strategy available today.  Adopting a pre-built framework has two benefits. First, it will offer a straightforward solution to needs and will not be tied to current events. Second, there will be many articles, supporting tools, and potentially a thriving community to help leaders roll out the framework successfully. Principle 2: Embrace Momentum The momentum of crises can be valuable and energising. Yet it can also lead to rushed decisions—the 2012 Hurricane Sandy crisis is an example. When power supplies went down, some large colocation facilities went off-line, and outages knocked companies down too.  In the aftermath of the hurricane, companies rushed to build better disaster recovery systems. They spent millions of dollars on changing to a new colocation and networking vendor in a matter of months. Usually, this process takes years—but organisations were forced to do it in months.  The issue that caused the problems was eventually identified and turned out to be an industry-wide design flaw: generators could not remain operational for more than 100 hours without being serviced, but service could not take place while the generators were switched on. Companies expended a lot of time, money, and effort switching from one colocation facility affected by the flaw to another. In the end, the issue could be fixed with a mechanical bypass valve, and the bill would come to about $2,000 and the cost of a day’s work. If leaders had only held back a little, they may have been able to make significant savings. Today, the modifications which prevent future outages are universally adopted.  This example shows that rushing in can be wasteful and end with no return on investment. A strong security strategy reduces wastefulness and ensures ROI. However, a crisis is not the time to be over-cautious. When world events push cybersecurity concerns to the fore, it is the perfect moment to explain why a robust security strategy matters. It may also be a great time to get the resources to finish a project or close gaps. Security leaders should be careful to tailor their ideas to the big picture and wider agreed-on strategy because a crisis is not the time to introduce concepts from left field. Security teams should also be visible across the organisation. This could involve asking staff to train colleagues in cybersecurity best practices and helping them learn to better protect themselves and the broader organisation. During turbulent times it is worth remembering that life will calm down eventually. If security leaders are seen to have a steady hand during a crisis, they will build trust. Then, when it is time to push that boulder up a hill again, that trust will help to lighten the load. Principle 3: Transparency Crises demand honesty. When the world is turbulent, organisations will face serious security challenges. Their security posture will not be in the place it would be during a time of calm.  Security leaders need to be transparent about security weaknesses and risks the company faces. Without full transparency, the correct decisions cannot be made and gaps cannot be closed. Leadership must be fully briefed on the strengths and weaknesses of the organisation’s security and have access to reliable, up-to-date data which clearly illustrates any problems that must be addressed. Without current, accurate data, leadership teams are unable to make the right informed decisions. When presenting challenges, security leaders should be prepared to present options on how they can overcome them.     Leadership In a Crisis  Every company has its own security flaws and risks. Adopting these three principles is a great start when it comes to leading a security team in troubled times. Testing defences should be a critical activity, with tests taking place on an annual basis at least.  Leadership is not easy. The world will always be dynamic, so proactively building a strategy, staying calm as events unfold, and being transparent will help security leaders sail through the storm and steer the company to a safer place. ### Challenges of moving to a multi-cloud strategy Hybrid or multi-cloud strategies are used by businesses for a variety of reasons from legal to cost saving, and application agility. Often businesses want the flexibility to take advantage of best prices or capabilities from differing public cloud vendors. Avoiding vendor lock-in is another reason a business will desire a multi-cloud strategy.   While a business may not want to be tied to any one cloud provider indefinitely, moving to a multi-cloud strategy is not right for everyone. In fact, the risk of being tied to one cloud provider is not necessarily the risk it’s often perceived to be.  This article sheds light on why moving to a multi-cloud strategy is not for everyone, and the barriers and challenges to overcome if this is option is considered.  Not all clouds are the same  In theory an application can be stretched over multiple cloud providers environments, however “True Application Portability” involves barriers that must be overcome prior to the app becoming portable, or truly multi-cloud enabled.   First, Virtual Machines (VM’s) must be eliminated from the application architecture. Not all clouds are the same under the covers. Each cloud has its own underlying abstraction layer, often called the Cloud Service Fabric, this is where network, compute and storage are presented to the VM that is usually used to host applications. For Example, AWS Networking is significantly different to Microsoft Azure Networking and in turn both AWS and Azure Networking are significantly different in their set up from GCP Networking. It is the same for virtual machines, a VM running on Azure cannot be automatically moved to AWS and an AWS VM cannot be directly moved to GCP.   If a VM is to be moved to a new cloud provider, the VM image will need to be modified to match the cloud fabric of the underlying target platform. Compute and storage offerings differ between cloud providers and often provide their own challenges in relation to application portability. These issues are compounded if the application uses Platform-as-a-Service (PaaS) offerings, for example Database-as-a-service (DBaaS), WebApps, or message broker components where the underlying data storage elements, website delivery mechanisms and application logic are proprietary to that particular cloud offering.   If the application uses Functions-as-a-Service (FaaS) (cloud services that provide an automated platform allowing customers to develop, run, and manage application functionalities without the complexity of building and maintaining the infrastructure themselves) then the task of portability becomes even more complex as once again each cloud’s FaaS nuances are specific to the cloud provider.  But if VM’s, PaaS and (serverless) FaaS architectures aren’t available, what can a developer do?  Breaking down the multi-cloud challenge  This interoperability issue between clouds can be viewed as the problem an international traveller has when they want to charge their laptop in several different countries. You either carry a plug for each power system (so one for the UK, one for Europe and one for the USA) or you carry an adapter that works anywhere.  For application architecture, that adapter would be a product like Docker, a container framework that is the same no matter what cloud it sits upon. However, containerising an existing application is not necessarily an easy or cost-effective thing to do. Containers are a lightweight hosting architecture designed to spin-up and shut-down rapidly, scaling to match resources directly to demand. Containers allow you to pay the least while delivering the most from your cloud. The heavier a container the less portable it is.   Challenge one: Breaking an application down to microservices  To be able to run containers at scale across any cloud, the first thing you need to do is to break the application up into small component parts called microservices.  Each microservice has its own container. Microservices are discreet capabilities within an application which, when put together, present a larger more complex application to the user. For example, if we look at a simple e-commerce application with: a website, a database that holds customer preferences, a catalogue of goods for sale and a payment capability, that would require a minimum of four micro-services within its architecture. If you then added in payment notification and fulfilment messaging to a warehouse to deliver the goods you would need at least six microservices.  And so, it goes on.  Each new application function requires its own service components.   Taking an existing application and breaking it down into a microservices-based architecture is expensive and time consuming.  But it is a pre-requisite to application portability because the container is the universal adapter.  Challenge two: Applying an orchestrator to run containerised applications  Unfortunately, the story doesn’t stop there. To run a complex containerised application, you need one essential component - an orchestrator - that will monitor and manage your containers. The orchestrator understands the overall microservices architecture including which containers, and how many of them, make up a specific application. It can tell each container where to find the other containers used to deliver the application. The most common orchestrator for containerised application delivery is Kubernetes, which is quite complex to operate.  Breaking the application down into microservices is the greatest blocker to becoming multi-cloud enabled. You might avoid vendor lock in from a single cloud provider, but you will still find yourself “locked-in” to docker and whoever provides the Kubernetes capabilities that you leverage to orchestrate your microservices-based application. Vendor lock-in at some level is unfortunately a fact of life, you just need to decide at what level you want that lock-in to occur.  Challenge three: it’s extremely difficult to avoid cloud vendor lock-in  If cost is your concern, then you need to leverage PaaS or FaaS capabilities within the cloud. However, whilst both these services are cheaper to run, they will increase your level of lock-in to your chosen cloud provider. If you rely on FaaS or PaaS and the vendor decides to retire or alter those services, you in turn will be forced to rework the services that rely upon them. You are locked into the provider’s upgrade cycle as well as their technology!   Unless your application is already microservices-based the redevelopment effort will massively outweigh any cost savings made by using a single cloud provider. Additionally, if it’s a commercial, off-the-shelf software application - you are completely at the mercy of the software vendor as to when or if they ever intend to convert the software into a microservices-based architecture suitable for containerisation and Kubernetes delivery.  Using a cloud that’s best for each workload  Most multi-cloud strategies in use today do not involve stretching a single application over multiple clouds (see above). It is much better to focus on leveraging the strengths and cost advantages of running the workloads that are best aligned to that cloud. For example, if you build your own software by knitting together Open-Source capabilities with a DevOps base you will probably run that workload on AWS. If you need huge amounts of compute to run scientific or statistical modelling, you’ll likely do that on GCP, or if you have information security assurance concerns, then you may choose to run those in Microsoft Azure.  (Disclaimer: each cloud provider mentioned will happily expand on why their corner of the cloud is super-wonderful for each of the workloads mentioned above – the stereotypical labels are blurring as capabilities mature – although…).  There are three important considerations to make before moving your cloud service.  Avoiding downtime is an important consideration  Many vendor-specific services are delivered from single regions within a cloud but are presented to multiple regions as if they are native to that region. Some recent high-profile AWS and Azure outages have been tracked down to specific failures within one or two USA based datacentres - be careful which services and regions you rely upon. Centralised DNS and authentication are the lifeblood of applications, so you might want to build in resilience for these critical services. One thing to note is Hyper-scaler outages are infrequent and when they do happen, they are often resolved much faster than traditional IT outages in an on-premises datacentre. If an outage affects multiple customers, you can guarantee the cloud provider will be analysing how the outage started and how to stop it from occurring again, which often means investing to make the service more resilient.  Make architectures and applications geographically resilient   Distributed consumption of services should be delivered from at least two regional datacentres, keeping the data close to where it is consumed, reducing latency for users. All the hyperscale cloud providers have a good regional split of datacentres, so this isn’t necessarily a reason for a multi-vendor cloud strategy, it’s just simple best practice to avoid a failure in a single datacentre affecting all of the users in that region.  Securing multiple clouds is always far more complex than securing a single vendor cloud architecture.   Leverage CIS (Center for Internet Security) benchmarks for cyber resilience, use multi-factor authentication, and if you do move to the cloud have one SIEM system that monitors what is happening in every environment in one console. If you don’t have one pane of real time visibility into your environment, it will be difficult to look for suspect activity, and you will never spot an attack whilst it is happening. This means the damage done will always be significantly higher than it would be if you are monitoring things in real time.  Ultimately, public and multi-cloud environments grow beyond human scale to monitor and manage.  This is where cloud agnostic application resource management (ARM), advanced observability, and extended detection and response (XDR) systems, fuelled by artificial intelligence allow the automated optimisation, remediation, and protection that manual human efforts will never achieve.   In summary  I would always recommend that an organisation starts with a single cloud provider and becomes proficient in using that provider’s native tooling and services. Then as the organisation matures its use of cloud and has mastered all the aspects of operating and utilising that cloud provider’s capabilities, it can begin to assess what real benefits can be achieved by moving to a multi-cloud strategy. Don’t try to boil the ocean - it is better to be an expert on each cloud than a generalist on multiple clouds. Third party toolsets will be required for multi-cloud strategies, but vendor native toolsets usually have a more complete feature set for that vendor’s cloud capabilities.  ### Five Ways the Cloud Can Unlock New Opportunities for Your Accounting Practice The accountancy profession is awash with talk of cloud adoption. Indeed, many firms have already recognised some of the tangible benefits that the cloud can bring. However, with digitisation of services competing with traditional accounting processes, there may be fears that cloud adoption could threaten the very existence of the traditional accountant. If your firm is questioning whether the transition to the cloud is worth the new opportunities it may bring, here are five potential benefits to consider. The Cloud: Delivering New Opportunities to the Role of the Accountant Cloud adoption and automation go hand in hand and, for some practices, this has sparked fears that the cloud may render people obsolete. However, it’s worth taking a deeper dive into the potential opportunities that may come with cloud automation.  First and foremost – what is cloud automation? The term refers to the various processes and tools that reduce or eliminate manual efforts used to manage cloud computing workloads and services. While practices may be wary that cloud automation will get rid of many of the activities that accountants normally do, the potential for automation to liberate the accountant is vast. Automation enables software to carry out the same task repeatedly and accurately. While existing systems can deploy automation, the cloud drives the use of more intelligent tools which require a dynamic data set to learn, adapt, and improve in the form of Machine Learning and the application of Artificial Intelligence.Moving to cloud automation can reduce the processing time for the repetitive tasks, helping advisors to focus on working practices that enrich their lives and those of their clients. They’ll derive learnings about their clients using a broader set of data than any single practice alone, which may help practices to support their people in exploring more diverse career paths, and switch focus from compliance to new business development.   Data in the Cloud: Opening Opportunities in the Advisory Space Simply put, the cloud is a great opportunity to consolidate data and bring it together to fuel new services in the advisory space. Most cloud accounting platforms can use Application Programming Interfaces (APIs) to connect to data, which means that advisors can bypass manual data entry and move directly on to data analysis. It’s an opportunity to move past number crunching and to manage data on a larger and more granular level.  The Power of Real-Time Data in the Cloud The cloud is giving advisors a 360-degree business view of their clients’ accounting landscape, and it could open up new opportunities in the advisory space. The power of real-time reporting, and having that data at your fingertips while you speak directly to clients is extremely powerful.  In a sense, the cloud is driving this new data-driven approach to client relationship management and helping advisors to improve relationships and get to know their clients better. Having real-time data to make informed decisions is a big differentiator. For example, in the past, you may have had to wait for quarterly billing data to come in to report on what may be impacting a client’s revenue streams. With the cloud, that data could be available in real-time, giving you instant access to pivotal business insight that advisors can interpret for clients.  Another important point surrounding real-time data is that having it readily available greatly reduces the possibility of unwelcome surprises. With real-time data, advisors can observe trends gradually, and give pro-active (rather than reactive) advice. There will always be exceptions, however, real-time data can help to introduce foresight into advisory advice, which is once again a strong competitive differentiator for any practice.  Introducing Agility, Flexibility, and Scalability with the Cloud For small and medium-sized practices, scaling is often one of the biggest challenges. For these practices, the cloud may be the answer simply because of the time savings it introduces. If you’re not taken up with data entry and manual number crunching, think of the additional hours you’d have left in the day to take on more work and more clients.  The agility and flexibility the cloud delivers are also critical for business growth and resilience. If practices want to be agile, they need to be able to adapt and change quickly and easily. They need to be able to flex during busy periods such as a financial year-end, or during holiday season. Being able to adapt and be nimble as an organisation is absolutely critical, and the cloud is quickly becoming the most practical way to introduce this flexibility into businesses of all sizes. Even the ‘pay as you go, for what you need’ model is indicative of the cloud’s flexibility. It’s a completely different paradigm from the way organisations paid for infrastructure in the past, and it puts the power back in the hands of business operators.  Company Culture and Cloud Collaboration One cloud benefit that is often overlooked is the power of collaboration that the cloud drives. By nature, the move to the cloud has got to be a collaborative effort. No one person can be the sole driver of change and everyone across a practice must work together to make the move successfully. Practices may worry that cloud-enabled remote working may dampen a company culture, but what is overlooked in this view is the communication and sharing culture driven by the cloud. The technology encourages data sharing and transparency where appropriate, encouraging advisors to engage and collaborate.  Future-Proof with the Cloud Change is constant, and the pandemic has been a pivotal event that has forced companies to change and innovate. As organisations look to the cloud, there are many opportunities for accountants to ready their practices for the future. By starting small and moving to a cloud-based model in a considered way, practices are well positioned to unlock a host of transformative opportunities, paving the way for new lines of business and increased productivity.  ### Serverless Best Practices in software development Serverless function-based compute platforms like AWS Lambda are built for scale. They automatically provision computing resources as needed and are designed to handle tens of thousands of requests per second. This makes them a great fit for modern web applications and APIs. But "serverless" doesn't mean you don't have to think about servers or architecture anymore, or that you can completely ignore best practices used in software development. It just means you don't have to worry about infrastructure management and scaling, so your focus can shift to building new features and delivering value faster, which is the ultimate goal of serverless. Here are some best practices we've learned at Serverless over the past few years while building serverless applications on AWS Lambda. What is serverless? Serverless computing is a cloud-computing codeless execution model in which the cloud provider runs the server and dynamically manages the allocation of machine resources. Pricing is based on the actual amount of resources consumed by an application, rather than on pre-purchased units of capacity. It is a form of utility computing. The name "Serverless Computing" was created by Amazon Web Services (AWS) to describe AWS Lambda. Serverless best practices Start locally From day one, this is the best practice for working with serverless code. If you can code it locally, you can likely deploy it to AWS Lambda and run it successfully in production. Start locally and build your function using the same language runtime and same SDK as you would use on Lambda and AWS. The role of Lambda is to execute your code on-demand, so focus on writing good code first and then start exploring how to deploy that code on Lambda. Use 1 function per route This will help in debugging and code maintenance. If you want to change the execution path of your API, you can do it by changing a single file instead of making changes to multiple files and routes. Use error handling middleware Anything can go wrong in your API request and you should be prepared for it. Your API might get a request that is not valid or an internal error can occur during the processing of the request. You should be able to handle these errors gracefully and inform the client about what happened and the possible next steps for them. Log everything Logging helps in debugging and monitoring the application performance. It also displays the API usage pattern which could be used for user segmentation, product improvement, etc... Use AWS Lambda Layers You can use AWS Lambda layers for common package dependencies across all functions so that you don’t have to include them in every function separately. This also reduces deployment package size which in turn decreases deployment time. A Lambda layer is a ZIP archive that contains libraries, a custom runtime, or other dependencies. With layers, you can use libraries in your function without needing to include them in your deployment package. Manage code, not configurations The serverless programming model requires a different approach to configuration management. Rather than managing configurations across all your services, you should manage code. You can use Lambda layers to do this. Layers allow you to separate concerns and reuse code across all the functions. As a best practice, use Lambda layers to manage shared dependencies like libraries, frameworks, SDKs, or runtimes. This approach also has the benefit of reducing deployment package sizes (and thus deployment times) because only changes in your function code need to be packaged and deployed. You can also implement configuration management through environment variables and configuration files such as .yml files that are included with your function code at deploy time. As an anti-pattern, don’t put any configurable parameters in your function code. Perform load testing This is very important to ensure that the capacity of your applications is in line with what you need, especially during high volume periods. You can do load testing manually by having a group of people run through your application and perform different actions at the same time. However, the even better approach is to use a load testing tool that can simulate thousands of users sending requests to your application simultaneously. One such tool is Apache JMeter. Serverless best security practices Deploy API gateways for security API gateways are a standard feature of modern software architecture, and they have several important functions, including the handling of authentication and authorisation. API gateways provide a single point of entry for a variety of services and allow you to hide direct access to other downstream services. This can make it easier to work with third-party APIs and also provide some added security. If you are using an API gateway to connect with other services, be sure to use HTTPS protocols throughout so that you don’t accidentally expose sensitive data at any point in the process. In addition to providing simple API endpoints, API gateways can also handle more advanced features like analytics, rate limiting, and monitoring. As your application grows, these features can become increasingly important as they help you optimize your app for performance and scale. Properly Handling Secrets To secure your serverless applications, you must focus on managing the secrets that your functions use. The first step is to avoid hardcoding secrets into the code itself. Hardcoded secrets are a security concern because they are visible to anyone who can view the source code. AWS Lambda provides a Secrets Manager that you can access from any function, and it makes it easy to rotate secrets without redeploying your application. In addition to using a secret manager, you should also ensure that you specify minimum permission policies for the IAM roles associated with each function in your application. These policies describe which resources each function can interact with and which actions they are allowed to take on those resources. Properly configured minimum permission policies make it harder for an attacker to leverage one function to gain access to other resources or privileged permissions within your account. You can learn more about IAM best practices in the AWS Security Best Practices whitepaper. Both of these best practices help reduce the blast radius of a compromised function by limiting what it is capable of doing within your AWS account. This means that even if an attacker does manage to get access to one of your Lambda functions, the damage they’re able to cause will be limited. Limiting Permissive IAM Policies When using a serverless stack, most of the permissions for AWS resources need to be set in an IAM role. In general, these roles should be given the least amount of permission needed to function properly. However, this can be difficult to accomplish with serverless functions because the code is not necessarily known at the time the role is created. Therefore, it is common practice to give a serverless function’s role full access to AWS resources. This is a very dangerous practice and should be avoided! When you grant a role access to all AWS resources, you are inviting disaster. You must assume that in the future someone will accidentally deploy malicious code on your serverless stack. This can happen in many ways: Multiple developers could deploy the code that they wrote while experimenting on your multiple AWS accounts.An intern could copy and paste code from a StackOverflow example that has malicious intent (this happened).A disgruntled engineer could deploy code that intentionally deletes your production database.A compromised CI/CD pipeline could upload malicious code without anyone noticing. Restricting Deploy Times It's just not safe to assume that your functions will always be able to execute in time when you deploy serverless applications. They might take a little longer for some reason, or you might have a sudden spike in traffic, or something else could happen that causes an unforeseen delay. These delays can lead to timeout errors and even serverless deployment failures, which are far from ideal. To prevent this, set up a function configuration with the minimum and maximum amount of time you want the function to be allowed to run. If the execution time exceeds the maximum limit, it will automatically terminate. This will give you peace of mind knowing there’s at least one safeguard in place to protect your deployments from becoming too lengthy. Consistent Conventions Consistency is the key to building a product that is easy to use, understand, and maintain. Proper conventions around naming, configuration, and directory structure will help you build a serverless architecture that scales with your codebase. Here are some of the conventions you should consider: Naming Conventions One of the first things to consider when creating a new service is how you are going to name everything. In addition to naming particular service functions, components, events, etc., make sure to have a process for naming environment variables. My recommendation is to use one word in all CAPS for environment variables. This will be consistent with the other services and frameworks you use. It also makes it easy to search through your codebase for instances of that variable because it won’t be confused with any other word in your application code. Directory Structure Serverless apps are highly scalable, so it makes sense that they can grow quickly as well. To keep your codebase maintainable as it grows, I recommend having at least one more level of directories than AWS has defaults for. Takeaway Serverless computing is a powerful way to build web applications and APIs with high availability, fault tolerance, and low maintenance costs. It requires an architectural shift, but the payoff is a faster time to market and a new way to focus on your product while leveraging existing computing services. The potential pitfalls of serverless architectures should be addressed carefully to ensure your cloud project will deliver what you need. Hopefully, this article will help you "think outside the box" and approach a serverless application with the right mindset. There is no silver bullet here, so make sure to combine your learnings with your specific context and projects to find what works best for you and your team. In the end, it's not about doing things a certain way, but rather delivering business value faster, cheaper, and more efficiently—which should be every developer's main priority anyway. ### The Two Ms Critical to IT innovation As digital acceleration continues unabated, enterprise-level IT departments face a complex logistical challenge. As a result, they must find ways to manage their multiple digital and technology partners efficiently in a way that still delivers a cutting edge customer-first solution. The answer lies in applying a Multi-Vendor Environment model and appointing Managed Service Providers to run it. Organisations across the globe have added various new technologies to their IT infrastructure in the past decade to meet their customers' needs more efficiently. By pivoting towards micro-services, API and cloud-based infrastructures, companies have embraced innovation, allowing them to improve the way they serve their customers. However, this explosion in new technologies means firms now have more vendors than ever on their roster. A 2020 consumer study by the Ponemon Institute revealed that the average enterprise-level organisation has around 5,800 third party vendors – this number is expected to grow by 15% by the end of this year. What are the implications of this expanded vendor base in terms of impact on resources, and what are the most viable solutions available, now and in the future? The unexpected challenges of embracing innovation While many organisations have successfully increased their investment in standardising and optimising their IT and digital estate, there is room for improvement when it comes to maximising their return on investment. This is because most businesses typically spend too much time on multi-vendor management. A recent survey of global IT leaders by the MACH Alliance found that IT teams are spending 39% of their time delivering upgrades, with 28% saying more than half of their IT team time is dedicated to this - including managing the partners supporting this work. That represents a considerable portion of time (and budget) that could be focused on innovation and delivering crucial customer experience improvements. Not surprisingly, the same MACH Alliance survey found that this leads to strong resistance to change among IT/implementation teams. As a result, it’s increasingly clear that organisations with complex IT infrastructures have to find new ways of ensuring in-house teams and their various partners work together towards a common goal. Seamlessly integrating multiple solution partners is critical. In turn, this translates into effective time management for IT teams, better customer experiences and ultimately, higher revenues. The challenge is that the multiple partners these organisations rely on are often unaware of the interdependencies at play.  Consequently, IT teams have to allocate additional time and resources explaining to each vendor why their company is not just another client – and why they need more bespoke solutions than the other businesses in their industry. After all, having third-party partners who are misaligned increases the chances of mistakes and can create obstacles to improving the customer experience.  Working with companies across various sectors, I have learned that there are common recurring problems that many clients experience when managing a complex IT landscape with numerous multi-vendor moving parts. These include: High costsStaffing challengesLack of visibility into the interdependencies between partnersThe increased amount of time that IT spends with partners to maintain the status quoInefficient performance measurement and optimisationA lack of understanding of responsibilities between partnersVendors and partners being reactive rather than proactive Inevitably, these issues contribute to inefficient or less-than-optimal service performance. Why A Multi-Vendor Environment is worth considering A Multi-Vendor Environment is a fluid model that allows organisations to manage multiple partners to facilitate collaboration and coordination between them, ultimately freeing up valuable time and resources within in-house IT teams. It's a hybrid approach that uses technology to help enterprises improve how they integrate their partners and suppliers, saving time and effort in solving recurrent issues.    Leading consultancy Gartner estimates that organising Multi-Vendors effectively can lead to a 40% savings in management costs. So it’s surprising that more businesses aren’t taking advantage of a Multi-Vendor Environment to optimise their relationship with their providers and make life more manageable for their IT teams. What’s more, with most enterprise solutions moving toward a cloud-first approach, automation needs to be at the core of Managed Services, and an Multi-Vendor model enables organisations to develop a more automated environment. It facilitates a 360 view of the IT activities making it easier to optimize and automate tasks. By implementing a global service platform such as JSM from Atlassian, organisations can benefit from greater visibility and accelerate the flow of work between vendors. Another core benefit of a Multi-Vendor Environment is its customer-first delivery model. While a customer-first approach sounds obvious, in practice it requires companies to anticipate their customers’ needs and expectations and go above and beyond to deliver a more meaningful experience at every touchpoint. A customer-first delivery model helps to promote more engaged and personalised customer relationships in the long run, which in turn helps drive revenues. That said, the ultimate goal of implementing a Multi-Vendor-based ecosystem is to provide time-pressured in-house IT teams with the room they need to focus on driving innovation and digital transformation instead of troubleshooting and supporting day-to-day tasks. However, the success of this approach also depends on bringing all the vendors together under one roof with a single point of contact (POC). This is where the role of  Managed Service Providers is key. By serving as a single POC, Managed Service Providers help facilitate communication and monitor vendor performance. So what can companies expect from Managed Service Providers? As a baseline, Managed Service Providers should deliver a wide set of services, including 24/7 maintenance and support, continuous development, performance audit, analysis, remediation and real-time monitoring. The aim is to help clients improve their understanding of their ecosystems, suppliers and partners. Meanwhile, by optimising companies’ use of data, Managed Service Providers help companies make decisions that add value and deliver more substantial return on investment.  Managed Service Providers aren't limited to operational issues: they can also help facilitate procurement efforts, orchestrate complex product upgrades and minimise staff training and churn, stretching client budgets to do more. By taking over the primary responsibilities surrounding performance optimisation, Managed Service Providers can help companies refocus their IT team's efforts where it matters most; on driving innovation.  ### Under the Hood of a Data Resiliency Cloud As organisational data becomes more valuable and threats constantly evolve, businesses must ask themselves if they are doing enough when it comes to keeping their data safe. The challenges are clear: organisations are creating vast quantities of business-critical data which is being spread across multiple different environments and creating more risk.  Meanwhile, IT staff are stretched thin due to a global talent shortage, and simply cannot take on more work. The digital talent shortage has been particularly challenging for UK businesses. In fact, recent research shows that as tech job vacancies continue to skyrocket, two-thirds of digital leaders in the UK are unable to keep pace with change because of the talent they need.  These IT teams are all too often running from behind to ensure this data is protected and managed efficiently. However, to keep their business resilient and on pace with growth, they must advance their core functions of data management and protection. To do so, IT teams need to ask themselves the million dollar question: how do we streamline data management to increase efficiencies, while also ensuring data is secure from threats? Businesses need a single solution that can meet these requirements. An approach that is grounded on the simplicity, scalability and security of the cloud is the only way to deliver on such a promise. A Modern Approach  A cloud-first approach enables organisations to maximise their data’s value to the fullest extent, while also addressing the core challenges every business is facing. IDC predicts that by 2025, 55 percent of organisations will have migrated their data protection systems to a cloud-centric model to ultimately simplify management and protection.  So, what is a data resiliency cloud? A data resiliency cloud is a modern platform that delivers automated capabilities designed to keep a business protected and efficient. By leveraging the elasticity and efficiency of the cloud, organisations can take advantage of unlimited resources and additional layers of protection that will enhance their cyber and data resiliency.  The Key Pillars  This modern solution operates off a Centralised Model, which allows an enterprise to bring all data related operations into one single platform. A centralised system will break down silos and integrate data management functions automatically without IT oversight. This convergence of management into one interface, no matter where data is located, makes the task of data protection, security and above all resiliency, incredibly less resource-intensive. Next, as organisations migrate their business processes to multiple-cloud environments, they must be able to protect all of their data within this centralised platform. A Comprehensive Control Pane that supports multiple clouds will enable this functionality and help businesses regain control of their data environments. A multi-cloud control pane is built off three principles including no infrastructure, global policies, and self-service with central oversight. Without infrastructure, businesses can run and scale in any environment, and with global policies, they will have oversight across all environments. A self-service functionality will also help IT teams delegate responsibility to certain application owners, while retaining centralised control.  While a multi-cloud control pane will provide central oversight, organisational data must be protected. As cyber threats and ransomware attacks continue to advance at an alarming rate, organisations increasingly need to not only prevent threats to their data but bounce back with minimal disruption when they eventually experience the inevitable attack.  A Zero Trust Approach, the third pillar, will ensure that different layers of the security stack work independently from each other. This mindset is about always assuming that layers above and below one another are already compromised. It’s about never trusting, and always verifying.  Zero trust is based on a verification process which should treat every access attempt as if it originates from an untrusted source. Access to an organisation’s network should only be granted after an identity has been authenticated through single sign-on (SSO) and/or multi-factor authentication (MFA). This is critical because it will help prevent bad actors, whether internal or external, from being able to gain access to an organisation’s network.  In addition, a cloud-native solution that enables cyber and operational resilience backs up data regularly, while keeping it air-gapped and immutable, and always available to restore. It also helps organisations orchestrate their recovery from an attack with minimal downtime and disruption.  A data resiliency cloud is fully Autonomous. Automated capabilities can detect threats and keep a business’ protection environment up to date on security updates, security patches and best practices without depending on any people. This eliminates daily management and allows IT teams to focus on larger priority tasks that add greater value to the business. In addition, an autonomous engine is able to provide insights into unusual activity which helps businesses to rapidly detect, fight and mitigate internal and external threats.  Last, these pillars deliver a Simplified Cloud Experience, which means that the solution is 100% SaaS and designed for the cloud - not just ported. With SaaS there is no hardware to invest in or manage. Organisations can reduce costs by never needing to buy or install software applications themselves, only paying for what they actually use. A true cloud experience provides organisations with a level of self-service. This means that when IT teams need to perform daily tasks such as creating a backup copy or running a restore, they do not need to wait for a central team and the task can be completed almost instantaneously. Of course, with self-service there should always be some form of central oversight to provide a safety net for the organisation and its data. Choosing What is Right for the Business Data environments are only becoming more complex, and cyber threats are more severe and destructive. Companies need to move past legacy approaches to achieve resiliency, and leveraging the cloud is a great place to start. A data resiliency cloud provides organisations with unlimited resources to protect their data. When such a system has all five pillars: aCentralised Model, aComprehensive Control Pane, a Zero-Trust Approach, Fully Automated Operations, and a Cloud-Native Experience  – organisations will be able to remain resilient no matter what gets thrown their way in the future.  ### Finding your silver lining: What multi-cloud means for your data strategy There’s a chance you might have heard about multi-cloud in the last year or so. And for good reason, too. According to IBM’s State of Multicloud report, 30% of organisations are using three or more clouds, while multi-cloud strategies are rapidly becoming an established part of the digital playbook, with 21% of businesses overall describing it as their ‘most important initiative’. We hear plenty about its potential for getting apps into production quickly. But what about doing the same for data? Well when it comes to cloud data warehousing (an enterprise system used for storing data from various sources), it’s generally the case that two (or more) clouds are often better than one, given the variety and complexity of data handled in the modern day. A multi-cloud architecture can present any number of combinations of cloud platforms and cloud data warehouses (CDWs), and there are a multitude of factors that might compel an organisation to set up their infrastructure in this way. However, there are several common use cases and reasons that we see every day: Meeting regional needs – The need to fulfil data compliance and sovereignty requirements, as well as the freedom to scale up and down in the different markets that make most sense for your businessData recovery – Multi-cloud architectures can be a safeguard against cloud outages, disasters and other expected downtime by having your data stored across multiple platforms and warehousesVendor diversification – Multi-cloud allows greater flexibility for your organisation should either pricing, storage or compute offerings change on their end, or demand for data on yours. It’s also well known to be a boon to businesses trying to avoid vendor lock-in, especially those tied to legacy, data-intensive applications and platformsDifferent tools for different needs – Data and IT teams are not homogenous; they contain people with a whole range of experience and preferences using different tools, technologies and datasets to support the varying needs of different lines of business, meaning a diversity of cloud environments is key to meeting their needs. However, multi-cloud isn’t a silver bullet solution and, like any technology, comes with its own data risks and challenges to navigate. Without a solid underlying strategy for where to store and execute against your data in the cloud, organisations risk facing serious problems. Let me explain why.  Solving the multi-cloud puzzle Inherently, a multi-cloud configuration creates data silos, with data stored in various data warehouses and lakes across multiple platforms, locations and environments. This outcome is by no means intentional, but leads to inconsistencies in output as individuals apply their own rules to and work on different streams of data. Ultimately, such silos are a major obstacle to that “single source of truth” that businesses so desperately seek, as well as their efforts to become truly data-driven. Portability is also a major cause of this; organisations find it hard to break out of data silos because their data is stored in different formats across a variety of technologies and platforms. Vendor lock-in is so common partly because of this lack of interoperability between providers, and while portability solutions exists, at present they’re expensive to obtain and maintain.  Both data silos and the lack of portability are ongoing issues, because moving data between platforms or regions can pose a data security risk without the correct governance, processes and security controls in place. So how can businesses safeguard against these data challenges and ensure they capitalise fully on the freedom, flexibility and efficiencies associated with a multi-cloud architecture? First we need to remember that multi-cloud architectures are not one-size-fits-all, just as business problems aren’t either. They can comprise a mix of public and private cloud infrastructures, or use a variety of different cloud data warehouse providers, such as Amazon Redshift and Snowflake. Or perhaps you might be hosting operational data stores in AWS but migrating and doing data analytics in Azure. Often multi-cloud is all of these at once! Whatever the scenario, a unifying data management layer that allows for the secure passage of data across warehouses, platforms and regions is always paramount. It sets the foundation for ‘Cross-cloud’ data sharing, in which organisations deploy a single type of cloud data warehouse that can operate on multiple cloud data platforms. Snowflake customers can launch their CDW on AWS, Google Cloud and Azure, for example. Or you could flip this configuration, and use a single underlying cloud platform and multiple CDWs. Whatever solution you opt for, a strong data transformation and loading process is essential if you’re to extract maximum value from your data in a multi-cloud infrastructure. Although these architectures can be complex without the right integration strategy, it ultimately helps data teams to be more productive, flexible and efficient. When fed into a robust data strategy across the organisation, this allows then companies to take away more value from their data and make more informed decisions – the ultimate silver lining sought from any cloud. ### How Cybercrime-as-a-Service is luring youngsters into a life of criminal activity A new generation of hacker is here. Research by the UK’s National Crime Agency (NCA) suggests that people as young as 12 could be at risk of becoming involved in cyber-dependent criminality. Let that sink in for a second.  The societal implications of adolescents involved in serious crime is one that would stir the emotions of many. And it is set against a challenging landscape for authorities trying to police it, often across international borders.  One such example is the LAPSUS$ cybercrime group, which has claimed several high-profile scalps including Microsoft, T-Mobile, and the Brazilian Ministry of Health, to name a few.  Seven people aged between 16 and 21 with alleged links to the group were arrested by City of London Police in March, and two teenagers have since been charged for hacking offences.  Putting aside the size and scale of these breaches, it is perhaps the age of the alleged perpetrators which is most shocking.  It is just one example in an overall trend which begs the question: how are teens ending up involved with underworld activity? The reasons are complex and nuanced.  Some may be looking to reap financial rewards, but for others the objective is to complete a challenge, gain a sense of achievement, and win a ‘badge of honour’ within their peer group.   Their initial motivations may also have been driven by wider issues in modern society, forcing them down a path of criminality. For others, they may now be trapped into doing the bidding of criminal gangs with no easy way to escape. And it is today’s increasingly cloud-based economy which could be fuelling this trend further. In this article I’ll explore the impact of a cybercriminal marketplace switch from a direct sales model to a managed service model. The result is a swell of young people sucked into a life of crime because of ‘cybercrime-as-a-service’.  Teen hackers arming themselves  The security community underestimates the younger generation. We forget teens today have not only grown up with computers, but also have access to an unprecedented number of educational resources on programming and offensive security.  While the list of targets LAPSUS$ has breached is impressive, the techniques used to compromise were not novel, nor were zero-day exploits used.  They took advantage of the difficulty large organisations face in managing disperse workforces and networks, with the success likely enabled by the ‘cybercrime-as-a-service’ market. It works by following the same idea as other ‘as-a-service’ offerings in business, but with a criminal twist. Those who have written malicious code rent out access to their own “cybercrime solutions” to lower-level criminals who either don’t have the resources or know-how to design, write, and execute cyberattacks on their own. The pay-off is a cut of any profits made in an attack which uses the code. For those starting out in this illicit world, it gives access to a suite of tools to commit criminal enterprise, including malware, stolen databases, social engineering attacks, and more. And as these malicious actors continue to specialise, it increases competition within the marketplace as well as the effectiveness of their code.  Ultimately, with the barriers to entry reduced, it is much easier for new entrants to launch threats.  How do teenagers learn to carry out these kinds of attacks? Age is increasingly less of a factor when it comes to cybercrime, as many hackers start out as tinkerers from very early on. Whether it’s playing with code or hacking items around the home, in most cases these individuals are self-taught. And there’s a thriving collaborative community online which caters to aspiring hackers.  Everything is available from tutorials on how to perform different types of attacks, to services to help attackers monetize stolen data.  Couple this illicit opportunity with lower barriers to entry and it fuels a growing swell of young people involved in crime.    But this comes with clear risks. It goes without saying it is breaking the law online by using their coding skills to develop malware or get involved in other cyber criminality. If caught, the penalties are extremely serious, such as imprisonment, or even extradition to a foreign power.  Naivety, gullibility, and inexperience of youth is also ripe for exploitation by other criminals. It can lead to some kids being a pawn in a wider game, or carrying out the bidding for a criminal gang online – while carrying all the consequences of their actions.  Cybercrime-as-a-service creates a level playing field The size of cybercrime groups can vary considerably, ranging from nation states with near unlimited resources, to organised groups of differing sizes who are mostly motivated by profit. These hacking collectives will cherry pick targets looking for the biggest reward.  Where nation states will hack for intelligence, smaller groups such as LAPSUS$ can inflict just as much damage, though we often see different goals – such as stealing source code.  We also see individual lone wolf ‘script kids’ looking to make a name for themselves, often armed with code procured from the cybercrime-as-a-service marketplace. It levels the playing field with these larger groups, so to speak. Ultimately, it’s this marketplace which is putting frighteningly powerful tools in the hands of anyone wanting to exploit them, as well as exposing young people to extremely serious crimes which could appear novel from their perspective.  Much greater education is needed as a result, both for the businesses tasked with protecting their customer data and IP, as well as in the education system more broadly.  The risks for young people being drawn into a life of crime needs to be clearly communicated, in a similar way to other serious crimes such as drugs and violence. It also calls into question the wider societal reasons why young people are drawn into crime generally, and the need to protect them. And just because a cyber attack can be carried out from a bedroom at home, doesn’t make it any less serious.  ### Addressing the infrastructure requirements needed for effective AI adoption The COVID-19 pandemic has greatly accelerated the rate of artificial intelligence (AI) adoption throughout the business landscape, but in doing so, it has also placed a huge amount of unexpected extra demand on many organisations’ existing computing resources and supporting infrastructure.  A recent report into the state of AI and machine learning found that throughout 2020/21, over half (55%) of businesses questioned had significantly accelerated their AI adoption compared to the previous year, with more than two thirds (67%) of respondents expecting to increase it even further over the coming 12 months. With adoption accelerating at such an unprecedented rate, the cost of overheads is becoming more central to discussions. So much so that business leaders who fail to properly plan ahead can quickly find themselves with a serious headache.  The rapidly growing trend towards AI adoption was reinforced by Ritu Jyoti, group vice president for AI and Automation Research at IDC, who recently commented that COVID-19 and the problems it created had quickly “pushed AI to the top of the corporate agenda, empowering business resilience and relevance.” Jyoti added: “We have now entered the domain of AI-augmented work and decision making across all the functional areas of a business. Responsible creation and use of AI solutions that can sense, predict, respond and adapt at speed is an important business imperative.” In light of these comments, it’s perhaps unsurprising that IDC expects the value of the global AI market to increase by more than 15% to $341.8 billion this year and surpass $500 billion by as early as 2024. Flexibility is crucial in modern business infrastructure When it comes to modern business infrastructure, flexibility is absolutely crucial. This makes cloud – and specifically hybrid cloud – the ideal foundation of AI, particularly as the volume of data involved continues to grow over time. The versatility of hybrid cloud solutions means organisations of all sizes can meet the technology demands of AI at a cost that is sustainable to them, even as their business naturally fluctuates over time.  Adopting Infrastructure-as-a-Service (IaaS) gives organisations the ability to use, develop and implement AI without sacrificing performance. However, choosing a suitable IaaS provider is key, and there are a number of elements that every organisation needs to address before making a final decision.  1. Storage scalability and capacity For many businesses, the ability to scale storage as data volumes grow is now a fundamental requirement. However, the type of storage best suited to an individual business’s needs depends on a variety of factors including whether they need to make real-time decisions, as well as the level of AI they plan to use. For example, a financial organisation that uses AI to make real-time trading decisions will likely require an extremely fast, all-flash storage solution. However, other companies in sectors that don’t move quite so quickly tend to be better served by larger capacity but less rapid storage options.  Another major consideration for businesses is how much data their AI applications will generate. The more data AI applications are exposed to, the better, more accurate decisions they are likely to make. Databases continually grow over time as well, so storage and expansion capacity need to be under constant review in order to avoid unanticipated issues down the line.  2. Overall computing performance In order to take full advantage of the benefits AI has to offer, organisations need access to powerful computing resources. In particular, machine learning algorithms require speed and performance to constantly transact vast numbers of calculations. While CPU-based environments can handle basic AI workloads, deep learning involves multiple large data sets and the capability to deploy scalable neural network algorithms, which can make GPUs a better bet. The superior performance provided by GPUs can significantly accelerate deep learning compared to CPUs. However, it’s perhaps no surprise that this increased speed and performance comes at a significantly higher cost, which means it may not always be financially viable to make the switch. Ultimately, finding the right balance for the required task is key. 3. Data security AI can involve handling large amounts of sensitive data, including personally identifiable information and financial records, so it’s critical that any infrastructure used is fully secured with the latest end-to-end encryption technology.  It goes without saying that a data breach can be disastrous for any business, and history is littered with examples of this. However, the introduction of AI adds another element of danger to this because any infusion of bad data (whether malicious or otherwise) could potentially cause the system to make incorrect inferences, leading to flawed decision making that could significantly impact operational efficiency. 4. Reliable networking infrastructure Networking is another key component of AI infrastructure. Good, fast, reliable networks are essential to maximising the delivery of results. Deep learning algorithms are highly dependent on communications, so networks need to keep pace with demand as AI efforts expand. Scalability is a high priority and AI requires a high-bandwidth, low-latency network. It is important to ensure the service wrap and technology stack are consistent for all regions.   5. Overall cost-efficiency The more complex an AI model is, the more expensive it is to run and maintain. Consequently, it’s important to ensure that the extra cost can be justified by the extra performance achieved. Businesses need to make careful choices and identify IaaS providers that can offer cost-effective dedicated servers as a means to boost performance and enable them to continue investing in AI without increasing their budget. Any organisation looking to successfully incorporate AI into its business operations for the first time must ensure the right infrastructure is in place to support the move. Finding the right IaaS provider will play a huge role in this and can ultimately dictate overall success or failure. As such, it’s crucial that organisations take the time to find a provider that is best suited to their overall business needs and objectives. Focussing on the five key areas specified in this article is a great place to start and will help set you on the path to long term success. ### AI Security VS AI Cybercrime – A Game of Cat and Mouse With cybercrime continuing to rise and become ever more complex, Philip Bindley, Managing  Director of Cloud and Security at Intercity, discusses how artificial intelligence (AI) is being  ‘weaponised’ by cyber criminals.  Earlier this year, the UK government released the results of its seventh Cyber Security  Breaches Survey exploring cyber security policies and the threat landscape for businesses of all sizes. The report found that 39% of UK businesses identified a cyber-attack in the last 12  months, with phishing named as the most common threat, having affected 83% of businesses.  The creation of phishing and spear-phishing attacks, where artificial intelligence is used to curate the content of an email to make it look more genuine, is just one example of how cybercriminals have weaponised AI so they can infiltrate an organisation and raise the cyber security threat level.   These increasingly sophisticated emails often use AI to emulate the tone of voice and mimic  content of previously sent emails. They may even include a topic that is currently being discussed with in the organisation – perhaps a new product or service, which someone from outside of the organisation wouldn’t, or at least shouldn’t, know. The purpose is to make these emails, and any action required by employees, as convincing and realistic as possible to encourage someone to act, giving cybercriminals access to the organisation’s wider systems and data.   Another example is through altering AI algorithms and creating intelligent malware that learns how to bypass the security tools used by an organisation in a way that doesn’t look suspicious.  Over time, this technique allows cyber criminals to slowly infiltrate an organisation and cause harm.  How can organisations use AI to fight back?  Traditional mechanisms for identifying cybersecurity threats have often relied on recognising a pattern or a known threat. AI can augment that in a number of ways by learning what “normal”  behaviour looks like and creating a baseline for the kind of events that occur in everyday operations. In this instance, AI would be used to flag events that fall outside of the norm to make users aware of the potential threat.  However, this method of identifying threats, such as a piece of ransomware, relies on the AI detecting activity that does not normally occur and is therefore often unreliable in detecting true threats. For example, this might be a device trying to communicate with a system based in a different country or large amounts of data being sent to a destination that normally does not receive such payloads.  Visibility and vigilance are the key. AI-based technologies help by allowing organisations to spot activity that might not alert to a potential breach but could be the precursor to something more serious. Here are some notable examples:  • Email scanning: The common consensus is that traditional methods of email scanning are not always effective when it comes to identifying potentially malicious emails. However, AI email scanning is forward-looking and can understand whether or not an email is potentially malicious based on its content and tone, and the actions it is trying to solicit.  • Antivirus products: AI in antivirus products is predominantly designed to detect anomalies, creating a challenge in finding the right balance between the ‘machine’  making all the decisions and the level of human intervention.   Overzealous technology could, potentially, cause and block legitimate traffic and programs. Having a blend that uses both AI and human oversight is arguably the best way forward. Endpoint Detection and Response is the favoured approach for combining AI and human interaction, whereby technology on the endpoint is deployed to protect and inform against known threats based on AI and the machine. It’s then down to human interaction to further diagnose the situation and make a choice on the best course of action.   • Machine Learning (ML): ML provides a huge advantage when looking to spot behaviour that is out of the ordinary. In security products, ML is designed to learn how a system normally reacts under various conditions, helping the AI to sharpen its anomaly detection powers.   However, ML is less helpful when it comes to detecting a potential piece of malicious code or a suspicious email based on patterns or acquired knowledge of what something bad might look like. That said, both Networks and Endpoints can benefit from a layer of ML that creates a picture of what normal patterns of network traffic or device activity looks like and can report abnormalities, or even block traffic and access to systems, based on the detection of something unusual.  • Natural language processing (NLP) apps: NLPs form another layer in the security paradigm that make the detection of malicious code easier. There are often patterns in the way in which harmful code is written that point to a likely threat based on the approach of the attacker. Common patterns in the language or code will also point to certain individuals or groups that are creating the malware, to identify the attackers  ‘voice’ and a potential individual behind the attack.  What does this mean for the future of cybersecurity?  There is a constant battle between cyber-criminals and cyber-defenders, often with humans and machines working together to try to outsmart each other. But we are moving into an era where, due to the sheer volume and complexity of these attacks, the more that can be automated and continually improved through AI, ML and NLP, the more we will see defences improve. However, on the other side of the fence, cybercriminals will be using the same  technology advancements to further their cause. Which in turn, leaves each side constantly trying to be one step ahead of the opposition.  Keeping ahead of the cyber-criminals will continue to be the game of cat and mouse it always has. However, the playing field is only going to get bigger. The move to automation with autonomous vehicles and smart cities, will exponentially increase the surface area for cyber attacks. This only serves to underline the continued need for AI but also shows how the stakes are likely to get even higher.  ### Tackling Cybersecurity Challenges in the Hybrid Age Is technology changing how we work, or is work changing how we use technology? Regardless of the cause, remote and hybrid working is now the norm and with a potential four-day week on the horizon, it’s clear organisations can no longer rely on a centralised network to provide their people and customers with the required level of support or security as before.  The rapid move towards remote and hybrid working has exposed IT systems to a host of new, and previously unconsidered, security challenges. Organisations are now only as secure as their weakest point which can expose systems to a range of potential threats. The urgent need to safeguard personal data and enhance cybersecurity activities should sit at the heart of all organisations' security strategies for the future. Recent increases in data breaches and cyber-attacks, both on private and governmental organisations, globally have highlighted the greater demand for comprehensive cybersecurity strategies. The pandemic has shown how the workplace can, and needs to, adapt to decentralised working without decreasing the level of accessibility, security and data protection.  Cybersecurity Challenges at the Heart of Digital Transformation The pandemic forced the hand of many organisations and posed various challenges to cybersecurity for how employees access, share and work with data.  The sudden lockdown measures meant cyber security managers had to meet the technology demands of a WFH set-up at short notice. With limited procedures and policies in place on how to manage cybersecurity for remote workers dealing with customer data.   It also demonstrated that there was no ‘one size fits all’ solution to those problems, which varied from team to team. Part of the reason why IT departments struggled to monitor, protect data and mitigate cyber risk initially was the lack of industry standards on how to navigate and protect sensitive data. Personalising the approach to building a human firewall It’s not just down to the technology.  A tailored approach that assesses the level of cybersecurity knowledge within an organisation on a case-by-case basis can help decision makers better understand the current gaps in data safeguarding and the effectiveness of existing operational practices.  Senior leadership may, in all likelihood, find a personalised approach will allow them to identify where additional training may be needed for individuals and key stakeholders. Ongoing employee training is consistently identified as a key driver in enhancing cybersecurity, regardless of industry or sector.  An employee who completes regular training on how to use technology securely is less likely to make poor decisions or find a workaround which could put an organisation at risk.  This approach can also help when it comes to gaining employee buy-in, through clear communication and the offer of professional development. A well-managed cyber security strategy  While on the frontline of defence, IT departments cannot stand alone in providing the resources and solutions being called for. They need metrics, they need collaboration, and they need organisational buy-in.  Ricoh’s Leading Change at Work report, identified a set of general actions companies across different industries can take to start building their digital transformation strategy and cybersecurity policy. Here are some top-line recommendations which could serve as a guideline and ease the framing of operational gaps.  Revisit strategic objectives for IT departments as digital transformation and security are the lifeblood of business continuityAdopt a layered approach with defence in depthMove from reactive approach to proactive with 24/7/365 monitoring and threat hunting.Get employee buy-in through training and communication can help protect data by reinforcing good practices Policies need to be built on a personalised foundation and joint effort between robust IT teams with an in-depth technical understanding of vulnerabilities and senior managers who understand the specifics of the mitigation strategy   It’s been a steep learning curve for many, but with the way we work continuing to evolve at a rapid pace we need to be able to pivot and adapt to meet security needs. ### Managing cloud costs Cloud-based solutions can offer your business a way to provide a scalable and reliable IT infrastructure that supports the company’s development and growth. But it needs to be actively managed to control potential hidden costs (and it can be harder to ensure cloud spending is actually driving growth). Cloud helps businesses to ensure business continuity by guaranteeing backup solutions and reliable disaster recovery, which would otherwise be costly and time-consuming to manage on-premise. The ability to gain quick access to data after a failure results in less downtime and saves the business time and money.But, for business to get full value from the cloud, it needs a cost-efficient and scalable infrastructure in place – one that constantly adapts to business needs and that doesn’t cost a fortune in the process. Cloud-based solutions also reduce IT management overheads because the organisation can use a pay-as-you-go model, only paying for what it uses. This allows the business to scale more efficiently. Tackling the rising cost of cloudThere’s been a rise in cloud costs as more businesses adopt the technology. This is contrary to what you might expect from a technology that is often considered as a means to reduce technology costs in an organisation, rather than increase them.It can be hard for individual businesses to manage as costs are often extremely difficult to track (and things like undetected architecture mistakes often account for a significant increase in cloud costs). So, it’s not uncommon to see businesses left with higher than predicted costs to get the same value from the technology.But, where do these costs come from? Well, for one, unexpected costs could come from misjudging the organisation’s IT infrastructure requirements. The business could over-estimate the cloud resources it needs, and some of that stays idle – contributing to the cost but not the value. Conversely, if the business’ resources are handling a higher capacity than it expected, it will see an increase in the cost of managing the excess.Then there are hidden costs that drain the company of resources, often without people realising what’s happening. These include:Data transfer fees. Businesses often don’t account for the cost of transferring data. (Amazon Web Services (AWS) and other major cloud providers have fees associated with data transfer either between AWS and the internet or between multiple cloud services.) These transfer costs can add up quickly.Fluctuations in traffic demands. Another thing to consider is if the business requires an  infrastructure that can cope with enormous spikes in outbound traffic from time to time. Having a system that can manage these demands will dramatically increase the business’ costs.The need to adjust infrastructure needs. Businesses also often try to adjust their infrastructure needs according to the executable code, which usually incurs higher costs. Dealing with undiscovered flaws in the code. These flaws can lead to additional unexpected costs. For example, when inefficient calculations are used, the organisation can end up using more intensive algorithms than it really needs, which costs the business more money. Businesses should focus on improving the application code itself, which will lead to more predictable costs and cost-efficient applications.How can businesses reduce the cost of cloud?Visibility and optimisation are key factors in monitoring cloud resource usage and reducing costs. Billing information needs to be transparent. An invoice isn’t enough – costs should be categorised by service, so the organisation can identify exactly where the cost is going. That spending can then be tied back to a team or project, which gives greater visibility into organisational structure and means it can track what resources are being used by various teams.Detailed, service-based cost information means organisations can evaluate what they are spending where, identify where resources should be reduced or increased, and optimise the service (and associated costs).  Businesses also must ensure that each cloud service is right-sized for the applications running on it. One advantage of working with the major cloud service providers is that they offer various billing alarms, which businesses can configure to warn them of unexpected usage or unpredicted cost spikes.Knowing what services and resources the organisation needs and adjusting them to the needs of the business is really important. When it comes to data storage, organisations must ensure that data is stored using the best solution for their specific needs (for example, if the business needs to access data regularly, it will want to ensure the storage system it chooses lets it do that efficiently, otherwise you risk wasting additional resources on gaining access). Amazon Web Services (and other providers) let their clients configure different lifecycle rules for their stored data, which means organisations can lower storage costs based on how often people need access to the data (and dependent on its size).Organisations should also look at implementing suitable scaling solutions that let the business adapt its infrastructure to spikes in internet traffic without increasing costs. For example, constant retrieval of stored files is subject to data transfer fees, which could drastically increase cloud costs. A great optimisation solution would be to use a CDN (Content Delivery Network), caching static data – this could significantly reduce the costs of transferring data.While cloud technology can save organisations significant time, money and resources, it is important not to underestimate the need to find the right services for the business, and implement and manage the technology well. To see the full benefit of cloud – and get the most value from it – the most important thing for organisations to is to do is have a constant process for monitoring cloud performance – cloud implementation isn’t a one-off project. Cloud services need regular management, review, and adjustment across every part of the business. It is by monitoring how the business uses cloud and making ongoing changes as needed, that organisations can avoid unexpected cloud costs.If your organisation has not conducted a review of its cloud services in a while, it is a good idea to start reviewing the situation – and making potential savings – now. ### The End of Backup, As We Know It Organisations must always be prepared for a disaster scenario. Whether it’s a cyber-attack, a natural disaster or a system failure, companies of all sizes need to formulate disaster recovery plans that will allow them to continue to operate when something goes wrong. The traditional approach to disaster recovery, particularly for large enterprises, has been to build and maintain a separate datacenter at a safe distance from the primary datacenter site, maintained with expensive high-speed WAN connections between the two locations. The doubled cost of maintaining a fully operable secondary datacenter just for DR was never really viable for companies with smaller IT budgets.  Another more economical choice is employing a backup solution to ensure data is not lost. The backup solutions employed by most enterprises involve shipping incremental daily backups to off-site storage. This approach leads to recovery times and recovery point objectives measured in hours or days. But backup solutions, as we know them, are built on an archaic principle of scheduled backup windows and lengthy, productivity-halting restore sessions. And waiting hours or even days for availability is simply unacceptable in modern enterprise standards. The problem with the backup and restore approach is that it is error-prone, slow, and tends to fail when needed the most. The most common comment from companies that have been hit by a major data loss incident such as a ransomware attack, is that they realised only in retrospect that their backups were not as good as they thought. This caught them off-guard with a recovery process that was way more complicated and time-consuming than they had imagined.  The takeaway from this is that in a world of rampant cyber-crime, ransomware, and state sponsored attacks, legacy backup solutions are woefully inadequate at keeping data secure.  Organisations need to shift their emphasis from backup and restore to intelligent storage solutions that are self-defending. A new security strategy is needed, one that is based on three major innovations in data storage: Continuous data protection, instant disaster recovery, and proactive disaster prevention. Continuous data protection Periodic backups, based on backup windows, are being phased out for modern storage solutions that provide continuous data protection in the cloud. Continuous data protection is a technique that continuously captures changes to data in real-time and streams these changes to a cloud-based, immutable repository. This means a file in use can be changed and that change is immediately captured and stored. This is in contrast to traditional backup techniques, which typically only capture data once or twice per day. The benefits of having continuous data protection are that it allows for recovery from data loss much more quickly and easily, and it allows recovery to any point in time. For example, if an organisation’s data was damaged by ransomware, it can simply revert to a very recent previous version, minimising the amount of lost work. Instant disaster recovery Instant disaster recovery solutions replace lengthy restore sessions with the ability to immediately roll back any quantity of affected data, on a file-level granularity, to any previous point in time. This removes the need to take systems offline for lengthy periods of time in order to restore them. How can we instantly recover terabytes or petabytes of data from the cloud? The trick is to perform the recovery process in the background, while users already have full access to the data. Whenever a user accesses an already-recovered portion of the data, it is immediately accessible, and whenever a user accesses an unrecovered portion, the data is recovered from the cloud in real-time. This process can be completely transparent to users, who will not notice any downtime. Proactive disaster prevention  The best way to protect data is to prevent disasters from happening in the first place. Next-generation storage solutions are beginning to use artificial intelligence and data-based insights while combining enterprise grade anti virus and threat protection to monitor data and identify potential threats. This means that data is protected before a disaster strikes, it is easy to identify the scope of the lost or damaged data, and the recovery process becomes quick and easy.  Additionally, adopting a zero-trust architecture stance where every user, device, or endpoint that attempts to connect to the network must be authenticated before gaining access is a beneficial layer to add to the proactive disaster prevention toolkit. Over the past two years, traditional, office-based work models are no longer, and that has led to a workforce that is distributed across home offices and remote offices. At the same time, we see sophisticated malicious actors taking advantage of weakly secured edge locations.  The new distributed topology of work makes traditional solutions, that rely on location to determine security posture, mostly irrelevant. That is why zero-trust architectures, based on the notion of “never trust, always verify” are now absolutely essential.   Multiple layers of security should be implemented to prevent not only intrusions but lateral movement of attackers within the corporate perimeter.  A new era of intelligent data protection It is the end of backup as we know it, and the beginning of a new era of intelligent data protection. As an industry, we should rise to the challenge, and deliver storage solutions that meet the needs of the modern enterprise. Storage solutions that proactively protect their precious contents from ransomware and keep their businesses running in the face of disaster. That is the future of data storage, and that is what enterprises need and deserve. ### What ‘flavour’ of cloud is right for your organisation? Cloud technology has never been more important in the business world. The pandemic had a significant effect on the adoption of cloud services and now, it is predicted that budget for cloud computing will make up 51% of IT spending by 2025. In fact, Gartner estimates that by 2025 “over 95% of new digital workloads will be deployed on cloud-native platforms, up from 30% in 2021”. It is clear businesses should embrace cloud technology as a vital part of their digital transformation strategies in the future. But with such a wide range of technology available that offer increased security, scalability, flexibility, and value, how do IT leaders know which ‘flavour’ of cloud is right for their organisation?  No one-size-fits-all approach  Businesses have all sorts of different requirements when looking to upgrade services to cloud. This can be based on the size of their company, the technology already in place and the reasons they are looking to migrate to cloud in the first place.  For example, newer private sector companies who are looking to integrate cloud-native services into the business have found a full migration to cloud is best. Whereas more traditional organisations across the public and private sectors may encounter problems with legacy technology which are not as easy to migrate wholesale. In this case, a ‘hybrid cloud’ solution might be more appropriate.  This all needs to be assessed before making an informed decision. In many circumstances – perhaps all, and despite what some providers might tell you – there is no ‘off the shelf’ solution that will work for all organisations. Making good decisions around cloud technology can seem complicated, especially without the right explanations or guidance. But it doesn’t have to be.  ‘Flavours’ of cloud  Public cloud  A public cloud solution is hosted by a third-party provider. This changes the traditional on-premise IT architecture to a virtual, scalable alternative which is provided over the public internet to multiple customers. This reduces the amount of server management the customer will need to do; it’ll allow for stronger security measures and saves cost, particularly compared to on-premise infrastructure.  But whilst this reduces IT operations for the customer, it’s wise to look out for contractual ‘lock-ins’ which can limit flexibility and, any compliance concerns over regulations that the business might need to adhere to.  Private cloud  For a security-conscious company, this option is by far the most secure. It isn’t shared with other customers and is only used by the one business, instead of multiple organisations like public cloud.  Private cloud services can be hosted internally by the business itself or hosted by a third-party provider off-premise.  Customers can benefit from improved performance; greater scalability and it offers better control over the service. Although this may come with increased costs to deploy the technology, the longer-term Return-on-Investment (ROI) makes it desirable.  Hybrid cloud This option is exactly what it says on the tin – a combination of cloud and on-premise technology. Much like the hybrid car, this creates a solution that is powerful, efficient and offers greater functionality.  It’s the easiest way to get up and running in a cloud environment – which makes it an attractive option for larger, more traditional organisations using legacy tech. This allows businesses to benefit from the scalability of public cloud, as well as the security of private cloud.  The connection between these services and, in some cases, legacy infrastructure must be seamless. The stumbling block for this ‘flavour’ of cloud is managing multiple environments with different security risks and integrations which can increase complexity.  As soon as a business uses its first cloud product, they are effectively employing a hybrid cloud environment, which tends to be the most popular choice if a full migration isn’t possible.  Multi cloud  With multi cloud solutions, businesses can use various public clouds. This would be for several purposes, but they would need to be complementary of each other, working in harmony.  The main driver for this type of cloud solution is business continuity and to meet regulatory demands. Large enterprises – who can risk zero-downtime – opt for a multi cloud environment to avoid the unlikelihood of their primary public cloud provider experiencing issues.  As-a-service solutions  On top of this, there are even more options for businesses to consider in their search.  Software-as-a-service  SaaS applications are hosted on cloud servers and are accessed over the internet.  The vendor has done all the hard work, effectively all the business must do is remember their login details. This is because SaaS solutions are accessed through everyday web browsers, making it accessible from any device and incurring little to no operational management.  Platform-as-a-service  With PaaS, businesses can build their own apps by using tools, infrastructure, and operating software from suppliers. This builds on the above. For a business looking to take their next step on their cloud journey, this is a cost-effective way to initially develop, run, and host cloud-based applications.  Infrastructure-as-a-service IaaS consists of renting server space from a cloud provider and building bespoke applications.  Rather than having to invest in creating your own private data centre, businesses can utilise IaaS partners’ compute, storage, and networking services on demand. This is usually a pay-as-you-go model which makes it more affordable than designing your own.  Function-as-a-service  This can also be referred to as ‘serverless computing’. It means cloud applications only run the elements needed per function.  This strategy allows IT leaders to focus on what’s most important – delivering value to their customers. By using the cloud service provider to scale and manage the infrastructure needed to run the application, organisations can roll out products faster and more efficiently.  The benefits outweigh the risks With so many options, initially it can seem overwhelming and sometimes expensive to take the leap to cloud. But guidance and education about different cloud options can help IT leaders of any scale – from a small independent company to an international enterprise – make informed decisions and architect the ideal solution to suit their needs.  ### Want your finance team to drive innovation? Here’s how to treat them right Finance is the unsung hero of the business world. For those who work in other departments, it can seem like oxygen - essential to life, but not exactly noticeable. The wheels keep turning, the invoices keep getting paid, and the annual audit just sort of… happens. When everything’s working, why stop to wonder about the processes and people who make it happen? But finance professionals know the reality is often quite different. Like the proverbial swan, all calmness above the surface and mad paddling beneath, many finance teams are delivering results - but at a significant cost.  The state of play today FloQast’s recent research, ‘The State of Play in UK Accounting’, found that increasing requirements around tightening deadlines (36%), existing applications and technology (35%), and complex compliance and regulatory tasks (32%) are acting as major obstacles to operational excellence. And as a result, nine in ten (87%) say their team has lost staff due to burnout.  It shouldn’t be this way. Too often, finance teams are battling with their tools, processes, and deadlines, rushing to get manual tasks completed for the umpteenth time, and struggling with ongoing talent shortages. Small wonder that the regular crunch of month-end is causing people to leave their jobs - and even the profession: of those accountants who’d seen colleagues leave due to burnout, nearly half (46%) cited the stress of month-end close as the specific cause of their departure. The root of these challenges is often a fundamental misunderstanding of the role finance teams can play in their organisations. Rather than just acting as glorified calculators, totting up the figures and popping out spreadsheets for the monthly budget meeting, finance teams can provide true innovation and strategic value to their organisations. A clear-eyed view of the numbers and, crucially, a deep understanding of what they mean can make the difference between a successful strategy and a costly flop. Businesses need to look at their finance teams as one of the main drivers of innovation, not just housekeepers and bean-counters for the departments doing the real strategic work. Here are three key areas to consider to make that a reality. Lean on technology to free employees from grunt work The fact that we can even have this conversation about finance playing an innovative, strategic role comes down to one thing: digital technology. The essential work of keeping track of income and outgoings against budgets can now in large part be handled by automated systems paired with well-trained staff. No software can operate independently of the finance team, but a good, ergonomic system can lift the burden of repetitive grunt work off their shoulders. For example, when it comes time to reconcile the accounts, imagine if rather than having to manually sift through mountains of receipts and transactions, taking up hours or even days of valuable time, the finance department could simply set their automated system off on the job and then deal with any issues it flags up. With that kind of tech-enabled process, accuracy and visibility are maintained (no worrying about what the auditor will dig up in the algorithm’s numbers) but the heavy lifting becomes a lot less heavy. It’s clear from our research that improved technology is high on the wish list for many finance professionals. For example, accountants who are mostly working from home are the most stressed (75%) during month-end, suggesting some teams are still running antiquated systems that get in the way of operational excellence and, simply put, driving staff to distraction. Overall, if done properly, automating regular workflows and making manual check-ins more efficient, as well as consolidating collaboration tools in one place, can totally transform the way it looks to work in finance. When technology has been designed with accountants in mind by teams who understand the real pressures they face day by day, it can free up significant amounts of time for more strategic work. Embrace ideas, create systems, achieve practical outcomes  All that said, how should businesses look to redeploy the time their finance teams gain back by using advanced software? In short, by involving them in the discussions about key challenges and opportunities facing the organisation, and empowering them to think outside the box to help address those challenges and opportunities. That might mean addressing localised issues within their own team - for example, figuring out better ways to create and deliver dashboards or financial KPI tracking to senior leadership. That could be a technological project, involving partnership with external providers, or it could simply be an internal quality improvement initiative, identifying bottlenecks and duplications and improving workflows to remove them. Either way, when the finance team itself functions more efficiently, the organisation as a whole benefits. Equally, it could mean assigning major, organisation-wide needs to the finance department to see what they can do. The most common demand from the C-suite along these lines is, bluntly put, to predict the future - to read the numbers as quickly as possible and accurately explain what they reveal about health and trajectory of the business. Achieving this kind of intelligent, strategic insight is no small task, but it is possible.  And when finance teams are no longer burdened by routine tasks, they are more likely to have the capacity to work with teams from across the business to integrate a wide variety of data into financial figures - and develop models that give greater insight into the future implications of current results. More advanced roles means greater satisfaction When finance teams are freed up and given the opportunity to get involved in more strategic, innovative work, our research shows that their overall job satisfaction tends to go up.  The change is already underway: 99% say their role has evolved during the pandemic, with the top changes being a greater involvement in technology decisions (46%) and a growing demand from the executive team to provide strategic business insights (44%). The impact has been positive: three quarters (75%) would like to be responsible for at least the same, or more, strategic tasks in the coming two to three years. Given the impact that stress and dissatisfaction are having on employee retention, these figures should be taken seriously. An innovative, strategic finance team is a happy finance team - and that benefits everyone, right across the business. ### Why Cybersecurity Asset Monitoring in the Cloud is So Hot Right Now Cloud-native infrastructure holds the promise of greater ease of deployment and faster time to value. It has elasticity built-in and leverages cloud speed. Moreover, cloud-native infrastructure has the potential to be more budget friendly – allowing enterprises to scale their financial investments to the resources they require, by using a software-as-a-service (SaaS) model.  At the same time, enterprises migrating their workloads and infrastructure to cloud-native security architectures are faced with a wide variety of challenges, not the least of which is handling Big Data – incredibly large volumes of logs coming from new data sources such as Virtual Machines (VMs), SaaS applications, containers, and other cloud native resources.  Moreover, cloud-migration involves implementing a completely new operational model for infrastructure in the cloud. New asset types need to be tracked and managed, and their associated vulnerabilities need to be addressed. The Challenge: Higher Costs & Data Overload Cloud-native Security Analytics solutions like Microsoft Sentinel provide the ability to ingest vast amounts of data at cloud speed and scale, supported by numerous data connectors ensuring data velocity, variety, veracity and value. A cloud-native Security Incident & Event Management (SIEM) platform simplifies the ability to integrate cloud security data from multiple cloud environments – enhancing Security Analytics capabilities. The exponentially greater quantities of data also can lead to higher costs – as cost is a direct reflection of how much data has been ingested. Moreover, “data overload” can result. Without the right strategy in place, cloud migration can be likened, metaphorically, to a miner who is standing on a goldmine, but without a shovel. “Data overload” raises the risk of security analysts losing sight of the key insights.  These challenges can be addressed by implementing strategies that optimize the organization’s threat monitoring capabilities while simultaneously keeping costs low.  Let’s look at 4 strategies for optimization that are key for any enterprise undergoing cloud migration: Reducing the Cost of Log Ingestion & Storage The difficulties of managing Big Data, on the one hand – while keeping costs down, on the other hand – can be approached by means of a well-defined data collection and pre-processing strategy. Advanced data collection solutions can be used to optimize data collection by means of data tagging, filtering, custom parsing, indexing, aggregation, and targeted routing to the data storage & analytics workspace.  To enable this, a robust data collection layer is needed, and with the ability to support various data collection methods such as syslog, database event pull, API, texts and more.  The recommended approach is to: First, securely collect and stream every available data leveraging the advanced data collection infrastructure.Then, store the data in optimized storage options such as Microsoft Azure Data Explorer (ADX). Finally, route the higher value data into the Azure Sentinel for real-time threat analytics. In this way, you do not compromise on the ability collect & store data and leverage this information to perform custom visualizations, reporting and hunting – in addition to conducting custom querying. High-value data is used for real-time correlation and analytics, and you maintain the ability to glean the right security insights from data with custom visualizations and reports while getting real-time threat insights. At the same time, you reduce the costs of log ingestion and storage.  Monitoring Across Multiple Clouds & Locations Many enterprises today leverage a multi-cloud environment and manage their cloud resources & workloads in specific locations – and there are good reasons for doing so. For example, a typical enterprise-level organization might leverage the best capabilities of different environments:  Providing a custom application development & hosting environment for developers in AWSUtilizing the Azure environment for cloud computing, workplace collaboration, and productivityLeveraging Google for data warehousing and cloud analytics The data and resources associated with each of these resources need to be monitored while they continue to reside in their respective environments.  The security situation is further complicated by the fact that the data for each of these clouds may be stored in multiple locations. This may be done to meet various business challenges, including processing data locally and filtering out sensitive data – as well as to align with regulatory requirements. As a result, an enterprise typically will have multiple instances of clouds – multiple subscriptions for Google or AWS, or multiple tenants for Azure.  Each location is likely to have its own management policies. The question therefore is how to maintain effective monitoring across multiple clouds – each of which uses different languages, schemas, and standards – and bring it all together successfully into a single view?  A cloud-native SIEM provides the solution, supporting integration across all cloud providers – offering a centralized, consolidated view of threats by monitoring data wherever it may be. Moreover, a centralized data lake such as ADX, which is designed to handle Big Data, facilitates cost-effective management across multiple clouds and locations. Today, advanced Managed Detection & Response (MDR) providers like CyberProof, a UST company are offering this kind of multiple, overlapping security monitoring capability.  Implementing Zero Trust  Zero Trust refers to the continuous validation of access at all steps of a digital interaction. It means providing each user with minimal required access or capabilities, while offering access from any location and providing endpoint detection, identity protection, and vulnerability management per device.  Because Zero Trust is complex, it involves more sophisticated asset management, which requires that an enterprise maintain full visibility of its assets. Visibility is essential to establishing the right level of security control and monitoring all the assets through threat detection & response, threat intelligence, and vulnerability management.  But how do you maintain this level of control with the exponentially greater quantities information associated with a cloud-native environment? The answer involves quantifying risk based on an in-depth understanding of exactly what you’re looking for.  If cybersecurity in a cloud-native environment must operate from a risk-based approach, then the first step in evaluating risk involves identifying which threats and threat actors pose the greater risks to a particular enterprise. Each organization must map out its assets, then evaluate (1) what types of threats these assets are likely to face, and (2) which threat actors are most likely to attack. To be effective this assessment must be based on high-level Cyber Threat Intelligence (CTI) input that relates to a wide variety of parameters specific to the enterprise, including:  IndustryRegionData typeWho has received access Prioritizing Risk with the MITRE Framework  Once you have evaluated the types of threats and threat actors that an enterprise is likely to face, you can prioritize threat detection & response capabilities leveraging a framework such as the MITRE’s Attacker Tactics, Techniques, and Common Knowledge (ATT&CK) framework.  The ATT&CK framework creates a categorized list of all known attack methods and marries each attack method with: Threat intelligence groups that are known to utilize themThe unique methods used in implementing the attacksThe mitigations and detection methods for identifying attacker techniques The beauty of the ATT&CK framework is that it provides direction for security teams making decisions related to developing their security operations center (SOC) strategy. More specifically, it helps ensure that use case development is continuously geared toward meeting the most likely threats to the business.  At CyberProof, for example, we optimize use case development using our Use Case Factory, which provides agile development of threat detection & response content and continuous improvement, to align with changing cyber trends and evolving threats. Migrating to the Cloud? First – Address the Risks  While cloud-native infrastructure offers so many benefits for enterprises over on-prem. IT environments, it must be managed correctly to keep down data ingestion costs and mitigate cybersecurity risks. The drastic changes involved in migrating an enterprise organization to the cloud impacts all aspects of security operations and creates new risks and challenges for the team, which explains why cybersecurity asset management is currently such a hot topic. Working with an advanced MDR provider gives you access to the expertise you need to explore, develop and implement strategies such as: reducing the cost of log ingestion & storage, monitoring multiple clouds and geolocations, implementing Zero Trust, and mitigating risk through use of a framework like the MITRE ATT&CK. If you're interested in speaking with Jaimon Thomas or another member of the team about optimizing cloud migration to reduce risk, contact CyberProof-a UST company. ### Emma Sue Prince - 7 Skills for the Future -Author Glassboard ### 6 Cloud Monitoring Use Cases and Why You Should Care https://vimeo.com/714023696/ed8718275b According to recent research from Flexera, 92% of enterprises have a multi-cloud strategy. However, according to Cisco, network teams are struggling to keep up with the pace of cloud change, with 73% of networking teams spending more of their time maintaining the status quo rather than focusing on multi-cloud deployments. Today, we’ll be looking at how you can get visibility into multi-cloud deployments and overcome some of the key challenges by reviewing six use cases, which include cloud network and application visibility, hybrid IT, security incident response, cost consumption, cloud migration and application visibility control.  First, let’s start with some of the challenges of public cloud. Network engineers have traditionally found it difficult to visualize how traffic traverses cloud networks. It doesn’t easily map to the typical mental model. There’s a lack of end-to-end visibility in a single pane of glass to understand application traffic from on-premises to the cloud and vice versa. For example, network engineers need to be able to understand how cloud entity instances – like EC2 instances or other virtual resources – communicate with one another in the same subnet. And there are other challenges with Inter-Availability Zone traffic to understand how the back-end application communicates with cloud database services in another AZ.  Furthermore, on-prem to cloud visibility is important as traffic traverses Virtual Private Gateways. There are challenges with visualizing how internet traffic goes in and out of a VPC or VNET and how to verify that everything remains secure. Along those lines around security, there are difficulties in validating security configurations (e.g., ACLs) with real-time accepted/rejected traffic in a format that’s simple and easy to explore and explain. Let’s look at some use cases in more detail. Cloud Network and Application Visibility – Engineer’s struggle to visualize how cloud traffic is flowing and behaving because existing products and tools do not match mental models. Users should consider a solution that visualizes this data, giving teams the mental model of network users and clearly showing applications, paths, and the interfaces they traverse. This gives network admins more visibility into what applications are primarily going through cloud deployments, and the bandwidth and utilization of what and how applications are traversing through different segmentation tunnels across regions.  Hybrid IT – It’s hard to provide an end-to-end path for an application that goes from an on-prem network to cloud and vice versa. Users should consider a solution that has end-to-end path analysis of applications from on-prem to cloud on a single screen workflow, but that can also triage issues and focus troubleshooting efforts based on on-prem or cloud or in-between. Monitoring solutions that provide a hop-by-hop analysis for end-to-end application for path examination is important, along with analysis on KPIs such as application and network latency, utilization, packet loss, QoS configuration, and VoIP performance. Networking monitoring solution should also support day 2 operations like monitoring, troubleshooting network and application behavior. Security Incident Response – It’s often not clear which traffic is getting accepted or rejected due to a mismatch of the mental model. Users should consider a solution that can identify the origin and destination of traffic into and among VPCs and visually determine if certain traffic is accepted or rejected. This allows network operation managers to analyze their network security definitions from required vs rogue traffic and get reports and alerts on unwanted applications that might be traversing the network. Once an intrusion is found, network admins need a recording of the activity the network   packets to determine both the fingerprint and the extent of the breach. Monitoring solutions that can provide the extra capability to capture and store every packet is helpful. Armed with the packet data admins can respond quickly and confidently. Cost Consumption – The lack of ability to slice and dice network traffic in n-dimensions for deep analysis costs time and money. Users should consider a solution that measures bandwidth utilization of applications, services, and internet gateways, and can compute baselines and trends. This allows network operations to manage the type of cloud hosted services (email servers, conference solutions, etc.) that are using most of the bandwidth to the cloud. That can then be compared with least used services and the service solutions redesign based on utilization. This enables IT teams to measure the performance and utilization baselines of cloud applications and services against trends over time to facilitate application and services capacity planning and optimization.   Cloud Migration – A lack of understanding of pre-and post-migration KPIs, historical baselines, and the ability to measure them precisely, impacts cloud migration. Users should consider a solution that has application and services visibility, that measures bandwidth usage and application performance baselines pre-deployment, and that validates bandwidth and performance post-deployment. Cloud migrations vary greatly depending on a company’s needs and objectives. Whether migrating limited portions of your enterprise systems (such as a few specific databases or server) or an entire application stack or datacenter, NetOps teams need a clear understanding of pre- and post-migration KPIs. Network monitoring solutions that provide the ability to accurately measure historic baselines and changes over time are key to understanding this.  Application Visibility Control – Migrating applications from the core network into the cloud is often challenging. NetOps teams need a deep understanding of the application’s current state before designing or redesigning the network and need to confirm that everything will work as intended post-migration. Network monitoring solutions that provide application and service visibility allow teams to measure bandwidth usage pre-deployment (while the application or service is still on- premises) and leverages topology visualizations to identify application paths to help plan the migration effectively. There are lots of challenges NetOps teams encounter when monitoring the public cloud. Hopefully these 6 use cases help shed some light on how to overcome some of them. As the industry moves forward, it’s important that solutions consume more data at a more granular level. It’s also important to allow teams to look at deployments globally then drill down to a location, a single hop, a packet, or even a phone number. End-to-end visibility of application and network performance from on-prem to the cloud is critical for efficient and accurate network monitoring.  ### Alison Reynolds - The Qi Effect - Author Glassboard ### Make your technology assets work harder - how can businesses get the most out of their cloud investments? https://vimeo.com/714050237/f1319fce3b The pandemic accelerated digital transformation and many businesses accelerated their digital transformation and transition to the cloud at the start of the coronavirus crisis to gain efficiency, enable remote working and reduce costs.The increased regulatory scrutiny around data and user privacy has also caused business leaders to make sure they really understand their risk profile in terms of Data Sovereignty, control and regulatory requirements across different markets. As businesses are emerging from the pandemic, CIOs and CTOs are increasingly looking to understand how to transition to the new ways of working in a way that supports existing regulatory requirements and the long-term objectives of their business. Now that things are starting to get back to normal, business should take a step back and take a closer look at the mix of on-premise, cloud and hybrid cloud solutions that they are using to ensure they are getting the most out of their pandemic investments. Here are a few tips on how they can achieve this.  Tip 1: Choose the communications architecture that fits YOUR needs. The cloud can streamline and strengthen your communications and collaboration infrastructure. By combining disjointed communications tools into an all-in-one, cloud-based solution that offers collaboration and mobility, businesses will be able to be more agile, scalable while also controlling costs, which are key priorities for business leaders. Cloud technology also helps with operational efficiency as organisations can outsource functions which are out of their domain. Nevertheless, there is no one-size fits all approach to cloud communications, and only by offering flexibility can organisations best achieve their business goals. This means offering the option to choose their own path, whether this will be using public, private cloud or a combination of both.By focusing their investments on what fits their business model best and scaling their existing solutions, organisations will be able to maximise their technology investments, while ensuring they have the right infrastructure in place to meet the changing market needs.  Tip 2: Ensure your cloud infrastructure can support your long-term hybrid working strategy. To cope with the pandemic many businesses were forced to make unbudgeted investments to support a vastly remote workforce. With hybrid working becoming increasingly prevalent across organisations, businesses have to identify the challenges within their company in order to make sure that their teams have long term solutions that allow them to work efficiently and effectively no matter where they are. According to recent research by Forrester, 70% of UK organisations now support remote employees, compared to 31% before the pandemic, while 86% plan to permanently adopt a remote working policy or have already done so. With remote working already in place in many organisations, it is now the right moment to take a step back, re-evaluate what can be scaled up or replaced and assess if your organisation’s technology assets can effectively support the business objectives. Understanding the infrastructure and applications that underpin hybrid working better will help technology leaders identify gaps where they need to focus their investments but also to better understand what they might need in the future. Tip 3: Ensure your investment priorities are closely aligned to your business objectives. The next step is to ensure the technology strategy you have in place can effectively support your organisation’s business objectives. If you are planning to increase your workforce by recruiting more remote workers, for example, are your existing collaboration capabilities scalable enough to support a larger workforce? As companies move to hybrid setups, they’ll need to ensure they focus their technology investments in the appropriate areas. For instance, when it comes to hybrid working, there are three key areas for innovation that need to be considered: business continuity, customer experience, and internal communications and collaboration. While these are all interconnected, ensuring the business has the right technological capabilities in each of these areas will be essential for transitioning successfully to a hybrid working model and maximising ROI from existing investments. Businesses also need to make sure they are offering flexible tools that do not compromise on data security or compliance. This further highlights the need for flexible technology solutions that cater to different business needs. For instance, a distributed cloud model can support the next stage of customer lifecycle management by addressing all hybrid deployment types. While, an on-premise solution could be preferred for critical business applications that contain highly sensitive information.  Tip 4: Don’t be afraid to ask for help. Another step to maximise cloud investment is to understand that businesses don’t need to do it all on their own. Organisations are looking for simplification and integration within a single platform and this can be offered by partnerships that bring expertise and integrated technology capabilities to the table. A technology vendor can also assist with the migration to the right cloud path by understanding the intricacies of a particular usage model and can help businesses do more with the solutions they have and migrate to new ones at a pace that’s right for their business.  Investments into the cloud made during the pandemic have a lot of potential and present opportunities for businesses to improve their agility and enable more productive ways of working. But if the communications platform put in place isn’t providing the functionality needed to grow and meet new challenges, now is the right time to pause and re-evaluate what is in place and needs to be optimised. With the right help, businesses can figure out their key investment priorities to scale and support their long-term objectives. Ultimately making the most out of their existing investments will increase ROI and create more efficiencies across the business.  ### Identity Modernisation: The backbone of hybrid cloud environments https://vimeo.com/714038255/7ec166db7c The work environment is constantly changing, and the work-from-home era is simply the next step in that process. Even before the COVID-19 pandemic hit, we were starting to see the shift to remote work– as a result, many organisations moved towards hybrid cloud environments in order to ensure that both on-prem and cloud environments were supported.   While it might seem obvious that we were always going to head in this direction, it doesn’t mean organisations have not hit problems. Research by Radiant Logic shows that only 4% of tech executives have completed a full cloud migration. Many of the pitfalls that organisations have hit when trying to transition to the cloud link back to the problem of failing to modernise identity data systems before venturing on their cloud migration journey.  Identity modernisation is not a quick-and-easy fix, instead it is a stepwise problem that must be executed systematically. These steps include consideration of single-sign on, access management security, data governance, and lastly contextual access – all crucial factors when organisations are trying to migrate to the cloud. However, without accurate and actionable identity data, identity modernisation will never happen, and ultimately hybrid cloud environments are destined to fail.  Organisations need a single source of identity data which can offer efficiency, accuracy and security. With one resource, organisations no longer feel like they have to “swallow the frog” when it comes to managing their hybrid cloud environments.  Why does identity management cause so many headaches when it comes to hybrid cloud environments?  For decades, organisations have stored identity data across dispersed identity sources which all use various protocols, resulting in extensive customisations and a lot of repeated processes when it comes to meeting new business demands. These siloed systems lead to identity sprawl, as well as overlapping, inaccessible, and often conflicting identity data.   If our identities are clashing with each other, then how are IT teams able to figure out if “John Smith” in one directory is the same “John Smith” in another directory? This disjointedness in identity data results in organisations never building accurate and complete user profiles.  Security teams then have the unenviable task of trying to figure out which employees have access to what. It is not surprising that Radiant Logic’s research showed for 52% of tech executives, the manual provisioning and deprovisioning of access caused the greatest amount of stress.  It’s not just IT teams that have to deal with the frustration of poor identity management, but also employees as well. The endless number of usernames and passwords employees have to remember in order to access different applications can be irritating, with 64% of tech executives reporting user frustration with the number of different credentials needed to access different apps.  Identity sprawl also poses a huge security vulnerability, as well as restrictions to usability. Threat actors are always looking for targets where they can remain virtually undetected when they breach, and siloed systems give them a perfect opportunity because the attack surface is increased, and gaps are created which can be exploited.  Siloed systems that are forgotten about by security teams can result in past employees not being removed from the organisation’s system. Threat actors can use these forgotten identity credentials to access restricted areas of the network and cause significant damage to an organisation while remaining effectively hidden within the noise of normal everyday activity. In order to gain the full potential out of hybrid cloud environments, organisations must reign in their identity data and gain proper control over their identity access. Without the modernisation of identity then hybrid cloud environments can never meet the needs of the organisation.  If these problems have such a significant impact on organisations, then why aren’t they doing something about it?  Organisations have tried to integrate Identity Access Management (IAM) solutions in order to help rein in their identity data, but they have had little or no effect, with IAM projects being halted or restarted at great cost and time.  Problems with identity sprawl is nothing new to organisations, but the realisation of the problem is. In the early 2000s, organisations already had identity siloes, but as networks were only on-prem, the problem was easy to handle through customisation and home-grown solutions. The relative safety of the network perimeter afforded some level of protection. When organisations started implementing cloud and hybrid cloud environments effectively erasing the network perimeter, the problem grew exponentially. The explosion of remote working and the rise of shadow IT in 2020 further compounded it, and organisations, all too late, soon realised what had happened with their identity data.     Once discovered, organisations struggled to install the correct IAM solution to a problem which should have been sorted years ago. Radiant Logic found that 67% of organisations have a modern access control and governance solution, but a lot of apps and users are left out.   Many identity and access management solutions were not built to unify identity stores, which is crucial in a hybrid cloud environment, when 68% of tech executives see legacy systems as either “very” or “extremely” important. Organisations need an IAM solution which is able to unify and allow communication between both on-premise and cloud technologies.  Why will a single source of identity data help organisations that are in a hybrid cloud environment?    Radiant Logic found that 47% of tech executives would be able to move forward with their digital transformation projects if they had one single on-demand source of identity data. With many digital transformation projects taking the form of cloud migration, a single source of identity data is the way forward. An Identity Data Fabric approach enables unified identity data to be delivered to an application on-demand, including all the attributes of a user irrespective of which identity source it came from.    Identity Data Fabric works by unifying distributed identity data from all sources in order for applications to access data as needed. This unification of different sources results in applications being able to access identity data no matter what protocols or formats are being used and irrespective if it’s being stored on-premise or in the cloud.  Beyond just unifying and presenting identity data from disparate sources, effective identity fabric solutions allow the enterprise to present subsets of identity and attributes to systems or administrators. Such fine-grained scoping of identity presentation enables the organisation to maintain the concept of “least privilege” in ways that is simply not possible with siloed and dispersed identity stores. Having one regularly updated resource means that IT teams will no longer have the stress of manually provisioning and de-provisioning users’ access, ultimately meaning that user access is correct, and the chances of the network being breached are reduced.  Efficiency, accuracy and security are the three components to ensure that hybrid cloud environments meet the needs of users. An Identity Data Fabric approach offers a single source of identity, which can serve identity modernisation and cloud migration projects, both immediate and long-term.  Hybrid cloud environments are now the cornerstone of every successful and modern enterprise, with identity data an integral part of that. Organisations need a flexible and accurate source of identity data which can meet the usability and security needs of users, applications and systems on hybrid cloud environments.  ### Jon Burkhart - Constant Curiosity - Author Glassboard ### 9 Tips for Marketing Innovation in an Old School, Traditional Industry https://vimeo.com/714039756/6771329d96 Traditional industries often struggle with marketing innovation. Banks, law firms, manufacturers, and others are quick to rely on what works, thinking “if it’s not broke, don’t fix it.” But these industries are often dry, and need something extra in their marketing strategies. They’re not as sexy and creative as fashion, art, food, or travel, though they can be with the right ideas. We’re in an amazing time for marketing innovation. Disruptive technologies like mobile-delivered coupons based on television commercials, movie trailers that adapt in real-time depending on the viewer, and augmented reality are just a few of the newest opportunities marketing offers. And there’s no better industry to leverage the newest and hottest trend than an old-school industry. 1. Leverage Facial Recognition Facial recognition holds a lot of power for marketing, especially in traditional industries. Virtually any company can use this technology to promote its products and give away free stuff, which every customer enjoys. Companies like Unilever used this technology to create viral challenges in exchange for free products, such as ice cream. You’ll recall that Unilever offers personal care brands, but it also makes ice cream. With a vending machine and the “Share Happy” campaign, the company leveraged racial recognition to award people with ice cream bars for smiling. The campaign went viral on social media, organically. Facial recognition can also be used for digital displays and signage to detect demographic information like age bracket and gender, serving more targeted audiences. These messages are contextualized and tailored to the audience, increasing engagement. 2. Hyper-Targeted Ads Targeting ads is nothing new, but ads are getting more specific with the help of technology innovation. Every person leaves a digital footprint online (where most people are shopping) that can be tracked by data mining. Marketers have incredible volumes of data at their disposal, so there’s no excuse for generic advertising. Consumers can be targeted based on hyper-segmentation, delivering ads to the people most likely to engage with them. For example, you can send ads at the right time to customers that are nearby your bank or retail store, or within your defined audience segment, or both, to let them know about a flash sale.  3. Create Immersive Augmented Reality Experiences Augmented reality is a huge game-changer in marketing, especially with the effects of the pandemic and more people shopping virtually. This technology can help shoppers put products and services into context, such as trying on a watch or glasses virtually, even going as far as allowing them to play with the features. AR can also be used via apps to help customers better, such as offering a GPS overlay with icons to show nearby coupons, store locations, and more. 4. Adopt Virtual Currency and Incentivized Ads Virtual goods are growing in value. Many app games feature incentivized ads that offer credits or in-game goods in exchange for watching ads or engaging with sponsored content. Virtual currency also offers value in exchange for surveys and feedback. Building referrals is an important part of growth, especially for traditional industries. Virtual currency or discounts can be offered to customers for promoting products they like with friends and family. When customers share a link to a sale, for example, and friends click the link, the customers get a credit. 5. Maximize QR Codes Quick Response (QR) codes are pixelated squares that are similar to a barcode. Many products have them, restaurants use them for virtual menus, and they’re used for security verification on apps and accounts. Televisions also display them. All the audience needs is a smartphone with a camera and a QR reader app to scan the code, which opens a web page, shows text, and more. Brands can use QR codes to link to a locator app and find nearby stores or retail locations, access content, or find product information. They’re also empowering businesses to do spot targeting within a geographic location, which has plenty of value for location-based marketing. 6. Use IoT The Internet of Things (IoT) is a multifaceted marketing innovation that can be used for a variety of creative campaigns. These devices are connected on a network and gather data from customers, providing drilled-down data to provide contextualized experiences. Like hyper-targeting, IoT empowered marketers to analyze customer buying experiences and get incredible insights into the customer journey as they interact with smart devices. IoT can also pair with other technologies on this list, including QR codes and AR experiences. One exceptional example of IoT in marketing for traditional industries is smart product labels. A simple QR label on a product can direct users to related products, trivia, discounts, or how-to guides. This is especially valuable for tech companies with complex products. For example, a customer may order a printed circuit board prototype with a QR code on the box, which directs to a page about bare board design or a discount on future purchases. 7. Be Present on Social Media Social media is here to stay, though many traditional companies are hesitant to start and maintain a presence on the most popular platforms, especially on the newer channels like TikTok. There’s no way around it – consumers expect brands to be online, whether it’s their favorite clothing label or their local bank. They want to get to know the brand’s culture and story through social media content and interact in the virtual space, no matter how “dry” the industry may seem. And even if brands aren’t online directly, customers make sure they are. Customers will leave a review about their experience and discuss a brand on their social media accounts. Because of this, brands need to take control of the conversation and manage their online reputations. If you’re not on social media, now’s the time to start. If you are, come up with a strategy to promote your brand and content on the channels that have the biggest portion of your audience. Depending on your industry, that could be Facebook or TikTok or YouTube. Even a dry brand can gain leverage with content on social media. For example, a video about the basics of a printed circuit board could capture users’ attention, leading them to read more in-depth posts about the topic and eventually making a purchase. It takes time to build a following on social media, but it’s worth the effort and patience. Several tools are available to improve marketing efforts and plan content marketing effectively, such as content scheduling, campaign management, and calendar management. 8. Provide an Omnichannel Experience Traditional industries need not only marketing innovation, but innovative thinking, to succeed. Some of the most creative campaigns have come from traditional companies, such as Esurance, Trident gum, and Tipp-Ex correction fluid. Along with that, companies like Progressive Insurance put themselves in the minds of the customers to improve the experience. Instead of sticking with what worked, Progressive revolutionized the industry by adopting direct channels, a website, and online quotes. At the time, these ideas were as new and disruptive as AR and IoT. Ultimately, this changed the way people shopped for insurance and focused on the entire process, all the customer touchpoints. Omnichannel marketing does the same by integrating all customer channels to create a unified, positive brand experience, whether customers are shopping in-store, at an event, on the website, or on social media. Customers can start in one place, like a social media page, and finish on the website. Omnichannel experiences are making waves everywhere, from healthcare to finance to legal. Many brands, no matter the industry, understand the value of putting the customer first and providing a consistent, personalized experience across all channels. 9. Create a Marketing Mix One of the challenges of old-school industries is marketing to a wide audience. Finance and healthcare companies, for example, often have audiences across different age groups, genders, locations, and income brackets. It’s not enough to stick with just traditional advertising or just digital advertising – modern companies need both. Old-school methods, such as radio ads, television commercials, and direct mail are still valuable to companies and target a specific audience. Likewise, social media, online content, and high-tech marketing innovation like AR and QR codes have value for younger, tech-savvy generations. The best way to reach both is with a mix of traditional and digital marketing that uses available technology in creative, contextualized ways. Depending on the audience, a touch of everything from direct mail flyers to word-of-mouth referral programs to social media shopping can be effective. Harness the Power of Innovation in Your Old-School Industry Traditional industries are what make the world go round. They’ve survived numerous revolutions and disruptions without faltering, but for some, refusing to grow with the times can leave them at a disadvantage in an increasingly competitive market. Not all marketing innovations are ideal for every traditional business or industry, but including more innovative ideas into the marketing mix can keep old-school companies in the spotlight for growth and profitability. ### Microlearning – The Key to Balancing Workload with Manager Development https://vimeo.com/714036085/71dc77f2a1 As the workforce becomes more competitive, it can be beneficial and challenging for both candidates and businesses. Rising technology complexity, increasing consumer demands, non-traditional work models, and remote work are creating multigenerational and diverse workforces that require a new type of virtual leadership training. For businesses, the increasing resignations and labor shortages can create talent gaps in necessary roles. It’s a critical time for employers to focus on teaching current employees more skills, in addition to searching for outside hires. Skills can become outdated quickly with emerging technology, so investing in a workforce that’s willing to learn and grow is worth its weight in gold. But devoting time to training is increasingly challenging. It’s difficult to fit time for training into busy employee schedules, especially as teams take on the work of vacant job roles. Employees can’t be neglected or ignored in light of business challenges, however. Challenges with Manager Development Businesses are struggling with a reduced workforce from mass resignations, increased consumer demands, skills gaps, and labor shortages. All of this means employees are forced to wear many hats, take on new roles and tasks, and struggle to get all their work done in a day. If workloads and time are already constrained, there’s little left over to spend on learning and development efforts for employees. Another challenge is remote and hybrid workforces. Though virtual training opportunities exist, it’s more difficult to keep employees engaged and on the same page with training. Remote teams usually can’t attend in-person workshops, and may even be on a flexible schedule that makes asynchronous learning more challenging. There will always be challenges, so how can we strike a balance between day-to-day demands and employee growth and development. Ensure that Content is Timely and Relevant Training and development are excellent for employee satisfaction, but only if the topics are something they want. If your training program is poorly suited or outdated, your manager may be thinking it’s not worth the time they’ll devote to it. Employees are also more likely to become disengaged with boring training programs if they feel that they’re not going to get anything out of it. Make sure the training content is timely and relevant. It needs to provide value to the manager to get them on board. If it seems arbitrary or like something handed down as a box to check, they may not fully engage with it. Consider asking your team for feedback and recommendations. What do they think they need to work on? What topics and programs would they like to see offered? Consider breaking up the training by levels to ensure that it’s not over- or under-whelming for your employees. In all likelihood, a manager has a wider skill set than a junior employee, so tailor the training to those skills. Consider the fundamental skills you could develop in your junior employees, new manager training, and your rising leaders, then develop programs that cater to each of them. Clarify the Career Connection With employee training and development, make sure your employees know what’s in it for them. Discuss the training in the context of larger skill development and long-term growth. ensure that your employees know that it’s not a one-off effort, but an investment in ongoing training and development. You should have a strategy in place to discuss with them and include them in the conversation. Plan mandated, manager-led career conversations at least twice a year, if not more. Think through the whole year and include topics that align with your employees’ needs, your development resources, and your current budget for training. Prioritize Micro-Learning Prolonged in-person training that combines live interactions with hands-on practice can do wonders for employee engagement. But take that and turn it into a video conference or an extended training series, and you may have a lot of employees falling asleep. We’re just not designed to learn that way. Microlearning addresses the engagement problem and the scheduling concerns to deliver better training in bite-sized chunks. The idea behind microlearning is that it’s broken into small portions and delivered via digital learning platforms. Unlike other online learning programs, which usually include long videos and activities designed to be completed in one session, microlearning uses brief, frequent interactions via modules to improve learning, retention, and engagement. With microlearning, employees can learn a new skill within hours – or even minutes. Microlearning also offers options for different learning styles and presents the material in a way that’s suited to the subject. For example, some programs may have a combination of video and reading, while another may have an interactive model. If you’re considering including microlearning into your employee training and development, search for management training programs whose formats are flexible and succinct to fit in with busy manager schedules. Remember, time constraints are tight and attention spans may be short, so your managers are likely even less tolerant of outdated or boring content. When you’re considering your options, ask for referrals and suggestions from your network. You’re not alone in your struggles to fit employee training and development into increasingly strained schedules, and you could save time and money by asking others about the microlearning options they’ve used. Microlearning isn’t a be-all, end-all for your training, however. You should include microlearning into a larger training initiative that seeks to develop goals over a longer period, such as a year. Asking employees to engage with two or three smaller learning experiences over the course of a year is manageable and shows that you’re committed to their growth in the long term, not just right now. Build Your Dream Team If the goal is to develop stronger managers, the key is striking a balance between their immediate deliverables and the business’s long-term goals and objectives. Finding time for training and development is always challenging, and there will always be obstacles, but the solution may lie in shorter, flexible, and relevant content that’s designed to accommodate busy and strained schedules. There’s no question that investing in your team can pay off, but it’s up to you to find a way to make it work for everyone.  ### Ben Ryan - Sevens Heaven - Author Glassboard ### Top tips for organisations looking to tackle common objections to ZTNA migration https://vimeo.com/714041868/bee9910814 With the increased adoption of remote working practices, employees are positioned outside of the traditional security perimeter. As remote employees continue to access valuable business assets using their personal devices and private/public networks, a company’s attack surfaces widen, exposing them to critical threats emerging from implicit trust practices and unknown vulnerabilities.  For most businesses, the solution lies in Zero Trust Network Access (ZTNA), a security solution framework that provides secure remote access to an organisation’s systems, applications, data, and other assets based on clearly defined access control policies. However, organisations often perceive ZTNA migration as a complex process, resulting in objections from business leaders and reluctance from the workforce. So, how can organisations overcome these objections and successfully implement ZTNA to build a secure and protected remote working ecosystem?  Understanding the significance of Zero Trust Network Access Most legacy security technologies are perimeter-based. The core idea of perimeter-based security is that anyone inside the corporate network is trusted. So, if a user is connected to the corporate network, they don’t need further verification to access the assets within that network. Such technologies leave the system wide open after the initial authentication; they also often fail to secure cloud environments and prevent threats occurring from cloud access. Instances like exposed IP addresses, exposed credentials, infected devices and breached Wi-Fi networks often go undetected by legacy systems.  More specifically, in a remote working environment, the majority of the network traffic comes from outside of the corporate perimeter. Legacy solutions cannot verify or authenticate the users or devices accessing the corporate information system. This unparalleled risk of unmanaged devices and access privileges can lead to critical security incidents. Traditional solutions like virtual private networks (VPN) are inadequate because they essentially provide access to large segments of the network once the user is authenticated. This goes against the principle of least privilege, which states that a user should only be given access to the resources needed to complete their job. It also opens the door for lateral movement across the network, should an attacker somehow find a way in through a compromised device or user.  The ZTNA model addresses this problem by following a ‘never trust, always verify’ principle. Zero Trust solutions authenticate every single connection before allowing network access, whether the user or device is inside or outside the secured network perimeter.  It continuously validates every aspect of user access privilege, including the user’s identity and location, their IP address, data or service being requested, and even their endpoint security posture. This approach ensures that every user inside the organisation’s network is verified and trusted, and there are no overprivileged or unauthorised accounts accessing the organisational assets.  Implementing ZTNA ensures that users only connect directly to the apps and resources they need, instead of connecting to the entire network. So, in case devices or applications are compromised, the risk is contained within a small area, restricting it from infecting other assets. Moreover, it is different from conventional firewall solutions, as Zero Trust solutions terminate every connection as soon as any malicious traffic is detected—cutting down attack paths before they can reach the target and eliminating attack vectors. A Zero Trust model is the definitive answer to reducing critical cyber risks arriving from unknown vulnerabilities. Objections to ZTNA migration and why they occur  Most objections and reluctance to ZTNA migration occur due to incomplete planning. Most implementation plans are often rushed and focused on overhauling changes in the current network security infrastructure. Business leaders and CISOs need to understand that legacy systems have existed for a long time and workforces are likely to resist sudden changes that require a complete modification of their work habits and business processes. Instead, a more refined and gradual approach to ZTNA implementation can help organisations slowly shift toward the new model and increase their adoption over time.  First, business leaders must communicate the importance of Zero Trust across the entire organisation. Every employee and staff member needs to be on the same page and understand the critical need for this shift. This requires increasing organisational awareness through planned training sessions and regular communication/updates. Once the importance of Zero Trust is defined and understood across the workforce, security teams can focus on the implementation phase.  Before implementation begins, organisations should incorporate a Zero Trust framework. This framework will clearly define which part of the organisation needs to have Zero Trust policies in place. What’s more, the combination of security and agility delivered by a successful Zero Trust approach can provide a powerful advantage to any organisation’s digital strategy. Indeed, the capabilities of Zero Trust can help define the scope of what digital transformation can achieve. With their expanded digital infrastructure effectively safeguarded, businesses are free to be more ambitious.   In terms of implementation, ZTNA should be rolled out gradually. Instead of completely replacing the old network structure and security policies in one go, CISOs should start with a defined protection surface. Begin by clearly identifying which assets and services should incorporate ZTNA as a priority. For example, organisations can begin by implementing a Zero Trust solution on specific applications such as Active Directory, or on certain critical assets such as the consumer or financial database.  Once a certain set of applications or assets have been migrated to the Zero Trust model, security teams should monitor its effectiveness for a certain period of time before moving on to the second phase of implementation. It is also important to monitor how the employees are adapting to the new security model. Employees should be made aware of the pre-defined Zero Trust policies and be trained on how to access different resources under the new security model.  By taking it slow and incorporating a planned and gradual implementation process, organisations can overcome the objection to ZTNA migration and help the workforce smoothly transition into the new security system.  The Zero Trust model can reduce business costs While some business leaders might be reluctant to incorporate ZTNA because of its implementation cost, it’s important to understand that a Zero Trust approach can actually reduce significant business costs. Shifting to Zero Trust means organisations can facilitate more rapid digital transformation and make more resources available on the cloud, leveraging the ZTNA architecture that includes secure cloud network access. ZTNA allows for unified access policies for remote as well as on-premise users, which will drastically reduce the complexity and can replace or decrease quite a few security tools in the existing corporate perimeter. ZTNA solutions can automate least privileged network access, leveraging your existing identity and access management and application metadata, allowing security teams to focus on other critical business processes. It will also significantly increase an organisation's security efficiency and help to protect the business from potential data breaches. In 2021, the average cost of a data breach was $5.04 million. Applying ZTNA will help organisations to avoid costly data breaches and ransomware attacks.  Therefore, business leaders need to understand the wider impact of a Zero Trust approach and make ZTNA a critical part of their security infrastructure. The cost of maintaining ineffective legacy systems and rectifying potential data breaches far overshadow the cost of implementing ZTNA solutions. By focusing on the powerful benefits around cost, agility and productivity that ZTNA can deliver, security leaders can overcome the common objections and get their projects moving. In today’s rapidly evolving digital space, threats can come from a wide array of sources and it’s an almost impossible task for security teams to monitor all aspects of secure access.  Implementing ZTNA takes some of the pressure off security teams and eliminates critical attack paths before a system is breached. It ensures that organisations can carry out their critical business processes within a secured digital ecosystem, helping them reduce their risk while empowering their teams’ productivity. ### 4 ways sales teams can perform better in the field https://vimeo.com/714012019/020a61308d Whether you are advertising or marketing a new product or offering any type of services, most b2b organisations include a sales team. In simple terms, a sales team is the group of salespeople who interface with customers and make sales for your company.  Sales team is a department who is responsible for meeting the sales goals of an organisation by selling your products. Sales team consists of sales managers, sales executives, sales representatives, and sales specialists who work continuously to meet daily, monthly, quarterly and annual sales goals. They must stay up-to-date with the latest sales trends and technologies. They specially focus on sales generation for better business growth.  Managing a successful sales team takes a lot of effort. As a sales leader or manager, you need to support your sales representatives as they work so hard to convert prospects to customers. You need to create such an environment where they can perform in a better way. So, let’s see 4 ways sales team can perform better in the field: Make maximum use of field sales software Field sales means outside sales or we can say sales teams go into the field or market to sell to prospective customers and do marketing of it. A field sales management software is a software in which the field sales team tin manage prospects and customer data more effectively through mobile or desktop applications. These field sales software manages all the challenging components of field sales, including location tracking, facilitating communication, organising field data and activity and much more. So to make your sales team perform better, your businesses need to turn to field sales software. It helps to gain speed, efficiency and productivity. Examples of such field sales software are delta sales app, salesrabbit and salesforce. It is a type of sales when the vendor visits client’s in-person to sell its products and services. Field sales software are commercial applications designed for businesses to help their field sales team manage prospects, customer data, and activities. They provide companies' field reps with location tracking, communication tools, event management capabilities and more opportunities to increase productivity. So when you're looking to boost your productivity on the field with mobile applications or desktop office software that can track your company's progress, motivate you during those long days and support your efforts behind-the-scenes—it’s time for you to introduce siloed teams into being proactive by quickly assessing genuine opportunities, who's been talking with whom and strategize accordingly. This will lead to new breakthrough growth both locally and globally as soon as early next year. Be a transparent organization Transparency here means that your peers, managers, executives should know how their salesmen are performing in the field. Each team member's goals and progress should be known by the top sales organisations. The number of calls made every day, the amount of time spent on the phone, what each person pipeline looks like should be makes available for everyone to see so that the sales people feel yes we are being observed so we need to focus more towards our work. Transparency is more than just a publicity tactic for businesses. We’ve seen that there are many advantages to being open and transparent, including increased productivity, trust, culture, and morale. It leads to trust, which is the foundation of great teamwork. When you're open with your customers and blog readers, it helps build that trust. So we've included transparency on our blog by providing all the information about everything that's going on including for features and pricing. And we believe that being open helps us build this trust and share our reasoning behind our choices. Encourage your sales personnel to work in a good, competitive environment Healthy competition at work means interaction between individuals that promotes and fosters thriving for higher achievements yet creates an environment where everyone in the group hopes that everyone will do well. Competition is good sometimes because when team members work together towards a shared goal, they push one another to be more creative, productive and motivated in the field. Sometimes it is very important to publicise individual salesman progress. Let people see one another's daily and quarterly progress via a gamified dashboard. Sales team members have to work under pressure and in a competitive environment, which can be gruelling at times. You don't get days off from sales, or any chance to relax yourself. Slow periods in business and lesser sales conversions add up to more stress and can take a toll on the enthusiasm of these people as well. Increasing competition amongst team members will ignite their spirit to outperform each other, enhancing work performance and achieving goals together is good for business. Reward your sales representatives For any business no matter what industry, a successful and cohesive sales team is a must. A great product or service won’t do much if you don’t have anyone doing the selling. There are many different ways to incentivise top sales performance and to motivate your staff, but one of the most popular methods and also one of the most effective is by simply offering monetary rewards for reaching specific targets. By way of example, if your marketing team does their job and brings in a certain amount of clients per month for example, one could offer them profit sharing once they hit that target which would ultimately act as an incentive to bring in even more sales! Your employees will be highly motivated overall after they see how they can benefit financially working hard -and that benefits everyone at that company. One of the biggest challenges that sales managers face is coming up with an effective way to train their staff. Just as there are many different types of people in the world, we can assume there are individuals who excel at various things. The most effective sales teams are not necessarily those who work the most or have more economic resources, but teams whose members know how to empower each other and work together toward achieving the same goals. It’s difficult to maintain a team that motivates and inspires each member to give him or herself completely for the success of the company, but it’s possible if you find ways for your team members to get along on a personal level, not just a professional one. No matter what industry you're part of, an effective sales team will always focus on several core values like: - High Performance - Self-Improvement - Teamwork - Customer First. ### Harness the Power of Hyper-contextual Advertising https://vimeo.com/714026212/bb9318717b While contextual data is often used by marketers and advertisers to provide a better experience for their customers, it's also the key to a new level of performance marketing. Contextual analytics, paired with big data sources, can essentially assist marketers in analysing their customers’ behaviour patterns, contributing towards improving the customer experience. To better explain harnessing the power of hyper-contextual data, let’s look at how to implement it with telemarketing and advertising via mobile devices.  Traditionally, marketers had to collect contextual data including information that is unique to the device and its current usage - where the device is located, how it's being used, and what's transpiring around the device - such as: 1. Geolocation data - where the user is in the world and where they are going. 2. Device data - what's going on with the device at any given time. 3. Sensor data - information collected from the device's sensors, including information about the device environment, such as the velocity of the device or the ambient temperature. 4. Behavioural data - information regarding what the device is doing, such as how users interact with their devices. Nowadays, with many smartphone applications and different types of software - collecting contextual data on the Cloud isn’t necessary and easy because users are more and more sensitive to their privacy - instead, contextual data stays on the mobile device along with any PII (Personal Identifiable Information). This is a big plus for end-users but can present challenges to the advertisers, in regards to the accuracy of their Ads. Indeed, PII plays an integral role in developing personalised marketing strategies. At the most basic level, PII indicates how to address, greet and send tailored marketing messages to customers. Going further, PII-based data tracks and monitors the customer journey. This enables marketers to gather the insights they need to run and maintain high performing, personalised omnipresent marketing initiatives. So, how can advertisers build on top of current limitations while increasing Ad-value?  5 Benefits of hyper-contextual data collection: On-Demand Targeting By collecting real-time contextual data, via mobile apps, marketers and advertisers can gain a better understanding of the user experience - attributes such as location when users are engaged with the device, what they are doing, and what they're interested in. This data can then be used to personalise content, enable on-demand marketing, and provide highly tailored communications, resulting in: Behaviourally targeted ads, which can be delivered in real-time while respecting people’s privacy.Personalised web experiences (such as the ability to register for an event or a webinar on the business's website while on the road).Ability to target users with real-time marketing communications and offers.Ability to deliver contextual ads based on sensor-triggered information about the device and user. An untapped targeting interaction is cross-app intelligence. It is believed to be the next frontier of hyper-contextual advertising. Several companies such as Digital Turbine, have been able to convince OEMs to embark their software OS-layer, enabling them to perform local and PII-friendly targeted Ads or App suggestions. In this line, the virtual keyboard is able to also gather non-PII information and cross-app intelligence as the user navigates between multiple apps during a day. Website Targeting  Contextual data collection can help elevate the online user experience. For example, the data collected via online engagements can help to better customise website experiences - leading to increased engagement and conversion rates. The correct data sets can be used to dynamically change the design, layout, and content of the website in question - generating a richer and more customised user experience. With website targeting we also have to keep omnichannel communications in mind. There's a lot of data going around that marketers and advertisers can leverage in order to make website targeting more successful. Not only does contextual data help communicationists build a customised user experience on their website, it also assists in driving website traffic from other channels and platforms - depending on how contextual data is implemented across each channel. Topic Targeting/Custom Keyword Historically, to target users, advertisers would create lists in which each user was represented by a number of keywords or topics. Using this approach, advertisers would bid on all of the keywords in the list and pay for every user that was reached. This is referred to as "non geo-specific" targeting. Now marketers and advertisers have the ability to create geo-specific keyword lists based on the hyper-contextual data that is available. For example, a Hotel brand can target users in a particular geographical area that has just booked a room without sharing data outside the device. Another example can be a Tour Organisation brand showing a relevant non-intrusive Ad inside a specific Travel booking app, bringing the hyper-contextual reach highly relevant for Advertisers and the User. Viewability If you're talking to someone but they are not looking at you, what's the point of talking? The same applies to advertising. For marketing to be effective, the initiatives need to be visible and be seen in the correct places, on the correct platforms. Viewability is a metric that is used to determine the percentage of an ad that is in view for a minimum amount of time. Hyper-Contextual targeting ensures that advertising is delivered to the right user at the right time and in the right way, as a non-intrusive display Ad. Brand Safety Contextual data helps to make smarter decisions about where to place your ads and offers. By analysing the data, marketers can be assured that the ads they place will not appear on inappropriate sites and that their users will not view ads that are irrelevant or offensive. Conclusion To conclude, the world has changed, and we need to change with it. We are moving from a world of identifiable users to one of anonymous users. Hyper-contextually targeted marketing is the future of marketing and we’re already seeing marketers and advertisers use it together with PII systems to customise their messages in accordance with the customer’s unique behaviour. Therefore, the companies that are harnessing the power of hyper-contextual data today will be the companies that dominate tomorrow. ### The role of big data in enabling ERP to trigger unprecedented growth https://vimeo.com/714028849/2ec96f16a6 Businesses across every industry are operating in an era of vast globalisation and fierce competition. Unfortunately, given the current cost of business crisis, a 30-year high inflation rate, and supply chain disruptions, this also coincides with an inability to predict challenges that will occur from month to month. As a result, it is crucial that businesses aim to prioritise efficiency gains by gaining clear perspective of their operations and preserve control across every aspect possible.  Although most businesses will have implemented some form of internal management system to manage the challenges created by finances, human resources activities, ecommerce, and the supply chain, many of these systems are limited in their ability to expand beyond day-to-day business processes.  However, the intelligence of enterprise resource planning (ERP) has provided a much-needed solution to these limitations. By gathering valuable data insights, ERP systems are able to streamline internal business processes and break down restrictive barriers across an entire organisation. By enhancing visibility, automating inefficient tasks and preventing the loss of essential resources, ERP adds value to businesses across virtually every industry.  A crucial element of ERP’s success, enhanced data insights enable businesses to proactively address areas of improvement that may have previously limited their success. From improved financial compliance with regulatory standards and reduced risk, real-time insights have enhanced operational efficiencies for data-driven business leaders on a global scale.  With research carried out by Experian finding that 85% of organisations view data as one of the most valuable assets to their business, the efficiency gains of ERP implementation are indisputable in the case of insight and evaluation. The research also uncovered that 50% of organisations have experienced improved customer service, 45% believe ERP has equipped them with greater insight for decision-making purposes, and 44% have increased their capacity for innovation.  Inadequate data handling  Whilst extensive ERP data insights have revolutionised the capabilities of organisations, they have also created new challenges for businesses to navigate. With the primary intention of ERP systems being to increase organisational efficiencies and reduce costs, it is concerning that the avoidable consequences of poor data management are estimated to cost businesses an annual average cost of $15 million. Whilst 78% of organisations are now affected by problematic data debt, 66% are having to implement new data management initiatives as a result. Likewise, half (51%) of organisations believe that the data collected by their ERP systems isn’t clean enough to be utilised, and 28% of businesses suspect their current and prospect data to be broadly inaccurate. With poor quality data and ineffective management systems weakening an organisation’s competitive standing and undermining critical investment objectives, businesses must look to implement additional support mechanisms to combat these challenges.  The consequences on future business success  The data obtained by ERP systems is infinitely unique – it must be continually captured, tracked, managed, and analysed. As 44% of transactional data is created by customers, suppliers, and external providers outside the immediate organisation, one of the biggest challenges for ERP systems is ensuring that the quality and consistency of this data is reflective of business requirements. Worryingly, with 88% of data unable to leveraged and only 12% being effectively analysed, research by Forrester found that ERP systems alone are unable to combat the challenges produced by large and complex data capture.  Besides the potential for data manipulation, incorrectly monitoring market dynamics and failing to respond to customer preferences can have a detrimental impact on business credibility and damage future opportunities. In the short term, any errors caused by poor data management creates costly errors – the longer it takes to rectify, the greater the cost to the business.  Big data as a solution  Big data refers to data that is so large or complex that is can be impossible to process using traditional methods. Unlike ERP in isolation, big data is able to distinguish between various data sources, and, as a result, can convert manipulated data into a user-friendly form for businesses to utilise. Not only does this restrict ERP systems from blocking potential intelligence that businesses require to make accurate decisions, but integrating big data into ERP systems also expands companies’ potential at an unprecedented rate.  Accuracy, availability, validity, and reliability are all characteristics of high-quality data. By overcoming technical deficits, big data provides organisations with data that consistently matches these qualities, consequently enhancing financial prospects that would have been wasted prior to its implementation.  The intelligence of big data analytics provides previously unattainable visibility into customer behaviour, supply chain requirements, and critical areas of engagement amongst key stakeholders. With an extended view of its assets and the ability to monitor data as and when it arises, businesses can optimise the routes involved in the flow of products and their movement within the supply chain cycle. Likewise, a deeper understanding of sales patterns and predictions for future supply and demand is particularly beneficial for the revenue of retail businesses. With greater insight into the desired inventory, supply management is greatly enhanced by reducing the potential for overstocking materials.  Big data shaping business prospects Global competition is rising across every industry, industries are facing the worst supply chain crisis since the mid-70s, and the extent of the inflation rate rise yet to be discovered. As economic circumstances continue to provide unpredictable challenges, it is essential that businesses do all they can to strengthen their competitive advantage. High-value data insights hold the potential to determine whether an organisation thrives or is left behind. Whilst optimised inventory, scheduling and management processes encourage businesses to combat external pressures, avoiding big data mechanisms drives organisations closer to achieving the opposite – productivity losses, ineffective business strategies and poor financial management included.  -Ends-  Notes to editors  For additional details, please contact:  Olivia Brown or Faye Mulliner at PR Agency One T: 0161 871 0551 E: nexer@pragencyone.co.uk About Nexer  Nexer is a leading digital transformation expert, specialising in Microsoft Dynamics. It advises, implements, develops, and manages Microsoft Dynamics applications, alongside other technologies, to help clients strengthen their market position, grow revenue, and improve productivity.  Working with businesses in the manufacturing, field service, wholesale distribution, construction, retail and fashion industries, its leading product lines include Microsoft Dynamics 365, Internet of Things, AI, Business Intelligence, Connected Field Service, the Smart Office and Smart Retail.  Nexer has been operating successfully since 2008 and was established in the UK in 2019. It is part of the Nexer Group (formerly Sigma IT) – a Swedish IT Services firm with over 5,000 employees across the globe. The business is able to leverage its relationship with the group in Sweden and worldwide to add significant and unique value to its customers. ### How Could an IT Consultant Benefit You? https://vimeo.com/714027212/34c65fe193 New technology systems and networks have been integrated into almost every business in recent years. Whether based online or not, nearly every business will be using some form of technology to help with their operations. This fact is now fundamental to every new start-up in most industries. This means it's nearly impossible to shy away from the IT sector. Obviously, it would be very helpful if you were an expert in the technology systems you need to use, but sometimes this is not always the case.  As a business owner, your expertise will likely be in daily management and niche industry knowledge- IT systems are not always tailored to this. This can be a dilemma for many start-ups. Many firms experience unambitious progress or extensive periods of decline due to problems with IT. Poor IT systems can be the difference between a struggling business and turning over a lot of money. IT consultants can be of great help with any issues regarding technology and networks in your business. Read on to learn more about IT consultants and how to pick one that will benefit your business. What is an IT Consultant? An IT consultant will work with you to improve any IT systems that your business is using. This means increasing the efficiency of any current IT networks to reach your business objectives. Furthermore, IT consultants can also increase the integration of IT with your business. This means recommending and implementing any IT technology you are not currently using. IT consultants will outline any ways you can improve your operations and how technology can make production more efficient. They can also help if you're running into problems with your current IT systems. Consultants can identify and fix any issues which may be harming your production. When Should you be Looking for IT Consultants? The great thing about IT consultants is that they can be helpful at any stage of your business journey. As long as there is the possibility of using IT in your business, a consultant can help with that. Alastair Williamson-Pound, the CTO of Mercator IT Solutions, states, “[You should be looking for IT consultants] when the company is lacking a key skill and doesn’t have the time to train up someone internally. Consultants will apply years of experience and knowhow and be able to start solving the problem within a day or so (depending on the nature of the problem and amount of information available). The application of this skill maybe something that the company doesn’t necessarily wish to invest in but only tactical.” Alastair continues, “When you want to get up to date with best practice. Working practices can get out of date. Bringing in a consultant can enable to you see what good looks like. Learn best practice and help shape your own internal employees’ working practices to optimise their productivity.” IT consultants can offer immediate help by integrating technology with your business. Since this technology improves your production methods, you'll find yourself with lower average costs. This should allow you to pass on lower prices to consumers and stand out from competitors. Consultants can still be of use if you're further down the line with your business and IT integration. IT consultants will prevent any standstills where production can no longer continue because of problems with IT.  Technology is constantly changing and improving. This means consultants will always be able to implement more efficient systems throughout your business journey. Picking out a Reliable Consultant There are many IT consultants in the market, all claiming that they are the best in the business! Therefore, it is pretty challenging to know if you're actually working with someone who will provide long term benefits for your business. It would be best to use these tips to help you pick an IT consultant that will help you reach your business objectives. Be Specific Most IT consultants will cover many industries and know about many different systems. However, it's not possible to be an expert in absolutely everything. So some IT consultants may not be as valuable to you as others. Different consultants will specialize in different industries. If you don't pick a consultant familiar with your industry, they may provide sub-optimal solutions that aren't of great help to your operations. Therefore, you'll need to research consultants and ask them what they specialize in. If you're hiring a consultant to iron out any problems with your IT systems, then being specific is even more crucial.  Try to understand the issue as much as possible. This will allow you to search for consultants who know your IT system very well and easily spot issues. This will allow quick fixes instead of waiting for a diagnosis from inexperienced consultants. Who do you work best with? If you've ever looked for an IT consultant before, you'll know there are many different options. They range from smaller individual consultants to huge consultancy companies. Different types of firms will have different potential benefits. Very large companies will have the advantage of many reviews and high reputations. This may work better for you if you're in a well-known market and looking for quick solutions. However, smaller firms or individuals could be more valuable if you're working in a niche market. You're more likely to find individuals specialising in these markets than larger firms. This may result in more effective solutions. Communication is Key When working with any 3rd party company, communication is vital for efficient service. Make sure you're working with a consultant who is communicating regular and in-depth information with you. Ongoing communication will make sure you know how they plan to help you and whether or not you're ready to implement their method. This will make the process much faster and have you back to a booming business in no time! This is why you may prefer working with small firms and individuals. Communication can be much faster since you are not on a waiting list among other clients that large companies may be working with. You also don't have to avoid going through multiple chains of command if you need to pass on urgent information. This information will be relayed immediately if you're working with an individual since you are already talking to the boss! Alastair states, “The best way is to go to a trusted and reputable consultancy. They can have a large network of consultants and they will be able to select the most pertinent person to match your requirements. It also depends on the nature of the role. If you needed a top tier management consultant then you could approach any of the large management consultancy firms (KPMG, BCG, EY etc). If it’s a technology consultant then Mercator would be best option.” Keep these factors in mind as you start the process of finding IT consultants to support your growing business. Without a doubt, your team will benefit! ### Dave Birss - Divergence - Author Glassboard ### Campbell Macpherson - The Change Catalyst - Author Glassboard ### Can Anyone go Phishing? Modern popular culture has a habit of depicting cybercriminals as being genius masterminds with an extensive list of computing skills at their disposal. While these individuals do exist, the image does not truthfully represent the majority behind some of today’s cyber attacks.  Phishing attacks have long been one of the most common threat vectors, ranging from the typical blanket email campaign to more targeted and sophisticated methods, like BEC attacks. As one of the more straightforward cyber threats, we’re seeing more examples of phishing being used in some of the biggest attacks today. And now, thanks to the availability of phishing kits and easy-to-use website hosting platforms, pretty much anyone can launch a career in cybercrime as a phisher if they find the right video tutorial.  How has phishing evolved?  As technology habits have progressed, so have the phishing tactics. The latest trend we’ve seen catch on is a rise in phishing kits that allow cyber attackers to focus specifically on mobile devices. The continuation of remote and hybrid working models makes mobile devices far more lucrative targets for criminals, as disbanded workforces are no longer directly under the watchful eyes of IT teams. Thanks to shadow IT and limited device visibility, it is far harder for security teams to monitor and protect all relevant mobile devices.  Additionally, we’re noticing that phishing campaigns are less likely to be driven by the theft of random credentials, and are instead targeting far more valuable data, including bank details and social security numbers. Previously, harvesting basic credentials like email addresses and passwords would give criminals the resources to carry out further phishing campaigns for the bigger prizes. However, now, we are seeing more adversaries trying to cut out the first stage to instead make a beeline for the pot of gold. For example, our latest research has shown a 300 per cent increase in phishing sites targeting Chase Bank account holders. Each fraudulent site behind the URL in the phishing email was created using a phishing kit.  While banking and financial data has always been a main target for criminals, there has recently been a significant shift towards cryptocurrency and crypto-wallets. Once a threat actor breaches a crypto account, they’re free to transfer the contents without much interference or risk of getting caught. Multifactor authentication (MFA) and other layers of defence make it harder for attackers to steal company assets, but criminals are constantly finding new methods to bypass security.  Giving attackers a boost As well as being deployed as standalone attacks, phishing is also used at the start of larger, more sophisticated campaigns. By targeting employees – who are considered to be the weakest link in any cybersecurity program – phishers can gain entry-level access to the company network and paint a picture of the entire infrastructure. This intel is extremely valuable for groups looking to launch invasive ransomware attacks as a secondary campaign.  Some of the most devastating attacks in recent months, including the SolarWinds compromise, have all started with a simple phishing attack. And with the necessary resources and step-by-step instructions on how to launch a phishing attack readily available on the internet via phishing kits, we can expect to see the numbers continue to rise.  So, what is a phishing kit?  In its basic form, a phishing kit is an all-encompassing package for setting up and deploying a phishing attack. Very few people realise how accessible these ‘starter packs’ are – you definitely wouldn’t need to venture as far as the dark web before you stumble across one. According to our research, one of the most used kits is the Chase XBALTI, and the primary targets are Chase and Amazon account holders. The kit includes code for setting up the phishing site, which is easily uploaded once a domain has been acquired. An individual would simply then have to configure the phishing site to direct stolen credentials to a separate location and get sending. There are usually large sums of emails available online, or sometimes phishers will purchase specific datasets from the dark web if they wish for the phishing campaign to be more targeted. There are even services that will do the sending for you, so once it’s deployed, it can be left to run. Even if one of these phishing sites is discovered, taking it down is not always a straightforward task. To align with the latest trends in phishing, online kits are now far more sophisticated – but still as easy to use – meaning they can now be deployed to collect financial details, social security numbers, home addresses and other personal information. They’ve even been equipped to bypass common security measures so that they can now capture one-time use codes for MFA. As part of their appeal, our research has also demonstrated these phishing kits’ evasion capabilities. In some instances, attackers use free dynamic domain name services to point a URL to the server of their choice. This feature allows users to change the destination of the URL should the phishing kit be discovered and shut down. Some online stores are selling these kits for a few hundred dollars and phishing sites can be set up in an hour or so. Crime has never been so affordable and accessible.  The rules to live by when fighting phishing  The simple and accessible nature of phishing means it will remain a popular threat vector for attackers in years to come. With tools out there to make anyone a phisher, businesses need to do the groundwork now and tighten all the hatches before they become overwhelmed. There are three security policies that businesses should adhere to as standard practice: never re-use passwords, always use MFA where possible, and never underestimate the power of common sense.  It is also important to introduce solutions that are able to analyse email messages to determine if they are malicious or not. Organisations should look to incorporate and blend the latest best practices to prevent phishing attacks with traditional email filters, specialised detection frameworks and user training. The use of email filters will provide an organisation with high-speed detection of spam, malware, and well-known phishing URLs, while specialised detection works at the application layer to learn and identify unique and targeted threats using things like machine learning, and natural language processing and behaviour analytics. User training adds another layer of defence by helping to address suspicious messages that are neither deterministically clean nor malicious. By combining these three best practices, organisations will find themselves at better stead to prevent themselves from being the victim of phishing attacks.   Human intuition is one of the most powerful tools we have against cyber threats. If something seems suspicious, then it probably is. Criminals are experts in social engineering and deception techniques so it’s vital that we don’t assume something is legitimate if we have even the slightest of doubts. For example, we know that banks will not send SMS text messages requesting further personal information, so immediately these should be treated as suspicious. If in doubt, it’s always worth contacting the addressed company to confirm before acting upon any requests.  The accessibility of phishing kits will keep businesses on their toes. Knowing that any individual could become an amateur phisher is a daunting thought. While they may not have the advanced skills to break through sophisticated defences, if enough of them make you their target, you can be sure they’ll find an overlooked weakness. ### Why do digital transformation programs fail? For all the talk of companies accelerating business transformation and moving to the cloud, there are still many firms grappling with how they will achieve operational efficiency and move processes online. After all, it’s complex and expensive to move to the cloud, integrate apps and ensure the supporting culture, structure, security, and governance are in place.  While it may be true companies are accelerating their plans, it doesn’t necessarily mean they are delivering on them. In fact, Forbes says that 84% of digital transformation initiatives fail because the technology selected isn’t appropriate or easy to integrate. Experience has shown that any digital transformation program viewed purely as an implementation of technology is doomed to fail. This is because a ‘technology first’ approach neglects to consider and accommodate the importance of strategy, culture, structure and governance to delivering a successful business transformation program.  Transformation failure points First, there is no ‘one size fits all' template for success. Every company is unique in design. A transformation undertaken in a ‘Command & Control’ environment will require a very different approach compared with a holding structure or a loosely federated enterprise. This is largely due to differences in stakeholders’ span of control and decision-making processes within these different organisational structures. Either way, most will have legacy architectures to consider, which brings with it cumbersome data management processes and security models and puts pressure on resources.  Second, digital transformation should always be considered with a strategic goal in mind. Any attempts to implement technology tactically without a clear vision of why it is needed is unlikely to deliver the intended results.  For instance, the goal might be to increase online sales by 20%. However, it’s impossible to achieve this objective without considering the stock levels, payment platforms, supply chain integration, and the impact on the workforce that manages the increased customer orders and product fulfilment.  Third, projects will fail without an executive-level sponsor to sell the value and lead the transformation efforts. From a governance perspective, considerable thought must be given to which processes must be modified so that the smallest of changes to the most radical transformation is smooth and has a positive long-lasting effect.  Next, culture must be aligned. All efforts must be made to identify and influence stakeholders to champion the digital transformation program. So often initiatives fail because people resisted the change. Painting the picture of a better, brighter future is essential.  However, leadership teams must be able to not only articulate the vision but also empathise with the reasons why people might resist change. Is there the perception that a machine or system will replace jobs? Organisations that understand this complexity often recognise it takes time to develop the supporting competence needed to deliver the strategy. It’s why some companies choose to accelerate their efforts by hiring external talent that has the experience needed to manage the digital transformation from both the point of view of systems and people. Finally, there must be an appreciation for macroeconomics and even the political landscape if the transformation is going to be successful. For instance, if legislation or policy is holding up the implementation of specific technologies then it could decelerate the adoption of transformational technology. The public sector and financial services sectors are examples of those that are beholden to regulated frameworks. They often cite this as an inhibitor to progress, especially when it comes to the adoption of public cloud compared with other, less regulated industries.  There are also economic factors to consider. If the financial investment isn’t available, nor the grants and tax breaks that encourage innovation, then progress will slow down. During the COVID pandemic, many organisations found it difficult to appropriately fund digital transformation projects despite considerable clarity and consensus that they needed to happen.  This left many companies stuck between a rock and a hard place. While there were huge swathes of consumers who wanted online options, there were businesses who couldn’t, and indeed still can’t react to the demand. Business resilience was greatly undermined.  On the flip side, there are also consumers who just aren’t ready for the move to online. Digital banking is an excellent example of where transformation left behind those not used to banking online or who were part of the minority of ‘unbanked’. The right technology choices Having a broader view of the considerations related to culture, structure, security, and governance is critical for transformation success and should inform technology choices. Aside from making the right decisions on platforms and applications, companies must be able to deal with the ocean of data that transformation creates. Data can drive decision making and create more agile businesses but only if it can be organised and mined. However, the scale of data involved would completely overwhelm most human beings. That’s why it’s so important to adopt the right technology for the business setting and vision. Only with the right applications in place can organisations find new data to support their decision-making processes, accelerate the time to make decisions and decrease operational costs. Businesses that get this right will strike gold.  However, where there is gold there are criminals, so every digital transformation program must include security measures as part of the governance. Without sophisticated cyber defences in place, any organisation is an easy target. Cybercriminals are using advanced tactics and techniques to engage in their nefarious activities so defences must match up.  This is where some businesses can be tempted to make shortcuts. For example, if a retail team can’t detect bad bots on its eCommerce platforms, then they won’t know criminals are scrapping credit card details, blocking inventory and committing fraud. It’s important to make the links between data laws, operational technology, and strategy, and it must be a top-down mandate to do so.  ### How hyperscalers can capitalise on the edge The rapidly growing demand for edge computing in the UK presents major cloud providers with significant opportunities once they form new strategic partnerships to overcome the latency problem. Edge computing is all about low latency. It means bringing low latency, high-capacity connectivity to all areas of the country so organisations can use advanced, data-intensive applications, regardless of location. Outside the South East, however, the hyperscalers are currently unable to provide the necessary level of performance because distance increases latency. Threatening their ability to fully participate in the evolution of the edge and to seize its almost limitless array of commercial opportunities. If, on the other hand, the major vendors forge the right partnerships with vanguard edge platforms that already have a strong regional presence, they stand to reap substantial rewards from a market that Statista predicts will grow to $250bn by 2024. The UK edge is part of a global trend and will accelerate 5G roll-out The tide of edge adoption is already flowing in the UK. Research from Aruba in 2020 found 44 percent of UK organisations were using edge technologies to deliver new outcomes and another 21 percent were planning to do so.  Pulsant research among UK IT decision-makers and business leaders shows the pandemic has increased demand for low latency cloud connectivity by accelerating digital transformation in 75 percent of organisations. Businesses are migrating workloads to the cloud and implementing SaaS applications to improve customer experience, deploy automation and increase productivity. Change in data and processing is very much on the agenda. Market intelligence company IDC predicts that by 2024, 80 percent of enterprises will overhaul relationships with infrastructure providers “to better execute their digital strategy for ubiquitous deployment of resources and more autonomous IT operations”. Hyperscalers are already forming partnerships with telecoms companies As more organisations understand the significant advantages of processing data workloads at the edge of the cloud, closer to the devices or businesses that generate and use them, the roll-out of 5G is certain to speed up, to supply the need for fast, universal connectivity. Telecommunications companies will therefore play a central role which is why the big public cloud providers are entering into partnerships with them. AWS has, for example, entered partnerships with Verizon and Vodafone, while Google has launched Anthos for Telecom.  With the expansion of 5G coverage, the use cases should grow exponentially. SaaS providers will deliver their solutions more directly, straight through the network. Solution-builders will create apps and orchestrate the network functions and computational infrastructure needed for secure delivery to their end consumers, using network slicing and private 5G where necessary. Edge computing will meet the demand from companies for Industry 4.0 technologies or more seamless, immersive, and personalised customer experiences.  Can hyperscalers overcome the challenges of providing edge computing outside the South East? The way forward for the hyperscalers now must be to expand their partnerships beyond the telcos and link up with high-speed, high-bandwidth edge data centres in the UK’s regions. This is how they can provide edge computing to their thousands of commercial customers outside the South East.  A list of one of the global cloud providers’ edge locations, for example, shows all seven of its centres and its Cloud Content Delivery Network (CDN), are concentrated in London. Facebook’s analytics data, derived from its servers in the UK, shows 186 miles equates to a millisecond of latency. This is a problem when gaming, automated trading applications, AR, VR and industrial IoT implementations require latency as low as 1ms to 10ms.  Gartner analysts found common business collaboration solutions show reduced performance at latency above 25ms. Unfortunately, the popularity of the hyperscalers also causes access bottlenecks resulting in data-congestion, which is only likely to worsen with the growing adoption of data-intensive applications.  If the major cloud providers do not take steps to resolve these latency problems, they will lose out, as businesses seek alternative vendors with edge capabilities. The big cloud vendors need more regional points-of-presence The big cloud vendors need new edge partnerships so they can quickly establish points-of-presence in data centres that are strategically sited around the country. Since success depends on bringing data and processing as close to customers as possible, they should select edge platform providers that have invested in locations that provide geographically ubiquitous coverage of the UK. Then they can provide low latency to customers and end-users as 5G or fibre-to-the-premises infrastructure expands. The more advanced edge platforms are already capable of achieving sub 5ms latency, regardless of location.  Hyperscalers should also ensure their edge computing platforms are linked by high-speed fibre networks, with full route diversity and fast, secure on-ramps to their services. The edge networking approach employed by such platforms spreads the load across several regional data centres, meaning data is processed closer to each end-user, reducing the congestion that undermines performance. Hybrid and multi-cloud-ready Forging these new partnerships makes sound commercial sense because existing edge platforms are ideally situated between the on-ramp to the public cloud and network-to-network interfaces. They are also well-equipped to facilitate the trend for hybrid and multi-cloud models which many enterprise customers prefer. It is an approach the major cloud vendors have themselves already started to address with dedicated tools, recognising customers want flexible access to the cost, performance and security advantages of different clouds while maintaining some workloads in on-premises data centres.  The better-prepared edge platforms also give hyperscalers an easy route into the world of software-defined access and its potential for business growth. In many cases, edge data centres are already implementing software-defined networking (SDN) to increase efficiency and performance. SDN on-ramps provide fast and secure access to multi-cloud ecosystems through scalable private networks. This means that essentially, anyone with an SD-WAN solution can access the network. They can make their way around UK locations and find a branch connection to the cloud. A SaaS provider can offer solutions and benefits to clients, irrespective of location, in a way that simply would not have been possible before.  The bottom line is that through edge data centres, businesses can easily employ hyperscale cloud vendors to power their growth, their entry into new markets and delivery of new services to almost anywhere on the planet. At the same time, they can fully benefit from global advances in analytics, AI, machine learning and business application innovation. Businesses will use edge data centres for the data their applications need to process at low latency, while fast and secure connectivity continues to provide them with resilient access to the wealth of data and services in the public cloud. Edge partnerships must be the future for the major cloud providers By increasing their presence in the regions through partnerships with purpose-built edge platform providers, hyperscalers will maximise their impact on the UK edge market. They can avail themselves of a vast range of opportunities and the explosion in SaaS applications. This will be vital as businesses shake off the pandemic, demanding faster, more efficient data processing and computing at high capacity and very low latency. Only the edge will achieve this, providing significant revenues for those who get it right.  By enlarging their presence in all regions, the hyperscalers will become the pulsating hub of this major evolution of infrastructure, driving new business models and creating new revenues.  ### Attacks are inevitable, but criminal success is not When a cybercriminal strikes, we’re quick to believe that they’re a highly skilled and sophisticated threat actor with an extensive toolbox of weapons to hack through a network perimeter. But this often isn’t the case. On more occasions than perhaps we’d like to admit, sensitive information is left out in the open, just waiting to be discovered.  Cybercriminals often target the low hanging fruit, and so by leaving databases exposed and neglecting basic cyber hygiene practices, businesses are making it all too easy for attackers to access their crown jewels. The biggest issue today is that organisations often don’t realise that they’ve left sensitive data exposed, and therefore believe they’re completely secure.  Given how much data is stored within one business, it’s become extremely difficult to monitor every single movement.  The risks of losing track of data  There are several different ways that businesses can unintentionally leave data vulnerable to cyber theft. These include exposed databases, forgotten databases, and third party weaknesses.   Over time we’ve noticed that a major cause of exposed databases is human negligence, either because of skill shortages, overwhelming workloads or lack of visibility. To keep databases secure, teams must stay on top of patching, although this can be complicated and time-consuming. Additionally, if the open API access is misconfigured then all efforts will go to waste and the database will be left exposed anyway. One wrong move could result in devastating consequences.  As we’ve established, it can be hard enough trying to secure the databases businesses know about – so what about those that they don’t? Without sufficient visibility over all existing databases – whether they’re still in use or not – businesses will never be able to guarantee complete security. When new databases are created, the old ones are often left to be removed but this often doesn’t happen. The fast-paced nature of business means teams get distracted and tasks fall off their lists. These forgotten databases will remain in the system and often unprotected, some still containing sensitive information, and just waiting to be harvested by criminals.  Finally, data is often shared with third-party companies, and once the information leaves the perimeter, the original company loses all control. In fact, IBM research reveals that around 60 per cent of businesses have suffered a major data breach because of a third party. By sharing information with another company, teams are essentially putting their trust in someone else’s security measures.  Extraction to exploitation  Once a vulnerable database is located, it doesn’t take much for criminals to break in and harvest valuable information. Simple methods such as using stolen credentials like emails and usernames, and other forms of personally identifiable information (PII), will grant criminals access in no time at all. From here, threat actors can do whatever they please with the data. It can be sold on the dark web, used for further data exploitation, or used for ransom.  The cost of a data breach is climbing, with IBM’s research revealing a 10 per cent rise from 2019 and the average cost now reaching $4.24 million. This shows that the initial damage caused by a breach is just the tip of the iceberg – the secondary effects of a successful attack are monumental.  The domino effect While some may assume that it takes a while to locate historical databases that have been out of action for years, unfortunately, that’s not the case. Exposed databases are ideal for a threat actor and so they’re constantly on the lookout for a quick win.  Looking beyond the initial fear of losing sensitive data, there are several other consequences that can soon follow if an attacker is successful. Businesses risk severe reputational damage and loss of customer and partner trust around how a company protects their data. Rising levels of suspicions and doubts can impact business in the future. On a more basic level, however, once an attacker gains access to the network they will endeavour to keep their foothold so they can breach more databases. No part of the system will be safe.  It can also be hard to tell which areas of the network are infected. Even if the initial point of entry is discovered, criminals can navigate undetected, causing major damage before they are finally discovered. The more databases that attackers are permitted to access, the more it will cost the organisation to recover lost assets and re-establish the company’s security posture.  Securing the future An effective security strategy must be built on strong foundations – which starts with getting the basics right. As patching is a crucial element of securing databases, organisations must ensure the necessary training is provided to avoid human error, especially if there is a skills shortage. Additionally, IP scanning solutions can help identify existing data leaks and which databases need priority action.  Turning attention to the attackers, businesses should make it as difficult as possible for them to break into databases. Criminals are less likely to spend more time focussing on bypassing the perimeter if it means they’re most likely going to get caught before they succeed. Digital risk solutions are available to disrupt their kill chain by blocking the footholds that attackers rely on. Organisations will be able to uncover existing exposures and correct any weaknesses within databases before any damage is done. This increased visibility is vital for maintaining and strengthening defences and keeping attackers out of all databases.  Any database could become a liability if managed incorrectly. To stay ahead of attackers and avoid the frighteningly high cost of data breaches, businesses must maintain sufficient visibility over all databases, whether they’re still in use or not. It’s important to remember that while breach attempts are inevitable, criminal success is not. ### Fighting the Ransomware That Seeks Out Your Data When 30 countries get together to discuss a problem, you know it's serious. Last October, the US hosted a multi-country meeting to discuss an online scourge: ransomware. The problem is getting so bad that it isn't just costing companies millions each year - it's also threatening critical national infrastructure. So what is it, why has it become such a problem, and what do we do about it next?  Ransomware has become a headline issue in the last few years but it existed long before that. Consumer-targeted malware froze victims' machines and demanded payment twenty years ago. The problem for criminals was getting money from the victims' accounts to theirs.  Then, along came cryptocurrency. Ransomware criminals traditionally had to try and convince victims to buy gift cards or make payments via money transfer services like Western Digital. Starting with bitcoin in 2009, cryptocurrency offered a fast, friction-free payment method that exploded over the next decade, with more digital currencies appearing almost weekly. Some, like Monero, were specifically geared to be as anonymous as possible.  This gave criminals a perfect payment channel, which paved the way for more professional attacks. The first ransomware strains were often poorly coded, enabling victims to share encryption keys and recover their own data. Later strains tightened up encryption and also used more sophisticated techniques.  Ransomware began deleting shadow volumes files, which are backups of files created locally by Windows machines and a critical tool for Windows to restore files locally. Ryuk has been spotted automatically crawling and deleting any shadow volumes or other backup files that it finds, using simple scripts. Locky, Wannacry, and Cryptolocker all target shadow volumes. Most ransomware volumes will also crawl networks looking for shared volumes, meaning that backing up to a network drive won't protect you.  Introducing human hackers  In the last few years, ransomware evolved again. It became more like a business. The criminal community behind it separated into different groups that operated on an affiliate model called ransomware as a service. The ransomware authors license their malicious software to others who find victims to infect. They then pay the authors a fee.  Ransomware groups began sourcing large volumes of vulnerable attack vectors on the dark web. These included not only stolen login credentials but also vulnerable remote desktop protocol (RDP) ports that they could use to infect endpoints with ransomware. They then automated attacks, hitting vulnerable points using bots to see which networks they could gain a hold in.  Once they infect a vulnerable network, many of today's ransomware attackers do far more than just let the software run. Instead, they spend time manually picking their way through a victim's network themselves, finding more machines to infect. This lateral movement enables them to find the victim's most valuable resources. They often use everyday administrative tools that already exist on the target's network, like PowerShell and Windows Management instrumentation, to avoid raising suspicion. This process is called 'living off the land'.  This more manual technique allows ransomware thieves to do more than encrypt data. Today, they're stealing it too. That way, if a company is able to recover its data from a backup, they can still try to extort money by threatening to publish the information.  The result?  Ransomware has evolved from a time bomb to a smart missile, seeking out the most valuable information in your organisation. But it doesn't stop at one data cache; it finds all the targets it can, maximising its blast radius.  Those attackers don't stop at primary data. They'll do their best to access a victim's backups too. This is often relatively easy, as some backup files have headers containing detailed information about their contents.  Those backups are often an easy way to collect large amounts of sensitive data in one easy raid. Cloud backups are even better because ransomware thieves that gain access to those accounts can often steal the backups without triggering any alerts on the victim's internal network. They can then pursue sensitive data at their leisure.  Criminals that find those backups can delete them before detonating their ransomware. That stops the victims restoring data from them.  The alternative is not to delete the backups at all, but instead to leave the ransomware lying dormant for weeks on the network. The ransomware files will then get backed up along with everything else. After it eventually detonates, the victim might restore the files only to find themselves infected again immediately.  The problem is getting worse  How can companies protect themselves against these ransomware attacks? Basic cybersecurity hygiene measures apply. Training end-users to watch for phishing attacks, scanning incoming emails and outgoing web sessions are all good lines of defence. Using multi-factor authentication for online accounts will help stop ransomware thieves from hacking accounts while switching off unused RDP ports will close down attack surfaces, as will regularly patching software.  Beyond that, though, companies need security solutions built for ransomware - especially with the rise of Ransomware-as-a-Service.  Ransomware-as-a-Service is exactly what you think it’s going to be, and it is becoming a significant threat as more and more threat actors are turning to it, meaning your solutions for recovery and cyber resilience will be even more crucial. The world isn’t like it used to be, that’s for certain. No longer do you only need to fear those with the capability to execute a ransomware attack, you now have to make sure you’re protected from every angle, because RaaS has enabled those with little knowledge and know-how to unleash attacks at their leisure. The kits are easy to access on the dark web, which ultimately means more attacks and attackers will usually utilise a “spray” tactic, hoping something lands. What does this mean for you? It means your defences and backups have never been more important. The ransomware scourge isn't going away. It's going to get worse, and more companies are going to get hit. As we've seen, relying on traditional backups is cumbersome and unreliable. When online crooks come calling at your organisation, will you be ready?  ### From 32 to 64-bit: Ensuring a smooth transition The arrival of Windows 11 came with some surprises for users, requiring much clarification from Microsoft in the days following its initial reveal. For many, the news that the upgrade would only be available as a 64-bit OS signified a substantial disruption to the status quo. Organisations may well be wary about the implications of this change for the applications they currently rely on.  While Microsoft is keen to communicate that 32-bit apps will continue to be supported on Windows 11, the retirement of the 32-bit edition raises questions about how consistent the functionality of these legacy apps will be. In addition, there is no guarantee that these applications will remain secure without the necessary updates. Businesses, therefore, risk the possibility of an app being unable to provide critical services to the organisation and its customers as they transition to a new OS. When making the shift, businesses need a solution that enables them to guarantee app reliability throughout. Having key processes in place will allow organisations to plan ahead and pre-empt any incompatibility issues, thereby making the transition as painless as possible.    Compatibility packaging technology Businesses can be forgiven for not celebrating every aspect of the move to Windows 11. Upgrading all devices can be an expensive and complex venture – but it is a move that is essential to avoid the security risks of operating an unsupported end-of-life system. Delaying the migration for too long may result in an even more costly reliance on Microsoft Extended Security Updates (ESUs). The priority must be to ensure 32-bit apps remain compatible with the new system. This is especially the case within public sector organisations. There are noteworthy precedents of vulnerabilities in legacy IT systems being exploited by cybercriminals: the 2017 WannaCry ransomware attack derailed the NHS, resulting in the cancellation of thousands of appointments and a repair bill running into the tens of millions. The subsequent investigation found that the attack was “relatively unsophisticated” and “could have been prevented” using “basic IT security best practice”, further underlining the need to consistently adapt to the latest systems. Organisations large and small need to protect reliable applications whilst keeping up with the pace of innovation and change. Application compatibility tools and packaging services provided by a technology partner like Cloudhouse are a means of allowing  32-bit apps to run on newer operating systems (OS). The technology works by using install capture and run-time analysis to initially package the application. Redirections are applied to the application’s file and registry, runtimes are isolated so that only the packaged application uses it and packages are then deployed with existing management tools and processes. While the most important result for organisations is continued reliability, this method also avoids the need for comprehensive training or infrastructure changes. Therefore, the significant costs and resources associated with such measures are not incurred. The importance of planning ahead With the official end-of-life date for Windows 10  almost four years away (14th October 2025), some businesses may be tempted to put the transition to 64-bit Windows 11 on the back burner to address more immediate issues. This approach is unwise: Windows 10 is already being deprecated on a frequent basis ahead of time, leading to critical functions becoming unsupported or even removed in their entirety. It is therefore critical for businesses to plan ahead as soon as possible and make the move to Windows 11 before these changes create unnecessary problems. Previously described as the ‘last version’ of Windows by Microsoft, the changed messaging in the light of Windows 10’s journey into obsolescence illustrates how technology is constantly evolving and altering the status quo. It, therefore, shows that businesses not only need to employ solutions that allow applications to be effectively shifted from OS to OS, but they also need technology that can flag any potential incompatibility issues as the treadmill of change continues. Businesses that are proactive in employing robust technological solutions stand to reap the rewards of moving to a new operating system. The right technology solutions allow visibility across the entire IT infrastructure, allowing them to monitor what is out of date and automatically achieve compliance through the pursuance of best practice configuration. Historical data can also give businesses insights into how a system has changed over time and how it might further evolve in the future. With change comes opportunity With the optimum processes in place, organisations can keep pace with the continuous cycle of change. They will be able to plan ahead and mitigate potential 32-bit compatibility issues as Windows 11 is introduced, lessening the risk of downtime or data loss. Furthermore, they will be well-positioned to address any further changes as Windows 11 itself and its successor become end-of-life systems in the years to come. Crucially, no server or operating system provides value for an infinite amount of time. 64-bit operating system editions are unlikely to be the very last. Therefore, businesses must capture external factors as they change and capitalise on the opportunities such developments bring. In order to remain efficient, competitive, and profitable, organisations need to keep an agile and forward-thinking mindset. The right technology allows for legacy applications to provide optimum value as the landscape around them continues to shift. During this transition and the next, businesses must invest in the optimum tools and solutions to stay ahead of the game.    ### Extract greater value through better hybrid cloud management Hybrid cloud is now the default infrastructure for most enterprises. In fact, 82 per cent now have a hybrid cloud strategy – with an average of 2.6 public and 2.7 private clouds per organisation. But a lot of this adoption has been tactical, so many lack a strategic cloud management platform. This limits the value enterprises get from the hybrid cloud and creates governance, risk and operational problems. Instead, organisations must pivot towards higher-value, more robust use of the hybrid cloud. The optimal position is to have a single-source view of cloud environments in order to simply optimise cost, standardise automation and manage security. This makes it easy to deploy workloads to the right location – public, private or non-cloud – to exploit the capabilities the workload needs, at the right price points. Good cloud management platforms enable enterprises to understand their resource needs and assign them appropriately using one tool. Monitoring resource allocation easily and transparently also allows the IT function to make better decisions in relation to cost and the distribution of resources. But this is difficult for many enterprises: Many inherit complex legacy environments which lack effective management tools, and they also struggle with the fast-paced development of cloud management tooling. There are also the limitations of either cloud-native or cloud-agnostic management platforms in delivering the full end-to-end needs of an enterprise to contend with. So, how can IT teams implement the most appropriate hybrid cloud management system for their needs, and how this will translate into cost savings and efficiencies? Align cloud management with infrastructure requirements  Firstly, enterprises need to understand how their infrastructure helps them drive value. Specifically, which capabilities help them deliver their core services better, and which should be simply managed for cost. This insight can then be used to design the best architecture and management platform for them. For example, an enterprise that drives value from data through analytics will pay particular attention to this as they architect micro-services spanning cloud and on-premise and select and configure the management tooling in relation to data. To qualify any cloud management software, it is important to consider what the target cloud footprint is going to look like in the future. Given the acceleration of digital transformation the need to plan ahead has never been more vital. Important questions to consider are, “Which business services span multiple cloud platforms and why?”, “What is my operating model?”, and “How will I manage business continuity?”. Once the most suitable cloud infrastructure management software for the business is determined, the next step is to consider how to actually use it to its full potential.  Federated vs. centralised cloud management  Broadly speaking there are two options; federated cloud management (where the cloud infrastructure management software sits on top of each platform) or a centralised cloud management approach (using one piece of cloud infrastructure management software that manages everything under the enterprise’s cloud footprint in one place). With the former, if business services span more than one cloud platform, IT teams then immediately lose the complete view of their cloud infrastructure estate. On the flip side, federated cloud management is very easy to roll out because it is normally offered alongside the cloud platform, meaning there is no development or management involved.  A centralised cloud management approach does what is said on the tin – it is a platform where information from each cloud can be integrated into one place. The benefit of this is that it creates an in-depth overview of the entire cloud estate. For example, the finance team can access spending information, the security team can receive a continuous compliance report (or a security posture report), and the platform owners can see how many services are running, as well as how many are up or down.  The drawbacks are that IT teams must integrate each individual platform into the centralised cloud management software, and they will need to customise it to some extent which can be quite time-consuming (but it can be automated). That said, many come with features that help guide IT through the customisation process. Ongoing management is also required – another drain on resources. Despite this, when a multi-cloud infrastructure is used, centralised cloud management is the strategic approach. How to avoid cloud inefficiencies Even with the most finely tuned hybrid cloud management process, no enterprise is immune to cloud value leakage. This is where the automation of hybrid cloud management comes to the fore. Starting with the ethos of automating everything from the outset is a strong choice. If an enterprise has started without that, then pivoting to that mode today is the next best option to save time and increase control for the long term. By driving automation to the next level, enterprises can reach top-quartile efficiency. It also allows them to manage their infrastructure uniformly and securely across multiple cloud platforms and improve application portability between platforms. Similarly, having a clear understanding of which hybrid cloud platforms should be used for what purpose helps to avoid inefficiencies in application placement and operating costs. Defining a strategy on how each cloud platform is to be used and consistently managed against those rules is key. Finally, having a clear view of cloud-native cloud infrastructure management software can save time and money, but often cannot alone give a complete view of the cloud estate – a single source of truth which is crucial for optimising the management process. Working out whether this is appropriate for the organisation, based on how hybrid the target architecture is, enables a good strategic choice to be made. Moving from poorly managed, to well managed For many enterprises, the big challenge is how to take today’s imperfect solution and improve it. Having established what the optimised target architecture should look like, the question is then how to pivot towards it. Firstly, this requires investment. Even outside the management tooling, the cloud requires constant investment – or “lifecycle management” – to maintain modern, cost-optimised architecture. If well managed, this investment should be consistent and deliver a return in terms of reduced like-for-like run costs. Secondly, it also requires time. It may take two to three years for a reasonably sized estate to be aligned with a strategic cloud management approach. Finally, it should be automation-led – prioritising automation both reduces delivery costs and risks but also sets the right culture on the programme. Don’t miss out on the value opportunity  Hybrid cloud has numerous benefits for the enterprise, but it’s important that IT teams give considerable thought to how the technology is going to be used in the long term and to determine the best management software for their specific needs. Not getting this right at the outset has the potential to erode value, create cost inefficiencies and drive up the cost of change. Furthermore, there is a chance that the opportunity will be missed to extract the most value from the good, focussed use of the cloud. To exploit cloud capabilities and maintain a continuous, high level of governance, organisations need to tailor how they govern, monitor and manage their cloud services.  ### On-premise vs. Cloud: The Implications On Data Security When a business is considering which Cloud infrastructure is the best fit for their company, there are a number of factors that they need to consider. Over the past decade, Cloud computing has significantly grown in popularity, offering organisations newfound flexibility and scalability. On-premise software, however, has been tried and tested by businesses and may continue to adequately meet their technical needs. In today’s digitally evolving marketplace, IT leaders are now considering the move to emerging Cloud applications to better achieve their business goals in a cost-effective manner.   One of the biggest elements that companies need to evaluate is on-premise versus Cloud security. It’s clear that data security is becoming even more prevalent for enterprises as cyberattacks are on the rise. With on-premise, the business’ servers and data are physically located in the office, and IT executives can use backup and disaster recovery software to extract data during a cybersecurity threat. It’s important to remember that the management and maintenance of the network is the sole responsibility of the business. On the other hand, with Cloud-based security, a third party company hosts the organisation’s servers and data in a data centre and can provide additional support to manage the network.  Peace of mind with on-premise  With on-premise solutions, businesses can benefit from end-to-end customisation across the entire network, including servers and data in the organisation. Not only does this ensure the solution fits their specific needs, but it also gives IT leaders complete peace of mind that the security measures are shaped to their purpose. On-premise is also ideal for companies that operate in legal, healthcare or financial industries, and need to follow stricter cybersecurity policies. These businesses may feel more comfortable hosting their data onsite, rather than across the country in a data centre.  What’s more, on-premise solutions benefit companies that already have an internal IT team. Whilst, initial investments for software and hardware may seem daunting, having a dedicated team of staff who can manage the infrastructure will ensure the business sees a return on investment. This team will be responsible for keeping the on-premise security running seamlessly which means the data is both secure and close by, as well as reducing the need to outsource support for a costly fee.  Improved flexibility and scalability with the Cloud  Although on-premise may suit some businesses, others are considering making the move to the Cloud for improved security quality. With the Cloud, a company’s data is not secured physically at their office for cyberhackers to take advantage of. This can reduce the risk of security breaches or threats onsite. In addition, with Cloud-based security, data centres have bolstered security features and a dedicated team of employees who can protect businesses data. Furthermore, Cloud systems will learn about an organisation’s network over time, ensuring that the solution grows with the company and can become more secure than on-premise.  With Cloud storage, an organisation’s data can be housed in a data centre for an unlimited amount of time. This is especially important in today’s fast-paced environment as more and more jobs move to digital, so having the enterprises’ data online already can streamline business processes. Scalability is also one of the biggest advantages of Cloud computing. Data centres can quickly and easily evolve their resources to meet business needs or demands. This may include company growth or expansion across the globe, as well as the move to remote working in the wake of the COVID-19 pandemic.  To further bolster security, Cloud solutions back up a business’ data in multiple places on a regular basis. Downtime can have a staggering impact on an organisation, and Cloud applications can keep this to a minimum with data restored from a backup faster than an on-premise situation. Finally, large corporations must abide by a growing number of compliance regulations. Cloud-based security can help these types of businesses to maintain the required infrastructures and ensure current and future regulatory measures are met.  Introducing hybrid Cloud solutions There is no one-size-fits-all solution when it comes to considering on-premise or Cloud security. Whilst both applications can be customised to fit the needs of a business, some organisations are selecting hybrid options that boast the benefits of both on-premise and cloud solutions. A hybrid cloud is an environment that combines an on-premise data centre with the public Cloud, allowing data and applications to be shared between them. For businesses with data that increases beyond the capabilities of their data centre, they can use the Cloud to instantly scale capacity either up or down to handle the demand. It also mitigates the time and cost of purchasing, installing and maintaining servers onsite that they may not need in the future. ### A Guide to Solve Damaged SQL Server Issues People work with SQL Server every day without being a production DBA. Hence, a point may come when you must consult professionals for different performance troubleshoots and projects. It’s because data security has become a crucial part of everyday life. It is a marvellous thing, as it is a vital way of learning new techniques and concepts related to SQL Server. If you want to sharpen your skills, you have to continue building content as a part of your current role.  Over the years, various issues have cropped up that have made computer users consider SQL Server in detail. There is no point avoiding these issues instead of finding an easy fix. The benefits, cost, and risk of the related application design for senior administrators are critical. You cannot avoid these glitches because they will affect data security in the long run. Before delving deep into the solutions of SQL Server issues, people must understand the common problems in detail.  Indexes A well-established problem associated with SQL Server is indexes. It does not mean that SQL Server does not perform indexes well. These days the servers perform indexing very well, indeed. Now, the issue related to SQL Server and indexes is how easy it is for computer users to commit mistakes when performing indexing. Wrong indexes, missing indexes, many indexes, too few indexes, and outdated statistics are common issues that users often face every day.  Although the area covers various grounds, the truth is that regular maintenance and a little bit of cautiousness will take care of these problems and help them disappear. Keep in mind that you must stay alert and perform the indexing task with precision as an end-user. If you feel the process is complex for you, you may take the help of professionals who know everything about queries, indexes, table statistics, and proper designing.  Wrong design decision For ensuring commendable database performance, you have to start with a robust database design. If you have issues with improper database choice, lack of info archiving, relational database issues, and use of nested view, you cannot take these problems lightly. There are no keys defined at all. You might have excel spreadsheets compiled with PowerShell that create a cluster node with a terabyte of RAM and flash drive. It’s challenging to alter the existing systems after deploying the same to production. Thus, poor design choices may linger for a long time and create a poor impression. Hence, making decisions regarding design and overall layout is your responsibility.  Bad code Poor code is a subjective term where each user has a unique definition. Improper code covers unnecessary cursors, user-defined functions, incorrect clauses, and much more. Along with the wrong design, the destructive code will further lead to concurrency problems that will result in locking, blocking, and deadlock. Since there are multiple combinations of poor code along with poor design, the significance of examining the database is well-established. Only a competent individual with years of experience and expertise in this field can help you solve the issue in such a scenario.  Object-relational mapping ORM or object-relational mapping is a tool that is used all across the globe. These are code-first generators and are used to work well. However, improper use of these tools results in bad performance and poor use of resources. ORM often results in frequent problems which are not easy to identify and thereby play the role of a fugitive. Remember that ORM leaves its fingerprints on the spot, and thereby you have to undertake a careful examination to understand the underlying cause. By discovering blog entries, you may appreciate performance issues related to ORM. Some tools will help you summarise ORM deployment, which you can use to your advantage.  Default configuration The installation of SQL Server is an easy task that does not require any understanding of the default configuration option. The option stands applicable for those who possess virtualised instances related to SQL Server. You get a chance to select default options which are sometimes not fair for the SQL Server. Tempdb configuration, MAXDOP, and default file growth are some examples you may use for configuring SQL Server before switching it on. When you see similar issues cropping up now and then, you should build a firewall inside and outside the SQL Server to ensure decent performance. An appropriate understanding of the configuration of SQL Server is necessary if you want to grab the best from this tool.  These are some of the issues you may discover while working with a damaged SQL server. Remember that the Internet is full of hackers and frauds. They always want access to your system that results in troubleshooting performance. By reviewing the design, backing up current data, and creating firewalls, you may deal with these issues and ensure the security of your data.  Easy ways of dealing with SQL Server issues If you want to have a worthwhile experience with SQL Server, you must regularly back up your data. Moreover, you must understand the underlying cause of performance issues and the best ways of avoiding them. The happy performance will then tune in.  •    Check the wait statistics: Checking the statistics is the first step to deal with the issue. SQL Server continually tracks the execution threads. The information is critical when trying to pinpoint the cause of performance problems. Discovering the reason is easy, but interpreting the info and solving the issue requires time and technical knowledge. Hence, it would benefit if you had a professional who belongs to such an environment and could pinpoint the underlying cause.  •    Run index maintenance: Yes, running index maintenance is another reliable way of dealing with the workload. Frequent changes in data and increasing workload often hamper the system. Check the index fragmentation regularly whenever it reaches 30% or extra. You may schedule weekly index maintenance to help update the statistics.  These are some simple ways of ensuring proper work of indexes and functioning of SQL Server. If you are still not sure how you will manage these issues, you may contact professionals to help you out.  ### Head In The Clouds: The Importance of Cloud Data Encryption For several years, the rate of cloud adoption has been steadily growing as businesses of all sizes realise its numerous cost and productivity benefits. The emergence of COVID-19 and the ongoing disruption to traditional office-orientated business models, however, have driven adoption levels to unprecedented heights. In fact, Microsoft CEO Satya Nadella recently claimed that his company saw two years of digital transformation in just two months at the start of the pandemic, with cloud adoption playing a huge role in that.   With cloud adoption now on a steep upward trajectory, businesses are increasingly focused on its security, and what they must do to protect their sensitive data once it leaves the safety of on-premises storage servers. One of the best solutions currently available -- cloud encryption -- warrants closer examination. This article will explore the benefits of cloud encryption in more detail, as well as some of the challenges associated with it, before laying out some cloud encryption best practices to follow to ensure data protection at all times.  What is cloud encryption? Cloud encryption is an umbrella term describing the range of different solutions that encrypt data before it’s transferred to the cloud for storage. Many cloud storage providers offer some form of cloud encryption as part of their services, with applications ranging from encrypted connections to limited encryption of only sensitive data, to full end-to-end encryption of anything/everything that’s uploaded. In all these models, the cloud storage provider will encrypt the data upon receipt and then share the encryption keys with the customers, allowing them to securely decrypt it whenever required.  What are its key benefits? Encryption is widely seen as one of the most secure approaches to cloud data protection because it scrambles the contents of any file, system or database in a way that makes it almost impossible to decode without the correct decryption key. As such, by applying robust encryption to their sensitive data and maintaining good decryption key management practices, businesses can fully safeguard sensitive data even after it leaves the physical premises.  Should any encrypted sensitive data get stolen, copied, or lost, anyone trying to access it will find it completely unreadable without the relevant decryption key, stopping them firmly in their tracks. This is particularly beneficial for data stored in the cloud because it means that even in the event of an account, system, or even provider being compromised, everything remains fully protected. Cloud encryption is also extremely useful for any organisation that operates in regulated sectors/industries such as financial services. This is because encryption, when combined with a range of other security measures, allows businesses to enjoy the benefits of the cloud while also meeting the strict compliance requirements for things like the Payment Card Industry Data Security Standard (PCI-DSS) and the General Data Protection Regulation (GDPR).   What are its primary challenges? Uptake is perhaps the biggest challenge currently associated with cloud encryption. Despite its undoubted effectiveness, cloud encryption still remains somewhat under-adopted. Fortunately, this is changing as more organisations ask their cloud providers to deliver better levels of security and regulatory compliance. Even with this progress, however, uptake remains below the level many security experts expect.  Why is this? A big reason is cost. Encryption can be expensive for cloud storage providers because of the additional bandwidth required to encrypt data before moving it to the cloud. Many providers look to pass this cost onto customers, making it unattractive. As a result, many providers only offer limited (and therefore, cheaper) cloud encryption services, while cloud customers often prefer to simply encrypt their data on-premises before moving it to the cloud. Some cloud customers will choose this approach regardless because it saves costs and it can be more convenient to keep the entire encryption process in their environment. Tips to ensure cloud encryption best practice For organisations looking to increase their cloud encryption activity, there are a few things to consider to maximise effectiveness. Choose your cloud provider wisely: When selecting a cloud storage provider, it’s important to start by identifying all sensitive data that will be moved to the cloud and mapping out the security needs associated with it in advance. Pinpoint everything that needs protecting and only go with a provider that offers a suitable level of encryption to meet these needs. For example, a media and production business using cloud storage for video footage may only need encryption for specific account credentials -- not for any of the material being uploaded to the cloud. Conversely, a software development company using cloud storage services to share IP and source code would need a much higher level of security, so is better suited to a provider offering end-to-end encryption. Look after your keys: Effective management of encryption keys (both those owned by a business and those provided by cloud providers) is critical. All keys should be kept well away from the data they are associated with, with any backups stored offsite and audited regularly. Keys should also be refreshed on a regular basis, particularly if they expire automatically after a set amount of time.  Some organisations choose to encrypt keys themselves but doing so can create unnecessary layers of complexity. Another best practice for key management is to implement multi-factor authentication for both the master and recovery keys, for added peace of mind. With cloud adoption currently skyrocketing because of the pandemic, there’s a renewed focus on ensuring any sensitive data being transferred and stored is truly secure. One of the most effective ways to achieve this is using cloud encryption, and there’s a growing range of providers and applications out there to help businesses achieve the level of security they need and desire. While not without its challenges, a well-planned, well-implemented cloud encryption solution can help businesses enjoy all the benefits of the cloud, while ensuring highly sensitive data is kept away from prying eyes and hands. ### Disruptive Seasons Autumn 2021 - Vera Loftis - Solution Junkies https://vimeo.com/619887016 Solution Junkies' brought a team down to speak at Seasons! An excellent keynote on a fantastic looking stage! ### Disruptive Seasons Autumn 2021 - Sri Ambati - H2O.ai https://vimeo.com/619888231 Check out this brilliant virtual presentation from Sri Ambati! ### Mixing Without Muddle: Perfecting the Hybrid Cloud Cocktail The dynamics of business have changed. Not just as a result of the global pandemic, although it has served to amplify and accelerate many of the business functions and supply chains that make the world go around, but also due to the wider impact of cloud, connected commerce and mobile device ubiquity. Top-down is now bottom-up There has been a marked reversal of momentum in many aspects of the technologies we are using. Organisations used to use top-down development practices as the primary rationale for the platforms they adopted and the applications and services they developed. Users largely took what they were given, only opting to complain if they really felt like a fight. As a blanket IT development policy, that practice no longer works. Users have become empowered and have been able to put the power of choice into their own hands. If flexibility and functionality aren’t up to standard, users will move on, so enterprise IT now has to accept the reality of bottom-up user-driven development. At the same time, IT teams around the world are transforming datacentre operations (spanning people and processes) and shifting infrastructure investments towards highly automated solutions to now focus on business-centric (rather than infrastructure-centric) decisions.  This is the point of flux at which organisations will need to assess which applications, which workloads and which data repositories they put where across a hybrid cloud landscape now spanning both public and private on-premises resources. It can feel like a mammoth task, so what aspects or IT operations should the business be thinking about? Calculating workload lifecycle Most organisations’ C-suite teams will know they have some data and understand that they operate a database to serve it. Equally, they will be aware of a certain amount of the organisation’s application base, some of which they will inevitably use themselves in the normal course of business. What these upper C-suite managers are unlikely to know is the rather more granular detail that lies under the surface of these data resources and applications. With decisions surrounding cloud migration responsibility now coming to the fore, a business will need to examine the stage and nature of an application and data workload's life cycle and the value to be attained by moving each workload to the cloud.  There is no single model for a workload lifecycle calculation, but if there were, it would be a calculation based upon a piece of technology’s mission criticality, its place in the business value chain, its number of touchpoints with users, partners and customers, plus also its fit into the governance and compliance responsibilities that the company has to adhere to. If the above factors stem from business variables as validation points, then there will also be a corresponding set of IT operations variables that will help ‘describe’ an application’s workload lifecycle status. These factors would include (but are not limited to) its use of memory, the number of data analytics calls it makes, its Input/Output (I/O) requirements and the amount of processing power it requires. Retain or retire, not rip & replace Naturally, some application workloads will not be easily migrated to the cloud in an economically viable manner. For these workloads, they should remain on existing platforms or be migrated to an updated traditional infrastructure, which can now be delivered with on-premises cloud deployments. Other application workloads will be at a stage of their lifecycle where the most prudent course of action is to retire them. Possibly due to discontinued support from the original vendor when deployed in the cloud, or more likely due to the fact that superior cloud-based alternatives exist, the case for securely managed retirement is made. Whether a workload is retired or replaced with a public, private or hybrid cloud alternative, organisations must make all decisions methodically and carefully only after a full evaluation process has been carried out. Any idea of a rip and replace process should be avoided at all costs. Are we there yet? Once all those thoughts are processed and the business has further evaluated the privacy and security factors needed to assess which workloads need to stay on-premises, which are suitable for public cloud breath and which need to straddle the hybrid world in between… can we ask whether we are there yet? The answer in the always-on essentially dynamic world of cloud is often yes, but also think about xyz. Other key factors will include cloud skill set availability in the existing workforce. Organisations will need cloud-migrated and cloud-native skills not just to bring application and data workloads to the cloud, they will also need these competencies to build new cloud services and operate the total cloud estate once instances are ‘spun up’ and in motion. The level of cloud skills an organisation has access to will organically determine the degree to which it can deploy, leverage and expand upon all types of cloud resources, particularly public cloud. For organisations with limited cloud skills, it may make sense to start by using cloud platforms as a form of data protection target for traditional infrastructure. The path to cloud migration The actual process cloud of migration may be a combined mix of the four Rs: rehosting, refactoring, rearchitecting or replacing, which is another story in and of itself. What we can say is that an organisation in any business vertical that takes stock of these core factors is on the road to taking advantage of cloud computing technologies at the right time, in the right workflows, at the right cadence and at the right price point.  Cloud migration can be something of a cocktail given the diversity of ingredients (workloads), the range of shakers and blenders (migration vehicles and techniques), the number of serving vessels and glasses (workloads endpoints and user interfaces) and the additional promise of a cherry on top or a twist of lime (cloud accelerators, visualisation dashboards and so on) if required. Cloud is a cocktail, but take a considered approach and you can avoid a muddle. ### Disruptive Seasons Autumn 2021 - Darshna Kamani - Barracuda https://vimeo.com/619886801 Darshna Kamani from Barracuda put on an excellent keynote. Watch it now! ### Breaking The Decision Deadlock: Positioning Your Business to Thrive Amid The Sunsetting of 2G and 3G Networks Decision Deadlock  The IoT market tends to look to the future, with a focus on innovations and new solutions. However, we don’t believe this means the industry should lose sight of the here and now. Whilst 4G and 5G networks are being deployed globally, simultaneously, 2G and 3G networks are being turned off to make way for their successors. Although each new generation of technology seems like a step towards the future, vast amounts of IoT technology depends on 2G networks. The prospect of networks sunsetting such services is unsettling for customers with products that rely on 2G, especially as 4G equipment is considerably more expensive. This predicament means that both IoT suppliers and buyers are hitting a stalemate: what equipment do they buy, and when do they swap existing devices for 4G supported technology?  To combat this deadlock, it is our belief that suppliers and buyers should look to multi-network providers who can ensure that as long as 2G or 3G networks exist in the UK, they will be able to guarantee a connection for hardware out there in the market. This switch will help the industry buy some time during which companies can start to rollout new equipment. Setting the 2G and 3G scene The G in 2G stands for ‘generation’, so 4G and 5G technologies are simply the latest phase of standards for mobile networks. 2G networks were the start of mass deployment for mobile services, and since then, the focus for new generations has been to develop and improve connectivity. The component welded into a 2G or 3G supported device at the point of manufacture is entirely different from that used in 4G and 5G, therefore as the networks shut down, the SIMs in 2G or 3G devices cannot just be swapped out. The only way to switch networks is to send an engineer to swap out the equipment that is monitoring the device. Meaning, at the point of purchase for IoT-connected devices, customers are required to make an almost irreversible decision on which network they want their equipment to connect to.  The 4G conundrum If all networks were to remain in place, we wouldn’t be facing such a dilemma. However, following the announcement from EE that it would be shutting down its 2G network by 31 December 2025 (disclosed in documents available to GMSA members) and Vodafone announcing that 2G in Europe will be live until “at least 2025”, the decision is more complex. Imagine you are buying life-changing equipment for medical monitoring purposes, if this is the case you would want to be sure that device will remain connected and fit for purpose for as long as possible. With that in mind, you’d opt for the 4G device, right? But what if that device was up to three times more expensive* than the 2G option. Considering that, you might revert back to the 2G connected equipment, especially as it will be running for at least another five years. And maybe longer? In a nutshell, this is the decision deadlock buyers and sellers in the IoT market are facing right now: to buy, or not to buy, the 2G chip, or the 4G chip. To go with the most cost-effective decision or longevity. And so, the uncertainty lingers. Commercial costs If it’s causing such ambiguity in the market, you might be wondering why it’s happening in the first place. Mobile Network Operators (MNO) are constantly striving to improve their offering – 5G only just emerged, and 6G is already hotly anticipated in the industry. Keeping older networks running that can be replaced by ‘better’ alternatives is costly and time-consuming. MNOs only have so many licences and frequencies they can use, so to free up capacity and cut costs, the simplest solution is to shut down older generations of networks to make way for new technology. After all, their decisions have to be based upon what is best commercially for them as company, not the IoT industry.  The Smart Meter  Although the sunsetting of 2G will affect all devices connected to the network, the closure will affect some sectors more severely. Industries that use vehicle trackers, medical monitoring equipment, and wearable devices that provide emergency signal for staff working alone, all rely heavily on 2G devices. The uncertainty in these industries was illustrated perfectly in a Freedom of Information request submitted in 2020 to Ofcom, by a company that manufacturers vehicle security and tracking equipment. It asked when the termination of 2G networks is expected to happen as the products they provide carry a five-year warranty. Ofcom could not give confirmation of this situation, but in an update in February 2021, they did  reveal they were working with MNOs to provide more information on 2G and 3G networks closing. However, they were still unable to provide clear timescales which further demonstrates the vagueness that persists in the market. For other industries, the main difficulty is the sheer volume of equipment that would need to be replaced as or when 2G networks shut. One example of this is smart meters. Over the past decade, energy providers have been rolling out smart meters that use IoT connectivity to record and send readings automatically to the provider: removing the reliance on households to self-submit readings, resulting in more accurate bills throughout the year. In the latest Government Smart Meter Statistics quarterly report, published in August 2021, there were an estimated 25 million households and small businesses in the UK that now have smart meters installed - the vast majority of these will be connected to a 2G network. So, if MNOs decide to close 2G networks in 2025, there will be tens of millions of smart meters that no longer work. The cost of replacing the 2G components and hiring the staff to undergo such a large operation will be huge. Breaking the deadlock What we are now left with is an industry that is in a deadlock. Suppliers are worried about pricing out customers if they sell 4G components, and buyers are hesitant to invest in new technology when there’s a chance cheaper historical equipment will be around longer than expected.  It is this hesitancy, that is affecting the market. But there is a solution: choose a multi-network SIM when you purchase 2G or 3G equipment. A multi-network provider has access to networks across the countries they operate in, so when the major networks start shutting down their 2G or 3G services, the multi-network provider simply moves your supply to another 2G or 3G network. As long as there is a network operating in the UK, your equipment will remain connected. Contrastingly, if you buy from a single network provider, you will be at the mercy of their decision. The market needs confidence if it is to grow and trade smoothly; purchasing a SIM from a multi-network provider means you can be certain that equipment will remain connected for as long as possible. And while companies have this confidence, they can start phasing out 2G or 3G equipment and introducing new technology in the natural cycle that would have existed before the threat of losing connection. We hope that by 2025, there will be a resolution to the debate around 2G and 3G networks. By then we hope fixed dates will provide certainty on any planned closures and the market will be able to make a measured and confident decision on what technology to invest in. Future-proofing IoT devices So, how can we future-proof devices to prevent the same issue from occurring in another ten years, when the newest generation of network surpasses 4G’s capabilities?  Enter the eSIM and the iSIM: the long-term resolution for the IoT industry. The eSIM is a component embedded into the equipment that can be connected to any operator. The iSIM, although still in its infancy, is software, rather than a component, that’s soldered into equipment. Both options remove the reliance on a single-generation network, overcoming the hurdles we are facing with traditional SIMs.  However, a lot of devices deployed across the UK right now contain plastic SIMs, making multi-network providers the go-to, short-term solution. This will give the market time to do business today, with the confidence to plan for tomorrow. Then, with the introduction of eSIMs and iSIMs when - or if - the next network shutdown comes around, the IoT market will be safe from the worry and uncertainty that arose with the sunsetting of the 2G and 3G networks.  ### Disruptive Seasons Autumn 2021 - Allan Behrens - Taxal https://vimeo.com/619885849 Join Taxal's Allan Behrens for his segment on Disruptive Seasons. An excellent session is guaranteed to leave you with new insight. ### The Company in the Age of Digital Communications The third industrial revolution was enabled by the development of mobile telephony and satellites, as early as the 1980s, and the positive adoption of these technologies was boosted by growing user demand for the mobile experience.  Currently, thanks to the rise of broadband and cloud services, telecommunications are increasingly fast, native, omnichannel and no longer only carried out between people, but also between applications and connected objects (IoT). This is now known as the fourth industrial revolution. In the enterprise field, this digital transformation has been amplified by the health crisis and the need to have the right information, at the right time, for the right person, regardless of the medium used. Beyond the flex-office: the Digital Workplace The digital transformation has been driven by the users themselves and by the generation of digital natives, born in a world with connected devices. In particular, Generation Y has been the instigator of many changes in the way tools are used and the way enterprises communicate. Thus, “Bring Your Own Device” (BYOD) has paved the way for the acceptance of portable devices in companies, such as smartphones, in order to gain mobility, nomadism and the ability to work from anywhere and everywhere. The multiplication of connected objects, which can be used on-site or remotely, is also a key aspect of digital transformation, addressing the problem of communicating and transmitting data in real-time. Since the health crisis, enterprises have begun to rethink the office, beyond the flex-office, bringing it into the era of digital communications. The digital workplace is an office adapted to all new forms of communication which is more agile, more customisable and also accessible from any place or device. However, it is important to respect the limitations, including connection time for end-users, at the risk of creating frustration among employees or customers. The advent of high-speed technologies such as 5G or fibre makes it easy to capture and transmit data in real-time and to process and analyse it with the power and speed of the central network; while the accessibility of the cloud makes data available at any point. The acceleration of these technologies creates a major challenge for companies: to be equipped to address the digital transformation autonomously and also effectively implement real-time communication throughout its digital ecosystem. Connect all company assets As real-time communications and collaboration become quickly widespread, they also become the heart of companies' performance challenges. However, there is no single scheme for migrating from a historical model to the digital age, since each company must adapt its communication solutions from an existing ecosystem. In the same way as an employee, applications and connected objects must now be considered as assets of the company. Although applications and connected objects respond to a simple and automated function, communication in the digital age will make it possible to have people and applications interact with each other, with data collected and processed by machines, regardless of location and device used. The goal is, through the organisation of these elements and automation in business processes, to optimise the performance of the entire system to offer a real-time communication experience without latency, which is collaborative and secure, to process the data or information needed at the right time. This digital transformation is happening step by step: first by adopting a more flexible model to migrate to a digital workplace, then by rethinking communications to put them at the centre of business processes.  Migrate to the cloud at your own pace Voice over IP (VoIP) technology was the first step towards unified communications. However, when we now talk about communication in the digital age, we imagine much further than telephony. We think of communication between people, by voice through telephony and collaborative tools such as conference platforms and online chat. But now communication also includes objects and machines communicating through AI to transmit useful information to operators, such as temperature or pressure sensors for predictive maintenance or geolocation beacons for tracking operations. In order to optimise digital communications, companies must take control of the entanglement of their network infrastructures and the new possibilities offered by the cloud (hybrid, private, etc.). In the not-too-distant future, networks will have to support a 5G, LAN or Wireless LAN technology and integrate harmoniously into the digital ecosystem chosen by the company. A mix between network infrastructure and cloud, which must be tailor-made to best meet the rapidly changing needs of a company. Every day, new services available on-demand or "as-a-service" appear and this will continue to evolve because we are only at the beginning of this new revolution of work. Many companies do not have the internal resources to develop these new ecosystems. It is then a question of finding truly agnostic partners in terms of technologies, communication and network infrastructure, applications and connected objects, to build a flexible model from all the existing elements and the expected prerequisites according to the sector of activity. The challenge will be to make what is already in place coexist with "all-digital" solutions, to limit the impact on existing processes, to develop a secure hybrid cloud to fully take advantage of data and applications, and to anticipate regulations related to this ecosystem, including digital sovereignty and ensure the protection of company data. For many companies, accelerating their digital transformation guarantees their future performance and success, impacting the talent recruited and the customers retained. A recent study by Nature Human Behavior, shows that teleworking, as productive as it is, is optimised only if real-time communication tools are interconnected, reliable and suitably sized for the company and its employees. Currently, the question is not whether to integrate communications into the digital age, through the implementation of a digital workplace, but when and how to do it. More than 18 months of health crisis have highlighted the need for companies and organisations to migrate to a digital environment that will allow them to communicate in real-time, with people and devices, no matter where they are. This is the goal, this is the future we are working towards and through digital transformation, this is the goal we will reach. ### Cloud Migration Success: What You Need to Know With public cloud end-user spending forecast to grow 23% in 2021, global cloud computing growth shows no signs of slowing. Largely driven by digital transformation, 5G adoption, the Internet of Things (IoT) and Artificial Intelligence (AI), as well as the pandemic, large enterprises have led the way in cloud adoption. However, small and medium enterprises (SMEs) are now mirroring their larger counterparts to emerge as one of the fastest-growing cloud migration segments.  In what is turning into a race to the cloud, SMEs are increasingly looking to Amazon Web Services (AWS) and Microsoft Azure as their preferred choices – with Azure being particularly attractive to them. This is especially true for SMEs that already rely heavily on Microsoft solutions. Aside from delivering a seamless experience with Microsoft server products, Azure Active Directory provides a link for cloud services, making the move to Azure a natural choice. A key financial benefit offered by Azure to current Windows Server and SQL Server clients is to provide them with the ability to use these licenses in the cloud at no additional cost. The race to the cloud: hurdles and opportunities Reaping the same cloud migration advantages as large enterprises, SMEs are benefitting from reducing or completely eliminating a large capital outlay for hardware. The cloud enables them to better control costs, improve efficiency with fewer resources, and gain the flexibility to scale up or down as demand requires. In addition, cloud applications and services make it easier for companies to support remote workforces, which as we continue to work through the pandemic has become a key necessity. Moreover, transitioning to the cloud provides businesses with immediate access to the latest technology and cutting-edge tools, giving them a solid foundation for growth and forward-looking IT strategies.  On the other hand, some SMEs hesitate to make the move to the cloud for a variety of reasons. Cloud contracts and cost models can be complex, and the right-sizing of the cloud is important to guard against unpredictable costs and bill spikes. Additionally, there are data transfer charges and SMEs who have limited internal IT expertise could feel overwhelmed by the task of migrating data to the cloud. As an example, for compliance reasons, an egress charge is typically added when moving data between regions. However, one of the main reasons that SMEs are reluctant to transfer their data and applications to the cloud is a perceived loss of control. Primarily stemming from endless reports of ransomware and other cyberattacks, SMEs question whether the cloud is secure. To some, keeping their applications and data in the local server room seems safer, while moving to the cloud awakens considerable data protection questions such as giving up control of their systems’ liability to a third party. The truth is, if done correctly, migrating to the cloud can actually improve an organisation’s security level. Cloud providers and third-party vendors alike have developed sophisticated security solutions and are constantly improving them to mitigate threats. By using the available security tools and services in combination with support from an experienced IT partner, SMEs can put in place a comprehensive cyber security strategy. But moving to the cloud takes forethought and planning. For a successful migration, businesses should take into consideration the following five strategies. Secure a professional with proven experience Shifting your infrastructure to the cloud is a complex task and not one that should be handled without cloud migration expertise. A trusted managed service provider (MSP) can help you make the right decisions on a number of matters, including when to migrate, which parts of your infrastructure to move to the cloud, and how it can be done with the least disruption. Additionally, make sure the MSP has strong security capabilities, as well as the necessary knowledge to put in place the security controls needed to meet your specific requirements. Before selecting an MSP partner, ensure they have proven experience in cloud migrations, and take a look at their pricing model(s). This will help you to determine if their pricing is transparent and predictable or whether you’ll need to plan for cost fluctuations. Create a strategic migration plan Work towards creating a 3-5 year plan with your MSP for transitioning hardware and applications to the cloud. By working together you’ll have the opportunity to review your business goals to determine which parts of your business would run more effectively in the cloud, then consider each application individually to determine migration timing. It’s important to remember that cloud migration isn’t a ‘one and done’ initiative, but rather a series of events that will span on-premise, the cloud, and hybrid. Simplify migration with a step approach By its very nature, the cloud can be optimised and adjusted over time. Initially, it will be important to focus on easier migration executions. For example, by moving servers to the cloud-first they can be monitored, and adjustments made before migrating applications or moving users to Azure Virtual Desktop. Incorporate change management initiatives Moving your infrastructure to the cloud will change how end-users access and use applications, which makes it important to keep communication channels open. To keep staff informed and effectively manage expectations, communications will need to be carefully planned. Additionally, spend time to understand how the user experience will be different and plan for adequate training to ensure that staff will be able to work efficiently within the new infrastructure. Draw on your MSP’s expertise to help navigate this process, provide user training, and develop documentation. It’s also beneficial to use this time to revisit security basics and provide employees with security awareness training. Implement a strong security strategy The importance of implementing a strong security strategy can’t be stressed enough. Since Infrastructure as a Service (IaaS) providers don't cover data protection and recovery out of the box, you’ll need additional data protection measures that meet your requirements. For instance, Azure Data Redundancy, which is provided free of charge, doesn’t store historical data nor does it provide automated recovery. Alternatively, Azure Site Recovery provides the ability to recover from a historical backup point beyond 72 hours – which may or may not meet your data recovery times and recovery points objectives.  For true business continuity, it’s best to distribute risk by performing cloud backups both within Azure and to a separate private cloud. This ensures that your operations remain up and running and provides the ability to restore them quickly if something does go wrong – even in the case of an Azure outage. Further to the point, work with your MSP on implementing a third-party business continuity and disaster recovery (BCDR) solution. Developed specifically for the public cloud, these solutions offer additional security measures such as ransomware detection and data deletion defences. Regardless of whether the workloads reside on-premise or in the cloud, a BCDR strategy ensures reliable data recovery in the event of deletion, corruption or a ransomware attack. Migrating to the cloud can be a key driver in modernising business practices, facilitating continued remote working, and improving collaboration, performance, and security. When handled correctly, cloud migration will optimise your infrastructure and open up new opportunities. The key to a successful migration is to plan carefully and onboard a trusted MSP that has the necessary cloud migration expertise. ### Disruptive Seasons Autumn 2021 - Chris Newlands - Tripsology Group https://vimeo.com/619919926 Chris Newlands from the Tripsology group made an appearance on Disruptive Seasons, you should check out what his presentation is about it is out of this world! ### Disruptive Seasons Autumn 2021 - Bill Mew - Crisis Team https://vimeo.com/619921954 Bill Mew from the Crisis team produced an excellent presentation at Disruptive Seasons. Watch now to gain expert insight! ### Disruptive Seasons Autumn 2021 - Audrey Tang - Psychologist https://vimeo.com/619921486 Dr Audrey, Psychologist, came to Seasons and really delivered, an excellent and thought-provoking presentation that can boggle the mind! ### Disruptive Seasons Autumn 2021 - Team Qual https://vimeo.com/619921176 Check out the presentation from the team members at Qual. We loved having them on Disruptive Seasons it was an excellent presentation. Watch now! ### Disruptive Seasons Autumn 2021 - Stuart Simmons - Denodo https://vimeo.com/619920644 Join Denodo's Stuart as he appeared on Disruptive Seasons giving us an unbelievable presentation. ### Disruptive Seasons Autumn 2021 - Darren Swift - Google https://vimeo.com/619920477 Join Google's Darren Swift as he presents an excellent keynote! ### Disruptive Seasons Autumn 2021 - Mandy Chessell - IBM https://vimeo.com/622294565 Join IBM's Mandy Chessell for her Disruptive Seasons session! ### Disruptive Seasons Autumn 2021 - Jez Back - Thebes https://vimeo.com/622294445 Jez Back from Thebes had a great session at Disruptive Seasons! Check it out now. ### Disruptive Seasons Autumn 2021 - Michael Foote & Rupert Fallows - Birlasoft https://vimeo.com/622293838 Michael Foote & Rupert Fallows from Birlasoft appeared on Disruptive Seasons, check out their session now! ### Disruptive Seasons Autumn 2021 - Chris Gilmour - Axians https://vimeo.com/622293477 Watch now to see Chris Gilmour's session from Disruptive Seasons! ### Disruptive Seasons Autumn 2021 - Isabel Kelly - Profit with Purpose https://vimeo.com/622293293 Join Isabel Kelly as she spoke at Disruptive Seasons Autumn 2021 about Profit with Purpose. ### Why Misconfiguration Remains the Biggest Cloud Security Threat Misconfiguration still remains the number one cause of data breaches in the cloud because change control is so challenging in this environment. It routinely tops the leader board of the Cloud Security Alliance and was top in the State of the Cloud Security 2021 report while the Cloud Configuration Risks Exposed report found 90 per cent of organisations are vulnerable to security breaches due to cloud misconfigurations.  It’s also a growing problem. An IBM study found of the more than 8.5 billion breached records reported in 2019, seven billion of those, or over 85%, were due to misconfigured cloud servers and other improperly configured systems compared to 2018 when these records made up less than 50 per cent. What is misconfiguration? Misconfiguration can broadly be interpreted as a failure to adequately apply restrictions on a service or system residing in the cloud. It occurs when applications are spun up into the cloud and new services are activated and can happen in various ways. There could be a failure to configure from day one, leaving systems with default settings or a failure to apply access restrictions and enforce the least privilege. Or perhaps unapproved changes were made in contravention of the security policy. Or systems were left publicly exposed to the internet – a common failing with object storage buckets.  The potential for misconfiguration is vast – the State of Cloud Security report found 49 per cent of teams experience over 50 misconfigurations per day – but there are hotspots. Identity and entitlement access management due to the proliferation of identities within the cloud is a case in point, as each of the permissions for each of these has to be nailed down. Other areas include security group/firewall rules, whether logging has been disabled/enabled, encryption controls for data at rest and in transit. And there’s also the potential for orphaned resources which then fly under the security radar. Current and future issues The ramifications of misconfiguration can be devastating, with any data breach potentially providing a foothold into the cloud environment. This can lead to the harvesting of credentials which are then leaked or sold on and used for credential stuffing attacks, the automated injection of username/password pairs to website log-ins. Or can pave the way for lateral attacks, such as ransomware or cryptojacking, whereby cloud resources are hijacked and used to power cryptomining operations.  And it’s a problem is that is set to worsen because of the way the cloud is evolving. We’ve seen rapid migration to the cloud under the pandemic but also expansion leading to higher uptake of hybrid and multi-cloud environments. Using a number of different platforms can make it difficult to maintain visibility and because service provider offerings are platform-specific and few organisations have pan-cloud security, gaps can result, leaving the business exposed. Many also don’t have the internal expertise needed to manage and maintain these environments.  Mitigating misconfiguration Keeping on top of configuration requires a multi-faceted approach. A priority is maintaining visibility of cloud assets, entities and identities and to do that you’ll need to consider if your Identity and Access Management (IAM) is fit for purpose and can right size privileges to ensure the appropriate level of access is assigned to cloud services. You’ll also need to consider how you can maintain a unified approach to the configuration as your cloud footprint grows and ensure policies are applied across different computing environments ie hybrid and multi-cloud. That’s often more difficult than it sounds because determining who is responsible for configuration can be complex, with implementations typically spanning multiple teams ie DevOps, security/compliance teams and external advisors. The need to continually spin up and make changes to services often on a daily basis means you’ll also need oversight of all APIs and interfaces, requiring some form of automation of cloud compliance. Policy as code (PAC) and infrastructure as code (IAC) can both be used to help monitor the environment while Cloud Security Posture Management (CSPM) can help with managing compliance and monitoring across the multi-cloud – although it can’t perform identity and entitlement management – for that you’ll need Cloud Infrastructure Entitlements Management (CIEM).  Yet even with these advanced automation tools, you’ll need eyes-on to resolve issues. Having the expertise to manage, interpret and respond appropriately remains a real concern for cloud teams, with 35 per cent of those surveyed in the State of Cloud Security report saying they need better guidance on the remediation of cloud misconfiguration in cloud environments and IAC. For this reason, many organisations periodically carry out a cloud security configuration review using a third party to establish where misconfigurations are and obtain remediation advice. This inventories the cloud estate across all cloud provider platforms and highlights any weaknesses in the security profile, giving a true indication of where the issues are as well as a clear means of resolution.  ### Disruptive Seasons Autumn 2021 - Lee Biggenden - Nephos Technologies https://vimeo.com/622293166 Catch Lee Biggenden from Nephos Technologies as he appeared on Disruptive Seasons Autumn 2021! ### Rapid Migration Leaves Vulnerabilities In response to the pandemic, organisations were quick to adopt new technology, such as cloud computing, to keep work flowing remotely. The problem is - when you adopt new technology, you also have to adopt new methods, solutions and techniques to secure it. In the urgency of the move to remote working and digital operations, many organisations lacked the time or resources to fully secure the new solutions.  While ultimately delivering significant business benefits, this rapid digital transformation can leave organisations exposed to attacks. Particularly as, According to VMware, ransomware attacks alone have increased by as much as 900% since the start of the pandemic. And our security operations centre (SOC) data shows a 63% increase in cyber attacks, in just the last quarter.  For CISOs, this presents a challenge – and an urgent one at that.  How can a single security team monitor all this information, at all times, no matter where in the world their users are? The answer, of course, lies in technology – specifically a cloud-native security platform. Here are three ways the right cloud security platform can help you automate security and sleep easy. 1.     Hacker vs. holiday: How to govern access for remote workers The cloud lets employees log in from any device and location. So how do you tell the difference between a hacker in Barcelona, and Bob the HR manager checking emails on holiday in Majorca? Assessing and authenticating users has always been a key part of security. But it was much easier when users logged into office-based computers that were wired up to a corporate server. Now, users can log on using unsecured devices from anywhere in the world. So, how can you protect yourself from hackers without stopping Bob from checking his emails on holiday? The key is to tailor the security response to the situation, creating risk-based user authentication rules: If one's logging into their office desktop as normal – then a simple password is sufficient.But if one is in Majorca, perhaps logging on out of hours – it makes more sense to ask for multi-factor authentication (MFA), or perhaps even biometric authentication to enter the corporate network.If someone is in fact a hacker from Barcelona, logging in from an unregistered device in a new location at a strange time – then perhaps CISO oversight is needed before access can be granted. Organisations today have a wide range of authentication methods to choose from – not just passwords. The key to an effective and friction-free sign-in policy is to carefully tailor authentication, to the risk profile of each sign-in event. With a cloud security platform like Security Information and Events Management (SIEM), CISOs can easily automate these rules. This gives end users a seamless sign-on experience wherever they are while taking pressure off the IT/security team to authenticate every sign-in. 2.     The weakest link: Protecting your users from costly attacks Since the pandemic, there has been a sharp spike in security attacks – specifically ransomware and phishing. Much of the time, the weakest link is hiding in plain sight: your users’ email inboxes. The truth is, human error is behind a shocking majority of security attacks and breaches: as much as 95%, according to IBM. The problem is if a user accidentally responds to a phishing email and leaks sensitive information - there’s little a CISO can do to get it back. So how do you protect your organisation against your own users? User training is obviously a good start - but the best training in the world can’t guarantee you’ll stay abreast of every ransomware, phishing and malware attack that hits your organisation. For that, you need technology. Ideally, technology that identifies phishing and ransomware attacks and stops them from entering your users’ inboxes in the first place. A cloud security platform can achieve this by automatically detecting suspicious-looking emails, attachments and downloads, and quarantining them so users can’t access them. Then, a CISO can assess the suspect emails, and take appropriate action – either sending them on to the user if they are legitimate or keeping them quarantined. 3.     Automate security and sleep easy The cloud makes it easier for users to collaborate across different locations and time zones – independent of traditional office hours. But what happens when an employee in the US accidentally shares sensitive data at 10pm, after the UK-based CISO has gone to sleep? Automation is the key to any effective security response here. Even if the CISO is online, they don’t have the time or resources to manually monitor every potential security alert – particularly in a large organisation.  So, an effective cloud security platform should identify and respond to threats such as ransomware and phishing – without input from the security team.  Cybersecurity is a top priority for business leaders everywhere and rightfully so. But with the right technology, it doesn’t have to be difficult.  Cloud security platforms like SIEM can automate monitoring, detection and threat prevention to keep your business data safe, whatever happens. ### Disruptive Seasons Autumn 2021 - Daniel Blondell - McLEAR https://vimeo.com/622292404 Join Daniel Blondell from McLear as he appeared on Disruptive Seasons Autumn 2021. ### Is The Potential Of Blockchain Being Realised In IoT Networks, Or Is It Detained To Remain A Niche Technology? The integration of blockchain and IoT has successfully addressed many issues in intelligent connected systems. Although organisations under different domains embrace blockchain, they will also realise the technology's true potential in the coming years, and it won't 'just' be a niche technology any longer. The IoT or Internet of Things functions by connecting people, devices, places, and technology to create value for individuals and organisations. Today, IoT exists in sophisticated and connected sensors, chips, and actuators embedded in different physical entities. As the IoT ecosystem continues to grow globally, billions of devices are getting interconnected. The world would have more than 50 billion connected devices by 2030. The very nature of IoT makes it vulnerable to various security threats because connected IoT devices exist in many forms and brands, tested or untested, approved or unauthorised. With the number of devices in the smart ecosystem growing, users would be susceptible to data theft and hacking. Even if a single connected device is vulnerable, it can lead to a compromise of data. Starting with Bitcoin, decentralised ledger technologies (DLTs) like blockchain continue to gain popularity at an unprecedented scale. However, there are three main features of this technology that might benefit the IoT ecosystem. Trust: DLTs have been conceived to add a trust layer between all the participant parties. Being append-only and reinforced by cryptography, no participant can modify or delete any of the ledger blocks, thus making the blockchain immutable and every transaction traceable.Digital Identity: With centralised identities becoming vulnerable to data breaches and identity thefts, digital identity often becomes unreliable. Incorporating blockchain technology, users would have control over their data or information. Thus, blockchain technologies are helping users control the use of their digital data and manage digital identities.Smart Contract: With real-time enforcement, blockchain-based smart contracts are being used to establish an agreement between multiple parties without intermediary involvement. The contract remains over a decentralised and distributed blockchain network. Presently, such arrangements have become a staple in different spheres involving human transactions, such as real estate and healthcare. The integration of IoT and blockchain can go a long way in preventing disruption, thereby building trust during data sharing, processing, and storage. This fact explains why organisations under various sectors are integrating blockchain into IoT ecosystems to secure their data across IoT (Internet of Things), IIoT (Industrial Internet of Things), and IoMT (Internet of Medical Things). Is The Potential Of Blockchain Being Realised In IoT Networks? While the utility of blockchain is yet in its nascent stages, the coming years would see its diversification and adoption in IoT environments. By 2025, blockchain IoT is set to be a $3 billion industry. However, IoT networks are yet to exploit blockchain technology to its full potential: Firstly, specific concerns regarding the processing power necessary to encrypt every object in a blockchain-based environment need to be addressed. Considering the diversity of IoT ecosystems, the key players need to make strategic moves.IoT is different from ordinary computing networks, as many diverse devices constitute an IoT environment. Accordingly, the computing powers vary in the system, and not every device would be able to maintain the same desired speed level while running similar encryption algorithms.Many firms are presently unable to deploy blockchain in IoT due to storage constraints. Although no central server would be required to store the device IDs and transactions, one has to place the ledger on the nodes.With time, the ledger size would be increasing. Considering such challenges, organisations deploying IoT systems have not fully capitalised on blockchain technologies yet. It won't be wrong to acknowledge blockchain as the missing link between IoT and data security today. However, between 2019 and 2025, global blockchain IoT is likely to register a CAGR of 91.4%. This projection is optimistic about the potential that blockchain holds for IoT in the coming years.  A recently published report has highlighted many use cases that leverage a combination of IoT and blockchain technology, such as supply chain (e.g., food safety, anti-counterfeit, etc.), IoT Network Management (e.g., smart cities, healthcare, etc.), Smart contract and compliance. It shows that blockchain cannot remain a niche technology and will make a widespread impact in IoT and many other spheres of human activity. IoT, IIoT, IoMT Device Authentication: It is imperative to track the origin of a physical or digital asset and eventually use a blockchain to trade it to ensure the authentic developer or manufacturer, owner, supplier, etc. Therefore, blockchain would not remain a niche technology in a few industries but will be of great help for the authentication of connected devices across various networks. Instead, almost every player in diverse sectors would embrace blockchain to secure their credentials and identity across devices and platforms and manage digital rights in the coming years.  Intel has already started working on blockchain technology in collaboration with JP Morgan, Microsoft, etc. Intel is also exploring how blockchain technology in combination with IoT, i.e., Blockchain-based IoT (BIoT) applications, can be used for authentication of IoT-based devices and help solve the security issues of their IoT-based systems. Logistics: Even with the digitisation of the logistics industry, it struggles to cope with the shortage of transparency and communication. Besides, with the number of players in this sector multiplying in recent years, organisations often encounter loss of time and money. With blockchain's transparency and legitimacy attributes, global logistics organisations are leveraging their automation processes. Automobiles Sector: Blockchain and IoT ecosystem will be the key technologies of tomorrow, starting from automobile manufacturing, payments, driving, tracking, parking, etc., in various everyday aspects of human lives. Toyota Blockchain Lab has already started working on customer contract digitalisation, vehicle rights management, vehicle lifecycle management, supply chain, finance, etc., by leveraging highly tamper-resistant and fault-resistant characteristics of blockchain technology along with IoT. Other Applications: Presently, the experiments are in the early stages. However, the applications range from medical records to identity management to micropayments and virtual customer blockchain wallets. Apart from the above-mentioned use cases and applications, other IoT-Blockchain projects such as VeChain are trying to enhance supply chain management. On the other hand, IOTA is developing a standard model for conducting transactions on various devices. Final Words Blockchain and IoT are transforming the digital space, and the world has started to realise its potential. While it might prove to be a labour-intensive pursuit, sensors and IoT devices can expand the digital assets and solutions developed on blockchain. Though the current scenario shows that both blockchain and IoT technologies are in their developmental stages, statistics hold enormous promises for the future. Blockchain cannot remain merely a niche technology in the modern world of convergence, interconnectivity, and diversification. Instead, it will revolutionise IoT and any other conceivable field of human endeavour on a global level. ### What are the Main Concerns for CISOs Today? The responsibilities of a Chief Information Security Officer (CISO) are forever growing as the cyberthreat landscape develops. The task of keeping up with these evolving threat vectors is no mean feat, and the pressure is on to achieve top results at speed. Needless to say, a lot can come to weigh on a CISO’s mind. Organisations are frequently being targeted by cybercriminals and they must be quick out of the gate when an attack occurs. However, Deep Instinct’s SecOps report reveals this is not always the case. On average, the UK response time to a cyber breach is 20.9 hours. That’s more than two working days. Not only will their data be long gone, but adversaries will have plenty of time to infiltrate other areas of the network and increase their take.  The delay seems to be creating a somewhat despondent attitude to security, with 86 per cent of UK organisations stating they don’t think it’s possible to prevent all ransomware and malware attacks. Even though cyber attacks appear to be inevitable, we shouldn’t just roll over and accept this fate. Teams must rise up and fight fire with fire.  The first step is to recognise the challenges that need addressing.  So, what are the main causes of concern?  The daily concerns of a CISO cover extensive areas of a business’s operations. There are countless numbers of potential weaknesses in one system, and it can be a monumental task to secure them all.  One issue that’s causing havoc for CISOs and their teams is the rising volume of security alerts and the high number of false positives they bring. Notifications that alert businesses to potential threats and breaches should be a vital part of a security strategy, but instead, they can present a serious threat. When teams are receiving thousands of alerts each day, the time it takes to investigate every single one undermines the entire purpose. Additional resources are often assigned to resolving these alerts taking the focus off basic cyber hygiene. Beyond this, one of the main takeaways from our research is that attackers use an array of tactics to breach networks – and this is one of the biggest concerns for CISOs. Even if a company successfully blocks an attack one day, there’s no guarantee of the same result the next time.  Here are a few more elements of security that remain concerns for CISOs.  Endpoint protection  Endpoints have always been potential weaknesses in a business network, but the dissolving perimeter and increased hybrid workforces have exacerbated this vulnerability. As endpoints now represent the growing attack surface, 31 per cent of security professionals are committed to installing as many endpoint protection solutions as necessary in order to keep the network secure.  However, there is still concern around balancing security with productivity. If an increase in security measures impacts system performance, then teams may be reluctant to move forwards with it.  Hybrid workforce The last year flipped security on its head. Permanent offices became a distant memory and homeworking took hold of the modern workforce. Even now, very few teams are committing to returning to offices full time and instead have adopted the hybrid model. The impact of this major change is clear as a mere five per cent of security professionals, from both large and small companies, stated they have no security concerns about the hybrid workforce. The main two concerns regarding this future way of working are teams being able to ensure employees have secure remote access and preventing the use of unapproved services. The speed at which teams were forced to adapt meant that a huge proportion of visibility was lost, and it’s been a priority to try and claw this back.   Human risk At the centre of a CISO’s role is an element in all security strategies that can be the most unpredictable. People.  Human workers hold a vital role in any cybersecurity plan, but they are often targeted by attackers as they are perceived as being the weakest link. Armed with extensive social engineering techniques, criminals are well prepared to take on the defence line of employees, picking out the vulnerable and using them to gain access to the network. And unfortunately, no matter how much training is given to workers, adversaries will always find a way through.  CISO concerns are evident, with only 14 per cent of research respondents confirming complete confidence in their employees’ abilities to identify malware.  What does this mean for CISOs? The sheer number of daily concerns for CISOs calls for greater control over the security stack. The time pressure of investigating threats and mitigating risks, combined with the ongoing SecOps skills shortage, makes a CISO’s role incredibly challenging.  One thing is certain – traditional security solutions are no longer enough to defend against advanced cyberattacks. CISOs have lost confidence in their abilities and so it’s time for a change. Strategies must move beyond mitigation and reaction and should instead adopt a proactive approach.  Become an opponent, not a victim Reassessing their security strategies will help businesses level the playing field and improve their chances of thwarting adversary attacks. Having to chase after various alerts without sufficient resources can cause greater challenges for teams, which is why automated processes have become more and more critical.  Machine learning (ML) has been the go-to solution for businesses looking to automate their processes and free up employee resources for more high-value tasks. Unfortunately, cybercriminals have caught on, and have developed ways to manipulate ML for their own uses, leaving organisations with an even wider attack surface.  However, there is an advanced solution that can offer the next level of defence to fight against this new wave of threats. Deep learning (DL) is an advanced subset of ML and essentially replicates the neurological networks of the human brain. This technology works independently from employees and uses raw data to learn how to differentiate malicious code from benign. Unlike ML, it does not use pre-classified data. Over time, the system will learn to recognise incoming known and unknown threats, providing organisations with end-to-end proactive protection. By removing the need for human support, DL allows teams to reallocate resources on areas of security that most need attention. The system can automatically assess and respond to false positives so that teams are only made aware of those that require further action.  CISOs should feel supported in their role, and they need to feel confident that their employees and the processes in place will provide an effective line of defence for the business. With deep learning, SecOps teams can prioritise high-value tasks knowing that the technology in place is working hard to keep the new business perimeter secure from incoming attacks, and therefore, put the CISO’s mind at rest. ### The Cloud Conundrum: Tackling Inefficiency and Value Leakage in Cloud Over the past 18 months, the pace of digital acceleration has quickened. A recent KPMG survey of business leaders found that 67 per cent have sped up their digital transformation strategy and 63 per cent have put greater investment into their digital transformation plans since the start of the pandemic. With the technologies that underpin digital transformation being extremely cloud-dependent, it is no surprise that IDC’s latest figures forecast 2021 cloud infrastructure spending to increase 12 per cent to $74.3 billion.  However, we know that historic cloud investments haven’t always delivered value, and in some cases have actively generated wasteful spending. Some estimates show that almost a third of organisations’ cloud investment is inefficient, equating to approximately $25 billion of spending that is not creating real value for the enterprise. The reason this is happening is primarily a result of architecture inefficiency and value leakage. If these predictions for cloud infrastructure spending come true, this value gap is only going to get wider.  Why is Cloud Waste a Problem?  Many companies are so focused on migration throughput, that they fail to identify how their enterprise will drive value from the cloud. This mindset results in a poor quality of transformation and intensifies architecture inefficiency and missed value. Cloud value requires more than just moving IT systems; a focus on business outcomes, resilience and regulatory controls is a must, as it’s hard to retrospectively introduce the controls or transform the architecture to support these areas. Architecture inefficiency relates to the use of architectures that cost more than they need to on the cloud. Perhaps they inhibit auto-scaling and utilisation, or they use older or more expensive instance types than needed, or they have failed to refactor workloads when implementing Function as a Service (FaaS) and Platform as a Service (PaaS) capabilities. In other cases, they use tooling around the cloud inefficiently. Value leakage comes when spending is not targeted directly at where the enterprise needs to get value from the cloud. Understanding at the outset where the most value can be generated (whether that is analytics, scaling cyber security, AI, or a combination thereof) is crucial. Either under-investing in those areas, or over-investing in undifferentiated areas, generates waste. Closing the Cloud Value Gap  CIOs are seeking to maintain or increase the pace of digital transformation which has been founded on cloud infrastructure, so they must extract value from every pound they spend. Fortunately, there are several measures that organisations can take to help achieve this:  Identify your organisation’s strategic reasons for cloud use The enterprise can use the cloud for many things. The basic functionality is commodity infrastructure, which provides data storage and computes power. At an intermediate level, the cloud can also be used for increased agility, scalability and speed of provisioning to accelerate the speed to market of technology products. For the most innovative companies, the cloud can be used for analytics and advanced technologies including AI and machine learning. Some businesses might use all these capabilities, while others only employ a handful. Organisations must laser in on what they are actually trying to accomplish with their cloud deployment. Without this focus, engineers will simply build a slightly better cloud than the last one, rather than create an architecture focusing on the value that needs to be delivered.  Quantify impact using a ‘cloud value map’  After the strategic purpose of using the cloud is agreed upon, ongoing usage should be mapped against key metrics and the business capabilities they underpin. The simplest way to do this is by using a value map that connects these differentiating business services at the top of the pyramid, with the enabler capabilities in the middle layer, and at the base the foundational architecture and operations levers. Put simply, based on the business services outcomes that need to be delivered, the enablers and foundations will be organised differently. For example, a cloud engineered for insight will organise data in another way to one engineered for cost.This then helps enterprises to identify if the cloud is supporting their ability to execute higher up the stack, whether business processes and systems are truly being enabled by the cloud, and if it is affecting customer journeys and processes. It’s also possible then to point at spending which is being wasted on functions that don’t support business value. Through a combination of these processes, cloud teams can be certain that the cloud infrastructure they are operating is cost-effective.  Don’t forget engineering teams and Risk functions Having a cloud controls framework in place to show to the Board and regulators will give them the confidence that any cloud migrations and updates can be done with speed, accuracy and minimum risk. The framework incorporates the operational, security and resilience risks that emerge from cloud migrations and ongoing operations. It also helps to understand, mitigate and manage any risks, while complying with existing regulations. Promote collaboration Cloud implementation, whilst on paper is often done correctly, often misses the beneficial opportunities when business and technology functions are not properly aligned. When business and IT teams are brought together it encourages both parties to evaluate current cloud usage, logjams and ways of igniting faster progress. Cloud adoption is usually driven by CIOs, but they must work collaboratively with other departments to decide on a strategy and communicate how these decisions will impact cloud use, as well as what this means for the rest of the business. Employ a sophisticated cloud-appropriate strategy The benefit of hybrid cloud is that it enables organisations to choose the most suitable infrastructure for the job. One workload may benefit from a particular AI capability more prevalent in one cloud, whereas another workload may be purely commodity to be placed in the most cost-effective location, cloud or on-premise.  There is no doubt that cloud technologies will play an integral role in helping businesses recover from the effects the pandemic has had on the economy. It will be a significant differentiator for those businesses that implement it well. IT teams that are quick to adopt a measured approach to maximise value and minimise cloud waste will secure lasting competitive advantages for their enterprise.  ### Be Cloud-First, or be Left Behind Take a moment to think about your network and the applications being used by staff in the company.  How much of that technology is cloud-native or has a cloud component?  Ten per cent, a quarter, half, or maybe more?  Whatever the answer, in almost all cases, it will be higher than it was five years ago and likely more than two years ago. And if not, then it’s time to think seriously about the importance of being cloud-first. Cloud platforms from providers, such as Google, Amazon and Microsoft, have proven to deliver a level of security, resilience, raw compute power and expertise that no one but the largest enterprises could hope to build. Even if they could, the geographical reach, ongoing innovation, investment in tech talent, and affordable price points that today’s cloud platforms provide would be almost impossible to match.  Applications, too, have made their mark, whether it’s productivity suites such as Office 365, Infrastructure-as-a-Service (IaaS)-based versions of popular server platforms, or CRM solutions like Salesforce.  Most software vendors already offer a subscription-based cloud version of their applications.    The on-premises model of technology is being superseded by the cloud, and whilst it may not be possible to get every application an enterprise wants as a cloud offering, in most cases, it is just a matter of time. Evolution towards cloud For enterprises, this evolution in the way technology is delivered creates a challenge. Do they embrace the cloud and go ‘cloud first’ or continue to maintain and invest in the on-premises infrastructure they already have? While it’s important to recognise there are specific characteristics that define the way industries embrace technology and every business has a tempo of its own, the evolution towards cloud is universally applicable. Many enterprises have been operating with hybrid models, but increasingly on-premises applications give way to cloud offerings. This means that whilst hybrid models have helped companies dip a toe in the water and experiment with the cloud, this is not the endgame. Vendors want customers on the cloud version of their products, and in as little as five years, most software solutions will only be available as a cloud service. The onus is now on enterprises to embrace cloud-first and move away from the traditional on-premises model. This seems like a tall order, but cloud-first does not mean ripping everything out of the server room tomorrow. Transitioning between on-premises or hybrid versions of products is now much easier than it was three or four years ago, where the process could be quite painful and require specialist skills.  Vendors have worked hard to develop the migration tools and support teams to make it a lot easier – precisely because they want their future in the cloud. Getting to cloud-first Going cloud-first starts with accepting that the on-premises technology and IT skills in the company today, either are, or will become, a technology debt in the next five years. Businesses that do not adopt a cloud-first strategy will find themselves burdened with outdated infrastructure and huge maintenance costs and only able to look on as their competitors become more agile and innovative. That technology debt will keep growing over time as the gap between the haves and have-nots widens, making it even harder to play catch-up when there is no choice but to transition. Cloud-first means giving fair consideration to whether a cloud solution can be a suitable replacement for any new or existing on-premises solution when it aligns with an upgrade cycle, project, or business imperative. There will be situations where it is not the right choice yet. If a cloud-native solution is not going to work, then decision-makers need to understand how long the on-premises version will be supported and maintained, what the vendor's plans are for migrating the service to the cloud, and how easy that migration will be for them as a customer. The speed and approach used to replace on-premises technology with cloud solutions will be unique to each business’s size and circumstances, its applications, network architecture, and its business strategy. But there are some things that companies can do to make sure they are ready to capitalise on cloud technology at the appropriate moment: Zero-trust security strategy: The adoption of zero-trust security models is one of the most effective ways to protect a business against hackers. Hybrid working practices, distributed workforces, and the use of cloud technology alone have made it essential to protect employees, devices, apps, and data wherever they are located. Every interaction between these elements must be authenticated, and the minimum level of access should be granted at any time.  Formalise the strategy: The business case for going cloud-first needs to be evangelised across all teams, departments, and the C-suite. Everyone must be onside to make the strategy effective. Formalising it and communicating through the business is critical to success as individual applications and resources move into the cloud.Architecture inventory: Enterprises can only move what they know they have. Resource modelling and discovery tools can help to shine a light on every detail of the company’s architecture, so it is easier to plan and understand where refresh cycles sit. No company should assume they already know it. There is always the chance that undocumented alterations or ‘temporary’ fixes have been made, which could have catastrophic effects on deployment efforts in the future. Importantly, this is not a one-off exercise, the reality of an IT infrastructure is that things will keep changing, so this exercise should be revisited consistently. Using automated tools can make this a very simple exercise.Plan, plan, and plan some more: Depending on the size and complexity of the business, the speed and approach taken for a cloud-first strategy will be unique and on a timeline that fits that business goals and IT refresh cycles. As business goals and priorities change, it is important to keep the IT strategy closely aligned with them. This, combined with the discovery phase, will help to identify any risk inflexion points and determine how best to mitigate them. Some systems may be too risky to move without running them in parallel first, for example. Though this should only ever be a temporary solution: maintaining two systems over time always leads to a delta between them that can cause any number of problems. Talent and skills: The skills required to deploy and manage cloud systems are different to those for on-premises. They are also skills that are in short supply in the market. Enterprises must consider how they will attract the right people or invest in upskilling the existing team.  Not everyone will want to learn new skills. Ultimately organisations need a strategy for building the team that also delivers a cloud-first strategy, and it’s worth investing time and effort to get it right.  Planning a cloud strategy The future of IT systems is cloud-first. For many companies, this might be an uncomfortable reality to accept, but setting up a strategy and planning for the future now will minimise the technical debt in the organisation. It will lower CAPEX costs and provide the level of flexibility, scalability, and resilience required for modern business. And it puts enterprises ahead of the game. ### How To Avoid The Expensive Route To Becoming A Cloud-Native Enterprise When big business moved to the cloud, the impetus was for it to be pretty modest, originally: reduce costs. Slowly, but surely, it’s now seen more and more at board level to increase competitiveness and business effectiveness, becoming a platform for doing more—not just the same cheaper, but doing things you couldn't do before. And of course, that also means learning new ways to do that to exploit the opportunities in front of you.  That has sparked the creation of what many CIOs now known as the data layer—work that needs to be done to ensure that data, which worked fine on big monolithic SQL databases in the old datacentre, makes a successful transition to this new promised land, and which typically starts as a move to virtualisation, then containers, and eventually adoption of a microservices deployment style. NoSQL was turned to--but unfortunately proved not up to the task In my experience, retail organisations tended to be at the forefront of all this. Why? Because they're the sort of people who need to do lots of different things for lots and lots of customers in a very competitive industry. As a result, many embraced a cloud-native development approach very early, investing big in it because they tended to be, of course, organisations with huge data centres and therefore the most to gain from moving to the cloud and digital ways of working. But what they have found is that the problem is not the applications using data in the cloud, but the database supplying it… so while the world of application development and delivery has come on leaps and bounds, the world of the database has been held back here because the monolithic relational databases that were running the whole business just aren’t very good at working in the cloud (to make something of an understatement!). Hence the emergence of the idea of a data layer that can deal with distributed transactions, and that can also support other kinds of workloads. I have worked with two separate organisations that had faced just these problems, but for (interestingly) different reasons—and who both found distributed SQL database to be their best way out of them. (Both happen to be large North American retailers, too, but with very different business models and approaches to the market.) The first example is where NoSQL had been turned to, but unfortunately proved not up to the task in hand—and which resulted in a lot of complexity, corruption of data, and end-user line of business dissatisfaction. This is a brand that operates a huge e-commerce site featuring thousands of sellers/millions of products, all backed up under the hood with a wide array of applications for handling large volumes of data and to manage how products get listed and presented to give the user a nicely personalised, seamless buying experience.  That ‘wide array of applications’ had been instantiated as a clutch of microservices powered by a cloud-native technology stack exploiting several well-known open-source data and NoSQL systems, all deployed in a multi-datacentre topology using a multi-cloud deployment strategy. So, all well, and this was an excellent internal team with the skills, experience and technology resources to deliver cloud-native applications rapidly… but a major pain point hampering all its good work here: inconsistency in the product catalogue.  What does that mean? Essentially, a lack of confidence that what happened at one end of a sale would be properly reflected at the other: a big problem when you have high volumes of updates arriving, and them all needing to be reflected consistently and both sides of the ledger to always balance! The cost of inconsistency was considerable and soon, in the eyes of the business, unacceptable in terms of a) the danger of user dissatisfaction b) the risk for mismanagement of orders and c) the building cost of fixing all this while live (a potential issue for both end customers but also partner sellers, remember). The ultimate cause of all this was all too clear: the NoSQL database being used was just not robust enough for the enterprise level of transaction consistency needed at this scale.  Instead, the team identified a need for a totally different database approach that could deliver strong consistency, at scale, offer high availability and strong resilience, and support its multi-datacentre/multi-cloud approach. I’m delighted to say that my company’s technology was the eventual winner here, but it’s more important to see why than us try to do any point-scoring: distributed SQL was the only way the developers could see to eradicate the inconsistency issue plaguing their carefully built cloud business platform. (Not to get too technical, but this was determined by two proofs of concepts, one on making sure a unique product catalogue identifier approach worked and mapping service that, again, properly supported lookups across various key identifiers in different services and systems.) As a result, that consistency bugbear has been totally quashed, and this user is now enjoying very high-volume ACID (atomicity, consistency, isolation and durability)-transactional consistency across a multi-region deployment on multi-table transactions, as well as the elimination of product catalogue inconsistencies and the ending of any remediation work to restore consistency.  Sub-10 millisecond performance--and no need for any kind of expensive database transaction management I mentioned above that there were two examples of customers I’ve seen faced with the cloud business database problem: let’s now talk about the second. In this instance, despite a lot of investment into cloud-native digital business infrastructure, the team couldn’t escape single-site data layer implementations without a lot of sharding or replication that introduced complexity, cost and what the team leader expressed with some frustration to me in an email as “brittleness”. The context here is that, again, the company had made the commitment to move to a microservices architecture to make it easier to deliver incremental improvements without risk of disruption to its existing heritage systems and processes. But it felt it couldn’t fully pull the trigger and move to its new model until it was sure it had the right database ready to plug in—one capable of (again) being able to support distributed ACID transactions, scalability, multi-region active-active deployments, multi-API support, and automatic data sharding. And I’m delighted to say that our software met these requirements, and as a result, the company has turned on its cloud-native back end, and, even better, is seeing sub-10 millisecond performance, and no need at all for any kind of expensive and fiddly database transaction management. What’s the takeaway out of these two different, but in many ways parallel, data layer problem experiences? Both were big, superbly competent retailers with big datacentres and all the rest of it… but both recognised in their separate ways that with the advent of cloud, not only would new ways of working with data be needed, so would very careful evaluation of what these new technologies could do for them in the could light of the transactional, data layer ‘day’. But perhaps more critically: also, that you just can’t run any kind of risk of losing the database transactional consistency as part of a move to cloud—and unfortunately that the first generation of data products just couldn't deliver that, but luckily all that has finally started to change. ### DRIVING THE RIGHT SECURITY OUTCOMES FOR GDPR https://vimeo.com/637871723 Celerity Presents: Driving The Right Security Outcomes for GDPR The ICO released an update to the General Data Protection Regulations (GDPR); Article 32. The speakers will explain the implications for businesses securing data from cyber-attacks and what can be done to bolster cyber postures that protect business-critical data and comply with GDPR. ### Taking the Cloud Concept to the Edge The combination of dispersed working and the exponential growth of IoT devices is changing the working landscape. As businesses require more reliable and faster access to data and applications, many have adopted cloud computing architectures, where data is processed in a centralised location such as a data centre. The adoption of public cloud services such as AWS and Microsoft Azure have typically enabled organisations to benefit from agility, efficiency, and enhanced security. Edge computing takes the basic principles of cloud access and transforms it for the new age. However, user expectations for rapid performance are growing, regardless of location, which is changing infrastructure requirements. While the public cloud is still a crucial piece of the puzzle, the diverse working practices of today are leading to the deployment of multiple clouds to support multiple locations, thereby creating a need for applications to sync and interact much more rapidly. Centralised models lack sufficient speed, creating network latency issues. Businesses, therefore, need a solution that keeps latency as low as possible, particularly with the evolution of IoT and 5G, which will facilitate new applications that place greater strain at the point of use.  Rather than centralised connectivity via a data centre, the edge is focused on decentralised strategies which enable important data, applications and content to be processed and managed closer to the end-user. Crucially, this allows for faster access and lower latency, meaning higher productivity and efficiency and support for emerging technologies such as AI and machine learning. Additionally, sensitive data is able to be processed locally, separate from the cloud, and sovereignty of data that needs to remain in the UK can be ensured. However, while the cloud market and available services have been long established, many regional enterprises and service providers are still unsure of how to access the edge opportunity. The role of edge computing With the cost of public cloud escalating, an increasing number of organisations are turning to multi-cloud, where multiple services are utilised in a single architecture, or hybrid cloud environments, which allows data and applications to be shared between both public and private clouds.  An interesting concept when looking through the lens of a hybrid cloud setup is the tendency for some workloads to run in the public cloud and others to be hosted in local data centres for security or privacy reasons. This, in effect, allows data processing to take place in data centres closer to the end-user than centralised cloud locations. While bringing workloads closer to users is not the typical reason for a business choosing a hybrid cloud solution, this same concept applies to the edge and how it fundamentally works. But while the benefits of edge are clear, many businesses don’t know how to implement it within their business and are constrained by concerns over cost, complexity and access. So how can organisations make the edge a reality? Connecting the nation When it comes to accessing the edge opportunity, businesses will only realise the benefits if the right national edge infrastructure is in place. What is needed is a grid-like architecture to enable data to be processed close to edge devices and bridge the gap between centralised platforms and micro-edge. The development of a nationwide edge computing platform will be fundamental to delivering critical connections to digital ecosystems, including distributed multi-cloud models, via strategically located regional data centres. This will enable data to be processed quicker, regardless of where the end-user is located and for regional businesses and service providers would be a game-changer, allowing them to take advantage of edge through one unified network with reach across the UK.  A nationwide platform is also particularly beneficial for regional businesses that have typically been underserved by suitable infrastructure and connectivity due to their lack of proximity to technology hubs in London and the South East. Many have been at a latency disadvantage as network traffic has to travel further, however, by spreading the load across three or four regional edge data centres, rather than through one central hub, businesses can enable data to be processed closer to each end-user, reducing congestion on the network, and improving latency.  Choosing the right edge partner As with any new technology, choosing the right partner will be key to success. However, with so many vendors offering edge services and solutions, it can be hard for organisations to know where to start.  Arguably the most fundamental ingredient will be the data centre locations, so organisations need to look for a provider that can provide a good geographical spread across the UK - not just metropolitan areas – along with maximum coverage and seamless connectivity to cloud services. Few networks are capable of the fast data transfer speeds and ultra-low latency needed for edge computing and organisations need to consider whether the provider possesses the sufficient capacity to match the organisation’s bandwidth demands both today and in the coming years.  It is also key to ensure they have full route diversity around the UK to prevent the incidence of any major fibre outages. Finally, when it comes to continued use of the cloud, considering a provider that can offer true wrap-around support and connectivity between cloud and colocation services can enable true resiliency. Unlocking the edge opportunity Ever-expanding volumes of data and growing numbers of devices have put pressure on response times and transfer rates that are critical for today’s advanced applications. As a result, edge computing is set to become the new standard across enterprise IT. However, when looking at the relationship between edge computing and cloud computing, pitting one against the other is not the answer.  While cloud and edge computing are distinct in how they work, edge computing will enhance cloud architectures by encouraging decentralisation and shifting resources closer to end-users.  Combining the best of edge and cloud will be the most effective strategy for many businesses, but this will depend on how organisations harness the opportunities that the edge can provide. And for regional businesses and service providers, a nationwide edge computing platform might just provide the key to unlocking the edge opportunity. ### Disruptive Live Announces Seasons Winter 2021 Find out more here and apply to view or register as a speaker. https://disruptive.live/virtual-events/disruptive-seasons/ https://player.vimeo.com/video/636110437?h=69b6e162eb&badge=0&autopause=0&player_id=0&app_id=58479 ### Looking After the Pennies: Remote Accounting Likely to Fail Without Cloud Computing The recent shift to remote working has caused a range of technical issues for many businesses. One issue causing particular exasperation is the way accounting systems often struggle under the pressure of multiple users accessing them remotely.   Generally, accounting systems are designed to cope with a finite amount of data and basic transactions. This has meant that many of these systems have begun to slow and become difficult to access as they attempt to deal with unprecedented levels of traffic coming into them from remote sources. To maintain financial health, businesses need to react and adapt to new technologies quickly and effectively if they want to succeed in the ‘new normal’ corporate environment. A fact especially true for expanding companies on the verge of outgrowing their current accounting system.  Accounting system providers at the forefront of industry innovation are constantly investing in new processes and technology, and in none more so than cloud computing. A modern and sophisticated cloud accounting system can help businesses work smarter, faster, and in a more integrated way, regardless of how many of the workforce are operating remotely. Finding the suppliers of such systems though can be an arduous task.  In this piece, we discuss the tell-tale signs that a business is partnered with a supplier that is behind the digital curve; clinging onto legacy technology and dated business models. We will also look at the benefits of cloud accounting to support new ways of working and what businesses should consider when seeking a new supplier. Spotting the symptoms of a failing accounting system For the majority of businesses experiencing friction with their accounting systems, the phenomenon has been relatively recent. In the early months and years of trading, the starter software they had invested in performed sufficiently well, required as it was to only manage relatively simple accounts and provide day to day information. However, for those businesses whose revenues, client bases, and workforces all grew, so did the pressure on their entry-level systems. As with the human body, with growth came pain. These pains include but are not limited to: Data corruption: A common consequence of an overloaded, legacy system that was never designed to be scalable. Occasionally, suppliers can help retrieve data but, where it must be done on a regular basis, it is both expensive and inconvenient.  Unreliable data extraction: Unable to access the required information, businesses are forced to purchase additional, third-party software or use time-consuming alternatives. Excel is a common tool for removing data from the main system so it can be worked on in isolation before being returned but often leads to multiple errors, along with security risks. Poor performance: As legacy systems struggle to process heavy traffic of large files; overall performance begins to suffer. The only solution is to extract data that, as noted above, comes with its own raft of problems.  Overspending: With key staff unable to access the latest budgets, overspending becomes commonplace. Moreover, as potential investors cannot readily acquire a transparent view of the accounts due to inadequate drill-down capabilities, a subsequent lack of trust in the accounting system may lead them to walk away. Businesses simply need more from their accounting systems Businesses experiencing any of the symptoms outlined above should take comfort from the fact that each is a signal of a well-performing business. But, to further enhance growth, businesses mustn’t just look to eliminate these pains, rather they must look for solutions that bestow greater capability.  Ultimately, businesses must appreciate that if they’re not working at maximum efficiency, many of their competitors will be. They also need to understand the risk that a supplier failing to evolve with market forces are often themselves phased out of the market, leaving their customers with no supplier and an inventory of outdated kit.   Assessing the proficiency and reliability of accounting systems can be achieved by looking at their suite of features and methods of working. For example, today, changes to data can be instantly updated in real-time from any location and across all systems through the formidable power of transactional interoperability. Suppliers unwilling to offer this service hinder their clients’ progress and should serve as an early warning. The sheer flexibility of modern systems also means that data can be shared freely across all other business systems. With integrated IT systems, businesses can free up finance personnel from mundane and manual tasks so they can concentrate on strategic, commercial objectives. Other tell-tale signs include a need to rely on one person in the finance department to sign off every process. Modern technology can clear the logjams that result from this by giving certain staff tailored access to the finance system. Through these devolved technologies it is possible to transform processes across the business, increasing speed, reducing workloads on the core finance team, and allowing senior managers to access their own figures to make instant, informed decisions, without having to grant them access to areas you don’t want them in.  Then there’s the fact that required information can be accessed from anywhere. Businesses moving their financial systems to the cloud gain easy access to crucial data from any location via a web browser, accelerating decision making and processes across the board. Budget holders are suddenly able to approve orders from their smartphones and sales teams can update their expenses and revenues via tablets. The result is teams that work faster and more efficiently from financial data that is always current, which is particularly important in the ever-evolving flexible working landscape. Choosing the right cloud system for your business In what is becoming an increasingly crowded market, choosing the right cloud supplier to support your accounting system can feel a little overwhelming. Nevertheless, there are certain features you can look for in a partner to help you arrive at an informed decision.  For starters, ask any potential supplier about their current clients. How many are using the cloud and why? When did their first cloud customer go live? When did they migrate from on-premise? What business benefits are they now enjoying? The answers to these questions will provide a rounded picture of why the supplier created their solution, and what their journey with the cloud has been like.  From here, uptime becomes critical. If a supplier has been providing cloud for a while they should be able to immediately communicate their uptime stats and monitoring processes. It’s also worth looking into their policy on updates, how often they’re installed and what kind of impact they might have on your business processes.    Of course, underpinning any cloud supplier’s proposition is the technology behind it so it’s vital to check you’re getting access to the industry’s best. Reliability, scalability, security, and user-friendliness are all critical components of any cloud offering and you need to know the details for each.  The best technology in the world is useless though if it’s stored in a wooden hut so find out where your cloud will be housed and assess its physical security. The likes of bomb-proof data facilities are becoming standard, so if a supplier can only offer something less robust, it’s worth considering whether you’d feel confident with them hosting your financial data.  Technology is moving with the times, are you? Today, accounting is completed remotely in a way not witnessed in human history. This is not something to be afraid of. Indeed, for the flexibility it affords, it should be embraced. However, remote accounting can only ever be as efficient as the technology that powers it. Bolstered by solutions that are accessed across the latest cloud computing software, the practice will only continue to get faster, more versatile, and more productive. It will also increasingly expose late adopters and, for some, the exposure could be harsh enough that it leads to failure.  ### Disruptive Seasons Autumn Event 2021 - Day Two Evening Session Welcome to the final part of Disruptive Seasons Autumn 2021! This event is showcasing some of the up and comers, as well as some of the giants in the tech and business world. Watch this segment for the chance to see a presentation from IBM’s David Spurway, Thomas Vosper from aisle 3...and much more! https://vimeo.com/618429912/c52a9e2b44 ### Disruptive Seasons Autumn Event 2021 - Day Two Afternoon Session Welcome to the penultimate session of Disruptive Seasons Autumn 2021! This event is showcasing some of the up and comers, as well as some of the giants in the tech and business world. Watch this segment for the chance to see a keynote from Tom Richards from Northdoor as well as Tom Shrive from AskPorter...and much more! https://vimeo.com/618428154/15bbc177dd ### Disruptive Seasons Autumn Event 2021 - Day Two Morning Session https://vimeo.com/618413217/d180645ed0 ### Disruptive Seasons Autumn Event 2021 - Day One Evening Session https://vimeo.com/617351351/7c77740509 ### Disruptive Seasons Autumn Event 2021 - Day 1 Afternoon Session https://vimeo.com/617350075/cb575c6091 ### Disruptive Seasons Autumn Event 2021 - Morning Session https://vimeo.com/616940899/e936ff50b0 ### Disruptive Live Launches Brand New Quarterly Big-Ticket Showcase To Register to speak or find out more, click the link below! https://disruptive.live/virtual-events/disruptive-seasons/ Airing 29th & 30th September 2021 - A brand new quarterly showcase of  Disruptive Live and other big players in the tech industry. Disruptive Live is launching a brand new quarterly showcase, airing at the end of the quarter. The show will be in different formats including keynotes and panels, with multiple different hosts . The show will be available on all major platforms including Linkedin Live, Roku, Amazon and more. ‘Disruptive Seasons’ will be the next big show to hit the community.  “The technology industry is ever-evolving, Disruptive Live know this and are with the times and producing some really first-class stuff.” A member of the Disruptive team explained. “The innovation coming from Disruptive Live needs to be done justice.” The Disruptive team member continued “The tech and business world is all about having something new, impressive and sometimes on a grand scale. We want to channel that into a show with those same roots, really express the innovations around us every day. You won’t have seen anything like it yet!” Not only will this show take the industry by storm but we also have some huge names in line to take the stage. It will contain exclusive insight that everyone craves and some stunning visuals. You’ll be able to access all of this amazing content through the Disruptive Live app, you won’t have to miss a minute! Do you or anyone you know have what it takes to stand up and to talk on our Virtual Stages in an event broadcast to thousands of viewers? The Disruptive Live app is free and available for download now from the following app store links or by searching for “Disruptive Live” from the app store on the device. iOS and Apple TV - https://apps.apple.com/gb/app/disruptive-live/id1535818369  Amazon - https://www.amazon.co.uk/Compare-the-Cloud-Ltd-Disruptive/dp/B08LH1TNBW  Android - https://play.google.com/store/apps/details?id=com.maz.combo3344 Roku - https://channelstore.roku.com/en-gb/details/3d07000543f962995ab95d584587b46b/dis ruptive-live ### The New Ransomware Reality As the headlines showed this year, the pandemic has been a breeding ground for ransomware attacks. Since moving to decentralised systems, businesses are being targeted by ransomware regularly. Ransomware has become such a frequent occurrence that many are now saying it represents the biggest threat to online security for most individuals and businesses in the UK.  Unfortunately, this situation is getting worse as hackers have only just begun. Whether they are state-sponsored attacks or ransomware-as-a-service practitioners seeking a profit, cybercriminals are attacking the highest-profile target they believe they can penetrate and who will pay.  A global threat  From hospitals and schools to infrastructure and government sites, no one is immune from ransomware. Because of this imminent threat, ransomware is no longer just a single organisation’s cybersecurity issue – it has become a global security risk that can impact public safety.  Recent attacks have reinforced this point. In 2020, we saw the first case of a fatality being linked to ransomware. An attack on a German hospital invaded 30 of its servers, which caused the hospital's IT systems to gradually crash. As a result, the hospital wasn’t able to provide emergency care and had to reroute a patient to another hospital. The patient tragically passed away shortly after.  In May, an attack on the Colonial Pipeline shut down its systems for days, causing widespread panic in the United States over fuel shortages. Temporary outages were reported in 11 states, with many governors declaring states of emergencies. If these attacks tell us anything, it’s that ransomware is not only disrupting businesses but impacting peoples’ lives and their access to critical services.  Putting an end to a continuous cycle When ransomware strikes, organisations often end up paying the ransom to prevent further disruption or in some circumstances, even risk to life. This year we’ve seen this alarming trend reach new heights with some of the biggest ransomware payouts in history.  Examples include the CEO of Colonial Pipeline paying $4.4 million in hopes to get systems back online. Meanwhile, chemical distribution company Brenntag paid the same amount after experiencing an attack from the same ransomware gang - DarkSide. However, the largest known payment thus far has come from insurance firm CNA Hardy, which paid $40 million.   Security practitioners have continued to say “it’s not a question of if, but when” every organisation will be impacted. The only thing left to decide is how organisations will meet this moment.   Being prepared is key  The more an organisation prepares and is able to quickly respond when an attack occurs, the less damage there will be to systems, finances, and its reputation. The first step in ransomware protection is data segmentation. Whilst an organisation would not want any of its data compromised by ransomware, the reality is some data is more valuable than others. With a rise in exfiltration attacks, it is critical that businesses are more vigilant in keeping their critical data away from hackers. Therefore, businesses should separate their data into defined buckets and understand what needs additional protection. In order to further reduce the attack surface, maintain data lifecycle and retention policies. This will also allow an organisation to more easily maintain compliance with data privacy regulations.   When attacked, a business must be able to recover its data quickly. This means they need to manage all of their data by backing it up automatically and securely. Backup security is also paramount. Ransomware hackers target backup systems because they are a business’s last line of defense and a central pool of all data. Utilising a cloud-based solution can be an effective and powerful way to protect these backups. Don’t forget to frequently run test restores of those backups as part of an active disaster recovery mechanism.  A key step that should not be overlooked is to educate teams on cybersecurity hygiene. Employees need to be made aware of common security threats such as phishing and know how to avoid them. Every employee should complete cybersecurity awareness courses on at least an annual basis; even more successful programs weave training throughout the year with simulations. The road ahead It’s clear that ransomware has become too large of an issue for one organisation to overcome on its own. But whilst hackers will continue to relentlessly extort and exploit companies, progress has been made. Efforts such as the Ransomware Task Force have been a great first step at addressing this issue. The coalition published a report earlier this year that advocates nearly 50 interlocking government and private sector strategies to tackle the criminal scourge. And there are steps we can all take. The more businesses that have tools readily available to recover from an attack, without giving in to ransom demands, the less incentivised hackers will be to make their strike. By understanding and segmenting data, securely protecting it in the cloud, and boosting cybersecurity awareness across organisations, ransomware can become less of an imminent threat, and just a manageable inconvenience. ### Ansible Automation Over the past couple of years, and especially during the pandemic, we have been approached by a number of clients looking to adopt more automation in the way they administer and deploy their IT estate.  As a result, I want to address some of the questions that arise when considering the adoption of automation.   In terms of why businesses are increasingly turning to automation, there are a number of factors driving this trend, for example –  Meet end-user and line of business expectations of ‘On Demand’ IT Where historically a lead time of days or weeks for a new environment was accepted, cloud adoption has led today’s business and end-users to expect near-instant access or self-service provisioning Free up IT resources from the menial tasks  By removing the repetitive grind, the IT team can concentrate on new initiatives to drive business value Provision complex environments without the need to consume valuable IT resources As environments grow increasingly complex, with often multiple technologies and skillsets involved – through automation, these tasks can be completed without the need to tie up one or more highly skilled consultants Achieve consistent, repeatable outcomes without manual intervention Whilst many organisations produce a runbook for any member of an IT team to complete a task, there can often be nuances or discrepancies in how the task is completed.  Through automation, the task gets completed the same way, every time – removing the chance of user error Manage growing environments with the same or fewer staff Whilst most IT environments are growing, either organically or via the adoption of containerised microservices to add business value, the IT organisation almost certainly isn’t. As a result, teams need to work smarter to prioritise the important tasks and offload routine jobs to automation When considering automation, most clients think of Ansible but for many, this is a new technology in which they have little or no experience.  We appreciate in IT, as in many professional disciplines, there is often an unhelpful trend of avoiding the simple questions even though this basic understanding is a critical foundation to everything that leads on.  As a result, we normally start the conversation at the ground and build-up – starting with the most fundamental of ‘What is Ansible?’ What is Ansible? In a nutshell, Ansible is a very simple IT automation platform that makes your systems and applications easier to deploy.   Ansible is open source and can either be used for free or with a commercial support agreement from Red Hat.  This gives the option to try out Ansible’s capabilities for free with the option to enrol in a support agreement if the decision is taken to roll the platform into Production.  There are no agents to deploy or manage to end systems and it provides highly flexible and configurable systems management, with rollback capability in the case of configuration error. A key concept in Ansible is a Playbook – this is a workflow of tasks that Ansible will execute to complete an activity – such as patching a system or provisioning an environment. Ansible Tower is available on a commercial subscription from Red Hat and extends the value of Ansible by provides a user-friendly graphical user interface.  This GUI centralises the inventory, provides a health and activity dashboard and allows execution of Playbook activities.  Why Use Ansible?  Due to the highly flexible and configurable nature of Ansible, it can be used to address a number of challenges.  The key Ansible Use Cases include: Configuration managementApplication deploymentContinuous deliveryProvisioningOrchestrationSecurity automation Can I use Ansible with my AIX and IBM i estate? Yes - All POWER, Z and x86 Operating Systems are supported Ansible endpoints.  Ansible can be used to perform routine maintenance operations such as patching, application and program deployment, work management, security management or the running of adhoc scripts, CL or SQL for specific administrative tasks.   Expanding on administrative activities, Ansible can be used to automate the build, unit test and deploy processes to support continuous development workstreams and can be combined with IBM Cloud Automation Manager (CAM) to deploy workloads in the cloud or on-premise.  As Ansible works with heterogeneous environments across platforms, automated activity can occur within a structured playbook to build or modify environments on both Power and x86. As an IBM Platinum partner and a Specialist in IBM Power and Storage infrastructure and a Microsoft Gold partner, Northdoor can help guide you through the options open to you. Summary We firmly believe that the use of automation is going to play a key role in allowing businesses to improve their efficiency and level of service to both internal and external users.  As the high-level overview above demonstrates, Ansible is a very powerful, cross-platform automation tool that can be leveraged to great effect when applied correctly. If you’re interested in finding out more, please get in touch.  In a typical engagement, the first step would be an introductory workshop to understand the challenges that you would like to solve via automation and a broader outline of the Ansible stack.  Following this workshop, we will propose an architecture to address your challenges – all of our solutions are bespoke to our client’s needs.  Should you decide to proceed, Northdoor can implement and proactively manage your ansible deployment    ### A Tour of Dell Technologies Tech World 2021 Dell Technologies Tech World 2021 has been and gone. It showcased Dell's latest technologies for the next generation of computing, data analytics and artificial intelligence. If you're a tech enthusiast or want to learn about what the future has in store, this event is perfect. Even though it was online and virtual did not take away from the atmosphere. It was boosted by the performance by Chris Martin which was stunning. I'm so grateful to have been part of it. In this post I want to detail some of my favourite sessions and what I, myself have learnt. I can't possibly cover it all but I'll try. How do you empower your remote workforce? The first and most important way according to Dell is to Deploy laptops and desktops direct from our factories to your end-users. This is the best way to ensure your users are productive from day one. Also, Dell has a very interesting program called 'Dell on Demand'. It's an all-inclusive service that provides end-to-end workforce solutions and includes everything you need to get started with IT or refresh your existing organization: hardware, software licensing, deployment infrastructure setup. They have got it covered! Then you must Secure valuable data with built-in protection, and deploy Dell Data Protection. Monitor your organization's environment with Insight and gain new insights from the data you have, to make better decisions on how to optimize the performance of mission-critical workloads. Then manage and support devices with minimal IT intervention with Dell Services. The best way to improve performance at home is a virtual desktop infrastructure. It's the answer to providing employees with the high-end performance to run compute-hungry applications and providing you with peace of mind knowing your data is secure. What are the problems with brick and mortar shops? The main problems that brick and mortar shops have are that they are in a physical location. This limits the opportunity for customers to visit, purchase products or services, or return them if necessary. It's become more difficult than ever before just to get into one because of how crowded cities are becoming nowadays. In addition to this, many consumers rely on online shopping instead of going out based on the surprising 2020 we have had. There is a solution, however. M3T, in collaboration with Intel, has developed an ingenious solution that solves the problems of traditional retail shopping by combining an onsite experience with the convenience of online shopping. With the M3T solution, customers can experience the next wave of retail with a system of in-store kiosks that quickly find the right product and provides a convenient and secure purchase process. Amazing! Just what we needed. That' just the stuff I came to Tech World for. What is imposter syndrome? My favourite part of Dell Tech World came when Adam Grant was joined by Jenn Saavedra. Their piece on Imposter Syndrome was inspiring. Imposter Syndrome is a feeling of fraudulence and inadequacy. It is the secret ingredient to success for many people because it makes them work harder than anybody else. You believe you do not belong where you are. It may be common, however, almost everyone comes across Impostert thoughts at least a few times in their life. It doesn't come without its benefits. One positive of Imposter syndrome is that it can propel you to work harder than anyone else. You push yourself because you feel that your success is not going to be sustainable if people find out the truth about who you are and what you know. Another positive is that it can lead you to 'fail fast' as it were. The important part is that many people give up on what they are doing, this is wrong, your first step should be giving up the way you are doing it, your method. People afflicted with Imposter syndrome tend to rethink earlier meaning the desired outcome is achieved sooner. Incredible, I would never have looked at it that way! Wrapping up The Dell Technology World 2021 event was incredible and there were many great points. My favourite by far was the pice on Imposter Syndrome, which I've always thought had negative connotations but in this context, it's a positive thing! This conference has made me think about how we should use our strengths to be successful at work or life instead of being afraid that they're not good enough. This is something I wasn't expecting to come out with but I'm glad I gained. The pieces on smart offices and how Dell EMC Storage Services for Multi-cloud help you get the most from cloud providers for block and file data improved my understanding. I can't wait for the next Technology World and hope I can attend in person! ### Establishing One’s Digital Maturity: A Roadmap to Effective Digital Transformation Digital transformation has been high on the business agenda for many years, with leaders investing heavily in new, promising technologies as part of their overall development plans. This trend shows no signs of slowing down. In 2019, IDC forecast that investment would reach $7.4 trillion by 2023, with the critical drivers being market competition, rising customer expectations and ongoing business challenges. More recently, Covid-19 has been the most unplanned reason for accelerating most businesses’ digital transformations; out of sheer necessity, leaders have opted for short-term solutions, with the hope of transforming their business as quickly as possible. Naturally, teams will have encountered a number of bottlenecks throughout this transition, as a result of unnecessarily complex processes across the business cycle. These companies have looked for quick fixes merely as a means to keep the business running while their workforce continues to operate remotely. These initiatives lack the foresight of a long-term transformation plan and rarely result in success. In fact, according to Conga’s latest research, only 36 per cent of all projects relating to the transformation of commercial operations are considered somewhat successful. Even prior to the pandemic, businesses would often rush their transformation programmes. Initiatives were often driven by the desire to keep up with competitors or to embrace the latest technology, rather than establishing clear business objectives based on a sound understanding of the outcomes that digital transformation can and should drive. Unfortunately, the pandemic has only accelerated this issue. How to approach a digital transformation programme The first step in formulating an effective strategy is to measure where the business currently stands in its digital transformation journey, that is, what works, and what does not. Before adopting any new technology or starting any transformation initiative, companies need to take a step back and establish their digital maturity. This will likely involve reviewing the whole commercial operational model. By reassessing its suitability, especially in the current environment, leaders may identify unnecessary or complicated processes across the operational cycle which hinder the business’ overall performance. Once they have assessed their business’ overall digital maturity, leaders will have more of an idea of how and where change needs to occur in order to progress and improve the company’s overall operability.  Businesses will have a clear picture of the current state and what the next stage of their company’s digital transformation journey should be, as opposed to simply guessing, or learning through trial and error. Only then can leaders establish clear objectives and a strategy, incorporating new technology to help achieve these goals. The digital maturity model – a phased digital transformation journey  The maturity model framework does not prescribe a linear change programme, where every business begins in the same place and follows the same steps to digital maturity. It is vital that every stage is assessed before any change is implemented to determine the next best steps. By evaluating maturity of any given process or area, businesses can gain a clear perspective on their current state, and how to move it on to the next level of maturity. In the initial stage, companies will likely review key processes, such as managing proposals, issuing quotes or contracts, or even invoicing. Then once they reach the foundation stage, business leaders will start to understand the organisation’s core business logic and identify key data flows and areas for development. By which point, the next step will involve reviewing the system and ensuring processes are fully optimised against agreed business goals. Following this, businesses can then consider some level of automation, or at least, the acceleration of everyday business processes, that enable teams to work on and prioritise more important tasks. By the next stage, leaders can consider the possibility of further integration between systems, such as customer lifecycle management (CLM) or enterprise resource planning (ERP) to deliver more multi-channel management. Before proceeding to the final stage: intelligence. This stage allows businesses to utilise advanced insights and reporting for unparalleled sales performance reviews, making sure that all teams are fully up to speed with each customer’s interaction with the business’ different teams. Businesses will of course proceed through these stages of digital maturity at different rates depending on the complexity of their structures, and how many roadblocks they encounter across the business cycle. Undoubtedly, some will have to go back several stages to tackle any issues regarding the business operability or efficiency.  Digital transformation is a process – it never ends  At a basic level, it is all about reconsidering the relationship between people, processes, and data. Simplifying and streamlining a critical business process that touches upon all three areas is a key part of the strategy that makes transformation projects far more successful.  As companies progress along their digital transformation journeys and gain greater digital maturity, they will continue to streamline processes, break down silos, and enable working across team and departmental boundaries. Businesses will become far more agile and adaptable, and able to identify other areas of their commercial operations that may need to be tweaked or fine-tuned, but this is very much a continuous process. No digital transformation journey ever really ends. After all, with flexible working and a hybrid workforce likely in the near future, businesses will no doubt have to consider new workflows to ensure they are better suited for the ‘new normal’. All companies should remain commercially agile. As recent experience has proven, digital transformation and indeed, commercial agility is vital in today’s business world. By establishing their digital maturity, businesses leave themselves better prepared for all outcomes. Hopefully, businesses will take stock of lessons learned over the past year, and as leaders plan the return to the office, whenever that may be, they can draw upon these experiences and establish a much clearer transformation strategy.  ### 5 Tips for Fintech Companies to Handle Compliance The FinTech industry has been developing quite rapidly over the last decade or so, and in so doing, FinTech companies managed to completely reshape the modern financial landscape. Regardless of whether you are making a petty online purchase, paying for a cup of coffee in your favourite coffee place, receiving money from your client, or performing convoluted management-related tasks for your company’s entire financial plan - all these processes are now performed quickly and painlessly thanks to the evolution of FinTech solutions.   However, the Financial Technology environment - in terms of both business and legal aspects - is fairly dynamic when it comes to legal compliances that these FinTech companies need to tackle in order to stay ahead. The legal ecosystem of various compliances, regulation and potential threats in terms of data security is a rather complex one, which means that FinTech companies must take it seriously and be sure to always approach these issues in a professional manner.   Now, even though the awareness is relatively high about how critical it is to have a proper compliance strategy in place, there is still an alarming number of organizations (that either belong to the FinTech industry or utilize some aspect of it in order to operate) that lack the knowledge or resources to have an effective compliance strategy implemented within their workflow and legal-based company structures.  This article tackles 5 useful tips for tackling compliance concerns that should most FinTech organisations find useful.  Bring Your Data Security to Top-tier Levels Whether we are talking about the data that belongs to your clients, employees, or the information about invaluable company secrets, protecting these types of information should be one of the highest priorities when running a business. This is especially true for FinTech companies as they need to approach data protection with even more attention to detail, given all the security rules and regulations that often differ depending on the country and/or region their business operates in. Additionally, this compliance landscape also tends to change fairly frequently within the same region, making it that much harder for these businesses to comply.   For instance, the European Union features its own rules established and implemented through GDPR across all EU states, while the United States Of America has its own set of rules and laws that are implemented and managed by the Federal Trade Commission (FTC), as well as the Consumer Financial Protection Bureau (CFPB), etc, This is why FinTech companies, regardless of their geo location, must tackle data protection with utmost seriousness and close attention to detail when coming up with their own security policies and data protection layers.   Implement Proper Email Retention for Improved Data Accessibility A huge part of the aforementioned data protection layers includes emails. Email-based information is an enormous part of the data flow of any modern company. Sensitive data is circulating through email platforms on a daily basis, and that data needs both protection and high levels of accessibility. Being able to manage, store, and retrieve critical pieces of data is of paramount importance, especially in scenarios where (due to legal-based reasons) you need to access specific information pertaining to the case at hand. Otherwise, a single lawsuit for which you are unable to collect appropriate data can severely damage your company, in terms of both ROI and the overall reputation.   A great way to take care of these issues is to have your own email retention policy and have an optimal data archiving plan. This can be very helpful for proper handling of data security and accessibility (especially in terms of how long certain files are being kept available), especially if an organization must quickly get a hold of the data requested by a regulatory body.  Get a Firm Grasp of Payment Protection  Since modern payment methods are the lifeblood of both the business and the current global economic landscape, making sure that these systems are regulated and highly protected is done via numerous regulatory regimes. These regulations can vary depending on the geolocation, and one of the most talked-about sets of regulations includes the second Payment Services Directive (PSD2) that regulates these payments across the EU.  Ensuring the preservation of payment security and consumer privacy is among the basic goals of PSD2. This directive also has an objective to design and implement a regime of rules that are much more balanced and user-friendly, but without jeopardising the necessary strictness.  Peer to Peer (P2P) Lending The UK companies manage peer to peer lending through the Financial Conduct Authority, or FCA. The FCA requires P2P lending services to inform investors about any potential risks and does so by providing fairly detailed and granular data on this aspect of overall compliance.  The US, however, has seen peer to peer lending services slowly turning into a somewhat controversial matter, especially after numerous fraud-related cases have occurred in China over the years. In the light of these events, countries around the world have been renewing the P2P rules and regulations in order to remove all the potential gaps and tighten up these regulations. FinTech organizations should ensure they are complying with all the necessary peer to peer lending rules that are currently within the regions they operate in.   Mitigate Any Fraud-Based Activity and Money Laundering  The FinTech companies are, unfortunately, often in the centre of numerous frauds of various magnitudes. FinTech products and services can be exploited by frauds, lawbreakers and terrorist groups by performing misconducts like money laundering, identity thefts, terrorist attacks, etc. This is why the companies should inform themselves about all the rules and regulations that deal with these malicious attacks and opt for proper solutions that can secure both the customer data and the information belonging to the company itself.  The businesses operating in the UK could be subject to oversight from the Financial Conduct Authority (FCA), Aussie organisations by the Australian Transaction Reports and Analysis Centre (AUSTRAC), while the US-based companies are being looked at by federal regulators, including the Office of Foreign Assets Control (OFAC), the Financial Crimes Enforcement Network (FinCEN), and the Securities and Exchange Commission (SEC).  ### Thinking cloud? Think again. It might have been around as a concept for more than a decade but, for many organizations, cloud computing is now becoming core to the way they can actually deliver their ‘business’. Others, however, are joining the ‘cloud club’ – often, it seems, without really knowing why. They just want cloud.  The problem is, they are finding it is not as simple as they thought. It’s not just the next place to ‘run your stuff’, but a whole new world of opportunities. Cloud should not be about making the same mistakes you already made but cheaper somewhere else, it should be about learning from the past, righting those wrongs and doing things differently. The bottom line is that nearly everyone is behind the curve on using the cloud, and the opportunities it presents. In the EU, only a third of businesses are in the cloud. Given the seismic acceleration of the digital economy in 2020, it is inevitable that this percentage will change, as businesses try to create new, app-first business models. They’re doing this to capture market share and building stronger relationships with customers; cloud will play a major role in that success, enabling the right apps to run in the right environments, to deliver the right experiences.  It’s the way the world is going: digitally transformed organisations are projected to contribute to more than half of the global GDP by 2023, accounting for $53.3 trillion. But adopting cloud solutions and services without really knowing why you need them for your IT, and more importantly, what they deliver for the business, is probably not the right strategy.  Simply wanting to join the club is not the starting point. Without knowledge of what it can do for your business, it will never live up to its ideal, but simply be a place to repeat mistakes in a slightly different way. So, if you are thinking about ‘going to the cloud’, think again.   From cloud-centric to app-centric Today, businesses must recognise the problems they are likely to encounter tomorrow. They need to understand what the digital and application landscape is going to look like in three to five years and make decisions accordingly. Data is going to be everywhere, with the volume created, captured, copied, and consumed annually worldwide set to more than double in the next 3 years. And increasingly, it is going to be residing anywhere but the traditional datacenter.  This is going to have a major influence on how companies engage with customers, and in turn how they use apps to facilitate these interactions. That’s why rushing into cloud now could well hamper how effectively business-critical apps that need to use distributed data in the future can operate. An app that does great things now, for example, is going to need to adapt over time, and the environment it is hosted in will need to do the same. An app that’s been deployed somewhere just because it’s ‘in the cloud’ could well struggle to meet those requirements. So ‘app first' is your first strategic guiding light.  Then there’s the challenge that, even as they rush to the cloud, many organizations are still unpicking their legacy infrastructure, technology they’ve probably held on to for longer than they should because, frankly, it was so resource-intensive implementing it they didn’t feel they could cut their losses earlier and didn’t have an exit plan. From spiraling costs to deploying environments without first considering how apps are going to use them, there is a real risk of experiencing the same problems that legacy IT created, just somewhere else this time. Ultimately, however, the biggest issue is that there isn’t one silver bullet. When we talk about cloud, we’re actually talking about a whole variety, from companies running apps on-prem in their data center, through software-as-a-service (SaaS)apps like Salesforce and Workday, even app components as a service like communications, to the public ones of the likes of Google and Amazon, all the way to highly distributed ‘edge clouds’. There are even sector-specific clouds, such as the financial sector developing a FinTech cloud, with all the compliance and security controls required for a highly regulated sector. CIOs expect the number of clouds – private, public and edge environments – they use to build, manage and run their apps to increase 53% in the next three years. This proliferation brings with it complexity – a VMware study noted that 63% of organizations state inconsistencies between clouds as one of the top multi-cloud challenges. To business-centric, for any app, over any cloud This is why it’s so important to understand what cloud will do for the business as a whole, and specifically how the business can use cloud services to develop, modernize and deploy apps as fast and securely as possible. Just imagine what a business could achieve if it didn’t matter what cloud their apps were in because they could use the same tools to manage everything?  What you should be doing now is “Application Portfolio Planning”.  That’s looking at every application and working out the roadmap for it – does it stay? Do you rewrite it locally or in the cloud? Is the answer to migrate it to run on a cloud? Or just bin it and move to SaaS?  Don’t plan based on where something is or what it runs on – that’s bottom up. Plan based on what it does now and what you want it to do in the future. Why? Because it is the app that is the gold in the digital economy. Just think, the number of applications has increased six-fold in recent years, and by 2024 more than three-quarters of a billion applications will be developed. The cloud is thus the developer platform, the security blanket, and the enabler. The foundation both for modernising existing apps and developing tomorrow’s next-gen applications.  That’s what cloud should be about. Providing the environment and the options CIOs need to drive the full-scale modernization of their business, deliver the operational capabilities demanded of digital businesses, and build and host applications in a way that is future-proofed. That means delivering modern apps at the speed the business demands; operating them across any cloud, with the flexibility to run apps in the datacenter, at the edge or in any cloud; and driving rapid transformation while delivering resiliency and security, in what is an unpredictable world. This is all made possible through a single platform optimised for all applications: virtualised and containerised, artificial intelligence and machine learning, traditional and modern. A platform that can be used across all clouds, from private to hyperscaler, with consistent infrastructure and operations, thus reducing complexity, risk and total cost of ownership.  But businesses need a fast and simple path to the cloud, and the flexibility to choose any cloud. Organizations can match the needs of each app to the optimal cloud, with the freedom to use the most powerful cloud services and app modernization. And to be able to deliver cloud migration without the cost, complexity or risk of refactoring.  The right environments for the right apps But what does that look like in practice? Here’s a handful of scenarios: You’re a public health provider, with a variety of systems, services and applications that serve patients and employees. Some services, such as appointment booking apps, the front ends of digital workspace platforms and entertainment networks for in-patients, might quite happily sit in a public cloud; but patient data, the interface that controls and secures your connected generators and other utilities, may need to be on-premises, some in edge devices, with much higher levels of security. You’re a multinational bank, one of the new challenger brands. Your whole selling point is that you are customer-first, with an intuitive app that acts as a window to consumers’ financial services. Again, to manage scale and agility, the front-end of that app will happily sit in a public cloud, but the data it collects may need to be securely stored in either a private cloud or on-premises. It may need to be the latter to comply with financial regulatory and compliance legislation, along with legislation surrounding data sovereignty.  You’re a retailer gathering lots of customer data. You’re gaining insights from it in the cloud with large scale ML, and using that to then train artificial intelligence models that run in-store to interact with shoppers’ mobile phones in near real-time, providing help, advice and guidance (such as price comparison) to customers. Imagine being a logistics provider with a new eCommerce service that allows customers to book online. During the traditional peak season, you manage the increased demand by moving the front end services onto a public cloud, ensuring you have the speed, scale and resources to stop the app from toppling over. Once the demand drops off, you scale the app back into a private cloud where you can maybe better control the operating costs, while still keeping the service available to customers.  Maybe you then consider moving it all to the cloud in future, or maybe it’s already all in one cloud, but that cloud provider is now suddenly a competitor and you want to relocate it to a different cloud. These are all examples of how businesses are using the array of environments at their disposal to operate more effectively. While there’s always going to be variation between different businesses, having a cloud foundation that provides the freedom to enjoy a true multi cloud strategy, running apps on more than one cloud is the goal. And it is the hybrid cloud, providing consistent infrastructure, management and operations between the private and public clouds and where apps can easily be moved with no modifications, that will deliver that vision. A hybrid future  This hybrid cloud future is unavoidable. Yet those businesses that are blinded by the lure of the cloud could end up recreating many of the issues they were attempting to get away from. Your business goals should always be your starting point for any project. Then you should assess how a digital strategy will deliver it, underpinned by the right environments, including cloud. A mix of clouds that are transparent, secure and potentially interchangeable, to meet the needs of apps, and more importantly users, both now and in the future, is essential.  The cloud should be about liberating and elevating your applications and their architecture, not just building them again in a different silo somewhere else. The businesses that take this approach to the cloud that will be the ones that build the environments their apps need, deliver the experiences users are demanding, and ultimately take advantage of the data-driven economy.   Thinking cloud? It’s time to think again.  ### The Distinction: Data Privacy versus Protection Data trespasses are a source of problems for both companies and their customers. When data gets exposed to unauthorized access, it can cost a lot of money to recover (which is not often possible). The situation can sully the company's image. This stolen data can vary from innocuous information to too personal details, depending on the company affected. Despite the rate at which one hears about such data breaches and the relative damaging effect, many individuals do not still understand the notion of data storage. Neither do they know the difference between data privacy and data protection. But distinctions between data privacy and data protection are vital to understanding how one is correlated with the other.  One must understand the difference between these terms to avoid confusion about legal obligations and rights. Though they are very closely interconnected, data privacy and data protection are not synonyms, neither at they the same. Privacy concerns occur when personal information distinguished from one party to another is collected, stored, or used.  Below, we discuss the distinction between data privacy and data protection and the importance of having the needed policies, tools, and technologies to protect digital assets. A Simple Comparison While data privacy authorizes who has legal access to information, data protection focuses on protecting users' details from unauthorized use. Data protection is often a technical issue, whereas data privacy is a process or legal problem. Simply put, data protection is about securing data against unauthorized access. Data privacy is about authorized access. Generally, data privacy is a legal issue, while data protection is a technical one. The User Dictates Privacy While the Company Executes Protection The primary difference people should know about data privacy, and data protection is who controls what. In data privacy, the user is the one who has it, defines it, and controls it. This user, who is the subscriber, willingly submits his/her data when requested by a company during a registration or a purchase. As the user, you can usually control which information is shared with whom. Data protection, however, is the company's responsibility. And it deals with how data is handled and managed inside the organization. When an individual willingly submits his/her data to a company, it is the responsibility of such an institution to ensure that the user's level of privacy has been set, executed and data is protected. Sold and Stolen Data  Data privacy is about keeping the user's information from being sold or shared by the authorized access. At the same time, data protection is entirely focused on keeping that information from hackers and cyber thieves. Another way of saying this is, data privacy is about what companies who have gathered your data legally can and should do with it. It also determines what control you have over that retention and use of data.  Data protection ensures that your information is safeguarded from unlawful access by unauthorized parties. Technology is not enough to ensure personal data privacy as authorized individuals might have access to the data. At this point, it becomes about a privacy breach and not protection. Therefore, companies must have a policy regarding what they can do for each type of data intrusion. Though one cannot relay the place of technology to the background, no technological armour can wipe out the central role of trust in ensuring data privacy. This goes too for those involved in file transfers. With the way data passes through Web nodes, any server and the forwarding IP address that handles a packet can read the message. At some fundamental level, there is no privacy and almost zero security for anything sent across the open Web. Enjoying Data Privacy Does Not Guarantee That You Have Data Protection In the distinction between data privacy and data protection, it is necessary to note that you cannot ensure data privacy unless personal data is protected by technology. If while you are inputting personal information, it gets stolen, then data privacy can be breached. Yet, personal data can be protected while still not being completely private. For example, when you swipe your credit card for a service provider such as an online writer approved by writing service reviews, you are doing two things. One, you are trusting the service provider and the payment system with your data protection. It means they should ensure that unauthorized access (which includes shady cybercriminals and other third parties) to your credit information without your consent is not possible.  But you also believe that those authorized partners will honour your data privacy by not misusing the information. It should hold even though you provided it to them. Addressing Data Privacy Data protection is the safeguarding of the data already obtained by a company. Most individuals, however, often ignore the fact that there has to be a privacy process before the question of protection even arises. You need to ask yourself if submitting your data is necessary before you start wondering about how protected the data will be. Below are ways companies can ensure that data privacy is always ensured: Breach notification Data processors must notify customers without delay once they become aware of a data breach (sensitive and confidential data is accessed in an unauthorized manner). Right to access Data providers have the right to confirm whether their information is getting processed, where the processing occurs, and for what purpose. They also have the right to collect a copy of the personal data. Right to be forgotten Data providers have the right to ask the data controller to stop processing and erase their data. How to Solve the Issue of Data Protection With cybercriminals becoming bold by the day, data protection becomes essential. When personal data is in transit, the only mode of protection one can depend on is encryption. In encryption, an unauthorized third party may see the data but not access, read or collect it. However, with end-to-end encryption, only authorized users with known IP addresses can get through the privacy shield and access the data. That is about as far as technology can provide you when it comes to data privacy and data protection. The rest is up to you. Check any data protection solution you have deployed often to spot errors early. Conclusion The difference between privacy and protection can be summarized as- “who one intends to share data with” and “how the recipient plans to protect your data from everyone else.” We must know this distinction between privacy and data protection to avoid confusion and misunderstanding about legal requirements. Businesses and companies have data protection obligations and should have a legitimate basis for collecting, using or processing personal data. ### 5 Steps to Build a Cloud Strategy on AWS Are you looking to build a cloud strategy on AWS? Amazon Web Services, commonly abbreviated as AWS, is a secure cloud services platform where you can use computer power to store your database info and perform other content delivery functions for your business. The worldwide cloud service helps you connect with businesses and clients from all over the world using a Content Delivery Network (CDN). Besides the connection to the cloud, you can store all your files on the cloud and access them from anywhere, run web and application services in the cloud to host different websites and send bulk emails to your clients. For you to get the perfect AWS system, you must come up with a suitable cloud strategy. Here are five steps that you should follow when building a cloud strategy. 1. Have a Vision and Build a Strategic and Tactical Plan The first step in creating a perfect cloud strategy for AWS is setting the right expectations and realistic goals. This can be deduced from the problems that your business has and recommendations from your other stakeholders. During this time, you must clearly define your goals and requirements for the cloud. Besides this, you must know that building a cloud strategy requires motivation, accountability, and strong communication. Your vision and foundation must be well thought of in order to breed this. 2. Design with Flexibility and Get Help As you design the cloud strategy on AWS, you must be very flexible with your mind. Plan in good time and have all the tools ready in iterations and unknown and complex failures. Additionally, you must incorporate technology in the design so that you can easily define your network’s servers and specs. A flexible cloud strategy will be scalable at any time, and this saves you a lot of time and money in future. When your business grows in the next ten years, you will only change a few things and continue using the strategy on AWS. You can also get in touch with AWS consultants and let them assist you on the flexibility bit. 3. Automate Your Infrastructure Technology evolution is the best thing that happened to businesses. You must think of how the cloud strategy will be handling the data centre, networking, as well as procurement bit. Other cloud processes include asset management and RMA that will show all your assets' life cycle on the cloud. You should use automation so that the processes require minimal human intervention. Whether you are online or not, your business progress is a sure and guaranteed process. 4. Test the Cloud Strategy Severally As you go on with the cloud strategy and design, you need to test what you build constantly. After testing the cloud strategy, you should note down the faults or issues you experienced and restructure the plan. Before you set it out for use, you should have tested it at least 100 times and fixed all the bugs and issues. 5. Migration and Optimisation The last steps are migration and optimisation of the cloud strategy. The migration in moving your entire business to the cloud. After the migration, the optimisation now comes in handy, helping you transform the ways of working, such as the access to the cloud, the restructuring and refactoring of the applications on the cloud. Conclusion Building a cloud strategy for business on AWS is a tasking job. It is a journey, and you must follow all the steps without failure. As you build the cloud strategy, you must be very clear on what you intend to achieve and whether it fits your multi-cloud strategy. If you feel a little crowded, you can always seek help from the AWS consultants and get information on building a cloud strategy on AWS. ### Making Sense of Cloud: Understanding Multi-Cloud, Hybrid Cloud, Distributed Cloud There are a variety of cloud computing solutions that you can leverage to meet your business needs. The most popular are multi-cloud computing, hybrid cloud computing, and distributed cloud. When deciding on which solution to implement in your infrastructure, you need to know how they work and the benefits of each. In this article, we will help you understand the critical information when it comes to making sense of cloud computing. What Is Multi-Cloud Computing?  Multi-cloud computing is similar to what it sounds like and what you’d imagine; it uses multiple cloud computing and storage services in a single network architecture. This architecture has cloud software, assets, apps, and more hosted on several cloud environments — hence the “multi-cloud” part. It can be all-public, all-private, or a combination of both. Multi-cloud computing uses two or more cloud services from multiple cloud service providers.  Businesses decide to choose multi-cloud computing to minimize data loss and downtime. This advantage comes because your cloud system is not dependent on one environment to run. If one ecosystem goes down, you will have another to back up your infrastructure. Multi-cloud systems also increase storage availability and computing power. The recent cloud computing developments have moved users from private clouds to multiple public clouds to ensure a more efficiently run environment. Multi-cloud management requires companies to have mastery in complex cloud management as well as multiple cloud providers.  Benefits of Multi-Cloud Computing A multi-cloud strategy gives you the freedom and flexibility to go with multiple vendors rather than being locked in with one vendor that may excel in one service but come up short in others. With so much data being processed, outages are bound to happen — either due to natural disasters or human error. Multiple cloud environments protect your business by ensuring the necessary resources and storage solutions are always running, limiting downtime. This type of computing can also help organizations with administration, risk management, and compliance regulations. Hybrid Cloud Computing Defined Hybrid cloud computing allows for the communication between the servers of your private cloud with multiple public cloud solutions. Proprietary software enables communication between each service. This strategy allows companies more flexibility between cloud computing workloads as resource needs and costs fluctuate. Hybrid cloud computing services allow organizations to have more control over their data, making it an ideal environment for many companies. This gives businesses the ability to store their data on private cloud servers while also leveraging the resources utilized on their public cloud. Unlike multi-cloud computing solutions, hybrid cloud computing is managed from a single location. This lets administrators control everything on one platform instead of individually managing each platform on a multi-cloud system. Having multiple cloud resources creates more avenues for security breaches, and having one location to manage with hybrid cloud computing prevents this.  Advantages of Hybrid Cloud Computing Hybrid cloud computing gives IT administrators more control over their company’s data. It also offers stakeholders the option to choose the environment that best suits each specific networking need. Most organizations don’t use the same amount of power each day; hybrid cloud computing allows administrators to use “more power” when necessary. For example, insurance companies have open enrollment applications submitted once or twice a year, while the environment supporting this may go virtually unused for 10 or more months of the year. There is no reason to be paying or using a robust environment that is not used for the majority of a year. Businesses who opt for hybrid cloud computing save on costs because of this.  A hybrid system also requires less space than a private model. This is beneficial for startups or small companies that can’t afford to invest in a large private data center. What Is Distributed Cloud Computing? Distributed cloud computing is a collection of computers working together while acting as one centralized network. This allows operations to go as expected if one or more devices happen to fail. In a distributed cloud computing environment, the computer systems can be run in the same physical location or in multiple locations that are close together. If the environment is run in multiple locations, their connection will be completed through a local network or by a WAN (Wide Area Network).  Distributed computing can be made up of various configurations, including personal workstations and mainframes. How Distributed Cloud Can Help Distributed cloud computing allows organizations the ability to easily scale horizontally by distributing resources among new machines as workload increases. If one data center goes offline, the others can pick up the slack with distributed cloud, making this system more reliable than others. Distributed cloud provides better performance and lower latency because the traffic will come from the closest data center. This type of cloud system is often less expensive than other alternatives, especially larger centralized systems. This is because a distributed system is made up of multiple smaller computers, making it more cost-effective than a mainframe machine. Distributed cloud breaks data into smaller sizes, which cuts down the time it takes to solve issues within the network. In terms of scaling costs, using multiple devices is more efficient in the long term on a larger scale. Which Cloud Computing Solution Fits Your Needs? After gaining a better understanding of multi-cloud computing, hybrid cloud, and distributed cloud computing, deciding which computing system to go with should be easy. Remember to choose the best solution to resolve your organization's problems. Do your due diligence in researching each cloud computing system and understanding the pros and cons of each before implementing a new environment for your company. ### Enterprise Content Growth Highlights the Value of Cloud-Native CSPs That enterprises have been using ever-growing volumes of content over the past decade is a given. Content is at the heart of digital customer experiences and few successful businesses exist that do not have content front and centre in their strategy.  There is more content in existence than ever before, and industry analyst group IDC has predicted that the volume of data will increase by 61% each year so that by 2025 there will be 175 zettabytes of data in the world. How enterprises manage this data has become increasingly important, a challenge exacerbated by the Covid-19 pandemic that has meant a majority of the workforce is now working from home, or remotely. Each employee needs to access and work with enterprise content many times every day, which has led to an upturn in the use of cloud technology for Content Services Platforms (CSP). IDC has forecast that by 2025 49% of the world’s stored data will reside in public cloud environments and it is clear that enterprises will also be using cloud-native ECM solutions in ever-increasing numbers. The value of content The importance and value of content should not be under-estimated by any organisation. 2020 research from Nuxeo revealed that more than half of consumers (54%) would switch to a competitor if the digital customer experience did not measure up. This digital experience includes smart personalisation; up-to-date and accurate product information; and meaningful and hi-resolution product photography – all important elements of content. Furthermore, additional Nuxeo research revealed that UK Financial Services (FS) firms store information and content across multiple systems, many of which are not fully connected. This is having a significant impact on knowledge workers who need to access content as a regular part of their jobs, with the research showing that they are spending almost one hour a day just searching for content because it is not readily discoverable. This is hugely impactful in terms of wasted resources and productivity. Content is intrinsic to a successful business in 2021, both in terms of creating a good customer experience and improving internal operations, so organisations need to find ways of connecting people to the content they need to do their job effectively. With so many employees working from home because of Covid-19 – and when things eventually return to normal, we will likely see a good number of those continue to do so – connecting people to content grew much harder. Going cloud-native Most enterprises understand that the cloud enables them to innovate faster, modernise ageing infrastructure, scale globally, gain new insights from their information, and create better access to content for employees. But not all cloud-enabled solutions are created equally.  Some journeys to the cloud are short, but these “shortcuts” often offer limited value or do not work at all. Cloud-deployed legacy platforms can run in the cloud, but don't take advantage of the scalable architecture, making them very expensive and difficult to maintain over time. There’s a world of difference between 'the cloud’ and ‘cloud-native’, with the latter a solution or application that has been specifically designed and built with cloud principles in mind. These take greater advantage of the scalability and flexibility of the cloud to maximise focus on driving business value and delivering a better customer experience.  When evaluating content services in the cloud, it is important to consider new architectures and emerging technologies that are simultaneously transforming information management options. Legacy solutions offer much slower innovation, so rather than shifting legacy ECM solutions to the cloud, it is time to consider how CSPs allow users to take greater advantage of cloud-based services including Artificial Intelligence (AI), Machine Learning (ML), and automation, offering benefits such as: Intelligent scaling - some of the most obvious benefits of the cloud are expanded scalability and cost efficiencies. Since most companies pay for cloud usage ‘as they go’, organisations can dramatically reduce the large, up-front hardware investments that limit innovation, while the cloud also positions businesses to scale the operational expenditures up or down, as needed.  But not all cloud-enabled solutions provide this flexibility. Legacy solutions often require businesses to forecast scalability requirements in advance. Cloud-native content services platforms, on the other hand, allow administrators to scale individual services automatically, eliminating the need to reconfigure the entire system or pay for unnecessary compute power or storage. Secure access and collaboration - cloud solutions are inherently more accessible than traditional, on-premises solutions, a quality that has become valued more with so many people working outside of the office network infrastructure because of the pandemic. It is also important for working with external partners, who rarely have access to corporate networks and internal systems. While cloud applications may lack the network firewall security provided by multi-step VPN protections, cloud applications frequently include sophisticated user security. Content-centric security in most content services platforms is fine-grained and configurable to meet specific business needs, from viewing documents to changing ownership and the ability to create or delete documents. Eliminate siloed and duplicate information - another challenge that most businesses face is the proliferation of business applications and document repositories. Nuxeo’s FS research revealed that content was spread on average across nine different systems. This can make searching for and accessing the right information much harder, significantly impacts employee productivity and creates the risk of the wrong information informing decision-making. By consolidating enterprise information in a federated solution that offers a centralised vision of all the content across the enterprises, much of the confusion created by these silos of information is removed, and search tools can find content irrespective of where it is stored. With a cloud-based, federated solution, end users can quickly reference a ‘single source of the truth’ regardless of the content’s storage location.  Add context to business applications - consolidating business content in a single content management system is not always a panacea to information management challenges. ERP, CRM, and HR systems, for example, often focus on storing data, but related documents are often stored outside of these business applications. While many solutions can manage these related documents, storing this information separate from primary business applications can be problematic.  Many organisations have therefore identified an improved solution: make this information accessible directly within the context of their primary business applications. Users can quickly see the most relevant content without switching applications or creating complex queries. As many of these business-critical applications are cloud-based, cloud-native content services solutions are more easily integrated. Maximise the value of cloud-based technologies - beyond faster implementations and innovation cycles, there are emerging technologies available as services in the cloud. These new tools augment a content services platform, providing the range of technologies necessary to deliver content-centric business applications. Examples include AI and ML which can manage translation, transcription and sentiment analysis. Cloud-native CSP vendors can provide specialised tools, including content-aware AI and focused ML to streamline the user experience and automate manual tasks. Legacy systems designed to manage content in an earlier era do not address today’s needs for agility, global scalability, omnichannel delivery, and automated processing and enrichment. With content services shifting to the cloud in ever-growing numbers, only cloud-native CSPs truly unlock the value of enterprise content.  ### Cloud-based SaaS application development, How does a SaaS model work? How Does a SaaS Model Work?  If you are planning to integrate SaaS solution with your business, it is pertinent to improve your knowledge about SaaS. Software as a Service or SaaS, as we commonly call it, is a delivery model wherein a centrally-hosted software solution is licensed to users through a subscription plan. Any organization that rents its software via a cloud-based, central system can be called a SaaS development company. Some examples of SaaS include Google Workspace, SAP Concur, Salesforce, Cisco WebEx, Dropbox, and GoToMeeting.Not to forget, the finest example of SaaS is our favourite Gmail, which we’ve been using for years to communicate through emails – personally and officially both. Google hosts Gmail and users access it through their browser as a client. Why Should You Adopt SaaS Business Model? Not sure why SaaS business model is recommended for you? Let’s try to understand with the help of the benefits it offers. There are some amazing advantages of SaaS that are motivating most companies to adopt it. The biggest advantage of SaaS over other traditional models is that SaaS is already installed & configured. Other than that, SaaS advantages include: Reduced costsEasy integrationScalable Upgrades (new releases) are easily available User-friendly Performing proofs-of-concept is easy There is no exaggeration if we say that SaaS is the ultimate business model these days. As many as 40-50% of businesses have already started using SaaS for running their operations. SaaS operates in the cloud and these are usually accessible through desktop & mobile apps as well as web interfaces. SaaS is owned, managed, and supplied remotely by its providers – one or more than one. SaaS companies are responsible for maintaining servers, software, and databases that enable products to be used online (over the web). Users can easily access and make use of the software from any device connected to the web. Usually, the users are required to pay a recurring subscription fee for accessing the software.  With SaaS, people get access to very powerful, high-performing online tool at a reasonable cost (if we compare with the cost of having the software solution developed from scratch). SaaS companies, on the other hand, gain from the recurring revenue. These companies can roll out additional features whenever they’re ready. But the novice enterprises willing to step into the software domain, SaaS business ideas and their implementation may be difficult to comprehend compared to other business models. Let’s quickly understand how does the SaaS business model work.  SaaS services that are most popular among businesses include: CRM or Customer Resource Management: Used by businesses to track sales and manage client information.ERP or Enterprise Resource Planning: This SaaS solution is perfect for big organizations looking to efficiently manage resources.  E-commerce &Web Hosting: Companies’ web presence is handled through remote servers using this type of SaaS apps.  Human Resources or HR: This SaaS product is used for tracking employee engagement, hiring process, and managing payroll.Data Management: This SaaS product is used for protecting business data and is commonly used for data analysis. Accounting & Invoicing: As indicated by the name, this SaaS product is used for invoicing and billing services.Project Management: Project management becomes easy with SaaS in place. It is used for helping ensure better communication among teammates working on the same projects.  Why Should Your Business Adopt SaaS Business Ideas? You need SaaS if you want things to work for you in minimum investment. SaaS relieves users from the stress of OS errors and issues that may arise due to new software installation because there is no need to install SaaS solutions at all. You can access all the features of SaaS product without even installing it on your machine.  SaaS is relatively cost-effective than other software solutions sold through other billing models. That is why users are more interested in adopting SaaS. SaaS app developers love it mainly because it’s developed consistently and can run on the infrastructure of the company. On the other hand, businesses love SaaS for its ability to help them generate recurring and predictable revenues.  SaaS Business Stages There are three stages of SaaS development as discussed below:  Stage 1. Startup: In this stage, a fully functional product is created and marketed to potential customers. Stage 2. Hypergrowth: This is the stage of enjoying faster growth with the client adopting the product. It includes bandwidth, data expansion, and other technicalities that support the accounts of the user. Many companies fail here because of their inability to efficiently and successfully handle this stage. Stage 3. Stability: This is the third and highly critical stage of the SaaS process wherein the SaaS model levels out. This is when you make huge profits, acquire more users, and finally enjoy the churn Adopting SaaS business model instead of standard software installation is highly beneficial to the vendor as well as the customer. A SaaS model is great for your target customers as it helps them reduce their operating costs and enhance product usage flexibility. Customers love SaaS because of the following reasons: SaaS solutions help them reduce their costs as they are distributed on subscription bases. As a result, they don’t need to spend money on licensing fees (as is the case with traditional software installation).SaaS enables clients to reduce or augment their expenses as per their usage. In addition, since SaaS products are cloud-based, the cost of infrastructure is eliminated for customers. Scalability & flexibility –The SaaS model of business allows customers greater flexibility. By basing your product pricing on usage metric, you can make your clients pay more in case they are using the SaaS solution often. Customers can grow their business with your software and save themselves from paying huge advance fee upfront for a product that may not even fit their purpose and needs. Benefits are instant. Since SaaS tools and solutions are cloud-based, immediate gains may pour in for clients. In a vast majority of cases, it’s as easy as using your email and name for signing up for quickly accessing product functions. As SaaS tools are now easily usable across the world, their adoption has also increased significantly. Needless to say, if software provides instant benefits to the users, they’ll not switch to other products unless there is a strong reason for that. Hence, SaaS adoption is quite high today. Upgrades are available for free. Most companies cannot afford downtown (even if it is for a few hours). Product upgrades are notorious for causing a significant downtime. SaaS product upgrades usually have a very short maintenance window or they cause no user downtime at all.  How Vendors Can Benefit from SaaS Business Model? Some of the greatest benefits that SaaS brings along for vendors are discussed below for your quick reference.  Budget-friendly: Maximum SaaS solutions come at a per month or per user rate, which allows customers and end-users to easily calculate the costs of the software. It reduces any possibility of sales friction caused by IT budget approval. Recurring& consistent revenue:  Amongst the more compelling benefits associated with SaaS business model is: it opens up a recurring and consistent revenue stream to help businesses control their churn. Improvements: SaaS allows easy product update on a continuous basis. As a result, your product is fine-tuned and you can easily increase customer retention while effortlessly attracting new ones during the process. A free trial is also available: On-premise products are tedious and colossal. But SaaS gives you a break here too. You can easily extend support to your SaaS product users immediately and on a trial basis. With that, the users may continue using the free trial or may choose to upgrade the product to a paid level as and when required. Support and update are easy: As a SaaS mobile app development company, you have full control of the system as well as the environment in which the product has been developed. For example, when you create a product for installing on more than one devices, it will be required to support different operating systems, edge cases, etc. The SaaS products that are browser-related, on the other hand, give you full control of the infrastructure they are being used on.  How Can Your Business Take Maximum Advantage of SaaS? To ensure you get the maximum benefits of SaaS for your business, you will need to hire dedicated developers with extensive experience in the domain. Some of the key parameters you’ll need to bear in mind as you start looking out for a mobile application development company are discussed here to help you decide better. Portfolio – Hire mobile app developers who have a great SaaS project portfolio. If they have already worked on this technology, you may benefit from their expertise beyond imagination. So, explore their portfolio to have a clear picture and a fair idea of what their actual competencies are? Experience – When looking for mobile app developers for hire, pay attention to their experience. Make sure you shortlist those who have rich experience in working on SaaS technologies in particular.Cost – While cost is no concern for bigger agencies, for startups and smaller firms it matters a lot. Hence, the cost can be a parameter worthy of your consideration if you have a limited budget set aside for this purpose. Get Android and iOS apps both – Hire an Android app development company, if you are targeting the masses as Android is the most popular and common OStoday. On the other hand, if your target audience is limited and you are aiming exclusively at iPhone users, you must hire an iPhone app development company for the job. Consider full-stack development services – Want to leverage the maximum potential of SaaS? Get full-stack development done.  Some of the top mobile app development companies can be on top of your list if you are looking for a high-security and great performing SaaS product for your business. But don’t just go by the tag of the top mobile app developers. Make sure that you carry out enough research and do your homework well before finally choosing your SaaS solutions developers. Take time to discuss your detailed requirements and expectations with the agencies that you shortlist. You would surely not want some unpleasant surprises later on in the guise of unexpected bugs, safety issues, and performance issues.  Wrapping Up The cloud-based SaaS model of business brings along immense business opportunities. It is accepted by a vast community of entrepreneurs and SaaS adoption is expected to rise faster in the years to come, owing to the amazing benefits it offers to the customers, vendors, and businesses alike. Coupled with the competition and demand in the marketplace, all you need to do is pay close attention to the dynamics of the SaaS industry and work towards providing SaaS solutions that are unique and extend value to the customers. And even before you hire a mobile app development company for creating or customizing a SaaS cloud solution for your business, it is highly advisable that you talk to the experts in the domain ahead of time. Adding a cloud-based SaaS solution to your business could be the best decision you would ever make towards taking your business one step or in fact many steps ahead of your competition. ### 5 Ways to Boost Employee Satisfaction with Technology If we have learned anything in 2020, it’s that investing in our employees is the only way to stay afloat. This has always been true, of course, because even without a global crisis on our hands, there is no denying that our employees are the main driving force behind sustainability and growth. Your employee collective is your biggest asset, and right now it’s more important than ever before to invest in their happiness and overall satisfaction. Luckily, there are many ways you can do this, but aside from good leadership and some additional perks, you should leverage technology as well. Adopting the right tech such as cloud computing will provide you various digital features that can streamline your entire operation, enhance the quality of life at work, and facilitate collaboration and productivity. When you make an effort to make the lives of your employees easier, you can bet that people will start coming to work with a smile, after all. You can use technology to automate various processes, but you should also use it to better manage your workforce. Here’s how to do it with the right tech. Better communication builds a better culture Firstly, let’s try to leverage the right software and hardware solutions to enhance the communication in your company, whether you’re operating on a remote work model or not. Good communication is the foundational pillar of a successful company culture, but it also facilitates growth and success across the organisation in many ways. That’s why digitising and unifying your communication is so important, especially in the new normal when the company’s culture is the first thing to take a hit. By implementing a VoIP (voice over internet protocol) solution into your company, you can facilitate horizontal and vertical communication among internal and remote teams. Given the fact that VoIP is a cloud-based solution with numerous communication features, it allows your employees to communicate via phone, video, chat, mobile apps, and use advanced features to communicate more efficiently, leading to a more positive culture across the board. Facilitating collaboration is equally important The way your employees communicate will not only affect your culture, it will also affect productivity and output across your organisation. What’s more, it will help improve collaboration within and between teams, and it will eliminate department silos in your company. However, having a communication tool is not enough to achieve these goals. You also need a feature-rich project management tool that will help people work from home seamlessly and collaborate with their colleagues wherever they are. With a cloud-based PM tool, your employees will be able to work on every project and task together on a centralised digital platform where everything is tracked and logged, which will also improve accountability.  Engaging and motivating employees in real time Managing your workforce is no longer just about monitoring progress or tracking their productivity, it’s about actively engaging them and motivating them to achieve their goals. As a leader, you always have to keep the goals of the company in mind, but you also have to guide your employees towards their own goals. After all, you can’t expect them to be motivated by your company’s objectives only. No, people want to be motivated by their leaders. To that end, it’s important that you stay in touch with your employees and through tech-driven employee engagement solutions that allow you to engage with your employees in real-time across the channels that they prefer. You also need to use this tech to inspire employees to provide daily feedback at the end of their shifts so that you can capture invaluable information that you can use to better run your business and manage your workforce. Emphasising personal data protection and privacy Of all the tech you can employ to motivate and engage your employees, advanced cybersecurity measures might be the last on your list. In reality, though, preventing phishing attacks and elevating cybersecurity for your remote teams is one of the best ways to keep your remote workforce happy and feeling safe at all times.  Stronger passwords generated by a password management tool, two-factor authentication for all accounts, end-to-end encryption in all communication, network monitoring, all these solutions will help keep your employees safe and motivated to do a better job. Delivering better training and education Finally, to keep your employees happy, you should give them an opportunity to advance in their professional realm. People don’t like to feel stuck or like their careers are stagnating, which is why you should use online training and education to help them upskill quickly and assume new roles and responsibilities. You can leverage automated training tools and intuitive training software to make upskilling more efficient and effective, but also more engaging for your employees. What’s more, you can leverage onboarding technology to help new hires become productive members of your organisation from day one. ### How to Manage Your Digital Certificates Efficiently It is no surprise that many companies have lost millions of users based on hacks and data breaches. However, one thing was common in each of these companies, which is a fallible cryptographic control system. As a security leader, you must bring out your 'A' game and be on the top of guarding your cybersecurity with the right tactics. That is when digital certificates come into the picture and protect access to all of your applications and devices. Any overhaul in the management systems can put a full stop to your business and can also cut a sorry picture in front of your target customers. Well. You save you all that melodrama, here are six such bullet-tested points to get you some thinking and protect your business before it is too late. Here is what we will be delving into today: Refrain from the inefficient Cryptographic PracticesImbibe Visibility into your Certificate InventorySegregate your Digital Certificate VendorsConfigure safe SSL Protocols when establishing Servers Eradicate stagnant and unused Certificates, right away!Yes to Automatic Certificate Lifecycle Management    Refrain from the inefficient Cryptographic Practices  It is the encryption techniques that ascertain whether the security of your infrastructure will remain safeguarded or not. In such cases, it is of high pertinence that you have the backup of a sound knowledge that tells you which to pick from the sea of options. First, DSA, RSA, SHA1, and SHA2 are the most popular in the market today and are most commonly used. On similar tracks, steer away from the techniques that have had a proven record of fallen vulnerability. Some of the known ones include MD5, RC4, and lesser than 1024-bit RSA modulus. Here, you must also note that the more is the bit-length, the higher the computing power it will demand. Hence, choose wisely. Additionally, you may offer limited access to your private keys and revolve it more often within the organization. One more thing that you should consider is that while you are getting self-signed digital certificates, restrict the validity of these certificates to a maximum of 1.5 years. The lesser the time is, the fewer are the chances of getting exploited by the hackers. Imbibe Visibility into your Certificate Inventory  What is paramount for an effective certificate management system is transparent visibility. Ransack your infrastructure (Well, not literally) and search for your certificates regularly. Ensure that all the digital certificates are discovered on a timely basis are well-maintained. See that the inventory must have all the necessary bits, such as the type of certificate, expiry dates, and deployment status. Every once a while, make it a point to monitor each of these certificates' validity - be it root, intermediate or end-user. Also, note that the unfortunate occurrence of any one SSL Certificate's expiry can put the others in the bunch at jeopardy. Hence, it is more than crucial to carefully track each of these certificates' expiry dates and try renewing them at a pace of 30 or 60 days at max. Once you do that, recheck that the expired certificate has been eliminated from the system. Do this almost diligently. This is regarded as one of the most prone factors for getting a certificate affected and can impact your company in many negative ways. Segregate your Digital Certificate Vendors  It accounts to be an insensible marketing policy to club all your certificate vendors under one umbrella. In no way, is it safe and wise to trust just one vendor for your business. If the vendor compromises with your trust, you can take no reverse turns from the mess. And as they say, 'Prevention is better than Cure', you must take the precautionary steps before that happens. Go for a certificate vendor that gels up with the nature of your business and budget too. Also, try to get a unique certificate for every server that you use and refrain from sharing a copy certificate amongst them. This, again, can work against the profits of your business. Even if one of your subdomains gets inflicted with compromising stuff, it takes down the rest of the subdomains along with it. So, all of the measures that you had taken up till now simmers into nothingness if that happens. Moreover, Tom, Dick, and Harry can make a subdomain using your name and get protection from that. This can magnify risks of all forms and is said to be a big no-no for your company's health.  Configure safe SSL Protocols when establishing Servers  SSL Certificates are a magical bliss for the online security industry. It is even hard to imagine how security can be maintained bereft of the presence of an SSL Certificate. As you know, the URL having HTTPS over HTTP is given by the SSL or TLS protocol. Some of the most commonly used protocols include TLS v1.0, TLS v1.1, and TLS v1.2. If you have one of those SSL v3.0 versions installed, it is advised to immediately take that off. Anything that provides weak encryption or protocol can become more vulnerable to get attacked quickly.  Did you know that when the client browsers see a TLS negotiation's unavailability to your server, it will automatically merge with the SSL v3.0? And that is why you must support your system with the latest protocols, as soon as possible. Further, just in case, your server demands to renegotiate from an SSL connection, it is always wiser to disable that with immediate effects and save yourself a load of fortune of the data breach and money loss.  Eradicate stagnant and unused Certificates, right away! So, what is meant by an unused certificate, anyway? They are the ones bought to add an extra layer of security to your system but have never been deployed. Similarly, unknown certificates are the ones that were never really documented and can easily fall into the rogue category. As you are going to build or check your inventory on a timely basis, ensure that you remove both the unused and the unknown type of certificates from the system, without any further ado. Even the slightest delay in eradicating them can call for massive loss to your inventory and question the rest of the files' security.  When your developers wish to procure the certificates for some internal testing purposes, limit the free SSL use to lessen the likelihood of disseminating these unknown or unused certificates. To save yourself from all the hurdles, ensure that you do a timely checkup of your certificates and let no file stay unnoticed. It is very much possible that the presence of a single vile certificate can contaminate the rest of your inventory in no time.  Yes to Automatic Certificate Lifecycle Management  It is a given that the digital certificates will increase as and when the new business's use cases double up. On that note, it can become a daunting task to manually feed in data and manage these certificates on the go. Additionally, putting the data manually can lead to more mistakes and can augment the overhead too. That is why automating processes is what the 21st century has taught us. When done right, it can leverage more power, lessen the workload and reduce the number of mistakes that happen daily.  The thing with an automated certificate management utility is that it can seamlessly discover, renew, issue, and install certificates without any human intervention. Moreover, you will get a quick notification when a certificate expires any time soon. And, these tools are validated to support multiple certificate authorities. Not only that, they give you the power to respond agilely to any of the more recent vulnerabilities and perform the migrating activities too. To put it in simple words, by investing in an automatic certificate lifecycle management tool, you can bid goodbye to the hassles that used to take place earlier and reduce the overall complexities of your digital certificate infrastructure. That’s a Wrap  Better yet, get yourself one of those cheap code sign certificates, and they will take care of the rest. So, what they do is they use a digital signature to validate the integrity of your system to that of the industry-standard encryption. While this will secure your software right, it will also tell you when the software is safe to be downloaded and not altered. Although you might not get a free code sign certificate just yet, you can get them in a genuine, inexpensive offer that will not burn your pocket in any way. Hope the above mentioned six tips would have helped throw some light on the topic and enlighten you with some of the major dos and don'ts. Above all, ensure safety and check your systems periodically before it is too late. So, why is the wait? Foolproof your digital certificate management today itself. ### Cloud-Native Transformation is Changing the Role of the CISO The pandemic has forced many businesses to reconsider the pace at which they are accelerating their digital transformation journeys. Many enterprises were not prepared for the overnight change to remote working, so over the past year, they have found themselves rapidly updating their cloud-native infrastructures to ensure that they thrive in the “new normal” for their business environment. Over this time, security has become a key focus as the decentralisation of employees is naturally followed by the decentralisation of security.  However, the businesses which have embraced and adapted to these new demands on security will have also found it to be an enabler rather than just an obstacle to innovation. This is where the evolution of the CISO becomes important. Now a key leader in the digital transformation process, the CISO must fully embrace new technologies such as easy-to-use cloud-native technologies, while revamping security practices to protect the “new normal” for IT.  The need for speed Digital transformation is not a new trend, but the past year has rapidly accelerated the process for businesses across all sectors of society. Broadly speaking, digital transformation involves the gradual but necessary updates to processes to create a digital-first customer experience, reduce reliance on paper and pen processes and instead make software and applications central to the running of businesses. Applications such as mobile banking and online document storage are examples of this.  These processes served as a good foundation for the overnight changes that were required when the current pandemic first caused major facets of everyday life to move online. With increased user demand and expectations for quality of service, IT was under pressure to not just move businesses online but to do so quickly and efficiently.  One core element of the solution to this problem is cloud-native technology. To cope with the demand and the essential need for a seamless customer experience, going cloud-native is the only option.  Cloud native transformation Cloud-native technologies are crucial to the digital transformation process. Cloud-native technologies can be deployed in stacks and with rapid delivery cycles which makes it possible to implement changes in days that would otherwise take months. The business impact of this is clear – rapidly adapting to customer needs will improve customer retention and growth and therefore increase revenue.  Another benefit of using cloud-native technologies is the resilience and agility that comes with being able to operate on a microservice level. Operating applications on a microservice level means that individual microservices can be independently updated when necessary without needed to revamp the entire application. Indeed, companies with agile practices embedded in their operating models have managed the impact of the pandemic better than their peers.  To ensure that this transformation is carried out without increasing the organisation’s exposure to risk, the CISO, and security in general needs to take on a central role. Security cannot be considered an afterthought, instead, it needs to be factored in from the very beginning and work in tandem with the development process. Just as security must be considered at all stages of the process, so must the CISO be involved throughout. The role of the CISO can no longer be siloed as a separate function, it must be central to the process of digital transformation.  The new central role of the CISO  The role of the CISO has been traditionally focused on reducing risk and protecting the organisation against cyber threats. Now, with digital transformation becoming a key initiative for many organisations, the CISO is relied upon to keep the business up to date on new initiatives and potential business opportunities as well.  Security of course remains a vital element of the role, especially as new technologies come with new risks that could impact the digital transformation process. The role of the CISO needs to evolve to not only monitor security but also use security to enhance business opportunities and stay ahead of competitors.  The first step towards this change is to break down walls between teams and encourage knowledge sharing. For security to be fully embraced into the development cycle businesses need to embrace the DevSecOps method and encourage collaboration between the security teams and the developers. The CISO is a key player in this change and will perform a vital role in ensuring that the change is implemented effectively. The result will be that security is no longer considered a hurdle at the end of the development cycle, but risks are spotted and managed throughout, speeding up the overall process.  Successfully combining these teams and speeding up the development process is the first step towards creating a general digital transformation mindset. As a result, businesses will be much better equipped to react quickly to real-world changes, risks, and customer requirements. Further collaboration between the CISO and other departments across the business will ensure that these changes happen without increasing costs or damaging customer relations.  Leading the pack The past year has shown all too clearly that businesses need to be able to rapidly adapt to real-world situations, and the new role of the CISO is central to this. With a modern CISO leading a new DevSecOps team at the centre, businesses will be able to successfully embrace cloud-native technologies and digital transformation.  Businesses that embrace these changes will outshine their competitors and excel in the new digital landscape. ### Data Protection Day – Don't cloud your security judgement The inauguration of Data Protection Day in 2007 signified the importance of securing sensitive information in a world where poor data security hygiene is impacting both enterprises and consumers. Fourteen years on, and the discussion around the use of data and data privacy is more pertinent than ever. For enterprises, the ability to collect and mine data has exploded as business processes are digitalised.  This digital transformation was accelerated further due to the pandemic, which forced many businesses to re-evaluate how employees could access data securely without compromising privacy from their new work-from-home offices. In fact, over half of UK business leaders admitted that shifting to the cloud saved their company from collapse during the height of the pandemic. This all sounds promising but the security risks associated with rushing into cloud services will not only impact the integrity of data but also potentially jeopardise compliance with data privacy and security regulations like GDPR.  The distraction of the pandemic has already presented a serious challenge for organisations trying to keep their systems secure. On top of this, they are also tasked with keeping up with privacy demands. Nevertheless, it is not a valid excuse for any business to overlook its data security and privacy commitments, especially as pandemic-related cyberattacks has grown. Cybercriminals have been persistent with their attempts to exploit cloud systems using common methods like phishing, which have been largely COVID-themed, malware and cryptomining. Additionally, hackers have been quick to compromise cloud services through unpatched vulnerabilities and cloud misconfigurations, which are both triggered by human error. If security hygiene continues to be neglected, then any hope for cloud and data security will be gone. As we enter the next era of cloud computing, the safety and privacy of data has become vital. Organisations can no longer afford to overlook security and must be proactive when safeguarding the data within their cloud systems. If not, then businesses are gambling the troves of harvested sensitive customer data. To avoid this situation, security leaders must consider the following advice to secure cloud environments: Avoiding cloud misconfigurations Misconfigured cloud systems are low-hanging fruit for cybercriminals. It doesn’t take much skill to steal the data, especially if the cloud system has public access enabled. To reduce the threat of this occurring, enterprises must ensure configuration settings are observed following security best practices - CIS benchmark, continuously, using automated cloud security posture management (CSPM) solutions. This will greatly reduce the chances of a security misconfiguration – the #1 cause for cloud data breaches. Other basic security controls should also be implemented including installing firewalls, backing up and regularly testing systems, as well as educating the workforce on what security practices to follow when accessing these systems remotely.  Protecting your cloud workload with shared responsibility Contrary to belief, the security of the cloud environment does not solely rest on the provider. If an application is vulnerable to SQL injection, moving it to the cloud will not protect it from the threat. The sharing equation is determined by the common operational scenarios: IaaS, PaaS or SaaS. For IaaS/PaaS environments, where the cloud provider secures the back-end data centres, networking, servers, and virtualisation; the enterprise needs to extend their workload protection best practices to the cloud workloads through continuous system hardening and vulnerability assessment. Known vulnerabilities are the first thing attackers will target, so it is crucial to keep your operating systems assessed in check at all times. For a SaaS scenario, security for the application and data is the responsibility of the service provider, while access security rests with the enterprise and its users through enforcing security policies between on-premises and the cloud services used.  Big data and multi-cloud considerations Cloud is incredibly unique and requires its own specific knowledgebase and skillset to establish proper controls. In the case of hybrid or multicloud deployment, the security controls from your cloud service providers - Azure, AWS and Google Cloud Platform will differ. You do not want to use different tools with different scopes for different clouds. It is important to implement homogeneous controls and have visibility in a across all your cloud services to help security teams remove blind spots and reduce costs. Your cloud security management process should also be automated and continuous to keep up with the dynamic nature of cloud service. By obtaining a unified view of the cloud environments, organisations can effectively protect cloud applications and the sensitive data that resides within them without impacting IT operations and resources. Data Protection Day is beneficial in reinforcing the need for better data security, however securing data is not a one-time event, and it must be a continuous and automated process. The digital adoption cycle shows no signs of stopping, with modern enterprises migrating to more cloud services due to its benefits. In essence, the cloud has become part of the new normal, but we must be sure cloud security is being normalised in the right way to ensure our critical data isn’t breached. ### Graph Data Science Gets to the Heart of Corporate Analytics Over 28,000 peer-reviewed scientific papers about graph-powered data science have been published in recent years. But access to the benefits of graph data science is no longer limited to scientists and those with deep pockets. The technology reveals contextual data connections, fuelling intelligent systems and enhancing machine learning predictions. Google was among the first to use graph-based page ranking to revolutionise search engines. Now graph technology is experiencing exponential growth in usage. Interest in graph data science overlaps with AI and machine learning as companies seek to get the best insights from data. Graph data science can reason about the ‘shape’ of the connected context for each piece of data through graph algorithms, enabling far superior machine learning modelling. Graph data science lets businesses make predictions in many diverse situations, from fraud detection to tracking customer or patient journeys. It helps companies learn from user journeys to present accurate recommendations for future purchases, supported by evidence from their buying history to build confidence in suggestions. Knowledge graphs are also being put to work to identify new associations between genes and diseases, discovering new drugs. Analyst firm Gartner has identified graph data science as a key data and analytics technology trend. When asked about the use of AI and machine learning, 92% of companies said the plan to employ graph techniques within five years. Gartner believes a quarter of global Fortune 1000 companies will have built a skills base within three years and will be leveraging graph technologies as part of their data and analytics initiatives.  Graph technology in central government Graph technology is being used at the top of government. Data scientists Felisia Loukou and Dr. Matthew Gregory deployed their first machine learning model with the help of graph technology to recommend content to GOV.UK users, based on the page they are visiting. The scientists explain that their application learns continuous feature representations for the graph nodes, which can then be used for machine learning tasks, such as recommending content. They note that creating the necessary data infrastructure underpinning a model’s training and deployment is the most time-consuming part. Graph tech powers marketing Graph technology can contribute to identifying the right content even for casual visitors to websites. Ben Squire, senior data scientist at media and marketing services company Meredith, has used graph algorithms that transform billions of page views into millions of pseudonymous identifiers with rich browsing profiles. He says: “Providing relevant content to online users, even those who don’t authenticate, is essential to our business. Instead of ‘advertising in the dark,’ we now better understand our customers, which translates into significant revenue gains and better-served consumers.” Supports the medical supply chain Graph data science is also supporting the medical supply chain. Medical device manufacturer Boston Scientific is using graph data science to identify the causes of product failures. Multiple teams, often in different countries, frequently work on the same problems in parallel. Yet engineers were having to resort to analysing their data in spreadsheets, leading to inconsistencies and, most importantly, difficulty finding the root causes of defects. Boston Scientific set up a graph data model consisting of three nodes, with relationships that trace failures to parts and connect those to finished products. Analytical query times are much faster, helping to increase overall efficiency and streamline the entire analytical process. The company can now identify specific components that are more likely to fail. Another benefit is that because the graph data model is so simple, it’s easy to communicate to others. “Everyone involved with the project, from business stakeholders to technical implementers, is able to understand one another because they’re all speaking a common language,” confirms Eric Wespi, Data Scientist at the company.  Boosts health outcomes In healthcare, New York-Presbyterian Hospital's analytics team is using graph data science to track infections and take action to contain them. The hospital wanted to log every event, from the time a patient was admitted to all of the tests they undergo and their eventual release. Graph data science offers a flexible way to connect all the dimensions of an event – the what, when and where it happened. The team created a ‘time’ and then a ‘space’ tree to model all the rooms patients could be treated in on-site. This initial model revealed a large number of inter-relationships. To build on this, an event entity was included to connect the time and location trees. The resulting data model allows the analytics team to analyse everything that happens in its facilities. These examples are just a small taster of the possibilities of graph data science. Graph technology goes hand in hand with predictive AI and machine learning initiatives and is set to become a critical pillar of corporate data analytics. ### The Best Tech Tools for Working from Home The global economy is changing, technology is rapidly advancing, and the number of remote workers is higher than ever before. To tackle the crisis caused by the current climate, employees have quickly shifted to remote work. On the one hand, this particular work arrangement allows employees to balance work and family commitments better. On the other, employee productivity often takes a hit at some point because of many distractions at home. To keep their productivity levels high, employees can resort to various tools and gadgets. Here are some of the best tech tools employees can use to nail remote work. Project Management For those having to manage a virtual team, project management tools are of great use. There are so many available options on the market. Some have more features than others. Depending on the size of the virtual team and its needs, managers can easily find the best fit. Project management tools, such as Basecamp or Asana, primarily allow managers to assign tasks and projects to employees and track their progress. Moreover, every team member gets a complete to-do list with their duties and responsibilities. More importantly, they can work and brainstorm together, help each other, making sure all projects meet tight deadlines. Internal Communication  Effective internal communication is crucial for remote teams. Remote employees have to communicate not only more but also better than usual. Keeping in touch makes the job easier. They get all the relevant information right away and interact with coworkers like they would in a regular office. That is why managers should stimulate their employees to regularly communicate not only as a team but also across departments. The easiest way to boost communication in the virtual workplace is with internal communication software. Slack, Instant Messaging, Google Hangouts, and Telegrams are just some of the many communicative technology tools, plus there are great alternatives to Slack and man other apps. They all connect employees and the management and allow them to communicate daily through text messages. File Sharing  Virtual teams don’t have access to what is commonly known as the file room. However, team leaders and managers should make sure they have access to the virtual one. Keeping all files uploaded on the cloud allows employees to upload, share, or download corporate files. Procedures, regulations, policies, guidebooks, and similar useful documents can be uploaded for employees to access whenever they need to. The most common document management tools are Dropbox and Google Drive. For example, teams can save, share, edit, or add new documents all online. However, the downside is the limited storage space. If teams need more storage space, they will have to switch to a paid version of the tool. Video Conferencing Remote teams don’t only struggle with productivity but also with the lack of face to face communication. While it is possible to maintain communication solely through texts, remote workers still need body language to send and receive messages effectively. This is where technology comes into play. Many platforms and tools allow remote teams to organise, attend, and record video meetings. Skype, Zoom, and Google Hangouts are the go-to tools for most virtual teams. While their primary purpose is video conferencing, some tools can also be used for file sharing, audio calls, texting, and teamwork. Whatever video conferencing tool managers choose, they will surely bring their staff members closer and make their communication more efficient. Time-tracking  Knowing how easily employees get distracted, software developers have designed time tracking apps to help them stay focused at all times. The apps are particularly useful for remote employees. When they work from home, superiors can’t monitor their activity in the workplace. For this reason, time tracking tools show employees how they spend their time during working hours. Apps such as Clockify or Toggl are some of the most common ones. Apps generate reports at the end of the workday, which employees can analyze and determine which tasks take the most time to complete.  Being aware of how they spend their time helps employees increase their productivity and prioritise.  Distraction Blocking  As mentioned previously, home offices are full of distractions. They prevent remote staff from performing well at their work assignments. That is why they should install distraction blocking apps on their devices. Most of these apps block distractions on all synced devices at once. Forest and Freedom are every remote employee’s best friend. The former allows users to plant a virtual tree that keeps growing as long as the user is focused. The latter can be used to restrict access to a web browser, social media, or any other app or website the user selects. When employees avoid distractions, nothing is stopping them from meeting deadlines. Organization  The remote staff has to be always organised to get things done at work. Lucky for them, there are numerous organization apps they can use to help them stay on top of things in the workplace. Nozbe and Doodle are some of them. Nozbe is a multifunctional app. It can be used for time, project, and task management as well. Meanwhile, Doodle is actually an online calendar that makes sure no meeting is double-booked, or deadline missed. Since remote work is on the rise, virtual teams need all the help they can get to stay productive and engaged in the workplace. There is no better sidekick than technology. Apps, tools, and gadgets we suggest can help them excel despite all distractions their home offices contain. ### Here's why Wifi Range Extenders won't connect to Wifi Router! Wifi extender acts as an intermediate or bridge between a Wi-Fi router and a Wi-Fi device that is outside the range of a Wi-Fi router's signal. Wi-Fi extender connects to an existing Wi-Fi network, and it acts as a wireless access point for Wi-Fi devices.  Come closer to the router: Most people assist Wi-Fi routers that will broadcast a Wi-Fi signal throughout the home and so their wireless devices can have internet access. But sometimes, a wireless signal is not strong enough to cover the entire house. When you come closer to the router, you will get a strong signal, and you will move far away from the router, then you will get a weak Wi-Fi signal. Building structure may affect your Wi-Fi router signal: The signal bars in mobile phones and laptops show the strength of the Wi-Fi connection. Sometimes, the building structure weakens the signal range when it goes through the building's walls and floors. Repositioning the Wi-Fi router to different locations will help you to solve the problem. Reposition or Fix Wi-Fi extenders in your building: If repositioning or any other method is not possible for your building structure, then use a Wi-Fi extender. This will extend the distance of wireless connection for reaching further areas. The SSID name of the Wi-Fi extender is the same as the router. For example, if the name of your router is "Sam," then the SSID name of your Wi-Fi extender may be "SamExt." When the user cannot be able to connect to the weak router signal, then he will connect to the Wi-Fi extender by using the SSID name. Reason for connection failure between a Wi-Fi router and extender: Sometimes, the Wi-Fi extender may not connect to your Wi-Fi router, which troubles you to access the home network connection. We list out the top reason for the poor connection between extender and router.  Improper wireless settings: You should match the settings of the wireless extender to the Wi-Fi router configuration. If you failed to perform the right settings, then your extender will not be connected to your Wi-Fi router. Connection failure due to distance issues: Fix the respective wireless extenders with an appropriate distance of receiving the router signal. This is because more distance between devices leads to a poor connection between the devices. Try to position your extender at the right distance to make sure of receiving the Wi-Fi signal.  Sharing the same network by high traffic devices: Make sure that you have other devices that generate a lot of network traffic. You can then move or connect this device near the range of the Wi-Fi extender or directly connect this device to the wifi router for long-range using an Ethernet Cable. External interruptions: Sometimes, the Wi-Fi connection may be affected by external devices or external waves in your building.  These interruptions may disturb the connection between the Wi-Fi router and the Wi-Fi extender. Interruptions such as Bluetooth devices, neighbouring wireless networks, 2.4GHz cordless telephones, microwave ovens, and baby monitors are the main causes of this problem. Mismatching firmware configurations: The outdated Wi-Fi extender may fail to connect to your Wi-Fi router. The configuration of the updated Wi-Fi router does not match with the outdated extender configurations. Try to upgrade your extender with the latest version of the router and solve this problem. Forget the router name or SSID: Sometimes, Wi-Fi users may forget the SSID or name of the router. So, they cannot be able to connect the Wi-Fi extender to a Wi-Fi router. The nearby Wi-Fi connection will be shown on your device, which helps you to identify your Wi-Fi router. Forget the Wi-Fi password: The usage of the wrong password never permits your extender to connect to your Wi-Fi router. This will make you reconfigure your router and change the password settings.  How to solve this problem? If your Wi-Fi extender cannot connect to the Wi-Fi router, then you need to reset your extender. The location of the reset button in the extender may vary depending on the brand. The reset button will present at the back of your extender, and press it for 10 seconds using a needle. This will reset all the current settings and allows you to modify your extender to your wish.  Check if another similar network is available in your area by using other devices like phones, tablets, laptops, etc. You will need to identify your Wi-Fi router's IP address to change the settings of your extender and router. Try to reboot your router and extender and rescan it again. Change the password and username of the router and extender and note this information for further connection.  What Can Be alternate to wifi extender? Wifi router for long-range are specially designed with wifi extenders to cover more area to provide strong and reliable connection to the user. Advantages of having Wifi router for long-range: It can cover more area at one time efficiently without any low signal problems.It helps to use multiple devices together after distributing equal ranges.Dedicated frequency could be distributed for better signals.Available at affordable rates for users.Google-assistance is also available in some of the routers. Conclusion:The above information will help you to buy the best long-range router and eliminate wifi extenders for a better connection. Share these points with your friends, family, relatives, and others. We are expecting your useful suggestions to update information about these technical issues furthermore. ### Shopping Online? Think Again. How to Avoid Identity Theft With the increase of online shopping during the pandemic, many people are at risk of account takeover from online scams. Many people on the internet have made sure to safeguard their personal information with credit monitoring to encourage alerts and ensure they don’t fall victim to scams. Thankfully, you can learn how to protect yourself from identity theft. How Are Identities Stolen? Identities are stolen by picking up someone’s personal information and using it to hack their account. An account takeover is when scammers gain access to a person’s financial account information, enabling them access to personal information and data. The scammer can use this information to buy other financial products, running up potentially thousands in debt for the individual and ruining their credit scores. This type of identity theft can take individuals years to recover. What Are Online Shopping Scams? These scams are popular with the growth of eCommerce and online shopping. In account takeover scams, the scammer collects a user’s financial information when they enter it at the checkout for an online shopping webpage. Little does the online shopper know that the entire page is fake. The products don’t exist and will never be shipped. A scammer has set up this website with the sole purpose of collecting credit card numbers or other payment platform information. The scammer is taking advantage of the trust that many have put into online shopping platforms in the modern era. How Online Shoppers get Scammed There are a few different ways that online shoppers get scammed, becoming victims of account takeover. An eCommerce site may be set up to look like any other eCommerce site, providing a range of specialty products. Quite often, luxury brands are displayed on these websites at much lower prices, making the offer enticing for someone who otherwise would not buy. In these situations, abnormally low prices are the main indicator that there is a scam taking place. Quite often, when things are too good to be true, it’s because they definitely are. How to Protect Your Identity Online It can be difficult to protect yourself from identity theft. There are a few things that can be done to safeguard yourself and your personal information. With a few habits and some help from technology, you can continue to enjoy eCommerce online shopping. ●  Be Sceptical Many online shopping scams are perpetrated through social media. A user will be searching for something they are in the market for and then get targeted on social media for that exact product. If you see a name brand on social media that is fifty percent cheaper than what you see elsewhere, there is a good chance it is fraudulent. If something is too good to be true, it’s almost always a scam. You can also adjust your cookie settings so this is less likely to happen. ●  Conduct Credit Monitoring There are many services available online for credit monitoring, which is a useful way to protect yourself from identity theft. With these in place, you can get an alert when your personal information has been used to apply for credit or there is spending that is out of the ordinary. This way, in the event of an account takeover, you can act accordingly and rectify the situation. There are many platforms that enable this type of personal protection. ●  Check Website Addresses Get in the habit of checking the address of the website that you are on. Scammers devote a lot of time and effort to replicating websites of name brand online retailers and eCommerce websites. In order to protect yourself from identity theft, make sure to verify the URL of a website that you have clicked on, especially from ads in a search engine result or social media feed. It could be that you have been taken to a fraudulent site with the intention of stealing your personal information. What to do if Your Identity is Stolen Once you fall victim and your account information has been taken and a scammer has managed to use your personal information for their own benefit, it is best to act quickly. Most financial institutions are quick to act in these scenarios and they may be able to identify youwho the scammer is, or at least find a way to block them in the future. Follow these steps to ensure safety for your personal information. Notify the authorities. When claiming that someone has used your account fraudulently, many businesses will ask for a copy of a police report for their own records. The police will also be interested in finding the perpetrator.Notify your financial institutions. In some cases, you may need to cancel credit cards or have them reissued. Be sure to let any business that is affected know that your account information may have been tampered with. Conclusion Many of us enjoy the freedom of shopping on the internet. Online shopping and eCommerce is one of the biggest developments in recent times, largely on account of the pandemic. It is widely believed that online shopping will continue to expand. Everyone needs to exercise vigilance moving into the future to ensure that they are not the targets and victims of rising fraudulent activity. ### Top 7 Uses of Cloud Computing Cloud Computing is the delivery of highly in-demand IT resources via the Internet using the pay-as-you-go method. Rather than companies having to buy, use and maintain physical servers and data centres that are difficult to use and access, you can use the cloud. This cloud service will allow you to access the technological services online, including storage, databases and compute power as and when required. Amazon Web Services (AWS) is one of the best cloud providers, which helps you use various functions and services of the cloud. To learn Cloud Computing and become an expert in using numerous cloud services, you must enrol in Intellipaat’s AWS certification program and gain in-depth knowledge and skills in this domain. Now that you have gained a brief understanding of Cloud Computing and the role of AWS in it, let’s discuss some of the most common uses of this in-demand technology. 1. IaaS, SaaS and PaaS Services IaaS stands for Infrastructure as a Service. The IaaS feature allows the cloud to deliver useful network, storage and computing resources on-demand to customers via the Internet. Companies get to save a large number of expenses by using the pay-per-use feature on existing infrastructure. This saves investment costs that would otherwise go into acquiring, managing and maintaining the infrastructure. SaaS or Software as a Service allows clients and customers to access numerous software applications online that are hosted on the cloud. Companies can use these applications for several business tasks. Cloud also offers a Platform as a Service (PasS) that offers customers the complete platform consisting of software, hardware and infrastructure. This allows organisations to build, run and manage applications without any additional expense and complexity that generally comes with creating and maintaining on-premise platforms. Often, companies use these services so that they can speed up their development process on deploying applications. 2. Hybrid Cloud Hybrid cloud, in simple terms, is an environment for computing that allows professionals to connect their organisation’s private cloud services with any public cloud and use the two as a flexible infrastructure to run the company’s workloads, processes and applications. This combination of both public and private cloud services gives the company the option to use the optimal features for all the required tasks and applications. Moreover, companies gain the flexibility to move to and fro between the two cloud services whenever required. Multi-Cloud, on the other hand, is a notch better as it allows organisations to use more than two clouds from multiple providers. With this, you will be able to mix various IaaS, SaaS and PaaS services and figure out which service is best suited for the respective cloud as per your needs. 3. Development and Testing Testing and development phase is among the best places where cloud services come in handy. It allows organisations to work under budget and set up the required environment using time and physical assets. Next step is to install and configure the cloud platform. Cloud Computing offers an available environment that meets your demands. Besides, you can further use various physical and virtual resources that are automatically available. 4. Big Data Analytics Big Data Analytics is another one of the most significant use cases of Cloud Computing. It allows organisations to access a large volume of structured and unstructured data that can be used to gain valuable insights for the company. This has helped several organisations, big and small, understand the behaviour and the buying pattern of their customers and use personalised marketing campaigns to convert them into leads. 5. Storage As compared to any other storage facilities, the cloud offers numerous features, including access, storage and retrieval of your data via an Internet connection. You can easily access the data at any time and anywhere if you have a scalable, highly available, secure and fast Internet connection. This helps companies save a lot of money as they only pay for the storage amount that they actually use without having to worry about maintaining the storage infrastructure. Moreover, they can store the data on-premises or off-premises as per their requirements. 6. Disaster Recovery One of the advantages of Cloud Computing is that it is cost-effective when it comes to offering disaster recovery solutions that allow organisations to recover their data. This process is faster and more cost-efficient as compared to recovering the data from multiple physical storage facilities that are present in numerous locations, which takes a huge amount of time and effort. 7. Backup It is always difficult and time-taking to back up data. Apart from this, companies will also need to collect the data manually, maintain various drives and tapes and dispatch the data to their preferred backup sites. During these steps, there may be all sorts of issues that the data can inherit, from its origin to its destination backup facility. The backup can have numerous problems, including a large amount of time it takes to be loaded in the respective backup devices for further restoration. Moreover, this data is prone to human errors and malfunctions. Cloud’s backup facility allows companies to dispatch data automatically over the Internet. This data is easily available and secure, with no issues in terms of storing capacity. Start Your Career in Cloud Computing This article on ‘Top 7 Uses of Cloud Computing’ has helped you learn briefly about Cloud Computing. Further, you have also learned about its various benefits that are extremely useful for organisations for multiple reasons. If you are looking to learn one of the latest technologies and build a career in that domain, Cloud Computing is the best option for you. So, get a kick-start to your career in AWS and become a certified professional in Cloud Computing. ### 7 Tips to Prevent a Successful Phishing Attack on Your Remote Team Since more and more companies have switched to a remote workforce, there has been an increase in cybersecurity risks.  As per a survey conducted by CNBC, over one-third of senior technology executives reported that cybersecurity risks have increased since most of their employees work from home. Now, communication that usually took place inside a secure corporate network is now being done at home. It gives hackers the perfect setup to take advantage of weak points in the communications protocol of businesses and access sensitive data.  Statistics suggest that over 60,000 phishing sites were reported in March 2020, and 96% of all targeted attacks are made with the intention to gather intelligence. If you are managing a remote team, these seven tips will prevent a successful phishing attack on your remote team: Enhance email security Blocking the emails at the source is an easy way to prevent phishing attacks. Though there is the standard Office 365 – Exchange Online Protection (EOP) anti-phishing solution that can block spam and standard phishing attacks, it can’t effectively block zero-day threats. To enhance email security, you need layered defenses. Consider a third-party dedicated anti-spam and anti-phishing solution. It should feature predictive threat detection and advanced anti-phishing mechanisms to identify zero-day threats.  SpamTitan will be a great choice as it features machine learning, threat intelligence feeds, dual antivirus engines, predictive technology, sandboxing, and more to add an extra layer of security and see that zero-day threats are blocked.  Look out for keyloggers. Another security threat that is gaining popularity is credential theft through a keylogger. The keylogger program can track the keystrokes that you make when you type on the keyboard.  If hackers install a keylogger, they can scan your keystrokes to find out sensitive information like usernames or passwords.  It is difficult to detect this cyberattack as the hackers can log in as you. They can install the program as you click on phishing emails or download malicious attachments. Use a web filter Web filtering entails stopping your team from clicking and viewing some suspicious URL links. Web filtering is done by preventing your team’s browser from loading pages connected to such sites or URLs.    Naturally, this can help you add an additional layer to your system and you can easily block sites that are more likely to engage in phishing and malware attacks. If a phishing email or message asking you to click on suspicious URL does reach your inbox, the web filter can still prevent your team from clicking on the hyperlink.  You can also use plenty of online tools that can help you block web-based attacks aimed at your office and remote workers. Such tools also allow you to change and set different controls depending on your team’s browsing habits. Don’t click on the link yet. Be extra careful when you receive emails or instant messages asking you to click on a link even when you know the sender. At the least, hover over the link to ascertain whether the destination is the right one.  Since hackers use sophisticated tools for phishing attacks, some destination URL can look precisely like the genuine site. To prevent such attacks, go straight to the website via your search engine instead of clicking on the link. Spot a phishing email The best way to protect employees from phishing attacks is by teaching them how to quickly spot phishing emails. Since hackers utilize real company logos and add details to make their emails appear genuine, it is challenging to spot red flags unless you are very certain of what you are looking for. Keep these elements in mind while trying to spot phishing emails: Since hackers are most likely not to have writers to craft their emails, they usually end up making noticeable mistakes. Thus, if you notice apparent typos or unclear text, this is a big red flag.Phishing emails usually seem generic without your name, reference, or other identifying information. It is because hackers won’t bother to spend time personalizing emails. If you check the sender’s email address, phishing emails won’t have a domain name. Most reputable companies have their domain email, while hackers will change the address though they might have made an effort to make the address look real.Hackers will usually send unsolicited attachments or ask for sensitive information via email. Implement two-factor authentication and strong passwords Implementing two-factor authentication will provide an extra layer of protection and safeguard users’ credentials and access sensitive company data from hackers. Consider using security keys like the Yubico YubiKey for an additional concrete layer of protection and prevent phishing. Also, ask your employees to maintain strong passwords and also to use different passwords across different services. In case a password gets revealed in a data breach, it will prevent the hackers from gaining access to other accounts. Conduct company-wide cybersecurity training The main defense against phishing attacks in your company is security-savvy employees. It is thus crucial to conduct company-wide security training so as to safeguard your company’s data.  You can implement this training into your onboarding process and, of course, schedule regular refresher courses. Consider using employee training software to train your remote employees. Whatfix’s real-time in-app training programs will let you train employees on demand while offering insights to measure training effectiveness. Moreover, your employees can access all the training resources from within your web application. Try to make your program as effective as possible so that you can engage your employees. Training should be given on best practices, but that’s not all. See that your employees are taught about the steps to take if they notice something suspicious and alert management of the matter. Wrapping-Up Ever since remote work has surged, phishing attacks on businesses and companies have seen an increase too. Keeping your employees’ credentials and your company’s data safe should be your top concern even more than before.  Start by enhancing your email security, look out for keyloggers, use a web filter, and more. Besides, make sure to avoid clicking on links before you ascertain whether the destination is the right one. Additionally, learn how to spot a phishing email, implement two-factor authentication and strong passwords, and conduct company-wide cybersecurity training. Also, always keep yourself updated with the latest trends in the cybersecurity industry. ### IaC: IT Infrastructure Management Model That Codifies Everything Managing a complex IT infrastructure is no joke, and as you know, it requires a dedicated team of professionals. Well, at least it used to. Nowadays, IaC (infrastructure as code) aims to provide business leaders with a concrete solution to their IT needs, making sure that any infrastructure is easier to manage through automation and numerous handy features that allow you to minimise financial waste and redirect talent in your company towards bigger and better things.  This is especially important in times of cloud technology, and now that companies all over the world are increasingly migrating to the cloud, provisioning complex infrastructures and building efficient data centres can seem like a daunting task. This is why it’s important to demystify the IaC model and highlight the benefits it brings to the modern business world. Today, we are taking a look at the five key benefits and features of IaC and how it can help you manage your infrastructure with ease while allowing you to grow and take your company forward as a whole.  Ensuring consistency in configuration When the business migrates its operations to the cloud, IT experts are tasked with deploying the infrastructure and configure it to avoid human error and inconsistencies, prevent resource waste and mismanagement, and ensure that all applications are performing as intended. During the setup faze, there are many things that can go wrong, and even the subtlest differences in configuration can be difficult to spot and debug. This invariably leads to subpar results and can even prevent you from managing your operation without any downtime – which is usually the key objective during a migration. Deploying a complex cloud infrastructure takes time and effort, but the infrastructure as code model helps alleviate the pressure and expedite the process by completely standardizing the setup of your infrastructure to minimize the risk of errors or inconsistencies. In turn, this reduces the risk of coming up against incompatibility issues that will stifle your applications and operational efficiency.  Improving efficiency in software development Software development is a complex process that requires the careful collaboration of numerous developers, and even cross-collaboration between development and operations. To make a software development process as efficient and effective and prevent setbacks, companies will often employ the DevOps model to bridge the gap between operations and development, automate and standardize process, and ensure incremental progress on a daily basis. DevOps also helps transform testing to make it more efficient and coherent for all teams, so that development can move along without setbacks. Infrastructure as code allows you to easily deploy the cloud architecture in numerous stages to aid and expedite development, but it also allows your developers to build their own sandbox environments where they can develop with ease. On the other end of the spectrum, QA specialists can have constant access to copies of the production cycle in real time, allowing them to test quickly and act on a moment’s notice to uncover any bugs and possible errors. What’s more, you can use IaC to spin down the environments you’re not using, which leads to better resource management and greater financial savings. Reducing the risk of human error One of the biggest benefits of the infrastructure as code model is that it takes a comprehensive approach to infrastructure automation in an attempt to bring the risk of human error down to a minimum. Basically, this is a model that codifies everything and reduces the amount of manual labour required to deploy and maintain your cloud infrastructure or your on-site infrastructure. This is an important feature to have if you want to be able to keep managing your infrastructure with ease when, should that happen, your master engineer leaves the company to pursue other career opportunities. When that happens, the last thing you want is to be left with an infrastructure designed in a way that no other engineer can figure out. With infrastructure as code, you’re getting full documentation and reports on how the infrastructure is set up and how every application works, allowing any new engineers to pick up where the previous ones left off without setbacks or the risk of inadvertently creation new issues into the system. Supporting long-term financial savings Through automation and standardisation of core infrastructure management processes, the infrastructure as code model is able to easily facilitate long-term financial savings for your business in the cloud. This allows for more efficient database management but also for smarter payroll decisions and the reduction of extraneous business expenditure over months and  years.  With IaC at your side, you’re able to make better hiring decision and only bring mission-critical talent onboard, and it also allows your IT staff and engineers to shift their focus from the menial tasks to mission-critical objectives. Along with the ability to create standardised scripts that eliminate the use of resources that you don’t need, you’re able to ensure long-term financial savings.  Automating infrastructure management to boost productivity Finally, it’s important to note that the biggest advantage of IaC is probably its ability to maximise the performance and productivity of your IT experts, engineers, and software developers. IaC achieves this mainly by automating infrastructure management and delivering advanced features that make the whole process more sustainable and less demand on your employees. Ultimately, this can help you scale down your IT requirements and make better long-term decisions while making your operation more efficient and flexible in the process. Wrapping up The IaC model revolutionises the way we look at data centres and cloud infrastructures, and now is the time to integrate it into your operation. Use these insights to implement IaC and pave the road to higher productivity, efficiency, and financial savings in the years to come. ### Digital workforces: Automation processes in workforces for SMEs Let's start with a definition. A digital workforce comprises a variety of technologies for automation working alongside the human workforce. These technologies include, e.g. driverless vehicles and AI, but the term is currently mostly used to describe software robots or robotic process automation (RPA). We have seen the digital workforce evolve from car manufacturing robots and CNC machines to more integrated supply chains, through IoT to its current incarnation where RPA, chatbots, digital assistants (e.g. Siri) and the like are almost becoming separate entities. Hence the term workforce. The benefits and challenges of a digital workforce for SMEs The most obvious benefit of a digital workforce is that it does not require rest, holidays or even a desk. Arguably the most critical factor in favour of 'employing' a digital workforce is that robots can perform routine tasks much faster and more accurately than humans can. And they do not get bored in the process. In other words, the human workforce no longer needs to work on a routine, tedious and often time-consuming tasks that, mostly, provide little satisfaction. In a way, that is its most significant challenge; the human workforce can easily perceive robotics as threatening their jobs so SMEs should consider this. The previous automation and internet revolution created more jobs than were lost; however, an adjustment took place in the type of work we, the humans, do. In the coming 10 to 20 years a much larger shift than we have seen before will likely occur, but there is no reason to believe that the emerging new technologies will not create many new jobs. They will, however, require a different skillset.  Not only that but the pace of business will increase. As robots can work between 4 to 10 times faster than humans, productivity can improve immensely. Think of the time you now need to produce a report on, let's say, process efficiency; collecting data, structuring and interpreting it. A robot can deliver data and analyses much faster than we can. What will be left for us humans is to translate such reports into actions and creative innovations.  The critical technologies for a digital workforce All of the technologies mentioned above will impact all businesses. The first technologies that will be widely adopted are RPA, 'smart' chatbots and mobile robots like self-driving forklifts, drones and probably trucks. Gaining experience with these will enable the adoption of the more advanced technologies like AI and quantum computing faster, once those are commercially available and affordable. Key to operating a successful digital workforce will also be the speed at which we can adapt and learn new skills. In this, AI combined with AR/VR will probably have an important role to play.  The best strategy for SMEs to build a digital workforce and invest in change management At this point, it's important to stress that a digital workforce is about augmenting what your staff do — and freeing up their time for better things — not replacing them. Your goal is to solve business issues with the right solutions. That requires business acumen as the starting point. Building and implementing a digital workforce will require thought, preparation, and communication with your human workforce. A good strategy that I see employed by many companies is to think big but start small.  First, thinking big. It means that you need to start thinking in terms of certainty. Businesses that do not get on this train soon will fall behind rapidly. Set a goal, have a vision for a future where your human workforce works together with the digital workforce and start communicating that vision. Be sure as well to have a human strategy. Jobs for humans will change. Skills and competencies that will be most in-demand are in the creative, the complex problem solving and communication categories, so what does that mean for your employees?  The second element of this strategy is starting small. Rather than taking a 'big bang' approach, split your challenge into bite-sized chunks. Solve little bottlenecks first or to glean specific insights that inform your decision-making.  Start with one or two business processes to learn and gain experience with RPA for a relatively low cost. As soon as you are ready to scale up, spend time implementing a governance structure for the digital workforce. I see a recurring issue with companies starting this kind of transformation: it is either left up to IT or operational units to manage. The consequence is that central coordination is lacking, resulting in situations where two departments may spend time and money building a very similar solution or using different software to build robots. Establishing clear goals will help protect you later against being dazzled into buying flashy technology for its own sake. Technology is a means to an end.  Once you have framed it the right way, it will be much easier to write requirements or compare and contrasts solutions and providers. Most importantly, it will be easier to involve your people, bring them onboard your vision and strategy, and ultimately transform your business organically and sustainably. About Eggcelerate:  Eggcelerate is the B2B growth experts for British SMEs supporting the growth of medium and small B2B tech British and Italian companies with international aspirations, helping them manage growth and change while streamlining their operations and improving their bottom line and cash flow position. Eggcelerate has worked, among others, with British start-ups in the field of FinTech (equity crowdfunding, supply chain finance), IoT (immersive events, drones) and more mature Italian SMEs (software house, manufacturer of electronic components).  ### Top 5 trends that will shape data management and protection in 2021 How we live and work changed dramatically in 2020. As a result, the way many of us manage and protect data in 2021 will be very different. The massive switch to remote working has led to a huge increase in the use of Microsoft 365 and Google Workspace, among other applications and SaaS. This seismic shift in collaborating and storing sensitive data has exacerbated the risks of widespread data loss. At the very time businesses are preparing for an uncertain future, IT departments are facing an alarming number of internal and external threats. This has prompted a greater awareness across industries for the need for third-party data protection technology. Questions are being asked about whether the traditional on-prem approach for data protection remains fit for purpose, especially when employees are rarely onsite and the data is in the cloud. With many organisations entering the new year on tighter budgets amid a pandemic-ailing economy, there is a real need to ensure all expenditure represents value for money. Some businesses have been quick to adapt already. For others, 2021 is expected to usher in a transformational year for data management and protection.   The rush to set up employees’ home offices at the outset of the pandemic forced organisations to bring forward cloud-native application plans.  As a return to normality seems unlikely any time soon, many enterprises have decided to stick with remote working, if not forever then for the foreseeable future.  According to a survey of chief financial officers by market research firm Gartner, almost three-quarters of companies (74%) expect at least 5% or more of their former on-site employees to work from home on a permanent basis, while nearly a quarter of firms are planning to keep at least 20% of their workers out of the office post-pandemic. What was supposed to be a temporary change in working arrangements has become the new normal for many organisations. Just 5% of IBM’s workforce currently work from the company’s offices. Remote work has resulted in greater cloud usage. Many businesses will be moving more of their infrastructure to the cloud and having to deal with the security challenges that arise from a hybrid infrastructure. That creates problems for companies that rely on perimeter and on-premises security software and appliances to keep their systems and data safe.  Six in 10 remote workers are using personal devices to carry out their work and almost all of those workers believe naively that their devices are secure. Keeping everyone’s data protected is no easy task.  Here we look at the top 5 trends to watch out for in 2021 as organisations strive for smarter, safer and simpler ways to manage their data.  As ransomware becomes an even bigger threat, the cloud will be critical to securing endpoints When organisations are more vulnerable, cyber-criminals need little encouragement to capitalise. As remote work disrupted and weakened security processes, the number of ransomware attacks increased – and there is no sign of any let-up. The first half of 2020 saw an approximate 35% increase in total attack volume compared to the second half of 2019. No-one is safe, cyber criminals have even set their sights on life-saving vaccine supply chains. The targeting of the most vulnerable victims, and tactics that make it more difficult to recover encrypted data will keep ransomware the most profitable “line of business” for cybercriminals in 2021 and the single biggest threat for all organisations. It is impossible to know what the future will be like over the next 12 months, but we can be better prepared.  Michael Sentonas, chief technology officer at security services firm CrowdStrike, said: “It’s my feeling that after the pandemic has subsided, we are going to see a major shift in the workplace as more businesses turn to remote-friendly cultures.  “This shift will cause cloud and SaaS adoption to be more important than ever. The cloud will ultimately secure workloads regardless of where employees are located, which will be critical to secure endpoints now and moving into the future. “With no sign of attacks slowing down, it’s more important now than ever for companies to be vigilant about their security posture and train employees on possible risks to protect and defend against rising threats.” That means investing in people and technology to help stop attacks; and focusing on the basics - multi-factor authentication (MFA), regular application of security updates and especially comprehensive backup policies.  Organisations will undergo a total reassessment of security strategies Cloud is here to stay and employees are going to be hooked on collaboration tools that make their jobs easier and more productive. However, attackers will focus on any resulting security weaknesses. Protecting all endpoints is a major challenge, though, when they can be anywhere and on devices organisations don’t control. Nation-state sponsored attacks are becoming more prevalent and keeping up with the increasing sophistication of organised cybercriminals, who are getting ever more resourceful, is a major issue.  The capability to adapt what may be a long-standing security infrastructure and ensure staff pivot quickly to meet new demands is crucial. Security measures adopted on the assumption that remote working is only temporary must be revisited. Many companies are incapable of locking down their employees’ laptops. When the workforce is operating outside of any perimeter security that previously existed within an office space, an organisation will be left relying on: security built into the endpointsecurity awareness instilled in usersforced connectivity back to the infrastructure via a VPN Employees’ decisions to use unapproved cloud services for work, so-called shadow IT, add to the new vulnerability, while remote privileged users pose a further risk to network security.  If a ransomware attack cannot be prevented, recovering from it is absolutely paramount.  Without an isolated, up-to-date backup of data, IT systems have no previous working state to revert to.Offsite, air-gapped backup will therefore be a top priority in 2021. Cloud-based protection that guarantees recovery from ransomware attacks will be in huge demand, while the burden of managing manual, time-consuming backups will be consigned to the past. Regardless of where the data is stored, organisations will demand instant data recovery. Even when structured and unstructured data is spread across fragmented silos.  A specialist cloud data management service typically means robust processes that offer vital protection from backups being deleted accidentally or intentionally by hackers or rogue employees.  Modern solutions now instantly restore individual files or whole systems, using user-driven recovery methods. Users and customers can access and work on priority data while the rest recovers in the background. Software-only solutions (especially pertinent during the current climate) with military-grade encryption and full automation will become more and more popular. CISOs will face increased responsibility and demands  As cybercriminals aim to profit from disruption, Chief Information Security Officers should seize the chance to have a bigger role at executive level.  The pandemic has undoubtedly raised the profile for security. A greater number of ransomware attacks has caught the attention of boards of directors, and they are looking to CISOs to respond.  Almost half (43%) of CISOs feel that they are competing with other business initiatives for funding, reports 451 Research and security firm Kaspersky. However, almost every expert recognises businesses need to take security more seriously than ever before.  The sudden need to safely support scores of remote workers has raised concerns over the vulnerability of systems and data – and not just to ransomware. According to Forrester, insider incidents, accidental or malicious, will be a factor in a third of all data breaches in 2021. That’s a 25% year-on-year increase. This will be down to a combination of rapid evolution to remote working, fear of job loss, and the ease with which data can be moved. Organisations should keep in mind that maintaining control is everything while making threat defence and employee engagement big priorities. As the financial impact of breaches grows, CISOs should find it easier to make their case for funding. Especially since Gartner is predicting that by 2024 as many as 75% of CEOs will be personally liable for cyber-physical security incidents. CEOs will be eager to understand the impact of a ransomware attack, the speed of response and the impact on the business.  Having plans in place now is far better than trying to contain it later when loss of earnings and company reputation are on the line. To fully protect a business, a ‘recovery-first’ approach is essential. More businesses will turn their data deluge into a data-centric advantage The value of data continues to rise exponentially with many organisations remaining data rich, but information poor. Data silos prevent businesses from exploiting the true value of their data. Sales have their databases, finance has enterprise applications, product usage often involves a third party, while agencies have web/log data and suppliers may have their own ways of doing things too. Nearly nine in 10 (89%) IT leaders report these data silos are creating business challenges for their organisations’ digital transformation initiatives, up from 83% last year. Data is everywhere, but tracking it down, and establishing who is accessing it is problematic. It varies in type, location and the rate of change, while collaboration is complicated. As the data deluge continues, organisations must be able to manage and secure the data across their IT estates. But that’s not as straightforward as it sounds. The most successful CISOs have always viewed the security function in a business context. Now that they are the focus of so much more attention, it is vital to press home that viewpoint.  That means going beyond talking about threats and mitigations and explaining how the right kind of protection also enables the business - as opposed to holding them back. Understanding and classifying data across an entire ecosystem can be a huge issue. Legacy backup technologies can lead to data silos, while storing data offsite on tape can lead to delayed access. Restrictions on visibility can mean lost opportunities and this has monetary implications. Apart from addressing the challenge of storage costs, CISOs now have access to technology that gives businesses complete oversight of their estate. Having the capability to see data in one place can help simplify governance across workloads with centralised controls. The right automated protection also reduces the operational burden as data management becomes smarter, quicker and easier. The importance of Kubernetes-native software will increase - along with the need to protect it There continues to be a huge increase in cloud-native technology. From a business point of view it is highly desirable as applications are always on, available and can be updated by a development team with zero downtime.  Development teams address customer requests more or less as they come in, instead of waiting weeks. In 2021 when applications are built and taken into the cloud, they are being deployed in containers.  This is because containerisation brings the capability to run all kinds of different applications in a variety of different environments - on-premises or within the public cloud – such as AWS, Azure, and GCP.  The scheduling and orchestrating of operations is essential, though, to eliminate many of the manual processes involved in deploying and scaling containerised applications. Kubernetes is an increasingly popular way of provisioning and keeping track of all these containers as it eases the burden of configuring, deploying, managing, and monitoring even the largest-scale containerised applications. Without an orchestration framework, services will be left to run wherever they have been set up manually – and if a node is lost or something crashes, it is manual work to fix it.  With Kubernetes, you determine how your environment should look, and the framework ensures it looks like that, dramatically scaling up or down if necessary. Organisations can schedule and run containers on clusters of physical or virtual machines, while automating many operational tasks. By facilitating the deployment of applications in this more efficient way, Kubernetes saves time and money because it takes less manpower to manage IT. Infrastructure costs can be slashed for an organisation operating at massive scale. Apps are also more resilient and performant as they can be moved more easily between different clouds and internal environments. As the usage of containerised software increases, expect more organisations to develop software specifically with Kubernetes in mind.  These containers will need protecting if an organisation is to recover individual files easily or quickly, or recover from user-driven errors or recover configuration information. There will also be times when data needs to be pulled out of sources like databases rapidly and injected back into an infrastructure that has just healed itself. So 2021 is likely to see an increase in demand for third-party data management platforms that specialise in protecting Kubernetes environments. ### Fog Computing And IoT: The Future Of IoT App Development In simple words, fog computing stands for decentralised computing infrastructure where the computing resources (e.g., applications) are placed between the cloud and data source. The term "fog" actually refers to a cloud's periphery or edge. Fog computing pushes power of the cloud closer to where data is created and engaged with. Meaning, more users can stay connected to the Internet at the same time. It offers the same network and services as the cloud, but with added security and compliance. Characterization of fog computing According to IDC, 45% of the data worldwide will move closer to the network edge by 2025, and 10% of the data will be produced by edge devices such as phones, smart-watches, connected vehicles, and so on. Fog computing is believed to be the only technology that will stand the test of time and even beat Artificial Intelligence, IoT app development, and 5G in the next five years. It is a highly virtualized platform that offers storage, compute, and networking services between the traditional cloud computing data centers and end devices. Fog computing can be characterized by low latency, location awareness, edge location, interoperability, real-time interaction between data and cloud, and support for online interplay with the cloud. Fog applications involve real-time interactions instead of batch processing, and they often communicate directly with mobile devices. Fog nodes also come with different form factors, deployed in various environments. Fog players: providers and users Although a lot has been written and researched about fog computing, it is not easy to say how different fog players will align in the future. However, based on the nature of significant services and applications, it is safe to conclude that: Subscriber models [smart grid, smart cities, connected vehicles, etc.] will play a massive role in fog computing.Providers angling towards global services will have to cooperate.New incumbents will enter the fog realm, including transportation agencies, car manufacturers, public administrations, and so on. Some known players in fog computing include open source cloud infrastructure providers such as Apache CloudStack7, OpenStack6, and OpenNebula8. Fog computing and IoT app development: the connection Do not be surprised when we tell you there are almost 31 billion IoT devices in use as of today. No wonder we produce 2.5 quintillion bytes of data per day. It is obvious we need an alternative to the traditional method of handling data. That is where fog computing enters the picture. When an application or a device collects enormous volumes of information, efficient data storage becomes a challenge, not to forget, costly, and complicated. Heavy data puts a load on the network bandwidth. Setting up large data centers to store and organize this data is expensive! Fog computing gathers and distributes storage, computing, and network connectivity services, reduces energy consumption, enhances the data's performance and utility, and minimizes space and time complexity. Let us take two IoT examples: 1. Smart cities Data centers are not developed to handle the growing demands of smart city apps. As more and more people started using more IoT devices, more data would be transmitted and accessed. Fog computing can help such ill-equipped smart grids to deliver the actual value of IoT app development. 2. Utilities The term "utilities" includes applications for hospitals, transportation, law enforcement, and so that need the latest technology to deliver data to support their operations. For instance, information about carbon emissions, potholes on the road, and water leakages can be used to update billing information, save lives, and improve operations. How fog computing enhances the value of the Internet of Things solutions IoT and end-users are becoming increasingly powerful. A large amount of data is now being processed directly on the cloud. Adding to that, here are six benefits that fog computing can deliver to the IoT app development process: 1. Greater business agility With the right tools, you can build fog applications and deploy them as needed. Such applications program the device to operate in the way a user wants. 2. Better security Fog computing acts as a proxy for resource-constrained devices and updates their software and security credentials. It deploys fog nodes using the same policy, procedures, and controls used in other parts of the IT environment. When data is processed by a large number of nodes in a complicated distributed system, it is easier to monitor nearby connected devices' security status. 3. Low latency Have you noticed how quickly Alexa does something on being asked? That is because of low latency thanks to fog computing. Since the "fog" is geographically closer to all users (and devices), it can provide instant responses. This technology is ideal for performing all time-sensitive actions. 4. Network bandwidth efficiency Fog computing enables fast and efficient data processing based on application demands, computing resources, and available networking. Pieces of information are combined at different points instead of just sending them to one data via one channel. This reduces the volumes of data required to be transferred to the cloud, thus saving network bandwidth and considerably reducing costs. 5. Uninterrupted services Fog computing can run on its own and ensure uninterrupted services even when the network connectivity to the cloud hampers. Moreover, due to multiple interconnected channels, loss of connection is almost impossible. 6. Improved user experience Edge nodes run power-efficient protocols such as Zigbee, Bluetooth, or Z-Wave. Fog computing enables instant communication between devices and end-users, irrespective of network connectivity, thus enhancing user experience. Fog computing vs. cloud computing: the difference Although fog and cloud computing may seem similar from the outside, they form different layers of the industrial Internet of Things solutions. Here are a few parameters on which both technologies differ: 1. Architecture Fog architecture is distributed and comprises millions of small nodes located as close to client devices as possible. It has a hierarchical and flat architecture with several layers forming a network. On the other hand, cloud architecture is centralized. Large data centers are located across the globe, which is far away from client devices. 2. Communication with devices Fog bridges the gap between the hardware and data centers and hence is closer to end-users. Without the fog layer, the cloud interacts directly with the devices, making it so time-consuming. 3. Data processing Data processing and storage in fog computing happens close to the source of information, crucial for real-time control. Fog decides whether to process the data from multiple data sources using its resources to send to the cloud. In cloud computing solution, the same happens far from the source of information, via remote data centers. 4. Computing capabilities Cloud computing capabilities are higher as compared to fog. 5. Number of nodes Fog consists of millions of small nodes, whereas the cloud has only a few large-server nodes. 6. Analysis Fog performs short-term analysis because of the instant responsiveness between the device and end-users. However, the cloud aims for long-term analysis because of the slow responsiveness. 7. Security Fog computing uses various security measures and protocols, making the risk of cyber threats and data leakage much lower. Plus, it has a distributed architecture. However, cloud computing is not possible without an internet connection. Since it is also centralized, the risk of any cyber threats is higher. Fog computing vs. edge computing: are they the same? This is a tricky question. But to keep things simple—yes, fog computing and edge computing are essentially the same things. Both technologies leverage the computing capabilities to push intelligence down the network architecture's local area network level. This prevents the computation tasks from being carried out in the cloud and thus saves time, resources, and money. Besides, both fog and edge computing can help businesses reduce their reliance on cloud-based platforms for data analysis, which leads to latency issues and reduces the time taken to make data-driven decisions. One of the significant differences is, perhaps, in the data processing. In fog, data is compressed with a fog node. Edge computing processes the data on the device or sensor without being transferred to any other infrastructure. Nonetheless, both technologies save time and resources when it comes to maintaining operations—with proper data collection and analysis in real-time. Imagine getting near-real-time analytics that is beneficial to optimizing performance and increasing uptime. That is what both fog and edge computing make possible. Real-time use cases of fog computing and IoT app development There are many areas where fog computing makes a mark for being robust, dynamic, and advanced technology. This section discusses four such real-time use cases: 1. Video streaming Data transmission in video streaming apps is well-organized at the fog platform. This is because of the elasticity and capacity of fog networking combined with real-time data analysis. Plus, fog enhances communication at a virtual desktop structured system, thus enabling real-time video data analytics for surveillance cameras. 2. Healthcare monitoring systems Smart healthcare decisions in the future would be incomplete, with robust and real-time health observations and monitoring. However, this can be made possible with the deployment of fog computing frameworks that enable data transmission in real-time. Another important use case is amplifying the "U-Fall" application, which involuntarily detects the extensive fall in the case of mild strokes. 3. Gaming Just like the cloud, fog computing brings the power closer to the gamers. For instance, SEGA's fog gaming system uses local game centers and arcades as a server farm to ensure low latency.  So, instead of streaming from the cloud, gamers would harness CPUs' power of the local arcade equipment. The distributed devices use fog nodes for a better quality of experience by ensuring smooth multiplayer online gaming. 4. Smart Traffic Light system Imagine there is a smart traffic light system. Its node interacts locally with many sensors to detect bikers and pedestrians' presence and measure the speed and distance of vehicles. Based on the information, it sends warning signals to cars through the green light. In another example, since it already monitors video surveillance cameras, it is easy to spot an ambulance through the alert alarm and its emergency light. Traffic lights can be changed to permit them from passing through the traffic. Summing it up Fog computing is a companion to the cloud and handles large volumes of data generated daily from the Internet of things solution. As mentioned earlier, processing data closer to the source of information solves the challenges of exploding data volume, velocity, and variety. It gives businesses better control over their data. Fog computing also accelerates awareness about and response to the events. It eliminates the need to go to the cloud for any analysis. That means no more costly network bandwidth problem in the Internet of Things solutions and no need to offload volumes of data onto the core network. Fog computing also aims at protecting sensitive IoT data as it analyses it within the company's firewalls. Ultimately, it leads to increased business agility, improved security, and higher quality of service levels. ### How Cloud accounting can help your business continuity The events of this year has led to many businesses revisiting their business continuity plans - in some cases for the first time since they were created.  As lockdown restrictions return to England and other parts of the UK, businesses are faced with a ‘lessons learned’ phase, where they are in position to use what they did right and what they did wrong when lockdown first hit in March this year, to help the business this time round. It means for any business that has revised their continuity plans, now is the time to put it into action. Business continuity plans In the past, businesses have typically created a plan, filed it away and have only come back to it when needed - and often, it’s only then they realise it isn’t fit for purpose. Going forward, businesses will have to revisit their plan on an ongoing basis and consider wider factors that may impact its effectiveness.  It could be the case for most businesses that their original business continuity plan was more based around market conditions, identifying ways that revenue might be hit. But continuity plans have to consider impact on people and processes too, as well as other factors like cyber security as we become more reliant on technology and data.  Depending on the sector you are in, you may have found your sales were rocketing, plummeting or even staying somewhat consistent during lockdown earlier this year. But your success or failure may have lied in how you were able to react and equip your team to continue to be able to do their job.  For a lot of businesses their teams will have been split across multiple locations since March, working remotely or will have recently returned to this method of working because of the lockdown restrictions imposed. It’s likely that everyone is now well versed and confident with working from home - but there might still be ways to better equip your team for working from anywhere.  For finance leaders, I would certainly expect that cloud accounting is now firmly on their radar if not being seriously considered as pivotal to the role finance teams play in their business’s continuity. After all, the finance team's key role is to track the numbers to protect the business and help fuel its recovery. Cloud accounting can play a major part in ensuring finance teams can do their job. Investing in cloud solutions  It’s hardly surprising that Q1 of 2020 saw a surge in spending on cloud-based solutions, with total spend reaching more than £21bn globally.  I am quite sure that any finance team continuing to work with more traditional accounting software over the past few months will certainly hope to be equipped with cloud accounting moving forward. By its very nature, a centralised finance function is designed to eradicate errors, decrease repetition and reduce low-value work. The automation of tasks such as generating invoices and reminders so you don’t have to manually keep track of debtors, can free up a significant amount of time for finance teams to focus on strategy and the bigger picture. Helping to guide the business through every step of its recovery and progress, rather than being bogged down in what’s gone before. And with lockdown restrictions reintroduced, finance teams find themselves in a similar position to March again, only this time with some experience of how to handle the situation and knowing exactly what tools they need to efficiently perform their responsibilities. The nature of cloud accounting solutions means that the balance sheet of the business can be viewed in real time and it’s much easier to provide an accurate snapshot of current company finances. This is vitally important when businesses are working through their business continuity - allowing them to focus on the businesses exact position as well as being able to look ahead to identify opportunities that are going to help the business to recover or stay ahead of the competition.  Planning ahead It may be an understatement to say that the last six months have been unpredictable and the same has to be said about the months ahead. However, ongoing continuity plans are going to be crucial, and central to those plans, is making sure that your team has the tools in place to ensure your processes are smooth and efficient, your people thrive and your business has the capability to withstand financial hits, and bounce back from them. Cloud accounting could be one of those tools that helps your business continuity.  ### Today Red Hat announced a definitive agreement to acquire StackRox Today Red Hat announced a definitive agreement to acquire StackRox. This is an exciting milestone for StackRox and a tremendous validation of our innovative approach to container and Kubernetes security. It combines the industry’s first Kubernetes-native security platform with Red Hat’s leading enterprise Kubernetes platform, OpenShift - helping businesses further accelerate their digital transformation initiatives by more securely building, deploying and running their cloud-native applications anywhere. StackRox will continue to support multiple Kubernetes offerings such as Amazon EKS, Azure AKS, and Google GKE. You can gain additional insights from Ashesh Badani, Senior Vice President of Cloud Platforms at Red Hat, in this blog post. StackRox was founded over six years ago with an initial focus on runtime security for containers. Over time, based on customer feedback and industry trends around DevSecOps and shift-left security, we expanded the product footprint to cover use cases across the build and deploy phases of the container lifecycle. Over two and half years ago, we made a strategic decision to focus exclusively on Kubernetes and pivoted our entire product to be Kubernetes-native. While this seems obvious today; it wasn’t so then. Fast forward to 2020 and Kubernetes has emerged as the de facto operating system for cloud-native applications and hybrid cloud environments. According to CNCF, Kubernetes is now used by 91% of its annual survey respondents, with 83% of them using it in production compared to 58% just two years ago. Today, DevOps and Security teams at cloud-native companies, Fortune 500 companies and government agencies, rely on StackRox to implement security and compliance policies across the entire container lifecycle. Why Is Red Hat Acquiring StackRox? 2020 was a watershed year for StackRox. Despite the challenges presented by the global pandemic, we grew over 2.5x while staying true to our core value of exceeding customer expectations as shown by our customer satisfaction, retention and expansion metrics. We delivered 17 major releases and launched KubeLinter, our first open source project, a static analysis tool to identify misconfigurations in Kubernetes deployments. Red Hat recognized that StackRox’s innovative approach to container and Kubernetes security was a perfect complement to its vision of delivering hybrid cloud software architectures that can be deployed anywhere. With StackRox, OpenShift customers gain a single platform to more securely build, deploy and run cloud-native apps across their entire fleet of Kubernetes clusters. By joining Red Hat, we will be able to accelerate product innovation and achieve far greater scale, on a global level, than we would be able to achieve as an independent startup. Red Hat sees the tremendous Kubernetes security benefits our customers have enjoyed, understands how security remains a top priority, and knows that together we can further increase the value we provide to our customers. What This Means For Our Customers As a company, one of our core values is to exceed customer expectations. We understand how important Kubernetes security is to our customers’ success. As we enter into this agreement, our customers remain our top priority, and we are excited about the additional resources we will be able to employ to provide our customers even greater business value. By joining forces with Red Hat, we will be able to accelerate our product roadmap and be in a position to bring innovative capabilities to our customers faster; many of them were planned in collaboration with our customers and we know that they will help customers enjoy even greater benefit from our platform. With its heritage and commitment to open source, Red Hat also identified synergies with StackRox as a cloud-native company with a high-caliber team and a growing open source strategy. As part of Red Hat, we will be able to share our expertise and technology with the open source community and build on the great work that Red Hat is already doing within the cloud-native ecosystem. The Next Step Bringing StackRox’s Kubernetes Security Platform to Red Hat’s leading enterprise Kubernetes platform, OpenShift, excited us from the day Red Hat approached us with the idea of joining forces. Over the past several weeks, we have gotten to know each other negotiating this agreement and there couldn’t be a better strategic and cultural fit between our two companies. As we look forward to the next step in our journey, I would like to take a moment to sincerely thank our customers for trusting us for container and Kubernetes security. I would like to thank our major investors - Amplify Partners, Sequoia Capital, Redpoint, Menlo Ventures and Highland Capital Partners - for your belief in our vision and for your unwavering support. Finally, I would like to thank our entire team for their continued dedication and relentless commitment to our vision, our customers and to each other. Ali, I and the entire StackRox team are thrilled to join Red Hat and accelerate our vision of enabling organisations to securely build, deploy and run their cloud-native applications anywhere. ### How Can DevOps Make Mobile App Development Simple & Faster? DevOps practices for mobile app development continue to be popular, allowing better integration with existing development processes and better-utilising resources. It is essential to understand this development approach's key value propositions.  This article will explain the critical reasons for leading mobile development companies to adopt mobile DevOps practices to boost productivity and efficiency.   What Exactly is DevOps? DevOps is a modern software development methodology that, for the first time, ensured a collaborative approach to bridge the gap between the development process and business operations. In other words, to say, it ensures optimum collaboration between all the different processes and contributors that play an essential role in the development and deployment of software. Bringing everyone in a project on the same page ensures excellent productivity, superior output, and higher efficiency.  Mobile DevOps Speeds Up App Release  DevOps allowing collaboration across development and operation teams, ensures streamlining the entire processes, from conceptualising, development, testing and deployment.  It allows easy sharing of information between multiple teams and can reduce the time of the development cycle. During the development, the frequent iterations ensure quicker code deployment and testing to save a lot of time than the phase-by-phase process. In addition to all these, mobile DevOps offers a quicker path to fix software issues throughout the development cycle instead of waiting for testing at the end of the cycle. This continuous testing and continuous development woven into one another speeds up development while ensuring optimum quality.  In many parts of the world where budget-friendly development takes the lead role, faster development ensured by the mobile DevOps approach has come as a big rescuer. Thousands of expert mobile app developers in India who need to face stiff competition in cost and efficiency are increasingly finding DevOps as the most suitable development approach.  Efficient Resource Utilization An integral and key part of the mobile DevOps approach is automation that ensures appropriate resource utilisation and streamlining processes. Automation is carried out, starting from the source code management to develop tools for app testing.  Mobile app test automation ensures continuous iteration, frequent deployment, and instant app releases giving productivity and efficiency a solid boost. Creating Winning Apps Becomes Easier  Thanks to the mobile DevOps approach, developers and operations staff can collaborate more frequently to create innovative new features, offer unique user benefits, and deliver unique advantages. All these end up creating better apps.  Since user experience has become the single most important yardstick for judging an app product, innovative features have become important. This is where the continuous iteration of the DevOps approach proves to be highly effective.  A Plentitude of Development Tools and Options  There is no dearth of sophisticated tools and diverse development options befitting every platform and development needs for mobile app development. For every app feature, you can find hundreds of quality tools. DevOps development methodology allows developers to use a large repertoire of tools and technologies.  Thanks to the continuous integration and modular approach of mobile DevOps incorporating any tool, developing, deploying, and testing the output within a short span of time is easy. These faster cycles of development, deployment, and testing make a place for more experimentations and the use of diverse tools. Fortunately, DevOps teams now have access to a plethora of development, deployment, and testing tools that can be modularity used across multiple projects while making way for continuous development, deployment, and testing. Mobile DevOps for Doing Away with Bottlenecks  The principal idea that made mobile DevOps possible is boosting efficiency, transparency, and internal collaboration to develop better software products. Thanks to mobile DevOps, some of the key bottlenecks that can be mitigated include the following.  Addressing Inconsistency Across Environments: Mobile DevOps can streamline the entire process doing away with the inconsistency of different tools and practices across platforms. Automation to do Away with Manual Interventions: In an environment created for full automation, the manual interventions only create bottlenecks, and this is where DevOps is effective. Bringing together Old and New Development Perspectives: DevOps can easily incorporate new approaches and innovations into the older development paradigms without harming efficiency. Addressing the Absence of Operational Practices: Thanks to mobile DevOps, an organisation can fix operational shortcomings and inefficiencies. By utilising mobile DevOps practices, both new and different types of operational practices can be incorporated. Thanks to DevOps, new ways of managing the changes, issues, server requests, etc., can be established. Replacing Manual or Older Testing: Since testing and quality assurance (QA) are crucial aspects of building high-quality apps, old and manual testing mechanisms are considered bottlenecks for most projects. This is where DevOps can play a great role in ensuring complete automation and continuous testing. Addressing Lack of Alignment: It is a common issue for many big organisations where not everyone in different departments is properly informed and aligned with the software development processes. The mobile DevOps approach in this respect can bring together everyone on the same page and ensure comprehensive alignment of different personnel and professionals.  Conclusion   DevOps not just makes better, only a single aspect of software or app development. It boosts up the whole organisational structure. This ensures a better and more efficient process, more effective collaboration, and a high-quality software product. ### Why Is Security a Stepping Stone to Technology-Driven Marketing Personalisation? A truly personalised consumer experience can only be provided when sufficient data is available to analyse consumer behaviour. Marketing today has become more personalised, contextualised, and dynamic. Acquiring data is the starting point—improved technology and algorithms have made it possible for companies to collect the same.  Companies work under the misconception that personalisation and privacy are conflicting efforts and do not believe that it can be interdependent. It is actually a positive-sum gain from cooperation. Research shows that 60% of consumers are frustrated with the brands' inability to predict their needs and think they aren't doing an adequate job using personalisation.  Companies find it challenging to optimise marketing personalisation with increasing privacy issues related to attaining granular consumer data. On the one hand, consumers are expecting to be recognised and their experiences personalised, on the other hand, consumers are concerned about data privacy.  With the implementation of privacy laws like General Data Protection Regulation (GDPR) and many browsers providing enhanced data privacy, it is clear that collecting consumer data has become difficult.  According to a recent Gartner survey, despite consumers having trust issues regarding usage of their data ethically, they are still willing to share information in exchange for convenience and personalised experiences. 63% of consumers expect to be recognised and want their experiences personalised. This has left marketers in a “catch-22” situation. Companies are stuck between the need for personalisation of data and consumers' need to maintain data privacy.   Growing privacy concerns and predictions for this tech driven marketing landscape With the increase in the number of data breaches, the data security market is expected to reach $38.23 billion by 2025 over the forecast period 2020 - 2025. Here are a few growing concerns and trends you need to know about. Data security spending will increase. Companies need to be transparent and prove their trustworthiness by highlighting their privacy policies. This is vital for building trust and accountability with their consumers. Data privacy is not possible without data protection.  As online threats continue to increase, cybersecurity spending across the globe is estimated to grow between $43.1 billion and $41.9 billion in 2020. Employees need to be regularly trained on data safety guidelines to ensure proper and ethical use of data. Facial recognition raises privacy concerns. Facial recognition technologies promise accurate identification. However, studies have shown that the technology is still vulnerable. Hence, accurate data generation and user recognition can be a challenge. An identity mismatch can lead to a security breach. If facial data gets compromised, hackers can easily copy identities and carry out illegal activities. Also, as facial recognition software generates a large amount of data, companies need to put ample security measures in place.  Protecting healthcare data from wearable technology. Wearable technology provides patient data to healthcare providers to improve diagnosis or treatment. As most wearable technology is interconnected with mobiles or laptops, hackers can use wearable devices as a backdoor to get into the phone and view personal information. This is a major setback for data privacy.  Consumers will make businesses accountable. Consumers are becoming more conscious about the amount of data being stored by companies, and are reluctant to share information due to privacy considerations. Around 72% of consumers have said that they would stop purchasing a company’s products or services due to privacy concerns. Also, 63% of consumers feel companies are responsible for protecting their data. They do not want companies that share personal data without their permission.  By prioritising a few key actions to improve security and privacy, organisations can overcome the privacy dilemma: Communicate and educate consumers on the link between personalisation and consumer data collection. Share privacy policies and practices, and assure consumers that their data will never be shared without their consent. Transparency is essential to gain the trust of consumers.  Empower the consumers by giving them control over how and where their data is used. Let them decide what level of marketing personalisation they want.  Companies need to train employees on ethical data use—what data can be acquired and stored while compliant with new laws and regulations. For creating an optimal consumer experience, consumer data should be acquired within the boundaries of defined privacy rules. Stakeholders should educate themselves on how consumer data can be used without violating privacy. This will lead to greater consumer trust and help generate business value for companies.  Companies need to combine identity data with behavioural data to deliver personalised consumer experience. By collaborating with third-party data sources such as social media to collect data like demographics, interests and combining this with buying and browsing history, marketers get rich identity data that can be used to personalise marketing campaigns.  Keep data use in context. While performing consumer analytics, companies should use only individual-level data needed to align the consumer experience, consumer preferences, requirements and interests. Conclusion Personalisation and privacy can go hand in hand. Digital trust by consumers is now a prerequisite for companies to gather information and provide a wholesome consumer experience. To overcome the 'privacy paradox', companies need to be aware of legislative requirements and adjust accordingly. For companies to practice good data security they need to invest in the technology that makes personalisation possible only under the realm of privacy laws. By assuring consumers that their privacy comes first and respecting their concerns over the shared details, companies can offer a personalised experience and inspire consumer loyalty and satisfaction—in the long run. ### Why collaboration is critical to the resilience of the channel. It has always been clear that collaboration is one of the ways that the channel industry differs - and stands out - from others, and the current crisis has illustrated this more clearly than ever. Working closely together is the only way channel companies will be able to persevere and come out of the other side of this pandemic in the best possible health. Indeed, as many companies have hit problems with cash flow, leading in many cases to staff being furloughed or even made redundant, we expect to see more partnerships being formed than ever before. Companies will need to rely on the skills and expertise that partners can bring to the table, to continue to satisfy and support their customers. Now that our ways of working are continuously changing, we must appreciate how important it is that we adapt and remain relevant. Redefining our vision and exploring new ways of engagement while we adjust to the ‘new normal’ will ensure we’re better prepared for the next unforeseen challenges that lie ahead. The pandemic may have been an unparalleled disruptive force, but organisations who are willing to innovate can use this as an opportunity to build resilience. Although this cannot be achievable overnight, there is no better time to start than now. Resilience starts with employees To create a resilient channel, collaboration must start within. Businesses will need to transform their model to adapt to the current environment - and it must start at the top. Management teams have an important role: to reflect values and behaviours within an organisation. Remaining transparent and consistent with employees by communicating key results or key events within the company can have a positive impact on staff mentality and culture. Businesses can also look at identifying ambassadors to empower resilience, who can provide other employees with training and tools to encourage others to be a part of the transformation. This can ensure that between the business and the employees, there is constant communication about what it means to be resilient in the channel. This can be through internal communication channels such as Zoom, Teams or webinars. It is also essential to continue to celebrate and reward employees' performance during the pandemic. If companies identify and recognise hard work, this will increase motivation and productivity. It encourages high-quality services and support, for example, leading to a stronger business. This will also encourage other teams to be more efficient in their role, again guaranteeing a more resilient channel. Take advantage of the skills an outside source can offer One positive effect of the current pandemic crises has been a renewed commitment to business partnerships. We may still be coming to grips with the new normal, but the one constant is that we must continue to work together and strengthen our networks across organisations as we move towards recovery and prepare for what might happen in the immediate future. When the pandemic hit and lockdown began, both professional and community networks were forced to adopt digital methods of communicating and working with immediate effect. Skype, Zoom, WhatsApp groups and Microsoft Teams rapidly became the de facto ways of working, ensuring that we stayed connected, which in turn led to the widespread acceptance and successful normalisation of home working. The urgent necessity caused by the crisis forced many of us to get over our misgivings about digital and swiftly removed many of the old obstacles in place previously for many businesses. Whether these were rules set by IT departments, cultural aversion or just the difficulty of getting everyone on board, the need to satisfy customers in a time of severe uncertainty surpassed everything. And, as a result, it’s now much easier to connect to colleagues across partner organisations, and to communicate with them quickly and effectively just as you would have done in an office environment.  Taking collaboration forward The pandemic has taught us that not everything goes to plan. However, we must not lose sight of the long-term plans which may come in use sooner than expected. Many organisations have focused on shorter goals in terms of the innovation effort, therefore the longer leads are most likely to have been forgotten. If companies prioritise future goals, this will give them a head start against competitors, making them more agile and resilient. Remote working is here to stay, and it has never been as important to take advantage of technology. Businesses within the channel must invest time and resources into available software to communicate effectively. Using these collaborative tools will allow companies to share information, whether this is internally or externally, to selected audiences. It will also enable employees to create groups and share files, therefore speeding up the communication process. Freeing up human resources and providing employees with the tools to develop solutions, fix problems, and measure results quickly and accurately will create a stronger, more resilient channel company. Additionally, this also allows team members to develop their skills so that they are more likely to identify failure, celebrate success and create positive business outcomes faster. To be resilient in the channel, collaboration is key. Businesses must be agile, flexible and certainly open to change. By establishing and strengthening innovative strategies, those in the channel will be able to react effectively to disruption. ### Testing times. The parallels between the fight for regular testing in a pandemic and the security of the cloud. The pandemic has created a whole new vocabulary of words and phrases – quaranteams, Zoom, new normal, lockdown, social distance. They will be words that sum up a generation.  I for one can’t think of a time when the word ‘testing’ was part of common speech to the extent it is today. We must use, hear and read it every day. But we have all come to realise just why it’s such an important foundation to rising out of the pandemic. The fabric of our lives hinges on access to accurate tests, and a quick diagnosis. Without it lives go on pause with isolations or worse.  But finding a model that works, is cost effective and resourced with the right people to not just test samples, but also conduct contact tracing has proved elusive.   What I’ve found interesting as I’ve watched the debate unfold are the parallels it has with security. Investment in cyber hygiene is essential but it’s not a one-time occurrence and must be maintained. Invariably, other priorities arise and something else in the IT budget has to give. You can’t make progressive investment in new technology if you don’t have the money to do so. CTOs face the decision of stealing from Paul to pay Peter.  Yet, the pandemic shone a light on things. You can accelerate digital transformation and find the money when your business survival depends on it. Shifting to ecommerce models introducing apps that let people order food in a restaurant without touching a menu, through to getting people working from their safe place has tested infrastructure and the status quo. But it happened. Companies had to make it work. Innovation rose from the ashes.  However, they also made compromises. The biggest indicators of which are the spikes in cyber-attacks over the last six months. Companies have opened up the floodgates to malicious actors who have sought to capitalise on the situation.  Remote working was the first culprit. Rolling out unpatched VPNs and RDPs have taken their toll. Hackers had a field day, exploiting known vulnerabilities and infiltrating systems just for fun, or to do untold damage by slowing e-commerce platforms, stealing data or IP, or scraping customer information.  Then there was the extraordinary move to the cloud. Companies brought plans forward by years such was the need for a radical response. It was a testament to the IT skill in the UK – and around the world given the many partners that will have been involved – to deliver network change, mobilise digital platforms and move applications online.  But just as hackers were waiting in the wings to infiltrate VPNs so they were waiting for the false moves a cloud strategy prompts. Applications were a hunting ground, of that there is no doubt. Hackers relied on the fact that the speed to release applications would override any security testing. And they were right to place their bets evidenced by the exponential rise in application breaches. It brings to the forefront that any strategy designed to outwit the competition, retain customers and deliver innovative customer experiences must have a security model at its heart. The case to include security testing in development of apps and cloud models has been proven. You simply can’t release apps, overlook security and expect to remain a trusted brand. Consumers won’t tolerate it anymore, and neither will data commissioners.   However, testing an app or any related infrastructure before launch isn’t enough anymore either. Regular testing is essential. Software changes to keep up with innovation and customer habits and with that comes new threats and the discovery of new vulnerabilities. Understanding this and embarking on a cycle of continuous testing has to be the default for any application.  But the pressure to deliver this level of service is too great at the moment as IT budgets are pulled in every direction. That’s why crowdsourcing models are rising on the radar of hamstrung CTOs. They can now use the skill and expertise of people around the world. Scaling it up and down ahead of a new launch or update, and to spot test applications either by having the service ‘always on’ but turning it up and down as needed, or by providing specific penetration testing programmes that put the network and apps through its paces uncovering weaknesses that can be fixed before they are exploited.  It’s a model that is gathering pace in financial services, but also the airline, oil and gas, energy, mobile and car industries as they develop internet of things (IoT) and ‘smart’ business models. It’s a really exciting future that presents itself, but it can only be trusted if it’s secure. Crowdsourced models are overcoming the traditional challenges of budget and skill and as we move into 2021 it will be a model that becomes a staple not a nice to have.  ### What are the Basic Characteristics of the Learning Times of the Efficiency Data Scientist Work? As data collection increases globally, so have the demand for building a framework and solutions for data storage. These processes have, in turn, given rise to the need for data analysis and processing, and this is where data scientists come in. Thus, data scientists' role is becoming more vital as more organizations now rely heavily on data analytics to drive decision-making. These businesses also lean on artificial intelligence (AI) technology, automation, and machine learning (ML) to enhance staff productivity and core components of their IT strategies. Today, we'll try to understand what data science is and look at all the elements you'll require to have a data science career. What is Data Science? Data science is the process of collecting and analyzing data history using various analytical tools, algorithms, AI, and ML principles. The ultimate purpose of the process is to discover new and futuristic patterns from a large amount of raw data. Although it may seem similar to what statisticians do, the difference lies between explaining and predicting.  Analysis in data science explains what is going on by processing the history of the data. Additionally, data scientists go beyond the exploratory analysis to discover detailed insights from data. They also engage advanced ML algorithms to identify both new and future occurrence of a particular event. In essence, data science revolves around these four primary areas: Descriptive data analytics: "What happened?"Diagnostic data analytics: "Why did something occur?"Predictive data analytics: "What will likely occur?"Prescriptive data analytics: "What action(s) need to be taken?" What Does A Data Scientist Do? A data scientist works as a professional whose responsibility is to collect, analyze, and discover insights from a ton of structured and unstructured data to help find solutions crucial for the growth of organizations within diffident industries. Their role emanates from several scientific disciplines roles, like statisticians, mathematicians, librarians, scientists, computer scientists. The type of industry determines what kind of data analysis approach the data scientist will use to provide specific solutions to the needs of an organization or its department. Therefore, it is vital to get accurate information from the company before a data scientist can find meaning from the bulk of structured or unstructured data. It also becomes imperative for a data scientist to have expertise in the business domain to translate business or departmental goals into data-based deliverables like automated tools, optimization algorithms, prediction engines, pattern detection analysis, etc. To achieve the primary goal of problem-solving, these are some of the secondary tasks a data analyst does on a daily basis: Pull, merge and analyze dataLook for patterns or trends in dataUse several tools, including R, Tableau, SQL or SAS, Python, Hive, Impala, PySpark, Matlab, Excel, Hadoop, etc., in data analysis and problem-solvingDevelop and test new algorithmsLook for channels to simplify data problemsDevelop predictive models Build data visualizationsWrite out analytical results to share with others. If writing isn't one of your most potent skills, paper writing reviews websites like Best Writers Online  can help with this process.Map out proofs of concepts Qualifications: Education, Training, and Certifications  To have a data science career, you are required to have sufficient educational or soft skills qualifications. For those who go the traditional education route, you must have advanced degrees or masters in data science, computer science, statistics, mathematics, etc. For those who don't have a degree in these fields, training and certifications can provide the skills necessary to transition to being a data scientist. These courses can boost your resume and increase your data science opportunities. Some of these certification programs include Microsoft MCSE Data Management and Analytics, SAS Certified Big Data Professional/Data Scientist, Certified Analytics Professional (CAP), IAPA Analytics Credentials, and DASCA Data Science Credentials, etc. It is essential that before consider applying for such programs. You'll need to research more on your industry of interest. It'll enable you to figure out what skills, tools, and software are most relevant. For example, the skill set you will require for working on professional writing services review websites like Online Writers Rating will differ from the skills you'll need to work within the travel or government sector. Required Data Scientist Skills There are skills you need and can help increase your data scientist job application's success rate. These skills include a mix of hard and soft skills that are not limited to, the following: Programming: This is a fundamental data scientist skill that can improve your statistics skills. It can also enable you to analyze large datasets and create your tools. It will help to know about programs such as Hadoop Platform, Python coding, Apache Spark, etc. Quantitative analysis: this skill is vital for analyzing large datasets. You can improve the ability to run an experimental study, scale your data strategy, and implement ML with quantitative analysis.Product intuition: the skill of product knowledge makes it easier for you to predict data system behavior and establish metrics. Management of Structured and Unstructured data: a data scientist should be able to manage structured data collected and organized by categories using electronic devices. They should also be able to manage unstructured data that is bulky and more likely to come from human input, such as emails, customer reviews, videos, social media channels, etc.Communication: just as with other professions, an excellent communication skill as a data scientist makes interaction with clients easy, enabling you to leverage on the other listed capabilities.Teamwork: as it is with communication, collaboration is another general yet vital skill for a successful data science career. It requires workings in harmony with others, being open to feedback and positive criticism, and knowledge sharing with your team. Data Scientist Salary Data science work is a fast-growing and lucrative field. According to a report from Glassdoor, data scientist was number three on the list of the best-paying job for 2020 in the US. At the beginning of the year, a data scientist's median base salary was $107,801. Conclusion: The beauty of establishing a career in data science is that there are several data science opportunities to consider. The demand for data management is more than ever, and this has lead to high order for people with data science and deep analytical qualifications and skills. However, it is an expensive and highly skilled job, so you'll first want to figure out your niche before you start applying for data science positions. Once you can understand the client's needs, you can create a significant positive impact on a business's success. ### Move Fast and Don’t Break Things With Cloud-Native Data Every business wants to do more with their data. At the same time, terms like digital transformation, big data and service-orientated technology have been flung around for so many years, it can be difficult to understand their real-world value.  Instead, we should focus on what we want to achieve. When I hear the phrase ‘digital transformation’ from a company, that tells me they want to create better user experiences. When a company talks about big data, I see a business that wants to improve how they collect, analyse and use data. Service-orientated technology is a little more technical and covers distributed computing, but the result is a company that can develop, release, iterate and maintain software faster and with more agility.  Making data the star  Using data in this way involves making use of cloud computing. Rather than simply lifting everything into the cloud, you can now make use of ‘cloud-native’ services that can take advantage of new and more dynamic environments. We’re moving away from the hardware world of racking up servers and bringing up ports to having large scale infrastructure and this mesh is communicating with each other. Each of these elements has layers and they are all being watched, maintained, monitored and secured.  However, we have to go further than this. For developers, they could care less about individual infrastructure elements, even if they can deliver scale and availability for their applications. Instead, we should be looking at how to make it even easier to work with data over time. How can new features be delivered in a single sprint and not in months.  This involves another layer of abstraction, but by using automation we can make it easier for everyone to work with data, rather than having to understand and run the databases themselves.  It may be easier to think of this approach as similar to self-driving cars: while trials are taking place around autonomous vehicles, we have not got there yet and it will need to develop over time. Instead, what we do have is more assistance in place to make driving easier and safer automatically. For example, if a car pulls out in front of me, my car can brake and stop before I can even think about it.  For cloud-native databases, that same level of automation can help deliver more automation and guidance on how things should be running in the background. Rather than relying on individuals to configure and run these installations, automation can help put together the most appropriate and efficient set-up, then make recommendations on what to do next if there are changes needed. Working at speed around data In real-world terms, what all this automation does is make it easier to innovate, and enable a business to match its infrastructure to consumer patterns. A good recent example of this is the explosive growth of kerbside pickup and delivery. Almost overnight, this became a critical service for consumers during the pandemic. For large retailers like Home Depot, Walmart, and Target, in-store pickup went from a niche offering to being the most used service for customers in the second quarter.  As an example, Home Depot had to launch this in thirty days as COVID took hold. The company saw sales across its digital platforms increase by approximately 100% in the quarter, with its new pickup service being used more than 60% of the time. Home Depot's SVP of Information Technology, Fahim Siddiqui, told the Inspired Execution podcast it had “everything to do with having the people, process and technology basis to scale out and really provide unparalleled service.”   Making the move to cloud-native What we’re seeing in this data-driven world is more automation, based on the reliability and ubiquity of cloud. However, this goal relies on how data and IT operations management tasks can be handled in the background. DataOps is almost the last frontier for large scale application infrastructure. What that means is that DataOps is going to be more of a service; you're probably going to be renting your data operations, you're not going to be deploying it. It will be cloud native, so wherever you are in the world and whatever you need data for, it's available.  This will rely on automation to help us make better decisions, save money and make sure everything is as efficient as possible. It will ensure infrastructure is in the right configuration for the use cases that we have.  The world we’re living in right now has the database at the bottom layer or what we call the backend. But it is affecting everything: developers want APIs, the queues go to events, microservices and queries, every part of this should be dynamic, scalable and self healing. If you had this block diagram deployed as your application infrastructure. I think you would be very happy, because you can answer questions, you can respond to changes in the world, and you can grow as you need to. If the business wants to extract value out of another part of your business you have the ability to do it quickly. ### Award-winning broadcaster Disruptive Live launches dedicated cross-platform companion app. Watch the latest business and technology trends, insights, and advice any time, any place, anywhere 18th November 2020, London – Available in app stores today, the new app from streaming broadcaster Disruptive Live delivers immediate access to fresh technology and business leader interviews, plus an extensive archive of features, industry panels, and events. Downloading the Disruptive app provides instant access to all of Disruptive Live’s award-winning shows, on phones, tablets, and streaming devices such as Amazon Firestick, Android TV, Apple TV, Google Chromecast, or Roku. “The Disruptive Live app is a response to the radical change in how, when, and where people are viewing our shows. Since the move to home working, and the events industry shifting online, we have seen a far broader range of devices and viewing times. Viewers are clearly working hard, on their own time, to keep up to date with business and technology trends,” said Kate Bennett, Marketing Director of Disruptive Live.  Bennett continued, “We have been seeing people searching for and watching a wider variety of our shows. We realised that doing this across all of the platforms we broadcast to is not the best possible experience, especially if you are trying to use a browser on a TV stick. Bringing our latest shows and huge back catalogue together in a dedicated app, tailored for each device, makes it faster and simpler for people to discover relevant videos.” Through the Disruptive Live app, viewers can already access 217 hours of videos across 11 channels including, Executive Insights, Lifestyle & Consumer, Cloud & DevOps, IoT & Industrial, Cyber Security, Technology Innovation, and AI & New Tech. Shows and streams are fully searchable, making it fast, simple and productive to take a deep dive into the latest thinking from industry experts.  The Disruptive Live app is free and available for download now from the following app store links or by searching for “Disruptive Live” from the app store on the device. iOS and Apple TV - https://apps.apple.com/gb/app/disruptive-live/id1535818369  Amazon - https://www.amazon.co.uk/Compare-the-Cloud-Ltd-Disruptive/dp/B08LH1TNBW  Android - https://play.google.com/store/apps/details?id=com.maz.combo3344 Roku - https://channelstore.roku.com/en-gb/details/3d07000543f962995ab95d584587b46b/disruptive-live ### How cloud resilience has been challenged and why security strategy needs a rethink. Migrating business applications to the cloud has saved 2020 for many businesses, to the extent that the vast majority (83%) of C-level executives expect the changes they made to stabilise their revenue will become permanent.  What’s more 80% of leaders expect a quarter or more of employees to stay working from home in future.  Lockdown 2.0 has underlined this outlook.  Unsurprisingly, remote working and the creation of contactless business models and processes accelerated cloud migration for 76% of senior executives worldwide. The unexpected speed of business transformation has brought about some long-term benefits and it’s not gone unnoticed.  C-Suite execs report that employee productivity has significantly improved as work-life balance has been redressed, at least in the short-term, leading to a greater retention of workers who are enjoying the flexibility of working from anywhere. It’s also true that geography has taken second place to hiring the best candidate for the job, with skills being the most important criteria aided by having a much wider geographic pool of potential people to choose from. With this renewed appreciation for how technology can facilitate productivity, we will see more executives forge ahead with network and application plans to support exploding customer and supplier demand for contactless interactions.  But it’s not been without its challenges, and this will continue to be the case as businesses find their new equilibrium. Budgets have been scrutinised and cuts have had to be made to make it all possible. What’s more skilled IT personnel have been under immense pressure as plans to roll out technology were brought forward by years in some cases.  Accelerated pivots have created blind spots in security too. Research shows that a lack of understanding of the threat landscape, and a hope that cloud providers will provide adequate security, has resulted in 40% of senior execs citing an increase in cyber security attacks during the pandemic.  That trend is unlikely to reverse anytime soon, even with a vaccine on the horizon, the world isn’t about to return to normal just yet. As lockdowns are announced, so people’s habits online change – they consume more digital entertainment, they buy more comforts for delivery, they stock up on food. This demand and the restrictions the pandemic impose means remote working, online contactless business models and the need to innovate customer facing applications will continue for the indefinite future.  This will affect the ongoing business strategy execs adopt, their investment in digital transformation, and force them to innovate on traditional business practices that had served them so well for years. The pandemic has prompted a rapid evolution in our day to day activities. On-demand content consumption, contactless payments, home delivery, and remote workforces are now business imperatives and forcing a rapidly increased reliance on the cloud.  As the number of attacks companies experience rises, so it becomes even more urgent for executives to revisit the technology solutions they have implemented before moving on. They may have agility and scalability like never before but it’s clear from the stats – no matter which security expert you refer to – they also have gaping holes in the security fortress. New strategic goals will be undermined if they are not closed very quickly. For that reason, reverse engineering the security gaps in company networks and applications should be considered mandatory and of the highest priority.  As noted above, execs appreciate that the situation is further compounded by the shared responsibility model they have had to accept with cloud providers. Many execs have put huge faith in cloud service providers taking full responsibility for the security of the cloud and all it hosts, not understanding the difference between infrastructure security (which cloud providers inherently handle) and application and workload security (which cloud providers do not handle). However, the shared nature of this model has been put to the test well and truly, and many have found it needs additional help. The board is now starting to listen to CISOs and recognise that they can’t simply move their critical business infrastructure and applications to the public cloud and assume that what the hosting partner does for security is enough.  Cloud providers typically deliver the same standardised infrastructure security across their customer base, in simple terms a “tick box” offering that meets basic requirements but does not meet the individual needs of a specific organisation or their applications. And of course, the more there is in the cloud the bigger the attack surface. As the volume and sophistication of cyberattacks continue their relentless pace, it becomes imperative to put more than minimal baseline defences in place.  Security needs to be in the core through to the edge from infrastructure through applications and it will take different forms. Automated detection and mitigation are a must. There needs to be a way to find anomalies in traffic patterns and do something about it in real-time, well before a website is scraped, data is stolen and customer trust is quashed. If a cloud provider isn’t on top of monitoring, identifying and remediating security threats on APIs or specifically coming from bad bots then companies need to be asking why and getting a resolution in place fast.  The development of applications needs a rethink too. Security needs to be at the start of the process particularly as new applications are born as cloud apps, not a bolt-on to an existing security framework. Companies should be embracing a DevSecOps model and trading on the virtues it delivers the customer.  These sorts of measures are essential because unresolved security incidents could be disastrous for companies in terms of customer trust, as well as revenue and the financial fines they could face.  The pandemic has affected nearly every aspect of life and work in a concentrated amount of time and looking to the organisations that had strong business continuity plans and an agile IT infrastructure it’s obvious that they fared better. It’s also highlighted just how important cloud technologies are for company resilience in the future. But, and it’s a big but, this assurance can only be guaranteed if it’s secured.   ### Helping businesses to understand the benefits of 5G The promise of 5G network technologies for enabling wide scale digital transformation for the businesses of tomorrow cannot be understated. From turbocharging economic growth to powering future innovations, the world will grow increasingly reliant on this new generation of wireless connectivity as it continues to be rolled out. However, if reports from earlier this year of the negative misconceptions held by some are anything to go by, it seems that the mobile industry still has work to do in terms of educating people on what 5G is, what it will do and what it could enable us to achieve. The risk of cumulative negative sentiment towards 5G is that it could cause governments and businesses alike to pull back on their investment in and adoption of the new technology – right at a crucial moment when excitement about its potential should be ramping up. Delays to the rollout of 5G could put a dampener on the kind of digital transformation that could boost the economic recovery for thousands of businesses both in the UK and around the world – from the biggest international corporations down to the smallest innovative startups. For this reason, the mobile industry must redouble its efforts to tackle misconceptions, challenge misinformation and encourage greater awareness and understanding of the enterprise benefits of 5G and how businesses can ensure they are well-placed to take advantage of them. Assessing 5G awareness A recent study conducted by Global Wireless Solutions sought to understand the sentiments of UK businesses when it came to 5G technology. The good news is that awareness of the future importance of 5G appears to be strong. While it is still in the early phase of rollout across the UK, more than half of respondents (56%) reported that 5G is already important to their organisation, with over a quarter (27%) identifying 5G as an important technology for the future of their business. A similar proportion (26%) of businesses are also excited to see a greater range of 5G offerings from their operator. Despite this, it still seems that the provision of 5G applications and services is not considered a top mobile priority by most businesses at this stage, with just a fifth (20%) listing it in their top three for the coming twelve months. Compare this to more than twice as many who highlighted areas such as consistent voice call accessibility and quality (46%) and greater remote working connectivity (45%) as key priorities for their organisation. While awareness of 5G appears to be very high, the question is whether or not companies understand the full range of product and service offerings that 5G could unlock for their business activities – and, perhaps more importantly, whether they are starting to invest now in the new and updated tools and systems that will allow them to make the most of 5G’s potential in the future. Building business understanding Despite strong awareness of the importance of 5G, businesses often appear less clear on areas such as how 5G will be deployed and what it could actually achieve for their operations. This could be fuelling further uncertainty around why they really need 5G. Another key problem is that existing messaging around 5G benefits is often quite limited in scope. Often the focus is firmly on increased speed, which clashes with the fact that the overwhelming majority are often quite happy with the network speeds they already regularly receive. Faster speed is certainly a feature of 5G, but it is far from the only aspect – focusing on this at the expense of other upgrades undersells the true promise of 5G and makes the next-generation technology seem like a less significant transformation. Beyond speed, the benefits of full 5G network connectivity include greater reliability, lower latency and higher rates of critical data-transfer, all of which will be vital for powering a more digitally interconnected future. More efficient utilisation of available spectrum and capacity will allow more people to do more things at the same time on the same network. Meanwhile, the collection and transmission of data will be revolutionised with 5G, as hardware, power and data storage capabilities are all enhanced and less cumbersome. This, for example, will enable billions of low-powered IoT devices to drive value through data right around the world. Communicating enterprise benefits Everything from autonomous vehicles and remote robotics, to real-time applications including the likes of VR and AR, will all be made possible through 5G. These are exciting prospects, but it’s also important to ensure that enhanced capabilities of 5G are being communicated in ways that businesses, employees and ultimately their customers will care about. For example, increased remote, wireless working for traditionally office-based workers has been one of the most significant shifts in response to the pandemic; as a result, a 5G-powered future could allow an even wider range of firms to operate remotely on a larger scale – more companies and more employees able to perform tasks and services as if they were in the office, in the field, or with a customer. 5G could enable engineers to perform maintenance tasks and address critical infrastructure issues as they happen in real-time from many miles away. It could give doctors the ability to go beyond hosting virtual consultations with patients via video link to actually performing remote surgeries and operations, all facilitated by real-time VR and remote-robotics applications. Crucially, these enhanced capabilities translate into increased productivity and efficiency for businesses, delivering time and cost savings and positively impacting the bottom line. Explaining the range of capabilities and benefits for businesses and framing things in terms that will resonate will help to make them aware of the actual positive impact that 5G could deliver to their operations. This is a must for the mobile industry to encourage more firms to take a stake in accelerating the arrival of 5G. Without knowing what the new network is, what it could achieve or when firms are likely to see expected returns on their investments in terms of the advancements they expect, there is a risk that businesses will continue to consider the arrival of 5G as a ‘nice to have’ and not a necessity. ### The changing weather of cloud infrastructure The pandemic has highlighted how integral technology is to help us work, shop and interact with each other. As more of our critical infrastructure and services become dependent on software hosted in the cloud, outages are more than just an inconvenience. But many businesses are playing a risky game with their cloud providers. They often use one main provider, located in a small number of places around the world – meaning if downtime occurs, they are left high and dry. On top of this, there are various data sovereignty and privacy concerns associated with using one sole provider across borders. In this piece we’ll explore the changing weather of cloud infrastructure, including the rise of local providers, the increasing data sovereignty complications, and how diversifying to a multi-cloud approach can help businesses address these challenges. Local cloud optionsWhen choosing a cloud provider, large organisations are often drawn to using one of the ‘big five’ suppliers. Of these, four of the five (Amazon, Microsoft, Google and IBM) are American. With the US having recently passed the CLOUD act, which has provisions that enable the US government to demand access to data stored by American companies overseas, many companies that handle sensitive information are concerned about the privacy aspect of storing their data with these US-based companies. Businesses are therefore considering building their online presence across providers within each jurisdiction they operate in. By seeking local market providers who provide cloud-based durability, cost-effectiveness and ease-of-use, they can rest assured that they are operating within the legal framework of each country they are established in. These local options are expected to increase over the next few years, given moves to promote competition, such as the EU’s recent ruling that countries should be encouraging local providers over the large US-based cloud vendors. Data sovereignty complications across bordersOrganisations who operate across several different countries are also impacted by a global web of data protection and residency legislation, which applies to the user data they hold, yet most companies are not even thinking about it. This is because current national and international legislation around tax, data protection and privacy are not compatible with one another, which makes dealing with data and transactions ethically a quagmire. There is a definitive need for a simplification of digital tax and data policy within major trading blocks. For example, although GDPR is a block-wide requirement in the EU, handling VAT on transactions is done on a nation-by-nation basis and needs to be managed independently for each country that gets serviced. This is incredibly complicated for a market that isincreasingly dominated by digital transactions. Addressing these challengesOver the next few years, and in absence of a universal simplification, the challenge for many global companies will be to ensure they are compliant to the increasing amounts of data protection legislations, which seek to regulate how they use and store data across countries. This challenge, combined with a much larger public awareness of data privacy and consumer’s rights, means that organisations immediately need to be transparent about their use of data and who it can be accessed by. To do this, awareness and protection are the first line of defence and, as well as getting an experienced lawyer to draft your company’s privacy policies, a risk assessment should be undertaken to determine potential exposure. With increasingly aware customers, businesses should be especially aware of the possibility of receiving Freedom of Information Act (FOIA) requests from the public who want to know how their data is being used. To prepare, businesses should ensure they have systems in place to handle the formal processing of these requests. Most importantly, in a time when the vaults of data businesses own and use are getting larger and more complex, companies need to ensure they’re compliant and avoid making mistakes. From marketing lists, to customer mailing lists and ad-hoc visitor lists, organisations need to clearly think through how they are working with people’s data and keep track of it. Building a multi-cloud approachTo help address these challenges, business leaders should consider building their applications across a range of providers within their own borders in order to mitigate their risk around compliance. For businesses, doing this also means that they can access data centres in areas which are not provided by the primary cloud provider and manage costs and resources more effectively by taking advantage of reduced prices or specialised offerings which are not available with large vendors. A key consideration when looking at moving to a multi-cloud approach is the role of API management. As moving data tends to rely heavily on APIs, supporting a multi-cloud strategy requires evaluating your API management approach - this includes finding an API management solution that is capable of working in a multi-cloud, multi-region configuration, while ideally providing a centralised view. With countries around the world beginning to build their own internal cloud infrastructure and with the increasing demand for domestic data storage solutions, the future for businesses is multi-cloud. Although the temptation may be to simply think short-term amid the pandemic, true business leaders will be focused on building for the future. Along with enabling remote working, this means investing in improving agility and efficiency. Considering a multi-cloud strategy, with all the flexibility, cost benefits and competitive advantages it offers, will help them to do that. ### Essential Points to remember before starting mobile test automation Smartphones have become powerful, sophisticated, and advanced over the years. From just being a device to communicate, now smartphones are leveraged to do any personal or professional work efficiently. From browsing social media to handling emails, smartphones have become a multipurpose gadget for users all over the world. More than $350 billion of revenue was generated from just mobile apps in 2018. And these astronomical numbers are only expected to surpass over $900 billion by 2023. Amid the exponential growth of mobile users, several challenges have emerged with mobile app testing service providers. Cutthroat competition has compelled businesses to launch applications at a faster pace. Apart from shorter time-to-market, QA professionals need to consider multiple aspects of an application including multiple OS versions, cross-platform capabilities, network, data usage, and in-built native features. With such a large ecosystem of mobile applications, it is also essential to use mobile test automation for enhanced efficiency, test effectiveness, and speed-to-market. However, it is not just technology and tools that can help you to implement mobile test automation efficiently. There are several other crucial factors that you need to consider to offer flawless mobile test automation services. CostIt is crucial to think about the cost factor with a thorough approach. You should consider factors including automation tools, timeline, IT infrastructure, training, and available bandwidth of resources. Analyze all these factors comprehensively to calculate ROI and identify the feasibility of a mobile app. Automation testing methods will also help in reducing the cost and efforts. You can resume already built test cases again and again without manual efforts. The time saved through this process will help in reducing the overall cost of the mobile app. Test automation can also enable QA teams to perform parallel app testing to find out and fix bugs at an early stage of software development. Scope of the TestIt is also crucial to analyze the coverage and scope of test automation for mobile apps. By applying the right approach to execute complex test automation, you can easily enhance the quality of mobile applications while reducing time-to-market significantly. It is also advisable to not limit the test coverage to just the functionality of mobile apps. You can also cover different areas including database table and memory usage among others. Apart from that, QA engineers should focus more on creating new automation test cases instead of automation manual ones only. Aside, you should not be relying solely on test automation. In fact, you can have better test results by leveraging a blend of both manual and automation testing. Proof of ConceptWith a legit proof of concept (POC), test engineers can easily mitigate existing risks that are linked to the flawless implementation of a test automation strategy. It can also help in finding whether a single tool is required to meet test automation requirements or you need a set of multiple tools to deliver cross-platform and multi-device automation solutions. Selecting The Right Tools and TechnologiesYou need to select an appropriate set of automation tools to ensure the successful implementation of mobile test automation. Look for features like easy integration, multi-device support, cross-platform compatibility, and user-friendliness to finalize an automation tool. You also need to ensure that the selected automation software is capable of testing interruption and battery state change parameters among others. To save upfront costs of expensive licenses, you can consider open-source automation tools. However, consider open source solutions only when you have specialized resources to handle these tools efficiently. Consider all these points judiciously before zeroing in on tools and technologies for your mobile test automation project. Leverage Cloud-based Automation ToolsCloud-based automation tools can help you significantly reduce the overall cost of the project. Costs of essential components including hardware, operating expenditure, per-user license, and depreciation costs can be avoided with cloud-based automation solutions. Cloud solutions can also help you easily scale and reuse resources at any point in time. Hence, they are also a credible option for performance and load testing requirements. It is also a boon for enterprises having a lot of remote workers. It enables QA engineers to perform test automation from any location across the world while having all the access to test results and reports from centralized locations or dashboards. Wrapping UpTest automation is an efficient and effective solution for mobile testing services. However, its implementation is not similar to a regular test automation project. It is because mobile app testing has multiple complexities including test coverage, toolsets, devices, and environments. Hence, to achieve the desired results, you need to follow all the best practices religiously. It will also help you to achieve robust quality at minimal costs at a shorter time-to-market. Hence, it is highly recommended to follow the above-mentioned points carefully before implementing mobile test automation in the real world. ### Serverless Computing – what to consider when investing [vc_row][vc_column][vc_column_text]It wasn’t until relatively recently that physical servers were the backbone of every data centre. While on-premise servers have by no means disappeared, more and more infrastructure components are finding their way into the cloud every year and hype around serverless computing is growing. But what is it? Serverless computing is a form of software architecture whereby the responsibility for time and resource-intensive infrastructure management processes such as patching, securing and scaling servers is outsourced to cloud providers. This means resources spent on building, scaling and securing can now be more profitably invested in the applications themselves. Though the name suggests otherwise, serverless computing cannot do without servers entirely – physical infrastructure is needed, but only on the cloud provider side. The cloud provider, whether it be AWS, Google, Azure or another, is also responsible for the dynamic allocation of resources to individual applications and tasks. This brings several advantages. Greater ability to innovate Container orchestration and mesh management using serverless methods fundamentally change the daily business of internal company teams in infrastructure management, redefining their objectives. It means they are now the beneficiaries of an infrastructure that is optimally adjusted for them, and that they no longer have to build, scale and secure it themselves. With the responsibility for infrastructure management in the hands of cloud providers, serverless computing frees up organisations’ time and money. They can channel resources into ensuring their applications function better, creating the opportunity to innovate faster and more easily. Instant scalability Serverless also promises considerable advantages in terms of scaling. A high degree of configuration preparation is not required, so it is therefore far more agile than conventional scaling based on cloud servers. For example, AWS Lambda has proven itself in the scaling of any number of short-running tasks. It doesn't matter whether there are hundreds of events or several billion – anything is possible. This can be explained with the hypothetical use case of a broadcaster’s sports app providing football fans with the latest match scores and real-time statistics on top players, such as the number of goals scored to date in their career. Assume this service provider originally developed its applications on-premise and, given its monolithic server structure, now has a classic scaling dilemma. As part of modernisation efforts, the user could rewrite code and extend it with Lambda. For every single move on the playing field – short, event-oriented tasks – a data object is created, forwarded to the servers and processed by them. The app developers can focus on building the best app possible without also having to manage servers. More efficient access validation Serverless is invaluable for users needing to get through high volumes of small tasks. This is perfectly demonstrated, for example, when customers are granted access to a specific resource. Take video games on smartphones for instance. In order to display high scores to users, they need access to the data on the backend servers. This involves requesting a number of different short-term tasks in the app’s backend. In the specific case of Pokémon Go, the developers couldn’t have anticipated just quite how successful the game has become. Nevertheless, they were able to scale and handle high data loads very well because they had designed the backend to scale quickly from the beginning. Serverless technologies like AWS Lambda make this possible. Serverless computing is still in its infancy however, and there are some teething problems and disadvantages to consider before investing in this approach. Increased system structure complexity As much as it simplifies infrastructure issues, serverless computing means more complexity elsewhere. Its excellent suitability for many short-lived tasks automatically leads to a significantly higher number of these. Each of these tasks may be greatly simplified in itself, but the increased volumes of transitions and interfaces also makes the overall system structure more complex to monitor. Examples include increased configuration effort, additional deployment scripts and tooling artefacts. As a result, the development of software is more efficient, but the continuous checking and monitoring of the applications is more complicated and harder to achieve. Potential impacts on performance Another setback is that the performance of individual requests is much less predictable with serverless and precise performance forecasts are therefore much more difficult. One request may take three milliseconds and another one hundred times that. However, as an application scales, its performance also becomes more predictable. One reason for this is that the more data that is cached and the more often it is accessed, the better the performance. Upfront costs Upfront costs can vary considerably, with investments in serverless computing possibly reaching between three and seven figure sums. Investment can prove extremely cost-effective long term though, so when making the case for the technology, organisations should consider each provider individually to make sure they maximise ROI. Questions to ask include: - To what extent can configurations, guidelines, technologies etc. be modified as the company continues to grow? - Are integrations with other providers possible? - Can new, innovative services be easily added? From a greater ability to innovate to instant scalability, serverless computing offers huge advantages. There are downsides to fully consider too though, particularly as serverless is still in its infancy. Spending time researching each vendor option and undertaking thorough cost-benefit analysis is therefore highly recommended before investing to make the most of this groundbreaking technology.[/vc_column_text][/vc_column][/vc_row] ### Designing geographically distributed environments with the Google Cloud Platform [vc_row][vc_column][vc_column_text]Every cloud provider makes it possible to create geographically distributed environments. I'm going to take a closer look at this using an example of Google Cloud Platform - GCP. Why do we need geographically distributed environments? What motivates the creation geographically distributed environments? There are three main reasons. High Availability High availability suggests a system that is available most or all of the time. It avoids single points-of-failure and leads to redundancy of system components and risk diversification. Redundancy means that system components have backups, and it one instance fails another remains available. This is called "failover." This simple method is used even in aerospace engineering. Distributing infrastructure geographically allows us to avoid single point-of failure and mitigate risks related to the geographical location of a system and data. Disaster Recovery Disaster recovery, the set of procedures and tools used to recover infrastructure in the event of disaster, affects high availability. DR plans always includes RTO and RPO. RTO is a recovery time objective. This is time in which infrastructure can be completely restored following a disaster and function at full capacity. RTO affects high availability directly. If RTO is long - there's no high availability. RPO is the recovery point objective, or maximum time in which data can be lost as a result of disaster. It doesn't affect HA directly but is very important. Geographical distribution can be part of a disaster recovery plan. If we experience a natural disaster (e.g. earthquake) that impacts a system in one location, there will still be a system functioning in another location and we can failover to it. Performance Any application must transfer data from the user to an app and back, and probably between application components. In the era of high-speed Internet and networks, it's usually not a problem. But what if your data center is in California and you many users who transfer significant amounts of data from Russia and get huge amounts in return. This could be an issue because of network overhead. Transferring data could take too long. leading to performance that doesn't satisfy system requirements. In that case, geographic distribution is beneficial. Moving datacenter/application components closer to the user's location makes it possible to decrease network overhead and improve performance. Implement a Distributed Environments using GCP GCP infrastructure distribution Currently, google infrastructure extends to 23 regions, 70 availability zones and 140 network edge locations. Let's take a look at these regions, zones and edge locations. Regions are independent geographic areas that consist of zones. They are connected to each other by a network, but mostly independent. The picture illustrates geographic regions on a map.[/vc_column_text][vc_single_image image="53985" img_size="full"][vc_column_text]A zone is a deployment area for Google Cloud resources within a region. A zone includes one or more datacenter(s). Zones should be considered a single failure domain within a region. Usually there is more than one zone in a region. When you deploy components of your cloud infrastructure in different zones within one region it is called multizone deployment. If you deploy them in different zones within different regions it is called multiregional deployment. GCP's services themselves can be global (multi-regional), regional, or zonal. Edge location is an edge (or let's say a tip) of a GCP's custom dedicated network. It might be a point of Cloud Interconnect (which allows you to connect your network directly to GCP’s dedicated network) or a CDN edge point that makes it possible to deliver content faster. We'll take a look at Cloud CDN service a bit later. The picture shows GCP's network and edge locations.[/vc_column_text][vc_single_image image="53986" img_size="full"][vc_column_text] Components for Designing Geographically Distributed Systems. Now we understand how geographic distribution can help and how GCP’s infrastructure is distributed. The next step is to take a look at particular services, the bricks with which we can build our geographically distributed environment. Let's start from Cloud VPC and discover how it can help build such an environment. Cloud VPC VPC stands for Virtual Private Cloud. VPC is global, scalable, and flexible. It consists of VPC Networks inside of which you can create resources as VM's, kubernetes nodes, etc. These networks can be placed in different zones and regions and are logically isolated from each other. All Compute Engine VM instances, GKE clusters, and App Engine flexible environment instances rely on a VPC network for communications. The network connects the resources to each other and to the Internet. Routes and Firewalls make it possible to set up rules for communication between networks, resources, and the Internet. Let's take a look at an example.[/vc_column_text][vc_single_image image="53987" img_size="full"][vc_column_text]The main difference and one of advantages of GCP's VPCs (in comparison with other cloud providers) is that they are not tied to a region. They are global and multi-region by default. You only create subnets in necessary regions and put workloads there. You don't have to do VPC peering or pay for inter-region data transfer. GCP's VPCs are global by their nature, which makes things easier for architects and developers. Using VPC networks, we can achieve geographically distributed environment resources. But how we can set up a failover and route users from particular locations to particular regional resources? Cloud Load Balancing The load balancer's task is to distribute user traffic across instances of application. Though its main goal is to spread the load, it also has great features like health check, which makes it possible to use for "failovering" and can also distribute the load between regions to resolve performance.  There a few types of load balancers in GCP.[/vc_column_text][vc_single_image image="53988" img_size="full"][vc_column_text]Internal load balancers spread the load inside Google Cloud. External load balancers distribute traffic from network to VPC network.  External load balancers can be regional or global. The global load balancer makes it possible to distribute traffic between instances in different regions but Regional LB, as suggested by its name, distributes load only between instances in one region. Let's consider the design in the illustration below.[/vc_column_text][vc_single_image image="53989" img_size="full"][vc_column_text]We can use the external global load balancer as an entry point for users around the world. It allows us to route users in Asia to instances in the Asian region and users from the US to the US-central region accordingly. The internal load balancer makes it possible to spread the load between internal tiers in one region and between zones. And this resolves the third problem mentioned at the beginning of the article - performance. But what about high availability and disaster recovery? Health checks will do the job. You have to set up a health check for each load balancer. GCP has global and regional health check systems to connect to backends on a configurable, periodic basis. Every attempt is called a probe, and each health check system is called a prober. GCP records the result of each probe (success or failure). The backend that responds with a configurable response in a configurable time and/or number of probes is called healthy. The backend that fails to respond successfully is considered unhealthy. Cloud load balancers automatically failover. They stop sending traffic to unhealthy instances and send only to healthy ones. This resolves the disaster recovery and high availability issues I mentioned at the beginning of the article. Having external load balancers also makes infrastructure easier to maintain, deploy and scale. You don't have to use "DNS hacks" to achieve traffic distribution based on location or simply availability of service in some particular region.  I am convinced that it makes automation and DR actions much easier to implement. So, considering the design in the illustration above, we can say that we can use the advantages of a geographically distributed environment to improve high availability and performance, as well as create a good disaster recovery plan. However, there is one more tool that can create a more distributed environment and improve performance. Cloud CDN Sending user traffic to the closest region is good but expensive for performance purposes and doesn't resolve performance issues completely. Regions cover large areas and a regional datacenter can be far from a user. Cloud CDN (Content Delivery Network) uses Google's global edge network to serve content closer to users. Cloud CDN simply caches data at the edge location closest to the user. When a user requests content for the first time, CDN takes the content from the origin server (e.g. instance or storage) and caches it at the edge location. When the user requests content for a second time or another user requests it from a nearby location, it takes it directly from the edge location. It is typically used for static content like pictures, videos and web server responses. It doesn't make sense to cache frequently changed content. The picture below illustrates how Cloud CDN works.[/vc_column_text][vc_single_image image="53990" img_size="full"][vc_column_text] Conclusion In this article, we have highlighted the importance of geographical distribution and how it can improve availability, disaster recovery and performance in your system. Also, we’ve provided a high-level overview of several GCP cloud components: Cloud VPC, Cloud Load Balancing, and Cloud CDN, all of which can help us leverage the advantages of Google's distributed infrastructure and implement a geographically distributed environment. Going global with GCP is much easier because most of its services are global by nature. GCP makes it possible to be more efficient in aspects such as cost, automation, infrastructure design and performance. In my opinion, one of the most important skills in engineering (and software engineering isn't an exclusion) - is the ability to select the proper tool for a specific task. If you’re building a of global, geographically distributed environment GCP is an excellent choice.[/vc_column_text][/vc_column][/vc_row] ### How Social Media Can Improve Customer Retention Ratio [vc_row][vc_column][vc_column_text]Today, whether you are running a well-established business or want to launch a venture, you can not imagine your business without social media. Social media has gained huge attention in the last couple of decades as it enables business owners to reach out to their customers in a most personalised way. Gone are the days when print media and radio advertisements were enough for brand owners to grab the user's attention. today, if you want to expand your business's reach without any geographical boundary, you must consider digital tools that flourish your business. These days, after Uber'sUber's huge success and other on-demand business models, many entrepreneurs have started to transform their traditional business because it ensures a quick return on the investment. Now imagine you have developed a website, design a fully-functional app, but how you will attract customers towards your products and services. Having a strong customer base is key to success. According to HubSpot, good customer retention strategies are more affordable than acquiring new customers. 90% of businesses today leverage social media strategy as it gives optimum results and helps the business increase customer retention ratio. Here are the topmost effective methods that help you increase customer retention ratio through social media.   Importance Of Customer Retention [/vc_column_text][vc_single_image image="53971" img_size="full"][vc_column_text]Image: (Source) The above chart clearly shows that acquiring new customers is more challenging than retaining an old one. Hence, you need to craft a robust customer retention strategy that helps you expand your business and bring new customers. An excellent and well-depth customer retention strategy is important because it helps you reach your target goals. Here we are talking about the online taxi business. If you have adopted the uber taxi app script for your venture, then a good customer retention ratio will add value to your transport business and help you attract new clients on a daily basis. According to American Express, customer retention ratio is not all about offering affordable rates to clients; you also need to have loyalty programs that keep engaging customers with your business. So let's discuss in detail how social media plays an important role in customer retention.   # Keep Track Your Followers There are endless ways to increase followers, but you also need to entertain and educate your existing followers and customers through quality content. Social media platforms are not only about posting content and getting likes on that. Social media metrics allow you to gain insight into your follower's likes and dislikes. It will also give you an idea about their status and personal traits. Your online taxi business must know about your customer's behavior pattern and their status. For instance, if your majority of followers belong from the job background, they need a cab to commute. You can attract them by giving maximum discounts. It eventually increases the customer base and builds brand loyalty in the market. This will also help you make better customer-centric products. Moreover, you will also come to know about their pinpoints. It will help you create reliable social media content that will keep engaging them. This is the first and most important step towards making solid customer retention.   # Make Your Account Accessible Undoubtedly, publishing quality content will give exposure to your taxi business. Publishing sponsored content will promote your brand and increase reach among the proper audience. But how consumers will find your page, here are some of the affordable methods: Place your company logo as a display picture so that people can easily recognise your brand. Your page's name should be the same as the business name so that people can easily search without getting confused with other similar brands. Use relevant hashtags and quality graphics to improve your taxi business's visibility that will also attract a new set of customers and start to follow your page. Besides, keep replying to your existing customers; they will spread positive word of mouth about your taxi brand and generate revenue. # Listen To Customers One of the premium ways to increase customer retention ratio through social media is when businesses listen to their customers and take serious actions based on their activities. Social media channels enable your customers to share and exchange their opinions about your taxi business. As a brand, you have to listen to them carefully because that will help you craft a well-versed social media marketing strategy for your taxi business. It means listening and responding to them on social media adds value and builds faith among the customers. Depending on their comments and feedback, you can improve taxi operations and assist customers in a better way. For instance, you analysed user's comments within various social media channels and observed there are lots of users talking about cab types. If you are right now only dealing with Sedan, you should now focus on offering an SUV. It will increase the customer base, and your existing customers will also prefer to book your cab over others.   # Loyalty Programs Loyalty programs, attractive deals, and special offers can enhance your customer retention ratio. If you offer hot deals and discounts, they will likely stay with your brand. Even if they are new customers, loyalty programs, and deals will compel them to think twice before booking a cab from anyone else. Give an incentive that entices your consumers so that they will choose your brand over anyone else. For instance, you can get five free rides to the user who installs your app. Another idea is to give special services or discounts to those who refer a new customer to your business. Once you give a loyalty program and special deals to the users, they will share positive reviews about your brand. More positive reviews on social media mean people will automatically follow your brands and will think to access your service. Even observing more stars and positive reviews on social media, new users will also take advantage of exclusive offers.   # Post Sharing Time Timing plays a key role when you are using social media to increase customer retention. Due to the corona pandemics, usage of smartphones and social media has suddenly risen so if you want to retain customers, you should pick the right time. If you want to increase the post's reach, you should know when your majority of audiences are active on social media. If you post quality content, but no one notices them, your efforts will go waste. On the other side, maximum likes and exposure will amount to maximum active engagement. Social media allows brand owners to interact with their audience. Usually, brand owners choose a night and office timings when posting the content. Advanced tools like Buffer and HootSuite enable business owners to schedule posts as per their time. If you post content at the appropriate time, you can also easily engage with them. Simply listening to your customers will feel them valued so that they can easily trust your brand.    Final Thoughts Thus, promoting your products through social media will effectively increase sales and help you make a solid customer base, helping you improve business operations even better. Focusing on your existing customers streamlines your marketing results. Make sure keeping customers happy through world-class services and loyalty programs increase your followers rapidly and build brand credibility in the long run.[/vc_column_text][/vc_column][/vc_row] ### A Beginner’s Guide to Azure File Storage [vc_row][vc_column][vc_column_text]Azure File Storage (AFS) enables you to store and manage file directories in the cloud, and access your data via Server Message Block (SMB). You can use AFS to share files across multiple machines, and distribute storage with automatic, local data duplication. AFS also provides data encryption for data at rest and in transit, via SMB 3.0 and HTTPS connections. In this article, you will learn what Azure File Storage (AFS) is, what you can do with Azure Files, and what are the cons and pros of AFS. Azure File Storage (AFS): A Brief Introduction Azure Files is a storage service that you can use to achieve file-directory storage in the cloud. With it, you can store any data you would in a traditional file system, including documents, media, and logs. It is based on the Network File System (NFS) protocol and allows access via Server Message Block (SMB). With Files, you can re-create your on-premises file solutions with the following benefits: Shared access—you can share file systems across multiple machines, applications, and instances. This enables you to allow distributed access while ensuring that all users have access to the same assets. Users can access files via SMB, REST API, or client libraries. Scripting and tooling—you can configure and manage Files via PowerShell cmdlets, Azure CLI, or a built-in UI. Resiliency—storage is distributed with automatic, local data duplication for greater resiliency and availability. You also have the option to duplicate data across availability zones for disaster recovery. Scalability—you can autoscale your file share as needed up to 5 PiB. If you need more storage than that, you can attach additional storage accounts (up to 250). Security—data is encrypted at-rest and in-transit via SMB 3.0 and HTTPS connections. You can also restrict access to files via Active Directory (AD) controls. Use Cases for Azure Files There are multiple ways you can integrate Files into your cloud workflows. Below are some common use cases: File server—serves as a replacement for on-premises file stores or NAS-attached storage. You can set this up solely in the cloud or as a hybrid share with the addition of Azure File Sync or other native file storage solutions, like Azure NetApp Files. Lift and shift—you can use to migrate applications that require a file share without modification. You can achieve this by moving the application and application data or by connecting your on-prem applications to Files. Monitoring and analytics—allows the centralised storage of log files, metrics, and reports for analytics tools and enterprise monitoring. This ensures analytics are performed uniformly and prevents loss of data due to instance or service failures. Development and testing—enables you to create a repository of code and utilities and software used for software testing and development. This allows you to share configuration files, diagnostic data, and provide access to tooling from cloud environments. Azure File Storage Pros and Cons Azure files can be an excellent solution for creating a cloud file share, however, it may not be perfect for everyone. When deciding whether Files fits your needs, consider the following pros and cons.[/vc_column_text][vc_single_image image="53959" img_size="full"][/vc_column][/vc_row] ### Building trusted, compliant AI systems [vc_row][vc_column][vc_column_text]Companies are increasingly deploying artificial intelligence (AI) as they race to extract value from the personal data they hold. As with any project using personal data, AI is subject to data protection requirements arising from the GDPR in Europe and similar laws elsewhere. However, some types of AI, particularly systems based on machine learning, pose specific data protection risks and challenges. The Information Commissioner's Office (ICO) recently issued new guidance on AI and data protection, which aims to support companies as they embark on AI projects. It helps companies to identify and assess risks, and to design and implement measures to mitigate those risks. This article offers practical suggestions for companies as they implement the new ICO guidance and build trusted, compliant AI. More AI, more data and more AI regulation More companies are using AI to process personal data, in some cases without a comprehensive risk assessment. AI often uses large volumes of personal data. Large scale, complex systems carry specific risks. This has prompted regulators to issue specific guidance on AI. McKinsey’s 2019 survey of over 2,300 companies showed that 80% had trialled or adopted AI. Of those, 63% reported revenue increases and 44% realised cost savings, but 41% didn't comprehensively identify and prioritise AI risks. Worryingly, only 30% were working to mitigate privacy risk. McKinsey defined AI in terms of machine learning, excluding rules-based systems or those based on decision trees. Machine learning often grabs the headlines, but other data-driven decision making can have significant impacts on individuals. For example, Ofqual’s system for assigning grades to students whose exams were cancelled because of COVID-19 was a statistical model based on attainment data, not a machine learning system. Broadly speaking, machine learning systems work by identifying patterns in ‘training’ data, then applying those patterns to make inferences about individuals. For instance, a bank’s AI could analyse historical data on loan repayments to identify factors correlated with defaults. The system could then make a prediction about the likelihood of a new loan applicant defaulting. The number of factors an AI model can take into account is staggering. GPT-3, an AI system for natural language processing, considers 175 billion parameters to suggest the most likely word to follow the one you just typed. In a personal data context, Acxiom, a data broker, claims to offer 5000 parameters on 700 million individual consumers. Companies can combine data from a broker with internal data (e.g. purchase history) to build even richer individual profiles. The GDPR focuses on protecting individuals and is technology neutral. It doesn’t distinguish between machine learning, rules-based or other types of processing. Instead, it focuses on the outcomes: profiling, automated decision making, and the risk of harm. The ICO guidance joins a growing list of regulatory interventions specifically on AI. The EU first investigated AI in 2014, and has published a number of documents covering safety and liability, setting the future regulatory direction and advising companies on how to build trustworthy AI systems. In the United States, the National Institute for Standards and Technology (NIST) recently launched a consultation on their draft AI explainability guidance. Specific regulatory guidance on AI is a clear trend; companies should take note. What are the risks? Effective risk management helps to build trust and confidence in your AI system, and helps you to comply with the GDPR. The ICO notes that “in the vast majority of cases, the use of AI will involve processing likely to result in a high risk to individuals’ rights and freedoms”. The specific risks involved will depend on the context, including: how the system is designed and built, how and when it is deployed and the impact that using AI has on individuals. The scale and complexity of AI systems can make it difficult to demonstrate compliance. The ICO guidance provides a useful framework to structure your approach to assessing risk. It highlights three focus areas: the ‘lawfulness, fairness and transparency’ principle, data minimisation and individual rights. Scale Many AI systems process large volumes of personal data. Processing data on a large scale can cause issues for data minimisation, raise surveillance concerns and might trigger the requirement for a data protection impact assessment (DPIA). Scale also carries a legal risk. Class action lawsuits, such as Lloyd v Google - a legal challenge to Google’s data collection on behalf of four million iPhone users - could result in a significant award for damages. Similarly, the Privacy Collective say that their class action suit against Oracle and Salesforce could cost those companies up to €10 billion. If these cases succeed, they would put the regulators’ power to impose fines of up to 4% of turnover or €20 million into perspective. Companies may find themselves at greater risk from legal claims for damages than from regulatory fines. Lawfulness, fairness and transparency Lawfulness requires that you have a legal basis for processing personal data. In an AI context, that may mean relying on different legal bases at different stages of the system’s life cycle. For instance, you may be able to rely on legitimate interest to use personal data you already hold, with appropriate safeguards in place, to develop an AI model. Deploying the model into a specific context will often require another legal basis. Bear in mind that the requirements for processing special category data (e.g. in a facial recognition system) or automating decision making are particularly stringent. Fairness requires that your AI system is sufficiently statistically accurate, avoids discrimination and that you consider individuals’ reasonable expectations. The GDPR requires you to ensure that personal data is accurate and up to date. It empowers individuals to challenge the accuracy of data about them through the right to rectification. The ICO distinguishes between accuracy in the data protection sense and statistical accuracy, which refers to how closely an AI system’s outputs match ‘correct’ answers as defined in the test data. Making decisions about individuals, such as whether to approve a loan application, on the basis of statistically inaccurate information is problematic. It can undermine trust in the system and may breach the fairness principle in data protection law. It can also be bad for business, as inaccurate decisions could mean missed opportunities for revenue or higher costs. The ICO describes transparency as “being open and honest about who you are, and how and why you use personal data”. In an AI context, this means understanding when and how you use AI decisions, how the AI system itself generates its output and being able to explain both of these to the individuals affected. You should also provide information on the purposes for which you use personal data and your legal basis for doing so in your privacy notice. A machine learning system that processes thousands of parameters relating to millions of individuals to identify patterns may be difficult to explain clearly. How does each parameter contribute to the model? Transparency also underpins fairness. Understanding how a system works means that you can check for bias. In some contexts, the risk of bias or discrimination triggers further regulatory requirements. For example, the UK’s Equality Act bans discrimination on the basis of the nine protected characteristics, and the New York State Department of Financial Services investigated the Apple Card after allegations that it discriminated against female applicants. Data minimisation Big Data often assumes that more data means better AI, implying a tension between AI and the data minimisation principle. The ICO is clear that using data on the basis that it might be necessary breaches the data minimisation principle even if the data, in retrospect, turns out to be useful for the AI project. I’ve argued that effective data minimisation can improve AI. Individual rights The ICO recognises that using personal data for AI may make it harder for companies to fulfil individual rights to information, access, rectification, erasure, restriction of processing and notification. Some of these are connected to the principles above; transparency can help you to provide information to the data subject. In some cases, AI will involve profiling and automated decision making. The GDPR specifically requires companies to provide “meaningful information” about how the AI system generated its output and what that means for the data subject. The ICO’s AI explanability guidance provides much more detail on what explaining AI means in practice. How can companies respond? Companies can respond to these challenges by implementing robust data governance. A data governance process helps you to understand your data, think broadly about risks and put the necessary checks and balances in place to manage them. Training data is crucial to AI systems. Understanding the data on which a model is based can improve transparency and explainability, and help to identify and reduce bias. Most companies curate a training dataset, usually a subset of data they already hold, as a basis for their AI project. Data governance models should include a process for deciding which data is necessary, checking for risks and unintended consequences and deciding how the data should be treated once the model has been developed (e.g. should it be retained for explainability purposes?). The training data may be de-identified, to mitigate privacy risk to individual data subjects, or may be weighted to ensure that it is sufficiently representative. The ICO’s AI explainability guidance includes a section on collecting and preparing your training data in an explanation-aware manner. This includes documenting decisions so that you can audit them later. This could become more important as other AI regulations emerge. The European Commission’s AI White Paper proposes that companies should be required to document the training dataset (i.e. its characteristics, what values were selected for inclusion, etc.) and in some cases retain a copy of the training data itself to allow issues with the model’s performance to be traced and understood. The White Paper is a consultation document, so it’s too early to say whether these specific recommendations will become law. However, it’s clear that the trend is towards a greater focus on training data as a key element of building compliant AI systems.[/vc_column_text][/vc_column][/vc_row] ### 5 Great Tips to Build a More Efficient Data Center for Your Business [vc_row][vc_column][vc_column_text]Higher operational efficiency is one of those long-term goals that every company strives to achieve, and there are many ways you can make your business more efficient in the digital age. One of the most powerful changes you can make is to transform your data center and your entire IT infrastructure to become cost-effective and power-efficient in order to maximise your investment and minimise waste. While this project might seem like a grand investment in and of itself, it’s important to remember that nowadays there are ways to ensure efficiency without breaking the proverbial bank. In fact, you can do this quickly and without wasting resources or experiencing any downtime. The tips and solutions we’ll be talking about today work wonderfully well for companies of all sizes, and some will be geared towards in-house IT infrastructures while others pertain to your IT infrastructure in the cloud. With all of that in mind, let’s talk about the power-saving and cost-reducing tips that will make your data center more efficient in the long run. Measure and monitor performance First things first, always keep in mind that it’s impossible to make the right decisions in business if you don’t have relevant and actionable data. After all, you can’t control and improve something that you can’t measure, so your first order of business should be monitor and measure the performance of your data center. This will give you invaluable insights into the areas that might be wasting energy and resources, as well as the areas where you might be wasting your storage capacity on the things you can easily eliminate from your data storage. You have to know where all of your power is going at all times, so you have to measure the power consumption of your IT systems, UPS control, chillers, and lighting for your on-site IT infrastructure. You should have a comprehensive dashboard where you can monitor all of these metrics, after which you can create actionable reports to discover the most wasteful areas. You can then start devising a data-driven strategy to minimise waste and maximise efficiency. Optimising data center cooling Cooling is one of the most wasteful processes in any IT infrastructure, at least when it comes to energy consumption. It’s important that you always monitor the power draw of your data center cooling systems so that you if you’re meeting the minimum requirements for efficiency. That said, you should also strive to implement better cooling solutions over time to gradually elevate efficiency and ensure better financial and energy savings while prolonging the lifespan of all the components in your data center. Some of the best practices when it comes to data center cooling include installing economisers, containing the heat through isolation structures and heat funneling systems, and optimising air conditioning systems. Optimising your air conditioning system might be the toughest of the three, though, as you need it to be operational at all times. Nevertheless, you can turn it off periodically to let other cooling systems take over, or you can vary the speed from time to time to lower the overall energy expenditure. Making the shift to HCI When it comes to cloud-based IT infrastructures, you first have to find the right vendors to work with, preferably ones that emphasise data center efficiency so that you get the pest performance, high scalability, stellar security, and more. Some of the best hyperconvergence vendors on the market, for example, offer advanced HCI solutions that simplify the traditional IT infrastructure and provide excellent stability along with quick and easy scalability. For these and many other reasons, companies are increasingly migrating to the cloud and transitioning to a hyper-converged infrastructure that can meet all of their business requirements. Of course, it’s not just about the efficiency of your data center, it’s also about ensuring top-of-the-line security on all fronts that will keep your data safe at all times. To achieve this, an HCI infrastructure has advanced firewall solutions as well, like a distributed firewall, along with comprehensive disaster recovery and backup plans. Optimise and improve IT power consumption Another crucial element of an efficient IT infrastructure and data center is power management, which is why one of your priorities should be to safely lower the amount of payload power you need for your data center. Given the fact that the majority of that power is consumed by the servers, you can lower energy consumption by eliminating unnecessary workloads, consolidating virtual machines, virtualising more workloads than before, eliminating the servers that are consuming too much power without adding value, and replacing old servers with new and efficient ones. Manage your data wisely and continuously More often than not, a thorough audit of your existing business data will reveal that you are storing way more data than you need to, and that most of it could be eliminated without batting an eye. Instead of letting useless data take up too much space in your data center, be sure to eliminate all the data you no longer need. You can start by eliminating duplicate data – and you can expect to find thousands of duplicates if you’re thorough. Your IT experts can employ solutions such as deduplication, cloning, and thin provisioning to eliminate unnecessary data with ease. Wrapping up Achieving greater data center efficiency might not seem like an easy task at first, and granted it can be a complex process at times, but that doesn’t mean that you don’t have some powerful and affordable solutions at your disposal. Use these tips to make your data center more efficient and boost the efficiency of your entire business as a result.[/vc_column_text][/vc_column][/vc_row] ### Collaboration without the security chaos [vc_row][vc_column][vc_column_text]Research from the ONS estimates that – at the height of the pandemic - a record 49.2 percent of employees worked remotely. It sparked widespread reliance on cloud-based collaboration tools like Microsoft Teams and SharePoint, which quickly became synonymous with the crisis. By the end of April 2020, Teams use had grown to 75 million daily active users, adding 31 million in just over a month. Today, as businesses start to emerge from lockdown, it is becoming apparent that the new normal will be a hybrid mix of home and office working. While advice for employers is to start bringing people back into the workplace where it is safe to do so, it is unlikely there will be a widespread return to pre-COVID levels of office-based staff. The flux between home and office networks exposes new entry points for cyber criminals  and, if access privileges are not managed carefully, it can leave organisations  exposed to fresh security exploits. As businesses now embark on a new phase where flexible working is standard, and our reliance on cloud-based collaborative tools looks set to continue, here are some steps that all organisations can take to fine-tune their security and avoid any business disruption in the months ahead.   With remote access comes risks Remote working carries a variety of different security risks. Cyber criminals are aware of this and are quick to probe for weaknesses. Brute force attacks against VPNs, alongside credentials phishing and command and control-based attacks are commonplace. The likely success of these attacks is heightened by the fact that many more workers are accessing corporate resources from personal machines and devices that do not meet corporate cyber security standards. Applications like Microsoft SharePoint and Teams have made sharing data and collaboration between colleagues regardless of location extremely easy. However, what most troubles security professionals is what happens behind the scenes. Establishing a new group on Teams automatically creates a new site for the company’s SharePoint for the whole group to access. At the same time, new Azure Active Directory (AD) groups are nested within the local groups, and a hidden mailbox is created in Exchange. One Drive is also used as a data store – any files shared within a Team chat is actually saved in One Drive a sharing link is created for the recipient(s). All of this means that Teams users in the group can share and work on documents freely together, completely outside the IT department’s control. The owners of a group can also invite other users, even if they are external to the organisation. This all adds up to a happy hunting ground for threat actors. The risk is that malicious insiders and external attackers with stolen credentials may be given free rein to access sensitive data and exfiltrate it while neatly sidestepping several layers of security defences.   Sharing without compromising security SharePoint and Teams have been indispensable for firms forced to temporarily abandon office-based working. The danger, however, is that with control over access to company data now in the hands of uninitiated users, weak links in the chain are almost inevitable. It is possible to mitigate the risks associated with collaboration tools with a few basic security steps.  Step one is to implement a least privilege policy. This ensures users only have access to information they need for their job. It’s a move that immediately reduces opportunities for external intruders and malicious insiders to access and exfiltrate sensitive data. A second step is to introduce a range of control measures - from rules preventing the transfer of company data to unauthorised devices to banning users from sharing links to company files without permission. Where individuals need to share sensitive company information with third parties, we recommend adding them as guests to Azure AD and granting them appropriate access from there. Another sensible precaution is to set an expiration date for all user-created links. Finally, organisations should monitor SharePoint for signs of suspicious activity. Examples might be unusual folder or admin activity, or alterations to group membership. Anomalous SharePoint activity could be a sign that either an employee or an external threat actor is abusing the capabilities of SharePoint, Teams and other services. In summary, best practice dictates that firms have a full inventory of sensitive data assets and know exactly where they are stored – whether on premises and in-cloud platforms. At the same time, the principle of least privilege, confining user access to the data needed for their job, should be applied. These simple steps can do much to compensate for the lack of visibility and control associated with cloud collaboration tools. In this way, companies can enjoy all the benefits of cloud collaboration while avoiding the chaos of unfettered data access.[/vc_column_text][/vc_column][/vc_row] ### Will On-prem Databases Survive? [vc_row][vc_column][vc_column_text]As companies of all sizes modernise their applications and systems, they are increasingly turning to the cloud as a means of enabling faster innovation and agility, saving cost, remaining competitive, growing revenue and adapting to evolving markets. Indeed, by 2022, three-quarters of all databases will be deployed or migrated to a cloud platform, with just 5% ever considered for repatriation to on-premises, according to Gartner. Already, cloud databases accounted for almost 70% of the market growth in the database market in 2019. Yet, there’s another factor that’s accelerating this migration to the cloud at an even faster pace than originally predicted. COVID-19 changed the world suddenly and dramatically. In March 2020, nearly the entire world came to halt in an effort to control the pandemic. Businesses not enabled for an all-digital, no-touch world had to quickly figure out how to move forward, how to evolve and how to keep up with the changing world. The cloud served as a key resource to help companies adopt digital approaches quickly. Whether it’s to reduce the reliance on a physical location for work or business, or to gain agility to handle increased demand at a time when most people are ordering everything online. For many businesses around the world, the desire to move mission critical applications to the cloud suddenly became a matter of a company’s survival. Recent research in polling over 500 enterprise engineering and IT professionals at manager level and above across the UK, US, France and Germany, found that 40% of respondents are now accelerating their move to the cloud because of COVID-19. The cloud enables businesses to increase their compute capacity without the need to invest in hardware and facilities. Cloud-based services, such as a database-as-a-service (DBaaS), can instantly provision the technology a company needs to power their digital businesses. Without relying on physical locations and by eliminating many time-consuming tasks, a DBaaS can change the way data is stored, accessed and analysed, freeing existing teams to work on more strategic initiatives. What’s more, companies realise that many structural changes are here to stay and future disruptions - be it a second round of shutdowns as a result of COVID-19, another pandemic or an entirely different disaster - need to be anticipated and planned for. The same survey found that nearly three quarters (74%) of respondents indicated that they expect a second wave of the pandemic, and over half (51%) plan to move more applications to the cloud to prepare for it. However, accelerated cloud adoption is not the universal response to the crisis. Close to a quarter (24%) of respondents said they were slowing down their move to the cloud because of the impact of COVID-19. This varied from country to country, with the UK indicating the lowest percentage of slowing down (12%) and the US the highest (36%). Careful planning is necessary to complete a successful migration to the cloud. Companies who have not yet embraced the cloud should invest in best practices to successfully move their mission critical applications in a way that isn’t disruptive to the business. There are many cloud solutions to choose from. It’s important to fully understand your requirements and objectives. Is it important to have database support for your mission critical application? What is your HA/DR strategy? Are you comfortable committing to a single cloud provider or is multi-cloud important to you? Understand what your needs are now but also what your needs may be in the future. One business that has successfully pivoted its operations during COVID-19 with the help of the cloud and earlier decisions around their technology stack is Tock, an established online restaurant reservation and payment platform. While the pandemic has impacted global economies in countless ways, the shutdown has been especially devastating to the restaurant industry. To help address the crisis, Tock pivoted overnight to offer ‘Tock To Go’. Within days of shelter-in-place orders being implemented, Tock To Go offered a solution for restaurants to sell pre-paid meals to go on a time-slotted basis. This provided clients with a lifeline to keep revenue coming in so they could keep staff employed while also managing their kitchen capacity during peak hours. Tock was able to act so rapidly and address the need for pre-payment from customers thanks to the solid technological underpinnings of its application design and architecture. Building the application on top of an open source database running on Google Cloud Platform (GCP) allowed the Tock team to innovate quickly to support its business and technical requirements as the world shifted so rapidly. Tock established themselves as a game changer for their customers not just for the immediate pivot to take-out but the long term change in the restaurant industry towards lower capacity indoor dining, higher volume of take-out/delivery while avoiding the margin hit of up to 30%, many delivery services command. COVID-19 has impacted every business around the world and has changed our behaviours in significant ways. Companies, big and small, are realising the benefits of moving to the cloud to deal with these unprecedented challenges and keep up with changing times. Are you ready?[/vc_column_text][/vc_column][/vc_row] ### Reinventing the Enterprise Cloud For the New Normal [vc_row][vc_column][vc_column_text]A global pandemic has fundamentally changed human behavior. Where we go, the products we consume, how we consume them, the supply chains that deliver them and the way we work and interact with each other has all been altered. To suit stay-at-home and quarantine orders, reliance on digital means has increased dramatically. As of April 2020, Pew Research Center reports that roughly half of surveyed U.S. adults (53 percent) say the internet has been essential for them during the pandemic, while other reports observe that compared to March 2019, the daily average in-home data usage has increased by 38 percent in March 2020. Naturally, as a result of this disruption and increased virtual dependence, businesses and their digital transformations have become hyper accelerated. Redefining the future of how and where work gets done through a newly remote workforce has meant that large corporations are shifting toward Artificial Intelligence (AI) for business automation processes and insights and increasing their focus on constructing the enterprise cloud as a business platform. These two trends are tightly interrelated, in that AI-driven businesses motivate enterprise cloud adoption. Yet, at the core of cloud is one key process: governance. The policies and procedures through which the cloud is operated can be the difference between cloud success and failure, and in this era of disruption, it can be the difference between keeping ahead of the curve and getting left behind entirely. The enterprise cloud is the future of not just corporate computing, but corporate survival, offering speed, agility, AI, business continuity, disaster recovery and beyond. In short, the enterprise cloud is the solution to the post-COVID-19 new normal, and in order to meet the needs of the new normal, a cloud driven by automated governance is key.   Automated Governance: The Answer to Lingering Questions Cloud-based services and technologies promise increased speed and corporate agility thanks to access to instantaneous compute capacity and an ever-increasing range of development tools. Yet, underlying cloud infrastructure lacks comprehensive visibility, and it often undergoes upgrades without informing enterprise customers or tenants. Lack of cloud infrastructure visibility within an application’s dependency map creates a lack of trust and confidence, and management becomes difficult because you can’t effectively manage what you can’t measure or see. Furthermore, a lack of visibility can mean compromised compliance without the organisation’s knowledge. Enterprises need a way to ensure that every element of the cloud deployment pipeline is protected and compliant, even as demands expand and the speed at which results are required increases. This is the role of automated cloud governance, or rather, an automated process for tracking governance throughout the software delivery process. Automated cloud governance provides the tools that enable the shared responsibility model that has been talked about but not yet implemented on a large scale. However, before wide-scale realisation is reached, there are many considerations to navigate. For example, when it comes to deploying strong cloud governance, three steps should be kept in mind. The first is outlining the cloud strategy, which means understanding individualised compliance and regulatory needs of the data involved and defining and documenting appropriate control practices. Next comes the operation and monitoring phase, ensuring independent verification of controls can be performed, providing visibility and transparency into third-party consumption and more. Finally, the audit and evidence stage is reached, which involves creating a historical inventory of cloud controls, provenance and applicability scope, as well as a history of raw service consumption, usage and configuration with internal identity linkage. To simplify the process, a cloud governance policy should provide for abstractions that can help guide implementations independently and manage the complexity of cloud computing environments. Not planning for any abstraction between implementations and policies means thousands of separate guidelines would exist, making the environment nearly unmanageable. Policy that is separate from the implementation simplifies everything because it enables the implementations to be backed by the policy instead of being directly tied to it. Resources that discuss aspects of automated cloud governance like those above and outline deployment models are key for ensuring the IT and business world understand how to bring the associated benefits to fruition. The process of discussing and educating the market will be key to ensuring the cloud can evolve to meet the needs of the new world.   Continuing on the Problem Solving Continuum In the ongoing recovery from the pandemic, organisations have realised that business and digital strategy are synonymous. A result of this is a keen need for a model that increases secure cloud consumption through automated cloud governance. While the variability of the cloud makes it difficult to track and maintain management and operational aspects, strong governance in hybrid and multi-cloud computing is achievable — but questions still remain. As the automated cloud governance model gains traction, the next questions to be answered involve understanding how much infrastructure stays on-premises, how private data centers should be deconstructed and more.[/vc_column_text][/vc_column][/vc_row] ### The impact on workplace IT [vc_row][vc_column][vc_column_text]The Covid-19 pandemic has undoubtedly changed where and how we work, with businesses having to adapt to remote working as soon as lockdown measures were introduced back in March. As we come out of the crisis, it could be argued that remote working will remain over the coming months and even into next year and beyond. This is to protect employees during a staged return to work and while social distancing measures are required, and because some will want to continue to work from home moving forwards. The likes of Shopify, Twitter and Facebook have all told employees they can now work from home ‘forever’ and smaller organisation will have to follow suit to offer the same work/life balance. To facilitate remote working, businesses had to turn to new technologies and processes, and below we discuss what these are and how they can be used moving forwards. We will also talk about what businesses need to do to ensure that employees can work from home safely and securely from a cyber security perspective. Network capacity: According to the Office of National Statistics, 70% of the UK workforce (23.9 million people) had no experience of working from home prior to Covid-19 lockdown measures. But with businesses forced to send employees home, there was a huge surge in the demand being placed on IT infrastructures and networks not necessarily designed for remote working. To cope, organisations were forced to upgrade systems to ensure they had sufficient bandwidth and licencing to handle the rise in VPN traffic. Organisations should maintain this increased network capacity over the coming months while employees are still working remotely and to ensure they are ready in the event of a second wave. The good news is that network capacity can be dialled up and dialled down in line with demand, so organisations can assess on a month by month basis what they need. However, as businesses adjust to the ‘new normal’ way of working, it is likely that the percentage of those who continue to work remotely indefinitely will increase. Businesses should look into a long-term solution to boosting their network capacity for the long term. Mobile connectivity should also be a consideration, especially with the introduction of 5G into the country. Cloud solutions: Cloud computing is absolutely the most effective and efficient way to run IT systems and networks, allowing employees to access information, data and apps no matter where they are. We saw a surge in the number of organisations migrating to the cloud in the early days of the lockdown to ensure their staff could continue to work without any major disruptions. For some, this meant upgrading their systems and products but as a result they can leverage the many benefits of the cloud, including added security and streamlined costs, now and into the future. For those who do not wish to migrate completely to a cloud environment, a Hybrid solution can be the best of both worlds. A Hybrid Infrastructure is both an on-premise and cloud environment that works in synergy to bring a seamless and efficient experience. It permits your business to be more efficient yet still allows for support of legacy applications, that need to remain on-premise, whilst using the cloud to boost business agility.  Added security: While the world was on lockdown, cyber criminals were hard a work launching a barrage of attacks on unsuspecting businesses – in April alone, Microsoft released 113 security updates to fight this. With employees working from home, the risk of a breach increases significantly as human error is the number one cause of hackers gaining access to systems. From phishing scams to DDoS attacks via personal devices and undertrained employees, organisations and their employees are under constant fire. Whilst many are working remotely, some businesses may have forgotten about their backup process, some may not even be able to occur properly remotely. Exposing a weakness in your cyber security defence. Businesses need to get on top of this, especially if staff are going to be working from home for the foreseeable future and perhaps forever. To do this, we recommended taking the following steps: Conduct a full security audit now to identify potential flaws Don’t neglect email security - make sure that email filters are switched on Mobile security is important - all devices used by staff need to be configured with security optimised passwords, usernames and multi-factor authentication Ensure that every end-user device that has been assigned by the business also comes with up-to-date anti-virus software Protect corporate systems - firewalls are vital Regularly back up data – also have a strategy in place for disaster recovery Communication is key: Communication and collaboration are some of the biggest challenges when staff are working from home, whether due to the crisis or just generally. Now is the time to adopt a cloud telephony solution, it will help aid communication now and in the future. Cloud telephony takes your business phone system to the cloud, you no longer need to be burdened with wires and clunky desk telephones when working remotely or when returning to the office. Thanks to powerful apps for mobile, laptop and desktop, teams can make and receive multiple calls, regardless of where they are located. To help, organisations should look to technologies such as Skype for business, Slack and Zoom as well as file sharing products like Dropbox and Google Docs. However, something to consider is that these products are often used in isolation and mean that employees have to download and use different applications that are not connected. This is why we recommend Microsoft Teams – it provides a communication channel but also links up to other Microsoft products such as Word, Outlook and SharePoint making it more cohesive. For example, you can video call colleagues and arrange meetings with multiple contacts, store, share and work on documents, and instant message in real time. It also has the capability to integrate many third-party apps. This means that you won't need to start all over again with integrated systems but can simply connect them into Teams. Now is the time to adopt a cloud telephony solution. The new way of working: The Covid-19 pandemic has changed the way we work, and we believe some of these changes will be permanent especially when it comes to employees working from home. The good news is that most organisations have already adapted to this, or at least have adapted in part, and have the ability to facilitate remote working safely and securely.[/vc_column_text][/vc_column][/vc_row] ### Facilitating the future of remote working with the cloud [vc_row][vc_column][vc_column_text]Following the recent updates from the UK government, many of us are now uncertain on when we will be able to return to our offices and work will continue as normal. This being said, remote working has gradually become the new normal, as we have steadily adapted to the working environment that we suddenly found ourselves in. Within the past fews months, since lockdown began, remote workers have found their flow with this dramatic change and cloud solutions have been there every step of the way to ease this rather tricky process. Businesses are realising the answer to the question that has been asked many times before; would productivity suffer if employees worked from home? And the answer, for many, is no. Thousands of us are waking up to the benefits of working from home and this change has created a new question to consider -  could remote working be the new normal? The rise of remote working Even before lockdown began, in 2019 over 1.5 million people worked from home in the UK, a 50% rise from 2009. Around 79% of businesses in the telecoms industry use cloud computing; and in the energy sector, as many as 87% are cloud users. This adoption has been fuelled by various technologies such as cloud solutions and collaborative platforms which are making the traditional working model obsolete. Twitter is one of the recent examples of a company that is embracing working from home, allowing its employees to work remotely ‘forever’. This announcement follows many others, including the likes of Facebook and Google, which are also extending their work-from-home policy into 2021. Not only are the tech giants encouraging remote working using cloud technology, smaller businesses are also embracing working from home practices too as a result of low-cost cloud solutions. This new level of flexibility has the aim to safeguard workers and reduce the spread of the virus, and employers are able to enforce this seamlessly thanks to the benefits offered by remote working platforms. It’s not just the short commute from one’s bed to their home office that employees can take advantage of. Despite being in the middle of a global pandemic, staff are reporting greater levels of productivity and job satisfaction. Looking towards the long term, remote working offers a number of economic benefits for a business. With less people in a physical working space, overhead costs can be reduced in many aspects from office rent, to staff equipment. Challenges of remote working However, there are several aspects that could make bosses hesitant to implement a mass-scale remote working approach within their organisation. If not addressed properly, we may see our previously normality resume, and working from home could become a thing of the past. Despite being more productive when working remotely, other studies suggest that those working from home can feel a lower sense of community, less social interaction and reduced informal collaboration. Combined, these factors can make employees more willing to return to the office, thus affecting the future of remote working. With the requirement to work from home happening so suddenly, many organisations failed to sufficiently plan for the effect this could have on their security procedures. Data security has always been a sensitive topic and almost overnight, many of us were now using private end devices such as laptop, tablet and smartphone to complete our activities which were not protected by the corporate network. Around 39% of companies fear unauthorised access to confidential company data and 29% have uncertainties regarding the legal situation. Not to mention, employees no longer have the benefit of having on-site IT professionals to monitor traffic and keep an eye out for suspicious activity. Furthermore, many can be put off just by the concept of implementing a remote working solution across their entire business. Technology is woven into many aspects of our work, and it can often be tricky to find workers with the skills to navigate around various software and systems, and so adding another to the mix can be daunting. Learning how to use a new platform for existing employees can also hinder productivity, especially for the more complex solutions. Overcoming obstacles of remote working with the cloud When the time comes that workers can return to their offices, it will be common practice for many to now have a choice of either work from the office, or at home. But in order for this to be a possibility, organisations must first implement a suitable solution in which employees can easily access their applications no matter where they are, without faulting a workforce’s productivity. There is an array of cloud-based solutions which can offer businesses the ability to enable employees to work from home, both productively and securely. The combination of software, platform and infrastructure as a service, otherwise known as Everything-as-a-Service (XaaS), creates a seamless process for employees as they can access any on-premise and cloud environments behind a single web portal. Security is one of the greatest obstacles that businesses face when implementing a remote working solution. To address any potential security concerns, there are a number of factors that decision-makers need to consider. To secure files and keep important documents safe, solutions can restrict access to certain data. If there is no authorisation, no files can be downloaded from the app server. Furthermore, the cloud can offer multi-stage authentication which determines the unique identity of the user. The latest streaming technology enables businesses to easily, quickly and securely provide applications to employees in the home office without VPN connections or client installation. This works for any size company and any application environment, meaning a new solution can get up and running in no time. Employees can easily pick up this new platform and continue their work as normal. With global sales for cloud computing expected to increase to $331 billion in 2022, vendors need to make sure the solution they are providing to users now doesn’t affect their performance. In this future of working, it will be crucial for staff to have the same ease as they would have in the office, in their own homes, to ensure productivity is not affected. A recent report from Gartner suggested that 74% of CFOs are now planning to shift some of their staff to remote working permanently, and it is up to vendors and cloud service providers to facilitate this change can encourage this new era of working which we could soon be in.[/vc_column_text][/vc_column][/vc_row] ### How to Make Working Remotely, Work [vc_row][vc_column][vc_column_text]The current circumstances have impacted almost every area of people’s lives – from their social activities through to how they operate and work remotely. Technology has been crucial to handling these shifts positively. It’s allowed employers to communicate with their workforce and ensured employees can collaborate with one another. Yet, logistical challenges have arisen. There are new issues which people might not have addressed before. How are employees accessing sensitive business data? Should employees be using a Virtual Private Network (VPN)? How do individuals navigate working from home if they live in an area that’s poorly connected? With working from home becoming the ‘new normal’, it’s vital that businesses find answers and solutions to these kinds of questions. Tackling all these questions at once can be difficult, especially when working remotely or without wider team support. However, it is possible for businesses operating within all sectors to make remote working a success, providing they plan around four specific areas of their day-to-day operations.   Access all areas Working from home can throw up all kinds of disruptions, from limited bandwidth with children streaming videos to poor video quality due to connection issues. But productivity will really suffer if employees are unable to access the resources they need. Especially those which were commonplace before the need to work remotely. Businesses need to focus on keeping the things they can control easily accessible and secure. Employees must have immediate and easy access to all files, tools, applications and data they are accustomed to in order to minimise loss of productivity.   It’s a learning process Once employees can access data remotely, risk is the next challenge to address. It’s an inescapable fact that working from home increases the potential for both malicious cybercriminal activity and accidental data loss. Especially when employees aren’t protected by the full array of protective measures they might have had in their usual work environment. This is where education becomes incredibly important. Keeping staff and data safe from cybercriminal harm only happens when employees are thoroughly educated on cybersecurity best-practices. The same is also true with accidental data becoming loss. If a business doesn’t want to see its information disappear into thin air, it needs to make staff aware of how to manage it securely. From a security standpoint, a VPN can be incredibly useful. It protects data while also allowing employees to access the data remotely. For example, VeeamPN enables employees to connect to their desired VPN without hassle or delay. IT teams also need to remain vigilant in ensuring all remote workstations are backed up to secure endpoints and installed with up-to-date, protective anti-virus software.   Back up, back up, back up! Conducting regular backups is crucial to ensuring that all activity is stored safely and resiliently across all storage environments. The current crisis has shone a light on the importance of continuity plans that include a solid backup and disaster recovery strategy – which is regularly tested. Even the technical matters which might seem less important. Take VPNs for example. The best-prepared CIOs don’t even let the usage of certain VPN tools pass them by, ensuring that they plan and test the potential effects of entire departments accessing their internal company network through a particular solution. Repeatedly.   Automation is key While working from home for an extended time, IT teams can be overstretched with providing desk-side support, tightening cybersecurity and monitoring network capacity. Automating business continuity procedures should be top of mind for all progressive CIOs. Enabling automation of these processes allows IT teams to effectively manage their time and reallocate it to focus on key business priorities which might require higher competency of thought.   What’s next?  There’s no firm end date to the current remote working environment. But embracing this new way of working will help us get to the other side. Amidst this uncertainty, there are things that can be done to help make operations as smooth as possible. Ensure the right resources and tools are available. Make working environments feel secure. Back up data regularly and embrace automation. With this kind of planning in place, being able to run like ‘normal’ should become more of a reality for any business – for however long that may be.[/vc_column_text][/vc_column][/vc_row] ### How should businesses communicate to customers? [vc_row][vc_column][vc_column_text]As lockdown measures in the UK continue to ease, many businesses have begun to speculate what the post-Covid19 landscape may look like. As experts estimate that China is roughly 4 weeks ahead of the UK with regards to the pandemic impact, we can look at their business recovery rates, for some direction of what to expect. As it stands, early signs reference a u-shaped recovery. Which suggests a lingering effect over the coming months before trade returns to pre-Covid19 numbers. Throughout this lingering effect, it’ll be vital for businesses to communicate with their customers in a bid to remain top of mind. The pandemic has caused a lot of uncertainty leaving many individuals experiencing both health and financial anxiety, it’s essential that businesses get their communication right. Afterall, brands want to be top of mind for the right reasons. So while customer safety will remain the priority for the majority of businesses, the question remains how and when is it appropriate for businesses to start communicating with customers after Covid-19? To assist, we’ve highlighted some important considerations to ensure customer communications are effective, while remaining sensitive to the situation at hand.   Provide open, transparent updates As the UK is still in the earliest stages of recovery, there are still a lot of unknowns and decisions which remain dependent on external data. As such, even decisions around businesses reopening can be overturned to ensure public safety. For example, on 30th June 2020, after what was described as a “surge in coronavirus cases”, the city of Leicester was the first UK city to announce a “local lockdown”. The tightening of restrictions has meant that non-essential shops have had to shut once again. As this situation is unprecedented, it’s important to remember that customers aren’t expecting businesses to have all the answers. Businesses should be truthful about what they don’t know. By providing open and transparent updates, businesses can build customer trust and ensure messaging remains consistent.   Assess what your customers need right now By assessing what customers need right now, businesses can provide resources which are responsive and informative to ensure clients are properly supported. For example, at the beginning of lockdown, Hubspot reported that while sales decreased, website traffic actually increased, particularly to their educational resources. This suggests that even in the earliest stages of lockdown, customers still wanted to engage with brands, but they were simply more interested in learning and education. Furthermore, in a bid to boost customer engagement while adapting to the current landscape, many businesses have invested in creating content which is focused on providing better value to current customers, rather than attracting new leads. As Hope Horner, CEO of Lemonlight explains - “The biggest shift for our marketing team was related to the content we were publishing and sharing. Overnight, nearly all of the content we had written became irrelevant or tone-deaf.” By focusing communications to deliver what customers need now, businesses are much more likely to hit the right note.   Get customers excited about your reopening In a recent study published by YouGov, adults in the UK were surveyed to see how comfortable they would feel returning to businesses with a physical premise. For example shops, hairdressers, bars and cafes. For some, figures were notably low - only 37% of recipients would feel comfortable returning to a restaurant, and just 32% returning to a beauty or nail salon. Thus for some businesses, simply opening the doors again may not be enough to see customers flood back. While it’s crucial businesses communicate the measures taken to make the premises safe for both customers and staff, it’s also important to try and create an authentic buzz around your reopening. For this, businesses could try introducing special offers or even organising a small gesture to thank customers for returning. For example a free goodie bag for the first 50 customers.   Communication channel is key When creating a communication strategy, businesses need to consider what communication channels they are going to use. While we would always recommend utilising multiple channels, it’s important to note that in recent years, SMS has become an increasingly accepted choice for brands interacting with consumers. So much so, that 74% of individuals report an increased impression of a brand who communicates with them via text message. Furthermore, as back-to-business will be a priority for many companies right now, many businesses will be communicating with consumers. So it’s important to ensure business communications aren’t lost among the sea of other brands. Since consumers are 35 times more likely to see a brand’s SMS than their email, utilising text messaging could really help boost planned campaigns. Moreover, when you use SMS you are sending messages directly to people’s mobile phone, which on average they pick up 58 times a day. So it is a great choice for communicating time sensitive messages, for instance time-bound offers during the first week of business reopening.   Consider opportunities for virtual interaction During lockdown some businesses were able to get innovative, and respond to the pandemic digitally. For example, beauticians introduced online consultations so they could send customers specialised facial kits, agencies pivoted annual conferences to take place completely online and many businesses either created or enhanced their eCommerce offering. As some customers may not feel comfortable returning to a brick-and-mortar premise and once open, many businesses will need to adhere to reduced capacity restrictions, it makes sense for businesses to consider or continue communicating opportunities for customers to virtually interact. Exactly what form these virtual interactions will take, will largely depend on the business and industry, but it’s worthwhile getting creative; particularly as this could create a new revenue stream. There is no questioning the economic impact of the pandemic, but businesses must decide how they will evolve to move forward safely. As trade settles during this uncertain time, it’s crucial businesses are able to effectively communicate with their customers. As those that do, place themselves in the best position for success. After all, there can be no business without customers.[/vc_column_text][/vc_column][/vc_row] ### New Tech Developments for Children with ASD [vc_row][vc_column][vc_column_text]More than one in 100 children in the UK have some form of autism spectrum disorder (ASD) — a disorder that typically appears in early childhood, impacting one’s social skills, relationships, and self-regulation. Many intervention and therapy options currently exist, but just as there is no one behaviour present in all children with autism, there is no single treatment that will benefit all children in the same manner or to the same extent. Therapies ranged from cognitive behavioural intervention to activity based intervention and many new technological developments – including the use of collaborative robots — are also proving useful. Read on to discover the ways in which technology is helping children and families lead a more productive, happier life.  Augmented Reality for Learning and Communication Computer software company PTC has developed a new augmented reality application that helps children learn and communicate. The app, designed alongside scientists at the Boston Children’s Hospital, was found in clinical studies to hold children’s interest and attention for far longer than traditional learning activities were. It is based around a virtual replica of a traditional children’s toy farmhouse with different components such as eggs, chickens, a bran, horses, etc. A visual replica of this toy overlays augmented reality features over the farmhouse. It also reveals nouns, verbs, and prepositions on icons that children can use to create three-word sentences. A voice ‘reads out’ a correctly formed sentence structure, encouraging children to repeat the sentence. Researchers have found that through this system, children who were unable to form three-word phrases are now doing so successfully. Apps that Talk Similar and already accessible are a host of autism-centred apps such as Avaz, Talk With Me, Jello, and Proloquo2go, all of which are meant to encourage children to voice out words and phrases. Parents report that children can become very attached to these apps - so much so that they may become fond of the voices and intonation used. If you have a little one who loves spending time on his or her tablet, choosing child friendly toys and apps should be a priority. The good thing about apps created for children with autism is that they are child-friendly in many ways. That is, they allow them to use their imagination, introduce new words that are age-appropriate, and contain many of the fun elements that children love in everyday toys (including representations of kitchen, toys, cars, children’s characters, and more). Socially Assistive Robots Researchers at USC’s Department of Computer Science have developed personalised learning robots for people with ASD. The robot is most effective when it can accurately interpret the child’s behaviour and provide the appropriate response. In one of the largest studies on this subject, researchers lent socially assistive robots to children in 17 homes for four weeks. The robots processed the child’s needs and reacted appropriately, providing customised instruction and feedback. The study found that the robots were able to detect whether or not a child was engaged with a learning task with an impressive 90% accuracy. As one of the study authors stated, “Human therapists are crucial, but they may not always be available or affordable for families. That’s where socially assistive robots like this come in.” How can Robots Personalise Instruction? The USC robot essentially via reinforcement learning (a subfield of artificial intelligence). The robot provided the child with different tasks, congratulating them if they answered correctly and providing tips if they got it wrong. Based on the child’s answers, the robot was also able to cater learning sessions to them. The key to keeping children motivated, scientists noted, was to provide tasks with just the right level of difficulty. Autism, they added, was the ultimate test of the usefulness of robotics precisely because every child with autism has their own, individual set of symptoms that can differ in number and severity from another child. Finding a system that can take these differences into account is key if progress in robotics for learning is to be achieved. Parents of children with autism often report that their child’s preferred method for learning, is via technology - especially apps for tablets. Studies have shown that new developments in artificial intelligence, robotics, and augmented reality may all have an important role to play in education. Children with autism all have different symptoms, which is why AI in particular, with its ability to customise learning experiences, will undoubtedly be the buzzword in education over the next decade and beyond.[/vc_column_text][/vc_column][/vc_row] ### Is it time to consider migrating or replatforming your ecommerce site? [vc_row][vc_column][vc_column_text]Replatforming your ecommerce site can be one of the most daunting, yet most necessary changes a growing online business can consider. Many merchants hold back on migration through fear of losing site traffic and customers, but also run the risk of being left behind by competitors who are putting in the work to create heightened user journeys. Is it worth the risk? Why consider replatforming your ecommerce site? The online retail landscape is ever-evolving. The ecommerce site you opted for initially might no longer be serving you or the needs of your customers. It could be holding you back from offering the best possible customer experience and it might be forcing users to use your competitors instead. Other factors to consider when migrating or replatforming your site include: Site speed – the faster your page speed, the higher your site will rank in Google Cost – you could potentially cut web hosting costs dramatically Maintenance – easier site maintenance means quicker implementation and better performance Limited functionality – increase the ease of use of the site and its interactivity You could be asking yourself if there’s an alternative platform out there that may help you to solve these problems. If you could get your hands on a site that’s easier to maintain and has incredible capability to help you scale and provide unrivalled user journeys, is cheaper and quicker – that would be the ecommerce Holy Grail. Yes, there’s a lot involved in getting the process right and it’s a big business decision to make but migrating your ecommerce site can drastically improve what you offer as a business. It can dramatically broaden your reach, boost your rankings, transform your conversion rates and prove to be an incredibly valuable investment. For example, according to a 2019 study, migrating to Magento Commerce 2 saved 62% in costs – savings such as that could prove invaluable to any business, no matter the size. Whether you’re in the early stages of considering migrating or replatforming your ecommerce site or are already searching for an agency partner to assist, you’ll need a thorough idea of what the process involves. The experts at specialist ecommerce agency, Xigen, have pulled together a checklist of the most important things you will need to consider when planning a site migration. What are the rewards of replatforming or migrating? In an ever-changing online landscape, customers have increasing expectations and an incredible breadth of choice, meaning what you offer needs to hit the mark immediately. To satisfy your customers, you must always consider their experience on your site. Are you providing the best possible user journey? According to data from Xigen, 73% of consumers say customer experience (CX) is very important in their purchasing decisions and a massive 60% of people would be unlikely to return to a website with a bad user experience. Keeping up with the requirements of your online customers is a huge consideration when deciding whether it’s time for change. If you’re hoping for business growth, but your current platform is restricting the number of products you can display, you have slow loading pages, or the shopping cart isn’t meeting requirements, you’ll be unable to scale. These are limitations for any business website, but especially for enterprise-level companies hoping to increase their reach into fresh markets across the globe. If you’re having trouble with basic elements such as product display – how can you expect to perform effective transcreation and tailor your site to the needs of all the international audiences you’re hoping to attract? If executed properly, the impact of your migration could be hugely successful. Not only can migration have a significant impact on organic traffic and users, it can also increase organic ranking, site speed, revenue, and orders. For example, Xigen replatformed the website of Jackson’s Art Supplies to a customised Magento platform and implemented several development changes. The statistics show the site experienced: +189% increase in website users + 178% increase in website revenue +167% increase in website orders +151% increase in organic traffic Increases in website users and organic traffic are clearly an indication of a successful replatform project, but it’s the heightened orders and revenue that prove its value. Regardless of the reasons for your site migration, you need have clear objectives from the outset that will help you set and manage expectations. Planning such a complex project can span several months so a project plan is imperative to the success of your migration or replatforming. The success of your migration or replatform of your website first hinges on a thorough plan and is then coupled with the expert execution of the work. Common pitfalls include poor strategy, poor planning, and late involvement, so it’s vital to have the process thought out all the way through from pre-launch to post-launch. The value of a successful ecommerce website migration or replatform is clear, however cutting corners could cost your business – so it pays to commit time, finances and quality resources.   Want to learn more about this topic? You can view and download an infographic on the topic here -> Xigen - Migrating your Ecommerce Site Infographic (1) (1)[/vc_column_text][/vc_column][/vc_row] ### Data privacy in a pandemic world. Can our own data save our lives? [vc_row][vc_column][vc_column_text]The current pandemic has come at a pivotal time in the evolution of digital systems.  We now can understand and manage the crisis like never before.  Networked devices and systems that can help track, understand and predict viral transmission.  Big data analytics approaches can help policymakers design appropriate society-scale responses, and assess not only the risks to life and health of the virus itself, but also the economic impacts of different policy actions such as shelter-in-place, social distancing, face mask requirements, and other interventions.  Yet ultimately these models are driven by very personal data: where you have been, who you have been proximate to, and what your various health attributes are (ranging from underlying conditions, like asthma, to recent test results from blood tests).   Could your personal data save your life? The personal data revolution was a long time in coming and has been accelerated by widespread consumer adoption of connected devices.  Smartphones, in particular, are packed with sensors and radios to link personal, individual data (ranging from location to heartrate) into cloud-based systems that can rapidly and readily aggregate information from hundreds of millions of people into population-scale analyses.  The rise of technology oligopolies, such as the mobile operating system duopoly of Apple and Google, mean that two decision-makers can access data on most of the world’s population.  When we add platform players like Ali Baba and Facebook on top of the devices, and network providers like Vodafone or Orange or China Telecom or Reliance Geo, we have near-ubiquitous coverage of billions of people. Today, that data is relatively siloed, particularly when we try to integrate it with health data such as electronic medical records.  Until 2020 there were a growing body of personal data privacy laws and regulations arising around the world, from the GDPR in Europe to the CCPA and HIPAA in the US, which served to create protections for individuals from the depredations of large corporate interests. In the exigencies of the COVID-19 crisis, we have seen government-supported waivers of digital privacy laws, enabling sharing of data across siloes.  Now, medical records and health data can be crossed with telecoms data to enable virtual contact tracing and align it with known positive tests for virus, accelerating public health management.  Public safety and national interest overrode demands for personal liberty.  To do this, multiple copies of multiple databases have been created, making for a personal and cyber security nightmare.  More on that in a bit. A step further from tying together your telecom data and your health records is the integration of quantified-self technology from companies such as Ginger.io.  What if we could use ubiquitous smartphone technology to assess and predict your individual health profile, and tell you whether or not you need to go to a hospital to get tested?  Select symptoms of COVID-19 can be picked up potentially before even you, the user, are aware that you have an issue. These predictive health systems have been under development for more than a decade, and now perhaps are finding their moment in the current crisis.  The classic inertial reluctance of people to try new things is getting upended as new behaviour patterns are emerging around the pandemic.  As tragic as the crisis is, it paradoxically has created a wealth of new opportunities for innovators.  And for individual consumers – new capabilities around these integrated data sources and technologies can be tried out in a way that weren’t practical even 6 months ago. You now have, at least for the moment, the opportunity to take advantage of all of the big data that has been collected about you by Big Tech, and use it for your personal safety. This moment is fraught with peril in the long term.  Hard-won privacy protections have been rolled back not for weeks or a couple of months, but potentially for a year or two.  And once privacy laws are held in abeyance for a couple of years, it becomes that much more difficult to reinstate them later.  Once this data is available and widely accessible, it can be exploited by unscrupulous actors, ranging from pecuniary corporates to subversive state actors.  (Think this is hyperbole?  Personal data was demonstrably used by a foreign government to sway elections in 2016, efforts were made in 2018, and already security experts are warning efforts are underway for the 2020 election cycle). Personal data can save your life, yes, but at what cost? The tragic reality is that many of the new data security risks that have opened up in the name of public safety were unnecessary.  A group of privacy researchers from multiple institutions such as MIT, Imperial College and the World Economic Forum, industry players like Microsoft and Orange, and even the French government, have been working on a new data security model that lets you access, share, and control your personal data, gaining the benefits we describe above, without creating data leakage or additional cyber risk.  It’s open-source, it’s called the OPAL Project, and it has been under development for several years.  In the race to formulate public health responses, however, it was not employed in the COVID crisis. We, unfortunately, don’t have enough cyber literacy for people to understand why trying to save your life with your personal data also creates a data security risk.  We’re trying to remediate this with the Oxford Cyber Futures programme that we’ve launched with Mastercard, but that’s only a tiny part of what’s needed.  Governments need to understand alternative approaches to data sharing in crisis, and consumers need to understand what trade-offs they are making.  Hopefully, in the next pandemic, we will be better prepared around data, and what to do with it.[/vc_column_text][/vc_column][/vc_row] ### How to Make an IoT App: Benefits, Tech Aspects, Detailed Guide With technology attaining milestones frequently, seamless integration of physical things and digital data is not in the realm of impossibility. To understand the concept, let’s take a look at some examples. For instance, can you control your thermostats with your phone? Does the heart monitoring device send data assigned to a particular IP address?  The integration of the digital system between a network of connected physical objects is a phenomenon widely known as IoT or the Internet of Things. In this post, we will learn more about IoT apps, their benefits, tech specifications, and more.  What is IoT? The IoT, or the internet of things, is a network of computing, digital, mechanical devices, people, or objects that are interrelated. The network is integrated with UIDs or unique identifiers and has the ability to transfer data without human-to-computer or human-to-human interaction. In essence, IoT is any object or item that has embedded software that connects it to the internet to exchange data with different systems and devices over that same network.  The IoT technology industry is encountering a massive spike at the moment. Amidst the deadly pandemic, the industry has registered a dramatic 22% growth. In fact, the market is predicted to cross $566.4 billion within 2027. Considering the growth statistics now is the perfect time to invest in an IoT app. What is an IoT app?  An IoT app is an app that allows gadgets to communicate or send/receive data without any human interaction. The complex culmination of sensors, networking, connected devices, and analytics can result in the development of new-age applications. IoT apps use machine learning and AI to add real-time intelligence to different devices. Moreover, these are created for specific industries such as healthcare, wearables, automotive, etc. To make the best use of this technology, you may need to hire expert developers.  Top benefits of IoT app development There is a wide range of advantages one can gain from IoT app development. Take a look at some of them listed below - Insights on user behavior It is possible to use IoT to gain knowledge about user behaviour via data collection. For example, IoT tools can be used to analyse customer behaviours on different platforms such as social media, websites, etc. With information on hand, business organisations can take active steps.  More productivity The integration of IoT tech increases work productivity. It aids in the automation of various processes, ensuring a seamless workflow. IoT solutions offer assistance in managing various processes, giving alerts about different problems, and so on.  Reduces manpower Automation of different processes due to the IoT tech leads to a dramatic decrease in manpower. There is no need to invest effort, which can be utilised in other aspects. It is easy to monitor the process from an app.  Enhanced data security The core features of any IoT app are authorisation and authentication. Due to the availability of real-time information in the cloud, security needs to be considered very seriously. That is why the architecture of the system is designed in a robust way, making the solution secure from online threats and malicious cyber-attacks. Improved flexibility IoT applications fulfill the need to connect to any desired device from anywhere. The scope of flexibility makes the app even more appealing. It is possible to control various actions or items with one device.  Challenges to overcome in IoT app development Despite the extensive list of benefits that come with IoT app development, there are some shortcomings too. It is vital to address them in order to find out potential solutions. Check out the challenges – Privacy The absence of user privacy is indeed a topic of debate in IoT technology. Consumers are right to worry about data privacy when it comes to using IoT applications. In order to offer protection, a top-grade encryption level is needed. That way, the data is not left vulnerable to malicious online attacks.  Security One of the most challenging aspects of IoT app development is data security. The presence of multiple interconnected devices leads to the creation of more than one entry point. These points become the source of data leaks for hackers. They can use these points to attain access to the information present in the system. Securing every node becomes extremely difficult. That is why it is vital to connect fewer devices in order to limit vulnerabilities and financial losses.  Cross-platform compatibility Maintaining the balance between software and hardware while developing the IoT app is a real challenge. That is why developers need to focus on creating an application that delivers optimum performance without compromising device updates and error-fixing features. Moreover, it must be compatible enough to run on both mobile devices and the web. Also, ensuring that the app follows the pre-set standards of the industry is a must.  Different components of the IoT system There are primarily four distinct components that make up the IoT system. These are – Devices In the first step, devices or sensors work towards the collection of data from the environment. The data collected can be simple or complex. Moreover, more than one sensor can be put together within a device to collect data.  Cloud connectivity After the data is collected, a wide range of methods is used to send it to the cloud. The different methods that one can use include satellite, cellular, LPWAN, Wi-Fi, etc. However, it is crucial to select the right connectivity option. It mostly depends on the type of IoT app you are developing.  Processing data Now that cloud has the data, it is time to process it. Data processing of the information the cloud receives can be simple or extremely complex. Whatever the outcome is, it helps the users immensely.  End user The data is processed so that it can help the end-user somehow. The results can be sent via different methods such as emails, notifications, etc. Apart from this, the user might be able to conduct an action that affects the IoT system. That mostly depends on the app and its functionalities.  Are IoT apps different from conventional mobile apps? IoT apps are significantly different from traditional applications. For instance, Artificial Intelligence (AI) and Machine Learning (ML) are used to integrate intelligence into the app. The combination of these technologies will aid in customising the user insights based on the industry. Another difference is that IoT mobile apps have an in-built feature that allows the user to update the hardware with easy clicks. Even the process of app development is different than the conventional one.  Conclusion Undoubtedly, the Internet of Things or IoT is the technology of the future. The uprise in the growth of the IoT tech market is an indication that the tech is going to revolutionise different industries in the upcoming years. Now is the best time to enter the market with an attractive yet functional IoT app. ### ThousandEyes Releases Inaugural Internet Performance Report, Revealing Impact of COVID-19 [vc_row][vc_column][vc_column_text] Global Internet Disruptions Spiked 63 Percent in March; Internet Service Providers Hit the Hardest   SAN FRANCISCO – August 4, 2020 – ThousandEyes, the Internet and Cloud Intelligence company, today announced the findings of its inaugural 2020 Internet Performance Report, a first-of-its-kind study of the availability and performance of Internet-related networks, including those of Internet Service Providers (ISPs), public cloud, Content Delivery Network (CDN), and Domain Name System (DNS) providers. Measuring performance over time, the report examines the impacts of changing Internet usage due to COVID-19 and how those impacts varied across different regions and providers. “The Internet is inherently unpredictable and outages are inevitable even under normal conditions. However, with the overnight transition to a remote workforce, remote schooling, and remote entertainment that many countries experienced in March, we saw outages spike to unprecedented levels -- especially among Internet Service Providers who seem to have been more vulnerable to disruptions than cloud providers,” said Angelique Medina, research author and director of product marketing at ThousandEyes. “With the Internet Performance Report, businesses can benchmark Internet performance pre and post COVID-19 and plan for a more resilient IT environment as they continue to build out infrastructures that can manage the external dependencies on cloud and Internet networks that employee and consumer experiences now rely on.” Rapid adoption of cloud services, widespread use of SaaS applications, and reliance on the Internet has created business continuity risks for enterprises. ThousandEyes is an enterprise software platform that enables organizations to see the Internet like it’s their own network. Based on an unmatched number of vantage points around the globe that perform billions of measurements each day to detect when traffic flows are disrupted and measure performance, ThousandEyes leverages this unique Internet intelligence to monitor and detect how Internet, cloud, and other third-party dependencies impact end-user digital experiences. Based on measurements collected between January and July 2020, the Internet Performance Report uncovers important insights into the resilience and behavior of the global Internet, helping organizations apply a data-driven lens to their IT and business planning. Key findings from the 2020 Internet Performance Report: COVID-19 Impact Edition, include: Global Internet disruptions saw an unprecedented rise, increasing 63% in March over January, and remained elevated through the first half of 2020 compared to pre-pandemic levels. In June, 44% more disruptions were recorded compared to January. ISPs in North America and APAC experienced the largest spikes in March at 65% (North America) and 99% (APAC) respectively versus January, and have since returned to levels typical of those regions. In EMEA, however, outages continue to increase month over month with 45% more disruptions in June versus January. ISPs were hit the hardest, while cloud provider networks demonstrated greater overall stability. Between January and July, cloud providers experienced ~400 outages globally versus more than ~4500 in ISP networks. Relative to total outages, more than 80% occurred within ISP networks and less than 10% within cloud provider networks. Though the total number of outages increased across all regions, impact on Internet users varied. Following pre-pandemic patterns, a larger proportion of disruptions in EMEA tend to occur during peak business hours as compared to North America, where a majority of large outages typically take place outside of traditional business hours and therefore may not have a meaningful impact on Internet users. Overall, the Internet held up. Despite unprecedented conditions and an increase in network disruptions, Internet-related infrastructures have held up well, suggesting overall healthy capacity, scalability, and operator agility needed to adjust to unforeseen demands. Negative performance indicators, such as traffic delay, loss, and jitter generally remained within tolerable ranges, showing no evidence of systemic network duress. Increased network disruptions due to operator adjustments. Many of the network disruptions observed post-February appeared to be related to network operators making more changes to their networks to compensate for changing traffic conditions. “Initially, we saw both businesses and service providers scramble to adjust, overnight, to work-from-home environments. However now, we see a definite shift towards accommodating a more permanent scenario of serving a remote workforce,” said Paul Bevan, research director, IT Infrastructure, Bloor Research. “This is creating a realignment of network infrastructure that will look very different from pre-March network platforms. The findings from ThousandEyes’ research will be critical in helping organizations understand the inter-dependencies that are at play between internal and external networks, and how to strengthen IT infrastructures now that the Internet has become a core component to manage.” For a complete list of findings and to learn more about the 2020 Internet Performance Report: COVID-19 Impact Edition, please download the report here. Learn more: Download the 2020 ThousandEyes Internet Performance Report: COVID-19 Impact Edition View the infographic: Five Data-driven Insights About Internet Performance and Resilience Register to attend the webinar: How the Internet Responded to a Pandemic -- and What it Means for Your Business, taking place August 13, 2020 at 10:00 AM PDT Read the blog: A Tale of Two Internets: Internet Performance Pre and Post COVID-19 Find current job openings at ThousandEyes Read the latest ThousandEyes announcements and news coverage   About ThousandEyes ThousandEyes, the Internet and Cloud Intelligence company, delivers the only collectively powered view of the Internet enabling enterprises and service providers to work together to improve the quality of every digital experience. The ThousandEyes platform leverages data collected from an unmatched fleet of vantage points throughout the global Internet, from within data centers and VPCs and on end user devices to expose key dependencies that impact digital service delivery, empowering businesses to see, understand and improve how their customers and employees experience any digital website, application or service. ThousandEyes is central to the global operations of the world's largest and fastest growing brands, including 150+ of the Global 2000, 80+ of the Fortune 500, 6 of the 7 top US banks, and 20 of the 25 top SaaS companies. For more information, visit www.ThousandEyes.com or follow us on Twitter at @ThousandEyes.[/vc_column_text][/vc_column][/vc_row] ### Mitigating the carbon cost of AI [vc_row][vc_column][vc_column_text]Artificial intelligence (AI) operates most efficiently when it is commoditizing intelligence and decision making. We are starting to see proof of the scientific and business benefits that come from this streamlining and processing of data, and the use of AI modelling is playing out across a wide and growing spectrum of market sectors. However, as with any form of progress, there is a cost. Training just AI models can lead to significant carbon emissions. Studies have shown that state of the art models can result in hundreds of tons of emissions, and researchers and companies alike are individually training the models for their own purposes. As a result, we now have the power of a new technology that invites an exponential growth in emissions without awareness and action. Higher compute power means higher energy consumption When you look across the financial service and healthcare industries, we see examples of how machine learning applications are revolutionizing products, services and research. Within the financial service industry, machine learning is changing quantitative investing from a set of algorithms based on historical data to a set of models that capture and actively react to the fluctuating changes in the market. With these new tools, focus shifts from past to future and thereby the potential to dramatically reduce the friction required to achieve the next advantage on the market. For healthcare, AI modelling is enabling better disease diagnosis and prevention, all the more urgent in the current atmosphere of a global pandemic. The amount of compute power required for today’s applications when applied at scale is orders of magnitude greater than previous generations. The greater the network depth and the greater the data quantities for input, the greater the compute complexity, all of which requires high-performance computational power and longer training times. Prioritise efficiency as a criterion for AI models The enormous computing power needed for machine learning and deep learning applications equates to markedly high energy consumption. The conversation around the drive for accuracy - the current standard-bearer of AI research success - is beginning to be tempered by a growing concern for computation or energy efficiency. A study from the University of Massachusetts, Amherst, raised a flag last summer when it concluded that large AI models can emit more than 626,000 pounds of carbon dioxide equivalent—nearly five times the lifetime emissions of the average American car, including the manufacture of the car itself. The carbon cost of AI becomes even more substantial when you add the energy required to keep equipment cool and prevent overheating – at least 40% of all energy consumed in a conventional data centre goes towards cooling. There are, however, steps companies can take to minimize their carbon footprint while still accessing cutting-edge supercomputing to drive their innovations. MIT President, L. Rafael Reif, reminds us that “technologies embody the values of those who make them, and the policies we build around them can profoundly shape their impact. Whether the outcome is inclusive or exclusive, fair or laissez-faire, is therefore up to all of us.” The process begins at the start of the AI project by thinking clearly about the data that is important to your business. Understanding the data at your disposal ensures higher quality results and can help identify the technology that is truly needed to support the solution. If AI is the right way forward, you can proceed more efficiently and not waste precious time and resources building and applying unnecessary models. Businesses need to approach new and innovative AI projects with the same sound practices that have been applied across other technology implementations. Choose less carbon-intensive data centres  A practical way of balancing the potential of AI with the need for greater sustainability is within the data centre. Many conventional corporate data centres are not well equipped to deal with the large compute required to train neural networks that are the foundation of machine learning applications. In the case of medical or financial trading applications, the training never stops and a traditional data centre that is not designed for extreme conditions will not be able to provide dependable performance. How can this be managed? Through modified data centre design. One best practice is to place server racks closer together to maximize the bandwidth capacity between servers while minimizing the overall cost of the deployment. Traditional data centre designs have relied on widely spaced racks to help reduce the cooling stress, but this practice sacrifices the closely coupled designs required for at-scale machine learning applications. Data centre developers also want to minimize the size of their data halls to reduce the overall cost of construction and to improve the return on investment for the infrastructure. But for today’s specialised GPU boxes, this can present a challenge. For example, air-cooled systems that are positioned too close together can result in cooling deficiencies as the extreme airflow requirements of high capacity servers, such as the NVIDIA DGX-2 and the newly announced NVIDIA DGX A100, can actually blow against each other and create backpressure on the cooling fans within the equipment. As a result, data centres that are to be designed with high-intensity AI applications in mind must balance the financial pressures of reducing the footprint of the data hall with the requirements to provide sufficient space for proper cooling conditions. In addition to cooling considerations, data centres must be structurally capable to handle very heavy equipment. The best data centres today allow fully populated, very heavy AI computing cabinets to be rolled from the loading dock to their final computing location all on the same level and all on reinforced concrete slabs. Data centres that are ready for extreme AI computing must be properly designed and operated. Location of the data centre is also paramount. A data centre campus with free cooling and powered by inexpensive and renewable sources can significantly reduce operational expenses. The majority of the equipment involved in training machine learning models does not need to be located near the end-user for performance or accessibility. As such, racks can be comfortably housed in data centres that are serious about their power sources. For example, at Verne Global’s Icelandic data centre campus, our managed colocation and cloud solutions make it simple to connect the most powerful computing resources directly to a 100% renewably powered grid that is the cleanest on the planet, all while benefiting from ambient free cooling thanks to the favourable conditions outside of our secure data centre campuses. One exciting change that we have started to see in the conversations that we have with our customers is that a growing number of businesses are beginning to give their leadership teams financial incentives to meet sustainability targets. These companies are putting their money to work and using executive compensation as a lever for the greater good. No business will be issuing executive incentive plans that neglect the bottom-line health of the company. Therefore, savvy and sustainable businesses must search out solutions that not only protect our ecosystems but also save on the expense side of the balance sheet. Finally, no matter how good our intentions are, if we don’t hold ourselves accountable for making these changes, it will be for nought. In the age of climate action, the public is no longer asking for claims, it is demanding action - and accountability. True accountability establishes a solid foundation for growth and enables future generations to carry our visions beyond. As we look ahead to opportunities AI can offer, we must tackle the sustainability head-on.[/vc_column_text][/vc_column][/vc_row] ### The role of AI in powering enterprise content [vc_row][vc_column][vc_column_text] Of all the challenges, applications and real-world use cases that Artificial Intelligence (AI) has been directly involved with, one of the most important and prominent is the role that it can play in managing content for enterprises.  Content is an integral element of modern business strategy, used to help enterprises design and launch products, drive marketing, increase sales and much more. In fact, content is essential to effective customer experiences (call centre operations, customer self-service, etc.) as well as in numerous back-office processes (e.g. claims processing, underwriting, etc.). But content has long been a challenge from an information management perspective. It can be nearly impossible to find due to inadequate and inconsistent metadata attributes, limited search functionality within core business applications, and the simple fact that it is often scattered across any number of disconnected different systems and repositories. With legacy content management systems proving to be less effective at managing modern content volumes and types, more modern content management systems providing AI capabilities have emerged to fill that void. These modern solutions can extract critical data from content and, in doing so, transform content into intelligent information that can be easily found, readily used to perform work, and always accessible. An avalanche of content In a recent Nuxeo survey of UK financial services companies, 80% of respondents indicated that their systems were not fully integrated and their organisations had an average of nine different content management systems in place. This is most likely very similar across other sectors, and as the importance of content has grown, so too has its volume and variety, making it even harder to manage. In fact, the volume and types of content are growing at an unprecedented rate. Many enterprises have accumulated billions of documents and scanned images over the last 20 years. But today, some are looking to capture and manage as many as 500 million new objects per month. That’s an astonishing amount of new information and one that is only likely to increase rather than slow down.  Extracting information from content and entering it into fields and tables is work that people inherently don’t like to do. Doing this work across 1000s or even 100s of 1000s of new documents, every day, is challenging, expensive, and difficult to do with consistent accuracy. This is why so many organisations have struggled with enterprise content management (ECM) for so long - most traditional systems required extensive human input, which is both time-intensive and expensive. The value of custom machine learning models To manage this content and extract the maximum value from it, enterprises are turning to AI and, in particular, machine learning. Many technology firms, including Amazon, Google and Microsoft offer commodity AI services that can be leveraged for working with various forms of content.  Many of these machine-learning offerings are focused on providing greater insight into and understanding of content, whether that’s text-based documents, photos and images, or even audio and video files. A lot of value can be derived from these generic models and services, particularly in performing routine tasks with high volumes of content. OCR, sentiment analysis, translation, or even transcription for audio & video files (i.e. speech to text) are all examples of generic machine and deep learning services. But the real value of AI and machine learning can only be truly achieved when these models are trained with an organisation’s own data and content. A custom model -- again, one that has been trained with an organisation’s own data sets -- can produce much more accurate and contextual data about content which, in turn, is more valuable to the business. For example, with a custom machine-learning model, a user could identify specific products in photographs or videos, classify accident damage in automobile claim photos, identify vital corporate records and even detect potential fraudulent claims. While it is still early days for this technology, some of the initial use cases for custom models include: Data Enrichment – this is all about extracting data from content and using that data to make that content more accessible, more contextual and more valuable to the business. Enrichment can take on many different forms, depending on what type of content a business is working with. Within a Content Services Platform, data provided by custom machine learning models gets applied to the object – an image, video, document, or other content types – as metadata which is indexed and can later be used to find and retrieve that object. This is the difference between a generic AI model auto-tagging a car in a picture as blue, compared with a custom model that provides more valuable information such as model, manufacturer, and even trim level. Such data can also be used to trigger workflows and initiate processes and can be passed to other, integrated systems.  AI can also be used to enrich traditional content, like documents and scanned images. Many financial services firms for example, have large volumes of existing TIFF images that they want to convert into PDF documents, so they can be indexed and searched based on the entire context of the document. Users can then search for content within a PDF document, perhaps to identify a particular word, phrase, or even a contractual term.  To accomplish this, a public AI service like Amazon Textract can be used to OCR the content from the TIFF and extract and index the text. A transformation service (not AI) is used to map this text back to the original image and convert the image to a PDF document. This document is then ingested into the Content Services Platform and properly indexed for search and access. Automation – another use case for machine learning is to help companies better automate critical business functions and processes. Most enterprises still process millions of paper forms every year, many of which are handwritten. A critical challenge is to first determine if the form has been completed correctly, before processing it. This is highly labour-intensive and therefore also costly, with humans required to determine what type of form it is, and then validate that the necessary responses have been provided, signatures are in the right place, and if required, that confidential customer information is present and correct. Machine learning models can do all this formerly manual work. Organisations can capture these forms from a variety of sources – fax, email, or physical forms – and convert them into digital images/ documents. Machine learning is then used to correctly identify the different forms and perform the necessary validation on the provided information, in most cases before a human even looks at the document. For some large financial services organisations, for whom over 60% of their forms are still completed by hand, this simple bit of automation can save millions annually. Insight – for many organisations, the insight contained within their content is actually vastly under-utilised. AI can help extract insights and intelligence from existing business content and deliver that insight back to users so they can make more informed decisions. One example is insurance. Fraudulent claims remain an issue and it is not uncommon for claimants to use the same accident photos in multiple claims. For a large P&C insurer, with thousands of claims processors, the chances that the same claims processor will handle two claims with the same accident photos are extremely small. Yet AI can help detect insurance fraud by leveraging machine learning to compare new claims photos with a vast repository -- or Content Lake -- of existing photos, quickly identify duplicates or near-duplicates, and then automatically launch a fraud investigation process. As the volume and type of content in enterprises grows, so too does the need to manage that content more effectively. AI and machine learning can unlock the insight and value within content and applying these technologies to content will become one of the more widely deployed AI use cases over the next few years. [/vc_column_text][/vc_column][/vc_row] ### Educating the digital world – the rise of eLearning [vc_row][vc_column][vc_column_text]The 21st century has rapidly come to be known as the era of technological evolution. With technology taking over nearly every niche and the use of mobile devices like laptops, smartphones, and tablets registering a profound rise, the world has well and truly integrated itself into the digital space. Studies suggest that the number of active internet users crossed 4.33 billion in late 2019. This paradigm shift has helped trigger an evolution for the better across the industrial spectrum, and education is no exception. The modern generation of students is no longer confined by the limitations of traditional learning systems, as technology opens up a multitude of opportunities to fulfil their quest for knowledge and experience. Prominent among these avenues is the emergence of education on the digital platform – eLearning. With the education sector becoming increasingly more tech-savvy, the e-Learning market is set to witness a tremendous boost over the coming years. eLearning is an educational system that builds on the foundations of conventional teaching and delivers information through the adoption of electronic sources, most notably the Internet and computers. Although it faced considerable resistance in its earlier years, owing to the belief that the module lacks the integral human element required for effective learning, consistent evolution in the technological space, as well as the growing awareness regarding the benefits of a digitised learning system, has led eLearning solutions to become more accepted by the masses in recent years. The benefits of online learning are many. For instance, studies have shown that retention rates can be increased by 25% to 60% through eLearning solutions; a significant number over conventional face-to-face learning which registers only 8% to 10% retention. Additionally, eLearning increases the flexibility of the education sector, by allowing students to have more control over the learning process, such as revisiting the material as and when required and allowing them to attend courses as per their convenience. eLearning also broadens the reach of the educational material, by catering to a vast audience, unbound by geographical or time-based limitations. The evolution of digital content and its impact on learning Change is the only constant; this adage is well inculcated within the technology domain and is evidenced by the profound change that technology has brought on in nearly every aspect of human life, from the way people work, communicate, eat or learn. In the past, the focus of academics was limited more or less to theory, rather than practical applications. Students were expected to gain knowledge about particular topics only by referring to myriad books and taking copious notes, which was a relatively restrictive approach to learning. Furthermore, not all educational institutions had access to faculty with suitable expertise towards a certain subject, which also posed a challenge to holistic learning. However, the emergence of technology has allowed the educational system to break past these boundaries and opened up limitless learning avenues for aspiring students. One of the key ways this has been achieved is through content digitisation. Digitisation refers to a means of creating digital proxies of analogue materials including newspapers, books, videotapes, microfilms, etc. It has numerous benefits, including more widespread access to material, especially for distant patrons, collection development through collaborative initiatives, enhanced preservation activities and higher potential for education and research. In the educational space, digitisation has the potential to not just make learning easier, but also to make it more engaging, interactive and exciting for the audiences, in turn leading to a significant boost in productivity. Digitisation has paved the way for eLearning to transform the educational landscape. eLearning is a way for digitised content to make its way to a vast global audience, even in previously hard-to-access rural areas. It also acts as a boon for working students, by providing them access to recorded eLearning courses at their convenience. One of the most prolific benefits of content digitisation in the education industry is its ability to merge together the best aspects of both online learning and traditional classroom learning methods. Emulating the characteristics of both learning disciplines, digital content builds a stronghold for modern students and also generates savings in terms of resources, by cutting down the need for additional printing of learning materials as well as mitigating paper usage by implementing online examination platforms. What is on the horizon for the eLearning industry? The recent pandemonium created by the onslaught of the COVID-19 crisis has asserted a significant impact on the world. However, for the e-education landscape, this impact has manifested in the form of a surge of opportunities, with over 90% of the wealthiest economies in the world rapidly shifting their focus from traditional classroom schooling to virtual education. Online learning platforms are responding to this surging demand in earnest, with many offering free access to eLearning courses and digital learning solutions. For instance, Bangalore-based ed-tech company BYJU’s has announced free live learning sessions on its Think and Learn platform. This move resulted in a nearly 150% rise in the number of new subscribers to the eLearning platform. Meanwhile, Tencent classroom has seen a massive rise in usage since mid-February 2020, in response to the Chinese government’s declaration that a quarter of a billion students were to resume their education through online platforms. This declaration consequently led to the largest “online movement” in the history of education, with more than 81% of students from K-12, attending eLearning courses through the Tencent K-12 Online School portal in Wuhan. Some people are of the belief that a complete transition to e-education, especially with inadequate focus on training, bandwidth access, and overall digital preparation would be detrimental to sustained educational growth. However, an increasing volume of the global population is advocating in favour of a hybrid educational model, that combines the social benefits of a classroom environment with the superior material access and convenience of eLearning solutions in the years ahead. Global Market Insights Inc. has a market report dedicated to E-Learning, available at: https://www.gminsights.com/industry-analysis/elearning-market-size[/vc_column_text][/vc_column][/vc_row] ### AI in the home: Smart devices do’s and don’ts [vc_row][vc_column][vc_column_text]The growth of Artificial Intelligence (AI) technologies in our homes have been increasing exponentially in the past few years, and with a large number of the UK workforce embracing the pandemic-imposed working-from-home culture, for many of us our homes have become our office. In this article, we consider the impact of this shift from a data perspective with a particular focus on smart speakers and other devices containing “virtual assistant” technology. Know Your Devices Smart speakers, such as Amazon Alexa and Google Home, are becoming mainstream in homes around the country. Many owners of smart speakers are by now aware that these devices are always “listening” for their “wake word” – which enables the device to listen to user queries and commands. The reality is that smart speakers are most likely not the only technology in your home which are “listening” for a “wake word” – mobile devices, tablets and computers also often have built-in virtual assistants, and it is becoming increasingly common for other devices (for example, TVs) to integrate virtual assistants too. You should become familiar with the smart tech in your home. Mute Voice-Input A smart speaker may inadvertently start recording confidential conversations if it hears its wake command, and therefore ensuring that voice-input is muted can be effective in preventing this. However, in line with our recommendation above about getting to know your devices, remember it is not only smart speakers which may constantly be in “listening” mode. If you are having confidential conversations - analyse whether it would be best practice to mute voice-input on each of these devices. Wear Headsets. It might not always be practical for you to mute voice-input on all of your devices. Consider whether there are other ways to prevent data from being heard, and potentially recorded, by smart devices. For example, wearing a headset on your calls may result in only your voice being heard by your devices, thereby minimising the amount of data which would be recorded – it might even have the added benefit of clearer audio! Change Your “Wake Word” Some smart speakers allow you to change the “wake word.” If you are finding that your device is constantly “waking” unintentionally, think about changing the wake word to something that is less likely to be used in every-day life. Know Your Suppliers Many, if not all, of the large tech companies producing smart speakers will use state-of-the-art technology to protect any data they collect – but even the best security systems can be vulnerable to rogue attacks. There are also a number of start-up companies which are developing exciting tech to be used in the home. Whatever company you are purchasing your smart tech from, you should ensure you know and trust them and understand their data policies. In the UK, tech companies should all have readily available privacy policies which will explain how your data will be used. Although these can be relatively lengthy documents, most should be written in a way which is easy-to-follow and if you are concerned by what data a company may store and how they may use your data, this can be a good place to start. If you are an employer, consider how you can assist your employees develop sensible working practices at home which will prevent the unintentional recording of data by third-parties. Regardless of your level of trust of the tech companies who store this data, once the data exists, risks associated with that data exist. The most effective method of managing this risk is to ensure that confidential conversations are never inadvertently recorded by smart devices in the first place. As many of us are in the long haul for working from home, these tips can be helpful to reduce the risk of confidential conversations being recorded by your devices and, where data is recorded, understanding how that data might be used.[/vc_column_text][/vc_column][/vc_row] ### Your next cloud move & the importance of control [vc_row][vc_column][vc_column_text]If the lockdown has got you thinking about moving your onsite IBM Power infrastructure into the cloud, you’re not alone. For companies who rely on remote working for business continuity,  cloud computing can be a valuable asset. Using a managed service provider’s IBM Power cloud service means you can store your company’s data and applications on their remote cloud infrastructure, enhancing accessibility by allowing you and your team to access it anywhere in the world. Of course, managed service providers themselves rely on physical data centres; secure locations where your server is kept along with others’ – much like a traditional storage facility for people’s furniture and other physical items. With this in mind, if you're thinking about moving to the cloud or changing cloud providers, make sure you choose a vendor that’s in complete control of their infrastructure. After all, you’re only as strong as your weakest supply link. Blue Chip has complete control “We own the leasehold on our data centres,” explains Chris Smith, Director of Sales & Marketing at Blue Chip. “The main reason we’ve been able to stay operational throughout the lockdown is because we’ve invested in them.” Two years ago Blue Chip built a software-defined data centre, investing in the networking, storage, and automation and orchestration software that enables its team of engineers to operate systems remotely from home. Delivering a mix of IBM i. AIX, zOS, Linux and Microsoft platforms In fact, Blue Chip is one of the only managed service providers with skills in IBM Power infrastructure and operating systems to own its data centres. Those who rent a data centre as opposed to owning it can’t be 100% in control. Ultimately, the control lies with the landlord. So why is ownership so important? Essentially, managed service providers who don’t own their data centres have to rely on the people that do own it to keep it maintained, secure and up-to-date, and when something happens, such as flooding, it’s out of their control. Managing data centres remotely Blue Chip is one of the only vendors in the UK with ISO22301 for business continuity. As such, the team already had a plan in place prior to the lockdown and were prepared for almost every eventuality, resulting in a smooth transition. “When we were delivering systems for our customers over the Easter weekend, none of our engineers or the customers themselves needed to be on site,” continues Chris. “Everything, from the data migration and the transition into our cloud infrastructure was done remotely from home on behalf of the customer.” Blue Chip’s software defined data centres locations are far enough apart for resilience yet close enough for low latency communications. Blue Chip also connects to other data centres around the world, and although most of its customers are UK-based, it works with customers in 150 different countries. Other aspects of control As well as data centres, Blue Chip is in control when it comes to the recruitment process. Many IT companies outsource to third parties nowadays, but Blue Chip prefers to attract and retain its own talent to run, maintain, recover and sell its systems. The company doesn’t rely on one sector for income, either. “Our customer base also covers every vertical,” says Chris. “Because we focus specifically as experts in IBM technology alongside Linux and Microsoft based solutions, we’ve been able to deliver our services to every kind of business you can imagine, from large financial service providers to specialist logistics companies.” To find out more about how we can transition, manage and maintain your IT systems in the Blue Chip Cloud, get in touch with our team today. [/vc_column_text][/vc_column][/vc_row] ### Enabling Intelligence-Driven Security: the rise of CISO as a service A global pandemic coupled with a global cybersecurity skills shortage, make these uncertain times for organisations of all shapes and sizes. The shortage of skilled IT Security workers in Europe has doubled in the past twelve months rising from 142,000 to 291,000, according to a recent report. Globally that figure is now over four million. A lack of skilled or experienced security personnel was declared as the number one workplace concern in the same report. With a lack of skilled technical resource inhouse organisations need to seek alternative ways to keep secure. Skills aren’t the only issue; the threat landscape is continuing to evolve at pace. As highlighted in our latest Global Threat Intelligence Report, attackers are innovating faster than ever, and more recently are looking to take advantage of the current pandemic to proactively target vulnerable organisations. Adding to this complexity is the rise in cloud-based services, mobile devices, big data, and the Internet of Things blurring traditional network boundaries and creating a broadening footprint. It’s unsurprising then that 57 per cent of respondents cited security as the biggest challenge of managing IT inhouse in NTT’s 2020 Global Managed Services Report. Evaluating security postures and dealing with security risks is a perennial challenge, not least amidst constant change. Now more than ever organisations are looking to services providers, calling on their expertise to fill any internal gaps and resource as their needs increase. The responsibility of managing this challenge and ensuring information assets are properly protected, usually sits with the Chief Information Security Officer (CISO) but they are in high demand and short supply. As a result, we have seen an emergence of a hybrid approach to procuring security services called ‘CISO as a Service’. Designed to bridge a widening gap in cybersecurity knowledge and experience the service is delivered to organisations by a third party and provides them with access to highly skilled security people and tools. Digital drivers One of the key drivers for this service is the desire for organisations to digitally transform whilst remaining secure.  They recognise the need to be ‘secure by design’ which means making sure that security is at the heart of the business’s overall strategy and building it in from the start.  Despite the benefits that digital transformation offers such as increased productivity, the ability to reach new markets and improved business processes it’s almost impossible to achieve without there being at least some risk. Digital isn’t the only driver though. Enhancing rather than hindering user experience is important but it’s hard to balance security with ease of use while being innovative. Adhering to the appropriate governance and compliance regulations is a constant struggle as is preserving operational security with compliance and all this while keeping up to date with changing practices. Furthermore, maintaining visibility and control over a fast-changing hybrid IT environment which has disparate monitoring systems and tools are adding to the mounting task. Last and by no means least is the design, building and operating of a proactive cybersecurity environment. The reality is that most organisations would require a sizeable, skilled team to achieve all of the above which is not only costly but challenging when we’re faced with a global shortage of skilled, IT security professionals. Why CISO as a Service is a viable proposition Those challenges I’ve listed above will be a familiar story to many organisations and it’s likely to be a similar tale to those even with CISO’s on-board. What we do know is that cybersecurity is a complex subject that isn’t getting any easier. Using the expertise of a third party to provide a CISO as a Service capability provides the necessary levels of assurance to organisations that their cybersecurity strategy is being driven by highly skilled experts with access to the latest threat intelligence. CISO as a Service also offers considerable levels of flexibility to organisations so that they are able to flex up and down depending on their requirements. It can be used to solve a specific task such as managing a compliance project or developing an incident response plan or it can cover the full range of security services Gaining access to skilled cybersecurity professionals and knowing how to deal with the fast-evolving threat landscape are ongoing challenges faced by organisations globally. They either don't have the skills or expertise, they aren't able to source them, or they can't afford the associated investment costs that additional headcount involves. For them, the concept of CISO as a Service offers a very welcome alternative and should certainly be considered a viable proposition whether it’s to support all or part of an organisation’s cybersecurity services.[/vc_column_text][/vc_column][/vc_row] ### For Business Continuity, Choose A Cloud Provider That’s in Control [vc_row][vc_column][vc_column_text]More than ever, we realise how important it is for us to be in control of our data centres, technology and team of experts – not just for our own business continuity but for the sake of the customers who rely on our services. Cloud providers that aren’t in control open themselves up to vulnerabilities when faced with disruption. It’s harder to remain resilient and protect your business from the unprecedented when you rely on outsourced infrastructure and staff. What does lack of control look like? Outsourcing is a common business practice nowadays and in some contexts makes a lot of sense. After all, outsourcing can help businesses keep costs down, reduce risk and eliminate the need for training and upskilling. However, we believe that control is critical in our business. Providing cloud service support and maintenance to a high standard requires ongoing investment, both in terms of the IT infrastructure itself and the people who build, deliver and maintain it. We also understand that if we were to hire contractors to deliver services, we’d increase the risk to our customers. For instance, if a migration project were to overrun, for whatever reason, contractors might have to abandon the project prior to completion. “We’ve been able to stay open because we’ve invested in our data centres.” The same goes for data centres. We know that using third party data centres would mean relinquishing control; we’d have to rely on the owner or the landlord to maintain it to a high calibre. Also, what’s to stop them from closing and using the real estate for other, more lucrative purposes? What does control look like? So, how do you know if your cloud provider is in control? In short, a cloud provider that manages everything in-house – from migration and maintenance to recovery and security – is in control. Data centres Blue Chip owns and runs its two software-defined data centres. “We invest in the power, networking, infrastructure – the whole thing,” explains Chris Smith, Director of Sales & Marketing at Blue Chip. The automation and orchestration software the Blue Chip team has developed also enables the systems to be delivered remotely. “Everything from the data migration to the transition into our infrastructure and cloud was done remotely from home on behalf of the customers.” “We’ve been able to stay open because we’ve invested in our data centres,” says Smith. “For instance, when we were delivering systems over the Easter weekend, neither the customers nor the engineers needed to be on site in our data centres. “Everything from the data migration to the transition into our infrastructure and cloud was done remotely from home on behalf of the customers.” Technical expertise Blue Chip employs 230 people, from maintenance engineers to security personnel. We choose not to outsource; instead, we invest in our staff to ensure that we stay at the forefront of innovation and change within our industry. “As we’re in control of every aspect, we’re able to continue to operate even in the unprecedented circumstances we find ourselves in today,” Smith adds. “We’re still delivering customers into our data centre completely remotely.”   Is my cloud provider in control? Checklist :  Do they own their data centre(s)? ☐ Are their data centres software-defined? ☐ Do they employ in-house teams? ☐ Do they have ISO22301 for business continuity? ☐ Do they serve multiple verticals? ☐ Are they able to operate remotely? ☐ [/vc_column_text][/vc_column][/vc_row] ### Adapting your data protection measures for the cloud [vc_row][vc_column][vc_column_text]We have witnessed an explosion in data. Driven by the growth in connected devices, it’s predicted that more than 79 zettabytes of data will be generated in 2025. This has changed the landscape of almost every enterprise, affecting IT teams on virtually every level. They are being forced to find new ways to store, manage, protect and efficiently utilise this information. The consequences for failing to do so can range from missing opportunities to, in the case of breaches, reputational damage, financial loss and regulatory punishment. Added to this is the pandemic-driven demand for remote working, adding a new layer of complexity to the challenges of securing critical data. Is this proving too much for enterprise security functions? It certainly seems to be the case that in the early stages of crisis, employers were prioritising continuity of service over data protection. The rapid introduction of collaboration tools and employees’ own devices may have kept the wheels turning as staff worked from home, but in the rush to make this a reality, organisations were perhaps inadvertently taking dramatic risks with employee and customer information. Given the proliferation of poor data privacy and management strategies and the wide-spread use of consumer-led remote workings apps, it can only be a matter of time before we begin to see data breaches and hacks directly linked to pandemic-driven working changes. In order to avoid this, organisations must adopt operating practices that places privacy at the heart of their data management strategies. But how? Delivering balance: Productivity and Security It is about balance. There is a pressing need for all organisations to ensure that productivity and connectivity remain consistent, yet this has to be done in a secure manner. Consider, for instance, the need to have documentation, including playbooks and running sheets, centrally located and easily accessible for all team members remotely. Cloud computing offers a route to this, but simply moving an on-premises workload to the cloud will prove to be a perilous and potentially reputation-damaging move. The accepted view is that data storage and management in the cloud is a cheap and easy solution for the enterprise. However, that does not isn't necessarily make it the right choice for most organisations, and even less so for government entities. Just as applications cannot simply lift and shift in the cloud, so it is also true that legacy policies and procedures should not simply be applied to the deployment of cloud environments. More agile technologies require similar approaches to security. Every organisation has information they need to protect, whether that’s products in development, customer or user data, contracts, financial reports and other sensitive material vital to day-to-day operations. However, most enterprises do not operate a strict classification system through which they can define access policies. If they were to categorise data based on sensitivity, they could determine, for example, that a sales manager needs data relating to their accounts, but not necessarily company financials, employee contracts or even information relating to other accounts. Once that categorisation has been undertaken, it is time to think about how a decentralised workforce will access, store and share information. Does each byte of data need to be encrypted, or will a secure connection be enough? How should that affect the environment in which it is stored? What about the use of mobile and laptops? Again, relevant customer accounts could be available through most endpoints, but financials could require a specific type of device, coupled with multi factor authentication and other security protocols. By taking this graded approach and matching access policies with user and device identification, enterprises can balance the need to secure data in all environments, whether cloud or on-premises, with the requirement to access it remotely. A new approach to data management for a new approach to working As enterprises adapt to more decentralised ways of working, and that data protection challenges that brings, adopting an appropriate strategy now may be the most critical IT decision they make all year. This much is clear - while the global pandemic is upending 'business as usual,' enabling the secure collaboration and communication for employees has never been a more business-critical issue. That means having a more streamlined, agile approach to data management and compliance. One that defines storage and access through categorisation of data. By doing this, enterprises can deploy the environments their data needs in order to achieve the balance between complete security and agile working.[/vc_column_text][/vc_column][/vc_row] ### Five pitfalls to avoid for a successful cloud migration [vc_row][vc_column][vc_column_text]The overnight transition to remote working has been a lesson in needing infrastructure that’s flexible, scalable and reliable. And successful deployments go beyond just a migration: they require a cohesive, phased strategy, leadership and bullet-proof security. Right now, we’re seeing businesses looking to catch up on their cloud strategies. There’s some simple steps to avoid when you’re leading a successful cloud migration - from failing to put a cloud-first culture in place through to data protection strategies. We’ll unpack these for you here, to ensure your deployment’s a success. Migrating to the cloud without a business goal This sounds obvious – we get so many enquiries about migrating to the cloud because companies “think they should.” Bringing the cloud into your enterprise infrastructure has to have a well-defined business purpose as does transitioning your data: what do you want from your cloud migration? Get clear on organisational pain points so you know how cloud becomes part of this strategy. Are you looking to shift your remote working policies over the long term? Are you looking at prioritising agility within the business? Is this about taking costs out of a legacy server? Each of these require different strategies to deliver a successful deployment. Being clear from the outset is essential. Not creating a cloud-first culture Transitioning to the cloud is as much about a cultural shift as a technical one. It will ultimately change the way your business works, beyond IT, to operations, finance, HR and of course, communications. Buy-in for this change has to come from the top, working closely with your IT leads to map out the business impact as you move over to the cloud – whether you’re looking at a hybrid or complete solution. It requires future-focused leadership. We always recommend understanding the component parts of drivers, the risks as well as the changes cloud platforms enable. Listen to your IT teams as they create a communications plan to ensure the business understands and leverages new cloud capabilities and applications to get the most out of your cloud investment. Planning for a single cloud migration  Once an organisation has committed to transitioning to the cloud, they want to do everything at once. Don’t. Many businesses have a hybrid model to ease the transition, from a business and tech perspective and we always recommend phasing your cloud deployment. By migrating specific bits of data in stages, you can ensure your enterprise becomes familiar with your provider’s cloud solutions as you go. We recommend migrating non-essential or test data first. You can then progress to business-critical and sensitive data, minimising any risk. Now’s a good time to review what applications need to be migrated against those you can replace. Essential software can be transitioned but if there’s a creaky old custom-built CRM system on your server, you may want to have a rethink, potentially configuring new software ahead of the infrastructure’s full deployment and testing, using interim resources to allow this. A road map is key to every successful deployment. Cloud brings with it new systems and software that affect workflow. This in turn requires team training and communication which you need to factor into the plan. Iterating the change is key. Neglecting cloud security Data security needs to be front and centre of your cloud migration strategy.  We’ve seen businesses overlook the need to protect their data throughout their transition as well as those so concerned about sensitive customer data, it delays their cloud migration completely.  Think innovatively and strategically to manage this and review both the data security benefits and disadvantages to navigate the transition. For instance, when transitioning sensitive data, we’ll look at segregating customer’s servers to ensure compliance with audited infosec. Consider what the implications are of data hosted in private, hybrid or private cloud – work closely with your cloud partner to assess the most seamless option. Future-proof your strategy We see this time and time again – a failure to look to the long-term and ensure your solution is future-proofed. The recent COVID-19 crisis is a case in point: businesses that wanted their cloud service on-premise to maintain control, have close oversight and avoid sending their people to other sites, found that the opposite is true. Looking ahead, as we plan for future black swan responses, it’s hard to justify why a flexible, scalable hybrid cloud deployment won’t be part of that. You get the benefits of managed environment, dedicated or shared hosting provider and outsourced support, meaning your business can operate at the level it needs to. The crisis has been a lesson in staying ahead of the curve when it comes to cloud deployment. Having the option to ramp up capacity has been the difference in being able to run your business – or not. Look for partners who can design solutions that make future-compatibility of infrastructure possible. This will ensure the resilience and stability is in place to drive your business growth over the long- term. And, planning around these pitfalls will support the security, reliability and performance of your solution, so you don’t have to play catch-up in the future.[/vc_column_text][/vc_column][/vc_row] ### Cloud and Cyber Security Expo launch Techerati Live digital conference in partnership with Disruptive [vc_row][vc_column][vc_column_text] Online panel discussions will be streamed live and direct from Disruptive’s state of the art COVID-19-compliant London Studios  London 6th July 2020 – Cloud and Cyber Security Expo has announced the launch of a new digital conference series - Techerati Live. Developed in partnership with video production and virtual event experts Disruptive, the first one-day event in the series will be Techerati Live - Security Edition and will gather the leading voices in the world of cybersecurity to discuss the important issues for businesses in the post-COVID world, all streamed live on LinkedIn on Thursday 9th July  from 10am t0 5:30pm BST Techerati Live - Security Edition is an invite-only, free-to-attend event that will provide attendees with access to a range of panel discussions and one-to-one interviews with leading cybersecurity experts, who will address fresh trends and topics in the field. The day’s sessions will also aim to provide insight into the security strategy of some of the world’s top organisations, as well as plenty of practical tips, to help attendees understand how to better defend their own organisations within the evolving threat landscape. Techerati Live - Security Edition will also be providing first-class networking opportunities for participants, enabling those in attendance to connect and interact directly with cybersecurity’s leading figures in real-time via live chat capabilities as well as other attendees. Additional information about speakers and the full agenda will be available over the coming week. Register your interest now to receive up the minute agenda updates and sneak-previews from speakers: https://www.linkedin.com/events/techeratilive-securityedition/   About CloserStill Media CloserStill Media specialises in international professional events. In the technology markets, these business events are hosted across six global territories. The portfolio includes some of the UK’s fastest-growing and award-winning events including Cloud Expo Europe, Data Centre World and eCommerce Expo. Having delivered unparalleled quality and relevant audiences for all its exhibitions, CloserStill has been repeatedly recognised as a leading innovator with its teams and international events winning multiple awards in Europe and Asia including Best Marketer – five times in succession – Best Trade Exhibition, Best Launch Exhibition, and Rising Star – two years in succession – among others. About Disruptive Disruptive, part of the Compare the Cloud network, is the UK’s first online tech and business TV channel. We cover some of the biggest events in the world as part of our live shows, produce original content, explore cutting edge developments across different sectors and interview the disruptive voices in the industry.[/vc_column_text][/vc_column][/vc_row] ### Taking a cloud-first approach to print infrastructure [vc_row][vc_column][vc_column_text]Technology has revolutionised the way we live and work, and people are now more reliant on technology than ever before to help facilitate various elements of their lives. These innovations have led to changes that could not be achieved even a few years ago. One such change is the evolution of the working environment, with an increasing number of businesses now beginning to adopt remote working. A technological advancement that has made this possible is the introduction of the cloud. The cloud has the ability to deliver live computing services over the internet, which means anyone can access their data whenever they want to, wherever they want to. Because of its importance in day to day life, you will struggle to find a business that does not use cloud computing for at least some of its day to day functions.   However, one area of the business that companies often overlook when it comes to the cloud is printing. Despite the rise of paper-light initiatives in recent years, printing is still heavily relied upon in the office environment. It’s an essential part of most jobs, but historically, many people have had a negative experience when using workplace printers. The rise of cloud computing has helped change this, but as companies adapt to the modern workplace, what should they look out for when deciding to undertake a cloud-first approach to print infrastructure? Infrastructure improvements Investing in a managed cloud print service can allow customers to use a variety of devices to print, which can be anything from computers to tablets. This can be achieved without having to worry about the burden of compatibility and drivers, making it easier to access a printer from an array of locations by holding the print document in the cloud until released by the user. Doing this gives the customer the choice of allowing their employees to work where they are most comfortable, which could be in the office, in a café, or even in their own bedroom. Employee wellbeing is important in any workplace, and the flexibility of cloud printing can align with the notions of the modern workplace. Reduce the IT burden A cloud first strategy for print infrastructure not only allows employees to work from various locations, helping increase work efficiency, but it also reduces the burden on IT departments. IT departments deal with most things computing related, from fixing the internet to making sure systems are secure enough to tackle cyber-attacks – dealing with the print system only adds to the pressure. It can take numerous hours and a hefty bill to keep a physical print system up to date, and the manpower to achieve this is considerably large. As everything is connected and centralised with cloud printing, basic tasks that would have totalled to hours of work at the end of the year now happen automatically. This is anything from software upgrades to ordering toner ink. Cloud printing helps reduce this burden, allowing customers to have access to the latest and fastest cloud print management technology, with upgrades added to the solution itself at times. Not only does this save the IT department time and money but it also means employees do not have to constantly go out of their way to install new software updates. Having the right printer solutions complemented by the best software means everything can be managed on one browser. Doing this saves time, money, and constant maintenance from the IT department. Operational expenditure Helping reduce maintenance costs is one thing customers should think about when adopting a cloud-first strategy for their print infrastructure, but it shouldn’t be the only thing. Operating a printing system through the cloud can save money in other ways. Firstly, it allows customers to print on demand, helping document-heavy customers save massively on printing costs. Maintaining a cloud print infrastructure also allows companies to be creative when it comes to print management. Creating specific printer rules or groups can help stop overprinting. For example, some departments within an organisation might be limited to a few pages a week – capping overly excessive departments from overprinting. Security & privacy We are now living in an age in which hackers are becoming more advanced, and cyber-security attacks are becoming more common. Customers who want to put security at the heart of their business should consider cloud printing. When cloud printing was first introduced, data was being transferred back and forth through an open network to a cloud provider, making it an easy target for hackers. New technological innovations have now made this data more secure through encryption, which means the data is protected throughout the whole printing process. Without encryption, hackers from outside the organisation could connect to the network and read the data being transmitted between a device and the printer. When looking to adopt a cloud first print strategy, there are a number of questions customers need to ask. Do they want to reduce the IT burden, enhance their security, or help improve the working environment? Cloud printing addresses all of these elements, and as the technology continues to evolve it will continue to play a major role in companies across the globe, no matter what sector they work in.[/vc_column_text][/vc_column][/vc_row] ### Making healthcare more connected with 5G [vc_row][vc_column][vc_column_text]The proliferation of IoT devices connected by 5G is transforming virtually every industry by connecting businesses, environments, and customers with non-stop streams of information. Perhaps no industry has adopted IoT more aggressively than healthcare, as it streamlines operations, monitors patients, and provides remote care. To combat the pandemic, however, we need even more interconnectedness between health care professionals and their patients. Still, even as we fight the current battle, it is important that the healthcare industry adopts stringent privacy and protection practices for highly confidential and sensitive patient data and services.   A case for greater interconnectedness As our NHS heroes face enormous new challenges during the pandemic, IoT and 5G technologies have helped them respond. As with many other sectors, healthcare is using IoT to streamline daily operations. For example, IoT enables hospitals to better track the state of beds and patients. Not only can it help identify the nearest supporting nurse or physician for a patient, but also alert staff when beds are free for a new patient. By replacing manual paperwork, patients’ waiting time can be cut in half, which improves both effectiveness of treatment and patient dignity. IoT and 5G also improve the quality of care by giving doctors access to more accurate and complete data. Devices now monitor bodily function, regardless of location, so the doctors can assess the patients’ health, even when not in the office. Whether it’s a change in blood pressure, insulin levels, or oxygenation, a member of staff can be alerted, and contact the patient before they themselves even know something’s changed. Not only do the devices help alert medical staff to health risks, but the constant stream of data enables them to track trends in the patient’s health. Increasingly, at-risk individuals are adding IoT devices to everything from pill bottles to home appliances to furniture to provide an even more holistic picture of the state of the patient and how they are functioning in their environment. Instant alerts and more data mean better care. Better connectivity supports a rise in telemedicine, allowing patients to access medical expertise without having to travel. Home-bound patients with underlying health conditions can connect with their doctors remotely. Since the doctor has the information from the connected devices, many checkups need not happen in person. Furthermore, remote connectivity allows a patient to see a specialist without gruelling long-distance travel. This will not only improve the quality of life for the patients, reduce their risk of going to a medical facility, and ease the strain on hospitals. By lowering the challenge of getting a medical consultation, at-risk individuals can address issues before complications develop. Finally, aggregating data of potentially hundreds and thousands of patients holds great potential in improving health care. A collection of data can both help better understand and combat pandemics like COVID-19 and detect and quell future pandemics. The data can also be used to track, diagnose, and treat broader health issues that affect our population. More information helps provide better care for our patients. The risks of modernisation  As the healthcare sector adopts new technology and data to help patients, it must ensure that it protects patients from outages, errors, and privacy violations. Health care organisations must protect the technology infrastructure they now depend on to run their hospitals. Among the most dangerous threats is ransomware. Since 2016, Comparitech reports that nearly 1500 healthcare organisations have been attacked, affecting over 6 million patients. Hospitals need a robust ransomware protection solution so that patients are not held hostage by these attacks. Even as they save their lives, medical institutions also need to protect their patients’ privacy. With so many devices, hospitals must manage “small data sprawl”, a process whereby data is spread outside a single location, which increases the risk of possible data leaks. Whilst such data may be spread across multiple locations, it still needs to meet the same standards of data protection as a data centre. This means that not only do the streams of data need to be backed up in case of error but that personal health information (PHI) cannot be stored in insecure locations. Laptops, tablets, and cloud have been the source of leaked (PHI) information, so they must be both protected and scanned. Healthcare organisations must also manage access permissions to central pools of information. While the centralised information is vital to current and future treatments, people’s information must be anonymised, and medical professionals must be authorised to access the information. Therefore, it becomes critical to not only set up clear access permissions but also tracking for unusual data access patterns. Protecting modern healthcare data  While IoT and 5G will transform healthcare, they require equally innovative solutions to protect the patients’ health and privacy. As a baseline, healthcare organisations should comply with stringent legislation such as the Health Insurance Portability and Accountability Act (HIPAA) and General Data Protection Regulation (GDPR). As technology evolves, traditional data protection methods simply cannot scale to meet this need. Organisations will need to adopt cloud to protect modern healthcare data. First, cloud provides offsite data protection, so that the backups will be safe from ransomware attacks. Instead of having to give into the cyber criminals, hospitals can restore their applications from the cloud. Second, cloud allows businesses to seamlessly manage their data from one central location because it can connect to IoT devices and consolidate the data protection. Third, cloud enables healthcare professionals to access healthcare records and data because it offers the scale to hold and process vast amounts of data on demand. Finally, with the power and analytics capabilities in the cloud, organisations can detect unusual data access patterns and validate that only the right people are accessing patients’ most private information. Even the most beneficial technology comes with risks. As we fight a pandemic, treat at-risk patients, and support an ageing population, we need to adopt new technology. IoT and 5G will transform healthcare for the next generation, but to support that transformation, we need to protect the new environment. With more cyber threats, chances of errors, and privacy risks, data protection must evolve. Only a cloud-based solution can connect to all the IoT devices, store large volumes of data, manage safe and high-performance access, and enable physicians to rapidly analyse the information. Data protection needs to support our healthcare heroes, by giving them the tools and information they need to put patients first.[/vc_column_text][/vc_column][/vc_row] ### Age Is No Barrier To Smart Home Tech [vc_row][vc_column][vc_column_text]As the number of people aged 65 and over is growing faster than all other age groups around the world, it can create significant problems for families. It can sometimes prove difficult to safeguard the welfare of elderly loved ones without encroaching on their feelings of independence. While setting a cooker timer or changing the thermostat may not seem like a huge task, it can be a significant and sometimes dangerous obstacle for an elderly person. But while care homes offer one option, they aren’t suitable for everyone. Many people don’t want to leave their homes, while family members would prefer their elderly relatives to be able to remain comfortable and settled where they are. Smart home tech has been quick to respond. Elderly Care And Smart Tech Smart tech and elderly care is a rapidly growing market. In the UK alone, nearly a quarter of households now own at least one smart home device, while one in ten has at least two, according to YouGov data. Meanwhile, the percentage of the population in the UK aged over 65 is growing each year. In 2016, there were around 11.8 million people aged over 65. Some 25 years earlier, there were around 9.1 million. As our healthcare services continue to improve, this trend is only expected to continue in the coming decades. Smart Homes Versus Care Homes As the senior population expands, the Internet of Things (IoT) and the  latest in smart home technology could offer a case for smart home technologies to pave the way for independent living and remote care monitoring. While there is naturally a limitation to smart home devices, they can provide added safety and peace of mind to individuals, their families and care providers. There are now apps that can tell you if your elderly parent has their kettle on or not and what room in the house they are using. The NHS has even announced that is partnering with Amazon to get Alexa automatically searching the NHS website when people ask for advice on health-related matters. Smart Home Devices In the last few years, technology firms have developed a range of smart home tech solutions that can help older people maintain their independence for longer. These easy to use gadgets can control your boiler, speakers and home temperature, turn the lights on or off, monitor the front door or play some music. These smart devices connect to a tablet or smartphone via a hub and allow people to control them with simple on-screen or voice commands. Simple Automation The over 65s have become a crucial market for smart home tech. As more older people continue to live at home as they receive care and support, devices such as Google Home and the Alexa app and Apple Home app enable older people to enjoy more independence. Google Home, for instance, can automate things around the home and integrate with other technologies such as Google Maps and Google Calendar, enabling even those with limited vision or mobility to benefit from the technology. It’s recently upgraded menu also offers a more streamlined way of accessing the device’s key settings and features. Enhanced Home Safety The concept of smart homes is becoming an accepted reality, but it’s not simply focused on convenience and accessibility. Safety is one of the key benefits of having a smarter home for many older people. Some of the most popular smart devices used to enhance safety include home alarms, security cameras, video doorbells and motion sensors. There is now the potential to harness assistive care solutions with the latest generation of assistive home technology that can respond to emergencies such as a fire or gas leak while also offering more simple enhancements to people’s lives such as medication dispensers. Improve Social Contact A loss of friends and family and reduced mobility and income can result in older people becoming susceptible to loneliness and isolation. Technology can play a role in helping in-home care agencies, relatives and the individuals themselves reduce social isolation. The use of WiFi and networked hubs increases the contact a person can have with their friends and family and provide access to online services and activities that can enhance their wellbeing. This smart technology can also make communication between individuals, care providers and family members more efficient by using apps for reporting and sharing information. Today’s technology is moving fast. New devices are launched every week that work smart and offer the potential of enhancing the quality of life of seniors, whether they remain in their own home or move to supported living.[/vc_column_text][/vc_column][/vc_row] ### How to capitalise on the ERP trends of 2020 [vc_row][vc_column][vc_column_text]The global ERP software market is projected to reach $70.40 billion by 2026 and shows no sign of slowing down. Companies are realising the importance of Enterprise Resource Planning (ERP) software to enable the modernisation of operations, no matter the size of the business or the industry it operates in. With the growing need to monitor consumer data, having a system in place that provides a holistic view for companies to prosper is imperative. Across industries – from retail to manufacturing to construction – ERP holds the key for businesses wishing to unlock their full potential by automating manual processes, gaining significantly enhanced visibility and improved business security. But for those businesses looking to migrate to an ERP system, they must have a clear and structured vision as to what they want it to achieve and they must select a partner carefully in order to truly maximise its potential. ERP and its importance ERP is a complete system, capable of integrating point of sale (POS), e-commerce, Customer Relationship Management (CRM), marketing, order management and inventory, Service Resource Planning (SRP), Business Intelligence (BI) and finance. 2020 will see the emergence of more agile ERP systems come to the fore and increased personalisation due to the likes of new technologies and Artificial Intelligence (AI). New technologies are changing the way in which organisations function and operate, as a growing impetus is placed on the customer being able to function as seamlessly and effectively as possible. The more data a business can access and utilise, the more competitive it can be. Data can influence the products a business might sell, the areas of the market it can target and in turn, can tailor its services to maximise revenue. ERP can organise large amounts of data which reduces errors and benefits productivity. This newfound way of working can include automation and integration of critical processes, as well as the sharing of information across the organisation. As businesses grow, it becomes more critical than ever before to have information easily accessible in one place.  2020 ERP trends 2020 will see a significantly higher adoption rate for Small and Medium-Sized-Business (SMB) enterprises as they look to benefit using ERP. Personalised ERP solutions are also set to come to the fore. Companies that sacrifice control of their own systems in order to get a specialised solution from a provider which is much more personalised to its industry will free up costly time and effort in the long run, making this a smart move. This ensures efficiency isn’t lost in the day-to-day running of the business, while the enhanced flexibility ensures that ERP components can integrate with new technologies. A solution which can move with the times and industry trends will ensure that companies never get left behind. One which is entirely flexible and scalable will help meet any company’s needs as it grows in terms of the number of users and functionality. This is why Agile ERP will also be a key trend this year. Mobility is set to be another key feature in 2020, providing users with mobile access and enabling them to tackle problems in real-time. This increased mobility will be felt right across the supply chain from the end-users to the management in the business, giving them with the ability to track Key Performance Indicators (KPIs) and gain key insights into their day-to-day business performance. As a result, flexibility is key and ERP features must be able to connect devices, departments, and customers in one versatile and easy-to-navigate database. Every business can use data provided by the ERP system to plan their business strategies and find smarter and more intelligent ways to use this information to stay ahead of the competition. Another important trend of 2020 will be the number of businesses choosing to adopt cloud-based ERP systems. Not only does this come at a more affordable price, it will also ensure improved management of a business’ assets. ERP is a proven solution with the flexibility to meet a company's demands. Being able to have a 360-degree visibility across all channels from a single platform will ensure better customer understanding.  Investing in an IT solutions specialist Updating or changing an infrastructure that has been in place for a long period of time may seem like a troublesome task but having a partner with a history of success in the ERP industry can make the transition smoother. For those businesses reluctant to change, they could be left behind in an increasingly competitive world. But those that choose to migrate will reap the instantaneous benefits of improved productivity and increased savings across all areas of its business. The expenditure and amount of time needed can often discourage businesses from using ERP. The integration of the new system involves staff needing to be re-trained at basic and higher levels in the new software, which can be challenging. However, when using a partner, which specialises in developing and implementing tailored cloud-based IT solutions for businesses, companies will be able to garner the advantages of using its cloud-based solution from the first moment. Able to analyse the strengths and weaknesses of businesses, an IT solutions specialist could recommend the best solution tailored for each individual company. It will trial the system first, providing its users with all the necessary training and will fix any problems they may encounter once the solution has gone live. Therefore, the potentially arduous process of upgrading to an ERP system is a simple and efficient one. An ERP software, which has been modelled to suit all business sizes across an array of industries and implement the solution personalised to a business’ specific needs with the aim to make it more efficient, agile and profitable, is preferable. ERP the path for success The growth of technology has meant that businesses can target their customers more concisely than ever before. The future-proof design of ERP means it is able to grow with businesses and the changing wants of the consumer. In a market that has an increasing online presence and the amount of data on display, businesses cannot afford to miss out on the fundamental insight which ERP offers.[/vc_column_text][/vc_column][/vc_row] ### Remoting even remoter? [vc_row][vc_column][vc_column_text]What happens to cybersecurity when remote office workers become home workers, connecting to remote customers who are themselves working from home? VPNs are not the solution. Zero trust networks are. National security agencies and the military know that there is only one sure way to protect an IT device from malware, and that is to isolate it completely. “Air gapping”, rather than networking, means that there is no room for connections to allow malware to spread between devices. Unless, of course, some malicious or fallible human carries it over – but that is another story. Humanity is currently threatened by its own version of malware – the coronavirus. Scientists have advised governments that humans must be “air gapped” to reduce the risk of it spreading. The tactic is to enforce social isolation via lockdown. Office workers, for example, must now work from home – and phone, social media, messaging and conference calls have replaced the business meeting, the open-plan office, the family reunion or friendly handshake. The irony is this: now that people are (relatively) safely air-gapped, the attack surface across their means of communication systems has ballooned. Consider an online financial transaction: this process would normally entail a call via the office through to the financial services company or its helpdesk. But under lockdown, the office worker is now calling from home, and the finance staff are also working from home. How can we ensure edge to edge security when the edge can be anywhere? The Virtual Private Network (VPN) is not up to the task There used to be a sure way to connect branch offices to headquarters: you lease a private line between them. As a dedicated private line, it was intrinsically “air gapped” from other communication lines and so very secure. But it was a costly solution, requiring manual connection at either end, and very inflexible. It took time to fix contract details and install and, if you needed a little extra capacity, it would require setting up a second leased line. For the last twenty or more years VPNs have been the de facto solution for securely connecting business to business and home users across the Internet. A branch office could sign up for a VPN offering secure connection to the VPN provider’s nearest server, from there it can be securely connected across the globe to a server closer to company headquarters, from where a final VPN connection links to head office. VPN looked like the ideal solution, except that cybercrime has since become so much more sophisticated and pervasive. As remote workers become even more remote in response to coronavirus social isolation, criminals are turning their attention to vulnerabilities in this ballooning attack surface. On March 13th 2020, the Cybersecurity and Infrastructure Security Agency (CISA) issued a warning to business about the growing security risks: As organisations elect to implement telework, the Cybersecurity and Infrastructure Security Agency (CISA) encourages organisations to adopt a heightened state of cybersecurity. In particular, they warn that: As organisations use VPNs for telework, more vulnerabilities are being found and targeted by malicious cyber actors. As VPNs are 24/7, organisations are less likely to keep them updated with the latest security updates and patches. Malicious cyber actors may increase phishing emails targeting teleworkers to steal their usernames and passwords. Organisations that do not use multi-factor authentication (MFA) for remote access are more susceptible to phishing attacks. Organisations may have a limited number of VPN connections, after which point no other employee can telework. With decreased availability, critical business operations may suffer, including IT security personnel’s ability to perform cybersecurity tasks. There are several problems here. A VPN connects the endpoint blindly to a public or private data center and imposes no access or authentication restrictions. Malware travelling through the VPN gets full network access and can start hunting for vulnerabilities in any server on the network. 63% of network breaches are traceable to third party access. There are other limitations too. As VPNs divert traffic between servers they trade performance for security, adding latency and reducing bandwidth. Whereas it is easy to establish a single VPN between two offices, setting up and managing hundreds of VPN tunnels to every home worker becomes a major headache. Is there a better solution? The Zero Trust Alternative “In today’s business, it is critical to give everyone in the organisation the ability to access all apps whenever they need them, no matter where in the world they are. To have this access securely is no longer optional but a must."  According to Chakib Abi Saab, CTO of OSM Maritime Group – who might as well be speaking for every organisation currently adapting to social isolation and home working. There is now a proven alternative to VPN links. It is a software-only solution available as Software-as-a-Service (SaaS). No additional hardware is needed, you just install the software – already available for Windows, Mac, Linux and mobile – in each endpoint. Every endpoint now has secure, strictly-controlled access directly to permitted private cloud, public cloud and SaaS services. It is a “zero trust’ network, only allowing specific, authenticated application sessions. You are no longer “tromboning” data around your corporate infrastructure, adding latency, packet loss and failure points. Instead of digging new VPN tunnels to every shifting endpoint, you become an air traffic controller, with simple, centralised visibility and control of the organisation’s entire network. VPNs are notorious for impairing app performance.  It is even worse now because of VPNs tromboning apps to a few VPN concentrators, then to a corporate network and then finally to the cloud.  A “scenic route” – but the added latency and failure points are not so pretty.  The Zero Trust solution, however, routes data directly along the best performing paths. Application performance is noticeably improved, and your home or remote workers are no longer at any disadvantage. For the most critical business, such as financial services, it really is possible to ensure secure, high-performance communications across the globe.[/vc_column_text][/vc_column][/vc_row] ### The benefits of cloud technology for life science firms [vc_row][vc_column][vc_column_text]Developing a new therapeutic takes time, especially when it’s being held back by legacy data management systems. Using the traditional pen-and-paper approach to capture and store data is unreliable – it ensures that data is siloed, and most likely then forgotten. We know that without access to data and the ability to share it with colleagues, drawing insight is impossible. The knock-on effect here is that innovation is stagnant and throughput reduced. Research shows that 90 percent of businesses now use cloud technology to increase productivity. Teams use Slack to communicate, marketing and sales use Salesforce, and creatives use Adobe. Similarly, implementing cloud software in the life sciences sector can increase productivity – speeding up research and development (R&D) by 20 percent. This presents a solid business case for technology that scales along with needs and supports all aspects R&D in one single and integrated platform. For organisations looking to upgrade their data management system, there are a variety of reasons moving to the cloud is a good idea. It is more scalable and cheaper than on-premise as well as being faster to deploy and easier to use. Cloud software requires minimal training – if you use the internet, then you are already familiar with it. In R&D, where time and costs are major decision drivers, it’s a gamechanger – there is great potential to significantly increase the throughput of viable drugs and therapeutics. Strengthen collaborations and share data instantly  When collaborating with an internal or external team, it’s important to be clear on what needs to be done, who’s doing it, and when. But if your data capture tools are outdated and ineffective, data might be incomplete, inaccurate, or missing altogether. Communicating with partners and gaining insight from that data becomes challenging at best. There is room for these issues and many more to develop when using simple pen-and-paper approaches to log the torrent of data – from experiment results and parameters to calibration/batch records, materials, and so on. Using spreadsheets, or a combination of paper notebooks and spreadsheets means transferring data between systems – even if this means simply moving bits of paper around. This is time-consuming and can result in inaccurate data. To access data, you must find it in its logbook or spreadsheet, which also takes time – an IDC report even showed that employees can spend 30 percent of their workday searching for information. And even then, you can’t be sure that the information is correct due to accidental human error. Inefficiency is rife and creates significant business issues. With cloud technology, data capture is automatic, so you don’t have to worry about writing everything into your logbook and finding it later. Storing your experiment data in the cloud also means that you can access it at a touch of a button and share it instantly with colleagues in other departments. This allows everyone to gain timely insights while streamlining the overall throughput. For example, an analytics team can share product attributes with the formulations team, enabling them to choose the right combination for a stable end-product. For those companies collaborating with external partners working across multiple sites, or with employees working remotely, storing data in the cloud enables them to securely access information regardless of location. There is no need to set up a VPN or rely on IT to offer technical support at unsociable hours. A cloud-based system will do the heavy lifting, providing a flexible and secure environment to collaborate. On top of this, with cloud deployments, no information is lost in the black hole of emails or sent to the wrong person by mistake. Instead, all data is available instantaneously and accessible to all those with the correct permissions – scalable and secure.  Get more for your money with cloud deployment  Most companies that already have a system on-premise may be reluctant to make the shift to the cloud due to fears of preliminary cost and disrupting operations. But a software as a service (SaaS) model can be up and running in just a few hours, and at a fraction of the cost of on-premise installations. Since SaaS is subscription-based, the cost is spread out throughout the year and includes regular upgrades and maintenance. Unlike on-premise and perpetual license models, SaaS can be used on any operating system, and developers have an incentive to continue to roll out updates, fix bugs quickly, and ensure the highest level of security. By running the software on its own servers in the cloud, SaaS removes the need for expensive specialised equipment, infrastructure databases, or the hiring of an IT specialist to maintain and support its running. SaaS vendors will have their own team of specialists and technical support to help you with the migration and continue to offer any support you might need throughout your use of the software, saving you both time and money. Even if you scale up, you’ll be able to see what the costs are, making budgeting much easier. Having assistance on hand when you need it (much like your data), drastically reduces the IT department’s workload, enabling them to focus on critical projects to drive innovation.  Meet regulatory and compliance demands with ease Cloud software can also help you comply with the high regulatory standards of the life sciences industry. Cloud models are typically architected to adhere to frameworks, which is vital for drug development firms, as failed audits can ultimately result in business failure. Some cloud vendors have ensured that their products can help you comply with industry best practices, including GxP compliance. How? Organisations can build in controls that flag issues before they cascade to ensure that all data is captured and stored in one location to reduce errors. This helps to boost data integrity. This approach means that data is tracked, so organisations can see who accessed what data when they did this, and what they did with the data. This level of traceability ensures the information is complete and secure, making audits easier. Data stored in the cloud is routinely and automatically backed up, a process which is normally laborious but necessary to ensure data integrity, meet regulatory demands and propel a product to market. What R&D organisations need is an end-to-end scientific informatics platform in the cloud that boosts collaboration, streamlines workflows and encourages regulatory compliance at the highest level. It should be scalable, flexible and secure – everything a life sciences business needs to drive innovation and get therapeutics to market faster.[/vc_column_text][/vc_column][/vc_row] ### IoT will empower retail resilience post pandemic [vc_row][vc_column][vc_column_text]In recent years, retailers have had to adjust to rapidly changing conditions. With the maturation of online shopping habits, the high street has been forced to revise its approach to customer experience. Although only in the early stages, brick and mortar stores had begun to rediscover their purpose in the retail landscape, fuelled by the increasing adoption of new technologies, recorded worldwide at +20% CAGR. The arrival of COVID-19 changed all of this. Stringent lockdown regulations brought almost all physical retail to a standstill and the ongoing situation means that even as shops reopen social distancing measures will greatly impact how they do so. Some retailers, economists and consumers have argued that the pandemic may mark the end of brick and mortar stores and by extension the need for smart technology in retail. Retailers would be wise to disagree.  With such a high level of commitment to high street premises in retailers’ business models and no clear end to social distancing in sight, to the contrary, technology solutions will offer an antidote to this industry dilemma - from stockroom to store-front. Managing resources and logistics Supply chain management needs an overhaul to become more flexible. Many retailers were beginning to embark on the right path before the lockdown, but now accelerated adoption of smarter systems will be a crucial factor in minimising costs to mitigate against unpredictable demand and revenues. The availability of supplies is now unpredictable since social distancing conditions made supply chain disruptions harder to rectify. Retailers need faster and more accurate tracking and recording to optimise stock management and cost-control. Starting with stock, GPS tracking and RFID (radio-frequency identification) technologies use live-tracking to optimise supply chain and inventory management. Used in conjunction with analytics capabilities, these technologies allow retailers to automate precise stock management thus simplifying the task of adjusting stock to better cope with unpredictable demand. With reopening on the horizon, retailers should remain prepared for closure at short notice. To help, even energy controls can be optimised for cost-efficiency savings on outgoings and resources. Whether humans activate the sensor controls or IoT devices operate storage conditions to reduce spoiling of products, using automated smart sensors electricals and lighting will only operate when they are needed. Taking an analytical approach, AI and data will empower retailers to identify potential excess before it becomes a burden on profitability. Despite continuing demand volatility and the influence of fewer purchases per capita, as suggested by Deloitte, AI algorithms eliminate the potential for wastage, unsold stock and irrational last-minute stock adjustments. Social distancing in a digital age Privacy concerns, particularly following the introduction of GDPR, have slowed adoption of shopper behaviour diagnostics in retail. Now that these practices are becoming necessary for retailers to responsibly monitor social distancing, retailers should consider those solutions which obfuscate their customers’ identities, even at the edge level. Using computer vision in cameras or LiDAR light detection sensors, retailers can measure footfall, crowding and movement around the store to control the spread of infection. Simple measures like more accurate and automated people counting can make the difference to delivering on safety, whereas more high-tech analytics such as temperature sensors can potentially detect early infections. These tokenising strategies retain shopper anonymity, which will empower retailers to harness valuable data insights to inform stock, merchandising and innovation spend once lockdown measures ease. Meanwhile, as UK retail spaces begin to consider reopening, they will need solutions to sterilise wide-area premises, and technology can assist here too. UV robotics devices will be used to perform and measure sanitation levels to ensure standards are met, in critical areas automated RFID soap dispensers will help employees maintain safe personal hygiene practices and non-contact accesses controls using voice or NFC will limit secondary contact. Similarly, creative application of biometrics will benefit the customer experience, and bring back the recreational element to shopping, or indeed window shopping. Touchscreens will go touchless, leveraging vocal input and Natural Language Processing instead. User-centric engagement technologies such as smart mirrors which reduce fitting room congestion and encourage product interaction. AI capabilities open up a host of possibilities for retailers to promote safety and enhance the retail experience. Introducing the new normal Even with the right systems in place retailers will undoubtedly need to adjust to decreased levels of footfall in the physical store beyond the current lockdown phase. This is where there needs to be a change of mindset. Retail spaces may need to be redesigned entirely and rebuilt around technologies that inform new, safer processes such as one-on-one-off display items. Retailers should look to the market for assistance with integrating appropriate systems which align with existing processes. Taking advantage of proof-of-concept trials is a simple process that helps demonstrate ROI and fine-tunes solutions before full implementation. The sharp rise in online shopping is inevitable but this will have other long-term effects which further merge online and offline worlds. For example, Click-and-Collect services and showroom experiences where customers view in-store and order online are on the rise. Stock management systems need to reflect these changes and work in harmony between stores and warehouses through adoption of analytics and automation. AI technology, cloud services and automation will power the retail industry for years to come, and will propel it into its next chapter. Retailers who are ready to take steps to protect their business from social distancing risks should look to digital transformation programmes that meet their specific needs. Turning this from potential into reality will require businesses to innovate within their supply chains, appoint smart safety practices and embrace complex data to navigate the new parameters for in-store experiences.[/vc_column_text][/vc_column][/vc_row] ### How banks can reflect on the pandemic to reinforce strategies that create more resilient systems [vc_row][vc_column][vc_column_text]We are currently living through one of the most difficult times in recent memory. The impact on our lives, including our financial lives, has been profound. However, once the storm passes, there will be a time for reflection and reorganising the way we do things. For banks, this will be a perfect moment to reinforce strategies that create more resilient systems. COVID-19 has dramatically changed the way we shop. In lockdown, people and businesses are relying on digital payments to keep things moving. Research has revealed a 209% increase in online retail sales during April, compared to the same month last year. But it’s not all about consumer spending; as supply chains become stretched, cashflow and processing business-to-business (B2B) payments faster is also of concern. Meanwhile, businesses are relying on banks to quickly scale up and disburse government-backed funds to keep them afloat. Figures from UK Finance show UK banks nearly doubled the number of business loans for those impacted by coronavirus in a week, but it’s not nearly enough. The pressure to quickly process loans and transactions has never been higher, but two things, in particular, can cause a hold up: controls and capacity. An effective way of addressing both lies in the public cloud. Control Concerns Many tier one banks, operating with outdated systems and facing a heavy compliance burden, have become overwhelmed during this time. Many are already feeling the need to do things differently, and not just for the short term. Cloud-first fintechs are liberated from legacy to focus on agility, navigating market trends and, fundamentally, their customers. However, they don’t service the majority of customers in either business or retail banking, yet. The pandemic has highlighted the limitations of the status quo. For example, in the UK, rigid controls around communication and confidentiality mean most banks’ customer call centres cannot operate from home. Anyone with an offshore team is now struggling with the fact they can’t access networks. It makes them ask, ‘are our operational control processes fit for the world we are living in when we need to react in a rapidly changing environment to provide us the ability to effectively innovate and better serve our customer’s needs?’. A renegotiation of these complex and lengthy operational control checklists – which have mushroomed out in an on-premise, siloed world and not the hyperconnected environment we currently find ourselves in - is therefore required. Banks now realise that they need to reconsider their thinking around control, with many staff having had issues in getting access to their networks, particularly those with offshore teams where VPN may not be great, many are looking to revisit how these operating models will work.  These, in turn, allow for greater workplace flexibility which may allow Banks to reassess their costly real estate footprints as they search for a more nimble and agile model. Turning to cloud deployment on the basis you can access it from anywhere is a natural solution and allows business process to catch up with the technological advancements of the last few years.  This is the first step in a wider shift likely to be accelerated post-pandemic where banks reassess the roles and responsibilities of operating their technology stacks shifting towards more consumable services which allow them the ability to focus on customers and not maintain technical infrastructure. So, what could current, on-the-hoof changes mean for customers long term? A more agile, Cloud-based and digitally informed approach could have a massive impact at street level – literally. We have all seen the long queues of people at cash machines, many of whom have received a government credit payment. As a consequence of the pandemic, we will see banks improve their controls to disburse money more effectively so that people don’t have to queue up to check if their funds have arrived. In today’s digital-first world where the majority of people have smartphones, there is no reason why you can’t issue them with prepaid cards or prepaid accounts. This goes back to data, leveraging insight analytics, machine learning and artificial intelligence (AI) to segment data better and figure out ‘who’s in most need of X, Y, and Z? Capacity Issues Linked to control is the battle for virtual space. For years, data storage and compliance headaches have gone hand-in-hand. Issues around confidentiality and security (not to mention compliance teams being busy with pressing frontline work and time-sensitive audits) meant that, for many, Cloud got side-lined. While investing in virtual storage is a compelling time and money saver long term, it hasn’t been the top priority. Now, many banks are searching for more effective backup, storage and hosting solutions. If they are seeing more transactions but don’t have the capacity to monitor them, there’s more chance of the system failing to work. A Cloud solution can help prevent this; it doesn’t use additional resources, and banks have what we call a ‘sleep-at-night-ability’ when experts handle their Cloud infrastructure. Managed solutions providers invest on the client’s behalf in reserved instances – the data providers’ way of forward-selling capacity as insurance against the kinds of data volume spikes seen in recent months. Reserved instances are crucial because, during a spike, everyone tries to push functions into the Cloud and everyone wants to access the same computing at the same time, resulting in a slow response. Reserve instances ensure we can scale up solutions in the Cloud quickly, avoiding any drop in response times. While coronavirus has created a plethora of new challenges, it has also highlighted problems that have long needed fixing. In supporting efforts to save businesses and individuals from financial ruin, banks have discovered their limitations. Perhaps there is a silver lining. Cloud gives you, at its simplest, a clearer focus on customers and the ability to more scientific in how you target them and win business. It also ensures organisations have time to think about customers, as opposed to all the other distractions they have in running their infrastructure. For me, the biggest benefit of the Cloud is that it allows you to get closer to customers and focus on what matters.[/vc_column_text][/vc_column][/vc_row] ### How Does SSL/TLS Client Authorisation Work? [vc_row][vc_column][vc_column_text]TLS, or Transport Security Layer, is a component of almost every web server as of 2020. It is a protocol that allows a client computer to authenticate the identity of a server before sending any data, which ensures that sensitive information is not being sent to a fraudulent end point. After verifying the identity of the server, all further communication is encrypted under a set of session keys unique to the client and server. This article will unpack the details of TLS a little later on, but with an important distinction; this article isn’t about TLS authentication, but instead TLS authorisation. Authentication vs Authorisation While the concepts of authentication and authorisation might at first seem to be essentially the same, the difference between them is crucial for Internet security. To use a real-world comparison, imagine going to a bank in which you have an active account—be it savings, checking, or anything else. If you want to access your account, you will be asked to present your card as proof that you are the legitimate owner of the account. This is authentication: the process of verifying the identity of a party. Presenting your credit card to the teller does not, however, grant you access to other bank accounts, nor does it give you permission to wander into the back vault, hop behind the desk, or otherwise engage with restricted areas. This is because although your card is able to authenticate you, it is not able to authorise you to interfere with the working of the bank; for that you would need to present something like a bank-issued ID badge proving your level of authority. This is why there are protocols on the Internet that ensure both authentication and authorisation of a client are possible. A common and practical method is to use TLS. As it stands, a lot of sites on the Internet that use TLS authentication do not also use authorisation. However, there are some occasions where you might want to deny access to certain parts of your site to certain users or, conversely, only allow access to certain parts of your site to select users. A lot of University sites, for example, have pages that are only accessible to staff. Facebook also uses authorisation tactics when determining if someone has the ability to see a post. If this is your goal, then you’ll also need to set up TLS authorisation. Important Concepts SSL/TLS: Standing for Secure Sockets Layer and Transport Layer Security, respectively, these protocols are one and the same. TLS has long been the upgraded version of SSL, however SSL is still more common in IT vernacular than TLS. TLS/SSL Handshake: This is the name for the process that occurs the first time a new client makes contact with a server. This process generates unique keys to secure that particular session, predictably called session keys. The server is then capable of remembering the unique session key, and can continue the session if it recognises that the client TLS certificate has an identical key. Can TLS truly allow authorisation?: Even though TLS certificates were only designed to authenticate clients, their infrastructure makes them ideal for authorisation as well. Doing so does require additional software, which will be discussed further shortly. Ultimately, having a client-side TLS certificate serving double-duty is one of the most secure and simple methods of enabling authorisation to specific white-listed certificate holders. Authorisation Typically, when a client computer accesses a web server using TLS, only the server is required to present a certificate for the client computer to verify. However, it is possible to configure a TLS protocol to check both the server AND client certificate in a process called mutual TLS. Since authorisation requires review of the client certificate, a mutual TLS is necessary for TLS authorisation to work. Getting TLS for your site is easy, and can be done at a small cost. In simple terms, TLS authorisation checks aspects of the client-side certificate to see if a certain certificate-holder is meant to be given the power to interact with select API. Once a client visits a site, mutual TLS will check their TLS certificate. If this is a first visit, there will be no record of that client’s unique certificate values, so a normal TLS handshake will occur. If this is a return visit, then the web server will recognise the session key that was used last time, and can resume from where the client left off. Before giving access to the site, however, a server with implemented TLS authorisation will first check the client certificate for authorisation details. Typically the server will request an identity token. If you recall the earlier example of the bank, an identity token is like a security badge. This data is passed to a security manager, a separate software element that can parse the token and determine the level of clearance the presented certificate has. It uses data from the provided token and replaces certain information in the properties tree identity fields. According to IBM, the data that is affected is as follows: “The IdentitySourceType is set to the user name. The IdentitySourceToken is set to the subject name of the client certificate. The IdentitySourceIssuedBy is set to the issuer name of the client certificate.” Once this process is complete, the session will continue using a set of session keys. However, the server will then be able to allow or restrict the activity of the client based on its predetermined parametres. A key component of this process is that a client certificate must be capable of being propagated throughout the entire process, called the message flow. Since basic TLS authentication only allows for verification of the certificate, additional software is required to carry the client certificate onward through the extra steps. OAuth is an example of a popular and widely used service for this. Spring Security is another such software and may be ideal for a less experienced developer. Should you decide to enable TLS authorisation on a website of your own, make sure to carefully read the details of your chosen software. The benefit of using TLS as a method of authorisation is that it already has a solid infrastructure in place. Since a TLS handshake involves the generation of unique keys, these are sometimes used in the generation of a security token. Additionally, because a web server can remember the session key of a client computer, it can also remember the changes made to the properties tree identity fields. If you’re interested in learning more about TLS, especially about the traditional authentication aspect, check out this article for more information.[/vc_column_text][/vc_column][/vc_row] ### Securing the Work from Home Digital Transformation [vc_row][vc_column][vc_column_text]The standard of working in an office is no longer the standard. Today, work from home, and all that it encompasses, is the new normal. In many ways, things haven’t changed. The focus for security is still the same: access, managing entitlements and controlling endpoints security expense. So, what does that mean now? And what does that mean in the next six months when people come back to a more typical work environment? Robert Meyers, compliance and privacy professional at One Identity, explains. Digital transformation is the modern phenomena of enabling users to work from anywhere but as with access to all the same resources as if they were working in the office. It often refers to the concept of using cloud resources. Perhaps this transformation was something organisations were putting off; however, in the end, it happened with a sudden crush of additional work. But it happened. Thank you, global pandemic. The forced digital transformation was done in such a way that IT departments worked like mad to make the changes that no longer could wait for tomorrow. To keep businesses running, things were done as rapidly as possible, not as securely as possible; not as automated as possible. Changes were done because they had to be. During any digital transformation, there are four areas on which to focus. Namely: Remote access - getting users working remotely Managing entitlements - giving users the right to do things that they need to Securing endpoints Controlling costs Let’s look at the four focus categories in more detail as it applies to dealing with the changes of today and in the future. Remote Access To keep business running, users need to have access. When we talk about remote access, we're not just talking about giving somebody a VPN client. They need to have all the tools they need to access their workloads. That could be a multifactor authentication (MFA) system or a privileged access system for getting into admin tools or securing a remote access gateway of some form. It could also be tools to get into Azure, AWS or Google Cloud, and maybe even work directly on a database. In fact, it could be moving critical resources into the cloud so they can be accessed remotely. The VPN client only goes so far. Not very long ago, the T and E series Internet connections were common. These included the T1, E1, and T3 for example. The problem is some organisations still hold them. Others go for bargain-basement Internet connections because all their workloads are on-prem. So remote access may not work with the VPN if a company has 500 users trying to connect into something that looks more like a coffee straw than a fast, wide pipe to the Internet. And that’s a normal reality. Remote access needs to focus on giving workers the ability to work. Sometimes this is moving the workloads to remote physical locations, and sometimes this is moving the workload to remote services or infrastructure (i.e. cloud). Don’t focus on one over the other. Today, we have to deal with managing it the way that an organisation can most easily maintain it. However, remember that time will cause change to occur. Many of these remote workers will be returning to offices. Some will return to factories. Others, who didn’t have access because they were furloughed, will also return, and some of those might actually end up being remote workers. So remember, when moving workloads, employees may need the ability to work on these workloads from the office as well. Plan for both. Managing entitlements Managing entitlements is a complicated concept. Many cut security out of it entirely. The goal in the past was to be flexible and fast but now things have changed. Now there is a distributed workforce using distributed technology. They’re trying to be resilient and flexible, but the lines of distinction between internal/external rights and security have inextricably blurred. The entitlements that give users rights to do things need to be managed. And that means they need to be put into a format – such as true identity management or account management. Additionally, we have to automate processes. Why is automation such a big piece of importance here? This dynamic environment that we are in today will change again. Furloughs will cease and users will return. When you have 5,000 users move to remote access in one day and another 5,000 have their accounts disabled on the same day - that’s a lot of work for IT. In addition, if you’re using traditional or manual processes to make these changes, this work is going to happen in a day. Most likely, it will take weeks. Now think what happens when you have to undo that. Inevitably, it will be even more complicated than doing it in the first place. Controlling entitlements is key, but it also has to be methodical and has to use automation. People make mistakes and they take shortcuts. If a person doesn’t have enough time to do a task and it was completed quickly, it means shortcuts were taken. When it comes to entitlements – particularly elevated rights that control applications and access or can access personal data or vital company assets – shortcuts are not an option if you want to stay truly compliant. So, when a company is returning to normal, automation is required to remove human error from the equation and to accelerate processes. Secure Endpoints and Identity Systems, such as PCs, laptops and mobile devices, that are issued to employees need to be secured. The problem is a lot of these machines are distributed, given to employees on a thumb drive as a virtual machine, without a thought of security. There has never been such a push to get people remote and operational as we’ve seen recently. But in the end, organisations must secure the endpoint. Defining an endpoint to include workstations, tablets, and virtual machines running on personal computers is imperative. But this security needs to be extended to include a user’s identity and access, and management of cloud applications, because all of that needs to be secured as well. Start with the physical endpoints, then get those mobile device management systems enrolling laptops and tablets. Deploy desktop-management tools to the desktops. Enable the helpdesk to actually help users. This is a significant issue right now. Systems are being sent out without a lot of the management tools that IT has used for years. The visibility into these endpoints is significantly less than at any time since IT security became the norm. Visibility into access controls is also low. However, security has to be prioritised and implemented in the quickest and easiest way possible. In some cases, that MDM is going to be deployed by an email. The MFA is going to be given after a user connects into the VPN without it. Is that the best solution?  No.  However, if the machines have already been deployed, and users are already logging in, then a company needs to start layering in all the security they can immediately, not in six months. Controlling Costs There are interesting explosions of cost centers today. People are moving technology budgets around at an unprecedented pace right now. At the same time, everyone needs to focus on ways to control their costs. So how can you do this during such a chaotic time? There are three areas you can focus on to control costs. The first is managing logs and SIEM; the second is managing license agreements; and the third is closely managing your migration to the cloud. It’s interesting that logs come before licensing. Right now, there are a lot of new log sources being sent to SIEM. Remote access logs that went from 50 devices to 5,000 overnight. New web applications and services. Dramatically larger utilisation of auditing logs, possibly in Microsoft Office 365. And logs have to be taken more seriously today because users are out there and so is the data that needs protecting. But most SIEMs charge by the gigabyte per day. They often charge for how much data is stored. As security teams try to keep up with all these new streams of log data that organisations worry about a denial of service of their own SIEMs. Another controllable cost is licensing. Why do so many organisations keep excess licensing? It is not uncommon to learn that an organisation has too many licenses for email, service desk systems “to be safe”. Those extra accounts should be de-provisioned and the license released. For those excess licenses running around, those should be released. Why do so many organisations waste money on licenses that are often purchased on a month-to-month basis? More and more services have gone to this service concept, which allows users to be added back in nearly seamlessly. So, organisations should plan to use this type of licensing when conditions move back to a more normal setting. Lastly, in the controlling costs category, migrations to the cloud will happen, but they need to be the right migrations. There are a lot of services to help identify an infrastructure provider or SaaS provider that will be economically advantageous. Today, that should include an interface and ease-of-use. The key here is to reduce costs by putting the right workloads in the right cloud infrastructure. Use the right services to control those spiraling costs. For example, escalating cost are those we normally haven’t seen in most businesses. In many organisations, IT spend is considered a pain point, and what happens is it’s strangled to the point where everything is behind. As an example, how many organisations are still basing their infrastructure off Windows Server 2008 R2 or Windows Server 2012 R2? Very few organisations have converted to either Windows 2016 or 2019 as the backbone of their infrastructure. When you talk to IT admins it’s always on the horizon. Just a year ago, it wasn’t uncommon to see people moving to 2012 R2. The argument is always about costs. But some of those more up-to-date operating systems have cloud interconnections. It is easier to move their workloads. So, sometimes controlling the cost means spending the right money, and that could be doing the right upgrades or buying the right tools to do a migration, and then doing the migration. Spend wisely, and always look for additional advantages with every purchase. Digital Transformation The digital transformation has truly occurred; and what’s going to come out of it? No one knows. It wasn’t controlled. It wasn’t planned. It happened. Now, we get to play catch up. When planning next steps, please remember to keep in mind the following: Businesses need to focus on access, managing entitlements, securing endpoints and identities, and controlling costs. These focus areas should include what happens next. The world as it stands and business as it stands are not going to be the world we work in next year. Plan accordingly.[/vc_column_text][/vc_column][/vc_row] ### 7 Reasons for SMBs to Invest in Cyber Security [vc_row][vc_column][vc_column_text]Recently it has become common to read breaking news about a significant cybersecurity breach at a major corporation. These stories dominate the news and lead to the idea that cybersecurity should only be the concern of large corporations. This is untrue, as some 43% of data breaches impact SMBs, according to a recent report. SMBs tend to be softer targets for hackers and threat agents because their cybersecurity systems are not as robust. As many as one in three SMBs have no recognisable cybersecurity system. Here are some reasons why smart SMB owners need to take it more seriously: 1. SMBs face the same threats as big businesses Threat actors are finding fewer reasons to concentrate their efforts on large businesses only. Where once it was thought that big corporations were the best targets for ransomware and other attacks, SMBs are being targeted in increasing numbers. Hackers realise that SMBs also have reputations to protect and they also have access to funds to pay cybercriminals if required. 2. Quality IT consulting services are now available to all As the internet has changed the way the world works, it has also democratised previously niche services. IT consulting services were once primarily interested in the money available from big business. But as more tailored and affordable product offers open up, it is just as easy to get IT support in Cairo as it is IT support Los Angeles. Besides these services, it is also a good idea for SMBs to be good at the basics of self-protection. These include updating software regularly, educating employees how to deal with threats, and implementing a crisis response plan. Other measures like multi-factor authentication and good online habits like strong password protection can go some way towards not leaving everything to a cybersecurity system. 3. IT breaches are expensive The vast majority of malicious attacks are financially motivated. The costs of resolving an attack can cripple a business. The toll is not only financial. Many victims find their operations severely disrupted for extended lengths of time. This causes a loss of customer trust and other unwanted effects such as loss of market share. The road to recovery from a severe breach can be a long one. 4. Interconnected devices are risky One of the great benefits of cloud computing is the ability to access files from a number of different devices. The trouble is that many times, SMB employees in particular use unsafe home devices, which expose company devices and systems to risk. Employees often use unsecured networks while working remotely. With no advice or training to modify behaviour, staff unknowingly expose the company to cyber threats. It is particularly common for SMBs to have a range of different hardware items at their disposal if they have grown too fast. IT security policies are also not consistent and do not cater for growth. Many SMBs have a range of different hardware items from different manufacturers. This indicates a procure-as-you-grow approach which shows that IT practices may not have been coordinated. In this environment, threats can enter. 5. Protecting client data is a key responsibility SMBs often hold large amounts of client data from project collaborations. This information may be housed on an extranet or on hard drives. Either way, protecting a customer’s data is any firm’s largest priority. Those companies who have had sensitive customer information such as credit card details stolen from them know all too well the public backlash. Vocal customers who relay a negative experience can do much to hurt a brand. Competitors are also quick to seize on any data lapse and will naturally try to take market share. 6. The reputation of your business is at stake It is not uncommon for businesses to fail soon after a cyber attack. As we have seen, a large driving factor is the loss of reputation. In a world of infinite competitor choice, it can be difficult to recover from a high-profile breach. If reputation loss happens and market share loss follows, this can be a hammer blow to many SMBs who are just not set up for the fallout. Smart SMB owners should be thinking about the hurdles around the corner. These yet unrealised disasters that can sink a business should be addressed as soon as practically possible. 7. SMBs are a soft target SMB owners are often distracted with the more pressing issues of growth and survival. Many times their IT services, be they in-house, or outsourced, are not adequate for the requirement of the SMB. Many owners leave cybersecurity measures by the wayside, because only the noisy wheel gets the oil. By underestimating the threat level, SMBs leave themselves wide open to attack. Many SMB owners believe that they don't have anything worth stealing. This is untrue, and is often proven to be so when a threat actor appears demanding a ransom for sensitive client information. SMB owners should constantly be testing their IT systems for vulnerabilities in the way large corporations do. Conclusion Cybersecurity is more than just an anti-virus program or a firewall. It is a concentrated and organised effort to plug all the potential holes in your system. Therefore, your IT consulting services partner must have a robust threat protection plan that includes defense, detection and fallout mitigation in case you fall victim to a cyber-attack. For SMB owners, coming back from a bad attack can take all the resources they have. Frequently they fail. This is why smart SMB owners know that prevention, in this case, is much preferable to a cure. Robust cybersecurity measures are not only the domain of large corporations.[/vc_column_text][/vc_column][/vc_row] ### Considering the Sustainability of Cloud Energy Use [vc_row][vc_column][vc_column_text]When COVID-19 started causing havoc around the world and forced governments to implement lockdown measures in an attempt to slow its progress, daily life changed quite significantly for most people. Business operations of all kinds were disrupted — offices closed, inessential brick-and-mortar stores shut down, and jobs were lost — and a lot of panic resulted. On the whole, though, it’s remarkable how well organisations everywhere (private or public) have adapted to these difficult circumstances, and it’s mostly due to the convenience and ubiquity of the cloud. Without it, the sudden move to remote working would have been too much for many: if this pandemic had struck 15 years ago, the effects would have been very different. After a short period of remote working being standard practice and traffic dropping enormously across the board, you probably saw the stories about pollution levels dropping drastically and air quality improving proportionately. People started to focus on the positives of the lockdown: after the pandemic is eventually thwarted, surely we should stick with this new approach to business. There is one major concern that the new business world has highlighted — and that’s the energy use of cloud technology. In this post, we’re going to consider how sustainable the cloud really is, and what can be done (or is already being done) to make things better: The drain of early blockchain technology Concerns about the energy use of technology really hit the mainstream when cryptocurrencies started to pick up steam and people everywhere invested heavily in crypto mining. High-power GPUs were set up to run at full capacity on a 24/7 basis, all in the hope of producing some profit, and environmentalists were understandably frustrated. The proofs being generated weren’t valuable outside of finding and allocating coins: it was purely about making money. Over time, things began to change. Blockchain itself matured, being embraced as a decentralisation framework by ambitious enterprises such as Everipedia, a more open-minded alternative to Wikipedia — and the benefits of pursuing crypto went down at the same time as manufacturers started disincentivising the use of consumer-grade hardware for mining. Computing solutions to environmental problems Blockchain didn’t just become more energy efficient as it grew: it also started to be used to address environmental issues. Companies arose to use cryptocurrencies to encourage the trading of resources like additional electricity from solar panels. Simultaneously, cloud processing was brought to bear as a powerful tool for calculating energy solutions. Machine learning deployed at a massive scale can create models on everything from climate change to the potential efficacy of alternative energy systems — models that go far beyond what we could ever achieve without it. There’s still so much that we don’t know for sure about these things, which makes it extremely difficult to make things better because any given change could ultimately have unforeseen consequences, but the cloud can lend us prescience. Yes, cloud computing requires an enormous amount of electricity to function, but if it ultimately devises new ways to reduce consumption elsewhere and produce more efficient systems, then it’s sure to constitute a net positive in the long run. It’s a sacrifice that must be made. Huge data centers are actually more energy-efficient Lastly, the biggest reason why the sustainability of cloud computing isn’t as big a problem as some think is that it’s actually more efficient than conventional computing. Think about various personal computers working hard to process calculations: the parts will vary in energy efficiency due to their age and quality levels, particularly since improvements in processing technology consistently make processors more energy efficient. Now redistribute that work to one massive data center that features all the latest hardware and software innovations. Everything is built around optimisation. That center will use an incredible amount of power, certainly, but not compared to the net energy use of all the home computers that would handle the tasks otherwise. If every personal computer in the world could be replaced with something like a cloudbook that passed almost all of its processing to the cloud, energy use would go down quite heavily. You can contend that it would be even better if less processing were done overall, but there’s no putting that genie back in the bottle: we’ll all continue to use computers all the time regardless, and cloud computing is by far the lesser of two evils. With so much work being done remotely, it’s understandable that people are getting concerned about the sustainability of cloud processing, but there really isn’t much reason to be worried. Using the current form of cloud computing is already vastly more efficient than relying on personal computers to get things done, and technology will continue to improve at a rapid clip. If we want a more energy-efficient world, we need to embrace the cloud, not shy away from it.[/vc_column_text][/vc_column][/vc_row] ### For Blue Chip, Cloud Control is also about Promoting Sustainable Practices [vc_row][vc_column][vc_column_text] In 2012 Blue Chip built its own software-defined data centre (SDDC) – a move that would prove critical in keeping customers’ IBM Power systems up and running at full capacity during the 2020 lockdown. “We’ve been able to stay open because we’ve invested in our data centre,” explains Chris Smith, Blue Chip’s Chief Marketing Officer, adding “as well as the networking and storage, the automation and orchestration software that we’ve developed allows us to deliver systems completely remotely.” As a result, Blue Chip’s team were able to deliver systems over the Easter weekend without anyone having to travel to the site. “Everything from the data migration and the transition into our infrastructure and cloud was done remotely from home on behalf of the customer.” When it comes to control, data centre ownership is key. Using a third party data centre, on the other hand, takes the control out of the managed service provider’s hands and ultimately compromises the security and resilience of the end user’s system. As well as investing in technology to enhance and safeguard customer data, data centre ownership also enables Blue Chip to have complete control over its carbon footprint. (Again, something that wouldn’t be possible with a third-party facility.) Designing an eco-friendly data centre To keep it safe and functioning in peak condition, the cold aisle air in a data centre needs to be at a constant temperature of 20 degrees celsius. “We have two different types of aircon systems,” says Chris. “We have a traditional air con system based on refrigerants, but it’s not very eco friendly so we reserve that as a backup. Instead, we use a technique that involves utilising the air from outside.” So, how does it work? Chris continues: “In the winter when the air is colder we mix it with the hot air that comes out of the IBM Power systems to bring it to the correct level and it gets recycled. “When the temperature exceeds 20 degrees in the summer we use aerobatic (free air) cooling. Cold water filters are implemented to drop the temperature by around 10 degrees. “30+ degrees is when the backup air con kicks in – but let’s face it, how often does that happen in the UK!” Then there’s Power Utilisation Effectiveness (PUE). Traditional data centres that aren’t software-defined use an extra amp to run the air conditioning and lighting for every amp they use to power a computer system and keep it running. “That’s what we’d call a bad data centre,” Chris adds. “We’re using .1 of an amp, so our carbon footprint is greatly reduced compared to pretty much any data centre out there, mainly because we’re using free air cooling.” Blue Chip has also been planting trees in partnership with a local trust in Marston Vale, UK since its data centre was built in 2012. Each time Blue Chip utilises a full rack they plant 1,900 trees (their target is 700,000 which they are well on their way to achieving). “It’s high on everyone’s agenda to be green at Blue Chip, which is why lots of our staff volunteer to plant the trees,'' continues Chris. “When we built the data centre we got the largest ever grant from the carbon trust at the time because we were planning to build something so eco-friendly. “It’s also another aspect of control – we’re trying to minimise our electricity spend so that we can reinvest the money back into the data centre. It’s a unique approach centred around selling a managed service, not electricity.” To find out more about how we can transition, manage and maintain your IBM system in the Blue Chip Cloud, get in touch with our team today. [/vc_column_text][/vc_column][/vc_row] ### Oracle Database in the Cloud: Azure vs AWS vs Oracle [vc_row][vc_column][vc_column_text]Oracle databases offer multi-modal database management, which you can deploy on-premise or in the cloud. For cloud deployments, Oracle offers the use of its managed cloud platform. Alternatively, you can decide to go with different cloud providers, and set up an Oracle database on Azure or AWS. This article reviews cloud-based Oracle database options offered by these three cloud vendors. Oracle Databases in Azure In Azure, Oracle databases are typically hosted on standard virtual machines (VMs). You can configure these VMs with either the Standard or Enterprise editions of Oracle and either Windows Server or Oracle Linux hosts. For licensing, you can bring an existing license or purchase a new one from Oracle. When configuring VM images, you can manually create an image or you can use one of the images that are preconfigured in the Marketplace. If you decide to use a custom image, you have the option to transfer an image created from your existing on-premises database or to configure an entirely new image. Oracle Management in Azure Once you set up your Oracle database in Azure, you can use a variety of native services as well as Oracle support solutions to manage your deployment. Ensuring high availability To ensure that your data remains highly available, Azure enables you to deploy databases across multiple availability zones or regions. It also provides the Azure Site Recovery service, which enables you to failover to a backup database in times of outage. You can also implement features and service that are native to Oracle, such as: Data Guard—a set of services that enable you to create, monitor, and manage standby databases for recovery purposes. GoldenGate—a software package that enables real-time data replication, filtering, and transformation between database instances. Sharding—a database architecture that enables you to horizontally partition data across multiple databases. Backing up your data In addition to the above disaster recovery methods, Azure also supports two methods of database backup. Azure Backup—a native Azure service that enables you to automate and manage backups, and you can use it with multiple Azure services. Oracle Recovery Manager (RMAN)—a native Oracle feature that enables you to backup and restore databases. Oracle Database in AWS In AWS, there are two options for running an Oracle database. You can either host on EC2 instances or use Amazon Relational Database Service (RDS). Amazon EC2 Setting up an Oracle database in EC2 instances works roughly the same as setting up any other instance. This method requires more work on your part but also provides greater control over configurations and flexibility over deployment. With EC2, you are responsible for managing Oracle updates, backups, and storage. When using this method, you can employ AWS native services for monitoring, backup, and recovery. You can also use Oracle native services, such as Data Guard and RMAN. Amazon EC2 is a good option if you: Need full administrative control including access at the OS level or SYS/SYSTEM users Have a database that is larger than 80% of the max size in RDS Need to retain specific IP addresses of databases or applications Amazon RDS RDS is a fully-managed database service that you can use for Oracle as well as Aurora, PostgreSQL, MySQL, MariaDB, and SQL Server. In RDS you can select between two instance types — standard or Provisioned IOPS. The latter is designed specifically for I/O-intensive, transactional workloads. You can also set up Read Replica and multiple availability zone deployments for improved availability and resilience. When you use RDS, you can either bring your own Oracle license or you can choose a license included option. Bringing your own license offers cheaper rates but requires a bit more work for configuration. Amazon RDS is a good option if you: Don’t want to worry about backup, recovery, updates, or storage management Want synchronous data replication to a recovery instance for Oracle Standard Edition Don’t want to purchase a stand-alone license Oracle Database Cloud Service The Oracle Database Cloud Service is a managed platform as a service (PaaS) offering from Oracle in which your databases are hosted on Oracle infrastructure. When you use this service, you are responsible for your databases but Oracle takes care of managing and maintaining the underlying infrastructure. In Database Cloud Service, you can choose from three different infrastructure options—bare metal, VMs, or Exadata. Bare metal When you select the bare metal option, you gain access to a dedicated service with up to 51.2TB of local NVMe storage. You can use this option with Oracle Database 11g, 12c, 18c, or 19c. For management, you have access to native Oracle utilities and tooling and can perform tasks from a built-in console, software development kits, or REST API. For backup, you can store data on regional Oracle Cloud Infrastructure (OCI) Object Storage or Block Volumes. Virtual machines When you select the VM option you have access to machines with up to 24 CPU cores and 40TB of network NVMe SSD storage. You can use this infrastructure with database versions 11g, 12c, 18c, and 19c. For management, you have the same tools and options available as with bare metal infrastructure. Additionally, VM deployments in OCI are the only supported way to operate Oracle Real Application Clusters (RAC) on VM. Oracle RAC is an architecture that enables you to create a clustered database. Oracle Exadata Cloud Service The Exadata service enables you to operate your databases from Exadata machines. These machines are specifically designed for high-performance and enterprise-scale workloads. For example, OLTP, data warehousing, and real-time analytics. The Exadata service is for the Oracle Enterprise Edition with database versions 1.2.0.4, 12.1.0.2, 12.2.0.1, 18c or 19c. When you select the Exadata option, you can choose from a quarter, half, or full-rack deployment. With these servers, you can access up to 368 CPU cores on up to eight nodes. Each node has up to 5.7TB of RAM, 300TB of NVMe flash cache, and 340TB of storage. In the Exadata service you have the option to operate multiple databases and database versions at once for mixed workloads. You are also supported by automatic backup and recovery. For management, you have access to all of the same tools and utilities as bare metal and virtual machines. You also have support for RAC. Conclusion Oracle databases offer many benefits, including a high level of portability which enables you to deploy databases on a wide range of infrastructure environments. However, it is important to note that each environment offers different configurations and resources, which can greatly impact the performance, reliability and usage of the database. When deploying Oracle databases in Azure, you can ensure high availability of VMs by using Data Guard and GoldenGate. You also get to use Azure’s built-in backup service and native integration with Oracle RMAN. When deploying Oracle databases in AWS, you can set up your database in EC2, and then connect to native Oracle services like Data Guard and RMAN. This will give you a lot of control over your infrastructure. If you prefer simplicity over control, you can deploy your Oracle database using Amazon RDS. When deploying Oracle databases on the Oracle cloud, you get to use Oracle’s managed cloud PaaS—Oracle Database Cloud Service. Once you set this up, you can choose between bare metal, VMs, and Exadata. The latter is a great option for high performance and enterprise-grade workloads.[/vc_column_text][/vc_column][/vc_row] ### Can autonomous mobile robots improve your warehouse operation? [vc_row][vc_column][vc_column_text]In the current climate, warehouses are under more stress than ever before to keep up with demand, but could autonomous mobile robots ease the load a little? Here, Ed Napier-Fenning, from supply chain tech experts Balloon One, investigates the impact these machines could have on your facility. With shops closed and lockdown conditions in place, the past two months have seen more and more people turning to online ordering when they need groceries or other goods. For many businesses in the logistics sector, this unprecedented demand has put a never seen before strain on their supply chain. Therefore, it makes sense that you may be looking for ways to optimise your operation. One of the ways that many firms are beginning to boost productivity is by incorporating robotics into their warehouse facilities. Nearly a third of warehouse and distribution hubs already have robots or are planning on adding them soon (Logistics Manager), while the autonomous mobile robot industry is set to be worth $75 billion by 2027 (IDTechEx). For the warehousing sector, it's autonomous mobile robots (AMR) that are leading the way towards an automated future — but what are they and what advantages can they offer? In this article, I will provide an overview of this technology, as well as look at some of the benefits. What are autonomous mobile robots? Autonomous mobile robots are robots that can handle a variety of stock activities working alongside warehouse operatives, including picking, replenishment, and stock movement. In some ways, they're similar to automated guided vehicles (AGVs) in that they can carry out bulk stock movement, but they are equipped with sophisticated sensors, operational mapping, and processing systems that allow them to make route planning and process decisions in real-time. This means that they can engage in a much wider choice of tasks than AGVs, which are purely designed for stock movement. Another big difference is that, unlike AGVs, they don't require significant amounts of infrastructure — like magnetic tape routes on the floor — to function. This means that they can quickly adapt to an existing facility without the need for big changes that can be expensive or lead to increased downtime. Typically, AMR units are used to transport stock that has been picked by a warehouse operative. This means that the employee does not need to move the picked goods to the packing area, effectively de-coupling the warehouse operative from the walking component of a pick run allowing for far greater efficiency. AMRs can be also be used to transport stock between locations in the warehouse, from production lines to storage, or for virtually any other movement task. Some units are equipped with RFID scanners that are used for stock checking activities, giving businesses access to even more accurate, real-time stock data. What benefits can AMR offer your warehouse? Productivity will be boosted The primary role of an AMR in your warehouse will be the movement of stock, which will greatly free up your workforce by reducing the time they spend doing basic handling tasks In terms of functionality, AMR units can typically go for 12 hours without needing to charge thanks to their high capacity batteries. What's more, when the robots are low on power, they can even return to a charge station themselves. This means that the requirement for human involvement in general day-to-day maintenance is minimal. While there are still some limitations to the sort of operations suitable to operate AMRs — such as product portfolio and order profile — if you are able to incorporate AMRs into your warehouse, you will see a boost in the capacity of orders you are able to process, as well as elevated stock fulfilment. In the current climate, when demand is higher than ever, you will be able to hit your targets and benefit from customer satisfaction. Improved health and safety In the UK, nearly 500,000 workers suffer from musculoskeletal disorders each year, where manual handling is estimated to be a major cause (HSE). In a warehouse environment, where this type of activity is commonplace, the option of completely automating a large capacity of this work should result in less strain being put on your workforce each day. Reduction in human error Incorporating AMR units into your warehouse will see human error reduced. For instance, a robot that is programmed to do a monotonous task automatically, like stock movement, will be less prone to error than a human performing it. And, for jobs like stock counting, an AMR with an RFID sensor will be much more accurate than someone counting themselves. Great return on investment Thanks to the greater productivity levels an AMR can offer, your warehouse will soon see a return on its investment as profits and customer satisfaction increase. And, if you want to incorporate automation but keep initial costs low, you'll find AMR units can adapt to your existing set up without the need to invest in overhauling your space. Are AMR units safe to use? As with any new technology, there's bound to be questions over safety. The good news is that AMRs are more than capable of becoming part of your busy warehouse environment thanks to their clever safety systems. The devices are fitted with advanced sensors that can detect obstacles, allowing them to stop and reroute very quickly. Generally, AMRs do not move quicker than the average walking speed, either, so they shouldn't pose a hazard in this regard. I hope this article has provided you with an overview of autonomous mobile robots and some of the benefits they can offer. Consider them for your own warehouse and you could increase productivity and customer satisfaction.[/vc_column_text][/vc_column][/vc_row] ### Security risks of increasingly popular cloud collaboration tools [vc_row][vc_column][vc_column_text]Due to recent unavoidable circumstances, many organisations around the world are having to actively utilise cloud collaboration tools such as Microsoft Teams, Microsoft 365, One Drive, and others. For example, Microsoft Teams recently announced that it had set a new daily record of 2.7 billion meeting minutes, which is up 200% compared to March of last year. While such tools are great enablers of remote work, they can increase security risks, and especially the risks posed by the insider threat. In fact, a recent study found that only 23% of remote employees had received any guidance on how to use platforms like Microsoft Teams. The result is that the majority of employees might not even think that they are putting company data at risk when they share sensitive files in chats and channels, assuming that it is someone else’s responsibility to protect the data. The problem is that software and collaboration platforms such as Microsoft Teams rely heavily on SharePoint Online to store files which are shared in conversations, on OneDrive to store files in private chats, and on Azure AD to manage and authenticate team members. Such storage locations appear automatically once the user creates a specific team or chat, which lacks sufficient security controls. Once users indiscriminately use Teams, numerous locations in OneDrive and SharePoint Online appear, of which users do not ever think of. There is a high risk of data overexposure in such storages. Flexibility comes with risks Popular cloud collaboration platforms such as Microsoft Teams are often very useful in supporting the collaboration needs of a remote workforce. Yet, the side effect of this flexibility is a high risk of human errors, as many employees might ignore security best practices just to do their job faster. The most common types of mistakes to be avoided are: Privilege elevation – Since groups in Teams are very adjustable, it is very easy to lose track of user access rights. In fact, group owners might grant access rights to their colleagues even though some may contain files with sensitive data such as financial materials or intellectual property. This can result in uncontrollable Azure AD changes and manipulations with sensitive data in SharePoint Online. Insecure data sharing – Sharing sensitive data or credentials via collaboration platforms can lead directly to the risk of data leaks and compliance fines. For example, some employees might ask their colleagues to share information or credentials via chats or team conversations, because they might not have access to password manager, and do not want to wait until the IT team resolve their request for this access. However, insecure data sharing results in sensitive data or credentials residing outside of the secure location, where they can be easily copied by other employees, which might eventually result in a data leak. Data downloading – Downloading sensitive data from collaboration platforms to employee’s devices increases risks of data leaks and compliance violations. Working remote, employees are more prone to this mistake. Such obstacles as poor internet, a slow VPN-connection, and the need to spend too much time searching for the necessary document in the corporate storage might be so frustrating that some employees decide to download data to their devices as an obvious option to simplify their job. Best practices for risk mitigation It is important that every organisation considers a cloud collaboration platform as a new element of its IT infrastructure that requires a modern security approach. The very first fundamental aspect of this is establishing a solid design structure of groups and teams that reflect business needs, as well as development of dedicated security policies. Moreover, it is vitally important to arrange a series of training for end-users, and to educate them on the ‘dos’ and ‘dont’s’ when working with cloud collaboration platform. Another important aspect is to ensure that an organisation can control how well the employees follow these rules. This can require implementing technology that is capable of monitoring activity and permissions around sensitive data. For example, in case of Microsoft Teams, it is important to monitor interactions with sensitive data in SharePoint Online, as any ‘team’ in the application is backed up by a dedicated site to store all data exchanged in on the platform. It is also important to track Azure AD changes, as it is used to store and manage authentication to these new environments​. Such measures will help an organisation to minimise the risk of an insider threat. Secure against the weakest link When an organisation implements new technologies, especially if they lead to new ways of work, it inevitably brings new risks that the organisation must address. Even if the technology provider offers high levels of security, employees are often the weakest link since a new environment can contribute to errors. Therefore, an organisation must continually assess risks that occur under new circumstances and outline a range of preventive measures.[/vc_column_text][/vc_column][/vc_row] ### The changing role of connectivity in healthcare [vc_row][vc_column][vc_column_text]In May 2020, the NHS posted a notice for software to help scale up their remote monitoring capabilities amidst the COVID-19 pandemic. Faced with a population under lockdown, senior officials are seeking ways to utilise health-tech and telehealth – supported by robust networks – to provide patient care. Even before the pandemic, Internet of Things (IoT) technologies played a significant role in the industry. Healthcare specialists could monitor their patients through connected sensors, make assessments via video calls, and even perform surgery on a patient thousands of miles away using virtual reality. A flexible network is needed to link devices between doctors and patients, and given the variety of legacy systems and services, full interoperability is critical. The ability to handle huge volumes of data in the instances of video and streaming demands reliability along with near-zero latency. It goes without saying that security is key due to the intrinsic sensitive nature of patient data. Exposure to cyber threats could have catastrophic results for both patients and the organisation, leading IT leaders to consider using private cellular networks in addition to Wi-Fi, in order to create the most secure solution possible. Life-saving connections Inside a facility, a network of sensors can be used in a number of ways to improve processes and support medical staff. For example, the status of every patient could be monitored using IoT devices programmed to send alerts if an abnormal reading is recorded. This is particularly important in large hospitals where staff have more patients to monitor than ever, given quarantine situations, and where remote monitoring is essential to minimising contact with the disease. Behind the scenes, administrative processes could be automated and streamlined to further support hospital staff. For example, by automatically registering the number of available beds in each department or sending test results directly to medical professionals. Connectivity also allows patient care to begin before they even enter the facility. Ambulance units equipped with IoT devices remotely connect doctors and first responders, enabling early diagnoses and implementation of treatment plans through in-transit situations. This type of intervention could improve the prognosis for patients in emergency situations. Telehealth solutions can also support medical staff by allowing them to regularly contact patients living in rural areas or with limited mobility. Video conference technology could be used to perform consultations, prescribe drugs, and treat patients, for example, by guiding them in physiotherapy exercises. This model allows healthcare staff to provide care in a way that suits a variety of lifestyles, thereby deepening relationships with patients. The biggest challenge: security To get the best out of healthcare IoT, a secure and reliable network is crucial – patient data is intrinsically some of the most sensitive. Private networks create a space separate from the public internet, drastically reducing the risk of successful cyberattacks. Each IoT device is connected to the network via a SIM card, enabling secure authentication and user access control. As with any emerging technology, there are areas to consider when implementing private networks. In the case of telehealth, it’s also worth noting that discrepancies in the connectivity solutions between the medical facility and the patient’s home can potentially introduce vulnerabilities. Devices in the hospital may run on a private network, but that’s not the case for the patients, who will likely use the public internet. This can potentially create a way in for malicious users and underscores the importance of prioritising security at the edge of the hospital’s network. As the pandemic has proved, there is no “out of the box” solution for healthcare, especially in the instance of urgent scale up to deal with a crisis. In order to build a network appropriate for the facility, each hospital needs to work with a specialist technology partner to determine what functions the connection needs to support, and how to rapidly put it together. The architecture of the solution also has to be easy for the IT department of the hospital to maintain and manage in the long term – demonstrating that urgency can be seen as a catalyst for innovation. Connecting the present to the future Changing patient demands and expectations mean that improvements in healthcare become more linked to technological innovation. Healthcare providers will continue to look for the best solutions to support initiatives and provide gold standard care. As technologies such as the IoT and 5G become more widely available, more use cases will emerge which rely on exceptional connectivity for success. The COVID-19 pandemic has highlighted the crucial role that IoT networks bring: from using robots to care for quarantined patient to performing remote monitoring and diagnosis. In the future, these types of applications will become more commonplace, as the industry strives to provide and maintain a high standard of patient care using new technologies. Private networks offer the security and flexibility needed to support new applications, ultimately enabling the healthcare industry to continue to innovate and provide superior care.[/vc_column_text][/vc_column][/vc_row] ### The key storage challenges businesses are facing during the increase of home working [vc_row][vc_column][vc_column_text] Storage pre-pandemic Traditionally, businesses are used to planning three or five years in advance for storage growth, with the surge in data changing the way companies look at future storage demands. Often businesses don’t realise that their data is growing exponentially and with it, their storage. However, pre-pandemic, many businesses were slow to migrate to the cloud mainly through confusion as to what it was. Whilst many lines of business storage consumers are comfortable with the cloud, pay-as-you-go consumption model, turning legacy on-premise infrastructure into a flexible private cloud can be difficult to achieve without expensive over-provisioning or operational overhaul as to the way the infrastructure is deployed. There was also an element of uncertainty around cloud solutions, taking sensitive data and placing it into a third-party server and ‘losing control’ over its security. This has probably increased even further with the implementation of new data driven regulations such as GDPR. This meant that many cloud migration projects were put on the backburner. Even though businesses wanted and needed to drive efficiencies and cost effectiveness the complexity of cloud solutions was a key issue for many, with businesses sweating assets to make sure that migration was a worthwhile exploit. Digital transformation has also proved to be a blocker to cloud migration for many businesses. Generally, digital transformation projects are slow, extremely high-risk and have lots of stakeholders. This, coupled with legacy systems holding data in disparate servers and cyber security concerns, meant that migration to something which is considered unknown pre-pandemic, fell by the wayside. Storage during and post-pandemic Decision making has been accelerated during the current pandemic with businesses needing to have a handle on what data they have, where it is and whether they need the ability to access it immediately or not. The next step was to understand where their storage is, the kind of storage they need in terms of private, public or hybrid cloud solutions and the associated costs. These were all considerations to take into account when looking to quickly secure data with staff working remotely off the corporate network. Our world is changing and fast, and in these unprecedented times it’s difficult for businesses to predict their future needs. However, the one thing businesses can anticipate is how prepared they are for such a change. Cost-effectiveness Most on-premises solutions involve essentially buying boxes in the anticipation of increased storage needs in the future. Companies can be investing huge funds into technology that will not be used for some time, or not at all. Such a CAPEX cost is increasingly difficult to justify, especially in the face of reducing budgets. Cloud allows a flexible approach, meaning that companies can ramp up or down storage depending on need, and only paying for what is used. This OPEX approach allows budgets to be used more effectively with no justification of a huge up-front cost in a ‘just in case’ scenario. To facilitate this in a private cloud scenario, IBM’s Storage Utility Offering enables businesses to benefit from all of the cost-flexibility advantages of the cloud, for on-premises hardware – only paying for storage when it is needed. For businesses that want to build a private cloud, to deploy hybrid infrastructure spanning both on-prem and cloud, or for businesses that know it will take time to migrate to the public cloud, IBM Storage Utility Offering is the ideal solution. Staying cyber secure One of the biggest issues for businesses when it comes to data storage is how to keep it secure. Remote working has unfortunately resulted in an increase of cyberattacks. Any data that is stored is subjected to an element of risk, with cybercriminals keen to obtain and exploit sensitive information. Securing infrastructure is vital in terms of helping organisations function, monitor and act fast to respond to cyberthreats. Cloud security must grow and evolve to face these threats in order to provide a defence for businesses and their customers that leverages the efficiencies and advantages that cloud services provide. Additionally, by offsetting the fear surrounding cloud security through the use of good practices, cloud services can take security one step further. Cloud services can not only secure data within the cloud, but also leverage the transformative cloud industry to secure endpoint users that use it. Moreover, predictive security in the cloud has innovated security with the technology collecting and analysing unfiltered endpoint data, using the power of the cloud, to make predictions about, and protect against, unknown future attacks. Businesses have faced a range of challenges over the past few months and the use of Cloud adoption can help them meet their pain points, allowing them to transition to new and secure ways of working. Cloud adoption has obviously been in high demand and it is a cost-effective solution that can help your business to thrive whilst keeping data secure, providing the ideal solution for creating full accessibility- turning a difficult situation into something that benefits everybody. With IBM Storage Utility Offering, businesses can swap fixed CAPEX for variable OPEX, and migrate to the cloud at their own pace. For more information on whether your systems are responsive enough for remote working, watch our storage Live Stream Replay here.[/vc_column_text][/vc_column][/vc_row] ### Why Progressive Web Apps (PWAs) are so valuable [vc_row][vc_column][vc_column_text]Content creators working in a range of industries have found success in the connected world, with the explosion of new channels and tools increasing the opportunities for them to communicate and connect with their audiences. The Rolling Stones are a great example of this –  they created a community platform to share their music and connect with over 250,000 fans around the world. The rapid growth of the influencer marketing industry is another testament to the value these creators can deliver, with brands expected to spend $15 billion by 2022. Now, it feels like almost anyone can become an influencer.  Our recent research found that 52% of locked down Brits are willing to cut the cord with social media giants such as Instagram and Facebook and would look to community apps dedicated to their particular passion. Evidently, there is consumer demand for such community apps, with the same survey revealing that over a third of individuals want to see dedicated apps from their favourite brands and influencers. More specifically, Gen Z would be interested in joining personal passion communities (54%) and brand community apps (41%). This dramatic shift has given rise to the passion economy, which gives content creators the opportunity to capitalise on shared interests and establish niche community groups. As trust in mainstream social networking sites continues to fall, the growth of community platforms and apps has skyrocketed. Progressive Web Apps (PWAs) are helping to power these passion-driven communities, allowing creators and brands to move away from social media, to create more dynamic and interactive platforms. Unlike native apps which are written to run on mobile devices, PWAs are compatible with on a webpage, allowing individuals to search and access from any device. Of course, not all of us have the knowledge or capability to create a functioning app. Influencers are relying on professional app developers, costing them time and money to build and maintain their own app. PWAs provide a great solution for this.   What do PWAs offer? The key attributes that define PWAs can be viewed through two categories that reflect their value; connectivity and a dynamic and engaging user experience. Connectivity PWAs offer search engine discoverability and a linkable URL, meaning they’re identifiable as applications, so there’s no friction, zero installation required and they’re easy to share. For any creator, the shareability and power of URLs is significant. The ability for existing users to share your content is essential to the growth of your platform and having a URL alongside your app enhances that sharing function. PWA’s have an added benefit of a Service Worker, which is essentially a script that runs separately to the main site. The role of Service Workers can play an important role in the development and maintenance of PWAs, they can allow you to reach functions such as cache network requests. As communities grow, platforms collect and store customer data from their web journey - from interactive behaviour to personal and sensitive data. PWAs are accessed through a web browser rather than any app stores so they can be less costly and allow you to save time compared to native web apps. Alongside that, there is no need for users to download content as they do on a native application, PWAs can be accessed directly through a URL and so content can be accessed much faster and users’ phone memory is not compromised. PWA’s cater to user demand for flexibility and seamless functionality, allowing individuals to connect and engage across multiple devices, including mobiles, laptops and tablets.   Dynamic and engaging  PWAs allow users to experience app-like interaction on a web platform, delivering easy navigation and the ability to install to home screen browsers, allowing users to pin and ‘keep’ selected apps, just like those you download from an app store. The responsiveness of PWAs means that creators and users are not restricted by accessibility and the increased flexibility across mobile devices, laptops and tablets ensures that content is not compromised across platforms. For creators, having fresh content is essential to keep views consistent and audiences loyal. Thanks to the role of Service Workers in the update process, it is easier for PWAs to be updated. PWAs also offer re-engagement UIs which hold functions such as push notifications – a key capability in increasing interaction and building web traffic across platforms. Engagement features significantly differ PWAs from native web apps as push notifications also work in the background so users remain connected with the latest content and updates. As native web apps are still dominant, it is important to note that these push notifications are not yet compatible with iOS as Apple only began supporting PWAs in 2018, however, this is expected to increase and the rollout of 5G networks will only serve to improve mobile device performance.   The future of community platforms The current software landscape is dominated by native web apps and PWAs have been making some headway in becoming a primary UI with the likes of Google moving Android apps for Chrome OS to PWA. However, the transition and creation process needs to be simplified to help brands and creators make the switch from a native app to PWA. When building your community, you shouldn’t be held back by the barrier of your technical competence. This is the challenge SaaS players in the industry are looking to tackle and increasingly, these providers are offering self-service models which allow brands and creators to build their own PWA-based communities. The incredible flexibility and control that PWAs offer when compared with traditional social media networks and native apps is undeniable. As the battle for consumer attention heats up, creators and brands will increasingly be forced to reassess what platforms and apps they leverage, and ultimately, selecting those which allow them to effectively engage with their communities with ease.[/vc_column_text][/vc_column][/vc_row] ### What Will Business Continuity Strategy Look Like In The Near Future? [vc_row][vc_column][vc_column_text]The business case for cloud technology could not be clearer today. In recent months some of the reservations that organisations have harboured towards the cloud. Today, cloud technology has enabled wholesale remote working at a scale, that pre lockdown, would have been unfathomable.    Recent months have put organisations’ business continuity plans to the test. As organisations now revisit their business continuity strategy and IT investment models for the future, their challenges and considerations will be different to what they were, even a few weeks ago.   Focus on ‘people first’ Predominantly, business continuity plans have been focussed on ensuring commercial operation during a crisis within the confines of a physical location – the office. While people have always been a key component of the plan, here on in, and perhaps for the first time, business continuity plans need to put staff on the top of the list in terms of importance due to the dispersed nature of the workforce. With flexible working no longer being a perk, attention needs to be paid to work culture, properly supported by a cloud environment. Risk considerations from a staff perspective will change too. At a base level, typically staff safety and risk mitigation has centred around physical events such as a fire. But how do organisations ensure staff safety and their availability during a black swan event with wholesale remote working?  And what role will IT play in such a scenario?   Technology alignment with working practices Organisations need to revisit their technology infrastructure to align it with new working practices. While digital transformation initiatives are underway in most organisations, these programmes need re-examining holistically. For example, to ensure easy, intuitive, and yet secure access to business information, centralised, policy-driven, and cloud-based repositories for document and email management become essential for a dispersed workforce. Not all organisations deploy such systems today, and even less in the cloud. In many sectors, despite the push to digitise records, there is still a significant reliance on paper for day-to-day activity – be they physical contracts, internal forms, postal letters and or any such. For the future, organisations need to review some of the older, non-electronic processes to close the loop and ensure that important information isn’t missed, which could impact business operation. By way of an example, during this lockdown, in some law firm offices, staff are having to go in to take deliveries of post, scan and then send electronically to lawyers. With a reduction in face-to-face meetings, perhaps there is a need to make video conferencing capability available to every single employee, unlike currently where in most organisations only a limited number of licences for such tools are purchased.  For example, some Office collaboration features are limited, unless the organisation has invested in the more expensive Office 365 Business or Enterprise licences.   Security measures rethink Cyber criminals are having a field day as organisations scramble to remain operational remotely. With home working becoming ‘business as usual’, information security needs to become a top business continuity priority. A layered approach to information security becomes essential. Foremost, conversations with cloud technology providers need to focus on their future security-related roadmaps, with contractual decisions taken based on the adequacy of those measures. For business-critical functional systems, investment in technologies such as Zero Trust, behavioural modelling and threat detection need to be undertaken. Over the years, huge amount of investment has gone into security systems for the office-based environment. Now with a dispersed workforce, the importance of advanced end point security systems grows in stature. The laptops of employees may not have the same level of protection as they would have in the office environment.  Potentially AI-based end point security will become an imperative for business continuity and security Information disposal is another challenging issue. At a system level, policy-based security measures can be put in place to protect confidential information, but thought now needs to be given to how employees in their home offices can dispose of commercially sensitive physical documents – especially in sectors where paper documents are still widely used. For example, do staff need to be provided with crosscut paper shredders?   A different kind of capacity planning Even in organisations that have adopted the cloud in some shape or form, capacity planning has been limited to a proportion of the workforce working flexibly or from home – with scalability thrown in for occasional disruptions such as train cancellations or annual snow-related transportation problems. Post COVID-19, organisations need to devise measures to ensure immediately available capacity for 100 percent of staff working from home. There are costs involved, which need to be factored in. In many organisations, this kind of capacity planning has been under-invested in. If everyone needs to be working on the VPN, then more licences need to be catered for. Capacity will also need to be purchased for much higher volumes of data, that can flex in real-time. The same applies to disaster recovery. Organisations need to reconsider their investment in cold, warm, and hot sites, based on new business requirements. There is a strong case for investment in SaaS, PaaS and IaaS models.   Supply chain resilience The COVID-19 lockdown brought to the fore the true IT resilience (or lack of) of organisations. Many businesses have not been able to meet their contractual SLAs. For example, many organisations struggled to get hold of laptops to provide to their staff when the lockdown was announced. Most organisations have single suppliers for different types of requirements – i.e. they might use HP or Dell for PC equipment, another suppler for office supplies such as paper, and so on. Future business continuity plans need to ensure that there are strong backup measures in place for equipment and other business-critical online, IT infrastructure services. Likewise, from a support standpoint, technology suppliers with remote IT implementation capability will merit partnership compared to those that can only deliver and maintain systems on-site. Business continuity planning in organisations requires an overhaul. Organisations will do well to consider tabletop testing of their business continuity and disaster recovery planning to ensure that their strategy is fit for the new business environment. It will allow them to re-direct and re-invest their resources in the most optimal manner, while ensuring that the needs of all stakeholders in the business are adequately met.  Business continuity plans need levelling up, based on future unforeseen events.[/vc_column_text][/vc_column][/vc_row] ### Maintaining business continuity in the face of change [vc_row][vc_column][vc_column_text]Footfall was down 44.7% in March. High street stores without eCommerce operations saw sales collapse. Primark, for instance, saw its monthly sales of £650m a month evaporate.  And whilst eCommerce retailers have the advantage of being able to maintain online sales, it’s evident that even those retailers are scrambling to adopt new business models and maintain business continuity. As a measure of how difficult it is to keep operations going, consider that Debenhams has fallen into administration, Cath Kidston has permanently shut up shop and Next and many others had to temporarily at least close their online operations altogether. One positive sign amidst the turmoil: shopping habits remain healthy with global data showing online sales up 46% since just before the World Health Organization declared a pandemic and stay-at-home orders began rolling out. The rise in online sales has been so significant, in fact, that the ongoing shift from brick-and-mortar stores to online shopping has been accelerated by a year or two because of the pandemic. While it’s too early to predict how much of the growth in online spending will persist after the pandemic ends, the growth is certain to outstrip the 16.1% global growth rate that eMarketer predicted for 2020. No doubt retailers are buoyed by seeing more consumers — and more new customers — heading online. But the rapid shift in shopping channels comes with challenges, including holiday-like order volumes, supply chain disruptions and the need to continue operating with employees working from home. Unlike the holiday season, of course, the COVID-19 induced changes and challenges came with very little warning or time to prepare. The combination of a spike in online orders and the reality of a dispersed workforce create specific difficulties when it comes to reviewing online orders for fraud and protecting the business from consumer abuse. Retailers that rely heavily on manual fraud review teams might find it difficult to keep up with the dramatic and persistent increase in orders while still accurately assessing which orders are fraudulent and which are legitimate. Workforce woes and data breaches The problem is exacerbated by requirements that fraud teams work from home. Agents’ home offices might not be equipped with the tools they need — including secure laptops on which they can review customers’ personally identifiable information. Their cybersecurity systems might not be as robust as the protections in place at company offices. All of which raises the potential of violating privacy regulations or experiencing a data breach. Moreover, even the most hard-working employees might not be able to maintain the same level of productivity at home, where two or more adults might be working and where children need to be cared for and possibly homeschooled. That creates a situation in which higher-order volume means more work needs to be done at the very time that review agents’ personal circumstances mean they have less time to do the work. All these pressures come at a time when potential for fraud and consumer abuse is on the rise. Fraudsters thrive amid chaos and they often seek to hide among the noise of increased order volumes. Furthermore, COVID-19 and the need to limit the disease’s spread have inspired new ways of doing business. For instance, to keep employees and customers safe, many couriers across the globe no longer require signatures on deliveries. That increases the chances for doorstep theft of an eCommerce package or for someone to claim that they did not receive a shipment that they actually had received. Unfortunately, the ongoing financial strain brought on by COVID-19 could lead to increases in consumer abuse. It’s not all doom and gloom As difficult as these times seem, however, there are steps retailers can take to protect themselves while providing their shoppers with improved customer experience. Automated order flow, with fraud solutions that rely on machine learning, allow merchants to off-load overflow orders to systems that can sift good orders from fraudulent ones almost instantly. The best among these solutions also protect merchants from false claims that a package never arrived or that the product did arrive was not in line with what was described on the merchant’s website. These automated systems can scale up almost infinitely and they provide all the necessary security to protect consumers’ personally identifiable information. The technology has been around long enough that the AI-based machines have had years to learn from processing vast amounts of data. By their nature, such systems are able to keep up with changing fraud tactics and targets without being distracted by the surrounding chaos. Exploring the potential of automated order flow can be a key step in building out a business continuity plan to guide your enterprise through the current pandemic. More importantly, it can be the start of preparing your business for the post-pandemic world and whatever that will bring.[/vc_column_text][/vc_column][/vc_row] ### AI on the Edge and in the Cloud: three things you need to know [vc_row][vc_column][vc_column_text]The world has seen a rapid upsurge of innovation in smart devices at the edge of the network. From smartphones and sensors to drones, security cameras and wearable technologies, taking intelligence to the edge is creating a market that is anticipated to be worth well over $3 billion by 2025. But it is not a market valuation driven purely by the devices. Rather it is the value of the opportunity created by a new generation of advanced applications that offer a new awareness and immediacy to events on the network. Lower latency and higher speeds of connectivity offered by 4G – and, soon, more reliably and ubiquitous with 5G – mobile broadband, along with greater power and memory in a smaller footprint enables applications to number-crunch on devices that are independent and closer to the source. With intelligence delivered on the spot, in real-time, AI at the edge allows mission-critical and time-sensitive decisions to be made faster, more securely and with greater specificity. For instance, AI-powered medical technologies can empower at-the-scene paramedics with immediate diagnostics data, while self-drive vehicles can be equipped to make micro-second decisions for safety. Yet there are many scenarios where decisions still demand heavy computational lifting or where intelligence doesn’t need to be delivered in real-time. In these cases, AI-driven apps can comfortably remain located in the cloud, taking advantage of greater processing power and specialised hardware. For instance, hospital scans, machine analytics and drone inspections can happily accommodate data transfer lags to and from the cloud. Whether considering deployment of AI at the edge or in the cloud – or perhaps a mix of both – there are three things you need to know:   1. ‘Trained’ versus ‘inference’ decisions in AI apps, and what this means for users A key underlying factor to consider in application development for and deployment of AI is whether the machine learning (ML) element relies on training or inference algorithms. The training approach creates a machine learning algorithm and makes predictions using the feed of data as its training source. It creates a continual feedback loop that modifies behaviour to gradually reduce the error value as the algorithm becomes more accurately trained. Generally, this requires heavy computational lifting using a cloud-based ‘inference system’ that is capable of continually revising and updating the algorithm at the same time as it is receiving data and delivering results. Whilst, for the user, this can mean a time lag in data transfer and processing, the advantages are continually more accurate results and more powerful processing of larger data sources. In contrast, the inference approach applies a trained algorithm to make a prediction on the device itself where fresh data is received. There is no training loop and no refinement; each response output is processed afresh using the same algorithm. This is where the inference system that executes the algorithm and returns decisions can comfortably be located at the edge. For users, this means that data is processed on the device rather than sent to the cloud. It means that the device does not need a connection to the network in order to operate, which is useful for remote, off-site workers and devices on the move. Without network latency, it also delivers a near-instant response, whilst also reducing security risks associate with data transfer across networks.   2. Interoperability adds performance and value Any IT infrastructure without a platform for interoperability will end up with a series of disconnected data silos, incapable of sharing intelligence to create value across different parts of the business. For AI devices on the edge, this issue is magnified. IoT devices and AI applications capture an enormous amount of data by the second. They are also often geographically dispersed and equipped to process part or all of the intelligence at source. But deriving value from the data relies on it being made available and accessible across other business operations in the infrastructure, identifying trends and empowering better decision making. If the data capture, processing and delivery of results is all happening on the edge device, there is a danger that the data will not be shared across the wider IT infrastructure. Interoperability creates links between applications and services across the business such that, for example, medical data from wearable devices is crunched at the edge to provide personalised diagnostics to the wearer, but also processed in the cloud for longer-term insight on user groups and input to business analysis.   3. An IT infrastructure to support hybrid edge and cloud-based AI There are situations where AI needs to be wholly based in the cloud or at the edge, such as in manufacturing robotics installations (cloud) or consumer gadgets such as smart speakers and home or wearable devices (edge). However, in the majority of applications, the IT infrastructure will need to support a hybrid of both edge and cloud-based AI technologies. The key consideration is, what will the impact of a delayed or interrupted response be to the end-user? This requires an understanding of critical at-the-edge intelligence versus non-time critical intelligence. With this knowledge, it is possible to build an IT infrastructure that balances demands and capabilities to suit. Edge devices can be selected with the right processing power, interoperability and capabilities to perform the level of number crunching at source, while cloud functionality can also be matched to support trained enhancements to the algorithm and perform further analytical operations.   Striking a balance As businesses and their customers increasingly embrace highly connected technologies in business and at home, it is inevitable that AI operations will increasingly move to the edge. It is clear to see that intelligent device footprints are ever decreasing, while their capabilities grow. However, whilst it may be tempting to embrace AI-charged edge computing to the max, it is critical to ensure that important data is not locked the end-point device. There needs to be connectivity for data sharing across the core business ecosystem to ensure that data continues to empower business decision making and add value to an organisation’s future direction.[/vc_column_text][/vc_column][/vc_row] ### AI can help finance team’s focus on strategy [vc_row][vc_column][vc_column_text]The ongoing Covid-19 situation has led to many businesses in the UK and across the world, taking a step back to rethink their processes, infrastructure and strategy. It has had an unprecedented impact on every sector, but it has also given businesses food for thought. The question the pandemic has posed to business decision makers is, ‘can you adapt strategy quickly enough for the business to operate when faced with a significant challenge?’ In short, are you agile? One area of business that has played a key role since the outbreak of Covid-19 is the finance team. They have had to shock proof their businesses and inform business leaders almost on a day-by-day basis what the figures show, as revenue takes a significant hit. Yet if they have been restrained by outdated software or left with no access to the right tools, it will have made their role even more challenging. So, should decision makers rethink the financial tools their finance team has at their disposal?   Shifting focus During the past decade the number of tools advanced technology has brought to finance teams’ armory has increased significantly, with many of these proving to have real benefits in the current situation. Some businesses were already adopting digital tax software (accelerated by the introduction of MTD), cloud solutions, mobile accounting and artificial intelligence (AI) - which in the coming years could have the biggest impact. AI could in fact see a shift in how finance teams work. The big four accounting firms were early adopters and some of them have already developed their own AI solutions. Other businesses are following suit thanks to finance management software providers developing AI tools or updates to their offerings that allow finance teams to automate once time consuming tasks. AI isn’t a new phenomenon, but it’s still important to highlight just how big a difference it can make to the productivity, quality and accuracy of accountant's work - not to mention that it will enable finance teams to move away from the mundane ‘enter and repeat’ style tasks towards a more advice-led, analytical consultancy based approach - which will have been their top priority in recent weeks, with AI, this can be their only priority.   Illuminating the signs One of the key benefits of AI in practice is that it can make long-winded, extremely detailed tasks much quicker and makes the data much more accurate. An example is auditing, whether that’s an external or internal audit. In theory, AI should provide finance leaders with the tools to help ensure the longevity of a business or at least put the signs of risk in big illuminous letters. Through algorithms, AI can process vast amounts of data and pull out misstatements or risk transactions that directors and key stakeholders need to know about in order to take action. Usually a team of auditors will pull out samples to spot unusual or potentially harmful transactions to a business backed up by other procedures to improve the accuracy of the audit. But it still brings with it a level of risk that the data sets miss the danger signs. AI can remove the risk (of missing the risks) and human error, enabling internal and external auditors to use their time analysing what AI has found in order to inform the entire business. Ultimately, protecting it and helping it to grow.   Lifting the weight It’s not just accuracy that AI has a positive impact on. Productivity can be vastly improved as AI helps to reduce the sheer amount of processes and paperwork they have to work through. AI can handle invoices, analyse expense reports and process POs, all through tools like machine learning, natural language processing and cognitive computing. These can all be integrated as part of a finance team’s ERP software. It frees up the team to spend much more of their time analysing data and advising the business, making it more likely that they spot potential risk or unusual activity. According to a global report from the Association of Certified Fraud Examiners, 13 per cent of organisations currently use AI or machine learning to help fight fraud. It gives finance leaders the power to not only spot this kind of activity in the data but also identify where it may occur in future. And with cyber-crime continuing to increase in frequency and complexity, it’s important that finance teams play their part in helping to minimise its impact through the use of advanced technologies.   Focus on strategy With the extra headspace tools like AI bring to a finance team, it means finance professionals can make sure their department is advice and consultancy led. In a situation like the one we are facing now, where directors and business owners want to know what their financial data tells them at every step to help guide current and future strategy, the right tools can make all the difference.[/vc_column_text][/vc_column][/vc_row] ### How Will Your Workspace be Different in The Future? [vc_row][vc_column][vc_column_text]With many workforces now up and running remotely, senior teams are debating what work will look like in a month or even a year? Returning to ‘normal’ may at first glance be the ‘sunny uplands’ we’ve been dreaming of, but should we return to our old ways? It’s likely we will take many lessons from the pandemic. For some, this will include re-evaluating their outlook on life and the way they work. So, how can we address our work-life balance better, and how should businesses adapt and change post-pandemic? There is already strong research that demonstrates people are more productive when they have a better work-life balance. Research by YouGov in 2019 revealed that a fifth of HR managers believed that staff work to a “slightly higher” standard at home than they do in the office, and a further 7 percent believe they worked to a “much higher” standard. In a Stanford study which monitored 16,000 homeworking employees over several months, they saw a 13 percent performance increase, including more minutes, worked and more work done per minute. Now companies are finding out for themselves the benefits and the problems of remote working. From our experience, the benefits far outweigh the difficulties. Although we don’t have hard stats yet around the increases in productivity we’re experiencing; there is strong anecdotal evidence that businesses have seen good increases in productivity. The Stanford study also found that home working leads to 50 percent lower employee attrition, and we know those employees who can work from home are 52 percent less likely to take time off work. Combine these employee wins with the reduction in office overheads, rents and travel expenses and the reasons for continuing remote working look good. And the positives in terms of reducing your carbon footprint should not be overlooked either. Remote working has not come without its challenges – from having to work whilst home-schooling or working from less than favourable home situations like the end of your bed. However, once children are back to school and with the right support to create suitable ‘workstations’ within homes and the ability to connect with colleagues face-to-face at times, there is much to be gained from continuing working remotely, and Ultima is looking at continuing to do so after the pandemic. We have looked at structuring ourselves into two parts – with one part of the company working from home 50 percent of the time and rotating this or having a drop-in centre or office hub that staff can go to for face-to-face time on an ad hoc basis.   Old versus new Many boards are grappling with what the new normal should look like for their companies. It’s all up for debate, with the modernists challenging the need to return to the office. I’ve found that my team is more productive. Some are still working at 7pm to complete tasks that need to be done – but then they have the flexibility to do personal things they have to do at other times of the day (which is mostly home-schooling now). I want us all to be striving for an excellent work-life balance. As a company, we’ve seen massive benefits from remote working. Collaboration tools have seen our team and interdepartmental communication improve with greater visibility of concerns and issues across the business. For example, IT has found where once people might have asked for help verbally, they now log a problem, so the team has better clarity of all queries arising, and our CEO is still able to hold his ‘coffee catch-ups’ with all staff. Cisco Webex and Microsoft Teams are enterprise-grade collaboration tools that work – without them, we would not have been so successful in our remote working.   Technology that enables As they haven’t done it before, empowering departments like finance and HR to work remotely has been more challenging. We needed to educate less technical individuals on how to access and use the technology remotely – from how to connect their devices to Wi-Fi to how to use the increased security measures in place to access relevant business systems. We’re looking at a new learning management system that will address the issues of user awareness that home working is creating. Some enterprises have been surprised that today’s technology can support all their staff working remotely. Systems aren’t falling over as 70,000+ global employees access collaboration tools and business systems. And the use of the latest desktop technology like Citrix Zen means that businesses can still use various legacy business applications in the cloud, enabling them, for example, to use finance and CRM applications as usual. The latest cloud and automation technology have made remote working for all departments a reality. By automating the hefty maintenance, security and support requirements of the cloud, organisations can now migrate their critical business apps to the cloud with all the management and security issues addressed much more effectively. The automation technology currently available is set to change the market by simplifying cloud ownership and operations. It finally makes the cloud an option for businesses of all types and sizes.   Further change required It’s been a knee jerk reaction to get people working remotely, but we need more change if we want people to work remotely permanently. Technology companies need to offer end-to-end customer-focused management – where the company’s end users are the customers, and not the IT or procurement departments or technical heads. We need to look at it from the perspective of the individual customer and make it a good experience for them. A fully managed service can support you working from any location – even the beach. There will, of course, always be a point at which hardware fails so companies need to think about innovative service offers to solve these issues. Where companies once signed-up to complex, inflexible contracts, IT needs to be provided on a flexible, consumption-based billing model. As people have been furloughed and companies need to keep cash in their businesses, we have had to be flexible in our billing, not charging for services that aren’t currently being used. If a second phase of the pandemic comes, then this pay-as-you-go model will be even more critical for businesses to survive. One-off large procurement will be a thing of the past in terms of tech investment as businesses move to a cloud-based SaaS model. Procurement departments will need to be reshaped, as will IT departments as they support and enable their users in a different way. Automation and remote working will see businesses shift to a cloud-based user model where you effectively pay for what you consume.   Personal wellbeing Technology aside, the key to enabling permanent remote working is good personal wellbeing in the home workspace. Businesses will need to give staff the means to work effectively from home – from offering the right desks, chairs and potentially clever workstations that can be installed in homes to make working from home comfortable and unobtrusive. And we will need to offer some form of personal face-to-face time too to ensure good mental health for remote workforces. The pandemic has seen changes to the way we work which would have taken five or ten years to embrace. It’s shown companies that the technology for remote working works - employees can be more productive, and businesses operate successfully. The pandemic has forced digital transformation on a global scale, post-pandemic we mustn’t lose the benefits of this change.[/vc_column_text][/vc_column][/vc_row] ### Safeguarding cyber health in uncertain times [vc_row][vc_column][vc_column_text]The outbreak of the coronavirus pandemic has triggered the greatest upheaval of modern life in generations. In response to the pandemic, governments have enforced lockdowns to place limitations on our movement and reduce the risk of infection. Many businesses have been forced to swiftly transition to remote working models as a result. Such rapid change has created a wealth of new opportunities for cyber criminals to exploit vulnerabilities in a company’s cyber defence system. Whilst this move is challenging for organisations of all sizes, SMEs are particularly exposed as only 46 per cent report having a formal cyber security policy in place. At this time of heightened risk, it is crucial that cyber security is front of mind for businesses of all sizes and they take steps to protect their cyber health. Risks of going remote Whilst remote working has been a necessary measure in the fight against COVID-19, it has denuded several layers of security normally provided by the office environment. Many employees are now working across multiple devices, including personal ones connected to shared networks. They have become reliant on video conferencing and collaborative platforms to share sensitive communications. Home WiFi networks frequently use basic or factory-standard passwords which are more vulnerable to hacking, whilst shared network environments open the possibility of multiple unprotected connections. This risk is heightened by the strain that such a sudden influx of remote endpoints can cause to a company’s infrastructure. Cyber criminals, aware of these vulnerabilities, are using them as fresh avenues for exploitation and extortion. In March alone, there was a reported 400 per cent rise in cyber attacks globally. Spotlight on SMEs With remote working causing unexpected strain upon a company’s cyber defence system, SMEs in particular may find their cyber security protocols are no longer capable of supporting and protecting their operations. Firstly, to address this growing problem, businesses should conduct a thorough vulnerability assessment. This will show where the business stands in terms of their cyber security exposure by assessing the current state of their security infrastructure. This will inform of the necessary steps required to secure their data. The first step is to ensure the use of a Virtual Private Network (VPN), which is a highly effective way to enhance digital security. A VPN establishes a secure, private connection between the organisation’s servers and the home and encrypts all the data sent between the points. This ensures the integrity of the data and that the connection is bona fide. The added advantage is that the connections can be disabled quite easily and so if there is a network problem, endpoints can be isolated and removed from the equation. Endpoint Detection and Response (EDR) software should be implemented in tandem with the VPN. EDR software is a relatively simple and cost-efficient way of uncovering and being alerted to potential cyber attacks. Companies with EDR software installed can feel safe in the knowledge that their network perimeter is safeguarded against threats. An added benefit of this measure is that IT teams will have greater capacity to handle commonplace IT problems. The weakest link Whilst this is a solid starting point, the greatest weakness in a company’s digital armour is often its own employees. More often than not, a cyber security breach is the result of human error or phishing attacks, which rely on ignorance or naivety for success. Usually taking the form of an email, they can be constructed swiftly and distributed widely within seconds. Such emails deploy social engineering tactics to manipulate the recipient into engaging with links or downloads that contain damaging code. This was already a major issue before the coronavirus pandemic and in 2019, 80 per cent of businesses were targeted. Now, cyber criminals are harnessing the confusion of the current climate and posing as official bodies such as the World Health Organisation (WHO), the UK government and the NHS. By offering financial aid, relief or information about coronavirus, scammers lure employees into downloading malicious software or revealing sensitive data such as login credentials and credit card details. The fallout from such an attack can be financially devastating and cause lasting reputational damage. A key part of mitigating this risk is equipping employees with the knowledge and tools to protect themselves, and the business. Robust and comprehensive training empowers employees to spot, report and remove suspicious emails, protecting their company’s digital integrity. Most recently, the Government’s National Cyber Security Centre (NCSC) launched a suspicious email reporting system. This, combined with a strong understanding of the company’s cyber security policies and escalation procedures, will foster the growth of an organic internal security system. Taking charge Yet these simple measures are not always implemented. Despite the growing risk and dangerous outcomes, a Make UK survey revealed that one in three businesses do not provide any form of cyber safety training to their employees and 50 per cent lack the means of tracking their security infrastructure. Cyber security is not being taken seriously by board members. This is particularly prevalent in SMEs where only 38 per cent have board members or trustees responsible for cyber security. With the move to remote working and increased reliance on digital structures for communication, operations and financial transactions, it is paramount that cyber security is a top priority for senior management and the board. Just one compromised account could bring productivity and operations to a halt. When this happens, it is the board who will be held accountable. The fact is, the average cost of cyber incidents for a UK company is $243,000. Given the current reliance on digital infrastructure, this cost would likely be even higher today. The question for board members then, is whether this is a risk they can afford to take. It is critical that SMEs fully consider the steps outlined above and implement them in order to establish appropriate cyber security awareness and protect themselves from the risk of a breach. The move to home working has raised concerns about our digital health, as criminals ramp up operations to exploit the vulnerabilities exposed by remote working. Network insecurities and phishing attacks are preventable threats, but they need to be treated with the seriousness that they warrant. In doing so, businesses protect their cyber health and safeguard their future.[/vc_column_text][/vc_column][/vc_row] ### How to smarten up your video background at home [vc_row][vc_column][vc_column_text]We’ve all heard the expression that first impressions last. So, what if you “meet” someone on video for the first time? How can you make a good impression then, and is the impression of equal importance? Of course, it is. What people see when they watch that little video frame, is all they know about you. And yes, the environmental impression lasts on video too. In this article, I’ll share a few simple ideas on how you can smarten up your video background at home. We’re living in a time of E-meetings and Zoom calls, Skype and Facebook lives have replaced our meeting rooms and coffee shops. Conferences, presentations and even networking events are now taking place in your own living-room and people you’ve never met are suddenly invited to see what your private bookcase and kitchen looks like. If you’re in business and you’re representing a brand, you should of course make an effort to scrub up a bit, and this includes your environment too. How do you smarten up the background then? DECLUTTER Decluttering is the first and simplest way to create a better background. If you have toys, food leftovers or even just messy stacks of paper and books behind you, people are going to read this as your personal behaviour and think you’re messy and disorganised. So, declutter, just remove piles of things to the side of the video frame and please take your drying pants off the radiator behind you! Remember that people will only see what’s in the frame, so anything behind the camera will remain secret. A neutral, clear background won’t give a bad impression, but if you try too hard or have the wrong things visible, it might be more damaging than you think. Of course, any branded products that stand out should be removed, such as known fizzy drinks or even plastic cups and water bottles, as they can be an eyesore for environmentalists. BLACKOUT BLIND An easy way to create a professional backdrop, is to get a blackout roller blind. You could fit it on two screw hooks in the ceiling and you’ve always got a neutral background that you can pull down. They of course come in all colours so you can choose something neutral – or why not something in line with your brand! Avoid really strong colours though, as they could get quite overpowering and also reflect colour on your skin, which isn’t ideal. Beware that you need light in your face though, so if you don’t have any video lights, you need to fit the roller blind so you’re facing a window with light coming in when you’re filming. PAPER ROLL AND STAND This is a more advanced way to get the professional look, but you can order studio backdrops from around £40 online. You’d need to fit it on a rope from the ceiling, or you can buy a backdrop stand online. Make sure it’s wide enough to cover the whole frame, video is wider than a still photo, the crop is 16:9, which is the new HD format. GREEN SCREEN If you’re new to video, I would say – no, don’t try green screen. Working with a so called chroma-key background is what many people do to make it look like they’re in a fancy office or on a tropical island. This is what they use in the big films to create exciting scenery in a studio. You can order the type of green colour backdrop you need online, but it’s quite tricky to light it so it looks realistic, so unless you’re quite techy and prepared to experiment a bit, I’d say, don’t go down that route. ZOOM BACKGROUNDS You may have noticed on some Zoom calls already, people have really nice backgrounds, but their faces look like they’re half invisible aliens. This is again what happens when you try and experiment with backgrounds that don’t actually exist, without having the correct setup. However, if you have a plain neutral wall behind you and good light, this option can work. PROPS I have a specific place I go to when I do my video calls from home. I’ve simply painted a wall dark blue and added some plants, a comfortable armchair with a lovely cushion and put some fancy candles on a shelf behind me. People always ask where I am, if it’s a conservatory or if I’m in Morocco. It’s a corner of my living room. It’s great, because I know it’s always ready for action, it’s inviting and makes me feel more comfortable in front of the camera too. No matter what, make sure you’re in line with your brand values to give the best impression you can.[/vc_column_text][/vc_column][/vc_row] ### ‘Tiny AI’: the next big thing in intellectual property? [vc_row][vc_column][vc_column_text]Advances in AI have changed the world in the last decade, whether indirect consumer-facing services like Amazon’s ‘Alexa’ or Apple’s ‘Siri’ which have become widespread, or behind the scenes with image processing; less glamorous inference; data processing; or control applications. Whilst large scale necessarily cloud anchored applications will be here for the foreseeable future, cloud processing of AI requests places heavy demands on communications networks. Regardless of how communication technology advances, there will always be some issues of latency and reliability of connection.  In a mobile world and for real-time safety-critical applications - with autonomous vehicles being one notable application - this can still be an important practical barrier to greater adoption. The world is, unfortunately, experiencing the effects of the Covid-19 crisis at the time of writing, which has, among other things, greatly reduced human travel and transport energy demands.  Data still travels, increasingly so, and a rising concern is the proportion of energy consumed by computing and communications.  Although data obviously costs a lot less energy to transport than a person, it still has a finite energy and infrastructure cost.  Rapidly growing individual streams of individual raw data from myriad mobile consumer applications of AI to and from data centres will hit bottlenecks.  ‘Locally sourced’ intelligence can mitigate this. The historical shift between cloud versus local or edge and mixed processing in other areas of computing will be familiar to most readers.  You wouldn’t sensibly post a letter from the UK to California just to ask a question that the person at the post office counter could easily have answered for you.  Using a data centre a thousand miles away to identify a smile could be compared to using a sledgehammer to crack a nut - even if you still need to use a cloud solution for part of the application.  For many AI applications, an acceptable result can be obtained without needing the resources of a data centre, if a suitably trained processor is used locally.  This can avoid, or at least reduce, communication demands and data centre processing demands - thereby making more applications more widely available. A lot has been said and written about patent protection for AI.  Speaking from my personal perspective as a patent attorney working in this area, I have frankly seen disproportionate hype around not much more than application of known techniques to a new data set, whereas some of the more interesting development (to me personally at least) is not really protectable for various reasons outside the scope of this article.  That is not to say there is no scope for valuable innovation in the area of ‘conventional’ AI, but do not expect every application to be protectable, and be prepared to engage creatively with someone to root out what is commercially worthwhile protecting. That being said, I do see plenty of scope for interesting and protectable development in making tiny AI work well.  There are of course numerous applications which do work locally, but AI feeds on data.  Having plenty of data to train on is important, so gathering available new input to grow that dataset is an important part of bootstrapping knowledge and effectiveness.  This does not readily lend itself to individual smaller processors working on their own datasets. Taking the right aspects of what can be done in the cloud and choosing to do them locally and then managing the interactions between local and cloud components still has a lot of unsolved problems.  Getting the most out of all the data available to the local device’s eyes or ears without simply shipping all of it back to base, and keeping distributed processors efficiently up to date with the latest intelligence, is likely to play an important part in next-generation consumer AI applications. I see AI as just following the cycle of other industrial developments a step or two behind other industries: we are now looking at improving energy efficiency; miniaturisation; data supply chain logistics; and security (which will be an issue - e.g. rogue data corrupting training).  There is plenty of scope for innovation here as in other industries. The backbone technology of AI is potentially valuable, and I predict numerous new companies will emerge to capitalise on niche areas, intellectual property (IP) being a key aspect of their strategy in order to maximise value and avoid simply being swallowed up or bypassed (and they may well be acquired based on that value).  Companies working in AI often have a lot of smart people working on getting the product working well as a priority (for understandable reasons), but some give a lower priority to looking ahead to actively maximising ultimate value realisation.  Creatively approaching the IP strategy is one component.  It is always satisfying to see a company that has developed a great product do well, and frustrating to see one who did ok but could have done better.  The key here would be for more companies to capitalise fully on the advancements they bring in giving AI more widespread and sustainable applications.[/vc_column_text][/vc_column][/vc_row] ### 5 ways to build a strong company culture across remote teams [vc_row][vc_column][vc_column_text]Adapting to remote working can be a challenging transition for many companies. While some employees may slot straight into a routine where their productivity is soaring, others can struggle to motivate themselves without a team of coworkers around them. For those considering remote working for the long term, the importance of a healthy company culture is imperative. Company culture impacts how employees enjoy and experience working at a firm, how their voice is heard, the work ethic they possess and the environment of trust which the company can develop. Company culture influences the results of everyone in a company, from the C-Suite to the people on the ground.. So, to run a business successfully through remote teams, a strong company culture must first be established. Listen to what your employees need Building a strong company culture across remote teams first requires an understanding of what those teams need from management. Employee mental health can suffer from the loneliness and change that remote working can introduce. Companies must, therefore, take the time to understand what their team requires and how the business can adapt to meet this need. Some employees may prefer to work independently without constant encouragement, whereas others prefer to have tasks delegated to them on a regular basis. By understanding what their specific team requires and how each member operates, companies can mould their strategy to suit the wellbeing of each employee. Communicate your company culture According to Deloitte, only 28% of executives understand their organisation’s culture. Understanding this is essential  with remote teams, as articulating the culture of a company can be even more challenging for employees to grasp. Firms must document what they envision for the company culture and ensure every individual in the business is clearly aware of the explanation. Each detail should be included, from the expectations of the firm and their performance measures, to the value and the vision which the directors have laid out for their business. To continue growing a positive company culture whilst a team is working remotely, the values and vision of the firm must be constantly evaluated. Employees should be kept up to date with any changes in the details so they feel equally a part of the strategy. Regular updates and clear messaging are key - management should avoid mixed signals at all costs since this can confuse employees and have a detrimental effect on employee morale. Encourage your employees to communicate Employees can lose the communication they had with other team members when they begin remote working. The initial adjustment period can be hard as teams adapt from seeing each other physically to casually chat about their weekends, television shows or work, to then needing to do all of this remotely. However, this does not have to impact the team spirit in a firm. To maintain a positive company culture, management should encourage employees to communicate virtually. Team meetings and collaborative work can still take place and video calls work perfectly for this. Video calls allow colleagues to communicate verbally whilst still observing the physical cues they would witness in person. These calls also benefit companies since they require no travel time, allowing employees to be in the comfort of their own space. Check on your furloughed staff In the current pandemic, it is not uncommon for businesses to have furloughed a portion of their employees. When thinking of company culture, management must remember those who are not currently contributing to the workload but are still very much a part of the team. Taking the time to check in on furloughed staff is invaluable. The experience of being furloughed can be traumatic for some as they face financial worries alongside potential dips in their mental health. Knowing their team is supporting them can be a significant help in the journey which furloughed employees are facing. Continue to organise social events Social events are a tried and tested method of boosting staff morale. These events can aid in moulding a company culture, demonstrating the firm is a fun and friendly environment to work in. Friendships and bonds are often formed outside of the office which can lead to boosts in innovation and efficiency, ultimately benefiting the company Since remote working can challenge the practicality of organising social events, management should take the time to plan virtual social events for their team. Optional events often work better, so employees do not feel obliged to attend, with quizzes or virtual games particularly popular in the UK. The key to cultivating a remote company culture lies within these five pillars, and companies can really take the opportunity to present the values of the firm to employees through each interaction. With a strong culture to guide employees, firms can align themselves in a strong position for growth and use remote working to their advantage.[/vc_column_text][/vc_column][/vc_row] ### Digital Asset Management: Cloud vs On-Premises [vc_row][vc_column][vc_column_text]Digital Asset Management (DAM) enables you to standardise the usage of digital assets. You can do that by creating practices that define how assets are used, modified, published, stored, collected, and shared. To support and automate your management processes, you can use DAM tools. You can deploy DAM on-premises, or you can use cloud-based DAM software. If you are unfamiliar with the differences between cloud and on-prem DAM, you can take a look at the extensive list of pros and cons below. What Is Digital Asset Management (DAM)? Digital asset management can be used to refer to: A set of practices and policies that determine how digital assets are managed, either in-house or by third-party providers. A set of tools that are used to enable and support the management of digital assets. Often, DAM is used to refer to some combination of the above two definitions. This is because it is difficult to separate the practices and policies governing management from the tools those policies are implemented with. Types of Digital Asset Management Systems Digital asset management systems have come a long way from being simple cataloguing or editing tools. Modern systems enable you to incorporate most aspects of digital management into a single platform, including organisation, licensing, versioning, and sharing. To suit the changing needs of businesses, DAMs have also expanded in terms of implementation. When you’re deploying a DAM system you have three options to choose from: On-premises—self-hosted on company servers and fully managed and maintained by in-house IT. On-premise systems typically provide greater control over your data and can grant greater opportunities for customisation. Software as a Service (SaaS)—cloud-based and at least partially managed by your vendor. SaaS systems typically offer less flexibility but can provide easier distributed access to assets and require less effort from in-house teams to maintain. Depending on your organisational resources, these systems may also be cheaper since you are not responsible for purchasing or maintaining host infrastructure. Hybrid—a system that is distributed across or used to interface both cloud and on-premise infrastructure. Hybrid systems are more difficult to deploy but can provide the distributed access of cloud systems with the greater control and flexibility of on-premise systems. Cloud vs On-Premises DAM: What Is the Difference? When comparing DAM systems, it can be difficult to choose which is the right option for you. It may seem like the distinction between deployment types isn’t clear, particularly if you aren’t the one who is responsible for deploying the system or configuring infrastructure. To help clarify the differences, keep the following in mind: On-premise DAM solutions are software that you purchase once, rather than paying a subscription fee. Your IT team is responsible for installing the software on your organisation’s resources and performing any and all maintenance required to use the solution. This includes support, customisation, integration, and upgrades. Additionally, these systems do not scale easily so you need to have a good idea of your resource requirements before deployment. Generally, on-premise solutions are best for organisations that: Require system customisation, such as security or integrations Have data the falls under compliance restrictions Have well-established and experienced IT Already own infrastructure for hosting SaaS DAM solutions are services that you subscribe to with a monthly or yearly fee. These services are cloud-based and the provider is responsible for system hosting and maintenance. You can configure the settings of a SaaS DAM system but typically cannot customise operations and are limited to integrations supported by the provider. Generally, cloud-based solutions are best for organisations that: Prioritise startup ease over customisability Use remote teams or require distributed access Are growing or foresee the need to scale in the near future Want managed support to supplement or outsource IT Cloud-Based DAM Pros and Cons If you think a cloud-based DAM might be the right solution for you, consider the following pros and cons:   Pros Cons Lower upfront cost   Since cloud-based systems are scalable, you only pay for the resources you’re using. This also means you can avoid the costs of purchasing hosting hardware or the associated costs of housing or maintenance. Vendor lock-in   Solutions may use proprietary technologies or formats that make it difficult to export or migrate files in the future. This can leave you reliant on the solution and vulnerable to price or service changes. Mobile-ready   Cloud-based solutions are designed to be remotely accessible. This often means that solutions are more compatible with mobile devices such as smartphones or tablets. Network performance   Since services are Internet accessible and not available on-premises, access is determined by your Internet connection. If you have a slow connection or lose connectivity your productivity and system performance suffers. Security   Cloud-based services manage some aspects of security for you. For those aspects they don’t manage, they typically offer tooling and expertise for securing your data. Unless you have high-level security experts in-house, vendors are likely to have more security expertise than you. Shared resources   Cloud-based solutions typically have multiple clients share servers. While your data should be isolated from these other tenants, their resource demands may negatively affect your system performance. Additionally, if a neighbour tenant fails to secure their resources, the vulnerability has the potential to affect your security as well. On-Premises DAM Pros and Cons If you think an on-premises DAM may be the right solution for you, consider the following pros and cons:   Pros Cons Total control   On-premise systems offer total control over implementation, configuration, data, and security. There is no intermediary and you can choose exactly how your assets are handled and managed. High upfront cost   While long term costs may be less, upfront costs are typically more. This is especially true if you don’t already have the infrastructure needed to host a DAM. Local accessibility and latency   On-premise systems aren’t restricted by your Internet connection. This means that data remains accessible even if connectivity drops. Additionally, because data is hosted on-premises, access speeds don’t rely on bandwidth and request latency is often lower. Long deployment time   Setting up an on-premises DAM often takes more time than cloud-based because you have limited deployment support. Where cloud-based DAM providers can help you configure your system, an on-premises system requires you to start from scratch. Regulatory compliance   On-premises systems make it easier to meet regulatory compliance since you maintain the auditability of your data. You don’t have to worry about where data is stored or who is accessing it since you have full control. Support and maintenance   Any customisations you make to your DAM require integration and ongoing maintenance. This means your IT is responsible for managing updates for any associated applications and for all support needed. Conclusion DAM systems can help you ensure you have control and visibility of your digital assets. However, different DAM environments provide different capabilities. Sometimes, there’s an overlap between on-prem and cloud offerings, but most solutions provide different capabilities per environment. In addition to reviewing this article, you should also carefully read official documentation and (when possible) consult with software representatives. They will be able to further guide you through the process of choosing the tooling that best suits your needs. Specifically, be sure to match DAM capabilities with your existing tooling stack and available talent skillsets.[/vc_column_text][/vc_column][/vc_row] ### The Future of Payments [vc_row][vc_column][vc_column_text]Britain was on track to become a cashless society 'within the next 10 years’ at the beginning of 2020. However, there was undeniably one event in 2020 that will accelerate that process. The Coronavirus pandemic saw UK cash usage decline by 50% in just a few days. This all began after The World Health Organisation began advising consumers to avoid handling banknotes and switch to contactless payments. The WHO later said the advice was not a formal warning, but more of a recommendation. This didn’t seem to matter though as soon after, some countries, including the UK, began raising their contactless payments limits. Consumers' attitude towards money has already changed, so what will payments of the future look like? More Ways to Pay on Mobile! Sweden is possibly the closest example we have to a cashless society. It’s likely that they will become the first as only 13% of their total population rely on cash. Swish is driving Sweden into a cashless society with ease. More than 6.9 million people use the mobile app to send and receive money. The app facilitates mobile transactions between individuals and businesses. It enables instant payments to be made between two parties. This is done via the mobile app that is connected to the user’s bank account. The bank account is tied to a “BankID Säkerhetsapp” which is an electronic identification system used by lenders across the country. However, Swedish police say that Swish is a big risk for money laundering. The lack of supervision means that it’s a preferable means for such activities. Unlike banks, apps like Swish do not have to report suspicious transaction deviations to the authorities. Much like cryptocurrency, some may prefer to use a payment method that the authorities have no supervision over. But, there is no denying that this does pose new risks. It also means that Swedish banks have even more data on the spending habits of their users. The amount of data and personal information apps like Swish would hold in much larger countries could put them at risk. Less Contact During Payments It’s likely that we’ll have fewer touch screens after the pandemic, we have all become hyper-aware of touching surfaces. This may present a gap for the use of voice interfaces, machine vision interfaces and facial recognition to be introduced at a greater rate. We have contactless payments, but these could still be just too much contact. They often require you to enter a pin for some transactions that are more costly. With an increase in people wanting to limit what they touch, an option to pay for goods and services that do not require any physical contact is likely to be in demand. This could mean looking further into technology such as machine visions interfaces. Machine vision interfaces are used within society already, most of this would have been recognisable in applying social media filters. But, this could pose an opportunity to offer an autonomous checkout process at some stores. Autonomous checkout stores allow consumers to walk in, pick up the items they want, and stroll out again. This has been seen with the Amazon Go app where there are no cashiers, no checkout, and no lines. To gain a seamless and easy transition process adopting autonomous checkout technology, retailers would need to invest massively in their own IT infrastructure and systems. Not to mention the huge choice of hiring the right partner to develop their autonomous checkout process. There is room here to develop current technology that recognises faces and gestures. This method would almost remove the need for any physical contact. Facial recognition technology alone would not be sufficient. It uses a database of photos to identify people by using biometrics to map facial features. There are of course security concerns around using this method. Biometric payment terminals in China require companies to separately store and encrypt facial image data, bank details and other personal information. Merchants and also not allowed to receive the transaction are not allowed to retain the facial image information. A major problem for the use of facial recognition is the fact that the technology doesn’t provide an all-round experience just yet. Verification could not be solely based on the facial signature, especially for payments that are classed as high risk. These would need multi-factor authentication and if institutions fail to properly verify identification, they would need to have a compensation mechanism. For an all-round experience, the compensation mechanism would also need to be completely contactless. As it stands, there is a demand for major advancements in technology in order to really pioneer in the sector of contact-free payments. A reactive approach on consumers attitudes towards cash post lockdown will pave the future of payments.[/vc_column_text][/vc_column][/vc_row] ### Database management: Overcoming the duct tape and bubblegum challenge [vc_row][vc_column][vc_column_text]Companies need their data to work for them and support their business, but many struggle to get their database environments right. Existing approaches to managing data and databases are often haphazard and informal – the digital equivalent of duct tape and bubblegum. While this might work for MacGyver, he only had to get to the end of an episode; he didn’t need to live with his creations long-term. Rather than managing their databases properly, many firms are simply taping them together, cutting corners, and failing to structure them for growth or cost savings.   What is your data problem? Databases today have to support complex and fluid environments, with a lot more scale than in the past. This often involves multiple database technologies used in one stack. Businesses are no longer solely running Oracle or SQL Server; instead, they have to be “everything shops.” According to our Open Source Data Management Software Survey, over 90% of organisations have more than one database technology in their work environment, and 85% use more than one open source database technology. 89% of respondents have more than one OSS database, with 43% running both PostgreSQL and a variant of MySQL. Meanwhile, 54% are running some purpose-built NoSQL database, and 73% are running a relational database as well as a NoSQL purpose-built database. This creates a major challenge: firms are managing multiple database technologies, but lack the skill set to run them all efficiently. In addition, with developers taking a more active role in technology stacks and growing the environment, there are often hundreds if not thousands of database installations, large and small, that need to be managed and maintained at the same time. This scattered approach to databases is already having a massive impact. Downtime is prevalent and many firms have to take the costly step of updating their hardware to keep up. There is another problem with scale: the bigger the organisation, the more problems you discover. When there are a large number of developer groups, and teams are stretched, firms may not have the specialist people they need to support their environments. Personnel-wise, companies have shifted away from employing database administrators and focusing instead on more generalist staff. At the same time, you don’t have to look that far to see firms having massive outages. Database breaches or outages are widespread – and often caused by issues that could be easily resolved. For example, data can be leaked because of configuration issues such as forgetting to set a password. Or someone can make a small configuration change, test it on one server, but accidentally apply it to all servers in the field.   Taking a more structured approach So, what changes should firms make to achieve the cost savings and growth they are striving for?  First, it is important to acknowledge that in the current environment, companies should understand their workload and put processes in place to iterate around environments over time. One challenge in the modern IT space is that usage of apps can change drastically based on user numbers growing rapidly over time. From a technical perspective this means that the set-up that worked six months ago won’t work today. To resolve this, it is important to prioritise which apps and databases are the most critical. At the same time, less important apps can be dealt with by additional  automation. It is impossible to develop an app without planning for it to be cloud native – so people should ensure their databases run like apps and can scale up and down. Alongside all this, it is important to standardise. All tooling should be in one place, and everyone involved in supporting the applications should acknowledge how app workloads are changing and ensure things are set up to run effectively. There are methods and tools available to help with data backup and recovery management, that can replay workloads and ensure systems can fail over properly.   New ways to run applications and databases In addition to using smart technology tools, many businesses are adopting a DevOps culture with automation. But, even these firms still have outages and the applications themselves require ongoing testing. Teams are also considering tools such as Kubernetes to manage and scale environments that run in software containers using automation and orchestration. These new approaches to running applications offer greater options and allow you to more easily scale up. However, an important consideration is how you communicate the need for change to management. This is not always simple as there is rarely one single thing that will solve an organisation’s data management issues. Instead, expertise and better thought processes are needed. Cost is always a critical problem for business leaders, so using cost metrics in your approach will help convince them that change is needed. For example, firms experiencing growing numbers of users have often had to upgrade their cloud instances frequently over a one year period – for those expanding the fastest, this could happen up to six times in twelve months. Each of these instance changes represents a big jump in spending, so not preparing for growth can be more expensive in the long run. This data can be used internally to unlock investment upfront and prevent costs from skyrocketing. Additionally, skill set is key. Developers and architects are today’s decision makers around technology, but all their staff need proper training and education. This need for better skills has led to trends such as cloud and database-as-a-service. For some firms, these services are useful as things are then managed by a third party. However, there is a layer above that which still requires attention, as cloud providers don’t tune the database instance to the workload of your app, manage potential slowdowns, or help you avoid bottlenecks. As teams take new infrastructure and deployment approaches from testing into production, those new applications will need support. It is here that detailed knowledge and skills are still required.   Making changes and planning ahead Overall, firms need a proper set-up in place and tooling to be able to scale up and – the way the economy is at the moment – scale down. Every time a new app spins up and workloads change, it is important to have tools and capabilities already in place to ensure things are ready to go from a database perspective, as well as being focused on the application side. Utilising this planned approach can remove the need for the duct tape and bubblegum that keeps systems linked together. A best-practice approach that looks at how to make the most of the potential that exists in these new systems can deliver better results. It might seem complex at first, but in the long run solving these problems early can help reduce costs, simplify management over time, and streamline performance.[/vc_column_text][/vc_column][/vc_row] ### Remote workers bombarded with 65,000 Google-branded cyber-attacks in first four months of 2020 [vc_row][vc_column][vc_column_text] Google-branded sites such as storage.googleapis.com, docs.google.com, storage.cloud.google.com, and drive.google.com have been increasingly used to trick victims into sharing login credentials, according to new insight from Barracuda Networks LONDON, 28th May 2020 – Remote workers have been targeted by up to 65,000 Google-brand impersonation attacks, according to the most recent ‘Threat Spotlight’ report from Barracuda Networks. This type of spear phishing scam uses branded sites to trick victims into sharing login credentials. Of the nearly 100,000 form-based attacks Barracuda detected between January 1, 2020, and April 30, 2020, Google file sharing and storage websites were used in 65 per cent of attacks. This includes storage.googleapis.com (25 per cent), docs.google.com (23 per cent), storage.cloud.google.com (13 per cent), and drive.google.com (4 per cent). In comparison, Microsoft brands were targeted in 13 percent of attacks: onedrive.live.com (6 per cent), sway.office.com (4 per cent, and forms.office.com (3 per cent). The other sites impersonated include sendgrid.net (10 per cent), mailchimp.com (4 per cent), and formcrafts.com (2%). All other sites made up 6 percent of form-based attacks. Barracuda researchers observed that Google-brand impersonation attacks have made up 4 per cent of all spear phishing attacks in the first four months of 2020, and they expect to see this number climb, as cybercriminals have success harvesting credentials. Steve Peake, UK Systems Engineer Manager, Barracuda Networks comments: “Brand-impersonation spear phishing attacks have always been a popular and successful method of harvesting a user’s login credentials, and with more people than ever working from home, it’s no surprise that cyber criminals are taking the opportunity to flood people’s inboxes with these scams. The sophistication of these attacks has accelerated in recent times: now, hackers can even create an online phishing form or page using the guise of legitimate services, such as forms.office.com, to trick unsuspecting users. “Fortunately, there are ways to protect oneself against these cyber, such as implementing multi-factor authentication steps on all log-in pages so that hackers will require more than just a password to gain access to your data. Other, more sophisticated methods of cyber protection include using email security software, such as API based inbox defence, which uses artificial intelligence to detect and block attacks.” [/vc_column_text][/vc_column][/vc_row] ### How channel partners can enable digital transformation success [vc_row][vc_column][vc_column_text]Digital transformation is one of today’s most discussed issues in business. While it’s erroneously perceived by some as an overused industry buzzword, it is very much a reality, and is impacting all companies in one way or another - with some thriving thanks to the innovation it enables, but some still grappling with how to adopt the necessary infrastructure to support it.  The COVID-19 outbreak and the measures implemented to contain it have forced a huge chunk of the globe’s population to work from home, complicating network traffic and causing a boom in the use of cloud and new applications. This has exacerbated some of the difficulties businesses encounter on the journey to digitalisation, making network visibility, monitoring and performance more crucial than ever before. With so many businesses impacted financially, their focus has inevitably shifted away from innovation and closer to optimisation. That being said, the agility and efficiency that stem from a digitally-savvy approach can empower companies to make the most of their current resources and stay afloat in this difficult climate. Moreover, with remote working the new norm, businesses that don’t have a sound digital strategy in place and are unable to facilitate telecommuting and virtual collaboration risk seeing their productivity take a hard hit. To successfully navigate these challenges, organisations require not only the right technology but also the right partners on their side. Crucial support at a testing time With overstretched resources and profit margins thinning, it’s no surprise companies’ aim is to work to a lean business model. This means minimising costs wherever possible. As a consequence, the pressure is on all partners and providers to prove their services are essential. While the COVID-19 pandemic has dramatically altered the commercial landscape, digitally-powered speed and efficiency remain vital for a business to stay competitive - so a partner that works as an enabler of digital innovation is a partner any company should want to keep in the long term. At the foundation of optimisation and improvement is sound network visibility. Enterprise networks are morphing into increasingly intricate beasts, spanning across on-prem, cloud and hybrid infrastructures, now further complicated by the sudden surge in remote workers. Thorough monitoring and analysis of all network traffic and tools is the only way to enhance network performance and increase business uptime - that’s why, to fast-track their customers’ digital evolution, partners should enable the implementation of network monitoring technology. Only through clear network visibility can companies gather valuable insight on system and application interdependencies; crucial to predict when an issue may occur, pinpoint opportunities for improvement and identify faster ways of working. Security also plays a key role in ensuring efficiency – Gartner predicted that 60% of digital businesses would suffer major service failures this year due to security lapses. Partners should also be considering how to help customers stay secure in these uncertain times, leveraging tools they already have. Right now, vulnerabilities are exacerbated by malicious actors looking to lure in users with information on COVID-19 with the aim of compromising their accounts. Channel partners should work to empower their customers to enhance protection with solutions that can detect, track and dismantle a hacker’s attack. Successful digital transformation is not just about adopting flashy new tools - it’s also about abandoning antiquated processes and removing blind spots in the network, so that hybrid infrastructures can stay secure, mitigating risk. While sourcing and implementing the right solutions is fundamental, partners need to go the extra mile and demonstrate their business awareness - as well as their tech knowledge - to help customers modernise their operations to ensure efficiency and security. All of these efforts combined will in turn benefit partners themselves - guiding companies on their way to digital success can make a world of difference in terms of nurturing the relationship as, if these projects are successful, customers are likely to come back for more. The role of cloud If partners are to remain successful in this uncertain time, they must act as digital specialists and provide their customers with practical advice on how to evolve and adapt to the current situation. Cloud technology has always been a focal point of digitalisation efforts - the flexibility and scalability it enables are crucial to longevity and improvement. It so happens that this technology has found new relevance in today’s commercial and workplace landscape. With around 20 million UK employees now working from home, companies will increasingly rely on cloud to facilitate file sharing and cooperation. As partners work more and more as managed service providers, they must take this opportunity to support their customers through cloud migration, or guide them as they increase their cloud adoption to enable productivity and efficiency for their home workers. The current economic recession is causing customer demand to fluctuate. As a consequence, businesses are having to change their approach, reassess budgets and make adjustments to their workforce to ensure resources are optimised. Cloud technology is synonymous with agility and has the potential to simplify this process and prevent companies from incurring losses. Digital transformation is something all organisations are aware they have to undergo. While many understand its importance and know they risk getting left behind if they don’t evolve, the idea of rewiring their entire strategy is intimidating to say the least, even more so in the current climate. However, COVID-19 can act as a catalyst for positive change, as more and more companies embrace virtual, cloud-powered collaboration. As improvement and optimisation are only possible with a thorough understanding of how their systems run, network visibility is critical. It’s up to channel partners to facilitate this journey and help propel their customers to digital success.[/vc_column_text][/vc_column][/vc_row] ### Where Digital Transactions Will Show Up Next [vc_row][vc_column][vc_column_text]As we progress further into the era of fintech, we’re beginning to get used to the idea of fully digitised transactions. Generally, the thinking has evolved somewhat, so much so that a digital transaction is not just one made through the internet, but rather one that takes advantage of advanced security and processing methods to provide a quick, reliable solution. In some cases, it can even mean a whole different form of currency that exists solely as a digital asset. Methods and assets of this nature are still only in the very early stages of going mainstream, however. And the following are some of the areas in which we expect to see them making an impact next. International Business Transactions Naturally, it’s in the best interest of any business with international dealings to make sure that any payments or contracts it has to handle are managed with the utmost security and efficiency. And since this is essentially the very idea of digitised transactions, it makes perfect sense that this is an area ripe for adaptation. The idea came up quite clearly in our look at Currency Cloud, a fintech company that was founded in 2012 but which is now positioned to make a major impact on modern businesses. That company provides a fast, transparent, and secure payment engine meant to transform the way businesses move money around the world. This speaks to the potential of fintech solutions more broadly — and sure enough at this point, in 2020, Currency Cloud is just one of many specific options. We’re beginning to see large businesses with international activity increasingly relying on solutions of this sort to make their transactions more reliable, as well as to earn the trust and confidence of partners and clients alike. In-Person Retail Stores In-person retail might get more attention than the rest of these areas put together when it comes to transitioning to digital payments. And there’s a chance that attention is actually somewhat unwarranted, in some sense. Surveys just last year indicated that most retail shops won’t go fully cashless, which is to say they’re not ready to fully support a digital payment revolution. At the same time, the fact that this is even a discussion indicates how much has changed already. Just five years ago the idea of paying for goods at a retail store by digital means was a foreign concept to most consumers. Most stores couldn’t even support such payments. Now, the same stores are adopting the technology and equipment needed to handle instantaneous, cashless, and touchless transactions from a variety of mobile services. They may still accept cash and cards, but they’re becoming more inviting stores for those customers who prefer digital payments. Forex Trading Forex trading, in many respects, represents a fairly ordinary financial market — with a few key distinctions. The forex market is the largest financial market in the world, for instance, and it is also one of the most accessible, with 24-hour trading on business days. Fundamentally though, it’s fairly straightforward: Traders make accounts with forex brokers and input currency trades, which are then enacted for small fees. Here too though, we’re beginning to see a gradual transaction toward modernised digital payments. With transactions conducted via secure digital networks, forex brokers can offer traders lower fees, faster trades, and in some cases a greater degree of security and transparency. Any full transition of the market to options like these is going to take time, but already some financial institutions supporting forex trades have reported significant savings on transaction fees. That could accelerate the process of digitisation in this specific area. Central Bank Distribution Maybe the most fascinating area in which digital transactions are beginning to take hold is in central banks. Institutions in China and the United States have made major headlines by introducing their own “CBDCs” (Central Bank Digital Currencies), and the Bank of England is considering a change as well. Not all CBDCs are arranged the same way, or introduced for the exact same purpose, but they are all quite literally introductions of new digital currencies. In some cases, those currencies are meant primarily for institutional use, or simplified cross-border transactions. These serve a purpose similar to the digitised international transactions for businesses we discussed above. In other cases though, central banks are in the early stages of trying to replace ordinary money with digital alternatives — which could ultimately be the most significant and comprehensive transition of all toward digital transactions. Beyond these areas, there are also some smaller or more niche spaces in which digital payments are beginning to prevail. For instance, some sports teams are following retail’s direction and embracing contactless pay options in stadiums; some social media platforms are building in digital transaction platforms to facilitate peer-to-peer payments. And the list goes on. But the combination of fintech in business dealings, retail stores, trading platforms, and central banks makes for a fairly significant overarching shift in modern society.[/vc_column_text][/vc_column][/vc_row] ### M247 partners with leading compliance firm MetaCompliance [vc_row][vc_column][vc_column_text] M247 has joined forces with tech compliance firm MetaCompliance to ensure it can meet the highest standards of information security, both as an organisation and as a service provider. M247 is a leading Manchester-based provider of international connectivity and cloud services and has witnessed the evolution of cyber threats over its 20 years in operation. As a major cloud-services provider, the company recognises that while the Cloud has had huge, transformational benefits for organisations, enhancing efficiencies and customer experiences, it has also presented a number of cyber security challenges relating to data breaches and increased regulatory requirements. These security challenges have led to an increased responsibility for businesses and organisations to implement measures for protecting the personal information they use and store. That prompted M247 to seek the support of cyber security specialist firm MetaCompliance. This new partnership will enable M247 to further improve upon its proactive approach to security and compliance challenges, by helping to enhance their already strong policy management. It will also help the company to establish the boundaries of safe cyber security behaviour for its employees, as well as bringing enhanced security benefits to its many customers. A driving force behind the partnership was a desire among M247’s management team to strengthen in-house compliance procedures, especially in terms of enhancing policies that ensured international security standards, such as ISO 27001, were being met at every level of the business. While accepting that businesses and organisations that are found to be non-compliant can face large financial penalties, government sanctions and potential lawsuits, M247 believed it was important to be able to offer increased peace of mind for their many end users and customers. Acknowledging the risks associated with inadequate policy management, they felt the new partnership with MetaCompliance would help them achieve this peace of mind by allowing them to adopt a tool that would enable policies to be ‘automated, auditable, accessible and easily updated’. M247 Group Information Security Manager Gary Myers said it was important for the company to partner with an organisation that was aligned to the values of ISO 27001 (which specifies the requirements for establishing, implementing, maintaining and improving organisational information management systems) and which could help it transform its approach to policy management and compliance. ‘MetaCompliance has always engaged us to work closely in understanding our requirements and help us to refine this internal service offering,’ he said. The Cloud environment in which M247 operates means businesses and organisations often have large and complex infrastructures, with vast numbers of end users that present increased risks for businesses. M247 recognised that its in-house policy enforcement had become less defined as the business had continued to grow and departments had expanded. M247 saw a partnership with MetaCompliance as an opportunity to refine policies and procedures to ensure a clear audit trail that incorporates all new colleagues and departmental roles, and which can continue to flex as the business grows. Working with MetaCompliance, who hold the same ISO 27001 level as M247, is a clear demonstration to the commitment in delivering to the international standard for information security.  And in providing the highest standards of compliance via approved processes and offering reassurance to its many hundreds of customers around the world. ‘The most tangible change [since we partnered with MetaCompliance] has been the ability to actually quantify attestation from our user base,’ said Mr Myers. ‘This has been invaluable not only to put our own minds at ease that our policies are being read and our training completed, but also provides evidence for our external auditors that we take information security seriously.’ Since the implementation of the MetaCompliance policy management software, M247 has successfully built a framework for ensuring policy and compliance form a central part of every role across the company. Policies are now conveyed clearly to employees, and compliance requirements are communicated consistently, helping create a culture of compliance throughout the business. By partnering with MetaCompliance, M247 is reinforcing its commitment to meeting the highest international standards of compliance. The strength of this new working relationship will ensure M247 can evidence staff understanding of policies and procedures, as well as target relevant policies towards the right people, in the right place, and at the right time. With this deeper, company-wide understanding of the challenges and an ability to monitor and manage key policies, the compliance culture at M247 has never been stronger. [/vc_column_text][/vc_column][/vc_row] ### How to protect your organisation from cyber espionage [vc_row][vc_column][vc_column_text]In a world full of geopolitical tensions, acts of cyber warfare between nation-states are becoming more commonplace. According to MI5, “the UK is a high priority espionage target. Many countries actively seek UK information and material to advance their own military, technological, political and economic programmes.” Historically, espionage activity has been carried out to gain political and military intelligence. But, in today’s technology-driven world, the private sector is finding itself increasingly in the crossfire, either as direct targets or as collateral damage. So, how can organisations prepare for the new status quo and the very real possibility they may become a victim to this advanced-level threat?   What is cyber espionage? Cyber espionage is a specific, advanced type of threat. Attackers are generally looking to dig deep into a network to steal information that advances their capabilities. It can take place over a long timeframe, often without an organisation realising the threat actor is there, infiltrating deep into a system and rendering it untrustworthy, even after remediation has been carried out. As a term, cyber espionage alludes a clear definition. Factors like the extent and nature of the damage caused by the attack, the identity of the attacker(s), and how the stolen information is used can all influence how it’s defined. A guideline taken from The Tallinn Manual, explains cyber espionage as “an act undertaken clandestinely or under false pretences that uses cyber capabilities to gather (or attempt to gather) information with the intention of communicating it to the opposing party”. It can be used alongside traditional warfare or as a singular event. For example, in 2014, Russia was accused of disabling Ukraine’s mobile phone communications before employing traditional battlefield methods. Whereas, in other cases, privately-owned US food giant Mondelez was denied a £76m insurance pay-out after suffering a Russian ATP cyber-attack deemed to be “an act of war” and not covered under the firm’s cyber-security insurance policy. However you define cyber espionage, the fact remains that it’s becoming more advanced, effective and professional.   How to plan for and defend against cyber espionage No matter the size of your business or your market, every organisation needs to be aware of the developing threat landscape and put measures in place to defend itself. The key is to understand what the threat and your organisation’s threat landscape looks like. Understanding the threat allows you to put the right technologies, the right people, the right processes in place to have your best foot forward. It’s clear attackers are moving away from malware techniques in favour of masquerading as the administrator, living off the land and using everyday tools to gain access and move laterally across a network. It could take years and years to be detected, or not at all. This can be difficult to spot as organisations can be extremely multifaceted, not only from the network level, the applications they’re running, and their supply chain but what they’re providing as an output to customers. It could be an ATP attack uses traditional techniques such as botnets to launch distributed denial of service (DDoS) attacks as a diverting smokescreen for other malicious activity. Social engineering and spear phishing techniques can also be weaponised to introduce an attacker into a system. The insider threat is also sizeable in cyber warfare, with a mole able to introduce a threat directly to the network or exfiltrate highly sensitive or secret material. Bringing security into the centre of an organisation can help you understand your threat landscape and put in place the right defences. If a security team can get into the mindset of a hacker, it can actively seek out its own vulnerabilities, understand what tactics might be used to gain entry, and what data can be accessed using those methods. Threat exposure and vulnerability must also be analysed as part of every core business decision, such as procurement, international expansion, new product lines, pricing structures and M&A activity which tend not to have security professionals present. What seems like a low risk and profitable business decision can suddenly become the opposite when you factor in its knock-on effect on cybersecurity.   Drawing up a cyber incident action plan Any organisation, no matter how small or large, should have a cyber incident action plan drawn up to contain a breach. Acting quickly can help to limit the damage – commercially and reputationally. It’s recommended an incident response team is set up, with selected senior staff members across technical, legal, the C-suite and PR/Communications, who can respond to combat the threat and manage the organisation’s messaging around the breach. Secure IT systems to minimise and assess the damage is also a priority. It could be web platforms need to be taken offline or the network in the short term to avoid having to rebuild the system if the breach is too deep and widespread. Rapid internal and external communication is key, alerting all staff to the issue and any steps they need to take to help mitigate the effects, as well as informing regulators such as the Information Commissioner’s Office, and customers and other stakeholders. Once the breach is contained, organisations must review the event and put in place measures to prevent a reoccurrence. This could involve bringing in an outside team and government-level organisations like GCHQ and NCSC to understand if you can trust your network to be secure again.   The future of cyber espionage and defence strategies Cyber espionage is something that directly impacts the UK economy and our ability to operate at a worldwide level. And slowly but surely, the issue is being drawn out of the shadows and into the public eye. But as the intricate details of targeted attacks are discussed and shared in public, the playbook is being dropped down to the average malware authors or criminal threat actor groups. As a result, ATPs are no longer advanced, they’re just persistent. In response, organisations need to move away from a rule-based approach, where you have to be declarative about what you want and what you don't want, into much more of a risk-based approach whereby you have a sliding scale from zero to 100. Organisations will need to choose something on there that is a reasonable set. As more organisations wake up to this growing threat, we’ll see the maturation of the incident response process which accepts an attack could take down a whole segment, or indeed, an entire business. Automated mechanisms like machine learning will help more organisations monitor behaviours across their network, look for anomalies and identify how an attacker infiltrated the system. Information can be shared across thousands of businesses too to improve cybersecurity strategies across the board. No longer will defence be reliant on finding a needle in a haystack.[/vc_column_text][/vc_column][/vc_row] ### How will the emergence of eSIM impact IoT? [vc_row][vc_column][vc_column_text]SIM cards, those little pieces of plastic that live in almost all of our mobile phones, have existed since the birth of the consumer mobile market and play a critical role in identifying users. For the Internet of Things (IoT) however, basic SIM cards are holding back advances in technology. In fact, for many IoT applications, SIM cards are proving to be costly, insecure, inflexible and not suitable for deployment. Consider how plastic SIM cards affect IoT devices: they require a slot to be incorporated into a device. Once deployed a SIM isn’t designed to be updated over the air. And if you want to make changes you have to remove the now useless piece of plastic and exchange it with a new SIM. The good news is that the solution to these problems is already here: the eSIM. While some consumer mobile phones are also beginning to embrace eSIM, it’s arguably the IoT where the biggest benefits will be seen. Plastic SIM cards are a problem for low margin, high volume IoT applications. When compared to the cost of embedding an eSIM that can be endlessly configured remotely to select the best available network once deployed, the traditional SIM is really blown out of the water for many deployments. What do we mean by eSIM? eSIM is a global specification by the GSMA which enables remote SIM provisioning of any mobile device. With an eSIM enabled device, users can store multiple operator profiles on a device simultaneously and switch between them remotely. This means manufacturers and operators can enable users to select the operator of their choice and then securely download that operator’s SIM application. Additional benefits include simpler device setup without the need to insert or replace a SIM cards, devices that can operate independently of a tethered smartphone, with their own subscriptions and the development of smaller devices. Overcoming traditional challenges So what else makes an eSIM better for the IoT? Well, the device no longer has to cater for a removable card. That means the IoT device can be very small, which could be vital for many applications in cellular IoT. It also removes the cost and vulnerability of an external port, as well as the environmental impact of plastic SIM cards. An eSIM holds multiple local network operator credentials, in contrast to a conventional SIM card that can only hold one. An eSIM can also be reprogrammed over the air. Such capabilities not only allow for future technology enhancements but also repeated updates with profiles suitable for the local network. This eliminates challenges that come from constant roaming and allow the connected device to take advantage of local data rates, that are typically cheaper and avoid any data throttling. Minimising costs while maximising data speed and stability will be a key enabler to accelerate IoT adoption. As Global Distribution Systems (GDS) for eSIM are developed, it will be easy for IoT service providers and end-users to choose the network that is right for them at any time and switch to the best network without having to switch SIM cards or waste time comparing rates. It will be a simple, instant, effective process. Additionally, manufacturers no longer need to provision devices for each specific country at the production stage. Devices can instead hold universal connectivity through a provisional bootstrap network profile. The bootstrap profile enables the device to download a fully operational profile once out in the local market. This saves manufacturers a significant amount of time and cost in product distribution and stock management with a single SKU able to be sold anywhere in the world. Flexibility and the IoT One of the most important benefits eSIM brings to manufacturers, service providers and end-users is flexibility. For IoT enterprises, there is no need to physically install or replace potentially millions of SIM cards for their connected devices. That’s crucial as some devices may be used in remote areas or locations difficult to access. The transition to eSIM is a key milestone and enabler for cellular IoT to take off. The scale of this opportunity is not to be underestimated. The GSMA predicts that there will be 25.2 billion connected devices in circulation by 2025. With the emergence of 5G and rapidly increasing network speeds and capacity, it’s clear to see how many IoT devices could benefit from a cellular connection through an eSIM. Especially as eSIM solves the fundamental challenge of managing connectivity remotely. eSIM also opens new pathways for different uses of mobile technology. Consider the connected car as an example. It may still sound futuristic, but there are millions of connected cars on our roads that have already embraced eSIM. Connected cars rely on networks to provide a variety of services to vehicle owners, including navigation, entertainment, breakdown services, telematics and diagnostics. With critical real-time services like navigation and breakdown provision, there’s a great deal of value in automatically choosing the best available network connection. While most IoT devices will perform fewer functions on their own, it’s clear that eSIM could improve their connectivity options and benefit manufactures, service providers and end-users alike.[/vc_column_text][/vc_column][/vc_row] ### Facing the challenge of distributed testing teams [vc_row][vc_column][vc_column_text]As governments around the world attempt to curb its spread, COVID-19 has, in a very short time, radically transformed the way we work, with most people now working from home. And no matter what you do for a living, making such a sharp and unexpected pivot brings a series of complications. Software testers in particular face some unique challenges. Coordinating testing across all the different tools and people involved in the end-to-end quality process has never been straightforward. Today, as organisations deploy urgent updates almost instantly, testers need to find ways of testing even more efficiently as they adjust to the “new normal”. Coordinating efforts Meetings have moved online, with apps such as Zoom or Teams replacing the conference room. However, even when all team members are able to attend a meeting, it can be hard to keep track of who’s doing what and when. Important details may be missed when the connection drops, for example, or when an attendee has to leave for a moment to take delivery of a package. A team member may have to take time off at late notice, requiring work assignments to be redistributed in order to meet deadlines, or an overnight change to a release’s scope or timeline may be missed among the increasing number of emails. Many organisations will have used an Agile test management tool to keep track of any changes and ensure that everyone works from the same page. But whether working with manual or automated tests or a combination of both, a centralised “hub” is invaluable in coordinating testing efforts, housing test cases and steps in a single repository from where they can be accessed and reused. Recording and sharing Remote working, by its very definition, can pose a significant problem for testers. In the past, testers have become accustomed to a “works on my machine” response after reporting a defect. Now, though, there are new challenges to consider. Team members might be working on different systems and environments, and it’s no longer possible for a tester to stand by a doubting colleague’s desk and talk them through what they’re doing. If both parties are online at the same time, they can screen share and discuss their actions, exploring the systems differences that might help pinpoint the root cause of the problem. But differences in time zones and work/life schedules mean this isn’t always possible. In these instances, it’s worth using a tool that can automatically record a tester’s actions, environment settings, and comments, allowing them to capture and share their test actions with their colleagues, wherever they might be located. Maintaining communication Working remotely can make test automation appear more daunting than usual. It takes time, effort, and - perhaps most importantly - collaboration to building out meaningful test automation, so it’s perhaps unsurprising that many teams feel the current wide distribution of testers will impact test automation rates. It’s important for business users to easily communicate business-critical requirements to testers so that test cases can be created, automated and executed. But, without being in the same room, it’s much harder to relay this information. It could get lost in an email or Slack message, for example, or be put on hold until everyone is available to join a video conference. Then, once a test case is pieced together, another round of review is needed to make sure it actually covers the original requirements, and extra time is needed to automate it. There are solutions and workarounds for situations like this. Business users can quickly and easily record critical business processes, for instance, saving them as shareable files which they can send to their testing teams via email or on a shared drive as test cases ready for automation. The tester can then build out comprehensive test automation for every business process that needs to be tested, over and over, across teams, as required. Distributing test sets One of the most intimidating obstacles teams face is how to keep their test automation initiatives up and running in this new environment. By providing them with the ability to remotely manage and execute automated test cases, a distributed execution tool allows testers to build, run and monitor tests, regardless of where they’re working. What’s more, by distributing automated test sets across multiple virtual machines, local computers, and in the cloud, such a tool can also boost testing performance. Indeed, the ability to distribute test sets across multiple execution clients in complex and diverse system landscapes is a core function for any agile testing team. Not only does it help to increase operating speed, it allows for rapid feedback regarding the quality of the product under test. The global pandemic has changed everything. As businesses adjust to this new reality, and pivot to remote work, their technology teams find themselves facing a new set of challenges. Delivering high-quality software and application updates depends on real-time communication and rapid testing across complex application architectures. But a remote and distributed testing team needn’t mean a drop in standards. With the right tools and processes in place, those teams can continue driving high-quality end-to-end test automation, all while working from home.[/vc_column_text][/vc_column][/vc_row] ### Site Reliability Engineering (SRE) 101 with DevOps vs SRE [vc_row][vc_column][vc_column_text]Consider the scenario below An Independent Software Provider (ISV) developed a financial application for a global investment firm that serves global conglomerates, leading central banks, asset managers, broking firms, and governmental bodies. The development strategy for the application encompassed a thought through DevOps plan with cutting-edge agile tools. This has ensured zero downtime deployment at maximum productivity. The app now handles financial transactions in real-time at an enormous scale, while safeguarding sensitive customer data and facilitating uninterrupted workflow. One unfortunate day, the application crashed, and this investment firm suffered a severe backlash (monetarily and morally) from its customers.   Here is the backstory – application’s workflow exchange had crossed its transactional threshold limit, and lack of responsive remedial action crippled the infrastructure. The intelligent automation brought forth by DevOps was confined mainly to the development and deployment environment. The IT operations, thus, remained susceptible to challenges. Decoupling DevOps and RunOps - The Genesis of Site Reliability Engineering (SRE) A decade or two ago, companies operated with a legacy IT mindset. IT operations consisted mostly of administrative jobs without automation. This was the time when the code writing, application testing, and deploying was done manually. Somewhere around 2008-2010, automation started getting prominence. Now Dev and Ops worked concurrently towards continuous integration and continuous deployment - backed by the agile software movement. The production team was mainly in charge of the runtime environment. However, they lacked skillsets to manage IT operations, which resulted in application instability, as depicted in the scenario above. Thus, DevOps and RunOps were decoupled, paving the way for SRE – a preventive technique to infuse stability in the IT operations. “Site Reliability Engineering is the practice and a cultural shift towards creating a robust IT operation process that would instil stability, high performance, and scalability to the production environment.” Software-First Approach: Brain Stem of SRE “SRE is what happens when you ask a software engineer to design an operations team,” Benjamin Treynor Sloss, Google. This means an SRE function is run by IT operational specialists who code. These specialist engineers implement a software-first approach to automate IT operations and preempt failures. They apply cutting edge software practices to integrated Dev and Ops on a single platform and execute test codes across the continuous environment. Therefore, they carry advanced software skills, including DNS Configuration, remediating server, network, and infrastructure problems, and fixing application glitches. The software approach codifies every aspect of IT operations to build resiliency within infrastructure and applications. Thus, changes are managed via version control tools and checked for issues leveraging test frameworks, while following the principle of observability.[/vc_column_text][vc_single_image image="53573" img_size="full"][vc_column_text] The Principle of Error Budget SRE engineers verify the code quality of changes in the application by asking the development team to produce evidence via automated test results. SRE managers can fix Service Level Objectives (SLOs) to gauge the performance of changes in the application. They should set a threshold for permissible minimum application downtime, also known as Error Budget. If a downtime during any changes in the application is within the scale of the Error Budget, then SRE teams can approve it. If not, then the changes should be rolled back for improvements to fall within the Error Budget formula. Error Budgets tend to bring balance between SRE and application development by mitigating risks. An Error Budget is unaffected until the system availability will fall within the SLO. The Error Budget can always be adjusted by managing the SLOs or enhancing the IT operability. The ultimate goal remains application reliability and scalability. Calculating Error Budget A simple formula to calculate Error Budget is (System Availability Percentage) minus (SLO Benchmark Percentage). Please refer to the System Availability Calculator below.[/vc_column_text][vc_single_image image="53574" img_size="full"][vc_column_text]Illustration. Suppose the system availability is 95%. And, your SLO threshold is 80%. Error Budget: 95%-80%= 15%[/vc_column_text][vc_single_image image="53575" img_size="full"][vc_column_text]Error Budget/month: 108 hours. (At 5% downtime, per day downtime hours is 1.2 hours. Therefore for 15% it is 1.2*3 = 3.6. So for 30 days it will be 30*3.6 = 108 hours) Error Budget/quarter: 108*3 = 324 hours. Quick Trivia – Breaking monolithic applications lets us derive SLOs at a granular level. Cultural Shift: A Right Step towards Reliability and Scalability Popular SRE engagement models such as Kitchen Sink, a.k.a. “Everything SRE” – a dedicated SRE team, Infrastructure – a backend managed services or Embedded – tagging SRE engineer with developer/s, require additional hiring. These models tend to build dedicated teams that lead to a ‘Silo’ SRE environment. The problem with the Silo environment is that it promotes a hands-off approach, which results in a lack of standardisation and coordination between teams. So, a sensible approach is shelving off a project-oriented mindset and allowing SRE to grow organically within the whole organisation. It starts by apprising the teams of customer principles and instilling a data-driven method for ensuring application reliability and scalability. Organisations must identify a change agent who would create and promote a culture of maximum system availability. He / She can champion this change by practising the principle of observability, where monitoring is a subset. Observability essentially requires engineering teams to be vigilant of common and complex problems hindering the attendance of reliability and scalability in the application. See the principles of observability below.[/vc_column_text][vc_single_image image="53577" img_size="full"][vc_column_text]The principle of observability follows a cyclical approach, which ensures maximum application uptime. Step Zero – Unlocking Potential of Pyramid of Observability Step zero is making employees aware of end-to-end product detail – technical and functional. Until an operational specialist knows what to observe, the subsequent steps in the pyramid of observability remain futile. Also, remember that this culture shift isn’t achievable overnight – it will be successful when practised sincerely over a few months. DevOps vs. SRE People often confuse SRE with DevOps. DevOps and SRE are complementary practices to drive quality in the software development process and maintain application stability. Let’s analyse four key the fundamental difference between DevOps and SRE.[/vc_column_text][vc_single_image image="53580" img_size="full"][vc_column_text] Conclusion – SRE Teams as Value Centre A software product is expected to deliver uninterrupted services. The ideal and optimal condition is maximum uptime with 24/7 service availability. This requires unmatched reliability and ultra-scalability. Therefore, the right mindset will be to treat SRE teams as a value centre, which carries a combination of customer-facing skills and sharp technical acumen.  Lastly, for SRE to be successful, it is necessary to create SLI driven SLOs, augment capabilities around cloud infrastructure, a smooth inter-team co-ordination, and thrust Automation and AI within IT operations.[/vc_column_text][/vc_column][/vc_row] ### How healthcare organisations can navigate through cloud connectivity [vc_row][vc_column][vc_column_text]Healthcare organisations are under more strain than ever before as they grapple with the ongoing Coronavirus crisis. This increased workload is also coupled with the changing work practices accompanying the shift to remote working. IT infrastructure is, therefore, proving crucial in supporting workforces needing access to systems and private networks from home as well as cope with the increase in demand. Speedy adaptation is a key capability of cloud technology and is, therefore, able to offer options around flexibility and pace of change, helping to solve short-term problems as well as providing future-proofed solutions going forwards. It is clear that the health sector needs to embrace cloud technology and reap the rewards it can provide now and in the future. The ability to scale During the ongoing pandemic, mammoth amounts of data are being produced daily. Secure cloud connectivity can be used to access the Health and Social Care Network (HSCN) or other private networks to share vital patient data in the fight against Coronavirus, and can be utilised remotely for healthcare workers at home. With cloud-native network connectivity platforms you do not require physical hardware to manage and make changes to your infrastructure, so deployment is rapid - essential in these unprecedented times. Facilitating remote working Remote access infrastructure has not been designed to cope with the volumes of traffic currently occurring with most employees working remotely. On top of this, the healthcare industry hasn’t traditionally been designed for remote working, which means organisations must adapt in response to Coronavirus challenges. Putting in place secure access connectivity can alleviate the short-term pain for organisations by enabling them to operate a more distributed architecture as opposed to a centralised architecture for their network. Secure cloud connectivity platforms enable remote access to applications, and private networks and data from a broad range of devices with no additional hardware, meaning productivity can easily continue with people working remotely. For example, technology provider Q doctor, which removes barriers for patients to access medical advice and treatment, needed an end-to-end connectivity solution to connect its software with the HSCN in order to access clinical systems for clinicians to be able to treat patients remotely. Q doctor needed its HSCN connection capacity upgraded to provide virtual workspace software alongside the existing video consultations solution. A week from the initial engagement, the cloud connectivity solution had been provisioned and onboarded, meaning within just 48 hours, over 300 clinicians were able to access, manage, transfer and share patient data remotely so effective patient treatment could resume. Agility versus security Organisations can often make the mistake of investing in new technology without considering the business needs and the risks it can pose. Given healthcare companies have vast amounts of confidential data, it is crucial that key services can continue to treat patients and that such data is safeguarded. Major data hacks have proven that outdated healthcare systems are vulnerable to attack. For example, the 2017 WannaCry cyber attack severely disrupted patient services, affecting a third of hospital trusts and costing the NHS £92 million. The healthcare industry cannot afford a repeat of such an attack, especially during a global health crisis. Consequently, ensuring secure infrastructure exists to route traffic, enforce security policies and reduce the risk of cyber attacks is key for the continued success of organisations. With cloud connectivity foundations, organisations are able to diversify their data and applications to reduce the impact if something goes wrong, and can scale this up and down according to need. This can help avoid repeating the damage inflicted by previous hacks such as the aforementioned WannaCry attack. However, without a secure access platform, cloud solutions could provide more entry points for criminals to exploit. This can be solved by ensuring visibility is made a priority as cyber decisions cannot be made with incomplete data. Centralising connectivity helps to establish a single source of truth, so that cyber teams have greater visibility to monitor activity and enforce security protocols across the whole network. Implementing cloud technologies ultimately enables healthcare organisations to improve patient outcomes and proactively build an agile network, while also ensuring better integrated risk management, flexibility and elasticity in the use of applications - all resulting in potential cost savings, particularly for the NHS. The road ahead As well as solving short-term remote working challenges, cloud platforms can form a foundation for future change. Cloud technology can, further down the line, be used as a strategic component for further transformation of IT infrastructure. With the date that lockdown will be lifted still unknown, organisations adopting cloud platforms will be able to scale to cope with whatever remote working demands are made of it. There is no doubt that healthcare organisations can benefit from the flexibility, cost reduction and agility that results in bringing cloud technology into play. It is imperative, though, that an agile and secure network is put in place to ensure healthcare systems can continue to operate in a flexible manner, while maintaining security and confidentiality.[/vc_column_text][/vc_column][/vc_row] ### Connecting the Cloud [vc_row][vc_column][vc_column_text]Technology has provided companies with a host of new ways to do business – from machine learning patterns that can make startlingly accurate business predictions to the growth of IoT delivering on the promise of a more connected environment. These and other advancements have been enabled by new services provided through the cloud. As businesses evolve and grow, we’re seeing more and more cloud-based applications and services brought on board to manage critical information and automate core business processes, with the ultimate goal to improve business productivity, alleviate the strain on staff, and generate faster and better outcomes for the business. However, every advancement brings new challenges, and the move to the cloud has left many businesses with siloed data and disconnected applications as ‘SaaS sprawl’ accelerates and takes over. To get the most out of their enterprise data assets, IT teams need to equip themselves with the right tools and best practices to quickly and easily connect every application and data store across the organisation, regardless of whether it’s in the cloud or not. The Problem Increasingly, cloud applications deployed in enterprises are brought in by individual business functions -- from Marketing to Finance, to HR, and more -- and can often end up operating in isolation from one another. If a cloud application or database, or its underlying data, is needed by another business division, moving the information from one place to another can be a costly, time-consuming, and error-prone exercise. In some cases, IT teams manually code integrations for different applications and data sources, but as the business grows this requires too much time and resource, is difficult to maintain and doesn’t scale to support growth. With a divided data and application landscape, it then unsurprisingly becomes more difficult to have a 360-degree view of a business’s data in real-time. This causes problems far beyond just the IT team, leaving any business department that needs access to timely enterprise data with a stunted, incomplete view – for example, the finance team may need employee information located in a HR data store or the marketing team might require customer details from the sales division. Data siloes slow down critical business processes and inhibit fast and accurate decision making. If businesses are harbouring redundant, incomplete, or poor quality data, cloud applications will not be able to deliver accurate or beneficial insights. As a result, IT teams find themselves spending hours keeping tabs on the labyrinth of disconnected apps and data across the enterprise, plugging gaps and connecting what they can, to ensure business teams can obtain the promised benefits. The solution There are steps businesses can take to tame disconnected cloud applications and SaaS sprawl. Firstly, it’s key to ensure that there is a holistic IT strategy for cloud that all departments buy into, one that is not a cloud application free-for-all across an enterprise, or that encourages ‘shadow IT’. It should instead position and empower the IT department as enablers which any other team can go to for advice, guidance, or counsel when choosing a cloud application. In turn, this will ensure procured applications fit with the company’s overall IT strategy, governance, and security mandates, and that IT is able to properly oversee any needed cloud migration, integration, or development requirements. This also means IT will be able to see which other business departments might benefit from a new application, making their own recommendations to help streamline business activity, not to mention negotiate greater discounts and economies of scale if the particular app gains widespread usage across the company. As part of this, many organisations find success identifying a person within each department to serve as the liaison with IT -- typically someone with some level of data and tech proficiency -- so that each department has their dedicated ‘go to’ when wanting to suggest new ideas to streamline business processes or leverage data where IT could add value. Another big step in rationalising cloud infrastructure in a business is equipping IT and LOB teams with the right tools. As the capabilities of the cloud continue to grow, and enterprises migrate more of their data and business applications to cloud platforms, and users from all parts of the business demand direct access to data, having the means to be able to properly manage and move data is essential. Integration is an obvious choice to help with this, but the traditional manual, time- and resource-intensive options just aren’t feasible for modern enterprises. Modern, low-code, self-service integration tools which harness the power of AI are the answer, helping to not only improve the flow of data within an organisation but transform how business users can leverage data as part of their day-to-day work. By automating manual repetitive integrations and automating common workflows and processes, productivity surges and teams across the enterprise are freed up to focus on higher-value strategic projects that drive the business forward. Enabling a new wave of business opportunity Cloud applications can significantly change the way companies do business, from streamlining the onboarding of new employees, ensuring a consistent customer experience, and bringing about data democracy within an organisation. However, unless teams take serious steps to break down the barriers and open up data sharing across those disparate applications, and across departments, the promise of a truly data-driven organisation will remain a pipedream. IT and LOB teams must work together to unlock this transformational change.[/vc_column_text][/vc_column][/vc_row] ### Grace Under Pressure: The Maturity of the Cloud [vc_row][vc_column][vc_column_text]The cloud has been “walking the walk” for a while now. But recent events have allowed the cloud to show its real value.   Many of us have been discussing the cloud with customers and prospects for the past eight years, asking the question, “What is your cloud strategy?” In the early days, this question was mostly met with a dismissive scoff: “We’ll never move to the cloud!”. Then, the mood softened a bit: “Hmm, we’ll look to leverage the cloud where appropriate.” Now, however, cloud-first strategies have become the norm. It’s clear that the cloud has converted many of the most ardent nay-sayers, but there are still sceptics. The reality is that we are living in an IT world of variety. We are engulfed in a long and seemingly endless phase of transition between old and new worlds, between on-prem and cloud. Despite this, one requirement is timeless. A user needs to open their computer and access the tools required to do their jobs. Everything else becomes irrelevant. Location, network, destination - it’s all one big wildcard.   No ‘snow day’ Recent unprecedented events have shown how imperative it is that cloud be at the forefront of all organisations' IT strategies. It is unlikely that any of us could have envisaged a global event taking hold so emphatically and causing such a seismic shift in working behaviour. Places of work are closed across all countries and all industries with a huge emphasis now being cast on everyone working from home. Clearly, most organisations already have provisions for remote working, but these systems and services are only stress-tested occasionally during the annual “snow day” (in the UK at least). So fleeting is this interruption, any issues discovered are often either tolerated or overlooked as it's back to business very quickly. Recently, many organisations have been caught short while very much in the eye of the biggest “snowstorm” of all time. In situations like this, people can only be reactive. One of the biggest challenges is being able to provide the platform for all employees to connect to business resources remotely with adequate performance. The first instinct is to go with what you know and trust. In most cases, businesses have existing VPN solutions. Therefore, it should be easy to give your remote employees secure access, right? Just build more capacity, procure more licences, and tune the configuration to eke out every last drop of resource. These measures are able to buy a little more time, but something is different about this snow day. We don’t know when it will end. Therefore, business must continue, no excuses. Good enough is, quite simply, not good enough. Many traditional and already-deployed remote access solutions require physical hardware. They are finite in resource. Normally, this wouldn’t be a showstopper. Planning teams would be one step ahead of the game, proactively monitoring and uplifting capacity to meet the demand as it grows. However, there is one commodity that is essential to proactiveness, which coincidentally is the one commodity that is in short supply right now – time. It takes weeks and sometimes months to procure, deliver, install, and enable physical IT infrastructure. However, that is not an option in this hectic time. There must be another way. This is where cloud can really prove itself.   Coming of age Amongst the security industry, existing customers and prospects alike have been turning to cloud providers with a challenge: “Can you help us provide secure application access for ALL of our employees? Oh, and did we mention the fact that we need this right now?” What happened next was unprecedented.  A combination of expertise, skill, determination, and above all, teamwork that enabled several organisations in every corner of the globe to quickly provision new software and start providing their employees access to work applications and data. What is the fundamental component that is making all of this possible? The cloud.   The mother of all stress tests One of the key arguments often raised against cloud adoption is maturity. Many cloud services that are being delivered are relative infants in IT terms. There are nagging doubts from some who suggest cloud is not enterprise-ready for all of their workloads. Granted, it’s an intimidating thought handing over the reins, the horse and the wagon of your business to a third-party to take good care of. Sure, service-level agreements (SLAs) are decent but SLAs do not guarantee the actual service. This recent crisis has been the mother of all stress tests for cloud services. Office 365 and Zoom have never been far from the headlines, with many wondering how they will cope under such an intense, and quickly growing, workload. Unsurprisingly, there have been challenges across the industry; growing pains if you will. A tidal wave of real customer load on any platform, no matter how well it has been lab tested, will flag opportunities for improvement. However, these have been dealt with quickly, intelligently, and effectively, making the overall service more robust and reliable. The past week has fast-forwarded the process of maturity by several months or maybe even years, given the enormous spike in demand and stress. Despite this, the industry is coping and growing, and this is testament to available mature, cloud native architecture.   IT will never be the same again It’s going to be a busy few weeks ahead for everyone, both personally and professionally. Once the dust finally settles on this unprecedented situation, the way organisations are geared up to provide IT to their employees will have changed forever, and for the better. If anyone out there is still not convinced of the power of the cloud, they never will be.[/vc_column_text][/vc_column][/vc_row] ### Embracing cloud: UK world leader in digital maturity [vc_row][vc_column][vc_column_text]Digital transformation has become a necessity for companies in all industries. Long histories and distinguished accomplishments offer no defence against disruptive competitors. Consider the case of Thomas Cook, which abruptly closed in 2019 after 178 years in business. The storied travel business attempted to reorganise and restructure, but in the end, could not survive against online discount competitors. Last Autumn, the Infosys Knowledge Institute assessed how enterprises are implementing digital transformation. We surveyed more than 1,000 executives of companies with at least $1 billion revenue. Respondents included executives based in Australia, Canada, China, France, Germany, India, New Zealand, the United Kingdom, and the United States. A top performer Companies in the United Kingdom perform among the world’s elite in advancing digital technology, according to Infosys’ Digital Radar 2020 report. When measured against 22 digital transformation initiatives, UK respondents achieved the second-highest digital maturity globally, with a score of 63. India was the only country surveyed which beat the UK, with a digital maturity index of 71. That’s despite the survey unearthing that UK respondents face the greatest challenges from legacy systems of any nation surveyed. Some 38% of UK companies listed legacy systems as their biggest challenge this year and also in the coming year. Globally 28% of respondents mentioned legacy systems as their primary challenge. Rapidly developing China reported the lowest level of struggles with legacy systems at 17%. Breaking silos and embracing clouds A closer look at the survey shows that UK respondents stand out for the way they embrace collaboration and have adopted cloud computing. UK companies have enthusiastically embraced the initiative of cloud computing, with 44% of companies now operating at scale in this discipline – above the survey average by two percentage points – the highest performance for cloud computing by any country group surveyed. Several top-performing companies from the UK commented on the power of cloud technology. A UK manufacturing company said that cloud was a good way to overcome risk aversion and slow adoption by senior management. A UK telecommunications company noted that engaging in multiple cloud products enabled it to correct previous poor implementations. In contrast with the challenge of operating through legacy systems, British companies excel at working together across business lines – they were the country least likely to name the inability to work across silos as a major business challenge. These two strengths relate: Cloud computing allows for greater collaboration, and that is a critical component for companies to evolve at the pace of technology change. The digital ceiling looms The survey and the rest of the research in our Digital Radar 2020 report revealed significant progress from basic to middle-tier digital maturity, for respondents around the globe and in the UK. However, many UK companies saw diminishing returns from their technology investments, reaching what we at Infosys have termed the “digital ceiling.” A small number of UK companies have broken through the 70% digital maturity score, and this is consistent with the global average. The rate of market change is faster than the pace of a business transformation cycle, and this is a fundamental problem for executives seeking to improve capabilities. Technology also changes faster than companies can react, and keeping pace with future digital evolutions requires a mindset change from company leaders and technology specialists alike. We found the most advanced companies show distinction in their achievements, and in their motivations to pursue advanced technology. These top performers are much more likely to adopt technology to empower employees and improve customer satisfaction. Respondents in the United Kingdom also indicated strong motivation to improve customer satisfaction. Some 61% called improving customer experience and engagement a top reason to pursue digital transformation. However, the survey revealed UK companies have an opportunity to improve when it comes to their employees. Only 35% of UK respondents said their companies wanted to put technology to work to empower employees to act and lead change. Employee empowerment is a critical success factor for top digital companies. The most advanced companies in our survey look for ways that technology can make things better for employees and clients alike. They work in quick cycles and behave like a living organism. They sense and respond, learn and adjust, and evolve. Several top performing UK companies list employee education and external training as a critical factor to increase their digital maturity. In contrast, the lower performing companies viewed technology as mainly a tool for cutting costs and increasing efficiency. Faster, better and cheaper technology alone will not provide the improvements that enterprises need. Leading enterprises bring these digital tools to bear with an employee-centered mindset. When companies do that, employees use technology to meet their needs and serve customers better, which ultimately generates the economic returns that power the enterprise to serve all stakeholders and keep pace with global markets.[/vc_column_text][/vc_column][/vc_row] ### The Unpredictable is Predictable, Thanks to Blue Chip Cloud [vc_row][vc_column][vc_column_text]Throughout the COVID-19 pandemic, key players in the business landscape have been working around the clock to maintain essential services.  Preserving critical supply chains has been a challenge for many Managed Service Providers (MSPs), yet the software-defined data centre at the heart of Blue Chip’s cloud service has remained fully operational. In fact, Blue Chip is continuing to operate with minimum disruption. So, how have they achieved this? “Over the Easter Bank Holiday weekend, when a lot of people were having to stay at home, we actually had a project team in full flight. We even transitioned a large managed service customer into Blue Chip Cloud,” explains Tim Stringer, Chief Information Security Officer at Blue Chip. Having invested and created a data centre from the ground up a decade ago, Blue Chip can continue offering its world class cloud services to keep help businesses up and running. Also, Blue Chip Cloud’s flexibility means organisations can have fully functioning environments set up in as little as 48 hours. One such customer, a highly regarded financial institution, was able to move their infrastructure into the Blue Chip Cloud without a hitch during the UK lockdown within safe and sensible parameters. Everything was facilitated remotely, says Stringer. Stringer goes on to explain that everything was “delivered across a multitude of different technologies as code. We've got our cloud infrastructure across all the technologies such as AIX, VMWare, Wintel, networks, security, storage – everything. It was all done virtually and there was no need for a physical presence on site.” In these unusual times, not all parts of the business landscape have been able to adapt. Some MSPs have had to postpone projects because the third-party data centres they depend on aren't accessible. Some organisations, for example, have been unable to get backups out of the systems. Again, this has not been an issue for Blue Chip. “Consistent investment within our services and team has enabled us to continue to operate as a business and deliver these projects remotely. “But beyond that, we focus on ‘control’. We own our data centres and employ our own staff; this enables us to have control over our processes and their outcomes,” says Chris Smith, Director of Sales & Marketing at Blue Chip. We can all agree that staying safe is the number one priority at the moment. Everyone knows that social distancing and other sensible precautions are needed to keep risks of infection as low as possible. But what happens when a company’s IT infrastructure needs physical maintenance? Blue Chip’s Data Centre and Facilities Officer, Eddy Conway, emphasises that Blue Chip is well prepared for keeping visitors and staff safe: “Our engineers can look at your system to make sure that it’s functioning well. If you need to bring in a third party, we will facilitate the escort duties so that you don’t have to visit the site. Everyone involved is employed by Blue Chip and your safety is our prerogative.” (Blue Chip also invested in extra spare parts for their warehouse prior to the lockdown to ensure availability.) Of course, it’s impossible to predict the challenges that may arise as we mitigate the impact of the health crisis on the global business landscape. However, Blue Chip is well-equipped to help customers remain operational and navigate through to the other side. As Smith summarises: “In this scenario, you're only as strong as your weakest link. So, what we've tried to do is remove the weak links and make sure we are self-sufficient.”[/vc_column_text][/vc_column][/vc_row] ### Mobile Cyber-Espionage: How to Keep Your Executives Safe [vc_row][vc_column][vc_column_text]In January 2020, the United Nations released a dynamite report. It alleged that the personal smartphone of Amazon boss Jeff Bezos had been hacked by Saudi Arabian crown prince Mohammed bin Salman. According to the findings, a booby-trapped MP4 file arrived on Bezos’s iPhone via a WhatsApp message from the prince, covertly downloaded spyware, and began exfiltrating hundreds of megabytes of data. All of which begs the question, if the richest person in the world can have his phone hacked, how secure are the devices used by your corporate executives? Fortunately, most mobile cyber-espionage threats aren’t nearly as sophisticated. But they are on the rise. Tackling them will require a considered approach combining a best practice blend of people, process and technology. Spying goes mobile Customer data has always been in high demand on hacking forums and dark web marketplaces, where it’s usually snapped up by scammers to use in follow-on identity fraud. Despite the advent of rigorous new data protection regulations such as the EU’s GDPR and CCPA in the US, activity in this area appears undimmed, for now. However, cyber-espionage is typically a more sophisticated marketplace where specific organisations are targeted for high-value internal data that could give rivals a competitive advantage or be used to commit stock market fraud. While desktop, on-premises infrastructure in large organisations is relatively well protected, the same isn’t always true of the mobile ecosystem. From 2015 to 2019, Trend Micro observed a 1400% increase in mobile cyber-espionage campaigns targeting multiple platforms, operating systems and countries. Some are the work of state actors while others are down to financially motivated cyber-criminals. The bad news is that organised cybercrime groups traditionally associated with desktop attacks are increasingly scrutinising mobile channels to further their goals. Taking everything Android remains the most commonly targeted ecosystem, as its relatively open approach means malicious apps are more likely to find their way online. That’s not to say iOS is completely untouched. The malware believed to have infected Bezos’s iPhone could be related to Pegasus, notorious spyware flagged in the past by rights groups as being used by despotic regimes to monitor dissidents. Popular with East Asian cyber-criminals, XLoader and FakeSpy target both Android and iOS platforms to harvest sensitive information. Typically, the kind of malware we’re talking about in these attacks is designed to exfiltrate a range of information to remote C&C servers. This could include basic device ID (IMEI), OS version and operator information, but also messages — even on ‘secure’ services like WhatsApp and Telegram — stored files, clipboard contents, call logs, contact lists, browsing history, account log-ins, geolocation and application lists. Attackers could access an individual’s entire digital life and any corporate accounts not protected by 2FA via spyware of this sort, harvesting highly valuable intelligence on the target and their organisation. There are various ways they trick the individual into installing the malware on their device. Most commonly the malicious code is hidden in a legitimate-looking application: perhaps a game, adult content or something more mundane such as a web browsing app or B2B service provider software. It’s then loaded onto a fake website or app marketplace; usually a third-party store, although malware has been found frequently on the official Google Play site. The final step is to persuade the user to visit and download it: usually achieved via phishing email or text message (smishing). Hiding in plain sight Trend Micro has tracked multiple spyware campaigns using such tactics over the years. In 2019 alone there was Bouncing Golf, which used malware capable of recording audio and video as well as stealing information, and which was posted on websites and promoted on social media. CallerSpy featured spyware disguised as chat apps and hosted on a phishing site masquerading as a Google page, while MobSTSPY harvested users’ account credentials and was downloaded more than 100,000 times from Google Play. To increase the chances of victims downloading the malicious apps, hackers will often leverage stories in the news and popular current services. We have already spotted one campaign disguising spyware as a Coronavirus Updates app, for example, while another featured a malicious Zoom installer. In some extreme cases, organisations may even be targeted with highly sophisticated exploits developed by grey-market organisations which typically sell them to foreign governments. This is where Pegasus came from. Threats like this can require little to no user interaction to work, but fortunately are extremely rare, and expensive to buy. Getting ahead The threat landscape is a volatile but dynamic environment, meaning tools and tactics will continue to develop in this space. The bad news is that as cyber-espionage becomes more commonplace on mobile devices, the cybercrime underground is likely to see a proliferation of readymade toolkits enabling even those with few technical skills to get involved. This poses a threat to organisations everywhere, especially high-value executive targets. Yet despite this, there are a few things you can do to greatly reduce the risk of being caught out. It starts with user awareness training. Ensure your employees know how to spot the tell-tale signs of a phishing email or text. This should be an ongoing process: the best training kits will allow you to update modules to feature the latest phishing campaigns. Next, revisit your device usage and management policies. In highly regulated industries it may be that employees are not allowed to access any corporate resources from anything other than a highly protected work device. Individual CISOs will need to work out their risk appetite and budget and calculate whether certain users require their own corporate handsets. Policies could be drawn up to restrict downloads of any non-approved applications or visiting any non-approved sites. Back this up with technical controls: AV software on each device to scan for malware and block any malicious downloads. Enhance corporate security further by mandating all accounts accessible remotely have two-factor authentication enabled. Gartner predicts that as many as three-quarters of CFOs plan to shift at least 5% of previously on-site employees to permanent remote working post-COVID. The future is increasingly mobile, so it pays to get ahead of the game to protect the organisation today.[/vc_column_text][/vc_column][/vc_row] ### Automation vs Emotion: Are people still important in customer experience roles? [vc_row][vc_column][vc_column_text]In the past few weeks, investment in digital transformation and technology processes has accelerated to keep pace with changing demands and to meet healthcare needs. One industry that was already being transformed before the pandemic was that of customer service. There’s been a gradual introduction of new technology using artificial intelligence which can do many of the tasks traditionally done by humans. Chances are, you’ve interacted with AI technology in the past week, even if you’re not aware of it. From grocery shopping to banking, ordering a takeaway, to checking the weather, we live in an increasingly automated world where technology reigns supreme. But how much is too much? With technologies ever-evolving, more and more businesses are turning to AI to improve their customer journey, signalling a shift for customer experience. But, striking this balance between human interaction and automation can feel like a minefield. Too little human interaction and your customers are becoming frustrated, too much and you’re simply wasting everyone’s time.   What are the benefits of customer service automation? Streamlined processes If you’re tired of pressing ‘one for service, two for booking,’ customer service representatives are just as tired of repeating the same answers again and again. For those mundane or regularly occurring issues, if a virtual assistant can interpret your request using an automated script response, it’s much more efficient than chatting with a human. Agents aren’t wasting time responding to the same queries that are most commonly asked. Instead, they are able to deal with more complex requests and critical interactions that require a human touch. Similarly, programs can easily automate follow-up emails or tasks, further saving your agents time and removing the margin for human error. Predicting customer behaviour Understanding what your customers want and need — ideally, before they even do — is the holy grail of successful customer service. By collecting data from previous interactions, new technologies are able to predict actions, both from the customer and the agent’s viewpoint. This personalisation not only speeds up the customer journey but also improves and enhances the customer experience.   The human touch Some businesses can easily miss the point: great business is built on relationships, and these are essential to success. But without human interaction; these relationships cannot exist. Of course, automation can solve efficiency issues, and it can be gold dust for the time-poor business. But nothing can beat human interaction. A chatbot can send people in the right direction if the question is simple, but it might not be able to fully ‘understand’ all issues. For many of us, empathy is an essential part of great customer service, along with being able to form an emotional connection, and this simply doesn’t exist when communicating with a chatbot. If we’re complaining about a product or service, as humans we often need someone to validate our feelings, and this emotional connection helps customers to remain loyal.   The future of customer service is a human-AI collaboration While technology can help your customers to have the best experience with your business, it comes with its limitations, and a combination of human interaction and technology is blurring the lines of what good customer service looks like. How can you successfully automate your customer service with the perfect blend of emotion? The key is to keep that human interaction, but only where it’s needed. Not everything needs to be automated - while a returning buyer doesn’t need AI assistance to purchase a product, a first-time site visitor will probably benefit from the advice of a chatbot as their questions will most likely have been asked time and time before. Businesses need to be selective about the areas where automation works best - namely those tasks that are repetitive or simple. Make sure your customer service channels are consistent - There’s nothing more frustrating than having to repeat your order number, details and query over and over again, as not only will this put customers off, but important information might get lost in limbo. 33% of customers admit to feeling frustrated by having to repeat themselves to multiple support reps, showing it’s crucial for businesses to remain consistent across all their channels to improve your customer experience.   The best solution for managing customer relationships? The combination of human engagement with the efficiency automation provides. Perfecting the perfect customer service system isn’t an easy task, and there is no one-size-fits-all approach. For businesses, it’s all about getting this balance right, and recognising when a person should take over. With 78% of customers saying they’d back out of a purchase due to poor customer experience, it’s about finding what’s right for your specific customer. From offering the opportunity to chat with a human from the off to asking for customer feedback to assess effectiveness, customer service is constantly evolving, but businesses need to keep in mind that AI was designed to simplify human customer service, not eliminate it entirely.[/vc_column_text][/vc_column][/vc_row] ### Cloud Supply Chain Risk – is your MSP in control? [vc_row][vc_column][vc_column_text]For some businesses, beyond getting homeworking up and running at a whole new scale, lockdown has brought IT strategy to a standstill. Those that do want to move forward can face delays from their Managed Service Provider (MSP) and the restrictions lockdown has placed on them to deliver new solutions into their clouds. While this is an immediate impact it has also exposed some of the risk that was already present for MSP’s & under the covers of their cloud services. There is nothing like a change in circumstance to really test the viability of vendors and the contractual SLA’s you have in place, or would require should you become a customer. Lockdown is the extreme of change and has therefore produced some interesting insight. Many MSP’s do not own the data centres their cloud is housed in, this can lead to the following issues: Access The ability to gain access is often a step removed from the data centre owner to a third party contractor adding layers of supply chain between the end customer and access. This will often affect access when needed the most. Many engineers have sat with critical parts in hand waiting for hours to gain access to third party data centres while systems were down. All because of this hidden supply chain. So, as far as risk security access is just the start. Network Operations Center (NOC) An MSP with their cloud in a third party data centre also needs to find a location for their NOC. This is where they will deliver support and management services from for their cloud. Often these are remote to the location of the cloud hardware, completely removed from the physical side of delivering a cloud infrastructure. Many data centres offer hands and eyes but these are usually only good for changing tapes or hitting the power button on a server. The risk here is more than just distance, most only have a single NOC without a backup for business continuity. This should be explored by businesses looking to move to the cloud and looking for an MSP to support them. Transition How is your MSP planning to deliver your project? Many use third party contractors and retain very few permanent staff. Transition to the cloud can be a long and complicated process, and occasionally it can take longer than anticipated or delays beyond anyone’s control can effect timescales. When this happens projects can be even impacted further when contractors move on to their next project. Leaving somebody else to pick things up from scratch. This is a huge risk to success of migration. Software Defined Capabilities Can your datacentre provider deliver mixed workloads IBM i, AIX, Microsoft & Linux based operating systems, processing, storage and networks as code? Deploying a whole interconnected infrastructure as well as networks can be hard if you don’t control the networking core of the datacentre. To build out this service takes time and investment in infrastructure and staff. Investment Not owning a data centre also restricts a hugely important aspect you want your MSP to control – investment. A cloud provider can have all the latest and greatest hardware within the rack they rent but if there is not investment in the facilities that house it, the uninterruptable power supplies (UPS),  generators, aircon, and networks are at huge risk from a vendor you have no direct influence over. If they have a power outage you may get service credits but how do you know it won’t happen again? All of the above is relevant to business as usual but what about during lockdown? Access An MSP with a third party data centre has no control over access in lockdown. Most non-critical access has been denied, so therefore new projects have been delayed to minimise staff risk. The MSP may not even be able to deliver new hardware for projects due to restricted access. Your MSP cannot deliver new customer solutions how viable is their business. NOC – This risk is the same but even larger if you can’t get access to the data centre your cloud is in, the MSP has to do everything remotely. Transition – If you don’t have control over your project staff during BAU, lockdown makes it worse. If you cannot deliver transition seamlessly and remotely via scripting and utilising a software defined environment you cannot deliver projects. Software Defined Capabilities In times of restricted access and risk to staff to physically attend site software defined capabilities are hugely important. These enable continued growth of existing cloud deployments and to continue ongoing new deployments as this can be facilitated by home workers. Investment – Still out of MSP control In addition, even if your MSP does own a data centre for Primary production IT services what do they do for disaster recovery (DR) or high availability (HA)? If again they are utilising a 3rd party vendor for this all the above is still an issue. Even worse they may split production and DR/HA between two different datacentre providers compounding the risks. Finally, if anything happens to your MSP and they go out of business or fail to pay their bills the third party data centre could just pull the plug on the cloud infrastructure. You have no control over this scenario. So when you are looking to move your critical workloads into the cloud run through this checklist with the provider - Do they own the Data Centre? - How long does access to the data centre take? - Where is their NOC located? - Can they deliver everything as a script are they software defined? - Who does the actual work (patching, implementations etc)? Then ask what happens with the above under lockdown conditions. In summary, you need to look for the MSP’s that have continued to deliver the full spectrum of services during lockdown. Those that own their own data centres and utilise their own staff to support these services are truly in control. Contact Blue Chip to find out how we remove these risks to deliver a Cloud Service that we fully control.[/vc_column_text][/vc_column][/vc_row] ### Best practice for cloud migration [vc_row][vc_column][vc_column_text]Cloud migration is a hot topic for SMEs at the moment. If it were a place it would be Death Valley, scorching hot. However, the danger of dealing with something so hot is that if you don’t plan your approach correctly, you’re likely to end up with your fingers burned. This is especially true during moments of high business stress, such as the current Covid-19 pandemic, as companies can see the short-term wins of getting as much data off-premise as possible but may unwittingly take problematic short-cuts. Of course, having your key business systems on the cloud brings tremendous benefits, such as remote worker flexibility, system availability, security and scalability. However, getting a robust and fully scrutinised plan in place before rushing into raising purchase orders will ensure that your cloud project is not only a success, but that you manage it without customer, staff or supplier impact.   What to migrate? The first step of any cloud or digital transformation project is to look at your current systems and workloads and weigh up the benefits of moving each of them to the cloud. Consider your current bottlenecks or service issues – are you struggling with branch office or remote workers accessing your Enterprise Resource Planning (ERP) system? If so, it is a prime candidate to move off-premise. Spend time grading each system or workload so that you end up with a prioritised list, as this will then drive your other decisions such as cost and time planning.   Not everything can or should move Whilst it may be a great goal to aim for, be realistic and remember that not all of your systems may be able to migrate to the cloud. Older line of business applications can be a sticking point, especially where vendors are not supporting the installation of these workloads in Azure or AWS. When you come across these sticking points, it’s sensible to add an item into your IT planning roadmap to re-evaluate these systems at a future date and consider other vendors or solutions that have a cloud-centric approach.   Hybrid has a place Similarly, although it is rewarding to remove every single server from the office, there are times when having a local server to provide quick cached access to data, or the ability for seamless sign-on at a branch office with poor connectivity, can make the difference between a successful cloud migration project and a nightmare of stressed and upset end-users. Being open to the possibility of re-using some physical equipment, as long as it’s in warranty and up to date, is a sensible mindset to have during the planning stage.   Assess your skills & tools Before kicking off the cloud migration project, take a close look at the skills and tools you have internally. Your own in-house team may be the best suited people to move an internally developed system to the cloud, but they may be less suited to move your file shares or e-mail systems. Where there is a skill gap, decide whether or not there is value in training your team to undertake this work. Unless you have numerous similar migrations to undertake, it’s likely to be the right time to engage a consultant or Managed Service Provider ( MSP) who are experts in this type of project and have the staff and tools to make it a success.   Cash is king Make sure you budget appropriately for your cloud migration. Ensure you’ve accounted for software licensing costs – both one-off and ongoing, support charges, migration tool costs, external contractors, and of course the costs of the public or private cloud infrastructure. Understand how the costs scale as you increase your workload or user count, to enable your future cost forecasting needs to be as accurate as possible. Calculating the cost of the public cloud infrastructure is tricky even for experts, as some workloads behave differently once moved into the cloud, so on-premise metrics – while a decent guide – aren’t always perfect. As such, allow a contingency in the budget to cover the need to add extra resources to cloud if required. Don’t stretch yourself too wide or thin, you’re better to over commit to a single cloud migration and come in with budget to spare than to try and push through multiple projects in a short time and get budget constrained if any of them run over.   Keep security front of mind The strongest benefit of the cloud – data accessible from anywhere, any time – is also its biggest weakness, that’s if security isn’t factored in from the start. On the plus side, the planning stage of a cloud migration project is the perfect time to re-assess current IT and operational security to ensure that the cloud platform is as secure as possible. Multi factor authentication – MFA – should be mandatory on any sign-in platform that provides access to company data. Consider all your cloud workloads, along with future cloud plans, and see if there is a unified option for MFA that will work on all the platforms you plan to use. This makes IT security administration and support easier, plus the end users have less applications to deal with to gain access to the company cloud. Likewise, single sign-on – SSO – ensures the users have less passwords to remember and again makes the ongoing security administration easier. Most major cloud applications support the largest SSO providers such as MS Azure AD, Okta or OneLogin.   Plan the end One mistake we see all too often is companies having a great plan for their cloud migration itself, but no plan for the steps afterwards. Ensure your plan has steps to make people responsible for monitoring the cloud platform, detail who the key support escalations are in case of future issues, and assign someone to review the migration to learn what could be improved and if there are any actions that may need to be taken that didn’t make the original plan. Allocate a team member, or your external IT support, to document the new cloud platform as soon as possible to ensure all the key knowledge is captured. Finally, once your project is complete you may well have retired several hardware devices. Make sure you have a plan to dispose of them securely and in an environmentally conscious way. Where devices contain hard disks, a certificate of destruction should be retained for each disk for at least 7 years in case of a data breach which could be attributed to poor data disposal practices.[/vc_column_text][/vc_column][/vc_row] ### How machine learning can give you the data security you need [vc_row][vc_column][vc_column_text]If we have the right processes in place to protect our data and control access to files and folders, we will naturally avoid more data breaches and remain compliant with regulations such as GDPR.   There are issues to overcome however and they mainly arise from a very traditional way of managing data. Internally, there's often an element of ‘my data is always more important than everyone else's data’, or ‘It must be kept forever’ or ‘It must be treated in a different way’. But when it comes to compliance, you have to make decisions. You can't just implement cloud based collaboration platforms such as Teams or Office 365 without implementing clear data governance policies. It's about getting people together, making a totalitarian decision. Agree how you are going to manage the data and then go ahead with it, no buts or ifs. The other problem is, even if you make those decisions, how do you police them? You can’t stand behind your business users to make sure they are classifying the data correctly? And you can’t expect your IT team to do this within their role either. You have to rely on an element of automation or machine learning. The automation could be a combination of encryption and rights management services in combination with data loss prevention technologies and cloud application security. We need an audit trail of who's got access to which data, why and when. Torsion is one such automated machine learning platform. It works with collaboration tools such as Sharepoint and Microsoft Teams to automatically monitor and detect any inappropriate access, out of date folders and permissions, or the movement of files. If anything doesn’t look quite right it will promptly alert a business user associated with the file and shut down any potential breaches. Other than that, it can run seamlessly in the background until and unless it is required. Owners or creators of files and folders can certify and revoke access themselves, taking the responsibility away from the IT function. Peter Bradley, CEO at Torsion says: “Businesses of all sizes must empower the role that employees and partners play in safeguarding the company’s data. However, expecting them to be aware of security breaches on a day to day basis when there is such a growing volume of files and data is simply unrealistic. “By automating the process of file sharing and prompting the business user if anything looks suspicious, inconsistent or not relevant, they can carry on with their main responsibilities confident that they are also keeping their data safe and secure. “When we show businesses how the machine learning technology works, and that data security doesn’t have to be an IT problem any more, they sit there silent for a minute and then say it’s how they should have been thinking about this problem for the last 20 years. We've been thinking about this problem incorrectly the whole time.” How machine learning is being used for data security Patrick Reynolds, Head of Operations at Neotas, who use Torsion’s machine learning software says: “We are a firm believer in using SharePoint, data encryption and IRM (Information Rights Management). Keeping everything secure and in house is key to what we do.  We share thousands of links around certain structures and have a lot of restrictions on our libraries. “As part of our ISO 27001 accreditation we really need to have a robust system in place and that comes from SharePoint being at the heart of all of our documentation. We need to mitigate all the risks and a solution like this gives us peace of mind by being able to make changes really quickly throughout all of our architecture.” Others are also turning to the automated solution because of the coronavirus pandemic. One particular government agency has had to furlough some of their workforce, but where possible they re-assigned responsibilities so as many employees can continue to work from home as possible. However, to ensure each employee could start working in their new roles, they needed to make sure everybody had access to the right files and folders. Granting temporary access to internal data, within a collaboration platform such as Teams or Sharepoint can often be a logistical headache: first establishing who needs access to what information; then updating permissions manually; and finally revoking that access once it is no longer needed. There can often be a period of a couple of weeks where employees can’t do their job as they don’t have access to the information they need.  If access is not revoked when the temporary roles end, it leads to out of control access and potential security breaches. This is the situation now facing many businesses taking on temporary workers during the pandemic. Using the machine learning from Torsion they automatically manage their files and folder permissions. The technology works within their Teams platform to monitor access, alert business users to any security issues and remove access when it is not needed. They can grant temporary access to files and folders for a set period of time and the machine learning automatically works out what information a team member might need, in addition to their existing permissions, and grants them access for a fixed number of days. At the end of the period the access is automatically revoked. The temporary team members are up and running with everything they need almost immediately, and the company has a clear audit trail of who has access to what information, when and why. Summary If we use the right automation tools to stay in control of who has access to what, why and when, we will consequently be in control of our data sharing and compliant regardless of the volume of files being shared. We must remember that good compliance does not necessarily give you data security, but data security gives you good compliance. Proving compliance to the auditors shouldn’t be a headache either. Because data security is being automatically managed and controlled, when it comes to proving your compliance it should be as simple as pressing a button to export a report. If you implement the right technology, we can all be confident that our data is secure and at any one time we can see who has access to what. Compliance and data security can just become part of the woodwork, the way that people work.[/vc_column_text][/vc_column][/vc_row] ### Why Now Is The Time To Enter eCommerce [vc_row][vc_column][vc_column_text]The number of daily online sales has increased by 56% since the beginning of March, coinciding with an 85% reduction in brick-and-mortar footfall. This is because, for many under lockdown, eCommerce offers a source of goods and services they might otherwise have to queue hours for. Large store-based retailers are experiencing massive surges in online orders and are expanding their eCommerce facilities and home delivery options in response. Since 2010, eCommerce has been growing exponentially from a 5% share of retail sales to over 16% in 2019. As the global pandemic continues to transform our daily lives, lockdown and social distancing orders are taking their toll on high street retailers and look to have a lasting impact on the ways in which we shop.  The resources available People have been asking themselves if now is the right time to start an eCommerce business for decades. Simultaneously many have wondered whether they’ve missed the eCommerce boat. The truth, however, is that online marketplaces such as Amazon, eBay and Etsy have made starting your own online business easier than ever. These services provide inbuilt shopfront, payment processing and order management systems which simplify the process of launching your business significantly. Furthermore, services such as WordPress and Squarespace have revolutionised web design. Ten years ago, a website would cost thousands of dollars and weeks of work to complete. Today, they can be built and deployed within days for a small fraction of the price, all without a single line of code on your part. This has allowed entrepreneurs to establish themselves online, taking advantage of existing marketplaces and affordable web hosting solutions to build their businesses with little investment outside of their product marketing. Evolving business models Over 4 billion people now have internet access, and that number grows by the millions each day. This incredible rate of growth brings with it new opportunities and new niches and opens new markets. As a result, recent years have seen a number of convenient, small scale business models emerge and become wildly popular. Dropshipping, one of the most popular eCommerce business models today, is a method in which the retailer never actually lays a hand on the product. Instead, after the product is sold, the retailer purchases it from a third party and has it shipped directly to the consumer. This method has a number of advantages: because little investment is required, the retailer negates a considerable amount of risk while simultaneously simplifying their supply chain and distribution requirements. Subscription boxes are a relatively recent eCommerce model in which consumers pay for physical deliveries of goods on a recurrent basis. They require significantly more preparation and design but have proven wildly popular for a number of reasons. Convenience is the key to success in today’s retail markets, and subscription boxes offer ready-made product selections for a range of niches from meal kits to personal grooming. How e-Learning can fill the gap in your skills Online learning (e-Learning) has rapidly grown in popularity over the last five years, becoming an industry worth hundreds of billions. It now plays a leading role in the lives of those looking to learn new skills of all kinds from home. This is made possible by the numerous free sources of learning material hosted across the internet, and by YouTube in particular, where organisations like Khan Academy provide free lectures. For those in need of something more official, many of the world’s top universities offer online degrees in subjects from machine learning to business management. Digital learning is cheaper, more convenient and higher quality than ever before. Whatever skills you feel that you lack can be learned and honed through online learning, giving you the confidence and ability to launch and maintain a successful eCommerce business. Marketing and SEO Search engines like Google now understand user intent and recognise shady attempts at getting into the first page of results. Instead of metadata and tags, their algorithms have become so advanced that they understand user intent and site content. This means that engines now reward quality content and genuine user interaction over spam and fake social media shares. For entrepreneurs new to the eCommerce world, there couldn’t be better news because this allows you to focus on quality content relating to your niche rather than worrying about learning shady SEO techniques and potentially losing out to businesses with better resources. Online advertising is more integrated and flexible than ever, thanks to platforms like Adsense, Google’s website monetisation and advertising service. It allows businesses of any size to serve adverts in a targeted manner, meaning that they’ll only be displayed on pages relevant to the businesses target market. Even better, you’re only charged based on user action. In other words, you are charged based on the number of times your advert is clicked rather than how frequently it is displayed. These factors, alongside budget caps and the lack of long-term contracts, make Adsense ideal for small businesses and entrepreneurs in need of affordable, effective online marketing - something that up until a few years ago just was simply not possible. The internet is expanding at an unprecedented rate, opening new avenues of monetisation and expanding niche markets. The opportunities and resources available to entrepreneurs looking to start their eCommerce business are more numerous and easily accessible than ever: whatever skills you feel you lack can be learned through e-learning. Furthermore, modern business models like dropshipping can reduce the risk and investment required to enter the market. Altogether, these factors make now the perfect time to start your eCommerce business.[/vc_column_text][/vc_column][/vc_row] ### Big data transformation: Top analytics trends in 2020 [vc_row][vc_column][vc_column_text]Other than the resurrection of various Artificial Intelligence technologies, Big Data transformation has bought a meaningful development in the past few years. Today, the Big Data industry is worth $189 Billion in the market. Study shows that there was an increase of $20 Billion in 2018 and keeping this in mind we can predict $247 Billion by 2022. The Walt Disney Company Chairman once said: “Technology is lifting of creativity and transforming the possibilities for entertainment and leisure.” Disney has a significant customer base which ultimately results in massive data. With using Big Data Analytics, Disney could extract the magic again. This problem could only be resolved with big data analytics tools. We plot this example as you could understand how Large industries could manage to handle the data flow and gain insights to improve business using the latest technologies. The year 2020 has already started, and once again, we can put our caps on and predict the Top analytics trends for 2020. What is Big Data? Big Data is extensive data, structured or unstructured, which helps businesses to establish patterns in human behaviour and interactions. The companies would leverage the data to enable better decision making and what customer’s look for. In the year 2001, Doug Laney, former VP of Gartner’s Chief Data Officer introduced 3Vs: Volume, Variety and Velocity. With accelerating challenges year after year, additional features were defined: 4V’s of Big Data Volume refers to the Scale of Data Variety refers to different forms of Data Velocity refers to the analysis of streaming data Veracity refers to the trustworthiness of data Big Data Analytics Trends and Solutions   The year 2020 is another year of great innovation and evolution for Big Data solutions companies. Read on to get some thoughts on Big Data trends and predictions. Augmented Analysis is the future of data and analytics Augmented Analysis is an emerging trend that is heavily used by banks. The data is processed and is automated by using Machine Learning and Natural Language Processing (NLP). To get precise results in a simple and accessible format, we use Augmented Analysis. Here the data is processed through a streamlined automation process from various sources like cloud data, internal data, external portals and different other locations. This way, the analyst will combine all data, process and check for redundancies with preparing them for analysis. The data is stored in the form of the cluster and used for quick real-time analysis with sophisticated tools. Later the information is automated to identify the pattern and trends. The Cloud is a new Data Lake Cloud-based technologies are overgrowing. It uses a process that moves the data integration and preparation from an on-premises solution to the cloud. If you want to be with a prevalent trend, then use the hybrid deployments. Be an early cloud adapter and switch your industries online. Move all data entirely and utilise cloud storage for dynamic workload with using the multi-cloud methodology. Digital Transformation is on Top-Level of Data Strategies Without data, there cannot be Digital Transformation. New technologies are developed to help businesses. Companies embrace business operations from manual to online. Today Digital transformation has become a priority for most of the activities, and the world is very soon growing digital. According to the IBM research, 1 out of 3 leaders use Digital transformation to help companies get accurate data. Artificial Intelligence and Machine Learning will continue to Evolve These two AL and ML trends are accelerating in data-driven organisations. One can make use of ML and AL algorithms within data pipelines and show more traditional BI and data integration platforms. The year 2020 will bring an automation framework that allows data scientists to create close to the production-ready outline. Data-As-A-Service is one of underplayed user Data-As-A-Service access data online from shared spaces. It is useful for a large organisation where employees need to share vast data between departments. It works similarly as downloading a volume of data like music or movies from the internet. Data-as-a-service is an architecture that uses a central hub in an organisation and promotes self-services with improving the productivity of the organisation. So, keeping data at one location can help multiple users to access data with ease. Evolution of Healthcare Services Researchers believe that medical data is vast. As the information is used to store the patient disease, cure and other preventive measures. Previously, there was no proper tool that connects all medical records centrally. But with IoT devices, this could be possible to manage hospital equipment. Many researchers have discovered the usage of IoT devices that track and monitor patient’s conditions. Few of the scientists have created a robot to attend patients and perform operations. R & D in various Industries After looking at the results, Big Data analytics is now used for multiple R & D operations to manage the organisation’s insights, customer preference, and create better products that customers wish for. The R & D department uses Big Data analytics to carry out simple social media management, manufacturing automation, improving product quality, better customer support, and to automate sales pipeline. One can make use of robust Big Data Analytics services to enhance the business reach and increase business ROI. Quick Questionnaire How do Companies deal with Big Data? Companies mainly prefer three operations (which was recently used by Rolls-Royce to adopt Big Data-driven approach) Design Manufacture After Sales Support How do Companies deal with their Big Data? Big Data Analytics use two operations (which was recently used by amazon to server their customer in a better way) Create a personalised recommendation system To improve Customer Service operations. On a Final Note It’s clear that new technology like Big Data is transforming based on how companies are operating. Digitalisation is everywhere, and the latest technology has become viral with implementing data into operations. Ultimately Big Data is used to enable smart decisions and help companies use a robust analytic environment, which includes predictive analytics, descriptive analytics and prescriptive analytics.[/vc_column_text][/vc_column][/vc_row] ### Intelligent Machines: How They Learn, Evolve & Make Memories [vc_row][vc_column][vc_column_text]Machines are slowly taking over. Every day, complicated algorithms serve up targeted advertisements; dictate the search results on Google, and serve to protect our sensitive data from other algorithms trying to steal them. It would probably surprise most people to learn just how much-advanced computer programs assist and influence our everyday lives. Their increasing efficiency and semi-autonomy make the trillions of daily computations they perform practically invisible around us. One particularly taken-for-granted form of machine labour is the computational shifts put in to keep our payment details safe and guard against fraud. Artificial intelligence systems are locked in an arms-race with malicious programs. They have to keep adapting (i.e. ‘learning’) to stay ahead of their ‘rivals’. Not only that, but the AI has to know to reconfigure to the tune of the PCI DSS and GDPR requirements -- complex security demands from both North America and Europe. In the global, transnational age, our AI systems are constantly juggling data -- and increasingly outpacing and outperforming humans. But what is artificial intelligence? And how do integrated circuits and electrical currents manage anything at all -- never mind learning to manage new tasks?  To understand, we will need to briefly cover a topic that is still unresolved after nearly a century of debate. Defining intelligence It is surprisingly hard to define intelligence, and there is no uncontested definition for it. Some AI researchers think intelligence entails a capacity for logic, understanding, planning, self-awareness, creativity, and learning, to name a few. But opponents would argue that these are human-centric viewpoints. Instead, it might be easier to give intelligence a broad and malleable definition: rather that intelligence is “the ability to accomplish complex goals”. What this means is, there can be broad and narrow forms of intelligence. A calculator, for example, is narrowly intelligent in that it can solve arithmetic much quicker than humans. But it cannot do much else, and therefore an infant could be said to be broadly more intelligent than a calculator. But ultimately, intelligence is thought to be all about information and computation. This has nothing to do with flesh and blood, and with this definition, there’s no difficulty in recognising that the machines around us, such as our card protection relay systems, have a degree of ‘intelligence’ to them. Creating artificial intelligence from nothing If intelligence is just information and computation -- then I can hear you ask -- what are information and computation? If there is anything we learned in physics class, it’s that everything in the universe is made up of really small particles. Particles don’t have intelligence on their own. So how can a bunch of dumb particles moving around according to the laws of physics exhibit behaviour that we would call ‘intelligent’? In order to understand this question, we will have to explore the concept of ‘information’ and memory storage more closely. The nature of information and memory storage space Look at a map of the world and what do you see? You see particles arranged in particular patterns and shapes. Particles in the shape of the British Isles, for example, with more particles in the shape of letters that spell ‘Great Britain’ tell your brain that you are looking at a representative of Great Britain on a map. In other words, the particles -- arranged in a particular way -- have presented to you information about the world. A map is a simple memory device as well as an informational device. Like all good memory storage units, a map can encode information in a long-lived state. Long-lived states are important because they allow us to retrieve information time and time again. Some things in the physical world make terrible memory storage devices. For example, imagine writing your name in the sand on the beach. The sand now contains ‘information’ about who wrote in the sand. But if you returned to the beach a couple of days later, the likelihood would be that your name would no longer be there. In contrast, if you engraved your name on to a gold ring, the information will still be there years later. The reason sand is bad for storing information -- and why a gold engraving is good -- is because reshaping gold requires significant energy. Whereas wind and water will effortlessly displace sand granules. So we can define information as patterns that make sense to us, and memory storage as how good something is at keeping that information in one piece, for retrieval at a later date. In computers, the simplest memory storage unit has two stable, long-lived states that we call a “bit” (short for ‘binary digit’). That is either a 0 or a 1. And the information it reveals to the observer depends on which of these states it is in.  These two-state systems are easy to manufacture and are embodied in all modern computers, albeit in a variety of different ways. The art of computing computers So that’s how a physical object can remember observation. But how can it compute? A computation is essentially a transformation of one memory state into another. It takes information and transforms it, implementing what mathematicians call a function. Calculators do this all the time. You input certain information, such as 1 + 1, and press the equals sign, and a function is implemented to give the answer of 2. But most functions are much more complicated than this. For example, the machines that monitor weather patterns use highly complex functions to predict the chances of rain tomorrow. In order for a memory state to transform information, it must exhibit some complex dynamics, so that its future state depends in a programmable way on the present state. So when we feed the memory state new information, its structure must respond and change and re-order itself into the new informational state. This process by obeying the laws of physics, but in such a way that the laws of physics transforms the memory state in the result that we want it to. (Kind of like how diverting a river does nothing to impact the laws of nature that made the river exist in the first place, but achieves a desirable end result of the diverters.) Once this happens, we have a function. The simplest function: memory storage and information transformation with a Nand gate A Nand gate is the simplest kind of function. It is designed to take two bits of information (two binary memory states) and output one bit. It outputs 0 if both inputs are 1 and in all other cases outputs 1. For example, if we connect two switches in a series with a battery and an electromagnet at the end, then the electromagnet will only be on if the first switch and the second switch are closed (“on”). But if a third switch was to be placed under the electromagnet, such that the magnet will pull it open whenever it is powered on, then we have a Nand gate -- a situation where the third switch is only open if the first two are closed. There we have a complex dynamic, where one state of memory (and by extent information), follows the laws of physics into another state. Today we have many more complex and efficient types of Nand gates, but the premise is the same. Therefore, seemingly “dumb” lifeless particles seem to exhibit behaviours that change and develop from one state of existence to the other. The final jump: from memory storage to information transformation, to learning. The ability to learn is arguably the most fascinating aspect of general intelligence. Now that we have explored how a seemingly dumb clump of matter can remember and compute information, it is time to ask: But how can it learn? With the functions described above, it is human engineers who arranged the physical states in such a way that the laws of physics are ‘tricked’ into transforming information. For matter to learn, it must rearrange itself to get better at completing the desired functions -- simply by obeying the laws of physics. Imagine placing a rubber ball on a memory foam mattress and then lifting it back up again. What would happen? Likely a small impression on the surface, and nothing more. But keep doing it, and eventually, an indentation appears. The ball nestles in the indentation. The point of this simple analogy is that, if you do something often enough, you can create a memory -- information about where the ball is often placed. The memory foam “remembers” the ball -- information is created about where the ball sits -- marked by the indentation. Machines learn in a manner that is not unlike how the synapses in our brains learn. In fact, the field known as machine learning utilises what are known as artificial neural networks in order to bring about a state of learning. If you repeatedly put bits in certain states in a network, the network will gradually learn these states and return to them from a nearby state. For example, if you’ve seen each of your family members many times, then memories of what they look like can be triggered by anything related to them. Artificial neural networks act as feedforward functions, meaning data is only input to flow in one direction. If we think of a neural network as a lot of nodes connected by wires, then the nodes can perform mathematical calculations at certain time steps by averaging together all the inputs they receive from neighbouring nodes. Do this long enough, and you keep exposing the nodes to the information that each carries. This is very similar to what is known as Hebbian learning and is similar to how synapses in the brain “fire together and wire together”, forming memories and associated memories in the brain. Just by following the laws of physics, after engineering an initial feedforward function, artificial neural networks can ‘learn’ surprisingly complex things. The nodes that get strong input from one another converge, and drop the input from less relevant nodes. Thereby completing an activation function, or entering a new state -- a learned process. Conclusion Intelligence is the ability to achieve complex goals. Right now, advancements in machine technology are enabling machines to learn and create their own functions and memories, and adapt to new challenges. All that is needed for this type of artificial intelligence is an initial human-engineered spark, and then for the natural progression of the laws of physics to take over. So there we have it, a way for seemingly dumb, non-conscious, not-alive matter to act spookily a little like our own brains. As of right now, most AI systems are “narrowly” intelligent. But it is estimated that, at some time near the middle of the century, true “broad” artificial intelligence will come to fruition. (If it’s ever possible, that is still up for debate.) One thing for certain is: machines will continue to learn, getting faster; better, and more efficient. Of that, we can expect a rising displacement of many jobs and increasing automation, but also the development of more social occupations. The social world is the last bastion of humanity that is yet to be contemplated by the great algorithm revolution.[/vc_column_text][/vc_column][/vc_row] ### Where are all the truly transformational IT projects? [vc_row][vc_column][vc_column_text]Where does the money that businesses invest in IT actually go? It’s a question that many a CFO has undoubtedly pondered over the years – and the honest answer may be a difficult one to stomach. As recent research from the Cloud Industry Forums (CIF) shows, organisations are currently spending 41 per cent of their budgets on simply managing infrastructure. That’s a remarkably steep percentage - undoubtedly, far higher than it would be in an ideal world. In many ways, it’s a confusing statistic, too. In 2020, should IT really be spending their days manually setting-up employee laptops, worrying whether all their system software is up to date, and firefighting tech problems? In an era of tools like SaaS, automated updates, and rapid deployment systems and services, shouldn’t it be possible to automate many time-intensive tasks? Despite the widespread availability of technology that can automate the majority of infrastructure management, the response seems to be ‘no’. Keeping the lights on remains a costly and time-consuming activity. Where else do IT budgets go? The ever-ambiguous ‘projects’, of course. But what exactly are these projects about? And what do they really achieve? When looking to innovate, many IT departments attempt to run before they can walk, trying to keep pace with business growth without proper strategic planning. Projects often become a ‘quick-fix’ solution. At first, a lot of the goals sound worthwhile: “modernisation of the legacy”, “cloud migration programme”, et cetera. The reality of these projects, however, is rather more boring. Initiatives like these often amount to little more than “lift and shifts”. Organisations are taking what was once on a data centre and dropping it into the cloud with little to no refactoring or consideration of the benefits, or indeed the costs, associated with public cloud environments. This is ‘transformation’ in name only. Much like endlessly raising the foundations on a house without ever actually constructing the building, organisations become trapped in a cycle of trying to maintain infrastructure, or in a cycle of shifting the location of legacy apps. IT justifies these sorts of projects as a necessity before the ‘real’ transformation can begin; but true transformation, in many cases, never gets the chance to begin. IT ends up losing sight of innovation. ‘Catch-up’ or ‘survival’ mode takes over and, as a consequence of this, the spending on managing and maintaining infrastructure remains at 41 per cent. Until we realise transformation and the real benefits of the cloud, we will never reduce this spending level. It’s an approach that doesn’t come without cost: a business’ IT budget plus whatever opportunities have been lost. Time for more innovation IT isn’t entirely to blame for this situation. IT knows that its role is to innovate, but on such limited budgets, this can seem nigh on impossible. The answer, in part, lies in addressing the issue of budget and resource. After all, an IT professional is only as good as their tools – and team. CIOs regularly blame CFOs for using financial risk as a reason for not approving projects. But CIOs often neglect the fact that they can leverage the conversation around ‘risk’ to their advantage. CFOs are some of the most attuned individuals to the reality of risk – it’s something they identify and mitigate every day. Winning project approval from the CFO involves making a clear business case about what an organisation’s digital transformation will deliver in concrete terms. Another request for server purchases, just because, is not good enough. Digital transformation projects need to focus on an ambitious vision that will move the needle for the organisation and, ultimately, increase revenue. The trick is to think big – but measurably so – as the C-suite will want to track the progress of the project as it advances. The coronavirus situation has really proved that point – IT budgets can be increased when the conversation around risk changes. Necessity is the mother of invention and innovation and never has IT had such an important opportunity to be the driving force of change. Racing car manufacturers have designed and are producing breathing aids, clothing manufacturers are making medical masks and scrubs, engineering companies have invented entirely new ventilators, and the NHS is transforming its IT systems to ensure critical medical equipment is available to the most at-risk facilities. Organisations are transforming at a rate we have previously never seen. And it’s all because the conversation around risk has changed; it’s no longer a case of ‘can we afford it?’ for businesses, but ‘how can we afford not to?’ In other words, while IT’s predicament might seem hopeless, it does have an ‘out’. Business transformation, in essence, is a risk, and for IT to move forward, CFOs need to understand how the benefits outweigh, and even mitigate, the risks. Still, it’s up to IT to make the case and do the convincing.[/vc_column_text][/vc_column][/vc_row] ### 5 Ways AI Is Helping In The Fight [vc_row][vc_column][vc_column_text]We are living in strange and uncertain times and many people are struggling, like the emails that flood our inboxes constantly inform us. As a direct result of the virus, over a third of the world’s population is currently on lockdown in their homes, throwing social and economic systems across the world into dismay. In unprecedented times we must look to unprecedented solutions. While artificial intelligence (AI) certainly has a fair amount of precedent it has up to this point been a fringe technology regarded with suspicion by the average user. However, with the dire need for fast and efficient data management, there are a number of areas in which AI is proving itself to be a powerful ally in the fight against coronavirus. Predicting Epidemics AI systems have already proved themselves to be powerful predictors of outbreaks and health trends. Most notably the AI-powered algorithm Blue Dot sent an alert about an outbreak in Wuhan, China, on December 31st, 2019, over a week before the World Health Organisation and the Center for Disease Control released their announcements. Months after the initial outbreak Blue Dot continues to help healthcare officials keep track of new outbreaks and monitor the spread of coronavirus. The power to predict outbreaks may seem a little late to those of us already living in locked down cities, but predictive algorithms like Blue Dot could be a vital part of managing the virus over long periods of time. Some are already asking whether, after we manage this outbreak, coronavirus could return with a second wave of infections. Having AI predictors on our sides could help us be more prepared and better mitigate the damage in future outbreaks. Drug Research and Deployment Perhaps more pressingly right now, AI can also help with the research, development, and deployment of potentially life-saving drugs. With infections and deaths mounting rapidly time is of the essence and if there’s one thing AI systems excel at is speeding up time-consuming tasks. There is some valuable precedent for AI-designed drugs. Exscientia, a British pharma start-up, used AI to develop a drug for obsessive-compulsive disorder (OCD) in just 12 months, a process that would have normally taken 4 to 5 years. It’s still too early to tell how this processing power could be applied to coronavirus drug research, but there are some promising movements. Take Insilico Medicine, who developed an AI system designed to identify the molecular structure of the COVID-19 virus in a matter of days where it could have taken human researchers months. Reducing Face-To-Face Contact One of the contentious aspects of the AI is how it could be used to replace human workers, a criticism that has suddenly become an advantage in this age of self-isolation. Any system that reduces human-to-human contact is of value, especially in high-risk environments like hospitals. That’s partly why Chinese eCommerce giant Alibaba funded the development of an AI system for diagnosing coronavirus, which it claims to do in 20 seconds with 96% accuracy. Another area in which AI is helping distance individuals is in claims processing. Also in China, insurance company Ant Financial uses an AI system to process health insurance claims without the need for the patient to meet a customer representative, further minimising the potential spread of the disease. Managing The Outbreak Fighting coronavirus is more than just curing the disease, it’s also about managing the healthy in order to reduce the chance of further outbreaks. This is the step that may last years, so it’s important to look into technology solutions that could help ease the burden of outbreak management. First came pre-diagnostic systems that told users whether they had the virus or were at risk of contracting it, like the Chinese app that determined exposure and viral contact. However, moving forward it may be the role of machine learning technology to enforce social distancing in workplace and public environments. Landing AI, a tech start-up by Andrew Ng, claims to have a system that does just that, recognising individuals through security cameras and sending an alert whenever anyone gets within the recommended two meters of each other. Fighting Fake News While coronavirus is spread through coughs, there is an accompanying psychological virus being spread through fake news. Scammers, fear mongers and the unwillingly misinformed are sharing dangerous misinformation about government conspiracy, symptoms, and potential cures, all of which are distracting from the life-saving information from official sources. Thankfully AI can help here too. Google, Twitter, and Facebook are putting their AI algorithms to task rooting out misinformation and redirecting users to official news sources.[/vc_column_text][/vc_column][/vc_row] ### Protect your productivity with DaaS [vc_row][vc_column][vc_column_text]No matter the reason for working away from the office, as a business owner, leader or IT expert, you’ll want to know that your teams are working in the most effective and productive ways possible. We’ve all seen a lot of articles about how digital transformation goes hand in hand with more industrious working methods – less time wasted, better time used. However, we also know that changing the mindset of a whole workforce from traditional set-ups to more digital-friendly ones can cause friction and there will always be a lag in natural take-up unless encouraged or enforced. But what if you need to make changes in the face of emergencies or to tackle unforeseen challenges? IT teams now need to safeguard the entire organisation’s productivity through homeworking tools, rather than focusing on the everyday continuity via backups and Disaster Recovery. The less structure there is in place, the harder it will be to ensure those necessary levels of productivity. Under pressure For many, the swift need for virtual access to desktops leads to unprecedented pressure – on you as a business leader, and your VPN. If you suddenly need a lot of your employees to use Remote Desktop Protocol (RDP) sessions to access their internal desktop machine, your network will feel the strain, even with a VPN concentrator. Not to mention the data security and compliance risks that increase with this kind of connection. An alternative is to adopt desktop virtualisation using desktop as a service (DaaS). DaaS is a key technology for increasing collaboration and boosting productivity, and something which can form the cornerstone of your workplace strategy – whether in or out of the office. By virtualising the workstation, staff get secure access to desktops, data and applications from any location or device. This creates seamless experiences for the end-user and makes it easier to work as and when staff need to. DaaS accesses the data centre rather than the corporate network, so there is little or no need to run a VPN. On top of the optimisation, because the session is cloud-based, no data is stored on the user’s local machine – reducing your security and compliance risks immediately. Get closer, and lose the latency For those working out of the office, one of the biggest concerns is the drop in their performance. This isn’t (necessarily) because the team is now working in their pyjamas, but more likely the network latency is causing delays between them and your on-prem resources. It’s understandable that your staff require access to these resources at all times, but what if they were just a bit closer to get rid of the lag and make everyone more productive? This can easily be solved by moving your resources into the cloud, so that they are close to the virtual desktops or DaaS environment. Keep it in the family Enabled by the right processes and technology, employees can work far more efficiently and accurately. By removing the things that employees find frustrating about the workplace, employees can be more engaged in their work. User-experience experts will tell you that with any interface, users need clarity, predictability and familiarity. It sounds so simple, but making sure your team can get up and running with a new type of technology hinges on their ability to quickly pick it up. It’s why DaaS is packaged to provide your employees with full access to all their applications and data, delivered via a familiar Windows desktop experience. Time is money When most think about digital transformation, they see roadmaps, staging, and worst of all – potential delays. With DaaS, it’s a step in the right direction, not a complete overhaul. Multi-tenanted DaaS platforms, like ours, are live right now, with new customers deployed in a matter of days. This immediately reduces the timescales compared to your average cloud project, allowing your business to deliver against the clock. This allows you to get the majority of your workforce onto a home-working imitative without worry. For those who use a lot of third-party suppliers who need to login to access files that currently live on your on-prem services, or if staff need access to on-prem resources, then VPNs or a move to the cloud may be needed. But don’t think of those as a blocker, it’s about moving in the right direction. The main thing is to look for solutions that deliver without giving you sleepless nights – whether forced remote work happens again, or needs to go on for longer. The benefits are clear, and the DaaS set-up can be consumed on a per-user, per-month basis, so you won’t be tied into costly technology that might not be needed at the current scale, in the future.[/vc_column_text][/vc_column][/vc_row] ### The past, present and future of the CVSS [vc_row][vc_column][vc_column_text]The common vulnerability scoring system (CVSS) provides a way for organisations to assess the principal characteristics of a vulnerability and produce a numerical score reflecting its severity. The CVSS has proven to be useful to consistently assess vulnerabilities and to standardise security policies. However, it has also showed some shortcomings in addressing the needs of users outside of traditional IT environments. In this article Jonathan Wilkins, director at industrial parts supplier EU Automation, explains. When fully protected, technological devices, both on and offline, can optimise a number of processes on the factory floor. By connecting devices to the Industrial Internet of Things (IIoT), manufacturers can collect data for a variety of purposes, such as monitoring production in real-time, detecting bottlenecks, optimising energy consumption and facilitating predictive maintenance. However, the growing number of devices connected to the IIoT also means that hackers have more opportunities to infiltrate a company, access sensitive data and disrupt production. According to NETSCOUT’s Threat Intelligence Report, the average time required to attack and IIoT device is just five minutes. SonicWall reports that IoT malware attacks jumped increased by 215.7 per cent in 2018, and the rate of cyberattacks is expected to keep increasing. Take a programmable logic controller (PLC) as an example. It is an automated decision-making tool that monitors the state of connected devices and makes decisions to streamline processes. As technology has advanced, PLCs have become equipped with remote access capabilities for ease of maintenance and increased flexibility when controlling other devices. To remotely monitor and control processes, PLCs must be connected to the internet. However, this exposes the technology to cyber-attacks, which could lead to extremely serious consequences, such as the Siberian gas pipeline explosion in 1982. The CVSS allows manufacturers to categorise their PLC’s potential vulnerabilities and ensure that the most dangerous are patched before an attack occurs. Understanding the metrics The first version of the CVSS was developed by the National Infrastructure Advisory Council (NIAC) and launched in 2005 with the goal of providing a free and universally standardised method to assess software vulnerabilities. Currently, the CVSS has reached version 3.1 and consists of three metric groups: base, temporal and environmental. The base score, measured from zero to ten, represents the intrinsic characteristics of a vulnerability, which are constant over time and across all user environments. This metric considers the impact of the vulnerability should it be exploited. It also provides information on how difficult it would be to access that vulnerability, such as the level of complexity of the required attack and the number of times an attacker must authenticate to be successful. The base score is composed of two sets of metrics: exploitability and impact. The exploitability metrics represent the characteristics of the component that is vulnerable, typically a software application. The impact metrics represent the consequences of a successful exploit on the impacted component, which could be a software application, a hardware device or a network resource. The temporal score represents the characteristics of the vulnerability that may change over time. It considers the level of remediation available for the vulnerability at the time of measurement, as well as the current state of exploit techniques or code availability. Since these parameters may drastically change, so too can the temporal score. Finally, the environmental score enables analysts to customise the CVSS score depending on the importance of the affected IT asset to an organisation. This score allows businesses to calculate the collateral damage potential of a vulnerability in case of successful exploit. In other words, this is about the impact on other equipment, people and businesses if the vulnerability is uncovered. This may drastically change depending on the sector the organisation operates in. Sets of CVSS metrics are usually represented with a textual vector string, which allows users to record the parameters of a vulnerability in a concise format. All about that base Base scores are usually provided by the company selling and maintaining the vulnerable product. Typically, only base scores are published, since they are the only ones that do not change over time and are common to all environments. Base scores can provide a good starting point to assess a vulnerability, but are not enough to have a clear idea of all the risks involved. For example, you might have a vulnerability that, currently, is very hard if not impossible to exploit. However, one year from now someone might release a new tool that allows hackers to exploit it easily. Moreover, base scores don’t consider how critical the vulnerable component is to the workflow of a specific company. Organisations should, therefore, supplement base scores with temporal and environmental metrics to produce a more accurate scoring, specific to their application and industrial sector. Organisations might also want to personalise the scoring by considering factors such as the number of customers on a product line, the monetary losses in case of a breach and public opinion in case of highly publicised vulnerabilities. A vital parameter organisation should consider is the potential impact of a successful exploit on living beings. This is currently not a metric in the CVSS, however, it is of the utmost importance for businesses working in sensitive environments such as the medical device industry or the automotive sector. Without these considerations, you’ll only be able to tell how bad a vulnerability is hypothetically, not whether it is a cause for concern. Worrying about a vulnerability based on its base score alone would be like worrying about a disease based solely on how deadly it could be, and not on whether you might be in a position to catch it. Current version and future developments Currently, the Special Interest Group (SIG) at the Forum of Incident Response and Security Teams (FIRST) is responsible for developing and maintaining the CVSS. On June 17, 2019, FIRST released the latest version of the scoring system, CVSS v3.1, with the goal of improving the overall ease of use of the 3.0 version without introducing new metrics. This means the latest developments focused on usability and clarity, rather than on substantial changes. For example, the definitions in the user guide were revised. The SIG, which is composed of academics and representatives from a broad range of industry sectors, is currently working on improvements to characterise the next version of the CVSS standard. Based on input from users, the SIG has already created a comprehensive list of potential improvements, which can be consulted in full on their website. One of the most important suggested amendments is the possibility to distinguish attacks available only on specific networks, such as a corporate intranet, from attacks that can be launched from anywhere else on the internet. The SIG is also considering the possibility of introducing new metrics, such as the concept of ‘survivability’ after an attack and ‘wormability,’ since computer worms represent some of the most common and dangerous malware in cyberattacks. Finally, a major future challenge is finding a way to quantify the damage that a successful exploit would inflict on living beings, something likely to happen in sectors such as healthcare, aerospace and automotive. These are just some of the issues the next versions of the CVSS are expected to tackle, and users are encouraged to contribute to its continuous improvement by sending suggestions to first-sec@first.org. It’s virtually impossible for companies, especially small to medium-sized ones, to patch every vulnerability as soon as it is found. When installing new equipment and connecting it to the internet, manufacturers must choose suppliers that prioritise security in both software and hardware. By relying on trustworthy suppliers and using the CVSS scores as a support, manufacturers can implement digital technologies to improve their workflows, without having to choose between security and digitalisation.[/vc_column_text][/vc_column][/vc_row] ### All You need to know About Securing your E-Commerce Store [vc_row][vc_column][vc_column_text]E-commerce is a big platform growing at unprecedented rate worldwide. People of all ages, from different walks of life love to shop from the various e-commerce shops. Shopping online gives more happiness than shopping at physical stores. Why? The answer is simple. Since there are so many apps and websites that focus on e-commerce, finding anything you want to purchase right away on an e-store is becoming simple. E-commerce development has truly evolved through the years. Furthermore, in the coming years, the growth is predicted to rise like never before with all the current technologies at present. Online shopping rate is rising beyond measure, which drives e-commerce owners frantic in keeping a strong place in the online race. But since the web is prone to cyber threats, when it comes to security of an e-commerce shop, partnering with an e-commerce development company is the way to go. Before the massive popularity of shopping on the web, the biggest cyber threats to the retail industry is hurled towards physical stores, particularly POS or point-of-sale systems breaches to loot customers’ credit card data. UNDERSTANDING E-COMMERCE SECURITY Today, when setting up an e-commerce shop, it’s critical to hire an e-commerce developer to do the job right when it comes to security. The sophistication and frequency of cyber-attacks skyrocketed recently. Security in e-commerce means the measures practiced to protect a business as well as its customers from cyber-attacks. Some common acronyms and terms you should know: Personal Data. Referred also as personal information, meaning any data that could be linked back to a certain individual, which include names, phone numbers and email addresses. The distributed denial of service is an attack that refers to disruption of service, service or network traffic by immensely putting a flood of traffic. The International Organisation for Standard, an international standard-setting body, which makes requirements that guide business organisations in making sure the processes and products are fit for the purpose. TLS or Transport Layer Security, SSL or Secure Sockets Layer and HTTPS authentication. Using SSL helps encrypt and authenticate links between networked or connected computers. With an SSL certificate for a website, you could move from HTTP to HTTPS that acts as a trust signal to customers that the website is secure. Ransomware and Malware. Ransomware is a kind of malware that locks a victim out of their system, or preventing data access until ransom is paid to the attacker. Malware or malicious software is a kind of software that attackers install on a system. PCI DSS. The Payment Card Industry Data Security Standard ensures information of a credit card gathered online and being conveyed and stored securely. BEST E-COMMERCE SECURITY PRACTICES Unique and robust password implementation. Over 80 percent of attacks are allotted to stolen or weak passwords. Put extra effort to ensure that you or your staff, as well as customers implement good strong passwords practices. Strong passwords are at the very least eight characters and has both uppercase and lowercase letters, symbols and numbers. Never share passwords and every user must have his or her own private, unique username as well as password for logging in. Do not use the same password for other logins as you use for the e-commerce site. Never hare sensitive information publicly, such as social security number, date of birth or any other information that you tend to use in answering security questions. Protect devices. Whether you have one computer at your home office or several full networked computer systems, make certain that the connected devices are cyber secure. Use firewalls, anti-virus software or other appropriate method to protect against threats. One of the best ways of avoiding malware infections is avoiding falling into phishing traps. Do not ever provide any personal information unless the identity of the recipient has been verified. Keep in mind that no legitimate organisation would ever ask anyone to share password. The website must be updated at all times since security is an ongoing process. Attackers determine vulnerabilities, software engineers fix them. Today, e-commerce development put huge emphasis on website security more than ever. When using the BigCommerce platform, updates are automatically handled. With on –premises solutions however, the business is responsible for the implementation of updates, vulnerability patches and bug fixes to the software powering the store. Move to HTTPS. A secure HTTPS hosting that needs an SSL certificate helps secure an e-commerce site. An e-commerce development company could help in this regard. It’s further a boon to the marketing department because Google penalises website with HTTP in the organic search rankings. HTTPS denotes a positive signal of trust to the clientele, particularly those who are tech-savvy.. Regular third-party and plugins review. Do an inventory of all third-party solutions run within the store. You should know what they are and make an assessment of the continued trust level in that particular third party. When no longer in use, remove integration from the shop. The idea would be to enable only the least parties to gain access to customer data, while continue to drive the store forward. [/vc_column_text][/vc_column][/vc_row] ### Raising the right culture for DataOps success in 5 steps [vc_row][vc_column][vc_column_text]A data boom is enveloping the business world. As information piles up, companies are scrambling to understand DataOps processes, and how these can be used for optimal management and maximised value. DataOps is a methodology designed to streamline data analytics processes, improving quality and easing data flows throughout an organisation. A cultural shift is required for DataOps to succeed in a business, for which companies need to make considered changes internally. DataOps must be adopted as an ongoing process that triggers transformation by improving both data quality and data management. In the current climate, a lot of organisations need to improve their data collaboration processes and how they use the information they hold. Developing pipelines and systems should be a combined workforce task, with a company-wide consciousness of the value of data. As with all change, it must be generated from the top-down. There are five key steps that C-suite professionals can take towards DataOps success. Accepting DataOps as a perpetual motion A business’s leadership needs to begin by deconstructing its entire existing system of data analytics. For DataOps to thrive, it must be implemented via a complete restructure. Culturally, the entire organisation must accept the use of data to drive business decisions. The structure of the company must embrace information-induced change and be open to ongoing flexibility. Slow adoption is the number one obstruction. Poor implementation and distribution of data leads to a fragmented and disjointed approach to reporting across an organisation. Success relies on a complete cross-organisational and long-term commitment to the DataOps methodology. Regarding DataOps as an ongoing process - a constant conveyor belt on which teams can send questions down and keep up with - is the cornerstone of successful implementation. Data-fuelled decisions Information should be central to decision-making. From staff who use data, to dedicated analysts and data managers, the mindset of data-driven business choices should be universal. DataOps empowers the entire workforce to openly offer data-driven opinions. This will allow businesses to stay competitive in even the fastest-paced environments, as the best ideas organically move to the forefront. This process acts as a catalyst for managers to naturally migrate from existing control strategies to data-driven decision processes. Banishing Silo Culture For data to be valuable, it must be accessible to those who require it. Barriers and red tape culture must be removed from businesses, as they develop over time and prevent the free flow of lucrative data insights. The key to this is to switch from stockpiling data to sharing it instead. This is achieved by the coordination of different teams to achieve a flow that still observes access control and security. It is vital that people have immediate access to up-to-date and accurate data. Employees need to be able to obtain and understand all information that influences their work, in order to have a full perspective. Show your mettle DataOps installation requires a bold and brave approach. There are two opposing common standpoints: some businesses fear transformation, and some go rogue in their efforts to achieve it. The former are usually larger companies that are afraid of change and do not want to upset the status quo. These companies tend to have a rigorous framework of rules and regulations, making any change a laborious challenge. The latter tends to be smaller organisations, who have an individual who attempts the entire transformation alone. Corners are often cut, and the business ends up facing unforeseen problems. However, there are processes and programmes that can help an organisation to transform quickly without the risk of major error. A business can succeed in its change without fear. Whenever failure is encountered, it should be converted into an opportunity to improve. By moving away from legacy processes and outdated routines, a business can discover new ways to thrive. DataOps can effect this change, structurally and methodically helping businesses to break out of old systems and impact overall output. Acquiring the best building blocks Investing in data process tools is vital. A team requires the ability to access, share and analyse the right information in order for the operation to be worthwhile. DataOps is not just a standalone tool, but an all-encompassing solution which improves collaboration, pipeline construction, testing and monitoring and the speed of future implementation. With the right resources and time, a company can apply its own DataOps programme by utilising the many different tools. For agility and timesaving, an all-in-one DataOps platform is another option. These tools must be accompanied by suitable training: external experts can work to ensure a smooth transition into the usage of DataOps. Embracing DataOps could be key to an organisation keeping up with its market and safeguarding its successful future. It should be viewed as a united ecosystem for all data that a business holds. Adopting an effective DataOps culture requires the consideration of all processes, employees and technology, in order to achieve a culture that fosters harmony between teams and capitalises on the value of data. The absolute bottom line is accepting and championing DataOps as a continuous, ever-flowing process, sending a company on its path to an efficient and successful future.[/vc_column_text][/vc_column][/vc_row] ### Top Tips for Preparing for Any Cloud Migration [vc_row][vc_column][vc_column_text]The World Health Organisation has described the world as being in ‘uncharted territory’ due to the coronavirus outbreak. Many working environments have changed significantly in recent weeks, but businesses remain resilient and are working hard to operate on a ‘business as usual’ basis. In professional services, moving from on-premise to cloud solutions continues to be a significant move for those businesses endeavouring to drive sustainable innovation and competitive advantage. The collaborative benefits of the cloud are fuelling better customer engagement and accelerating many communication-based activities. Moving to the cloud means streamlining and digitalising key processes to unlock dead billable time, as well as integration, consolidation and access from any device. However, whether you’re adopting a new application or migrating an existing one to the cloud, it’s useful to start with a clear understanding and documented approach. It is essential that companies understand not only the technical architecture surrounding each of their business applications, but also, in the case of collaborative cloud solutions, exactly how their customers and clients will interact with new cloud-based applications. If you are considering migrating to the cloud, this checklist can help you to prepare.   Dedicate time to tidying up your pre-migration data. If you’ve been using your current on-premise product for some time, you may find some of your data has become disorganised or requires a refresh. Do you have phone numbers where email addresses should be, for example? Do you have old data that should be removed – for example, former clients/customers that you haven’t dealt with in years? If migrating transactions, do you need historical data from five or 10 years ago, or just the most recent transactions? Do you have any GDPR considerations to take into account? Moving to new software is a great time to clean up your data.   Review the mapping of your data fields. Unless your new software is an upgrade of an on-premise product moving directly to its cloud equivalent, you will need to map fields/tables from the old database to the new. This may be best outsourced to experts who have done this work previously.   Pre migration, check out what data validation tools may be available to assess how smoothly your data migration will run, and whether any of your data flags migration related problems. Data validation tools will often show you the errors you need to fix, and if you can’t fix them, then it’s time to arrange for a consultant to assist. This can be especially important if migrating from multiple software systems (for example a separate CRM and accounting system) into one new software product.   Consider migrating a couple of days before any cloud product training, as this ensures that your product training can be done on your live data rather than generic data. Once you’ve completed your cloud migration, invest dedicated time in employee training. When it comes to employee training, there are three things to remember:   Whether moving the same software from on-premise to cloud or implementing new cloud software, there will be changes in the software that will necessitate changes in internal procedures and training provided by the software vendor (or their agent) who will help you to identify these changes. Professional trainers have a wealth of expertise gained from working with other companies using the same software and can potentially give you advice not just on using the software, but also around improving your procedures in relation to the use of that software. Additionally, your employees may not have had training on the software you are migrating from, and therefore this is a good time to invest in training for those individuals. In many industries, time spent being taught new software also counts towards employee CPD hours.   Does your new cloud application have client login functionality? Communicate with your clients well in advance. Many professional services-based solutions, including cloud-based bookkeeping, tax and general accounting products feature extensive client dashboard functionality, so this is your opportunity to communicate the benefits to your clients. Consider whether you need to conduct client training sessions so that clients are making full use of the functionality on offer, and understand workflows.   When planning the date and time for your migration, pay attention to your cloud product support hours. In the event that you do need help, you want to make sure it’s at hand.   Plan your actual migration timetable and estimated duration. The time it takes to migrate to the cloud will vary according to how much data and how many applications are being migrated, as well as how much integration needs to be done. Very large migrations may have to take place over a number of days.   If your cloud application has client login functionality, understand how and when new release functionality will be rolled out via the cloud. The beauty of the cloud is that delivering things online generally means that the benefits are felt more widely and quickly, but make sure you’re communicating this with your clients so that they experience the upside immediately.   Consider the opportunity to make configuration changes to your software. Migrating to cloud software will bring benefits just because you are moving to new technology, but potentially you also have the chance to improve the configuration of your software, and your internal procedures, to make it more efficient and easier to use as well.   It's also worth regarding the move to the cloud as an opportunity to take advantage of new software features – the cloud frequently has advanced time-saving features which your on-premise software may not have included. Being taught how to use these features by the vendor at the time of migration is much more efficient and less costly than learning by trial and error. Critically, ensure you partner with an organisation that believes and invests in technology that makes a difference; technology for the real world. Done correctly, moving your key applications to the cloud with an integrated solution will set you apart and empower more productive services, giving you more time with your clients to add enhanced value to their businesses which can only help to solidify a strong future partnership between you.[/vc_column_text][/vc_column][/vc_row] ### 8 Cybersecurity Tips For Working Remotely [vc_row][vc_column][vc_column_text]In recent months the Coronavirus, or COVID-19, has spread across the world, dominating the news and having a massive impact on society. One of the biggest changes to many businesses is the sudden shift to remote working. If companies are going to continue to operate then it makes sense for employees who can work remotely to do so and reduce the likelihood of catching and spreading the virus. However, working from home is very different from working at the office, especially when it comes to cybersecurity. Companies put a lot of effort into thoroughly protecting networks and devices in the office to keep private data secure. Unfortunately, your home network and devices aren’t likely to have the same level of protection, leaving you more vulnerable to a leak or cyberattack. Here are 8 simple cybersecurity tips to help keep confidential information secure while you are working remotely during the Coronavirus lockdown. 1. Install reliable antivirus software on your devices Many offices often employ a number of measures designed to protect devices and networks from malware, such as powerful security solutions, restricted access and rules against installing applications on work computers. At home, it is much more difficult to maintain this level of protection which can leave your computer vulnerable to cybercriminals. It is essential that you install antivirus software on any of your devices that are used to access confidential data from work. This is also a sensible precaution to keep your own personal information safe. If you are hesitant to invest your own money on a security solution, there are a number of free antivirus solutions available that will reduce the risk of getting infected. 2. Encrypt your Wi-Fi router The best security software won’t help you if cybercriminals can connect to your Wi-Fi or access your router. This would allow them to intercept anything you send online, such as documents, emails & messages or passwords that pass through the router. This is why it is essential to configure your network connection securely. Similarly, if you haven’t changed the login and password for your Wi-Fi router, you should do so now. The default password for most models is often weak and easily searchable. Make use of a strong password for your Wi-Fi. 3. Make use of a VPN in public places A VPN is an extremely useful tool for both protecting data as it is moved from the office network and your device. A VPN provides an additional layer of security which hides the user’s IP address, encrypts data transfers while in transit and masks the user’s location. A VPN can also be used to protect your device if you are making use of a public Wi-Fi network. Public Wi-Fi networks are rarely encrypted which could allow other users to spy on you through the network. When you are connected through a VPN, all of your personal data, such as passwords, will be encrypted. 4. Always lock your device when you leave your desk It is a best security practice to lock your device when you leave it for a bathroom break or to make a cup of coffee and this still applies when working remotely. Anyone can catch a glimpse of your work emails or private documents while you are away from your desk. Even if you are working from home without anyone else in the room it is still a good idea to lock your device, if only to prevent an eager cat sending an unfinished email to your boss or a curious child deleting 2 days worth of work by mistake. In addition, it should go without saying that your device should be password protected. 5. Update programs and operating systems Another best practice for cybersecurity is taking the time to install all updates to your applications and operating systems. These updates often patch new vulnerabilities that are found in software which could be exploited by cybercriminals to infiltrate your device. Cybercriminals rely on people neglecting to install these updates so it is important to regularly update everything installed on any device that you use for work purposes. In addition, it is often possible to activate auto-updates on your device to help keep you up to date. 6. Make use of a secure and approved cloud network Another way to keep confidential data or documents safe is to ensure that it isn’t stored locally on your device. Content storage should be cloud-based where possible and that the cloud storage service used has been verified by your company's IT department. In addition, it is important that you make use of secure cloud-based apps, such as Microsoft Office 365 or a corporate email system for exchanging documents and any other information. 7. Be careful & vigilant As always, a big part of cybersecurity is vigilance. A highly convincing spam email can find its way onto your corporate emails system, especially now the number of digital communications will be increasing. Take the time to double-check the sender is who they say they are and that what they are asking for is legitimate. Be extra careful of emails with links in them. Links in emails are a great way of getting malware onto a secure network. Always check the destination of the link before opening it and if you are suspicious then it is best to ignore it. 8. Create a comfortable workplace This final tip is less about cybersecurity and more about your own health and well-being. Try to create a comfortable workspace while working remotely. You might be tempted to lounge on the couch with your laptop, but your back will suffer for it. Try to find a comfortable office chair and a desk to improve your posture while working remotely. Take plenty of breaks, stretch your legs and drink water to keep yourself healthy and motivated. Make sure you stay in contact with your team and try to maintain regular office hours and get plenty of sleep. Working remotely can be a big change and it is important that you make the effort to protect your work and your own well-being during this time.[/vc_column_text][/vc_column][/vc_row] ### Karantis360 Offers Assisted Living Solution to Care Providers on a Non-Profit Basis During COVID-19 pandemic [vc_row][vc_column][vc_column_text] Powered by IBM Cloud and Analytics Technologies, Solution Helps Keep Older Adults and the Vulnerable Safe During Social Isolation London, UK, April 16, 2020 – UK predictive care management start up Karantis360 is now offering its innovative at-home monitoring solution at cost price to assist caregivers in keeping older adults and the vulnerable safe during the COVID-19 crisis. Karantis360 uses sensors to keep a watchful eye on the daily activities of at risk older adults and vulnerable people being cared for in their own home. Running on the IBM Cloud, the solution uses IBM’s AI and advanced analytics to identify and learn an individual’s typical behaviour based on movement, temperature and humidity readings. Through a mobile app, Karantis360 helps to provide reassurance to caregivers and family members who have access to a more complete picture of the wellbeing of the person being looked after and can be alerted to changes in normal daily routine which could point to a potential problem. There is now an urgency to ensure individuals, who are impacted by social isolation, can be monitored for indicators of risks to continued health, due to the current threats of exposure to COVID-19. Just as important, is to enable at-risk older adults and vulnerable patients to stay safely in their own homes, or assisted living facilities. This not only helps to free up much- needed hospital beds, but also drastically reduces the risk of cross-infection. The solution, which is available across Europe, the USA, Hong Kong and China, has already been installed by a number of public-sector and private care providers in the UK including, York City Council and Consultus Care & Nursing. In Italy, care provider Sole Cooperativa is using the Karantis360 solution to monitor the health of individuals living inside one of its social living facilities. During the COVID-19 pandemic, the solution is being offered to NHS, Local Authority and private care providers at the cost price of £10 per patient per month, for at least 6 months, plus the cost of the smart sensor technology from Pressac. In addition, care providers also have the option of an ‘install now, pay later’ finance agreement with Bluestone Leasing. Other project partners include Arrow Electronics, Tech Data, Softbox and IndividuALLytics. Karantis360 is part of the IBM public cloud ecosystem, a new initiative to help clients create, modernise and transform mission-critical workloads on the IBM public cloud. The IBM public cloud offers the industry’s most open and secure public cloud for business. Quotes "It’s extremely reassuring to have access to data which indicates the wellbeing of those in our care. I am therefore very happy to have a partnership in place with Karantis360, especially during this emergency situation, when it is crucial to keep our eye on the activities of daily living." Roberta Massi, Presidente, Sole Cooperativa, Italy “Our ability to help reduce some of the impacts of social isolation and to help & support our social care colleagues in these challenging times through our work with Karantis 360 is one of the silver linings that has emerged over the recent weeks, and will help us face the challenge of a significant rise in the discharge process over the coming weeks and months” Roy Grant, Head of ICT & Digital Services, City of York Council "In unprecedented times of enforced social isolation, the Karantis360 solution helps people to live more safely at home and provides reassurance for carers and family members.” Lucy Abbott, a Consultant Geriatrician, UK “We are delighted to support Karantis360 with our hardware development and production expertise and are committing to reopen our UK factory, in accordance with government guidelines, in order to do so. The combination of smart technology and analytics reduces pressure on the health systems by keeping people out of hospital and living in homely environments where they are most comfortable.” Peter Burbidge, Managing Director of Pressac Communications, UK “Today there are tremendous personal, social, but also financial incentives to keep people in their own homes. Evidence-based data, research and intelligence are more necessary than ever to support the strategic concept of ageing-in-place. Technologies like the one proposed by Karantis360 can help make our homes capable of a dialogue based on data, supporting a more transparent relationship between the different stakeholders across the care ecosystem. They also help increase safety and reduce the cost of care for end users, their families and care providers”. Nic Palmarini, Director, National Innovation Centre for Ageing “I'm extremely excited by how technology is being used to support the more vulnerable in our society. It’s not just about monitoring - data is the key to successful care and provides a scaffold for families who will be reassured by knowledge of their loved one’s welfare.” Professor Nigel Holt, Head of Department (Psychology), University of Aberystwyth, UK “In unprecedented times such as these, it’s crucial for us to come together to do what we can to support the national and international efforts to stop the spread of the virus, and ensure that every individual’s health is cared for, no matter the circumstances. Together with our partners, we’re able to offer a solution on a not-for-profit basis that can support the heroic efforts of health workers by enabling vulnerable individuals to stay safely in their own homes. It is especially important that during this period of lockdown, families can be reassured of their ability to remotely monitor for changes in the wellbeing of their loved ones.” Helen Dempster, Founder and Chief Visionary Officer, Karantis360 For more information register your interest here: www.karantis360.com/monitoring [/vc_column_text][/vc_column][/vc_row] ### How educational videos are beneficial for children? [vc_row][vc_column][vc_column_text]Learning is the base for improving knowledge of children. The learning process has to be engaging and more fun-oriented so that children love learning. The only available educational tool available to the children making them more interesting is video sessions. The video learning classes are interesting and engaging to the children who watch it than traditional classes. The benefits of educational videos are plenty among learners nowadays. Interesting and useful educational videos bring changes for children and their development Educational videos engage and mingle easily Children who watch videos for learning should be engaged by the content. Hence, content is produced keeping the level of understanding of the learners who attend the classes in mind. If the content and the explaining skills of the teacher on par with excellence, the children get the content in their mind without any difficulty. The video classes bring the children close to them for easy understanding. The positive features of the educational video turn the mindset of the children towards learning lessons. Boring video sessions do not yield any results at all and hence the content should be interesting and optimistic. Another option would be online video learning, especially when it comes to science-related lessons. A great resource for your children would be Generation Genius. Content of videos is important The educational video sessions have to be realistic and informative to the children who watch. The authentic video classes enhance the sessions more interesting to the children. The development of children depends a lot on their positive mindset and knowledge level. Both these features develop children into mature adults at a later stage. The educational video should improve their vision towards the study and its benefits for their life. It is quite clear that children love video games because the games are interesting and it involves the mind and soul of children. Likewise, educational videos attract students more than oral teaching. Type of video content for children to learn Now, it is very clear that educational videos play a vital role in involving children fully. The students need to view fun and motivating videos so that they will listen for a long time. Long-time with interest in learning brings many changes among students and it leads to development. Thought-provoking videos would bring massive changes in the life of the children. So, videos have to be entertaining and kindling the motivation level of the students to achieve goals in their life. Interactive video sessions could do better than only listening to it. Yes, video sessions combined with questions and answers keep children engaged with the video. Once the video session for a limited time is over, the teacher can question the student what they have understood. This kind of interactive session brings a massive amount of changes to the learning behavior of students on the whole. Motivation is kindled by external forces happen with the educational videos in the classroom. Yes, the students who have clear vision and motivation would get engaged with the learning process. Hence, education video sessions should be framed in such a way of making them more focused. Lively video sessions attract students more Concentration level is another major key to successful teaching through any tool of education. Only lively video hours attract listeners thereby making them concentrate more. If the focus power and concentration are abundant with the listener, the video class sessions are successful without any second thought. The educational video lessons need to pragmatic, practical and not only theoretical. Even the tough syllabus for a student is easily understood if he watches practical videos. The real culture followed outside the world is expected by the listener when he watches educational videos. Memory power is enhanced Another setback that exists among students is memory power during exam hours. This problem is minimised if he watches educational videos which makes him retain a memory of listening for a long time. The content presented in the video sessions remain for many days and this would help him during his exam days. The students who have attention disorder problems are helpful if they are provided with educational video hours for their subjects. They can listen and observe content more easily than other types of classes. Development of children is purely related to mindset, attitude and focus while learning. These three features are enhanced by the educational video which inculcates the habit of engaging more than any other tool for learning.[/vc_column_text][/vc_column][/vc_row] ### Don’t let bad infrastructure choices impinge your data strategy [vc_row][vc_column][vc_column_text]The ability to effectively analyse data can be the difference between success and failure for a business. Collecting data is one thing, but if insights aren’t being gathered and applied to make positive business transformation then its largely meaningless. Its therefore no surprise that we’re seeing more and more chief data officers (CDOs) featuring on executive boards as businesses of all sizes see that being data-driven is key to becoming a digital business. As part of this, enterprises recognise they require a coherent data analytics strategy in order to reach the full potential of what they can do with data. But what’s essential is that they don’t fall into the trap of fixating on whether their deployment model should be cloud, on-premises or a hybrid approach as the first step. Infrastructure decisions are important, but they are just one factor to consider. Developing a clear data strategy and data-driven culture led by CDO has to come first, as this avoids a disjointed approach to data and prevents employees feeling disillusionment or distrust in business processes. In fact, people should be at the heart of every data strategy because the most powerful results happen when the whole organisation is involved in driving the strategy forward. Furthermore, the most successful strategies are those which are integrated and communicated from the beginning as part of a business’ overall strategy. This step makes sure data is being managed and used as an asset, with business-wide processes, practices, and common and repeatable methods. Making data accessible With common practices in place, employees at every level can have access to real-time data and know how to use data to make faster and better decisions – which potentially open up new business possibilities. Democratising data in this way empowers employees with relevant, customised, up-to-date analytics on key metrics, which involves and empowers them in moving the business forward. And here is where the technology infrastructure comes back into play. Anything that thwarts an organisation’s ability to become data-driven needs to be reconsidered. Businesses can’t let their infrastructure stand in the way of data democratisation. When we surveyed 2,000 global data decision makers in our recent report, Data strategy and culture: paving the way to the cloud, it revealed that 81% of businesses with hybrid cloud models found employees at all levels had sufficient access to data to improve decision making. For many this is because a cloud model allows data to be shared at scale, eliminates data silos, and is secure and cost-effective. Analysing the cloud However, it’s not that simple if we take a closer look at the results. For some organisations, it is difficult to make data accessible across the business. When we asked respondents if their current IT infrastructure makes it challenging to democratise data in their organisation four out of five agreed. Barriers to success also included too many sources of data (25%), a lack of relevant data skills (24%) and performance limitations (21%). This is concerning. We’re all operating in a real-time world where we’re amassing data at breakneck speeds, which makes fast data analysis even more paramount. Limitations such as incumbent systems, poor technology infrastructure and a lack of employee buy-in can’t stay unaddressed. Business leaders need to tackle accessibility, integration and employee knowledge challenges now in order to compete. And with a robust data strategy in place as step one, the important decision of what deployment model comes next – to ensure your infrastructure doesn’t hold you back from making data available across the business. This doesn’t necessarily mean that you need to run straight to a cloud solution though, as it might not be the right choice for every workload. It can be an instrumental part of an effective data strategy, but your decision needs to centre on how the business will evolve in the future and what requirements there might be that may mean some data needs to stay on-premises. In the highly regulated financial services sector for example, an on-premises approach can work better. Whereas if you have predictive/prescriptive analytics and data science workloads or need to speed up the adoption of software and services, you would be better suited to the cloud. Some of the benefits of the cloud include improved ease of access and shareability of data, and faster query/response times. 73% of decision makers in our survey that have migrated data workloads to the cloud have seen a positive impact regarding what they can do with their data. Ingredients for success Ultimately, the deployment model an organisation chooses needs to enable every worker across the business to access the insights they need in real-time. In my experience a hybrid cloud approach can really deliver on this objective; sensitive workloads can be kept on-premises and public cloud offerings can be used to manage less crucial information. As a result, greater agility is achieved, enabling firms to not only quickly adapt as their business evolves, but also turn their data into value faster than ever before with the speed and performance hybrid brings. Get your data strategy in place now and then pave the way to the cloud![/vc_column_text][/vc_column][/vc_row] ### Finding the silver lining in a private cloud [vc_row][vc_column][vc_column_text]Public sector organisations are increasingly making the jump to public cloud. But while this is a positive step in their journey towards modernising services, it also brings its challenges. Migration to the public cloud is by no means a new phenomenon, but the hype cycle has reached fever pitch, and organisations should be mindful not to be swept away by the hype. What many organisations are now learning is that public cloud services are typically extremely costly and can place added pressure on already cash-strapped organisations. These high, and often, unexpected costs are borne from the inflexible solutions that public cloud providers offer. In many cases, the lack of flexibility in these public cloud models means public sector organisation are being tied into lengthy contracts that are difficult to get out of, and that simply do not meet the real needs of the business. In reality, public sector organisation at the start of their cloud journey need three things from their cloud offering: security, reliability, and capacity. These three fundamentals should form the basis of an organisation’s cloud strategy and should see them look beyond public cloud. Indeed, for many organisations a private cloud strategy could meet their current and future cloud needs. What’s in a cloud? The benefits of adopting a “cloud-first” approach are clear. Public sector organisations can prioritise real business transformation through greater access to critical business applications that can be hosted in the cloud, without the need for disruptive and costly on-premise hardware upgrades. Additionally, cloud offers scalability and increased reliability through on-demand resources, as well as the ability to dial up security as and when needed. Cloud also facilitates greater mobility, this will be essential when rolling out new applications and services, such as remote healthcare practices or e-learning services. What is critical however is understanding which of the aforementioned benefits, and specifically how many, are included in public cloud offerings. There is a big debate around the price of public cloud. In the initial stages of onboarding customers, public cloud providers have notoriously offered attractive prices for their services, but further down the line these offers have presented hidden costs. The challenge for most organisations is that they do not have a complete view of what they are paying for and where resources are allocated in a public cloud scenario. The lack of flexibility in these cloud models means organisations often unknowingly pay for resources they don’t need. For publicly funded organisations, this of course creates challenges whereby the very solution chosen to help drive down costs, is in fact having the opposite effect. A private solution for public service success For a long time, the assumption was that public cloud is a cheaper alternative than private. Why? Because infrastructure costs are shouldered by the public cloud providers who operate mammoth data centres with vast quantities of storage and compute capabilities. However, in recent years the premise that public is cheaper than private has been found to be largely false. Private is now widely considered to be more cost effective in the long run, as storage and compute power can be added or removed, or scaled up or down in accordance to the needs of individual organisations. But cost reduction is not the primary driving force behind private cloud adoption. One aspect of private cloud that has perhaps deterred public sector organisations in the past is the responsibility of ongoing maintenance. This can be easily solved, however, by partnering with the right provider to manage the private cloud on behalf of an organisation. In this instance the partner provides support, maintenance and upgrades, and shifts management responsibilities away from the customer. Security is another area where private cloud has the potential to shine in the public sector, as it is inherently more secure than public cloud. Public sector organisations hold a duty of care and are responsible for the processing and protection of large quantities of sensitive data belonging to the communities they support. In a private cloud environment, the customer controls the physical servers and access to these, and is able to mitigate risk by setting parameters on what data can be accessed, when and by whom. Firewalls can also be set up based on specific organisational requirements. What’s more, the external risk of cyber-attacks is also diminished with private cloud, meaning organisations are less likely to suffer from data breaches or leaks. In addition, this minimised risk gives organisations more bandwidth to prepare for such cyber-attacks, and with the NCSC announcing that it wants organisations to be more proactive rather than reactive when responding to cyber threats, switching to a private cloud may go some way in helping to achieve this. One final area where private cloud holds weight is in the agility of the providers. Public cloud solutions are run by a handful of big industry players that cannot commit the same level of customer service offered by small or medium providers. In choosing to go private, public sector organisations will have more control to adapt their cloud solution, controlling when and how systems connect to it and how teams interact with the solution. This added agility and adaptability has been a clear growing requirement of public services as they look to streamline efficiencies. While the benefits of cloud will be hugely influential in delivering the transformation that public services are crying out for, public sector organisations must assess their options. While public cloud is a popular option, it’s not necessarily the right one. If public sector organisations want to ensure their cloud solution can adapt to their needs and grow with them, they must consider a more bespoke and tailored offering. That offering is a managed private cloud service.[/vc_column_text][/vc_column][/vc_row] ### Write A Successful Robotics Technician Cover Letter [vc_row][vc_column][vc_column_text]Writing an effective cover letter is a crucial part of any job application process, but more so in the world of robotics. A cover letter will allow you to highlight what it is about you personally, as well as your experience and skills, that enable you to meet the requirements of a robotics technician job. Here are 5 steps to help you write the perfect robotics technician cover letter. The Purpose Of A Cover Letter The cover letter is your opportunity to capture the attention of a prospective employer. In order to achieve this, you need to make sure that it is carefully curated to show a selection of relevant skills and stories or examples from your career which help to portray a clear and accurate picture of who you are. These selected examples should demonstrate how you will add value to the company you are applying to work for. A Memorable Introduction Do your research and find out who it is that you should be addressing your letter to. It will make a much better first impression if you are able to address the recruiter directly as it will show that you have a genuine interest in the company. Try to use a unique opening line. Introduce yourself and describe why you are applying for the role of robotics technician within the first few sentences. Outline Your Skills, Experience And Education As a robotics technician, explaining your relevant skillset, experience and education is crucial. Make sure that you provide a concise and accurate overview of your qualifications and highlight how these are directly related to the job you are applying for. Tell your prospective employer about your degrees as well as any other relevant courses or qualifications you have which make you the best candidate for the role. Remember to explain any other experience in other fields as well that you have. Think carefully about how you’re the skills that you’ve acquired in other positions you’ve held or projects you have worked on are applicable to this job role. For instance, mention any robotics test procedures or results that you might have documented, or any health and safety work you’ve been involved in that may be relevant. Remember, it’s about emphasising how your skills are relevant to the job. Problem-solving skills are particularly good to mention. Give examples of specific problems you have solved and clearly explain how your particular skills were key in resolving the issues. Be Passionate There will be many other candidates applying for the same position who may have more experience or be better qualified than you. Make your cover letter stand out by clearly explaining why you want the position. Remember to remain professional, but allow your enthusiasm and passion to shine through. End With A Clear Call-To-Action You want your cover letter to end with an open-ended call to action,” explains Vivian Torres, a career blogger at Essay Writers.  “You can include a line at the end in which you suggest that you are looking forward to talking to the recruiter soon or that you’re excited about providing them with more information about your skills and background. Avoid making presumptuous comments or statements, such as telling them that you’ll be in touch to schedule an interview. You want to make sure that they have a clear reason to contact you. Give them a polite call to action.” Edit And Proofread Your Letter You need to ensure that your cover letter makes a good impression, so always proofread it before sending it. Use editing services to help ensure that your cover letter is as high-quality as it can possibly be. Conclusion Spending time ensuring that your cover letter is perfected is important. Make sure that you keep it focused and concise. Share relevant information about your qualifications, skills and experience, as well as specific examples of how you would be a valuable addition to the company. Take time to convey your interest in the company and end with a polite call to action. Remember to edit and proofread your cover letter and make use of editing services to help you do this effectively.[/vc_column_text][/vc_column][/vc_row] ### Events Industry: Using tech to disrupt the disruption [vc_row][vc_column][vc_column_text]Alerts on the news, across our phones and over the radio have announced the postponement of events, festivals and conferences worldwide. The first to announce the changes were festival heavyweights like Coachella, followed by Glastonbury, where the protection of thousands of people gathering in close proximity was paramount. Next sporting fixtures could no longer take place and the Euro’s and now Olympics have been rescheduled to 2021. The Government has taken stricter measures, reiterating the importance of staying in and limiting public interaction, resulting in no physical events taking place until further notice. It’s been predicted there will be multi-million-pound losses for the experiential events industry following Covid-19 as brands review spend during this time. Understandably, it’s easy to feel apprehensive about what the future holds but where there is change or disruption, there is space for innovation and in the case of Covid-19, it's no different. So, what does the future hold for the events industry? Stage one – the immediate reaction In the first instance, while restrictions are in place, digital will lead the way stepping into the space left by physical events, such as Formula 1’s virtual Grand Prix which attracted just under 300,000 viewers in the first week. Formula 1 has committed to running a virtual Grand Prix each race weekend and the opportunity for brands to get involved is surely inevitable. As viewers grow, this captive audience offers a chance to existing fans to enjoy the sport they love and the potential for new fans to be won, using a new channel to engage with them gives the chance to showcase the sport from every angle. And it’s not just the sporting industry that’s taken a hit during these unprecedented times – the once lucrative live music industry is also in jeopardy. Although music artists have always been tech fans, understanding the power it holds to connect with audiences since the days of Lily Allen and MySpace. Now technology, predominantly social media so far with a focus on Instagram, is keeping music alive with new releases being shared and gigs streamed live by artists such as Coldplay and John Legend. Stage two – short term impact Up until now it’s been accepted by many people for brands simply to modernise and start using technology in a relatively basic manner, especially when it comes to connecting with the customer or audience. By being able to have a focus on the physical event, a number of brands have seen using social media, for example, as adequate to appear current or even ‘on-trend’. Results from Global Web Index in 2019 found that there had been a 47% increase since 2016 of people watching or following sports events as a main motivation for going on social media and therefore covering off this channel had been seen as simply enough until now. However, as we all start to adjust to our new way of living, we'll see content being digitised and specifically created by brands to offer consumers much needed escapism, positive experiences and a sense of a new normality. Brands will be forced to actually become digital rather than just modernised. This enforced disruption of the physical events industry will see a growth in creative ideas and brands adopting digital options to ensure connection with consumers is maintained, and for a time, will lead the way as other options are restricted. Stage three – long term impact As the current situation ends (which hopefully will be as soon as possible) there is no doubt Covid-19 will leave its mark on how we enjoy events moving forward. We'll see industries such as music festivals selling two types of tickets - physical and digital. It’s likely a digital ticket will provide an access code for people to enjoy streamed music, additional content, and for those who have VR headsets, an immersive experience through using services such as Melody VR. As VR opportunities increase, the demand for a headset will too, hopefully reducing the main barrier to entry for consumers which currently is cost. We’ve already seen brands, such as Verizon, use the recent Superbowl to start breaking the boundaries between technology and sport when it gave us a glimpse into the potential of 5G in transforming the way we consume and enjoy sports. Verizon subscribers with 5G-capable devices were able to access a new multi-camera-angle streaming feature, allowing fans to stream action from five different camera angles that were on the sidelines. Fans could switch between views, rewind and play instant replays from any of these angles. As well as this, through the use of AR, fan could see real-time stats as they're watching the video streams from any of the five camera angles. Verizon’s super dome fan experience was built on a new scale to any previous incarnations and looked to inspire both fans, families, its partners and the sport itself to the possibilities of tech led sport experiences. People are bound to be nervous for health reasons once the need for social distancing is no more, so the growth of physical events could be slow at the beginning meaning event companies and brands must diversify. This is a time where we can showcase our skillset, engage with new audiences and demonstrate how technology can both enhance a physical event as well as offer a unique experience. Events have always been a huge part of the experience economy and it’s more important now, than ever, that we offer people an escape – don’t panic, get creative and turn to technology is these turbulent times.[/vc_column_text][/vc_column][/vc_row] ### The Importance of Data Centres in the Cloud Repatriation Era [vc_row][vc_column][vc_column_text]The enterprise IT architecture has been on a fast-paced trajectory of growth and evolution. Today, the emergence of cloud computing — computing that is based on the internet — has presented a notable and nearly ubiquitous change to the IT environment, opening up a world of new opportunity. Now, operating in a cloud-first landscape, cloud adoption has grown from merely an available option to a virtual necessity. Reports forecast that spending on cloud IT infrastructure will continue on its upswing to reach $99.9 billion by 2023, and as cloud-first strategies grow increasingly ubiquitous, Forbes reports that 83 percent of enterprise workloads will be in the cloud by 2020. The cloud undoubtedly offers a range of potential advantages for the enterprise. This survey of 166 IT leaders reveals that 61 percent say cost efficiency is the primary goal of implementing cloud, with the benefit of new features and capabilities coming in at a close second with 57 percent. Boasting added convenience, scalability, flexibility, data mobility and resilience, it’s easy to see why adoption is so attractive. However, the path of IT evolution is now bringing a growing number of businesses back out of the cloud — more specifically, out of the public cloud. This reallocation of workloads (termed cloud repatriation) has been occurring for a couple of years now, with the IDC’s January 2018 Cloud and AI Adoption Survey reporting that more than 80 percent of surveyed IT decision-makers had repatriated workloads in 2017-2018. Repatriation activity reports in the following year were higher still. The importance of the public cloud for enterprises is plain to see, and 61 percent of surveyed technical professionals from industries around the world stated that their organisation was running applications on Amazon Web Services (AWS) in 2019. Furthermore, usage of this cloud is still growing. However, while the public cloud does fulfill certain needs, it’s becoming clear to many businesses that the public cloud does not offer the ideal solution to all workloads, falling short in key areas. The top reasons for public cloud repatriation have been outlined as security, performance, cost, control and governance. While the value of pulling workloads out of the public cloud ultimately depends on the use case (for instance, latency-sensitive apps or those with static functions may fall short here), it’s clear that many need alternatives that can better deliver security, control and performance. The Data Centre’s Enduring Merit While the cloud stores data remotely and provides access over the public internet, a data centre stores data on a company’s own hardware either on-premises or in a hosted environment in a colocation data centre. Sure, the cloud averts the need to invest in the equipment that data centres require, but the benefits that a data centre approach adds to an IT architecture are plentiful. An in-house data centre delivers the highest level of customisation for the computing environment by allowing companies to keep server hardware on-location, store and access data over the local network and leverage full control over both their data and hardware. This availability of computing power means that individual application needs for security, performance and reliability can be built and maintained as needed. Furthermore, by remaining private and away from the public internet, the risk of mission-critical information falling victim to the increasing number of highly-publicised data leaks is reduced. This is a substantial benefit in an age when IBM warns that data breaches can cost a business $3.92 million on average. However, on-premises data centres, while highly strategic in their offerings, must be maintained by in-house teams and supported administratively on the enterprise’s dime. In truth, many may not have the expertise or available capital for an on-premises solution — especially when data loads are expanding rapidly and every capacity purchase and installation must be managed by the company itself. Luckily, this doesn’t mean that the enterprise has to forego the advantages of the data centre. Colocation’s Ideal Value Colocation has emerged as a great solution for businesses that need an advanced environment for their workloads but can’t afford to — or don’t want to — maintain their own data centre. Colocation allows enterprises to house their own server equipment in the facility of a trusted data centre provider. This delivers the privacy, infrastructural security and reliability of a data centre while offloading the expenses associated with maintaining one. By retaining ownership and control over the physical servers, enterprises can gain greater security, performance and uptime while still leveraging flexibility and scalability for dynamic expansion. The key to implementing the ideal colocation solution is finding the right partner. Vital considerations here are location, physical security that protects against natural disasters, man-made disasters and unplanned downtime, compliance and proper certifications and abundant fiber connectivity — to name a few. Of course, a data centre is just one tool that the enterprise can use in their IT strategy today, and colocation presents an advantageous way to achieve the benefits that come with it. In order to get the best from all workloads and optimise data’s efficiency and efficacy, a hybrid solution that includes an individualised mix of public cloud, private cloud and data centre is what many are turning to. Although dynamic workloads encourage enterprises to diversify and reimagine their architectures, the data centre’s unique value ensures that it is here to stay.[/vc_column_text][/vc_column][/vc_row] ### Managing The Evolving Data and Privacy Law Landscape [vc_row][vc_column][vc_column_text]Different countries, states, and industries are enacting privacy regulations that are significantly increasing the requirements for organisations that handle data as a part of their operations.  Evolving responsibilities, liabilities, and fines for data breaches and non-compliance are increasing every year. This article will shed light on how to manage the changing regulatory landscape and provide ideas for how to proactively prepare for future changes. Core Data Privacy Law Concepts The world of data privacy legislation compliance is an incredibly complex web that can be overwhelming to process when viewed granularly. To better understand data privacy laws, it helps to know the common intents they tend to share. Key Definitions Though the exact wording may vary slightly per each legislation, data privacy laws will generally cover terms of a similar nature. Personally Identifiable Information (PII): Any information that can potentially be used to distinguish an individual’s identity. This includes (but is not limited to) names, addresses, medical records, and in the case of GDPR even internet cookies used in web tracking. Data Subject / Data Principal: The individual that will have their data collected. They could be a customer, website visitor, or another sort of consumer. Data Controller / Data Fiduciary: The entity that acts on behalf of another entity to determine the purpose of the data collection and how it will be processed. Data Processor: The third party that processes the data collected by the data controller. Common Themes in Data Privacy Laws Data privacy laws are enacted to give consumers greater protection, awareness, and control over their data. The exact mechanisms for how this goal is accomplished may vary, but it remains the principal motive for governments to impose data privacy regulations. As informed by current privacy legislation such as CCPA, GDPR, and HIPAA, data privacy laws can be expected to come with a combination of stipulations from the below list: Consent: Whether opt-in or opt-out, informed or implied, consent will be a key determiner for compliance with future data privacy laws. When feasible, the best practice will be to ensure that data subjects are provided with the option to give informed consent before their data is collected. Responsibility for Third Parties: Data controllers relying on a third-party data processor will more often than not be accountable for breaches of the data. Controllers must take great care to audit any third-parties they intend to share data with. Data Breach Reporting: Entities that store sensitive data will continue to be responsible for the timely reporting of any breach of that data. Penalties: Non-compliant organisations will face penalties in the form of fines or lawsuits from the governments and individuals that entrust the entity to act responsibly in the handling of PII. Complexity With Overlapping Data Privacy Laws  Preparing for the future of data privacy is no small feat, particularly as a mix of local, federal, and cross-jurisdiction data privacy legislations develop and increase the complexity of compliance. Organisations that become subject to overlapping data privacy laws will need to maintain hyper-vigilance of their situation to ensure they perform their due diligence.  While the CCPA has exemptions for organisations that are regulated by HIPAA, a California-based healthcare organisation could find aspects of their operations subject to a mix of CCPA, HIPAA, and GDPR. Further regulations that begin to apply to such an organisation will be difficult to manage if jurisdictions do not develop regulations and amendments with existing legislation in mind. Steps to Prepare for Future Data Privacy Laws Preparing for the future of data privacy starts with meeting the needs of the present. Proactive planning, leveraging external specialised resources, and modifying operations to prioritise the protection and responsibility handling of data will assist greatly for preparing for the future. Appoint a Data Protection Officer (DPO) GDPR requires many organisations to appoint a DPO to monitor the organisation’s data protection compliance and advise the organisation of its data protection obligations. Organisations that are not required to appoint a DPO may still wish to invest in a similar role or service that specialises in data security and privacy compliance. Privacy by Design The key to easing the growing pains of future data privacy laws is to make privacy a priority from the start. By making privacy the default operation, organisations that rely on data will be far more agile in responding to developments in legislation. A Privacy by Design approach follows 7 foundational principles: Proactive and Preventative, Not Reactive and Remedial Privacy as the Default Privacy Embedded into Design Full Functionality - Privacy and security work together without trade-offs End-to-End Security - security through the entire lifecycle Visibility and Transparency Respect for User Privacy By prioritising privacy, organisations will qualify for competitively advantageous compliance certifications such as Privacy Shield and Privacy by Design. Policies & Procedures Organisations must thoroughly plan and understand how they interact with data as a part of their operations. These plans should be reviewed quarterly to ensure they are compliant with the latest developments. Areas of focus: Privacy Policy: Provide readily accessible and up-to-date privacy policies that thoroughly dictate the nature of the data collected, the rights the data subject has, etc. Incident Response Plan: Data breaches are a matter of when not if. A detailed incident response plan details the responsibilities and procedures to be carried out in the event of a data security incident. Data Minimalism: With the emphasis on data processors and controllers having responsibility for the safe handling of data, minimising the amount of data that is processed and kept is a must.  Data Auditing: To properly plan for evolving data management needs, organisations need to thoroughly understand the role that data has in their operations.  Data Classification and Categorisation: Following the data audit, organisations will better understand their data processing pipeline and be in a position to identify the nature and sensitivity of the data they are responsible for. This process will help greatly in the event of a Data Subject Access Request (DSAR).[/vc_column_text][/vc_column][/vc_row] ### You have lots of APIs and Microservices – now what? [vc_row][vc_column][vc_column_text]In business, IT teams that create APIs and microservices can reap benefits across the entire organisation. However, there can come a time when the number of APIs created can be too large to manage. This can get out of hand quickly, resulting in APIs and microservices that either go unused, or have no use at all, as multiple teams create similar products in parallel. If organisations want to maintain a healthy pipeline of managed APIs and microservices, then a certain degree of discipline is needed. Principles must be established, and it’s important for businesses to understand them if they are to enhance their API program successfully.  Manage the API portfolio Organisation and strategy are crucial in all aspects of business, and it is the most significant change any IT department can make in taming their portfolio of APIs and microservices. A lack of portfolio management can result in a snowball effect for businesses, where new APIs are built without knowledge of others that already exist. When an API is created, there needs to be a clear structure in place, highlighting who is responsible for the upkeep – whether it’s the developer team, or someone else within the IT department. Today’s economy is dominated by digital products, services and experiences that are developed, tested and upgraded at speed. APIs present the only reliable path to growth in this landscape by short-circuiting the product development cycle. However, without any structure in place, APIs and microservices can swell as different parties add to it without communication, ultimately leaving the product as an unorganised failure. Managing a portfolio takes dedication, and adherence to rigorous controls and processes. As APIs grow and evolve, it’s important that businesses seek to control their portfolio. This can be done through the following actions: Set boundaries for the portfolio: Not every API will serve the same purpose (linking customer accounts, processing orders, and helping to support an inventory for an eCommerce platform are all different purposes an API can serve). By categorising APIs and microservices into different areas, teams can split up and it sets the groundwork for a much more organised portfolio. Define the process and stick to it: Once APIs have been categorised, future confusion can be avoided by defining a clear process for adding new APIs into the portfolio. Manage URL paths: IT teams need to manage their URL paths through the use prefixes or other naming conventions. Consistency is key, especially with governance Strong API governance should encourage consistency across the organisation, combined with the flexibility to support changing market needs. Smaller enterprises may not have the resources available to support this level of governance, and so a single API architect will often ‘make do’. While the one API architect may not necessarily meet the terms for true API governance, they are often performing the role of a full governance team – coaching teams on API design techniques, delivering educational material and creating policies for onboarding. In a smaller business, API governance may only consist of one individual, but it’s important to develop and expand into a small team. Whatever it looks like, a level of consistency is crucial as the number of APIs and microservices increases. API adoption needs to be as accessible as possible Developers are the lifeblood of any API program. However, this doesn’t mean API adoption should be reserved for developers alone – it’s important to make the strategy and processes clear to all parties involved in order to make API adoption as seamless as possible. Common tasks to drive increased API adoption include maintaining comprehensive documentation, helping developers to get started quickly, and defining a clear onboarding process. Focusing on API adoption helps to build awareness of available APIs, and prevents wasting time and resources to construct APIs that are rarely, if ever, used by developers. Having good intentions, and sticking to them All API initiatives start with good intentions. However, the trick to maintaining a healthy API program is ensuring that governance is maintained throughout the entire process. Smaller organisations will never be able to follow the procedures that large organisations can when it comes to API strategies. Therefore, having a lightweight set of coordinating processes and a focus on encouraging API adoption will go a long way. For larger organisations and enterprise IT, it is critical to install a governance team that can help organise your API program, drive consistency, and help grow API integrations. Launching and maintaining an API program is a challenge, regardless of company resources. However, if performed correctly, it has the ability to transform an organisation and reap benefits that include a healthy pipeline and a clear onboarding roadmap, providing quality APIs and microservices to those who need them.[/vc_column_text][/vc_column][/vc_row] ### Data virtualisation – the missing link in cloud migration [vc_row][vc_column][vc_column_text]Over the years, cloud computing has shifted away from being an optional part of a wider business strategy. It’s now seen by many as a necessity; a way of life. Thanks to the promise of increased agility, flexibility and scalability, many have already joined the wave of cloud adopters currently sweeping the business landscape. Yet, despite recent reports predicting that cloud specific spending is expected to grow at more than six times the rate of general IT spending through 2020, the reality is that the majority of organisations have yet to reach the advanced stages of cloud implementation. Whether it’s challenges with moving existing data to the cloud, security, worries about cost or a lack of adequately trained employees, something is standing in the way and – as a result – many are failing to reap the full benefits associated with transitioning to cloud. This is where data virtualisation can help. The missing link By developing a single, logical view of all the data a business possesses, data virtualisation enables uniform access to all information, regardless of where it is being stored. It allows the creation of a virtual infrastructure – spanning both on-premise and cloud environments. This, in turn, helps businesses to overcome connectivity, security and compliance challenges and feel more confident about keeping data in a cloud environment. In essence, data virtualisation simplifies the whole cloud process; empowering businesses and enabling them to go one step further when it comes to cloud migration. For example, one of the first cloud-related challenges that organisations come up against will often occur in the migration stages. Moving data from an on-premises repository into a cloud one is no easy feat. This is, in part, due to businesses needing to be able to migrate their data to the cloud without affecting their existing applications. Data virtualisation solves this issue by decoupling the data consumers from the underlying technical infrastructure, allowing for a seamless transition. Without this, organisations could risk impacting overall business productivity and even downtime, both during and following the migration period. Another big challenge is how to integrate the new cloud systems with the data sources which inevitably will remain on-premise, at least for some time. By generating a logical overview of all enterprise data, without having to duplicate information in a physical repository, data virtualisation enables organisations to save businesses time, money and resources. The promised land for big data analytics  For many businesses, cloud computing is seen as a tool which will enable them to remain relevant to their customers and gain the competitive edge. This is never truer than when thinking about it in the context of big data analytics. By using big data analytics in the cloud, businesses can analyse larger amounts of data – both structured and unstructured – than ever before. This can help them to monitor business trends and glean a fuller picture of what is happening throughout the entire organisation; insights which can then be used to drive productivity and inform key business decisions. But, for many organisations, relocating big data into a cloud environment can be a cause for concern, especially when it comes to security and governance. This is because big data will often incorporate sensitive information, such as the personal details of individuals working within an organisation or even the customers that they interact with each day. This data needs to be carefully managed so users get access only to the pieces of data they are authorised to see. With GDPR and other regulatory legislations coming into force, the potential risks of not fully protecting this type of data could be substantial for any businesses, regardless of size or sector. The initial fines - followed by the long-standing reputational damage - that a business in violation of GDPR will face could be enough to destroy its entire brand beyond repair. In addition, to ensure data quality it’s also crucial to guarantee that the data delivered to the users of the analytics initiatives complies with the formats and rules defined by global governance policies. It’s no wonder that moving big data to the cloud can be somewhat intimidating! The fear, alongside the sense of a loss of control, so often associated with migrating to a cloud environment can make it difficult to get everyone on board. It can feel almost like taking a leap of faith into the unknown. However, data virtualisation can help to calm these fears and put minds at ease. By integrating all enterprise data across various disparate systems, it helps to create a centralised security and governance data delivery infrastructure. This provides businesses with a single point from which they can apply security and governance policies and rules across all data, whether it is being stored in the cloud or on premise. This is the basis for providing a ‘single source of truth’ for data consumers, while guaranteeing security across all locations.  Optimising cloud migration Although migrating to cloud is no longer a ‘new’ concept, many businesses are still failing to reap the rewards in full. Data virtualisation is changing this, enabling organisations to enter into the world of cloud at full speed. By providing a single, logical view of all data no matter where it resides, data virtualisation is opening the door for all businesses, helping them to migrate data smoothly and diminish any security or compliance fears. Data virtualisation is opening a new chapter in cloud migration, full of opportunity for those organisations looking to stand out from the crowd and gain a competitive edge.[/vc_column_text][/vc_column][/vc_row] ### 4 Ways Armoured Vehicles Can Teach Us About Cloud Cybersecurity [vc_row][vc_column][vc_column_text]Cybersecurity teams find themselves on the front lines of protecting their organisations’ data and reputations. In much the same way that militaries have tweaked and optimised armored vehicle designs over the last few centuries to be both impenetrable and mobile, security teams are tasked with armoring their assets without decreasing the ease of regular business operations. One of the most rapidly-evolving elements of modern cybersecurity is cloud migration. It’s extremely common for organisations to deploy a hybrid combination of traditional, on-premises IT and cloud tools. It won’t be long before all enterprises have at least some cloud infrastructure. How can organisations secure both types of environments seamlessly while keeping up with the breakneck pace of the growth of cloud resources? It’s helpful to review the history of armored vehicles and look at what they can teach us and how we can apply the findings to modern-day cloud operations. Here are the four examples we can think about in the context of developing a cloud security strategy: 1. Use Vulnerability Management to Find the Chinks in Your Armor Long before the modern tank, “war wagons” were made during in the Middle Ages by adding plates that were made from steel onto wagons before they were taken onto the battlefield. As we moved into the 20th century, armored vehicles got a boost from petrol engines, but “armored cars”, for the most part, were still existing cars with the addition of armored plates. While trying to find a balance between a level of security and functionality would often end with mixed results. Does this sound familiar? Cloud security teams, especially those working at DevOps velocity, can be seen as a hindrance to efficiency. There’s a misconception that security tools always slow down processes, but new DevOps security tools are emerging that do actually keep up seamlessly. As your business migrates to the cloud, you will need to review your own armor which may potentially need modernising to reflect current cloud-powered infrastructures. Expanding capabilities with new operating systems and storage types is exciting, but it’s important to stay cognisant of new risks that come with that. If you are utilising a vulnerability management tool, this will flag up clear indicator of where these are located and will help you to prioritise and mitigate them in order of the severity of risk they pose. 2. Make Your Battleplan More Flexible With DevOps Security Tools As we head further into the 20th century, armored vehicles became more popular as there were further developments in their engineering and construction. With more vehicles featuring across the battlefield  allowed for a much higher degree of specialisation and further tactical flexibility, but that required new skillsets and further investment for engineering expertise. As a reflection in a change of military tactics and specialisations, in 1939 the Royal Armoured Corps was created by The British Army and this was formed by joining the tanks corps with cavalry units. Technological advances and rapid growth of the cloud also presents a  need for the right tools, knowledge and procedures to help a growing, in-demand and ever-changing infrastructure. Adapting to today’s practices and not rushing the cloud adoption process, having DevOps-orientated security tools to examine the development and testing phases help to ensure both security and compliance regulations are being met. Having this take place throughout the development lifecycles will help better reflect the new engines of cloud operations. 3. Specialisation Lets You Respond Quickly to New Demands As armored vehicles became more advanced, they were engineered with more specialisation. Vehicles ranged from the small and nippy to the bigger but sluggish armored tanks. It used to be that the weapons would be fastened to the vehicle. However, these were then adjusted to meet the specific needs of the underlying platforms or chassis. Similarly, the cloud is driving specialisation of security toolsets. It’s important to recognise the best tools for addressing today’s risks. An issue could arise if your cloud asset assessments are not supported by your security products. This could end up hindering any scope to expand your current cloud investment. Your strategies also need to account for faster provisioning of servers and applications. 4. Bring the Security Conversation onto Business Turf Today, some people may find it difficult to identify an armored car as it goes by. Bulletproof glass can make armored cars look like all the other cars on the road. Now extra protection isn’t limited to military use. Because of this, new sensibilities are needed with this modification. Those who seek and can afford these armored vehicles also want the balance of comfort and consumer-friendly practicality. This kind of balance also needs to be struck among the security tools. As security responsibility expands across other areas within the business, security information, tools and processes need to cater to a broader audience. There used to be a time when those within the SOC would have operated in isolation while scanning asset security. With DevOps, this has changed and now users can test and scan throughout the lifecycle of their deployments. In order to secure the DevOps process, security tools and processes need to work for developers too, not just security specialists. In for the long haul Though it may seem completely unrelated, over a century of armored vehicle innovation and design provides several lessors when considering security for cloud systems. Certainly, the environment has become more challenging as we try our upmost to build security to meet the demands of the industry, but as Art of War author Sun Tzu said, “in the midst of chaos, there is also opportunity!”[/vc_column_text][/vc_column][/vc_row] ### Boost Your Small Business With These AI-Powered Tools [vc_row][vc_column][vc_column_text]Is it any wonder that artificial intelligence remains controversial? After all, consumers are still complaining about self-checkout machines in retail a good 20 years after their inception – and that’s a technology most laypeople understand. When it comes to true artificial intelligence, even experts are sketchy on the full impact of the technology. How will it impact the job market, and what are the dangers of all that shared data? A significant 37 percent of experts surveyed by the Pew Research Center in 2018 still said they believe people will not be better off from AI-related technology by 2030. “Without significant changes in our political economy and data governance regimes [AI] is likely to create greater economic inequalities, more surveillance and more programmed and non-human-centric interactions,” Institute for the Future Executive Director Marina Gorbis told Pew in response to the survey, “Artificial Intelligence and the Future of Humans.” The Covid-19 epidemic might help quash some mistrust of AI, however. Scientists can do so much more than mathematically plot the expected spread of a disease. Thanks to AI, computing power can be used to account for compiled data of such variables as people’s actual movements in areas of dense population, creating enhanced progression forecasts. AI has also been utilised in the rapid development of new antiviral drugs and vaccines, which can be created by algorithms in just 12 months compared to 4-5 years through traditional research. Vital developments such as these only reinforce what many in the tech sphere have known for some time: Artificial intelligence is no longer technology of the future, but instead AI is the tech of the here and now. It just might not be taking the form many expect. Famed science-fiction author Isaac Asimov might have explored the world of AI through his depiction of robots beginning in the 1940s, but by 1950 at least one eccentric mathematician, Alan Turing, foresaw the ability to create artificial intelligence outside of a mechanical body. Turing wrote about the potential for software run on a virtual computer that could observe the environment and learn new things, whether those lessons be in mastering chess or understanding human languages. "We may hope that machines will eventually compete with men in all purely intellectual fields," Turing predicted in his article, “Computing Machinery and Intelligence.” Fast-forward 70 years, and Turing’s predictions have come to life. AI now allows machines to learn from experience and perform cognitive tasks. It’s use in business is without question, whether it’s analysing data to the nth factor – making predictions that only the most gifted mathematicians previously could hope to forecast and saving countless productivity hours performing mundane tasks. The same technology that can predict hurricanes can also be used to quickly and accurately balance a budget or create a successful marketing strategy. Still, many hesitate because they don’t fully grasp just exactly what AI means – and how it can work for them. Simply put, AI is the ability of computers to observe, deduce, learn and assist in decision-making tasks to solve problems. These can be either problems based on data too advanced for most people to accurately analyse or problems that require productivity, time and money to complete. So that’s what AI does, but how does it work? Large amounts of data are combined with intelligent algorithms, series of coded instructions that allows software to analyse and learn from patterns and sequences it locates within data. People have been trying to understand algorithms since Google first began touting its superior Page Rank method of determining search engine results. How does the algorithm work? The answer transcends what the average user comprehends, and Google remains mum on the precise formula and its many updates. Fortunately, explanations of other complex algorithms are available. HostScore, for example, explains in detail how its scoring system ranks web hosts. Its algorithms weigh factors including uptime scores, speed scores, editor’s scores and user’s scores, all based on frequently updated real-time data. It even explains exactly how each of the scores are derived and combined for a final rank. Similar software and algorithms can complete a whole host of tasks and solve complex problems in almost any sphere, but AI is currently taking the small business world by storm. In fact, when HubSpot conducted a survey of 1,400 global consumers, it found that 63 percent of respondents were already using AI without knowing it. Just as they embraced cloud technology to solve everyday storage and collaboration problems, small businesses frequently are relying on AI to automate mundane tasks, solve problems and boost productivity, including in the following 10 categories:   1.    Personal Assistants Any small business owner can vouch that time is money, and so it’s really no surprise that more are relying on the power of AI to serve as their personal assistants, automating tasks such as emailing, text messaging, scheduling and taking notes. X.ai, for example, is a personal assistant designed for scheduling. The tool monitors the user’s schedule and availability, manages appointments and meetings, creates notifications and even responds to emails. Other virtual assistant apps complete a more extensive array of tasks. Extreme Personal Voice Assistant, for example, claims to be as helpful as Tony Stark’s JARVIS voice assistant, connecting a variety of apps while understanding and responding to questions and commands.   2.    Customer Relationship Management The whole point of CRM is improving business relationships, but the recipe for doing so often remains elusive. Thanks to AI-powered software and apps, that formula is a lot easier to decipher. By using AI to analyse company and industry data, small business owners can gain vital understanding into their internal and external customers’ needs, wants and drivers. Crystal, for example, helps contact sales leads in the best possible method based on their personalities. It integrates data from LinkedIn, Salesforce, HubSpot and more sources, then evaluates anyone’s personality based on their virtual persona. It then provides personalised advice on the best way to contact, communicate with and market to each lead.   3.    Legal Advice Few small businesses will ever get by without any need for legal advice, whether it’s analysing and understanding contracts, abiding by applicable laws and regulations, controlling liability or avoiding litigation. Lawyers, however, are costly – legal lawsuits alone cost U.S. companies $150 billion a year – and often not in the operating budget. Now, a variety of AI-based tools can provide needed legal advice at a fraction of the cost. Legal Robot, for example, is an AI tool that can understand complex language. It compares thousands of legal documents to create a legal language model and analyse any sort of contract. Likewise, Intraspexion relies on a deep learning platform that acts as an early-warning system that can prevent litigation.   4.    Cyber Security Users have been relying on specialised software to protect their online properties and identities for decades, but now that cyber attackers are relying on AI, it only makes sense to employ it in cybersecurity efforts. Small businesses turn to platforms like Recorded Future to provide real-time threat intelligence that helps them proactively defend their entire companies against cyberattacks. Its security intelligence can accelerate detection, decision and response times thanks to a combination of machine and human analysis based on open source, dark web, technical sources and original research.   5.    Human Resources Balancing human resource-related tasks with other everyday operations can be a challenge for a small business. Fortunately, AI tools now can automate much of the process. Resources like TextRecruit employ text messaging and live chats to interact with job candidates, customising the content to a company’s brand and voice. It also assists newly hired employees with onboarding tasks. Meanwhile, AI tools like Zenefits streamline and automate a variety of HR-related functions, such as payroll and benefits.   6.    Lead Generation According to the Harvard Business Review, companies that rely on AI tools for sales increase their leads by more than 50 percent. Platforms like OptinMonster offer a variety of AI-based marketing and sales tools. Used by more than 1 million websites, OptinMonster has been responsible for 217 million conversions since its 2013 inception thanks to campaigns like exit-intent popup forms, scroll boxes and footer boxes. It uses AI technology to analyse factors like site visitors’ mouse gestures and velocity to determine the exact moment a user is about to leave a site, then prompts them with a targeted campaign. 7.    Customer Service According to Oracle research, 80 percent of leaders within sales and marketing already use or plan to use chatbots on their websites. These forms of AI rely on natural language processing technology to decipher customer inquiries based on key terms. Tools like FlowXO’s chatbots employ AI to respond to common customer service requests, based on a company’s own documentation – saving valuable human productivity time for more in-depth issues. Those that also implement augmented messaging technology, such as Salesforce’s Service Cloud Einstein can even identify opportunities when a human agent should become involved. Sentiment analysis tools go a step further and identify necessary escalations based off the emotion expressed by a customer.   8.    Data Analysis AI-based data collection and analysis tools can help small businesses compete with larger companies when it comes to using that data to make critical business decisions. Data collected from business systems including sales, customer service, marketing and accounting can be integrated into actionable business intelligence. Insight Squared, for example, compiles and analyses historical data from a company’s various internal systems to generate recommendations for improved sales, marketing and staffing. Likewise, IBM Cognos Analytics’ AI-powered technology can not only help small businesses visualise performance with a variety of customisable dashboards and reports, but it also helps users interpret their data, complete with actionable insights.   9.    Accounting Balancing the budget is a vital aspect of any small business, but not one at which all small business owners are skilled. They can save money spent hiring an accountant thanks to AI-based technology. Companies like Xero provide accounting software that doesn’t just compile and report information, but also automates tasks like reconciliation and analysis. Even Quickbooks, which small businesses have relied on for accounting and bookkeeping software for decades, has introduced AI technology that uses predictive modeling to help companies better manage their cashflow.   10. Content Strategy No matter a company’s size or industry, content remains king in terms of marketing. How many small businesses, however, have the manpower and the budget to employ a professional content strategist? Various AI-powered tools now enable small business owners to develop a successful content strategy and increase their revenue. Tools like Market Muse can analyse content and compare it to the competition, advising users how best to edit and expand upon early drafts while optimising for traffic and lead generation. Cortex, on the other hand, helps determine the best timing and channels for publishing content on platforms like Facebook, Twitter and Instagram by analysing competitors’ content and its efficiency. It's an exciting time for small businesses to embrace AI-powered tools. Is there a favorite you’ve already employed, or were you using an AI business tool without even realising it?[/vc_column_text][/vc_column][/vc_row] ### Achieving integration excellence in the cloud [vc_row][vc_column][vc_column_text]Integration projects were once a back-end IT operation. While important, they would typically involve a small team hidden away from the rest of the organisation, focusing on bringing data into business applications such as SAP or Oracle. Today, integration has been recognised as being increasingly important to anything an organisation’s IT team wants to achieve. A new business application, for example, must be supported by data from established parts of the business, while a cloud migration will require integration to transport data and to stitch new cloud applications into the rest of an organisation’s operations and infrastructure. According to Gartner, however, the rate of innovation and digital change today means that traditional methods of integration are no longer up to the task. Instead, as with many areas of business today, the most effective way of connecting data and applications lies in the cloud. In addition to integrating their own applications and data, the deployment of an integration cloud will enable businesses to integrate with their partners, customers and suppliers, and for their products to integrate with those created by other organisations. While an effective integration cloud will significantly improve an organisation’s agility and productivity, establishing an integration centre of excellence (ICoE) will help ensure the deployment of that integration cloud works to its best possible effect. From beginning to end Sometimes referred to as an integration competency centre, an ICoE ensures that all integration projects are completed to the highest possible standard, following best practices across all relevant disciplines, and making the most efficient use of an organisation’s people and IT resources. An ICoE will be involved from the beginning of an integration project to its end and will fundamentally be concerned with governance – including the definition of standards, best practices and team responsibilities, as well as their consistent application across an organisation for each of its integration projects. During its initial engagement with a project, the ICoE will undergo a process of discovery. Meeting with all relevant stakeholders, the team will gather requirements from which it will formulate guidelines and build an implementation plan. By the end of this discovery phase, it will have defined a project roadmap, and a reference architecture, along with the deliverables and processes to be used for building and then managing the integration. The ICoE will then ensure work is carried out according to best practices for building, designing, and testing integrations. This will result in a working integration being built much faster, cheaper, and smarter than it would have been without the ICoE’s involvement. Once it is up and running, the ICoE will work with the organisation’s IT team to help govern and manage the integration. If sufficient monitoring tools aren’t in place, for example, it will be up to the ICoE to make appropriate recommendations. Then, once the necessary tools have been implemented, the ICoE will work with the operations team to ensure a standard approach for monitoring and alerts. This way, the organisation can be sure that its integrations are not only being built consistently, but that they’re being deployed and run consistently too, in accordance with the roadmap and reference architecture defined at the beginning of the process. Making a difference The introduction of an ICoE represents both a cultural and an organisational shift.To make it easier to resolve any conflicts or concerns that may arise when the ICoE recommends doing something in a way never before tried, it’s vital that an organisation’s IT leaders show their support for the ICoE and its work. After all, it can be hard enough to implement organisational change at the best of times; doing so without the buy-in of a company’s C-suite could be virtually impossible. With this support, though, an ICoE can make a real difference for an organisation. Integrations will be built better, at a lower cost, and rolled out more quickly, enabling stakeholders across the business to access the data and applications they need to do their jobs. The ICoE will enablethe business to more easily integrate with its partners, customers and suppliers. By moving integration projects to the cloud, organisations will be better able to improve efficiencies and achieve the level of agility and flexibility they need to thrive and survive in today’s competitive business environment.[/vc_column_text][/vc_column][/vc_row] ### What does Machine Learning mean? [vc_row][vc_column][vc_column_text]In recent years machine learning is gaining more and more popularity, but what exactly is the Machine Learning. in this section we will deep dive and answer all this question by the end of the topic you will understand what machine learning is, types and how machine learning is used. Evolution of Machine Learning The name Machine Learning initially originated from famous gaming researcher Arthur Lee Samuel. Samuel is the first person to bring self-learning programs into society. This remarkable discovery shortly laid the foundation for Machine Learning algorithms. In later years raising popularity in Machine Learning and Artificial Intelligence give birth to so many innovations in the field of Computers and Automation, However, similar definitions and usage of ML & AI created ambiguity in distinguishing these two fields. In fact, few beginners in this field often use AI and Machine Learning interchangeably, but the fact is that they are the same. Artificial Intelligence is the integration of machine learning algorithms. Artificial Intelligence models are used to perform multiple tasks such as Self driving cars, Humanoid Robots On the other hand, Machine Learning is used to accomplish only specific tasks like spam detection, Movie recommendation, and Image classification. Actually, Machine Learning is a subfield of AI, the picture below clearly explains what I meant.[/vc_column_text][vc_single_image image="52432" img_size="full"][vc_column_text] Machine learning is broadly segmented into three types. Supervised Machine Learning Unsupervised Machine Learning Reinforcement Learning  Supervised machine Learning Supervised machine learning is the most commonly used technique many industries use supervised machine learning techniques to train machine learning algorithms In supervised Machine Learning, we supervise or teach the machine using labeled data, in other words, we show the sample data and tell the machine what the label is, likewise we do it for every sample in the dataset.[/vc_column_text][vc_single_image image="52433" img_size="full"][vc_column_text]Figure 1 will clearly explain the working of supervised machine learning. In figure 1 Dataset consists of ‘n’ Labelled cat and dog images, each image is labeled with a tag. For instance, in figure 1 Image 1 is labeled as a cat. Likewise, there will be ‘n’ labeled images from 1 to n. In supervised learning, the Teacher holds the actual values for every corresponding image in the dataset. Similarly, the Learning system will give predicted values for every corresponding image in the dataset. Once we got the image output values from the teacher and learning system error function will calculate the error between actual and predicted values. Using the feedback error, the Learning system will keep on updating its parameters (weights) to minimise the error value. Eventually, this process of learning parameters (weights) will help the Learning system to understand the model. Unsupervised Machine Learning In contrast to Supervised Machine learning, unsupervised machine learning doesn’t have any monitoring or teaching system. we let the machine to figure out its own model from unlabeled data.[/vc_column_text][vc_single_image image="52445" img_size="full"][vc_column_text]Like figure 1, In figure 2 also Dataset consists of ‘n’ images of Cats and dogs but now this image is unlabeled (means we don’t mention their tags to learning system), for instance, in figure 2 now Image 1 is unlabeled it may be cat or dog. Likewise, there will be ‘n’ unlabeled images from 1 to n. So, to tackle this unsupervised learning system uses the concept of clustering, Clustering is a grouping technique in which similar featured objects are grouped together. In the above example figure 2, we have two different categories (cats vs dogs), accordingly, we design our learning system capable of clustering into two groups of images separately. we achieve this by the concept of feature selection in other words similarly in images are grouped together and dissimilar images group together.  Reinforcement Learning Reinforcement learning is a special type of machine learning especially used for game theory, signals and system and control systems.[/vc_column_text][vc_single_image image="52441" img_size="full"][vc_column_text] What next Most Machine Learning algorithms fall under these categories, but current advancement in machine learning created many more different algorithms among them Semi-Supervised Learning is worth of mentioning. Why and where we use Machine learning According to Forbes Machine Learning is going to a common buzzword in Medical, Finance, Transportation, E-commerce and many more sectors. But why popular Companies are focusing on machine learning nowadays. The answer is because of machine learning capabilities. Do you know how Amazon is achieved to show the same product that you saw on their website last night and have you ever thought how Netflix able to recommend the movies you are interested in, the simple answer is Machine learning algorithms, this algorithm is expertise in the forecast the future trends and understands the customer behavior. Likewise, there are a lot of applications of Machine learning in different fields Computer vision[/vc_column_text][vc_single_image image="52442" img_size="full"][/vc_column][/vc_row] ### DDoS Attacks Vary Based on Date and Time [vc_row][vc_column][vc_column_text]Distributed Denial of Service (DDoS) attacks are a growing threat to organisations’ security and profitability. A DDoS attack that takes down an organisation’s web presence can dramatically damage sales numbers as customers give up and visit another page in the face of slow page loads or failed connections. Regardless of when a DDoS attack occurs, having a DDoS protection solution in place is essential to ensuring website availability. However, it can’t even be assumed that a DDoS attack will occur when an organisation is able to respond and take steps to remediate it. The patterns of life of DDoS attacks indicate that they are more likely to occur outside of standard business hours, making an automated DDoS protection solution even more essential. Introduction to DDoS Attacks A Denial of Service (DoS) attack is designed to degrade or destroy the ability of a system to provide services to legitimate customers. This can be accomplished in a number of different ways. DDoS attacks take the approach of sending a massive amount of data to a system, overwhelming its ability to process the malicious data as well as the connections of legitimate users. Overwhelming the ability of most online services to function requires a great deal of computational power and networking bandwidth under the control of the attacker. At some point, every byte of data that is received and processed by the victim must be sent by some other system. Depending upon the amount of computing power that an attacker owns, it may not be possible to overwhelm a target using only their own systems. Cybercriminals overcome these difficulties in a number of different ways. Some DDoS attackers operate massive botnets. Building these botnets has become cheaper and easier in recent years as the Internet has been flooded with Internet of Things (IoT) devices (with default passwords and use of insecure protocols) and cloud computing has made it possible to lease cheap, Internet-connected computing power. These devices are also scattered all over the world, making it harder to identify and block malicious traffic. Other innovations in the DDoS attack, like the discovery and exploitation of potential DDoS amplifiers, have also helped to make massive DDoS attacks easier to perform and much more common. The low price of DDoS attacks has also made it possible for DDoS botnet operators to sell their services (and that of their botnets) to interested customers. With a 1000 device DDoS attack costing only about $7 per hour to run, an enterprise-scale DDoS attack is within the reach of a wide variety of potential customers. As DDoS for hire providers tailor their services to certain users, DDoS attacks are starting to fall into patterns, making it easier to predict when an attack is most likely to occur. Patterns of Life in DDoS Attacks Despite the fact that a DDOS attack can be performed by systems around the world, patterns have emerged in the timing of DDoS attacks. In general, a DDoS attack is most likely to occur on a Saturday, and, if performed on a weekday, is likely to fall between the hours of 4 and 8 PM. These patterns of life in DDoS attacks suggest quite a bit about the likely attackers. For one, those ranges line up rather well with when gamers are likely to be playing, after school and work hours. Since many DDoS attacks are targeted against game servers to impact gamer rankings, this timing is unsurprising. However, for the DDoS attacks not designed to ensure that the attacker keeps the top spot in the ranking list, the timing of DDoS attacks have worrisome implications for businesses. Outside of business hours, a company is likely to be operating with greatly reduced security staff, making it more difficult for them to respond quickly to an attack. On the other hand, the hours that these DDoS attacks are being performed are also the same hours that customers are likely to be browsing ecommerce sites. A strategically timed DDoS attack that brings down a company’s site during peak sales hours can have a significant impact upon revenue. Protecting Against DDoS Attacks As the threat of DDoS attacks grows, organisations need to develop and deploy a DDoS protection plan that can mitigate these attacks when and where they occur. The patterns of life of DDoS attacks indicate that they are unlikely to occur when an organisation is best prepared to defend against them. An attack during off hours, whether during the weekday evenings or on the weekend, could catch an organisation off guard and take longer to remediate. This is a significant concern since a DDoS attack can have a number of impacts upon a company. Beyond the possible loss of revenue due to a loss of customer access to the organisation’s web presence, DDoS attacks are also commonly used as a smokescreen to hide other, subtler types of attacks. An organisation’s security team that is flustered and caught off guard by an off-hours DDoS attack may miss the data breach that the attack is designed to conceal. For this reason, deploying a DDoS mitigation solution capable of rapidly and automatically responding to a DDoS attack is essential to an organisation’s security and bottom line. When DDoS attacks catch an organisation wrong-footed, they can’t afford to wait until the security staff makes it to the office and is able to respond.[/vc_column_text][/vc_column][/vc_row] ### Purifying data for better business [vc_row][vc_column][vc_column_text]The basic building block of all products, services and customer interactions is, of course, data. And just as water is crucial to life on Earth, fuelling ecosystems across the planet, data is the core component that underpins every business strategy. It provides the facts, insight and intelligence needed to propel activity and growth across every functional discipline. But our water supply faces constant threats that affect its quality, purity and ability to sustain life. One only has to look at places such as Flint, Michigan in the US or the Yellow River in China, where the water is so polluted as to be undrinkable, to see the impact this can have. The same can be said for the purity of an organisation’s data. Among other things, poor data quality can leave businesses facing dissatisfied customers, regulatory disclosure risk, and an inability to accurately forecast earnings. It’s vital, therefore, to know that this data is of pristine quality. It may look crystal clear on the surface, but it’s what’s underneath that truly matters. Only by using the highest quality data can a business deliver a satisfactory customer experience, reduce the risk of compliance failures, and optimise its operational efficiencies. In many cases, this will require organisations to purify the data they use. Filtering out impurities The purification process should begin with businesses outlining a series of structured steps they can follow to ensure their data is of the highest possible standard. More than anything, formulating a solid data governance and quality strategy has the benefit of offering greater transparency around an organisation’s data: where it comes from, how much of it there is, whether it’s all required, and who should have access to it. Knowing this means that when it comes to implementing the plan, organisations can immediately identify opportunities to cut costs and improve efficiencies. Prior to embarking on the plan, research should be undertaken on the tools and technologies that will make the process easier for all involved. After all, with a wide range of analytics and machine learning solutions available to carry out the heavy lifting, manual intervention should only be required for exception handling when it comes to improving data quality. It’s now just a case of finding those tools that fit an organisation’s requirements and meet its budgets. An audit should then be carried out to assess the relevancy of any data currently held by the organisation. The type of data used to help boost sales, for example, will differ greatly from that needed to improve customer service or streamline the supply chain. Likewise, it’s important to take stock of those suppliers and customers that are still relevant to a business, and those which can be left behind. The aim of such an audit is to enable organisations to focus on purifying the data they actually need. As a filtration process removes impurities to leave only clean water, so the removal of irrelevant data will leave only that which matters to a business. It’s important to avoid cherry-picking data, however. It’s an easy and common mistake for organisations to make - we have a natural tendency to look for the results we want and to ignore less favourable data – but might be too biased or narrow minded. Objectivity is essential. All relevant data – both good and bad – must, therefore, be included in an audit’s findings if businesses are to ensure their data cleansing plan is comprehensive. Mindsets and mindfulness As organisations everywhere undergo digital transformation, the volume of data they create is only set to increase. Both operational employees but also management sometimes underestimate the value of data to a business, though, so some education is required on the importance of data quality as a driver of organisational success. Maintaining a high level of quality – of data purity – will, therefore, require a company-wide change of mindset. One way of achieving this is to assign data advocates to each business unit, keeping their colleagues abreast of the latest developments in - and benefits of - their company’s data purification process, and the ongoing need to ensure that data is kept relevant and up to date. That said, it’s important not to set the bar too high. It’s possible to over-purify the data supply and thereby removing some of its value. Organisations should be mindful of data protection regulations such as GDPR, too, to ensure they remain compliant without hindering the flow of data between internal systems, and external customers and suppliers. An awareness of established best practices, industry standards, and similar initiatives undertaken by comparable companies will help avoiding this. Data is essential to the life of every business today; it improves efficiencies and helps organisations maintain their competitive edge. But like our water supply, it must be kept clean of any unnecessary pollutants and allowed to flow freely. By taking a structured, company-wide approach to purifying data, businesses will feel refreshed, reinvigorated and ready for the rewards that relevant and timely data offers.[/vc_column_text][/vc_column][/vc_row] ### How IoT Is Enhancing Customer Experience [vc_row][vc_column][vc_column_text]Customer experience is overtaking price and product quality as key metrics of brand differentiators and customer retention today. Retailers and organisations are keeping up with the demand for better consumer satisfaction with key technologies. But one vital technology that organisations are deploying today is the Internet of Things (IoT) – sensors and devices connected across a digital network. Smart User Experience, the paradigm which integrates IoT technologies into customer experience, has taken centre stage in delivering growth for organisations in the past few years, and experts predict that this year will be a crucial point for its growth. IoT explosion With the fifth generation connectivity (5G) rolling out in numerous areas this year, industries are scrambling to leverage IoT applications to improve the customer experience. The number of connected devices – predicted to reach 24 billion by the end of the year – produces huge opportunities for organisations to deliver an improved experience to their patrons. A huge part of this will come from the amount of data generated from these devices. Research firm IDC estimates that around 29.4 trillion gigabytes of data will be generated by at least 41.6 billion connected IoT devices in the next five years. This data can enhance personalisation and transparency, and create insights into customer behaviours and needs. While the outcomes of increased productivity and the decreased cost of deploying IoT in industries are undeniable, how is it changing customer experience overall? Here we will discuss some great real-world examples that you can use in your own organisation: Faster and more customised services In retail, 70% of surveyed decision-makers say they are ready to adopt the changes enabled by IoT, according to a study conducted by Zebra Technologies. Supermarkets like Asda use RFID sensors and IoT devices to drive sales and build customer loyalty. IoT connected devices like beacons, connected cameras, smart shelves and RFID devices provide access to huge amounts of customer data. This data in turn provide insights for more customised customer engagement. The data is also used to judge the effectiveness of store layouts, in-store promotions, and product interactions. These are analysed in real-time to capture customer preferences and paths to purchase. In turn, this identifies the best experience for customers. Intelligent staff scheduling from insights generated from IoT sensors also helps eliminate long queues – improving the customer journey in-store. Services today are also increasingly at the fingertips of customers, who now expect their choices to be fulfilled after a few clicks. However, online groceries are struggling as brick-and-mortar stores implement smarter solutions. Essential Retail highlights how Tesco launched a digital channel on the IFTT platform to help consumers automate their shopping. This is a clear take on Amazon's Dash IoT button which launched years ago, which helps customers track specific items in stores. Real-time visibility Customers today expect delivery to be convenient, fast, and flexible. Especially with the rise of direct to consumer marketing led by Amazon, consumers want real-time parcel tracking as an added service. In fulfilment services, DHL utilises IoT sensors in its warehouses and fleets to send continuous updates to its customers. This lets customers know the status and any troubles their parcel might encounter. In fleet management, Verizon Connect uses vehicle tracking technology to provide real-time insights regarding the fastest routes and to locate the closest field workers to the requesting parties. This decreases wait time for customers and provides them with accurate delivery estimates. Better personalisation Customers today also expect their IoT devices to adjust and improve services from the gathered data. For IoT devices in the home, iRobot customers are informed with custom notifications when there’s something wrong with the device. When repairmen are called, they can easily identify the problem through the vacuum’s sensors. Companies like Hive go a step further. The home IoT device manufacturer recently partnered with Salesforce to analyse their customer data to anticipate device failures and notify users immediately. While the conditions are different for every home – be it weather, location, or usage patterns – Hive aims to create a baseline to better predict the needs of their customers from purchase to use. Reimagined product and services While the IoT revolution builds upon the efficiencies of past technologies, it also offers the opportunity to create innovative products and services. Last year saw the launch of the fully functional no-checkout retail store, Amazon Go. Currently, the only store that is fully automated, it enables customers to interact with products inside the store and use their mobile wallets to pay for the items. This creates a seamless experience for customers. Startups like Mobike and other product-as-a-service platforms today are created in response to customers wanting to pay-per-use rather than lease. With more devices and data streams generated every day, IoT analytics will be a central technology in continuously creating better customer experience in the future.[/vc_column_text][/vc_column][/vc_row] ### Smart home security: Why cloud isn’t always the answer [vc_row][vc_column][vc_column_text]With 25 billion connected things predicted to be in use globally before 2021 (according to Gartner), the issue of securing the data that comes from these devices is an important one, especially as organisations increasingly migrate user data to the cloud. This trend is only set to increase as internet connected devices and services are becoming an integral facet of our professional and personal lives. Since the advent of connected devices, the traditional approach of storing data in the device itself has become almost obsolete. Developers have become aware of the greater convenience and efficiency that cloud storage offers, especially for smart home devices such as cameras. From easier installation (for instance, users can link multiple cameras to one network, rather than assigning individual IP addresses to each camera) to instant data access, cloud storage for the smart home presents significant advantages – but potentially at the cost of data security. Understanding IoT and cloud data storage With manufacturers not wanting to take responsibility for security on the device, many have looked to the cloud to store data. However, transferring data to the cloud can put users in a precarious position, particularly because their personal data is now outside of their direct control. In light of the increasing amount of data leaks and device hacks over the past few years, centralising data in the cloud can cause the downfall of the whole IoT ecosystem. Helping management understand the root causes of these issues is key to helping the organisation overcome such security issues. Given that management executives are primarily focused on designing and manufacturing the device itself – as opposed to devising other features such as access control – they occasionally lack in their understanding of how to manage user data correctly.  Though integrating security controls into the design of the device itself would be ideal, this often incurs additional costs – in both monetary and time measures – for the manufacturer. Back to square one: the microprocessor Understanding the complexities of IoT environments and products is the first step to solving security issues for connected devices. Manufacturers must realise that internet-based products are more complicated than things such as email services and social networks. IoT devices, for instance, contain microprocessors that can be used to process user data whilst simultaneously hosting it on the NAND. The microprocessor can also be used to encrypt sensitive user data from within the device itself, allowing their information to not only be securely carried through the internet, but also to be directly transferred to the user without the need to go through cloud servers that are externally controlled. Such features can prevent multiple devices from being the target of mass cyberattacks, and from being hacked at the same time. Additionally, with encryption serving as a non-recurring operation and IoT processors being idle 60% to 70% of the time, IoT devices can allocate their extra capacity towards protecting user data. Using the device’s integrated NAND storage can be used to easily host various security components, including device settings and encryption keys. Moving away from cloud storage and hosting data through local storage can also help manufacturers reduce their technology and operational costs. Data security for the future smart home Ensuring that a cloud device is secure is a monumentally bigger – and more expensive – task than securing a device with local data storage. For instance, an organisation could employ a security team to ensure that the cloud connection is always secure; a 24/7 job. Alternatively, an organisation could pay a team of white hat hackers once to test an IoT device, resulting in a one-off payment and a secure device. The two options both aim to secure the customer’s data but maintaining a secure cloud infrastructure is expensive and requires constant monitoring, potentially wasting an organisation’s resources. Forward-thinking organisations can even integrate neural networks with video data from within a local IoT device based on low-cost ARM processors. Naturally, using a cloud-based infrastructure may be unavoidable in some cases, and its advantages may mean that it is the best option available for manufacturers. However, rather than automatically opting to choose cloud services as a default option, manufacturers should consider other aspects, particularly around security. Users’ needs should come first, especially in today’s age of data protection, where it is vital to give them direct control over their own data. For devices such as smart home cameras – which quite literally gives glimpses into customers’ everyday lives – data confidentiality is key, and manufacturers should take steps to ensure the right encryption measures are made, to prevent unauthorised parties from accessing such sensitive information. End user protection is far too important for a blanket cloud approach.[/vc_column_text][/vc_column][/vc_row] ### What does real AI really mean? [vc_row][vc_column][vc_column_text]Are we sometimes talking about artificial intelligence when another term would be more appropriate? Does it matter? Guy Matthews skirts a terminology minefield. Forbes describes artificial intelligence (AI) as the most important technology of the 21st century. Yet it has a problem. In a rush to associate itself with such game changing, mould breaking potential, the tech sector is all too ready to stick an AI label on any solution that offers some or other degree of basic automation. The water, some argue, is thus muddied and the very reputation of AI compromised at a critical time in its evolution. One problem is that there is no centralised definition of AI that everyone is agreed on. By rough consensus, a system is only truly artificially intelligent if it is able to become smarter over time with no human input required beyond its creation. True AI grows more capable as a result of its own experience. It thinks for itself, in some fashion, as a human would do, or ideally far more profoundly and reliably than a human would do by removing emotion out of the decision process. A system that uses algorithms to replicate or mimic a human action can be said to be robotic, but not necessarily intelligent. AI should also be distinguished from augmented analytics or predictive analytics. A system based on this kind of capability can be used to make decisions, perhaps quite important decisions, but it is not getting more intelligent over time. It is not learning and adapting. At best it is making assumptions and predictions based on the historic data it holds, able to call on that information to turn out ‘what if’ scenarios. It does not have a nuanced and human-like ability to see a bigger picture emerging, or to spot the unexpected in a sea of data and react to it. It will spot what it has been programmed to spot. This perhaps is why it matters when people arbitrarily toss different ideas in a bucket and call the whole thing AI. It might suit a marketing agenda, or play well with shareholders, but ultimately confusion, misinformation and ignorance are the result. “There are a lot of people who are artificially intelligent about artificial intelligence,” quips Nick McMenemy, CEO with Renewtrak, a developer of automated licensing renewal solutions based around machine learning (ML). “We’ve got a lot of armchair experts who don’t really understand what it is or what it does. Many people mistake AI for what in reality is a derivative of machine learning. But AI is infinitely more complex and involved.” McMcMenemy is happy to call out those who cynically name check AI because it seems like flavour of the month and resonates with the investment community: “AI has become something people want to talk about, whether it is justified or not,” he says. “I don’t want to denigrate ML, because that’s on the pathway to the nirvana of full AI. But let’s not create confusion.” True AI, if there is such a thing as true AI, is hard to define because intelligence itself is hard to quantify, argues Kumar Mehta, co-founder of Versa Networks, a vendor of advanced software-defined networking solutions: “What we should be asking instead is for domain specific intelligence,” he says. “An example of true domain specific AI would be where a network is managed and operated without any user intervention – so-called predictive networking.” Mehta suggests that rather than get hung up on terminology that end users ultimately don’t care about, the IT industry should be following the example of the autonomous car sector which talks about ‘degrees of intelligence and automation’ rather than overlay complex jargon onto developments. Mehta remains upbeat about AI’s potential, despite issues over what it may or may not be: “We are on track for a huge transformation that will make our life easier and better,” he believes. “However, we need to be realists regarding its progress. It will be incremental and based on what today’s algorithms and the speed of compute and networking technology will allow.” Rik Turner, principal analyst with independent consulting firm Ovum, believes it may be helpful to view AI as no more than an umbrella term for a range of technologies and approaches. “These could be deemed to include machine learning, deep learning, which is essentially ML on neural network-based steroids, and natural language processing,” he says. “These are some of the branches of AI, though by no means all of them.” Others believe that AI needs to be contained in a tighter definition than that. In essence it is the theory of how computers and machines can be used to perform human-like tasks most often associated with the cognitive abilities of the human brain, says Scot Schultz, senior director for HPC, artificial intelligence and technical computing with Mellanox, a developer of networking products. “Machine learning is the study of algorithms that build a mathematical model that a computer will use. Machine learning can be applied in a huge number of applications, including data analytics. Data analytics, however, is fundamentally is applied from real-time data and historic data to find otherwise unseen patterns and trends and solve for a future situation in industry and business. Robotic Process Automation on the other hand automates the operation or inputs to software applications or digital control systems and it can use ML to optimise those automated inputs.” If the man or woman on the street finds it hard to separate what constitutes real AI from dolled-up analytics, then they couldn’t really be blamed. After all, the tech sector itself is at sixes and sevens over the matter, according Zack Zilakakis - Marketing Leader of intent-based networking specialist Apstra: “AI is still ambiguous, and often misunderstood by traditional technology practitioners,” he claims. “The term ‘artificial intelligence’ is hard to define and there is no agreement about it. There is no real agreement among AI researchers and specialists concerning its definition. By contrast, he defines machine learning as the ability to continuously collect and convert data into knowledge, necessary to make a decision, either digital or from a human. “Machine learning will ultimately eliminate the need for data science,” he predicts. “Many AI or ML projects fail because organisations hire data scientists as machine learning experts. The projects require a focus on academia and, more specifically, mathematics as a baseline.” He sees the correct handling of data as no trivial matter: “Organisations will ‘drown’ in data lakes after a year or two if they do not act on data analytics,” he warns. “These data lakes grow by the minute and are a nightmare to manage. It is imperative not only to collect the data but also convert it into knowledge to make a decision.” On the matter of sorting true AI use cases from false, he agrees that it is more than a minor matter of semantics and very much an issue of reputation: “AI's image problem historically has been to over-promise and under-deliver,” he says. “Customers care about practical solutions to their problems, not definitions. Historically, there is always a pull-back from the over-promising when there is under-delivering.” Zilakakis is sure that the delivery will happen, despite occasional periods of doubt and uncertainty. He sees AI as likely to evolve in multiple waves rather than in conveniently linear fashion: “The initial AI wave will address numbers and repetitive tasks,” he forecasts. “These are low complexity and can be augmented or replaced. This includes some of the work of people like accountants and equity traders. The next wave will focus on automating the middle person from the process. This will include platforms to interpret and provide medical diagnosis, among others. Traditional blue-collar and creative jobs require hardware and will take more time to develop. What will not change is the need for creative and empathetic skills. These are artists, nurses, and architectural development, among others.” One we are past the need to define what is and isn’t true AI, we can start to embrace some other substantive issues, he believes: “AI will introduce its own set of issues to include an ethical element,” says Zilakakis - Marketing Leader at Apstra. “Do algorithms discriminate? Humans are limited in our ability to remember, and that’s our faulty part of the brain, but computers are void of this limitation.” Mellanox’s Schultz agrees that it is early in the AI revolution, but points out that applications under the overall AI-umbrella have already been transformative in selected areas of science and industry, healthcare and medicine in particular: “It is also becoming a huge deal in language processing and conversational AI,” he observes. “Going forward, AI-based solutions will become transformative in more and more areas including security, manufacturing, e-commerce, agriculture, transportation, research, marketing and politics.” Perhaps the current quandary over what is and is not true AI is a mere bump in the road on a longer journey. Try as they might to hijack its good name, hyperscalers and marketeers can only do so much to hold back an idea whose time has nearly come.[/vc_column_text][/vc_column][/vc_row] ### Managing your cloud environment [vc_row][vc_column][vc_column_text]Now that you’ve reached the end of your new connectivity route and started to deploy cloud generation technology, you will want to manage this environment. Management of the cloud is arguably more important than on-premise management. The reason is that you - the customer - now gets to utilise an architecture that is dispersed, covers multiple geographies, has ‘soft edges’, is difficult to contain, runs on multiple cloud platforms, and is all supplied by different vendors. In our previous article, we discussed the management considerations for connectivity and product choice/solutions when moving from an on-premise world into the cloud. So, if a customer is now enjoying all the ‘new’ aspects that the cloud delivers, why do they manage this brand-new environment with an on-premise mentality? A fairly simple answer to this is that early management options in the cloud arrived in the form of familiar on-premise solutions. SIEM tools are an intricate part of data centre management posture, and so it seemed like a great idea to copy that posture to the cloud. However, to get the best out of the architectures that cloud providers offer, they need to evolve. Why, I hear you ask? The fundamental answer to this is that the architecture of a cloud configuration is very different to that of the ones we are used to seeing in the data centres around the world. Therefore, the management of the cloud needs to follow suit. On-premise solutions are contained, whereas cloud is not. On-premise architectures don’t need to consider internal bandwidth charges or worry about placing agents in every corner of their data centre. Both of these are, however, important considerations in the cloud. On-premise management tools have the luxury of sitting outside the fences, calling in, and demanding every last scrap of data (in the form of large log files from all the players) in an attempt to collate this data to inform the users what they should do should any incident occur. Cloud architecture is different. Control, management, and data planes have been deliberately separated to enable a performance not possible in an on-premise alternative. The separation of planes is an area that the customer needs to appreciate regarding security. Remember the shared security model? Cloud vendors look after parts of the data and control plane, but not the data going through them. In addition, you are responsible for the management plane and having visibility into these is key. In terms of management considerations, you don’t want to generate large log files and then wait for them to be passed to a tool that requires professional analysis. This is not the best use case, and the cloud offers much more flexibility in this field. Instead, the cloud demands a cloud generation solution for management. Size doesn’t matter in the cloud: in fact, smaller is better. A new approach in this area is to utilise tools that have ‘insiders’ talking to end points via API calls, gathering all the data available, and sifting through it closer to the source. Once complete, it can pass small chunks of pertinent data through to tools that don’t require professional intervention to decipher the information while offering the ability to automate remediation. Some tools are now able to offer automatic remediation to the issues they find. This is a major step forward in the ability to monitor and manage your cloud infrastructure. Some tools can also integrate their reporting with existing SIEM tools to help with reporting and alerting. If you are an SMB, you may not have the resources to keep this type of skill set on the payroll, and so an extra step in the journey is required. The journey from on-premise The journey from an on-premise environment to a cloud-based architecture may be daunting for some, but - if the latest tools are researched beforehand - it can be a smooth process. Security, as always, is key. This means security built in from the start and not treated as a ‘bolt on’ later. If you utilise the latest cloud management tools to make sure you stay in check, you can be sure to keep a solid eye on your journey and keep everything in shape once you’ve arrived. Things happen quicker in cloud and, in the world of security, that has both positives and negatives. The positives are clear. Threats are now on a zero-day schedule and you need to be prepared for the next threat and to deploy the next layer of defence. Failure to plan for this would make it highly likely that your organisation would suffer significant outages, financial pain, or worse: loss of data. So, a quick reaction is key and being on the inside will give you the advantage. The negatives are not so clear. How do you make sure your new defences are deployed correctly? How do you get them to work with your existing defences? In addition, how do you train your workforce to report back on the latest threats, should it hit your organisation? Because of the speed of change these days, you rarely witness a brand-new product replacing the previous version. What you tend to find is constant improvements to the existing foundation by layering extra features and protection to the base solution. This layered approach can be seen in email security products. Entry level protection is common, familiar, and similar across vendors supplying such services. However, as ongoing threats developed from spam and phishing to ransomware and spear phishing, additional layers of protection were added to the original stack. It’s unnecessary to create a new master solution that addresses all threats. The existing layers serve a purpose and additional ones are added to address new and more complex threats. Cloud management tools are now evolving to offer extra strings to their bows. Having visibility into the whole deployment and possessing the ability to report from all endpoints without disrupting the latest cloud architectures is key to good management.[/vc_column_text][/vc_column][/vc_row] ### Remaining compliant whilst embracing social media [vc_row][vc_column][vc_column_text]Since the global financial crisis, ‘regulated businesses’ in the financial services sector have been significantly impacted by two key disruptors. First, the 2008 crash caused seismic changes in financial regulation. In order to attempt to remedy the mistakes that led to the crash, a number of new national and transnational regulatory frameworks were created to force firms to be more transparent and better protect the data they hold. Operating legitimately now requires engaging in a host of time-consuming and resource-intensive procedures. Second, we have seen the significant disruptive influence of technology, in particular the rapid rise of a whole new set of FinTechs. In an industry-changing shift, the likes of Starling Bank, Revolut and Monzo have lured customers away from the traditional ‘Big Four’ lenders by offering digital innovations that have changed the way many of us pay for goods, organise our finances, and interact with banking services. But beyond this, technology still of course offers financial firms a vast array of opportunities. The challenge is that these innovations, while advantageous in many ways, can act as potential obstacles when it comes to complying with strict industry regulations. Specifically, I’m talking about social media. Regulated firms’ issues with social media Social media has made compliance particularly difficult for regulated firms over the last ten years. Of course, the irony is that given it is a fundamental modern method of communication, financial services firms are often expected to maintain a social media presence as part of their business strategy. Indeed, social media is now an integral part of running any modern business. However, building a regulated business’ reputation on social media does carry weighty implications when it comes to complying with financial regulations, so utilising social media properly requires careful consideration. This can tempt regulated businesses to consider avoiding social media altogether. Yet, leaving aside that such a decision would severely truncate a company’s ability to communicate, abstaining from social media completely does not guarantee that a regulated business would still be compliant with the necessary regulations. This is because companies do not have the ability to simply choose not to engage with social media: it is ubiquitous. For example, regardless of whether a company is active on social media or not, an employee’s actions on their own social media channels might well be subject to regulatory oversight. And, no matter how hard financial services firms might try, it is almost impossible to guarantee that all employees’ daily activity remains compliant with existing regulations. But rather than ignoring it, regulated firms should harness the considerable benefits of having a strong social media presence. Indeed, social media has handed modern businesses a powerful tool when it comes to cultivating a public image and relating to consumers directly. And for financial services firms in particular, maintaining a good reputation through social media is crucial to restoring consumer faith in the whole financial system. So, the trick is not to reject it, but to know how to use it. RegTech is key for social media compliance Fortunately for regulated firms, a new breed of FinTech companies – RegTechs – have created software to ensure that clients continue to comply with regulations whilst exploiting the business opportunities offered by social media. Whereas some FinTech firms, such as neobanks, have set out to displace well-established financial services companies and disrupt the industry, many of today’s RegTechs were created with the intention to be part of an important financial evolution, not a revolutionary insurgence. Indeed MirrorWeb – having recently listed in the US’s FINRA Compliance Vendor Directory – has for some time been working with many UK and European regulated firms, providing them with a platform on which to archive activity which takes place on their social media accounts and company websites. Already, we service clients like Liontrust Asset Management, Tesco Bank and Zurich Insurance, helping them to comply with current regulatory frameworks like MiFID II. The purpose of the MirrorWeb platform is to improve a company’s digital conduct with retrievable records that not only show fair treatment of customers, but also deliver confidence that they're not going to fall foul of regulatory standards. In other words, MirrorWeb is a good example of technology providing a solution to a regulatory challenge caused by technology. So what does the future hold? Stephen Covey, the author of The 7 Habits of Highly Effective People, is credited with saying that “if there's one thing that's certain in business, it's uncertainty”. I agree. Regulated businesses cannot be certain how financial regulation will evolve in the future, but they should be able to evidence what was communicated and when – ensuring their digital truth isn’t lost. The innovations in RegTech now make both that and complying with other regulations much easier, by automating processes that would otherwise be very labour and  resource-intensive. All this, crucially, whilst not sacrificing the agility and ability to connect with more people which social media allows. This was surely the point of embracing technology in the first place.[/vc_column_text][/vc_column][/vc_row] ### Setting up a Public Wi-Fi Network – The Essential Dos and Don'ts [vc_row][vc_column][vc_column_text]Public Wi-Fi networks are a fantastic resource for both consumers and businesses. They are a perfect way to draw more people into the business premises, and once they’re through the door they’re more likely to stick around if they have a good connection. However, setting up a public Wi-Fi network for your own business requires more than just switching on a router. Here, Amvia founder Nathan Hill-Haimes guides you through the process of creating a secure, stable and beneficial Wi-Fi network for your business. DO ensure your network can manage the traffic If you plan to simply hand out your password to anyone that enters your premises, then you’re sure to run into problems. In order for customers to make proper use of your network, you should ensure that your business broadband package is able to handle the amount of potential traffic that it will encounter every day. Even what sounds like large data limit may be used up quickly if there are lots of people logging on to your network freely – it’s therefore best to choose a package that provides you with an unlimited amount of data. For businesses with smaller premises, a fibre optic connection should be enough. It’s recommended that you choose a package that offers very fast internet speeds, as a slow connection will leave customers frustrated and unlikely to use your service again in the future. For businesses with bigger premises, a faster leased line or Ethernet connection may be the best option as there will likely be a high volume of people trying to access the internet all at the same time. DON’T put your customers and your team on the same network Creating separate Wi-Fi networks for your team and your customers helps to ensure a higher level of security for both parties. It’s likely that each department will have different demands, but by creating separate networks you can allocate a bandwidth amounts as required; your team account shouldn't need too much bandwidth, while the guest account may have much higher demands. Allocating more bandwidth to your guest network means your visitors get a stronger, faster and smoother internet connection. Setting up a password on your guest network allows you to limit access to only users that actually enter the premises, and not other users in the nearby area. DO create a hotspot gateway Hotspot gateways are a way of restricting Wi-Fi access by requiring your guests to access the network via a virtual portal – this means a more secure Wi-Fi connection for both your business and your guests. Hotspot gateways add an additional barrier to your network, making it more difficult for cybercriminals to get through, while also allowing you to put additional firewalls in place across your network. In order to access your network, guests have to accept terms and conditions – in turn this provides you with some legal protection whilst they are on your network. One of most simple ways that you can install a hotspot gateway is by purchasing dedicated hardware; this costs anywhere between £50 and £1,000 and the price tag will depend on how much traffic you would expect to have on your network at any one time. It may seem like a costly expense, but it will radically simplify the whole gateway process for you. DON’T go easy on network security Adding gateways and separating your networks are all essential steps for making sure your business and data stays safe, but this alone won’t be enough alone to ensure you are fully secure. With an open-access Wi-Fi network, you must make doubly sure that the right security measures have been put in place to protect your business from cyber criminals. WPA2 is the standard security protection for all modern routers and offers the most up to date Wi-Fi security – just make sure you have it switched on! It’s also good practice to take additional precautionary steps including regularly changing your passwords, keeping the router in a private location away locked away from public access so that it can’t be tampered with, introducing additional firewalls, ensuring that the strength of the Wi-Fi covers just your premises and does not leak into other locations and turning off WPS. Hotspot gateways offer their own levels of protection but each one is slightly different to the next, so it’s worth following the supplier’s advice on this. You should also regularly check that your security software is up-to-date and running properly, otherwise you could be unwittingly leaving yourself open to the latest hacking techniques. Once you have taken the steps to get your network up to a high level of security, you can begin handing out the Wi-Fi password and drawing in a whole new customer base.[/vc_column_text][/vc_column][/vc_row] ### The Growing Need for Security in Health Data [vc_row][vc_column][vc_column_text]Who has access to your private information? It’s a question you really can’t know with 100 percent certainty. The idea that somebody out there could know information like your full name, your email or physical address, credit card info, or even your social security number, is terrifying. While it is possible to know if your information has been part of a major leak thanks to the website haveibeenpwned.com, it’s not a perfect record. There are bound to be data breaches that either haven’t been reported, discovered, or affect such a small amount of people it wouldn’t make the news. One of the most vulnerable areas of data is in the healthcare industry. It’s currently on the rise, affecting 29 million patients that we’re aware of. Many data breaches go undetected for years if the criminals cleaned up any evidence they were there. Because the healthcare industry deals with tons of sensitive information, from health conditions to social security numbers, it needs better security. Better Training for Healthcare Professionals Many data breaches and cyber attacks can be prevented if those involved follow proper cybersecurity habits. Far too many hacks and leaks happen because a staff member chose a weak password or accessed information from an unsecured computer. Everybody involved with healthcare data needs better training so as to prevent easy access to information. What Makes a Good Password The more nonsensical and complex a password is, the less likely a hacker can get it. While the days of people having their password being “1234” or “PasswordsSuck” are hopefully behind us, most people still don’t have very strong passwords. Many bad habits include having your password be a childhood nickname or pet, using only a single unique number or character (the number 1 is the most popular), using the same password you came up with when you were 14 for everything, and including any part of your name or birth year. Healthcare professionals need to create a fully unique and complex password for both the computer and software used to access private data. That’s two different passwords, making both your computer and account doubly secure. While this is a pain at first, just spend 30 minutes logging in and out of the account, letting your fingers create muscle memory for the password. Then, instead of worrying about remembering the password, your fingers will do the work for you. Avoiding Strange USBs and Email Attachments Many healthcare records, especially for hospitals, are stored on a private server only accessible through their own intranet. If a cybercriminal is wanting to get a hold of that information, they’ll need to gain control of a computer connected to that intranet. Two very popular ways of doing this are loading malware on either a USB or in an email and getting it into a computer connected to the network. Proper training would include teaching administrators and professionals on how to treat suspicious emails and USB drives. This should also include policies for healthcare workers on keeping secure USBs and how to request a new one instead of just digging around a desk drawer and using whatever they can find. It’s not just USBs they need to watch out for though, it’s also their phones. A hacker could infect a phone with malware that not only could lead to a data breach, but also cause the phone’s battery to drain quickly. Then, the healthcare worker notices their battery dying, and just like everybody else in the world, panics and tries to find a way to charge it back up. Then, they plug the phone into the computer, and boom, the data is breached. The Big Data Revolution is Making it Harder As the big data trend continues to grow, the risk of data breaches rises with it. That’s information that more people can access, leading to more opportunities for criminals to get their hands on sensitive information. While big data can be extremely useful for things like medical mapping or assisting with the latest in medical technology, it needs to be done properly. Anytime data is submitted for a big data project, the most sensitive information needs to be scrubbed from it. Names, exact addresses, social security, and anything that could be used to identify a person needs to be redacted. Then, those handling the big data need to comply with HIPAA guidelines and follow with the latest in data security. That means strong passwords, stronger firewalls, and quality security software. It’s everybody’s job in healthcare to protect their patient’s information. From the secretary who schedules to the busiest of doctors, anybody who has access to private data needs to be properly trained and make sure their devices are secured.[/vc_column_text][/vc_column][/vc_row] ### They are what they eat: Feeding the chatbots 2018 was the year of the chatbot. Organisations of all sizes embraced artificial intelligence to support an increasing array of functions across customer services, sales and marketing and began exploring new opportunities in cybersecurity and financial management systems. Back in 2011, Gartner predicted that organisations will be capturing 85% of their customer contact using artificial intelligence (AI) technology by the year 2020. We also expect to see chatbot technology being developed in the hands of the organisation, enabling tailoring to specific workplaces and integration across multiple bots that connect automated functions across the business. For example, in an online retail company, the chatbots that function in front line sales, customer service, purchasing, logistics and all the way through to delivery will seamlessly interact. Ideally, companies want the chatbot to answer as many questions and handle as many scenarios as accurately as possible from the word go. And therein lies the challenge, because AI is a learning technology not a plug-and-play solution. Think of AI as an iceberg. The tip above the water is the chatbot front end, happily receiving customer questions and delivering information in response. But 90% of its working is hidden away under the waterline, powered by the capabilities of cloud computing. AI programmes need to consumer a LOT of data to fill the iceberg that sits below. That data is processed and digested to find patterns, set up algorithms and make predictions to enable fast and accurate output. These patterns and predictions are only as good as the quantity and quality of data that the chatbot has been fed from the outset. And the better that the chatbots connect with live applications in use across the business, the better their frame of reference. A chatbot can only be as good as the data it is fed. Cloud technologies have enabled organisations to amass enormous amounts of information. The twin challenges faced are, firstly, integration between enterprise platform and chatbot itself, and secondly, making connections between multiple enterprise applications to feed in all of the information it needs to draw on. Achieving a fully integrated enterprise input is key to supporting a highly effective and intelligent output. Integrating across multiple data sources that are not necessarily designed to be compatible can create a major headache for organisations, particularly those that are seeking to self-install an AI solution rather than outsource IT management. Besides, developers and companies alike want to be spending their money on polishing the face of the chatbot, not on what it is eating for breakfast. Also native to the cloud, integration platform as a service (iPaas) can enable compatibility across the wealth of data that the enterprise has collected to feed and educate the chatbot on real-life information. Whether the cloud is on premises or outsourced, iPaaS can connect the chatbot to any data sources within the enterprise system that are relevant to its function – or to other chatbots with which it is designed to work, side by side. Connecting data sets within the enterprise enables the chatbot to create patterns and predictions based on real information that has optimum relevance to the environment in which it will be expected to function. And with the right iPaaS, any organisation can feed their chatbot using integration built on simple and intuitive dashboards to provide secure, clean data rather than requiring deeply technical coding capabilities. They are what they eat. So, when inviting a chatbot to join your organisation, don’t let it go hungry! ### Artificial Intelligence and Security [vc_row][vc_column][vc_column_text]Progress has had an impact on everyone. The 21st century has become the era of big data and technology. On the one hand, growing artificial intelligence possibilities encourage us to trust the machines, but on the other hand, they cause new security problems. It becomes really hard to remain anonymous online, since the users can be easily monitored by security services, and their activity can be traced making no effort. Some network restrictions not only affect anonymity, but also a commercial activity which can be significantly expanded if it’s not connected to a specific IP. How is it possible to solve this problem? Websites such as cooltechzone.com can help you find out how to hide your IP on the web and provide free access to any content feeling completely safe. Artificial Intelligence: New Possibilities Artificial intelligence has already left a great impact on our planet. Nowadays, it’s used in almost all areas of human life. Let’s try to figure out what benefits AI brings to humanity, and whether it can be dangerous for people. AI is the machines’ ability to learn, think, perform actions and adapt to the real world expanding human capabilities, saving energy costs and performing dangerous tasks. According to experts, artificial intelligence has sufficient potential to radically change the life of the whole society. Basic artificial intelligence systems make it possible for people to secure their home. The newest systems include computer training to adapt to your habits and graphics or tracking unfamiliar noises. Some systems use face recognition algorithms to monitor visitors coming to your house and also have HD cameras and audio sensors. Chatbots also gain in popularity; it’s an artificial intelligence product created to help clients of different companies. In fact, such virtual assistants really help the business. Lots of people are afraid artificial intelligence will replace them in all industries. It’s too early to talk about it, but the fact that high technology helps automate different processes, starting from sending letters ending with flight tickets, is known throughout the world. The goal is not to replace people, but to make human labor more efficient. The use of smart machines in surgery and agriculture has already managed to establish itself in the market. The robotisation of other areas is only gaining momentum, but according to scientists, robot and artificial intelligence market will actively grow in the next decade. The artificial mind is great for a variety of mechanical activities. Artificial intelligence is a new platform that will fight against cyber crimes. The use of intelligent technologies increases the level of threats detection reducing response time and improving techniques that can distinguish real attempts to break the data center of your company and incidents that can be ignored due to the absence of risk. Why AI is Dangerous Computer software and applications for smart home systems have certain features that make them vulnerable. The situation is getting worse since more and more devices get connected to the global network, and the level of consumer awareness of how data is transferred, stored and processed is really low. In other words, people rely on the Internet and gadgets but don’t understand what is happening with the information they send to the network. Here are some measures that reduce the risk of becoming a victim of data mining. Use anonymous networks to work on the web. Anonymous networks such as I2P, Freenet and ToR are available to all Internet users. All of them support end-to-end encryption, that means the data transmitted through them is anonymous and can’t be intercepted. Use an open source OS. Everything that has been said about browsers applies to operate systems. Unlike Apple and Microsoft using multiple backdoors to collect user data, Linux is more secure. Such advanced technologies as artificial intelligence have brought lots of positive changes in people’s lives. Information collected and analysed with the help of modern programs helps in solving social problems. Nevertheless, the AI has its cons: all the information gathered about you can be used against you. In today's world, VPN service is the most significant assistant which will safeguard you online. Why You Need VPN Each client uses VPN to solve certain problems. Using this service, employees who work remotely are able to connect to the central office network. Banks can provide access to payment services ensuring security over VPN, while operators ensure a secure connection for their customers. VPN lets you bypass website blocking with the help of an encrypted channel between the client and the operator. It’s a known fact assigned IP is not a free service, so VPN services charge a certain amount of money from their users who are willing to rent one or another IP address. In any case, VPN service ensures data security and lets you bypass any limitations set by both the provider and government.[/vc_column_text][/vc_column][/vc_row] ### Data ethics: Five things to care about [vc_row][vc_column][vc_column_text]With artificial intelligence (AI) and machine learning (ML) driving business innovation and operational efficiencies, understanding data ethics is paramount to leveraging AI and ML for successful results. The promise of AI & ML AI promises to deliver considerable business benefit – IDC estimates that $52 billion will be spent on it annually by 2021. Companies around the globe are exploring ways in which they can use the right data to feed into their AI solutions to reduce costs, meet regulatory demands, deliver an enhanced customer experience, and innovate. Getting it right, the capturing, processing, managing and storing of data, is not as straightforward, however. The Cambridge Analytica scandal brought the issue of data ethics to the headlines, particularly in the context of social media platforms, but other concerns are being raised within the technology industry that are more subtle. For example, can an AI machine learn morality and just which set of morals should it learn? Morals differ dramatically from culture to culture, as a recent Massachusetts Institute of Technology (MIT) experiment showed. Others are asking if the conscious and unconscious biases of those who assemble an AI solution will be then found baked into that solution. Think tanks are focusing on defining what the ethical treatment of data should look like. The Information Accountability Foundation recently published a paper that asks probing questions about risks and benefits around data ethics within organisations. The Center for Information Policy Leadership has published a report that also examines data ethics issues. Google recently issued its own AI principles and we should expect other technology companies to follow suit with their own data ethics policies over the next year or two. Investors may not be asking for data ethics policies today, but they will be soon. The reputational risk from a data ethics failure can destroy considerable shareholder value overnight. Embedding data ethics Businesses need to avoid a ‘black box’ situation by becoming a true data citizen. Understanding data –  how it’s handled and managed – will ultimately help businesses  maintain control and adhere to data ethics guidelines. While a shroud of uncertainty surrounding data ethics still remains, there are five areas of clarity that any business should ensure they understand and implement: Take a proactive approach to data ethics – Often companies can find themselves focusing only on compliance with regulatory requirements. This can be a mistake. In the event of a data-focused problem, and without a data governance programme in place, businesses will be on the back foot. Unpreparedness often leads to poor business choices. By keeping a proactive mindset when it comes to data ethics, and thinking, “what could we need to solve next?”, businesses will remain more resilient to the potential data challenges that lie ahead. Understand the data journey – When thinking about how to treat data ethically, think through the entire journey the data will make. This includes, for example, the data governance framework, data application, model building, model validation, and model assignment. It’s important to consider issues such as data quality, data transparency, and data privacy through all of the stages data may go through. Explain the importance of good data to the business – Often the business needs educating on why issues such as data quality and data ethics are so important. Neither topic should be seen as something separate from the business. The business needs to be involved in setting policies and owning compliance. Make the connection between ethics, transparency, and what is important, such as delivering shareholder value and creating innovation uplift. Involve a wide range of stakeholders – In any new business project that involves using data, for example, a new retail-focused AI solution, it helps to not just have the CDO or CIO look at the potential data issues. From the IT team, analysts and even the CEO, all employees need to be data cognizant and understand the issues. A diverse group of people looking at potential data ethics issues in a project brings a wider variety of perspectives. A more 360-degree view of the client will increase the likelihood that any challenges will be flagged early on. Monitor data uses on an ongoing basis – It is not enough to just validate the use of data at the beginning of a project, the development of a model, or the creation of an AI solution. As business circumstances evolve, so too may the uses of the data. Check in regularly that both the data and the system that is using it are continuing to deliver value and are performing as expected. In the next year, many organisations will prioritise taking next steps in creating a data ethics culture.  Working together as data citizens across a business and as a wider cross-company initiative, we can all help impact each other’s journeys. We need to collaborate, share ideas and update each other on the progress each of our individual organisations is making in promoting a more ethical culture around their data resources. Data ethics is undoubtedly still a work in progress, but it is one that could not only help companies profit, but generate real societal gains.[/vc_column_text][/vc_column][/vc_row] ### Customer success is critical to growth in the digital era [vc_row][vc_column][vc_column_text]Any enterprise worth its salt will purport to be customer-centric. Yet, what this actually means in practice can be a pale imitation of the approach needed for the customer retention and renewal rates required for business growth. We are in an era where sporadic client satisfaction scores and a reliance on dated customer service and sales support no longer cuts it. Traditional investment in sales simply isn’t paying off in the way it once did. Typically, the customer journey is now long and fluctuating, and demands a commitment from the service provider to deliver value at each stage in the process, irrespective of customer size. And of course, the real work only begins after the client is won. In short, the game has changed forever, prompted by the customer themselves who have higher expectations around service quality and the innovation required to pique their interest and retain their loyalty. This has gone hand-in-hand with a technological revolution which has had significant influence on the dynamics and balance of power between the business and customer in the enterprise software space. The rise of cloud computing, hosted Software-as-a-Service (SaaS) solutions and subscription-based monthly recurring revenue models have been the game-changer. Now, only customers satisfied by the business value provided by the software will be incentivised to continue making ongoing payments, putting them firmly in the driving seat. As such, keeping them on board has never mattered more, in some instances a strategic endeavour that usurps the chasing of new leads. It’s the reason why the savviest operators in the digital enterprise recognise that managing customer success must now be a fully-fledged function in its own right. And, more broadly, fully ingrained within the business culture. Populated by a team that are accountable for delivering financially sound, scalable, outcome-focused and growth-driven support to clients, all activity stems from the ability to know every aspect of customer needs and behaviour. Only by conducting a continual assessment of their health, and factoring in the change and evolution in the customer journey, will the business be equipped to identify risky renewals and avoid missed opportunities. By utilising the wealth of customer data that we now have at our disposal, and drilling deep into client accounts, we can now identify where the attention and resources should be placed. Only then are we ready to apply the nuanced and anticipatory interventions which can lead to greater advocacy, leading to impressed customers communicating the advantages of the product to others in their network. Ultimately, the business that can best gather, analyse and utilise data to garner the most pertinent insights into their audience, will be best placed to thrive. This is why success hinges on the right data solution, a challenge that can still thwart many organisations. Indeed, customer success specialist UserIQ’s first annual State of Customer Success and Trends found that the greatest challenge for almost 40% of customer success teams was gaining visibility into customer health metrics and user behaviours. The onus, therefore, must be placed upon customer-aligned delivery models, notably the cloud, predictive analytics and a platform that offers a single view of customer data via visual reports and dashboards. Only such granular detail, presented in a clear and accessible way, will enable informed decision making. While this becomes the basis for making a shrewd and timely product recommendation, the visibility and broader picture provided goes beyond metrics. Indeed, it can achieve more meaningful and engaged relationships, in which the customer is helped to derive success from the product offered. It is the antithesis of the traditional, short-sighted approach of simply viewing customers as a mere revenue stream. The result is a successful marriage of technical innovation and the very human qualities of rapport and relationship building found in good customer success managers, who are now empowered by having greater insight at their fingertips.[/vc_column_text][/vc_column][/vc_row] ### Hamilton Rentals Enabled Businesses to Keep the Lights On [vc_row][vc_column][vc_column_text][WOKINGHAM] 1 April 2020 Five weeks ago, with impending danger attached to a virus emanating from the East, Hamilton Rentals anticipated a new change in circumstance was going to affect how people worked. If our hunch was correct, this change was quite likely to disrupt business in offices. We prepared for an upsurge in demand for laptops, which would enable employees to work from home and operate a business as usual policy – as our business has. “I am proud to say that we have been able to rapidly deploy over 3,500 laptops to customers during March, enabling their businesses to operate as usual. What I have learned from this experience, which has created such disruption to our normal daily activities, is the importance of IT rental in a business’s crises strategy”. - Steve Shelsher With the situation changing rapidly each day, we will continue to support our community by supplying AV and IT equipment to businesses for short-term rental. Our warehouses in Wokingham and Portsmouth are fully operational, while employees working from home remain contactable virtually, with video calls a substitute for face-to-face discussions and meetings, while we get through this current period. Hardware hire is structured around the requirement, allowing for the equipment’s return, renewal or refresh at any point, with no financial penalty. In response to demand, additional stock will continue to be processed. Contact us with any queries and we will do all we can to help. Logistics: Hamilton’s internal fleet of vehicles delivered compute equipment to business locations throughout the UK, as well as facilitating distribution to homeworkers. Our engineers, as per usual, managed the IT hardware installations and continue to offer on and off-site support to businesses via telephone, or onsite where possible. Sanitisation of product: Rental products have always been cleaned before and after each hire as a matter of course. We have introduced additional sanitisation using the cleaning agent and virus sanitiser, Isopropanol liquid, which is used in many laboratories and hospitals to clean surfaces and disinfect products. Please be assured that we are taking precautionary measures to minimise and contain any risk of contagion with the health of our employees and are following guidance and advice from Government. About Hamilton Rentals is in business since 1972 and started with the provision of short-term rental of electronic office, computer and audio visual, and test and measurement equipment. In 2017, the company was acquired by Bell Integration, a global IT Services Consulting business. Bell is at the forefront of helping companies drive down operating costs and improve their ability to engage with their own customers, either by supporting the growth of their channels to market, or by aiding their responsiveness and service quality. Combining forces enabled Hamilton’s to extend its customer base and range of services to more than IT and AV rental solutions. In addition to compute rental, Hamilton’s provides Channel Rental and Vendor Hardware Evaluation programmes, Event Management, Ex-Rental Sales, Installation and Support solutions to the construction industry, events and exhibitions’ sectors, film & television, financial, government, IT systems’ integrators and Technology. Being part of Bell Integration, the company is also able to offer a range of IT solutions, which you can read about here https://www.hamilton.co.uk/additional-services/[/vc_column_text][/vc_column][/vc_row] ### Everyday AI | The rise of chatbots | Compare the Cloud It can be difficult to know when we’re talking to a real person or an automated-response-bot when we are seeking help online, as the abilities of such bots become more and more human-like and advanced. Chatbots have become increasingly popular with businesses as they can provide automated replies to customers at all hours of the day (a task beyond human abilities), and instead of hiring new staff for the role, they can design a chatbot that can meet specific requirements. The creation and use of chatbots result from the widespread rise in AI deployment. The most advanced chatbots run via artificial intelligence, giving them the ability to comprehend complex tasks and produce personalised responses. The utilisation of chatbots is a part of a larger transition in the way businesses access and help their consumers. Chatbots are a feature of an increasingly time conscious and automated customer experience. In the same way that more and more people are likely to choose to use self-checkout point at the supermarket to avoid the added time expense of social interaction, chatbots give people the ability to get quick and concise information at any time. Chatbots can automate hundreds of interactions during a single day, which is a task that would take traditional workers many hours. This time that employees save can be spent on other tasks and can dramatically improve productivity. Optimising engagement can be a difficult feat, but chatbots can aid this by communicating with customers on the behalf of a company. There has been debate concerning bots vs apps, suggesting that one must replace the other, but a hybrid of both can be the most effective approach. Rather than having to message a bot and hope for an accurate and helpful response, when integrated into an app, dashboards, social media, email and more can also be used to seek out information. Chatbots can guide people through the app and provide options of what to click on etc, in order to provide a more accessible and curated experience for the customer. Relying solely on a chatbot can cause problems, as they can actively move your money if you request it and if a mistake is made, it may be complicated to fix. Instead, within an app, it makes it easier to track processes and amend request and seek further help through links and email. In April 2016, Facebook announced their plans to allow businesses to provide automated customer assistance and interaction through chatbots on Facebook Messenger. By enabling the Facebook Messenger app to be used for people to access a variety of chatbots, a users interaction with a chatbot becomes much simpler and loyalty to the Facebook brand is increased. Since Facebook and other messenger providers like Kik began to incorporate chatbots into their apps, the chatbot market has ballooned. Developers have experimented with public figure Chatbots, including Shakespeare Chatbots. Although chatbots have only started to become mainstream in recent years, in 2001 AOL created a system called SmarterChild on its messenger service AIM. SmarterChild could inform the user of weather information and the latest news. The system struggled to answer complex questions and could only offer default answers to some, and therefore, it is the advancements that have come about recently and the evolution of the technology that gives chatbots tremendous marketability. The excitement that a chatbot provides its users as they ask it endless questions to test its abilities shows its novelty appeal, but the technology can be deployed to aid a user’s budgeting and planning and has potential to provide far more than a novelty experience. An example of a bot integrated into the Facebook Messenger experience is Cleo, an AI financial assistant bot that can analyse your spending habits and keep you on track with a budget of your choice by alerting you to your spending, and even offering the option to transfer money. The bot uses 'bank-level encryption and security practices’, and therefore cannot do anything with your money without your consent. With chatbots replacing some interaction traditionally communicated between humans, the difference between interactions with machines and fellows humans is a significant consideration for users and developers. A considerable difference between chatbots and people is the struggle of chatbots and AIs to pass a modification of Alan Turing’s "The Imitation Game" test (introduced in 1951). The original test consisted of three rooms, each connected via a computer screen and keyboard to the others. In one room sits a man, in the second a woman, and in the third sits a person referred to as the "judge". The judge's role is to decipher which of the two people talking to him through the computer is the man. A modification of this test was then carried out in which the judge must decide which is human and which is a machine. So far, no Chatbot has passed the Turing Test as the technology is not at the stage where it is flexible and able to understand the context of interactions to the same extent as humans. If  Ray Kurzweil, Google's Director of Engineering, is correct in his predictions AI will reach human intelligence by 2029, and chatbots should have no problem passing the Turing test. ### What the arrival of 5G means for IoT security [vc_row][vc_column][vc_column_text]This year, EE and Vodafone became the first network operators to offer 5G network technology, with O2, BT Mobile and Sky Mobile set to soon follow suit. Within the next five years, we now expect to see a huge uptake in 5G subscribers worldwide, with some estimates numbering almost 2 billion. Already, consumers are delighted at their ability to access superfast download speeds, which can reach 460Mbps in certain parts of the country. But it’s important to remember that 5G isn’t simply about improved download speeds for users - rather it has been specifically designed to support ‘things’ and the machine-to-machine (M2M) connections that enable them to communicate with one another. At the very least, 5G will be able to support one million devices per square kilometre, including smartphones, wearables, energy metres and appliances - and this list is likely to expand in the near future to include driverless cars and medical devices, such as insulin pumps that are permanently connected to the Internet. However, the rapid rise in the number of smart devices worldwide has also coincided with a dramatic increase in cyberattacks, where IoT devices are often targeted for use in DDoS botnets. Everything from toys to traffic lights to fish tank thermometers have been successfully hacked by cybercriminals. The problem with smart devices Many people wonder why smart devices have historically been so difficult to keep secure - after all, many of us are familiar with installing antivirus software on our PCs and we’ve come to expect a degree of security to be pre-built into a laptop or smartphone. The fact is, most smart devices are built to be as small and cheap as possible, so there often isn’t enough CPU or RAM available to support on-device protection - particularly if the device is expected to download and install regular updates, which is an essential aspect of any watertight cybersecurity system. The only practical way to protect smart devices is through network-based security, where a layer of protection is provided at the root of the network, ensuring that all connected devices in a household or workplace are covered. This is sometimes offered by a broadband provider as part of a package deal or additional service. Until now, network-based security has provided a thorough approach because the majority of devices connect to the Internet via WiFi or broadband connections - but of course this is all set to change with 5G. DNS and the “phonebook” of the Internet Thankfully, the situation is no different for 5G-connected devices: they can be protected by the same type of network-based security via the Domain Name System (DNS), which is sometimes described as the “phonebook” of the Internet. When you type “www.comparethecloud.com” or "www.disruptive.live" into your web browser, DNS converts this web address into a computer-readable IP address (e.g. “192.168.1.1”). Almost all network connections - even M2M ones - require an initial DNS lookup, so this makes it the perfect location to apply network-based security. This way, owners and operators of IoT devices can detect unusual traffic or connections to known malware command and control servers. What this means for IoT security Within three years, it’s expected that there will be 18 billion IoT devices worldwide - that’s more than two devices for every man, woman and child across the planet. Furthermore, 70% of these devices are expected to be utilising cellular technology like 5G, and without the correct security measures in place, this is an unimaginable amount of devices that could be targeted by cybercriminals. DNS-based network security is quite simply the only solution that can deal with this increasing IoT cybersecurity risk. And not only is it an effective method of identifying, mitigating and responding to infection, but it’s a much more convenient method for consumers, too. With DNS security, an administrator can receive real-time protection alerts directly to their smartphone, removing the need to run manual virus scans. It also enables security updates to be applied to every device across the whole network. As 5G adoption continues to grow, it’s vital that we treat it just like any other network and apply the same security measures that anyone would expect of a home broadband connection. This is a golden opportunity for network providers to unlock new revenue streams while ensuring that their customers can stay on top of security threats whilst at home, work and on-the-go.[/vc_column_text][/vc_column][/vc_row] ### Getting the Most From Modern Data Applications in the Cloud [vc_row][vc_column][vc_column_text]Increasingly, big data is being deployed in the cloud; the latest research proves it. Quite rightly so - increased scalability for growing workload volumes, reduced management overhead and the assurance of SLAs is often all it takes to move your workloads to the cloud. A wide array of technologies, Kafka, Spark, Impala, Hadoop, NoSQL to name a few, are being rapidly deployed.  However, to truly reap the benefits and get the most from modern data applications (such as customer 360, Fraud Analytics and predictive maintenance) in the cloud requires data-driven planning and execution. So where to begin? Below are key considerations before deciding how, when, and why to move to the cloud. Understanding the current on-premises environment In order to effectively move data deployments to the cloud, it goes without saying that a business needs to fully understand its current environment and dependencies. Specifically, details around cluster resource and usage and application requirements and behaviours are vital for making confident decisions around what makes sense to take to the cloud. Firstly, an organisation needs to identify what its big data cluster looks like and how dataplines are architected from data ingest through to consumption of insights. What are the services deployed? How many nodes does it have? What are the resources allocated and what is their consumption like across the entire cluster? Secondly, it needs to understand what applications are running on the cluster, as well as how much resources these applications are utilising and which are prone to bursting, for it is these that make a prime case for cloud deployment. Making the move, one application at a time Once a thorough understanding of the cluster has been ascertained, the next step of the cloud migration journey is to understand which workloads would benefit most from the advantages offered by the prospective cloud environment. For example, the applications that are prone to bursting in nature as well as failing due to lack of available resources make good candidates, namely due to the third party’s obligation to SLAs. Not only this, but once it has been decided which applications it makes sense to move to the cloud, the question moves on to timing. Often, most organisations strategically decide to phase out migration to make the transition as smooth as possible. They may decide to migrate applications belonging to specific users and/or queues first, followed by others in a different phase as determined by the analysis made in understanding the on-premises environment. It may sound easy enough, but often this information is scattered across the technology stack, making competent analysis - and subsequently a firm business case - challenging if not impossible. Taking the leap with full faith What is needed, then, is a single pane of glass that can ensure effective cluster reporting across time that reveals patterns in workload and application resource usage, as well as predictive analysis to ensure cloud deployments are designed with peak utilisation times top of mind. A Data Operations and Performance Management  solution can offer all of this information in real-time across the entire data stack that allows for full confidence in mapping an on-premise environment to a cloud-based one. Not only that, but the most advanced solutions will also provide different strategies for this mapping based on individual goals, whether they be based predominantly on cost reduction, workload fit or otherwise. Taking this a step further still, reputable providers will have comprehensive knowledge of the main cloud providers, including the specs and cost of each VM, and can help to monitor the migration process as it happens. Once the workloads are there, the best APM tools can then compare how a given application is performing in its new home compared to before as well as provide recommendations and automatic fixes to ensure performance stays up to par Ultimately, as an organisation migrates its apps to the cloud, a robust Data Operations approach will help ensure it won’t be flying blind. With data-driven intelligence and recommendations for optimising compute, memory, and storage resources, proper planning and the technology ensures the transition is a justified and, most importantly, smooth one.[/vc_column_text][/vc_column][/vc_row] ### 7 steps to a seamless digital transformation project [vc_row][vc_column][vc_column_text]With digital transformation now a must for many businesses – large and small – Alex Wilkinson, Client Delivery Director at Cranford Group, shares his seven-step guide to overcoming the obstacles most commonly related with these change projects… 1. Define the scope of the project For some organisations, a digital transformation programme could involve a relatively simple move from an on-premise to off-premise cloud infrastructure. Other businesses may be readying themselves to build their own secure data centre or roll out a complex multi-cloud strategy. Whatever the venture, the scope of the project needs to be clearly defined and communicated, so that objectives and timescales can be mapped out and agreed by all relevant stakeholders. It is important to remember that everything is relative too – what represents a straightforward transition for one team could prove a huge challenge for the next. A well-articulated plan ensures expectations can be managed from the outset, and potential obstacles anticipated and mitigated. 2. Prioritise people Irrespective of the technology or operating model that is selected for a digital transformation project, the involvement of the right people is key. In fact, overlooking the role that humans play in such schemes, is one of the most common reasons why projects fail – or at least encounter stumbling blocks. Around 90% of digital transformation projects take place to make the lives of people easier, in some respect. So, to neglect people-centric considerations throughout the assignment seems ludicrous. This should remain in sharp focus from day one, through to completion, and beyond. The right people are required to deliver the transformation too. Do they have the required skill-set? The necessary mindset? Are there gaps within the team? These questions need to be answered before anything else takes place. 3. Complete a competency framework Digital transformation is one of the most overused phrases in the business environment at present. So, linked to point 1, it is important to understand exactly what activity is taking place, and also which skills are required to execute this brief (as eluded to in point 2). It is then possible to complete a competency framework which clearly demonstrates the existing capabilities of the team, as well as any potential areas of skills exposure. Some soft skills can be taught, providing the project schedule allows for the learning process to take place. Other more technical expertise may need to be brought in, whether to permanently plug a gap or to drive the project execution on a contracted basis – the latter being particularly common in the cloud and DevOps space, given the resourcing challenges currently being experienced (Gartner, January 2019). Ultimately, if you don’t measure you can’t manage, so this competency framework is a crucial exercise to ensure the right skills are in position. 4. Develop knowledge  There are so many ways to upskill staff and fuel knowledge transfer, from investing in formal in-house or external training, to devising a bespoke e-learning programme, and/or leveraging the collaborative mindset of the developer community. The most appropriate route depends on factors such as the project timeframe, training budget and culture of the business. Ideally, the learning plan should be project or operating model specific. Ask yourself – do you need front-end developers, or people with DevOps, cybersecurity, big data, or AI specialisms, for instance? Is AWS, Azure or ServiceNow knowledge a must? Or specialities like Python and Jira? The next question is whether these skills can be taught or whether it is better to bring experienced talent into the team, if only for the duration of the project. 5. Foster a data-driven culture A recent article by Craig Stewart, SVP of products at SnapLogic, perfectly coined this point: “A data culture is, simply, when everyone in the company is switched on to the potential of data.” He also stressed that: “Building a data culture needs to be a crucial part of the digital transformation process.” So, EQ skills matter just as much as the team’s technical capabilities for instance, as does a shared acknowledgement for the role that data plays throughout the organisation. People that have an aptitude for learning and development typically support digital transformation, whereas others who resist – or do not understand – the change, may impede the project’s success. Managing cultural change can be a long and complex process, so if an appetite for digital transformation doesn’t exist, the project will always be harder to execute – but not impossible. Strong leadership and communication are paramount before any new strategies or infrastructures are implemented, so that people appreciate the reason for change and are hopefully on board in the earlier phases. Many individuals won’t work in a new way unless they understand why it will benefit the business and/or them. Someone should therefore own the ‘marketing’ surrounding the digital transformation exercise, even if this is only an internal communications process. Looping back to point 1, this should remain a people-centric programme of work. 6. Of course, there’s the tech! This is one of the most obvious scenarios where there can be no such thing as prescriptive ‘one size fits all’ advice. The technology and operating model that suits one digital transformation brief may not be right for the next organisation, even if they have a seemingly comparable objective. There are so many elements at play, but of course in considering a seamless digital transformation project, it is important to reference the need to choose the best-fit tech. A bank, looking to transfer millions of users to a new app, will undoubtedly have more due diligence to carry out than a clothes retailer, who may experience a small amount of downtime, for instance. There is no way the bank could utilise a public cloud environment, for example – they may decide the only way forward is to build a bespoke data centre from scratch. And that’s before they even think about the UI/UX. Everything comes down to the brief concerned. 7. Understand the compliance, governance and security requirements of the project Cloud is often the most secure infrastructure for an organisation’s data, but the implications of a hack could be devastating. Organisations must therefore be acutely aware of their compliance and governance requirements, when it comes to keeping their data secure. It is important to consider things such as the resilience of the proposed infrastructure, GDPR obligations and disaster recovery strategies. Which cloud environment is most appropriate, for example, can the proposed partner provide the SLA you require, and do they have a proven approach to DR, that will minimise any business disruption or reputational damage, should the worst happen?[/vc_column_text][/vc_column][/vc_row] ### Cyber Security Threats Companies Face in the Digital Age? [vc_row][vc_column][vc_column_text]Account takeover (ATO) fraud is an online scam where a trickster gets hold of valid credentials and uses them to take control of an account. Once the fraudster finds their way in, they pretend to be the actual account owners and may go to the extent of initiating multiple transactions, selling important data to black markets, changing account passwords, transferring or redeeming loyalty points, or any other thing that could earn them some cash. What a scammer does with your account depends on the type of account you hold. Customers who are reluctant to change their passwords from one account to the next allows fraudsters to gain control of multiple accounts after a successful breach into one. Tricksters are also exploiting Tech in their account takeover endeavors— for instance, if they already have your email, bots can help them try multiple (even thousands) of possible passwords. What are the red flags an Account Takeover is underway? It can be challenging to spot transactions coming from an ATO fraudster because they may look more normal as they come from a familiar shopper with a record of initiating transactions. But there are some ATO red flags merchants should keep an eye on, here are the most common:    The number of purchases goes up beyond what you consider reasonable for the shopper’s purchasing habits.    Massive reward points transferals.  Sudden multiple changes to an account at a go for things like email, address, password, or device.   A rise in fraudulent disputes by the customer because the real cardholder is disagreeing with the charges on their card.    Several “password reset” requests or multiple login attempts Synthetic Identity Fraud Synthetic identity is when scammers get your info, such as address, email, name, date of birth, and mixes them with some fabricated information to form an account. This approach is the trickster's favourite trick for making fake credit or debit card purchases. Be warned; the fraudster uses the real customer’s name and exact details which makes it a very difficult cybercrime to detect as most. But you can prevent credit card fraud with Ivrnet and protect your customers from exploitation. Denial of Service (DoS) This is an attack trickster launch on large companies with the aim of shutting down the website in question. Denial of Service attack starts by tricksters exploiting a system’s vulnerability and using it to send huge piles of data to the entire network until the system breaks down. Denial of service could take two forms; DoS where fraudsters attack through one computer or Distributed Denial of Service (DDoS) where they breach multiple computers. The only way to stay safe from DoS and DDoS is to update your site’s software regularly. Also, make sure to keep track of data flow to check for unexplained increases in traffic. If possible, purchase extra bandwidth or use tools that can detect DoS attacks. Conclusion Attacks are dangerous to your company, no matter how small they seem. A little attempt could open doors to a massive heist.  Remember, the more we adopt tech, the more we should discover loopholes and stop tricksters from successfully conducting digital assaults.[/vc_column_text][/vc_column][/vc_row] ### What Does an Augmented Workforce Really Look Like? [vc_row][vc_column][vc_column_text]Businesses are increasingly advancing the use of automated and AI technologies, but are we ready to manage teams with people, robots and AI working together side-by-side? What does an augmented workforce truly look like and what do C-suite executives need to know about this growing trend to make it work for their businesses? It’s important to remember that an augmented workforce is not a new thing. Since the dawn of time, humans showed a distinctive ability in making and operating tools, and these tools have evolved with us. Making them requires growing specialisation – such as training an AI – but operating them is becoming easy for the end user. With the rise of machine learning we are witnessing yet another industrial revolution (and let’s hope not the last one led by humans!) where the level of human supervision on the machine will be a bare minimum and, arguably, there could be a reverse hierarchy with the AI supervising some human tasks. But we do not need to look too far into the future to witness the changes – these are already happening in our working lives today. No matter how wary we humans are of this type of technology, the augmented workforce is not something an executive can decide whether to embrace or not. It’s happening. Every day, we deal with an augmented workforce, either directly, or indirectly through colleagues who use automation in their daily activities. The pattern is simple: usually, a minor initial investment of time and money in an automated solution generates a disruptive return on investment because a human could not achieve the same task in the same amount of time. The Digital Open Office We are already experiencing much more open and on-demand working environments in many industries through a mobile and remote-centric trend. An augmented office changes the dynamics of the different tools needed to supplement the compromise of office-based collaboration and facilities-based environment. For example, the use of printers, the next-door colleague, office managers or internal companies providing computers which sets the tone for data tracking and compliance from IT teams. When it comes to digital innovation and a mobile workforce, we are seeing technology becoming ever-more crucial for mobile staff. From the transportation of fleets to technical offsite engineers, the workforce is dependent on technologies to carry out their jobs, yet these tasks still require human expertise to execute them, so it’s a co-dependent dynamic. Controlled Environment and Data Protection The anticipated benefits of an augmented workforce – whether it’s an AI-driven, an automation tool or complete AI robot assistance – are to diminish human error and increase productivity. Reducing human error through automation is one of the core reasons companies are deploying intelligent systems that can, for example, support decision-making through the means of eliminating data chaos. Increasing transparency by auto-data gathering and enforcing technology that controls data by establishing a more efficient workflow related to that individual task. Robotic process automation can run 24/7, allowing room and filling the void for out-of-hours routines. This would of course depend on the type of business, but we deploy all kinds of automation from CRM systems to self-help tours based on user behaviour, all of which requires human setup, maintenance and management. An interesting Research conducted by Deloitte states that over 34% of the workforce already adopts AI or robotics, and another 34% are in the process of implementing. From a business perspective, AI can bring a new competitive edge and game-changing processes to deliver better value faster. The Future Uniting ever-more sophisticated technologies into the human workforce will be an experimental process. It will be an exploration of the array of new developments that can meet the changing needs of businesses and support growth through new process innovation. This allows employees to focus on management rather than repetitive tasks. Empowering employees to focus on more cognitive competencies which technology will struggle to achieve. AI-based technology has evolved and is easier to use and manage. Different applications are now talking to each other through APIs, further streamlining data and disrupting the status quo workflow. So we must adapt. An augmented workforce will require a changing role or skill set in everyday lives, as opposed to replacing them. Over the next few years we will face some interesting challenges. On the one hand humans will have to accept that machines will get better than them at, for example, recognising patterns, such as with fraud detection. On the other hand, there is a debate about whether we will require machines to make ethical decisions. Or will they design their own ethical rules? AI and automation are unavoidable, they are here already.[/vc_column_text][/vc_column][/vc_row] ### Why Smart Homes Require Smarter Security [vc_row][vc_column][vc_column_text]Smart home applications have taken the world by storm in the last few years, with consumers rushing to outfit their homes with the latest and greatest technology which promises to automate the most mundane tasks. Turn your heating off from work when you’ve forgotten to do it at home. Switch the lights off from your bed. Control your TV with your voice. Monitor your home even when you’re away. The gadgets can also boost sustainability, cost savings and benefit safety. Yet they do pose security risks and users must take precautions. Just as you would protect your computer from hacking your personal information, you need to think about how you will protect your Alexa, which stores all sorts of personal data. Smart devices are categorised under the “Internet of Things”, which connects all devices together, from smart toasters to smartphones and even wearables. It’s all connected. Recent Cisco insight found that by 2020, there will be more than 30 billion connected devices in the world. There is also 2.5 quintillion bytes of data produced every day - that’s 2.5 followed by 18 zeros. Take a look at the below infographic from Domo’s “Data Never Sleeps” campaign which shows just how much data in 2018 was processed every minute of every single day: Security Risks Think about it: a smart home fully equipped with appliances would cover the kitchen, the bedroom, bathroom, garden, living room, etc. If you’re logging your daily activities and rituals, it’s going to have a bucketload of data and the potential to know everything about you, from lighting preferences to individual family members’ daily schedules. Just a few months ago, Amazon faced backlash after the Amazon Alexa assistant was reported to let out random bouts of eerie laughter and a user in Germany was sent 1,700 audio recordings from someone he did not know. Computer scientists at the College of William and Mary discovered during tests that someone using the same WiFi as a stranger on a public hotspot in a coffee shop could hack into their smart home without even stepping into it. This is done by accessing a low-security device such as a baby monitor and then using this access to go into a high-security application which has important information on. How Do I Protect Myself? Many of these devices are tied to our privacy and security, such as doors and cameras. They can then use this information to physically burgle your house - and take your data with them. These are very rare cases but, put simply, a smart home requires even smarter security, and you should have precautions in place. This can be a potentially major hindrance in the future if not protected right.  Multi-layered approach The multi-layered approach should be at the forefront of your home security strategy. You can have all the smart security in the world, but traditional methods are priceless. This way, you can get the best of both worlds and be assured that the deterrents are in place. Think about the community that you’re living in, make sure that someone picks up your post and checks in your house when you’re away, regularly change your locks, check that your windows and doors are secure. A smart security system is great, but it’s not a magic bullet of security. It doesn’t have arms and legs to prevent burglars from entering your home, but it should be used in tandem with traditional methods to deter burglars and alert you if something happens. Glass break sensors, for example, are a great technology, sending your phone an alert if it should happen. Security lighting A simple security floodlight is an incredibly effective addition to your home security - and when you’re spending a fortune on smart home tech, it’s pretty affordable. It also helps boost the nature of your CCTV and smart doorbells by making facial recognition easier. Password Protect Make sure you change the default password of each different smart device. It’s customary for all websites but can easily get forgotten about in relation to online home applications as we don’t always associate them with computers.  If possible, prioritise your banking passwords and store them on a different WiFi network, which prevents a hacker from getting into your lighting and then being able to hack your bank account. Be Wary of Public WiFi Many people who use public WiFi networks are aware that they bring more security risks, yet still, use it. It can be dangerous, as mentioned earlier, as those that are passwordless can be vulnerable to hacking attacks. You can take steps to minimise risk, by only connecting to networks that require security codes (most restaurants now have these), setting your phone to make sure that it doesn’t automatically connect to these, and using a virtual private network. Be Aware There’s no replacement in security prevention for being aware. Smart home devices are not meant to be a burden and can bring an abundance of benefits, but you must be wary of the risks. If your device is acting strangely, trust your instinct and consult a security professional or throw it away. Traditional security is just as important with smart security as without, and must never not be prioritised. Let’s be aware, take the small steps to increase safety and put your feet up and enjoy the benefits of living in a smart home![/vc_column_text][/vc_column][/vc_row] ### Using Artificial Intelligence to Drive Compliance There are many different applications for artificial intelligence (AI). Some are ready now, such as using the processing power of AI to crunch big data to gain customer insight. Others, such as deploying AI within driverless vehicle computers might be another year or two from being fully ready. But amidst the discussion around AI, one element of modern business is generally overlooked – compliance and governance. To be compliant and on the right side of governance laws around the world is incredibly vital. The European Union’s General Data Protection Regulation (GDPR) that comes into effect on 25 May 2018 is a high-profile example. Every organisation – irrespective of where in the world they are located – will have to comply with GDPR if they hold or collect data on European citizens. Failure to do so will result in fines of up to €20,000,000 or 4% of an organisation’s annual global turnover, whichever is greater. [clickToTweet tweet="Organisations manage #compliance and #governance manually. Using just an Excel spreadsheet for complex #regulations." quote="Yet many organisations continue to manage their compliance and governance functions manually. They use no more than an Excel spreadsheet to stay on top of increasingly complex regulations."] Yet many organisations continue to manage their compliance and governance functions manually. They use no more than an Excel spreadsheet to stay on top of increasingly complex regulations. What challenges are organisations facing with regard to compliance? Could AI be about to drive a new era of automated compliance? Compliance challenges Compliance is a never-ending process for businesses, and the regulatory requirements across most industries are constantly evolving. This is especially true in heavily-regulated sectors such as financial services (FS). New regulations such as MiFID II and MAR increase the regulatory requirements for FS firms. This is also true for initiatives yet to take effect such as GDPR. This will impact on what is needed to remain compliant. Barriers to international trade are much lower in modern business, which means that many organisations do business in an array of different countries all over the world. Each country (and trading block, such as EU) has their own regulations and requirements for remaining compliant. This makes the task even more of a challenge. Most larger organisations have their own officers and teams to deal with being compliant, liaising with regulatory bodies and governments. They are, however, still frequently managing it manually. Such approaches can leave an organisation vulnerable to rapidly changing circumstances. It also raises risk of facing penalties for non-compliance.  It all means that attempting to stay on the right side of all this regulation can feel like a significant challenge. Digital compliance However, there is another way of managing governance that smarter organisations are already using – digital compliance. This is an automated approach that ensures being compliant. It helps take the strain from beleaguered teams that currently manage it, too. To be effectively compliant, you need precision. You also need to be systematic, and you need to be up to date. This can be done using a spreadsheet, database or SharePoint. These methods all require human input though, and humans are fallible. However, if you have an automated tool that takes away the hassle of managing it, it mitigates the risk of failure much more effectively. This entails a combination of automation tool, along with input from industry experts and thought leaders. This can help map the compliance requirements faced by an organisation, across territories, sectors and a range of other areas. But for 2018, the next wave of compliance could be based around artificial intelligence. This is something the Financial Conduct Authority (FCA) is giving serious consideration to. The independent UK financial regulatory body is looking into how AI could be used to enforce regulatory compliance, and is evaluating a number of options that would help reduce risk and allow the FCA to put out rules which are written manually but can be fully and unambiguously interpreted by computers at financial firms. AI and a new wave of compliance That would be a significant move by the FCA and is probably still a while away from becoming a reality. But that’s not to say that organisations cannot use AI themselves to manage compliance. The sheer volume of data that compliance professionals must now make sense of is growing continuously, and it gets harder and harder for humans to do so, certainly in the desired timeframe. So using AI to analyse large amounts of data and find patterns, trends and connections within that data, feels like a much more efficient and effective approach to compliance. The more data that is given to AI technology, the faster it will learn, helping to detect any compliance issues and delivering actionable insight to a compliance officer or team. Although compliance is a business function relatively untouched by digital technology, the computing power that AI offers means that organisations will increasingly adopt AI for governance and compliance over the coming years. However, we probably aren’t quite yet at the stage where AI-enabled compliance functions will completely replace humans. Compliance officers will continue to have a significant role to play in most organisations, and the deployment of AI should be used to enhance such teams, not replace them. ### Cloud is a Journey, Not a Station ‘Moving to cloud’ is a phrase that’s often used as though it’s a single step.  For all but the smallest organisations, it is very unlikely to be so.  Cloud comes in many shapes and forms, and implementing it is not simply a process of taking an existing service and handing it over to a cloud provider. Instead, deciding to move one or more services from the in-house provision, third party hosting or a managed service to some cloud is likely to be just the first step in a journey. The desired end point may be to consume public cloud SaaS, but there may be intermediate stages such as a virtual private cloud, using an intermediate provider to host legacy applications or buying service management from a hosting provider until the appropriate skills are developed in-house.  This process needs to be repeated for each of the IT services an organisation consumes. Decide on your destination The first step in the journey is to understand your organisation’s current IT capabilities (the start of the trip), align them with your strategic goals, and define what IT services and capability your organisation needs to deliver these goals (the next destination).  A good place to begin is a business and IT alignment review to define the service levels you need for the key operational processes that IT supports in your organisation to realise any misalignment or gaps in provision and to understand their impact plus cost, performance and availability implications. Cloud provides standardised, commodity services that are consumed in a standard fashion using standard processes. You may need to adapt or modify your current IT operational processes to maximise the benefit of using the cloud, particularly if you want to use the public cloud, as all the major public cloud providers have defined, standard processes and you are unlikely to be able to persuade them to change them to suit your organisation. Most cloud providers follow ITIL processes and offer user self-service for some or all elements of the service, so it is likely that your existing incident processes can easily be adapted. Plan your route The next stage designs: mapping and aligning the journey against your business requirements.  Having defined what services are needed, you now need to decide which ones can usefully be provided via the cloud. You also have to decide how much work you want to do for each service. If you choose Infrastructure as a Service (IaaS), you will still have to do a lot of the work; slightly less with Platform as a Service (PaaS); and not much with Software as a Service (SaaS), although you will still have to retain responsibility for ensuring your cloud provider meets the agreed SLAs. You may want to retain core business applications in-house, but move non-core or commodity services to the cloud. Most organisations will want to hand over responsibility for areas where they do not have the skills in-house or cannot justify the cost of employing specialists, such as a storage system, backup management or security systems. There are a multitude of cloud providers, with significant differences in the contractual terms and conditions, available SLAs and recompense if these are not met, the legal jurisdictions where data is held and data recovery terms. For larger, more complex organisations, a current optimum solution could be a hybrid of public cloud, managed cloud and in-house or private cloud. Few of the cloud services currently available offer the ability to easily transfer legacy applications and all the associated data onto them, and so for these you will probably need to use the private cloud or managed IaaS as a staging point until more appropriate public cloud services become available (see our previous article https://www.comparethecloud.net/articles/moving-legacy-applications-safely-cloud/). Take your team with you The next stage of the journey is migration, which requires planning, dedicated resources, clearly defined and understood success criteria, a test plan and if you do not have the skills in-house, external expertise to assist with the process. [easy-tweet tweet="Cloud services replace services running inside your datacentres" hashtags="DataCentre, Cloud"] With cloud, there is one further element, your people, who are a more significant factor with cloud than in a ‘traditional’ infrastructure migration. Simplistically, cloud services replace services running inside your datacentres; if you have teams of staff currently running or managing these services they will perceive it is not that good for them, so you need to factor this in. This can be where outside expertise is invaluable, working with companies who have helped with these issues before. It is vital to get people throughout your organisation on board – not just commitment from the top but support from the team at the coal face. Change is always difficult and particularly with the cloud, as staff will be worried that their jobs are at risk so may not fully commit to the project. You may be moving to the cloud because you have problems in your IT service delivery. If people are going to support the project, there must be something in it for them, which means job security, new skills and hopefully recognition and increased salary. There may be team members for whom the project does not offer anything. If this is the case managers should address it at the start of the process so that it does not have an adverse impact at a later stage. Communication is also vital. Too much is never enough, as your staff will always assume the worst when there is silence. Keep them informed throughout the change process. Moving to the cloud is a more complex transition than other infrastructure change projects, particularly if most of your organisation’s IT services are currently provided in-house. The key to delivering it effectively is to understand what you want to achieve and why build alignment across all the people involved and understand the costs and implications before making the change. While it isn’t easy, good things never are, and by carefully planning the journey cloud can provide significant benefits to your business operations.   ### Why Data Acceleration Is Essential to Cloud Computing’s Growth [vc_row][vc_column][vc_column_text]When it comes to modern big data systems and related cloud computing platforms, you’d think that storage capacity, processing power and network bandwidth would be the primary elements of an efficient system. It’s becoming increasingly clear, however, that’s not the case, especially as more businesses emphasise data acceleration. Data acceleration essentially refers to the rate or speed at which large troves of data can be ingested, analysed, processed, organised and converted to actionable insights. Speeding up these processes, as you might expect, is the acceleration aspect. More importantly, because there is so much that goes into an efficient data system, it’s more of a concept that involves all hardware, software and related tools.   By focusing on data acceleration as a whole, platform developers and network engineers can deliver targeted solutions to improve power, performance and efficiency of these platforms. By focusing on data acceleration as a whole, platform developers and network engineers can deliver targeted solutions to improve power, performance and efficiency of these platforms. Simply installing a faster server or enabling wider network pathways are just a couple examples of how you can improve a system in the short-term. However, they don’t offer the real benefits that a truly optimised and efficient platform can. It’s all about operating in the right environment and under the right conditions to create an optimally functioning data facilitation system. It’s remarkably similar to edge computing — on the surface, anyway. Data is analysed and handled closer to the source in the edge, to maximise security but also reliability, speed and performance. Data acceleration uses similar principles, except the data in question is not local — it’s still remote. Unique hardware and systems are enabled to mitigate packet loss and latency issues. Why Does Data Acceleration Matter so Much? According to BDO, "During the period 2014-2020, the percentage of U.S. small businesses using cloud computing is expected to more than double from 37 percent to nearly 80 percent." Of course, no one in their right mind would argue against the growth of cloud computing and big data. The point is not necessarily how big or fast that’s happening — just simply that it is. A growing market means growing demands and requirements. Cloud providers will need to come up with more capable platforms that can store, ingest, process and return the necessary data streams all in real time. Even so, the hardware for doing all this will continue to evolve and get better, bigger and more capable, but that doesn’t necessarily mean it’s going to be efficient. It’s up to the platform developers and engineers to ensure the appropriate data acceleration limits are not just achieved but maintained. Without acceleration, the community may run into bottlenecks, serious latency problems and even operational failures when the data isn’t processed or returned in time. [clickToTweet tweet="Without #DataAcceleration, the community may run into bottlenecks, serious #latency problems and even operational failures when the #data isn’t #processed or returned in time." quote="Without acceleration, the community may run into bottlenecks, serious latency problems and even operational failures when the data isn’t processed or returned in time."] Think: trying to deliver a personalised ad to a customer after they’ve left the channel you’re targeting. Or, better yet: an autonomous smart vehicle that is feeding relational data to a remote system and waiting for a response or key insight. Most of the brands and organisations feeding data into the system only have a small window to work with, so efficiency and speed are crucial. In order for the cloud computing industry to grow to new heights, at least beyond what it is currently, data acceleration will need to become the number-one priority of most — if not all — development teams. Naturally, you’ll want to research and learn more about data acceleration, if you haven’t already. It’s sure to be a driving force in big data, analytics and machine learning markets for the year ahead. More brands and organisations will see the value and potential in improving the overall efficiency of their data systems. Those that don’t will need to improve efficiency anyway to meet the rising demands of their networks, customers and channels.[/vc_column_text][/vc_column][/vc_row] ### Creativity vs Data: The Two Should Not Be Mutually Exclusive [vc_row][vc_column][vc_column_text]Consumers care. They’re loyal. They expect a lot from their favourite brands. These things ring true not only for the product or the service that they’re buying, but also across every brand touchpoint – from the moment they see an advert online all the way to the moment at which they complete the purchase. This is why the advert itself is so crucial to the equation – a loyal consumer will want to see content that excites them, piques their interest and captures exactly why they fell in love with the brand in the first place. Yet through a cloud of data and layers of technology, the ability to deliver creative and engaging online content is becoming increasingly difficult for marketers globally. But as we take our first steps into 2019, we must commit to changing this status quo. With AI-powered adtech solutions only growing in sophistication, marketers must look to the tech for the powerful data insights that they need to produce truly creative advertising and keep their loyal customers just that: loyal. Data killed the creative star…or did it?  Alarmingly, in a study of more than 500 brand marketers from across the US and Europe, 67% of people pointed to the growth in digital advertising as a key reason for the waning creativity from brands and agencies. And this makes sense. As marketers have struggled to come to terms with the rising complexity of the digital landscape, the need to place an ad in front of the right person, at the right time and in the right place has outweighed the need (and the budget) for the content that they’re actually delivering. In fact, 71% of marketers said they find it difficult to achieve ad reach in the face of fragmenting audiences. And with 62% of marketers stating they feel the media landscape has become too complex, it’s no wonder that we’ve started to lay the blame at the feet of digital media; especially as it continues to become more segmented year on year. Technology isn’t the enemy, it’s the hero of the story  The challenges we’re facing thanks to evolving technology are multifaceted. On one hand, every marketer understands the importance of data analytics and personalisation, yet they’ve also become well-acquainted with the headaches that come with data access (hint: the GDPR and the impending ePrivacy Regulation). Yet while it may seem that technology is at the heart of these issues, one well-known, much-talked-about solution holds the key: AI. Yes, it’s the biggest buzzword out there but for good reason. Not only does AI deliver a way to analyse and apply data at scale, it also gives marketers a secret weapon when it comes to creativity. The technology can help agencies understand the exact components of an ad that will determine the success or failure of a campaign. From the right colours, images, words or specific product promotions, AI gives creative teams a chance to dive into the minds of consumers, to see what creative content resonates and why. And to take it a step further, AI can analyse all the components of an ad and make small variations almost in real-time. This means advertisers are able to tweak campaigns to suit an individual consumer’s preferences to a tee – whether that’s serving Jack an ad for his favourite brand of golf clubs set against an image of the Scottish Highlands rather than the sands of the Caribbean; or showing Jill an ad for the latest Audi convertible in racing green as opposed to canary yellow. AI can help uncover these idiosyncrasies that make each of your consumers unique, so you can ensure ads are delivering the content that will inspire people to purchase.  The road ahead Moving into 2019, the narrative will only become more focused on the need to prioritise performance-driven creative. Further pressure will fall on marketers to create the adverts that capture consumers’ hearts and minds, while delivering the sales and profits that brands need to survive. In fact, the overwhelming majority (91%) of marketers acknowledge that their digital ads must be more engaging, so they can achieve their brand’s goals. So, while the rise in digital marketing may have brought with it more complexity, more technology and more data, it has also given marketers more opportunities to fine-tune creative content and deliver the perfect campaign to the perfect audience. By combining data, creative and media together and building a foundation that is driven by AI, brands will become more empowered to build campaigns that meet their business goals.[/vc_column_text][/vc_column][/vc_row] ### SD-WAN: the marketing myth [vc_row][vc_column][vc_column_text]We have seen a huge surge in a new software-defined wide-area-network strategy (SD-WAN) in the last few years with many large scale corporations shouting at the rooftops at how the new strategy is going to make networking and communication across large areas more efficient and secure. Of course, when you put it that way, it would be stupid to not jump on the bandwagon and spend big money to adopt it.  However, it seems that this marketing hype is causing many professionals to be led to making decisions that aren’t right for their business. As technology develops, more options are becoming available - which is absolutely fantastic - but the risk of marketing hype clouding decisions is becoming bigger and causing more IT professionals to make the wrong decision.  By demystifying the hype around SD-WAN, it will be clearer to everyone whether it’s required or not.  Narrative As we become more reliant on the internet for large scale migration to the cloud, traditional WANs have fallen out of favour with its limitations to enterprise, branch and data centre.  An SD-WAN uses a centralised control function and software, typically MPLS and other internet services, to guide traffic across the WAN to increase performance and enhance business productivity. This means that decision-makers and business leaders can have better visibility across their network and understand exactly what is going on in real-time as well as manage an international network at speed. Furthermore, it allows customers to upgrade and enhance without any changes to the infrastructure, mix and match network links without a single bandwidth penalty, as well as provide greater security across the board. On the face of it, SD-WAN sounds extremely desirable and it’s quite clear why businesses would want to spend money on such a service. However, not everything is as it seems and by opening the bonnet, we can reveal the truth about SD-WAN.  Reality  There was always going to be a natural progression of networking and how it would help enterprises across industries as we all learn to understand how to make better use of the data available. Yet, many technology vendors have taken advantage of this wave and bundled together existing technologies into a sparkling new package called SD-WAN with a great graphical user interface (GUI).  These technologies comprise of PfR (Performance Routing), NBAR (Network-Based Application Recognition) for application awareness, traditional 3/4 Firewall, IP-SLA, object tracking and per-packet/session load-balancing for good measure. The SD-WAN solution has automated these technologies which makes it more efficient and cost-effective. This does sound desirable but what isn’t desirable is the fact that companies are completely ditching their existing infrastructures like MPLS in favour of SD-WAN which removes all of their back-up support systems and as the internet becomes more fragile, this could cause catastrophic consequences.  Furthermore, many vendors pushing SD-WAN are based in the US and, as such, the technology has been built to be effective on advanced infrastructure. The current situation in the United Kingdom simply won’t keep up with the speed required to make sure that it works effectively; 9% of the country still hasn’t got access to the 4G network. This causes issues in providing real-time services, such as voice services, because of the need for it to be instantaneous and for that to run solely on the internet as it continues to stretch far beyond its means, could cause consequences for businesses and their operations. Build or buy?  To try and combat the hefty price tag of outsourcing their SD-WAN, many companies might decide to build it themselves. This does provide IT professionals with the power to make it completely bespoke and ensure that it totally suits the business needs rather than having to answer to an outsourced provider. However, it must be noted that businesses doing this can put a huge strain on an IT department. The department will have to learn, plan, design and implement and from there, maintain it to the highest quality standard. This means that it is essential to have a consistent team that can troubleshoot the network across the whole enterprise which can be a vast challenge. Of course, companies can’t guarantee that they are going to have a consistent IT team and will be constantly up-skilling new members to maintain a smooth operation. Consequently, vendors are charging extortionate amounts of money to remove this responsibility from businesses.  It goes back to the most important point, understand what you need and then decide from there.  On initial inspection, SD-WAN does seem to provide huge benefits for companies but once we understand its true form, we can see that it isn’t all it’s hyped up to be. SD-WAN isn’t fully developed and still raises some security and efficacy concerns. For companies that have fallen for the marketing hype and are ditching their MPLS and adopting an SD-WAN-only strategy, they need to be prepared for the worst. [/vc_column_text][/vc_column][/vc_row] ### Why cybersecurity is essential for physical security peace of mind [vc_row][vc_column][vc_column_text]The era of connected devices promises many things: by combining digital data from previously unconnected analogue systems, we can deliver new insights and innovations to improve efficiency and safety in our homes and workplaces. The first rule of innovation, however, must be to “do no harm”, which in the case of the Internet of Things (IoT) means getting cybersecurity basics right first. The momentum is shifting rapidly in the fields of video surveillance and physical security. While the journey to the cloud had been slower than in other sectors, increasingly CCTV, access control and audio systems are becoming network-connected, and therefore integrated into IT networks and processes to deliver more intelligent physical protection. But this move to the cloud also creates many more business opportunities. Real-time video data from network connected cameras, for example, can be combined with information from other systems to develop new sources of business value via cloud analytics. This can be seen in retail stores, which commonly deploy network surveillance cameras to prevent or identify shoplifters. Connecting those same cameras to a cloud analytics suite can also give retail managers the ability to automatically detect queue build-ups and generate alerts, or gain a better understanding of traffic flow around a store. The ever-present risk of connected systems But if businesses are wary of connected cameras, they have good reason to be. The world’s largest Distributed Denial of Service (DDoS) attack, which disabled many popular internet applications such as Dropbox, Netflix, Uber and more in late 2016, was launched in part from hundreds of thousands of compromised network surveillance cameras. The Mirai malware which directed the attack took control of networked devices by testing common default username and password combinations, such as admin:admin. Two and a half years after these major incidents, it’s still the case that too many professional grade physical security products are being sold with little to no consideration for cybersecurity best practice, and still failing basic oversight tests such as default credentials. The emphasis remains on speed to market and reducing costs. There’s not enough attention paid to quality control, “secure by design” product development and effective processes for post-sale support with firmware updates and asset control. This critically undermines the potential of these devices to deliver. If the promise of IoT in physical security is that of a better, safer world, it can’t be achieved if equipment is liable to introduce new vulnerabilities into a client’s network. OEMs need to put in a lot of effort to ensure devices are fit for purpose and can be trusted for innovation. It requires investment in skills and processes internally, and clear communication and transparency throughout the entire supply chain. Does every component meet the rigorous standards that a customer has a right to expect? Just as important, however, is the investment that needs to be made into end user education. The market is highly price sensitive, and customers need clear guidance to understand the risks involved when purchasing equipment from unknown vendors that isn’t configured correctly. And beyond that, awareness must be driven beyond the IT department. Everyone in an organisation needs to understand the importance of correct procurement processes, as the falling cost and simplicity of use for modern IoT devices makes them difficult to control. A CIO may have the best strategy in place, but if a business manager is purchasing equipment on a credit card and adding them to the corporate network, who is responsible for ensuring the products are safe? Regulations have increased awareness That said, businesses are becoming more aware of the cybersecurity risks that network connected devices present, and 2018 saw much regulatory change to counteract the threats. The General Data Protection Regulation (GDPR) began being enforced, as did the Directive on security of network and information systems (NIS Directive). Both directives impose the same substantial financial penalties for non-compliance. Large monetary fines have the potential to debilitate businesses, so it is imperative that the relevant companies undertake due diligence in meeting its requirements. We have already witnessed GDPR fines being imposed across the UK and EU related to the deployment of CCTV systems. It was recently reported that hackers in the UK broke into schools’ CCTV systems and streamed footage of pupils live on the internet. Understandably, this gained a lot of negative publicity; after all, we send our children to school with the expectation that they will be safe, and the security systems are there to protect them rather than put them at risk. Selecting the right technology partners is critical In our converging security landscape, selecting the right technology partner has never been so important. Security practitioners need to acknowledge that cybersecurity isn’t just an IT issue and understand the associated cybersecurity risks to a business related to the deployment of physical and electronic security systems. While the digitisation of physical security is a tremendously exciting space to be in at the moment, for it to continue, as an industry we need to better address the issue of cybersecurity, and soon.[/vc_column_text][/vc_column][/vc_row] ### Are you harnessing the power of your business data effectively? [vc_row][vc_column][vc_column_text]Cognitive computing is leading society on a pathway that is genuinely set to revolutionise how we live, work and behave. Previously, new ways of communicating online were big business, now Artificial Intelligence (AI) and machine learning take centre stage. These technologies are driving unprecedented developments in medicine, education and finance, as well as relieving us from a wide range of everyday tasks. AI is heralding the fourth industrial revolution and will quickly become indispensable. The opportunity is there for businesses to embrace, but with this comes a warning: if you aren’t taking advantage of AI now, your competitors are. It’s a case of being a disrupter, not disrupted. So, how can you realise the benefits of intelligent technologies today? Know what questions to ask The nature of business is to look forward; at new technology, new products to launch and new customers to win. But to take that step forward with confidence, an organisation must agree on the business insights that will bring the biggest competitive edge. Even with AI, asking the wrong question will always give the wrong answer, while asking the right questions of your data will deliver meaningful results. Think about the business issue that is keeping you up at night, or what you’re most worried your boss will ask you. Imagine how solving that problem will make a difference. Perhaps it will help your company grow or maybe it will save time and money. Harnessing AI may accelerate innovation, or it may help to protect your company and your customers. Cognitive computing consultants can help you identify these critical issues so that you can take the next step in solving them. Undertake an audit/inventory of your data Once you know which questions you want to ask, the next step is to audit your corporate data to find the information that might hold the answers. Most IT teams are used to dealing with ‘structured data’. It’s organised in a format that’s easy for machines to process, such as a database or a spreadsheet. Historically, limited compute power meant that only structured data could be analysed effectively. This left organisations with vast pools of valuable but untapped ‘unstructured data’ such as images, documents, emails, video, audio, images and social media posts. The lack of a pre-defined structure made extracting answers from this data extremely time consuming, resulting in it being left on the cutting room floor. Advances in high-performance technology mean that neural networks can now work with unstructured data, harvesting information that wasn’t accessible before and drawing out valuable insights previously hidden from the organisation. As a result, Gartner predicts that the volume of data will grow by over 800 per cent over the next four years and the vast majority of it (80 per cent) will be unstructured. AI means that you no longer need perfectly structured data to get answers. The technology can work with all types of data; it’s mainly about locating it, labelling it and making it available. Depending on the complexity of the business challenge, many data sources may need to be lined up, but there are ways around not having your own specialist data scientists. Working with an enterprise performance partner who has access to a pool of data scientists is one of them. Train your AI model Once you’ve figured out the problem and located the information to solve it, the next step is to work on developing your AI model. The first stage is to provide training data that contains the correct answers. For example, creating a model that recognises different vehicles requires images of hatchbacks that are labelled ‘hatchback’, images of tractors that are labelled ‘tractor’ and so on. Given accurate training data, machine learning algorithms will find the patterns that can predict the correct answers. Applying those patterns on raw un-labelled data is where the AI model shows its value. Proving the conceptual model can be done with a relatively small set of data, however, to support robust decision making, comprehensive data sources will need to be used. The larger the data set, the fewer anomalies you’ll get. Use too little data, and you’ll have an answer based on what that set of data is telling you, but not reflective of a true business pattern. Choose affordable compute power to develop your AI capabilities The volume of data you need to trawl through when training your AI model requires a powerful amount of compute, but once the model is trained to do its job, running it against real world data is not nearly as processor intensive. If you’ve invested lots of money in the kind of high-performance infrastructure needed to train your machine learning models, the chances are you’ll be left with a large chunk of it unused when that first phase is over. An alternative approach is to use a cloud-based AI platform to do all the heavy lifting from the outset in gathering and analysing data, avoiding a considerable capex investment. Designed specifically for this type of data-intensive workload, these specialised cloud services offer ‘pay as you use’ access to some of the most powerful servers commercially available. It’s important to work with a partner that can find you the right cloud environment for your AI model. Cost, performance and cybersecurity concerns need to be finely balanced. For example, once the model’s up and running, it can be hosted either on-premise or on a private or public cloud or a multi-cloud combination depending on what’s best for the application. Real life scenarios of AI and cognitive computing AI isn’t limited to any particular field or function and can boost the capabilities of any sector and sphere in the world. The public spotlight often shines on high profile examples such as analysing digital images for early cancer diagnosis, natural language processing in Siri and Alexa or the predictive capabilities of a self-drive Tesla, but machine learning can transform almost any market and often in seemingly mundane ways. In retail, an AI-based personal sales assistant can prevent shopping carts being abandoned before a purchase is made by delivering real-time product targeting. Modelling millions of users every day can predict a shopper’s intent. It’s then possible to match a brand’s product offering to an individual’s preference and increase conversions. Image processing can reduce waste in ready-meal manufacturing by sorting potatoes by size, so chips are made from the longest ones, hash browns from the medium sized, and mash from what’s left. University students’ progress can be tracked and improved by analysing assignments and recommending personal study where knowledge gaps are identified. Booking flights, hotels and rental cars is simplified when a chatbot can answer traveller queries. The process of sifting through vast quantities of data and spotting correlations and inconsistencies can be used to make predictions and solve complex problems for any business. Often the inspiration for successful AI project comes from exploring examples in other companies. The route to lean innovation There is now so much that the wider enterprise world can do with AI that just wasn’t possible five years ago. Plenty of pre-bundled software now exists from the main ‘cognitive in the cloud’ players such as IBM, Microsoft and AWS. For example, an application to recommend podcasts can be stitched together with modules that transcribe audio to text, search text for keywords, index podcasts based on keyword hits and display the results in a colour-coded dashboard. Proof of concept facilities are also being made available by the main players that allow you to undertake a ‘trial’ with a much smaller dataset. This means that you can provide a business case to the board and demonstrate the business functionality of your idea without investing large sums of money upfront. These services are now becoming more widely available demonstrating the many possibilities that AI can bring to an organisation. Imagine being able to sift quickly through your business data to recognise patterns and discern inconsistencies so that you can then make predictions about your business. Well, that time is here. Now is the time to uncover the insights in your data that will give your company its perpetual edge over competitors.[/vc_column_text][/vc_column][/vc_row] ### The Meaning of Influence Maybe it’s time to explain in more depth the meaning and importance of influence, as well as the best ways of measuring it. 1) Why is influence so important? What’s all the fuss about? The world has changed – the way that we all purchase things is very different from the way that we used to do so and unless your marketing strategy and capability has adapted and evolved to address this change, you are being left behind. In the past our marketeers focused on big events, PR and advertising, while sales teams were focused on account based selling. As purchasers we might have reached the consideration point and then attended a trade show, visiting the stands of all the main vendors to find out more about what they had to offer, before reaching the point of intent or evaluation. Buyers simply don’t act this way any more. They do all their research online, searching via google, researching via social media and seeing what the so called experts have to say and drawing up a short list without having seen or spoken to any vendors. Signs that you’re stuck in the old world are: You’re still attending trade shows and wondering why all the foot-fall is other vendors. Why does it feel so empty? Where are all the punters? You’re still spending money on adverts in publications or banner ads on news sites. Even children these days have learned to tune out when they see adverts. Why would you think that buyers would be any different? You’re still focusing on media and analyst relations as if press and analysts were still the only real influencers? Worst of all though: you’re kidding yourself that you’ve moved with the times by posting content on your web site and by setting up a corporate Twitter account. When the reality is that 90% of all content marketing gets NO reaction - a massive wasted effort and opportunity cost. [easy-tweet tweet="The reality is that 90% of all content marketing gets NO reaction" user="billmew and @comparethecloud" hashtags="marketing"] 2) What is Influencer Relations really all about? Influencer Relations is all about identifying and reaching the people that have real influence, and sharing content with them that will have real impact and really get their attention. Influencer Relations needs to be an integrated part of the new marketing continuum that includes both Content Marketing and Social Selling.  For all of this to be effective you need real data-driven insights and answers to the questions below: 3) So how do you measure influence and how can you identify the real influencers? Here are a few pointers that should help you differentiate the Compare the Cloud influence ranking from the alternatives: Where are they looking? There are many tools out there that will give you various statistical analyses based on social media data, but social media while important is only part of the equation. We are very active on social media and are massive advocates of its use, but we are realistic in the share of voice and attention that it represents. Our monthly analysis looks at a far broader set of media. The balance between these is dynamic and based on the volume, level of opinion and real influence being applied across each at any time: currently this balance for cloud computing is as follows: 22.6% from blogs, 74.4% from news articles and 3.0% from social media. Any ranking that purely looks at social media is only assessing 3.0% of the overall spectrum. [easy-tweet tweet="Any ranking that purely looks at social media is only assessing 3.0% of the overall spectrum." user="comparethecloud"] What are they measuring? For our rankings we use tools that measure influence as it is demonstrated in public news, articles, discussions, and social media. We apply patented natural language processing algorithms to first extract opinions that have been publicly shared relative to a specific topic definition, then determine who held that opinion, and finally calculate an influence score for the opinion, opinion holder, and the topic. This process is repeated every day, using the full-text articles from over 3 million sources including news, blogs, and social media. [easy-tweet tweet="For #Cloud Computing @Comparethecloud is currently tracking 637,607 identified opinions" user="billmew" hashtags="cloudinfluence"] For Cloud Computing we are currently tracking 637,607 identified opinions and the day to day variations in our influencer analysis are shown in the graphs that accompany our rankings. By comparison a table that simply focusses on social media without isolating individual opinions and analysing each in turn, simply does an analysis of tweets mentioning certain prominent hashtags, and counts the number of followers. What results do they get? Our rankings change from month to month and are closely aligned to the news agenda, we provide a ‘Global Top 50’ list of the leading #CloudInfluence organisations and individuals and can at any time identify the main opinions expressed by each. If you’d like to scrutinise our results or commission us to do a more in depth analysis of any market segment then we be happy to discuss this with you. Important conclusion Don’t let misleading social media based analysis put you off or misdirect you. The thing is that Influencer Relations is an important part of the modern marketing continuum and that identifying and reaching the right influencers is critical. Don’t let misleading social media based analysis put you off or misdirect you. Instead get the right advice and focus back on the real data-driven insights that we listed earlier that you need to be able to address in order to be effective. ### The Future Of Work? Employees Will Be Gamified + In Control [vc_row][vc_column][vc_column_text]We’re seeing major social change, across Britain and globally as people redefine the pattern of work. More and more workers seek the flexibility to choose where and when they work to achieve a better work-life balance, and the result is a rapidly growing gig economy. The gig economy provides the worker with total control over their work-lifestyle, with ever-increasing levels of compliance and protection thanks to crowdwork sites. Most industries are seeing some level of adoption of the gig economy and this is only likely to grow. According to the Office for National Statistics, the number of self-employed people in the UK has risen by 45 percent since 2001, with more than 15 percent of the UK labour force classed as self-employed in 2017. Since 2010, there has been a 25 percent increase in non-employer businesses within the UK’s private sector, a greater and faster rate of employment growth than other private sector SMEs. For regulated industries such as security, cleaning and logistics, however, the concept of a flexible, on-demand workforce has been overlooked. Because, despite years of global innovation, technology has been unable to infiltrate the regulated service industries with most in the sector fearing digital transformation and excessive compliance red tape. Until now. The birth of Labour-as-a-Service It’s clear that to succeed in a modern, digitally-focused world, the services industry must adapt and transform to the needs of the modern employee, who are increasingly looking to gamify their lives. Many workers use the gig economy model as a secondary source of income to supplement their shift-based work and enable a better standard of living. For example, people on zero hours or part time contracts may look for additional flexible shifts at evenings or weekends and they need a quick and easy way to find job openings and flexible roles. Just like in the goods industry, service workers are increasingly dictating the way they want to work; flexibly, on smart phones and for a range of different companies. They no longer want to go searching jobs boards or trekking to a high street recruiter. They want a smarter way of working that puts them in control and means they can find work, cover shifts and get paid in a much more streamlined and efficient way. Enter Labour-as-a-Service (LaaS). Technology that powers the on-demand service sector Technology that can disrupt and empower the services industry is ready and raring to go, with artificial intelligence (AI), gamification and apps the three main drivers. It’s not particularly revolutionary to suggest this; it’s a trend that’s clearly visible and successful in other markets and sectors. The modern worker wants to gamify their life and their work, whether they are part of a permanent or temporary workforce. The regulated service industries can meet this demand by engaging with and motivating their on-demand staff using game theory and designs. It can also support the candidate screening and job application process – a major compliance hurdle in the regulated industries. Along with compliance concerns with the gig economy model of working, the perceived lack of commitment and loyalty a temporary member of staff is likely to show is another barrier for the regulated services industry. Using gaming principles such as profile rating, performance-related milestone achievements, behavioural quizzes and community engagement, employers can improve performance, commitment and interest in a firm among a transient workforce.  In turn, building meaningful relationships, boosting employee motivation and even supporting training and productivity. Critically, in a candidate-driven market, it allows the service industry to reward on-demand staff for good practice, empowering and incentivising them to drive higher standards. Accessing verified and vetted candidates for immediate short-term work has always been a major challenge for regulated industries. Gamification can speed up this process, testing skills such as accuracy, time management, creative thinking and logic. These modern strategies will enable firms within the services sector to establish a real point of difference at a time when unemployment is at its lowest. Future fabric of employment The DNA of Britain’s workplaces will continue to redefine itself, as people strive for better lifestyles, modern benefits and a fair work-life balance. The gig economy is one step towards achieving this, and it is working – for workers, for business, for Britain. But the regulated services industry is playing catch-up to the likes of Uber, Deliveroo and JustEat. As a sector that is ripe for innovation, now is the time to modernise, digitise and transform to meet the needs of a rapidly expanding gig economy model of working.[/vc_column_text][/vc_column][/vc_row] ### Reasons for the hybrid cloud: disaster recovery and cost [vc_row][vc_column][vc_column_text]Hybrid cloud has evolved to becoming a major differentiator in today’s technology landscape. Whilst security was often touted as a major challenge for those businesses considering cloud adoption, the hybrid cloud model has ushered in greater agility and security compared to the other cloud models. Whilst hybrid cloud has emerged as the next step in the evolution of cloud computing, what do we mean when we talk about the hybrid cloud? Is it just a varied number of IT solutions offered up by vendors looking to close more sales in a very competitive environment? The ‘hybrid’ nature usually refers to the location of the cloud, i.e. on-premise or off-premise, whether it is private or shared to some extent and even what type of services are being offered. So, in effect, a hybrid cloud refers to a mixture of on-premises private cloud resources combined with third-party public cloud resources with an accompanying level of management therein. Many businesses will seek to balance their workload mix across both on-premise and off-premise options to leverage a more efficient use of IT resources in order to meet business objectives. For organisations running diverse applications over different legacy systems, hybrid cloud is best placed to manage their complex IT landscape. Enterprises are looking to boost productivity by hosting critical applications in private clouds and applications attracting fewer security concerns than in public clouds providing environments that are secure, cost effective and scalable. I see two main reasons for businesses to move to the hybrid cloud. One would be for disaster recovery purposes. You want to run your production or your environment in more than one location using more than one provider; one option that allows you to do that is running a hybrid cloud. Whether it's active or passive, you can run your environment in two locations - one in the cloud and one on premise. In this way you achieve full redundancy along with a disaster recovery plan. With hybrid cloud, companies can use the public cloud as a place to replicate their data to, whilst keeping their live data in the private cloud. Using hybrid cloud to improve your disaster recovery capabilities really means that you are using cloud disaster recovery, but your live system is in a private cloud. Perhaps one of the key considerations for using hybrid cloud as a disaster recovery solution is that the recovery process is complex at the best of times, and failover planning from your live site to a public cloud requires a lot of planning. The second reason would be with respect to cost. Whether it's to move a portion of your setup, (the portion which is more expensive) to the data centre, because sometimes there is a specific service that produces a lot of outgoing traffic which is relatively expensive in the cloud, while it's almost zero using the data centre; in this case you can split your environment into a hybrid one, running the costly services via the data centre. If you syndicate hardware and software, then virtualisation can reduce your costs significantly. And hybrid cloud provides scalability; once you outgrow a server, with the hybrid cloud you can then upscale with your live system. In a hybrid cloud environment, it is possible to obtain additional efficiencies and further reduce the over-provisioning of IT resources while also maintaining the on-premise option. It is now possible to lower overall IT resource consumption and shift those loads to the lowest cost site.  So in low use periods, the lowest cost option would likely be on-premise and in periods of peak loads, it would be a mix of both on-premise and the lowest cost cloud provider. In the continuing evolution of cloud computing, it is clear that hybrid cloud adoption is only going to increase. Gartner  has already predicted  that by 2020, the vast majority of businesses will have adopted a hybrid cloud infrastructure. Clearly technology is a massive consideration when you start seeing the incorporation of automation, machine learning and artificial intelligence into cloud platforms along with the way in which environments will be managed and maintained. But is hybrid cloud just about the technology? Sure, technology will be updated and improved but ultimately hybrid cloud is about a mindset. A mindset that is focussed around outcomes that ensure that business is delivering in terms of costs and objectives.[/vc_column_text][/vc_column][/vc_row] ### The multi-cloud paves the way for AI and ML [vc_row][vc_column][vc_column_text] The multi-cloud is happening now The rise in off-premises cloud services is transforming how businesses serve their customers. New mobile apps, websites (e.g., Facebook, YouTube, Twitter, LinkedIn) and services (e.g., Amazon, Skype, Uber) are consolidating decision-making. People who need to act in real time can now gain insight from non-traditional data sources, infusing it into their business processes. The following three major trends have come together, making it possible for enterprises of all sizes to apply analytic techniques to business processes: More data is available from an expanded ecosystem of diverse sources, including connected devices and social media. Location tracking and monitoring individual behaviors and news feeds allows for more accurate prediction of individual needs that vary from moment to moment. Software vendors and cloud service providers (CSPs) are moving to package analytic technology in an easy-to-consume format, allowing enterprises to apply the technology without requiring highly skilled data scientists. Off-premises cloud services, infrastructure-as-a-service (IaaS) offerings like Amazon Web Services (AWS), cloud as a service (CaaS) products like Google Container Engine, and platform as a service (PaaS) offerings like IBM’s Cloud have become more affordable. Enterprises now have access to the large-scale compute environments needed to provide the required high-volume computation. The need to store, process, analyse and make decisions about rapidly rising amounts data has led enterprises to adopt many off-premises cloud offerings that can automate and improve business processes. IT organisations now must evolve alongside this new deployment architecture, allowing for greater agility and adoption of new technologies. Enterprises are not only diversifying the location of their cloud services, from on-premises to off-premises, they are also increasing the number of cloud service providers (CSPs) used to manage their IT needs. North American enterprise respondents to the recent IHS Markit Cloud Service Strategies survey reported they expect to use an average of 13 different CSPs for SaaS and IT and 14 for IT infrastructure (IaaS and PaaS) by 2020. In effect, enterprises are creating their “cloud of clouds,” or multi-cloud. A great deal of variety exists between providers’ offerings, with specialised players meeting particular needs. Get ready for AI and ML: cloud first, then on-premises CSPs offer a variety of artificial intelligence (AI) and machine-learning (ML) solutions from the cloud, by deploying within their infrastructures a combination of CPUs and co-processors, memory, storage and networking equipment, providing customers with an abundance of applications to choose from. Users are taking advantage of these advanced applications, delivering services for use cases like facial, video, and voice recognition for smart cities. Another example is online retail, which uses augmented and virtual reality (AR/VR), image detection, and other innovative technologies. Many major CSPs have introduced new and updated AI- and ML-based products in the past year, including the following: Google Cloud: Vision API 1.1 recognises millions of items from Google’s Knowledge Graph and extracts text from scans of legal contracts, research papers, books and other text-heavy documents. Google Cloud Video Intelligence identifies entities and when they appear in the frame, using TensorFlow, which enables developers to search and discover video content by providing information about objects. For more information on this topic, read “Google Cloud Next 2017: Making Machine Learning Ubiquitous,” from IHS Markit. AWS: Amazon Transcribe converts speech to text and adds time stamps notifying users when certain words are spoken. Comprehend transcribes text from documents or social media posts, using deep learning to identify entities (e.g., people, places, organisations), written language, the sentiment expressed, and key phrases. Rekognition Video can track people, detect activities, recognise objects and faces, and select content in videos. For more detailed information, read “AWS re:Invent 2017: Amazon’s Alexa Becomes Business Savvy.” The use of AI and ML requires that new server architectures include specialised silicon co-processors that can accelerate the parallel computations needed for AI and ML. IHS expects unit shipments of servers with general-purpose programmable parallel-compute co-processors (GPGPU) will make up 11 percent of all servers, exceeding 1.7 million units in 2022. In effect, the use of co-processors crosses an important 10 percent threshold, where the technology passes from early adopters to mainstream buyers. At this point, on-premises enterprise operated data centers are expected to also contain GPGPUs; thus, AI and ML workloads that were born in the cloud will also be executed on-premises. The following factors were included in this IHS Markit forecast: Generation-on-generation improvement of co-processor performance makes servers with co-processors more attractive to customers than traditional CPU-only servers. Co-processor options are multiplying, giving customers choices. With a long-term roadmap developing, there has been an increase in vendors offering servers with co-processors, which is creating even more choices for customers. Multi-tenant server software continues to add features, making it possible for customers to virtualise co-processors, increasing the utilisation of servers shipped with co-processors. The bottom line Enterprises are looking for cloud service providers, so they can adopt AI and ML in their business processes as the enterprises go digital. In response, CSPs looking to differentiate themselves have introduced many services with embedded AI and ML. New offerings provide enterprises with a lot of choices, and they have responded by using a variety of cloud service providers, rather than relying on a single provider. They are effectively building a multi-cloud where their software workloads are placed in the specific data centers best optimised to execute them, including their own on-premises data centers. In the next five years, the use of AI and ML will become mainstream. Enterprises will leverage compute infrastructures from a variety of cloud service providers and from their own on-premises facilities. Server architectures have already changed over the last 12 months, with buyers preferring higher end servers with more memory and storage, as data sets increase in size. Servers have also been shipping with additional co-processors for parallel compute, to support AI and ML workloads, and that trend will continue.[/vc_column_text][/vc_column][/vc_row] ### Removing the complexities of hardware with virtualisation [vc_row][vc_column][vc_column_text]As business software advances faster than ever before, the modern workforce demands IT infrastructure that is simple, secure and flexible. That said - due to inevitably longer product cycles - the hardware powering these workplaces is in many ways slow to adapt to the increasing demands of modern applications.  The hardware challenge A recent report by Curry’s TechTalk sheds light on the effects of slow workplace technology, with workers across Britain spending an average of 20 minutes a day opening software and programmes - adding up to a whopping 10.8 days per year. Now, businesses that rely heavily on outdated, on-premise hardware to keep staff productive risk not only falling short on performance, but also risk failing to cater to the needs for a more flexible way of working.  Amidst often dwindling IT budgets, it simply isn’t always feasible for businesses to invest in regular hardware upgrades to facilitate the sophisticated software needed to operate across many roles. Take the example of design-based industries like construction or manufacturing, which require complex and graphically-intensive applications such as AutoCAD to develop high-spec drawings.  These programmes can quite easily falter when run on hardware that is not fit for purpose, with potential to not only disrupt the user experience, but increase the margin for inaccuracy. What’s more, this type of software would traditionally require staff to be present within the office in order to operate - leaving very little opportunity for flexibility or productivity elsewhere.   A virtual approach The question now for businesses is: how can we achieve all of this simultaneously and within budget? For those running complex on-premise applications, technologies such as Virtualisation can spark a shift towards a more SaaS approach and alleviate some of the common intricacies associated with business hardware. Virtualisation is by no means a new concept, however it seems that only recently are businesses realising its full potential to deliver more cost effective and smarter working.  Essentially, the term defines the creation of a virtual version of something, such as a hardware platform or operating system. Simply put, it is a technique that allows you to host multiple virtual machines, and each machine can be running different operating systems or applications.  By way of business value, this can help to eliminate the need for additional costly hardware; more servers on a machine reduces the need for physical servers, reducing hardware, space and power costs. Managing graphics within a virtual hosted environment also means that end users can operate on any device from anywhere. The result? An ability to run the most demanding software on even the poorest of devices – with the only real requirement being access to the internet and a licence for some specialist applications.  A new industry standard?  With more than half of businesses planning to use storage and application virtualisation by 2021, the opportunity is ripe for businesses to transform with managed virtual graphics and reduce the rate of interrupted working experiences at the hands of inadequate hardware.  Combine this with secure hosting technology and processing can take place in robust data centres, rather than the end user’s device to ultimately decrease reliance on high-spec hardware. This can also bring added security benefits; data is never stored on the end user’s device - giving IT decision makers peace of mind that all business-critical data can’t be compromised. This also begs the question as to whether the technology will be used as the key to flexibility across industries that rely on demanding software. Application isolation, greater workload profitability, improved scalability and accessibility are a handful of tangible outcomes which can make virtualisation attractive to business decision makers, whilst helping their staff to achieve more without investing in expensive hardware - no matter how complex the application.  Unlocking a flexible future Being able to access everything that users need remotely is still a new way of working across many industries. Previously, being able to access industry-grade applications on standard end-user devices has been a tedious and challenging experience. This often meant that working remotely - whether that be from home or even when commuting on the train - was anything but easy. Yet, with managed virtual graphics opening up complex software to the masses regardless of location or device, could the tide be changing?   It goes without saying that flexible working is the future and any roadblocks to delivering on this trend  can really affect a user's productivity, satisfaction and quality of work. In CIPD’s 2019 UK working lives survey, 54% of UK employees work flexibly in some way, however two in three employees would like to work flexibly in a way that’s not yet available. By creating virtual versions of an on-premise hardware platform or application, users can maximise their outputs across any type of device and decision makers can reinvest budget that would otherwise be spent on implementing the latest, most robust on-premise hardware. By breaking down location barriers, businesses can successfully open workers up to a smarter working experience - one that isn’t bound by the limitations of outdated hardware. Implemented correctly, virtualisation can remove the difficulties of hardware by eliminating slow workspaces, encouraging staff flexibility and productivity, as well as mitigating the growing costs of updating machinery. Over the next few years, we may well even see virtualisation extend across all other areas of a business and offer an extremely attractive solution to smarter and easier working for employees and decision makers alike. [/vc_column_text][/vc_column][/vc_row] ### Hosted physical security delivers benefits for business [vc_row][vc_column][vc_column_text]Rodrigue Zbinden, CEO of Morphean explores the growing appetite for cloud-enabled security services and how businesses can benefit from the intelligent insights to improve operations. The term ‘security’ in the IT world is commonly taken to mean the safeguarding of software, systems and data. Yet cloud also brings physical security into the equation to better protect people, premises and assets. Through advancements in IoT technologies that enable the connection of IP devices to a network, the world of physical security that includes access control and video surveillance, is benefitting from cloud platforms. The Video Surveillance as a Service (VSaaS) market, in particular, is expected to reach USD $5.93b by 2022, growing at a CAGR of 22.0%, in part due to the intelligent insights that connected systems can deliver to improve operations for a business of any size. For many business leaders and IT managers, better security, cost benefit and better functionality were found to be the most influential factors and the most commonly realised benefits of hosted security solutions as cloud continues to transform and streamline business operations across every industry. Legacy IT systems, which are proving both expensive to maintain and increasingly problematic to secure, are being replaced by cloud-hosted solutions buoyed by the low cost set up, the flexible scalability on offer and the increasing demand for real-time and remote access to data.  Understanding the appetite for hosted security We know this because Morphean recently commissioned an independent survey into the attitudes and behaviours of 1000 IT decision makers across the UK and Europe, forming the basis of a new whitepaper, ‘2019 Landscape Report: Hosted Security adoption in Europe’. We wanted to gain a better understanding of purchasing intent for evolving security provision in the 2020s. Results revealed that as many as 84% of IT managers reported currently using (48%) or considering using (36%) cloud-based video surveillance (VSaaS) or access control (ACaaS) systems to enhance their security provision. Our survey also revealed that 36% of IT managers identified operational performance as a priority for improvement within the next year. When we consider the many aspects of operational performance, productivity will always be a key measure, and cloud facilitates leaner decision-making, being more scalable and customisable to adapt to the changing business landscape as required. Both VSaaS and ACaaS use a ‘pay-as-you-go’ model which is already the preference of the IT department, removing the large scale capital expenditure associated with on-premise IT equipment and shifting this to the cloud as an operational expense with all the cost, space and security benefits this brings. Unlocking potential for business application Video cameras, or other common physical security devices, are by their very nature ubiquitous at critical points of business infrastructure. This means that they are collecting vital data that can be utilised for a number of different purposes through analysis and combination with other sources of intelligence. Once management of physical security has been shifted to the cloud, an array of possibilities can be accessed depending on budget and business requirement, without the need to upgrade or replace expensive on-site hardware. Cloud-enabled physical security has much potential. A simple example might be: real-time analysis of visitor numbers, gender and age demographic in a retail store on a particular day. This data could then be cross-referenced with current promotional campaigns to gauge effectiveness, or used to identify insights into customer behaviour patterns in-store such as hot and cold zones. The same platform might ‘learn’ what in-store behaviours cause queue build-ups, and trigger alerts more readily to advise management that more till staff are needed.  Similarly, machine learning can be used to identify suspicious behaviour or recognise the start of a violent altercation between visitors and staff, to prevent an incident before it occurs. Building management could benefit too, as cameras could be used to shut down climate controls or turn off lighting in unused space. As more companies experiment with the potential, more use cases for using video surveillance and access control technologies in the cloud will emerge, widening the scope of this technology across multiple industries and delivering improved business intelligence and operational efficiencies. Facing the challenges and reaping the benefits  Both VSaaS and ACaaS offer customers a security solution that is not only invaluable for the management and protection of premises, assets and people, but also integrates seamlessly with existing IT infrastructure. For the business leader committed to improving operational performance across teams, systems and services, unlocking powerful business insights from these converged platforms makes good sense. In a world which is increasingly reliant on the cloud for its daily business needs, hosted physical security will ensure that all businesses, of any size, are more capable of meeting the wide range of challenges we face in our modern age. For further insights read our 2019 Landscape Report: Hosted security adoption in Europe https://morphean.com/whitepapers[/vc_column_text][/vc_column][/vc_row] ### Business intelligence in a multi-cloud era [vc_row][vc_column][vc_column_text]Until recently, the relationship between business intelligence systems and cloud providers has been simple. BI tools have traditionally been built with a single-vendor architecture in mind, allowing for straightforward integration with any one cloud infrastructure.  However, since the cloud market has matured and become mainstream, things have changed. Companies are now taking advantage of the many different possibilities available to them, and in turn, shifting the way that business intelligence tools are used. This has resulted in a new era, in which organisations are increasingly opting to take a multi-cloud approach to BI, rapidly embracing the benefits of a build your own, bespoke offering. Every organisation’s data environment is as unique as its business, and success requires an analytics platform that supports your unique data stack. Multi-cloud means multi-choice By next year, it’s expected that more than 80 per cent of all enterprise workloads will be cloud-based, with many businesses using cloud strategies to boost ongoing digital transformation efforts and achieve greater agility. While public cloud engagement will no doubt play a key part of this spike in adoption, organisations are increasingly looking for solutions that are capable of going beyond the usual deployments, seeking those that also support IaaS, SaaS and private infrastructures. It’s unsurprising, therefore, that enterprises are opting for a more tailor-made approach to meeting their cloud infrastructure needs, outside of current individual offerings. No longer required to pick just one of the vast amount of vendors and technologies available, 85 per cent of businesses are already choosing to operate in a multi-cloud environment. Over the next few years, as emerging functions such as Artificial Intelligence (AI), the Internet of Things (IoT) and Machine Learning (ML) continue to change the nature of business functions and operations, this will no doubt grow even greater. Business intelligence as a service (BIaaS) As part of the dynamic and evolving world of multi-cloud – in which personalisation reigns supreme – it’s essential that businesses are also entitled to flexibility when it comes to their data platforms. This freedom of choice allows organisations to change approaches as and when necessary, based on a variety of different infrastructures, applications and needs. While some BI features may appear similar on the surface, regardless of which vendor supplies them, access to capabilities can sometimes be limited, depending on what an organisation and its data team is trying to achieve. Through a multi-cloud business intelligence approach, enterprises can avoid being locked down by one vendor, instead leveraging a catalogue of strengths. This chop-and-change approach is also favoured for those wanting to keep costs down or stick to a specific budget, as vendors are forced to keep pricing low in order to remain competitive. With the playing field levelled, time-consuming and costly migrations are out of the picture.  Data development and dialect your way Because not every BI tool is created equally, an agnostic approach to multi-cloud databases is understandable and often necessary. If an organisation is modernising its data infrastructure, retiring old systems or even adding new technologies, then any platform being implemented needs to be able to support and simplify the process, not add to the confusion. This requires the ability to automatically change the SQL dialect as needed for the new system, meaning your existing data models and business logic can be reutilised in your new database, instantly relieving pressure. In a multi-cloud world, every database is different and speaks a slightly different dialect of SQL, so creating universally applicable SQL queries is near-impossible. Multi-cloud business intelligence platforms are much more likely to be able to take on this task, with many having already been integrated to support multiple versions of dialect and more being continuously added. Another new approach also involves abstracting your query from the underlying SQL dialect, allowing data teams to write a query just once. Not only does this leave one less thing for enterprises to carry out manually and on a regular basis, but it also takes away the burden of needing specific expertise within the company to manage the process. For example, HR tech company Namely use the same platform across two different databases as they migrate between enterprise data warehouses (EDWs). This gives them the benefit of not having to rewrite all their queries to make them work with a new database, which is crucial for data privacy and allows them to focus on putting data in the hands of users, wherever it’s located.  All too often, data tools lock users into a single way of viewing and utilising data, with many older approaches to dashboard analytics heavily limiting the value organisations can extract from data. To combat this, businesses need to ensure that they are deploying open systems, letting them deliver data in the way that they want, to who they want. In a multi-cloud environment, this means being able to take immediate action on data and connect users to the insights that are relevant and useful to them. Modern data platforms make it easy to operationalise data and insights, by plugging directly into workflows using connected systems like Slack and Jira, allowing users to deliver data in a setting where collaboration is done in real time. As the importance of multi-cloud for the future of business intelligence continues to accelerate, now is the time for organisations to work out what is important to them and go on to define their approach to data. Regardless of sector or department, BI’s growth cannot be ignored and neither can its role in businesses wanting to transform themselves digitally.[/vc_column_text][/vc_column][/vc_row] ### 4 Things to Look for in SAAS Application Development [vc_row][vc_column][vc_column_text]Today, more and more SaaS companies are taking off the ground, and that's great. The SaaS business is also growing at a very fast rate, and this keeps attracting more individual developers and companies. The organisations keep floating more applications in the cloud. But, what can you do if you need a SaaS application development platform for your business? How will you know an ideal SaaS application development company to work with? Are there some things that must be considered when developing such applications and what's their use? In this article, we will examine four things that your developer must examine to offer you the SaaS application development platform that suits the needs of your organisation. Multi-tenancy SaaS application development naturally offers services for managing data for different clients. This is likely to be accomplished on pieces of joint infrastructure. Security is, therefore, a significant concern. Because of this, applications need to be designed in a way that they can force authentication and need for authorisation to access sensitive resources. Such capabilities need to be created from scratch so that it can prevent the user from accessing information from another company. Mostly, logging can be performed to track modification requests and access in case a requirement pops up later. This can also be done to show the effects of a breach. Configurability All SaaS application development platforms need to be apps that can serve the requirements of a broad range of users and organisations even if they are "one-size-fits-all." A developer needs to plan for a high configurability degree from the beginning. This helps users to make the system adapt to the specific requirements of their businesses. For configurability to be successful, extremely flexible dashboards will be needed. However, this will spread to areas like extending data and field labelling that could be stored alongside objects that are built-in. Scalability and robustness One key promise most SaaS application development providers make is that they can comfortably handle the capacity management problem. This is done so that the SaaS application client will not worry about provisioning additional sources or adding new users. The developer is forced to think of multiple concerns beforehand when architecting the system. Some questions the developer has to provide answers for include: Can additional storage be added? How will usage escalation be accommodated? How will additional connectivity be added? How will one deal with failures that can occur in any part of the system? When a developer architects all these concerns from the beginning, painful upgrades and outages will be avoided later on. This helps you know that they work for the best SaaS application development company. Connectability One great advantage of a SaaS application development solution is that it can be customised and then connected to any other system. As a result, a SaaS developer must take the time to front design the API's (application programming interfaces) that should be visible to a third party developer who might need to extend the SaaS platform's capabilities. This can also be done when integrating the SaaS platform with other systems utilised by the firm. Although this makes the SaaS platform valuable, a lot must be done to avoid introducing any serious security issues that can expose the API consumers, SaaS platform, and all organisations that use the SaaS platform to risks. If concerns like these are addressed from the onset, your SaaS application development company will be able to anticipate issues and create solutions for them in advance. This way, your company will avoid expensive software rewrites in the near future.[/vc_column_text][/vc_column][/vc_row] ### Data analytics, hybrid cloud & stream processing: enablers for an AI-led enterprise [vc_row][vc_column][vc_column_text]With AI and Machine Learning growing at a rapid pace, companies evolve their data infrastructure to benefit from the latest technological developments and stay ahead of the curve. Shifting a company’s data infrastructure and operations to one that is “AI ready” entails several critical steps and considerations for data and analytics leaders looking to leverage Artificial Intelligence at scale, from ensuring that the required data processes to feed these technologies are in place, to securing the right set of skills for the job. Therefore, companies usually begin their journey to “AI proficiency” by implementing technologies to streamline the operation (and orchestration) of data teams across their organisation and rethinking business strategy — what data do they actually need?  This is a natural first step for most organisations, given that Machine Learning and other AI initiatives rely heavily on the availability and quality of input data to produce meaningful and correct outputs. Guaranteeing that the pipelines producing these outputs operate under desirable performance and fault tolerance requirements becomes a necessary, but secondary step. As a recent O’Reilly Media study showed, more than 60% of organisations plan to spend at least 5% of their IT budget over the next 12 months on Artificial Intelligence.[/vc_column_text][vc_single_image image="51863" img_size="full"][vc_column_text]Considering that interest in AI continues to grow and companies plan to invest heavily in AI initiatives for the remainder of the year, we can expect a growing number of early-adopter organisations to spend more IT budgets on foundational data technologies for collecting, cleaning, transforming, storing and making data widely available in the organisation. Such technologies may include platforms for data integration and ETL, data governance and metadata management, amongst others. Still, the great majority of organisations that set out on this journey already employ teams of data scientists or likewise skilled employees, and leverage the flexibility of infrastructure in the cloud to explore and build organisation-wide data services platforms. Such platforms ideally support collaboration through multi-tenancy and coordinate multiple services under one roof, democratising data access and manipulation within the organisation. It comes as no surprise that technology behemoths like Uber, Airbnb and Netflix have rolled out their own internal data platforms that empower users by streamlining difficult processes like training and productionising Deep Learning models or reusing Machine Learning models across experiments. But how do companies step up their infrastructure to become “AI ready”? Are they deploying data science platforms and data infrastructure projects on premises or taking advantage of a hybrid, multi-cloud approach to their infrastructure? As more and more companies embrace the “write once, run anywhere” approach to data infrastructure, we can expect more enterprise developments in a combination of on-prem and cloud environments or even a combination of different cloud services for the same application. In a recent O'Reilly Media survey, more than 85% of respondents stated that they plan on using one (or multiple) of the seven major public cloud providers for their data infrastructure projects, namely AWS, Google Cloud, Microsoft Azure, Oracle, IBM, Alibaba Cloud or other partners.[/vc_column_text][vc_single_image image="51865" img_size="full"][vc_column_text]Enterprises across geographies expressed interest in shifting to a cloud data infrastructure as a means to leveraging AI and Machine Learning with more than 80% of respondents across North America, EMEA and Asia replying that this is their desired choice. A testament to the growing trend towards a hybrid, multi-cloud application development is the finding in the same survey that 1 out of 10 respondents uses all three major cloud providers for some part of their data infrastructure (Google Cloud Platform, AWS and Microsoft Azure). Without question, once companies become serious about their AI and Machine Learning efforts, technologies for effectively collecting and processing data at scale become not just a top priority, but an essential necessity. This is no surprise, given the importance of real-time data for developing, training and serving ML models for the modern enterprise. Continuous processing and real-time data architectures also become key when Machine Learning and other Artificial Intelligence use cases move into production. This is where Apache Flink comes into play as a first-class open source stream processing engine: built from the bottom-up for stream processing, with unbeatable performance characteristics, a highly scalable architecture, strong consistency and fault tolerance guarantees, Flink is used and battle-tested by the largest streaming production deployments in the world, processing massive amounts of real-time data with sub-second latency.[/vc_column_text][vc_single_image image="51866" img_size="full"][vc_column_text]Examples of such large scale use cases include Netflix, using Apache Flink for real-time data processing to build, maintain and serve Machine Learning models that power different parts of the website, including video recommendations, search results ranking and selection of artwork, and Google using Apache Flink, together with Apache Beam and TensorFlow to develop TensorFlow Extended (TFX), an end-to-end machine learning platform for TensorFlow that powers products across all of Alphabet. The journey to Artificial Intelligence proficiency might seem like an overwhelming and daunting task at first. Making the right investments and decisions upfront to nurture the right data engineering and analytics infrastructure, thinking cloud-based and considering stream processing as a real-time business enabler will help tech leaders to navigate through the journey successfully and accelerate their enterprise into an AI-led organisation of the future.[/vc_column_text][/vc_column][/vc_row] ### Adopting cloud securely: minimise risk; maximise performance [vc_row][vc_column][vc_column_text]The cloud has brought a range of compelling benefits to businesses, helping them improve performance, become more agile and increase efficiency. For these reasons, and more, its adoption has accelerated since the early days of cloud computing. However, as with all new technology implementations and change management programmes, the cloud comes with its own risks. Awareness of these, and a sound governance, risk and compliance (GRC) approach to cloud implementations, is essential for businesses to minimise risk and maximise business performance through cloud. Understanding risk According to a Cloud Security Alliance report (2017), the top three critical issues to cloud security are: data breaches insufficient identity, credential and access management insecure interfaces and APIs. Cloud adoption strategies must recognise these risks, assess the probability of business impacting issues arising from them, and have plans in place to mitigate any such occurrences. Failure to do so could mean not only the benefits sought from the cloud are not attained, but that the business fails to protect itself from damaging incidents that were potentially avoidable. GRC on the cloud is a way of ensuring that risks are completely understood and can be more effectively managed through a robust technology platform and the effective execution of risk management strategies. It is also a way of smoothly managing change – something that impacts all industries - to address evolving requirements. The GRC cloud platform A risk-based approach when adopting cloud computing helps minimise risk and enables management of data and business-critical applications in the cloud to be consistent with the enterprise’s risk appetite and strategic objectives. While cloud security controls, threat monitoring, vulnerability scanners and other tools help minimise risk, a GRC cloud platform goes further than this by bringing all these risk and compliance factors together into a single source of truth, enabling enterprises to have an integrated view of their GRC profile in the cloud. Aside from the scalability, cost-efficiency and agility of a cloud deployment, a cloud-based GRC platform also gives enterprises enhanced visibility into risk exposure and enables them to automate compliance and monitoring processes. Managing change Flexibility is a mainstay of businesses that are able to succeed in continually changing environments. Risk and compliance also continuously evolve – the type and nature of risks change as do compliance requirements and the regulations that businesses must abide by. Change has an impact on processes, ways of working, organisational structures and teams. To keep up, GRC management within enterprises must be flexible, configurable and scalable and it must enable action resulting from change to be handled cost-effectively, if the business is to adapt and grow. Data: confidentiality, integrity, availability One of the main benefits of cloud is its ability to store massive amounts of data and to provide anytime, anywhere, controlled access to it. Managing data confidentiality, integrity and availability is essential and each enterprise should have clear criteria and governance structures around this. The avoidance of data co-mingling is one such requirement and here, a multi-instance cloud architecture can be effective. It can maintain a separate full-stack environment, enabling the complete separation of instances to ensure data integrity. Of course, data volumes continue to grow, and data sets can become fragmented. It is often a challenge to handle such data sets with traditional warehousing and business intelligence tools, let alone support the ways in which organisations need to use them in order for them to be effective. Data is the lifeblood of business and, through it, insights can be gained that can confer significant business advantage but only if it is stored, managed, maintained and accessed in the right way. Tools and processes for data security, monitoring, maintenance and cost must be included in strategic planning, and in this, next gen IT strategies and techniques such as advanced analytics, visualization tools, and parallel data processing are starting to play a part. Risk management for cloud data storage and governance must be robust and for this reason, enterprises are looking at GRC platforms on advanced cloud data centres in order to effectively identify, assess, and mitigate cloud computing risks, while ensuring compliance with data governance regulations. Evolving technology platforms With a GRC framework on cloud, enterprises can ensure that security risks are completely understood, change is smoothly managed, and that informed decision-making puts the organisation in the best possible place to reduce risk, while benefiting from the advantages of cloud in enhancing business performance. As technology continues to evolve, it is essential that enterprises evaluate, implement and adopt the cloud in a risk-aware way. A detailed, robust and well-maintained GRC cloud programme, together with a technology platform that enables flexibility and scalability, can support businesses in these endeavours.[/vc_column_text][/vc_column][/vc_row] ### How To Use Cloud Tech To Automate Your Sales On Scale [vc_row][vc_column][vc_column_text]The internet has come along at a time when companies are starting to automate their services in an effort to be more efficient. To facilitate these changes, numerous applications and software are continually being developed to assist with automation and other related processes. Cloud technology is one such provision that allows you to store data on internet-based applications. Today, businesses including those involved in sales are leveraging the power of cloud computing to make their work process less strenuous and more rewarding. In this post, we take a look at four main ways in which you can use cloud tech to automate your sales process. Data management Storing data in the cloud using applications such as Google Docs comes with several benefits. For starters, you don’t need to worry much about organising the information as the software does it for you automatically. For example, any time you upload contacts, documents, and sales progress details randomly, the cloud tech will automatically detect whether they belong to one person or otherwise, and group them accordingly. Better yet, this technology updates this portfolio anytime you add new information to the database. This function comes in handy in situations such as when you need to make a quick recap before getting into a meeting with a particular client. Instead of shuffling through the entire portfolio and possibly tens of documents in your storage, the application enables you to pick the key highlights in an instant. Needless to say, this is a time saver and a sure way of achieving increased efficiency. Lead engagement Most if not all users visiting your site usually have some inquiries to make concerning your products or services. While it can be a pleasure serving everyone individually to ensure they have adequate information each time they visit, doing so can be tiring and time consuming. Luckily for you, there are numerous cloud-based applications that can do this work for you. A good example are the chatbots. Today, most companies are using these highly intelligent tools to make their websites more engaging while at the same time providing their users with necessary information that ultimately drives more sales. The bots work by saving details of questions asked by visitors and then coming up with the right answers for them. The end result is reduced work for you and your sales team which allows you to focus your time and resources to getting new leads and closing deals with prospects. Lead scoring As a salesperson, one of the most difficult tasks you need to do regularly is to keep tabs on the current and prospective customers. To make your work more effective, it’s important to know how frequently these potential clients visit your site as it enables you to understand their level of interest in your offerings. This is where cloud technology becomes useful. Using a tool like Google Analytics, you can track not only the number of visitors your site serves daily but also identify anyone that passed by more than once. Using this and other forms of information, you’re able to approach these prospects and successfully sell them your product or service based on their preferences and needs. In essence, cloud tech helps you find your customers before they can even find you. Collective customer relations management Before the introduction of cloud CRM, salespeople were usually responsible for managing their clients without necessarily anyone monitoring them or accessing information on the progress made on any lead. Consequently, any sales manager who wanted to gain some insight on a particular client would have to call the team member in charge of that client, get information from them, then proceed with the negotiations. That’s now in the past, at least for companies that utilise cloud CRM software such as Salesforce or others like this Hubspot Outlook integration. These tools automate the bigger part of your sales process including conducting simple analysis of the sales data that each salesperson keys into the system. The deductions from the sales analysis are then stored in a central place, making it easy for the entire sales team to access this information. In the end, this reduces the number of times a manager is prompted to contact a team member or call for briefing meetings to gather crucial information. Wrap-up No one should suffer the consequences of using old and outdated methods in sales management when there are thousands of effective and more advanced alternatives available on the market. Now more than ever, business owners looking to grow their profits need to understand the crucial role that cloud technology plays in making critical processes such as sales easier to carry out. What areas of your business do you think can best benefit from cloud computing? We’d like to hear your thoughts on any plans you might have concerning use of this technology to grow your business.[/vc_column_text][/vc_column][/vc_row] ### AI is on the shopping list for retailers [vc_row][vc_column][vc_column_text]There is no shortage of reports saying artificial intelligence (AI) and machine learning will transform how we shop. Researchers at Juniper Research forecast retailers will pump $12 billion into AI by 2023, compared to $3.6 billion today. But, how will we get to this vision of AI powering retailers? AI provides retailers with a variety of benefits AI could help retailers be more competitive and to do so by cutting costs while increasing the quality of experience and engagement with customers. All kinds of processes will be affected including jobs – though AI will end dull jobs of little value to the customer or the employee. Within physical stores, AI could remove what makes the experience mundane for example how smart checkouts use computer vision to enable “just walk out” purchasing and eliminate queuing. Furthermore, taking away tills frees up staff from undertaking low value tasks for high value customer-facing activities like personal styling. These higher value activities can then be augmented by AI to benefit retailers further. For example, artificial intelligence can take a massive catalogue of products and filter them down into a manageable amount of recommendations, which the personal stylist can then present to the customer. The combination of human and artificial intelligence – people skills from human beings with data-driven insight provided by AI – provides a better customer experience than either one used alone, helping the retailer drive sales more effectively. AI is already making waves in e-commerce Online commerce is clearly where AI is having a huge effect already. For example, how machine learning sorts out product catalogues for online retailers. For decades, category managers and developers have been trying to create the perfect – high quality, relevant – product catalogue. With advances in machine learning, these day-to-day tasks could be automated. There is a huge potential for intelligent algorithms to make sense of an increasing amount of data collected as part of digital commerce processes like search, fraud detection or churn prediction. Demand forecasting is where AI will be playing a transformative role in the background. Understanding consumer demand patterns, predicting supply chain pressures and proactively making changes are highly complex tasks that need to be done in real time. This carves out an obvious role for AI that can learn and meet customer demands faster. Unsurprisingly, the number of retailers relying on AI-powered demand forecasting is expected to triple over the next four years. Where do chatbots come into play? However, no matter how intelligent AI becomes they can’t be useful without the ability to take actions on behalf of customers. A good example of this is chatbots, about which big promises were made, but with a failure to deliver in most cases. The reasons for this include how the industry hasn’t fully comprehended the amount of training or data needed to make chatbots truly useful. Too many of today’s chatbots lack the ability to change an order address, buy on behalf of a customer or do other things that a customer could do themselves or would expect a human agent to do. This lack of usefulness isn’t a failing of AI system but the systems they sit on top of. These systems were built for an old retail model, with two primary touchpoints: storefront and catalogue. E-commerce came along in the 1990s and the catalogue was largely replaced with the desktop webstore. The software that supported both online and physical retail was logically monolithic in nature. E-commerce was treated as just another store in many instances and it was fed the data it needed. Most retailers still sit on these monoliths. As needs have changed, software vendors added layers on top of the monolith to expose APIs for use by external applications. This was meant to modify processes to be more agile but was never intended to fully support the high-speeds and scalability required by chatbots. How can retailers make the most of investment into AI? Because the API was an afterthought, it was never meant to be a truly high-speed entity. Typically, any large load on an API causes the internal processing of the monolith to slow or fail due to the unexpected processing burden. Thankfully, modern cloud native and API-first commerce platforms are designed to scale, to have one system about product or customer of truth rather than siloed data and to meet the demands of any customer interaction, be that a chatbot, a VR experience or an in-store mPOS transaction. The growth of AI in retail doesn’t need to end in disillusionment. Its success will depend on how retailers have migrated from legacy to new modern commerce platforms that can accommodate and maximise new technologies like AI and omnichannel strategies seamlessly.[/vc_column_text][/vc_column][/vc_row] ### How Technology is Transforming Mentoring [vc_row][vc_column][vc_column_text]Mentoring is widely recognised as one of the most powerful and impactful ways of helping an individual to gain experience, learn new skills, develop and to ultimately grow in their career. This is why mentoring is often used as a technique within organisations of all shapes and sizes. And mentoring is gaining traction too - with more and more organisations looking to offer it to employees as a way of helping with employee engagement, knowledge sharing and employee retention too. It has, however, been a traditionally tricky technique to effectively implement. By its very nature, mentoring is flexible. Based on the very simple concept of one individual (the ‘mentor’) helping another (the ‘mentee’), it can take several different forms. Mentors and their mentees can meet weekly, monthly, quarterly or sporadically. A mentoring relationship might last for a month or several years. It might focus on specific challenges in the workplace or be in place simply as a general guidepost for individuals to develop and grow. And it’s all these considerations which make the implementation of mentoring programmes tricky. Fortunately for HR directors and those in learning & development roles within larger organisations, technology is beginning to appear on the scene to help make mentoring more resource-effective, accessible and, vitally, reportable. It isn’t just companies that find mentoring challenging though. Individuals, like myself, have struggled with mentoring previously too. I, myself, was looking for a mentor just over a year ago and found it incredibly difficult to find one. Where does someone like me, who is not part of a large organisation and doesn’t have access to mentoring programmes, go to find a mentor? It’s this question and challenge which drove me to setup PushFar, a web-based career progression and mentoring platform, aimed at tackling all these issues surrounding mentoring. There are a few ways in which technology is helping to transform mentoring, making it easier for everyone involved – from the managers of mentoring programmes through to those mentors and mentees, managing their own mentoring relationships. Below are a few traditional mentoring challenges that technology like ours is combatting. Mentor Matching The concept of mentor matching is simple enough. Forming a mentoring relationship with two individuals where a mentor with experience can provide support and guidance to a mentee. The big issue here is that for mentor managers it can be a very resource-heavy process! Imaging having just 100 mentors and mentees and trying to pair them based on experience, personality type, office location and a number of additional potential considerations. It can quickly become extremely challenging and simply is not scalable for larger companies with thousands of employees. Technology can help drastically. With mentor matching technology that can consider availability, skills, experience, office locations and much more too, it can take the resource and manual labour out of the process and allow organisations to quickly scale-up mentoring schemes and programmes. Mentor Engagement Mentoring programmes often start with enthusiastic employees and members getting behind the schemes and engaging. Unfortunately, as weeks and months go on, mentoring engagement tends to flounder. Work gets in the way. Individuals forget to schedule in meetings. Keeping track of mentoring is challenging and ultimately those mentoring relationships that started strong, begin to die. For those in charge of mentoring schemes, it’s a huge problem and one that I’ve spoken with a lot of HR directors about. The manual process (again) involved in sending out emails and checking in with mentors and mentees, to see how things are going, can take a long time and simply isn’t effective. Technology can help to automate these nudge emails and reminders - making it easier for mentors and mentees to schedule in meetings when they should and ultimately keeping that engagement high. Mentor Reporting Keeping track of mentoring is another big challenge. Who is mentoring who? How long have they been mentoring for? How often do they meet? Have they set goals and targets? Is it helping to keep employees engaged and retained in an organisation? These, and several other questions are commonly asked by company directors, managers and senior leaders. The problem is that the answers usually aren’t available. With mentoring platforms and mentor software reporting is a breeze. These technologies track everything and allow administrators to put together relevant, useful and insightful statistics and reports. There are several other challenges around mentoring too. Being able to keep on top of goals and targets, scheduling in meetings with your mentor or mentee, understanding how mentoring is helping, the list goes on. Technology really is helping to transform mentoring in a big way now. In the present day and in years to come, with younger generations moving between roles at a faster pace, it is key that mentoring be considered at the forefront of company agendas.[/vc_column_text][/vc_column][/vc_row] ### Controlling Cloud Spend in AWS Environments As the market for cloud services for enterprises continues to grow, the amount your organisation spends on cloud services can quickly escalate. If enterprises can gain better visibility on the cost of each cloud service they utilise, they can find the right balance of the services they use, reduce operating expenses, and garner the best possible value from the cloud.  According to the IDC, spending on public cloud services and infrastructure is expected to reach $210 billion in 2019, which is an increase of 23.8% from 2018. By 2022, public cloud services spending should reach $370 billion, with the market forecast to achieve a five-year compound annual growth rate (CAGR) of 22.5%. Public clouds are popular with enterprises because they're more cost effective and less complex to set up, allowing companies to achieve their goals more efficiently. That said, complexities often arise as cloud resources accumulate, and enterprises have little visibility of which cloud resources they’re utilising. This typically leads to a surplus of unmonitored, underused, or idle cloud resources on various platforms in multiple accounts, causing the enterprise to overspend on unnecessary resources. Managing these underused resources and reducing the associated expenditure can be achieved by implementing a cloud performance monitoring solution that provides the following features: 1. A holistic view of cloud services Typically, enterprises only get an idea of their cloud spend when the invoices for the various cloud services they’re using land in the accounting department. This makes measuring cloud costs a complicated, time-consuming process. By implementing a platform with AI-driven monitoring, enterprises can optimise spending by holistically viewing cloud expenditures across multiple accounts from a single console. Enterprises can benefit from a solution that displays AWS expenditure broken down by various dimensions, including linked accounts, service type, region, and user-defined tags—with options to schedule them as reports. 2. Set budget controls When cloud spend is managed retroactively by assessing invoices, the enterprise has already accumulated costs that could have been avoided using readily available technology. A cloud monitoring platform with budget control functionality helps reduce operating expenses, preventing runaway spending by capping the amount that can be spent on each public cloud service. The right monitoring platform should enable users to set up notifications to inform them when spending exceeds the established budget for each cloud service. 3. Tracking cloud spend with tags  Tags are a common way to manage workloads, and with the right cloud monitoring platform, tags can be used to track the use of cloud services, too. Tags allow enterprises to recognise which project each cloud service is being used for, which team or customer is using the cloud service, and when that service can be switched off once it is no longer utilised. 4. Security and compliance Security and compliance must be factored into any solution, so enterprises must look for solutions fortified with self-hosted data centres, redundant  grid architectures, encryption of data in transit, strict policies and processes, and features to help with complying to regulatory standards such as ISO/IEC 27001, GDPR, and SOC 2. By empowering teams to regulate their cloud services, enterprises can gain greater control over their cloud spend. It’s easy to accumulate cloud services; however, enabling teams to monitor their own usage will encourage them to effectively manage their own costs. Of course, other internal parties, such as the accounting department, should also monitor the cloud spend of each team or customer as well. Therefore, it’s important to invest in a solution that enables visibility of cloud spend to multiple users across the entire enterprise. Supported by AI-driven insights, a cloud monitoring tool that provides greater visibility and control over cloud resource spending enables DevOps and IT teams to streamline and enhance the performance of their cloud resources while also eliminating runaway expenditures. Continuous monitoring of cloud environments and the ability to control spending within those environments is vital to achieving a competitive edge.[/vc_column_text][/vc_column][/vc_row] ### 6 AI and Machine Development Technologies to Follow [vc_row][vc_column][vc_column_text]There is little to no doubt that the machine and artificial intelligence industry is changing the way we live and work. Advances in the past ten years have outstripped that of the past 50. With the ever-increasing pace of new developments, knowledge of the most innovative and game-changing technologies on the market quickly becomes out of date.  Developments within the technology industry are vast and at times it can be difficult to keep track of the systems that will make the greatest impact on life as we know it. Below are the 6 fastest emerging machine and AI technologies in 2020 that will set the tone for the next decade of AI and machine development. Digital Twin Modelling “A digital twin is a virtual replica of a living or non-living physical object. The digital replica demonstrates the potential and physical assets of processes, things, people and places and can be used for various purposes,” explains Alice Willson, a tech journalist at OriginWritings. Digital twins utilise the Internet of Things, machine learning, software-network graphs, and artificial intelligence to create living digital simulation models that update and change as their physical counterpart changes, all in real-time. This could revolutionise how many industries work, from providing remote configuration services to customers or allowing trainee surgeons to practice on simulated bodies, instead of actual patients.  Autonomous AI Self-Driving cars, socially-enabled domestic robots and collaborative production assistants are examples of autonomous technology that is being developed at companies like Tesla and Bosch. Autonomous systems are designed to operate in complex and open-ended environments. The research is at an intersection between robotics and artificial intelligence, with a particular focus on the integration of particular methods to create complete cognition-enabled robotic systems, with software available in open-source libraries. Robots can complete tasks that transform lives and accessibility, such as assisting households with complex needs or assembling products in factories. Conversational AI Conversational AI is the technology behind familiar processes such as speech-enabled applications and automated messaging systems. Conversational AI is the gateway between human and computer interaction. The development of the software involves a strict discipline dedicated to designing conversational flow, context, relevance, understanding of intent and automated translation and speech recognition systems. The goal of conversational AI is to develop a system that interacts with the user in such a way that it is indistinguishable from human interaction. The technology will enable users to manage business tasks more efficiently by deploying powerful conversational AI interfaces, such as end-to-end bot hosting platforms. AI Security AI security is probably one of the fastest-growing areas of AI development currently on the market. AI security refers to the tools and cyber techniques used by artificial intelligence systems to identify potential threats based on similar and previous activity. AI shows the greatest potential for automated fraud identification, intrusion detection and risk probability for logins. Current AI security technology is designed to work harmoniously across businesses. The automated frameworks can identify and correlate threat patterns from huge amounts of activity datasets. Advanced cybersecurity can work seamlessly without disrupting business. Around 80% of telecom companies are relying on this technology to increase security on their systems and around $137 billion was spent on AI security and risk management in 2019. Probabilistic Programming Probabilistic Programming refers to the utilisation of algorithms to predict the likelihood of events and make informed decisions in times of uncertainty. The software is currently revolutionising areas of trade and advertising, being used to predict stock prices and recommend products to target consumers with greater accuracy. The technology is developed by borrowing methods from programming languages and incorporating them into statistical models. As computers become more efficient at dealing with probabilities at scale, it will enable us to evolve our current advanced computer aids into intelligent partners for decision-making and understanding. Automated Machine Learning Automated Machine Learning is a technological shift in the way companies and individuals utilise traditional machine learning methods and data science. Extracting relevant data from raw data sets using traditional machine learning is time-consuming and costly. Data scientists with computer programming skills are in short supply and high demand. Automated Machine Learning means that organisations can build and use machine learning models based on up-to-date data science expertise. Organisations can run systematic processes on raw data that automatically extract the most relevant material from huge datasets. Not only does this method save time, but it removes the risk of potential bias and human error from the results. This technology makes data processing available to industries such as healthcare, sports, public sector enterprises, and retail. These industries on average, do not have the resources at their disposal for accurate and detailed data processing. The technology also allows larger companies to outsource data processing to automated systems which frees up data scientists to concentrate on more complex problems.[/vc_column_text][/vc_column][/vc_row] ### Surviving the Cloud Migration Journey [vc_row][vc_column][vc_column_text]As the global cloud computing market surges towards the $130 billion mark, most businesses will have at least begun to think about migrating elements of their tech to the cloud. The preferred model – at least in concept – is a hybrid network. A successful migration to a hybrid cloud architecture will enable you to keep your bespoke legacy applications and secure your most sensitive data on-premise. At the same time, you will be able to avail yourself of the plethora of applications and services that run in the cloud. So the story goes. However, there is a huge caveat. Cloud migration is complex at the best of times and configuring a hybrid cloud that works is the most intricate challenge of all – after all, every business is completely unique. Stripping away the tempting sales patter, here is a more realistic appraisal of the cloud migration journey – and some tips for surviving it unscathed. Bumps on the Road There is a big difference between a 'paper migration' and the real thing. Even small migrations, such as moving from an on-site Exchange server to a cloud-based SaaS service such as Office 365 is more challenging that it looks. In fact, a provider of IT support Los Angeles based DCG Inc., highlighted five potential problems with this migration alone. If you are one of the majority of businesses envisioning a hybrid cloud, be aware that sharing workloads between your private and public clouds may not be as easy as you imagine. Some businesses have found that cross-cloud collaboration becomes difficult because one set of employees have access to a wider set of features than another. There is also the headache of updating company policies to reflect a different way of collaborating. In some nightmare scenarios, businesses have found the problems insurmountable and ended up running double the number of workloads at double the cost. That is not what hybrid is supposed to be about. Worse still, when they have decided to change strategy and jump into the public cloud with both feet, they find that the big cloud providers (AWS, Azure and GCP) only provide on-premise-to-cloud migration assistance; hybrid to cloud is not an option. Then there is the misconception that you can simply move all of your assets from on-premise to cloud and they will just work. How many applications do you intend to lift and shift into the cloud in this way?  If you were thinking all of them – or the vast majority – prepare for a wake-up call. If you are fortunate, around a quarter will run in the cloud 'as is' without any problem. The others will need to be either re-platformed, re-built from scratch or decommissioned completely. Smoothing the Path Enough of the negativity. Just be aware that an aggressive, impulsive cloud migration strategy is likely to backfire and leave you in a spaghetti-junction like tangle. On the other hand, a strategic, measured migration approach will get you from A to B with much less risk of detours and false starts. In terms of applications, it is wise to ruthlessly audit them. If there are viable SaaS alternatives in the cloud, they will likely be better and cheaper anyway so create a roadmap to take those applications out of action. Your slimline application estate will then need to be fully mapped in terms of each application's dependencies because this will determine the order in which they should be migrated and whether migration is possible at all.  Often, you will be able to re-platform an app for your chosen cloud service but if an app is vital to your business and you can't migrate it, you will need to make provisions to run it on an independent network. Better still, get your DevOps team to build a new cloud-native app and decommission the existing one. If we've scared you away from the hybrid cloud approach, it's time to backtrack and take a fresh look.  If you are not averse to placing your business with Amazon, Microsoft or Google, all three now offer a complete hybrid solution with virtualisation software running on top of your on-premise hardware. Microsoft Azure Stack has a head start in this area but AWS Outposts is an attractive option for those businesses happy for Amazon to set up, configure and manage AWS hardware in their own data centers. Then there's Google with the fascinating Anthos project which levers Kubernetes to enable workloads and applications to be run across private and public clouds – a true multi-cloud solution (if you don't mind being locked into GCP of course!) Why Consultancy is the Key Of course, all of the public cloud providers above, engaged as they are in a war for hearts and minds, will promote their hybrid cloud solution as the best for your use case. That's where vendor-neutral consultancy comes in. A good consultant will have a broad field of vision that encompasses not just technology infrastructure but also data management, policy standardisation, business communication and overall strategy. A consultant should also have strong partnerships with a wide range of cloud service providers because this will provide an important balance between accessing cost-saving partner deals and avoiding bias in the sales pitch. Beware any attempt to lock you in to a specific vendor unless there are real valid reasons that make sense for your business. Even better consultants should be able to help with the migration itself. Proud IT personnel may balk at this idea but migration is a difficult horse to ride. Even DevOps specialists who live and breathe the cloud are likely to find the scope of a full-scale migration overwhelming. As DCG Inc, providers of IT consulting in LA, puts it: “Many businesses...believe that they can do the entire migration process without the assistance of experts. However, without proper guidance, you may run into problems that can harm vital business operations.” In addition to professional consultancy, speak to other companies who have traveled the journey. What experiences have they had? What advice can they give? After Migration: Monitoring and Managing  One of the biggest shocks that companies get when moving to the cloud is that their resource allocation is not automated by design. They may find they spend a lot of time managing virtual machines to avoid excessive charges. Fortunately, there are a few routes they can take. Technically strong businesses can embrace containerisation and microservices, using Kubernetes-based orchestration services. These are offered by all of the public cloud providers (e.g. Azure AKS, AWS EKS, or Google's GKE).  Even better, the Istio service mesh promises to bring multiple microservices together into an overarching system which provides secure communications and in-depth monitoring. You've probably noticed a pattern in the way Google is leveraging Kubernetes to gain power in the cloud – but that's another story. For those businesses happy to outsource their IT completely, a cloud-based managed IT services provider is likely the way to go. Consulting is once again advisable to ensure the SLA is set up to protect your business. In conclusion, when it comes to the cloud, the journey really is almost as important as the destination itself. Taking the right migration path will help you to safely arrive in that promised land of elastic consumption, real-time reliability, industry-leading security and unlimited scalability.[/vc_column_text][/vc_column][/vc_row] ### A guide to cloud-based Digital Asset Management [vc_row][vc_column][vc_column_text]Organisations large and small are generating more and more digital content. The result is a modern-day challenge for just about any business – a volume of files and folders, which when contributed to by many people, can become hugely frustrating to navigate. Team members’ time is often lost searching for previously created files, ranging from imagery to videos, documents to presentations, and audio to design files. In addition, file sharing outside of the organisation is a regular necessity. General file sharing services have their limitations and uploading files from internal shared servers to transfer sites can be an administrative burden. Digital Asset Management (DAM) is a solution designed to help teams overcome this challenge and unleash their potential through better file sharing and collaboration. It ensures entire teams, departments or even your whole organisation is securely organising digital content in a consistent manner. On-premise versus cloud Historically, DAM systems resided on-premise, requiring hosting on your own server as well as owning responsibility for the infrastructure and updates. This would typically cost tens of thousands of Pounds, Dollars or Euros up-front, with ongoing maintenance costs and the challenge of implementation as well as user training. Usually it would require an implementation specialist from the vendor side and at least two IT people within an organisation dedicated to managing the system. Your IT staff would be in complete control of the version installed, and be responsible for feature updates and data security – all meaning this was a substantial investment decision. It’s therefore pretty obvious why this limited take-up of a great tool – particularly amongst small-medium or geographically diverse companies. New cloud-based DAM changes the game. Now anyone can gain the benefits of state-of-the-art DAM functionality at a commercially viable price point. And as such, adoption of this new software-as-a-service (SaaS) is rapidly increasing among private, public sector and non-profit organisations of all shapes and sizes. Why do organisations choose cloud-based DAM? There are several benefits to a DAM platform that supports you to safely store your digital content in the cloud: 1. Fast onboarding and intuitive interface – it’s so simple to get started. Thanks to easy importing you can have content in your library in minutes. Uploading selected folders from a shared server or desktop to your DAM platform is fast, and if your team uses services like Dropbox, Box or Google Drive, you can sync these folders in a matter of clicks. Great user experience is an expectation for today’s top-performing cloud-based tools and DAM is no exception. Your team or external partners won’t need support to show them how to use the platform – simply install and go! 2. Usability for teams in multiple locations – as SaaS solutions are accessible via web browsers from anywhere in the world, you can open up access to whoever, wherever – regardless of their geography and IT infrastructure. 3. Provision of third-party access to particular files – collaboration with external parties (i.e. people who don’t have access to your servers) is a necessity in almost every industry. With cloud-based DAM systems, admins can provide accessibility to approved individuals or teams, but limit access to only the assets they need. With numerous individuals involved in creating, sharing and using files, you need everybody involved – including outside suppliers – to work in the same way. One central and secure system provides consistent organisation, searchability and version control of all your digital assets. 4. Cloud storage capacity that will scale – if your DAM platform usage increases, it won’t cause you a headache – it can grow with the needs of your business. Dedicated DAM software uses the likes of Amazon Web Services to offer ample storage capacity. It keeps assets in one central location, with tagging so they can easily be found and shared. Usage often accelerates once DAM is embedded into workplace culture. 5. Constantly evolving functionality – with all updates to the platform managed by the DAM software provider, you won’t have to rely on your IT team to provide you the latest version. New features are frequently added to DAM platforms and you can benefit from these immediately. When choosing a DAM provider, be sure to look for the following: Powerful search filtering using tags, keywords, metadata and customer fields. The ability to search by file size and other factors – including which photographer or content creator – can also speed up searches. Facial recognition is perfect if you want to instantly find all saved imagery featuring a particular person. Version control manages updates to all types of files and ensures your digital assets are always up-to-date particularly when different parties might all be working concurrently on the same project files. Portals for external partners and customers that can be customised with your own landing page and branding – you can empower your end-users with the latest content at their fingertips, while all under your own brand appearance. A Google Chrome plugin lets your users search your media library without leaving their browser-based email, document or PowerPoint presentation. Integration with design programmes and other programmes, including Adobe Photoshop and InDesign. This means faster editing and file saves are synced with the centralised DAM storage system so everyone else immediately has access to the latest version. Who is using cloud-based DAM? DAM is used across all vertical sectors and industries. While the need to manage and share files transcends every department, it’s typically favoured by marketing, creative and sales teams who have to store, share and organise an abundance of visual content. You might be a retailer or manufacturer with an extensive library of imagery for current and previous products and collections. Or perhaps you’re a brand owner who needs to provide a range of up-to-date branded assets – both internally to teams and externally to agencies and partners – ensuring your image and messaging is consistently presented. You might be heavily involved in research and development and need a way to carefully organise design iterations across multiple projects. Whatever the needs of your business, cloud-based DAM is unlocking productivity and empowering people to organise, find and share their digital assets in a far better way. When your content is organised, your team is organised.[/vc_column_text][/vc_column][/vc_row] ### Human experiences – the key to happy customers and employees Digital technologies are transforming every aspect of our lives and human experiences. Innovation is booming and businesses are moving faster than ever trying to keep up with customer and employee expectations. Technology has empowered customers as never before. They are now dictating the rules of purchasing and expect from all players the same level of efficiency, performance, personalisation and convenience as from industry leaders such as Google and Amazon. If a company fails to deliver on these expectations, customers simply walk away. Customer experiences According to a recent study by Gartner, when making a purchase 64% of people find customer experience more important than price. At the same time, research conducted by American Express reveals that millennials are willing to spend up to 21% more to get great customer service. Furthermore, a report released by Accenture shows that companies that get customer experience right outperform their competitors and generate higher revenues by 11%. Under these circumstances, organisations have no choice but to make CX a critical business priority and invest significant resources in order to deliver on customers’ expectations and demands. As a result, a significant number of companies rushed to acquire the latest technology and tools that promised to deliver best in class customer experience. However, many now realise that they can’t offer stellar customer experience without also considering the employee experience. Employee experiences The old adage, happy employees make happy customers has never been more relevant than in today’s ultra competitive business landscape where even a small mistake can derail a company’s progression and growth. Many businesses currently operate in sectors where attracting and retaining talent is increasingly challenging. Offering superior employee experience is a significant competitive advantage that many companies are trying to use nowadays in order to increase staff loyalty and engagement. In order to provide high quality customer, and employee experience senior business leaders are deploying complex and sophisticated strategies, investing millions of pounds in state of the art technology, platforms and tools.  But despite the invested resources and capital many organisations still fail at providing a seamless, engaging and consistent experience to both customers and employees. Many CEOs, CIOs, CTOs and other senior executives are struggling to find the faulty elements in their strategy or technology that render their effort moot. The answer lies with human experiences Many companies do not deliver the expected and promised customer and employee experiences because they fail to identify and understand their journeys. In order to garner customers’ and employees’ loyalty and engagement the C-suite needs to focus on the complete, end-to-end experience individuals have with the company. Too many companies focus on individual interaction touch points or tech specs forgetting that the user journey spans a progression of touch points. In this day and age, companies need to excel across the board, both internally and externally. Offering a good product but a poor customer experience, or good working environment but underperforming working tools, is no longer enough if a company wants to survive and thrive in a highly complex and competitive business climate. Complex digital eco systems However, considering the sophisticated and intricate digital ecosystems companies manage, finding the faulty elements that prevent them from increasing efficiency and offer a high quality customer experience is almost impossible. Making things even more complicated is the fact most of an organisation’s digital platforms, tools and networks are outsourced to third party providers where business leaders have no visibility and lack the ability to control the full extent of their infrastructure. Even if they ask for a performance report, they will only receive data and metrics that focus on engineering performance of a specific piece of technology. In order to elevate the user experience, both internally and externally, companies need to focus on how all elements of the digital ecosystem – IT, third party networks, data centres, applications, tools, digital systems and platforms - fit together and affect how humans interact with the business at various and numerous touch points. Human experiences analytics Thankfully, nowadays senior executives can use analytics tools that focus on the human experiences and deliver razor sharp insights regarding each technical roadblock that hampers efficiency, productivity and organisational performance. Armed with an analytics tool that can perform millions of calculations per minute and uncover the cause of any technical bad behaviour in real time, business leaders can cut through the complexity of the digital ecosystem and analyse the whole infrastructure through the human experience lens. In this way, they eliminate guesswork, save time and address any potential issues in the shortest time. By constantly monitoring the digital ecosystem and how it affects human interaction with the business, senior executives can deliver flawless services to both internal and external customers. While new technology is exciting and many organisations want to ensure they explore new trends to gain a competitive edge, if the digital ecosystem is not calibrated to make customers’ and employees’ lives easier by removing friction, delays and inefficiencies, the company might find that it has wasted critical resources and assets, and still can’t deliver the desired business outcomes. ### Ensuring Security In An Era Of AI And Cloud Platforms [vc_row][vc_column][vc_column_text]Across over 25,000 cloud services, each enterprise generates more than three billion events per month, including uploads, logins, shares, edits, etc. About 21 percent of the files in the cloud contain sensitive data. On average, an enterprise experiences 12.2 incidents of stolen account credentials stored in the cloud, per month. Cisco’s 2018 Annual Cybersecurity Report examined trends and patterns in data theft, data loss, and other issues. The report revealed that 32% of security leaders are completely dependent on artificial intelligence  (AI) to protect their sensitive corporate information. There is no doubt of how beneficial cloud and AI platforms are to businesses. In fact, their benefits go beyond businesses alone, with personal users backing up emails and other personal information in the cloud. However, the dependence on these platforms opens the data to a number of vulnerabilities. A recent report, The Malicious Use of Artificial Intelligence: Forecasting, Prevention, and Mitigation, indicates that cybercriminals are increasingly using AI for their own immoral gain. Also, several major data breaches have occurred over the years. There seems to be a race now, between cybercriminals and enterprises, to make the better use of AI and cloud platforms -- the former seeks to exploit these platforms while the latter tries to protect everything from business management to network endpoints. Security challenges with cloud and AI services Cloud and AI platforms have helped enterprises of all sizes achieve data efficiency. These innovations have not only increased their operational efficiency, but also have cut down their information technology (IT) costs significantly. From helping organisations deliver faster, better, and cost-efficient results, to effectively managing their results, cloud services and AI have been a boon for enterprises. However, with any boon comes challenges. Among AI applications in cybersecurity, the arrival of Bots or intelligent assistants to improve the accuracy and speed of data analysis is welcomed. As cloud computing continues to disrupt IT (Information Technology), the availability of machine learning capabilities and powerful algorithms address the shortage of security expertise. Nevertheless, such solutions for the security of data architecture and management bring unique challenges. Here are four ways to master security with cloud and AI platforms: 1. Establish strong cloud governance policies For preventing data breaches and maintaining the utmost security with cloud computing and AI, effective policies should be in place. Serve your clients as an authoritative security advisor to help them understand their regulatory compliance obligations, vulnerabilities of their data assets, and their data protection needs. With these critical steps, you can develop and communicate cloud data security policies with your client that will keep their information safe in the cloud. 2. Encrypt sensitive data Encryption technologies exist on the front line to ensure security with cloud platforms. Paired with a strong security policy that mandates encryption keys, encryption of sensitive data fulfills many compliance requirements. It also ensures that data is unreadable even if stolen or leaked without proper permissions. Encryption services are steadily growing in popularity, in the wake of increasing data breaches and government spying. Analyse your clients’ requirements and recommend the best and most effective encryption technologies. 3. Educate your employees Educate your employees about proper security practices and ways to minimise cloud security threats. Involve the entire workforce so that they take proper ownership of their responsibilities regarding security measures. Provide security training and set up a response protocol for the users of cloud platforms within your organisation. Run security tests and invest in tools that allow sending simulated phishing emails to check if your workforce is able to take appropriate action in the given scenario. 4. Get a secure data backup plan With the growing maturity of cloud and AI platforms, the possibility of permanent data loss increases. Employees can cause data leaks maliciously or inadvertently, which poses serious concerns. For this reason, organisations should back up their policies and encryption strategies that can detect malicious attempts or potential threats, while providing insights into employee activities. Ensure that you have a full and secure data backup to protect and preserve your business data. Final words Today, customers are aware of the price of their private information. Businesses can ill-afford to gain a reputation for showing little regard for data security. Enterprises should develop a security platform that allows them to implement consistent data protection policies across cloud and AI platforms. IT managers should distribute data and applications across the organisation for extra protection, as well as strictly follow best practices in off-site storage, daily data backup, and disaster recovery. Having a better understanding of how to protect from the cloud and AI security threats, you make a more informed decision about IT infrastructures. A proactive approach towards these technologies helps to refine individual security to manage risks more effectively.[/vc_column_text][/vc_column][/vc_row] ### Cyber Security: How much could a security breach cost your business? [vc_row][vc_column][vc_column_text]As businesses of all sizes become increasingly reliant on digital data, workplace data breaches have become more common, and as a result, have gained widespread attention. In this post, Steve Thomas, Finance and Project Based Accounting Expert at The Access Group, takes a look at cyber security and the potential costs businesses could face. Cyber security is a big talking point for all SMEs, and the impact can be deadly to any business, with more than half of SME businesses closing operations within six months of an attack due to not being able to recover from the financial repercussions. The financial and information industry, in particular, is an attractive target for hackers due to the nature of business. Processing and handling large purchases and financial transfers on a regular basis leaves companies at risk - especially without an adequate malware software solution in place. Nowadays it is impossible to understand the digital transformation of businesses and organisations without cloud computing. Time moves quickly and everything has become increasingly digitised, so it is vital for business leaders to face the threat of cybercrime and focus attention on the risks and consequences of a potential attack. The “cloud” is just a server where you store the data, applications and software that you can access from any device, as long as you have an Internet connection. One advantage of cloud services lies in accessibility, and many large purchases and financial transfers are conducted digitally, meaning stronger security measures need to be adopted in order to reduce the risk or threat of a data breach. As John Chambers, the previous CEO of Cisco once said: “there are only two types of companies: those who have been hacked and those who could be”, and small businesses are no exception to this. No matter what size, a business is at risk of several cyber security attacks, with the most common including: Malware and viruses Phishing Ransomware Password hacking Having said that, there is a lot that business owners can do to help protect their business, such as: Firewalls Cloud hosting for key systems Use secure on-site Wi-Fi Staff training to increase awareness of cybercrime Install email threat detection What damages and repercussions can a data breach cause? Currently, £22,700 is the annual average cost for businesses that have lost data or assets after being victims of a cyber attack. This amount would be a drop in the ocean for a multinational, but for SMEs, micro-businesses and sole-traders, it could be the difference between being able to continue trading and going into administration. Small business owners are relatively easy targets for cybercrime as more often than not, they don’t have the budget for a full-time IT employee - or are unaware of the importance and benefits of investing in this type of professional. In this day and age, it is advantageous to create room in the budget to protect your business, cash flow and assets against cyber risks as money is not the only thing at risk - 89% of SMEs reported that a cyber security attack impacted their reputation negatively while 30% reported a loss of clientele. With the correct security measurements in place and eliminating the risk of a data breach, thousands of pounds could be saved and invested back into the business. £22,700 is a lot of money for an SME, and could fund various assets such as 19 used Ford transit vans for your workers and fleet, 1,305,565 cups of tea to fuel your team, or even 19 years worth of cyber security prevention. One thing is for sure, the costs of cyber security breaches can be substantial. What would you spend £22,700 on?[/vc_column_text][vc_single_image image="52658" img_size="full"][/vc_column][/vc_row] ### Top 5 Cloud Security Solutions for Your Business [vc_row][vc_column][vc_column_text]Cloud computing has become a go-to option for the majority of companies that are aiming to digitise their assets. However, the rising number of cloud-based businesses has created new opportunities for cybercriminals who are looking for ways to access corporate networks, too. With cyber attacks becoming more frequent and far more devastating, the demand for data privacy and cloud security is higher than ever. Keep reading to learn how to secure your business using a VPN and top cloud security solutions. Secure Your Business with Easy Cloud Security Solutions Cloud computing allows for efficient flexibility and mobility. However, it also brings a certain amount of risk to the businesses that are using it. Every business, no matter its size, needs a cloud security solution to secure their network and data. These solutions secure the connection between cloud-based software and the user. Thus, the risk of cybersecurity incidents is reduced. Companies susceptible to cyberattacks can protect their data by investing in cloud security solutions. These security solutions can help monitor and track network activity to prevent attacks, as well as block unsafe content online. Other features, such as network scanning and real-time firewall updates, can further enhance security and website speed performance. Besides cloud security solutions, it’s recommended to use a VPN to secure your business. A virtual private network can encrypt all traffic traveling to and from the devices (download on Google Play) in your network. That way, company data is hidden from hackers. Below, you can find a list of top 5 cloud security solutions for your business. #1 Qualys Qualys has been around since 1999. This proves them to be a reliable option when it comes to data protection. Their service focuses on identifying compromised assets and helping stop cybersecurity incidents from advancing. Qualys offers features such as endpoint security and web app security as well. The best part about it is that it doesn’t have any software or hardware requirements, as it is a fully cloud-based solution.   #2 Proofpoint A leader in advanced cybersecurity solutions, Proofpoint is a cloud-based provider that offers services customised for businesses of all sizes. They protect from a variety of cybersecurity threats for both small and large corporations. Proofpoint can tailor their services to your needs, depending on the size and type of business you’re running. Besides securing outgoing data, Proofpoint provides efficient email security management features, as well as mobile solutions for targeted vulnerabilities. On top of that, they offer security products for social media and mobile devices. #3 CipherCloud Founded in 2010, CipherCloud offers security solutions across three different models, including PaaS, SaaS, and IaaS. CipherCloud allows users access to a single platform from which they can secure all customer data. What’s great about CipherCloud is that it doesn’t compromise the performance of the company’s website and assets. It operates in the background across multiple private and public cloud applications. Some of its best features include complete cloud encryption and data loss prevention. Adaptive control and threat prevention are CipherCloud’s two major strengths. #4 SiteLock Created in 2008, SiteLock secures over 21 million websites across the globe. As such an efficient security solutions provider, SiteLock protects sites from security attacks, sneaky malware, and other threats. It regularly scans websites for vulnerabilities and keeps them safe from SQL, DDoS, and XSS attacks. While some security solutions can negatively affect a site’s performance, SiteLock boosts the performance through dynamic caching and load balancing. This makes it an excellent choice for large corporations. Besides website scanning and other security features, SiteLock also offers emergency website repair services. They help business reverse the damage in case of a cyberattack by repairing hacked websites. #5 CloudPassage Halo CloudPassage Halo has been providing cloud security solutions since 2010. They help organisations establish and maintain compliance with different security policies and regulations. Besides, they provide extra security through software vulnerability assessment procedures. These procedures help notice and prevent potential security incidents in time. Thanks to these features, CloudPassage Halo can provide visibility and better security across different cloud-based workloads. It is also important to mention that they provide great customer support service, which makes the overall user experience better. Conclusion Online security has been a major concern for years, especially nowadays, when the number of cyberattacks is rising. Without clear cloud security solutions, businesses are exposed to a wide range of online threats from small data breaches to large DDoS attacks and data theft. However, cloud solution providers can help reduce these risks by monitoring network activity and keeping all threats under control. Besides securing your business with cloud security solutions, make sure to encrypt the entire network with a VPN. Educate your employees on the importance of online security and demand that all devices connected to the company’s network use a VPN when browsing the internet.[/vc_column_text][/vc_column][/vc_row] ### Tips for implementing good information governance As businesses finish another financial year one area that receives a lot of attention around this time is compliance. However, this is only one component of something much larger that businesses of all sizes should be giving attention to: information governance. [easy-tweet tweet="Information governance is much more than compliance - it's the strategy for the entire information lifecycle"] Rather than being a one off catch up activity done at year end, information governance should be an on-going, critical initiative that runs throughout the year. Short-term solutions that only address particular mandates, and the specific needs at that time, can lead to costly decisions in the long-term. Putting good practices into place has the potential to be transformational for businesses. It can help pave the way for a more successful year and help future proof the business against on-going change. It’s time for organisations to shift their perspective and put systems in place to address information governance. Taking notice of information governance Information governance is much more than compliance and should not be used interchangeably. It is the strategy behind the entire information lifecycle, including effective management of information’s authority, control, accessibility, and visibility. Furthermore, information governance can bring much greater value to organisations as it has the potential to uncover business opportunities and protect them from security threats. Businesses should see compliance as the end goal and information governance as the way to achieve it. Answering these simple questions helps you on your path to good information governance: Do you know how your employees are working and what applications they use? • Do you know where your business’ information is being stored? • Do you know if you have full control of your business information? How would you answer that last question? Unfortunately, most organisations would answer ‘no’. A recent Association for Information and Image Management (AIIM) study found two-thirds of organisations had some level of information governance policy in place but nearly one-third admitted that their inferior electronic records kept causing problems with regulators and auditors. So what are the hurdles and how can they be overcome? There are common pitfalls Poor information governance varies from the unfortunate to the catastrophic. At worst, hackers get a hold of sensitive information. At best, out-of-date information may be used and then commitments have to be honoured based on this inaccurate information. While in between is a range of incidents of information mismanagement and examples of employees using unsanctioned tools, all of which can be prevented. One great example is email. Its very nature puts valuable information at risk on an hourly basis. Potentially confidential information contained within an email is frighteningly susceptible to interception and vulnerable to security threats. Yet countless employees use email as a method for sharing sensitive information. But worse still employees use both approved work email accounts and unsanctioned private email accounts. A recent Alfresco survey found that over half (54 per cent) of end users have turned to their private email for work, most likely due to the limitation of enterprise email. Many knowledge workers have turned to consumer solutions to provide collaboration and access capabilities not enabled within the enterprise. None of these applications are approved or controlled by corporate IT. These ‘Shadow IT’ solutions can pose a serious security risk for organisations, leading to information leaks from unsecure practices and the failure of compliance regulations. Another critical challenge is implementing policies for the use of other tools such as instant messaging and social media. This is born out by the results of a recent AIIM study that highlighted that less than 15 per cent of organisations included social postings in their information governance policies. While some conversations are essential to business growth, 37 per cent of respondents agreed that there are important social interactions that are not being saved or archived due to a lack of information governance. Rather than being a one off catch up activity done at year end, information governance should be an on-going, critical initiative that runs throughout the year Good information governance can be achieved A lot of organisations have a focus on compliance, management, and security controls in place, but what is really required is information governance. Here are some simple steps organisations can take: Audit Understand the range of information you have and how it needs to be managed and where it is currently being stored. Prioritise Rank your information and the associated processes to assess the level of risk: compliance risk, regulatory risk, and reputational risk. For ease consolidate this to a minimum. Define Policies need to be decided. What needs to be kept, for what purpose, which employees need access, and for how long? The information should be stored where it can be most effectively used, while also addressing business objectives and risks. Clean Once these protocols are set, there should be regular checks of what information is maintained. Archiving or deleting content once it has outlived its useful life should be encouraged. Pruning old data will reduce storage costs and the associated management costs. Control Keep Shadow IT in check. Where you can restrict access to unsanctioned tools and stop employees using personal accounts for business. Create Most importantly, develop an information management system with people at the heart of it. Implementing tools to support your employees – ones they find easy to use – so that they will, indeed, use them. [easy-tweet tweet="Organisations may focus on compliance, management, and security controls, when information governance is needed"] Following these steps will enable organisations to take information in any format; analyse what needs to be preserved and protected, and delete what is unwanted. Content can now be easily sorted and managed, access and monitoring controls can be easily implemented where needed. Being able to say you know how your employees are working, where your information is being stored and that you have full control of that information will lead to a boost in efficiency and productivity. ### 6 Tech Trends Small Businesses Should Be Aware of in 2020 [vc_row][vc_column][vc_column_text]In the last decade, we’ve seen the rise of a range of technologies, from AI and Cloud computing, to IOT devices and Blockchain. All of these impact on small businesses in some way, whether directly or indirectly. With technology seemingly developing faster than ever before, what technological developments can we expect to see in 2020? In particular, what are the tech trends that will impact small businesses, particularly those in the midst of digital transformation? Here are the top 6 tech trends for 2020 small businesses need to know about. 5G Will Launch It’s been hyped for some time, and it seems like in 2020 we will finally see the launch of 5G networks across the globe. 5G is the next generation of mobile broadband, and as it replaces or augments the current 4G LTE network, you will notice exponential improvements in download and upload speeds. This is because the time your device takes to communicate with wireless networks is greatly reduced. 5G promises faster than ever connections, with download speeds reportedly set to be around 1 Gbps. This could have huge impacts on small businesses, who will be able to download files, work in the cloud, and run other processes online much faster than ever before. In particular, businesses will be able to communicate in lightning-quick time both internally and externally, making processes more efficient, and improving customer service. Diverse Mobile Technology  Consumers and businesses alike have been fairly mobile-centric in recent years. This is set to continue in 2020, with consumers using their mobile phones for everything from searching for products online, to making payments. This means that small businesses need to adapt to their mobile-centric customers.  In 2020, it is critical that all websites are mobile-friendly. Consumers are now more likely to browse or shop online, so having a website that is responsive to mobile is essential in order to maximise sales. This affects not only online businesses, but also brick-and-mortar ones: an Invesp study found that 32% of consumers changed their mind about a product after checking product information online while in the store. Additionally, Google introduced mobile-first indexing mid last year, which means that having a mobile-friendly site is critical for SEO. Equally, mobile tools and technologies are becoming available that will help small businesses. Card reader technology has also gone mobile, which means businesses can now take payments literally anywhere, anytime. Customers can also now make mobile payments at home or in-store, making it easier for them to pay – which is good for everyone!  Blockchain and Alternative Payment Methods As Blockchain technology develops, in 2020 we will see more small businesses taking it up as a way of processing payments. Crypto and digital currencies are gaining traction with consumers, who are looking for ways to pay with digital payment methods. In 2019, we saw crypto and digital currencies have an impact on consumer banking. This year we are likely to see them emerge in the context of business users. We’ve already seen apps such as Tide that allow consumers to make payments directly from their accounts to pay for products, skipping the credit card entirely. This could be a big plus for small businesses, representing significant savings on credit card processing. The Dominance of Voice Voice technology is predicted to rise to prominence in 2020. Furthermore, this technology is set to significantly evolve, so that it will be able to answer complex data queries. This means that businesses will be able to use voice to search and analyse a range of information. Additionally, the rise of voice in the home and for consumers means that businesses need to take this into consideration when it comes to their digital marketing. For example, when consumers search for products using voice search they use different words and phrases than they would when typing , which greatly affects SEO. Automation Software You may think that automated technologies and artificial intelligence (AI) are too big to have any impact on small business, however that is changing in 2020. AI technology is having a wide impact on a range of tools and processes, including many that can help small businesses be more effective and profitable. AI-based technology will help small businesses in areas such as email marketing, customer service, data entry, and accounting. This technology has been available to big companies for a while, and soon it will filter down to small businesses, giving those SMEs who are at the start of the trend an edge over their competitors. Big Data Another tech trend that has been highly influential in recent years but until now viewed as too big for small business is big data. Like AI, as this technology becomes available to small businesses, those that get on board first will have a significant advantage over their competitors. Big data will allow businesses to tap into data and harness the power of this information. Around 63% of UK businesses do not tap into their data at the moment, which represents a loss of opportunity and potential profits. [/vc_column_text][/vc_column][/vc_row] ### Ways to Improve the Security of Your Smart Home [vc_row][vc_column][vc_column_text]When you’re shopping for new appliances and devices for your home, you’ll undoubtedly come across the term “smart home”. It’s one that’s rather popular nowadays, and that’s not going to change anytime soon. The concept of a smart home aims to give us a more convenient and more secure home to live in, and the price you’ll pay for all the devices isn’t overly expensive either. However, if you’re going to be investing in a smart home (or you already have), you might have some security concerns. We aren’t talking about concerns that have to do with your home security system, but instead with the way the smart home works and any potential vulnerabilities. And quite frankly, we understand that. But there are a couple of things that you could be doing if you want to improve the security of your smart home that would basically eliminate any concerns you might have, so let’s take a look at them. Make Sure Your Internet Network Is Secure If there’s one thing that any and all smart home devices have in common, it’s that they all make use of your home’s internet connection. Therefore, it would make sense for a potential intruder to try to gain access to them via that same internet connection. How do you stop them from getting access? You ensure your network is as secure as possible. You’ll want to have access to your router for this because you’ll need to change a few settings. First things first, the password to access your wireless network should be unique. Don’t resort to using the same password across multiple accounts or logins, and instead come up with something that’s unique and complex enough that nobody can just guess it randomly. While we’re at the password, chances are you’ll get the option to choose between WEP and WPA/WPA2 for encryption – go for WPA/WPA2. This is a much more secure protocol and one that’s nearly impossible to breach. The other thing you’ll want to be doing is making sure no unauthorized users have access to your home’s wireless network. When you’re going through the settings in your router, you’ll find a list of devices that are connected to your network. If you see something that’s not instantly recognisable there, remove it from the list. This might be your neighbour’s kid who asked for your password once because their internet was out, but it could be someone trying to breach your smart home. Don’t Buy Unknown Devices This is something you should be thinking about when you’re buying the items for your smart home. Whether it’s the hub itself, or the lights, or the speakers, or the security cameras and alarms, when it comes to smart home technology, you’d be smart to get products that are all made from a reliable manufacturer. They might come at a price premium, but aside from quality and oftentimes ease of use, one major thing you’ll get from going with a reputable manufacturer is security. They always make sure to encrypt their signals and make sure to update the firmware regularly. Even if a new vulnerability is found, chances are you’ll have an update that protects you from it rather quickly. Check Your Security Regularly A common misconception is that a home security system is a “set and forget” thing, and once you have it set up, you don’t have to do any maintenance to it. While a good system won’t require too much maintenance on your end, it’s still a good idea to test the system regularly. Because you don’t want to find out something isn’t working when you’ve got a burglar ransacking your home. If you have a monitored system, you’ll want to contact the monitoring company and let them know about the test, so they know not to react to the alarms being triggered. If you’ve went the DIY route, you should check all the cameras and sensors you have installed around the house and make sure everything is working as it should be. See if you get alerts on your phone quickly and reliably, see whether the motion detectors trigger the cameras you set them to trigger, and check everything. While we’re at this, we should also mention that alerts should never be ignored and neglected. Even if you have a false alarm, make sure to investigate why that false alarm occurred the first chance you get. It might’ve been your kid who just got back home from school that forgot to disarm the alarm, but it might be a potential burglar, too. Make sure to follow up on anything that happened – a smart home gives you an overview of everything that’s going on in your home, by all means, take advantage of that. Wrapping Things Up At the end of the day, there’s no denying that your home security is a critical aspect of living. Yes, some of the devices you should be buying might be pricey. Also, regular check-ups are time-consuming, too. However, when you realise that you’re looking at unparalleled peace of mind, it’s all very much worth it.[/vc_column_text][/vc_column][/vc_row] ### The most important driver of AI innovation lies in the cloud [vc_row][vc_column][vc_column_text]When most people think about the increasing sophistication of artificial intelligence (AI), they tend to focus on algorithmic improvements, increased computing power or breakthroughs by technologists. Whilst these are fundamental to progress, what is often forgotten is the need to collect and store huge troves of data. We would never have seen Google’s AlphaGo beat Go master Lee Se-dol or Facebook’s poker bot best some of the world’s top Texas Hold’em players were it not for contemporary data capture and storage solutions. Data is the fundamental resource and utility that is driving innovation in AI, with its progress inextricably tied to the ability to effectively store data.  The value of AI in countless applications and the cost savings it will deliver will drive much greater adoption. AI developments will result in labour savings of over $40 billion in federal, state, and local government, according to a study by Deloitte. And once we apply this to private industry, we’re looking at savings of more than $5 trillion dollars per year. It’s certainly an exciting prospect, but if industries are to break through as quickly as they’d like to, there needs to be a way for all of that data to be captured and harnessed. Understanding the resources AI needs There’s a reason why companies such as Facebook and Google are at the forefront of this nascent technology. Yes, partly because of their ability to attract the best talent and afford the latest tools, but most importantly because of the sheer volume of their datasets. Google receives a staggering two trillion searches a year, while Facebook collects and collates multiple data points every second from each of its 2.2 billion active users. AI uses these databases to ‘learn’ and find patterns, becoming increasingly smarter as it digests more and more data. Just as a skyscraper requires cement, or a human mind requires books, so too does AI require data to function. We are at a tipping point, and once we start to reduce the costs associated with storing data, we will see the technology improve at a dramatic pace. How driverless cars can avoid hitting a brick wall One of the most pertinent examples of AI taking shape is in the automotive industry. It’s experiencing radical change as autonomous vehicles (AVs) start to become a reality. The likes of Tesla and Google X have been spearheading its development, but traditional incumbents are also starting to invest heavily into the technology - one example being BMW’s recent deal with Daimler - to push their business towards driverless cars. Driverless vehicles such as Waymo generate over 11 terabytes of data a day, though they currently only use AI in a limited capacity. A lack of data is one of the reasons progress has been stifled, rather than problems with the engineering teams themselves. Today, driverless vehicles are trained to gather a targeted portion of data from their surrounding environment. This is to conserve storage which is finite, meaning algorithms only collect a subset of an AV’s context. This is not the ideal way to train AI. The best way to train AI in an AV is in the same way we train humans to drive: holistically capture all the information that is available in the environment. However, doing so would mean increasing the data that needs to be collected stored dramatically, from 11 TB a day to around 200 TB a day. What this means in practice is that a single vehicle would need to store 73 petabytes of data every year, which would cost roughly $21 million to store in the cloud. This may be viable for a small number of flagship or proof of concept vehicles, but makes scaling across millions of vehicles impossible. If we can reduce the costs associated with cloud storage, we can effectively put autonomous vehicles on the market. It’s not just driverless cars making leaps in AI We’re seeing AI and machine learning solutions emerge across countless industries - post-production in the media offers another good example. The possibilities for studios and production teams will become seemingly limitless once AI driven tools are implemented. For example, the ability to search an entire content library based on specific scenes, words spoken or even characters will be instantaneous. Production teams will be able to collaborate on content in real time in disparate locations, fostering dynamic development processes and making the production process far more efficient. To achieve the faster data flow required to enable this progress, the industry is increasingly moving away from on-premise storage and to the cloud. Currently 5,000 petabytes are stored in the cloud in this space, but this is expected to grow to more than 130,000 petabytes by 2021. To facilitate the huge data influx, moving to the cloud is a no brainer. AI has the capacity and potential to disrupt almost every single industry. No matter which industry we’re discussing, the proliferation of data required to make it a reality and the need for faster and cheaper access to it is the true enabler. This explains why cloud storage services that democratise data storage are vital for any company looking to take that next step in their technology. The good news is that cloud storage is exploding as an industry, and many solutions have cropped up in recent years that are more cost-effective than those companies have been tied to historically. Specialised cloud storage providers like ours are tackling issues of scalability by dramatically reducing the costs associated with storing data. If we want to focus on algorithmic breakthroughs and drive forward innovation in AI, we’re looking at data, data and more data. For companies that learn how to harness that data so it can be captured and analysed effectively, moving to the cloud is really the only viable option. For those that understand this, there’s a lot to be gained.[/vc_column_text][/vc_column][/vc_row] ### Black holes and the future of data management [vc_row][vc_column][vc_column_text]There may be nothing in the universe more mysterious than a black hole, but the first photograph of one such formation in the Messier 87 galaxy made the phenomenon a little more familiar. In April, the image of a glowing red ring got its 15 minutes of fame. A second photo featured Event Horizon Telescope computer scientist Katherine Bouman next to a table covered with hard drives, which held information from eight observatories that contributed to the black hole image. Some technology professionals responded with incredulity at this physical sign of the “old school” methods the team used for data transmission—namely, shipping via FedEx. The reason for turning to air cargo in place of data cables is simple. Using internet connections to send five petabytes from Hawaii, the Chilean desert and Antarctica, among other locations, to the compilation centre at the Massachusetts Institute of Technology would take years. Shipping mirrored hard drives, however, takes mere hours. Using aeroplanes for data transmission related to a groundbreaking scientific project is surprisingly emblematic of the new era the world is entering. Our growing ability to monitor and track nearly everything around us will soon overwhelm our resources for storing and sending all that data. Creative alternatives are urgently required. Most of us have read the statistics: global data volumes are expected to reach 175 zettabytes by 2025. Sensors are ascendant, and devices will soon generate far more information than humans with our documents, videos, texts, and other outputs. How we will manage the forthcoming data explosion is unclear. Like the black hole photo, we can see the outlines, but the detail remains out of focus. The data transmission problem One trillion gigs of data and counting will present two critical challenges for the technosphere. First of all, it is impossible to efficiently transmit such a massive amount of information over current or next generation networks. It’s not just the Event Horizon project having issues getting five petabytes from A to B. It’s the transportation industry dealing with the five terabytes of data each autonomous vehicle may generate daily. As well as this, it’s the biotech confronting the 500 terabytes one genome analysis can produce, and manufacturers preparing to collect countless status updates from nearly every consumer product imaginable – not to mention their own facilities and assembly lines. There are cases where slow and steady data transmission will suffice. But an autonomous car must know instantly whether to brake, and reports of a fire at a remote factory must be addressed immediately. This brings technology to the edge. Among the possible solutions for dealing with the massive increase in data volumes is to move compute and storage closer to the point of data creation. Edge and fog computing promise to help achieve the low latency demanded by such applications as augmented reality, as well as to compile raw data into more useful and condensed form before sending it to centralised data centres and the cloud. These technologies will together limit the pressures on our networks to some degree, but this will be done using architectures far different than the cloud-dominated prognostications of just a few years ago. Storage of an expanding data universe The second key problem with creating ever more data is storage. For one image, the Event Horizon Telescope generated the equivalent to 5,000 years’ worth of MP3s, which filled half a ton of hard drives. The world’s data that will exist in 2025 will require 12.5 billion of today’s hard drives to be housed. As on-premises and cloud storage are backed primarily by hard drives of the spinning and flash variety, the future of this technology is a chief consideration. Herein lies the bad news. Drive-based storage has reached the superparamagnetic limit, a big word for the Moore’s Law equivalent in the digital storage realm. Just as processor speeds have doubled about every two years as manufacturers crammed more transistors into integrated circuits, storage has become more powerful. This is at a rate of about 40 percent per year, as OEMs shrank the magnetic grains coating disks. And just as there is a physical limit to how small transistors can be made and how closely packed together these are in an Intel chip, there is a smallest practical size for magnetic grains—and it has been reached for hard drives. Why hasn’t this gotten much attention? So far, manufacturers have responded with more heads and platters to boost capacity in the same footprint, but the gains delivered through these means are slowing. There are technologies in the works to help, including heat-assisted and microwave-assisted magnetic recording, but they remain problematic and costly. These engineering realities have experts looking in various directions for how best to encapsulate the record-setting amount of data humankind will soon produce. There are advocates for tape backup, which despite its latency offers energy efficiency and a still unmet superparamagnetic limit. Farther afield, researchers are exploring how to store information as DNA. In this scenario, all global data could fit into a coat closet, but the fact that the polymer encoding would need to be rehydrated and fed into a sequencer to be read means instant access would not be a feature. There will be good answers to the storage conundrum, but better IT equipment will not change the fact that voluminous stockpiles of data presents cost, security and compliance issues, as well as usefulness challenges. Enterprises will need to address this. The importance of data management In the coming years, organisations will inevitably throw hardware at the data explosion. This will often mean keeping existing storage systems on hand far longer than they may expect, as the budget available to transition archives to newer storage arrays will be limited by the concurrent need to add capacity for incoming data. This will affect procurement decisions, equipment lifecycle management, and hardware maintenance, support, and upgrade choices. Ultimately, the data deluge is not solely a hardware problem. Much like a family home bursting at the seams with clutter, the best, most cost-effective solution is not to buy a bigger house. The technology industry will need a Marie Kondo-esque dedication to tidying up. The choice of what data to keep and what to discard will become increasingly important if companies want their heads to rise above the IoT data floodwaters. It will, therefore, be critical for organisations to develop strong, effective data management strategies to limit the volume of information retained to the most valuable and impactful for their operations. The design of a data lifecycle road map, from the original creation to its final destruction, must be on the agenda. Final destruction, for security, privacy, and compliance reasons, must be as permanent and irreversible as if the information fell into a gaping black hole.[/vc_column_text][/vc_column][/vc_row] ### The gathering momentum of hybrid cloud [vc_row][vc_column][vc_column_text]The cloud, in all of its forms, is making a lot of noise in the world of enterprise IT, but what would we find if the full extent of the booming public, private and hybrid cloud landscape was evaluated for the enterprise cloud market? We can begin to find answers in a recent survey conducted, on behalf of Nutanix, by Vanson Bourne. Mining insightful data from more than 2,300 companies across the globe, the Enterprise Cloud Index found - among many interesting data points - that over a third (36%) of enterprise workloads are currently running in the cloud. Moreover, this figure is predicted to increase to over half of all workloads within the next one to two years. Public is not a panacea Companies are increasingly aware of the benefits they can reap from the scalability and pay-per-use economics that the public cloud provides, but they are also waking up to the realisation that not all clouds are equal. The decision to switch to a blend of clouds is hardly surprising: not only are these companies now benefiting as they house each app they deploy in the best possible environment, but this application-centric approach to IT is also going a long way to avoid dreaded vendor and platform lock-in issues. However, lock-in is but one issue, in fact it is one of a number of reasons why companies are looking to deploy their workloads across multiple clouds. Increased workload mobility is certainly a key feature of hybrid cloud, and the financial and technical benefits of being able to easily move workloads between clouds is illustrated in the Vanson Bourne research, where mobility was ranked highly by respondents as one of the benefits, even ranking far higher than cost and security. Building hybrid from the ground up The move to the hybrid cloud might not be as smooth a transition as companies would like, not because switching is an entirely difficult process but simply because, as the survey discovered, there are some fundamental issues in the foundations to build hybrid empires upon. This is partly due to a lack of IT staff with the skill-sets to make hybrid cloud work for them, as well as a distinct lack of tools sophisticated enough to ensure visibility when moving applications around multiple cloud environments. It is important to note that the tools that might close the gap are in development, but can’t seem to come soon enough, as companies are crying out for replacements of the all-too often proprietary technologies that are available now. For example, although technologies like microservices, containers and APIs are working to ensure apps are more flexible, we often find that the management, monitoring and deployment capabilities are not quite batting in the same league yet. When adding traditional on-premise infrastructure to the mix, it becomes even more difficult to see what’s actually going on. All of which impacts on security and compliance, as well as availability and performance. What organisations really need in order to reap the rewards of a hybrid cloud deployment, is to be able to clearly see each application on every cloud environment it comes into contact with. This would mean a better way of guaranteeing compliance and availability, and will almost certainly improve the experience of automating processes. Cruising through the clouds As organisations look for a solution, there is much promise in the emerging technologies and solutions that are in development, that aim to deliver this need for cloud-wide visibility. These solutions and technologies should meet a few requirements that can be categorised in three useful advancements: one would be ensuring expert analytical tools that can successfully read and evaluate the ins and outs of the public cloud, including its financial and technical nuances, to enable IT to place their apps appropriately. And not only that, but these tools should allow management to not only see but understand where their investments stand. A second advance would be cross-cloud networking, crucial for delivering IT across clouds with ease and agility. Able to manage the connectivity of applications across numerous vendor clouds and services, it will be critical in unearthing bottlenecks or potential vulnerabilities. Lastly, a cloud-based disaster recovery service that ensures protection of critical data and applications, if done well, would have the potential to resolve a mountain of issues around availability. A clear move towards hybrid cloud is evident, with the current figure showing 18% of companies deploying workloads across a mixture of public and private clouds, but set to more than double by 2020. This is solid evidence that hybrid cloud IT is a game changer, and is on its way to becoming the most popular method for enterprises to implement IT. While delivering the technologies and solutions mentioned will not be the easiest task, the ever-increasing number of enterprises migrating to hybrid means that putting our heads together to make the necessary changes is in everyone’s best interest. If we can get to grips with cross-cloud visibility, and introduce the vital tools needed to make it work, hybrid cloud will live up to its exciting forecast and become the best way to manage and provision enterprise IT.[/vc_column_text][/vc_column][/vc_row] ### How Working With Multiple Technologies Can Impact Your Salary The tech industry is saturated with opportunities. While many workers fearing for their future due to automation, the tech sector continues to thrive. The demand for highly-skilled IT professionals is increasing. Companies are placing more focus on promoting digitisation across their organisation. Organisations are using more business management software systems than ever before, and as a result, demand for tech professionals is skyrocketing. The turnover of digital tech businesses had increased by £30 billion from 2011 to 2016 — that’s twice as fast as non-digital businesses.   Salesforce may currently be the top CRM software solution, but cloud companies including NetSuite and Microsoft are vying for a bigger share of the market. This gradual levelling of the playing field is allowing developers the chance to cash in on multiple technological niches, broaden their skills, and boost their potential income — especially those who are working as a freelancer. Tech salaries are rising quickly Increasing demand for skilled tech professionals means developers can pull in a significant salary, provided they have the right skills. The wage for a mid-level NetSuite Developer across the globe has risen by 18.6% since last year, with NetSuite developers in the UK commanding an average yearly salary of £59,122. A full-time developer working with Salesforce can expect a similar wage to their peers in the NetSuite channel. However, if they decided to go freelance, they could bring in a daily rate from £500-£690. In terms of job security, the role of a software developer could be seen as one of the ‘safer’ roles currently on the market, being as it's in demand all over the world. The role also allows professionals to gain valuable knowledge quickly, allowing them to jump onto new and trending technologies, such as crypto-investing, and get ahead of others. Wide range of skills Learning new digital skills is a surefire way to make a positive impact on your salary. The industry is changing at a rapid pace, and professionals must embrace a lifelong attitude to learning to fully take advantage of emerging markets and the opportunities they present. The more skills you acquire, the more valuable you will become. As more and more products and platforms land on the market, many businesses are migrating; meaning the skill sets required internally are constantly changing to new, improved solutions. Without relevant, up-to-date knowledge, tech professionals could find that their value takes a nosedive; widening your skillset, and becoming proficient in other areas can help make sure you stay vital to your organisation. According to a Gartner report on skills needed in the workplace, there are sweeping changes to technologies and processes to computing, which will create ongoing skills gaps particularly for operations and support staff. By 2022, it’s expected that 75% of IT staff working for end users will have to gain significant skills in at least one more area of technology to remain effective in their job. Freelance opportunities A quickly changing market, coupled with high demand, makes working as a freelancer a great opportunity, either as a side gig or a full time role. The number of professionals choosing to work freelance is growing; it’s predicted that half of the UK workforce will work as a freelancer by 2020. While a full-time role offers arguably better job security, going freelance would allows tech professionals to utilise their skills across various technologies, enabling them to keep their fingers in many pies, and help them to earn more. The way we work is shifting quickly, and with technology continuing to advance, working freelance gives tech professionals more freedom to upskill and build knowledge of different products, helping them stay ahead of the game — especially as each software releases several versions of their product each year. Certifications The best way to show your knowledge is by becoming certified. Such qualifications are a great way to give you an edge in your field — take Salesforce’s Trailhead for example. A free, online training scheme, Trainheadoffers developers, administrators, and end users the chance to learn more about Salesforce, no matter what their experience level. Trailhead users can also earn badges to verify their new skills, and illustrate their competencies to potential employers. Although there are many certifications available, the proof really is in the pudding that they can make a huge difference to your final pay. With companies crying out for certified employees — seven out of 10 of our clients ask us to provide certified candidates — now really is the time to showcase your skills through these official qualifications. A certificate in Microsoft’s most advanced cloud certification, MCSE: Cloud Platform and Infrastructure, for example, could stand you head and shoulders above others in your field and help you land your dream role with a great wage. The same can be said for a certification in NetSuite. Anderson Frank’s survey also found that over a third of respondents noticed an increase in their salary once they had gained a NetSuite certification. Certifications in technology help you make a great first impression, with research showing that certified employees are more confident, more knowledgeable, more reliable, reach job proficiency quicker and perform at a higher level. And with IT positions also expected to increase by 17% by 2022, becoming certified will help you future-proof your career as a tech professional. There are a huge number of opportunities in the tech industry, and it doesn’t have to be ‘one size fits all’. By branching out and specialising in more than one technology, tech professionals will be more appealing to many employers and stand themselves in great stead to earn more.       ### Adopting AI? Set Realistic Expectations To Help You Succeed [vc_row][vc_column][vc_column_text]It’s an undeniable fact that Artificial Intelligence (AI) is changing the way we live. From facial recognition and voice technology assistants to self-driving cars, AI has crept into our lives and as consumers, we utilise it without a second thought. But its impact across a wide range of business sectors is perhaps the hottest topic in tech right now. AI has developed and matured to the stage where, for some functions and operations, the levels of accuracy have overtaken human skills. Yet with stories of mind-boggling complexity, escalating project timescales and spiralling costs, the much-hyped technology is still regarded by many business owners with confusion and perceived as a risk. Justin Price, Data Scientist and AI lead at Logicalis believes that knowing what you want to achieve and setting realistic expectations are the best guarantees of a successful first AI adoption. Choose the right AI tool for your business need During my meetings with clients, and from talking to CIOs and business users, it has become apparent that confusion reigns over the terminology used to describe Artificial Intelligence. There are three key terms at play. All three fall under the umbrella term of AI, and are often used interchangeably but each has a different meaning. AI is a technology that retrains or ‘learns’ patterns and other specified behaviours to achieve a set goal. Crucially, it is about producing something which didn’t exist before. Other types of AI behaviour include: Robotic Process Automation or RPA, a software designed to reduce the burden of simple but repetitive tasks done by humans. Machine Learning - essentially probability mathematics used to spot patterns in very large samples of data. Deep Learning - a Neural Network which mimics the way the human brain works to examine large data sets, HD images and video. Being aware of the subtle differences and uses of these terms allows a greater understanding of which tool will best-support your business’ data insight needs. Make sure your data is up to the job   When delivering an AI project, around 80% of the total effort and time will go into making sure your data is correct. Underestimating the importance of top quality data is a common pitfall for organisations because, just like any other IT tool, AI will perform poorly if you have low quality data. So the focus in every instance must start here. Data must be well structured and it must be in format that’s consistent and compatible with the AI model. Don’t forget that AI is a process which must be regularly re-trained to ensure accuracy, so ongoing maintenance is essential. It’s important to have an idea  where your AI solution will be hosted and how many people are going to use it. If you don’t know, get some advice. Once your data is organised, it helps to have an understanding of what you have and where you can get it from. This is where traditional business analytics come in, giving you a good idea of what value you can get from your data before you start to use AI to drive recommendations. Brace yourself for complexity Never underestimate the breadth and complexity of what is involved in building and delivering an AI project. For many CIOs this will be their first experience of AI and, even with the right data, there are many variables at play that can add to both the costs and timescale of implementation. Working with the right partner is essential to guide you through the first project. We recommend undertaking an initial project with a fixed fee whereby you can deliver a functioning result while you establish trust and credibility with your solution provider. As well as the importance of good data, another critical factor in delivering a successful AI project is finding a solution that is scalable. It’s one thing to write an AI model on a laptop. But it’s a completely different thing to write a model in a scalable way that will survive a deployment across a business. This is where getting expert advice will help you decide on the correct infrastructure to support your project. We advise clients to consider pre-built services. All the major IT players have been quick to offer a range of on-premise and cloud solutions, and we’d highly recommend looking at some of these instead of trying to build your own. Know when AI is (and isn’t) the right tool for the job The concept of AI has been around since the 1950s but now, suddenly, everyone is talking about it. That’s because finally it has become commercially viable. We have the data, the processing capabilities and the skilled people to harness its power. With a plethora of impressive use cases available to businesses spanning most sectors, it’s no wonder AI is the tech tool of the moment. But AI is just another technology and won’t be the right choice for every business looking to gain insight from their data. The importance of prioritising people, process and culture in any AI project has already been discussed by my colleague Fanni Vig in a previous blog, and this is absolutely essential to ensure your business isn’t trying to use AI where a different, but still hugely valuable, tool could deliver the desired results. At the highest level, AI allows you to work through far larger data sets than previously possible. It can be used to help automate your data workflows, redirecting low-difficulty but high-repetition task to bots, which allows people previously engaged in these tasks to work more efficiently. This process creates a new way of working that may have greater implications across the business as roles change and skills need to be channelled in different directions. Introducing AI is a truly cross-business decision. And let’s not forget that, at the most fundamental level, using AI to harness your data is an investment that must show a return. Finally, get advice from the experts While the impetus to adopt AI may come from the IT department, the results generated can help drive cross-company productivity; help differentiate businesses from their competitors; and delight your customers through a more tailored services. The impact cannot be underestimated. But neither can the complexity. So if you are a business considering whether AI could help you get more from your data, my advice would be to work with a trusted solution provider to guide you through your first successful deployment.[/vc_column_text][/vc_column][/vc_row] ### Lawyers + Coders + Beer + Pizza = Global Legal Hackathon London 2020 [vc_row][vc_column][vc_column_text] The third Global Legal Hackathon starts this Friday.  When you put lawyers, marketers, designers, consultants and developers in to room with beer and pizza what do you get?  If the last 2 years of this event are anything to go by, we’ll get something really special!  And by the way, other food and drinks will be available. GLH2020 is happening over the weekend of 6-8 March.  Back in 2018 40 cities joined in simultaneously across 6 continents.  In 2019 we had 47 cities, and this year, even with the Caronaviris scare, over 40 cities will be involved and teams will be able to participate remotely if they want to.  We aim to make London bigger, better, and even more fun.  First a disclosure – Agile Elephant and I have been part of the organising team since the start.  Actually, the idea for this event was formed when Brian Kuhn, who at the time ran IBM’s Watson Legal business, met David Fisher, CEO of Integra Ledger, at a workshop Rob Millard of Cambridge Strategy Group and I ran back in 2017.  Rob and I have hosted the London edition ever since, with a lot of help from our friends, sponsors and the University of Westminster.  This is a not for profit event, free to enter for the participants, with our sponsors covering the cost of some prizes, as well as lunches, evening meals, soft drinks, coffee, tea, beer and wine.  A hackathon wouldn’t be a hackathon without beer and pizza! Here I am explaining it in a bit more detail: https://youtu.be/Nhnh81qqEs0 What’s the objective? To progress the business of law, or to facilitate access to the law for the public. Ideas will be pitched on the Friday evening, and teams of 3-10 will form to work over the weekend to create an app or a service.  We expect ideas using technologies like AI, Machine Learning, Chatbots, Blockchain, or the Internet of Things.  Our 6 judges will deliberate and pick the winning team for London. That team will enter the virtual semi-finals with all the winners from the other cities on 22 March where 10 teams will be chosen to compete in the grand final in London on 16 May. What’s this Inclusivity Challenge you mentioned? “Participants and teams around the world, in every Global Legal Hackathon city, are challenged to invent new ways to increase equity, diversity, and inclusion in the legal industry.” At the conclusion of the GLH weekend, a local winner of the GLH Inclusivity Challenge will be selected by each city alongside the main winner and will progress to a global semi-finals too. This will be an extra stream and, like the main stream, finalists will be invited to the GLH Finals & Gala, to be held in London in mid-May. On top of that, the overall winner of the GLH Inclusivity Challenge will be invited to present its solution during a diversity and inclusion summit that BCLP is planning to host in September, where leading figures from the industry will be asked to commit to ensuring the idea is brought to life and scaled up to deliver a lasting impact on the legal industry as a whole. #GLH2020 London is bigger and better The London stream of the Global Legal Hackathon is being co-hosted by Cambridge Strategy Group, Agile Elephant and our venue is kindly provided by the University of Westminster, Marylebone Campus at 35 Marylebone Road, London NW1 5LS (near Baker Street station). All of the details, latest news and how to register are at: https://www.legalhackathon.london and follow #GLH2020 with #London on social media. Who are sponsoring this? This year the bills are being paid by law firms Bryan Cave Leighton Paisner, White & Case and software company BRYTER, who are providing access to their low-code platform for participants.    The Law Society, Disruptive.Live and Techcelerate are supporting us too. How can you get involved in the GLH? Hacker teams and team members - Anyone involved in the law, interested in the law, involved in technology for the law, developers, marketers, graphic designers, app designers who want to join the fun.  We know some firms will submit teams, and new teams will form on the first evening around a great idea at the GLH. Helpers - We need volunteers over the weekend to make it happen and keep everyone happy. Mentors - We need subject matter experts and technologists who can mentor the teams over the weekend to help crystallise their ideas, challenge them, or keep them on track. Judges - We've got 6 great judges. Sponsors – It’s not too late to get involved and spend some of that marketing budget you had planned for big events overseas.  This is a 'not for profit' exercise for the hosts, but we need to cover our costs. We think this is going to be something special.  What really happens when you get a bunch of lawyers, coders, designers, consultants and marketing types with their laptops and cloud platforms together over a weekend?  Please register, come and join us and find out! [/vc_column_text][/vc_column][/vc_row] ### AI - what is and isn’t, and how it can help me [vc_row][vc_column][vc_column_text]Before getting into discussions involving Artificial Intelligence (AI), I think it's (always) best to clarify what we refer to as AI, for everyone has a somewhat different understanding. I will try to do so in the following paragraphs. In the sense that we will be using, AI refers to the intelligence of machine and robots, from a computer Science standpoint. And from that same standpoint, AI is classified in two ways - weak artificial intelligence (weak AI, or narrow AI), that is focused on a narrow task or specific problem, and strong AI or artificial general intelligence, which outlines a machine or software with the ability to apply intelligence to any problem, and to learn to solve general problems. It's important to understand that all systems that claim to use AI nowadays are operating a weak AI focused on narrowly defined specific problems, and changing the problem in any way, or it's input data, will, in most cases, render the solution useless or non-performant. As a result, if a specific problem can be solved acceptably well with AI, like, for example, Siri performing tasks of a virtual assistant via voice interface in English, that does not mean that changing to a language with a different structure, like Chinese, would perform as well. Not to mention that even if voice commands are perceived correctly and consistently, the technology is nowhere close to actual 'understanding' or interpreting natural language on a general level. In order to achieve Artificial General Intelligence, or to solve hard specific problems like text understanding or interpretation, a lot of research and innovation is still required, with the timespan of achieving this, real experts in this field agree, can range from 10 to 100 years, even though it's plausible it will happen sometime in the next 20. AI, independent of the techniques used, be it machine learning or something else, is an intelligent adaptive pattern matching solution based on statistics and probabilities. Meaning all solutions solve a problem of finding specific patterns in a given set of information. Depending on the problem and the data set some solutions are better than others. With that in mind, let's have a look at what AI is useful and how it can be used in improving day to day businesses. AI can be applied successfully in any field where there is enough structured or semi-structured data available and a problem can be formulated in a way where finding patterns can provide a solution. For example, in voice recognition, the same word is pronounced in a similar manner by most people, even if it may have a different pitch, tonality, volume. In photography or object recognition there are sets of pixels that determine features that determine patterns of a specific object or digital creation. In games certain sets of actions lead to a better performance in the end, and given enough game data these actions can be determined, connected and improved - see AlphaGo or OpenAI. In any business, AI can help improve performance, but it's always a matter of effort and reward. When it comes to effort, AI requires consistent data, sometimes manually annotated by people - and this can make a potential AI solution very expensive to implement or scale efficiently. Nevertheless, in today's world, it is definitely worth trying to figure out if an AI solution could give your business a competitive advantage in the market and what options you have to implement it. Businesses, both small and large, should be looking at AI as a tool to create value for their customers, their employees, the bottom-line or to improve a critical internal process. Often time it involves all four of them. An easy way to assess this is to assess the problems and potential of improvement - here's how to start: What’s the problem? - Identify problem areas in your business (e.g. high rates of return, errors on the production line, ineffective marketing campaigns, misplaced products, stock management issues...) - Identify possibilities for automation (e.g. correcting errors in financial transactions, using a chatbot for support ...) - Identify frustrations from your consumers (e.g. long lines, technical or customer support...) - Identify employee irritations, dissatisfactions (e.g. inconsistent schedules, doing repetitive tasks…) - Determine which one or two would make the biggest impact on your business - Consult with experts in AI to determine which solutions are viable. Implement the plan created either in-house if you have an R&D team, through companies who specialise in AI or through 3rd party SAAS services. So how to go about it? If your core business is not AI, unless you already have a decent sized R&D department, hiring an AI data scientist that knows what he/she's doing may be quite expensive and it would not make sense from a business standpoint. Relying on a good developer but with no data science, background or real expertise to build the AI solution your company needs is in most cases doomed to failure. Understanding what techniques and architecture is needed or might work, as well as knowing how to prepare the data is crucial for a successful application. So that leaves you with either hiring a third-party specialised team - an agency that has actual data scientists - or using an existing service that solves your problem. Our advice is to contract someone to figure out what the best solution would be for your business, not to build it, but to consult. This way the specialist gets paid to make the best recommendation and would not have any incentive to upsell you on custom development that may not be needed. AI is changing the way we do business. It might not (yet) be in the way we see in sci-fi flicks, but its creative use is helping to create more loyal customers, happier employees, optimal processes and better bottom-lines. Give AI a try - figure out your business improvements and get in touch with a specialist.[/vc_column_text][/vc_column][/vc_row] ### Autonomous vehicles - Is the hype warranted? [vc_row][vc_column][vc_column_text]So where is your self-driving car? Those of us waiting in hope must reckon with the daily news cycle, as tech companies big and small contribute to a hype that obscures the real progress. Elon Musk has predicted several times that we would achieve ‘Level 5 autonomy’ inside a year, but these predictions have failed to materialise. Meanwhile, there are huge teams of engineers deep in the technical soup, working out the best sensors to use, how much data to collect, how best to communicate with other vehicles, and the dozens of interconnected technical challenges. The goal is full autonomy, a vehicle that drives itself, anywhere and everywhere, with optional human being. And where are we today? Well, you could have your self-driving car tomorrow – just don’t expect to get more than a mile or two without “disengagement”. It’s clear that we’re nowhere near our expectation of perfection when it comes to autonomous vehicles. We’d like all the skill and nuance of a human driver, without any of the flaws. Autonomous vehicles should be (very nearly) perfectly safe, all the time, irrespective of the world around them. But even with the rate of technological progress, will that ever be possible? Before we explore this question, it’s worth reminding ourselves of the industry’s agreed levels of autonomy: Level 1 is ‘driver assistance’. This is where a single aspect is automated, but the driver is very much still in charge. Level 2 is ‘partial automation’. This is where chips control two or more elements. In broad terms, this is where we are today, where vehicles are intelligent enough to weave speed and steering systems together using multiple data sources. Level 3 is ‘conditional automation’. This is where a vehicle can manage safety-critical functions. Although all aspects of driving can be done automatically, the driver must be on hand to intervene. Level 4 is ‘high automation’. This is where vehicles will be fully autonomous in controlled areas. This will see vehicles drive in geofenced metropolitan areas, harnessing emerging technology in HD mapping, vehicle-to-vehicle communications, machine vision and advanced sensors. And finally, Level 5. This is ‘fully autonomous’, anywhere, in all environmental conditions. The key difference between this and level 4 is that the human driver is optional. Software at its core Investor extraordinaire Marc Andreessen told us back in 2011 that “software is eating the world”. That remains true. At its heart, creating autonomous vehicles is a problem of software, or in fact many complex pieces of interleaved software. The first Level 5 vehicle on the road will look very similar to the Level 2 vehicles of today. The body, sensors, data feeds and so on will look the same. The key differences will be the human interaction and the software, the lines of code that make sense of the world around, making predictions and creating actions, at lighting speed. Software will consume sensor data and analyse the vehicles surroundings. Software will help us navigate around difficult terrains, deciding which route to take and when to change course, avoiding dangerous routes when there is heavy rain. Software will detect objects, from cat’s eyes to lampposts. Software will track the motion of a child running after a football on the pavement, even going one step further, applying its advanced grasp of physics to predict the risk that the ball’s trajectory will land it on the road. It all sounds great, right? But will these advances ever bring us to complete safety, or is it ok that they will simply be safer than human drivers? This is where we enter an adjacent, fascinating field of study. There is something unique about how humans perceive technology. It’s my view that we’ll be extremely unforgiving about the mistakes of an autonomous vehicle. Statistical proof that they’re safer than with human drivers won’t be enough when there’s something… unnerving, even appalling about a machine making a decision that causes a crash or ends a life. So better will never be enough. We’ll need to see a huge leap forward in autonomous vehicle safety for these vehicles to see mainstream adoption. The only wildcard I see is that insurers may provide big incentives to drivers that leave the driving to the car, but even then I can see reluctance continuing. So this is a core challenge facing the industry: These vehicles must be almost absurdly safe as we move up the levels towards level 5. The solution is in the code New vehicles can carry up to 150 million lines of code (more than modern fighter jets) and the role of software is only increasing with each new model that is rolled out. Despite the challenges the industry faces in moving up the autonomy levels, we are seeing rapid advances in the software, particularly where it’s powered by machine learning. Machine learning is vital to the way an autonomous vehicle perceives its environment and will have a role in the decisions it makes about which actions to take, although some decisions are likely to be more rules based. The challenges in a vehicle perceiving the world as a human does is that the road environment - particularly in dense urban settings - is very complex. It’s also subject to environmental conditions, such as rain, fog, smoke and dust, which make it even harder to understand what's happening around the vehicle. This is one of the areas we’ve been looking at: before Christmas the Cambridge Consultants team demonstrated a technology called DeepRay which removes this distortion from video in real time, and would enable an autonomous vehicle to see more clearly in real-world conditions. These advances in the software must be complemented by historical datasets, which has given rise to the race for tracked miles. This data then feeds through to new software releases, providing reassurance that vehicles will respond to situations as expected and that they learn. None of this is fast. Elapsed time means more learning and better software. Ultimately, it’s these advances in software, allied to robust and methodical testing practices, that will move the industry through the autonomy levels, all the while building the user trust that represents the biggest challenge to adoption. And when will you have your self-driving car? Elon Musk’s most recent timeline suggests September this year, but with multiple caveats. Today we’re engaged in the big leap from level 3 to 4, in which all safety-critical functions are performed by the vehicle. This is a viable vision for the next three to five years, so perhaps by 2022. The timeframe for these vehicles to be trusted by the mass market remains an open question.[/vc_column_text][/vc_column][/vc_row] ### Security | Is your IoT gadget spying on you? With the rise of interactive and intelligent everyday items, concern surrounding the power these items may hold over us can arise. The fear is that, in the hands of hackers, advanced internet connected appliances could act as spies. Although most people may not be at risk of being watched doing anything more thrilling than the washing up or feeding the pets, the thought of a breach to your security and privacy and a helpful household object watching you without your knowledge is an unsettling one. This concern does not simply stem from paranoia; the threat is a reality as some devices have been found to be fitted with spyware or with features vulnerable to hacking. A group of researchers from Positive Technologies uncovered security vulnerabilities in smart vacuum cleaners. Dongguan Diqee 360 vacuums contain elements such as a microphone and a night-vision camera. By merely acquiring its MAC address, a hacker could share the wireless network of the device, and then send a User Datagram Protocol request, potentially giving them the ability to control the functions of the vacuum. Logging onto the device is relatively simple as a lot of devices may still have the default username and password combination. These vulnerabilities are not exclusive to the Dongguan Diqee 360 vacuum cleaners; there may also be issues of vulnerability that affect other IoT devices using the same software. Hackers may also target outdoor surveillance cameras and smart doorbells according to the Positive Technologies research team. Video and microphone access maximises the hacker‘s invasion into our private lives as they can obtain information from what they see us do and what they hear us say. Depending on the position of the IoT devices (some of which can navigate themselves around the household), the hacker could obtain your bank details by watching you use your bank card or use the information you disclose unknowingly to your household gadget to blackmail you. Although these are worse case scenarios, once a hacker has a window into your home, anything the often misleadingly cute looking gadget’s microphone or camera can capture is theirs for the taking. Although we all assume ‘it will never happen to me’ - that attitude leaves us even more vulnerable and unprepared for a violation of our privacy. Imagine the questions you ask your home-bots to save yourself the effort of searching the web are recorded for sinister purposes, and now the hacker with access to your device knows what your interests are, and they can choose to exploit this. What can we do to protect our IoT devices It is essential that if people are bringing IoT devices into their homes, they are educated about the risks to protect themselves and their internet connected household items from hacking. A way to improve the security of IoT devices is to give the subject greater coverage in the media to spread information and warnings regarding the technology. Additionally, manufacturers can guide their customers on how to protect their IoT products, and in turn, themselves. A solution currently being considered for implementation in Europe is an “IoT Trust Label” to be placed on items to assure security, encourage transparency about the devices and their functions, and importantly, establish a degree of protection from surveillance. For example, if the customer is informed about the product and its abilities, and how these abilities could be exploited, they can make sure they maximise the security of their device. For instance, customers can ensure they do not leave default settings vulnerable, such as the login details required for access to the Dongguan Diqee 360 vacuums. There is a widespread assumption that if a company’s product is reported to have flaws, it is a disaster for the company, but with IoT products, reporting and investigating failures or incidents with the technology could help to increase the standards of the products and improve security. So keeping this in mind when purchasing products is vital as you could help other owners/future owners of IoT items if you make a product’s flaws known. For example, Positive Technologies' research into the vulnerability of the Dongguan Diqee 360 vacuums has opened up a conversation about IoT security, and this conversation will inevitably lead to improvement within the company’s manufacturing of the vacuums, as well as educating other manufacturers. Although IoT is not yet highly regulated in the UK, there are associations made up of tech and engineering industry professionals that research IT security and monitor advancing technology. One of these associations is called the IEEE (Institute of Electrical and Electronics Engineers), and they have begun to look into IoT technology. The IEEE look into IoT system architecture, IoT network coding, IoT demands, sensors technologies, smart cities, and more.  And as long as such organisations continue in their research, manufacturers must continuously check the standards of their products to appease such bodies, and in turn, improve the IoT technology that is taking over the home appliances market. With organisations and manufacturers working together to ensure the security of IoT devices, the opportunity for hackers to infiltrate our homes is less likely to arise. With the right procedures put in place during development, in distribution and after the customer takes the product home, the likelihood of hosting a spy in your home is significantly decreased. Never be afraid to look at your IoT vacuum cleaner or smart doorbell with suspicion! ### How Can We Ease Education with the Internet of Things? [vc_row][vc_column][vc_column_text]Online curriculums and e-learning have already been popular and leading the change in the education sector in many parts of the globe. But the impact of technology in the education sector is no longer limited to the impact of the web. The connected ecosystem of gadgets, largely known as the Internet of Things, IoT Development has become the flag bearer of change in the education sector now.  Connected gadgets are transforming the classroom education and student-teacher communication in many parts of the globe. Internet of Things (IoT) solutions are impacting the education domain in the below-mentioned ways.  Improved efficiency: Education in a modern world mostly corresponds to skill development, and in this respect, IoT is delivering us great impact. The real-time streaming of educational content through the tablets or mobile devices of the learners can improve learning efficiency to a great extent. The cloud-based anytime-anywhere access to educational content is also helping to boost efficiency.  Learner-focused approach: Thanks to the connected ecosystem of devices and applications, learners now can share, communicate, and collaborate with each other. This is helping to build a strong learning community focused on improving problem-solving skills and knowledge.   Pushing creativity and innovation: The connected ecosystem of devices and applications empowered with Artificial Intelligence (AI), Machine Learning (ML), and several states of the art technologies are also paving the way for more creative brilliance and innovations. Thanks to VR and AR technology, getting hands-on training on complex systems and outdoor realities is no longer impossible for the learners. Digging Deeper Into the IoT Impact on Education The Internet of Things not only added substantial value to modern education, but it also helped solve a multitude of problems. From remote collaboration and monitoring to cloud-based anytime-anywhere access to fluid real-time collaboration between digital screens and the device screens of the learners, in a multitude of ways, they connected reality transformed education. Let us have a quick look at some of the most impressive ways IoT changed the education sector as a whole.  Building a Strong Community  IoT allows global networking among students, academics, teachers, professionals, and parents in shaping a bigger, neater, and more collaborative educational community that can continuously add value to the learning process. Through the use of connected devices, IoT helped to break the constraints of the classrooms and helped to reach out to numerous stakeholders in the education process.  Access Without Constraints of Location and Device  Most of the connected ecosystems of devices are now powered by cloud computing and cloud storage. This helps IoT devices to remain accessible anytime and anywhere. In the education sector, this has a huge impact as connected devices and applications literally can break through the constraints of location and the devices. This anywhere-anytime learning capability is helping to reduce stress while maintaining optimum outreach to students and learners.  Tracking Accountability   In educational institutions, monitoring students and keeping track of the activities of the teaching staff and other professionals have been a key concern related to the efficiency and reputation of institutions. The connected devices and systems can help to keep track of the students and staff activities in real-time and thus can reduce unprofessional impact and inefficiency.  Efficient Management In many educational institutions, the management finds it extremely hard and monotonous to keep track of all the school activities and taking decisive calls when they require it. This is where the Internet of Things can play a really great role by streamlining the day to day tasks and managing them with a comprehensive central interface.  By allowing IoT led automation to take control of the repetitive tasks and responsibilities, a lot of productive time for the teachers and staff can be created. For instance, the RFID technology can be utilised for keeping track of the lab equipment, and educational gadgets like projectors. This will also help institutions to save the budget spent on hiring manpower for these tasks.  Ensuring Optimum Campus Security  With the rise of global terrorism and the exposure to security risks through connected applications and devices, maintaining campus security and safety remains a concern. This is where connected gadgets like Digital IDs, wristbands, and wearable apps can play a really great role. These devices, along with their connected apps, can help parents to track their children on the way to and back from school. These devices and applications can also help school authorities to keep track of in-campus activities that can be harmful from a security perspective.  The connected applications for the vigilance camera systems within schools and colleges can really give educational institutions great security cover. Thanks to the connected camera systems and real-time tracking ability, the security breaches and risks can be minimised.  Helping Differently-Abled Students  Internet of Things, with its connected ecosystem of devices and applications, can also play a great role in educating and nurturing the differently-abled students. The differently-abled students suffering from lack of eyesight or cognitive problems or limp impairment can now stay connected to the staff and parents in real-time. The IoT based educational devices can also help them learn faster and cope up with the reality around.  Conclusion  IoT in the education sector has appeared not less than as a significant promise for the students, teachers, and the faculty staff. As the impact of IoT is continuously growing, in the years to come, we can see more IoT led innovation for the education sector. [/vc_column_text][/vc_column][/vc_row] ### How IT can support an Agile methodology? [vc_row][vc_column][vc_column_text]Being agile has never been more important for businesses, regardless of size or sector. Against the backdrop of today’s uncertain economic landscape, it’s critical that teams can act fast and deliver results in a nimble fashion. The original Agile Manifesto - published in 2002 - has shaped the way business leaders think about work management. Although agile was designed with software development teams in mind, it has proven to be a winning methodology and is now being used by marketing, design and other business functions. Agile’s success within different teams across an organisation has driven projects in product development and marketing for years, but frequently those teams are siloed from one another. The next chapter of Agile is one in which large enterprises adopt the methodology business-wide and use it to prioritise and execute projects across teams that connect back to high-level company objectives Creating this Agile business environment is a massive exercise, typically being undertaken by IT organisations and project management officers. They are responsible for ensuring that Agile at scale is transforming an entire business, from R&D to product delivery and service. When this involves thousands of employees - a significant investment in new technology and process is required. Fortunately, the four core values outlined in the original Agile Manifesto are still relevant and can be used to build an enterprise-wide agile strategy for your business. 1) Individuals and interactions over processes and tools This pillar of the Agile Manifesto may sound counterintuitive for IT teams – whose role is to enable and secure organisations – frequently by implementing new processes and tools. But thanks to technological advances – such as cloud collaboration – interactions between individuals and tools and processes are no longer mutually exclusive. In fact, today’s IT teams have the opportunity to provide the very tools which will ultimately improve collaboration and speed up interactions across geographical locations. IT departments should choose platforms that are multipurpose and flexible, so each team can define and refine their own processes. This means a team can control their own destiny in terms of how they work and where they work, increasing agility throughout an organisation. 2) Working software over comprehensive documentation This rule was clearly written for software developers, so I would simplify this for Agile across an enterprise simply by saying “Working over talking about working.” In short, teams should focus on reducing the time-to-value of effort and preventing action from dying in committee. After all, in digital work, it is often quicker and more efficient to iterate than it is to make something perfect the first time around. To help with this, IT teams can implement software that will increase the speed of work by connecting different departments with the data to inform decision making. This is also likely to require a culture shift in organisations, so that they feel comfortable breaking down work into smaller chunks and learning on the go. 3) Customer collaboration over contract negotiation There is no escaping that we are currently living in the “age of customer”. Although not every team within a business will communicate directly with customers, every team needs to take into account customer feedback. Happily, customer feedback is in no short supply. Support conversations, social channels, reviews sites, and in-product feedback from apps all provide a real-time look into how customers feel about your product and brand. Making sense of this feedback is something that IT can help with. IT teams will become increasingly crucial when it comes to creating an easy way for teams to access customer feedback and interpret the data that helps them make decisions. Integrations between systems can be used to build dashboards, and shorten the pipeline between customers and the teams that are trying to give them great experiences. 4) Responding to change over following a plan A project plan is important, but it must be flexible enough to allow for projects to be prioritised on the fly in response to the mark. The ability to adapt and still thrive in any situation is important for an agile business. But this ability also takes a particular mindset. Teams must embrace change and work without ego in order to achieve it. IT teams can help to foster this mindset by ensuring that all employees have the right tools and processes in place to help them. Business intelligence is not just for the C-suite anymore. Frontline workers must also be able to understand the data in real-time in order to unlock business growth opportunities and adapt to changes. The future is agile By revisiting the Agile Manifesto as a baseline, IT leaders can encourage a business-wide shift to agility. Change management, coaching and buy-in at all levels of a company will play an essential part in this shift but, for organisations that can achieve it, the rewards will outweigh the initial costs. The next chapter in agility is here. And, although the term is often associated with start-ups and smaller businesses, the reality is that all businesses - irrespective of their size - need to be agile in order to improve their own processes and stay one step ahead of the competition.[/vc_column_text][/vc_column][/vc_row] ### Email Security Tips for Small Businesses [vc_row][vc_column][vc_column_text]With huge advancements in AI and the rise of the Internet of Things, the business technology landscape is being blessed with ever-increasing efficiency and interconnectivity. Though this is changing the way we work for the better, it’s also changing the risks we face on a day-to-day basis. Recent years have been beset by high-profile scandals that continually remind us of the importance of data security. From the near-continuous Facebook data breaches to the 25 million passport numbers stolen in the Marriott data breach, 2018 was a year dominated by stories of businesses failing to protect their customers’ data. The damages caused by these breaches are well known and, if you’re a business owner or executive, it’s vital you protect your activity and safeguard your organisation from similar attacks. The cost of the average data breach to a US company now stands at $7.91million. According to a recent report, 47% of SMEs experienced a cyber attack in the last year. Cybersecurity is a complicated area composed of many interdependent parts. However, given that 91% of cyber-attacks begin with a fraudulent email, email security is an important place to start for any individual dealing with sensitive data. Use Strong Passwords It may sound simple enough, but simple passwords are still one of the easiest ways for cybercriminals to compromise your network security. Ensure that you and all your employees understand what constitutes a strong password. You’ll also want to avoid having the same password for all your accounts. To manage the different passwords, install a password manager to help you keep on top of your passwords and keep them safe. Use Two-Factor Authentication Most email services today offer two-factor verification processes as standard. This combines your unique password and a second validation step, whether it’s an SMS, automated phone call, or recovery email. By using 2FA, you can be safe in the knowledge that even if your password is compromised, it will be difficult for any cyber criminals to access your accounts without your phone. Encryption Software Most emails that are sent and received are done so in plain text. As a result, they’re easily legible to prying eyes. If you’re sending particularly sensitive information over email, it’s recommended that you use an encrypted email service. Since the Snowden leaks several specialist email servers have emerged that encrypt your emails in transit. Doing so means that they will be unintelligible between accounts, avoiding the risk of interception. You can also enable encryption on many of the most popular email services. However, their privacy policies and encryption methods are often likely to be weaker, so if you’re sending particularly sensitive information it is worth using a specialist service. This should be used in conjunction with other basic consumer security tools like antivirus and Virtual Private Network software, which should be in effect across all of your active devices. It’s important to take your time when choosing a service provider here, as some are more secure than others. While the best choices in the market offer strong all-round protection, there are plenty of unreliable options that have been known to store and even sell user data. Generally speaking, services like NordVPN and Cyberghost are safe options which get great reviews. Avoid Public Wi-Fi Make sure you don’t access any sensitive company information from a public Wi-Fi connection. Public connections provide a whole host of security issues and leave you vulnerable to man-in-the-middle attacks. If you’re left with no choice but to use an unsecured network, ensure your VPN is turned on. Educate Employees Most scams aren’t successful because the attacker is highly skilled. They happen because someone’s concentration slipped. For that reason, educating yourself and your employees about the tell-tale signs of phishing scams, the dangers of public Wi-Fi and the importance of maintaining strong passwords is paramount. As your company grows it’s important to make sure that cybersecurity compliance grows alongside it. Onboarding and offboarding are particularly significant times for cybersecurity training so before you hire new employees, make sure that you have a detailed policy in place that will help keep your data secure. Email Deletion Review your email provider’s Terms of Service to understand their email deletion process. If you delete the email from your inbox, know where it’s stored afterwards. Even if you’ve deleted the email, that doesn’t mean that data has disappeared and may still be accessed if not properly deleted. Some services also offer expiration dates for emails, after which the emails are completely erased from existence. This is especially useful in cases where emails stay unopened in your inbox for a longer period of time. Understanding Email Security in 2019 With the tide of high-profile data breaches unlikely to slow, 2019 will be an important year for digital privacy and cybersecurity. Significantly, email security and the rise of anti-phishing measures will dominate discussions of small business cybersecurity. While it’s important to implement mechanisms such as two-factor authentication and encryption, it’s also important to stay educated on the broader issues and make sure that your employees are doing the same. With a comprehensive cybersecurity policy that incorporates these email security tips, you’re more likely to remain cyber secure in the long-run.[/vc_column_text][/vc_column][/vc_row] ### Cyberwarfare: the New AI Frontier [vc_row][vc_column][vc_column_text]Over the past few years, we have seen artificial intelligence move from the shadows into the mainstream. From the smart music speakers in our front rooms to the digital assistant on our phones in our pockets, we probably interact with or are guided by AI on a daily basis - often without awareness of its impact or influence. The ability for a system to rapidly classify an input, as well as learning from experience and by example means that security vendors have been quick to leverage AI to underpin their threat analysis process. This has led to increased accuracy, faster threat recognition and an overall reduction in risk when pitted against even the best human-derived threats. The very same reasons that security vendors have adopted AI have made the technology attractive to cybercriminals, as having an attacking system that can very rapidly classify its target, learn from its failures and exploit its successes, all without human interaction, is a perfect storm scenario for any attacker. AI vs AI This leads to a very strange and unknown scenario where we will see attack AIs pitched against detection AIs. Unlike a human whose choices are guided by experience, skill and bias, an AI will often deploy strategies that are unexpected. If we think of the headline AI victories over humans such as the 4-1 defeat of Go Grandmaster Lee Se-Dol by Google’s AlphaGo or IBM’s Deep Blue victory over Garry Kasparov, this shows that AI thinking can be very unpredictable, often playing a crazy almost suicidal or oversimplified move no right-minded human will make. This left-field approach is then built on as part of a larger winning strategy. As AI is pitched against AI, the battle ground may become a highly misunderstood place for humans, with attack and defence strategies not being understood and, as such, knowing who is winning will be almost impossible to call. This toing and froing may, in fact, never end, leading to a continual battle scenario, and even when we think the battle is won this may be just the next stage in the attacker’s strategy, with the real battle being fought on a completely different front. What should we do? It is therefore important that we adopt security strategies that do not fully remove the human touch. We need to adopt multiple systems that provide enough information from sources so that we can determine if the threat has indeed become a breach, so the impact of said breach can then be understood. Once again the assessment process is highly likely to be underpinned by some kind of AI-based decision engine that highlights possible threats that ordinarily might have been missed by the human eye. However, if an attacker AI suspects they will be monitored by an AI, they will likely take a strategy that successfully evades detection. You may ask how we operate in this new AI frontier. In my opinion, we need to take a pragmatic approach that assumes that one or more of our defences have been breached and the intruder is well hidden. We also have to validate all activity and any attempt to access systems and data rather than depend on one single activity at one time. In short, we need to adopt a zero trust model and only allow access once the burden of proof shows that the request is valid and the response will be equally uncorrupted.[/vc_column_text][/vc_column][/vc_row] ### Managing the Performance for SaaS-Based Applications Software as a service (SaaS) has reached mainstream status and will continue to grow as businesses of all sizes increasingly rely upon it to deliver apps and IT services. However, the challenge for these services is that their growth is accompanied by a rise in expectations for high levels of availability and performance. Meeting soaring customer demand is complicated by how SaaS providers must monitor and manage end user experience with remotely hosted and delivered SaaS applications. Whenever they engage with any digital service, people now expect the relevant web content and apps to load within seconds on all their devices and to provide them with a fast and reliable experience throughout. Our latest survey on e-commerce performance and online consumer shopping behaviour showed that consumers can get easily frustrated with slow-loading websites and therefore abandon the page without finalising their purchase. For 60% of all respondents, poor technical performance means that they will avoid making purchases from that website or mobile app again. High-performance is equally as important for SaaS companies as it is for e-commerce businesses. So, if SaaS providers want to be successful and continue to grow, they need to adopt a similar laser focus on the high-performing, super-reliable end-user experience like e-commerce brands are realising they must provide. Meeting these performance demands hasn’t been easy, so what are the challenges SaaS businesses need to overcome in order to meet their customers’ high expectations? It turns out that there are four key elements that could affect SaaS digital experience performance: global SaaS user growth, infrastructure add-ons, the capricious nature of the Internet and the use of out-of-date experience management tools. 1: Global SaaS User Growth Believe it or not one of the main reasons for heightening performance challenges is the extensive growth of SaaS end users on a global scale. The reason for this is because application performance tends to deteriorate depending on how distant it is from the SaaS provider’s data centre. Therefore, the further end users are served from the origin data centre, the more variables such as networks, IPS and browsers, there would be between them, which can be the cause of poor performance. 2: Infrastructure Add-Ons As many SaaS providers expand into new geographic regions to support more businesses of enterprise level, they tend to add infrastructure as well as divide up more existing systems to help share the load. However, additional infrastructure build-outs inadvertently add greater complexity. The end result that the SaaS provider gets can be decreased visibility into infrastructure health and end-user application performance. [easy-tweet tweet="The most recent example of a major SaaS outage is the Amazon S3 (AWS)" hashtags="AWS, SaaS"] 3: The Capricious Nature of the Internet Full SaaS outages are rarity; however, when they do happen, they can have a disastrous effect on the end-user. The most recent example of a major SaaS outage is the Amazon S3 (AWS) outage that happened this February. Because of the outage popular and diverse online services - from high profile websites and apps to IoT devices, were inaccessible for several hours. The AWS outage clarifies the fact that 100% availability is unrealistic even when your web services depend on an impressive infrastructure like the one of Amazon. 4: Use of Out-of-Date Experience Management Tools To be able to get ahead of potential performance issues before their end-users, SaaS providers need to shift from traditional application performance management (APM) to digital experience management (DEM). So, how does DEM operate? It treats the end-user experience as the ultimate metric and identifies how the myriad of underlying services, systems and components influence it. So, the same way you use monitoring to track the performance and availability of your internal applications, you need to put a process and the right tooling in place to monitor your SaaS applications. SaaS providers may find themselves drowning in data due to the increase in variables that impact performance. In order to make sense of this data and identify the cause of performance issues, they need insight form advanced analytics. When analysed efficiently, the data is useful for issues that are both within their infrastructure, like when a geographical region or data centre requires more capacity, and outside it, like a slow ISP for the final mile to the customer or the performance of the content delivery networks being used. It is important that business users of SaaS services get involved with the performance management process by consistently measuring the response levels of their SaaS providers and real-time monitoring the end-user performance at the closest points of geographic proximity. This information can not only help SaaS users uphold provider SLAs, but also identify performance issues in advance and quickly deal with them. Today businesses are depending more than ever before on SaaS applications and as users import more mission-critical applications to the cloud, SaaS providers will continue to struggle with the growing demand of their services. As end-users demand higher than ever levels of performance and productivity, SaaS providers cannot disappoint with poor performance. SaaS providers should deal with the rise in expectations for availability and performance through new approaches that evolve APM to DEM. The involvement of businesses in the monitoring process ultimately translate to more proactive and productive performance management and accurate issue resolution. ### Cloud Shock | How to avoid it Are you spending far more money on cloud-based digital transformation projects than you’d bargained (or budgeted) for? If so, then you may be experiencing the relatively new phenomenon known as Cloud Shock. As many businesses and organisations, understandably, move quickly to fund and green-light digital transformation projects, one common occurrence is the emergence of newly-empowered tech decision makers from across the organisation (both from within and beyond the traditional IT department). Budget-holders are giving their teams a much freer rein to develop cloud services to help their organisation gain and maintain competitive advantage, through bringing new products and services to market before their rivals. These newly-empowered technology decision makers from across the organisation are investing in cloud-based technologies, primarily infrastructure (IaaS) and software (SaaS).   The moment that Cloud Shock hits Cloud Shock, refers to that particular moment when the budget holder (or worse, the CFO) realises that the company has spent considerably more on cloud-based projects than they had initially bargained for. The irony here is that it is then, most often, that the IT department is first called upon to explain why the cloud bills are so unexpectedly high! And this is set to become an increasingly more common problem, with varying analysts predicting growth rates of up to 100% for SaaS spend and 200% on IaaS over the next three years. It’s easy to understand why cloud-based services are becoming so popular, as the benefits of moving apps, services and workloads to the cloud are hugely appealing. Yet the downside, where Cloud Shock hits due to an under-budgeted and un-governed cloud spend, is also becoming an all-too-common story.   How to avoid the pitfalls of Cloud Shock At recent Snow user events in London, Sydney and Auckland we spoke with numerous customers about these issues and the vast majority of them agree that Cloud Shock is becoming a widespread problem throughout the industry. So how can you best avoid or manage it? Organisations can use asset inventory and automation solutions to avoid or address Cloud Shock, by providing technology decision makers with the ability to: Govern the process of creating IaaS instances - automatically creating approved instances in the right environment with the appropriate attributes and cost centres, automatically retiring the instance after a pre-determined time or period of inactivity to eliminate wasted spend.   To identify the use of SaaS applications across the organisation - pinpointing the use of SaaS applications is the first step to understanding the costs associated with cloud computing, especially when subscriptions are created with no involvement from the centralised IT function.   To pull subscription data from leading SaaS applications - to manage costs, organisations need to understand what subscriptions have been provisioned, to whom and at what level.  Effective subscription management for SaaS apps can yield significant cost recoveries. The overall lesson that really needs to be learned is that monitoring and managing all forms of cloud spend will enable the asset management function to deliver new value to CFOs and lines of business managers, helping to avoid Cloud Shock and giving decision makers the confidence that they are investing wisely and making the most of their subscribed services.   Carrying out a thorough cloud audit Most importantly, a thorough ‘cloud audit’ is required. You need to know exactly what cloud technologies and services are in use across your organisation, no matter who decided to set them up or who authorised the initial payment for each one. As Business Unit IT becomes prevalent, so visibility and insight into usage becomes a massive challenge for the centralised IT team and each individual business unit, They increasingly find they are spending without sufficient information about exactly what they getting or understanding the true consequences of ongoing costs. It needs to be reiterated that these exercises are not all about reducing costs, but mostly about reducing the tendency that many organisations are currently displaying to overspend on cloud services that are not directly driving business value. Budget holders – and CFOs – need to be confident that the technology investments they are making are being used optimally and wisely. ### Online Vs Offline Backup: Navigating the Options Business continuity depends on ensuring files, data and applications are backed-up on a regular basis so that, in the event of disaster, recovery will be fast and comprehensive. With cyber criminals now targeting businesses large and small, data backup represents an essential aspect of any organisation’s cyber security plan. But when it comes to backup, is cloud or offline – using local physical devices – best? Here’s a top-level comparison of the advantages and drawbacks of both local backup and cloud backup across five key criteria. [easy-tweet tweet="When it comes to backup, is cloud or offline – using local physical devices – best?" hashtags="Cloud, backup, storage"] Local storage Local or on-site backup is a tried and tested option that’s familiar to many. And there’s something very reassuring about keeping files and data locally or ‘in-house’, backing up data on a range of devices and media – external hard drives, USB thumb drives, optical disks, or NAS servers. Price – USBs and external hard drives are relatively low cost to purchase, but over time there may be a proliferation of devices required to support the growing needs of the business. While budget friendly, local storage options aren’t immune to physical failure or damage. Security – local devices offer the peace of mind that you have full and independent control of your data. You know exactly where your data is, and can manage exactly who can and cannot access your files. What’s more, with external hard drives, your data is as protected as your network and, once disconnected, will be safe from any malicious attack affecting your infrastructure. But should devices be lost or stolen – if there’s a break in for example – your data may end up in the hands of others if unencrypted and, from a business perspective, it will be gone forever. Time and effort – typically this involves undertaking regular manual backups. Indeed, creating and maintaining a local storage system can be time intensive and costly. What’s more, manual backup is a task that often goes to the bottom of the ‘to do’ list in the face of other more pressing tasks. And that leaves the business in a potentially vulnerable position. Storage limits – if you’ve got an ongoing backup process in play, then you don’t need telling that capacity can quickly become an issue. Disk drives fill up and, as backup needs escalate the devices needed to keep pace with burgeoning demands will need to get larger. Support services – dealing with networked servers and ensuring that backup and restore times are fast and efficient means you may need to call on specialist IT support to help manage local device backup processes effectively. Cloud storage Cloud storage has been around for a while now and offers a variety of advantages for businesses looking to automate and streamline their data backup strategy. Price – recent rivalry between Amazon, Microsoft and Google means that the price-per-gigabyte cost for storage has plunged, which means cloud backup services are now even more affordable for business of any scale. Charged at a monthly price, business-grade cloud backup services typically feature regular oversight and security. Plus there’s no need to own or maintain hardware or software associated with backup storage. Security –any provider worth its salt will offer advanced encryption methods that protect data during transmission and storage, but cannot prevent you accidently deleting or editing important files. Opting for a fully managed provider who takes care of the end-to-end backup and security process will help maximise protection for your data. Time and effort – thanks to ‘set-up-and-forget’ technologies, organisations can save valuable time and money. What’s more, regular backups can be scheduled and automated to run continually in the background. So, should disaster strike, restoring to the latest backup enables operations to be back up and running fast. Storage limits – business-grade cloud storage offers limitless scalability, and makes it easy to add capacity on demand. There’s no need to wait to upgrade or rollout increased storage. Or to scale back down, if requirements change. Support services – there are plenty of options available for businesses looking to use cloud backup. Check providers have experience helping customers backup to the cloud, understand the most effective approaches for restoring data services in the event of an incident. As organisations increasingly digitise their business operations, making the move from local storage components to a full cloud backup and storage infrastructure can provide the robust protection required to cope with unexpected events and cyber attacks. Finally, for additional peace of mind adopting a hybrid option, combining automated cloud platforms and local backups for essential data, will offer a truly ‘belt and braces’ approach. ### The demystifying guide to social AR [vc_row][vc_column][vc_column_text]Augmented reality (AR) experiences, like Snapchat Lenses and Instagram Effects, have become bread and butter to social media users. Highly popular, they provide an incredibly effective way of reaching customers on social channels. This explains why social AR should be a fundamental marketing consideration for any business that wants to grow its customer base. The problem is, that although social AR platforms are now simple and intuitive to use, many businesses don’t know where to start when it comes developing an AR lens that will draw in customers. So, as an antidote to the densely worded AR guides and forums, this article offers an easy-to-digest overview of social AR’s gloriously creative possibilities. To create an AR experience that will ignite across social channels, the core idea has to be highly sharable. Users are motivated to engage with and share an AR experience when it makes them look cool, makes someone laugh, helps them express something or be part of a cultural moment. Although it’s possible to let users augment their environment with digital layers that surprise and delight, giving users an opportunity to manipulate their face offers even more entertainment while tapping into selfie culture. So face manipulation is often the key to creating highly shareable social AR content. Once you’ve cracked a killer idea that neatly aligns with your business, it’s time to work out where the experience will ‘live’: Snapchat, Facebook and Instagram? Or should it be web? Or an app? After these fundamental decisions have been made, it’s time to look at ‘elements’ – the basic building blocks of an AR experience. Elements fall into different categories, ranging from facemasks to 3D portals. And it’s these elements that hold the potential for highly shareable experiences. There are many different ways to augment a face. From duplicating features to creating masks, distorting elements, changing colours and textures, adding 2D or 3D attachments, giphy stickers and animations, sprites, environment particles, overlays and integrating photorealistic make-up shaders; the possibilities are constantly increasing with regular platform updates. If you really want to go to town, additional layers can be added to make the experience even more interactive: freeze frames, score counting, portals, geo-location, two-player gaming, pulling in user data, audio, body tracking and painting the sky… these are great ways of attracting users to the lens. Although some AR experiences begin instantly without any user interaction, the more engaging ones give users agency over the lens. Gestures that control an AR experience are called ‘triggers’. A trigger could be a simple facial movement, or it could involve interaction with the screen. Other triggers include: open mouth, smile, drag, raised eyebrows and blowing kisses. The type of trigger that’s best suited to the AR experience largely depends on which type of AR element is being used. For example, triggers involving facial expressions are best for facial manipulation elements. Another consideration that may hold businesses back from creating social AR is thinking that it takes too long to create an AR lens. Depending on its complexity, a decent experience can be created in as little as a week, providing it uses basic elements like facemasks. But more elaborate builds – those entailing portals, games and dynamic lenses – will require up to six weeks. Now that the basic principles have been outlined, there’s no excuse not to start leveraging AR to grow your business’s social presence. Crucially, though, there’s more to social AR than just raising brand awareness. When done right, social AR can move beyond driving awareness to driving sales. Just Eat, for example, reports that its recent Snapchat AR lens generated an order rate of 10%. This, clearly, is an impressive ROI… and one that your business could benefit from too.[/vc_column_text][/vc_column][/vc_row] ### Technologies That Can Revolutionise Business Models in Manufacturing [vc_row][vc_column][vc_column_text]In the future, the success factors that will separate winners from losers in an increasingly competitive manufacturing landscape will extend far beyond the ability to manufacture products. In its recent ‘Exponential technologies in manufacturing’ study, Deloitte observes that organisations from every industry now face mounting pressure to transform and make the shift from product-centric business models to capture other sources of value—and the manufacturing sector is not immune to this challenge. However, interviews conducted by Deloitte with manufacturing executives highlight a worrying trend. Many interviewees expressed concern that their organisations are not preparing or moving fast enough to address future disruptions on the horizon. Prioritising the areas in which digitalisation will deliver the most benefits is just part of the challenge. Many manufacturers are locked in a world of legacy IT that is hampering efforts to adopt Industry 4.0 operational innovations and become digital manufacturing enterprises (DMEs) as quickly or holistically as needed. Enabling digital transformation capabilities Overcoming the barriers to digitalisation that exist within an enterprise’s own IT infrastructures represents a first vital step on the journey to digital transformation. Indeed, having the necessary IT services in place that are both available and easy to rapidly scale is a key prerequisite for digitalisation—which that means enterprise cloud strategies will need to be adopted to support these requirements. After which, the advantages of 'disruptive' technologies can be fully utilised to enable future sustainable growth and enhanced market relevance. Furthermore, transitioning to a cloud-environment also makes it easier to deliver demand-oriented access to innovative applications at any time, whether for pilot projects or standard processes. Creating the prerequisites for intelligent systems The second requirement for businesses looking to become a DME is the ability to capture data from all internal and external processes, contextualise it, and use it for decision-making and forecasts. Central systems for enterprise management—typically enterprise resource planning (ERP) systems in the manufacturing industry—must therefore be integrated across all relevant business areas, as well as external applications. As well as creating the basis for intelligent systems, this will also enable the initiation of vertical networks—cascading from production to corporate management—and the design of digitalised, horizontal eco-systems with partners, suppliers, and customers. According to KPMG's Global Manufacturing Outlook, this latter capability will be decisive for enabling maximum supply chain transparency and growth. Modern ERP systems are geared to delivering all this and more. Today’s systems are available flexibly in the cloud, or as an on-premises solution, and designed to support the individual cloud strategies of enterprises striving for a digitally connected future. With the appropriate infrastructure in place, manufacturers can then deploy the modern software architectures and flexible interfaces that make it easier to test and implement Industry 4.0 technology innovations. Industry 4.0 and digital twinning One such innovation is digital twinning—groundbreaking Industry 4.0 technology that represents an important innovation for the manufacturing industry. It will enable engineers to create virtual product prototypes and maintain virtual representations of these products, making necessary amendments to optimise its business performance. In the future, digital twins will be created in a wide range of contexts to serve a variety of objectives that will include driving leaner, and more productive, operations. As PwC explains in its report Digital Factories 2020, digital twinning is not only valuable for product development. Digital twins could be utilised to simulate an asset’s operations—through the creation of digital simulations of facilities and environments that emulate real-life conditions—to enable enhanced predictive maintenance. As manufacturing processes become increasingly digital, and smart connected technologies become more pervasive thanks to the Internet of Things (IoT), interactivity between the physical and digital worlds is set to grow. Indeed, a recent Research and Markets report predicts that, by 2022, around 85 percent of all IoT enabled platforms will have digital twin functionality. Blockchain in the supply chain The growing trend towards digitally networked supplier ecosystems means the demand for maximum transparency and security across the supply chain is gaining momentum. With companies coming under increasing scrutiny, and customer confidence a top priority, the ability to facilitate tamper-proof transactions and services is becoming increasingly important. Blockchain is a data structure that makes it possible to create a digital ledger of transactions. This uses cryptography to manage secure access to the enterprise blockchain ledger, and once a block of data is recorded on the ledger, it is extremely difficult to change or remove. Implementing blockchain technology can revolutionise the supply chain, delivering the transparency, scalability, and enhanced security that makes it easier and safer for businesses to work together over the internet. When connected to an ERP system, this technology can also provide executives with real-time, robust data. MIT is currently developing a peer-to-peer platform that organisations can use as a blueprint for initiating highly secure blockchain applications. Artificial intelligence for robotics and planning Similarly, artificial intelligence (AI) is set to transform manufacturing—enabling the development of AI-powered robots that can undertake automated issue identification.  Machine learning will enable us to test and learn from each iteration what works, and what doesn’t to improve production efficiencies. Large manufacturing companies have already begun using AI to empower enhanced material purchasing and allocation decisions, or to more accurately predict delivery times and volumes—based on capacity, and planned, and unplanned, downtime. Others are exploring how predictive analytics and AI could enable them to reinvent capacity planning and achieve a more predictable performance. The emergence of new human-machine interfaces Finally, as the IoT grows, human-machine interfaces (HMIs) are becoming increasingly sophisticated. Technologies like augmented reality glasses and voice recognition offer the promise of transforming production throughput and quality, by enabling workers of all skill levels to efficiently complete complex tasks with minimal formal training. Providers such as Vuzix and Google Glass Enterprise Edition provide wearable devices that give users quick access to information, such as notifications or navigation instructions. These can increase the efficiency of technicians, engineers, and other employees working on the production line or undertaking maintenance in the field. Applications range from the provision of step-by-step instructions, guidance on selecting the right tools, or the photographing, recording, and reporting of quality issues. Voice-driven technologies are set to have a major impact on industrial landscape too. Indeed, a recent study undertaken by Zebra reveals that 51 percent of manufacturing companies are planning to expand their use of voice technology in the next five years. These implementations will encompass far more than simply eliminating the need for workers to juggle equipment in one hand while making inputs on a mobile device with the other, or improving communication between employees in production facilities. When linked with an extensive ERP system that oversees all operations on a factory floor, voice-driven technologies can also help to improve decision-making. The moment a command is issued, a smart ERP system can sync these with a manufacturer’s internal systems to collect invaluable insights over how these are being actioned across the factory floor. Delivering this data in real-time makes it possible to pinpoint where operation inefficiencies lie and focus efforts on closing these gaps to increase output. Connecting everything together The Industry 4.0 revolution encompasses a broader digital transformation that spans processes, functions, and technologies. Transforming how manufacturers make decisions that impact operations, deliver value, and improve performance. Manufacturers need to prepare a pathway toward digital maturity in readiness for the adoption of advanced technologies, and the enablement of real-time access to data and intelligence across the enterprise and beyond.  From digital supply networks to AI, the ability to integrate advanced new technologies with existing IT infrastructures will enable ERP to connect everything together and leverage cloud computing, the internet, cognitive computing, and robotics, to apply live data insights to production and on-demand consumption models.[/vc_column_text][/vc_column][/vc_row] ### Delivering an enterprise mobility strategy that is fit-for purpose [vc_row][vc_column][vc_column_text]When IT teams embark on digital transformation projects, to automate paper-based systems and improve business processes for field-based or mobile workers, the enterprise mobility leg of the journey often involves selecting the right tablet, rugged device or mobile device strategy.  In sectors, such as construction, logistics, engineering, manufacturing or the public sector, IT teams may inadvertently incorrectly select either the most, seemingly ‘cost-effective’ consumer device available on the market and will enhance it with a tough case to make it ‘rugged’. Or, the IT team will adopt ‘over specified’ and unnecessary military grade devices. Long-term, this procurement strategy can often turn out to be expensive and ineffective. In cases where the popular consumer mobile brands and devices are selected, IT teams mostly procure these devices in bulk, and just replace them as they break, believing that this approach to replace devices is ‘cheaper’ and more effective.  Over time, the economies of scale, in terms of device costs and downtime for teams and IT are prohibitive, making the total cost of ownership, over time, very expensive. The alternative approach that is considered by enterprises is to adopt over specified military grade devices. This route is often unnecessary, and raises a fundamental question for IT teams about whether - as part of their enterprise mobility and digital transformation programme - they are selecting devices that are fit-for-purpose, according to the business processes that the strategy has been designed to improve? In addition, regardless of whether a device is a standard, well-known consumer brand or a piece of military grade kit, it will eventually break at some stage. This will cause downtime for the end-user and a reduction in personal and overall business productivity. There will be a knock-on effect to the IT department, as IT will have to deal with the broken device and plug the gap in the meantime.  Depending on the severity of the problem, it might be able to solve the issue with the user over the phone, or it will have to replace the device. This sort of problem happens all too often across mobile workforces, and the latter increases the total cost of ownership of any enterprise mobility programme. This then raises a fundamental question about why any organisation would set their field service and IT teams up for this sort of over-priced headache and IT failure?  We believe that a better approach to take would be to develop a mobile strategy that is based on selecting a device that is fit-for-purpose, business-rugged and which offers strong device up-time and fast replacement services. That in mind, what should organisations consider as they develop enterprise mobility and digital transformation strategies to improve efficiency and reduce expensive device costs? It’s all about the end user As UK plc evaluates how to improve business processes within the organisation, it’s important to consider the applications that the users will access to do their work. Do these applications have a mobile version for users to access?  When employees use ‘said’ software, what is the actual user experience of that software like – and do internal trials prove that productivity has increased.  Or at the very least, from a wider strategic perspective, has the software been customised to deliver on the business promise and does it have the potential to truly drive tangible improvements? Once these application design questions have been asked, it’s worth considering the device that employees will use as the interface with the various software applications that are driving the business forward. It is no longer cost-effective in the long run to buy perceivably ‘cheap’ devices. In this scenario, where enterprise mobility forms a crucial part of the digital transformation strategy, business leaders should consider selecting the best device for the business process that requires improvement. Improving BPM with a fit-for-purpose device When selecting a device, it is important for organisations to consider the applications that will function on the devices and the environment that these devices will function in. For argument’s sake, if you ran a construction company, you might just need a tablet that is stronger than a standard iPad, but which isn’t military grade. So, why select either of those devices, when there is a tablet or mobile that is designed to meet the organisation’s needs.  Consult with your device providers and challenge them on the devices that they are advising you to use. Your devices may not be fit-for-purpose. It’s a bit like using a butter knife to take a screw out. Sure, it will be effective once or twice; but in the long-term using that ‘tool’ will become frustrating, the knife will break, and you’ll need to use a proper screwdriver to do the job. The same applies to ‘mobility’. Business rugged The next point centres on whether the device that is selected is “Business Rugged”. Within hardwearing environments you only really need a device that is ‘business rugged’ or rugged enough for the environment that the device is operating within. Again, this is where your device provider should work with your team to identify what physical scenarios the device might be exposed to. For example, is the device used by sales teams, or engineers within manufacturing or construction sites, or is it a case of warehouse and logistics teams using barcode scanners while on the road? What is the physical environment that devices operate within, what types of business scenarios will they experience – will they be dropped, exposed to a lot of dust, what about chemicals or will the screen be exposed to mud? Considering how the device will relate to the environment, and the person it is being used by is crucial and can ensure that you buy the correctly specified devices for your team. Can I speak to a human, please? Once you’ve assessed how your end users will use the software and applications, and what supporting devices are required to improve processes, it’s important to ensure that you’re happy with your device supplier’s support, device repair and replacement procedures. All too often, it is increasingly difficult for IT teams to simply be able to just talk to a human to help them with their device provisioning.  The value of a having a supplier that is ‘easy to work with’ to deliver value is key, so ensure that your supplier does have the right capabilities and is willing to offer you easy access to them for advice and support. What is your device downtime strategy? This is important because at some stage, beyond initially getting set up, your devices will suffer a few knocks and will break. Even military grade devices break. So the question is, when this happens, what is your wider replacement strategy – which then raises a further question about how your team and your suppliers’ team is set up to support your business. You don’t want to discover further down the line that you’re left ‘high and dry’ by your supplier, when you really need them, when a device breaks. This will kill productivity for end-users within the business and destroy confidence in IT and the strategies it is executing. Conclusion Too often IT departments invest in their organisations’ enterprise mobility by selecting a set of devices in bulk without scrutinising whether or not these devices and the contributions towards productivity are really being reaped.  We encourage CIOs to review device procurement policies and to establish whether these devices truly are fit-for-purpose, business rugged, increase productivity and decrease ‘total cost of ownership’.[/vc_column_text][/vc_column][/vc_row] ### Solving Surveillance Storage Woes with the Cloud [vc_row][vc_column][vc_column_text]There’s no doubt we’re in the midst of a gold rush for surveillance products and services - and while too much surveillance is problematic for obvious reasons - there’s no doubt it remains a critically important part of the security strategy for governments and private enterprises.  According to IHS Markit, the surveillance market grew at almost three times the annual rate compared to previous years in the period 2016-18, reaching a worldwide market revenue of $18.2 billion by the end of that year and market growth is only set to increase worldwide. From the public sector organisations looking to improve public safety and help catch criminals, to companies building new customer experiences that rely on surveillance systems to work like Amazon’s futuristic grocery store Go, this market is projected to grow to more than $68 billion by 2023.   A lot of this growth has come from further migration from analog to Internet Protocol (IP) based video surveillance systems and digital upgrades on already existing hybrid systems - 62% of all cameras shipped in 2017 were network cameras which jumped to 70% in 2018 -  and this will no doubt continue growing over time.  Surveillance has led to an explosion of data  The amount of data stored globally is anticipated to reach 163 zettabytes by 2025 according to IDC research - the primary driver of this data explosion is imaging, and video in particular. Today, a single 4K video camera running at 30 frames per second and compressing video with the H.264 codec will generate, in the best case scenario, around 255GB of data per day that needs to be stored.   Extrapolating this for the estimated 420,000 cameras present in London would mean 150 petabytes of data generated a day. Moreover, the 700 million surveillance cameras estimated to exist globally are generating an estimate of more than 100,000 petabytes a day - and that’s if you assume we’re using the newest codec (H.265) for optimum video file compression, which won’t be the case across the board.  The generation of data from these new higher resolution cameras has vastly outstripped most organisations’ storage budgets. Surveillance firms are coping with these massive amounts of data by reducing frame rates and storing data for only a few days.  Neither of these solutions is desirable. The industry clearly needs much less costly storage options than the typical on-premises hardware solutions in general use today.   An increase in the usage of body worn video cameras (BWVs) is a great example of how this evolution is increasing data demands. A pilot scheme by the London’s Metropolitan Police in 2016 saw a 93% reduction in the number of complaints made against police who were wearing BWVs, and has led to wider rollout all over the UK. In the last couple of years, UK supermarket Asda has begun to equip its security guards with BWVs as well, with other businesses no doubt eyeing this equipment with interest.    A typical BWV camera generates around one to two hours of footage per day, or around 3GB of data a day, and this will grow significantly as these cameras move to higher resolutions and greater capabilities. Further to this is the advancement in associated technologies, such as artificial intelligence and machine learning, which are being applied to this surveillance imagery to obtain better insights - and need ever larger data sets. Indeed, this factor may have played a significant part in the Metropolitan Police’s decision to go with an unlimited data contract as part of their surveillance contract with their provider Axon.   Stricter regulations are also forcing businesses and public sector organisations to keep the video footage they capture on file for longer. Today, airport guidelines mandate that footage of on-camera injuries, thefts or conflicts be kept for at least seven years. And if an incident is captured from multiple angles and cameras, that amounts to hundreds of gigabytes of data. The need for sufficient storage capabilities becomes even more pertinent with the increase of 4K cameras (some cameras today are even 8K or 10K) as well as 5G capabilities increasing the scope of wireless data transmission.    Adopting the right storage strategy  Making the decision to move to cloud storage can be tricky. There’s often two ways of doing it - either going with a single vendor solution which covers you for everything - that’s cameras, software and storage - or taking the systems integrator option, where you purchase a bespoke solution via a third-party.  If you opt for a single vendor for your surveillance solution, you risk being locked into using their cloud storage provider which can cost you in the long term. For a hundred police officers for example, a five year commitment for body cams will cost in the region of half a million dollars or pounds, with around 65% of that covering the storage alone.   Each camera will produce around 500 GB of data to be stored in the first three months alone. Typically, such data will be purged and the storage reused unless it’s footage of a criminal activity, in which case, it should be stored for an average of four to five years. After speaking with one police department last year, I was surprised to find out they only kept body cam videos for two weeks before reusing the storage. They would ideally keep all footage for at least 2 years but currently can’t afford it.  Sometimes they don’t even know a crime has occured for several months, and by then the video has been erased.  Within the five year timeline of typical usage, another 500GB of data is generated, bringing the total to around a terabyte. So you’re paying around $350,000 to store just 100 terabytes of data annually for the lifetime of your contact. It might be hard to make the argument to your CFO to go for this solution when independent cloud storage solutions can store the same amount of data for less than $40,000!  The hybrid cloud is also a solution that is becoming increasingly relevant for data storage needs in a surveillance context. Some concerns relating to a pure cloud solution for surveillance applications could include uncertainty related to operational effectiveness (e.g. speed, if the internet goes down), bandwidth issues due to the scale of camera equipment on site, in addition to legal requirements that may require data retention for significant amounts of time. Not everything needs to be in the cloud.  Using a hybrid cloud solution can enable organisations to keep some surveillance video storage facilities on premise whilst moving others to the cloud. Most recent videos would be stored locally for sake of speed - where it usually only needs to be stored for a day or two, subsequently being copied to the cloud where it can be kept for as long as the organisation needs it. In doing so, security and surveillance solutions can become more cost-effective and allow companies and governments to safeguard the information they need to do their jobs effectively.  With increasing amounts of data being generated, avoiding becoming locked into a surveillance vendor’s cloud storage solution is key to prevent future costs spiralling out of control. While all-in-one solutions have their benefits, with so much data needing to be stored, analysed and managed, companies should be mindful to research a cloud storage solution that fits their individual needs. With an easy way to take care of all that data, government agencies and security decision makers can carry out their duties as efficiently as possible.[/vc_column_text][/vc_column][/vc_row] ### Critical Open Source Components of Telecom Cloud Management in the 5G Era [vc_row][vc_column][vc_column_text]The future of ultra-reliable, super-fast telecommunications delivered via the 5G mobile networks is much closer to reality than you might think. It’s an exciting proposition, with some tests having already proven the capability of delivering 10Gbps edge computing for technologies such as self-driving cars becoming much more feasible. That 10G speed is the same as an enterprise-level cloud direct connect! 5G will also enable diverse industries such as healthcare, retail and travel to use and deploy mobile applications and services that would never have worked over a 4G connection. Augmented Reality (AR) and Virtual Reality (VR) applications will also see the inherent benefits of 5G. In order for 5G to realise its full potential, though, sophisticated cloud management solutions that can handle the different types of hardware and software for 5G must also be brought together to act in unison. In this article, we’ll look at the three critical components of telecom management: orchestration software, containerised network functions (CNF) and infrastructure management technology, and how gateway platforms (e.g. MCP Edge) will be used to tie these components together. Additionally, we’ll also explain how this technology will bring some unbelievable benefits to the telecommunications industry as a whole and its end users. But first, some background on the open source movement, because without open source technology, much of the development required for 5G would be out of budget for most businesses. Open Source Technology Many of the technologies mentioned in this article are either open source or in the process of being released as open source products. These include Kubernetes orchestration software, Magma telecom management software and OpenStack infrastructure manager.  The open source movement was designed to improve collaboration between developers by allowing software to be freely shared and edited by developer communities. Not only does this mean that there is a pool of talented developers constantly improving the software, it also means that developers can use, modify and combine the open source technologies to create their own enterprise solutions without passing on the costs from huge licensing fees to the end user. Since many extremely talented programmers are committed to the open source movement, the technologies that result from this collaboration are as good – if not better –  than many proprietary versions. Open source technology also supports globalisation, an important factor in mobile telecommunications. Technological advancements can be quickly adopted and circulated throughout the world, increasing access to new benefits such as 5G. The structure of open source development is often best compared to an onion. At the center are the core contributors who do most of the development work. Directly outside the center rings are contributors who donate their time responding to bug reports and pull requests. Then, there are contributors who make up the outer layer by submitting bug reports with a further layer of moderators and repository monitors outside of them. Advanced Orchestration Software  The first critical component of a futuristic 5G telecom solution is Kubernetes, a popular open source program which goes beyond traditional workflow orchestration. Kubernetes was developed and used by Google for 15 years before they open-sourced it. It is designed to manage containerised applications and services (which we’ll go into in the next section), as well as for process automation and declarative programming. In simplest terms, Kubernetes takes high-level information on the desired state of a system and then acts to bring about said state.  For example, if a container fails to respond to a health check, it will be terminated and replaced. If an application is developed or modified, Kubernetes can construct a new container and allocate the necessary resources to it. Kubernetes can also perform load balancing by ensuring that both CPU and RAM allocation is within a certain set of parameters. It can also manage the secure management of OAuth tokens, SSH keys and passwords. Kubernetes isn't a traditional platform-as-a-service (PaaS) offering, though. Instead, it’s built to work with any containerised application and/or service in supporting a wide variety of stateful, stateless and data-processing workloads. Such flexibility makes Kubernetes the ideal candidate for a 5G telecom system comprised of multiple different parts. Containerised Network Functions An example of a containerised network function (CNF) that could play a big role in a 5G telecommunications network is Facebook's Magma. Magma provides a containerised mobile packet core along with network automation and management tools, which serves to speed up and simplify telecom connectivity and operations at the edge. Earlier this year, Facebook announced that they would be open sourcing Magma, which is huge news for telecom vendors and consultants, who can now look forward to developing and auditing a new wave of edge-focused telecom management solutions. Containerisation decouples applications from a specific operating system, which is how Magma enables simple cloud connectivity for diverse mobile network operators. Magma helps vendors from remote areas federate their systems with pre-existing LTE networks rather than having to integrate with complex centralised evolved packet core (EVP) deployments. Magma is also ideal for private LTE network operators and wireless enterprise deployments. Additionally, it can automate tasks such as element configuration, device provisioning and software updates. However, not all telecom providers use serverless technology. A truly comprehensive cloud management solution would need to enable VMs, containers and even bare metal services in order to share the same cloud ecosystem.  This is where the third open source solution comes in... Open Source Infrastructure Management The third service in our open source trinity is OpenStack.  While Kubernetes is designed for managing containerised workloads, OpenStack is built to manage VMs and other virtualised network functions (VNFs). OpenStack communicates with various compute, storage and network resources via APIs with shared authentication mechanisms.  In addition to standard infrastructure management, OpenStack also provides fault finding and orchestration capabilities through a selection of 'plug and play' services and sample configurations built for specific purposes. Administrators can access the OpenStack dashboard from anywhere via a web interface. Californian cloud service specialist Mirantis has recently used its expertise with both Kubernetes and OpenStack to develop MCP Edge, which integrates Kubernetes, OpenStack, Magma and Mirantis’ own proprietary DriveTrain infrastructure management software into a telecom gateway platform. This platform has the capability to bring 5G mobile connectivity to the edge of the cloud, even in remote areas that traditionally suffer from poor internet service.  Using MCP Edge, vendors in remote areas can configure and deploy mobile networks at the edge, thus overcoming significant geographical, technical and financial barriers to entry.  Indian telecom provider Reliance Jio is a great example of how MCP Edge can be leveraged for cost savings. In just a few months, Reliance Jio was able to capture a significant market share by building networks at 20% of the cost of their competitors and then passing those savings on to their customer base.  What does it mean for the telecom industry? The evolution of 5G wireless connectivity and open source cloud management services offers numerous benefits to telecom providers. Reduced cost of entry, more efficient system management and automation for reduced downtime, a more predictable level of service and the ability to quickly add new services and applications are all benefits that can be expected with 5G capabilities. This will all be especially important as the Internet of Things (IoT) develops and, with it, a dependence on real-time edge-focused computing. In addition to providing connectivity to remote rural areas, edge platforms will enable license-restricted operators to extend their services using Wi-Fi and Citizens Broadcast Radio Service (CBRS) as well. Technology is constantly evolving to meet new challenges and even level the playing field in some ways, but the true impact that 5G will eventually have relies heavily on the open source movement and the success it has in democratising the source code behind it all. [/vc_column_text][/vc_column][/vc_row] ### CISOs Must Lead From The Front On Security [vc_row][vc_column][vc_column_text]There’s a problem at the top. In too many organisations, responsibility for cybersecurity is muddled. Or even worse, it’s being assumed by the wrong person. Given escalating threat levels, the complexity of security environments and increasingly acute regulatory challenges, there’s an urgent need for clear reporting lines and a foregrounded role for the CISO. With strong leadership at the top, it becomes easier to build that much-needed security culture organisation-wide, engrained by design and default into everything people do. Confusion reigns NTT Security’s most recent Risk:Value report was distilled from interviews with 1,800 non-IT business decision makers across the globe. It paints a confusing picture. Globally, 22% believe the CIO is ultimately responsible for managing security — slightly ahead of both the CEO (20%) and CISO (19%). In the UK, the biggest number of respondents believe the CEO (21%) is in charge, followed by the CIO (19%), with the CISO again in third place (18%). In the US (27%) and Norway (26%) even more respondents voted for CEO leadership in security. We can deduce a couple of things from these findings. First, the CISO is still not viewed as a standalone leadership role, and second, executives are really split over who’s in charge. In fact, with the hiring of Data Protection Officers (DPOs) by many organisations to comply with the GDPR, responsibilities could become even more blurred. Part of the problem may be that in many organisations the CISO still reports in to the CIO. This is starting to change. According to the CIO100 survey the number of CISOs who are seen as peers of the CIO more than tripled, from 5% in 2017 to 16% this year. But our findings show there’s still a lack of clarity on the separation of powers between CIO and CISO. Whatever happens, organisations should not be handing responsibility for the cybersecurity function to the CEO. A non-expert managing this specialist role is likely to do more harm than good, and at the very least may delay crucial security decision-making as other critical responsibilities take precedence. A bad time to lack leadership Thanks in part to a lack of leadership, fewer than half (48%) of global organisations, and just 53% in the UK say they have fully secured all of their critical data. This is despite the potentially huge fines awaiting any who fall foul of the GDPR. It’s worth remembering that the average cost of a breach globally now stands at $1.5 million, up 13% from 2017. But this could rise many times higher if European regulators see something they don’t like. A lack of cohesion at the top may also be responsible for the glacial pace at which organisations’ cybersecurity preparedness is moving. Just 57% even have a security policy to speak of, a single percentage higher than in 2017, while 26% say they’re working on it, versus 25% last year. Some 81% claim to have actively communicated this policy to the organisation, compared to 79% last year. It’s no surprise that the number of executives who believe their employees are fully aware of these policies has not changed since, standing at a poor 39%. Security as a shared responsibility The bottom line is that boards need to grasp the importance not only of cybersecurity itself, but also clear and effective leadership. Digital transformation efforts will fail if not built on a secure foundation. CISOs can help where possible by speaking in a language the board understands, which means talking in terms of business risk. Yet once the role of the CISO has been elevated to a standalone board position on a par with the CIO, that’s not the end of the story. CISOs must cultivate a corporate-wide culture of security. This is easier said than done but should start with awareness-raising and training programmes for everyone — including temps, contractors and, of course, the board. In fact, according to our data, senior managers and executives are among the most likely to click on phishing links or engage in other risky behaviour. Your employees increasingly represent the frontline when it comes to cyber threats. Attackers believe they are the weakest link and will relentlessly target them with phishing messages. With a critical awareness of where these threats lie, your employees can be transformed from a weak link to a great first line of defence, to be complemented by the appropriate security policies and controls. Consider training programmes that use real-world simulation scenarios that can be tweaked to take account of the latest threats. These should be run in short bursts of perhaps no longer than 15-20 minutes. Also crucial is the process of collecting and analysing results and feeding these back to employees. Over time, you will hopefully start to see improvements and changes in behaviour. Security should be everyone’s responsibility. So, on top of phishing awareness and other practical exercises, it pays to help staff understand the importance of their actions. One misplaced click could lead to major fines, customer attrition and ultimately potential job losses. The GDPR states that data protection must be engineered into everything a company does, by design and default. This should apply to cybersecurity more broadly. Helping to join the dots for employees so they can understand the impact of poor digital hygiene is just one way to build that organisation-wide culture of security that every CISO should aspire to.[/vc_column_text][/vc_column][/vc_row] ### Cloud-Native vs Cloud-Enabled – Are You Using the Right Term? [vc_row][vc_column][vc_column_text]Cloud applications are the talk of the town. Many times, we come across terms like cloud-native applications and cloud-enabled applications. For starters, few people use both the terms interchangeably. But there lies a blurred line difference between the two. So, what exactly is the difference between cloud-native and cloud-enabled? Why it is so important? Let’s find out! To start with, let’s have a look at what the analyst community is saying about the cloud applications. As per a research report by IDC, by 2022, 90% of all new apps will feature microservices architectures that improve the ability to design, debug, update, and leverage third-party code, 35% of all production apps will be cloud native. Now the microservices architecture is the one which is used by cloud-native applications. In other words, these are the applications which are born in the cloud; made as microservices packaged in containers and are deployed in the cloud. So, surely the future belongs to cloud-native applications. So, what are cloud-enabled applications then? Cloud-enabled is relatively an older term being used by the enterprises for a long time. These cloud-enabled applications are made in house in a static (on-premises) environment and are a typical legacy or traditional enterprise software enabled for the cloud functioning. While cloud-native applications are made on a microservices architecture (known as micro-services) that are designed as an independent module to serve a particular purpose, cloud-enabled applications are made on legacy infrastructure systems where each module is dependent on each other. These are generally static and making upgrades to such an application means making changes to the whole of it. The cloud-enabled applications were of great use in the older times when there was a limit on the data usage. Today, the enterprises are competing in the digital age. Every enterprise has heaps of data which needs to be analysed for concrete information for further decision making. Cloud-native applications are needed to cater to such requirements as they are dynamically orchestrated and supports full-scale virtualisation through harnessing the complete power of a cloud. In other words, the recent emergence of these cloud-native applications is the reason the enterprises are embracing digital transformation in their day to day operations. To proceed further let’s see the main point of difference between cloud-native and cloud-enabled – 1. Origin – Cloud-native applications are indigenous to the cloud. As described earlier, they are built in the cloud and deployed in the cloud, truly accessing the power of cloud infrastructure. Cloud-Enabled applications are generally made in house using on legacy infrastructure and are tweaked to be made remotely available in the cloud. 2. Design – Cloud-native applications are designed to be hosted as multi-tenant instances (microservices architecture) Cloud-enabled applications are made on in house servers, hence they don’t have any multi-tenant instances. 3. Ease of Use – Cloud-native applications are highly scalable, real-time changes can be made to the individual modules without causing disruptions to the whole application Cloud-enabled applications require manual upgrades causing disruption and shutdown to the application 4. Pricing – Cloud-native applications require no hardware or software investments as they are made on the cloud and are generally available on the licensee, thus relatively cheaper to use Cloud-enabled applications are generally costlier as they require infra upgrades to accommodate the changing requirements. 5. Implementation – Cloud-native applications are easily and quickly implemented as there is no need for hardware or software configurations. Cloud-enabled applications need to be customised for the specific installation environment The above points showcase the clear-cut edge cloud-native applications have over the cloud-enabled ones. This is mainly because the cloud-native applications prepare a strong foundation for the enterprise to operate in a fast-paced business environment. The changing business needs due to social-political or economic reasons can be quickly accommodated or implemented in the business applications build on cloud-native architecture. One of the major applications of the cloud-native solution is in the integration platform. A typical small to mid-level enterprise deals with thousands and hundreds of applications across various departments such as procurement, logistics, transport, HR< finance, legal, sales, marketing etc. A seamless integration is required between these applications for propelling digital transformation. Integration platform built on cloud-native architecture can connect many applications, systems and devices (either hosted on a cloud or on-premises) on a real-time basis. Such a hybrid integration platform is highly scalable and helps in business process automation. Conclusion – The digital disruption is here to stay. A cloud-native architecture prepares a strong foundation for the digital transformation journey of an enterprise as they inch closer to embrace digital technologies. The changing business needs are making a paradigm shift towards tailor-made cloud applications which are customised as per the desired business case. With a cloud-native architecture at their disposal, enterprises can surely focus more on their strategic needs, thus tapping the best of available business opportunities for further growth and prosperity.[/vc_column_text][/vc_column][/vc_row] ### Beyond the classroom | How technology is changing schools As technology evolves, society and every sector within it evolves. Education, for instance, has transformed enormously since the beginning of the 21st century as more online resources have become available to aid the learning of pupils. Additionally, the teaching of IT and the use of computers for school work means that the way in which people learn has changed far beyond the traditional concept of a teacher in front of a class, writing on a blackboard. Children learn skills that did not exist before as STEM subjects develop further and garner interest from pupils as well as encouragement from teaching staff. As well as new subjects to learn about being introduced through technological advancement, the possibilities for interaction and collaboration have grown. Classrooms were once fairly limited spaces with only classmates in the same room to work and generate idea with. With access to the internet, the opportunity for discovery is vast, and rather than only having books, teachers and other students as resources, the online realm provides endless articles, images, perspectives and information. With technology like video conferencing, students have a platform to communicate with people beyond the classroom walls. For example, if a class had a history project to work on, with the consent and guidance of a teacher, they could find a historian to live chat with and learn from. The education of children and young people if no longer restricted to one school or town, students can talk about what they have been learning with other students across the world with messaging services available to them. Technology has also impacted the roles with schools. For example, teachers were once responsible for the entirety of student learning, but now they guide students through information with the help of technology and the internet, giving students the agency to discover information for themselves. With artificial intelligence increasing in quality and deployment, the way in which papers are marked has also changed in some schools. Although many teachers are still responsible -if not entirely, partially for the marking of work, as AI (artificial intelligence) technology is further tested and modified and becomes smarter, for such technology to be used as an independent grading tool is fast becoming a reality. There are already some cases where AI is being tested specifically to grade work in schools. For example, Several schools in China are using artificial intelligence for grading, according to the South China Morning Post.  One in every four schools in the country is testing machine learning systems with the intention of students’ work being automatically graded. The AI uses an evolving pool of knowledge to understand the “meaning” of pupils’ essays. It can identify issues with style, structure, and tone. It can read both English and Chinese and is supposedly so perceptive that it can detect when a student is straying from the subject. The decade of grading software uses deep learning algorithms to compare their own grading with human teachers’ grading. In a test involving 120 million people, the South China Morning Post alleges, the AI matched human graders around 92% of the time. Researchers hope software like it will help reduce grading inconsistencies and reduce the time teachers spend grading essays. Such technology would by no means leave teachers without jobs, as grading is one part of a diverse role. A teacher is essential in order to provide students with guidance and monitoring their behaviour. It will be a long time from now if teachers are ever replaced altogether. Robots cannot offer the same encouragement or emotional support as teachers, and therefore a positive learning environment would be difficult to achieve without human teachers. Technology has enormous potential and has already achieved so much in education, but this does not make it an adequate replacement for teachers, it is simply an effective learning tool and assistant to the teaching and marking process. Although attending a physical school remains the norm, for children who are home schooled and adults in higher education, online platforms have revolutionised the learning process and achievement of qualifications. It is now a simple task for teachers/lecturers and learners to contact each other via email and instant messaging services. Work can be submitted online via sites like Blackboard (a platform used by higher education institutions) where reading lists can be accessed and booking library slots and more can be achieved. With technology like this, education is more accessible to people learning from home or with restricted mobility.   The evolution of technology and its application in schools, and outside of school for remote learning, has made the learning experience much more vast, flexible, and easier to monitor. With more advanced technology developed at an extraordinary rate, there are more subjects to learn about (e.g., AI, IoT and medical technology) and more ways to learn. As new technologies are tested for academic use, new possibilities and ways to optimise the way in which people learn will arise. It is an exciting time to learn, with limitless access to subjects and new ways to consume information. It is difficult to know whether automated grading will reach widespread use, but the results of the testing in China give us some insight into an objective and scientific approach to grading that is showing real potential to meet the standards of teachers. ### Multi-Cloud Strategy: Pros, Cons and Considerations [vc_row][vc_column][vc_column_text]Multi-cloud has become somewhat of a buzzword in the technology industry over the last couple of years, with some research indicating that 81% of enterprises are investigating or actively pursuing a multi-cloud strategy. Each company has its own business and technical drivers for moving to a multi-cloud strategy, and they are tailoring solutions specifically to their own specific needs. Multi-cloud allows enterprises to deliver services across both private and public clouds, which enables them to host their workloads in the most appropriate location whilst providing a consistent security approach. All of this is good, and there are many benefits to multi-cloud, but it also presents challenges that need to be considered. The Positives One of the main benefits of implementing a multi-cloud strategy is the flexibility it offers. It allows businesses to be creative whilst using the right set of services to optimise their opportunities. As with all businesses, the public cloud providers cannot be masters in all areas of services and capability, and the various providers have now started to create their own focus areas, where they have particular strength or reference ability. Adopting a multi-cloud strategy enables an enterprise to benefit from this differentiation between providers and implement a “best of breed” model for the services that they as a business need to consume. Another key reason businesses are turning to multi-cloud is in order to avoid vendor lock-in. Research found that more than 80% of organisations have high levels of concern about being locked into a single public cloud platform. For many companies, avoiding vendor lock-in is a core requirement and a way to achieve flexibility for their applications. By ensuring they are not locked into a single vendor, businesses can remain able to adapt their strategy when industry revolutions emerge. The ability for organisations to choose the vendor that offers the best price for the workload they have is another major advantage of multi-cloud. Data storage can be costly and businesses can often end up spending a lot of money on this. With multi-cloud, enterprises are in a strong position to be able to take advantage of any price fluctuations to move their data to the provider that offers the best price, enabling the enterprise to better manage the cost models for their workloads. The Negatives Despite there being numerous benefits to multi-cloud strategy, there are some challenges that need to be addressed. Using services from multiple providers can get complicated and it can become difficult to manage. Each provider has a different set up process and failing to manage them all correctly could affect a business’s agility. There are a new breed of platform-independent tools that are developing to resolve this, but this in turn can add to the overall cost, and whichever solution is implemented, thorough planning and a deep understanding of the applications and providers would be required. Due to the different architectures involved in public cloud, multi-cloud does come with additional security concerns, and it is crucial that all providers involved are efficiently coordinated. Threat detection and prevention tools need to be able to seamlessly share security information in order to work together to address any threats that may occur, regardless of when and where. Organisations need to understand the security requirements for each workload and architect their overall solution to allow for this variety. This will also feed into the overall design as to where certain workloads are able to reside, both from a public/private cloud perspective but also with regards to geographic boundaries and the physical location of the workloads (data sovereignty). Another potential concern with multi-cloud is the possibility of reduction in the purchasing power of a business, making it more difficult to track costs. As multiple service providers are involved, costs are divided between them which means that businesses’ ability to benefit from cost advantages could be reduced. In all likelihood the public cloud providers will adapt to a multi-cloud future and their pricing models will evolve as their customers purchasing habits change. Considerations With research suggesting that there will be a significant rise in multi-cloud adoption, it is important that businesses consider both the advantages and challenges in order to fully take advantage of the benefits that could be realised from a multi-cloud strategy. Careful planning and thought is required in order to achieve consistent and effective security and compliance, and this may necessitate the use of additional tools and management platforms. Today, employee and staff training should be considered to fully understand the impact of a multi-cloud strategy, and by undertaking this preparation work, enterprises will be able to mitigate many of the potential pitfalls that could be hit. Finally, having clear control of your budget is another point to consider when choosing a multi-cloud strategy. Ensuring cloud spends are monitored and do not spiral out of control will ensure that efficiency within businesses can be maximised to its full potential. Ultimately more and more organisations are turning to multi-cloud strategies to help make the most of the advantages they offer. We look forward to seeing the future of multi-cloud and more businesses adopting the strategy in order to achieve their digital transformation needs. Chris is an experienced CTO with over 20 years’ working in the networking industry. His focus is on enabling Axians customers to accelerate the success of transformation initiatives through the adoption of new and emerging technologies and solutions.[/vc_column_text][/vc_column][/vc_row] ### How Cloud Computing Can Help Solve Modern Businesses Problems From Artificial Intelligence to Augmented Reality, the current deluge of technological developments is unavoidable. As a result, businesses are having to change the way they operate. Everything, from daily operational tasks to the way they interact with customers, is affected in some way. Keeping up with technological advances Many businesses recognise the huge impact technology is having. In a new Confederation of British Industry (CBI) report, half of all the businesses surveyed believe AI will fundamentally transform their industry. In response to such changes, the report concluded that “technologies like cloud which were seen as niche just a few years ago have matured to now underpin much of the UK business infrastructure.” But not everyone is ready to wholeheartedly embrace the changes. While 33% of the businesses questioned in the CBI survey saw themselves as digital pioneers, 27% viewed themselves as followers. [easy-tweet tweet="Having the ability to adopt new technology can give you a competitive advantage" hashtags="NewTechnology, Cloud"] Some retailers like Ikea are pioneers in this area. They began using augmented reality to let customers try out new furniture in their home back in 2013. The retail experience is changing, so you need to offer customers something that sets you apart. Adopting cloud computing makes sense at such times because it helps you get around some of the reasons people might be reluctant to embrace new technology. Introducing new infrastructure is costly but cloud computing has minimal setup costs. It is quick to get up and running, so you don’t have to wait to access the latest technology. Plus, as technology moves on to the next big thing - you can scale the cloud’s capabilities to keep up. Having the ability to adopt new technology and use it to solve problems for your customers can give you a competitive advantage. In an article in Wired magazine, Secretary of State for Culture, Media and Sport, Karen Bradley said: “...as we prepare to leave the European Union, the UK’s global competitiveness will increasingly depend on not only a vibrant digital sector but also on all our businesses using the best digital technology and data to drive innovation and productivity.” Data - the challenges and opportunities The secretary of state touched on one of the cornerstones of technological advancement - data. We haven’t even come close to unlocking the full potential hidden in the data we are all generating. Least of all because there is a shortage of workers with the skills to interpret the data. Airbnb has responded by setting up its own Data University, so all its employees can become data literate. In doing so, they are addressing the skills shortage but also helping overcome another problem - the cultural shift that’s necessary to make the most of data within organisations. A critical analysis of big data challenges published in the Journal of Business Research this year said “aligning the people, technology, and organisational resources to become a data-driven company is problematic.” Since launching their university in Q3 of 2016, Airbnb has reportedly gone from having 30% of their employees use their internal data science tools to 45%. They refer to data as “the voice of our users at scale”. They are using data in all manner of ways from increasing diversity within their workforce to delivering personalised, targeted and actionable insights to their hosts. One of the points they highlight is the importance of having a “stable, reliable, and scalable” data infrastructure. If you don’t have the systems in the place to handle your data, then you can’t benefit from all it has to offer. Cloud-based software allows businesses to use their data without making massive changes, or employing a huge workforce to interpret it. It can help you store, process and even analyse data from multiple sources within your business. Operating in an agile way You may not want or need to adopt every technological change but you should certainly be in a position to quickly and easily test it out should you wish to. That’s another reason why cloud computing is appealing. It enables you to operate an agile business. If you decide to prototype a new idea, it can support you in doing so, without the risk or cost involved in investing in massive infrastructure changes. Adopting cloud computing If you are part of a business that is still sitting on the sidelines, while others embrace technology, then adopting cloud based business management software can help prevent you from falling behind.   ### Debunking the AI myth in customer service – what to watch out for [vc_row][vc_column][vc_column_text]AI is transforming the world as we know it. Its applications in the e-commerce space have led to a massive shift in consumer perceptions about bot-generated content. Almost every e-commerce site now offers some sort of personal recommendation based on customer preferences. This is AI in a relatively simple form. An increasing number of businesses are also investing in chatbots to bridge the gap between the customer and salesperson in a digital world – capturing and qualifying interest based on their online behaviour. While these applications of AI are now common place, bot-generated responses are still a long way from making an impact in the customer service sector. We believe some of the reasons for this are: What if the bots get it wrong? Brands are fearful about the level of accuracy and more importantly, the relevancy of answers provided by bots. An example is Facebook which scaled back i’s AI ambition after its chat bots hit 70% failure rate. What if my customers feel undervalued? Fear that bots might remove any scope of personalisation from the interaction. Customers feel comforted, reassured, and engaged by a human interaction. A bot service might not always be intuitive enough to pick on the tone of voice, style of language, and tailor the dialogue accordingly. The Limitless SmartCrowd™ platform is all about democratising AI offering our proprietary AI service as plug-and-play for businesses looking to automate their customer service operations. Our SmartAI™ has been carefully developed to mitigate the well-known risks associated with automation. Here is how we do it: 1. AI and humans working together AI by itself doesn’t work; SmartAI™ is a blended solution which offers the benefits of automation without losing out on the human touch by being a component of the SmartCrowd™ (crowd sourcing) platform. SmartCrowd™ enables enterprises to build their own crowd of expert customers who provide highly personalised and relevant responses to customer queries. This is called CrowdService™. SmartAI™ constantly learns from a stream of relevant conversations, lending a more personalised, human feel to the automated suggested responses. What’s more, these suggested answers are being used to train and onboard new customer experts on the Limitless platform. 2. Improved accuracy, faster learning, and reduced costs Our process collects x100 more data points than other methods allowing it to predict answers much more reliably and learn faster – reducing operational costs.[/vc_column_text][vc_single_image image="51999" img_size="full"][vc_column_text]As the table above shows, a normal call centre handles a million contacts. A trained and knowledgeable third party like a Quality and Compliance Officer must check every agent interaction for accuracy, before the data is used to train the AI. Separately, the system must rely on customers indicating they found the ‘AI-generated suggested answers’ to be helpful, to train the AI further. Our SmartAI however, saves time and money through 2 levels of quality control – firstly, the end-customer provides feedback on the answer. Secondly, fellow Ambassadors independently peer review every answer provided – this setting can be configured to drive as many peer reviews as required, depending on the task complexity. This means the AI can learn faster from a wider set of data points and predict answers with a higher level of accuracy. 3. Unique and personalised Our AI IP is unique and based on real client requirements. Algorithms facilitate advanced routing, personalisation, rewards and anti-gaming features on our platform. SmartAI™ is platform-agnostic, and easily slots into your CRM and knowledge management systems, so you can be up and running in no time. 4. Flexibility and agility When you deploy SmartAI™ along with CrowdService™, it will complement your existing workforce and not replace it. Our blended solution is designed to sit right next to your traditional model, helping improve quality of service and boosting productivity. So why not let the AI handle transactional queries, leaving your crowd of customer experts to deal with those requiring the human touch. This gives your trained customer service agents more time to deal with the most complex and sensitive enquiries. Furthermore, the pay-as-you-go business model ensures you are only paying for resolved cases – no overheads resourcing or training.[/vc_column_text][/vc_column][/vc_row] ### How Cloud Collaboration Software can Enhance Company Security [vc_row][vc_column][vc_column_text]In recent times, collaborative working tools have become more and more popular, as workforces look for new ways to revolutionise how staff carry out their jobs. Such tools have widespread advantages, including an increase in communication and productivity, a more flexible work environment, the ability to seamlessly manage and track tasks, and the secure sharing of documents. However, with the introduction of any new workplace software, management may have concerns over security, particularly regarding web-based threats, potential data breaches and application-layer vulnerabilities. For these reasons, security threats remain a primary reason as to why many company decision-makers are cautious. We discuss several ways in which businesses can overcome these concerns and safely use cloud collaboration software to enhance workplace security. Restrictive access and full audit trail Within the workplace, managers often have valid concerns regarding which employees are able to access sensitive company data. Secure collaboration software does, however, adequately address these trepidations, by allowing managers to grant tool access to only specific people or teams, as well as creating stringent controls over which information can be viewed and by whom. Alerts can also be set up to notify the relevant people when an unauthorised access attempt is made, including the ability to block access in the case of an attempted breach. This means confidential files and data are kept secure and can only be viewed by trusted people in specific roles. Full audit trails, which log each interaction within the platform, can also be viewed by those with top-level access, providing full transparency regarding who has uploaded, viewed or edited specific files. Password security The majority of tools on the market require a username and password in order to login to the software. Basic passwords are, obviously, very easy to hack, and the consequences of a data breach, particularly to small-and-medium-sized enterprises, can be devastating. To overcome such issues, the best cloud collaboration platforms will have stringent controls with regard to password security, including factors like prevent password re-use, using long-tail, complex, and combination password patterns, and insisting users set up two-factor authentication processes, requiring a back-up code sent via phone. Rigid platforms should also automatically log users out after periods of inactivity. A secure network One of the core benefits of collaboration tools is that users can work anytime, from anywhere in the world. As such, a highly secure network, with https/SSL connections, is at the heart of this type of software. The most reliable tools on the market will ensure data encryption during every step of the journey, as well as having systems in place to continuously monitor the network in case of threats. Company data breaches are often caused as a result of negligent or improperly trained staff, which can easily be avoided through thorough training and making use of all the security features included within the platform. Providers of cloud collaboration tools are on-hand to ensure their users are well-versed in the functioning of the platform and how it can securely protect data. Moreover, managers can encourage employees to act sensibly, by fostering a company culture whereby staff understand the gravity of their actions, and work to minimise security threats overall. Data that is always available Good security isn’t just about keeping your data safe – it’s about giving you peace of mind that your data is always available when you need it. Imagine if one of your critical systems went down for any length of time. What damage would this do to your organisation’s productivity? Or worse, to its reputation? That’s why you should select a tool that is resilient against network, power, connectivity and hardware problems so it is available round-the-clock.  Preferably one with a service level agreement guaranteeing a minimum uptime 99.95%. Certified providers Once you have decided to invest in a workplace collaboration tool, it is important to only choose one from a valid provider. Those that have been granted an ISO 27001 accreditation are able to prove that the security measures used within their tool are secure, rigorous and do not compromise sensitive data. This certification is industry-respected, since an impartial, trusted third party tests the software and confirms whether it has met the required security standards. Another way to determine security credentials is to establish whether the platform provider has completed an annual IT Health Check (ITHC) through an accredited CHECK testing partner. Undertaken by security experts, the test is extremely stringent and determines whether the platform remains secure in the event of a hacking attempt. Once a provider passes the test, it guarantees that the tool is protected from known attacks and that the user authentication and session management are secure. When used correctly, and with security firmly in mind, cloud collaboration tools provide numerous benefits. The risks outlined should be used as a tale of caution, helping to steer businesses in the right direction when deciding which platform is suitable and which vendors meet the strict security requirements laid out.[/vc_column_text][/vc_column][/vc_row] ### Demystifying Cloud Partnership: how to choose a cloud partner that works for you [vc_row][vc_column][vc_column_text]Cloud adoption is increasingly being implemented globally with this set to continue. Over the next five years, IDC predicts that the cloud-based business solutions market will increase by 30%. The benefits it provides to organisations are clear  - faster time to market, lower cost platforms and the ability to provide more agility which helps us work towards our digital transformation goals. Your cloud adoption is not something that should keep you up at night. Although, it is important to remember that it’s not an easy process that requires ‘no IT’, nor does it guarantee a ‘lower-cost delivery model’. Choosing the perfect partner Deciding on the right cloud solution provider (CSP) for you is a challenging task. When undertaking this consider: The type of platform that will work best for your organisation’s needs. Planning the potential scale and complexity of your solution. Design the end solution to be technology agnostic to give you more options in the future. Your organisation’s current strengths and areas you need assistance with. Cloud services do not need to be a “one-stop package” – if you can find the perfect partner that meets all your requirements then great, but a multi-vendor strategy should not be ruled out. A complete cloud strategy may not currently work for all areas of your business. It may be best to have a combination of on premises and in the cloud initially. Finding the perfect partner It is good to understand where your potential CSP partner is regarded. Are they: Pioneers: Most complete in their offering and value added to you – these partners are leading the shift to the cloud. Typically, implementation and migration experts, but they also have to ability to resell licenses for multiple cloud platforms. Builders: They have extensive market knowledge and cloud experience. Tend to work with large enterprises that need assistance with strategic and transformative work. Transactors: They mainly sell lots of hardware and software services (normally in bulk). Limitation of this type of CSP is that they are very limited on terms of the additional services they can provide, meaning a low total value to you. Specialists: Normally operate in niche industries and have a limited offering. However, they do excel at what they deliver; but they are highly specialised, so their value and offerings are limited in-house (although tend to use other partners to deliver more services). Regardless of the CSP you end up choosing – for example, Amazon, Google, Microsoft or Oracle – they will all offer and provide you with similar global and technology services. Unless you are a large global player, getting direct access to these will be a challenge. Therefore, a recommendation is to look at smaller Premium or Gold partners of these CSPs. They will provide you with the bundle of managed services (such as consulting, design, migration, implementation, and life-cycle management) in a much more competitive, agile and personalised nature. Working with your CSP Your ideal CSP should be a strategic partner that is valued on both sides of the relationship. Work with your chosen partner to: Set out your vision and be clear on the goals together so that both sides understand where you are heading. If you have a multi-vendor platform managing your environment, make sure they understand how they each fit into the task of delivering the needs of your business. By including them early on to share the responsibility of delivering these services, it provides your CSP with more ownership and buy-in to deliver against the common goal. Create an environment that will stimulate your staff/company to make them want to take part in embracing cloud services. Ensure communication channels are open with your chosen CSP at multiple levels. Be clear when communicating what success is for delivering these services. Remember success is modest improvement, consistently done. Have you got the right partner to achieve this? Ensure you discuss your exit strategy openly and honestly. In today’s market, termination agreements need to be fair and allow you to switch providers. Gone are the days of 3+ year contracts. Make it personal Finally, it is important to mention the social aspect of your partnership. It is beneficial to get to know the people and company that are helping you to deliver your vision and the service expected within your business. When possible, attend events that are meaningful to what you are trying to deliver. Try and ensure multiple people/teams within your business are interacting on a social/personal level. Building and maintaining these relationships will help understand what motivates each of you and build more interest in the partnership and services being delivered. It can be vital when challenges end up confronting you as that relationship will allow for the candour to help either resolve the issue quicker or help to collaborate more effectively on new projects.[/vc_column_text][/vc_column][/vc_row] ### The Changing Role of the IT Manager and Where That Fits in the IT Revolution IT managers are facing a variety of issues including managing staff, adding value to their organisation and enhancing security and business agility. It has never been a more challenging and exciting time for an IT Manager. IT Managers need to ensure their business is equipped with the right technology and tools to support their ongoing business’ digital transformation. The way we all approach business is changing. As we constantly try to keep pace with rapidly evolving technology, individual departments within a business are becoming as agile as the larger companies themselves. IT departments are experiencing incredible changes as their roles expand to impact customer service, sales and even business strategies. As a result, companies are increasingly turning IT into a driving force in all aspects of business. More than a technology facilitator In years gone by the role of an IT Manager was focused on infrastructure. In modern business, the IT Manager is more than a technology facilitator. IT is playing an increasingly integral role across many departments, from operations and finance to marketing and human resources. Today’s ‘company IT guy’ has far more to oversee than software and hardware. One thing to understand when managing the growing pains of IT is that it’s no longer the work of a single department. The modern IT Manager is a tech-savvy innovator who creates change across the entire business. IT Managers are driving change at an organisational level and are looked at to deliver a competitive advantage. Since our reliance on technology increases with each passing day, IT is steadily moving to the front end of business. [easy-tweet tweet="The IT department is becoming the hub of business" hashtags="IT, Cloud"] Reliance on technology The IT department is becoming the hub of business. IT Managers must transform their workforce along with it and look to drive innovation and business growth. We’re at a point in time in which IT drives operations at every level of a company. Changing the old model of IT as a separate entity into a breathing part of the business will mean a thriving enterprise and one that runs smoothly. Since personal mobile devices are increasingly the technology of choice in the office, all eyes are on IT to maintain security integrity while also ensuring employees can effortlessly work with any device, anytime and anywhere. IT is creating truly all-wireless workplaces where Wi-Fi is unescapable, guest and BYOD security are automated, office appliances are mobile-device friendly and communications applications on mobile devices simply work better. As technology continues to be the critical element in the success of a business, the IT Manager will be the tip of the spear in enabling new revenue streams and employee satisfaction. IT Managers sceptical about moving to the cloud There is a growing trend among businesses to move IT departments away from their traditional roles to crucial positions as revenue generators. Departments within businesses are looking to IT departments to help impact and improve business outcomes by enabling the use of innovative technologies to engage with target audiences in new and exciting ways. With cloud computing, IT managers don’t have to worry about running programmes on their own computers if they don’t have the right resources, as other computers on the network can take care of it. By migrating to the cloud, you will be no longer required to buy equipment and software. Instead you will have more time to spend on business objectives and to understand more about the rest of your business. Cloud Computing is above all a window into the new world of agility. It allows the building of resources to prove concepts for short periods, experiencing little cost. IT Managers can utilise shared platforms to deliver services using ‘just enough’ resources. This flexibility removes time from IT innovation cycles changing the pace of a business. In the event of security vulnerabilities, cloud computing can plug the hole before any damage can occur. In the past, classic server structures used to have to be updated one at a time, making it a complicated process. Now, with cloud computing, security vulnerabilities can be fixed instantly by combined resources. Each computer relies on each other to ensure that any requests to accomplish a task are handled instantaneously through the data-handling ability of the entire network. Increased Efficiency A move to the cloud is a way for IT Managers to adopt new ways to handle their responsibilities. Cloud computing is a more efficient way of going about the business of IT and means lower overall costs and improving efficiency in business. IT Managers will still be needed to run networks and applications as well as coordinate support and helpdesks. Ignore knee-jerk reactions you might have read about cloud services making IT managers and departments redundant. Cloud is going to expand your skillset and transform your job description for the better. Running an IT department has always been about staying at the forefront in a field that is constantly changing. IT Managers will still play a crucial role in keeping many cloud-based systems fully operational. ### Digital Transformation Will Help Business Rethink Their Practices [vc_row][vc_column][vc_column_text] Moving to a Digital Future According to Tech Pro Research, 70% of companies either have a digital transformation strategy in place or are working on one. A conscious effort is being made by organisations to transform their markets before they get disrupted by competitors further along the digital journey. They are preempting future changes with new business models and digitally enabled products and services, and whilst aspirations are there, experience shows that, in practice, the level of digital maturity is variable across sectors and businesses. For many businesses the move to a digital future comes in two phases. Firstly gaining experience through digital business improvement: automating and transforming existing business processes by the use of digital technologies. Then building business intelligence systems: making use of real-time data to bring about business innovation, creating new digitally enabled products, services and platforms to drive revenue and open new markets.  In many of our customer engagements, the first challenge we have seen is for business leaders is to understand the “art of the possible”  to separate the hype from the “achievable now” where short-term payback can be delivered to build digital confidence for future online product offerings. Business improvement through digital optimisation To take these first successful steps, it’s necessary to build the digital dream team, bringing together digital leaders with business, technology and data expertise and align to deliver expected business outcomes.  In many cases this means working with external digital transformation experts whose practical experience with the cloud, mobile, data and integration technologies is needed to deliver. We have seen many successful established businesses, especially those with remote workforces handling their primary customer interactions, still burdened by manual processes and poor access to data in the field.  Historically job management and data recording has either been paper-based or handled through first generation of hand-held data collection devices. Unfortunately these tend to result in the need for a substantial, expensive and inefficient back-office admin function to consolidate and transform data from site or customer visits and results in poor employee and customer experience. The approach we are seeing being successful is to focus on data flows and to deploy the latest cloud and mobile technologies to develop smart, Android applications with real-time access via web-services and APIs to centrally held secure cloud data integrated tightly to existing legacy on-premise systems.  Digital business optimisation optimises the field worker experience often allowing more visits per day; provides real-time data from customer to field worker to head office improving business agility, better customer advocacy and higher net promoter scores and also eliminates back office manual processes improving data accuracy and delivering a short term payback purely on headcount savings. Digital business innovation through data Providing real-time access to data from customer to field to head office then allows the next phases of digital business innovation to be undertaken; the launch of new digital services or repositioning of the company as a digital service provider.   Managing data in modern cloud platforms makes it possible to gather data at scale with very low cost compared with traditional data warehousing platforms; have better control on security and access and free it from legacy silos. Organisations can then analyse and use this business intelligence to make faster and more accurate decisions, or offer more comprehensive data, results or insights for customers. As clients become more digitally mature we are seeing them launch new premium digital service offerings complementing their now more efficient operations with the provision of advanced data reporting services for their own customers. The future - advanced analytics and machine learning Where do we go from here? Businesses are currently using less than 50% of the structured data they hold, and only 1% of unstructured data. It is apparent right now that businesses are searching for ways to best leverage their data, but don’t have the tools required to achieve that. There is a huge amount of hype around Artificial Intelligence but we are already seeing practical innovations using Machine Learning (a much better descriptor in our opinion). Machine Learning can empower data analysts, the primary data warehouse users, to build and run models using existing business intelligence tools and spreadsheets. This enables business decision making through predictive analytics across the organisation.  Machine learning applications are already becoming common in areas such as online retailing with product suggestion engines building on structured data. We are also seeing innovations in unstructured data such as video and image analysis.  We have recently undertaken a successful proof of concept for automatic pest recognition through image analysis in a machine learning model and for another client looking to help them monetise existing image and video catalogues though automated tagging of meta-data with machine learning tools. Developing consolidated data platforms through digital business optimisation and the launch of new digital services coupled with the ability to leverage machine learning in the future will help businesses rethink their practices and produce actionable insights. It won’t be a surprise to see digitally transformed organisations generating more than 35% of their revenues in the near future from new business models that are ‘digital’ and have data at their core.[/vc_column_text][/vc_column][/vc_row] ### The Challenges of software testing IOT devices [vc_row][vc_column][vc_column_text] ‘Susan Harris is alone in the house when, suddenly, doors lock, windows slam shut and the phone stops working. Susan is trapped by an intruder - but this is no ordinary thug. Instead, the intruder is a computer named Proteus, an artificial brain that has learned to reason. And to terrorise…’ The 1977 film Demon Seed, a story about an AI gone rogue may have been a tad melodramatic on the danger that artificial intelligence poses to man, but it did have some smart insight into home automation and its potential malfunctions. From healthcare and home automation to transport and the oil and gas industry, the Internet of Things (IoT) is rapidly growing. Researchers Garner Inc predicted that by 2020, there will be 20.4 billion IoT devices connected to the internet. So whilst you are watching films on your smartwatch, asking your home to warm up to a prefered temperature, and perhaps negotiating what constitutes as burnt with your toaster, spare a thought for the testers. The men and women toiling behind closed doors to ensure that these devices actually work as intended. So what are the challenges software testers face when testing IoT devices? Firstly, there’s the obvious - is it safe?  Security A quick search of IoT will bring a plethora of articles discussing security concerns with IoT. However, rather than worrying your kettle might be spying on you, there is the very real concern of adding new or breachable devices to your trusted networks. With a laptop or computer software can be installed to protect devices, but for IoT devices the support is slim. Many devices also have default usernames and passwords that the average user never changes, making them vulnerable. Testers must also consider the wider affect IoT devices can have on internet security. One only needs to look back to the Mirai botnet attack that used items such as digital cameras and video recorders to cause huge disruption to large parts of America’s internet, bringing down sites such as Netflix, Twitter, CNN and Reddit. For a tester then, issues such as data flow between devices, data encryption and integrity of testing software (if automating) are all key concerns.  Sometimes testers must simply rely that users will protect their networks. However, with cheaper routers often containing less safety measures, there lies another issue for IoT testers, that of replicating the environment the item will operate in. Replicating environments When testing a website it’s easy for testers to run the pages through the mill. Open the page on the popular browsers, run it on laptops, tablets and smartphones and you’ve pretty much tested how it will be used by 90% of users. However, with IoT devices, replicating a user’s environment is far harder. Will the device be used at home or at work? Will the internet speed be fast or slow? How many unknown devices are also connect to this network? Will the device need to leave and rejoin the network? Forgetting the technical aspects, what type of furnishings will the device rest on? Apple’s HomePods have been documented as leaving white rings on wooden tables that have been treated or varnished. Whilst not strictly an issue for the humble software tester, it shows how the wide variety of environments means the job of replicating conditions for testers is challenging. For this reason testers must ensure they have a strong understanding of the product and try to encompass as many different scenarios into their testing as possible, including how the device will work with others. How it works with other devices Compatibility testing is always a bugbear of testers, but in the IoT world it becomes a myriad of challenges. Just take the seemingly endless interactions that can be created on devices that support IFTTT - there are simply too many moves on the chess board to analyse and test every device for every release in every environment. The only way testers can begin to address this task is to focus on the most popular operating systems and communication methods (ie bluetooth), on the most popular devices, so that efforts have the widest reach, before moving on to more niche items. Costs Another challenge is perhaps one of the most obvious - the cost. Testing something simpler such as a website is reasonably quick and cheap. But testing IoT devices in different environments with different integrations can be very costly. Time is a big factor, due to testers having to cover so many bases, but another issue is the possible need for specialist testing software. A few studies have alluded that the combination of a lack of developers failing to review their code for security flaws, and a lack of thorough testing of IoT apps mean that testing becomes very complex. And with complexity, comes cost.  Power & Backup  Unlike the most common IT solution of turning it off and on again, some IoT devices can struggle if there is a sudden loss of power. Testers need to ensure that they test how a device changes following a loss of power - particularly for items in the healthcare or industrial sectors.  This will become less of a problem as wireless power continues to grow, but until then the variables need to be considered. Consider a smart intruder alarm or a security camera, if the system fails due to a bug or drop in power or connection, they are essentially useless. How or even if a device backs up is another issue, especially when you consider how new or updated software will responded to older data. Updates  Firmware is an integral part of many IoT devices, and there’s nothing more developers love than pushing out new updates. But for the tester, updates can carry with them the monumental headache of bugs. If a missed bug gets through into a standard computer, then the issue can usually be simply tested and fixed before there is too much damage. But if a bug gets through into an IoT device, it can have far more physiological ramifications.  Take Nest’s Thermostat issue back in January 2016. A software update the previous December had resulted in a bug draining the battery life of some Thermostats, causing the system to deactivate. Eventually, the company rolled out a nine-step-fix for users, but proper and thorough testing could have saved the user’s shivers and the company's face. So whilst we are not in danger of our homes becoming the Demon AI Proteus; if Software Testers do not take the proper precautions, they do risk being left in the cold.  [/vc_column_text][/vc_column][/vc_row] ### Just do it: Cloud migration the lift, shift and refactor way [vc_row][vc_column][vc_column_text]Say ‘cloud’ and by now anyone in the tech sector – and a good few people outside it – will be able to rattle off the benefits. Security, scalability, flexibility, resilience, cost-efficiency. Cloud is no longer a challenger delivery model – increasingly, it’s the dominant way of doing things. For new companies, or companies building new applications, that’s good news. They can build from scratch in the public cloud and access the benefits straight away. Indeed, Gartner recently published a report revealing that the majority of cloud implementations are new builds. The use of cloud for new builds is now well-established. Organisations always want new software, either brand new, large applications or smaller ones that sit at the edge of their system. Faced with the decision on where to run these apps, organisations are increasingly putting them directly into the cloud. If you were to start your own business now, you’d put it all in the cloud – it’s been done enough times now that the pathway to the cloud via public providers like AWS and Azure is well-trodden. How do you move apps to the cloud? But the flipside of that good news is that for those organisations that have essential, hefty applications sitting in a data centre, things are more complex. It’s far from simple to recreate existing, organically-grown apps and guarantee that the new version will provide exactly the same results as the old one. Testing and the creation of test data is problematic, especially identifying the edge cases where differences in computation will produce different results. Testing applications is particularly difficult where the passage of time is an important dimension. For example, interest calculations in banking and credit card applications or utility bill and statement systems can only be tested over several  periods. Because the bill is due on a certain date and automated follow-ups are only required after those points, test scenarios are time and date dependent. In other circumstances the complexity and risk go even deeper. In insurance and banking systems, the logic in the application is itself the realisation of the product. as it was originally sold. In other words, the way the insurance policy dictates claim pay-outs is built into the on-premise software and embodied in the contract which the customer signed thirty years ago. If the application is rebuilt in the cloud and the change of coding has any material impact on the policy rules, the insurer will be in breach of contract. Lift, shift and refactor In short, rewriting an app shouldn’t be undertaken lightly. So, what’s the answer for companies that need to get out of costly on-premise contracts without altering their apps? In a nutshell, organisations should lift and shift their application into the cloud with minimal or no changes, if at all possible. You can then modify them once they’re running in the cloud, which will still be cheaper and more cost-effective than running them in the data centre. Move your application as quickly as possible to the cloud along with its supporting structures, and then set about redeveloping it. It’s important to note that sooner or later there will need to be changes to these existing applications, whether or not you move them to the cloud. Legislative and regulative changes such GDPR, for example, have forced many organisations to review and make modifications to their applications to ensure compliance. To make changes, you need a test environment. If the application is running in a data centre, the test environment will have to reside there too, spinning away even when it’s not in use, costing money and generating management overhead. Better to run it in the cloud and pay for the usage when you use it. When you put your applications in the cloud, you also have your test environment there with it. It’s much cheaper to run development and testing in the cloud , you can turn off the  environments when you’re not using them Migration tools are your friend Let’s look at a worked example. Say a company has an application running with an Oracle database and wants to move it onto an AWS cloud infrastructure. You could set up the cloud environment, applications and databases and copy the data into the new cloud environment.  Each element that has been hand crafted (such as table structures, indexes and permissions) will need to be tested to ensure that the copy will run as expected. That has to happen before porting the data into it and cutting over, while keeping the old system live in case you need to back out the migration. That’s the costly way. Instead, companies need to start lifting and shifting straight to the cloud using migration tools such as AWS Endure or Azure Site Recovery (ASR). Using tools that were originally designed for creating disaster recovery in the cloud means that the software and data is replicated to the cloud. Additionally, the data can be synchronised with the live system, until you want to make the cloud version live. Making money Clearly, cloud migration just isn’t the same low-hanging fruit as a new-build application in the cloud. For a large organisation that may have  thousands of commercial apps, moving that portfolio into the cloud can be a major headache, and if it’s done inefficiently it will reduce the cost benefit of the cloud migration business case. The crunch point is that cloud migration needs to make business sense. If there’s an advantage to being in the cloud, then it makes sense to do it as soon as possible. Don’t hold it off while you tinker about in the data centre getting your applications ‘ready’ for the cloud – the time that takes could run off the value altogether, given each application has a limited shelf life. Lift, shift and refactor is the mantra. If you’re considering migrating apps to the cloud, make sure you’re working with a technology partner who can help you do it quickly, efficiently and intelligently.[/vc_column_text][/vc_column][/vc_row] ### Internet of Things: Preventing The Next Wave of Ransomware Attacks In 2017, organisations were hit by ransomware attacks on an unprecedented scale. One report claims the average number of ransomware attacks in 2017 was up 23 percent compared to 2016, with detections up almost 2000 percent since 2015.    The Future of Ransomware Attacks New research shows nearly a quarter (22 percent) of IT decision-makers say their company has been a victim of ransomware at least once, while another 26 percent believe it’s “probable” that someone in their organisation has been hit by ransomware. That means nearly half of the organisations surveyed have been victims of ransomware or are unaware whether they have been subject to an attack. And once an organisation has been targeted, it will often suffer subsequent attacks. Unfortunately, the size and scale of ransomware attacks are only set to increase in 2018, with several factors combining to create the perfect storm for cyber-criminals. The increasing availability of crypto-currencies allows cyber-criminals to remain anonymous while conducting mass attacks and, combined with the smaller payment sizes associated with these attacks, it is more likely that victims will pay the ransom. At the same time, experts foresee a rise in targeted ransomware where criminals pinpoint a specific, and potentially lucrative, victim for extortion. Elsewhere, the growth of anonymous payment systems has been a catalyst for the growth of Ransomware-as-a-Service (RaaS). This involves ransomware kits sold on the Dark Web, making cybercrime accessible to anyone – regardless of their technical skills. One report puts estimates there was a 2,500 percent increase in the sale of ransomware on the Dark Web between 2016 and 2017.   IoT: A New Entry Point for Ransomware While most ransomware attacks currently infiltrate an organisation via email, a new delivery system for both mass and targeted attacks is on the horizon, with the mainstream adoption of the Internet of Things (IoT). Gartner predicts there will be 20.4 billion connected things in use worldwide by 2020. The volume and variety of new endpoint devices alone will present a huge challenge for IT managers, who will be tasked with deploying, managing, and securing the influx of new endpoint devices. The issue of managing endpoints within an organisation is already a challenge. Autotask Metrics That Matter™ 2017 survey said 63 percent of IT service providers have witnessed a 50 percent increase in the number of endpoints they’re managing, compared to 2016. IoT will usher in a raft of new network-connected devices, each one a potential entry point for malicious attacks, particularly when there is still a lack of established security standards around IoT. Many companies’ uncertainty around securing IoT devices is highlighted in Spiceworks’ State of IT report, which shows currently 29 percent of organisations have adopted IoT, with an additional 19 percent planning to do so this year. However, the data shows only 36 percent of IT pros feel confident in their ability to respond to cyber attacks on IoT devices.  [clickToTweet tweet="Consider the serious – or even life-threatening – impact of #ransomware on #smart #devices within critical #applications. What if they had access to ‘smart’ #medical devices such as #pacemakers?" quote="Consider the serious – or even life-threatening – impact of ransomware on smart devices within critical applications. What if they had access to ‘smart’ medical devices such as pacemakers?"] But consider the serious – or even life-threatening – impact of ransomware on smart devices within critical applications. A cyber-criminal could potentially have the means to turn off lighting or heating systems, or lock users out of their homes or businesses. Moreover, they could even affect the safety of drivers by tracking and hacking their IoT-enabled vehicles, turn off entire power grids or access ‘smart’ medical devices such as pacemakers.   Security Protocols So perhaps it is unsurprising that a 2017 survey found that almost half of small businesses questioned would pay a ransom on IoT devices to reclaim their data. With IoT vendors rushing their products to market despite a critical lack of security standards, what can be done today to help secure your organisation from attack? It will take a mixture of technology and processes. Traditional antivirus or endpoint security will only tackle known ransomware, so it’s important to deploy solutions with dedicated anti-ransomware capabilities and to keep both devices and operating systems up to date. Creating a process for patch management, so vendors can push out important security updates, is essential. Finally, ensure your entire team is educated and trained on the latest security protocols – human error is often the main cause of security breaches, so take the time to make sure they aren’t the weakest link in your perimeter.   Ransomware attacks are on the rise, and the new wave of IoT devices are another entry point into your organisation. The cost of deploying the right technology pales in significance to the potential costs and damage that a ransomware attack could inflict. ### 5 Ways AI Is Revolutionising Digital Marketing [vc_row][vc_column][vc_column_text]Whether you operate as a service-based business or an eCommerce storefront, the notion of implementing Artificial Intelligence (AI) into your website may a discomfort. Machine learning algorithms have advanced to a point where the end-user simply cannot discern whether they are chatting with a sales representative or an AI while shopping. According to studies published by Venture Harbour, 85% of customer interactions will be handled without a human intermediary by 2020. This is followed by a statement from 80% of business leaders who expressed that they have already implemented AI into their companies in some capacity and have experienced productivity growth as a result. The data clearly shows that AI does impact everyday workflow for numerous companies across the globe. But how does this technology revolutionise digital marketing specifically? Can AI handle the marketing of your brand to potential customers like an advertising agency would be able to? Automated Content Generation While it may seem far-fetched, AI algorithms are already capable of generating content for online publishing. The technology is already so far ahead that you may not be aware that an article was generated by an AI. A perfect example of this practice can be found in Forbes which uses its own artificial intelligence algorithm to create news articles. The AI collects relevant information from across the web and generates unique blog posts based on what it gathered. Results of AI-generated content are created in mere seconds, something which would take actual writers and journalists hours or days to do correctly. However, without source materials to look through, the technology simply cannot keep up with actual writers as of yet. Even though AI can’t generate content without raw data to work with, it is still a great addition to small companies with little in terms of resources to spare for hired writers. Content Curation One of the best reasons to consider implementing AI in your digital marketing is due to content curation. Customer bases are often comprised of people from different professional backgrounds, generational gaps and lifestyle choices. The same type of marketing content simply won’t appease every one of your followers no matter how well you curate it. However, an AI might be able to curate the content depending on who accessed your website. According to Martech Advisor, Netflix managed to make up for $1 billion in lost revenue by implementing AI curation on their streaming platform. The result of content curation is better targeting and management of what content is displayed to which user that engages with your website. The same principle can be applied to your online store, blog archive or forums which will make the content more appealing for new visitors. Indefinite Customer Support Machine learning algorithms can often be seen in the form of chatbots across the web. These specialised AI algorithms serve the purpose of customer engagement and personal marketing. According to Chatbots Magazine, over 100,000 bots operate on Facebook Messenger alone in hopes of engaging new customers with popular brands. Setting chatbots up on your site and social media pages is quite easy and can introduce a new layer of digital marketing to your brand. Best of all, however, is the fact that chatbots work overtime and are functional 24/7 while also learning new information which each individual conversation. This means that the longer a chatbot is active, the more fluent it will be in whichever language and industry you operate in. You can implement professional platforms such as Pick Writers and their content localisation services to your products available to as many people as possible. While they still can’t fully replace live customer support agents, chatbots provide enough incentive for implementation for both small businesses and large corporations User Experience Advantages Graphic and web designers often struggle with User Experience (UX) research and design. After all, it’s difficult to cater a single website to millions of users despite extensive customer profiling. AI algorithms introduce a new level of curated design to the web. Menial tasks such as data research, A/B testing and visual style generation can all be handled by AI algorithms. These tasks are technical and focus on repetitive, time-consuming tasks which take away precious work hours from the design team. Despite the troubles in pinpointing the right balance of UX, it is still an important element of a design process in terms of digital marketing appeal. By splitting the workload between web designers and AI algorithms, users will be presented with more appealing websites and be incentivised to engage with your content more frequently. High RoI Incentives While the notion of cost-cutting in terms of manpower may seem counterintuitive, AI can truly transform your business model if you allow it. We’ve already covered the topics of customer support and UX research, but how far can AI really go? These two segments alone would cost a small company a fortune if they opted for third-party outsourcing. AI algorithms are arguably cheaper and faster in terms of presenting actionable results but they do lack in terms of human touch. According to statistics published by Adobe, 38% of consumers actively seek to talk to an AI instead of a human representative which drives the point further. It’s important to create a good balance between AI implementation and actual professionals working in your company. No matter how extensive your AI implementations are, the return on investment, as well as the revenue growth you will experience, will be worth the trouble. The Future is Artificial (Conclusion) It’s easy to see the appeal of implementing artificial intelligence into existing content marketing strategies and advertising campaigns. However, companies should tread lightly in terms of delegating important tasks to AI algorithms. Weigh your company’s needs for automation and AI before investing too much capital in a technology still in development. If balanced correctly, AI can transform the way you operate online and help you reach more customers and clients in a relatively short time.[/vc_column_text][/vc_column][/vc_row] ### Changing from network access to application access [vc_row][vc_column][vc_column_text]The workplace is changing, which is leading many companies down the path of digital transformation. Employees are no longer anchored to their desks or beholden to the corporate data centre. The digital employee of today demands flexible access to data and applications regardless of where those resources are stored, which device the employee is using or where the employee might be working. Users accessing corporate resources as part of their day-to-day working lives aren’t concerned with how they are being connected. They just want access to what they need, when they need it. With the proliferation of enterprise apps and a growing number of mobile and remote working policies, organisations need to change their approach when it comes to network security. Gone are the days of allowing employees unfettered access to the network, as these employees are no longer secluded behind the corporate firewall. Organisations must enable secure application access without necessarily granting access to the corporate network each time, as this inevitably introduces risk. However, any change must first be preceded by the acceptance of a new approach. This is becoming easier by the day, as even the most-stubborn deniers can no longer ignore the benefits of a cloud transformation. This is evidenced by the latest survey data from Atomik Research. According to the State of Digital Transformation—EMEA 2019 report, digital transformation efforts are gaining ground among EMEA businesses, with a majority now conscious of its benefits. More than 70 percent of decision-makers in the U.K., Germany, France and the Netherlands within enterprises of more than 3,000 employees are already in the implementation phase of their digital transformation projects or are already benefitting from a digital transformation initiative. Only 7 percent of the companies surveyed have not yet started, with many having already completed their transformation projects. Encouragingly, the survey also found that many of these digital transformation initiatives are being driven from the highest levels within the organisation. While cloud transformation is undoubtedly a priority for most businesses, in reality, some companies remain at least partially locked into their legacy infrastructures. Even those businesses with a portion of their staff working remotely or on the move often retain a large amount of their on-premises resources. The cultural shift to a cloud-first infrastructure approach - along with moving applications to the cloud – seems to be too large a first step for many. Yet, the mere relocation of applications to the cloud is far from a complete and secure cloud transformation. If applications are kept in the cloud and the internet becomes the new corporate network, how must secure access to these same applications be designed? Companies often neglect this network transformation step while in the planning stages. They remain loyal to their traditional structures and instead backhaul users over their legacy network. This detour not only affects speed, but also the security of the entire network. Organisations must factor in the effect that application transformation has on their network performance and bandwidth consumption, as well as the latency added by hub-and-spoke architectures from the outset. Moving applications to the cloud needs to be considered alongside new network infrastructure and security requirements. However, the State of Digital Transformation—EMEA 2019  report found that only 9 percent of enterprises consider application, network and security transformation equally important when planning their journey to the cloud. This holistic view is vital in any digital transformation project, as it plays a key role in the overall user experience. And, especially with today’s digital workforce, user experience is of paramount importance. This means speed, reliability, security, and usability are key factors to consider when embarking on a cloud transformation journey, irrespective of the size of the company in question. As part of that experience, the user no longer wants to differentiate between applications that are kept in the cloud or on the network. Seamless access to applications is critical, whether they are held in private or public clouds, in Azure and AWS, or in the corporate data centre. Employees also expect business applications to provide the same smooth user experience they get from the consumer apps on their smartphones. This is the start of the transition to a limitless working environment. Whether the desk is in the office or home office, or whether the employee is a road warrior and accesses his applications and data from the hotel or airport, the path to that data must be secure and fast. To ensure an undisturbed experience for the user, secure cloud transformation should also be accompanied by another change - from network access and to access at the application level. After all, if the application has already left the company network, why should the employee still be connected to the network and not immediately access the app on the most direct connection? Opening up the entire network to remote users only creates a security risk for the company. When companies undergo a cloud transformation for efficiency reasons, they must incorporate modern approaches to secure access at the same time. The concept of zero trust is one approach. In this model, users are securely connected only to the applications for which they are authorised with ongoing verification of their access rights. Companies should consider integrating Zero Trust Network Access as part of their secure cloud transformation from the outset to ensure the workplace of the future - as traditional network access is quickly becoming a thing of the past.[/vc_column_text][/vc_column][/vc_row] ### Has your organisation locked the stable door? [vc_row][vc_column][vc_column_text]It seems like hardly a day goes past without news of another major data breach or cyber attack. In fact, recent research from DLA Piper revealed that 59,000 breaches have occurred since the implementation of the General Data Protection Regulation – with the likes of Toyota, Quora, and even Google coming under fire.  While it’s widely known that data breaches incur a significant financial penalty under GDPR – up to €20 million or 4 per cent of the company’s global annual turnover – what is far less understood is who’s responsible (and who pays the price) for a breach due to employee negligence or criminality, sometimes referred to as ‘vicarious liability’. A recent example from the UK illustrates the problem – with a major supermarket chain fighting an earlier ruling that made it liable for one disgruntled employee’s leak of the personal details of 100,000 colleagues online. While this battle continues through the courts, what is clear is that companies are currently considered to be vicariously liable for the actions of their employees and the security of both employee and consumer data. Forgetting to hand over the keys Whichever way the issue of corporate responsibility is resolved, businesses of all sectors and sizes need to ensure that their employee and customer information is properly protected. This may seem like an overwhelming task, but the truth is data breaches are often – perhaps predominantly – caused by simple, avoidable errors during day-to-day processes. For example, companies often fail to consider whether employees can still access this information once their employment has been terminated, as seems to have been the case in the supermarket breach. While this should be easily avoidable, it continues to be a massive problem for businesses, as our own research reveals. In SailPoint’s most recent Market Pulse Survey, we found that almost half (47%) of employees who leave a job still have access to their former organisation’s data via corporate accounts (17%), cloud storage (16%) or mobile devices (14%). That’s an astonishing figure. After all, no landlord would forget to ask their tenant to hand over their keys once they vacate a property, yet this is pretty much exactly what many – indeed, nearly 50% of businesses are doing with their former workers. It only takes just one employee to cause massive, perhaps irreparable damage to a business’ reputation by accessing and sharing enormous volumes of sensitive data. So, beyond ensuring that they remove access to corporate systems immediately after the termination of workers’ employment, how can they best protect their data and avoid a damaging breach? Managing access becomes more complex If your organisation has been lucky enough to avoid a serious data breach, that’s not necessarily cause for complacency. Until you can ensure that you control every worker’s access to sensitive data, including and especially after they’ve left your business, the stable door remains wide open. It’s only a matter of time before an employee accesses and leaks sensitive information, either maliciously or by accident. Instead, congratulate on your good luck so far – and take steps today to improve your organisation’s identity governance. This can seem a daunting task at first, especially if your IT teams currently spend significant amounts of time struggling with the complex question of who has access to what. This difficulty is often compounded when an organisation is going through a period of significant changes, for example during digital transformation projects, when a company may be making many new hires or employees changing roles. Any change to the workforce – even the promotion or sideways move of a single employee – heightens the risk of a worker being able to access information or systems that they’re no longer authorised to view. Similarly, it should be obvious that when an employee leaves, their access privileges are immediately revoked, but sadly we’ve seen how this often isn’t the case. But there’s another side to the coin. When an organisation forgets or otherwise fails to update an employee’s access, they can leave ‘orphaned’ accounts, and these represent a particularly tempting target for hackers. That’s because hackers can use these as cover, hacking into unguarded, unwatched dormant accounts to steal sensitive data through seemingly legitimate access and without raising the alarm. Making identity governance manageable Faced with the growing complexity of access management, how can an organisation respond without further burdening already-overstretched IT teams? The answer, as with so many other areas of business today, is through intelligent automation of access. Choosing the right identity governance solution means that an organisation can manage access far more effectively removing, at a stroke, the risk of forgetting to update privileges whenever an employee’s role changes or when they leave the company. An effective identity governance system can also help you to manage potential security and compliance risks, while also ensuring that every digital identity throughout the organisation is kept secure. What’s more, they provide a far-enhanced level of oversight so that IT and other parties can easily keep track of who can access what data. The lesson is clear: don’t lock your stable door after the horse has bolted. It could be galloping away with your most precious resource: your company’s most sensitive information.[/vc_column_text][/vc_column][/vc_row] ### How AI Can Improve The Employee Experience [vc_row][vc_column][vc_column_text]More and more businesses are realising the power of AI and how much it can impact the success of their business. But something that many may not have considered is how it can be used to impact the experience that employees have whilst at work, and how it can measure company culture. The capabilities of AI are advancing every day and crossing into every aspect of business, so it makes sense to get your company on board and harnessing its power before your competitors do! Alex Tebbs, co-founder of unified communications company VIA, shares his insights on a few ways that businesses can use AI to improve the employee experience. Revolutionise your recruitment drive Utilising AI-powered assistants in your hiring process can significantly improve the experience that potential job candidates have in the application process. AI can support in the initial stages of hiring by responding promptly to applicants, keeping them informed of any updates to boost their engagement throughout the process. Artificial intelligence has an edge over humans here, as it can use its technology to determine when is the best time to reach out to different candidates, as well as the language and tone to use. This also helps to cut down the amount of time your current employees need to spend on recruitment, giving them more time to spend on their primary job role. Rather than spending time sending emails and doing hiring admin, instead your manager can dedicate more time to that big project they’re working on. In this way, AI assistants benefit not only your potential future employees, but they also reduce the number of additional duties your current team members would need to take on. Supercharge your training programs AI can also be used to revolutionise your training processes. Rather than handing out the same introductory binder to each new hire, consider using AI to deliver your training modules. The machine learning algorithm can predict on an individual basis when the optimum time for your employees to learn is, and which format suits them best based on their previous training successes. In this way, AI can also offer personalised feedback and, in a way, act as a personal mentor to your team members. If the system recognises that someone has trouble with a particular topic or process, it can recommend additional training or repeat exercises to bring those skills up to scratch. It can also deliver these insights to you as a manager so you can have an in-depth understanding of your team’s knowledge and competencies.  Improve collaboration and working relationships The relationships between employees are crucial for a positive and harmonious working environment. But AI can take that to the next level and deliver insights about which people work best together, and which people should form different components of a team. This kind of data simply isn’t available without artificial intelligence, but it can transform your company’s productivity and operational efficiency. If you are part of a large or international business, AI can also help to facilitate successful collaboration across teams which may otherwise find it difficult to share information efficiently. Machine learning can predict when is the best time to send data to and from different teams, as well as which formats are most beneficial for easy understanding and processing. This application of this technology isn’t just limited to employee to employee relationships either. Why not use it to measure the success of your sales team’s interactions with different clients to help you determine who is the best person to put forward as a primary point of contact. Or even who to send to pitch for business from a certain type of company in a certain industry in a certain location. The number of variables involved with analysis is often too many for humans to process efficiently, but AI can offer powerful insights on this and many more aspects of workplace interactions.  Advance personal development Not only can AI be a game-changing tool when it comes to training, it can also help to identify which team members would benefit from which personal development opportunities. For example, if someone shows an aptitude for communication but struggles with confidence, this can be flagged to their manager who can arrange suitable training opportunities. This person could then go on to be an excellent public speaker and representative for your business, but this can fly under the radar without the insights from AI. Perhaps you have a team member who you think is nearly ready to take a step up and receive a promotion. AI can help you to identify any aspects of the new role which the employee may not be entirely prepared for, so you can then provide a suitable solution. The machine learning can even advise which format would be best for this, whether it be additional formal training, a shadowing opportunity, or even just a bit more experience. In this way, you can be sure that you’re providing your team with the most appropriate development opportunities for their skill sets, as well as ensuring your business grows in a consistent, sustainable way. Though AI can seem like a highly advanced and, to some, an unattainable tool for the workplace, it’s one which every company should be investigating to propel their business growth to new heights.[/vc_column_text][/vc_column][/vc_row] ### Fuel Your Engine: 8 Tips for Deploying a Seamless Cloud Migration Strategy [vc_row][vc_column][vc_column_text]Cloud computing enables organisations to dramatically scale their operations and provide greater access to their networks while maintaining security and cost-efficiency. Migrating to the cloud can be challenging and expensive, but is well worth the effort once you have successfully moved your data storage and IT operations to a cloud network. Many businesses are choosing to adopt a cloud environment because the flexibility and scalability of the cloud make it a competitive advantage. A major benefit of cloud computing is that it provides redundancy, which allows you to restore your data. Moving to the cloud can also help companies expand their operations while mitigating the risks of IT outages and data loss. Organisations can choose between various public, private, hybrid, or multi-cloud solutions. Public clouds are multi-tenant environments, meaning that you share computing resources with other users. They are cost-efficient and easy to manage. Private clouds are single-tenant environments, so they are more secure than public clouds and offer the customer greater control. The drawback is that private clouds require greater management, and their scalability is limited because you have to acquire new infrastructure to expand. Many organisations opt for hybrid solutions, which combine private and public clouds to maximise security whilst maintaining scalability and efficiency. You can rely on public infrastructure for less sensitive requirements, and transition to a private environment when handling privileged data. Multi-cloud solutions are similar, except they combine services from multiple providers. However, these solutions can be expensive and complex to manage. The various cloud services on offer can generally be divided into three categories. Software as a service (SaaS) is the most popular option, as it allows you to access pre-engineered software. The provider handles the technical side so you don’t have to. An example is Gmail. Infrastructure as a service (IaaS) involves renting computing resources like servers and storage space from a cloud provider. Platform as a service (PaaS) combines the two so customers can customise their cloud infrastructure without having to build everything from scratch. Challenges of Moving to the Cloud There are several issues to consider when attempting to implement cloud migration. Many organisations fail to adequately plan the move and end up facing unpredicted costs and delays. Some pitfalls to look out for are: Vendor lock-in—some organisations come to rely too heavily on a single cloud provider, which makes it difficult to switch to a new service. Unexpected costs—if you don’t adequately plan your budget, your cloud operation costs can rise exponentially. If you fail to predict your future needs, you might not select the most cost-effective cloud solution. Security—managing cloud security can be challenging. If you rely exclusively on a public cloud, you have little control over the environment, because you share the same computing resources with other users. Time and complexity—migrating your entire workload to the cloud can be time-consuming and technically complex. Some organisations don’t have the necessary know-how to implement it. Legacy software—some legacy applications are difficult to migrate, especially if you lack visibility into your legacy components, or if you don’t have a proper migration plan. For example, apps with a monolithic mainframe are harder to rehost. Workplace culture—siloisation (lack of collaboration between departments) is a major obstacle to an agile work process. Development and business teams often view the security and technical aspects of migration as someone else’s responsibility, leaving the IT department, and this attitude can hinder a successful migration. 8 Tips to Plan Your Migration to the Cloud It is important to plan your migration in advance. The following practices can help ensure a smooth and efficient transition to the cloud. 1. Understand Which Cloud Solution Fits Your Needs Before you attempt to move to the cloud, you need to analyse your workload and research the available cloud services and tools that can help you migrate. Evaluate which cloud offerings meet your operational requirements, taking into consideration cost, usability, flexibility, scalability, security, and access. You should only select a cloud service that lets you track usage and billing data. Another issue to consider is redundancy and data assurance. You should use a cloud solution that allows you to back up your data easily. For example, AWS clouds allow you to back up your data as EBS Snapshots. 2. Decide on a Cloud Migration Strategy Having a clear migration strategy is the most important element of a successful plan. There are several ways to move your applications to the cloud, including rehosting, replatforming, and refactoring. Rehosting, also known as lift-and-shift, is the simplest and fastest method. The cloud provider hosts your existing software without you having to modify it first. However, rehosting is not the most efficient way to make the most of the cloud. Replatforming is the next step up. You modify your infrastructure slightly, but the cloud provider takes care of most of the infrastructure and maintenance. It allows you to keep your core architecture while reducing management and operation and licensing costs. Refactoring, also known as re-architecting, is the most advanced, and therefore the most time-consuming, approach. Your entire software and infrastructure are adapted for the new environment, allowing you to take full advantage of the flexibility and cloud. However, this strategy requires greater expertise and a larger up-front investment. Migrating large networks or heavy legacy applications can be difficult, so you might want to opt for a phased migration strategy, starting with rehosting, and then working your way up to a full refactor. Start with small applications and expand your migration efforts incrementally so you can learn as you go and refine your migration plan. 3. Plan Your Budget: Cost is often the main incentive for moving to the cloud, but not all cloud offerings are cost-efficient for every workload. Estimate your cloud computing costs before you attempt any migration. Factors to consider include licensing, setup, and data storage costs, as well as ongoing maintenance costs. Bear in mind that low upfront costs (as in rehosting) can cost you more further down the track. A greater upfront investment (as in refactoring) can generate significant savings in the future. You can refer to a cloud calculator or migration consultant to help you budget for the cloud. 4. Know Your Applications: Keep track of your open source components and legacy software in an inventory. To ensure visibility, run a discovery to identify any applications or components you may have overlooked. If you lose visibility into your applications, you cannot properly maintain them or resolve vulnerabilities. A discovery can also help you determine which components you can do without, so you can reduce your migration footprint. Some architecture components are easily replaced on the cloud. 5. Embrace Automation: Automate the repeatable elements of your migration strategy to accelerate the process and improve consistency. Automation also helps you streamline builds and edits while reducing human error so you can avoid downtime. 6. Plan for Maintenance: Allocate funds and time for ongoing management. Moving to the cloud reduces your management burden, as the cloud provider handles most of your hardware and software maintenance needs. However, this doesn’t mean you can rest on your laurels. You will need to assess your needs on an ongoing basis, scaling capacity or even switching providers to respond to new demands. 7. Adopt a Culture of Shared Responsibility: This is perhaps the most challenging factor to implement. Your entire organisation, including developers, operations staff, and business leaders, should be involved in the cloud migration process from the beginning. Don’t leave it to IT or engineering. Achieving this may require training your staff or adopting new communication protocols to ensure that every department collaborates and shares responsibility for the migration. The idea is to shift the mindset at your workplace and make your staff more comfortable with cloud services. 8. Prepare a Disaster Recovery Plan: You need to have a cloud-based disaster recovery (DR) solution in place. This is cheaper than an on-premise DR solution. Use a DR plan that replicates your on-premise infrastructure in the cloud and continuously updates your data. This will allow you to continue working as normal, via the cloud, in the event that your hardware is damaged. Conclusion As speed and scale drive up the competitiveness of apps, more and more organisations are turning to the cloud. If you don’t want your business to fall behind, you should consider migrating your applications to the cloud. However, before you do, make sure you are ready with a well-thought-out migration plan.[/vc_column_text][/vc_column][/vc_row] ### Keeping Safe With Smart Devices & IoT [vc_row][vc_column][vc_column_text]In this article, we will be using the term “Internet of Things” or “IoT” as a catch-all term to describe interconnected smart devices, machine-to-machine (M2M) communication, and related software/hardware technologies. The data insights and automation potential provided by Internet of Things (IoT) technologies have created great opportunities for process improvement and other revolutions in how businesses operate. The popularity of these devices is rising greatly - The global IoT market is estimated to reach 22 billion actively connected devices for different IoT industries worldwide by 2025.  As with any new technology, the Internet of Things is not without its cybersecurity drawbacks. Any business seeking to implement IoT devices need to do so with cybersecurity as a predominant concern if they wish to keep their devices, systems, and data safe. How Can IoT Help Businesses? Technologies under the IoT umbrella can do wonders for businesses across a variety of sectors, though a significant impact has been demonstrated with its use in industrial companies. IoT devices, along with other rapidly developing technologies such as 3D printing, artificial intelligence, quantum computing, and advancements in energy storage, have sparked an unprecedented revolution in the capabilities of industrial companies, leading to what is becoming known as the Fourth Industrial Revolution. IoT devices allow industrial companies to: Remotely control and monitor their supply chains Combine IoT sensors with existing technology to aid in the prediction and implementation of preventative maintenance by monitoring for signs of wear and enacting the needed solution. Operate their processes in a more energy-efficient way through real-time energy data For businesses in other sectors, IoT devices can bring forth considerable change in the form of improved data insights. These improvements can provide companies with benefits that include greater stock/inventory control, improved efficiency of scheduling, and waste reduction, among other enhancements.  The Dangers of IoT While there is a clear advantage that can be gained through the data collection and automation that comes with IoT devices, they are not without their risks. Many IoT and smart device users do not realise how accessible some of their devices may be. The search engine Shodan scours the web for devices with lackluster password protection and displays screenshots of what it was able to access - the most notable examples being streams from IoT security cameras used in both business spaces and private dwellings. This list is far from comprehensive, however, it serves as a reminder that insecure IoT devices can be exploited and should be implemented with due caution.  Lack of Built-In Security for IoT & Smart Devices  Businesses wishing to leverage the power of IoT need to ensure that the manufacturers of IoT devices they use are taking cybersecurity seriously.  In an effort to mass-market their devices and keep costs low, many smart device manufacturers have opted to forgo investing in cybersecurity and have instead prioritised the cost-effectiveness of their devices to attract buyers for this evolving market. While governments such as the UK and the US are beginning to take IoT cybersecurity legislation seriously, for the most part, external pressure on smart device manufacturers to prioritise cybersecurity in their device development process has not been sufficient.  Many smart devices do not come integrated with two-factor authentication (2FA) features, nor do they encrypt the data they collect and transfer. Until smart device manufacturers are held responsible for implementing security as a priority from day one of development, smart devices will continue to be a viable vector for cybersecurity threats. Examples of IoT Security Breaches Historically, IoT devices have been used as an entry point for malware attacks. To provide further context, here are a few high-profile examples. Distributed Denial-of-Service (DDoS) Attacks In 2016, a botnet known as “Mirai” executed a Distributed Denial of Service (DDoS) attack by exploiting the default passwords of a variety of IoT devices. The DDoS attack led to the loss of internet connectivity for a large segment of the east coast of the United States.  The relative ease with which this attack was implemented provides insights into how IoT technologies can be exploited for nefarious purposes, and it was far from the only botnet attack powered by compromised IoT devices. Stuxnet’s Attempt to Destroy Nuclear Machinery While the creators of Stuxnet are not confirmed, extensive study of this highly evasive computer worm has confirmed its purpose - to target centrifuges used in nuclear plants and reprogram them to perform cycles that are damaging to their physical components. Stuxnet provides a cautionary tale that the vulnerabilities of connected machinery can cause issues greater than lost data and disabled software - they can cause serious physical damage to the devices, or even endanger lives.  How a Thermostat Lead to a Data Breach An unnamed casino in Las Vegas had a database of customer data stolen in the most unexpected way possible - through their aquarium’s thermostat. The casino used a wifi-connected IoT thermostat to monitor and adjust the temperature of the aquarium. As the thermometer was on the same network as their customer’s data, cybercriminals were able to exploit the thermometers vulnerabilities to use it as an entry point.  Shifting from Cloud Computing to Fog Computing Contrary to popular belief, the Internet of Things can function without an external cloud computing provider, however, the advantages of cloud computing must be strongly considered before deciding to shift to a cloudless solution.  Businesses that wish to have total control over their data can leverage their own locally-controlled  “fog computing” infrastructure to allow them to process and transmit their IoT data without sharing it with an external cloud computing provider. Fog computing is a decentralised computing infrastructure that transmits data from IoT devices to a gateway on the local area network (LAN) that handles the transmission of the data to the appropriate processing hardware - the use of local hardware for this data processing is often called “edge computing”. While fog computing and edge computing offers a suite of advantages including reduced latency and greater control of how data is shared and transmitted, they are not without their vulnerabilities.  Businesses that leverage fog computing and edge computing to store and transmit their data are the sole providers of the physical, procedural, and technical cybersecurity measures needed to protect the data they collect, which is no small task.  Leading cloud computing providers are heavily invested in implementing and maintaining leading cybersecurity measures to protect the data they store, transmit, and process as their entire business model is reliant on their reliability and security. Businesses that simply want to use IoT devices as an upgrade to their usual operations may not reasonably be able to maintain a similar level of security as cloud computing providers, leading to greater cybersecurity risks if their use of fog computing is not performed with equally robust cybersecurity measures. How to Use IoT Devices Safely While not without their risks, IoT devices do present an unprecedented opportunity for advancements in data collection and transmission that can lead to incredible gains in capabilities and efficiency. To use IoT devices safely, there are key security measures that can be implemented. Encrypt Sensitive Data While the process of encrypting data before it is sent for processing to the cloud computing provider can cause delays in data transmission, it is an important step for sensitive data. Companies that transmit sensitive data (such as medical data) from an IoT device to the cloud need to take the sensitivity of that data seriously, however more innocuous data can be left unencrypted.  Use Unique Passwords As seen with the Mirai botnet DDoS attack, one of the methods that are used to exploit IoT devices is through software that attempts to gain entry by using known factory-default passwords used by the manufacturer of the IoT device. In addition to changing the factory-default passwords to a unique password, IoT devices that are used must be manufactured in a way that the devices cannot be reset to the factory-default password as the ability to reset to a default password could provide cybercriminals with an added vector for attacks.  Use a Separate Network for IoT Devices For businesses that are controllers of sensitive data, that data must be held on a network that is separate from your IoT devices. Due to the relative immaturity of IoT device security, keeping them on a separate network reduces the possibility that they can be used as a point of entry to the main network. Choose Your IoT Cloud & Device Providers Wisely Businesses that opt to take advantage of the power of cloud computing for their IoT devices need to carefully choose an IoT cloud platform provider that they can trust to secure the data they store and processes on their systems. They will also need to choose IoT products that are manufactured with cybersecurity as a significant concern. When deciding IoT device providers for your needs, consider the following: Does the device manufacturer provide a public point of contact for cybersecurity vulnerability reports? Do they have a history of taking these reports seriously? How long will the device manufacturer provide security updates for their products? Does the device manufacturer prioritise cybersecurity as a part of its production mandate? IoT cloud service provider (CSP) cybersecurity considerations: Does the CSP have a history of taking cybersecurity seriously? What cybersecurity measures has the CSP taken to protect data? Can the CSP offer encryption and other security measures at scale as the IoT infrastructure grows? The above list is far from comprehensive and each business use-case will have unique cybersecurity considerations. As IoT technologies continue to contribute to rapid growth, businesses looking to take advantage of the evolving insights and features provided by them will need to prioritise cybersecurity first and continue to research and implement the best possible solutions for their needs.[/vc_column_text][/vc_column][/vc_row] ### How Ready-To-Deploy Widgets Make Building a Website Easier Building a website from scratch was very difficult in the past. Firstly, web developers did not have a proper IDE (Integrated Development Environment) to work in. Secondly, there was little to no scope for making websites responsive. Manually, it was something impossible to do. Finally, websites back then were static. Building a dynamic website was something they could only think of in their wildest imaginations. With time, however, things have changed. Developers now get to work in proper IDEs. These allow them to work on their HTML and CSS files simultaneously. At the same time, developers get a live preview of what the site looks like as they keep coding.  Using frameworks like Bootstrap made it easier for developers to make websites responsive. Lastly, modern websites are dynamic. Not only are they visually appealing, but they also handle their functionalities very well. One of the most useful features of the modern-day website is its ability to have widgets. Widgets are available for almost all types of services these days. Integrating widgets on a website can make life easier for developers. That makes building a website a lot easier as well. So how is it that ready-to-deploy widgets manage to do so? Let us find out. Reducing the Need to Write More Code Since the widgets are ready for deployment at any given time, developers need not write additional code for them. Be it CMS or HTML/CSS-based, widgets are available for all types of websites. Whoever creates the widgets does it all for them. So in case you are fiddling with the Tomorrow.io widget, you can copy-paste the code they provide on their website.  Tomorrow.io is a super accurate weather app. It is one of the most reliable weather apps and can provide better weather prediction than most other weather apps. It took developers months to build something like Tomorrow.io. Just the weather forecasting capabilities probably took two or three months to integrate. Then comes the AI and ML integration.  Imagine doing all that by yourself. Coding it from scratch could take you years. That would especially be true if you were an amateur web developer. However, since Tomorrow.io provides a widget, you do not need to code anything. You can copy-paste the widget or API code from their website and use it on your site. That will help you save time and effort.  You will also get to use Tomorrow’s servers while using their widget. The same goes for any other widget or API that you plan on using. Your web server will not have to take the load for any of the widgets you have there, unless you made it yourself, or are hosting it on your server.   Integrating Artificial Intelligence and Machine Learning with the Widgets Building a widget is difficult in itself. So when you think about integrating AI or ML technologies into it, things are bound to get even more difficult. However, you need not do that either. As per their needs, developers these days use AI and ML to train their widgets. These widgets are therefore smart and carry out a diverse range of functionalities.  Being one of the world’s most reliable weather apps, Tomorrow.io's weather widget is equipped with AI technologies. It uses AI and complex event processing to analyse the weather data it retrieves from the nearby weather radar. Afterwards, it provides users with the weather forecast for that area. Thanks to the use of AI, the forecasts are usually accurate. Apart from that, the AI also helps the weather widget give out intelligent weather advice. For instance, it will advise users to take a raincoat or umbrella if there is a possibility of rain. Similarly, it will tell people to drive safely as the roads might be slippery after a downpour.  Improving Website Visuals Modern web widgets have stunning visuals. Alongside a simple user interface, the widgets also come with interactive graphics and visuals. Such visuals are there to appeal to the visitors. They help improve the overall aesthetics of the websites. And since most widgets are responsive, you can expect them to adjust to different screens accordingly. That helps keep the website’s look uniform no matter how you view it or from whatever device. Widgets also give the website a more dynamic feel, especially those that are continuously updating. Take the Tomorrow.io widget as an example. The weather data changes once every hour or so. Hence, there is an update to the visuals, or at least the numbers on the screen. Such real-time updates keep the dynamic feeling of modern websites intact. These are some of the most crucial ways widgets are helping to make web developers' lives easier. Widgets help save time and effort. They also help save developers or hosts money on server space. Thus, it is safe to say that a modern website is unimaginable without at least one or two widgets. ### Reaching cloud nine: how to stop the high cost of cloud sprawl [vc_row][vc_column][vc_column_text]As we start a new year and anticipate new technologies entering the market, one solution that will continue thriving is cloud computing. According to McKinsey, cloud-specific spending is expected to increase six times faster than general IT spending through 2020. From a regional perspective, it’s worth noting that the UK is leading the way in adoption. In fact, an Infosys report categorised British companies as ‘torchbearers’ in terms of their cloud adoption rate.  The answer to the question of why cloud computing is so popular could be that it comprises myriad benefits. These include agility, collaboration and productivity, along with the flexibility to create new cloud environments on an ad-hoc basis whenever they’re required. With benefits a plenty, it’s easy to overlook its downsides. One of the most concerning pitfalls is that companies often invest in cloud applications and neglect to devise a plan to realise ROI. This is reflected in a Gartner report which cites that less than 30% of businesses have devised a cloud computing strategy. Consequently, this tunnel vision could result in organisations paying for a product that isn’t required or is inadequately managed. With cloud computing growing in necessity, how can companies improve how they manage their investment, reduce cloud sprawl and achieve tangible business benefits?  Back to basics  As the age-old saying goes, you can’t ‘run before you can walk’, and this is the same when it comes to the cloud. Before investing in cloud applications, it’s important to go back to basics. Firstly, organisations must consider how their teams will actually use the cloud services, what capabilities are required to enhance productivity, which cloud applications will drive results and which will be redundant. Using a solution like IT Service Management (ITSM) enables organisations to monitor their cloud usage as well as capture and track all cloud application requests. From a business perspective, this means that companies can maintain a comprehensive inventory that includes the relevant attribute of each cloud environment. These steps may seem obvious, but are necessary to establish a roadmap which outlines how cloud applications are being leveraged and how they support the business strategy.  Problems, protocol, profit Keeping a cautious eye on a cloud application’s lifespan is critical to ensure it is cost-effective, otherwise unrestricted applications can be a huge money pit and drain resources. It’s important that organisations screen, and in some cases reject, requests for cloud environment in order to ensure that IT budgets are being spent wisely. It’s imperative that before a company invests in new cloud applications or platforms, that these are vetted and undergo a strict approval procedure. This procedure should be based on a business case explaining why that particular cloud instance is required, cost and expected benefits to the business. Additionally, identity management tools can be used to control who can approve and access certain cloud applications, and for how long.  In the same vein of approvals, it’s important to have someone dedicated to the business’s cloud landscape, responsible for maintaining it and ensuring that each profile is kept up-to-date based on the cloud application’s lifecycle and business terms. This may come in the shape of an individual or group – for example, a Cloud Approval Board. An obvious choice would be an IT leader, as their skills and involvement naturally lend themselves to this role. They can, for example, call upon their experience to establish how a cloud solution can assist the company to achieve its business objectives.  Keep an eye on time  It’s unavoidable that companies will require cloud applications for short-term projects as business booms. However, these are often easily forgotten and left running – and draining resources – once the work is completed with the costs continuing to stack-up. The good news is that businesses can overcome this pain point by arranging pre-set end dates for cloud applications. They can save even more time by using an automated solution which can independently enforce end dates based. This gives power back to businesses by allowing them to contain and limit their cloud spend. It also ensures that cloud activities don’t side-step stakeholders, who can access an overview of all cloud resources whilst they are running, not just at the end of the billing cycle.  Beyond this, companies need to also consider the lifecycle of cloud-based apps to ensure that they stop running after the employee who was using them leaves the business or changes departments. This timely termination of cloud instances isn’t only imperative to cost-effectiveness, but also to security. Automating application control and revoking access to cloud instances during staff offboarding crucially prevents security breaches or data thefts which could be caused by unmanaged applications or unauthorised access.  It’s clear that cloud computing is a core component of digital transformation, and its adoption, and inevitably waste, is set to continue to rise. It’s prime time for organisations to improve their governance of cloud computing and ensure that their resources address their wider business plan. By determining cloud computing requirements and eliminating redundant cloud costs, businesses unlock the door to full ROI potential. [/vc_column_text][/vc_column][/vc_row] ### Virtual reality to transform surgical training Operating on a human body is one of the most daunting and accountable tasks you can carry out. As a result, a new virtual reality system may be a complete game-changer as it offers a new safe and immersive way for trainee surgeons to prepare for the operating room. Virtual reality is most often used for entertainment purposed and to wow users, but this application of virtual reality goes far beyond entertainment and could really aid the effectiveness of surgical training before students have access to real live human beings to operate on. Training to be a surgeon is an enormous undertaking that consists of many years of studying, watching other people operate in video form, practising techniques on cadavers and fake bodies, and helping out in operating rooms. A study found that over 30 percent of trainee surgeons in the United States cannot perform operations without supervision until they graduate. Virtual reality could offer a new way to learn how to be a surgeon. The CEO of Osso VR, Justin Barad, has created a virtual reality system, Osso, which simulates orthopaedic operations. The system is made up of a headset and two hand controllers that enable students to carry out virtual surgery on virtual patients with virtual equipment such as knives, drills, nails, screws and hammers. Students can feel the vibration of the drilling through their controllers. Of course, the simulated equipment is not the same as real tools, but VR can provide some sense of operating through the controllers. Barad tested the system by allowing a group of eight medical students in their first year of study to try out Osso for the first time. In order to test the system and how it compares to the pre-existing preparation to be a surgeon, half of the students used the VR system for a period of 15 minutes, and the rest worked from an instruction manual which provided directions on nailing a metal rod into a shin bone. The students who used the VR as training equipment were more successful and accurate when they were told to repeat the surgery on a plastic model. Therefore, VR has proven to be effective in experiments and, as a result, VR could be a more affordable alternative to established surgical preparation techniques. Barad hopes the technology will offer a less expensive and more effective approach to quickly training students. This does not mean that the intention is for VR to replace the training already carried out by students where they operate on physical plastic models, the technology should rather be a complementary tool to be used in addition to other ways of learning, to help them prepare for performing actual operations on live people. A common technique for the preparation of medical students for surgery is simulation-based training, which takes place in a hazard free environment. At the University of Massachusetts Medical School, VR simulation has become a part of third-year medical student training, by simulating different surgical possibilities. The aim of the surgical simulations is to apply knowledge obtained from studying and reading about surgeries and to help bring the surgical scenarios to life without putting real human beings at risk. Medical simulation is the imitation/recreation of real medical situations. The medical situations included are simulated conflicts, medical procedure, and treating the simulation as a real human being with consideration for vital signs and damage that could be caused by mistakes made. Kansas City University is also constructing a VR simulation centre as the medical department wishes to embrace the technology already used at the University of Massachusetts to complement their current training with models and cadavers. One student explained that the simulation truly feels like you are in a genuine operating room. The simulation features a patient on a surgical table in front of you and you then must collect equipment from around the room in order to perform surgeries. Although models and cadavers offer physical and tangible operating practise, the immersive experience of simulation adds to the overall training experience by bringing the operation and operating environment to the students. Osso immerses its users in the operating room and confronts them with a realistic looking body to operate on; they are able to operate by moving the hands of the simulated surgeon with hand-held controls, transforming them into fully fledged virtual surgeons for the time they wear the VR headset. Virtual reality innovations like Osso will likely be further developed and perfected for the training of future surgeons, and if it is made widely useable, students as young as secondary/high school students could have the chance to pick up surgical knowledge and skills without any real human body needed. Virtual reality is likely to be very popular with young technology-minded students as they look for more efficient and engaging ways to learn. ### The Digital Toolbox For The Future Of Work [vc_row][vc_column][vc_column_text]This year we will see more companies change the way they work to improve employee productivity and attract new talent. This is crucial as the demand for more flexible and collaborative working environments is on the rise. By 2025, web meetings business PowWowNow estimates 75% of the workforce will be millennials, and 70% of this group will want to work remotely. In addition, as more companies internationalise and serve diverse markets, it’s natural their workforce will become more distributed across the globe too. Therefore to keep up companies need to start thinking now about how to cater to the remote teams of the future. This requires a better set of digital and cloud-based applications to make sure teams are able to communicate and collaborate as effectively as possible. Our favourite tools communication and collaboration Working at .Cloud, the domain smart domain for modern business, I’ve been exposed to a variety of inspiring companies who are redefining the future of work. For example Screen.cloud - cloud-based digital signage that’s stunning and simple to use. Or RIO.Cloud who is making fleet management more comprehensible across brands. At .Cloud we also embrace modern ways of working and I wanted to share some insights and tools. First and foremost, it’s important to find technologies that inspire you and your teams, and it’s amazing what the right one can do for better communication and collaboration. There are many applications that are transforming the modern workplace like Trello, Jira, and Slack. It’s important to listen to your team and get feedback as they are the ones who will be using them.  It’s also worth considering how the tools can be integrated with other systems and whether they are cloud based. There are two pieces of tech the .Cloud team loves; Trello & video conferencing. Trello is one our favourite business apps for team collaboration. I originally encountered it as I was looking for a simple to-do list, but through the years I’ve discovered how well it works for team collaboration and visualising any process. We use it to create roadmaps, projects, and editorial calendars. It’s flexible and fun and we found it particularly useful in staying visually organised for projects and workflows. Trello Inspiration is also a great a place to build new ideas and templates. Another important tool, especially with dispersed teams, is video conferencing. There are different options available such as Google Hangouts, Skype, GoToMeeting, and Zoom. I think it’s great that some of these services offer a free version because it allows you to try them and see what fits. At .Cloud we use a handful of free services to be flexible in accommodating different customers and regions, however, we are transitioning to a paid service as our needs are changing. One of our considerations is how well they support the Asian region as we are doing more business there and we recently launched .Cloud in China. Disconnect to connect New apps, if used to their full potential, can prove to be powerful productivity tools to connect teams working in and out of the office. However, they can also lead to sensory-overload. As such, there is value in disconnecting to connect – purposely switching off to better ourselves and the relationships within our team. When it comes to team activities, don’t forget the importance of meeting up in person when you can. Good hooks for this could be a planning session at the end or start of a year, or a kick off meeting for a significant project. You could even consider longer-term meetings such as corporate retreats with team building activities. Additionally, as we see more and more people improving their productivity through meditation and wellbeing. Real creativity comes from deep inside our own minds, and having a clear and calm mind can do wonders for your business and personal life. I recommend Insight Timer, it’s a free app with a customized timer and plenty of great guided meditations. Businesses that empower employees to explore and use new ways of working are those truly redefining the future of work. There are already many ways technology can improve employee productivity and enable flexible working policies. Take some time to evaluate what your business needs and which technologies can support you. Above all, being able to transition between online and offline interaction is the true definition of a modern business.[/vc_column_text][/vc_column][/vc_row] ### The changing technology landscape [vc_row][vc_column][vc_column_text]Technology has evolved drastically throughout the years and will only continue transforming. Advancements in this space are revolutionising the way people work and are also helping to drive businesses forward and remain competitive. I believe that over the next few years, we will see trends such as blockchain and 5G evolve further and we will begin to see companies looking seriously at these technologies as they consider their digitalisation strategies. Let’s take a look at some of these evolving elements in the technology landscape. Blockchain Firstly, I think we are going to see blockchain become a lot more mainstream. There will be high demand for the tech to be implemented into a range of different industry sectors such as the food industry. Blockchain will be able to assist the supply chain in the food and manufacturing industries, providing an easier and more efficient system to trace and distribute products. It will also allow organisations to react quickly to changes in their supply chain, and be a vital factor in driving them forward. 5G Through 2018 we saw 5G pilots in the UK and around the world and now will see the start of the commercial rollout across major UK cities. As with the speed increases of 3G and then 4G, 5G will bring huge benefits and opportunities to businesses as we strive even further to becoming an “always connected” workforce. More and more people are working from home, with research suggesting that half of the UK workforce is expected to work remotely by 2020 - therefore, 5G will be even more of a necessity. IoT IoT has impacted many industries over the past few years, improving efficiency and productivity. It is continuing to grow as it becomes a fundamental solution for many organisations and as 5G widens its reach across the UK, cities, cars, houses and appliances will become smarter building a vast network of billions of devices. Companies are seeking IoT technology to help them automate and connect processes, building self-sufficient systems and services. The technology will enable SMEs, particularly those in manufacturing and distribution, to strive to achieve Amazon-like levels of service, delivering a huge amount of value to their businesses and their customers. Big Data & AI As devices and systems get smarter, we are generating more data than ever before, which is hugely beneficial for driving business insight and predicting customer behaviour. The challenge with capturing so much data is as to how we interpret the information. Over the next few years we will see increased adoption of Artificial Intelligence (AI) technology helping us to derive insight from the information generated. AI technology will become relevant to a wide variety of applications from providing human-like support communications within software, to detect fraud and irregular transactions. With such technological leaps come challenges in that with the increasing power of artificial intelligence, we need to consider the ethics of how such intelligence is applied and who it directly affects. It will be interesting to see what other technology trends start to emerge and how they will affect businesses and the way they operate. This is a very exciting yet turbulent time for the market, but with blockchain, 5G, IoT and big data set to make a big impact within the technology industry we look forward to seeing it thrive and continue to evolve.[/vc_column_text][/vc_column][/vc_row] ### How to Design a Winning Cloud Procurement Strategy The procurement and adoption issues around cloud services are many, and their interactions are complex. One of the fundamental problems is that most organisations view cloud procurement as a technology issue based in the IT department when really it is a core business operations decision requiring stakeholders throughout the organisation. It is vital that CIOs and other decision makers involve these key stakeholders in the process at the outset so that they clearly understand both the objectives and the desired business outcomes. It is also crucial to acknowledge that cloud computing is fundamentally different from traditional IT, and to understand what this means (in both positive and negative terms) before developing a procurement plan. CIOs must also ensure the cost implications of cloud procurement are fully understood by those who hold the ultimate buying decision. Moving to a consumption model may appear cheaper, but it may not prove to be if services are consumed constantly, without being shut down when not required. For instance, services procured as Platform-as-a-Service (PaaS) may deliver savings over Infrastructure-as-a-Service (IaaS) when internal support costs are considered and the costs of managing Software-as-a-Service (SaaS) procured services and the data in them may also become a factor. Understanding the requirements for managed and unmanaged services – and their true costs – is a key requirement for designing a winning procurement strategy. Security and governance are also different in a cloud environment, and it is crucial that requirements relating to these operational challenges be integrated into the procurement model. It is also important to work with providers who understand this, and who can demonstrate and maintain appropriate services alongside the core delivery process. In order to ensure that consumption and management of the procured resources are understood and controlled, it is vital to include monitoring and reporting as part of the procured service. Most cloud service providers will provide some level of this, but it is vital that CIOs ensure that the information provided will meet the business needs as part of the procurement process. Monitoring and measurement are one of the five key aspects of Cloud as it is defined by NIST (National Institute of Standards and Technology), yet often this is overlooked or insufficiently emphasised, leading to failed procurements or abandoned adoption. Crucially, one of the key benefits of cloud computing is its flexibility and agility and it is important that CIOs are not so overcome by the procurement process that the offering becomes constrained and narrowly inflexible. Having the right commercial terms and conditions and the right management controls are, of course, essential, but at the same time there needs to be flexibility both for current and for future growth, otherwise, the advantages are lost before transformation can begin. Engaging with the right cloud services provider It is in the interests of both the consumers of cloud services and the cloud providers to ensure that the requirements and objectives are clearly understood at the outset and that a clear set of metrics and success factors are agreed and in place. [easy-tweet tweet="The emergence of hyper-converged infrastructure is now starting to have an impact on the market" hashtags="Cloud,Software"] It is vital to establish areas of responsibility and operational demarcations. Is the procurement purely for infrastructure, with the business retaining responsibility for the applications and data or are these in scope for the cloud service provider? How is the business continuity and disaster recovery strategy to be affected and who will affect its invocation and testing? How are future updates to the infrastructure, software and applications to be managed? The emergence of hyper-converged infrastructure and its integration with cloud management and cloud brokerage software and applications is now starting to have an impact on the market, and the new capabilities and capacity that this offers are both intriguing and challenging. Meanwhile, we are also starting to see platform offerings from the major application vendors take the real hold in the market and affect the choices available to CIOs. The key is to remember that digital transformation is a journey and not a point of arrival. New options will continue to evolve and new business opportunities will arise as a result, just as old business models will become obsolete. The procurement of services in the cloud will clearly help businesses to become more flexible and adaptable but will also increase the competitive pressure to do so. Today’s CIOs must not be dazzled by the cloud services options now available but should realise that the right answer for their business is probably a unique hybrid of technology, processes and operations that suit their delivery model and their market differentiation. It is important to establish a trust relationship with a provider or ecosystem of providers that understand this and can help CIOs put the right applications and services onto the right infrastructure components at the right time. Some businesses may be able to move everything to the hyperscale cloud today but, for most, the foreseeable future is hybrid. ### Maintaining control in a multi-cloud ecosystem [vc_row][vc_column][vc_column_text]Migrating to the cloud can be somewhat liberating. It allows enterprises to leverage operational tools and practices pioneered by the cloud titans. But while these operational tools give enterprises a path to a much more agile IT environment, that speed comes at the cost of control.  How can IT teams balance the agility of the cloud with the control required to run a modern enterprise? Command and control Enterprise IT has traditionally operated using a strict command and control model. BEHAVIOUR is determined by configuration being precisely set over a distributed infrastructure. If something needs to be amended, operators propose changes to the configuration. There are a few drawbacks to the command-and-control model. First, infrastructure becomes quite brittle when it is dependent on the kind of operational precision required to specify specific behaviour across diverse infrastructure. This is a big reason that many enterprises use frameworks like ITIL. When change is difficult, the best you can do is inspect in excruciating detail. This, of course, makes moving quickly nigh impossible, and so our industry also employs excessive change controls around critical parts of the year. Second, when behaviour is determined by low-level, device-specific configuration, the workforce will naturally be made up of device specialists, fluent in vendor configuration. The challenge here is that these specialists have a very narrow focus, making it difficult for enterprises to evolve over time. The skills gap that many enterprises are experiencing as they move to cloud? It’s made worse because of the historical reliance on device specialists whose skills often do not translate to other infrastructure. Translating command-and-control to cloud For command-and-control enterprises, the path to cloud is not always clear. Extending command-and-control practices to the public cloud largely defeats the purpose of the cloud, even if it represents a straightforward evolution. Adopting more cloud-appropriate operating models likely means re-skilling the workforce, which creates a non-technical dependency that can be hard to address. The key here is in elevating existing operational practices above the devices. Technology trends like SDN are important because they introduce a layer of abstraction, allowing operators to deal with intent rather than device configuration. Whether it’s overlay management in the data center or cloud-managed SD-WAN, there are solutions in market today that should give enterprises a path from CLI-driven to controller-based control. Minimally, this helps provide a proving ground for cloud operating models. More ideally, it also serves as a means to retrain the workforce on modern operating models, a critical success factor for any enterprise hoping to be successful in the cloud. Intent-based policy management Abstracted control is critical because it leads naturally to intent-based management. Intent-based management means that operators specify the desired behaviour in a device-independent way, allowing the orchestration platform to translate that intent into underlying device primitives. An IT operator ought not have to specify how an application is to connect to a user. Whether it is done on this VLAN or that VLAN, across a fabric running this protocol or that protocol, is largely uninteresting. Instead, the operator should only have to specify the desired outcome: application A should be able to talk to application B, using whatever security policies are desired, and granting access to people of a certain role. By elevating management to intent, enterprise teams do two things. First, they become masters of what matters to the business. No line of business cares about how edge policy is configured; rather, they care about what services and applications are available. Second, by abstracting the intent from the underlying infrastructure, operators create portability. Multicloud and portability Portability is a huge part of maintaining control in an environment where infrastructure is spread across owned and non-owned resources. If abstraction is done well, the intent should be able to be implemented across whatever underlying infrastructure exists. So whether an application is in a private data center or AWS or Azure, the intent should be the same. When paired with an extensible orchestration platform with suitable reach into different resource types, that intent can service any underlying implementation. For example, assume that an application workload resides in AWS. The policy dictating how that application functions will be applied to the AWS virtual private cloud (VPC) gateway. If the workload is redeployed in Azure, the same intent should translate to the equivalent Azure configuration, without any changes initiated by the operator.  If a similar workload is launched in a private data center on a VM, the same policy should be used. If that application moves over time to a container, the policy ought to be the same. By making policy portable, the enterprise actually maintains control. Even though the underlying implementation might vary, the behaviour should be uniform. This, of course, relies on multicloud management platforms capable of multi-domain and multivendor support, and there needs to be some integration with different application lifecycle management platforms. But operating at this abstracted level is actually the key to maintaining control. Trust but verify Having control but being unable to verify is really no better than not having control in the first place. Ultimately, for an operating model to be sustainable, it needs to be observable. This is true from both a management and even compliance perspective. This means that enterprises looking to maintain control in a post-cloud world will need to adopt suitable monitoring tools that grant them visibility into what is happening. Similar to the policy portability discussion, though, these tools will naturally need to be extensible to any operating environment—private or public, bare metal or virtualized, cloud A or cloud B. For most enterprises, IT operates in discrete silos. The application team and the network team are run separately. The data center team and the campus and branch team are run separately. If a prerequisite for control is visibility, and that visibility has to extend end-to-end over a multicloud infrastructure, it means these teams need to come together to ensure observability over the full multicloud ecosystem. Tools like performance monitors and network packet brokers will need to be evaluated, not in domain-specific contexts but over the full end-to-end environment. This might mean trading off one tool that is superior in a particular domain for another that is more capable of spanning multiple domains. Ideally, these tools would plug into a broader orchestration platform, allowing observable events to trigger additional action (if this, then that). Start with culture While there are certainly dependencies on underlying technology, the ultimate key to maintaining control in the cloud will fall back on that common dependency for much of IT: people. Enterprises should evaluate technology but failing to start with people will mean that the path forward is only partially paved. Coaching teams to elevate their purview above the devices is an absolute prerequisite for any cloud transformation. Breaking free from CLI-centric operating models is critical. And embracing more diverse infrastructure will be essential. The cloud doesn’t care about legacy products managed in legacy ways. With a willing and trained workforce, the technology on which multicloud management is built can be effectively deployed in such a way that enterprises get the best of both worlds: agility and control.[/vc_column_text][/vc_column][/vc_row] ### Harnessing the business cloud [vc_row][vc_column][vc_column_text]In recent years, there has been a major shift in data centre strategies. Indeed, enterprise IT organisations are shifting applications and workloads to the cloud, whether private or public. Enterprises are increasingly embracing software-as-a-service (SaaS) applications and infrastructure-as-a-service (IaaS) cloud services. However, this is driving a dramatic shift in enterprise data traffic patterns, as fewer applications are hosted within the walls of the traditional corporate data centre. The rise of cloud in the enterprise There are several key drivers for the shift to SaaS and IaaS services with increased business agility often at the top of the list for enterprises. The traditional IT model of connecting users to applications through a centralised data centre is no longer able to keep pace with today’s changing requirements. According to Logic Monitor’s Cloud Vision 2020 report (1), more than 80 percent of enterprise workloads will run in the cloud by 2020, with more than 40 percent running on public cloud platforms. This major shift in the application consumption model is having a huge impact on organisations and infrastructure. With organisations migrating their applications and IT workloads to public cloud infrastructure, this tells us that the maturity of using public cloud services and the trust that organisations have in them is at an all-time high. Key to this is speed and agility, without compromising performance, security and reliability. Impact on the network Traditional, router-centric network architectures were never designed to support today’s cloud consumption model for applications in the most efficient way. With a conventional, router-centric approach, access to applications residing in the cloud means traversing unnecessary hops through the HQ data centre, resulting in wasted bandwidth, additional cost, added latency and potentially higher packet loss. In addition, with traditional WAN models, management tends to be rigid and complex and network changes can be lengthy, whether setting up new branches or troubleshooting performance issues. This leads to inefficiencies and a costly operational model. Therefore, enterprises greatly benefit from shifting toward a business-first networking model to achieve greater agility and substantial CAPEX and OPEX savings. As the cloud enables businesses to move faster, software-defined WAN (SD-WAN), where top-down business intent is the driver, is critical to ensuring success – especially when branch offices are geographically distributed globally. A business-driven network To tackle the challenges inherent in traditional router-centric models and to support today’s cloud consumption model, companies can embrace a business-driven SD-WAN. This means application policies are defined based on business intent, connecting users securely and directly to applications wherever they reside, without unnecessary extra hops or security compromises. For instance, if the application is hosted in the cloud and is trusted, a business-driven SD-WAN can automatically connect users to it without backhauling traffic to a POP or HQ data centre. In general, this traffic is usually going across an internet link which, on its own, may not be secure. However, the right SD-WAN platform will have a unified stateful firewall built-in for local internet breakout allowing only branch-initiated sessions to enter the branch and providing the ability to service chain traffic to a cloud-based security service if necessary, before forwarding it to its final destination. If the application is moved and becomes hosted by another provider, or perhaps back to a company’s own data centre, traffic must be intelligently redirected, wherever the application is being hosted. Without automation and embedded machine learning, dynamic and intelligent traffic steering is impossible. Ensuring security in the cloud Securing cloud-first branches is vital and, to do this, they require a robust multi-level approach. This is not least because a traditional device-centric WAN approach for security segmentation requires the time consuming and manual configuration of routers and/or firewalls on a device-by-device and site-by-site basis. This is complex and cumbersome, and it simply cannot scale to 100s or 1,000s of sites. With SD-WAN, companies can minimise the available attack surface and effectively control who, what, where and when users connect to public cloud applications and services, encompassing the ability to securely connect branch users directly to the cloud. Enabling business agility For enterprise IT teams, the goal is to enable business agility and increase operational efficiency. However, traditional router-centric WAN approach doesn’t provide the best quality of experience for IT, as management and ongoing network operations are manual and time consuming, device-centric, cumbersome, error-prone and inefficient. A business-driven SD-WAN centralises the orchestration of business-driven policies, enabling IT to reclaim their nights and weekends. Depend on the cloud If a business is global and increasingly dependent on the cloud, a business-driven WAN enables seamless multi-cloud connectivity, turning the network into a business accelerant. Unified security and performance capabilities with automation deliver the highest quality of experience for both users and IT, while lowering overall WAN expenditures. Shifting to this approach will enable business to realise the full transformational promise of the cloud.[/vc_column_text][/vc_column][/vc_row] ### The damaging effect of cloud outages, and how to stop them [vc_row][vc_column][vc_column_text]Moving digital infrastructure to the cloud is steadily becoming best practice for businesses, because of numerous associated benefits - including flexibility, cost efficiency and overall performance. This has been accelerated by digital transformation, and consequently, the International Data Corporation (IDC) has predicted that global spending on digital transformation will rise to £2 trillion in 2022. But this rush to online and cloud-based processes has left some companies without a clear strategy for their IT monitoring solutions, resulting in a smorgasbord of unconnected and unfit monitoring tools. Consequently, the threat of disruption through outages is ever-present for those who employ unsuitable monitoring tools. The real impact of IT cloud outages is distorted because they often go unreported. Companies proudly disclose figures calling attention to their low number of outages, but the reality is that just because they haven’t experienced a total shutdown doesn’t mean an outage hasn’t occurred – just that they have managed to keep service running at a lowered capacity. This suggests that IT cloud outages are far more prevalent than the data suggests. Therefore, the need for increasingly robust and centralised monitoring software is essential for system-wide visibility, to spot performance issues for both physical infrastructure and the cloud before it’s too late. No company is safe. Just think, 2018 saw companies like Google Cloud, Amazon Web Services and Azure experience highly disruptive cloud outages, with far-reaching consequences both financially and to their reputations. The financial sector was hit particularly hard and had a tumultuous year. A Financial Conduct Authority report showed that in Q2 2018, Britain’s five largest banks, HSBC, Santander, RBS, Barclays and Lloyds Banking Group, suffered 64 payment outages. As a consequence, the FCA has now decreed that two days is the maximum limit for which financial services can have services interrupted, due to this plethora of disruption. However, for those who want to remain competitive, they should really be aiming for a zero downtime model, as customers will no longer stand for poor-quality service. The severity of major outages has been shown by an illuminating report from Lloyd's insurance and risk-modeller AIR Worldwide, who calculated that an incident for one of the top cloud providers (like Google or AWS) in the US for three to six days would result in losses to industry of $15bn. It’s abundantly clear that organisations cannot afford outages of any kind, and that appropriate measures must be utilised to mitigate outages happening. The consequences of IT outages The current digital climate leaves no room for negative customer interaction; modern technology has led people to expect a constant high level of service. The ‘always on’ mantra means that any disruption to day-to-day services have a debilitating effect on customer trust. With so much choice, flexibility and at times even incentives to switch providers, disruptions can cause customers to move to competitors – so firms can no longer risk a band-aid over the bullet hole approach. In April 2018, TSB had a catastrophic incident when an error made during an IT systems upgrade led to 1.9 million people being locked out of their accounts, some for up to two weeks; all told, the bank lost £330M in revenue. In the same month, Eurocontrol, the airport management system for many of Europe’s airports, had an outage that left 500,000 passengers stranded across the continent. British Airways also experienced an outage with its third-party flight booking software. With 75,000 travellers affected over the three-day period, it lost an estimated £80m in revenue and a further £170m off its market value. With a Gartner report asserting that such outages could cost up to $300,000 per hour, the need for a unified solution is key to effective IT monitoring. Although the financial ramifications of an outage are plain to see, regardless of the sector you operate in, organisations need to exercise effective crisis management and be upfront with their customers. And when outages do occur, organisations need to relay reliable and up-to-date information to their stakeholders to mitigate damage to their reputation. TSB showed exactly how not to do this when after 12 days of its outage, it insisted on social media that ‘things were running smoothly’, even though some customers hadn’t had access to their bank accounts for nearly a fortnight. TSB resultantly lost 12,500 customers. Why a unified approach is key to success Gaining insights into an IT system’s performance is always a challenge, especially with the growing issue of ‘tool sprawl’ that many companies either opt for in desperation or are stuck with due to decentralised systems which don’t communicate with each other. Organisations are often reluctant to update their systems because any disruption during implementation of a wholly new IT monitoring system can seem daunting or an update might even be too great a risk when weighed against a theoretical outage in the future; leading to many companies having sprawling IT systems that are continuously patched. The key to countering the problem of cloud outages is a single pane of glass solution that provides visibility across all of a business’ IT systems. However, Enterprise Management Associates has said that many companies use up to ten monitoring tools at once, consequently creating data islands, diluting their data and averaging times of between three and six hours to find performance issues within IT systems. Simply put, companies commonly have unfit solutions in place that are built for static on-site systems rather than today’s cloud and virtual-based digital systems. By housing analytics and system data in a single unified tool, organisations will have a clearer picture of system health, availability and capacity at all times. Outages are a fact of life; but companies should do their utmost to mitigate against them and, when they do occur, have the correct tools in place to find the issue and rectify it. This returns service in a timely manner reduces downtime and prevents loss of revenue. All this should be completed while keeping customers informed of progress - unlike TSB’s self-destructive ‘information vacuum’ approach. As digital transformation begins to accelerate across businesses, IT systems will grow ever more complex – and the capabilities of effective monitoring tools will have to be deployed to meet the challenge. Regular threat and vulnerability assessments, along with reviewing configuration and operation process validation checkpoints can reduce the odds of suffering a critical failure. For this reason, the importance of a single-pane-of-glass monitoring tool, enabling the consolidation of siloed teams and the removal of any blind-spots caused by overlapping systems that isolate data and fail to communicate with each other, cannot be overstated.[/vc_column_text][/vc_column][/vc_row] ### Being Successful in Cyber Security [vc_row][vc_column][vc_separator][vc_column_text]Cybersecurity can often seem like a murky world of espionage and counter-espionage; where men in blacked out vans kidnap carefully chosen nerds from the streets of the country’s small towns to keep us safe from those pesky bad guys. However, to succeed in the world of cyber security you do not require a specific set of skills as the Liam Neeson of Taken 1, 2 and 3 will have you believe. This does beg the question of what do cybersecurity experts look for when hiring the next generation of keyboard-based James Bonds? Thinking outside the box The short answer is they look for curiosity and a thirst for knowledge. Most cyber professionals have a chequered history, with many having been kicked out of school for pushing the boundaries that one step too far. However, those successful in cyber are by nature very inquisitive – we like to know what makes something tick and why. If you played with Lego or Meccano as a child and really enjoyed pulling models apart and then putting them back together in a different way, it’s likely that you have the raw skill set that cyber professionals look for. One of the key jobs in cyber is taking networks apart to find malware or holes in the security; which requires thinking a bit differently from the norm. Those who spent their childhood or teenage years inadvertently thinking this way through the art of Lego will naturally do well in cyber. Technical skills Undoubtedly, a certain level of technical skills is required when working in cyber. A good knowledge of IT networks, computing and operating systems to start with will help you get into the industry; but this will only get you a job, not a career. While technical skills can be learned online or through courses at university – some are even accredited by GCHQ – it’s the extra push on thinking outside the box that really makes good cyber professionals stand out. That keenness to learn new technical skills while testing the limits of what is possible in practice is key. Leadership skills There’s a long-lived stereotype that cyber professionals are nerds who can’t hold a conversation or manage a team. While not completely inaccurate, in reality the best people in cyber are the ones who can explain what they’re doing, and why, to anyone up to a C-level executive in simple yet effective terms. They will also stick their head above the parapet and say what they think; leading their team by example by constantly learning new things and trying to find new ways of breaking a network or OS apart and putting it back together. These leadership characteristics should be seen at all levels, regardless of how long you’ve been in cyber. Courage, self-confidence and humility are essential, as cyber professionals need to lead from every level in a business for their work to be successful. Caring At their core, cyber professionals genuinely care about looking after people. They care enough to work through hours of code to find the single bug that’s causing havoc for an overall network. The key mentality is of the kid who stands up to the bully in the playground – fundamentally, cyber as a job is about protecting people. Far from what the screens of Hollywood would have us believe, cybersecurity professionals are tenacious rule-breakers, who have a genuine curiosity and keenness to break apart and fix the digital world. For those who would stick their hand up and ask the awkward questions while working hard to find the solution, cybersecurity could be the profession for you.[/vc_column_text][/vc_column][/vc_row] ### How IoT Can Help Drive the Smart City Agenda and Upgrade the UK’s Connectivity [vc_row][vc_column][vc_separator][vc_column_text]The rise of the Internet of Things (IoT) has changed our expectations of the machines we use. It is estimated that by 2020, 20 billion everyday objects will be connected to the internet. Our desire for constant interconnectedness has meant we require more from our devices. However, machine to machine communication and how it will affect our cities means we require more from our public telecoms too. Our cities will need to be smart if they are to adapt to the demands and challenges of tomorrow. The smart city agenda offers vast potential and opportunity but only if we build and upgrade our current infrastructure on our high streets across the country. In 2017 the Government set out bold ambitions to make the UK a pioneer in new technology such as IoT and 5G. This move was welcomed by the telecoms industry. With the UK currently lagging 35th in the world behind the likes of Bulgaria and Madagascar in broadband coverage rankings, this is the type of innovation the UK needs to embrace to realise the potential of connectivity. It is understandable why the Government is keen to capitalise on new technology such as 5G. Research by HIS Economics estimates it will enable $12.3 trillion in global economic output by 2035 and support 22 million new jobs. With download speeds up to 1,000x faster than 4G, 5G will enable the IoT applications needed to power developments in autonomous vehicles and addressing the city-planning challenges of tomorrow. How can IoT be deployed in the smart city agenda? The ability of the internet of things to capture, send and receive data means we are able to create an enormous well of information which can be harnessed to enhance our future cities. These future cities are now not so far away, with much of what can make cities smart already existing today. The environment is one area we can harness the potential of IoT to monitor changes in air quality in real-time, and then use that data to create dynamic clean air zones and help with plans for future city infrastructure. This same approach can be taken with our transport, and the way we monitor traffic. The data can be collated to create public transport system that adapts instantly to spikes in demand or to disruptions, in turn informing the maintenance and development of transport networks in the future. Driverless vehicles also have a role to play in smart cities, helping to reduce congestion, alongside gathering data to turn them into collective decision-makers, making our roads more efficient and environmentally sustainable. What stands in the way? The UK has a golden opportunity to become a world leader in areas such as IoT and 5G technology, but it can only do so by investing in innovation and with the backing of government, both local and national. However, with the Government’s forthcoming consultation on permitted development rights, there is a risk that the rollout of such technology may be slowed, impeding our drive towards building smart cities. Permitted development rights have been the cornerstone of public telecoms in the UK for the last thirty years, ensuring new telecoms infrastructure can be installed without the obstructive planning permission process. The proposed consultation buried deep in the Government’s Autumn Budget could see these rights scrapped. Their removal would stifle innovation and investment in much-needed new telecoms infrastructure. It would seem that the consultation stems from a misconception on the part of a handful of local authorities. Their grievance is with the next generation of ‘public call boxes’ built to replace their often badly-maintained predecessors, mostly installed back in the 1990s. Call boxes, they argue, are no longer necessary due to widespread mobile phone usage. But the new ‘call boxes’ bear little resemblance to their traditional predecessors, in the way that a smartphone bears little resemblance to an old rotary dial. Public telecoms for the future The next generation of call boxes, better known as streetside interactive hubs, are reimagining public telecommunications for the 21st century. On top of free calls, they provide free wi-fi, apps and services, alongside the capabilities for smart city data collation. What’s more, their streetside location and small cell deployment will be key to boosting 4G and 5G connectivity, in areas where coverage is poor or where users are likely to experience dropouts such as around Waterloo station. The upgrading of the UK’s public telecoms infrastructure offers huge potential, helping people to connect in the present, as well as powering how we might connect people, places and things in the future. If we are to build the smart cities of tomorrow, we need to build out the streetside telecoms networks that underpin them today.[/vc_column_text][/vc_column][/vc_row] ### Can Robotic Process Automation Boost Employee Engagement? [vc_row][vc_column][vc_column_text]Over the past few years, the tech landscape has seen the emergence of terms that have quickly become buzzwords. From AI to Big Data to GDPR, these words were all generating noise even before their exact definition was agreed upon or understood. After a period of hype and overuse, buzzwords such as these generally cool down and become part of normal tech vocabulary until the next one appears. The latest ‘hot term’ may very well be Robotic Process Automation (RPA). R-P-A – three letters that are currently being thrown around in technology and business as any trending topic would be. But this topic isn’t just a fleeting tech trend; it’s also one that could leverage great returns. Gartner recently announced that RPA software revenue grew 63.1 per cent in 2018 to $846 million, making it the fastest-growing segment of the global enterprise software with revenue expected to reach $1.3 billion this year. So, what exactly is RPA? How does it really work, and why does it mention robots? We’ll start off by setting the record straight and demystifying some of the connotations surrounding the technology, and then we’ll provide a few good reasons as to why businesses should invest. RPA is about people collaborating with machines People often assume that call center agents spend all their time answering calls and talking to customers. However, this isn’t the case. Mundane and repetitive tasks take up much of an agent’s time and motivation.  For instance, after a call, the agent must write a summary of the conversation and perform any follow-up actions, which takes an average of two to three minutes each time. This is where RPA comes in.  The ‘robot’ part of RPA is a clever piece of software that can interact with any existing application in the same way the agent can (clicking buttons, typing into fields, navigating to different screens).  Simple RPA solutions perform work without a person being involved – they take work from a queue and then process it case by case.  More sophisticated RPA solutions can work alongside an agent, helping them on their desktop as they take calls, understanding what they need and providing it.  This means that robots can not only automate the wrap-up actions for each call, but they can augment the agent during the call to ensure compliance, guide them through complex conversations and even help with cross and up-sell opportunities.  They can even trawl through hundreds of items of customer data to provide useful insight that the agent can use – all while the customer is still on the line. As the agent’s time is less taken up by mundane and repetitive tasks, they become more engaged and in turn, more productive.  Having a robot helping them as they work really super-charges their engagement and enables them to do things they couldn’t possibly achieve on their own.  Ultimately, RPA will keep your agents motivated, engaged and productive, as well as reducing the chances of mistakes, and cutting waiting times for customers as processes are completed faster. This, in turn, will positively impact customer experience and satisfaction.  It’s not just contact centers either – any employee can benefit from having a robot ‘helper’. Reaping the benefits of employee engagement through RPA The correlation between employee engagement and performance doesn’t need to be proven anymore.  Numerous studies all point to the same conclusion.  Recent research from Dale Carnegie on over 1500 employees found that companies with engaged employees will outperform those without by up to 202 per cent.  According to Gallup’s State of the American Workplace, companies with engaged employees can see a 240 per cent jump in performance-related business indicators compared to other firms. RPA ensures the constant satisfaction of your workforce.  And as your employees are your first ambassadors, how they think and feel are key factors in whether they advocate for or against your company, which can sometimes have a greater impact than all your marketing and advertising materials ever could. Not only are the financial benefits and returns of employee engagement quite clear, but RPA deployment will also lower your overall running costs by reducing turnover and the costs associated with recruitment or training.  You will make more and spend less, pushing your results ever higher.  As your employees enjoy their new streamlined work life, they will reward you by becoming productive, positive ambassadors and help push your turnover to new levels. Transforming the work experience upgrade your customer’s experience So, RPA isn’t as much about robots in the usual sci-fi sense, as it is about growing turnover and achieving excellence in customer services by both automating repetitive tasks and helping employees as their work.  Today’s economy is as much about the experience as it is about the product, and customer service is often the first-place people interact with a brand.  So goodbye to R2-D2 and BB-8 and hello to valuable business returns in a galaxy where every employee in every business can now have their own robot ‘helper’.[/vc_column_text][/vc_column][/vc_row] ### Unlocking the world with a look – the growing potential of face-based biometrics [vc_row][vc_column][vc_column_text]The idea of unlocking your phone by scanning your face would have seemed like something out of “Blade Runner” not so long ago. Concepts like iris tracking, liveness detection and 3D face mapping would have come across as novelties, certainly not technology on the brink of becoming commonplace Yet here we are today, picking up our iPhones and unlocking them with a look. Apple’s Face ID feature has relegated Touch ID to the past and is ushering in a new wave of facial biometrics, as more and more people get comfortable with using their own biometrics as a means of authentication. As face-based biometrics begin to work their way into the mainstream, it begs the question — what other scenarios could be transformed by this kind of identity verification? In light of large-scale data breaches, the dark web and the high volume of identity theft and account takeover, it has become increasingly difficult for companies to know whether someone is who they claim to be. Therefore, because organisations need to conduct more continual identity proofing, a growing range of scenarios to use face-based biometrics emerge. Changing the status quo of identity verification While the need to identity proof has become increasingly more important in recent years, and consumer appetite and familiarity with facial biometrics is nearly where it needs to be, we first have to understand why else organisations would look to adopt this form of security method. Traditional identity verification methods range from knowledge-based authentication (KBA), whereby you answer personalised security questions based on information like your mother’s maiden name, to two-factor authentication (2FA), whereby you are sent a verification code via SMS message, to prove that you are indeed the person trying to access the account. However, these methods come with increasingly significant vulnerabilities. For example, hackers are able to easily intercept the four- and six-digit SMS codes that underpin 2FA via the SS7 telecommunication protocol network, or through phishing attacks. A more elaborate hack involves "SIM swapping". If a criminal has some of your identity details, they might be able to convince your phone provider that they are you and request a new SIM attached to your phone number to be sent to them. That way, any time an authentication code is sent from one of your accounts, it will go to the hacker instead of you. When it comes to KBA, simple social media searches can reveal the answers to the supposed secret questions that this solution relies on. Hackers can also reset the passwords of legitimate customers using their birthdate, postcode and other personal information, that can be easily found by simple Google searches or purchased from the Dark Web. Add to the mix cybercrime and the dark web evolving and becoming far more sophisticated, traditional forms of authentication that were once effective can simply no longer reliably ensure that the person logging into their online account is the actual account owner. Paving the way for new alternatives Therefore, facial biometrics presents a very real alternative that both increases security and capitalises on consumer understanding and awareness. And we’re already seeing uptake in these new, safer and more secure methods. Using cutting-edge AI and video selfie technology, this technique goes well beyond traditional authentication methods to deliver a significantly higher level of assurance and establish a trusted identity. In addition, these trailblazing solutions also mean that variables such as wearing glasses, gaining or losing weight or growing a beard won’t disrupt the tool. As companies like Monzo and easyJet begin using face-based biometrics, consumer confidence continues to rise. Research from Experian shows that 74 percent of consumers globally are more confident that physical biometrics will protect their information over passwords. With passwords being bought and sold on the dark web for next to nothing, this is a compelling statistic to see. Broadening the horizons of facial biometrics This form of identity verification is key in the account opening stage. But there is an urgent need to use these methods beyond just this step — primarily, in higher-risk scenarios including wire transfers and password reset attempts. Selfie-based authentication utilises a selfie captured during the account opening process as a reference point. If you need to reset your password, you could take another quick selfie that would be cross-checked with the original selfie used to open the account for quick and secure verification. Using face-based biometrics in this way also opens doors to other use cases, including self-check-in for flights and hotels, and unlocking a pre-arranged rental car. Businesses can easily lose customers due to a slow, clunky or unclear onboarding and authentication process. This method, however, improves the experience significantly because of its speed, familiarity and simplicity. Perhaps more importantly, it also scares off would-be fraudsters as it’s unlikely they’ll want to share their own picture with the very company they’re attempting to fraud. By 2022, Gartner, Inc. predicts that 70 percent of organisations using biometric authentication for workforce access will implement it via smartphone apps, regardless of the endpoint device being used. In 2018, the figure was fewer than five percent. Paired with the growing familiarity of face-based biometrics, now is the time for businesses to seize the potential and realise the benefits of this innovative technology.[/vc_column_text][/vc_column][/vc_row] ### Managing cloud migration and the end-user experience [vc_row][vc_column][vc_column_text]The traditional role of the managed services provider (MSP) has, up until recently, been clearly defined—helping enterprise customers with issues around pre-installed software, internal hardware, and servers has been a niche MSPs have filled for years. However, recent changes in IT have forced MSPs to reevaluate where they fit in the bigger picture of their customer’s IT operations. This change has been brought about by the huge rise in software-as-a-service (SaaS) and cloud storage solutions. In its latest quarterly earnings call, Microsoft® announced the number of Office 365® subscribers had jumped to 31.4 million. This number has steadily risen over the years, and will only continue to rise as Microsoft phases out its yearly standalone Office software and ushers customers over to the Office 365 platform. The common adoption of cloud-first computing and virtualised desktops has changed the playing field for MSPs. This is often seen as a challenge, but it shouldn’t be—instead, MSPs should see such advances in enterprise IT as an opportunity to forge new revenue streams and responsibilities in relationships with their customers. The cloud opportunity MSPs might not have a role in distributing and providing a virtualised desktop like Office 365, but they can play a role in managing the software and offering customers specific services that help mitigate the negatives and downsides of whatever SaaS solution they are using. Using Office 365 as an example, there are multiple areas where MSPs can provide services to improve upon the overall experience and functionality of the software. One key area is storage. By offering customers a strong third-party backup storage solution, MSPs can provide an essential safety net for the limited cloud-based storage Office 365 offers. The potential importance of this is best explained when put into the scenario of a malware attack or virus causing mass deletion of key company files. The recycle bin is empty, and the important documents and files are lost forever. This is, of course, unless you have all your files backed up on to a third-party storage solution—which an MSP can provide. ‘Cloud to cloud storage’, and the increased protection it offers businesses, is a service all modern MSPs should be providing to their customers and leveraging to justify their necessity to an enterprise’s IT operations. Office 365 can be an expensive option. All IT departments are looking for ways to cut costs, and MSPs can tap into this sentiment by offering some of the same services as Office 365. Customers may want to pay for the Office 365 cloud-based email service but not the full software, and this is where an MSP can come in to provide a more cost-effective backup solution. Bold steps towards cloud platform management There are important services MSPs can provide to those customers who are increasingly relying upon SaaS platforms. Yet MSPs require some blue sky thinking in order to shift from the role they traditionally play. To be an effective advisor, MSPs need to focus on how they can bridge the gap between the end user and the cloud solution they use. By adding cloud monitoring solutions to their portfolios, MSPs can become a crucial partner to support popular enterprise cloud platforms, like Microsoft Azure® or Amazon Web Services™, monitoring and troubleshooting the end user’s technical issues. This emerging ‘middle man’ role between cloud provider and customer is something all MSPs should be examining and adopting if they are to take advantage of the huge numbers of organisations moving to enterprise cloud solutions in the next few years. But to successfully pivot into forward-thinking and cloud-focused outfits, MSPs must reflect upon their overall business models. Stereotypically, MSPs have focused their efforts on providing solutions based around troubleshooting and improving devices. Going forward, MSPs should shift their attention towards improving the overall experience of the end user. They will only be able to do this if they boast a portfolio of services that grant the ability to administrate cloud services; this will help MSPs actually manage the platforms used by their customers in everyday office life, thus providing a genuinely vital service. Migrating to managing end-user experiences Increased enterprise adoption of SaaS and cloud storage solutions shouldn’t strike fear in the hearts of MSPs, no matter how naturally that notion may come. There are plenty of opportunities for MSPs to provide new services to their customers, but taking advantage of these opportunities will depend on the MSP’s ability to adapt their portfolio of services and shift their overall focus from device performance to end-user experience.[/vc_column_text][/vc_column][/vc_row] ### Using cloud integration to overcome the complexity of a multi-cloud infrastructure [vc_row][vc_column][vc_column_text]IT cloud service providers are becoming the new IT department and that’s all good.  But how can the new IT landscape avoid ending up with a spaghetti mess of isolated cloud vendors? Here’s a take on how to enable rapid decision-making and an agile way of working in a fragmented business landscape.  Cloud services (SaaS) have exploded during the last couple of years with companies such as Salesforce and Amazon paving the way. It has become a mature business where over 90% of companies use one or more cloud services (IDG Expert Network: Navigating the Digital Universe) of which 80% utilize a multi-cloud landscape. But even though cloud services have become widely accepted and carry documented benefits in terms of flexibility, cost reduction and data security, buyers of cloud services start to question new SaaS services. It indicates that cloud services are moving into a new phase with new challenges that need to be understood and addressed.  Since the 2000s, companies have endeavoured on IT application outsourcing to global vendors with the aim to reduce operational cost and improve IT performance. For many companies, it has been a challenging journey with increased governance and collaboration complexity, and fragmentation of information and processes. The complexity has increased exponentially with every new vendor added to the IT outsourcing landscape. Structural and technical solutions were the backbone of the outsourcing setup and required best practice tools and frameworks to make this new way of working both efficient and effective. However, managing expectations and responsibility across multiple internal and external borders and cultures turned out to be the main headache. Today, we see a new type of IT application outsourcing in terms of cloud services, made possible by digitalization and globalization. It carries a similar operating model where the responsibility for application development and operations is transferred from the internal IT department to the IT (cloud) service providers – available through a subscription model and updated user interface. New cloud services are easy to initiate in business with little involvement of the IT department. But the ease of initiation is also its greatest challenge as information, processes and collaboration easily become fragmented in different cloud solutions. So how do companies enable rapid decision-making and an agile way of working with a fragmented business landscape (information, processes and collaboration)? According to ComputerWorld research, Cloud Services is currently under scrutiny as 50% of companies hesitated to add new cloud services during the last three years – considering the hurdles of cloud integration. In fact, 90% of companies (MuleSoft Research) experience integration challenges and fragmented cloud landscape of multiple user-solutions. With more and more cloud services available, the problem with cloud silos is worsening everyday. It is the cost of moving too fast into the cloud without a proper analysis, planning and governance. Integration of multi-cloud environment (on-site, private or public) is the key priority of IT professionals to boost business competitiveness (ComputerWorld Survey) - but it is also an unfamiliar and uncomfortable territory. A majority (54%) of digital professionals do not believe they will succeed with digital transformation (Institute for Digital Transformation Survey) – mainly due to the increased complexity and process/information fragmentation.  Maybe it is a sound development to view cloud services from a holistic perspective.  Overcoming the complexity of multi-cloud and harvesting business benefits starts with business ambitions, requirements and way of working. How does business intend to compete and create value for its customers in the future? The answer is a business-driven, holistic and technically robust cloud integration enabling scalability, flexibility and business effectiveness. It is a fine balance between technology, governance and business orientation where the shortcomings of one capability will have a negative effect on the whole multi-cloud set-up and limit the business effects. It is therefore important to have the right competence, tools and experience for cloud integration to mitigate the business complexity. One way to ensure the right workings of the multi-cloud environment is to conduct integration stress tests where business scenarios are run in a safe environment. For example, what happens if there is a significant cloud service down-time? It is an excellent way to test the maturity and robustness of the integration set-up and identifying potential threats that need to be resolved.  My recommendations:  Understand the complexity of multi-cloud solutions. How does this affect the Business/IT architecture, governance, ecosystem and leadership of the organization? Are we ready to go into a multi-cloud set-up? Start with the technology and structure of cloud integration. Is it the backbone of a multi-cloud infrastructure? Do we have access to experts, frameworks and best practice integration engines to enable efficient and effective cloud integration?  Engage business in stress testing your multi-cloud environment and integration solution. Does it really manage all the different business scenarios according to business expectations? Do not attempt to scale your multi-cloud environment without a mature and robust integration foundation (technology, governance, ecosystem and leadership). It will not work! Cloud Services is the future for IT application development and operations. The resources and competence required, in an unpredictable business and IT environment, to host these applications and infrastructure in-house are difficult to find and manage. Cloud service is the obvious choice. However, it is a tricky journey to reach the required maturity and robustness to enable scaling and realization of cloud services with full benefits. It is a journey with new obstacles that requires a new perspective and mindset of IT and business professionals. The question I ask is whether these professionals are ready to go outside their comfort zone to address the increasing complexity of cloud integration?[/vc_column_text][/vc_column][/vc_row] ### Why successful digital transformation does not exist without IoT [vc_row][vc_column][vc_column_text]Success in the digital economy is no longer achievable by simply having the best product. Today it requires a complex balancing act that brings together design, supply chain, manufacturing, delivery, service and operations to create a differentiated and optimised customer experience that is adaptable, agile, scalable and efficient to deliver. Whilst it can be said that this is how successful businesses have always operated, the difference in the digital economy is the speed at which the customer proposition has to evolve. To adapt at speed organisations require data on demand. Data they can use to understand every aspect of the supply chain and every mode of operation. In the digital world, IoT is the source of this data. According to the latest Vodafone IoT Barometer, 84% of adopters of the technology describe their Internet of Things projects as ‘mission critical’ to success and 72% say digital transformation would be impossible without IoT.  8% are even going further, saying their “entire business depends on IoT”.  Furthermore, and very tellingly, nearly all adopters saw a return on investment, with 95% already having gained measurable benefits from their IoT projects. With the advent of narrowband low power networks (NB-IoT and LTE-M) and with 5G opening up more opportunities for IoT to be deployed at massive scale, it’s clear IoT is not just any technology, but the true driver of digital transformation for businesses. How digital transformation journeys have evolved Historically, businesses have invested in solutions that would ease bottlenecks and improve workflows within and between different departments, effectively breaking down internal siloes and eliminating unnecessary processes. What digital transformation does is take this to the next level, helping existing processes becoming automated as well as de-risking new and more complex ones. Digital becomes then a way of working, not a technology. The technology has to go hand in hand with process re-engineering and with a mind-set change across the business. This means it can also have an impact on how organisations interact with the outside world, making their collaboration with partners, suppliers and customers simpler and more efficient. Digitally transforming is as a much a cultural shift as a technology one. It’s not just a case of automating what exists: for digitisation to be a success, the outcome has to be defined in terms of a perfect transaction and then the technology applied to ensure that this perfect transaction can be delivered time and time again with the same consistent content and quality. This means a shift in the way organisations think about the result they want to achieve and the problem they want to solve, how they resource the process and how this is executed. Digitisation has the potential to revolutionise the journey towards business outcomes. This means once a transaction has been digitised and the manual processes has been re-engineered and optimised, it is almost impossible to revert to a set of manual processes, as the whole business framework would have completely changed in order to achieve the desired result. This creates a reliance on the technology and, in the case of IoT, puts the communications network at the centre of the ecosystem that delivers the benefits of the digital enterprise, making the choice of network partner crucial for business leaders implementing digitisation strategies. IoT and digital transformation – the power of data The real power of IoT is the ability to measure and record an ever-increasing pool of highly valuable data sets, with a granularity and completeness that allows businesses to become truly data driven and insightful. In order for this data to deliver value and informative business insights though, it needs to be analysed and processed. Luckily, the combined rise of IoT, cloud computing and the emergence of AI creates an almost perfect environment for the safe, secure and rapid processing of large amounts of data. Whether this is used for predictive maintenance, the accurate scaling of infrastructure or to understand the customer interaction, there has never been a better time to leverage the value of IoT data. With the introduction of 5G, the scope for IoT increases even further. The combination of low latency, high bandwidth and quality of service enables IoT networks to become control networks for the autonomous machines of the future, be these remotely controlled cranes, guided transporters in factories or autonomous cars and lorries. The combination of AI, mobile edge, cloud and IoT further enables the next step in digitisation, as it moves from data driven processes to real-time control. A business fit for the future IoT provides greater visibility and control over how a business operates and is essential for those that want to digitally transform.  Through IoT businesses gain a clearer perspective of performance and can control crucial outputs, quality and levels of service. Additionally, the valuable insights gained from analysing data can help inform decisions that are mission critical to the organisation and stimulate new creative approaches to problem-solving. Some businesses are born digital, some have further to go with legacy systems and infrastructure, but all will ultimately need to undertake a digital journey.  Whatever stage they are at, enterprises of all sizes can safely digitally transform by relying on partners that can help them become more agile, virtual and software defined. Now is the time to really capitalise on the way that IoT, in combination with mobile edge and AI and analytics, can deliver truly exciting benefits for business looking to embrace the digital world.[/vc_column_text][/vc_column][/vc_row] ### 6 Pros and Cons of Cloud Storage for Business If you are considering investing in cloud storage for your business, you are not alone. Cloud storage has recently gone mainstream and businesses of all shapes and sizes are adopting this technology. [clickToTweet tweet="Adoption to #cloud #storage allows businesses to respond rapidly to changing needs and foster #IT #innovation. #cloudstorage" quote="Adoption to cloud storage allows businesses to respond rapidly to changing needs and foster IT innovation."] With any kind of platform, there are pros and cons that should be considered in determining if cloud storage is the right match for your company’s IT infrastructure. See four advantages and disadvantages of cloud storage below. Cloud: The Good Accessibility: Files in the cloud can be accessed from anywhere with an Internet connection. This allows you to move beyond time zone and geographic location issues. Cost savings: Cloud storage for your business will come at little or no cost for a small or medium-sized organisation. This will reduce your annual operating costs and even more savings because it does not depend on internal power to store information remotely.      3. Disaster recovery: All businesses should invest in an emergency backup plan and       cloud storage can be used like this by creating a second copy of important files. You         can store these files at a remote location and they can be accessed through an                 internet connection. Scalability: With cloud storage, you only pay for the amount of storage you require. If your business experiences growth, then the cloud operator can help accommodate your corresponding growth in data storage needs. All you will have to do is vary how much you pay to extend the storage you have. This also works in the same way if your business shrinks and you require less storage space at a reduced rate. Speed: Tests have shown that when the cloud is supported by the right enterprise technologies, the speeds achieved can rival onsite scores. For example, an enterprise can have multiple servers backing up data simultaneously much quicker than backing up onto disk. Storage immortality: The cloud offers the opportunity to bypass the risk of purchasing hardware that will soon be obsolete. Instead, you can pay for the capacity and performance your business requires, and your provider can upgrade the environment to keep pace with the latest technology. This is brought on by competitive pressures from other cloud providers. Cloud: The Bad Security and privacy in the cloud: There are concerns with valuable and important data being stored remotely. Before adopting cloud technology, you should be aware that you are giving sensitive business information to a third-party cloud service provider and this could potentially put your company at risk. This is why it is important to choose a reliable service provider that you are confident will keep your information secure. Bandwidth limitations: Depending on what service you choose, there may be a bandwidth allowance. If your business surpasses the allowance, then charges could be costly. Some vendors provide unlimited bandwidth and this is something to think about when choosing the right provider. Vulnerability to Attacks: With your business information stored in the cloud, there is a vulnerability to external hack attacks. The internet is not completely secure, and for this reason, there is always the possibility of stealth of sensitive data. Data Management: Managing data in the cloud can be a hassle because cloud storage systems have their structures. The existing storage management system of your business may not always integrate well with the cloud vendor’s system. Lifetime costs: With public cloud storage, the price costs over the years might increase and tend to add up. This is the same as buying a new vehicle with a large upfront cost. The convenience of lease payments might look appealing at the beginning but you will owe for mileage overage and have to pay a lot to keep the car. This is when the lifetime costs will hit you. If your applications are local and your data is in the cloud, then it can add to networking costs. Compliance: Depending on the level of regulation within your industry, it may not be possible to work within the public cloud. This is especially the case for healthcare, financial services and publicly traded companies that have to be especially careful when considering this option. Do the pros of cloud storage outweigh the cons? Despite concerns about the security of cloud storage, many businesses see that the cost savings, accessibility and disaster recovery are more valuable than the associated risks. Cloud storage is certainly here to stay for some time and worth considering for your company’s infrastructure and budget. ### Why The Initial Idea Is So Important When It Comes To Building Your App [vc_row][vc_column][vc_separator][vc_column_text]Almost everything starts as an idea. In the case of mobile app development, the thoroughness of your initial idea can be the difference between success and failure. Although the mobile app market looks saturated, it is a profitable field if you have the right app idea. Research firm App Annie stated that since 2017 there has been a 60% growth in the number of apps downloaded globally. The report also revealed that customer spending has increased rapidly. In one day alone - on New Year Day in 2017 - customers bought apps worth about $240 million on Apple's App Store. And, according to Statista, mobile apps are expected to generate about $189 billion from revenue via sales and advertising by 2020. These figures show that the global app market is still very lucrative. However, without a unique app, you may struggle to get in on the action. If you are looking to create an app, take the time to fine-tune your original idea. The importance of building a solid app idea cannot be overstated. Everything hinges on this. If you do not nail this part of the process, not only is your app likely to end up being a misfire, but you may also spend a lot more money and time to create it. In this article, we'll be looking at why it is crucial to take the time to hone your app idea. But before we do that, we need to dispel some misconceptions about app ideas. The idea for the world's next biggest app can pop up in your head at any time. Therefore, you must form a habit of writing down the most striking ideas that come to you daily. Not all ideas are clear at first. Sometimes, things can be a little fuzzy, and you need to take the time to iron it out in your mind. It is important not to give up on your ideas. Dedicate time to fleshing out every idea to estimate its full potential. Here are a few reasons why you must take the time to nurture your initial app idea: It will give you a full perspective of the problem and how your app can solve it. There are scores of mobile apps on the market. The more unique your app is, the better the chances that it would stand out on the market. When you catch an idea for an app, your next step should be to look at how it solves a problem. Once you can identify a problem and figure out how your app will solve it, you are already on your way to creating a top-selling app. Pinpointing a loophole is just the beginning of the mobile app development process. The next step is research. You need to look at why the problem has escaped everyone else and why has no other person ever thought of developing an app to solve the problem. Doing this will give you a broader knowledge of the situation and will put you in a good position to come up with a solution. It will enable you to identify your target market. The purpose of your app idea must be clear enough to be articulated in one sentence. This is one of the benefits of building the initial idea for your app. Being able to pinpoint the purpose of your app will enable you to identify your target market. Also, it will allow potential customers to grasp the value of your app readily. As it stands, users of app stores only have a short description and screenshots of an app before they choose to download it. You may lose potential customers if the purpose for your app cannot be communicated easily. Apart from helping you to identify your target market, working out the idea of your app will enable you to implement the proper user retention strategies from day one. The ability to identify your target market is invaluable in marketing. You can easily work out their likes and dislikes. This will guide the direction of the design of your app and ultimately increase its chances of being a success. It will give you an edge over the competition. Another benefit of taking the time to work out the initial idea for your app is that you get to analyze the competition and figure out what gives your app an advantage over all the others. Getting to know your competitors will give you a fair idea of how difficult or easy it will be for your app to break into the market. Study the history of your competitors and how they have evolved over the years. Also, check the number of people who have downloaded your competitor's apps as well as their ratings, and reviews. Going through the pains of figuring out what gives your app an edge over the competition during the initial phase of your idea will save you time and resources. You can learn a thing or two from your competitors about how to make your app appeal to your target users more. It will enable you to figure out which platform to sell your app to be profitable. Another advantage of expanding the initial idea of your app is that you can figure out which platform to sell your app. Although there are different stores to sell your app, it is still important to pick the best one. You can figure out which store is best by considering which store is best to reach your target audience. In terms of revenue, Apple's App Store tends to pay developers more for their apps than Google Play and Windows Store. According to Apple Insider, the App Store continues to be a more profitable option for developers compared to Google Play. In 2016, Apple paid out around $20 billion to developers. Meanwhile, Tech Crunch reported that customer spending on the App Store surpassed that of Google Play last year. Customers on the App Store spent about $38.5 billion while Google Play got $20.1 billion. Apple's App Store may seem like a more attractive platform to sell apps, but it may not be the best fit for your app. The Google Play Store has a broader reach because the Android platform has a bigger market share. However, Android users are less likely to pay for apps compared to iOS users. It will guide the design of your app. The design of your app is key to its success. Laying out the idea for your app will allow you to come up with the best design. Nothing beats an app that users can connect with. You need to work out the best design ideas and figure out how to make your app appealing to your target users. Do not compromise on the appearance of your app appears for anything since its one of the first things that users see. The inspiration of your app should guide every aspect of its design including colour schemes, the type of icons you use, and even how users navigate the app. Your app is more likely to attract more users if it has a design that stands out and is easy to use. When it comes to design, it is always better to do less. Avoid the mistake most app developers make of crowding too many things on one page. Take your time to choose the perfect pixel graphics. Be sure that the size of your fonts, buttons, and icon are not too large or too small for the screen size of your potential users. Conclusion Your app idea is the foundation of everything. You must take the time to nurture your idea and shape it into something special. Do research, test your idea, and find ways to improve it before you take the next step of bringing it to life.[/vc_column_text][/vc_column][/vc_row] ### Capitalising on the IoT through the 'analytics of things' The number of connected devices and assets that comprise the Internet of Things is nearly unfathomable. Last year, Gartner predicted there were more than 8.4 billion connected “things” –  roughly a billion more than the entire human population. While these numbers vary by analyst and media outlet, there’s no denying that this growth will continue as companies in nearly every industry look to cash in on this mega trend. However, the number of connected things aside, the IoT presents a huge business opportunity across a variety of sectors. In fact, according to IDC, spending on the IoT will reach $1.2 trillion by 2022, predominantly in technology and services. For example, equipment manufacturers will invest to improve the customer experience and minimise service disruption, while governments will invest to intelligently monitor and improve the ageing grid and infrastructure. Essentially, all industries will capitalise on the IoT by creating entirely new business models based on the insights gleaned from its data. Simultaneously, well-funded, data-first start-ups will to go to market even faster than traditional companies in those industries with disruptive products and services. Managing unprecedented data growth Billions of connected things generate unprecedented volumes of data. According to recent research, 2.5 quintillion bytes of data is created each day. Even more astoundingly, 90 per cent of the world’s data has been generated in the last two years alone. Initially, this data growth presents more challenges than benefits for traditional enterprises, particularly when it comes to managing and governing this information. Why? Because they need to re-platform and modernise their data warehouses to take advantage of modern approaches to streaming, managing, and analysing. This applies not only to sensor data, but all forms of data. On the other hand, data-first companies born in the 21st century can make a first-mover advantage with a green-field approach. By quickly evaluating and adopting proven technology, they are able to form their analytical stack without the baggage of data infrastructure. Both established and emerging businesses have been building up data lakes for years – storing sensor and other emerging data types in cost-effective ways such as Hadoop and S3. These organisations were waiting for a time when IoT business cases matured far enough for them to apply analytics to their data at scale, enabling monetisation. And that time appears to be now – as technology has advanced, enterprises and start-ups can now focus on capitalising on stored data, creating new revenue streams as a result. Why successful IoT use cases require predictive analytics Predictive maintenance. Smart metering. Pay-as-you drive insurance. Telemedicine. Intelligent manufacturing. The list of emerging and proven IoT use cases is growing daily. However, any organisation implementing one or more of these use cases as a strategic part of their business requires one essential technology – analytics, or, more accurately, predictive analytics. Hiring legions of data scientists to build machine learning models in Python or R is not enough. These IoT use cases require highly accurate and constantly updated predictive models based on unlimited volumes of data – not just samples of the data. To provide an example, Nimble Storage sought to differentiate itself in the storage industry through achieving higher levels of customer satisfaction than its more established competitors. To accomplish this goal, the company invested in a high-performance, massively scalable analytical platform that manages and analyses data from millions of sensors to predict and prevent any potential indicators of downtime. The initiative was so successful that Nimble Storage essentially made storage autonomous, predicting and preventing 86% of issues automatically. The company was ultimately purchased by Hewlett Packard Enterprise for more than $600 million, as it hopes to apply this proven predictive maintenance method across its full portfolio of storage and server offerings. Business drivers require action for analytical insight There are three primary business drivers behind IoT use cases. Two of these focus on improving customer satisfaction through less downtime or disruption, while at the same time reducing the costs often related to service, support and maintenance by automating a business process. However, the most impactful business driver revolves around generating entirely new sources of revenue based on the inherent value of the new connected product or service. To pursue these business drivers, equipment manufacturers, governments, industry leaders, and just about any other organisation in the world will invest in IoT analytical technology this year and beyond. They will need to consider new streaming technologies, more scalable, open analytical platforms with machine learning built in, and potentially edge-based analytical offerings. Whether it’s a modernisation initiative or a new IoT project with a fresh approach, organisations will only be successful if they can access predictive insight in mere seconds or minutes from unlimited data volumes, allowing them to uncover trends and patterns before the competition. To be truly disruptive, IT leaders will need to change their mind-set and look beyond cost reduction and operational efficiency. This way they can create entirely new businesses where revenue streams are driven by data analytics.   ### Emerging Paradigms in Endpoint Protection Platforms Over the years, cybersecurity solution investments have witnessed colossal upsurge. Rampant ransomware attacks, online frauds and malware have prompted entities to beef up existing security apparatuses. In 2019, over 40% of all cyberattacks were directed towards small businesses, given their high vulnerability due to the presence of inadequate security systems.   With the global cybercrime figures expected to reach US$ 6 trillion by 2021, companies have made cybersecurity solutions their top priority. Apprehensions abound that the incidence of cyberattacks, both from isolated hackers and organized cartels will record exponential growth.  This article discusses future possibilities in endpoints protection platforms development in the wake of rapid technological advancements and the possible opportunities and challenges in store for the market in forthcoming years.  Intense 5G Technology Explosion  The global technology landscape is highly unique, in the sense that a massive revolution occurs at regular intervals, especially with respect to cellular network standards and broadband connections. One such buzzword for the upcoming decade is 5G or fifth generation wireless communication technology.  While not expected to replace the existing 4G technology, 5G connectively is expected to usher in enhanced user experience, translating into faster speeds, higher data processing capacity and lower latency. Prolific advancements such as instant real-time interaction based hepatic feedback, remote surgeries and remote vehicle piloting are on the cards. Although a welcome development, businesses apprehend numerous security challenges across the virtual landscape. Despite being endowed with unprecedented network speeds, the global 5G networks shall likely become primary targets for malevolent entities seeking to compromise sensitive data. Increased frequency of DDoS attacks would severely cripple real-time enterprise systems. Taking cognizance of this, numerous endpoint security solutions providers are already assembling at the frontiers, offering robust threat intelligence solutions. A case in point is that of Palo Alto, which offers the world’s first native 5G security firewall solutions in the PA-7000 and PA-5200 NGFWs which include containerized solutions. Compounding Ransomware Attacks Heightened security vulnerabilities during the COVID-19 pandemic due to increasing digital footprint has amplified cyberattack incidences throughout the virtual landscape. Statistics report that two out of five cyberattacks in the third quarter of 2020 were ransomware attacks, with nearly 200 million attacks experienced worldwide. While ransomware victims have been the receiving end of file encryption and operations paralysis, a significantly interesting trend which has emerged is that of data exfiltration, a comparatively new tactic adopted by attackers since the beginning of the current financial year.  Data exfiltration attacks accounted for over half of all ransomware attacks. While the best way to prevent exfiltration attacks would be to prohibit downloading of suspicious applications, it is a significant challenge because such restrictions are not adequate enough. This is where endpoint protection platforms are likely to play a pivotal role.  Tessian Limited’s Human Layer Security platform is making significant headway in tackling the data exfiltration menace, detecting human errors, one of the primary causes of exfiltration attacks, and prevents dangerous and anomalous activities. It does so through its stateful machine learning technology, turning an organization’s own data into a security mechanism. Growing Vulnerabilities in Healthcare Industry Public health has emerged as one of the most recent operational areas for potential cybercriminal organizations and entities, compromising millions of patients’ health data and exposing glaring security vulnerabilities. According to a report published by the Cybersecurity and Infrastructure Security Agency (CISA), FBI and the Department of Health and Human Services (HHS), the primary attackers include the TrickBot and BazarLoader malwares. These attacks involved large-scale data exfiltrations, credential harvesting and cryptomining.   Since early 2020, actors associated with the aforementioned malware triggered the deployment of multiple ransomwares, including Ryuk. These typically assume the shape of an e-mail linked to a Google Drive document in a PDF format. The ransomware mostly uses Cobalt Strike or PowerShell Empire to steal credentials.   Realizing the debilitating impacts induced if such attacks become widespread, the abovementioned organizations have proposed a viable continuity program to keep businesses functioning. This is likely to expand scope for the entry of numerous endpoint protection platforms players in the global healthcare cybersecurity landscape.  Aggrandising Demand for Mobile-based Solutions As of today, global smartphone ownership has reached over 3 billion, which constitutes over 2/5th of global ownership. This has obviously led prominent manufacturers to equip their models with the latest antivirus and security solutions to protect user data from becoming compromised or stolen.  Unfortunately, the intensity of mobile-directed ransomware attacks have kept pace with technological advancements in endpoint protection platforms development. Hackers have found multiple ways to circumvent the smartphone security perimeters, including developing convincingly deceptive downloadable applications. Amongst all smartphones used, Android dictated over eight out of ten purchases in the global landscape. This has provided plenty of opportunities for cybercriminals to exploit the Google Play Store platform to initiate malicious attacks. For instance, Pro Selfie Beauty Camera and Pretty Beauty Camera were responsible for spreading spyware. While at face value, these applications helped enhance camera functionality by modifying selfie photographs, in reality, they served as tools to aggressively display advertisements and also installed spyware capable of making, tapping and intercepting calls and also pin-pointing user location.  To address mobile security concerns, CrowdStrike Holdings Inc. has developed the Falcon for Mobile Endpoint Detection and Response (EDR) software, providing application shielding, kernel-based vulnerability detection and threat intelligence integration. The Future Roadmap It is evident that cybersecurity threats are intensifying their iron grip on the global virtual space. In spite of robust security frameworks in place, incidence of malware, ransomware and phishing attacks are inclining exponentially. Hence, company CEOs and other executive members will need to formulate security strategies based on customer needs. While extensive technological leverage is evident, there is still a long road towards realizing actual success against potential cyberattacks. This is largely a result of inexperience while operating advanced security solutions. Primary focus, therefore, should be to educate employees on security software operations.  Training and education of security personnel, IT administrators and the management is a top priority area. Programs offered by specialists such as Cybrary, Open Security Training, the Department of Homeland Security and SANS Institute’s Introduction to Cybersecurity are some highly recommended ones.  With the digital revolution expected to outlast the current pandemic scenario, individuals and entities are preparing themselves for the new normal, with bulk of global businesses transitioning to the virtual landscape. Hence, adequate knowledge about endpoint cybersecurity protection platforms shall prove critical to ensure operational consistency.  ### The Telecoms Graph Database Promise [vc_row][vc_column][vc_column_text]Telecoms is highly multi-faceted business, with carriers, telecom equipment vendors and service providers all striving to constantly innovate and offer different competitive solutions to deliver what they need to deal with ever-rising solution complexity and high customer demand. It’s also an industry with a very wide variety of technology applications — from building out and maintaining great networks, to numerous customer-facing billing and support back office systems. Very much part of the mix now are Smart Homes, Smart Cities and planning for large-scale Internet of Things (IoT) projects, too. Telecoms do not rely on a single technology to manage these different areas. But notably, one form of software is present across the telecoms ecosystem, and being used to support everything from network-facing activities (OSS), unified network inventory, planning, traffic engineering and traffic intelligence, performance analytics, detecting fraud and customer journey analytics to try and prevent ‘churn’ to personalisation apps. That one technology: graph database software, and almost always in its native form, and it turns out graph software is a platform relied on for often highly-demanding uses by four of the top five global telecoms equipment companies. A new way of visualising the world Why have graph databases proven so popular in this particular market? Because graph software — the world’s fastest growing database technology, used by an increasing number of enterprises to solve connected data problems — provides a powerful and flexible new data model fit for multiple business use cases, a fact that hasn’t escaped the attention on telecoms network builders. From talking to customers who are doing this, their excitement lies in the flexibility and power of the graph model. Those telecoms enterprise customers who know about graph technology are enthused because they can see it unlocking a new way of not just working with information, but thinking about it — as it’s effectively a new way of visualising and understanding the world. It’s also a new way to solve old problems. Telecommunications networks are hyper-connected ecosystems of components, services and behaviours, from physical items (such as end devices, routers, servers and applications), to activities (like calls, media and data) and customer information (rights and subscriptions). Yet critical information is often siloed, so telcos are starting to realise that by using graph databases, cross domain dependencies with a single unified view of the infrastructure and topology are now much easier to visualise. Graphs can also much more naturally capture relationships between data than traditional relational databases, and therefore more easily model network and service complexity. Graph software’s in-built facility with connections definitely marks out from its RDBMS forerunners that require careful schema building and complex joins to map multiple levels of relationships, which are then inflexible when adding new data, but graph software was actually specifically designed to store and process such multi-dimensional associations. Plus, service assurance in the telecommunications context requires performance and predictability, such as monitoring real-time end-user experience for automated responses — again, graph databases thrive in querying such complexity at scale, easily outperforming RDBMS and even other NoSQL data stores. Graph telecoms pioneers tell us this is starting to really help service assurance, which has had to rely on fragmented views of the network and services that are typically manually curated and often inaccurate. There’s a real worry now that sticking to the relational status quo could result in costly and unsatisfactory service quality. A bird’s-eye view of even the most complex networks Successful telecoms companies are thus embracing next-generation service assurance that leverages a comprehensive, real-time view of services and infrastructure with an eye on end-user experiences, new service creation and predictive modeling. One example is Orange, which is using graph technology for security insights and overall infrastructure monitoring in Orange’s key information systems “and [so] have a bird’s eye view of all its components”. Specifically, Orange engineers and operators are starting to use graph technology to create a singular view of operations across multiple networks at once, including cell towers, fibre lines, cable, customers and consumer subscribers or content providers. As a result, they are starting to make measurable improvements in the customer experience by minimising the impact of system maintenance or outages, being able to re-route services during an unexpected interruption, or identifying and preemptively upgrading vulnerable servers based on their maintenance history and availability. Another example is Zenoss, a leader in hybrid IT monitoring and analytics software and which has many telecoms customers, provides complete visibility for cloud, virtual and physical IT environments. Graph technology is used by the company to help users understand what needs to be done to provide the highest quality of service with the greatest uptime reliability — and at scale; Zenoss monitors 1.2 million devices and 17 billion data points a day — more than 60 million data points every five minutes. It’s that kind of power that is opening up genuinely fresh perspectives for the kind of problems — and the size of the solution — telecoms firms can finally start to think about.[/vc_column_text][/vc_column][/vc_row] ### IoT in the Cloud: Azure vs AWS [vc_row][vc_column][vc_column_text] As of November 2019, there are 7.7 billion people in the world, and 26.66 billion Internet of Things (IoT) devices. While the world population sees continued growth, it is surpassed by the widespread use of IoT devices. According to the United Nations forecast, the world’s population should reach 8.1 billion in 2025. By that time, IoT penetration should jump to 75.44 billion. The increase in the demand for IoT technology can be attributed to the growth in digitalization. The customers of 2019 look for fast and convenient experiences—when purchasing products, ordering supplies, licensing software, and outsourcing services. Digitalization helps organizations service customers over the Internet. The technology that enables this connection is IoT. What Is IoT and Who Needs It? The official definition of IoT treats technology as an umbrella term, which covers a system of connected physical and digital components. Each IoT component has a Unique Identifier (UID), and can transmit data without the assistance of mediators. IoT is commonly categorized into five applications: Consumer IoT Consumer IoT is perhaps the most talked about application of connected devices. A popular application of consumer IoT is smart home devices, such as home security systems and light fixtures. Also included is elder care IoT, which covers assisting technologies like voice assistance and physical devices like cochlear implants. Commercial IoT Smart healthcare devices, aka The Internet of Medical Things (IoMT), cover a range of technologies that enable the digitization of healthcare. IoMT includes data collection and monitoring systems, pacemakers, Fitbit electronic wristbands, and sensors. Applications in the transport industry include smart traffic systems, infrastructure sensors, and electronic toll collection systems. To enable connected cars, automobile innovators make use of a range of connectivity devices. A notable trend is vehicle-to-everything communication (V2X), which aims to develop IoT technology that enables autonomous driving. The main components in V2X are vehicle to vehicle communication (V2V), vehicle to infrastructure communication (V2I) and vehicle to pedestrian communications (V2P). All of these can only be enabled by IoT. Industrial IoT Industrial Internet of Things (IIoT) devices enable companies to monitor and manage industrial systems. Applications of IIoT in manufacturing include digital control systems, predictive maintenance, statistical evaluation, and industrial big data. IIoT also enables smart agriculture and helps utility companies digitize their services. . Infrastructure IoT Smart cities are made possible by the city-wide deployment of IoT components. A smart city architecture unifies its services—including networking, energy, utility, and transport—under a cohesive digital ecosystem. This kind of operation requires a wide range of IoT components—from switches and routers to sensors, management systems, and user-friendly user apps. Military IoT The term Internet of Military Things (IoMT) refers to the application of IoT technologies in the military field. For example, sensors for reconnaissance, robots for surveillance, and human-wearable biometrics for combat. What Is Cloud-Based IoT? To develop and maintain IoT technology, developers require a host of resources. Some developers prefer their own infrastructure. Others opt to outsource some or all of IoT-enabling processes and tools. Cloud vendors offer a variety of cloud-based IoT services. You can find fully managed IoT services, which leave the main tasks in the hands of the cloud providers. There are IoT services that offer on-demand resources, and there are also services that support IoT operations. Azure IoT services Azure IoT Hub A cloud Platform as a Service (PaaS) for IoT development. You can use the hub to build, connect, maintain, and monitor IoT. The hub supports multiple protocols such as HTTP, MQTT, and AMQP, and configuration of other protocols via the Azure IoT Protocol Gateway. Azure offers a free tier, which you can use to experiment and grow at scale. For backup, Azure offers a range of services, including disaster and recovery services. Azure IoT Edge Cloud intelligence and analytics at the edge. Edge computing enables you to lower storage and bandwidth use. Instead of writing data to the cloud for processing, storage, and analysis, you can do these tasks locally. The advantage of IoT edge is very low latency and almost real-time response potential. Azure IoT Central A fully managed IoT Software as a Service (SaaS). You can use this service to quickly deploy production-grade IoT applications. This is an end-to-end solution that enables businesses to deploy, monitor, and manage IoT. A main advantage is the templates offered, so you won’t need to start from scratch. Amazon Web Services (AWS) IoT Amazon FreeRTOS A microcontroller operating system for the development and management of small edge devices. Amazon FreeRTOS is an extension of the open source FreeRTOS kernel. The service enhances open source software libraries with AWS IoT Core and AWS Greengrass offerings. AWS Greengrass AWS IoT Greengrass provides controls for building IoT devices that connect to the cloud and other devices. The main advantage of this service is that it enables the local execution of AWS Lambda code, data caching, messaging, and security. Devices are put together into groups, and each group can communicate over the local network. This enables fast communication, which translates into near real-time response. AWS IoT Core A managed cloud service that enables you to connect devices to cloud components. You can easily connect billions of devices. The service can support trillions of messages, any of which can be routed to AWS endpoints and other devices. AWS IoT Core connects to other AWS services, including Greengrass, AWS IoT Analytics, and AWS Lambda. You can use the free tier to experiment, but be sure to properly calculate all variables before fully adopting the service. Conclusion In today’s digitized ecosystem, there’s no shortage of solutions. As the demand for IoT technologies rises, the cloud IoT market follows. The major cloud vendors—Azure, Google, and AWS—provide a range of IoT offerings. There are many services, so be sure to check out other cloud vendors.  Only you can tell which vendor is a good fit for you. Do your research, assess your situation, and make use of the free tiers. Experience will make you wiser. [/vc_column_text][/vc_column][/vc_row] ### Multi-cloud management demands a single viewpoint [vc_row][vc_column][vc_column_text]Remember when the narrative around the cloud was polarised by the simple choice of going public or private? How times have changed. In the ever-evolving world of digital transformation, it all seems rather naïve to think of trying to shoehorn all our digital expectations and requirements into such an either/or and black and white choice. It is why the hybrid option emerged as a natural successor and has now expanded further to give us the multi-cloud proposition. You may wonder to what extent the multi-cloud environment varies from the hybrid incarnation, but the two are far from interchangeable. In essence, multi-cloud goes a step further and represents a more strategic approach to cloud management. Stripped to its core, it is the antithesis of vendor lock-in, instead of a pick and mix approach that takes the best-in-class technologies and services from different cloud providers to create the desired solution tailored to specific business needs. Inevitably, this more nuanced approach gives the business greater agility. It negates the more intrusive and costly intervention that comes with making significant changes to the underlying architecture to accommodate changes in business demands - or being confined by the limitations and cost implications when cloud provision is tied up with a solitary provider. Yet there is still a fly in the ointment. The greater choice, flexibility and reliability that entails comes with a caveat: greater complexity. As applications and workloads flit between diverse cloud providers with their own management interfaces and unique services, we can see how the lack of standardisation is leading to an exponential rise in integrational demands as the technology stack grows. This is compounded further by the struggle organisations have in trying to keep track of a fast-expanding diverse cloud environment via solutions that don’t provide sufficient visibility. The upshot of this is that even in what promises to be the most dynamic environments, the potential can be heavily compromised without the necessary control and structure in place. It will manifest itself most acutely when we consider the migration and management of data from one cloud to another. Without optimal connectivity, the potential for data loss grows and the impact is most keenly felt by the end user who expects and deserves a seamless and consistent user experience. In my opinion, one of the biggest oversights in a cloud strategy remains overlooking the extent of the integration needed and failing to properly take into account the entirety of the ecosystem involved. This includes people, systems, data sources and how each and every one of these must be interconnected. In fact, I don’t think it is an overstatement to say that even some of the most well-intentioned approaches barely scratch the surface. Great customer experience doesn’t hinge on a great app alone however innovative and multi-functional it may be. If it isn’t fully connected, it will not translate to a great service. In fact, the only way to guarantee this is to know exactly what is happening all the time, across the entire estate, and to be in a position to respond and react to changes and the insight that presents itself as fast and intuitively as possible. In short, looking at data in context, through the vantage point of a cloud management platform with a single interface. It underscores, once again, the criticality of bringing control and uniformity to an otherwise eclectic environment, underpinned by the logical premises that the same tools equate to the same user experience, irrespective of the different personas involved.  The focus must be on low code application and integration that adopts a use-friendly, intuitive, design-led approach. This will enable a broad mix of users to build low code apps and the incorporation of visual analytics that speeds up and simplifies data integration, analysis and interpretation. Furthermore, features that enable fast and easy connections to virtually any endpoint take care of the heavy-lifting in a vast and sprawling cloud environment where it is all too easy for things to slip through the net. This potential insight is something that none of us can afford to miss.[/vc_column_text][/vc_column][/vc_row] ### The Artificial Intelligence Revolution Is Here [vc_row][vc_column][vc_column_text]Artificial intelligence (AI) is no longer a technology of the future, it is a technology of the now. It is playing a significant role in our daily lives, from Netflix recommending content based on what you have previously watched to personal assistants such as Siri and Cortana. Despite the technology entering the mainstream in the consumer sector, it has yet to be embraced in the same way by the business world. I believe this is mainly due to the technology simply being misunderstood by organisations of all shapes and sizes. There is a very good reason for this; AI is a comprehensive and complex topic that has lots of different strands to follow and explore. By better understanding what the technology is, how it works and the benefits it offers, businesses can unlock upsides including increased productivity and streamlined costs. Artificial intelligence defined: Before we dig into the scale and scope of the technology, it is important to have a clear understanding of what we mean by artificial intelligence. “The ability for computers to perceive, learn, reason, and assist in decision-making to solve problems in ways that are similar to humans”. AI can be deployed across a range of business functions to take over manual, repetitive and time-consuming tasks undertaken by employees. It can also provide insight into data. Its key capabilities include: It can perform human tasks faster and in a more thorough way. This includes inspecting, repeating and analysing actions automatically It can gather, analyse and process huge volumes of data and help identify key trends that humans might not spot or interpret accurately It can do all of this in real-time, taking seconds to complete tasks that would take humans days or even weeks to work through The business benefits of AI: AI allows organisations and employees to work smarter, better and faster. It drives productivity, boosts efficiency and ultimately makes businesses more profitable. When it comes to data, AI can gather, segment analyse and report on huge volumes of information. It can then make highly accurate predictions based on this information. For consumer facing businesses, AI can be used to deliver a superior customer experience, from automated chatbots to real-time parcel tracking. By doing much of the heavy lifting, AI frees up time for employees to focus on other aspects of their roles and to further educate and upskill themselves. It also removes the risk of human error. AI in action: A growing number of businesses are joining the AI revolution and are already benefitting from the many upsides it offers. Microsoft’s AI report says that organisations on the AI journey are outperforming others by 5% on factors like productivity, performance and business outcomes. Here are a couple of examples of how AI is being used in the workplace: The technology is being used to deliver personalised offers to online shoppers based on their previous buying profiles and habits Businesses in the transport industry are using AI to send real-time passenger updates for trains, buses and taxis Recruitment agencies are using AI to scan CVs and covering letters to filter out suitable candidates for roles How to harness the power of AI: In order to unlock the power and potential of AI, businesses need to take a considered approach to the technology. They must also understand that AI does not replace human employees, rather it is augmented into their roles. The human is very much at the centre of it. The first thing to consider when it comes to deploying AI solutions within your organisation is to identify where it can have the most impact and where it can be successfully integrated. We always recommend starting with one area and then increasing its usage from there. As with any technology, it is best to walk before you can run. Microsoft refers to this process at an AI journey and suggests organisations ask the following questions: What are the business problems we want to solve – and how can AI help? What are the opportunities we are missing? Is our data ready? Are our people ready? If not, how can we re-skill and re-train them so that technology augments their role rather than simply automates it? By undertaking this process, you will gain a clear picture of how AI can benefit your business and what action you need to take to successfully integrate it into your organisation. Other things to consider: Deploying AI technologies across your business requires a collaborative approach. This means bringing together senior managers from all sections of the business – operations, finance, marketing and, of course, IT, to ensure its potential is maximised. It is important to understand what benefits AI can bring to your business and where those benefits will have the most impact. By getting it right from the start, you can ensure you invest in the right technologies and implement them in the areas of your business where you will benefit the most.[/vc_column_text][/vc_column][/vc_row] ### How the Cloud and Automation Impact Database Security [vc_row][vc_column][vc_column_text]There has been a great deal of talk around database automation in the past few months, especially as it relates to organizational cybersecurity. Companies around the world are looking for new ways to streamline their processes, and cloud-based automation tools are quickly becoming a popular solution.  They are simple to deploy, cost-efficient, reliable, scalable and enhance information management. For example, Oracle, who announced its cloud-based autonomous database last May, considers its cloud application suite strategy to be a direct contributor to its recent success in Oracle applications. When it comes to database management, automation can eliminate the human labour associated with database tuning, security, backups, updates, and other routine management tasks. It simplifies operations and allows companies to stay focused on their core business. But as more businesses move from traditional software to cloud solutions—and as more software vendors automate monitoring, testing, patching, and tuning—many question the impact on database security. While database automation won’t completely absolve you of your security-related responsibilities, it will certainly play a role in how you manage them. Here are a few security tips to keep in mind when automating your database in the cloud. Your existing security policies and tools won’t work in the same way in the cloud. The on-premise world is much different than that of the cloud, especially when it comes to security.  For example, in on-premise environments, there’s no question that anything that happens within the system is owned by the company’s security team. You probably have existing people, processes, and platforms in place to address every part of your database security. However, in the cloud, you have a shared security responsibility with your cloud provider. Though Gartner predicts that just 5 percent of security failures are at the fault of the cloud provider, it’s a fact of the cloud environment your business must learn to operate with and integrate into security policies. Your risk of data breaches will decrease. Organizations of all sizes are becoming well versed with the risks associated with data theft, misuse of data, and inappropriate access to data. While the importance of securing data continues to grow, many organizations are unable to expand their resources to meet the demand. On average, many organizations wait six months before applying security patches to critical systems. A patch is a small piece of software that a company deploys whenever a security flaw is uncovered. Like the name implies, it repairs the vulnerability, which otherwise can be exploited by malicious hackers. Even worse, the majority of organizations that suffered a data breach, attribute the incident to a known, unpatched vulnerability. Finding a tool or solution with automated patch management makes it much easier for businesses to stay on top of their security. Rather than forcing downtime to reboot critical systems, the database can automatically repair security vulnerabilities as they arise. Your database administrators (DBAs) will shift responsibilities.   While your database security might not continue to be reliant on DBAs, they will still continue to play a vital role in database operations. Like technology, the role of DBAs will continue to evolve and grow. As Blake Angove, director of technology services at LaSalle Networks states, “As more businesses are moving to the cloud, it’s changing what [DBAs’] roles and tasks look like. Automation is probably making them a little nervous, but the DBA role isn’t going away—it’s evolving.” The use of the cloud is taking DBAs away from some of the tasks they might be used to doing, like being on-call 24/7 or managing backups, but it also allows them to be more involved in helping their business leverage data. They can take a more proactive role in problem-solving, whether it be fixing a slow database or working with data science teams. DBAs can utilize their data expertise to add significant value to the company. After all, they know the data coming in and out of the organization better than anyone. Rather than just managing the database, DBAs can assess how the company can better use that data. You will improve cyber hygiene. Cyber hygiene, much like personal hygiene, refers to the practices and steps that users of computers and other devices take to maintain system health and improve online security. Practising good cyber hygiene involves properly configuring the system, including patching, but also requires encrypting the data within the system, controlling access to that data, and monitoring access to that data. Automating as much of that work as possible not only makes it easier for organizations to continuously put forth their best efforts toward cyber hygiene, without added effort, but also ensures that proper security measures are taken at all times. Ultimately, automating business processes not only streamlines operations, but it makes for simpler security management too. With risks like unpatched systems looming over organizations at all times, IT staff can’t manage databases alone. With an expert staff and automation working hand in hand, organizations can elevate their database security and bring operations to the next level.[/vc_column_text][/vc_column][/vc_row] ### 10 Practical Tips for Keeping Your Business' Data Secure [vc_row][vc_column][vc_column_text]Whether it’s occurring at a huge corporation or the newest start-up on the block, web threats and data theft can cause massive disruptions to any business’ day-to-day operations. Without the proper security and procedures in place, businesses leave themselves open to the consequences of such attacks, which are at best frustrating and at worst irreparable. As damaging as threats to a business’ data security can be, they’re also easily avoidable when you have the appropriate safeguards in place. If you’re wanting to ensure business continuity, then investing in the right methods is essential. To get you moving in the right direction, here are ten practical tips your business can use in order to keep its data safe and secure. 1. Write up a strategy Rather than having a vague idea of policy and procedures, businesses of all sizes should have a formal IT security strategy that’s as detailed and exhaustive as possible. It’s imperative that it not only lays out how to protect data and resources, but what to do should things go wrong. An incident-response strategy ensures you’ll be a step ahead, rather than making any rash heat-of-the-moment reactions that might make things worse. Keep it updated and close to hand too; there’s no point putting in all that effort writing it up only for the document to collect dust in a drawer somewhere. 2. Protect against malware Ward off data threats by securing your PCs and network against malware. Malicious software that can cause massive amounts of data damage, malware can swarm on unprotected machines without you even knowing about it. It’s essential that you protect yourself from malware through the following: Apply the firewall: While not enough on its own, your router’s on-board firewall provides the first line of defence, so turn it on. PC protection: Sophisticated security software protects without compromising on the performance of your computer or network. Look for protection that can deal with identity theft, suspect websites and hacking in one fell swoop. Keep emails clean: Antispam software protects against unwanted emails, which can create risks and distractions for employees. Stop them in their tracks with the necessary precautions. 3. Keep your wireless network secure If you have a wireless network, then beware: hackers are waiting to pounce on it without warning. An encryption key may flummox those who aren’t especially tech savvy, but to hackers, it’s be a breeze to bypass. Strengthen your router by using the strongest encryption setting you can to protect your business, and turn off the broadcasting function to make your network invisible. As far as hackers are concerned, they can’t hack what they can’t actually see. 4. Safeguard passwords Even something as simple as a password can be optimised to fortify your data. They might be a nuisance to remember, but the more complex your passwords, the more protection you can provide. Make your passwords at least eight characters long, and embed numbers and other non-standard characters within them, so they can’t be easily guessed. Changing them frequently can also help – as can employing credentials which aren’t words, but combinations of seemingly random letters, numbers and special characters. Here’s where passwords managers really come into their own, meaning your employees don’t have to worry about remembering them and won’t risk writing them down. 5. Create a plan for personal devices More common in small-to-medium sized businesses, make sure you’re staying abreast of the security risks associated with employees bringing in and using their own devices. Create a plan for the practice in order to provide some protection against legal repercussions and mobile system costs. A clear, comprehensive policy covering pertinent data deletion, location tracking, and Internet monitoring issues can be very valuable. Additionally, businesses should look to make proper provision for employees who work remotely or use their own devices as part of their roles. While these practices can increase productivity and reduce overheads, they can also introduce new security concerns if not properly managed. 6. Set up automatic software updates Hackers love to scan a network or site to see which version of software it’s running on to make it easier for them to exploit the vulnerabilities of older versions. Updating device security settings, operating systems and other software to their latest versions can prevent this from happening. Set any patches and improvements to automatically update in the background to further safeguard against potential threats. 7. Conduct background checks Be extra vigilant with regards to hiring new employees; safeguarding against internal threats plays a key role in effective cyber security. Look into their background and give yourself an idea of what kind of person they are. Additionally, be mindful of changes in the character of existing employees, as this could be indicative of other issues. 8. Dispose of data properly Having the appropriate measures in place to dispose of data which is no longer required is a critical factor in reducing the risk of a security breach. Ensuring that retired and reused devices and storage media have had their contents properly removed will ensure that confidential company data can’t be retrieved further down the line – and won’t fall into the wrong hands. Remember; Reinstalling your operating system, formatting your hard drive or deleting specific files and folders doesn't ensure your data is gone. In fact, in most cases your data is still completely accessible with freely-available tools. Ensure your IT disposal partner is using a tool that overwrites your data multiple times ensuring your data is unrecoverable. Businesses should look to implement a sound data destruction policy which outlines the protocol for each use case (computers, phones, external hard drives and flash memory) – whether these devices are being redistributed within the business or discarded at the end of their lifecycles. 9. Use the cloud If your business doesn’t have the time or expertise to stay on top of all the security issues updates requiring attention, then it might be worth looking at a cloud service provider instead. A reputable cloud provider will be able to store data, maintain software patches and implement security. While not likely to be suitable for enterprise-level organisations, this can be a good approach for small businesses looking to provide themselves with a degree of protection. 10. Educate your employees Making sure everyone in your business understands company security policy is important. Whether you opt to do it during onboarding or conduct bi-annual refresher courses, it’s worth carrying out – just make sure everyone is heeding the practices, throughout the entire company.[/vc_column_text][/vc_column][/vc_row] ### Leveraging AI to Revolutionise Underwriting [vc_row][vc_column][vc_column_text]Today, underwriting is tedious, time-consuming and inconsistent. With AI, we can realise a better tomorrow. Leveraging AI to revolutionise underwriting Underwriting is at the core of the insurance industry. Today, however, the process is tedious, inefficient and leads to inconsistent results. Human intervention in underwriting is a significant factor, such as with life insurance. Of course, where people are involved, inconsistencies are inevitable.  Many insurers are further hindered by old technology and procedures that prevent them from leveraging Internet of Things (IoT) tools like smartwatches and telematics. These devices provide real-time data on the activity level and driving performance of clients. The solution to these problems is underwriting powered by artificial intelligence (AI). With AI-powered underwriting, decisions can be reached in days, not weeks. Since AI can derive insights from historical data, decisions can also become more consistent and reliable. A few insurance companies are using AI in select areas, but there is yet to be widespread adoption of AI solutions across the underwriting landscape. There are four techniques under the broad spectrum of AI that would provide underwriters with a comprehensive solution and a competitive advantage: Machine learning Natural language processing Deep learning Behaviour data models Machine Learning Insurance providers have vast amounts of historical data on applicants – medical history, family history, job data – as well as the risk assessments underwriters assigned to applicants based on the available data and a static set of rules. A machine learning algorithm is capable of ingesting that historical information and creating a model that will imitate the decisions underwriters have made to that point. For simpler cases, allowing the model to assign the risk classification would move the industry towards zero-touch underwriting, where decisions are made with zero manual intervention from the underwriter. For more complex cases, the underwriter’s decision can be made quicker and with sounder judgment because of the underlying model that is there to support them. As the model matures and the organisation becomes more comfortable with AI, the percentage of applications underwritten with the zero-touch concept can increase. Eventually, insurers could reach an end state where manual or assisted underwriting could comprise 5-10% of their cases, with the rest zero- touch underwritten. Deep learning With the advent of deep learning techniques, AI is becoming more sophisticated and better able to use reasoning and problem-solving skills in the way a human brain does. This will help in transforming the underwriting of various commercial risks that are currently being underwritten manually using traditional methods by expert underwriters. It will also enhance decision making and reduce costs (including underwriting losses). Drones, equipped with infrared cameras, lasers, and sensors help gather data relating to weather, temperature, and environmental conditions, including radiation. They are the game changers that are disrupting the way risk is being assessed in the commercial sector. According to the Federal Aviation Administration (FAA), 7 million small drones could fill the sky by 2020, and as many as 2.7 million will be used for commercial purposes. Here are some examples of the ways drones can be used: Property insurance: inspecting high rise buildings or weather damaged rooftops Agriculture insurance: verifying crops across acres Boiler and machinery insurance: for equipment break down insurance, drones can help collect data from high-pressure machines and unsafe boilers Natural language processing Natural language processing (NLP) has two clear uses within insurance. The first is that applicants can communicate with a bot to gather the data needed for an underwriter to make a risk assessment. The bot is able to understand the human it is communicating with due to the natural language processing technique. The other application is with text- based data mining – currently a manual method for most insurance companies. If a doctor sends in an evaluation of an applicant, that data must be taken from the text document and coded into the system by someone within the underwriting department. With natural language processing, that information can be fed into a bot that instantly creates an electronic form. If machine learning is being utilised, that form can immediately be fed into the algorithm. In commercial insurance, NLP can be used to better support the underwriters by being their virtual assistants. Apart from the usual data entry, NLP can help the underwriters to pull up relevant information on the risk they are writing using search- based analytics to speed up data access. For underwriting a mine, the underwriters would traditionally look at the location, depth and breadth, safety procedures, number of resources and other relevant factors pertaining to mine including past claims. In the coming days, the underwriter can dictate commands to the virtual assistant which using NLP can intuitively pull up not just the traditional data but also analysis on current developments, literature on environmental risk (or climate changes) and the economic conditions affecting the mine, and an insight into future risk trends giving the underwriter a broader and a complete picture of the mine and the mining industry. Mining insurance being specialty heavy industry portfolio, the search analytic tool can also be used to provide data needed to make a decision on reinsurance placement. Behaviour data models Behavioural data models can be used to analyse the real-time customer data from IoT devices for more precise risk classification and product innovation. Using that data, insurance companies can launch new products that incentivise life insurance customers to lead healthier lives, or auto insurance customers to drive safer. When insurance companies leverage behaviour data to combine a wellness product with traditional insurance, they can tap into a new market segment, a new revenue stream. Overcoming barriers to adoption Insurance companies that utilise these AI techniques will see reduced costs due to less time needed for underwriting, as well as fewer mistakes and more consistent decisions. Being able to issue policies faster will allow larger, more established companies to compete with technology-driven start-ups that are taking away their market share. However, most companies face barriers to adopting these solutions. First barrier The first being data capability. Historical insurance data is normally spread across disparate systems or applications, making it difficult for companies to group together true and consistent data for a machine learning algorithm to begin mining that data for insights. Second barrier The second barrier is an ecosystem of legacy technology. The existing technology used by most insurance companies is outdated enough that implementing AI techniques would require a significant financial investment, as well as a cultural shift within the organisation in terms of how technology is used in decision making. Given these barriers, a ‘big bang’ approach is not recommended for companies looking to implement AI techniques within their underwriting department. Instead, companies should opt for a ‘start small and fail fast’ approach to achieve immediate wins and reduce the risks both financially and in terms of their time investment. Lemonade, founded in 2015, demonstrates this approach by utilising AI with a simple product like renter’s insurance, which applicants can attain within a few minutes using the company’s mobile app. Ladder is another company that leverages AI to get applicants a quick quote for a fully underwritten term- life insurance policy. In starting with more targeted use cases, insurance companies can begin to engage with AI technologies to gauge effective use techniques. In time, these use cases will compound, generating sustained efficiencies for underwriters across all sectors of the insurance industry.[/vc_column_text][/vc_column][/vc_row] ### Demystifying the cloud for CFOs Inside corporates, The Cloud is on a roll. In the beginning, the IT department was cloud’s natural early adopter. There was initial resistance due to security concerns and the apparent risks to job roles – but the advantages of cost and scalability made the business case for PAYG computing irresistible. Sales and marketing came next, as companies like Salesforce offered CRM and other applications as software-as-a-service (SaaS) – perfect for teams often on the road and needing daily/hourly access to shared data and assets. As Cloud continues to prove itself across the enterprise, CFOs and finance teams should be next to jump on board As Cloud continues to prove itself across the enterprise, CFOs and finance teams should be next to jump on board. But confusion about pricing, benefits, and how to make the transition from on-premise systems is holding many back. CFOs: The last cloud holdouts? Gartner reported last year that around 36% of all transactional systems will move to the cloud by 2020. Finance applications form a big part of that and more will move to the cloud with each passing year. The shift began with SMBs because bigger organisations were less enthusiastic about moving large data sets over the internet. It’s normal for an organisation’s keeper of financial probity to want hard evidence of value before investing in something new, but as proof has become more apparent, resistance to the cloud has begun to break down. [clickToTweet tweet="'It’s normal for an organisation’s keeper of financial probity to want hard evidence of value before investing in something new, but as proof has become more apparent, resistance to the cloud has begun to break down.' #CFOs could be next to adopt #Cloud" quote="It’s normal for an organisation’s keeper of financial probity to want hard evidence of value before investing in something new, but as proof has become more apparent, resistance to the cloud has begun to break down. "] Regardless of company size, CFOs are always looking to reduce costs and improve the bottom line, and an increasing number are seeing the cloud’s potential as a money saver – as well as a way to make their own departments more efficient and effective. What’s making those early finance adopters make the move? Efficiency and cost savings Access to big company capabilities Flexibility The financial benefits of the cloud Transferring responsibility for hosting and maintaining applications over to a cloud vendor can be quite empowering. The cloud provider does all the heavy lifting in terms of management of the hosting environment, securing data, and making sure applications are available to end-users at top performance and across devices. There are well-known risk factors associated with traditional IT systems. Sometimes they fail to deliver the promised benefits. Sometimes they fail due to lack of end-user acceptance. The cloud’s subscription-based pricing model eliminates much of that uncertainty. Budget moves from CAPEX to OPEX. The freedom to pay only for the capacity, capabilities, and user licenses you need allows organisations to avoid the high upfront costs of on-premise solutions. In that way, Cloud makes it easier to align technology expenditure with growth. When French electrical-supplies distributor Rexel Group migrated its IT infrastructure to the cloud in 2016, the company was able to adopt an IT operational-expenditure model that lowered costs by 35%, while allowing the office of finance to generate financial reporting five times faster. Rexel found that the costs of managing cloud-based systems are lower and that the experience of using cloud software is more satisfying for end users, who always have the latest versions and the latest functions. Patches, upgrades, migrations from old to new software versions all happen invisibly in the background. Cloud-based systems can also respond more flexibly to changing business requirements. Because they can be deployed without installing new hardware and software, Cloud allows companies to support new business models, harmonise systems with new acquisitions, and more easily test new markets. As a result, organisations can afford to be more progressive and disruptive by introducing the latest innovations in a manner that is easily digestible, and at a pace that compliments its technological maturity. Other cloud benefits for finance:   The finance-focused cloud is growing: more cloud solutions supporting core financial applications like ERP and EPM are entering the market and geared to organisations of all sizes. Excel Farewell: Cloud provides a low-risk, path to move processes away from spreadsheets CFOs can keep IT costs in-sync with business levels: Since cloud assets can be scaled upward and downward quickly against an agreed cost structure, IT spend can be more closely aligned with growth. That includes costs for IT security: The cost of keeping IT security up to date is significant, and the costs of a breach can be severe. The cloud initially raised safety concerns for some, but time and innovation have given cloud vendors some of the strongest security protections available in IT. Improved Productivity: Since the cloud makes core applications more accessible, employees can accomplish more. Rather than being bound to a single location or machine, they can access the tools they need when travelling, offsite or out of hours. Reduced Redundancy: Cloud environments eliminate the contingency expenditure built into on-premise systems. Any necessary redundancy is the vendor’s responsibility and built into the data centres where infrastructure and applications are hosted. With organisations of all sizes adopting cloud technologies at a rapid pace, it’s also worth evaluating what the competition is doing – no one wants to be left at a disadvantage.  Moving to the cloud. Can CFOs go it alone? While the benefits of the cloud seem to be transparent and superficially require just a PC and fast internet connection to use, migrating data and processes from traditional IT to SaaS can be complex. There is much more to it than handing over access to your databases to the cloud vendor and assuming all will be fine. Integrating systems and creating new processes can be a lot of work. Significant planning is required in order to achieve data portability and interoperability with the cloud. The stages of migration and deployment can be costly and resource-heavy if not handled correctly. CFOs should consider working with an experienced implementation partner who can help answer the following questions: How much time and expense will we need to invest in evaluating cloud vendors? How can we make the implementation work for finance requirements, time cycles and processes? How much can we save by purchasing licenses as needed rather than having to project our needs up front? Will we need entirely new processes or can we migrate those as well? Looking beyond costs savings, where can cloud software give us a strategic advantage? The advantages of cloud computing should give today’s CFOs lots to think about. The benefits are compelling, but at the outset at least, making the move alone is unwise. Consider working with an expert who can help you design an approach to cloud computing tailored to your organisation’s particular needs and business objectives. ### Setting Your Business On A Positive Path To Digital Transformation w/ RPA [vc_row][vc_column][vc_column_text]Advances in machine learning mean that the cloud can now power greater efficiencies and return on investment than ever before. 2019 will see Artificial Intelligence (AI) and automation capabilities extend from the early adopters to a broader enterprise audience, giving more companies access to cutting edge technology. This change is set to revolutionise the way we do business, putting huge oceans between those that modernise, automate and innovate and those that do not. So how can you embrace new digital technologies to avoid being swept away by the digital tsunami?   A sound approach is to build your Intelligent Automation capability in incremental steps. RPA (robotic process automation) is continuing to gather pace in UK businesses, and recent industry successes make this a good place to start your journey to fully embrace AI. RPA is the process of using software robots to automate mundane, repetitive tasks. Once these are automated, companies can look at moving to more complex AI-based automation, for example, using visual and cognitive intelligence to deliver more advanced automation that draws information from multiple sources and interprets it to provide improved business intelligence. But how do we get from here to there? Enterprises need to look practically at their infrastructure, workforce and security, and consider what they need to change to enable their business to be set on a positive path to digital transformation. Take infrastructure to start with. In an ideal world, your employees would have secure access and data sharing wherever they are located, as this offers enormous benefits to performance and productivity. There are a whole host of technologies available from leading companies like Microsoft, VMWare and Citrix that allow secure and easy access to your business applications and data irrespective of location and device. Working with a managed services partner to ensure you have the right combination of technology to allow you to create a successful modern workspace will allow you to operate on a ‘pay-as-you-go’ model. Here you pay, per user, per month and can scale up as appropriate and take advantage of software upgrades without spending vast amounts of capital. Part of your digital transformation is likely to involve moving your data from on-premise servers to the cloud. By engaging with a cloud services provider, you remove the burden on IT and improve business agility by allowing you to provision apps and desktops faster. And, you will benefit from the latest security features and management functionality of a modern operating system like Windows 10 that won’t go out of support. Once your infrastructure and security is sorted, enterprises can begin to automate many of their key business processes using RPA. Combined with Artificial Intelligence-based technologies, the use cases for process automation are even wider, and provide greater returns: One training provider which takes up to 400,000 first line calls annually is using speechbots to answer calls and leverage RPA to verify the caller. This has resulted in reduced operational expenditure in the call centre by 50% and increased efficiency. A large car dealership is automating new and used car ordering from the manufacturer PoV in seven dealerships to improve operational productivity and customer experience, saving £210k per year for that one process alone. Organising legal documents around correspondence, evidence, witness statements, case precedent and indexing using RPA has resulted in a monthly saving of £16,500 for one law firm. Their payback on the initial investment was achieved in just over six months. At Ultima, we have been using RPA technology to automate our back-end operations, and we’ve seen productivity rise by a factor of two since implementing the technology across five processes. We’re using RPA to aggregate sales pipeline data from 32 individual spreadsheets and systems and presenting the data in PowerBI. This has saved eight hours of people-hours per day, saving one full-time admin head at £20k per year. Not only that, but the information can be run at 100% accuracy multiple times per day. We’ve also automated our first-line helpdesk for our managed services clients. They can now self-serve using chatbots and RPA which means our costs are reduced so we can serve more customers with the same headcount. It’s also improved customer satisfaction as they like to solve their own problems in a short space of time which automation allows. Looking ahead The next level of digital transformation will be to move from intelligent automation to artificial intelligence. Paul Daugherty, CTO Accenture and co-author of Human+Machine said we need to “initially focus on developing the full potential of your employees by applying automation to routine work; then you can proceed to concentrate on human-machine collaboration.” In terms of outcomes for business, intelligent automation is going to lead to increased productivity through better business intelligence and collaboration, and improved workforce engagement – staff will work on more valuable and interesting projects. Efficiency will improve through process automation, reduced building estate, more intelligent workplaces and automated services. Compliance will improve with automated controls and digital rights management. All this will be driven by RPA, analytics, chatbots, cognitive services and IoT. The result will be improved quality of service for customers and repeat business leading to better profitability. It’s a certainty that automation and AI in business will become all pervasive. Cloud computing is the catalyst for accelerating this speed of change. It makes the hyper-connectivity that AI and the IoT deliver affordable and means businesses can scale as and when they like. Companies will be able to sift through huge data pools and analyse them to provide a level of insight and knowledge about their businesses that humans alone couldn’t do. Automated machines will collate vast amounts of data, and AI systems will understand it. By coupling two different systems – one capable of automatically collecting incredible amounts of data, the other that can intelligently make sense of that information - individuals and businesses will become more powerful. Right now, we are just at the beginning of the intelligent transformation journey.[/vc_column_text][/vc_column][/vc_row] ### Cloud-based AI technology propels cardiovascular medicine forward [vc_row][vc_column][vc_column_text]The UK government has recognised the potential of technology to improve patient care and save lives. Technology is considered central to the future of healthcare, and Matt Hancock recently outlined a digital strategy that defined tech innovations as key to helping prevent, diagnose and treat different diseases. The med-tech sector and the NHS have become close partners, with private and public organisations working together to ensure innovation and science continue to improve the patient experience. Some of the technologies that have been adopted by the NHS in recent years include Microsoft’s InnerEye, which assists physicians to mark-up scans for prostate cancer patients and Babylon Health, whose app uses artificial intelligence (AI) to assess medical symptoms and to help ‘triage’ patients with very basic queries. Another area where AI is helping to advance healthcare is in cardiovascular medicine. Cardiologists can now use AI’s capabilities to collate data, and as more and more of it is gathered and analysed with deep learning technology, doctors can better understand how severe a condition may be, accurately diagnose the disease or explain to patients why they are experiencing certain symptoms. In turn, this can help to formulate the most effective treatment plan possible. HeartFlow is the result of research I began while a doctoral student at Stanford University more than 20 years ago. I worked with Christopher Zarins, MD, former chief of vascular surgery at Stanford University and an expert in vascular biology, and Thomas J.R. Hughes, PhD, former professor of mechanical engineering at Stanford University and a leading expert in computational fluid dynamics, to develop computing technology to model blood flow in arteries from medical imaging data.  As a professor in the departments of Bioengineering and Surgery at Stanford, I focused on developing this technology for more than a decade before Dr. Zarins and I went on to found HeartFlow in 2007. At HeartFlow, we brought experts in bioengineering, computational fluid dynamics, cloud computing, computer vision, and artificial intelligence together to revolutionise the diagnosis of heart disease. Supported by robust clinical evidence, including numerous clinical trials and hundreds of peer-reviewed scientific publications, the HeartFlow Analysis received FDA clearance in 2014. HeartFlow has also made its way across the Atlantic where it has caught the attention of the NHS. The condition Coronary heart disease (CHD) is the UK’s biggest killer – responsible for more than 66,000 deaths a year across the country. It affects more people than breast cancer and prostate cancer combined. The wide variety of symptoms exhibited by patients with this condition makes it particularly challenging to diagnose – sufferers can experience everything from feelings of indigestion and jaw pain to breathlessness and tightness in the chest. A lack of public awareness of CHD symptoms may also contribute to the problem. A recent YouGov survey found that two-thirds of respondents could name chest pain or tightness as a symptom of heart disease, while just 16% knew that similar feelings in the abdomen are also symptoms. Even when patients do seek out medical treatment, they often have to wait several weeks to be seen by a doctor and undergo multiple tests before a diagnosis can be made. All in all, it’s tricky to swiftly and effectively diagnose CHD. Technology goes beyond diagnosis HeartFlow is helping to streamline the CHD evaluation process. The technology uses deep learning and artificial intelligence to help cardiologists better understand blood flow in a patient’s arteries. Under the NHS’s Innovation and Technology Payment Programme, the HeartFlow Analysis is being fast-tracked in the UK and is now available in more than 30 hospitals. The process starts with the hospital securely uploading a CT scan into HeartFlow’s cloud-based software system hosted on Amazon Web Services (AWS). HeartFlow leverages deep learning and highly trained analysts to create a personalised digital 3D model of the patient’s coronary arteries. Next, powerful algorithms solve millions of complex equations that simulate blood flow within the model, so that the impact of any blockages on the arteries can be assessed. The completed analysis is securely transferred back to the hospital and allows cardiologists to interact with the 3D digital model: selecting specific areas to investigate, zooming in and out, and rotating the image. This enables a level of examination of blood flow in the arteries ordinarily only possible through an invasive procedure. The benefits of such an approach are clear. Patients can receive a diagnosis without undergoing a surgical procedure, and physicians can have greater confidence in their patients’ treatment and management plans. The future of AI in healthcare Technology is constantly developing, and recent years have seen its application in healthcare accelerate rapidly. HeartFlow is leading the way in this, applying artificial intelligence to help make the diagnostic process for heart disease more accurate and efficient. There is still so much room for further development, though. Technology will most certainly play a vital role in helping to tackle our health challenges. And in the hands of the medical professionals who interpret the data and put it to work, it has the potential to improve and save countless lives.[/vc_column_text][/vc_column][/vc_row] ### CFOs: are you innovating? [vc_row][vc_column][vc_column_text]In a modern, global business environment where competitors are constantly innovating, and start-ups are shaking up industries by using emerging technology in novel ways, the finance department has been under pressure to not only take care of the numbers but to innovate. CFOs are now being expected to lead the way and take a more strategic role to elevate finance beyond its traditional remit. The balancing act When it comes to new technologies, organisations have primarily invested in customer-facing solutions. But businesses focussing their efforts solely on more visible customer-facing capabilities are missing an important part of the smooth running of the organisation: business support functions. Without this investment, these organisations are slower and less accurate at forecasting and planning than organisations who invest equally in both. A study from FSN Modern Finance Forum found that organisations that approach technology investment as a double-pronged approach find that they are better at nurturing innovation. These businesses are more easily able to share ideas and cross-organisation skills and are more willing to make mistakes as they grow: innovation is the top priority. The study also found that businesses that balance their customer-facing and business support functions face fewer obstacles like in-house politics or risk aversion, and they find it easier to spot talent to encourage further innovation. Investing in customer-facing systems is more immediately noticeable to customers, but business support function investment is what will drive an overall better customer experience. Without this foundation, a unique omnichannel experience is almost impossible. By seeing both operational and financial data in one place, finance departments can have a superior, 360-degree view of the organisation—allowing data-driven decision-making, a more personalised customer experience and faster delivery of services. Overcoming roadblocks on the path to innovation Innovation in all forms requires a significant change, even if the finance function knows exactly what type of innovation it requires, and there are often many obstacles getting in the way of this change. According to the FSN global survey, culture, time and a lack of credible and accurate measures of success, like proving ROI, are all key hurdles to overcome in this process. If there is not a strong organisational buy-in being driven by senior leadership, change can be hard to implement. This is especially true in environments where mistakes are relentlessly punished, which stifles naturally-innovative employees. Equally, if the finance function is not viewed as a source of innovation by the rest of the management team, it will be difficult to sell the value of innovation within it and make a strong case for investment. CFOs and their senior executives need to be able to measure ROI to create a successful proposal for innovative investment. However, the FSN survey found that there are conflicting views on how to go about this. According to the survey, 75 percent of CFOs believe innovation success can’t be measured with traditional methods of calculating, arguing that these methods cannot adequately capture the intangible benefits of digital innovation. However, sometimes, the cultural nature of the landscape in which they operate can impact an organisation’s propensity to innovate and drive change. The study notes that North American organisations are more willing to take risks, whereas European counterparts are traditionally warier of the perceived risks surrounding change. Yet, the opportunity is waiting to be grasped. For European organisations, a risk-taking culture shift could help to attract top talent, break new ground and pave the way for the performance gains they seek while encouraging creativity across other core business lines. One of the most tangible benefits and measures of ROI for digital innovation is time. The rewards of automation are manifold, freeing up finance teams to focus on strategic thinking, and offering improved insights in less time. However, time is also a major reason why innovation is neglected. The FSN survey found that 67 percent of CFOs say that too much resource is tied up in legacy systems and traditional ways of working, which leaves very little room to innovate. Without overcoming these obstacles, the finance function is in danger of falling behind the rest of the organisation. This is a huge risk to the organisation: with the finance team driving so many business decisions, innovation here is crucial to remain in-line with, or ahead of, competitors. Join the movement early The truly innovative finance functions are those that adopt technology early and actively encourage a culture of innovation across the business. These CFOs can forecast quicker, close books faster, plan and budget more effectively and forecast more accurately than those who don’t set aside resource to innovate. Initial investment may be tricky, but in the long run, the benefits far outweigh the cost.[/vc_column_text][/vc_column][/vc_row] ### Cloud service provider: Which questions should you be asking your CSP? For most organisations, it is no longer a question of whether to adopt cloud, either wholly or for specific services, but which services to move and when to move them. Having architected and sized the chosen services, they then need to select the most appropriate cloud service provider(s). If they have done their preparation correctly, the choice of platform is almost immaterial, as these days almost all the technology is pretty good. [clickToTweet tweet="Most of the #PublicCloud service providers are very capable, however there is a multitude of providers offering different types of managed #cloud service. E.G. to host #legacy services until an appropriate public #CloudService becomes available." quote="Most of the public cloud service providers are very capable, however there is a multitude of providers offering different types of managed cloud service. E.G. to host legacy services until an appropriate public cloud service becomes available."] This does not mean the choice of provider is straightforward. While most of the public cloud service providers are very capable, there are significant differences in everything from design criteria, billing models and contractual terms and conditions to SLAs and data recovery terms. There are also a multitude of providers offering different types of managed cloud service, perhaps to host legacy services until an appropriate public cloud service becomes available. These too have their own T&Cs, SLAs etc., as well as the legal jurisdictions where data is held. These are important considerations when ensuring services are GDPR (General Data Protection Regulation) compliant. It is vital to know exactly what your organisation is signing up for to avoid problems in the future, so we have developed a checklist which we have been using with our customers to help them compare different options. As we help them define and negotiate cloud contracts, this has been refined to address the most common pitfalls and misunderstandings, and to help them make a realistic evaluation and comparison between different cloud service suppliers. Availability and usage The first consideration is whether your service requires persistent (reserved), non-persistent (on demand) or metered instances. This will depend to some extent on whether it is required 24 x 7 x 365, but there are other considerations too. Most applications require additional systems such as login/authentication, network etc., which need to be powered up beforehand, so a 9-5 requirement quickly becomes 7-9. Shutting down and restarting has to be sequenced, and some employees will want access outside core hours, so one of cloud’s specific cost-saving capabilities is potentially less useful than it could be as 9-5 quickly becomes 24x7. With availability decided on, you need to ask potential providers whether their service offers this and how cost effective it is. It does not happen very often, but AWS’ terms and conditions allow them to shut down on-demand instances without any reference to the client. If there are specific times that your service must be available, you need to know whether the provider will ensure these within a non-persistent service. With metered services, ask the provider what guarantees they will give that all capacity is available even if not used, and find out what actually constitutes usage. Several applications generate keep-alive packets to ensure availability, and these can be used by providers offering metered instances as the basis for charging even when services are not actually being ‘used’. Optimisation and granularity Different cloud providers handle charging in different ways, so it is vital to understand the characteristics of the service being migrated. Will general purpose instances suffice or are computer, memory or storage optimised instances needed? Costs vary dramatically from individual suppliers as well as between providers. For example, Microsoft Azure has five storage options, each with different dependencies. All need to be understood, compared and evaluated when choosing a service. More generally, you need to find out what is included in the charging model and what is an extra. If an extra, how is it charged and what is the likely impact on overall charges? For example, for an IaaS instance in AWS, there is a minimum of five and potentially eight metered costs that need to be tracked for a single Internet-facing server. Azure and other public cloud services are comparable. The complexity increases if your organisation is hosting multiple server environments and if other elements are required to run the application, such as security, resilience, management, patching and back-up, which will appear as additional charges. This is less of an issue with SaaS, which usually has a standard per user per month charge. Security considerations Maintaining security of cloud services is crucial. First, consider the security classification/business impact of the data within the service. Does this mandate physical location awareness and, if so, where will your organisation’s data be stored? What security, access, audit and compliance controls need to be in place and can the provider guarantee them? If so, how – self-certification or independent testing and validation? Then consider how the potential supplier operates. If they adhere to recognised security standards, they should be able to prove that they have the relevant controls in place. If not, you need to find out how they will guarantee that their infrastructure is secure and patching is up to date. Providers which have to meet public sector requirements will be regularly audited and tested by independent external providers to ensure that they meet the latest security standards and will have tested and audited procedures for dealing with any security incidents. Your organisation is responsible for asking your chosen cloud provider(s) to deliver the appropriate levels of information security and you need to measure and audit the provider to ensure this is applied. This is particularly true with IaaS, less so with PaaS and SaaS. Irrespective of who hosts the data, under both the Data Protection Act and GDPR your organisation retains responsibility for the security of its data. Resilience Resilience is another area where it is important to look under the bonnet to find out what is really being offered. You will be charged for transferring data between domains, so to understand costs you need to know the frequency and size of snapshots and the rate of change of data. If the standard offering does not meet your organisation’s requirements, additional resilience may be available – but what exactly is offered and what are the costs? You should also examine services guarantees closely and find out what compensation is offered if these are not met. A major loss of services such as a data centre failure, security breach or other outages, or even reduced performance, could create significant issues for your business. Under most public cloud service SLAs, the cloud provider will apologise and refund a proportion of the monthly service fee. This recompense covers a very small proportion of the disruption you may have incurred, so needs to be evaluated carefully before moving a business critical service. You should also consider whether to have primary and recovery services, where applicable, hosted by the same supplier, and whether you have or need an independent backup to restore from in extremis. Cloud service management, processes and contracts Look into the details of how the service is run. If operational management is via a portal, find out how the supplier handles escalation and service updates. What processes do they use for Problem Management or Major Incident Management, and do they have SLAs? You need to be confident that the way the supplier operates fits the way your organisation needs to operate. With public cloud, you are unlikely to be able to persuade providers to revise their processes to suit your organisation, so will be better off talking to private and virtual private cloud providers. Making changes to standard terms will always impact on costs, so you need to decide if the business benefits are worthwhile over the contract term. You also need to consider contract flexibility – in particular, whether there are exit or data transfer costs should your organisation wish to switch suppliers. Cultural fit Finally, think about the cultural fit between your organisation and a potential provider. This may seem trivial, but your organisation is potentially entering into a multi-year agreement which will impact the services it offers its end users. It helps to ensure that all parties are aligned before committing to any agreements. ### How Technology Is Changing the Supply Chain as We Know It The supply chain is no longer the rigid, human-centric protocol of the past. Instead of positioning real employees in every step of the process, more companies realize the value in supply chain automation. Not only does it free up employees for other areas, but next-gen automation might also be necessary to gain control over big data and put these massive datasets to good use. Radio Frequency Identification Radio frequency identification – RFID – is huge in the current supply chain. It also has the potential to transform the future of manufacturing on behalf of countless brands. Companies around the world are adopting RFID to support various activities in the supply chain, including asset and inventory tracking. Marks & Spencer, based out of the United Kingdom, began using RFID in 2001. One of the earliest adopters of the technology, they're currently using RFID tags to ensure timely and accurate deliveries of trays containing fresh food, flowers and plants. Their initial foray into the world of RFID was so successful, they upgraded their system multiple times since then. By monitoring the movement of goods – both during production phases and during final shipping – manufacturers can identify possible bottlenecks, eliminate redundant or unnecessary stops and streamline the entire process. The Internet of Things With some companies still warming up to ideas like online business, e-commerce and social media, the most tech-savvy companies are graduating to the next level of connectivity with the Internet of Things. Commonly abbreviated IoT, the platform gained a lot of momentum in 2017. It's expected to continue its evolution and popularity throughout 2018 and beyond. According to recent research by Gartner, more than half of all new major business processes and systems will utilize the IoT by 2020. The IoT isn't the wave of the future – it's already here. Volvo, a car manufacturer headquartered in Sweden, uses cloud technology and the IoT to support and streamline supply chain logistics in several different ways. Apart from relying on the technology to order assembly components, they also use the IoT when shipping finished vehicles around the world. Not only does it ensure accuracy and timely delivery, but it also opens the door for real-time communications in the event of an error or issue. Many of the IoT's benefits to the supply chain revolve around operational efficiencies like asset tracking and inventory management, as mentioned above. The IoT also establishes brand-new revenue opportunities by enhancing the overall customer experience and facilitating greater customer service. Business Intelligence and Artificial Intelligence In many ways, business intelligence – BI – goes hand-in-hand with artificial intelligence – AI. With so much emphasis placed on the research and development of AI, it's easy to see the potential for a real supply chain transformation. Advanced AI and BI are already used in predictive analytics, long-term forecasting, reporting and more. Traditionally, skilled employees handle many of these tasks. With the dawn of BI and AI, next-gen computer systems are taking over some of the more menial and monotonous activities. With the dawn of BI and AI, next-gen computer systems are taking over some of the more menial and monotonous activities. As a result, the most skilled workers transition to other, more important jobs. Wal-Mart currently uses HANA, SAP's proprietary cloud platform, to enable automated, data-driven decision-making and analytics. Other organizations use this machine-learning framework in similar ways. As a result, 10 of HANA's biggest and most prolific users are forecasting five-year ROIs of 575%. Cloud Computing If BI and AI are interchangeable, then so are the IoT and cloud computing. Many IoT applications wouldn't even be possible without the modern cloud. Companies like Rockwell Automation, a top provider of machine sensors and controllers, use the cloud to support their clients in numerous ways. Some of their most significant IoT-cloud operations include offshore drilling efforts in Alaska, ongoing data collection from liquid natural gas – LNG – pumps and support of various manufacturing operations. Not only does this strengthen operational efficiency across the board, but it also gives senior officials at Rockwell even further insight into industry trends, future forecasts and more. According to Gary Pearsons, vice president and general manager with Rockwell Automation's Services Business, their embrace of IoT and cloud computing "enable unprecedented efficiency" in all of their day-to-day activities. With so much enthusiasm coming from a company of such size and scale, it's hard to ignore these two technologies and their potential to revolutionize the supply chain as we know it. Overcoming the Challenges of the Next-Gen Supply Chain Powered by highly advanced and sophisticated technology, the next-gen supply chain will make it easier to compete in the 21st century. [clickToTweet tweet="'Powered by highly advanced and sophisticated #technology, the next-gen #SupplyChain will make it easier to compete in the 21st century.'" quote="Powered by highly advanced and sophisticated technology, the next-gen supply chain will make it easier to compete in the 21st century."] But there are some obstacles to overcome before we understand the full potential of innovations like RFID, the IoT, BI, AI and cloud computing. While we made a lot of progress in 2017, and we'll continue to tread new ground in 2018, there's still a long path to follow before we see a widespread embrace of the new supply chain. ### Moving to the Cloud? Take the Marie Kondo Approach to Decluttering First [vc_row][vc_column][vc_column_text]Moving to the cloud is much like moving house. When you move house, it’s simple to have someone box up all your belongings and move it all to your new home. After all, it takes much less immediate effort, particularly while you’re frantically sorting out everything else that moving day demands. But there’s a problem with simply calling in the movers – in all likelihood, you will have just paid to move a whole host of ‘stuff’ that you no longer need, use or want. You might even have invested in a bigger, more expensive house to accommodate all these old things when a smaller, more affordable home would actually have been a better fit – if only you’d decluttered first. The same principle holds true for moving to the cloud. For many CIOs, it’s tempting – while juggling the daily business demands on the IT department – to simply ‘lift and shift’ everything from the on-premise environment to a cloud environment, and think you’ll refine it after the move. But just like that new year’s resolution to finally clear out the garage, those best-laid plans rarely come to fruition. Because of this, taking the time to declutter your IT environment first will help you save on costs and create clean, efficient systems to drive your business forward once you’re in a cloud environment. Does your IT infrastructure spark joy? Ever since Netflix ran its hit series featuring Marie Kondo, a Japanese decluttering expert, people around the world are ridding their homes of objects that no longer ‘spark joy’ in a bid to get organised. The KonMari approach (only keeping hold of the things which retain use to us) might, at first glance, seem to have little relevance to an IT Decision Maker’s (ITDM) state of mind. But look a little deeper, and there’s plenty for businesses to learn from using a similar approach when moving to the cloud. Much like cleaning out the attic, ITDMs should use a move to the cloud as an opportunity to reassess their existing applications and infrastructure and determine which must be migrated, adapted, or left behind. There are five key steps to take when moving to the cloud to make sure you only take what you need: Assess: Firstly, work with a Cloud Project Manager to assess the infrastructure on premise. Running a specially designed script on the systems will generate a report showing what is in place, the health of the various systems, and if any of them could prevent or slow migration to the cloud. This is the equivalent to looking down the back of the sofa, or in the garage, to discover exactly what’s there before making the move. Envision: Next, that list must be analysed using a tool such as a Data Migration Assistant, to determine which of it should be kept, discarded or updated. For example, if there are systems coming to the end of their life, like Microsoft SQL 2008, it makes sense to use the move to upgrade them. Similarly, if you have a database but 20 percent of the data in it isn’t used, why pay for space for it to reside in the cloud? Much like having a keep, sell or throw out box when moving house, use this stage to identify which of your data estate is worth moving, storing or deleting – and ascertain what elements might present challenges in migrating it to the cloud. Design: Following this, assess which third party integrators have the skills to design the ideal solution based on the findings (e.g. if you’re moving to Azure, find an Azure expert to help). For instance, remember that 20 percent of data that you need to save but don’t need to access regularly? It could be left on premise, housed in more affordable cloud archives, or in cold tier storage to save costs. Migrate: Now, you will have your plan and roadmap to hand, and be ready to migrate. Systems can then be rehosted, refactored, rearchitected or rebuilt for maximum efficiency and minimal downtime. In short, the moving trucks are here. Manage: Finally, once the migration is complete, ongoing management of the cloud environment is critical to ensure costs don’t escalate. Putting in place an overarching cloud management service will help to manage and optimise costs, and ensure you receive better, more affordable support. The best services will also give you regular advisory hours, with access to technical help to resolve any issues or to plan for new projects. With Gartner predicting that the worldwide public cloud services market will grow to $206.2bn in 2019, up from $175.8bn in 2018 – the pace of cloud adoption is only set to accelerate this year. Moving to the cloud is a significant investment for any organisation, so ensuring a move delivers return on investment and meets business objectives should be top of mind for ITDMs. Taking the opportunity before any migration to streamline and declutter the IT environment will pay dividends further down the line, and ensure successful digital transformation.[/vc_column_text][/vc_column][/vc_row] ### How a customisable ITSM solution can be a platform for success [vc_row][vc_column][vc_column_text]Customer satisfaction is a key goal for any Managed Service Provider (MSP). If the client is not happy with the service being offered, they are likely to vote with their feet and move to another MSP.  So, responding quickly and effectively to requests must be apriority for MSPs. Yet MSPs are also businesses and need to service as many clients as possible to be profitable. Getting the balance right between quality of service and quantity of clients served is essential to running a successful business. Spend too much time keeping a customer happy and the business won’t make money. Conversely, if a business has too many clients there could be difficulty in meeting service level agreements. For MSPs, such as ourselves, the ITSM platform is fundamental to this balancing act. A quality ITSM, or IT Service Management, platform will help streamline efficiencies for both the business and its clients, enabling staff to resolve customer issues in an effective and timely fashion. Flexibility and customisation For the ITSM platform to deliver the benefits an organisation wants to achieve, it must have flexibility and the potential for customisation, otherwise a business is restricted by the capabilities of that particular platform. At Utilize, the rigidity of our previous ITSM solution meant we were forced to develop processes to fit the package, not the business. This is not a good situation to be in for a number of reasons. For example, creating a unique selling point and introducing new services to the market are next to impossible if an organisation has to change its business model to fit in with what is effectively one of its business tools. Also, the full potential of the skills and experience of staff can never truly be realised if they are stifled by such constraints. So, we looked for an ITSM platform that could be built around our own business needs, whilst having the flexibility and scalability to change and grow as we do. One of our key criteria was that the ITSM platform has to be customisable by the team in-house; after all nobody is going to know better what a business needs than those working within it. For this to happen, changes should be easily made to the platform without the need for any code. It was with these objectives in mind that we partnered with Cherwell Software, who were able to deliver what we were looking for and could provide unparalleled support. When deciding on what the new platform customisations should be, it’s important to get the team involved. So, we set up engineer focus groups, something we’d recommend to other MSPs if they are undertaking similar projects. In these sessions, engineers can discuss what has worked and what hasn’t, explore metrics and use customer feedback to recommend changes to the platform. Metrics explored can include helpdesk efficiency, call closure rates and customer satisfaction through the tracking of its Net Promoter Score. To ensure the smooth running of the business, a customised platform should be robust and sophisticated enough not to encounter any issues with routine system updates. Automation and integration  To really drive efficiencies, the ITSM platform must have the ability for processes automation and solutions and services integration from third parties. From our own experience, being able to integrate Microsoft Partner Centre with our Cherwell Software ITSM platform has been a game changer. Previously, if customers wanted to adjust or renew Microsoft Office 365 subscriptions, we would have to send out quotes that would need to be signed and returned. This was followed by a manual process internally where our purchasing department had to adjust the subscription and then accounts would raise the relevant invoices and adjust the contracts. Now, however, a customer can adjust their subscription via the portal, which is automatically updated as live in the Microsoft Partner Centre in their Office 365 tenant. The invoice can then be downloaded, and the customer is billed accordingly. The benefit of automation is that the platform drives processes through a business rather than having to rely on staff. Automating the Microsoft Partner Centre has saved us two hours per order, equating to a significant 120 hours per month, freeing up staff to get on with other tasks. How to use ITSM monitoring to scale the business  Through ITSM customisation it’s possible to better manage and monitor the performance of teams and staff through setting KPIs and creating dashboards. This can then inform training requirements, role changes, allocation of workloads and potentially recruitment requirements. In terms of customer service, using integrated reporting tools means we can track trends and reoccurring issues and get to the root cause of the issue to prevent it returning. Trend reporting can also be used to help plan customer upgrades and training issues. Providing customers with easier to use systems that deliver quicker responses and turnarounds naturally encourages greater use of the system. For example, we have increased portal requests by nearly 25% since the Cherwell deployment, along with positive feedback from customers. Greater customer satisfaction and use of a flexible and customisable ITSM solution can only be good news for ambitious MSPs.[/vc_column_text][/vc_column][/vc_row] ### Why Mass Data Fragmentation Matters [vc_row][vc_column][vc_column_text]The move to the cloud has brought about a revolution in the IT industry. A recent survey from IDG found that 73% of organisations have at least one application or a portion of their computing infrastructure in the cloud already, and that is expected to grow to 90% in 2019.[1] But the cloud isn’t necessarily a recipe for success. In fact, in many cases, the proliferation of multiple cloud applications for storage has provided IT teams with major headaches when it comes to data storage and management. Mass data fragmentation a business threat It’s an often-repeated mantra nowadays that companies are using data to ‘unlock’ value for business. Although data certainly can be valuable, it must be managed correctly or else it may just end up becoming a hindrance and an obstacle. Unfortunately, data management is a bigger challenge than it seems on paper. We all know data is being created in volumes that are almost impossible to comprehend. 90% of all the data ever generated has been created in the past five years, and this number shows no sign of slowing down. But it’s not necessarily data growth that’s the only issue to keep in mind. A phenomenon called mass data fragmentation (a technical way of saying: “unstructured data that is scattered everywhere, in the cloud and on-premises) is the issue that seems to be wreaking havoc on businesses around the globe. With mass data fragmentation, massive volumes of what’s called secondary data are stored in a dizzying array of legacy infrastructure siloes that don’t integrate and are therefore incredibly challenging and costly to manage, burdening an already over-burdened IT staff. Instead of driving digital transformation, data that’s housed in this fashion becomes a hinderance to it. In case you are wondering what secondary data is, it’s all the data that’s not considered primary data. Primary data is your most critical data, data with the highest service level agreements. Secondary data includes data that’s stored in backups, archives, file shares, object stores, and data that’s used for testing and development and analytics purposes. You may be surprised to know that secondary data comprises roughly 80% of an enterprises total data pool! Mass data fragmentation is not just a management challenge. The repercussions of mass data fragmentation can be much worse. All this fragmentation makes it incredibly difficult to know what data you’ve got, where it’s located and what it contains. This raises huge compliance vulnerabilities, which of course can lead to fines and reputational damage. More so, it’s nearly impossible to drive any meaningful insights from all that data. And, remember, we’re talking 80% of a company’s overall data pool. If you can’t get the right insights from your data at the right time, that can lead to major competitive threats and lacklustre customer experiences. Let’s probe this further and look at recent survey data that shines a bright light on what’s happening within enterprises globally? Cohesity surveyed 900 IT leaders at global enterprises across the UK, US, France, Germany, Australia and Japan, with an average revenue of $10.5bn. Data silos create real management challenges According to the study, close to a third of organizations use six or more solutions for their secondary data operations, and most IT managers store data in two to five public clouds.  By using multiple products and the cloud to manage data siloes, the burden on IT becomes staggering, and the resource drain starts to creep upwards.  Additionally, 63% of organisations have between 4-15 copies of the same data. And as IT struggles to keep up with this dizzying array of copies, the cost to store all these copies also rises, and additionally begins creating compliance challenges for organizations as we referenced above. IT Teams being stretched 87% of IT leaders believe their organisation’s secondary data is or will become nearly impossible to manage long term. According to the survey, if IT is expected to manage all the organization’s secondary data and apps across all locations and technology isn’t in place to accomplish that goal, 42 percent say morale would decrease, which is important to keep in mind at a time when finding and hiring top talent is increasingly difficult. As much as 38% of IT leaders fear massive turnover of the IT team, 26% fear they (or members of their team) will consider quitting their jobs, 43% fear the culture with the IT team will take a nosedive. These findings offer a clear indication that the IT teams would prefer to be working on something else – ideally, delivering innovation to the business rather than trying to manage a complicated web of data silos that’s only getting more complex over time. Of those respondents who believe their data is fragmented, 49% of the IT leaders surveyed expressed that failing to address the problems of mass data fragmentation will put their organisation at a competitive disadvantage, while 47% believed that customer experience would suffer. Moving forward, almost 100% believe it will consume significantly more time – up to 16 additional weeks per year, without more effective tools. Resource reallocation could have a huge impact on business 91% of senior IT decision makers said that if half the amount of IT resources spent managing their organisations secondary data were reallocated to other business critical IT actions, it could have a positive impact on revenues over a five-year period. Of those who share this belief, nearly 30% of respondents believe this adjustment could positively increase revenues by at least 6% over a five-year period, while nearly 10% believe this could impact revenues by 8-10% or more. With average revenues of companies in the survey above $10Bn, (€8.7bn) that would translate to roughly $800 to $1B (€885m) in new revenue opportunities. The search for a solution has been driven by the desire for a holistic view of a company’s secondary data on a single platform. To lead this process, organisations need to be able to consolidate data silos onto a single hyperconverged web scale platform. Through this platform, they can then take control of their secondary data and immediately address and solve challenges pertaining to mass data fragmentation. With the pressures of digital transformation firmly placed on every business, organisations need to determine how they can take more of a data-first approach, similar to Amazon, Netflix and other data-driven companies. This can create new opportunities to differentiate against competitors and delight customers while advancing your bottom line. [1] https://www.infoworld.com/article/3297397/cloud-computing/cloud-computing-2018-how-enterprise-adoption-is-taking-shape.html[/vc_column_text][/vc_column][/vc_row] ### AI in the Contact Centre – Striking a Balance [vc_row][vc_column][vc_column_text] The ongoing advance of artificial intelligence (AI) represents a real opportunity for organisations to offer enhanced customer service levels without increasing business cost. There are a raft of ways in which AI can be used today to drive benefits for organisations and the customers they engage with.  We are already seeing much greater use of robot agents at the front end of customer service, particularly when dealing with the simpler, more routine tasks in the contact centre. Chatbots are great at answering routine questions quickly. Their fast processing speed means they can rapidly provide answers to standard enquiries.  Unlike their human counterparts, they are always available. Moreover, they excel at triaging customers; interfacing with self-service applications and intelligently searching for information to help resolve queries quickly. They are empowering digital self-service but they empower the human agent also, freeing up their time to do the value-added, non-routine tasks, dealing with exceptions and offering empathy to high-value customers. There are a wide range of other AI applications in use in the contact centre today. AI can be working in the background of a customer interaction to gather relevant information and present it to the agent to help resolve the customer’s query quickly and efficiently. Real Time Speech Analytics can be used to effectively perform the role of the personal assistant to the agent. The solution analyses agent and customer speech to provide live feedback to agents, team leaders and quality assurance teams about what is being said and how it is being said. It monitors script adherence, speech clarity and even stress levels, all while the call is in progress.  We also expect to see off-the-shelf bots being used increasingly over time, to automate very specific tasks, such as password changes. In this vein, industry specialists will develop bots specific to and for their vertical – developing specific solutions to match the needs of the housing sector or for financial services applications, for example.  Sounding a Note of Caution  The capability of the kind of technology we have referenced above is advancing all the time and, in line with that, there is a growing recognition that AI can add great value to an organisation’s customer service efforts. but it can never be the whole answer across the customer service industry generally. A low cost airline may want to automate almost all of its processes but a retailer offering a high level of premium service will certainly want to continue to use human service as a key part of its value proposition and differentiation.  There are a wide range of considerations businesses also need to take account of if they want to ensure they get their AI implementation right. First and foremost, they need to take their staff with them as they move to more advanced automation. Before any organisation deploys AI, it is worth considering that one of the biggest risk factors in any IT implementation, system upgrade or system change are the human users of that system.  If the business does not communicate in an open, honest, transparent way how this technology is going to benefit them, it will find resistance. It needs to get people involved in the process: ensure it can test out the technology in a safe, sandbox environment and make sure certain people are comfortable with it, before it even starts rolling the technology out.  It is critical also that any AI tool brought into the contact centre has a defined business goal. Automation should never be implemented for its own sake. In light of this, it is important to give bots a job description. Humans work far better, when they have clear targets and defined goals. AI is the same.  Businesses need to give their AI-driven technology a defined goal. They also need to measure the performance of their robots on delivering against these targets and focus on making improvements.  In short, businesses need to start treating their automation systems and chatbots like human agents; train and support them and regularly monitor their progress to drive continuous improvement.  It is also important to note here that good knowledge management is key to good AI. Just like a human, a robot needs to have access to relevant knowledge and information to do their best job. Businesses need to ensure that when a question is answered in the contact centre, that knowledge is captured and delivered into the Knowledge Management System (KMS) to allow bots and human agents to feed off it.  After all, how can AI  be used to make decisions when it does not actually know anything? It can learn but it needs relevant data to do that. That is why it is so important for businesses to have processes and procedures in place that enable them to feed accurate data and intelligence into the KMS.  Striking the Right Balance  Today, it is clear that the use of advanced automation and AI technologies in the contact centre is growing all the time – and there are a wide range of ways in which these tools can be used to enhance customer service.  It is, however, also encouraging to note that even among organisations, we are also seeing growing awareness that these technologies can never form the basis of a one-size-fits-all or plug-and-play scenario. Businesses are becoming more aware that thought and effort needs to go in to ensure these technologies can proactively support enhanced customer engagement and drive a better customer experience for organisations today. [/vc_column_text][/vc_column][/vc_row] ### Cloud vs on-premise: striking a balance on security [vc_row][vc_column][vc_column_text]The popularity of cloud platforms has soared over recent years, as businesses everywhere take advantage of the increased flexibility, cost-effectiveness, and apparent data security they offer. According to a recent report, three in five security professionals believe the risk of a security breach is the same or lower in cloud environments than compared to on-premise.  Despite this, however, the cloud may not actually be as secure as people believe. Both cloud and on-premise environments can have equally devastating flaws. Opportunistic criminals are all too aware that many business strategies now tend to favour a shift to the cloud, especially within those organisations – such as large enterprises and government bodies – that they would consider to be high-value targets. As a result, cloud platforms are being aggressively targeted, and have become key strategic features on the global cyber and information warfare battlefield. The truth is that, although it offers a number of – very real – benefits, the cloud is no safer than on-premise infrastructure, and made less so in migrations that don’t prioritise cyber security.  Visibility and control Data is, of course, the lifeblood of any business. The intelligence and insight it provides is what gives an organisation its competitive edge. Any conversation around security risks will therefore come down to the protection and control of an organisation’s data, whether on-premise or in the cloud.  So, it’s hardly surprising that, in a desire to retain control over their data, most large businesses simply won’t move it into the cloud. Doing so means they’ll lose visibility of it, which they consider to be a significant business risk. Indeed, apart from the cloud service providers themselves, there are very few £250m+ companies that use the cloud exclusively. Instead, in order to maintain visibility of their data, most organisations operate a hybrid model; part on-premise, and part – mostly public-facing infrastructure – in the cloud.  Ultimately, a judgment must be made, based on the sensitivity of certain data, as to whether hosting that data in the cloud represents less of a security or business risk than hosting it on-premise. But data isn’t only valuable to an organisation’s business operations; it’s critical to its security posture too. Why, then, would an organisation give this data away to a tech giant?  Valuable data is being given away to companies that make their money from data. Doing so effectively enables cloud providers to map out all of the securities and insecurities within that organisation’s network. There’s a reason the cost of cloud is so low; hosting companies are getting far more than just CPU cycles, power, and energy. Fundamentally, to any business, data is value – it has to be protected, and visibility is key to this. After all, once it’s gone, it’s gone for good. Retaining control Corporate networks are becoming increasingly complicated, containing elements of both cloud and on-premise infrastructure. Protecting these networks, and the data that flows across them, requires a security infrastructure that both mirrors and is scalable to their growth.  On-premise infrastructure needs to be strong enough and offer complete visibility over an organisation’s data rates both now and in the future. Likewise, a virtual infrastructure is needed that can be deployed in the cloud, and that can be scaled out on demand, to meet an organisation’s changing demands, expansion plans and future data rates, all while still providing full visibility over its data.  Many organisations will utilise the cloud itself in strengthening their security provisions and threat intelligence. When training AI algorithms and machine learning-based anomaly detection systems, for example, organisations will often share threat intelligence data directly into the cloud. Doing so creates a security infrastructure which uses the cloud as central “brain”, from which up-to-date threat intelligence can be derived, and from which they can identify any potential threats that might resemble something seen on a global scale. Once again, however, this raises the question of control. It’s important that any threat intelligence comes directly to the organisation itself, rather than being given away to any third-party hosting its security infrastructure. Similarly, no organisation wants a third-party to have access to the data it’s using to train its AI.  There are clearly benefits to having access to wider threat intelligence, but using the cloud for security purposes is inherently complex. Ultimately, security infrastructure shouldn’t be mixed up with business. An organisation shouldn’t store the cloud services it’s monitoring on-premise, and vice versa; it shouldn’t store its on-premise security data in the cloud. Striking a balance There is a balance to be struck, one that boils down to classic risk management – business versus security.  On a technical level, the cloud is less secure than on-premise. Criminals see it as a high-value target and will attack it more frequently. But its flexibility, scalability and cost-effectiveness are often what businesses need to maintain a competitive edge. On-premise infrastructure, on the other hand, is more expensive, but it does offer organisations greater control over their data, and offers the added peace-of-mind of physical security features. The balance between on-premise and cloud infrastructure should therefore be tailored to an organisation’s needs at any given time. It should mirror an organisation’s risk appetite and its business imperative. Furthermore, better standards for encryption and engineering are needed on a universal scale that are underpinned by the latest legislation, as such, global and technical collaboration is needed by all.  In the end, it’s important to remember that cloud is actually no safer than on-premise. Given the value placed on data, it’s vital that organisations prioritise the accompanying security infrastructure, and ensure it too mirror business needs in the same way. [/vc_column_text][/vc_column][/vc_row] ### IoT security: how to drive digital transformation whilst minimising risk [vc_row][vc_column][vc_column_text]In just a few short years, the Internet of Things (IoT) has radically reshaped the way we live and work. From the gadgets on our wrists and in our homes to the connected buildings and smart factories we spend our working lives in, it’s making us safer, happier, and more productive whilst offering new growth opportunities for organisations. The technology is at the vanguard of a digital transformation revolution that is already permeating the vast majority of businesses, helping to make them more efficient, agile, and customer-centric. However, digital expansion also means greater digital risk: the latest figures reveal a 100% year-on-year increase in IoT attacks. The key for IT teams, therefore, is to support business growth whilst minimising these cyber risks through best practice security adapted for the new connected era. Going digital IoT has come a long way in a relatively short space of time. Gartner predicts that there will be 14.2 billion connected things in use in 2019, with the total reaching 25 billion by 2021. With this soaring volume of devices has come an explosion in data. Cisco estimates that the total amount of data created by all devices will reach 600Zettabyte per year by 2020, up from only 145Zettabyte annually in 2015. This is a huge amount of data – a single Zettabyte is the equivalent of 1billion Terabytes. Why has the IoT begun to permeate business environments so completely over the past few years? A Forbes Insights poll of 700 execs from 2018 helps to explain. It finds that 60% of firms are expanding or transforming new lines of business, a similar number (63%) are delivering new/updated services to customers, and over a third (36%) are considering new business ventures. Across the coming 12 months, nearly all (94%) respondents predicted profits would grow by 5-15% thanks to IoT. IoT sensors, the data collected, and sent to the cloud for analysis offer major new opportunities to enhance the customer experience, streamline complex business processes, and create new services thanks to the insights revealed. This could include utilities firms monitoring for the early warning signs of water leaks or power outages, manufacturing companies improving operational efficiencies on the factory floor, and hardware-makers providing new after-sales services for customers based on predictive maintenance. Risk is everywhere The challenge facing any firm ramping up its use of IoT and industrial IoT (IIoT) systems is that it will have expanded the corporate attack surface in the process. A quick look at OWASP’s IoT Attack Surface Areas document illustrates just how many new possible points of weakness are introduced. These range from the devices themselves — including firmware, memory, and physical/web interfaces — to backend APIs, connected cloud systems, network traffic, update mechanisms, and the mobile app. One of the biggest risks to such systems is that they are produced by manufacturers who don’t have an adequate understanding of IT security best practices. This can result in serious systemic weaknesses and vulnerabilities that can be exploited by attackers, from software flaws to products shipping with factory default logins. It could also mean the products are hard to update and/or there is no programme in place to even issue security patches. Such flaws could be exploited in a variety of attack scenarios. They could be used to breach corporate networks as part of a data-stealing raid, or sabotage key operational processes — either to extort money from the victim organisation or simply cause maximum disruption. In a more common scenario, hackers can scan the internet for public-facing devices still protected only by factory default or easy-to-guess/crack passwords, and conscript them into botnets. This is the model used by the infamous Mirai attackers and subsequently adapted in many other copycat attacks. These botnets can be used for a variety of tasks including distributed denial of service (DDoS), ad fraud, and spreading banking trojans. In the dark The difficulty for IT security teams is often gaining visibility over all the IoT endpoints in the organisation, and some IoT devices that are detected can often be operating without the knowledge of the IT department. This shadow IT factor is in many ways the successor to the enterprise BYOD challenge — except instead of mobile phones, users are bringing in smart wearables, and other appliances that then get connected to the corporate network, increasing cyber risk. Just consider a smart TV being hijacked by attackers to spy on board meetings, for example, or a smart kettle in a staff cafeteria used as a beachhead to launch a data-stealing raid against the corporate network. Even worse for IT security bosses is that they have to manage this growing cyber risk with fewer skilled professionals to call upon. In fact, demand for IoT roles soared 49% in Q4 2018 from the previous three months, according to one recruitment group. Regulators catch up The impact of any major IoT-related cyber threat could be severe. Data loss, IT systems outages, operational disruption, and similar can lead to financial and reputational damage. The cost of investigating and remediating a security breach alone can be prohibitive. A serious outage could require a major investment of funds into IT staff overtime. Regulatory fines are another increasingly important cost to bear in mind. Not only is there the GDPR to consider for any issues affecting customer or employees’ personal data, the EU’s NIS Directive also mandates best practice security for firms operating in specific critical industries. Both carry the same maximum fines. This regulatory oversight is encroaching on both sides of the Atlantic. A new piece of proposed legislation in the U.S. will require the National Institute of Standards and Technology (NIST) to draw up minimum security guidelines for IoT manufacturers and demand that federal agencies only buy from suppliers that meet these baseline standards. Visibility and control If passed, this U.S. legislation could well build on a European ETSI TS 103 645 standard, which itself was based on a UK government-proposed code of practice for the industry. It’s certainly a great start and will hopefully push the industry to be more security aware, while empowering consumers and businesses to seek out the more secure products on the market. But in reality, it will take a while for such steps to filter through to the market, and even then organisations may already be running thousands of insecure legacy IoT endpoints. IT security teams must act now, by gaining improved insight into their IoT environment. IT asset management tools should be able to help here by providing the all-important first step: visibility. Next, ensure device firmware is up-to-date via automated patch management tools and that this process is monitored to make sure it is working correctly – some security updates are buried deep within a vendor website and require human intervention to ensure these have been enabled. Additionally, IT teams must check that any usernames/passwords are immediately changed to strong and unique credentials. Even better, replace them with multi-factor authentication. Other best practices could include encryption of data in transit, continuous network monitoring, identity management, network segmentation, and more. Finally, don’t forget the people aspect of cybersecurity: employees should be taught to understand the security risks associated with IoT, and how to use systems and devices safely. This is the way to gain most value from any IoT initiatives, without exposing the enterprise to unnecessary extra risk. It takes just one serious security breach to undermine the innovation-powered growth so vital to the success of the modern digital business.[/vc_column_text][/vc_column][/vc_row] ### The ethics of artificial intelligence (AI) [vc_row][vc_column][vc_column_text]Addressing ethical issues surrounding AI will always be work-in-progress and will need to develop, as AI itself evolves. Imagine you’ve applied for a job or for a loan, and you’re told you’re unsuccessful. You’re curious as to why, and so you use GDPR legislation to request access to the information the company holds on you. You obtain your data – and at the same time, you discover that the decision was made using artificial intelligence (AI) algorithms that screened out your application for no obvious reason. Or imagine this. You discover that AI is being used for surveillance purposes at your place of work – and also that your employer is collecting and processing data relating to your health history using AI algorithms. In neither case has your consent been sought or obtained. In all these scenarios, it would be understandable if you felt pretty aggrieved. At the very least, it would damage the relationship you have with the organization employing the AI; at the worst, it might move you to consider taking legal action, and going public with your story. In short, while organizations are increasingly taking advantage of the benefits of AI, they must simultaneously be mindful of the consequences of their behavior. A recent study by the Capgemini Research Institute has found that consumers, employees and citizens will reward organizations that proactively show that their AI systems are ethical, fair and transparent. The “Why addressing ethical questions in AI will benefit organizations” study surveyed over 1,500 executives from large organizations across 10 countries, and over 4,400 consumers across six countries. Developing a plan of action The main findings of the study are perhaps obvious: that ethical concerns are deemed important by pretty much everyone who is served by or employed by organizations; that regulation is deemed desirable; and that companies are rewarded or punished in relation to the degree to which they are perceived to use AI ethically. However, the study goes beyond an analysis of people’s attitudes, to outline a course of action for organizations seeking to develop an ethical AI strategy. The recommended approach embraces all key stakeholders: For CxOs, business leaders and those with a remit for trust and ethics: establish a strategy and code of conduct for ethical AI; develop policies that define acceptable practices for the workforce and AI applications; create ethics governance structures and ensure accountability for AI systems; and build diverse teams to ensure sensitivity towards the full spectrum of ethical issues For the customer and employee-facing teams, such as HR, marketing, communications and customer service: ensure ethical usage of AI application; educate and inform users to build trust in AI systems; empower users with more control and the ability to seek recourse; and proactively communicate on AI issues internally and externally to build trust For AI, data and IT leaders and their teams: seek to make AI systems as transparent and understandable as possible, so as to gain users’ trust; practice good data management, and mitigate potential biases in data; continuously monitor for precision and accuracy; and use technology tools to build ethics in AI. The need for continuity Perhaps the key take-away from the study is that the structured, planned approach to AI that I have summarized above can achieve two important aims. First, it will earn people’s trust and loyalty, and achieve greater market share. And second, it will avert significant risks from a compliance, privacy, security, and reputational perspective. Of course, whatever organizations do won’t and can’t be a one-time fix. Addressing ethical issues surrounding AI will need to develop, as AI itself evolves: it will always be work-in-progress, or, to use that business cliché, a journey. And the sooner that journey begins, the better.[/vc_column_text][/vc_column][/vc_row] ### Taking the fog out of cloud security [vc_row][vc_column][vc_column_text]Over the last 20 years, digital files of all description have replaced their analogue ancestors. Databases, blueprints and all manner of company files and are now created and stored digitally. CADs have replaced blueprints, resting on servers and clouds rather than filing cabinets. The convenience is immense. Files can be accessed on the go, anywhere in the world. International business has never been easier, but the convenience comes with a very heavy privacy price. If you utilize a cloud for storage then your data rests on someone else’s server, someone else’s computer. It is simply not safe, but businesses are still relocating data to the cloud at an unprecedented rate. A recent survey by McAfee found that most organizations store some or all of their sensitive data in the public cloud, with only 16 percent not storing any sensitive data in the cloud. Security remains poor and a heavy reliance on the cloud is cited as the main reason for a data breach. A recent report by Kaspersky put the average cost of a cyber attack on businesses from 2017 to 2018 up 24 percent from the previous year, and 38 percent higher than losses from 2015–2016. The report states that small and medium businesses lose $120,000 per cyber incident on average — $32,000 more than the previous 12 months. But if the NSA and the CIA—perhaps the most cyber paranoid entities in the world—are avid users of the cloud, you can bet that it can be safe as houses. The NSA is moving most of its mission data to the cloud, while the Pentagon is planning to use its JEDI cloud to hold top secret US national security data. The NSA has already moved most of the data it collects, analyzes and stores into a classified cloud computing environment called the Intelligence Community GovCloud. The IC GovCloud is NSA’s creation. Data including foreign surveillance and intelligence information are pooled into this single lake so it is easier for entitled NSA staff to find. The cloud can be safe as houses as long as you employ a good encryption method. As long as the data is protected with encryption and passwords, you can stick in cloud or anywhere else and no one can steal it or have a peek. As Sean Roche, associate deputy director of the CIA’s Digital Innovation Directorate said recently: “Security is an absolutely existential need for everything we do at the agency—the cloud on its weakest day is more secure than a client service solution. Encryption runs seamlessly on multiple levels. It’s been nothing short of transformational.” The benefits of cloud computing are true for the CIA and NSA just as they are for a small business owner or your average user. As long as you have an internet connection, the cloud offers an infinite space where data can be sent, received, and stored seamlessly. As long as you employ encryption, your data is sealed. But not all encryption methods are created equal. Advanced Encryption Standard 256 (AES-256) is considered the safest encryption algorithm. It’s the one the US government, the NSA, and International Organization for Standardization (ISO)/International Electrotechnical Commission (IEC) use to protect classified information and national security systems. It’s also the method used in our privacy app, get2Clouds. If five supercomputers could check a trillion AES keys per second (no devices exist) it would still take trillions upon trillions of years for them to exhaust the 256-bit key space.  So if your data is protected using that, any hacker that gets their hands on it will only see binary junk. They wouldn’t even be able to tell what kind of document it is. But it’s not just company and client files that rest in the cloud and need protecting. All digital communications and digital file sharing essentially moves through a cloud and can be hacked by malicious third parties. All data in transit; every single email and message. The recent Collection #1 data dump which exposed 773 million email addresses and their passwords is the latest in a long list of incidents that show how laughably insecure email is. That wasn’t even the biggest email hack. Last year Yahoo announced that every single one of its email accounts had been. Emails are the go-to cash cow for hackers, and they’re far too heavily relied upon in the business world. Email was initially made for programmers to talk to each other openly in the early days of the web, but it went on to become the postal service of the online world. We don’t consider it data storage, but it’s just one big unsafe cloud where immense quantities of communications, contacts and attachments are stored. Email is a dated and insecure means of sending and receiving files and messages. That’s it. The privacy app get2Clouds does everything email does and more. It is a secure messenger, large file transfer, cloud sync that works with all major cloud providers and the corporate packages include unlimited cloud storage space. Most importantly, every action taken in the bubble of get2Clouds is protected with AES 256. The patented data transfer technology used within the app to send and receive files already secures over one billion downloads annually for the industrial automation sector. Instead of an email address, users are automatically assigned a 555 number when they register the app on their device or PC. Numbers can be changed to phone numbers or a 555 number of the user’s choosing. It is the only free messenger where users don’t have to register their SIM number. It also means it works just as well on SIM-less devices. get2Clouds works with all major cloud providers so users can encrypt the data in their private clouds but they can also secure and send files via their network attached storage (NAS), making their NAS their own private cloud. This is the perfect solution for smaller businesses that do not need to buy cloud space. This way, they can store their data locally and securely.[/vc_column_text][/vc_column][/vc_row] ### Edge computing | How IoT is changing the cloud The cloud…how did we get here, how did it all start…surprisingly as early as the 1950s, when mainframes – physically huge and extremely expensive – became available to large corporations and universities. They were accessed by multiple users via terminals that had no compute capabilities and acted solely as stations to facilitate access to the mainframes. Some argue this was Cloud 1.0. Fast-forward several decades and as high-speed internet becomes commonplace, users can access almost limitless compute resources from any internet-connected device, subject of course to being able to pay for it. As we enter the age of the Internet of Things (IoT), Machine-to-Machine (M2M) communication and as the number of data sources continues to grow, the cloud model, with its ultimate physical limitations in terms of network latency will no longer be the panacea it is touted as today, enter Edge computing. What is edge computing? A relatively new concept, edge computing – also referred to as cloudlets or grid computing – effectively pushes the computational functions to the edge of the network. In other words, rather than pumping all the data back up to the cloud for analysis and action, this process takes place much closer to the data’s source.   Edge devices can be anything with sufficient compute capacity and capability; for instance, switches, routers or even the IoT sensors collecting the data themselves. By processing the data as close to the IoT devices involved in generating and responding to the data, the physical limitations of the network becomes less relevant. The result is eliminating bottlenecks and redundant cloud computer and network-related costs. When and where is edge computing useful? Some of the key benefits from edge computing come from its ability to reduce latency. In a network, latency is the time taken for data to get to its destination across the network. Usually, this is measured by a round trip time. For instance, the latency between California and the Netherlands is approximately 150 milliseconds. While this seems an insignificant amount of time, and for traditional applications, such latency is statistically irrelevant, in a world increasingly reliant on IoT-connected devices, this might quickly change. A prime example is self-driving cars. Decisions such as collision avoidance need to be made by the vehicle in as close to real time as possible. In this scenario, even the smallest amount of latency can pose serious safety risks. Another aspect of edge computing is related to the capacity and availability of networks and bandwidth. Despite the increases we became accustomed to, it can still easily cause a bottleneck in the cloud stack as the Internet is still ultimately a shared public communication medium.   Edge computing reduces the load on networks by reducing the volume of data pushed back to the core network. This is because traditionally, not all data generated by IoT devices is needed, therefore processing data at the edge of the network allows enterprises to send back to the central cloud service only what’s truly relevant. Similarly, the connectivity to networks that we are accustomed to is not ubiquitous. Devices such as wind turbines typically located in remote areas could be given the limited ability to self-diagnose and self-heal various issues with the help of edge computing.   Security and compliance are a must With the General Data Protection Regulation (GDPR) on everyone’s mind, and data sovereignty becoming a key concern for organisations, public cloud often presents a series of challenges. GDPR compliance can be complex in the cloud, particularly for global businesses. With the advent of the ‘smart’ era comes ever more data from devices such as watches, homes, phones etc. By hosting all this in the cloud, there could be future issues with data sovereignty, ownership, consent and right to be forgotten. Offerings in the enterprise and corporate space are already racing to catch up with legislation, with vendors such as such Microsoft now offering Multi-Geo options for where you store your data. In terms of security, with edge computing, organisations can limit their exposure, simply by storing and processing data locally, and only transferring to the cloud what is really required. Not only does this aid compliance, but also decreases cybersecurity threat risks. A lower amount of data in transit equates to a lower exposure surface, giving opportunists fewer attack vectors to exploit.   The era of big data is here, and it’s everywhere With the ability to store and process vast quantities of data available, we are entering an era of unrestricted data-driven innovation. The business case for edge computing will continue to be driven in the short term by cost savings network capacity and compute, while in the longer term this will be supplemented by its ability to provide faster and more accurate access to automation and time-sensitive decision making at the source. All this will fit into a wider cloud-based ecosystem shaped by the demand for services for customers rather than the limitations of cloud infrastructure itself. Ultimately, the symbiotic relationship between the data we create and the ability to process and store it will define future technology systems. While edge computing is in a state of constant evolution right now, its benefits are undeniable – and increasingly, in demand. ### How Cloud, Mobile, and AI Are Shaping The Education Sector [vc_row][vc_column][vc_column_text]Not an industry exists that will not be or hasn’t already been affected by technological advancements. These rapid changes mean students need to be fully armed with the skills necessary to adapt to the workforce of the future.  The education sector, just like other industries, is embracing advances in Cloud Computing, Mobile Technology, and Artificial Intelligence to improve their operational practices and efficiency. Students are fast swapping traditional books for computer tablets and getting the opportunity to experience virtual classrooms. While enabling students the opportunity for experience with these innovative technologies within the classroom is important solely from an educational point of view, schools and colleges are feeling added pressure to adopt. Competition to enroll students between various learning institutions is increasing; therefore, they need to embrace modern technology to stay relevant. These technologies are going mainstream in the very near future, as such, students demand access now. As the next wave of digitization strikes, expect major technologies, to continue to shape the future of the education sector. Cloud computing and education Cloud computing is a revolutionary development in the field of education. It offers a range of benefits to everyone from teachers and students to educational institutions in the education industry. Cloud technology in this sector is already valued at $8.13 billion in 2016 and is expected to reach $25.36 billion by 2021. Today, about 70 percent of higher education institutions in North America have adopted, or are in the process of adopting, cloud technology to enhance the sharing of information across campus or across the world. One Key development in the cloud computing space is the advancement of virtual classrooms. Virtual cloud classrooms offer educators with a paper-free way to conduct a class or course, distribute learning materials, assignments and track student participation and progress remotely from a computer, tablet or smartphone. As well as providing this flexibility and convenience, the cloud creates new opportunities for teams to work on the same projects as never before. An example, using applications such as Google Docs, enables multiple users the ability to work together from either the same classroom or located on opposite sides of the world.  This opens students up to the capability to experience different countries and cultures, fostering a more immersive, innovative, and globally connected classroom. With the introduction of AV (audio-visual) technology existing via the cloud, teachers can interact with colleagues around the world too. This allows guest lecturers or experts on particular topics to attend classrooms for Q&A sessions. In additions, tours can be organized in distant locations or locally, without the need to leave the classroom. To give you an idea of the potential financial benefit, the costs of resources such as textbooks, for example, at the college and university level, have even outpaced tuition fees in some cases, putting many financially-struggling students in a tight spot. Cloud technology enables educational institutes to store all their learning material in the cloud. This allows on-demand access by students and administrative personnel quickly and economically. Thus, it not only saves time but also reduces the price of study materials. Further, cloud technology enables institutions to have easy access to updated cloud-based learning materials. Since cloud-based Internet applications can run on even the cheapest smartphone or computer with an internet connection, it eliminates the need to buy or upgrade hardware, including storage devices needed to access the latest learning material. Mobile technology and education From entertainment and communication to learning, mobile technology has completely dominated our lives. Mobile learning has become pivotal in addressing many long-standing educational challenges by personalizing and accelerating learning experiences. According to a recent study, 86 percent of college students have a smartphone, and nearly 47 percent of the population owns a tablet. Moreover, 50 percent of students use mobile devices for daily school work. Students use mobile devices for a wide range of educational activities, like registering for courses, reading e-texts, checking grades, and a lot more. The majority of students leverage mobile technology to communicate with other students about class-related matters outside class sessions. Another study has shown that over 90 percent of students use their smartphones while in the classroom. Many higher education institutions have found effective ways to harness these habits and are now using mobile technology to improve the learning experience. For example, in the United States, Purdue University launched the Hotseat app -- it allows students to provide real-time feedback during a lesson, making class sessions engaging. There are a considerable number of educational apps that make learning truly interesting and fun for students. Learners can choose any format of study material, such as videos, presentations, and even game-like learning apps depending on their preferences. AI (Artificial intelligence) and education Artificial intelligence has already paved its way out of science fiction movies to real life. Its applications are limitless, ranging from speech processing to machine learning and beyond. In fact, education is one of the spheres most influenced in the last few decades by advanced technological aspects introduced by AI (Artificial Intelligence). We all use tools like Amazon’s Alexa and Siri for performing education-related activities. This means we are already using AI in education. The AI market in the United States’ Education Sector 2017-2021 report found that AI in education will grow by 47.50 percent during that period. AI-aided educational technology helps teachers comprehend the learning curve in ways that are much better than ever before. They can understand and explain concepts in a much better way using AI-enabled Edu-Bots in a classroom setting. In-depth lessons can easily be imparted through the core AI functions. Such robotic innovations have made it possible for students with learning difficulties to attain mainstream education. In the future, we can see innovations such as automatic grading and automation of many other administrative tasks, allowing teachers more time to actually teach. Differentiated and personalized lessons adjusted based on student’s needs. AI will be able to identify weaknesses in the classroom, that is impossible for teachers who are managing up to 30 students in their class is another welcomed benefit of AI. Conclusion: The future of technology in education will be bright because everyone from administrative personnel to students, along with everyone in between, is enjoying it. We already use many technological innovations in learning. The education sector will need to keep evolving as a result of technology advancements, to best prepare the next generation for the new era of digitization. Sources: https://www.marketsandmarkets.com/Images/cloud-computing-education-market5.jpg https://www.educause.edu/ir/library/pdf/ss14/ERS1406.pdf[/vc_column_text][/vc_column][/vc_row] ### Five cloud pitfalls that create data management problems [vc_row][vc_column][vc_column_text]Migrating to the cloud is now a key IT strategy, often used to enable digital transformation, for almost every organisation. No matter how big or small, or what sector they operate in, almost all have benefits in flexibility and agility to gain from moving some or even all of their data and applications to the cloud. However, the move to the cloud does not always result in better data management. It can create unforeseen complexity. It can reduce visibility, moving data to places where it can no longer be easily seen or controlled. And even though the cloud, in theory, increases possibility for innovation, collaboration and creativity, if users cannot access the information they need then those benefits will go unharnessed. Here are five common pitfalls to avoid in your cloud strategy if you are to achieve improved data management from a cloud migration. Pitfall 1: Include legacy IT systems Integrating legacy IT systems is a complex challenge whether you are working on-premise, with a private cloud or a public cloud provider. Legacy IT systems may not interoperate smoothly with other applications, meaning that data stored in one system cannot easily be shared with another, so separate systems cannot ‘speak’ to each other as they need to. As such, your cloud strategy needs to ensure that legacy IT systems that are not being refreshed are nonetheless modernized by use of middleware, in the form of data and API integration. Pitfall 2: Don’t replicate problems For data to be useful, your organisation needs to be able to access it and query it with ease. After all, information in its raw form doesn’t do very much – you need to be able to turn that information into tangible insights and action. In practice, this means knowing where different datasets are stored, and having an interface through which those datasets can easily be accessed and manipulated. If your data is inaccessible in the first place, then simply moving elements to the cloud from on-premise storage may result in simply recreating the problem in a new application. A data and API integration platform may be needed to continuously and seamlessly connect applications and their underlying data together. Pitfall 3: No unwanted data lakes Many businesses try and create data warehouses as part of a cloud move, so key data sits in one place. It sounds great – a single, clearly organised and structured data source, through which stakeholders can access the information they need on demand. However, the reality is that data warehouses can rapidly become unwanted data lakes, where unstructured data is unceremoniously dumped. Data lakes are huge unwieldy beasts, and very difficult to harness effectively for the business. To avoid this, data needs to be structured from the outset and technologies used that enable and ensure your chosen ontology is maintained. Pitfall 4: Open data policy Cloud strategies are understandably executed by IT teams, but data is used by everyone. Will your cloud strategy empower marketing teams to access the analytics they require to tweak their campaigns, or your finance departments to work with the data they need to produce accurate forecasts? Will key stakeholders be able to reach the data they require independently, or will the IT team have to be a facilitator, one that feels more like an uncompromising gatekeeper? Once again, the importance of data and API integration platforms for connecting disparate moving parts and ensuring that even non-technical knowledge workers can work with core business data, cannot be underestimated. Your cloud strategy cannot simply be about making the IT team’s life easier – it needs to be about making the entire business work smarter. Pitfall 5: Focus beyond cloud There are multiple different ways of structuring a cloud migration, and all bring different business benefits and challenges. It is therefore very easy to focus all your attention on infrastructure elements such as cloud providers, hybrid cloud models and networking. These areas are certainly of great importance, but don’t forget to consider other applications that fit into your overall cloud strategy and help promote better access to data. Integration-Platform-as-a-Service (iPaaS) products can help make data more accessible to all within the business and sit within your overall cloud strategy. Digital transformation is a core priority for many organisations today, and cloud adoption can be a core component for making it happen. However, it is important not to lose sight of the data which sits underneath. It is your data assets that unlock the benefits of digital transformation, enabling you to make more timely and accurate business decisions and work more collaboratively, promoting creativity and innovation.[/vc_column_text][/vc_column][/vc_row] ### The importance of data privacy in digital transformation [vc_row][vc_column][vc_column_text]The term ‘digital transformation’ has different connotations for different people. For some, digital transformation is about using artificial intelligence to improve customer experience while for others, it could be about using cloud technology and analytics software to optimise their logistics processes. In a nutshell, digital transformation represents how businesses innovate their processes through technology. Interest surrounding this subject has increased rapidly in recent years. Google trends data shows a steady rise in UK searches for ‘digital transformation’ since 2014, indicating that technological innovation has increasingly become a priority for business owners to consider. This graph reflects recent reports of investment in digital transformation being on the rise. A recent report by Deloitte found that the average budget for medium-sized businesses to invest in digital transformation increased by 25% in the last year alone. Likewise, a report ‘The State of Digital Transformation’ found that growth opportunities (51%) and increased competitive pressure (41%) are the primary reasons for businesses investing in digital transformation. Despite this growing enthusiasm, there still remains a level of uncertainty, especially among smaller businesses. Last year’s implementation of GDPR legislation now means that businesses are required to map their data flows, assess the risks in their data processing activities and identify where controls must be implemented. Innovation therefore now carries a greater risk of sensitive data becoming lost, or even worse, stolen. With the maximum fine for non-compliance standing at €20 million or 4% of annual turnover, the consequences of making mistakes are too great to be ignored. When the specific requirements of GDPR were originally announced and Privacy by Design was, for the first time, to be made a legal requirement for IT projects, it was feared that hundreds of digital transformation projects - many of which had been scoped and designed months, if not years previously - would become derailed. Or worse, would reach an indefinite impasse. After all, retrospective Privacy by Design is often impossible, but stopping the project and re-starting from the ground up is typically out of the question. If digital transformation aims for the freeing of as much data and accurate context as possible around the business, and data privacy looks to ensure confidentiality and pseudonymisation, how can the two co-exist? Keep the project focused A prerequisite to successfully implementing digital transformation is keeping your business goals specific and achievable. Businesses far too often overstretch themselves by making bold plans, only to fall short and risk not only their reputation but their sensitive data. We once worked with a major European retailer who wanted to design a more optimal working environment using technology. By tracking their staff’s use of physical and digital resources, they wanted to find areas of the business that were not performing to maximum efficiency and rectify the issue. Moreover, they wanted to integrate this data with employee HR records, looking at usage report and access card data alongside employee attendance and performance metrics to identify which employees were underperforming and which deserved promotions. The idea worked on paper, as it removed the risk of subconscious bias that may have been present in performance reviews. However, the company failed to make the wider team aware of the proposed structure and had not made their individual privacy rights even a consideration during the design phase. Employees were subject to an automated decision-making process, without any form of awareness or thus consent, which directly contradicts GDPR. As a result of this, the project had to be entirely rethought, creating both financial and reputational costs. The technical infrastructure was secure and the workflows and machine learning in place were admirable, but the company attempted to make changes that were too far-reaching and were not communicated with the privacy team. Make privacy a priority from the beginning To avoid potentially damaging data breaches, companies should make the security of their company data a priority from the beginning. In attempting to make widespread changes to a business’ operations, effective security precautions are too often left to be dealt with later, causing sometimes serious issues down the line. We once worked with an international medical organisation which produces devices for the healthcare industry. The developer team used an IoT technology to monitor the use of every device they created with the aim of using the data for product development and maintenance. Being a company that collected healthcare data, the need to keep private information secure was even more important. Last year’s GDPR legislation places special importance on healthcare data and therefore outlines additional rules with regard to its storage and use. Nevertheless, at no point were either the patients, healthcare professionals or indeed the wider business beyond the developer team made aware that their data was being collected and used in this way. Any company looking to undertake a similar initiative should decide on a set of project oversight practices at the start to ensure that the project is vetted by a privacy or legal expert before it proceeds further. Documentation recording and governing the data’s collection, storage and use must also be produced. In the case outlined above, none of these steps were followed and it was only once the legal team had begun their GDPR preparations and company-wide audit of data use that they discovered this activity. The project was immediately halted and led to product development delays, unhappy investors and extra costs to re-launching a similar project without data issues. Key Takeaways Despite its potential pitfalls, digital transformation remains an extremely exciting venture for businesses of all shapes and sizes. The prospect of leveraging cutting-edge technology to accelerate their business’ processes and thereby making themselves more competitive is certainly attractive. However, data privacy should always be a foundation of any digital transformation project, as without it, the whole house will start to fall. A consideration businesses should make is hiring Privacy Architects to assess their objectives and the privacy legislation that they will have to comply with. Privacy Architects are experts in both privacy and technology, a rare yet essential combination of expertise. Without knowledge of privacy law, technology projects can create new risks for a business. The wider effects of which go far beyond penalties and fines, but instead to the heart of whether customers can trust you.[/vc_column_text][/vc_column][/vc_row] ### Common Informational Security Loopholes In Data Engineering [vc_row][vc_column][vc_column_text]The world right now exists under constant connection to the internet, thanks to the advanced modern technology. Such is an environment that encourages creativity and innovation, and while that can be looked at as an advantage, it has created room for the thriving of businesses, and that involves black market. It is therefore not a surprise that cyber security issues have raised a brow across the board, more especially when it comes to data engineering. As a business owner, or a data engineer for that matter, it counts to know that there are some loopholes in information security that can pose threats to you. Find some of them below: Overreliance on mobile technology The pace at which technology has grown has popularized the use of mobile technology so much that today’s society is unbreakable in their level of dependence on mobile technology. While there is nothing wrong with using smartphones, they have become an informational security loophole for any data engineering efforts. It is a lot easier for cyber criminals to get access to people’s confidential information through mobile technology than through laptops and desktops. The overreliance on mobile phones to conduct all online transactions increases the security vulnerabilities because it is much harder to secure a smart phone from hacker attempt and threats than it is to secure a desktop device. Social media attacks The popularity of social media is very widespread today, and undoubtedly, has become one powerful tool for internet marketing, and even, data engineering. Even then, social media attacks are an area that you cannot overlook as regards informational security. As people are bust sharing information from one platform to another, the posts and media that people share come embedded with links that could invite malware and viruses on devices. Ideally, hackers and other cybercriminals leverage the fondness of people with social media channels as an opportunity to distribute complex ‘water holing’ attacks, which infect a cluster of websites at a time. Social engineering Social engineering basically depends on social networking and interactions among people, along with psychological manipulation of people so as to gain access to people’s information. On the internet, people are busy transmitting data from one network to another and from one person to the other. In the process, an environment of trust and vulnerability is built based on social validation, which creates a loophole for social engineering to pose informational security risks. This loophole is especially risky because it is a mischievous form of intrusion to information privacy that can have unpredictable yet dire consequences. Corporate data leakage Every corporate has its members access some company information through their personal devices. While an organization may try as much as possible to keep their security intact, there is no telling the amount of confidential data that can be exposed when a hacker gets hold of any kind of information from an oblivious staff member. Technically, you cannot always control what the employees do with their smartphones or corporate phones when at home. Thereof, even though a company determines to distribute corporate phones to its employees, cybercriminals can easily capitalize on this loophole as a way to get access into a company’s confidential information. Outdated security software It may be common knowledge to you, but not everyone is aware that outdated security software are in themselves a security risk, more than a form of information security means. As you keep on generating data and sharing it on different internet platforms, outdated security software can be the loophole that implicates your entire firewall and security system. The whole reason why you must be intentional about updating your security software is that they are made to fight against threats, and therefore, when new codes of threats are generated, if you do not update your system, then your security software are incapable of fighting them, hence exposing you for threats and attacks. Proper device configuration It’s alarming that something as simple as the configuration settings of your device can cause a major data breach in your data engineering process. Companies have a common habit of overlooking simple instructions that come with big data tools. Although keenly working through the configuration security settings can be overwhelming, in just a single decision and click of a button, you can open your system up for threats and attacks. Be careful that you do not accidentally enable one critical functionality in the process of skipping some important configuration settings, as this will only be a loophole for security vulnerabilities.[/vc_column_text][/vc_column][/vc_row] ### Cloud-Based Integration Takes The Pain Out Of Digital Transformation [vc_row][vc_column][vc_column_text]Businesses are under increasing pressure to integrate their systems in order to improve processes, enhance the experience for staff and customers, streamline the supply chain and future-proof their operations. But everyone knows this is costly, takes a long time to get right and involves hiring or outsourcing to teams with highly specialised technical knowledge. On top of this, companies have to maintain the highest levels of security across all data exchanges at the same time as abiding by data protection legislation. With so many tools and services being used across an enterprise, trying to maintain security across all systems can be a huge challenge. APIs have taken us some way towards simplifying the integration process but,in reality, they’re not effective enough on their own for enterprise-wide integration. Plus, APIs are constantly having to be developed to link new applications. Integration in the cloud The cloud or ‘as a service’ concept has completely revolutionised how technology is purchased and deployed within organisations. But integration – often the key to bringing digital transformation projects to fruition – is still perceived as something that needs to be done on premise, and is a process that is costly, both financially and from a resource perspective. Now, Integration Capability as a Service (ICaaS) has emerged to speed up digital transformation and slash the time and costs involved with integration projects. Copying behaviours By using a pre-built, cloud-based integration platform, businesses can cut up to 90% of the prohibitive costs and time involved with integrating data sources and applications – massively reducing the reliance on specialist integration skills. With an integration platform already in place, new integration services can be developed quickly via the deployment of configuration and transformation files. ICaaS platforms are built on a foundation of patterns each fulfilling a specific integration behaviour - such as Assured Delivery, Request/Response, Distributed Assured Delivery and Orchestration. Pre-building these patterns, means IT teams don’t need to recreate the wheel each time. As an example, services such as RetrieveCustomerDetails & RetrieveEmployeeDetails both adhere to the Request/Response pattern of behaviour. By deploying separate configuration data for each service to control the flow of data across the ICaaS platform, the same code base is used to ensure consistency of behaviour To ensure consistency of operation, the same building blocks micro-services are used to construct these behavioural patterns, including: * Configuration - used to control the process functions * Authorisation - to determine if the user is authorised to use the service * Validation - to ensure the input data is in the expected format * Transformation - to convert the output data into the correct format * Routing - to ensure the information goes to the correct consumer * Audit - to maintain a record of the data content * Exception Handling - to ensure all exceptions are recorded consistently Beyond APIs The benefit of all of this is that adding new integration services to an ICaaS platform becomes a straightforward task. All that’s needed is configuration data to define how the service works, validation information to ensure that the input data is in the expected format, and transformation details to get the data into the format required by end users. Because it allows interfaces to be reused and it employs pre-built connectors, a cloud-based integration platform results in shorter times to production. It also helps to protect digital assets. Data is transferred securely, and it’s easy to employ data governance and access control rules as well as keeping control of integration. The platform means that failures can be swiftly detected and monitored all the way through to resolution. Ultimately, it can make life easier for staff too, by ensuring that the correct data is always delivered to the appropriate person at the right time to allow them to work effectively. Gartner has stated that ‘digital transformation is intensifying demands for seamless integration across application and information infrastructures’. Cloud-based integration eases that path towards digital transformation, saving time, money and resources without compromise.[/vc_column_text][/vc_column][/vc_row] ### Video conferencing in the era of AI [vc_row][vc_column][vc_column_text]No doubt many businesses are trying to establish how artificial intelligence (AI) can make a difference to their business. They’re not alone. A recent Gartner Group study found that while there’s a lot of interest in AI, it is hard for many users to understand the technology’s true value to their business. Hype aside, it is possible to drill down into how augmented reality and AI can be used in video conferencing and collaboration. A recent Zoom survey found that nearly 60 percent of respondents thought augmented reality would be somewhat or extremely useful in their video meetings, and 73 percent said they expect AI to have a positive impact on meetings. On the point of how will AI and AR change meetings, here are just a few of the benefits users can expect cutting edge technologies to deliver to video meetings today and in the coming years. The New Meeting Scribe: Artificial Intelligence Today, AI has already begun to make video meetings even better. It is no longer necessary to spend time entering codes or clicking buttons to launch a meeting. Using voice-based AI, video conference users can start, join, or end a meeting by simply speaking a command, much in the same way you interact with Alexa (you can even doing this using Alexa). Voice-to-text transcription, another artificial intelligence feature, now offered by Otter Voice Meeting Notes and other vendors including Zoom, can take meeting notes during video meetings, leaving individuals and their teams free to concentrate on what’s being said or shown, which boost efficiency and collaboration. This machine-learning-based technology gets smarter and more accurate as more people use it. AI-based voice-to-text transcription is advanced enough to identify each speaker in the meeting, which saves users time by letting them skim the transcript, search and analyse it for certain meeting segments or words, then jump to those mentions in the script. Video transcripts not only deliver easy-to-digest, comprehensive records of meetings, they can also be used to provide feedback for training. It is interesting to note that over 65 percent of respondents from the same survey said they think AI will save them at least one hour a week of busy work, with many thinking it will save them one to five hours a week; for industries that charge by the hour, that could make a real contribution to the bottom line. One respondent noted, “For me, the AI advantage would be a ‘virtual assistant’ tool for notes, to-do and follow up items. The advantage would occur post meeting as a reminder or follow up guide to get things done. Currently, the maelstrom known as the ‘day’ swamps my ability to focus on deliverables from key meetings.” AR Promises to Redefine Video Conferencing It is important to make the distinction between augmented reality and virtual reality as the two are often confused, but the two are quite different. Virtual reality is a completely computer-generated re-creation of a real-life environment or situation (riding a roller coaster for example). Augmented reality meanwhile projects images and information onto your actual view of the real world. Though its current use in video meetings is more limited than that of artificial intelligence, augmented reality has the promise to truly redefine and enhance communications in many industries – and it’s getting off to a great start. For instance, companies like Meta are partnering with video conferencing providers to deliver solutions that enable the use of augmented reality in video conferencing to share and manipulate 3D virtual holograms in real-time, and allow others to interact with them. Imagine medical students watching a demonstration of a procedure on realistic anatomical models, thus aiding first-hand experiences beyond anything they can learn from a textbook. It creates a whole new experience to see and interact with virtual replicas of products, and service technicians, for example, can view systems and diagnose problems in real-time. Among those surveyed, 30 percent were excited about the potential AR has to create an environment without limits on how you can collaborate or what you can build together. Right now the future for VR in video communications isn’t quite as strong. The point of video communications is to engage with someone face-to-face over long distances. If you’re engaging with their virtual avatar instead of their face, you lose this important benefit. With AR, you can share, which is the real point of collaboration. What’s Next? Looking ahead, we’ll see AI used in video communication for a wider range of more sophisticated tasks, including language translation, meeting follow-up (creating to-do lists and scheduling subsequent meetings), and maybe even leading meetings. An AI-based assistant may create a top-line meeting summary from the notes it’s taken, and then deliver those summaries to meeting participants. Using natural language processing, the assistant could also contribute to the meeting, perhaps by creating charts to illustrate data being discussed, or proactively accessing and sharing historical data relevant to the discussion. AI-based facial recognition will also be used in video conference rooms for a variety of purposes. It will recognize who is in the room and be paired with technologies that provide other information, such as a meeting participant’s title and company affiliation. Insights into who has used the conference room, when, and for what purpose will also be available to help IT and facilities staff better plan space allocation and usage. There’s no doubt that some AI applications don’t seem to have real-world benefits and exist simply for the sake of saying it could be done or to showcase the potential – like Sophia, the first-ever AI robot citizen. However, the use of AI and AR in video conferencing-based collaboration is truly accomplishing things – and those accomplishments are making video meetings even better than meeting face-to-face.[/vc_column_text][/vc_column][/vc_row] ### Challenges surrounding IoT deployment in Africa The Internet of Things (IoT) is an over arching technological term representing a network of devices capable of collecting data, sharing data over the Internet and sourcing information for users with ease. Many IoT devices also resemble robots, able to assist in household chores and using sensors to monitor health, movement, visitors at the door and more. With IoT technology spreading far and wide, for continents such as Africa, embracing this new form of technology has the potential to offer new jobs and provide new solutions to issues such as water and power shortages. There are a variety of predictions for the amount of Internet of Things devices (wirelessly connected devices) on Earth. For example, IHS market estimated 30.7 billion IoT devices for 2020, Gartner predicts 20.8 billion devices other than smartphones, tablets, and computers by 2020. $6 billion is anticipated to be invested into IoT solutions, such as app-connected devices, IoT security, device hardware, further development of new ideas and devices, and the integration of IoT into the homes of the public and businesses around the world. While the West and Asia have been able to invest wealth into the development of IoT technology, more impoverished places have not been able to progress at the same rate. For example, as IoT is integrated into the technological landscape of many nations, continents like Africa may struggle to adapt to this due to the disparity in wealth and cultural difference in the type of jobs carried out by its citizens. With the 4th age of industry rearing its head, African people may fall victim to not having the funds available to participate in the IoT phenomenon to the same degree as wealthier regions. Unemployment could be a risk as less traditional, manual jobs will be required, but given the right skills and a financially viable approach to developing new IoT technology, Africa and other more impoverished regions could play a role in the advancement of IoT. IoT could offer developing countries the opportunity to develop a low cost and sustainable approach to the technology. The creation of new jobs with a focus on sustainability could help the economy as well as the environment. Although if appropriately managed with an inclusive approach, the rich and poor of Africa could benefit from IoT, there is a risk that jobs may not be evenly distributed between all income brackets. Despite the enormous potential for IoT to offer economic empowerment to underprivileged communities if IoT education and employment remains biased towards the rich, inequality could increase. Although there is potential for IoT deployment to be challenging in Africa, there are organisations that are actively seeking out a positive and efficient approach. For example, the WAZIUP project uses IoT and Big Data technology to enhance the working conditions in rural Sub-Saharan Africa. Africa’s lack of infrastructure, expensive hardware, and limited technological background are all issues they experience surrounding the deployment of IoT. According to the WAIZUP project, the implementation of IoT solutions in rural locations needs to tackle four significant issues: “(a) Longer range for rural access, (b) Cost of hardware and services, (c) Limit dependancy to proprietary infrastructures and (d) Provide local interaction models”. In rural Africa, GSM/GPRS and 3G/4G are very costly for IoT devices. Instead, short-range technologies like IEEE 802.15.4 can be used by administering multi-hop routing. Transmitted-receivers (wireless communications devices) consume an enormous amount of power in a radio node, and as long-distance transmission needs immense power multi-hop routing could be a more energy efficient option than single-hop routing. There are active startups like Illuminum Greenhouse in Kenya that focus on IoT solutions for agriculture. The company’s greenhouses are powered by solar panels and sensors, which work together to create an optimal environment for growing crops. When the sensors detect that the crops need water (when the soil is dry), an automated watering system supplies the precise amount of water required. Illuminum Greenhouse builds cost-efficient greenhouses suitable for farmers of smalls areas of land. They also promote the local economy by only using locally available materials and solar-powered sensors. This particular startup exemplifies how relatively affordable and straightforward IoT systems can improve the living conditions of underprivileged African people who would otherwise struggle to make money and survive.   With IoT technology spreading far and wide, for continents such as Africa, embracing this new form of technology has the potential to offer new jobs and provide new solutions to issues such as water and power shortages. With a humanitarian approach, the deployment of IoT in Africa could be enormously positive; with companies like Illuminum Greenhouse and their consideration for rural land workers and impoverished regions, IoT could boost the economy and aid sustainability. With globalisation and access to the internet increasing in Africa, more people can learn about IoT and how to apply it in their communities. Of course, not every region in Africa has much in the way of internet connectivity, but the poorest country in Africa, Nigeria, had 98,391,456 Internet users as of December 2017 (50.2% of the population) while In 2013 only 33% of Nigerians used the Internet. Although there is a long way to go, as internet usage steadily rises, more opportunities become available for the people of Africa.     ### Business Applications of IoT and Advantages The thought of living in a world where every device in your home, office or your car communicate with each other and be controlled over the Internet may have seemed like a far fetched fantasy a decade ago, but now it is all very much possible with the invention of IoT technology. IoT connected the world at an unprecedented scale via machines and tools communicating with one another through internet within a few years. With the promise of smart homes, workplaces, and even cities, IoT is developing by leaps and bounds completely changing the business industries and the lives of people everywhere. As we have seen lots of exciting and life-altering innovations, but the Internet of Things is set to transform our way of life with objects coming alive, interacting with people, understanding and anticipating their needs and choices. When it comes to business, IoT is a total game changer. The technology makes companies smarter and more efficient while optimising the logistics processes and improve customer relations. The IoT revolution is still in its infancy, but the fleet industry has been reaping the benefits of IoT and telematics technologies for over a decade now as one of the first industries to adopt the advanced systems. According to the statistics, 8.4 billion connected devices were in use in 2017 worldwide with a staggering 31% increase from 2016. The number of connected “things” are expected to reach 20.5 billion by 2020 with over $2 trillion of economic benefit globally. The widespread implementation of IoT is a clear indication of how popular the technology is right now. Fleet businesses are one of the pioneers in the field of IoT technology. While the individual use of connected device technology was almost non-existent, fleet companies used sophisticated tracking software and telematics devices to keep track of their vehicles and acquire data about drivers and the condition of their cargo. Fleet business industry heavily relied on rudimentary tools such as walkie-talkies and car phones to convey essential information that managers need to make decisions, last-minute changes or plan routes before the implementation of telematics systems. With very limited data and communication tools, the managers had a hard time coordinating the movements of their fleet or check vital data on their vehicles such as speed, temperature, and GPS location. However, with the introduction of the IoT devices into their infrastructure, fleet businesses gained access to a plethora of information about their vehicles and drivers. Field operators require extensive data on every phase of their operations from driver behaviour to fuel levels to real-time location data. This greater connectivity throughout their fleet and the continuous data flow between the IoT devices and the tracking servers pave the way for sophisticated fleet management software that helps businesses with their management process tremendously. Fleet telematics systems boast a wide range of prominent features such as real-time tracking, geofence zones, fuel usage reports, route history, and driver management which are all possible thanks to the IoT’s connected device and vehicle technology. By utilising telematics systems and IoT technology, fleet companies can completely change the way they operate. There are many examples of how connected devices and vehicles affect businesses all over the world. As an example; a baker which has a fleet of 15 vehicles that perform around 50 deliveries a day needs to precise about their operation down to the last detail to be able to deliver fresh products to their customers. IoT sensors on the vehicles will allow the manager to control the temperature inside the cargo hold and make sure their cakes and pastries are in perfect condition by the time they arrive at their destination. Telematics devices also assist the field manager to plan efficient routes which allow the drivers to reach their intended locations on-time, this is but one example about how IoT and telematics devices connected over the internet can improve business operation. Fleet managers can generate fuel reports based on telematics data or check the humidity levels inside a vehicle to make sure its cargo stays fresh over long distances. IoT and telematics systems present fleet businesses with unlimited opportunities and improvements in every aspect of their operations. Understanding the potential of IoT, many tech giants such as Apple, Microsoft and IBM started investing in this technology years ago. With the help of IoT technology and the infrastructure to support it, a company can step into a new age of business and become a pioneer in their business industry while increasing its productivity and profits.   There are billions of connected devices generating an exorbitant amount of data all around the globe. We still can’t fully analyse and utilise all this data, but IoT tech is improving rapidly to increase and expand connected device capabilities. Soon; ordinary objects, vehicles, machines, buildings, and houses; pretty much everything in the world will be able to communicate and talk to each other via the internet and help people with their everyday activities. ### Emerging Trends in Cloud Technology [vc_row][vc_column][vc_column_text]While this decade comes to an end, it’s time to look ahead to what’s in store for the future. One area where business leaders are looking for continued development is the cloud. The cloud helps businesses with reducing costs, boosting efficiency, increasing agility, and assisting in overall growth and innovation. In the coming years, there are likely to be a variety of developments that will impact cloud technology. From new abilities to new members of the workforce, here are some of the trends we can expect in the years to come. Developing Quantum Computing As the amount of data collected increases, there will need to be a shift in processing time. Enter quantum computing: a high level of computing that looks to develop computer technology, based on quantum theory. This advanced type of computing will help with quickly organizing, categorizing, and analyzing cloud data. Using this technology, companies will not be backlogged by an unwieldy amount of data.  Adapting Omni-Cloud Solutions While many companies have selected just one cloud provider, they may find themselves needing multiple cloud vendors to help with future problems. This tactic is known as omni-cloud, where a business uses more than one cloud provider to avoid being locked into one solution. By using more than one cloud vendor, businesses have options for meeting both internal and customer demands. However, some vendors, like Infor and NTT Communications, may not be able to keep up. That’s why having more than one vendor can help bridge any gaps in cloud capabilities. Currently, companies have been using multi-cloud or hybrid solutions, but omnicloud allows them to test a wider range of options. Transforming SaaS to Intelligent SaaS Software-as-a-Service (SaaS) will be infused with extra abilities and become intelligent SaaS. Artificial intelligence (AI) will lead the charge on this transformation. With many benefits like automation, analytical insights, and chatbots, AI will help expand what software can do. In the coming years, it will be hard for organizations not to embrace intelligent optimization within all tools.  AI, the Internet of Things, and blockchain are all technologies that will generate more data to add to the cloud. The real-time data generation from these solutions can help with insights into what’s going on with customers and in the industry. Plus, these tools can run themselves in some instances, alleviating some monotonous administrative tasks. Integrating Blockchain Technology Blockchain technology, which can track and record a product’s journey, is helping with tracking through all stages of the product life cycle to improve delivery. When the cloud is connected to this technology, businesses will have an even better understanding of what happened during a product’s delivery time. These insights, such as weather delays or tracking where a contaminated product originated, can save businesses millions of dollars. Although some companies are already executing this practice, it will continue to grow and mature in the coming years. Personalizing Cloud Experience Right now cloud businesses are creating solutions and applications to meet individual client needs. However, there will be more of this special treatment as companies demand software adjusted to their unique requirements. If the company is using a hybrid or multi-cloud approach then offering personalized options will be even more crucial.  An additional thought for cloud companies to consider is how the cloud will possibly become a large storage place for data. Businesses will want to share this data with a variety of people from their employees to clients. That’s where additional personalization might be necessary especially for companies that want specific people to see certain amounts of information. Increasing Security Capabilities With new data regulations, such as GDPR and CCPA, companies are no longer allowed to be careless with consumer data collection and storage. Especially with new security measures in place, businesses do not want to face monetary penalties or legal action. Because there are businesses unaware of the impact of these regulations, cloud companies will need to be well versed in possible weak points. Security itself also needs to improve to face off with thieves using technology like AI or social engineering. Although incidents like this have been small, there may be an increase of new cyberattacks due to new laws. This is why cloud companies will need to maintain and improve security measures. Hiring Digital Natives The new employees joining companies today come highly knowledgeable in technology and the wide variety of tools available. Known as digital natives, these individuals may create a divide between employees who are technology literate and those who are not. Businesses will have to adopt new training techniques or bring on additional cloud solutions to assist new team members.  The cloud keeps growing and developing new abilities to assist businesses. New ways of collecting data, along with new technology capabilities, will lead to further cloud benefits for businesses. What do you see as the next developments in cloud technology?[/vc_column_text][/vc_column][/vc_row] ### 5G Will Transform User Experience But It Is Not The ‘Silver Bullet’ For All Businesses [vc_row][vc_column][vc_separator][vc_column_text]From greater automation to seamless connectivity, the 5G rollout is being heralded as the game changer that will make truly connected workspaces and workforces a reality and unlock the potential of emerging technologies. It’s the network that will give businesses an opportunity to transform the user experience for both staff and customers. While the possibilities created are attractive, businesses should consider carefully how and why they want to use this technology. Harnessing the power of 5G 5G has the potential to transform digital experience across multiple sectors, from providing the foundation for a robust IT infrastructure in the NHS that enables the crucial work of frontline, critical services to ensuring retailers meet the always-on demands of their customers. 5G promises to be up to 30 times faster than current network infrastructure and will allow for seemingly instantaneous two-way data transfer. This will unlock the potential for data to drive more efficient management of assets, resources and services and for advanced application-based solutions that once seemed out of reach to soon become a reality. The potential of 5G to improve end-user experience The 5G network will enable connectivity beyond the capability of its predecessors, releasing the full potential of the Internet of Things (IoT). It will advance mobile from a technology that connects people-to-people and people-to-information, to a central and unified process that connects everything to everything (X2X). This potential for hyper-connectivity makes 5G probably the most transformational network since the internet developed. It will be the critical enabler of the smart city and requires a new level of intelligence delivered to the network edge: doctors will receive secure, live updates on patient wellbeing; connected autonomous vehicles (CAVs) will seamlessly navigate a city’s roads and multi-location businesses will have access to central and reliable monitoring systems that will drive productivity across their estate. As well as enabling the growing number of flexible and remote workers to stay fully connected, the speed of 5G will drive a major shift towards using data more effectively to manage assets, resources and services with greater efficiency. Public sector organisations will also have the opportunity to benefit from near-instantaneous two-way communication, enabling efficient management of services like local authority waste collection. Households in some European cities, for example, now deposit waste in municipal smart bins that monitor waste levels and optimise collection routes. Beyond the smart city, one of the sectors most likely to benefit from the high-performance network is the logistics industry. Innovation using 5G and the IoT has the potential to streamline the manufacturing and delivery process; real-time tracking, including insight on delays or traffic surges, would remove the need for scan-points and deliver invaluable data to a central management system, automatically triggering a change in scheduling, routing or stock loading. The challenges for an effective rollout The ongoing development of 5G will drive demand for reliable, seamless fixed networks - all the potential benefits of the technology rely on upgrades in the UK’s core network being ready. It is critical that 5G signal get to a reliable, high capacity fixed network as quickly as possible. Rural broadband connectivity remains an issue and, far from being a quick fix, 5G will still rely on the groundwork being in place. It’s likely that 5G will expand rather than close the digital divide – access is critical and while we are moving in the right direction, the country still needs to work to bring effective network coverage across the whole of the UK. There are some interesting case studies in the US about how agricultural communities are being connected using a combination of 5G and fixed networks. The structural challenges posed by 5G means the rollout will be much more gradual than the seemingly sudden launch of 4G. All stakeholders, including the government and technology providers, must collaborate effectively if we are to achieve the end-goal of a fully connected and digital society. 5G will not be the ‘silver bullet’ for all businesses A 5G-enabled society is certainly in the UK’s grasp, but getting there won’t be plain sailing. Hardware security concerns have already stalled the rollout of the UK’s 5G network, which will rely on a robust, future-proof network infrastructure. And, just as the country as a whole has work to do, so too do individual organisations if they are to maximise the potential of the network’s capability. Businesses must consider how they will access 5G, how this will sit with other services in their infrastructure, and how it will be budgeted for. As well as deployment, businesses must address security issues, which continue to be a concern. The challenge for many businesses will be transitioning from traditional legacy systems, cultures and ways of working to enable them to make the most of the smart, new, connected world ahead. Limited budgets and the need for skilled IT teams to manage the transition to new technologies may also slow the progress of change for organisations. While 5G will open up a hotbed of opportunity, it’s crucial that businesses walk before they can run. It is one thing to recognise the potential in the network, but another to ensure deployment will support a business, instead of creating a wave of unforeseen and uncosted threats that could leave you without a business at all.[/vc_column_text][/vc_column][/vc_row] ### IoT devices: The Trojan horses of our time [vc_row][vc_column][vc_column_text]We live in a world governed by connectivity. In many ways, it’s taking over our lives and we need to be prepared to embrace both the benefits and the dangers. For many of us, the word “Trojan” conjures images of the infamous battle whereby the Greeks stormed through the independent city of Troy. Yet in recent years, this word has come to take on a new meaning – ringing alarm bells to those of us that are tech savvy.   In the same way that the Trojan horse became associated with danger during the Greek mythology days, at the beginning of the late 20th century, the word Trojan was applied to describe deceptively benign computer codes that seem legitimate – but are in fact, malware. To this day, malware Trojans remain widespread. However, today’s internet users have the benefit of understanding these dangers and what can be done to avoid potential hazards. Yet, when it comes to connected devices, the same cannot be said. We are still surprised by hacks – because we aren’t prepared. This is why we refer to the Internet of Things (IoT) as the Trojan horse of our time. So - why aren’t IoT devices safer, and how can we rectify this? IoT and security: the challenge We need to work to ensure IoT is safer, however, there are reasons that this is currently not the case. 1. Security is not part of the design process Let’s think of why we buy IoT devices. It’s not because it is a comprehensively well-thought-out piece of technology, but because it amazes us with its futuristic features. The intelligent refrigerator or the IoT lamp do not reinvent the refrigerator or the lamp but enhance their abilities by making them smarter. This does not of course mean the same for industrial solutions, but many parallels can be drawn. User studies, as part of the design process, will always come to the conclusion that IoT is a new, fascinating market and the typical IoT users of the first hour seek novelty or usefulness over security. 2. Security would increase the price IoT devices became attractive to the mass market. The average cost of IoT sensors is falling and by 2020 it is estimated to be about $0.38 (£0.28). Even manufacturers of specialised Industrial Internet of Things (IIoT) equipment are in fierce competition with one another. Spending a lot of money on the development of better security features does not make sense for manufacturers. The industry wants to achieve favourable prices through mass purchasing. 3. Security isn’t the number one priority It all comes down to a two-way attitude from users and manufacturers. We need to be talking about IoT security more – but, given its damaging to a business to slow down market growth, we don’t take the time to speak about it enough. IoT and security: the solution IoT devices cannot be completely monitored. Even if the devices have been specifically deployed by a company's IT department, traditional corporate security measures do not work. IoT devices can only be controlled to a limited extent by the IT team because they operate beyond their own closed systems. This means that to improve security, we need to consider three things that can help give us “peace of mind”. 1. Importance We need to pay more attention to data. To secure our data, we don’t need to back up an entire IoT device. Instead, we need to look at the cloud to secure data from IoT devices. However, keep in mind that as soon as a mobile IoT device contains sensitive data, it will be a target to hackers. Not only this, but if an IoT system is managed by a central administration portal which is deactivated, it will no longer report on attacks to individual devices. 2. Trusted storage IoT devices are predominantly mobile. The difficulty here lies in averting any malicious applications from them. One way to prevent this is by storing the device ID in a trusted area. This means you can decide who does and does not have access to communicate to the device – for example, by using biometric identifiers. 3. Look for radiation effects Monitoring, no matter how sophisticated, cannot directly detect whether an IoT device has become the gateway to certain attacks. However, radiation effects can be identified. Via the network distributor, a monitoring tool can recognise when an unusually high amount of data traffic occurs. It can also be detected via pattern recognition if unusual traffic takes place in the network. A warning would then be sent to the system admin and the discovery of the device in question should proceed quite quickly. We may still be at the beginning of IoT security, and we may still have a long way to go. But, if one thing is for sure, it’s that we need to be prepared or risk turning myth into reality as these Trojan horses attack.[/vc_column_text][/vc_column][/vc_row] ### 5 Ways Artificial Intelligence Will Drive Digital Transformation In Healthcare [vc_row][vc_column][vc_column_text]AI and robotics in healthcare is no more a part of science fiction. Instead, it has been acting as a self-sustaining engine reforming healthcare exceptionally. According to one of the studies conducted by Accenture, AI healthcare application development has the potential of contributing $150 billion to annual saving in the US healthcare economy by the year 2026. Rendering smartness to simple machines, AI is shaping healthcare into an intelligent system while significantly reducing the time and putting it to better use with effective decision making. This overall fosters quick care facilities, early diagnosis, and genetic information data analysis to predispose someone to a particular disease, besides many more. Let’s explore some of these breakthroughs lead by AI revolution in healthcare. #1. Virtual Assistants to Assist Patients and Healthcare Practitioners The increasing shortage of medical experts has majorly lead the growing demands for virtual assistance across the globe. This ratio of demand vs. availability has significantly posed great pressure upon the workers and affected the patient's experience. AI-powered virtual assistants are here to solve the issue. Their ability to enhance communication between care providers and consumers has drastically transformed consumer experience and physical burnout. Combined with the power of voice recognition technology, voice biometrics, EHP interactions, and many more, virtual assistants offer an all-new healthcare mechanism that allows physicians to treat better and patients to feel cared and content. While the patients are waiting for physicians to turn up, these assistants carry out the initial conversations. Therefore, the practitioner and the patients have a better chance of in-depth and treatment-centric discussions later. Simply put, virtual assistants not only do they share the load of the practitioners, but also help them offer better services and care. #2. Conversational AI in Healthcare Powered by Chatbots AI-powered chatbots have been gaining massive popularity across the world not only among the users but also among the brands. An Oracle’s study emphasizes that 80% of the brands aim to include chatbots in their business processes by the year 2020. Moreover, over 35% of the users prefer interacting with AI chatbots rather than human assistants. Healthcare is no exception to this trend. Thanks to their ability to offer less turnaround time and automation to the initial communication, healthcare is expected to receive massive benefits from them in terms of cost savings. Chatbots assist the patients through a set of questions wherein users choose the answers from a set of pre-defined choices. They, then decide the best-suited course of action for the same. With the power of machine learning, the chatbot experience will only get better with time. The critical element of chatbot implementation in AI is the knowledge management system that stores common questions and answers. This will in turn aid the learning process of the chatbot. #3. Robots for Explaining Lab Results Explaining lab results to the patients is one of the most boring parts of a physician’s job. Most importantly, it consumes their time that they can contribute to critical aspects. However, AI-powered relieves doctors from this not-so-favorite part of the health processes. The platforms operate by making effective use of AI mobile apps along with natural language processing capabilities to explain lab results to users in a way that they can understand. Thus, while the patients have help readily available, physicians can focus on other important things. #4. Robot-assisted Surgery Robot-assisted procedures can fill in the gaps in the skills of physicians during the instances of new and complicated surgeries that have higher chances of affecting patient's health and of course, the money involved. While microsurgical procedures require precision and accuracy, AI-powered robots can help physicians reduce variations influencing health and recovery. With power technology, doctors can overcome challenges in the form of imprecisions and anomalies. The best part of this is the presence of machine learning that only further enhance the possibilities of critical insights and best practices. #5. Constant Health Monitoring with AI and Wearables Unless you have had a significant memory loss, there is no chance you missed out on the healthcare monitoring applications of AI. With a wide range of health monitors like Apple, Fitbit, Garmin, etc., users can now keep a check on critical aspects of their health including heart rates, activities and much more. The data is then transferred to highly-advanced AI supported systems to gain more insights and information comparing the current statistics to ideal requirements. Being capable of detecting workout patterns, sending alerts in case of missed workout routines, these systems enable the user to record their habits and avail the data when the need be. Not only this, but they are also capable of raising the alarms in case of emergencies. For example, if a patient has already reached the upper limit of an activity, they'll raise an alert asking the user to stop. The Takeaway By driving innovation in the healthcare realm, Artificial intelligence has given a newer vision to the medical experts by enhancing the service delivery matrix with less turnaround time, and affordable patient care. It has also enabled the patients to remotely access best-in-class services while making sure that the diseases are identified well in time and treatments are done accordingly by bringing better healthcare at the tip of fingers. While the changes are pretty evident right now, the amalgamation is only expected to enhance with time incorporating novel concepts.[/vc_column_text][/vc_column][/vc_row] ### The forecast: cloudy with a chance of containers [vc_row][vc_column][vc_column_text]Consider this scenario. An organization, which has a data center, wants to hyperscale. It wants to add an analytics component for an application that requires data parallelism at scale. The data center may or may not have the ability to scale. By utilizing hybrid clouds, the enterprise can offload the compute of analytics to the cloud environment, which provides the virtualized hardware, including CPU, memory, general purpose computation on graphics processing units (GPGPU) and more. Such scenarios are driving the adoption of container-based technologies for multi-clouds, slowly but surely. It is happening despite some confusion and misunderstanding about the technology. A survey from IBM last year found that less than half (41%) of enterprises still do not have a multi-cloud strategy and only 38 percent had the necessary tools to even operate multi-clouds. Without a question, Virtual Machines (VMs) will continue to exist for many years. However, for hybrid or multi-cloud development and deployment scenarios, CIOs are increasingly turning to containers. Containers provide an opportunity for enterprises to scale deployments as necessary in hybrid-cloud scenarios. As such, it’s important for CIOs to understand the near- and short-term benefits of containers and the challenges. Container conundrums One of the biggest conundrums has been to demonstrate the value of containers when making the transition. In many cases, CIOs will choose containers for new initiatives; however, for existing systems that reside on a VM and bare metal, it’s more costly to move those to a container-based system. Some CIOs have even had to forego containers until costs can be mitigated. Containers do provide a number of cost benefits over VMs. For instance, the compute does not have to run all of the time for analytics components. In a VM setting, the analytics component has to run on demand, and provisioning a VM takes several minutes to launch. With Docker, the launch is reduced to a few milliseconds, and the component runs instantly. With Kubernetes, scaling primitives (called ReplicaSets) are common, and they don't vary across different environments. Additionally, Docker and Kubernetes are environment agnostic, meaning the deployment of applications is the same in local data centers or across public clouds. Provisioning of the environments also is common for most cases with very minor differences, making it easy to distribute workloads across clouds – but that can come at a cost. Typically, provisioning container-based services happens in a matter of milliseconds, which helps control the consumption of resources like hardware, network and storage. Nevertheless, certain specialized software programs and scenarios, such as complex graph databases, will need to be run on VMs and may not be adaptable to containers. This has been another challenge IT teams have had to overcome. Security scares What’s more, there are critical pressure points that CIOs and IT teams have had to overcome when managing Docker across the enterprise. One of the key challenges of containers is security. VMs are relatively secure and have a smaller attack surface because they’re fairly isolated. Containers are more vulnerable to attack and carry with them the possibility of spreading any security issues to other containers they touch. Because container technology is relatively new, some developers have limited understanding about container security. The good news, however, is that there is a committed effort across the industry to tighten container security using multiple approaches, including encryption and process whitelisting. Additionally, best practices guidelines for container security is now are available for developers. This has been a bone of contention for many CIOs and enterprises. Containers are DevOps’s best friend Containers provide a new way to approach application development for hybrid cloud platforms from a DevOps perspective too. Enterprises that have not turned to DevOps in the cloud – are part of a tiny minority. According to a recent study, just 12 percent of enterprises have not adopted cloud-based DevOps. A separate study from CA Technologies found that DevOps in the cloud could boost software delivery performance by around 80%. Continuous Integration (CI) and continuous Delivery/Deployment (CD) – two of the most important characteristics of DevOps – benefit from containers. Docker integrates with many CI tools, including Jenkins and Puppet, helping developers continuously collaborate on code and ensure that a build is always stable and successful. Docker and Kubernetes are changing microservices architectures, wherein a single modular function is packaged as a service and consumed by a client. The advantage of this microservices approach is the creation of a distinct entity for each function, so any changes can be implemented without adversely impacting others. With containers, this approach is reinforced, as each microservice definition becomes a single container image and is deployed and scaled on a Kubernetes cluster. With containers, it also becomes easier to have multiple versions of the same service running at the same time. Despite some misunderstandings and challenges, the key takeaway for CIOs is that containers and Kubernetes are transforming hybrid clouds and the way enterprises can envision the technology solutions they are launching and hosting. According to 61 percent of the multi-cloud experts IBM surveyed, at least 80 percent of new apps will be developed using containers by 2021. Containers are here to stay.[/vc_column_text][/vc_column][/vc_row] ### How hyper-converged infrastructure (HCI) simplifies IT infrastructure and management The popularity of hyper-converged infrastructure (HCI) is on the up among IT departments across the UK. According to figures from IDC at the end of 2017, global HCI sales generated $1 billion in revenue last year, with the usual suspects of Nutanix, Dell EMC, HPE and Cisco competing for market share in an increasingly growing sector. Gartner recently suggested the HCI market will be worth some $6 billion by 2020. In this piece, we will look at why more and more organisations are opting for HCI and why the industry is expected to grow so much in such a short space of time. What is HCI? Simply put, HCI is described as being a fully software-defined IT infrastructure that virtualises all of the elements of conventional hardware-defined systems. Stuart Miniman, analyst at Wikibon, describes ‘server SAN’ - a collection of commodity servers clustered to act as a single storage source - as being at the heart of the concept of Hyper-Converged Infrastructure. According to Trevor Pott, consultant and writer for The Register, a solid definition of hyper-convergence is “a special class of server SANs where VM workloads run alongside the storage workloads. It was conceived of to be cheaper, denser and more appealing than legacy convergence”. He adds: “Storage in a hyper-converged environment is provided by filling up the compute nodes with disks and creating a server SAN. This uses a part of the compute servers RAM and CPU resources. Hyperconverged solutions still rely on proprietary switches for networking [and] might require a few extra compute nodes to fit the same number of VMs, but the overall footprint is generally smaller.” Why is HCI becoming so popular? IT transformation or digitalisation - whether cloud, hybrid cloud or for on-premise -  is one of the key drivers leading organisations to take stock of what they currently have or are doing to use technology to become more agile and responsive to their business needs and challenges. On the other side of the coin are the organisations who are now investing in HCI, but who sat on the fence when HCI was the new kid on the block. At the time, there was uncertainty around the technology, no long-term reliability figures, and the companies specialising in it were fairly new start-ups. As a result, some companies preferred to wait to see how this new technology panned out within the industry before committing to it. Benefits of HCI [clickToTweet tweet="#HCI - collapsing the traditional three-tier #network, storage and #compute - sometimes known as #legacy #infrastructure - and fitting it all into a commodity ‘tin’ with a #software overlay. This simplifies the #architecture of the environment." quote="This is where HCI comes into play, by collapsing the traditional three-tier network, storage and compute - sometimes known as legacy infrastructure - and fitting it all into a commodity ‘tin’ with a software overlay. This simplifies the architecture of the environment."] Many organisations find their current infrastructure complex, requiring a lot of resources, skill, time and cost to keep the lights on and running. In addition, there is a demand for IT departments to respond to business units quickly where the projects on their current infrastructure can take weeks or months to implement, which can have a negative impact. This is where HCI comes into play, by collapsing the traditional three-tier network, storage and compute - sometimes known as legacy infrastructure - and fitting it all into a commodity ‘tin’ with a software overlay. This simplifies the architecture of the environment. HCI deployments can reduce your data centre footprint - the ability to consolidate from three racks of server network and storage to one, or even half, can significantly reduce energy costs. With HCI, there is a single, simple management interface, removing the need to log on to multiple interfaces. What’s more, HCI makes it easier for a business to scale,  negating the idea of sizing storage. You can start small and grow by adding building block nodes each time. When to consider HCI? According to research firm Forrester, HCI systems are now typically used for scalable platforms for enterprise applications, database and private cloud. The organisation found, in a survey of infrastructure professionals planning or expanding on their use of HCI, that the most common workloads being run on converged systems are database (50% of respondents); file and print services (40%); collaboration (38%); virtual desktop (34%); commercial packaged software (33%); analytics (25%) and web-facing workloads such as LAMP stack or web servers (17%). Getting buy-in from the CFO and CTO One of the main challenges of implementing HCI is the cost - the initial outlay is considered to be quite high compared to the siloed approach, where the CFO of an organisation will pay most attention to the big sum on a quote. However, if they can see the bigger picture and see past the initial numbers, they will see the total cost of ownership will be lower over time and over the life of the solution. This can be attributed to the reduced heating and energy costs, reduced data centre footprint, reduced investment in training, and fewer licences or maintenance agreements. In addition, it will mean having a viable disaster recovery solution with more robust recovery point objective (RPO) and recovery time objective (RTO) times. With so much for organisations to gain from HCI, it’s clear to see why leading research firms expect the industry to grow from strength to strength in the coming years. ### Artificial intelligence and the new wave of innovation [vc_row][vc_column][vc_column_text]To be innovative is one of the most highly prized qualities in modern business. For any organisation described as innovative, one would immediately see that business as progressive, challenging, risk-taking, focused on digital transformation, modern and successful. However, the truth isn’t quite as simple as that. While most businesses might claim to be innovative, for many it is merely smoke and mirrors, a PR exercise that positions that company as innovative without really delivering. Organisations that are serious about being innovative know that it involves a different culture and a different mindset to be successful. And for innovation to be future-proof and sustainable, it is also going to increasingly involve artificial intelligence (AI). The evolution of innovation Around a decade ago, with the emergence of agile and digitally-focused start-ups, it became clear that businesses needed to innovate more than ever, in order to be successful and maintain their market position. Traditional businesses were coming under threat from new market entrants, start-ups that were structured in a way that put innovation and fresh thinking at the heart of their operations. This led to the accelerated emergence of a technology to capture and manage innovation, the idea management platform. The concept had already existed in different forms, but the increased need for innovation in more traditional businesses saw idea management platforms become more mainstream, offering the ability to capture, refine and realise ideas, and allowing discussion and engagement across multiple communities – internal or external, employees or management, national or international. The last five years has seen this technology evolve even further, as innovation becomes even more highly valued, with the latest idea management platforms making tangible use of AI to power innovation programmes. Artificial intelligence = sustainable and scalable innovation While AI has existed as a concept since the mid-'60s, for a long period it seemed as though it was never quite going to push through to the business and technology mainstream. The last few years however, have seen increasing adoption of AI and it is starting to fulfil its undoubted potential and have a real impact on many businesses. One area where AI has undoubtedly made a difference is in innovation and idea management, adding machine learning capabilities to ensure corporate memories for ideas are much longer, leading to truly sustainable and on-going innovation. Even when an organisation is using an idea management platform, it is possible for a good idea to be forgotten if it’s submitted at the wrong time. If there isn’t a current need for an idea, then they mostly either drift into the ether or are stored away somewhere, undiscoverable and never used. Using an idea management programme that leverages machine learning algorithms allows the platform to keep building, learning and developing its own memory, which becomes more useful in the future. Just because an idea in response to a specific request isn’t quite right at that moment, it could be applied to another area of the business at another time. The right machine learning technology can store ideas until such a point that there is a use for them. This means people looking for solutions will be alerted to previously submitted ideas that could work, based on the idea management platform’s understanding of their needs and memory of what has been suggested before. As the volume of ideas grows, it become less efficient or even impossible for people to manually make connections between these ideas, whether they were implemented or rejected. AI tools help solve that problem by highlighting unexpected relationships between ideas. Retaining the value of good ideas Similarly, if an organisation is running an innovation challenge across its target audiences, it is possible that it could get thousands of responses. The most immediately relevant ideas will be selected, but many others may be rejected for different reasons, such as lack of resources or poor strategic fit. Only a small percentage of those ideas are going to be taken forward initially. What happens to the remaining ideas? They will be discarded and potentially forgotten, but often there’s still value to be found from ideas that don't find a home the first time around and by applying AI to idea management, these initially overlooked or rejected ideas are folded back into the corporate memory, with the knowledge that in the future they may be useful. The platform learns about each idea and holds it there until that time, which might be a week, a year or even a decade in the future. Businesses must understand that innovating for today is the machine that feeds innovation for tomorrow - culturally, in capability, process, and much more. To do this effectively and to future-proof good ideas, it means adopting AI-enabled idea management to power the next and future waves of innovation.[/vc_column_text][/vc_column][/vc_row] ### How Augmented Intelligence is Changing the Workplace as We Know It [vc_row][vc_column][vc_column_text]Where AI once equated to artificial intelligence, we are now beginning to talk about it in reference to augmented intelligence. Augmented intelligence is resulting in adaptive systems and intelligent tools which can be applied or used to facilitate the jobs carried out by people, as such, it is driving an evolution of the role of people in the workplace. But which industries are seeing the impact of augmented intelligence the most and what will this mean for the humans working in those industries? Financial services The uptake of augmented intelligence is happening on a market by market-basis, with the financial services sector among the first to adopt intelligent systems, which are having a dramatic impact on the industry. In commercial markets, traders are moving away from watching the market, plugging data into their models and executing trades. Instead, traders are overseeing the intelligent systems and components that are spotting trading opportunities and will make those trades. This is freeing traders up to help guide and define what makes a ‘good’ trade and help to develop the rule-base that the intelligent systems are using to evolve different strategies and adopt different approaches. Traders are therefore moving away from execution and, instead, taking on more strategic and less tactical roles. More and more, the intelligent systems are not only handling the execution, but they are also spotting opportunities and providing tools and capabilities that make it easier for traders to see these. This is taking strategies that were once the realm of highly specialised teams and making them widely available to the trading community. This is something that we are have already seen across more strategy-driven trading institutions, such as hedge funds, and means that highly quantitative, analytical trading strategies are being more commonly adopted as organisations don’t need a team of highly qualified people building models to identify opportunities. As a result, it’s much easier to have highly intelligent adaptive systems that are supporting people who don’t have specialised analytical skills to use those tools and techniques to support day-to-day trading activity. In the financial services sector and beyond, augmented intelligence is bringing much more sophisticated capabilities and the ability to apply those sophisticated capabilities in more generalised environments. Software development As software tools become more intelligent and get more capabilities built into them, we are seeing fewer people writing low-level code that creates the fundamental building blocks for IT systems and more people are assembling pre-built components in ways to meet novel demand. What we are seeing is the way in which these pre-built components operate provides more intelligent capabilities, and consequently make it easier and faster for people to build systems that incorporate those capabilities. Rather than having a team of data scientists on tap to build augmented intelligence models, for example, it’s becoming easier to have components that you incorporate into a data workflow which will automatically construct models to perform certain functions, such as the de-identification of data. Realising greater value Particularly in businesses that use engineering and development processes, where it is important to get things right the first time, augmented intelligence has the potential to help businesses and industries realise even greater value. In these situations, there are big costs involved with going beyond modelling and design, into the production stages and getting it wrong. This is especially true if the business is making tangible goods with tooling costs associated with those goods, as consequently, the cost of getting it wrong is high. Therefore, as augmented intelligence helps people produce better quality products and performs a number of checks to ensure they are right, it drastically improves the likelihood of getting things right the first time and, as a result, reduces the cost of bringing new or changed products to market. Intelligent vehicles One thing we are hearing more regularly is that we are now entering a phase of augmented intelligence acting as an ‘extension of a human being’ – a theory demonstrated by the likes of Tesla and Google within the field of intelligent or self-driving cars. There is a vast amount of intelligence going into building the capabilities of intelligent cars to identify risks and enable the cars to automatically take action when they identify hazards in order to support the driver. This, therefore, augments the ability of the driver to operate safely and efficiently in those environments allows the car to operate in a much more sophisticated way. Broadly speaking, augmented intelligence is still in its infancy and we are only just scratching the surface of what it can offer society and organisations. The ability to really shorten the decision-making process and bring more appropriate information and present it in a timelier manner, in a way that hasn’t been possible to date, will revolutionise the way in which society operates. As augmented intelligence continues to be adopted by new markets, we will see its impact ever more clearly, affecting not only industry but all aspects of our society.[/vc_column_text][/vc_column][/vc_row] ### Multi-cloud, Fog, Edge & Hybrid Computing - What’s the Difference? The cloud has become a key component in the successful running of many modern businesses, making it possible for companies to transfer some of the burdens of storing and processing large amounts of data on the local network into the hands of cloud providers. However, as the way we use technology evolves so too do cloud services, and there are now several cloud computing methods that are evolving to help businesses manage their assets efficiently. To address some of the confusion surrounding the naming conventions for cloud models, we’ve put together a mini-glossary that should help you navigate from Cloud to Multi-cloud, past the Hybrids, through the Fog and out to the Edge. Cloud Computing Also known as simply ‘the cloud’, traditional cloud computing is a term used to describe the on-demand delivery of a range of services, including applications, media and data, via the internet. This process has been designed to both simplify and protect the data storage process for businesses, reducing storage costs, preventing the need for costly updates and releasing in-house resources (such as IT staff) for use elsewhere in the company. Multi-Cloud Computing   Delivering the service its name promises, multi-cloud computing describes the use of multiple cloud services from more than one provider for different tasks. For very small companies or independent contractors this may feel excessive, but as cloud adoption grows, many companies are beginning to appreciate the different types of cloud available, along with their varied benefits. For instance, while one team may require a cloud provider capable of safely sharing sensitive data, another may need a cloud service more suited to sharing and collaboration involving large data, such as media files. This can occur within the company, department or branch office. A multi-cloud approach can often provide the best option when the respective requirements are largely mutually exclusive.  For example, a company’s payroll, customer relationship management, human resources and inventory management may all be run on different cloud platforms. The most widely accepted benefit of this approach is that companies can mix and match best-in-class technologies. Hybrid Cloud Computing Hybrid means something formed or composed of two or more elements and in the cloud computing world, that’s no less true. [easy-tweet tweet="A hybrid solution will often combine the flexibility and scalability of public clouds" hashtags="CloudComputing, Hybrid"] Although often confused with multi-cloud computing, hybrid cloud computing has one significant difference: where multi-cloud uses different cloud providers for different tasks, hybrid computing uses different cloud services for the same task or process. In other words, hybrid cloud solutions can make use of any combination of multiple public or private infrastructures to access or perform certain tasks, with the orchestration of data between them. A hybrid solution will often combine the flexibility and scalability of public clouds with the security and accessibility of private clouds. Fog Computing While cloud computing is an ideal data solution when you’re lucky enough to have uninterrupted, fast cloud access, it’s unable to reliably support the rapid development of the Internet of Things (IoT). Referring to the obtaining, analysing and delivering of data to and from things previously disconnected from the internet, the concept of the IoT has necessitated the development of a new form of the cloud solution that keeps the information a little closer to home. Enter fog computing, a term coined by Cisco to describe a localised cloud solution that collects, processes and stores data within a local area network via an IoT gateway or fog node. The power and advantages of cloud computing are moved closer to where the data is being generated. Data can be sent to the gateway from various local sources in the same network, processed to assess the relevance and to create any necessary reply, and delivered back along with commands to solicit the correct response from the device. This system is particularly beneficial thanks to its ability to process data from multiple sources and respond only where required, using a local network to avoid the need for constant connection to external servers. Edge Computing Although it is a term often swapped interchangeably with fog computing, edge computing is its unique solution. Taking localised data processing one step further, edge computing moves analysis even closer to the data sources, requiring each device on a local area network to play a part in the processing of the information. To achieve this, each data source is connected to programmable automation controllers (PAC), which take care of the processing and communication systems for each device. Edge computing has the distinct advantage of removing unnecessary communication, and thus reducing the potential for failure. Instead, each item works almost independently, determining what, if any, information should be stored and what should be passed forward, and all without consulting with any other points in the network. According to Ericcson, there will be a worldwide total of around 28 billion connected devices by 2021, with nearly 16 billion related to IoT.  As a growing proportion of transmitted data emanates from IoT, it is important that as much as possible is primarily dealt with at the edge of the network. Where is the Cloud headed? Though there is overlap between cloud computing naming conventions, without question, we see a move away from centralised computing wherever its infrastructure may lie. As IoT proliferates, not only is there a need to communicate with these billions of devices sitting on the edge but to provide secure, fast processing and decision-making capabilities.  This needs to happen even in areas of poor (or no) internet coverage and with the help of local data intelligence. A lot of the data will be used and processed on the spot and as it will never be needed again, doesn’t need to be stored. This will shift company’s focus from traditional cloud computing out to the edge, decentralising decision intelligence and data technologies while still retaining a holistic overview. ### Latest Advances and Applications in the IoT Technology [vc_row][vc_column][vc_column_text]The Internet of Things, or IoT, is the name that the IT folks have given to the now billions of physical devices throughout the world that are connected to the internet. These devices not only collect data, but they share it as well. Through the proliferation of wireless networks as well as cheap processors, it’s now possible to turn a multitude of physical objects into an IoT device. These things can range from a smart thermostat in your home that you control  with your smartphone, to a sophisticated driverless car that’s filled with hundreds of sensors collecting and transmitting data back to make sure it is operating efficiently. Devices that are considered to be part of the IoT are devices that we normally wouldn’t expect to be connected to the internet- unlike a PC, laptop, or a smartphone. They are things that can communicate with a network on their own, without human action. The IoT, along with artificial intelligence, machine learning and cloud technology, has been one of the most important trends in high-tech over the past couple years. It has been developing at astonishing speeds since its inception, often rapidly changing direction and popping up in new and quite unexpected forms. In this article, we will focus on current advancements in the world of the IoT, advancements that only a few short years ago were talked about but had no tangible demonstrations yet of their viability. Some of these things are continually evolving trends- things that experts expect to evolve rather rapidly but are still in their early stages of development. The rise of 5G technology is the first advance we will talk about. 5G networks are at the forefront of development of cellular mobile communications. Recent developments will ensure that their spread will mean much more than just a faster internet connection for your smartphone. Their extremely high speeds will offer an array of new possibilities for the IoT, paving the way for a degree of connectivity that is impossible with current standards. Through 5G, data can be gathered, analyzed and managed in real  time, virtually without delays, greatly broadening potential IoT applications and opening up pathways to further technological innovation. Edge Computing Edge computing is literally the opposite of cloud computing, the technology that has gained so much prominence just in the last five years or so. Edge computing means that data is stored in micro-centers as opposed to the cloud, providing numerous new options for the IoT. By storing data locally, it offers a cheaper, faster, and more efficient approach to data processing. In this manner, data can be made immediately available to a corresponding  IoT device, decreasing the “stress” on the network and the necessary bandwidth. Smart Stores In 2019, smart lighting devices, video feeds and Wi-Fi enabled food traffic-monitoring software allows store owners to collect info about customer traffic patterns in the shop, how much time they spend in each of its aisles and how they interact with products on display. After analyzing this data, retailers are able to change the way they lay out their merchandise and decide how much of it they put on display or even change their entire store layouts to enhance them in line with knowledge they have gained about customer behavior. Smart Cities In a relatively simple application, Cary, North Carolina is using the IoT to track the condition of its traffic lights. If one malfunctions, the system notifies utility companies so that they can quickly send a technician to solve the problem. Besides traffic lights, the IoT is being put to use in creating smart common areas in cities around the world. Sidewalk Labs, owned by Alphabet, the parent of Google, is building a smart neighborhood in Toronto. Smart sensors are being installed around the neighborhood that will record everything from shared car use, building occupancy, sewage flow, and optimum temperature  choices around the clock. The goal is to make the neighborhood as safe, convenient, and comfortable as possible for those who live there. Once this model is perfected, it could set the standard for other smart neighborhoods and even eventually entire smart cities. Manufacturing and Healthcare The IoT has already begun to transform manufacturing. Sensors, RFID tags, and smart beacons have been in place for several years. Advances in these devices are disrupting every part of the production process from product development to supply chain management. Factory owners are becoming able to prevent delays, improve production output, reduce equipment downtime, and of course manage inventory. In the world of healthcare, already more than half of organizations have adopted IoT technology. It is an area where there are almost endless possibilities- smart pills, smart home care, electronic health records, and personal healthcare management. All in the interest of higher degrees of patient care outcomes. Connected Smart Cars At virtually every price point, current car models have much more IoT upgrades than ever before. These come in the form of diagnostic information about the car. Everything from tire pressure, oil level, fuel consumption , and when something goes wrong with the engine is now available to be sent to the palm of your hand via a Wi-Fi connection to your smartphone. In the next year or so, we will see even more IoT advancement beyond diagnostic info. Connected apps, voice search, and current traffic information are just a few.  While self-driving cars are still several years or more away, the prototypes being built now utilize IoT technology in the multitude of sensors they contain that allows them to be monitored remotely as they navigate streets. Increased Security Concerns Despite the promise that the IoT holds for all of us in making our lives easier, at the same time it carries the dual-edged sword of making us more vulnerable to an attack. In the past, a malware infection meant just lost or compromised data. The emergence of the IoT means that a virus or ransomware infection can easily disable vital functions and services. In Atlanta, in March 2018, ransomware crippled the city’s water services and ticket payments systems.  In 2019, hardware manufacturers like Cisco, HPE, Dell and more are building specific infrastructure for micro-centers to be more physically rugged and secure, and security vendors are starting to offer endpoint security solutions to their existing services to prevent data loss and give suggestions about threat protection and network health. These, then, are the latest advances, applications, and trends in the continually evolving technology of the Internet of Things, a technology with the promise of creating a brave new world for us all. At the same time, it  comes with the proviso that it has the capability of upending civilization when malevolent forces seek ways to weaponize its promise. Joseph Zulick is a writer and manager at MRO Electric and Supply.[/vc_column_text][/vc_column][/vc_row] ### Top IoT Trends to Watch for in 2019 [vc_row][vc_column][vc_column_text]The Internet is vast and yet still expanding. One of the most profiting expansions that it is making is in the Internet of Things (IoT); an approximately $200 billion industry worldwide. According to Statista, this figure is expected to get to $520 billion in the next couple of years. Today we are here to familiarize you with some of the top IoT trends in 2019 that will be responsible for the rise of this entity. The year 2019 is going to be a huge one for IoT based innovations and numerous industries like healthcare, smart cities, automotive, home and office automation, usage-based insurance and asset management will benefit from these IoT trends of 2019. So a significant chunk of IoT application development will be revolutionized and upgraded for good. IoT Trends to Watch for in 2019 IoT has come a long way since its inception, and the below mentioned IoT trends of 2019 are clear proof to that. So without wasting any further time; I present the top IoT trends in IoT: Fragmentation in IoT Just as we witnessed with Android OS, as the OS grew, it became more fragmented, and a significant number of versions are active in the market. Hence, the Android developers have to take care of all the fragments of OS version while making even the slightest of an upgrade to an app. IoT devices and software face the same situation. The fragmentation will create some hurdles for many companies, as they deal with compatibility issues throughout their industries. Fragmentation is a widespread issue for any software-based innovation that is used worldwide by millions or billions of people. The only accurate solution for this is to introduce effective standards and certifications for IoT hardware and software development. Qualcomm is trying to lead the standardization race and solve the fragmentation problem as soon as possible and 2019 may very well be the first step towards this standardization. Edge Computing in IoT With the massive amount of digital data generated and processing from smart devices, businesses need technologies and algorithms that can quickly process these data and make sense of the data. With edge computing, the smart devices will become a lot powerful in enabling local data processing and also advanced AI capabilities. This year, the data transfer volume will reduce making cloud dependency more agile and flexible. Edge computing is a fantastic upgrade to all those industries where data is to be handled instantaneously or where cloud connectivity is restricted in real-time like manufacturing, security, shipping, and logistics. Hyper Personalization in IoT As mentioned earlier, there is a huge amount of data generated by smart devices. More data means more precise knowledge of consumer’s intent, needs, behaviors, desires, and current location. Data is the currency of technology so making sense and use of maximum collected data will give companies to personalize services for their users. This high level of personalization opens up the possibility of delivering highly relevant and contextual services as well as marketing content at the most optimum time and place. Without predictive analysis, the whole process becomes manual and extremely time-consuming whereas, with this trend integrated with IoT, personalization will reach almost its peak. Rise of 5G in IoT 5G is probably the most anticipated network upgrade of 2019; it is apparent that it has something in store for the IoT sector also. 5G will support an increased number of device interconnectivity that will move the IoT innovations further into the future. The VoLTE technology is for calls over an LTE network, while Cat-M1 technology is to connect M2M machines over an LTE network. Cellular and wireless chip providers need time to implement Cat-M1 into modules that could be used for IoT applications; this will be done probably by the end of 2019. 5G will allow gathering, managing and analyzing data in almost real-time. 5G will expand the IoT sector into areas where bandwidth speed is a crucial aspect of working. Digital Twins Digital twin technology is a virtual copy of the real-world product, asset, processes, or systems. This technology is also known as a hybrid twin or virtual prototyping. It’s used as a simulation tool that works together with AI, machine learning, and IoT. This digital technology trend creates a connection between the physical and digital world. The IoT connected objects are replicated digitally for simulations, testing, modeling and monitoring based on the data collected by IoT sensors.  The analysis of digital twin data helps businesses make decisions which positively affect their performance indicators. The digital twin technology is set to improve how industries work by streamlining digital data operations. Blockchain We know what you are thinking, Blockchain is a decentralized digital ledger for various records, especially cryptocurrencies; how will this be a trend in IoT. Well, Blockchain is not controlled by a single party and has no single point of failure. So, if we integrate it with IoT with blockchain, it will help streamline businesses by connecting them directly to customers (B2C) or other businesses (B2B) through smart contracts, thereby saving costs that could have been pocketed by middlemen. Conclusion The Digital race of IoT sector is in its final lap and businesses are making a run for the ribbon. If you want to catch up to the industry giants, then these top IoT trends in 2019 will be of great help. Maciej Kranz once said, “Lone wolves won't succeed with IoT. This is a team sport”. You need to integrate these top IoT trends of 2019 into your IoT device or service, and for that, you need a team of experienced IoT developers that know IoT mobile app development in and out.[/vc_column_text][/vc_column][/vc_row] ### Overcoming Cloud Complexity [vc_row][vc_column][vc_column_text]The public cloud has the potential to offer businesses of all sizes much in terms of speed, flexibility, scalability, cost reduction, ease of maintenance and improved security. However, getting it right is anything but trivial. Not only is the starting point different from one IT department to the next in terms of what is virtualised and what it is running on, but so is the potential endgame in terms of which cloud service provider (CSP) or combination thereof is the best fit. Even within the realms of one cloud service provider, making the right selection that matches demand with the best available supply is difficult, given the complexity of their catalogues. As cloud costs rise in a dramatic and seemingly unmanageable fashion for most companies, we hear far too often that the main cloud service providers were so easy to get into, yet so difficult to get out of. And then there’s the “what stays on-prem and what goes into the cloud?” discussion to be had. Followed by “what catalogue selection gives the best match from a resource requirement perspective to ensure the best performance at the lowest cost?” The reality is that IT never truly becomes simplified, just different. A callous outsider might suggest that this is how everyone is kept in a job, but the truth is simply that demands and requirements change as available options evolve. As public cloud reshapes how much of technology is delivered, how the IT department is structured, what skills it needs and what technologies and tools it requires, have to change in step. For organisations that have moved all or part of their business to the cloud, it is all too easy to assume the cloud will prove both optimal for simplifying day-to-day administration and in terms of cost savings. But this would be a dangerous assumption to make. As cloud service providers offer such a vast array of options, selecting the right resources for applications can be complex, time-consuming and costly. Considering the main drivers for businesses going to public cloud in the first instance, this can potentially be self-defeating. More to the point, this is an on-going and not a one-time challenge, as application demand evolves and available options change. This is a complex challenge that requires continuous adjustment and optimisation at a scale which is beyond the reasonable capabilities of cloud ops teams, attempting to address it with spreadsheets and basic tooling. There are of course tools available to assist with the basic differentiation between CSP offerings and some that make basic recommendations. However, these are ultimately advisory and lacking in deep analytics that such a complex problem requires. With the rapid growth in public cloud adoption, a new breed of vendors is addressing the limitations of currently available tools with technologies that automate the selection and management of cloud resources with the use of analytics and machine learning. Organisations such as Densify are now able to analyse the precise requirements of every aspect of compute resource and the available cloud-based options to make the correct selection every time. In this way, under or over-engineering is avoided, while providing better-performing applications at the lowest cost possible. Additionally, many organisations are increasingly using Infrastructure as Code (IaC) tools such as Terraform by HashiCorp to simplify and automate the process of managing and provisioning infrastructure.  These tools provide a simple yet powerful way to request cloud infrastructure at the code level, but require developers to hard-code resource requirements or instance selections. This is problematic for a number of reasons. First, developers generally have to guess at what resources to code in, as they simply do not know what the appropriate values to use are. Second, this often leads to increased cost as there is a tendency to over-specify resource allocations to compensate for the lack of confidence of what an application will actually need. Third, this can lead to performance issues, as the wrong type or family of instances are specified. Finally, hard coding resource requirements that are wrong is restrictive, as efforts to correct the selection by op teams are thwarted as apps revert to the hard-coded values when that code is run again. A better approach is to extend the Continuous Integration/Continuous Development (CI/CD) framework with Continuous Optimisation, replacing these hardcoded entries with the selection of infrastructure based on actual application demands on a continuous basis. Optimisation-as-Code (OaC) which goes nicely hand-in-hand with IaC fully automates the selection, placement and ongoing optimisation of workloads in the cloud. For some businesses, the goal is to run purely on public cloud. However, many organisations want (or have to) adopt a hybrid model to meet their business needs. Cloud management and automation tools must, therefore, fully support a hybrid environment and enable their users to make the appropriate selection on what to run where and to drive automation and optimisation across multiple platforms. Machine learning-driven cloud automation, migration and management technologies can truly unlock the potential of the cloud, by providing enterprises with fundamental benefits: assisting IT in getting the right application performance, reliability and agility and rewarding the CFO in terms of the cost savings.[/vc_column_text][/vc_column][/vc_row] ### The power and promise of AI in the coming year and beyond [vc_row][vc_column][vc_column_text]In 2018, I posited that artificial intelligence (AI) would become widespread, seeping into our everyday lives. Now, in 2019, we’re starting to see this become reality. Until recently, AI was the domain of large tech companies that employed machine learning to increase efficiency and scale at the back-end. Now, we see more and more business leaders joining the conversation around AI, even if they aren’t yet using it within their own businesses. Companies now realise the benefits of using AI and are seeking to incorporate AI technologies into their mainstream industries. Nowadays, it’s hard to find a company that isn’t at least working on one AI project. As the ability of AI becomes ever closer to that of human employees, it’s interesting to consider what might come next. So far, we’ve seen significant growth in AI around customer support, and that’s likely to intensify this year. In fact, Forbes says that by 2020, 85% of all customer service interactions will take place without a human agent—and most of us are already using these services without realising it. Whether you’re tracking a package, making a reservation for an upcoming holiday, or refilling your prescription, chances are that you have spoken to a robot within the last year. AI and the Customer Customers generally will seek to use their preferred channels, be it mobile messaging, Facebook, Twitter, or e-mail to engage with brands. This means that companies must use these channels as part of their customer service. AI advancements are also happening rapidly in the area of sales productivity. Over the past year, the level at which businesses are utilising AI to grow their business has skyrocketed. It’s become standard for companies to use AI to improve predictive business software and to make more effective decisions. Using heavy duty machine learning analytics as a standard business practice is now widely accepted. Looking even farther down the road, there are those who believe that computers will be just as smart as humans in about two decades. I personally love reading about the subject of singularity and quantum computing. It’s fascinating to hear about its potential. Naturally, one could argue that humans might not want computing to become as smart as us.  We’ve all watched movies centered-around apocalyptic devastation! But, in my opinion, AI stands to improve our lives in ways that we have yet to consider, especially at home. While AI is becoming commonplace in customer service and sales, we are a long way from having a robot cooking us dinner or cleaning our apartments. Nevertheless, if we buy into the idea that robots will do this for us one day, what else can we anticipate? gardening robots? machines delivering food to isolated parts of the world? The possibilities are endless! However, before we see a world in which robots are making our beds, AI must become even more embedded into businesses. The technology is capable of some incredible things. For example, AI has predicted pneumonia better than radiologists at Stanford. Yet, we have been reluctant to trust AI and slow to adapt it. AI companies must invest in stronger algorithms, both to prove their own worth and to become widespread to everyday consumers. Businesses need to start investing in this technology now in order to be part of the AI generation. The technology is not only here to stay but is set to drastically transform both business and consumer landscapes in the immediate future.[/vc_column_text][/vc_column][/vc_row] ### How to weather the digital disruption storm [vc_row][vc_column][vc_column_text]Statistics from the European Commission forecast Industry 4.0 will see manufacturing production rise by 20 per cent by 2020, driven by a 50 per cent decrease in downtime. There is no doubt that the age of digital transformation is here, and its influence is growing. Darren Duke, business analyst at leading foam product manufacturer Zotefoams, discusses how manufacturers can harness data to reveal actionable business insights, and how innovative technologies can be deployed to improve modern manufacturing processes. The use of technology does not automatically guarantee growth for manufacturers, but businesses can take advantage of the opportunities it enables if they have the correct mentality to embrace new technologies. Business leaders with an open mind who take time to review all possible technology avenues will be in a better position to make wise investments to gain a competitive edge. And to enable this open mindset, you need to have an open enterprise strategy. As digital disruption continues to gather pace in the industry, manufacturers should be looking at innovative technology applications such as sensors to enable predictive product quality, in addition to the more developed predictive maintenance areas. Every manufacturer will have a differing use case for new technology, but technologies such as Artificial Intelligence (AI) and the Industrial Internet of Things (IIoT) have a wealth of potential and applications. Using technology to predict and prevent An example of AI and real-time monitoring is how they can help act on potential machine failures before the business reports any problems with plant equipment. With large screens in the maintenance workshop, each morning a maintenance team can assess what’s happening, pick up on any potential flaws and take preventative action before a machine stop. This proactive approach is intended to maintain the smooth running of machines and increase overall equipment effectiveness. By using analytic and reporting software, maintenance teams can spot trends or outliers early – enabling them to respond in an agile way. Used correctly, this type of software will aid faster reactions to changing markets and customer needs. Manufacturers should be constantly working to collect information from real-time or near-real-time automated analysis using AI to rapidly identify trends that a human could miss. From the outside in Businesses should also look outside their company as well as inside. In anticipation of market fluctuations, manufacturers should run a regular portfolio management exercise and assess how operational and market changes in the next year could move products around. Tools such as the Boston Matrix, which analyses businesses based on their growth and market share, can weigh up what opportunities or problems will be triggered and prepare for them accordingly. The way emerging technology enables today’s manufacturers to pick up social media trends and industry conversations is particularly helpful and can reveal market insights such as unfulfilled customer needs. Take Microsoft Dynamics 365 for example. Applications and widgets within this platform allow organisations to gather information and use data analytics to get an idea of customer sentiment about products, brands and services. Compliments or issues can also be picked up on social media sources, and these capabilities are very effective in helping organisations gain a better understanding of customer expectations. To keep pace with the speed and effects of digital disruption, manufacturers must embrace the transition towards fast data analysis, interaction through social networks and a shift to a light, app-based suite of systems. Younger business system users are typically used to an app-based ecosystem. Modern business management systems are especially valuable in delivering this as they enable businesses to tailor apps to boost productivity and streamline business processes. Customer engagement adds market direction We must all as manufacturers recognise the true value drivers for customers, and then put in place the internal measures to sustain this value. Our technology partner Columbus recently produced a major industry report investigating how manufacturers can effectively prepare for 2020 onwards. Drawing on academia, technical experts and industry leaders, it highlighted the emerging technologies that are enabling manufacturers to develop customer-centric strategies which put the customer at the heart of all business operations. The manufacturer has traditionally had a distant connection with the customer, but a transformation to insight-driven businesses with a better grasp of market direction and developments will move us all much closer to customers and their requirements. This will speed up the movement fromrisk-aversee, capital intensive industries to a more agile, analytical and results-driven business with predictive process controls and improvements. Gradual gains with digital disruption New technologies are helping manufacturers realise the value of products from a customer or end user perspective and are enabling us to react faster to changes in the market. These technologies allow businesses to limit the time they spend collecting data from a range of ad hoc systems and focus more time on key decision-making. A calculated approach to technology selection and deployment will make the potential of digitised manufacturing operations a reality for businesses of any size.[/vc_column_text][/vc_column][/vc_row] ### Beware the temptation of cloud pick & mix! [vc_row][vc_column][vc_column_text]Walking past a traditional sweet shop in Las Vegas during AWS Re: Invent in November made me reflect on how many organisations still feel the urge to fill their stomachs with the digital equivalent of pick & mix during cloud adoption. Faced with a tempting array of services from cloud providers, it is simply too easy to binge and add them all to a design proposal just because they are available. It’s like filling one of those cinema pick and mix bags to the brim even though you won’t manage to eat all the sweets - and ignoring the damage done to your bank balance and your teeth. The start of a new year, when many of us are trying to adopt a healthier approach to lifestyle and diet, seems a fitting time to propose a more restrained approach to cloud adoption too. One that doesn’t splurge money on unnecessary cloud services and that has your organisation limbering up for a cloud transformation journey in peak physical health. Here are my tips for resisting the lure of the sweet shop in the clouds and preparing the ground for a smooth cloud adoption programme: Start by creating a technology reference architecture that will keep everyone on task. This is a relatively straightforward activity. Simply list all of your cloud provider’s available services and decide which ones are approved for usage, which ones are contained, and which ones you will leave for future discussions. Plan ahead by having a full and frank discussion with everyone involved to agree on who will do what and secure buy-in. All teams need to adapt for the cloud, and spending time on this (as described in the AWS Cloud Adoption Framework, People Perspective) is essential to a smooth transition. Nominate a subject matter expert as the lead/go-to person for each service area. As the list of cloud services continues to grow, it’s becoming harder for one person to be expected to get their arms around every single service. For example, in the area of machine learning, one team member may know enough from a high level architecture perspective, but as more services are introduced it would likely require a data scientist with specialist knowledge to help determine which services are right for the organisation. Don’t be tempted to get rid of your traditional architecture teams too quickly. Far too frequently organisations let experienced people go, only to realise too late that their expertise and knowledge would have been invaluable during the cloud adoption journey. Decide on your starting point for each service and contain the ones you don’t want, or don’t need to touch immediately. It helps to share details of why particular services are contained or approved to ensure everyone is on the same page. Detail the process for changing any of those initial decisions for clarity and consistency - for example, the required approvals process, who needs to be updated on agreed changes, and how to onboard and test new services. Get buy-in on your initial position from stakeholders, make any required tweaks, and then communicate extensively across your organisation. It helps to include details of how future updates and changes will be managed too, so there is no confusion. Within your agreed reference architecture it pays to include details on specific governance designs and standards or guide rails. For example, if you contain the automated code deployment tool AWS CodeDeploy, then refer to what the standard tooling actually is and reference the internal owner in case anyone has questions. Review your position periodically. When adding new cloud services, be sure to communicate widely the initial status as well as subsequent updates. Adding multi-cloud to the (pick ‘n’) mix While there are definitely well-publicised merits to a multi-cloud approach, go in with your eyes wide open. If your organisation is considering a multi-cloud strategy, create a reference architecture for each cloud platform. And be very clear up front on how you plan to ensure workload portability between cloud platforms as this can be complicated. Splitting your focus, efforts and money between multiple cloud providers should not be underestimated: "Just in case we ever need to," is not a valid reason! It’s also advisable to develop an exit plan for each cloud provider at the reference architecture stage. We recommend identifing the correct tooling for each cloud provider’s APIs so that you do not inadvertently end up with the lowest common denominator services (typically IaaS). Armed with a comprehensive technology reference architecture document that focuses everyone’s minds and is communicated and updated regularly, your organisation can effect a smooth cloud migration. You can be smug knowing you have avoided that bloated feeling that comes from biting off more cloud services than you can chew.[/vc_column_text][/vc_column][/vc_row] ### Chatbots can pose a serious threat if not handled responsibly Chatbots are quickly becoming a standard method for consumers to interact with a business. Organisations are beginning to identify the benefits of deploying chatbots. They are a revolutionary way in which people, computers and IoT devices can access and update information in real-time. Whilst bot technology has been around for the past few years now, there are several reasons for the recent uptick in their use.   Chatbots: Language and Communication First, the core technology that powers chatbots is improving dramatically. This is enabling computers to process language and converse with humans in ways they never could before. Substantial advancements in Artificial Intelligence (AI) and Natural Language Processing (NLP) are making it possible for bots to better understand users’ needs and how to meet them.   Secondly, the way we communicate has changed drastically. In-person visits, phone calls and even emails are no longer the primary services by which companies engage with customers. Social media platforms, such as Facebook, have been early leaders in this new way of communicating. By promoting chatbot technology, developers gain the opportunity to produce a chatbot for their brand’s page. This can then interact with consumers and perform specific tasks within their messaging platform.   Controversy and Security As we move forward in 2018, chatbot integrations will continue to gain popularity and usher in a new era of conversational interfaces. We have already seen many of the major tech companies release their own conversational interfaces. This includes the Amazon Echo, Samsung’s Bixby bot, and Apple’s Siri. In each of these, there’s no need to click or interact in any other way than using your voice. As chatbot technology continues developing, these conversations will eventually seamless. So much so that they will be indistinguishable from the conversations that we have with our friends and family. However, it would be foolish to ignore the security issues for bots.   Current bot solutions are not entirely secure and could leave a new door open to attacks from hackers. The insecurity of bots provides a cyber attacker direct access to an organisations’ network, applications and databases. One of the ways to improve this technology is to develop end-to-end encryption allowing only authorised users to access the information shared. The end-to-end encryption method would reduce the risk of third party attempts from breaking into the system. Chatbots will develop further over time, learning to collect more sensitive and personal information through AI and machine learning capabilities. Therefore, it is crucial for the security to enhance on the platform at the foundation level.   It is especially important for organisations to educate themselves and be aware of the risks when running a chatbot. They must consider vital factors, including sharing and storing the information, and most importantly who can access it.   Natural Language Processing and Security Awareness [clickToTweet tweet="#Digital and #technology advancements should not come at the expense of enterprise #security" quote="Digital and technology advancements should not come at the expense of enterprise security"] If companies aren’t aware of potential security risks, confidential data will be accessible by any determined hacker. Additionally, attackers may be able to repurpose chatbots to harvest sensitive data from unsuspecting customers. Digital and technology advancements should not come at the expense of enterprise security. They also do not need to come at the cost of security. To help prevent attacks opening from chatbot services it is essential to have a forward-thinking approach. At BOHH Labs, we believe it is critical to ensure that each data request validates to the bot before gaining authorised access to a backend system or application. The security service supporting chatbot technology should be able to separate out the requestor from the request and securely allow the request to navigate to whatever end-point is required. This leaves the requestor waiting until the response has been collected and checked before moving it forward and returning the application to the requestor.   BOHH Labs does this in a number of ways, but mainly with our AI and NLP engines. These manage the data transaction process – the AI engine looks at and cleans any unwanted traffic, while the NLP engine takes the incoming message and determines where to send it. This means the user can use plain text with no command languages. Together, these two technologies can separate, recognise and maintain a secure connection to many different systems and prevent any third parties from trying to hop on the connection and get to the backend database. All of this happens in real time (hundredths of a second), so there is no disruption to the user experience, just the confidence that their transaction/request is secure. ### A brief history of disaster recovery Driven by high-profile cyber attacks and data losses, disaster recovery has become common parlance for businesses all over the world. But this hasn’t always been the case. Disaster recovery has undergone decades of evolution to reach its current state and there is plenty of development still to come. [embed]https://www.youtube.com/watch?v=BJ2g_1-m_mo[/embed] 1970s The rise of digital technologies also led to the rise of technological failures. Prior to this, the majority of businesses held paper records, which although susceptible to fire and theft, didn’t depend on reliable IT infrastructure. As businesses began to embrace the mobility and storage benefits of digital tech, they became more aware of the potential disruption caused by technology downtime. The 1970s saw the emergence of the first dedicated disaster recovery firms. These early firms came in three forms: hot, warm and cold sites. Hot sites duplicate a company’s entire infrastructure, allowing them to continue working immediately when disaster hits. Understandably, however, these sites are extremely expensive. Warm sites, on the other hand, only allow some of the core processes to be resumed immediately. Cold sites do not allow the immediate resumption of any services, but they do provide an alternative space in the event of a disaster striking the main office. 1980s Regulations are introduced in the US in 1983 stipulating that national banks must have a testable backup plan. Other industry verticals soon followed suit, driving further growth within disaster recovery businesses. [easy-tweet tweet="Disaster recovery has undergone decades of evolution to reach its current state" hashtags="DRaaS"] 1990s The development of three-tier architecture separated data from the application layer and user interface. This made data maintenance and backup a far easier process.                      2000s The 11th September attacks on the World Trade Centres has a profound impact on disaster recovery strategy both in the US and abroad. Following the atrocity, businesses placed greater emphasis on being able to react and recover quickly in the event of unexpected disruption. In particular, businesses looked to ensure that their critical processes and external communications could be recovered, both for altruistic and competitive reasons. Server virtualisation makes the recovery from a disaster a much faster process. With traditional tape systems, complete restoration can take days, but virtualised servers can be restored in a matter of hours because businesses no longer need to rebuild operating systems, servers and applications separately. With server virtualisation, the ability to switch processes to a redundant or standby server when the primary asset fails is also an effective method for mitigating disruption. 2010s The rise of cloud computing has allowed businesses to outsource their disaster recovery plans, also known as disaster recovery as a service (DRaaS). As with other cloud services, this provides a number of benefits in terms of flexibility, recovery times and cost. DRaaS is also easily scalable should businesses expand and usually less resource intensive, as the cloud vendor, or MSP, will allocate the IT infrastructure, time and expertise to ensuring your disaster recovery plan is implemented properly. Now recovery isn’t about back-up and standby servers, but about Virtual Machines and data sets, that may have been replicated within minutes of the production systems and can be running as the live system within minutes. What was once a solution for only the largest organisations with the deepest pockets is now available for all. However, along with the improved offerings such as DRaaS that new technologies offer, comes new types of threat. With employees connected to both the internet and corporate systems, companies will see an increase in demands from Auditors and Insurance Companies to protect against developing threats such as ransomware, which is now being targeted at company servers in addition to the desktop. [easy-tweet tweet="The #DRaaS market predicted to be worth $6.4 billion by 2020" hashtags="DisasterRecovery, BusinessContinuity"] The recognition by all organisations of the importance of maintaining Business Continuity and having a credible Disaster Recovery plan, is reflected in the growth forecast for the sector, with the DRaaS market predicted to be worth $6.4 billion by 2020. Find more information about DRaaS here. ### IoT and the Changing Face of Healthcare The implementation of the ‘Internet of Things’ (and the devices consigned to its subcategories) isn’t a new idea. Coined as a term in 1999, the technology itself was first envisaged in the 1970s, falling under the classification of pervasive computing. Now gaining significant popularity, industries all over the world are beginning to harness the technology. Statistics show there are currently 3.6 billion internet users scattered across the globe, and by 2021 it's expected that 28 billion ‘things’ will be connected to wireless networks and infrastructure, with 16 billion of those classified as IoT. With these stats in mind, experts predict that the origin of many of the devices will be the healthcare industry, where they will be used to supplement the collection of patient information and improvements to treatment. Having already led toward huge advancements in the diagnosis of patients in the healthcare sector, the presence of IoT technology has encouraged an increased introduction of smart, interconnected devices and sensors to monitor patient symptoms and behaviour. What’s more, the data is collected instantaneously and stored in the cloud ready for analysis. Using IoT in Healthcare IoT devices have already become widely accepted as a way to personally monitor health. Figures show that there are 23.2 million Fitbit users actively wearing devices which constantly collect data, including the number of steps taken, how long the wearer has slept, and their resting and active heart rate. Following its collection, data is then uploaded to secure cloud server where the user is given immediate access. It's the power of wristbands operating similarly to Fitbits that the healthcare sector has begun to harness. However, to successfully implement the technology and make it completely accessible, medical practitioners suggest there must be an element of standardisation through the early stages of introduction, which in the long term should allow for better scalability and affordability. Partnered with this IoT transformation is the now-prominent use of Big Data in the healthcare industry. Over the last decade there have been huge advancements in its use, with the prediction of disease outbreaks, treatment of chronic conditions, cures for diseases such as cancer and the secure storage of electronic health records. Implemented outside of a hospital scenario, inter-connected IoT devices that collect and exchange data using embedded sensors, as well as having the ability to speak to users directly, are becoming more common. Facilitated as a diabetes coach to monitor glucose levels, check heart rates and track dementia patients, the advancing technology in each device allows them to transmit signals directly to a hospital in case of emergency. [clickToTweet tweet="IoT devices that collect and exchange data using embedded sensors are becoming more common #Iot #Healthcare" quote="IoT devices that collect and exchange data using embedded sensors are becoming more common #Iot #Healthcare"] When operated inside a hospital, devices offer the increased capacity to monitor real-time data of a patient’s vital signs. This can lead to prompter diagnosis, faster implementation of treatment plans and a speedier discharge. The collected data is analysed using machine learning algorithms, before being further evaluated alongside a database of solutions collated from predictive analysis techniques, with the primary aim to aid doctors in their diagnosis. IoT devices and the analysis of collected data has also been introduced to help with the care of newborn babies. Wirelessly connecting wristbands worn by the mother and baby to the hospital network to keep both of them safe, the technology is also currently being integrated into NICU departments to help with the treatment and care of premature babies. Following the introduction of the IoT and Big Data to a medical scenario, the next step should be to hire a Data Scientist who will have experience managing large unstructured datasets and be confident in using non-conventional data management tools. The primary objective of this professional is to identify patterns and trends running through the data that, when compiled in a database, should aid with the diagnosis of patients. When selecting a Data Scientist, especially for a role in healthcare, knowledge of subject matter is preferential. The use of Electronic Health Records is becoming more prominent and is changing the insight offered by modern medicine, thus leading to improved treatments. But it's Data Scientists who are needed to decipher the information in order for that to occur. IoT Architecture and Structure For the implementation of IoT to be undertaken successfully in the healthcare industry, devices need to be structured around specific network architecture to ensure that every piece of technology is introduced and maintained to a high standard, which should theoretically lead to zero breakdowns in communication between the network and devices. The first aspect of the architecture to be introduced regarding IoT is device management. Normally falling under the remit of a member of the Security and IT team, it is the role of this individual to ensure security protocols are followed during the introduction process of new devices and maintained during their lifespan. Next on the architectural structure list is infrastructure management. Often overlapping with device management processes, the speed in which devices are connected is key to how effectively they are implemented. Owing to their computational limits reducing their speed and memory, numerous IoT devices are modified, with a boosted connection to the Cloud to increase their productivity. The primary aim when introducing IoT devices into a network is to harness the power of data, and this can be positively achieved by incorporating data management as an architectural structure. Using the skills of a data scientist to leverage all of the data available, it offers the chance to uncover dark data, allowing for further insights and the positive upkeep of all collected data. Risks and Security There is an element of risk associated with implementing any new technology or devices into a new environment, and many of those centre around security. Throughout the introduction process of IoT devices, questions will arise regarding the secure storage of data collected from ‘things’, how to keep it safe from cyber attacks, and ultimately, how networks can be designed and monitored to stop breaches. Before the introduction of IoT devices, especially in healthcare, it has become common practice to hire professionals with experience of monitoring large networks and technical infrastructure. Working alongside the Data Scientist, who will identify threats to a system, a DevOps engineer will have the know-how to develop, incorporate and update network security protocols, to stop threats and add an extra layer of protection to the collected data. When connected to a network, the data obtained from IoT devices is compiled and stored in a hybrid cloud environment, a system which combines a private onsite cloud server with a public cloud service such as AWS. Coupling devices to this infrastructure where they can securely deposit data will first ensure they comply with network security protocols, and will also allow for not only the data be constantly monitored, but also the network itself. the data collected through each device is also a prime target for hackers who look to infiltrate a healthcare system to obtain patient information However, it isn’t only the system that is susceptible to risks, the data collected through each device is also a prime target for hackers who look to infiltrate a healthcare system to obtain patient information. Preventing network breaches is a key part of the role of a DevOps engineers, and to ensure that data is kept completely safe from outside sources. From the first use of a device, engineers are able to enforce encryption on all collected information, which can then only be decrypted by a member of the data team. Ultimately, furthering the use of the Internet of Things in the healthcare industry to improve treatment, develop cures and monitor health records, can only stand to improve performance overall. But in order to make the transformation successful, infrastructure and security should be the top priorities for providers looking to make the change. ### Invisible discrimination: How artificial intelligence risks perpetuating historical prejudices [vc_row][vc_column][vc_column_text]This year has seen a wave of new research of the unintended consequences of an Artificial Intelligence industry dominated by middle-class white men, teaching itself with unchecked and unregulated data sources. New research published by AI Now Institute, New York University in April 2019 concluded that AI is in the midst of a ‘diversity crisis’ that needs urgent attention:  “Recent studies found only 18% of authors at leading AI conferences are women, and more than 80% of AI professors are men. This disparity is extreme in the AI industry: Women comprise only 15% of AI research staff at Facebook and 10% at Google […]  For black workers, the picture is even worse. For example, only 2.5% of Google’s workforce (in AI) is black, while Facebook and Microsoft are each at 4%” But the levels of gender and racial representation of those working in the industry is only part of the challenge -  there is an increasing body of evidence that suggests that AI algorithms themselves are unhealthily influenced by discrimination and biases of the past. Let’s take three examples: Firstly, Amazon had to take an automated recruitment robot out of service, after it was found to be favouring male CVs over female for technical jobs. Google, had to adjust an algorithm that  was defaulting its translations to the masculine pronoun.  Our third example adds racial rather than gender bias:  Joy Buolamwini, a researcher from Massachusetts Institute of Technology found that a facial analysis tool, sold by Amazon, would not recognise her unless she held up a white mask to her face. An AI algorithm will diligently go about its task of recognising patterns in data more efficiently than any human – but the old adage “garbage in, garbage out” still applies in the digital world. Or, to update it clumsily for the 21st century:   “if the data set contains biases, so will the conclusions”! Our recruitment tool example likely learned through a dataset of successful CVs for top engineers, it identified patterns, and used those patterns to make recommendations.  But if over the past few decades we have seen more men working in technology, studying technology in higher education, and applying for engineering positions at the likes of Amazon – we should not be surprised that data set had this bias built in. AI translation tools will favour the male pronoun if the data set it feeds from does the same;  and facial recognition technology fed millions of image of white faces will, guess what,  become more adapt at identifying those faces than other races. So – what to do about this? Firstly, of course we need to push for a more diverse AI industry in terms of gender and race, but we must recognise that even reaching a balance more reflective of society does not go far enough.   We need to take steps now to either correct our data sets, and/ or engineer specifically to correct bias of the past. By definition we cannot now identify every possible negative unintended consequence of unchecked AI – but we can make a very good start! It would not have taken much foresight in our recruitment bot example to see the risk and engineer for it… Coding specific controls into AI recommendations or manually changing data sets may seem counter intuitive or even undesirable, but I would argue that positive discrimination when applied selectively can be an absolute force for good – whether in the boardroom, or deep in an AI algorithm.  Even the random play function on your iPhone had to be specifically coded to be less random, so it could appear more random to users -   I am calling for a variation on that theme; to engineer into our AI correctives for the human failures of the past. As to “how” we do this - there have been calls for increased regulation and government policy in the area of artificial intelligence,  but whilst that may be part of a solution, I worry of more unintended consequences of a rapidly growing industry with a global “war for talent” being stifled in regions where heavy regulation is applied.   It is not in the interest of any of the companies involved in our three examples above to allow these issues to go without correction; I would like to think we are collectively intelligent enough to correct our simulated intelligence without regulation telling us how to do it. So for the time being at least, I would prefer we push in parallel for greater equality in tech companies across senior management and AI engineering, and a more public awareness and dialogue of these issues in the tech community. We all need to be cognisant of the risk in allowing AI algorithms that continue to blend into in our day-to-day lives propagating historic biases into our future.[/vc_column_text][/vc_column][/vc_row] ### Cloud Vendor Assessments | Done The Right Way In security circles, there is a famous saying which is: “trust but verify”. These are wise words; yet we still find most security professionals conducting incomplete third-party assessments, leaving out the “verify” aspect. Having sat on both sides of the fence, conducting cloud vendor assessments and filling out questionnaires required by potential customers. It’s become apparent that some put very little effort into this process and so it feels like a mere “tick-box” exercise. It begs the question: if it’s just a checkbox, then why waste everyone’s time?   More aptly, when little effort is put into questionnaires, it can seem like the individual works for a low-trust organisation or s/he simply doesn’t understand how to verify trust. Therefore, it’s time for organisations to consider changing the process from something that has become all but meaningless to a constructive way to assess cloud providers and the value they will bring to the company. Admittedly, there is a market for companies to outsource third party risk assessments and a market for risk rating reports on vendors; however, in full disclosure, most are misleading. Companies in truth don’t need to hire a third-party company to conduct the cloud vendor risk assessment and they certainly don’t need a generalised risk rating of an overall cloud company.   So how do organisations know they can trust a cloud vendor? The very first step is to understand the business requirements: what is the business wanting to do with the cloud vendor? What data is involved in this business process? Has the business looked at other vendors? If so, which ones?   Do your homework Once these business requirements are factored and the path to selecting the vendor is chosen, it is advisable to go to the vendor’s website to read its privacy policy. The first question that needs answering is who owns the data? Next, go to the compliance page and get a copy of the vendor’s SOC2 report. The Service Organization Control (SOC) 2 examination demonstrates that an independent accounting and auditing firm has reviewed and examined an organisation’s control objectives and activities and tested those controls to ensure that they are operating effectively. There are five trust principles and the SOC2 report will reflect which trust principles were tested. There are two types of SOC 2 reports: Type I and Type II. The Type I report is issued to organisations that have audited controls in place but have not yet audited the effectiveness of the controls over a period of time. The Type II report is issued to organisations that have audited controls in place and the effectiveness of the controls have been audited over a specified period of time. If the cloud vendor has a SOC2 Type 2 and/or other certifications, companies should ask themselves if they really need them to fill out a lengthy security questionnaire? The answer is no. In fact, to receive the answer “refer to SOC2 or refer to AOC, etc.” is perfectly acceptable in these instances. Then, if questions still remain as to the verification of trust, even after reading the findings of those certifications, send the vendor any other queries that matter to the business. Note: if a potential customer is interested in the vendor’s PCI certification then sending the question, “Do you conduct vulnerability scans?” is a clear indicator that the business obviously doesn’t understand the PCI requirements, so be sure to send only the questions that will help verify that trust.   Red Flags Buyer beware: if the vendor states it has a certification and sends an AWS certification, that is a BIG RED FLAG. In fact, run! The certifications to look out for are those that the vendor has itself achieved, not their vendor. As with all cloud vendors, there is a shared responsibility with security and compliance. In this example, when evaluating the cloud vendor, the company seeking to verify trust will be looking at the cloud vendor’s own controls and responsibilities, and not AWS’ certifications.   What to do if there are no certifications What if the vendor doesn’t have any certifications? No problem; that’s where the lengthy questionnaire is relevant. The Vendor Security Alliance (VSA) has a great questionnaire that is free to download. If business requirements include data privacy, then it will be necessary to add some questions to VSA’s questionnaire. In addition, when trying to verify the trust of a vendor without any certifications, first ask what security/compliance framework it follows. If, for example, the answer is PCI, then give the vendor a test: how often does it scan for vulnerabilities? If it states annually, then this vendor obviously does not follow the PCI framework! Remember, the job of organisations is to assess the risk and relay that back to the business. If the business still wants to move forward with a high-risk vendor, then the business owner didn’t understand the risk and the discussion can then be moved to consider compensating controls. However, once that path is travelled, the business owner usually instructs the team to look for other cloud vendors. If the business still insists on using the vendor, then ensure a termination clause is put into the contract terms. For example, Termination due to Change in Security. If Vendor determines that it will not move forward with SOC 2 Type II certification, and/or no longer performs quarterly security scans then Vendor must notify Customer. At the time of notification, Customer will be granted the opportunity to exit this agreement. In addition, Vendor must notify the Customer in the event its security controls do not meet SOC2 Type II trust principles. A vendor will not materially decrease the overall security of the Service during a subscription term. With a little guidance and common sense, the vendor assessment journey needn’t be a laborious task and can save the time and trouble in the long run associated with chopping and changing vendors that turn out to be riskier than deemed acceptable. ### 2020 - the year of cloud [vc_row][vc_column][vc_column_text]It is clear that cloud computing was one of the key innovations which defined the 2010s. The general adoption of the cloud was preceded by several events, starting from the planning of “intergalactic computer network” by U.S. government scientist J.C.R. Licklider in the 1960s. In 2006 cloud computing entered the mainstream technology arena when AWS announced the launch of its Elastic Compute Cloud. However, it wasn’t until the 2010s that cloud computing really started to lead the IT revolution by fully transforming the way computers and software operate. Cloud was adopted by leading brands and made cloud-native companies like Amazon, Netflix and Facebook the largest digital businesses in the world. In 2010, the cloud computing market was further established when technology giants Microsoft, Google and Amazon Web Services launched their cloud divisions. In the same year, OpenStack, a leading open-source cloud software platform was established, cementing the rise of cloud-enabled technologies. The definition of cloud computing itself has dramatically changed over the years, with the fast growth of edge and hybrid cloud adoption. Although still consider an early-stage technology, it’s expected in the coming years the development standard for both enterprise and consumer applications will shift from cloud-enabled to cloud-native. According to Statista, public cloud spending had a fivefold increase over the decade, starting at $77 billion and is predicted to reach $411 billion by 2020. Enterprises will complete the cloud migration Despite the anticipation of what cloud computing can do for organisations and underpin the modern business infrastructure, a lot of businesses are yet to make the leap. According to Forrester less than half of all enterprises use a public cloud platform right now. However, recent research by 451 Research demonstrated that it is surprisingly, the financial services industry who is a leader in adopting cloud technologies. With increasing competition brought by cloud-native disruptors in the banking arena, 60 percent of financial services companies surveyed reported that implementing cloud technology will be a business priority from next year. Furthermore, a recent McKinsey survey outlined a big challenge among enterprises to completely migrate their operations to the cloud. There is a big gap between the IT leaders who have migrated over 50 percent of their workload to the cloud, compared those trailing behind with less than 5 percent. One of the main reasons for slower cloud adoption is security. According to a study by LogicMonitor, two-thirds of IT professionals state that security is the key concern in cloud migration. In the coming years, improving security will be the key cloud industry objective. With new solutions for compliance and data control in place, companies who haven’t moved to the cloud will have less reason not to. Indeed, solutions that answer questions around data control, compliance and user security will be the driving force behind businesses adopting cloud solutions. Solving these questions around security will be the biggest reason for the enterprises to uptake rather than compute requirements. To conclude, security responsibilities rest mostly with the customers, and more are using cloud visibility and control tools to lower chances of security failures and breaches. Machine learning, predictive analysis and artificial intelligence are set to provide new levels of security and accelerate the number of large-scale, highly distributed deployments. While, there is no computing environment that can guarantee complete and flawless security, yet moving towards 2020 more businesses will likely feel safer working with the cloud than the past decade. Edge computing will reimagine the cloud Cloud computing is usually viewed as centralised data centres running thousands of physical servers. However, this perception is missing one of the biggest opportunities brought by cloud computing - distributed cloud infrastructure. As businesses require near-instant access to data and computing resources to serve customers, they are increasingly looking to edge computing. Edge computing directs specific processes away from centralised data centres to points in the network close to users, devices and sensors. It was described by IDC as a “mesh network of microdata centres that process or store critical data locally and push all received data to a central data centre or cloud storage repository, in a footprint of less than 100 square feet”. Edge computing is essential for the Internet of Things (IoT), as it requires to collect and process big amounts of data in real-time, with low latency level. Edge computing will help IoT systems to lower connectivity costs, by sending only the most important information to the cloud, as opposed to raw streams of sensor data. For example, a utility with sensors on field equipment can analyse and filter the data prior to sending it and taxing network and computing resources. Edge computing is not the final stage of cloud computing, but rather a stage in its revolution that is gaining fast adoption across the industries. Widespread containerisation Moving to 2020 we’ll continue to see the growing adoption of containers - the technology enabling developers to manage and migrate software code to the cloud. Recent research by Forrester estimates that a third of enterprises are already testing containers for use in software development, while 451 Research forecasts that container market will reach $2.7 billion in 2020 with annual growth increasing to 40 per cent. According to a Cloud Foundry report, more than 50 per cent of organisations are already testing or using containers in development or production. For businesses using multi-cloud infrastructure, containers enable portability between AWS, Azure and Google Cloud, and adjust DevOps strategies to speed up software production. By using operating-system-level virtualization over hardware virtualization, Kubernetes is becoming the biggest trend in container deployment. It is clear that containers will be less a buzzword and more a widespread development standard as the decade turns. Serverless gains its momentum “Serverless computing” is a misleading term in some sense, as applications still run on servers. However, using serverless computing, a cloud provider manages the code execution only when required and charges it only when the code is running. With this technology, businesses no longer have to worry about provisioning and maintaining servers when putting code into production. Serverless computing gained mainstream popularity back into 2014 when AWS unveiled Lambda during its Reinvent Conference and got further traction recently as AWS announced its open source project Firecracker.  Serverless computing is predicted to be one of the biggest developments in cloud space. Still, not everyone is ready for it. Moving to serverless infrastructure requires an overhaul of traditional development and production paradigm, meaning outsourcing the entire infrastructure. Serverless computing will not be an overnight sensation. Instead, it will be adopted and developed together with a growing amount of use cases. Current solutions usually lock customers into a specific cloud provider, but the arrival of open source solutions in this space will enable a wider portfolio of implementations or serverless computing across the industry. Open source is more relevant than ever Open source software is the most popular it's ever been. An increasing number of organisations are integrating open source software into their IT operations or even building entire businesses around it. Black Duck Software surveyed IT decision makers and identified that 60 percent of respondents used open source software in their organisations. More than half of the businesses surveyed reported contributing to open source projects. The cloud provides an ecosystem for open source to thrive. The large amount of open source DevOps tools automation and infrastructure platforms such as OpenStack and Kubernetes is supporting the growing open source adoption. As organisations continue to migrate their operations to the cloud, open source technologies will boost innovation beyond 2020. With the current development landscape, it is clear that a key momentum for cloud computing is still yet to come.[/vc_column_text][/vc_column][/vc_row] ### Digital Transformation: Lost in Translation? [vc_row][vc_column][vc_separator][vc_column_text]I’ve worked in the world of tech for over two decades now and the only constant is that our industry loves three things in equal measure: a buzzword, silver bullet and an acronym. The problem with buzzwords, is that the true potential of the technology quickly loses its meaning. Look at the word – and its various iterations – innovation.  If you aren’t innovating then you’re not an innovator and everyone these days is innovative. But are they? If everyone is innovative, then who are the true pioneers? In this situation, no-one is a winner. Innovators are a big, indistinguishable homogenous mass. The term, digital transformation, is in danger of suffering a similar fate (a problem for someone with it in their job title like me). From farming through to pharma, every business is reportedly in this midsts of a digital transformation. The risk here is that digital transformation is becoming a catch-all term, applied to all forms of ‘digital’ technologies. For example farmers in the US tapping into and benefitting from the Farmers Business Network – described by the Financial Times as ‘Google for famers’ – is individuals using digital technologies to transform their business. But it is not digital transformation. You could argue that the difference is akin to splitting hairs. And perhaps you’d be right, but there is a difference – and no matter however nuanced that difference is – it’s a distinct one. By bounding digital transformation around, the risk is that we devalue it because people no longer understand what it really means and the potential it can deliver. That’s especially true of the c-suite. Defining digital transformation So, let’s take a step back. What do we actually mean by digital transformation? Rather unhelpfully it can mean a lot of different things to lots of different people. For some it’s all about ripping legacy infrastructure and replacing it with cloud-based platforms, whereas for other businesses it’s about a shift in mindset and culture. I’d argue it’s a case of all three. Digital transformation, for my money, goes way beyond ripping out legacy IT. I’ve seen many longwined explanations of digital transformation but to me success comes down to the tried and tested triangle of people, processes and technology – one is not optimised without the the other two. And it’s the same for digital transformation:     People – tech has to be intuitive, generate an engaging user experience and above all make the job easier. If its not, employees will find a work around. That’s what makes culture so critically important – digital transformation is a journey. People need to understand the end destination, why it's important, how you’re going to get there and the role that they have to play.     Processes – without the right processes in place, the tech won’t work. You’ve got to understand the dependencies between platforms, servers, apps, data and delivery channels. There is no point in having a shiny self-service app, if the back end isn’t supported and well mapped out processes.     Technology – this is really important because all too often blueprints for business transformation projects are rooted in the enterprise companies of today – not the businesses they need to be tomorrow or in ten years’ time – so they are anchored by the technical debt of the past. Tech is evolving at an incredible pace, so it’s vital the the foundations laid now can embrace and support new technologies as they emerge. Bringing all of this together means that every second of a customer journey is reflected in real-time in your systems allowing you to become 100% customer centric – that’s true digital transformation in my book, Driving and underpinning all of this is of course, data. Information and insights are the lifeblood of any company, big or small. How you harness them in order to become more efficient, more relevant, more agile and more competitive is key. Data is the glue that binds people, processes and technology. How it flows between people, fuels processes and is used within technology platforms is the bedrock of digital transformation success. Data feeds the senses of your organisations, without it, your systems are blind, deaf and dumb. Equally, it's important to realise that digital transformation is not an out-of-the-box solution. There is no one-size-fits-all. Every single company is unique and therefore so will its requirements. For one business it could be moving its business to mobile and internet-based channels, for another it could be using cloud platforms to transform their supply chain. Digital transformation has the power and potential to change a business. To make it more adaptive, competitive and efficient, but only if it is rooted in sound business strategy. Therefore, we need to showcase what it can achieve and let those achievements speak for themselves rather than adding superlatives and hype. Why? This doesn’t do the tech industry – or the very businesses we’re looking to help – any favours at all. It creates confusion, eye rolling and business leaders switch off. And in doing so they could be facing their own Kodak moment, opening the door to a competitor or inadvertently sidelining themselves in the marketplace. Let’s be open and honest about what digital transformation is (or should I say DX), and what it is not, because if we are then we can reinvigorate a fatigued C-suite and give companies the athleticism they need to compete in a fierce and unpredictable world economy.[/vc_column_text][/vc_column][/vc_row] ### The Importance of Artificial Intelligence in Space Technology [vc_row][vc_column][vc_column_text]Artificial intelligence has transformed many areas of our daily lives. From healthcare to transport, tasks usually carried out by humans are now being performed by computers or robots, more quickly and efficiently. What’s more – AI is adapting. It’s now bringing us closer and closer to the stars, with numerous benefits. Here’s a closer look at the importance of artificial intelligence in space technology. Why Do We Need AI? Artificial intelligence allows computer systems to work with the intellectual processes of humans. However, these machines can carry out tasks more dexterously and efficiently than people. They can also enter hazardous environments, such as areas that require deep sea diving, making certain processes much safer. In short: AI is improving the way we work. It’s not about replacing humans, rather, changing our workplaces for the better. For these reasons, it’s currently being utilised by a variety of industries, including: Healthcare AI can mimic cognitive functions. This makes it a perfect fit within healthcare, where many issues need addressing to make treatments faster, more effective and more affordable. One example of this is using artificial intelligence to build databases of drugs and medical conditions. This can help us find cures or treatment for rare diseases. AI is also being utilised to make healthcare more affordable around the world. The machine can mimic a doctor’s brain, recognising how humans express their ailments. Transport Artificial intelligence is greatly improving the efficiency of the transport sector. It can help optimise routes, finding the fastest and safest journeys for different vehicles. Manufacturing Adopting AI in different factories is helping to: Make certain processes safer by automating them Improve engineering efficiency Reduce costs Increase revenue Plan supply and demand The Value of AI in Space Where does space come in? Artificial intelligence has enormous promise within satellite and space technology. Global Navigation Data collected by Global Navigation Satellite Systems (GNSS) can support AI applications. Tracing, tracking, positioning and logistics are all areas that can be greatly improved due to precise and consistent data collection. Earth Observation Satellite imagery in conjunction AI can be used to monitor a number of different places, from urban to hazardous areas. This can help improve urban planning, finding the best places for development. It can also help discover new routes, allowing us to get around more quickly and efficiently. Satellites can also observe areas of interest, such as deforestation. Data collected by this technology can help researchers in monitor and look after them. As well as this, satellites can be deployed to monitor hazardous environments, such as nuclear sites, without the need for people having to enter. Data collected by space technology and processed by AI machines is fed back to the industries that require it, allowing them to proceed accordingly. Communications Satellites can transfer important information to AI machines, providing reliable and important communication. This can be used for traffic needs, for example. Satellites can collect data, on congestion or accidents, and feed back to the machines. Artificial intelligence can then be used to find alternative courses, rerouting or diverting traffic where necessary. What Does This Mean? We haven’t yet scratched the surface on unlocking the full potential of AI. This presents many exciting business opportunities for innovators to take advantage of. This could be in exploring the technical viability of new applications or exploiting new space assets, for example. Artificial intelligence in space technology can greatly improve our daily lives, from inside our workplaces to how we get around.[/vc_column_text][/vc_column][/vc_row] ### The future of hybrid cloud [vc_row][vc_column][vc_column_text]Cloud computing has provided and will continue providing businesses the flexibility and efficiencies that they need to grow and meet the demands of all users. It provides the necessary infrastructure, software and platforms to meet the requirements of all manner of organisations. But the question remains as to which model is best — public or private cloud? Hybrid cloud is positioned to address this argument as it allows users to get the best out of both worlds. Strictly speaking, the term hybrid refers to creating something else by combining two different elements. In the case of cloud, the premise remains the same, but there is no all-encompassing definition — it means different things to different people. However, the right mix — whether that’s heritage on-premises hosting and colocation, or leveraging services within dynamic public cloud — will be different for each and every business. Many of the initial barriers around adoption of hybrid cloud, such as security and costs, have been addressed and businesses are now looking to get the most from their investment. But, what is on the horizon and how is cloud adoption going to occur in the future? What makes hybrid cloud an effective choice? Hybrid cloud can be used in many ways. It’s a good platform for building confidence in cloud, developing a transformation programme on and carrying it out. In the same vein, it may be a deployment model that must be used for compliance, regulatory, risk, latency or data sovereignty issues. Regardless of intent, the first and potentially the most important step for any organisation is to define their cloud aspirations. This can be done in several ways, including conducting a cloud readiness assessment, which determines an organisation’s current state the steps needed to start the journey. Only then can a business formulate a cloud adoption transformation programme. What are the public cloud vendors doing? The public cloud market is dominated by a few large hyperscale players but what are they doing to pave the way for a hybrid future? Microsoft Azure In 2017 Microsoft launched Azure Stack as the answer to the hybrid cloud conundrum. Azure Stack is an extension of Azure, allowing users to run an Azure environment from within their own premises or third-party datacentre. The benefits include pay-as-you-go pricing, and the ability to develop applications in-house in an Azure environment that’s the same as public Azure. In a nutshell, users can create a private cloud with public cloud capabilities. Amazon Web Services (AWS) AWS has been supplying large-scale public cloud to the market for more than 10 years. In 2016 it partnered with VMware for the purpose of allowing VMware’s customers to burst into the cloud from their virtualised environments. But AWS has since taken the move to hybrid further with the recent announcement around EC2 being able to run on-premises with the Snowball Edge device, a hard drive for moving workloads between AWS cloud and client datacentres. Additionally, in 2018, the company took things even further by announcing AWS Outposts. This service consists of fully managed and configurable compute and storage racks built on AWS hardware. It enables customers to run compute and storage on premises, while also easily connecting to the rest of the company’s cloud services. Google Cloud Platform Launched in 2008, the Google Cloud Platform provides users with a suite of modular cloud services. Built on the legacy of its search engine and email platform, Google Cloud made its official foray into the hybrid space earlier this year. The company extended its container and microservices technologies (like Kubernetes) from being used on the Google Cloud Platform, to being used on in-house servers or edge devices. More recently, Google has announced Anthos, its hybrid cloud management product which allows enterprises to use a single dashboard to manage all applications, regardless of whether they are in a private data centre, Google Cloud, AWS or Azure. Looking to the future As vendors continue to develop their services and barriers disappear, hybrid cloud adoption is only going to increase. In fact, Gartner suggests that by 2020, 90% of businesses will have adopted a hybrid infrastructure. Technology will also continue to play a leading factor in the evolution of hybrid cloud. The  incorporation of automation, machine learning and artificial intelligence into cloud platforms will influence, not only the deployment of new technologies and services, but also the way in which the environment is managed and maintained. Vendor collaboration will also feature strongly in the hybrid future. We’re already seeing this happen; just consider cloud vendor partner programmes such as the Cloud Solution Provider program from Microsoft or the AWS Partner Network. Regardless of which model or vendor organisations choose to adopt, one thing is for sure; the future of hybrid cloud will most likely be focused around outcomes. For organisations, this means effectively measuring the benefit realisation of hybrid cloud and ensuring the organisation is on track in terms of costs and objectives. This is where the support of the right cloud provider will be critical. Only with the support of the right partner, can businesses optimise their cloud investment and achieve the desired outcomes in an effective, secure and compliant way.[/vc_column_text][/vc_column][/vc_row] ### Improbable grows to a complete cloud-based multiplayer games tech company, expands existing partnership with Google Cloud [vc_row][vc_column][vc_column_text]Google Cloud today announced an extended partnership with Improbable, the multiplayer game specialists and creators of SpatialOS, the unique cloud-based platform for the development and management of any kind of online multiplayer game. Building on an existing strategic relationship, this extended partnership will see Improbable and Google Cloud work together to improve solutions and reduce cost and risk for multiplayer game developers. The partners will also work together to help Google Cloud customers making multiplayer games explore how SpatialOS can reduce their risk and speed up gameplay iteration, testing and improvement, even for challenging and innovative game designs. Since announcing its first partnership with Google Cloud in 2016, Improbable has grown from around 100 employees to more than 500, and raised more than $550m in funding. In 2019 Improbable announced the opening of Improbable Game Studios, with three wholly-owned game studios based in the UK, the US and Canada working on games using SpatialOS. Developers using SpatialOS build games using their preferred combination of both Improbable and third-party tools and engines. These games are then run on the cloud, using SpatialOS’ hosting, online services and multiserver networking layer. This enables developers to reduce their project’s risk by iterating on gameplay faster, designing and experimenting more flexibly, and launching with a managed solution for online games. SpatialOS leverages the effectively limitless computational power of the cloud by allowing game worlds and gameplay systems to be seamlessly distributed across multiple cores and servers. Using this multiserver architecture, developers can rapidly build, test, debug, launch, redesign and scale multiplayer games – from action-packed, session-based Battle Royale experiences to huge, experimental virtual worlds, shared by thousands of players at once. Improbable uses Google Cloud to provide the computational power underlying the SpatialOS platform, providing cost and efficiency benefits to developers: Increasing reliability and reducing risk: SpatialOS integrates closely with Google Compute Engine, gaining the advantage of GCE’s rapid provisioning of virtual machines for multiserver game development and operating environments. This empowers developers to make games available for playtesting quickly and iterate rapidly based on feedback. Google Cloud Platform’s fast networking I/O and in-memory storage enable stable, low-latency gameplay even in fast-moving games running on SpatialOS. Google Cloud Storage and Cloud CDN also enable rapid downloading of assets into the game world and redundant storage of game snapshots. A shared commitment to open development: Improbable uses and contributes to key Google Cloud Platform open-source projects, including gRPC to coordinate SpatialOS’ microservices and Kubernetes for container management - allowing games to scale rapidly and efficiently with demand. Matching players quickly and accurately: Improbable is adding support for Open Match, an open-source customizable matchmaking framework co-created by Google Cloud and Unity. Adding Open Match as an option alongside SpatialOS’ native matchmaking services increases the options available to game developers for matching players of equivalent skills. Increasing developers’ range of integrated tools helps them to build great games with the tools of their choice. Improving operational efficiencies: Improbable leverages Google Drive and Google’s office applications to help run the company's internal operations. It also uses BigQuery to underpin its game analytics service, allowing game studios to track events within games and generate insights. Improbable is also working with Google in other gaming-related areas, such as a launch partnership with the Google Stadia gaming service, announced at GDC in March 2019. The multiserver game world management and networking of SpatialOS and the streaming technology of Stadia are complementary technologies, using the cloud to deliver new gaming experiences without computational or hardware limitations. "We were initially attracted to Google Cloud for the quality of its network and infrastructure, and through the course of our partnership have seen that our values and our visions for games are aligned," said Lincoln Wallen, CTO, Improbable. “Whether we are supporting game developers or building our own games, Google Cloud ensures scalability and access to computational power, while empowering our team to deliver better experiences to SpatialOS users." “Improbable has moved from proving its technology to supporting and simplifying development across the range of online multiplayer games, and Google Cloud is proud to be a part of that ongoing journey,” said Thomas Remy, UKI Head of Telecommunications, Media & Entertainment, Google Cloud. “Everyday, our cloud technology enables Improbable to more efficiently scale their technology, drive innovation in game development and build and test the next generation of cloud-based games.”[/vc_column_text][/vc_column][/vc_row] ### E-commerce: Speed and scalability is everything There are many factors to consider when choosing an e-commerce platform but two of the most important are speed and scalability. The reasons are simple; online shoppers expect a fast online experience and you need to be able to effortlessly scale your operation as it grows. It might seem an obvious point but slow e-commerce operations are bad for business and it’s not only about customer frustration. When websites are slow, every business metric suffers, from page views and bounce rates to conversions, revenue and customer churn. Customers expect blistering performance There’s lots of research to show that customers expect a website to load lightning fast, typically two seconds, and the majority who are dissatisfied with website performance are less likely to make a repeat purchase. [easy-tweet tweet="There’s research to show that customers expect a website to load lightning fast, typically 2 seconds" hashtags="#ecommerce"] You might think the answer to slow speed lies in faster networks and devices but this is not true. Yes, faster networks can increase performance and devices galvanised with the latest and greatest processors can also make a difference. However, if the e-commerce platform isn’t optimised then you certainly won’t benefit from great speed; indeed ‘crash and burn’ may become your favoured expression for its faltering performance. The world’s most popular Magento is the world’s most popular e-commerce platform and with good reason. It is built specifically for e-commerce performance and scalability, it’s easy to add third party services such as all-important payment gateways, database applications and shipping and shipment tracking modules and its very SEO friendly. Importantly, it also supports mobile commerce. If it’s not implemented properly then you won’t get the maximum benefits. Magento is a very feature-rich and comprehensive system it needs the appropriate server resources to power it. Along with selecting the appropriate server, you also need to ensure you select the appropriate partner. Collaborating with a Magento-certified hosting partner will ensure you get the most out of your e-commerce platform. Hosting providers with expert experience in Magento have teams dedicated to advising and ensuring your site is built for optimum performance. Cogeco Peer 1 is exceptional in its technical support, expertise and the technology solutions offered. Does your platform scale? You also need to consider scalability by looking ahead at how your e-commerce platform will develop in the coming years. Does your platform scale well? E-commerce sales consistently rise year on year, and if like the sales your site also follows this trend, you’ll need to integrate other systems at some point in the future. You need to consider scalability by looking ahead at how your e-commerce platform will develop in the coming years Depending on your needs this could be anything from accounting, fulfilment and warehouse management to CRM or ERP systems. Of course you may already have considered this and have a good idea of how to distribute the transaction processing load across multiple servers to maintain performance as site traffic grows in volume. Limitations? Have you considered whether there are limits on the number of products you can offer or whether the number of items a visitor can buy at any one time is restricted?  Are you also familiar with bandwidth limitations or whether the e-commerce platform stores inventory information within the program or whether it connects to commercial databases such as SQL or Oracle to gain larger storage capacities? Given that speed and scalability are the most important features for a successful e-commerce operation, they are points that should be uppermost in your mind. ### Three Key Personas in Cloud Deployment As technology continues to reshape the business landscape and intensify competition, many businesses have turned to hybrid – or even multi – cloud deployment as a way of keeping up with innovation and staying agile in today’s saturated marketplace. The benefits of this shift in strategy are obvious – cloud technologies offer an array of opportunities that were not previously possible; from enabling unlimited scalability at lower costs, to facilitating rapid service delivery by improving the efficiency of internal infrastructures.   Yet, for all the benefits that the cloud has to offer, stakeholders can jump the gun in deploying their cloud strategies; often entirely overlooking the data that is actually housed in the cloud. This can lead to bigger challenges later down the line, particularly for the various business units who all have competing needs and must upload, access, and analyse different types of data. As the volume of data continues to multiply at an exponential rate – and will inevitably challenge a one-dimensional cloud strategy – integrating proper data management is the key to empowering organisations to stay agile and maintain an unbeatable competitive advantage.   With time of the essence in today’s fast-evolving market, organisations must understand how interconnected data and cloud strategies are to a building a cohesive and successful business strategy. Given the multi-faceted requirements of different business functions, it is critical for enterprises to understand how they can adapt their cloud strategies and integrate data management to serve the three key personas in cloud deployment, all of whom will play central roles in driving growth.   The IT Decision Maker More often than not, the two key focus areas of the IT decision maker are cost savings and operational efficiency. Cloud technologies offer huge, interconnected benefits for both aspects – for example, it helps reduce spending on data centres and hardware, in turn freeing up more capital to be reallocated to new business opportunities and other growth initiatives.   Despite all its potential, however, some decision makers are finding that their hybrid or multi-cloud deployments are actually creating more challenges than opportunities. As the volume of data continues to rise, regulatory compliance requirements multiply, and external vendor partnerships pose new risks, these new factors are racking up unmanageable and unanticipated costs. This has led to some organisations to move their data facilities back on-prem, leaving their initial investments in the cloud to go to waste.    A solid, comprehensive data management solution can help IT decision makers effectively determine and anticipate current and future cloud deployment costs. By implementing a set of strong data management principles and best practices, IT decision makers can classify and contextualise their vast repository of data and gain a single, 360-degree view of their enterprise. This allows them to evaluate the types of workloads that work best on the cloud, and the kinds of workloads that work better on-prem; thereby facilitating cost-productivity comparisons and preventing wasted capital on inefficient cloud deployments.    The Developer  Central to a developer’s role is the ability to continuously develop, test, and deliver high-quality applications in a rapid and agile manner. However, with the recent proliferation of cloud services such as hybrid cloud, developers are now faced with the time-consuming inefficiency of having to rewrite and re-upload the same code several times for different cloud vendors; causing backlogs in their everyday workloads.  Modern data management solutions often provide containerisation facilities, which allow developers to bundle an application and all its libraries, files, and code into one package. In doing so, the challenge of having to cater to various cloud vendors is eliminated as developers can simply write their code once and instantly deploy it to multiple clouds. The key benefit here is flexibility: developers can transition and move applications, data, and workloads not only between on-prem and cloud platforms, but also between various cloud platforms themselves. This, of course, presents significant competitive advantages in easing developers’ workloads and helping them become more efficient, agile, and productive.  Containerising apps also helps mitigate the issue of ‘vendor lock-in' that tends to arise with native cloud apps. Once developers write code for a certain cloud vendor and upload all their data and apps onto the system, the process of switching to- or adding- another cloud vendor becomes a very expensive and difficult process. In its ability to support multiple clouds, however, containerisation prevents developers from being chained to a single cloud vendor, whilst simultaneously helping the IT decision maker leverage arbitrage between different vendors and ensure the right level of associated costs.   The Big Data Operator Last, but certainly not least, is the big data operator. To keep up with the fast pace of today’s competitive market, big data operators need a data management solution that provides them with a high level of self-service and instant access to the right analytics tools to avoid cloud deployment.   Without these, big data operators will find that the information they need is scattered across multiple clouds and on-prem. They are forced to spend hours wading through the murky depths of unclassified data, leading to imminent setbacks in productivity and service delivery.   A modern data management platform that offers point solutions will allow big data operators to instantly spin up their desired workload on any cloud for any specific use case. These types of on demand data tools enables them to quickly move from one task to another and ultimately improve their efficiency to avoid cloud deployment.   Furthermore, robust data management also provides big data operators with a single source of truth that can be applied to stacks running both on-prem and in various clouds. With the GDPR now in effect, implementing a strong, single set of security and data governance policies for all systems is key to not only staying compliant, but also avoiding the huge overhead costs that result from writing multiple copies for each platform.  Capitalising on the potential of cloud technology requires more than its isolated adoption. To maintain a forward-thinking business model that maximises the benefits of the cloud, organisations must first anticipate and serve the needs of its key stakeholders through robust data management. With these strong data practices in place, only then can cloud deployments successfully take off and empower digital transformation across the business.    ### Redefining the Hospitality Industry with Internet of Things [vc_row][vc_column][vc_column_text]With almost all industries showing interest in introducing Internet of Things (IoT) solutions to their workplaces or work processes to make several operations automated and more productive, the hospitality businesses are no more an exception now. With hospitality IoT solutions, hotels can provide a comfortable and more personalize stay to their guests. Let us understand how IoT solutions will help hotels improve customer experience. Customer Satisfaction Hotels have to have direct interaction with customers, otherwise said as guests in the hotel-industry terminology. Unlike other industries, the customer or guest feedback in the hotel industry does not have to travel so long to reach the service or product provider. It happens immediately. And what makes guests’ feedback positive is the quality of service they are offered by a hotel. If they are satisfied, they will naturally provide positive feedback that would help future customers to choose a particular hotel. Traditional approaches say nothing impressive about how to satisfy a customer but IoT does. An IoT solution for the hospitality industry can help a hotel understand customers’ past patterns and behaviors, and suggest customize services that, for sure, would satisfy customers. With IoT hotels can bring in a variety of automated solutions, from booking a room to checking out, to smartphone-based controls of various amenities, appliances, and a lot more. Rooms in a hotel can be comprised with multiple responsive IoT systems, apps, and devices that communicate to deliver an optimized and personalized hotel experience to guests. Enhanced, personalized customer experience Every business gains a significant competitive edge when it personalizes the customer experience. Same happens with hotels too. Guests feel good when they are greeted automatically while entering their rooms or when they enter their rooms and lights are switched on just by sensing their presence. Guests enjoy the level of personalization when rooms understand them through their preferences and open curtains, control temperature, play music, and so on. By offering this level of personalization hotels win the loyalty and recommendation of customers. Also with IoT, hotels turn them innovative, modern and comfortable to attract more customers. Reducing operational costs It takes a lot of spending to maintain and provide the high quality guest experience. But hotels have to keep their expenditures in control to maintain their profit margins. From energy consumption over lights and several other appliances to people serving guests, hotels have to reduce operational costs to stay profitable. But it’s hard to manage these things without the support of technology. IoT is the technology that can provide the best possible automation to a variety of hotel operations and reduce costs. Predictive/ preventive maintenance Hotels provide services to their guest with the support of a number of appliances, but they fall into trouble when these appliances suddenly stop functioning. In this situation, guests express their dissatisfaction with hotel services and in result, provide low ratings. Low ratings then affect future customers visiting the hotel. Because customers pay to have maximum advantages out of their limited stay at a hotel, they have no business with a hotel’s problems. But how is this type of situation developed? It happens because something stops functioning in guest rooms and, the hotel does not have any idea that it would happen. But what happens if a hotel can already have the information that a machine or appliance needs maintenance or service? This can happen with IoT-driven predictive / preventing maintenance approach. Many businesses in other industries are already using an IoT for predictive / preventing maintenance solution. Hotels too can use it and keep tabs on all the appliances and machines contributing to hotels to provide highly satisfied services to customers. Location-based Information Bluetooth, GPS and beacon technologies can also be used to create an IoT environment to deliver location based information to hotel-customers. Many hotels are already using these technologies and their blend to send messages to customers at the time they need them the most. This may mean in a lot of ways; for example, sending notifications to customers about discounts or offs when they are close by a hotel or providing updated information about local transport links, or nearby attractions. Businesses can merge IoT and geo-location data together to better optimize staffing. Electronic key cards Most of the hotels use the typical lock-key system to unlock rooms, but IoT can make it a bit sophisticated with the use of a wireless, keyless door opening that can be controlled via an app installed in mobiles. Hotels can send an electronic key card directly to a guest’s mobile phone. Taking this technology a step further, hotels could send an electronic key to a guest with using internet an before the check-in time and sync it with the check-in desk. This will not require guests to wait on the check-in desk as they will already have a key in their mobiles and, they could bypass the whole check-in process and go straight to their rooms. Voice-controlled customer service Last but not least a hotel can also implement voice-controlled customer service to deliver superior customer experience. A voice-controlled room assistant, for example, can allow a customer to request room services, or book a table on his behalf, or book a spa session just by uttering to do so to the device in his or her room. Some hotels like Best Western, Marriott, and Wynn are the early adopters of this sort of service.[/vc_column_text][/vc_column][/vc_row] ### Using Unified Communications to Scale Globally [vc_row][vc_column][vc_column_text]Over the past decade the business landscape has become increasingly more digital. The evolution of mobile technology and cloud software has been driving forward this digital transformation, leading to businesses reacting quicker and enterprises moving at breakneck speed to adopt the latest technologies that would enable them to remain agile, adaptive and profitable. As a result, this fast-paced, technology-centric world has led to a surge in competition amongst businesses with spectators and critics looking to see who would be the first to stumble. Yet, whilst competition has increased, opportunities have grown tenfold with more and more organisations now able to operate on a global level; a feat that was previously only achievable by few. Effective communication has been placed at the heart of every growing business and the rapid advancements in intelligent communication systems have kickstarted globalisation across many industries. In order for businesses to effectively scale globally, decision makers must ensure they have the tools in place to unify their networks and enable rapid and productive interactions with customers, partners, colleagues and prospects. This has led to many businesses investing heavily in sophisticated Unified Communications (UC) solutions to streamline information sharing, video conferencing, collaboration and drive company-wide productivity across large,increasingly scattered and mobile workforces. In the past, businesses would rely on a handful of communication solutions that would allow them to host conference calls and video chats. However, the development of cloud technology and faster internet speeds now means that organisations are able to bridge the gaps between web conferencing solutions to create one frictionless network. Uniting a global workforce In order for organisations to effectively operate in today’s fast-moving environment, the demand for seamless connection is greater  than ever. The world has evolved in such a way that people are used to ongoing connectivity and this also applies for a business that is trying to operate across multiple time zones and split teams. What’s more, without the correct provisions in place, managing a global workforce can also be both daunting and challenging at times. However, UC solutions are able to help businesses overcome these struggles, securely bringing employees together and helping to drive communication and collaboration regardless of geographies. Flexible hours and remote working has become commonplace for today’s businesses, with more employees choosing to work for companies that offer these options.  One challenge for these remote workers is often feeling alienated or removed from being part of the overall company culture. Yet, with solutions aimed at connecting employees globally, businesses can reassure that all staff feel integral to the organisation no matter their location. Enabling rapid communications One of the main challenges for global businesses is the ability to remain connected in order to provide faster responses to customer queries or collaborate on projects with colleagues remotely and securely. Due to the fast acting nature of business, the ability to respond to a customer email more efficiently or provide feedback quicker may well be the difference between profits earned or profits lost. Modern businesses rely on constant communication to succeed. Whether this be communication with customers, employees or line managers, the digital nature of business has meant that all staff need to be armed with an array of tools that can allow rapid communications across multiple devices and channels. For example, one web conference can be arranged for a strategic executive meeting whilst an online planner can be used to book the meeting space with your attendance confirmed via an instant message. In short, businesses should utilise various communication channels to ensure they remain as efficient as possible. In the past, companies were often reliant on expensive hardware to enable access to these channels, but now cost-efficient, web-based solutions remove the need to roll out these devices and instead make the most of existing devices currently used to enable a constant cycle of communication at scale. Productive staff are happy staff A successful business is the result of productive and happy staff. This rings true no matter the region a company operates within, but can present a challenge for businesses operating globally and those having to combat timezones that might cause unnecessary delays. For example, if working on a project with a scattered or mobile workforce that requires each member to contribute could be problematic. However, with UC, employees are able to collaborate effectively no matter their location or time zone; leading to teamwork being a much smoother collaboration process. As a result, having the tools to access, edit and share files whilst also providing feedback means that employees no longer feel restricted to working from one location or having their calendars inundated with face-to-face team meetings to discuss ongoing projects. In addition to enabling staff to work remotely, being able to communicate and collaborate via multiple channels means that checking emails out of office hours can be kept to a minimum or completely eradicated. Being able to send quick updates or share files via mobile or instant message means that employees no longer need to be sat in front of their desktop when home with their families, facilitating that much needed enhanced work/life balance. Enhanced connectivity leading to a new era of UC Unified communications has evolved exponentially over the years with new tools and functionalities constantly added to drive business performance, boost employee productivity and provide a better working experience for staff members at all levels of  an organisation. Its ability to kickstart business growth on a global scale has inevitably caused UC to experience significant growth, with the number of Unified Communications as a Service (UCaaS) users now surpassing 43 million and is tipped to grow by 23% each year until 2023. As the technology continues to establish itself within the business world, we will undoubtedly start seeing it implemented with an organisation’s legacy systems, such as ERPs, CRMs and supply chain management tools. In addition, artificial intelligence will continue driving forward team collaboration and enable seamless communication across an organisation. What’s more, with 5G technology on the horizon, we can expect mobile technology to play an even larger role in the field of UC. With faster mobile connectivity, we can expect better unified communication for remote and mobile users. For example, mobile video is largely dependent on a cellular connection when not in range of WiFi. However, with some locations still receiving poor signals, this should be less problematic once 5G is available; leading to better connectivity and communication no matter how remote the location may be. With enterprise technology constantly evolving, businesses need to adapt to this change and implement solutions and tools that enable them to remain competitive and scale globally. The key to this is giving employees the ability to remain connected, communicate efficiently and work on projects seamlessly by removing geographical limitations. In today’s digital environment, the ability to work anytime, anywhere is no longer seen as optional but instead a necessity for any business looking to grow at scale and maintain a fulfilled workforce.[/vc_column_text][/vc_column][/vc_row] ### The Cloud Journey – A Guide To Migrating Databases To Cloud [vc_row][vc_column][vc_column_text]Cloud adoption continues to boom – as companies seek to meet growing business demands, they are increasingly exploring the power of the cloud and are adopting cloud-based architectures. Today, cloud computing has become an integral part of the ‘digital revolution’ of businesses and it is estimated that cloud services spending will reach$160 billion by the end of this year. From mobile working to file back up and disaster recovery, cloud adoption offers a multitude of modernisation opportunities, including the opportunity to modernize legacy databases and migrate databases to the cloud. In fact, due to the advantages of moving to cloud-based database services, 63 per cent of database administrator surveyed for the 2017 IOUG Database Replication Survey, said they plan to migrate to the cloud within the next two to three years. The advantages of moving databases to the cloud It is true that a large portion of IT is moving to the cloud, but companies following suit are not just jumping on the bandwagon solely for the reason of following an IT trend. When it comes to moving databases to the cloud, what is propelling this movement are the key advantages that the cloud offers to organisations: lower costs, flexibility, reliability, and security.  When adopting cloud computing, the first priority for the business is often the possibility of lower maintenance costs, and this holds true when it comes to database migration to the cloud. By migrating to the cloud, businesses are able to permanently eliminate a large proportion of capital expenses for hardware and software. Not only that, but the organisation will be free from the operating expenses of installing, maintaining, updating, patching, and retiring databases without the additional administration overhead. Another benefit of working in the cloud is the flexibility it offers. For example, when managers want to stand up a development environment with a database, they can create it immediately, get their developers working on it in no time and scale it up and down. In turn, this scalability ensures that needs and resources are closely matched. Building on the advantages of lower costs and flexibility, migration to the cloud also offers reliability and redundancy as cloud providers employ great numbers of administrators to run data centres. These administrators ensure that there is no single point of failure, creating a reliable service for the organisation operating in the cloud. Additionally, although moving data out of a business’ on-site facilities and entrusting valuable data to other companies and people may seem dangerous to those with traditional views on cloud and security, in fact, the opposite holds true. Security in the cloud can often be stronger than security on premises. Why? Data is the lifeblood for cloud providers, and it is one of their top priorities to keep it safe. Therefore, providers have armies of experts in security who ensure data kept on the cloud is safe from threats – from tracking security bulletins to undergoing white-hat penetration testing on their own servers for security assurance, cloud providers are dedicated to keeping your data safe, behind a virtual barbed wire fence. There are few companies who would have the resources or technical depth for the same dedication to cloud security.  With the benefits of database migration to the cloud in mind, it is still important to remember that moving databases to the cloud is a complex process done in multiple phases. It is crucial that the business factors in the possible challenges of the process when making the decision to migrate. The threat of downtime When migrating databases to the cloud, the threat of downtime is the number one barrier to be overcome. On a migration project for a multi-terabyte CRM database for example, sysadmins usually are required to stop all user input, export the existing database and import it to the new database in the cloud, a process which could take several days to complete. This is why the business must ask the question: ‘Can we afford downtime, and how long can we afford it for?’. Fortunately, whilst a certain period of downtime is unavoidable, it is possible for the period to last less than an hour with adequate planning and risk management. To achieve the short downtime goal, dedicated database replication and migration tools can work to ensure that once the source and target databases are coordinated and database replication has begun, users can go seamlessly from working on the old database to working on the new cloud-based one. It is also important to keep the existing system’s data accessible while database administrators are setting up the new system. With the downtime challenge overcome, it is clear that when it comes to migrating databases to the cloud, the benefits far outweigh the challenges. Once a business makes the decision to migrate databases to the cloud, the next stage is making the move itself. How to successfully migrate a database to the cloud There are a few different approaches to moving databases to the cloud. For example, an organisation may choose to develop brand-new applications using databases in the cloud without migrating old databases at all. In this case, as queries on historical data could arise, on-premises storage remains and as do some of associated hardware and software costs. Alternatively, an organisation can decide to switch an entire on-premises database to a database in the cloud suddenly, over a weekend. However, this is a high-risk approach which increases the possibility of extended downtime. With these two approaches in mind, it is crucial to remember cloud migration is a journey and not a destination, especially when starting the database migration process. Organisations are better advised to approach database migration by identifying low-impact tables and schemas such as development, QA databases or use cases like data integration, disaster recovery and offloaded reporting that require data availability but do not interfere with application uptime. Alternatively, organisations can replicate data from an on-premises source database to a target database in the cloud. Whichever path to migrating to the cloud an organisation chooses; it is vital to remember that planning is key. Adequate planning of database migration will ensure a successful migration with little to no workflow disruption. Users should be able to execute tasks like reporting, querying and analysis throughout the process and applications running on those databases before, during or after migration should not be affected– this makes for a successful database migration. Start the journey As cloud adoption continues to grow, it has become the norm rather than the exception, and offers a wide range of advantages compared to traditional infrastructures, especially in the case of databases. With lower costs, flexibility, reliability, and better security than on-site database sources, organisations yet to make the first step in cloud computing should start today.[/vc_column_text][/vc_column][/vc_row] ### ThousandEyes Introduces Collective View of Global Internet Performance [vc_row][vc_column][vc_column_text]Delivers ground-breaking visibility into Internet outages and scope of business impact by harnessing real-time, collective performance data across thousands of service providers ThousandEyes, the Internet and cloud intelligence company, today unveiled Internet Insights™, the only collectively powered view of global Internet performance. Internet Insights™ analyzes common points across billions of service delivery paths in real-time to identify where business-impacting Internet outages are occurring. These insights enable enterprises and service providers to immediately see issues that directly or indirectly impact their users, accurately measure their scope, accelerate the remediation process and confidently communicate what's causing service issues to their customers and employees. Internet Insights™ leverages the aggregated, de-identified Internet telemetry data from those billions of ongoing measurements to provide a macro view of global internet outages that goes beyond the monitoring scope of any individual organization. Issues detected within ThousandEyes service-level tests are automatically linked to macro views within Internet Insights™ that display the scope of impact to users and services, as well as the severity and duration of the event within the provider network. A historical timeline of availability incidents within Internet Insights™ also gives businesses a historical understanding of availability issues across service providers, helping businesses enforce their vendor SLAs and make informed service provider choices. "For digital businesses, being able to effectively manage their service delivery across a vast Internet that's made up of countless independent service providers is essential to their ability to generate revenue and protect brand reputation, as today's users expect applications and sites to be reachable and high-performing at all times," said Mohit Lad, co-founder and CEO, ThousandEyes. "By leveraging the collective intelligence of every test that's running on the ThousandEyes platform at any point in time, we're giving enterprises and service providers the indisputable evidence they need to proactively improve the quality of their services, allowing them to provide a superior experience for their customers and employees." "However resilient the Internet may be, at the end of the day it's still rife with problems that have material impacts on businesses that rely on it existentially for their day-to-day operations," said Shamus McGillicuddy, research director at EMA Research. "Internet Insights™ gives the collective whole a clear advantage in managing what is inherently a collective problem. Ultimately, faster remediation of service outages improves the overall quality and performance of the global Internet, making worldwide connectivity more reliable than ever." Since ThousandEyes launched publicly in 2013, the world's biggest brands including 75+ of the Fortune 500, 140+ of the Global 2000, 20 of the top 25 SaaS providers and hundreds of other enterprises have made ThousandEyes critical to their daily operations to gain end-to-end visibility into the service delivery paths of not only their applications and services, but the applications and services that they critically rely on to run their own businesses. Today, ThousandEyes is measuring more than 8 billion service paths per day, and more than 33 million network traces are collected per hour. These numbers are doubling every six months. "Whenever a customer is having an issue with our service, even if we can definitively say the issue isn't in our code or in our direct internet service delivery path, it's still our problem because it's affecting our customer's experience," said David Mann, technology officer at a global education company. "That's why it's important to be able to see beyond our own service delivery paths and gain intelligence from the collective view of what's happening on the global Internet, so we can help our customers pinpoint where the issue is, even when it's not ours. We would never be able to do that without a collective intelligence solution like Internet Insights." "Being able to look beyond our own view of the Internet to instantly verify if the issue is unique to us, or if it's a larger issue impacting others is incredibly useful information to have when an application performance issue is impacting our customers or employees," said Oleg Onatzevitch, vice president in the enterprise network services group at one of the largest multinational investment banks. "That data allows me to immediately escalate the conversation with the service provider that's causing the issue because I can show them it's not just us, but their issue is impacting several other major businesses too. It helps them effectively prioritize issue remediation, and it helps us be able to confidently communicate what's going on to our customers, which is a win-win for everyone."[/vc_column_text][/vc_column][/vc_row] ### Why the Cloud is Transforming Customer Experience Enda Kenneally, VP of Business development at West’s Unified Communications Services explains why cloud-based platforms are shaping the contact centre industry of the future. Two years ago, West published UK market research which predicted an explosion in cloud adoption in the contact centre, driven by dissatisfaction with legacy equipment, costly upgrades, technical limitations, long deployment times, and resource-hungry integration projects. This year West carried out a new research project, and the results confirm that 2015 predictions were accurate. According to the ‘State of Customer Experience 2017’ report, 39 percent of contact centres have already migrated to the Cloud, and a further 53 per cent are planning a move to the Cloud within the next three years. It looks like the Cloud has become the infrastructure of choice for those trying to improve their contact centre by streamlining the process of providing modern and personalised services to customers that in turn improves customer experience, increases customer loyalty and sales.But what makes the Cloud the best option? Lower costs and better experiences, the key drivers There is no doubt that, as the 2015 research predicted, a cost is an important driver of cloud migration. When West asked those who have taken the plunge what they see as the biggest benefits of the Cloud, the top three benefits were highlighted as the speed of deployment, cost savings from flexible licensing models and reduction in maintenance costs, side-by-side with access to a more advanced feature set. Cloud-based contact centres offer a more cost-effective business model, and this has a strong appeal for organisations. However, there is another, more powerful reason for the cloud contact centre revolution, which is highlighted by our new study. Consumer behaviours and demands have changed and will continue to evolve, and the customer experience has taken centre stage when it comes to brand competition in many organisations. In fact, a recent Walker study found that customer experience will overtake price and product as the key brand differentiator by 2020. [easy-tweet tweet="It is no longer acceptable to be placed in a phone queue...We want immediate solutions. " hashtags="Cloud, Technology"] As consumers embrace technology as an integral part of both their personal and professional lives, workforces become more mobile, and businesses place more of a focus on increased productivity, we see people’s behaviours changing. Things move faster, both at home and at work, and we are all getting used to instant responses, instant access to information and instant gratification. This extends to the customer experience; we don’t want to wait to resolve our query. As customers, we expect brands to go out of their way to help us and make us happy. It is no longer acceptable to be placed in a phone queue and to repeat our issue when finally connected with an agent. We want immediate solutions, wherever we are and on our platform of choice. Some will still like to make a phone call, but others will want digital channels. In both cases, the expectation is that we’ll receive a tailored experience. Delivering an effortless experience will improve customer loyalty and see sales soar for organisations that get it right, boosting their bottom line as a result. Therefore, it is not surprising that most firms are racing to find a formula that can deliver the experience their customers deserve and demand. It is clear that legacy systems can’t adapt fast enough to meet these business needs. Legacy systems can no longer add value West’s research reveals that legacy systems are one the biggest roadblocks for organisations trying to deliver a satisfactory customer experience. According to the survey, the limitations of traditional systems are a significant obstacle when it comes to providing seamless customer experiences across both digital and voice channels. More than half (53%) of customer experience professionals agree that their contact centre would not meet their own needs if they were to get in touch as a customer. This less than ideal result is partly due to old systems’ inability to modernise at a pace that’s fast enough to keep up with evolving customer expectations. Traditional call centres were built around the needs of voice communication, and many contact centres have struggled to add and effectively manage new channels.  Customers expect a fast resolution of their enquiry whether they contact an organisation via email, web chat or an SMS messages. In fact, 88 percent of contact centre professionals in our study said they expected digital interactions to overtake voice by 2020.  At the time of the survey, only 20 percent of respondents said their contact centre has a web self-service capability, and just 29 percent of professionals surveyed strongly agreed their contact centre could deliver seamless customer experiences across multiple channels. The challenge is that perceived costs and difficulty associated with integrating technologies are standing in the way of positive moves to provide a better customer experience, one that breaks down barriers to customer communication rather than creates them. Issues such as this are leading decision makers to adopt cloud platforms, many of which have been designed for multichannel communication. If contact centre decision makers fail to take action, customer defections, demotivated customer service agents and spiralling operational costs will become the norm, leaving the field wide open for new disruptive business models to steal market share. The situation could only become acuter with time. Owning the customer experience Contact centres should be at the core of the customer experience. According to the research, 88 percent of customer experience professionals agreed or strongly agreed the contact centre can play a fundamental role in defining and proactively managing the customer journey between channels. Even more, interestingly, 88 per cent also agreed that digital channels open up new opportunities for contact centres to own the customer experience. So, if there is so much potential for contact centres to have a business-wide impact, why aren’t they? Because only by improving and modernising the technology used in their contact centres will businesses truly see a significant enhancement of the customer experience, and subsequently, of their bottom line. If your customers are frustrated every time they make a query, or contact your organisation, their satisfaction will never reach levels required to earn long-term loyalty. Ensuring that contact centres are up-to-date and able to cope with the demands of the new digital-savvy consumer is essential. The Cloud has proven to be the best-suited infrastructure to meet the needs to the modern contact centre, and the industry has already started to recognise this. Top three tips for migrating to the cloud While moving to the Cloud is the preferred option for most organisations now, it is important to ensure the migration works for both the business, your employees and your customers.  Below are some tips to help you move your contact centre to the cloud and improve your customer experience. Adopt a phased approach Many people don’t realise that moving to the Cloud doesn’t necessarily mean completely ripping and replacing all of the existing on-premises equipment. Particularly in the contact centre, there are some applications that can be deployed to run alongside existing systems. For example, call routeing software, customer relationship management systems, or workforce management software are just a few of the many applications that can be integrated with legacy equipment. This approach is a good way of easing the transition, and it gives managers an opportunity to demonstrate the benefits of cloud-based solutions at a smaller scale. A hybrid solution is a great first step towards a full cloud-based system. The key is to migrate gradually. Make the most of SaaS Cloud-based contact centre solutions are a software as a service (SaaS). This means that in addition to the software, customers receive added consultancy, training and help with the solution from day one of the project. In the traditional model of suppliers selling hardware to contact centres, there was often a gap in the process. Once the hardware and the licenses were sold, sometimes with a little consultancy on top, it was a done and closed deal. This led to many contact centre professionals admitting that they felt suppliers did not understand their business. For a long time, suppliers were only around if there was an issue or if a contract was due for renewal. From a customer’s perspective, the feeling was that suppliers were constantly trying to get extra money. With the SaaS model, however, the relationship between provider and customer is on-going. When you get a cloud-based solution, service and support are part of the package. Make sure you don’t pick a technology partner who’ll pitch you, then ditch you. Find one that will understand your business, offer you the support you need, and work collaboratively with you to ensure you get the best out of the systems you are using. Consider the impact of a cloud solution for your entire organisation This is perhaps the most valuable piece of advice. If you make the right choice, your cloud contact centre provider will give you a communications platform for the entire organisation. As industries evolve, companies will need a platform that allows them to react quickly to the needs of customers, partners and suppliers, and offer them multi-channel support. If your cloud contact centre provider has delivered a robust solution, they could become your partner at a broader level.   ### Bandwidth and Cloud-Based CCTV | Low bandwidth is no barrier Technologies such as telephones and television have changed out of all recognition in recent years. Both are now highly internet connected, can be personalised through a mix of apps and settings, and have uses that extend far beyond original expectations. In contrast, many of the CCTV systems in use today are relatively untouched by these technological advances. Bandwidth also has a role to play in all this. The time is right for CCTV to reinvent itself and enable business users to take advantage of the accessibility, capacity and scalability that cloud provides. However, one thing that prevents businesses from moving to cloud-based CCTV is concern about not having enough bandwidth. If their premises are in a rural area, where fixed broadband is not available, can the mobile phone network cope? If they rely on an ADSL connection, will that provide enough bandwidth? Fortunately, the idea that lower bandwidth connections cannot support cloud-based CCTV is something of an urban myth, and the answer to these questions is almost certainly ‘yes’. Businesses should bear in mind that data security is paramount and ensure that their data is encrypted before it is transmitted to the cloud, as recommended by the ICO.   Reducing the amount of data sent Streaming visual data to the cloud 24 hours a day is impractical for the vast majority of businesses. It is an unnecessary and expensive use of both bandwidth and storage capacity to save visual data that shows nothing useful. Additionally, the General Data Protection Regulation (GDPR) calls for data minimisation – only capturing and storing data that has a valid use. An effective approach is to configure cameras to respond to movement triggers or to PIR sensors so that they only capture only what is interesting and useful. These features can be supported by using sophisticated individual recording schedules on each camera that are centrally set by an administrator. For example, a business may want some of its sites only monitored out of hours, while choosing to have other highly secure or more vulnerable locations monitored for movement at all times. Another way to minimise the amount of data sent to the cloud and so reduce the bandwidth required is to set a low frame rate. As few as three frames a second are enough to ensure visual data is usable for security purposes. Compressing the data is another popular technique. It is also possible to apply analytics to identify visual data that is of interest and to trigger recording and alerting based only on that data. For example, someone moving around a room that should be empty might be of interest, but not if they are checking whether that room needs to be set up ready for a meeting or removing coffee cups afterwards. Sometimes there may be no need to send visual data at all, just metadata that suggests something interesting has been recorded. All these measures, in appropriate combination, help businesses using ADSL, 3G or 4G connections ensure that they send only useful data to the cloud, and minimise the amount that they send.   Handling vagaries in mobile network availability Where a fixed broadband connection to cloud storage is not an option, the mobile network is almost certainly available. While 4G coverage has increased, for many locations the best that will be available is 3G, However, there are systems available which have been designed to work effectively with 3G, particularly in combination with the techniques to reduce the amount of data transmitted already discussed. Systems may also have to cope with ‘blips’ in mobile network availability. To handle this, Cloudview has developed a visual network adapter which constantly checks network speed. If for some reason the network is down or running slowly, it will automatically hold visual data until the network is operating at full strength again. This adapter can be added to existing cameras (digital or analogue) to provide cloud connectivity.   Estimating how much bandwidth you need It is difficult to come up with a hard and fast rule about the minimum bandwidth a business CCTV user might need. Installations vary widely and bandwidth will depend on the number of cameras as well as the methods described for reducing the volume of data transmitted. For example, a single camera that records an hour or so of motion-triggered footage a day will need a lot less bandwidth than a single camera recording footage continuously for eight hours a day. This, in turn, will need much less bandwidth than ten busy cameras. We have a rule of thumb based on our own experience working with businesses which says an upload speed of 200kbps to 250kbps per camera is adequate. It is worth noting that the average upload speed for ADSL hovers between 512 Kbps and 2 Mbps, for 3G it sits at around 0.4 Mbps, and for 4G is around 8 Mbps. It is easy to check your own bandwidth using any one of a number of online tools – for example, the Measurement Lab Network Diagnostic Tool. The following estimates of bandwidth usage across three different types of site will help organisations calculate what might be appropriate for their business.   A remote site using 4G Assume this has four cameras capturing data at 5 frames/second, recording on average 2 hours, 4 hours, 3 hours and 1 hour respectively per day. The estimated upload bandwidth required is 1.1Mbit/sec.   An office site using standard ADSL broadband Assume this has ten cameras, recording a total of 30 hours per day at 3 frames/second. The estimated upload bandwidth required is 2Mbit/sec.   A large site using fibre broadband Assume 75 cameras recording a total of 180 hours per day at 5 frames/second. The estimated upload bandwidth required is 20Mbits/sec.   Bringing cloud-based CCTV to everyone The ability to use 3G, 4G and ADSL fixed broadband for CCTV opens up access to commercial premises without high-speed broadband connections. This means that rural businesses, remote utility facilities and other isolated sites can replace outdated CCTV systems that require manual interventions with modern cloud-based systems. The benefits include anywhere, anytime access to visual data, alerts based on pre-set triggers, secure access to footage for authorised users, easy sharing of visual data with third parties such as security teams or the police, and strong audit trails for data protection assurance. ### The importance of choosing the right channel partner Channel partnerships may have changed in recent years, but they remain as important as ever. Previously, the partner model was centred on resale, margins and collecting upfront revenue, but as the technology sector has transitioned to a more service-based industry, the partner channel has followed suit. Nowadays, it is not enough for channel partners to offer good technology, they must also provide responsiveness, flexibility and strategic enablement. Of course, the potential benefits of a modern channel partnership depend largely on who you choose to partner with, which is why this decision is proving so important for businesses across all industry sectors. [easy-tweet tweet="It is important that businesses look for a #channel partner that has a comprehensive strategy in place" hashtags="Dell"] On the simplest level, partnerships allow resources to be shared between both parties, including consultancy, education and expertise, as well as more tangible resources like technology, capital and marketing materials. Effective channel partnerships have enabled businesses to generate new revenue streams, leverage powerful IT solutions and brand influence and gain access to professional business tools that may have otherwise been out of reach. On the other hand, ineffective partnerships can see businesses weighed down by bureaucracy and lacking support and open communication. When choosing an enterprise partner it is important to select one that is willing to take the time to understand your business, its customers and what it is that you are trying to achieve. Vendors also need to create and deliver services that reflect your business needs – if the product or service is not relevant to your business, then your partnership is unlikely to be a fruitful one. As a result, collaboration is absolutely vital for any channel partnership strategy to be effective. Communication will allow you to both realise shared goals and strengthen trust between the two parties. It is important that businesses look for a channel partner that has a comprehensive strategy already in place. This will likely include detailed pricing considerations that reflect different channel categories, discount options and a clear reseller model. The product or service being supplied may also require some tweaking to suit both parties. Is your channel partner willing to modify its resources to better suit its distribution partners and their customers? Will further adjustments be required for international partners in order to reflect local demands? What level of collaboration will take place when it comes to promoting these products or services? At Dell we’ve worked hard to transition from a consumer PC vendor to a reputable channel partner that is fully invested in our partner’s aims and objectives. Through our PartnerDirect programme, we offer a range of benefits to ensure that we remain as committed to you as you are to us. Whether you sign up to the Registered, Preferred or Premier schemes, you’ll receive a number of advantages tailored to your business needs, including account management, access to field marketing support and market development funds. [easy-tweet tweet="Businesses must choose their #channel partner wisely if they are to access the possible benefits" hashtags="Dell"] Channel partnerships can prove hugely beneficial for many businesses, boosting sales, creating new revenue streams and opening your company up to a wealth of shared resources. However, before any business can access these benefits they must ensure that they choose their channel partner wisely. The right partnership can truly demonstrate the benefits of business collaboration, but the wrong one can leave you stuck with a business that is ill-suited to your needs and ambitions. ### Digital Transformation: 101 – Challenges, Benefits and Best Practice [vc_row][vc_column][vc_column_text] What does digital transformation really mean? Digital transformation is the integration of digital technology in all aspects of the business and by doing this, fundamentally changing the way the business operates and the way it provides value to its clients. Apart from this, it's also about cultural change. It is about the necessity for organizations to constantly challenge the status quo, to experiment and to think out of the box. This often means walking away from long-standing business processes in favor of relatively new practices that are still being defined. The concept/term is quite overused these days which inherently creates some false expectations or misunderstanding. Some of them listed below: The focus on digital creates the impression success is mainly realized via injecting technology whilst technology should not be the main starting point The transformation should not be seen as a one-shot project but should be about culture change, new ways of working and continuous improvement Some companies use the hype to market e.g. paperless office tools or transition to public cloud or… mainly zooming in on one specific aspect while in reality it could mean anything or could look different for every company What are the challenges of embarking on a digital transformation journey?   Resistance to change in the organization: digital transformation, by its very nature brings quite some change. Resistance can come for various reasons: concerns about cannibalizing revenue sources, difficulties justifying, or funding projects as normal project cycle and approach does not apply, fear for job loss or job change, etc. Lack of a clear vision and digital strategy. You need to start from a business perspective, set clear business objectives. Develop a digital strategy. Experiment broadly with technology but don’t do digital as an objective in itself Speed: the speed with which some digitally enabled experiences thriving in consumer space are creating a new normal has an impact on all industries and is impacting expectations in B2B market. This is pushing companies to go fast and to respond quickly and as such rushing into high investments without a proper objective and approach Scaling initiatives from experimentation to real transformation: it is rather easy to develop a good approach to incubate ideas and test them. Scaling up these best ideas up to a level that they are integrated across a broad international organization, with legacy IT systems with an important partner channel is sometimes challenging What are the benefits of digital transformation?   Organizations are aggressively pursuing digital transformation to gain efficiencies, lower costs and increase profitability with the ultimate goal of gaining a competitive advantage by delivering the very best customer experience. Digital transformation allows organizations to be the 1st disrupters in their given markets. Whether they are in finance, manufacturing, retail, healthcare or transportation, why let an Uber, Amazon, Airbnb like competitor come into their space, why not be the digital disruptor amongst their competitors? A big part of that is delivering to the heightened expectations of the new demographics made up of millennials, gen x and z’ers as well as the elevated consumer expectations across all demographics and that is about delivering an exceptional customer experience whether that is an internal customer – an end user, or an external customer buying a product or service. The expectation is a personalized experience – anytime, anyplace, any device…simple, intuitive and secure. So how does digital transformation help facilitate this exceptional user experience? Rapid innovation allowing companies to quickly respond to the ever-evolving requirements of their internal or external customers. Increased mobility connecting users to the information and applications they require easily and seamlessly. Enhanced agility to help companies response faster to changes in their market or business conditions. Faster decision making starting with big data analytics to deliver actionable intelligence and informed recommendations to the business. How to do digital transformation right   Establish a compelling change and transformation story, based on a clear outside-in vision and digital strategy. Start from the business model or the customer experience instead of starting from an inward goal like digitizing the existing operating model. Stimulate and embrace digital experiments and initiatives that focus on information sharing, self-serve technologies etc to start feeding thoughts and building more tech capabilities in the company. Embrace partnerships for important skills gaps e.g. in the fundamental aspects like journey mapping, process design and data and analytics. Digital transformation projects leadership Do not treat this as a standard project. Need for iteration, openness for experimentation. Do not use traditional KPIs and ROI mechanisms to measure success and set objectives. Integrate a “management of change” and culture track from the beginning focussed on people empowerment and attitude and on skill and capability development. Foster collaboration, create cross-functional teams. Why does digital transformation matter? No choice. Consumers will simply turn away from companies that aren’t keeping pace in favor of a brand that can provide a connected experience across digital channels. Will new roles have to be created? Many of the roles that are required to tactically execute a digital transformation plan already exist, application developers, DevOps managers, cloud specialists, process automation engineers, business data analysts, e.g. The challenge that organizations will have is attracting and retaining some of the more specialized roles in a very competitive labour market. Many of the new roles that will be essential to the success of a company’s transformation will be leadership roles to drive digital initiatives within the organization, titles like Chief Innovation officer, Chief Digital Officer or Chief Experience Officer for example. What are the best digital transformation companies? The companies that are going to experience the best results around digital transformation in 2019 have the following traits: They talk to their customers about what their competitors are delivering today regarding mobile enabled client applications and portals There is a strong statement from executive leadership that states the company’s 3 – 5 year goals around digital transformation Initiatives are evaluated based upon their impact on the user experience They have the philosophy that everything can be mobile and take it to the extreme, for example even legacy custom apps running on a mainframe They actively engage their top vendor partners in the process They are not afraid to fail and when they do, they fail fast Common questions you have to ask yourself Has there been a formal announcement or communication from executive leadership and/or at the board level stating your 3 - 5 year goals around digital transformation? What percentage of your applications are accessible to users on mobile devices? Do you need to create a separate group within IT to accelerate your digital efforts (e.g., a bimodal - mode 2 group) or can you reach your goals based upon your existing structure? How web enabled are your customer facing applications? Are you actively engaging your vendor partners in your digital transformation plans? Is the time to launch new applications the same now as it was 5 years ago? Do your competitors have mobile applications that customers can access to conduct business? What percentage of your efforts are internal facing to create efficiencies and lower costs versus external facing to enhance your customers’ experience? Are you already using or in the process of rolling out a collaborative suite, e.g., Microsoft Teams, WebEx Teams, Spark, e.g., as a part of the tactical execution of your plan? Is creating an environment that attracts and retains millennials/gen x’ers a driving force behind your digital transformation plans? [/vc_column_text][/vc_column][/vc_row] ### Cloud and Web Application Security: Growing Confidence and Emerging Gaps [vc_row][vc_column][vc_column_text] Cloud and Web Application Security: Growing Confidence and Emerging Gaps For modern organisations, digital transformation is increasingly the only game in town. CIOs are turning to multiple cloud providers in their droves to offer them agile new app-based models, driving enhanced business agility to meet ever-changing market demands. Yet security remains a constant challenge. Web applications themselves remain a major target for data theft and DDoS. A Verizon report from earlier this year claimed that a quarter of breaches it analysed stemmed from web app attacks. So, what are organisations doing about it? The results of a new Barracuda Networks report reveal some interesting findings. Cloud maturity grows The poll of over 850 security professionals from around the world reveals a growing confidence in public cloud deployments. Over two-fifths (44%) now believe them to be as secure as on-premises environments, while 21% claim they are even more secure. What’s more, 60% say they are “fairly” or “very” confident that their organisation’s use of cloud technology is secure.  This makes sense. After all, cloud providers are capable of running more modern, secure infrastructure than many organisations could in-house. That means customers benefit from the latest technology, accredited to the highest security standards, versus heterogeneous, legacy-heavy in-house environments. As long as they pick the right third-party security partners and understand the concept of shared responsibility in the cloud, cyber risk can be mitigated effectively. The cloud even offers more options for back-up and redundancy to further minimise risk. Yet this isn’t the whole picture. Respondents to the study are still reluctant about hosting highly sensitive data in the cloud, with customer information (53%) and internal financial data (55%) topping the list. They complain of cybersecurity skills shortages (47%) and a lack of visibility (42%) as hampering cloud security efforts. And over half (56%) aren’t confident that their cloud set-up is compliant.   Could some of these concerns be linked to web application threats? Websites under attack The truth is that web apps are a ubiquitous but often poorly understood part of the modern cloud-centric organisation. As a business-critical method of delivering experiences to customers and productivity enhancing capabilities to employees, web apps are a major target for cyber-criminals looking to steal sensitive data and interrupt key business processes. A Forrester study from 2018 found that the leading cause of successful breaches was external attacks — the most common of which focused on web applications (36%). Fortunately, Barracuda Networks’ survey finds more than half (59%) of global firms have web app firewalls (WAFs) in place to mitigate these threats. The most popular option is sourcing a WAF from a third-party provider (32%), which makes sense, as long as they can protect their customers from the automated bot-driven traffic that dominates the threat landscape. Not all can. Patching and configuring However, of greater concern is the fact that many organisations don’t appear taking the threat of web app vulnerabilities seriously. Some 13% claim they haven’t patched their web app frameworks or servers at all over the past 12 months. Of those that did, it takes over a third (38%) of them between seven and 30 days to do so. For a fifth (21%) it takes over a month.  This is the kind of approach that landed Equifax in a heap of trouble, when it failed to promptly patch an Apache Struts 2 flaw, leading to a mega-breach which has so far cost it over $1.4 billion. It’s an extreme example, but one that highlights the potential risks for businesses.    Another potential area of risk with web app environments is human error. A massive breach at US bank Capital One earlier this year, affecting around 100 million customers and applicants, was blamed on a misconfiguration of an open source WAF.  Some 39% of respondents told Barracuda Networks they don’t have a WAF because they don’t process any sensitive info via their apps. But attacks aren’t just focused on stealing data, they can also impede mission critical services. WAFs are certainly not a silver bullet. But as part of a layered approach to cybersecurity they’re an important tool in the ongoing fight against business risk. Conclusion Growing confidence in cloud is enabling digital transformations across organisations of every shape and size, yet that confidence comes with a cautionary tale.  Attackers are also zeroing in on vulnerabilities and weaknesses that may have been ignored in the past, and many organisations are unaware of how these multi-layer attacks can unfold from a single access point.  Web Application security and cloud posture security are key weapons which customers must deploy to continue their digital transformations in a safe cloud.  To ensure you are secure in the cloud, here are some tips: Ensure you have WAFs protecting all your apps – don’t assume that just because an app doesn’t appear to have outside visitor engagement doesn’t mean it can’t be used as an attack vector.  Once discovered, attackers will exploit any found vulnerabilities and it may simply to gain access to your network and more valuable resources.   Don’t leave app security in the hands of your development team.  They aren’t security experts, nor do you pay them to be – you pay them to build great products.   Deploy a Cloud Security Posture Management solution – not only will this eliminate many security risks and failures, along with providing your development team with necessary guardrails to “build secure,” it greatly simplifies remediation and speeds investigations when issues do arise.  [/vc_column_text][/vc_column][/vc_row] ### Using IoT to support fleet management The Internet of Things (IoT) connects products, assets and infrastructures and is driving new developments and innovations in most industries. By connecting fleet vehicles to the Internet, new opportunities for business efficiency appear. Many companies have already connected theirs - General Motors is one example that has over 12 million IoT enabled cars. The rapid growth and countless benefits associated with this technology means that the industry is expected to be worth US$15,870 million by 2025, according to Market Research Hub. Connected Vehicles To thrive in competitive markets, fleet operators need current and accurate data to maximise the efficiency of each vehicle and driver. Using connected IoT technology such as telematics, operators can keep track of fleet productivity, as well as gaining valuable insight into driving behaviour and vehicle performance. As IoT and artificial intelligence (AI) technology solutions become more prevalent, fleet-led industries are seeing numerous benefits: By connecting to the cloud, vehicles can be monitored for faults and generate proactive maintenance updates. These intelligent vehicle assessment tools help reduce the cost of unplanned maintenance and unexpected vehicle downtime. Servicing is easier as instant alerts allow fleet managers to schedule maintenance at a convenient time for both the business and the driver.   In-cab coaching provides drivers with real-time feedback, so they can improve their performance in line with company health and safety standards. This data is also available to fleet managers who can proactively assess driver performance and behaviour to identify training needs, disciplinary issues, or to inform employee award schemes.   Duty of care is improved as intelligent fleets enable two-way communication, making it easier for head office to keep in touch with drivers. This can be used in various ways from monitoring wellbeing to providing emergency alerts about route choice or vehicle problems.   Connected vehicles are improving workplace efficiency and making life easier for drivers as more companies adopt electronic logging devices. These automated processes save hours of administration time previously spent filling out logbooks, timesheets and mileage forms. Connected devices using RFID, NFC, or Bluetooth technology also reduce vehicle loading times so drivers can spend less time stock checking and more time on the road, shortening delays and improving the customer experience.   Supply chain managers use live telematics to negotiate with suppliers by providing evidence of realistic mileage and driver behaviour. Costs can be saved on historically oversubscribed services, such as insurance, where telematics can offer vital data. Connected Depots Depots and supply chain managers who introduce IoT solutions experience significant improvements. Although asset tracking is not a new idea, new technologies provide more precise and accurate tracking data of individual items compared to periodic barcode scanning. Specifically, RFID tags allow for items to be accurately tracked at any point of the delivery journey. IoT technology can improve the safety and security of assets and people by employing sensors to disable functionality until the asset is in a safe location. Functionality can also be used to improve in-house health and safety conditions by maintaining an optimal working environment. This powerful technology has the potential to save lives and money, as it lowers the risk of accidents within the supply chain as well as reducing the associated expenses. Depot managers can use the IoT to maximise their capacity by analysing stock quantities in real-time and devising new ways to maximise efficiency. Regulatory requirements are a major part of supply chain performance; IoT data can be accessed on demand and eliminate human error, often leading to significant cost reductions and a more productive warehouse system. IoT connectivity is also entirely scalable, so the system and its data will not be adversely affected by rapid businesses growth. While current technology relies on manual number crunching to identify data trends, the rise of AI will make gains even greater as depots become able to respond to stimuli almost instantly. Depots will also be able to reroute fleets and individual packages to alternative routes to maintain SLAs, efficiency and customer satisfaction. A Connected Future As IoT adoption grows, different parts of a fleet will connect intelligently and depots will be able to manage incoming and outgoing vehicles more effectively and optimise pallet and load movements. Trailers will be able to ‘talk’ to trucks and data gathered from the IoT will provide insights into load capacity, allowing for better fleet utilisation. Managers will also be able to identify areas where fuel is wasted - for example on shorter or less intensive routes. Customers, drivers and supply chain managers are all set to reap the rewards of IoT integration, with fleets able to report issues quicker than ever, deliveries will be timelier and fully trackable, which will improve relations between stakeholders. Fewer journeys will be required as telematics provides information on load capacities and intelligent packing, further reducing overhead costs. Conclusion IoT is set to completely revolutionise the way supply chains and depots work. It is already being adopted by major players around the world and, with the sector growing, all businesses now have the chance to benefit from the many exciting opportunities possible in this new, hyper-connected world. ### Becoming a Digital Accounting Firm [vc_row][vc_column][vc_column_text]The digital transformation that triggered the 21st century banking revolution and is now making waves in the accounting sector. It’s an exciting time, but with any significant change comes big challenges. Historically speaking, accounting has always been considered a relatively ‘safe’ sector: it’s a necessary service for all businesses with predictable revenue models. Accountancy is still a necessary service – but its delivery has changed, thanks to digital disruptions and innovative solutions from tech-savvy firms, and predictable revenue may no longer be guaranteed. Cloud-based accounting operations have the power to outdo traditional and more manual accounting methods in terms of efficiency, accuracy and speed. As automated accounting becomes more mainstream, accountants need to look for important ways to future-proof their profession and stay relevant. This is why more and more accounting firms are digitising their offer in response to market demands. However, embracing digital transformation is not just about adopting new technologies – it’s about developing a new mindset too. The real opportunity here is for accountants to use technology to offer more than number-crunching solutions and instead become business advisors to their clients. Harnessing the digital mindset   The millennial generation is increasingly present in the workforce as employees, clients and bosses. This group is very switched on technologically – and expects a certain level of digital services and interaction in the workplace. Adoption of digital solutions is crucial for staff retention as well as business efficiency. Making Tax Digital (MTD), for example, is around the corner and will pose a real challenge to a large number of firms that aren’t prepared for it.  Businesses that fail to make the necessary plans and investment to ensure they will be compliant by 1st April 2019 risk facing penalties – not to mention falling behind the curve. However, the successful digital mindset is not solely focused on new technologies. It also values human engagement and emphasises the importance of strategy and culture. To get the most from technology, accountants need to strike a balance. Think about it: if you just use technology to automate compliance, what are your clients really paying for? You need to demonstrate value-added services and ascribe them the same level of importance as digitally delivered efficiencies.  Packaging your offer How accountants revise their business models and fee structures in line with these developments is crucial. With administrative tasks and data entry streamlined thanks to automation, firms can consider offering, for example, more advanced advisory and reporting services. This is particularly valuable for firms with smaller clients; helping them compete with bigger firms that could afford to diversify their offer some time ago. In a nutshell, becoming a business advisor means reviewing your client company’s financial data, identifying any problems, and coming up with solutions. Machine learning may eventually do some of this analysis – but it will never 100% guarantee the right answer. The need for human oversight and experience remains key. When putting together a new fee structure that reflects these value-add services, you need to figure out what you’re actually charging clients for - and whether they are getting their perceived value for money. It’s also worth thinking about investing more time into content marketing to boost your market expertise and trustworthiness. Building a digital culture Becoming a digitally transformed accountancy practice requires hiring staff that are digitally savvy – or at the very least, demonstrate an enthusiasm for upskilling.  To attract these employees, you need to tap into the millennial generation’s workplace goals and create a fun, appreciative working environment that offers clear career progression. You need to identify the gaps in your business and then look for the right skills to fill the open positions. If you battle to find the skills you need immediately, look at training opportunities to help your existing employees widen their abilities. Training is also a great way to retain good people. Being a digital firm is all about long-term planning. You need to put together a strategy that plots where your firm will be in say one, three and 10 years. If you want to be able to grow your practice and increase your number of clients, you have to be able to scale all of your systems and processes. This level of planning will guide your business as it grows and help you achieve your objectives. Key business benefits of cloud-based accounting software Cloud accounting helps your business work smarter and faster. It provides better access to financial information and improves collaboration among teams as well as with clients. For accountants, this means you can review data in real-time rather than wait for clients to email outdated documents through to you. Multi-user access makes it easy to collaborate online with all stakeholders, using the latest information and updates at all times. Everything is run online so updates to the systems can be run without disruption. It’s also easier to scale as you are working with software rather than out of date hardware or even physical tools like calculators. Information is backed up automatically and updates are typically free and instant. This means you can get on with doing the work that really matters to you and your clients. It also means that your business costs are reduced thanks to the elimination of on-site software management activities like version upgrades, maintenance, server failures and administration. All of these would be managed by the cloud service provider of your choice. Migrating to the cloud  Many firms are already on their way to migrating to fully digital cloud practices. Now is the time to focus on hiring staff with the right skills, adopting the right technology, and developing relevant processes to build a completely streamlined – and digitally successful – workplace. Migrating to the cloud requires a lot of forward planning; it’s not something that can be done off-the-cuff. You need to ensure that data is accessible as this will fuel your advisory services. Think about how all your existing technology fits together and choose the most complementary accounting software, plus accompanying apps, to build a comprehensive and user-friendly set up. One that functions for both you and your team – as well as your clients. Choosing which apps to use is a bit like completing a puzzle. You need all the right pieces to complete the full picture – and each app will likely provide a specialised service in a key area. For example, Soldo, the multi-user business expense account, has recently integrated with cloud-based accounting software Xero to make it easier for businesses to monitor expenses thanks to an open bank feed integration that automatically syncs transactions. Other available apps can cover things like sales invoicing, payroll, time management, CRM and much more. Becoming a truly digital accounting firm is more than just about technology – it’s about people, culture and wider business goals. The real power of technology is that it can take care of tasks that require little to no human interaction, creating more time for you to connect with your clients and build better, longer-lasting relationships. Part of this means offering them the services they value: good business advice that helps them achieve success.[/vc_column_text][/vc_column][/vc_row] ### The Six-Key Metrics of a Successful SaaS Business Transitioning to a SaaS business model can create huge value for your business, but success depends on addressing six key criteria, says Lyceum Capital partner Martin Wygas. It is no secret that the market is moving toward the software as a service model, with SaaS products encompassing every aspect of business services from customer acquisition and marketing to delivery and operations. For the customer, buying cloud-hosted, subscription-based software in place of on-premise licences has several clear benefits: faster and more cost-effective deployment, always-on access, greater update frequency and integrity, more flexible usage and costing, as well as reduced infrastructure costs with improved IT security. For software businesses, the rewards from adopting a SaaS model are equally compelling. Among them are higher quality recurring revenues with greater forecasting visibility; increased direct engagement with the user and better customer alignment; boosted sales, including cross and upselling; lowered customer churn; potential efficiencies in development and support; and an ability to scale faster. However, simply switching to a SaaS pricing scheme won’t deliver a full SaaS transition. It requires you to review and change your business model and create a SaaS culture. This includes how you incentivise your sales force, the approach you take to development and how you track and report performance metrics. It also involves scrutinising how you communicate internally to ensure employees are on the same page, and, externally to stakeholders to explain how you are driving growth. Therefore if you are an owner of an on-premise software business, you need to assess the impact SaaS will have in six key areas. Size As in other sectors, buyers pay for size. SaaS businesses of scale are scarce, and therefore attracting an enhanced premium.  You need to feel confident that a shift to SaaS will help you create a business with annual recurring revenues of well over £10 million. Historical growth Track record counts, with buyers willing to pay a premium for companies that can show year-on-year revenue growth of at least 20 percent. Our research indicates that businesses with mean historic growth of 27-35 percent are typically valued at a healthy 3-5x revenue or above. At the other end of the scale, SaaS businesses that grow at less than 10 percent year-on-year are unlikely to achieve much valuation uplift. Of course, the timing of your SaaS transition will impact revenue growth. Therefore your business will need to learn how to track growth metrics other than delivered revenue. Annual or monthly recurring revenue and bookings are the most frequently used lead indicators of future revenue growth. Profitability In addition to growing fast, companies attracting premium valuations also have a proven ability to deliver profit growth. A common rule of thumb is the so-called “Rule of 40”, whereby combined EBITDA (Earnings Before Interest, Taxes, Depreciation and Amortisation) and year-on-year growth figures equal 40 percent or more. Undertaking a SaaS transition will naturally lead to some level of in-year revenue and profit compression and maintaining profitable growth throughout the transition can be a challenge. However, there are some measures you can take to protect EBITDA and cash flow during this period: Planning – Develop a clear plan that examines the impact on all aspects of the business and creates a clear pricing structure that includes minimum number of users and contracts term to ensure a floor to your SaaS pricing. Controlled and limited launch - Launch your SaaS offering across a limited customer set either regarding geography, vertical or product set. You can then build credentials while confirming market pricing for your product. Hosting - Carefully choose the right partner for your hosting needs regarding scalability and pricing structure. Cashflow management - Start invoicing annually in advance: when a customer’s environment goes live in your hosted environment and not at full customer go-live. Customer success - Offer tiered support on new sales and at the highest level, a dedicated, onsite customer success manager who will also be able to drive further upsell opportunities. Upgrades - Put in place a stratified approach for existing customers and a payback pricing plan to switch to SaaS (a 50% uplift on existing support is a standard target). [easy-tweet tweet="Recurring revenue is key to obtaining a premium valuation for your business" hashtags="Cloud, IT"] Quality of earnings Recurring revenue is key to obtaining a premium valuation for your business. Those with 75 percent or more of recurring revenue and new sales predominantly on a SaaS basis can expect an uplift in value. But, when assessing earnings quality, it is not enough to simply focus on the headline percentage of recurring revenue. Management must also examine the contribution from each revenue line. Businesses that do not generate a gross margin more than 85 percent on SaaS sales (taking into account all related hosting and other infrastructure costs) will not get the full benefit of an uplift in value. Scalability Your business must be positioned to grow in terms of technology and people. Its software platform and infrastructure need to be secure and able to scale in line with the growth of the business. This means fully embracing SaaS as a business model and not just a revenue model. A key component of embracing SaaS scalability is a focus on improved sales effectiveness. Starting to measure and manage against SaaS KPIs, such as customer acquisition cost (“CAC”) and customer lifetime value, will help illustrate the scalability of your business. Growth potential Businesses targeting a sizeable or fast-growing market support higher valuations. Adopting a SaaS model can counter the limitations of a slowly growing market by broadening market appeal through new verticals, geographies, and new product lines. By delivering increased functionality and exploiting cross and upselling avenues, you can alleviate market pressures and grow by increasing average revenue per user from existing customers. A structured and well thought out approach to upgrading an installed on-premise product to a new SaaS offering can reap great rewards. And continuous focus on existing customers will promote another highly valuable SaaS metric:  negative customer churn. Shaping a customer-centric SaaS strategy that addresses most of the above criteria is not simple and, as experienced software investors, we know that access to additional capital and a ready network of industry experts can be the difference between steady growth and the emergence of a market leader. ### The four pillars of an Enterprise Data Cloud [vc_row][vc_column][vc_column_text]Data has grown exponentially over the last 20 years and the potential for it to transform businesses is greater than it has ever been. IDC estimates that by 2025 the amount of data will hit a mind-boggling 163 zettabytes, marking the beginning of a digitisation wave that is showing no signs of abating. Perhaps unsurprisingly, the value of data analysis at scale – including storing, managing, analysing, and harnessing information – has become an increasingly important part of the corporate agenda, not only for IT departments but also for senior management. While most companies have now realised the business benefits of data analytics, developing the right strategy to harness the value of it can often be challenging. Although companies still need to rely on large data repository for analytics at scale, the widespread use of IoT devices – and subsequently the large amount of data coming from edge networks and the need for consistent data governance – has prompted a wave of modernisation, requiring an end-to-end technology stack underpinned by the power of the cloud. The public cloud has now been experienced by a vast number of organisations, who value its simplicity and elasticity. However, unexpected operating costs and vendor lock-in have prompted enterprises to opt for some other cloud infrastructure models that would allow both choice and the ability to run demanding workloads no matter where they reside and originate, from the edge to AI.  Same problems, new challenges The most valuable and transformative business use cases – whether it’s IoT-enabled predictive maintenance, molecular diagnosis or real-time compliance monitoring – do require multiple analytics workloads, data science tools and machine learning algorithms to interrogate the same diverse data sets to generate value for the organisation. It’s how the most innovative enterprises are unlocking value from their data and competing in the data age. However, many enterprises are struggling for a number of reasons. Data is no longer solely originated at the data centre and the speed at which digital transformation is happening means that data comes from public clouds and IoT sensors at the edge. The heterogeneity of datasets and the spike in volumes that is leading to real-time analytics means that many organisations haven’t yet figured out a practical way to run analytics or apply machine learning algorithms to all their data. Their analytic workloads have also been running independently - in silos - because even newer cloud data warehouses and data science tools weren't quite designed to work together. Additionally, the need to govern data coming from disparate sources makes a coherent approach to data privacy nearly impossible, or at best, forces onerous controls that limit business productivity and increases costs. Back to the drawing board Simple analytics that improve data visibility are no longer enough to keep up with the competition. Being data-driven requires the ability to apply multiple analytics disciplines against data located anywhere. Take autonomous and connected vehicles for example, you need to process and stream real-time data from multiple endpoints at the Edge, while predicting key outcomes and applying machine learning on that same data to obtain comprehensive insights that deliver value. The same applies, of course, to the needs of data stewards and data scientists in evaluating the data at different times in the processing chain. Today’s highest-value machine learning and analytics use cases have brought a variety of brand new requirements to the table, which have to be addressed seamlessly throughout the data lifecycle to deliver a coherent picture. Enterprises require a new approach. Companies have grown to need a comprehensive platform that integrates all data from data centres and public, private, hybrid and multi-cloud environments. A platform that is constantly informed about the location, status and type of data and can also offer other services, such as data protection and compliance guidelines, at different locations. The rise of the enterprise data cloud Since enterprises undergoing digital transformation are demanding a modern analytic experience across public, private, hybrid and multi-cloud environments, they are expecting to run analytic workloads wherever they choose – regardless of where their data may reside. In order to give enterprises flexibility, an enterprise data cloud can empower businesses to get clear and actionable insights from complex data anywhere, based on four foundational pillars: Hybrid and multi-cloud: Businesses have grown to demand open architectures and the flexibility to move their workloads to different cloud environments, whether public or private. Being able to operate with equivalent functionality on and off premises – integrating to all major public clouds as well as the private cloud depending on the workload – is the first ingredient to overcome most data challenges. Multi-function: Modern use cases generally require the application of multiple analytic functions working together on the same data. For example, autonomous vehicles require the application of both real-time data streaming and machine learning algorithms. Data disciplines – amongst which edge analytics, streaming analytics, data engineering, data warehousing, operational analytics, data science, and machine learning – should all be part of a multi-functional cloud-enabled toolset that can solve an enterprises most pressing data and analytic challenges in a streamlined fashion. Secured and governed: With data coming from various sources, comes great responsibility. Businesses want to run multiple analytic functions on the same data set with a common security and governance framework – enabling a holistic approach to data privacy and regulatory compliance across all their environments. It must therefore maintain strict enterprise data privacy, governance, data migration, and metadata management regardless of its location. Open: Lastly, an enterprise data cloud must be open. Of course, this means open source software, but it also means open compute architectures and open data stores like Amazon S3 and Azure Data Lake Storage. Ultimately, enterprises want to avoid vendor lock-in (to not become dependent on a single provider) and favour open platforms, open integrations and open partner ecosystems. In the event of technical challenges, not only one company, the original supplier, who delivers support, but the entire open source community can help. This also ensures fast innovation cycles and a competitive advantage. To achieve their goals of digital transformation and becoming data-driven, businesses need more than just a better data warehouse, data science or BI tool. As new data types emerge, and new use cases come to the fore, they will need to rely on a range of analytical capabilities – from data engineering to data warehousing to operational databases and data science – available across a comprehensive cloud infrastructure. Throughout their journey, they need to be able to fluidly move between these different analytics, exchanging data and gaining insights as they go. Being able to rely on an enterprise data cloud will future-proof their commitment to technology innovation and ensure business objectives are met across any division.[/vc_column_text][/vc_column][/vc_row] ### Artificial Emotional Intelligence: Understanding Human Behaviour The need to better equip the security and intelligence world with greater insight into human behaviour has perhaps never been greater. For instance, the ability to predict human behaviour and the following scenarios in public places could prove vital. Defusing a situation before it escalates, or understanding how someone feels whilst standing on the edge of a train platform. This could have a dramatic impact on safeguarding others. It is no secret that focuses on public safety is at an all-time high so gaining access to unbiased intelligence itself can be of great value to security providers looking to manage the risk of public safety from a national security perspective. But, if the past has shown us anything, it is that good, reliable intelligence is hard to come by. [clickToTweet tweet="How a person #feels while standing on the edge of a train platform could dramatically impact on #safeguarding others." quote="Understanding how a person feels while standing on the edge of a train platform could have a dramatic impact on safeguarding others."] Emerging AI (Artificial Intelligence) technologies such as robotics, greater advances in machine learning and high-tech facial recognition form only part of the spectrum of possible innovations. This could help us in the quest to understand human behaviour better. What is clear, though, is our desire and need to unlock emotional intelligence. Providing greater insight into what human beings are thinking and feeling is the next logical step. Removing ‘in the moment’ human bias But how can innovative AI technologies help to improve levels of insight into human behaviour and how might that translate into something meaningful from a security intelligence perspective?  Artificial Emotional Intelligence (AEI) is an area that has the potential to provide in-depth analysis and intelligence surrounding emotion and characteristics relating to the individual but without the influence of human bias, which often muddies so-called studies and the subsequent data. Using software that has the capability to remove that human intervention could, in fact, take the realms of security intelligence to another level entirely. Especially when used to read subliminal facial expressions accurately live (in the moment – while it is happening) and then convert them into a range of more profound emotions. When used with specific characteristics (real time), it would create an accurate picture.  Imagine knowing ahead of time what someone is feeling… The primary emotions the software deciphers in these cases are the seven universal languages of emotion; anger, disgust, contempt, happiness, sadness, surprise and fear. The primary emotions the software deciphers in these cases are the seven universal languages of emotion; anger, disgust, contempt, happiness, sadness, surprise and fear.  In line with this, there are also many critical characteristics that such technology can detect as a result of these expressions such as confidence, passion, honesty, nervousness, curiosity, distress and depression.  Knowing when a person is displaying these traits in a public space can be an indicator of potential risk or threat. Standing out from the crowd Fear is a big one. Assessing whether or not someone is fearful of a situation before it occurs could be invaluable information.  Anger or contempt are also other emotions that could be used to significant advantage in the case of large crowds of people as it can help to pre-empt anti-social behaviour ahead of time.  Honesty is also a characteristic that can be used to detect incidents such as theft or someone who is planning to steal something especially if they are displaying unusually low levels of honesty.  In the case of characteristics exhibited such as depression or distress, security teams would be able to evaluate people who may be at risk of a suicidal attempt and thus approach them, before it is too late.  Of course, as with any technology like this, it is about prevention rather than cure – or at least having the opportunity to take appropriate action at a given point in time. Using facial expressions to detect nervousness is also useful because one can use the data to can assess any potential behaviour links such as an ‘attacker’ displaying signs of nervousness. Ensuring public safety has never been more poignant, so using AEI technology alongside the monitoring of extreme life-threatening and anti-social behaviour, for instance, could provide information that could be used to prevent incidents from rising such as threats to public safety, criminal activity or indeed suicidal attempts. Assessing risk and gaining foresight How? Through the deciphering of live characteristics, technology can now alert security teams to extreme behaviours in real time before they have the opportunity to evolve, for example, high concentrated levels of anger amongst a crowd of people, or abnormal level of distress or depression when standing on train platforms.  From a security perspective, this advanced insight is invaluable in terms of assessing the risk of particular situations and having the foresight to control them before they get out of hand. When detecting a person of interest, AEI technology can give you that crucial layer of intelligence previously unattainable.  For instance, understanding exactly how nervous someone is, whether they are showing any obvious (or perhaps less obvious) signs of depression or distress.  If security providers are able to combine this added layer of data-backed evidence with their existing or supporting data, they may be able to pre-empt scenarios before they happen.  One could argue that this is real ‘crystal ball gazing’ at its best. Going more profound than merely deciphering human emotions, security providers need to use technology that goes beyond that using partial facial recognition, robust camera angles and state-of-the-art pixelated raw data to reveal typical personality and behavioural traits to ensure the detection of hidden signs that naked human eyes cannot see.  Until now, this is something that previously has been almost impossible to achieve. Psychological behaviour and emotional backup Over the next 12 months, we can expect AEI technologies to be used to significant advantage in providing the security space with enhanced methods of combatting day to day risk to public safety. This technology innovation continues to aid further the ongoing pattern spotting and monitoring that exists today. The flagging of characteristics and emotions combined with face detection allows crowd monitoring and tracking to be taken from a previously reactionary process to a preventative state. Using advances in AI technology, security teams can now be confident that they are physically acting with the added benefit and back up of psychological and emotional data as well as a visual track. This benefits detecting wanted or suspected individuals as well as identifying people who may be about to commit an offence, which is today’s world is invaluable. ### The Future of Work in an Age of Virtual Assistants [vc_row][vc_column][vc_column_text]Automation has long been one of the world’s most transformative forces and its impact only promises to accelerate. During the industrial revolution, engineers designed machines to automate physical tasks and completely reinvented agriculture and production in the process. Then computers came along to automate transactional tasks, which sent ripples of disruption into just about every industry. Today, a new generation of Artificial Intelligence (AI)-powered virtual agents are automating cognitive tasks and the effects promise to be just as monumental. Here are three big ways that enterprise-grade virtual assistants will (re)define how humans work. The Semantic Revolution Consumer-grade virtual assistants (e.g. Siri, Alexa or the Google Assistant) have revolutionized the way that people engage with technology in their personal lives, but AI is also transforming the way employees interact with technology at work. Enterprises around the world are increasingly “hiring” digital colleagues like IPsoft’s Amelia to automate the front end of their digital systems. Virtual assistants allow people to engage with technology through the medium most comfortable to them: conversation. Semantic UXs empower all users to use a system, regardless of technical know-how. For example, on the consumer side, even if a user had never used a music streaming service before, they can tell ask their Echo, “Alexa, play some 80s music.” Similarly, semantic interfaces open access to enterprise services. With AI, employees will not need to be trained to use a company’s shared or IT service systems -- they will simply tell the virtual assistant do it. For example, an employee who needs to replace a broken laptop won’t need to navigate an unfamiliar IT ticketing system, they will just tell the virtual assistant, “My laptop is completely broken, can you order me a new one?” The virtual assistant will parse the user intent and independently issue the ticket to the relevant department. This open functionality can be applied to any business area, and thus make enterprise services (everything from HR to accounting to facilities) open to any employee on day one. No More Busywork AI excels at executing regimented processes that previously required a human. For example, as an IT service desk rep, a cognitive agent like Amelia assists users 24/7 with various types of low-level requests (e.g. password resets, guest Wi-Fi access, ticket status, etc.). These low-value requests can account for up to a third of IT workload according to a report from Quocirca and IPsoft, however they are easily automated by a cognitive agent. While this next level of automation might frighten many human IT workers, it should actually be a source of hope for a more rewarding work day. When AI automates routine (and frankly boring) tasks, human workers are freed to address complex or unique tasks, which will make their jobs more interesting. A New Emphasis on Uniquely Human skills As AI eradicates low-level tasks that previously required a human, the labor force will naturally begin to emphasize human skills that can’t be automated. For example, businesses don’t necessarily value internal IT service agents with exceptional “soft skills” (e.g. empathy, humor, and the unteachable ability to make other people feel comfortable). While these would be nice attributes to have, most companies want IT service agents who can execute routine services. This doesn’t mean the role of IT worker will go away -- quite the opposite. However, when low-level tasks are efficiently automated by machines, businesses will find it in their interest to recruit and secure candidates with additional sets of skills, e.g. workers who excel at explaining technical issues to non-technical staff, applying creative problem solving to existing IT challenges, or are good at negotiating with vendors to secure the best prices on technology. AI promises to make businesses more productive and efficient, but will also make human roles more rewarding. These new technologies won’t just change human roles; they have the potential to evolve them into something more rewarding.[/vc_column_text][/vc_column][/vc_row] ### Cloud Repatriation - Why enterprises are bringing some apps back from the public cloud [vc_row][vc_column][vc_column_text]Organisations leverage on-premise, private cloud, and public cloud infrastructure to move applications and data across all environments - but now a growing number of enterprises are moving apps back home. Just a few years ago, we believed that the public cloud was the future and would replace physical data centres sooner or later. Since then, the trend to migrate applications and data into public clouds has been strong. However, despite the ongoing trend to use the public cloud, Cloud Repatriation - the decision to move applications and data back home on-premise - has become a trend. As the hybrid cloud environment is becoming the standard for most organisations, there has been a dramatic shift in thinking. From a paradigm that the public cloud is the best place for everything, to a strategy to place applications where they fit best – even including pulling some back from the public cloud. But what is causing the trend of cloud repatriation? In fact, there is quite a long list of factors. 1. The on-premise data centre has evolved into cloud-ready infrastructure The prerequisite for the trend to repatriate data is that data centres have become increasingly software-defined. Public cloud vendors once triggered this development by building software-defined or automated IT services, creating attractive tooling and interfaces for developers. And those great software-defined technology advances are no longer unique to the public cloud and can now be found all across the computing spectrum: in private clouds, at the edge, at the distributed core, or even as SaaS or managed services, where they offer cloud-like speed, automation or self-service. This has blurred the line between the data centre and the private cloud even more. Vendors like VMware, AWS, and Azure are even offering a gateway to the public cloud and back with solutions like VMware on AWS, AWS Outpost, and Azure Stack. Enterprises are increasingly starting to use cloud-like infrastructure in their data centre, which now gives them the choice of where to place their applications and data. 2. Data Gravity Data gravity is another factor, mainly affecting on-premise data storage and the cost and ease of moving it. Applications and data are attracted to each other. And the more data there is, the greater the attractive force pulling applications and services to associate with that data. There is a long list of factors that can affect data gravity, but two factors in particular: network bandwidth and network latency. However, it can also be thought of in terms of network effects more broadly; a large and rich collection of information tends to pull in more and more services that make use of that data. IoT systems and anything else working with large quantities of data need to be designed with that reality in mind as data will continue to grow at the edge of the network because it can’t move. Storage-drive density is growing exponentially, but moving data to the cloud is getting harder because cable capacity does not grow exponentially. It is hard to generalise how much data is too much to move. It partly relates to costs associated with moving and storing the data, such as network charges. If sending 2 petabytes (PB) across network links to the public cloud is unaffordable today, then sending 5 PB in 12 months’ time will be even more unaffordable, and 10-15 PB, a year later will be nearly impossible. Even with fibre optic networks, it would take years to migrate big data sets somewhere else. That is leading to companies using Edge Computing to process data where it is created and starting to pull some of their data back in-house while it is still possible. 3. Control, security, and compliance Another main reason for businesses moving certain kind of applications and data away from public cloud is security. At the beginning of the trend to migrate to the public cloud was a misconception that data in the public cloud was 100% protected and secure. In reality, organisations are at risk if they do not architect the right security and data protection solutions. Organisations understand more about what the Public Cloud offers and what it lacks. Bringing data and workloads back in-house can provide better visibility of what exactly is happening and control about security and compliance. GDPR for instance has given organisations a reason to keep their data close as a measure of data sovereignty. 4. Cost One of the early reasons to move data to the public cloud was a better cost-effectiveness, especially for huge amounts of data for backup and archiving. But as more and more cloud-ready technologies are available in the data centre, the gulf between the two has narrowed, which in turn reduces the cost benefits of the public cloud. In some use cases, on-premise solutions are already more cost-effective than public cloud for the majority of workloads. The hybrid cloud gives organisations the choice to place their applications and data where they fit best, including in their own data centres. This possibility, paired with rising issues about recent outages, high costs, latency issues and questions regarding control, security, and compliance are leading to the new trend of repatriation of workloads and data from public clouds to private clouds. Another big driver to repatriate applications and data is the increasing issue of data gravity, which will in the future prohibit moving large sets of data due to the growing cost of network transmission. Above all, enterprises are looking for flexibility to set up solutions and services that are flexible to grow with their business and will not likely commit to either side, on-premise or in the cloud. As businesses evaluate the best IT infrastructure for their workloads, hybrid IT with a mix of public cloud, private cloud and on-premise solutions will become the norm. These factors are leading to the emergence of application delivery and services technologies that work consistently on different underlying compute infrastructure including bare metal servers, virtual machines, and containers and across private or public clouds. One such architectural pattern - the service mesh - has become popular for applications using microservices and cloud-native architecture, but the concept applies equally well to traditional applications in data centres. With cloud-ready applications spread out over hybrid cloud environments, IT teams need to adopt technologies — like service mesh — that will connect applications to the services they need, wherever they are.[/vc_column_text][/vc_column][/vc_row] ### Are you a data hoarder? Data hoarding can be damaging to our lives Data hoarding is widely recognised as a damaging activity in our day to day lives, with TV shows such as The Hoarder Next Door helping us appreciate how an unwillingness to throw anything away can quickly have negative effects.  However, I have come across organisations, all across that world, that are guilty of digital hoarding in one form or another, where they are clinging on to data well beyond the point that its continued storage is necessary or even beneficial. Even though these data stores aren’t as visible as piles of rubbish it doesn’t make the practice of keeping hold of them any less destructive. After all, if you don’t know exactly what you have or where it is stored, how are you ever going to know how best to manage and protect it? You could perhaps be forgiven for thinking that as the cost per TB of cloud storage gets cheaper, that it would be ok to let your data stores continue to grow. You might think it wise to keep hold of it “just in case” it one day becomes useful. However, the storage costs and risks of a potential data breach often outweigh the benefits of keeping hold of this data. If we forget the security implications for one minute, have you ever thought to consider how the cost of data storage compounds each year? According to the Veritas Global Databerg Report, just 15 percent of the average company’s data is business-critical, a third is redundant, obsolete or trivial and another 52 percent is unclassified (dark) data. This leaves a huge amount of data that could – and should – be erased. So as to make it easier to understand what this looks like, we created our Data Storage vs Data Erasure ROI Calculator. The calculator allows you to enter your own figures which will quickly give you an understanding of how much you’re spending in order to store non-critical business data and how those costs compound each year.   It’s not just data hoarding that’s an issue Making your data more visible reduces risk through easier monitoring. The more data you hold, the greater the consequence should you fall victim to a data breach. Besides, if you don’t have a full understanding of the data you’re holding, it’s all but impossible to fully understand the scale of the problem or how many people might have been affected. Once a data breach has occurred, no executive would want to have to explain to their staff, customers as well as the mainstream media that they had no idea what information was taken or how many customers were likely to be affected. Yet, by failing to challenge this data hoarding mentality within their organisation, this is exactly the risk that the majority expose themselves to on a regular basis. One of the most high-profile examples of this is Yahoo!, which had to triple its estimate of the number of users affected by a historic data breach from 1 billion to 3 billion. This discrepancy shows just how little oversight it must have had into the information held on its servers.   Will EU GDPR transform business thinking? With any luck, the incoming EU GDPR regulations will be the shot in the arm organisations need to address their data hoarding addiction once and for all. After all, how can a ‘Right to be Forgotten’ request be carried out within an organisation that doesn’t know the data it has or where it lives? In order to fully prepare your organisation, the first thing you need to do is classify the data that already exists. Once this has happened you can start thinking in terms of data lifecycle management, or the comprehensive approach to managing the flow of information system’s data and associated metadata from creation and initial storage, all the way through to the point it becomes obsolete and is destroyed. It is also important that companies know how much they’re currently spending on data storage, including both soft and hidden costs. Only once you know how much money you’re spending on storing unnecessary data, will you be able to see where you could save money by erasing that data. When that’s done, you can create processes for classifying and erasing unneeded data and regularly monitor your data management processes. That way, data can start to be routinely erased whenever its value is less than the liability, when customers demand it (when closing accounts, for example), or when it is required for regulatory compliance. Effective standards for data erasure contribute to overall data hygiene by ensuring that data is destroyed when it reaches the end of its retention date, is no longer necessary or isn’t adding value to the business. This factor is essential in preventing unauthorised access, whether through a security breach or inadvertent disclosure. While it’s remarkable to hear about the mitigated security risks and the adherence to compliance organisations have achieved through data erasure, it’s far more fascinating when companies can see the impact data erasure can have on their data storage costs compounded over time. Suddenly, data erasure becomes a critical data security solution that not only minimises exposure to data loss and ensures regulatory compliance, but also delivers an improved bottom line. ### Five Trends to Define Cloud Computing in 2019 [vc_row][vc_column][vc_column_text]Last year was yet another year of phenomenal growth for the cloud. With the average investment up 36 percent from 2016 across all industries, cloud has become an essential business model enabler for the modern company. However, with so much of IT infrastructure becoming reliant on the cloud, CIOs and CTOs need to make sure they have a fully fleshed out strategy to manage the challenges involved in making the switch. For example, moving to a cloud-first model can help businesses gain vital agility and digital transformation capabilities, but also staying focused on security can be tricky during a period of such drastic change. With this in mind, here are five aspects technologists and executives need to be aware of this year: Being cloud native won’t be optional any more In this software-driven economy, going cloud native will no longer be optional for many businesses. Being cloud native allows companies to get new products and services to market faster, provides greater agility, and enables scalable on-demand computing capacity. In the last year alone the amount of cloud native technology in use has increased by a staggering 200 percent. In 2019, increased IT automation will eliminate numerous manual IT tasks and accelerate delivery of cloud native applications. Alongside continued innovation in the design and construct of applications, this will allow the companies which properly invest in cloud native capabilities to make significant improvements quickly and easily, such as customising apps based on choice of language to meet the needs of different regions. Those that fail to make the shift, however, could find themselves being left behind by faster competitors.  Spotlight on security Cloud has created a host of benefits for CIOs and CTOs but one aspect that needs a great deal more attention is ensuring that information on the cloud is secure. Last year saw huge amounts of enterprise data migrated to the cloud, closely followed by a deluge of data breaches, like the misconfigured Amazon S3 bucket which exposed 48 million records of personal customer data including names, physical addresses, birthdates, Twitter handles and data scraped from LinkedIn and Facebook. This is damaging for all companies but particularly for those in heavily regulated industries like healthcare or financial services, especially now GDPR has come into force. Such incidents can deeply affect day-to-day operations as well as potentially exposing businesses to regulatory fines and reputational damage – as a result, cloud security will certainly shoot up the corporate agenda this year. We’ll see more companies investing in tools to address cloud service risks, enforce security policies, and comply with regulations – even when cloud services are beyond their perimeter and out of their direct control. Accelerate cloud analytics By now, it’s hardly news that firms are struggling to process all the data they’re being inundated with but, thanks to trends like IoT and mobile, the problem is only getting worse. As a result, 2019 is likely to be the year of cloud analytics as companies investigate how the cloud and edge computing can help them manage their reams of data. Cloud analytics will enable companies to store, manage and interrogate their data while still being cost effective. As more data moves to the cloud throughout the year, companies will have to shift gears towards a hybrid mode with analytics, data, and applications spread across on-premise and multi-cloud environments. Evolution of serverless We’ve come all the way from Infrastructure-as-a-Service (IaaS), to Platform and Software-as-a-Service (PaaS/SaaS); this year we’re going to see the explosion of serverless computing. In 2019, Function-as-a-Service (FaaS) or serverless architecture will hog the limelight. We’re going to see companies being able to leverage their own datacenters, porting serverless apps between cloud providers. This means that while they run serverless applications on public clouds, they also have the option to do it on their own infrastructure ensuring speed and cost saving. And serverless won’t just be used to accelerate the development of new cloud-native applications but to modernise legacy apps as well. Pick your own cloud Finally, an emerging cloud strategy that will come to much greater prominence in 2019 is multi-cloud solutions like Polycloud. Companies that have essentially been using a single cloud vendor will start using multiple vendors, for instance, a firm might use AWS for its machine learning capabilities but opt for Microsoft Azure to transition to cloud from on-premise Windows servers. Similarly, more and more companies will be adopting hybrid cloud, for it combines the advantages of both worlds, including more flexibility, tools and deployment options. For example, companies are likely to use public cloud for high-volume activities, while retaining private cloud capabilities for sensitive or complex operations. Making the most of it These are just a few of the changes around cloud computing we’re likely to see in 2019 and it remains an extremely exciting time for the technology. In particular, recent research suggests that manufacturing, telecoms and utilities will be undergo enormous transformation as the pressure to become cloud-based increases. We can expect to see a lot more thrilling new technologies come online – the question is, will your business be ready for them?[/vc_column_text][/vc_column][/vc_row] ### CMMS vs ERP | Asset Management Dilemma Enterprises that are seeking to optimize their core operations often run into a dilemma on how to exactly manage their considerable assets. Is using ERP by modifying one of the existing modules enough or should they look for specific software solutions like computerized maintenance management systems (CMMS) that are designed to help you efficiently manage and maintain your assets? The situation becomes all the more precarious upon considering the fact that both solutions - CMMS and ERP - have a fair bit of similarities as both come with features that control important business processes. However, both also come with their own set of differences and limitations. Understanding these differences is the key to making the right asset management decision. The Basics of CMMS Computerized Maintenance Management System (CMMS) is software that is used to track and maintain assets and resources in an organization’s maintenance unit. It allows its users to monitor and record any and all maintenance work, while simultaneously keeping a historical record of work that has been done and tracking critical asset information for future reference. The Basics of ERP Enterprise Resource Planning (ERP) is a software package that manages every aspect of an organization’s business processes. It’s a system of integrated applications that work together to manage and automate several operations including CRM, human resources, accounting and financials, manufacturing, inventory, procurement, project management and so on. In the context of this discussion, we will look at the Enterprise Asset Management (EAM) module. The Relationship Between CMMS and ERP By looking at the above descriptions, there appears to be a relationship between CMMS and ERP as some of the similarities are already evident. Both software solutions claim that they can help you manage your assets which means that they must have functions that overlap. In regards to asset management, an organization may well choose to use the Enterprise Asset Management module (EAM) within ERP for maintenance as this will allow them access to some of the components typically found in a stand-alone CMMS solution. For example, one such component is the inventory and parts management function that CMMS comes with. In its own case, ERP calls it materials management and parts procurement module.   Is One Solution Undeniably Better Than the Other One? There are several pros and cons to consider when deciding which solution better fits your needs. We tried to highlight the most important ones. Using a stand-alone CMMS Pros One of the major strengths of a CMMS is that it’s a proactive and dedicated asset management tool. It can manage all reactive and preventive maintenance activities. Most CMMS solutions also offer advanced inventory management features, ensuring that all spare parts and stock items are available for repairs and servicing. Running costs are lower and it offers quicker ROI through improved efficiency, decreased asset downtime, and increased asset life expectancy. Modern CMMS are easy to deploy and technicians can pick them up rather quickly. They save all critical asset information and allow you to use advanced reporting features to improve your asset management decisions. Modern CMMS have APIs that allow ERP systems to pull data from the CMMS, but the setup is required and require IT skills to accomplish Cons: By definition and design, CMMS is focused only on maintenance. If the CMMS doesn’t have an integration and isn’t able to work alongside other ERP modules, the information flow from the maintenance to another department will be obstructed which is just inefficient. Using an EAM module Pros Multiple functions that are applicable to several different business processes. Allows for fast data exchange and data utilization on the level of the whole organization It is generally faster to integrate an EAM module than CMMS (unless the module needs heavy modifications to fit your organization) It offers some of the functions that can be found in CMMS which, in some situations, can be a worthy replacement. Cons ERP often fall short in ease of use and quick implementation when attempting to use it for the top of the line maintenance management. In some cases, there would be a need for thorough and expensive customization processes to adapt the ERP’s EAM modules to manage critical functions that the maintenance unit cannot do without.   Integrating Both Solutions for Asset Management As the EAM module and CMMS cover the same area, you probably won’t look to use both solutions at the same time. On the other hand, using CMMS in isolation of ERP may not allow asset managers to reap the full potentials of either software. Integration enables both software to share important information. For example, you can generate purchase orders in your CMMS and just forward them to ERP as a purchase request. On top of that, the data about quantities and cost information should automatically sync up which makes inventory ordering a breeze. When would integrating CMMS into ERP work better than just adding an EAM module? In general, if the EAM module you have at your disposal has to be drastically modified to fit your organization, want an easier to use maintenance software or if you want to have deep insight into historical maintenance data and be able to monitor your maintenance operations in great detail, then CMMS is probably the better option for you. However, when it is all said and done, this is something that only you can answer as it needs to be decided on a case by case basis. ### How 5G Will Accelerate Cloud Business Investment [vc_row][vc_column][vc_column_text]5G will bring a massive change to cloud computing. Much of the discussion at the Mobile World Congress, organized in February, was about the upcoming fifth generation (5G) technology. 5G is expected to revolutionise the network and communications industry by providing ultra-fast transmission rates that can be as much as 100 times faster than the existing 4G.  The first commercial 5G smartphones are already set to roll out by the first half of this year. A recent Ericsson Mobility report predicts that, by 2023, there will be 1 billion 5G subscriptions, accounting for nearly 20% of the entire mobile data traffic. A number of industries will benefit from what 5G has to offer. From the healthcare industry to the automotive industry, from smart homes to smart cities and beyond, 5G’s features change the landscape, allowing for several applications that 4G just couldn’t handle. With the increase in potential industries, both tech- and non-tech related, there will be an increased need for cloud services. For example, wearable devices that lack enough internal storage, usually relying on synced larger devices such as smartphones, will be able to function independently with the aid of the cloud via 5G’s low to zero latency. The communication potential opens doors to several avenues for businesses. For example, IBM and Vodafone recently inked a B2B (Business-to-Business) deal aimed at cloud hosting, along with another part providing solutions in a number of areas including IoT, Artificial Intelligence, and other 5G-beneficial applications. In essence, for cloud dependent industries and cloud industries themselves, 5G is a welcome force. The fifth generation technology will accelerate cloud business investment through its widespread application to a number of innovations. Impact of 5G on other technology innovations The combination of 5G and cloud technologies will enrich the capacity, functionality, and flexibility of a number of industries, especially for cloud businesses themselves. This combination will allow network carriers to offer competitive services in a way non-cellular IoT (Internet of Things) network providers will be unable to copy. These innovations will introduce several investment opportunities for cloud businesses. Here is how 5G’s improvement of cloud technologies will benefit some innovations. Streaming data and analytics: The existing Big Data processing technology uses cloud infrastructure to support the required storage. However, to stream analytics on Big Data, systems still face major challenges related to latency with current wireless networks. As 5G networks are touted to be surprisingly fast, real-time streaming challenges will be minimized greatly. Industrial IoT (IIoT): In industrial use cases, such as supply chain management and process manufacturing, processing and analyzing the massive amount of sensor data in real-time is crucial for valuable insights to manage cost and efficiency. 5G has the potential to reduce the cost of Big Data analysis and make it even more effective given the remote and variable nature of these workloads. Edge computing: 5G hugely impacts the performance of mobile and remote devices. Remote systems such as location tracking apps, home automation systems, and voice assistants, which are based on sensors, will use 5G to transfer a huge amount of data ten times faster than 4G networks. Artificial intelligence (AI) and natural language processing (NLP): As more and more enterprises are using AI and NLP, they would require the ability to process and manage huge data. Most cloud computing service providers have the necessary computers and storage but need to improve their real-time data ingestion capabilities. 5G will provide the required level of data transmission for AI- and NLP-based apps to work effectively. Virtual reality (VR) and augmented reality (AR): 5G will dramatically enhance the quality of VR and AR applications, bringing innovations in industries such as retail, travel, healthcare, etc. As 5G and its applications evolve, there will be significant technology adoption in the above areas. These areas involve huge and complex workloads, making cloud computing a key component. We are a few years away from when 5G will become mainstream network technology, but such a time isn’t as far as it seems. Currently, the 3GPP Release 15 standards for this next-generation network classified 5G devices as non-standalone devices. It means these devices can’t perform independently and need significant modifications to network standards and infrastructure. Thus, enterprises, telecommunication firms, and cloud businesses have a few years to design and implement their 5G strategy. How will 5G impact cloud services? Today, organizations utilize mobile cloud applications from an operational perspective as well as a customer offering standpoint. The future of mobile cloud applications, from healthcare to banking, and beyond, will see it become more widely used and efficient after the widespread roll-out of 5G technology. For example, unified communications services used by businesses will gain from 5G’s improved speed and service reliability. Everyday mobile apps are also cloud-dependent and will improve given the availability of 5G’s low latency capabilities. This means faster, smoother transfers and more that are lacking in today’s 4G connectivity. Such lacking performances cause applications to be deliberately light, or have reduced potential, from their development stages. With 5G’s improvement on cloud availability, applications are expected to see their full potentials, without any innovative idea being left behind. There are so many benefits of cloud communications whose primary features will be improved by 5G. 5G technology will enable cloud service providers to reach enterprise mobile customers easily and reliably. Access to virtual machines via phones will become common because of larger computing and machine-to-machine communication provided by 5G. Cloud computing enterprises will offer more features and options to mobile users, and hotspots will become faster, allowing remote workers access to cloud services, even where internet connectivity is lacking. Final words 5G technology will bring major improvements to the cloud computing world. This is due to the fact that most technology innovations can be more efficient when cloud-dependent. 5G, in turn, improves that integration with its low to zero latency, making for smoother communications. Applications in this space range throughout several industries. From healthcare applications to autonomous vehicles, and even down to wearables and mobile apps, the cloud is a useful space for non-device storage. Making use of the cloud with 5G connections will boost performance of these innovations. Cloud-based products and services are expected to become more reliable, faster, and efficient. These innovations will, in turn, accelerate cloud business investments.[/vc_column_text][/vc_column][/vc_row] ### Exploring The Business Value of Digital Twins [vc_row][vc_column][vc_column_text]As cutting-edge companies seek ways to improve internal processes and gain competitive advantage, they’re increasingly examining the business implications of digital twins. The digital twin is a virtual representation of physical assets, processes and systems that mimics changes as they occur in an actual physical system. The most common applications today are found in industrial settings, in which companies like General Electrics use digital twins of jet engine components to predict maintenance schedules and life expectancy. Doing so enables the company to manage engines to deliver the highest possible profit and performance, while also ensuring passenger safety. The key value proposition to digital twins is in their ability to combine real-time data, physical dependency models and intelligence from different platforms to simulate, predict and improve assets and E2E processes. According to a recent Deloitte report the global market for Digital Twin technologies is expected to grow to $16 billion by 2023, while technologies used for digital twins – IoT and machine learning, for example – are predicted to almost double by 2020. Further underpinning the importance of the technology, vendors like IBM, Oracle and SAP have launched product portfolios around data twins. Such vendor support, coupled with increasing numbers of implementation scenarios, is driving down the cost of digital twins, while also increasing its applicability, as organizations recognize it diversity and appeal. As such, digital twins now are used for applications in retail, healthcare and the public sector to model a variety of processes within companies, including overall product lifecycle management, network planning and design, and the deployment and management of remote resources. Making the Most of a Digital Twin As organizations consider ways in which digital twins can be integrated, it’s imperative to include as many data points as possible. Digital twin development should include asset lifecycle data from product designs, production, sales and marketing, supply chain, operations, services and finance. Additionally, data twins should incorporate real-time information between manufacturers, service partners, customers, OEM partners, regulators and other ecosystems. The successful digital twin will gather information from both cloud and on-premise assets, including cloud IoT integration, enterprise resource planning (ERP) business suites, data management systems, collaboration asset platforms and digital innovation portfolios. When the digital twin incorporates data from these diverse data points, it creates a myriad of potential benefits, including: The ability to test changes to processes before they’re implemented Data-driven decision making Collaboration with both internal and external ecosystems Development of new business models and improvements for existing business models Improvements to customer experience Test-driving Potential Changes Digital twins enable organizations to test scenarios, changes or updates to business processes before they’re implemented. This “try before you buy” capability can pinpoint areas within an organization that would benefit from automation. As such, enterprises gain the ability to walk through multiple scenarios prior to making a decision that could impact business processes, operations and employees. Such modeling also affords companies the ability to innovate and evolve without disrupting day-to-day operations, while also reducing the complexity and inefficiencies of existing models. Data-driven Decision Making Digital twins promote data-driven decisions by building a comprehensive, virtual representation of a company’s processes. With the right data points, digital twins provide a business-level view that can be used to measure and analyze operations across an entire enterprise. This end-to-end visibility enables organizations to understand strengths and weaknesses of its associated processes. With such knowledge, organizations can simulate alternative approaches and restructure entire processes based on hard data, as opposed to making assumptions based on generalized expectations. Innovations Throughout Ecosystems Organizations that incorporate data from numerous ecosystems will find that it ignites innovation, as ecosystems are armed with the necessary data to build next-generation products or manufacturing processes. Likewise, the digital twin can spur innovation from secondary ecosystems, incorporating external assets and technologies from different vendors. Information from an organization’s digital twin can be made available to its ecosystem of partners, enabling them to innovate and improve the design and performance of their products or services. This creates secondary and tertiary systems of data around which companies can base decisions. Product Improvement and Creation Digital twin models enable product designers to prototype new ideas quickly and inexpensively. Product improvement is achieved by simulating what-if scenarios involving system interactions, product testing and customer experience. Predictive analytics from digital twins creates a better understanding of how an enterprise can meet the needs of its customers through new product and service development, as well as providing insight for ways to improve service after sales. By studying the digital twin under actual working conditions, companies can see a product in action, enabling engineers to make more informed choices during the design process and use digital twins to make their simulations more accurate. In addition to future innovation and product development for designers and engineers, digital twins build a stronger relationship between engineering and operations teams. Operations teams use the data to optimize performance, service and maintenance over the lifetime of a product, enabling organizations to avoid costly downtime, repairs, replacements and future performance issues. Create customer value for life The overarching result of data twins is that organizations can create customer value and satisfaction throughout the entire lifecycle of its products and services. Customer experience is improved through collaboration, as well as the creation of customized and individual product and service offerings. Industry stakeholders expect lifetime value, especially for capital-intensive products. Digital twins can be used to evaluate and deploy new operating methods that weren’t necessarily a part of the original value proposition, thereby providing new services to customers over an asset’s lifetime.[/vc_column_text][/vc_column][/vc_row] ### Cloud Computing: Differences Between the AI and ML [vc_row][vc_column][vc_column_text]Are Artificial Intelligence (AI) and Machine Learning (ML) the same thing? Many people confuse these two concepts, using one instead of another and vice versa. Unfortunately, companies mislead their customers by promising AI instead of ML or some unrealistic combination of the two. In the realm of big data, AI and ML are often used interchangeably. With all the hype going on about these two ideas, it’s easy to get lost and fail to see the difference. For example, just because you use a certain algorithm to calculate information, it doesn’t mean that you have AI or ML at work. What does? Let’s start with the basics. What Is An Algorithm? An algorithm is simply a set of actions to be followed in order to get to a solution. When it comes to ML, the algorithms involve taking data and performing calculations to find an answer. The complexity of these calculations differs depending on the task. The best algorithm allows you to get the right answer in the most efficient manner. If an algorithm works longer than a human does, it’s useless. If it offers incorrect information, it’s unnecessary. Algorithms get training to learn how to process information. The efficiency, the accuracy, and the speed depend on the training quality. When you use an algorithm to come up with the right answer, it doesn’t automatically mean using AI and/or ML. But if you are using AI and ML, you are taking advantage of the algorithms. All humans have eyes, but not all creatures who have eyes are human. These days, we hear about AI and ML being used whenever an algorithm exists. Using an algorithm to predict event outcomes doesn’t involve machine learning. Using the outcome to improve the future predictions does. What Is Artificial Intelligence? AI is a widely used term. It’s a science of making the computer behave in such ways which are commonly thought to require human intelligence. Basically, making a computer act human in some ways. Even though the above definition is rather precise, the AI field is still broad. For example, in the 1980s, anyone would tell you that a pocket calculator was an artificial intelligence. Today, it’s a common program which doesn’t seem to have anything to do with AI. Artificial intelligence takes advantage of numerous technological advances. Machine learning is just one of them. Unlike machine learning, the definition of artificial intelligence changes as new technological advances come into our lives. It’s likely that in just a few years, what we consider to be AI today will look as simple as a pocket calculator. Experts at Miromind offer another definition of AI, which can be easier to grasp. It is a study of training a computer to execute tasks which humans seem to do better today. In the digital realm,  it’s largely applicable in marketing and SEO efforts What Is Machine Learning? Machine learning is simply a branch of AI. It’s a study of computer algorithms that automatically become better through experience. ML is one of the ways to achieve AI. Machine learning requires large data sets to work with in order to examine and compare the information to find common patterns. For example, if you give a machine learning program many photos of pregnancy ultrasounds together with a list of indications to identify the gender, it’s likely to learn to analyze ultrasound gender results in the future. ML programs compare different information to find common patterns and come up with correct results. Machine learning comes with advanced sub-branches, such as deep learning and neural networks. Some people have a tendency to compare neural networks and deep learning to the way human brains operate. However, there are many differences between them. Overall, ML is a learning process, which the machine can achieve on its own without being explicitly programmed to do. Machine learning involves computer learning from experience. AI vs ML The key difference between the two concepts involve •           Goal - The goal of AI is to increase the chances of success. Meanwhile, ML’s aim is to improve accuracy without caring for success. •           Nature- AI is a computer program doing smart work. ML is the way for the computer program to learn from experience. •           Future - The future goal of AI is to stimulate intelligence for solving highly complex programs. The ML’s goal is to keep learning from data to maximize the performance. •           Approach - AI involves decision-making. ML allows the computer to learn new things from the available information. •           Solutions - AI looks for optimal solutions. ML looks for the only solution. AI and ML Even though many differences exist between AI and ML, they are closely connected. AI and ML are often viewed as the body and the brain. The body collects information, the brain processes it. The same is with AI, which accumulates information while ML processes it. When people use these two terms interchangeably, they fail to have a deeper understanding of the concepts while intuitively understanding how closely related they are. Conclusion AI involves a computer executing a task a human could do. Machine learning involves the computer learning from its experience and making decisions based on the information. While the two approaches are different, they are often used together to achieve many goals in different industries.[/vc_column_text][/vc_column][/vc_row] ### DDoS attacks - Seven effects it has on cloud environments A question that I’ve encountered many times in the field of late is what are the impacts of DDoS attacks (Distributed Denial of Service) on cloud compute environments?  The primary benefit of cloud is that it elastically scales to meet variable demand, scale up instantly, scale down when demand subsides – in seconds. So layman’s logic might say that cloud-based services are immune from the downtime effects of DDoS attackers, but the possibility of gigantic unexpected bills is a given? After exploring the topic with a number of eminent cloud architects and representatives of online organisations, with experience running primary service on cloud platforms such as Amazon AWS and Azure, here are seven interlinked complexities to reflect on:   Customer experience is key [clickToTweet tweet="The majority of #DDoS attacks aim to significantly impair #customer experience. So, without DDoS protection it is not uncommon for DDoS attacks to go unnoticed, blighting #CustomerExperience in #cloud environments #cybercrime #CyberSecurity" quote="The majority of DDoS attacks aim to significantly impair customer experience. So, without DDoS protection it is not uncommon for DDoS attacks to go unnoticed, blighting customer experience in cloud environments"] The simple fact is that the majority of DDoS attacks have limited ambition of killing a service entirely but rather significantly impair customer experience. So, without DDoS protection that can distinguish legitimate traffic from bad, it is not uncommon for DDoS attacks to go unnoticed and instead blight customer experience in cloud environments just as they do in traditional physical data centres.   DDoS infrastructure pains are different between traditional data centres and cloud In traditional physical data centre environments, all DDoS payload targeting base infrastructure becomes a potential customer experience and economic hit against capacity assumptions. In cloud environments, this pattern changes. For starters, attacks against the underlying infrastructure before the customer’s front-end apps are generally dealt with by the cloud service provider. There are recorded outages where cloud service providers have not got things entirely right of course, such as the 2016 Dyn attack, but such incidents remain exceptions to the rule for now. The upshot of in-built protection is that customers do not normally feel the burn of poor user experience or the bill of DDoS attacks hitting shared internet connectivity or APIs. However, when the attack gets into the customer’s personal compute domain, the pain occurs.   Lift and shift of traditional data centre to cloud renders auto-scalability virtually useless to fend off DDoS attacks If a customer takes their traditional data centre footprint into a cloud environment without transformation, then they will inevitably be lifting the majority of their previous capacity ceilings into their new cloud home. In an attack, the customer would be almost assured of hitting a limitation in licensing, OS threshold, messaging queue or some other interlink between front-end and back-end applications that would cause an outage or major service degradation that no end of horizontal auto-scaling or virtual RAM and CPU resources could mitigate. In one sense this scaling failure might protect them from the worst of a so-called Economic Denial-of-Service attack, EDoS, AKA a huge bill. Not something to applaud of course.   Excess billing prevention is on you The major cloud providers simply do not provide utilisation caps. It is not that doing so is contra revenue interests (which it is of course), but as much they don’t want to be the party that brings their customer’s house down due to their involvement in the automatic shutdown of service. For this reason, it is down to the end users to define the thresholds for spend alerts to somehow then run countermeasures to limit the economic blast radius of an EDoS attack.   Cloud-architected applications are the road to infinite billing Customers that have gone through the cloud transformation playbook face the nemesis of infinite cloud scale: infinite cloud billing. Stripping back an environment from the traditional limits of scalability in favour of cloud-built apps quite simply takes the gloves off a DDoS attack to economically hurt. For this reason, such organisations embarking on this transformative cloud journey would have received the very sound advice of cloud migration experts and a multitude of best practice whitepapers to purchase effective DDoS protection. If they had not taken heed, then the clock is ticking to face down the bill caused by the insatiable demand of an IoT botnet!   Knowledge is power, metrics matter Organisations that take their cloud play highly seriously are some of the most probable organisations to be leveraging an understanding of utilisation to drive a better understanding of customer experience and predicting forward demand for their products. A so-called Data Driven Company. In stark example, a streaming service like Netflix would be measuring the volume of video commencements, the logic being that a higher than normal ratio of commences could indicate a service issue with frustrated users attempting to recommence content on initial failure. It is not difficult to understand that a DDoS attack into the service left unmitigated could completely throw off such metrics and the value they have towards the company bottom line.   Auto-scaling combined with pulse attacks is the primary pain for both economic and experience impacts One of the key trends is pulse attacks, whereby organisations endure multi-vector attacks that oscillate between volumetric and application layer in short periods of time that leave no room for infrastructure to recover or manual countermeasures to be formed. Think of how such burst attacks play out against auto-scaling triggers, even if the organisation has DDoS technology. The technology would need to be able to detect and mitigate attacks in seconds to prevent auto-scaling triggers firing. Fleets of virtual servers coming online automatically, remaining up for a period of time and then shutting down again when load disappeared. You are left thinking about the unnecessary bills caused by mass over-provisioning – which would certainly add up. However, more seriously, what are the implications for customer experience and the management overheads for supervising services coming online and going offline in high frequency. Most organisations would expect that service goes into overdrive just once or twice a day to meet variable demand, what if that was happening +500 times a day? The organisations I’ve heard about that have encountered this scenario were proud of how they fought off 48-hour attacks through the determined efforts of DevOps staff to keep putting plugs in the dam to limit the billing blast radius. But into campaigns that go on much longer, the distraction for the ICT org is irrefutably highly problematic without highly effective DDoS mitigation technology.   In conclusion, cloud computing provides no assurance that inherent scalability can mitigate against DDoS attacks.   The pain could be service outage, user experience degradation or unnecessary reoccurring spend, and the only viable mitigation is DDoS technology that is as automated enough to cope. ### Cloud-Based CRM System: Is it Safe and What Are the Benefits? [vc_row][vc_column][vc_column_text]Broadband internet access has transformed the nature of digital commerce. Unlimited high-speed internet access used to be a commodity only the government, the military, and large corporate enterprises could afford. In contrast, the default assumption today is that everyone, including private individuals and small businesses, has high-speed internet. And once broadband access became the norm, cloud computing became a viable solution for business application deployment. Soon after, cloud-based software solutions started replacing on-site applications for practically all business-related tasks. Practically every kind of business software, from office tools to marketing automation platforms, data analysis systems, and customer relationship management applications now has a cloud version. The adoption rate for cloud-based software has been high overall, but many businesses still have reservations about abandoning on-site software deployment. They often cite concerns about costs, speed, usability, and most of all data safety. Fortunately, most of the concerns surrounding cloud-based software and cloud-based CRM, in particular, are unfounded. The technology has reached a point where the downsides have largely been eliminated or minimized, while the benefits keep accruing. If you’re still unconvinced and want to know more about the topic, you’ve come to the right place. In order to help you understand how cloud-based CRM can help your business, we’ve created and outlined its core benefits, which you can find in the remainder of this article. 1. Hassle-free Setup One of the main reasons why businesses are apprehensive about experimenting with new software is fear concerning setup difficulties. The reasoning here is sound – why would you want to invest in an application that is more difficult to use than the one you’re currently using? Luckily, in the case of CRM software, this concern is largely unfounded. Modern-day cloud-based CRM solutions are remarkably easy to set up and run. All you have to do in most cases is install a lean application frontend, and you’re ready to go. Most modern CRM systems can also be run directly from the browser, forgoing the need to install anything whatsoever on the client side. 2. Access From Anywhere Thanks to the development of internet infrastructure, companies are now rarely housed in a single office. Rather than being the exception, remote work is quickly becoming the new norm. Another consequence of the decentralized nature of modern work is that many on-site CRM solutions became obsolete. Being able to access your CRM data from a variety of access points is now one of the costs of doing business. And cloud-based CRM is the easiest way to fulfil this requirement. Cloud-based CRM solutions can be used from any internet-capable device – all you need to do is register an account, login, and you’re ready to get started. 3. Affordable Cost Business applications used to have one thing in common – a high asking price. The cost of maintaining on-site applications was also substantial, often necessitating a fully-staffed IT support department. This has made them inaccessible to everyone apart from large corporate enterprises. But thanks to the exponential growth of the software industry, there are now business solutions at every price point. On-premise CRM used to belong to the former category, while its cloud-based counterpart clearly belongs to the latter. Instead of asking for a large sum upfront, cloud-based CRM software uses subscriptions as the default payment model. Most cloud-CRM providers offer different subscription tiers, including a free option, a fully-featured basic option, and a deluxe option with extra features. 4. Built with Collaboration in Mind CRM is a collective effort. In order to provide a quality customer experience, your employees need to be on the same page, especially people from marketing, sales, and customer service. Cloud-based CRM makes it easy to share customer data within your organization. This allows you to implement complex, multi-staged business strategies with ease. Most cloud-based CRM solutions have built-in communication features, including chat, email, and instant messaging, which reduces the need for using third-party communication apps. Best of all, cloud-based CRM solutions can run on any internet-ready device, making them especially suitable for geographically dispersed enterprises. 5. Integration With Other Apps CRM is only a single element of the average business tech stack. Most businesses use additional apps for specific activities such as email marketing, process automation, analytics, and software testing. However, these apps don’t always work well together. There are issues relating to differences in file formats, drivers, system requirements, as well as issues with feature overlap. Cloud-based CRM can’t solve this problem as such, but since they are built with integration in mind, they won’t contribute to the difficulty either. They can even replace some of your other apps since they come with a variety of useful features. 6. Ready for Scaling On-site CRM solutions are built to handle a limited amount of data. The limit itself can be quite high, but you are bound to reach it eventually, which will cap your company’s growth. Another issue is that of hardware. The more customers you have, the more hardware you will need to keep your operation running, which is a costly way to do things. In contrast, cloud-based CRM scales with your business. If your current subscription plan can’t handle the load, you can switch to an advanced tier with only a small price hike. Another benefit of cloud-based CRM is that you can also downgrade your subscription, in case of an economic downturn. 7. Data Security Finally, there is the issue of security. CRM apps are built for handling sensitive customer data, and keeping this data safe is of prime importance, especially in the wake of the GDPR data privacy legislation. Keeping customer data on your own servers, and running on-site CRM might seem like a safe choice at the outset, but upon further inspection, you will find that this is not the case. Unless you have a cybersecurity division within your company, your servers will be vulnerable to hackers. In contrast, cloud CRM solutions store data on secure servers across the globe. Cloud CRM providers will also create backups of your data, so even in the event of a compromise, you can easily rebuild your customer database from scratch. Embrace the Cloud Cloud CRM is one of the most successful implementation of SaaS. It has none of the disadvantages of on-premise CRM, and many unique benefits as well, including security, usability, performance, and a low cost. On-premise CRM is a relic of the past, and cloud-based CRM is doubtlessly a worthy successor.[/vc_column_text][/vc_column][/vc_row] ### What is a Cloud Management Platform (CMP)? Jeanne Le Garrec of Hedera Technology provides a detailed insight into the constituent parts of a cloud management platform (CMP). When it comes to cloud platforms, it can be difficult to define the differences between IaaS, PaaS, SaaS ...and "CMP".  The differences between a IaaS (Infrastructure as a Service), SaaS (Software as a Service) and PaaS (Platform as a Service), are reasonably well established although PaaS can still be tricky to define without debate. Today, I'd like to focus on another type of platform: Cloud Management Platforms.  Big IT and cloud vendors have launched their own but what is it for? And is it really necessary? The CMP market is growing rapidly and getting more and more crowded - which is odd given that most clients we meet struggle to define what it is and what it stands for.  So let's try and clarify things: A Cloud Management Platform is software which combines a set of features or modules which enable the management of different cloud environments.  Public, private and hybrid cloud cannot be all handled with a simple virtualisation management console.  A CMP addresses a certain list of characteristics that will be described later on. Perhaps more precisely Gartner defines a CMP as: Products that incorporate self-service interfaces, provision system images, enable metering and billing, and provide for some degree of workload optimisation through established policies. More-advanced offerings may also integrate with external enterprise management systems, include service catalogs, support the configuration of storage and network resources, allow for enhanced resource management via service governors and provide advanced monitoring for improved “guest” performance and availability. Gartner IT Glossary Q: So when should you start looking for a CMP? A: When you have to deal with performance and IaaS management issues! Obviously you won't be checking out such platforms if you have a very small infrastructure or if you just got started in virtualisation. CMP targets enterprises with various IT demands and load fluctuations, those looking to industrialise their infrastructure and start dealing with requests from a “service” and not only “operations” point of view . Piloting an entire infrastructure can become extremely complex, especially if you have as many interfaces as you have technologies.  We are not talking just about the ones consisting of your infrastructure (hardware, network, storage, VMs) but also your IT tools (monitoring, workflows, automation tool, etc). The value of a CMP is to gather your tools, process, technologies in a unique pane of glass to facilitate your cloud environment management. Since companies nowadays have very heterogeneous infrastructures it can be hard to think about optimisation because there are so many aspects to consider. CMP has a role to play there too. To put it all in a nutshell, a Cloud Management Platform is a tool sitting above your IaaS to automatically organise and pilot your infrastructure. It's aimed at IT administrators, Infrastructure Manager or CTO, giving them a global view and control on their entire infrastructure. As dealt above, there are some “must have” features required of a CMP.  Here are a little guide to help you understand what are these features for. Self-service: This is the portal, the interface through which you log on and manage your infrastructure from. From here, it must be easy to choose your configurations and execute deployments. You have access to a list of configuration templates that you can select and customise. Provision system images: A CMP deploys system images on underlying infrastructures. Via the self-service feature, you can have access to different kind of system images that you can choose and deploy. Metering and billing: Cloud Management Platforms provide information on your infrastructure consumption and can edit bills based on this data. This is a crucial feature for companies wanting to recharge costs to inside services (sales, marketing, HR, accounting, developers) or for service providers/cloud providers invoicing based on customer's consumption.  You can also track your infrastructure use trend: when was it most loaded? What time of the day? Which day of the month? What time of the year? Which client? You can then adapt your business and IT with all cards in hand. Workload optimisation and policies: All CMP features are linked and work seamlessly with each other. CMP is a tool to optimise your infrastructure usage either automatically for some CMP or as a help to decision making for some others. The idea is to manage your resources better. You can also create management policies, with instructions such as: “provision a new server every time a physical server is used up to 65%” or “add 2GB of ram when my VM resources are used up to 75%.  End result - use less energy and make your own management rules. External tool management:: CMP will centralise your data centre technologies in one interface. Companies already use IT tools for monitoring, workflow management, deployment automation. CMP can connect to those existing technologies . Either to collect their data or to pilot them directly, “ordering them” when to execute their operations. Service catalogs: Via the self-service portal, you can choose service templates or configurations to apply them to the service you want to launch. You have access to a list of already “prepackaged service” or you can make them from scratch. Saved as templates you can then duplicate configurations to other services.  For instance, you can start your new service (let's call it service B) with the same elements than Service A, but want to customise it because your Service B answers to different set of user needs and security matters than Service A.  Policies which can be good for your service A might not fit for your service B - for example you won't want the same type of service for your sales departments as those for your software development team. If it takes more than 15 minutes to provision a cloud, then it's definitely not a cloud management platform! Network and storage resource configuration: CMP will aggregate your servers management with your network and storage technologies.  You can define which type of storage or network you want to use for your service. You can create management policies as well on those elements . You can set up your storage configuration (NAS, SAN, direct attached storage, Shared bloc level storage) for each service. With CMP everything can be customised so you can purpose your services exactly the way you want. Service governors: A CMP is a smart tool which understands how your company's IT works.  Service governors will analyse requests in order to deploy them with the right configuration and guide the request where it must be. It will analyse which user made the request and for which service. Is it a sensitive service? What are the user rights? Are there any policies linked to this user or service? Must it be deployed on a physical or a virtual machine? Is there a special type of network that you indicated that must be used for this type of service..? This feature makes sure your operations are made the way you want it to be.  It is the CMP “police officer”! High performance management: Gathering and correlating all your policies, from your services to your infrastructure pattern use, CMPs will provision, configure automatically and deploy your services using the right amount of resources. CMP intelligence gathers all the data points into one account to guarantee performance and SLAs, to make the best out of your user experience and satisfaction. Remember the aim is to provision and deploy services as quick as possible, without delay for your users. So, if it takes more than 15 minutes to provision a cloud, then it's definitely not a cloud management platform! One hot topic we can expect in the CMP roadmap is hybrid cloud connectors ...to be able to get [additional] resources from public cloud vendors. If we analyse the market, there are plenty of cloud management products, the most prevalent of which are BMC, Dell , Morph, Hedera & Scalr. Some other software developers call their platform a “CMP” but do not really qualify as such.  Make sure all the features listed above are present and remember this: A CMP manages your infrastructure from a service point of view, it gathers your IT in an “all in one” interface, and pilots your infrastructure with performance and optimises targeted operations. It complements a virtualisation platform vendor and DOES NOT replace it.  Concerning the future, one hot topic we can expect in the CMP roadmap is - without any doubt - hybrid cloud connectors and their management, in order to be able to get resources from public cloud vendors when resources are temporarily missing.  Let's wait and see how that will evolve. ### Emma Sue Prince - Opening Lines [vc_row ][vc_column ][vc_separator ][vc_column_text ]Watch our new Opening Lines video now. This week we have Emma Sue Prince who joined us to talk about her book '7 Skills For The Future' Watch the video and read her article below![/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_column_text] [embed]https://vimeo.com/342715449/1961a9a350[/embed] [/vc_column_text][vc_column_text]Forget ‘self-help’ – these 7 skills are all you need Oh how we love a quick fix! Many self-help books and articles advocate a mix of tools and affirmations that have worked for the writer and promote “transformational change”. It can be easy to get lost in all these pieces of advice, suggestions, routines and “hacks” which can be quite challenging to follow through for any length of time.  The only route and the fastest route to creating happiness at home and at work is through developing your self-awareness. Do this by strengthening 7 specific skills that you already have.  When you build more self-awareness into your muscle memory you experience quick wins (which our brain will quickly translate into quick fixes) and real long-term change and results. Think of a professional footballer – they will have spent millions of hours manipulating the ball with different parts of their foot so that when they are in a high stake situation – like a match, they are able to look up, be present with whatever is going on and make that winning kick. That’s muscle memory. And so it is with behaviour and skills only far more interesting and fun! These are the only 7 skills you need and this is how you can use them to create your best life: Adaptability – this skill will help you respond with grace and ease to change and setbacks. Rather than avoiding them you can learn to embrace them and experience growth and unexpected advantages as a result. The best way to build adaptability is through widening your comfort zone. That means looking each day for a way to get yourself into “stretch” – that’s where you experience ‘slight’ anxiety but where the biggest growth is. Say “yes” to new challenges and things you might by default say ‘no’ to. Try new things every day – whether that’s a new food, a new recipe or a new invitation – go from small to big and your comfort zone grows. Critical thinking – your brain is amazing and can do so much more for us than we allow it to but we constantly drain it of energy. We are bombarded with a constant stream of information. Start by creating space in your day for reflection, contemplation, for thinking. Go for daily walks without your phone to not only do yourself good but also give your brain a chance to flourish. Routinise your day, especially the first part of it by deciding in advance what you will wear, eat for breakfast and plan your schedule the night before. This frees up your brain to do what it’s actually designed to do – be creative, resourceful and innovative! Empathy – the most incredible gift you can give to others is the gift of your attention. This gift will immediately supercharge your personal and professional relationships. You cannot give this gift if you do not know how to be present. Start simple mindfulness exercises through slowing down your breath and being completely still for just 5 minutes a day. You can do this on your commute or whilst you are drinking a cup of tea. As long as you are not scrolling mindlessly at the same time! Being mindful every day for small pockets of time develops your ability to be present and truly listen the very next time you have a conversation. Integrity – If you know your values and what is important to you, decisions are the easiest thing in the world to make. The end of the year approaching is a great time to think about what you value most and to what extent that is reflected in your everyday living and action and then build small, consistent ways to ensure that it is. Guaranteed happiness right there. Being proactive – Whatever you are facing or whatever you might be worried about stop right now and make a list of everything about that situation that is within your control and everything about it that is not. When we are stressed we tend to place our energy into things we have no control over. But the minute you put that energy into what you do have control over: your thoughts and behaviours you feel better immediately plus you usually influence the situation for the better! Optimism – Optimism really is something you can bring into your everyday. Do this by creating positive experiences – these can be very small things – from enjoying a cup of tea with a friend to making time for looking after yourself more. And remember to be grateful – quickest route to happiness that exists! Resilience – we need to be much more resilient so we are not constantly battered from whatever life throws at us. Build grit by sticking with something – we tend to give up very easily after even just one “rejection” yet perseverance has a super strong link to being resilient. So get these skills into your muscle memory by small easy steps that punch big on impact and you’ll be miles forward to happiness and your best self. Emma-Sue Prince is an inspirational soft skills and effective behaviours expert and author of 7 Skills for the Future, (Pearson) out now, and available on Amazon priced £12.99. [/vc_column_text][/vc_column][/vc_row] ### Industry 4.0 | Cloud technology within Manufacturing In today's E-commerce driven market, manufacturers are under increased pressure to respond to Industry 4.0 and work towards digitising their supply chains. More and more manufacturers are looking at innovative cloud-based solutions such as, Internet of Things (IoT) sensors and devices that talk to the internet and provide real-time dashboards of activities happening throughout their production lines. Despite the rise of IoT and aspiration of digital supply networks (DSNs), recent research has revealed a gap between organisations’ operational objectives and what is actually being achieved. Notably, only 29% of manufacturers understand what having a DSN is, with under 15% implementing a DSN and expecting them to become the norm for the business in the next five years. However, there’s no hiding from Industry 4.0 and the implications for businesses who don’t keep up. So, why are some manufacturers seemingly missing the mark? And more importantly, how can they make sure they don’t get left behind.   Removing silos At present, manufacturers cite securing meaningful intelligence from their end-to-end supply chain (80%), dealing with real-time information (75%) and the ability to deal with the intelligence – as significant hurdles that need to be overcome. However, only a third of manufacturers are able to aggregate information across the supply chain. Although 89% of manufacturers state that they believe a single view of information from supply chain operations is key - to date, only 30% have full end-to-end visibility.  Manufacturers need to act now and take steps to transform. The research also showed positive signs that many are starting to make some vital changes; 38% of respondents are looking to improve supplier collaboration, 35% supplier performance monitoring and 34% predictive alerts to mitigate disruption. Advanced technology such as artificial intelligence (AI), automation and robotics are deemed to be at the forefront of innovation. However, in order to make the most of the latest technology, manufacturers need to ensure they invest in scalable, flexible solutions that unlock extended capability within their legacy systems to enable higher levels of integration, reduce silos and create a collaborative supply chain. If businesses manage to bridge the information gap, management can make better-informed decisions. The benefits of connecting different internal and external information silos will be greater agility and the ability to maximise investments in new technology.   Embracing the Cloud The wider uptake of Cloud technology will play a key role towards enabling organisations to be more immersed in digitalisation. It’s all well and good wanting to implement the latest technology, but if a solution lacks flexibility then it won’t drive cost efficiencies as the business moves forward. Chosen solutions should enable real-time visibility, productivity and quality improvements. Organisations not only need to assess where the addition of a specific technology will provide the most value, but also ensure the technology they want to implement is based on a flexible platform for future scalability. The Cloud facilitates a real-time exchange of data, creating and promoting an environment of digital collaboration and integration. A collaborative supply chain that utilises the Cloud and the information available will better connect key stakeholders, providing real-time visibility, allowing organisations to proactively manage the supply chain, driving efficiencies and enhancing risk management. IoT is also an enabler, reducing the complexity of machine-to-machine communications. The introduction of IoT allows organisations to collect and analyse data through strategically placed sensors. The more useful data an organisation has access to, the more capability it has to utilise AI and implement successful and scalable solutions. So, what is going to be key to realising the benefits of the digitisation movement? Creating an integrated information layer fed by core systems, enabling collaboration between departments and being able to access the right information, at the right time, from anywhere in the supply chain. And, most importantly, empowering management to take the essential decisions required to embrace demand-driven manufacturing and to meet, if not exceed customer expectations.   Conclusion Ultimately, Industry 4.0 is a transition that is driven by customer demand and technological advancement. Industry 4.0 will continually be worked towards and the competitive landscape of each market will produce different paces for each industry. Take, for example, the development of autonomous cars. The use of this more complex technology and personalisation has directly impacted the rate at which automotive manufacturers need to update and accelerate the speed of their supply chains. Manufacturers across all industries know there needs to be a greater focus on speed, accuracy and agility within the end-to-end supply chain if they are to remain competitive and achieve Industry 4.0. And, the only way organisations are going to achieve this is by optimising processes between legacy and new systems, as well as providing key stakeholders with meaningful insight from real-time data sources.   ### Five Benefits of Leveraging the Cloud for Financial Services In March 2017, the Cloud Industry Forum reported that the UK’s overall cloud adoption rate had reached a record high of 88%, an increase of 5% from 2016. This may have surprised the financial services industry, which has traditionally been hesitant to adopt the cloud due to outdated security concerns. At Payment Cloud Technologies (PCT), we believe that this attitude is no longer an option in today’s digital world: the cloud is central to digital innovation, and financial services is perceived as one of the most dynamic and innovative industries around. In fact, new data from YouGov recently revealed that 23% of UK adults see the banking industry as leading the way in digital innovation. Slow-moving financial institutions that fail to meet these expectations may find that they lose business to better-equipped competitors that are already satisfying the needs of today’s digitally-savvy customers. And, with 67% of users expecting to upscale their usage of cloud services within the next 12 months, those that fail to keep pace will miss out on 5 key business benefits as a result: Cut Costs For years, the building, developing and operating of a bank has been expensive. New datacentres and servers had to be constructed, and staff recruited and trained to operate them. [easy-tweet tweet="Cloud computing means banks do not have to invest heavily in dedicated hardware" hashtags="Cloudcomputing, Data"] Unfortunately, these costs have deterred potential new market entrants from providing much-needed competition to the monopoly of the traditional high-street banks. But cloud computing means banks do not have to invest heavily in dedicated hardware and software with a limited shelf life, nor the manpower to maintain it. Instead, financial institutions can buy into the infrastructure of a secure, dedicated cloud service provider and focus on driving more money into their business. For example, ME Bank is a Melbourne-based retail bank with 800 employees managing $20 billion in assets and 280,000 customers nationwide. As a digital bank, it wanted to transition from its incumbent on-premises datacentre infrastructure to a more efficient system, and reduce operating costs in the process. After migrating its development and testing environments to the cloud, ME Bank reduced the cost of delivering development and test environments for new services and applications by 75%. Improve Flexibility and Scalability The cloud gives banks the ability to rapidly scale processing capability up and down according to ever-changing market developments and customer demands on a global scale. In today’s fast-paced world, with customer-centric digital banks garnering significant market share, the ability to act swiftly is critical to remaining competitive. Working with the cloud is a highly effective way to achieve this scalability at pace: 65% of organisations noted scalability as their top reason for initially adopting cloud services, and several financial services organisations have already realised this benefit: For example, Auka is a Norwegian mobile payments startup that allows users to send and receive money instantly, pay bills, view transactions, and obtain instant credit. Delegating the business’s server-side infrastructure to the cloud helped Auka to deliver impressive scaling capabilities. Daniel Döderlein, Founder and CEO, Auka, said: ‘’With App Engine, we can go from ten transactions a second to thousands a second without a hiccup. Expanding across borders into new markets becomes vastly more tractable. It scales beautifully." Increase Efficiency The cloud can help financial services organisations to streamline their operations to enjoy improved efficiency ratios and operating leverage. This is particularly important for businesses that are operating across multiple markets with numerous target demographics, and must optimise all aspects of their organisation to retain a high level of efficiency. Key to this process is the ability to interpret and analyse rich market data. Financial services organisations that action this effectively gain an advantage over their competitors by seamlessly negotiating new innovations and developments from that particular market. For example, PCT’s cloud-based digital.VISION middleware was developed to keep pace with industry changes, in order to apply the most robust and relevant technologies to businesses depending on their objectives. Further, FIS, a global leader in financial services technology and a strategic partner of PCT, runs US market analysis using the cloud. FIS’ Market Reconstruction Platform can collect and process data, and produce feedback reports within a few hours, with the ability to adjust to fluctuating market activity and support complex analytics. Serve Customers Faster Cloud computing makes new products and services easier to develop and launch, which is particularly important for a financial services industry that has traditionally been slow to respond to evolving customer requirements. Even at the prototype stage, non-cloud based applications can necessitate more than 1.5 years of development before they are ready to progress to market. With PSD2, Open Banking and other initiatives all underway or on the horizon, and ushering in a new wave of customer-centric, fast-moving digital banking, the cloud’s ability to deploy new features in less than 3 months will be crucial to both challenger banks and established institutions as they strive to stay relevant. Unisys, a global information technology company, recently worked with PCT to leverage the Microsoft Azure cloud as a key component of Elevate by Unisys, an omnichannel digital banking platform that enables startups and established financial institutions to deliver secure banking services anytime, anywhere. Crucially, digital.VISION’s agile cloud-native technology allows Elevate’s users to keep pace with the digital banking age by securely enabling quick-to-market current accounts, at a fraction of the time usually associated with delivering new banking features. Forge Stronger Customer Relationships The cloud’s combination of big data and potentially unlimited computing power allows banks to secure better insight into their clients than ever before. Financial services organisations that ignore this unprecedented opportunity to develop systems that are highly customised to their customers’ expectations will pay the price by losing crucial demographics. For example, 52% of UK SMEs say banks are not business friendly, while 44% of millennials in the US do not feel that their bank understands them. In an increasingly competitive market, it is unlikely that banks will survive without understanding these groups and servicing them accordingly. When ING Direct changed its name to Tangerine in 2013, it adopted an entirely new business model to ensure that it could access customer feedback more easily, and deliver services based on that feedback effectively. The bank achieved this by using Microsoft’s Analytics Platform System (APS) and Azure HDInsight. After transitioning 45 business intelligence (BI) end users to a Microsoft BI environment, Tangerine was able to convert customer data into tangible insights faster and more easily than was previously possible. This deep understanding of Tangerine’s customers has allowed it to deliver the incentives and services required to retain and grow its customer base. Further, using the Microsoft cloud, Tangerine has been able to adjust new product rollouts based on customer reactions in real-time, keeping the bank one step ahead of its competitors. Conclusion As Bloomberg reported last year, a minimum of 25 of the world’s 38 largest financial organisations and insurance businesses have begun to work with Microsoft to start putting applications onto the cloud. As the cloud adoption rate continues to grow, financial services organisations big and small must ensure that they do not disappear into obscurity by failing to keep pace with this key trend. PCT believes in the power of the cloud for business and consumer alike, and we have already helped many financial institutions to secure swift market entry using the cloud. As the industry progresses further and further into full digitisation, the benefits of the cloud will be too important to ignore. This makes it crucial for businesses to act now in order to cope with an unpredictable and constantly changing financial services landscape. ### IT Management: The 3 Golden Rules to Comply with GDPR The New EU General Data Protection Regulation (GDPR) is coming, and will officially apply from May 25th, 2018. It establishes a single law to enforce European data protection & regulation rules, as well as the right to personal data protection. GDPR has been largely commented upon, especially regarding how non-European big tech companies will have to handle personal data, by the new extraterritoriality rule (Art. 3). But GDPR is also about how any company must protect and manage their data, prevent breaches and thefts. At a time when Shadow IT and the use of consumer public cloud solutions has never been so high within the enterprise, many companies will be forced to make significant changes to be sure that personal data is not spread across uncontrolled public clouds. It has even become a necessity since serious infringements to the regulation can lead to a fine up to 4% of a company's annual revenue (art. 5 & art. 7). HOW WILL GDPR AFFECT COMPANIES Making a complete list would be too long, but here are five main points in GDPR that will have a major impact on companies and that any IT Manager should bear in mind The right to Erasure and "to be forgotten" (Art. 17):  Companies must be able to easily find specific data, target it, and automate the removal of personal data upon request. Implement Data protection "by design and by default" (Art. 25): The Privacy by Design (PbD) rule includes minimising data collection, deleting personal data that is no longer necessary, and securing data through its entire lifecycle. Records of Processing Activities (Art. 30): companies must implement technical and organisational measures to properly process personal data Notification of personal data breach to the supervisory authority (art 33): this will include having a response plan in place. Data Protection Impact Assessment - DPIA (Art. 35): Companies should create data protection risk profiles, and assess processing of sensitive data. A Data Protection Officer (Art.37-39) will be responsible for advising on and monitoring GDPR compliance. GOLDEN RULE N°1: DEFINE A DATA POLICY One could argue that GDPR simply legislates common sense data security ideas that many IT Departments are aware of. The big change is that they will now have to take actions, with a corporate roadmap and strict governance principles in mind. First, focus on your data storage infrastructure:  identify where personal data is located, and try to build a consistent architecture to be able to track and monitor what becomes with this data. [easy-tweet tweet="As soon as you manipulate personal data, you will be accountable for the use you are making of it" hashtags="Data,IT"] Then, define an official security policy and share it within your company: data encryption (in transit / at rest), secure access methods (multi-factor authentication…), sharing documents with passwords and expiring links, etc… Identify your criteria, depending on the specificities of your professional activity and your use cases. Also, be very strict about using personal devices (BYOD) inside your company. GOLDEN RULE N°2: TRACK WHO ACCESSES YOUR DATA As soon as you manipulate personal data, you will be accountable for the use you are making of it. Very Often, data breaches are caused by mistake made by an end-user and do not involve the infrastructure or IT policy. Sharing data with the outer world cannot be 100% safe, that is why you must be sure that you have enabled all protection options. You must understand who is authorised to access personal data in the corporate file system, how they access it, and define permissions based on your collaborators’ real usages, beyond their team belonging and titles. In other words, implement "role-based” access controls. GOLDEN RULE N°3: MONITOR YOUR DATA FLOW Data Loss Compliance and breach notification requirements place a new burden on IT Departments and data managers. The new IT golden rule is now "always monitor". You will need to be alerted to suspicious activity and potential security incidents, spot unusual access patterns to files containing sensitive data, and promptly report any exposure to your local data authority. This underlines the need for adequate solutions, especially regarding file sharing and collaboration tools. File Sharing solutions must come with a powerful Admin Dashboard giving IT Managers the opportunity to have a comprehensive vision of what is going on the platform, get user behaviour analytics (UBA), monitor the sharing processes, devices involved, and draft reports based on all the data he can access to. GDPR clearly imposes new constraints on companies: turning them into a business advantage requires for IT Departments to define and explain new processes, adapt the IT infrastructure when necessary, and implement the right sharing & collaboration tools on top of it.   ### Five Benefits of Cloud-Based Technology Whether organisations are moving to a cloud infrastructure, using cloud-based software or storing files in the cloud, many are using some aspects of this handy technology. With around 93% of businesses currently using the cloud in some form and worldwide public cloud services revenues expected to grow 18.5 percent in 2017 to $260bn, up from $219.6bn last year, according to Gartner, there’s a reason that companies are increasingly turning to the cloud. Let’s look at five benefits that cloud-based technology brings to your business at peak times. Easy collaboration Cloud technology encourages collaboration, making it easy for people around the globe to connect, communicate and share information for better team working. It’s simple to store, view and edit information in one central location, and documents are updated in real time. Everyone’s kept in the loop and colleagues can ‘meet’ virtually, no matter where they are in the world. Research shows that unleashing business builders from burdensome admin could unlock over $600bn in lost productivity and save businesses an average of 120 working days per year on administration alone. Better collaboration brings a heap of gifts to the table, including boosted productivity, improved customer service and engaged, enthused employees. Happy employees According to Centre for Retail Research, Christmas spending has continued to grow globally in spite of the gloom, with spending expected to rise to £747.4bn this year. This upturn in business, combined with employees fighting winter colds, struggling with commuting snow disruptions or keen to see their little angel starring in the school play, means it can be a challenge to keep on top of the work. Cloud-based technology makes it easy for employees to work flexibly, with people able to log on, access, share and edit information from anywhere – and even on the go. With most cloud services offering mobile apps, it’s simple to work from any device, so everyone can keep up to date, including sales staff, remote workers and those needing a bit more flexibility at this time of the year. Flexible working helps employees to enjoy a better work-life balance, without impacting negatively on productivity. Better productivity and flexibility If your business has peak times or different staffing demands, you can really benefit from the flexibility that the cloud offers If your business has peak times or different staffing demands, you can really benefit from the flexibility that the cloud offers. It’s easy to scale cloud services up or down, depending on need. This means you can temporarily increase capacity to handle busy periods like Christmas – without having to shell out for hardware or software that wouldn’t be used at quieter times. Not only that, with cloud-based software all the latest updates are immediately available to users. Getting instant upgrades gives employees new features and functions sooner, making them more productive. Improved data security At busy times, the last thing a business needs is for technology to fail them. Downtime can lead to all kinds of problems, including reduced productivity, lost revenue and reputational issues. A reputable cloud provider will maintain reliable, secure services, giving you peace of mind that all data is secure, safe and recoverable, no matter what happens. It’s easily accessible from anywhere and any device. And if something does go wrong, a cloud host should resolve issues quickly, getting you back up and running in no time. [clickToTweet tweet="Cloud can also solve the chaos of documents & data being stored in different formats & on personal drives & devices." quote="Cloud can also solve the chaos of documents and data being stored in different formats and on personal drives and devices. "] Cloud can also solve the chaos of documents and data being stored in different formats and on personal drives and devices. This causes mass confusion, security issues and plenty of stress. Storing data and documents in one place on the cloud means everyone’s accessing the same, up-to-date, authorised information, leading to less human error and a clear record of any updates. A business boost All of these benefits can lead to fewer stresses at peak times – and, more importantly, will also give business a lift. Your company will be more agile as capacity can be flexed up or down to meet demand, and you can take advantage of peaks in the market without significant costs. With streamlined processes and more engaged, productive employees able to connect and collaborate from around the world, you’ll gain a real competitive advantage putting you ahead of your competitors. ### The eight most common cyber-threats, and how to mitigate them While attackers constantly come up with new tactics and tricks, their overall strategies remain the same. Understanding the patterns is a critical success factor to protect your organization against cyber-threats. As much as the digital universe grows, so does its shadow world. Cybercrime has become a lucrative ecosystem that is evolving quickly. Damages are predicted to reach US$8 trillion by 2021, according to Juniper Research. While criminals tend to come up with new tactics and tricks, their strategies follow the same patterns. Based upon research conducted by Verizon, more than 88 percent of all incidents fall into one of the following eight categories: [clickToTweet tweet="'...#Cybercrime has become a lucrative ecosystem that is evolving quickly...'" quote="'...Cybercrime has become a lucrative ecosystem that is evolving quickly...'"] Crimeware This includes all kinds of malware designed to automate cybercrime, with Ransomware being the most prominent example. For the criminal, launching an attack and holding files for ransom is incredibly fast, of low risk and easy to capitalize on — especially with cryptocurrency such as Bitcoin that allows them to anonymously pocket payments. How to mitigate it It all begins with constant patching and stressing the importance of software updates – that applies not only for the latest anti-virus patterns, but also for applications and even the operating system itself. Watch out for macro-enabled MS Office documents and train users never to click on suspicious links. Also, create backups regularly to be able to redeploy clean images if needed. Distributed Denial of Service (DDoS) Attacks aiming to interfere and compromise the availability of networks and systems belong into this category. DDoS attacks are often targeted at large organizations. While some poor souls face a constant interruption, most attacks are over within a matter of days. How to mitigate it Understanding the types and levels of mitigation you need is key. Beyond setting up firewalls and putting close monitoring in place, embed DDoS mitigation into your business continuity and disaster recovery concepts. Make sure that you have DDoS mitigation services in place to defeat any attacks, that they’re regularly tested, and that they actually work as planned. Espionage Increasingly, state-affiliated actors are entering the scene aiming to gather intelligence or aid their local economy, for example. Whether it’s a malicious e-mail or other types of malware that paves the way in, this is usually followed by tactics aimed at blending in, giving the hacker time to quietly capture the desired digital assets. How to mitigate it Conduct regular security awareness training and encourage your teams to report phishy e-mails. Make it difficult for the adversary to jump from a rigged machine to other devices on your network. Apart from leveraging networking security to prevent unauthorized access in the first place, again close monitoring will help you to discover suspicious activities. If you have reasons to believe that there have been attempts or an attack is underway, get the authorities involved quickly. Fraud An emerging tactic includes suspicious e-mails where “the CEO” or another senior official suddenly orders wire transfers with an urgent and believable back-story. While it might sound simple, unfortunately it often works. How to mitigate it Instruct your teams — especially in finance — that no one will request a payment via an unauthorized process. Moreover, ask IT to mark external e-mails with an unmistakable stamp. Human Error Where wood is chopped, splinters must fall. However, data lost through human error can be harmful too — especially if it’s the customer or a supplier who makes you aware of your mishap. How to mitigate it Implement and enforce a formal procedure for data lifetime management, especially disposing digital assets that might contain sensitive data. Furthermore, establish a formal approval process that at least requires a four-eye principle before releasing corporate information to the public. Insider Misuse Users are often unaware and easy to dupe, while others even act with malicious intent. Some insiders abscond with data aiming to convert it into cash in the future. Whether it’s a case of unsanctioned snooping, taking data to a new employer or setting up a competitor, insiders statistically account for around a third of all cyber-threats. How to mitigate it Implement access control, logging and monitoring of use, and look out for large data transfers and use of USB devices. Enforce encryption on data in use, at rest, and in motion. Skimming Skimming devices are typically placed on terminals that handle payment transactions — such as ATMs, POS terminals or gas pumps. While ATMs continue to be the prime target, the number of gas pump terminals used to collect payment card information more than tripled compared to 2016. How to mitigate it Train employees who carry corporate payment cards to spot signs of tampering, monitor your own payment terminals with video surveillance, whenever possible, and make sure the recordings are reviewed regularly. Web Application Attacks Not all web applications are trustworthy. While they don’t necessarily hold payment card data, they do often request users to submit their names, addresses and other sensitive information. Security is often weaker than online retail sites, so attackers use them as an easy way to capture personal data and credentials for use elsewhere. How to mitigate it Encourage users to vary their credentials and leverage two-factor authentication. Limit the amount of sensitive information stored in web-facing applications. Takeaways Only when aware of the threat landscape can you identify white spots and come up with measures to mitigate risks. If you were off to Everest, you would probably leave the shorts at home and double-up on the thermal wear. The same applies when assessing where to spend your precious budget. The themes above help you understand the most common patterns. Only when aware of the threat landscape can you identify white spots and come up with measures to mitigate risks. You don’t have to be big, rich or famous to become a target. Cybercrime is part of today’s reality and literally affects everybody. It’s often about identity theft, collecting credit payment card data and cloning the identities of everyday people. Similarly, it’s not just households finding themselves on the target list. Start-ups are chased for their breakthrough inventions, blue chips often fall victim for their customer records, and others are identified as a soft target and stepping stone to exploit their partners’ ecosystems. Cybercriminals don’t rely on the status quo. As the value of some forms of data falls, they are casting their nets wider and come up with new tactics. While no system is 100% secure, too many organizations are just making it far too easy for criminals. The following seven tips cover simple mistakes that happen time and again: Start with physical security, as not all data theft happens online Restrict access rights Train staff to spot the warning signs and trigger alerts Patch promptly Monitoring, log files and change management systems can give you early warning of suspicious activities Use two-factor authentication to limit the damage of a lost or stolen device Encrypt sensitive data, so that it is next to useless when being stolen ### Do Digital Tools Make us More or Less Productive at Work? [vc_row][vc_column][vc_separator][vc_column_text]Technology has influenced almost every aspect of our life. In fact, we cannot imagine a day or two without the use of tech gadgets. From smartphones to PDAs and laptops to tablets, we are living in an age where spending your time without these handy gadgets is almost impossible. With the evolution of technology, various digital tools emerged to help us work smartly, be more productive and save our time. However, the question is whether these digital tools help us be more productive or just waste our time at work. To give a brief overview, let us have a look at some of the studies and statistics revealing the facts. Employees who work in digital workplaces are not only more productive but also more motivated, have higher job satisfaction, and report an overall better sense of well-being, according to a new global study from Aruba, a Hewlett Packard Enterprise company. In a report by Aruba Networks, almost three quarters (74%) of the employees that work in fully-enabled digital workplaces said their job satisfaction is good or very good, while 70% reported their work-life balance to be good. Employees were also 60% more likely to say they are motivated at work and 91% more likely to praise their company’s vision when using digital workplace technologies. Few people might disagree that the use of digital tools at the workplace has a negative impact on employee’s productivity. However, the article is centered on looking at the brighter and darker side of the statement. Let’s get started. Revenue Growth The purpose of any digital tool is to reduce the amount of work and be more productive. For instance, if an accounts manager has to prepare financial statements for a whole year manually, what are the consequences? There are a number of chances of human error while adding or subtracting numbers. On the other hand, if the same accounts manager utilizes tools like QuickBooks or Sage, the chances are fewer to none for any error. Imagine if the same thing applies to the whole organization. Think about the level of productivity that can be achieved, provided every employee is equipped with relevant tools to help complete their job. Communication on the Go The rise in digital tools has filled the gap of communication that people experienced in the 80s and 90s. With the rise of social media websites and apps like Slack, Skype, and WhatsApp, businesses can now communicate 24/7. With that said, 24/7 access can help organizations communicate anytime and anywhere without any limitations. They can provide support to customers, answer queries, and address any concerns on the go. Moreover, employers can connect with employees and communicate their concerns whenever they want. Imagine old means of communication, such as postal mail where you had to wait for the mail to be delivered and receive a reply. Moreover, contacting someone on the landline was also quite challenging, as the availability of the person was uncertain. Scheduling Tasks There are thousands of tools and apps that help you keep track of your important tasks. Missing an important task at a certain time means a huge loss to the company. Imagine that you had to dispatch a parcel at 12 noon but were unable to do so on time because of some other engagements. In the end, you lose the client as well as money. However, on the contrary, digital tools help you keep track of your important tasks. Trello, Todoist, and MyLifeOrganized are some of the tools that can manage your list, prioritize them and notify you on the go. Securing Sensitive Data  Sensitive data in today’s digital era is a big concern for financial institutions and enterprises. Organizations that collect data have to be very careful as the loss of data can result in a huge setback. Especially, if the organization is using hard drive or USB to store data, it might be vulnerable to viruses, malware, and hacking. Digital tools have somehow facilitated enterprises to store data securely. Google Drive, Dropbox and Amazon Drive are some of the tools that can be used to store data as well as scale up with the passage of time. Networking Capabilities Networking capabilities and opportunities have increased with the rise in digital tools. Companies can connect with anyone working in the same industry or approach potential clients interested in their products or services. For instance, Twitter and LinkedIn are widely popular to make connections. It does not require any cold calling or emailing, just connect and say your word. Before the emergence of these tools, imagine how difficult it was for business people to connect with the influencers and potential customers. They had to meet in person with an extra cost to bear if they resided outside the town or country. Remote Access to PCs With the evolution of technology, managing multiple devices has gone a lot easier. For instance, Thin Clients help you and your employees to access virtualized desktops and applications in VDI environments with zero downtime and maximum productivity. You do not need to spend extra money on maintenance of devices or cooling equipment. Thin Client has proved to be a game changer in terms of revamping IT infrastructure. Easy to install, setup and run, you can enhance the productivity of your employees. It facilitates access to virtual desktops and applications anywhere, anytime and from any device. Conclusion Digital tools have played a positive role in enhancing the productivity of employees and led to greater revenue generation. However, sometimes these tools are used in excess, which might hinder workflow and productivity. For example, using social media websites for your entertainment instead of promoting the business and interacting with clients will lead to a negative impact. Organizations that deploy several digital tools without any purpose can also distract employees, their productivity and workflow. It is important to analyze the tools before you deploy and evaluate how they can enhance productivity and efficiency of the company.[/vc_column_text][/vc_column][/vc_row] ### What are the Key Strategy Drivers Moving your Business to the Cloud? Your organisation will move to the cloud. Whether you’ve already started adopting cloud technology, are in the midst of planning a strategy to do so or can’t quite fathom the change just yet, it is inevitable at some point in the not so distant future. But why is cloud adoption so inevitable? What’s driving organisations to make this move? There’s actually a strategic reason behind every cloud adoption effort, and identifying and prioritising your reasons, also known as cloud drivers, is the key to success in developing and executing your cloud strategy.   Why You Need to Identify Your Cloud Drivers [clickToTweet tweet="Understanding why you’re adopting #cloudsolutions is a crucial step toward desired #results. More often than not organisations fail to identify their #cloud #drivers. Having this information clearly outlined as part of your cloud #strategy is important" quote="Understanding why you’re adopting cloud solutions in the first place is a crucial step to achieving your desired results. While to many this may seem obvious, more often than not organisations fail to identify their cloud drivers in a documented strategy. Having this information clearly outlined as part of your cloud strategy is important."] Understanding why you’re adopting cloud solutions in the first place is a crucial step to achieving your desired results. While to many this may seem obvious, more often than not organisations fail to identify their cloud drivers in a documented strategy. Having this information clearly outlined as part of your cloud strategy is important because: It will help keep your cloud journey on track. With all of the “shiny objects” in the cloud world, it’s easy to get side-tracked from your original vision and goals. Identifying and documenting your cloud driver(s) from the start will help you stay true to that vision. It will help you prioritise any conflicts among drivers. Over time, your drivers might come into conflict with one another. For example, if you want to cut costs, you might end up inhibiting growth or hurting your customer experience. As you identify your cloud drivers, you should also prioritise them so that you have a clear answer as to which one takes precedent should any conflicts arise. more often than not organisations fail to identify their cloud drivers in a documented strategy Understanding the Six Possible Cloud Drivers With all of that in mind, what might your cloud drivers look like? When you boil it down, there are only six drivers for cloud adoption: Three business-focused drivers (business growth, efficiency and experience) and three technology-focused drivers (agility, cost and assurance). Here’s what you need to know about each one:   Business Growth Business growth is one of the top benefits organisations realise as a result of cloud adoption. If you identify business growth as a cloud driver for your organisation, you also need to decide how you will define growth. The answer should depend on the maturity of your business. Based on that answer, you then need to outline your plans to achieve that goal and determine how cloud technology will help in that pursuit.   Efficiency Efficiency is an extremely common cloud driver, with 71% of organisations worldwide ranking it a top area they hope to approve through cloud technology (2015 Cloud Enterprise Report). At its core, efficiency is about removing unnecessary steps to streamline processes in order to increase productivity or deliver on customer requirements faster. As a result, increasing efficiency also supports business growth, by increasing worker productivity, and experience initiatives, by fulfilling customer needs faster. Experience Next among the business drivers is improving the quality of the customer experience, which 45% of enterprises worldwide rank as a top cloud driver (2015 Cloud Enterprise Report). Two of the most common ways organisations are achieving this goal with cloud technology are by introducing new channels of engagement and improving workplace productivity. However, the experience driver can come into direct conflict with goals around cost and efficiency. For example, if you introduce an automated phone tree for customer service, you might cut costs and increase efficiency, but your customer experience will suffer. Agility Improving IT agility is a top technology driver for 66% of organisations worldwide Improving IT agility, or enabling IT to be more responsive to business needs and react faster to market changes, is a top technology driver for 66% of organisations worldwide (2015 Cloud Enterprise Report). This goal is very attainable in the cloud environment, as SaaS technologies mean that IT no longer needs to be consumed with traditional application management tasks. Cloud technologies are also easier to enhance and swap out to accommodate changing business needs. However, it’s important to recognise that the cloud environment requires a new set of IT skills around managing and brokering these products. Additionally, while increased agility can benefit the internal customer experience by making it easier to introduce new technology that users want, it can also harm it by diluting ITs ability to provide expert support. Cost The cost driver has two sides: Reducing IT expenses and restructuring these expenses to spread them out over time thanks to the licensing model of SaaS technologies. While 39% of organisations do consider reducing IT expenses a key cloud driver (2015 Cloud Enterprise Report), cost generally takes a backseat to other cloud drivers as organisations typically choose to reinvest any savings to help achieve other goals, such as increased agility and improved experience. Assurance Finally, we have assurance, which is the idea that your data will be more secure in the cloud and you’ll attain better uptime because your solutions are maintained by providers who have built their businesses around these competencies. As a result, it’s no surprise that 73% of organisations report IT has benefited the most from adopting cloud technologies (2015 Cloud Enterprise Report). Although organisations in highly regulated industries do need to be more cautious here to ensure they maintain compliance with data residency laws, the cloud is still a viable option. In general, the benefits provided by assurance can also help further initiatives around IT agility. Beginning a New Cloud Initiative? Always Ask Why! Before you begin any new cloud initiative, it’s essential to identify which of these six possible drivers is behind your decision to move to the cloud. Once you do, you also need to prioritise it against any other drivers you have identified as part of your cloud strategy. As outlined above, doing so is essential to success because it will help keep your activities focused on your desired end goal and make it easier to resolve any conflicts between drivers when they arise. ### What is a treasury management system? Treasury Management Systems: A Primer Treasury management systems (TMS) have evolved significantly over the last few years, in large part thanks to a maturing cloud market and the continued advancement of remote banking communication systems. For example, the growing impact of SWIFTNet, emergence of EBICS, and end-of-life cycle for the ETEBAC protocol have conspired to make TMS solutions a near necessity for companies of all types and sizes. Despite the growing need and market for treasury management, however, there’s still a great deal of confusion about the critical purpose and key benefits of using a TMS. Here’s what you need to know. TMS Basics What exactly is a TMS? At their most basic, treasury management systems allow companies to automate critical financial operations — such as communication with banking partners or pulling cash flow data in real time — while ensuring finance data remains secure. As noted by Treasury Today, it’s easy to conflate TMS and ERP (enterprise resource planning) systems since they fill similar roles in your organisation The key difference, however, is specificity: While ERP solutions now offer many treasury functions and banking links, they remain jack-of-all-trade tools designed to provide a “single system” approach to back-office functions. The complexity of treasury demands, however — such as the need for high-level risk management analytics, complex product coverage, and compliance with federal finance standards — makes a TMS the ideal solution for dedicated treasury support. Local Vs. Hosted  If you’re in the market for a TMS solution, there are two basic “types” to consider: local and cloud-hosted. Local systems, sometimes known as “installed” TMS, are either developed in-house or purchased from trusted, third-party vendors and then installed on local servers. This option provides greater control over features and security protocols since the system is used exclusively by your organisation. The trade-off? You need superb third-party support or highly skilled local IT pros to ensure the TMS is properly managed day to day. Cloud-based solutions are your other option. These software-as-a-service (SaaS) deployments offer a number of benefits such as speedy implementation and deployment, high availability and built-in provider security. If you’re comfortable putting the system at arm’s length, this is an ideal way to achieve TMS functionality without breaking the bank. Key Advantages Why choose a TMS? There are a number of key functions that directly impact your bottom line. For example, cash management gives you a comprehensive view of balances for both central and regional treasuries; liquidity management lets you easily determine cash on hand and availability of investment capital; while accounts management ensures that all transactions are reported, stored and available on demand for critical analytics. Ultimately, the right TMS removes the need for IT and Fintech staff to manually enter transaction or revenue data, in turn reducing your total error rate and increasing the amount of time your staff can dedicate to line-of-business projects. Piece by Piece Traditionally, standalone TMS solutions didn’t “play nicely” with other tech solutions, meaning IT staff were required to expend significant time and effort managing and troubleshooting treasury management systems. Cloud-based alternatives, however, offer the critical benefit of flexibility; they are both compatible with a wide range of ERP and in-house tools and yet separate from these tools, reducing the chance of a software conflict. In addition, cloud options let you pick and choose the services and functions that best suit your financial needs rather than forcing you to adopt an all-in-one solution. Treasury management systems play a critical role in keeping corporate finances on track. The rise of flexible cloud-based solutions, meanwhile, provides both enhanced agility and control to maximise the efficiency and accuracy of all cash reporting. ### How Internet of Things Is Transforming the Food Industry Internet of Things (IoT) is defined as the network of devices that gather and convey data via the Internet. Slowly, but surely, the Food industry is getting acquainted with the Internet of Things. With the number of remarkable applications of the Internet of Things the food suppliers, processors, and retailers are experiencing good opportunities for operational as well as financial augmentation in their food businesses. the IoT network in the food supply chain greatly aids to trim down waste, costs, and risks Each functional area of the food industry has been significantly getting better with the integration of IoT. For an instance, IoT facilitates food companies to ensure superior levels of traceability, food safety and, therefore accountability all through the farm-to-plate supplies chain operations. Moreover, the IoT network in the food supply chain greatly aids to trim down waste, costs, and risks as well, in all stages of the procedure. Apart from this, the list of benefits offered by IoT technology in the food industry is quite long; consequently, the impact is also extremely impressive. Let’s check out some imperative aspects describing the grand consequence of IoT in Food Industry: Better Food Safety The implementation of the internet of things (IoT) in the food industry has been considerably diminishing the risk of food illness epidemic. Different kinds of sensors are being used to monitor essential production state, shipping time and most essentially the temperature. The utilization of real-time temperature tracking sensors allows organizations to closely supervise food safety data points, ensuring effective cold chain management. With IoT, the supply chain will be able to function jointly to become acquiescent with global and local regulations. Automated Hazard Analysis and Critical Control Points (HACCP) checklists are being used throughout the manufacturing, production, and transporting procedures thereby companies can get access to meaningful and consistent data that enable them to put into practice some food safety solutions. Logistics With the help of RFID (Radio Frequency Identification) transmitters and GPS systems, the distribution chain can be effectually monitored all along the whole storage and shipping course at the sales points or stores. This also enables companies to be acquainted with the preferences of customers, better reply to market requirements and decrease surpluses. The advanced Radio Frequency Identification (RFID) tracking provides supreme visibility into the food supply chain, helps automate delivery and shipping processes and effectively monitors and controls temperature. It also enables shippers to track a location of the product with GPS. By collecting the meaningful data, shippers can estimate performance in a number of regions including understanding customer behaviour to diminishing deadhead miles in truck fleets. Supply Chain Transparency Generally, consumers or buyers expect transparency from the agencies that they purchase from. Employing traceability and transparency throughout the global supply chain will aid food agencies to succeed in business by acquiring customer loyalty and trust. Even though international and domestic regulations can amplify the intricacy of the global food supply chain, IoT technology can make it convenient for both companies and consumers to track products. Further, transparency incorporates some added benefits for companies including improved inventory management, cost savings, and faster lead times. Businesses can obtain such benefits by identifying and solving incompetence in the supply chain, exceeding and meeting food safety regulations, and offering transparency to customers. Production and Storage Sensors can significantly help to perk up quality control, products tracking, workers’ activities, and taking advantage of real-time analysis for production. Sensors persistently inspect the colour and specks throughout flour production, which helps to immediately rectify any inaccuracy. Moreover, sensors gauge the moisture content along with protein or cash content and allow real-time optimization of the production procedure. Wastage Reduction As per the report of food and agriculture organization of United Nations, nearly one-third of human food production is wasted globally each year. This not only results in money loss but also tend to damage our surroundings by growing carbon gas in the atmosphere. IoT technology makes it possible to effortlessly monitor the state of all food products and send the real-time information to the assigned individual, which decreases food wastage.  Addressing Issues in Faster Way Since we know that most of the maintenance is preventive or reactive but not predictive therefore by employing remote equipment monitoring we can foresee issues ahead of their occurrence thereby saving your money and time as well. Conclusion With the availability and enlargement of the Internet of Things (IoT) solutions & Services, the food industry is experiencing some key transformations particularly in terms of trends in food safety.  Food manufacturing agencies can leverage the IoT technology to keep up the highest quality standards with the assurance of having the same product, with the similar higher quality at any time and anywhere. [clickToTweet tweet="'Implementing #IoT #technology solution with any #food business functionally could prove to be a most beneficial decision from all the aspects...'" quote="'Implementing #IoT #technology solution with any #food business functionally could prove to be a most beneficial decision from all the aspects...'"] Implementing IoT technology solution with any food business functionally could prove to be a most beneficial decision from all the aspects. If you are willing to integrate IoT solutions with your food business, get in touch with us to acquire most trending IoT App Development Services and solutions as per your specific requirements. ### 7 Common ERP System Security Problems and Safety Steps ERP (enterprise resource planning) systems have evolved significantly in recent years. Modern systems can now automate practically all day-to-day business processes, including human resources, sales, stock management, and so on. That’s why many organisations are now choosing ERP systems. The advantage of all-in-one solutions like ERP systems is that they remove the need for multiple software applications to improve data consistency and ensure all aspects of daily operations are compatible and accessible. However, as with any sort of fully comprehensive system which covers such a broad spectrum, there are naturally going to be some weak spots and vulnerabilities that are important to keep an eye out for. Here are 7 common ERP system security problems, and handy hints on how you can avoid them: Delayed Updates It’s reported that a whopping 87 percent of business computers feature outdated software, including ERP systems which are not up-to-date. If your version is currently unsupported, it can make it difficult to rectify any issues, such as crashes. More importantly, it leaves your business vulnerable to risk. Updates happen for a reason; sometimes to introduce new features, but mostly to address weaknesses that have been identified in the software. The world of cybercrime is changing constantly, and hackers are finding ways to get around even the latest of measures. That’s why installing updates as soon as possible is vital. How to Avoid: If you’re finding you’re often lagging behind when it comes to installing ERP updates, then it might be worth looking into an automatic updater which applies any software updates when available. Full Access Rights The biggest threat to businesses undoubtedly comes from external sources, but that doesn’t mean we can sit back and ignore potential in-house risks. Full access rights shouldn’t come as default; instead, it’s important to look at who has access to what data. For example, in most cases, a software developer wouldn’t require access to employee salary information. It’s also worth looking into which employees have permissions to make changes to the system. Access rights and permissions will largely depend upon the needs and requirements of your business, but as a general rule, it should be a ‘need to know’ basis. How to Avoid: It’s important to maintain audit logs to track any changes. It’s also worth adding ‘authorizations’ to checklists for new hires, promotions, and any role change documentation. Inadequate Training Following on from the above, it is certainly worth considering the security risk posed by internal sources in more detail. In some cases, the risk may be intended and malicious, but in most cases, it is more likely to be the result of a lack of understanding. This could be a lack of understanding of the ERP system as a whole, or it could be a lack of understanding of what is expected by the organisation in terms of security. This is especially true for new hires who do not have an in-depth knowledge of internal processes. While any errors may be classed as ‘innocent mistakes’, it still leaves your business open to security risks. How to Avoid: Ask your ERP provider if system training is including as standard, nominate staff to train new hires, and ensure business protocols are widely available and easily accessible to all employees. Failure to Comply If your ERP system is being used to store confidential sales information, including personal details and payment details, then it’s essential that the system meets local security standards requirements. This could include PCI DSS requirements if credit card data is involved. The system itself should store details in encrypted form only, without retaining the 3-digit security code, and there are also requirements for the business, too. You’ll be required to maintain secure passwords, restrict access to ‘need to know’, and track access to the data that you keep. You may also need to comply with regulations within your sector. How to Avoid: Choose an ERP system that’s designed to comply with necessary regulations. It’s also important to change your vendor-issued password and adhere to good security practices at all times.  [easy-tweet tweet="The whole point of ERP is integration; to remove the need of ‘Frankensteining’" hashtags="ERP,Frankensteining"] Use of Unauthorised Systems The whole point of ERP is integration; to remove the need for what is known as ‘Frankensteining’. Frankensteining happens when multiple software programs are used simultaneously to achieve a single goal, such as maintaining sales data on an ERP but running reports using Excel. This practices still takes places across many businesses, even if it is not office protocol. It mostly comes down to familiarity and preference for a specific application, and ease of use. This means that data could exist within a number of different programs at the same time, where it is not adequately maintained, updated, or secure. How to Avoid: Firstly, look into preventing data export unless absolutely required. Secondly, if your ERP system isn’t doing everything you need it to, then perhaps it’s time to upgrade to a new system.  [easy-tweet tweet="Cloud ERP systems are becoming increasingly popular - any data is stored by a third party" hashtags="Cloud,ERP"] Automatic Trust  Cloud ERP systems are becoming increasingly popular. This means that any data that you choose to enter into the system isn’t stored locally, but is instead stored by a third party cloud hosting service. There are a number of advantages to cloud ERP; they can mean much less work for your IT department, freeing them up for more profitable tasks, they can save you money, and it’s less drain on your internal networks. However, there is a slight downside, and that’s the need to place 100 percent of our ERP system security into someone else’s hands. Businesses need to have peace of mind that their data is safe. How to Avoid: Consider your cloud provider very carefully, paying particular attention to their security processes and their data regulations. Ask around, read reviews, and don’t be afraid to ask questions. Single Authentication As ERP systems have evolved, they’ve become capable of handling not only a much wider range of information but also more sensitive information as well. Single authentication — passwords, for example — is standard, but we have to ask ourselves whether 1FA (one-factor authentication) is enough for modern ERP systems. Password cracking is one of the simplest and most common forms of hacking, so it really doesn’t make sense to protect our most important, sensitive, and confidential business data through the use of passwords alone which can be stolen or even guessed relatively easily by experts. How to Avoid: The obvious solution is 2FA. The good news is that the 2FA industry has changed in recent years and there is no longer a need for a physical device. Instead, a code can be sent to an email address. Weighing Up The Benefits Although there are a number of security factors to take into account when implementing a new ERP system, it’s important to remember that the advantages far outweigh the concerns. In fact, by maintaining a safe and secure ERP system, with high levels of data consistency, the system could actually help to make your business even more secure, providing peace of mind for your staff and your clients. ### The Rise of Digitalisation in Business Digitalisation is a hot topic in the business world right now. But what exactly does it mean? We explain what it is, how it is impacting businesses now, and what it means for the market in the future. Digitalisation is the process of converting material or information into a digital form. Many businesses are now transitioning online in a bid to streamline the management and day to day running of operations. This shift is being powered by a new wave of technology that allows companies of all shapes and sizes to be more strategic and efficient. This trend is set to continue as more businesses understand the benefits of digitalisation and move to capitalise on them. From storing documents, online-backups, workflow and document management to remote working, the possibilities are endless. The process is faster, more efficient and safer than offline equivalents. The process of automation saves you time, space and money. Key benefits: Increases efficiency Reduces operational costs Enables data to be analysed Safer data storage in the cloud Lack of human error Here are three popular processes and how they can be improved by automation: The Cloud The role of technology in the workplace has changed dramatically. Cloud computing is technology that can help your company gain a competitive edge in the marketplace. The introduction of cloud computing has revolutionised how businesses operate, providing new levels of flexibility and access via remote working. Cloud computing is the delivery of IT services online in the ‘cloud’. A growing number of businesses are adopting this application for their IT infrastructure. It is more secure - hosting your systems on remote servers protects your information, prevents data loss and allows for a more effective and protected storage solution. It is also more reliable with automated data backups and disaster recovery. Cloud computing comes in three forms: public clouds, private clouds, and hybrids clouds. The right solution for you will depend on the scale and scope of business, as well as your data requirements. Workflow and Document Management Another way to increase productivity is through automated workflow. This is a system that digitally produces, tracks and manages documents associated with your business processes. Many businesses find they have large paper archives and countless boxes of old documents that are taking up valuable office space. Over the years it’s likely that the organisation and storage of these documents will fall into disarray, leaving a large paper trail of archived work. Effective workflows are crucial; the more efficient, cost-effective and sustainable they are, the better prepared you will be for the future workplace. Effective workflows are crucial; the more efficient, cost-effective and sustainable they are, the better prepared you will be for the future workplace. An improvement strategy will help manage your documents whether they are paper, electronic or digital. There are many benefits to this; reduced costs, improved efficiency, security, which is especially important to those who must comply with rules around legal or sensitive data. You can control when, who and to what capacity certain people access documents, with a solid authorised paper trail. Popular departments who benefit from this process includes digital mailrooms, recruitment, HR and finance management divisions. Essentially it allows you to manage your information, analyse current performance and optimise for continued improvement. Data Capture Intelligent data capture manages the process of capturing incoming information or data to your business, ensuring it is processed and documented efficiently. Information coming into an organisation typically arrives in paper, fax, web or email format. With intelligent data capture solutions, information can be automatically classified, extracted, validated and shared with digital workflows or existing ERP systems. There are many data capture technologies that organisations can incorporate into their processes, the most suitable will depend on the nature of the business. Digital pens, tablet or OCR intelligent document and image scanning are the most popular. The information gathered can then integrate with host systems and pre-defined databases – automating the process from start to finish. [clickToTweet tweet="When data capture is #digitalised and #automated, it eliminates the #HumanError factor and will increase data quality and accuracy" quote="When data capture is #digitalised and #automated, it eliminates the #HumanError factor and will increase data quality and accuracy"] When this process is digitalised and automated, it eliminates the human error factor and will increase data quality and accuracy. It also saves a huge amount of time, increases efficiency and productivity and lowers organisational and storage costs. It improves business processes and increases transparency internally. Popular for mailroom, field operations and accounts payable automation, data capture is a smart digital solution for many businesses. These three core processes used together or in isolation can have a dramatic effect on operations. We now live in a digital age, and as the technology advances so will the way in which we all work. Digital transformation and disruption have been making waves lately across all industries. To stay relevant, companies, departments and individuals need to know exactly where business technology is headed and be sure to stay on top of each shifting digital trend. Those who embrace these technical advancements into their business structure will thrive, those who don’t risk being left behind the digital revolution. ### The Future of Banking: Digital Account Opening [vc_row][vc_column][vc_column_text]Since fintech has come to the forefront of the financial services industry, the entire landscape has evolved. The rise of new challenger banks built on modern technology, and with a customer first ethos, has resulted in the market share of the big four legacy banks, dropping significantly over the past 10 years (from 92% to 70%). Legacy banks must adapt to the shifting landscape, or possibly join the predicted one-tenth of European banks that won’t survive the next 5 years. Agreements are deep-rooted in the world of business, but the importance and sensitivity of agreements in banking, such as when opening a new bank account or taking out a mortgage, carry with them a higher value and risk than your everyday employee agreements. They’re also subject to legal enforceability and regulatory standards. At present, most banks still rely on legacy approaches to agreements that can require lengthy application processes, branch visits and a sub-optimal mobile experience.  As banks move to a fully digital process, they need to balance customer experience and risk/security. This isn’t about a trade-off between security and customer experience, but rather how to deliver BOTH – offering an experience with the necessary security safeguards while ensuring a frictionless digital experience. Modern day tools such as; ID verification and e-signature have made it possible to fully digitize the agreement process, while enhancing the customer experience and removing the risks associated with manual agreements. Banks and financial institutions (FIs) must adopt these new technologies in order to continue competing in the modern digital banking landscape. E-signatures: Enhancing Customer Experience and Compliance Signatures are a traditional form of verifying identity, but manually “wet” signing documents can be a time-consuming process, that can involve visiting a branch, or printing, scanning, and posting documents, all of which carry a higher chance of human error. The pain-points associated with manual signatures only become greater if an agreement spans geographical regions. Given this, banks are increasingly adopting e-signature solutions as a more seamless and secure e-signing experience  that allows banks to acquire new customers quicker, offer a higher quality service, no matter their location. E-signatures also help banks remain compliant with GDPR and other regulations, by capturing a customer’s digitally signed document supported by comprehensive visual audit trail detailing what the customer has agreed to, when and how they signed. While many banks have already adopted basic e-signature abilities, the technology alone is not enough to completely automate the new accounting opening process while reducing fraudulent enrollments. For example, manual identity document verification checks, or introducing paper agreements, are all ways in which banks end up with a semi-automated or siloed process, increasing application abandonment rates and application fraud while negatively impacting the overall customer experience.  Modern Identity Verification Verifying the identity of customers is an essential component of banking, where interactions are high-value and security is of paramount importance. The most common methods of customer identity verification methods have traditionally involved a customer visiting a branch and presenting their ID documents, or banks using legacy knowledge-based authentication (KBA) methods that cross reference the fixed identity information the customer provides against a database from third-party or credit agency. However, as the banking landscape has shifted, and technology has advanced, both approaches are no longer adequate. The number of physical bank branches are falling, the appetite for digital banking has never been higher, and challenger banks that employ modern identity verification methods are on the rise. Given this, it’s becoming more difficult for legacy banks to justify asking customers to come in-branch to verify their identity during the account opening process. Because KBA relies on static information that’s becoming more widely available thanks to the number of data breaches in recent years, it’s no longer a secure means of verifying identities. If cybercriminals can get their hands-on personal identifiable information (PII), then it becomes all too easy to open fraudulent accounts. Banks need to be able to verify identities without compromising on customer experience or security. One method of doing this is by implementing context-aware identity verification, which is a combination of digital identity verification methods, such as ID document capture and facial comparison, with risk analytics. Digitizing the account opening process by automating agreements results in an improved and secure customer experience. For banks and FIs it allows them to increase application conversion rates,  achieve regulatory compliance, while becoming more operationally efficient. The banking industry has never been more competitive, and customers are more open to switching bank providers than ever before. Automating the account opening process allows banks and FIs to provide a secure experience across the digital customer journey and ensures customer loyalty and helps to grow top line revenue.st[/vc_column_text][/vc_column][/vc_row] ### AI for small businesses and how to best optimise it [vc_row][vc_column][vc_column_text]Are you intimidated by the words “artificial intelligence”? Maybe you’re thinking of Internet algorithms spying on you — or robot overlords from movies?  However, if you’re a smaller business, AI can be good news and make you a master, not a slave. Artificial Intelligence (AI), Machine Learning and Deep Learning have become mainstream buzzwords. They appear in media stories about self-driving cars, advanced medical diagnosis and military targeting. You’ll also come across stories about companies or organisations using Internet algorithms to bombard you with adverts or attempt to influence your politics. If you’re a small business, AI can seem like a world apart. However, this technology can be simple to understand, practical to use and make a massive difference to your company every day. Although it makes the headlines regularly, AI isn’t ‘new’. It’s been around since the 1950s when a bunch of researchers starting using simple computers to solve problems and learn how to win at draughts (or ‘checkers’ as Americans like to call it). It’s the availability of vast quantities of data, storage and computational power — alongside plenty of smart ideas and investment — that’s propelled AI onto the big stage today. However, AI is not as complicated as it might seem. It’s not just for Google, Amazon and Netflix. Smaller companies of less than 30 people and with modest revenues can take advantage — without needing a data scientist on board. So how could AI help? Well, here are three examples: Predictive Analytics: This can be useful in sales and marketing. AI can look at the profiles of your customers and analyse their behaviours over time. It may predict which product an existing customer is likely to want next, so you can come in with an attractive upsell offer. AI might also spot the tell-tale signs that a client is about to leave, so your account team can be more attentive. Robotic Process Automation: Within any business, there are likely to be routine, repeatable and mind-numbing tasks — such as processing orders or re-keying invoices. With AI, many of these operational processes can be automated and streamlined. You just set some rules and integrate your new tools with your existing systems, saving hours for your staff. Cognitive Interaction: Web chatbots have come a long way. The bad ones we’ve all encountered tend to speak in generic terms and think along the lines of a simple decision tree. The moment the caller goes off the script, they get sent around in a frustrating loop. However, advanced AI can personalise the experience, calling up account details, while also reading the emotion of the customer. For example, if the customer is starting type ‘angry’ words, the chatbot can prompt a member of your human team to join the conversation and save the day. What it is — and isn’t At this point, it’s important to stress that AI is about augmenting what your staff do — and freeing up their time for better things — not replacing them. Your goal is to solve business issues with the right solutions. That requires business acumen as the starting point. Think about the core business challenge, not technology. For example, in the marketing world, social media is a channel. I always recommend that my clients have a content strategy that engages with the target audience, not a ‘social media strategy’. The tail shouldn’t wag the dog! Similarly, you shouldn’t have a ‘data strategy’ or an ‘AI strategy’. AI is a tool and data is the fuel to make it work — but the starting point should always be business. So how do you want to use AI? Where’s the pain in your business today, and where’s the opportunity? Perhaps, you want to reduce your time to serve each client, you’d like customers to have a better experience, or there’s a new product in the pipeline, but you’re not sure which features are the ones to push? Identify a business issue. Next, talk to someone who can help you to spot pockets of data across your business and figure out ways to bring this together, so you can use it to deliver real-time insights. Most likely, this will highlight the specific obstacles that need fixing first or identify patterns that you didn’t realise existed. Then, find someone who understands business and AI. You don’t need an army of data scientists or an expensive consultancy. You don’t need to sink £40k into a bells-and-whistles app that takes two years to produce and misses the mark completely. Rather than taking a ‘big bang’ approach, split your challenge into bite-sized chunks. Use AI to solve little bottlenecks first or to glean specific insights that inform your decision-making. You might discover that the software you need exists already. If not, then think creatively. For example, you could team up with a local university or college and work collaboratively, providing some modest financial incentives and prizes. Smart minds like a challenge and may relish the opportunity to solve a real-life business headache. This way, you’ll solve the business issue, break the ice with AI, have new tools that sharpen your competitive edge.[/vc_column_text][/vc_column][/vc_row] ### Evolving ID delegation for the digital age: building supply chain trust [vc_row][vc_column][vc_column_text]To survive in today’s fast-paced business environment, digital transformation has become the status quo. So much so that 47% of UK organisations fear they will become less relevant if they fail in their efforts to transform digitally. With the necessity to innovate at an all-time high, companies are now relying on suppliers and partners for business-critical components and a competitive advantage, ultimately meaning that more third parties are interacting with an organisation than ever before. This shift has led to a changing cyber-attack surface. As employee access controls have evolved, cybercriminals are looking beyond traditional methods to breach an organisation's defences, increasingly targeting vulnerable third party access points. In fact, recent research revealed that there has been a 50% rise in third party data breaches over the last year. As such, it’s more important than ever that companies establish relationships of trust with their suppliers – but in most cases this is easier said than done. With that in mind, electronic delegation has the potential to dramatically impact how companies build these relationships across both the public and private sector. Digitising delegation to secure and manage the supply chain can not only support complex scenarios, but also build accountability. Here’s how it can be done. Digitising delegation Delegating rights and roles is an essential part of our daily lives and our economy. For example, we delegate our financial matters to accountants, legal matters to lawyers and estate matters within families through power of attorney. In the offline world, this delegation takes place via signed or co-signed documents, and often notarised. Occasionally, in-person visits to a local bureau are mandated. These centralised methods of physically establishing delegation rights are costly, slow, inconvenient, and do not scale well. Moving delegation to the digital level creates an opportunity for businesses to decentralise and solve these problems. It not only increases security, but also reduces cost, speeds up administrative processing, is much more convenient for both the source and target of the delegation, and scales to internet levels. Electronic delegation can take complicated admin-heavy workflows and make them simpler and less expensive for all involved. Yet when it comes to digitising delegation, whether to secure the supply chain or reduce admin-heavy workflows, few organisations are using it and few technology providers get it right. Managing user roles and rights Identity & Access Management (IAM) solutions by definition manage user roles and their rights to protected resources – systems, networks, documents, data, among others. All IAM solutions should provide some level of delegated administration – a function that allows users to create and manage other users. However, to meet intricate real world needs such as managing the supply chain, electronic delegation needs to be able to support complex situations where individuals delegate to individuals, and/or individuals delegate to organisations (and vice versa), and/or organisations delegate to organisations. Electronic delegation across industries There are a variety of ways electronic delegation is already being used across a number of industries, which can all be applied in a similar way to supply chain security. For example, if we consider the nationwide platform for tax submissions currently in place in Finland, citizens or company officers can delegate the roles needed for the various government tax submissions. Multifaceted hierarchies and auditable power of attorney are supported, for instance CEOs delegate to CFOs, who in turn delegate to external parties responsible for specific submissions, who in turn delegate role-based access to their team members. Moreover, within the energy industry where there are countless third parties involved, delegation can support hierarchical access to energy contracts and details, such as tenants in housing associations controlling their energy ‘mix’, then landlords delegating access to the energy company. Delegation allows access to energy details in hierarchical ownership or tenancy structures (tenants in housing associations being a prime example). Management rights for energy supply and usage can be forward delegated to landlords, who in turn delegate to tenants. The future of electronic delegation Innovation is now driving both the adoption of electronic delegation and the creation of new services that use delegation at its core. Concepts like electronic “Right to Represent” is, at the highest level, a mechanism that allows the assertion of rights and the delegation of some or all those rights to other identities, such as one organisation checking the NDA signatory rights of the individual signing the NDA as a representative of another organisation. Crucially, electronic delegation and services built around delegation concepts will become increasingly widespread as companies look to secure third party access points within their supply chains. And when embraced it can offer cost savings, auditable security and allow complex processes to scale.[/vc_column_text][/vc_column][/vc_row] ### Two for one: the top security measures for retailers [vc_row][vc_column][vc_column_text]It would be fair to say that technological developments have completely changed the way consumers interact with retail companies in recent years. According to the Office for National Statistics, one in every five pounds spent with UK retailers last year took place as an online transaction. And on the two biggest shopping days of 2018 – Black Friday and Cyber Monday – Brits spent upwards of £7 billion from the comfort of their digital device. As we once again approach the peak spending period for 2019, retailers need to be more on top of their digital game than ever. And in today’s tech-driven society, this means ensuring that customer data is well protected. While there are many factors involved in maintaining a safe and efficient online sales funnel there are two crucial areas that perhaps stand out above the others in 2019. Getting to grips with GDPR Introduced in May 2018, General Data Protection Regulations apply to all 28 European Union member states. These regulations are designed to protect consumer data, improving the rights of EU citizens whose personal information is processed and held by organisations like retailers and service providers. Retailers deal with a huge volume of consumer data when taking online orders – from collecting payment card details to postal addresses, emails to feedback forms. If this data is lost or compromised, the consequences can be catastrophic for both the customer and the business itself. If a business is found to be in breach of GDPR, severe fines can follow. Prior to the new regulations introduced back in May 2018, these fines were capped at £500,000. Now, fines can be as high as €20 million, or 4% of a company’s annual turnover. To further complicate things, there is currently no hard and fast process to follow for businesses. In fact, a common misconception is that enterprises need to become “GDPR compliant”. There is no certification or accreditation that signifies a compliant business; rather GDPR is about constantly maintaining data protection best practice through the implementation of rigorous processes and procedures. A company’s Data Protection Officer (DPO) or Chief Information Security Officer (CISO) are often tasked as the gatekeepers for all data-related decisions, but additional support can be sought from GDPR consultants. A fresh pair of eyes will help to pick up on things that may have been missed, and GDPR consultants are specialists in this area with expert knowledge and experience. While GDPR was originally designed for EU member states, every company must adhere to the regulations if they deal with data belonging to EU citizens, no matter where they are based. This is particularly relevant for internet-based retailers, as customers from all over the world can buy products from a website. Blocking EU traffic to a site is one way to ensure that only non-EU customers can enter their details and purchase items. Payment Card Industry Data Security Standard (PCI DSS): bargaining with Brexit With the Brexit deadline looming ever-closer, businesses across all industries are struggling to understand how to prepare. While the UK will no longer be an EU member state, businesses in Britain will still need to maintain their data protection standards.  The UK introduced its own Data Protection Act (DPA) in 2018, covering other areas not currently included within GDPR. But as part of the EU Withdrawal Act, GDPR will be introduced into UK law and run alongside the DPA. This will ensure that UK companies can still trade with EU member states without facing substantial fines. It is important that retailers are familiar with PCI DSS in 2019, as it will be relevant in the future. Payment Card Industry Data Security Standard applies to all companies that handle card transactions, making it relevant to almost all retailers. Companies must provide evidence to indicate their activity over the last 12 months concerning, for example, how the business has assessed and ranked security vulnerabilities as they have been discovered. As with GDPR, there are trained professionals to help businesses through the PCI DSS compliance process. QSAs can provide critical support to organisations, guiding the process and providing advice on any changes that need to be implemented. QSAs also identify areas of vulnerability where a security breach is most likely, encouraging businesses to be proactive rather than reactive in their approach to data protection. Recent incidents like the attack on US website StockX (which saw the release of 6.8 million customers’ personal data) highlight just how damaging a hack can be to retailers. Indeed, data loss can be damaging for both business and customer, but storing personal and payment information is simply part of the retail process. Introducing up-to-date security practices can not only protect against an attack but also provide consumers with peace of mind when shopping online with a particular brand or ecommerce store.[/vc_column_text][/vc_column][/vc_row] ### Top 7 Mobile App Development Trends To Be Watched Out In Current Era [vc_row][vc_column][vc_column_text]The world moves in the decade where the mobile app plays a vital role and dominates in changing the lifestyle. It is also proven that mobile app development is the industry that is at its fast growth. Also, there are no signs for a downfall in the graph in the future as well. The technology is bringing up some new inventions and that comforts the life of people in the best way. It also helps the business people to emerge out for success in the most efficient manner. Continue reading to know some of the top trends available in the market of mobile app development. 1. Artificial intelligence Artificial Intelligence or Machine Intelligence is the one that is efficient enough to perform the tasks of the human. This is the evolving technology and foremost advancement that is in the growing stage in the industries. It is also expected that there will be a drastic change in the society when this technology comes into action in all the industries. 2. Internet of Things Today, IoT is developed enough to implement them in forms of some wearable technology. You might have heard some terms like a smart bulb, smart thermostat, and several others. Now, several mobile app development companies are working to adopt this type of technology for mobile app development as well. Research has also said that around 5 billion people will be under the technology of IoT at the end of the next half a decade. 3. Cloud-based technology For example, you will be using your email and where are all the emails stored? Do your mobile devise consume space for the storage of the mails? No! It takes the medium of cloud and all the emails are stored in the cloud. Why shouldn't this be followed for the other mobile apps as well? Yes! There is research been carried out the steps are going to be implemented. This will blow-off the hassle of storage issue in mobile devices, make the app faster, secure and easy to deploy. 4. M-Commerce You might have come across the online shopping and the interest of people to show importance to it. They love to purchase things, make money transactions, reach the customer center, video calls, images, etc as the communication to the online store. This will be possible and made easy with the development of the M-Commerce sites. 5. Beacons Beacons have lost its name as the stranger to the market as it becomes popular. Several industries like medical care, hotels, museums, etc are highly welcoming the Beacon technology as their application. Lots of business people also believe that this technology is more safe and understandable. Till the current year, it does not have a good establishment in the industry but in the future, it is expected to make some revolution in the mobile app development industry. 6. Wearable apps People show interest in moving towards certain trendy technologies and witness in some innovative ideas. As per certain survey was given, it is expected that the wearable technology will grow up to 35% in the current year. This will also help in enhancing your personality, appearance, and style. Especially, in certain application like morning exercise, workouts, cycling, walking, etc. This wearable will be highly useful. 7. Blockchain technology You might have been aware of the blockchain technology and the drastic change brought out by this in the cryptocurrencies. Now, it is the fact that the financial sector is reeling massive benefits for such innovative decentralized currencies. They also aim to make the transaction with more efficiency without the involvement of some third party. It is also proven that the technology is the worthy asset for the financial and the banking sector. So this has a good sign in the mobile app development industry currently. Final thoughts Mobile apps, being an indispensable part of human life and it has become impossible for imagining the world without mobile phones and mobile apps. These take significance in all the areas like booking tickets, food orders, and make the entire world into the palm. Further, the evolution of these technologies will enhance the living of people with more comfort.[/vc_column_text][/vc_column][/vc_row] ### Retaining talent and boosting business with people data [vc_row][vc_column][vc_column_text]As with many elements of today’s society, the workplace is undergoing an evolution. More and more people are opting to leave their jobs or change career paths, with research finding that only 28 percent of millennials plan to stay in their current role for more than five years. Employees are no longer entering the world of work with a life-long vision, instead switching between jobs and employers at an increasingly high rate – leading to the rise of the so-called ‘quitting economy’. There’s no doubt that this new environment has created retention challenges for businesses. When it comes to replacing an employee, for example, organisations face issues such as recruitment costs and culture changes. Productivity is lost, workloads are thrown off balance and they have to invest in further training schemes. Alongside this, a range of industries are currently facing a shortage of digital skills, meaning the competition to attract and retain talent is fiercer than ever. Job vacancies for tech specialists are now at an ‘all-time-high’, as revealed by a recent report from Tech Nation. What’s more, despite the majority of tech vacancies being advertised in London, a new study from Accenture has highlighted that cities across the country are also experiencing extreme demand for people with tech skills. But with these upcoming tech hubs at risk of being overlooked, the ‘war for talent’ could potentially create a North-South skills divide. Each of these elements, combined with the growth of the ‘quitting economy’, have caused business leaders to seriously look at their employee retention strategies – which is why they are turning to solutions like people analytics. By applying this emerging technology, companies are able to assess data trends amongst their workforce, with the goal of predicting and pinpointing any issues and then generating the appropriate solutions to overcome or prevent them. Investing in the people The aim of people analytics is to identify the workforce’s behaviour, trends and habits through collecting and analysing employee data. By interpreting this data, companies of all sizes are able to obtain valuable insight, including a more personal understanding of the lives of their staff. People analytics are providing businesses with vital knowledge to help them recognise any existing issues amongst their personnel. And it’s this understanding that can help tackle the ever-growing ‘quitting economy’. For example, by looking at their data, businesses are able to evaluate key elements affecting their workplace, such as working hours, employee happiness and commutes. Understanding an individual’s daily travel for instance could reveal that a vast majority of the workforce endures long commutes each day. As a result, leaders could make plans to introduce schemes to reflect this, such as flexible working hours and opportunities to work from home. People analytics are also useful when considering key team members, as the data allows companies to understand their motivations and goals, and therefore foresee any issues that could result in a resignation or a change in career. A personalised approach allows businesses to cater for the individual needs of employees, and therefore improve talent retention. Bringing people analytics into the workplace For the modern business, standard HR methods, such as annual employee reviews, are no longer enough to assess workplace culture. To gain a ‘Google Earth’ view of the workforce, employers need to adapt their HR systems and reconsider how they utilise data science. Previously, only large scale companies were equipped with comprehensive, data-focused HR approaches. However, nowadays it’s possible to implement people analytics in any business, thanks to the development of new HR tech solutions. HR systems are able to act as a centralised point for information, gleaning data from every department within the business to gain an extensive overview of the employees. This introduces the entire enterprise to a data-driven approach. Provided everybody in the company has had the correct training, the process of inputting all relevant information should become a force of habit and slot in easily to the daily work routine. However, the quality of this data can’t be overlooked – companies need to ensure that this is kept to a high standard. The implementation of regulations such as the EU’s General Data Protection Regulation mean that it’s more important than ever that all data entries comply with fundamental data privacy guidelines. Boosting business By analysing employee data, employers can align their company with their people’s needs and make intelligent, informed decisions. Meanwhile, through applying the same methods, businesses are taking steps towards becoming more efficient. This, in turn, is allowing them to retain their best talent, promote productivity and proactivity, and optimise procedures. For example, Chevron transitioned from using a decentralised HR system to adopting a centralised, data-focused system. As a result, the company increased productivity levels by 30 percent while simultaneously lowering costs. The ‘quitting economy’ is here to stay – and for this reason it will be difficult to overcome its impact completely. But the implementation of people analytics can arm businesses with the knowledge and comprehension they need to improve employee retention and counter its affect significantly. Importantly, in an increasingly competitive environment, obtaining a holistic picture of the workforce is the key to finding the best and brightest people and keeping them. And having actionable data is a step in the right direction.[/vc_column_text][/vc_column][/vc_row] ### AI in construction: what’s next? [vc_row][vc_column][vc_column_text]McKinsey claims the construction industry is under-digitised, which is alarming given the sector is expected to be worth $8 trillion by 2030. Construction lags behind other industries in the race to adopt technologies, with companies failing to realise the potential of artificial intelligence (AI) as a driver of exponential growth and efficiency. In this article, Actavo Direct’s James Hepton looks at the first steps for AI in construction and how businesses can adopt the technology to cut costs, save time and race ahead of the competition. Limitless learning A key facet of AI is machine learning – the ability of technology to continuously learn from limitless amounts of data. All the data fed into an AI programme is used to and create patterns and algorithms, which can be used to suggest improvements for future projects. Companies set to gain the greatest benefit from AI will be those that take the time to feed in the most amount of information possible. No amount of information’s too much and while some data may not seem relevant now, it may well help shape strategy and drive decisions in future. In construction, this may include all details on stock levels needed to complete previous projects, the number of employees deployed on each project, calendar and schedule information, and any other information that may impact projects, like weather forecasts. Using this data to inform future projects means businesses save time and money as a result of greater efficiency and less waste. Timings are accurate, projects are appropriately staffed material stock is based on previous, similar projects. Off-site, AI already has applications in daily activities like spam email detection, keeping businesses protected online in a growing threat landscape. However, benefits extended beyond just business benefits and profit margins, with implications for employees day-to-day on-site and in the office. AI has made live video and image detection possible, with which AI platforms can be taught to identify potential hazards on site and alert employees in real-time to any occupational hazards. Design innovation Traditionally, construction design meant paper and pencil drafts and painstaking hours covering every angle and detail for each project. Building information modelling (BIM) has since revolutionised the design stage, combining walkable 3D designs with intricate detailing and built-in project management capabilities to provide a single comprehensive database, available for everyone across the whole project to work from. On top of a 3D site and building map, all project details can be woven into the design, including financial information and planning details. BIM even provides scope for mechanical, electrical and plumbing work to be theorised in the design stage, saving time and cutting costs in wasted material and rescheduled work in the building stage. Businesses can also reap the rewards of greener projects thanks to BIM. With projects being meticulously detailed on a digital plan, they now save on paper traditionally used in sketch designs and, more significantly, the materials wasted or overestimated in the planning and building stages. The reality of AI Far from the robots we see in Hollywood movies, the robotic reality of AI is somewhat less dramatic, but the implications are becoming more relevant every day in construction. Self-operating and self-driving construction machinery are already paving the way in terms of efficient industry tech, taking manual and repetitive tasks like excavation away from employees and allowing them to focus efforts on more complex management and planning tasks. The anticipated realisation of 5G network infrastructure on a mass scale will pave the way for development in construction technology, facilitating the development of AI-operated construction machinery like remote machine operation. Construction companies will be able to harness the speed and stability of a 5G network to operate excavation tools and other construction machines from thousands of kilometres away. This offers time and cost savings for companies who can perform tasks on multiple sites from one remote location, plus health and safety benefits for employees who don’t have to physically enter sites. E-commerce advantages As the construction industry continues to undergo digital transformation, so will its sales and marketing activity, as businesses move to meet their customers at their common touchpoints. Those able to adopt AI as a tool for improving their online service offering – for example, offering tailored product suggestions to customers based on search history and previous behaviours – will move ahead of the competition to win business. Machine learning allows companies to log customer data, like key purchasing times, products regularly bought together and more, helping businesses shape their sales and marketing activity. It gives businesses a greater understanding of their customers’ needs at each stage of the buying journey, so they can align their promotional activity. For example, creating informative content to meet people in the early stages of researching construction companies, or recommending products or service packages to those ready to convert on a website.[/vc_column_text][/vc_column][/vc_row] ### Why I launched a business to provide virtual migraine treatment [vc_row][vc_column][vc_column_text]My name is Dr. Risa Ravitz, and I am a migraine sufferer, headache and migraine specialist, and health tech business founder -- in no particular order. I have suffered from debilitating migraines my whole life, with the lack of answers I received growing up leading me to follow a path into neurology. I then became a specialist in treating patients who suffer just like I do, both online and in a brick-and-mortar practice. Suffering with migraines gives me a unique approach Suffering with the condition that I treat means I am a lot more empathetic to the patients’ condition. I understand the struggles that often prevent them from religiously following the lifestyle advice of their physicians, such as doing light exercise and maintaining a consistent sleep schedule. It also means that I understand and recognize the symptoms that might not immediately be associated with migraines, such as vomiting, nausea, dizziness, and simply not feeling “100%.” This extra level of empathy that I have with migraine sufferers heavily contributed towards my decision to launch my own telemedicine migraine treatment business. When people hear the term “telemedicine,” they often imagine a medical professional simply on the other end of the phone. In reality, telemedicine for migraine and headache treatment is much more than that. Why telemedicine for migraine? Telemedicine in 2019 is more than a simple phone consultation, especially for sufferers of headaches and migraines. By offering a one-on-one video chat with patients, I am able to engage in thorough and focused conversations during which I understand their medical history, symptoms, and lifestyle habits. This information can be obtained from the patient while they are in the comfort of their own home. Migraine patients are often in so much pain that they travel to be seen at the emergency room, whose germ-filled waiting rooms generally have glaring lights - a headache patient’s worst nightmare that I’ve been through myself. And it’s no secret that wait times to get seen by a healthcare professional are far from speedy, with patients often only getting five minutes face-to-face with a doctor after going to the ER, only then to wait weeks to see the specialist they have been referred to. This means a crucial part of care - listening - is neglected, as well as explaining, reassuring, educating, and empathizing. All due to the time constraints of traditional care. I launched my company as a way to fill the holes of today’s healthcare system in the U.S. With telemedicine, patients can speak to a doctor on their terms, outside of working hours and from the comfort of their homes. And for those that doubt the quality and promise of telemedicine for migraine, there was shown to be no difference in outcomes between in-person and telemedicine care in this 2017 study by the American Academy of Neurology. However, as a woman founder in a space dominated by male leaders - health tech - launching my business was never going to be an easy ride. Taking the female founder path into health tech When I started my own headache and migraine treatment practice, I knew I wanted to go further in helping those suffering with the condition, but I didn’t expect to experience the “glass ceiling” in health tech leadership quite as much as I did. I felt like men were dominating top roles in the field and the status quo was being withheld. So, unsurprisingly, when I look back at my journey to becoming a health tech business founder, I wonder how much faster I would have moved had I not been a woman. I was even denied a small business loan despite having a great credit record -- I can’t help but think if being a woman business founder had something to do with it. But when you look at the stats, it’s easy to believe. Despite making up over 70% of the healthcare workforce globally, women remain significantly under-represented at the leadership level: 30% of C-level positions in healthcare are held by women, just 13% of CEOs are women, and only 9% of health tech businesses are founded by women. These reasons for under-representation in leadership roles in the field bear resemblance to other challenges we see across the board in tech. It’s harder for women to achieve the same level of trust as men easily gain as they dominate the leadership positions: Only 9.7% of investor funding goes to health tech startups founded by women. Women are less likely to self-promote and speak up, they have a smaller professional network, and of course, women are often seen as having less potential for productivity as they are deemed more likely to leave their career to have a family than their male counterparts. But by no means does that mean it’s impossible. My business is up and running, and I regularly  treat migraine patients virtually, providing care and relief that they had been struggling to access through traditional avenues. If you have a great idea that you believe in, it’s worth fighting for -- no matter how solid the glass ceiling might seem.[/vc_column_text][/vc_column][/vc_row] ### Moving cloud services towards understanding speech [vc_row][vc_column][vc_column_text]Google has recently announced a number of updates to the technologies that underpin its Contact Center AI solutions. In particular, Automatic Speech Recognition (ASR). ASR goes under several names including computer transcription, digital dictaphone and quite wrongly, voice recognition. Speech recognition is all about what someone is saying, whereas voice recognition is about who is saying it. This has been further confused  by the launch of products such as Apple’s Siri and Amazon’s Alexa which are described as Voice Assistants. However, we shall stick to the term ASR. What’s the point of ASR? The applications for ASR are potentially massive - there are so many situations where it can provide either significant financial or ergonomic  benefit. With computers costing far  less than people,  it’s easy to see why ASR might be a cost-effective way to input data. For a start, most people can speak three times faster than they can type so in any job involving inputting text, the gains are mainly in saving the time of salaried employees. A lawyer might charge his/her time out at hundreds of pounds per hour and a good proportion of billed time will be taken up writing documentation. Lawyers are rarely fast and accurate typists, so dictating to a computer which costs perhaps £2.00 per hour to run provides an obvious benefit. Increasingly, companies of all sizes are finding that ASR helps enormously with productivity. But there are also situations where speech is the only available control media. This can range from systems that require input from the handicapped, to people who are already mentally overloaded such as  fighter pilots. ASR isn’t easy But recognising speech is not easy. Like many other Artificial Intelligence (AI) problems, some researchers have tackled ASR by looking at the way that humans speak and understand speech. Others have tried to solve the problem by feeding the computer with huge amounts of data, and getting the computer to work out what makes one word sound the same as another. Typical of the latter approach are Neural Networks, which may be loosely described as an attempt to emulate the way that the human brain works. It has the advantage that the ASR system designer need know nothing about human speech, linguistics, phonetics, acoustics or all the other myriad of disciplines involved in analysing what we say. However, its main disadvantage is that it requires large amounts of computer power and, when it works, nobody knows exactly how. Transcription of telephone network conversations is generally more challenging than the transcription of speech made direct into a computer or cellphone.  The quality of the acoustic signal is lower on telephone calls and by definition, there will be (at least) two speakers rather than one - potentially confusing the ASR software. It is now possible to not only digitally record calls but also to automatically transcribe them  - and far more accurately than ever before.  By utilising the context of a conversation, it becomes possible to overcome and compensate for some technical challenges of the telephone system such as the quality of the audio signal and poor enunciation that plague human listeners too, All this is important for businesses looking for services to provide ASR as quality and accuracy have long been a concern. For some tasks with a limited number of outcomes, good speech recognition is already perfectly adequate – for example, hands-free control of vehicles. However, where there are an infinite number of outcomes, the risks associated with confusion or rejection may be much greater - from losing customers to causing death, with applications such as on-line banking to medical screening. This is where speech understanding is essential. When AI is not very intelligent When Alan Turing first coined the term Artificial Intelligence, he defined it in terms of what could fool a human, not what algorithms are used. To quote Roger Schank from Northwestern University in Illinois. “When people say AI, they don’t mean AI. What they mean is a lot of brute force computation.” The history of ASR goes back more than 50 years to the early days of general purpose digital computers. In those days, vast amounts of computer power were simply not available. Even the largest computers available in the 70s and 80s would be dwarfed by a mobile phone of today. So rather than let the computer infer what the human brain was doing there was no other option but to attempt to find out by experiment what the human was actually doing and then try to emulate it. This did not stop at phonetics - which studies the basic units of human speech (phonemes are the building blocks of syllables, which are the building blocks of words, which are the building blocks of phrases, and so on)  - but encompassed many of  the things we learn as children to help us recognise speech - grammar, meaning, gestures, history, mood, etc. Indeed, we often recognise words we don’t actually hear. Even though some phrases are phonetically identical, it is only the context in which they are said that elicits their actual meaning. Take for example, the two phrases     A tax on merchant shipping     Attacks on merchant shipping. They are  phonetically identical and even a human could not distinguish between their meaning if heard in isolation. Yet, for sure, they do mean something different. Knowing that one was uttered by an accountant and one by a naval commander would make the distinction so much more obvious. The person that utters the phrase forms part of the context, and there can be literally hundreds of contextual clues that a human uses that are not contained in the raw speech. A computer which produced either of the above sentences could be said to have correctly recognised the speech that made them, yet only one would be a correct understanding of what was said. As computer power has become cheaper, the trend  has been to let the computer get on and infer the relationship between acoustic and written speech. It may be that in so doing, a computer which is fed with many examples of acoustic phrases and a human transcription of the phrase ends up mimicking part of the human brain. But who cares how it does it?  If it works, the end justifies the means. But does it work? The answer is no, not always. Let us take, for example the word “six” You have a thousand people utter it and feed it into computer so it learns what it thinks “six” sounds like. Then person 1001 comes along and for whatever reason, the computer gets it wrong so the person tries to articulate “six” much more clearly (maybe indicating their annoyance) by saying “sssssss-ic-ssssss”. Ie over stressing the “ssss”  sound).. The “ssss” sounds at the beginning and end of “six” will always be recognised by a human as the same sound no matter how long they last (ie are stressed). Other sounds, such as the “i” sound must be kept short otherwise it turns into an “ee”  sound. This is basic phonetics and the computer would not necessarily be able to work  this out from a great many samples. Because humans often articulate things similarly, basic statistics are  not going to tell the computer that. However, a knowledge of phonetics would, The importance of context And so it is with context. Knowing who is speaking, their sex, where they come from, what job they do, what words they use, who they communicate with, for example, all helps us humans understand what they are saying. Cloud computing has increased the temptation for providers of ASR services to use algorithms that learn rather than are taught, simply because computing power is cheap. Why not use a sledgehammer to crack a nut? This occurs because in the situation where many users require simultaneous ASR services, it is far more efficient to apply the shared resources of all users in one highly powerful ASR engine rather than in  many small individual ASR engines. It is more efficient because most of the time, small dedicated ASR engines will be doing nothing. When they are needed, they may take many times longer to process the speech. With one shared cloud computer, all this latency can be mopped up and used to provide one fast service. This is why products such as Dragon running on a local computer cannot compete with a cloud-based ASR service. With so much applied resource, computer-hungry approaches become viable in a way they would not be for local computers. However, as indicated above, no matter how much computer power is applied, there will be a limit to the ability of the service to understand what is being said. And by understand, I mean establish the meaning of the phrases and sentences, not simply identify the words. In order to apply context, it is no good taking thousands of random utterances and smoothing them over. What is necessary is to look at specific utterances and establish the local context. This is an approach that has been used successfully in programs which search and retrieve voice  and written data.  The best of these can use any cloud-based ASR service, and apply the context found in emails and text messages to pre-condition the ASR service. So for example, if Bob and Alice exchange emails, they will be more likely to use certain words than might be the national average. The ASR can then be conditioned to favour those words. What’s interesting about the Google announcement is that Google seems finally to have accepted the importance of context in their cloud services. However, Google’s focus of attention is contact centres where the need for ASR is primarily in forensics and damage limitation when things go wrong. The really big opportunities are small workgroups - where the need is to improve efficiency and the context is far more valuable. Even large corporations are built from many small workgroups. Computing power is now cheap. But that doesn’t mean it can replicate human intelligence. If you want a piece of software to act like a human, it has to understand like a human. We think it’s good news that Google has finally seen the light, and businesses using this technology should really start to benefit.[/vc_column_text][/vc_column][/vc_row] ### Collaboration among cloud providers the best way forward [vc_row][vc_column][vc_column_text]Taking on the hyperscalers and their comprehensive suite of services is no easy task. A fact borne out by the most recent Gartner figures that show the big five (Alibaba, Amazon Web Services, Google, IBM and Microsoft) now have a combined market share of 77 per cent.  However, smaller cloud providers, though squeezed, have important advantages against their larger competitors, not least a more personalised, localised and tailored service.  Hyperscalers are so vast that it is impossible to provide all but their largest customers a level of expected service and support. But taking on the hyperscalers takes more than a personalised, tailored service. We would like to see smaller cloud providers working together to combine their niche offers in an effort to provide a comprehensive single service to customers, without losing sight of their individual expertise and heritage. These collaboration strategies would open new doors, opportunities and we believe would sustain a valuable place in the market for smaller cloud providers, and the advantages they bring to the customer. With these challenging market dynamics in mind, I will explore more of the drivers for smaller cloud providers to become more discerning about collaborating with partners. Evolution of multi-cloud strategies Different cloud platforms provide varying types of service and are often used for different purposes. Many companies now depend on multi-cloud infrastructures for their operations and it is estimated that more than 75% of midsize and large organisations will have adopted a multi-cloud and/or hybrid IT strategy by 2021. The increasing use of a multi-cloud infrastructure is a great market opportunity for small cloud providers to collaborate and build on each other’s niche specialisation. The difficulty for a hyperscaler is that no matter the question a customer asks, the answer will always be their public cloud offering. With a network of smaller cloud providers working together, customers are really listened to and enjoy a genuine tailored solution to their needs. Furthermore, smaller cloud vendors are often more competitively priced as well as being able to offer a more personalised service. A comprehensive approach to data security Network security is another big challenge for today’s C-suite leaders. Recent reports show how some of the biggest tech companies (Facebook, Yahoo, Equifax) have suffered massive security failures, giving us insight into how big the problem of cybersecurity really is. With increasing dependence on cloud systems, businesses are also facing a sudden rise in the scale and sophistication of cyberthreats. So, as the trend of multiple cloud use grows, a similar tailored approach to data security is required. No two data infrastructure set-ups are the same and it is therefore important for cloud providers to be part of an agnostic broad network of cybersecurity experts. The urgency of the issue is heightened because financial penalties from data breaches alone can pose an existential risk to their operations. Many business leaders are unaware where their data is being held or what steps need to be taken in the case of a data breach. This is especially dangerous after the implementation of GDPR. While no business can ever outsource ultimate responsibility for their security to an external provider, there are a large number of experts who can help. Having a tightknit network of cloud and security companies collaborating together can make it easier to manage the overall security architecture. Know your data location With so much data at the heart of digital transformation, it is important that businesses store this data on a reliable platform. We all know that the loss or misuse of data, accidental or otherwise, could prove to be an existential risk for businesses. It can destroy reputations, result in significant regulatory fines, and cause serious organisational disruption. The threat is even more for SMEs with limited resources but greater interests at stake. It is therefore important that businesses choose their cloud provider with the utmost care and intelligence. Management must be careful and develop a good understanding of the data security processes and policies of the vendor before signing on the dotted lines. However, with the multi-cloud approach, businesses can be assured of personalised, cost-efficient and reliable cloud services, which is definitely not expected from the big five. The way forward In such challenging times, I believe the best way forward for small cloud providers is through collaboration with other niche players and by playing to their strengths. Also, as someone who has worked with the cloud services industry for many years now, I feel it is my professional responsibility to highlight the importance of knowing your data location and availing the advantage of multiple cloud partners than just one hyperscaler. Our company Memset is a brilliant example of how a smaller cloud provider has recognised the changing market dynamics and grabbed the opportunity with both hands to build a portfolio of partners to deliver secure reliable services to our customers.[/vc_column_text][/vc_column][/vc_row] ### Be optimistic about the future [vc_row][vc_column][vc_column_text]Technology is constantly changing and advancing. Along with an organisation's workplace and processes, it plays a key role in unlocking business potential. If implemented correctly, digital transformation can revolutionise the workplace, saving employees time and helping to streamline everyday processes. And here’s how... The first step on the journey is overcoming any fears or trepidation and taking the plunge to adopt new technology, whether it be hardware or software. Listening to employees will help them get the most satisfaction out of their working day, and digital innovation is part of this recipe for success. Putting your people first can give that extra impetus to be brave and take the first step towards change. Once you’ve embarked on your road to change, there is, unfortunately, no guarantee of plain sailing. A few hiccups are to be expected within any process of change. These bumps in the road can be off-putting and make giving up during a challenging process look like a tempting option. Maintaining a techno-optimistic outlook will help you to stay focused on the end goal and persevere to achieve success. Think of Monzo. We regularly see the unmistakable, bright coral cards, whether it’s on the underground station or in a coffee shop. Maybe you even have one yourself. Monzo has closed a fresh round of investor funding that now values the company at £2 billion. Although Monzo is now a part of everyday life and may appear to have found success overnight, the reality is that Monzo was initially met with resistance as a disruptor in a cynical industry. Monzo identified that the banking industry was in dire need of disruption and maintained their optimistic outlook and faith in the process, despite initial negative responses from both the industry and media back in 2015 when it launched as a pre-paid card seller. Monzo did not give up and remained focused on solving the problems they had identified. Embracing technological advances and the rise of the smartphone, the Monzo app offers customers a range of features to give them greater control over spending and savings, or even to analyse their habits. Fast forward to today and 55,000 people open a Monzo account every week and it is clear to see that their perseverance has paid off. The moral of the story – change always takes time to prove successful. Having trust in the process is crucial to avoid the pitfall of a pre-emptive retreat, which can lead to giving up on the strategy if results are not seen immediately. Having an optimistic perspective allows leaders to see possibilities where others might simply see obstacles, adapting the strategy but not abandoning the goal. Companies of any size and any industry can harness a positive outlook to avoid this pitfall, giving their people time to adapt to new systems and supporting them with training. Advances in technology can make our lives easier, and 57% of European workers believe that technology will make a four-day working week viable soon. The people behind the machines need to be on board for any change to be a success, or new technology can become more of a hindrance than a help. When it comes to introducing new technologies, leaders should encourage a culture of optimism, where employees feel positive about how these changes will impact their day-to-day. This should be backed up by making sure employees are always at the front of mind when making decisions about implementing new systems. Providing the appropriate level of training will make sure employees can get the most out of their new tools and understand how to best use them to suit their individual needs. It is therefore key for leaders to employ a people-first approach, listening to their employees and supporting their employees throughout the process of transformation. 72% of employees across Europe say they want to contribute more to their work and implementing digital transformation can help them to do this. More specifically, 49% of employees believe that technology helps generate creative ideas, compared to only 21% of employers, which shows a significant gap between the views of leadership and executives. If employees feel that technology will help them in their work, leadership needs to take this as fuel for their optimism when implementing strategy across the company. In the long run, exponential technological growth means that businesses will need to constantly review and update systems as the latest advancements become available. Businesses must be optimistic about the continuous stream of new opportunities this will offer. At Ricoh, we understand that to unlock their full potential, businesses need to evaluate how workplace, processes and technology are working for the business, always keeping people at the forefront of any decision. We call this ‘workstyle innovation’. We have developed the Pathway to Productive People, three papers to help business leaders on their journey to bringing lasting, positive change to their companies. The first paper, Bravery, is all about taking that first step to making a change. The second, Optimism, offers advice on how to maintain an optimistic mindset to see change through. Finally, Impact looks at the importance for businesses in recognising the impact workstyle innovation can have, from optimal productivity to wider profitability.[/vc_column_text][/vc_column][/vc_row] ### Reducing costs and increasing flexibility with open APIs [vc_row][vc_column][vc_column_text] The rise of Open Banking has changed the dynamics of the financial sector, making it more end-user friendly with better financial services. With it, we have seen a rapid increase in the adoption of application programming interfaces (APIs). This allows businesses to collaborate and ensure the most innovative solutions are available to the market. Open APIs have given financial institutions and businesses alike the ability to easily offer innovative banking and payment features to their end-users. This is because they have enabled businesses to integrate new solutions into their existing infrastructure, with minimal impact on day-to-day functioning and short ready-to-market periods. This new approach is cost-efficient and time saving, allowing businesses to focus on providing excellent core services for customers, while reducing the need to invest in additional resources and staff. But, to truly understand the advantages of this financial evolution, we must look at how open APIs work and the benefits they pose for both businesses and customers. What are Open APIs? Open APIs are a fast and simple way for businesses to integrate their services with one another. They work by giving companies online access to new solutions that can be simply attached to existing frameworks. When developers create APIs, they share a set of rules and protocols of how the code of a feature must be used and what it’s capable of. What’s more, the API environment from a supplier is typically written in a universal language that makes it easier to be understood by other developers and integrated into other applications. This allows businesses to introduce frameworks, services, and software tools into their existing structure, without having to change their entire framework. Instead, they can adopt a ‘plug and play’ approach, where banking and payment features from specialised third party suppliers can be attached to the business structure that already exists. As a result, this universal language and clear set of protocols enables the integration process to be efficient and more secure from programming errors. At the same time, developers and collaborating partners can easily and quickly reach an understanding about how the technology works and the benefits the partnership will bring for both businesses. APIs: more revenue, less problems By creating financial services through the integration of open APIs, it can significantly reduce costs and increase efficiency. By simply connecting API protocols into their existing framework, financial institutions can offer tailored and innovative solutions to their customers via highly specialised third party suppliers. This removes the need to buy a one-size-fits-all package from a bank or other provider where 90 per cent of services included aren’t necessary. Instead, businesses choose what they need and add it to their product offering. What’s more, this approach reduces the need to invest in resources and staff to build new features from scratch. Instead, already existing API-based solutions from other parties can be managed in the back-end. The result: financial institutions simply need to plug in the desired financial feature into its own system and offer its new services to its customer base. This is a win-win for all involved; the business will generate revenue from its new product offering and customer will reap the benefit of the new solution. Becoming flexible with APIs As a result of easy API-integrations, improved efficiencies and reduced costs, financial institutions can become more flexible with what they offer to customers. This is because they build services catered to the exact customers’ needs, without making large and unnecessary investments. As a result, financial companies can offer modular financial services with solutions that make customers’ lives easier, such as; multi-currency IBANs, in-account and in-transfer foreign exchange (FX), and fast cross-border payments. This stems from the simplicity that APIs create. Once a financial institution understands what its customers need, it can swiftly integrate this service using a back-end API. This ensures the financial services’ offering can be changed to meet any future requirements. Furthermore, they can continuously update services so they include the latest innovative solutions on the market. This way, businesses avoid being stuck with outdated legacy systems and prevent customers looking elsewhere for the latest technology. What’s more, it reduces the time-to-market for these new products, as integration is quick and simple. Making the most of APIs Financial institutions need open APIs to ensure they stay ahead of the competition, while providing the services their customers crave. The technology reduces costs and adds flexibility to what can be offered, all with minimal impact on a business’s existing structure. To successfully compete in the industry, open APIs should be an integral part for any financial institution’s strategy. At ONPEX, the financial services we provide to our clients are fully API-based. With our financial solutions, such as multi-currency IBANs, seamless FX, European SEPA and cross-border SWIFT payments, businesses are enabled to improve and create their own financial services. This helps our clients to no longer be restricted by outdated legacy systems. Our clients simply pick and choose the services they need and only pay for what they used. To learn more about how APIs can benefit your business, visit: https://onpex.com/ [/vc_column_text][/vc_column][/vc_row] ### How Do Instagram Uses AI and Big Data in 2019 [vc_row][vc_column][vc_column_text]Social media marketing is about dealing with information. You evaluate consumer data, assess conversion rates, and plan your most effective social content based on your findings. The same rule holds for Instagram, the photo-sharing platform. When it comes to social networking platforms, there are loads of data you need to process and interpret, and accordingly, design your marketing strategies. That is why most brands today consider customer information when implementing their Instagram campaign. Studies show that users make almost 95 million new and unique posts on the photo-sharing site daily. Then, it is impossible to track tons of data. Thanks, you have technologies such as big data and artificial intelligence (AI). Today, businesses use AI and big data to derive numerous actionable insights into consumer behavior and their buying habits. Then, Instagram has a couple of benefits compared to other social platforms. There are users on Instagram that do not use much of Facebook or Twitter. Therefore, Instagram lets you study unique aspects like demographics and optimizing your marketing efforts for specific audiences. Did you know that approximately 200 million Instagrammers visit business pages and almost 80 percent of users follow one business on Instagram? There are tools like Social Insight that gives you access to detailed analysis to understand your Instagram marketing strategies. Read on to learn how Instagram uses big data and AI in 2019. Targeted advertising The information collated from Instagram is precious as it offers many useful insights for businesses. The photo-sharing site keeps track of search preferences and user engagement. Accordingly, it sells advertising to brands that show interest in a specific audience type. That is how companies reach out to prospects that show interest in their marketing messages. Since Instagram belongs to FB, it already has an audience base of approximately 1.8 billion users. It means there is much scope for analysis. Your brand can, therefore, target advertising to people depending on what people they follow, what content they like, what posts they share, and what they do not like. The targeted advertising depends on these factors. Targeting is one of the key benefits of AI in social media marketing and email marketing. Take user experience to another level The key to the success of social media marketing depends on user experience. There is no denying the fact. It all depends on how your Instagram followers discover your content that piques their interest. Your posts need to generate the same kind of interest to become useful to your prospects. The more content a user uploads, the broader is the choices, and more challenging it becomes, to find content that is unique and interesting. Yes, information overload is a problem for social media marketers. Even a couple of years back, Instagram algorithms showed posts in your feed in a reverse chronological manner. It was a simple process but not practical and effective. Today, Instagram algorithms are AI-focused and choose the most likable posts or content. It has made marketers happy and they can strategize their social media campaigns based on Instagram’s new AI-based algorithms to increase likes or comments. Besides, if you want to get automatic Instagram likes, there are numerous ways to achieve your goals. Machine learning has also made deriving actionable insights easy when it comes to social media data. The algorithms learn and keep improving over time, thus displaying the most useful, precious, and relevant posts for all. The old Instagram algorithms made people miss on half of the content based on which accounts they followed. The new algorithm is an improved version delivering 90 percent of unique and valuable content to the users. Eliminating spam Did you know that the photo-sharing platform also leverages AI to eliminate spam? Instagram has smart technology to detect fake messages and getting rid of them from accounts. As of now, this technology works with nine languages such as English, Arabic, Chinese, and Russian. Spam detection has become possible, thanks to AI and DeepText that read texts and determines its context or relevance. The same technology is used by FB too. These little things matter a lot in social media today, be it Instagram or FB. Battling offensive and inappropriate content Today, bullying is a problem on social media, be it a personal account or a business one. Sometimes, posts are created to demean a brand or an individual entrepreneur on Instagram or any other social site for that matter. Studies show that 42 percent of bullying instances happen on Instagram and therefore, the platform has decided to fight offensive and inappropriate content. Instagram now has an offensive comment filter that figure out which specific groups of people are soft targets and run the risk of discrimination. If the AI algorithms detect a definite pattern in offensive or bullying comments or repeated responses, Instagram receives the notification and takes appropriate steps to fight the menace. AI has taken Instagram to another level and can do much better than what it has been doing so far. There hardly any doubt about it. The latest developments in machine learning have helped Instagram filters to identify bullying in photos. It is a very complex algorithm and still developing further improvements. Big data and AI has taken account protection to another level. It has not only improved user experience but also changed users’ lives in a better way. Understanding human behavior and culture When it comes to Instagram, it makes your audience feel like a part of a global community. Therefore, your business needs an understanding of human behavior and culture in different parts of the globe. It means that you need to work with loads of data and interpret them correctly to take the right actionable insights to improve your content and posts targeted to your audience. AI helps in this respect to understand consumer behavior so that you can tailor product-related content that piques audience interest. Conclusion Big data, AI, and machine learning play a crucial role in social media marketing, including Instagram. They introduced new standards of user experience so that brands can create content that grabs consumer attention.[/vc_column_text][/vc_column][/vc_row] ### Don't demonise data: central government, citizens, and digital safety [vc_row][vc_column][vc_column_text]Data has long been an invaluable tool for the development of businesses everywhere. By highlighting continuously changing user and customer behaviour and sentiment, the insight and intelligence it provides allows organisations to identify opportunities to streamline processes and address those areas where improvements can be made for greater all-round efficiency. But the value of data isn’t exclusive to private companies. Possibilities exist for using data for the social good in British communities. Connecting the various capabilities across the country’s public sector can only help deliver better services to citizens, businesses, patients and students. Indeed, the transformation is already underway to some extent. Social care workers are already working more closely with NHS staff, and the increase in effective knowledge sharing between police and security services will better enhance nationwide safety. But the needs of many individuals are currently being overlooked. Social behaviours – like those of a business’s customers – aren’t static. They are continually evolving, and without monitoring the right data it can be difficult to monitor social progress and the overall wellbeing of the country’s citizens. Such is the pace of change, that almost a decade on, the findings of the 2011 census will now be largely meaningless. Government leaders therefore need to be able to compare and connect up-to-date data from across the country if they are to have a better understanding of the social landscape. And as central government typically supervises local authorities to some degree, data such as that relating to council tax benefits, housing benefits, Universal Credit and emergency services, could be used to create a more complete, accurate and current picture of the UK demographic. It’s essential, though, that in enabling this data to be shared and used, the government ensures it will not fall into the wrong hands. The responsibility lies specifically with central government to focus efforts on passing legislation that not only protects citizens from harmful data, but also makes good use of it. And citizens must have complete confidence that any data the public services hold on them will only be used for good. Central cloud-based repository In common with many digital transformations – both in the public and private sectors – cloud will play a key role in this data-driven approach to social betterment. Cloud-enabled technologies such as AI and machine learning, for example, are essential for optimising the insights gleaned from the data collected. Most government organisations are still tied to legacy infrastructure, however. So, while migrating to the cloud is necessary, it’s not necessarily straightforward, and existing anxieties around security and data leaks have significantly held back progress for public sector cloud procurement. Fortunately, the Government’s G-Cloud procurement framework offers access to a wide range of cloud providers and support services, collaboration with whom will make the transition easier and more manageable. Rather than dealing with one tech giant, public sector organisations can work with more innovative and agile startups and SMEs, increasing the efficacy of their cloud deployment – usually at a lower cost. What’s more, a multi-cloud environment is vital in these situations, allowing the public sector to modernise existing systems where necessary, while transforming new ones, and avoiding vendor lock-in. Of course, the privacy of personal data should be prioritised throughout. Cloud providers will be obligated to ensure it is robustly protected both at rest and in transit, with strict permission controls allowing it to be accessed only by the right people, for the right reasons. More than anything, the greatest benefits of this cloud-based approach to data sharing come from the fact that the data is held in a central repository, whether it’s consolidated on a local or a national government level. After all, it stands to reason that a centralised database provides quick and comprehensive access to the data, ideal for the purposes of gaining a fuller picture of Britain’s demographics, and achieving joined-up services. Holding data in one location, as opposed to many, is inherently more secure too. What’s more, with less need for maintenance and management, it’s also less costly. For the benefit of the nation A greater understanding of the country’s citizens and their needs will require greater collaboration between public services and organisations and, equally, from citizens themselves. By sharing current data on everything from health and social care, to education and housing, government at all levels can garner a snapshot of the state of society and identify those areas that need addressing. There really is a chance to use data for social good. To do so, the public sector must break away from inefficient legacy systems and invest in secure local cloud services that will enable the sharing, analysis and optimisation of this data.[/vc_column_text][/vc_column][/vc_row] ### Top five considerations to assess when migrating security to the cloud [vc_row][vc_column][vc_column_text]Organisations continue to move more and more of their services and infrastructure to the cloud with Gartner predicting worldwide public cloud service revenue to top $200 billion this year, an increase of 17%. But this continued movement to the cloud has also led to a shift in the security posture of companies since it typically means that most of their data is living in multiple different clouds, versus residing on premise. Because this data is effectively beyond the ‘castle walls’ of their network perimeter, organisations are in something of a state of transition when it comes to the cloud and their approach to security. Security considerations are arguably acting as a brake on overall cloud growth. A  recent study of 300 large enterprise IT and security professionals found that forty-three percent of respondents said security concerns are the biggest obstacle to cloud adoption, and 37% claimed it’s the biggest barrier to SaaS adoption. Yet, if implemented correctly, moving infrastructure and services to the cloud boosts security and can give companies more control over their infrastructure and users, provided some of the following are considered carefully: How ‘in control’ is my organisation? Moving infrastructure and services to a third party can mean a loss of control if the right questions are not asked. For instance, systems might be upgraded without notice or at times which don’t work for your organisation or customers. As a provider of cloud-based security gateways, we often hear that some firms are placed on shared infrastructure and a large upgrade for one organisation can have a detrimental effect on the other despite being a separate company. Similarly, organisations can be forced into sharing IP addresses and sharing SSL decryption keys – these have a big impact on being able to use the cloud securely and seamlessly. It’s very important therefore that firms ask searching questions of their cloud provider to ensure this doesn’t happen. Are security policies consistent regardless of where the user is based? It used to be the case that most workers would be office based for most of the time and security was easier to control. Now this perimeter has disappeared, it is vital that office and remote workers are subject to the same security policies regardless of where they are based. ‘Cloud connectors’ provide the ability to apply user-based policies and generate user-based reports regardless of whether the user is in or out of the organisation’s network perimeter, wherever they may be geographically and regardless of what device they are using. Will applications perform in the same way when hosted in the cloud versus locally? When traffic is headed toward cloud applications such as Office 365, unnecessarily sending that traffic through private connections to centrally hosted security appliances is not only costly but can reduce user productivity substantially as Internet connections from branch offices are choked. Therefore, traffic must be able to flow through the most optimized path directly to the Internet in order to reduce the load on valuable network resources, including firewalls and routers. Is bandwidth going to get out of control? As mentioned, unnecessarily sending traffic through private connections to centrally hosted security appliances are bandwidth intensive and can introduce latency as well as costs. If security appliances are on premises it can also mean a continued infrastructure cost since they need to get continuously upgraded just to keep up. Delivering Internet security in the cloud so that it routes traffic directly to the Internet from branch locations is the only way to keep on top of network demands and provide the best user experience possible. How do I present unapproved apps being used in my organisation? Shadow IT is a continuous issue for IT departments as the lines between work and private activity continue to be blurred and cloud application use becomes un-audited and un-controlled due to lack of visibility and controls. Shadow IT usage can present multiple risk vectors for the organisation that includes data loss, productive loss, bandwidth utilization Issues and a higher risk of compromise to malware infections and exploits. Gaining visibility into cloud application usage, understanding where data is being stored in the cloud and having the ability to ‘unsanctioned’ shadow IT applications is essential for any organisation with a cloud first strategy. Addressing the above means that organisations can implement their cloud security move in a staged and controlled manner and ensures they can move to the cloud without sacrificing the benefits derived when deploying on premise. Just as importantly it brings security to wherever their users are. Being geographically closer means faster connections and eliminates latency issues. It has the effect of making employees appreciate the experience their corporate IT gives them rather than resenting it. Security effectively becomes an enabler to their work, not a barrier.[/vc_column_text][/vc_column][/vc_row] ### The evolution of the bot: How they beat me to AC/DC tickets [vc_row][vc_column][vc_column_text]Imagine this, it’s 9am and you’re the first in line to buy online tickets for an AC/DC concert. You’re going big this time, getting the best tickets money can buy. Why not? You don’t do this every day. 9am rolls around and up comes the payment form, card details go in and…'sold out'. How could they have gone so soon? Surely the whole world wasn't quicker than you? This is the second time in the last week that this has happened to me. First the AC/DC tickets (I’ve always been a fan of AC/DC: it’s not very IT but everyone has their guilty pleasures) and then I attempted to get the latest trainers for my daughter. I went through a similar 'primed and ready' routine but alas, the same result. They were sold out before I could even imagine someone had the time to put in their credit card details. Sounds familiar? Well, if this has happened to you, it's more than likely you lost to a bot - not a person. Bots are everywhere, and the bad ones can inflict a lot more damage than a disappointed daughter. Bots have been around for quite some time now.  Around the late nineties the first generation of bots showed up in scripts, with names that - to be fair - weren't exactly that scary: GDbot or Sdbot for example. These bots targeted IP addresses with simple requests, which are all very easy to defend against in this day and age. As time has moved on, we've seen a second, third and now fourth generation of bots entering our lives. As the bots have become more serious, the names have become more sinister too: Methbot in one case! Nowadays 'Man-In-The-Browser Bots' (MITB) are typically used within financial fraud due to the high amount of resources and cost involved. MITB attacks are much harder to mitigate due to its ever-changing nature and ability to get in between a user and their security mechanisms in a ‘public space’, such as the internet. These new types of fourth generation bots use more advanced, human-like interaction which allows the bots to be distributed across tens of thousands of IP addresses; moving from storing cookies, to mimicking mouse movements, to triggering scripts and distributed sources. These bots are able to use machine learning to facilitate correlation and contextual analysis, even creating 'bot-farms' or ‘botnets’ that are able to generate millions of dollars in areas such as counterfeit inventory, by targeting the premium video advertising ecosystem. So, what is a botnet? When bots come together, a botnet is created. A botnet is a logical collection of internet-connected devices such as computers, smartphones or IoT devices whose security has been breached and control ceded to a third party. Each compromised device is created when a device is penetrated by software, usually from malware distribution channels such as email or poor access controls. The botmaster uses command & control (C&C) servers to direct the activities of these compromised computers through communication channels, using protocols such as Internet Relay Chat (IRC) and Hypertext Transfer Protocol (HTTP). Well documented cases of botnets are showing phenomenally high numbers of accounts targeted, from Linkedin to eBay to Uber to Marriott (who had 500m accounts affected), the attacks are incredibly serious and have gigantic implications for organisations. Bots are big business It is now possible to buy account takeover functionality, carding, captcha beating software, web scraping, scalping and cart abandonment ‘as a service’ - as mentioned, web scraping and scalping are two of the more notorious functions that bots utilise: Web scraping is an important function of the internet. Price comparison and search engines rely on this form of communication to do business, however if compromised, a web scraper or crawler can suck up significant amounts of bandwidth resulting in unusually high charges for businesses, simply because someone just wants to copy their data and repurpose it for their own benefit. Scalping comes in several forms. First is using a method of arbitrage of small price gaps created by the bid-ask prices. The second is fraudulent forms of market manipulation. The third is using the speed of bots vs humans to buy scarce resources – these things are fast! Businesses need protection against this new, fourth generation of advanced bots, which is where WAF (Web Application Firewall) comes in. WAFs are paramount for detecting bots, as they intercept and analyse each HTTP request before they reach the web application. Conclusion  Bots can be used for a multitude of things, good and bad. With research showing that only 12% of web applications (including websites) are protected against bots and botnets, the web is a feeding ground for threat actors using this method. There is no question that protection is required against bots and up to the minute protection is essential for organisations. A Web Application Firewall (WAF) is the most sensible form of defence, whether in hardware, software or ‘as a service’ formats but make sure that your WAF provider is staying up to date with the current threats to offer you the most advanced bot protection available.[/vc_column_text][/vc_column][/vc_row] ### Adding Context Will Take AI to the Next Level [vc_row][vc_column][vc_column_text]If we’re serious about building Artificial Intelligence (AI) that makes choices which are as smart and informed as the ones we humans make, it makes sense to look into an important environmental factor we use to inform our decisions: context. Essentially AI needs context in order to mimic human levels of intelligence. Context is the information that frames something to give it meaning, after all. For example, a person who says “Get out!” may either be expressing a friendly note of surprise, or angrily demanding that someone leave the room. However, you can’t decide that simply by reading the text alone. We solve this by using context to figure out what’s important in a situation, and in turn, how to apply those learnings to new situations. For robots to make decisions closer to the way humans do, they will need to rely on something like context as well. After all, without peripheral and related information, AI will require more exhaustive training, more prescriptive rules, and be permanently limited to more specific applications. The problem is that context has to be discovered. AI scientists try to avoid the problem by building narrow, but powerful, systems that do one thing extremely well. Narrow AI is focused on performing one task very well, such as image recognition, but it can’t scale horizontally and doesn’t offer anything like human-level understanding of complexity – around planning, language comprehension, object recognition, learning, or problem solving. Connecting Data and Defining Relationships One way to provide AI applications with context is by extending AI’s power with a graph approach to working with complexity. If you’re not familiar, a graph database is a way of managing data that is quite unlike the traditional relational store approach of Oracle or Microsoft SQL Server. It also differs from NoSQL approaches such as MongoDB. Gartner has identified enterprise interest in the graph database as one of its current top trends and we are also experiencing the ’Year of Graph’. Graph is applicable to a wide variety of use cases, from Amazon-style shopping recommendations to fraud and money laundering detection. Increasingly, graph technology is being used to power AI and ML (machine learning) initiatives. That’s because its native architecture provides the missing context for AI applications, with early results suggesting outcomes far superior to results from AI systems that don’t incorporate this background. Graph technology connects data and defines relationships, and by enhancing AI with related context graph technology, it offers an effective means to empower the development of sophisticated AI applications. Consider the case of self-driving cars. Teaching autonomous vehicles to drive in rainy conditions is difficult, because there is so much variability in wet conditions. It would be impossible to train them for all possible situations, but if the AI is supplied with connected contextual information (rain, light, traffic congestion and temperature), it is possible to combine information from multiple contexts, allowing the vehicle to infer the next action to take. Graphs provide context for AI in at least four areas. The first is knowledge graphs, which are used to improve decision support and ensure the most appropriate answers for a situation are given. The most familiar use case for a context-rich knowledge graph is Google search, but documentation classification and customer support are also common applications. A context-rich knowledge graph works well for organisations that capture  a great deal of knowledge in the form of documents. One example is NASA’s Lessons Learned database, which captures 50 years of knowledge about past space projects. Second, graph accelerated machine learning uses graphs to optimise models and speed up processes. Current machine learning methods often rely on data stored in tables, but teaching a network using such data is resource-intensive. Graphs provide context for improved efficiency because data in this representation is connected. This enables relationships of numerous degrees of separation to be traversed and analysed quickly and at scale. Third, connected feature extraction analyses data to identify the most predictive elements contained within it. For example, studies show that your larger friend network is a better indicator of how you will vote than even your immediate friendships. A great use case here is how graph algorithms simplify finding anomalies of hidden communities that might be fraud rings or money laundering networks. Fourth, graphs offer a way to provide transparency about how an AI makes decisions. This ability is crucial for long-term AI adoption, because in many industries, such as healthcare, credit risk scoring and criminal justice, we must be able to explain how and why decisions are made. Context-supported AI helps an AI’s human overseers map and visualise the decision path within the contextual dataset, removing the ‘black box’ aspect of decision-making that can reduce confidence in the conclusions/recommendations offered.[/vc_column_text][vc_single_image image="52881" img_size="full"][vc_column_text] Making AI More Trustworthy Neo4j is so convinced by the importance of graphs to AI that we have formally submitted a graph and AI proposal to NIST, The US government’s National Institute for Standards and Technology, which is creating a plan for U.S. AI government standards. Our proposal states that AI and related applications of intelligent computing, like machine learning, are more effective, trustworthy, and robust when supported and interpreted by contextual information that only graph software can provide. AI that doesn't explicitly include contextual information will result in subpar outcomes, but graph software, developed as a way to represent connected data, can step forward to help. Let’s use the power of graph technology to enrich our datasets to make them more useful and a better basis for the next generation of AI success stories.[/vc_column_text][/vc_column][/vc_row] ### How to Better Optimise Your Cloud Applications [vc_row][vc_column][vc_column_text]According to Gartner, the cloud services market will grow 17.5% by the end of 2019 that totals to $214.3 billion. This is an increase from $182.4 billion in 2018. Cloud offers a cheaper, hassle-free, and efficient way to manage voluminous data. It also provides improved scalability, agility, seamless integration, and faster configuration. Though cloud adoption results in financial and operational excellence, the process doesn’t end with the last app being moved. There is always a need to monitor the deployed resources to ensure consistent productivity, security, compliance, and elevated ROI. Furthermore, optimization techniques like identifying idle cloud instances and accurately meeting usage commitments help slash costs and boost application performance on cloud. Equipped with unmatched experience in providing cloud computing services in Genève, we have listed a few other aspects of cloud optimization that help enhance application performance. 1. Choose Cloud Auto-Scaling According to Forbes, 60% of enterprises allocate funds for additional storage space to manage huge amounts of data. Auto-scaling is a dynamic feature of cloud computing. This attribute enables the application to automatically extend its services like virtual machines and server capacities, depending on the demand. Auto-scaling ensures that new instances are increased seamlessly with the rise in demand and decreased after the completion of such instances. This feature is pivotal in controlling operational costs in a cloud application. Consider a situation where a campaign performs well and there is a sudden upsurge in traffic. With auto-scaling, the system will be prepared to manage this unprecedented spike. Furthermore, as auto-scaling is geared towards offering the right number of cloud resources based on current needs, it also allows defining the minimum and maximum sizes of instance pools and scaling metrics. 2. Manage Instances Effectively An idle computing instance may utilize a minimal part of the CPU capacity when the enterprise is being billed for 100% utilization. This results in a wastage of resources. Such idle instances should be identified and removed. The key to cloud cost optimization is consolidating computing workflows to fewer instances. It is cardinal that the enterprise decides the size of the virtual server instances before the migration. The focus should be on reducing unused instances by defining clear rules and metrics. Organizations can also benefit from features of auto-scaling, load balancing, and on-demand capabilities. 3. Explore the Latest WANOP Systems The latest WAN optimization systems seamlessly support cloud-based applications and can be used as a virtual service to optimize traffic and bandwidth by aggregating links. The WANOP (Wide-Area Network Optimization) systems can also be used to: Shape traffic by prioritizing and allocating bandwidth accordingly. Control data deduplication by reducing the amount of traffic across WAN for remote backups, replication, and disaster recovery. Compress data size to limit the amount of bandwidth. Cache data by storing frequently used data on a local server for faster access. Monitor traffic continuously to detect non-essential components Define rules and regulations for downloads and internet use. Spoof protocols by grouping chatty protocols to speed processes. 4. Optimize Cloud Storage Optimizing data storage on cloud will help better manage data from multiple sources. This can be done in the following ways: Delete or Migrate Unnecessary Files after a Particular Period Rules can be programmatically configured to shift data to other tiers or delete unnecessary data. This saves a lot of storage space. Major cloud providers implement data lifecycle management. For example in Azure, active data can be placed in Azure Blob Standard storage but if some files are being accessed less frequently, they can be shifted to Azure Cool Blob, which has a cheaper storage rate. Also, object storage for log collection automates deletion using life cycles. An object can be programmed to be deleted after a stipulated period from its creation. Data Compression before Storage Data compression before storage reduces storage space requirements. The incorporation of fast compression algorithms like LZ4 boosts performance. Check for Incomplete Multipart Uploads Object storage includes several incomplete uploads. If an object storage bucket has a capacity of a petabyte, even a 1% incomplete upload will occupy several terabytes of space. Such unwanted uploads must be cleared. 5. Implement Cloud Analytics Incorporating cloud analytics helps get the most from the cloud investment. It helps analyze the cloud spend and usage trends across the application. The advantages of cloud analytics are: Controlled Costs Enables right-sizing resources based on usage trends. It also offers recommendations of the best-suited resource size and estimated cost savings possible due to it. Comprehensive Resource Management It offers a holistic view of the spend across applications. It provides intuitive insights about additional cost savings by identifying idle resources and helping realize the implementation of best practices. Helps stick to a budget and triggers alerts if exceeded. 6. Incorporate a Governance Structure A solid governance structure helps organizations to determine effective ways to monitor cloud solutions and establish best practices. A governance board can help: Oversee adherence to compliance and regulatory requirements Authorization levels within the cloud application Scenarios wherein cloud solutions can be implemented for improvement It also helps in curbing idle instances and monitors the efficiency of the cloud application on the whole. 7. Select Event-Driven Architectures Server-less computing is event-driven application design. AWS Lambda, Azure Functions, and Google cloud Functions are examples of server-less cloud services. Though servers are required to operate event-driven functions in the backend, the idea of an event-driven architecture is to slash deployment and long-term operation of instances. Cloud providers control event-driven architectures. An appropriate code is loaded to determine software behavior or functions. This code is deployed and runs only when triggered by a real-world or programmatic event. After the function is complete, the code is unloaded and does not occupy any cloud resource[/vc_column_text][vc_single_image image="52876" img_size="full"][vc_column_text]Event-driven architecture comprises event producers that generate a stream of events, and event consumers that listen to the events and respond. Event producers are decoupled from the event consumers and consumers are decoupled among themselves. In a few systems like IoT, the events need to be ingested in huge volumes through the event ingestion feature. 8. Opt for Micro Services Architecture Monolithic applications include all the features and functions in a single executable structure. Though such an application is a proven development approach, it does include concerns about scalability and performance on cloud. For example, if the monolithic application has reached its peak performance, a whole new instance will need to be created and deployed as a new application. Microservices architecture addresses this concern. It splits an application into several programs. They are individually deployed, operated, and scaled. These work as APIs to provide the application's functionality. In this case, if one API reaches its maximum performance limit, only that particular API needs to be scaled out. This proves to be a faster and more resource-efficient way to manage applications.[/vc_column_text][vc_single_image image="52877" img_size="full"][vc_column_text]Source In the above microservices architecture, we can see independent services are self-contained and implementing a single business capability. Wrapping Up The cloud computing ecosystem is dynamically changing to stay on par with rising demands. Enterprises need to continuously monitor it and be creative to make the most of this platform. Though there are several optimization techniques, the key is to find the best-suited techniques that align with the goals and operational needs of the enterprise and implement them to obtain maximum benefits.[/vc_column_text][/vc_column][/vc_row] ### 3 Important Tips for Negotiating the Most Successful Cloud Contracts [vc_row][vc_column][vc_column_text]Whether you're a startup or a large technology company, cloud computing can propel your business forward. A cloud solution can meet the needs of your company at a reduced cost and with higher reliability. There's an increasing number of vendors offering cloud-based services, so it’s important to assess a service provider against its competitors. Analyze the cloud contract before subscribing. The contract terms are critical in shaping your relationship with your chosen vendor. The quality of service you receive may also depend on the agreement you sign. Here are three important tips for negotiating successful cloud contracts. Understand the Terms of Service Understanding your provider's terms of the agreement is necessary before beginning negotiations. Terms depend on scale of service and type of service. The three main service types are software as a service (SaaS), platform as a service (PaaS), and infrastructure as a service (IaaS). An expert-led negotiation seminar can support and encourage full awareness of the contract terms. It’s important to learn how to navigate through the offer details to claim better value. Avoid skimming through the documents. Don't assume the fine print will be as you expect. Every vendor is different and may present varied terms and conditions. Cloud services aim to increase your business efficiency. However, the provider’s contract terms are naturally more likely to favor the provider. Your company should stand to benefit if you negotiate where possible. Assess the responsibilities outlined in the contract. Understand your duties as the user, and the provider's obligations. Go through the service-level agreements (SLA). Some of the cloud SLA metrics vary from one service provider to another. Read and ensure you understand the different areas outlined. Terms may describe the volume of work, speed, quality, efficiency, and responsiveness. Know the expected standards of the responsibilities. In case the vendor fails to deliver up to standard, negotiate for compensation. Include financial penalties for defaulting on guaranteed terms. A redress is necessary for maintaining accountability. Both the user and the service provider should be clear on penalties. Negotiate Pricing Cloud is essential for hosting apps and storing data, but shouldn’t take up too much of your operational budget. Determine costs in comparison to benefits. Negotiation seminars can prepare business owners to claim higher value for lower prices when negotiating with service providers. When at first signing with a cloud service provider, the rates are often quite enticing. The vendor aims to attract your attention using discounted price tags. Some business owners taking advantage of the reduced rates fail to read the contract further to notice that the rate is an introductory one. Study the fine print to understand future cost implications. Using your chosen vendor's services may create a dependency on the provider's services. Vendors design their agreements to create such vendor lock-in contracts. Vendors may also charge high migration fees if you want to transfer to another provider. You may find yourself with no choice but to renew your subscription. Renewal rates are an element of the cloud contract you should pay close attention to. Discuss the terms of service beyond the first subscription, the duration, and the fees. Some vendors may insist on a long renewal period (many years). Depending on the nature of your business, you may want to negotiate for shorter periods. Ensure there's a specified renewal fee rate in your cloud contract. Clear terms can prevent unexpected price increments in the future. If possible, negotiate for an extended period of the initial discounted rates. Let the consumer price index (CPI) guide pricing. The CPI clarifies industry standards to avoid overpricing by the service provider. Prepare for Termination of Contract Business relationships evolve and sometimes come to an end. The end of an agreement may not be top on your mind when contracting a cloud service provider. Analyze what the cost of terminating your contract would be before signing the agreement. Seminars train attendees to prepare for the end of business relationships from the start. Various triggers may lead to breaking ties with a cloud-based vendor. Your business may need more advanced cloud services than your current vendor provides. There may be a failure to meet the service-level agreement by your cloud provider. Either way, ensure your cloud contract allows you to exit without hefty charges. Most contracts include early termination fees to protect the vendor. Be aware of these charges. Negotiate to ensure the penalties are reasonable. For defaulted service-level agreements, the service provider should waive termination fees. The cloud provider should pay compensation instead of the user. Also, take note of data migration fees. When you migrate from a cloud-based vendor, you want to leave with your data. Before signing up, clarify the terms for data migration. Agree on a predefined format of the data and the cost implications of exporting it. Negotiate for favorable or zero charges if possible. Many cloud providers need a notice of non-renewal within a given period. Take note of this timing in case you need to exit. In Conclusion When choosing a cloud provider, plan and review the contract details in advance. Carry out a risk–benefit analysis to test whether the provider is right for your company. If possible, have an attorney assist in clarifying terms.[/vc_column_text][/vc_column][/vc_row] ### Optimizing Your Martech Stack For Maximum Productivity [vc_row][vc_column][vc_column_text]Increasingly, the boundaries of regular marketing and digital marketing are becoming blurred. The reason for this is that we live in an age where marketing almost necessarily requires the use of technology which, in turn, makes it digital marketing. There are some limited uses of marketing which are completely free of technology and certainly the ideas generation phase doesn’t necessarily have anything to do with technology. But, on the whole, marketing means digital marketing. And digital marketing means ‘martech’. Given how ubiquitous martech is now when it comes to marketing, optimizing the process of using it is now a vital and quite challenging aspect to the marketing field. The use of a martech stack is one of the best ways to integrate your marketing-oriented tech solutions but can get a little complex and, a relatively new idea at least in the collective conscious, it requires some real thought and effort to decipher the best policy for maximum productivity. So, without further ado, let’s take a look at how to achieve this. Martech: What’s it all about? Martech is a portmanteau which, broken apart, refers to ‘Marketing Technology’. In an ideological sense it refers to the increasingly important reliance on technology which has wormed its way into marketing. “Martech is a broad term which encompasses all sorts of different technological fields, from social media marketing, to software used to track marketing and marketing results”, explains Jean DeLuca, tech blogger at Paper Fellows. It has rapidly become a very important way to think about marketing particularly in connection with the next concept. What Is A Technology Stack? A technology stack refers to a ‘stack’, ‘pile’, ‘structure’ or ‘web’ of interconnected software and digital tools, very likely produced by a wide range of different companies and developers which, when combined, create a complete toolkit, as it were, for handling all of the ins and outs of digital marketing, of ‘martech’ in one solidified digital solution. When the tools are specifically chosen for use in the field of marketing, that is when you have the titular ‘martech stack’. What’s The Point? Stacking your martech solutions and tools in a consolidated way gives you a few benefits. Firstly, it keeps you organized as you can align the software with the different individual tasks that it will be used for, ensuring that everyone on a marketing team is able to see where each tool or piece of software becomes involved in the process of digital marketing. Secondly, it allows you to save money. You can save money by seeing where in the process you have programs that are doing a job that is already being done elsewhere, allowing you to eliminate one. The clearer your overview of all the parts at play the better you will be able to streamline the process. Similarly, you are able to spot where your overall marketing process is a bit thin and reinforce it with further tools if need be. In a more general sense it also promotes a synergy in your marketing tools department. How To Build A Successful Stack Geared For Productivity The first thing worth noting is that no one man or woman can be responsible for stack design. The reason that this is important is that the technology that makes up the stack is relevant at all different stages in the marketing process, in all sorts of different ways. It can’t be up to one person to make the investment in the tech, it needs to be a consultation on what technology ought to be in the stack. This way productivity is likely to rise since all parties are connected to the stack and have an investment in it. Secondly, where possible, applications ought to have pre-existing connections. As much as is possible you want to be building technological threads through your stacks. It’s not ideal having 30 completely disparate applications in your stack, so you might note that your word processor is Microsoft and decide to use Excel as your spreadsheet software, for example. The integration and compatibility of software will lead to much more effective results and ease of access. Finally, constant review is necessary. It can be easy to pour effort into developing a stack, finishing and then washing your hands of the whole thing. Maximum productivity is achieved by having the best products and the only way to ensure that is to do frequent reviews of product performance and of other industry options. Conclusion It’s a lot of a simpler idea than it might sound, and your company might already have a martech stack without you even realizing it, but it’s something that is worth formalizing as an idea. That way, by thinking more directly about it, you will be able o optimize your stack for more effective and productive marketing.[/vc_column_text][/vc_column][/vc_row] ### Startups move fast, but they can move a lot faster [vc_row][vc_column][vc_column_text]Startup founders have different technical challenges at the various stages of their journey.  At the early stage, it’s a race to product market fit, and at the growth stage, it’s scaling the business and customer acquisition. A technology stack that does not support your growth strategy holds back product evolution and causes performance issues that erode the customer experience.  So how can startups go faster and retain and grow their customer base?   At the early stages startups pay less attention to the performance, cost and scalability of their technology stack.  Instead, they tend to use the standard infrastructure building blocks and install software to experiment and develop their technology. The goal is to get their first customers signed up, but this approach requires that they manage their infrastructure stack manually resulting in a much lower rate of experimentation, lower developer productivity, and ultimately a much longer journey to product market fit.  Leveraging cloud native applications removes the undifferentiated heavy lifting of installing and managing software. It gives early stage developers the freedom to experiment without having to worry about the cost of managing the technical infrastructure. Leverage the self-service approach to learning. There is an abundance of online training resources to keep up to date on new technologies, for example, review use cases and links to the detailed product documentation.  AWS also has a new Startup Kit, which is a set of resources designed to accelerate startups’ product development on AWS. A core component of the Startup Kit is a set of well-architected sample workloads that can be launched within minutes with code published on GitHub. The code can be extended easily to create a wide variety of applications. Look for opportunities to meet like-minded people to learn from. This could be through classroom training, bootcamps, user groups, meetups, and industry events. Leverage funded programmes, such as AWS Activate, which is a program designed to provide startups with low cost, easy to use cloud computing resources. Many cloud providers also have programs available for funded startups to help with the cost of experimentation during the early stages and benefits are accessible through their angel investors, accelerator program managers and venture capital firms. Architect for scale and cost efficiency. The world’s most successful startups from Airbnb, Slack, and Stripe to Robinhood have all been able to scale globally at a rapid pace because they’re built using cloud native applications. This approach helped them to scale, and remain laser focused on building innovative products and services that deliver value to their customers.  They have also been able to double down on other business priorities, such as hiring great people and driving new customer acquisition. Startups who choose not to build in the cloud risk having to rearchitect their applications, slowing down innovation and wasting money.  Supporting a technology stack that does not scale to meet the needs of the business is a costly mistake. A good example of this is TransferWise.  TransferWise are re-platforming from a monolith on-premises architecture to microservices using AWS Elastic Container Services (ECS). Using autoscaling containers was not an option for them 7 years ago when they first launched, but they are now adopting cloud native applications, which has allowed them to scale more cost effectively.   For example Deliveroo developed an internal application called ‘Frank', their new despatch engine running on AWS using Machine Learning models.  'Frank' tracks restaurant preparation time and delivery times then despatches riders based on the outputs, and it has helped Deliveroo increase operational efficiency by 20%.   The costs of machine learning model training are kept low by using AWS Spot Instances, a cloud native approach to consuming spare compute capacity for steep discounts. Don’t get locked-in. Startups are choosing to use cloud native services to avoid getting locked into lengthy contracts with traditional technology providers. Taking this approach means startups are not forced to standardise on the lowest common denominator of features and services.  They can benefit from the elasticity of cloud native applications and cost per use pricing, so are not forced in to overprovisioning and overpaying for infrastructure that needs to support peak time utilisation. Whatever stage of the journey you are at, adopting cloud native applications accelerates your rate of experimentation, increases your developer productivity and scales your startup in the most cost effective manner.  By reducing the heavy lifting associated with managing software, you can move a lot faster and scale globally.[/vc_column_text][/vc_column][/vc_row] ### How to generate sales in the technology sector [vc_row][vc_column][vc_column_text]High quality sales leads are the backbone of any thriving tech business, however, there are many definitions as to what a “quality” lead is. For example, should you class any prospect that has shown interest in your business services or products? Or should you wait until they’re further down your sales funnel until they are considered “high quality”?  Sud Kumar, marketing director at digital marketing agency Origin, discusses the key steps that technology businesses should take to generate high quality leads. In many organisations, marketing teams will generate leads and send them over to their sales teams to close. However, this is not necessarily the best way to boost your business growth, as you may be sending across low quality leads to your sales team, which will mean that they’re wasting their efforts when trying to convert them. So, what should you be focusing your efforts on? In reality, the fewer, better quality leads you generate, the better, as this will enable your sales people to really get to know the leads, get under the skin of their problems and build meaningful, working relationships with them that will grow into lasting customer relationships. Categorising leads the right way… Leads can be categorised by their position in the sales funnel and their level of interest in your business. For technology businesses, you’ll be able to look at how your service and product will have an impact on the lead’s organisation. For example, will your tech drastically improve their operations and provide the solution to their problem? This will present an opportunity for you to communicate all the benefits they’ll gain by implementing your technology which will pique their interest and push them further down your sales funnel. To ensure you categorise your leads in the right way, it’s useful to be aware of the two different types of leads. Marketing Qualified Leads (MQL) – this is a prospect that has shown interest in your business by completing an action. This action might be filling in a form on your site, signing up to your marketing updates or downloading a guide or some content from your site. To nurture this lead effectively, you should begin to send them more content that shares your business knowledge. For example, if they’ve downloaded a guide or a “how to” video from your website, have a closer look at the actual topic of that content. This will enable you to identify the types of themes that they are interested in and you can send them more of what they want. Sales Qualified Leads (SQLs) – this is a prospect that has shown immediate interest in your offering, or this is a prospect that has passed through the MQL stage, been pushed down your sales funnel and is ready to make a purchase. Lead generation tips…   Now you understand how to look after and nurture your potential customers that are at different stages within their purchasing journey, it’s time to generate leads. Below, I’ve identified four of my top lead generation tips. Get your sales and marketing teams on the same page To begin your lead nurturing activity, you must properly align your sales and marketing efforts. To do this, the teams or people responsible for your sales and marketing must work together in unison so that prospects are passed on to sales at the right time or are nurtured further through marketing until they are ready for sales interaction. Keep your leads engaged Once your leads sign up to your marketing updates and communications (once you have their contact details), you must begin to nurture them and feed them more interesting information to keep them engaged. But, don’t just send your prospects information about your business and products, send them a mix of useful industry insights, trends, research, advice and sprinkle in references to your business to position your brand as authoritative and not just focused on sales. Create lead questions To help you categorise your leads effectively, introduce forms on to your website pages to gather information and data about your potential customers. The forms should contain questions to help you learn more about the prospect, identify whereabouts they are in the sales funnel and what content they should be sent next. Introduce lead scoring Lastly, introduce lead scoring into your business activity to rank your leads based on their level of engagement with your brand. This will help you determine if they are the right “fit” for your business, how much time your team should be spending on them and what your next move should be, to enable you to really maximise your lead generation efforts and gain the most return. To find out how we’ve done this for other technology companies, visit our website www.origingrowth.co.uk/work/loqate.[/vc_column_text][/vc_column][/vc_row] ### Get your IP strategy right early; reap the rewards later [vc_row][vc_column][vc_column_text]Some technology innovation businesses focus on intellectual property (IP) generation and are well aware of the threats from the outset.  However, despite intangible assets now usually making up the major part of most company valuations, many businesses still deal with these issues reactively when they get thrust into focus by events, rather than proactively, and end up with an outcome which is both less positive and more expensive. Why is this?  Well, to be honest the issue is that it is not always easy to get IP strategy right, and particularly for the appropriate cost and level.  Some attempts get bogged down in the detail and a lot of money is spent looking through the weeds without finding a clear path,  while some businesses engage in consultancy which produces a nice sounding ‘strategy’ document that looks impressive in the annual report or at the Board but is painfully short on implementable details or specific value-adding output.  Having experienced either in the past, or it simply not being clear what value is to be gained, often sensible (and perhaps cynical) business leaders under time and budget pressures understandably press ahead with the more tangible product development issues. Let’s consider some typical scenarios involving a ‘moderately innovative’ company which doesn’t seek to be a cutting-edge disruptive player but nonetheless is routinely refining, updating and improving its products and adapting to expected customer requirements. Perhaps the company considers it is simply adding automation, integration, connectivity and convenience features that users generally want, or it is improving manufacturing processes or product physical qualities. Occasionally, changes will be made which percolate up as being sufficiently distinctive that the company considers applying for a patent.  At this point, it may transpire that for various reasons there are likely to be serious difficulties protecting what was thought to be interesting, which could be due to the technology area.  For example, what can and cannot be protected in the field of software / AI / medical technology is often far from intuitive to someone who is not an IP expert in this area, or simply because it is a generally crowded field.  However, it often turns out that the company did have some innovations along the way at the design stage that might not only have been protectable, but could even have been the basis for broader and more valuable protection but it is now too late as they have been disclosed or others have got there first.  Moreover, some of the little tweaks that were dismissed or not registered (which were made to solve particular problems) could have been protected, and might have qualified the company for a tax break if patented. Another worst-case scenario is that, whether by a benign means or by being the recipient of a threatening letter or court action, a company discovers that a competitor has patent protection for a feature after having spent time, money and effort developing and incorporating it into a product.  At a late stage, the choices are unlikely to be attractive: either abandon / redesign the product; face the prospect of margin-destroying royalties, if even available; or worse - distracting and expensive litigation.  It is often the case that early visibility of the situation might have enabled the design to be modified to reduce risk, and potentially protection could have been obtained which would have been valuable and possibly tradeable in a cross-licence with the competitor.  The threat itself might have been relatively inexpensively cut down by opposing the competitor patent at the European Patent Office (EPO). Now extensive searching for potential bad news is expensive and rarely worthwhile, but targeted intelligence on competitor IP may be simpler to obtain and can inform general business decisions.  Identifying promising areas to develop and protect strategically (not simply reacting to what comes out at the end of the design process), and considering the IP angle early can identify less risky routes that are more protectable, might give rise to tax reductions and tradeable assets.  Collectively this could make a company a more attractive acquisition target and a less attractive litigation target. A number of companies opt to take IP more seriously after some successful early stage growth when new advisors and investors come on board.  More often than not, it is discovered that many of the business’ good ideas were not protected appropriately at the time.  A scrabble for second generation protection generally costs significantly more to obtain several narrow patents, each of which might be circumvented, than a broad patent covering key innovations in the first place might have cost. How much does looking ahead strategically cost?  A fraction of what dealing with a surprise usually does.  As little as half a day’s time of a handful of the right people (CTO, key technical/product people and an appropriately technically and commercially savvy IP advisor) appropriately prepared at the start of a new product development cycle can typically identify the more promising and less promising avenues before design commitment, and often generate potential IP in itself early on. This is particularly true for a small to medium size growing company: filing a number of strategically targeted patent applications can add value and can encourage otherwise likely litigious competitors to adopt a more constructive cross-licensing approach.  To put in context, defending patent litigation costs may typically run into millions of pounds spent in short order at a time of a competitor’s choosing, whereas building a credible patent portfolio may cost a few tens of thousands spread over a few years, with timing adjustable to suit budgeting. Sometimes the outcome of a strategic look ahead at IP is simply that there frankly is not much of a hard IP angle to the proposed new product activities. This knowledge, as well as usually being refreshingly inexpensive to acquire, is itself valuable in informing the likely speed of competition, appropriate branding and other marketing strategy to keep ahead. As with most things in business, a little knowledge of the road ahead can be very valuable.[/vc_column_text][/vc_column][/vc_row] ### Trust is key to avoid vendor lock in [vc_row][vc_column][vc_column_text]Digital transformation is more than just a buzzword; it is a critical process for businesses that want to keep pace with today’s mobile-first consumers. It is also non-negotiable, with the Digital Transformation 2018 report stating that 40% of organisations will no longer exist in 10 years if they fail to bring in new technologies to power digital transformation. One of the key enablers of digital transformation is cloud computing, the new normal for organisations large and small. The cloud lets organisations experiment and innovate cost-effectively, helping them move fast to gain competitive advantage and give customers the flexibility and choice they demand. While demand for cloud computing is growing, even in government organisations and regulated industries, so too are the requirements that organisations place on their cloud partners. They are no longer looking for a technology partner for life; instead, they want the solutions and scalability that will help them adapt to the fast-changing needs of their customers. A relationship of trust  Cloud computing today delivers much more than virtual machines or virtual storage. It allows customers to move fast, operate more securely and save substantial costs while benefiting from scale and performance. Many cloud providers also deliver services that help organisations deliver on digital transformation projects, from analytics and artificial intelligence to security, virtual reality and application development. However, the crux of an organisations relationship with its cloud provider is trust; when your business infrastructure lives in the cloud you need that cloud to be available and resilient.  This is why customers expect always-on uptime, world-class security and competitive pricing. This trust can be easily broken by issues including service disruptions, price hikes and security lapses. Should this occur, organisations want the option of walking away. Sadly, some cloud providers make this difficult with efforts to ‘lock-in’ customers. Moving away from vendor lock-in Before the advent of cloud computing, traditional IT systems were delivered via long term contracts and upfront payments. Cloud changed this, enabling organisations to pay for only what they need on a subscription basis. This approach not only provides a financial benefit, it also lets organisations walk away at any time, moving to another provider that can restore their trust or better help them evolve in line with changes in their marketplace. At AWS, customers have responded well to the cloud model and are looking to get away from the “old way” of doing things and avoiding the “lock-in” associated with high hardware costs, additional software licensing and support needed for in-house data centres. In fact, the frustration with vendor lock-in has been exacerbated as many contracts force customers to postpone their move to the cloud. “We are looking to adopt a cloud-based model as far as possible but we have a lot of legacy applications and a lot of long-term contracts which we can’t really seek to re-procure with something already in place, ” said Marion Sinclair, Head of Strategy and Enterprise Architecture at Kensington and Chelsea Council.  New platform, same frustrations? On transitioning to a cloud environment, organisations expect to be free of vendor lock-in. Unfortunately, some cloud providers are pushing technologies that effectively keep their customers locked into their platform. One example of this is the cloud abstraction layer where vendors add a layer between their infrastructure and the app. This layer hides the implementation details of how app functionally operates, making it difficult to move to a new cloud platform. Customer’s moving to the cloud must be careful to avoid more lock-ins; they can make organisations less nimble and adaptable to changing business conditions, and can also greatly impact growth, innovation, business costs and flexibility. Cloud computing has quickly changed the way that IT is built, developed and deployed. Moreover, it has been critical in supporting organisations with a focus on fast-paced digital transformation. However, to truly keep pace with their market and customers, organisations must avoid lock-in, leaving them free to select the best IT building blocks for their transformation efforts. AWS believes that customers should have a choice to change cloud service providers (CSP) and avoid customer lock-in. We strive to maintain customer loyalty by innovating, and improving and creating new services. Customers pay only for the services they use and can switch CSPs at any time. AWS provides access to the highest levels of security, but without a large upfront expense, and at a lower cost than in an on-premises environment. AWS offers both a secure cloud computing environment and innovative security services that satisfy the security and compliance needs of the most risk-sensitive organisations.[/vc_column_text][/vc_column][/vc_row] ### How Cloud Storage Solutions Are Coming up Handy for Business Organisations Cloud storage can be defined as one of the best mediums to store data online.  Long gone are those days when business ventures only used to be reliant on the in-house servers for safekeeping the increasing assortment of data and online documents. However, cloud servers have now appeared as the next big thing and this upgraded next-generation system is also known for helping professionals come up with enhanced decisions by comprehending to the intricacies of unstructured data. Technologies such as machine learning, programmed reasoning, information sampling etc. are applied in order to interpret as well as decode amorphous data to recognise, surmise, and project best solutions ever. Cloud storage devices come with huge storage space, offering data transfer to a remote location just in a jiffy. When the storage space can extend up to 2 TB, it can also be sent to multiple locations from an array of folders and files. This way, it can save data from potential hacks or cyber-crimes. Also, if seen from collaboration grounds, cloud storage is considered as an ultimate platform. Allowing people to obtain, examine, revise, emend, and consort on a single document, a cloud environment helps users get an easy access to important online files from anywhere in the world and club together in real-time. At the same time, with cloud storage solutions on board, added safety and security features are the most important facets to consider. Branded cloud storage solutions, especially that are vended and peddled by renowned organisations, come with improved data security features. All one needs to have is a solid uninterrupted Internet connection for grabbing online backup securely as and when needed. Also, cloud storage solutions don’t have to bear with additional expenditure. Sellers who trade with cloud-based storage expedients tend to offer technical support & provision to the buyers almost 24/7 and users can raise questions to them and get their queries answered promptly. With cloud storage platform on board, businesses, irrespective of their sise and extent, can reap huge benefits. Coming with mobile-friendly skins & sorts where one can consign data to remote servers through any mobile gadget and accomplish the cloud operations conveniently on the go. After a successful business operation, there always remains a chance for veritable threats in the form of cyber hacks or attacks. And, when such incident goes beyond control and IT specialists fail to get hold of the same, the bigwigs in the organisation can still recoup the lost data from the remote serves just in a minute’s notice. This way, a business can be operated without a hitch. In a nutshell, a cloud storage platform gives way to uninterrupted business flow without any dares and defies. Last but by no means the least; clients can enjoy highly sprawling storage space. Business ventures can put down files to remote servers. The storage constraints and strictures that are provided by branded sellers are always likely to be higher in capacity in comparison to local servers. One can expand as well as tail off storage space according to their business needs. The amount of ease and expediency provided by cloud storage solutions is incredible indeed. Even in case, one happens to store data in movable devices such as magnetic tape, diskette, floppy disk etc., still some sort of standard or manual interference is needed. On the other hand, data that is kept in cloud is safely stored online and can be retrieved from anywhere. As the information keeps spating in, it gets saved automatically over there. One absolutely doesn’t need to keep every detail in sight. The provision of cloud storage system permits one to thoroughly focus on their work without being least bothered about data loss and other such issues. According to Allied Market Research, the global cloud storage market is anticipated to grow at a significant CAGR from 2020 to 2027. Increase in preference for hybrid cloud as prime distribution model, rise in the drift in adopting cloud by several industry spheres, surge in demand for low-priced data storage, growing concerns for safety & security over cloud storage, and leap in inclination toward cloud adoption among SMEs are the major factors fueling the market. At the same time, over-pricey private cloud and threats for open coercions in public cloud system have accelerated the demand for hybrid solutions where high-end pliability is there to swap between public and private vault. Furthermore, cloud storage also paves the way for secure access of data. And, the fact that it involves relatively low outlay cost, as it hardly entails any maintenance cost for in-house data center, has supplemented the growth of the market in more than one way. To conclude, it can be stated that the global market has already gained a considerable thrust and in the coming years, it’s expected to expand its reach yet more.[/vc_column_text][/vc_column][/vc_row] ### How to Turn Your Mobile App into a Lead Magnet [vc_row][vc_column][vc_column_text]Marketers work hard to attract potential customers to their websites, various online platforms, social media profiles, and mobile applications. You can think that the general rules can be applied to every type of project, but it’s not that easy when it comes to the applications that work on different hardware platforms. When we talk about websites and online shops, it’s quite easy to understand that you can attract your clients with fancy design, ease of navigation, and other tricks. These basic moves will guide your users straight to you, and you’ll just have to make him a tasty final offer, so he would buy whatever you sell. Today we want to describe the ins and outs of turning your mobile app into a real lead magnet, but first, we need to start with the basics. What are we even talking about right now? What is a lead magnet? Just like anyone running an online or offline business, you want to land more sales and convert leads to real deals. This whole process requires you to get the leads first, and that’s where you may want that ‘magnet’ to step into the game. Breaking down the lead magnet is quite easy — it’s a free offer that your client can accept and give you his email and any kind of other additional information that you may need. Expert tip: We understand that sometimes, having more information is the real deal. However, we strongly recommend you keeping it short. Otherwise, your users may quit and not follow the lead. Among the most popular types of lead magnets, there are: Signup offers Freemium unlockers Bonus content Some people tend to call it a ‘bribe’ that you give out to your clients in exchange for his email or other required info. This doesn’t sound right to us. It’s more appropriate to call it a mutual exchange. Your user gets something valuable and gives out something he is ok to share. Why you need a lead magnet So the final destination of your user’s lead magnet interaction is the mailing list of your company, shop, website, whatever. Magnets allow you to grow your mail audience and get back to them with new offers, convert these leads into sales and make money after all. It’s all business. The bottom line is that having a lead magnet is crucial for your marketing. This is a fundamental part of your business growth and revenue generation. How lead magnets can be applied to the world of mobile apps Achieving good conversion rate numbers is not that easy on mobile devices. You should either drive your users to well-designed and convenient landing pages that can be used easily on a mobile phone or try the in-app purchases techniques. This is something you should think from the very beginning of your mobile app development. The first thing you should highlight is whether the point of the actual sale is located in the app itself or somewhere else on the other page on the internet. We recommend you to choose a multi-platform approach allowing users to buy your products or services in the app itself and online on your platform. Mobile device users love exclusive offers just like you, me, and everyone who uses other types of devices. These promotions and freemiums can include e-books, checklists, video tours, and bonus content available only in your company’s app. The main goal is to bring value to your user and offer him something he can use or utilize on his mobile phone, in your app or online from his phone. Lead magnet formats you can try right now Checklists — tutorials that guide your users step-by-step through complex tasks that your users are challenged with. They should be short and straight to the points. Checklists and other types of cheat sheets are super popular among users these days. E-books and trial offers for online courses — if you’re selling content services like courses, you can support your product with e-books or trial codes that can only be accessed if your user gives you his email. Free consultation — being good in your field of expertise can help you drive many leads to your platform through the app. Just offer your potential customers to try one consultation for free and show him you’re good at giving valuable advice, content, or ideas. Conclusion Lead magnets are not hard to create, but it’s essential to understand what kind of platform and device your users use the most. Having this information will unlock massive potential for your business to bring thousands of emails and top-quality leads. Furthermore, you’ll be able to convert these leads to actual sales and boost your business revenue drastically.[/vc_column_text][/vc_column][/vc_row] ### How to hold onto the magic as your business grows [vc_row][vc_column][vc_separator][vc_column_text] [embed]https://vimeo.com/338855371/2db73c748d[/embed] [/vc_column_text][vc_column_text]In 2010, IBM released the results of a study of the world’s top CEOs. They wanted to find out what the most pressing issues were for business leaders. And the results were pretty surprising. The number one skill they were looking for from their employees turned out to be creativity. Not time-keeping or salesmanship or efficiency. These big corporations wanted their workforce to be able to come up with ideas.  That certainly surprised me. To understand why this was their biggest concern, we need to take a step back and look at the natural life-cycle of a company. The early days of most businesses are highly creative. Your energy goes into shaping your products, creating your brand, bringing in business, working out what makes you different from the competition and hundreds of other things. Everything you do here is about creating the very essence that makes your business special. It’s an exciting and magical time. But once you've found that essence, your business goes into a new stage. Your focus shifts from creativity to systemisation. The business now needs to work out how to reproduce its offering as predictably as possible. It’s about getting the good stuff out of the founders' heads and turning it into a business algorithm that can scale. And then the last step is optimisation. Once you’ve got a system in place, the company's focus switches to efficiency. If you can cut costs by doing things faster, using fewer people, at a larger scale and with better-negotiated contracts, you’ll naturally increase your profits. All the effort goes into small incremental gains. Changes within this finely-tuned system can screw everything up, so the corporate antibodies destroy anything new or different. And this final optimising stage is where these big companies exist. Their focus is on efficient replication not on creative origination. Like oil and water, these two things just don’t mix. But because these corporate megasaurs are getting increasingly worried about nimble startups eating their lunch, they know they need more creativity if they want to survive the storm. And by that, I assume they mean they need more ideas, more flexibility and more innovation. That’s something they often struggle with themselves. (I should know because many of them call on me to help them). Of course, not all companies fall into the rigid trap these businesses find themselves in. Some of them continue to grow and innovate and evolve over time. But it’s best to bake this attitude in from the start. And you need to make a conscious effort to hold onto it when you move into the systemisation phase and again when you move into the optimisation phase. Because, if you don’t, it’s hard to get it back. Here are some things to build into your business from the very beginning. Schedule time to think beyond the short-term It’s easy to get caught up in issues like cashflow, logistics, new business and other pressing considerations. I’ve run my own business, so I can sympathise. But if you place your entire focus on the now, you’ll always be reacting and firefighting. Schedule regular time to think about the future and adjust your direction. Even established companies can benefit from the occasional pivot. Always have a big goal to head towards If your goal is simply quarterly profits, you’ve lost your soul. Your goal should be a vision that you’re collectively trying to achieve. Your goal is what keeps you striving and keeps your staff on track. It’s what makes things worthwhile and keeps your eye on what’s important. Ask your employees how you can do better When a company grows, the captain at the helm becomes further and further removed from the day to day operations. So, if they’re not careful, their decision-making becomes less effective and relevant over time. To counter this, Toyota created a brilliant program that asks their staff for suggestions on how they can improve. Not only does it allow every part of the business to be improved but it also makes everyone feel involved in the organisation. And that’s massively powerful. Of course, there are different ways of holding onto your magic. But it always comes down to keeping a balance between doing and thinking. It will vary according to your industry, your company structure, your size and all sorts of other factors. Your investors will likely encourage you to concentrate fully on the doing, the systemising and the scaling. Naturally. That what’s likely to lead to the speedy financial returns they’re after. But in today's ever-evolving business climate, that’s the very approach that could lead to a dramatic disappearing act somewhere down the line. Or at the very least some vanishing coins. Dave Birss is the co-host of the “ultimate anti-conference”, the Fast Forward Forum, due to run in Venice on 6th to 8th October 2019. For more details visit https://fastforwardforum.eu/[/vc_column_text][/vc_column][/vc_row] ### Hybrid-cloud: Getting the Best of Both Worlds [vc_row][vc_column][vc_column_text]Having access to, and the ability to generate insights from, high-quality data is a game-changer for organisations. Not only are the majority of enterprises beginning to recognise how data insights enable innovation and optimisation, they are waking up to the need to secure a foot-hold in the expanding digital and data economy. With Cisco estimating that 94 per cent of workloads will be hosted in cloud data centres by 2021, enterprises ignoring the cloud and the data insights it affords will be firmly in the minority. The public cloud has been generating buzz for the best part of a decade now, yet it is only in recent years that Fortune 500 companies have begun to execute their own cloud strategy. A large component of the debate surrounding whether enterprises should embrace cloud migration is that it is often presented as an either/or proposition. This is to say that enterprises are often presented with the choice of either retaining their on-premises data centres or migrating it all to the cloud. For enterprises torn between choosing an exclusively cloud-based strategy or an entirely on-premise one, both options have their own associated advantages. Staying on-premises offers greater control of data and security. On the other hand, moving to the cloud could lower costs (but only if well-managed) and allow for greater flexibility. However, what the last couple years have taught us is that the choice between on-premise and cloud is no longer binary. Instead, organisations can get the best of both worlds by partially leveraging a cloud solution while still making use of their on-premise environments. This is known as a hybrid-cloud strategy. The appeal of a hybrid-cloud strategy is strong. It allows enterprises security and control where it is needed while still offering flexibility and lower costs as well. This is precisely why an increasing number of organisations are adopting the strategy. According to the RightScale 2019 State of the Cloud Report from Flexera, enterprises with a hybrid-cloud strategy grew to 58 per cent in 2019 from 51 per cent in 2018. While this trend is certainly good news for cloud providers, it is pertinent to ask what it means for the growth of big data. ELASTICITY IS KEY Cloud environments are simply more suited to the requirements and demands of big data than any traditional on-premise infrastructure. This is primarily because the cloud can offer the elasticity needed by most big data deployments. In the majority of data deployments, it is common to see sudden and substantial spikes in resource requirements. For instance, an e-commerce company will typically see consistent, stable levels of traffic throughout the year. However, certain days or events such as Black Friday or Cyber Monday can generate enormous increases in traffic that strain on-premise infrastructure beyond capacity. For e-commerce enterprises whose deployments are integral in analysing the behaviour of users on their site and providing real-time recommendations, having their big data insights fail isn’t an option. To keep providing data insights - and indeed, service - during these traffic spikes, additional servers would be necessary. However, these servers wouldn’t see any use for the rest of the year where their traffic remains at a stable, but far lower volume. This is where the elasticity of the cloud represents a fundamental advantage. In a cloud environment, the same e-commerce enterprise could instantly scale their infrastructure to manage the extra traffic and scale it back down again when it passes. This is both cheaper and easier - when well configured. Even on a much smaller time-scale, elasticity is still a big draw for enterprises. For example, most apps and websites will have traffic spikes at particular times of the day and lulls at others. According to Nielsen, Tinder consistently sees an increase in active users between 8 pm and 9 pm. Tinder, of course, leverages big data to provide its users with the most likely matches. Maintaining the infrastructure necessary to facilitate these data insights on a 24/7 basis would generate a lot of unnecessary cost for Tinder. Instead, in a cloud environment, scaling up cloud deployments during these peak hours and scaling back afterwards is as easy as pressing a button. By contrast, in an on-premise environment this would involve maintaining physical infrastructure that would only see use in a very limited number of hours in the day. CLOUD IS EXPANDING BIG DATA The cloud is no longer a curiosity or buzzword circulated by enterprises. Instead it is becoming an integral part of how organisations adopt big data through simplifying the process and reducing costs. That being said, this is not to say that on-premise environments are going to disappear. There will always be a demand for the tighter control and security offered by physical infrastructure. Regardless, it is inevitable that the cloud will be leading the way for big data going forward.[/vc_column_text][/vc_column][/vc_row] ### How to compare cloud-based accounting software [vc_row][vc_column][vc_column_text]Accounting can seem an intimidating task, especially for a small business just starting out. Before the advent of cloud computing, businesses had to rely on traditional accounting software on dedicated hard drives, that was often managed by an external accountant. This meant the data wasn’t easily accessible to anyone else in the business, and meant even business owners may only see the accounts annually. Cloud-based accounting software has, however, helped to make the task of bookkeeping easier for businesses. Rather than each business needing an external accountant or specialist equipment to store their information, cloud-based accounting software has made it more accessible for businesses to balance the books themselves. It has also made accounting a much less time-consuming task as the software can update information automatically in real-time, if it is linked to your bank. Because of the “Open Banking” system, which allows regulated third-party systems to access your banking information, transactions can be automatically entered into your accounts which removes the need to manually enter each individual payment. Furthermore, cloud-based software can be a more secure way for a business to do their accounts, although there is an added cyber-security risk. In the past, when data was stored on one device (and any backups the company kept), there was always a risk that it could all be lost should something happen to that piece of equipment. Cloud software has minimised this risk as the data is stored remotely on servers in the cloud and isn’t tied to any single device. Instead, the accounts can be accessed anywhere by anyone with the login details, whether you’re at work, at home, or even on the go, thanks to mobile apps. Businesses both big and small can benefit from using cloud-based accounting software, but platforms don’t all offer the same features. They come with different properties that are suitable for a range of businesses, from freelancers, to self-employed individuals, to SMEs, to large corporations. Who will be using the software? This is one of the first questions businesses would need to consider, as the requirements and preferences of a qualified accounting professional within a larger company will differ to a small business owner running their accounts alongside their other duties. Some programmes are more suited to accountants who fully understand what they need to do and just want a functional tool to help them carry out their work. On the other hand, some platforms are more targeted at less-experienced individuals (such as small business owners) to help make managing their accounts less complicated and time-consuming. Additionally, rather than one accountant being in charge of the accounts (as with the traditional software), you should bear in mind that cloud-based accounting platforms allow multiple people access to any relevant parts of the system. Different departments can use the cloud software for their particular roles, such as sales invoicing for the sales team and purchase ledger data for the office manager. What are your business needs? Businesses should also consider what they need from their cloud accountancy software. Not only will the size of the business affect what provider you might choose, but also the nature of your business. For example, a business offering a service will need different accounting tools to a business dealing with stock, so it is important to check that the cloud software has all the tools you need. Features of cloud-based accounting software There are some fundamental tasks that you can do on most cloud-based accounting tools, such as monitoring income, tracking cash flow, sending invoices, and filing expense reports. Most businesses would need these features as a minimum, even if they are a freelancer or a self-employed company of one, so all cloud accounting software will generally offer these services. Some cloud tools may also be able to help businesses with their auditing and VAT, something which is particularly pertinent with the recent introduction of Making Tax Digital. Larger businesses with employees are likely to need a wider array of tools with their accounting software, such as a payroll feature and a time tracking feature. They may also want a more advanced software that enables them to create data reports and forecasts, to help them make future financial predictions. Furthermore, businesses dealing with products may search for accounting software which has a tool to track their inventory, so they can quickly check how much stock they have left. It is also an idea to look at how different cloud accounting providers can integrate with other platforms, like Slack or Salesforce. If your business already uses certain systems, it may be an idea to see if any cloud accounting tools specifically integrate with them as this will maximise their benefits. Scalability The scalability potential of a cloud-based accounting software should also be something for businesses to consider, particularly if they are small and growing. Whilst a basic, easy-to-understand package may meet a small company’s current needs, they may find that, as they grow, they need extra features that their existing accounting software can’t provide. So, depending on the confidence and aspirations of the business owner, they may find it more beneficial to choose an accounting software that is customisable and has the ability to grow with the business. Cost Cost is an obvious measure to compare the range of cloud accounting tools, with most of them offering a monthly subscription to access their different features. Some cloud-based accounting providers offer several grades of package, depending on what your business requires, whilst some allow businesses to simply pay for the features they use and add any further features for an extra price. Businesses should make sure they look at the pricing structure carefully, as some providers restrict the number of transactions or invoices that can be made on that specific pricing package. They need to assess their current business needs, as well as their future development, to find a cloud accounting software that will work for their business in the long-term, as you do not want to move platforms too often. There are a number of free cloud accounting software tools, but these are unlikely to offer anything beyond a limited number of essential features. Making Tax Digital compatibility Previously, businesses have had to manually send in their tax and VAT documents, but this is being replaced by digitised methods. The UK government launched the Making Tax Digital scheme as an attempt to make it easier and more efficient for businesses to get their tax and VAT statements correct, and cloud accounting can play its part in this. Some cloud-based accounting software providers are compatible with HMRC’s systems, which means you can submit your information direct to HMRC. If, however, a cloud accounting provider is not currently compatible, you would need to transfer your information manually to HMRC’s online system. Over time, more cloud-based accountancy providers are likely to develop their compatibility with HMRC, but this will certainly be something for businesses to look at when choosing their software. All cloud-based accounting tools can help businesses to automate their accounts and update their data instantly on any device. Whether they’re looking for a basic and easy-to-understand software or a full accountancy package that can perform numerous tasks, businesses are sure to find a cloud provider that will meet their needs and make the chore of balancing the books much quicker.[/vc_column_text][/vc_column][/vc_row] ### ThousandEyes Addresses Critical Enterprise Application Performance Visibility Gap With Internet-Aware Synthetics [vc_row][vc_column][vc_column_text]Correlated app, infrastructure and Internet performance visibility speeds problem isolation and resolution for complex multi-cloud and SaaS deployments London, September 4th, 2019 – ThousandEyes, the company that delivers visibility into the Internet and digital experiences, today announced ThousandEyes Synthetics, an Internet-aware synthetic monitoring solution for proactive detection of modern application performance issues. Addressing a significant application performance monitoring gap introduced by API-heavy and Internet-dependent application architectures, ThousandEyes Synthetics visually correlates application performance to underlying infrastructure and Internet delivery performance in a single, shareable dashboard for instant root cause identification and collaborative issue remediation. These new capabilities dramatically reduce business continuity risks, enable successful SaaS and cloud-hosted application rollouts, and give IT teams confidence in their ability to troubleshoot SaaS, cloud-based and browser-based application performance issues. "Traditional synthetic monitoring solutions simply don't cut it in today's Internet-dependent, cloud-centric ecosystem. An app-centric view with no knowledge of the underlying dependencies leaves IT, Digital Ops and service delivery teams dead in the water while troubleshooting application performance issues," said ThousandEyes vice president of product, Joe Vaccaro. "ThousandEyes Synthetics enables both SaaS app owners and IT teams to deliver and deploy with confidence knowing they will be able to quickly identify exactly what's causing any issues in end-user digital experience regardless of where the issue lies, eliminating massive business continuity risks." Legacy synthetics were designed for traditional application development approaches and for applications that are hosted in on-premises data centers where the Internet and third-party services aren't potential complicating factors for application performance. Modern applications, however, require an entirely new approach to synthetic monitoring due to the fact they are distributed, interact with multiple third-party services through APIs across multi-cloud environments, and are constantly changing thanks to continuous integration and continuous delivery (CI/CD) models. ThousandEyes Synthetics combines a new, programmable javascript-based approach with deep active monitoring that correlates application insights gathered through synthetic tests with HTTP, network metrics, network paths, Internet routing, and outage visibility, in a single view. This allows for: Comprehensive Insights: Gain actionable insights into application performance with proactive monitoring of commonly trafficked user paths and multi-step business transactions. Cloud and Internet RCA: Expedite the identification of cloud- and Internet-related root causes of application performance issues like ISPs, CDNs, IaaS, SaaS, and the Internet. User-centric measurements: Understand how applications are performing from user-relevant locations via pre-deployed monitoring agent locations around the world and new mobile network monitoring agents. "Our research shows that tool sprawl is a significant challenge for enterprise IT teams, and the need to reference multiple disparate tools and datasets makes addressing application performance issues incredibly difficult and time consuming, which has obvious impacts on end-user experience, revenue and brand reputation," said Shamus McGillicuddy, Research Director at EMA Research. "ThousandEyes' approach to consolidate and correlate different views of all the different potential sources of application performance issues will be very compelling for teams looking for that singular view, so time can be spent on issue remediation versus issue isolation." Proven Solution ThousandEyes Synthetics is already in use by some of the world's largest enterprises and fast-growing SaaS providers alike. Schneider Electric, the global leader in energy management and automation, will share why they selected ThousandEyes Synthetics to support its global roll-out of Salesforce Lightning to its more than 45,000 employees in an upcoming webinar, taking place September 17, 2019 at 9:00 a.m. PT. ThousandEyes Synthetics is currently in limited release and will be generally available later this year.[/vc_column_text][/vc_column][/vc_row] ### How to address the technology skills gap [vc_row][vc_column][vc_column_text]The UK’s position as a global technology powerhouse could be in jeopardy due to the lack of new talent entering the industry. It has been well documented, so will come as no surprise to most, that the sector is experiencing a chronic skills shortage, with employers finding it hard to match jobs to suitable candidates. Research has shown that almost 80% of tech employers have indicated that a shortage of suitable candidates is their number one recruitment concern, with 75% having encountered a moderate or extreme skills shortfall. There’s a real fear is that if this deficit deepens then the country’s coveted place at the fore of the global digital economy will be impacted. The technology sector plays a vital part in the economy and in order to continue to realise its potential, it is critical that ambitious people with the right attitudes and attributes are found to become the next generation of tech pioneers. It’s a sector that’s worth almost £184bn to the UK economy, according to the 2018 Tech Nation Report. This same report stated that figure was up from £170bn in 2016, and that the turnover of digital tech firms rose by 4.5% in the 2016-17 period, favouring well when compared to the 1.7 % UK GDP. It’s a fast paced, dynamic environment that is changing and growing all the time, and skills are needed to develop emerging areas particularly fintech, cybersecurity, artificial intelligence, biotechnology and eco tech systems. The jobs out there are many and varied, covering software development, data analysis, artificial intelligence, cybersecurity, cloud, AI and much more. Right across the industry, essential jobs need to be filled to satisfy the global demand for digital. This demand will only get greater and failure to gain the right people for posts could result in other countries overtaking us and push ahead in these fields and gaining a competitive advantage. We all need to shoulder increasing responsibilities to address the recruitment challenges, and can do this working in conjunction with schools, colleges and universities, as well as with agencies and governments. Students across the UK, armed with their exam results, are currently considering their future studies and careers based on their grades. As an industry we need to showcase digital technology as being a rewarding sector to choose because of it offering incredible variety, scope and reward. In England, Wales and Northern Ireland, students recently received A level results, just a few weeks after their counterparts in Scotland received certificates for their Higher and Advanced Higher grades. Higher education won’t suit all school leavers and now is a good chance to source individuals that have achieved great results in Science, Technology, Engineering and Maths – the so-called STEM subjects – by offering bespoke opportunities. It’s one solution to the problem that the industry can itself offer up, by opening up avenues into the industry through using training schemes and apprenticeships. As a small-to-medium enterprise (SME), based in a rural part of Scotland, Clark Integrated Technologies provides apprenticeships to school leavers in an effort to attract raw talent. It might seems that one company sourcing new blood through the apprenticeship route is just a drop in the ocean towards addressing a nationwide skills shortage. However, we have developed excellent links with schools and training providers as a result, and that will stand us in good stead for the future. In addition we are equipping young people with relevant skills for the jobs of today and tomorrow. We hope our young employees will stay with us, but by giving them a first class grounding in sought-after technology skills, we appreciate that there are many other rival opportunities open to them. So far though, this approach has had the right outcomes for Clark IT, as several of our apprentices have progressed into other positions within our company and are making a valuable input to our operation. It’s not just about getting young people into the industry – all untapped pools need to be looked at. Nowadays the concept of a job for life has almost become a thing of the past and people are changing pathways more often. These are employees that have workplace experience and they can retrain to gain the requisite skills for careers in technology. It ticks the box for lifelong learning too. The UK Government’s Technology Innovation Strategy sets the foundation for innovation through emerging technologies, in relation to digital, data and technology. The Scottish Government’s cyber resilience strategy supports developing the ability to be able to prepare for, respond to and recover from cyberattacks, so positioning Scotland towards greater cyber resilience. There’s no quick fix solution to digitalisation’s skills shortage but the big picture is about having a shared responsibility, and using joined up thinking, to ensure future ambitions are met and advances continue to be made. That way we can continue to find the innovative leaders and have the right technological culture for the future. However, more effort needs to be taken to achieve that aim, otherwise growing and developing expertise within in the globally competitive disciplines could be a real struggle – and UK technology firms could lag behind without having the right expertise.[/vc_column_text][/vc_column][/vc_row] ### Can you sum up your business idea in three sentences? [vc_row][vc_column][vc_column_text]The answer to this question is definitely ‘yes’. And more, you should sum up your business this way. If you can’t, you probably haven’t thought things through with enough rigour. The three-sentence pitch is often called an elevator pitch, the idea being that you can pitch your idea to a potential stakeholder in the time it takes to ride an elevator (or a lift, if you’re in Britain). We’ve heard many such pitches that assume that the elevator gets stuck half way. You should be able to write your pitch on the back of a beermat. Three simple, clear sentences: pain, premise and proof. The first sentence, pain, does two jobs. It defines your customers and it says what pain you solve for them. Defining customers is important – we often meet entrepreneurs who say their idea is for everyone. This is a danger signal – something for everyone usually ends up so general that it doesn’t appeal strongly to anyone, so misses out to more targeted offers. Yes, there are exceptions. Microsoft sought to ‘enable everyone to harness the power of personal computing’ and haven’t done too badly out of that. However most new businesses serve smaller niches, and a lot of their success comes from understanding exactly what niche they serve and exactly what the inhabitants of that niche need. We say ‘need’, not ‘want’. A serial entrepreneur we met in America told us that his motto in looking for new business ideas was ‘Where’s the pain?’ This struck us as an infinitely superior approach to that espoused by many marketeers – ‘How can we sell this?’ We have seen too many businesses that are solutions looking for problems, and they usually falter. We both come from a services background, so may be a little biased here. But even pure consumer offers are better off thinking of pain. JK Rowling didn’t sit down and think that the world had a desperate need for books about boy wizards. She thought people might enjoy such a book. But once the Harry Potter series took off, her fans did feel a real need for the next book in the series – and that was when the big bucks started flowing in. So there’s your first sentence. An example: ‘We take away all the hassles of relocating offices for businesses in the South of England.’ Your second sentence is the premise. How, exactly, do you solve that pain? There’s no need to go into great technical detail – it’s only one sentence, after all – but a brief description of what you do is needed. If you can frame that as something unique, that’s even better. ‘We deal with all aspects of relocation, from transporting physical objects to helping staff find good houses and schools for their children – nobody else does this.’ If other people do offer a similar service, you should be thinking about how you differentiate, but don’t go crazy worrying about this. If you have a good sales team and provide genuinely good value, you can get business. A ‘USP’ will emerge once you start working with clients and problems emerge which you then solve on the hoof. Proof is essential, because, sadly, people lie, so wise customers are wary. It is best quantified. ‘We have 120 customers, including XYZ PLC.’ ‘Jan Smith, from XYZ, said we saved them £100,000 on a move they had to make last year.’  If you have no numbers you can use, then a customer endorsement is good (though nothing really beats numbers). ‘Jan Smith, from XYZ, said we were the best relocation company she’d ever worked with’. The thing not to do, at any point in the pitch, is to parrot a slogan. Slogans are for big companies trying to lodge their mass product in the mind of the busy, distracted public. ‘Should have gone to Specsavers’ is a current catchy current one. Esso told our dads to ‘put a tiger’ in their petrol tanks. But these say nothing about the actual nature of the business. Any other optician or petrol company could have come up with the same slogan. As a start-up, your pitch must be specific and full of information, designed to inform an individual whose attention you already have, even if only briefly. Everyone in the business should know your pitch by heart and should be happy to repeat it at any time. It’s worth getting a professional in to train everyone to do this in a way that is natural, not stilted. They can then do it whenever anyone asks who they work for. They can do it at conferences. Even at events where there may be no potential customers, you still want people to know who you are (word gets around). They can even do it socially. Every employee is an ambassador for the business, and their first duty as such is to know the pitch and to be able to present it to the world when asked. The pitch should also appear on your website header and in any relevant LinkedIn or Facebook profiles. Being able to sum up your business in three sentences focuses your mind, and the minds of your people, on what it is you do. It impresses others, who will like the fact you know what you are about and are proud to tell the world. If you can’t do this right now, sit down with the team and word out what your three magic sentences are: pain, premise and proof. It will be time very well spent. Article by Mike Southon and Chris West, authors of The Beermat Entrepreneur, published by Pearson, available now[/vc_column_text][/vc_column][/vc_row] ### Big Data - New technologies against fraud [vc_row][vc_column][vc_column_text]One of the biggest hurdles in the credit card segment is the hackers garnering illicitly obtained Big Data of credit card information globally using sophisticated techniques to satisfy their financial needs that normally go unnoticed when they make purchases using the card. Improving technologies promise deterrents to credit card fraud, However, the stakes are very high and it is a never-ending battle since hackers are not only knowledgeable in the very same techniques but innovate their methodologies very frequently. Compromised credit card information has led to e-commerce companies, banks and credit card companies needing highly technical software solutions to avoid losing billions of dollars. Among the very many solutions offered the ones based on technology handling the credit card’s big data analytics courses is the safest bet. The adage prevention is better than cure applies well here and smart techniques in ML with Big data systems can counter such frauds. The typical process: Financial institutions that issue credit cards or companies that offer credit-facilities usually create an exhaustive user profile of their customers. Details indicated here are the user mobile number, call center conversations, social media accounts, data from customer’s devices and more helping the analytics systems collect and use information from multiple origins and sources. The AI behind the scene analyses typical customer behavior and involves very large big data analytics courses and Big Data sets across disparate sources. A red flag is raised if any unusual or deviation from normal buying patterns is observed. The customer will then receive an immediate alert seeking information if such transaction was indeed made by them. Previously phone calls were made. But currently, automated messages serve the purposes of both privacy and personalization. Typical red flag scenarios pointing to illegal and shady activities that immediately arouse suspicions are: Use of a new unrecorded device for credit card transactions. Several rapid transactions happening using different devices in a short span of time all on the same day. Multiple transactions on the credit card occurring at far-off different locales and cities within a few minutes or hours of each other. The transacting amount is higher than the expected monthly spending amounts and the pattern of buying behavior. The card is used suddenly for high-volume and large purchases. Constant vigil by the system flags unusual parameters and behavior in their trend analysis. If the transaction is red-flagged the clients are asked to verify purchases. Since the credit card segment is all about customers and safe use of their cards, the personalization features of the software enhance their user experiences and also allows them to track their own purchase history. For example, credit cardholders can choose to limit their transactions to a particular limit and be alerted if the limit is exceeded. They can also enable authentication requirements for each transaction by simply opting for alerts on their profile page setting off triggers in the big data analytics courses behind their cards and profile. Future needs: Automated fraud detection is more complicated than you imagined. And blocking credit cards can be a double-edged sword for both the client and financial institution. Imagine scenarios of the card being used at a different or foreign locale. Normally blocking the card is an option that needs caution and should be used when the cardholder has not informed the bank of his/her plans and most importantly the mobile user number, social media posts and location does not match the transactional details to avoid a false-positive result and inconvenience to the client. More complications arise for changes in buying behavior of customers which impact the financial companies as legitimate purchases getting flagged are higher. Thus, to avoid customer inconvenience the fraud detection techniques need constant overhauling and innovation with the latest current data and ensuring the AI is constantly learning by being fed fresh data. Parting Notes: Data collection is the foundational basis of fraud detection. One must check the data privacy rules applicable to that particular country and locale before gathering data for clients. New data and new techniques of fraud detection should be applied continuously to stay ahead of the fraudsters who use the data for wrongful means and are experts who cash in knowing the lacunae of data analysis in the credit card segment. If you would like to make a career in the field of Big Data try the big data analytics courses at Imarticus Learning. They not only excel at training and develop your skills for you but even offer you assured placements in this high-paying always in-demand sector. Why wait?[/vc_column_text][/vc_column][/vc_row] ### 9 tips for creating eyeball grabbing content that drives conversion [vc_row][vc_column][vc_column_text]Most content begins with written words rather than visuals! So it doesn’t matter what kind of content you create, whether it is a beauty based content, health-related blogs, or any other content you wish to write; you can profit from understanding the secrets of professional writers. One of the highest struggles content marketers face today is having to produce sufficient content and simultaneously maintaining the quality high. You have to forget the competitive writers and focus on your quality of content. That’s something professional writers must work through daily! If you’re a content writer who is struggling to produce exciting and engaging content for your customers, here are 9 pro-tips that can help you to make your writing more effective: Be interested in what you are writing There's one thing that guaranteed to get readers interested in your writing is; it’s being passionate about what you’re writing. One says that you can understand someone's passion for a subject from their writing; it makes the writer and the reader equally interested in that subject. Thus making it more engaging and active, by injecting excitement and vitality into your words. It’s challenging to deliver passion for your least favourite subjects, and this will come across in your writing. A great headline is a promise Your headline makes your content better, and it is the first thing your readers notice about your landing page or blog post content. What makes a headline great is that it gets people interested enough to learn more. Whether or not the headline is excellent or not, this will decide if people proceed to read the content or bounce. Here are some easy guidelines: Highlight the benefit to your reader. Ensure the headline is clear and to the point. Make your headlines resonate with your reader's view. Connect readers with your intro Your headline should draw the attention of readers, and your intro should intrigue them enough to make them stay on your page. Your headline will tell them what promise you can give them briefly. And your three or four-sentence intro will say to them what they can expect from your content. It needs to be clear and to the point to clearly illustrate how readers can profit from the post. It should make them want to read more; otherwise, why should they keep reading? 'Telling a story' helps to bond with your audience We all love good stories. Storytelling with your marketing or content strategy is the best way to draw the attention of the audience. Along with relatability, stories provoke a more personal and emotional experience.  Whether it’s happiness, frustration or anger, stories make people feel something. What makes a good story: Entertaining Relatable Unique Memorable Relevant Give relevant and useful information Before you begin working on any content, you should consider two questions. Is it relevant? Is it beneficial? Determining what your blog will give to your readers will help you with the idea of how to write useful content. Educate them on something they don’t know about. Give them something to wonder about, an offer they can’t ignore, or some tips and tricks that will improve their business or their website. And always bear in mind that creating great content is a mixture of being passionate and grouping technical factors and also being productive in your way. Don't forget to add visuals Adding visual elements to your text content tends to draw attention better. It is an excellent way of breaking up long pieces of content, resulting in your content easier to understand and examine. Adding visual media like screenshots, videos, infographics, charts and graphs keeps people on the page longer and is more likely to result in likes, shares, and comments. Focus quality over quantity High-quality content is the king of content marketing. Create content that should always be "valuable" to your audience. Just because the quality is essential, it doesn't mean quantity has no value in the content; quantity matters too. Quantity will bring you in the game. Quality will take you to the top. Here are bonus tips to consider on your next successful content writing: Use lists Format wisely Use keywords Write for your audience Use headers and subheaders Set the tone It's time to set your brand's voice.  Which you prefer - cute, humorous, serious or sarcastic? Set the tone based on your brand values and the audience you expect to get involved in. Have a compelling Call to Action CTA buttons help increase conversions by leading visitors on what to do next. It can be encouraging the audience to subscribe to your channel, navigating to a landing page or download your latest e-book. Experts suggest having multiple CTAs on every single page. However, the two most successful and productive places to place a CTA is at the middle of the content, and bottom of a page. These are some of the best practical ways to create high-converting content.  By following these tips, you can improve your content to boost conversions.[/vc_column_text][/vc_column][/vc_row] ### Don’t let the FUD cloud the cloud [vc_row][vc_column][vc_column_text]No-one has ever said that the traditional financial services companies are leading edge when it comes to IT, but the use of public cloud seems to be an area where some FS companies are still struggling to even climb onto the ledge, let alone fear falling off it. It’s very easy to get criticism, as a technical consultant, evangelising about the use of public cloud. The level of FUD (Fear, Uncertainty and Doubt) can sometimes be deafening. The reality is clear though - Fintech and Insurtech companies are already there. And the reasons? Cost, ease of use and the continued innovation that the big tech suppliers are offering (in their ever-expanding cloud product range) all allow the start-ups to innovate quickly, build cheaply and pivot when they need to. The recent results of Microsoft, when they became a trillion-dollar company, were driven by a 73% surge in revenue from their Azure cloud. Amazon Web Services (AWS) revenue grew 41% in the quarter to $7.7 billion. These companies are only going to increase innovation and offerings with that kind of return, and the gap in capability between what you can do with your own data centre/IT and what you can get from public cloud is only going to grow wider. The Fintech and Insurtech companies are already taking small chunks of big FS companies’ business (Starling with their current account is a great example) and you can be sure that if Amazon decides to target your market in FS, then they are going to be taking full advantage of all the functions that AWS can offer them, and at the cheapest cost. This assumes, of course, that they haven't already - I can now buy my Kindle with 5 months of interest free credit at the touch of the 'Buy' button. If, as a large FS company, you are only just starting to consider moving some development servers to run on 'Infrastructure as a Service' in the cloud, you're not on the slow boat, you've missed the boat entirely. As Werner Vogels, the CTO of AWS, said a couple of years ago “virtual servers in the cloud are already legacy”. Your competitor is going to be running their business with native cloud offerings such as ‘Code as a Service’ and ‘Function as a Service’ (FaaS) - no servers, no patching, no upgrades, no out of support software and no owned data centre costs. They will be paying just thousandths of a pence for each call to the code (did I mention that the first million calls are free AND the solution will scale automatically from 1 to 100 million users?). They will be taking advantage of the huge investment that Microsoft and Amazon have made in AI and Machine Learning, by simply calling functions (via API) to do voice, text and image recognition that before would have been out of reach of most companies. And it’s not just start-ups – some of your competitors are starting to go public cloud native as well. At an AWS finance event late last year, David Knott, Chief Architect, talked through three solutions that HSBC have implemented, using native cloud offerings, to help transform the HSBC business. By embracing and taking full advantage of what these features can offer, they are evolving a legacy estate. So are there perhaps some unique regulatory obstacles to adopting the public cloud? The FCA has tried to help with its FG16/5 Guidance to Outsourcing, which covers cloud usage among other topics. In that guidance the FCA states, “We see no fundamental reason why cloud services (including public cloud services) cannot be implemented, with appropriate consideration, in a manner that complies with our rules”. As always with regulation, however, the devil is in the detail (or lack of) when it comes to practicalities. How much effort do you need to expend to develop an ‘Exit plan’ from your public cloud supplier? Do you really need to build your IT solutions to the lowest common technical denominator so that you can ‘easily’ move to another cloud provider? The time and effort involved in these activities is arguably better spent building secure solutions using native public cloud components which will reduce risk in other areas – no servers to patch, no upgrades to operating systems, no hardware failures to manage. These are much bigger risk areas than the theoretical need to change cloud provider; Microsoft and Amazon are going to be around for quite a while yet. It is also interesting the different perceptions that public cloud has compared to other SaaS solutions. Many companies will completely rule out using a cloud provider’s cheaper, serverless database – as it is shared via segregation across multiple customers – but will quite happily sign up to a Salesforce contract where, of course, their instance is sitting on a database that is shared and segregated. Yes, these risks and others do need to be carefully considered (security, skills, cost, lock-in), but in the meantime, don't let the FUD cloud the future of your IT strategy. (In the interests of transparency – Darren has a Starling current account, uses AWS & Microsoft Azure…and owns a Kindle!)[/vc_column_text][/vc_column][/vc_row] ### Is open source the key to AI’s future? [vc_row][vc_column][vc_column_text]AI is one of the key technologies set to transform both our personal and professional lives in unprecedented ways. According to a recent study from Stanford University, in the last 20 years there has been a 14-times increase in the number of AI startups, while in the UK, funding for AI developers from venture capital increased more than 200 per cent in 2018. At the same time, the AI space has seen a growing number of technology giants - including Microsoft, Salesforce and Uber - open-sourcing their AI research.  Investing, or “giving back,” to the open source community has helped developers worldwide create and improve AI & Machine Learning (ML) algorithms faster. Open source software is now crucial to driving fast, reliable, and secure development in the AI space. But why did both industry giants and start-ups alike decide to embrace openness, and how will it affect technology and science moving forwards? Why enterprises are turning to open source Open source software started to play a significant role in IT development across industries following the launch of Netscape Navigator, the first open source program, in 1998. The strategy Netscape chose was to emphasise the business potential of sharing the software’s source code. As with science, if all researchers kept their methods secret, progress and innovation would take place much more slowly. With developers racing to deliver “the next big thing”, secure and easy-to-deploy software frameworks are essential to supporting this. The high costs of AI and ML model development are usually driven by the computing power required, as well as having enough data in place to build and train an advanced model. Additionally, the skills gap is usually a big challenge for enterprises - according to recent reports, despite the availability of millions of AI-focused roles globally, there are only 300,000 professionals able to fill them. Open source software allows IT teams to access frameworks, data sets, workflows, and software models in the public domain and as such reduces  training costs. At the same time, the open source community is always monitoring the code for flaws and vulnerabilities - adding an extra layer of security and also making such concerns a common responsibility. As an example, Kubernetes - the open source platform which automates the deployment and management of containerised applications, including complicated workloads like AI and machine learning - can be a facilitator, as it takes a large amount of ongoing effort required to keep cloud applications up to date. Openness and trust are essential for AI Open source disrupts the development of cutting edge technologies by fully democratising it, allowing any developer or IT team to facilitate cheaper, faster, more flexible and secure deployment. Developing in the open helps accelerate the adoption of numerous frameworks and software solutions through support from a large community of contributors. And, as mentioned, large technology companies are demonstrating a commitment to contributing to and supporting the open source community, making AI and ML frameworks accessible to everyone. Google has been very active in opening their research to the public with Tensorflow, their popular machine learning framework, now used by companies including Airbnb, Uber, SAP, eBay and others. Following in Google’s step, Amazon has started opening up its internal machine learning courses to developers. The company sees this as an opportunity to hire more efficient people and accelerate machine learning growth. The latest addition to the AI/ML open-source club is Facebook, with its Deep Focus (the AI rendering systems data and code) becoming publicly available. With such established tech companies betting so heavily on the ‘openness‘ of AI, it is clear that AI development will continue to transform over the next few years. An open source community working with AI and ML can accomplish its goals more easily and quickly by eradicating barriers, such as high licensing fees and limited talent, from getting in the way of delivering true AI workloads. In particular, knowledge sharing between companies allows developers to have access to trusted, secure and easy-to-deploy solutions, moving entire industries forward by making AI more accessible. Join the community and help democratise AI![/vc_column_text][/vc_column][/vc_row] ### The future of programmatic advertising [vc_row][vc_column][vc_column_text]Over the past ten years, programmatic advertising has become the cornerstone of most online media strategies. By harvesting data, advertisers have been able to provide personalised messaging, which is more relevant to the user, and therefore more lucrative for both the advertiser and the publisher. The sheer volume of data companies could access allowed them to dynamically tailor their messaging to be more relevant to users than ever, with ads adapting in response to demographics, locations, devices, the time of day, and even the weather. But recent regulation has limited the ability of publishers and advertisers to harvest and use this data, and it looks like increasingly strict laws are set to turn the industry on its head. So, just what does the future hold for programmatic advertising? In this article, I’ll explore a few of the potential outcomes we could see over the next few years. The regulatory squeeze is starting to take effect Data is the key to success for any programmatic campaign, and the raft of data protection legislation we’ve seen of late — such as GDPR — could be about to make customised advertising much trickier. It’s certainly true that, within the EU at least, there has been an intense regulatory squeeze on companies holding personal data. Companies that were able to freely harvest user data without consequence are now facing a regulatory roadblock that prevents most essential information from being used. There will inevitably be work-arounds, which at the minute seem to be intrusive cookie acceptance notices, but I'd imagine over time (especially within the EU) it will become even harder to harvest so much user data. In my view, programmatic practices need to adapt to new technologies that will completely change the fundamental model of what programmatic advertising is. Otherwise, I think there is a finite amount of time left for it continue to provide value for advertisers in its current state, especially in the EU. New tech could give users greater control of their personal data Starved of the data that is so essential to an effective programmatic campaign, I expect that advertisers will need to use new technology to engage users in a different way. I believe the next evolution in the space will see companies allow users to own, use and monetise their personal data: people will be given control over how — and when— they are advertised to. As a result, both advertisers and publishers will need to work harder to earn their engagement. We’re already seeing of lot of this sort of innovation coming out of the blockchain and crypto-currency space, with new browsers that allow users to control their privacy and accept advertisement at their discretion. Users could even start paying publishers directly for viewing their content via cryptocurrency. Cryptocurrency will play a crucial role One example of the new role of cryptocurrency in advertising is the Brave browser, from the developers behind JavaScript and Mozilla. Brave strips the ads out of sites, allowing the user an ad-free browsing experience. But, it also incentivises them to engage with advertising by offering them a chance to earn a form of cryptocurrency if they choose to be shown ads, which they can then spend on voting for their favourite publishers. The model offered by the Brave browser is still controversial — understandably, not all publishers are too pleased with the idea of having their ads removed and replaced with ones that drive revenue for Brave. But nonetheless, it certainly represents a shift in the role of the user. With traditional programmatic advertising, users take a passive role, having little say in what data is harvested and how it is used. But with a business model like the one offered by Brave, users have better control over their data, and advertisers are — to an extent — forced to respect their privacy. Web monetisation The other piece of technology that I think has the potential to seriously disrupt the space is web monetisation, which has been built into W3C standards using tech called the Interledger Protocol (ILP). It essentially allows for the real-time streaming of micro-payments in the form of cryptocurrency, transferred from the user to the site they’re visiting, as opposed to viewing advertising. Companies like Coil and Puma Browser are examples of businesses that are building the groundwork for this technology. Much like Brave browser, they claim to champion the privacy of the user, and offer a browsing experience without ads or tracking. I think programmatic advertising is unlikely to survive in its current form, and companies must turn to new technologies if they want to engage users and continue to profit from ads. Looking to the future, I expect that cryptocurrency and pay-to-surf business models will become much more prevalent, disrupting the traditional relationship between users, advertisers, and publishers. [/vc_column_text][/vc_column][/vc_row] ### How technology will impact and enhance the care industry [vc_row][vc_column][vc_column_text]Despite care homes previously being notoriously behind the times when it came to the topic of technology, the sector is slowly catching up and this standardisation could no longer be further from the truth. The role that technology plays within care homes has become increasingly prevalent over recent years; with many care homes adopting solutions such as cloud software, smart home capabilities and AI technology. This technology enables both residents and employees to enjoy the associated benefits. Having said that, for as many care homes that are welcoming this new technology, there are just as many that shy away from it. Shockingly, only one in five care homes have WiFi, and even those who do, can often have dead spots, and no reserved bandwidth for essential services. Care homes across the country need to stop being wary of the introduction of new technology and start implementing it so they can feel the associated benefits. In the article Blueleaf discuss the most recent up and coming technology trends and how they are being implemented within the care industry. Artificial intelligence There are aspects of Artificial Intelligence (AI) that have been introduced into care homes across the UK, allowing for the monitoring of patients to predict the need for early intervention. This particular technology will allow for an extended ecosystem; making the delivery of 24/7 care a possibility. AI-based tools allow the tracking of habitual behaviour in residents, can spot changes in real time and provide a platform for the end to end digitisation of healthcare. These innovations are transforming the way in which care is delivered and improving the quality and personal aspect of care, which in turn will alleviate the pressure on carers and families alike. Pending AI tech from forward thinking businesses like Novacare, aim to provide support to employees that will speed up processes and deliver insightful information to managers. Solution software companies are developing care-related technology which will utilise AI, wearable technology, and big data analytics, helping to reduce the need for admin related tasks. Stephen Wilson, Director of Novacare, said, “There’s no doubt that artificial intelligence and robotics are going to shape the future of care. In fact, by 2023 it’ll be technology that’s commonly used throughout the sector in the UK. However, these developments might not be in the way most people think of. Often, when AI and robots are mentioned in relation to care, people picture robots as the primary carer, with the human side of the profession falling to the wayside but this isn’t the case. “Every way the technology is developing suggests that the opposite will be true. At the moment, care employees face a huge amount of paperwork and routine tasks that eat into their time. AI and robotics aim to relieve some of this strain. Rather than taking time away from care home residents, when used correctly, AI and robotics will allow carers to better focus on peoples’ needs, including social and emotional needs that are often overlooked in the current system.” Smart home technology integration While smart devices increased in popularity across domestic homes in the early 2000s, it wasn’t until more recent years that they were introduced into UK care homes. Home hubs are at the core of smart technology and act as the main control for managing everything from lighting, to thermostats, and playing residents’ favourite songs. The term “smart lighting” encompasses various forms of lighting - from nightlights found in residents’ rooms, to large-scale lighting solutions. The biggest benefit from dynamic lighting was enjoyed by dementia residents, with regular rest helping them to feel less agitated. Lighting can affect patient's body clock as the different colours of light have varied wavelengths that the human body responds to in different ways, but certain lighting methods can be used to regulate patient’s body clock. The cool blue light of the morning kick starts our body clock by promoting a state of alertness, preparing us for the day, whilst warm yellow colours (representative of dusk), make a person feel sleepy.  The hormone melatonin, which the brain releases towards the end of the day is the science behind this, as the release of it causes us to feel drowsy. A cool blue light in a care home dining room at the end of the day is not conducive to a relaxed and restful evening for residents as white and blue based lights will inhibit the secretion of melatonin, interrupting our body clock and upsetting our usual sleep pattern.  Software solutions Software solutions have allowed for digitisation of records including care plans, residents’ medical records, and staff employment and management records. This has led to the optimisation of operational and administrative processes in UK care homes. Systems such as eMar have played a large role in reducing the pressure on staff within care homes by providing a faster and more efficient way of recording resident information . Whilst eMar manages medication administration processes, there are several types of software solutions that care homes can invest in which will speed up administrative tasks such as day-to-day care planning, budgeting, staff rostering and supply chain management; allowing for a more efficient business model and enhanced care for patients. By adopting relevant software solutions, in addition to providing an enhanced quality of care to clients, efficiency and productivity will be improved. Many care establishments are recognising the benefits of cloud-based solutions and are performing more effectively as a result. Future predictions  While in recent years, the care sector has upped its game in the implementation of technology; there is still a way to go. Robotics, AI and machine learning technology are expected to play a large role in enhancing and transforming care homes in the future. As previously stated, technology is expected to benefit not just patients by combating severe issues such as loneliness, but care professionals as well. Assisting with daily core tasks, reducing workloads and broadening health care functions will allow carers to devote more time to residents to provide compassionate care.  Robots are in the very early stages of being introduced in care homes across the world, in an attempt to combat loneliness. However, it’s the future possibilities that are especially exciting. Research has shown that loneliness is just as detrimental to a person’s health as smoking 15 cigarettes a day. This epidemic is predominant in the elderly, with over one million UK citizens over the age of 75 not speaking to anyone in over a month. While living in a care home can help to reduce feelings of complete isolation, loneliness increases mortality rates by 26%. It’s predicted that in the future, robots will combat this, by offering companionship. Issac Theophilos, author of How to Get Outstanding: An Ultimate Guide for Care Homes commented on the prediction of adopting robotics and other technological advancements within the care industry: “The ability of the machines to learn is multiple times better than humans. The power of machine learning can be used to provide voice support to help people with simple tasks such as calling for help, turning off the lights, adjusting the room temperature, and so on. “I assume a lot of management or operational roles would be made redundant if technology uses its full potential in the years to come. While several hotels have experimented using robots serving their customers, I am still waiting for that moment when a robot knocks on the door of a resident's room to offer a cup of tea”.[/vc_column_text][/vc_column][/vc_row] ### Is Cloud Computing the Next Natural Step in Tech? [vc_row][vc_column][vc_column_text]Cloud computing seems to be on everyone’s mind these days. But is it really the best way to handle large amounts of data at an affordable price? We approached the topic and analyzed it from the perspective of companies that work with big and small data. Let’s face it: a few years back, there weren’t that many companies who loved the idea of storing important information in the cloud. The data had to be kept on-site, in secure servers that could be watched and maintained by the company employees. However, with the rapid development of big data, it seems that local storage facilities can’t keep up. As a company, you either invest more in equipment, personnel, and storage spaces, or you make the leap and put everything in the cloud. Furthermore, due to recent technological developments, cloud storage solutions are more secure, cost less, and don’t require hiring specialized people for maintenance. So, you can see why more and more companies make the switch and give up on standard data centers. In fact, the industry has grown so quickly that, even though it’s the next natural step, changing to the cloud can feel confusing. So, to find the right solution for your needs, we will provide some helpful tips that will come in handy to businesses of any size and type. Choose the Provider for your Needs The good news is that there is a cornucopia of cloud providers on the market, for any type of business. However, this doesn’t mean it is going to be easy to select the one that works best in your favor. In this situation, you need to start with a list of needs. Ask yourself: what exactly do you need from cloud computing? Do you need more storage space, faster data access, software systems to use on the go? Cloud computing offers include a diverse list of services, solutions, and resources, so take your time to understand your needs before signing in for anything. Once you have a well-defined list of needs, it will be easier to select one out of the best cloud providers on the market, starting with Amazon, Microsoft, and Google. Experience on the Market As you can see, some of the largest IT companies in the world are invested in the development of cloud computing services. With such names, you know with certitude that they have experience in handling high amounts of data and that they invest in highly-reliable security solutions. Still, you shouldn’t take the brand for granted! Conduct your research, to make sure the company hasn’t been breached recently (we had several impressive breaches in the last year). And, if they did, learn about how they handled the situation and the improvements made to reduce the risks. Data breaches may happen (they are an unfortunate consequence of tech advancements), but it’s important to learn how to deal with them to avoid the same situation. And this is why a company with experience in the market and resources will always be the best choice. Price Luckily, most reliable solutions available right now are scalable. This means that you will only pay for the resources you’re using, which is a fantastic option! Furthermore, cloud solutions are not just about storage; they also provide users with accessibility, flexibility, and versatility through the apps they offer. Of course, the price will be different from one package to another, which is why you need to have that list of company needs we discussed above. Keep High-Sensitive Information Locally Preparedness is the best practice, which is why we recommend a system where you separate data in “regular use” and “high-sensitive”. Data that is considered high-sensitive should be kept in a local data center, administered by a trustworthy system manager, with experience and/or training in cyber security. Also, the access should be limited only to personnel who strictly needs it, to reduce the risk of exposure through human error. By implementing this system, you can make full use of the power of cloud computing while still protecting the core that drives your business. In Conclusion It looks like cloud computing is no longer the future, but the best solution that’s easily available right now. Sure, we still have some data breaches from time to time, but these can also happen to a standard data center. If you know how to select your cloud solution providers and understand how to use this option to your company’s advantage, you have nothing to worry about![/vc_column_text][/vc_column][/vc_row] ### OpenJDK Plans to Bring Java to iOS Platform: A Feat to be Remembered [vc_row][vc_column][vc_column_text]In recent times, the OpenJDK community is on cloud nine. The reason being, there is a floating proposal going around through which the entire dynamics of the technology world are bound to change. There is a plan to set off Java on Apple’s iOS. This news has been broken down by Johan Vos, CTO of mobile developer Gluon. The development is intended to develop the OpenJDK classes and API for iOS and Android technologies. Recently Vos sent an email informing these invaluable efforts. The amazing thing is, Open JDK mobile centers offer the same APIs in the modern version of the OpenJDK source repository to iOS and Android. These technologies are leveraging the same tools as Java developers. The primary attention currently is on iOS because it does not provide traditional support for Java. You will be surprised to know that Apple has disallowed the Java Virtual Machine to run on the platform. In order to maximize the effectiveness of this recent trend get in touch with a reputed iOS app development company and avail iOS app development services from it. The new Java JDK 13 is arriving. The feature list has already been locked down. JDK 13 is due to arrive on September 17, 2019. However, an initial release candidate can be expected on August 8. Let’s look at the feature list of JDK 13. The integration of text blocks is in the preview stage. There are several goals behind this feature. The number one objective is to make the writing of Java programs simpler by adding user-friendly strings consisting of different lines of source code while evading escape sequences in common cases. The second goal is to increase the readability aspect of strings in the programs that contain the code written in the non-Java language. The second preview of switch expressions has been proposed for JDK 13. Although it was present in JDK 12, a change was inevitable. The Z Garbage collector has been enhanced to return ideal memory to the operating system. This has been proposed in JDK 13. There are other features too but these were the most important ones that needed to be highlighted. To download the JDK 13 builds from the jdk.java.net website, click here. The first beta version is specially developed for macOS, Windows, and Linux. With the help of the new plan, JDK Mobile hopes to employ GraalVM ahead-of-time compiler which makes it easier to compile code at build time. However, Vos confirmed that JIT compilation is not an option on iOS. The next step is linking the compiled Java code to native libraries especially to target operating system thereby creating executables. This is already there for iOS based on Java 11. With the help of the lethal combination of GraalVM Native images and OpenJDK classes, it is possible for the developers to create applications following the Apple rules. This way, Java developers will not need to learn Objective-C or Swift to write software for iOS. Vos further quoted, “Although Java is late in the game of mobile, the added advantage with this technology is the fact that it is cross-platform which consists of security as the main feature. This way, it enables secure connectivity with cloud services which makes it a serious contender for mobile development.” Java has been a part of Android development since its inception. However, Android is not Java 11-complaint. It requires its own development tool - Android Studio on top of processes. There are various developers who are facing critical issues employing Java projects and libraries on Android according to Vos. On top of this, there is a plan to synchronize the fork of the Open JSK master. This will be implemented by employing Project Skara. With the help of the Skara-based repository, it would become possible to develop OpenJDK for iOS and Android. While the primary objective is to bring Java for iOS, the community is also planning to entail Gluon’s own Eclipse plugin on top of antiquated RoboVM tool for employing Java on Android devices. This goes to show that Java is planning to create a major impact in the mobile app field, i.e., both the Android and iOS ecosystem. This is the perfect chance for those who are in Android and iOS development. They will be able to exploit the new opportunities coming up in the mobile industry and get higher advantages. In case, you are a Java developer or from the Android/iOS mobile ecosystem, it is time to keep yourself updated with the announcements about OpenJDK mobile.[/vc_column_text][/vc_column][/vc_row] ### Is your business Built to Grow? [vc_row][vc_column][vc_separator][vc_column_text] [embed]https://vimeo.com/338856259/e471b23dca[/embed] [/vc_column_text][vc_separator][vc_column_text]In recognition that entrepreneurial and business growth is closely tied to impatience, adopting a success modelling approach can certainly ‘fast track’ your success. Here are my top 7 must do’s when growing your business. #1 Understand your compelling WHY Think for a moment about WHY you do what you do? When you are able to answer not just WHAT you do, but elaborate the reasons WHY you do it, it will change the way you make decisions and the value you place on things. It’s the point you acquire real purpose. Your WHAT becomes nothing more than a means of delivering your WHY. Be mindful of two common pitfalls. Don’t confuse your WHAT with your WHY and don’t fall into the trap of believing your WHY is solely about making money. The broad and diverse facets of building and running a business can be brutal and you will need a compelling reason to put yourself through it. Making money should be a measure of your success, the by-product of what you do. It should not be your primary WHY. Your WHY should be centred on making a positive difference and adding phenomenal value to others. #2 Fulfil a gap, need or want for your target customer Become fanatical about looking at yourself through your customers’ lens. Adopt ‘outside in’ thinking to ensure your customer is the focal point of everything you do, how you do it and you are fulfilling a real gap in the market. You want your business to fly so don’t invest all your life savings, energy or resources developing your dream unless it meets your target customers perceived need or want. #3 Failing to plan is like planning to fail You have to have a plan which translates how you’re going to turn your WHY – into reality. You cannot leave it to chance. You are the architect of your destiny and should have a three-year line of sight on target financial goals, a twelve-month plan detailing strategic priorities and a ninety-day tactical plan. Initially you will need to invest time to create your plan, but you will reap the benefits as you make regular reality checks against your plan to evaluate what’s working and what isn’t. #4 Focus on your personal strengths and compensate for your weaknesses In creating your plan there will be areas which you enjoy and are good at and there will be areas which are daunting. Here’s the great news. You don’t need to be great at everything! Put in place people, resources, partners who will support you. The growth of outsourcing functions such as HR, Marketing, Finance and the creation of virtual teams you tap in to as & when you need it means you don’t have the overhead/costs directly, but you do have the expertise to hand. #5 Talk benefits not products and be obsessed with customer experience Features of your product or proposition should become apparent after the benefits have been communicated, absorbed, and acknowledged by the customer. If your customers can answer ‘so WHAT?’ or ‘why should I buy from you” you have been talking benefits. Combine this with a healthy obsession with delivering an outstanding customer experience and the prize will be true differentiation from your competitors and greater advocacy among existing and future customers. Your customers then become your sales force! #6 Turnover is vanity, profit is sanity AND… …. cash is reality! Managing your cash flow is critical and can be what stifles your growth potential if you get it wrong. Your cash management strategy is fundamentally about one measure: your debtor to creditor ratio. Nearly every business is both a creditor and a debtor and proactively managing the ratio between these two levers is key to being able to fund your growth ambitions. You are not there to provide a free credit facility for your customers – don’t let your debtors treat you like one. Consider how creative you are in positioning your value proposition and the associated payment terms and your ability to negotiate effective terms with your suppliers and partners. #7 Control the Controllable The important thing is not to get overwhelmed. You cannot control everything. Master the art of controlling the things, which are controllable, and you’ll have a stunning platform for accelerated, sustained and profitable business growth. Royston Guest is author of Built to Grow (Wiley 2016), a blue print to help entrepreneurs, business owners and professionals understand the guiding principles of accelerated, sustained and profitable business. See roystonguest.com for more details.[/vc_column_text][/vc_column][/vc_row] ### Can Blockchain Technology Impact Banking? We Analyse this Important Trend The financial services industry is all set to undergo a sea of change. This is because of blockchain technology, which powers bitcoin transactions, offers many advantages compared to traditional banking. These include better accessibility, greater transparency, lower fees, and quicker transactions. Ignoring these benefits would be like using a handheld pager for communication instead of a smartphone. Even NASDAQ head Bob Greifeld has predicted that the global economy is ready for this change. Financial biggies such as Visa, Standard Chartered, DBS, and ING are already paying attention to the blockchain. Fintech is all agog about blockchain’s future. This industry has proved itself over the past few years with useful contributions to online forex trading, payment gateways, and bitcoin transactions. Fintech leaders believe they can go a step further with blockchain. Fintech impact on banking Though bitcoin has been used as a currency for quite a few years now, it is only recently that Fintech has attracted the financial sector’s attention. Fintech is expected to make an impact in three important ways: new customer generation, banks’ compulsion to make more profits, and tech improvements that merge the two. Millennial customers are ready Today’s young generation of clients has grown up on peer-to-peer dealings. They include profitable tech businesses, their employees, angel investors, and millions of closely networked customers. These customers are accustomed to online transactions, crowdsourced wisdom, and real-time results on a personal as well as professional level. Whether reading about suitable software solutions and SaaS trends in a reliable B2B directory, hiring the closest Uber cab, or getting details about the latest restaurant deals, these customers are ready to embrace Fintech’s promise of transactions without middlemen. [easy-tweet tweet="Banks face the major issue of cyber security and Fintech is actively addressing this problem" hashtags="#Fintech #Security"] New technology Fintech’s importance lies in the growing use of beneficial technology such as internet infrastructure and access, bandwidth, security, mobile platforms, and mobile app development. All these advancements are making this alternative transaction model go mainstream. Banks face the major issue of cyber security and Fintech is actively addressing this problem. This has resulted in industry-wide adoption and attracted more VC spending. Investment in blockchain was estimated to be $3 billion in 2013 and grew to a whopping $20 billion in 2015. The resources and capital being invested in blockchain show that this technology is growing at full speed.  What does blockchain promise? Nobody is sure about the impact blockchain, and its improved version will have on banking over the next few years. But right now, it offers three main advantages over conventional banking which is why Fintech is confident about its future. Quicker transactions Conventional financial transactions consume a lot of time as they travel through some third parties. For instance, a normal payment goes through a gateway, stock exchange, or clearinghouse. As many as ten entities may be involved in the transaction, exchanging messages, reconciliation, or security which means the seller receives payment only after days. Blockchain shows the potential to reduce this extended timeframe by decentralising everything. It can do in seconds the securities and cash transactions that conventional stock trading takes three days to complete. Similarly, the forex experience offered by Fintech where buyers and sellers deal using an online platform to trade in real time looks a promising technology for the banking sector. Transparency and Accessibility Compared to firewalls, mobile encryptions have the advantage of allowing users to access transaction details anytime, anywhere with internet access. Authorised parties can access the transactions stored in blockchain’s shared ledger. Blockchain records and locks transactions and users can access the full historical data easily. Thus, it offers greater transparency compared to the more secretive traditional banking sector. Fintech still faces the problem of reliable security. But sophisticated processes are being developed, and transactions get uniquely validated by software, which eliminates the need for third-party verifiers. Likewise, blockchain informs users about the entire transactional story, which makes the exchange of messages between parties that takes place in conventional banking look obsolete Reduced transaction costs Blockchain accelerates transactions which reduce the costs involved. It renders third parties and their charges unnecessary. Therefore, businesses that do multi-nation transactions stand to benefit. Conventional international transactions travel through the central banks of the countries involved and then through local banks. This means a global enterprise may end up paying as high as $25 for a transaction. Blockchain creates a ledger providers network enabling businesses to directly transact with one another across nations. Enterprise Irregulars forecasts that transaction costs will come down to a mere few cents. Challenges faced by Fintech Blockchain still has quite a long way to go before it can become mainstream. To realise its full potential, Fintech needs to address three main challenges: Security Banks face the pressing issue of cyber security risks (KPMG 2016). Three cyber attacks in recent years on the banking industry’s SWIFT network resulted in a loss of $100 million to the sector. Blockchain allows only authorised users to access and view its public ledgers. However, banks are wary of letting financial data sit outside their secure firewall. To alleviate fears, Fintech needs to build use cases aggressively and simplify process information to make blockchain appealing to bankers. For instance, hacking has become associated with online transactions and blockchain is also dragged into this picture. But in the blockchain, in addition to direct parties, multiple other parties named miners to verify daily records. Thus, some players take care of the verification of the blockchain financial ledger, which makes it very difficult to manipulate or hack the financial records. But this security measure also has a disadvantage. Since transactions are irreversible, manual data entry errors such as adding an extra digit by mistake can prove catastrophic. Regulatory hindrances Central bank rules, bank secrecy regulations, and other red tapes can adversely impact the effort to introduce blockchain into the mainstream financial sector. Blockchain implementation depends on governments’ inclination to adopt it as well as their keenness in fighting powerful banking industry lobbyists who may battle against this disruption. Industry hostility Fintech also needs to tackle resistance from financial institutions and banks that form an old boys club. Plus, there are legitimate fears such as cyber security and the possibility of banks spying on one another using the shared transactional platform. Some entities are also not eager to let Fintech take control of the game. This Financial Times report was one of the first to reveal this development which strays from the main goal of blockchain advocates. Many people are enthusiastic that bitcoin and its blockchain model can lead to the democratisation of banking as they can perform their transactions without intervention from big players such as bankers and the government. Bankers are also open to the idea of using blockchain technology without bitcoin as it promises cost reduction and efficiency. Peter Randall, CEO of UK-based blockchain vendor Set, reports that banks are discussing a scenario in which only trader insiders, regulators, and bankers, can access and use blockchain to improve the financial services sector.  Conclusion Who will win the battle, Fintech or the financial services industry? The young Turks are battling hard to change the attitude and mindset of the old guard. Bankers think blockchain has the marketing potential to improve their margins. Fintech will benefit from the turning tide if this technological disruption takes place. Our opinion? As history repeats itself, we think Fintech will win the battle as no force or industry can stop the constant march of technological growth and evolution. ### Top 5 Considerations Before Buying a GPU [vc_row][vc_column][vc_column_text]So you finally have the money to buy a beefy graphics processor and you’re looking to get the most expensive of them all. However, the gaming video cards isn’t everything to get your PC to run at high graphics settings. You see, there are ground rules when it comes to building your PC and a powerful graphics processor alone isn’t enough. Here, let me show you a few important pieces of advice that you need to take note of. And if you happen to be a buyer who is unfamiliar with PC hardware, I’ll explain each consideration as simple as I can so you don’t feel overwhelmed. 1.   The GPU needs an equally powerful CPU Before you even bring home the GPU, you need to make sure you have a good CPU to go along with it. CPU is the shortcut for Central Processing Unit, also known as the processor. This is the melting pot of the PC and the center of everything that makes a computer work together to run properly. In short, the CPU is the brain. Without a good one, your PC will just be all muscle and no brains. It’s where the calculations happen and clocks in everything the GPU and RAM does as well as the other peripherals. If you don’t have a good CPU here’s what will happen: your graphics may be able to run at max but everything else including physics, post rendering, anti-aliasing, shadows and lighting all come to play in a very slow manner. Nowadays, you have two main choices: Intel or AMD Ryzen. Intel has always been known for making quality CPUs ever since and has models that are built for gaming, especially for the 7th to the 9th generation i5 and i7. Meanwhile, the AMD Ryzen is a cheaper alternative but delivers great optimization and decent performance to make it a must-have for budget gamers. The biggest difference between the two is the overclock performance. If you don’t know what overclocking is, it means tweaking your processor beyond its capabilities to max outperformance. This is highly dangerous and can eat up more electricity than usual, causing some safety issues with the PC as well as increasing your electricity bill. This is the biggest case with the Intel processors but not so much for the AMD Ryzen. 2. RAM Means More Power Besides the CPU, your GPU needs an equally good RAM. It’s a bit complex to explain this so I’ll just keep it brief in layman’s terms. If you want a stable gaming PC with a consistent performance of up to 120 FPS, you will need enough RAM to power through a particular game’s demands. Sure, 4 GB of RAM is decent but for optimal purposes, please get 8 GB of RAM so you don’t have to experience slowdowns and frame rate issues. This is because most games these days apply lots of details onto their games and textures thrown in here and there. The true test of RAM comes in the form of open-world games such as The Witcher 3, GTA V and the upcoming Cyberpunk 2077. If you want minimal loading times and fast rendering, you need at least 8 GB of RAM to do that. Don’t aim for 16 GB of RAM though - not at least for the next 5 years. 16 GB RAM is overkill for current games. 3. A Sturdy Motherboard to Keep Everything Intact It doesn’t just end with the CPU and RAM though. You need a motherboard to latch onto these small parts to keep everything at optimal levels. The motherboard is the base and mainframe of the 3 major peripherals you have. A good motherboard means minimal overheating and overall guaranteed safety on keeping the RAM, GPU, and CPU from burning. Gigabyte and ASUS ROG deliver one of the best motherboards for gaming so consider those two on your list. 4. Ti is Always the Better Version Always go with the NVidia GTX series. The AMD is decent, sure, but it can never go beyond what an NVidia does. And if you’re considering Intel Integrated Graphics, just forget about playing on PC. Going back to the main point, the NVidia GTX is already tried and tested when it comes to bringing out the best of the game and everything it has to offer. Ti is the improved version over its vanilla GPU. For example, you have the GTX 1070 and the 1070 Ti. The latter is like 1070 2.0. As of this writing, a GTX 1050 Ti or GTX 1060 Ti is affordable enough to buy. Both are still good enough for high performance in gaming on current-gen games. But if you want the king of GPUs, get the GTX 2080 Ti. It’s enough to get you on a date with an e-girl you found on Discord. 5. HDD or SSD This one is subjective but if you’re looking for budget gaming, settle with an HDD that has at least 1 TB. But if you want to max out the utilities of your PC, go with SSD. The latter may cost more but the loading and booting times are significantly shorter. Hope this article helps you to have some second thought before spending your hard earned-money on GPU alone.[/vc_column_text][/vc_column][/vc_row] ### VPN: Reclaim Your Internet Freedom [vc_row][vc_column][vc_column_text]VPNs are excellent modern technology. Using a VPN, you become able to not only remain anonymous on the web but also get free access to the Internet from anywhere in the world despite the restrictions and limitations. People who don’t use VPN may face certain problems when surfing the web while traveling outside their country. Is VPN legal in all countries of the world? Unfortunately, the answer is no. There are some countries where VPNs are fully banned and service usage is controlled. Even in our modern progressive world, there are several countries where the use of VPN is punishable by law. No arguments about the VPN benefits for ensuring the safety of people and society as a whole are taken into account by the authorities of those countries where the government intends to tightly control the life of its citizens. The Need of VPN Across the World It is important to consider not the negative sides, but the benefits VPN brings not only to individuals but to entire states. Protection of corporate information. It’s hard to overestimate how important it is to protect the large companies’ data. Remain anonymous ― it is necessary to hide your actions from third parties. Advertisers are actively using information from search history, so lots of people use VPN to hide from them. Protection when connecting to Wi-Fi in public places. Without protection, hackers will get any information from the gadget. Using such technologies, it becomes possible to avoid your favorite web resources blocking. VPN technologies are simply irreplaceable in nowadays world. Installing a VPN on your computer or mobile devices, you will have free access to the web anywhere in the world. When it comes to freedom of information in the media including freedom on the web, it can be noted that blocking VPN services is a huge problem. Any state should strive to solve the main problem, and not deal with the consequences just banning it. After all, people will always look for ways to circumvent restrictions. VPN for Your Protection Ensuring your cybersecurity is as important as physical protection. Every day this issue becomes even more significant. Anonymous web surfing, protecting your personal data and keeping your info confidential is the basis for preserving you from problems that may affect your life. Even in countries where usage of VPN is legal, it is important to rely not only on software but, first of all, on yourself. What VPNs are best to use to get an encrypted connection, access geographically blocked websites, change your IP? Consider the three most well-known services. ExpressVPN This is a high speed, secure VPN service with a large network of servers (over 2000) which not only provides lots of connection options but also helps you choose the fastest one. Strict policies against registering online activities ensure complete freedom and confidentiality. ExpressVPN is available for various programs from Windows to Android. As for browsers, VPN add-ins are available for Firefox, Chrome, Safari. ExpressVPN is a paid service that doesn’t have a free trial. TunnelBear Canadian VPN offering secure connection and great performance. It is fully compatible with all devices, as well as operating systems. It has add-ins for Chrome, Firefox, Opera. It’s enough to install the official extension and sign up indicating your email. Then you should choose one of the 20 available servers, confirm your choice, wait for the browser to reload. TunnelBear has a clear privacy policy. It doesn’t keep IP addresses of those who visit the company’s website, IPs of users when connected, DNS queries or other information concerning online activity. The service has three tariff plans: standard Little Bear and two commercial ones (Giant, Grizzly). The free Little Bear plan includes a limited amount of traffic: 500 MB per month. Paid plans offer unlimited traffic use and differ in payment frequency: once a month or once a year. Windscribe Free extension for Chrome and Firefox with a wide variety of features. The application is equipped with a firewall, as well as uses strong encryption, provides security and makes it possible for you to remain anonymous when installing torrents, and blocks ads. When connected, Windscribe selects a fast server automatically changing it when requesting blocked sites. It’s incredibly simple to use Windscribe extension; it’s enough to just come up with a unique username and password. After that, you get 2 GB of free traffic monthly. To increase traffic to 10 GB, you will need to enter your email. People want to feel free online, as well as make it impracticable for third parties to get their personal info. In this regard, it is necessary to do everything possible. Indeed, propaganda or restrictions should not be allowed on the web. Only following these principles people living in any country of the world will be able to enjoy a transparent, honest information space that is truly beneficial.[/vc_column_text][/vc_column][/vc_row] ### Data For The Win: Delivering Competitive Advantage in Sport [vc_row][vc_column][vc_column_text]The age of local computing is over. Individuals and organisations are now looking skyward to cloud computing as the next big driver in technological innovation. Cloud is also the building block on which new technologies like blockchain, 5G and AI are built – technologies that are set to define the next stage of the human experience. Cloud adoption has spread through every industry and the world of sport is no different, where athletes, coaches and fans are experiencing the benefits of this technology. Cloud computing offers sportspeople the ability to better understand their own performance by providing applications to collate and crunch the immense amounts of performance data in real time. It can also offer fans new ways of watching, interacting and experiencing their favourite sports. As a result, the sports industry will continue to experience significant change as it goes through its own digital transformation. Real-time review The America’s Cup is the oldest international sport and in nearly 70 years since the competition began, yachts have come a long way. The 2017 British team had a £100-million budget which allowed them to deliver a catamaran capable of traveling at 60mph. One of the next-generation capabilities included on this boat was created by BT, who produced a “virtual chase boat” which shared a live feed of video, audio, telemetry and data, from 350 different data points, back to the team in Bermuda and to mission control in Portsmouth. The 16 gigabytes a day of uncompressed data was moved via the cloud, using then state-of-the-art Royal Navy 4G technology. This enabled immediate performance analysis and subsequent alterations to be made on the fly. Marginal gains The idea of the aggregation of marginal gains, first popularised in the field of cycling, suggests that minor improvements across a large range of different factors can add up to a significant change in performance. It is a mantra that has changed the way many sports are played and now technology is giving the sports industry the chance to build on this philosophy. Cloud computing allows coaches to collate massive amounts of data on athlete or team performance spanning hundreds of data points from years of historical data. In recent years, this volume of data and the subsequent analysis of it, has moved from being an interesting novelty to an absolute necessity. The massive amounts of data generated from sports, from heart rate to wind speed, need to be properly organised before analytics engines are able to generate insightful recommendations on performance. Sports teams need to leverage the right skills in how to properly handle the range of information produced by athletes and build a data pipeline that can ensure data is not left stagnant and unused. This data needs to be integrated to bring it together in the cloud allowing users to connect analytic applications and data. If the data is correctly integrated in the cloud it will accelerate analysis of performance and therefore drive faster training alterations. Ultimately, the faster a team can collate and integrate their data the faster they can review insights and improve. Fan interaction Off the field, sports fan’s experiences have also been influenced by the cloud’s growing monopoly in the tech sphere. In 2017 Manchester City unveiled the CityPulse wall which shares real-time statistics and player profiles with fans in the social centre of the Etihad stadium, all made possible by sharing data through the cloud. Similarly, Real Madrid, who have only 3% of their fan base at home in Spain, use cloud technology to tailor experiences for supporters abroad. The Real Madrid digital platform, developed using Microsoft cloud technology, has enabled the club to provide a more tailored experience for fans and has seen digital revenues increase by 30% and fan profiles increase by 400%. More sport The broadcast revolution changed the way in which we consume sports. A TV in every home ensured that sports fans no longer needed to purchase a ticket to watch their favourite athlete compete. Similarly, cloud has changed the way that we can consume content – over-the-top (OTT) broadcasting cuts out the need for a broadcast, cable or satellite distributor, delivering content directly from source to viewer through an internet based cloud platform. Professional OTT channels can be launched in a relatively short space of time and show a new range of less well-known sports, from Formula 4 to competitive eating, to audiences around the world. The cloud helps platforms fulfil consumer demand and deliver a bespoke experience to fans, whatever their interests. Conclusion The cloud represents the next stage of development not only in sport but in every facet of our daily lives. The power of cloud computing can enable coaches to analyse massive amounts of data in real time and from places where this was once impossible. Fans can now engage with their sporting heroes in new ways and OTT broadcasting has led to the quick and cheap distribution of a wide range of lower tier sports. Sport is continuing to evolve in line with the technical revolution and as long as the major players in the industry continue to adopt new technologies like the cloud, the sky’s the limit.[/vc_column_text][/vc_column][/vc_row] ### 8 The Best Ways AI Will Help Humanity, Not Harm It [vc_row][vc_column][vc_column_text]Today, AI has reached unprecedented proportions. It is quite difficult to find a person who would not be affected by this topic. Artificial intelligence is developing so fast that people can not simply assess our future. We all remember computers in the 90s. Then it did not seem to us some kind of breakthrough, but rather a joke or a toy. But anyway it was a huge breakthrough, even then, originated the concept gave us a push. This has led to the fact that today there are simply no enterprises that do not use machines. Sometimes what we have achieved with the help of artificial intelligence is fascinating. Today you can use your face as a key to any complex technical device. It is necessary to take into account the fact that about 30 years ago there was a debate on this subject. People feared for their future, and specifically many believed that this could lead to global catastrophes. And also the possibility that AI will take over us.  I'm sure many who read this immediately came up with sketches of scenes from all the famous Hollywood movie "Terminator". Based on the facts given below, you will come to the fact that the consequences of the inclusion of artificial intelligence in our lives will not carry any negative, but on the contrary is a good for all mankind as a whole. Enhances Efficiency If we put aside concerns about AI and look sensibly from the side. You can see that this industry opens up new jobs, new ways of income, and simply saves our money, which allows us to improve efficiency and automation. The most obvious example is the automotive industry. Time in this industry has put everything in place, thereby protecting it, and also greatly simplified the production of cars. Eliminates the Importance for People to Do Hard Tasks As the main argument can be lead the that this technology allows people avoid carry complex tasks by taking over entire the most tedious work on their shoulders. A banal example at work in the banking sector. For banks, this is a very positive course of development. Since introducing AI, banks accelerate all processes related to the application in order to optimize the operation and interaction with the application. Thus freeing their time and energy for something more such as some creative ideas, personal life or just focus on the consumers. Medical Science It's no secret that the machine is given the opportunity to interact with our brain, which helps it to delve into human psychology. All this is very interesting, but we all care about one question: What is the use? Everything is quite simple, so the quality of treatment will improve. Specifically, it will affect surgery, because machines do not have feelings, in accordance with this, we can conclude that the surgical operation will be more accurate, affordable and fast procedure. If we allow the cooperation of living doctors and artificially intelligent, then science and medicine will achieve new heights. Solves Complex Social Problems Everyone knows that in society from time to time there are complex social problems that are undoubtedly waiting to be solved. And AI has a broad prospect of development in this matter. But based on the fact that people do not know how to apply this technology properly, they are simply afraid. As soon as enough people begin to understand the essence of the interaction of AI with our problems in society, then in this regard, we will open up new opportunities and benefits in our lives. Translation and Philology Currently, not all languages are translated into different languages. In fact, it is about a hundred languages. With the introduction of this technology in large companies such as Apple, we will come to the fact that all languages of the world will be available. That is, in other words, people speaking in their native language will be able to understand each other despite the fact that the dialogue will take place in different languages. Frees People From Any Responsibility Following this scenario, we should not blindly follow the lead of artificial intelligence. On the one hand, you can use this technology to predict some processes or problems in society, on the other hand, you can transfer all control over the situation on the head of the system. At one point, it may simply turn out that we are not needed at all. Cyborg Technology It would be great if a person who has lost a limb for some reason could find a replacement for his old one with a new robotic limb. Scientists quite seriously argue that in the future a person will have such an opportunity. New limbs will interact directly with the human brain, allowing them to manage. This will allow a person to greatly improve their physical performance. Improves Our Lifestyle The most important question, in my opinion, is: How will AI affect our lifestyle in General? Well, it will affect absolutely all spheres of life, from the banal responses to emails using artificial intelligent assistant, saving energy when using a smart home, more productive marketing and ending with getting the best health care thereby getting the most accurate diagnoses possible.[/vc_column_text][/vc_column][/vc_row] ### Resolve Acquires FixStream to Deliver Game-Changing Combination of AIOps and Advanced Automation [vc_row][vc_column][vc_column_text] Acquisition empowers agile IT operations with the ability to predict, diagnose, and automate resolution of complex infrastructure and application issues in seconds LONDON, UK — August 20, 2019 — Resolve Systems®, the leading IT automation and orchestration platform, today announced the acquisition of FixStream, a pioneer in AIOps. The acquisition, expected to close by the end of September, will enable Resolve to offer the most robust IT automation platform available on the market by combining artificial intelligence insights into dynamic, hybrid IT environments with powerful automation capabilities that are purpose-built for the complexity of modern enterprises. The unified platform will handle a wide array of IT operations – from AI-driven infrastructure mapping, operational data correlation, and predictive analytics to intelligently automating cross-domain actions based on those findings. This will enable customers to significantly improve infrastructure performance, reduce mean time to resolution (MTTR), increase IT operations efficiency, reduce alarm noise, and proactively and intelligently allocate resources for critical business services. With 7x revenue growth over the last 24 months, FixStream is one of the fastest growing AIOps platforms. Its proven solution provides automated dependency mapping that dynamically tracks the changing relationships between applications and underlying infrastructure to aid in quickly diagnosing the root cause of performance issues and outages. It also leverages artificial intelligence and machine learning to analyse and contextualise large volumes of systems data, enabling the solution to detect patterns that predict future issues across the entire IT stack. Ultimately, the long-term vision for the combined Resolve and FixStream solution is to aid customers in achieving the long-awaited promise of “self-healing IT.” Combining FixStream’s multi-layer visibility and predictive analytics with Resolve’s cross-domain, service-level automation capabilities will arm customers with an unparalleled ability to automatically predict, prevent, and fix issues autonomously. Leveraging dynamic dependency mapping and AI-driven insights to inform, auto-update, and trigger intelligent automations, Resolve will be able to deliver a closed loop system of discovery, analysis, detection, prediction, and automation. Acquiring FixStream enables Resolve to further tap into the market for AIOps solutions, which is seeing significant growth as IT teams cope with the conundrum of reducing IT costs while managing increasing complexity, including an exponential uptick in data volumes generated by IT infrastructure and applications. Gartner estimates the subsegment of the performance analysis market that includes AIOps, ITIM, and other monitoring tools will reach $5.7 billion in revenue worldwide by 2020.* "We believe that FixStream’s AIOps and infrastructure mapping capabilities are a perfect marriage with Resolve’s enterprise automation platform, providing game-changing functionality that will enable customers to achieve unprecedented agility, speed and simplicity in IT operations,” said John Ferron, CEO of Resolve. “By combining our powerful, cross-domain automation with insights from FixStream’s artificial intelligence, we’ll be able to help IT teams accelerate their digital transformation journey.” “Together, Resolve and FixStream offer IT organisations the complete automation platform they have been looking for. We’ve repeatedly heard the need for a solution that brings together best-in-class AIOps with proven, cross-domain automation capabilities and are thrilled to see this vision become reality,” said Sameer Padhye, CEO and founder of FixStream. “Beyond the synergies that exist within our products, we share the common goal of helping IT organisations address the challenges posed by modern IT infrastructure and improve service delivery.” “Our research has shown that IT operations efficiency has become a critical priority for CIOs as their teams’ workloads continue to grow while staffing remains flat due to a shortage of skilled IT workers and shrinking budgets,” said Carl Lehmann, principal analyst at 451 Research. “Combining insights from artificial intelligence with automation technology enables enterprises to overcome these challenges and to manage the growing complexity of today’s hybrid IT environments, while moving closer to the long-term goal of self-healing IT.” [/vc_column_text][/vc_column][/vc_row] ### 10 Ways to Promote Your Brand Online [vc_row][vc_column][vc_column_text]Unless you have wads of cash for trial and error in your marketing, learning effective ways to sell your brand online is incredibly essential. This is particularly important if you’re running a startup or selling little-known products or services. The aim is to effectively communicate what you have for your potential customer, and this also includes explaining why they should buy from you and not your competitors. Here, we show you 10 simple ways to promote your brand for an authoritative presence online. Leverage the immense power of social media Social media has grown tremendously in the past decade and should now be treated as a necessary investment by any business looking for serious exposure online. The good thing is that you don’t even need to start from the bottom as most social media platforms now offer premium services that allow placement of ads as prioritised posts. With this, you have a direct and quick link to your audience from as early as the first day of setting up your social profile. Further, you can invest your time and resources in networking with like-minded industry leaders and target audience for an even wider reach. Consult digital marketing experts The majority of top digital marketing experts and service providers boast ample exposure and industry experience as they specialise in the business of helping others increase their visibility online. In addition, most have the required resources and expertise to research on the latest marketing techniques that work and those that don’t. For example, if you’re looking for an excellent hosting for your website or advice on web hosting and related topics, be sure to check Mangomatter for detailed reviews of both paid and free hosting options available for newly launched businesses. Master key website optimisation techniques Setting up a site and publishing lots of content on it doesn’t always guarantee that you’ll attract significant traffic. Instead, what brings you the traffic you need is to use proven approaches like optimising every aspect of your site including content, web elements, and interface for enhanced user experience. Great content and user experience mean users enjoy visiting and staying on your site – a key factor in improving site visibility on search results. Have an active blog Blogs offer ingenious ways of enlightening, entertaining, and informing your existing audience about new updates and developments in your brand. The content you publish here enables you to include and target keywords and relevant topics that your prospects are probably searching. What’s more, a blog can help to foster a good relationship with your audience especially if you focus on providing quality content that solves problems. Publish press releases occasionally Whenever your firm has a newsworthy item coming up or already happening, be sure to create a press release to get your audience talking. This approach increases the publicity of your brand and also offers an inexpensive way to market yourself when the audience shares your work. Consider hosting live streams With nearly all major social media platforms supporting live streams today, it’s time to use this feature to your business advantage. Live streaming provides a perfect way to engage your audience in real-time. In particular, it allows you to react to customer feedback on the spot thereby helping you add credibility to your brand. Use influencers Influencers are people or entities with large audiences on various platforms including social media, community forums, and websites in your niche or related industries. Reach out and collaborate with them on promotional tactics like guest posting on their spaces, brand mentions and endorsements, as well as product reviews. Offer incentives Let’s face it, we all love giveaways, discounts, and other incentives from the brands we are loyal to. So why not use this powerful marketing technique to your favour and use it to promote your products and services. Freebies make customers feel appreciated and valued which encourages them to recommend your brand to others. Use infographics You probably already know that colourful objects are very effective in catching the eye of the viewers – you see it in the ads all the time. Infographics usually involve designing ads, posts, and brand data in a manner that fascinates whoever is looking at them. Once you have their attention, you can easily communicate the idea that your brand sells in an interesting and no-to-standard way. Networking Before the advent of the internet, networking involved attending trade shows, parties, and other business events to interact with like-minded people. Today, you can easily network with real people online by learning from their success stories, promoting their brands on your online platforms, or being a loyal follower of an influencer you wish to collaborate with in the future. Use branded videos Videos have quickly become a powerful means of communicating ideas online. Indeed, statistics continue to show that more and more people are now preferring watching videos instead of reading written content. If you’re already using videos to educate your audience or market your products and services, consider branding them so that anyone who comes across them knows the people behind them with little hassle. YouTube, Facebook, and Twitter are great platforms where you can share your branded videos to reach your audiences. Have a story and tell it Stories sell emotions which humans are hard-wired to remember and relate to. A brand with a convincing story makes its presence memorable and quickly evokes trust. Memories bear loyalty which is something you want once your brand is out there. Besides, stories are excellent communicators of the purpose and belief of your business or brand. Share details about who you are, what you and your company stand for, and anything else that matters to your audience. Which promotion strategy is working well for your business? Share with us in the comments below.[/vc_column_text][/vc_column][/vc_row] ### 7 proven tactics to build a strong relation with customers [vc_row ][vc_column ][vc_column_text ]In this day and age, the internet is an intrinsic component of everyone's daily lives. Apart from apprising its denizens of the goings-on in the world, it also helps them find commodities, facilities, products, and services of every kind imaginable. And as brands proliferate at lightning-fast speeds, the people's search for individualized services has gained center stage, and businesses are here to capitalize on this discovery. As B2B organizations vigorously transition to a more digital manner of managing their business, the severity of the competition between has reached a fever pitch. And with the perennial tug-of-war between organizations churning out different ideologies for consumer satisfaction, the one credo that holds the most value for businesses is building a strong relationship with their customers. With the Temkin Group's findings on how companies that earn $1 billion on an annual basis can expect an extra $700 million within 3 years of investing in customer experience gaining ground, organizations are increasingly adopting customer-centric strategies to woo their patrons. If you too would like to know of tactics that can help you redefine your relationship with your clientele, here's a list of seven such powerful maneuvers that can help you do so. 1. Communicate From prompt issue resolution and result-oriented conversations to round-the-clock support, proactive communication with customers can not only create robust customer relationships but also help you spread a word about your service faster while also gaining the confidence of your patrons. You can do this by deploying advanced customer support tools like Acquire's live chat for your customer support team. There is valid proof that shows how effective live chat support can be for businesses. For instance, Forrester claimed that usage of live chat promotes an average of a 48% increase in revenue per chat hour. Apart from this, an Econsultancy report found that live chat contributes to the highest level of customer satisfaction at 73% as compared to 61% for email. With a live chat window, your support personnel can help users find the appropriate product, answer questions about the return policy, and guide them with a steady hand throughout their journey. A good live-chat platform like Acquire's, which comes with modern features like co-browsing is a great step forward in this direction. With collaborative browsing, you can offer help to your customers at different touchpoints in their journey for problems such as payment issues and different modes of payment. Additionally, you can also introduce chatbots to fuel instant gratification for your customers by extending discussions with them after converting them into a customer. By embedding your chatbot with support articles, you can also provide users with suggestions for your products from your help center. 2. Be Aware of Their Requirements “Get closer than ever to your customers. So close that you tell them what they need well before they realize it themselves.” - Steve Jobs Any business worth it's salt should consider these words of the most famous inventor of modern times as the biblical truth. With a burgeoning swell in the market in terms of the variety of products and services available, whimsical consumer decisions are a very common occurrence. So that you don't lose out on your painstakingly acquired customer base, it is always a good idea to inform yourselves of their evolving needs and requirements. Leverage the pull that social media sites like Facebook, Instagram, and Twitter have to hear the opinions of your patrons, and always try to acknowledge their input. In addition to this, you can also create surveys of your offerings, customer service, etc. and ask your customers to partake in it. You can go a step further and share the results of the survey with them. This not only helps paint a better picture of your brand in the eyes of your customers, but it almost always gives you additional information in the form of post-survey reviews. So much so that, an Edelman survey found that loyal customers, those who support a brand over a certain period, spend 67% more than new customers. 3. Exceed Expectations with Great Customer Service Did you know that as per a Walker study, customer service will overtake price and product as the key brand differentiator by 2020? Great customer service is the bedrock of any business. It has increasingly become the priority of many organizations the world over, and 2019 is no different. Touted as the most exciting opportunity for businesses, it is also a great way to redefine your relationship with your customers. To put things into perspective, it was found in a recently conducted PwC survey that customers are willing to pay a premium of up to 13% for services, simply by receiving a great customer experience. All of this points to only one thing; faithful customers are your best salespersons. So, take time out to network with them and also to follow up. Another way that businesses can please customers with is expeditious delivery of products and services. By simply under-promising and over-delivering, you can make an impression on customers, and they will keep coming back to you. 4. Connect One of the biggest mistakes that businesses make is failing to follow up with their customers. It doesn't matter how good your product is or how effective your service is, no one is going to separately remember you from the number of brands out there. In fact, a report found out that 70% of customers say service agents’ awareness of sales interactions is very important in keeping their business. By connecting with your customers instantly, you can make sure that you are always on their mind. Send them a nice 'welcome' e-mail or add them to your newsletter. Let them know who you are and what you do. Maintaining a connection is easier than warming up a prospect again and again. From online tools to social media, there are many ways that you can start conversations with your consumers than ever before. However, make sure to keep it engaging and not create a one-way conversation. 5. Reward Loyal Customers Did you know that as per global management consulting firm Bain and Co., a 5 percent rise in customer retention generates profits of over 25 to 95 percent more? Customer incentive programs are gold if you ask us. Since repeat customers are your most profitable ones and spend 67 percent more than new customers, it is imperative that you hold their attention. You can do so by giving them something in return for their time and business, like loyalty bonuses in the form of vouchers, coupons, free shipping, etc. You can keep track of metrics like the number of orders, welcome points, referrals, loyalty points, etc. to ascertain your most loyal customers. This will help both you and the customer enjoy unparalleled benefits. While they keep visiting you for the promise of added benefits, you will enjoy higher customer retention rates. 6. Ask for Feedback As per a Gartner survey, organizations that devise tactics for better customer experience begin by focusing on ways that they can accumulate and assess customer feedback. This is a great starting point as it informs you of your customers’ expectations and helps you revamp the specific needs of your customers so that you can find the right solutions to their problems. It doesn't matter whether the feedback is positive or negative; it will tell you a lot about your services. Always listen carefully to compliments or complaints as they will provide you with an honest gauge of customer experience. Not only does this tell you what works for your brand and what does not, but it also helps you to improve your products and services by collecting opinions regularly. 7. Focus on Their Complaints and Improve Services Accordingly While collecting feedback is necessary, personalizing your services on the basis of the feedback you receive is very crucial. That is why, a data-driven approach to your marketing such as this can help you redirect your marketing efforts, product design efforts, and service. Try to gain as much information as possible and learn what customers have to say about your business. Keep your eyes peeled for any negative and sub-standard mentions of your products on social media and the web. Check reviews and survey results. Listen to the feedback that they give while on the phone with your customer service, and come up with ways to deliver a better customer experience by improving your services accordingly so that they match customer expectations. Business is all about maintaining an amicable relationship with your customers. Find them, rear them, spoil them, and see your sales soar. Conclusion Nothing is more important for a business than the satisfaction of its customers. However, with a number of brands competing against each other in the market, vying for the attention of customers, you'll be left high and dry if you don't create a sustainable, loyal, and active customer base. To this end, the idea of relationship building and redefining customer experience are very vital for businesses. Prioritize the customer's experience and the value that they bring and ensure that you consistently deliver high-quality and customized services. This will not only showcase your brand in a better light, but it will also make it easy for anyone to connect with you. Simply employing these simple strategies can help you see a noticeable rise in how people clamor for your products. Your customers will be satisfied with their choice, first-time buyers will feel more confident coming back for more, and prospects will start to recognize your value.[/vc_column_text][/vc_column][/vc_row] ### Cloud Hosting vs Shared Hosting – The Pros and Cons [vc_row][vc_column][vc_column_text]Choosing the right kind of hosting package for a business website is fast becoming one of the most important factors in the longevity of any online brand. A reliable hosting solution that can offer virtually continuous uptime and connectivity to your goods and services for your customers will help to generate a steady revenue stream. Given the plethora of websites that promote the best cloud hosting and shared hosting solutions, it can be truly overwhelming deciding between the right type of hosting. Some platforms deliberately confuse users by using acronyms that the layman simply wouldn’t understand, just so that they choose a bigger hosting package than they need. The two most popular types of hosting solutions available today are cloud hosting and shared web hosting. They are very different types of hosting packages, each with their own pros and cons, so we’ve put this article together to help demystify the cloud hosting and shared hosting landscape: The benefits and disadvantages of cloud hosting Pros Extensive storage space Cloud hosting packages tend to be able to offer far more storage space than shared hosts. That’s because cloud host users don’t have to share the same server and can often host multiple websites through the same cloud hosting solution. Highly scalable for fast-growing businesses While shared hosting packages penalise websites for going above their bandwidth limits, cloud hosting packages can be quickly scaled up or down, depending on the amount of disk space, processing power or bandwidth a website needs to grow and improve. Quicker page load times for improved search visibility Thanks to integrated caching, cloud hosts generally ensure faster page load speed times, which are increasingly used by major search engines like Google as a ranking factor for search engine optimisation (SEO) purposes. Easy to restore lost data In terms of data backup, cloud hosts make it incredibly easy to retrieve lost files as the data is stored in a different location/data centre to the business. Cons Often restricted to single vendors Some cloud hosts will limit you to using only one vendor, making it hard or even impossible to migrate your website from one cloud platform to another. Generally more expensive than shared hosting plans Cloud hosting often costs more than shared hosting because of the freedom and flexibility afforded to website owners. The benefits and disadvantages of shared web hosting Pros Easy to deploy and manage Ideal for those with minimal technical knowledge, shared hosting can be deployed within a matter of minutes and is easy to manage via a web hosting management dashboard such as cPanel, allowing for one-click installations of software. Your shared hosting provider will assume all technical responsibilities All shared hosts are responsible for the maintenance and upkeep of their servers, so you don’t require any technical knowledge. Cheaper than most cloud hosting plans As resources are shared across multiple users on a single server, shared hosting plans are cheaper than cloud hosting solutions. Cons Capped speeds can make shared hosting cumbersome The more active users on a shared server, the slower the page speed load times of your website. Server downtime is more frequent with shared hosts Shared servers also tend to be more susceptible to becoming overloaded with user traffic, leading to unwanted downtime and costly server crashes. No control over the features available via your host platform What you pay for is what you get. Shared hosting plans have a set number of services and software available and you don’t have a say in being able to get more software onto your server. As with most things in life, you get what you pay for when it comes to hosting. Cloud hosting might cost you a little more, but it offers improved security and performance. Meanwhile, those with basic website needs can look to shared hosting for a value solution.[/vc_column_text][/vc_column][/vc_row] ### Is too much Noise keeping you from success? [vc_row][vc_column][vc_column_text] [embed]https://vimeo.com/338854915/979c961de0[/embed] [/vc_column_text][vc_separator][vc_column_text]Sometimes we seem magically able to give our full attention to the activity that we are doing. Creatives call this being in the ‘zone.’ Whether it is writing an email or playing in a tennis match it has our full focus, our mind does not wander and we look up to see that a considerable amount of time has passed. It is in this state that we perform at our best. In terms of the human condition, however, it is rare. Scientists believe that we have between 20,000 and 60,000 thoughts a day and only about 5% of these thoughts are spent on the task in hand. The rest is Noise. It’s interesting to imagine the potential that we would harness if we found a way to move the focus rate to 50%. So what is this Noise that fills up so much of our mind and our time? Past Noise is the universal tendency to dwell on the past and to replay situations in our mind that cannot be changed. It often results in the negative emotions we felt in the past being re-experienced. This is different to calm, balanced reflection assessing and learning the lessons from what has gone before. In fact it is often said that successful people consider their failures fully but quickly, and then move on to the course-correct. They will tend to dwell on past successes rather than failures. Future Noise, you won’t be surprised to know, is the vast amount of time we spend worrying about the future, or more accurately possible potential futures. There’s a lovely quote from Mark Twain which we can probably all relate to – ‘I’ve had a lot of worries in my life, most of which never happened.’ We imagine each of these potential futures in minute detail, constructing little mental movies in our mind. In extreme cases worrying about the past leads to depression and worrying about the future leads to anxiety. It is easy to get lost in the thoughts around Past and Future Noise, to be drawn down a spiral of automatic negative thoughts, with one negative thought leading to the next. In the Mind Fitness programme we call these automatic negative thoughts ANTs. It is important to know how to keep the ANTs at bay. Mindfulness has been proven to substantially quieten the Noise that crowds and clutters our mind. It brings our attention to the present moment in order to experience life more clearly and more fully. In the ‘zone’ we are, of course, operating in the ‘Now.’ This is a quick exercise that will halt the spiral.  It is the mnemonic of now, so it’s very easy to remember. N – Notice – just glance around and notice one thing that you can see. O – Observe – bring close attention to the object that you have chosen,  Observe it in detail. W – Wonder – bring a spirit of curiosity to the object. A negative thought cannot co-exist with positive engaged interest. Another very common type of Noise is Negative Self-Talk. It is the things we tell ourselves we cannot do. Most of us have a fair few that we can bring to mind without too much difficulty. I’m not the sort of person who can learn a language / understand tech /cope with heights / do maths / meditate. The list is endless. Our self-criticism can also be intensely personal and judgemental; we say things to ourselves we would never say to anyone else. For some this negative self-talk can be like living with a gremlin, always ready to leap out and attack. For a week or so keep a notebook and write down the negative self-talk that comes into your mind. Some of it will probably surprise you. Beside each negative phrase, write the opposite, the positive equivalent. ‘I am great at maths’ ‘I find learning languages easily.’ Don’t worry if you don’t yet believe it, still write it down. After a week or so pick a few that you would really like to change and develop a Positive Affirmation for each of these. This is a sentence that you can repeat each day. It works best if it is present tense ‘I am good with spiders’ rather than ‘I will be ..’  and avoid a double negative as in ‘I am not afraid of spiders.’ There is also some research showing that it works best if you say the Affirmation out loud, so do this if you can. We change our mindset by building new neural pathways related to our new thoughts and beliefs; it takes about six weeks for the new pathway to become the stronger route, (our brain always takes the route of least resistance) so give this time to work. It is also worth looking at the distractions that you allow into your life, a pop up notification can prevent you from finishing an important task, and more importantly, it can make you feel swamped and out of control, even if it is trivial in terms of both time and urgency. Avoid multi tasking wherever possible – distraction always result in a performance level that’s less than your best. Clearing your mind of some of the clutter and noise will bring down your base level of stress and improve your concentration and overall performance. It will also give you back a considerable amount of time each day as you avoid the spirals and rabbit holes that have become so much part of our busy stressful lives. Unlock You by Andy Barker and Beth Wood is out now, published by Pearson, priced £12.99. To find out more go to: https://www.amazon.co.uk/Unlock-You-confident-happy-minutes/dp/1292251123  [/vc_column_text][vc_column_text]See the rest of our Opening Lines videos and articles on our site![/vc_column_text][/vc_column][/vc_row] ### How Fintech Based Tools And Resources Made Handling Finances Easier [vc_row][vc_column][vc_column_text]The amazing advancement in fintech can be used very brightly with a methodical approach, ina fully structured way, to make money, monitor money making, and also fight back debt and financial crises. This can be done in various stages, all through the use of technology. Finance and technology combined has brought forth some such amazing apps and tools, which can be used online through smartphones, and definitely through computers and tablets. That’s why the common man now is richly equipped to play with finance, take care of money matters, and grow rich by utilizing the tools and resources thoughtfully. Here are some ideas shared, which can be used smartly to make money. Investing in the stock market One of the very lucrative options to make money, and make huge if you have the luck and understanding of the market, is the stock market investment method. Many people invest in the stock market, and buy shares of companies, and then sell them at the right time to make money. There are apps for this. The function of the apps is to help you trade online. The apps make trading easy without having to rely on any middleman. This means you can directly buy and sell stocks from any computer or smartphone or tablet. All you need is continuous internet connectivity, and the appropriate tab that helps study charts, lock prices and buy and sell orders. Such apps have more features like presenting your old records, giving graphical views of stock price records, displaying current market trends, and much more. Investing in foreign exchange The foreign exchange market operates similar to the stock market. The only difference between the two is that in this case, instead of stocks, you trade in various currencies. There are many foreign currencies which have a huge demand, and in this form of trading, the currency is bought when its rate is down and sold when the rate goes up. Supporting apps are there, which are highly efficient in an instant and daily trading. These apps are equipped with many features for the basic to advanced traders and are available on many platforms. Cryptocurrencies Trading in cryptocurrencies is the new trend, which can help make people huge money. This surely is a great success of fintech, where a currency, which is completely in the digital form, yields cash and, is some cases, can take one from rags to riches. Crowdfunding platforms Crowdfunding is another great way to get money. This is an excellent way to raise funds. Once against finance meets technology here, where an online platform and media site is used to interact, present projects, impress people, convince a crowd, and raise the money. That’s why crowdfunding, which is an effective and accepted method to raise funds for business, is a very nice gift of modern fintech. Business watch Apps are there to watch businesses, compare their performances, and tell you which one is doing good at present, which is underperforming, which one looks promising and so. This helps many people invest wisely on businesses of their choice. Real estate A great way to make money is real estate. When you invest in real estate, you have great chances of making money. However, the best decision is taken with good research work. And these days many apps are there to provide information on lands, houses, projects, and buildings, etc. That’s why investing in real estate is much easier now with the support of real-time and constantly updating data. This surely is another side of fintech success. Commodities The commodities market is also great money making machine if you know to use it. Just use the appropriate financial technology here too. There are fantastic apps and tools to predict the rise and fall of the market and various commodities. That’s why with a close market watch through the apps, and using the trading apps; you can make good money from this market too. Investment advisory Some apps guide the investors through many updates, news, economic news and finance management tools, etc. The main role of such tools and apps are to indicate the ripe sectors of the market, which can be used for investing. Buy and sell Buying and selling of used and old items also is a business, and there are apps for this. Many people make money out of it. You can earn cash from a sale and even from the commission. Also, other forms of selling, like affiliate marketing, are there to earn cash. You get supporting apps for all this. Economic news Better investment decisions can be taken when you are updated about your regions, country’s, and world economics. For this again, there are excellent tools and apps, which are a gift of the modern finance technology. You can stay updated about any small and big economic move etc. from such tools. Credit card finder The use of credit cards brings convenience to an average person in many ways. Paying for any item even when you don’t have the money gets easier with a credit card. Moreover, you can transact cashless when you have a credit card. And there are many apps which helps you apply for a credit card, track your card application, and compare and choose from many credit cards, also check your eligibility as the applicant. Loan Finder There are apps which help find and apply for loans. Many lenders are there who can be reached through such apps. This is another bright side of using technology to get financial assistance. While in times of need, you may find loans with such apps, you also can get assistance with debt management with apps and financial tools. Resources like NationalDebtRelief come as a great help in this. Finance management Wallets are a very convenient way to carry digital cash or cash in digital form. This relieves you from carrying any physical wallet. Transactions can be completed from mobile number to number, bank account to account, or wallet id to id, etc. Finally The various products around you which are technology-based, and meant for finance management are varied and many. Hence life can be made simpler with their use, and finance management is easier than ever with these certainly.[/vc_column_text][/vc_column][/vc_row] ### How Data Security and Privacy Can Save Corporations Money [vc_row][vc_column][vc_column_text]Successful corporations hold the trust of their clients: it is the responsibility of the corporation to hold a client’s data securely and to protect their privacy. Without these basic requirements, a company can fall to the ground in the event of a breach and lose their customer’s trust - as well as being heavily fined by government agencies such as the UK’s ICO (Information Commissioner's Office). Examples where companies’ reputations were risked Facebook fined $1.6 billion A recognizable example of where a corporation took a large hit and lost a significant amount of trust was in September 2018. Hackers were able to exploit one of the sites features which allowed attackers to expose the personal data of over 50 million users. The attackers were then able to steal access tokens which would allow them access to user’s accounts. Subsequently, the announcement was broadcasted to public and the company lost a large majority of its user’s trust. The IDPC (Irish Data Protection Commission) also opened an investigation and set up a fine to the company of $1.6 billion. All of eBay’s user’s data exposed eBay suffered a similar attack in May 2014 which allowed hackers access to personal data such as names, D.O.B., addresses and encrypted passwords of all of their users. The attack wasn’t completed using a brute force style attack; instead, it was completed using employee credentials. This shows the need for internal security as well as external. The CEO reported that the breach resulted in a decline of user activity - ultimately due to losing a majority of its user’s confidence in their service. Uber’s attempt to hide attack Another attack which shows the importance of internal security was the Uber attack in 2016. Attackers were able to get access to Uber’s AWS servers which allowed them access to personal data including driving license numbers. The thing that made this attack notable is how Uber dealt which the attack. They paid the hacker’s $100K to delete the data they stole, however there was no guarantee that the hackers actually would. Uber’s valuation subsequently dropped from $68 billion, earlier in the year, to $48 billion by the end of the year. These are just a tiny fraction of all the attacks which take place; however, they really show the importance of data security and privacy. Not only is external protection (DDOS, XSS, etc.) a necessity, but so is internal protection. Corporations need to implement policies and training for scenarios. Uber dealt with the situation poorly, but if they had a security and privacy policy in place then it wouldn’t have been as bad as a situation as it turned out. The unethical decision to keep it a secret and the attempt to pay the hackers to keep quiet costed the company billions and permanently destroyed their reputation. How attacks can happen Nearly half of all data breaches aren’t due to a malicious attack: around 36% of the time it is due to human error and 5% of the time is due to system malfunctions. A main human error source is due to social engineering attacks - this is the process of tricking someone into handing over sensitive data such as login details or files. On the malicious attacking side of things, if software isn’t updated then already discovered exploits, which are openly posted online, can be performed by anyone with nearly to no hacking experience required. However, an attack doesn’t have to be one or the other. Attacks can be combined using social engineering to gain access and then planting malicious code.  Prevention As one of the main causes of breaches is due to negligence, it is important to have policies in places to deal with situations like these. Policies and training A corporation should have a policy guide to follow in the event of an attack so that it can be handled ethically and legally. This will also help regain a corporation’s client’s trust as it shows professionalism. It is also crucial to train employees against data breaches - social engineering attacks mainly take advantage of customer service agents to gain access to sensitive areas. Encryption All corporate computers should be encrypted. From user's browser to server, the connection through which the data is passing should have encryption. This is one of the best preventions for data breaches as even if a hacker gains access to a database, it renders the accessed data useless as it is encrypted and cannot be understood. When we talk about SSL, EV SSL certificate is one of the best SSL certificates that enables green bar with highest business authenticity. Updates As mentioned above, all software should be updated regularly as vulnerabilities in previous versions of software are often posted online. The updates patch these vulnerabilities in an efficient and safe way. Penetration tests For larger corporations it is highly recommended to hire a third party to carry out a cyber security analysis test. These will highlight threats found and may also offer ways to prevent the threat being exploited. This is a good long-term option as it can save billions in the future. It is crucial to perform these regularly as systems can change often. Human error As human error accounts for 5% of breaches it is worth trying to minimalise human interaction as much as possible: repetitive tasks usually performed by humans should be performed by a system instead. Backup data Although not a prevention, backing up data will help with the recovery process in the event of an attack. If data is destroyed or corrupted and no backups have been performed, a lot of user data will be lost forever. Therefore, frequent backups help the restoration of a corporation.[/vc_column_text][/vc_column][/vc_row] ### Focus on email continuity and resilience or bust the bottom line [vc_row][vc_column][vc_column_text]Fifty, one-hundred, one-hundred and fifty? How many emails have you received today? Despite a rise in the popularity of instant messaging tools, if you take a look at your inbox it quickly becomes obvious that business still fundamentally runs on email.  Millions of people across the world now use cloud-based email systems with the most predominant being Microsoft Office 365. This has meant that the number of emails in the cloud has inflated and adoption of cloud email services is only set to continue. By 2021, Gartner expects 70% of public and private companies to be using cloud email services.  But what happens when the service that supports so many mission critical functions in the business is disrupted? No cloud vendor is perfect and the number of cloud email outages over the past few years has served as a reminder of this fact. No organisation should trust a single cloud supplier without an independent cyber resilience and continuity plan to stay connected and productive during unplanned and planned service disruptions. Every minute of an email outage costs businesses hundreds and thousands of pounds. It carries a variety of consequences that range from frustrating to financially devastating. The question is if your work email and productivity are dependent on cloud email, what would a potential disruption cost you? The risky business of downtime Email reliability – whether managed through on-premises infrastructure or as a cloud service – is dependent on a variety of factors, only some of which are under the control of in-house IT. Disaster recovery plans and systems that predict potential IT fails and offer a plan B have been best practice for years. This shouldn’t be any different in a cloud first world. If you don’t have a backup plan in place for when a major cloud service does go down, your email systems will be down until the provider gets it back up again. And you can’t control when that will happen. One hour? Five hours? Days? This can be caused by anything from the reliability of the servers on which email software runs, the operating system, human error, Internet connections, the power grid, weather and much more. If any link in this chain breaks, email systems can suffer downtime. Large cloud-based collaboration suites like Microsoft Office 365 and GSuite are as susceptible to outages as any other provider, ranging anywhere from a few minutes to several hours at a time. A lifetime for any business, and a blow to the bottom-line. It’s important to remember too that it’s not just downtime caused by cloud provider interruptions that businesses need to consider. The risk of ransomware is still growing. Recent research shows that 53% of organisations experienced a ransomware attack that impacted business operation, a 26% jump over the previous year. Data loss, financial loss and customer loss were the top three business impacts. Primary email systems may need to be shut down during and immediately after a ransomware attack to help stop the spread of infection. In this scenario, a backup plan to keep email safely flowing and be able to recover data to a known good state is key. Planning for high availability and recovery The issue is that all too often disaster recovery plans are insufficient when it comes to the critical and timely recovery of email communications. This ultimately contributes to the excessive outage times that occur when things go wrong. When developing a business contingency plan, decision makers must determine the acceptable risk associated with email downtime and data recovery in line with the needs of individual users and the organisation overall. This means focusing on Service Level Agreements (SLAs), Recovery Time Objectives (RTO) and Recovery Point Objectives (RPO). This will allow businesses to reduce the risk of potential data loss and speed up the restoration of an email service. It involves looking at employee productivity and legal requirements, processes that rely on email, including customer communication, and other regulatory obligations. Organisations should then implement an email continuity solution that will enable users to continue working with always-on access to live and archived email. Otherwise businesses risk employees using personal email accounts to keep work flow moving and this has serious implications on security and compliance in the workplace. Businesses should also implement a data recovery protocol as for many employees, email archives have now become the primary repository to save and access important information. It’s fundamental that employees are able to continue with their work as normal with as little disruption as possible, meaning data recovery should be a high priority to help aid continuity efforts. This is critical too for recovery efforts following a security breach. When it comes to planning, businesses also need a clear chain of command, should disaster strike. If any critical systems go down, businesses need to know what action to take and understand who is responsible. A regular continuity and data recovery performance test is key here – a one-off simulation is simply not enough. Depending on the size of the business, performance testing must occur regularly, and solutions must be frequently checked to ensure the business is prepared to continue as normal even in the most critical situations. When all eggs are placed in a single digital basket, it is essential to ensure that your business has the capability to recover from a situation that cuts off access, locks down, deletes or corrupts data. Building a cyber resiliency strategy  There are also security risks that must be considered when implementing a critical business system availability plan. Importantly, this should feed into wider cyber resilience strategy, which focuses on defence, but also restoration of operations if things were to go wrong. The first stage of this is for organisations to implement layered security. Email security solutions that might have been adequate several years ago often lack capabilities to protect against today’s modern attacks. This layered security should include cloud technologies that help ensure security solutions are always up to date and ready to protect against the latest and fast-moving threats – something that’s much harder to achieve with legacy on-premises solutions. It helps ensure threats are either captured before they have reached the network or defended against and removed if activated within the business. Combating cyber risk in the long-term also requires organisations to look beyond technology and ensure a greater and ongoing focus on awareness training. Over 90% of breaches are caused by human error, targeting individuals with cleverly crafted phishing emails. Moreover, our research found that 71% of attacks over the last year saw malicious activity being spread from one infected employee to another. It only takes one untrained eye to open a malicious attachment and render an entire system as useless or for ransom. Educating employees on what to look out for and what to do if they see something suspicious is vital. Being able to identify a phishing attack or an impersonated email is paramount for protecting data, privacy and the business continuity of front-line services. What’s more, with the General Data Protection Regulation now in full swing, organisations must have a clear plan in place on how they manage and protect critical data stored in email. Failing to adhere to regulations could result in massive financial penalties and poses serious risks for the business’s existence. Worryingly, research from our State of Email Security Report found that data leaks are on the rise, with 41% of respondents noting an increase. This is why it is fundamentally important that in order to minimise the impact of a potential breach, businesses must ensure email systems and data are effectively protected, backed up and recoverable. The bottom line is that organisations that are highly dependent on email communications and use email as a key repository of business records must focus seriously on the continuity of email. This must span beyond cyber awareness and include remediation and recovery to ensure they can get quickly back on their feet. Building this into a wider cyber resiliency strategy is key to ensuring email is available anytime and anywhere, even when the worst happens.[/vc_column_text][/vc_column][/vc_row] ### Why Wordpress and what it can do [vc_row][vc_column][vc_column_text]Deciding on a Wordpress hosting service can be a daunting process for a person just starting out. Even for people with experience, finding the right package to suit the needs of you and your company can be just as difficult. The article below outlines some of the key aspects you should consider when choosing a hosting provider for your Wordpress site. In particular, we’ll cover the pros and cons of managed, shared and VPS hosting services, storage and bandwidth. Before we start, for any readers wondering, let’s answer: What is WordPress hosting? By buying hosting for your site, you are subscribing to a service that allows your website to be viewed on the internet. The provider stores your website on servers that deliver the information requested by a person’s computer when they enter your website. It tells their computer what should be displayed. Essentially, wordpress hosting is a form of web hosting that is designed specifically for websites built on wordpress. There are unique elements to the wordpress platform that can cause issues on some sites if they’re not hosting on a wordpress optimised hosting server. Cost and Pricing No matter the size of your business, the cost of hosting is always an important consideration. For many just starting out, it is the main deciding factor too. Try to resist the temptation to just go for the cheapest option you can find, you often get what you pay for with hosting. Examine all the features offered by multiple hosting companies within your price range. Make sure you include companies towards the upper end of your price bracket. The services offered will impact the cost of package, for example the quality and expertise of  support, the healthiness of the network and the technical additions available to you. If you know you need to have the best, then expect to pay a premium for this. An important tip is to work out what your actual needs are as a company and what demand this will place on a server. People often find that their needs don’t match their expectations for what hosting suits their current stage. By understanding this, you’ll be able to work out a hosting package that meets the demands of your service, whilst not charging you excess amounts of money. If you have a small wordpress site, paying between $3 and $30 a month depending on your need is a sweet spot that most companies at least begin within, particularly if revenue is not being generated from the site at this point. Beginning with a budget friendly shared web hosting is a good starting point, as you grow you should scale up the server size and package to match the increasing demand. Performance and Features It is a classic sales trick to be upsold all the bells and whistles, ending up with you paying for more than you will ever need. Whilst some people like to show off how big a hosting plan they have, it makes no sense to waste resources on features you won’t use now, or won’t use in the future. It helps to plan in advance, work out your needs currently and what you predict your needs will be in a years time. Here are a couple considerations: Do you plan to have more than one Wordpress site? Are backup services included in the price? Can you afford them if not? The above is by no means exhaustive, but it should help you consider your business and what you wish to achieve in the future. Working out a solid business plan will direct you towards the type of wordpress hosting you need. Throughout the consideration process, your hosting provider needs to guarantee up time, super-fast servers, bandwidth and enough disk space to match your usage. By choosing too cheap a package, you’ll experience downtime and server related issues. In a sense, it’s a false economy, because you could end up paying out more to fix issues that wouldn’t occur with a more expensive package. Further features to research and ask providers about are: Email hosting Control panels Security How good is the customer support? You can tell a lot about a company by assessing the quality of its customer support. If you have a bad experience with the customer support, do you want to trust they will do right when helping you to fix issues with your online business. Perhaps your site goes offline for an unknown reason, is there instant support from a tech team? How quickly can they resolve problems, even the smallest ones? Read up on reviews and feedback for the prospective companies’ customer service. Types of Wordpress hosting To compound the confusion that some might be feeling, there are then multiple types of wordpress hosting you can purchase from providers. The most popular types of hosting being managed, shared and VPS. We’ll discuss each of these below, although it’s important to bear in mind that combinations of these can also come about, for instance managed shared and managed VPS hosting. Managed WordPress Hosting A managed package means that the company will look after the technical aspects of keeping a server online. You do not have to worry about that aspect of the work, you pay the fee and they will deal with the nitty gritty of it. The downside is that there is less opportunity to customise and optimise for those who are technically capable. Shared Hosting Shared hosting is the most widely used hosting option, particularly so because it is the easiest entry point into hosting for sites just starting out. It is ‘shared’ because multiple websites are stored on the same single server. They share resources, although there is usually a limit on how much can be used. The benefits include cheaper access to server hosting. A downside is that if one of the other websites on the server has a large spike in traffic, or adds increased demand onto the server this can impact your access to the network. Virtual Private Server (VPS) VPS servers are often the next step up from shared hosting. Once your website reaches a certain size and puts increased demand on the server, it could be worthwhile migrating to a VPS to increase performance and keep up with the demand. This package works by having a server clearly divided into several smaller areas, with each website having access to the resources contained within their own private area. Unlike with shared hosting, if another website on the server has a huge increase in demand, it should not have access to the resources available to your website, meaning your site can keep functioning without issue. Conclusion To have a successful online business, getting the right wordpress hosting is vital. It will keep your website online, support you through technical difficulties and remove the headache of server side issues. All of these problems will just distract you from the main focus - turning your website into a success.[/vc_column_text][/vc_column][/vc_row] ### ThousandEyes Expands Global Multi-Cloud Monitoring Coverage, Adds Support for Alibaba Cloud [vc_row][vc_column][vc_column_text] [/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_column_text] ThousandEyes significantly increases Asia-Pacific monitoring capabilities for challenging Internet region; adds to existing multi-cloud support of leading IaaS providers Amazon Web Services, Google Cloud Platform and Microsoft Azure [/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_column_text] London, August 14, 2019 -- ThousandEyes, the company that delivers visibility into the Internet and digital experiences, today announced a significant expansion of its cloud monitoring coverage, including the addition of new global Alibaba Cloud monitoring capabilities. This increased coverage builds on ThousandEyes’ robust fleet of Cloud Agents providing global monitoring vantage points from major cloud providers such as Amazon Web Services (AWS), Google Cloud Platform (GCP), and Microsoft Azure as well as global data centers and major ISPs, giving enterprises unmatched visibility and the ability to measure application and network-layer performance metrics for any website, application or service.   [/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_column_text] “Global organizations today run on the Internet, connecting applications and services to end-users everywhere, and making deep Internet visibility non-negotiable, which is especially relevant for companies operating in Asia-Pacific where heavy sovereign controls impact Internet performance and digital experience,” said ThousandEyes vice president of product Joe Vaccaro. “ThousandEyes already provides unparalleled visibility into how AWS, GCP and Azure impacts digital experience delivery, and by adding the ability to monitor Alibaba Cloud performance to the mix, global enterprises operating in Asia-Pacific and beyond are empowered with the visibility and control they need to improve customer experiences and business performance.” [/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_column_text] The worldwide infrastructure as a service (IaaS) public cloud services market grew 31.8% in the past year, according to Gartner research. Gartner’s data indicates that Alibaba Cloud is the dominant IaaS provider in China, and it experienced the strongest growth among the leading vendors, growing 92.6% in 2018.[1] Notably, according to the 2018 ThousandEyes Public Cloud Performance Benchmark Report, the Asia-Pacific region showed some of the highest performance variability across all of the major IaaS providers, demonstrating the need for in-region vantage points within and across all major public cloud providers. Traditional monitoring tools and cloud-native point solutions both lack the ability to see beyond internal enterprise networks, or outside of their own networks, respectively, meaning companies that rely on either or both are leaving the broader Internet unmonitored, jeopardizing existing investments and are at increased risk of delivering poor digital experiences. [/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_column_text] ThousandEyes added 19 Alibaba Cloud regions worldwide, plus 13 new Cloud Agent locations across Asia-Pacific, including four new locations in India, bringing ThousandEyes Asia-Pacific vantage points to a total of 53 cities and global vantage points to a total of 184 cities. This latest expansion adds to ThousandEyes’ existing Cloud Agent locations in IaaS providers, which currently includes 15 AWS regions, 15 GCP regions and 25 Azure regions. Now, IT teams can leverage pre-provisioned and easy-to-deploy vantage points in IaaS to measure inter-region, hybrid and inter-cloud performance, map network paths and monitor connectivity between a combination of on-premises and cloud data centers. Companies are also enabled to adopt a data-driven approach when planning multi-cloud deployments, as well as provide immediate visibility into application delivery, network behavior and inter-service dependencies, and their impact on digital experience.   -ENDS- [/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_column_text] About ThousandEyes ThousandEyes empowers enterprises to see, understand and improve digital experiences for their customers and employees. The ThousandEyes cloud platform offers unmatched vantage points throughout the global Internet and cloud providers, delivering immediate visibility into the digital experience for every user, application, website or service, over any network. ThousandEyes is central to the global operations of the world's largest and fastest growing brands, including Comcast, eBay, HP, 120+ of the Global 2000, 65+ of the Fortune 500, 6 of the 7 top US banks, and 20 of the 25 top SaaS companies. For more information, visit www.ThousandEyes.com or follow us on Twitter at @ThousandEyes. [/vc_column_text][/vc_column][/vc_row] ### Info On Cyber-Security Training And Careers [vc_row][vc_column][vc_column_text]Almost every company has switched to digital services in some way or other, whether it be online banking or informational websites. However, just like thieves robbed banks, hackers can get into online banks and perform just as much destructive crime, so how do we prevent this? Cyber-security, of course, but do you know enough information about the training for a career in cyber-security, or the various careers available to you? All of this information has been collected into this article, so you don’t need to worry about those gaps in your knowledge anymore! Cyber-Security’s Importance Cyber-security matters so much, to everyone on any form of internet, because it can affect exactly that many people. “From government officials, to large businesses, to your grandma on her first computer, everyone is affected by cyber-security, or a lack thereof, whether they know it or not,” Rebecca Curran, a network administrator at Academized, suggests. “Your data lingers on the internet, collected by companies who watch your every click and search, so protecting all of that vulnerable data from the greedy hands of hackers and unsavoury individuals who may use it for criminal purposes is a must.” So, why sit back and let other people do the job of safe-guarding people on the internet, when you could train for it and do it yourself? It’s a perfectly viable and stable career, as you will soon learn, so why not be a part of the cyber-security solution? Certification First of all, even if you complete training with a certain company (presumably leading up to a job with them) you will get certification which you can show to any company which you want to work for, so that a concrete record can be given to a potential future employer. Your certification will be different depending on what sort of course you took - they range from general to specific skills like data protection, so bear this in mind when searching for cyber-security training. Thinking of any possible future job paths which you’d like to follow may help you to decide what you need to go forward in cyber-security; the answer to that is the answer to which sort of certification you should go for. Free Training Leading on from that, the government has actually realized and recognized the need for more workers in cyber-security, and the lack of skilled candidates, and, thus, has provided a two-hour course of free training for all employees. While it is meant for civil employees first and foremost, any business can apply for this training, so you may be able to pick up a certification within your current job before moving on to a job focused on cyber-security, which would be a stable and sensible education choice before moving into the cyber-security sector. Importance Of Jobs In Cyber-Security Cyber-security is a constantly growing and expanding sector of careers, which is speeding up in its popularity and positive potential future growth at a faster rate than many other industries: four times as fast, in fact, since most have an average growth rate of 7%, but cyber-security has an average growth rate of 28%. This shows that investing in training for a job in cyber-security, then signing up for a career in it, can actually be a very worthwhile procedure which will give you not only the opportunity to work doing what you love, but also work successfully and for the long-term, without fear of having to switch jobs constantly or haggle for promotions, etc., since that sector will be bustling with achievements and stability. Cyber-Security is creating a multitude of new jobs as every company and every government will now need cyber-security experts and professionals that can handle this. You can work in many different areas and focus your skills on helping people. This is a stable industry which can offer opportunities for people of different backgrounds and help them achieve something great in their lives. Types Of Jobs Within cyber-security, there are actually many different types of jobs, which you may not have known before. And this isn’t just one or two jobs, either, but a whole multitude of different roles within the sector. There are so many different roles out there because of the different skill sets required to fulfill those roles properly, for example, a security general would most likely be working in a smaller company with a minor cyber-security need, while a cloud security engineer would have to dedicate their studies to cloud-based platforms and find work in a company which focused on that area. So, you have to consider what type of job you’ll want during training - or take general training if you’re not too sure - and make sure you apply to one which suits your skill set.[/vc_column_text][/vc_column][/vc_row] ### Embracing the benefits of a multi-cloud strategy [vc_row][vc_column][vc_column_text]Almost every business today has Cloud as part of their IT- directly or indirectly, through adoption of Microsoft, Slack, Evernote or any of the other emerging workplace applications - Cloud is there! So Cloud Phase 1 is well underway- but what about Phase 2? Do I use a single cloud provider, should I silo how I use cloud platforms, or run multiple clouds, how can I cope with complexity fragmentation brings? At the heart of these questions is what is the best way to take a multicloud strategy. So what is the best way to embrace phase 2? There are 3 considerations: Choose the right platform Whilst there is a perception that every Cloud is the same, reality is different.  Some perform better in certain regions or applications, others are designed to satisfy certain regulartory requirements and standards.  You can therefore have a hybrid solution where your business requirements are met by different providers. We are also seeing the growing importance of IT departments forming application development operations for their companies. IT has become less static and demands greater fluidity in how the Cloud is used is vital for effective application development. For example, many businesses separate out application development (in the Cloud) and production (on-premises). It is therefore important to understand specific business goals and requirements before choosing how to deploy Cloud services as this can save you lot of time and money in the long run. Once you are clear on the technical service requirements, you can start shortlisting potential providers. It is all about relationship building Cloud affords fantastic opportunity for business.  This in turn does create some complexity in the IT landscape and uncertainty for the retained IT team.  Where do they fit? Key to the success is working with a Cloud provider that can help you build and then deploy a tailored infrastructure that is specific to the needs of the organisation- one you can trust. The number of public Couds that companies should deploy depends on the organisation’s requirements and strategy. Again, this is where a good relationship with a cloud provider can play a crucial role. Instead of pushing products and services, a trusted Cloud provider will understand what an individual company needs. Companies should not add more public Cloud just for the sake of it. Whilst it may be tempting to think that having more options offers more flexibility operationally having too many options can actually have the opposite effect and make a company less efficient. Security The most important consideration when exploring the best way to deploy in the Coud is security. Keeping data safe has become one of the greatest vulnerabilities a business can face, and the biggest financial and reputational risk if not done properly. The Cloud has made tremendous advances and may invariably more reliable and secure than traditional on-premises solutions. However, some business owners worry about having their information ‘somewhere out there in the open for hackers to access’. Indeed, there is a risk in going ‘multi-cloud’ that everyone will think data security is someone else’s problem. We have lost count of the number of times our customers tell us that they thought their data was automatically backed up and stored forever by Microsoft office 365 (it isn’t!). Data security is vital and can’t slip through any cracks and this concern needs to be front and centre as organisations explore the cloud. Key takeaways How do you approach the Multi-Cloud conunudrum? take a multicloud approach? There is a different answer to that question for every business depending on their business model and data sensitivity. In addition, it is not just the Cloud that businesses should look to adopt but the network and partners of the appropriate third party solution providers. This is crucial and it is important that they understand the importance of hybrid integration when it comes to connecting on-premises IT resources with Cloud services. Ultimately, businesses should look for a managed service provider that has a network of partners that compliment and assist in the creation of a robust and scalable Cloud strategy that supports the overall business objectives and a comprehensive Cloud strategy. A well rounded partnership will allow to have a proposition focused very heavily on both physical and logical security, which is very appealing for customers today. This is exactly the environment that we have created between EACS and Memset in our new partnership. Cloud has rapidly matured and while some successful companies have taken Cloud-first strategies, many still struggle with their core business processes when it comes to managing their cloud strategy. Managed service providers with strategic partnerships can offer a much more integrated service where businesses can have some of their core processes on a public cloud infrastructure, some hosted internally, and perhaps some hosted as a software subscription service.[/vc_column_text][/vc_column][/vc_row] ### Answer Why? What? How? and What if? - Opening Lines [vc_row][vc_column][vc_separator][vc_column_text] [embed]https://vimeo.com/338855592/ce5881e343[/embed] [/vc_column_text][vc_column_text] How to engage everyone in your audience The four questions are based on ‘The 4Mat System’ by Bernice McCarthy. The system is useful for structuring presentations in a way that the maximum number of people will understand. It works because members of your audience will be interested, to a greater or lesser extent, in the four different types of questions.  By covering all four of them you will appeal to most, if not all, of your audience.[/vc_column_text][vc_single_image image="52617" img_size="full"][vc_column_text]Think through the four types of questions on the minds of any audience Why it matters Answering the four questions enables you to: Plan content that will interest all, or most, people Structure content in an easy-to-follow sequence Reduce the risk of missing something vital from your talk Increase the chances of gaining buy-in Enhance the possibility of the audience taking actions you desire What to do Put yourself in the minds of your audience and answer the questions below. 1. Why? Imagine you are an audience member: Ask yourself: ‘Why should I listen?’ ‘Why is this topic important?’ ‘Why are you the right person to speak about it?’ ‘Why now?’ Think of more ‘why’ questions pertinent to your topic. 2. What? Ask yourself: ‘What’s the key message?’ ‘What’s the big idea?’ ‘What’s the theory or model? ‘What’s the key information or evidence they need to hear? Write down any more you can think of. 3. How? Ask practical questions such as: ‘How does it actually work?’ ‘Can you explain the practicalities?’ ‘How about some examples?’ ‘Can we see some evidence to support your theory?’ ‘Can you demonstrate it?’ 4. What If? There are two kinds of ‘what if’ questions. a. Negative ‘what ifs’? These are potential snags such as: Risks in what you are proposing Flaws in your argument Counter-examples that might dilute or refute your case Exceptional circumstances when your guidance would not apply Such concerns might sound like this: ‘What if there is not enough in the budget?’ ‘What if the scope of the project gets wider?’ ‘What if the client doesn’t pay us on time? ‘What if I don’t have all that equipment?’ Think of your own questions specific to your situation. Tip Ask someone to play ‘devil’s advocate’ and think of the most difficult questions that could come up. 5. Positive ‘what ifs’? This is where people are already thinking about the future and imagining: Where the information they have gleaned will be of use The positive results of actions recommended A vision of a successful future Such thoughts can translate into questions such as: ‘Where will this be useful to me?” ‘How can I apply this idea? ‘What if I were to put this into practice?’ ‘When will we see the results?’ Decide which of the above questions apply to your topic and add more that relate to your own talk. Keep the answers and use them to help you decide what to include in your talk. Think through the four questions before every talk and you will be well-prepared Your Turn Get used to using the four questions: Think of a topic you have to speak about. Think of all the questions that could be on the minds of the audience. Write them down under the headings; Why? What? How? What If? Use your answers as material for your talk. Useful if you want to explore the system in depth and is helpful for applications to teaching and learning. Learn more public speaking secrets in ‘The Speaker’s Coach: 60 secrets to make your talk, speech or presentation amazing’ by Graham Shaw. Published by Pearson and available from all good bookshops and Amazon.[/vc_column_text][/vc_column][/vc_row] ### Why “cloud” is no longer a buzzword [vc_row][vc_column][vc_column_text]Moving away from talking about cloud may seem like a mad concept - cloud technology is figuratively and literally everywhere. In fact, Gartner predicts that by 2025, 80% of enterprises will shut down their traditional data centers. In essence, however, this means that cloud is no longer the future - it is the mainstream choice for enterprises looking to host their applications, broadly speaking at least, with obvious exceptions such as defense and government. In some instances this is 100% public cloud, but more often a hybrid cloud structure with a mix of private cloud data center and public cloud depending on individual enterprise’s needs. IDC’s research into worldwide cloud spending saw vendor revenue from sales of infrastructure products for all cloud IT grow 48.4% year on year in the second quarter of 2018. In this period, private cloud infrastructure spending alone reached $4.6bn, an annual increase of 28.2%. IDC estimates that for the full year 2018, private cloud will represent 14.8% of total IT infrastructure spending, growing 20.3% year on year. In the UK, one theory for this increase is that it is attributed to ‘the Brexit effect’ as companies retrench more local and controllable dedicated cloud environments  to protect themselves against the unknown and likely turbulent future. Speaking to a wide variety of organisations, however, they are not of the opinion that Brexit is shaping their cloud strategies. What is more likely is that with the emergence of new architectures, such as Azure Stack or Google Cloud’s Anthos, which allow companies the flexibility to create on-premise with the potential of moving workloads between public and private clouds in the future, the understood meaning of ‘private cloud’ has shifted; ‘on-premise’ spend is morphing into ‘private cloud’. Private Cloud Use is Becoming More Sophisticated as Adoption Becomes the Norm  This increase in prevalence of cloud means it’s not whether but how companies use cloud that is important. Increasingly companies are establishing frameworks around cloud implementations and migration - in the majority of cases employing a hybrid cloud framework with both public and private cloud alongside SaaS. This comes with the realisation that for some applications to fulfill their purpose they must remain on-premise, but this needn't limit the whole network infrastructure to on-premise. Insight such as this comes from an established use of cloud already leading companies to consider their network as a whole and incorporate cloud as a mature technology. Not only is it clear from company behaviour that the cloud market is mature, there’s data to support it too. Gartner’s The Future of the Data Center in the Cloud Era suggested that multi-cloud would be a common strategy for 70% of enterprises this year (2019). IDC frames public cloud spending as a $370 billion market by 2022, in its Worldwide Semiannual Public Cloud Services Spending Guide. The report shows a five year compound annual growth rate (CAGR) of 22.5% through to 2022. Finally, Forrester reports that nearly 60% of North American enterprises already rely on public cloud, five times the number that did five years ago. It’s time to shift the conversation. All this suggests that we need to move the focus from whether or not cloud is the ‘next big thing’ and explore how it can be best utilised to benefit all businesses, from small to large. Implementing cloud Looking to the future, companies need support in building out their cloud infrastructure - making sure they can make use of both on-premise and public cloud services. There is a vast array of options out there in a heavily competitive market.  Cloud deployments are no longer pilot schemes, at ThousandEyes we’re seeing our customers build cloud applications into their entire IT strategy. This shift means cloud and applications delivered from it are core components and so must be treated with the same discipline around risk and lifecycle that essential on-premise applications historically have been. When implementing cloud strategies, what is not to be overlooked is the need for companies to see their entire infrastructure in order to manage it. Both public and private cloud networks need to be accessed and understood fully. Increased complexity of networks requires a more sophisticated level of network visualisation, which has typically only been limited to on-premise infrastructures. The industry is increasingly recognising this and now a wider range of products is available to support companies who are looking to re architect their digital infrastructure, supporting their digital transformation journeys.[/vc_column_text][/vc_column][/vc_row] ### Big data made smart - Putting confidence back into decision-making [vc_row][vc_column][vc_column_text]Google the word “connectivity” and you’ll be presented with thousands of web pages on the issues and challenges associated with a more connected world. However, it’s important to recognise how to embrace the benefits too. As connected devices proliferate, so too do their capabilities – real-time decision-making is becoming more important than ever before if we’re to make smarter, more informed decisions. This is why our cloud computing architectures need to be able to handle rapid data transactions – but, currently, they are too centralised. As a result, we’re seeing the current movement of computational capacity, away from the cloud and onto the edge. Enter: Edge computing Edge computing is undeniably one of the biggest developments in the blossoming Internet of Things (IoT) sector to date. In fact, McKinsey’s top trends list reports that in many industrial sectors, particularly those with mobile and remote assets like in the oil and gas industry, analysing data at the edge may be more cost-effective than moving data from the edge. This allows analytical decisions to be made in real time. Edge computing therefore presents the opportunity for businesses to make large-scale decisions, by analysing data procured at the edge – at the edge. Real-time insights like these improve both speed and accuracy by removing the need to stream all data from the edge back into the enterprise or cloud. As such, organisations need to push their analytical computer power further out to the edge, if they are to reduce their costs. What’s more, edge computing is set to grow. Predictions from McKinsey suggest that it will represent a potential value of $175B - $215B in hardware. Before we know it, edge will be all around us. Tech companies are disrupting the sector and those opting to move to the edge will be able to make smarter decisions, while feeling more confident in the decisions they are making at the time they make them. Putting it into practice You may be wondering how edge computing can help a business to make smarter decisions when it comes to IoT projects. Enterprise companies now have large volumes of data to deal with. Most of this is never analysed and no value is gained from it. This is because it can be costly and complicated when there are so many transactions happening in multiple systems all the time. This is why businesses are working to join the data across systems, normalise them, analyse the data and make decisions from it. The competitive advantage this offers is the reason that data management and analytics is now such a massive industry. Edge analytics and IIoT Companies adopting IoT initiatives have millions of devices producing data in real time. Even prior to IoT these organisations were struggling to cope with the volume of data they had – today that struggle is epic. This is where edge computing comes into play – enabling enterprises to analyse the data that adds value. Let’s take the manufacturing industry as an example. The manufacturing sector is leading the way in embracing smarter technologies such as sensors, to become more connected – and smarter. Currently, lighthouse factories are paving the way for manufacturers looking to become smarter. Edge computing is essential for smart manufacturing. This is because, in an IoT world, it is the sensors and the connected devices that live at the edge. With this in mind, the analytics on the data is best placed to happen on location, rather than being moved to a centralised storage location. Edge analytics enable the right data to be put into the right peoples’ hands. As opposed to putting large volumes of valuable data into more peoples’ hands than necessary, it’s better to let AI do the grunt work at the edge – enabling humans to focus on the value-add areas. Ultimately, by combining edge analytics with the Industrial Internet of Things (IIoT), manufacturers can unleash their vision, optimise machine reliability and production, improve quality and drive the creation of innovative business models. If one thing is for sure, it is that most industries have the opportunity to embrace edge and can therefore derive value from this shift. Without it, they’ll suffer from slower decision making, volumes of data too large to manage and an increase in costs. So – what are we waiting for?[/vc_column_text][/vc_column][/vc_row] ### Marketing automation will create captivated and engaged customers [vc_row][vc_column][vc_column_text]It might seem counterintuitive, but marketing automation can actually increase levels of brand engagement.  Customers now expect relevant and personalised marketing interactions with brands across various channels and platforms, so automated marketing makes this much easier and effective. This ability to personalise experiences with your brand for each customer creates better engagement, improves business workflow and increases campaign ROI. We’ll take a look at some of the ways marketing automation can be used to captivate your customers, leading to a more engaging journey with your brand. 1. Identify your customer journey with automated analytics A customer’s journey with your brand is never going to be linear. Increasing marketing channels such as social media, email and online ads mean customers will all take individual routes through brand interaction. Marketing automation tools make it much easier to track a customer journey, easily identifying key milestones in the entire process which can then be used to up engagement. This can include the initial channel, buying behaviours and the different paths customers take within the process. Even better, you’ll also be able to clearly identify key areas where the customers lose interest or disengage completely, allowing you to reconnect with these customers quickly and with relevant content. These insights combine to develop an informed, user-focused strategy to make your sales funnels much more engaging. And that will make them more effective.   2. Use automatically triggered emails to boost engagement Email is an often under-used marketing tool, but utilising email automation is an effective way to segment, personalise and engage with the customer. Automatically triggered emails let you nurture the customer with carefully created content based on their past interaction with your company, with automatic systems making this easy. Sending welcome emails, special date reminders and discount codes all personalised to the customer after a sale or specific action makes them feel valued and appreciated, increasing engagement. You’ll also be able to send automatic requests for reviews after a purchase or visit to your store, in addition to individualised upsells and recommendations. Automated behavioural emails are a great way to maximise your sales as well as creating a more loyal and engaged customer base.   3. Boost social media engagement with marketing automation Social media is really important for any business when it comes to engagement. Your customers want to interact with your brand via various platforms, including Instagram, Snapchat and Facebook, and marketing automation will underpin all of this activity to amplify your brand message. Marketing automation makes the whole process much simpler, making it easy for you to integrate content, analytics and other marketing campaigns. Social marketing automation lets you: Know which types of posts your customers find engaging, and create future content accordingly. Add sharing buttons to all of your site content to make it easier for your customers to share all posts with their friends. This increases levels of social proof and brand awareness. You’ll also be able to assess engagement through knowing click through rates, which content is downloaded the most.   4. Test to maximise engagement Every business needs to test the effectiveness of any marketing campaigns. Marketing automation gives you the tools to easily test new products and marketing promotions with easily identifiable groups. Automation makes data selection and segmentation of your customer base much more simple, so you can test different marketing messages to different demographic or customer groups.  Analysing the results is also made simple, giving you real-time insight to develop, roll out or improve certain campaigns. This information will allow you to segment future campaigns to maximise engagement, offering each customer their preferred to interact with your brand.   5. Use real-time automation on mobiles Marketing automation through wi-fi proximity marketing or SMS messages on a mobile gives you the tools to communicate with customers in real time. The ability to send a push notification, pop-up with chat-bot or send a special offer through a downloaded app to a customer close to your business increases sales and upselling opportunities. This is particularly useful for retailers or hospitality businesses who can target customers with relevant, but automatic communications. Interaction with customers via their mobiles also gives your customers a cohesive marketing strategy across all devices, as well as providing your business with extra data to improve your sales strategies.   6. Website tracking will increase customer engagement too Marketing automation through web personalisation allows you to tailor your website experiences to your website visitors, even before they’ve shared their personal information with you. Website ID technology gives you the tools to use browsing data and location information to create a more engaging visit for your customer. For example, visitors can be targeted with ads based on location, or directed to certain products based on their click throughs. This is a really useful tool to increase engagement; not only can you direct visitors through a more relevant website journey, you’ll also gather important tracking information on how and where customers do (and don’t engage). This allows you to tailor and adjust your campaigns to maximise leads and sales funnels. Marketing automation creates more engaged customers Marketing automation is a key tool to increase customer engagement. Its doesn’t just make regular communication with your customer base easy and instant, it allows you to create personalised and segmented marketing messages based precisely on the past behaviour of the customer. Used properly with triggered emails, website tracking and social media consistency, this data gives you the ability to create truly engaging content for each customer, increasing interactions, sales and loyalty.[/vc_column_text][/vc_column][/vc_row] ### Improving inspections in confined spaces with the right industrial drone [vc_row][vc_column][vc_column_text]Confined spaces like mine shafts, electrical vaults, storage tanks, boilers, culverts, tunnels, pipes, ducts and drainage shafts are all essential parts of industry that require careful inspection, but they’re not built for continuous occupancy. Confined spaces frequently have low air quality, toxic liquids or gases, limited entry and exit, risky electrical setups, free-flowing liquids or granular solids, and other conditions that make them hazardous to humans. For too long, industrial companies have faced a difficult choice: expose employees to massive risks by frequently sending them into confined spaces for equipment and infrastructure inspections, or inspect less frequently and risk equipment or infrastructure malfunction that could have deadly consequences. Further, when a worker is in a confined space completing an inspection, it’s an onerous and time-consuming process, with his or her movements often hindered by the limited space and making it difficult for a thorough inspection to be completed. All told, confined space inspections are an aspect of industrial operations that has been in desperate need of improvement. Thankfully, that improvement has arrived with the development of automated industrial drones. A job fit for a drone Considering that the potential hazards faced by humans in confined space inspections include getting stuck or crushed, running out of oxygen, dealing with extreme temperatures, or being exposed to toxic chemicals and gas, there are few jobs better suited for being taken over by industrial drones, allowing human inspectors to do their work using a high-definition live video stream. Not only does using industrial drones for this job make the process infinitely safer, but it also makes the process much less expensive by reducing the time it takes for the inspections to be completed, reducing the need for operational downtime, and eliminating the need for specialized equipment for inspections, such as scaffolding, cables and rope access, and breathing aids. Industrial drones and their range of sensors also allow for the capture of high-resolution imagery as well as a range of precise data that may be superior to what a human inspector could collect in these high-risk inspections in limited space with often extreme conditions. Overall, industrial drones enable companies to quickly perform frequent, low-cost inspections without enduring downtime or exposing employees to serious health hazards and occupational risks. There’s really only one way using industrial drones for these types of inspections can be made any better, and that’s with automation. The necessity of automation While any high-quality industrial drone represents an improvement in the confined space inspection process, it’s only an automated industrial drone that can truly optimize these inspections. Automated industrial drones are drones that handle the entire flying process automatically, without requiring a human pilot. Not only does this cut costs significantly, but it also takes the chance of human error out of the inspection process, producing better results in areas that are difficult to navigate and reducing the potential for damage to the drone. Just as importantly, leading industrial drones built with end to end automation also perform their own maintenance. This includes changing their own sensors so they can stream live video, test for emissions and leaks, and gauge temperature, among other inspection applications. It also includes changing and charging their own batteries, which is essential for industrial inspections of all kinds because it means these drones are ready to launch at any moment, making them invaluable in an on-demand and even emergency capacity. Considering the disasters that can occur at oil and gas facilities and other industrial sites, the quick response of an automated drone could very well prevent the loss of life and damage to the facility and surrounding area. Low-flying, high expectations When you picture drones it’s all too easy to picture high-flying missions over soaring expanses, but some of the most important work industrial drones can do is in tight areas, particularly with confined space inspections in areas like sub-cellars, silos, utility vaults and pump stations. Industries like oil and gas, alternative energy, mining, construction and critical infrastructure will greatly benefit from turning over responsibility for these high-risk inspections to the automated drones that can handle them quickly and efficiently with low associated costs and reduced downtime. This is just one area of industry that automated industrial drones are revolutionizing, and considering the dangers and complications associated with this particular area, that revolution is coming none too soon.[/vc_column_text][/vc_column][/vc_row] ### Top Fintech Business Startup Ideas to find Easy Investment [vc_row][vc_column][vc_column_text]Fintech startups across the globe are now coming up with a lot of fresh new ideas. There are a considerable number of RFS (requests for startups) also being raised daily across the globe. There are many investors also like angel investors who are ready to fund for innovative startups. The objective of many online platforms for startup establishments is to connect entrepreneurs with investors in any industries like software, healthcare, logistics, or infrastructure development, etc. Many of the corporate firms like Y Combinator and Plaid etc. have their RFS platforms for fintech startups through which anyone can raise a request for their business ideas to get funds. When you consider any successful startup, one can find a common thread as they all are focusing on solving some real-world problems. Here, considering the real-world problems of our times, we are putting forth some startup ideas, which the investors may blindly put their funds into. 1. Clear bills You also may have found it troublesome to see many incomprehensible items on the service bills you get. Some of the providers, especially services, may have made it more difficult for the users to identify what the given charges are all about and what they are paying for. So, benchmarking the services is a key aspect to consider, which has a huge potential as a startup business idea, how the bills are compared with other providers offering the same services and who ensures the best return on investment, etc. 2. Consumer-centric loans In the United States, once the loans are paid to the consumers, next the servicer is responsible for the collection of the loans until completely paid off. This is a highly impersonal relationship, and things may often go out of control also to become a bit dehumanized. You may consider the education loan, for example. There are about 44 million Americans who borrowed student loans, which may sum up to about 1.5 trillion dollars. It could be a very daunting experience to go through the student loan processes. Some major issues to address in this domain are: The services may give you inconsistent information about the plans which may ultimately reduce the payments received. Lenders may not administer the services you avail in a way you desire. Lenders may report some inaccurate info to the credit agencies, which may adversely affect you. Fintech startups which could address these issues may have a scope. 3. Hardware and software support for branches As we see, digital banking has now grown big, and the majority of the consumers too prefer this mode of financial transactions. Majority of the day to day financial interactions with the banks are now done through internet banking or ATMs. Financial services providers like libertylending offer their end-to-end service online; however, there is still a significant volume of conventional banking operations done in person. Many banking applications you see now may seem to be far-fetched; however, there are many new technology prototypes which are tested now as video ATMs which may centralize all financial services from the same terminal, interacting with remote banking assistance, conversational-based artificial intelligence applications playing a big role in supporting the customers, etc. 4. Tax preparations There is much software which exists for tax preparation. However, many of these applications hardy cover the unique needs of special taxpayers, i.e., someone running a home-based side business, etc. It is estimated that only about 30% of the tax preparations in the US are done by using some software. There are more than three hundred thousand people in the US into tax preparation consulting, and it is a $10 billion industry. So, this also puts forth huge opportunities for the startups which can dig into this problem and come up with practical solutions. 5. Mobile banking Visiting a bank brand to go through hectic paperwork to open a bank account is not liked by all. This is difficult for even the city dwellers, so what about more than four billion living in rural areas. So, there is still huge, but a potential underserved category of the population who may like to have a bank account and access to financial services. A reassign stat is that about six billion out of the total seven billion global population now have mobile phones. A UN study even shows that several people now have access to cell phones than toilets. Having mobile-based banking applications for account opening, doing bank transactions without visiting the branches, etc. can make a huge impact on the financial market. 6. Core banking There are bigger expectations now from the core banking systems than just recording the account transactions. Internet banking services and fintech mobile applications now demand the banks to offer end-to-end integration, which their proprietary and branch banking systems. The new generation banking is all about branchless banking over traditional banking. However, it could be a huge task to replace or upgrade the existing systems. However, the digital banking applications may effectively close this gap, and from the customer perspective, third-party API services may offer them innovative ways to surpass traditional banking hazards. 7. Stock brokerage as a service Buying and selling of stocks at a brokerage now cost about $65 on an average per transaction. This is one major reason why the small-scale investors who are even willing to trace stay out the game. There are many fintechs now offering their clients some free trading platforms without any transactions limits. Thinking this way can further create a much flexible and cost-effective ecosystem of trading services. 8. Personalized insurance The new-generation consumers are very much used to the ease of access to services and product at a click of the mouse or swipe on the cell-phone screen. They don't want insurance services too to be different from this practice they are used to. Rather than running to advisors, consumers want to explore their insurance offerings whey they want it the most and also when they have time to explore it. Latest technologies like IoT are now offering small devices to collect relevant and use them for the benefits of the users. All these data can be valuable input for the insurance providers to offer customized and flexible policies for the users and also monitor the policy terms and services. Even though fintech is a feasible and profitable startup business idea, it required thorough knowledge and perfect planning to launch and survive.[/vc_column_text][/vc_column][/vc_row] ### What is Impact, what impact are you having as a leader? | Opening Lines [vc_row][vc_column][vc_column_text] [embed]https://vimeo.com/339313639/7e611fa2ba[/embed] [/vc_column_text][vc_separator][vc_column_text]You always have an impact on people and the world around you, whether you pay attention to it or not. Ultimately, everyone is responsible for the impact they have and the impact they want to have. And the more senior you become as a leader, the more people you impact, and therefore the more responsibility you have for making sure you get it right. That’s how important your impact becomes. To have impact is closely linked to the ability to influence. The difference between influence and impact, even if they overlap at times, can be described like this – “If you can influence you have an impact”. To influence someone is to make them take onboard our ideas, suggestions or directions. This can happen through logic, facts, behaviours, emotions and peer pressure – to mention a few. When you have influenced someone to take on a new thought, argument, action or behaviour, you have had an impact on them. It’s HOW you do business that creates your impact B2B: Business to Business. B2C: Business to Consumer. These are common ways of describing if a business is providing products and services to another business or directly to the consumer, the end user. This is to some degree an important distinction as it impacts or even dictates how an organization is organised and how it needs to operate. But ultimately all business is H2H – it’s Human to Human. It’s people who make decisions to buy or not buy, to stay loyal to a brand or not, to recommend a company or not. We may for example think that we have a contract with an organization to deliver service, but that’s only part of the truth – it will always be people who decide to sign that contract or not. If that person or those people don’t have a great experience with us, they may choose to sign a new contract with someone else instead of signing or re-signing with us. But don’t underestimate the power of the human aspect of ALL contacts an organization and its people have. It’s all about people. People make decisions. Connections happen between humans. And this is why HOW we conduct business; HOW we behave in interaction with others, HOW we make others feel, is so important – and if anything, is only becoming more and more important. Every interaction matters and the impact you have in each interaction. You are not just representing yourself, you are also representing the whole organization and what it stands for. Throughout the history of mankind, the ability to create relationships with others, to connect and collaborate with others has been a key success factor. The concept and power of impact has always existed. Impact starts from within How you feel about yourself affects how you think about yourself, and how you think about yourself affects how you feel about yourself. So the most important factor to consider when assessing your current impact or planning for your desired impact is how effectively you are leading yourself and taking control of your “inner self”. This inner system consists of your beliefs about yourself, your self-esteem and level of self-confidence, driven by your thoughts and feelings. It’s your whole mental and emotional self and a big driver of everything you do and what the world experiences with you. If you want to have a great impact on the world around, start first within. Get to know and understand yourself so well that you recognize and can take control of the impact you have. Effective leadership impact comes down to this: what state (of mind/emotion) do you want to elicit in others? Therefore, what state do you first need to elicit in yourself? If you want to be inspiring, you need to be inspired. Your impact is about the ripple effect you create Impact happens on a one to one basis, with individuals, and on a one to many basis, with teams and groups. Negative impact may for example be as simple as checking emails on your phone when in a one to one situation. How do you think that makes the other person feel? What impact are you having on them? Will they want to go that extra mile for you? And who are they meeting next? How may they affect that person as result?  Equally, if you’re presenting to a room full of people, handing out awards and getting the recipients names wrong, your personal brand will be negatively affected. People will feel that they are not important enough to be remembered, or that you didn’t care enough to pay attention to the details. And people tend to remember those situations, so you will now have to work harder to reclaim some lost credibility and achieve the impact you want. Positive impact can be as simple as saying thanks to someone who helped. It doesn’t matter how small or large, just taking the time to stop and say thanks can have a huge impact on people. In the busy world that we live in we can easily forget this important and impactful effort. Creating an Impact Strategy Given that impact is that important, you need to challenge yourself to become aware of the impact you have or maybe lack. You may have a lot of strategies, for the business, for change initiatives and more, but you also need to have a strategy for your impact and therefore what that will do for the business. What all leaders have in common is that they always operate through others, they need to enable employees to do a great job. This is why your impact becomes your most important strategy in order to deliver the desired and expected results. Your impact is and should be bigger than you. And as a senior leader in particular, it’s not about raising your own profile, your focus on impact is for the good of the business, the greater good.  You need to recognise that creating impact is a positive, powerful and respectful commitment to excellence. And most importantly you need to do it in an authentic way, a way that suits you. Whatever leadership role you’re in, it’s your duty to ensure you have a strategy for your impact. Things move fast, we’re all surrounded by constant change. Leaders need to create impact in the moment, to not lose the power of that moment. No one is perfect and no one will get it right all the time, but they need to at least seize their most important moments and create the impact that will help them connect with others in a respectful way, to create trust, get others to listen to them, to influence effectively, and to drive results. Going into the future, our ability to have a good or even great impact is becoming more and more important. We all need to think about the effect we have on others and what effect we want to have. ”How” we operate rather than simply ”what” we do is becoming more critical to success. In fact, it is fast becoming the differentiating factor for successful executives, leaders and organisations overall. By actively CREATING THE IMPACT they want, leaders are demonstrating they are more in charge of, and can better predict the outcomes they get. We all need to manage our personal impact, and the effect our impact has on all our stakeholders, both in the short- and the long-term. Whatever your impact ambition is, you need to build a tailored IMPACT STRATEGY for yourself.  Sustainable, long-term impact comes down to behaviours, what you intentionally create, how you react and respond. Understanding your audience Every single person is different to the next and therefore needs to be approached differently, the Platinum Rule comes into play here – ”treat others as they want to be treated” - as opposed to the more commonly known Golden Rule – ”treat others as you want to be treated”. With the Platinum Rule you don’t assume that everyone is the same as you. Great communication, great influence and great impact always comes down to how well you understand your ’audience’ and how well you adapt your style to match them. You are having your people/situation ’radar’ on to tell you what’s going on around you, what’s coming up, what’s around the corner, so that you can act and behave with integrity and presence and relevance. Think about it. If you’re treated as if you’re unique, as if you’re understood, you more likely to feel as if the person treating you in this unique way is having a positive impact on you. Right? So, to create the positive impact you want, you need to think about how you can treat others in a unique way, while still being authentic. Being authentic makes you feel comfortable and the effect is that it makes the other person at ease too. Being Authentic and using your ULP Just as every person is different from the next, so are you. You bring something that is unique and that differentiates you. You need to know what that is, so that you can tap into it and use it actively to bring out the best of you. And that in turn brings out the best in others. Powerful impact is fueled by your Unique Leadership Points, the combination of personality, background, experience, strengths and characteristics that you alone have. This is rarely fully understood, as most people don’t have a comprehensive grasp of their strengths and what makes them unique. No one can be best at everything, but everyone is best at something. The best impact you can have is when you are ’being you’. It’s good to be aware of when that works for you and when it doesn’t. There are times when you need to adapt your style to connect with another person while still being authentic, to have an impact. So it’s a fine balancing act between being attuned to yourself and others at the same time How actions and behaviours drive impact Our days at work are full of what we do and how we are. Our actions make up the what while our behaviours make up the how. Having an impact is your job as a leader. It’s not a nice to have, it’s a necessity. It is the job. Mandy Flint and Elisabet Vinberg Hearn are leadership strategists with a focus on future trends for leadership. They are multi award-winning authors, their 3rd Book for FT publishing The Leader’s Guide to Impact is out now.[/vc_column_text][/vc_column][/vc_row] ### How Visual Content is Changing the Company Training Game [vc_row][vc_column][vc_column_text]Words alone aren’t the best way to communicate. In fact, sometimes words can get in the way of what we’re trying to say. But if words aren’t the answer, where do we turn to get our message across? TechSmith research shows that two-thirds (67%) of people understand information better when it’s communicated visually. And people are consuming more and more visual content: a Nielsen report from last year found that US adults spent almost six hours per day consuming video content during the first quarter of 2018, an increase of 11 minutes on the previous quarter.  When it comes to learning and development for your employees, incorporating visuals can be a game-changer. Visual content relays information more quickly and efficiently, and aids retention. It’s a win-win situation: employees receive more engaging, interesting training, and employers benefit from better-trained staff. There are tons of other bonuses too, including cost, flexibility, and scalability. A more dynamic approach to L&D Let’s say that a software company needs to ensure that all their consultants are experts in the software they’re selling and up-to-date with current trends in the industry. Consultants are dispersed across many different countries, making it challenging to deliver efficient, cost-effective training. Instead of attempting to train each new hire face-to-face (or even in multiple group settings) and issuing explanatory documents, the L&D team could instead create a series of instructional videos. Simple-to-use tools for screen capture, screen recording, and video editing like those provided by TechSmith make the video creation process easy. There’s no need to hire a professional, external video production team, as videos can be created easily in-house. The same videos can be repurposed for different regions by creating voice-overs in different languages, or adding appropriate subtitles. As all the videos are easily editable, they can be quickly tweaked to keep the training relevant and up-to-date as necessary. Training videos are saved in a central location so that they can be accessed and revisited by all staff at any time. And any face-to-face training that does take place can be recorded, screen-captured, and added to the training video library so that it benefits everyone. This method is kinder on the budget, effective, and scalable. It results in better information absorption and retention. But what’s even more revolutionary is that learning becomes self-directed. Employees can go back and revisit training materials when they need a refresher and can learn at a pace that best suits their needs. And, when they learn how easy it is to create these videos, they can even start creating and sharing videos to help highlight work they’re doing, results they’ve seen, or provide their own trainings on new systems or approaches to the work. Staff are motivated to actively participate in this social, dynamic form of learning, which just can’t happen with traditional methods of training like face-to-face sessions, PowerPoint presentations and instructional PDFs. Lessons in visuals from the marketing team While many marketing teams are adept at creating video content, HR and L&D teams may feel like they don’t have the necessary skills, tools, or confidence to make brilliant videos. But there are lots of tips and techniques that L&D professionals can borrow from marketing to make sure that training videos are as effective as possible. These are our top tips: Tell a captivating story Good marketers know that product-focused content is unlikely to catch the attention of their audience. Instead of concentrating on the hard-sell, they present a narrative that resonates with their customers and makes an emotional connection, in an effort to establish a lasting relationship between brand and consumer. L&D professionals can try the same technique, creating a story and characters that effectively convey the message and learning points with much more impact than a simple instructional video. Think about the overall experience When they’re considering how to engage with consumers, marketers focus on the overall experience and try to ensure that every interaction is on-brand. When designing training videos for learners, L&D professionals need to think through the overall experience too. It’s important to ensure that each element of a video is consistent, and that each video makes sense in the context of a whole learning pathway. A recognisable message and tone throughout will help with engagement and information retention. Personalise the content Marketers are skilled at creating personalised messages to appeal to different target markets, and this is another lesson for L&D teams. An effective training video allows users to choose the way they navigate material, including how fast they move through the content and what topics they learn. Keep it short Recent TechSmith research found that when it comes to video, brevity is definitely best. Almost three-quarters (74%) of UK viewers prefer informational videos that are no more than six minutes long. Marketers are well aware that today’s consumers tend to be time poor and have short attention spans, which explains the rise of super short video content. Training videos should be relevant and where possible, kept short and concise too, without too many learning points packed into each video. How to make a training video Now you know the ingredients of a successful training video, follow these steps to get started. Make sure you’re clear on who your audience is and what topic/s the video will cover, including the content you’ll include and the learning points that you want viewers to come away with. Don’t try to overload the video with information. Focus on a topic and get the information across in the most efficient manner. Choose the best format. Will the video be a screencast, demo video, presenter video, role play, or a combination? Write a script for your video and then create a storyboard. This means using images to show the sequence of your video. Once everything is prepared, start shooting! You don’t need fancy equipment; your smartphone and screencasting software should be plenty to start. Edit your video to create a finished piece. The TechSmith Assets library contains resources including motion backgrounds, animations, images and music that you can add to your video. Annotations, text overlays and animations can also help bring your video to life. Once you’ve finished the first draft of your video, get stakeholders and colleagues to take a look at it to check you’re on the right track. You can use TechSmith’s Video Review tool to streamline this process, and then incorporate any stakeholder feedback into your next version. When your video is finalised, you need to render it from the video editor into a video file — usually an MP4 — and then choose a place to host it. YouTube and Vimeo are popular hosting options, but you may just host it on your intranet if you want it to be private to your employees. [/vc_column_text][/vc_column][/vc_row] ### Where will the next big platform for innovation come from? [vc_row][vc_column][vc_column_text]In a world turning digital at speed, entire industries are regularly disrupted and totally reinvented. Faced with such rapid evolution, it’s easy to get caught up in industry hype around the latest technological breakthroughs – but where will the next rich vein of innovation in our work environments actually be unearthed? Typically, improvements in process, increased productivity and enhanced efficiency define innovation – even more so for businesses. Light and lighting meets all of these requirements. You rarely give a thought to the ubiquitous fittings which have the power to unlock significant benefits from efficiency through to enhancing productivity and even enabling the Internet of Things. If nothing else, the sheer ubiquity of light sources makes lighting a terrific platform for innovation. In the planning stages, lighting is one of the first considerations –inefficient lighting is one of the biggest energy expenditures and a cause of greenhouse gas emissions globally. With the UK Government’s 2050 net zero greenhouse gas emissions proposal, the pressure on businesses to make their operations greener is even greater. Believe it or not, simply upgrading to energy-efficient LED lighting can unlock significant budget by drastically reducing energy usage and expenditure. Adding smart sensors and controls to LED lighting through a management system such as Interact Office compounds those savings and creates an intelligent network of sensors across your building. A modern office building with efficient, smart connected lighting can save up to 80% on energy – significant savings that unlock revenue. However, energy savings are only part of the equation. With centralised lighting and sensors the potential for productivity and energy efficiency are unlocked LED luminaires with integrated sensors can collect data on lighting performance and sense the environment around them. For example, they can collect anonymised data that provide insights as to how workers use the workplace. Everything - from heating, ventilation, and air conditioning (HVAC), to cleaning and space usage - can be monitored and made more efficient. As a result, energy usage and costs can be reduced while and boosting productivity. One example of how innovation will boost employee and office productivity is in the occupancy estimates.  Recent studies indicate that 25-50% of office desks in buildings are vacant on any given day. Interact Offices’ space management helps to understand occupancy rates across buildings, floors, meeting rooms or even at desk-level by counting people and analysing the number of workers in a given space. Through accurate reporting on historic and real-time space usage, organisations can assess and optimise space requirements and workplace designs easily. Furthermore, using indoor positioning technology, employees can use software apps on their smartphones to book meeting rooms and navigate within the office. They can use their smartphones to personalise lighting around their workstation, further improving their productivity and employee engagement. Even better, this is all possible without any disruption beyond the addition of a connected light fixture which can be easily fitted retrospectively. Also, offices (and other buildings) can benefit from enhanced connectivity through their lighting. This is particularly relevant in areas that are sensitive to radio frequencies, such as hospitals, clinics and, factories– or areas with poor or no wireless radio communication. Light Fidelity, or LiFi, creates a stable and fast broadband data connection through light waves. By modulating light, at a frequency imperceptible to the human eye, we can create a fast-two-way broadband connection of 30Mb/s. Light cannot penetrate walls like other wireless communication systems. To connect to LiFi you need your device to be in the pool of light created by the LED luminaire and to have a personalized USB access key. LiFi offers an extra layer of security, further enhanced by the selective authorisation of USB access keys. You might be surprised to learn that the energy savings from LiFi, alongside further efficiencies unlocked through data insights collected by smart sensors and additional revenue opportunities mean heavily reduced operating costs. The upfront investment cost is quickly offset and can even be avoided entirely through innovative business models, such as Lighting as a Service – where you pay for just the light, not the hardware. Leveraging lighting as a connectivity source and platform for Internet of Things applications also reduces the cost of infrastructure. This is particularly the case for new builds, whereas new wireless options are available for existing buildings – avoiding costly refit or expensive building work. I’d argue it’s a no brainer, but then I might be biased. A lot has changed since the invention of the humble lightbulb in 1878, I’d argue it’s time we rethink our relationship with light to create the next generation of smart spaces and unlock this exciting new platform for innovation.[/vc_column_text][/vc_column][/vc_row] ### The Cloud is the platform for ‘the new’. [vc_row][vc_column][vc_column_text]According to findings from The Cloud Industry Forum, almost nine in ten (89%) businesses are fully immersed in one cloud-based service or another, with adoption levels continuing to rise. For the first time in the cloud’s history, cloud infrastructure receives almost a fifth of businesses’ total IT budgets (19%), surpassing on-premise and legacy expenditure by 18%. One of the biggest lessons learnt throughout this journey is that the cloud is different – and it had to be. The giants that created the platforms wanted to attract the masses, and they wouldn’t have been half as successful if they were simply recreating the customer's environment on a cheaper tin. Innovation was key. And so, the journey began. Customers dipped their collective toes in, and then, once confidence was achieved, implemented a cloud-first policy that completely transformed their IT practice. But, what if you’re not there yet? It’s definitely healthy to look back at the lessons learnt from those who bravely stepped out of the sanctuary of their own data centre and put their prized possessions into someone else’s hands. This wasn’t sitting and worrying about your child’s first ever sleepover; this was sending them away on an indefinite exchange trip. Connectivity is key in the first step to the cloud You are effectively breaking the perimeter for the first time and allowing a path to and from your data centre. Things were simpler with that wall in place. You controlled everything, including the ability to watch and monitor the behaviour of your infrastructure at any given time of day or night. One of the first observations regarding traffic management was that not all routes to the cloud are equal. Some connections gave basic connectivity only. This was a shock to some who not only were used to monitoring traffic, but also shaping it for priority during peak times. Having the ability to assign more bandwidth to mission-critical apps was a given and, even though it is possible to do it in the cloud, not being able to do this between your two locations was a problem. Fortunately, there are some firewall technologies that allow for on-premise behaviour of traffic management to be consistent across the link between HQ, the cloud, and multiple cloud vendor platforms. This is an important area to consider when acknowledging the cloud, as it is a good example of this maturing. If we dig deeper into other areas that caused confidence to increase, the vendors themselves were a contributing factor by adapting their own portfolios to be cloud-capable. Vendors started to realise that it wasn’t realistic simply to deploy their on-premise solutions in a software fashion on the cloud. There were many features that weren’t required or simply didn’t work: a new approach was needed. As the cloud architecture evolves, a new form of tool is becoming available that can monitor things from the inside. Here, the cloud generation of products was born. Why would this be important? We talked about areas of the cloud that are different to on-premise and the evolving steps that have occurred. One of the slower areas keeping pace is the general knowledge of cloud-based platforms within organisations. In this era of constant attacks and zero-day threats, this is a big concern. Gartner have said that 80% of future cloud breaches will be due to customer misconfiguration, mismanaged credentials, or insider theft - not cloud provider vulnerabilities. Having a comprehensive view of your infrastructure is vital. The cloud generation demands the next wave of management options and sitting ‘inside’ - as opposed to ‘outside’ - the cloud’s walls. This will enable your shiny new engine not only to perform just as they said it would, but also help keep you one step ahead of the bad guys. One of the attractions of the cloud is developer freedom. This is because the cloud gives developers the ability to test and deploy at lightning speed. CISOs, however, find this much less attractive. Though developers don’t want to be tied to a strict process, CISOs struggle to get the visibility they need when cloud instances are spun up by individuals or groups. The cloud separates the two: developers want to build fast and CISOs want to stay secure. Likewise, current SIEM tools are important and can offer great reporting at incredible speeds. However, speed in this area is not always the only thing you need. If a tool reports back quickly on a wealth of issues with suggestions of areas to address for resolution, you then not only have to factor in the time for these fixes, but also have to figure out if you have the knowledge to address them. The cloud is an enabler, and choosing a platform that offers visibility, reporting, and automatic remediation will free up specialised resources to concentrate on other benefits that the cloud offers.[/vc_column_text][/vc_column][/vc_row] ### How Can Consumers Stay Safe When Shopping Online? [vc_row][vc_column][vc_column_text]As headlines keep reminding us of the latest hacker attacks and data breach incidents, consumers are becoming increasingly aware of the dangers that lie ahead in the digital age. Attack vectors like ransomware – think WannaCry and NotPetya – and phishing scam campaigns that keep popping up and manipulate their victims by using prestigious brand names as a smokescreen, have shown us that everyone is on the radar of hackers. It is not just big companies or government agencies; the average consumer can be at any time hit by hackers, especially during an online activity that involves exchanging funds or using banking details. This includes web banking, booking services online, and - of course - e-commerce. In an era where more hackers roam the world wide web than ever before, how can consumers stay safe when shopping online? Online shopping is on the rise E-commerce is continually growing in recent years, aided by an increasing internet penetration rate as well as an immense rise in mobile web-connected devices, especially smartphones, that give shoppers the opportunity to make purchases on the go. Amazon has been spearheading the online retail industry across the US, Europe, India and beyond, and the company’s success has been a crucial factor in online shopping becoming so mainstream. It is also indicative of the upward trend of the market: as Statista reports, 82% of adult consumers in Austria and Germany and 81% in Italy have stated that they have bought something on Amazon in the past 12 months. The figure climbs to 86% when it comes to the UK and the US and reaches a staggering 88% in India. In Canada and France, 72% of respondents say the same, while Spain at 69% and China at 23% complete the top 10. Most of these users pay with debit and credit cards. According to research, 48% of consumers prefer to use a credit card when shopping on the internet, while 44% of respondents stated that they prefer using debit cards as a payment type in general, with 33% opting for credit cards and only 12% preferring cash. Overall, there are 1.5 billion credit cards in the US, with 67% of people owning one.[/vc_column_text][vc_single_image image="52482" img_size="full"][vc_column_text]This trend makes a strong case for finding a data protection solution that will help meet PCI DSS compliance requirements. Short for Payment Card Industry Data Security Standard, PCI DSS is a set of 11 high-level requirements that companies which process debit and credit card transactions must comply with in order to receive certification. These requirements are meant to increase cybersecurity across databases, files, and the web and ensure that clients can trust the certified enterprise with their payment details. Among others, companies are required to establish a process in order to uncover vulnerabilities with regard to cybersecurity, as well as monitor all activity when it comes to accessing cardholder data. They also mandate that companies restrict access to this data only to specific employees on a need-to-know basis. Coupled with the implementation of the right security tools, like SSL certificates or data masking, PCI DSS can boost the security of online card transactions. One of the most important tips for staying safe when shopping online is to always check whether the seller complies with security standards like PCI DSS – and always choose reputable online retailers that truly invest in cybersecurity. How to avoid phishing attempts Phishing scams are just the latest successful attack technique, which dupes consumers into inadvertently disclosing sensitive information like banking and payment data. In most cases, attackers pose as a third party that the victim knows and trusts – like your bank or a well-known brand. If you receive an unsolicited email from an e-shop you are registered in, asking you to update your payment details by sending over your credit or debit card data again, beware. A trusted provider would never request sensitive details in a manner that does not guarantee that they will be transmitted and stored safely – so that email is most probably a phishing attempt. This is hardly unique to online shopping: recently Netflix users were targeted by a similar campaign, while every year during tax season many taxpayers are bound to get phishing messages supposedly from the tax authorities requesting critical information. Many online scams will also attempt to lure people in by offering discounts and promotions that are too good to be true. Fake ads that claim to provide unique offers are regularly posted on social media like Facebook – but when users click on them, they are redirected to a fake site that will steal their data. In an effort to catch consumers off guard and make sure they are too emotional to think clearly, these scam promotions are often advertised as countdown offers. This effectively means that online shoppers won’t go to the trouble of scrutinizing the target site or pay attention to the little details that reveal the offer to be malicious since they are feeling pressured to act quickly and claim the alleged incredible discount. If an offer sounds too good, then it probably isn’t true – and consumers should also check whether a website they do not know is legitimate or not before proceeding with a purchase. As e-commerce is set to grow even more in 2019, adhering to a few online security tips can go a long way towards guaranteeing a safe online shopping experience.[/vc_column_text][/vc_column][/vc_row] ### Blue Chip | SAP HANA | EP3 [vc_row][vc_column][vc_separator][vc_column_text] [embed]https://vimeo.com/345042832/60b312dab3[/embed] [/vc_column_text][vc_column_text] Watch episode 3 of Blue Chip - SAP HANA Now! Our guests on this show are Matt Lovell, CEO of Centiq, Narendhar Tangella, Data Management Technical Architect from Blue Chip and David Spurway from IBM. [/vc_column_text][/vc_column][/vc_row] ### Blue Chip | SAP HANA | EP2 [vc_row][vc_column][vc_separator][vc_column_text] [embed]https://vimeo.com/345042504/e1a25ef7b5[/embed] [/vc_column_text][vc_column_text] Did you enjoy episode 1? Then watch episode 2 of Blue Chip - SAP HANA Now! Our guests on this show are Matt Lovell, CEO of Centiq, Narendhar Tangella, Data Management Technical Architect from Blue Chip and David Spurway from IBM. [/vc_column_text][/vc_column][/vc_row] ### Blue Chip | SAP HANA | EP1 [vc_row][vc_column][vc_separator][vc_column_text] [embed]https://vimeo.com/345040902/27a55af478[/embed] [/vc_column_text][vc_column_text]Join us for episode 1 of Blue Chip - SAP HANA. Our guests on this show are Matt Lovell, CEO of Centiq, Narendhar Tangella, Data Management Technical Architect from Blue Chip and David Spurway from IBM. [/vc_column_text][/vc_column][/vc_row] ### James King | Blue Chip | Tailored Solutions For You [vc_row][vc_column][vc_separator][vc_column_text] [embed]https://vimeo.com/342774651/146f6bca71[/embed] [/vc_column_text][vc_column_text]Watch as James King from Blue Chip joined us in the studio to talk about the right solution for you while thinking 3-5 years into the future.[/vc_column_text][/vc_column][/vc_row] ### How can the cloud help small businesses grow? [vc_row][vc_column][vc_column_text]A small business might start out as a one-person operation on their own computer. But as the business continues to grow and grow, an operational rethink is required. For example, vital files only being accessible to one person could really start to hold your business back. The cloud is a modern solution that provides key foundations for a business to grow up and not be held back by their systems. So how can the cloud help a small business grow? Accessibility To run a successful business, key data needs to be quickly and easily accessible, both on and offline and from multiple devices. The cloud is perfect here. Cloud hosting allows you to have a central cloud-based location where all your files are kept, whether that be Dropbox, Google Drive, OneDrive or something else, so that any employees can access the information they need, from their own device. In these cloud-based filing systems you can also usually set particular documents to be available to certain users offline. That way you’re business won’t get caught out by a lost Wi-Fi connection. Email in the cloud gives you not only access on any device but also tools such as shared mailboxes, centralised calendars and management. You can benefit from these features that are often found in larger companies and will be well set-up as your company continues to grow. You can enjoy the combination of familiarity along with flexibility. Reliability For an SME or startup, things just need to work. You want to focus all of your energy on running your business and not have to worry about your technology letting you down. You need to be able to trust the services are there and working when you need them to be. Countless businesses large and small rely on the cloud every day. Because all your files are stored there, you don’t need to worry about losing important data if your computer breaks. There are also various features that can help give you peace of mind in this respect. For example, file sync in the cloud gives a cheap insurance policy for document recovery, which could you get out of tricky a situation. Scalability You need a platform which will grow with your business, wherever the journey may take you. The cloud is perfect here. As your business grows you can easily invest in more cloud storage so that you always have exactly what you need to run your business at any one time. Unlike with traditional data storage expansion, where you may need extra service space, hardware and maintenance – at vast additional cost – the cloud makes expansion and growth simple and easy. Collaboration In this day and age, the ability for you and your team to work together via a digital medium is practically imperative, whatever the size of your business. Intranets based in the cloud (such as Sharepoint) provide easy document storage and compliance and great central sharing resource. In addition, they can create auto update notices to staff and create audit trails to demonstrate that they have seen that a document has been updated. Cloud file storage services such as Google Drive also offer a great way for everyone in the team to be able to access and work on the same documents. These cloud-based forms of collaboration help your company to work flexibly and transparently for optimum efficiency and productivity. Security A security advantage of cloud computing is that your data is not located on your premises, so cannot be easily stolen or destroyed. In the cloud, your information is protected from onsite power outages, floods, fires or theft, as well as with encryption methods, anti-virus and other security measures. Cloud companies currently are throwing immense resources at managing the security of cloud computing, so security is only set to keep on improving. Additional business tools Many useful cloud tools are available outside of just email, intranet and collaboration. These can make things easier for you and be a great help to your business. For example, Microsoft offers cloud tools such as To-Do for planning your time, or Lens for scanning business cards to Outlook contacts or for recording your expenses. Microsoft also offer a basic CRM as part of their cloud business subscription and booking systems. These tools save you time and money, while being all under one platform – allowing you to do to what you do best and concentrate on growing your business. Conclusion As can be seen, the cloud possesses many advantages that you can utilise to help your company thrive and grow, from reliability to scalability to additional tools. However, it is best to get an IT professional to help you assess your options, pick the right cloud option of you and oversee implementation.[/vc_column_text][/vc_column][/vc_row] ### Building supply chain resilience with cloud apps Like most business operations, COVID-19 has disrupted the supply chain function too. Given the globalized nature of manufacturing and sourcing inputs, the worldwide lockdowns, and restrictions on the movement of goods proved to be the proverbial spanner in the works. However, disruption is not completely new for supply chain professionals because it is often the first function that gets impacted during natural calamities, wars, trade restrictions, and in recent times, epidemics like Ebola, SARS, and MERS.  While the prolonged impact of the pandemic has hit hard and supply chains will likely remain fluid through 2021, lessons from earlier instances and the adoption of technologies like cloud apps have shown what the next generation of cloud-centric supply chain management could look like. Why the supply chain matters, now, more than ever The pandemic has highlighted the importance of having a robust and reliable supply chain, particularly for European companies that have globally distributed manufacturing and depend on imports (including the free movement of material within EU members) to sustain a regular supply of goods. And the situation is further exacerbated with Brexit and other international trade complications that have come to the fore in recent years. From a business perspective, the impact of COVID-19 varies highly between, and even within, sectors. While companies in the luxury segment have experienced a drop of up to 50% in sales, pharmaceutical companies experienced exceptionally high demands for products essential for delivering patient care and for drugs of certain diseases such as influenza, diabetes and infectious diseases.  Whereas, non-critical surgery-related products saw a steep drop in demand, followed by massive spikes as the infection rates plateaued; nose diving once more with the second wave of COVID-19.  Then there is the retail sector which has witnessed the emergence of new business models like direct-to-consumer and servitization of consumer-packaged goods. While each of these presents a unique challenge, what connects them all is the need for a supply chain that is responsive to changing demand and provides resilience against future breakdowns. The role of cloud applications As companies unpack the lessons from the analyses of their supply chains, it is becoming increasingly clear that certain well-entrenched concepts like ‘Just in Time’ inventory and offshore manufacturing need to be revisited. And while some large enterprises may have the resources to undertake such a mammoth initiative, time is also a key factor. On the other hand, smaller companies do not have that kind of access or the time. This is where cloud apps have a pivotal role to play. Owing to its design and tiered structure, cloud apps are relatively easy to integrate with existing systems which makes them quicker to deploy. Being cloud-based, they do not need an expensive infrastructure to run, and offer flexibility in terms of use and computing capacity. Additionally, cloud apps are designed for non-IT users and run on common devices like laptops and smartphones. And since the efficacy of any technology is as good as the manner in which it is used, the user-friendly nature and access of cloud apps make it a more-than-viable solution. Key trends powered by cloud apps Real-time visibility and traceability The efficiency of a manufacturing schedule relies on the regular and timely receipt of material. Any delay, particularly in a multi-step or multi-location production set-up can have a ripple effect through to the retailer. So, if a material is imported from China, it needs to pass multiple ports of entry along the route, and with constantly changing lockdown laws, a few weeks of lead time can become a few months.  Cloud apps allow manufacturers to integrate the supply chain and all participants from vendors to last mile sellers which enables real-time and precise traceability of every component that constitutes a finished product. Such visibility enables affected parties to make alternate arrangements or take corrective measures. For example, Infosys helped pharma companies use real-time data, leveraging our cloud-based solution, part of Infosys Cobalt, to increase their planning frequency to manage production schedules.  Enhanced demand forecasting:  The unpredictability of demand is perhaps the biggest challenge companies are grappling with in a pandemic-induced environment. Add to it, changing consumer trends, the restricted labour supply due to social distancing and short supply of raw materials, and the task becomes that much more challenging. To address this, several manufacturers are turning to technology. A cloud-centric supply chain allows all the data to reside on a single platform and offers computing power that is manifold faster compared to legacy systems. This democratization of data allows supply chain managers to run more simulations and scenarios to enhance demand forecasting. For example, the luxury goods sector can use projections to reset their inventory targets and optimize their capacity utilization to account for the drastic fall in demand.  Cross-pollination of technologies The compatibility and interoperability of cloud apps are its biggest differentiators, and its ability to leverage data to aid decision-making, its biggest utility. This also makes it apt to be used in conjunction with other emerging technologies like IoT, AI, ML, Blockchain, and Analytics among others. It also allows a sophisticated level of automation on the shop floor or warehouse using robotics, autonomous vehicles, and unmanned aerial vehicles.  And so, as supply chain becomes a cross-functional activity, the combination of cloud apps and other technologies could become a versatile solution in the hands of future supply chain leaders. ### Finding Ikigai - Opening Lines [vc_row][vc_column][vc_column_text] Out Now! Our new Opening Lines video. This week our author is Paul Hargreaves writer of Forces for Good. You can watch the video and read the fascinating article below! [/vc_column_text][vc_separator][vc_column_text] [embed]https://vimeo.com/337570543/50055e0a55[/embed] [/vc_column_text][vc_column_text]Ikigai, the Japanese concept meaning ‘a reason for being’, has been written about by several others and is described in various ways as fulfilment, happiness or simply ‘a reason to get up in the morning’. Ikigai shows us how work can be -good for our health, wellbeing and happiness. This philosophy appears to result in human beings living longer too, as the Japanese island of Okinawa, where Ikigai has its origins, is said to be home to the largest population of centenarians in the world. It is generally agreed across the world that happiness leads to long life, so if finding our Ikigai leads to happiness, it will increase the tendency to long life. So, we see that Ikigai is the intersection of where we are doing what we love, what we are good at, what the world needs and at the same time earning money. Finding that place at the centre of the intersecting circles through our purpose leads to fulfilment and happiness and makes us live longer. Héctor Garcia, the co-author of Ikigai: The Japanese Secret to a Long and Happy Life (2017), says, ‘Your Ikigai is at the intersection of what you are good at and what you love doing.’ He continues, ‘Just as humans have lusted after objects and money since the dawn of time, other humans have felt dissatisfaction at the relentless pursuit of money and fame and have instead focused on something bigger than their own material wealth. This has over the years been described using many different words and practices, but always hearkening back to the central core of meaningfulness in life.’[/vc_column_text][vc_single_image image="52472" img_size="full"][vc_column_text] Transitioning toward the centre So, the key for us is to find out where we are on the Ikigai diagram and aim to transition towards the intersection at the middle, which is the only part that is within all four circles. It is unlikely that you are currently sitting in any of the areas within only one circle in the diagram, i.e. only in what you love, what you are good at, what the world needs or what you can be paid for. For example, if you are being paid well for what you are doing, you are probably good at it; if not, you may not be there very long! If you are doing something you love, but not getting paid well or at all, you are probably doing it for the world and are in the ‘mission’ intersection. Many people are in the intersection between what they are good at and what they can be paid for named ‘profession’. Such people are generally very competent, and are often highly paid individuals, but in a job simply to earn as much money as they can. Other professions, such as nurses, would be in the ‘vocation’ intersection, as they are definitely doing what the world needs and are paid, but not enough, according to most people’s opinion. It is more likely that you are in one of the smaller triangular intersection areas, which are the intersection of three out of four circles. Let’s take the top intersection of doing what you love, what you are good at and it’s something the world needs. The legend on the diagram says, ‘Delighted and fulfilled, but broke’, which may resonate with some. I was in this place during my ten years of charity work in inner-city London. I absolutely loved what I was doing at the time, we were making a difference to the surrounding community and, after a few years of practice, we were reasonably good at it. However, there was a sense of struggle to it all, and a lack of permanence, simply because there was barely enough to pay the mortgage most months. Take another intersection, between what you love, what the world needs and what you can be paid for. The missing area here is what you are good at. This could be the position for someone in an early start-up. In the early years of a business, you are quite often having to be a jack of all trades and master of none. All jobs have to be done and there is probably only one job in eight that you are really good at. This scenario is fine in the short term, as your passion and excitement will sustain you, but unless the business grows rapidly, and you are able to delegate the jobs you aren’t good at, it can lead to frustration. I was always very keen to hand over to others all the areas of the business that I know I am not good at: administration, logistics, finance. Finding others who are master at these jobs and who were far better than me led to much greater Ikigai for me, and for them too, as they enjoyed the responsibility in the area they were good at. What if you feel that you are in the third intersection of what the world needs, what you can be paid for and what you are good at? The area that is missing is what you love, and you are comfortable but bored and empty. There is no passion and you are not engaging emotionally with what you are doing. There may be a feeling of emptiness, and whilst the world does indeed need what you are doing, the task in hand has possibly become no more than going through the motions. This can sometimes be from exhaustion, physical, mental or emotional, otherwise known as ‘burnout’. I was in this place at the end of my labours in the inner city, where I was still engaged in charity work, had started a business a couple of years earlier and had three children. I now know I was juggling too many balls and the passion had completely disappeared from everything I was doing. Finally, the intersection area to the left of the centre, where you are doing something you love and are good at it and you are being well remunerated for it. What’s missing? The world does not need what you are doing. You feel satisfied to a degree, but in your heart of hearts you may have a nagging feeling that what you are doing is not benefiting the world. I have met some people working for banks and investment companies in the city who have described a feeling of emptiness and know the world at large is not benefiting from what they are doing, just a few very affluent people. I have also met some after they have quit their job in the city and are finally doing something more altruistic which has given them an enormous sense of fulfilment and satisfaction. If you are in this place, toward the left of the Ikigai diagram, you just need to turn your talents and abilities into significantly changing the world for better. How to find our Ikigai A helpful exercise if you know you need to change what you are doing but are struggling to know what to do would be to write down four lists. Ask yourself the four questions: (1) What do I love? (2) What am I good at? (3) What can I be paid for now? (or what could pay me in the short term?) and (4) What does the world need? The cross-section of those four lists is your Ikigai. If you are still struggling, ask one of your closest friends. Often it may be more obvious to them what you should be doing. A complete change of direction may well propel you towards your Ikigai. You may have been struggling for years in a role that doesn’t ignite your passion and may secretly yearn to explore your interests within, say, the arts, food or culture. Or alternatively, it may just mean a tweak to what you currently do on a day-to-day basis. Let’s give an example from my business, Cotswold Fayre Ltd. I originally started in business as I wanted to change the world for better, loved good food and drink and had the ability to sell. I had always concentrated on the sales and marketing side as that is what I was good at, and I ensured I recruited others to deal with the logistics and administration side, which I really wasn’t good at and didn’t enjoy. I generally remained doing what I was good at, what I enjoyed, was being paid reasonably well and started to make a small difference in the world, i.e. close to Ikigai. However, we reached a point in 2015 when the company had grown much larger and was moving over 750,000 boxes a year around the UK from our own warehouse and we had morphed into a logistics business rather than a sales and marketing company. I still had a very good operational team running that day-to-day side of the business, but we ‘felt’ like a logistics company rather than a sales and marketing company, which didn’t suit me. So, after a considerable thinking period, we outsourced all our logistics in 2016 to other companies. The change in our business was transformational and I, and everyone else, certainly increased our joint Ikigai. Suddenly, we had lost the headache of running a logistics operation and gained headspace, allowing us to work with greater creativity. This is just a personal example of how I reclaimed some ground back toward the centre of my Ikigai. The chances are that you are not right at the centre of the Ikigai diagram right now. Think about where you are and what you can change, and then please do something about it if you want to be happier and live longer! Paul Hargreaves is the author of new book Forces for Good: Creating a better world through purpose-driven businesses. He is also the CEO of the fine foods wholesaler Cotswold Fayre.[/vc_column_text][/vc_column][/vc_row] ### PSD2 opens a new era for fintech software innovation [vc_row][vc_column][vc_column_text]The deadline for the Second Payment Services Directive (PSD2) looms in September but, with only weeks to go, many banks and fintech companies are still grappling with the requirements and challenges of a new open banking marketplace. Rather than tightening the banks’ operational model, PSD2 requires banks and fintechs to open their payments infrastructures and customer data to third parties. Businesses in the sector need to consider the long-term opportunities of opening access to data, a move that creates unprecedented opportunities for new and innovative services across a much wider and more dynamic banking landscape. At the same time, this shift heralds a new era for innovation in fintech software and solutions. The competitive face of Europe’s fintech industry The trend towards open banking redesigns the competitive landscape of the fintech industry across Europe. As the marketplace opens, existing service providers will see increased competition from other fintech companies as well as new entrants such as technology companies, retailers, telecommunications providers and even crowdsourcing platforms. The introduction and success of Mobile Virtual Network Operators (MVNOs) such as Virgin and Tesco Mobile has softened the customer’s mind towards new service providers breaking into traditional industries. One strength of these companies is their ability to capture and share data across different areas of business for marketing, customer service and operations in order to create a ‘sticky’ experience for customers. Banks and fintech organisations know that the new competitive landscape requires an agility beyond the reach of their legacy silo-based systems. Furthermore, it opens the door for collaborations with third party providers and opportunities for new services, cross selling and upselling to secure customer loyalty and reduce churn. The common focus across all service providers – long-standing and new – will be on innovation and customer experience, and the spotlight will be on ISVs to deliver new solutions for the market that take advantage of the open environment. Interoperability underpinning innovation Customers are increasingly expecting real-time, personalised and seamless service offerings from their banking service providers. On the business side, this heavily relies on fintech companies deploying a more scalable and flexible SaaS-based infrastructure and moving away from wholly-owned legacy systems in a bid to increase operational agility and scalability. All this disruption creates an unmissable opportunity for software developers to create and market new products for the industry – and interoperability is key. For banks, that means the ability to get maximum value from customer data across the whole of their business, as well as integrating third party capabilities into their core offerings to improve and expand core services.  For the software providers, it means putting integration capability at the heart of their product for newcomers wanting to grab a share of the market and for incumbents seeking to innovate. Forward-thinking ISVs can offer new opportunities to fintech companies that are considering how their legacy operation can integrate with new software services as well as for collaborations and new competitors aiming to create impact and secure customers. Delivering integration with iPaaS ISVs have a choice of how to offer integration to their customers. Software developers can choose to offer a ‘one size only’ product to the market which reduces their own time and cost of integration but instead transfers responsibility to the customer. This strategy risks losing customers that don’t have time and resources to dedicate to the job. Alternatively, ISVs can reactively respond to customers’ demands for integration, re-engineering the solution for each customer. Though not only does this increase the cost of bringing a product to market, it can also limit the field of customers to which the ISV can profitably market its solution. So far, it is a lose-lose situation with one party or the other bearing the burden of integration. The third option, offering flexibility and speed to market, involves building a white-labelled integration platform as a service (iPaaS) solution into the foundation of their own software products. In this way, ISVs can offer complete interoperability with any existing architecture, as well as any applications or collaborations that may be added in the future. With integration capability at the heart of the product, the ISV can guarantee interoperability with any other programmes within a legacy infrastructure, or across systems owned by third party partners. Furthermore, without the need for manual programming for systems integration, iPaaS-based solutions offer a faster launch time for new products and initiatives to be brought to market. In a competitive open market, where time and cost are critical, this factor should not be underestimated. ISVs that can offer reduced integration costs, increased connectivity and agility across systems will have a strong business case to attract – and retain – banking sector organisations. Time and cost to market for new consumer products and services will become even more important in the post-PSD2 marketplace, so reducing the burden of integration is a key component in easing this pressure for ISVs and their customers alike.[/vc_column_text][/vc_column][/vc_row] ### Using AI to Improve Mobile User Experience [vc_row][vc_column][vc_column_text]Pseudo AI technologies are on the rise and are affecting the way we go about our tasks and entertain ourselves. There are several examples of artificial intelligence in our day to day lives. It’s important to know that they are just machine learning software that runs on behavioral algorithms and adapt to things we like and don’t like. A true AI learns on its own. Google’s deep mind aims to combine neuroscience and machine learning without relying on predefined algorithms to build an AI that we are used to seeing in science-fiction literature and films. There is no doubt that a tiny glimpse of AI can be recognized in mobile technology. Personal assistants like Alexa and Siri have become part of our voice and non-voice based interactions. With their wide acceptance, the data is used to improve their skill set and usefulness. AI In Design AI will center around optimization and speed. Mobile app development efficiency will increase significantly and designers can craft designs faster and cheaper. Its primary function in the initial phase would revolve around Analyzing vast amounts of data and Suggesting design adjustments AI design tools will scale the prototyping in design.  Once the basic sketches are scanned with few parameters, a library of established UI components would seamlessly render a prototype for a company’s product. An AI algorithm could generate millions of unique designs for products graphic identity. AI would also assist designers to create 3D AR VR worlds with few parameters involved. The best bit about AI inclusion is that it would present designers with multiple options to choose from.  Designers Will Boost User Experience A subset of AI, machine learning renders self-learning ability to systems so that they can improve themselves without being programmed. It focuses on the development of programs that learn for themselves with the large chunk of data at their disposal. This next-gen mobile experience will raise the level of customer experience and have already given UX designers a free hand to channel their creativity. By delivering intuitive and automatic responses derived from user behaviors, machine learning driven technology can identify the design sketches, wireframes and prototypes in seconds to convert them into codes in real time. Likes of Airbnb and Nutella have already followed this suit and others will join the party soon. Designers will boost by leveraging: Shared Language The product has to align with the shared vision to create a meaningful and intelligent user experience. Machine learning development with product design and goals have to be prioritized and understood by the whole team. Experts from both fields can be instrumental in optimizing the machine learning model that produces more accurate personalized content for the end user. Fine tuning a user experience design without a shared language or technical expertise will have its own drawbacks. Use Case Use case methodology is an effective way to analyze, identify, clarify and organize systems with their requirements. A high-end consumer product is that which prioritizes user experience. Mapping out a use case for app features requires UX designers and machine learning experts to draft multiple prototypes before finalizing the design. With the input from data engineers, designers and data scientists, features have to be tested and improved in every development phase. It also helps determine KPIs that are closely associated with the metrics of machine learning. Data User experience design and machine learning can be fully optimized when both, qualitative and quantitative data are used to full effect. Qualitative data such as questionnaires, user interviews, feedback and reviews on product features can shed light on how users feel and interact with your app. Any factors that affect machine learning development and user experience can be improved through quantitative data. For e.g., whether its a particular feature that is not being responsive to user behavior or its the feedback loop that is ineffective. Robust data renders a wider perspective on user behavior and is essential for designers and machine learning experts to create cutting edge apps for the future. Conclusion AI has been called by many names in the Mobile world. Wired magazine calls it AI-Driven design, while others Design Intelligence. Others have named it Algorithm- Driven Design. The state of progress that we are witnessing is more of “Augmented Intelligence” than “Artificial Intelligence”. It's wiser to stick to the former for the short term. Irrespective of how it is coined, in the end, it is simply non-human intelligence capable of generating results that seem real to the human eye. Many software companies have started offering end-to-end mobile app development services to leverage the potential of AI in healthcare, retail, and entertainment industries. Designers co-creating with AI will present us with exciting opportunities. With the combination of art, engineering, science, and design, it will establish new relationships between customers and products. Users are demanding a more-in-depth personalized experience. With the power of mobile, designers will define the context for innovation with exciting new opportunities lying in front of us.[/vc_column_text][/vc_column][/vc_row] ### The Curiosity Continuum | Opening Lines [vc_row][vc_column][vc_column_text] Our new opening lines video is out now! Watch as Jon Burkhart talks to us about The Curiosity Continuum and what this means You can watch the video and read the tremendous article below! [/vc_column_text][vc_column_text] [embed]https://vimeo.com/338855691/0d3cb94fc1[/embed] [/vc_column_text][vc_separator][vc_column_text]In the work Carla Johnson and I have done with brands from nearly every industry and every corner of the world, we’ve seen distinct patterns in how curiosity in all its forms plays out in the success or demise of a company. I’ve taken these observations and mapped them in a matrix that reflects the structure (or lack thereof) in an organization, and the likelihood that culture prompts people to regularly peek around the corner to see how the bigger world may impact their brand’s success. We call this the Curiosity Continuum. On the horizontal axis, we show the range of a company’s culture from highly chaotic on the left, to highly structured on the right. We define chaotic as a nonlinear, unpredictable process working toward unexpected outcomes. Structure, on the other hand, refers to a clearly defined process an relationships that work toward predefined outcomes. The vertical axis refers to how likely a company is to be curious. The highly curious companies that rank at the top have a strong desire to investigate, learn and know things without specific reasons to do so. In contrast to them, companies that rank as indifferent show characteristics that imply they have no particular interest in, and are unconcerned with, exploring new outcomes. We have characterised the inhabitants of four quadrants of the Curiosity Continuum as four different animals: The Platypus = Highly Curious, Highly Chaotic Self-identified as curious Prefers working in an unstructured, chaotic environment Collaborative Risk and failure seen as learning opportunities High energy High innovation   The Sloth = Low Curiosity, Highly Chaotic Indifferent to the world around them Believes world/industry should and will continue as is Prefers working in an unstructured, chaotic environment Liberal approach to risk Sees failure as exactly that Low energy Low innovation   The Gazelle = Highly Curious, Highly Structured Self-identified as curious Prefers working in a structured environment that balances efficiency and experimentation Collaborative Carefully manages risk through process Sees failure as a learning opportunity High energy High innovation   The Ostrich = Low Curiosity, Highly Structured Indifferent to the world around them Believes world/industry should and will continue as is Single-minded focus on efficiency Carefully manages risk through process Sees failure as exactly that Insists people stick to their swim lanes Low energy Low innovation The sliding scales Chaotic organisations see the need for open-ended idea generation. People who downplay this approach think letting employees follow their curiosity will lead to a costly mess. Execs believe that employees would be harder to manage if people were allowed to explore their own interests. They believe people would disagree more, it would take longer to make decisions, and the cost of doing business would go up. That’s simply not true. While chaos doesn’t always produce something that’s immediately usable, it does push the parameters for possibilities. And that’s the foundation for progress and, eventually, getting to a higher level of performance. While some teams thrive on chaos, others do best with a foundation of structure. It gives them a greater feeling of control which, in turn, helps them manage the emotional aspect of risk that comes with curiosity. Think of it as the bumpers along the lanes when bowling. They have a process through which to channel curiosity, an accepted path to follow within their organizational culture and they see curiosity an iterative process – they don’t settle for the first possible solution, instead they use that to look for better remedies. Clearly, this shows that there’s not one approach that’s better; rather which one launches success for a company depends on what works best for their respective culture. Turning to the gradients of curiosity, however, gives us a much different picture. In curious cultures, leaders understand that the questions asked are more important than the answers themselves. Eric Schmidt, Google’s uber successful CEO from 2001 to 2011 shared: “We run this company on questions, not answers.” Leaders in indifferent cultures believe they need to talk and provide answers, not ask questions. And certainly, they don’t want employees wasting critical productive time by doing that either. They communicate from the top down instead of actively asking employees and customers what they think. “We know best,” and “that’s not how we do things,” are common mantras. So, once you’ve found your place in the Curiosity Continuum, are you pegged to stay there for life? Definitely not, and there are some great examples of companies who have developed in our three areas of curiosity – market, customer and product – with great success.   Jon Burkhart also the co-host of the “ultimate anti-conference”, the Fast Forward Forum, due to run in Venice on 6th to 8th October 2019. For more details visit https://fastforwardforum.eu/[/vc_column_text][/vc_column][/vc_row] ### Less paper, less work: Moving mortgage advice to the Cloud [vc_row][vc_column][vc_column_text]“Paperwork wouldn’t be so bad if it wasn’t for all the paper. And the work,” one modern American novelist has joked. The hassle of paperwork is one reason why cloud storage is seeing astonishing growth. The market is now worth more than $30 billion, according to estimates, and is expected to reach over $100 billion in less than five years. Within three years, more than 72% of the global organizations will have migrated to the Cloud. The advantages aren’t just for big businesses running their own data centres, however. The benefits are also important for smaller firms for whom the efficiencies, security, and the productivity gains delivered by the cloud can be crucial to ensuring they survive and thrive. This is particularly true in the mortgage market, where small firms have to compete with larger brokerages and need to use technology to level the playing field. For brokers, and others involved in the home buying process, paperwork plays a central role. With IDs, proofs of address, property information forms, EPCs, valuations, pay slips, bank statements, surveys, searches, and contracts, documents can quickly come to dominate and overwhelm the process. Lost paperwork and waiting for the post are key causes of delays. Manual processes to extract information and rekey it into internal system, meanwhile, bring a high risk of error. Cost, convenience and control There are three key benefits the cloud brings to the mortgage industry when it comes to storing and managing information. The first is that centralising storage and providing secure access to documents significantly increases convenience for everyone involved. For example, Smartr365’s DocuStore means that brokers can quickly call up documents from the office, home or anywhere else. They can also easily share these and collaborate with clients, colleagues, estate agents, or protection specialists – all of them viewing the same documents and data, when and where they need them. Customers, meanwhile, can view, access, and download documents they have to review or complete, and can upload these from home when convenient. The cloud significantly reduces the need for travel and meetings, and eliminates delays to the process caused by waiting for documents to be returned by post. It also provides much greater visibility and control of the process – the second major advantage. Storing and automatically organising data online, cloud solutions make finding client information simple, fast and efficient. Once uploaded, nformation won’t be lost – it can be quickly and easily recovered when it's needed.  Time spent rifling through papers for the right form can be turned to other tasks like advising clients, These boosts to efficiency and productivity can be felt by brokers, agents and others in the course of their daily work, hour-by-hour, and sometimes minute-by-minute. Finally, those benefits can be achieved with lower and more certain costs than most in-house storage solutions. The cloud is endlessly scalable, helping businesses of all sizes to grow: users will never run out of space and the costs of maintaining, updating and increasing capacity of the system are born by the cloud service provider. That also means information never has to be deleted just to make space – when a client wants to remortgage years later the broker can quickly recall their file can with a few simple clicks. Security and suitability Despite the obvious benefits the cloud brings, some are still reluctant to adopt it. In the main, this is due to concerns around security or suitability. It is right that the former should be a key concern. Much of the data brokers store is sensitive, and due diligence before entrusting this to a third party or sharing with others is essential. However, a good service from a legitimate cloud service provider will provide a fully compliant solution supported by much greater investment in security than most brokers could reasonably be expected to make. It also brings increased resilience for the business itself, with data automatically backed up and stored offsite as a safeguard against hardware failures and fires, flood or theft onsite. Concerns over suitability are, likewise, legitimate, but – provided the right solution is chosen – ultimately misplaced. The rapid rise of cloud technologies has seen a proliferation of services, with the majority aimed at consumers or a general business market. These can and often do lack the tools brokers need to make the most of opportunities, or, worse, are slow to operate, actually undermining efficiency compared to no-site solutions. That cloud boom has also seen a wide variety of systems developed, however, including bespoke mortgage solutions such as Smartr365’s – designed by mortgage brokers, for mortgage brokers. These are solutions that can take care of the not just the paper, but also much of the work – leaving brokers time to focus on the people and relationships that ensure their continued success.[/vc_column_text][/vc_column][/vc_row] ### Computing and data advances on the final frontier [vc_row][vc_column][vc_column_text]Imagine a server being strapped to a rocket, blasted over 26,000 miles to geosynchronous orbit, and then living a life bombarded by solar flares and radiation while expected to perform flawlessly. This is precisely the challenge facing IT hardware manufacturers intending to compete in the emerging field of space-based data storage and processing. An interesting new approach from Hewlett-Packard Enterprise is showing great promise for delivering performance and lifespan with standard, off-the-shelf equipment sent into space. This development could, in turn, help push enterprise IT in new directions as well. From physical hardening to software protection Until recently, the best way to help sensitive computer systems withstand the forces of takeoff and the environment of space has been so-called “hardening.” It’s a shorthand for what amounts to creating a super-tough Toughbook. Manufacturers build specialised systems or retrofit commercial hardware, doing their best to insulate them physically from potential harm. Hardening is expensive and time-consuming, and that’s a big reason why there isn’t a lot of compute or storage happening in near-earth orbit or beyond. We just don’t send true supercomputers outside our atmosphere today. Instead, data from the International Space Station, Mars rovers, and interstellar probes is beamed back to earth, where there are plenty of servers, storage arrays, and other IT hardware to deal with it. There are downsides to this approach, familiar to any IT pro—namely limited bandwidth and high latency. And they’re far worse than for any enterprise. The 300 Mbps available for the Space Network is a limited commodity, and the ISS and various satellites must share it to control systems, download images from the Hubble Telescope, and allow astronauts to talk to their families. As nations and private interests look to establish bases on the moon, put people on Mars, and explore the galaxy with increasingly powerful optics and sensors, there is good reason to take a more “edge-like” approach—moving compute and processing closer to where the data is created. For example, this might enable detailed analysis of onboard experiments, with only results, not raw data, being transmitted to ground stations, leading to a significant reduction in network traffic. There are also those who envision a future in which commercial data centres could be put into orbit, a public cloud operating above the clouds, so to speak. Such facilities would benefit from cool temperatures, the lack of humidity and weather extremes, and a zero-G environment spinning drives especially like. The HPE gambit & remaining barriers To make all this possible, however, we need better ways to help expensive, cutting-edge computers survive in space. HPE scientists wondered if a software approach could replace hardware-based protections at a lower cost. Specifically, they hypothesized that slowing down processors during assaults, such as solar flares, could prevent damage and data corruption. To test this theory, HPE sent a Linux system to the ISS in 2017 and kept a nearly identical machine in a lab on earth for comparison. They hoped for a year of error-free operations in orbit. Over 550 days later and counting, everything is still running fine. It’s going so well, in fact, the test machine hasn’t been booked a return ticket. This success is great news, but hardly the final hurdle for space-based computing. After all, any data centre manager knows that without undergoing 5G pressure at max q during launch or being attacked by space radiation once in place, IT equipment has still been known to fail. Fortunately, advances in machine learning are pushing remote monitoring and repair capabilities to space-friendly heights. At present, it is possible for a system to learn to proactively detect faults and diagnose the underlying cause of IT issues from a distance. Such capabilities are already helping enterprises achieve near-constant uptime for banking, social media, and other technologies we all rely on. With AI advances, the industry can expect remote hardware-management systems to become increasingly predictive and able to initiate fixes further in advance of critical failures. Moving to more software-based functionality—as the IT industry is doing with software-defined networking and, eventually, the fully software-defined data centre—will enhance flexibility and remote upgradability. This will be a boon for space-based computing. Such capabilities added to a long-range space mission will allow for more post-launch adjustment and lasting reliability. The remaining challenge is that human engineers are eventually required in data centres to swap out a drive or move in a replacement server, and we don’t have the robotics to take care of these jobs on the ground, let alone in space. Robotics is a budding field, however, so this shortcoming will not remain a barrier. Implications on earth What does all this mean for the IT professional not working for NASA, SpaceX, or any of the other interests in the new space race? As we found out with Tang, advances driven by space exploration are often adapted to broader purposes. Putting innovation dollars behind research into more hardy IT equipment could bring the average enterprise more reliability and lower costs. Pushing the envelope of remote monitoring will transform IT hardware maintenance. And if we must have robots available to fix future generations of Google servers in orbit, data centre managers will one day install similar systems for truly lights out operations closer to home. These developments will also help to sustain the computing we are doing in increasingly remote locations. Microsoft has installed an undersea data centre pod off the Orkney Islands for testing purposes. Green Mountain put a commercial data centre underground near an Arctic fjord. And the ALMA Correlator, a unique supercomputer, exists at 16,500 feet in the Chilean Andes. Computing in harsh and isolated environments and sending IT resources completely off-the-grid are becoming common. Advances made in space may help make these endeavors more successful and fuel a radically mobile lifestyle for earthlings. In the meantime, for space geeks everywhere, the successful HPE experiment gives us one more piece of information to feed our interminable debates about what it will take to survive, thrive, and compute happily as we move into the final frontier.[/vc_column_text][/vc_column][/vc_row] ### 5 of the Best E-commerce Marketing Strategies to Explode Your Sales [vc_row][vc_column][vc_column_text]With more brick-and-mortar stores becoming e-businesses, e-commerce marketing strategies have become a necessity if you want to remain on top of the game. Times have changed, and people have changed the way they shop. Online shopping is the norm for most shoppers currently. According to a report by Statista, the worldwide E-commerce sales contributed revenue of $2842 billion in 2018, and this number is expected to shoot to 4.8 trillion by 2021. If you're not using E-commerce Marketing Strategies to get a share of these stats, you are losing out on a lot. E-commerce In case you are unfamiliar with the meaning of e-commerce marketing, it is the practice of driving sales by increasing awareness of your online store. There are numerous e-commerce marketing strategies out there and picking the best one is daunting even to business gurus. The reason is that marketing strategies evolve every day, and what worked right for you yesterday might not work today. However, at the end of the day, you want two or more sales that you didn't have yesterday. Aside from that, you also need your old customers to come back to your online store for more. The strategies below have worked for many online businesses for years. They are the best you can use to get more sales and gain an advantage over your competitors. Here are 5 of the best e-commerce marketing strategies that will skyrocket your sales. 1. Content marketing It is a powerful tool that is quite effective. When done right, it has the power to attract traffic, increase brand awareness, educate readers, and convert them to customers. There are different forms of content marketing tools. Some of these include blog posts, testimonies podcasts, storytelling, infographics, and email newsletters. How do you market your content the right way? You should know what content your readers need. Create a blog on it. Post content that is relevant to your target audience during the opportune time and at reasonable intervals. Mix your content with visuals like images and tutorial videos to attract more attention to your site. Post stories, case studies, and testimonials from customers or employees about how your product solves their pain points. Don't forget to increase your visibility on search engines by optimizing your content with specific keywords and strong CTAs to invoke desired actions. Create great content and start blogging now. Make sure it is interactive to encourage more engagements. Ensure the content gets a wider reach by sharing it on your social profiles. With the advanced technology today, there are various tools that can help you to choose proper keywords. You can use them to generate blog post topics if you are experiencing difficulties coming up with one. Your content will be exposed to new audiences and attract more sales. 2. Social media marketing Social media platforms are doing wonders for many e-commerce businesses. If you can learn how to use it the right way, you can count on increased brand awareness and sales. With a current population of 3.2 billion social media users worldwide, you can't risk not using it in your marketing strategies. Social media is growing, and it is high time you started using it to drive sales on your online store. Create a social media profile on available networks such as Facebook, Instagram, Twitter, and LinkedIn. Post unique content with attractive visuals for each platform and post regularly. When people comment on your post, interact with them by giving real-time feedback. Include hashtags and share buttons to enable your fans to spread the word. Leverage the power of sponsored ads on Instagram, Facebook, and other social media channels. If you feel maintaining an active social presence is hectic and make you nervous, hire a social media manager. You can also use incentives such as giveaways if a follower shares your post with others. Partner with an influencer to get more fans to follow your brand. As you engage with more followers, encourage them to purchase from you. You can explode your sales with very minimal investments. 3. Email marketing If you think email is a thing of the past, think again. Email is one of the top communication channels today. According to a study, 91% of consumers check their emails every day. Email marketing involves sending promotional emails to customers and prospects. These emails can have different goals including encouraging a customer to browse your products, informing of a time-sensitive discount, or to retarget a customer who abandoned a cart. You can't leverage email marketing benefits if you have not collected email addresses from leads. You don't want your promotional email landing on the spam box of potential customers. You need to build an email list. You should encourage potential customers signing up to your mailing list. Start with pop-ups requesting them to subscribe. Entice them with offers that benefit them in exchange for their email address. With an email list, you should send the right emails such as welcome, abandonment, orders, or confirmation. To save yourself time, automate the emails. Optimize them to be mobile devices friendly. Increase your sales by using your emails to promote your brand. If you are learning about email marketing just now, start with newsletter campaigns. From there you can tap on upsells, cross-sells, promotional offers, and customer loyalty. 4. Search Engine Optimization If you want more all types of potential customers visiting your site, you need to be ranking on top in SERPS. SEO is a strategy of marketing that requires a combination of multiple aspects to get it right. It is a broad topic, and as a result, some businesses have been known to use dishonest tricks to land on the first page of Google results. However, this only gets them in trouble with Google. Do not take shortcuts. Identify the proper keywords for your website. Use keywords in your content naturally, titles, and meta descriptions. Ensure the content is written by a human for other human readers. Optimize the website for mobile devices. Ensure its quick to load, and the users can navigate through it with ease and checkout. With more visibility, potential customers will easily find you and convert. 5. Referral marketing Also called word-of-mouth marketing, it is another powerful marketing tool that you can't ignore. A referral is more likely to translate to a sale. Think about it in this way. When you are looking for a vacation spot for your family, you ask your nearest friends for ideas. That is the whole idea of referral marketing. It leverages the fact that humans are social creatures. What do the numbers say? 82% of consumers consult friends and family before making a purchase. 92% of consumers trust recommendations from people they know. These numbers are worth investing every dollar and seconds to get them on your side. How do you it right? Consumers can be your brand advocates even without getting anything in return, especially when they have a great experience with your product. However, don't rely on this solely. Offer consumers some incentives such as gifts, discounts, and perks, if they refer your product, service, or site to a friend. When you have decided on the right incentives, the next step is to promote it. Ensure it reaches as many people as possible by posting it on your social media or promoting it via email. Wrapping up These are the best marketing strategies that will help explode your online store sales. However, what works for another e-business, might not work for you. It is up to you to test the waters until you get the one that works for your online store. E-commerce is such a competitive arena right now, therefore, it is prudent if you got a step ahead of the game with an effective marketing strategy. Keep up with new trends that come out each day, and eliminate bottlenecks that are holding your sales back with the above strategies.[/vc_column_text][/vc_column][/vc_row] ### Which Industries Will Benefit Most From IIoT 2019? [vc_row][vc_column][vc_column_text]A common statement amongst historians of the 1849 gold rush, was that the people who were most likely to make money from their endeavors were the ones who made tools for the miners, and not the actual miners themselves. As so many great industries including transportation, manufacturing, technology, energy, and healthcare pursue the luxuries their companies afford as they reach success with Industrial IoT (IIoT), this colloquial wisdom stands true as IIoT equips them with the information and data in order to do their businesses more effectively. It’s no wonder why over half the companies who successfully use this technology have reported increases in revenue1. To understand why this is, we must look to the challenges industries are working to overcome and how IIoT helps them cross this digital and business chams. Imagine if you’re part of the power grid in the center of Phoenix, Arizona, with temperatures averaging above a hundred degrees and reported lows of 30 degrees (at its coldest). For your region, controlling temperature to make it livable for everyday life is a critical foundation of the city. In fact, the ability to route power to the specific area experiencing a meteorological event and so efficiently is the source of millions of dollars in energy and utility spend across the region and our country. In fact, by cooling off homes in a specific region before a heat wave hits, regions are saving millions of dollars on the energy grid and receiving money back from the government for doing so. This story goes on across many other industries. As transportation leaders including airlines innovate maintenance of their planes every hour to improve safety and takeoff times or the healthcare industry decreases misinformation by improving the flow of data; the industries who have data closest to the problem or provide people data enough to make decisions faster is key. By 2025, 75% of data in these industries are expected to move out of the company’s environments and in our environments - an area known as the edge. At the edge, companies are moving IIoT closer to where business and enabling key decisions and outcomes to take place in the field much like IoT is allowing everyday consumers like you and me to have better experiences in our homes, commutes to work and leisure today. Overall this massive shift in data is a pretty big jump from the 10% of industrial data at the edge today. This industry is growing at a rate far greater than most and is estimated to be close to 1T dollars by 20252. So why is it that industries across high tech, transportation, energy, manufacturing, and healthcare are receiving such a big value with the introduction of IIoT? In order to see the value it brings to these industries, we have to dive into what’s happening at companies in these industries today and the opportunity they might achieve tomorrow. For many of these industries, their systems, production mechanisms and technology were created up to 30 years ago. From aging facilities and oil rigs in Southern California pumping energy out of the ground to the modern-day automobile engine -- not much has changed in the mechanics that contribute to these highly reliable systems. With Industrial IoT, companies are able to attach sensors or even make decisions at the edge which reduces the time it would take to get information back to the office or a car shop, respectively. With all those round trips of data and information, it’s like traveling across around the country expecting to get to your destination in the city nearby, faster. The change for these traditional systems can bring about capabilities never thought of before. Business processes like predictive maintenance in manufacturing and transportation, a technique which replacing industrial parts before they fail, are both the source of the highest cost and most dangerous systems if they fail in operations. By allowing companies to more effectively operate this high dollar operation, this sector is proving new values. Even in the energy sector and healthcare, where having continuous operations and avoiding system downtime, may mean the difference between life and death. Across many in this arena, industrial IoT is allowing the companies in these industries to achieve new forms of business innovation and transformation unseen in the industry. As business leaders, managers and experts across the industry look to where they should bring about their next innovation, it’s no wonder why IIoT is key. It will be critical for any industry that operates in the field, even defense and aerospace; to leverage this technology. With the introduction into the aforementioned business sectors, it will not only lead to advancement but re-shape the way they achieve their products, business, and these innovative destinations altogether. If you’re a manager in transportation, high tech, manufacturing, factory automation, energy or healthcare, look no further for the technology that’s disrupting the industry. With over 50% of companies reporting increases in revenue when they successfully get there, it’s no wonder why IIoT is and will continue to see close to 30% compounded growth. Whether so many industries can reach the digital nirvana that takes place when sensors and information can drive the decisions which can be made in their natural environment is yet to be fully realized. My hope is that much like the 49ters, these industries get the tools they deserve to bring us into a new age we’ve never seen before.[/vc_column_text][/vc_column][/vc_row] ### Future proofing your business – Start working in the present [vc_row][vc_column][vc_column_text]The tech industry is the fastest moving and most innovative industry in the world, but it is also one in which longevity is more the exception than the rule. The average life span of a company listed in the Standard & Poor’s 500 index is predicted to shrink to 12 years by 2027, from 33 years in 1964. As global organisations, IT enterprises need to be agile in order to overcome any danger they may face in the future. This danger can range from changes to data and privacy regulation, cloud network issues, recruiting the right talent and enabling them with the right skills. There is also a need to prepare for challenges they didn’t even know they would face. In the age of disruption, change is something tech companies can embrace and be fully prepared for – as long as the right measures are implemented. Value employees In our industry, it is easy to get distracted by technology, products and commodities, so much so that we might forget about our greatest asset – people. Flourishing organisations keep employees at the core of what they do; investing in them, keeping them happy and considering their health and wellbeing. There is a direct correlation between FTSE 100 companies and those that feature on Glassdoor’s 50 best places to work in 2019 – including tech giants like SAP, Salesforce and Microsoft. To foster a culture of innovation, executives need to create an environment that encourages creativity. This can include initiatives such as freedom to experiment, networking at tech industry events and rewards for outstanding efforts. It’s about allowing staff to be creative away from their desks. This is exemplified in Rocket.Build, an annual hackathon open to all employees. Staff retention is critical to an organisation’s survival and with over half of the UK IT workforce revealing they feel stressed, underpaid and overworked, creating a positive working environment where people feel valued is important as a valued employee is more likely to be a committed one. Train employees As R&D moves at lightning speed, IT enterprises must also ensure their employees’ skills adapt and change with the products. A World Economic Forum report last year revealed 65% of children who enter primary school will end up in jobs which don’t yet exist. Therefore, if executives implemented the same philosophy to training staff as they do for product development, employees would be on a continuous quest for improvement. Yet, unmotivated and untrained staff can destroy the best products and services. It is important for managers to anticipate what’s coming, understand the market and any change of direction to be future-ready. This means training employees at all levels, so they are prepared to understand and meet customers’ needs in the present and future. The Association for Talent Development estimates that ongoing training and education at work could increase a company’s income by a staggering 218%, as well as a 24% higher profit margin. Be versatile In Rocket’s line of work where legacy technology is the one to turbo-charge the business of our customers, identifying risks or possible point of failures is what will prepare our industry for the future. These risks could be in areas that range from people, systems, processes or legal compliance. However, all these influencing factors can be identified and acted upon. For example, if you are too dependent on a single person, vendor or team for the success of the business, this could create big problems in the future if they are not around. 79% of people who leave their jobs cite lack of appreciation as the main reason. Upskilling, valuing and training employees can resolve this. IT enterprises must ensure their organisations are connected; from sales to service, identifying and integrating data from every touchpoint of your customer journey helps executives to understand customer behaviour and anticipate their needs ahead of time. This ultimately leaves you in a better position to make strategic decisions for growth and adapt your business according to their needs. Adapt working environments Just as technology has adapted, so should your business’s working environment - it can play a major role in how satisfied or motivated your employees are. With 41% of UK office workers admitting they are only ‘sometimes self-motivated’, directors are well-advised to create a setting that feels like a community rather than ‘just’ a place where people go to do a nine-to-five. Working practices are extremely important to most employees. Providing a working environment which accounts for the needs of the individuals is certainly worth consideration, especially since many of us are spending 1,792 hours at work every year. Gensler identified four different work modes – focus, learn, socialise and collaborate – all of which need to be accommodated. Businesses can go the extra mile and e.g. bring the outside in by introducing natural elements like plants into the office space, provide break-out rooms for much needed quiet in an open space environment, and consider partitioning to control the noise level which can be distracting. If employees enjoy the workspace, productivity and engagement are likely to increase. Collaboration Remember that many brains are better than one. Strive to create a culture that supports and encourages staff to conceive and come forward with new products, services and technologies for your business. A 2017 McKinsey study revealed 40% of industry is digitised in a time where millennials make up the majority of the workforce. These digital natives need nurturing and looking after. Employees should be encouraged to share and think creatively outside of their computers – communication platforms such as Skype for Business are making this sharing easier. According to author Jacob Morgan of the book The Future of Work: Attract new talent, build better leaders and create a competitive organisation “knowledge is now nothing more than a commodity”. The means the most important factor is an employee’s ability to learn how to learn and stay adaptable to any challenge arising. This is far more valuable than the knowledge you can access from a computer. The new knowledge workers are the future “Knowledge workers” was a term coined back in the 80s, describing this new kind of worker whose main capital was knowledge. Nowadays with working environments that cater to the needs of the employee’s and many tools at their disposal, workers can be considered “professional knowledge workers”. By accommodating and equipping these workers, your business will be armed for any challenges in the future.[/vc_column_text][/vc_column][/vc_row] ### Questions to maximise your recruitment technology investment [vc_row][vc_column][vc_column_text]The recruitment industry is rapidly evolving. With skills shortages in key fields, uncertainty over Brexit, and the ever-changing nature of the workforce, the expectations of recruitment firms are evolving. As a result, so too has the technology used by recruiters. In many ways, technology has evolved to meet the increasingly sophisticated techniques that recruiters are now adopting to maximise candidate sourcing and engagement. There are many different recruitment technology options on the market, which makes choosing the right solutions often a long and complex process. The recruitment industry itself is not homogenous – businesses don’t all work in the same way, and many have unique requirements that necessitate unique technology solutions. Additionally, since profit margins in the recruitment industry are being squeezed by increasing global competition and a tight labour market, making future-proofed tech buying decisions that will deliver strong return on investment is more important than ever. Building a robust technology stack that is not an easy task, but there are some questions you can ask of your chosen technology provider before committing to them, to ensure that it will support your company through its next level of growth – and beyond.  Is your tech supplier continuously innovating? A good technology provider will have a clear roadmap in place for future product developments and – by extension – how it plans to innovate. For recruitment software, this should certainly include a roadmap for more advanced automation and artificial intelligence. Manual processes result in costly errors and damaged productivity for recruiters, so automating processes should be high on the agenda for all recruitment companies, if it isn’t already. The future state of the recruitment process will be undoubtedly shaped by machine learning, natural language processing, and other subsets of AI, so it’s worth preparing for it now. Your provider should also be able to prove how it stands out from its competitors. This might mean highlighting specific features that competitors don’t offer but could also include couching differentiation in terms of scalability, breadth of integration partners, and ability to provide specific product versions that meets the needs of different types of recruitment business, including executive search and other specialist firms. How far can the technology be customised? It’s highly unlikely that a high-performing recruitment firm would run with a software platform out-of-the-box, with no level of customisation. Now that recruiters engage with candidates and clients across a wider range of digital channels, it’s important that your core recruitment platform can be integrated with email clients, social media platforms, mobile apps, and more. Managing the recruitment lifecycle is complex, so strong collaboration between team members and departments is essential. Therefore, rather than running with several disparate platforms for different stages of the lifecycle, recruitment businesses are better suited to using one platform that can manage the entire process: start to finish. And this platform should be capable of providing tools not just for client and candidate-facing activities, but also for used for back office activities like billing, invoicing, and time tracking. This is where an open technology ecosystem comes into its own. In other words, it’s key to work with a provider with extensible APIs that enable ease-of-integration, so that your platform can be configured to your specific needs. The platform you choose should also be flexible enough to allow software developers to develop, test, and deploy their own third-party applications on top of the core platform – taking customisation to the next level. Is the technology future-proofed? Once you’ve established that your technology supplier is innovative and flexible, you’ll need to assess whether or not the platform is truly scalable. Your platform should scale effectively alongside your recruitment business, rather than forcing you to have to replace it with something else once you outgrow it. Part of being future-proofed is having the aforementioned roadmap for innovation in place, and for the technology to be customised as a company’s needs changed. But it’s also about having a reliable, scalable system. Part of this is making sure you use software that is ‘true cloud’. There are a few key characteristics that make a solution ‘true cloud’; it should minimal installation requirements, be updated to bring out new features in real-time at no additional costs, be reliable, secure, and easily configurable with open APIs for customisation. In the age of GDPR, part of future proofing your recruitment business will inevitably be improving its ability to manage and utilise data. Similarly, your solution should also promote better data management practices by acting as a single source of truth for all teams. Poor data management can be a real roadblock for growing recruitment businesses. A common example of this happening is when recruitment and sales teams are out of sync, meaning the business isn’t turning over client work as quickly and efficiently as it could be. Finally, but perhaps most crucially, upgrading your technology infrastructure is time-consuming and expensive – you only want to do it once. This means making sure your provider is financially sound and exhibits signs of stability and growth. It’s not uncommon for software companies to drop out of the market, leaving their customers in the lurch. In short, part of making sure your technology choice is future-proofed is choosing an established, secure, and reputable provider. Creating a bespoke technology stack that will adapt and grow with your business is crucial to achieving a strong ROI. By asking the questions above, you’ll be sure to find the technology that can help your business grow.[/vc_column_text][/vc_column][/vc_row] ### Effective cloud technology is all we need Using the cloud for IT services brings numerous benefits for organisations. The flexibility, security and scalability of the cloud all bring strong advantages over on-premise, resulting in cloud migration becoming the obvious choice for many enterprises. However, it’s easy to overcomplicate the cloud, and to involve a workforce in its implementation and use more resource than is necessary. At Alcatel-Lucent Enterprise, we believe that the cloud works best for organisations when it happens behind the scenes – in other words, when it isn’t even a concern for most employees outside of IT functions. Here are some reasons why. Platform-as-a-Service (PaaS) PaaS is a cloud computing service which provides enterprises with a complete platform, including hardware, software and infrastructure. Using this, IT teams can develop, manage and run applications in a much easier way than if they were maintaining the platform on-premises. The employee experience here has become so seamless that the cloud component is to all intents and purposes transparent. This means that employees don’t have to be involved in assessing or monitoring the various benefits cloud brings, such as lower cost, simplicity and flexibility. The great advantage here for the organisation is the ability to provide a platform that works every time, but won’t run into issues like storage capacity or service interruptions caused by malfunctions or errors in the management of an on-premise platform; issues that can disrupt the employee experience and cause unnecessary concern. Flexibility to innovate One of the key things that hold organisations back from innovation is the time and money that is needed to set up new platforms. Implementing these services on-premise can be an extremely labour-intensive process, which slows down the organisation as it tries to maintain competitive advantage and agility in the market. However, with the cloud, establishing new services is often much quicker. New configurations can be up and running within hours, and because the organisation is only being charged for the time it uses (and not the hardware and infrastructure) cost is greatly reduced too. Using the cloud for new products and services liberates your product development teams, giving them the freedom to innovate without having to worry about new infrastructure and return on investment. The business model associated with the cloud is a key change in how the services are consumed. On a subscription-based model, the customer is free to increase or decrease their bill according to their real use of the solution. A customer can even stop whenever they choose. The subscription model provides real value for customers as they have zero capex investment. Vendors also have to adapt to this model. Instead of selling products, they now need to sell services with impact on how they manage the cash flow. On that side, many changes are happening and require serious adaptation in the vendor’s business intelligence.  Company structure evolution In any company some evolutions are bound to take place, be it extending the offices, opening new locations or even closing some locations; it is important to have a solution that can adapt to those typical changes. For instance, any merger or acquisition requires a great effort to bring together disparate sets of data and records. With on-premise systems, months or even years of manual coding can be required to bring information from one system to another. With systems in the cloud however, making that transition is far smoother. Employees from the two organisations can easily exchange and merge data using the cloud, making the merger or acquisition a seamless and almost instantaneous process. In any merger or acquisition, the workforce is concerned with how the process will be achieved. With IT systems merging in the cloud, at least one piece of the challenge is handled, and employees can focus on more important aspects. It’s critical that the data merger taking place within the cloud is kept as seamless - and to all intents and purposes as transparent as possible. This helps ensure the merger of two organisations feels like the creation of a new, single team. Continuity in uncertain times Managing costs and workload during times of market upheaval has always been a tricky proposition for the enterprise. One enormous advantage of cloud working is how quickly and easily activity can be altered and scaled to adapt to changing circumstances.  This means that IT systems can be changed with relative ease when market circumstances (for example, a fluctuation in sales) happens. By keeping cloud IT systems in the background, all the workforce needs to know is that the same services and platforms continue to be available. Issues like scaling of cost and resources simply are no longer relevant. This means a much smoother ride in tougher times, or times of fast expansion.  Never is this more important than in the case of disaster recovery. Cloud backup helps businesses to recover their data quickly should the worst happen. It ensures that operations continue with minimal downtime. Kept out of sight, the cloud enables minimal disruption to the employee experience, maintaining the confidence of the workforce even in the face of major incidents.  More effective collaboration Cloud technology allows collaboration on a much larger scale among employees within the same organisation. It allows multiple users from a range of different departments to access the same information. Thanks to the cloud, organisations can overcome geographic limitations and set up multi-region infrastructure that can be accessed from anywhere at any time. It enables disparate international business units to feel like one team, operating together to achieve shared goals. The key here is that the cloud enables this collaboration in a way that feels transparent – and as much as possible, it should be kept that way. The cloud, managed in a discrete way, means that collaboration can be taken for granted.    Cloud technology – a need-to-know platform? At Alcatel-Lucent Enterprise, we believe that powerful technologies such as the cloud should be orchestrated behind the scenes. A workforce should not even have to think about it, but simply know that it’s there to help them communicate and operate securely and effectively, helping them to work efficiently, anytime and from anywhere.  With the cloud, IT leaders have time to think and act strategically. They are able to provide the insights needed to choose the right resources for the business; they can worry less about how their teams will have to manage on-premise technology. The cloud enables businesses, whether small or large, to provide a comprehensive toolkit that helps a workforce to do its job whether in an office or from home, while maintaining an excellent level of customer service and intimacy. The importance of the right cloud-supporting technology To ensure the cloud is always working in a way that delivers its benefits in full, organisations need to adopt certain critical technologies to provide support. A strong cloud infrastructure consisting of servers, storage devices, network and cloud management software, deployment software, and platform virtualisation should all support the cloud in its behind-the-scenes efforts. For example, Wi-Fi 6 will ensure the strongest and most reliable connection within buildings. This is a business essential to permit cloud technology to perform to its fullest extent. Finally, adequate cybersecurity to protect your cloud network is also vital in ensuring the safety of data held in the cloud. Here again, the cloud provides behind the scenes security through the encryption of cloud data.  In addition, the protection of the network as a whole is vital in keeping enterprise data secure and allowing the cloud to be your biggest business enabler. ### Creating a Business Development Culture - Opening Lines [vc_row][vc_column][vc_column_text] [embed]https://vimeo.com/338854583/4f8aaf2417[/embed] [/vc_column_text][vc_separator][vc_column_text]I talk regularly about , the business development paradox that exists in businesses where individuals have to both sell and deliver their service. The Business Development Paradox is the situation where the most skilled business developers do the least business development. This is because they have the most established client base, so need to do the lead new business development. This leaves the new business development to junior members at business development. From a CEO's perspective it seems crazy that their growth plans rest on the least skilled members of their team. From a neutral point of view, it makes no sense to have the toughest part of business development, fall on the least experienced members of your business. However, many people have a vested interest in keeping things the same.  Who would not want to have an easy life servicing existing clients, earning well and not having to do the heavy lifting of new business development.  As a leader, you may fear the consequences of telling your top billers they need to change. Perhaps you recognise things aren’t right, but you feel powerless to do anything about it. You can’t just turn the existing business model on its head over night. But there are some less disruptive steps you can take, to create a healthier business development culture. A culture that benefits your whole company, not just certain individuals. You might not turn things round overnight, but you won’t alienate your top performers either. WE NEED TO MOVE TOWARDS A CULTURE OF BUSINESS DEVELOPMENT COLLABORATION 1. INCENTIVISE COLLABORATION Talking about business development collaboration isn’t enough. You need to recognise and reward the consultants or sales professionals in your team who actively promote other people in the business. You could formalise it by running quarterly competitions. You could just put more effort into recognising people in team meetings. If you praise people who go out of their way to help teammates and talk about it regularly, things will start to change. 2. ALL FOR ONE AND ONE FOR ALL Encourage knowledge sharing. Pair up your top performers and less experienced team members during core business development hours. Encourage them to share the tricks of the trade and their market knowledge. Move towards a culture where senior consultants are expected to support the rookies’ development. You can do this through formal objectives. However, it’s better if you can persuade them it’s in their interests too. It will lighten the load of managers who are often also selling, and it will help identify senior consultants who are ready to step up to a leadership role.  3. USE IT OR LOSE IT Are your top performers creating a culture of fear? Experienced consultants / sales professionals will often ring-fence their best clients and intimidate others from calling them. They may be making tens of thousands a quarter from these clients, cherry picking the clients they want to work with, but what are they leaving on the table? How many thousands could your other consultants to pick up? If you suspect that opportunities are being missed, you need to call them out on it. You can give them a choice: Maximise the available revenue with the client Introduce your colleagues when opportunities arise, or.. Risk losing that client. If the relationship between your company and the client, relies on just one consultant, you’re always at risk of that consultant leaving and taking the client with them. When you have several consultants working with the same client, you spread your risk and reduce the power of any individual consultant. CULTURE CHANGE IS NEVER EASY It takes time and inevitably upsets people along the way. If you do decide to try and change things, you need to plot a clear strategy out at the start and work hard to keep everyone engaged along the way. One of the reasons it’s hard, is because we all react differently to change. Some will feel threatened; others will see opportunity. Regular open conversations are key to understanding where people’s heads are at. Give them the chance to express themselves and contribute regularly.   Alex Moyle is a business development, sales expert and author of Business Development Culture – taking sales culture beyond the sales team, published by Kogan Page and available on Amazon.[/vc_column_text][/vc_column][/vc_row] ### How will AI impact the future of different industries [vc_row][vc_column][vc_column_text]Over time, Artificial Intelligence has become a significant part of our everyday life and is not just a science-fiction idea anymore. Voice assistants like Amazon’s Alexa, Apple’s Siri, Microsoft’s Cortana, and many such AI systems are used by us to interact with our speakers and phones. Cars built by Tesla can drive themselves by intelligently interpreting and analyzing their surroundings. There are many other applications of AI that surround us yet we don’t realize. In the year 1956, John McCarthy, one of the founders of Artificial Intelligence said: “As soon as it works, no-one calls it AI anymore”. Though we are using Artificial Intelligence-based systems every day, the question “How it will impact the future?” still remains. Let’s have a look at how AI will change the future of work, education and most importantly the future of humans. The future of work: Artificial Intelligence (AI) is reshaping companies leading to increased productivity which in turn offers economic growth. This technology will change the nature of the workplace as well as the work since machines will be able to complement the work done by humans, perform a larger number of tasks in less time and even be able to carry out the tasks that go beyond the ability of humans. Many believe that the emergence of AI technology will displace workers. In the next 15 years, it is likely that this technology will take the place of about 40% of human jobs. But many fail to see how this technology will benefit the employment sector. Boosts Productivity: Many companies have confirmed that the implementation of AI in their business has improved their productivity. With the deep integration of software and machines in companies, the workplace will grow continually by enabling humans and machines to work as one. Stimulate Innovation: The future workforce will lean towards the creativity and innovation which will drive the business. With the help of automation, tedious tasks will become completely computerized. This will help employees to focus their time and energy on customer-oriented tasks. Also, promoting creativity gives employees a sense of satisfaction. Create jobs: Contrary to the belief that AI will displace human jobs, it is safe to say that this technology will give rise to new jobs as well. Artificial Intelligence and robots will give rise to as many jobs as they displace. Data scientists and Machine Learning engineers will be of high demand. The future of education: The swift growth of technologies like artificial intelligence and robotics has influenced various industries, including education. AI can be connected to multiple classrooms across the world as it is computer based. While these programs can teach maths or literacy to students, emotional and social skills will be taught by humans. The benefits of AI in education: Customized learning: Learning based on the needs of an individual targeting their strengths and weaknesses is one of the key advantages of Artificial Intelligence. AI systems help to provide a tailored learning experience to students based on their learning abilities. Essential and instant feedback: Students have increasing hesitancy of asking queries or clearing their doubts regarding the concepts of the course with their teachers in front of their peers. Artificial Intelligence has made it convenient for students to raise questions, make mistakes and get the feedback they require for their improvement. Collaborative learning: By examining the data of each student, Artificial Intelligence systems can group students depending on their character, strengths and interdependent skills. This will make the process of learning smooth and productive. The future of humans: Artificial intelligence is the duplication of human intelligence processes by machines like computer systems which include learning, reasoning, and self-correction. Its implementation has improved in the last couple of years. There is a rising fear that this technology will eventually replace human workers. But AI is a friend rather than a foe and while it will completely change the way the work is done, the broader impact of this will be in enhancing the capabilities of humans. Here’s how Enhanced automation: At present, Artificial Intelligence can execute intense human activities smoothly without the necessity for human interference. This has remarkably automated many jobs in various industries. Smart weather forecasting: Artificial intelligence (AI) methods implement reinforcement learning on previous forecasts and original outcomes. By analyzing forecasts with the actual results, it is able to learn and enhance simulation abilities, predicting the weather more accurately in the future. Improved safety: AI can manage the intensity and features that humans cannot approach accurately. Advancements in Machine Learning technology means that AI can adjust to variations in threats automatically and detect issues as they occur. Conclusion: Though the exact future can’t be predicted, it is evident that Artificial Intelligence will become a significant part of our lives. There are many different methods in which this technology can influence our future. While most of this technology at present is still rudimentary, with organizations starting to implement AI, eventually advanced AI will exceptionally impact our everyday lives and will make our tasks simple.[/vc_column_text][/vc_column][/vc_row] ### How To Can Streamline Your Business With Blockchain Technology [vc_row][vc_column][vc_column_text]Did you know 80% of businesses see blockchain as a strategic priority? While the majority of people might not accept the arrival of this revolutionary technology, the bottom line is that many aspects of this technology are due for mass, multi-industry implementation. Businesses, which understand that blockchain will not only change but also improve their operations, will be the ones to reap the rewards within their competitive industries. So, here in this article, we will discuss why you need blockchain to streamline your business.  Improved Security Since blockchain networks are more secure than others by design, it takes the guesswork out of the security. Blockchain technology uses advanced cryptography for modifying the data, and Proof-of-Work technique makes the data difficult to hack. Besides, blockchain databases are entirely secure as the data stored in them is spread out across multiple locations. Also, the database is decentralized that could not be compromised by hackers. To modify or alter the data, a person is required to change every piece of the chain. This, in turn, guards the business against internal corruption, and customers can also be confident that their data is in safe hands. In addition, accepting cryptocurrencies can even secure transactions. These payments are made without personal identification information that gives inherent security against identity theft. With cryptocurrency compare tools, you can easily find out the currencies your business should start accepting.   Better Supply Chain Management If you are a small-business owner, you must know who’s on the other end of the transaction while placing an order with a supplier but you might not know who the supplier’s supplier is. Blockchain technology brings transparency into the supply chain and allows the owner to see every company that has a hand in growing, creating, or manufacturing any product it sells or service it delivers. Also, blockchain technology stores the transactions in a decentralized record. It even monitors the transactions safely, which can reduce time delays and mistakes that directly impact time and money.  Increased Efficiency Blockchain technology can make transactions faster and more efficient. As there is no involvement of third parties, legal contracts can be completed quickly. The blockchain eliminates the need for reconciling documents like billing statements and invoices, thus making the payments instant. This has a huge impact on a business’ cash flow, including those who rely on insurance reimbursements. Also, as blockchain technology creates a tamper-proof log of sensitive data, there is no need for regulatory bodies and security oversight.  Automation through Smart-Contracts Blockchain technology creates smart contracts that automate legal arrangements, escrow agreements, and accounting. It eliminates the cost of the third party usually responsible for facilitating these deals. Smart contracts are more transparent than physical contracts since the data contained on the blockchain is stored on peer-to-peer networks becomes visible. For a business wanting to restore faith with customers can demonstrate their integrity with smart contracts.  Decentralization The blockchain data is spread out across multiple networks, hence the decentralization. It protects against security breaches, and this network also requires fewer resources to maintain. The decentralized nature of the blockchain can quickly scale the storage capacity of your business as your growth warrants it. Conclusion Undoubtedly, blockchain technology will be the absolute game-changer in the upcoming time. Be it logistics, healthcare, or banking & finance, this technology has the potential of revolutionizing many industries. So, it’s high time for businesses to begin exploring the plethora of improvements blockchain technology can make in their businesses.[/vc_column_text][/vc_column][/vc_row] ### Green data: Survival of the sustainable [vc_row][vc_column][vc_column_text]Pit an environmentally-friendly (green) business or household against one that can’t tell the difference between recycling and general waste – who are you going to back when it comes to winning the race for sustainability? With the government on board to cut greenhouse gas emissions in the UK to almost zero by 2050, those who don’t take responsibility will be left behind. The same responsibilities apply in the datacentre world. When it comes to fitting out and operating a facility, businesses are racing one another to turn their data ‘green’ and make their facilities as sustainable as possible. Let’s take a closer look at which trends are pushing environmentally responsible datacentres over the finish line: Cool data, hot competition The first step to improving sustainability is reducing the amount of electricity required to operate a datacentre. According to the Storage Networking Industry Association (SNIA), 5% of total global energy usage is by electronics – this number will grow to at least 40% by 2030 unless companies make major advances in lowering electricity consumption. Additionally, datacentre cooling is one of the main rising energy costs, along with the demand for datacentre capacity. Developing in-house water-cooling systems is one way to address these challenges. For example, servers that use a water element to cool themselves, rather than electricity, inevitably reduce energy consumption and optimise air flows. This is because the water-cooling systems are built into racks with integrated heat exchangers and power distribution units and users need less than 10% overhead energy on top of server energy. A typical datacentre, in contrast, needs between 40-100% more. Cloud migration service providers allow corporations to not only reduce the cost of operations and develop new uses, but also to reduce their environmental footprint compared to their legacy and generally less efficient facilities. Aggregating cloud computing needs through large hyperscale cloud service providers becomes part of the answer to control the risk mentioned above.  Maintaining your stride towards an efficient datacentre The global cloud datacentre traffic was 6 zettabytes (ZB) in 2016, and it is now projected to reach 19.5 ZB per year by 2021. So, wherever possible, datacentre providers should optimise the use of server resources across their customer base. This can be achieved via virtualisation and the re-use or recycling of servers, as well as using an integrated industrial model to build systems that are more energy efficient. In addition, dedicated research teams can work alongside production to ensure the functionality of a product and assess safety requirements before they are manufactured. Innovation that consists of improving efficiency with fewer resources ultimately respects continuous renewal, through the development of agility and the creation of sustainable solutions. Hence, providing the best services for customers. Futureproof and don’t look back The foundations are in place and official targets are set. It’s imperative that the right steps are taken to keep sustainability at the forefront. Those who utilise green energy will be a step ahead in reducing climate change and leading by example, to ultimately grow their datacentre to have a sustainable future.[/vc_column_text][/vc_column][/vc_row] ### Four elements of a successful file services platform [vc_row][vc_column][vc_column_text]When his firm was hit by a ransomware attack and files held hostage by attackers for more than $100,000, IT manager of construction company S.J. Louis, Rob Svendsen, did not panic. Despite the fact that all the records for recent and former jobs were on the servers, including $200 million worth of current construction work. That’s because S.J. Louis has been clever about its data, by being an early adopter of enterprise cloud file services; a holistic approach that moves the gravity of enterprise data to the cloud. With enterprise file services, a file server hit by ransomware, accidentally deleted, or even drinking a beverage intended for human consumption, are not a major cause for concern since the data itself lives in the cloud. And with modern file services that employ clever caching technologies, fully recovering from such a catastrophic loss can be achieved in minutes. In the world of cloud file services, data follows the user as it flows between clouds, offices and endpoints. But as organisations make their transformational journey to the cloud, they face diverse challenges in the areas of infrastructure management, data governance, privacy and security. Successful file services platforms combine four access methods, allowing organisations to achieve safe, seamless access to their entire global enterprise data. But first, let’s talk a bit about how things look in most enterprises today. Legacy silo systems and unstructured data Most organisations rely on legacy systems and architecture in their datacentres which hinders increased versatility. Within these environments data is typically unstructured, with project files located in silos, stored on users’ devices or located in branch offices across multiple locations. This lack of interoperability is not conducive to file protection in the cloud and can present many intractable and costly challenges, such as critical data loss. The challenge of protection How to prevent the loss of valuable information is a major conundrum for most enterprises. It is practically impossible to ensure the protection of data on servers located in distant branch offices, unprotected locations or on employees’ laptops. If a laptop goes missing, for example, vital data is likely to be lost with it. Hardware or system failure as well as ransomware and other viruses are very common causes of data loss. Predictably human error is often to blame, whether it’s a result of accidentally pressing that delete button, leaving your laptop on a train or dropping a mobile phone. People always make mistakes. Dark data Another key issue is what we call ‘dark data’ which may represent either a lost opportunity or a security risk to a company. Dark data is dispersed digital information which is stored but unutilised.   By adopting a file services platform, organisations can consolidate all files from all users, servers, NAS devices, etc., and integrate the data into a single solution. The optimum file services platform will remove storage capacity limitations by dynamically caching files from any secure cloud to enterprise edge devices and desktop users. This allows users to access, share and protect an unlimited number of files in the cloud as if they were stored locally without being constrained by local storage capacity or security compromises. The four methods for accessing cloud file services A file services platform should enable the storage of all the organisation’s information, including unstructured files, making access seamless and simple from any location or device. Ideally a combination of four main methods should be employed: Web interface - The first method would be a web interface, where workers can connect to a web portal from which they can access all their files. Endpoint client - The second method is the installation of an agent on users' laptops or desktop computers that allows them to work on files stored in the file services platform. This can be achieved either by synchronising a proportion of the files from the file services platform onto these devices, or by providing cached access. The agent client essentially enables users to see all of the information in their file services, with the structure appearing the same as it did prior to migration. The information that is most frequently needed will be cached locally while the rest of the data will be brought on-demand. Mobile app - The third access method is the utilisation of a mobile client, which allows users to access all an organisation’s information from a mobile device, a phone, or a tablet. The mobile client is the provision of an application that exposes the information so it can be accessed easily from a variety of devices. The caching gateway – While the first three access methods are likely to be familiar to individuals who store data in the cloud, this final method is not available on most platforms. The caching gateway is a client device that allows organisations to access all its documents in the file services platform in the same way that a file server or a NAS device would be accessed, using traditional protocols like SMB (server message block) or NFS (network file access). A caching gateway enables the enterprise to seamlessly migrate existing file servers or filers into a consolidated worldwide file services cloud platform. But from the users’ perspective, there is little change, as they can view the same information that was visible from the existing server and everything continues to work just as before. The power of combining access methods In order for the decision to migrate global company data to a cloud file services platform to make business sense, it is critical to implement a global file services solution that utilises the benefits of all four access methods. Organisations must be able to continue working without disruption, just as they did prior to migration. The importance of maintaining file structure Users might have files that are stored in a particular folder structure. For example, employees may regularly work with a number of Excel spreadsheets that need to be maintained in the exact drive mappings as they were before. It would be very difficult for an organisation to migrate to a solution that does not include the ability to continue to serve this type of workload in this way. Failsafe operations In the unlikely event that a caching gateway fail, all that is needed is the installation of a second gateway, and the files are instantly restored intact. Non-disruptive migration Non-disruptive migration is an important capability of a cloud file services platform. It is especially important for organisations that have previously invested heavily in traditional systems and legacy architecture, which may become obsolete during an organisations’ digital transformation. The beauty of this approach is that enterprises can achieve the agility they are after, without having to start from scratch with the procurement of entirely new systems. As enterprises grow in the global market and embrace digital transformation, more and more people are working remotely and expect data to be available to them wherever they are. It is vital to these users, and the companies they work for, that travel and remote office working equates to business as usual. With caching capability allowing for multiple gateways to a single global file services platform, organisational data can be quickly and securely accessed from desktops, laptops and mobile devices at any time, from anywhere.[/vc_column_text][/vc_column][/vc_row] ### How Can AI Help You Design A Bespoke User Experience? [vc_row][vc_column][vc_column_text]Artificial Intelligence (AI) has become a part of our daily lives. We have all used AI-based tech at least once in our lives – Siri, Alexa, and Cortana (VAs) are all products of AI-based advancements in communications tech. Today, more than 58% of Google searches for local businesses come through virtual assistants. More than 27% if all local website users find the sites via voice searches. The numbers prove beyond doubt - AI is integral to the UX design for digital branding assets. What Is The Applicability Of AI In E-commerce? It is the era of AI and its subset, machine learning (ML). As the users become accustomed to the benefits of easy search, navigation, and product purchase, more number of e-commerce sites are now adopting AI and ML. AI typically has several forms and functions. The simplest, yet most effective of them can be to – i. Interpret the behaviour of the target audience ii. Automate analysis of the big data collected through user interactions iii. Leverage ML to provide optimal solutions to the problems in hand. AI is the potential of a machine or software to scrutinize large volumes of data, interpret it, and "learn" to react according to circumstances in a manner similar to humans. Their ability can range from the analysis of data sets to determining patterns in the same for creating a personalized user experience for each visitor cum shopper. What Are The Benefits Of AI In E-commerce? Adapting new tech can be jarring for any individual, and it is more so for enterprises that have been running successfully for the last couple of years. AI has several benefits, including shortening the path to conversion and adapting to voice-assisted searches. However, e-commerce platforms need to consider the significant advantages mentioned below before shying away from new technology. i. An Increase in Revenue One of the most significant benefits of adopting AI could be the amplification of ROI for e-commerce websites. However, the adoption will come with several clauses including but not limited to the preservation of consumer privacy, creation of new marketing channels, and quicker turnaround for complaint tickets. ii. Providing Personalized Experience While shopping online is convenient for everyone, there is one significant caveat – the loss of personalization. People often complain that there is a lack of personal touch when they are visiting e-commerce sites. There are no "sales reps," whom they can ask for size referrals or trending styles that might suit their body types. It is a common concern among many shoppers and website owners alike. iii. Suggesting Relevant Products to Individual Buyers AI can bridge the gap between the website user’s need for personalization and the owner’s need for selling their products. The beauty of AI is that it can equip any website to meet the client’s demands for customization and personalization. Modern AI technology can present potential buyers with a wealth of information on the products they are interested in, provide a fine-tuned selection of products for them and offer delivery or shipment options convenient for their location. Apart from offering a personalized shopping experience, AI can also provide a plethora of coupons, discounts, and sale options for the customers according to their preferences or buying history. 1-800-Flowers uses a similar service that allows the user to select the kind of events, the categories of gifts and offers suggestions accordingly. iv. Offer Chances of Growth and Scalability AI can live up to the standards of personal assistants that can serve up offers and products according to buyer personas. While e-commerce platforms like Magento already have existing apps that include similar features, the adoption of proper AI technology can accelerate the growth of any e-commerce website. Research shows that personalized services can boost customer satisfaction. Satisfaction is what drives customer retention and loyalty. More than 63% of all consumers between the ages of 19 and 29 begin consumer service interactions online. Millennial consumers no longer evaluate a brand by the prices and offers, but by their customer services. v. Improve Customer Service and Enhance Satisfaction Studies show that millennials are ready to pay 21% more (at an average) for a brand's product and service as long as the brand offers stellar customer service. Aerie, the lingerie brand from American Eagle Outfitters, has been using a simple AI Bot to gain enormous popularity by targeting millennial and Gen Z consumers. More than 33% of the consumers state that speaking to an agent of the company makes them more confident about purchasing from a brand. Most new small businesses and startups do not have the resources to appoint 24/7 agents to address the consumer's messages, calls, and emails. It helps to have a "smart" chatbot service that can answer the consumers' queries via the integration of AI and ML. Several e-commerce platforms like Sephora and eBay now include AI chatbot services that respond like humans to queries and complaints. It has enhanced their UX and helped them boost their ROI. Unless there is a complicated issue, most chatbots are equipped enough to increase customer satisfaction by being the "friendly and informative friend" consumers seek. vi. Optimize and Consolidate Data from Multiple Channels Research shows that more than 60% of the customers in the US want omnichannel support that enables them to find solutions through automated services like mobile applications, or websites. It saves the consumer's time and effort. It is only possible for the current e-commerce sites to do so by integrating AI technology with their existing apps and techs. More than 70% of the current search traffic comes from mobile devices. Consumers switch between mobile devices and PCs or Macs according to their convenience and accessibility. The ability to consolidate the data from a registered account from all user devices enables the e-commerce site to preserve the personalization settings and offer a seamless service to each potential customer. Conclusion The integration of AI can transform your enterprise into a customer-centric business. It is a smart investment for a smart future. Advanced personalized solutions can address cart abandonment issues as well as high bounce rates. It is the delicate touch that can humanize your entire e-commerce operations and add a dash of personalization to your current UX. It is the secret sauce for the recipe of success of every e-commerce site. Integrating AI can be a game-changer for your business and brand.[/vc_column_text][/vc_column][/vc_row] ### Amazing FinTech Women We Should All Admire [vc_row][vc_column][vc_column_text]Most people would agree that the level of inequality in the workforce is a major injustice. Women make up approximately 43% percent, almost half, of the working population, and yet many are not given positions in management or senior roles. This is particularly true in technology-focused jobs. Worse, women are often discouraged from going into high-level jobs or starting their own companies. This is mostly due to the prevailing belief that a woman’s place is in the home, that her skills should revolve around cooking and child-care, and that she must serve her husband first. Thankfully, the number of people who hold this belief is constantly decreasing, but vast swathes of society are still feeling the effects of it. Why We Need More Women In Tech Industries Studies consistently show that increasing diversity in a workplace can skyrocket the success of a company. This is true for racial diversity as well as gender equality. In 2012, a study found that having females in top management could majorly increase company success, by as much as $42 million in just over 10 years. “Obviously increasing the number of women in tech industries is good for the industry as well as for the women. We need to put energy into encouraging girls to take an interest in science and mathematics early on in their lives, and destroy the belief that the logical subjects are geared towards the boys,” says Emma Brent, a tech writer at Gum Essays. The Women We’re Wowed By 1) Ghela Boskovich: Initially an economist, Boskovich launched FemTechGlobal in 2015. This foundation seeks to increase the belief in diversity and encourage men and women to work together to improve technology. 2) Susanne Chishti: She has a full 14 years of experience in the finance industry, and in 2015 she was one of the European Digital Financial Services “Power in 50.” She started FINTECH Circle the previous year, which looks for the best FinTech startups to get involved with. 3) Ana Baraldi: Working in a very male-dominated environment in Vérios, Baraldi has had to deal with difficult customers who do not believe she belongs in the industry. She has kept powering through, offering the best possible customer service as Head of Customer Experience, and inspiring young women everywhere. 4) Seema Hingorani: An amazing woman who started out as an equity analyst, Hingorani rose quickly through the financial world and began her own company, SevenStep Capital. In 2015, she also began a nonprofit called Girls Who Invest, which has successfully helped over 500 women get a foot in the door of the financial world. If more people like Hingorani were around, our problems of equality would soon disappear. 5) Abby Johnson: A woman whose family have been in the tech industry since 1946, she is considered the seventh most powerful woman in the entire world. Her firm, Fidelity Investments, was started by her grandfather, and today is in her capable hands. She is constantly pushing her company forward, and took a strong stance against sexual harassment issues within the company. 6) Banco Neon: This woman thinks that the industry is finally starting to see some change in terms of equality. She believes the startup environment is more open to diversity and new ideas than an established business, so innovative new tech companies are a good foothold for women seeking to enter the industry. She says that men are still frequently favored over her, but that the tendency is reducing now, as old players in the industry are starting to respect the presence of women. Tech companies are slowly waking up to the fact that women will make them better and brighter, but it’s a long process. Many women attempting to enter this industry face prejudice and a lack of faith from their colleagues, and they often have to work harder just to earn the respect that male workers are given automatically. Thankfully, powerful women in the industry are fighting for the rights of other females, and studies are pointing to the benefits which companies can reap. If we continue the current trends, perhaps one day we will see a fair representation of gender balance in the tech industry. Until then, let’s keep applauding those who are making the effort to realize this future![/vc_column_text][/vc_column][/vc_row] ### The Technology That’s Creating Tomorrow’s Workplaces [vc_row][vc_column][vc_column_text]How do you really figure out what’s happening in an organisation, how people feel about their employer, and how workplace dynamics are actually playing out? Tools like Organisational Network Analysis (ONA) and machine learning are giving leaders an unprecedented insight into their businesses, helping them to reshape their companies for maximum productivity and happiness.   Under the skin of an organisation Culture is crucial. In a competitive marketplace, it can provide a critical advantage, helping to attract and retain the very best employees and impacting organisational performance. Historically, however, organisational culture has been very difficult to understand and assess. Organisations have had to rely on paper-based surveys, which capture limited information and quickly become outdated. Now, new technology is revolutionising the way that employers assess attitudes, monitor employee engagement, and gauge workplace culture. Culture Workbench from Temporall collects real-time data, uses rich data analytics and machine learning to interpret it, and then helps leaders make evidence-based changes to their workplace. ONA is central to this technology. ONA analyses the relationships and patterns between people and the impact that individuals have on an organisation. According to Professor Rob Cross, a social network analysis expert, ONA “can provide an x-ray into the inner workings of an organization”. This technology is “a powerful means of making invisible patterns of information flow and collaboration in strategically important groups visible” (‘What is ONA?’). This means that ONA can unveil things about an organisation that were previously almost impossible for leaders to see. ONA can be combined with quantitative, qualitative and sentiment analysis to build a rich, complex picture of the people dynamics in a workplace. This is the true ‘who’, ‘what’ and ‘why’ of an organisation’s culture. Culture is constantly moving and shifting, and it can’t be captured in a static org chart or conveyed on a feedback questionnaire. Using these new tools, leaders can discover which employees are central to the company running smoothly, find out which company initiatives are having the most impact, and get the information they need to successfully implement digital transformations. This happens through real-time data capture, data interpretation, and then action based on these findings. We’ll run through each of these steps in a little more detail. Continuous data collection  Workplaces are awash with data, and data analytics is already a growing practice in HR. By collecting data about payroll, absence and operations performance, HR teams gain a certain amount of useful insight. But there’s far more to be gained from gathering more complex data about the many facets of company culture. If you want to find out about a workplace, it makes sense to meet users where they already work and are most productive. Daily conversations can pop up while people are working, getting rid of the need for surveys. Data can be captured from teams using enterprise systems that are already in place, like Slack, Workplace by Facebook, and Google Cloud. Information can be collected continuously, providing real-time data and creating an insight loop. Sophisticated data analysis So the workplace data is flowing in, but what does it actually mean? Machine learning paired with data analytics and reporting can help make sense of all the information, sifting through data to recognise patterns and draw conclusions. Data-based decision making  The purpose of gathering and interpreting all of this data is to make better decisions, changing an organisation for the better, and turning a company’s culture into a competitive advantage. Leaders and HR teams can use the information to clarify purpose, refine company processes, and make intelligent decisions about productivity and retention. And as changes are made, the continuous feedback loop means that leaders can measure the impact of those changes. The path to a high performing workplace Why is all of this so important? Because if a business has a clear strategic direction coupled with a great culture, employees know what they’re aiming towards and feel motivated to achieve it. Understanding and refining company culture is the key to creating healthy, high performing organisations. Emotional intelligence (EQ) is the next frontier for machine learning. EQ is already being used in market research, with companies using algorithms to scan comments about their business online so that they can monitor their brand’s reputation and predict potential crises. It’s not difficult to see how this technology can be applied to understand workplaces, with revolutionary results for business leaders. This is the ‘employee engagement’ piece that so many HR departments are wrangling with. Employee engagement is inextricable from culture, and culture is the beating heart of an organisation. This new technology enables leaders to constantly keep a finger on the pulse. Business leaders can now understand their workplace culture like never before, unlocking the knowledge that will help them make their organisations as brilliant as they can be. This has far-reaching effects, resulting in both more effective organisations and employees who are healthier, happier and more productive.[/vc_column_text][/vc_column][/vc_row] ### What makes a brand great? Opening Lines [vc_row][vc_column][vc_column_text]We’re surrounded by brands. They are so ubiquitous that we rarely stop to question why they exist in the first place. What are they here for? Are they here to provide a veneer of acceptability to capitalist enterprise? Are they the product of mass media? Are they a vehicle for the manipulation of people in society? Are they an inevitable manifestation of the human condition? It’s difficult to judge what makes a brand great without a clear point of view on what role they are supposed to fulfil in the first place. If we view them as oil in the cogs of capitalism, then value creation is the ultimate measure of greatness. But if we believe they play a greater role than mere profit generation, then we need a more ambitious set of standards. So why do brands exist? Let’s go back to the beginning. Right back to the earliest recorded civilisations. The year is around 2300 BCE and we’re in the Indus valley. Merchants in the ancient city of Harappa use distinctive seals to demarcate their goods. These seals perform a clear functional role: to convey the identity of the sender of merchandise – they represent a mark of origin. A guarantee of quality. But the seals also contain symbolic value: some contain totemic animals like unicorns and antelopes; some are some show deities like the God Shiva, who was venerated for fertility and hunting prowess. It’s difficult not to interpret these as the earliest known example of brands. They existed in the oldest known civilisation and they have existed in various forms ever since. Along with cooking, sport, folklore, language and superstition, brands are human universals. They are a chronic aspect of humanity. Why? In the 5th Century BCE, the Greek philosopher-poet Xenophanes of Colophon noticed the tendency of different civilisation to create gods in their own image: Greek gods looked like Greeks, African gods looked liked Africans, and so on. In doing so, he identified a general human tendency to conceive of the world in human terms: anthropomorphisation. We do this from childhood with our teddy bears when we talk to them and pour them tea. We anthropomorphise our pets when we make them a part of our family. When Financial Times journalists write about equities suffering in a crash they are also succumbing to our unconscious compulsion to ascribe human qualities to non-human entities. And this compulsion is central to understanding why brands exist and what makes a brand great. Brands exist because people can’t help but think about organisations in human terms. That’s why so much of the language of branding is anthropomorphic: we talk about brand loyalty, brand personality, brand intimacy and even brand desire and brand love. We can measure the extent to which people perceive a brand to be fun or serious, calculating or caring, old-fashioned or optimistic. If you buy the idea that brands exist because people anthropomorphise organisations, then this has fundamental implications for how you define greatness.  How do you define greatness? I’ve been a brand consultant for twenty years and have spent much of that time beating the same drum that the likes of Byron Sharp still continue to bang: that great brands are the product of consistency, repetition and consensus. Consistency dictates that every interaction a person has with a brand should be designed to a strict and fixed set of guidelines, in order to create a clear expectation of the next interaction. Repetition involves the building of ‘distinctive memory structures’ in people’s minds through the repeated use of logo, colour and message. Consensus means that everybody in an organisation aligns around the same elevator pitch and conforms to the same set of prescribed behaviours. For decades, this is how brand marketers have defined greatness – and many still do. The problem with this approach is it promotes an extremely mechanistic, rules-based approach to brand management. Rather than resulting in great brands, it undermines the fundamental role of any brand: to humanise an organisation so that people can connect to it emotionally. Imagine if a person always repeated the same catchphrase, always wore the same outfit and always behaved in accordance with a fixed set of guidelines. We wouldn’t call that person great. We’d call that person a sociopath. Surprise us The anthropomorphic brand champions coherence over consistency. We tend to anthropomorphise things that are unpredictable. The brand experiences we love the most aren’t those that we can predict; they are the experiences that surprise us. The brands we are most drawn to aren’t those that deliver uniformly across markets and channels; they are the brands we can’t take our eyes off because we don’t know what they are going to do next. It’s important to say that this is not to suggest that brands should deliver random or chaotic experiences: if something becomes so unpredictable that it is considered to be random, then we are less likely to anthropmoprhise it. But a little bit of unpredictability is a good thing. The anthropomorphic brand eschews repetition of logo, colour and messaging in favour of interpretation and re-interpretation. This is what stops brands from going stale. It makes them interesting to watch. That’s why we love Absolut Vodka’s limited-edition labels. It’s why we love Lego. Great brands don’t just deliver on expectations; they play with those expectations. They inspire us to see more of life’s infinite possibilities. They make the world more colourful and interesting and beautiful. They certainly don’t nag at us with the same brand logo, brand colour and brand message over and over and over and over and over again. Be human, have fun The anthropomorphic brand encourages challenge rather than building consensus. They don’t hand their employees a script to regurgitate. They don’t issue long lists of do’s and don’ts. They encourage the people who work for them to improvise around a theme. They challenge people to decide for themselves how best to apply organisational values to their work.  They place as much emphasis on what is flexible in their brand guidelines as they do to what is fixed. This is how you make a brand more human: make the interactions people have with it coherent but not uniform; give people space to find their own meaning in those interactions; and challenge people to question that meaning and evolve it over time. Not only does this result in more compelling brands, it’s also far more fun.   To find out more, go to https://theclearing.co.uk/[/vc_column_text][/vc_column][/vc_row] ### The irreplaceable security practice [vc_row][vc_column][vc_column_text]Using physical and behavioural characteristics in security to authenticate identity dates back to the first signature used on a bank cheque, and since then, progress has been significant. New technologies like biometrics are increasingly used across industries, with facial recognition and fingerprint sensors becoming the new trusted gatekeepers of our age. The appeal of biometric technology comes from its inherent clarity, speed, ease of use and functionality, mixed with a small hint of Bond‐esque gadget appeal. The spread to consumer products like laptops and smartphones, with biometric technology inbuilt, has served to deepen the widespread comfort in biometrics as a reliable form of security. There’s also an understanding of how unique features such as a face or a fingerprint can provide an infinitely more complex key to protect access with, than any combination of letters, numbers and symbols. Yet, with the impressive functionalities that biometrics provides, it is essential to cast a thought to how the associated data is being stored and what the risks involved are in opting for fingerprint‐only authentication. The idea of opening a smartphone with one’s face is impressive, but facial recognition as a security measure has its own downfalls. A well‐known hacker proved the fallibility of fingerprint authentication recently by using a print left on the screen of the newly released iPhone 5S screen to bypass the authentication request. This scenario is not uncommon. The ability to use a high-resolution image of a hand to collect fingerprints in order to use as access authentication has been previously documented. Biometric recognition also poses a potential pitfall thanks to the personal, unique form of authentication, one which cannot be lost, replaced/reset or disassociated from its owner. As more and more banks and retailers start to use biometric recognition as a form of identification to perform everyday functions such as authenticating payments and provide access, the risk of this type of biometric hacking is increasing. The prospect of a biometric data breach, where users risk the prospect of losing biometric data used to provide authentication, which then cannot be changed, highlights a significant potential pitfall for the technology. For most people, the idea of security ends at their fingertips. Many don’t consider where their biometric data is stored and how that must be protected in itself. Biometric data needs to be stored securely with restricted and monitored access – be it on a device, server, or in the cloud.  Investigating the security capabilities of providers is especially important when selecting who holds your biometric and personal data. As part of this issue, Biometric data governance is necessary, not only across business, but in places that might not immediately come to mind. Many schools use biometric technology to monitor attendance or as replacement for library and lunch cards, for example. This kind of data is normally centrally stored on the school’s network and could put children’s personal biometric information at risk if inappropriately secured. Biometric technology is also valuable in terms of corresponding security procedures, just as a password might be if paired with another form of authentication. This is particularly important with intrusion attempts and data breaches being daily occurrences. As the risk of single-factor authentication becomes apparent, the role of additional layers of security, such as biometric technology is rapidly becoming a necessity. Multi‐factor authentication provides an opportunity for users to shore up access rights and reduces the prospect of a potential data breach. The concept is not new ‐ the term two-factor authentication (2FA) has been used since the nineties to describe a second authentication method, often a hardware token that could generate a one‐time password. However, as technology has evolved and personal technologies such as smartphones have become more sophisticated with new features such as push technology, it has become much easier, not to mention more secure, to have two or more forms of authentication. Enabling multi‐factor authentication is a critical step in providing robust security, and in an ever-evolving security landscape, it is quickly becoming a necessity. What was formerly just a novelty, biometric technology is quickly becoming a modern-day requirement in providing an edge in the battle for secure data.[/vc_column_text][/vc_column][/vc_row] ### The new economy of subscription software [vc_row][vc_column][vc_column_text]Before SaaS, enterprise software developers could focus on what they do best – developing great software that answered the business challenges in their target market. There was a purchase price for the CD download, plus periodic updates offered wholesale to all purchasers. It was a one-size fits all model for a mass marketplace, with simple price and volume economics. In today’s online marketplace, SaaS has shifted software vendors (ISVs) onto a new model based on subscriptions accrued from customers over a longer period of time. Software developers may know how to make great software, but it is a different challenge to make it a commercial success in this environment. It is much simpler to expand distribution of a cloud-based service than a physical product, and also more straightforward to connect vendors with customers to deliver maintenance and customisation that meets demands. But therein lies the big challenge for ISVs in the new economy, which has both highlighted and inflated the burden of application integration in the overall cost of customer acquisition. A game without winners With a growing number of SaaS applications being deployed in organisations across all industries, ISVs are expected to make sure their product not only works, but also works well with other applications to create a fully integrated platform. The 2019 Okta report confirms that companies the world over are increasingly selecting technologies that ‘prioritise interoperability, automation and offer a broad range of functionality.’ When this is the case for each and every customer, the software vendor is forced to make a series of decisions. They need to consider which customers they make the product available to – a few big fish or many little fish? They also need to decide how they spend limited time and resources; is it meeting the afore mentioned customer ‘interoperability’ needs or focusing on developing their core product? It is a game without winners. Whichever path they choose, they’ll lose by either disappointing existing customers or spending time on these customers’ demands at the expense of acquiring new ones and improving the core offering. Allowing software developers to focus on their core skill set by employing specialist integrators adds an expensive and time-consuming layer to the process of software development. On the other hand, embedding integration – however it is achieved – has the potential to augment the value of the software to enterprise customers and be reflected in the price that they are willing to pay. The knock-on impact on commercial viability of the software after all of this is taken into account can affect how – or even whether – the product is brought to market, and further influence which customers are targeted for sales. Creating a sticky situation With rise of the integrated enterprise and the Internet of Things, time and money spent on integrating APIs for each individual customer can easily spiral, particularly as the architecture evolves over an extended period of a subscription contract. Furthermore, the task of supporting all these different versions into the future becomes onerous, with the potential of making the software’s business model untenable. Our own conversations with ISVs have highlighted loud and clear that this is a critical dilemma. While the old purchase model could deliver a cost-plus return per customer unit, subscription models are less able to guarantee the same return. As a result, SaaS companies are searching for ways to make their product ‘stickier’ with customers, decreasing customer churn to secure more revenue over an extended period, and reducing the burden of integration with an ever-growing range of applications. An integration platform as a service (iPaaS) allows software vendors to achieve exactly that. By building universal integration capability into their own application, ISVs can offer a new kind of one-size fits all model for the subscription economy. At the same time, without the need to employ specialist integration capabilities within their business, the software vendors can save on development costs and still deliver the most comprehensive, premium value product possible. Software developers can integrate a white-labelled iPaaS solution into their own software that provides complete interoperability with any existing architecture, as well as any applications that may be added in the future. In this way, ISVs can focus on what they do best and spend less time on the maintenance and integration of an increasingly complex portfolio of customers. The lose-lose situation can be flipped to a win-win for ISVs with a simple, effective solution to subscription economics.[/vc_column_text][/vc_column][/vc_row] ### Chisel up Your Role with Augmented Analytics [vc_row][vc_column][vc_column_text]Are you considering an advanced analytics solution? Then, it’s quite natural for you to wonder how exactly you are going to be benefitted with the same.  Time has come for the business owners to thank artificial intelligence as well as augmented analytics! Business users are now much closer to their data and especially, to tortious insights. The wall between the questions asked by business users and the answers given by business authority has now tumbled down, paving way for much perfection and precision. The benefits of augmented analytics are numerous indeed! Whether you have opened a new venture or have been running the business since long, by enforcing advanced analytics solution in your company, you can actually heighten the profit-margin yet more. When you can chalk out plans and make predictions with more poise, you can also make perfect business decisions on time. You can categorize opportunities, crack problems in a better way and share data to make sure that everyone in the venture is equally informed, and working with the up to date data. Also, with the appropriate self-serve tools, you can make way to a more erudite business environment that would guarantee high-end smart data visualization to help users through the planning process. Augmented analytics has just altered how users have approached data till date. Non-technical users can now easily connive with augmented analytics solutions by placing questions directly and getting answers in a jiffy, which, in turn, has helped them curtail down their reporting time and perk up their performance. Moreover, with this advanced tech on board, technical analysts can now take care of more complex queries and data science projects. In a nutshell, we can definitely say that augmented analytics can help many different types of users across the business venture. Being freed up to solve complicated problems that were never so easy to crack before, data scientists can now make way to advanced output. For an instance, a scientist, instead of scuttling through repetitive reports and answering simple queries, can easily solve problems with advanced AI and machine learning. Consequently, this cutting-edge and innovative work would provide the company with a benefaction on the competition. And, it would definitely be a win for everyone involved in the business. Well, it doesn’t end here! Marketing personnel are even more likely to perk up their daily operations with augmented analytics on drill. Right from brand managers to chief marketing officials, every employee within the marketing parasol is often dependent an analytics team for all heights of data reporting. Before augmented analytics came into the scenario, marketers would generally have to be dependent on third parties; be it for one-off questions or for any in-depth planning. It was not only an inept practice, but was quite costly too.   However, the emergence of augmented solutions has almost made this job a cakewalk! The clout is just back in the hands of the marketing users. Even, sales team can be extremely benefitted by this. With the ability to obtain in-depth and automated analysis, augmented analytics helps abridge the decision cycle and this is certainly a key benefit for any sales team on board. Also, it enables them to get straight access to any kind of data.  Sales people are always supposed to be agile enough to come up with prompt on-the-spot decisions. And, a responsive analytics solution would always allow them to pivot quickly. Also, sales executives would no more have to wait for any long-term reporting. Answers would just be available as soon as they're required so that they can cox the car accordingly. Augmented analytics makes improvements happen pretty instantly, it doesn’t take days or months instead. At the same time, any business intelligence platform would offer dashboards. And, an augmented analytics tool will use machine learning and artificial intelligence to depict what is really there in the dashboard data. Then, with brand health study, one can go deeper than a plain rundown of their standard metrics. So, there is no doubt in the fact that the sooner you embrace the technology, the better your team will be able to capitalize on the growth opportunities. According to Allied Market Research, the global augmented analytics market is expected to grow at a significant CAGR from 2018-2025. Rise in need to democratize analytics, growing awareness of enterprises to use updated streams of data from various sources in innovative ways, and mounting demand to make the work easier for citizen data scientists as well as business users have fueled the growth. On the other hand, Security concerns related to critical data among different industry verticals and slow adoption of advanced analytics solutions in the underdeveloped regions have happened to curb the growth to some extent. However, increasing investments in bots have almost downplayed the factors and created multiple opportunities in the segment. To conclude, it can be stated that augmented analytics market is expanding quite abundantly and in the next few years, it’s going to proliferate yet more.[/vc_column_text][/vc_column][/vc_row] ### Can cyber insurance offer comfort anymore? [vc_row][vc_column][vc_column_text]According to Orbis research the cyber insurance market is expected to reach $17.55bn by 2023. Ten, fifteen, years’ ago this wasn’t an industry that existed. Such are the times we live in that there is now money to be made, and lost, on cyber security insurance. And times are tough. Breaches that register terrabytes are starting to become common place and companies report continuous prolonged attacks, day in day out. 20% have experienced attacks daily, a fifth weekly and a third monthly. What’s more some 57% of companies have reported a data breach as a result of attacks in the last year. That’s hard to swallow in a GDPR world. But it’s not surprising when you learn that in Europe, two thirds of companies believe their networks are susceptible to attack. It’s a natural state and although companies are spending somewhere between 30 and 40% of the security budgets on new forms of AI solutions, they rely heavily on their security vendor and, unfortunately for some, manual troubleshooting to keep them safe. Is it any wonder then that they are turning to insurance? Especially given half of attacks cost between $500,000 and $10m, and the average attack represents $4.6m in losses and around £100,000 to win customers back after a breach. Yet the current Mondelez case highlights that cyber insurance could leave companies with a false sense of coverage. When NotPetya struck it caused hundreds of millions of dollars’ worth of damage. But it was deemed an “act of war” instead of just a cyber attack and therefore not eligible for a pay out, under the ‘war exclusion’ clause. Looking at the market as a whole, these types of clauses, in effect protect the insurer from highly destructive global attacks. And with nation state attacks become more prevalent and highly effective in stealing both Intellectual Property (IP) as we amass data on the public at large, there is often enough uncertainty to refuse the claim. In addition, some policies are quite narrow in their scope and cover only costs related to direct loss of customer data such as credit check costs and associated legal bills. They don’t cover all the costs incurred, which, as we have already established, can be several times bigger and more permanent than the direct costs alone. Sadly, for companies in this position it’s likely to take years to fight cases in court and finalise if the insurance should be upheld. So, if you should ever find yourself in that position, what do you do in the meantime? Here’s our 9-point checklist. You have to get back to basics and prioritise your own security. Make sure that the plan you have isn’t treating security as an add-on. You can longer be in a position of waiting until you’ve been hit with an attack to beef it up. It has to be built it into the very fabric of your company’s foundation. If you do it from the start you are far more likely to build an approach that can be scaled and focussed.   Scaling and automation are important because the threats are always changing and are increasingly complex. You cannot leave yourself in a position where it’s a scramble to mitigate multiple new threats as they appear. You have to be ready with a solution that can turn up the dial automatically, at a moment’s notice.   Likewise, if you have products and services you sell then baking in security at conception will be paramount to allay attacks in the future. It’s also proving to be a differentiator and a powerful marketing tool. It says to people that you can trust us and builds your brand.   Think about your applications and how important they are to the fabric of your company. They will no doubt be intrinsic to your operations and if they failed your employees wouldn’t be able to work, your suppliers wouldn’t be able to get good to you and your customers wouldn’t be able to buy. That’s why you have to install comprehensive DDoS and application security protection. It’s the only way to optimise business operations, minimise service degradation and prevent downtime.   Don’t overlook mobile applications as part of this as they are pervasive in modern business, and particularly vulnerable. In fact, a fourth of execs say mobile apps are targeted daily. It’s not helped by the fine balancing act of managing mobile access to the cloud which is still proving to be a headache be it private, public or hybrid.   Related to this is the need to better manage permissions. This holds particularly true for organisations operating in or migrating to public cloud environments; excessive permissions are the number one threat to cloud-based data. 75% of execs say that they regularly see unauthorised access to their cloud from employees that are using slack credential checking. Permissions exist for a reason, so poor permission practice allowing too many individuals access just can’t happen. If you can, put in place a specific process that can’t be by-passed no matter how busy people are.   It’s therefore always a good idea to think about your employee behaviour and what you can do to get them thinking secure at all times – on the train reading a confidential document, tailgating on the entry barrier, using in-secure wifi connections are all things that need to be eradicated. This can’t be emphasised enough. But employers should also educate their employees about common cyberattack methods, like phishing campaigns and malware, and to be wary of links and downloads from unknown sources. This may sound simplistic, but it’s often why an emergency response team has to get involved.   It’s also a good idea to use multi-factor authentication across your network. Again, this is low-hanging fruit, but it bears repeating. Requiring multi-factor authentication (MFA) may seem like a pain, but it’s well worth the effort to safeguard your network. Without MFA, passwords are simply too easy to crack.   And, as always, let the security experts handle the cybercriminal experts. As attacks continue to get more complex and more effective, leverage the people with the latest tools and training. Don’t hesitate to engage managed security experts in your quest to provide a secure customer experience. The security landscape will always change, and the threats will alter. But with a check list like this you can ensure you have the bases covered with the latest techniques, and ensure the cloud, the apps it runs, and your employees and customers are always protected.[/vc_column_text][/vc_column][/vc_row] ### How Edge-AI will inform the smart environment of the future [vc_row][vc_column][vc_column_text]Cloud computing and the Internet of Things (IoT) – two of the biggest technologies of the last few years, and two trends on a collision course set to completely transform the world around us. Cloud platforms and services are already well established, both from a consumer and a business perspective, having impacted many aspects of our daily lives. New cloud innovations are constantly emerging, but it’s when we add IoT into the mix that things start to get really interesting. The IoT industry is set to experience explosive growth over the coming years. Analyst firm IDC predicts that worldwide IoT spending will reach $745 billion in 2019, a significant increase from the $646 billion spent in 2018. This growth will drive a transition to a world of ambient technology, a future in which unseen enablers exist all around us – central to how we communicate and function day to day, and accessible to anyone. This growth will result in exponential increases in the amount of data being created, analysed and stored, and so cloud service providers will have a key role to play in engineering this view of the future. However, is cloud alone the best answer to the scalability and management challenges that the transition to the world of ambient technology will undoubtedly bring? And if not, could Edge-AI prove to be the missing piece of the puzzle? A step too far for cloud? While it’s true that IoT presents a huge amount of possibilities for businesses in all industries, it also creates some significant technical issues that will have to be overcome. These primarily revolve around coping with growing data quantities as IoT continues to grow. One of the great benefits of cloud platforms is that they enable users to manage and work with more data than they ever could before – from the biggest enterprises right down to the smallest start-ups. The issue is that although the centralised cloud model is sufficient for today’s IoT ecosystems, the infrastructure will quickly become overwhelmed when billions of devices are involved. The issues associated with managing the control of potentially billions of smart devices in the cloud, combined with ongoing concerns surrounding the security and privacy of the data being transmitted, suggest that a wholly cloud-based model may not scale as our interactions with AI devices become more pervasive and intricate. As well as requiring huge investments, the complexity of the networks and the sheer volume of data involved will also increase the risk of the user experience being degraded by network connectivity or latency issues. This will be particularly true in high-density areas such as cities that will become inundated with ‘smart’ sensors and connected devices. However, there is a solution. The ability to reduce the amount of data transmitted, to provide higher-level context to the user experience, and to use smart devices on the odd occasion the internet is unavailable, are all on offer with the shift to Edge-AI. Moving to the edge Edge computing refers to the practice of decentralising data processing by moving it closer to the source of the data – in this case the connected device. While artificial intelligence and machine learning computation is often performed at large scale in datacentres, the latest processing devices are enabling a trend towards embedding AI/ML capability into IoT devices at the edge of the network. AI on the edge can respond quickly, without waiting for a response from the cloud. There’s no need for an expensive data upload, or for costly compute cycles in the cloud, if the inference can be done locally. Some applications also benefit from reduced concerns about privacy. When it comes to the issues of preserving data security and reducing the quantity of data being transmitted in the world of IoT, Edge-AI devices pose a viable solution. For example, it removes the latency issues associated with cloud-based control. Although AI in the cloud is thought of as a huge collective intelligence, AI at the edge could be compared to a hive mind of many local smaller brains, working together in self-organising and self-sufficient ways. Essentially, intelligent insights generated from the data will be available in real-time on the device itself. Edge-AI also enables user experience designers to personalise the interaction with remote AI entities through the fusion of sensor data, as well as enjoying full functionality without the support of a connection, meaning service disruptions will be reduced. Ultimately, the issue we will face over the coming years is that the cloud will unlikely be able to keep up with the growing speed of the IoT industry or the diverse needs of IoT applications. Total cannibalisation of the cloud may not be the ultimate endpoint, but Edge-AI will enable a rebalancing of compute utilisation from core to the edge, where algorithms will learn from their environment to make locally-optimised decisions in real time. The future of IoT and the transition to the world of ambient technology will likely depend on decentralising networks in this way, giving industries and businesses the tools to capitalise on the data being collected, rather than being overwhelmed by it.[/vc_column_text][/vc_column][/vc_row] ### Overcoming Cloud-vendor lock-in with Serverless [vc_row][vc_column][vc_column_text]The digital and cybersecurity skills shortage in one of the most well-documented issues organisations are faced with today. With recent reports estimating that there are currently 600,000 unfilled tech jobs in the UK, which is costing the economy a staggering £63 billion every year. As a result of the skills shortage, organisations today are heavily reliant on digital tools to bridge any gaps these unfilled roles are creating, and one of the most popular technologies that is being adopted is the Cloud. It is estimated that 88 percent of UK organisations today use some form of Cloud, which is a staggering number that highlights how popular the technology has recently become. By moving to the Cloud organisations can embrace some of the many benefits it offers and, because the Cloud means data can be managed by Cloud vendors, rather than by in-house teams it can also help to bridge the skills shortage. However, one issue organisations are increasingly facing is the challenge of Cloud vendor lock-in. This is when a company becomes trapped with a specific Cloud vendor, because transitioning data and applications from one vendor to another creates challenges and costs that are difficult to overcome. Cloud-vendor lock-in  With cybercrime becoming such a huge problem for businesses today, and the need for data to be available 24/7, it is not surprising that when an organisation moves to the Cloud they don’t like the idea of putting all their confidential information in one place. If a Cloud provider suffered a security breach or downtime all the company’s data could be affected, which could have detrimental consequences. As a result, organisations often feel more comfortable using multiple vendors and taking the features from each that best suit their needs, however this is an expensive approach as it means investing money in multiple Cloud vendors and of course having staff who are trained in each platform, which can be difficult considering the digital skills shortage. Another challenge organisations face with this strategy is that Cloud vendors would rather an organisation works with just one provider and will encourage them to do so by offering preferential rates, which the organisation may be inclined to take even though the vendor does not entirely satisfy their needs. If an organisation opts to do this but then a few years down the line decides to move to another vendor that better suits their requirements a challenge they may face is that their data is no longer compatible with other Cloud vendors. This is an approach many Cloud vendors take to help lock-in customers and make it more challenging for companies to move to one of their competitors.  The approach can make organisations feel trapped and tied down to a specific vendor and may deter them from transitioning to another vendor out of fear of losing data or because the process can be extremely time-consuming and costly. This of course begs the question, what can be done to avoid Cloud vendor lock-in while still embracing the benefits the Cloud has to offer? Avoiding Cloud-vendor lock-in with Serverless The good news is that today new tools have become available which can help organisations overcome the challenges they are facing with Cloud vendor lock-in. These open-source, solutions are compatible with all the leading Cloud vendors, including Microsoft Azure, Amazon Web Services and Google Cloud Platform, and they allow organisations to easily move between them, selecting the best features from each to suit their organisation. These solutions can also help organisations overcome digital skills shortages as they don’t need to hire personnel skilled in all the major Cloud platforms, which can of course help organisations reduce costs. Cloud vendor lock-in is a very real issue faced by many organisations today, however, fortunately, new tools are becoming available to help tackle the problem. These solutions are built of Serverless, open-source technology and they run natively with all the leading Cloud vendors. This allows organisations to adopt the best Cloud applications for their organisation and help cut down on costs.[/vc_column_text][/vc_column][/vc_row] ### Why is Accessibility Important for a Website and How to Achieve it? [vc_row][vc_column][vc_column_text]As you can see, the benefits of ensuring web accessibility are many. Apart from ensuring compliance to the accessibility regulations put forth by the authorities, web accessibility is an important consideration for business websites too to ensure that their products and services reach to a wider audience by covering people with disabilities too. This is considered to be very important thinking from the humanistic perspective too as it makes sure that things are not shut for the disabled users. Ensuring complete web accessibility for your website offers you with many unique benefits. Most importantly, as noted above, it will expand your potential reach. You can find the math so evident here. If a greater number of users gets access to your site, then you naturally grow your user base. This will put you a step ahead of the competitors who may not have taken steps to ensure accessibility. Provided accessibility, you can get benefitted of all the visitors of all category, including disabled getting access to your website features. For this, the primary consideration is for making your website accessible to all in terms of its usability and overall design. Adding to it, you may also make your website more flexible and future-proof. More importantly, it is also critical to note that many countries have passed specific laws related to technology accessibility to all. To be legally compliant, you need to match the specified standards. Ensuring web accessibility compliance If you are concerned about not being compliant with web accessibility standards yet, thankfully, you are not alone. Covering its various aspects on the go, web accessibility initiative is running since 1997 in order to make the internet the best place for all. We can see many individual initiatives also as how WordPress has done its own Accessible projects for their development platform. There are many other community-driven initiatives, too like A11Y, which provides resources and guidance in order to make accessible websites. An overview of web accessibility terms While making or restructuring a website or web-based application, you need to analyze the accessibility requirement at the first point and also ensure compliance through the development process. If you can identify the accessibility problems at the first point, it becomes much easier to address those on the go. There are also many evaluation tools to help with this. In fact, a single tool alone cannot determine whether the website meets all the accessibility requirements. For this, an audit based on the human evaluation by knowledgeable experts is required. It is also important to run a conformance evaluation to determine how well the web pages and web applications meet the standards of accessibility. For this, the conventionally used conformance testing mechanism is the Website Accessibility Conformance Evaluation Methodology by W3C (WCAG-EM). WCAG (Web Content Accessibility Guidelines) conformance can be done effectively with the help of this evaluation methodology. The WCAG-EM reporting tool will get you the detailed evaluation report as per the WCAG standards. It will also help you follow the WCAG-EM steps and generate the reports based on the accurate inputs you provide. Why is it important? Before considering why is accessibility important for a website, let's have a quick check as what the word ‘accessibility' covers. Simply, accessibility considers everyone's ability to access internet content and tools irrespective of their age, gender, devices used, and abilities or disabilities. As per Wikipedia definition, "Accessibility refers to custom designing of products, systems, services, devices, or environments to be used by people who suffer from disabilities." So, we can assume that Web Accessibility is the best way to ensure an inclusive approach to your website development by considering all types of users in mind. To put it in other terms, we can say that: Web accessibility simply refers to the ability of people having disabilities also to use the web. People with disabilities should be able to understand, perceive, navigate, and engage with the web elements, plus they can also contribute to the web. Going a step ahead, web accessibility also should ensure benefits to those with conditional, situations, or temporary disabilities too as old age, medical conditions, slower internet access, etc. Types of disabilities To understand the situation better, a developer needs to know what difficulties are caused by various disabilities in accessing web content. Disability is the condition of body or mind, which makes it difficult for an individual to perform certain activities. There are three primary modes of disability as: Permanent disability: A person is completely disabled, which will last for a lifetime as blind, deaf, etc. Temporary disability: Means a person is suffering from a temporary condition of physical or cognitive disability which hinders him or her from acting normally for the time being. Situational or conditional disability: This is the inability of a user to perform things normally in an adverse situation like slow internet connection. Making web accessible Making your website accessible to all depends on how you develop and design the web elements. Here are some tips. Use alt-tags: Alt tags or alternative HTML attribute are used to describe an image or graphic in the HTML. This is something like using the code as: Even while you use some visual tool which hides the HTML code while building a website, you can effectively make use of the HTML to enter proper image description. In fact, nothing will happen if you leave the (alt=” “) empty in the finish of the web page, but when someone uses the screen readers to understand the content, they won’t get any clear picture about what your page is all about. Similar to using alt tags properly, you may also take care of using better tables, prepare the keyboard navigation with the special needs of disabled people and aged people in mind, Use default HTML tags, proper captions to images and media, maintain the text transcripts of video contents, and also add closed captions for the media. So, always have the need for web accessibility in your mind while building websites or making modifications to your web pages. Doing this will ensure that you are building it for everyone from the humanitarian perspective and also that you reach to more by covering the minorities too from the business promotion point of view.[/vc_column_text][/vc_column][/vc_row] ### 56% of people fear public speaking – How it can hold us back in business [vc_row][vc_column][vc_column_text] [embed]https://vimeo.com/338855980/4b46991bd2[/embed] [/vc_column_text][vc_column_text]I believe public speaking fear can hold us back in business.   It’s often how someone presents themselves that prompts an audience to take action or not. It’s often how someone pitches a business that prompts a client to buy or not. Building trust and connecting with other people is important for progression. When we connect with others and we’re trusted we’ve got more chance of getting the job, of being heard in a presentation, of winning the pitch, of being visible in a meeting. People often trust others who are natural, relaxed and authentic.  When we get an idea of the real person behind the content we connect to the content. The problem is that when you are faced with a sea of eyes looking at you and hanging on to each word you say, you might find it difficult to be natural, relaxed and authentic. You might feel scared. Your body language and voice could reveal a lack of confidence, a lack of credibility and give people completely the wrong impression of you.  Consequently you could miss out on opportunities. We are human beings made up of vulnerabilities and there’s nothing wrong with being scared of presenting. I used to be and I know how it feels. There’s nothing wrong with fear. In fact it’s useful. However learning how to manage this fear can make a big difference to your outcomes. I believe learning how to speak in public helps to define us, helps other people to understand our viewpoints, to hear us and see who we are.  If we avoid this, we are always open to other people’s assumptions and misinterpretations. A presentation is an opportunity for you. Sometimes we just need a little help and I’ll share with you my three favourite tips 1- CONVERSATIONAL: Keep it conversational.  Imagine you’re talking to a friend. Remember you are speaking to other human beings, however formal the environment.   Many people adopt a robotic persona when they present because they get nervous or they take themselves too seriously.   An audience will always respond better and listen more if you’re personable and reveal human emotion. Creating a closer connection with your audience will also feed your confidence while you speak. 2- POWER POSE: I love Amy Cuddy’s popular TED talk encouraging you to do some power posing before an event. By this, she means shoulders back, chest out, imagine a piece of string attached from the top of your head to the ceiling and get yourself into a powerful position before any challenging situation – interview, networking, presentation or meeting. Her research indicates that, by doing this for two minutes, you can increase your testosterone levels and walk into challenging situations feeling more powerful. Many years in this industry tells me this works!  One thing I would say, practice in the mirror some time so that you don’t end up imitating a pumped-up superman or a crazed superwoman in the interview! 3- SIMPLICTIY and STORY: Keep it simple and use stories.  In every day life, we don’t speak in bullet points do we?  For example, when we go on holiday, we don’t come back and say… France Skiing Fun Ate good food If I spoke like that, you’d probably think I was really boring to go on holiday with and unnatural.   So why do we think speaking at people in bullet points, during a presentation, or pitch, is a good idea?  It’s not.  If you can use stories in your presentations and bring it to life with analogies and examples, you’ll take your audience on a journey and they are far more likely to sit up and listen.  A story provides emotion – this is what people connect with and that’s why people act on what you say. Plan your stories rather than a list of facts. Change the fear into an excited and positive energy! Give it a go. Speak Up & Be Heard is out now, priced £10.99. To find out more go to: https://www.amazon.co.uk/Speak-Up-Heard-Lindsay-Maclean/dp/199933650X/[/vc_column_text][/vc_column][/vc_row] ### Fixing Bottlenecks For Your Cloud Analytics [vc_row][vc_column][vc_column_text]Start tracking these KPIs for a successful migration of your data infrastructure to the cloud As enterprises are going digital, the amount of data they have to handle has exploded. New uses cases like machine learning and artificial intelligence are forcing them to re-think their data strategy. It’s a new world where data is a strategic asset - the ability to manage torrents of data and extract value and insight from it is critical to a company’s success. A recent survey of 2,300 global IT leaders by MIT Technology Review Insights found that 83% of data-rich companies are prioritizing analytics as much as possible to gain a competitive advantage. That’s where “data lakes” come in, as the underlying technology infrastructure to collect, store and process data. The constraints of legacy data infrastructure  The original term "data lake" comes from the on-premise Hadoop world. In fact, many enterprises today still run their analytics infrastructure on-premise. But despite years of work and millions of investment into licenses, they've got little to show. And now this legacy analytics infrastructure is running into four major limitations. Limited data. With on-premise infrastructure, data is often distributed across many locations. That creates data silos and a lack of a comprehensive view across all the available data. Limited scale. Scaling your infrastructure to add new capacity takes weeks and months, and is expensive. Limited analytics. The system runs canned queries that generate a descriptive view of the past. Changing these queries and introducing new tools is another time-consuming process. Limited self-service. IT is the gatekeeper of all data. It's a static world, where the consumers of the data have little to no control over changing the queries. It's this set of constraints that are driving enterprises to adopt the cloud for their data and analytics infrastructure. Shifting your data to the cloud The "new" data infrastructure combines cheap storage of cloud-based data lakes and powerful analytics capabilities of cloud warehouses. Cloud-based data lakes combine simplicity with flexibility. It’s a simple and scalable way to store any data, while giving you flexible choices for security, geography and access rights. Putting your data into a single place while keeping tight control means breaking down data silos. Data Lakes also offer pricing flexibility, with different tiers to reduce the cost for storing historical data. With data in a single place, you still need to join it to generate new insights. That's where cloud warehouses come in, to run complex analytical models and queries across terabytes and petabytes of data. Cloud warehouses cost an order of magnitude less than on-premise alternatives. Popular cloud warehouses include Amazon Redshift, Google BigQuery and Snowflake. Unlike on-premise products, cloud warehouses can scale on-demand to accommodate spikes in data and query volume. Cloud warehouses offer the additional flexibility to allow data workers to use the tools of their choice. And because cloud warehouses are both performant and scalable, they can handle data transformation uses cases in-database rather than in some external processing layer. Add to this the separation of storage (in your data lake) and compute (in your warehouse), and there are few reasons to execute your data transformation jobs elsewhere. As a result, much of the processing of large data sets today happens within cloud warehouses, based on traditional SQL. The primary benefit of SQL is that all data professionals understand it: data scientists, data engineers, data analysts and DBAs. Cloud warehouses are where data-driven enterprises run their "data pipelines". And by giving people access to data with easy-to-use self-serve analytics tools, enterprises remove fences around their most valuable data and put it to work where it's most productive. New platform, new bottlenecks In the same MIT Technology Review survey, 84% percent of  respondents confirmed that the speed at which they receive, analyze and interpret data is central to their analytics initiatives. Yet when first embarking on their cloud journey for their data infrastructure, teams run into a number of bottlenecks. They range from slow queries to data pipeline failures or increasing load times. A frequent reaction is to throw more capacity at the problem. But that adds to the cost and often doesn't solve the problem. Going into a migration with high hopes, the bottlenecks can be frustrating.  What' can be even more frustrating that the root causes are not always clear. The reality is that data pipelines and queries fail, for a number of reasons. Unexpected changes at the source can break the workflows that extract the data. A delay in data loads can cause a backlog with downstream effects on data availability. Poor SQL statements can consume excessive resources and slow down an entire warehouse. Defining data infrastructure and query KPIs  As the DBA or data engineer in charge, you want to be the first one to know when these bottlenecks happens. Or even better, before they happen. This is where your new KPIs come in. For starters, I recommend three basic KPIs when running a cloud warehouse. 95th percentile of load times 95th percentile of runtime for your workflows 95th percentile of execution time for your ad-hoc queries Just monitoring these three simple metrics will uncover bottlenecks that impact your SLAs. For example, a sharp increase in rows in a table can lead to an increase in query execution time. From there, you can take action to fix the root cause of the slow down. A common challenge is to locate these KPIs, i.e. the metadata / logs that provide the relevant information. That's where a product like intermix.io comes in. It provides these KPIs out of the box via custom dashboards. You can analyze query performance at both the individual and aggregate level, keep an eye on critical workloads, and drill-down into the data and look at historic patterns. Follow the process Migrating your data infrastructure is as much about the process as it is about technology. Simple tasks like data pipeline execution can require complex migration steps to ensure that the resulting cloud infrastructure matches the desired workloads. Much of the hype surrounding data lakes and could analytics is justifiable. But careful planning and anticipation of bottlenecks is essential to determining success.[/vc_column_text][/vc_column][/vc_row] ### How Virtual Reality Is Changing the Way We Train Doctors [vc_row][vc_column][vc_column_text]Training healthcare professionals comes with unique challenges. How do you ensure that medics are up-to-speed with the latest technology and techniques when there’s often limited access to new equipment, and it’s not possible to practise on real-life patients? Virtual Reality (VR) could provide the answer. The VR experts at Immerse are creating detailed simulations to upskill healthcare professionals. And immersive, effective training doesn’t just mean better-trained employees, it also means increased engagement, improving staff retention in a sector that’s struggling with understaffing. How tech is reshaping healthcare There’s a technology revolution underway in healthcare training. Cutting-edge tools and techniques are changing the way training is delivered, from students watching live-streamed operations by experienced surgeons, to nurses learning how to deal with aggressive patients through VR simulations. And VR is being used in other healthcare contexts too; it’s been shown to be effective as pain relief, redirecting a patient’s attention away from treatment by engrossing them in an alternative experience or game. The possibilities of VR for healthcare are exciting and far-reaching. And this technology couldn’t come at a better time: the NHS is the UK’s biggest employer, but with A&E admissions continuing to rise and Brexit putting pressure on staffing, cost-effective training and staff engagement and retention are critical. X-ray training Consistency is key when it comes to training medics, but ensuring that all staff are getting the same experience and reaching the same standard when the workforce is huge and dispersed is a challenge. Plus, access to state-of-the-art equipment is often limited, making hands-on training difficult. The right use of VR can mean that healthcare employees are trained on new techniques and technologies in an efficient, cost-effective way. The VR training developed by Immerse for GE Healthcare is a great example of this. CTCA scanning is a non-invasive, x-ray based technique for detecting patients at a high risk of acute heart problems. However, it’s a relatively new technique and only a small number of professionals are trained in it. And because it can only be performed using modern CT equipment, there are currently only 228 scanners capable of doing it in the UK. Immerse developed a training program for radiographers that simulated hands-on experience of completing CTCA scans. The simulation offers various play-through modules and patient types, with first-time users given a brief tutorial, and then directed to start in the fully guided 'learn' mode with the 'textbook patient'. Users can make mistakes in a safe environment, and they have 24/7 access to the tool, reducing the need for onsite training. Medical professionals can have the same experience no matter where they’re based at whatever time they’re available, and employers can be sure that all staff are receiving the same level of training. The engagement story Good VR is truly engaging and immersive: the participant is involved completely in the simulation, with little chance for distraction. Gamification can add another element of engagement, with competition elements encouraging staff to revisit training modules to hone their skills and improve their scores. And with an increase in engagement comes higher levels of information retention, which is particularly important in a sector like healthcare. Radiographers’ reactions to Immerse’s CTCA training show that the VR scenarios were both engaging and effective: many in-training radiographers reported on the ease-of-use and the detail of the simulation, with trainees saying that they felt 'right at home' after spending time in the simulation. The engagement benefits also reach beyond the training itself. Staff who find their company’s training enjoyable show higher levels of engagement with their role and with their organisation in general. Delving into data The effectiveness of training programs has historically been hard to measure, but technology like this changes things completely. VR training programs can capture huge amounts of data, helping employers to track progress, measure outcomes, and tailor future training. The depth and complexity of the data available is ever-increasing, with the potential to use eye-tracking software, biometric technology and even brainwave monitoring in VR training environments. Data can be collected on both an aggregated and an individual level, helping to personalise training to individual users. This has exciting implications for healthcare training. In the CTCA training program, for example, all actions within the VR space are tracked in real time, enabling instant review and feedback. In the future, this data capture could mean that the program could be developed into a more formalised assessment tool. New horizons for healthcare training Across many sectors, business leaders are realising the possibilities that VR offers for staff training. Improved training doesn’t just mean upskilled staff, but can also mean better staff engagement and retention. VR also shows true potential in improving both safety and efficiency across the healthcare sector. As the CTCA scanning example demonstrates, using a realistic VR training simulation offers both financial and logistical benefits. It addresses the issues with the current restrictions on training, and provides long term cost saving on sustained training programs. In-situ beta tests at various hospitals are currently planned through 2019. When it comes to VR in healthcare training, the possibilities are endless and there is no doubt expectations will be exceeded.[/vc_column_text][/vc_column][/vc_row] ### How to Streamline Businesses Through Tech [vc_row][vc_column][vc_column_text]Technology can be a headache for businesses owners when they don’t get it right. Poor Wi-Fi can slow down operations and put customers off, unreliable network connections can hinder communications and security breaches can put finances at risk. However, when tech is done well, it can take your business to the next level and make your day-to-day processes go as smoothly as possible. Here are just a few of the ways you can upgrade not only your technology provisions and services, but your business as a whole. Switch to a VOIP phone line VOIP, also known as internet telephony or ‘voice over IP’, allows users to make phone calls via the internet rather than with a traditional analogue phone line. The major selling point of VOIP is its cost: it’s cheaper than normal phone lines once everything is set up, and could even mean that you pay nothing at all when making calls – you may even be able to spend hours on a conference call for free! VOIP systems allow voice and video calls to be made from your smartphone, laptop, iPad, tablet or home PC as if you were in the office. A VOIP system can exist wherever you work and gives you all of the functions you may need remotely – calls will be routed through your office contract. In some cases, you will be able to keep your phone number (it depends on the provider if they can swap it over). The cost of your VOIP system depends on your setup, but app-to-app calls are generally free and any calls provided by a business will come with a monthly cost. Improve your Public Wi-Fi Setting up a public Wi-Fi network requires more than just a router and password. You also need to ensure your business broadband package is able to handle the amount of potential traffic that it will encounter from day-to-day. Make sure your internet package offers an unlimited amount of data and very fast speeds, as a slow internet connection will irritate customers and put them off using your service in the future. It’s vital for any open Wi-Fi networks that the right security measures have been put in place. WPA2 should come as standard with most modern routers and is the most up to date Wi-Fi security. You should also consider changing your passwords often, as well as making sure the router is located away from public access and that the strength of the Wi-Fi covers only your premises. It is also worthwhile adding additional firewalls to provide extra coverage. Consider a leased line Whilst standard home broadband makes you share a line with other users nearby, a leased line is a line dedicated to you and your business alone. When sharing a line, the amount of bandwidth available varies depending on how many other people are using the service at the same time, reducing speed at busy times – since leased lines don’t share bandwidth, they don’t have this problem. For small, medium and growing businesses, a leased line can be the most practical and cost effective choice. The bandwidth they provide is symmetric, which means users get identical upload and download speeds – this can be in excess of 10Gbps. Amvia’s Leased Line comparison tool AmviaSearch™ lets businesses see all the best deals available from multiple providers, without having to spend time sourcing them individually. The free to use online tool provides a completely unbiased view of all the providers and saves users an average of 37% on their existing contracted rate – alongside all that valuable time that could be better spent on more pressing business matters! Implement resilience Internet outages can be catastrophic for businesses across a range of industries. If your organisation relies on VOIP telephony, SaaS systems or cloud storage, downtime can have a severe impact on your productivity and even your reputation. Internet failover links can help to mitigate these critical risks – these are secondary internet connections which are setup to ensure valuable service uptime is never lost. Relying on a single connection can be risky, but implementing a secondary connection offers peace of mind – your colleagues can remain productive and your customers can continue to receive the services and products they need. For example, a fibre link could be the main connection and a wireless connection could be the backup. If properly implemented, failover solutions will automatically switch from the primary connection to the secondary link, while instantly transferring tasks from a failed internet service to a similar or redundant system. An automated failover system can immediately reroute data and communications, which helps to minimize disruption across the business network. Install a static IP address Broadband providers will assign your device an IP address whenever it goes online, letting the network know where the data needs to be sent. IP addresses can either be dynamic or static – in most cases they will be the former, which means that eyour IP address will change every time you switch on your computer. However, with a static IP address it will constantly remain the same, providing you with a number that is fixed to your device and your connection only. A static IP address will mean less downtime for you, since dynamic IP addresses run the risk of downtime whenever the IP address refreshes rather than providing a stable connection. Understand today’s security risks It’s essential that managers understand the variety of threats that a lack of internet security poses to them. Small businesses are often more vulnerable to attack, as they don’t have the same kinds of budget set aside to protect themselves as large corporations do. It’s prudent that you write a security policy for your business. All employees should be well versed in the policy and be aware of the implications of non-adherence. It’s also useful to have a policy that covers what to do if breach occurs – if you do fall victim to a cyber attack, you will be glad to have a reference as to which systems need to be closed down and how to retrieve any lost data.[/vc_column_text][/vc_column][/vc_row] ### What Brands Should Know About Millennials in the UK [vc_row][vc_column][vc_column_text]Millennials are possibly the most enigmatic generation right now. They are highly influenced by all of the latest trends and fashions, although it can sometimes be quite hard to figure out what exactly it is that they want. This can sometimes make them a difficult target for brand managers and marketers. To try to understand this generation more, the team here at Latana surveyed 370 millennials in the UK. Here are some of the interesting facts that we uncovered. Millennials: The Results Top 3 Motivations in Life First of all, we decided to look at the big goals that all millennials have. What drives and motivates them in life? The top 3 results might not be too surprising here. As you might expect, 52% said that they want to spend time with close friends and family, while 50% also listed being happy and having fun as a large target in life. 33% of UK millennials also put taking care of themselves and their own health in their top 3 goals as well.[/vc_column_text][vc_single_image image="52217" img_size="full"][vc_column_text]Spare Time As part of the survey, the millennials were also asked about their spare time. Over half (55%) said that they like to spend it with their family; 52% enjoy relaxing in their free time; while 49% play games.[/vc_column_text][vc_single_image image="52218" img_size="full"][vc_column_text]Interests Millennials interests tend to lie in the areas that you might expect from a young generation. 51% are really into their music while just under half (49%) of those questioned said they enjoy movies. In third place, gaming was an interest of 48%.[/vc_column_text][vc_single_image image="52219" img_size="full"][vc_column_text]Spare Time and Technology A large percentage of millennials said they watch TV a few times a week. The exact figure was 64%. Something else that was slightly unsurprising was that 61% said they regularly use social media. When it comes to browsing online on a mobile device, 59% said that this is something else that they do a few times each week.[/vc_column_text][vc_single_image image="52220" img_size="full"][vc_column_text]Millennials Online Have you ever wondered what millennials are looking at when online? We asked them that in the survey as well. 58% watch YouTube for at least an hour a day, while 55% and 41% are spending at least an hour on Facebook and Instagram respectively.[/vc_column_text][vc_single_image image="52221" img_size="full"][vc_column_text]How Millennials Discover New Brands All of this internet usage also has a big effect on how millennials find out about new brands and companies. Even though 50% discover new brands through friends and families, and 45% still discover them through ads on TV, there is still a high percentage that is highly influenced by social media. In fact, around 39% of millennials discover new brands on the likes of Facebook, Twitter, and Instagram.[/vc_column_text][vc_single_image image="52222" img_size="full"][vc_column_text]What Should Brand Managers Take from this Report? If a brand wants to keep up with its audience, then it needs to start listening to all its consumers, millennials included. By improving its consumer awareness and appealing to this younger generation of consumers, a brand can stay at the top of the market by being relevant and aware of exactly what its audience wants. By really understanding these kinds of insights into certain consumer generations, a company. will have plenty of intel into their target market. And that will make it easier than ever before to ensure your brand appeals to them.[/vc_column_text][/vc_column][/vc_row] ### Artificial intelligence goes mainstream? Not before a reality check. [vc_row][vc_column][vc_column_text]It’s widely expected that machine learning and artificial intelligence will follow the trajectory of cloud technology – breaking into the mainstream and impacting our lives significantly. Cloud technology is now widely understood among decision makers and operates in an increasingly mature market - a far cry from the days when the concept of the cloud was only understood by a select few technologically-minded individuals. The Cloud Industry Forum (CIF) confirmed this in 2017, polling 250 IT and business decision makers in large corporations, small-to-medium-sized businesses and public-sector organisations. The research determined the UK cloud adoption rate had reached 88 percent. Yet while cloud technology can now be considered mainstream in most sectors, AI and machine learning still represent unknown quantities to many business decision makers. A number of whom work across sectors that could benefit from adopting nascent technology that can streamline business practices and uncover unique data insights.  For example, Ocado, recently incorporated machine learning to automate the running of their warehouses - streamlining the experience for customers, as well as allowing the organisation to predict demand for products. However, for this technology to follow the trail to mainstream adoption blazed by cloud technology, the tech sector has serious work to do to increase understanding. A good starting point, and indeed a bone of contention within the tech sector itself, would be to address the difference between the two. Machine learning and AI – what’s the difference? The interchangeable use of the phrases AI and machine learning is undoubtedly a barrier to understanding - and one that must be overcome before either concept becomes truly mainstream. To this end, it’s important to understand that machine learning is a learning process and a sub-field of AI. In fact what is commonly described as AI can be separated into three different sub-fields, machine learning, knowledge representation, and algorithmic approaches. Machine learning is best described as a statistical method of identifying patterns in datasets in order to make predictions. Knowledge representation allows a system to encode individual pieces of knowledge, and their relationship with others, to create a knowledge base from which to inform actions. Finally, an algorithmic approach involves using dynamic programming as a method of knowledge gathering. Machine learning in particular can take place in different contexts, be that supervised learning, unsupervised learning or reinforcement learning. Supervised learning uses labelled data to train a machine to predict outputs based on inputs. For example, the number of ice creams sold as a function of outdoor temperature. Unsupervised learning sees machine learning exploit unlabelled data to obtain insights. For example, whether the data can be grouped or associated in any way - highlighting key features within a complex dataset. In simple terms, supervised learning results in a machine producing approximations, or predictions. Unsupervised training results in a machine providing descriptions of the data it is fed. Reinforcement learning is a process by which a machine learns in real-time interactive environment through trial and error. The process of reinforcement takes place through the use of rewards and punishments, with the machine trying to achieve the goal of maximising the cumulative reward it achieves. This is often demonstrated in an example where an agent tries to complete a game of PacMan - where eating the food is the reward and being killed by a ghost is the punishment. The agent plays the game and, through trial and error, learns which route to take to maximise the reward and eventually win the game. It is this incarnation of machine learning that is closest to what most people would identify AI. However, it is more accurately described as machine learning. In this context, many decision makers considering how ‘AI’ could transform could more accurately be said to be considering how machine learning could benefit their business. Providing clarity on this point - and fostering a greater understanding of disruptive technology - should be a key goal in the tech sector. To achieve this fully, however, there is another important challenge facing the tech sector. Myth busting. Hollywood misconceptions One of the difficulties around introducing machine learning into the mainstream is that preconceptions about the technology already exist. There is no clean slate from which an understanding of this new technology can be built. Instead, many people already have preconceived ideas fuelled by popular culture. Within this context, people are often told machine learning will be transformational. That it will disrupt our working practices and have a major impact on our lives. That may be true. The problem is that many people hearing these things haven’t encountered machine learning in a business context. Rather their only exposure to it has been through popular media, where it is entangled within the broader concept of AI. The result is there is little real public awareness on the capabilities of this technology. Many people believe they have little real-world experience of it, and no reliable reference point to help them understand what the technology can and cannot do. Even those who benefit from the technology, for example by using autocomplete functions when composing text messages or engaging with chatbots, are often unaware of the machine learning technology powering these innovations.  Addressing these misconceptions by raising awareness of real-world application of machine learning, such as predicting sales forecasts, analysing big data and automating factory processes is, therefore, an important necessity. Making the most of machine learning - and demonstrating results From a business perspective, it is the responsibility of our sector to build  a greater understanding of machine learning, so that it’s potential can be harnessed as part of business transformation strategies. Its crucial projects begin with a strong set of objectives and clients are engaged throughout the process. Simplifying the conversation surrounding this innovative technology also allows us to demonstrate tangible results that reflect the objectives of our clients. Working to very specific briefs provides a framework from which to base tangible and measurable results. An example of this is demonstrated in the work CTS carried out with Leiden University Medical Centre and Google. The brief demanded that administrative load on medical staff be lowered by automating time-consuming processes. By using speech-to-text technology to record the conversations between doctors and patients, and saving all data to the system automatically, this goal was achieved. This system removed the requirement for doctors to manually input data, and has opened the possibility of data analysis of the information provided. Minimising the burden of administrative workloads. Delivering against these briefs not only demonstrates the value machine learning can deliver but also manages expectations and ensures that new adopters aren’t discouraged by bad experiences or unrealistic expectations. With so many products that can work in tandem or in competition, it’s clear that jargon must be replaced with straight talking. It’s essential to dispel misconceptions and drive wider take up of emerging technology.[/vc_column_text][/vc_column][/vc_row] ### What man’s best friend can teach us about good ideas [vc_row][vc_column][vc_column_text] [embed]https://vimeo.com/338855225/b83c2bf96b[/embed] [/vc_column_text][vc_separator][vc_column_text]I recently spent a couple of days with a friend who had just got a new puppy. This cute little thing is a cross between an Airedale and a Jack Russell, and it is so full of playful curiosity it made me really happy that humans invented the dog. You may not think of the dog as a human invention, but it is. Historians can even trace the origin of the dog to somewhere in Belgium about 30,000 years ago, when humans took control of which wolves were allowed to breed with other wolves. This created something that would never have existed in nature. In the intervening years, we’ve become so successful at it that we now have about 240 recognised breeds, from the towering Great Dane to the celebrity handbag-rat. We’ve created dogs for specific uses, for aesthetic reasons and for companionship. And there’s a lot the world of innovation can learn from how we’ve done it. We’ll never reach the ideal dog Getting to the Pekingese was a long journey. It didn’t happen by saying we wanted a Pekingese and then having a particularly successful mating between two wolves. Instead, it’s the result of thousands of iterations and applied human effort. It’s been an ongoing process that’s still going today. Businesses need to adopt a similar approach if they want to get the most out of innovation. However, most organizations are based on static processes. Change is seen as a risky, costly, inconvenience. So, their approach to innovation tends to be about coming up with a slightly different static process that can displace their current one. Or a slightly different product that they’ll stick with for a while. Businesses like to wait until it’s too dangerous to simply stand still and then make a leap out of desperation and necessity. It’s uncomfortable, costly and loaded with risk. On the other hand, the most successfully innovative companies have innovation embedded in their organisation. Adapting to change is an ongoing process. And the leadership are focused on constant movement rather than static optimization. They understand that you never arrive at the ideal company, it’s only ever your latest iteration. Embrace unexpected results Do you think anyone ever set out to create the Chihuahua? That several millennia before the invention of the handbag, they had the vision for a tiny, boggle-eyed mutt that could fit inside it? I doubt it. I think dog breeds came around by spotting unexpected results. People spotted a dog with a fluffy tail and thought “I like that” so they bred it with another dog with a fluffy tail. Or they spotted extra muscle mass. Or a desirable temperament. Or an attractive face. Or a diminutive size. Our incredible variety of dog breeds are the result of us embracing the unexpected and then selecting for it. Businesses tend to be pretty awful with unexpected results. If something doesn’t give them the result they were looking for, they categorize it as a failure. And, hence, it becomes a failure because they fail to learn from it and fail to improve what they’re doing based on the knowledge it gives them. One of the things I like to do with clients early on is help them build a learning loop into their system. Whatever the results are of any activity, they sit down and interrogate it to find out what worked, what didn’t, what they’d do differently next time, what skills they need to pick up and lots of other things. They turn these learnings into recommended actions for the organization. It is then the management’s responsibility to put these actions into place. That’s how an organisation keeps improving. That’s how they get better at innovation. That’s how they get to more and bigger successes. Single decision makers Growing up in the UK in the 1970s, my family used to watch the TV show “One Man and his Dog”. It involved a shepherd in a flat cap instructing his dogs to get a flock of sheep into a pen in the fastest possible time. The shepherd used a combination of shouts and whistles to control the dogs and it was a lot more compelling than it sounds! The thing to note is that the show wasn’t called “A Committee of Experts and their Dog”. It was one man (these were in the days before equality was even thought of). Most companies have a tortuous process for ideas to go through. Multiple decision makers with different priorities get involved to give feedback according to their agenda. Status meeting after status meeting stops any momentum and turns the process into a slog. And the most important leaders only get involved at the end, giving feedback at the most harmful and expensive stage of the process. It’s a wonder any ideas get through at all. I find that the best way is to remove the meetings, shorten the process and get to a prototype as quickly as possible. And to assign one person to be the ultimate decision maker. They set the vision at the start along with the leadership team. And they have the right to veto everyone else’s feedback. As long as everyone understands that this is the way things need to work, the process is a lot smoother and the end result is a lot better. Even democracies need to be run by a leader. Innovating innovation If you’re ruthlessly loyal to your existing innovation approach, you can take comfort that you’re still in the majority: the 94% majority of CEOs that are dissatisfied with their levels of innovation. But if you want to make some changes that are going to get you into the illustrious 6% club you could do worse than go for a walk. With a dog, of course. And a notebook to write down what you learn.[/vc_column_text][/vc_column][/vc_row] ### Keith Richards, Gig Economies and £100K Service Techs [vc_row][vc_column][vc_column_text]Legendary Rolling Stones guitarist, Keith Richards, once said, “I ain’t getting old, I’m evolving” - a statement that could quite easily be applied to the thousands of men and women who repair, service and maintain the ever growing machines and equipment around us. These field service technicians are seeing their profession undergo massive change. With the exception of The Rolling Stones, most baby boomers are retiring or scaling back, looking for part time work – taking years of product insight and expertise with them. But the experience required today – and tomorrow – is very different than the previous generation of service techs. Today is a brave new digital world. And keeping that world running will command a premium. As an increasing number of consumer products are servitized and organisations shift to outcome-based business models, this has put service in the hot seat, triggering a shift in perception at the boardroom level. Service is now a revenue generator (not a cost overhead), making the service professional an integral part of any organisation. For the first time in the history of service, field service technicians are strategically important, rather than tactically required. They can upsell services onsite, are seen as ‘trusted advisors’ by customers because they’re technicians rather than sales people, and for many product companies, they’re the only human ‘face’ of the product or company a customer will ever encounter. Not only must field service techs be more rounded with a consolidation of skill sets, they must also master some of the customer facing softer skills, rather than just technical competency. With connected devices expected to rise to 20 billion by 2020, the unfair image of field service technicians as the stereotypical ‘white van man drivers’ weaving in and out of traffic, costing their businesses money as an expensive but necessary overhead (rather than a revenue generator) is rapidly becoming a thing of the past. As a profession, field service is growing up. Fast. Why The Gig Economy Suits Service With digital transformation sweeping just about every industry, the traditional break/fix model of maintenance has to change. The economics simply don’t work for maintaining all assets the same way. In the next few years, low risk assets, for example, will be serviced by an ‘uberization’ of field service workers with field service management companies, such as ServiceMax, becoming the ‘network’ between its customers and its agents. Of course, an uberized technician talent pool is particularly well suited for simpler, more repeatable service and maintenance tasks. A service execution platform will enable jobs to be auctioned to technicians that are available with the relevant credentials, enabling them to work as much or as little as they like.  While companies may use an Uber-type delivery model for their service and maintenance delivery, customers want to have a consistent brand experience with how a company’s service is delivered. They don’t really want to know if a technician is a badged PAYE employee or third party. Customers want it to go well, regardless of who is sent. It could be ad-hoc on a day-to-day basis, and the work gets thrown out to bids. A service tech could see there are ten jobs in the vicinity, where companies could see who is available to address the workload they’ve got that day. The £100K Service Technician At the other end of the spectrum, premium, higher paid technicians will focus on harder, more complex tasks. The predictable service model of digital twins to manage assets and account work better will result in fewer but more experienced senior technicians. They will be the problem solvers, such as delving into ‘no fault found’. The skill level and complexity level with new technologies will drive up salaries, reaching six figures within the next few years. We expect this to happen first in more scientific and tech roles, with other industries following suit. But it’s not just expertise that will drive up technician salaries. Customer demand for high availability and throughput means businesses want off-hours, weekend and 24x7 support, triggering more paid overtime and increases in shift allowances, lone working extra resource, and compensation for unsociable hours. The Mixed Labour Model But how do you capture the senior skills, hands on expertise and years of knowledge in a generation on the brink of retirement? Technology and time typically provide an answer to most dilemmas. It’s inevitable that the industry will face a skills shortage. In the next five years, the number of baby boomers will shrink dramatically. This will have a twofold affect: 1) driving up wages of existing field service techs as their skills become more in demand and 2) encouraging older techs to still use their knowledge and skills in the Gig Economy, working on their own terms and earning extra cash. There will undoubtedly be internal roles too, transitional periods where businesses and field service workers come to terms with knowledge transfer. Ultimately, it will be technology that makes it all possible. While digital twins increasingly predict service requirements and model change, automation will lead to greater efficiency for low maintenance tasks. Service will continue to become more efficient, thanks to technology and data analytics, which will enable senior service skills to be more valued and rewarded, either in terms of educating new recruits particularly on legacy assets, helping to upsell valued clients or working in the gig economy. In the field service industry this means knowledge and the ability to service clients, regardless of machine and regardless of time. Coupled with advances in automation technology, this should mean greater efficiencies despite greater demands from customers. As senior service skills and experience become more valued and rewarded, it will elevate the levels of professionalism in field service higher than it’s ever been before. The potential for a career in service maintenance will soon offer far broader opportunities, ranging from blue collar gig-based jobs to white collar premium positions. And just like Keith Richards, it is indeed evolving.[/vc_column_text][/vc_column][/vc_row] ### AI: Inspiring creativity and fuelling business growth [vc_row][vc_column][vc_column_text] Anyone who knows me will be pretty aware by now that I’m fascinated by the development of artificial intelligence (AI), machine learning (ML) and technology which analyses and exploits our deepest thoughts and emotions. I host the Creative Intelligence podcast, in which I have conversations with senior figures and innovators in the field about their research, as I try to draw together a disparate handful of threads to weave into a single narrative strand to take me through how all of this technology has grown, how we as humans need to interact with and shape it, and how important it is that we use the results of our collective labors to make our lives more fulfilling, not simply to run more efficiently or profitably. AI is a remarkable beast. I guess that people outside the scientific community, laypeople and casual observers, think it’s just another convenient technological advance, like Alexa or the iPhone. It really isn’t. For a start, artificial intelligence isn’t a single thing. It’s not even a process. AI is a philosophy, a whole infrastructure of design and hard science; you might even call it a new way of looking at the world. It offers us a future in which machines, computers, microchips, software are not just our tools or servants to be exploited: if we do it right, they can be our partners. The old paradigm was that humans design machines to do something that we had previously done, but to do it better – or to do something that we couldn’t do. An example of the former would be the mechanized production line of the kind pioneered by Henry Ford. He was still making cars, like everybody else in the automotive industry. He wasn’t making a new kind of car. But he was doing it so much more quickly and efficiently than his competitors that it gave him a commercial edge. The latter is epitomised, maybe, by medical imaging equipment, CAT scans, ultrasound, X-rays. The way they look through our outward carapace and see the physiology beneath is something that the human eye simply can’t do. It doesn’t have the power or the capability. So, in that sense, that kind of scientific advance has expanded our horizons. AI is different again. AI works with us; the technology is created and programmed by us, of course, and that remains a fundamental limitation which we always have to bear in mind – but that’s just the starting point. With ‘old’ technology, you invented something, you produced it, and there it was. Boom. Someone might come along and make a better one, steal a march on you. But it was the same process again. With AI, it starts from our human algorithms and programs, but it’s adaptive and responsive. It works with us, it complements us, and it’s starting to learn from us. We’re going to have to debate much more seriously in the future what constitutes ‘consciousness’, but I tell you this: the gap between human and machine is going to get smaller, fast. And this isn’t science fiction. This is happening right now. So how does AI inspire us? How does it fuel our creativity? We tend to think of technology, of machines, as the very opposite of art, of creativity, or the esthetic sensibility. But AI isn’t like that. It puts a hand on ours as we hold the brush, or the pen, or the cutting shears. It allows us the think in new ways, to make connections we would never have made before, between apparently disparate things. One of my guests on Creative Intelligence, Richie Manu from Central St Martin’s in London, described this forging of new connections, of rewiring the way we think, as our “curiosity bandwidth”. The potential outcomes are as wide as our minds can make them. For example, CuteCircuit, the London-based fashion house, produced a little black dress made of graphene, which could monitor, interpret and respond to the way the wearer breathes: more Samantha in Her than Audrey Hepburn in Breakfast at Tiffany’s. So I believe that AI allows us to expand our creativity, for sure. But more important than that, I think it allows us to reimagine what our creativity can really be. Designing a dress isn’t just a good eye and the ability to cut fabric skilfully anymore, it’s also advanced and empathetic data collection, applied science and (forgive the pun) cutting-edge nanotechnology. And you can call me a dreamer, and many do, but I think that’s awesome. It’s a little bit easier to understand how AI can help businesses grow and strengthen their appeal to customers. Retail is, in large part, an emotional process. There are no doubt some purchasing decisions which are ruthlessly commercial and efficient: an office manager buying printer cartridges, or a road builder buying aggregate. These processes probably don’t have much room for choice, influence, esthetic quality, sentiment. But think about so many other sectors. Fashion, we’ve talked about. But what about art? Automobiles? Food? Vacations? All of these are emotionally-driven buying scenarios, and what AI can do, or what it can help us to do, is understand the decision-making, questions and formulae that go through people’s minds. Who buys a product? What’s the demographic? Or rather, what are the demographics? Which of the measurable qualities of a given group of people - age, socio-economic class, educational background, race - are important when a purchasing decision is made? What do people think? AI can help us understand that. It can map human emotions and interpret them, allowing a retailer to produce the most comfortable and enriching consumer interfaces. If you get that right, then of course your market share will increase. The key to sales is giving the customer what they want, or even anticipating their needs, and the more information producers and vendors can collect, the quicker and more thoroughly it can be processed and understood, the more perfectly you will bring what you have to sell into alignment with what someone else wants to buy. The key to unlock commercial nirvana. So I think this is really exciting. When we do it right, AI can be a benign force, a helping hand rather than a forceful restraint, helping us to do what we want to do, not telling us to do what we ought to do. It’s complicated, and multilayered, and interdisciplinary, and people who are really leading from the front are few and far between. But their ideas will spread. Once the revolutionary is proved efficacious, it becomes the commonplace, and the essential. I think to myself, what a wonderful world. [/vc_column_text][/vc_column][/vc_row] ### 9 in 10 SME owners still lacking information on GDPR compliance [vc_row][vc_column][vc_column_text]It may have been a whole year since the GDPR (general data protection regulation) laws came into effect, but new research by business insurer Hiscox has found that business owners still aren’t completely up to speed with what is required of them under the new regulations. Among the eye-opening findings, the study unearthed that 9 in 10 SME owners still don’t know the main new rights that GDPR gives consumers and 39% don’t know who GDPR affects. These are concerning statistics considering the vast majority of businesses will be dealing with consumer data on a day-to-day basis to some extent. This lack of understanding perhaps suggests that the efforts made to educate businesses on GDPR compliance were not as effective as hoped. When survey respondents were quizzed on what they found most annoying online in 2018, constant communication about GDPR topped the list, alongside PPI calls and website pop-ups. Is it possible that the abundance of information available was actually more irritating than insightful - failing to engage the intended audience? What should businesses be doing to comply with GDPR? The purpose of GDPR is to improve regulation of data privacy and security in the EU, including how businesses collect data, how they use it and how they store it. It aims to provide the general public with more control over their personal data and to encourage businesses to be more transparent about how they are using consumer data. According to the GDPR directive, ‘personal data’ refers to any information related to a person, such as a name, photo, email address, bank details, social media information, location details, medical records, or even a computer IP address. Some of the key actions that have been taken that the public will be most aware of are the introduction of updated opt-in cookie consent pop-ups on websites and the distribution of emails informing consumers of updated privacy policies and their usage of personal data. A lot of businesses have also made the decision to hire a dedicated GDPR officer, who takes responsibility for overseeing the company’s data protection strategy. This individual ensures the business is complying with GDPR by educating all employees on requirements, conducting audits to ensure compliance, maintaining records of data processing activities and more. Are businesses complying? According to a report released by DLA Piper in February 2019, between May 2018 when GDPR came into action and January 2019 there were nearly 60,000 reports of data breaches, but only 91 fines had been issued. The highest and most notable fine (€50 million) was made against Google for failing to acquire users’ consent for advertising. The consequences of a GDPR breach The Hiscox study found that 96% of SME owners didn’t know what the maximum fine is for breaching GDPR, despite the fact that a breach could potentially land them in hot water financially. If a business is found guilty of not complying with GDPR they could face a fine of up to €20 million or 4% of the company's annual turnover (whichever is higher). These fines can be issued for failing to report a data breach within 72 hours of becoming aware of it or for failing to integrate data protection policies. All fines are administered on a case-by-case basis, however, and it’s unlikely that many businesses will be fined the maximum amount unless it is a severe case of infringement. Supervisory authorities have the scope to impose smaller fines for less serious cases or to issue warnings, reprimands and compliance with data subject requests. What next? There is still time – and a continued necessity – for business owners to get up to speed with the new laws if they aren’t already. Business owners should consider GDPPR compliance as an ongoing challenge that requires continued attention. Despite the tumultuous journey that the UK has been through regarding Brexit, as it stands, it looks as though the UK is set to leave the EU in the coming months. This doesn’t mean that the nation will be abandoning GDPR, however, as it has now been integrated into domestic law and will also remain incredibly important for anyone doing business with EU countries. While it may appear that GDPR is something that effects global organisations such as the Googles of the world, it’s just as crucial for smaller businesses to ensure they’re staying on the right side of the law. If you think there’s a chance that your business isn’t complying with GDPR, now’s the time to get up to speed.[/vc_column_text][/vc_column][/vc_row] ### How organisations can employ tokenisation using public blockchain to access untapped value [vc_row][vc_column][vc_column_text]Over the past few years, blockchain technology has piqued the interest of a whole host of businesses, from a rich variety of sectors. According to a recent survey from Deloitte, 98 per cent of UK businesses have either already deployed a blockchain solution or intend to do so at some point in the future. However, enterprises exploring the technology are primarily using private, permissioned blockchain to increase the efficiency of record keeping and data processes. Tokenisation is one of a number of functionalities going underutilised as a result of this limited scope. So, how can businesses harness tokenisation using public blockchain to access untapped value? Public vs. private blockchain: What’s the difference? Many are unsure about the difference between a public blockchain like Ethereum and private blockchain such as Hyperledger. It all comes down to the scale of the network and who is allowed to participate. Public blockchains are hosted on public servers, and anyone can join, read and write data to the chain. These systems are highly resilient and boast high levels of privacy due to the anonymity of each participant. Private blockchains are much smaller in scale, and hosted on private servers. Only authorised participants can join, read and write. Both public and private blockchains can also be permissioned. In the case of public blockchains, this means anyone can join and read, but only authorised and known participants can write. On private, permissioned chains, only the network operator can write data to the chain. By deploying a private or permissioned network, enterprises can harness the power of distributed databases, but miss out on the true potential of public blockchain—its shared state and trustless design. Both types of blockchains have their uses, but many underestimate the value of public blockchain in enterprise. What is a token? In simple terms, a token is a digital representation of a unit of value. This unit of value can be assigned to an intangible asset such as a cryptocurrency, or a physical asset such as property or a piece of art. Software code in the form of “smart contracts” can represent agreements between individuals, enterprises, and governments, and utilise tokens to facilitate these transactions. Using public blockchain, the combination of these technologies unlocks a wealth of opportunities in enterprise.  How can organisations employ tokenisation? Streamlining value transfer At a technological level, tokens can be utilised as a common protocol for data and value transfer, which can lower transaction costs and increase speed. In any supply chain, a set of complex contracts define agreements between supplier, manufacturers, and retailers. However, on occasions when these terms are not met, a lengthy bureaucratic process is required to determine culpability and financial penalties. By connecting automatic value transfer with contract terms using tokens and smart contracts, overall transaction speeds and costs can be reduced.  Democratising the financial ecosystem In the Philippines, 56 per cent live with limited access to the financial ecosystem. Rural banks are neither connected to any electronic banking service, nor to domestic and international money transfer networks. This challenge can be surmounted using a combination of tokenised assets and a fluid, cross-border platform to create the building blocks for an open marketplace. A cash-backed token, or “stablecoin”, that can be openly traded in a national system can be used to create an open payment network. These digital tokens can be used to instruct and settle remittances between participating rural banks by consolidating messaging, execution, settlement, and accounting of the transaction through one platform. Asset tokenisation Tokens can be used to represent the whole or a fraction of a real-world entity. These entities include natural goods like gold or oil, real estate, financial instruments, and more. Tokenisation allows traditionally illiquid assets to be split into smaller, more liquid components. Liquidity is all about easily getting in and out of assets, and the all-or-nothing ownership principles of our current markets make exchanging assets for value inefficient and sometimes impossible. By allowing for the fractional division of assets, tokenisation addresses this issue, creating greater freedom in the trading assets and decreasing illiquidity premiums. New crowd-funding models Tokens can also be used to represent rights to a future good or service. This allows projects to raise funds in a new way and create buy-in from a potential userbase, even before a product or service has been launched. Whereas investment into early-stage ventures has traditionally been restricted to venture capitalists, tokenisation creates additional investment models based on the principal of crowd-funding. Whilst the uptake of private, permissioned blockchain in enterprise is a positive sign of the value and longevity of the underlying technology, businesses are missing out on the value tokenisation can provide. Certainly, private blockchain has the potential to improve the efficiency and transparency of business processes, but its benefits are limited in scope. Public blockchain has the potential to be far more transformative, because its core characteristics allow for the creation of entirely new business models, unlocking previously untapped stores of value for businesses.[/vc_column_text][/vc_column][/vc_row] ### Women in FinTech: Fresh Perspectives [vc_row][vc_column][vc_column_text] The financial services industry is well known for being fast-paced, competitive and male-dominated. In Fintech, women are vastly underrepresented, making up just 29% of staff in the sector, despite representing 47% of the UK workforce. But this is also an industry in transition – and growing numbers of companies now recognise the importance of promoting gender neutral pay structures and flexible working policies. As Rosie Silk, R&D Tax Manager at Kene Partners, explains, with the right working policies and role models, concerns about attracting women to the industry should be rapidly consigned to history. Zero Bias Gender pay gaps. Endemic sexual harassment in Silicon Valley. It is easy to assume that the experience of women in the workplace has failed to improve over the past three decades. But that is patently untrue. There are many organisations within the FinTech sector that are passionate advocates for generating a positive working environment, and one that has zero gender bias. Given the challenge of recruiting high calibre individuals, companies need to encourage working practices that suit today’s attitudes towards work/life balance. Both women and men should have equal access to flexible working policies, for example. However, of course, FinTech is a young industry. Fast growing, start-up organisations will often overlook the need for flexible working models to support parents of both sexes. Indeed, in many start-ups, such flexibility is simply not an option. But this is an industry that offers choice: if flexible working is important to any individual, then look for a different employer. There is, without doubt, a divergence in cultural attitudes and behaviours as well as working practices in firms across the FinTech sector, and it is incredibly important for both employer and employees to understand and identify those issues up front. Skills and experience alone are not enough to ensure a great fit. Role Models This is also where female role models can play an important part, not only to inspire contemporaries and the next generation but also to set expectations within the workplace. The days when women leaders felt the need to adopt so called male leadership traits are long gone; successful women are leaders on their own terms in FinTech as much as any other industry, and that is an important message. Attending an innovation conference recently and finding four women on a panel of five, is incredibly inspiring. Furthermore, these women can also reinforce a strong culture by acting as coaches or mentors to new starters. But it is also important to remember that changes to the workplace alone can never achieve zero gender bias without a wider change in culture, at home and in education. Chat to any successful women in FinTech, or just any successful women, and the story will be the same: they were provided with a huge amount of confidence at home, encouraged to try new experiences and treated equally. As a result they expect to be treated fairly at work and to have access to the same opportunities. They are also confident enough to take risks and embrace innovative business areas such as FinTech. For any business leaders carefully crafting good flexible working policies, don’t forget to apply the same encouragement and support once you get home from the office! Conclusion Despite the occasional negative headlines, times have without doubt changed. In addition to the support provided by switched on employers, women leaders are now becoming commonplace and acting as role models to the next generation. Companies across the board are investing in new working practices to support all staff and, as this continues, hopefully, within the next few years this conversation will be consigned to history. [/vc_column_text][/vc_column][/vc_row] ### Three ways blockchain can bolster ERP [vc_row][vc_column][vc_column_text]By augmenting existing data processes, blockchain technology has the potential to address some of the thorniest problems facing businesses today. Providing an increased level of transparency, security and auditability, blockchain is able to deliver value to businesses, in tandem with existing ERP systems and data architectures. Here are three ways blockchain can bolster ERP: Ensuring compliance With the introduction of GDPR, it has become more important than ever that businesses retain control of their data. When data is stored across multiple disparate spreadsheets, this becomes next to impossible. Blockchain-enabled ERP represents one solution to this conundrum. Utilising the functionality of private, permissioned blockchain, enterprises can ensure data is stored and exchanged in an ethical and secure manner. Private blockchain networks are open only to authorised participants, as dictated by the network operator. They are designed to allow the secure exchange of data within an organisation or between an organisation and a trusted third-party. The granular access control afforded by private, permissioned blockchain means that data can only be viewed by and exchanged between those for whom it is role-relevant. By restricting access to specific data to a handful of relevant individuals, blockchain automatically mandates ethical data-handling practices. By minimising the possibility that customer and partner data falls into the wrong hands, blockchain can ensure businesses remain GDPR compliant, and avoid the considerable penalties attached to a breach. Increasing auditability The quantity of data generated in the coming years is set to grow at an exponential rate. The key to gaining business advantage will be in managing this information in the most effective way. Data is at the heart of all business decision making, so it’s vital businesses are able to trust that their data is both accurate and not subject to tampering. Retaining full visibility using a blockchain-enabled ERP system is one way to ensure business data is trustworthy. Blockchain creates a transparent and indisputable log of data additions, transactions and changes, providing an end-to-end audit trail. This means that businesses can see precisely when a piece of data was introduced into the system, who has access to it, and whether any alterations have been made. The consensus algorithms at the heart of blockchain technology ensure that the system (and the data contained within) is near impossible to tamper with or corrupt. Blockchain creates a single version of the truth, providing a strong and trustworthy foundation of data on which businesses can base critical decisions. Driving transparency in the supply chain Though blockchain is still in its relative infancy, it’s already having a tangible impact in the manufacturing and supply chain sector. Organisations are using the technology alongside existing ERP systems to boost transparency and coordination across sprawling supply chain networks. The technology allows organisations to track a product through multiple stages of the supply chain in an efficient and reliable manner, as it passes through multiple stages and even locations, over several months. This comprehensive audit trail allows businesses to move products through customs swiftly, track fresh produce to identify the source of contamination, and even trace the movements of highly valuable and sensitive items. By making supply chain processes transparent and paperless, blockchain has the potential to both cut costs and drive productivity. Many companies are already using blockchain technology to enhance their supply chains. For example, Viant, an Ethereum-based platform for building supply chains, partnered with WWF to track tuna from the moment it’s caught, until it reaches the shop floor. With the industry rife with corruption and accusations surrounding poor animal welfare standards, the ability to track the process from start to finish is a huge step towards reducing the environmental impact of fishing. Though still regarded as an emerging technology, blockchain is becoming an increasing presence in enterprise IT, especially in the supply chain and logistics sector. According to recent figures from Deloitte, a huge 98 per cent of UK businesses have either already adopted a blockchain solution or intend to do so at some point in the future. The technology’s core characteristics and functionalities mean that it’s well equipped to augment existing ERP systems and data architectures, bolstering transparency and auditability, and ensuring businesses remain regulation compliant.[/vc_column_text][/vc_column][/vc_row] ### Spend on Cybersecurity or be prepared to spend much more on ransomware [vc_row][vc_column][vc_column_text]In early May 2019, the city of Baltimore fell prey to a debilitating ransomware attack. Emails and voicemails were crippled. The hackers seized data from the parking authority, water bill system, and real estate transactions. Anyone planning to purchase a home in May in Baltimore would be delayed. Hackers demanded Bitcoin payments equal to around $100k to return the services and data. If they did not receive the money they would wipe out the data, costing the city significantly more in damage. As of the writing of this article, more than 2 weeks after the attack, Baltimore mayor Bernard Young is standing strong against those that have perpetrated the attack. His stance is that paying these ransoms is what makes the practice possible and he posits that the money that is given to hackers is then likely used for them to build more sophisticated attacks. However, when asked, after over 14 days of his city being hamstrung,  if he planned to capitulate, he is quoted as saying, “Right now, I say no, but in order to move the city forward? I might think about it.” In all likelihood, the reason that the city of Baltimore was put in the position of being saddled with a ransomware attack is not because it was specifically targeted. Ransomware attacks like this one are becoming more and more prevalent, and mainly have to do with opportunism. Something about their system made it particularly vulnerable and hackers were able to exploit that, costing them what will result in millions in damage whether they pay or not. Don’t be Baltimore Nothing against Charm City, but it is crucial in the age of rising incidences of cybercrime to be better prepared for attacks like these. Unfortunately, as criminal behavior increases the budget needed to remain vigilant must also increase. Consider that according to CSO Online, damage from ransomware attacks grew 15 times in less than two years to more than $5 billion in 2017. And total cybercrime costs are projected to hit $6 trillion annually by 2021. With this in mind, most companies are following the trend of increasing their cybersecurity budget. According to Varonis.com, over 75% say that they have readdressed this budget line with expenses going up over 140% in just the past 10 years. If your company isn’t keeping up at this pace, you may be inviting a devastating attack. What might it cost you? According to a study by IBM, individual cyberattack costs rose to an average of 3.9 million in 2018. These costs are astronomical, but that is just the beginning. Take into account not only aspects like ransom but also lost revenue over the average 50 days it takes for a company to recover from attack as well as reputation costs and customer turnover. What you can do to protect yourself Protect the crown jewels: The largest losses in cyber attacks come from the compromising of crucial data, representing over 40% of costs. Back up and silo the most important data in a dedicated server which is not networked and has limited access. Be Prepared: Put a plan in place for what to do if you are faced with an attack. Be Vigilant: Of course you don’t want to compromise the privacy of your employees, but making background checks part of your hiring protocol is crucial, online services such as NetDetective or BeenVerified to do this work for you. Get help: Again, spending money on this issue is likely going to be a fact of life. Get consultants to help you through where to prioritize spending. Create a culture of cybersecurity: This one doesn’t necessarily cost much money, but is likely the most important measure you can take. Make sure your company takes security seriously and set up protocols that make this clear. Let’s talk cybersecurity -- Better passwords are a great start! Remember, like in Baltimore, most of these crimes do not involve hackers specifically looking for your company. 70% of breaches are caused by random process failure which often includes employees not following password procedures. Remarkably, two-fifths of reported cybersecurity incidents are the result of a breached password. Typical password fails include: Being stuck in a password rut: Over 50% of people use less than 5 different passwords their entire life! Ancient passwords: Over 20% say they still use passwords that are over a decade old. Very poor password choices: Sadly, “password” “qwerty” and “12345” were among the most popular passwords of 2014. How to create better password protocol Length: Making passwords more characters makes them more difficult to crack. Experts say to make passwords at least 12 characters long as each character adds an exponential level of security. Variance: Recommend not putting words or pronouns into passwords at all. Common advice is to take a phrase and to use the first letters of each word to create a memorable password. For example “My wife and I got married in 2008 and went on our honeymoon to Vegas!” would become MwaIgmi2008awoohtV! Have a different password for every site: This sounds overwhelming, but there are online services like LastPass and Dashlane that use complex encryption to keep your various passwords organized and safe. Two Step Verification: Usually this means a code is sent to an employee’s cell when they are logging in on a secure site. Taking advantage of this kind of two-step verification is crucial. You’ll be surprised to see how easy it is to set up two step verification on some of the most used websites. Other tips: Communication is key: Keep cyber security top of mind, often referencing it in company-wide emails and at meetings. Tell people about trends in phishing scams and convey tips on keeping secure. Be watchful: If someone is acting suspiciously, be wary. Create a system where employees can anonymously report anything they find questionable. Ensure your supply chain: Making sure your vendors and any IoT devices you are using are secure is essential. Many breaches are caused by weak security in a vendor company, or on a networked device that is brought in from another company. Also try talking to your bank about their safety protocols. Unfortunately, the problem can not be ignored and this is a battle you will only win with investment and vigilance. Staying abreast of best practices is key. See below for a useful infographic from Varonis to help you prioritize your cybersecurity spending and avoid becoming another Baltimore.[/vc_column_text][/vc_column][/vc_row] ### The Secrets of SaaS [vc_row][vc_column][vc_column_text]Software as a Service is a driving force for disruption – a way to innovate at the speed which customers and the market demands. So how can companies with a traditional software go-to-market motion make the switch, and capitalize on all that SaaS has to offer? What’s the secret to succeeding in a SaaS world? As the CEO of a 100% percent SaaS-driven, born-in-the-cloud company that was bought by a global technology leader who is constantly evolving and optimizing their business, I could preach the benefits of SaaS ad nauseam. Here is a quick list: Velocity. If you decided to set up a new office today, with a new team, it may well take more time to assemble the chairs than it will to provide the IT. With an internet connection and a credit card, an entire business can be given access to cloud-based productivity, CRM, finance and storage tools before lunch. Agility. Adding new tools becomes frictionless. There’s the opportunity to test alternatives and find the best fit for each task and integration with the overall business. If a specialist tool doesn’t work as well as imagined, then swapping it out for something different is easy: organisations are far less likely to be locked in to particular vendors, having paid a large up-front license fee. Freedom to innovate. There’s an old technology maxim, “Fail fast, scale rapidly.” With SaaS, the business can concentrate on what it actually does. Buying, provisioning, running and maintaining heavy infrastructure can be left to specialists in their data centres. User support also becomes simplified since everyone’s running the newest version of the same systems. IT becomes a strategic asset for growing the business, as opposed to a cost-centre. Cost. Businesses very frequently prefer to face operating expenses over capital expenses. The expense of on-premises data centres and up-front license commitments are difficult to swallow if you aren’t very certain of your organisation’s profitability over the time those arrangements are expected to serve the business. Monthly fees, in contrast, allow a business to scale their provision up or down in response to the commercial realities of the moment. Collaboration. In order to scale at pace, cross functional interlock becomes a must. SaaS both enables collaboration and makes it critical to the success of your business. The benefit is that you get a company without siloes, in which every department is high context and high performing. Everyone’s a winner? The software market has embraced SaaS irrevocably. Companies reliant on large, up-front license sales are sitting on the wrong side of history – or have a niche that cannot effectively be replicated through cloud services. Applications that require zero latency, have intensive local hardware demands or have very specific compliance directives attached to them might remain on-site licensees for the time, for example. In other cases, if it’s not SaaS already, then it either has a hybrid model or soon will be. This customer acceptance of SaaS is, of course, the largest advantage for vendors themselves embracing the model. It is easier to overcome resistance to a £30 monthly fee than it is to a £1080 three-year license, so sales are likely to be more frequent and sales cycles, unless you’re in a very crowded sector, are likely to be shorter. These smaller sums of money also provide multiple opportunities for expansion. The lower cost of entry means customers with small budgets, who might have considered your product out of their price-range, will be tempted to dip their toes in the water. Then, if your customer is successful using your tools, then they may want extra seats soon. Perhaps there are additional, complementary services to offer, or an enhanced support package? Offered in terms of micropayments, the acceptance rate again becomes higher. Next, possibly a counter-intuitive point – the levels of attrition for SaaS products is not nearly so high as you might imagine. Humans are hard-wired to being averse to losing things they already have. Studies show that losing something that made you happy hurts about twice as much as the increase in happiness gained from the same thing in the first place. So long as your service has some utility, people won’t want to lose it. The ability to walk away from a SaaS offering is a good reason to get started, but the actual likelihood of people doing so is much lower than they think. This inertia isn’t something to depend upon, though, which brings us to our next point. Metrics at Your Fingertips While we’ve stressed the commercial advantages of a SaaS business model so far, doing this well and creating a long-term business isn’t just about selling subscriptions and waiting for the money to roll in each month. It’s about using those advantages to direct investment into the service, driving up loyalty, and turning users into fans. Creating and maintaining a SaaS product produces the opportunity for a very different relationship with customers than traditional markets. Historically, developers created a product, shipped it and moved on. If they were aware of how their customers used their products, it was often in a very piecemeal way, or through a piece of very deliberate research. SaaS demands customer intimacy. Part of that comes from easy access to key information: on a daily basis, here are some of the KPIs I check: customer acquisition cost (CAC) ratio, direct customer MRR (monthly recurring revenue), channel partner MRR, MRR by region, annual gross churn, year over year quarterly ARR growth, cost per lead, and Days Sale Outstanding (DSO) – which I then compare to the SaaS average, 76 days. Across the industry, the annual SaaS survey by KBCM Technologies (formerly Pacific Crest) is the gold standard for benchmarking and provides a valuable yardstick by which SaaS companies can metric their own performance. Having this data at my fingertips is invaluable. It embodies the agility which all businesses crave, and which is a primary driver for SaaS transformation and growth. Any good business puts their customer first…if you have a company culture code, I’m willing to bet “customer focus” is one of your core tenets. But SaaS transformation takes this a step further. It requires customer intimacy: knowing what your customers bought, why they bought it, and why they renewed (or didn’t) so that you can constantly provide them with even better value. Just think about how well Netflix knows you. It sometimes knows what you want to watch before you do! Now consider how much convenience this adds to your daily life. Conclusion The ‘service’ part of SaaS can sometimes be misinterpreted as another word for ‘product’: in such a case, you’re giving customers the same thing, but charging a monthly fee instead of an up-front license. To do so would be to willingly be blinded to the greatest advantage of SaaS for vendors – the opportunity to give your customers exactly what they want.[/vc_column_text][/vc_column][/vc_row] ### The Insights Into the Future of Healthcare Technology [vc_row][vc_column][vc_column_text]Recent technological developments have changed the way we do business. The integration of artificial intelligence and cloud computing has reached almost every industry, including healthcare. With new tech trends constantly emerging and evolving, it’s evident that the future of healthcare lies in technological developments. After all, some of these developments can be life-changing, as they allow doctors and patients to communicate faster and thus act faster in emergencies. ​The Benefits of Technological Improvements in Healthcare The IoT brings many benefits to doctors and patients thanks to the improvements in healthcare services. One of the examples of how technology is changing the healthcare sector is the possibility to access healthcare services remotely. This allows patients to manage their data more easily. They no longer have to go to the doctor’s office to get the necessary information. Patients can access online portals from the comfort of their home and see all of their medical test results. It is also possible to schedule real-time appointments or request prescription refills. Another significant tech development integrated into healthcare is chatbots. They can be used to help patients find the necessary information and gain insights into their conditions. Cloud computing has become an essential and unavoidable part of the healthcare services, too. It increases the accessibility to data to both patients and doctors. Other technological trends that are being used in healthcare include digital patient room whiteboards, as well as patient and staff identification systems. Overall, the integration of technology in healthcare services has improved communication, accessibility, and responsiveness. And there’s no doubt that this is crucial in saving lives and treating various conditions. ​The Negative Side of Technological Improvements in Healthcare Unfortunately, the use of technology in healthcare comes with a negative side as well. With cloud computing and online data becoming more widespread, doctors and patients are exposed to various cybersecurity threats that can lead to data theft and security breaches. Cybersecurity threats have drastically increased in the last couple of years. This leads to concerns over patients’ safety and privacy. Therefore, it’s crucial to maintain a cybersecurity plan and follow security frameworks to ensure all data and accounts are protected. Maintaining security by complying to certain regulations and standards can make a huge difference in the outcome of healthcare services. It is recommended to follow the NIST cybersecurity framework, which is a collection of rules and regulations designed to keep infrastructure safe from breaches and cyber attacks. Unfortunately, data theft and other forms of unauthorized access can be a result of poor security on both sides. In other words, poor security settings on patients’ devices can lead to breaches and other devastating consequences. ​Keep Your Data Safe Users who access healthcare services remotely or check their medical records while connected to an unprotected server are exposing sensitive data to third parties and hackers. One way to go around this issue is to connect to a virtual private network to encrypt sensitive data and hide information from third parties. VPNs significantly improve security by hiding the user’s IP address and allowing them to stay anonymous online. Reliable encryption makes the user’s data untraceable, which enables them to safely browse the internet or access confidential files. Choosing a suitable VPN service to protect data and devices might be challenging. Below are the key pointers that can help users find a reliable VPN service provider. Start by avoiding free VPN services, as there are plenty of them currently available on the market. It is understandable how a free VPN might sound attractive. However, such service does not guarantee full encryption. Moreover, most free VPN services record users’ data for their own advertising purposes. Instead of opting for free solutions, users should look for paid services that have a zero-logging policy. Other features that should be included in the package are fast server connection, reliable customer support service, as well as a broader range of available servers and locations. ​Conclusion The integration of advanced technologies has already transformed healthcare services in a positive sense. Patients are now able to connect with doctors and access their medical records faster than it was ever possible before. It is clear that technology is going to play an important role in the future of healthcare services all over the world. However, as beneficial as the IoT can be, it is also a source of risk for both patients as individuals and entire healthcare systems. The recent developments in the field of technology are turning healthcare services into primary targets for cyber attacks. According to WelchAllyn, a total of 477 recent healthcare breaches affected more than 5.6 million patients’ records. Thus, it is highly recommended to treat the topic of cybersecurity seriously and raise awareness regarding its role in healthcare.[/vc_column_text][/vc_column][/vc_row] ### How The Gaming Industry Uses Big Data to Shape Development [vc_row][vc_column][vc_column_text]Given the continued growth of the gaming industry to reach monumental levels, it is no surprise to see business leaders begin to fully embrace the potential of big data. The gaming industry is following the lead of other businesses in seeking to maximize the trove of data that is stored during online activity. Big data also allows the video game industry to personalize their service to consumers, whether working out what type of advertisements to use or learning how best to create engagement on social media. The video game industry’s adoption of big data will inevitably prompt some gamers to reconsider their level of data privacy, with online security more important than ever in this age of digital transformation. While these intentions to use big data may be transparent and not at all insidious, the knowledge that masses of data are being collected, stored and monitored can be disconcerting for some gamers. Big plans for big data Using data to create targeted in-game advertisements purely has the aim of generating more revenue for the industry. This can be achieved through cooperation with social media accounts, in which video game companies can build up a more comprehensive profile of a gamer. While this may seem troubling at face value, it may be a sacrifice that gamers have to make. If the industry can attract more money through more effective advertisements, more money can be spent on developing new products. A more practical use of big data is to directly improve the games themselves. Companies can now use data analytics to interpret the key facts and habits of gamers, with the aim of being able to create a gaming experience that is ultimately more gratifying for players. The nature of video games lends itself perfectly to tracking player performance in order to gauge the difficulty of certain levels. This means that games can be tweaked subtly to improve the overall experience. If a review of the data reveals that players keep getting stuck at a particular point of a level, modifications can be made. Adam Fletcher of Gyroscope Software posted at Gamasutra to explain the benefits of using data analytics to guide game improvements. Fletcher details how data science can pinpoint what keeps players engaged, with increased engagement generating more revenue for the developers. Therefore, there is every incentive for video game companies to use big data, while it creates a more intelligent product for the consumer. A huge industry LPE has compared the size of the video game industry with its movie and music counterparts. Video games generated $137.9 billion in 2018, substantially more than the music and movie industries combined. With so much at stake financially, it is no surprise that video game companies want to give themselves every possible edge. The gaming landscape has undergone dramatic change in recent years. The rise of mobile gaming has proven lucrative, with almost 50% of the money generated by video games accounted for by mobile products. eSports have become a global phenomenon, with figures from Syracuse University suggesting that eSports will account for 10% of all US sports viewing by 2020. Another hugely influential contributor to the video games industry’s success is live streaming. The power of live streaming Gaming has always had the potential to be a social pursuit, but the sustained growth of live streaming is taking that communal element to the next level. Live streaming has also created bona fide celebrities from the world of gaming, which feeds into the lucrative nature of the industry. In January, Business Insider reported how Tyler ‘Ninja’ Blevins earns $500,000 each month for streaming games of Fortnite from the comfort of his bedroom. That kind of earning power would have been unthinkable even just a few years ago. While Ninja may be the most extreme case, live video streaming is hardly an exclusive phenomenon. Newzoo analyzed data from January of this year, uncovering that 63,700 streamers hosted videos on Twitch and 22,000 people streamed video game content through YouTube. Those streams can focus on any form of gaming. While Fortnite and other trending games may dominate, many streamers venture into retro titles or simulation games.[/vc_column_text][vc_single_image image="52105" img_size="full"][vc_column_text]Other parts of the gaming industry have found this trend towards live streaming impossible to ignore. Some online casinos have made a concerted effort to introduce more live streaming games, partly driven by a desire to tap into a more communal form of casino gaming. Betway now has a dedicated live casino page, with games ranging from roulette to blackjack streamed in high definition with real-life dealers. Mobile apps have also got in on the live streaming action. Quiz apps used to be a way for individuals to pass the time and engage their mind in some spare time, but HQ Trivia changed the playing field. Thousands of people across the world logged into the app at the same time to compete for prizes, with the quizzes hosted in real time by presenters and celebrity guests. Live streaming looks destined to continue to expand throughout the coming years. After all, so many different games are attracting huge audiences that there is no reason to slow down. As those audiences swell, so do marketing options. If the gaming industry can use big data to track both streamers and audiences, it can tailor and sell advertising packages for a much greater profit. It is natural that the video game industry would want to use all of the available information in order to improve its products. Data will be created by players' actions anyway, so it makes sense to log and track patterns to ultimately enhance the gaming experience. With eSports and live streaming propelling the video game industry to new heights, the usage of big data may become even more widespread.   If you are fighting a gambling problem please visit https://www.begambleaware.org/ Or phone the National Gambling Helpline Freephone on 0808 8020 133[/vc_column_text][/vc_column][/vc_row] ### Dell EMC Puts the “U” in Universal Customer Premises Equipment at the Network Edge [vc_row][vc_column][vc_column_text] Dell EMC to resell ADVA Ensemble software, enabling a choice of more than 50 onboarded virtual network functions for the Dell EMC Virtual Edge Platform London, UK – June 4, 2019 News summary: Dell EMC now offers ADVA Ensemble software on its Virtual Edge Platform - a leading uCPE system Collaboration and an open platform enable choice from among for 50+ virtual network functions Companies work to bring the benefits of the cloud to network for enhanced efficiency, economics Full Story: Dell EMC significantly expanded the capabilities of its Virtual Edge Platform (VEP), a multi-function network device, by certifying and making available ADVA Ensemble software – delivering a choice of more than 50 virtual network functions (VNFs). Dell EMC’s collaboration with ADVA enables software choice for the VEP, supporting a wide variety of VNFs including virtual firewall, WAN Optimization, IoT, virtual test, voice, wireless and many others. Dell EMC VEP and ADVA – Virtualizing Next-Generation Access to Improve Network Economics Virtualising next-generation access is about enabling network functions on a common platform in lieu of expensive fixed-function access hardware. This is the promise of universal customer premises equipment (uCPE). Using software-defined architecture, Open Networking and virtualisation to improve network access, organisations such as service providers can get the benefits of the cloud and accelerate their digital transformation goals. This approach helps them take advantage of new market opportunities more quickly, flexibly and efficiently. Built with advanced intelligence for network virtualisation and software-defined architecture, the Dell EMC VEP provides an open Intel® architecture-based platform to support multiple simultaneous VNFs. Numerous proprietary physical devices can be consolidated into this single uCPE while maintaining the high performance levels needed to host many. The modular design includes room to grow with front panel expandability so the platform can be easily upgraded or serviced in the field as needed. The agreement gives more customers access to Ensemble Harmony Ecosystem, a unique multi-vendor environment with unparalleled variety and choice. It enables customers to avoid vendor lock-in and unleash new service innovation in an instant. Now, with Ensemble Connector running on the Dell VEP family, many more service providers and enterprises have the freedom to explore product offerings and create try-before-you-buy programs for new dynamic offerings. The combined initiative consolidates VNF management, including the Ensemble Connector’s enhanced management and orchestration (MANO) capabilities. It streamlines customer deployments and creates a new level of operational simplicity. Supporting quotes “There is a real need among service providers and enterprises to update network operations to address distributed and cloud-based applications and capitalise on changing economics enabled by cloud models,” said Tom Burns, senior vice president & general manager, Networking & Solutions, Dell EMC. “By infusing Open Networking into access networks to the cloud with the Virtual Edge Platform family, Dell EMC can help customers modernise infrastructure and transform operations while automating service delivery and processes.” “Today’s network operators need solutions that are optimised for uCPE. Dell EMC’s new uCPE platforms are the answer to that challenge. Combined with our high-performance Ensemble Connector NFVi software, it provides an open architecture to support multiple simultaneous VNFs. By connecting the enterprise edge to the cloud, it provides unrivalled choice, drives growth and significantly improves end-user experience,” commented James Buchanan, GM, Edge Cloud, ADVA. “Our technology has already succeeded in bringing the power of virtualization to major customers around the world. Now, by joining forces with Dell EMC, we’re enabling more service providers and enterprises to digitalize their infrastructure, both in data centers and out at the branch IT edge.” Availability:  The solution is available in the United States now with worldwide availability by the end of June 2019  Additional Resources: Realizing the Value of NFV with Universal Customer Premises Equipment (uCPE) ADVA Network Virtualization product pages Connect with Dell EMC via Twitter, Facebook, YouTube, LinkedIn and DECN About Dell Technologies Dell Technologies (NYSE:DELL) is a unique family of businesses that helps organisations and individuals build their digital future and transform how they work and live. The company provides customers with the industry’s broadest and most innovative technology and services portfolio spanning from edge to core to cloud. The Dell Technologies family includes Dell, Dell EMC, Pivotal, RSA, Secureworks, Virtustream and VMware. [/vc_column_text][/vc_column][/vc_row] ### Bloor agrees partnership with Disruptive.Live and launches Bloor TV Bloor TV will bring analyst insights and reports to life in a new, easily consumable way with a range of live stream and video programmes. Following Bloor’s recent acquisition by Globalution Group, the research, analyst and consulting firm has agreed a new strategic media partnership with tech TV channel Disruptive.Live.  Bloor and Disruptive Live will both invest in creating Bloor TV, a new channel with regular programming and live streams from the Bloor website and their social media channels.  Alongside its written content, Bloor will use the Disruptive studios, producers and technicians to produce a new range of video programmes.  These will include glass board presentations, panel discussions, vendor spotlights, interviews and a new approach to briefing customers and vendors on the latest technology. Bloor’s newly appointed Chairman Brian Jones, a four-time FTSE 100 CIO with Burberry, Allied Domecq, Scottish Power and Smiths Group, is leading the charge on its Mutable Business messaging.  Brian said: “The work we’ve done in Disruptive Live’s South Bank studios has been really exciting.  This is a new medium for us.  They’ve helped us create the first four shows that explain the right way to approach transformation for CIOs and vendors, who need a new mindset of being in a permanent state of reinvention”. Joint Managing Director of Disruptive Live, Vikki Kersey, said about the partnership: “Bloor is a very prestigious analyst organisation. We are looking forward to driving the Bloor research and messaging further into new media channels through the use of next generation mediums”. Bloor’s new CEO Tim Connolly, who joins the company after thirty years in consultancy, added: “We are broadening Bloor’s research and consulting focus from technology vendors to the CIOs and customers they support.  Bloor TV will develop regular programmes, news and a different style of technology briefs to help us get the message to the new audience, which in turn will help us become more effective for vendors.” This new move reflects the emphasis that Bloor places on the business and people factors that are so critical in maximising value from companies’ investment in their technology estate. This is the essence of the Mutable Business story that Bloor has been telling for the past three years. About Bloor Bloor, owned by Globalution Group, is a research, analyst and consulting firm focused on the idea that Evolution is Essential to business success and ultimately survival.  For 30 years we have helped businesses navigate technology and choose the optimal solution for their needs.  To find out more or talk to one of our analysts go to our website: Bloor email us at info@bloor.eu or follow us on social media: @BloorResearch & LinkedIn About Disruptive Live Disruptive, part of the Compare the Cloud network, is the UK’s first tech TV channel. We cover some of the biggest technology events in the world as part of our live shows, produce original content, explore cutting edge tech and interview the disruptive voices in the industry. ### Don’t Be A Victim! 10 Ways To Protect You and Your Business From Fraud [vc_row][vc_column][vc_column_text]The rise of fraud means it’s more important than ever to be ‘security aware’ both personally and in business - here are some pointers to protect against cybercrime: 1. Financial paperwork Cyber crime is rife in banking and credit cards, so diligence is imperative. Up to date tech helps in business: for example, some modern accounting packages offer secure check facilities where checks generated are tamper resistant and more or less impossible to forge with features such as embedded holograms. Statements and bills can offer fraudsters opportunities in terms of account numbers with full postal addresses, so file them safely or consider paperless billing. 2. Bogus phone calls Beware of callers claiming they’re from your bank or other financial institution you deal with asking for your security details on the premise they’re checking a so-called security breach - especially if they’re asking you to move money. Genuine bank or credit card company staff don’t ask for PIN numbers and other sensitive information over the phone. If you receive a phone call like this, terminate it and contact your bank to see if the original call was genuine and follow these other tips. 3. Identity theft A rising fraud activity; some burglars are only interested in finding something with your full name and address details on it, or your computer purely for the information it holds about you. Be aware of the following: Shred any papers showing your personal information even if it’s just your name and address Be very wary of any letters, phone calls or emails purporting to be from your bank requesting personal details and passwords Don’t reply to unsolicited emails even if removing yourself from their mailing list; it’s a sign your address is ‘active’ and you’ll simply receive moreIgnore mailings telling you you’ve won a lottery or competition you know you didn’t enter 4. IT fraud A common fraud method as they rely on some organizations’ security vulnerabilities such as out of date software and the same passwords being used for months on end. They also bank on some staff cutting corners and not being aware of the online threats. To combat both, organizations should ensure their IT infrastructure is up to date with the latest software and that passwords are changed regularly. Staff should be made aware of good security procedures and trained and retrained regularly. 5. Password policies Following on from the above, ensure password security as follows: Change them regularly - 60 to 90 days minimum Set house rules - for example, insist that passwords are a certain length and contain at least some numbers and upper case characters Levels of access - passwords should only allow access to relevant levels of the IT system Sharing - insist no-one shares their password with anyone else at all 6. Email fraud A popular method is ‘phishing’ where an email supposedly from a bank or other company asks you to click a link or phone a certain number. These are nearly always bogus - check for the following: Your bank and other organizations you deal with will usually use your name They wouldn’t ask you for passwords or other sensitive information They’re unlikely to ask you to ‘click this link’ If in doubt contact your bank or whoever the email is from using the number you have for them, not the one on the email. 7. Staff anti-fraud education Training your staff in good security measures when using IT systems and guarding company information is important. It’s also vital to ensure ‘refresher’ training is conducted as threats can change as tech and business practices evolve. 8. Employee background checks Because many staff these days are accessing IT systems and are privy to sensitive customer data, conducting pre-employment background checks is good business practice. Along with cyber crime and helping staff stay secure when using IT, staff should be trained in how to spot counterfeit money, checks and stolen credit cards if they’re involved in financial transactions. 9. Monitoring While not wishing to suggest to staff you don’t trust them, a degree of surveillance is worth considering such as in, say, areas where valuable goods or money is handled. A CCTV security system could be considered - this also guards against possible customer fraud and theft. 10. Review paperwork Especially in larger organizations, it’s possible for fraudsters to ‘try their luck’. For example,  there may be invoices from bogus vendors or forged purchase orders supposedly from your company, and some contractors may bill for work they never actually did or over-bill for work done. These rely on organizations dealing with mountains of paperwork not being diligent when checking and can cost industry millions - as does other fraud.[/vc_column_text][/vc_column][/vc_row] ### The Importance of UI and UX in Driving Better Collaboration [vc_row][vc_column][vc_column_text]Collaboration technology is key to facilitating information management and effective organizational communication, however, a study from Deloitte revealed that 47 percent of business and HR leaders expressed concerns as to whether their existing collaboration tools really helped drive overall productivity. The ability to deliver a good user experience (UX) is crucial in selecting the appropriate communications and collaboration technology.  A well-designed user interface (UI), which includes all visual elements; pages, icons, buttons and screens, plays a significant role in enabling more effective collaboration. A poorly designed UI can be problematic within a business, and culminate in issues, such as a negative UX.   It can irritate employees, cause overall worker productivity to decline, while failure to collaborate can result in technology adoption being abandoned. In the same study, 47 percent of business leaders also claimed that they felt that the productivity of the “new workplace”, with the flood of new collaboration tools available, was a very important concern.  The approach to an improved UX is for organizations to consider a well-designed UI, which will improve the overall quality of their meetings. According to a survey by Udemy on distractions in the workplace, three out of four workers feel that they are distracted on the job, while 70 percent agree that training on how to adequately utilize these tools would help block out distractions and focus better.  A conference platform with all the necessary functions integrated can still require significant time and effort to locate certain features. The biggest obstacle for unified communication and collaboration users is to recognize, when, where and why employees or customers utilize conference tools and even more so, which specific features are critical, and which are less useful. Using UI to Drive Productivity As mobile devices are becoming the new business norm, IT-professionals must be able to adapt and ensure that the UI is both as functional as it is mobile-friendly. Our own research, in collaboration with the GSMA’s Mobile World Live, showed that 44 percent of people preferred smartphones for unified communication and collaboration services. Joining a virtual meeting should be as easy as walking into one in real life. A well-designed UI is easy to navigate, contains all the necessary prompts and most importantly, should be able to imitate apps that most users are accustomed to dealing with.  These factors should help users have a positive experience with their chosen conferencing platform. The aesthetic design of the UI should not be overlooked, as layout and colours are also important elements to the overall experience.  After all, though usability of the platform may initially attract people to it, the visual appeal of a UI is what really allows it to stand out from the crowd, and for users to adopt it with widespread enthusiasm. The Significance of User Testing User testing is generally the most effective way for product and design development teams to improve the overall design of collaboration technology.   Ongoing user testing allows our team to make minor tweaks until we are able to provide the best UX possible. Techniques we use such as cadence mapping indicate how usage across various functional areas, such as scheduling or chatting evolve into a pattern for our users. For instance, we have discovered that smartphones are often being used by people to join meetings while they are in their cars. Subsequently, UI elements should be designed to be much larger and easier to read, particularly audio control buttons.  Our simple, three-tab design allows users to navigate swiftly and effectively through various functions.  While our text and interface was also redesigned with stationary users in mind, as they often use their devices hands-free  in meetings. So in conclusion, an improved UI will in turn lead to a better UX. Widespread adoption of this technology will also translate into a positive ROI. The criteria for an effective collaboration approach includes: a simple navigation system, an aesthetically pleasing interface and relevance to the business.  Once the technology chosen is in alignment with team preferences and behaviours, it will result in a more engaged, efficient and productive workforce.[/vc_column_text][/vc_column][/vc_row] ### BT Chooses Juniper Networks to Underpin 5G Capability and Move to a Cloud-Driven Unified Network Infrastructure [vc_row][vc_column][vc_column_text]BT Chooses Juniper Networks to Underpin 5G Capability and Move to a Cloud-Driven Unified Network Infrastructure Automated, Programmable Network to Deliver Faster, More Robust Services Across Fixed and Mobile Portfolio LONDON – June 3, 2019 – Juniper Networks, (NYSE: JNPR) an industry leader in automated, scalable and secure networks, has been chosen by British Telecommunications (BT), one of the world’s leading providers of communications services and solutions, to deliver its Network Cloud infrastructure initiative. This deployment will pave the way for BT’s Network Cloud roll-out - and also enable a more flexible, virtualised network infrastructure that can deliver the technology requirements of various lines of business for BT from a single platform.   BT will also use this platform to create new and exciting converged services bringing mobile, Wi-Fi, and fixed network services together. Furthermore, with the implementation of the Network Cloud infrastructure, BT will be able to combine a range of currently discrete network functions and deploy them on a cloud infrastructure that is built to a common framework and shared across the organisation, throughout the UK and globally. These include services across BT’s voice, mobile core and radio/access, global services, ISP, TV and IT services, as well as a host of internal applications, thereby cutting operational expenditure and significantly simplifying operations throughout the organisation. News highlights: This project will enable BT to implement a range of new applications and workloads and evolve the majority of its current ones including: Converged fixed and mobile services rollout to consumers and businesses. Faster time-to-market for network services ranging from internet access delivery to TV and business network functions Improved voice and video delivery and scalability. To accomplish the evolution toward a more agile, virtualised network, BT is investing in a range of Juniper solutions across various tenants within the BT network, including: Dynamic end-to-end networking policy and control for telco cloud workloads using Contrail Networking Cloud operations management using AppFormix Highly scalable and flexible spine and leaf underlay fabric using the QFX Series “BT is a global leader in ultrafast services, with growing demand from our ultrafast broadband services and ultrafast 5G services and has the perfect opportunity to combine several discrete networks into a unified, automated infrastructure. This move to a single cloud-driven network infrastructure will enable BT to offer a wider range of services, faster and more efficiently to customers in the UK and around the world. We chose Juniper to be our trusted partner to underpin this Network Cloud infrastructure based on the ability to deliver a proven solution immediately, so we can hit the ground running. Being able to integrate seamlessly with other partners and solutions and aligning with our roadmap to an automated and programmable network is also important.” Neil McRae, Chief Architect, BT “As a renowned global service provider, BT is a shining example of how to evolve networks to become more agile. By leveraging the ‘beach-front property’ it has in central offices around the globe, BT can optimise the business value that 5G’s bandwidth and connectivity brings. The move to an integrated telco cloud platform brings always-on reliability, along with enhanced automation capabilities, to help improve business continuity and increase time-to-market while doing so in a cost-effective manner.” Bikash Koley, Chief Technology Officer, Juniper Networks Additional Resources: Follow Juniper Networks online: Facebook | Twitter | LinkedIn Juniper Blogs and Community: J-Net About Juniper Networks Juniper Networks simplifies the complexities of networking with products, solutions and services in the cloud era to transform the way we connect, work and live. We remove the traditional constraints of networking to enable our customers and partners to deliver automated, scalable and secure networks that connect the world. Additional information can be found at Juniper Networks (www.juniper.net) or connect with Juniper on Twitter, LinkedIn and Facebook. [/vc_column_text][/vc_column][/vc_row] ### Turn physical security into business insight [vc_row][vc_column][vc_column_text]There are many misconceptions about machine learning and artificial intelligence (AI). In all the hype around emerging technologies, what is often lost is a clear understanding of the practical steps needed to make it a reality; in the case of AI, it is the amount and quality of data needed to train smart systems before they are ready to be deployed for real-time decision making. While this can lead to confusion about the capabilities of AI, it can also result in lost opportunities. Many people overestimate the ability of machine learning or the ease of producing data suitable for training systems, others, on the other hand, are sitting on data resources that could be exploited with relative ease for business intelligence purposes. Physical security systems, for example, create large volumes of data which is currently often under-managed and under-utilised. Analogue and IP-enabled CCTV systems, for example, are infamous for filling tapes and hard drives which have historically been reused after a short period of time to keep costs manageable. But this data can be put to other uses beyond preventing or investigating incidents of crime or shrinkage. The growth in adoption of the Internet of Things (IoT) and networked devices, combined with cloud data centres and machine learning technologies creates enormous potential for changing the way we think about physical security. Two things are driving this change. First and foremost, video data that was previously only kept for limited amounts of time can now be stored almost indefinitely: the cost of long-term data storage drops dramatically when it’s moved to the cloud, enabling collection and retention of information that would have previously been destroyed relatively quickly. Secondly, cloud computing enables organisations to enrich this data with information from other sources and deploy real-time analytics and AI to derive business intelligence. From security to business intelligence Typically, physical security systems are deployed to reduce incidents of petty theft, robbery or aggression, and the digitisation of data offers many ways to improve on these core functions. One current application for AI, for example, is teaching security systems to identify suspicious behaviours. Platforms exist which can tell the difference between an individual loitering suspiciously by an ATM machine, and someone who is innocently digging in their bag for a lost card. The more data they can gather about a specific place, the better these systems become at identifying unusual activity. The same platforms can also help to pre-empt dangerous situations by learning to identify aggressive behaviours against staff. Many surveillance systems in retail spaces are unmonitored and used primarily for forensic investigation after an incident has occurred. A smart (IP) system capable of identifying tension and aggressive behaviours can trigger automated alerts for security teams, or play a pre-recorded warning through the in-store PA; so, a situation can be dealt with as it happens in real time, potentially lessening its impact and pre-empting criminal behaviour. Beyond using AI to enhance security, however, the same network of devices is also well placed to deliver valuable information about other business activities. Retailers can enhance the customer shopping experience through automated alerts to lengthy queues and putting more staff on check-outs. And combining camera data with marketing platforms, for example, could give store managers a better understanding of how effective in-store adverts are. Is a particular display, banner or video attracting attention, how are people responding to it? Making intelligence a reality Historically, this kind of intelligence would have been costly, slow to gather and analyse. Today, these observations can be turned into real-time decision making. Visual analytics can distinguish between men and women, or old and young with a relatively high degree of accuracy. Combing the data from security cameras with digital display advertising, for example, means adverts can be updated with real-time targeting depending on who is in store at that moment. This can be achieved by using systems and devices that most retail spaces have already deployed, delivering new value and return on investment from existing kit. The potential for these kinds of systems is still only just being fully understood, and early adopters are continually finding new ways to draw on security data to optimise and deliver business value across their organisation. Airports are combining visual analytics with access control to ensure employees really are who their card says they are, office parks are integrating security cameras with building management systems to identify cars as they enter a complex. The potential applications are limitless, and to really cut through the hype around AI and start feeling its benefits, you have to start using it.[/vc_column_text][/vc_column][/vc_row] ### How software solutions can be used to stimulate growth within hospitality [vc_row][vc_column][vc_column_text]With everything becoming more digital focused thanks to technological developments, organisations are taking advantage of software solutions and other systems which streamline operations. Many operators within the hospitality sector, however, are still trying to do everything themselves when they could use technology programmes as a solution. From an integrated kitchen management system and menu engineering, to customer loyalty and online payment tools, right through to staff training and providing an enhanced employee experience, introducing software will reduce the need for manual tasks and give hospitality operators freedom to do more. A recent survey conducted by The Access Group in conjunction with The Caterer, shines light on the hospitality industry and attitudes towards business in 2019. Read on to find out how you can use technology, tools and apps as a solution to reduce admin time, streamline operations and allow more time to devote more resources to other areas.   Reduce admin time More than 25% of hospitality operations spend a combined total of circa one day a week on administration. This nature of work hinders an employee’s ability to focus on customer facing activities, leaving less time to pay attention to guest experience, training staff, engaging front of house staff and driving customer loyalty. Digital systems automate time-consuming admin tasks, such as timesheets, manual file sharing or expenses; allowing key personnel to concentrate on more important tasks and functions. Certain software solution systems also strip out old paper-based processes, allowing growing companies to expand without investing in additional administrative staff.   Drive customer loyalty and increase guest experience 33% of respondents have plans to implement customer loyalty apps in the next year, in an attempt to boost customer service. By implementing a system which allows you to identify purchasing patterns and customer habits, you’ll have more visibility into how you can attract new customers and retain current ones. Providing your employees with a handheld tablet with an integrated system that allows them to immediately send orders to the kitchen will decrease waiting times and increase table turn. This will increase weekly revenue and in turn, enhance profitability. The survey also revealed that, whilst improving overall profitability was the number one priority for the next 12 months for more than half of the respondents (62%), guest and employee experience also rated highly. A third of hospitality operations are planning to adopt technology which will aid improved data reporting. Integrating a customer relationship management (CRM) system with an ERP will help to highlight sales trends and realise customer’s needs and concerns; making areas of improvement apparent.   Employee satisfaction and staff training Employee satisfaction and staff training Cloud-based workforce management solutions for restaurants, bars and other hospitality outlets enables the ability to effortlessly manage team members in real-time as well as on the go. What’s more, providing your employees with access to essential information they need, as well as the ability to request and amend records, is a valuable perk that makes a huge difference to productivity. Whether it’s recording absences, swap shifts, view payslips or updating data, online systems that easily allow employees to do this are a credit to efficiency amongst your team. There are also systems out there which allow for staff training and development. This streamlines HR activities and is time efficient as it allows managers to keep track of (as well as keep safe, confidential, and in one place), personal development needs for every employee, in addition to gaining a holistic view of all training activity across the organisation.   Kitchen management and menu engineering Almost a third of respondents said that they are looking to improve customer service by introducing technology as part of their menu engineering programme. As part of enhancing their menu, including increased attention on providing allergen and calorie information will be incorporated. Satisfied customers are the key to any successful business; if wait times are long or popular menu items are constantly unavailable, you run the risk of damaging relationships with your customers. The good news is that there are solutions available that allow tracking of orders to payments, which can be integrated with your inventory management systems. By incorporating relevant systems, hospitality organisations are able to improve overall profitability by freeing up employee’s time, allowing them to devote more time to tasks and roles that will increase productivity. The responsibility for increasing productivity should lie with managers and business owners, not the employees so by ensuring they have efficient (and relevant) systems in place is advantageous. The technology which is currently available is giving the hospitality sector a much-needed boost in today’s digital age.[/vc_column_text][/vc_column][/vc_row] ### The Importance of Artificial Intelligence in Space Technology [vc_row][vc_column][vc_column_text]Artificial intelligence has transformed many areas of our daily lives. From healthcare to transport, tasks usually carried out by humans are now being performed by computers or robots, more quickly and efficiently. What’s more – AI is adapting. It’s now bringing us closer and closer to the stars, with numerous benefits. Here’s a closer look at the importance of artificial intelligence in space technology. Why Do We Need AI? Artificial intelligence allows computer systems to work with the intellectual processes of humans. However, these machines can carry out tasks more dexterously and efficiently than people. They can also enter hazardous environments, such as areas that require deep sea diving, making certain processes much safer. In short: AI is improving the way we work. It’s not about replacing humans, rather, changing our workplaces for the better. For these reasons, it’s currently being utilised by a variety of industries, including: Healthcare AI can mimic cognitive functions. This makes it a perfect fit within healthcare, where many issues need addressing to make treatments faster, more effective and more affordable. One example of this is using artificial intelligence to build databases of drugs and medical conditions. This can help us find cures or treatment for rare diseases. AI is also being utilised to make healthcare more affordable around the world. The machine can mimic a doctor’s brain, recognising how humans express their ailments. Transport Artificial intelligence is greatly improving the efficiency of the transport sector. It can help optimise routes, finding the fastest and safest journeys for different vehicles. Manufacturing Adopting AI in different factories is helping to: - Make certain processes safer by automating them - Improve engineering efficiency - Reduce costs - Increase revenue - Plan supply and demand The Value of AI in Space Where does space come in? Artificial intelligence has enormous promise within satellite and space technology. Global Navigation Data collected by Global Navigation Satellite Systems (GNSS) can support AI applications. Tracing, tracking, positioning and logistics are all areas that can be greatly improved due to precise and consistent data collection. Earth Observation Satellite imagery in conjunction AI can be used to monitor a number of different places, from urban to hazardous areas. This can help improve urban planning, finding the best places for development. It can also help discover new routes, allowing us to get around more quickly and efficiently. Satellites can also observe areas of interest, such as deforestation. Data collected by this technology can help researchers in monitor and look after them. As well as this, satellites can be deployed to monitor hazardous environments, such as nuclear sites, without the need for people having to enter.Data collected by space technology and processed by AI machines is fed back to the industries that require it, allowing them to proceed accordingly. Communications Satellites can transfer important information to AI machines, providing reliable and important communication. This can be used for traffic needs, for example. Satellites can collect data, on congestion or accidents, and feed back to the machines. Artificial intelligence can then be used to find alternative courses, rerouting or diverting traffic where necessary. What Does This Mean? We haven’t yet scratched the surface on unlocking the full potential of AI. This presents many exciting business opportunities for innovators to take advantage of. This could be in exploring the technical viability of new applications or exploiting new space assets, for example. Artificial intelligence in space technology can greatly improve our daily lives, from inside our workplaces to how we get around.[/vc_column_text][/vc_column][/vc_row] ### Powering digital transformation with cloud communications [vc_row][vc_column][vc_column_text]Adopting cloud-based unified communication and collaboration (UCC) solutions will be a relatively new strategy for many businesses, yet it’s sure to be a highlight at UC Expo, returning to London next week for its 13th year. This demonstrates the history and strength of the market, with massive innovation and significant developments having taken place in that 13-year period. Further growth is predicted, with the market set to exceed $57 billion by 2024, and cloud-based UCC set for pronounced growth. Ahead of UC Expo, Divya Ghai Wakankar, Head of Cloud Communications Product Management and Innovation at BICS, discusses the benefits of moving comms to the cloud, and addresses the most pressing concerns of businesses and their customers. At present, North America is further ahead in terms of UCC than both the European and Asian markets. This doesn’t mean that services are less readily available in these regions, or that there’s less innovation: it just signifies the maturity of the cloud market in general in North America. Amazon and Microsoft’s Azure were born in the USA, so the sector has had a headstart on the rest of the world. Cloud technologies and infrastructure has had more time to evolve, and businesses and enterprise are subsequently further along in their digitalisation journeys than their counterparts in other regions. There has also been a lack of awareness of UCC; of its benefits, what companies should look for, what they should do, and how it will benefit their business. The cloud is now considered a key aspect in the digitalisation strategies of most companies, having already moved a number of key processes to the cloud. No matter how far along they are in their roadmaps, companies large and small should now also realise the importance of including communications in their migration to the cloud. Why move to the cloud? First, moving unified comms infrastructure to the cloud, adopting cloud numbers and SIP Trunking (which supports inbound and outbound VoIP calls, messaging and videos, and supporting services like emergency calling, lawful interception and directory listing) with anti-fraud measures, gives businesses secure global coverage from a single supplier. This lowers the total cost of ownership, and therefore also reduces CapEx and OpEx, shaving significant amounts from an organisation’s telecoms bill. Sixty-two per cent of companies responding to a 2019 survey said that above all they want to work with a single cloud specialist and to establish a direct and meaningful relationship. Second, SIP Trunking fulfils multiple omnichannel requirements, allowing a business to stand out from the competition with advanced multichannel interaction capabilities with messaging and voice-enabled business numbers. In the future, this will also encompass rich communication services. In addition, many businesses are looking to enhance and augment their comms capabilities in order to support international expansion. However, growing your footprint has traditionally required a comprehensive understanding of the telecoms regulations within each and every country your company is looking to enter. Different regions will have different – and often strict – legal frameworks, governing things like competition, interconnection and pricing, licensing, and universal access and services. Cloud communications – and in particular, a unified, all-in-one UCC solution from a single provider – has changed this dramatically. Now, companies are adopting cloud comms as a means of easing global expansion, with regulatory compliance issues being handled by a single global service provider. What can cloud comms deliver? Migrating communications to the cloud will deliver similar results to those promised by digitalisation strategies more generally. As with other subscription-based, virtualised solutions, cloud comms services can be trialled and tested by an organisation before they have to make a significant decision and investment. Unlike on-premise equipment – which comes with all the hassle of lengthy installation and integration – organisations are able to use cloud comms solutions on a temporary basis, ensuring they are the right fit for their business, and deliver on their requirements. Agility is also crucial; expansion, and the rapid rate of digital transformation means businesses must be able to pivot, adopt new strategies, and embrace technologies with speed and ease. Cloud comms facilitates this, with solutions quick and straightforward to integrate, and 24/7 enterprise-grade support offered by the service provider. The cloud presents a high degree of flexibility and customisation; organisations can choose whether to self-manage, automate, or to take advantage of a managed service for things like cloud numbers, anti-fraud processes, new application integration and so on. Finally, moving telecoms functions to the cloud allows a company to replace multiple, disparate systems (managed or delivered by multiple different providers), and instead consolidate new, enhanced services into an all-in-one package. This streamlines things like billing, customer service queries, and ease of use, while at the same delivering high-quality, unified messaging, voice and video services. This could be a high-definition video conferencing facility for a multinational corporation, or local, toll-free, cloud-based numbers for a small business wanting to expand its pre- or-post sales customer support presence. I’m sure that all of these benefits – as well as emerging technologies and use cases – will be top talking points at UC Expo. Whether you’re attending, are already far along your transformation roadmap, or feeling hesitant about major change: moving communications to the cloud should be viewed as the next logical step in any digitalisation strategy.[/vc_column_text][/vc_column][/vc_row] ### AI Tools to Plan Your Content Strategy [vc_row][vc_column][vc_column_text]No matter the scale or industry a business operates in, original content is what matters the most in the eyes of would-be stakeholders. Whether you work in fintech, IT or as an independent content creator, the artificial intelligence (AI) boom has something to offer for everyone. After all, potential customers and B2B partners will get their first impressions of your brand through the content strategy you employ. According to Martech Advisor, only 30% of companies plan to integrate AI in their production and sales departments by 2020, with only 17% of email marketers planning for AI. This leaves a lot of room for companies such as yours to pioneer AI content strategy trends for others to follow. With that in mind, let’s take a look at some of the most prolific and useful AI-driven content strategy tools which can improve your industry authority and subsequent revenue. 1. Node A great way to kick off your content strategy is to find out who your target audience really consists of. Node is an AI tool dedicated to content analysis across the internet. It effectively draws parallels between content, eCommerce stores, user engagement and other parameters to discover your optimal customer. Once Node extrapolates your data, it can be used to come up with new content, topics and marketing materials for upcoming campaigns and sales initiatives. Node also has AI features such as sales and product development assistance, as well as content engagement monitoring and analysis. 2. Market Muse The most important part of differentiating your brand from the competition is to produce and publish new content. However, original content will only get you so far without SEO optimization. Market Muse is an AI-driven content strategy tool with an emphasis on cloud-based content research. Depending on your niche, the tool will analyze your content in regards to other posts and articles already published on the web. It will then proceed to advise you on how to edit, fix and build on the first draft in order to get more traffic and leads from each piece of content you produce. 3. io Social proof remains one of the most important elements in convincing stakeholders to convert into customers. Import.io is an AI tool built with data gathering and visualization in mind. You can export any form of data from the web and shape it into infographics, charts, graphs or other data visualization forms. This data can be integrated into written posts and articles to further prove your claims and to build your brand’s reputation as a professional business. Import.io features a plethora of AI features including sales research, risk analysis and integration with other content strategy tools. 4. Cortex Publishing content on social media platforms such as Facebook, Twitter and Instagram can provide you with new leads and untapped popularity. However, finding the best channels and timing for your content strategy isn’t as simple as that. Enter Cortex, an AI social media analysis tool built with content tracking in mind. You can set Cortex to analyze your competitions’ social media pages and track their efficiency, content types and user engagement. The tool can then extrapolate the best types of content for you to use and advise you on the optimal post length and publishing times. You can keep a close watch on your and other social media pages with Cortex and its AI algorithms to maximize the effectiveness of your content strategy going forward. 5. One Spot Content personalization is an essential part of delivering the right content to the right customers. However, knowing how to address or how to shape your content to certain demographics can be difficult without factual data. This is where One Spot comes in with its AI content personalization capabilities. You can use One Spot to format your content so that the right messages reach the right audiences through segmentation and personalization. One Spot can be used in content strategy creation as well as email marketing depending on your needs. Once you make your content more relevant to an individual reader or customer, traffic and engagement rates will spike as a result of your user-friendly content strategy. 6. Stackla Integrating user-generated content into your content strategy is one of the best ways to build positive relations with your audience. Stackla is an AI tool built with social media content analysis in mind. It scours social media platforms for user-generated content based on the custom search filters you set beforehand. You can reach out to individual stakeholders and offer to feature their user-generated content on your future posts or marketing materials. Stackla also features social event and eCommerce AI analysis options which can help you discover even more content for future publishing. 7. Atomic Reach Even if you produce high-quality content, its effectiveness will drop over time due to shifts in trends and SEO. However, AI tools such as Atomic Reach can be used to monitor your content and audit it on a per-case basis. Atomic Reach will inform you of shifts in SEO, industry trends and whether or not your content is still performing as expected. If not, it will guide you through an optimization process in order for you to improve those pieces of content which are performing below expectation. This is a perfect tool for businesses which produce a plethora of content strategy materials for their online platforms, especially since it automates a lot of otherwise tedious monitoring activities. In Conclusion Cloud-based AI tools for content strategy optimization are the perfect addition to your production arsenal no matter the scale at which you operate. Make sure to combine different platforms in order to maximize your productivity and reach in terms of new stakeholders and revenue streams. Don’t settle for manual content production and optimization – think smart and use every opportunity to build on your brand’s foundations. Over time, the right audience will find its way to your content and your AI tools will work even better thanks to their monitoring and analysis capabilities.[/vc_column_text][/vc_column][/vc_row] ### Teaching an Old System New Tricks [vc_row][vc_column][vc_column_text]It’s time to rethink the approach to legacy. Digital and cloud are now familiar, ‘business as usual’ terms in the modern enterprise. Put simply, if organisations don’t ‘do’ digital and even those who simply don’t do it well, are heading to failure. Adopting a ‘digital-first’ mind-set and migrating core services to the cloud is at least relatively straightforward for young, agile or digital-native businesses – but how grim is the prospect of a cloud overhaul for businesses who have taken the time to invest heavily in legacy on-premise systems and applications? It would be absurd to provide the fallacy that a boldly undertaken ’rip and replace’ strategy for the myriad technological and organisational legacy applications is the secret to survival. Digital change is an urgent and pressing need – but that doesn’t mean it has to be done overnight. A considered, gradual approach seeking effective marginal improvements can be an older, more established business’ secret to success. Don’t demand change - enable it Whatever the field, keeping up with an industry’s most agile, digital start-ups is a serious and daunting challenge. Many of these businesses were established with the sole purpose of disrupting or challenging the ‘old’ ways of doing things. After all, embracing new ways of working to survive and fight back is necessary, but immediately establishes significant cost and implementation hurdles. These hurdles often conjure tensions between what the business wants and what the IT department is able to deliver - shifting ambitions, restricting growth and impacting performance. Migration to the cloud, for example, has been a strategic priority for most businesses for years, but there is still a wariness about adopting it at scale. IT teams are asked to deliver the advantage of cloud without compromising on security, while still keeping options open for the entrance of new technology. It’s not unsurprising then that trying to meet each of these needs at once can prevent effective progress. What is required then to enable change is to alter mindsets. IT teams need to feel empowered and work towards a common goal of updating systems regularly as opposed to having their arm twisted at the CEOs liking of a new piece of kit. If this can be achieved, transformation becomes organic and it would be possible to harmonise legacy investments with new digital ones, unlocking the value inside both. Think future ambition first, technology second As with the release of every new consumer smartphone, there is always excitement about the ‘latest’ and most exciting technology, even if the practical applications, benefits and reliability of that technology is still far from proven.  Adopting a strategy aimed at securing a new technology into a business is a very limited, and indeed risky approach. An investment in new technology can often be ‘change for change’s sake’ – rather than a legitimate exercise to become ‘future ready.’ Organisations of all sizes must create a focus that considers the ambition and outcomes first and then asks what is specifically needed to deliver them. Treating change as a process of continuous improvement and not a single major event is the secret to iterative development and improving on legacy strengths. By allowing time and space to explore new options and think differently about ‘driving change’ as well as offering the opportunity to make ‘digital’ happen in incremental steps, it is far easier to bring relevant teams on board and secure the business-wide backing needed to fully succeed. This incremental approach to change builds confidence with small project successes gradually building to a wider take up of other digital solutions. Although cloud migration is almost always a key component of digital transformation, it is also important to recognise that not everything can or should go to the cloud. Seeking out which digital ‘veneers’ can smooth over current IT assets to deliver a specific experience or a result more efficiently is extremely valuable - not least in ensuring the value of existing IT systems. Some of these steps may be very minor, but all lead towards a better, improved future. If organisations are looking for new ways to make existing legacy systems work harder to deliver impactful, exciting customer centric services and drive greater value while also considering retiring redundant systems or rebuilding IT platforms in the cloud – the most important step on that long journey is the first. They must take the time necessary for deep thought and strategy about harmonising legacy systems with new digital capabilities.[/vc_column_text][/vc_column][/vc_row] ### Pure Storage - Robson Grieve [vc_row][vc_column][vc_column_text] [embed]https://vimeo.com/335103228/ecfc83c94e[/embed] [/vc_column_text][vc_separator][vc_column_text]Watch as we are joined by Robson Grieve, Chief Marketing Officer at Pure Storage. In this show, Robson talks about what his role entails and what Pure Storage are doing in terms of cloud and innovation.[/vc_column_text][/vc_column][/vc_row] ### The Opportunities and Challenges of Embracing Relevance in the Channel [vc_row][vc_column][vc_column_text]It is without a doubt that channel companies need to fully understand their customers’ needs when it comes to deploying a customer-centric business strategy and model. If the requirements and demands of customers are not met, businesses can face a variety of challenges. In order to make a business model relevant for customers, a company’s approach needs to be solely focused on their challenges both today and moving forward. If this is done properly, the company will become more aligned to its customers, building longer term strategic partnerships. There can be many challenges associated with moving towards a customer-centric model. Due to this, some companies in the channel have no plans to move towards such a model. There are a variety of challenges that companies can face, from employee training and recruitment, to difficulties understand customers’ needs. According to our recent research, over half of companies surveyed in the channel found it difficult to fully understand customers’ needs, whilst also finding challenges when it came to training existing employees. Additional training and development of existing employees is needed when adopting a more relevant approach for customers. In fact, when we asked decision-makers in the channel about this, 36% of respondents reported that training and development is a key challenge with adopting a customer-centric model. New team members may also need to be recruited, which can prove difficult when companies are transforming their models to focus on their customers’ needs, due to having to teach new employees a blend of the old and new business models. Our recent research showed that 30% of businesses consider recruiting new employees to be a challenge they face in adopting a more relevant business model for customers. In fact, 61% of channel businesses said that they thought moving to a customer-centric model would have some impact on their recruitment strategies, with 23% considering this to be a big impact, and 38% anticipating a small impact on recruitment. However, nearly a quarter said that moving to a customer-centric model would have no impact on recruitment. Another challenge which may be taken into consideration is the financial considerations involved in changing business strategy. This is an unsurprising factor, as finance is arguably the underpinning driver of most business decisions. Anyone involved in the channel would advise businesses to ensure they are collaborating and working together with other companies when it comes to moving to a customer-centric model. This enables the company to identify and build a relevant partner network to operate in and meet the demands of customers. Working alongside other businesses will also help companies to be more innovative and offer more for their customers. There are some companies within the channel that see no barriers or challenges associated with operating in a customer-centric model. As a result of this, many businesses have already moved to focus more on customer requirements and taking the partner-led collaborative approach to business. This could be because the benefits outweigh the minimal challenges significantly. Companies that are already operating in a customer-centric model report seeing an increase in sales of up to 20% due to focusing more on customers, as shown in our own research into the channel. With sales increasing, businesses are only going to grow further and faster. Evidently, this can show that there are a variety of benefits for companies moving to, or operating in, a customer-centric model. Operating such a model allows the company to give extra value to customers, which is essential in today’s market. In turn, increasing value for customers will mean that businesses can further increase customer loyalty and reliability, meaning they will continue to develop and strive continuously for customers. On the whole, it is clear that at the moment and into the future, more and more channel companies are making the move towards a customer-centric, as-a-solution approach. With today’s landscape proving to be unpredictable and changing almost daily, channel businesses are now expanding their companies across the globe and embracing collaboration and innovation. It is evident that the need to focus on customers in order to increase stickiness and loyalty has never been more essential than it is today, and if this doesn’t happen companies may cease to remain relevant for customers and shrink in business growth.[/vc_column_text][/vc_column][/vc_row] ### Will AGI redefine human intelligence? [vc_row][vc_column][vc_column_text]Would you be willing to read a novel written by a computer, or listen to a song composed by a robot? As artificial general intelligence (AGI) continues its rapid growth into all areas of society there are strong hints, aspiration even, that it could overshadow human intelligence. Are the aspects of the human mind that make people unique under threat from a technology that we have created and programmed with the aim that it will learn and adapt itself to new levels that we cannot anticipate in advance?   Let’s first consider intellectual property. Novel writing algorithms could lead to a situation when some of what is widely considered to be the best literature of the year is written by a computer. The Hoshi Shinichi Literary Award is a Japanese short story competition, out of the 1,450 entries in 2016, eleven were said to be written by ‘non-humans’. One line from a story co-authored by AGI was profound, considering the situation: “The day a computer wrote a novel. The computer, placing priority on the pursuit of its own joy, stopped working for humans.” While one assumes that this line was written by the human author, there is still a lingering feeling that a computer could have this awareness. This would mean literary awards judges will have a significant decision to make. The question is, does it matter who created the literature? If the story is good and the composition regarded at its notional best, surely the creativity is what matters? Computers can now create what appear to be famous paintings with impressive accuracy, conversely algorithms have been created to identify fakes by humans. Computers can flirt and write jokes, it seems that every year that goes by the differences between humans and computers decreases. The close proximity between human intelligence and AGI has led to questions about what happens when AGI eclipses us. One of the most interesting settings for this debate will be in arts and culture, how easily accepted will AGI authors and musicians be accepted or will they be rejected outright. The art critic, Martin Gayford, wrote, “The unresolved questions about machine art are, first, what its potential is and, second, whether—irrespective of the quality of the work produced—it can truly be described as ‘creative’ or ‘imaginative’.” If people decide that all work by AGI is not creative because it came from a computer programmed by a human, then a world of artistic possibilities will be lost. Surely though this means that it is at its ‘heart’ still human, as we created the algorithm. Will customers care that what they are reading is created by a machine, using vastly different techniques than people use. Perhaps more interesting then, is to ask who would gain credit for the work, the computer or the human(s) who created the AGI. Mark Zuckerberg has predicted that in a decade AGI will start outperforming humans, with Facebook trying to build AGI with better perception than people. This relates to some of the basic human senses like seeing and hearing. I venture that in many (highly specific) scenarios this is already the case. The main way AGI will redefine human intelligence is by making a lot of it irrelevant, many workers will be looking over their shoulders, not at robots, but computers. If AGI is more perceptive, that does not mean that it is better, Zuckerberg made sure to calm the human population by confirming that computers will not be “thinking or generally better [than humans]”. For example, a computer may be programmed to possess the ability to beat a person at chess, but the computer cannot stop the game and write a letter about its style of play. The same computer cannot beat a human at backgammon either, for this to happen it would need to be reprogrammed by a human first. What we are waiting for, and seems a long way off, is a computer programmed to learn by itself from basic principles as we do. Given, as we (humans/animals) are, some critical base pieces of hardware and algorithmic software. Then put into the wild, with teachers (humans to start with) and allowed to grow from baby steps; will the computer truly be in a position to go beyond human capability, learning and mastering chess, then moving to the backgammon table and mastering that too? One of the most unique creations of human intelligence, written language, must also be adopted by AGI, as you’ve seen above in many respects it has already started. There have been reports of AGI programmes being shut down because the AGI bots created their own language, these stories have been proven to be wide of the mark and somewhat embellished. However, AGI creating its own language could be the key to its success as a new kind of intelligence in the future, this seemingly scary prospect could be the point where AGI overtakes human intelligence. AGI with its own ‘self-taught’ method of processing information will not only be seen as the time it took over human intelligence, but also the time where people fear how safe it is to be continually developing AGI. One of the most uncomfortable aspects of AGI for some humans is what appears to make us unique will no longer be the case. Humans propose that they could understand aliens from another planet creating art and music, but with it coming from technology that is controlled and programmed by humans will leave many people feeling uneasy. Taking the positive stance, another question we could be asking, is what is it that we really fear? If AGI does redefine human intelligence, surely that will also allow humans to reach levels they otherwise would never have attained. It could be seen as a new era for humanity where we finally transcend hate and inequality through common purpose and greater understanding of ourselves.[/vc_column_text][/vc_column][/vc_row] ### NHS, technology and OPEL: the systems used in A&E departments [vc_row][vc_column][vc_column_text]It’s impossible to miss how the NHS is portrayed in the news every year. The term ‘winter crisis’ makes headlines, and A&Es are often put in the spotlight. As the first port of call for emergency care, A&E departments face a difficult challenge. They get the bulk of critical patients – while also dealing with non-critical, trivial or non-emergency cases – plus also bearing the brunt of issues that are prevalent across the NHS. At a very basic level, this includes a lack of staff, funding and resources, but at a wider level, the issues in the NHS are multifaceted. Introducing ‘Crisis Point: A Day in A&E’ Some of those issues are highlighted in a new game created by specialist lawyers Bolt Burdon Kemp. The ‘Crisis Point: A Day in A&E’ game simulates day-to-day life as an A&E staffer. The player undertakes two 12-hour shifts, under the guise of interchanging members of A&E staff. The mission? To make the right decisions so you maintain a high in-game status level, with Level 0 representing the best score and Level 3 representing the worst performance. Every decision you make affects the status level – choose poorly and you may see your status level steadily dip from green, to yellow, to red and finally black. While playing, you gain an insight into the numerous patient care, staffing and triage issues that contribute to the problems within the NHS. What is the OPEL framework? What’s interesting about the simulation is that the in-game status levels and colour coding is based on an actual framework used by the NHS: the OPEL framework. The Operational Pressures Escalation Levels (OPEL) Framework was introduced in October 2016 in hospitals across England. It was designed to replace the older system of ‘red’ and ‘black’ alerts and to introduce a centrally-agreed set of definitions and guidelines for NHS Trusts to use to determine if they’re under undue pressure. Similar to in the game, the NHS uses 4 OPEL levels. Level 1 represents optimal conditions – the patient flow can be maintained well, and the hospital is able to meet anticipated demand without issues. Levels 2 and 3 respectively signify increasing signs of pressure, with the hospital at risk of not being able to meet patient demand. The lowest Level 4 represents a major crisis (the equivalent of the old ‘black’ alerts) and urgent action will need to be taken. The OPEL system gave clarity to hospitals on three major factors: Better identification of problems, at the cusp of it happening, rather than fighting issues retroactively Clear guidelines to allow a better understanding of the actions that need to be taken to prevent problems worsening Giving an overall picture of how the hospital is performing, including on a national level For example, in January 2017, the system made it possible to discover that 52 NHS Trusts reported an OPEL 3 and experienced serious pressures that compromised patient flow. What is the four-hour government target, and why was it ineffective? Another factor explored by the game is the four-hour A&E waiting time target set by the government. The target stipulates that 95% of patients must be seen, then treated, admitted or discharged within four hours of entering A&E. Each patient is given a colour coded indicator level that steadily shifts depending on how long they have been in the department. Unfortunately, while the target made a big difference when first introduced in 2003 – waits of over 12 hours were common before the target came into place – it eventually became ineffective. The focus shifted from patient care to staying within target and, ultimately, around a fifth of all patients were admitted in the final 10 minutes before the deadline. It’s no wonder, then, that in 2018, the government scrapped the target until April 2019; it is currently under review with a new system potentially set to replace it. While some systems have been better for the NHS than others, the use of technology in A&Es should continue to be explored. The potential for tech to monitor patients – beyond their vital signs – can only be realised by critically evaluating the current systems within the NHS. After all, there are plenty of technologies that have the potential to be developed or adapted to help improve patient flow and eventually prevent the ‘winter crisis’ from continuing to be part and parcel of life in A&E.[/vc_column_text][/vc_column][/vc_row] ### Enabling agile embedded integrations in SaaS applications [vc_row][vc_column][vc_column_text]As a developer or manager of a SaaS platform, you will make certain decisions about what non-core functions to outsource to third party services. Why write your own billing or email delivery systems when there are excellent solutions available. Similarly, you shouldn’t have to worry about connectivity between SaaS applications either. The current issue is that the creation and maintenance of integration with other apps and services is a developer-intensive task, which doesn't scale well without additional or dedicated resources. Integration solutions exist, but how do you go about finding the best for your business? APIs are just the first step The next generation of API users are not programmers: they’re business users who understand their data and how to make better use of it. So how do you best meet their integration needs? While it might be tempting to let a third party take on the solution delivery task, in the form of a developer building bespoke integrations or an external third party integration platform, not only does this break your relationship with your customer (and potentially expose them to competitor services), but the third party makes 100% of the revenue on your API, paid for by your customer. Embedded SaaS connectivity is here An embedded integration platform (or embedded iPaaS) for SaaS companies is a relatively new proposition that resolves these challenges, enabling you to deal with your customers in a scalable, agile, fashion. With an embedded iPaaS, the creation of integrations is split into two parts: the code and the solution. The code creates the capability for application A to communicate with application B. Every SaaS company that wants to self-create a direct integration starts by understanding the third party application ‘B’ and coding the communication capabilities. An embedded integration platform removes that need to code, as it creates (and maintains) the building blocks, which means that this part is ‘out of the box’. The solution is where APIs are instructed to interact to achieve a real-life objective (e.g. data movement or trigger-based action). Here, the embedded iPaaS will power the creation of the solution (using a GUI interface) but you, as the SaaS company, own and create the solution itself. It means that SaaS connectivity becomes another service (like cloud infrastructure, documentation, messaging, helpdesk and billing) that you can drop in and build your business on, with a visual toolkit that can be used across a wide range of teams. An embedded integration platform, like Cyclr, enables you to rapidly deliver more direct integrations in an agile form, meaning you can adapt and change them quickly yourself. One of the main benefits to this is that it requires less engineering overhead than if you were to create and manage them from scratch. You can focus on building your product, while simultaneously benefiting your API as it is moved from the cost side of your business to the revenue side. Launching integrations becomes a repeatable process which is no longer solely in the developer’s realm.  All or nothing? Although an embedded iPaaS solution is powerful, you may not want to use it to deliver every possible combination of integrations that your end users require. Inevitably, over time, you will develop a strategy for how you handle an individual integration request depending on a combination of strategic value and likely demand. There are three main choices in how you deliver an individual integration: Build – where you start from scratch and write code to achieve each integration; Design – where you design integrations using an embedded iPaaS solution within your platform (like Cyclr); or Outsource – where you deploy a ring-fenced solution that achieves the customer’s integration off-platform (like Zapier or IFTTT); The best solution is usually the one that combines two of the above to give maximum coverage. Why does this provide a better solution? Principally because it is better to resolve the key integrations natively (Build or Design) and the long-tail off-platform (Outsource). Choosing the right integration combination for your SaaS Here are six key questions to help you decide what you should do: 1. Q: Are my customers B2C or B2B? A: B2B = build or design, B2C = build or outsource   2. Q: Is it important that I resolve my customer integration at ‘pain point’? A: yes = build or design, no = outsource   3. Q: Is my application lightly or deeply enhanced by third party applications? A: lightly = design or outsource, deeply = build or design   4. Q: Are integration requirements consistent across all 80%+ of my customers? A: yes = build or design, no = design or outsource   5. Q: What is most important, the source code or the end-customer solution? A: source code = build, end customer solution = design   6. Q: Are my customer’s integration needs relatively fixed or do I need agility? A: fixed = build or design, agility = design or outsource If you answer all six questions you will see a general weighting towards build, design or outsource. Armed with this knowledge you can look at your platform, and your platform’s users’ needs, to work out the combination that’s best for you.[/vc_column_text][/vc_column][/vc_row] ### Decentralised Exchanges in the age of AI [vc_row][vc_column][vc_column_text]How intelligent is AI? We are in the midst of a third major revolution. The information revolution has the potential to change society as profoundly as did the agricultural revolution and the industrial revolution in the past. A notable difference, however, is that while the impact of those revolutions took centuries to be felt, the information revolution is transforming society within a few decades. We have moved from the first PCs to a world where everyone is connected all of the time in thirty years. Today, one of the more pronounced impacts refers to the machines' intelligence. We live in an era of rapid sophistication of autonomous agents -- they are precisely the carriers of what we call Artificial Intelligence (AI). The scientific foundations have been laid some time ago, but only recently has the technology begun assuming a critical role in the everyday life of ordinary citizens. Famous examples start with IBM's chess player Deep Blue and Google's AlphaGo, which were very successful in the confined environment of a game, and span a wide spectrum of breakthroughs in medical diagnosis and legal judgments, up to self-parking and fully self-driving cars. If you are still hesitant of a robot offering a diagnosis or taking a legal decision, think of the amount and variety of experience a smart mechanical agent may possess, as well as the speed in which it assimilates new information. When it comes to more mundane matters, as AI and Machine Learning gain traction in our daily lives, smart agents may run essentially all microtransactions and everyday tasks. Ideally, autonomous agents shall be very efficient in learning and refining their understanding of the world based on their own experiences, or from the feedback provided by humans. A related aspect nowadays is, of course, the availability of massive data which, by means of Machine Learning, aspires to improve the quality and robustness of autonomous decision-making. The aforementioned examples were cited in order of increasing difficulty, going beyond structured problems and predetermined environments to heterogeneous and unpredictable settings, where making decisions and judgments involves the complex nature of human agents. In the latter realm, a concrete question, which still represents a challenge even for advanced AI systems, is to enable the smooth functioning of decentralized communities. Let us focus on the case of Digital Asset Exchanges. Decentralized Exchanges disrupt digital transactions. After the first decade of the digital asset marketplace, a number of issues hinder wide adoption of cryptocurrencies: malicious hacks, technical traps, unreliable governance, stolen stakes, or even flat-out fraud. In particular, looking at exchange platforms, the digital equivalent of a stock market, some major centralized exchanges have pulled back dramatically, typically because of software attacks that made users lose money. Therefore, it seems inevitable for seasoned investors as well as everyday citizens to turn to Decentralized Exchanges (DEXs) that leverage the power of blockchain technology: The exchange platform itself becomes transparent, and the funds are held by each user in their own personal digital wallet. As in so many domains of today's activity, one chooses to rely on peer-to-peer transactions than put trust on some unknown, and often dubious, centralized authority. In a nutshell, a DEX is a distributed exchange that emphasizes security, privacy, liquidity and speed in a permissionless, decentralized framework. Besides having no single point of failure, DEXs have a reduced risk of server downtime. DEXs rely on development and governance implemented by a Decentralized Autonomous Organization (DAO), which is managed by the community according to a constitutional code of conduct. This structure is spreading decision making power evenly among the DAO's members, thus ensuring that the interests of all voters are adequately represented. As diversity increases and the system matures, it becomes increasingly autonomous, healthy, and self-sustainable, hence the increasing relevance of an autonomous agent that learns to safeguard the constitutional code by implementing the principles of Machine Learning and exhibiting the virtues of AI, in particular lack of bias. A particular example is when the DAO discusses projects that ask for funding, and the autonomous agent must first confirm that they are congruent with the core values and goals, in other words the constitution of the DAO. Hence AI is the key that shall lead a DAO to self governance. However, a major challenge here is to seamlessly handle the conflicting goals of the human agents. The dawn of Collective Intelligence The power of AI in structured problems and predetermined environments is well-established; some major showcases were already discussed above. But assisting a DAO is a significantly harder task because, like driving cars or making medical decisions and legal judgments, it involves balancing the human factor, learning while the behaviour of actors may be selfish or conflicting and, finally, managing human agents with personal, and often selfish agendas. This is todays' major challenge for AI in supporting the autonomous operation of decentralized communities. Game theory in the past couple of decades has focused on the computational hardness of synchronizing such groups of agents, in particular when they wish to engage into transactions. It has demonstrated the so-called Price of Anarchy, in other words the overhead of the overall system incurred by lack of coordination among its members. Enter Collective Intelligence, where groups of individuals act collectively in an intelligent manner. This of course includes software, people, or agents that switch between the two states over time. It is precisely the case of DAOs, but also of many other instances of human-machine interaction. Eventually, the goal is to seamlessly integrate machine and human intelligence at a large scale in order to yield the so-called mixed-initiative systems, one of the holy grails of AI. In this respect, the current efforts of the community target technology that is both intelligent and sensitive in such environments, where autonomous and human agents coexist. The practical aim is to maximize the Machine Learning benefit emerging from interaction with human insights, failures, and struggles. This project lies at the heart of our work in Volentix (https://volentix.io/), where we are supporting an integrated ecosystem to provide the infrastructure in the whole lifecycle of a digital asset, centered around a DEX and managed by a DAO, assisted by AI in the framework of Collective Intelligence. In particular, Volentix develops the decentralized exchange platform VDex, which employs a collection of smart EOS.IO contracts to establish quick and secure transactions, liquidity, and user anonymity, as well as its own digital KYC tool called Blocktopus. All of these aspects rely on AI in order to gain self-governance since they are eventually managed by a DAO, hence the need of combining smart agents with collective intelligence. The future is bright but we do have some work to do.[/vc_column_text][/vc_column][/vc_row] ### Enhance your Mobile Ecosystem with Cloud Computing [vc_row][vc_column][vc_column_text]Cloud computing has deeply revolutionized the way entrepreneurs and developers are viewing successful mobile app development. With the help of cloud app development, it has become much easier to access such applications as they have increased their services like software as a service, platform as a service and infrastructure as a service. Any app development company is interested in integrating cloud infrastructure as it is most requested by the businesses to enhance the mobile ecosystem for mobile apps. With that being said, mostly the developers are not using cloud technology in order to create the applications. In this article, we will look at some new opportunities which are brought by cloud technology to the world of web and app development. ❏    App Monitoring You can completely content with all of your internal tools, off the shelf analytics engine and in the house data center, but what if the system gets down when everything is running in the house? Here, monitoring your mobile application's uptime globally and reporting structure to conform to depend upon if your system has dropped down for two seconds, two hours or more. For this, you need to have an external system to do it and no one would do it better than your cloud provider with systems which are designed with reliability and failover in mind. ❏    Development There are so many sayings about the benefits of leveraging the cloud at runtime but still, many companies are still leery regarding full-scale deployment in the cloud which lets them slowly take small steps by using the cloud for less critical functions. The mobile development teams who are managing the code and using arbitrary test data will not cause any type of embarrassment if it was leaked out into the event of service provider breaches. Developers are facilitating with interests of these cloud-exploring pioneers like cloud-based issue tracking systems, manage source code systems and load testing tools when they are writing code. ❏    Hosting Services Amazon and their IaaS (Infrastructure as a Service) offers a largely pioneered idea of the cloud through their EC2 platform to the organizations in order to handle off the job of application hosting to their EC2 service. But, Amazon is only one of the game player in this field and IaaS plays like EC2 are pushed aside when it comes to various Software as a Service (SaaS). Here, the clients surrender a certain amount of control over their operating system runtime environment with the promise of fewer configuration and lower administrative overhead. ❏    Payment Gateways The reason what makes Apple and iTunes store so successful is because of its straightforwardness and easy-to-use payment systems which have made a routine for the suppliers of iPhone and other iOS-based apps to become successful. With the mobile market becoming mature, many mobile app developers are criticizing Apple's financial model while looking towards alternative payment mechanisms. Several cloud-based providers have emerged by making it easy for app developers to perform financial transactions with their clients without worrying about lost sales due to reliability issues or software bugs. ❏    Web Analytics Mobile app developers have taken the base of 'know your customer' to a different level. It is a fact that mobile apps often have fewer screens and more focused with regards to features. Such high focus nature of mobile apps allow developers to quickly turn the web analytics for product enhancements. In an effort to make the products better, more mobile apps are leaning on cloud-based services in order to capture, store and render information about the users' interactions. The Road Ahead We can definitely say there is something about the mobile development communities that are making cloud-leveraging a natural fit. It is a fact that the short lifecycles of mobile development projects require all the ready-made services which are provided by so many cloud vendors. The mobile app development company are becoming more adventuresome and disruptive than their enterprise, no matter the type of personalities and try something new like a cloud-based IDE or monitoring tool. Whatever the reason it is, it seems that the mobile community is fully embracing the cloud which is making both the cloud and computing groups stronger together. Keep Learning![/vc_column_text][/vc_column][/vc_row] ### How retailers can harness tech to battle the crisis on the high street [vc_row][vc_column][vc_column_text]It is no secret retailers face a challenging environment on the high street. The national news is frequently awash with stories of store closures and profit warnings, with huge household names including John Lewis, LK Bennett and House of Fraser significantly affected. The retail sector’s most recent bellwether event, Christmas, offered little respite. Instead, it portrayed a mixed picture. Aldi saw like-for-like sales over the festive period increase 10 per cent, but those fortunes were not mirrored by Mothercare, which saw like-for-like fall 11.4 per cent. Sainsbury’s, Marks & Spencer and Debenhams all saw sales over the Christmas period fall from the year before – and HMV fell into administration for the second time in six years. Often, the rise of online giants like Amazon is blamed for the woes of bricks and mortar retailers on the high street. The success of these disruptors has led to growing numbers of consumers shopping online, lured in by convenience, lower prices and a more personalised experience. Yet while traditional retailers face a testing environment, those decrying the ‘death’ of retail and our high streets will not necessarily be proved right. Across the retail sector, greater attention is being given to digital transformation and the possibilities it can bring. A recent survey by IT Pro revealed that 66 per cent of retailers considered digital transformation to be crucial. These attitudes suggest many bricks and mortar retailers are poised to make a step change and adopt technology that can help transform their fortunes. Those that do will find themselves best equipped to fight back against the crisis on the high street. Digital transformation - where to start? As high street retailers try to turn the tide against a startling concoction of falling profits and store closures, innovative tech offers a lifeline. Increasingly, retail executives look towards digital transformation as a conduit through which they can ensure their brands remain competitive. They are alive to the rich potential that customer data can hold - and want to implement systems that will allow them to gain transformative business insights. They are intrigued by the ability of machine learning to create accurate sales forecasts and predict future patterns of demand. They can see how new technology can help present a more compelling offering to customers. What they can’t always see, however, is that to realise these aspirations they need to take some formative steps on the road to digital transformation. For bricks and mortar retailers looking to realise the potential of advanced data analysis and machine learning across their business, the first step is to adopt cloud infrastructure. One reason for this is because the collaboration cloud infrastructure facilitates is key to creating a unified and effective data strategy. To gain meaningful insights from their data, retailers need to have one data set. A data lake that stores data collected across different teams, from which meaningful dashboards and visualisations can be created to uncover genuine business insights. Without this in place, retailers run the risk of holding fragmented data, siloed by the teams that collect it and of limited use to the business as a whole. It is also important to get the correct infrastructure in place from the start, to make sure it can meet the demands that will be placed upon it. IT infrastructure needs to be able to cope at peak times - handling high volumes of traffic. It also needs to provide storage for vast volumes of data. Investing in the scale of on-premise infrastructure needed to achieve this can be prohibitively expensive. It may also constitute a poor return on investment if it is only required to operate at close to peak capacity occasionally. Maintenance, too, can prove expensive. In contrast, the flexibility offered by subscription-based cloud infrastructure, such as Google Cloud Platform, allows infrastructure to scale in line with fluctuating patterns of demand. For example, infrastructure can be expanded at peak times, like Black Friday, and then scaled down after automatically. The ability to scale cloud infrastructure alongside periods of peak activity also improves reliability because servers are no longer overwhelmed with high volumes of traffic. Migrating to cloud infrastructure also offers retailers the ability to simply ‘lift and shift’ their existing IT infrastructure to the cloud. Alternatively, it opens up the possibility for them to work with a cloud services specialist to re-architect the infrastructure to improve its reliability and performance. Taken together, the capabilities afforded by cloud infrastructure provide the essential basis from which the potential data analysis and machine learning can be realised. The power of data Once retailers have infrastructure in place to organise and store data in one location, they can uncover key insights and make smart predictions. This can give them a serious competitive edge. Data analysis can create unique insights into customer behaviour. Differences in purchasing decisions between certain demographics can be uncovered. Seasonal trends can be explored and understood in greater detail than before. The performance of certain products with different audience groups can be better measured. All of this can help retailers become more customer-focussed in their approach - allowing them to be more attuned to customer demands and make better business decisions based on factual insight. More than this, retailers can use their data to harness the power of machine learning. Not only can the data that retailers hold be analysed, through the implementation of machine learning technology it can be used to accurately predict future behaviour and trends. The predictive powers of machine learning can help them forecast seasonal patterns of demand and sales with a greater degree of accuracy. In turn, this can help retailers make more informed decision on stock purchases, which can allow them to maximise sales and minimise wastage. Machine learning could also help retailers serve more compelling product offerings to customers based on their previous purchasing decisions - driving further revenue and enhancing the customer experience Combined, these abilities can have transformative effects for retailers. Digital transformation can unlock a huge array of untapped capabilities and provide unique business insights to drive revenue and efficiency. There is no reason to believe bricks and mortar retailers who harness its potential cannot thrive in the future.[/vc_column_text][/vc_column][/vc_row] ### Technical solutions set to transform the FinTech industry [vc_row][vc_column][vc_column_text]From mobile banking to stock trading apps, financial technology (FinTech) has seen a surge in new innovations in recent years and there are no signs of this stopping. From improved cyber security, to fully-automated processes, I’ll be outlining some of the ways new developments will revolutionise the FinTech industry. Cyber security Security is rapidly becoming a hot topic when it comes to technology and the internet. In the financial industry, security is extremely important as hackers and cyberattacks are a serious threat. Luckily, cyber security is seeing some new developments in technology. Artificial intelligence (AI) is now being used to track and analyse behaviours, picking up abnormalities that could be cyberthreats. By knowing the specific behaviour of these threats, AI can develop ways to prevent them from happening in the future, all without any human input. For example, using AI, banks can get an idea of your usual spending habits then, if anything suspicious happens, they have ways to alert you in case of fraud. When it comes to sensitive data, which is particularly common in FinTech, computer and technology companies are working hard to come up with new ways of protecting it, and they’ve now created new authentication systems. These systems rely on more than just a simple password to gain access to them, which is where two-factor authentication comes in. This second level of authentication usually comes in the form of a randomly generated code that’s given to a specific user to allow them to gain access to the system. Alternatively, some systems use a ‘token’, which can be a USB or mobile device and, without this device, users can’t use a particular programme. Or, just like modern phones, these systems can even use fingerprints and facial recognition to grant access, meaning outside threats are less likely. Some gaming platforms, like Steam, and even big tech companies, like Microsoft and Apple, have started rolling in this two-factor authentication system to protect users’ sensitive data. Peer-to-peer (P2P) lending Now becoming mainstream, peer-to-peer (P2P) lending is revolutionising the FinTech industry by providing a new platform for investment opportunities. It connects those who want to lend money with those who want to secure a personal loan. When bank loans are becoming increasingly harder to obtain, P2P lending is a much more efficient way of doing things, as some loans can be given within a matter of hours. Because it’s such a modern phenomenon, the processes behind P2P are generally entirely digital, which allows for much more control and faster results. Some P2P loans go to small businesses and budding entrepreneurs. With P2P lending, it’s a lot easier for new businesses to grow and for investors to see a return on their lending. These loans remove the need for a middleman in the form of banks and credit unions, and can tailor lending and interest rates according to each individual — it’s often a better, more personalised option for both parties. Platform as a service (PaaS) Platform as a service (PaaS) involves a third-party company that provides the hardware and software tools, via a cloud-type network, that other companies need to create, develop, and use new applications. This means that users don’t need to download, update, or fix platforms to carry out tasks, because it’ll all be done for them by the third-party vendor, reducing the amount of time IT teams need to spend on updating and maintaining operating systems. PaaS means that users can use these platforms from anywhere with access to internet, which is great at a time when remote working is becoming more common. As a growing market, PaaS is potentially a big opportunity for investors to consider. Regulatory processes At Lending Works, we focus on automating business processes to increase efficiency. Advances in technology mean processes, such as regulatory reporting, compliance checks, identity and access management, and monitoring transactions can be automated, too, so there’s less need to hire groups of people to do these tasks for you. Automated regulatory practices not only make company processes more efficient and streamlined, but they also mean companies are constantly up-to-date with new laws and regulations — it’s much easier to update a programme than to train a full team of staff. In fact, robotic process automation (RPA) will mean that most of the more mundane repetitive tasks in FinTech, such as data analysis, can be carried out by AI. This will also streamline and make processes more efficient, as well as reduce the risk of human error. Improvements in cyber security, P2P lending, automated processes, and PaaS, are just some of the advancements and solutions we're seeing within the FinTech industry. These technologies are on course to make our work safer and more efficient in a world that’s becoming increasingly more reliant on IT and computerised processes.[/vc_column_text][/vc_column][/vc_row] ### Third-party support in the cloud [vc_row][vc_column][vc_column_text]ERP vendors are very keen for their customers to move to the cloud. Both Oracle and SAP are limiting their innovations to their SaaS services, while making their on-premise solutions less-enticing to their customers. SAP has gone a step further and set a deadline of 2025 for its on-premise customers: no more support for those not using S/4HANA. Why are they so keen? A number of reasons, some selfish, some not. Moving their customers to the cloud can mean moving to an in-memory solution that will be much faster, and any updates will be applied automatically, keeping customers secure. But at the same time, it means higher and more predictable revenues as customers pay regularly for their use of SaaS. It also means customers that used to be able to use third-party support have to move back to official channels, and the additional costs that involves. Efforts to convince customers to move from on-premise ERP to the cloud have not proved to be completely successful. SAP has already moved its deadline as it was proving unrealistic to many of its customers, and Oracle no longer reports its cloud revenue separately, making it tricky to tell just how well it’s doing in convincing its customers to make the leap to SaaS. This isn’t entirely surprising. Shifting ERP to the cloud has its benefits, but it’s expensive and time-consuming. It can also mean losing a great deal of flexibility, including bespoke software and specialist support. So is it possible to keep this flexibility while moving to the cloud, or are the two incompatible? What you want vs what the vendor wants The path that a vendor will lay out to the cloud will be the one that they would prefer. It’s also the “easiest”, or at least is regarded as such: to move to vendor cloud. As the most profitable set up for the vendor—they make money for both the ERP and the cloud infrastructure—it’s what organisations will be steered towards. But there are other options beyond the default of vendor cloud. Both Oracle’s Fusion and SAP’s S/4HANA can be installed on private cloud infrastructure—something that may actually be necessary thanks to regulatory demands. Multi-tenant cloud might actually be impossible thanks to the sensitivity of data. When planning a cloud migration, it’s important to keep at the top of mind at all times what’s best for the organisation. ERP vendors, particularly Oracle, incentivise their cloud offerings and bundle them to create tempting deals. But moving to the cloud can take years, often far longer than expected. Systems have to be moved in an order than makes sense, integrated with other systems, and lots of iterative work. We’ve seen cloud migrations that were initially planned to take just one year, slip by two years. If public cloud is the best option, even then it’s not necessary to choose vendor cloud. Oracle was actually pretty late to launch its cloud product, and it’s more common to move to Amazon AWS. Cloud migration of ERP can mean better security, better performance, better analytics, and make remote access. But all of these must be balanced against what can be lost. Staying flexible Once migrated to a vendor cloud solution, it can be very difficult to move away. We’ve seen instances of an organisation being given zero support beyond what was absolutely necessary when the decisions were made to move from vendor cloud to Amazon AWS. The organisation had to do all the heavy lifting while the ERP vendor claimed that it couldn’t help without giving away proprietary information. But potential problems go beyond vendor lock-in. It’s not uncommon for organisations to run modified ERP systems that meet their exact needs. Moving to a SaaS solution means losing these modifications and settling for a one-size-fits-all solution. Vendor cloud solutions also make it impossible to use third-party support that may understand your organisation and offer a better service at a better price. Vendor cloud means opting in to everything that the vendor offers. The key consideration is balancing what could be gained by SaaS, against instead taking current on-premise software and moving it to the cloud. ERP software is not known for its great leaps in innovation, and security can be handled through virtual rather than official patches. If a business decides to stay with the current iteration of its ERP software, will anything be lost? One of the biggest advantages of moving to the cloud is that it is a flexible technology, but in moving to ERP vendor cloud, a great deal of business flexibility can be lost as options as closed off. By taking the path of least resistance, the advantages of moving to the cloud may be completely overshadowed by what’s lost.[/vc_column_text][/vc_column][/vc_row] ### Up in the clouds with cyber security [vc_row][vc_column][vc_column_text]The last few years have seen the cloud develop from a cutting-edge advantage into a standard part of business strategy required for remaining competitive. Cloud adoption is now so widespread that it has been predicted that as much as 83 percent of all enterprise workloads will be in the cloud by 2020. The rapid adoption has been fuelled by the cloud market’s increasing ability to deliver practically any business service. Organisations of all sizes can now access skills and resources that were previously restricted to the largest market leaders, enabling them to achieve new levels of efficiency and unlock entirely new business strategies. An increasing number of companies have taken a cloud-first approach, while many new firms are cloud-only. As a result, the worldwide public cloud services market is predicted to reach more than $206bn in 2019, a growth of 17.3 percent over 2018. Growing rewards, growing risks Although cloud adoption is delivering powerful rewards to those organisations leading the charge, it can also introduce many new risks without the right precautions. Indeed, cyber security has been cited as the leading concern for IT professionals when embracing the cloud. Effectively managing and securing user accounts can present a particularly difficult challenge, especially in hybrid cloud set ups where there are likely to be overlapping systems and overlooked access points. Cybercriminals often exploit vulnerabilities in the cloud to attack in-house infrastructure and vice-versa, particularly via compromised user accounts. Privileged user accounts that have access to administrative powers pose the most serious threat, as attackers can exploit their capabilities in a number of highly damaging ways. The cloud also generally adds an extra layer of complexity to the IT landscape at a time when many organisations are already struggling to keep their infrastructure secure. It’s common to find companies fighting a losing battle against issues such as keeping systems patched and effectively managing and securing user accounts. A deepening skills crisis also means that it is difficult for organisations to find the security professionals they need to address these challenges. Experienced practitioners are increasingly expensive and difficult to recruit and retain. Securing the cloud, through the cloud Ironically, while cloud adoption is causing its share of security headaches, it has also emerged as a major solution to cyber threats. Just as the cloud market provides solutions for business functions like finance, HR and logistics, it can also deliver access to cybersecurity resources and expertise that would normally be too expensive for most companies. While organisations will always need some in-house technical expertise, consuming security solutions through the cloud enables a business to greatly reduce the underlining requirements such as more specific knowledge of areas like SQL and IIS. This will free up resources and enable them to better focus on normal business priorities. Cloud solutions mean that the business does not need to find individual experts for Operating Systems, databases and system backups, as third-party service providers can take care of everything they need. They can simply consume the services – they only need to learn how to use it versus installing and maintaining the solution. Prioritising PAM  While the cloud has opened up access to cyber skills and solutions, the cybersecurity challenge is so broad that many companies struggle to know how to prioritise spending on their defences. It has become increasingly accepted that it is impossible to guarantee security, so the best approach is to focus on investing in solutions that will cover as much ground as possible. With that in mind, one of the most valuable areas to invest in is Privileged Access Management (PAM). Privileged accounts are also known as superusers and are used for essential functions that require a far greater array of powers and access than a normal account. These accounts are highly prized by cybercriminals as they have the highest-level access privileges, enabling an attacker to infiltrate systems and edit data, install malware, and access critical systems across the network. The attacker can also use the privileged account to cover their tracks, allowing them to avoid detection for months or even years. Due to the serious threat posed by the compromise of privileged accounts, leading analyst house Gartner has named PAM as the number one cybersecurity priority in its recent Top 10 Security Projects for 2019 report. PAM was also estimated to be the area of cybersecurity to see the second highest increase in spending this year. PAM forms a part of the broader category of Identity and Access Management (IAM), and ensures automated control of user provisioning along with best security practices to protect all user identities. PAM can also be integrated with Security Information and Event Management (SIEM) solutions to create an inclusive picture of security events that involve privileged accounts, and provide clearer oversight for the IT and security teams. Beyond security Aside from reducing the risk of serious cyberattacks, PAM can also provide several powerful advantages for cloud-centric organisations. Securing privileged accounts means it is easier for users to access higher functions from remote locations – something that would usually be an invitation for compromise by cybercriminals. This means organisations can make even greater use of flexible working strategies, empowering their workforce to achieve more without conceding on security. While the cloud will continue to present its share of security headaches along with the benefits, organisations can access powerful solutions such as PAM to balance out the risks, helping to drive growth and efficiency without exposing the company to serious security threats.[/vc_column_text][/vc_column][/vc_row] ### Five cash management essentials banks cannot afford to ignore [vc_row][vc_column][vc_column_text]Traditionally, commercial banking has been one of the most profitable segments for banks. After a slow response to digitalisation and disintermediation in retail banking, banks the world over are strengthening their solutions and value propositions for their corporate business clients. A few FinTechs have already sprung up in the corporate banking space. However, banks have the advantage of long-lasting customer relationships with their corporate clients, and they must deepen these relationships by offering enhanced digital solutions to avoid getting marginalised. In this article, we shine the spotlight on five such solutions. 1. Digital self-service In retail banking, the rapid adoption of new digital channels is evident. The multiplication of digital channels is driving self-service, and leading banks such as Capital One in the US are already offering banking services on smart voice assistants. The growing trend and preference for self-service is on the rise in corporate banking too, where leading and progressive banks are introducing innovative, user-friendly solutions that can be accessed on the customer’s channel of choice. Increasingly, corporates require a single portal from where they can view and manage all their operations across the globe. Take Santander UK - the bank offers cash management, cash forecasting and payment services that its corporate customers can access on any device. Seamless integration of banking services with the ERP systems of large corporates is another solution proving to be a multiplier of organisational effectiveness and productivity for businesses. 2. Cash management Large corporates in most countries are faced with the dual challenge of allocating cash reserves efficiently in times of negative interest rates and managing liquidity resourcefully for their growing international business presence. Corporate clients are looking for solutions that can help minimise liquidity risk and provide a real-time view of their global cash and liquidity positions. Banks are tapping into the new open-banking possibilities and enriching their offerings with data to deliver more value to their business clients. Firstly, as account aggregators they can now offer their customers a centralised view of funds across all banking relationships, and also allow customers to extract this real-time information to perform fund transfers for improved liquidity management. Secondly, with data-driven insights banks are augmenting their cash management solutions with superior cash-forecasting capabilities. For instance, modern cash and liquidity management solutions allow a corporate business with multi-country operations to move surplus funds from an account in one country and manage future working capital requirements in another country efficiently. These forecasting solutions provide a unified view of cash forecasting and working capital needs and also allow fund movements through a single self-service portal. 3. Virtual account management Enhancing the centralisation of operations and self-service further is a solution called Virtual Account Management (VAM). Virtual accounts reduce administrative overhead significantly by simplifying reconciliation and enabling greater straight-though processing through efficient in-house banking. They allow corporates to build their own hierarchy of accounts that aligns to their structure of business units or subsidiaries. Enhanced liquidity management techniques such as payments-on-behalf-of and collections-on-behalf-of help a corporate treasurer manage payables and receivables using a minimal number of external accounts. 4. User-friendly liquidity dashboards Growing businesses looking for efficient liquidity planning solutions demand single-view dashboards that provide a global view of domestic and international accounts across multiple banks. Progressive banks are enhancing their cash management solutions with intuitive analytics-driven dashboards. In addition to a comprehensive view of future cash-flow and liquidity positions, these dashboards make recommendations for next best action, provide suggestions and trigger alerts.They allow corporates to set up and manage rules for automated action. For instance, a corporate can set up rules for the movement of surplus funds into money market or foreign investments. Using the information about the risk profile of a business and the defined rules, the liquidity management solution can move funds and make investments to manage surplus efficiently. 5. Integrated software solutions for small and medium businesses The cash management requirements of small and medium businesses are very different from those of large corporate giants. Although the sector promises attractive returns, it has been long-neglected by banks. However, emerging digital technologies are now helping banks provide tailored solutions to their small and medium business clients. Small and medium businesses (SMBs) seek cash forecasting and working capital management solutions that can be accessed and used on-the-move on channels such as mobile and tablets. Enhanced self-service, instant approvals, and quick transactions are the key considerations for an SMB looking for cash management solutions. Apart from multi-channel capabilities, a bank’s solutions to its small business clients should be easy to integrate with accounting packages. For instance, a solution that integrates with accounting packages such as Intuit, allows a shop owner to conveniently carry out transactions through e-banking. These solutions also increase the ease-of-use and adoption of cash forecasting tools. Conclusion FinTechs and nimble digital start-ups are fast becoming the preferred providers of lending and payment services for businesses in the small and medium business segment. Although not as rapid, the early signs of disruption in corporate banking have also begun to appear. An established customer base is one the most significant assets for banks, and banks must find sources to deliver enhanced value or risk losing these relationships. They must act fast, and they must act now. Banks looking to augment their revenues from the corporate and SME segment must build capabilities such that they can serve the diverse needs of corporates as well as SMBs equally well. Their solutions for SMBs should allow quick, easy and on-the-move cash management capabilities to their time-pressed small business customers, and their solutions for large corporates should provide insight-driven flexible and comprehensive cash management capabilities. Going forward, corporates expect to be empowered with a relevant digital cash management offering that will be able to manage their operations with agility and optimise their working capital.[/vc_column_text][/vc_column][/vc_row] ### Should APTs be a concern for the healthcare sector? [vc_row][vc_column][vc_column_text]In recent years, healthcare records have become an increasingly valuable commodity on the dark web. They contain far more sensitive, personal information than financial records, and allow criminals to create more personalised and targeted attacks using social engineering techniques, like impersonating friends and colleagues. Compared to healthcare records, financial information has a short lifespan. Short term credit/identity theft monitoring is frequently used to allay the damage done by financial data breaches. The difference with healthcare records is persistence. Persistent data adds value to an attacker, who can repeatedly return to the stolen information. Healthcare data is not just persistent, it is also updated as patients visit healthcare providers who, in turn, share that information between fellow healthcare institutions (for example, clinics to hospitals). On top of the reliable growth of data that existing patients provide, new patients guarantee a new source of information that attackers can sell on the dark web. No healthcare provider is safe from these attacks. But it is not advanced persistent threats (APTs) that cost the healthcare industry well over 2 billion dollars in 2018; it is its close cousin, the targeted persistent attack (TPA). Unlike an APT, advanced techniques are not required to compromise the targeted victim. Common phishing attacks, and improperly configured systems are frequently the cause of data breaches. How do TPA attacks work? TPAs are not random, as was the case of WannaCry. The victim of a TPA is chosen specifically for the type of data the attacker desired. Phishing if often the spearhead of a TPA, but access may be obtained by hacking unsecured network resources. Once inside, the attackers employ a variety of techniques that will keep them flying below the radar for extended periods of time. Hackers achieve this in stages: Break in – Attackers leverage a vulnerability, which can be found in the network, through an unsecure application, or through a phishing attack. Once the vulnerability has been exposed, hackers will insert malware into a network undetected. Gain a foothold – Now that the malware is in place, attackers can create backdoors which are used to navigate a network without being detected. Once in, hackers will remain anonymous thanks to the oft-seen technique of deploying file-less malware, leveraging polymorphism to trick defences and cleaning up after itself to remove any trace of existence. Lateral movement – Once the hacker has established themselves within the network, they will try to navigate the network. They will attempt this by employing various tools, such as a password cracker which can be used to achieve administrative control, giving criminals influence over the infrastructure. Data exfiltration – Sitting undetected inside the system, hackers secretly harvesting data, until they acquire everything they need. Continuous monitoring – At this stage, attackers may have close to unlimited control over their desired target. From here, they will collect new data as it becomes available and attempt to burrow deeper into the network. Attackers have been known to remain within a network, often for 200 days or more, and can withdraw without a trace, ensuring an effective stealth attack. A common strategy for attackers is to leave a back door open in the network to allow re-access in the future. Why do attackers target healthcare? Whilst the healthcare industry is ripe for APT attacks, APT should be on the radar of IT leaders in all sectors. A recent study, commissioned by Trustwave, revealed that the mean value of payment card information on the black market is $5.40; the mean value of healthcare records however, is $250.15, and this is what makes the healthcare industry a lucrative target. Another reason that attackers are seemingly drawn to the healthcare industry is that it is still reliant on equipment that runs on legacy software, most notably, unpatched versions of Windows XP. By targeting this operating system, criminals can leverage the vulnerabilities to their benefit, gaining a foothold into the network. From this point, hackers can deploy more advanced malware to gain access to private medical records. Running on an unpatched version of Windows XP is critical to hackers acquiring sensitive data, but the problem is compounded by strained IT budgets across the healthcare industry, which struggles to cope with the steadily increasing number of attacks. Budgets are difficult to work out for cybersecurity in healthcare, as there is a difficulty in measuring ROI before a breach occurs. For hackers, this is ideal, as it creates a situation in which encryption, arguably the most important layer of defence, is not always identified. A situation like this highlights the benefits of training staff on security awareness, to minimise the risk of further data breaches. Persistence in attacks One example of a persistent threat that has been around for years, is the Conficker worm, a decade old piece of malware. To this day, Conficker is one of the most encountered threats on the internet and can be especially problematic for hospitals. Even though Conficker is 10 years old, the unpatched Windows XP operating system that is prevalent in healthcare ensures its continued relevance to this industry. Once this relatively simple piece of malware is used to gain entry to the network, more advanced malware can be deployed to infiltrate the network. Healthcare organisations are gradually improving their defences, with £150 million pledged to improving the NHS’s resilience to attacks. Despite this, ATPs continue to cause problems for IT professionals. To avoid repeating past mistakes, the healthcare industry needs to deploy defensive measures across the board. A vital component of this defence involves continuous education and assessments, which ensures staff across the organisation are equipped to handle information in compliance with proper security protocols and recognise potential attacks. Training the entire organisation is a great start for healthcare, as human error permits many successful attacks across all industries through advanced phishing efforts. This, combined with an anti-malware software, can be considered an effective first line of defence. Once the basics have been taken care of, IT security professionals can turn their attention to securing the network, server, perimeter, and the endpoint. When it comes to protecting data, no organisation can afford to take a lax approach. Only a layered security offering, complemented by a threat intelligence service, can mitigate against evolving threats.[/vc_column_text][/vc_column][/vc_row] ### Technology: Making Its Way In Healthcare [vc_row][vc_column][vc_column_text]Software like artificial intelligence (AI) and virtual reality (VR) are often viewed to be technology of the future, however, there are ways these types of futuristic advancements are used today. In big corporations, technology like AI or cloud computing systems are used on a regular basis to store information and make inferences from said data. Today technology like this is also making its way into the medical field, and many physicians are either currently using, or looking into ways software can be used to change the way healthcare is provided. Artificial Intelligence Artificial intelligence (AI) is an up-and-coming technology that is widely used in businesses. This technology has the ability to take the information it is presented and learn from it in order to produce an answer or outcome. Software developers will take information, translate that information into a mathematical equation, and input it into the AI system allowing the machine to learn through algorithms. Once it has the information it needs, it is able to make connections by vetting the information against each other. From there, it is capable of learning new information and creating new algorithms to learn and store. This type of technology is most commonly used by businesses to create insights that drive business decisions. AI can be used to create insights for marketing campaigns or perhaps for financial investments, however, this technology has the opportunity to be used for medical discoveries. The use of AI in medicine has been tested by many different groups, however, developers from New York’s University’s School of Medicine along with help from Google were able to train AI software to detect cancer. The software, Inception-v3, was trained with data from the Cancer Genome Atlas that were uploaded into the system. Once up and running, the technology was able to detect 6 different cell mutations with a 0.733 to 0.856 accuracy. This isn’t the only software that has been able to detect cancer as well as, if not better than, oncologists. Another AI software, deep convolutional neural networks (CNN), is detecting skin cancer through image processing 95% of the time, which physicians in the control group were only able to detect cancer 87% of the time. This technology can be groundbreaking for rare cancers as well. Canon Medical Research, an Edinburgh-based firm was awarded a 140,000 pound grant to test the use of AI to help malignant pleural mesothelioma. With mesothelioma being such a rare disease that has a poor prognosis, the team hopes for a better outcome for patients with this brutal cancer. They hope to show that AI can make a positive impact on other types of cancer diagnoses, treatment, and the cost of cancer drugs. Telemedicine Telemedicine over the years has been most used to treat patients in remote areas. Today, telemedicine has become more commonly used to treat all types of patients, due to the advancement and widespread use of technology. Telemedicine is the use of technology, such as email, phone call, messaging, photo sharing or video chatting in patient care, and is used to help doctors treat patients without the hassle of an in-person appointment. This concept is very appealing to today’s society considering our fast-paced world. Most people today would rather skip the waiting room and get right to the doctor, which is why remote doctor’s visits are so widely accepted. The use of technology in the medical field is also more cost-efficient. The United States spends over $2.9 trillion on healthcare each year, which is more than any other developed nation. Not to mention it is estimated that $200 billion of those funds are from unnecessary spending. Telemedicine, if used more often, has the ability to cut costs by eliminating unnecessary ER visits to avoidable hospitalizations. Another benefit could be to help more patients who have specialized or rare illnesses. According to The New York Times, there is a rise in drug-resistant infections and a shortage of specialists who are capable of treating these diseases. In fact, the Infectious Diseases Society of America has had to create aggressive recruiting strategies to help with the shortage. Telemedicine could serve as a solution to this epidemic. With the shortage of specialists in this field, more patients could receive the treatment they need no matter where they are geographically located. Cloud Computing Cloud computing is more often utilized in a conventional business setting, however, the medical community has taken advantage of this technology for medical records. Keeping patient data together and accurate is extremely important and over the years a record or two can fall through the cracks, only to leave gaps in a patient’s history. HIMSS Analytics survey published that 60% of the responding healthcare organizations are using cloud computing for backup and disaster recovery, and 51% are using the cloud for clinical applications and data. BCC Research projected that with the rate of growth of the global healthcare cloud computing market will increase to $35 billion by 2022 with an annual growth rate of 11.6%. NetApp, a cloud computing organization stated that there are many benefits to cloud computing, which is by the market is growing at such a rapid pace. Among them are data protection, disaster recovery, automated storage, volume cloning of data, and significantly reducing costs. Not to mention this information can be utilized for clinical research purposes. While the cloud is extremely beneficial, there are often concerns with HIPAA and data protection. However, according to the Department of Health and Services, Cloud Service Providers (CSPs) are covered under the HIPAA Privacy, Security and Breach Notification Rules and must comply with them. A covered entity can be anything that is part of a patient’s health plan that is transmitted or stored electronically. “When a covered entity engaged the services of the CSP to create, receive, maintain, or transmit ePHI on its behalf, the CSP is a business associate under HIPAA. Further, when a business associate subcontracts with a CSP to create, receive, maintain, or transmit ePHI on its behalf, the CSP subcontractor itself is a business associate.” So while many patients and healthcare providers have concerns about protecting patient information on cloud technologies, one could argue that it could be even more secure. In comparison to using paper files, or even an on-premise technology, cloud computing is often updated on a regular basis, leaving it one step ahead of any cybersecurity threat. Technology like telemedicine and cloud data processing has proven to be great improvements in the medical field. With software like this paving the way, there is a chance that up-and-coming technology like AI will have more of a place in healthcare. Many physicians are working with new technology that can be utilized in different ways for medical advancements like the New York University School of Medicine and the Canon Medical Research teams are. With technology already having such a huge role in this space, who knows what other advancements we will see for healthcare in the future.[/vc_column_text][/vc_column][/vc_row] ### How to Keep up in an Era of Constant Digital Disruption [vc_row][vc_column][vc_column_text]It’s not easy to plan for digital change. If we could predict what would happen in the future, MySpace might have taken up Facebook’s $75 million acquisition offer in 2004, and BlockBuster might not have “nearly laughed Netflix out of the room” when the streaming business asked it to buy it for $50 million. Technology is unpredictable, and that doesn’t make it easy to plan for. With the likes of connected cars, virtual reality and drones appearing in the news everyday, the digitisation footprint follows us wherever we go. To stay in line with others in your industry, keeping on top of these relevant changes is important. But how do we do this? Andy Allan, Owner of CAT Autokeys, explains his five tips for staying in the know... Never Stop Learning It’s easy to get stuck into the daily routine and become an expert at your job, but not your trade. Yet things will undoubtedly change, and if you’re not careful you can easily lose your lead - successful people are always learning. Read everything you can about your industry online and use social media to your advantage to stay on top of the latest news and changes in your industry. You can also set up hashtags to follow, or create a list of influential people in your sector on Twitter. Accept that you don’t know everything in your industry, and it will help you to embrace a new learning culture. Set aside a couple of hours a month to make sure that you’re fully on top of the latest updates and then you can plan out what this means for your business. It doesn’t have to be a massive time expenditure, a few hours can go a long way. Network to Success Connected people are successful, because relationships are powerful. When it comes to staying on top of digital trends, it’s not just what you know but who you know. Along with exchanging information, you can nurture long-term relationships, develop your skill set, improve your knowledge of the latest trends and meet prospects that might help your business development. Get involved in industry alliances, standards bodies and networking events which can offer an entire community of professionals in your industry who you can talk to about trends, mutual challenges, and what they’re doing to combat this. One person might give you insight into a trend you hadn’t expected, another might have a solution for a problem you’re facing or help to understand a new customer trend. Determine your Needs Once you know the latest industry trends and technologies, it’s time to determine what your unique business needs are. You won’t need to apply all the new technologies to your business, but instead plan out an analysis of the needs of your customers, the expectations that new technologies could bring to your business, and what processes need boosting.  Acknowledge your struggles and challenges you’re facing, and seek out technology that can help. Work out what your competitors are doing better than you, and see how you can adapt. If there is a benefit to a new technology in your sector, factor this into your analysis and create a roadmap to plan out when it would be implemented. Ultimately, you need to work out whether you can justify business value for digital changes. Establish Resources Set clear, ambitious targets that allow you to achieve digital disruption in your industry. Then, look at the resources that you need to launch the process. This doesn’t just mean the technology, but, depending on your business, you might need to invest in upskilling, nurturing a digital culture, and also factor in the time it will take to change, update or renew your current tools. Technology is often expensive, because it provides so many benefits, so you may need to prioritise. Companies are estimated to invest more in technology across all of their business lines during 2019, so it’s key to budget plan effectively to deliver the best value to your business. Never Make Assumptions Don’t assume that once you have overhauled your configuration equipment, for example, that that’s the end for a few years - it’s a naive mindset to have. Keep learning, keep researching, keep networking and keep investing. Successful people are always learning and learning can help you to unlock new doors. It’s rare to find a business not reaping the benefits of software and for a good reason; cutting-edge software is showing its worth in industries across the world. Digital transformation can boost processes, profits and people, and that’s why no industry will be left untouched by it. But it’s hard to know how to plan for it, and that’s why you need a combination of ingredients to tackle the constantly changing digital environment. The ability to recognise your weaknesses is invaluable, relationship-building is powerful and education is priceless.[/vc_column_text][/vc_column][/vc_row] ### Getting to Grips with Your Data [vc_row][vc_column][vc_column_text]While many organisations are realising the benefits that artificial intelligence (AI) has to offer, for those who have yet to take the plunge, simply knowing where to start on their path to digital transformation can be a real challenge in itself. In particular, companies who want to use AI to gather new insights from their data must first have in place a solid data strategy. Given the large volume of data typically gathered and processed by today’s businesses, creating a strategy that will harness all that information in a clear and useful manner can seem virtually impossible. Where do you start? What exactly are you looking to achieve? The following tips will help organisations of all sizes create a data strategy that will set them on the best path to using AI to gather business insights from their data: 1. Identify where data is stored – and how Data is likely to be stored in all kinds of formats, from emails and documents to structured databases, and across various locations and standalone databases. Without a clear overview of their data, businesses could be missing out on all kinds of opportunities, from coordinating promotions to ordering stock in response to patterns of customer demand at different times of year. 2. Keep it in one place Which brings me to my next point: all data should be kept within a centralised data lake to make it easily accessible. A cloud-based service is an ideal place to start, as they are usually far less expensive than an on-site data centre. Ideally, organisations of all sizes should choose a platform that is easy to scale – in both capacity and performance – and think about which format they need their data to be available in. Using a translation tool such as Kafka will make it easier for companies to manage data from different apps. However, with so much important information held together in one place, the issue of security becomes more important than ever. A solid backup and recovery plan is a vital element of any data strategy. 3. Cleanse all data It may sound obvious, but companies should always dispose of data that’s no longer used or of value to them. This is particularly important when meeting GDPR regulations too. Most companies have decades-old data, often customer orders or a contact at a business that no longer exists. Finding this data and destroying it appropriately is critical to being able to manage all your relevant data effectively.  4. Ask the right questions Once an organisation has achieved the task of gathering, cleansing and storing its data in one place, it’s time to think about the questions that need to be asked of the data. This means understanding and looking for the insights that will make the biggest difference to your business. With a set of objectives in place, it’s worth testing these with a smaller data set to reveal whether you are asking the right questions – and whether the right algorithms are being used to answer them. Each of these steps should contribute to the ultimate aim of any data strategy, which is to create a single point of truth. Achieving that goal puts any organisation in the right place to embrace the opportunities offered by AI-driven data analytics, from improved customer service to increased productivity. To stay competitive in today’s fast-paced business environment, it’s a must.[/vc_column_text][/vc_column][/vc_row] ### Cloud strategy: Don’t leave security on the sidelines [vc_row][vc_column][vc_column_text]An increasing number of organisations are migrating to cloud services as part of their digital transformation, attracted to models such as IaaS, PaaS and SaaS by the agility, flexibility, scalability and potential cost efficiency they offer. But there’s one area that is often forgotten: security. Security deserves to be given a prominent role in the transformation process, but it often tends to be left on the sidelines. Some organisations assume that security is the sole responsibility of their cloud service provider (CSP), perhaps because they’re not aware of the shared responsibility model. Others believe on-premise security practices and security controls can be directly mapped to their cloud workloads, irrespective of how they might have been modified en route to the cloud. Another challenge is the fact that the security team doesn’t always have visibility of what’s being deployed in the cloud. If they don’t know what’s there, how can they secure it? This all helps to create confusion around how to best secure the cloud. Who is responsible? What levels of security are already provided, and where does security need to be augmented? How does the chosen cloud model affect these lines of responsibility? A complex environment Security in a hybrid cloud environment is even more problematic. Organisations have been securing their datacentres for years, implementing a variety of different solutions and ensuring they follow security best practice. For hybrid cloud environments, however, it is questionable whether there’s a one-size-fits-all solution that maintains the right levels of control for both on-premise and in the cloud. This is a challenge that a lot of organisations are facing today. In fact, the security requirements of on-premise and in the cloud are different, so it’s important to treat them separately when applications and workloads are migrated. However, this ‘two speed’ security strategy can leave the cloud vulnerable if it’s not a joined up approach. As an organisation’s cloud infrastructure and assets are directly accessible over the internet, it’s a vulnerable attack surface through which malicious actors can attempt to gain access, potentially exposing the datacentre too. This is why it’s imperative that the level of security in the cloud is the same as in the datacentre – or even higher, given the increased visibility and accessibility of cloud services. Managing the multi-cloud Today, many organisations – and many departments within them – use a range of different cloud providers. This could be in order to benefit from best-in-breed services, or to prevent vendor lock-in or cloud provider outage. It might even be due to local geo-political considerations. Where there is no defined cloud strategy, however – especially if IT is not directly involved – the result can sometimes be an unmanaged and ungoverned cloud deployment. This will likely involve different management tools, different training requirements and a diverse way of securing the cloud solutions too. To combat this complexity, an organisation needs to establish a clear cloud strategy, which includes proper rules and governance around how, when and why cloud service providers are chosen. You need to monitor your cloud Any security strategy which encompasses cloud, whether hybrid cloud or multi-cloud, needs to include monitoring. Workloads can be deployed at pace both within and across cloud service providers and, from an operational and security point of view, it can be a real challenge to maintain and manage them on a daily basis. Restricting deployments will risk preventing organisations from achieving the full benefits of cloud, however. So it is vital that organisations monitor their cloud deployments from a workload perspective, ensuring security alerts and incidents have visibility and, if applicable, monitoring regulatory compliance levels too. Only once you achieve visibility of your cloud infrastructure can you begin to safeguard its assets and workloads. Automate to optimise security Automation is the key to realising the consistent levels of security across both on-premise and cloud environments, while also enabling the business to meet its objectives around time saving, agility, scalability and cost effectiveness, in this vastly changing ecosystem. Unfortunately, all too often traditional security is seen as a cumbersome and not considered part of the automation pipeline. While infrastructure and applications are becoming highly automated, using tools like Terraform, this Infrastructure as Code methodology can often ignore security configurations and controls. However, by building security into an automation strategy – as part of the ‘shift left’ paradigm, rather than being ‘bolted on’ – security can be an integral part of the application lifecycle, rather than an unwanted constraint. Remove security as a barrier Security has been one of the main reasons organisations have chosen not to migrate to the cloud. If good practices are put in place, however, it should no longer be a barrier. Organisations must first determine why they want to move to the cloud, and identify the technical and business drivers for the change. They then need to define a cloud strategy which encompasses every part of the business, from development to security, from legal to operations. Finally, a joined up, cohesive approach is imperative: if every department goes its own way, there’s no way of knowing if the cloud services being used are in conflict with other services deployed across the organisation, at odds with the technical constraints of the business, or – more seriously – whether they’re undermining the corporate strategy. The scalability and flexibility offered by the cloud provides great opportunities for organisations to accomplish their business goals. In this new app-centric world, the rate of change needs to be reflected across all aspects of IT, whether that’s development, infrastructure or security. Cloud plays a vital part in achieving this. However, as outlined, this brings extra responsibilities to an organisation, and anxiety to those cybersecurity professionals required to secure it. A security strategy first approach is key to making sure security moulds to this new cloud environment and, crucially, grows as an organisation’s infrastructure grows, whether that’s on-premise or in the cloud.[/vc_column_text][/vc_column][/vc_row] ### Predictions for regtech in 2019 [vc_row][vc_column][vc_column_text]The global regtech market is growing exponentially, powered by significant regulatory changes such as MiFID II, PSD2 and GDPR, which all brought with them extensive new obligations for financial services firms. Currently valued at $2.3bn, the regtech market is expected to reach $7.2bn by 2023, coupled with a predicted 500% increase in investment in the sector by 2020, from $10bn in 2017 to more than $53bn at the end of the decade. Regtech encompasses the vast number of tech solutions to firms’ regulatory obligations, and addresses regulatory reporting, compliance checks, risk management, identity management, transaction monitoring and more, ultimately helping firms to operate for efficiently and competitively. 2018 was a huge year for the industry, with the introduction of the aforementioned MiFID II, PSD2 and GDPR, all of which have had a transformative effect on how firms operate. But the evolution of regtech is far from over, and the potential impact it could have is only beginning to be understood. Below, Matt Smith, CEO of SteelEye, outlines his predictions for regtech in 2019 – and beyond. Regulators will wake up to the value of Regtech So far, regulators have done little to support regtech and there has been minimal if any, collaboration among regulators to help boost regtech’s impact across different industries and regulations. But I predict this is going to change, and regulators will start becoming more proactive in encouraging innovation and collaborating on regtech solutions. Unlike most industries, financial services have no governing body responsible for setting a collaborative regtech agenda, so implementation efforts across different regulations have not so far been aligned. But in 2017, the International Regtech Association (IRTA) was launched, with the aim of promoting standards, certification, innovation and collaboration in the sector. This was supported by a recent report promoting the adoption of regtech by regulators to streamline their own internal processes, which the FCA has already begun by exploring how technology can make it more efficient and relieve the burden on firms. As the industry becomes more established, it’s likely that in 2019 this enthusiasm will be taken further as regulators continue to see the value of regtech and encourage its adoption and consolidation across industries. Investment in the sector will rise While regtech has had a very strong start in terms of investment, it is still a fairly new industry and therefore investment is only likely to increase in 2019 and beyond. Until recently, financial firms were relying on old, legacy systems that were inefficient and extremely costly. Regtech has begun providing an alternative, with scalable solutions that have helped revolutionise firms’ compliance processes – often taking 15 minute compliance processes down to a mere three. It’s likely, then, that firms who have been ‘turned on’ to the ways of regtech will continue looking for more game-changing answers to their compliance problems. By 2020, it’s estimated that there will be 300 million pages of regulations, and fines for non-compliance will rise accordingly. New directives yet to be implemented, such as the Fundamental Review of the Trading Book (FRTB) and the Central Securities Depositories Regulation (CSDR) will further increase firms’ obligations and necessitate more regtech to help meet these demands. With the road ahead uncertain and firms’ obligations increasing, regtech’s job looks set to get much bigger. In the first half of 2018 alone there was an investment of $1.37bn – more than for all of 2017. It’s only right, then, that investment into new and exciting regtech solutions, or the expansion of existing brands, will increase in 2019. Regtech will move beyond regulation  Regtech obviously has a major role to play in helping firms with their compliance obligations. But its benefits go far beyond streamlining long and tedious regulatory processes, and in the year ahead I predict that regtech will move far beyond being used simply for regulation. Today, firms generate data through almost all of their processes, and new data visualization tools and cloud software have only taken this further. With such vast amounts of data being created, technologies such as AI and machine learning are enabling companies to gain insights into regulatory practices and reporting, and carry out meaningful inspections of critical compliance risk areas. Companies are increasingly using regtech to deal with this data, not only to help with compliance, as discussed above, but to gain business insights and get an edge over their competitors – by gaining a better understanding of their customers, trying out new products or improving their operative processes. Regulations such as Market Abuse Directive/Market Abuse Regulations (MAD/MAR) for example, have ramped up compliance requirements for trade and communications surveillance. This has meant firms now have vast amounts of scattered, disparate data. Recently, however, as firms have woken up to the value of regtech, we have seen a trend towards firms leveraging all this data on their trades. By deploying regtechs analytics, surveillance data has helped firms drive better decisions on risk capital allocation; or highlight where traders achieve poor risk-adjusted returns. Ultimately, giving firms the insights to gain a competitive edge, drive efficiencies and enhance their processes. I expect, then, that big data and innovation in regtech tools will allow providers in 2019 to make products that are more commercially exciting, and have a broader range of uses. An exciting year ahead It’s set to be an exciting year in 2019 for regtech, with more investment, regulator involvement and even more advanced products. As firms continue to realise how much value these new technologies can bring, regtech could have a truly transformative effect on the financial sector, helping firms not only to better manage their compliance obligations but perform more competitively and innovate faster.[/vc_column_text][/vc_column][/vc_row] ### How Blockchain Is Revolutionising the Crowdfunding Landscape [vc_row][vc_column][vc_column_text]Blockchain technology is gathering momentum, creating new and interesting opportunities for people around the world. Prior to the advent of Blockchain, one of the most innovative fundraising concepts was just beginning to gain viability with the start-up community; Crowdfunding. Crowdfunding has grown steadily as an extreme alternative to venture capital funding. As a result, non-traditional projects had a new audience to pitch to and raise funds for their cause. Now with blockchain disruption, crowdsourcing could be revolutionized to unlock new use cases and enable global outreach. Before diving into how blockchain technology and decentralization will revolutionize the crowdfunding industry, let’s take a look at what blockchain is and what it does. Blockchain Technology and Decentralization There’s a common misconception that Bitcoin and Blockchain are one and the same, however, that is not the case. The blockchain is the underlying technology behind the success of cryptocurrencies. It acts as a data structure that holds various records and while ensuring utmost security, transparency, and decentralization. While creating, storing and facilitating transactions of cryptocurrencies is one of the applications of Blockchain technology, there are numerous other. Additionally, there are some features that make this revolutionary technology stand out. Perhaps one of the most ingenious notions to come out of the blockchain revolution is the use of smart contracts. Different from traditional deals, smart contracts are self-executable and self-verifiable contracts. In fact, smart contract implementation may soon disrupt industries that operate on a contractual basis, such as finance, insurance, and real estate, among others. For example, smart contracts can manage the supply chain, and ensure ethical sourcing of healthcare products. The Crowdfunding landscape Crowdfunding an equally big concept only less complicated. It is a popular way to democratize fundraising for startup projects and charities, a miracle of the modern Internet age. When crowdfunding works, it brings new products to life. However, the model is still extremely inefficient. It’s centralized, just like venture capital firms, with a central authority controlling the platform. For this reason, preferential treatment is awarded to campaigns the crowdfunding platform may deem as worthy of public support. Consequently, approximately 78% of campaigns end up falling short of their funding goals. Also, only 1.2 percent of crowdfunding campaigns go to developing countries, which actually starve of start-up funding. As such, blockchain disruption in the crowdfunding sector is not only inevitable but also necessary. So, how can blockchain beat the crowdfunding industry? Initial Coin Offerings Above all, the crowdfunding sectors is a perfect fit for blockchain disruption. In particular, the blockchain’s industry answer to crowd financing, Initial Coin Offerings (ICOs) is already popular among start-ups today. In a nutshell, ICOs, which are actually analogous to public offerings, involves issuing tokens which act as the company’s shares without equity exchange. Investors purchase the crypto-assets (tokens), instead of shares, using either cryptocurrency or fiat currencies such as US dollars. A majority of crowd-sales are governed by smart contracts, which dictate the terms and conditions of the sale of the tokens. Tokens sale participants send their contributions via smart contracts, which automatically allocates tokens to the participant. In this scenario, the developers are the counterparty. Hence, they collect contributions from crypto-investors who invest in the projects newly issued tokens. In the end, the company benefits from its ICO by bypassing various hurdles set by venture capital firms and banks. Future developments in token crowdsourcing Indeed, the crypto-world will never seize to evolve. One of the latest inventions, Initial Exchange Offering (IEO), could soon disrupt how start-ups raise fund through token sales. In this new system, the tokens are sent to a suitable exchange, which acts as a counter-party, making them available to individual investors. The price and condition for the sale of the tokens are subject to an agreement between the platform and the developer. In details, this agreement includes the number of tokens set aside for the development team, the percentage of tokens the platform will hold off from the sale, and the flat rate price for the tokens. As a result of listing a token, the exchange receives a given percentage from the sale of tokens. Besides, the process of marketing a new token also attracts new users to the exchange. Since the exchange is equally invested in the project, feasibility and viability studies are done prior to showing interest in listing a particular token. Once the exchange is satisfied, the exchanges demand a fee from the developers and compliance with the platform’s terms and conditions of use. Making the case for Initial Exchange Offerings The number of ICO scams is alarming. The lack of regulation and legislation is the driving factor for this problem. In response, IEOs will provide a more secure setting for investments into the crypto-sphere. To illustrate, while literally, anyone is eligible to launch an ICO, only select projects that pass the feasibility screening will list on exchange platforms. Thus, the investors are assured of the legitimacy of a company’s token sale. Unlike the preferential treat awarded to certain projects in crowdfunding sites, screening solely to protect investors from swindling activities rather than drive traffic to certain projects. In the same way that crypto-exchanges employ refined technology to patrol their platforms against fraudsters, IEOs will enjoy an equal degree high level of security. But wait, there is more. The exchange will conduct AML/KYC on behalf of the participants, instead of the token issuer in the current ICO model. However, the fact that investors get to invest through exchanges, locks out potential investors who like to make anonymous investments in crypto-assets. Additionally, the developers will have to factor in the cost incurred when setting the flat rate of the token, increasing the cost of each token. In conclusion. A new era in crowdfunding is on the horizon. Blockchain technology is facilitating the democratization of capital for startup companies. However, ICOs, the preferred model for start-up financing on the blockchain, sits on the precipice. Despite this minor setback, blockchain-based start-up financing is making a comeback crypto-exchanges taken on more responsibility in ensuring only authentic projects are allowed into the market. With IEOs, investors can be confident that their investments go into worthwhile projects. What do you think? Is blockchain-based crowdfunding here to stay?[/vc_column_text][/vc_column][/vc_row] ### On-Premises or in the Cloud––What BI to Choose? [vc_row][vc_column][vc_column_text]The use of cloud-based business intelligence (BI) tools in enterprises is a growing trend. Louis Columbus suggests that in 2018 cloud BI has grown to twice its adoption level of 2016. However, for businesses considering a switch from an on-premises business intelligence platform, following a trend simply for the trend’s sake is not always a good plan. In this article, you can discover the advantages and disadvantages of cloud-based systems and whether they genuinely present more benefits than traditional in-house tools. Exploring the Cloud Cloud-based business intelligence has been around for a few years now, gaining in popularity and adoption across various industries. Breaking it down by sector, Forbes suggests the 2018 adoption rates at around 62% for Financial Services, and 54% for Technology and Education. Such BI tools allow businesses to analyze KPIs, trends and other metrics via a variety of systems, with Amazon AWS, Microsoft Azure and Google Cloud leading the pack. With such impressive rates and providers behind the movement, it’s difficult not to be enchanted by the potentials presented by the technology. But is all as it seems? Advantages of the Cloud These are the benefits that enterprises engaging cloud-based BI tools can expect to receive: Scalable Cloud software is often expandable to meet the needs of a business; pricing plans will vary accordingly based on the number of users and required functions. External maintenance and updates The monitoring of cloud-based tools is the vendor’s responsibility, meaning your staff will not be required to update or maintain the software, only to use it. It should also be updated regularly by the vendor ensuring the product’s security and relevance. Excellent reporting standards Such tools are often able to sync with other software, making reporting more straightforward. This saves time for employees and reduces the need for copying and pasting information from one source to another. Challenges of Cloud-based Solutions While the advantages of cloud solutions are impressive, there are some challenges along the way that could prevent a perfect BI experience. These include: Integration issues If your business is seeking a switch from traditional on-premises BI to a cloud-based solution, it’s likely that there will be some integration issues. This may include data that doesn’t transfer well, programs that don’t work together and other problems that prevent a seamless data stream. Additionally, in terms of staffing, a business may experience additional integration issues with the roles and responsibilities of current employees, which may involve training, thus adding further expense. Security concerns While cloud-based software is reasonably secure, many organizations (up to 77% according to RightScale) may still feel discomfort with storing vital business data on the cloud. This is particularly relevant in the case of financial information and corporate secrets which could potentially be compromised. Further, certain businesses are subject to strict data storage regulations, in particular financial and healthcare companies, in which case it is essential that data protection is up to industry standards. For them, using cloud solutions would make compliance more challenging. Reliance on a third-party provider No matter the provider, employing BI cloud services means relying on a third party. This may affect a business if the cloud temporarily goes down. Even if it quickly rectifies, this can lead to a loss. Besides, unlike on-premises systems, in the case of a system failure businesses may find it problematic to employ efficient contingency planning in the case of downtime. Connection issues between cloud and server As much as we like to believe that the cloud is a faultless infrastructure, the issue of connectivity remains. In case of an Internet failure, backup measures need to be in place to ensure data is synced once the system is back online. In addition, it may slow down working time if a lag in the server-cloud connection occurs. Although for those with a solid, reliable Internet connection, this isn’t too biting an issue. Keeping BI On-Premises While cloud BI is improving and booming, for large enterprises the decision to switch or continue with on-premises solutions is often a tough one. In some cases, innovation is not always worth the risk, whereas in others it is a cost-efficient answer to an organization’s BI woes. On-premises solutions generally offer the same BI capabilities as the cloud; however, in practice, the two work very differently. In this section, we will explore the advantages and challenges of an in-house BI tool and why it may prove more suitable for your enterprise. Advantages of On-Premises BI Businesses that decide on in-house solutions can expect the following benefits: No compromises on security Unlike cloud solutions, crucial data is stored on site, meaning essential BI information is less likely to be exposed. In this case, the security measures will rely solely on the business itself that can adjust levels to meet its particular needs. It also means that if a data leak has occurred, it would be much easier to track who had access to the information and when. Guaranteed maintenance and contingency When systems are built and maintained on-premises or with the help of an external BI development team, the business itself is responsible for maintaining the services and tools. While this may sound like extra work, it also means flexibility in available tools, planned downtime, and available staff if an issue occurs.  It also brings in reduced contingency issues as BI software is already tailored to meet the needs of the company. Pay-off in the long term While initially setting up an entire BI network within a company may be quite a costly exercise, this pays off in the long-term as the solution can be tailored to meet the growing needs of the company without any unneeded supplementary tools. Additionally, staff will become more knowledgeable as their skillset advances. And of course, no one can discount the stability of having a system in-house. Challenges of On-Premises BI With such strong advantages, companies sticking to tried and tested on-premises solutions should feel no shame in not following the trend. That said, those choosing this route should be aware of the following obstacles to be overcome: Installation may not be a smooth process In theory, simply paying for and using third-party cloud BI may be a more straightforward solution than designing and developing your own, at least in the short-term. It will take time and testing to install and successfully use tailored on-premises solutions. Additional staff training needs On-premises often means more staff: those who will work with, engineer and monitor the system. These would be the people that your cloud outsourcer would hire. However, for software produced in-house this onus is on you or your provider, which may prove more or less expensive. Additionally, staff using the system will require additional training to use the new tools successfully, though this is also true for the use of cloud systems. Scalability and flexibility The ability of your team or provider to scale and personalize your BI tools after they’ve been created is limited by cost, personnel and time. While you can tailor to specifically meet your needs, it does mean investing further funds to keep up with the latest BI trends. Hybrid––the Middle Ground? One option that hasn’t been mentioned is the ability to combine cloud and on-premises solutions to work in a hybrid environment. This can bring together the flexibility and scalability of public cloud BI services with the security of private on-premises solutions. According to BI consultants, how this works in practice is individual to every organization but may prove a suitable middle-ground for companies seeking quick expansion while mitigating security risks. To sum up, both cloud and on-premises BI present their own advantages and disadvantages. Each business should evaluate for themselves which areas hold the most relevance for their sector. However, one thing is clear; following the crowd blindly is not the way to go. Each company should thoroughly examine the advantages and disadvantages taking into account their specific needs and requirements.[/vc_column_text][/vc_column][/vc_row] ### Adaptive and value-focussed – the future of technology in healthcare [vc_row][vc_column][vc_column_text]The National Health Service (NHS) has always been committed to universal healthcare, irrespective of age, health, race, social status, or ability to pay.  Those values have not changed since its inception in 1948, but the world has. The whole of the healthcare sector and in particular the NHS must adapt to take advantage of exponential technology change.  Today that change manifests itself in digital transformation.  Successful technology can turn an ugly duckling into a beautiful swan but poor execution can leave you with a very expensive, very fast ugly duckling. It’s a problem that NHS Digital Academy CEO, Rachel Dunscombe, is only too aware of. “Technology is the only option for the NHS to make significant gains in productivity and safety,” she says.  “All other avenues will give marginal gains but not the major impact on our health and care system needs.” Over the years, healthcare has seen as wide a divide between ‘IT’ and ‘the business’ as any vertical you care to name. Evidence from healthcare organisations often points to a lack of appreciation of each other’s issues at all levels so the solution needs to be top down.  Whether by Government-imposed constraints or lack of knowledge, the failure to leverage talent from outside the sector has led to many good ideas falling on stony ground and consequently failing to blossom.  It’s that very reason that the NHS Digital Academy was born.  The NHS Digital Academy is a virtual organisation set up to develop a new generation of healthcare digital leaders who can drive the information and technology transformation of the NHS.  The NHS Digital Academy, through a partnership with Imperial College London, the University of Edinburgh, and Harvard Medical School, provides a year-long fully accredited learning programme (Post Graduate Diploma in Digital Health Leadership) for digital change leaders. Dunscombe is keen to link the success of technology with those driving change: “Leadership is essential in this space as digital is the platform for reimagining how we deliver health and removing geographical constraints and changing the skills mixes we need. This journey needs people who can tell the story of how we will revolutionise the way we work. This will have an impact for almost all the NHS workforce and so strong leadership who can build trust and seek win-win situations are essential to the momentum. Traditionally in healthcare, the gap between technology and delivery has been wide - the best leaders are now bridging this gap - getting everyone pulling on the same end of the rope for positive change.” To futureproof itself our healthcare sector needs to move away from just throwing technology at problems and hoping for the best.  It needs to be supported by cultural change to embrace the potential of AI, Cloud and digital whilst retaining focus on patient outcomes.  Healthcare could learn much from practices such as DevOps, Agile and ITIL 4 where the co-creation of value through collaboration of all stakeholders is at the heart of what is delivered.  That means leveraging the best solution through collaboration between patients, healthcare technology organisations and clinicians.  It is encouraging to note that a significant percentage of the NHS Digital Academy cohorts 1 and 2 are from clinical and non-IT backgrounds.  The CIO of the future must not be blinded by the dazzle of new technology, but rather apply technology solutions to healthcare in ways that co-create value for all stakeholders. Stakeholder expectations are shifting and the healthcare sector is no exception.  Be it the surgeon reviewing new techniques through YouTube, cancer research using AI or the patient checking into the surgery on a touchscreen, expectations of new technology are sky high. Can healthcare keep up?  If it is to succeed it must overcome four key challenges related to legacy systems.  Healthcare CIOs must not let this “Legacy Backlog” stifle innovation.  The four challenges are: - Switching to the cloud – choosing the right architecture to bring together what is often a patchwork quilt of diverse systems Failure to execute – there is a marked difference between being alerted to the value of technology and realising that value Choosing the wrong AI or RPA projects – it is easy to be dazzled by the bright lights of technology at the expense of value in areas like patient outcomes Swamped in data – the sheer volume of data provided by modern tools and medical techniques needs to be structured and utilised The keys to making technology work are building organisations that focus on value, speed, and agility.  Every healthcare provider must be led by people who understand and can execute this.  This is where the NHS Digital Academy is trying to augment the abilities of those leading change.  Dunscombe is passionate about this adding “The biggest capability gap in the NHS is leaders who can work with and at Board level to fast track the opportunities digital can bring. Through the Academy, we are slowly getting to the new world with digital and technology leads on boards.”  With the right people in place, Twenty First Century technology solutions in healthcare should see the beautiful swans emerge.[/vc_column_text][/vc_column][/vc_row] ### TAMing the data centre: keep control over assets [vc_row][vc_column][vc_column_text]Effectively managing physical IT infrastructure - let alone optimising for best output - is an increasingly tricky feat. Trends such as the upsurge in connected devices, as well as the move to edge computing, mean that assets are increasingly spread over disparate environments, leading to complexity if IT managers are unable to obtain visibility over all environments, assets, and workloads. Couple this with increasing pressures from the wider business to reduce risk, get and stay compliant, as well as save money, then taming the data centre is challenging to say the least. As a whole, the demands of modern-day IT mean that organisations need to prioritise technologies that enable IT teams to monitor, manage and protect not only their physical assets, but virtual ones too - rendering traditional IT Asset Management (ITAM) no longer suffice. So, from the desktop to data centre, to colocation and edge as well as IoT devices, how can an organisation achieve full control over its assets? ITAM or TAM: what’s the difference? Siloed technology is the bane of any data centre manager, and having a management system which is unable to work or communicate with another system is an unnecessary stress when organisations want to move in real-time - not in extended query-respond cycles. While traditional ITAM set out to track and manage all hardware and software within a business, these two areas of infrastructure no longer fully summarise everything that comes into contact with the network. IoT devices such as CCTV, medical and industrial equipment all need to be considered too in order to comprehensively acquire information on all assets driving business activity and reliant on the data centre or corporate network. Technology Asset Management (TAM) is a superset of Asset Management that can oversee multiple assets across physical and virtual environments, including software, networked building infrastructure, data centre infrastructure, applications in the cloud, personal computing devices and millions of IoT devices. Acting like a single pane of glass, data centre managers are able to see their entire technology portfolio in terms of status, health, security, and compliance. The most critical component of TAM is its ability to provide real time data, and it this data that allows businesses to benefit from the following tangible, performance-enhancing results: Making risk redundant The explosion in connected devices has inevitably led to greater security risk when it comes to securing a company network. More devices means more end-points that, if reached by nefarious hands, could have the potential to be devastating to a business’ security, productivity and overall ability to operate. It is clear, then, that comprehensive visibility over these devices is crucial. A modern approach to TAM automates the discovery and cataloging of all things connected to the network. It interconnects with other building, IT, and business systems, providing a comprehensive solution that offers transparency and accountability to the entire extended organisation’s tech backbone. Not only this, but there will always be opportunistic hackers who prey on old systems and outdated security operations. Checking individual software to see whether it is compliant with current security protocols and hygiene is a time consuming task. Security managers and compliance officers need the ability to clearly see what is installed and where, validate its patch history, and know who has access to it - something that is made achievable quickly through TAM. Beating the bills Money and efficiency is at the top of everyone’s agenda, and data centre managers are no different. By gaining appropriate visibility into where data centres are failing to make best of resources efficiently TAM has the ability to redistribute and rightsize asset usage and inventory. This is particularly important for colo providers who are renting space and assets; by identifying redundant, under or improperly utilised assets, colocation providers can benefit from increased revenue. Equally, accurate billing and chargeback based on actual power usage is made possible, which stands to benefit both provider and customer. TAM achieves this by being able to connect to CMDB, ERP and HR systems for accurate accounting all in one place. Compliance Thirdly, it gives the data centre manager peace of mind when the auditors turn up. Organisations in all industries are often subject to an audit, and particularly those in the process of a public merger, acquisition or going public. With this in mind, IT operations teams need to understand how to self-audit as well as how to perform requested audits, and traditionally these occurrences might have caused panic as staff scramble to find information on all assets, their locations and life cycles. Furthermore, regulation such as GDPR now dictates that companies provide Data Breach Notifications indicating what data subjects’ data ran on what assets, and the identification of secondary locations of infrastructure for the safe handling of data transporting across borders. The best way to ensure mitigation of these risks of fines is with significant reporting capabilities that can not only provide evidence that a company is compliant, but can also generate and keep required documentation in the event of a breach so that licence remediation can be made instantly. A single source of truth for full control Taking all of this into consideration, TAM is a must-have in any organisation’s tool kit as it provides managers a single source of truth for a very complex, diverse, and fast-moving environment. Having been able to scale up and facilitate IoT devices and other developing technologies, it would be disappointing for an organisation to fall at the task of maintaining and optimising them. To tame the data centre is to have full control and visibility of all assets that leads to the safe-running of the entire business, - and is essential to the effective running of a hyperscale data centre that today’s enterprises increasingly rely on to operate.[/vc_column_text][/vc_column][/vc_row] ### More than half of British businesses witnessed cyber-attacks in 2019 [vc_row][vc_column][vc_column_text]The number of UK firms reporting a cyber-attack has increased, though most businesses admit they are ill-prepared for a breach, according to research from Hiscox (LSE: HSX), an insurance provider and underwriter at Lloyd's of London. A week hardly passes without news of a major cyber incident, and there have been great adverse effects associated with most of them. There is rampant data theft; ransom demands are steadily rising; as is the heightened hostile cyberspace within which businesses must operate. The cyber-threat is unavoidable in today’s business world. Hiscox surveyed approximately 5,400 small, medium and large businesses across the UK, Belgium, Germany, France, Spain, the Netherlands and the United States. The insurer found that there was a "sharp increase" in data theft, fraud, sabotage, and extortion, increasing both the cost and frequency of cyber-attacks, in one year alone. According to the Cyber Readiness Report 2019, more than 60% of firms reporting one or more attacks, up from 45% in 2018. No One Is Immune Fifty-five percent of UK firms faced down an attack in 2019, up from 40% last year. Despite this, three quarters of firms were still "novices" in cyber-readiness. The report highlights that many businesses "incorrectly felt that they weren't at risk". Yet, in each of the 15 sectors tracked, the number of firms reporting one or more attacks showed a sharp rise. The most heavily targeted sector was the Technology, Media and Telecom (TMT) space. In all seven countries, there was a rise of 21% over the year, with 72% of respondents reporting at least one attack. In second place came the public sector, which saw a 16% increase (71% reporting an attack), followed by financial services (67% up from 57%). Losses from breaches averaged between $229,000 (£176,000) and $369,000 (£283,000), an increase of 61%. UK Businesses Are Less Prepared The insurer said the percentage of firms scoring well on cyber-security ratings was on the decline, particularly in UK organisations. Firms in Britain had the lowest budget for cyber-security, spending an average of just $900,000, in contrast to $1.46 million across the other six countries. They were also as likely as a U.S. firm to have a "defined role for cyber-security" in their organization. In France, the proportion of firms prepared to deal with cyber-crime was much lower, just one in ten. Gareth Wharton, Cyber-CEO at Hiscox, says the low UK spending might be attributable to the large number of small businesses in Britain. “They may feel like they won't be targeted, as we tend to only read about large breaches in the press,” he continues. “If they incorrectly feel that they won't be targeted, they may be less likely to spend on cyber-security." But Hiscox also discovered that the average cost of an attack in the UK was substantially less (an average of $243,000) than an attack in Germany ($906,000) and in Belgium ($486,000). More Firms Now Appoint A CISO And Increase Their Efforts Among the full sample, the fraction of businesses with ‘no defined role for cyber-security’ has been cut in half (from 32% to 16%). Not all of the rest have their own cyber-security lead or a dedicated cyber team; 19% use a third-party provider to oversee their cyber-security. But three-quarters of small companies have at least one person or an external supplier managing cyber-threats (up from 56% a year ago). They are still a long way behind larger enterprises, the vast majority (95%) of which have a defined cyber-security role, but it is still indicates that things are moving in the right direction. Roughly two-thirds of respondents (67%) report their cyber-security spend will rise in 2019, up from 59% a year ago. While more money directed toward technology remains a goal for half of the survey respondents, the numbers of those planning to direct more monies to employee training, cyber-security staffing and consultants or third-party services are markedly higher. In other words, more attention is being paid to people and processes – which is positive, too. The cyber-risk may evolve rapidly, but the battle to mitigate and manage it is also changing. The old adage ‘prevention is better than cure’ pop to mind – and indeed, being cognizant of the threats is half the battle. New laws have also spurred action, with 80 percent of UK firms saying that they implemented changes after tough new EU data protection (GDPR) rules came into force last year.[/vc_column_text][/vc_column][/vc_row] ### How to protect yourself from invoice fraud [vc_row][vc_column][vc_column_text]There are many types of fraud that small businesses might face, including identity theft, payroll fraud and return fraud. But there is one type of fraud that often flies under the radar and collectively costs SMEs £9bn every year – invoice fraud. Like many other types of fraud, invoice fraud relies heavily on social engineering. It occurs when fraudulent invoices are sent to a company and paid due to insufficient or vulnerable accounts payable processes. It’s an easy enough scam for criminals to execute if they’ve done their homework. Often, they send the accounts payable department an invoice marked as late for an amount that they know is low enough to bypass approval and not cause too much suspicion. In other instances, the fraudsters simply duplicate invoices, inflate them or raise the price beyond what was originally agreed. If these invoices aren’t investigated (and they are specifically designed to avoid triggering a closer look) they often just get paid. Business with poor internal communications between departments are particularly susceptible - and this isn’t restricted to SMEs. Dell recently had more than $1 million stolen by an employee who filed fraudulent invoices over the course of five years. In another case, the FD of a mature mid-size company ran an analysis between the payroll bank account details and supplier bank account details. One invoice had been running for ten years, and despite it being a small amount, it cost the business nearly £10,000 It’s a particularly dangerous crime because it’s both easy to understand and execute. Companies have many relationships to keep on top of – suppliers, customers and employees – and so it can be difficult to keep track of all agreements made with each party. While accounts payable departments should be responsible for managing invoices, it’s up to the organisation as a whole to come up with a plan and implement the right processes and systems to protect the business against invoice fraud. Preventing invoice fraud   Understanding how invoice fraud works and the financial impact it can have is the first step to tackling the issue. In theory, being extra vigilant and just getting a second pair of eyes on the books sounds like a good solution, but small businesses don’t often have the luxury of extra staff and additional resources. More often than not one person takes on many roles, which can be problematic – and as a result, mistakes are bound to happen. This is where procurement technology can come in handy. Small businesses are sometimes hesitant to use new tools and solutions as implementation can take time and resources to manage. However, a short teething period is well worth the reward. In the long-term, the right technology can save you time and money. Procure to pay systems are designed to eliminate manual paper-based administration and block threats like invoice fraud before they occur. They can help manage invoices, purchase orders and expenses. Most procure to pay system suppliers should let you cherry-pick the features that benefit your business most, allowing you to personalise the software to cater to your needs. If you’re thinking of employing this type of software to combat invoice fraud, you’ll want to look for a verification feature. This means that any new supplier must be validated before signing on – and any supplier request to update or enter new payment information is verified before being approved. Long-term approaches to tackling fraud  Tackling fraud is something that everyone involved in a business – employees, suppliers and clients – should take some responsibility for. Effective security is often rooted in education and awareness, so taking the time to talk about the risks of fraud and the potential effects it can have on the business will go a long way to fighting the problem. But there are other resources and products that can help you. Investing in the right technology, communicating with your peers and even joining relevant communities will go a long way to fighting fraud and stopping problems before they arise – ultimately having a positive impact on your company’s bottom line.[/vc_column_text][/vc_column][/vc_row] ### Don’t lose your head - Consumer Devices [vc_row][vc_column][vc_column_text]Today’s consumers use an increasing variety of devices, including tablets, smartphones, smartwatches and even voice-activated digital assistants to do their online shopping. In fact, in 2017, for the first-time smartphones gained a higher share of e-commerce revenues than PCs. At the same time, customer experience continues to take priority over price as the main motivation behind purchases, so businesses must keep working to find new ways of maintaining loyalty. Bring these two things together and you have a perfect storm. Customers are demanding the same great experience and service, regardless of the channel or device they use, for any shopping activity they undertake. The answer to this challenge is what’s known as headless commerce. Having the opportunity to present customers with rich but relevant product content across all touch points is an important aspect of the integrated customer experience. Headless commerce is about replacing fixed strategies for each channel with a personalised buying experience. This is based on tailoring information regarding new products and recommendations for each customer across all channels. The right approach to headless commerce ensures that customers stay engaged, increasing their propensity to buy, even if they are switching between channels to make a purchase. Even in B2B commerce, the boundaries are being pushed and there is a strong need for headless commerce. According to our recent research, 39% of global B2B organisations are using VR to personalise the customer experience while 40% are using progressive web apps and 39% are enabling wearable technology to place orders. The momentum of omnichannel within the B2B market has already refined the customer experience from what it once was. As a result, organisations are opting to combine direct sales channels with web stores, warehouses and back office functions to achieve seamless commerce. For example, if a customer abandons their shopping cart, they will receive a notification, email or phone call from the sales team to find out why. While traditional e-commerce solutions offer default front and back end layers, headless commerce platforms are able to deliver information without the need for a front-end layer. By decoupling the interface with the back-end, developers have more freedom to customise a webstore and review the buying experience. This means businesses can present content at different touch points throughout the sales cycle, whether it be on a mobile phone or a smartwatch. More and more businesses are also taking advantage of e-commerce 3.0 to build more functional webstores and improve their e-commerce solution for better customer experience. e-commerce 3.0 is the stage where companies seek more integration between their webstore and wider IT infrastructure. Headless commerce is an extension of this development. Where 3.0 mainly focuses on the integration of the webstore and back-end (ERP) system for transactional benefits, headless commerce looks at the front end of the CMS and how content can be delivered most effectively, easily and consistently across a business’ sales channels. Since e-commerce 3.0 enables efficiency across the system and allows for data to be logged within one source, it’s perfect for headless commerce. Integrating the webstore back-end with the ERP system allows an e-commerce solution to calculate pricing, tax, and shipping costs, and also ensure consistent product information management, amongst other benefits. Developers can use APIs within headless commerce to then easily customise what the customer views without having to rebuild the applications into the e-commerce platform. If businesses are able to work with one system to service their webstore and other sales channels, this can save time and resources that can be utilised elsewhere in the business. While many B2B organisations focus on upgrading their existing webstore, some are implementing new technologies such as headless commerce in a bid to improve their customer experience. By building a headless commerce CMS system business can ensure that their webstores are capable of using multiple channels. Businesses that fail to future proof their e-commerce platform and manage their digital sales channels as a single entity will eventually see more savvy competitors with headless commerce strategies pull ahead. Simply using an online webstore is no longer enough. Utilising all sales channels available to provide a seamless and personalised omnichannel experience is vital to maintaining a high quality customer experience that results in optimal sales.[/vc_column_text][/vc_column][/vc_row] ### To keep the botnets at bay: regulate or innovate? [vc_row][vc_column][vc_column_text]As disruptive technologies such as 5G launch, the number of connected devices is poised to increase dramatically. Indeed, Intel predicts that a remarkable 200 billion connected devices could be operating by 2020. But this sharp rise in connected devices creates more opportunities for hackers, and cyber-attacks are becoming increasingly sophisticated. A growing number of attacks are now developed with the express aim of compromising IoT functionality, according to the ENISA Threat Landscape Report. Last year malware was identified that was designed to compromise industrial safety systems, which could have devastating consequences. All of this makes security of the IoT an urgent matter. But while it is crucial to ensure secure IoT devices and services, it is also important that we maintain the high level of innovation the IoT has generated thus far to guarantee further development and growth. Security – an afterthought  IoT security is often overlooked in the development of connected devices, either because the manufacturer lacks the technical expertise or is facing commercial pressure to get products into market quickly. These factors, combined with a landscape of rapidly-emerging standards and technologies, has introduced a new set of risks. One of the most devasting cyber-attacks that originated from IoT devices was the Mirai botnet attack of 2016. The incident saw huge numbers of devices infected with malware that attacked core Internet infrastructure. Approximately 24,000 devices were targeted, with the University of California putting the cost of the incident at $300,000. Alas, Mirai has not been an isolated incident. This year a new variant of the botnet was detected, which targets enterprise networks and has the potential to infect an even wider set of Internet-connected devices. As the IoT ecosystem grows, so does the threat. This is the challenge that industry faces moving forward. The options to deal with the journey ahead are varied. Regulation of the IoT has been touted as one route to developing a secure and robust network. There are benefits to this approach, such as the provision of more effective protection for consumers. However, while regulation is often said to create a level playing field, smaller companies may find compliance more of an obstacle than their larger competitors. Care also needs to be taken that regulation does not stand in the way of innovation and is able to keep pace with the rate of development in the IoT. Establishing a centralised regulatory body for something as diverse as the IoT, affecting everything from cars to thermostats, would be a challenge as it would need to bring together an equally broad set of competencies. It is also important that any regulatory interventions are coordinated at a global level, which might be easier with a targeted focus on particular areas of interest. A sectoral approach, in which existing industry regulators team up with industry and security experts, could be a more worthwhile exercise. After all, the Internet was built on collaboration and this approach could be hugely beneficial to the development of IoT security standards. By working together to self-regulate the IoT, stakeholders could develop a more flexible, successful and secure IoT network. Voluntary IoT standards allow innovation and competition to thrive – but also support the development of products with security in mind. The self-regulation revolution This voluntary self-regulation for IoT security is already occurring, with the Department for Digital, Culture, Media & Sport (DCMS) in the UK establishing a Code of Practice for direct collaboration with a range of manufacturers, retailers and the National Cyber Security Centre in 2018. Moreover, in 2019 a group of the UK’s leading universities launched a new National Centre of Excellence for IoT Systems Cybersecurity, which will draw academics and industry experts together to create a secure and trustworthy IoT infrastructure. These efforts are to be commended, as they enable stakeholders to work together to share experiences and best practice around IoT security while competing on product features. The fruits of these labours will be IoT products that are safer for consumers to use. There is still some way to go before IoT security is seriously addressed. But encouragement can be taken from these types of initiatives. They represent the first tentative steps towards establishing a set of voluntary IoT security standards that the market so desperately needs. Alongside these projects, it’s important that the mindset changes too. Companies obviously need to consider their commercial pressures and objectives – but all parties need to recognise that they are operating in the same ecosystem, which needs to be protected. And a unique ecosystem like the IoT will require unique ways of working together.[/vc_column_text][/vc_column][/vc_row] ### Broadband for business resilience and recovery [vc_row][vc_column][vc_column_text]Every business of every shape and size depends on having an internet presence. If you don’t have a website, an eCommerce store, a way to communicate with your customers or a website for marketing and advertising, you may as well not exist. Vital to managing all of this is a reliable internet connection. With more businesses doing more online, reliable business broadband is becoming increasingly important. If you’re moving to the cloud, adopting VoIP, using video conferencing to save money on travel or any of the myriad of other business functions that can happen online, what would happen if you lost those functions? How would your business cope without being able to get online? If you’re not sure how much it would cost your business to lose internet, try our simple downtime calculator. You might be surprised at what you find! A business needs to take into account two main elements when considering the reliability of their connection. Resilience and recovery. Let’s take a look at each. Business broadband resilience Business broadband resilience is about preventing broadband outages. It includes practical considerations organisations think about to prevent downtime wherever possible. Four suggestions for improving your business broadband resilience include: Using hardwired connections Wireless may be easy to install and flexible but it isn’t as reliable as a wired connection. In an ideal world, you would have a primary wired internal network using Ethernet and then a wireless network on top. That way, you can connect using whatever network you like and will always be able to rely on Ethernet should anything happen to Wi-Fi. For larger businesses, having a primary broadband connection and a separate backup connection can be well worth the investment. They would require separate ducting, separate entry points, separate power supplies and independent routers for real resilience. Smaller businesses could use powerline adapters as backup. These are plug-in devices that can carry limited traffic over your internal electricity wiring. As a failsafe backup, they are easy to use and require very modest investment. Researching speed and uptime When you’re shopping for business broadband, most of us will look at the headline speed and any data caps. Very few people look at the uptime. It is a vitally important metric that gives you an idea of what level of service to expect when you sign up. Speed is obviously important as businesses are looking more to the internet for cloud computing, backups, storage and applications. The faster your connection, the less time wasted waiting for something to happen. Uptime is a measure of how reliable a provider is. You should expect no less than 99.9% uptime and can easily achieve a couple more decimal points if you shop around. Equal to uptime is response time. How long will you have to wait for an engineer? How quickly will your fault tickets be picked up? All these go towards the resilience of your broadband connection. Invest in your network A network is a living, breathing thing. That first installation of router and ethernet cable was just the beginning. If your business stays the same size, your network will just need maintenance and router updates as they are released. If your business grows, you need to ensure your router is up to the task and that you have the bandwidth to handle the workload. A consumer router that worked when it was just you and an assistant is not going to be able to handle the traffic of multiple staff members or provide robust enough security without an upgrade or dedicated firewall. Buying a faster broadband connection is an expense but if you could count the time wasted waiting on something to download or queueing with staff to upload to cloud storage, there probably wouldn’t be much in it. Our advice would always be to invest in a faster connection than you need right now because it gives you scope to go faster and grow faster. Monitor your Service Level Agreements We mentioned response times earlier when talking about researching broadband providers. A key part of any business contract is going to be the SLA, or Service Level Agreement. It outlines how long you would expect to wait until a fault ticket was picked up or how long you would need to wait for an engineer. The SLA will also outline compensation for downtime and for missed targets. We are running a business here and any downtime costs money. If you used our downtime calculator, you will have a good idea of just how much money! Make sure you understand your SLAs and how to claim compensation of your provider falls short. Business broadband recovery Business broadband recovery is the other side of the equation. If, after all of your investment and preventative measures, something happens that is out of your hands, how do you recover as quickly as possible? The simplest way is to plan for the worst while hoping for the best. We touched upon diversity earlier by suggesting a backup connection into the building using a separate entrance point. Or using powerline connectors for smaller business. But what else could you use? Mobile broadband backup The improvements in speed and coverage of Britain’s mobile networks has now progressed to a point where it is possible to use them as a broadband backup. You could use a mobile broadband dongle for individual computers or a router with 4G access for internal networks. Each mobile provider has either PAYG mobile broadband access or fixed price access for just this situation. Prices, coverage, data caps and speed are dependent on the provider you select. Do your research, make sure your area has a strong signal and you should have a cost-effective and flexible backup solution. Routers with 4G LTE capability are now commonplace and do not cost much of a premium over non-LTE routers. They can be configured to automatically fail over to 4G when your primary link goes down and alert you to the fact. Getting something back If you do lose your broadband connection, what should you do? Here are some practical steps you can take.    Test your own equipment first. Reboot your computer(s) and your router, just in case.    Call your broadband provider to report the fault. Note down the date, time and the agent’s name as well as any ticket number.    Monitor the fault ticket using a customer portal if you have a backup connection or use your phone.    Note any conversations you have with the provider, any promised ETAs and refer to your SLAs to manage expectations.    Be calm and be reasonable.    Note the exact time of service restoration and compare it against what your contract or SLA says. In most situations, the broadband provider will be able to restore your service well within your SLA. If they don’t, having exact times of failure versus restoration as well as a running record of who you spoke to, what was said and any promises made will help you claim compensation. You may be entitled by law to compensation for any downtime that falls outside your contract. Your first call should be to your provider to make a claim. You need to give them all of the evidence and allow them to come up with an offer. You don’t have to accept this offer if you think it is unduly low. You can negotiate until you reach a settlement you are happy with. If your provider will not pay compensation or will not make a fair offer, you can submit your complaint to an independent Alternative Dispute Resolution (ADR) scheme.[/vc_column_text][/vc_column][/vc_row] ### Technology as a work perk: Why recruiters are craving simpler benefits [vc_row][vc_column][vc_column_text]The conversation around employee perks and benefits is usually centered around the competition between big tech companies that can afford internal office slides, free meals and weekly dolphin rides - or whatever the latest fad in employee engagement is. Recently, however, more studies are showing how less is usually more when it comes to employee benefits and employees are increasingly yearning for the simple things in life as office perks. Natural light and good views are seemingly trumping ‘Prosecco Fridays’ whilst chairs with strong back support are far more appealing than Charlie the office dog. This makes sense because let’s be honest - no one wants to work in the basement - no matter how brightly painted the walls are. Another simple and often undervalued office perk is access to intuitive technology that makes the working lives of your professionals easier and more efficient. A recruiter’s time is precious, and it’s very rare for their day to finish at 5pm. Candidates can often only take calls late at night and hiring managers seem to always push that feedback call to 8pm. No matter how many times they try to arrange meetings within working hours, professionals in this sector always find their evenings filling up remarkably fast. In the flexible working world, it’s unreasonable to assume your consultants are going to stay chained to their desks until 9pm every night. With that in mind, if they had access to the right technology they would have the ability to continue their work from home - or anywhere outside the office. Stuart Thomas, Recruitment Industry Expert at Access Group discusses how different aspects of technology can positively impact your employees and business: Mobile technology The freedom to log on and work remotely will suddenly open up a lot of possibilities for your consultants, as well as free time in their diaries. Gone are the days of lugging heavy laptops from office to home while spending ten minutes trying to log on to the server remotely - provide your recruiters with easy and remote access to recruitment software and candidate information. Collaborative platforms Organisations are becoming increasingly more collaborative and teams spend more and more time delivering projects collectively, which means an internal communications system is a must. Having a beneficial infrastructure in place to support collaborative work is a benefit for the entire group. Internal communication platforms give employees the ability to see updates, track timelines, share information and easily connect from apps or from computer desktops. Online HR management Providing your employees with access to the essential information they need, as well as the ability to request and amend records, is a valuable perk that makes a huge difference to productivity. Whether it’s recording absences, tracking holidays or requesting annual leave, online systems that easily allow employees to do this are a credit to efficiency amongst your team. The technology integrated within agencies and recruitment businesses has a direct impact on employees and their ability to efficiently deliver results and drive high performance. Providing your people with the right tools and technology and ensuring they have access to adequate systems and software might just be the perk your employees are actually looking for. After all, beanbags and ping pong tables are great for the first month, but seamless systems that make your life better and your job easier don’t get old.[/vc_column_text][/vc_column][/vc_row] ### Closing the App Gap with HPE Nimble Storage [vc_row][vc_column][vc_column_text]In today’s data-driven world, digital businesses are faced with a dilemma: as the volume of data they are expected to process continues to expand, it becomes harder and harder to ensure that their growing customer base is able to reliably access this data when required. This delay in performance - when an app cannot access and process data from the provider’s storage servers fast enough - has been dubbed the “app-data gap” or simply the “app gap” by the experts at Hewlett Packard Enterprise (HPE), and it is becoming an increasingly common problem. Many businesses are facing up to this issue at the moment, and many are spending a lot of money trying to build capacity in order to eliminate it - but most are missing the fact that smarter storage technology is available that could close the app gap much more effectively, at a much lower cost. The app gap problem Modern organisations frequently rely on apps to reach their customers, serve employees and fuel the growth of their business, but this can only be achieved when these apps are able to guarantee uninterrupted and instant access to data at all times. Without the right infrastructure in place, lags in application performance become inevitable, creating headaches for the company’s staff and customers alike. A widening app gap can slow down business activities, undermine productivity and create negative experiences for the end user, and it can often be difficult to know how to solve this. When an app gap problem arises, the responsibility falls on the IT department to fix it, and the usual approach will be to adopt a brute-force solution of simply buying more storage until the issue is resolved. But this is often a flawed and expensive strategy, as estimates suggest that 54% of app gap issues are not related to storage capacity at all, and can actually be better explained by configuration problems, interoperability issues or a simple failure to adhere to data storage best practice. Rooting out the real problem Clearly, splashing the cash on extra storage will not be an effective solution if deeper-rooted issues are causing your app gap problems, which is why it is so important to ensure that any effort to close the app gap is based on a proper understanding of how your data environment is configured. This means looking at the storage infrastructure from top to bottom - including its hardware, software and performance trends - as well as assessing the network setup and any potential compatibility clashes between various vendor solutions. Has everything been set up in the best way possible, or are there ways in which it could be improved? Are your disparate systems working in a way that’s complementary, or are compatibility issues arising? And does your infrastructure’s performance actually align with your business needs? Without knowing the answers to these questions, it will be impossible to understand where the bottlenecks in your system are or to know what kind of solution you may need to close the app gap without overspending unnecessarily. How can the right technology help? Addressing these problems holistically can often be achieved through a combination of proper analysis and sensible, targeted investment in the right tech solutions, rather than unstructured spending on capacity expansions with no clear guarantee of results. HPE’s Nimble Storage technology is perhaps the definitive example of a storage solution that has been specifically designed to help organisations close the app gap. Rather than focusing purely on capacity, Nimble offers rapid-access flash performance at an affordable price, alongside sophisticated deduplication and compression capabilities that maximise the efficiency of the storage volume. Most notably, Nimble also incorporates the market-leading predictive analytics tool InfoSight, which analyses and correlates usage data from more than 10,000 HPE systems deployed worldwide to predict and prevent problems before they create any impact. It also gives the user the ability to estimate their storage capacity requirements via trends extrapolated from more than 300 trillion pieces of data, providing a clear idea of how best to configure your own system based on data taken from other organisations with a similar profile or workload. As such, InfoSight can offer an end to repetitive technical issues, a guarantee of near-zero downtime and a sharp decrease in user complaints about performance, as well as helping companies to ensure their system as a whole is set up in a way that maximises efficiency and closes the app gap completely. Mastering the app gap It is a reality of 21st century digital business that clients, customers and staff alike all expect the data they require to be instantly available as and when they need it - and any organisation that is not able to meet this standard will quickly be left behind. With HPE Nimble Storage, your organisation will be able to nullify the threat posed by the dreaded app gap and deliver the level of data availability that all of your stakeholders expect, while still keeping your upfront costs down and making the most of your current assets.[/vc_column_text][/vc_column][/vc_row] ### Unlock the Value of IoT with Performance Monitoring [vc_row][vc_column][vc_column_text]The Internet of Things (IoT) might sound like nothing but a buzzword, but it’s much more powerful than that. Understanding the IoT is easier said than done, especially as technology keeps changing rapidly. In a world of smart devices, smartphones, and endless connectivity, how do you define the IoT? In broad terms, the term IoT refers to anything and everything connected to the internet. It also can refer to any item that’s connected or “talks to” other devices. For instance, your smartphone is connected to your Google Home and vice versa. As technology advances, we have more and more smart devices to choose from, and our lives are becoming more connected as data is shared. How does this tie into performance monitoring? According to a recent MIT Technology Review report, application performance monitoring (APM) might just be the key to unlocking the full power of the IoT. It all comes down to the way in which you diagnose and analyze your data, and that’s where performance monitoring comes in. In this article, we’ll explain why APM might be the key to IoT success and why this matters for the future. Why Connected Devices Need to Share Data First, we need to talk about the reasoning behind so much data being collected from devices in the first place. In a sci-fi universe, more data usually equals less freedom. Many people today argue that just because a device can connect to the internet doesn’t mean it should connect to an internet. While it’s true that all things might not need to be connected, having them connected gives us a lot of unexpected benefits. For instance, it’s an opportunity to be more efficient across industries. Because companies, public authorities, and even governments can learn about how products and services are used in real-time, they’re better able to set goals and meet consumer needs. Much of this comes down to data quality. Too much data can be a bad thing, especially if that data is unorganized or inconclusive. That’s where application performance management becomes useful. Because this entire concept is a way to make information and data more usable, it helps with these challenges. Using Performance Monitoring Tools Organizations and even governments can deploy application performance management and monitoring tools to help them solve problems as they occur? What is application performance management exactly? It’s a way of seeing real-time data, logging problems, and preventing errors in our connected platforms, applications, and devices. The more proactive we can be with solving errors, debugging devices, and analyzing data, the more intelligent the IoT becomes. In a world where security matters more than ever, the IoT needs a way to become more trustworthy, especially if it’s going to be more well-known by the public. Performance monitoring makes safety and security a top priority, and it ensures all data is used in a productive way. The Future of the IoT It’s clear that the IoT is a disruptive technology. There is already much back and forth on how it should be used and who should be in control. Where is the Internet of Things going next? Today, this technology is in its infancy. We’re starting to see smart devices become more common in our homes, but it’s just getting started. You might have a smart device like a Google Home or Amazon Alexa. You likely already have a smartphone. You might have a smart car or other smart appliances like smart bulbs or thermostats. In the future, this is expected to play an even larger role in our daily lives. However, there are some dangers at play if the IoT moves too fast. Namely, security is the biggest concern. Many of our devices aren’t secure enough as it is. If we produce more, there will only be a greater burden to find secure solutions. This is why developing and adapting performance monitoring solutions is so essential. Final Thoughts In a world where privacy, security, and data are of chief importance, there's a bit of a balancing act happening with the Internet of Things. While many technologies are pushing forward, we’re also challenged with finding new solutions that make these disruptive technologies more digestible for our everyday lives. Luckily, all hope is not lost in this Brave New World of connectivity. We are uncovering new ways to manage our data in a productive way. At the end of the day, that’s the best thing we can do until the IoT is the new normal.[/vc_column_text][/vc_column][/vc_row] ### The Dangers of Centralised Databases in the Digital Age [vc_row][vc_column][vc_column_text]Data breaches are nothing new, but those occurring over the past few years have certainly pushed the discussion into the mainstream – a string of high-profile incidents hitting the likes of Equifax, Yahoo and Facebook have seen billions of records containing sensitive user data exposed. Who Watches the Watchmen? It’s growing increasingly difficult to trust that companies can simply patch an overlooked vulnerability and move on with pristine security hygiene. Since the Cambridge Analytica catastrophe, Facebook has been repeatedly hit by further attacks resulting in the compromise of its users’ accounts. The effects are by no means contained to the platform, either – a malicious actor having gleaned the username/password combination for a given user may very well succeed in accessing their accounts on other websites. Estimates vary from survey to survey, but it’s thought that the majority of individuals reuse passwords across sites. It’s easy to point the finger at a business whose database has been breached by cybercriminals. Indeed, even governments have begun to sanction companies that fail to adhere to stringent data protection regulations aimed to secure customer information. One has to wonder, however, how much of the blame can be put on these entities when the problem is evidently a symptom of a fundamental flaw at the infrastructure level. No architecture is perfect, and having multiple developer teams working on large amounts of code is a recipe for the creation of bugs to be later exploited. A Systemic Oversight At the root of the issue lies the very practice of users sharing their data. In order to interact with virtually any company that keeps digital records, customers must surrender a questionable amount of data, ranging from card details and physical addresses to social security numbers and identity documents. The most basic identity information that almost every personalized site asks for is usernames and passwords. They have no assurances as to the protection of this highly sensitive information, other than the word of those holding it. This pattern is repeated around the globe, in spite of it being demonstrably broken time and time again. With the username/password combo maintained by a third party, a central authority is the true ‘owner’ of any data submitted by users, thus making said authority a lucrative point of failure for attackers to target. There’s a veil of privacy/security, but it is rapidly lifted when a breach inevitably occurs. It would be too optimistic to assume that all users effectively partition their online activities. If the aforementioned stats regarding password repetition are anything to go by, then the majority of individuals are put at considerable risk when even one of the services they use is compromised. Though they may only have some fairly innocuous information stored with that particular service, the login could be tried on other sites linking to sensitive data. The breach of a small blogging platform can soon lead to fraudulent spending on e-commerce platforms. ‘Bring Your Own Identity’ – The Saving Grace? It’s perhaps unsurprising that a number of teams are working on protocols to put an end to a problem so detrimental to so many industries. The most interesting of these are developing digital user-owned identities – that is to say, a portable and self-sovereign identity controlled solely by a given individual, which can be used to authenticate the user anywhere. The advent of distributed ledger technology enables these identities to forego the need for a middleman to administer a central database of users – instead, private keys stored in users’ smartphones would grant them access to an encrypted container that holds identity documentation. In doing so, the incentives for attackers to target businesses’ databases are completely destroyed: though the business can still have its users authenticate themselves as is the norm, they would no longer hold a registry with sensitive user information. In order to perform a large-scale breach, a cybercriminal would need physical access to the tens of thousands to millions of phones held by the individuals (which can be rendered even more difficult still with biometric authentication) – a task too expensive for even the most well-funded opponent. In this system, users are the owners of their data. We’ve touched primarily on identity, but in fact, these systems expand beyond identification and encompass sensitive documents of all kinds, with given services able to certify its authenticity without any damage to user privacy. Building for the Future The distributed identity offering appears to be the logical way forward. As with any major technological shift, however, it will not come about without significant effort. In order for such a system to have value to the end user, it needs to be widely adopted by businesses – which means making changes to their existing architecture and educating their customers on how to use these identities. Industries across the board need to consider how to streamline the onboarding process for customers and employees alike. Ideally, such solutions should seek to incorporate existing authentication standards like SAML and OpenID, in such a way that they can easily be deployed within incumbent frameworks (and save enterprises the hassle of needing to rewrite protocols from scratch). This approach provides a bridge between the old identity paradigm and user-owned, distributed identity. The looming threat of data breaches only grows more worrisome with every passing day. In an ecosystem of fragile databases, sensitive data continues to be stacked in insecure containers that sit waiting for attackers to exploit them. We need to do better, and distributed identities promise to be the solution.[/vc_column_text][/vc_column][/vc_row] ### How the rail industry has harnessed the power of the cloud [vc_row][vc_column][vc_column_text]The rail industry is set to face new challenges in the coming years, with critical changes expected from movements such as the liberalisation of the European rail network with the Fourth Railway Package. As the number of rail passengers continues to grow and with the opening up of competition within the industry, investment in the right technology can help create conditions ideal for future innovation and improvement to the passenger experience. Investment in cloud technology will help to combat the new and present challenges impacting the rail industry and as we have seen in recent years, this technology has already had a positive impact on rail operations. Helping rail retain its green status  It is widely understood that rail is by far the most environmentally-friendly way to travel, and as discussions around the impact of climate change continue to dominate the media agenda, more passengers and operators are aware of the rail industry leading the charge. However, if rail is to retain this status, it should embrace green technology across all areas of the industry. Initiatives such as ticketless travel; increasing adoption of electric trains; optimizing rail operations through greater insight into capacity versus demands can all contribute towards reducing rail’s environmental impact. At Amadeus, we believe carbon reporting can be a differentiating factor when choosing a mode of travel. Given rail’s status, carbon reporting should be the next step for the industry. It is our ambition to move towards a future where every rail journey includes both the price of a journey and its carbon costs. Improving the passenger experience As rail passenger numbers increase, and as digital technology continues to impact the industry, customer expectations across travel has changed. Passengers want a rail experience that is personalized, comfortable and easy to shop for. The industry is trying to meet these demands, with the aim of deregulation to provide wider choice, cheaper fares and improved quality of travel. Rail is moving towards a more traveller-centric approach and utilising technological innovation such as cloud-based technology infrastructure to help achieve this. The cloud provides rail companies with greater flexibility and agility in how they operate business. At the same time, it can allow them to more easily adopt new ways of interacting with travellers, whether that is through their mobile application, website agency partners or other retailing channels. Operational reliability  Essentially, the cloud can provide rail companies with better operations efficiency. As computing capabilities have evolved, and customer demands have become greater it is important that the travel industry responds to the changing demands. In order to enable greater collaboration across the industry and increase efficiency, Amadeus recently became the first travel technology provider to have retired all mainframes and run fully on open systems. The move to open systems provides a number of benefits for the travel industry, from increased capacity, better system reliability and a global reach for better proximity and delivery of new products and services. This open cloud technology has given travel companies the flexibility to be able to scale and meet the demands of the growing industry, without the need to invest in costly infrastructure. Additionally, the options an open system provides an opportunity for greater collaboration as teams share the same cloud infrastructure enabling more opportunities for partnerships across the rail and wider travel industry. The rail industry is set to face significant changes in the coming years and so scalability, reliability and flexibility will be crucial from an operations perspective. However, with the changes and challenges set to come, it is crucial that rail companies put the traveller at the centre of what they do, to ensure they are providing the traveller with an experience that reflects their needs and requirements.[/vc_column_text][/vc_column][/vc_row] ### Measures to Ensure a Safe Digital Experience [vc_row][vc_column][vc_column_text]With current advances in AI, Fintech, IoT, it is very easy to overlook problems in more traditional areas of our digital/ Internet lives, such as cybersecurity. Nevertheless, this constant arms race between hacking tools and digital security tools has been valid for all these years and continues to be so nowadays. There are a number of actions regular users or the employees of a company can take to ensure comprehensive security across devices. Creating Strong Passwords One important action has to do with learning to create a strong password for any web platform (e.g. payment system), mobile or desktop app, password-protected archive, etc. Truth is that nowadays, a simple user has accounts on dozens of websites and remembering all passwords can be very challenging. As a consequence, users might make the mistake of choosing passwords that are very easy to remember but also very weak (can be bypassed in minutes by a professional hacker with appropriate tools). Alternatively, users might pick strong passwords and reuse these for multiple platforms/ apps – this is an equally dangerous approach given that data breaches are frequent nowadays and a Dropbox data breach, for instance, can reveal to a hacker the password you are using to log into several other places (this has happened, by the way). To address these, people must choose both individualized and strong passwords for all important accounts. Fortunately, nowadays this can be greatly facilitated by dedicated password vaults (specialized apps that can both generate and save hundreds of passwords, serving as unifying password storage centers). Using Antivirus Software Antiviruses are universal, multifunctional tools that monitor lots of processes and data exchanges occurring on your device. Among the typical functions these perform are: scanning local and downloaded files, email attachments, directories and locations (e.g. flash drives) for viruses, spyware, malware, etc.; isolating suspicious files (putting them under quarantine); enforcing safety practices; preventing access to dangerous websites; warning you of potential risks (e.g. about changes to certain system settings). With regard to data traffic, antiviruses and firewalls actively filter it trying to identify known and unknown threats. Avoiding Phishing Attempts and Suspicious Links, Not Sharing Sensitive Data Irresponsibly One important prerequisite of safe web experience is to avoid emails from entities disguising as credible ones in order to obtain your login credentials, personal data (including intimate), or financial data (phishing emails). If a hacker obtains such data, it can be used for fraud, blackmailing, unauthorized actions in your name, etc.  You should be extremely careful with regard to who you share your personal data or passwords with (it is virtually never justified to share passwords). Another important piece of advice is to avoid links from suspicious addresses/ users delivered via emails, text and video chat apps, social networks, etc. Accessing these can infect your device with malicious content and compromise your security. If suspicious messages come from friends, just write them back asking whether they really wrote it themselves (sometimes the answer would be negative). Using VPNs A VPN is a somewhat less familiar technology to many digital users since it has only recently known a serious expansion and popularity gain. The role of VPNs for safe digital media accessed via the Internet is already significant and likely to become even greater in the years to come. This is explained by the functioning principles and the clear advantages this technology offers. What VPNs do is to establish secure communication channels between users and the Internet by rerouting all communications through dedicated VPN servers and by encrypting all data traffic exchanged through these channels. This has a number of implications: data traffic becomes harder to intercept and decrypt. This is true for hackers, Internet providers, government Internet spying agencies and for any other third party who tries to tap into your traffic. Encrypted traffic looks like nonsense to anyone that does not have the corresponding decryption keys (some really strong keys can be used with many VPNs). Safe Internet traffic ensures secure online data exchange and payments even from less trustworthy networks. users cannot be easily tracked anymore. Websites normally track users by their IP addresses – all activities performed by users over the entire web are associated with their IP addresses (and oftentimes, with unique advertising IDs) and websites use to exchange/ sell this information with each other. By activating VPNs, users become virtually impossible to track by simple means – their IP and other provider-associated data is hidden (and replaced by VPN server data). In theory, users can be tracked even by using such parameters like the screen resolution of their devices (in combination with other parameters), but such data is much less informative and cannot be routinely used to identify users uniquely. not only one cannot be tracked – one can choose which region or country he/she might be willing to access the Internet from (which can add another layer of privacy). A major bonus implication is that users can also access geo-blocked content by simply rerouting their traffic manually through one of the VPN servers in a country of choice (where the corresponding content is allowed). Interestingly, this is one of the key reasons that motivate the use of VPNs, including in countries with censorship practices. To conclude, comprehensive cybersecurity practices involve several complementary measures like using antiviruses for overall safety and for filtering out constantly evolving threats, abiding by safe practices (e.g. choosing strong passwords, avoiding dangerous links/ attachments, not sharing personal data), and using VPNs for privacy of the Internet access but also to hide the traffic from the public and thus, to minimize the risk of hacking. Only by ensuring all these aspects one can be justified to expect no trouble.[/vc_column_text][/vc_column][/vc_row] ### 5 Things We Can Learn from HR Cloud Natives [vc_row][vc_column][vc_column_text]It’s safe to say that business functions are moving to the cloud in droves, but what isn’t as widely reported on is how ‘cloud natives’ are driving innovation and identifying new business use cases underpinned by the cloud’s strengths. Though not commonly recognised as an early adopter and driver of new technology, HR has actually been a cloud evangelist for nearly a decade now, with the cloud transforming Human Resources management significantly. HR cloud natives – those who have knowingly embraced ground-up native cloud systems and digital-first HR – are seeking out new and better ways to harness the latest technologies to the benefit of both their organisation and employees. So, what can these early adopters teach us about the power of the cloud beyond the obvious, and how can we take a leaf out of HR’s book and apply this expertise to other business situations? 1- Always On, Here to Help HR was surely one of the first departments to exploit the employee-empowering element of the ‘always-on’ cloud. Employees expect information at their finger-tips so that they can execute at a time that suits them, whether that’s checking holiday and requesting time off, or doing basic admin on the go. With HR in the cloud, you can do your timesheets on the train if you want to, or request your next holiday from the beach. It’s a great example of how well suited the cloud is to most employee-related activities. 2- Driving Engagement Being a long-time pioneer of innovative cloud use, of course, has its benefits. HR cloud natives have a long history to tap into when it comes to demonstrating how the cloud has helped increase engagement within the business, and with potential employees and new joiners through online recruiting and more recently onboarding software. Not only has HR technology helped practitioners cut down on paperwork, it has also done so while facilitating better communication and engagement. For years, HR cloud natives have been using social platforms and HR systems to post news, invite feedback, and create a sense of community, while also providing a ‘one stop shop’ where employees can showcase their own talents and interests, and engage with colleagues across the business. 3- Breaking down Boundaries The use case for the cloud has no better example than where HR systems help establish a sense of community across international organisations, allowing employees to connect with each other via a common platform from wherever they are. What HR cloud natives have also excelled at is nailing ‘local’ as well as global; recognising that a one size fits all approach just won’t work. As valuable as it is to have a single consistent view of employee and organisational data to drive strategic workforce decisions, local relevance is even more important. This is not only critical to compliance, but also to employee uptake, a pre-requisite if HR systems are to deliver on their promise. If an employee can’t use the system in their own language, recognise the date formatting or terminology, trust the system to calculate time off entitlements in line with their local legislation, or help them connect with colleague in other countries, the system will quickly be abandoned. 4- Personalisation HR software and the cloud have not replaced conversations between HR managers and employees, but they have supported it, and in many ways, transformed it. Information is more visible, managers and employees more accountable, and outcomes easier to track. In keeping with the theme of customisation, HR managers are also tailoring onboarding processes and performance reviews to be relevant to individual employees, rather than opting for a standardised approach. It means that employees stick around for longer because they get customised regular feedback, which supports their performance improvement and overall career progression. It’s a great use case and highlights the cross application amongst other user journeys. By applying the same logic and processes to the customer experience, for example, we may find that customer churn is reduced and customer loyalty is increased. 5- Agile and Future-Proofed Last, but not least, HR cloud natives have proven that modern, cloud-first multi-tenanted solutions are not just equal to – but better than – older cloud-enabled single tenant solutions. Multi-tenanted systems, which can be switched on almost instantaneously, are purposely designed to be quick to deploy and configurable by non-technical users – all while taking full advantage of the scalability and cost-efficiencies of modern cloud infrastructures. For the most part, they offer the same features as the older systems, but at a lower cost and with less complexity. The ramifications are significant. Since it’s now easy to test drive these cloud-native applications before committing to them, cloud-natives have ditched long-winded RFIs and selection processes in favour of user case scenarios and hands-on experience. Rather than taking months or years to go live, company-wide enterprise HR systems can be deployed in just a few weeks – or even days. With more scalable IT infrastructures, HR teams don’t need to turn to their suppliers to add more capacity, but can simply add new employees, departments and countries whenever they want. Contract terms are generally shorter too, in months rather than years. If customers aren’t happy, they can walk. Suppliers are staying much closer to their clients, actively canvassing ideas for improvement, and ensuring that these find their way into their solutions and their services much faster than in the past – some suppliers update their systems almost every month. For businesses, this raises an interesting question. Should you be flexing your business model in the same way as your HR tech suppliers to accommodate feedback from customers, employees and other stakeholders? And, if so, do you have the people, skills and technologies in place to facilitate this? Perhaps start by talking with your HR team.[/vc_column_text][/vc_column][/vc_row] ### Big Data: Even Bigger Questions [vc_row][vc_column][vc_column_text]It can be hard to understand exactly what the phrase ‘Big Data’ means since it doesn’t have an official definition, but it really refers to just what it sounds like: a whole lot of data. Although the concept of Big Data is relatively new, its origins date back to the 60s and 70s as this is when the first data centres and databases were formed. Back then, data collection and analysis were just beginning to shape the business world (a process which eventually became known as ‘business intelligence’) but, with the advent of commercial internet in the 90s, the practice skyrocketed, leading to ever more sophisticated uses for data technology in the 2000s and beyond. The really interesting thing about Big Data, though, is where it comes from and the ways it may be used. Both are complex considerations since the field of data science and data analytics (and the tech they rely upon) are constantly evolving. We can see this at work in our own use of data-driven technology, which is increasing exponentially. Things like smartphone apps, GPS, and fitness trackers, social media, online messaging services, and mobile-banking all process, share, and store large amounts of data about us in order to function. All these technologies also have a relatively short shelf-life, as they are soon superseded by new software updates and new, better functioning devices. With each upgrade comes increased functionality and convenience; the price for which is usually – you guessed it – more data. So, the more that technology integrates with and improves our lives, the more data there is to be captured and processed about us. To put this into perspective, the amount of data we produce is growing at a mindboggling rate of 2.5 quintillion bytes per day! The 3Vs of Big Data The sheer amount of data gathered across multiple internet-enabled devices known, collectively, as the Internet of Things (IoT) is a result of three elements. These are the ‘3Vs’ that are said to characterise Big Data: Volume – The IoT is growing exponentially year on year, as are the amount of internet users and, subsequently, the amount of data available for analysis. Variety – Data doesn’t just refer to numbers and statistics as it did once upon a time. Big Data is made up of both structured and unstructured data, including (but not limited to) non-numerical data such as emails, pictures, and voicemails. Velocity – velocity refers to both the speed at which data is generated (which, we know, is enormous) and the speed at which this data may be processed. It’s worth noting that advancements in technology mean data can now be analysed pretty much in real time, as is the case with smart meters, chatbots, e-commerce, and etc. Big Data in Practice It’s hard to envisage Big Data without its comrade, cloud computing. Scalable as well as cost-effective, the cloud provides the necessary infrastructure that Big Data demands if it is to be used effectively for analysis. For example, many organisations virtualise their servers in the cloud to make them more efficient, and to make data management and subdivision more straightforward. As you can imagine, being able to rapidly crunch large volumes of data from numerous sources is useful for businesses that are looking to data for market-confidence. It also allows organisations to form more complete answers about us, the customer/data subject. Let’s take an emerging industry like financial services technology (or ‘fintech’ for short) as an example. In many ways, fintech has thrived because of its user-centric approach to things like insurance, foreign currency, and money transfers. In what is often viewed as direct opposition to the large, traditional banks, fintech has somehow humanised financial services, using cloud-based platforms and mobile apps to bring accessibility, convenience, and transparency to the end user. Indeed, the global adoption rate of fintech products sits at around 52% – and this number is rising rapidly. Big Data is the driving force behind fintech’s better and more efficient customer service because it enables firms to profile and categorise their customer-base. Categories may depend on age, gender, socioeconomic status, geographical location, spending habits, online behaviour, debt history, general health, and much more. Using this data to cross-analyse users, fintechs can easily personalise banking suggestions and target their marketing to meet customer needs. They can also identify the most valuable customers in terms of spending-power, offering certain financial products only to statistically viable subsets. Other examples of Big Data use by fintech companies (as well as other commercial/retail enterprises) is what’s called natural language processing, or ‘NLP’. NLP is a type of machine learning that works by processing data in the form of language and cross-referencing it with the wealth of information already held about the data subject. An example may be chatting with a ‘digital finance assistant’ in your mobile banking app. By accessing Big Data, the chatbot can easily make personalised recommendations that feel natural and relevant. Understanding customers and their behavioural patterns in this way also helps fintechs manage risk, detect fraud, and optimise performance. So, whilst new technologies often appear more human, they are, in fact, artificially intelligent. Their celebrated usability is a result of Big Data and is driven by complex algorithms and AI. Although many argue that consumers will benefit from increased access to more personalised, cost-effective products that encourage fair competition, others are more cautious; after all, Big Data begs big questions: What happens if data security is compromised? Who (or what) is held accountable by regulatory watchdogs for decisions made by robots? Just how do firms using Big Data protect our consumer rights? Big Data and the Law These questions are particularly pertinent under GDPR (or the UK’s implementation of it, the Data Protection Act 2018). Under this legislation, accountability sits with data controllers (organisations that own data) to adequately protect personal data and to ensure that it is processed ethically and lawfully. This means that, in order to comply, data controllers must: Be transparent about how they intend to use data (including putting measures in place to track and audit data use and for customers to access records about how their data is being used). Obtain informed consent from data subjects to use their data in the manner they want to. Organisations risk breaching data privacy and data security laws if they carry-out profiling on data they only have implied consent for. Ensure that automated decision software is fair and unbiased. Protect data integrity by using only accurate data and updating this data as and when required. One way in which fintechs and other technology-based applications plan to remain compliant with data protection laws involves yet another innovation: regulatory technology, or ‘regtech’. Indeed, the continued crossover between regulation and technology seems essential in the future as firms encounter ever more regulatory and reporting requirements. Extending disruptive digital technologies to regulation does seem like the next logical step.[/vc_column_text][/vc_column][/vc_row] ### Healthcare: last tech frontier in the public sector [vc_row][vc_column][vc_column_text]Technology dominates every aspect of society. This is not an argument, this is a fact that has been apparent for the past three decades. Technology has caused enormous societal changes: social media has enabled us to widen our reach and communicate with different audiences, automation has enhanced supply chain and manufacturing and mobile banking has changed customers’ relationships with a predicted 72% of the adult population banking via a mobile phone app by 2023.  However, the public sector remains relatively un-‘digitised’, least of all in the healthcare sector. Although strides are being taken to reverse this, such as the signing of the Local Digital Declaration in 2018, in which numerous councils and sector bodies across the UK recognised the opportunity to reshape public services through technological development that best meet citizens’ needs, there is still a long way to go on the healthcare sector’s digital engagement journey. Big ideas always start small The Secretary of State for Healthcare, Matt Hancock, has expressed big plans for the future of the NHS. At the Spectator Health Summit in March this year, he set out his vision for the future NHS to include robotics, artificial intelligence and genome sequencing as a “routine, everyday part of healthcare”. Hancock’s enthusiasm to fully embrace the tide of technology is a step in the right direction, but focusing on some of the biggest technological advances the world will ever see is very ambitious. There are several other (and smaller) measures healthcare providers including the NHS can introduce to maximise the NHS’s digitalisation journey. Although the NHS mobile app, launched back in July of 2018, where patients can check their symptoms and connect with their GP practice to book appointments, was a step in the right direction, we can look to other countries for applicable examples of digital healthcare. It’s not about the hardware One of the common misconceptions about the challenges faced by the NHS is that its problems are caused by the fact that many hospitals operate on mainframes. Commentators frequently present older computers as archaic or inflexible; some even argue that hospitals with this ‘mainframe'  mentality are a barrier to digital enhancement. But running on a mainframe does not have to be a hindrance in the digital engagement journey; these computers were designed specifically to handle the type of data-heavy applications needed by the medical profession. For example, in the USA, every single provider of healthcare runs on a mainframe – thanks to its scalability, security, adaptability and reliability. These same healthcare providers also offer digital services, including virtual doctor appointments and teleconsultations. Apps can bring ‘appiness’ to patients With over 80% of smartphone or tablet time spent in an app rather than a mobile web browser, mobile apps could be the answer for the NHS to achieve digital transformation: they are fast, interactive, easy to personalise and instantly accessible from anywhere. To prove that healthcare is the next public sector to be propelled on its digital engagement journey, private equity firms and venture capitalists are investing billions of pounds into it. For example, Accrux, a messaging app service for medical teams and patients, has just raised £8.8M in Series A funding. It can be used as a way for doctors to send advice, notify a patient of results, remind them to book an appointment and leave messages. In addition, all this communication is automatically saved in a patient’s medical records. The messaging app has the potential to strengthen doctor-patient relationships whilst also being the answer for the NHS to respond to app-savvy, more consumer-focused generation. It is both cost effective and cost efficient - and also could help to reduce the costs of missed GP appointments which amount to £216m per year. Teleconsultations just as popular as face-to-face consultations Looking at our European neighbours, teleconsultations are already highly popular in Nordic countries. According to LIVI, in Sweden, 45% of the queries that come into general practice surgeries can be dealt with digitally. These virtual visits are also two-thirds cheaper than in-person visits. This has the potential to free up doctors’ time and reduce pressure on GP surgeries, where doctor shortages are a growing issue for the NHS due to a fall by 1,000 in the GP workforce over the past year. Furthermore, teleconsultations are a big hit with the younger generation for their simplicity and convenience -  in the US, two-fifths of Americans ages 22-38 now seek routine medical services virtually according to a recent Accenture report.  The ease that apps offer to younger people also provides an opportunity to reduce missed appointment rates, as the majority of no-shows at GP appointment are aged 16-30. Switching to online consultations is not only beneficial for patients;  GPs are given the flexibility to work from home and chose the hours, ultimately helping those working mothers and fathers. Benefits all round The NHS regularly reaches first place in the poll of what Britons are most proud of – and rightly so. It is on its way to providing patients with digital services, especially following the launch of its first app last July. Nevertheless, there is always time to look to our European and transatlantic neighbours for measures to implement. As demonstrated there, mainframes are not a barrier to digital advancement. It is possible for the NHS to offer digital services without having to rip out any of its existing technology – and more integrated technology means a more cost-effective and more efficient service for everyone.[/vc_column_text][/vc_column][/vc_row] ### Improving Your Content Marketing with Big Data [vc_row][vc_column][vc_column_text]The content marketing creation and distribution landscape are changing by the day. Popular platforms such as YouTube and Facebook have become instantaneously accessible to billions around the world. Whether you operate in startup mode or work as a marketing expert for an international company, big data is certainly worth looking into. Implementing big data into content marketing isn’t a novel idea but it can turn things around for your brand and public perception. With that in mind, let’s take a look at several ways in which you can use big data in your everyday content marketing workflow for better customer engagement down the line. What does Big Data stand for? Big data typically refers to a mass of user-generated data created in a certain digital eco-system. For example, each click, share, comment or interaction on Facebook or Instagram amounts to a large sum of big data. Big data started its digital tenure as a cryptocurrency transaction ledger before becoming an integral part of the corporate market. It’s hard to grasp the fact that we have generated over 90% of world’s data in the past two years alone, with US citizens using over 2 million gigabytes of internet bandwidth on a daily basis. A fact worth pointing out is that marketing departments need to wait for the launch of their campaigns to get feedback about how well their content did compared to other brands in the industry. Big data can be used to better understand how B2C and B2B stakeholders think, their habits, brand or product preferences and so forth. This allows for a plethora of benefits for one’s company, which is exactly what we will cover in the upcoming points of discussion. Better Goal-Setting The implementation of big data in content marketing starts at very early stages of planning. You can use big data to set better goals and KPIs for your upcoming content calendar through audience research. This will allow you to bypass potential bottlenecks, uninteresting content types or topics as well as plan your delivery times more carefully. Good goal-setting can make or break your entire content marketing campaign cycle, so make sure to take time and explore your possibilities before going forward. Accurate Customer Targeting Audience segmentation and customer targeting are also some of the most important elements to consider before content creation begins. Who is your intended audience and what kind of lifestyle choices or professional background do they possess? Different audience profiles will respond differently to certain content types, especially when there are large age gaps or social status discrepancies between two potential customer groups. Don’t misalign your audience segmentation and make good use of big data before proceeding to the content creation part of your campaign. Streamlined Content Creation Processes Big data can be a great substitute for traditional market research depending on the scale of your business. You can rely on big data repositories to find out which topics are trending in your industry, what keywords to target and what emotional triggers to use to entice engagement. This type of content creation streamlining is akin to SEO research but instead of relying on tools such as Google Analytics, you will have access to audience response and interaction patterns instead. In terms of reliability, most marketing experts would argue that big data provides more tangible data in terms of audience research than what works or doesn’t work in terms of SEO. No matter how you put it, big data is likely to increase your overall content creation productivity and maximize your marketing output, leaving a lot of resources available for other creative or research endeavours. Social Proof & Metrics Showcase Depending on the industry your company operates in, you can use big data to showcase interesting and useful social proof metrics. For example, your sales numbers, review aggregates, and overall customer engagement rates can be great additions to your content marketing. However, it’s worth noting that you should only rely on data that concerns your company or the industry at large and not resort to counter-marketing tactics against other competitive brands. Always work toward long-term gain and brand loyalty without resorting to short-term gain efforts, especially if it can affect your company’s reputation. While “fair play” isn’t a word often used to refer to content marketing, big data implementation in public marketing content should be respectful and professional in nature. It can elevate your brand significantly and bring new stakeholders to your fold if you choose the right data and metrics to showcase through your marketing content, especially in social media platforms. Content Tracking & Analytical Possibilities Once your content is published under big data’s umbrella, it will be far easier to track its performance and engagement rates going forward. You can track your content for its appeal, engage with the audience who share or comment on your published posts and learn about their behaviour patterns in more detail. This will allow you to iterate on your content marketing strategies and create new, innovative ways to approach familiar territories. Differentiating your content from the competitors’ is one of the key principles brought on by the big data revolution in marketing. Don’t settle for using big data in the initial research phases of your content creation and look for ways to rely on the tech indefinitely. In Summary Constant iteration and breakthroughs in content marketing creation are some of the core elements of what big data has to offer. However, it can be overwhelming to go through the plethora of information available to you at every corner. Make sure to create a shortlist of filters, KPIs and targets for your upcoming campaign before you approach big data. Compare your initial goals with what big data has to offer in those regards and work your way forward from there. Even though big data’s information is conclusive with audience’s digital and behavioural patterns, you should still use human logic to make definitive marketing decisions. Find smart ways to incorporate big data repositories into your pipeline and your marketing processes will flourish as a result.[/vc_column_text][/vc_column][/vc_row] ### Why today’s business planning must shift to the cloud [vc_row][vc_column][vc_column_text]The business environment has been growing faster, more dynamic, and more complex for years, picking up pace as it goes. And there’s no end in sight. As a result, today’s businesses need to operate with new levels of agility, so that they can quickly respond to changing conditions around them using real-time data from across the organization. But getting that agility requires companies to evaluate the way they plan, execute, and measure activities. And it starts with the cloud. Today, many enterprises still rely on traditional, on-premise corporate performance management (CPM) platforms—and many still rely on spreadsheets for business planning. Shifting to a cloud-based planning platform allows companies to adopt a planning process that is collaborative, comprehensive, and continuous. That combination can deliver the business agility companies need to compete in today’s fast-paced world. Legacy on-premise planning tools          Companies that have older, on-premises planning software typically find that they are expensive to maintain, hard to change without involving IT—and sometimes hard to even operate without expensive consultants—and often too finance-centric to support broad participation in the planning process. The issue with older CPM deployments is that they were optimised for “paternalistic” planning models. Indeed, they were designed to support a small, senior finance audience that would model the business at a macro level. While they deliver a high-level, top-down view of the business as a whole, they only support one-way data flow. Functional business units have no lateral visibility or direct input into their own plans, let alone other areas of the organisations, effectively shutting down participation in a wider, cohesive planning process and limiting collaboration. In addition, these tools typically require a lot of technical oversight from IT to set-up and maintain, creating an inflexible process that can’t adapt with the speed of business today. Worse still, “shadow planning”—planning that’s done in silos with no connection to the broader corporate plan—inevitably wins out with these tools. The reason? Functions and business leaders on the ground actually getting on with running business operations take matters into their own hands and create their own, disconnected plans. Spreadsheets are a tool, not a platform Spreadsheets remain a surprisingly robust halfway house for financial and operational planners. There are reasons why so much planning still gets done on spreadsheets; they are simple, powerful, and, in the right hands, financial and operational planners can make tasks happen very quickly. For example, if they need to analyse numbers in isolation, or do quick ‘what-if’ scenario planning using a static data set, spreadsheets are often hard to beat. However, spreadsheets also have some fundamental shortcomings. They have no lateral scalability and they are not built for complex, multi-department collaboration, or highly dynamic data with a fast rate of change. Plus, when everyone across the business is planning in isolation, it takes a huge amount of elbow-grease and accountability to manually verify all the data and consolidate hundreds of errant spreadsheets into a single version of the truth. And after that Herculean effort, poor version control means inaccuracies often slip through the net. Nevertheless, for small ad hoc planning, spreadsheets are not going anywhere. However, their limitations are being recognized as the need for agility escalates: they are fundamentally static and incapable of consolidating and distributing complex information around the business. In fact, they were never really built to be anything more. Moving towards a cloud-based approach to planning And just as the cloud has changed the way organizations work in many other business functions, the cloud is ideal for business planning. Using a cloud-based, Software-as-a-Service (SaaS) platform offers businesses a scalable, flexible planning environment that is quick to deploy and easy to manage. In addition, when participation is easy and accessible across functions, it results in broad adoption. That is why it makes sense to find a cloud-based planning platform that does not just support multi-user collaboration from a technical standpoint but actively drives it with an easy-to-use experience. The benefits of cloud-based performance management systems are plentiful. They enable rapid deployment and customisation through configuration settings, rather than hard coding. Cloud-based performance management systems also enable continual software updates and avoid disruptive software upgrade projects. In addition, there is the benefit of lower initial deployment cost, freedom from investments in IT infrastructure, and reduced administrative overhead. A game changer in business planning Put simply, it is the advent of cloud-based planning platforms that are truly changing the game. The world is moving away from legacy on-premise planning tools and, while spreadsheets still have their role within the enterprise, cloud-based business planning approaches are winning above the rest. With the cloud, business planning can be easy, powerful, and fast—a must-have in today’s fast-paced, data-driven world.[/vc_column_text][/vc_column][/vc_row] ### Mobile payments and what the future holds? [vc_row][vc_column][vc_column_text]Digital transformation in the financial sector was badly needed, it was a sector that lagged behind the others, slow to adapt with heavy regulations. Before fintech boomed, transferring money abroad was challenging. Those who wished to send money overseas had no other choice but to turn to banks. The process was expensive with undesirable exchange rates and high transfer fees charged by banks. On top of this, banks took days to process the transfers, this still happens today – though things are improving now. A mobile first society Breakthroughs in mobile and communication technologies are enabling fintech disruptors to enhance the way financial services are being delivered to consumers. Fintech emerged as promising option as we entered into a mobile first society. According to GSMA, at the end of 2018, 5.1 billion people around the world subscribed to mobile services, that’s 67% of the world’s population. Mobile web services are now so popular that back in March last year, Google began indexing and ranking pages on its search engine based on the mobile versions of websites rather than the desktop ones. Mobile services are convenient. A few taps of the finger on the interface, and you’ve successfully done what you’ve needed to do. A good mobile app is a great customer relationship builder as brands can now be where consumer attention is at. In the UK, OfCom found that an average of two hours and 28 minutes are spent on a smartphone per day by users. Business never has to end and it doesn’t with most apps being used between 6 and 7 a.m. followed closely by 9 to 10 p.m - all completely outside of typical ‘office hours’. This is why I believe mobile payments are the future. Fintech’s spotted the problem with transferring money abroad - it was slow, laborious, complicated and expensive. But this isn’t why these businesses have been so effective. What’s taken digital money providers to the next level is the mobile-first society in which we now live. Fintech solutions are digitally native - they’re born in the mobile age and so they don’t need to adapt to become accessible via a smartphone device - making these apps easy to adopt. The new ways to transfer money Easy adoption has opened the gate for digital cross-border transfer providers like InstaReM and others to join the scene. The new solutions are much more user-friendly and offer cost benefits that banks and other money transfer operators can’t offer. Its this approach that has helped digital players move the payments market forward. As with all tech offerings, convenience and speed of transaction time are  key factors where digital payment companies have an edge over traditional businesses. You no longer have to draw out cash, pop to the shop or bank and track down the nearest ATM. All you need is your mobile phone and an internet connection to perform a much-needed money transaction eliminating all the faff. Performing money transfers from a smartphone is simple and secure. You can now send money from wherever and whenever you want. Users can keep track of their payments, how much their sending, where and when they’ve sent money all through application features. Mobile money transfers are helping the industry become more transparent. In turn, this has raised customer expectations. They expect a good level of service, with speed and accessibility high on the agenda. They want their money transactions to be safe and efficient and as technology enhances from smartphones and tablets to blockchain and AI - the hard work for fintech firms has just begun. What does the future hold? The digital money transfer market will witness big changes going forward. Fintechs are partnering with each other and with global payments enablers such as Visa and Mastercard to strive for further financial inclusion around the globe. Other businesses will also move into the payments market sooner or later.  We’re already seeing in Asia how the big tech players such as Alibaba, Tencent and Grab are moving into the remittance market. Even the traditional players like Western Union are aggressively pursuing digital strategy for money transfer business. InstaReM has closed the biggest funding round for a digital remittance company in the region and I think we’ll see more companies around the world move into this space in the coming year – including WhatsApp. Wallet-to-wallet remittances will grow in popularity making international money transfers as frictionless as possible. I am reasonably confident that mobile will be the default way of sending money across border sooner than expected. It won’t be a surprise if in the coming months you’ll be able to send money from Singapore to the UK to a friend via text message – and that money to be in their account in a matter of seconds.[/vc_column_text][/vc_column][/vc_row] ### The challenges and opportunities of Office 365 [vc_row][vc_column][vc_column_text]Without a doubt, Office 365 is unlike any SaaS application ever. Organisations have been embracing SaaS and cloud-based applications for a while now, but Office 365 is a game changer in more ways than one. With approximately 135 million active users worldwide and more businesses adopting it each day, Office 365 is transforming how businesses work and communicate. It’s also transforming the corporate network. Each Office 365 user generates between 12 and 20 persistent connections, which, when considering how many people are using it on a daily basis, is resulting in gigabytes/terabytes of data. For businesses still relying on the traditional hub-and-spoke architectures, managing this volume of traffic is a challenge. When Microsoft Exchange servers and Office applications were on premise, businesses had to backhaul traffic from remote sites and mobile users to the data centre. It was the only way to provide connectivity. But now that these services have moved to the cloud, backhauling traffic to data centres can create latency, which quickly leads to frustrated users and delayed deployments. Indeed, the increased number of persistent connections can overwhelm traditional MPLS links and appliances, resulting in a poor user experience and costly upgrades. Furthermore, the performance of productivity apps, such as SharePoint and Skype, may be impeded by the latency added by centralised gateways. To mitigate these issues, Microsoft itself has issued recommendations for organisations deploying Office 365 on hub-and-spoke architectures with centralised proxies, including performing next-generation firewall (NGFW) capacity assessments and WAN latency assessments. Microsoft also acknowledges that Office 365 was built to be accessed securely and reliably via a direct internet connection, which minimises latency – the enemy of Office 365 – on user traffic. Investing in more bandwidth and upgrading firewalls is just not sufficient. Microsoft has invested heavily in a content-delivery network (CDN) to accelerate the delivery of content to users and, subsequently, wants users to connect to its network as fast as possible. What does “direct internet connection” mean? It essentially means the wide-area network (WAN) has been relegated to second fiddle. The traditional WAN has been displaced and is no longer suitable for the more data-intensive business applications. A direct internet connection ensures the shortest possible path for your data. A big issue around direct internet connections is how to secure them. Direct-to-internet means bypassing the security gateway, but it’s often impractical and far too costly to place security appliances at every branch. However, by embracing a security-as-a-service approach, businesses can deliver the security required for their internet-bound traffic and the user performance needed for their Office 365 connections. Companies with successful deployments are using local internet breakouts at branch offices, which gets Office 365 traffic to Microsoft as quickly as possible, and avoids the latency added when traveling through older hub-and-spoke networks. In effect, these companies are using the internet as their network, whilst still applying the controls and rigour that their hub and spoke network provided. Using cloud-based business applications, such as Office 365, may seem like a daunting task when taking into account the impact they are having on the traditional network. However, the benefits are significant. It enables companies to create a truly global workforce by delivering services, such as Skype for Business, Cloud PBX, SharePoint or OneNote, to users wherever they are located. Users also get to experience a frictionless IT environment that enables them to communicate and do their jobs faster and more efficiently. By moving to a direct internet connection and using local internet breakouts at each branch office, businesses can ensure they have fast, secure connections for all application requirements.[/vc_column_text][/vc_column][/vc_row] ### Money for nothing – why cloud instance family moves are a no-brainer [vc_row][vc_column][vc_column_text]The pace of change in cloud computing is relentless. Yet, for IT decision-makers it’s all too easy to focus on the next big change while overlooking your existing infrastructure. When you fail to review and monitor your current cloud instances and servers, you could well be missing out on the possibility of better performance and lower costs. Instance family moves – whereby an organisation moves its workloads to next-generation server instances – provide organisations with this opportunity. By taking a few simple steps, more efficient and cheaper cloud servers are there for the taking. Those who dare, can win. What is an instance family move? To set the scene, imagine you have a mobile phone contract you’re paying £25 a month for. You’ve had the phone for a year or so, it still does its job, but it has definitely grown slower. To speed things up you could uninstall some apps and turn off Bluetooth, but why go through the time and effort to simply limit the capabilities of your hardware? Imagine instead that your phone provider offered to take in your old phone, shut it down and then promised when you started it up, it would be slightly faster and in better condition. What’s more, it would only cost you £22 a month from now on and would have all the very same apps and accounts set up as before. Increasingly, this is what more and more cloud providers are offering to their customers. When they update their server hardware, they will offer new cloud instances at a cheaper price. A typical deal might look something like this: a company could be running a M3.xlarge server in Ireland, with Linux, 4vCPUs and 15gb RAM for $214 dollars each month. Through an instance family move, they could gain access to an enhanced m5.xlarge server with 4vCPUs and 16gb of RAM for only $156 a month. The motivation for cloud providers is that it’s more expensive to keep maintaining the old hardware. The incentive for customers to migrate is two-fold, the cost savings can be quite substantial and, in some cases, the newer infrastructure performs better too! What’s the catch? If instance family moves represent such a valuable opportunity, then why aren’t more companies taking advantage? Indeed, relatively few customers seem to care about these regular hardware releases. At present, the only way to undertake one is as a regular review built into your own Cloud Centre of Excellence, or to directly contact your IT or managed services provider about it. Forgetting to do this is an easy oversight, but it’s even easier to fix. There are, however, pitfalls to consider. Starting an instance family move can be as simple as shutting down your server and starting up on the new family type. Yet, there are constraints as to which instance types you can move to. Understanding your workload and its specific needs is important as it allows you to make more informed decisions when choosing the best instance type for your application. It is also important that your architecture is designed so you can shut down a server without it impacting your customer experience. Load balancing and immutable infrastructure should be commonplace in your IT estate in the cloud. In this respect, Netflix’s Chaos Monkey – which randomly shuts down servers to ensure the workload can cope – is a good act to aspire to. Throughout all this, analysis and diligence are critical. Furthermore, it doesn’t hurt to call in some outside expertise. Some considerations are easily missed when you lack experience of performing instance family moves. For example, some newer CPUs are based on CPU credits that are built up over time. Having a trusted partner help you through the implementation process can be invaluable. While change is often good, it can also be challenging and expensive. Yet instance family moves present a largely painless way to effect real positive chance to operations. ‘If it ain’t broke, don’t fix it’ seems like common sense, but it won’t do you any favours in a rapidly evolving cloud landscape. Undertaking an instance family move for lower costs and potentially faster speeds should be the obvious choice.[/vc_column_text][/vc_column][/vc_row] ### Legit Ways to Prepare Your Site for Peaks in Traffic [vc_row][vc_column][vc_column_text]Peaks in website traffic is a double-edged sword. On the one hand, they mean more traffic and more opportunities to convert visitors into paying customers. On the other hand, peak traffic can have a detrimental impact on website performance and could even cause downtime that results in the loss of potential customers. When a business knows that a peak is coming, whether as a result of a promotion or in relation to seasonality or a holiday, there are steps they can take to mitigate the risks. Here are some legit ways to prepare your site for peak traffic, Have Website Monitoring in Place The importance of having website monitoring applications in place cannot be overstated. These tools provide valuable insights into what’s causing a problem, as well as triggering an alert when bottlenecks arise and identifying when a problem is imminent. When searching for a website monitoring provider, look for a company that helps identify the root cause of an issue, and instantly alerts you of the problem the moment it happens, regardless of where you are in the world. Use a tool that provides guidance on correcting the problem and allows you to share your website’s status easily and efficiently. Test the Traffic Before you have hundreds or thousands of people crowding your website at the same time, you can simulate the effect and see what will happen to your site. Stress tests are a way to prepare for peak traffic by simulating the effect of a traffic increase and noting the results based on your current set up. This exercise will identify what needs to be put in place for your website to perform at optimal levels during periods of increased traffic. With enough time and preparation, you’ll also be able to test the alterations you make based on your initial results. This will help you minimize the potential for downtime and ensure that your real visitors have the experience you want them to. Talk to Your Hosting Provider Understanding how your hosting plan works is essential in preparing for increased traffic. If you don’t already understand how your plan works or what flexibility you have, take time to have a discussion and explain your concerns. Your host should be able to look at your current plan and determine what you can handle or what extra service you can purchase to handle the influx of traffic. Fortunately, cloud hosting creates a lot of flexibility in plans. While not all of your website traffic will be directed through your host, having a discussion with your provider can help create a great customer experience for those directed through that point of entry. Caching is Key All websites and app developers should use caching to some extent. Caching saves relative data so that it doesn’t need to be reloaded from scratch every time someone visits the site. The idea is that the first visit will have the longest loading period, and subsequent visits will be reliant on cached data that takes less time to load. Going beyond the website defaults and taking an in-depth look at caching opportunities for both static and dynamic content can help reduce loading and processing time, so that visitors get what they need and leave quickly, making room for more guests. Understand Your Customer Journey Even with enough bandwidth, bottlenecks can occur. For example, your website might be able to handle 500 active users at any given time, which works when you have 400 browsing and 100 going through the checkout. However, if there was a shift, and you suddenly had 100 people browsing and 400 people going through the checkout, problems could arise. To get ahead of bottlenecks, you need to understand your customer journey by looking at your analytics. Take note of the popular points of entry, whether it’s a blog post or your home page. Where do your customers go from there? Are there common crossover points? Where do conversions happen and where do customers bounce the most? Understanding the customer journey can help you prepare accordingly, and test various routes to help customers get to the check out a little faster. It can also help you allocate bandwidth during peak times throughout the day if necessary. Work with a Pro Even with the right tools in place, sometimes people don’t know what they don’t know. Most businesses use their website as a tool while their specialty and knowledge is focused on what they sell. This is why working with a web developer or someone who is experienced in creating a web traffic strategy is a good idea for businesses who know that changes are on the horizon. Rather than struggling with the concepts above, a web developer will be able to see the bigger picture and put it all together in a way that makes sense for the business. Whether your organization aims to handle the traffic peak itself or work with a consultant, following the tips listed above can help prevent disaster and create ample opportunity for success. Do you use word press? Find out about some ways to speed it up here:[/vc_column_text][/vc_column][/vc_row] ### The AI revolution has arrived – but are enterprises ready? [vc_row][vc_column][vc_column_text]The overall theme at this year’s Mobile World Congress was ‘intelligent connectivity’. Described by the event’s organisers as the ‘beginning of a new era of highly contextualised and personalised experiences’, it’s a theme that’s as equally applicable to service providers as to their customers, and one that will be increasingly enabled by the adoption of artificial intelligence (AI) and machine learning (ML) technology. Once the sole province of future-gazers, AI has slowly but steadily progressed to become a critical part of our everyday lives. However, while virtually every device launched at MWC last year featured an element of AI or ML, the general opinion was that the industry was only just beginning to explore the technology’s potential. But, as the telecoms world assembles in Barcelona once again, we now appear to be on the verge of a ‘profound technological revolution’. Use cases and opportunities The advent of 5G has opened the door to several use cases in which AI plays an integral part. Edge intelligence, intelligent network slicing, and predictive digital twins for example, will all be greatly enabled by 5G, and all rely on AI to some degree. Edge computing is on the rise. In order to deliver the ultra-low latency promised by 5G, compute resources are being moved to the network edge, reducing the distance that data is required to travel from where it is generated. A serverless architecture at the edge allows for greater scalability and availability than would be afforded by edge-based servers, while the application of AI-powered analytics enables edge intelligence, the level of which was only previously available in on-premise or cloud-based datacentres. Network slicing, another function enabled by 5G, allows an operator’s business customers to tailor their connectivity and data processing capabilities to their specific requirements. The addition of AI, however, automates the process, thus optimising its efficiency and increasing the opportunities for operators to address the evolving needs of their customers. Indeed, according to the GSMA, network slicing open up a revenue opportunity worth around $300 billion by 2025. 5G will provide the connective fabric for a wealth of connected IoT sensors, the data from which can be used to create digital twins - digital replicas of physical assets, processes and systems. Predictive analytics then help organisations make sense of this vast volume of data, turning information into insight. Predictive digital twins can shape the future of a business. According to Deloitte, "digital twins can profoundly enhance an enterprise's ability to make proactive, data-driven decisions, increasing efficiency and avoiding potential issues. And they can make it possible to "experiment with the future" by exploring what-if scenarios safely and economically." The future is closer than we think We’re – fortunately – a long way from approaching Terminator or HAL 9000 levels of artificial intelligence. We’re starting to see the emergence of AI-driven applications, however, that until recently were the stuff of science fiction. Autonomous robots, drones and vehicles, for example, use AI and ML to automate functions previously performed by humans. The manufacturing industry is being transformed by ‘smart factories’, in which collaborative robots, or cobots, are trained to perform a multitude of tasks rather than carrying out one repetitive routine. And self-driving cars, expected to hit the roads this year, rely on AI and ML to continually process the data they need to learn how to drive and perform on the road as a human would. Extended reality, or XR, a mixture of real and virtual environments and interactions between humans and machines, is set to radically transform the way in which we interact with media. The hope is that, in time, people will not distinguish between AR, VR or video, and will seamlessly move between the media that best suits them at any given time. Finally, AI will intersect with quantum computing, creating the ability to present multiple solutions to highly complex problems simultaneously. The potential of this ‘quantum machine learning’ will be hugely disruptive, enabling AI to more efficiently perform complex tasks in a human-like way, and allowing robots to make optimised decision on a given situation in real time. The future is boring These are all exciting prospects, of course, and updates on their progress will undoubtedly draw crowds at events such as MWC over the coming years. As the technology continues to evolve and develop, however, we must proceed with caution. Despite fears of AI-enabled Terminators, John Giannandrea, Apple’s SVP of machine learning and AI strategy, suggests we forget killer robots – bias is the real AI danger’. Although AI is viewed as intelligence demonstrated by machines, the algorithms on which it runs, and the data sets used to train those algorithms, are all created by humans. As a result, they can be tainted, containing implicit racial, gender or ideological biases, clearly unhealthy for any applications supported by AI. It’s important, therefore, when developing AI systems to maintain an awareness of bias, and use quality, transparent data at all times. AI is clearly a key pillar of ‘intelligent connectivity’, with the potential to transform our lives. For the time being, though, the AI applications that will make the most difference to our lives are going to be relatively dull when compared to a future of autonomous drones, quantum computing and widespread XR. And there’s nothing wrong with that. When it comes to automating and optimising their networks, operators are already realising that it’s the boring, background solutions that make the biggest difference. So, while there was a lot of talk about the bright future of AI at MWC – with dancing robots and automated baristas- it’s the little things we should really be paying attention to.[/vc_column_text][/vc_column][/vc_row] ### Don't Crash the Party: How to Ensure Your Site Stability [vc_row][vc_column][vc_column_text]Competing in the modern business landscape takes a lot of hard work and some marketing savvy. Creating an online presence for your business is essential. The first step in spreading the word about your company’s products and services online is having a website build. Often times, taking on this difficult process alone is a bad idea. Modern consumers want websites that are eye-catching and responsive. Did you realize that over 35 percent of consumers claim they stop interacting with a website if they consider the layout unattractive? While the overall layout of your website is important, you need to focus on making it stable as well. The following are some of the things you need to consider when trying to keep your new website reliable and fast. Investing in a Content Delivery Network is Vital Are you trying to find a way to make your website load faster for new users? If so, then investing in a content delivery network (CDN) is a good idea. In essence, a CDN is a cloud-based system that helps to optimize the delivery of things like video and graphic elements. The cloud-based servers a CDN supplier has been spread throughout the world. When implementing the use of one of these networks, you can make it easy for a user to get a fast and reliable website browsing experience. Once the CDN identifies where the user in question is located, they will direct them to the server nearest them. The closer the cloud-based server in question is to a user, the faster the website will inevitably load. Having a Cloud-Based Backup in Place is a Smart Move The more you customize your business website, the more files there will be in its database. Instead of leaving the safety and security of these files to chance, your main goal needs to be protecting them. One of the best ways to keep the data on your website accessible is by investing in a cloud-based backup. Not only will this backup keep your information protected, but it will also allow you to get your website up and running in the event of a crash. The biggest benefit that comes with using a cloud-based backup for your website is that they automatically update. Before choosing a backup provided, be sure to take a look at the security features they have in place. Keep Software Updated One of the most popular website building and hosting platforms on the planet is WordPress. For most business owners, the biggest benefit that comes with using WordPress is the endless array of plugins they have to offer. While these plugins can make your site more functional, they need to be updated regularly to ensure they work. Letting these programs get outdated will lead to slower page load speeds and functionality issues.One of the best ways to catch these issues before they lead to a website crash is by using Papertrail. This program puts all of your website error logs in one easy to use dashboard. If you get an error message about outdated software problems, you need to work on getting them fixed immediately. Penetration Testing is Crucial Getting to know just how much your website can handle before it crashes is important. One of the best ways to get this information is by hiring IT professionals to perform a load and penetration test. An IT company will have the tools needed to simulate the load of thousands of users trying to access your site at once. The results from this testing will help you figure out whether or not your servers are well-suited for the traffic you hope to generate. A penetration test can be used to point out security flaws in your website’s infrastructure. Having these tests performed on a regular basis is the only way to ensure your website is safe and functional. Reaching Out For Professional Assistance Trying to maintain a website and run a small business can be extremely challenging. Instead of getting overwhelmed, you need to reach out for some professional help. Hiring an IT with a great deal of experience will allow you to keep your website running like a well-oiled machine.[/vc_column_text][/vc_column][/vc_row] ### Inspirations - Part One - Disruptive Live [vc_row][vc_column][vc_raw_js]JTNDaWZyYW1lJTIwY2xhc3MlM0QlMjJjdGNwZi1lbWJlZCUyMiUyMHNyYyUzRCUyMmh0dHBzJTNBJTJGJTJGY29tcGFyZXRoZWNsb3VkLnBhZ2VmbG93LmlvJTJGaW5zcGlyYXRpb25zJTJGZW1iZWQlMjIlMjBzY3JvbGxpbmclM0QlMjJubyUyMiUyMGFsbG93ZnVsbHNjcmVlbiUzRSUzQyUyRmlmcmFtZSUzRQ==[/vc_raw_js][/vc_column][/vc_row] ### The ‘Internet of Things’ - from buzzword to reality? [vc_row][vc_column][vc_column_text] The internet of things - or IoT to the majority of us - has been a buzzword for some time now. And like AR and VR, the term is thrown around by large corporates and start-ups, all angling to get a head start on the next big thing. Until now, that’s all it’s really been – buzz. But things are about to change. Like most mainstream technology, it’s rarely a single product that makes a market ‘go’ but a combination of factors all coming into existence at the same time. IoT is no different. Here are the factors that I believe will provide IoT with what it needs to finally take off and change how we live, work and play.   A real reason to be At the heart of any successful innovation must be a real genuine reason to be. IoT isn't about connecting things to the internet and making them smart – we’ve been doing that for years. Remember smart fridges? How many people actually have a fridge that shops for them?    Innovation is about the invisible potential in the things we overlook that connect us with one another, and with ourselves. IoT isn't just about more tech – it’s about making things invisible, making annoyances go away and optimizing every-day life.   Once we get past the first wave of mind-boggling smart gadgets - from home appliances that sing, to underwear that will know when it needs a wash - we will start to get into the really interesting stuff. Genuine breakthrough platforms that combine AI to join up the dots and create a network effect, where the combination of products working in harmony with users will create new experiences. At the heart of IoT adoption will be people like me and you, wanting and demanding better.   A new network 5G technology marks the beginning of a new era for IoT. 5G is not just about faster speeds – with any new network offering comes innovation and the potential for disruption. The publication “Telecomlead” says that 5G is “up to 10 times more efficient on a cost-per-gigabyte basis compared to LTE”.   Such an exponentially reduced cost could mean a global democratization of data and network usage, allowing for a universally connected world. We are already seeing signs that the big telcos are gearing up, with the launch of V by Vodafone and a number of hardware manufactures shifting from mid and low-tier products to connected homes, families, pets – the list goes on. Even if IOT devices don’t necessarily require 5G to operate, a new dawn can form a new era.    A new kind of power In order to support a network of smart connected devices, we need to rethink power. The good news? It’s already happening. Companies like Drayson Technologies have mastered a wireless charging technology that can power low level devices over 4G, 5G and Wi-Fi. The wireless charging market is already beginning to disrupt the healthcare and automotive industries – in fact this sector is poised to be worth $22 Billion by 2022. It might sound like magic - but it’s exactly what we need to maximize the potential of IoT, and allow smart tech to come out of the home or office and exist on us, in us, in nature, or in remote environments – and beyond.    A killer example Finally, it will come down to someone doing something surprising and memorable. How many people could have predicted that the millions of dollars in R&D that went into the iPhone would have been showed off by the simple drinking of a humble pint of digital beer? I don’t think this time it will be a single product, or even an experience, that sets off the IoT revolution. The benefit of IoT is more likely going to be brought to life by empowering startups and small cities to do things never thought possible – to compete against the tech hubs such as San Francisco, New York, and London – by giving them the opportunity to become just as connected and powerful. We can already see this happening, with Bill Gates spending $80 Million on creating a “Smart City” in Arizona, and companies such as Cisco creating a fund to help cities around the world build smart infrastructure.   So whether you’re an individual who will benefit from minor life improvements in hidden ways, or a city reaping huge financial, social and logistical benefits, or even somewhere in between – a business, for example – the question of what IoT means for you remains to be seen. The future is up to all of us – all we know is, there probably won’t be a singing fridge in sight.   [/vc_column_text][/vc_column][/vc_row] ### SDWAN Solutions Launches World First SD-WAN Cloud [vc_row][vc_column][vc_column_text]SDWAN Solutions Ltd launched the world’s first SDWAN Cloud on 5th April 2019 to great acclaim. SDWAN Cloud allows a single site on an SD-WAN real time access to every IaaS and SaaS instance and to over 250 data centres, globally - a real breakthrough for multi-cloud customer environments and a visionary, next generation product for an industry only just into mainstream adoption in the UK. SDWAN Solutions Limited - a London based company, soon opening a European office in Brussels - offers an end-to-end, managed service provision with 100% core business focused on SD-WAN. SDWAN Solutions only partner with true SD-WAN vendors only such as Silverpeak, Talari and Velocloud, and offers complete SDWAN-As-A-Service solutions, including connectivity sourced from nearly 2000 suppliers globally. However, it is their passion for innovating the SD-WAN market that has driven this advance, and ahead of any other competitor globally. They are now inviting similarly minded, forward looking businesses across all sectors, to take up their offer of free proof of concept and welcome them to rigorously test their unique SDWAN Cloud, offering a free proof of concept period during Phase 1. What are the benefits of the SDWAN Cloud? Anthony Senter, MD explains how businesses will benefit of the SDWAN Cloud. ‘With around 88% of UK businesses using cloud-based applications from many sources, we’ve developed SDWAN Cloud to allow them to connect and disconnect their SD-WAN network to any cloud application in the world, in under 60 seconds and with no minimum contractual commitment – giving businesses total control, from a single portal. Businesses operating different SD-WAN networks can now, for the first time ever, link up separate SD-WAN networks, regardless of the underlying provider. How does the SDWAN Cloud work? A world first that allows SDWAN Solutions customers to self-provision virtual connections to almost every IAAS and SAAS provider instance Connection to over 250 data centres internationally Consumption based billing model - monthly commit or PAYU model Spin up or spin down connections in under 60 seconds Facilitate communication between different SD-WAN vendor solutions making multi-SDWAN networks a reality Self-provision virtual X-connects from a central portal No need to install and manage multiple instances of SD-WAN in each IaaS and SaaS environment Will the SDWAN Cloud save money? Businesses will save money by using the SDWAN Cloud as part of their SD-WAN solution to send data between over 250 international data centres on a PAYU model, without paying for a dedicated international connection. PLUS – customers no longer need to install and manage multiple instances of SD-WAN in every cloud as SDWAN Cloud accesses them all! SDWAN Cloud is the next logical step in facilitating multi-cloud environments, and just like an SD-WAN solution, can be controlled from a single location. Essentially, businesses will no longer be tied into contracts which are not cost effective, or be paying for lines they don’t use. This will deliver genuine cost-savings to businesses. Free Proof of Concept for the SDWAN Cloud Toby Sturridge, Chief Technical Officer, explains the company’s invitation to be part of their breakthrough innovation and take advantage of their free proof of concept period. “Our phase 1 launch is all about extensive testing and to ensure we cover every eventuality. So, we’re looking for live customers to partner with us and be part of this exciting project – simply click here to review the offer. “If you have a multi-cloud strategy and are investigating SD-WAN as a future network topology, or if you currently have international circuits and are looking to reduce costs, or even if you need to join different SD-WAN networks together, we would love to include you in an innovative and no charge Proof of concept trial, to fully demonstrate the advantages of the new SDWAN Cloud.” Anthony Senter, concludes. “We are fast building a reputation of innovation within the SD-WAN sector, emerging as trusted experts in all things SD-WAN, so we are delighted to bring this concept to market as a world first.” “Our SDWAN Cloud not only gives the customer full control over their cloud connections, their SD-WAN estate is also reduced and customers can switch between different cloud providers in under a minute, in times of outage or degraded performance from a cloud provider. Plus, for the first time ever, companies can successfully run SD-WAN from different suppliers together in one network, very useful in times of mergers and acquisitions or where different regions use different SD-WAN solutions.” What next for SDWAN Cloud? Phase 2 will go on general release to the public immediately after Phase 1 testing with live customers has been completed and will include easy connection to security and perimeter-as-a-service products.[/vc_column_text][/vc_column][/vc_row] ### Open Banking is changing banking, but can one regulation break the system? [vc_row][vc_column][vc_column_text]On the first anniversary of Open Banking earlier this year, many stopped to question how regulatory standards were working, and what could be done to improve the banking ecosystem. How far have we come? Have we done what we set out to? Are we winning the battle to democratise banking and remove friction for the end user? Open Banking was implemented to enhance consumers’ banking experience, but what’s become evident is that – while Application Programming Interfaces (APIs) have created a more user-friendly system – problems related to the Second Payment Services Directive (PSD2) have also threatened not only to inconvenience consumers, but to undo the Open Banking ecosystem we’ve worked so hard to develop. Is regulation hindering FinTech? Open Banking’s potential remains enormous and innovation continues to expand, but the incorrect implementation of Strong Consumer Authentication (SCA) standards jeopardise the progress of Open Banking. The threats facing Open Banking can be broken down into three groups: unilateral application across non-payment accounts; authentication apathy from consumers; and longer-term threats to consumer data. Unilateral application across non-payment accounts The first challenge facing SCA is the unilateral implementation by banks across accounts, regardless of whether they’re PSD2 regulated, or what type of activity requires access to the account. Right now, legally, only payment accounts need to apply SCA standards, and the European Banking Authority (EBA) says that as such, “security measures should be compatible with the level of risk involved in the payment service”. While implementing SCA across all account types seems secure and transparent, if these standards are applied to all accounts, including: savings, individual savings accounts (ISAs), mortgages, and loans, customers may soon experience significant disruption across their banking experiences. Authentication Apathy As they stand, SCA regulations mean that consumers must reissue consent for their data to be used by third party providers (TPPs) every 3 months, regardless of whether they’ve granted access previously or not. This isn’t the case for non-payment accounts like savings, loans or ISAs, so I ask: how can we expect to regulate these in the same way? Longer-term threats to consumer data SCA implementation and its disruption go beyond User Experience (UX) friction. What users need to be most worried about is potential consequences to consumer behaviour. By making the consumer journey so cumbersome, SCA could ultimately lead consumers to take counterproductive actions putting their data at risk. For example, creating one password across accounts for ease of use. This makes financial data less secure, because if one account is accessed, all accounts are compromised. Because SCA regulations are likely to be applied unilaterally across accounts, as many as 69% of the UK population who use online banking could be affected, and therefore are at risk of fraud. A recent data breach from password flaws like those mentioned left 2.7 billion customer records at risk. Companies can resolve this particular problem by making consumers aware of the issue and instituting various levels of authentication for various functions within their banking experience. For example, viewing an account which has no payment capability should shouldn’t require the same levels of authentication as making a payment. This is a simple but important distinction - as banks begin to roll out SCA, step-up authentication needs to be considered seriously as a viable alternative. Rethinking ‘one size fits all’ Open Banking is founded on the promise that consumers would be able to safely and securely share their data, to make their financial lives more integrated and manageable. Something needs to change, regardless of challenges. Ultimately, the main concern around SCA is consumer protection, and ensuring users have the tools they need to make the best financial decisions for their lives. This issue is a highly important and urgent one, and if left unaddressed, undoubtedly, consumers will be faced not only with less helpful banking tools, but more critically, in poorer financial health.[/vc_column_text][/vc_column][/vc_row] ### Disaster will strike, get ready [vc_row][vc_column][vc_column_text] Get prepared for when, not if disaster strikes We see all manner of scenarios when it comes to business disaster recovery (DR) plans. There are those who think they’re prepared and ready for any type of natural or man-made disaster, and others who think daily data backups to tape will keep them safe in the event of IT failure. And, we also see plenty of organisations who haven’t got further than thinking about DR. Data suggests that less than a third of small businesses have a continuity plan in place to protect themselves in the event of IT failure. While for large organisations, shockingly a third have no contingency plans in place either. Whatever the size of business, it’s clear that organisations place varying degrees of importance on the impact of business disruption in the event of a disaster. No business is immune to the risk of cybercrime impacting their business. Today’s threats are so sophisticated that even those who believe their DR set up to be pretty robust could see their business paralysed by a particularly nasty ransomware attack. Risky business                             And it’s not just cybercrime that businesses need to worry about. Businesses can also find themselves facing a period of operational downtime at the hands of a power outage, hardware failure or human error. A few years ago, British Airways suffered a major power outage which damaged servers at its data centre. Flights were cancelled impacting approximately 75,000 passengers, putting their reputation at risk and leaving them facing a hefty compensation bill. Fire, flood or even a terrorist attack may all feel unlikely scenarios but if there are no plans in place to get back to business as usual following a catastrophic incident, they all have the potential to stop business proceedings altogether, sometimes for good. The fact is, any of the above scenarios can happen to any business and at any time. A problem we encounter all too often though is that many businesses believe that their DR set ups are robust enough to handle any form of IT failure, when in fact, the DR plan has never been tested to see if it works. Even large enterprises occasionally fall foul to not testing for vulnerabilities in their DR plans and procedures. Sophisticated ransomware attacks can not only encrypt data stored on the primary infrastructure but their back-up data too. Even the replicated data stored off site is at risk from a particularly ferocious virus. It’s also important to remember that businesses have a duty to protect their clients’ data and they’re putting it at severe risk if it is comprised in any way. Last year businesses had to ensure they were compliant with the new GDPR regulations to avoid a fine. But how many ensured that their data was protected from IT failure? It is essential that DR processes are compliant too. However, it’s never too late to review or test your DR plan to protect yourself against lost sales, reduced productivity and the associated reputational risk that comes with operational downtime. According to analyst group, Gartner, the average cost of IT downtime is approximately £4,400 per minute, a high price to pay for situations that can be avoided. System outages are highly preventable if your business continuity plans include the right solutions. A good place to start is to answer a few key questions: Is your DR set up tried and tested? Have you protected your core business-critical services? Does your current DR infrastructure need updating? Does it need further investment? If your primary systems were to fail right now, what would happen? If you struggled to answer any of the above questions or are concerned about the impact a significant period of downtime could have on your business, it might be time to reconsider your approach to DR and outsource it to a third-party provider to ensure you’re covered against every eventuality. This is where Disaster Recovery as a Service (DRaaS) comes in – enabling a third party provider to provide failover by replicating and hosting physical and virtual servers in the cloud. DRaaS can provide full recovery by replicating infrastructure, applications and data from multiple locations to the cloud. A set of Recovery Point Objective (RPO)/Recovery Time Objectives (RTO) rules can also be built into DRaaS to automate recovery in the event of an incident within an agreed timeframe. The main benefit of DRaaS is that it offers CIOs the reassurance that their business-critical assets are outsourced to a safe pair of hands and can be recovered in the event of a cyber attack, a hardware fail or power outage. Business can’t prevent some incidents from happening, but what they can do is adequately prepare for disaster to prevent operational downtime significantly impacting them. Unfortunately, the likelihood of a disaster striking has become very real and it’s essential that they are prepared for when rather than if it happens to them.[/vc_column_text][/vc_column][/vc_row] ### How a Virtual Data Room Differs From Any Traditional Cloud-based Storage: Which One to Choose? [vc_row][vc_column][vc_column_text]These days, businesses have more options when it comes to storing and sharing data information. Two of the popular platforms being used are the virtual data room and the traditional cloud-based storage. How do they differ and which is better? Let’s scrutinize them and answer the question later. Most people prefer the traditional way of storing data. But more and more people are discovering that virtual data rooms have more benefits, and it would be foolish not to make a switch. You need to understand the technology behind the virtual data room before you decide. Let’s see how virtual data room differs from traditional cloud-based storage. Efficiency When it comes to efficiency, the VDR has the edge over its traditional counterpart. The transactions are fast, thus increases deal value and lower the costs. Also, a virtual data room can reach a huge number of bidders fast. Unlike in traditional setup, you need to schedule an appointment to have a meeting. With VDR those bidders will have access to the room at the same time. As a result, the deal time is reduced, and the bid value is higher by 20 to 30%. These bidders are busy individuals, and they do not have time to travel to the meeting place. Virtual data room has higher possibility to convert the bidders into buyers since they have access to data room anytime they want to. Simple and Secured Your work is simplified in a virtual data room. As compared to an actual data room which is not straight forward, the work is tedious and time-consuming. Most of the time the traditional data rooms are conducted at a lawyer’s office to make sure everything is secured. Although this process does increase security, your maintaining cost is also increased with the traditional system. With the virtual data room, all your data are stored on a secured server. You can control to whom you will give access to the documents. Copying the documents are restricted as well; they can only copy it if you will allow them. This is something that is not allowed in the traditional data room. Monitoring In a virtual data room, monitoring is easy. This system is fast and simple. Most of the time, you can have your data room set up and run within a few hours, and thousands of documents can be stored and made available on your site within a few days. In a regular data storage, monitoring and tracking who viewed the document are not possible, but on digital data room, it gives you daily reports of which among the bidders access the documents. Virtual data room provides you with information electronically as to who viewed the document and when they did access it. Communicates Effectively Virtual data room can keep the communication in order and confidential. The actual data rooms lack this privilege which is a big advantage of VDR. The communications between you and the bidders in online data rooms are kept confidential and detailed. Questions intended to a particular company were answered directly without exposing the information to others. Adding new documents so the bidders can see is also easy, upload it to the system and you are done. Provides Valuable Information In a virtual data room, the company's interest is easily analyzed, and this is not possible for traditional cloud storage since it does not have such reporting tools. This information is very useful; your investment banker can use this to increase value during the bidding process. Knowing the interest of the bidders is an important component to increase the bidders offer to your company. Essential Factors to Consider in Choosing a Virtual Data Room Investing in a virtual data room is not as simple as you think it is. You need to consider some factors that may substantially affect your business. Security One factor you should consider is security. You need to understand the risk involved. Yes, they say it is secured but how secure it is. With today’s latest technology, even the government’s website is not safe.  Reliable providers of online data room are using a system that will ensure users hackers will not be able to penetrate. Cost Providers of VDR are charging their clients depending on the amount of storage used as well as the length of time that the data room is running. Some law companies with advanced M&A provide their own proprietary data rooms. Some providers may charge a subscription for continuous access. Choose a provider that will charge per use if you only need the VDR for a one-time deal. But, if you will need it for multiple projects all year round, choose the one that offers a flat fee, subscription-style. Use and Functionality It is essential that your online data room is user-friendly so everyone can access it with ease- anytime and wherever they are. This will save you from hours of training them. Choose the one that is easy to set up and uploading files in bulk should not be a problem. Remember, one of your purpose of using VDR is because of its ability to keep a great number of documents and managed it accordingly. Having a system that lets you upload the bulk of zip files or can drag and drop files easily can save you a lot of time, and keep your deal moving. Accessibility This is also an essential factor that you need to keep in mind. Bidders might be accessing or viewing that VDR using their mobile devices, tablets, and other operating systems. Make sure that your virtual data rooms are accessible using these platforms so your clients will not have any issue using it. Final Thoughts There are lots of reasons why professionals are favoring virtual data room over traditional cloud storage. It is the best option for storing and sharing confidential information during transactions and negotiations. If the stake involves a significant amount of money, never trust your most sensitive document to any platform as you never know who is just waiting for the chance to get access into your most confidential information.[/vc_column_text][/vc_column][/vc_row] ### Cloud LMS vs. On-premise LMS — Which One Works Best For You [vc_row][vc_column][vc_column_text]In a world where useful knowledge is put out every day and professionals even spend personal time and money to grow their careers, a reliable LMS is a must. And it pays off: structured learning resources that can be accessed anywhere will equip professionals with the skills they need to succeed at a lower cost. Everything looks good so far, but as soon as the decision to implement an LMS is made, another question follows: “How are we going to deploy it?” An LMS can either be hosted on the cloud (and here we mean private cloud) or on a business’s own servers. Deciding on the hosting method matters because it will cost you money and affect the way your LMS serves your needs. There is no blanket answer to this question because different businesses have different goals in mind when they want to develop a custom LMS solution. Looking at successful cases can provide insightful inspiration, but they are not enough for a final decision. You don’t want to follow anyone else’s steps to end up with an expensive system full of features you don’t need or a simpler one that fails to meet your expectations — which amounts to wasted money anyway. The wisest answer would sound a bit frustrating if it came in isolation: “It depends on your e-learning needs.” That’s why understanding our own needs and how certain features can help is fundamental. So here is how different features work in each deployment method: Accessibility Installing an LMS means attaching it to a certain device, something that cloud-based systems don’t require. That makes the idea of an on-cloud LMS more appealing if learning on the go, such as mobile learning, is on the table. Learning platforms that are created with flexibility in mind are recommendable because they give learners freedom to study anywhere they have an internet connection. If a business plans on having mobile learning solutions for smartphones as well, deploying their system on the cloud makes the experience even smoother for the learner. In this case, as long as an employee has a mobile device — they can learn. Their own phone and a decent 4G data package are all it takes for them to engage with the course while commuting, for example. Maintenance Because of their very nature, on-premise software solutions require infrastructure to be in house. Servers will be physically there. That means your IT staff will be able to keep an eye on the system to repair and adjust it as needed. The catch is they are going to have to do so because all maintenance will be up to them, which might leave them with their hands full. Hosting an LMS on a private cloud means renting a dedicated physical server from a vendor who will make sure everything works smoothly without your having to worry about hardware. One of the reasons companies may opt for a cloud-based solution is because they lack sufficient IT expertise or want to keep their IT staff small. After all, whatever issues arise, the vendor will immediately fix them. Costs (and hidden costs) Choosing to install the LMS on premise will involve a steeper upfront expense. Businesses may have to invest in new infrastructure after all. But whatever spending is needed at this stage will not be repeated. Deploying your software on a private cloud, on the other hand, won’t put a dent on your budget the same way on-premise will. The cost will keep returning instead. A cloud-based LMS involves the hosting fee that, while smaller than on-premise hardware spending, will become part of your operation costs. So, in the end, the scenario here isn’t very different from a couple considering “Should we buy an apartment or rent one?” Well, just like the buy-or-rent dilemma, it’s all about determining what you can afford now and how it will serve you in the long run. Before you go to the next topic, however, one thing needs to be clear: both options have their hidden costs. On-site systems slowly grow outdated as the time passes, and the antidote is the necessary migration after, say, a few years. As far away as that might sound (really, that’s similar to buying a car thinking about selling it), it cannot be avoided. Migrating from a legacy system to a new one can cost businesses a hefty sum depending on how obsolete their server has become. On-cloud systems entail no such problem because their servers are being migrated by the hosting service without the business having to worry. That’s already covered in the monthly subscription fee. Moreover, new features released can be added one by one. But depending on cloud-based systems, especially when many learners use it, demands robust bandwidth. Slow or failing internet connection is the bane of cloud users; besides being disruptive to the learning process, they count as downtime. Installed software, on the other hand, ignores that. Is it raining so much the internet is out? Fine, keep on learning. Customisation Simply put: customizing and branding your system is feasible regardless of your deployment method, but more easily done with a cloud-based LMS. The vendor will give a business full control over customization and features to be included — again, without needing a team to monitor it later. On-cloud systems are the vendor’s responsibility, which means they will mind any issues that might arise. In fact, when the time comes that the system needs to accommodate new users, the business can change its plan and scale more easily. What’s the best option for me? Shall we go beyond the “it all depends” answer? Granted, it all truly depends on your needs, but some practical suggestions are possible. An on-premise LMS might be the best choice for you if you want: Absolute control over your system and data — no one else handles your LMS. An extra layer of security — clouds are secure, but businesses that deal with highly sensitive information that must never leak might want to keep it all inside their own walls. Self-maintenance — if you can afford a qualified IT team ready for work, maintenance and will be quicker. Independence from the internet — unreliable internet connection will no longer be a problem. A cloud-based LMS might be the best choice if you want: Quicker deployment — not all businesses can afford to wait for their system to be ready, and a private cloud solution will be usable sooner than an installed one would. More flexibility for the learner — being on the internet means any laptop computer with access to the internet can connect to it. Learning on tablets and smartphones can be added for maximum accessibility. An LMS that’s ready for the future — the disproportionate majority of LMSs are now cloud-based — much software is too. Scaling is easier as well. Fewer expenses and worries — leaving maintenance and operation details to other professionals decreases spending with IT resources. Your existing IT staff (which can be small) will have less to worry about. [/vc_column_text][/vc_column][/vc_row] ### Where AI and Humans Intersect | The Power of Hybrid Artificial Intelligence (AI) is quickly growing in popularity and corporate use, with chatbots reaching the mainstream through companies like Casper with their Insomnobot 3000, and Tesla making self-driving cars a reality. AI functions by measuring scenarios it is programmed to analyse, which in turn obtains data that can be transformed into action. This is a simulation of human intelligence without the inherently human hurdles of emotion and fatigue. A major benefit of using AI for business is the ease at which an AI engine can constantly detect and analyse data without tiring. A multitude of companies is investing in AI including Amazon, Google, Microsoft, and IBM. Amazon has their voice-activated bot Alexa and opened up an AI supermarket called Amazon Go in Seattle, Google purchased AI startup Deep Mind, Microsoft Ventures launched an AI startup competition, and IBM has had their own interactive question answering computer system called Watson since 2010. Companies working with both humans and AI such as Mighty AI, a Training Data as a Service™ company, and CloudMinds, a provider of cloud-based systems for AI bots, have honed in on these benefits of AI whilst acknowledging the importance of human supervision. Adjacent to their avid use of machine learning, both companies remain conscious of the human role in effectively programming AI and monitoring its accuracy. Mighty AI hires people to pinpoint content correctly and tag it accordingly, and from that, the machine learning technology is able to do the rest of the work. One of the individuals hired to carry out this task at Mighty AI explains her job as “teach[ing] machines to identify high heels on a photo” in a promotional video for the company. On their website CloudMinds explain that their employees “are vital to making the vision come alive”, and they “are world-class scientists, engineers, business leaders and other professionals, like medical doctors”. When working with AI, humans are responsible for training machines as well as ensuring the maintenance of the machines in order to keep standards high and the work carried out by the machines streamlined. The accuracy of data collected is improved with the large numbers of talented and vigilant people who are hired to pinpoint content correctly and tag it accordingly, and from that, the machine learning technology is able to do the rest of the work. A major consideration for businesses surrounding the growth in implementation of AI technology is what jobs will be lost, what new job roles will be created as a result of the technology (e.g. the role of tagging content), and how staff will adapt to working with AI. With the deployment of AI across business comes the necessity for people to build, programme, and train AI bots and computer systems. AI cannot function properly without human intervention and training. Without this human element, the use of machine learning is known as unsupervised learning where no training data is used as a basis for the machine to learn from. This leaves the AI to fend for itself without the guidelines of the humans sourcing data and content for it to learn from. Unsupervised machine learning is of particular use when there is no data or when AI is being used for purely experimental purposes. For example, the 2012 Google Brain project, which consisted of the AI being confronted with millions of frames from YouTube videos without any annotation, by looking for trends and patterns, the bot taught itself to identify animal faces. Supervised machine learning is a safer alternative particularly for the development of such things as self-driving cars as lives may be on the line, so knowledge of environments based on human labelling could help correctly identify a threat that the AI could miss if left to its own devices. Supervised machine learning algorithms rely upon training data to continuously learn from. There are different categories of algorithms: regression algorithms predict the output values based on input data, classification algorithms assign data to different categories, and anomaly detection identifies a typical pattern called outliers. Anomaly detection, for instance, can be used by companies to detect security breaches and can even identify atypical physical features on a human body such as a tumour through scans like MRIs. The human trainer of the AI is responsible for teaching the computer system how to identify these anomalies and what constitutes an anomaly. The human role when working with AI technology is to provide a safety net and secondary source to resolve and monitor potential issues in the development and deployment of new and potentially hazardous technology. Additionally, the synergy of man and machine significantly aids productivity and efficiency as the human employees share the workload with their AI counterpart. The importance of the human worker must not be overlooked as the machine does not work without its trainer. ### 7 Essential Steps of Starting a WordPress Blog [vc_row][vc_column][vc_column_text]Have you always wanted to start a blog either for business or personal reasons? It can be intimidating, right? However, with the right knowledge, starting a blog will be a piece of cake for you. Here is a step-by-step guide: Get a domain name and hosting For starters, you'll need hosting and a domain name. But why would you do that? You can have a free blog on WordPress.com, right? Well, yes. But, if you truly want to be seen as a professional, you need your own corner of the internet where all of the content belongs to you and you can control it. On WordPress.com, that is not the case. So, if you want a good start, you have to invest some money. This is where BlueHost steps in. Most people start there - it's cheap, simple and partnered with WordPress. You can get it for even less money through various bloggers partnered with them. This may not be the top quality option you are looking for but it will do in the beginning. You can switch it later, once you start earning money on your blog. You will have to choose a domain name - choose wisely. To change it, you’ll have to pay more money so be careful. If some of your first options are taken, try using something either shorter or similar to that. Remember, your domain name is going to appear in your URL and everywhere connected to your blog. It needs to be memorable and it needs to look good and professional. Don’t choose a domain name that’s too long - the shorter is better in this case. They will give you a temporary domain - don’t get confused by that. Once you launch your site, it will change into your chosen domain name. “Some may get confused with Bluehost, especially if they have zero experience or knowledge of this part of the job, but there are plenty of videos and tutorials that show you exactly what you need to do in order to set everything up,” says Josie Bell, a blogger and regular contributor to Phd Kingdom. Install WordPress This is a one-click process on Bluehost and that’s why it’s so often recommended to new bloggers who are still unsure of what they need to do. All you have to do is log in to your fresh Bluehost account, enter your cPanel and find a place where it says WordPress. It’s that simple. Then you’ll have to wait for a while until it downloads. WordPress.org works in a similar way to WordPress.com - when it downloads, you will see a pretty familiar dashboard but there will be more options that you otherwise wouldn’t have. Design your website Now comes the fun part - or the tedious part, depending on where you stand. Choosing your own theme, design and fonts may seem like a wonderful job, but in reality it’s only fun for about five minutes. Then it gets frustrating. The important thing is to keep your head in the game and don’t get too caught up into the details. Elaborate, masterpiece themes look amazing but you really have no use of them at the moment. Choose a simple one that you will spice up with some personal touches like fonts, images, your logo and so on. It’s much better to go for a theme which loads quickly than for a theme that looks good but loads incredibly slowly. Many experts say that you should just focus on choosing something you like and moving on because you’ll get a lot more ideas as you go and you’ll change and tweak a lot of things. No one really sticks with their original design as trends, styles and preferences as well as the nature of your blog changes. You should, however, be consistent with your niche. Assuming you are already aware of which one you want to contribute to, you can choose designs and themes that work well with it. For instance, green or gold colors if your niche is finance. Get some useful plugins Some would say that you should write your first blog post straight after - or even before - choosing a theme, but it’s wiser to stick with the technical side of starting a blog. This means setting up many different preferences as well as installing all of the recommended plugins. For one, you’ll definitely benefit from the Yoast SEO plugin. It has a free and paid version. For starters, you could use the free one. Yoast SEO will be with you as you write your posts and let you know just how readable and optimized your posts are. If you notice a red smiley face, this means that you need to change something - and Yoast SEO will tell you exactly what it is. Next, install and activate WP forms or a different plugin for email newsletter signups. This is really useful later on, as you get more visitors. You should also see if Optinmonster - a plugin that gives your visitors a form where they can leave their email. For speed optimization, download Hummingbird plugin and choose one for all of your social media icons. Overall, there are a lot of options that you can choose from. There is a free plugin for almost anything. Test them and download the ones that work well for your site and your needs. Write a post Now it’s finally the time to write a post. Choose an interesting topic from your niche. Something that people would want to know about. It’s recommended that you start with a few - at least two - pillar posts. These are those long, 2000+ word posts where you give an extensive guide on something. These usually have a title starting with How To or A Guide To. Choose a realistic schedule for yourself. Once a week may be realistic for you if you have a job but you can post more if you have more free time. An average would be three times a week but this is not set in stone and different schedules work for different people and different blogs. Add plenty of images and visual aids to make your words breathe and to illustrate your point a bit better. Be concise and find your own voice in writing. People are not looking just for tips but also for some personality. Optimize for Google Yoast is a huge help in this area but there are also things you have to do on your own. For instance, choose a keyword you’ll rank for or optimize the speed of your site. Google takes everything into account so you should learn the general rules and best practices to ensure that you’ll rank well. Let the world know you are there Finally, create social media profiles or pages. While it may be a good idea to do them all, this will actually cost you a lot of time and yield poor results. Instead, focus on two platforms tops. Most experts recommend Twitter as a must and most recently, Pinterest as it’s a great place to get some people to see your content without having to gather a huge following. You can also do LinkedIn, if that suits your blog better or you could do Facebook, although it can be difficult to get followers there in the beginning. Starting a blog may seem hard when you are not skilled in certain areas but it can be really simple if you follow these tips and steps. The most important parts of the process are remaining consistent in your posting and being available across different social media platforms. Hopefully, this guide will help you.[/vc_column_text][/vc_column][/vc_row] ### Creating the excellent experiences today’s customers expect [vc_row][vc_column][vc_column_text]Today’s customers will not accept being treated like everybody else. They want to feel special and feel they are being given the attention they deserve. As a result, businesses will, therefore, have to figure out how to inject personalisation into their interactions with customers. This provision of personalised experiences is no longer a ‘nice to have’, only to be addressed once processes are up and running and shouldn’t be underestimated. It has already become a business differentiator as, according to Deloitte, 1 in 4 consumers are willing to pay more to receive a personalised product or service. But achieving the right level of personalisation is often easier said than done. As often pointed out, when it comes to providing a personalised customer experience, local neighbourhood shops definitely have a competitive advantage over most of today’s big businesses. They know you, probably your family too and your preferences. Knowing and anticipating your purchasing habits and maybe even your daily routine is part of their day to day job. They are also prepared and able to offer you a good deal to stop you from leaving after a bad experience. This kind of personalised customer service is the one that all businesses would like to recreate. However - despite acknowledging the importance of personalisation - many businesses and their contact centres are still falling short. Taking on the one-to-one challenge In 2019, one of the biggest challenges for contact centres with regards to providing a personalised service will be that the customer experience is no longer measured on individual interactions. Instead, it is judged on the complete customer journey. In today’s digital world of multiple touchpoints, this presents complications. Customers are interacting with businesses through more channels than ever. With online, social media and mobile applications, many contact centres are struggling to track the channels and deliver a consistent, personalised experience across the board. This shouldn’t be dismissed as a side problem as it can in fact directly impact a company’s bottom line. According to a recent report, 8 in 10 consumers are willing to switch companies due to poor customer service. In this context, businesses and their contact centres just cannot afford to get the customer experience wrong. A seamless customer experience is essential for businesses that want to remain competitive in today’s landscape, which presents a need for technology that can provide the personalisation demanded by consumers. Tools such as cloud computing, for example, provide contact centres with the ability not only to engage with customers much more quickly, but also interact with them in a personalised way by adding new channels or changing Interactive Voice Response (IVR) scripts in real time. But cloud isn’t the only technology with the ability to transform the customer experience. Other tools, such as data analytics, also play a major role in helping contact centres move from the outdated ‘one-to-many’ approach, to ‘one-to-one’ customer interactions.  Analytics will personalise your customer experience If used in the right way, data analytics allows businesses to have access to how a customer perceives the entire journey. It can combine different forms of measurement across all channels, so past and present journeys can be evaluated to pinpoint the root cause of negative interactions. It can also make it easier to personalise the customer journey by providing valuable insights into how specific customers behave, allowing businesses to re-create the local corner shop experience on a much larger scale. To put it plainly, through data analytics, contact centres can build a complete, real-time view of their customers by capturing customer feedback and interactions throughout their entire lifecycle. This can then be used to manage effective follow-ups and drive organisational improvements. Analytics will also empower contact centres to make intelligent decisions around maintaining performance and satisfaction, from both the customer and the employee perspective. For example, by analysing voices for variations in pitch and tone or taking advantage of natural language processing to spot patterns in what they are saying, agents will be able to gauge the emotional state of customers and adapt to it. This will enable them to track the most frequent topics mentioned by customers, as well as identify trending satisfaction issues. This could be the difference between proactively solving a problem for a customer or leaving them frustrated. Also, analytics can be used internally to aid employees. Smart reporting can be used to reward employee performance, thereby driving employee engagement. Inefficient processes can also be identified and improved. The business value of analytics can be seen through various metrics and across different functions. However, most importantly, the use of data analytics tools can help contact centres move towards one-to-one interactions that are tailored to customers’ specific needs and preferences. In today’s hyper-competitive world, that’s an opportunity that can’t be ignored.[/vc_column_text][/vc_column][/vc_row] ### How can innovation shape the future of teaching? [vc_row][vc_column][vc_column_text]In Damian Hinds’ most recent address, he spoke of his “determination to reduce the workload of teachers and return teaching as one of the most rewarding jobs you can do”. To make this possible, the learning and development of teachers must be held in a similar band of priority to the education of students. Teaching is regarded as being amongst the most hard-working professions in the UK, yet the progression of their skills and their careers are not given the same attention as in private sector roles. Technology has largely had a positive effect on the education landscape, and it is up to people operating within it to integrate the use of innovative technology in helping teachers improve themselves and their ability to raise outcomes for students in a time of tough budgeting. This is in stark contrast to the values of the wider education system. The values of the education industry are centred around the learning and education of children of all ages. This spans the entirety of children’s time in the education system, including both curricular and extracurricular activities. The education industry should take a similar stance with teachers if we are to make well-rounded teachers who want to stay in the profession. Better teachers will result in better student attainment and ideally will solve the retention and recruitment issue facing the industry. To better implement this, education institutions must use the new technologies available to them to reassure and inspire a new generation of teachers in a time of low budgets and high workloads. The downturn in the popularity of teaching is further proof of a country of educators that have fallen out of love with the profession. However, it is easy to see how this could change if the values begin to orientate towards the care and development of teachers so that they get over the five year hurdle by which time so many leave the profession, scarred and burnt out. In the past ten years, we have seen more technology being used in the classroom across the board. In many ways, the use of technology has allowed teachers to inspire school children from different perspectives. It has increased the attention span of children worldwide, using a wide array of multimedia content and tech to bring even the most mundane topics to life. Current EdTech innovations are largely centred around the better learning of school children. However, in light of damning statistics that suggest only one in three newly qualified teachers (NQT’s) will stay in the education system after five years in the profession (Gov UK). Institutions should now be looking to implement innovative technologies that will act as a catalyst to quicken teacher training and development and give them the support they need to cement their skills and confidence. Teachers’ needs must be at the forefront of the new era of EdTech innovation that can streamline lesson reviews , knowledge sharing and remote access that enables long-distance mentoring where previously it wasn’t possible. Innovation can be the driver that steers the education industry into a new age of teacher satisfaction. Technological development will allow teachers to practice more effective and efficient development programs. Teachers can be expected to complete all their standard job functions, while also learning from their own practices at the same time, which is not possible most of the time. Learning from yourself In almost every other profession, you are expected to learn from previous experiences. Why should this change for the people who are educating the next generation? We must use EdTech to ensure that teachers are able to analyse and proactively use evidence to support their teaching methods. There are companies within the education technology sector using video footage and experienced teachers to mentor NQTs on their style of teaching, showing them where to look rather than what to see by allowing them to review their own lessons and the teaching performance, as well as the impact on students within the classroom - including those whom the teacher was not observing at the time. Part of the focus of this technology is to target teachers’ Continuing Professional Development (CPD). This approach is important as it allows the NQT to steer their development to their exact needs, and allows mentors to guide where they need to analyse and recognise shortcomings in their own teaching practice. This analysis and coaching is used across numerous industries but is especially prevalent in retail and sports. This is how businesses continue to work and perform at an optimum standard, whilst ensuring those who are motivated to improve and better themselves are provided innovative technology as the vehicle to do so. As the education industry looks to put teachers at the forefront of its development, it is important that these measures are taken to proactively coach those within the first five years of their career. However, in addition to Damian Hinds’ statement that NQT’s within the first five years of their career, should be targeted for this sort of coaching, such advancements in technology should not only be made to cater to them. For NQTs, the presence of these technologies will be something they have trained with, however, for seasoned professionals, there will be a slight learning curve. There is a chance that those who have had lengthy careers without the influence of such tech might become stuck in their ways and may need to see for themselves how such technologies can be used empathetically rather than critically to allow them to improve themselves, rather than be dictated to. The use of this technology can be a benefit to both the teachers and the ones judging them. Outdated practices such as teacher reviews can be modernised and streamlined through the use of video-led lesson observations. This significantly streamlines the process, and is in many cases, far more reflective of a teacher’s ability to control a classroom. In essence, if the education industry sets upon a path to build innovate around the needs of teachers, the standard of education for both children and teachers can only improve. This investment in teachers will help to attract the talent that the education industry requires, and push both primary and secondary teaching back to the top of the ‘desirable jobs’ list.[/vc_column_text][/vc_column][/vc_row] ### Upgrading data centers to cloud scale efficiency – hype versus fact [vc_row][vc_column][vc_column_text]“Software-defined” famously made networking sexy, and network efficiency is the New Black. But is anybody wearing it? - If you want to cut through hype and rumour to find out what is really happening, you ask the people at the coal face. That is just what the latest Futuriom report – Untold Secrets of the Efficient Data Center – sponsored by Mellanox Technologies, has done. Over 200 director level or higher data centre professionals were screened by country and company size to dig deeper into actual working practice and the key trends. “The data centre is being reinvented” according to Scott Raynovich, Chief Analyst, Futuriom. “It’s a real challenge to build a cloud infrastructure that can scale to support demanding applications that can embrace big data, analytics, self-driving cars, and artificial intelligence. The very techniques developed by hyperscale cloud giants are now migrating to the enterprise, where distributed applications now rule. There’s more pressure than ever for networks to perform, and new technologies are beginning to be deployed to make sure that networks don’t become the bottleneck for the cloud. This report provides the most detailed insight into why this matters, and how key players are re-shaping the road map.” Background The report summarizes the results of a survey taken in Q1 2019 by Futuriom, and an independent cloud-based data partner. The respondents included 116 from the US, 52 from China, and 50 from the UK – to provide an international overview based on regions where data-centre infrastructure is being deployed aggressively. By industry, the distribution covered the cloud (49%), telecommunications (26%), and enterprise IT domains (25%). All were screened for IT expertise, with 25% falling into the CxO or SVP category. Roles included: enterprise IT managers (39%), cloud architect/managers (32%), applications development (26%), security (24%), and network manager/architect (22%). The survey was limited to companies with more than 500 employees as follows: 34% of 501-1,000; 41% of 1,001-5,000; 14% of 5001-10,000; and 11% more than 10,000 employees. The key role of the network So how is the data centre to be upgraded? Asked to: Rank The Following Technological Responses to Improving Data Center Performance, the highest average ranking goes to: Improve the efficiency of networks using techniques such as processor offload and SmartNICs, whereas the lowest ranking goes to: Deploy more servers. This theme emerged clearly throughout the survey: that the network is seen as a key engine of performance to the cloud, and it needs specific adaptations to keep up with data centres that have ambitions to be cloud-scale. And the potential benefits expected from these network upgrades include: faster application performance (64%), stronger security (59%), greater flexibility (57%), and application reliability (57%). Overall 84% of respondents thought network infrastructure was either “very important” or “important” to delivering applications such as artificial intelligence and machine learning. The choice of SmartNICs is interesting as it is a relatively new solution (only 10% confess to not knowing what a SmartNIC is). SmartNICs are Network Interface Cards (NICs) with built-in processors and intelligent accelerators using standard APIs. They can be C-programmed to do anything from optimizing traffic flows to recognising and quarantining malicious data before it reaches a server. This takes an enormous load off the servers that the network connects. Without them, tasks such as Remote Direct Memory Access, Non-Volatile Memory Express over Fabrics (NVMe-oF), compression, encryption, and network virtualization place a constant demand on the server cores, and this reduces the power to support applications. More advanced SmartNICs can even virtualize networked storage to simplify provisioning to both virtual and bare metal servers. basically So, basically, SmartNICs create a “Smart Network” that manages itself and takes a big load of the servers, freeing them up to provide optimal application support. Asked: Which of the following use cases for SmartNICs appeal to your IT organization?, the overall winner is: Improve efficiency of VMs and/or containers (56%); second is: Virtualize and share flash storage to use it more efficiently (55%). Other choices are: Enable more software-defined networking (54%); Accelerate hyperconverged infrastructure (50%) and Isolate and stop security threats (47%). There are interesting differences across the three regions polled. For the two most popular use cases ­– improving the efficiency of VMs and containers and virtualizing and sharing flash storage – the responses from China are higher at 65.38% and 75%, respectively. These compare with 55% and 51% in the US, while the UK showed lower levels of interest for all use cases. The future of Moore’s Law This overall emphasis on increasing efficiency, as a better way to improve data centre performance than simply adding greater processing power, might be simply a response to the supposed death of Moore’s Law. For decades the IT industry has lived with the comfortable knowledge that processing power would increase and become cheaper year on year. If that is no longer perceived to be true, it might explain this shift from adding hardware to increasing efficiency. But when asked: Do you believe that Moore’s Law is decelerating, disappearing oe changing how you depend on chip development cycles? The overall majority response (38%) was: Moore’s Law will continue to be around for the foreseeable future, with only 12% saying: Moore’s Law is starting to decelerate or subside. So it does appear that the quest for efficiency marks a positive decision rather than a reaction to reducing opportunity. Again there were interesting regional variations, with China much more confident about the future of Moore’s Law than the US and UK. Conclusion This survey takes a detailed look at the data centre environment and concludes that data centre professionals see the need for new solutions to optimize their operations and efficiency. They want to avoid adding more expensive servers, and they see how virtualization and network optimization technologies offer the best way to achieve their goals. To that end, they recognise network optimization and SmartNIC technologies as the most realistic way for the average enterprise to upgrade existing data centres to achieve those hyperscale efficiencies.[/vc_column_text][/vc_column][/vc_row] ### AI, 5G, trust and much more work to do - five key take outs from this year’s RSA conference [vc_row][vc_column][vc_column_text]Year after year, technology and security professionals flock to the RSA Conference in San Francisco. It is the annual opportunity for security professionals to tackle the biggest issues and trends impacting the industry. From talking to customers about 5G and what that means for security strategies to listening to brands, such as Uber discuss the future of data protection, here are the five key trends which underpinned this year’s event. 1. It’s all about trust Rohit Ghai, president of RSA, spoke about the future of trust and remarked: “Trust is to the economy, what water is to life”. A powerful statement but one that set the tone for the conference. Large scale breaches are gradually chipping away at consumer trust and businesses need to quickly make changes to navigate the complex nature of getting data privacy and security right. This is something Uber knows better than most, experiencing a breach of 57 million users’ data in recent years. Uber’s CPO, Ruby Zefo, joined a panel on the future of data protection, exploring the importance of protecting PII and defining the difference between who owns and controls data in a bid to show the company is making changes. Uber is not the only business that knows the cost of losing customer confidence and trust - and so speakers around the conference were quick to press how they were putting the right security and privacy procedures in place.    2. AI needs humans to solve problems  According to Cybersecurity Ventures, the cost of cybercrime is predicted to be $6 trillion a year by 2021. This is a huge business and hackers are constantly looking for, and finding, new ways to make money. To counter this, businesses are investing in technologies which strengthen their defences, but this can also fuel attacks. It’s a topic that Steve Grobman, Senior VP and CTO of McAfee, took on at the event, stating that “AI creates as many challenges as it solves”, he also mentioned that people and machines need to work together. While it is imperative to have the right technologies in place to protect data, a robust team that understands the potential vulnerabilities across the organisation will help ensure no stone is left unturned.   3. 5G and the need for transparency Given the imminent rollout of 5G, there was some debate as to what this means from a security perspective. 5G in many ways is great news, providing superfast, broadband-like download speeds, meaning that digital experiences will be more streamlined - encouraging consumers to shop and consume content anywhere, at any time. But with consumers becoming more connected, comes more touch points where data owners could be vulnerable to a breach. Robert Joyce, Advisor for cyber strategy at the US National Security Agency tackled what 5G means for organisations, explaining that transparency with consumers is essential in this new era. There will inevitably be challenges with 5G and security - therefore ensuring there is an open dialogue with consumers is key.   4. More education is needed to improve cybersecurity The need to improve education and understanding of cybersecurity was highlighted by General Paul Nakasone, United States Army, Commander, United States Cyber Command, National Security Agency (NSA). Nakasone compared the need to improve the level of cybersecurity education to the equivalent importance of the 1960’s space race - a big claim. However, other speakers echoed this need for education and upskilling of teams too. This education isn’t just important for IT teams and security professionals, however. There is a need for those at board-level and in the c-suite to truly understand where their company has weaknesses. Too many businesses are unaware of their greatest vulnerabilities, which is why we see headline after headline informing consumers of data breaches. Ultimately there should be more investment in people and teams to help this process - this is something the c-suite should have firmly on their radar in 2019.   5. Rethinking efficient third-party management Data breaches are often due to a weak link via a third party. For example, website security breaches can originate from third-party marketing technologies which a business does not control. Some of which companies might not even be aware of. This issue is something that sparked debate in a panel on ‘Rethinking efficient third-party risk management’. Todd Inskeep, principal, cybersecurity strategy, Booz Allen Hamilton, made the point that there are two types of third-party risks: “what you can control, and what you can’t, and the only thing you can control is contract liability.” New risks are always occurring and clear agreements must be in place. But, organisations can and must take steps to mitigate risk and take ownership of their website supply chain. For instance, most website data breaches are preventable by implementing the right technology. Moreover, if businesses do their due diligence company-wide and identify where possible risks are in the first place, this will help prevent breaches and give more control to the organisation. This is a huge and often thankless task but there are partners out there to help businesses build strategies and implement technology to mitigate risk. There’s no denying that our industry has come a long way in the last few years but there is much more work to be done. See you next year![/vc_column_text][/vc_column][/vc_row] ### Wearable AI Gadget | Is telepathy a step too far? The process between thought and action is not often one we think about, but an intriguing gadget could fast track that process by reading our thoughts and acting on them without us having it. The device has been developed at the Massachusetts Institute of Technology  (MIT) by a team of researchers. Back in April, a promotional video was uploaded to YouTube to demonstrate the possibilities of the device. The video featured MIT research assistant Arnav Kapur walking around the campus with a white piece of plastic attached to the right side of his face. Words appeared on the screen to represent Kapur’s internal thoughts; showing how the device could register and process these thoughts. For example, when Kapur thought, “Time?”, a voice replied, “10:35 a.m”. Kapur originally commenced his time as a researcher at MIT after moving from New Delhi in 2016. He arrived at MIT to build wearable devices that would be able to incorporate technology into our everyday lives (at all hours of the day). The gadget is known as AlterEgo and is 3D-printed with electromagnetic sensors. The device also connects to Bluetooth to in order to access the internet for data collection. It has allegedly become so effective at reading Kapur’s mind that it is able to place a request for an Uber ride without the need for verbal or typed instruction. He explained that the aim of the product is to get as close to capturing exact thoughts as is possible. It is astounding such a device can obtain thoughts from a person without it being implanted within the brain itself, but eventually Kapur and his fellow researchers plan to make the device entirely unnoticeable. Understandably, many people will find this merging of human and technology quite alarming, but Kapur embraces this synergy, and has said, “I think the future of human society is about us collaborating with machines”. Public fear would likely revolve around the loss of human control and the potential for hacking and privacy violations. If a gadget is able to read the human mind, shouldn’t humans be able to read each other’s minds? Of course, the sensor technology gives the device an upper hand in mind reading but considering this device was developed by humans, humans could use the same technology to access our inner thoughts. Hacking mobile phones and computers is a common and widely understood issue, but the potential for the mind to be ‘hacked’ via an IoT mind reading device such as the AlterEgo is a truly terrifying one. The intention behind AlterEgo is clearly no more than a desire to aid the communication between human and AI (allowing people to truly harness the power of the internet), but, as with any new technology, it is at risk of falling into the wrong possession. My mind wanders to Doctor Who when considering the dystopian possibilities of mind-reading technology- I can picture a scene in which hoards of people wander in unison with earpieces controlling them. The particular episode I am referring to is called ‘Rise of the Cybermen’, which is incredibly relevant to Kapur’s plans for AlterEgo and his perception of man and machine combining. He describes the use of mind-reading and our integration with AI as, “how we’re going to live our lives”. He sees it as an inevitability and clearly plans to embrace it. Although technology resembling AlterEgo may cross privacy lines in the future, Kapur and his MIT colleagues have ensured that the device is user-control based. The device will only identify thoughts when you want it to. The idea is that the user must want to communicate with the ‘computer brain’ or AI in order to interact with it. Professor Pattie Maes, an A.I. expert worked with Kapur and his team, and both are very aware of the ethical considerations necessary. Although calling the device’s ability ‘mind-reading’ is a way of explaining its function, according to Kapur, the device can’t really read your mind and he assures that there is no possibility of it being able to do so in the future. Without our thoughts being in direct communication with the device (requiring our choice and consent), AlterEgo is not able to access our thoughts. The gadget was not created to shock the public by reading our mind without us knowing, it was purely designed to be convenient for its user. For example, to interact with your virtual assistant bot you must verbally communicate with it, but the MIT creation takes away the necessity to verbalise commands, questions or thoughts. When in a location that requires silence, i.e., a library or lecture hall, you can still interact with the device without causing a disturbance. It is not difficult to see the potential for this device and the value it could have in several years. AlterEgo could find extensive popularity amongst large tech companies already embracing AI and its application in business, such as Amazon. For the time being Kapur and his creation remain at MIT, but who knows where this invention will end up?   ### The nuts and bolts of online gaming [vc_row][vc_column][vc_column_text]It’s a gamer’s worst nightmare. The shot is lined up, the monster’s treasure is calling, and then the action freezes. Kicked to the lobby or forced to reboot, the gamer loses a hard-played shot at points and booty. The prospect of such negative experiences also keeps gaming developers and infrastructure planners up at night. Competition among massively multiplayer online role-playing games (MMORPGs)—from World of Warcraft to Fortnite—means unreliable server connectivity or latency-plagued action can easily spell doom. It destroys the immersive experience and directly affects gaming companies’ bottom line. And today’s gaming market isn’t child’s play. It’s big business – over £98 billion, with more than 2.5 billion video gamers of all ages worldwide. Enthusiasts run the gamut from mobile to console to PC, with the latter still comprising 57% of global revenues. Even Google is joining the movement, with 2019 plans to launch a new digital gaming platform, Stadia, which will stream games that used to rely on discs or downloads. Demands on gaming infrastructure can be intense. In 2018, Playerunknown Battlegrounds (PUBG)—touted as a Fortnite alternative—hit 550,000 players hourly on Steam, the online-only game store. Another MMORPG, Defense of the Ancients 2 (DOTA 2), averaged 532,000 players hourly. Peak player numbers frequently reach into the millions, and delivering realistic action at these volumes is a challenge which companies use various strategies to overcome. Gaming infrastructure design Most online games use a client-server model. The client running on the user’s computer, console, or smartphone includes the playing board and user’s viewpoint, based on algorithms that dictate the representations of the game world and action that the gamer sees and hears. On the server side, the multiplayer universe is constructed, incorporating the various connected clients into an accurate depiction for all players. Like most other client-server applications, MMORPGs rely on multiple servers to spread out workloads and, hopefully, provide adequate redundancy to minimise freezes and other issues. The functions are generally divided among distinct server clusters, each dedicated to a particular purpose. These include providing login capabilities, supplying access to account information and character statistics and gear, and representing the game’s physical environment. It can also include performing the calculations related to players’ actions and the associated in-world physics, delivering chats, or handling players’ voice traffic. There are usually patch servers as well, because software updates to eliminate outdated code are frequent. Most information about gaming companies’ infrastructure is closely guarded. However, a 2006 disclosure to the U.S. Securities and Exchange Commission, Vivendi, revealed that running World of Warcraft required about 9,000 servers worldwide. Of course, demand for more features and ever more realistic experiences has only increased hardware requirements in the intervening decades. Sources of problems Most gamers understand that their home system must make the grade or their experience will suffer. Games are among the most demanding software out there, and to play the new releases, it’s often essential to invest in a machine with a fast processor and top-end graphics card. Internet connection speeds are also important. Although games tend not to have high-bandwidth requirements, bursts of small packets must arrive rapidly to ensure a realistic experience. Assuming the client-side hardware and broadband performance meets or exceeds a particular game’s requirements, there are other reasons behind why play can freeze or lag. For example, local internet or data centre outages are always a possibility. Moreover, viral popularity is a real challenge. A largely unknown game with minimal web infrastructure can become wildly successful overnight. Gaming companies must determine where and how to invest up front, as failing in that leap to stardom can easily turn players off and kill the surge. Fortnite, with its 125 million players and up to 3.2 million concurrent sessions, relies on hyperscale cloud services to deal with volume spikes and provide robust, global infrastructure. In fact, turning to Amazon Web Services (AWS) is becoming a standard tactic, in part because the talent required to run gaming-capable infrastructure is in short supply. Even with such a partner, however, Fortnite has experienced multiple outages. Some have derived from distributed denial of service (DDoS) attacks launched at AWS systems, while others were caused by overall popularity taxing technical limits in certain regions. To be sure, hackers’ DDoS attacks have historically been a core problem for games, more so than viruses. Minor interruptions that wouldn’t fundamentally alter the Instagram experience, for example, are deadly in an MMORPG world, given the low-latency demands. Another fundamental issue is code. Updates are frequent, as games offer a dynamic experience and need to eliminate redundant code. But like any other continuous deployment scenario, bugs are a risk, and strong testing and change management protocols are required to limit problems in production environments. Keys to avoiding game outages and freezes There are few industries placing as high a demand on infrastructure as MMORPGs, so it’s essential to follow best practices regarding data centre design and maintenance. Less intensive mobile games are more likely to be run on wholly owned equipment, but the high usage rates—reaching some 70% of the population—makes it impossible to skimp on infrastructure. Gaming enterprises operating their own servers will want high redundancy and, for MMORPGS and other action-style games, geographically dispersed facilities to minimise lag. Data centre managers will generally need a solution to track an immense quantity of IT hardware inventory and assets, including the length of time each piece of equipment has been active. This can help identify when a given server or other system may be at risk of failure, as well as list the support coverage and provider information to enable rapid response in case of failure. Properly planned architecture will help to ensure that a single server outage does not result in system-wide downtime, but it’s nonetheless best to make repairs with utmost speed. Software updates and other basics of data centre maintenance will also take on great importance in a gaming scenario. Security-related patches from the original equipment manufacturer, for instance, should be monitored and installed. Scheduling such updates, as well as hardware replacement and upgrade activity for off-peak gaming periods, will reduce or eliminate impacts on players. Beyond high-level data centre management and maintenance, various gaming companies are exploring additional measures. For example: Breaking game delivery into zones is now common and can reduce lag and provide redundancy. The prominent gaming companies understand peak player times, and zones will “follow the sun” around the globe as players in different regions log on in droves. The strategy helps gaming enterprises ensure speedy interactions via geographically local servers. Should any zone experience problems, such as internet outages, other nearby zones can typically fill in. Some MMORPGs, among them PUBG, are matching players with similar ping rates, so action appears consistent because players’ internet speeds are. This can help provide better experiences in rural areas and less developed regions and ensure no one’s play is “brought down” by someone else’s slow connection. Many gaming companies are tapping the cloud, whether as a primary provider of server capacity, or as fill-in infrastructure in zones where they would have difficulty operating their own equipment. They may also do so to help cover bursts of activity that challenge the company’s on-premises or colocated infrastructure and threaten the gaming experience. Crowdsourcing beta releases is a viable option for field testing new code. The gaming industry enjoys a core audience of exceptionally enthusiastic customers. Much like the Windows 10 Insider group, it can exchange early access to features or other offerings for honest input and bug testing. Posting bug bounties, where active users are rewarded for identifying issues in the gaming experience, is another option for leveraging the gamer base in a crowdsourcing capacity. Even for non-gamers, the challenges and solutions developed by gaming companies are important to monitor. We are rapidly approaching an age of augmented and artificial reality applications, which will take a gaming-style experience into consumers’ daily lives, for everything from shopping to news consumption. The technology industry has a great deal to learn from gaming companies’ success in achieving low-latency, resilient and immersive experiences capable of delivering hours on end of enjoyment, so we should all continue to pay attention and follow suit as their best practices develop.[/vc_column_text][/vc_column][/vc_row] ### Unit4 acquires Intuo; Extends HCM offering to better support modern workforces in the services economy [vc_row][vc_column][vc_column_text]People, Finance, Project and Planning software portfolio in the cloud for mid-market organisations Utrecht, Netherlands & Ghent, Belgium, 26th March 2019 – Unit4, a global leader in enterprise applications for service organisations, has acquired Intuo, a fast-growing provider of Talent Enablement solutions nominated as a Deloitte Fast 50 Rising Star in 2016. The acquisition combines Unit4’s experience delivering rich transactional HR, payroll and vertical specific applications to services industries, with Intuo’s innovative technology for strategic HR. Unit4 now offers an extensive HCM portfolio covering People, Finance, Project and Planning in the cloud for mid-market organisations. Intuo offers an integrated talent enablement platform that provides an alternative to the traditional yearly evaluation cycle and engagement survey. It meets the performance management needs of a modern, non-traditional digital workforce that is project/task driven and works to a flexible schedule, characterised by a lack of hierarchy, multiple reporting lines, multi-employment etc. Business leaders can easily plan and develop talent through continuous conversations and in the moment performance management which leads to a more accurate and helpful understanding of an employee's performance. Continuous people appraisal and feedback loops make work more meaningful, keeping multi-generational teams engaged and helping people grow. This new approach to talent enablement combined with Unit4’s experience delivering rich transactional HR and vertical specific front-office applications to services industries, meets today’s requirements for HR in the flow of work. Intuo ensures peer to peer performance conversations, with an emphasis on career ownership. Key capabilities delivered by Intuo include: 360 feedback and recognition Frequent engagement pulses and surveys Agile and transparent objectives Learning management Guided and customisable check-in conversations Dashboarding and advanced behavioural analytics Unit4 and Intuo share a focus on delivering powerful solutions that support people to optimise productivity. The acquisition extends Unit4’s Human Capital Management (HCM) portfolio so customers can benefit from the latest approaches to talent enablement with data richness from Unit4’s Business World ERP solution as the system of record. Intuo has enjoyed rapid growth since it was established in 2013, more than doubling the business year on year with customers including USG, Hays, BMW, Verisure, Brussels Airport, Thomas Cook, FairFX, Ag Insurance and Europa bank. It brings an innovative start-up mindset to Enterprise HCM that meets the requirements for today’s diverse workforce. The solution will be available both standalone and integrated into Unit4’s People Platform technology foundation this year. Integration with Unit4 Business World, Financials, PSA Suite and Prevero performance management will bring to market a modern, extensive suite uniquely focused on the mid-market. “The way people want to work has changed, and service-sector organisations must respond if they want to remain competitive and attract the best talent,” said Stephan Sieber, CEO of Unit4. “One of the biggest challenges to large organisations is to engage with their workers. New approaches to strategic HR are required to cater for talent diversity and multi-generational teams, as well as remote and contingent employees. These new people challenges provide an opportunity for those that get it right. Adding Intuo extends our HCM offering so we can meet those needs with modern, future-proof solutions and rich data that enables HR and analytics in the flow of work.” “The combination of the two companies will deliver market leading core HR and strategic talent enablement,” said Tim Clauwaert, Intuo’s Co-Founder and CEO. “HCM is a large and fast-growing market and changing HR requirements are reinforcing people centricity and people productivity. Our approach creates a more connected organisation where people interact and coach each other all the time. It fosters faster learning and people development because people know in the moment what they have done well or how they can improve. People work better when they are recognised for great work and there are clear expectations. Organisations want and need to improve employee experience to attract and retain the best talent while learning from accurate performance insight. Unit4 shares our passion for people and we’re excited to bring our combined expertise to market.” Intuo is a privately held company headquartered in Belgium. Tim Clauwaert will continue to lead Intuo and drive its international growth as part of Unit4. Unit4’s initial go to market focus will be on the UK and Benelux, with planned investment in feature development, integrations and international expansion. Holger Mueller, Principal Analyst & VP, Constellation Research, Inc: “The area of talent management, from engaging employees and understanding workforce attitudes, to helping managers to raise engagement, is becoming critical. Performance management has been broken for a long time. Without an integrated HCM strategy with modern talent management, it’s impossible to know if you’re hiring the right people, if learning is effective, if performance and productivity deserve rewarding... It’s a challenge that needs solving right now in enterprises.” A video of Unit4 CEO, Stephan Sieber, and Intuo CEO, Tim Clauwaert is available here[/vc_column_text][/vc_column][/vc_row] ### The problem with print servers: using the cloud to improve security and compliance [vc_row][vc_column][vc_column_text]In recent years, growing data risks and tightened legislation has highlighted the need for greater security and compliance across the enterprise, especially in industries that keep sensitive records about health or other private details. Securing the network is obviously a priority so it’s vital to be aware that everything and anything connected to the network presents a potential vulnerability. With a survey by analyst firm, Quocirca, revealing that 59 percent of businesses report a print-related data loss in the past year, it must be recognised that an unsecured print environment, comprising printers and multifunction printers (MFPs), can create an open target for would-be hackers and intruders trying to access the corporate network. The problem with print servers While most organisations recognise that securing the print environment is of vital importance, many focus solely on the device itself and fail to address other infrastructure elements that create far more risk than one might expect. This is because in most enterprises, print environments are more complex than they were just a few years ago. Digital demands mean today’s print environments are made up of multiple devices connected to numerous servers dispersed throughout an organisation. The scattered, disconnected nature of this typical infrastructure creates challenges for IT and risk for the organisation. Required technology updates are likely to be out of sync with IT security policies. Servers may be physically located in an unsecure location such as a manager’s office in a retail store or an unlocked closet in a bank branch. In some cases, servers are simply forgotten. Security and compliance challenges The increased complexity of print environments due to the disparate and scattered character of multiple print servers, makes them difficult to maintain and manage, resulting in the devices and the information they output becoming vulnerable to security breaches and potential non-compliance. Printer fleets comprising multivendor devices is also a factor. A 2017 Quocirca survey found that while 67 percent of respondents operating a multivendor fleet reported at least one data loss, this dropped to 41 percent for those that were operating a standardised fleet. When devices miss their technology updates they become an open target for would-be hackers and intruders. Where access controls are lax or non-existent, unclaimed printouts sitting on output devices also become a security issue. Confidential documents piling up in your office print centers and copy rooms can be easily misplaced or land in the wrong hands, putting an organisation in breach of compliance regulations. The answer’s in the cloud The solution to removing these frustrating complexities and security upkeep requirements, is to move to a print solution designed for the digital age. At Lexmark, we recently launched a cloud-based print management solution to enable stronger access controls, security and compliance for our end customers. This is based on the premise that with device management located in the cloud, technology updates and monitoring become easy. When there is no longer a requirement to be on site to perform updates and analyse device usage, the responsibility for these vital tasks can be outsourced to a trusted technology partner who can ensure that every single printer has the latest security updates in a timely fashion. Printing from the cloud also helps ensure user and business security by preventing unwanted or forgotten pages from sitting at the printer unattended. We all know that sometimes it’s difficult to retrieve documents immediately after they have been printed out and that this can be a security risk if the documents contain sensitive information. With cloud-enabled print release, it becomes possible to send print jobs from any device, in any location, to the cloud. The printouts can then be retrieved at a chosen printer at a specific time, significantly reducing the chances of someone seeing it, or taking it, when they are not meant to. Best of both worlds Within an enterprise, certain groups or departments may prefer to keep documents on their own side of the firewall. In this instance, a hybrid cloud configuration provides a flexible solution which combines cloud-based print management with networked servers to suit different user groups and employees. This hybrid option enables control of exactly how, when and where cloud capabilities are used so that you benefit from the best of both worlds. In addition to improved security and compliance, cloud-based print management services can generate powerful, aggregated analytics when implemented across an enterprise. This gives IT managers the insight to uncover potential security gaps and find areas that could be operating more efficiently. Armed with this ‘intel’, better decisions can be made such on which devices to place where to optimise the print environment. The sky’s the limit Even if you have the staff and budget to devote to it, the process of maintaining, supporting and securing a print and document management infrastructure can be complicated and expensive. While printers and the cloud may seem like an unlikely partnership, cloud-enabled print eliminates the complexities of maintaining a server-based environment. Using secure cloud technology, cloud print management delivers a simplified, streamlined print infrastructure that’s more secure and easier to manage. For us at Lexmark, the business case is clear - becoming cloud enabled will arm your print infrastructure, processes and users with the means to handle today’s challenges while proactively preparing for tomorrow’s complexities.[/vc_column_text][/vc_column][/vc_row] ### Let chaos reign and it will: why you need a tight grip on multi-cloud Once, organisations were tentative about adopting cloud, now they are worrying about managing too many. Analyst IDC[i] predicts that by 2020 over 90 percent of enterprises will use multi-cloud services and platforms. But before organisations get to this stage they will need to create a structured and consistent framework to manage them. The use of multi-cloud is a growing trend across organisations that have actively adopted public cloud, according to IDC. This is being driven by the need to use differentiated services from different providers and wanting to minimise lock-in to a single platform. Multi-cloud enables enterprises to mitigate risks and leverage the strengths of different cloud providers to improve performance optimisation and improve reliability. But multi-cloud can easily turn into an uncontrolled proliferation of clouds, also called cloud sprawl, if enterprises don’t carefully plan cloud resource management and integration of resources across cloud platforms. Smart organisations are already busy mapping out their multi-cloud strategies. But many are finding the benchmarking, planning and resource allocation processes complex. It is impossible, for example, to simply shift legacy systems and processes into multi-cloud and expect them to work. To tap into the power of multi-cloud, organisations must identify which clouds are most suitable to support each workload. The top challenges facing multi-cloud and how to address them: In our highly competitive digital economy it is paramount that enterprises can run workloads in the most appropriate environments to increase productivity and take advantage of any cost savings. Multi-cloud is one answer. But multi-cloud doesn’t come without its hurdles. Here are five challenges and tips to overcome them to speed up multi-cloud adoption. Compliance For most organisations, compliance requirements such as the General Data Protection Regulation (GDPR) apply to some or all of their data. Some address data jurisdiction issues via multiple clouds and data centers in different geographical regions. But with a plethora of applications it is essential IT departments know what is running where and have the correct tools in place for visibility and monitoring. Policies defined at the IT level to maintain organisational standards are essential. These need to be agile to address dynamic business processes and introduce restrictions or standards quickly across cloud platforms. In larger organisations, a central policy manager is a necessity to maintain compliance across multi-cloud platforms and services. Finally, it goes without saying that organisations need to verify providers can meet their compliance requirements before introducing them to their multi-cloud infrastructure. Security No discussion on multi-cloud can pass without mentioning security. From a positive perspective, a well thought out multi-cloud strategy can help to neutralise the possibility of cyber-attacks because an organisation’s entire cloud services don’t typically sit with one single provider. But as long as cloud data access points operate through the public internet, the risk of threats is still there. With applications spread across different providers they can be difficult to monitor. Some organisations are turning to data encryption and identity key management services to shore up their defenses in a multi-cloud world. Others are opting for a zero-trust model backed up by next-generation tools. Ultimately the only way to secure a multi-cloud environment is through a robust multi-layered approach. This may require the assistance of dedicated security partners, enabling organisations to continue to innovate and leveraging the latest technologies, whilst staying one step ahead of a hostile threat landscape. Cloud skills gap Organisations are often surprised at the technical challenges associated with deploying and running a multi-cloud environment. Multiplying clouds inevitably means increasing the skills necessary to run the entire environment efficiently.  To move forward with multi-cloud deployments, organisations require IT professionals that understand different cloud providers’ platforms and management tools. Organisations will need to be prepared to make the necessary investment in IT skills and look for a commonality in cloud platforms such as operating systems and development languages to ease the complexity. Alternatively, they will need to partner with Managed Service Providers (MSPs) who are adept at optimising and managing application workload deployments across a multi-cloud environment in geographically distributed locations, whilst providing 24/7 support. Controlling costs Effective management and cost control require a deep analysis of an organisation’s entire infrastructure, which often reveals services they are paying for but not using. A single pane of glass and dashboard allow IT departments to micro-manage and optimise performance and resources. In addition to tracking current multi-cloud spend, organisations need to project their consumption requirements in the future. Cloud governance-based tools, for example, can alert administrators when total usage for an account has gone over budget. Partnering with an MSP who can orchestrate the whole multi-cloud environment to consolidate billing from multiple vendors and provide performance and cost analysis can help to take the pressure off internally from monitoring cost control. Visibility Multi-cloud by its very nature is complex and difficult to trace. A cloud management platform (CMP) brings together disparate cloud environments through a single portal with native connections to major cloud solutions. From a security point of view, everything is under the same management and rules. A portal and service catalog provide service capabilities with regards to apps and a single pane of glass by which to monitor the entire multi-cloud environment. CMPs include tools for dedicated cost optimisation, for example, which means organisations can assess several environments such as Azure and VMWare. They can check if better optimisation could be achieved by switching clouds for certain applications or managing cloud bursting from private to public cloud when capacity spikes arise. Some solutions can expand this visibility to on-premises using the same interface. Are you right for each other right now? Multi-cloud is fast becoming the norm. But organisations need to have reached a certain level of cloud maturity before they can dive into the deep end. They must be able to define their multi-cloud journey according to their business goals by considering geographical footprint, legal constraints, security, compliancy requirements, etc. and thus identifying suitable options and expected benefits. Only then will they be able to see multi-cloud as a real game changer when it comes to a competitive edge.[/vc_column_text][/vc_column][/vc_row] ### How to make AI work for your business, today [vc_row][vc_column][vc_column_text]The productivity gains offered by Artificial Intelligence (AI) are huge. In fact, PwC estimates the global economic value to be upwards of $15.7 trillion over the next decade. This makes AI the biggest commercial opportunity in today’s fast-changing economy. Applied properly, AI can transform how a business operates, its products and services, and its revenue potential. Yet, many organisations have been slow to respond to the AI revolution and risk being left behind. As humans and machines collaborate more closely and AI becomes a reality of “doing business”, now is the time for adoption, no matter the size of your organisation, industry or IT budget. Don’t delay; here’s how you can make AI work for your business, today… Defining AI In the broadest definition of the term, AI is a collection of intelligent computer systems which have the ability to sense their environment and action a response, much faster than a human would be able to do (if at all), while constantly learning and refining its decision-making process. In the past, AI has been the stuff of Hollywood-inspired nightmares. In some cases, it caused the end of human-kind. While AI application is still in its infancy, its already widely considered to be a game changer, demonstrating its ability to amplify human capabilities and make decision-making faster, easier and more accurate. In truth, AI represents both a threat and an opportunity. It’s re-writing the rules across industries, making it possible for the start-ups of tomorrow to leapfrog their more established counterparts and become the market leaders before anyone’s even looked up. Laying the groundwork No sector is immune to the disruptive forces of AI. However, using the same premise, every business can capitalise on it. The big question is how to leverage it and make it work for your business. AI, when applied to business, can mean anything. However, in all cases, AI is underpinned by data…and lots of it. That’s why laying the groundwork before you invest in AI is vital. The first and perhaps the most important step on your AI journey is to make sure your business has “clean” and well-organised data structures, otherwise, any investment will be worthless. In many organisations, data is still extremely fragmented and this needs to be addressed before AI enters the equation. Duplicate and incomplete data is polluting storage, holding back progress and wasting money. Data cleansing needs to be a priority, which includes developing a data quality plan across all departments to set expectations for your data and understand the root cause of data quality errors. It’s also important to make sure data is standardised at the point of entry which can be achieved through creating a standard operating procedure (SOP) and that data is pooled across an organisation in one place to eliminate fragmentation issues. Targeting your investment Once you have a solid foundation upon which to build your AI strategy, you’ll next need to decide what you want to achieve in order to target your spending and gain maximum ROI. As with any business spend, simply throwing money at AI won’t solve any problems. At this point, you must sit down and work out what AI means for your business, seeking points of view from staff on current bottlenecks, competitive pressures and changing consumer behaviour. This will give you the insight to identify operational pain points AI could help address, today and in the future. Dependent on your sector, the areas of AI’s biggest potential will differ, as will the barriers to overcome. For example, in healthcare AI is already helping to support diagnosis, appointment scheduling and identify pandemics to track and contain its spread, with the longer-term potential of “robot doctors”. However, the biggest hurdle to overcome is how AI uses and protects patient privacy and sensitive health data. Another example is in the retail sector, where AI offers the opportunity for brands and retailers to provide personalised products and experiences; keep ahead of consumer demand through predictive analytics and drive efficiencies across the supply chain. In future, AI will help the retail industry to create products that anticipate demand before customers know what they want. “To buy or to build?” The next step in your AI journey is to decide the best way to build your AI capabilities: to build or buy? The right choice will depend on your in-house capabilities and it may be a hybrid approach will work best for your business. AI talent is in high demand in the digital economy, meaning hiring inhouse AI talent is expensive. It can also take many months, if not years, to build a well-performing AI solution because a system running off billions of data points takes time to fine-tune. If you lack internal know-how and need an AI solution in the near-term, buying AI as a service from a vendor can bring immediate financial and customer satisfaction benefits via a market-ready solution that can propel your organisation ahead of the competition. The golden rule of outsourcing any kind of business function is to be clued up on the current market and value of different offerings so you can strike the best deal. If you go in blind, it’s likely you’ll pay over the odds for something you don’t necessarily need or risk being locked-in. Thinking longer-term, every business should keep open the possibility of building their own internal AI solutions, particularly when it comes to mission-critical projects. If you consider your company to have an exclusive and highly-value data set, building an in-house AI model will help to protect your competitive edge as off-the-shelf AI models are tested, trained and calibrated using your (and many other companies’) data sets. However, whether you choose to buy or build AI, it’s important businesses start small and constantly review and learn from their mistakes. In this dynamic environment, AI is evolving at a faster rate than many other commonly used business technologies and as such, strategies, training and vendor relationships need to be a constant state of flux too.[/vc_column_text][/vc_column][/vc_row] ### What exactly is cloud testing? [vc_row][vc_column][vc_column_text]In the past few years, IT has witnessed a ‘virtualisation evolution’ in the form of cloud computing. For the uninitiated, this “as a Service” model offers a total solution, delivering IT as a service where resource sharing, allocation and availability are on demand via the internet at any given time. Think about streaming services like Netflix and Amazon Prime, which pride themselves on providing a wide range of easily accessible, on-demand programmes and films. With millions of users and an enormous amount of data, the cloud is a vital tool for them to be able to provide the content consumers crave. It’s undeniable that in today’s data-driven world, everyday activities from music streaming to reading an eBook all require cloud computing in order to work effectively. Another example, but from the business world, is the ever-increasing number of apps moving to the cloud-based model. From Microsoft Office 365 to SalesForce, many businesses now offer the option to run their applications through a web browser. This business model reduces or eliminates the upfront capital costs, provides more predictable expenditures, lowers the need for support and ensures that the app is kept updated. But, before they make it big, how do these apps and services start out and what processes and tests need to take place in order to go mainstream? Cloud testing: the lowdown  Even the biggest and most well-known apps have humble beginnings. As with most people-facing products or tools, testing is a vital stage before such a technology can ‘go live’ or be deemed fit for public consumption. Cloud testing is the process of testing the performance, scalability and reliability of web applications in a cloud computing environment. Whether a tech company is extending an existing application or building something entirely new, cloud-based resources can save the company time and money, regardless of the application deployment or the size of the organisation. This method can also facilitate more effective developer collaboration over its predecessor, manual testing. Testing in the cloud is performed in three distinct areas of cloud that includes infrastructure, platform and service. This also means you are also including availability, security, performance, interoperability, disaster recovery, and multi-tenancy testing. One common interpretation of “cloud testing” that many vendors adhere to is using the cloud to run or manage the tests themselves. For example, testers can use the cloud to generate massive distributed load tests, simulate a large number of mobile devices or run functional and performance monitors from all over the world. Although these are all extremely valuable offerings themselves, they are not very specific for testing cloud applications, which means that referring to it as “cloud testing” in this situation isn’t always 100% correct. Benefits of cloud testing A main benefit of adopting cloud testing is giving the tester quick access to data, whenever it's needed. This model also allows information to be moved to large, remotely-located data centres easily and at very little expense, with the user being able to access the resources 24/7. As well as being cheap and easy to create, this model also makes reconfiguring and tearing down test beds painless. Alongside this, testing in the cloud is more consistent, more easily customisable and allows testers to perform more rigorous performance testing than previous methods. Finally, cloud testing also reduces direct price of equipment maintenance and management and helps attain rapid ROI on application assets and brings about faster time to market, which means businesses can start capitalising on the technology more quickly. A look to the future  In the past few years, testers have recognised the significant role cloud testing plays in the development effort. As such, testers are seen as an integral part of the project’s different phases from concept to launch. Using the cloud for testing is helping organisations to acquire the tools, software licenses and infrastructures they need to succeed at a very low cost, without having to go through the motions of setting these up themselves or face concerns about the project’s maximum utilisation or potential. With big names from the consumer landscape and the business world jumping on the bandwagon, it seems that the popularity of cloud testing is not set to slow down any time soon. Due to this flexibility, scalability and the reduced costs this method brings, it’s highly likely that testers will continue to see cloud testing as a future-proof option that provides a clear route to success.[/vc_column_text][/vc_column][/vc_row] ### Driving transformation with digital expense management [vc_row][vc_column][vc_column_text]For many organisations with employees travelling regularly for work purposes, it is important to carry out administrative tasks, such as logging mileage and expenses claims, efficiently and accurately. It’s no secret that paper-based processes are cumbersome and present many challenges for both employees and their managers. With digitisation sweeping many sectors, especially healthcare, there is a keen focus, now more than ever on replacing manual practises, enhancing reporting functionality and ultimately driving efficiencies through data collection. In order to better understand cost drivers and streamline the expenses process many organisations are adopting digital expenses solutions. It’s important that organisations implement systems which can be customised to reflect their expense policy and also integrated with existing Electronic Staff Record (ESR) software. Only having one system to update will be of huge benefit to administrative staff, creating greater efficiencies and time savings. As well as giving them peace of mind that any updates to employee information, such as change of address will be replicated across the board and information will always be up-to-date. Additionally, involving and working closely with other departments such as the IT team will ensure that organisations understand the requirements of their employees, changing technological landscape and ultimately implement a solution which matches the organisations current and future needs, as well as driving transformation and decision making. Training is a key component to digital success By digitising the expenses process, organisations can reap many benefits such as a reduction in average mileage costs and an understanding of how employees interpret or misuse the expenses policy. However, for a successful and smooth transition to a new system, training is a key component as well as managing staff expectations from the beginning to the end of the change process. As part of our own digitisation we implemented Selenity Expenses which also supported our flexible approach to working with its mobile capabilities. The new system also coincided with a number of other digital transformation initiatives being introduced throughout the organisation. During the implementation it was key that we ensured all users were fully informed, comfortable in what they are doing and understood how to claim for the right items and amounts. Whether it is an electronic training package or delivering face-to-face training, employees will need guidance and a little bit of support to start with. Investing time and effort in training not only ensures employees are comfortable with new processes but also makes them feel engaged. Additionally, championing the solution and marketing training packages to the wider organisation not only increases awareness of new technologies but also ensures that the solutions will be readily adopted and used. Increasing accuracy and efficiency Having a system tailored to the needs of the organisation not only raises the transparency of procedures but also reinforces company policies, leading to more accurate and consistent expense claims. After going through this process ourselves, we learnt that there had previously been variation in the methods used by employees to record journeys – with some using odometer readings and others calculating distance with online route planners. Features such as policy reminders, mileage verification in the form of GPS capture and postcode to postcode look ups, all help to increase the consistency of mileage claims submitted by employees. For business administrators this means that claims are often easier to review and approve and there is much less checking and chasing for additional information. Streamlining processes such as corporate travel and expenses also provides organisations with the ability to look at how services are delivered and if there’s a way to improve them and make them more efficient. Enhancing the reporting process Digital systems provide a higher a level of transparency and many organisations can harness this detailed information and use it in a meaningful way. By capturing expense information centrally, operational teams are able to continually use real-time information as a tool to help transform the organisation. Not only helping to drive decision making but also allowing them to challenge certain claiming behaviours they may pick up on – something you wouldn’t get from a paper form. Finally, digital systems allow teams to identify patterns in the data and pass those insights onto managers and senior business figures. The transparency of digital expense systems can also help operational teams to field inquiries from the finance department and compare statistics, such as looking at how the average mileage claims compare from one service to another. Harnessing data in this way enables organisations to provide effective value for money services, ensuring that the right people, are in the right places, doing the right things at the right time.[/vc_column_text][/vc_column][/vc_row] ### How enterprises can champion agility and adaptability in today’s shifting business landscape [vc_row][vc_column][vc_column_text]In today’s shifting landscape, traditional business models are quickly becoming outdated and new models are constantly emerging, many of which focus on fluidity and flexibility. This is creating fresh challenges, conflicts, and opportunities for organisations. So, how can businesses best equip themselves to build out a flexible and adaptable model in today’s technological environment? Here are some points to consider: Support operational flexibility Perhaps the most striking feature of today’s technological landscape is its pace of change. The question every manager is concerned with is not how to adapt to change after it occurs, but how to prepare for changes on the horizon. In a rapidly changing environment such as this, the most important traits are agility and adaptability. Senior decision makers must ensure that information systems are set up to support operational flexibility. The infrastructure of an organisation's information systems should be easily scalable, allowing the business to adapt and evolve, and certainly should not stand in the way of growth or change. To this end, it is important to implement core systems that will meet an organisation's needs not only in the present but also in the future, without knowing precisely what these needs will be. Best-in-class ERP solutions, for instance, enable changes to processes and business rules with immediate effect. They also harness standard APIs that allow interfacing with applications and external technologies such as IoT sensors. This allows the workforce to conduct tasks from a variety of platforms and devices, from any location. Alongside technological transformation, businesses must also be prepared for the predicted transformation of organisational structures. In future, it’s expected that businesses will employ only a few regular staff, assisted by a team of freelancers, ad hoc project managers, and many part-time employees, most of whom will work remotely. The internal structure of a business will become far less rigid, moving towards a model that looks more like a loose and flexible network. Supporting a flexible business culture will be vital for the longevity of any business. Emerging technologies are friends, not foes CEOs and business owners have a lot to gain from the early adoption of intelligent technologies. They have the potential to enable enhanced performance, efficient resource utilisation, and informed decision-making. To maintain a competitive advantage, it is important that business leaders suppress their fears and embrace cutting edge technologies. Managers must take care to instill a commitment to embracing change across the whole organisation, encouraging innovative initiatives and ensuring employees take an active role in shaping new processes. Businesses will also need to think about how technology will disrupt traditional practices. For example, over the next two decades, autonomous vehicles will fundamentally change how goods and people are transported from place to place. Understanding the impact of these changes on the way a business operates, and the ways ERP solutions can ease this transition, is critical. Doomsday narratives surrounding the implementation of AI in the workplace are fueled by the misconception that humans and machines cannot work alongside each other. In truth, the most innovative companies create synergistic scenarios in which robots perform routine work tasks and data processing, and human workers focus on more complex tasks. These include job scenario management, strategy development, customer focus and long-term thinking. Process automation and emerging technologies are friends, not foes, to business. Organisations must turn their attention to how the workplace will change as a result of their introduction, and prepare the workforce for a future in which they’ll work alongside advanced technologies. Compliance is King Today, consumers are hypersensitive about how their data is stored and processed, and organisations cannot afford to take any chances with data access control. Equipping a business with the tools to adapt to new regulations as and when they arrive will ensure it’s able to stay compliant. Regulation compliance is of real importance to businesses, especially following the slew of fines dished out this year by the Information Commissioners Office. Failing to align with regulation—such as GDPR, VAT or the upcoming MTD (Making Tax Digital)—can prove damaging to security, consumer trust and the bottom line. GDPR, for example, is all about keeping track of data, which can become a complex task if it’s arriving for various different sources, and is stored in numerous different places. Investing in technology that can centralise data, such as an ERP system, will greatly simplify this process. Handling data according to strict processes, ensuring security and auditability, is a must. Technologies allowing businesses to conduct these processes simply and quickly, whilst remaining compliant, will prove invaluable. Companies able to champion agility and adaptability are setting themselves up for success in today’s ever-shifting business landscape. The key to unlocking the potential of a business is to equip the workforce with the tools they need to work efficiently, and to adapt to the increasing integration of new technologies and automation. Organisational flexibility will lay the foundations for these inevitable changes, securing the future success and longevity of the business.[/vc_column_text][/vc_column][/vc_row] ### Smart Contracts: Where Blockchain will prove its value [vc_row][vc_column][vc_column_text]We start 2019 with many still sceptical about the true impact Blockchain can have on the enterprise. While there’s little doubt about the transformation it can bring to a wide range of industries, the problem is that meaningful use cases are still few and far between. And without being able to see real-world benefits, organisations are struggling to understand the practical applications of this exciting technology. A recent survey by Deloitte of 1,000 executives, underlines the conundrum businesses are facing; showing more than 75 percent believe that without blockchain their business could lose competitive advantage, compared to 33 percent who remain uncertain on the return on investment in the technology. To really propel blockchain applications forward, it’s clear the industry needs proof that these technologies are commercially viable, and that the technology is maturing. How companies are using Blockchain today Despite these reservations, most major Fortune 500 companies are in fact exploring Blockchain today. They recognise the potential benefit it can deliver across business operations and are making tentative footsteps into an area that can add real value in certain scenarios. Contract management is one area in which Blockchain is really showing its worth. Blockchains that use distributed ledger technology allow for contracts that are self-verifying, self-executing, and autonomous. Companies can exchange terms, events, and information throughout the lifecycle of a contract without relying on brokers or middlemen. For example, contracting parties can automate payments due over the lifecycle of a contract. The nature of blockchains and distributed ledgers means that as these contract milestones are reached, and payments are made, they are recorded in such a way that neither party can repudiate or manipulate the record. Transforming Business Contract management is fast becoming the fifth system of record along with CRM, HR, SCM and ERP. Since contracts enshrine the rules of business (what do I do to get paid, and what do I get for what I pay; with what risks and what levels of compliance), the execution of these rules to bring contracts closer to the business transactions they enable is critical. We call this contracting at the edge – just like a rental agreement gets signed at the point of rental, more and more contracts will move closer to the actual transactions, leading to huge benefits in efficiency, lowered risk and better compliance. Blockchain offers an exciting way to bring this to reality, since they can help transactions execute securely and autonomously according to the rules laid down by the contract. It will help to enable a new era of contracts that will transform the way business is conducted. A network of smart contracts can be used to manage terms, events and information throughout the contract lifecycle by enabling different contracts and assets to work together and learn from one another. Blockchain provides the platform for truly autonomous, smart contracts, minimising the need for conventional human management and managing business continuity by taking control of the transactional elements of day to day operations. Guardrails can be enabled in the blockchain and rules set around what can change in the contract and what cannot, can significantly reduce risk. In practical terms, this means if a company signs a deal with a consulting firm in another country and creates an agreement which has been negotiated to include payments and bonuses when certain conditions were met, disputes over currency values can be avoided by enabling the smart contract to automatically include adjustments for currency fluctuations without the need to trust mediators and to securely include non-repudiation. Clauses like these are codified as “Smart Clauses” and deployed on the blockchain, where they are monitored for compliance and upcoming milestones. As conditions are met based on external inputs, the smart contract verifies the transaction, adjusts the currency for market rates and executes by releasing payment instantly and directly to the seller. By making a contract interact with and respond to its environment during the transaction, it can deliver maximum business value while reducing transaction risk. Blockchain technology also enables businesses to create a private ledger (often called a consortium blockchain), where a group of organizations who regularly do business with each other can provide a safe, “trust-less” platform that everybody in the consortium can trust without prejudice. The application programming interfaces (APIs) provided to the parties in the contract allow them to dynamically monitor, measure and manage the contract evolution including acceptance and execution. By examining ledger entries tracking the execution of a contract, parties can figure out the deviations between expected timelines to develop accurate risk models, using contractual data to support the development of business models. With secure access to an environment to execute transactions within the guard rails set by the contract, a slew of interesting use cases open up. Smart contract use cases Smart contracts are gaining popularity and have already realized benefits in various blockchain projects across different industries: Healthcare Contract management solutions in healthcare can help boost confidence in patient privacy through an immutable record which cannot be manipulated without the correct permissions. Blockchain will also enable healthcare businesses to comply with national and regional regulatory regimes, and ensures ethical sourcing and procurement of healthcare products and equipment. Manufacturing Manufacturers are investing in a blockchain-enhanced supply chain that enables better traceability and transparency, both with partners and suppliers. Improved automation within the supply chain also means a product’s lifecycle can be tracked from origin to the shelf, showing ownership transfer from the manufacturer to the consumer and everyone in between. What happens next? For businesses looking for Blockchain projects that can offer real value, contract management offers fertile ground to use this new technology to accelerate the business, protect against risk and optimize commercial relationships. For any company managing multiple contracts, contract management makes perfect business sense - especially in today’s highly regulatory, compliance-heavy and high-risk environment. As Blockchain brings increased sophistication to contract management, and smart contracts develop, the benefits of investment in this space become even clearer. It’s this clear recipe for success that we believe will help more organisations start to see the strategic value in blockchain, with companies looking likely to take their first steps into blockchain via this route.[/vc_column_text][/vc_column][/vc_row] ### Main Problems With Telemedicine In Developing Countries [vc_row][vc_column][vc_column_text]Telemedicine is a huge resource for developing countries where there are massive infrastructure and accessibility constraints. Frequently, telemedicine makes for the only viable way to issue prognoses and even prescriptions to people in less developed areas who lack access to proper medical supervision. However, the uptake of telemedicine in developing countries such as Mexico, a country which is right beside developed US states like New Mexico, Arizona and others, has been slow at rife with problems. Here are a few that are stalling progress. Lacking Support From Official Bodies Medicine is also an area in which people search frequently for some sort of professional validation before committing to any sort of real plan of medical action. It is subsequently a huge issue for telemedicine if the professionals giving the advice are not supported by an official body. Advice is less likely to be heeded and doubts in the system as a whole grow as a result of the lack of validation. Policy Problems Implementation of telemedicine necessarily requires the full support of governmental policy, or else bureaucracy will hold the whole process up. Medicine is a highly regulated practice since the stakes of the whole operation can be so high. Policy regulations need to be adapted to help the spread and efficiency of telemedicine, or else it gets thwarted so easily. Technology Isn’t Optimized Technology has pushed on massively in the last ten years with advancements left, right and center. However, this doesn’t necessarily mean that there is technology in place specifically of use to the telemedicine field. Databanks and call centre style phone communication isn’t necessarily in place for telemedical professionals. What is more, with so much left up to the telemedicine professionals themselves there isn’t the support system in place to solve technical areas as they arise with a huge amount left up to chance and pure availability of technicians at a certain time. With the urgency of medicine, this is often not good enough. Depleted Representation Telemedicine has the potential to be a huge and civilization changing curve if capitalized upon with the requisite enthusiasm. But realizing that this is the case and believing it to be true with the right level of enthusiasm and planning is actually a difficult area to promote. There aren’t enough respected and well-established authorities championing the whole concept to the people who can benefit from it or who can help to implement it to really begin the process of spreading the news and getting it going. The confidence that the programs can work and can be an amazing alternative to traditional medicinal practices is absolutely vital to the survival and propagation of the idea throughout developing countries and the blossoming of this alternate form. Internet Connectivity Issues It’s a pretty obvious point to make but if your telemedical service is conducted over the internet, then you’re going to need internet connection to continue your work. It is quite often the case that developing countries don’t have this available to them. “Internet connectivity can be patchy at best in a lot of developing countries. For a rigorous telemedicine network it’s really important that there is stable connectivity, 24/7” warns Joel Schneider, medicine blogger at Buy assignment service. Before this is addressed some countries simply can’t develop their telemedical field. Conclusion Telemedicine is such a vital alternate to traditional medical practices in developing countries and would seem like an obvious option for pursual in countries all across the globe. However, there are problems as we learnt above. The problems, more often than not, relate to infrastructure and problems that connect directly with the status of the countries in which it is needed most: legitimacy, championing of the idea and technical solutions are all needed to maintain the progress and effectiveness of this tool. Sadly, until this is solved there will likely be a big halt in progression.[/vc_column_text][/vc_column][/vc_row] ### 5 Common Video Types to Market Your Business [vc_row][vc_column][vc_column_text]Different medical video production companies say, Video has become so ingrained in our lives – whether it’s on TV, a YouTube tutorial on how to cook a meal, a creative segment on our social media feeds, a webinar at work, or an interview with the cast of a popular Netflix show, video has pretty much permeated how we connect, interact, and digest information in our everyday lives. On YouTube alone, nearly 5 billion videos are watched per day. Because of this, including video content in your business marketing strategy is an essential part of creating brand awareness and generating revenue. Here are 5 common video types your company should consider incorporating into your marketing efforts. Client or Consumer Testimonials Testimonials are the perfect way to demonstrate the value of your product or services while also building credibility with prospective clientele. Potential customers are always hitting the internet to find what other people are saying about particular products or services, which ultimately facilitates the buying decision. Rather than simply adding a few quotes on your webpage, you can elevate your efforts by putting together a video montage highlighting your best testimonials. For consumers, physically seeing an individual attest to a product or service is a much more powerful source of credibility than just reading words on a website. Explainer and How-To Videos Positioning your brand as a thought leader in your space and delivering valuable tidbits of content for your audience is a great way to build trust with consumers. Whether it’s interview tips, how to use a new app, or advice surrounding HR best practices; educational content accounts for the largest portion of videos watched on a daily basis. By putting together short explainer videos, you’ll establish yourself as a credible source that places value in delivering effective customer service. These videos can also be recycled and used across various marketing channels, such as accompanying a “How to Create a Product Roadmap” video with blog posts that provide more in-depth content. By creating complementary content over various social media channels and online platforms, you will be able to broaden the scope of your reach and appeal to the varying appetites of how your consumers prefer to digest new information. What would normally be somewhat boring instructions, can be delivered in an entertaining, easy to understand way through the use of creative videos. You may also want to consider leveraging this type of video content to answer some of the most frequently asked questions your support team receives, which includes crafting tutorial videos that can be emailed to customers who request help with navigating your product. The most complex concepts are easier to explain when you pair it with visuals that a customer can understand. Whether you choose to use whiteboard animation or invest dollars in hiring a professional video service to customize video content for your business, never underestimate the use of visual communication in converting leads and retaining clients. Product Demonstration Videos Rather than explain the benefits and features of your product or services, you can use demo videos as a way to highlight this information in a captivating way. Not only will it improve sales efficiency, but it’ll give your team the perfect collateral to share with prospective customers during a sales meeting. Regardless of what you’re selling, building successful product demonstration videos requires a few best practices. First, you should incorporate your brand message into your script, ensuring that your communication remains cohesive across all business teams and platforms. Next, you want to inject some life and personality into your video in order to separate yourself from competitors and many of the dry, dull demo videos on the web. You’ll also want to focus your attention on the video quality that you’re putting out there. If your video quality is poor, consumers will make a negative connection between the low caliber of your video and the quality of your product or services. Videos Promoting Your Internal Culture Another great way to use video content is in a recruitment capacity to advertise your internal culture and put a face to your brand. Videos showcasing company culture give prospective hires a behind-the-scenes look at what life is like working at your organization. These videos could showcase your employees hard at work, at a company-sponsored event, or highlight some of the benefits and perks your business offers. You can truly get creative here and even ask for ideas and input from your current employees. While these videos are mainly targeted at attracting applicants, potential customers will find value in this type of content too, as it builds trusts and puts your brand’s personality on display for a consumer. Product or Service Advertisement Videos Informative, entertaining advertisement videos, when done correctly, can engage your audience and help potential customers form an emotional connection with your product or services. Although you can go many different directions with your video ads, you always want to keep your brand message and customers in mind. Whether you use influencer marketing or make pop-culture references, you want to keep things fresh, innovative, and professional looking. When it comes to using video content for your business, there’s no denying the advantages it can offer on a multitude of fronts. Incorporating video into your marketing strategy, even in a small capacity, can help drive customer engagement, increase revenue, build brand equity, and establish your business as a leader in the space.[/vc_column_text][/vc_column][/vc_row] ### Choosing The Right CMS For Your Business_ The Way Forward Whether you are a first-time shopper or you are simply seeking to upgrade your existing content management system (CMS), the process of selecting an appropriate platform for your business can be extremely overwhelming. With the availability of numerous platforms to choose from, how can business owners know the right CMS suitable for their specific business goals and needs? This article is set to explain how this can be easily achieved. At the same time, it’s going to address some of those challenges web owners often face long after selecting and paying for their preferred CMS platform. From conducting researches and launching demos to comparing pricing and features and more, you are definitely going to learn more about making the right choices, particularly when it comes to choosing a suitable CMS platform to work with. For the purpose of this article, some interesting details about a number of some prevalent CMS platforms, such as Umbraco, DNN, Typo3, Drupal, Joomla, and WordPress has been analysed. References to other platforms like Wix, Rooftop, Contentful, Concrete5 have also been provided here. So, when choosing a suitable content management system for your business and budget, here is what you need to consider. Know what you want it to do You are not forced to go with a CMS because everyone seems to be using it and so, you are safe. Consider WordPress for example, it is by far the most popular CMS in the web world with over 24 million installations. As opposed to small business, blogs or personal sites like that of auto repair shops, neighbourhood groceries, and salons, the actual percentage of these platforms that basically deal with the installation of business websites are not known. In actual terms, these websites are not in any way symptomatic of the capabilities of a CMS platform. Unlike WordPress, other CMS platforms feature a limited number of installations. For the sake of comparison, Drupal now has over 1,200,000 installations, Concrete5 holds 140,000 installations, Umbraco claims to have over 380,000 installations, and Typo3 has got about 500,000 installations. Now, when it comes to quality, features, and complexity, a Fortune 500 company website can outrank that of local hairdresser built on a prominent CMS. [easy-tweet tweet="When you choose a CMS platform, ensure that it provides you with the needed editing functionality" hashtags="CMS,SEO"] So, as a business owner, your choice should not be solely based on the number of people using the CMS. It is imperative to consider some use cases that are already published and for the kind of clients they are made so. However, try as much as possible to introduce some quality criteria to your assessment. In your business sector try to do some research on the general number of organisations that are employing this CMS and try to check out the reason for their preferred choice. Assess functionality and usability When choosing a CMS platform, it is essential to ensure that it provides you with the needed editing functionality. When researching your options, make sure you get input from everyone who will want to make use of the platform. Is the CMS mostly going to be used by marketers, or developers? How many numbers of users will be able to access the platform? Do they have any significant level of technical expertise or not? Choosing a CMS that will fail to meet the specific needs of your business or one that will be difficult to operate by the majority of your users is nothing but a futile effort. Depending on how you need your CMS to go, there is very possibility that your CMS will allow you to do virtually everything. Here are certain things to consider when choosing the right CMS for your business -           Assuming you CMS has a WYSIWYG editor, are you protected from ‘breaking’ the layout? -           As is the case with app development, do you have to resize images before uploading them to your website or your CMS automatically resizes them for you? -           With regards to links and SEO that may already exist on your old URL, what happens when you change the URL of a page? -           Do you have to manually update the navigation system of your website or does it get automatically updated? These and lots more are some of the essential things you need to put into consideration. In most cases, you may just have to do the customisation work yourself, as everything might just depend on the developer. Hire the right developer Interestingly, there are many app development companies out there that are capable of using their own proprietary CMS to provide web design and development services. However, you need to be very careful at this point as you may be yoked to the proprietary software of a specific agency. In the future, your chances of finding app developers who are willing to help you work on it are very slim, even if you are so lucky enough to get the source code of the company’s CMS. Conversely, you should seriously consider the signal-to-noise developer ratio particularly if you are looking to go with popular open source CMS. Finding developers for CMS platforms is easy but you have to be extremely conscious of counterfeit developers. Try as much as possible to be on the lookout for fakes or people with little or no knowledge about this subject when looking to web and app developers for popular CMS platforms. As a rule of thumb, there is a greater pool of inexperienced developers around a more popular CMS platform than an uncommon one. Most open source CMS platforms are vulnerable to this problem. Determine your budget for the web or app development When making a decision on an appropriate content management platform you will certainly need to factor in your budget except your business is supported with an unlimited supply of funds. As a matter of fact, it might be needful to consider have a more complex solution particularly if CMS stands as a focal point of your business. If you are paying more for a complex feature, you must ensure that it everyone using the platform benefits from it. In conclusion, it’s not an easy task to decide on the right CMS platform to use for your site. And just because an agency is working with a platform you like does not mean it should be entirely left in their hands. An essential part of the whole construct is that engine that powers your website behind the scenes. So, don’t be blown away by the visuals and designs you see on the surface rather do proper research before settling for an appropriate CMS for your business. Take time to plan well on how you want your site to evolve. [/vc_column_text][/vc_column][/vc_row] ### Sustaining the Smart City | Compare the Cloud Smart technology is ever increasingly present in the everyday lives of communities around the world. As urban areas become increasingly reliant on information technology, IoT gadgets, and wireless networks - evolving into smart cities-, sustainability becomes more and more significant. Cities like this must remain conscious of sustainable solutions to combat urban growth and CO2 emissions. Luckily with smart technology, many issues surrounding sustainability can be curbed despite energy usage and growing populations. For example, smart meters installed in houses help homeowners and tenants monitor their energy use. Smart meters Several UK companies already offer smart meters to help people choose wisely how and when they use energy, and with companies like OVO Energy, the findings can be accessed online and on a display screen in your home. According to UK Energy Minister, Claire Perry, smart energy could save the UK up to £40 billion from now to 2050. “Smart meters will be the cornerstone of a cleaner, flexible and efficient energy system, saving the country tens of billions of pounds”, she said. Half a million households in the South West of England alone have already had a smart meter installed. And if every household in Britain had a smart meter, the national savings would be so enormous that we could supply energy to power the homes of hundreds of thousands of people. Smart technology can also be used to benefit communities by tracking weather conditions with IoT sensors and apps, calculating water supply - researchers at the University of Waterloo developed an AI system to identify bacteria and contaminants in the water -,  and smart technology such as smart-bins can help monitor waste disposal to improve recycling. The Internet of Things (IoT) IoT takes up a critical role in the construction of a smart city, as IoT items that connect to the internet and communicate with each other are key to the streamlined running of a technologically advanced city. Innovative businesses and local authorities have already started to work on projects surrounding smart cities and how they can work towards sustainability goals. In 2015, the UN commenced the 2030 Sustainable Development Agenda. The goals of this agenda revolve around poverty, health, education, climate change, water, energy, urbanisation, environment and social justice. IoT innovations including virtual power stations, energy efficient heating, sensors to monitor health, low-cost energy usage monitors, and IoT enabled security systems, can help tackle many of these issues. Research has found that 84% of IoT deployments are tackling the UN’s Sustainability Development Goals. As IoT devices become more and more integrated into the running of communities and as the technology becomes more affordable, people are likely to reap the rewards of IoT, improving the quality and ease of their lives in addition to more sustainable living. Smart technology can assist in the smooth running of all stages of the day, from monitoring sleep, planning meals, saving money through efficient energy consumption, and travelling on public transport (with apps to show bus arrival times based on live tracking). How realistic is the smart city? A smart city seems like a kind of Utopia when we consider the benefits alone, but of course, the reality of a smart city and how it would function is somewhat a guessing game until it is a reality for more people and has gone beyond an experiment. Giving technology too much power could put the jobs of humans at risk and sacrifice the psychologically necessary interactions between people. Cybersecurity and privacy are also concerns that should be considered. The enormous benefits to sustainability and ease of everyday living are great reasons to embrace IoT technology, but for smart technology to take over all sectors and mitigate human responsibility may be a step too far. Poor implementations of a smart city model could cause exclusion of people without access to the technology necessary or understanding of the technology and could result in a violation of public privacy. But with IoT devices being used to help residents manage their use of energy, monitor their health, communicate with ease, quickly obtain information, and aid environmentally conscious waste disposal communities could function very positively and sustainably. With regulation of IoT devices, strict privacy laws, and public funded projects to provide technology and training to people from all backgrounds, a smart city could work for everyone. But as smart technology is in its infancy, whether we put our trust in the technology enough to embrace the smart city model is uncertain. On the other hand, smart technology and its ability to aid sustainability is a focus area for many companies and organisations wishing to find an efficient way to help people keep track of their carbon footprint and make small changes to their lives (e.g., taking traffic into account before driving and recycling correctly). There are IoT sprinklers that help users monitor their use of water, sensors to monitor crops (the moisture in soil and growth of produce), and more. IoT technology can be deployed to better the environment and our management of it without the need for constant human intervention. For example, The Rainforest Connection have made sensors from old mobile phones to be attached to trees to detect illegal logging and poaching, as well as providing insight for scientists into the lives of endangered animals.   ### Is your business ready for the automation revolution? Even the biggest, most technologically-advanced businesses find artificial intelligence difficult.  There is a slew of examples of AI-gone-bad, where developers have accidentally built applications that have proved themselves to be racist, sexist, or seeming to advocate violence. The problem is not with the technology itself, but with its human creators who so often bring their own unconscious biases to the table. Google, for example, listed 641 people working on “machine intelligence” – of whom only 10 percent were women. This shows how even the biggest, most technologically-advanced businesses can make serious missteps in their journey towards better, more intuitive operations through AI and automation. How can businesses navigate this complex landscape and develop systems that are not only free from bias but which also develop real and measurable business value? AI and automation – the human factor Luminaries from Elon Musk to Professor Stephen Hawking have made dire warnings about the existential threat of AI to humanity. On a more prosaic level, many ordinary people have the erroneous (and dangerous) assumption that technologies such as AI, machine learning and automation will soon replace them in their jobs. In truth, we’re a long way from achieving this. Current automated systems are still directed heavily by humans, with pre-set tasks and complex but defined and limited algorithms. There are no AI solutions in current business use whose actions are completely unpredictable, none that are capable of independent thought – and we see this as good business sense. Rather than seeing the relationship between humans and these technologies as akin to that between master and servant, we should instead think of it more like a marriage. Automated and AI systems should be there to support us, not replace us. And, like a good marriage, they find firm foundations in dialogue. The picture on the ground  What does this look like in the real world? Well, to give one example, a business should not seek to replace call centre staff with full automation, but rather examine how they can use AI and machine learning to automate aspects of the process, making it more efficient yet still providing the human interactions that customers crave. Before deploying AI and machine learning in contact centre environments, businesses need to decide precisely what they want to achieve. We would suggest that the most profitable aim would be to streamline the process, and work to shave a minute off each interaction. Companies can do this by categorising the different types of or reasons for customer calls, and then work out which low-value ones are ripe for automation – for example, with a virtual assistant. But this should only be the beginning. Having automated aspects of customer contact, businesses should then be applying machine learning technologies and methodologies to calls to improve them still further. For example, they can identify why customers might drop out of their automated journey by analysing at what stage people left the call, and what patterns of interaction are most likely to lead to an unresolved query. They can then work to improve the process. This could involve manually building in more questions, or it could use machine learning to improve these models based on previous interactions (including person-to-person calls). Cloud – the foundation for automation revolution While the potential benefits of these technologies can, of course, spread far beyond the contact centre, this environment is a good illustration of the principles that will guide and underpin the automation and AI revolution. Many of the new services that businesses develop will be built on natural language processing, such as Amazon Lex, Nuance, and Google Natural Language. These technologies can be deployed to “listen” to human-to-human customer service interactions, learning from human patterns of speech and then feeding into machine learning applications to make future calls smarter. Businesses can also use them to make the customer’s unique voice characterisation their password, as HMRC has done, to remove the need for tedious and hard-to-remember authentication processes. The challenge with such sophisticated technologies is that their very complexity needs an extremely robust infrastructure platform. These aren’t just applications that need to be hosted: they have intense data processing, storage and security requirements, as well as integration with other corporate systems. It makes little sense to host these complex and demanding systems on-premises; to work effectively, they need to be based in the cloud, where organisations can benefit from scalable or burstable compute and storage, along with first class security systems. The cloud also enables businesses to integrate AI, automation and machine learning technologies with other systems, applications and data into which they feed. Revolutionary as these technologies will be to the customer experience (and much else), it shows how the cloud is actually the most transformational of all. As the foundation of tomorrow’s automation, AI and machine learning technologies, having the right cloud infrastructure to support your business’ ambitions has never been more important.[/vc_column_text][/vc_column][/vc_row] ### 5 Linux Terminal Commands That You Needed The Whole Time [vc_row][vc_column][vc_column_text]If you’ve never used Linux before, the black terminal screen can feel like a glimpse into the abyss. Linux has always been associated with the geekiest among us even though it has had a friendly Windows-like graphical user interface (GUI) for a while now. The reason most people are so apprehensive of Linux is because they are so accustomed and deeply immersed in the Windows and MacOS environments. Yet, Linux is actually not that difficult to learn. As long as you master the basic and most important commands, you’ll have the foundation you need to grow your knowledge from there. You can of course use Linux’s GUI to accomplish a lot. However, knowledge of text-mode commands can give you a depth of control that isn’t available through the GUI. The following are the five most important commands every first time Linux user should seek to master. 1. sudo sudo is simultaneously the best and worst command on Linux. You should treat it with certainty but also a healthy degree of dread. Using sudo before any Linux command automatically runs the said command with root (or superuser) privileges. sudo is necessary when running certain processes such as altering configuration files or updating the system. But sudo also grants the power to irreparably destabilize or destroy the system. It can be used to intrude on the privacy of other system users. Since sudo has such unlimited power, you should never use it before a command you do not fully understand. 2. Package Manager Tools (pacman, apt or yum) The most frequent reason you’ll be deploying the sudo command is when you are adding or removing programs on your server or PC using a package manager. Package managers like pacman, apt and yum, may differ in their command grammar and arguments but they can all perform the basic functions of package installation, package removal and package upgrade. You’ll need to precede these commands with sudo if you aren’t logged in as root. As you become better versed with Linux, you’ll learn other features of these package managers. However, the three core functions are still what you’ll use them the most for. 3. systemctl (systemd) For years, daemons (background programs on Linux) were initiated using a set of scripts known as initscripts. If you weren’t a seasoned user, initscripts were difficult to read, understand, interpret or amend. In recent times, initscripts have given way to systemctl (or system), a service management software. Linux is a modular platform where programs and commands are created to do one thing very well. Ergo, one persistent complaint about systemctl is that it does too much. Still, as a newbie, you need only focus on five functions of systemctl—start, restart, stop, enable and disable. See this guide for more information on using systemctl. 4. ls The ls command is relatively straightforward but has more capabilities than most Linux users realize. For persons familiar with Microsoft DOS commands, ls is the Linux-equivalent of DOS’ dir command. ls lists the folders and files in the current working directory by default (though you can specify a system path whose contents you’d want it to list). If you saved a file but cannot remember its exact location, ls can come in handy in your quest to find it. In a nutshell, ls is your scout to help you inspect the contents of your computer. 5. man If you are having difficulty understanding something about your Linux server, the natural thing to do is search for answers on the Internet. But sometimes, the answer to your question is much closer than you think—on the computer itself. The man command is short for manual and gives you access to local documentation. For instance, if you cannot remember what the chmod command does, type man chmod and you’ll be presented with a detailed explanation. Scroll up and down the text, and press Q (quit) when you’re done. These five are just a fraction of the commands you need to learn on your way to becoming an expert Linux user. Nevertheless, if you can master the use of these five, you’ll have an easier time understanding how the Linux OS works.[/vc_column_text][/vc_column][/vc_row] ### 2019 in data – Where are we now? [vc_row][vc_column][vc_column_text]2019 is already proving to be a banner year for the big data industry – no question. We’ve seen cloud migration, a critical factor for many big data projects, really increase over the past year. Hybrid cloud models are becoming a very well-trodden path for the enterprise able to stitch together their increasingly complex and business-critical data pipelines with speed and reliability and cost-effectiveness. For the industry, worldwide revenues for software and services around big data were projected to increase from $42B in 2018 to $103B in 2027, according to Wikibon. And according to an Accenture study, 79 per cent of enterprise executives agree that companies not embracing big data would lose their competitive position and face extinction. Cloud has begun to stake its claim as a staple of this in 2019. As data delivery options merge and enterprises seek the scalability of platforms that help them achieve their goals, they also tackle a skills gap dilemma that isn’t closing anytime fast. In fact, with expert resources thin on the ground and commanding high salaries, with both automation and AI breaking out and becoming an essential component to delivery and operations. The talent gap within DevOps and big data have rapidly become a barrier to growth and efficiency of analytic operations. In a recent survey, conducted for Unravel Data by Sapio Research, one in three enterprise business and IT decision makers revealed that one of the biggest pain points was talent scarcity. Perhaps by causation, 34 per cent claimed it takes too long to get to insight too. With this premise proving true, we will start to see AIOps converging with DevOps as a top priority this year. For the enterprise, data is funnelled into training and improving AI-oriented applications and their development and delivery. Where the algorithm can take the strain of an over-stretched employee, it significantly speeds efficiencies in the business proving to be one of the data landscapes biggest volume drivers. No more opening support tickets and waiting patiently for a hounded DevOps to fault, find and diagnose an issue. Instead, businesses are looking forward to efficiency in both process and output with improving and maintaining the operations the focus. The stack has become ungovernable for many enterprises. This is not a problem that can be solved by throwing more resources at the challenge if the issue lies in   DataOps to stay both efficient and effective. All data-reliant enterprises are putting time and money into finding a way to stitch the fabric of the data stack together to secure the goal of extracting fast, usable insights from their BI and various data sources. A case in point, Unravel’s research showed that although 84 per cent of respondents claim that their big data projects usually deliver on expectations, only 17 per cent currently rate the performance of their big data stack as ‘optimal’. This is when application performance management (APM) is utilised in the big data stack – just how it is within other stacks. APM for the data stack will ensure that the final data outputs are in line with guidance at the right level – and that all the processes and individual components are optimised throughout the process. From code level performance profiling to customer application and application log data, optimisation makes ideal performance more than a dream. Given that many enterprises use the cloud along with existing on-premise systems to run dozens of applications, the monitoring and troubleshooting issues of the DevOps team monitoring them all can be much more difficult than in other areas that fall under the DevOps remit. For a Chief Data Officer – or data-responsible person – with their eye on consolidating gain made in 2018, the allure of fixing the avalanche of niggles that gives the data stack a precarious wobble will be very high in 2019.[/vc_column_text][/vc_column][/vc_row] ### 7 Crucial Ways To Expand Your eCommerce Emailing List [vc_row][vc_column][vc_column_text]Emails to potential or past customers can be a crucial way to generate interest and revenue. When done correctly it can be a form of passive marketing which can generate large numbers of clicks on your eCommerce site. When a future customer signs up to a mailing list they are volunteering to receive tailored advertising for your site directly, alongside their other emails. When an eCommerce company can earn that position it can be an incredible opportunity to turn the potential in that customer into real results. For most online companies, some sort of emailing list is crucial to success in as busy a marketplace as the internet. With that being said, here are 7 ways to help you grow your emailing list and, in turn, grow your business! 1. Hide Certain Content Probably the most effective way of snaring those potential customers is by withholding website content until they have entered their email details and accepted that marketing emails might come their way. The ‘unknown’ is an incredibly powerful motivator on the internet and withholding entry to your site, or access to information on new deals without agreeing to be on the email list is an amazing way of capitalising on that fact. It also plays on the short-term interest and impulse of most internet users. For them, eager to access what they want to access, their threshold for allowing an eCommerce site to email them is lowered impulsively opening the door to future marketing materials. Furthermore, this method forces the potential customer to knowingly agree to the marketing which is psychologically advantageous and more likely to produce a long-term customer. 2. Advertise Your Site Through Facebook Facebook Adverts have become a huge resource for eCommerce stores, offering very specifically tailored advertising options to companies with really any budget. If you post a Facebook advert for a particular product or service, then anyone who intentionally clicks on the advert can be assumed to be directly interested in your company. Putting an email marketing sign up at the junction between the Facebook ad and the site that the potential customer is trying to access will most likely result in another email for your list. It also helps identify customers who have a pre-existing interest in what you are selling which helps your email list to be filled with shoppers who are more likely to actually engage with your site. 3. Offer Rewards A normal eCommerce business can afford to offer small rewards in return for being granted the opportunity to put a potential customer’s email address on their mailing list. Rewards can come in many different shapes and sizes and relate specifically to the product being sold by the site. For example, a clothing store might offer a 5% discount on all stock in exchange for an email address, whilst a news or journalism website might grant 24 hours of unlimited access to all their articles for the email. There will always be people who will take advantage of this but the act of subscribing to a mailing list is far more convenient and, with rewards, far more easily motivated than going through the ordeal of unsubscribing immediately afterwards. You can get creative with this as well, coming up with fun rewards as incentives and even fun ways of deciding the rewards themselves (see Spin To Win). It also allows you to begin your relationship with a potential customer on a good note, something which is more likely to generate a lasting relationship which reaps benefits for you and your company down the line. 4. Catch Them On The Way Out If a potential customer has visited your website, no matter what they have done, whether they’ve checked a price and then immediately gone to leave, or they’ve put in a huge order the moment they go to click out of your website is a good time to quickly interject with an entry bar for an email. It can be another junction to offer them some deals, a moment to thank them for their purchase or even just their interest, or a chance to display a bit of your company’s character. If they’ve made a purchase, they are of course much more likely to want to sign up to the mailing, and you might have caught them upon check out anyway. But even if they were just browsing offering them a chance to put their email in plays directly on the idea of the internet being a crowded place. “A user can check a website one day and then a week later they will have forgotten what it was and how to return. The goodbye email list offer encourages a potential customer to avoid losing your page in the enormous world of internet eCommerce and allows you to, for a moment, communicate directly with them. Again, this encourages an active choice, as opposed to an inadvertent sign up, which plays well further down the line,” says Michael S. Louis, an email marketer at Professional Essay Writing Service. 5. Have Strong Email Content It’s a somewhat obvious point, but it can be one of the most important things in terms of building your emailing list. When you do manage to get a user to sign up for your mailing list you’ve got a crucial opportunity to impress them. If what they receive at first is a complete disaster, poorly written, with mistakes in grammar and spelling, or if it is formatted badly, you’re going to find it hard to keep a hold of those readers, for future correspondence. Even if the people themselves aren’t especially good writers, the fact that you are a company means that there is an assumption of a certain professional standard you must adhere to. However, the skill of having strong email content isn’t as simple as getting it right, in a baseline sense. Writing an effective eCommerce letter is much easier said than done and can require a lot of quite intense work, simply to even make sure there is appropriate quality control. 6. Write With A Personality Bland marketing content, while safe, is the death of some emailing lists. There’s nothing worse as a consumer than opening an email which, from the get-go, is simply dull. Consumers already know that they are receiving blanket emails so it’s important that at the very least what’s written in the emails doesn’t sound generic. It can be perfectly phrased, spelt and have good grammar but if it doesn’t grip the reader it might as well be gibberish (in fact gibberish might be more effective!). Execute your emails with a sense that you want the reader to know something about the character of the company and to have learned something new about the thing they are signed up to by the end of the email. Don’t be afraid to startle! It’s better to have one or two unsubscribes but grow closer with your target audience than to play it too safe. Include graphics, colours, bold text, gifs, videos whatever you need to so that no-one closes the email without having had an impression left on them. 7. Make It Personal Which leads us to the final point. Nothing grips a reader more than feeling like they’re somehow understood by the company they are subscribed to. Again, as above, it plays on that idea of startling the potential customer with the email. If a customer visits the site once and views an article on tech policy as their free article, then the next email to their account should be with a link to something related to that. To generate clicks and returns to your eCommerce site you want your readers to feel understood. The feeling of being understood encourages a desire to perpetuate that understanding, and a belief that your site is in some way unique in its perception of them as a customer is an incredibly valuable thing. Offer them a deal on whatever they clicked on, send them a link to similar articles to something they’re interested in, send them offers to complete the jewellery set of which they own the earrings. Allow them to begin to form a personal connection with your company and you could have a customer for life.[/vc_column_text][/vc_column][/vc_row] ### Moving to the cloud - what are you waiting for? [vc_row][vc_column][vc_column_text]It wasn’t too long ago that building your own on-site infrastructure was the go-to approach for developing technology solution for most businesses. This approach was driven by factors such as the low level of broadband connectivity, the traditionalism of technical practices and a lack of technical standards in the market. However, as technology has advanced, companies now have a variety of options.  For example, the cloud has become the ultimate IT -as-a-commodity, providing new ways to accelerate and facilitate business growth. The cloud now represents an alternative to the traditional ‘on premise’ model that is reliable, affordable and can be used in several different ways. For many companies, especially new companies, the cloud is the go-to solution, rather than an afterthought. However, the majority (65 percent) of enterprise workloads are still running on owned or onsite data centres. Just 9 percent of enterprise workloads are cloud-based. One reason for this is that it can be tricky for businesses to choose the cloud platform that best suits their needs. Too often, businesses get entangled in the many unfounded reasons why moving to the cloud might not be best for them. For example, a small business might say it is too small for the benefits to have a real impact, while large organisations often talking about the supposed complexity involved. This couldn’t be further from the truth as the evidence suggest that moving to the cloud is beneficial for companies of all sizes. Whether your priority is scalability, disaster recovery or digital transformation, your business stands to gains by from moving to the cloud. Scalability made easy Moving to the cloud makes it easier for your business to adapt as it grows. This growth can either be horizontal, by manipulating the infrastructure to add or remove cloud servers or vertical, by increasing or reducing the individual components (vCPU, RAM, HD etc.) of a server. This is true even if you’re a start-up like CercaOfficina.it – a website that connects users with garages for vehicle repairs. CercaOfficina.it crossed the threshold of 100,000 requests after just four years in business, so had to be able to scale up quickly. Another example is Tommigame.com, a start-up that supports hospitalised children and uses cloud solutions to collect data about patients’ psychomotor behaviours in order to monitor and personalise their treatments. In this case, the business grew so fast that it had to scale up and provide additional tools and resources very quickly. Cloud-based solutions can be adopted at every stage of growth, enabling businesses to adapt their IT infrastructure according to their needs at any given time. Businesses can start off with a relatively small infrastructure, then gradually scale up to meet demands without getting stuck with dormant physical IT infrastructure. Disaster recovery made simple For most businesses that don’t have a specialised IT department, disaster recovery (DR) often means relying on a third-party provider. This happens mainly because most small businesses don’t think they are big enough to need DR or because they think it is too expensive. Worst still, over two-fifths (43 percent) of SMEs have no contingency measures in place to deal with an IT crisis. For those that do have DR in place, a recent survey found that 18 percent lack confidence in their DR plans and 46 percent don’t test their plans on an annual basis. Whichever one of these scenarios describes your business, cloud-based DR offers a solution that works for you. Whatever the size, cloud-based DR enables businesses to build up their resilience at a price that’s relative to their size, with smaller resource overheads when it comes to creating, implementing and testing a DR plan. More effective digital transformation The digitalisation of business processes, and the effectiveness of digitisation is the biggest differentiating factor for delivering the best customer experience. And the cloud is perhaps the most effective way to deliver this transformation Take Nexive for instance, Italy’s number one provider of private postal services. It recently decided to digitise all of its operations and move from on-site to cloud-based data centres as part of a five year and 3.5 million Euro investment plan. This transformation enabled Nexive to ensure reliable physical and digital services for its customers, as well as putting it one step ahead of its competition with a flexible and secure data storage solution. Before moving to the cloud, Nexive’s data was stored on a private server, which proved to be extremely costly. From process management to carrying out regular upgrades and maintenance, the server required significant investment in human resources, not to mention the costs associated with regulatory compliance. Among its various benefits, moving to the cloud eliminated the maintenance and compliance costs and offered a solution that could instantly scale up in case of an activity peak. Stay one step ahead with the cloud Moving to the cloud enables businesses to stay on the front foot. Whether it’s providing vital insurance against downtime (e.g. due to human error or a natural disaster) or responding to activity peaks in real time, the benefits are clear to see. What’s more, deploying the right cloud solution provides protection for IT resources, the data the businesses hold and for the business processes it supports, thereby freeing decision makers’ time to concentrate on running the business. Looking forward, it will become increasingly difficult to avoid having cloud-based solutions as an essential part of your business processes. Ultimately, businesses that fail to recognise this will end up missing out on the many opportunities and losing customers to competitors that do recognise the opportunity and have taken the right steps to position themselves for ongoing success.[/vc_column_text][/vc_column][/vc_row] ### Private Cloud: Answering the call of UK business [vc_row][vc_column][vc_column_text]The rate at which UK business is transitioning to the cloud is at an all-time high. Companies have their sights set on innovation, efficiency and lower OpEx, as delivered by cloud-based infrastructure. In fact, a report from Gartner predicts that by 2025, 55% of large enterprises will successfully implement an all-in cloud SaaS strategy. However, when it comes to digital transformation a one-size-fits all approach doesn’t always work. There are many types of cloud solutions, offering businesses the opportunity to choose the service that best meets their needs. Private cloud, in particular, is an especially attractive option for businesses that know what tools they need and want to maintain complete control of their infrastructures as they evolve. By virtualising their network and storage technology, the entire infrastructure can be managed in-house – including the deployment of new resources. Therefore, a rapidly increasing number of businesses are achieving their goals by working with a software-defined datacentre (SDDC), like that offered by OVH. This method virtualises infrastructure setup and design, leaving the hardware behind, with a focus on flexibility, security and control. Speed and flexibility So, what are the key benefits of moving to an SDDC? An almost seamless transition to the cloud is a very important factor for business leaders when making the digital leap. As we all know, downtime is dangerous! Setting up new network infrastructures and new server environments may seem like a goliath task, and traditionally it was. But today SDDCs can be up and running in less than an hour, greatly limiting operational disruption. The increased flexibility of the SDDC also delivers a powerful boost to staff’s speed and productivity, while helping to keep costs down.  In fact, the successful implementation of an SDDC can reduce operational expenditure (OpEx) by as much as 56%. Businesses are reflecting many of the trends that are growing among consumers, and the will to not be tied down is a central one. Businesses want to know that they are not giving up the ability to be agile by bringing a new service on board, and SDDC technology ensures this much needed freedom. . For example, our infrastructure is built to VMware’s industry standards which means customers not only have certainty but have access to a reversible platform. Getting the right fit If your infrastructure is currently based on dedicated servers, there are many ways an SDDC can help businesses. In addition to a growing call among customers for flexible options, businesses also need scalability. Companies using dedicated infrastructures won’t be able to achieve the levels of scalability and flexibility offered by the SDDC, a requirement that is becoming increasingly essential for businesses at both ends of the size spectrum. . An SDDC will allow the customer to use as much or as little capacity as they need, leading to considerable savings and the ability to progress on their digital transformation journeys. Another option for businesses could be to combine an SDDC with Public Cloud instances. This allows an organisation to take full advantage of the SDDC’s security and reliability for essential applications, while enjoying the speed and flexibility of the Public Cloud for any applications that need to be deployed and broken down quickly and cost-effectively. A combination of Public Cloud instances and Private Cloud solutions, a true hybrid cloud solution, is now not only possible but highly attractive for many global organisations. For example, a healthcare company may use an SDDC to keep sensitive patient data secure, while using Public Cloud instances for additional sites, such as blogs, news pages or test and dev environments. With so many options, deciding whether an SDDC is the right solution for a business may seem complex. However, the key to success is knowing your business goals and to work with cloud specialists to establish which combination of solutions will help you achieve those goals most efficiently. The wealth of SDDC choices means that businesses can find a cloud solution that matches their individual needs. Taking back control Having the flexibility to choose a hybrid cloud solution is important but maintaining full control of your infrastructure is what unlocks the real business benefits of an SDDC. Not only is data more valuable than ever, it is also at its most plentiful. Vast datasets are now required for cutting-edge work like training AI algorithms, and technologies like IoT stand ready to generate masses more. Handling and managing this volume of data is not a task that humans can successfully carry out, instead businesses need an automated process. Using SDDC technology also allows businesses to control the process of scaling up by adding disk space to accommodate growing datasets and an evolving virtual infrastructure, all without causing costly operational disruption. From concept to reality The SDDC concept is evolving alongside the changing needs of the customer, but the rate of change is swift and constant. Providers like OVH are continually investing in research and development to fuel progress and ensure their services keep up with business needs. Sophisticated private cloud solutions have the ability to change the playing field for businesses of all sizes. Above all, regardless of where your business is in the digital transformation journey, it can benefit from the flexibility, efficiency and control offered by an SDDC.[/vc_column_text][/vc_column][/vc_row] ### Why “NoSQL” doesn’t equal “insecure” [vc_row][vc_column][vc_column_text]The NoSQL database is a product of the 21st century desire to deliver increasingly fast, always-on digital experiences. Where traditional, so-called ‘relational’ databases require predictable, structured data, NoSQL (Not-Only-SQL) provides an extremely dynamic and cloud-friendly way for organisations to manage real-time, unstructured data. However, unlike traditional databases, which are almost always located on premises, some of the first NoSQL databases were exposed to the internet by default. This simple setup configuration has subsequently led to users of some of the most popular NoSQL databases falling victim to catastrophic ransomware attacks and reputation-crushing database leaks. For example, in late 2017, reports suggested that tens of thousands of publicly-facing NoSQL databases had been accessed and held to ransom, with some even wiped when users failed to pay up. More recently, cybersecurity researchers have continued to find troves of sensitive data laying unprotected in open databases. In September 2018, an unsecured database belonging to a Californian email marketing company leaked 11 million users’ personal data, including names, email addresses and physical addresses. And it hasn’t taken long for 2019 to see some potentially devastating leaks either; 202 million job-seekers in China had their CVs leaked by an unencrypted database, whilst smart doorbell company Ring was recently found to be giving access to every single customer’s camera footage to its R&D team – all because the company had not properly considered its security practices. It’s no surprise that the volume of database leaks has garnered apprehension, but it would be wrong to write off NoSQL databases as fundamentally insecure. In truth, the issues primarily come down to user error and poor database design – two things that can be tackled. As long as vendors implement secure-by-default features and users follow security best practices, NoSQL databases are just as secure as their predecessors. The proof is in the plan It sounds crushingly simple, but businesses must plan to prevent leaks before they occur. Choosing the right NoSQL provider is clearly the first basic step. Don’t make the mistake of trusting your database with anyone that considers cybersecurity an addition to their services – it should be built into their systems from the outset. Make sure you question any providers about their knowledge of end-to-end security. Look at whether they show a history of responsibly reporting vulnerabilities. Ask about how easy it is to implement security capabilities around the database. Properly researching your provider may seem obvious, but it could make the difference between suffering a leak and not. The next stage of planning your database security is to think about how your data is secured in transit. It’s a given for businesses that data will need to be transferred both internally and externally – this is not dangerous, in itself. However, the transferring of data externally has the potential to expose it to unauthorised third parties, because many businesses forget that encryption is vital for every dataset – regardless of where it’s stored. It’s therefore vital that you can secure data both at rest and in transit, by investing in SSL connections for client/server and server/server communications. The security planning for your database doesn’t end when you roll out your NoSQL database. It’s essential that you have visibility of and continually review the security roadmap of your database. As NoSQL databases are relatively new to the technology scene and therefore constantly developing and improving, it’s vital to understand exactly how any upcoming improvements or launches will impact your cybersecurity policies or needs. Five tips to security success It’s one thing planning your NoSQL security process, but putting it into practice is a different story. Here are some key tips businesses must remember if they want to avoid catastrophic cyber-attacks: Never expose databases to the internet. When it comes to database security rules, this is as important as they come. If you don’t store all nodes behind a strong database firewall, you risk the security of your sensitive information. Keep your server operating system up-to-date. Unless you install the latest encryption patches, no data security can be guaranteed. As the WannaCry and Spectre/Meltdown issues have shown, there’s no substitute for responsible patch management. Always delete “default” and sample databases. The security industry may not like the word “default”, but it’s not something to be brushed under the carpet. As those who have suffered cyber breaches will know, it can nearly always be replaced with the word insecure: default passwords are insecure passwords; default settings are insecure settings. Use strong and unique passwords. Thanks to the rise of Open Source, it's relatively simple to find out about security settlings like default passwords. Businesses can therefore never be too careful in ensuring any new project or database is given a strong password that is not used for anything else. Report security holes straight away. Even if it seems minor, you can help the wider community by sharing it. Coercive silence helps no one but cyber criminals, so if you see something that doesn’t look right, report it without fail. A problem shared As bleak as it sounds, businesses must accept that hackers will always be a threat. With every new technology that emerges, hackers become more sophisticated, constantly creating new methods to exploit vulnerabilities and disrupt enterprises. Rather than viewing this as a threat, however, businesses must see this as an opportunity to invest in education and best practice. Crucially, businesses should also begin to share the responsibility of compliance and security. From web, mobile and app developers through to C-suite and technology executives, everyone involved in databases has responsibility for ensuring they are secure. There’s also an onus on the NoSQL vendors to honour the trust that organisations place in them. Providing the NoSQL database isn’t just a case of implementing a product – they also have the responsibility of ensuring users are able to protect their data. To do this, vendors need to make sure their services are secure by default and avoid necessitating extra steps that do nothing but confuse the user. Whilst it would be nice to believe NoSQL data breaches and leaks are a thing of the past, that’s unlikely to be the case. Going forward, however, the leaks that have occurred – and are yet to occur – can serve as a reminder for businesses to take database security seriously.[/vc_column_text][/vc_column][/vc_row] ### Cloud Compliance | Pulsant | Javid Khan Javid is an outstanding IT expert with a passion for innovation excellence. He has a deep understanding of IT & Business with extensive enterprise architecture experience which spans a vast range of technologies and industries including Health, Finance, Energy, Cloud and E-gaming. He has a proven track record of successfully delivering projects both hands on and as primary technical design authority. As Co-Founder of LayerV, he is able to impart his knowledge, experience, leadership, professionalism and passion across the whole organisation. Transcription Today I'd like to talk about compliance in the cloud, and more specifically the challenges of compliance in the cloud. In this example; if we take an on-premise cloud (or your private cloud) and if you look at your public cloud - there are four technology layers broadly speaking. You have your data, you have your application, you have your virtual infrastructure,  and you have your physical Hardware. Now for a responsibility perspective, and this is highly reference-able online, you're responsible for pretty much everything up to OS. Everything below OS - you don't have to worry too much about - that's tackled and handled by the cloud vendor. But if you are thinking and talking about private cloud or on-premise facilities, you have to worry about pretty much the entire stack. Some of the things that you might have already in place to monitor and manage; your security, your controls, maybe some of your compliance or tooling around your physical elements, especially on your on-premise around your virtual infrastructure, your network, your CPU, your instances. Maybe a number of tools were application and data. And very similar in public Cloud you might have the same tool or a different tool to manage your data, applications, and your host more specifically. The burden and the challenge then comes with the regulations the requirements and some of the Frameworks you have to align as an organisation. So if we take ISO 27001 as an example, you might also have GDPR requirements for your infrastructure and you might want to align with some industry standard benchmarks like CIS. You might also have some more specific industry related requirements such as FCA or PCI or GXP for in pharmaceuticals and so forth. How do you ensure those controls are implemented across your on-premise, and your public cloud? And how do you ensure those controls stay conformant for the duration of which you have that infrastructure in situ? We've made that a little bit easier. We've brought the ability to take all those tools that you really have to be in able to extract that data consolidate it and provide it in a near real-time dashboard. So you're able to understand whether your infrastructure is conformant or not conformant to any of these Frameworks as I described. So, how do we do that?  We have created the ability of extracting data from a number of sources. So in this example, we will use AWS as our public Cloud. We might even want to take some Azure, some VMware and some application data specifically around antivirus or maybe some bespoke applications. We have something called adapters.  And those adapters are able to extract data. Using  an adapter Fleet, so we have a specific adapter for each of these data sources. We can extract that data normalize it and centralize it. What we then do and what we were then able to do is query against our compliance engine. Our Compliance engine has the ability to analyse and query the rules that we've created and predefined across these Frameworks. So we already have a predefined list of rules, technology rules, specifically for public cloud and for on-premise private cloud. We're then able to present that information and that data into a dashboard and provide a near real-time capability of whether your Cloud infrastructure or infrastructure on-premise is conformant on not conformant to the Frameworks as described earlier. As an optional what we then able to do is present that information, an integrator into a sock. So you can use your own sock or leverage our sock. Ultimately what we want to do is take the alerting that's been triggered and actually do something about it and remedy it. So once we remedy that alert.  It will then form that continuous loop and ensure that that infrastructure is then conformant across the next Pole. And what we then create is a continuous cycle or continuous compliance for your infrastructure. Now, the benefits of this are highly bespoke. We can take data sources pretty much from anywhere, whether it be on-premise data, whether the cloud data, whether it be application data. What were able to do is create your own rules and your own policies based on the Frameworks you need to align to or if you have any specific Frameworks or policies that need to be aligned so we're able to create bespoke rules for that. So what does that mean for you? We're able to provide you with the confidence that your Cloud infrastructure is conformant to those Frameworks - that you have to align as I described in my example. You're able to ensure that you have a deep understanding of your compliance posture and provide you with that near-real-time exposure at any given point. And more to the point it's compliance Made Easy and it's simple. ### 5 Amazing Ways AI is Transforming Online Courses [vc_row][vc_column][vc_column_text]State-of-the-art technology demonstrates at least two important things. First, with hard work, the human mind can actualize nearly everything it dreams up. The second thing, which may be even more important, is that intelligence breeds intelligence. The smarter our creations are, the smarter we become. This is the case with artificial intelligence (AI) and eLearning. AI allows us to leverage our LMS software solutions to an even greater extent, create online courses that transcend the boundaries of space and time, and unlock new learning possibilities. In other words, artificial intelligence is transforming the way we learn, and here’s how: Machines Have (Meta) Learning Abilities Intelligent technology relies on a concept called machine learning (ML). This powerful capability allows it not only to collect data but also to learn from it. As a result, we now have machines that recognize patterns and make decisions without human supervision. In the context of online courses, ML ensures a deeper insight into learners’ needs and preferences, as well as their existing aptitudes and skills. It’s like an X-ray for individual learning behavior. But that’s not all. AI-based LMS software can see through anything. Empowered with super-advanced analytics, online educators can now closely monitor their courses, track student progress, and pinpoint obstacles that contribute to knowledge gaps. This is important for the improvement of any learning environment. Thanks to AI, eLearning courses are becoming more effective than before - key insights on learners’ performance allow better knowledge retention, and so does course analysis itself.  The Age of Personalized Learning is Here  The more insight AI helps us gather, the more we realize that the traditional one-size-fits-all teaching model doesn’t suffice in today’s democratic learning environment. What we need is a personalized approach to all learners. The age of diversity is upon us - it’s not only that modern-day students learn at a different pace, but it’s also that they do so in different and highly individualistic learning styles. Instead of the outdated chalk-board method, we have new popular techniques such as gamification and social learning, as well as different devices for distributing online courses. By analyzing learning behavior, AI helps create custom-tailored learning paths within online courses and thus cater to visual, auditory, reading, and kinesthetic learners. Artificial intelligence looks beyond established types to reveal unclassified combinations of learning needs and habits, allowing us to provide blended courses in the true sense of the term. AI Helps Create 10x Educational Content A good book reads us, not the other way around. We should be able to say the same about educational content. But to serve as a motivational tool that boosts knowledge retention, learning materials must be truly remarkable, in addition to being useful and fact-dense. This means that content must also be engaging and interesting. When in-depth or data-oriented, it must not overwhelm with dry facts. It must be 10 times better than that. AI may not be able to do our writing for us, but it can certainly help us with research. Discovering new learning content should be a breeze for an intelligent machine. Just try to imagine the amount of material it can sift through. With powerful LMS software, educators across the world will soon be able to access every single source of knowledge in both the online and offline environment. Digital Assistants Unlock Virtual Mentoring Chatbots are already changing the way we shop online. They are fast, convenient, cost-effective, and consumers simply love them. Though they are less frequently used for providing real-time support in education, this application of artificial intelligence boasts exciting new possibilities in terms of interactive learning. Being mobile-responsive and AI-based, these digital assistants can replace educators in online courses as well, where they can offer 24/7 counseling whenever the need arises. Paired with machine learning, a capability called natural language processing (NLP) enables them to understand human language and respond in a conversational way. Thanks to NLP, chatbots can be trained to monitor learners’ behavior, talk to students, provide additional educational materials, and even impart expert knowledge on various subjects. Virtual mentoring can also help personalize learning paths. Without any human supervision, AI can learn to give custom-tailored guidance and feedback, thereby boosting learner engagement and productivity. Automated Authoring, Tracking, & Grading  All of the above benefits - learning about learning, deep personalization, engaging content, and virtual mentoring - can help you build a better and richer learning environment. But can AI help you create an online course faster and without much effort too? Both machine learning and natural language processing, as well as all other capabilities that AI brings to the table, are powerful automation tools that facilitate authoring, tracking, and grading. With a good LMS software solution by your side, creating a course is smooth sailing. The tricky part, however, is that courses must be regularly improved and updated to meet the requirements of modern-day students and cater to the needs of individual learners. This obviously takes time, but AI can help you cut that in half. Wrap Up We’ve only scratched the surface of everything AI can do for educators and students who choose online courses as their primary eLearning tool. A few innovative applications more, and we’ll be able to unlock the full potential of LMS platforms powered by artificial intelligence and smart digital tutors. From where we stand at the moment, the future of AI-driven education is looking bright.[/vc_column_text][/vc_column][/vc_row] ### Misuse of AI can destroy customer loyalty: here’s how to get it right [vc_row][vc_column][vc_column_text] Once just a marketer’s buzzword, the concept of a ‘multichannel experience’ is now industry standard for consumers. Shoppers want to be able to browse a product in-store, complete the purchase online at its best price and availability, and then have it immediately shipped to a desired location. They expect native functionality, consistency, and a flawless experience throughout the process, regardless of the channel. Increasingly, due to mass adoption of Netflix, Amazon and Spotify, consumers also expect personalised recommendations tailored to their individual taste and preferences. A recent report found that 83% of customers expect brands to serve them a fully-personalised digital experience. However, the successful implementation of personalisation is easier said than done. While nine out of 10 UK companies surveyed by Adobe believe AI equates with future business success, only 30% are happy with the role AI currently plays in personalising their customers’ experience. All too often, customers are either bombarded with suggestions for redundant products they’ve or receive recommendations eerily tailored to private parts of their lives. For instance, retail giant Target accidentally exposed a teen girl’s pregnancy to her father due to the historical data it had collected from the consumer, who had begun to buy certain products indicating a potential pregnancy. And it’s not just creepy ads; just because your data suggests a targeted ad or marketing campaign, doesn’t mean it’s the right thing to do. In a particularly extreme example, an article in Brandwatch references an elderly person in an assisted living home receiving a Christmas basket from a local mortuary. As personalisation integrates more seamlessly into our lives, the line between enticing and scaring off the customer grows ever more blurred. Often brands might be doing more harm than good with some of their personalisation tactics, which can severely damage their reputation and customer relationships. According to a new report, 75% of consumers state they find certain forms of personalisation ‘creepy’, and consumers are four times more likely to dump a brand after a single bad experience. As personalisation has become an operational necessity, how can businesses find the ‘sweet spot’ of creating lasting loyalty while steering clear from the ‘uncanny valley’? Here are three tips to get it right. Brands must become customer & data-obsessed Before retailers consider implementing a personalisation strategy, they must not only strive to understand the affinities of the end user but must also understand that there is a human being at the receiving end of all this effort. Companies can have the best AI system in place, but if the personalisation strategy is not human-centric, and as a result efforts misfire, they may find themselves in a worse position than had they not implemented the system at all. In order to succeed, retailers need to become customer-obsessed. Who is it that is being targeted and what are these individuals’ likes and dislikes? Retailers need to track every single customer touch point across all channels, merge that information with relevant customer data from other systems, automatically interpret all information to determine affinities, and then store it in a central place, creating a single, unified profile for each customer. Since May 2018, retailers have another hurdle to overcome: the General Data Protection Regulation (GDPR), which grants individuals more control over their personal information. GDPR has laid a new groundwork of individual rights which retailers simply need to obey. For instance, the ‘right to erasure’ forces companies to delete all existing data of a user, and ‘the right to object’ stops all permission-based personalisation efforts for that user. However, GDPR is no threat to personalisation. It should instead be considered an opportunity to refine your marketing content to deliver the most value to the unique user experience. The power of machine learning To be truly effective, companies need to apply machine learning to the vast amounts of data at their disposal. While previous approaches to personalisation were based on traditional rule-based principles, segmenting customers by predetermined criteria, machine learning algorithms can ingest and analyse ‘big data’ and then make suggestions suited to each individual user. Instead of giving the computer specific rules to follow, it’s programmed to learn as much as possible about your consumers in order to select the experience most likely to appeal to them. For instance, a basic machine learning platform might analyse shoppers’ browsing behaviour and address and leverage that information to make statistically-supported, best-effort inferences about other characteristics of that user. The ability to easily do this - and more - at extraordinary computational scale enables retailers to deliver far more engaging interactions than traditional rule-based approaches could ever deliver. The importance of in-the-wild testing Once retailers have built up a picture of their customers with help of machine learning, they need to create effective engagement strategies using transparent messaging. Brands can decide how and when to engage with their users, and which messages and experiences to deliver throughout the entire customer journey. However, the key to ensuring a seamless customer experience is having a rigorous testing process in place that ensures the testing remains human-centric. The problem is that regular testing approaches – traditional QA labs, for instance - seem simply outdated in the age of global, omnichannel commerce. Relying on the traditional approach often results in critical issues going undiscovered, because even the best in-house QA team cannot accurately reflect the complex ways that real customers interact with technology. As a result, it becomes increasingly important to put the consumer at the centre of all testing by using real people on real devices. At Applause, we call this ‘in-the-wild’ testing. In-the-wild testing means real people – your customers, your competitors’ customers, your prospects – in the very markets you serve to provide actionable feedback on any part of the digital and physical customer journey. This type of testing perfectly complements machine learning-driven personalisation efforts, as it enables you to promptly validate proposed technical or advertising strategies with real customers (testers) b efore it gets into the hands of your wider customer base. With an effective approach to in-the-wild testing, you can capture results that far better reflect the realities of your real users’ experiences than is ever possible with just an in-house QA lab. The emergence of personalisation is one of the most exciting developments in retail technology. Once a unique competitive advantage over others, it is now becoming the norm and has reshaped how brands engage and communicate with their consumers, building lasting loyalty along the way. However, as personalisation integrates more seamlessly into our lives, retailers need to find the’ sweet spot’ between satisfying the customer’s demands and steering clear from the uncanny valley. The only way to ensure this is by rigorously testing using a human-centric approach. [/vc_column_text][/vc_column][/vc_row] ### Smartdust coming to a computer near you Micro computers have various benefits including their ability to be embedded into product packaging to monitor supply chains as well as their medical use whereby they can enter the human body and collect information regarding a patients health. As with all technology the downfalls revolve around who has access to it. If a miniature computer ends up in the hands of criminals and hackers, its microscopic nature could benefit the trade of illicit goods rather than prevent it. As technology continues to decrease in size, it becomes more challenging to monitor its use, as by its very nature it is easy to conceal. It is extremely important to understand the risks as well as the benefits of shrinking technology. There is a competitive side to the development of microscopic computers as there is with any instance of ambitious companies fighting for the title of "the world’s greatest..", "the world’s first.." and now "the world’s smallest..". For example, in 2015 the Michigan Micro Mote was created by University of Michigan faculty members David Blaauw, Dennis Sylvester, David Wentzloff, Prabal Dutta and several tech-savvy graduate students working at their own start ups. The Michigan Micro Mote (also known as M3) measured 2mm across and contained a processor, system memory, solar cells to power the battery, temperature sensors, a base station, and wireless transmitters. Once the scientists at the University of Michigan heard that IBM had announced an even smaller device, their focus switched to reclaiming the title of “the world’s smallest computer”. They went on to release a new device a tenth of the size of IBM's 1mm x 1mm computer. Understandably, if someone came across this invention and had prior knowledge of IBM’s "smallest computer" they might assume that scientists behind M3 just wanted to overthrow IBM and reclaim the "smallest computer" crown with little regard for its actual use. But in fact, the minuscule size of the invention is not just an effort to reach a new record surpassing IBM. The small size aids its deployment in the medical field (being able to detect ailments in the human body) as well as its use in tracking the movement of merchandise in a supply chain to ensure the items are legitimate, safe and legal. One of the challenges the team at the University of Michigan came up against in making a computer a mere tenth of the size of IBM’s was how it would run with little power when the system casing had to be translucent. The light from the base station and the device’s LED generate currents within the device’s microscopic circuits. Another challenge they faced was how to maintain the device’s accuracy while running on minimal power, akes many of the common electrical signals (such as charge) louder. According to David Blaauw (one of the M3’s creators), they had to invent a new approach to circuit design whereby the device would run on limited power as well as being able to withstand light. The technology they have developed is incredibly versatile and is adaptable for various uses. Although, the team has focused on temperature measurements because of its ability to help understand and detect tumours. For example, research has suggested that tumours have higher temperatures than healthy tissue, and a device such as the M3 can help to confirm or disprove this notion as it is miniature enough to enter the human body. The scientists at the University of Michigan have already used the device on mice. Gary Luker, a professor at the University, said that “since the temperature sensor is small and biocompatible, we can implant it into a mouse and cancer cells grow around it”. This emphasises how life-changing and even life saving this technology could be as it has the properties to potentially recognise ill health in a patient. Although, so far, the Michigan Micro Mote is the only ‘complete computer’ of such a small size (with IBM following closely behind), small scale tech devices are not especially unique items in the contemporary world. For example, in 2015, a Swedish company offered tiny chips to their employees to inject into their hands. Instead of scanning ID cards to access their workplace, workers can move their hands over scanners. The RFID (radio-frequency identification) chip is around the size of a grain of rice, and the chip also gives employees access to photocopiers. Understandably the technology had mixed reviews, but its deployment in the workplace shows how brave and innovative tech developers have become; showing a willingness to enhance everyday life with the addition of wearable technology. With the production of microscopic technology comes the risk of human error and corruption. Unfortunately, such technology will inevitably entice hackers and criminals wishing to track objects or even the movements of people for felonious purposes - taking an innovative and positively transformative piece of tech and moulding it for illegal activity. As long as security remains of the utmost importance for creators and providers of such technology, there should be no issue, but as with all inventions, these devices carry risks, and users and developers of such products should be aware of these risks at all times. Although needing a license, background checks and training may seem excessive, the level of risk does not match the tiny size of the Michigan Micro Mote and similar items. Measures such as these may become necessary as miniature technology expands into widespread use. ### Why even the smartest fintech firms still need humans [vc_row][vc_column][vc_column_text]Algorithms, automation and machine learning are trending towards one thing: reducing or eliminating the need for human labour. But in the world of fintech, humans aren’t just necessary - they’re irreplaceable. 30 years ago, it was perhaps unimaginable to think that enormous financial decisions could be made, without speaking to a human throughout the whole process. We’ve grown accustomed to the fact (and in fact welcomed) that banking, mortgages, and insurance can be conducted entirely in a digital manner. But, despite society facing an unprecedented level of technological sophistication and increasing innovation, there’s an integral requirement for human intervention, both on the surface and behind the scenes. Within the business Although the tech is undoubtedly the focus of any fintech, the humanistic element is integral to the delivery of their product. It’s not just the obvious need for human developers to write, maintain and optimise code nor to imagine the products that make the code necessary. Even if these processes could be automated, the need for humans would remain. That’s partly because ‘build it and they will come’ doesn’t apply to the noisy, competitive world of fintech. Even the most disruptive, compelling and revolutionary products will add more noise than signal. And so, all things being equal, it’s shrewd branding, marketing and PR that will help a product cut through – all things that require unparalleled human input. Online mortgage brokers, for example, face unique challenges to convince customers to switch from traditional brokers and loyalty to their banks or existing lenders. This is just one area in which human empathy and creativity can’t be replaced or emulated by technology. Consumer attitudes Consumers can be understandably reluctant to trust small, relatively unknown startups with their personal and financial data, particularly if it involves one of the biggest purchases you can make – which means some fintech firms have a harder job than their more established rivals. A survey of 2,000 consumers conducted by TopLine Comms found a massive 83% were ‘unsure’ of fintech companies and how they work. Interestingly, almost a third (27%) cited a lack of understanding as the reason why they were unsure. Fintech firms can only earn trust by understanding and addressing customers’ concerns, or educating consumers through marketing, communication and branding – and this empathy can’t be automated or faked using technology. But that’s just acquisition. Customer experience is what turns a sceptical user into a customer, and a hybrid approach that uses tech and human interaction may be the key to the best customers experiences in fintech. To use the same example, fintechs in the mortgage space can use tech to free their human advisers from the laborious and time-consuming parts of the job so that they can spend more time finding the best deal for their customers and building a relationship that fosters trust. These relationships build advocacy, which can in turn convert others. In a survey by Podium, 93% said online reviews affected their decision to make a purchase or not. If positive relationships translate to positive reviews, the human element of the customer experience has a multiplier effect. Of course, for some, the very appeal of an online alternative to a traditional service such as a mortgage broker is that they won’t have to deal with a human. Issues around creditworthiness and complex financial situations can be sensitive, and customers in these circumstances may prefer not to speak to someone. Entirely digital experiences, perhaps augmented by chatbots that use Natural Language Processing (NLP) and machine learning, could be the best solution for them. In reality, both kinds of consumers exist – and so a hybrid model that uses tech and human interaction to improve the customer experience may be the most pragmatic approach. ‘Human touch’ Richard Hayes is the CEO and Co-founder of online mortgage broker Mojo Mortgages. Richard’s experience tells him customers still want to deal with humans but knows how much tech can improve customer experience. He said: “There are customers out there that want to transact digitally, and there are also ones that want a human touch. Particularly with complex financial products such as mortgages, we’ve found there are also customers that want a hybrid of the two. “Our balance between human and technology, allows us to create the perfect mortgage experience for consumers thereby giving them a greater sense of confidence when buying”. The future As fintech goes mainstream, consumer attitudes may change. As tech gets more sophisticated and ubiquitous, people may start to feel less apprehensive about trusting new companies with their data. The advent and widespread adoption of Open Banking could enable this paradigm shift. But in the meantime, fintech will still face the twin challenges of making themselves stand out in a crowded marketplace and converting sceptical consumers. In both cases, humans will remain an essential and irreplaceable component to any successful fintech.[/vc_column_text][/vc_column][/vc_row] ### Beyond The Serverless Trend, Why Conversational Programming Should Be Your Next Focus Area [vc_row][vc_column][vc_column_text]Container Wars, Serverless, No-Code are the “en-vogue” buzz words in the technology market today, but if you are not competing in this space now, where do you look to leap ahead and regain that crucial competitive advantage. One area has caught the attention of many is Conversational Programming. Once the stuff of science fiction, where people could talk to a computer and it comprehended what you wanted and created it all from voice commands with limited or little instruction was the at the same level of awe and wonder as people watching TV programmes such as “Star Trek” and seeing the characters being able to do video calling from one side of a planet to the other, all without wires. Both are now reality, albeit that Conversational Programming is still in its infancy. But what is it? Conversational Programming can be defined as a programming capability to help you with building infrastructure, storage instances, platforms and applications by tracking the state of what you are building.  This includes user selection, as a context for interpretation, and is constantly running your program in order to give you informal feedback and improve it. This enables greater visibility of how your code is working to build more efficient and cost-effective capabilities as you are better equipped to map the costs of each stage of the application lifecycle and workflow. To give this context, let’s look at an exclusively AWS example in order to simplify the proposition as much as possible. Taking an Amazon Alexa, it is possible to build a set of skills that can enable an application to be built from the Serverless Application Repository as all of the component parts can be defined as services within it.  By mapping each of these component parts, it is possible to use the micro-economics of each stage of the workflow to see its cost and value.  This can greatly increase the speed in which capabilities can be developed but they rely on two key things underlying them to be successful. Firstly, understanding the cost and value implications.  By aggregating the entire workflow together to define its value and cost creates a capital cost flow, or more simply put, a way of seeing how Revenue – Cost = Profit in what you are building. Given that there is ever increasing pressure on IT leadership to control and manage costs whilst at the same time a rise of awareness that spend in cloud is not as easily defined as traditional IT, the ability to map capital flows is a critical component for CIOs to understand and explain the value of what their teams are delivering. CFOs, justifiably, want proof of investment and CIOs must deliver that evidence.  However, they cannot do it alone. The future leaders of IT organisations sit in DevOps and architecture teams today and they must be taught the cost implications of their design and decision making in addition to the technical ones from the outset to equip them for today and the future. Secondly, the quality of the code and components in the Serverless Application Repository has to be good enough.  Fundamentally, this is about good IT practice.  Components must be in a mature state and therefore any changes to them could mean significant impacts across multiple capabilities when changed.  This where a ‘go slow to go fast’ attitude actually is beneficial to ensure that unintended consequences of change do not bring havoc.  It is a delicate balancing act. In summary, the pace of technology development shows no sign of slowing down.  The issues of non-traditional IT practices are still maturing and that leaves uncertainty, however, those that are actively engaging in this space are gaining the competitive advantage as they are forcing good IT practice (cost) to drive better outcomes (value).  It is a long term play but one that will ultimately provide better evidence to CFOs, but in the meantime will demand close management.[/vc_column_text][/vc_column][/vc_row] ### AI-driven learning essential for workers’ digital transformation preparations [vc_row][vc_column][vc_column_text]As artificial intelligence (AI) becomes increasingly prevalent in modern society, the technology’s implications are becoming easier to understand. One of the areas that will be most affected by AI is employment. There are clear signs that business is becoming aware of this opportunity. In a survey of 3,000 business leaders conducted by Boston Consulting Group and MIT Sloan Management Review, 75% of executives believed that AI would enable their organisations to move into new areas of business, while 85% felt that AI offered them a competitive advantage. However, the impact on the workforce is a mixed picture. According to PwC, AI could displace around 7 million existing jobs by 2037, although the accountancy firm predicts this could be offset by the creation of 7.2 million new jobs. Existing jobs will also see radical change. The OECD has stated that 32% of workers in the world’s richest countries face significant upheaval and would see their tasks changing considerably due to AI. What is clear is that AI will result in seismic changes to the employment landscape. In the new working reality, fostering a culture of continuous learning to adapt to the changed, displaced or newly created jobs will be critical. This new world will be one in which a career is an education in itself. At the same time, continuous learning is also vital for companies to keep abreast of innovation, stay ahead of their competitors and retain top talent. The secret to delivering the new continuous learning workplace? AI. AI is key to preparing workers for the automation transformation In order to seize this opportunity and embed a culture of continuous and adaptable learning to respond to the automation transformation, companies will need to encourage much greater collaboration and knowledge sharing within and without their organisation, drawing on the latest and best content and tools from across business and society. Key to this will be AI-driven technology platforms to support personalised, automated continuous learning. There is now a wide acceptance of learning technology and a renewed focus on learning outcomes as a means of driving organisational performance and, ultimately, revenue. However, in a study carried out by Brandon Hall Group, 58% of companies surveyed described their current learning system as outdated and not fit to meet their business needs today, let alone in five or 10 years when automation takes hold across the economy. Another problem organisations will have to address is the nature of learning content itself. Spherion Staffing’s 2016 Emerging Workforce Study revealed that 45% of workers felt that their company’s existing training offerings weren’t relevant to their daily responsibilities. To provide learners with appropriate knowledge that will help them advance their career in an AI-led world of work, firms will need to identify learners’ requirements and close any skills gaps. Managing this transformation and delivering value from the opportunity presented by AI will require new skills across the workforce. Workplace learning and development will be critical to delivering the skills business needs from the workforce in a fast, efficient and cost-effective way. There simply isn’t going to be time to wait for new skills to arrive from new employees, or the existing human and financial capital to procure skills from the market. Personalised training to reskill workers Making AI the engine that powers the whole employee learning and development process will allow companies to develop more immersive and personalised learning experiences while allowing automation for menial tasks. AI will become the trainer! As with all AI-driven processes, the effectiveness of AI-enabled learning depends on how people actually use the system: the more data the system processes, the more the machine learns about individual learner needs, turning the learning platform into a continuous improvement engine that grows alongside learners. As each employee is different and will need different skills to adapt to different challenges, they need personalised content, accommodating personal preferences and learning styles. Delivering such personalisation “at scale” is impossible without AI. Personalised learning involves handing a degree of control over to learners themselves, giving them input into how they progress and develop. For example, learners identifying a particular personal skills gap can receive targeted recommendations and training. The AI learning systems can also automatically recognise an individual employee’s skill set, enabling it to provide a more comprehensive and less linear learning journey compared to previously human-coordinated training programmes. AI will make learning personal. It will make learning relevant, with insights based on data, on user behaviour and on preference. Changing enterprise learning’s status quo with an incredible ability to deliver automated and personalised learning. In a world driven by automation, making sure the human workforce keeps pace will increasingly rely on the same underlying technology that is transforming the employment landscape.[/vc_column_text][/vc_column][/vc_row] ### How to avoid tech paralysis [vc_row][vc_column][vc_column_text]As a business, it’s important not to become set in your ways. If things are going well or you’re unable to identify which technologies suit your goals, then it’s tempting to keep the company’s technological methods as they are. However, the steadfast approach could result in a lack of efficiency and your competition overtaking you as you increasingly rely on outdated technologies and software. The truth is, a lot of businesses fail to identify the tech that truly benefits their workforce, which ends up standing in the way of efficiency – as employees lack the appropriate and adequate training when it comes to software, systems and apps. Here, we’ll discuss the topic of tech paralysis, touching on the reasons why businesses still use antiquated systems, where businesses can improve, and what a company with a grasp of modern technology actually looks like. Why do some businesses suffer from tech paralysis? Copying their competitors If a competitor is doing something right, then surely if you use the same tech that they have implemented and assume that replicating it automatically means success for you and your business? Not so. There is no one-size-fits-all approach to business; what’s most important is to sit down and talk with everyone in the business, identify where current systems are letting them down, or not being used to their full potential, and research the best solutions. Cost, effort and time If you’re planning on developing and testing a new system, then you’ll have to factor in the cost, time and effort that will take to carry out properly. It can be a significant drain on a business’ resources; even when the new system is up and running, you have to consider the cost it takes to train all members of the team to use the technology, too. Whilst the long-term benefits of updating tech systems may be obvious and acknowledged by stakeholders, they’ll also be constrained by short-term targets – the cost of which could mean that businesses can’t feasibly invest in long-term goals. Fear of investment Along with the cost of new technologies, some businesses remain sceptical when investing in new software and systems because of the perception everything will soon be out of date, and they’ll be back at square one once again. Like a lot of new things, these fears can often be unfounded. If you’re looking to err on the side of caution, then take the pragmatic approach and invest in software and systems which are scalable. Programs and tech such as these offer opportunities to continuously update and improve as technologies develop and the business grows. What does a business with the right tech look like? A business truly using tech to their advantage is identified by their workforce employing tools, systems and software to effectively manage their responsibilities. For some organisations, it may be important to offer a suite of different tech options, so members of different departments can access and utilise the tools which best suit their roles and responsibilities. For example, customer-facing personnel may yield greater benefit from a different system to those working in internal communications. Rather than forcing one department to adapt to a less suitable option, it is preferable that both are available for the teams to use at their discretion. How can businesses improve on their tech? There are simple tests businesses can use to evaluate how they use technology. The most basic process would be to evaluate lead times on tasks, projects and campaigns. If a particular job is taking too long to move from inception to completion, then it could be inhibited by the supporting tech. Consider the Kanban methodology, which is used to manage the creation of products, with an emphasis on continuous delivery. Utilising this method of task management can highlight the stages at which tasks are routinely hampered, and help you identify where a solution could be required. By viewing all items in the context of each other, it helps to limit the amount of work in progress, so teams don’t commit to too much work at once. Then, when something is finished, the hierarchical structure you’ve identified means the team can work on the next highest thing in the backlog.[/vc_column_text][/vc_column][/vc_row] ### Should robots be marketed to children / teenagers ? As the market floods with new gadgets and technological possibilities, excitement can override any consideration for potential issues that could come from these new releases, including a careful approach to whom the devices are being marketed. Of course, parents play the most significant role in managing what their children have access to, but with seemingly harmless robots being sold with children in mind many parents/guardians do not know of or see potential problems associated with the technology. There are many positive reasons to embrace robots and artificial intelligence, but rarely will an advertisement disclose anything beyond the positives about the products. Robots can provide children with entertainment, education, and can even act as substitute child-minders. Although everything about this is incredibly convenient and could help busy parent relieve themselves of guilt when they cannot occupy their children, whether these children are missing out on crucial interpersonal skills development and human contact is something that must be considered. There is a constant barrage of articles and news items released regularly that criticise contemporary parenting and the way in which screens have replaced of forms of stimuli such as reading or talking the other people, but there is a lack of criticism aimed at the companies that produce these robots and simulated experiences to children. It is not unusual to see a toddler sat in a pram clutching an iPad as their hardworking parent doses off on public transport, but rather than simply blaming parents, it is important to identify how these devices come to be available to children. Of course, more often than not, parents buy these technological products for their children, but I have no doubt that the advertising of these devices and the ease at which they can be purchased plays a huge role. It is inevitable that children will be exposed to technology, and for the most part, it is a great thing, as they can learn an enormous amount from the click of a button or by asking their bot a question. When the balance between technology and human socialisation is achieved, children can grow into well informed and social people, but when they are left for too long with nothing more than a robot for company, their communication skills will inevitably lack and they will not reach their developmental stages (e.g. empathy) at the same rate as more socialised children do. San Francisco robotics startup, Anki, released a robot called Cozmo, designed to manifest some emotional engagement for child users. The robots are pitched as little buddies. Many will sincerely doubt the emotional capabilities of a robot, but robots dating back to the 1960s have been able to manipulate the emotions of humans. For example, an MIT creation, a computer system called Eliza acted as a psychologist, asking people questions about their feelings. As a result of this emotional interaction, people started treating Eliza as if she were human. Cozmo is small enough to sit in the palm of your hand, and costs a mere £175 and is marketed towards children and adolescents. The cost and strategic marketing towards a younger audience make this an ideal and easy gift for a child. The robot has been programmed with an ‘emotion engine’, which means Cozmo can respond to environments and circumstances in a manner similar to a human. Comzo displays emotions ranging from a happy and content to annoyed and gregarious. If you agree to play a game with the robot, its eyes will transform into upside-down U’s to show happiness. The cynic within me sees this as emotional manipulation to sell a product, and I highly doubt such a product can replace emotional interaction with other humans. The technology of these robots is incredible and show the enormous capabilities of artificial intelligence, and there will be some lonely children who will find real solace in their relationship with these bots. Research into Socially Assistive Robotics (SAR)  has found benefits through therapeutic applications for children with developmental disabilities, like Autism, as well as helping to personalise health education for children with diabetes. On the other hand, I find it problematic that such devices are being marketed as “like humans” because it should not be suggested that robots can replace human interaction. Robots should be embraced as a separate entity from humans, and only embraced in moderation in childhood. As more regulations are put in place to determine a product’s psychological and developmental suitability to children, robots and other advanced technological devices will be evaluated for the impact they have on young minds (good and bad). There are definite areas in which robots can benefit children, including monitoring health and helping children with developmental conditions to interact and learn. Bringing robots into the home for children to use will be much more in-line with the needs children have; eliminating any negative impact the technology could have. ### How to Improve Organisational Resilience with the Cloud Given the pivotal role IT infrastructure plays in the functioning of organisations today, any downtime can be costly. Therefore, the stability of the IT backbone which includes the cloud as an important component plays a huge role in building the organisation’s resilience. For instance, how quickly can the business resume in the face of any unforeseen event, whether a natural disaster or social unrest? How can it cut the risk of crucial data getting corrupted? How can it safeguard itself against deadly cyberattacks? The pandemic is rapidly accelerating enterprise adoption of digital technologies, particularly cloud solutions – projects that are expected to take years are being completed in just weeks or months. One report by Markets and Markets found that the cloud market size is expected to grow from USD 233 billion in 2019 to USD 295 billion by 2021 at a Compound Annual Growth Rate (CAGR) of 12.5% during the forecast period.  With most organisations embracing cloud infrastructure, some of these risks are mitigated to a certain extent. However, cloud also brings in its own set of risks that organisations need to address. Cloud deployments add greater complexity and volatility since they typically involve large-scale transaction volumes, open architecture, and multiple vendors. Then there are challenges around managing the synchronization between cloud and legacy environments and ensuring the availability of network connectivity.  In such a scenario, organisations need to take specific steps to build business resilience in a cloud environment. They need to take accountability for business outcomes by creating a comprehensive strategy that handles everything from provisioning, day-to-day management of a multi-cloud environment to driving innovation at scale using cloud capabilities.   Understand Critical Workloads  As a first step, there needs to be enough visibility into the types of workloads, especially customer-facing ones, that are residing on the cloud. This helps understand the impact of downtime on business continuity. For instance, if an e-commerce application runs on the cloud, it is critical to make sure it is available all the time. Since the application is a revenue generator, any downtime immediately impacts the bottom line, not to mention the negative effect on brand reputation.  Build a Robust Strategy for Business Continuity Anticipating issues that could result in downtime and having strategies to address them quickly and effectively is essential. Companies need to proactively identify security vulnerabilities and have mechanisms to mitigate risks. Best practices and plans to ensure quick recovery in the face of any problem that might bring the network down can go a long way in establishing resilience. Choosing an internet only network or a software-defined network can ensure connectivity at all times, mitigating downtime significantly.  One of our clients required that their infrastructure was assured of 99.999% uptime, which translates to a maximum of four minutes of downtime through the year. Designing such a solution required us to work closely with hyperscalers to manage configuration, storage, and other operational resilience parameters in great detail. From the security perspective too, it meant putting in place the right patches and service packs and also deploying Identity access management (IAM) solutions to manage security incidents.  Manage Legacy infrastructure  Legacy infrastructure brings in its own set of challenges with three key points that need to be addressed. First is identifying and addressing single points of failure that can directly impact the running of critical infrastructure. The second is to watch out for end of life or end of support infrastructure that is highly vulnerable since the original manufacturers no longer support it. The third is to ensure that all the existing systems are up-to-date on patches. All hardware and software must have the latest patches, either N or N-1, to be effective.   Choose the Right Cloud solutions Different cloud providers bring in various capabilities and strengths. Therefore, identifying the nature of the workload and the intended applications can help identify the right cloud provider to help an organisation’s digital transformation journey. For instance, if the workload is purely for collaboration, then a certain cloud provider might be better suited for this application. If the workload consists of an e-commerce application, a different vendor might be a better fit.  A consumer products company wanted to diversify and rapidly grow its global younger customer base. One of their key challenges was the prevalence of varying demand and seasonality across geographies. The solution was to consolidate their various fragmented individual ERP platforms into one IT platform that could provide dynamic compute and analytical capabilities. The group made a corporate decision to move to a common SAP S4HANA platform. Infosys focused on a service offering that brought in business process consulting, SAP consulting and implementation, Cloud consulting, and implementation services seamlessly integrated into a solution. The solution resulted in a standardized business process and a unified IT portfolio that facilitated seamless business collaboration across 120 countries and a centralized purchase system leading to a significant reduction in operational expenditures. These added growth and business to the company. Having a multi-cloud approach to help manage all the disparate workloads is also crucial in such a scenario. Choosing a technology partner with deep relationships, ability to bring the right cloud assets, and industry cloud blueprints would help with cloud vendors goes a long way in driving speed to market and enabling innovation. We recently launched Infosys Cobalt - a set of services, solutions, and platforms that acts as a force multiplier for cloud-powered enterprise transformation. Infosys Cobalt helps businesses redesign the enterprise, from the core, and build new cloud-first capabilities to create seamless experiences, amplify innovation, accelerate speed to market in public, private and hybrid cloud, across PaaS, SaaS, and IaaS landscapes. And this we believe will help accelerate to improve organisational resilience on the cloud.  Most organisations today are highly dependent on their IT infrastructure to run smoothly and ensure business continuity.  Therefore, having a well thought out plan in place to identify, foresee, and address causes of downtime is extremely critical to building greater business resilience.  ### Why Mass Data Fragmentation Matters [vc_row][vc_column][vc_column_text]The move to the cloud has brought about a revolution in the IT industry. A recent survey from IDG found that 73% of organisations have at least one application or a portion of their computing infrastructure in the cloud already, and that is expected to grow to 90% in 2019. But the cloud isn’t necessarily a recipe for success. In fact, in many cases, the proliferation of multiple cloud applications for storage has provided IT teams with major headaches when it comes to data storage and management. Mass data fragmentation a business threat It’s an often-repeated mantra nowadays that companies are using data to ‘unlock’ value for business. Although data certainly can be valuable, it must be managed correctly or else it may just end up becoming a hindrance and an obstacle. Unfortunately, data management is a bigger challenge than it seems on paper. We all know data is being created in volumes that are almost impossible to comprehend. 90% of all the data ever generated has been created in the past five years, and this number shows no sign of slowing down. But it’s not necessarily data growth that’s the only issue to keep in mind. A phenomenon called mass data fragmentation (a technical way of saying: “unstructured data that is scattered everywhere, in the cloud and on-premises) is the issue that seems to be wreaking havoc on businesses around the globe. With mass data fragmentation, massive volumes of what’s called secondary data are stored in a dizzying array of legacy infrastructure siloes that don’t integrate and are therefore incredibly challenging and costly to manage, burdening an already over-burdened IT staff. Instead of driving digital transformation, data that’s housed in this fashion becomes a hinderance to it. In case you are wondering what secondary data is, it’s all the data that’s not considered primary data. Primary data is your most critical data, data with the highest service level agreements. Secondary data includes data that’s stored in backups, archives, file shares, object stores, and data that’s used for testing and development and analytics purposes. You may be surprised to know that secondary data comprises roughly 80% of an enterprises total data pool! Mass data fragmentation is not just a management challenge. The repercussions of mass data fragmentation can be much worse. All this fragmentation makes it incredibly difficult to know what data you’ve got, where it’s located and what it contains. This raises huge compliance vulnerabilities, which of course can lead to fines and reputational damage. More so, it’s nearly impossible to drive any meaningful insights from all that data. And, remember, we’re talking 80% of a company’s overall data pool. If you can’t get the right insights from your data at the right time, that can lead to major competitive threats and lacklustre customer experiences. Let’s probe this further and look at recent survey data that shines a bright light on what’s happening within enterprises globally? Cohesity surveyed 900 IT leaders at global enterprises across the UK, US, France, Germany, Australia and Japan, with an average revenue of $10.5bn. Data silos create real management challenges According to the study, close to a third of organizations use six or more solutions for their secondary data operations, and most IT managers store data in two to five public clouds.  By using multiple products and the cloud to manage data siloes, the burden on IT becomes staggering, and the resource drain starts to creep upwards.  Additionally, 63% of organisations have between 4-15 copies of the same data. And as IT struggles to keep up with this dizzying array of copies, the cost to store all these copies also rises, and additionally begins creating compliance challenges for organizations as we referenced above. IT Teams being stretched 87% of IT leaders believe their organisation’s secondary data is or will become nearly impossible to manage long term. According to the survey, if IT is expected to manage all the organization’s secondary data and apps across all locations and technology isn’t in place to accomplish that goal, 42 percent say morale would decrease, which is important to keep in mind at a time when finding and hiring top talent is increasingly difficult. As much as 38% of IT leaders fear massive turnover of the IT team, 26% fear they (or members of their team) will consider quitting their jobs, 43% fear the culture with the IT team will take a nosedive. These findings offer a clear indication that the IT teams would prefer to be working on something else – ideally, delivering innovation to the business rather than trying to manage a complicated web of data silos that’s only getting more complex over time. Of those respondents who believe their data is fragmented, 49% of the IT leaders surveyed expressed that failing to address the problems of mass data fragmentation will put their organisation at a competitive disadvantage, while 47% believed that customer experience would suffer. Moving forward, almost 100% believe it will consume significantly more time – up to 16 additional weeks per year, without more effective tools. Resource reallocation could have a huge impact on business 91% of senior IT decision makers said that if half the amount of IT resources spent managing their organisations secondary data were reallocated to other business critical IT actions, it could have a positive impact on revenues over a five-year period. Of those who share this belief, nearly 30% of respondents believe this adjustment could positively increase revenues by at least 6% over a five-year period, while nearly 10% believe this could impact revenues by 8-10% or more. With average revenues of companies in the survey above $10Bn, (€8.7bn) that would translate to roughly $800 to $1B (€885m) in new revenue opportunities. The search for a solution has been driven by the desire for a holistic view of a company’s secondary data on a single platform. To lead this process, organisations need to be able to consolidate data silos onto a single hyperconverged web scale platform. Through this platform, they can then take control of their secondary data and immediately address and solve challenges pertaining to mass data fragmentation. With the pressures of digital transformation firmly placed on every business, organisations need to determine how they can take more of a data-first approach, similar to Amazon, Netflix and other data-driven companies. This can create new opportunities to differentiate against competitors and delight customers while advancing your bottom line. For more information and to learn more about the issues covered, view the report here.[/vc_column_text][/vc_column][/vc_row] ### What To Do If You’ve Been Hacked [vc_row][vc_column][vc_column_text]While there is plenty of advice available about what to do to prepare for a cyber attack and how to shape your security to minimise the risk of a data breach, what do you do if you realise that you may have already fallen foul of malicious activity? Becoming the victim of a hack is increasingly common, but that doesn’t mean that victims suffer any less worry and confusion. This article offers some advice on what to do if it looks like the worst has happened, including how to minimise the damage and how to prevent future breaches. Changing passwords If you suspect a breach, don’t wait for ransomware to lock you out of your system. Where there are red flags, decisive action will help to minimise the damage. One of the simplest methods hackers use to get access to accounts and networks is also one of the easiest to prevent. Most people have email addresses that are regularly handed out. If this is coupled with a weak password, such as ‘password’ or ‘123456’, your account is like a door left on the latch, and requires minimal effort for a hacker to gain access to. “Credential stuffing“ is rife, with 90% of online retail login activity thought to be hackers trying their luck. If you have been hacked, chances are it’s due to a weak password and so one of your first steps should be to replace all passwords with secure alternatives. As well as complexity, the key to an effective secure password is to ensure they are not reused on multiple accounts, not matter how convenient this might be. A simple solution to managing all of these new complex passwords is to use a password manager. These are tools that can remember all of your passwords (and many also help to generate new, complex passwords) meaning you will only need to remember one password rather than dozens. Better yet, you could also opt to sign in using biometrics on your mobile devices. Monitoring accounts Attacks will rarely stop at one account. If they have had some success, hackers intent on identity theft will likely try to gain control of other accounts. An attack on one area should always be considered a signal to double-check the security of others. This includes checking your email to make sure it isn’t being used to forward information, and ensuring that social media accounts are not being used to contact people you follow. Even if you have lost control of an account, it can still be recovered. For many major accounts, such as those held by Google, Facebook, Twitter or Apple, there are account recovery tools or processes that will help you to regain control by proving your identity. It isn’t just email and social media accounts that need to be checked, however. If your details have been collected there is a possibility they will be used to try and make unauthorised purchases using your bank account or credit card. So, keep a close eye on your transactions and make sure that you contact your bank swiftly if you spot any unauthorised purchases. Revoke third-party app permissions It may be increasingly common and would definitely have been convenient at the time, but signing up for apps or tools using your Google or Facebook credentials is a shortcut that is often not worth taking. Not only could hackers gain access to these accounts, but they could create new accounts without your knowledge, which they would retain control of even if you change your passwords. If you have been breached, it’s important to remove the permission for every app you have connected to using this sign in method, and they should only be reinstated when you are sure that your accounts are secure. Wiping devices and restoring from backups If it looks as though you have suffered an attack, the safest step is to assume that your devices have been compromised and to run antivirus and malware scanners across your network to make sure there are no nasty surprises waiting for you. If the scan results reveal that your devices have indeed been compromised, the best course of action is to wipe your hard drive, reinstall the operating system and restore your sensitive data from a backup. In most cases, keeping a regularly updated backup is as important a step in seeking to protect against threats as strong passwords and antivirus software. For example, a device that is infected with malware may have its contents held hostage until a fee is paid, which might sound disastrous, but with a backup in place, the only inconvenience would be the few hours it would take to wipe the device and restore your sensitive data from a backup. Prevent future breaches Knowing what to do if you suspect a breach has occurred is vitally important, but hopefully, you are not yet in that position. To make sure that your data is not compromised, the best step is to create new practices. There is no setup that will guarantee 100% protection and assuming hackers are always getting smarter, you should be motivated to continue to monitor and improve your security tools and best practices. Keeping software up to date and implementing strong passwords are good steps, but additional measures like a VPN, which will improve the anonymity of your online activity and offer end-to-end data encryption, can help to bridge any gaps in your existing security measures. While you will certainly be an advocate of best practices after recovering from an attack, the best way to prevent future breaches is to tell others about it. Simply by sharing your experiences with friends and colleagues, you will make the threat of an attack feel tangible and real. By shattering the complacency that comes with breaches becoming a daily occurrence, more people will improve their own security and, in turn, make the hacker’s life as difficult as possible.[/vc_column_text][/vc_column][/vc_row] ### Blockchain: Adopting legal frameworks to gain competitive advantage [vc_row][vc_column][vc_column_text]Technology is developing at an ever-increasing pace, particularly in relation to cryptoassets and the blockchain space. With governments still deciding how they will regulate these developments, what can a business wanting to utilise blockchain technology do to protect itself and gain a competitive edge?  Background Blockchain technology came to prominence as the means to facilitate  cryptocurrency transactions. As a distributed, public ledger, secured using cryptography, the blockchain provided a secure environment which overcame many of the hurdles faced by previous cryptocurrencies. People quickly realised that the features of the public blockchain, such as transparency and immutability, could be applied beyond cryptocurrency. A key part of blockchain’s potential comes from the creation of smart contracts, which are self-executing protocols embedded in the blockchain. Smart contracts ensure that transactions which traditionally might have been managed by third parties (such as the purchase of land or other assets, currency transfers or payment of royalties) are automatically executed when certain pre-defined parameters or rules are met. There is considerable excitement around the opportunities this technology provides for reduction in third party fees and transaction times, whilst enhancing customer access and limiting fraud. It is easy to see why a sector based around this new technology is quickly developing.  However, long term success is likely to depend on creating suitable approaches to best practice. Protect your investment Unless you are a skilled developer, you’ll be hiring someone to build your blockchain. You may wish to develop your own proprietary version, or you may intend to create a DApp (Distributed Application) to operate on an existing platform such as Ethereum. Ethereum operates as an open source platform utilising both permissive and restrictive licences, so it is important to understand how the licensing model will fit with your business model. In simple terms, the blockchain is software.  You should therefore have a contract in place with your developer, as you would for any other software-related project. Contract terms such as payment schedules, acceptance criteria and termination clauses (not to mention a VERY detailed specification) are essential to ensure you get what you expect and are paying for. A smart contract is still a contract and is therefore subject to established legal principles. You should ensure that there is a clearly documented record of the terms that will be coded into the smart contract, which may need to be negotiated if there is more than one party involved.   Having such contracts in place will also demonstrate the legality and functionality of your business for customers and investors. Intellectual property Consider carefully the elements of intellectual property involved. If you are paying for the creation of proprietary software, you will want to become the owner of this wherever possible. You may be using third party software, engaging web designers, artists or other creatives to create intellectual property for you, or you may be using music, art or designs within your business model. Nothing ruins a good business idea like a claim against you for infringement of intellectual property rights, so ensure everything is covered off in contracts well in advance of it becoming an issue. Data Protection Data privacy is a complex issue for all businesses, but the blockchain has additional considerations due to its transparent and unchangeable nature. Let’s consider the wine industry, because the blockchain provides an excellent system to prove provenance and convince buyers they are getting the genuine article. However, the system may depend on the personal details of the current and previous owners being made publicly available forever. Therefore, to ensure compliance with the GDPR, careful consideration of privacy rights and required privacy information is needed, alongside appropriate risk assessments and data management processes. With the increase in sanctions, this is not one to get wrong! Initial Coin Offerings and Security Tokens Although blockchain businesses do not need to create their own cryptocurrency, they may choose to. The main reason for doing so would be to generate income with which to launch or grow the business. A common misconception is that the Financial Conduct Authority (FCA) does not regulate Initial Coin Offerings (ICOs). If the quality of the token is that of a regulated investment and/or the activities of the company can be classed as regulated activities, the regulations will apply. In particular, companies issuing Security Tokens (which represent fractions of an existing, tradable asset), are likely to attract the FCA’s attention. However, the FCA is open to business innovation, so it would be wise to work with a compliance specialist to ensure you do this effectively. Engaging with new audiences and customers Each blockchain project will need to interface with a business’ intended market to generate revenue. For example, if you were looking to develop a blockchain to drive value and options for consumers in the Telecoms sector, you would need to consider both the laws governing the selling of services over the internet (distance selling) and the restrictions around marketing to individuals. Clear terms of business would be needed with consumers (subject to consumer rights) and with the Telecoms companies, so all parties know what to expect, and are aware of their liabilities and potential charges/income.  In addition, regulations may mean you are subject to additional requirements.  Prior approval of the smart contract, by Ofgem, may be recommended to avoid wasted costs. You may look to outsource to cloud or SaaS suppliers in order to establish a customer interface linked to your blockchain, which is less energy intensive and more flexible if your business or customer focus changes. All businesses face these considerations, so they are not barriers and are included to illustrate that no business operates in isolation. Investment and Sale Finally, many businesses require external investment whilst often having one eye on a future sale. Both will be easier by demonstrating you have in place development agreements, IP protection, valuable customer contracts etc. and compliance documentation. Summary The blockchain space provides an exciting arena for entrepreneurs to create new and innovative ways of doing business. New technologies do not operate outside the law.  Understanding the legal framework (or working with a legal team or adviser that does) will provide a competitive edge for those willing to plan ahead.[/vc_column_text][/vc_column][/vc_row] ### Five Cloud Migration Issues You Shouldn’t Ignore [vc_row][vc_column][vc_column_text]Technology used to be viewed as a utility, similar to power, and though it was true that you couldn’t run your business without either, neither were going to give your business a competitive advantage. However, times have indeed changed. Customer experience, revenue and brand reputation are now digitally centred, employee productivity is thoroughly dependent on technology, and even your ability to attract and retain talent requires a compelling, digital workplace. In recent years as companies have migrated their applications to the cloud, modern enterprise IT has been transformed. Enhancements in network speeds, the public Internet, and computing have meant that infrastructure can be anywhere and everywhere. In fact, Gartner has forecast that by 2020 the majority of available global compute capacity will exist in the cloud. But while the cloud certainly has a silver lining it also comes with its own challenges, since IT organizations no longer own or control many critical aspects of service delivery infrastructure and software. In this scenario, five cloud migration factors come to fore and need to be addressed: Internet Unpredictability The Internet is not a fixed resource but instead is dynamic, meaning the route you take this second may not be the route you take the next. As a communal network, delivery is the best effort. And since the Internet has no centralized governance, its proper functioning rests on implied trust, which can be easily broken. Given the proliferation of applications and services that you now rely on, your business is now riding on thousands of networks you don’t own and can’t see, heightening unpredictability tenfold. Location Matters, Even in The Cloud When we think about cloud applications we tend to think, understandably, of the app itself, whether it’s Salesforce, Office 365, Workday, SAP Concur, or any number of the thousands of apps now available. But the way that those apps are delivered has a big influence on performance and user experience. While some apps are delivered via a Content Delivery Network (CDN) that has servers embedded in networks close to all of your offices, many software-as-a-service (SaaS) apps are delivered from a pair of primary and backup data centers that are located in one geographical region. That means that your geographically distributed offices may have highly variable network latencies at play when they try to reach those data centres. Planning for this variance is necessary for smooth roll-outs. Changing Operational Processes As you migrate to the cloud and many of your IT assets move outside of your four walls and beyond your control, your operational processes also undergo a significant change. When you directly own and operate IT assets, the troubleshooting domain that operations teams have to work in is relatively contained and results in a “find and fix” process. This doesn’t work so cleanly in the cloud. The troubleshooting domain expands to include potentially thousands of Internet networks and multiple cloud and service providers, so just tracking the source domain of the problem is exponentially harder. Once you isolate the fault domain, you must then take it up with the third-party provider, which often involves overcoming a provider’s reluctance to act and desire for plausible deniability. The whole process can cause mean time to resolution (MTTR) for cloud issues to rise uncontrollably, and your service desk costs may explode due to unclosed trouble tickets. User Experience from Everywhere You have undoubtedly thought about your users’ experience. But no matter where your users are, their experience is your business. The needs of employees and the drive to make sure they’re productive has re-focused many companies onto employee experience. The most progressive of organizations regard the way they equip their employees with the right technology to do their job as a competitive differentiator, and one of the ways they attract and retain talent. Furthermore, as technology is now central to productivity, it has the capacity to make their employees as digitally effective as possible. It’s often the case that some of your most valuable, and highly paid employees are also the most mobile, and this combined with the rise of flexible and mobile working means that for many companies, their people can be anywhere. Technologies like secure web gateways have become popular as a mechanism for securing employee access to cloud apps. In these cases, remote and travelling employees may be going straight to the cloud without ever touching a single point of corporate-controlled network infrastructure. Evolving your Monitoring Before the cloud, monitoring was largely based on passive data collection from all the infrastructure, network devices and software that IT owned and operated directly. These datasets were often siloed, but at least they were available. In the cloud, traditional passive data simply can’t be collected from networks, infrastructure and software that IT doesn’t own or control. Given this reality and the Internet’s unpredictability factor, new combinations of active and passive monitoring are required to reveal app delivery performance. However, it’s not enough to simply have this data once you’re in operation. Given the unpredictability of the Internet, it’s important to move visibility forward in your planning cycle and utilise modern cloud visibility to set clear user experience success criteria, and to alert when anomalies occur. Finding ways to share visibility data ahead of roll-out helps you create cloud troubleshooting and escalation processes that will work under pressure. The cloud has undoubtedly revolutionised so many aspects of business for the better, but that does not mean companies should blindly migrate what services they can without thinking of the consequences of expanding their IT infrastructure beyond their own networks. Options like network intelligence and monitoring provide an overview of the extended cloud eco-system, meaning you can manage every network as if it were your own. As cloud migration becomes an inescapable reality for all, the organisations that succeed the fastest will be those monitoring their networks and tackling migration issues head-on.[/vc_column_text][/vc_column][/vc_row] ### Medical IT Must Boost Its Immune System [vc_row][vc_column][vc_column_text]Hospitals and medical records have been offering low-hanging, soft and juicy fruit to cyber criminals. It is time to sharpen up. The human immune system exemplifies the potency of distributed defence. When you think of the pain – the inconvenience, the anger, the possible loss – caused by a stolen credit card, it is somehow even worse to be told that hackers typically only charge 10 to 15 cents for stolen card numbers. Talk about adding insult to injury! But did you know that your medical records might fetch anything from $30 to $500? These figures emerged in a CBSN On Assignment news program about New York’s Erie County Medical Center ransomware attack. Rather than pay the $44,000 ransom, the hospital was reduced to pencil and paper for six weeks before going back online. And this has been just one of a number of recent high profile hospital cyber-attacks. Getting them where it hurts The hijacker points his gun at the driver’s head, not at the car to be stolen –aiming the threat where it hurts. Medical data and hospital services are highly sensitive, as the enormous sums spent on treatment, medical insurance and litigation against medical malpractice attest. Even if there is little exploitable data in one’s own medical record, the hospital that allows such data to be leaked will face severe penalties – from government, from being sued, and in terms of reputational damage. You might, therefore, expect cybersecurity to be a very high priority in hospitals. The need is indeed recognized by the majority of medical administrators. But it is not so easy to propagate a sense for security across the whole organization, because so much energy is being directed towards one particular very strong human need: to preserve and optimise life and health. Medical staff say they are focused on patient care and don’t have time to worry about cybersecurity – it is seen as secondary to the literally vital job they are doing. In terms of results and medical reputation, medical staff would far rather see money spent on leading edge medical scanning technology than on a security upgrade – but that is only half the problem. Most successful hospitals can afford to spend more on security, but it is a well known fact that security is as much about people management as it is about technology. If staff feel they are too busy to stick to security policies, there is little value in a sophisticated authentication process. Also, some hospital administrators think that meeting state and federal regulations provides adequate security – but those present minimal standards that don’t address the size of the threat. Criminals are aware of these weaknesses in healthcare security, and they also know that a typical hospital network offers a very big attack surface. An enormous number of devices are connected, including mobile phones, tablets, desktop computers and servers; plus a population of patients and visitors with their own devices; plus networked medical devices and monitors that can communicate with medical staff – each offering an opportunity to injecting malware into the network. That is the double problem: hospitals are both a highly lucrative and a soft target for cybercrime. Why now? From X-ray machines to blood pressure monitors, hospitals are creating new efficiencies while simultaneously generating more data than ever before. And the complexity is growing as hospitals inter-connect and telemedicine gains momentum. Millions of patient records are now increasingly stored in the cloud – adding another entry point for hackers. Numerous organizations share cloud resources at risk from malware Trojans whose ability to self-morph, makes them very difficult to detect by conventional signature techniques. Traditional security models focused on protecting only the network perimeter. But the growing number of devices on the hospital network and the cloud vulnerabilities, mean that perimeter protection becomes far too simplistic for today’s complex, data-intensive world. Firewalls and other boundary-based security solutions fail to address threats from within a network. They also do not have the ability to detect malware that has managed to infiltrate the network, nor can they effectively combat internal attacks once detected. The more nebulous the edge, the greater the need for security awareness to be distributed throughout the system. The immune system Cybersecurity can learn a lot from our own immune system. It is neither wholly resident in the skin, as perimeter protection, nor is it centralised in the brain. Instead immunity is distributed throughout our bodies. When our protective skin is penetrated – whether by a hard object or by disease – local defences will be triggered even before a warning is sent to the brain to register pain. The result of this distributed attack awareness is remarkably intelligent, it adds up to a self-learning defensive system analogous to today’s more sophisticated artificial intelligences. This is not the intelligence of a massive number-crunching central brain, but closer to the emergent intelligence of a swarm of ants, where simple rules and reactions distributed across the whole colony enable the swarm to behave and defend itself in a remarkably sophisticated manner. That is why today’s most advanced security strategies combine strong encryption and authorisation with a greater use of distributed security hardware. A distributed security approach Distributed security provides a multi-layered defense with protection both at the perimeter and within the network, as well as at individual servers and devices connected to the network. Distributed security scales out as the data center grows and doesn’t require expensive upgrades to perimeter security appliances as network bandwidth increases. There are two ways to distribute security: a software and a hardware approach. The software-only approach is attractive because it is simple to install and runs on existing servers – but that in itself becomes a disadvantage. With multiple virtual machines (VMs) being spun up on each server, it is possible for malicious applications to be hiding in one VM and invisible to security software on another VM or an adjacent server. It also adds to the servers’ load to be managing networking and security functions rather than their expected application workloads – and effective security demands very rapid responses, with minimal latency and jitter. So it makes a lot of sense to follow the immune system in installing security across the network and not just in the brain. Smart network adapter cards are now available that continuously monitor traffic passing through and provide highly encrypted telemetry metadata to a centralized workstation. If any issue is identified, specific traffic flows or servers can be shut down before the entire network suffers. Combine this distributed network awareness with self-learning intelligence and you enable an AI system that learns network behavior under normal conditions and, if anything atypical happens, flags a warning. Having distributed hardware security “agents” throughout the data center provides AI based security with early warning of attacks, as well as a sure way to shut off attack traffic at individual nodes in accordance with agreed security policies. A final advantage is that, while the system is growing, one can simply install more of these smart adapters at the same time, instead of having to add, upgrade or rebuild existing edge security devices. Healthcare, heal thyself? Articles on cyber security love to open with statistical data on the magnitude of the threat. Is it better bedside manners to offer the patient some comfort before outlining the threats? A recent study by Hiscox suggests that small businesses in UK alone are the target of around sixty five thousand attempted cyber attacks every day, of which some UK company becomes a victim every nineteen seconds – at an average clean up cost of over twenty five thousand pounds sterling. Symantec announced twenty four thousand malicious mobile apps being blocked every day… and so on. Are there any corresponding figures for the number of viruses, bacteria and other health threats that the average human body fights off every a day? It would be an interesting comparison. Whether or not, it is certain that healthcare could learn something from the very bodies it is dedicated to protecting. A lesson in distributed security.[/vc_column_text][/vc_column][/vc_row] ### Scientific Innovation | Device to combat water shortages New inventions and gadgets are often highly anticipated and embraced by the public, but there is no invention so crucial to human life as devices and technology designed to combat environmental and social problems. At the University of Akron in Ohio, researchers are working on a device that is able to collect drinking water from the air. The device captures condensation through tiny polymers. A prototype of the water harvester takes the form of a backpack, making it easier to transport and use. Every human on earth should have access to fresh and clean water, and these researchers hope to help provide drinking water to those in need, with a particular focus on people living in drought territories. In tropical locations where humidity is high there is more water vapour present in the air, and as a result, a device called a fog catcher is simple to use to transform water vapour into drinking water. In drier climates like in California, fog catchers do not work effectively due to the lack of water vapour in the atmosphere. Water shortages are increasingly extreme in the poorest parts of the world as the planet warms due to climate change. A professor of mechanical engineering at the University of Akron, Dr Josh Wong, explained that the atmosphere is a major source of water. Native communities living in the dry and high altitude Andes mountains have developed their own innovative methods of capturing water vapour for thousands of years. They often collect dew captured inside cavities in the desert and send it into large tanks, which provides fresh drinking water. This inspired the Ohio research team in the development of their own system. The team’s water harvester will use nanotechnology to further develop on pre-existing techniques. Electro spinning, whereby nanoscale fibres are formed by the application of a strong electric charge on polymer solution, can produce extraordinarily small polymers (as small as several tens of nanometers across), so an enormous surface area can be compressed into a tiny space. It developed effectively, the team of researchers will achieve the creation of a lithium-ion battery charged resilient and lightweight freshwater harvester. Wong sees this as similar to the systems used by people in the Andes to collect fresh water, but on a smaller-scale with more potential and ease of use through its portability. He compared the approach to wearing glasses in hot weather as when you enter a cool air conditioned room from the heat outside on a summer’s day, your glasses will fog up. It is this fog that can be accumulated at a nano-scale. There have been a variety of efforts to help with water shortages. For example, the company Water Solutions uses state of the art scientific techniques, with the latest computer software for scientific investigations relating to groundwater development and maintenance. They have installed ponds in locations suffering from droughts to help replenish groundwater. Similarly, an infiltration pond in California's has become a testing ground for scientists to refine methods for recharging the area’s depleted aquifer. Groundwater provides over 90 per cent of the water used in the Pajaro Valley, but the water levels inside their aquifers have diminished over the past 30 years. They are working to combat this by identifying potential sites and good conditions for groundwater recharge by monitoring stormwater flow across Santa Cruz. F&T Water Solutions also developed something called Variable Electro Precipitator. The Variable Electro Precipitator can eliminate contaminants from water through a process called electrocoagulation that standard filtration cannot. Janicki Bioenergy has even created a machine that can extract clean drinking water from human faeces. Bill Gates himself drank water from the machine, which gives it a great deal of credibility. Although other researchers have worked on the creation of water harvesters and their own techniques to the production of clean drinking water, the Ohio team have said that their product would be more affordable and more easily portable due to its small size. It could take the form of a rucksack, or even be placed onto a mechanical device like a small train on a track so that the water could be collected and then distributed to communities of people in need. If the device is made, the small size will mean that hard labour will be eliminated from the process of water collection and less of a burden will be placed on people in drought suffering regions as the lightweight nature of the product will cause limited physical strain. As more companies and Universities dedicate entire workforces and research teams to discovering more efficient ways to generate clean water from the appropriately termed ‘thin air’, the perils of climate change including drought can be combatted with techniques perfected over time, even by finding inspiration from methods dating back centuries as the University of Akron did by applying the techniques of indigenous people in the Andes and adding their own advanced spin for the collection of drinking water from vapour in the atmosphere.   ### How Keywords in Data Classification Can Transform Analytics [vc_row][vc_column][vc_column_text]When different people are voicing different issues, they will use different words and sentiments. Current analytics typically identifies just the keywords used, but this runs the danger of failing to miss the entire context behind the communication. Often a customer will merely imply a sentiment or intention instead of explicitly expressing it with specific keywords e.g. a customer in a restaurant might say ‘by the time my meal arrived, the food was cold’, the keyword would be flagged as ‘cold food’, when in fact the main issue was the slow service. There are also other limitations with using just keywords such as sarcasm, context, comparatives and local dialect/slang. The overarching message can often be missed and so the alternative is to analyse text data using ‘concepts’ instead of ‘keywords’. Machine learning can help by providing the missing link between the keywords used, thus giving a much clearer picture of the full concept behind a piece of customer feedback. However, it’s the latest AI technology that is taking this a step further and making the classification of concepts more accessible for all. Using Concepts for Sentiment Analysis When using concepts instead of keywords, each concept has an associated sentiment, graded or otherwise. You can then do what you like with this e.g. tot them all up or filter for certain topics. For example, if you have the sentence: “the food was great, and I loved the selection of drinks, but the service was terrible and I’m never going back” you have two markers of positive, one of negative and a clear negative intent.  Sentiment can be presented as an overall picture across customer interactions. In the example given above, the overall sentiment may be positive, or neutral, but the actionable insight is apparent in the negative feedback and intent. Overall sentiment scores can give a skewed view of how customers really feel about a service, and real change can only be affected by focusing on negative sentiment and insight. Using concepts instead of keywords provides insight into what the most negative areas to improve are, without clouding this by letting positive or neutral sentiment interfere. Comparably finding the sentiment using just keywords for the same sentence isn’t straightforward. Essentially it is driven by a black-box algorithm which uses various techniques to try to match the opinion terms close to the keyword terms (from -1 to 1) and then combines them in a way within sentences. In addition to this, the way people talk about things in reality, with sarcasm, negatives and context, can present a lot of challenges that many people who use keywords overlook. Sarcasm often gives a completely opposite sentiment score to that which the customer intended. These limitations have generated a demand from businesses and analysts for accurate, actionable concepts i.e. those which capture the topic and intrinsically the emotion attached. Machine learning alone is not enough Machine learning is helping to solve many of the issues associated with the use of keywords in sentiment analysis and it does this by classifying by ‘concept’ instead of ‘keywords’ and rules. It focuses on the concepts of issue with the emotion attached and can classify more objectively. For example, if someone contacts a bank and says that they’ve had money taken from their account then the fact or concept that they have been subject to fraud is captured no matter what dialect, terminology or tone they use. The concept itself will scored as a negative sentiment and more importantly the bank now has actionable insight and it can deal with the customer appropriately. The sentiment can also be graded by concept, so rather than having a black-box score, it can be categorised into “very unhappy”, “unhappy”. The customer might also state their intent, e.g. leave the bank, or not recommend it. These intents can also be captured and can be very useful markers in terms of prioritising how the bank chooses to deal with them. Furthermore, the bank doesn’t have to physically read the review again. The concepts have been identified and any input or action required is complete. However, whilst machine learning has the power to classify by concept, setting up traditional machine learning models and running them is difficult. Firstly, you need to ‘predict’ on the same dataset you ‘train’ on. There are different phrases, words and even characters used, depending on the platform you are analysing e.g. the language used on Twitter is not the same as written complaints. You also need to build the training set, tune it and constantly update it and this needs to be done by Data Scientists. Finally, if your machine learning model doesn’t lead to actions, or is not fit for purpose because of the volume of false positives or negatives then it can fail to be operationalised. Unfortunately, it is all too common that there is a disconnect between data science and how it is judged by others in a business. AI is making concept classification easier and faster The latest in AI text analytics is taking the advancements made by machine learning and classifying by concepts instead of keywords one step further. It automatically and accurately ‘tags’ feedback using actionable concepts along with the intent of the customer through a combination of automation and ‘human-in-the-loop’ technology. Human-in-the-Loop involves somebody validating and training the model when it needs it, typically on records that have high uncertainty. The AI will send an alert to a human asking them to validate something it might not be sure about and then trains and updates itself with the input. Importantly the validation doesn’t need to come from a Data Scientist, it can be a business user or anybody that is familiar with the operation. The impact of this is huge. Because of the automation it means that far fewer, non-data scientists are required to run the model and the validation that is invited by the AI can be done offline. Complex models that would normally take weeks to build by a data scientist and with an ongoing overhead to support and curate, can now be generated and tuned by a non-data scientist in less than a day. Performance using concepts An example of the latest AI, namely PrediCX software from Warwick Analytics, was used to research publicly available customer data on Trip Advisor and Twitter, comparing the outputs for using both concepts and keywords. The precision rate for labelling keywords was 58% compared to 76% for concepts. Precision here refers to how often the identified topic, or sentiment, was correct – a true positive. For example, if a classifier identified 10 mentions of ‘bad service’, but 3 of them actually mentioned good service – your precision is 70%. The recall rate for keywords was unknown as there were 46% of records which were completely unlabelled. For concepts the recall rate was clear and it was 51%. There were no records unlabelled. Recall here refers to what percentage of the relevant data in a set were labelled. To refer to the example above, if the dataset mentioned ‘bad service’ 10 times in total, and the classifier picked up 7 mentions of ‘bad service’, then the recall is 70%. It’s clear that measuring the recall rate is a problem when using keywords. When it came to the intents of the customer, the keyword precision was 68% although it was 54% if you took into account the difference between recommendation and saying they would return. Intents here refer to either emotional intents (disappointed, repeat problem, etc) or sales intents (would recommend, would not return, etc). There were also a lot of inconsistencies e.g. where both ‘recommending’ and ‘quitting’ were predicted together as well as duplication by 14% i.e. double counting. When estimating the recall, the keywords picked up intents in 23% of records, albeit with the accuracy noted above, so in fact around 13% to 16%. However, when using concepts, the intents were picked up 28% of the time. The precision was 88% and recall 45%. For keywords it labelled 23% of records as “recommending” but less than 1% as “quitting”. For concepts, it labelled positive intents 23% and negative intents as 5%. This meant that it was significantly better (1000%) at picking up negative intents and arguably these carry the most insight. When it comes to Twitter, the average number of labels per record was 2.3 with keywords but only 1.05 with concepts. This is important as a Tweet usually has one intent and so keywords become less useful for picking this up as much of the verbiage will be context. We can see therefore that performance for keywords is driven by how well defined the taxonomy is. Even so, subtle nuances can return inaccurate classification. Performance isn’t explicit and has to be manually measured. The performance for concepts relies on an algorithm and some training data applying to the domain concerned. It can be tuned and may or may not require intervention from a data scientist. It’s also worth asking what a ‘good’ performance is as a benchmark. We can do this by comparing against what humans could manually achieve. This can range anywhere from 70% to 90% precision but with much lower recall (it’s easier to miss things than to correctly or incorrectly judge something). Another benchmark is to have a reliable self-measurement metric, noting that some labels will have better performance than others, mostly likely the larger classes and/or those with less variable description possibilities. Conclusion There are a range of options for analysing text. Keyword analysis solutions are widely-available and ready to deploy and are certainly fit to serve basic text analysis needs. Keywords can serve to highlight, on a less detailed level, areas of a business that need attention. However, for more detailed, actionable insight, classifying by concepts, particularly with the latest in AI text analytics, is far and away the better option. Accuracy is not only explicit but can be improved with very little work. Machine Learning is helping to make the use of classification by concepts more widespread but there are still barriers to its’ adoption, mainly the resources required to build and maintain the models. AI alleviates many of these barriers through automation and only using human intervention when required, thus using concepts instead of keywords is likely to become more accessible for all.[/vc_column_text][/vc_column][/vc_row] ### Juniper Networks Expedites 5G Transformation for Service Providers [vc_row][vc_column][vc_column_text]Expanded portfolio simplifies the transition to a secure, automated and cloud-centric infrastructure with new innovations in metro, edge and core SUNNYVALE, Calif., Feb. 12, 2019 – Juniper Networks (NYSE:JNPR), an industry leader in automated, scalable and secure networks, today announced a major refresh to its metro, edge and core solutions to accelerate service providers 5G transformation. Comprised of IP optimized silicon enabling industry-leading 400GbE density on a new 14.4 Tb linecard, new ACX access and aggregation platforms and an expanded MX 5G router portfolio, these new Juniper solutions will help service providers achieve a holistic approach to infrastructure transformation. Combined with Junipers’ Contrail software portfolio, these new infrastructure enhancements deliver a secure, automated and cloud-centric architecture that will unlock new revenue opportunities for service providers. 5G networks offer providers an opportunity to increase revenues through new services that simply weren’t possible before, including near-zero latency applications such as IoT, AR/VR and connected cars. However, with these services comes exponential traffic growth and heightened performance demands, new security threats, and operational complexity. To meet these demands, service providers must transform their networks to be secure, automated and cloud-ready and capable of delivering diverse services in a simple, yet agile and cost-efficient way. Providers require a holistic approach to transformation, one that encompasses infrastructure, operations and cloud-native service delivery to realize the full economic benefits required to be successful in today’s market. Juniper’s new solutions give providers an operations-driven approach to infrastructure transformation built on best-of-breed security, automation and cloud capabilities. To complement these new infrastructure enhancements, Juniper’s widely-deployed Contrail software portfolio enables service operators to transform from managing individual devices to automating the full operational lifecycle with one integrated software suite, simplifying the design, implementation and operations of the network.  News Highlights: New Metro Innovations: Juniper is introducing an expanded Metro Fabric portfolio to deliver an adaptive, agile and secure network infrastructure for simplified service creation and delivery. New today: 1 RU ACX700 Universal Metro Router: This high-capacity access router provides the necessary IPsec transport, precise timing and bandwidth requirements found in the 5G era for agile service delivery. The 1 RU ACX700 also meets the stringent environmental requirements at remote sites with 24x10GbE and 4x100GbE port densities in a compact hardened form factor. 3 RU ACX700 Universal Metro Router: Delivers high-availability and high-capacity bandwidth required for mission-critical sites projected to scale at fast rates. The 3 RU ACX700 offers 2.4Tb of total system performance in a temperature-hardened form factor for both pre-aggregation and aggregation. New Edge Innovations: The new MPC11E line card in the MX2000 Series 5G Universal Routing Platform, powered by Juniper’s Penta Silicon, delivers a three-fold increase in line card and system capacity, with an industry-leading 4Tb per slot capacity for high-capacity edge routing platforms. New Core Innovations: Juniper’s new Triton Silicon enables end-to-end secure connectivity at scale with an industry-first 400GbE native MACsec support in a networking chip. Triton Silicon will propel the PTX10008 and PTX10016 Universal Chassis into industry leading positions for 100GbE and 400GbE density. The new Juniper Triton Silicon delivers a ~70 percent power efficiency gain (~0.15 watts per gigabit) over existing Junos Express Plus silicon, leading to a 380%  bandwidth increase over the previous generation line cards in the PTX10008 and PTX10016. The Juniper Penta Silicon-powered MPC11E line cards for MX2000 and Triton Silicon-powered 14.4Tb line cards for the PTX10008 are expected to be available during the second half of 2019. The 1 and 3 RU ACX700 will be available in early 2020.  Supporting Quotes: “Service providers seek agility, cost savings and new services from 5G networks, but capitalizing on these benefits requires a complete transformation of infrastructures, operations and services. With today’s announcement, Juniper Networks is giving service providers the building blocks required to create 5G-ready networks built for simplicity and agility that are capable of supporting immense traffic demands for the future. By combining the attributes of secure and automated cloud architectures, Juniper is ushering in the next era of service provider networking.” Brendan Gibbs, Vice President, Product Management, Juniper Networks Additional Information: Juniper Networks with be showcasing 5G solutions and demos at Mobile World Congress 2019 in Hall 2, Booth 2I60/2J61 Launch Blog: 5G Ready Network Today Requires A Secure Automated Cloud Architecture [/vc_column_text][/vc_column][/vc_row] ### Customers Don’t Care About Open Banking [vc_row][vc_column][vc_column_text]So you are not (yet) part of the select group called CMA9 and you are asking yourself: “Should I wait to be invited or just joint the movement? Is this Open Banking standard really a game changer?” Even though Open Banking was introduced by the CMA order in the UK, in reality, Open Banking is an industry necessity and it is inevitable. The alternative to Open Banking is to fail. If you are waiting for a sign that your organisation should join this new ecosystem, here it is. Now, before you organise your open banking kick-off meeting, please consider this: the new UK industry standard was introduced a year ago and there are lots of people concerned with the levels of customer awareness (which is very low). As if people would sleep in a queue, Apple event style, every time a bank launches a new thing… come on. Customers don’t care about Open Banking. In the same way, customers don’t care about the electricity provider every time they plug an electronic device in a power socket. The only expectation is that it should work. The way most banks interact with their customers today is pretty much like utility companies do. Mass propositions, complicated terms and conditions, hidden fees, poor communication and, of course, bad customer service. Yes, there are a few exceptions. Therefore, whether or not customers know about Open Banking is secondary. Banks have a much bigger problem to tackle. What customers really care about is straightforward, frictionless customer journeys when applying for, using or cancelling a product or service. Now, the reason your organisation should care about Open Banking is because it’s truly the only economically viable and efficient path for banks to be able to make this seamless customer experience possible. So let’s stop talking about low levels of Open Banking adoption and start to design better customer propositions instead. Customers don’t care how you operate your business. (Well, they do, but just a little bit). What they really want from a bank relationship is convenience, transparency, speed and security. However, to be in a position to offer compelling propositions based on Open Banking standards, non-participant banks will have to navigate the choppy waters of the Open Banking sea. This journey will require an organisational boat that first and foremost satisfies the most important criteria: be able to float (i.e. be compliant with the OB standards). No engines, sails or oars are required at this stage. Second, an organisation must be agile enough to respond to unexpected market changes. This means an organisation needs to be willing to challenge and constantly adapt its own business model and practices, from the moment they create a product to the moment they distribute them. Don’t worry about the customer knowledge about Open Banking – if you design a compelling Open Banking proposition, the market will pay attention. Open Banking creates a unique opportunity for banks to augment their existing product offerings as well as create new ones. That’s not just what customers want – that’s what they expect from banks. Remember, when we talk about customer experience, customers don’t compare your organisation to other banks. Their reference point is any organisation that offers digital services. Therefore, look outside your industry for inspiration and stimulate some crosspollination. The window of opportunity is literally open, but for a limited time.[/vc_column_text][/vc_column][/vc_row] ### Mendix Unveils Game-Changing Native Integration with IBM’s Cloud Services and Watson [vc_row][vc_column][vc_column_text] IBM-Mendix alliance brings users of Mendix’s market and technology-leading low-code application development platform the awesome powers of IBM Cloud services and Watson’s AI capabilities   BOSTON – FEBRUARY 11, 2019 – Mendix, a Siemens company and the leader in low-code for the enterprise, today announced new and greatly enhanced platform enhancements for Mendix on IBM Cloud to be introduced by Erno Rorive, senior product manager at Mendix, speaking at IBM Think 2019. Mendix is IBM’s only enterprise-grade, low-code development platform fully optimized and cloud-natively integrated with IBM Cloud Services. Rorive will highlight Mendix’s reengineered cloud-native architecture that now fully supports Kubernetes containerization for the IBM ecosystem. Full support for IBM’s Kubernetes implementation provides Mendix low-code app developers with seamless access to the capabilities of Watson, the world’s leading provider of AI cognitive services.   “The goal of the Mendix-IBM Alliance is to place the ease of drag-and-drop, low-code application development into the hands of the Enterprise cloud market, leverage Watson’s AI capabilities for the largest number of users, and vastly accelerate the time-to-value deployment of business innovation,” says Rorive. “The integration of the Mendix platform with IBM Cloud and Watson represents a golden triangle of enterprise-ready solutions that will power the next wave of smart applications to focus the power of AI on vertical industry solutions.”    Mendix’s platform enhancements for the IBM Cloud infrastructure include:   • Broader Access to IBM Cloud Services. Integrating Mendix’s low-code app development and deployment with Watson’s out-of-the-box AI capabilities will enable enterprise customers to experiment with AI-ready services, including access to Weather Channel content, App ID, and App Launch on IBM Cloud.    • Full Support for Kubernetes. Integrated support for Kubernetes production-grade container orchestration is now part of Mendix’s cloud-native architecture. Native Kubernetes support enables streamlined management and interoperable deployment across public, private, and hybrid clouds. This turbocharged integration with IBM Cloud and the Mendix platform eliminates costly IT overhead with automatic healing and horizontal scaling, enabling seamless management for migrating legacy systems or creating new, cloud-native, microservices-based capabilities.   • Single Sign-on Onboarding. Reengineered platform integration creates a frictionless user experience for all Mendix low-code applications with one-click, out-of-the box deployment to IBM Cloud’s entire catalog of third-party vendors. This enhancement allows enterprise customers to operate and monitor mission-critical applications in a single IBM-centric dashboard, eliminating the need for additional logins and passwords.   • Integrated Billing with IBM Cloud. The newly streamlined financial platform provides enterprise customers with a single invoice for any purchased IBM Cloud service. By adding the Mendix platform to IBM Cloud’s catalog, customers can quickly begin to explore Mendix’s enterprise-ready, low-code application development, allowing them to deliver commercial benefits at speed and for the lowest possible total cost of ownership, using their IBM Cloud Credits.   “The billing platform integration will encourage new users who are seeking an effortless way to evaluate cutting-edge products,” says Rorive. “It becomes much more flexible for enterprises to experiment and extend their application landscape with low-code development, at a much lower investment risk than normal.”   The IBM Think 2019 conference at San Francisco’s Moscone Center is the largest annual gathering of top executives, business strategists, professional developers, and educational and enablement experts who will share their stories, insights, and best practices for creating value with IBM’s cloud-based ecosystem.   Erno Rorive’s presentation, “Build applications 10x faster with the IBM Cloud low-code platform by Mendix” takes place on Tuesday, February 12, at 1:30 p.m. at Moscone Center South as part of IBM Think 2019’s track on Cloud and Infrastructure: Smart Starts Here.   To hear Rorive’s talk, register for IBM Think 2019. [/vc_column_text][/vc_column][/vc_row] ### Machine learning: A rationalist’s view [vc_row][vc_column][vc_column_text]Given the volume of column inches devoted to artificial intelligence in today’s media, it is all too tempting to treat it as a revolutionary concept. In reality, the concept is anything but new. The historical record shows that the term was first used by computer scientist and pioneer John McCarthy at an American technology conference in 1956. This fact isn’t meant to dissuade those who have faith in the undeniable potential of AI to transform our world – because it surely will – but to encourage a degree of measure, reason and patience in the discussion. First and foremost, all of us who work in the technology world have a responsibility to our customers and the wider public to be clear about what we mean when we say ‘AI’. As a recent editorial from The Guardian points out, the term is commonly used to describe what might more accurately be referred to as machine learning: the process of discovering patterns in datasets to improve a system’s intelligence. Today, machine learning impacts the minutiae of our daily lives in more ways than we may care to imagine. Big brands like Apple and Nestlé are investing huge sums of money in developing AI and machine learning models to capitalise on the kind of in-depth data that helps form a sophisticated picture of consumer choices and habits, aiding their product development and advertising strategies. Ads that pop up on your web browser are anything but random. Hugely popular online streaming services like Netflix and Amazon Prime use similar AI bots to curate a viewing experience that you’ll enjoy, recommending films and TV series by remembering what you’ve previously watched. Do you ever wonder why a TV series that you’ve never heard of has a score of “95%” on Netflix? This isn’t a user rating, but the system playing matchmaker, as the machine learning bots learn your viewing patterns and then make a calculated judgement that you’ll enjoy it. In truth, Netflix belongs to a powerful new generation of businesses whose online models are contingent on their ability to shape the services they provide around the behaviours and preferences of their customers. For modern behemoths such as Amazon, eBay, Uber and Deliveroo – with their unprecedented market valuations – this is the new norm. But the reality is that these companies, though ubiquitous, remain the exception and not the rule. While firms such as PwC estimate that AI could add a staggering $15.7 trillion to the global economy by 2030, research from Microsoft just last year found that most do not have an adequate AI strategy in place to take advantage of this opportunity. The challenge lies in the data. It doesn’t really matter how smart or sophisticated the AI bots are. If they are being fed bad or insufficient volumes of data, they will be unable to produce any kind of meaningful insight. This is, in part, what Mars Inc’s chief digital officer, Sandeep Dadlani, means when he says that “AI is still in its infancy in terms of solving problems”. Organisations haven’t even scratched the surface of what’s possible because they still do not possess the data that will make a real difference. In order for machine learning to do its job, it needs datasets that are broad, consistent and unbiased. The phrase ‘garbage in, garbage out’ was first used in 1957, just one year after McCarthy coined ‘AI’, but it is more relevant now than ever before. In fact, IBM has estimated that ‘bad data’ could be costing organisations as much as $3.1 billion in the US alone. One solution is greater investment in the digital talent gap. As organisations move more administrative and sales processes online, they have no choice but to recruit a higher number of computer programmers, web developers and data analytics experts. This is already taking effect. Harvard Business Review described ‘data scientist’ as the sexiest job of the 21st century seven years ago, while online recruitment firm Glassdoor has named the “best job” in America for the past four consecutive years. All of this suggests that demand for intelligent tools that improve data analytics and governance will only increase, as organisations streamline corporate information architecture with tools that help to embed machine learning into business operations, ultimately providing a simplified, richer customer experience. This is why Mitra has recently partnered with Amazon Web Services Marketplace for Machine Learning. Our business has developed a suite of machine learning products to help democratise AI’s capabilities, giving a growing number of organisations the capability to turn their data into genuine insight. Firstly, however, businesses must plant the necessary foundations by looking at the type and volume of data they are collecting, and the objectives they are trying to achieve as a business.[/vc_column_text][/vc_column][/vc_row] ### Top to Bottom Success in the Enterprise WAN [vc_row][vc_column][vc_column_text]Multicloud has changed the world, and as much as the transition is impacting the inside of the modern data center, it is reshaping the networks between them. In championing the migration to multicloud through 2018, Juniper Networks saw growing momentum for our Contrail Enterprise Multicloud and Contrail SD-WAN solutions. Given the heavy focus in the industry on automation and overlay (commonly known as SD-WAN), the underlying network (commonly known as Enterprise WAN) is often overlooked. Juniper has been delivering solutions for Enterprise WAN across private and leased networks that enable a secure, reliable and scalable network underneath SD-WAN. Today, we published the enhanced Enterprise WAN 5-step journey, along with leading thinking about the evolution of routing and securing enterprise networks under the sweeping forces of multicloud. So let’s peel back the SD-WAN overlay and see how enterprise customers are securing and automating their Enterprise WAN with Juniper Networks solutions in four key areas: WAN backbone, data center interconnect (DCI), public cloud and internet peering. Routing the Enterprise Here is what our customers are saying: -          “The  entire retail industry continues to evolve, and at Tractor Supply, this starts with our network. With Juniper Networks, we significantly enhanced the backbone of our network, allowing us to streamline operations processes and bring faster alignment across all of our locations. Juniper’s routing technology is like the Swiss Army knife of platforms—the MX Series offered us every feature set we were looking for within one box.” - Raymond Beaudoin, Network Architect, Tractor Supply Company   -           “The MX480 that is responsible for the WAN between the data centers of Tokyo, Osaka, and Yokohama and our Internet connection boast incredible stability. The Internet side sometimes has to handle a full BGP routing table of more than 500,000 routes, but it goes off without a hitch.” - Tomotake Wakuri, corporate senior architect for security and network of Ricoh’s Digital Promotion Division         -          “The City of Portsmouth is on the brink of digital transformation for smart city foundational infrastructure. We are investing in Juniper’s offerings to securely connect our network, community, and region for the virtual generation. In partnership with Juniper Networks, we are securing and automating our multicloud WAN, which will allow our digital growth without constraint and accomplish our goals of increased mobility and open data infrastructure.” - Daniel Jones, CIO, City of Portsmouth   -          “At University of South Carolina, we take pride in the quality of our research. By partnering with Juniper Networks to enhance multicloud routing and connectivity, we will be able to vastly increase the capacity of our network, consequentially opening up a greater breadth of research possibilities for our scientists and academic researchers.” - Paul Sagona, Interim Director of Research Computing, University of South Carolina Juniper Enterprise Multicloud Routing Top to Bottom Hastily jumping into public cloud or proprietary data center solutions can encumber enterprises as they move to multicloud. With multicloud squarely in Juniper’s enterprise strategy, our goal is a future where enterprises are free to move workloads between clouds with optimal networking and strong security in the overlay and underlay networks—top to bottom. Overlays and underlays are well understood in the data center, but in the WAN, examining and optimizing the underlay for the new multicloud era is not well covered, nor well understood. Starting at the bottom in the WAN underlay or transport, Juniper’s strength in WAN routing technology and vision is built on a foundation that helps enterprises connect their campuses, data centers and the super-highway of direct connections into public cloud with SDN-based traffic engineering using NorthStar, switches, and routers like the MX10003, MX204, QFX10000 and PTX. On top in the WAN overlay, solving the difficult challenges upfront, Juniper purposely built Contrail SD-WAN to conquer NFV, uCPE and SD-Branch. We are now the provider of the most scalable and versatile secure SD-WAN solution, while other vendors in the space are fettered with architectural revamps to try to add in branch security and LAN orchestration. Best of all, our open standards-based solution bridges seamlessly with WAN backbone routing used in the underlay or traditional multi-vendor routing. Juniper’s portfolio easily serves the largest enterprises that need to direct the WAN backbone and other underlay routing use cases. It also serves enterprises, big or small, looking for SD-WAN control. And unlike some SD-WAN vendors, Juniper acknowledges and addresses WAN solutions top to bottom. Furthermore, the WAN overlay solution is only complete with solid SDN overlay control for the data center interconnect (DCI). New in Multicloud DCI SDN inside the data center was easily the first major use case for more dynamic software control over scale-out switching architectures and plugging into workload orchestration systems. Other use cases found in a DCI involving secure underlay. This could be over internet with IPSEC, over dark fiber with MACSEC and simple DCI.On top of the secure underlay, EVPN gives you the ability to achieve large scale multi-tenancy and workload mobility. Contrail Enterprise Multicloud handles the data center top to bottom, managing the physical underlay fabric, virtual overlays, and security and can plug into a range of workload orchestration systems. To bridge the SDN journey between data centers in the WAN, we have now expanded Contrail Enterprise Multicloud for overlay control with standards-based EVPN-VXLAN type 5. The solution to orchestrate L3 DCI is available today and supported with our MX and QFX Series devices. The new DCI features naturally complement the Contrail Enterprise Multicloud solution inside the data center, and round out our SDN orchestration for enterprises end-to-end and top to bottom. The Enterprise’s Reach Should Exceed Its Grasp Enterprises of all sizes need to plan their journey to a multi-vendor multicloud reality because change is inevitable but network rip-and-replace disruption is not. To this end, Juniper has put together the latest 5-step enterprise WAN journey to secure and automated multicloud. As enterprises are advancing their multicloud routing and WAN to develop SD-WAN, DCI, multicloud transport routing and SDN optimization, Juniper Networks' building blocks and commitment to open standards can maintain smooth growth with an evolvable architecture.[/vc_column_text][/vc_column][/vc_row] ### How to Get Boards ‘On Board’ with Innovation Governance [vc_row][vc_column][vc_column_text]Industries around the world are facing disruptive digital transformations. Emerging technologies are demanding new ways of operating as well as delivering unique, sustainable value. In several cases, the winners and losers already are becoming evident. One such case is the global fast food giant McDonald’s, which, in 2017, implemented new technology for ordering and delivery, including a mobile order-and-pay-ahead app, in-store digital ordering kiosks and a delivery partnership with UberEats. These innovations, augmented by other digital strategic initiatives, have further cemented the fast-food titan’s market position through increased customer traffic and better earnings. On the other hand, several once-dominant enterprises that failed to positively embrace disruptive trends – such as BlackBerry, Kodak and Myspace – have either lost a significant proportion of their market share or have been completely swept out of the way by nimbler and more agile newcomers. The stakes of digital transformation are high. The difference between success and failure, however, often comes down to governance. Using frameworks and improvement models such as COBIT and CMMI can ensure IT investments support business objectives. Successful innovation governance requires understanding and buy-in from both the board and the C-suite. But, these business leaders cannot do it alone. Here are 5 recommendations to help your board and C-suite improve technology governance and adapt to disruptive change: Speak the language of business. While many companies are working hard to recruit leadership with tech expertise, boards and C-suites may not have the depth of knowledge to oversee critical technology-driven initiatives or fully assess digital opportunities and threats. Leaders also may be prone to inertia. They may not see a compelling need to change traditional ways of running the business, especially when these methods have worked well in the past. But IT leaders can influence the board and business executives by talking the language of the board and C-suite – especially addressing specific business metrics: revenues, profits or cost efficiencies.   Improve your business acumen. Be a business leader, not just a technology leader. To achieve this, you need to ask some important questions: Who are your organization’s key stakeholders? What do they want? What drives business results? What is the business strategy and how does it inform near- and long-term objectives? A CIO with this level of understanding is more likely to influence the enterprise’s strategic decisions as part of the executive team.   Know how technology will create unique, sustainable value. If you want board and C-suite support, and enough technology innovation budget, it is important that you clearly explain how a given technology will deliver unique, sustainable value to your organization. The most common goals of innovation are to modernize core functions (including elimination of redundancies), improve efficiencies or innovate products and services – but do not neglect opportunities to create a new business model through disruptive technology. For example, DHL and other companies are testing drones for delivery of medication or blood samples for lab tests in remote and hard-to-reach locations.   Help the board and C-suite assess risk. Business leaders may freeze in the face of disruptive technology. They may convince themselves that it’s too early to invest, or the technology costs too much, or the risk of failure is too high. The choices, however, are binary: Innovate or become obsolete. It is therefore important for the CIO to help the board and C-suite understand that failing to invest in digital technologies may open the door to competitors or have other adverse effects. When companies like BlackBerry and Kodak did not drive technology innovation at the highest levels, it cost them their industry leadership.   Build an Innovation Culture. Driving digital transformation is hard without a Digital Culture. While a number of boards are starting to direct business strategies fronted by digital technologies, most boards do not only lack digital understanding, but shy away from the digital transformation agenda. Creating a culture of innovation requires executives and boards that are digitally savvy. CIOs need to assist the board and executive teams to upskill their digital skills by assessing current digital fitness to design nimble digital learning plans and by creating collaborative engagement models, such as innovation labs, alternative workspaces or sandboxes. This will ensure that the board is not only steering the main crew in the desired digital direction but will enable everyone within the organization to contribute to the digital journey. As a business leader in technology, a CIO plays an important role in fostering innovation governance that helps enterprises thrive amid today’s fast pace of change. CIOs who are unable to drive their boards and executives on the innovation bandwagon are being naturally phased out. Being an innovative organization is not about chasing the latest technology fad; it’s about accelerating specific enterprise customer and employee experiences and creating core business value using transformative technologies such as artificial intelligence, IoT, connected clouds, machine learning, virtual reality and robotics to shift business performance and customer experiences. This requires technology leaders to bring their boards on board with innovation governance.[/vc_column_text][/vc_column][/vc_row] ### Taking Label Management Into The Cloud [vc_row][vc_column][vc_column_text]Labelling is a key business process that is widely used internally and externally, across the organisation. From streamlining logistics and distribution, to helping manage suppliers and production, labelling can be a challenging task. There are multiple issues to consider; the hardware involved, ability to design labels, maintaining accuracy of information, and making the correct labels available to the right departments. Accuracy, for example, is key for those organisations working in regulated environments such as pharmaceuticals, chemicals or food labelling. On the other hand, businesses with international offices and distribution centres may find it challenging to cost-effectively design and deploy correct labels in different languages for different regions. Technology is increasingly being used to help overcome these challenges and make the labelling process more effective, despite the fact the industry has been largely static for the past few years. Technologies like cloud can have a dramatic impact on operations especially seeing as so many other critical business processes, like customer relationship management (CRM), have already made the move. So if they can move to cloud, why can’t labelling? The answer is, it can. Moving away from a legacy of complexity As it stands, many organisations are stuck with legacy labelling processes that adds complexity to the design, printing and distribution of accurate labels. Typically, a business has multiple systems, across multiple departments that don’t necessarily work together. In addition, because labelling methods have historically been difficult to use, label design has largely fallen to the IT team. If something needs to be updated or changed, or a label designed from scratch, already over-stretched IT teams need to do it. The problem here is that it won’t be done quickly which can hinder the rest of the process. In one manufacturing company, for example, getting a label changed requires a minimum of eight weeks’ wait for the IT department to do it which they charge €3,000 per change request. This means important business opportunities may lost. These disparate systems and hardware are costly to maintain and require specialist skills to operate. Democratising label management In the era of the cloud or software-as-a-service (SaaS), the labelling process is democratised. This means the designing element of the process is made easier and business users can do it, while the quality assurance process is made more efficient because typically these systems are intuitive and user-friendly, built using familiar interfaces. This leads to one of the main benefits of moving labelling to the cloud. Adopting a SaaS approach means there is no additional specialised (and expensive) IT infrastructure needed and there is therefore no need to spend money maintaining it. As a result, businesses of all sizes — not just global enterprises — can afford label management systems and benefit from the productivity gains. In the past, it was larger organisations with the resources and in-house IT skills who could deploy such a system. But now there are far fewer barriers to entry for those smaller businesses who don’t have the IT skills and budgets needed for hardware, maintenance and management. Better quality assurance, lower risk The advantages of digitalising the labelling process don’t stop at cost and ease of use. There are wider benefits, especially when it comes to the accuracy of labelling information. A digital approach to quality assurance of labelling effectively removes the human element from the process which also takes away risk and cost. It is also easier to secure the process and provides an added measure of traceability. Firstly, only people with the right access can amend data or change the label templates. Secondly, digitalising the label management system provides users with an extensive audit trail, which details which changes have been made, who they were made by, when labels were printed and what was actually printed. This traceability is vital for meeting compliance requirements for regulated industries such as food and beverage and pharmaceuticals. Building on quality assurance, a cloud-based label management system enables businesses to centralise the label management processes and be able to store, change and approve all labels in a centralised location. This ensures that everyone is able to access the right, most up-to-date information wherever they may be located across the business or across the country. This also applies to global operations, making information available to suppliers, partners or affiliates, if needed. Conclusion It is indeed a new era for labelling. Digitalising the process has widespread benefits for all organisations: agility, efficiency, speed, regardless of their size or the industry in which they operate. Cloud-based or SaaS systems ensure businesses can move easily and quickly away from legacy IT solutions that may be more of a hindrance to the process than anything else. It also means that more people within a business can design and print labels, quickly and easily, and have access to the right versions and the most accurate data.[/vc_column_text][/vc_column][/vc_row] ### Humans Struggle to Turn Off Pleading Robots [vc_row][vc_column][vc_column_text]There is a growing market of interactive robots that provide humans with companionship and help. They even take up typically human roles such as security guards and receptionists. As robots become more and more humanised, where do we draw the line in our relationships with the robots? Is compassion for robots overshadowing human interaction and true sentience? Studies have established that we are easily drawn in by the simulation of social cues and human traits exhibited by machines. A recent experiment carried out by German researchers from the University of Duisburg-Essen shows that the simulated cries of a robot can emotionally manipulate people to such an extent that they will refuse to switch them off. The study consisted of 89 volunteers, all of whom were required to complete two tasks with the help of a small anthropomorphic robot called NAO. These tasks included asking NAO questions and organising a weekly timetable. The volunteers were told that the tasks were designed to improve NAOs learning algorithms, but in actuality, the real test was how capable the participants would be at switching the robot off as it begged them not to. In approximately 50% of experiments, the robot begged for its life, saying to the volunteers that it was afraid of the dark. When NAO desperately asked for mercy, the human volunteers were likely to decide they could not switch off the robot. Of the 43 participants who heard NAO’s pleas for its life, about 30% of them could not bring themselves to switch NAO off and refused to do so. The remaining 30 people participating in the study took twice as long to agree to turn NAO off compared to those who did not witness NAO’s begging. Those who refused to switch NAO off had an array of reasons when asked why. A few people were shocked by the begging, which made their decision difficult, and other participants felt morally responsible for the robot, not wanting to do the wrong thing. For the most part, people chose not to switch NAO off simply because the robot asked them not to. In the paper documenting the study, the researchers conclude that the volunteers were “triggered by the objection, people tend to treat the robot rather as a real person than just a machine by following or at least considering to follow its request to stay switched on”. Now that robots of this calibre exist, a new social dynamic is introduced as humans have to ability to converse in their language with robots, and inevitably people forge some attachment to other social entities whether human or not. Research resembling that of the NAO experiment included a study carried out in 2007. The experiment revolved around a cat-like robot who also begged for its life. Unlike the case with NAO, all participants did turn off the robot after quite a moral struggle. This is perhaps because the participants were in the company of the scientists and were more focused on obeying them as an authority figure than the robot, whereas NAO was left alone with the participants, making for more of a bonding session between man and machine. Human empathy for machines has been identified in other studies too. For example, the first test of humans and robots interacting on an emotional level was carried out in 1966. Professor Joseph Weizenbaum from the Massachusetts Institute of Technology created a therapeutic computer program called “Eliza”. Eliza acted as a psychologist, asking participants questions about their wellbeing and feelings. As a result of this emotional interaction, people started treating Eliza as if she were human. It has been found that if a human can relate to a simulation of human life, there is little difference in the way they treat them in comparison to other human beings. The German researchers from the experiment with NAO found that people predominantly prefer interacting with a robot that they feel is similar to themselves regarding personality. Robots can have such an impact on people if they accurately simulate human behaviour. They have such an effect that people can even end up following the commands of robots if they are acting as authority figures. Robots have always been associated with the future and dystopian visions of the universe where robots threaten to overrule humans. This literary and cinematic trope is no longer all fantasy. Given their emotional capacity and their growing ability to interact with humans as if they were human themselves, the robots featured in films like Ex Machina, whereby incredibly convincing humanoids are objects of human affection, are not so fantastical. Relationships between man and machine, whether in the workplace or more controversially, romantically, are no longer solely present in storylines belonging to fiction novels or sci-fi films. A world shared by human and humanoid is ever increasingly becoming our future, as robots like NAO are already pulling on the heartstrings of the people it interacts with.[/vc_column_text][/vc_column][/vc_row] ### Top IoT Devices That Will Become More Useful In 2019 [vc_row][vc_column][vc_column_text]The days of convenience and easement of work are upon us. There are so many things that we have to take care and make it through at the end of the day. At times, it becomes very necessary to find something that will surely take care of such daily chores in the ideal way possible. Humans are innovating every phase of life using technologies and Internet of Things (IoT) is one of them. From industry to household, most of the devices used to make things work every day can now be controlled using the power of internet and innovations. As per a report from Gartner, over 6.4 billion IoT devices were used and the number will triple itself within a span of few years. It means that the number of devices used will reach 20.8 billion by the end of 2020. This prediction was done at least 5 years ago. The recent trends and numbers suggest that the invention of more gadgets will escalate the use of IoT devices up to 50 billion. The overwhelming number of devices used for convenience is remarkable. It is a blessing for the daily life of the busy people out there as the devices can be used perfectly to manage everyday work and chores properly. Leading manufacturers following this trend The biggest players in the digital world such as Amazon, Google, Apple, etc have already initiated finding new ways to make life better by introducing new IoT devices. The majority of the entrepreneurs are also relentlessly looking for smarter options to come up with elegant devices connected via IoT and ready to work as per the command of the users. From doorbells to security cameras, water sprinklers to floor cleaners, any device can now be connected in a single system using this elegant innovation. The IoT predictions for 2019 suggest the following things that will surely amaze you in different ways. Edge computing The inclusion of data processing power via machine learning and AI platforms will enhance the elegance of the devices used every day. The data transfer requirement will reduce considerably as the devices will depend less on cloud computing. Edge computing is a part where a decision has to be made immediately based on complex data analysis. Big player domination As mentioned earlier, the big players have already started dominating this industry and will continue to do so. The emergence of smarter devices such as Siri, Alexa, Google Assistant, and other IoT linked devices will be witnessed more. Automotive models for business Data science will become the lifeline of every industry. The automation will become more aggressive and every vehicle in the industry will need to gather more data for seamless service. This is why the devices that will be made in the future will become more data-centric. This particular trend will also pave the way for 5G connectivity and services. Smart IoT devices emerging in 2019 Below is the list of Most Popular IoT Devices in 2019 we will see more in the market. Smart lighting system The households will find smart lighting system a better solution to use and save more energy. Other than illuminating the entire interior with the smart lights, the IoT controlled system will deliver immense flexibility to the user to control the lighting in a room as per the requirement without compromising. Depending on the mood, a specific environment can be created using the lighting system. Smart IoT medical devices The medical devices will now be using the IoT platform so that they can be monitored all the time to check the status of a patient. It will revolutionize the healthcare industry especially for those patients who need constant supervision of the healthcare professionals. In this aspect, the medical platform will become more efficient. The lack of medical professionals can be solved by tackling the rising demand for healthcare necessities of the population using such devices. In fact, the AI platforms will also ensure that the queries and other simpler actions can be done automatically in order to free the professionals on the floor for more important services. The IoT medical devices will also become a smart companion for the patients. They will help to find more information about a prescribed medicine. The device can monitor the vitals of a patient and can also notify to take a pill or procure the stock. From medications to vaccinations, electronic health records to insurance information, IoT devices will be used for various purposes. Smart kitchen appliances Think of an intelligent oven that can provide you with the proper insights regarding what you are cooking. It means that the IoT cooking appliance will use a camera and will find out the specific necessities to provide you with better knowledge. With the advent of smarter kitchen appliances, you can easily keep a tap on the progress and never let your preparations get overdone. These IoT devices examples can be connected to your smart phone and you can enjoy doing other work while the appliances track the progress. Smart home appliances As mentioned earlier, the daily chores such as cleaning the floor, watering the garden, etc can now be done using the IoT devices. From locking the doors, cars, and other assets to clean the floor, the devices can be easily connected and controlled by smart phone. In fact, the security system, smoke alarms, pet cams, etc can utilize the IoT frame for operating. The electronic devices such as televisions, sound system, lighting, etc can be connected to get a better service. The interconnected devices will coordinate accordingly to give you the ideal experience. In fact, you can also increase the security of the IoT device by using a smart firewall. These IoT Devices That Could Change Our Lives will dominate the market in the upcoming years. Verdict These IoT trends for 2019 and beyond can be easily introduced to your home and offices by hiring the ideal IoT service providing agency. Once these most popular IoT devices are incorporated, your life will become extremely convenient.[/vc_column_text][/vc_column][/vc_row] ### A Connected Future In Fintech [vc_row][vc_column][vc_column_text]Fintech has dramatically disrupted the financial services industry with four of the most successful fintech companies launched in just five years from 2010 – Revolut, Monzo, Strip and Starling Bank. Why have companies like this continued to succeed and emerge as we head into 2019? The easiest way to find out is to look at what each of them have in common – putting the customer first, perhaps only made possible because they weren’t being held back by legacy technology or processes. Keeping up with success Fast forward to today and the fintech industry is already booming, attracting $57.9bn of global investment in the first half of 2018, and new companies launching into market every day. But to ensure that success continues over the coming years, fintechs need to be able to grow and scale at pace in order to support their continually growing customer bases. But scaling up isn’t an easy process, as many businesses will know. Though putting in place tools and processes which can grow with any company is essential to empowering the organisation to get on with what it’s best at, finding the right approach isn’t always straightforward. At the same time as coping with this huge amount of change, fintech companies need to ensure they maintain business as usual, yielding the same levels of customer satisfaction as always with the fully digitised customer service that they have become synonymous for. As such, it’s vital that companies meet this period of growth with caution and play it right to ensure success. Scaling up will bring with it a whole host of change and, paired with the constant evolution that comes part and parcel of today’s market, fintech organisations need to be prepared to be flexible enough to adapt to the environment and fast enough to make decisions as quickly as change happens. The era of planning The solution to ensuring that businesses are able to cope with and quickly respond to this constant stream of change is all in the planning. Fintechs already benefit from the advantage of flexibility and adaptability due to their highly digitised working practises. But this won’t help unless the same qualities are also reflected in their business planning. Being relevant now means nothing if you can’t adapt to the changing climate, so companies need to ensure they are planning for the future – not just thinking about the here and now. In a recent survey, Ernst & Young revealed that a third of UK fintech companies believe that they’re likely to IPO in the next five years – a clear demonstration of the rewards that can be reaped from staying successful. But the answer to truly setting the successes apart from the failures is connectivity. A more connected company with a more connected approach to how it plans will be leaps ahead of their competitors. Connecting the dots To be able to accurately forecast revenue, costs and liquidity on a regular basis, fintech companies need to first connect the dots – those being the people, processes and data. Doing so will enable them the ability to model and digest significant variations in activity and resources, as well as changes in operating models and growth scenarios. Keeping the business in silos only holds a company back. To be able to have insights into where money is being spent in order to forecast more accurately, and as such, more valuably, fintechs need to break down these silos and connect the business so they can make more informed decisions and ensure they don’t burn through valuable capital unnecessarily. Connected planning is also a great strategy to help businesses remain nimble - which is an extremely valuable asset when the future is so unclear. With new regulations being implemented, political fluctuations affecting currency rates, access to skills and trade deals, no one is able to predict what the future holds. Taking a connected approach to planning allows information to flow more freely across a business, providing a much more holistic view of the entire company that allows visibility into its performance. The data collected can then be used to forecast for years in advance at a time, making the future much more predictable and providing the much-needed agility and insights for adapting and responding to potential unknowns. If changes occur, organisations also have the power to input new variations and quickly run new scenarios to respond in real time, potentially marking a real competitive advantage. Suddenly, fintechs can make critical decisions that before may have seemed too risky, with far greater certainty. Fintechs have digital on their side, meaning traditional banks are unlikely to ever keep up with their agility – but that doesn’t mean fintech companies shouldn’t always be looking over their shoulder. What traditional banks do have on their side is experience, loyal customer bases and money. As such, taking a connected planning approach to ensure fintechs have a clear view of performance and the ability to react quickly to market changes is key for these companies to cement their place in the future of finance.[/vc_column_text][/vc_column][/vc_row] ### How is FinTech Promoting Financial Inclusivity? [vc_row][vc_column][vc_column_text]The financial services industry was traditionally dominated by a select group of institutions that monopolised the way we saved, borrowed and invested our money. The emergence of new technology, as it has done in so many other industries, has disrupted the status quo and broken up the longstanding control of those institutions. The result has been more choice for the customer as well as much greater access for those who previously may not have been eligible for traditional banking products. However, in recent years, it has emerged that although many more people are now able to access the formal economy, there may also be a limit to the impact of digital financial technology (FinTech) which is hampering efforts to improve financial inclusion around the world. With that in mind, we’re going to explore the impact FinTech has already had and look at why it may not be the panacea for financial inclusivity. FinTech in action – what improvements have been made? There’s no disputing the tremendous impact FinTech has already had on levels of financial inclusivity around the world. Figures from the 2017 Global Findex Database, which is published by the World Bank, reveal that globally, the proportion of adults with a bank account now stands at 69 percent. That’s up from 51 percent in 2011 when the Global Findex Database was first produced. As you might expect, many of the FinTech success stories can be found in developing countries. For example, in Africa, a lack of access to finance has long been a barrier for small businesses and entrepreneurs. The rate of women’s entrepreneurship in Africa is higher than in any other region of the world. However, a report by the Graça Machel Trust charity found that 71 percent of those businesses were self-funded because of the difficulties associated with raising capital from the banks. A Kenyan FinTech start-up, 4G Capital, has partnered with a Canadian blockchain securities firm to launch Africa’s first bond issued using cryptocurrency. The aim is to bring much-needed investment into Kenya. That investment will be used to lend $40 million to Kenyan businesses over the next 12 months alone. That will impact more than one million people by 2020. Financial inclusivity is not just a global issue A common assumption is that a lack of access to financial services is predominantly a problem experienced in developing markets. However, in the UK, there’s also a significant number of people who are unable to access or understand the basics of formal financial services. For example, 2 million people in the UK don’t have a bank account and 1 in 5 are unable to understand a bank statement. In the UK, the FinTech sector is taking more of an additive rather than a transformative role. It is making life easier for customers and SMEs by introducing additional services that let them make payments more quickly, access and compare affordable sources of debt and manage their finances in a simpler way. For example, new Open Banking rules have been introduced to encourage cooperation between FinTech firms and established institutions like the banks. The rules are designed to give customers a clearer view of their finances so they can see multiple products, such as bank accounts, broadband and mortgages, all in one place. That allows consumers to manage, compare and switch deals more easily so they can make the most of their money. What are the limitations of FinTech? While FinTech has made giant strides in boosting financial inclusion in the UK and around the world, there are some limitations that are restricting its effect. One factor that could diminish the potential impact is the additive nature of many of the products and services that are being introduced. The aim of additive FinTech tools and apps is to enhance the convenience for banked and underbanked populations. However, to promote financial inclusivity in developing countries, what we really need is transformative tools. They are designed to address access to and usage of financial services by the two billion people around the world who are still completely unrepresented by the formal financial system. In recent years, there has also been a slowdown in the FinTech boom that had taken countries like the US, the UK and much of Europe by storm. Economic and political uncertainty in those regions has made it more difficult for FinTech start-ups to access venture capital funding. That has led to a geographical shift in FinTech innovation into new regions such as Asia. Another potentially limiting factor on the impact of FinTech is the access to technology itself. Many FinTech tools rely on the consumers’ ability to access digital services through a website or app. In 2018, the world’s internet users passed four billion for the first time, with much of the growth down to more affordable smartphones and mobile data plans. However, that still leaves a third of the world’s poorest consumers without the ability to access the internet and unable to unlock the financial services they need the most. The changing financial inclusion space FinTech has undoubtedly had a positive impact on rates of financial inclusivity around the world and brought what was a relatively neglected topic on to the global agenda. There is now far greater interest, understanding and ambition regarding these issues than there has ever been before. However, with 2 billion consumers still unable to access the most basic financial services, there’s still a very long way to go.[/vc_column_text][/vc_column][/vc_row] ### It’s Time To Stop Mixing Up AI and Machine Learning [vc_row][vc_column][vc_column_text]There is a lot of buzz/hype around artificial intelligence (AI) and machine learning (ML) – some of it for good reason. With DeepMind beating the world’s number one player at Go, and Netflix utilising machine learning to recommend shows to users, AI and ML has many industries excited (and worried), about its transformative implications for how we work. It has various members of the C-Suite getting excited about the potential revenue streams and new business opportunities. Yet, confusion reigns – artificial intelligence and machine learning mean the same thing for a lot of people/too many people. This is simply not the case. This lack of understanding is the first fundamental hurdle organisations are facing around artificial intelligence and machine learning. After all, how are you supposed to extract value if they are used interchangeably when they represent different things? Organisations must know the difference As Bernard Marr states: AI is the broader concept of machines being able to carry out tasks in a way we’d consider ‘smart’ (for example, recognising a face in an image or reasoning an answer based on a question). Machine learning is an AI application based around an idea that we should be able to give machines access to data and let them learn, predict and classify based on the input they are given. It’s similar, but it’s not the same – and it’s making some in the tech industry look foolish! Buzzwords are a problem for the enterprise. They become so overused they become meaningless, leading to miscommunication and misunderstanding – and possibly to wasted investment. It leads to an unfounded belief that AI and ML can be used to solve any business case or problem. The over-hype that exists around AI and ML right now will mean that a great number of enterprises will increase their odds of becoming another failure statistic as they rush to figure out how to use shiny new toys without putting the right platforms and procedures in place based on real understanding. It will be hard to have an analytical culture and really compete on data science without a growing awareness of what tools are what, in the marketplace. It’s hard to get the shiny toys to work The big ‘FANG’ companies (Facebook-Amazon-Netflix-Google) understand the complexities of getting it right. They possess understanding that by leveraging the data deluge carefully, by laying the right groundwork, they start with a strong foundation. The data they have is in the right place, well kept, understood and curated – meaning a solid foundation is in place for any kind of transformation or experimental project. That’s no small part of the reasons behind their growth and dominance in tech. However, the C-Suite must note that even the most well-resourced, and most ‘data ready’ of enterprises are not immune to artificial intelligence mistakes. An unfortunate demonstration of this can be seen with Tay, Microsoft’s chatbot. Originally designed as an experiment to engage with real people using light conversation, Tay started well… doing just this. Unfortunately, this descended rapidly into chaos – with Tay pouring out inflammatory, offensive remarks from the Twitter account, based on nasty human input. Microsoft’s error was a demonstration of woeful utopian naivety, based on assumptions that everyone is as nice to each other online as they are within the halls of Redmond. Simply not at all preparing its chatbot for the worst elements of human society allowed it to learn ad behaviour in a public setting. Microsoft is not the only one to struggle. Facebook also recently had to scale back their chatbot projects, citing a 70% failure rate. The mistakes these companies made really demonstrates the various challenges AI poses. AI is hard. With the vast majority of the industry still not really really understanding the nuances of deep learning networks, disaster could ensue, with worse results than the bigger players, who already have the data groundwork in place. Lay the groundwork, then focus it It is critical that organisations don’t jump before they can run. A good example of this is another inescapable enterprise buzzword – The Internet of Things (IoT). Research conducted by Cisco, around the failure rates of IoT projects, discovering that three out four IoT projects will fail. Cisco cited challenges such as the time taken to complete, limited internal expertise, quality of data, and integration across teams. Similar challenges to many kinds of new transformation that require a strong data element to succeed, including AI and ML. The success of any transformation or experimental projects hinges on the ability to garner smart data insights to improve processes. However, without having the right data tools, or the correct data culture to help encourage this, you are on a ‘hiding to nothing’, as the saying goes. The take-way is that AI is ‘hard’ and the industry still doesn’t really understand the nuances of Deep Learning networks to apply them everywhere they’re needed. Secondly, AI (as ML before it, and descriptive analytics before that) needs to be focused on a tight business problem. It’s not a one-size fits all solution and requires situational awareness and a laser-sharp attention to the customer/end-user need in order to succeed. But that just means there is an opportunity out there to be seized by those prepared to get to the heart of the problem and use the world of analytics smartly. Creating a data culture, and having knowledge workers take the lead in transforming the world using their best resource to make intelligent and positive change – data. It’s a wild ride![/vc_column_text][/vc_column][/vc_row] ### Must-Have Logos & Videos for Your Business [vc_row][vc_column][vc_column_text]The trend of digital marketing is emerging at a very rapid pace. No matter what type of business you own, if you want to stay ahead of the competition, you must have something that attracts the attention of the people. Undoubtedly, running a business requires good marketing knowledge and advanced strategies to make it successful. Even to stay up-to-date, you need comprehensive and inspiring information from top experts, here website like Compare the Cloud can help! It provides information from top experts from different fields that help you to stay updated. Why a Business Needs A Logo or a website? A business without an influential logo or video will not be able to persuade the audience. As these, both play a crucial role in setting the image of your business in visitors’ minds. As you may know, a logo is the face of your business and is the first thing your audience know about your brand. Whereas, an effective brand video helps your audience to know about your business and relate with it. A solid video puts a business face to a name and allows the audience to know the real nature of your business and offerings. Whether you own a small business or a large enterprise, having a video and logo that stands the test of time is very important. Today, many big names like Nike, Apple, and more have influential logo and videos that leave an impact on people’s mind. Coming to the main topic- must have logos and videos for your business, let’s discuss the importance of logos and videos and their role in your brand’s promotion. Let’s have a look: Importance of Logos A Digital Identity of a Business  A logo is a major component of a business. In short, it is the face of your business audience will instantly recognize when they see it. Every customer prefers to visualize a product before buying or using it. According to a survey, an effective logo has a lot to do with the clients’ appreciation. A good logo contributes a lot to different marketing strategies. For example: When you decide to buy a Nike product, the first thing you will recognize is its swoosh. It is the first thing that comes in the mind. Right? Even they don’t have any second thought about the brand quality. As a business owner, you should get inspired with such brands that are doing so much to make their visual identity. Even, there are also many tools that can help you to get a unique logo and create a unique identity of your business. Make Your Brand Stand Out among Competitors There are several companies in the market offering the same services and products. To be standout from the competitors, every business has to struggle hard to present itself in a unique manner. One best solution to lower your struggle level and make your business stand out in the marketplace is by having an amazing website logo. A creative logo doesn’t only make your business look different but also helps visitors in recognizing your business easily. In addition, a creative outlook helps you in branding your products. So, take out some time to select a simple yet creative logo with an amazing outlook to highlight your identity. Foster Your Brand Loyalty Customers crave for consistency. As your business grows, your logo becomes familiar to a number of clients, and this familiarity builds the perception that you are accessible and reliable. You can add new clients to your list by giving better offers. But to increase the list of sustainable consumers you need something your clients can remember. And a brand logo is the thing customers look for first. You can foster your brand loyalty or get sustainable clients to your business only if your brand presents a unique logo with a different outlook. For example, the sports brand “Nike” is a live example of sustainable clients. The logo of this brand is more famous than the products it provides. Importance of Videos for your Business Increase Your Brand Awareness It sounds crazy, but the video has opened doors for many small businesses to educate and entertain the customers. Today, small companies owners are using this powerful marketing tool to grab the attention of the customers. According to the statistics, 85% of companies regard promotional videos as an essential part of their marketing strategy. Adding promotional videos to any website is the best way to help people know about your business. Moreover, it will also boost the customer base easily. By adding a promotional video to your website, you can convey the core values of your business easily. You can enhance exposure to desired clients by providing educational and entertaining videos. These videos can help the visitors to know more about your business and help them remember your brand name. Cost-effective Solution for Boosting Your Client Base Another big advantage of adding promo videos to your website is they help you in reaching out a large number of visitors within a short time, without spending much. A video on the internet can go viral and get you millions of views in a limited time. Just imagine the benefits for your brand. Undoubtedly, video marketing is a cost-effective option for the amazing outcomes it delivers. Convey More and Clear Information in Less Time According to the research done by the psychologist Liraz Margalit Ph.D., the human brain process videos 60,000 times quicker than text. Watching a video is a more passive activity or experience than reading text. A video is the best way to engage your audience and convey your brand message in a positive manner. Remember to create high-quality videos related to your brand to make an emotional connection with the audience and keep them engaged. These are some amazing benefits that are making logos and promotional videos priority of all digital marketers.[/vc_column_text][/vc_column][/vc_row] ### FaceTime on your windscreen? Apple's got it nailed... A recent patent application from Apple has revealed their plans to create an augmented reality windshield capable of using FaceTime. The prospective invention is known as a Heads-Up Display. Apple has detailed this product and described how it will enable “visual communication between an occupant of the vehicle and a remotely located user including an occupant of a separate vehicle.” According to the patent, the Heads-Up Display will also assist the driver by showing them the speed they are driving as well as the speed limit so that they can make appropriate adjustments and abide by the law. Another feature detailed in the patent is the ‘Panicky Occupant Detection’ whereby drivers' eye movements, posture and temperature are monitored by cameras and sensors to determine their anxiety levels. The windscreen would then adjust accordingly, changing the graphics displayed to suit the driver and their psychological state. Apple’s interest in vehicular technology is nothing new. Apple CarPlay is a system they released in 2014 which enables drivers and passengers to connect their phone via USB or Bluetooth to their vehicle and view their iPhone’s content from the car’s infotainment screen. CarPlay can be used to navigate you, make calls, play music, and even send text messages using voice commands directed at Siri. All of these tasks can be carried out without the driver removing their hands from the steering wheel. The Heads-Up Display offers navigation, but far beyond that of a standard sat-nav. The windscreen will have an overlay of directions showing drivers exactly when/where to go instead of them having to look down at a phone or small sat-nav device. Apple did not invent the concept of AR windscreens, but the addition of FaceTime is undoubtedly unique. Jaguar, for instance, revealed in 2014 its “360-degree Virtual Urban Windscreen,” for Land Rover. Director at Jaguar/Land Rover, Dr Wolfgang Epple, explained that their goal is “to reduce road accidents and enhance the urban driving experience. The Jaguar Land Rover research team is developing this technology to improve visibility and to give the driver with the right information at the right time. If we can keep the driver's eyes on the road ahead and present information in a non-distracting way, we can help drivers make better decisions in the most demanding and congested driving environments." Unlike the more versatile plans for Apple’s windscreen, Jaguar’s displayed mostly safety alerts and images such as virtual orange cones to warn the driver to areas to avoid. Toyota has also released a car with a virtual display; the Toyota Camry has a 10-inch colour Heads-Up Display. Although Toyota’s virtual windscreen display does not cover the windscreen (at on 10- inches in size), unlike Jaguar, who said their product would not be only the market for another ten years, Toyota’s new feature is fully available as of this year. Although many people will be sceptical about increasing integration of technology in vehicles, seeing it as a potential distraction, Heads-Up Displays can be used to make the driver more aware of potential dangers. The intention behind the technology is to stop drivers from looking away from the road, not to distract them further. Perhaps Apple's idea of incorporating FaceTime in their Heads-Up system could pose a concentration issue though. Of course, we do not know quite how this technology will work, but if the display were to be taken over by the face of a friend, it would no doubt be very distracting. It is very likely Apple will have taken this into account and are working towards a safer alternative. As this information regarding Apple’s Head-Up Display plans originates from a patent application, there is no certainty that the product will come to fruition. The application does indicate Apple’s intentions to expand their technological horizons though, which is undeniably exciting. Considering their ability to dominate the mobile phone market with the iPhone, one could argue they could do the same with Heads-Up Displays. Due to the interest the company has already taken in vehicular technology through their production of CarPlay, it is no surprise that they have further automotive aspirations. There’s no guessing when this product will be available on the market, but knowing Apple and its incredibly loyal consumers, it will certainly attract many eager individuals. A vast community is already invested in the services Apple offers, such as FaceTime and Siri, and with the release of this new product, many will want to transfer those services to their cars - particularly if the Heads-Up Display aids their road safety. ### A Strategic View For Small & Medium Sized Companies PART 3 [vc_row][vc_column][vc_column_text] TCG Opal is the standard de-jure framework for device security. It is designed for mobile devices and laptops and has extremely wide industry support and across multiple operating systems. The TCG Opal framework is supported by most top-tier storage manufacturers and is also designed for solid-state drives (SSDs) as well as hard disk drives. Devices that conform to the Opal standard have proven secure encryption build in at the firmware level. The moving of the encryption to the drives means that the reduction in performance is minimised as the compute-intensive work is done by ASIC (Application Specific Integrated Circuits) designed specifically for the job. It should be pointed out that any Opal certified drives are visible to most users who may purchase such a device to install in their laptop. Only when the TCG Opal functionality is enabled does it become something more than an SSD. As good as they are, standards and frameworks will only get you so far. The management tools that utilise them are important. There are several vendors that provide management tools that integrate with the TCG Opal framework to provide centralised management and an almost transparent user experience users still have to enter a password or key on start-up.) To utilise Opal, there are two parts to the framework in a client- server configuration. The server works as a management station to allow control and management of the encryption and the keys. Critically, using a solution such as Opal wraps up the encryption complexity into a set of management tools utilising a framework. Vendors include: McAfee • Sophos • WinMagic • Symantec Whilst the exact implementation details vary from vendor to vendor, they provide the key needs and requirements for management. Management functionality includes a wide range of tools and facilities including: Remote Wipe When paired up with a remote location and management tools, the ability to remotely wipe the drive upon device loss proves additional levels of security and confidence that the data is beyond recovery, even if the keys were available. This is accomplished by removing the device key, so when the device connects to the internet, this is achieved instantly. Key management As previously noted, centralised key management is fundamental. It provides the help desk staff with the means to manage the encryption infrastructure. Drive level management Powerful software solutions will also provide monitoring and managing of the drives. Such monitoring will alert the administrator or help desk to any potential failure or a drive that is displaying known indicators of early failure including the excess relocation of blocks or excessive errors. As mentioned, TCG Opal only works on compliant drives. Any drives an administrator wishes to use should be TCG Opal certified. The standard covers all drive formats, whether they are standard SSD 2.5”, M2 or mSATA. It is wise to choose a vendor who has an option in each format. For administrators making the move to TCG Opal they need to understand that the management framework comes in a client- server architecture. The clients are those laptops that have certified drives in them, and the management is a server that holds the application of choice that utilises the framework to communicate with the client and manage the actual SSD device. The question of how to secure the data is complex, and it is critical to incorporate and manage encryption into standard IT operations without hindering progress or potential data loss. Whilst these considerations may sound complex there is a framework that helps companies manage such items on a practical level. Ultimately managing device encryption is a mix of well-designed business processes and supporting technology. Centralised management is key to providing a quality service to the end users whilst minimising the cost of providing the service and support. Planning is essential and utilising a framework such as TCG Opal with top tier SED vendors makes the whole development that much more straightforward. Launched in April 2018, Kingston’s new SED UV500 provides end-to-end data protection using 256-bit AES hardware-based encryption and TCG Opal 2.0 security management solutions. As mentioned above, the TCG Opal functionality needs to be enabled to utilise the drive encryption fully. This can only be done via TCG Opal compatible solutions, such as those from, e.g. Symantec, McAfee or WinMagic. For the user that does not have TCG Opal environment already the drive, encryption is transparent, automatic and not configurable. It cannot be turned off or on and these users can utilise ATA Password in BIOS. The ATA Password is a standard password security that functions on most hard drives and SSDs on the market. In summary, the TCG Opal software is needed for unlocking the full set of features on the UV500 SSD. Without TCG Opal, the user can use ATA Password to protect the content on the drive.[/vc_column_text][/vc_column][/vc_row] ### Securing Privacy While Using Devices at School [vc_row][vc_column][vc_column_text]Accessing the internet and browsing online has become easier than ever with public Wi-Fi and 24/7 wireless connections. Whether you are in high school or attending university, securing your data and protecting your privacy anytime you are online is essential. With the rise of data breaches, hacks, and identity theft, knowing how to properly use the internet on a public connection is imperative to prevent your information from getting into the wrong hands. In some schools and universities, you may lose your rights to your browsing data once you have accessed the school's provided network. Maintaining security and privacy while using your devices at school does not require specialized tools or software in most cases and can be done with minor tweaks and updates before accessing preferred websites. Review Your School's Privacy and Data Policies Before you access an internet connection at your school, review privacy and data policies that are often provided in your official handbook (or on the official website of your school or university). Not all schools collect and treat data in the same manner, which is why it is necessary to familiarize yourself with any potential limits or restrictions you may face while browsing. Use a Secured Web Browser Use a secured web browsing whenever you are searching for information while at school. Secured browsers limit tracking and provide you with more freedom and flexibility over traditional browsers such as Google Chrome or Mozilla Firefox. Some of the most popular secured web browsers include Opera, Brave, Chromium, or even a Tor browser, depending on your needs and the type of surfing you intend to do. Alternative browsers offer anonymity features that are not always available in mainstream solutions. For more information on top secure browsers, read here. Avoid Logging in When Using an Unsecured Connection Avoid logging into important email or social media accounts while you are using an unsecured connection. If you need to check your bank account, email, or social media, use your phone's 3/4G network without allowing automatic access to your school's unsecured network. Do not use your main email address and password while browsing on your school's connection. Most unsecured networks are extremely vulnerable to hackers and potential thieves who have the ability to access devices that are currently connected. Use Apps With End-to-End Encryption While many apps on the market today utilize encryption to help protect and secure data, most of them do not offer end-to-end encryption. Popular messaging applications such as Facebook messenger do not provide end-to-end encryption, leaving you vulnerable to sharing personal information with complete strangers. Whenever you are using a public internet connection, stick to apps such as WhatsApp to ensure your data is not shared with others and to prevent your conversations from being read or viewed by the provider or others who are also connected to the same network as you. Research each individual application you are thinking of using on a public connection before downloading it or allowing it access to your device while you are connected to your school's Wi-Fi network. Do Not Use the "Remember Me" Function While Using a Public Connection Whenever you do choose to log in to a website or when you enter a username on an unsecured network, avoid checking "Remember Me" when prompted. The "remember me" feature is optimal for home and personal computers only. Even if you log out after using your connection (especially on a public computer or device), your username may reappear to the next user, increasing your chances of becoming a target. Be Aware of What You Post Publicly and to Social Media Accounts While social media has become commonplace and it is more popular than ever, it is still important to remain self-aware when using it, especially if you represent a university or a special program you have been accepted into at school. Using social media improperly or offensively has lead to hundreds of suspensions, expulsions, and real-life consequences, making it difficult for students to rebuild their personal and professional reputations, even after graduation. Review your school's privacy policy when it comes to data collection and storage before updating your social media accounts and official pages. Even if you are using your own personal device but you are connected to the internet using your school's network, you may be liable for anything that is uploaded and shared. Posting inflammatory, offensive, or illegal information using your social media accounts or even your own website and blog can result in real-life consequences that impact your education and future. Use Your Own Hotspot if Possible Whenever possible, avoid connecting to public internet connections. Use your own Wi-Fi hotspot whenever you can to avoid putting your data and information at risk while sharing your browsing history publicly. Personal hotspots are available with most mobile phone carriers and ISPs (Internet Service Providers), although they may cost a nominal fee each month. Delete Your Browsing History Always delete your browsing history once you are done using the internet connection provided by your school or university. Avoid storing cookies and cache on your device to begin browsing again with a clean slate the next time you reconnect. Log Out When You Are Done Browsing Never forget to log out of any public device that is using your school's connection, even if it is for a website that is unimportant to you or not for personal use. Leaving yourself logged in can lead to others using your accounts to search for more information about you or post information without your knowledge or consent. Taking the right precautions can prevent unnecessary headaches and trouble if your login information finds its way to a hacker or an individual who has malicious intent. Getting familiar with online security and taking measures to protect your privacy is extremely important anytime you are browsing the internet at school or university. With the right tweaks and by remaining aware of your choices, ensure your information remains safe and protected from unwanted consequences.[/vc_column_text][/vc_column][/vc_row] ### A Strategic View For Small & Medium Sized Companies PART 2 [vc_row][vc_column][vc_column_text] With the event of self-encrypting drives (SED’s) manufacturers have not only removed the performance penalty but also made the devices extremely secure, not least because the encryption keys for the devices are stored on the physical drive itself. The encryption function is also implemented in silicon rather than software, making it more secure and just as importantly keep the encryption overhead down. Due to the way the self-encrypting drives are designed, the key never leaves the device, making key extraction virtually impossible. “Implementing self-encrypting drives that provide hardware-based AES 256-bit encryption has fast become an easy to manage and cost-effective solution to stop data breaches through the theft or loss of computers, laptops and tablets containing confidential company, customer and client information.” - Pasi Siukonen, Team Leader Technical Resources Group at Kingston Technology. Now that we have discussed the why, it is time to turn attention to a high level “how”. As part of those business requirements, a forward- looking company will design and develop processes to deliver the needs and the processes required to manage the encryption related calls that will come in. One of the most important things to consider is the management of a chosen encryption system. There are several practicalities that must be considered when looking at the “how” of setting up encryption for mobile devices within a business. The business must be able to manage the encryption and the devices in question centrally. Administrators need to be able to manage not only the encryption but also access to the management platform. Good security and auditing of the critical cryptographic platform are key. It should also go without saying that the cryptographic management platform should be redundant. Avoid putting all the eggs in one basket (server). Conversely, the encryption and security must be as clear as possible to the end-users. End-user downtime causes lost productivity and therefore directly impacts costs as well as creating non-positive perceptions of the IT department. At the same time, any data on the drive must remain accessible. Frequently, employees leave and the data must be available even after its owner may have left but also remain secure against loss or theft at the same time. For this reason, solutions such as BitLocker and VeraCrypt while robust secure, can be more complex to manage and usually miss the key feature, an agnostic management framework that provides the full range of requirements for deployment at scale or are restricted to a single operating system. A solid framework is both agnostic and easy to consume. Amongst the most well-recognised frameworks is TCG (Trusted Computing Group) Opal. [/vc_column_text][/vc_column][/vc_row] ### Why Is Blockchain Good For Your Business? [vc_row][vc_column][vc_column_text]Blockchain is one of the hottest tech trends and an increasing number of companies are starting to realise it has potential for them. Recently, many organisations have associated blockchain with the cryptocurrency sector and didn’t really think it had near term potential impact for their business. A number of enterprises are realising that blockchain, in fact, offers a far wider range of applications and business benefits than simply crypto markets. The most obvious benefits are for the financial industry simplifying international payments, remittances and complex financial processes. We now know blockchain can solve many challenges and create new opportunities for industries such as logistics, IoT, healthcare, media and entertainment, oil and gas, agriculture, law, defence, luxury goods and many other sectors. In fact, Gartner predicts that blockchain’s business added value will grow to over $360 billion by 2026, then surge to more than $3.1 trillion by 2030. This means businesses are starting to pay more attention to this emerging technology and early adopters are finding ways already to reap its benefits. As a result, CIOs are under increasing pressure to understand and guide their companies in implementing blockchain to drive efficiency, productivity and market competitiveness. So, what are the main benefits of blockchain businesses should harness? The immediate advantage blockchain provides is increased transparency, privacy and security. Blockchain is a sophisticated algorithm that stores and encrypts data, creating immutable records in the process. This is a massive asset for security and privacy. Because there is no central point of vulnerability companies can better protect their data and lower the risk of data leaks and hacks as well. Considering how many world-renowned corporations have reported data loss incidents and malicious attacks lately, blockchain’s security advantages are more important than ever in an ever-changing threat landscape. From a privacy perspective, by using blockchain companies can ensure that no other commercial or governmental entity can ever access or manipulate their data without their consent. At the same time blockchain enables companies to easily trace transactions and operations. This is particularly useful for organisations with large logistics and supply chains. With blockchain companies can track the movement of goods, their origin, quantity, quality and so on, simplifying processes such as ownership, transfer, production, authenticity and payment. If an irregularity is detected somewhere along the supply chain, blockchain systems can easily pinpoint the origin and businesses can carry out investigations and take the necessary actions. Commercial Transactions Many commercial relationships have been damaged by human error and lack of transparency. By using blockchain for payments, businesses can remove any guesswork, human error and avoid delayed transactions and financial losses. Furthermore, because each transaction is recorded sequentially and indefinitely, companies can easily provide an indelible audit trail for each transaction, operation or asset. In this way, the audit process becomes much more efficient and faster. Additionally, companies can access data regarding any potential issues in real time and they can address any potential problems as soon as they arise. Paperwork, Data and Storage By using blockchain businesses can also increase agility and efficiency. Nowadays, companies generate huge amounts of data and manage countless transactions and operations externally and internally. With some blockchains, businesses can manage billions of transactions with sub second finality. Of course, not all organisations need this much capacity but it’s encouraging to see how quickly blockchain is scaling to meet the needs of the enterprise. Many companies still use paper heavy processes which are time consuming, prone to human error and lack transparency. Blockchain streamlines and automates all these processes, enabling organisations to become more efficient and agile. Because transactions and data queries are validated and completed faster, companies can increase competitiveness and significantly improve the customer experience. At the same time, the cost of storing all this data and processing endless transactions is significantly lower than traditional storage options. Another area where blockchain helps businesses reduce cost is third party transactions which can be all but eliminated, streamlining many organisational processes and increasing transparency. A recent report by Bain and Company estimates that the savings from the implementation of blockchain technology would amount to somewhere between $15 and $35 billion annually. Smart Contracts Fully automated legal agreements known as smart contracts is a major area where blockchain will change the way organisations operate for the better. Companies can now automate any legal agreements and documentation that need signatures and approval from multiple parties without the involvement of humans that are usually the cause of delays and errors. Smart contracts can have a significant positive effect on cash flow as well. Rather than waiting for several layers of authorisation when the invoice is issued, smart contracts can make immediate automatic payments as soon as a transaction is completed. And these are only some of the core benefits of blockchain for businesses. Considering that the technology is still being perfected and the sector is growing by the day, many more new opportunities for businesses are being uncovered to increase productivity, performance and profitability. One thing is clear - blockchain isn’t a pipedream. Applications and services are available already, many of which are fast, flexible and secure and are starting to improve many business functions. The companies taking advantage of this technology can gain a significant competitive edge and organisations that choose to ignore it risk being left behind.[/vc_column_text][/vc_column][/vc_row] ### When Cloud Meets Machine Learning: Solving The Changing Recruitment Problem [vc_row][vc_column][vc_column_text]Navigating cloud-based recruitment platforms to solve talent problems was, once upon a time, an efficient solution for businesses. But a shift in mindset, behaviour and demands from the UK’s top talent is fuelling the growth of the freelance technology consulting sector, meaning traditional cloud technologies are now in danger of losing their shine. Perched on the precipice of a new cloud era, CEO and Co-founder of business-to-consultant matchmaking platform, Worksome, Morten Petersen explores the changing tech recruitment landscape and looks at what might happen when cloud meets machine learning The global consulting market has been on a path of continuous growth these last few years: with a total value of around $250 billion, it is earmarked as one of the largest and most mature markets within the professional services industry. Zoom in closer to understand its composition, and you’ll find that the technology and IT consulting services account for roughly 20 per cent, translating into an estimated value of $48 billion. A 2.5 per cent year-on-year growth since 2011 presents good news for the technology consulting sector, but it’s not without challenge: in 2017, 65 per cent of IT leaders worldwide reported a skills shortage was holding their business strategies back. And in today’s highly competitive and changing climate, where companies are trying to keep pace with the slew of new technologies emerging on the scene,  a skills drought could prove harmful. As the pressure mounts, companies have two choices to acquire these new and critical skills: 1) they can train their existing staff 2) they can hire new talent who already possess the necessary competencies But still there is challenge, because the talent scarcity and the fact that tenures within the industry are decreasing makes it even harder to acquire new knowledge. Savvy companies recognise that if they want to compete, they first have to win the war for the brightest minds. And an ever increasing number of those people are freelance consultants. Since 2009, the freelance economy in the UK has increased by 25 per cent and generates about £109 billion a year, according to IPSE and so it’s perhaps unsurprising that 51 per cent of IT leaders have now turned to employing freelance IT specialists to fill the skills gap. This makes freelance IT specialists the most popular solution to the skills shortage problem. Model challenges The problem for cloud-based recruitment-associated platforms like LinkedIn is that freelance consultants challenge their business model: they stray from the traditional 9-5 job in order to gain greater freedom, flexibility, and the ability to hone in on their skills. LinkedIn offers plenty of opportunities for people who want to work in traditional full time positions, but they haven’t managed to encapsulate the growing freelance market, yet. This has opened up a gap for emerging players. While the IT market is mature, it’s still growing - and rapidly, too. There seems to be no way to satisfy companies’ appetites for IT specialists, and there’s a rising demand for people with skills in both IT legacy skills, such as Cobol, as well as people with bleeding edge skills in Blockchain and ML. As of now, professionals with both these competencies are getting flooded  - by LinkedIn, phone or email - by eager recruiters trying to meet their quarterly targets. Some recruiters may know very little about the competencies and skills they are recruiting for, but they still charge up to 20 per cent of first year’s pay. However, the fact that the market is this fragmented, calls for a more sophisticated way for supply and demand to efficiently match the right talent with the right job. Essentially, recruitment is an information problem. Factors such as years of experience, specific technical skills, coding languages, and industry experiences are all part of an equation, which is inherently better handled by cloud-based platforms, than by human beings. It’s important to also factor in the fact that people with in-demand skills don’t want to be flooded with calls from recruiters. They want to take back control, particularly on how they are employed. The best way to give them this, is to provide them with a complete overview of the jobs that match their skills and preferences precisely. Cloud-based platforms do exactly this, by essentially displaying all available options, leaving the candidates with the ability to choose which option suits them the best. But there’s still more work to be done by cloud-based platforms, and many opportunities are on the table if they too embrace technology to create a whole new wave of cloud-powered possibilities. A machine learning curve In essence, cloud-based artificial intelligence lies at the heart of newly emerging business-to-consultant matchmaking platforms. Advanced algorithms take care of the matching of supply and demand, by considering a number of different factors to ensure a perfect match. These algorithms are continuously improved and are becoming more and more sophisticated. The next step for the frontrunners of tech recruitment is machine learning in the cloud. Those who are innovating smartly in this space should already have the setup to develop cloud-based machine learning, it’s just a question of getting the sufficient amount of data to feed the algorithms. The tipping point is coming though, and when it does it will bring a host of new and exciting opportunities with it. What machine learning will have better success at is finding the right candidates for companies’ open positions. The algorithms will be able to find correlations and patterns that humans overlook, which will result in higher-quality candidates. While machine learning can help reduce the recruitment cycle time, cost, and number of bad hires, people still need to handle the “candidate experience.” Technology can’t yet assess cultural fit or imitate chemistry. Thus, humans will still play a vital role in finding the perfect fit for the job - cloud technology just makes the road to getting there so much easier.[/vc_column_text][/vc_column][/vc_row] ### A Strategic View For Small & Medium Sized Companies PART 1 [vc_row][vc_column][vc_column_text] Data security is an extremely hot topic at present. Securing laptops and mobile devices is not only essential but also presents management complexities due to the very mobile nature of the devices that make them so very appealing in the first place. Making the situation that much worse is that loss of sensitive data can destroy a company’s reputation with its customers overnight, as well as the public at large. Who would want to trust a company with data again when they have proven they cannot look after the data they already possess? Whilst it is best practice to educate staff in good data hygiene and data loss prevention, it is the company that must take the steps necessary to secure the data, being ultimately responsible in the eyes of the law. No matter how much businesses plan for it, data loss, or more accurately, loss of control of the data and devices will still occur through accidental loss, theft and other nefarious means, no matter how much businesses plan for it. A typical scenario, back before data encryption entered the consciousness of most administrators; a salesperson within the organisation left their laptop in their car in clear view, rushing to a client meeting. Upon returning, he was greeted with a smashed window and a missing laptop. Hence, administrators often need to protect the company from the users. While for most companies that may have been the end of the story, the laptop in question had remote viewing and management capabilities installed. It was interesting and horrifying at the same time to see the eventual owner of the laptop rifle through the documents, price lists, customer details and other commercially sensitive data. In the hands of a competitor, it could have had a devastating effect. The lesson learnt for the company’s system administrators was that full drive encryption was not an option, but a necessity. Loss and theft also occur in other ways. “Across Transport for London’s (TfL) network alone last year an average of three laptops went missing every day*”, says Tony Hollingsbee, SSD Business Manager for Kingston Technology. “This obviously exposes organisations to significant risks such as reputational damage, loss of trust or fines, if there were personal data on the laptop.” Furthermore, not all staff can be trusted, and devices can be stolen for personal profit or gain. Encrypting the drives also means that any failures on it that need to be returned to the manufacturer are encrypted. For today’s administrators the management overhead of drive encryption and its associated security has become much easier to manage and implement on a technical level but understanding how to put the solution into practice correctly is essential. It is not just a technical matter but a business process that needs to be adequately evaluated and introduced properly. The differentiator is how a business plans for data loss and responds to it that makes you a data security winner or a data security loser. Adopting a strong, well-managed encryption solution can make the difference between a simple device loss report to the appropriate authorities and reputational destruction. Moreover, the stakes have just been significantly increased with the introduction of GDPR (General Data Protection Regulation). As part of the GDPR legislation, there is a requirement to provide a reasonable level of protection for data concerning third parties against data loss or theft. Failure to do so can come with large financial penalties, up to four per cent of global turnover. Mobile security is but one aspect of GDPR. Here we look at the requirements, background and practicalities of implementing secure data encryption. There are many types of data encryption, but without doubt, Full Disk Encryption is both more secure and easy to manage. Historically, the downside of full drive encryption was the not insignificant performance degradation it caused, magnified many times over on low-end hardware. Whilst improvements in CPUs and architecture helped negate some of the impacts, there was still a performance hit.[/vc_column_text][/vc_column][/vc_row] ### Javid Khan - Pulsant - Cloud Adoption [vc_row][vc_column][vc_column_text]Watch now Javid Khan from Pulsant talks about cloud adoption[/vc_column_text][vc_column_text] [embed]https://vimeo.com/360269832/34062d4315[/embed] [/vc_column_text][/vc_column][/vc_row] ### Saving Time and Money with Unified IT [vc_row][vc_column][vc_column_text]ROI is a critical aspect of running a successful organisation, and while incremental changes can be made around many parts of the business to improve this, eliminating the historically siloed approach to IT operations and cybersecurity is one way to swiftly improve a business’ finances, efficiency and security. Siloes tend to occur as a result of an organisation dividing broad goals into what could almost be described as micro-actions for different parts of the organisation. While it is good to have skilled teams focused in on particular aspects of the business’s ambitions, this approach often means that teams unwittingly restrict information from other departments and insulate themselves from the rest of the organisation. What is required for the forward-thinking organisation is an approach that is mindful of the varying aspects of IT infrastructures and one that allows different teams to work better together. Unified IT is a holistic approach to operations which enables the increasingly varied roles within IT and cybersecurity to get the most insight and effectiveness out of their data, hardware and systems. Rather than having teams work as solitary units as they’ve done for the last few decades, an increasing demand for convergence largely due to technology trends like big data, cloud computing and automation, is driving organisations to find solutions which best utilise resources and time. Unified IT is all about breaking down the ideological and technological barriers that have caused so many different aspects of IT and Security to view each other as alien, so that they can better collaborate. In practice, this means pooling together all of your resources, and even your budgets. Currently there is a huge discrepancy between the budgets of different IT teams. For most organisations this has not necessarily been a conscious decision. As the focus of the Board has changed, such as 49 per cent of CIOs now identifying cybersecurity as an operational priority according to the 2018 Harvey Nash / KPMG CIO Survey, other departments within IT may see less of these decision makers’ focus. Restricted budgets along with a siloed mentality can often lead to a reactive approach to dealing with technological issues ‘as and when’ which can in turn set infrastructure up to fail. This can leave organisations trapped in a confusing and potentially dated environment that will get more expensive to deal with the longer it is left to stagnate. Organisations will encounter further roadblocks as they rely on numerous tools from different vendors that are incompatible, preventing potential collaboration. Unified reporting also makes it easier for the CIO or CISO to make well-informed decisions on how to use IT and Security to proactively improve the business in a way that is more cost efficient. But outside of infrastructure costs, one of the biggest areas that will see benefits from Unified IT is cybersecurity. IT Operations and Security have long been siloed departments with very particular roles and responsibilities. However, as both technology and cybercrime evolve, these two parts of the business are starting to collide. Ultimately, cyberattacks cause IT downtime which results in both a reputational and financial cost. Furthermore, the first person to spot an imminent attack could well be from IT, as they would be the first point of contact when a user reports a PC or other device acting strangely. The effects of a cyberattack can be defended against by both IT and Security. WannaCry is a perfect example of this. Large parts of the NHS were badly hit by this attack because many devices affected were still running on legacy systems like Windows XP. Additionally, others hadn’t properly patched devices using Windows 7. Legacy technology is an IT problem because IT is largely responsible for the digital transformation of the business; yet patching continues to be a security issue. However, as both areas affect both teams surely they should be working together in order to do both of their jobs more effectively. Unified IT or operationalised security is the next step in these departments’ evolution. New advanced tools such as AI can be brought in to enable this evolution by automating and improving processes where relevant, for example by enabling self-service IT for users. In a unified environment, many of the problems associated with siloed teams and tools can be mitigated against. Unified IT provides further benefits in that it can reduce incompatibilities caused by relying on a diverse set of tools from different vendors, which can create siloes within siloes. In addition, unified tools open the door to joint reporting which helps prevent duplication of actions between teams and ultimately aids in giving CIOs and CISOs better visibility into the technology powering and protecting the business, enabling them to make well-informed decisions and have full visibility into the organisations, which is particularly key in a post-GDPR world. The landscape of enterprise IT is evolving rapidly. As the workforce becomes more mobile and organisations become more reliant on the cloud, the same old siloed IT strategies will become harder to upkeep. Collaboration between departments and even different companies is now a vital part of how business is done, and with IT often driving change in the organisation it is hypocritical for IT to remain siloed and stagnant. Unified IT is the way forward for IT Operations and Security teams and is the best way for businesses to properly take advantage of the resources – and talent – at their disposal.[/vc_column_text][/vc_column][/vc_row] ### Jumping To ‘False Causes’ At Point Of Failure [vc_row][vc_column][vc_column_text]One of the biggest time drains for IT support teams when dealing with business critical issues at the point of failure, is jumping to 'false causes’ at the first point of contact with a problem.  Often it is human instinct to jump in quickly and attempt to ‘fix’ a problem especially when you are working with experienced and highly skilled professionals within the team. Companies quite rightly, hire the best people they can find to support their IT, people with knowledge, experience and skill, but interestingly this in itself can lead to its own problems. The likelihood is, 90% of the time IT professionals have witnessed a failure before, and will know exactly what steps to take to resolve it.  At the same time, systems are increasingly complex and often what we see is a symptom of the problem, rather than the actual cause.  For example, it may be the case that the last five times we saw a drop in transaction rates on a critical payment system, that it was triggered by a security upgrade. This time the symptom looks exactly the same as before, so we roll back the upgrade and that doesn’t help. It later transpires that on this occasion our network provider had performed some routine changes in the routing protocol, which had affected the data verification. Clouded judgement The time wasted on dealing with the false cause is a frequent situation, but this kind of time drain can only be eliminated in the future if we start to place an equal reliance on data as we do knowledge and past experience, because experience without adequate information can cloud our judgement. The assumptions we make are usually exacerbated by an over-reliance on the knowledge and experience we have, but at the same time customer or user pressure to get things fixed instantly is rarely helpful, as it encourages speed at the expense of effectiveness A good analogy is, if your loved one was in emergency surgery, you wouldn’t shout at the surgeon to hurry up with the operation, nor would you make suggestions about where he or she should make an incision. The problem with IT support teams occurs when we only hire people for their technical expertise, not necessarily someone who is confident dealing with stressed-out colleagues or customers while under pressure. Thankfully many firms are starting to realise that being a ‘techy geek isn’t the only, or even the most important qualification to look for when recruiting.  It is far more useful to have an IT team that possess a variety of mixed skills and talents.  Many firms deliberately staff the first line of contact with nontechnical (and sympathetic) personnel, so that the customer has a chance to air their frustrations, before the serious business of diagnosis and correction begins. Count to ten and take three deep breaths Managers also need educating on dealing with business critical scenarios too. Today it’s becoming unacceptable to call yourself a leader and then to simply pass the pressure onto your team when something goes drastically wrong. Managers are under pressure from lines of reporting above them, and they must realise that it’s part of the job to take that pressure and to present a calm and focused attitude to their teams.  Many would argue that the old-fashioned advice actually works: count to ten, take three deep breaths and then speak.  Technical or not, it’s also important for managers to understand the most effective flows of problem solving, so they can quickly recognise when something’s not working and give assistance. Holding large-scale meetings – especially conference calls- in an attempt to solve issues can also complicate matters especially when people go off on tangents.  Different types of issues need different types of approaches.  Where things are confusing, the conversation must be confined to clarification. There is no room for solutions if you don’t understand what you are dealing with.  Where there’s a clear departure from normal function, then people need to be in investigative mode looking for evidence and using that to figure out the causes. In many cases, to get services back on track, there are frequently multiple options, and this conversation has to be about what those options are, and how to decide quickly what’s best.  Try to keep an open mind too because things can always get worse so understanding and managing those risks involves another kind of thinking.  To help meetings progress usefully, you can draw boxes around these four different types of work: Clarification, Investigation, Solutions and Risk, as this helps to retain focus and avoid confusion and going off track. Mixed messages Gathering the right information at the first point of contact with a problem is vital because incomplete or inaccurate information can lead to an incorrect diagnosis resulting in the wrong actions being taken and more customer frustration. Too often when reporting issues, we tend to mix up what’s really there with what we think is going on and then present our whole story as if it were fact. A simple example from the helpdesk: IT: “Good morning, how can I help?” User: “The server’s down again!” IT: “I’m so sorry. What exactly do you see on your screen right now?” User: “It’s my login screen.” IT: “And what is that doing that it shouldn’t? User: “It won’t accept my password.” Although the IT support team are often technically qualified people, it’s short sighted to assume that all other users are clueless about tech.  But as a user, what almost all of us do is embellish the real data with other stuff we think is interesting or helpful, without necessarily sorting out what’s relevant before we give the story to the helpdesk.  Luckily, IT professionals are often able to use tools to see exactly what is affecting the service and get accurate data that way. If not, then they have to prompt their users by getting a detailed description of what they saw, and ideally the exact time it started, as well as the last time they were able to use that same service successfully. To avoid jumping to false causes you have to focus on using all of the available data. The easiest way to avoid the trap of jumping to conclusions is to ask how the assumed cause fits the data.  So for instance, if we have an outage in Data Centre 1, the Incident Manager immediately points to the change that was implemented overnight. If we know that Data Centre 2 is still working, we can ask: “How come Data Centre 2 isn’t affected? The same change was run on all Data Centres.”  If any suggestion is valid, it must be able to show how it fits the facts on the ground. If it doesn’t, it’s simply not correct. Focus on root cause Probably the most important aspect of solving critical issues in the right way is to ensure that the IT support environment supports this approach of detecting and solving problems long term.  There are some really big challenges here: engineers on the whole, rather like solving problems. Having said that, the time pressure often means that once the symptoms have been made to go away, there’s not much incentive to check they really did correct the root cause. That’s why ITIL has identified the difference between two essential processes: Incident Management to effect a speedy restoration of service and Problem Management to find and correct the cause. In summary, the most important change an organisation can make is to focus on the quality of work: the completeness and accuracy of details recorded by support staff in all roles. How well have they captured the data? How completely have they explained how their assumed cause fits the data?  Some organisations make regular quality checks of case records and provide feedback; but most don’t, and without that all-important feedback, things are unlikely to improve in a positive direction.[/vc_column_text][/vc_column][/vc_row] ### Why Blockchain Should Orchestrate From Behind The Scenes, Not Take Centre Stage [vc_row][vc_column][vc_column_text]Anyone with any sort of experience in the technology industry will be familiar with its obsession with buzzwords. From digital transformation to big data and the Internet of Things, it is almost impossible to escape or ignore the latest buzzword. As 2018 draws to a close, we’ve certainly had our fair share of buzzwords, but one that has intensified more than any other over the last 12 months is blockchain. Although the technology has been around for some time, it well and truly entered the public’s consciousness this year. So much so that almost every technology company is trying to muscle in on the conversation one way or another. However, despite the hype, practical applications of the technology have been few and far between in visibility. A perception problem When you look at analyst reports, the blockchain industry appears to be going from strength to strength. According to the IDC, US businesses invested around $640 million in blockchain in 2018, followed by Western Europe at $340 million, and these numbers are set to increase over the coming years. But, despite the investment, some key issues are holding back the development of actual products and services. One of the main problems is that many people simply don’t understand the difference between enterprise blockchain (the underlying technology) and cryptocurrency (the first notable application of blockchain). The second issue is that the negative connotations of cryptocurrencies are preventing businesses from seizing the opportunity that enterprise blockchain represents. These concerns have led to uncertainty around wider enterprise blockchain developments, with many businesses not prepared to take what they consider to be a risk on the technology. What’s more, the number of practical blockchain-enabled applications entering the market are being outnumbered by companies taking advantage of the hype and using it as their product’s selling-point. Many are simply using the term ‘blockchain’ as a marketing lever, without actually implementing the technology in a way that solves real business challenges. This all adds up to a perception problem that is making it difficult for companies with genuine products to achieve cut-through. So, what can be done to ensure that enterprise blockchain lives up to the hype and that it realises its promise as a truly transformative technology? Fulfilling its potential Rather than blockchain taking centre stage, the technology should be left to conduct its considerable work from behind the scenes. When used correctly, enterprise blockchain is a business enabler that solves the problems many modern digital enterprises face. Especially when it comes to trust, efficiency and security. For example, enterprises today are spending huge amounts of time and money on expensive software solutions but are still failing to protect themselves from breaches and hacks. Many have also turned to zero-trust solutions in an attempt to access and exchange data more securely, but these solutions are using historical data management techniques and inadvertently making collaboration more difficult and inhibiting business agility. Enabling a secure and seamless flow of data – both internally and externally – that businesses can guarantee (via architecture and auditability) hasn’t been tampered with is one of the thorniest challenges in the today’s digital economy. So, how can enterprises securely collaborate while maintaining absolute trust in their data? The simple answer is: by using a data management platform based on private permissioned blockchain technology. By providing an immutable system of records secured with a cryptographic key, with contextual access blockchain can allow organisations to retain control of their data no matter who has accessed, read, written or shared it. Through blockchain, enterprises can securely share critical data with authorised third parties outside of their organisation’s normal perimeter of control. In this context, blockchain isn’t a marketing gimmick. It is simply the best data management solution to a very real business problem. This is how organisations must be prepared to think. For enterprise blockchain to provide real value to businesses and consumers in the future, it can’t simply be thrown around as a marketing phrase, or as a meaningless buzzword attached to an ideology. Its true utility is as an innovative, technological solution to a range of distinct business problems. The temptation is to get unrealistically excited by blockchain’s potential to transform every way we work and make it the headline act of the show. However, given the current state of confusion, we need to reset the conversation and make sure prospective users don’t lose sight of its real benefits and potential for existing operations, as well as its role in digital transformation activities. Once these benefits are communicated and understood, we might just then begin to see the difference between the hype surrounding this transformative technology, and its tangible value.[/vc_column_text][/vc_column][/vc_row] ### Why More Organisations Will Move Toward An Offline Tape-Based Storage Strategy In 2019 [vc_row][vc_column][vc_column_text]New Year’s Eve has closed yet another year of headline-grabbing malware and cyberattacks that robbed firms of their data and exposed them to public fury. You would think that only SMBs could fall victim to hacks, but tech giants like Google and Amazon weren’t spared either. Facing ever-more sophisticated attacks, organisations might consider going back to basics. In 2019, tape storage probably won’t supplant cloud, but as an offline storage it will offer an additional and much needed line of defence against malware. Tape and the cloud will not only co-exist, they will co-mingle. This will enable organisations and cloud providers alike to discover unique benefits of LTO tape. In all, vendors will witness the tape market growing, based on large-scale and hyperscale use as long-term storage. A new use case for tape Today’s modern storage strategies aren’t about picking one media over the other but rather about knowing how you should apply different mediums. Tape is the best option when it comes to offline storage. Except when read or write operations are performed, files stored on LTO tape aren’t connected to the network – they are “air gapped”. Data stored on tape is not susceptible to malware or ransomware. Tape fulfils the requirement for an offline storage solution. Backups are copied to tape, and the tapes are kept onsite in a tape library as a “secure, offline copy” of data protected against malware. As they remain in the library, tape handling is no longer a problem, while minimal handling of the tapes reduces the chances of media damage. Today, tape is no longer about offsite storage; it is about offline storage. Protecting data against ransomware and other malware attacks requires offline data protection. And tape is the most effective—and lowest cost—method of providing it. Tape and the cloud will intersect and co-exist While cloud usage increases as a resource for storing data, cloud service providers face the same challenges as enterprises do: cost of storage, malware attacks, data loss, etc.  Even worse, cloud service providers face these challenges at massive scale!  That multiplies not just the severity of the challenge but the benefits from any technology that can help overcome these challenges. Because tape is by far the most cost-effective mass storage technology, and because it can also provide the benefits of offline storage, cloud providers are embracing tape at a rate beyond what anyone expected.  Cloud providers are clear on the benefits of tape technology and so tape is being designed into the architectures that serve the world’s cloud providers. Unique benefits of LTO tape Tape, as a storage medium, stands out in many ways. Disk and flash-only data storage systems tend to be extremely expensive. This is not only because of the higher acquisition costs for these technologies, but also because they rely on electricity/power 24 hours per day. Whereas tape can store huge amounts of data and be stored offline for 30 years or more. This provides you with a financial gain that can reach millions of dollars, as well as green credentials. Adding to this, tape’s capacity is simply massive. This allows for the storage of huge amounts of data for everything from images, to client data and transactions, to Internet histories, social conversations, and more. The same capacity on disk or flash is much more expensive. In 2019, the tape demand will rise The changing cybersecurity landscape will make tape an increasingly attractive option in fulfilling a deep market need as an inexpensive storage media. Any organisation with significant data sets will be tempted to turn to the cloud. As businesses move their disaster recovery strategies to the cloud, the recognition for tape’s specific advantages in archive and long-term data preservation as well as an offline and cost-effective medium will become much more entrenched. The data migration to the cloud trend will continue, but business with the incentive of lowering the cost per TB of storing large pools of data will increasingly look to tape media as the solution to long-term storage.[/vc_column_text][/vc_column][/vc_row] ### A Decade In The Cloud [vc_row][vc_column][vc_column_text]Cloud has been around for some time now, with the year of the cloud increasingly becoming what seems more like an entire decade. It’s infiltrating more and more sectors as the technology and the industry itself continue to mature.  But barriers do still exist to its adoption and it has a long way to go as new and innovative technologies continue to emerge. Paired with the lowering costs of compute and storage, an increasing number of smaller vendors are on the verge of entering the market. As such, the next ten years for the cloud industry, and the industry giants that currently dominate the market, are uncertain. Barriers to cloud adoption still exist Unfortunately, obstacles still stand in the way of cloud adoption, particularly from the human side in terms of legacy resistance to change and risk aversity due to new regulations such as GDPR. Customers also now have the positive challenge of an increasingly wide selection of cloud services to choose from, which presents a new challenge of how to disseminate the differences. Unfortunately, the average sized business is not experienced or educated enough in the new world of cloud in order to know what questions to ask or indeed, how to interpret the answers they receive. As such, it’s difficult to know how the average smaller business is expected to best assimilate the cloud services that will add value to their business. Do they just select the traditional big-name technology brands they have heard of (who in cloud may not be the best solution any longer) or do they innovate and go for a newer name who can give them better business differentiation to their users and customers? With many of today’s enterprises facing this decision, legacy vendors will be forced to cloudify through re-engineering, re-platforming or acquisition in the coming years in order to remain competitive, or risk becoming the next ‘blockbuster’ to their Netflix cloud competitor. Staying relevant in an increasingly innovative world We still face a breadth of technology sectors that, for the majority, sell as traditional on network systems. We have seen CRM shift over the last 15 years from 95% on network systems to now over 70% of new systems sold today being cloud based CRM, mostly due to Salesforce. We have already seen this expand to ERP and Service Desk solutions, but we still face a bulk of technology areas such as telephony, antivirus and software suites that are in the majority sold as local deployments. Cloud is going to continue to disrupt sector after sector, with new entrants typically threatening the traditional brand names due to the fact that these have a ball and chain around their ankles. This being their fight to protect their existing revenue streams, not wanting to disrupt their own revenues using cloud and more often leaving it for other third parties to do so who have all to gain and no existing business to protect. The future for cloud competitors The new cloud disruptors haven’t put their heads above the parapets yet, but soon we are going to see the new Uber, Facebook and Netflix appear from nowhere because we now live in a world where the computing power required is affordable to any innovator with an idea. The previous barriers of compute and storage vs. cost are gone and even the smallest of firms are able to develop and deliver quicker than ever and reach a global audience. New competitors to the major players may not have even been born yet. As such, for the time being, cloud powerhouses are now in a three horse AGM race (Amazon, Google and Microsoft) with an increasing number of traditional software vendors re-platforming their apps to one of these underlying cloud services. We are going to see an increasing number of big traditional corporate brand technology vendors challenged to survive, recently exampled by CA’s (Computer Associates) sale to Broadcom, when nearly 50% of staff were axed quickly thereafter. Remaining relevant in a new cloud world for legacy firms is going to get harder and harder. The changing face of cloud – what will it look like in ten years?  With all of this in mind, I believe we will see far less private ‘cloud’ hosting firms in the years to come. With the commercial power of Amazon, Google and Microsoft on the race to Zero (i.e. who will be the first to give it away for free?), traditional co-location hosting will be marginalised if not crushed. We will hear less of the term ‘cloud’ as it will be ingrained into a plethora of other technologies as the engine; these will include Artificial Intelligence (AI), Big Data and Internet of Things (IoT). We will see a breadth of co-operating systems as API’s between cloud systems extend and cloud silos become marginalised and a massive growth of edge computing devices with local processing power communicating with and supporting centralised cloud computing engines.[/vc_column_text][/vc_column][/vc_row] ### Blockchain – Creating a Power Shift in The Real Estate Industry [vc_row][vc_column][vc_column_text]When you think ‘blockchain’ you don’t usually think of office space and property sales. Finance and investment maybe, but not real estate. Yet Commercial Real Estate (CRE) transaction volumes currently stand at $341 billion globally. If even a relatively small percentage of that is spent annually on informational data and due diligence, the business case for any technology that can reduce these costs becomes compelling. Despite all the hype, blockchain technology is actually at its best when used to provide transparency, for example in a supply chain. It is already being used to ensure the authenticity of medicines and luxury goods and is poised to drive similar improvements in the office leasing business. Greater transparency will drive better and faster decisions, and ultimately, frictionless transactions. The current landscape As things stand, it takes unnecessary time and money just to rent an office. A series of middlemen have to search and verify, and then negotiate with both parties on terms and penalties, not to mention informational data and due diligence, relying too much on the word of landlords. In this environment it can take more than twelve months and hundreds of pages to lease a single floor. Landlords are unable to rent floors predictably and quickly, leaving them overexposed to vacancy risks and financially stretched. Meanwhile, tenants spend months shopping and negotiating via brokers and lawyers, and then are constrained by long-term leases, imposing onerous long-term liabilities on their businesses. In many cases, tenants also have no way to easily compare the price they are offered with market equivalents, meaning many are left stuck with a poor value deal. Decentralised approach A proposed way of changing this situation is to decentralise the process and provide a large aggregate data source based on blockchain as the platform. The idea behind this is to eventually allow every office property in every major market globally to be indexed and searchable, available to lease with one click. This would begin with creating a distributed and immutable online ledger that simplifies the process by reducing paper trails and transaction times. The resulting ‘trustless environment’ then encourages cooperation between companies that have never worked together before. This approach could prove invaluable for brokerages, financial information companies, and tenants of all sizes globally. Not a one-size-fits-all solution While many in the property industry recognise that the technology has potential to quickly improve data sharing, help with raising capital for real estate assets, and generally make the market far more liquid, blockchain is not a one-size-fits-all solution. As in other industries, companies considering its use should assess whether and where blockchain can be useful, as the technology will not necessarily address each procedural inefficiency. For example, in its recent report entitled “Blockchain in Commercial Real Estate”, consultancy Deloitte identifies that “leasing and purchase and sale transaction processes” (among the core CRE processes), “are ripe for blockchain adoption”. In the report it outlines five key instances in which blockchain should be considered for CRE processes as follows: Need for a common database – Critical for leasing and purchasing transactions such as the multiple listing service, which collates property-level information from brokers and agents’ private databases Multiple entities can edit a database – Transactions involving several entities such as owners, tenants, lenders and investors who modify a variety of information Lack of trust among entities – Often participants in leasing and purchase and sale transactions are new to each other and could have concerns over due diligence and data integrity. Here blockchain can help reduce the risk through digital identities and more transparent record keeping systems Opportunity for disintermediation – Less need for intermediaries due to increased security and transparency in title management and auto-confirmation Transaction dependence – Many real estate transactions have conditional clauses and can be executed through smart contracts. For instance, the conclusion of a purchase-sale transaction could be dependent on loan approvals or title clearances A question of trust What this checklist helps emphasise is that blockchain is at its best when sitting in the sweet spot between application and database. If you are the only writer, for example, using blockchain makes little sense, but if there are many parties involved it can help create trust by providing verification and authenticity, without the need for middlemen. In practice this could mean an end to inefficient searching for office space by the elimination of misinformation and price inflation. It could also allow for near real-time settlement of recorded transactions which will remove friction and reduce risk during the rental and buying services, limiting the ability to charge back and cancel transactions. All this will only be possible, however, if the industry can be convinced to trust the blockchain itself. With enough people on board there is a real opportunity to put the value of property data back in the hands of those who create it.[/vc_column_text][/vc_column][/vc_row] ### Does advancing technology alienate the elderly? The conveyor belt of technology releases and developments move like lightning. As many of us excitedly embrace the IoT, smartphones, artificial intelligence and more, members of the elderly community who have not grown up with such advanced technology are at risk of being overlooked. Of course, the elderly are not prevented from purchasing smart devices, but the adjustment to such technology can be quite a feat for the older generations of society. Ian Hosking from the University of Cambridge’s engineering department has said, "There are some very tech-savvy older people around, but there is clearly a large cohort of people who feel excluded by technology. They find it a bit impenetrable". Age UK found that only 40% of the people 65 and over in the UK in 2010 had never even used the internet. Some initiatives have already been put in place to help elderly people adjust to the widespread use of technology, as to be left unconnected to Wi-fi, without access to the enormous source of knowledge that huge numbers of people rely on (the Internet), and without the devices that can aid communication with loved ones, it is an isolated world to live in. It is not that older people do not have the means to obtain the technology, although some do not, but rather that they were not brought up in a world where it existed or was necessary to the extent it is now. Whereas, the youth of today have learnt to type on laptops from as early as 4-year-old in school, and approximately 84% of children aged between 11 and 12 have a smartphone (according to a 2018 Childwise report). Age UK's 'itea and biscuits week' help older people learn more about technology and how to use the internet through technology taster sessions. Such campaigns give elderly people opportunities to try out different forms of technology in a low-pressure environment with friendly volunteers to help them. Another example of an initiative connecting older people with technology is Age Concern Edinburgh’s Moose in the Hoose Project. The project aims to show the benefits of computers and the internet to elderly people living in care homes and/or spending time in day centres in Edinburgh. The service consists of trained volunteers visiting to care homes and day centres on a weekly basis to help the elderly learn to use technology. How technology can help the elderly There is a multitude of reasons for elderly people to use technology to help themselves. Not only do computer and smartphone application like video messaging services help the immobile and lonely see and talk to their loved ones, but smart gadgets can help older people improve their quality of life. IoT devices, whereby household items can communicate with one another via wifi, could be of particular use for older people, as they can ask their home-bot to search the web, make phone calls, add dates to a calendar, and much more. This helps people with limited or no mobility as they can complete tasks through a voice-activated device rather than having to write, operate a phone or move from their seat. Beyond convenience, advanced technology can also aid more vital aspects of life. Voice-activated bots can help you with your medical needs by phoning a doctor or requesting an ambulance, and apps can remind you to take your medication (a particularly useful tool for those who have Alzheimer's). Although elderly care robotics is quite a new field and have not been implemented all that much as of yet, there is excellent potential for robots to offer medical assistant and companionship. An Israeli start-up called Intuition robotics announced their companion for the elderly, ELLI.Q, which is due for release later this year. The robot functions through machine learning and is designed to provide elderly people with activities entertain them to help combat inactivity and loneliness. The bot also monitors the user’s environment in order to inform them of risks and is intended to focus closely on their wellbeing. The robot can organise transport for them and remind them of plans (appointments, social events etc). Intuition robotics secured over $20 million in venture funding in 2017, with a $14 million investment from Toyota AI Ventures. Final thoughts With more technology being created for the elderly and more classes being held for older people to learn how to use it, not only will their lives be enhanced by the increased communication possibilities and the medical assistance of bots, but with the correct and thorough education in the operation and risks of such technology, elderly people will be able to use it to its full potential with knowledge of cyber security and how to stay safe online. In Britain, such technology could help combat the enormous and unfulfilled demand for elderly care and lifestyle improvements. Although this may seem like a risk to the jobs of carers, robots should not replace them (and most likely will never have the compassion necessary to do so) but should compliment care by allowing older people to be self-sufficient in the absence of their prime carer/s. ### How To Ensure Security With Cloud Hosting? Our world will become completely digital in a few years. This is a good thing but also raises some serious questions about data security; every business (be it small or large) needs to be vigilant about data security in this digital era.  Most companies store their data on-premise and look after its security themselves. However, a new trend is slowly picking up the pace - cloud hosting. The technology has given impressive results when it comes to data security by offering a secure virtual environment. Cloud hosting offers numerous control levels to enable protection and continuity. It has become an integral part of creating a secure environment for businesses across the globe. The cloud hosting providers offer multiple security features like data privacy, data integrity, encryption, and more. Let’s have a look at some methods by which cloud hosting ensures security. Privacy Protection The first step towards ensuring data security is data encryption and controlling who has access to which data. For this, you need to identify sensitive data - discover where such data resides, classify it into different data types, and create policies based on the data types.  You can also use multiple tools that automate this process - Amazon Macie and Azure Information Protection are some popular examples. There can also occur a scenario where you might need to offer special authorization to a person to access specific data. For instance, testers or developers might need access to live data in order to test the code. Your hosting provider will take care of all these things and ensure that your data is safe.  Data Integrity The basic definition of data integrity is the protection of data from any unauthorized modification, fabrication, or deletion. Role-based authorization is the key here - you can decide the authorization level and ensure that the data is not modified or lost by unauthorized users. The system administrator decides who has access to which data to ensure its integrity.  Also, when the data is hosted on the cloud, its integrity is protected. Hosting providers deploy bots that monitor your data for any unusual activity. They prevent unauthorized access by using the latest antivirus and firewalls, intrusion detection, and multi-factor verification.  Prevention from DDoS Attacks The frequency of Distributed Denial of Service (DDoS) attacks is on the rise, making it a major concern for most companies and organisations across the globe. Fortunately, cloud hosting offers what it takes to stop heavy loads of traffic directed towards the cloud server. There are multiple ways by which the cloud helps.  First, the cloud server has far greater bandwidth than a local or private server, meaning that it would take a lot of effort to get the better of the server with a DDoS attack. Second, the cloud is a diffuse resource by nature. It means that the server absorbs malicious traffic before it is able to cause any harm. Third, hosting providers have a team of IT and cybersecurity experts who monitor and mitigate the traffic from DDoS attacks. BCDR BCDR stands for Business Continuity and Data Recovery. Let’s try to understand it with a simple example. What would happen to your data (stored locally) if the hard disk is corrupted or your office premises is hit by an earthquake and everything is destroyed. There’s very little chance that you’ll be able to retrieve your data. With cloud hosting, your hosting provider ensures that regular and multiple data backups are created on remote servers to ensure business continuity, even during a disastrous situation. If one of the servers gets damaged, it makes data retrieving a straightforward task.  Encryption of Data in Motion Data security's main aim is that no one should tamper the data, and no one should understand it except the sender and the receiver. A popular method to achieve this for data in motion is data encryption. It ensures that even if there is a data breach, the attackers do not understand the data and its security is not compromised.  Along with encryption, authentication also goes hand-hand-hand. It is done with the help of public and private keys. Only the receiver has access to the private key that’ll help in decrypting the data. Cloud hosting uses the latest encryption technologies and strategies to ensure that the hosted data is safe, secure, and travels safely through different mediums. Conclusion Cloud hosting is one of the most popular methods used by companies to ensure data security. With the use of the latest strategies, data encryption, and firewalls from the hosting provider, it has eased data migration in a safe and secure manner. Cloud hosting also focuses on the privacy and integrity part of the data; no unauthorised person can access and modify it. Additionally, the technology solves the problem of data storage and associated costs for the companies. In short, it’s a win-win for everyone.  ### Why is Everyone Talking About VPNs? [vc_row][vc_column][vc_separator][vc_column_text]If you’re a bit of a techie then VPNs (Virtual Private Networks) probably aren’t anything new. Instead, there’s a good chance that they’re already an integral part of your setup and somewhat old hat. If, however, you think megabytes are more to do with shark attacks than computers, you might be wondering what all the fuss is about. So, we're going to examine what they are and why 1 in 6 Brits now use them. What is a VPN? A VPN is a tool – often a subscription service – that increases your privacy and security by helping you to access the internet over an encrypted connection. VPNs operate using a number of computer networks dotted around the globe called servers, which re-route your online activities via alternate IP addresses and locations and add a ‘tunnel’ of encryption to keep data secure. Your IP address is a unique identifier (sort of like a license plate) that allows others to identify your device, its activity and where you are. When you’re using a VPN, the re-routing of your connection means that sites you visit will only see the IP address and location of the VPN server you’re using, rather than the real thing. Ultimately, this allows you to surf the web anonymously, in addition to the secure tunnel of end-to-end encryption that protects you from cyber attacks. Being able to choose from servers across the globe also means that you can give the appearance of connecting to the internet from anywhere in the world, as long as it’s on your VPN provider’s network. This is how many people choose to circumvent certain geo-restrictions on sites like Netflix and HBO. Cyber security VPNs aren’t just a must-have for lovers of American Netflix, they’re also a great way of protecting yourself against malware and cyber attacks. But how? Well, say you run out of data and go to the nearest café to connect to the Internet. You join the queue, buy yourself a latte and ask for the Wi-Fi password. Once connected, you open some emails, do a bit of online shopping and post a few tweets – hurrah! But then you get home and none of your passwords are working, money is missing from your account and you can’t access your social media. This is a worst case scenario, but the lack of encryption on public Wi-Fi networks does make them a hotbed for cyber crime, because they are far easier to hack than your conventional home network. Using a VPN on all your devices (yes, even your phone) can mitigate many of these dangers by adding an extra layer of encryption around your activity. The end-to-end encryption a VPN provides means that if a hacker were to try and access your activity, all they’d see are a load of nonsensical encryption keys - strings of seemingly random numbers and letters - rather than things like your personal emails or bank details. Staying in touch with friends and family Living in the age of social media has its ups and down, but one of the best parts of modern life is being able to stay in touch with friends and family no matter the distance. That is, unless you happen to live in a part of the world that blocks VoIP (Voice over Internet Protocol) services like Skype and Facetime. VoIP bans are in place all over the world, from parts of Africa to areas in the Middle East and East Asia, which can make keeping in touch with friends and family costly. However, with a VPN your loved ones can hide their real IP address and connect to the internet from a server and IP in a different location. This allows them to access the same free services that are available in that location, putting long-distance calls with gran back on the table. Targeted Ads Have you ever been stalked by targeted ads before? If not, count yourself lucky. Targeted ads work by collecting data about you, like your age, gender and income, and combining that information with details from your browser history and online activity to build a detailed picture of who you are, and which products you are likely to buy. Advertisers can then target your IP address with specific marketing campaigns. Targeted ad campaigns are a nuisance and do nothing but remind you of the times you considered buying a dodgy Christmas jumper or an overpriced sandwich toaster. Installing a VPN on your device can help you to get rid of those pesky ads and forget about past fashion choices once and for all, by ensuring your browsing activity can’t be tracked. You can also use a VPN to grab geo-specific deals that aren't available in your location, in the same way as you might use one for VoIP calls – by connecting to a server in the country you want to shop in, so that it looks like that’s where you’re browsing from. Bypass bandwidth throttling  Bandwidth throttling is something ISPs (Internet Service Providers) do to keep their networks clear. Regrettably, this usually involves monitoring your online activity and limiting your bandwidth when you do something data heavy, like streaming. It also means that you can spend a lot more time waiting for a TV show to load then actually watching it. If you’re fed up with watching the loading screen more than your favourite TV show, installing a VPN on your device will mean you’re less likely to have your bandwidth throttled and can make for smoother, faster browsing. How does this work? When you hide your activities with a VPN, you aren’t just dodging targeted ads. You’re also stopping ISPs from knowing whether you’re using tonnes of bandwidth or the bare minimum, meaning they can’t throttle your connection in accordance with your use. Using a VPN on your device is a great way to stay in touch with friends and family and keep up with the latest offerings from across the Atlantic. They’re also pretty indispensable if you’re a fan of using your local coffee shop’s free Wi-Fi, or downloading songs and movies without your ISP curbing your broadband speed. Most importantly, with cyber attacks and large-scale data breaches on the rise, protecting yourself online has never been more important. Taking the time to look at which VPN services are available to you, and considering which one you might use to encrypt your personal data, could be as vital as using a strong password when it comes to staying safe against cyber crime.[/vc_column_text][/vc_column][/vc_row] ### Incident Response Privacy | IBM Resilient [vc_row][vc_column][vc_separator][vc_column_text] [embed]https://vimeo.com/309471574/6b1d52b86e[/embed] [/vc_column_text][/vc_column][/vc_row] ### What Does 2019 Have in Store for the Internet of Things? [vc_row][vc_column][vc_separator][vc_column_text]How many more connected gadgets have appeared in your home this year? IoT is expanding at a phenomenal rate. Vehicles, wearable gadgets, RFID sensors and software are advancing past basic functionality and the network is growing to include even more advancements. 2019 will see such advancements becoming even more commonplace in the home, in businesses and on the road. So, what are some of the things that we can look forward to over the course of 2019? Specifically, what things will have a significant impact on IoT over the coming year? Will 5G save the day? There are billions of devices that are connected to the Internet that are used for daily tasks (the total number of connected devices is much bigger). When you talk about the IoT market and connectivity statistics then we are talking about numbers in the Billions range. In 2019, there will likely be a bigger push for 5G connectivity —adding another lane to a very busy web highway to handle the increase in devices. More data and more traffic makes for a very congested connected internet. There is also edge computing to consider. Data from IoT devices will be stored closer to the source—taking business from data centres. To combat this, data centres and edge computing will need to work in harmony. However, all this forecasted growth could slow reflecting the massive lead times in capacitors and batteries (40+ weeks in some cases). Electronics manufacturing industry players don’t expect to see much relief until mid-2019, and some think that shortages will last well into 2020. Leading passives suppliers in the industry will need to meet supply challenges and help their customers to stay up-and-running. Will the Low-power WAN bubble burst? LPWAN is a wireless, wide area network technology that interconnects low-bandwidth, battery-powered devices with low bit rates over long ranges. Low-power wide-area networks (LPWANs) are a battleground right now, not just involving major telecoms companies and mobile operators, but a number of new players eager to bypass the traditional networks. A report from ABI Research claims that the latter is already outstripping the established network types such as narrowband IoT (NB-IoT) and LTE-M. The figures showed that LPWAN providers, such as the European provider Sigfox, accounted for a whopping 93pc of connections in 2017. But will we see the LPWAN bubble burst over the course of 2019? Is there a possibility that enterprises will begin to realize that, beyond their prototypes,  LPWAN is going to prove to be expensive and won’t actually offer the coverage required for global business requirements? 2G or not 2G? Disappearing 2G was a trend that was first apparent in Asia, with KDDI ending its TU-KA 2G product in March 2008. All of Japan’s mobile operators abandoned 2G services by April 2012, making it the first country to fully jump to 3G and 4G-only networks. KT Corp of South Korea and Spark of New Zealand also shuttered their respective CDMA networks in 2012, while CAT Telecom of Thailand followed suit in 2013. For many operators, the switch away from 2G has come as part of a wider migration from CDMA-based networks to the W-CDMA and LTE technology path. Refactoring a globally ubiquitous radio spectrum for carrier specific domestic NB-IoT seems like a badly thought through strategy by operators who aren’t offering an alternative to their domestic customers who operate globally. Will MQTT be recognised by the mainstream? Designed for constrained devices and low-bandwidth, high-latency or unreliable networks, MQTT (MQ Telemetry Transport) is a publish/subscribe, extremely simple and lightweight messaging protocol. It is likely that in 2019 will we see MQTT coming to the forefront of IoT and being recognized by the mainstream M2M brigade as being more useful for IoT than TCP-IP/HTTPS. If we consider Smart Cities then HTTP is not practical to manage a use case where a controller needs to turn off over 1,000 streetlights in one low-cost instruction. On the other hand, MQTT is 1-2-Many and is ideal. Will we see more of examples of this over the course of 2019? The issue of IoT fragmentation will only continue to grow. A big part of the problem is that there are so many different devices/solutions/protocols/platforms. There are probably way more than needed because, until recently, many IoT solutions have been designed for problems/clients that don't even exist yet. These days it's now easier to build solutions based on real-world problems - the industry is maturing and the wider business world is becoming more familiar with the benefits that IoT can bring. If the whole solution is designed around customer experience, the problem of fragmentation is diminished because there is a clearer view of what's needed from the outset.[/vc_column_text][/vc_column][/vc_row] ### Why The Cloud and Edge Belong Together [vc_row][vc_column][vc_column_text]We know by now that to extract optimal value from data intelligence, it needs to be highly accessible, captured in real time and at its freshest state and quality if it is to drive immediate operational decisions and responses. It’s why we have seen the decentralisation of the cloud as the sole vacuum of system intelligence and the migration of data capture and processing to the most remote part of the network edge. Not only does this provide for enhanced agility in terms of data access and handling, but also for improved security. The time data spends travelling across the network bandwidth and potential for corruption is much reduced, while the regular bottle necks that ensue as multiple devices communicate back to a centralised core network are fully consigned to the past. Unsurprisingly, traction has been buoyant in the data-heavy environs of the IoT space, where more and more imaginative applications have demanded greater efficiency in the processing and transmitting the volumes of data generated. Specifically, the digital edge has flourished in industrial IoT settings, where increased data usage in remote devices becomes a norm, rather than exception. Here, data must be transmitted across some of the most remote and challenging environments. This means sensors require sufficient processing power to make the kind of mission critical decisions that can’t wait for data to be sent to the cloud. Collecting data fast and flexibly in a gateway solution is a major bonus, not only lowering operational costs, but localising certain kinds of analysis and decision-making in a move that empowers the end user. Yet the complexity of IoT ecosystems dictate that it is not just about getting data to the brink and job done. First, there’s the question of which approach is best to facilitate it, which can see many caught short by an over reliance on the deployment of dumb devices such as the low-cost routers in the field, while saving investment for more sophisticated hardware further up the food chain. Where IoT devices are deliberately designed with simplicity in mind, and to reduce power consumption, the upshot can be endpoints exposed and vulnerable from a security standpoint. This should remind us of the importance of the stability and processing power that the cloud can bring and the benefit of architectural solutions that facilitate both. Indeed, while the credentials of edge were once entirely framed in a debate that pitted it against the cloud as an either/or option, in reality this simply hasn’t panned out. Amid the complexities and choice that define the IoT space, a more nuanced hybrid approach was always going to prove a preferred option to deliver the best of both worlds. This capitalises on the more robust capabilities of the cloud, and the contextual awareness and locality of the edge. Such flexibility is increasingly imperative to thrive in the digital enterprise. Delivering this duality, demands an underlying infrastructure that provides a consistent platform for both entities, one that is based on open architecture with flexibility and modularity that can drive real time integration for endpoints. Simple and seamless integration is the foundation of this approach. It ensures the transcendence of boundaries that modern app development now encompasses and the accommodation of connected devices that use an increasingly wide variety of data formats and operating systems – or in some cases, no operating system at all. It is why open source development must play a role, having evolved from being seen as simply a cheaper alternative to proprietary software to becoming the primary source of innovation. Innovation that enables the creation of smarter, event-driven microservices and IoT edge applications. Furthermore, using an intuitive drag-and-drop and API-led design approach, across cloud, edge, and hybrid environments delivers the speed and agility needed for making changes on the hop. With less overall cost and impact on the existing infrastructure, apps can be connected without the need to write code, a move that plays into the hands of multiple users with varying levels of expertise.[/vc_column_text][/vc_column][/vc_row] ### AR/VR | Through the eyes of history | Compare the Cloud Augmented reality (AR) took the world by storm with Pokemon Go. The app used AR to overlay digital images of Pokemon creatures on to a live view of the gamer’s surroundings. Virtual reality (VR) is also an incredibly significant form of modern technology, consisting of devices like the Oculus Rift whereby users can be transported to real-world and fantasy places, totally immersing the user and replacing real-life surroundings completely. Although augmented reality and virtual reality technologies have existed for more than 30 years, neither have been available to the public until fairly recently. To understand the advanced nature of both augmented and virtual reality as it is today, knowledge of the developmental history of such technology is necessary. The birth of AR/VR The beginnings of AR and VR can be identified long before modern computing technology was born, but the head-mounted display technology that we associate with AR and VR today originates from in 1968 when Professor Ivan Sutherland created The Sword of Damocles. The system relied upon a general-purpose computer, a clunky headset suspended from above. Although Sutherland is often credited for the invention of the Head-Mounted Display, in 1961 Philco (a company in Philadelphia), developed a method whereby there was a camera was in one room and a user sat in another with the display. Although this may simply resemble television or film, the VR element comes in because magnets were used to match the position of the camera with the position of the user’s head - displaying the user visuals as if they were witnessing it first hand. Where we are now With smartphones being so integral in the contemporary world, screens have become an essential component in our everyday lives. We are constantly enclosed in a world of screens, and companies have embraced this by developing a variety of virtual reality and artificial reality products and devices to optimise the screen user’s experience. As I have previously mentioned, Pokémon Go is an immensely successful example of this, as the developers achieved a adoption of augmented reality. Snapchat and Facebook have created highly popular filters that can be applied to images, which many people would not associate with augmented reality, but that is exactly what it is. Earlier this year Snapchat also introduced a “Shoppable AR” feature as an element of its augmented reality lenses: users can simply tap a button to open a webpage which functions to promote a product or sign-up page; there is also a video option designs to give users access to trailers and how-to short videos. Not only is AR and VR used for entertainment purposes, but can be an excellent tool for marketing and advertising. The most common and popular approach to VR experiences is through Smartphones with headsets. This shows that although the early examples of the technology that were experimented with in the 1960s were disadvantaged by lack of technological know-how, as it stands today, VR and AR has some clear resemblances to the headset style of the early trials. Virtual reality is a lot more sophisticated than augmented reality as the software and hardware to create immersive VR experiences are already established at a high standard. For example, systems like the Oculus Rift, along with 360 cameras have made virtual reality experiences far more advanced. Virtual reality has applications in various industries such as real estate and tourism. For instance, the Visitor’s Bureau in Jerusalem has used virtual reality to immerse tourists in the city as it looked 5,000 years ago. New applications of virtual reality experiences are being released across the globe all the time. Although augmented reality does not quite compare to the standard of virtual reality in terms of maturity as a result of the limitations of the technology and higher cost, it is already being deployed in industries such as healthcare, construction, and logistics. Often, augmented reality experiences are presented through headsets, such as HoloLens. There are early signs that the technology is set to hold substantial commercial weight on the market. This is clear from the success of artificial reality Snapchat features and Pokemon Go, but more advancements need to come in order for it to catch up with virtual reality. There is a community of people already convinced that we live in a simulated not dissimilar to the ‘plugged in’ world in the Matrix, and although that may be quite an exaggeration of our current existence and perception of the world around us, as more possibilities are discovered for augmented reality and virtual reality, the technology will undoubtedly become more and more integrated into our everyday lives. Location no longer restricts what people can see; in fact, even time is not a limitation as virtual reality headsets can take people into the past and view what locations once looked like. Augmented reality will also inevitably be further deployed through apps to add moving image overlays to our surroundings, and I am sure more ways to use the technology will be found to give it commercial power comparable to virtual reality. ### Big Computing Power is the Only Way Forward [vc_row][vc_column][vc_column_text]The reputation of blockchain has taken a beating recently. Following months of hype during which it was proposed as the solution for everything from world hunger to the Irish border question, the technology’s sheen is now a little tarnished. In a recent article the BBC’s technology editor Rory Cellan-Jones recently described his conversations checking out claims from eager vendors that the government was adopting their technology; “Once I have got in touch with those departments, I've been met with weary sighs and suggestions that Whitehall's enthusiasm for blockchain has been somewhat exaggerated.”  Headlines such as “Blockchain is just a very slow database” don’t help. Now that we have got beyond the initial breathless excitement, however, it’s time to consider the real potential – and sticking points - of the technology. Blockchain has already proved itself as a concept and more applications are already underway. The supply chain is a case in point: the significant growth in complexity and the various innovations, such as the emergence of personal computers in the 1980s, have changed the way the supply chain is managed. Globalisation is another factor adding to the intricacy; chain of command is essential. In the global supply chain industry, products pass through the hands of numerous suppliers and authorities. When every party is contributing to the blockchain ledger, all goods transported around the world will be accompanied by a record which can verify authenticity and ethical sourcing at every stage of the journey. A recent report from Cap Gemini showed that while only 3 per cent of organisations are deploying blockchain at scale, 87 per cent are in the early stages of implementation or carrying out proof of concept projects with the technology. And governments, while not necessarily committing to its adoption, are certainly looking into it. In Dubai, as part of the Government’s Smart Dubai Initiative, IBM has recently launched a government-backed blockchain platform service that will allow businesses in the country to test, build and launch blockchain-based products. A lot of power Nonetheless, the sheer size of a distributed platform that is constantly updated in real time inevitably requires a great deal of processing power – and indeed electricity and speed is an issue. By definition Blockchain technology is decentralised; it uses a peer-to-peer network approach that involves many separate elements working together. But while this collaborative approach is hugely beneficial in creating trust, it negatively affects processing speeds and hinders Blockchain’s widespread adoption in industries like banking. To help solve these issues and increase blockchain’s adoption across multiple industries, Hyperledger - an open source collaboration initiative established by the Linux Foundation to ‘advance cross-industry blockchain technologies’, is supported by blue chip member businesses including IBM, American Express, Intel, Deutsche Bank and Accenture. The project has brought together many of the biggest organisations in the technology, banking and finance, Internet of Things, and manufacturing and supply chain fields, to work together to develop a legitimate, standardised enterprise-grade blockchain solution that will allow secure transactions to take place at scale. Bringing in the big guns Part of the solution may be a hybrid approach that incorporates an element of centralised technology that can take care of the heavy lifting. Big global banks are currently able to carry out over 12 billion encrypted transactions per day because of the speed, scale and security provided by a single mainframe. The same technology may be the key to success for Blockchain. Mainframe technology underpins 87 per cent of all credit card transactions and four billion airline flights each year. The technology is considered so reliable that 57 per cent of enterprises with a mainframe run the majority of their business-critical applications on the platform and, according to Forrester, this is set to rise to 64 per cent by 2019. By hosting blockchain on mainframes, businesses can take advantage of their sheer computing power that comes with this - not to mention the cryptography, security and reliability. It also means that the hosted blockchain can be integrated with an organisation’s existing transaction data and systems already running on their mainframes. It’s still early days for blockchain. We can expect to see many trials, both unsuccessful and successful, before we fully understand where the technology is heading. But wherever it goes, it will need underpinning with a solid mixture of powerful processing power and high transmission speeds if it is to respond at a rate that will be acceptable for our global institutions. By bringing together centralised technology in the form of the mainframe, in a distributed model, it may just be possible to have the best of both worlds.[/vc_column_text][/vc_column][/vc_row] ### How Insurtechs and Big Tech are going to be foes and friends for Insurers in 2019 [vc_row][vc_column][vc_column_text]The insurtech sector is one of the fastest growing sub-divisions of the fintech movement. Indeed since 2012, insurtechs have attracted $2.5 billion in early stage capital and there are over 250 companies in the general insurance segment alone. The boom in insurtechs started three years ago. The popular image is of the insurtechs who are out to disrupt the established insurance market with digital products that cut out traditional insurers and brokers. Like their bigger brothers in banking fintech, disruption was the insurtechs’ original mantra. However, there is a dawning reality that scaling up an insurtech start-up, or unicorn, to compete for customers on a massive scale is super difficult. There are some immovable obstacles to be dealt with. Regardless of how digital and agile an insurtech believes it is, there are legal and regulatory hurdles that are hard to overcome. To move ahead in 2019, insurtechs need strategies to deal with these. Some of the most disruptive insurtechs are those who seek to seize customers from traditional insurers. You could term these as digital attackers and several of these will survive. It is notable that these challenger, independent digital insurers are installing traditional insurance industry veterans to lead their operations for next year. These are exactly the right guys to get over the regulatory hurdles. The main development in the insurtech sector will be how insurtechs make serious efforts to be more attractive, not disruptive, to the insurance establishment. Insurtechs are realising that collaboration with the establishment is critical to their success because traditional insurers have done the hard work on customer acquisition, statutory capital, and regulatory compliance. And what is more, the establishment insurance companies are in the market for insurtech innovation. Successful insurtechs will be those who fit into different parts of the insurance customer lifecycle. These can be categorised as process improvers who can make existing processes, like risk analysis or claims assessments, more efficient, or how they can evolve existing products like motor with vehicle telematics or property with IoT. Great examples of these are FRISS and Octo Telematics. A European insurtech, FRISS is one of the most successful providers of fraud detection solutions with over 150 implementations in 25 countries. These implementations include the flagging of potential fraud during underwriting and claims management using machine-learning and other techniques. Octo Telematics, one of the largest global telematics companies, automates the First Notice of Loss (FNOL) process for motor physical damage claims and can digitally reconstruct the dynamics of a crash with their telematics data for a claims adjuster to evaluate. Octo has telematics data on over 200 billion miles driven. Do not expect disruptive insurtech offerings to fade in 2019. Many will keep on inventing new insurance products that may be positioned as disruptive (for example, pay as you go motor insurance or insurance for the gig economy). Nonetheless, insurtechs will be most likely to succeed if they have the backing of an established business that is typically a traditional insurer or reinsurer. As insurtechs look for ways to survive and thrive, they find they need to connect with insurers’ systems, underlining the importance of software platforms that are widely adopted in the industry. These offer the best way for insurtechs to interconnect with insurers. 2019 will see even more enthusiasm for insurance platforms, many of which exist in the cloud. By offering products that operate on the same platform as the insurer, insurtechs remove the obstacle to new technology adoption and realize value more quickly. No matter how easy an insurtech app is to use, if it requires staff training on a new user interface, there is a significant barrier to its success. This difficulty can be avoided if the app’s capabilities can be accessed and employed within the insurer’s familiar systems dashboard. While insurance has been shaken up by the advent of insurtech, there have been persistent rumours in trade press about the Big Tech boys - Amazon, Google, and others - entering and disrupting the market more decisively than an array of plucky insurtech start-ups. Some limited ingress by Big Techs has begun and should be expected to continue. The pattern will be for insurers to both repel and embrace Big Techs. It has been entirely sensible for insurers to evaluate an Amazon aggregator service or other parts of their offering – for example Travelers partnership with Amazon on smart home in 2018. This trend is going to pick up over the next 12 months or more. However, in entering partnerships insurers hold a strong not a weak card. Big Techs who want to offer insurance services must overcome regulatory barriers that insurers understand. It is worth mentioning that Big Techs, like Amazon, Google and Facebook, do not exactly have an unsullied reputation in data or regulatory compliance. This does not mean insurers should hesitate to fight back in 2019.  They need to both understand better how Amazon and the other tech giants operate; and realise their own innate strengths to repulse the attack on their market. The key developments to watch out for will be how insurers invest in innovative ways to maximise and leverage the data they already have about risks and customers, and how they create the effortless interfaces that customers want. The latter will seen in what way insurers learn the lessons of digital experience that is intuitive, uncomplicated and always aimed at saving the customer time and energy. Look at how the Google search page works for a hallmark of computing power rendered with awesome simplicity.[/vc_column_text][/vc_column][/vc_row] ### Data Science - Observing Big Data - QuantSpark's Adam Hadley Compare the Cloud interviews Adam Hadley from QuantSpark. Adam talks about how QuantSpark is aiming to disrupt the consulting industry. What does Big Data mean? Big Data is the phrase referring to a large volume of data, structured and unstructured. As this data is so large it is difficult to process through a typical database and needs to be analysed computationally to reveal trends and associations. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about QuantSpark here: http://www.quantspark.com/ Big Data ### Banks need to collaborate to best use APIs [vc_row][vc_column][vc_column_text]Digital transformation is on the lips of every bank’s CIO, whether the objective be to offer retail customers new online services or to create cost and efficiency savings for the business itself. Many banks are still working with legacy software running on a mainframe built on hundreds of thousands of lines of COBOL code. This may hinder innovation and agility – incredibly important when digital up-starts are threatening to eat a significant piece of banks’ service offering pie. Additionally, the Open Banking initiative, which launched in January 2018, saw many banks open up their APIs for integration with other third party services. This thrust the spotlight onto the importance of APIs and the ability to create new innovative services with the data and the existing business logic that they provide. APIs are crucial because they help reduce the time to market for new services from years to months or even weeks by being able to draw down on logic (code) created by other developers. And, this means that customer-facing apps and services do not have to be developed from scratch and cost millions of pounds. Running the API integration gauntlet However, legacy systems may make it difficult to integrate many APIs. To further complicate the issue, banks may not always integrate APIs that have been certified to meet an industry accepted standard. This is, in part, due to the fact that industry standardisation across API definitions is lacking. These definitions are vital – they help developers to annotate API functionalities and provide guidance on how to implement innovative and intuitive digital services across both back-end and customer-facing functions. With stringent regulatory requirements residing over the way banks interact with customers, IT glitches, and downtime are often met with an impatient customer and often, fines from regulatory boards. If APIs are badly implemented the likelihood that IT glitches will happen increases dramatically. Reputational damage, fines and customer distrust all take a toll on a brand’s future. An open, flexible, cloud-exchange platform is the key to API integration The key to helping banks implement APIs in line with best practice, is to use an open platform which provides access to an API repository. The platform should facilitate the sharing of APIs, while also building a member community around it to enable users to share implementation and integration knowledge and foster a collaborative approach to industry innovation. It should also consist of an online library of free API definitions, a common repository to store API data or source code. There are few enterprise-grade cloud computing API exchange platforms that exist in the industry. There are fewer still which are created by a group of established banking industry participants that have worked to standardise API definitions and ensure that all APIs on the platform are compliant. BIAN has developed just this; in collaboration with over 10 of the world’s most prominent banks and technology service providers, they have built an open, flexible, enterprise-grade cloud computing API exchange platform on the Microsoft’s Azure cloud infrastructure. The platform enables BIAN members and the wider banking industry to explore the world of BIAN-compliant APIs, experiment with member APIs, and integrate them into their solutions at no cost. The platform also allows for contributions to the community, permitting registered participants to on-board their own APIs that follow BIAN standards. The API Exchange features gamification qualities such as a slick UX and public ratings to enhance community engagement and user experience. The gamified environment is also designed to promote healthy and measurable meritocracy to enhance collaboration and contribution amongst the BIAN community. Yet, as the platform demonstrates, APIs also need a place (a wiki) for documentation for developers to understand the API protocols, how they are created, and how they can be implemented and tailored according to the needs of the industry. Collaboration is the cornerstone to true technology innovation Banks should continue to work together to resolve tech challenges, use world-class solutions, and accelerate their technology innovation. Tackling issues alone will only create more fragmented technology solutions, often unwieldy or expensive and time consuming implementations. In being able to come together to share best practice – particularly through a central repository that contains compatible APIs, standards will be aligned, and the industry as a whole and modernize itself faster for the benefit of the customer. Not only will open and flexible API-exchange platforms help banks realise the benefits of standardisation, but they will also achieve rapid acceleration in integration, quality assurance, and partnerships within the banking industry. And, critically, the API Exchange swill allow banks to modernise their critical legacy infrastructure and adopt standardised frameworks to facilitate more open and easy collaboration with FinTechs and RegTechs.   3D animation render by Quince Media[/vc_column_text][/vc_column][/vc_row] ### AI Mimics Humans Intelligence but Not Human Culpability? We teach AI systems to process information with the intellectual acumen of humans (perhaps even surpassing some humans), yet we do not reprimand them for their mistakes. Naturally, the blame is placed on humans who developed the artificial intelligence technology and perhaps as a result, AI will never be totally independent. Like a child, AI learns and makes decisions, but it has to be monitored by its creators to ensure it does not go rogue. How do we respond when an unsupervised machine learning AI is left to its own devices and makes mistakes? For many, the obvious answer to this question is that the developer of this technology should be held accountable, but when we risk allowing AI to make its own decision unmonitored, are we giving it responsibility it does not have the means for? A machine must learn its lesson as a human does if we are to progress further into the widespread implementation of AI technology. Integrating explanation systems For AI to truly mimic human intelligence and thinking they would have to be able to explain their actions but this is a complicated matter. Integrating explanations into an AI system would require substantial data and training in its development and also in its application. As AI systems are designed to manage tasks in an efficient and more scientific manner than humans, an AI translating their workings into tangible explanations accessible to humans could cause delay to operations and slow down its productivity. The key question is, how can we make accountable AI's without sacrificing the quality of its operations? The absence of accountability and second guessing is a major difference between humans and AI that gives AI the edge in terms of efficiency, but in a human centric world, an entity capable of making decisions is naturally assumed accountable for those decisions. For some manufacturers there is a fear surrounding explanation as they believe the inner-workings of their systems could be revealed, giving competitors access to their golden secrets. Totally transparency in terms of the design behind the AI is not necessary though; rather, a transparency in terms of outcomes from decisions should be prioritised. For example, an AI system should be able to explain why a driverless car crashed without giving away the coding secrets behind the AI’s system. Instead, an explanation such as 'the wall was labelled incorrectly as ‘road’ ' should be provided so that future mistakes can be avoided. The subject of AI accountability is not only reserved for the speculation of the unknowing public, the matter has been discussed and meditated on at a higher level. For example, German EU legislator, Jan Albrecht, believes that explanation systems are essential in order for the public to accept artificial intelligence. If we do not understand why AI makes the decisions it does or what it has the power to do, people will be far too afraid to embrace the technology. Albrecht believes that there must be someone who is in control of the technology. Explanations could also help developed identify biases, as an AI system may only look at men when collating data on scientific endeavours and this bias could then be eradicated once the developer receives an explanation. The future will reveal much more about AI and what we should expect from it. Questioning how 'human’ AI is and whether explanations should be required from it is entirely reasonable as we are still in the process of adapting to AI and deciphering its place within our world. What about criminal responsibility? Placing liability with developers makes sense while they remain involved in the development of the AI, but in the not so distant future, autonomous machines may have the agency and intellectual freedom that any human adult has. A parent is responsible for a child up until they come of legal age to care for themselves, and if we see the developers as parents, their AI offspring may fly the nest in the near future. AI is not at the stage where we allow it to exist independently, but as we test the technology further, our confidence may increase to the point where we let it run freely and make vital decisions for us. To some degree, with the use of unsupervised learning, there are already examples of relative AI independence, but this is within testing centres where people can monitor it. Unsupervised learning only gives an AI system a narrow range of focused abilities such as identifying potential customers for companies based on data. The issue of liability will become paramount if AI is ever free to choose what it becomes involved in without concern for morality or risk to human life. Many will envisage RoboCop and his flavour of vigilante justice when thinking about the potential for autonomous technology going rogue and although this may be an exaggerated image of what is to come we do not know quite how close to the truth this could be. Although AI technology is often used to detect criminal activity e.g. intelligent sensors in packaging to identify illegal items AI is yet to adopt the nuanced moral codes of humankind, and therefore an AI bot or system may commit crimes. AI robots have been known to commit what humans would consider a crime. For instance, in 1981 a Japanese man working in a motorcycle factory was killed by an AI robot working in his vicinity. The robot incorrectly identified the man as a threat and went on to push him into an active machine. The bot then continued working as its perceived enemy had been eliminated. Although this incident happened many years ago it raises the issue of accountability and whether the developer could be seen as responsible. There appears to have been no intent to harm from the developer and therefore the intent was with the robot. How do we manage criminal activity if it is committed by a non human entity absent of empathy? Perhaps the only answer is to train artificial intelligence more thoroughly on human behaviours and emotions. We cannot threaten to reprimand a robot with jail time as this will mean little to them unless they are trained to feel regret or accountable for their actions. ### Chatbot | Your Latest Recruit More and more businesses are saying "Hello" to a chatbot to increase efficiency and cut costs while offering their customers a more personalised experience. Hannah Jeacock, Research Director at leading HR & Payroll provider MHR explores the potential for chatbots in the workplace. In the infancy of the internet, AOL Instant Messenger was a state of the art means of communication, used as an early social platform and helping office-workers get faster responses from colleagues. A few years on, the dot-com entrepreneurs stormed ahead with innovation after innovation, making early messenger services seem as obsolete as they had seemed groundbreaking a few years prior. In the last few years though, chatbots have given us a glimpse into the future.   The Bad and the Good! Millions of people already use platforms like Facebook Messenger and conventional text messaging every day - but chatbot technology is set to take the message format to the next level. Admittedly, chatbots have not been without their problems. While being the world’s largest software firm, Microsoft’s first steps into using this technology were halted due to racist remarks generated by its chatbot, ‘Tay’. What started out as something innocent and helpful, turned into something rather more sinister when the chatbot started posting inflammatory and offensive tweets, as a result of gathering offensive comments that had been posted online. Thankfully, that’s not the case for the many chatbots that are making a positive impact on people’s lives. In fact, Chatbots Magazine has stated that 2018 is the ‘year of the chatbot’, owing to huge advances and investment in chatbot technology by major firms. A great example is the Lark chatbot, which is already saving lives and improving health as a piece of technology that assists people who live with chronic diseases, and it has gathered a great reputation by doing so. By checking people have been eating the right food, or have taken their medication at the right time, support is offered by giving feedback which is comparable to having a nurse on hand - at a fraction of the cost, and with less intrusion. This is just one example of the importance that chatbots can and will have in our lives, and rather than being novelty items, they are extremely useful tools.   Cost Saving and Expenses Reducing unnecessary staff is an example of saving time and effort through a chatbot, but it also saves money for organisations by reducing wage bills and avoiding many costly mistakes that can occur through human error. As well as these benefits, it is easier to monitor expenditure in terms of payroll access and expense processing, both of which can be made faster by giving and receiving employee feedback, to ensure all parties agree over monetary amounts. Expense claims can be a bugbear for many employees who want them processed as quickly as possible, but face delays due to archaic processing methods and outdated technology. Consider this scenario. A salesperson has to stay away from home for two nights to meet with customers. They keep all their receipts in a safe place for when they get back to the office to make an expense claim. They fill in an expense form and post the receipts to the payroll or expenses department. There is then a long wait for the expenses to be approved and the money paid. It’s a frustrating process for the employee, particularly if they are left struggling because they’re out of pocket. Had they used a chatbot, it would have read the image, turned the text from the paper into digital text and automatically sent the info to their expenses software for manager approval; with a response back in minutes. They could also check and confirm the amount, saving precious time, without having to keep mounds of paperwork!   Appointment Scheduling Software Chatbots work for a diverse mix of businesses, including makeup retailer and online brand Sephora. A historic brand, Sephora has moved from its origins on the Parisian high street to using an innovative chatbot through Facebook Messenger, adding value for its customers. Already offering ‘virtual makeovers’ online through an augmented reality app which uses customer photos to show how products would look, the company are a good example of how technology can keep your business fresh. Their chatbot is used by customers to reserve appointments, currently for their US stores only, but with a far-reaching application for all businesses worldwide, and in every sector. By implementing scheduling software of this type, employees can manage themselves, arrange meetings and meet with customers – but the real benefit is that they can do all this whenever and wherever they are. The mobile and intuitive nature of chatbots means that appointments can be easily arranged in environments where using technology is not usually practical, for example on a building site. By using the chatbot, a few pressed buttons can do all the work, and chatbots can memorise several common commands, such as ‘site survey’ or ‘boiler repair’ which are most frequently used, meaning mobile workers can organise themselves mid-job if required.   An Interactive Online Holiday Booking System Busy staff sometimes need a rest from the challenges the day throws at them and they, their managers and the HR department all have limited time and energy for carrying out administrative tasks. When a break is required, booking a holiday doesn’t need to be a challenge for anyone, as chatbots now have the functionality to do this in an instant. Imagine you are that staff member: After checking the dates and availability, your company chatbot can automatically send a message to one of your managers to authorise your annual leave, and the user does not stop there. Once leave is authorised, it is now possible to book your actual holiday through a chatbot, selecting the best flights or using a questionnaire to establish the best kind of holiday for you. Many holiday companies like Kayak and Booking.com are already using chatbots to good effect, taking the hassle out of booking. The traditional airline British Airways took the bold step recently of using emojis to operate its messenger chatbot, after finding that more people prefer communicating by emoji than words. The so-called ‘Emojibot’ works by asking a chain of questions about the kind of holiday you are after, so a quick response can be made to speed up the search. This quick-fire rendition of a chatbot shows that as AI chatbots continue to develop, the time saved for businesses is going to be very significant, and bots a joy to use. The process of employees organising their schedule, booking annual leave, then booking a holiday for that break, shows that the multi-use nature of chatbots is hugely important for the future of business, and the present direction of technology; hugely benefiting businesses and their personnel management.   The Ultimate Attendance App Attendance is a challenge and high priority for many employers. By using a chatbot, staffing levels can be checked through a series of questions. Staff can also virtually check-in when they have arrived at work, training or customers’ premises. This makes it simple to see how many staff are working, and what staff are doing; for example who’s in or out of the office.   Why We All Need a Chatbot From a practical standpoint, chatbots are a great thing. Saving money, time and (when programmed effectively) offering consistently useful help to both customers and employees. Having already been adopted successfully by many international companies, they do not seem to be going away. The chatbot can be a tool with a human voice, something more approachable than the software it drives – employees may find self-service procedures dull or complex for example, but by adding the illusion of a human assistant, the customer journey is far smoother, and the results will speak for themselves. Additionally, using a chatbot on a phone offers privacy to employees who wish to keep certain appointments confidential, such as a medical appointment or an application for a promotion - things they may not feel comfortable having displayed on their desktop monitor or work laptop for their colleagues to see and comment on. When a new technology emerges, consumers and business leaders want to know the benefit of purchasing the product, and what it does. With an AI chatbot, the question is almost ‘what can’t it do?’ From organising, to engagement, to cost saving - a new era of technology has begun, and its possibilities are endless. ### How Fintech Apps Revolutionize Finance Methods? [vc_row][vc_column][vc_column_text]Finance industry replicates complex business scenario and customers find it hard to get accustomed with the environment. True that the Fintech industry is one the verge of aggravating changes, yet there lies plentitude of scope to improve. With the exponential rise in digitization, it is imperative to integrate modern technologies that would likewise portray success. The growth of Fintech industry has been a lot more than anticipated and accounts for opportunities to all the app developers reshape the Fintech industry with the onset of Fintech Apps. No doubt, the finance industry has faced severe pitfalls but a fall does not suggest failure. Finance Industry embraced innovation a few years back as and when Finance clubbed with technology giving rise to FinTech - Financial Technology. Several industries are now trying their luck in this fast pacing industry and one name among them is the mobile industry. Financial Software Development has recently attained immense attention from all the startup agencies. Owing to the fact that the next-gen youth spend a considerable amount of time over smartphones, deploying a Fintech app is more of a necessity to stay ahead of all competitors. And supporting this, a survey reflects the fact that most of the payments done for financial services have been done via mobile portals. Additionally, it is predicted that by the end of 2020, around 90% of Fintech payments would be through Fintech apps. No matter developing financial applications might appear as hard to cut a crack, there lie a plethora of benefits supporting the implementation of Fintech Apps and lead to mobility.  Threats Posed To Companies While Deploying Apps  Deploying a mobile seems to have plenty of benefits but the road is not placid. Financial Software Development companies face a whole set of challenges while thinking to build an app for financial services. Few of these are: Security Issues No matter what the domain is, apps are always prone to data breach and require immense security concerns to cater. And when it comes to sensitive apps as Fintech apps, data breach might account to loss of private credentials such as user Id and password, or account detail and this is not acceptable. Hence, one of the most important thing to keep in mind while you plan to build a fintech app is to ensure optimal security. Abide By Regulations  Financial Services are formulated by the government and it is of huge importance for all the app developers to stand by it. A simple tweak might lead to cancelation of your company. Hence, you must keep a check on the rules and regulations while your app is being developed. Able To Handle Multiple Request As the world advances digitally, so does competition. Customers seek results and faster ones. If an app is to slow to provide services, there are high chances of customers shifting companies. Customers have their own basic requirements and if your apps fail to meet those, you would have a tough time pitching success.  Odds That Favour Fintech Apps   Personalized Appearance  Customers are the key to success and no matter how have you ideated, Fintech Apps aggravate a sense of uniqueness. Apps for Fintech start-up nowadays exactly map with the needs and desires of the customers. It helps mapping users with the sold product thereby leading to personalization of the app. This alternatively induces customers trust and ensure a long-term relationship. Individualization is the key to the success of financial app development for startups. Enhanced Customer Retention Mobile apps ensure long-term connectivity with the customer. According to a survey, promoting sales to an existing customer has a success rate of 60-70% whereas the same accounts to almost 20% among new users. Mobile apps thus not only builds connection but keep the user engaged for a long time there you converting the same to sales. Push Notifications  One of the most crucial elements of a mobile app is the provision of sending an update and offers to the potential customers. On-demand apps own this as an asset and an arsenal to marketize products and likewise enhance sales. Additionally, the use of ML techniques helps customize the user profile so that they are notified about products or services that own greater probability of appealing. Alluring discounts and special offers for five-star customers. Greater Customer Management  Employing a fintech app paves way for improved customer support. FAQs have been into the trend for a long time but the advancement of technology lead to the onset of Chatbots for regular assistance to all customer. Live chats are another feature reigning in-app these days. Simplified Data Fintech Apps are the voice of simplified data. Fintech App ideas include providing services that simplify the data and along with it streamlines the entire process of purchase. Fintech apps account for hassle free purchase along with the provision of displaying transaction history and dashboard with a comprehensive summary. Tracking information, accessing significant data, an app would have all. Future Of Fintech  Having said all above, a final word with few stats. Several reports suggest that the total investment in Fintech in the upcoming 5 years would reach $150 billion. Huge! While the future of Fintech seems captivating, it is high time that all the insurance agencies and the app development companies acknowledge the potential of the financial industry and adapt to the same. It would not be wrong to state that very soon the financial Industry would hit the market by a storm leaving behind a plethora of opportunities for all the start-up enthusiast to gear up with a take on the Finance Industry.[/vc_column_text][/vc_column][/vc_row] ### Cloud Networking | The common pitfalls of moving Cloud networking introduces a new way to deploy and operate distributed access networks. It delivers enterprise-class network management capabilities for Wi-Fi access points, switches, and routers, via a cloud infrastructure that requires little or no capital investment in additional hardware and software platforms or in IT resources. By now, almost any company, independent of its industry and size, uses at least one kind of IT cloud service. And that’s not surprising at all, because cloud-computing services provide huge upsides and workload shifts for companies: IT departments no longer have to purchase, deploy and maintain computing hardware and software in-house, cloud services are quick and easy to set up, they scale as needed without involvement from IT, and are automatically updated to the latest release level. In short, they vastly simplify and reduce the complexity and cost of IT service delivery – and organisations’ wired and wireless access networks can now enjoy the very same benefits. Unlike traditional locally hosted solutions, cloud networking simplifies highly complex tasks to make them extremely simple, enabling organisations to deploy locations in minutes and operate distributed networks via a centralised console, providing unprecedented levels of centralised control and network visibility. It also allows for seamless growth without bottlenecks caused by legacy networking components such as wireless LAN controllers, and supports headquarters and remote locations alike, significantly reducing the need for local intervention and truck rolls.   With cloud networking, simplicity is the goal As a result of its inherent advantages, cloud networking adoption is growing continuously. Following the general trend of moving IT infrastructure to the cloud, and consuming it as-a-Service instead of hosting it locally – IaaS is experiencing the fastest increase out of all cloud services, with a compound annual growth rate of 29.7% forecast through 2020. The biggest trend in cloud networking is being driven by a need to move from complexity to simplicity. Planning for a network that should last anywhere between 4-7 years is challenging as corporate, BYOD, guest, and IoT connectivity requirements are rapidly changing. Organisations are already stretched for budget and resource, and without a crystal ball, IT departments need technologies that easily grow and adapt as needed, without worrying about change control, license surprises, or additional infrastructure that wasn’t planned for. Ultimately IT departments are looking for solutions that take care of them, not the other way around. They want to set and forget where possible, but many solutions require ongoing maintenance and support.   Considerations when moving to the cloud Not all clouds are created equal. Here are some of the most common pitfalls to look out for and to research while evaluating different solutions: Forced upgrades: As their network grows in scale, organisations may have to purchase hardware controllers periodically, and/or purchase a new network management system once their network reaches a certain size. Cloud wireless LAN controller masquerading as cloud management: If an access point loses its connection to the cloud, there is a risk of clients losing connectivity altogether, or of certain features becoming unavailable until the access point can connect with the cloud again. Different feature sets for cloud and on-premises deployment options: one or more features are only available on one deployment option, not on both. Depending on the organisation’s requirements, it may be forced into one deployment model. It may also risk that future new functionality will only be made available on one deployment model, which in turn may make interesting new features unavailable to the organisation. Availability: With traditional network architectures, Wi-Fi service can be interrupted if the management goes offline. With true cloud management, wireless connectivity at customer sites will continue. The same applies if access to the cloud management is deliberately suspended by the network vendor, e.g. in case of delayed payments. No seamless upgrade path: the vendor’s portfolio includes optional capabilities and features, such as advanced authentication or APIs, that cannot be supported by all management solutions, and will necessitate an upgrade of the underlying management platform. Non-universal hardware: Different networking hardware is often required for different network sizes and different management offerings – so as the organisation’s network grows, or as it is upgraded, access points, switches, and routers will have to be replaced.   For those looking to make the most of moving their network to the cloud, it is important to follow these tips. Cloud networking is a powerful solution and offers a host of advantages compared to traditional, controller-based network management systems. Due to its flexibility and seamless scalability, cloud networking is a perfect fit for organisations of all sizes and industries. The key is to ensure that the time and money invested in deploying a cloud network will stand the test of time. Crucially, organisations need to look to a solution which can scale on demand, and allow them to constantly create new competitive advantages, process efficiencies and cost savings. ### The Cloud Has Disrupted Enterprise IT (and ain’t going to fix it) [vc_row][vc_column][vc_column_text]There’s no getting away from the fact that cloud computing is having a profound effect on enterprise IT. This is due, in part, to once cautious executives now seeing it as a quick and easy shortcut to digital transformation. However, while some may find it a good fit, for others the brave new world of cloud computing is far from a perfect solution. Indeed many of the executives I talk to see themselves stuck in what Gartner calls the “trough of disillusionment” having rushed into the technology a little too quickly and without a full understanding of what they were buying. It’s the cloud, stupid There are, of course, lots of good reasons for moving at least some workloads to the cloud with by far the biggest lure being the simple on-demand delivery model on which most services are based. Before, the cloud on-premise infrastructure teams would, typically, have to spend months justifying and purchasing the hardware, software and services they required. A task made even more arduous by having to second guess demand years in advance, with many over ordering ‘just in case’ for fear of having to go through the process all over again if they got their sums wrong. With the cloud, that all goes out of the window. Need a couple of virtual servers, a database instance, or a hosted email server - next week, tomorrow, now? In the cloud, they can be up and running in minutes, along with all the storage and network connectivity needed to go with them and with no capital outlay required.  Moreover, with a lot of cloud platforms, you only have to pay for the resources you actually consume, making the cloud - in theory, at least - the ideal fractional computing platform on which to host digital business workloads. Unfortunately, it often is ‘in theory’ as in many ways, the public cloud is a little too quick and easy of a solution, and careful planning is needed if it is to deliver the fractional computing benefits many expect of it. Moreover, if you get it wrong it can be a lot harder to put right than might be imagined. Choices, choices, choices One of the biggest problems with the cloud is the sheer number and variety of platforms and services on offer. Last time I checked, AWS alone had over 150 products grouped under 20 different categories, from basics such as compute, storage, database and developer platforms to leading edge business analytics and AI tools. Add to that similar portfolios from other providers, plus the ease with which public cloud products can be purchased, and you have the perfect recipe for creating a mixed bag of platforms, services and apps which, while effective and perfectly manageable in their own right, don’t make for good bedfellows. Low buy-in costs also complicate matters, making it easy for line of business managers to acquire public cloud products with little or no oversight from corporate IT teams (so-called Shadow IT). Which wouldn’t be so bad except that, unlike on-premise infrastructure where it’s possible to look inside the data centre to see what’s going on, management visibility in the cloud can vary considerably between vendors and across products, making it hard to even know what you’ve got, let alone stay in control. New issues for old Don’t get me wrong, I’m all in favour of cloud computing, but where once the cloud was seen as a way of fixing the issues around on-premise IT, I’m increasingly hearing of new problems being laid at its door.  One of the most common is that while the cloud can, and does provide a cost effective platform for the development of new applications, when those apps are put into production the costs can skyrocket. Beyond that, the executives that I have talked to complain that they find it much harder to keep track of costs when applications move to the cloud compared to on-premise deployments. In fact, many will secretly admit to having little more than a vague idea of the total spend, plus no understanding as to whether that’s ‘normal’, or what they can do about it. Another common concern is the extra layer of complexity added by a technology widely promoted as a way of simplifying IT. Rather than making IT easier, companies often end up having to build additional specialist support teams, in many cases for each individual cloud platform, over and above those required for on premise infrastructure. And lastly, there’s the little matter of cloud sprawl with isolated silos of data even more commonplace and much harder to eliminate where the cloud is involved. We can fix it Unfortunately, these issues aren’t something that cloud vendors themselves are likely to address, at least not in the short term. Hence why so many organisations are abandoning thoughts of cloud-only IT and taking a hybrid approach spanning on-premise as well as cloud platforms – just to get back some degree of control. At the same time, however, they also want the on-premise part to be as quick and easy to scale as the cloud - which is why, while traditional server and SAN sales are forecast to fall over the next few years, interest in hyperconverged platforms is surging ahead alongside growth in public cloud services. It’s also why there’s been a flurry of activity around tools and technologies to not only empower C-level execs to keep a lid on costs but allow IT teams to move applications around and balance workloads across platforms regardless of the technology, vendor or implementation involved. These orchestration tools are the missing link in the equation to fix enterprise IT by enabling the enterprise to take full advantage of what the cloud has to offer on their own terms.[/vc_column_text][/vc_column][/vc_row] ### Corneas | 3D Printed Human Corneas | Compare the Cloud 3D printing is enticing for its ability to bring ideas and images into the third dimension, allowing the human imagination to run wild, but it is technology far beyond novelty. 3D printing is used to make parts in engineering, and even as a medical tool. For instance, Newcastle University scientists have used the technology to print the first corneas. The true to life details captured in the printed corneas were obtained through the scanning of patients’ eyes. The data from the scans allowed the scientists to promptly print a cornea matching the size and shape of the human eyes. The replica corneas were made from human corneal stromal cells mixed with alginate and collagen creating a 'bio-ink' that can be used in the 3D printer to produce prosthetic corneas. The scientists accomplished this using a basic 3D bio-printer to transform the bio-ink into printed corneas. The printing process took less than 10 minutes. Over 10 million people across the globe need surgery to prevent blindness resulting from infections and diseases, but there are not adequate human corneas available to accommodate this need. With the introduction of 3D printing to provide transplant ready corneas means that, if medically approved for widespread use, in the near future there could be a constant supply of corneas. The creation of artificial corneas is an immense achievement for the Newcastle University. Close to 5 million people are completely blind as a result of injury or illness, and with 3D printed corneas available for continuous production, such blindness could be avoided for future generations to come. It is no easy feat to have an invention approved for medical use, and as a result, the 3D printed corneas will have to be thoroughly tested and perfected before they are ready to use as transplants. Although the corneas are not yet patient ready, the scientists are understandably proud of their achievement as they have proven it is scientifically possible to create corneas from stem cells and printing technology. Now that is has been established as possible, scientists can focus on trialling their invention and working with medical professionals to ensure its safety. The Newcastle scientists are not the first to produce human organs or tissues, but rather just the first to 3D print corneas. For example, researchers at Wake Forest University have said they have a 3D printer capable of producing bones, organs and tissues that could potentially be used as implants for living humans. Their printer works in a way much the same as the University of Newcastle’s printer works. The Wake Forest’s printer deposits water based solutions containing human cells. Their printer allows is able to print tissues compatible with blood vessels, which means that the cells can receive the oxygen needed to survive. The products they have printed so far show no signs of cells dying in the tissue. The Newcastle University scientists have also been able to keep cells alive for weeks at room temperature. These incredible scientific breakthroughs have the potential to transform the painstaking transplant process whereby patients must wait until an organ or tissue become available. The reliance on the bodies of the deceased for this practice is exceedingly challenging to patients and medical professionals as so often availability is scarce. Sight and good health are incredible gifts bestowed upon the unfortunate individuals who require transplants by life saving donors, but to be able to carry out transplants without the need for an enormous waiting list would rehabilitate and even save the lives of so many more people. ### Why AI Will Never Transform Recruitment [vc_row][vc_column][vc_column_text] Digital innovation is transforming temporary recruitment. But is it driven by artificial intelligence, asks Andrew Johnston at TempRocket? No matter what sector you look at, it seems it won’t be long before artificial intelligence rides in like a knight in shining armour to transform it. It’s almost like some industries are actually coasting along without innovating because they are simply waiting for the AI panacea. And with the lack of innovation in temporary recruitment, up until this year, you might be forgiven for thinking that the sector was a culprit. However, digital disruption finally arrived in the sector during 2018 in the form of online temporary recruitment platforms. By bringing a hitherto disparate industry together in one place online, these new disruptors in the recruitment space offer a quicker, more cost-effective and joined up way to find and employ temps. These platforms are technologically advanced and innovative and look set to transform temporary recruitment for the better, providing a national or even global one-stop shop where hirers can connect directly with contractors and agencies to find the perfect temp cover in minutes. And of course, because they are digital they are available 24/7/365. The best versions of the platform also include a bidding system enabling hirers to get the best deal from agencies. Plus they are completely free to use for hiring businesses. So has AI struck again? Are these cutting edge recruitment platforms being driven by artificial intelligence? Well, the answer is not quite so straightforward. Yes, the platforms incorporate AI, but to say they are powered by the technology is taking things too far. These temporary recruitment platforms employ a specific form of AI known as computational intelligence (CI). They use this particular strand of AI to handle assignments autonomously for hirers and in some cases for agencies as well. The more advanced versions of the platforms combine inputs from hirers, contractors and agencies with their own intelligence to provide a streamlined workflow saving users time and money. This optimised workflow allows a booking to occur in under a minute. Unlike the AI that most of us are familiar with, which is based on hard computing techniques, CI is based on soft computing methods, which enable adaptation to many situations. Hard computing techniques follow binary logic based on only two values (0 or 1) on which modern computers are based. The problem is that the more random, variable or ‘human’ a process is, like recruitment, the less successful AI’s absolute approach will be. Soft computing techniques, based on ‘fuzzy logic’, are more useful. In temporary recruitment, if we remove the human decision making element from the system, we could, in theory, speed the booking process up immensely. If we don’t need the contractor to confirm they can undertake an assignment because we’ve linked a computer up to their calendar directly, then we could save precious minutes in the booking process. If we used AI to predict when a company needs temporary workers, then we could have them booked in a matter of seconds with automatic bookings made across the system. This would be great, right? Except, removing the human from the decision-making process is a mistake in systems where human interaction is key, and the recruitment industry is a case in point. If we remove just one element from the booking process the entire system will collapse. We need the contractors to confirm their availability, the agencies to confirm their willingness to supply, and the hirers to confirm they need temporary workers. A contractor isn’t going to want to have their schedule and calendar dictated to them by a machine. A hirer isn’t going to want a machine second-guessing their staffing requirements. Therefore, involving the decision makers is key, and that’s exactly what the new online temporary recruitment platforms achieve. The best ones combine inputs from agencies, hirers, and contractors with AI to provide a streamlined workflow saving users time and money. This optimised workflow allows a booking to occur in under a minute while continuing to engage with the key stakeholders. In a sector like temporary recruitment that relies on just as much subjectivity as objectivity, you have to question whether the common hard computing form AI will ever be appropriate. Perhaps AI-powered facial recognition might at some point be used to monitor staff for signs of illness, such as colds, flu, etc, and then alert management to sign up a temp for the following day. The problem is that Debbie comes into work no matter how ill she’s feeling and a cold certainly won’t keep her away, whereas Bill takes a day off when he’s got a hangover or at the earliest possible signs of man flu. How would AI understand people’s individual complex personalities? Great recruitment requires a very human touch. Which is why it’s a sector that AI alone will never completely automate.[/vc_column_text][/vc_column][/vc_row] ### The Rise of P2P Lending: What Are The Scopes Of This Fintech Innovation? [vc_row][vc_column][vc_column_text]Investment is something most of us know about. While the majority of the population rely on the bank for either taking loans or indulging into investment, a new term is seen buzzing across the globe. Fintech Apps or Peer to peer lending is what we will debate about in this article. Currently, in it's growing phase, P2P lending or crowdfunding has exposed a whole new set of investment opportunity for all the stakeholders. P2P lending is trending and the next-gen investors are now seen shifting from the traditional means of business to the explicable P2P lending apps for providing loans. No doubt, cutting banks and inducing a new platform voices innovation. But, how would this turn up in near future would be worth enticing? Given the growth of market trends and high paced investment in the field of Fintech Innovation, it is imperative to keep an on what this industry has in store and how would the same help redefine the industrial world. Before we move on to the specs of Fintech apps and the future prospects of Peer to peer lending market, let's have a look at what do we mean when we say the term p2p lending. P2P Lending - Unboxing Opportunities While you must undoubtedly have an idea about what lending notifies, P2P lending or peer to peer lending is a deal made between two interested parties. One being the borrower and the other an investor. A borrower is expected to register himself on the peer to peer lending app, verify it's credentialing and then put a loan request stating in details the reasons why an investor should consider him. Investors, on the other hand, scroll down all the loan listing and stop by the one that pleases them the most. While the loan amount varies from person to person and depending on the reason behind seeking a loan. Personal loans range between £500-£35,000 where's loan expected for setting up startups are dramatically higher. Future of Fintech is a mirror view of all you see. For some, it might act in favor whereas for others it could unexpectedly turn cards. Flipping is no new in the domain of Fintech or peers to peer lending. Where a borrower could be on the winning side, an investor might suffer loss owing to payment denial by the corresponding buyer. Guessing at this stage would be wrong. Fintech is innovating and there like a plethora of opportunities for all the start-up geeks to vent in peer to peer lending startups. Odds that Favour Fintech Innovation While the future prospects of peer to peer lending is multifaceted, yet there is a plentitude of benefits that please a business enthusiast to try their hands and play the game of lending & borrowing. Greater Returns For long, most of us have always looked up to banks we platform of providing loans. But, the rates of interest are undeniably high. When it comes to P2P lending both the borrower and the investor drives benefit. The borrower, on one hand, has to pay a less interest rate, investors could charge more than what they are paid by bank investment. Narrowing The Process Of Application Banks have their standards and policies while providing loans. Plenty of paper works and frequent visits to the bank before you are granted your loan amount. On the other hand, P2P lending portals streamline the entire process with their e application. All a borrower needs to do is to set up his profile and request a loan along with notifying the interest rate at which he expects to repay the amount. Borrowers could then explore the list to connect with the ones that appear feasible. A faster process of funding Banks take nearly week's before they agree on sanctioning the loans amount. On contrary, lending apps automate the process and provide access to fund within a week. Additionally, a user can request an amount as low as £10, leading to greater traffic. Huge participation is witnessed on the app. Pitfalls Of P2P Lending While there are many who favor investing in P2P lending, many are concerned about security and prevalence of greater risk. P2P lending market suffers a major drawback in terms of uncertainty. Even though borrowers need to have a credit score before listing loans, yet few prefer lesser return but secured investment means. True borrowers benefit from the disadvantage, but this decline the overall trust entrepreneurs have on Fintech investment. Scopes The prospects of Fintech Innovation seems predominantly high and this not because of it's growing size of industry but owing to the fact they're nearly all sectors have had an impact of Fintech Innovation. It has transformed the sector of Financial Services and would contribute to the disruption of the financial world. Industries such as RegTech, InsurTech, and WealthTech are already hit by its storm and the craze of this explicit technology does not seem to fade. Considering the current market trends, it would not be wrong to state that p2p Lending has revolutionized the entire industry. To add to this, it has opened doors for all the start-up enthusiast to vent in P2P lending apps to kickstart their business. How does the future of Fintech transform the digital world is what we look forward to seeing.[/vc_column_text][/vc_column][/vc_row] ### Healthcare Data Protection | A Vulnerable Fitbit Generation [vc_row][vc_column][vc_column_text]With smartphones and other data collecting devices, users may be able to decline to share much of their information, but it is inevitable that certain apps and database will hold information about you if you are an avid smartphone user - whether you know it or not! Whether you use autofill to save and reuse your information to quickly fill in forms (i.e., your name and date of birth), or even your credit card details, or if you fill out a profile on a social networking site, you have given up a significant degree of anonymity. The most extensive source of data regarding a person’s health is their smartphone. While this can be useful if you fall ill as a healthcare professional or a person who has come to your aid can quickly access your smartphone if it is not password protected and find health details through systems like Apple’s Health app and CapzulePHP (an app that holds details regarding fitness, medication and more). In CapzulePHP access to emergency information can be obtained when a device is password-protected via QR code and text forms. Despite the usefulness and benefits of health apps, Privacy Rights Clearinghouse carried out a thorough study of more than forty mobile health, which revealed considerable privacy risks for users of the apps. Unbeknownst to the individuals using the apps to store and analyse their personal health details, the information appeared to be unencrypted, precarious and used by the developers of the apps as well as third parties. Privacy violations of health and fitness apps are by no means unheard of. In fact, in 2011 Fitbit mistakenly publicly revealed statistics of users’ sexual habits. Although this controversy was undoubtedly the result of mishandling information or a complete accident, it is a reminder to users of such technology and apps to be more vigilant in their consideration of what they share. Of course, it is a well-known fact that the majority of people do not make the time to read the terms and conditions of apps, and although it should perhaps be the responsibility of the user to do so, more accessible and concise tick box style questions to confirm consent could be a better alternative. Even then, in their study of health apps, Privacy Rights Clearinghouse found that only 43% of apps included a privacy policy. In an effort to protect more people from theft of their health data, programs have been instigated to approve health apps for safe use and disapprove others. An example of this is the UK’s NHS Health Apps Library, which consists of a  carefully curated list of apps. Registered apps go through a process of evaluation to determine their clinical safety and how well they comply with data protection regulations. To feature in the list, app developers must disclose all data transmissions and register with the UK’s Information Commissioner’s Office (the enforcing body of the Data Protection Act). When apps that hold your medical details are hacked, the risk could be enormous. With the hacking of the NHS in May 2017, it is clear that healthcare information is seen as very valuable to the medical world as well as the hackers themselves. The motive behind the NHS ramsonware attack appeared to be financial, and with access to personal health details, it is possible hacker could find information that some people would not like the public to know about and hackers could use this more blackmail. Healthcare technology, devices and apps help people to understand themselves from within- allowing them to adjust medication and exercise accordingly. Although the use of advanced wireless IoT devices that collect health information are yet to have widespread use, there are several on the market that could also raise cybersecurity concerns. There are IoT blood pressure devices that test the user’s blood pressure through an armband and then wirelessly connect to an app in order to store the health data. For example, a blood pressure device called the QardioArm uses Bluetooth to connect its monitor (to be attached to the upper arm) to a smartphone or tablet, which then records the results, such as pulse rate, to be automatically synced with the app. The results can then be sent the to a GP. The technology is incredible and gives users real insight into their well-being and health, but if a device such as this was not carefully protected from malware and hacking, in-depth and revealing information regarding your medical conditions and health should be accessed by the wrong people. For users of healthcare apps and devices, it is important to know whether the particular devices or apps you are using are compliant with data protection laws. For developers of healthcare apps and devices, cybersecurity must be taken seriously and upheld in order to protect the consumer as well as protecting the reputation of their app and/or device. Healthcare providers must also look into the apps and devices they are adopting for use with patients in order to ensure the security of data collected. Essentially, from developer to user, everyone must be vigilant about data protection.[/vc_column_text][/vc_column][/vc_row] ### Disaster resilience | Managed services Data-hungry technology is used throughout manufacturing in many ways – to store data, run automated machinery on the plant floor, track inventory and support distribution. This technology is collecting and storing data intertwined with production processes and when manufacturers suddenly find themselves unable to use those processes, production stops and so does revenue. Disaster resilience is a way to prevent and improve the data service.   With ever increasing network connectivity achievable and the rapid advancement of operational intelligence technologies within the SCADA application space, monitoring these systems and maintaining the high availability levels critical to productivity is becoming progressively complex and, historically, plant management’s concern has been production – not IT. The reality is that to manage the infrastructure and applications in this environment, process industry businesses have found themselves scrambling to find the know-how needed to ensure consistent monitoring and disaster resilience. Corporate IT departments rarely understand the specific requirements of the industrial IT environment, whilst on the plant floor, there is a lack of technical expertise about IT infrastructure and significant resource constraints. As a result, where monitoring is happening, its use is limited. There is nobody with the time or expertise to interpret the data.  What do those error logs actually mean? Prevention is essential Manufacturers can retrain their teams or bring newly skilled digital talent on board, but in the fast-paced manufacturing environment, there is often neither the time nor the budget to support that route. So, how can plant environments deliver predictive and preventative monitoring and maintenance of their automation infrastructure and combine this with a robust, proactive response to rectify abnormal situations which could impact production? One increasingly popular option is to take advantage of managed services to support these operations and tasks, as this helps to plug the skills gap that is often prevalent within organisations looking to improve their availability levels by outsourcing all of the monitoring required for effective preventative maintenance. Outsourced managed services can also help to protect system health and eliminate the risk of unplanned downtime. Unlike production engineers, external providers have the dedicated resource to ensure continuous services and high availability of data. Crucially, they can also interpret the data and use it to inform improvements and efficiencies and, should the worst happen, they can work quickly to minimise any negative impact on operations. The latest managed service offerings for industrial IT environments include performance management technology which enables the monitoring of the health and wellbeing of both the physical attributes of hardware and the process attributes of software, allowing for abnormal situations to be avoided. Plant floor data can then be transferred via a secure web connection into a secure Cloud environment, while real-time data can be displayed on an insight dashboard, giving engineers access to unprecedented insights into their operations.   Disaster resilience Outsourcing disaster resilience might sound counter-intuitive at first glance, but in industrial environments backups are often taken and stored in offsite facilities, but rarely checked for effectiveness. To implement a successful disaster resilience plan, it is important to understand how long a business can be out of production and how much data would be lost, and this can be calculated by looking at the recovery time (RTO) and recovery point objectives (RPO). The reality is that manufacturing organisations often inherit IT infrastructures specified at the corporate level which are not robust or secure enough for the plant environment. A recent survey revealed that 40% of manufacturing businesses were less than confident in their organisation’s ability to get up and running again after a critical IT failure. Hybrid Cloud Technology In contrast, the latest solutions use enterprise-class Hybrid Cloud technology to improve resilience and give users greater protection over their systems and data. Workstations and servers are protected locally across the Local Area Network (LAN) to the appliance and data are then automatically transmitted to the secure cloud, improving fault tolerance whilst reducing the reliance on bandwidth speed. These solutions deliver a very low RTO because local virtualisation can take place within minutes of a machine failing, backups can be taken and checked by the external team and instant local virtualisation means that should a machine fail, the plant has a replacement within minutes. Skills shortages in manufacturing and a continued blurring of the lines between OT and IT are making it difficult for plant environments to drive innovation, so manufacturers in all industries should strongly consider outsourcing areas of their IT operations to enable them to focus more closely on their core objectives. A new breed of managed services can allow manufacturers to combine performance monitoring, proactive alerting and disaster resilience to help reduce downtime, lower capital and operating costs and increase competitiveness. ### The Cloud’s Fight Against The Dirty Dozen [vc_row][vc_column][vc_column_text]The shift to use public clouds to support digital transformation has created the biggest and most urgent security problem CSOs face. Business applications from web analytic platforms and domain name services, through apps stores and marketplaces, to mission critical ERP will all come under more intense fire as hackers look for a way in, motivated by greed, a social cause or politics. In fact, I and many of my fellow security experts, will not be surprised if there’s a successful cloud attack of such scale in 2019 that every business will be forced to re-evaluate their use and security. One of the biggest contributors to this prediction is the current global cyber landscape. It’s a boom time for hackers and Nation State activity will most certainly capitalise on it. Organised groups will create widespread disruption, either as solo endeavours for profit or in conjunction with armed conflicts. And we should expect that communications systems, the backbone to life and trading, will be an ongoing target. I anticipate we will see attempts to bring about multi-million dollar loses and should expect more governments to be embarrassed, shamed and manipulated, as well as face physical disruption to internet services in 2019 too. That clearly has repercussions for anyone using the cloud and should be prompting a review of how secure and stable the cloud and telecoms provisions are. The reality is that hackers are continually automating their attacks making them more complex, and more lethal.  In this environment, moving applications to the cloud is actually making people less secure, not more, since the attack surface is even greater.  But it’s not going to stop companies from doing it – the cloud is now an imperative for agile computing and business. All the time we do adopt cloud computing, the cyberattack surface is growing; multiple clouds running applications with different configurations and security vulnerabilities. Radware sees attacks on cloud apps every six seconds. More investment in security solutions isn’t something the board can dispute.  We must mitigate the risk.  Investment  is definitely needed to turn the cloud from the Wild West to a secure environment for business. Of course, the reason we use the cloud is because it unlocks so much of the promised potential of IoT devices. But that too brings concern and based on developments we’ve seen on the dark web, it leads me to predict that we’ll see more attacks that harness the power of IoT to create swarmbots and hivenets to launch larger more efficient attacks. In the case of swarmbots hackers will turn individual IoT devices from ‘slaves’ into self-sufficient bots, which can make autonomous decisions with minimal supervision, and use their collective intelligence to opportunistically and simultaneously target vulnerable points in a network. Hivenets take this a step further and are self-learning clusters of compromised devices that simultaneously identify and tackle different attack vectors. The devices in the hive can talk to each other and can use swarm intelligence to act together, recruit and train new members to the hive. When a Hivenet identifies and compromises more devices it will be able to grow exponentially, and thereby widen its ability to simultaneously attack multiple victims. This is especially dangerous as we roll out 5G as hivenets could take advantage of the improved latency and become even more effective. So which way should you turn when it comes to securing the enterprise next year? Well firstly start with the fact that anyone who is forewarned is forearmed, and so understanding these risks is a shot in the arm. If you’re adopting IoT, moving more applications to the cloud or find yourself reliant on your cloud provider for security then stop and ask yourself where’s the weak link? If your service provider is hit, are you? And what if a supplier is hit – are you the next domino to fall? The next thing to consider is what type of attacks are you likely to encounter and if you close the gaps will you be resilient? As part of this, I’d recommend you note the dirty dozen on attack types – the top 12 most likely methods hackers will use. #1. Advanced persistent threat or APT #2. Organised cyber crime #3. Ransom #4. DDoS Groups #5. Hacktivists #6. Patriotic hackers #7. Exploit kits #8. Trojans #9. Botnets #10. Insider threats #11. Defacements #12. Consumer tools Next is to work out the technology you’ll need to automatically detect, mitigate and defend in real time. Much is spoken of AI in the fight against cyber attacks. I agree, it’s a weapon we must have. But it can’t be replied upon exclusively – not just yet. There is still much work to do until we can do that, and besides there is no substitute for a human making good critical decisions and planning ahead. So if there’s one thing I urge you to do, it’s not to rely solely on technology. It is to also ensure your team know about the dirty dozen, understand the consequences of decisions they or other parts of the business make and put in place a plan that enables technology and human intelligence. I believe it is an under utilised weapon in the fight against cyber crime and we must invest in both the technology people will use and the skills they need to use it intelligently if we are win the war.[/vc_column_text][/vc_column][/vc_row] ### Unplanned Effects of Modern Payment Methods [vc_row][vc_column][vc_column_text]One of the unplanned effects of modern payment methods is that we are becoming less aware of the price we pay for things.  As we tap and go through transport turnstiles, and at the many thousands of locations where we use plastic to pay for our coffee, our lunch or whatever, the price of the things we buy has been relegated to the background.  For those fortunate enough to have enough wealth not to account for every penny, these transactions disappear in the plethora of digital noise that surrounds our lives. For those less fortunate, however, it makes the business of managing their money far harder. Cynically, it also allows retailers, transport providers and others to increase their prices without the consumer always being fully aware of the insidious increases. The primary driver of these changes has been the introduction of contactless payments.  This increasingly successful payment method comes with the expectation that we won’t receive a receipt for the transaction, and, as a direct result, the chance to reflect on the price of individual items purchased has been removed.  A further cause has been the introduction of card on account, one-click shopping services, such as those offered by Amazon and many others.  These distance the buyer from the act of consciously considering the cost of the item and make shopping a therapeutic activity rather than a financial transaction that has to be paid for from available funds.  And the ever growing move to link on-line shopping services to instantly arranged lines of credit, only serves to further separate the act of shopping from its financial consequences. For those in society able to manage their finances effectively; those happy souls who studiously go through every payment card statement and account for every penny, this situation offers no real threat and many opportunities. But for others, it is a world that offers unaffordable temptations and stealthy price rises. One way of rebalancing the system would be to ensure that an electronic record of what has been purchased, and, importantly, the price paid – an electronic receipt -  is easily, if not instantly, available to all, weherever and whenever a payment product is used. This would require either a standardised approach that was adopted by all providers, or a collaboration between providers and the agreement to operating a shared, centralised, service.  Collaboration, or standardisation, is needed to protect both the consumer and the merchant from the need to have multiple ways of sending and receiving receipts.  Commercial attempts to deliver e-receipts have struggled to overcome this problem -  and businesses would benefit greatly from an ability to deliver their service from within a standardised, shared environment. Standardisation should also address the form and design of the core elements of both e- and paper receipts.  For those who do check their receipts against their statements, or for those completing expense claims, the current wide variations in receipt design, and in the presentation of key information, is frustrating and time consuming.  And many receipts contain information that is largely irrelevant to the consumer, unnecessarily consuming paper that simply ends up in landfill sites. A move to e-receipts on a nationwide basis would save millions of paper-rolls, reduce retailer costs, improve pricing transparency and support money management for all consumers.  It would supress devious price rises and provide an opportunity for innovative businesses to provide a variety of technology based services built on the back of the information flows created.  It may also allow the tax authorities to gain further insight to expenditure, and make it easier to assess tax. In the past, the UK has been very adept at creating shared infrastructures to support such endeavours.  But whilst Open Banking is designed to address access to accounts, and may offer some support for this idea, it cannot address the need for purchase based information.  Therefore without a specific and common approach based on UK regulation, and changes to the law governing receipts, the benefits of electronic receipts cannot be realised for the country, its merchants, consumers or tax authorities.[/vc_column_text][/vc_column][/vc_row] ### What Can Blockchain Do For Data Storage? [vc_row][vc_column][vc_column_text]Blockchain. It’s a concept that has a great deal to offer – including everything from giving logistics companies constant sight on the location of their supplies or ensuring that proper royalties are paid for music. It feels like the possibilities are increasing all the time, and now it could even prove to be the catalyst for changing the recognised face of data storage. Typically, we think about storage as something that’s located on-premises, in the public cloud or in a colocation facility. But blockchain offers the prospect of creating a secure yet decentralised storage. It offers the chance for what could be defined as ‘a storage marketplace’. An electrical grid, but for storage Storage can easily become a source of real frustration for businesses, who often find there is never the right amount available for what they need. What’s ironic in these circumstances, is that whilst enterprises are continually caught trying to secure more capacity, storage providers and data centres are scratching their heads wondering what they’ll do with their own excessive space to fill. Thankfully virtualisation and storage enhancements have mitigated these issues to a certain extent, but spare capacity is still being wasted when it could be put to far better use. The time has never been better to think differently. What if there was a way to allow organisations to buy and sell storage, in a fashion similar to a traditional marketplace? It could be likened to an electrical grid, where power companies buy and sell electricity to each other to match appropriate supply and demand. Shortfalls could be made up as well as monetising excess storage. The end user would still get the services they need, but it would easier to manage the storage. Akin to the principles of an electrical grid, the marketplace for storage would operate with both storage providers and consumers all underpinned by blockchain. Blockchain could enable a seamless transition for customer data to the most appropriate location, based on parameters like performance, policy and service level agreements. It would be a new, distributed model for storage. What’s wrong with using the cloud? With the use of public cloud continuing to grow, questions could be raised on why you would want to use this distributed model, given the renowned flexibility offered by cloud technology. But the hyperscale public cloud, for all its popularity, does have limitations. Speed of service, bandwidth and latency – all of these can have an effect. Cloud service providers are also vulnerable to the damage that can be caused by malicious acts and outages, which can be especially be harmful to the businesses paying to rely on their services. The distributed storage model eliminates these kinds of risks by allowing organisations to quickly and easily access extra capacity as needed. As a result, businesses can operate more efficiently and bring in a much clearer return on investment. Three factors for success  Making the storage marketplace a reality starts with implementing high performance distribution via peer-to-peer content sharing – a method which is already in use at organisations like Microsoft. The key advantage comes from the reduction of pressure and stress on one single hub, as well as being able to offer higher speed of service by receiving smaller parts of files from multiple different locations. Security is the next concern that is never far away when storage is discussed. Confidentiality, availability and integrity are the top three considerations here and any storage method that can’t guarantee these will not useful for long. Confidentiality can be achieved with well-established techniques like encryption. The storage marketplace, particularly if it makes use of peer-to-peer functionality, can also very easily guarantee accessibility. By being able to ensure that every piece of data is stored in multiple and redundant locations, no one individual storage node is a liability. Integrity is about ensuring the content and format of the data remains the same as its stored, shared and received. The strong user access controls enabled by blockchain technology (and the ledger it underpins) can achieve this. The final concern is the marketplace itself. The distribution model needs a way of tracking every sale and purchase of capacity to truly work. Transactions also need to be secured for anyone to benefit or have trust in the system. Where blockchain fits Blockchain is the technology at the heart of ensuring both the security of the distributed model as well as the operation of a marketplace. Blockchain guarantees every action recorded as data is segmented and distributed across the grid. With a known ledger in place, activity outside it can be prevented. Blockchain also supports improved availability, because the physical location of the data can be decentralised. When capacity is removed from the marketplace the data could be automatically moved elsewhere. Finally, blockchain will provide proof of ownership, not only confirming that the data exists but so to the contracts between the appropriate buyers and sellers. By providing clear evidence of transactions, blockchain can eliminate the need for manual tracking of exchanges and create greater confidence in the whole system. Overcoming sensitivities about storage The advent of the cloud has seen storage already go some way to becoming a commodity. A marketplace for storage would therefore seem like the next logical step. Adopting such a new distributed model, however, is much more of a hurdle for businesses to overcome in their mindset. The traditional point of view is that secure and trusted data must be stored centrally. After all, this information is increasingly the sole revenue source of businesses, so why should they work to radically change how it’s stored if a perceived level of safety and compliance can be assured by sticking to the status quo? More progressive thinking will take time to develop. The marketplace model is not far off, as the technology needed to make it a reality already exists. For businesses willing to take the plunge and adopt a more forward-thinking approach, there could be major opportunities. More responsive networks, boosted efficiency and a reduction in latency are just the start, not to mention a new untapped revenue source. A new storage paradigm is right around the corner.[/vc_column_text][/vc_column][/vc_row] ### How Creative Businesses Can Get The Most Out of AI’s Immense Potential [vc_row][vc_column][vc_column_text]Businesses should be applauded for dipping their collective toe into the enigmatic new world of AI. As someone who creates AI applications for marketers and ad agencies, I’ve seen some very successful initiatives, like the French multinational advertising and public relations company, Publicis, investing big in a bespoke piece of AI management software they call Marcel. Built in collaboration with Microsoft, Marcel aims to coordinate the holding company’s global employees. With 80,000 staff on the books, it makes perfect sense to use AI to manage all the data. But for every good use in ad land, there are - unfortunately - many more that are wide off the mark. The advertising world has recently been rocked by the fact that Lexus used AI to write its latest TV commercial script. Perhaps it was a nod to AI’s creative potential. But to me, it whiffs of a PR stunt and shows that businesses still has far to go in figuring out how best to capitalise on AI’s immense potential. It can be challenging to develop creative AI projects that give people something genuinely useful, rather than just using AI to show off some forward-thinking creds. The hurdles are manifold: Firstly, you need tonnes of data, but the data that’s widely available is rarely of use for creative initiatives. And the type of deadlines that people are generally used to make it difficult, if not impossible, to develop a working machine learning system from scratch. Add to that the fact that people tend to be so pumped by AI, they often ask their partners to reach further than the technology allows. These issues, when combined, often lead down path of failure. Bearing in mind AI is still a relatively nascent field, it’s not surprising that we’re seeing a few misfires. But there are some clever ways to put creative businesses on a more assured path to success: AI learns from data. The more it can learn from, the better the AI. If businesses don’t want to limit their projects to the data that’s already publicly available, they'll need to collect it themselves. But collecting the necessary volume on your own is an arduous and lengthy task. Crowdsourcing, however, can be the answer. Huawei’s Sound of Light project used AI to analyse patterns in the Northern Lights and then create a piece of music that would interpret nature’s most spectacular light show as sound. This project used a crowdsourcing tool that enabled data on people's perception of auroras to be collected at a scale. Within just a day, data was sourced from over 1,000 people, each providing multiple data points. This resulted in a 30k+ dataset size, meaning development could begin without delay. Data augmentation techniques can also help speed up this process. In the case of Huawei’s aurora images, perception doesn’t change when the images mirrored. They're still the same auroras. But in slightly rotating the images, the dataset can be immediately multiplied. Granted, it's not as efficient as gathering a new set of unique images. But it can help create a dataset that’s big enough to get the AI learning. That said, you have to be realistic as to what can be achieved within the time frame. One month isn’t enough to create a revolution. Bona fide, innovative AI usually requires more time than other digital projects. Without proper time investment, the result will be, at best, mediocre. So instead of developing an entire AI from scratch, consider incorporating existing, pre-trained AI. This is how Royal Caribbean’s SoundSeeker app – where AI is used to compose a unique piece of music for people’s holiday snaps – was created. Had we tried to create AI that looked at raw images and then drew conclusions, it would’ve been too weak to use in isolation. But when paired with a parallel pre-trained model – in this case, Google Cloud Vision to understand what objects, places or faces are seen within the images – the AI can suddenly operate on high-level data. Unless a business has the luxury of millions of references for its custom AI to learn from, a partnered approach will massively boost the AI’s initial understanding of the world. Then, and only then, can an extra layer of project-specific understanding be added. Another incredibly helpful approach is the concept of word embeddings. If an AI is supposed to draw conclusions based on image content, using Google Cloud Vision to retrieve the labels such as "person", "boat" or "beach" may not be enough. This is because the AI has to learn by heart how each of the words, completely meaningless to it to begin with, relate to the expected outcome. It doesn't get how "person" is related to "man", or that "coffee" and "drink" may be the same thing. But word embeddings convert words into vectorized numerical representations that preserve the "meaning" and "relationships" between the actual concepts that the words represent. So the mathematical distance between a vector representing "coffee" and "drink" will be very small, while the distance between the embedding of "coffee" and "beach" will be large. In other words, the vector representing "coffee" and "drink" will be very similar in mathematical terms, despite the words not even having a single common letter in them. Such approach lets the AI work on "concepts" and "meaning" rather than on meaningless words that us humans convert subconsciously. A common mistake when it comes to creative AI projects is thinking that an AI can be left to do the entire job on its own, for example: input image, get final music. But developing an AI that creates ‘something’ out of ‘nothing’ is incredibly difficult. More often than not, it's better to use AI as part of the process, rather than trying to make it the only component. So instead of training an AI to create an entire piece of music, why not get it to only output the notes? That way, by asking a composer to turn those notes into music, the project can incorporate a human layer. But perhaps the biggest mistake is using AI for AI’s sake. Sometimes there's a huge engineering effort that goes into developing an AI project, but this isn’t always clear in the final output. The immense effort that’s involved is, generally speaking, grossly under-estimated by the end user; and sometimes it’s completely invisible. I hope this is blindingly obvious, but AI should only be used when it’s of genuine help. Although we should encourage businesses to adopt new technologies, there’s nothing worse than seeing an initiative, AI or otherwise, use the latest new tech just to jump on the bandwagon. That said, AI has so much to offer the business world – whether it’s in the form of Marcel-style practical support, efficient precision medical scan interpretations or building a fun and creative digital toy – that businesses really do have to jump on board. They just have to make sure that ‘board’ isn’t a bandwagon.[/vc_column_text][/vc_column][/vc_row] ### Incident Enrichment | IBM Resilient [vc_row][vc_column][vc_column_text] [embed]https://vimeo.com/309472599/5b374bf47e[/embed] Cyber analyst have to deal with massive mountains of work and alerts. Watch as Jessica Cholerton talks you through what all of these different alerts mean internally and externally and how analysts should categories them to best tackle the incident.    [/vc_column_text][vc_column_text] Learn More: https://www.ibm.com/security/intelligent-orchestration/resilient [/vc_column_text][/vc_column][/vc_row] ### Automation In Incident Response | IBM Resilient [vc_row][vc_column][vc_column_text] [embed]https://vimeo.com/309472799/36f3c151fb[/embed] Jessica Cholerton joined us in the studio to speak to us about automation in incident response and the different areas that you could and should automate along with some of the other areas where that you should be more cautious with. [/vc_column_text][vc_column_text] Learn More: https://www.ibm.com/security/intelligent-orchestration/resilient [/vc_column_text][/vc_column][/vc_row] ### AI Revolution | Human input is key Humanity’s progress to date has depended on our ability to create new and innovative tools that help us to achieve more as a species. The introduction of AI into our lives will, without doubt, represent the next giant leap forward for humanity, but it will also massively disrupt our society, just as industrialisation, automation, and digitisation have done before.   While previous technologies - the steam engine, assembly line, and the computer - each disrupted industry at the time, AI will soon be able to work more effectively than us in the vast majority of sectors. What makes AI and the fourth industrial revolution different is that none of us can ignore the impact that AI will have on our lives.   Oh, the humanity! The ad-man, lawyer and doctor of the ‘50s might have been able to ignore the direct impact of mass production, but almost all modern-day work that can be made into a process can be automated by AI, including digital marketing, case law, and basic medical diagnosis. However, just as computers opened up entirely new sectors and fields of research, and mass automation allowed the factory worker to step into a new thought-based role, AI will also provide us with a new range of jobs based on our human ability to adapt, improve, and think creatively.   A by-product of our last technological leap was the creation of the modern office, and since then we have been conditioned to think of ourselves as leaders or followers, as a manager or an office drone. These dichotomies are entirely false and hold people back from achieving their full potential. Within all of us, there is the drive and ability to have genuinely innovative ideas, to develop them to completion, and to lead a project as part of the greater good. To take full advantage of AI, we need to remember what we can do, and stop declaring defeat before the disruptive change has even begun.   Plato and Prehistoric thinking If we go back to the usual starting point and forgive the caveman cliche, then we can see AI as the tool that it is. When our distant ancestors first wondered if a sharp branch might be used to kill animals, or if fire might be used to cook the meat, they did it to improve the overall capacity of the group. Each advance came from abstract thought and initiative, which was used to achieve more, die less, and become the most successful apex predator in history.   As the product of this raw intelligence, modern humans instead use our advancements to make sure we don’t have to put in too much effort. A particularly famous cave devised by Plato allegorises the restriction of man’s consciousness, proposing that we cannot conceive of anything more than our reality until we push ourselves into a new one. Just as our instinct-driven ancestors used their creative thinking to develop tools and move out of their cave, AI will drive us out of our mental cave (a ‘that's not my job' mentality) and make us use our initiative and emotional intelligence to its full potential.   However, completely changing the way we work won't be easy, and companies need to change now to encourage those attributes that make us wonderfully human.   We can make this work The traditional organisation, where managers make decisions and employees do as they are told, creates a process-driven workforce that is not able (or allowed) to see the bigger picture. In our company, Pod Group, we realised AI could handle any process a hundred times better than we can and restructured the entire company to focus on our most human attributes. These attributes, Wisdom, Emotional Intelligence, Initiative, Responsibility and Development (WEIRD), are central to our WEIRD strategy, which encourages everyone to feel responsible and confident about their actions as owners of the company.   Oddly enough, if you treat people like adults who are good at their work, they want to do it. People can choose where they work, and when to take a holiday, and all information about the company is entirely transparent - including salaries decided by each person - so everyone can know the costs and returns of any project. This way everyone feels respected and satisfied at work, and can see how every process fits together to benefit the company, which is precisely what we need to work with AI successfully.   AI will force change Moving to a management structure that lets people decide what is best for their work will also help companies become more agile and streamlined. AI will change many things, so companies must prepare now to make sure they are ready, but pulling ourselves out of a ‘leader or follower' mindset can only be good for the future of humanity in its own right.   Humans didn't get to the top of the food chain by accident; we used each of our technological advances (fire, the wheel, steam power) to reach a new phase in our development. AI will allow us to break through to a better world, but we need to help ourselves first to take full advantage of our most advanced technology yet. ### Mitre attack | IBM Resilient [vc_row][vc_column][vc_column_text] [embed]https://vimeo.com/310593996/7a2405893c[/embed] A Mitre Attack framework is a way to map the techniques used by an attacker. Craig Roberts joined us in the studio to talk about the mitre attack framework in security operation platforms. [/vc_column_text][vc_column_text] Learn More: https://www.ibm.com/security/intelligent-orchestration/resilient [/vc_column_text][/vc_column][/vc_row] ### Choice not Compromise: The Best of Both Worlds [vc_row][vc_column][vc_column_text]Today, enterprises may feel like they have to compromise in order to bridge the divide between on-premise and public cloud. It’s a divide that is now set against the backdrop of the increasing imperative for enterprise application mobility, driven by the likes of artificial intelligence, machine learning and data analytics. While each approach carries its own benefits, what organisations really need is flexibility and choice when it comes to infrastructure. Workloads and applications should live where it makes the most sense for them to live. This should be an infrastructure choice based on business objectives, not constrained by what the technology can do or where it lives. Flexibility is key here – business objectives change all the time, and organisations need the freedom to adjust and adapt where necessary. Overcoming Barriers to Hybrid Cloud The obvious solution is a hybrid approach. That said, this is easier said than done, and poses many challenges itself. Firstly, most enterprise applications run on-premise, and migrating them to the cloud can be tricky. By that same token, most web scale apps are built in the cloud and so migrating them on-premise can be equally difficult. Today’s organisations cannot afford disruption or downtime, making seamless and agile movement across different clouds a business necessity. In addition, on-premise and cloud storage have different features and APIs, developing applications that run seamlessly across both has been nearly impossible. Modern Requirements If these barriers were not enough to cause an enterprise headache, there’s the added complexity that faces all modern organisations. Data is now considered the “new currency” for businesses, but keeping this in check and using it to its full advantage is no easy feat. To give some context, according to IDC, the volume of data generated by the Internet of Things (IoT) alone will be as big in 2025 as the amount of ALL data created in 2020. That’s a tremendous amount of data, which will dictate the needs of a business’s infrastructure. Enterprises will need to have the ability to seamlessly move an application born in the cloud to an on-premise environment (or vice versa) once this data dictates a different set of requirements. At present, businesses are being forced to compromise, as opposed to having the ability and agility to design an infrastructure that works for them. No longer EITHER/OR… AND is now possible: Fortunately the tide is changing, and soon business will no longer have to make this compromise. We are entering into the era of AND, not OR. Solutions are now becoming available to help businesses take full advantage of multiple clouds – no business should be forced to compromise. The features that organisations will be able to take advantage of include: Enable the building of all clouds: Organisations will be able to build private clouds on-premise or in hosted environments to deliver storage-as-a-service with the performance, availability and ease of use that every business needs and deserves. Create freedom to run applications anywhere: Businesses will have the ability to run applications in on-prem or hosted environments, yet also run them seamlessly in the public cloud. Having a single storage platform will deliver consistent storage services, resiliency and APIs; meaning applications can be built once and then run anywhere in the hybrid cloud model. A new model for data protection: flash-to-flash-to-cloud: The traditional disk and tape-based data protection model is failing to keep up with the demands of the cloud era. Data sizes are growing, customers expect global availability, and having PBs of backup data stuck in a vault is no longer acceptable. The combination of flash + cloud allows customers to re-invent data protection, enabling both fast local recovery from flash and low-cost long-term retention and data re-use in the cloud. Data is the lifeblood of any organisation and there needs to be flexibility for businesses to turn this data into value. The era of the cloud divide is coming to an end, ushering in the age of choice and enabling organisations to build a data-centric architecture. These worlds can now co-exist, bridged seamlessly with a common storage layer, so applications and data are free to move easily between owned and rented clouds. This hybrid approach means that applications can be developed once and deployed seamlessly across owned and rented clouds, giving customers the ultimate flexibility to turn data into value wherever that data resides. The ultimate result for enterprises? They can take advantage of the agility, flexibility of a hybrid environment to develop applications faster and have the freedom to free applications from being linked to any one type of infrastructure; allowing them to create a data centric architecture that’s right for the business, and that puts enterprises on the right path for future success.[/vc_column_text][/vc_column][/vc_row] ### Food Industry | Better cost efficiency is key for production 2018 is the year of the blockchain, and it’s much more than a buzzword. The technology is seeing unprecedented levels of enterprise adoption, governmental support, and entrepreneurial zeal, but it is the year of the blockchain because of the tangible benefits that are resulting from the blockchain’s rise to prominence. One, perhaps unexpected industry benefitting from this technology is the food industry. There is mounting evidence of the blockchain’s obvious importance. In June, The Wall Street Journal released its list of tech companies to watch. It’s a future-focused compilation of companies that is developed by experts and industry analysts. This year, five of the twenty-five companies listed are blockchain initiatives. Moreover, earlier this month, South Korea became the first country to integrate blockchain technology into the public sector. Finally, entrepreneurial zeal is producing more new blockchain platforms than ever before. In this first half of 2018, there were more than 400 new digital tokens launched that brought in more than $10 billion in startup funds for these initiatives. For all of its use cases, one of the ways the blockchain is most frequently applied is toward supply chain management. Its decentralized network, tokenized data or value transfers, and unchangeable records make it especially well suited for this work. As research and consulting firm, Deloitte, recommends, “Using blockchain in the supply chain can help participants record price, date, location, quality, certification, and other relevant information to more effectively manage the supply chain.” While this has implications for virtually every business, it is a right on time development for the food industry. Pressed by a growing population that demands higher yields and more affordable food, the food industry is looking for cost efficiency anywhere that it can find it. The Current Global Food Supply Chain The global food supply chain is robust, and it provides a constant supply of diverse foods to people around the world. According to The United States Department of Agriculture, the U.S. imported more than $137 billion in food last year, which is nearly triple the amount from just two decades ago. As the report indicates, the scope of these imports increased as dietary patterns changed, the population diversified, and out of season produce received year-round interest from consumers. Of course, the complex supply chain that manages these imports adds a lot of miles to a meal. For example, The WorldWatch Institute estimates that meals in the U.S. “travel between 1,500 and 2,500 miles from farm to table.” That’s 25% further than food travelled a decade ago. It’s not uncommon for food to travel through 7 - 10 transition points before it reaches the consumer, and these transfers are not making food more affordable. Indeed, last summer, global food prices rose 17% from early 2016 levels. The Bureau of Labor Statistics notes that fruits and vegetables are experiencing the most significant itemized price increase. To be sure, the entire food operation is only going to become more complicated. The World Economic Forum estimates that global food production will have to increase by 50% - 100% by 2050 to keep up with population growth and increased demand.   A Better Method This is where blockchain technology can step in. While it can’t solve all of the food industry’s problems, it can vastly improve its supply chain management to ensure that the food that’s being produced is being effectively and efficiently shipped to market while verifying its origin, safety, and freshness. The blockchain can bring cost efficiency to the food industry in several ways.   #1 Specified Food Industry Recalls Unfortunately, food contaminations are bound to occur, and that food has to be recalled by the distributor. Last year, there were 131 recalls totalling nearly 21 million pounds of food. Because current supply chain data can’t pinpoint an items origin or travel points, companies cast a broad net to ensure that all the contaminated food is removed from distribution. The average recall costs a company $10 million, and those costs are, to some extent, passed on to the consumer. Using blockchain technology, food companies can identify an items precise point of origin, and they can examine its entire transfer route. Contamination can occur at any point, and the blockchain issues pinpoint specificity to the cause, which can save lives and dollars. #2 Direct Distribution Using the blockchain and its accompanying digital payments, farmers can connect directly with shipping agents to bring their food to market. In this way, farmers can operate on an independent and direct basis with transport agents. Using things like smart contracts and digital tokens, the blockchain can facilitate direct collaborations that are predicated on trust and validity. This methodology can reduce transportation transitions that make food more expensive. More direct collaborations can bring food to market for efficiently and cheaply.   #3 Authenticity Verification Organic food is a growing sector of the food industry. Organic food consumption is expected to grow by 14% by 2021 as health and safety conscious buyers pay a premium for more naturally produced food. The “organic” label comes with strict production and tracking standards to guarantee that the food is actually organic. These processes increase the already high cost of organic food, and the blockchain can help streamline that process. Its unchangeable, auditable records can track and verify every step of organic food’s lifecycle to provide consumers with confidence in the products that they buy. For regulatory officials, this technology streamlines the complicated verification process, which means that food producers can spend fewer resources on important but costly regulation.   An Industry In Need The blockchain’s use cases seem as endless as their users’ creativity. As a result, it’s receiving a lot of hype, but hype doesn’t diminish actual value. Blockchain technology will have a felt impact on numerous industries, and none feels as timely as making the food industry more affordable. Food is one of our essential requirements, and the industry and consumers are already feeling the pressure of that. The blockchain can increase cost efficiency for producers, which can lower costs for consumers. That’s a good start to providing what everyone desperate needs. ### Breaking Out Workflows part 1 & 2 | IBM Resilient [vc_row][vc_column][vc_column_text] [embed]https://vimeo.com/310595628/92c196cfbf[/embed] Watch as Craig Roberts discusses how they map out an incident response process and the best way to start thinking about designing such a process. [/vc_column_text][vc_column_text] Learn More: https://www.ibm.com/security/intelligent-orchestration/resilient [/vc_column_text][/vc_column][/vc_row] ### Businesses Brace for 2019’s Big Tech Changes and Challenges [vc_row][vc_column][vc_column_text]It’s that time of year again. Businesses are dusting themselves down from a turbulent, fast-paced and opportunity-rich 2018 as they start to map out the year ahead. Now is the time to take stock and prepare for another calendar cycle of relentless forward momentum. As ever, there will be challenges that endure and new tech advances to capitalise on. Here’s a snapshot of emerging trends and developments businesses can ill-afford to ignore if they want to stay relevant, innovative and profitable in 2019. The future is multi-cloud Corporate cloud literacy is becoming an operational prerequisite as technological progress accelerates in EMEA. With a multi-cloud strategy, enterprises can assign workloads to public clouds that are best suited for specific tasks, including speed, agility and security. If harnessed with intelligence and foresight, the expansive opportunities afforded by multi-cloud scenarios will benefit bottom lines and earn customer trust through service excellence.  According to Foresight Factory’s recent F5-sponsored Future of Multi-cloud (FOMC) report, the expert consensus is that those delaying on multi-cloud exploration and adoption will eventually become irrelevant. In the coming years, the FOMC report believes that upfront costs will become less obstructive as cloud vendors continue to demonstrate compelling use cases. As a part of this shift, technologies such as artificial intelligence (AI) and machine learning will be fundamental to driving higher levels of automation and rendering existing obstructions to multi-cloud obsolete. Application services to the fore As businesses invest in digital transformation, it is vital to modernise application portfolios and infrastructures. More than ever, it’s important to architect a system that balances effective controls with innovative freedom. Application services emerged from the disaggregation of capabilities formerly integrated into devices such as Application Delivery Controllers (ADCs). They are now software-defined, loosely coupled, and easily consumed. It is finally possible to attach individual services to applications in real-time based on specific needs. A major benefit of application services is that they enable IT to enforce consistent service quality. This means an additional layer of security, availability, and reliability – even if applications don’t have such in-built capabilities. As 2019 rolls into view, businesses will demand services that follow applications wherever they go. This is critical at a time where much of the user experience is digital, delivered via the cloud, and built by developer teams outside of the IT organisation. App environment understanding needs to improve  Unfortunately, businesses worldwide are still struggling to understand, optimise, and protect their rapidly expanding application environments. According to the F5 Labs 2018 Application Protect Report, as many as 38% of surveyed organisations across the world have “no confidence” they have an awareness of all their applications in use. The report, which is the most extensive of its kind yet, also identified inadequate web application security practices, with 60% of businesses stating they do not test for web application vulnerabilities, have no pre-set schedule for tests, are unsure if tests happen, or only test annually. The pressure has never been higher to deliver applications with unprecedented speed, adaptive functionality, and robust security — particularly against the backdrop of the EU General Data Protection Regulations (GDPR). Ultimately, businesses that fail to grasp their application environment big picture will struggle. A company’s reputation is always perilously predicated on a comprehensive security architecture. Technologies such as bot protection, application-layer encryption, API security, and behaviour analytics, as we see in Advanced WAFs, are now essential to defend against attacks.  Millennials wield more influence The oft-perpetuated myth that millennials as lazy, entitled, disloyal, and difficult is patently nonsense. This is especially true in the context of a looming IT skills crisis and a general need for more tech-savvy workforces across all industries. The generational gap is frequently and tediously exaggerated. There are plenty of new recruitment and employee nurturing nuances for business leaders to consider, but none should be incomprehensible. Pre-conceived notions or misty-eyed nostalgia shouldn’t cloud judgements. There is a cutthroat battle going on to identify and secure the workforces of tomorrow. Business leaders clinging to the status quo need to rethink their stance. Multi-purpose attack Thingbots threats on the rise  Towards the end of 2018, F5 Labs fifth volume of the Hunt for IoT report revealed that IoT devices are now cybercriminals’ top attack target. This could prove problematic in the long-term. Lax security controls could even endanger lives as, for example, cellular-connected IoT devices providing gateways to critical infrastructures are compromised. Indeed, the report posits that there are growing concerns that IoT infrastructures are “just as vulnerable to authentication attacks via weak credentials as the IoT devices themselves.” According to F5 Labs, 2018 ended with threats looming from thirteen Thingbots, which can be co-opted by hackers to become part of a botnet of networked things. This includes the infamous Mirai botnet. Distributed Denial of Service (DDoS) remains the most common attack. However, attackers in 2018 began adapting Thingbots under their control to encompass additional attack methods including installing proxy servers to launch attacks from, crypto-jacking, installing Tor nodes and packet sniffers, DNS hijacks, credential collection, credential stuffing, and operating fraud trojans. Businesses need to brace themselves for impact. IoT attack opportunities are virtually endless and Thingbot building is more widespread than ever. Unfortunately, it will take material loss of revenue for IoT device manufacturers, or significant costs incurred by organisations implementing these devices, before meaningful security advances are achieved. Therefore, it is essential to have security controls in place that can detect bots and scale to the rate at which Thingbots attack. In addition, bot defenses at the application perimeter are crucial, as are cutting-edge DDoS solutions. Super-NetOps Emerging threat landscapes and multi-cloud possibilities are changing the game. Users across EMEA are demanding rapid, safe and multifaceted services. 2019 will see pressure growing on traditional IT teams to embrace programmability and enable the orchestration and agility needed to succeed in a digital economy.  Regrettably, there is a lingering disconnect when it comes to collaboration between NetOps, SecOps and DevOps teams. This could be remedied in the coming years as the concept of “Super-NetOps” professionals gains traction. With training programmes already rolling out worldwide, we can expect a surge in a new breed of systems thinkers actively and collaboratively supporting organisational needs for rapid, automated application development and delivery. Increasingly, network professionals will learn how to apply their expertise in new ways, becoming integrated service providers to their organisations rather than siloed ticket takers.[/vc_column_text][/vc_column][/vc_row] ### Impact of AI | The impact of Artificial Intelligence on the workplace Digitalisation and the new technological possibilities that artificial intelligence (AI) brings are driving the biggest social and economic changes since the industrial revolution. These are associated with opportunities and risks. Without the right political, economic and ethical framework conditions there is a risk of uncontrolled development and a negative impact of AI. AI ultimately affects all industries more or less heavily. AI or specific forms of it, such as machine learning (ML), can be used in a wide variety of application scenarios. First and foremost, they will be increasingly found in areas where large amounts of data need to be analysed and evaluated. AI systems are better equipped than humans for analysing massive amounts of data, correlating the data with various reference points and thus to create a better base for decision making.   AI is shaping the way we work AI has already entered many areas of our working and private lives. Already classic examples of this are Alexa or Siri. By means of smart algorithms, machines today are capable of doing incredible things with facial and speech recognition. With error rates of under five percent, many systems can perform better than humans. In image recognition, which is also used on Facebook or in self-driving cars, computers are now far superior to humans. And online retailers or search engines, for example, use ML to optimise the user experience and to create buying recommendations. In short, AI and ML are already fixed - and also largely accepted - components of our day to day lives. The impact of AI, when it comes to our working environments, AI, ML and digitalisation bring a significant sea change, as they alter or expand the human activity spectrum. This means employees’ work profiles and what is required of them will change considerably. A typical example is industries where more and more machines and robots are used, such as the manufacturing industry.   AI brings risks With regard to the use of AI, there are two principal risk categories: on the one hand, there are risks connected with society and humans and, on the other hand, there is a risk of dependence on technology. Many are concerned about machines and AI taking over human activities. This is not just about apocalyptic fears and Terminator-like scenarios, which have also been cited repeatedly by Tesla founder Elon Musk, but about more elementary, existential fears. People ask themselves questions such as: How do I fit into the digital future when intelligent robots take over my job? Do I still have the right skills? The older generation especially is very sceptical about the technological development and increasing use of smart machines. However, it should be remembered that the world of work will definitely change. Jobs that are indispensable today may not be relevant in a few decades’ time. Currently, the impact of AI is seeing a great need for data scientists and developers, but in a few years from now, machines might be able to do these jobs better and faster. The technical development and the concerns of society make it imperative to start socio-political initiatives that develop and implement not only the technological opportunities but also educational and digital policies. Without a concept, Britain will weaken its international competitiveness. The second risk category concerns the question "What decision-making powers do we give machines and AI?" There is a risk of becoming too dependent and unable to reproduce how a machine or algorithm has come to a conclusion. Neural networks, which have millions of connections and interactions, make decisions that they cannot adequately explain due to a lack of communication skills and the complexity of their "thought processes". Machines are not emphatic narrators, they know more than they tell us, and they do not understand our information needs. In any case, in some critical areas and rapid decision-making processes we will eventually have to rely on information from machines – take for example the decision that the immediate shutdown of a nuclear power plant is necessary, since the complex trains of thought involved in such events cannot be traced and understood fast enough by humans. It is also often overlooked that even machines can have prejudices based on information from the past. An example would be a recruiting system that supports the selection of optimal candidates by analyzing thousands of historical decisions made by the company. If a recruiter in the past - for whatever reason - preferred men over women, then this "prejudice" is also inherited by the machine and becomes part of the algorithm. In addition, there are already AI applications today that develop their own successors without human input, as in the example of Google's AutoML project. The impact of AI could mean it creates a dynamic that can no longer be controlled, which is indeed disturbing. And, it stresses the urgent need for a moral and ethical framework.   Impact of AI and cybersecurity Before any use of AI technology, elementary questions must be answered. The area of cybersecurity, which uses AI and ML for better threat recognition and the prevention of financial or operational damage caused by cybercrime, is a good example. Threat Detection and Threat Intelligence are the keywords here. The following questions must be considered before AI is applied: • What do I want to achieve? • Which ML methods best support my goals (such as supervised learning, unsupervised learning, decision trees, or deep learning) • What impact does AI / ML have on my organization and employees? • What risks might be created by the changes? • How much decision-making power do I allow the machine? • Where and when do we expect human intervention? Overall, industry and society are still at the beginning of the change process brought about by AI and ML. New opportunities will drive innovation and create new ideas. We will see mistakes, but also unexpected successes. However, one thing must not be overlooked: The effects of AI are serious. Humanity, according to the World Economic Forum, faces the fourth industrial revolution - with social, economic and societal implications. Therefore, politics and business must work together - here and now - to create the appropriate framework conditions that will set the course for future generations. ### Threat Intelligence Platform | IBM Resilient [vc_row][vc_column][vc_column_text] [embed]https://vimeo.com/310622492/2c575e887e[/embed] Craig Roberts discusses how threat intelligence platforms and security orchestration automation platforms interact with each other and draws this out in an easy to follow diagram. [/vc_column_text][/vc_column][/vc_row] ### Protecting Your Customers From Persistent Banking Fraud [vc_row][vc_column][vc_column_text]With fraudsters rapidly growing in sophistication, the Financial Ombudsman has announced “it’s not fair” to automatically blame customers for falling victim to banking fraud. With more and more stories surfacing where scammers have been able to bypass simple security checks and move a victim’s money around freely, what can banks do to better protect their customers from the risk of fraud? Regulations like Open Banking and PSD2 have been put forward as an innovative way to deliver an improved service and better deals to banking customers. Part of the legislation calls for a strong authentication process whenever a payment is initiated, or remote account access is requested. This process, known as SCA (strong customer authentication), is intended to reduce online payment fraud and try to help banks converge on the authentication and fraud controls. However, whenever new regulations are introduced, the whole industry is placed under considerable pressure to comply, which drives some organisations to push-back on the regulator. This discourse often leads to exceptions for organisations that can show they are managing the risks within certain constraints; allowing them to avoid the initial risks around non-compliance, without having to make large investments and risk impacting their customers. This means that for every change to these regulations going forward, the organisation will have to react accordingly, forever being one step behind regulatory changes. And yet, a reactive approach isn’t necessarily the cheapest. It’s been proven time and time again that taking a reactive approach to regulation means that resources and technologies have to continually be stretched to meet requirements. Not to mention if there’s ever a non-compliance issue, there’s a risk of fines and loss of customer confidence, and you end up having to implement the maximum standards anyway. There is a risk that over time banks will begin to suffer from regulation fatigue, seeing more and more regulation coming into place without the benefits being recognised, they may start to look for the cheapest implementation to combat this. By taking a short-term view, organisations can often end up taking a costlier route over time, than if an initial investment has been made at the beginning. Often the impact this has on customer experience can be considerable, as they encounter more friction due to a standardised catch-all approach taken to security. Whilst the compliance team might be happy, the CISO and the operations teams are paying the price with continued fraud risks and lowering customer satisfaction. It’s certainly a paradox - how do you comply with regulations without increasing the risk of fraud whilst simultaneously maintaining a positive customer experience – and prove it? When it comes to banking, we understand there is a conflict of interest; consumers want to get on with their digital lives without unnecessary layers of security every time they click through a payment journey, however, they also want to know their money is safe and secure and are willing to take the necessary steps to ensure this. A simple code sent to the consumer to verify their identity just isn’t enough in today’s world, as demonstrated by banking lobbyists UK Finance who reported £731.8m lost in unauthorised financial fraud last year. Banks need to be able to intelligently decide when to add additional layers of security, by combining the benefits of both hard and soft biometrics with machine learning, they can be truly confident the person moving money is who they say they are. Technology that learns the unique behaviours of customers, such as typing or swiping techniques, online habits and facial recognition will help determine whether someone’s behaviour is within their normal pattern. Where these identifiers throw up anomalies the bank then knows to introduce further tests. Also, consumers need the freedom to choose their identification method, eliminating the risk of isolating an entire demographic due to technology or restrictions on ability. Combined, this approach avoids a static rules-based method that we have seen can be easily replicated by the bad guys. Financial services firms must also take advantage of better intelligence. This way, banks can protect their customers while still providing the seamless, friction-free service they expect from their digital experiences. Advances in Artificial Intelligence and Machine Learning means tools have been developed which can remove the need for additional authentication methods by utilising Secure User Authentication methods. Rather than asking the user for information, this system relies on learning the customer’s patterns and behaviour, including location - where is the access request being made; device - is the access request being made from an authorised device; and behaviour - assessing the user’s interaction through the log-in process, from key strokes to the ‘style’ of their swipe. With each individual interacting with their device in a different way, this forms a second-layer of unbreakable authentication. Although we’re not quite ready to say goodbye to passwords completely, we are well on our way to a world where we can more intelligently and securely identify each individual – facilitating the move towards a solution that finally reduces fraud. By investing in the right tools, the identity paradox is solved, placing banks and their customers in a much better position. It’s clear there’s a third option when it comes to compliance around regulations, one that helps grow a business rather than just defend it.[/vc_column_text][/vc_column][/vc_row] ### Deploying containers securely and at scale with Multicloud Enabler powered by Red Hat and Juniper [vc_row][vc_column][vc_column_text]Multicloud should be accessible for more than the worlds largest cloud providers. Here is how Juniper Networks and Red Hat can make that happen. The move to hybrid and multicloud environments are more than an exercise in deploying technologies. A switch or a router, or an orchestration or management system. It is not enough to lift and shift an application from on-prem servers to servers in the cloud. This transition represents something more than swapping one technology for another. It is a fundamental change in how enterprises approach IT. As with any change of this magnitude, success will not likely be determined by the high level objectives, but rather how well nuances, complexities and ground realities are managed. Like a cloud Most transitions in technology take time. With cloud, we are not looking at an evolutionary step in isolation. We have customer successes that provide a glimpse into what the possibilities might hold. Customer challenges to deliver new services faster and to create differentiation through technology are indeed solvable - and at scale. We believe the key to navigating this change is democratising cloud grade operations. Merely replicating technology choices without the accompanying shift in operational skill, discipline and culture will not yield great results. This is because the cloud is more than just technology and also more than just public cloud. Cloud is a new architecture and operational model and hybrid and multicloud architectures can offer customers the greatest flexibility. Red Hat as an operational change agent The operational change before us is fundamental. It necessitates agents of change that understand the future, approach innovation in new ways and have the market presence to facilitate the migration to a new way of architecting and operating IT infrastructure. Red Hat is perhaps one of the best change agents in our industry. Having driven open source to the enterprise, Red Hat has been able to strike at both the technological and cultural barriers to progress, as well as across the technology stack. Whether it is Linux, Ansible, OpenStack, Kubernetes or other open source technologies, Red Hat has become a standard for bringing open source innovation into the enterprise in ways that are designed to make it more consumable, reliable and secure. Juniper believes in the same principles of open source innovation, open APIs, open standards and software defined automated infrastructure. Of course, no single vendor should be able to drive a movement like this, which is why Juniper and Red Hat are working together to lead this transition to the next era in IT. Diversity and uniformity Customers can run workloads on bare metal, in VMs or containers in private clouds or public clouds such as Amazon Web Services (AWS), Google Cloud, IBM Cloud or Microsoft Azure. They can use both proprietary and open source tools to manage workloads, monitor the system, increase security or troubleshoot. The options are seemingly endless. All of these choices are a good thing, as it gives power back to the customer to adopt the technologies that make the most sense for their organisation, to deploy apps in the best environment for their use or to shift workloads as resource needs change. This is the goal of hybrid and multicloud environments. To enable this diversity of choice, creating a consistent foundation and approach to operational consistency across IT environments is paramount. Operational practices should not have to change in meaningful ways depending on where an application is hosted or for whom a service is deployed. Customers are asking for a simplified operator experience and importantly, that simplification should cut across compute, storage and networking. Juniper and Red Hat Juniper’s collaboration with Red Hat is an important step in helping to make this simplification a reality. Our Multicloud Enabler solution combines Juniper’s Contrail Enterprise Multicloud with Red Hat OpenShift Container Platform and Red Hat OpenStack Platform. By bringing these products together, we are aiming to simplify the deployment and operation of multicloud infrastructure. Earlier this year, we announced this initiative. Today, we are pleased to share that the Multicloud Enabler solution powered by Red Hat and Juniper is generally available.[/vc_column_text][/vc_column][/vc_row] ### The year Wearables 'ECG/Cardio-tech' becomes ‘Warnables’ ECG-tech. EKG-tech. Heart-tech. Cardio-tech… all words which we have seen and heard around the new Apple Watch 4 thanks to its use of tech which will read the wearer’s heart rhythms in a revolutionary way.   However, few people know much about ECGs, especially outside a medical setting – what are they, how the technology works or how it might be applied more widely across the health landscape.  B-Secur has been developing ‘heart-tech’ for 15 years. It is now the leading heart-tech company in Britain and works with companies like Cypress Semiconductor and Analog Devices.   It is also the company behind HeartKey, the technology that uses ECG biometric algorithms for authentication, identification and a range of health measures – meaning they are well placed to explain the role ECG could play in the future.   What is ECG? ECG stands for electrocardiogram. It is a direct test that measures the electrical activity and rhythm of your heart. Just like fingerprints and irises, everyones cardiac rhythm is utterly unique, which means that your ECG can identify that you are who you say you are. However, because your heartbeat is never endingly dynamic and says so much about how you are, it can go way beyond ‘just’ identification (which is really where other biometric authentication devices end).  Until recently, ECG readings could only be taken in clinical settings and required twelve electrodes placed across the body.  As Apple has now shown, sophisticated and accurate ECG readings can now be taken via one body placement. Health application Seeing this technology delivered much more widely in a health setting is now almost certain as other wearable and health brands as well as the medical community, wake up to its potential. Gyms & Sport Having a healthy heart is important. If you have a well performing and efficient heart, you will be in a good position to avoid all sorts of health issues and be able to perform well in whatever kind of athletic endeavours you enjoy, from Ironman triathlons through to gardening. It is one of the reasons we work out or take part in sport. For those interested in getting or staying fit, understanding your heart is key.  It will help you understand when you are pushing yourself enough, or not enough, indeed one of the US’s most successful gym chains is called Orange Theory Fitness – based on a coloured heart rate chart, they recognise a great work out will see you in the ‘orange zone’ for as long as possible – not in the red zone (you are pushing yourself too hard) and not in the yellow zone (not enough). So, the metric which is more important than the amount of weights lifted, sets and reps, distance run, body fat percentages, weight gained or lost is what is going on in your heart.   Of course, wearables from Garmin, Fitbit and a host of other manufacturers have been able to tell you your beats per minute for a while now, and its useful information. But where ECG/EKG tech goes a step further is it moves trackers from ‘fitness’ to ‘medical’ with wearers getting the same level of information professional medics are used to seeing in terms of accuracy. It also turns the device from ‘helpful’ into genuinely intelligent supervisor, keeping an eye out for weaknesses and inconsistencies and alerting accordingly. Medical heart help Coronary heart disease is the UK’s number one killer: 160,000 die from heart and circulatory diseases, 73,000 die from coronary heart disease and 42,000 died prematurely from cardiovascular disease according to HEART UK. Helping to reduce these numbers with improved cardiovascular disease management and prevention could save the NHS tens, even hundreds of millions of pounds. When it comes to cardio disease management and prevention, patients need to commit to heart healthy physical activity, diet, medication and self monitoring, activities which traditionally go largely unmeasured….. wearable ECG tech instantly overcomes this, allowing continuous health monitoring for both the patient and his team of cardio-professionals. Most notably, this can impact on atrial fibrillation (1), an issue which can go undetected and can be difficult to manage. Short term monitoring and occasional visits provide an unhelpfully limited amount of data which can result in serious complications.  However, use of ECG in wearable devices gives medical teams the ability to monitor on an ongoing basis, helping to prevent strokes, manage symptoms and reduce hospitalisations. In fact, so effective can this technology be that inconsistencies in heart beat rhythm can actually alert medical teams to likely catastrophic situations before they strike. Head and heart But measuring and monitoring heart performance in order to treat and prevent heart-specific issues is not where ECG technology ends in a medical sense. For instance, heart rate variability (HRV), the variability in the timings of beats, has been shown to have possible relations to both dementia (2) and depression (3). The establishment of this link could revolutionise the identification, prevention and treatment of both diseases, helping millions of people and can now be done via non invasive wearable technology. Beyond fitness & wellness With ECG metrics for wellness constantly developing we will see the technology slip evermore into our healthy lives, but also into other areas – like our cars. The applications are easy to imagine; a door handle which recognises your heart beat upon touch and unlocks the car instantly, but refusing to do so for your friendly local car thief; a steering wheel which recognises your heart through your grip and which reads your signs, notifying you or acting autonomously when it senses you nodding off while driving; an in car entertainment system which automatically adjusts to your Spotify playlists when you get in and re-adjusts when your teenage daughter borrows your car. ### Role Based Access Control (RBAC) | IBM Resilient [vc_row][vc_column][vc_column_text] [embed]https://vimeo.com/315656601/91ac0f2c67[/embed] Role Based Access Control is critical in maintaining privacy and security over the investigation of incidents. An incident response system needs to maintain privacy at all times due to the information it holds. Join Andrew Yeates from IBM Resilient as he discusses Role Based Access Control [/vc_column_text][vc_column_text] Learn More: https://www.ibm.com/security/intelligent-orchestration/resilient [/vc_column_text][/vc_column][/vc_row] ### Distinct Differences Between The AI and ML [vc_row][vc_column][vc_separator][vc_column_text]Over the past few decades, the world has changed in a ton of different ways, but none bigger than when it comes to technology. Things most wouldn’t have imagined to be possible even a few decades ago, are completely possible and widespread today. This is only going to continue to occur as technology is still advancing frequently. Technology helps us in many different ways from helping or companies be more efficient, helping us organize our personal lives and solving a ton of life's other problems. For example, there are apps that can control the lights in our homes, tools to monitor distributed Java applications and so much more. It seems that every week there is a new innovation that is continually pushing the industry forward. Some of the biggest innovations in tech over the last few years have without a doubt been artificial intelligence (AI) and machine learning (ML). They aren’t brand new, so to speak, but are two of the most exciting advancements in the space and have a lot of buzz surrounding them. But despite what you might believe AI and ML are not actually interchangeable terms for the same thing. In fact, there are actually many distinct differences between the two. With that in mind, this article is going to look at both AI and ML, and the differences between them. What is Artificial Intelligence? Artificial intelligence represents the overarching concept of being able to incorporate the intelligence and smart decision-making skills of humans into machines. It is an incredibly large and broad topic and dates back to a conference at Dartmouth College in 1956, which is considered the birthplace of AI. Artificial intelligence is usually put into two categories, general and narrow or applied. General AI is capable of exhibiting all areas of human intelligence such as problem solving, understanding human language, recognizing things and more. Narrow AI is great and doing certain things, but will be very limited in others. So a machine that builds a product very fast and efficiently automatically, but can’t do anything else, would be a good example of a narrow AI. As you could imagine, general AI machines are much less common as they are capable of doing so much more and thus are much more difficult to create. What is Machine Learning? While many believe machine learning is the same as artificial intelligence, as we mentioned, this is not the case. In fact, machine learning is actually a subset of artificial intelligence. This field of AI is primarily focused on giving machines and computers the ability to learn. Like artificial intelligence, the term “machine learning” was also coined over half a century ago. Artur Samuel coined the term back in 1959 and he defined it as the “field of study that gives computers the ability to learn without being explicitly programmed”. In order to achieve AI through machine learning, you need to provide a ton of data to the algorithm and allow it to learn about the information being processed. Machine learning is widely used today by a variety of different companies such as Facebook, Google and Netflix. The idea of machine learning itself is based on something called neural networks. The idea of neural networks is quite complicated, but they are essentially networks that are built to train and/or learn. Simply put, neural networks help to allow computers to mimic the brains of humans without the bias and with increased speed and accuracy. Once perfected, the network will help the machine to learn and improve itself without any sort of intervention from humans. The Differences Between AI and ML Just to wrap up and conclude the article, we thought we would review and clearly list some of the distinct differences between AI and ML that we touched on throughout this article. The differences between AI and ML include: Machine learning is actually a subset of artificial intelligence and is also a way to achieve artificial intelligence. Artificial intelligence is a much more broad and general concept than machine learning. AI as a technology allows a system to demonstrate human-like intelligence, and machine learning uses mathematical models and data to make decisions and the more data included, the better the machine learning will be. As a result, machine learning doesn’t have any “real” intelligence, so to speak. In conclusion, hopefully, this article has helped you understand a little bit more about artificial intelligence and machine learning, and the differences between the two of them. They will likely continue to grow and be used in a much wider fashion and be adopted by many more product or companies. While they have some things in common, they are not the same and recognizing the difference is important if you ever want to use these in your personal lives or for your business.[/vc_column_text][/vc_column][/vc_row] ### The Future of Data Centres [vc_row][vc_column][vc_column_text]The world consumes an enormous amount of data every single day—about 2.5 quintillion bytes of data every 24 hours, and now even more so with the advancements in Artificial Intelligence and the Internet of Things. Because of this, data centres have become a vital part of our lives. Our devices are becoming more high-tech and are producing more data than ever before. In fact, 90% of all data today was created in the last two years alone. All of this data goes into data centres. Although Artificial Intelligence and the Internet of Things are creating a lot of this data we are producing, these same advancements will be an integral part of data management and an important part of the future of data centres. This article looks at the future of the data centre ecosystem and predicts what’s in store for the industry. 5G Networks Will Become a Reality As you can clearly see, we are producing data at exceedingly high rates. We are also processing this data quickly and will need to continue to do so because of the advancements in technology and the data these devices produce. 5G networks can help improve data transfer speed, reliability, and connectivity. Verizon is currently using an unofficial version of 5G for home Internet services, but a global standard version will be launching in 2019. More edge-based systems will likely be a by-product of the upcoming 5G networks. Edge-based systems will decrease latency and offer higher speeds especially for those living in more rural areas. The projected speed of 5G will change the way data centres are built, and connections will be distinctively faster than it has been in the past. And because of the projected speed of 5G, artificial intelligence could also play a major role in data centre technology. Edge Computing Edge Computing will be another advancement that will change the way data centres will function. Edge adds small data centre hubs between current data centres and the user. By adding a gateway connection point the user experience and connection is faster and stronger. Because of Edge Networking, the way data centres are built will also be different. We will see more micro data centres built into preexisting towers. This micro data centres will be handled similarly to current data centres, but these will be located closer to businesses that are within the data network. Latency issues will be a thing of the past. When Edge Networking was in its earliest stages, many industry experts (including Data Center Frontier) believed that Edge would cancel out the Cloud, and only data centres would benefit. But we predict that all three will work together making the world’s connectivity stronger than ever before. The Internet of Things will also make Edge Networking a plausible part of future data centres. The Internet of Things Optimizing how a data centre is managed can be tricky—but new technology is making it easier to improve the way data centres are run. The Internet of Things will improve security, data flow, storage requirements, and access capabilities making data centre managers’ jobs easier. Managing everyday tasks such as updating systems, configuration, patching, and monitoring will be managed by the Internet of Things. Running a data centre with the help of IoT can also ensure the data centre is more environmentally friendly by finding the best setting in real time. IoT is becoming more popular in our homes, but it will also be an integral part of data centre management shortly. Artificial Intelligence Will Be a Standard Artificial Intelligence has already been implemented within some data centres around the world, but because of all the positive benefits it offers, we believe this will AI will be a standard in the data centre world. AI has already helped Google cut its energy bill by 40% by optimizing its servers. AI can also be used for security purposes as well. AI is a very important advancement in technology, especially when it comes to data centre management. Predictive analytics and statistical algorithms are two of the major reasons for how AI will be used to optimize how a data centre will run. And just like IoT, AI will also help manage temperature, security, ventilation, and other potential hazards. Artificial Intelligent managed data centres could potentially take the place of human data centre managers in the future. Potentially saving businesses a lot of money. Dac, the first AI-powered data centre operator, has already been introduced. LitBit’s new technology uses an IoT system that will diagnose problems within the data centre including water leaks, cooling issues, and loose wires. Conclusion The way we are consuming data (2.5 quintillion bytes of data every 24 hours), we will need future data centres to be bigger, stronger, and faster. But these technology advancements we mentioned will bring data centres into the future. Artificial Intelligence, the Internet of Things, and Edge Computing will be a vital part of the future of data centres.[/vc_column_text][/vc_column][/vc_row] ### Crisis Management | IBM Resilient [vc_row][vc_column][vc_column_text] [embed]https://vimeo.com/310608080/0afae566f8[/embed] Cybersecurity is now a board level issue and looking at previous cyber attacks it is clear to see that they affect the business continuity of any organisation. Now is the perfect time to live and breath crisis response plans. Join Craig Roberts from IBM Resilient as he discusses crisis management [/vc_column_text][vc_column_text] Learn More: https://www.ibm.com/security/intelligent-orchestration/resilient [/vc_column_text][/vc_column][/vc_row] ### The Dangers of Unsanctioned Shadow Apps [vc_row ][vc_column ][vc_separator ][vc_column_text ] The threat posed by unsanctioned shadow apps is a growing concern for businesses everywhere. Shadow apps is the term used to describe any applications that haven’t been cleared by a company’s information security team, but which employees choose to use anyway. Because these apps are not sanctioned, they usually aren’t monitored or secured in the same way that approved apps are, making them vulnerable to exploitation by criminals and/or insider threats. This article will look at some of the most prevalent areas for shadow apps and the dangers they pose to the wider organisation.    Browser Extensions   Browser extensions are historically difficult to secure but pose a significant threat to data security, making them a perennial favourite amongst cyber criminals. A compromised browser extension can be used to deliver malicious URLs, turning that browser into a potent cyber weapon. Every day, Google is forced to remove dozens of such browser extensions from its Chrome Webstore, and that’s just one vendor.   Many recently discovered malicious extensions have been loaded with malware used for cryptocurrency mining and click fraud campaigns. Cryptocurrency mining in particular can have a devastating effect on an organisation’s network, with the amount of traffic generated causing major performance issues and running up big electricity bills.   Unsurprisingly, those behind such cryptojacking extensions aren’t too keen on getting caught, with many running their processes through proxy servers or using custom mining pools to separate the mining from the cryptojacking, but still deceive users.   Instant Messaging   Instant-messaging clients can be found in nearly every workplace and while the most popular ones, such as Skype, tend to be on the list of authorised apps, it’s the use of unknown, unsanctioned messaging apps that can expose an organisation to danger. For example, Pidgin is an open source client used by millions worldwide, but it can do much more than just enable communication between co-workers. In some environments it can also be used as a tool for running arbitrary commands on infected endpoints and controlling backdoors.    Pirated Apps   In recent years, there’s been a growing number of apps sold outside of official stores. Many of these have been designed to look like legitimate ones, but are instead laced with malware, spyware, or worse. When installed, they can open up a network and the data held within to all kinds of cyber-attacks.    The Wider Issues With Shadow Apps   Aside from the inherent risks that unsanctioned shadow apps present, like those described above, they also create wider issues that can make life very difficult for IT teams. One of the biggest is the fact they aren’t patched like sanctioned apps are. The majority of large organisations operate strict patching regimes across all their main applications, keeping them up to date with the latest bug and vulnerability fixes. However, with shadow apps falling outside of this scope, it can be weeks, months or even years before the employees using them decide to run updates, leaving them open to exploitation and unauthorised access.    In other situations, these apps could be rigged to leverage network functionality to third party sites that an organisation may not even be familiar with. A perfect example would be an attacker using an FTP application that his or her organisation does not monitor at all. Once the attacker has access to sensitive data, he or she could exfiltrate it via the FTP without the organisation even knowing about it.   Once an organisation has established an ecosystem of sanctioned apps, it needs to take great care in ensuring third-party apps that integrate with those sanctioned apps don't proliferate without the IT team’s knowledge. Popular cloud storage solutions like Dropbox and Box are often authorised for use in organisations, but they also interact with a large number of other apps that don’t have the same authorisation. If these avenues aren't identified, they can quickly pose a threat to the organisation’s data security. Lower the Risk by Understanding What’s in Your Environment  For any organisation concerned about the use of shadow apps in its environment, there’s a growing number of security technologies that can be used to gain valuable insight into the apps employees are using, both sanctioned and unsanctioned.  For example, some software can give the IT team complete visibility into the types of data flowing through their system and even block unauthorised apps from executing. Others can be used to educate employees by alerting them when they attempt to open unsanctioned apps that are against company policy. Over time, these kinds of prompts help to change employee behaviour, teaching them to think more carefully before they act and understand when they are behaving in a risky manner.   Shadow apps inevitably find their way into the majority of organisations and while not all of them pose a threat, many of them can if they aren’t carefully monitored and/or controlled. While IT teams may not be able to prevent them altogether, taking steps to know what they are, the data they are accessing and who is using them will all play a key role in minimising the threat they pose. [/vc_column_text][/vc_column][/vc_row] ### Dangers of automation part 1&2 | IBM Resilient [vc_row][vc_column][vc_column_text] [embed]https://vimeo.com/310608280/1defb12bf5[/embed] [/vc_column_text][vc_column_text] [embed]https://vimeo.com/310608269/772a204f34[/embed] Automation is a big buzzword in the cyber security industry and it can help incident response. However, people need to think about what process they are going to be automating. In order to automate you need to be completely certain. Join Craig Roberts from IBM Resilient as he discusses the dangers of automation [/vc_column_text][vc_column_text] Learn More: https://www.ibm.com/security/intelligent-orchestration/resilient [/vc_column_text][/vc_column][/vc_row] ### SOAR markets | IBM Resilient [vc_row][vc_column][vc_column_text] [embed]https://vimeo.com/310599818/f4191d368e[/embed] Security orchestration automation and response have previously been disparate, fragmented technology and process areas desperately strung together. This has lead to the convergence of a number of technology areas. Join Ben Fleming from IBM Resilient as he discusses SOAR Markets [/vc_column_text][vc_column_text] Learn More: https://www.ibm.com/security/intelligent-orchestration/resilient [/vc_column_text][/vc_column][/vc_row] ### Artificial Intelligence: The Problem with Perfection In today’s world, with artificial intelligence, Deep learning and machine learning all around, honing trillions of data points across a plethora of industries and consumer applications, we face a single issue. That issue is one of perfection - with perfection we have a sense of everything attuned and cultivated towards a wonderland that is beyond rebuke… With perfection we remove choice and freedom of choice, negating that which makes something perfect to us on both an emotional and human level. From my perspective, I like things that are imperfect. Movies that are so bad they make me laugh, food that is bumpy and organic and not artificially shaped, songs that don't follow a prescriptive flow or beat and provide a variety outside of machine generated recommendations. In every industry you would be hard pushed to find the majority of business leaders who are not on the cusp or in deployment mode of some form of machine intelligence project. But will this continued push towards perfection cause a retro movement in my generation? Like we are seeing with vinyl records vs. digital downloads, or the move away from larger social media platforms. In the world I imagine in the future, which will be driven by algorithms, the question I ask is do we need to regulate perfection? We often hear about bias in algorithms, which may be attuned to gender or other causes, but is our real enemy that of destruction of choice? H2: Destruction of choice Choice or freedom of choice without gaming or manipulation I do not think we ever will experience, since the days of early advertising and marketing, we have always been manipulated into purchase or brand recognition. The issue with the aforementioned manipulation is when this overspills into political or other messaging like we have seen recently. There are many quotes I can use that describe an all-in or all-out approach, the issue that I am trying to address is that choice is the basis of all human intelligence, and the one area that provides regulation, from the vote we cast, to the movie we watch, protection of choice is key. The growth of companies that we had never heard of only 5 years ago shows the world how algorithms and Machine Learning / Artificial intelligence and Deep Learning are fuelling today’s consumer economies. As an individual, I don't want to be manipulated into choosing my perfect song, film, clothing or any other form of apparel. Unlike the mad men era of many years ago, data and the collection of that data is virtually weaponised against me on a daily basis, from the cookies that are collected and cross matched against databases I never gave permission to profile me, through to any form of unauthorised personally identifiable information (and yes you anonymise data blah blah blah). GDPR and the collection of data and privacy controls in my view as a user are a great step towards transparency but is it enough? Do we need to see the control and monitoring of algorithms to preserve choice? Could we look at regulation and oversight to alert us?. Could this be based on a simple solution such as the voluntary food label scheme that shows high salt and fat content in an easy to use traffic light system on packaged food goods? “Could a similar system of traffic lights be shown when we are being nudged or persuaded towards a choice even if that choice is one amassed by the collection of data?” My generation has been born on the internet but that does not make us a slave to online data services, in fact, it is totally the opposite. I strongly believe now is the time to begin looking at the systems that guide all of our choices today before it is too late. ### IoT | Taking a Stronghold on Mobile App Development [vc_row][vc_column][vc_separator][vc_column_text]Jinkies! It seems like we are all set to create a Jetson-esque future; all thanks to IoT mobile app development. After falling into the general pit of buzzword-vagueness, Internet of Things (IoT) is seen used creatively, in hitherto unheard of sectors including the mobile realm with no signs of slowing down. Earlier in 2011, the tech giant Google came up with Nest thermostat- products that can be controlled by a smartphone. A few months later, the concept gained acceptance with the launch of Apple Watch, connected with iPhone at the initial stage. Cutting to many years later, the tech touches a variety of industry verticals ranging from healthcare to agriculture, education, energy and so forth. The term smart-cities is growing at a fanatic pace providing a vast scope to access IoT enabled devices. Tapping this opportunity, many businesses have developed their own dedicated mobile apps to better engage with their customers and stay ahead of competitors. But How Does The IoT Affect The Mobile App Industry? First and foremost, Internet of Things makes way easier to connect mobile phones with other smart devices. Let’s say for example- Uber. Of course, there were apps capable of assisting you in getting a taxi but when a mobile app combines with IoT; we get something extraordinary like Uber. The app shows you where the nearest driver is, how much money you’re going to pay for the ride. All this is possible by gathering data from all devices connected to your app which is via IoT. Further below I would like to mention a few pointers stating the impact of IoT on mobile app development. Life becomes easy Typically, IoT is one system managing all- it comprises of multiple devices and apps and functions. For instance, mobile devices featuring IoT functionality enables you to do several tasks within a few minutes such as a switch on the light or track cab or check security camera, etc. As a result, life becomes easy for mobile app developers and end users, at present, users can manage all from one single device. As a result, a considerable amount of time and effort being put by developers is reduced to build an app. This means remaining time and effort can be used to conduct multiple activities without much hassle. Due to the integrated power of the internet with the devices, the technology can manage a lot of tasks for the end users as well. Progressively diverse and flexible skill set Living age of smart objects, to remain relevant, apps have to be designed to be highly adaptable. Do you know why? Because apps can become obsolete on an instant basis. In addition to this, developers required being highly adaptable and committed to investing in their own development. So if you identify any skills gaps, it’s time for an update! High-end Security Unfortunately, an unprecedented amount of entry points, the threatening ones are allowed by IoT that can harm you in several ways. Cybercriminals find it very easy to exploit the data and make use them in unethical ways. So what needs to be done is? The IoT mobile app has to be safe, secure where user’s privacy is not hampered at any stage. And the good news is that despite of this, enterprises still look forward to exploring IoT in their respective field and areas. This gave rise to businesses running IoT enabled devices ensuring to put an extra layer of security. Since no enterprise looks for cyber attacks, security has always been the key discussion point in the development of applications. Lastly, several defense barriers can be improved by IoT- Make all your physical devices the first point of entry. This won't just save the core but also alert professionals about the upcoming threat. Hybrid Apps One must accept the fact that a genuine innovation deserves all the accolades it gets. Hybrid apps is one such technology for sure. Revolution is an on-going process, even today most of us are looking to adopt new ways to change the way we interact today.  Several Mobile app development companies have started to use IoT, Augmented Reality, Artificial Intelligence, etc. with advanced codes to come up with extraordinary outcomes. IoT Mobile App Can Serve Many Objectives of the Business Location independence- IoT-based devices can be controlled easily with smart devices from anywhere. More convenience- Apps has to be better in regards to accessibility because people use smartphones more frequently as compared to their laptops. Notification facility- An app can give you a real-time update on the IoT network. Social media integration- Integrating IoT mobile app with social media network means you can remain in touch with employees and people at once. Brand promotion- Promoting a brand becomes way easier because a mobile app featuring IoT support can provide your end users to utilize specific devices or products with ease. Increased flexibility- Mobile apps have flexible accessibility. I am pretty sure you know the power and effect of offline functionality. [/vc_column_text][/vc_column][/vc_row] ### App exchange community vs enterprise | IBM Resilient [vc_row][vc_column][vc_column_text] [embed]https://vimeo.com/310576130/4b4bba732f[/embed] Did you know that IBM has an apps store with Resilient compatible apps? Although this app store may not be what you think. When IBM says app store they don’t mean a game you can download for your phone. They mean fully packaged integrations that are simple and easy to install. Join Jessica Cholerton from IBM Resilient as she discusses the IBM app store. [/vc_column_text][vc_column_text] Learn More: https://www.ibm.com/security/intelligent-orchestration/resilient [/vc_column_text][/vc_column][/vc_row] ### Three pieces of bad news about botnets you need to know The Council on Foreign Relations, a non-profit US-based think tank, ended 2018 with a report arguing that we need to work towards an internet with zero botnets. Yes, botnets, those malicious powerhouses that are each comprised of anywhere from thousands of enslaved devices to millions of enslaved devices. In the defence of the Council on Foreign Relations, they never said it would be easy, they just said it’s what needs to happen because botnets are the bane of the internet. They are correct. The history of botnets and their associated attacks and other malicious accomplishments like DDoS, spam and cryptojacking is long and storied, since botnets have been at it for decades. However, there isn’t much point in looking to the past when the current botnet situation is so scary. Here are three pieces of bad news you need to know about botnets. A scary-smart malware is building a botnet and no one knows what it’s for If there’s one thing a botnet loves, it’s IoT devices. With the huge number of IoT devices connected to the internet – currently estimated to be somewhere around seven or eight billion – and their typically lax security, they’re ripe for the picking for botnet malware designed to guess default usernames and passwords. Now, unfortunately, it would seem they’re also ripe for the picking for a scarily brilliant botnet malware armed with over 100 variants of its malware payload, a range of commands designed to ensure payload delivery, and the capability of infecting between 15 and 20 IoT architectures. Forget about wiping this malware with a reboot of the device as well, because it has seven different methods of persistence all in use at once. Beyond being a major step up in what botnet malware is capable of, not much is known about the so-called Torii botnet. It’s carefully encrypted, and so far only one of its servers has been analysed by security researchers. No one knows what for sure Torii is being built for or who is behind it, but experts say that in addition to the usual botnet abilities it is capable of stealing data. A vulnerability from 5 years ago is helping to build a massive botnet Ah, yes, the internet. A place where a funny cat picture from two days ago is old news but vulnerabilities from years and years ago continue to aid and abet cybercriminals. A botnet catchily nicknamed the BCMUPnP_Hunter is using a universal plug and play or UPnP vulnerability from over five years ago to feast on home and small office routers to add devices to its ranks. Originally designed to make configuring devices easier for their users, the UPnP protocol has caused a variety of cybersecurity issues. In this case, it causes the routers in question to respond to discovery requests that come from outside of the local network, which then allows the botnet malware to deliver the malicious payload. As a result, BCMUPnP_Hunter currently consists of over 100,000 devices, and researchers suspect they are being used to send spam. The Mirai malware has move beyond the IoT Mirai had its first run as a terroriser of the internet at the end of 2016 when it launched a series of record-breaking DDoS attacks, including the assault on the Dyn DNS server that took down Twitter and Reddit, among other major online entities. It’s been enjoying a second run as one of the biggest malicious forces on the internet since its source code was released and everyone with a passing interest in launching distributed denial of service attacks has created a Mirai variant to create their own IoT botnet. Now that fervor for the Mirai botnet has extended beyond the IoT and right into Linux land. Mirai variants are now targeting the Hadoop YARN vulnerability on unpatched Linux servers. With the power and bandwidth these servers offer to botnet operators, even a small Linux botnet could be capable of thundering DDoS attacks, ones that rival the attacks coming from massive IoT botnets. Battling back against botnets With a truly concerted effort from business owners, device owners, governments and cybersecurity professionals, we may one day have a future without botnets. For now, however, we have a present rife with them, including the devilishly brilliant botnet gearing up to steal data, a router botnet preying on a vulnerability from five years ago, and a Mirai variant going after all the firepower provided by Linux servers. It’s time to start applying patches and checking security on IoT devices because while that zero botnet goal may currently feel out of reach, we should all be able to work towards decreasing the size of botnets. ### Service Effectiveness | IBM Resilient [vc_row][vc_column][vc_column_text] [embed]https://vimeo.com/309932707/978d451df6[/embed] Security operations are starting to receive the funding that they need. Now businesses that invested in their own security are starting to look back at security and ask, what value is being delivered from that investment. Join Ben Fleming from IBM Resilient as he discusses service effectiveness [/vc_column_text][vc_column_text] Learn More: https://www.ibm.com/security/intelligent-orchestration/resilient [/vc_column_text][/vc_column][/vc_row] ### How AI applications are being used to transform business AI is a new paradigm in computing. You’ll find authors defining it differently in various tech publications, but the general consensus appears to be this: AI, of the type that’s prevalent today, determines solutions to various issues without being explicitly programmed to do so. Historically, our software’s sole purpose has been automating logic. We’ve given it problems we knew how to fix and had it perform calculations within predefined rules, to achieve objectives promptly. Programming has been based on proof and certainty. AI, as a computer science, has a more empirical nature. It doesn’t require prior knowledge of the truth and applies probability and statistics to deal with unsureness. In a nutshell, it strives to draw inferences, as accurate as possible, from incomplete information. Why Are We Talking About AI Now? Three factors are considered to have spurred the current AI wave. The recent algorithmic advances in Machine Learning (this is referring primarily to deep learning algorithms), the emergence of large enough datasets for ML models to detect patterns in, and, finally, the availability of strong computation hardware that can power Big Data processing. If we were to name one company that helped escalate the progress, we’d say it was NVIDIA. The firm’s robust GPU cards, invented initially to conjure up realistic graphics for PC gaming, turned out to possess properties nearly ideal for Deep Learning. Namely, their chips each contained about 4000 cores, which weren’t particularly powerful on their own but enabled parallelisation of computing and, thus, provided sound platforms on which neural networks could function efficiently. There were also many influential research papers published in recent years such as Dropout: a simple way to prevent neural networks from overfitting, Deep Residual Learning for Image Recognition, Large-Scale Video Classification with Convolutional Neural Networks and many more. Collectively, they have all helped ML practitioners to increase, gradually, the prediction accuracy of ML models from 60% (10 years ago) to about 90% (now) and thus enabled firms in various industries to invest with confidence in AI offerings to drive tangible benefits. Where is AI being used now? Nearly 95% percent of the overall economic value generated by AI is produced via supervised machine learning. That is why the two terms - Artificial Intelligence and Machine Learning - have almost become synonymous in many a tech outlet. Supervised learning concerns itself with determining quickest paths from inputs to outputs. It enables us, humans, to teach computers how to perform specific tasks, after being trained on labeled datasets, and maximise specific metrics. ML is being used widely in: Digital Advertising. The abundance of easily available social data helps companies gain deeper insights into their target audience and segment prospects with more precision. Marketers feed information about a person (input) to an AI model and have it calculate, for example, the probability of how likely the user is to click on a certain add. According to Salesforce, around 60% of marketing leaders are now convinced that AI technology will help them better personalise content and increase the efficiency of programmatic ad campaigns. Web Fraud Prevention. Cisco, an international tech giant, has recently launched an Encrypted Traffic Analytics platform, which leverages machine learning to detect malicious behaviour and threats in encrypted traffic. The system spots suspicious actions by monitoring the metadata of the packet flow, without decrypting information. Video surveillance. Relying on the same mapping input to output principle, ML algorithms can be taught to recognise, with great accuracy, people’s faces. Panasonic’s FacePRO technology (one among many similar tools) can identify one’s features even if their faces are partially obscured (by sunglasses or a mask) or they’ve aged 10 years from their previous photo. Biometrics and facial recognition, empowered by ML models, are expected to become mainstream soon. Oracle’s Hotel Report 2025, for instance, suggests that 72% of hotel operators are planning to adopt such systems in the next few years. Voice and speech recognition. With Google claiming that 20% of their queries are voice searches (and Comscore predicting that 50% of all online searches will be conducted via voice by 2020) it makes sense that the market for speech recognition technology, which turns audio clips into text transcription, is projected reach $6B by 2020. Self-driving cars. At their core, self-driving vehicles utilise ML technologies that input images of the surroundings (readings from radars, etc.) and output positions of other cars and objects on the road. Uber already has a self-driving car test program running while Ford, Hyundai, Renaut-Nissan, and other automotive titans have all promised to catch up by 2021. Does Artificial Intelligence Mirror Human Intelligence? Contrary to the bogus claims some sensation-loving bloggers love to spew, we are nowhere near replicating, in any meaningful way, the interlinked and sophisticated functions of the human brain. Nor we are trying to. The name Artificial Intelligence, which triggers the comparison of machines to people, is largely considered unfortunate. As people, we have: Motor intelligence. Though we take this one for granted (don’t treat with such reverence as, say, excellence at math) we do pay millions to athletes who have exceptional coordination and physical skills. Visual intelligence. We’re not only able to grasp visual input from our surroundings but also interpret images in a fraction of a second. Our perception system is fine-tuned to capture light (through eyes) sends it to our brains (via receptors) and, processes it, rapidly, in our visual cortex. Speech intelligence and natural language understanding. Our genetically engineered audio recognition system is, too, extremely complex in its anatomy. We’re able to pick up air vibrations (sound), push them along through multiple processing mechanisms to the auditory cortex and have them transformed into meaningful information. These are all parts of our interconnected intelligence. One can recall an image from an immense database that is their memory and then immediately find words and gestures to describe it. AI systems are only designed to tackle one specific objective at a time: to move, to summarise/translate text, to recognise faces on the photos, etc. No one is intending to transform them into omniscient, human-like beings. The fields AI has made the most progress in so far include: Computer Vision. We’ve made strides at mimicking human perception of light and colour. Our cameras can capture images in digital form. However, we have yet to train neural networks to comprehend pictures, which are just sequences of integer numbers representing intensities on a colour spectrum to a machine, with the speed and precision of a human. Speech recognition. Our AI systems can already perceive natural language and translate texts, especially on technical matters, with a remarkable accuracy. That said, plenty of work is still ahead. To interpret language fully, in a human sense, a computer would have to learn to comprehend social context, read humour and sentiment, and get a grasp on human feelings. So, what does AI’s Future Look Like? AI’s adoption, experts predict, will come in three stages. At first, industries already sitting on big data - finance, healthcare, ecommerce, education, etc. - will transform profoundly their business processes and incorporate AI to enhance, restructure, and automate operations. Next, we’ll see the abundance of new types of data floating around on the web - info from wearables, interactive personal assistants (such as Echo), smart vehicles, various IoT devices, and ubiquitous computing. Finally, as AI platforms become commonplace, we’ll be able to teach machines to move autonomously (this means walking robots and self-driving cars will become accessible.) According to Dr. Kai-Fu Lee, a renowned AI guru, we can expect the last stage to come into full effect in about 15 years. AI has moved out of the lab and companies across different verticals are increasingly adjusting their business models to accommodate for its full potential. Now, the goals of AI investors are no longer centred around increasing efficiency and automating simple processes (for which, say, chatbots were developed.) They’re focused on cracking high-grade tasks such as enhancing pricing optimisation and preventive maintenance, which, in turn, can increase uptime. ### Juniper Networks Helps Enterprises Simplify Data Integration to Pinpoint and Prioritise Cyber Threats from any Security Source Updates to the Juniper Advanced Threat Prevention portfolio leverages third-party firewalls and security data sources to offer enterprises a fast, flexible and automated defense against malicious activity   Juniper Networks, an industry leader in automated, scalable and secure networks, today announced new offerings as part of its Juniper Networks® Advanced Threat Prevention (JATP) Appliances, enabling enterprises to detect malware, understand behaviour and mitigate threats with just one touch. This solution leverages data from any third-party firewall or security data source, avoiding unnecessary vendor lock-in. Eliminating complex, time-consuming data collection configurations, Juniper is helping security teams improve their organisation’s security posture by simplifying and accelerating security operations.   Sixty-four percent of security teams surveyed said that speeding up threat analysis and prioritising threats with automation would improve their security posture, according to a Juniper Networks and Ponemon Institute study. High volumes of incident data generated by numerous, disparate sources make threat detection and mitigation increasingly difficult. To uncover critical threat behaviour, already-understaffed security teams spend significant time analysing and correlating alerts, ultimately increasing time to remediation. Security teams also face the manual tasks of creating one-off custom integrations to ingest relevant data from these sources.   To address these challenges, Juniper Networks today revealed new capabilities that build upon the open architecture of its unified cybersecurity platform. Now security teams can easily create custom data collectors right in the JATP Appliances platform, enabling the ingestion of threat data from any Juniper or third-party firewall. Leveraging an intuitive user interface without the need for custom code or pre-defined integrations, Juniper is simplifying operations in multi-vendor environments. This new capability introduces easy-to-use customisation controls for security analysts to collect, parse and pinpoint specific data without relying on outsourced customisations. It also automatically integrates with the single, comprehensive timeline view offered by the JATP Appliances, streamlining investigation and remediation by bringing the most important threat behaviour details to the forefront more quickly. The JATP Appliances provide up to 12x productivity gains over manual processes for malware investigations.   With the continual advancement of its unified cybersecurity platform powered by Software-Defined Secure Networks (SDSN), Juniper helps security teams pinpoint evasive threats hiding deep in the network while showing a temporal view of behaviour to stop threats as quickly and effectively as possible.   News Highlights:   Seamless Integration of Security Data from any Network Source: Building upon the platform’s open architecture, the JATP Appliances can now quickly capture, parse and leverage data from all security sources in the network through its built-in custom data collectors, eliminating the need for outsourced and time-consuming configurations. Once the dataset is defined, it seamlessly flows into the JATP Appliances threat behavior timeline, empowering security teams to quickly see what happened and when in an intuitive user interface. This new capability supports multiple log format types, including XML, JSON and CSV, and is complementary to existing SIEM functionality.  Juniper Networks® Advanced Threat Prevention Appliance for Distributed Enterprises: As part of Juniper’s continued efforts to provide superior protection from malicious activity, this new on-premises device is the ideal option for security teams that require automated threat prevention capabilities across their distributed enterprise. The JATP400 Appliance works alongside any existing firewall, reducing the need for complex integrations and with the built-in timeline view, security teams are able to mitigate threats with just one touch.  “We have been very pleased with Juniper’s best-in-class unified cybersecurity platform, which has helped protect us against threats more easily and faster than before, mitigating the risk of disruption to our shows, websites and other media sources and protecting our intellectual property. With Juniper’s security platform, we’ve been smarter, leaner and more efficient with our network. We’re looking forward to these new additions to their portfolio and continuing to leverage Juniper’s products to protect our company against a wide range of threats.” -- Dustin Brandt, director of IT, America’s Test Kitchen   “As a longtime partner of Juniper, we have seen its unified cybersecurity platform continue to grow into the robust security offering it is today. We believe that these newly added JATP400 capabilities will enable our customers, including distributed enterprises, to quickly identify and intuitively fight threats by adding much needed automation to their security portfolio.” -- Patrick Zanella, Security Practice Lead, Integration Partners   “We are immensely proud of the progress we have made to date with our unified cybersecurity platform and are excited to announce the newest addition to our portfolio, JATP400 Appliance, along with the addition of our latest threat detection capabilities. The new custom data collectors, in particular, will give our customers a fast and flexible way to gain a better view of their network from all angles, using their security data to quickly identify advanced threats directly from the JATP Appliances. We look forward to bringing these new capabilities to enterprises and taking another step toward truly secure networks.” -- Samantha Madrid, vice president of security business and strategy at Juniper Networks   Follow Juniper Networks online: Facebook |  Twitter  |  LinkedIn About Juniper Networks Juniper Networks simplifies the complexities of networking with products, solutions and services in the cloud era to transform the way we connect, work and live. We remove the traditional constraints of networking to enable our customers and partners to deliver automated, scalable and secure networks that connect the world. Additional information can be found at Juniper Networks (www.juniper.net). ### Email security in the age of cloud When it comes to sending emails, security is everything but straightforward. As one of the oldest systems that are still widely used across the Internet, email fundamentally lacks any kind of consideration for security or privacy. Over the years, there have been many solutions proposed and deployed to address this shortcoming, most of which are transparent to end users.Red Sift’s OnDMARC assists with setting up the three core technologies widely used to “add” security to email: SPF, DKIM, and DMARC. These technologies work hand-in-hand to protect both the company and their customers from phishing and malicious email. Technical solutions, however, are never enough. In the age of the cloud, as we are, it is increasingly difficult to keep a secure email server in one’s basement or even datacenter. Not only is it often prohibitively expensive to run such a crucial service internally, staying on top of critical updates without compromising the availability of email requires a dedicated team. Even then, there are a number of great products out there that are loved by development, marketing, and commercial teams for their ease of use and distinctive features. Instead of flat-out banning cloud services, it is useful to consider what real threats they pose to our company. We need to develop an intuition as to where things may go wrong, and how they will affect operations when they do. Risk assessment When we start using cloud services for business functions, whether it be the CRM, mailing lists, or as part of company software, we have to realise that we’re delegating some trust to a provider. We pay them to do “the right thing”, not only to run a reliable but also a secure product. Managing this trust can be tricky, so understanding exactly how things will go wrong when they do is essential. Trust does not have to be a yes or no question. I am often happy using SaaS I do not consider trustworthy at all. With the knowledge that they serve a specific purpose for a short period of time, and once they betray me my sensitive information is still inaccessible to them, I will feel safe. On the flipside, there are occasions where only the best of the best will do, like my personal email. Gmail and Outlook 365 are the biggest providers, and they will generally do a very good job in keeping their users secure, even against nation-state actors. But what about privacy? What about company secrets? Realistically, a “big” provider is often more secure just by virtue of being big. A compromise of a centralised provider could be catastrophic globally, therefore they will afford employing the best security personnel and practices they need. In contrast, a company whose main focus is different from email will put less significance on their email security. Indeed, the scale of a compromise in their case is going to be smaller. A company is going to be an easy target if the easiest way in is through a provider who did not consider security spendings important. For example, impersonating a company and sending phishing emails to people through abusing SendGrid will be much harder than by compromising a smaller vendor. Google is going to need to protect more users than Switzerland-based privacy-focused alternatives, and they will do a more diligent job as a corollary. Measure twice, trust once Evaluating trustworthiness of a service provider is difficult. There are a few angles that we need to look at and make our own decision. Is it the vendor’s core product? If a service provider’s core product is not the one we’re going to use, chances are they pay less attention to it. A small marketing agency running a mail server is not specialised in providing a marketing mailing list. Their focus is providing marketing communications and strategy on behalf of their customers. While seemingly the latter includes the former, in reality, managing a mailing list is a fundamentally technical task while marketing activities are, well, they aren’t. The number of customers. Big providers tend to be more reliable just by virtue of being big. Email sending services like MailChimp and SendGrid are software companies with a narrow focus. They solve one problem well, and because many people put faith in them running their infrastructure well, they need to live up to that trust. Reputation. The reputation of a company through recommendations or press weighs a lot. While anecdotal evidence should always be taken with a pinch of salt, there are a lot of ways for a company to demonstrate their worth. Good documentation and support are usually a positive sign about a vendor staying on top of their product. User experience. Consistent and good user experience often means things are in good hands. If there are reports about an unreliable service, it is best to stay away. User experience doesn’t just mean a flashy website and ease of use, but also stability and reliability. When we start to see constant outages, it’s time to start asking questions and evaluating alternatives. Pace of change. Regular updates can be a good sign. If, however, the upgrades bring instability, that’s quite possibly a sign of some failure in testing. A few products, on the other hand, don’t require continuous development. Slack has not changed their UI in what seems like forever, and it still works. Eggs in many baskets, in baskets, in baskets… Of course, besides analysing products for their security, it’s worth looking at the impact of their potential breach. After all, even the seemingly safest product has a non-zero chance of failing, and it’s worth considering what happens when they do. As a general rule of thumb, using a small number of providers to handle different tasks is a good idea. For instance, consider the scenario when we have multiple providers sending emails from company.com, and an employee in sales gets phished, allowing the attackers to take control of the CRM. Until the breach is detected, they can impersonate the company to our customers. However, after noticing a breach, the CRM can be distrusted as an email sender for company.com using DMARC without any disruption to other business functions. For a low profile domain, we might choose to go with a less trustworthy provider, and its impact should be small enough to “not care”. For the crown jewels, only the best of the best will do. There’s a balance how much we should diversify the services that take care of the online persona, and that balance may be hard to find or, indeed, maintain. But diversification in general is a good thing. Assuming a correct setup of SPF, DKIM, and DMARC, we can always rely on limiting the scope of a breach. Email security is complex, but it does not have to be scary. By assessing the products we use and how they fit into the infrastructure, security-related decisions become manageable. Most importantly, cloud SaaS products almost always offer fine-grained controls over the amount of information we share with them and the actions they are allowed to perform on behalf of the company’s online persona. We should not be afraid to tinker with them, and stay conservative about what options we select. Last, but not least, solutions such as OnDMARC and OnInbox can help companies understand their email security profile better, and offer advice to remediate configuration issues. ### IR on top of SIEM | IBM Resilient [vc_row][vc_column][vc_column_text] [embed]https://vimeo.com/310345820/526e403619[/embed] Most people have heard of Security Information Event Management tools or (SIEM) but when you’re trying to investigate an incident SIEMs can make it seem like you’re looking through a keyhole. Join Andrew Yeates from IBM Resilient as he discusses IR and SIEMS [/vc_column_text][vc_column_text] Learn More: https://www.ibm.com/security/intelligent-orchestration/resilient [/vc_column_text][/vc_column][/vc_row] ### Why integrating systems makes visitors feel like VIPs Two out of every five people claim their perception of a company or brand has been negatively affected by their experience in the corporate lobby or reception area. That’s a potential 40 percent of customers left with a bad impression according to a survey of 2,000 office workers carried out this summer across the UK and US. From unfriendly receptionists and having their name misspelled on visitor badges, to being late for meetings because of long queues in the lobby or the receptionist being unable to contact the meeting host, business visitors are being let down by a disjointed visitor experience. We all know that first impressions count. And if an organisation fails to make a good one, it can negatively affect their bottom line – especially as there are 11 million face-to-face business meetings taking place every day across the US alone. But by integrating the different elements which make up the visitor experience – from parking and access control to front-of-house and hospitality – no visitor need be left out in the cold. Using – and integrating – the latest smart technologies enables organisations to provide the white glove treatment to every visitor. Imagine the situation where a few days before their planned visit, the visitor – let’s call him John – receives a meeting request by email enabling him to pre-check-in on his phone. In the background, the organisation performs a security check to whatever level is required. Meeting room management and parking systems are triggered to find the best possible space for John and a unique QR code is generated to enable him to access the building. On the day of his meeting with Mary, John receives a friendly reminder with useful details including a map, directions, health and safety information and even the weather forecast.   When he pulls up to the facility, the barrier to the car park recognises his number plate allowing him in and directing him to a space. At the front desk, he checks in on an iPad and Mary automatically receives a text to say he’s arrived. The receptionist makes him his favourite drink, which Mary pre-recorded on the system. He makes himself comfortable in reception, the receptionist having told him that Mary is running late. He logs onto the WiFi with the guest code he received by text and prepares for the meeting. When the receptionist receives a text from Mary to say she is ready, John uses his QR code to open the security gates and follows the wayfinding on his smart phone to the meeting room. He uses the bathroom on the way – security automatically enabling him access to the bathroom and other common areas with his QR code, but not any commercially sensitive areas of the site. Towards the end of the scheduled meeting, John will receive a text with traffic notifications. After the meeting, he checks out and the system knows he’s left the building. It’s a smooth and seamless process – the integrated visitor experience. The challenge is that no one vendor provides all of these solutions – and often the different technologies aren’t integrated. Visiting buildings still using archaic security and building technologies without smart integration feels like the polar opposite to John’s meeting with Mary.  But automated technologies are now capable of talking to each other to ensure that the safety and security of the building is assured while also extending the same warm, special welcome to an expected business visitor as they would a guest in their own home. The cloud and Software as a Service are now capable of facilitating a smooth and seamless process between multiple technologies. For example, the Visitor Management System (VMS) will sync with the Access Control System (ACS) to automatically grant the appropriate level of access to a visitor, without the need for multiple, clunky security checks. The VMS syncs with the Room Booking system to allocate the appropriate space, the hospitality booking system to ensure the preferred refreshments are on hand to greet visitors, and the parking management solution to book parking spaces to maximise car parking occupancy. The latest technology is taking this a step further: automatic number plate recognition facilitates the automatic opening of gates on arrival; facial recognition can automatically recognise visitors at the reception desk reducing administration time and improving the personalisation of service delivery; and humanoid robots are increasingly working as part of the front-of-house team greeting visitors and performing a range of tasks. Integration take-up is improving. Two years ago fewer than 5 percent of meeting room systems required integration with any other system than basic digital displays. Now one of the first questions asked is about non-proprietary platform integrations. There are numerous drivers towards the integration of these systems: removing the inefficiency of inputting data into multiple systems; the growth of the smart building market; the proliferation of smart phones where people expect a millennial experience through technology; the migration to IP-based networks; and the desire for organisations to use best-of-breed solutions for each application area. But there are challenges. For the Integrated Visitor Experience to deliver optimal value, suppliers must work to open protocols and not be protectionist. Data protection – particularly in Europe which comes under the new GDPR rules – is a concern and organisations must ensure they are managing data correctly. Wider acceptance and use of mobile is threatened by fears over cyber-security in some organisations. And there are concerns that smart technology may replace people, rather than complement them. The reality is that automated technologies are most likely to free up a person’s time spent on menial tasks or operate at night when nobody else is there. In 2019, the corporate world is starting to join the dots between the technology experience customers expect when they visit corporate premises and the apparent ‘disconnect’ there has been to date. As smart cities are created, and visitors use the latest technologies such as high-speed rail, driverless cars and smart motorways, creating an Integrated Visitor Experience will become a true differentiator. And the result? A robust, safe and warm welcoming experience so that all visitors feel like genuine VIPs. ### NetApp builds on its cloud leadership in global markets NetApp Insight™ Barcelona 2018 showcases solutions that help customers fuel business growth and drive innovation across hybrid cloud and multicloud environments by: Delivering data-rich customer experiences through new applications Optimizing data sources throughout the entire infrastructure Enabling companies to innovate across clouds without fear of data loss BARCELONA – December 3, 2018 – NetApp (NASDAQ: NTAP), the data authority for hybrid cloud, today announced new data services and solutions that empower customers to innovate in the cloud. These new offerings include the expanded availability of the Microsoft Azure NetApp® Files preview, expanded availability of NetApp Cloud Volumes for Google Cloud Platform and NetApp SaaS Backup for Salesforce. As the leader in hybrid cloud data services, NetApp enables users to bring their data to the doorstep of industry-leading public cloud services and respond faster to market changes and to customer demands. The bulk of businesses around the world now rely on data as a strategic advantage. Therefore, it is essential to leverage the cloud to make better decisions, to improve customer experiences, and to develop new revenue streams. Navigating the cloud, however, requires organizations to integrate, to protect, and to optimize data in a post–General Data Protection Regulation (GDPR) world. This challenge requires the hybrid and multicloud expertise that only NetApp, with its Data Fabric strategy, can provide. “Organizations are shifting steadily toward cloud-first strategies. For organizations undertaking digital transformation at the business level, cloud isn't just about picking among a specific set of products or service delivery models such a public, private, or hybrid cloud,” said Stewart Bond, director of Data Integration and Integrity research at IDC. “NetApp is leveraging a combination of technology, partnership, business model, and vision to position itself at the forefront of delivering integrated and consistent data services for the hybrid cloud.” “Building on the many products that we launched at NetApp Insight Las Vegas in October, this year’s European event is a key example of the continent’s dynamic and diverse approach to data-driven transformation,” said Alex Wallner, NetApp senior vice president and general manager EMEA. “Just as GDPR now acts as a blueprint for data regulation around the world, our data-conscious customers are flexing to implement best practices by safeguarding, enriching, and optimizing their data. We are proud to meet this challenge with the expansion of NetApp cloud offerings.” Inspire Innovation with the Cloud Azure NetApp Files is an Azure file-service powered by NetApp’s ONTAP® technology and will be sold and supported by Microsoft.  Azure NetApp Files delivers enterprise-grade storage and data management capabilities for moving and deploying file-based workloads in Azure. This offering is currently in preview in certain regions and will soon expand service capabilities to include additional Azure regions across Amsterdam, Dublin and U.S. South Central, and continue to add capabilities in the coming months.   NetApp has also announced the expansion of its Cloud Volumes Service for Google Cloud Platform with the addition of the EU-West3 region. This expansion will help companies to: Make data-driven decisions faster by moving analytics to the cloud. Safeguard databases in high-performance file services in the cloud to allow high-speed testing and iteration across applications and workloads, with the ability to quickly revert back to the original database if anything goes wrong. Speed time to value by automating development and testing environments with the ability to clone hundreds of environments in minutes. Safeguard Data in the Cloud The general availability of NetApp SaaS Backup for Salesforce is a complementary addition to the NetApp global cloud portfolio to: Empower customers with control of sensitive data through simplified, automated backup and granular on-demand restore.   Protect Salesforce data from accidental deletion, corruption, or malicious intent. Help data-driven businesses meet regulatory data compliance. Additional Resources Read more about today’s announcements in the NetApp blog posts: Read about MAX Data here. Read about NetApp HCI here. Learn more about NetApp Cloud Data Services Learn more about NetApp Insight Barcelona. Visit NetApp Insight on Facebook. Read about NetApp Insight on Twitter. Connect with NetApp on LinkedIn. ### How Cloud Computing Will Transform Traditional IT in 2019 Information technology is a field that rapidly changes as software and technology continues to advance. Anyone who would write on the similarities and differences between traditional IT practices and those inspired by cloud computing would note these changes. These changes are expected to grow more drastic in the coming year—here are some of the advancements you can expect to see in IT because of cloud computing. Multicloud Systems Before, there was a single cloud that could be shared by entire organisations or individual people. Multicloud systems will become more popular in 2019, helping organisations limit access to and share information. This will give them greater control over the employees that have access. Additionally, employees can have individual clouds that sync with the main multicloud system. The major change that will result is an increased need for IT personnel to manage these complex systems. Quantum Computing While 2019 may not be the year that a quantum computer will be developed, the research shows that IT companies may be closer than ever to creating a computer that can seamlessly interact with artificial intelligence, encrypt data effortlessly, predict the weather, improve financial modelling, and solve complex medical problems, to name a few things. The race is on as technology giants like Google, Microsoft, IBM, and Intel try to create a quantum computer. Currently, there are some quantum computing services available.  Cloud Security Could Become More Complicated As the cloud grows, it will be necessary that the security measures put in place to secure it grow as well. This could involve development of greater security measures, advanced encryption strategies, and an increased need for personnel to manage these systems. This may grow beyond the scope of professionals typically hired by companies. In this case, outside IT personnel may need to be consulted. Option of Serverless Computing For some companies, serverless computing may be a good alternative to managing their infrastructure. Serverless computing works without the use of infrastructure, operating systems, or servers. Instead, the computing is carried out through an application provided by an outside vendor. They provide and manage the infrastructure, from the physical machines and software to the specialists and IT professionals that may be relied on if there are any problems with the system. An Increase in Hybrid Cloud Systems For many companies, public cloud systems seem to be the best option. However, these may not offer the same security as organisation-based systems. Organisation-based systems, however, can be expensive, hard to implement, and difficult to manage without the right knowledge.  Next year, it is projected that there will be an increase in hybrid cloud systems, which combine the benefits of public cloud systems with organisation-based systems. These will allow clients to have the best of both worlds, meeting their needs for data protection, sharing, and more, as well as falling with their company budget. Option to Build with Kubernetes Even though the Google Cloud is among the most well-known of the clouds, Kubernetes is a good choice for companies who have IT experience. Kubernetes allows containerised applications like the cloud to be grouped, deployed, scaled, and managed much easier. This makes managing a cloud server easier for companies who may not specialise in IT services. Some of its advantages include the ability to grow endlessly with your company, its open-source design that gives you the tools to build a system that works for you, streamlining management of data and company resources, batch execution, and more. Increased Competition The IT field is expected to surge in the area of cloud computing. Research conducted by IDC shows that nearly half of spending in the IT market was spent on cloud-based technology in 2018. By 2020, it is estimated that ‘cloud’ spending will amount to 60% or more of IT infrastructure, services, software, and technology. This obvious interest in the market has left many companies clamouring to meet their clients’ increasing demand for technological advancement. Even though cloud computing companies may be driven and competitive, however, it remains increasingly important that they continue to manage their security needs well. Without proper security, entire cloud systems could fall and leave data vulnerable. The IT competition for the best cloud computing does not require just coming in first—it requires creating the best system possible. As with any form of technology, change occurs rapidly when new systems are developed. With the great impact that cloud computing has had in the last few years, it will be fascinating to see what additional changes happen with time. It has the potential to change the way that businesses operate and how information is shared with the world. ### Data certainty and uncertainty | IBM Resilient [vc_row][vc_column][vc_column_text] [embed]https://vimeo.com/309471793/206229e43a[/embed] To understand the certainty and uncertainty in cybersecurity it is helpful to look at an example from the real world. By looking at an aeroplanes autopilot system we can better understand cybersecurity. Join Andrew Yeates from IBM Resilient as he discusses data certainty and uncertainty [/vc_column_text][vc_column_text] Learn More: https://www.ibm.com/security/intelligent-orchestration/resilient [/vc_column_text][/vc_column][/vc_row] ### Protecting the cloud - an evolving challenge The definition of the cloud has been in metamorphosis since its conception. The earliest understanding of the term served a useful purpose as it really was a paradigm shift to a scalable, elastic, quickly deployed infrastructure. While appropriate at the time, this definition lacks some clarity as many see it as the foundation of what we now see as the Internet on the whole. What is certain in the future is that 'what goes online stay online'. This points to a future where every device will simply connect to the cloud. Yes, we cannot deny it - “everything” will soon be online. Currently, most things are offline by default, but being online and connected will become the default for everything. The cloud will be the foundation of the data for the edge devices. This massive cloud computing power with instant response will make intelligence on demand available for everyone, everywhere. New business models where devices are boosted by inexhaustible cloud-based resources – will begin to emerge. AI will benefit as a result. We will experience more natural interactions with computers. A super intelligence. This incredible computing resource combined with fast 5G will serve us with a powerful computing potential previously considered to be in the science fiction realm. However, with this relentless move towards online comes a question around online protection. With the cloud, true security practices always come at the cost of convenience. This is a well-known mantra in the security world. The correct trade-off to give them a 'certain level' of security while still not eating into productivity is what most people are seeking. Of course, that 'certain level' will differ as people have different threat actors to worry about. We have a good understanding of what is good solid authentication today which generally involves a mix of a strong password (ideally random generated and managed by a password manager) linked to multi-factor authentication via a dedicated app or a hardware token. Of course biometrics can be in the mix too. However, we know that a significant proportion of online users are unable and unwilling to adopt these practices. Yes, we can teach some people how to use password managers and we can get organisations to better integrate password managers with the software and devices we use to enter them but that will not happen universally. The computer security industry knows that password authentication is broken. With most cloud applications protected by nothing more than a password, this proves an increasing challenge. However, it is still a method which everyone knows how to use. Until another easy to use authentication can surpass it, then we have to live with it and hopefully people will get the message of the dangers of password reuse on sites and the need to make it long (and as random as possible). We are seeing some moves against passwords. Just in October, Microsoft announced that it now supports password-less logins via its Microsoft Authenticator app. It works for hundreds of thousands of Azure Active Directory-connected apps. It is not entirely a new avenue for them as for some time with Windows Hello, it offers a version of this for Windows 10 users. For Azure Active Directory, the Windows Authenticator app basically copies Windows Hello functionality and it allows users to use their fingerprint, PIN or face to log in to enterprise applications. The idea is to provide two factors of authentication: something you are (your fingerprint or face) and something you own (your phone). What this does indicate is a move towards eradicating the password as the defacto authentication method. It is feasible that biometric authentication becomes the de facto form of providing credentials in the future (although it should be combined with multi-factor methods). Many smartphones have biometric readers or sensors incorporated into the hardware. Deployment of proper biometric solutions should significantly reduce identity thefts with great benefits for the economy by eliminating passwords from the equation in place of more reliable solutions. Face ID does seem to work quite well. It works by projecting around 30,000 infrared dots on a face to produce a 3D mesh. The infra-red sensor on front is crucial for sensing depth which allows the device to verify the 'liveness' of what is in front of it. Earlier facial recognition features were easily tricked by face masks and 2D photos. Behavioural biometric based authentication methods on mobile platforms is another step in the right direction. They are more than just a one-off identification process, as they allow for on-going monitoring of a person’s behaviour, detecting things from the way someone types to the angle at which they hold their phone. There is also voice authentication but it suffers of course from the danger of aural eavesdropping. The most interesting new moves in authentication have been the move by IT giants such as Google in their Advanced Protection programme to embrace hardware tokens. At this time, a password and hard token are as good as it gets.  Of course, biometric, authenticator apps or hardware token solutions may not provide us with the complete authentication solution we need right now to more fully secure our accounts and systems in the cloud, but they will play an increasingly important role in the days ahead. As our understanding and application of the cloud continues to change and grow, these developments are a step in the right direction. ### Incident response privacy | IBM Resilient [vc_row][vc_column][vc_column_text] [embed]https://vimeo.com/309471574/6b1d52b86e[/embed] If a security incident involves a data breach then it is very important that you know which regulation that you must comply with. There are many different industry regulations and country regulations that you will need to comply with. Join Jessica Cholerton as she discusses incident response privacy[/vc_column_text][vc_column_text] Learn More: https://www.ibm.com/security/intelligent-orchestration/resilient [/vc_column_text][/vc_column][/vc_row] ### How to build your incident response process | IBM Resilient [vc_row][vc_column][vc_column_text] [embed]https://vimeo.com/315672026/7289cdc0a8[/embed] What do you need to keep in mind when mapping an incident response process? It is always best to look at previous incidents to see what response may work in the future. This technique may work, however, many of the responses are instant specific so the process will change instant to instant. Learn More: https://www.ibm.com/security/intelligent-orchestration/resilient [/vc_column_text][/vc_column][/vc_row] ### 10 Untapped Tech Inventions You Need To Know About [vc_row][vc_column][vc_column_text]Technology plays an ever-growing role in both our personal and professional lives. This is because it calls for the application of knowledge to coherently organized activities involving machines that are designed to achieve sustainable goals. Countless extraordinary tech inventions have shaped modern society. We’ve all heard of the light bulb that propelled Thomas Edison to fame and Alexander Graham Bell’s telephone that revolutionized communication. These are just two examples of products, or rather, solutions invented by brilliant minds. Here, we’ve prepared for you our little list of amazing tech inventions you won’t believe actually exist. Cicret Bracelet Imagine a wearable device that makes it possible for you to have your own skin as your touchscreen. This handy technology is THE answer to all those questions on quick and simple ways to use smartphones. What do you have? A small bracelet. What does this nifty gadget do? It projects a touchscreen onto your arm and enables you to access all the features of your phone without you actually having to take it out. The bracelet comes with its own processor and storage features including a USB port. It uses multiple sensors to guide touch interactions between your fingers and the screen so that you can easily control your phone’s applications. Furthermore, this device integrates Wi-Fi and Bluetooth for you to connect to the Internet from your arm. Talk about convenience. Everykey So maybe some of us are not too fond of keys and passwords. Luckily, Everykey is here to save the day. This sleek, secure wristband is essentially a Bluetooth device that uses military-grade security to replace both elements. All you need to do is keep it close to your phone, tablet, laptop, car door, house door, and similar access-controlled devices. Everykey unlocks them when you are nearby, and they lock back down when you walk away. It also generates secure passwords for your website accounts which you can then easily log in to. Everykey can be deactivated at any time and there is no need to worry even if you lose it. You can always freeze it remotely to prevent third parties from using it. STARY the Skateboard This will appeal to the sports fans out there. Weighing a mere 11.5 lbs and capable of reaching speeds up to 30km/h, STARY has been dubbed the world’s lightest and most affordable electric skateboard. It is the fastest skateboard for beginners looking to learn the sport as a means of transportation or simply, as a recreational activity. STARY delivers high performance through quick acceleration and regenerative braking. It comes with a remote which powers the skateboard by sliding a control forward or backward. It is a practice-run friendly skateboard that works well with different customers. Depending on your skill level, you can choose from four speed modes, including beginner, cruise, advanced, and expert. LIX Calling all artists! Technology has worked its way through paper, and you can now instantly create what you imagine with LIX, a 3D printing pen. This is a professional tool that pushes your creative boundaries by allowing you to doodle in the air. Its lightweight design gives it that extra comfort and natural balance which come with gripping a 3D pen. You can create just about anything you like, from freestanding small structures to intricate details and even prototypes. Atomic Fingerprinting What’s the real deal? Everyone wants to find out, and in a world with sophisticated counterfeiting techniques, it can be difficult to tell what’s real and what’s not. Luckily, our scientist friends have come up with a solution that can stamp things with atomic fingerprints to keep forgeries at a distance. This confirms the authenticity of an object and prevents global economies from facing staggering losses that could otherwise result from counterfeit crimes. Zero Breeze Fancy an AC that you can take with you anywhere in hot summer days? This is where Zero Breeze comes in. It is a portable and multifunctional air conditioner that includes an LED nightlight, a Bluetooth speaker, and a power bank for charging your phone on the go. It does not need a power outlet to function, and its battery can run up to five hours. Zero Breeze is suitable for those who spend most of their time outdoors, whether they are at work or on a camping trip. iScout iScout projects all the necessary information for safe driving such as car speed, GPS navigation, fuel level, and maintenance issues as a floating image in front of your vehicle. The device works by syncing to smartphones via Bluetooth and it performs a wide variety of tasks. For example, it enables you to accept or reject calls simply by waving your hand. You can even compose texts through voice dictation and view notifications from applications like Facebook and WhatsApp while keeping the GPS on the screen. iScout also displays reminders, social media content and other notifications along with important features like a forward-looking dashcam to prevent accidents. All in all, this device is designed to promote situational awareness while driving. Cowarobot Any globetrotter will tell you that there is more to having the right luggage. Traveling in style matters these days, and this goes beyond selecting the perfect ensemble to wear. Here’s technology at work again with COWAROBOT R1, the first and only carry-on size robotics suitcase that frees your hands during your travels. This smart invention maneuvers through traffic independently and follows its user while avoiding obstacles in its path. These features of autonomous travel and obstacle avoidance make traveling convenient and pleasant. Thin Client What can be said of Thin Clients, except that they have revolutionized the IT industry in more ways than one? Before we jump the gun, let’s examine the key characteristics that make up a typical Thin Client. This endpoint is essentially a secure, lightweight computer that establishes a remote connection to a server to implement computational functions. Applications, programs, memory, and data that you would generally find on a regular desktop PC are stored securely in a data center when using one of these devices. Thin Client technology primarily replaces slow, power-hungry, noisy and cluttered PC environments. Designed to support Virtual Desktop Infrastructure (VDI) deployments, Thin Client solutions come with several benefits that constitute a modern IT setup. They promote centralized management, minimize carbon footprint, help achieve less downtime and improve productivity by allowing users to access virtualized desktops and applications with time and location independence. Tapplock Here’s one thing. Or, make that two. It is easy to forget important things such as combination codes. What about the constant need for keys to access our valuables? Tapplock, the world’s first smart fingerprint padlock, changes all that as it brings the best of both convenience and security. You can use it in three different ways, including through Bluetooth, fingerprint, and a Morse code backup pattern which involves pressing the power button with short or long presses. Moreover, the device is safe, water-resistant and includes a built-in mobile phone charger. Concluding Remarks This is the year 2018 and considering the brilliant inventions we already have, we can only expect technology to continue providing us with so much more. Least to say, it is definitely exciting to get our hands on the products and solutions our genius inventors will come up with next.[/vc_column_text][/vc_column][/vc_row] ### Increasing dependence on tech demands board-level IT representation There is an organisational disconnect between the board and the IT department on disaster recovery. We asked over 400 UK IT decision makers how their Recovery Time Objective (the length of time it takes to restore IT systems following a disaster) compared with the expectation of the board. Around a quarter (26 per cent) said their recovery times were slower than their board’s expectation, and a further quarter (24 per cent) didn’t know if they were meeting its requirement. The results reflect what we see in the real-world. There is a lack of agreement on recovery requirements for businesses.   Organisations that do business continuity planning well have recovery objectives agreed and approved by the board. This is important because it sets the goals for business continuity and individual disaster recovery plans.  Without a consensus agreement, those recovery plans just aren’t working towards a common end. When planning a business continuity plan, there is a question we must all address. How quickly do you need your IT system back after a disaster?  Ask the accounts team about billing systems or a sales team about its CRM and the answer is likely “ASAP”. It is possible for an IT team to deliver that kind of speed of recovery but it comes at a high cost.   Beyond that initial, knee-jerk reaction, if you get teams to think about how they would be able to continue working, using alternative methods you start to get to a more realistic recovery need.  But ultimately the business continuity team needs to collate this information and weigh the costs and implications of downtime against the cost of recovery solutions. Once the business sets these objectives, it’s then the responsibility of the IT department to deliver on them – to build the internal capability or to select a service provider to help them meet that requirement.  It’s therefore vital that these projects are adequately funded. It’s pointless to set a very aggressive recovery time but not provide sufficient budget to deliver on it. A growing reliance on technology For that reason, these decisions must also consider expected changes over the short-medium term. The board must factor in that if changes are made to the objectives, it will take time for IT to then source and implement new solutions to meet them. We were recently told a story by an IT Manager who had just suffered an IT outage. He carried out a successful recovery and had the business back up and running in two days.  After the incident, he was called in to explain himself to the board. They asked why it took such and unacceptably long time. He then showed them that the recovery went exactly according to plan and met the recovery times they had agreed two years prior. This example highlights a specific issue: There is a growing dependence on technology from all areas of business There is an increased expectation of uptime Even in a very short period, for that particular business, the requirement changed. Business continuity plans, risks and mitigation plans should be reviewed and updated every year. But also, consider that the average lifespan of a solution is around 36 months (depending on depreciation lifecycle or supplier contract length). It is therefore important to plan sufficiently far ahead. The case for IT representation on the board This disconnect over disaster recovery also points to a larger issue. We’ve recently seen numerous high profile IT incidents, such as the troubled TSB IT migration and the recent British Airways data breach. It reinforces why organisations simply can’t afford to sideline technology to the back office any more. These aren’t supporting functions – they are the critical operations of the business. It is our opinion that there is a need for greater IT representation at the board level. There is a need for someone with specialist knowledge to be able to translate the specifics of how technology is enabling the businesses – as well as the risks it brings. The board sets the tone for a firm’s digital future, but it is also accountable for the ramifications of tech failures across the entire business. As we’ve seen from our research, it’s clear opinions differ between stakeholders within an organisation. Having a digital leader in place embeds the board with a more realistic understanding of the company’s IT capabilities, opportunities and threats. ### Why cloud may not save you money One of the key reasons usually given for moving services to cloud is to save money by getting rid of expensive IT infrastructure. However, making the business case on cost grounds alone is not necessarily straightforward. First, cloud almost certainly won’t save money unless you fundamentally reengineer your organisation; second, although it takes away in-house IT operations, it creates the need for new skills, such as billing management and managing your cloud provider to ensure they deliver the agreed service; and third, you need to know your way round the ins and outs of cloud usage and charging models to ensure you are minimising spend with your cloud provider. Fortunately, the properties of cloud enable a business case to be made in different ways, focusing on the other benefits that it offers over in-house service provision. Here are five examples where cloud offers specific benefits which can be used as the basis of a business case. Collaborations and joint ventures When you’re setting up a joint venture or collaboration, cloud offers an excellent solution to the dilemma of which organisation’s IT system you should use. If there’s a proven SaaS service available, such as Microsoft Office 365, this can be put in place quickly and ensures that everyone is using the same versions of the same applications, making information sharing much easier. It is also straightforward to add users as the project develops, and wind up at the end of the venture. This was the solution chosen by Fusion, a joint venture (JV) between three leading construction companies – BAM Nuttall, Morgan Sindall and Ferrovial – which was awarded one of the three enabling works contracts for High Speed Two (HS2). Fordway implemented a dedicated Office 365 domain for Fusion and now provides ongoing user support through its Service Desk. This ensures that everyone can access project documents, contacts and their calendar on any device from any location. Cloud’s ubiquity of access assists mobile working, which is particularly important for this type of project as most of the users are likely to be working from a range of locations and devices. Short-term increases in capacity One of the advantages of cloud is that you can increase capacity extremely quickly as and when needed. It takes just minutes to spin up new instances, whereas physically buying, building and configuring a new server can easily take a month from ordering equipment to installation, and with cloud there are no fixed asset costs. This makes cloud ideal for organisations whose workload is extremely ‘peaky’, for example retailers coping with busy periods and flash crowds, or organisations with significant increases in workload at month end and year end. You can rent capacity, run the processes you need and then hand the capacity back again until the next peak. We implemented such a service for a government agency which provided funding to thousands of other organisations. The end of every month saw a peak in demand as the client organisations submitted their monthly returns, and these workloads were further multiplied with reconciliations at the end of the financial year. Cloud was ideal for providing this additional capacity, first through Fordway’s managed cloud service and ultimately with the service replatformed to public cloud. Cloud is also ideal for experimentation and testing for new capability. So, for example, you can obtain instant Ruby on Rails development capacity on which to develop new services and then move them into your production environment when completed. This also minimises any risk to your production environment. Data security and DR Perhaps the most obvious benefits of cloud are when it is used for disaster recovery. For this most organisations can show cost reductions, as both AWS and Azure are on demand services which don’t  charge when virtual machines are stopped, so the organisation only pays for data storage (plus any replication costs) until DR is invoked or you are running tests. Organisations who already have a second environment for DR can extend the life of their existing infrastructure by moving this DR capability to the cloud. As well as providing an initial experience of using cloud at a lower risk than moving their production environment, consolidating a passive DR environment into production infrastructure provides increased capacity and potentially a longer life for current systems. For organisations whose DR has been based on restoring from tape, the advantages are even greater – immediate failover to the backup system. As one client, a logistics company, told us: “I don’t lie awake at night any more wondering what would happen if there was a fire, as I know everything is being continually monitored and our data is safe.” Limited internal resource/shortage of qualified staff Cloud has specific advantages for organisations who are struggling to retain skilled staff – a particular challenge for many medium sized businesses – or have a small in-house team. They are unlikely to have the diverse range of skills required to run a complex IT infrastructure, and hence either have to take a ‘best guess’ approach or turn to external experts on a regular basis. Carefully chosen use of cloud services will enable them to focus internal resources on the most business critical services, as even IaaS provides, and in most cases automates, basic operational functions, freeing up internal staff time for more useful activities. One of our public sector clients chose managed cloud (IaaS and DaaS) to provide them with a flexible IT infrastructure, which we manage for them. Their in-house IT team handles first line support, while we are on hand to handle what they call the ‘stickier’ issues. We also record all service requests and incidents in a customer portal, so that their IT director can log in at any time and check their status. It’s a seamless partnership in which everyone plays to their strengths. Adding new applications quickly A further advantage of cloud is the ability to get new applications up and running quickly. For example, for one of our clients we set up and now host a cloud based remote desktop environment to run business-critical applications which cannot be supported internally. This enables new applications to be added rapidly to meet user demands – ideal for an organisation in a rapidly changing environment. When they suddenly needed to run a new economic modelling package at short notice, this solution enabled it to be up and running within the required timescale. Cloud, as these examples show, offers benefits in many situations. Organisations need to ensure that when developing a business case they consider the wider picture, not just the financial one. ### Filming Interviews at Define Tomorrow 2018 Define Tomorrow 2018 ### Digital transformation is transforming business We are becoming inured to the word “digital”. Digital clock, digital telephone, digital TV… It usually meant that we would be required to buy some new boxes soon. So forget it for now. But “digital transformation” means something far more significant. More than just a technology shift, it implies a profound, cultural revolution. It signifies a transformation of business and organisational actions, processes, skills and models to take better advantage of opportunities arising from the way digital technologies are infiltrating and impacting business and society. For business digital transformation has become a fashionable buzz-phrase, but it also applies more widely to governments and public sector agencies, including those tackling some of societies’ greatest challenges – such as pollution and ageing populations. In practice, digital transformation will require changes in leadership style, encouraging innovative thinking and new business models, digitisation of assets and smart use of technology to offer employees, customers, partners and stakeholders a better quality of experience. No single example can capture the full scope of this change, but compare the experience and features of shopping at Amazon against a traditional retail store; compare Uber to a twentieth century taxi service, or Airbnb to searching the Yellow Pages for a suitable bed and breakfast. Look at your own business through that same lens. If you offer products or services, is it as easy for the customer to search and compare in depth features and prices – and then order and pay – as in the above examples? This is not simply about customer-facing technology: people interact with a company through many channels. A connected and informed front office offers a better customer experience. Efficiency, agility and responsiveness are enhanced by happier, more informed employees, partners and suppliers. Digital transformation is being described as “the fourth industrial revolution” because it impacts every aspect of business and strategic decision-making. With such rapid change, many companies are struggling to even understand how to respond. Others have started rebuilding their organisational structures, operating models, business processes, technology, skills, and cultures to address the new reality of constant transformation. So where should we begin? A modular approach First it is necessary for business functions to be modularised into independent micro services that can be deployed across the organisation. For example, a customer may be required to digitally sign acceptance of a purchase. If this function is presented as a module, it can be re-used across many different customer engagement experiences. As digital transformation accelerates, the library of re-usable service modules will expand. As the environment changes, business functions can be updated in real time. If, for example, e-signatures are replaced by palm scans, updating that one module results in the update being automatic propagated across the entire business. Changes that might have taken weeks to roll out are now almost immediate. A similar modularity is needed for the supporting IT infrastructure. If its components are isolated in terms of the business services they provide, it becomes easier to upgrade them in stages, instead of revamping the entire infrastructure. As a result, cloud virtualisation and infrastructure-as-a-service (IaaS) become the rule, rather than the exception. Making changes and adjustments in such environments is simple, inexpensive, and immediate. With constant change in applications and infrastructure, network agility is paramount, but networks such as MPLS and its equipment have remained largely unchanged and are the opposite of agile. Even Software-Defined Wide Area Networks (SD-WANs) are still often location, hardware, and service-provider specific. Application Specific WANs Digital transformation requires a paradigm shift. We are used to thinking of a network connecting physical locations. Instead we must now think of it connecting applications and shaping itself to their requirements. In application-specific networking (ASN), application endpoints define the edges and application contexts define the network properties. What is needed is a fast, easy means to launch instant high performance, secure, application-specific networks in the same way that today’s data centres can spin up application-specific virtual machines on demand. As business services become modularised, these “AppWANs” can be spun up, segmented, and adjusted in minutes to secure and connect them to the appropriate context within the company ecosystem or over the Internet. This is being achieved by the use of software-defined overlay networks. Software-defined networking creates a “control layer” above the networks’ physical structure. Instead of sending out engineers to manually adjust each piece of equipment, the network manager simply defines the changes on a central controller, then software in the control layer will immediately broadcast those changes right across the network. The physical infrastructure remains the same, but the traffic flow is transformed to define a specific virtual network across that structure. AppWANs extend that principle by creating multiple discrete virtualised network layers on top of a physical network, to suit specific applications or security needs. From a central console overlays can be dynamically adjusted to meet the applications’ performance requirements and to define how specific endpoints will be allowed access across the Internet and/or existing private networks. One major benefit of AppWANs is that, being abstracted above the network infrastructure, they are completely service-provider agnostic. This is a vital consideration for larger or global enterprises that need consistent, reliable networking across any number of regions and providers. Another is that security comes built into the network specification. The security perimeter and security needs are defined by the application, rather than the traditional combination of application, network and security infrastructure.  Central control also specifies the access criteria – who or what has access, and how they will be authenticated. Putting it all together Internet connectivity has put power in the hands of the consumer. Your product or service may be first class, but if it is difficult for the customer to access specifications, if the telephone helpdesk is ill-informed, if delivery is slow or the wrong product dispatched, if the payment process is too clumsy… If any part of the interaction becomes too painful for today’s customer they are probably just a few keystrokes away from finding an alternative product or supplier. There can be no weakest links in a digitally transformed organisation. There can be no separation into hierarchical silos where departments become competitors and frustrate the customer by passing blame. Instead connections must run horizontally across the organisation. Everything must be connected to everything else in order to harvest and communicate the wealth of available data. Stocktaking is no longer a weekly event carried out by people with clip-boards, it is an automated function updated in real time by Internet of Things (IoT) chips on every shelved item. Customer choices and buying patterns are also being registered and recorded along with payment details and more. All this data would be just a burden were it not for the power of artificial intelligence to mine it and extract vital market intelligence to inform ongoing business strategy. Connectivity is the nervous system that brings it all into one intelligent and responsive whole. To achieve that, it must enable every business application with the link that best serves it, and that means application-specific connection. The organisation has become a living, intelligent organism. Now you have the opportunity and the challenge, do you also have the creativity to reinvent your business and ride this beast to the front of the field? ### Common cloud security issues: How to address them As organisations continue to carry out digital transformation, cloud computing is on the increase, enabling those that adopt the technology to be on the frontline of innovation. The positives of utilising cloud infrastructures are well documented but a characteristic that often gets overlooked, much to the detriment of organisations, is security.  With businesses looking to migrate away from on-premise technology, it is essential that steps are in place to protect the vital assets that transfer to the cloud. This will also avoid any serious disruption to the business overall. Nevertheless, there seem to be key areas of cloud security that continually seem to be neglected, so here are our tips on how to resolve them. Knowing where the assets are Even though the flexibility that cloud offers is seen as a positive, it can also add an element of anarchy for security teams. If the number of departments with access to a particular cloud infrastructure is high, it can become a mind field for security professionals to keep track of who is accessing what. In addition, when companies transfer offerings to the cloud, visibility of assets can be obscured, especially when departments are given more freedom to deploy and use Software-as-a-Service (SaaS) and Infrastructure-as-a-Service (IaaS) solutions. To avoid a situation like this occurring, it should be advised that organisations implement security solutions that include auto-discovery functionality. With this efficient technology, businesses can conduct inventory checks within a timely fashion across networks, servers and workloads without shadow IT risks. As a result, security teams are given a comprehensive view of all deployments in the cloud, thus providing an accurate picture of the correlated risks between on-premise and cloud assets. Lacking in cloud security skills The cyber skills gap hindering the industry is very visible to this day. There are too few security professionals around and those that are qualified have limited time to take on more training. This is adding to the stresses and pressures of the job. In fact, in a recent study it was revealed 16 per cent of IT security professionals have admitted to ignoring critical security vulnerabilities due to lacking the necessary skills to rectify them. In addition, over a quarter (26 percent) confessed they have overlooked a critical security flaw due to having an insufficient amount of time to fix it. Furthermore, two thirds (64 percent) of senior executives claim their organisations are losing out on revenue because their teams lack the expertise to carry out what is required regarding cloud services. Matters are made worse due to the number of security teams who must understand the many services that an individual cloud provider offers. For example, Amazon Web Services has 142 services. If the organisation fails to have the necessary knowledge on the cloud provider, how can security best practises be applied? An alternative to address this scenario would be to outsource to a Managed Security Service Provider (MSSP) or software company that has the desired levels of cloud competency and that can guide the organisation. This can take place at the initial implementation stage of the cloud service and can be removed once the internal security team has reached a credible level of experience. Another tip would be to empower individuals who use the cloud to become awareness ambassadors within the company. Through security initiatives and further training, the ambassadors can push fellow colleagues to follow better security practices, which will raise the overall security posture of the company. Securing the API 2018 has been a tough year for organisations that had poorly implemented Application Programming Interface (API) security. Salesforce, Panera Bread and Vemno were among the brands to suffer highly publicised data breaches and this is a continuation from incidences that took place in 2017, when sensitive data on millions of users became exposed from T-Mobile, Instagram and McDonalds. The API is an integral part of the cloud infrastructure as it is the gateway or interface that provides direct and indirect cloud infrastructure and software services to users. Because of this, developers have relied on them to support the delivery and integration of products and services. Yet, there is are risks associated with this as cloud services authorise third-party access which fundamentally exposes the APIs. This is an area of concern and a reason why security cannot be overlooked by the DevOps team. Ensuring that security by design is the approach carried out throughout the development process falls on the shoulders of the DevOps teams. By following this method, organisations will gain a clearer understanding of what is required from an overall security standpoint. This will ensure the infrastructure is built with adequate authentication, authorisation and encryption as well as mitigate any known vulnerabilities. With digital transformation fuelling much of today’s cloud adoption, companies mustn’t rush into deploying or migrating until security measures are addressed. Whether that security is conducted in-house or outsourced, there are measures that can be put in place to reduce most risks and help begin to build confidence in this growing infrastructure. ### US tech firms set benchmark for data privacy When General Data Protection Regulation (GDPR) laws came into force back in May, businesses across Europe had to drastically alter the way they handled their customers’ data – and with similar rules now being implemented in the US, attitudes of big business towards data privacy look set to change across the globe. The California Consumer Privacy Act (CCPA) will take effect in January 2020 and start to be enforced by the Attorney General six months later. With many of the world’s most powerful tech companies calling the Golden State’s Silicon Valley home, CCPA is expected to have a significant impact far beyond California’s physical borders. Modeled on GDPR, although less expensive to implement, the legislation is expected to receive the backing of Apple CEO Tim Cook, who spoke out in support of stricter privacy laws at the International Conference of Data Protection and Privacy Commissioners in October. Warning of the “weaponised” use of data against consumers, he confirmed that the company was “in full support of a comprehensive federal privacy law in the United States”. Microsoft CEO Satya Nadella has also praised GDPR in the past, while Facebook COO Sheryl Sandberg has previously declared that the social network is “open to regulation”. The original ballot initiative came about through a privacy rights organisation but support quickly grew for a law to be passed through the normal procedures. The provisions of the actual law have ended up being less stringent than originally proposed, but they are not drastically different. If companies want a say in what any future regulations might look like, it makes sense for them to get onboard sooner rather than resisting completely. The resulting CCPA is far more comprehensive than previous data protections laws, giving individuals more rights when it comes to guarding their personal information, with greater transparency around what info is collected. People will also be able to opt out of their details being sold and have more power to sue in the event of a breach. In a country with a reputation for being more litigious than others, that means businesses will need to take the new rules very seriously indeed. Any business outside of California assuming that the legislation will not have an impact on them should bear in mind that many of the companies that will have to comply with the new rules are by no means constrained to the state, with their influence reaching right around the world. As well as Apple and Facebook, tech giants such as Google, Netflix and Twitter have headquarters in California but operate across the globe. Any changes they make to their own policies are likely to affect customers elsewhere too. Then there is the fact that California is famously progressive and tends to lead where others follow. Implementing new rules in the state could open the floodgates for the other 49 to follow. In fact, the US National Telecommunications and Information Administration recently undertook a 30-day public hearing process to gather comments on its policy options for federal legislation. While the appetite for change is clear, it remains to be seen whether there will be a conflict between state and federal rights. From a practical standpoint, though, CCPA will set a new benchmark that could quite easily be used as a model for country wide implementation – unless a federal law gets there first, of course. However any rules and regulations end up being implemented beyond California, there does seem to be a consensus that the focus on privacy is only going to spread, especially with data breaches becoming more common. This means that companies need to wise up when it comes to the way they handle any data that they collect. With the rise of AI and voice assistants, businesses also need to be more aware of the types of data they are in possession of and how they gather them. As populations become more tech-savvy, data breaches are becoming more and more damaging to a company’s reputation, while under new rules, fines for committing them will also be more severe. Being able to protect user info and react accordingly when breaches occur is now essential for any business that handles personal data of any kind. Support for California’s Consumer Privacy Act will protect the integrity of data-driven businesses across the globe, says Grant Caley, NetApp’s Chief Technologist for UK & Ireland A company that already has a GDPR compliance program can make sure its Californian customer base is covered with just a few tweaks. If future laws also use GDPR as a blueprint, those that have already prepared will be in a better position to cope. The overriding message is clear: be prepared, proactive and transparent – because tougher data protection is on the up and it is most definitely here to stay. ### 10 business benefits of an effective cloud system Why are so many businesses moving to the cloud? The answer is that its integration allows businesses to focus on what is most important: their business. And although the benefits are endless here are 10 best ways an effective cloud system can benefit your business. Sales With businesses putting a premium on customer service, it is essential to put in place a reliable system that automates every phase of the order management process. Customer-focused order management software allows businesses to not only simplify complex processes but improve both efficiency and its value. Whether it is tracking or processing orders, all of the data is in one place, which means fewer delays and assured updated information. Purchasing Powerful cloud procurement can help speed up purchase orders as well as manage and track cash flow effectively. Not only does identifying fiscal commitments and creating shorter lead times on back orders improve customer service, but it gives businesses more control over their financial operations. Supplier Managing supplier data in the cloud gives you greater visibility through a real-time view of supplier activities. The scalability and the accessibility of data gives businesses greater commercial leverage and simplifies supplier information management as well as making supplier data transparent and easy to share to current and future stakeholders. Inventory Inventory management software gives a business scalability, flexibility and visibility. The real-time visibility into inventory data anywhere, anytime, means you can plan for unexpected changes and meet customer demands. Furthermore, multi-site flexibility means that inventory can be tracked over many sites simultaneously, a mobility that creates a proficient solution to analyse stock levels and ordering patterns for appropriate re-ordering. Pick Pack and Dispatch An integrated cloud system means you can exceed customer expectations through an efficient picking and packing dispatch process. Prioritising parts of the process such as the type of product, customer or delivery service through powerful filters and invoice roles means that you can automate your businesses priorities. Automating a number of processes from barcode to picking and packing eradicates worries of time-consuming manual procedures and interventions. Returns A fully-integrated returns process, used with an invoice manager means you can manage all the challenges of product returns. End-to-end returns management increases visibility through a reverse logistics value chain, helps you work more efficiently through unified data, and overall improves customer service and satisfaction through simpler and faster handling of returns. Accounting Cloud accounting can help improve services while cutting costs. The files remain in real time all the time so that teams can work on the same file simultaneously. With automatic updates on balance sheets and real-time reporting, account balances remain accurate and secure. Cloud technology can also aid the challenging and time-consuming task of debt management by helping to quickly and sufficiently identify patterns of debt and credit risks. With reduced data entry time, no system downtime and the ability to identify shortfalls businesses can deliver better quality information. Courier Management As your business grows, so will the number of parcels you are dispatching each day. Through a number of powerful tools, courier management software can provide efficient customer service with instant access to complete, up-to-date information about their parcels. This therefore ensures on-time deliveries and no more re-keying errors, making the courier delivery process smoother from both ends. Your Business Your Way Embrace a new analytical approach by using software to generate reports. Data gathering software can benefit your business as decisions can be made based on the up to date and accurate statistics and information. Data is one of a companies most valuable assets, so it must not be misused. The cloud means you can get value from your data by reviewing what orders have been processed, which stage of the pick, pack and dispatch process they are in and how much stock you can allocate. Leveraging this accurate data generated from the cloud can lead to revenue improvements and business growth. CRM Add value by ensuring you always have an online, single customer view. With customer relationship management hosted in the cloud, you can stay on top of customer needs with a more accessible, mobile and innovative system. Leveraging the most up to date data in an omnichannel world is vital to offering the best customer service. By tracking customer interactions and managing the sales pipeline you can utilise its value to put the customer at the heart of your business. The drive for technological change and advancement has opened the gateway to previously unavailable business control and growth. When implemented correctly, an effective cloud system can be the root cause of positive business change and continued competitive advantage. ### Why augmented intelligence is integral to customer experience Much has been written on the importance of the customer experience in modern life. Customers have so much choice of products and providers and it is easier than ever for them to change supplier if their expectations are not met. This is especially true in the B2C space, but even in the B2B world it is vital to delight your customers, anticipate their requirements and any issues they might be having – if an organisation does not do that then they can be certain that any number of their competitors are happy to do so. But how can organisations go about delivering that first-class experience to their customers? With more data than ever before, how can that data be put to effective use? The answer lies in augmented intelligence, technology that enhances human imagination and delivers enormous insight to any organisation seeking to improve the customer experience. Data is not enough – it’s all about insight with CX Many organisations are now data-driven, or at least would claim to be. But in the customer-focused era we live in, collating and analysing data on customers is really not enough to deliver the experience that will keep them loyal and prevent them from looking elsewhere. For businesses in 2018 (and beyond) it’s all about being insights-driven. Insights-driven businesses comprise some of the most high-profile and disruptive businesses to emerge in recent times. Examples include Facebook, Amazon, Google, Uber, Airbnb and Netflix – these are all businesses that are founded on smart use of data and insight, but there are also older and more established organisations that can be categorised in the same way. In its recent report The Insight-Driven Business, industry analyst group Forrester categorised such organisations as ‘predators’, because they are intent on taking much of the business and profits from more traditional organisations, and indeed are perfectly placed to do so. Common to all these businesses is their on-going application of insights to decisions and customer engagements at every opportunity. Furthermore, the individuals that lead these organisations have a deep understanding of the value of insights and have put in place corporate strategies and cultures that mean leveraging data and insights is simple and embedded in almost every process. This all means that these ‘predators’ are able to give customers exactly what they want. They understand what it is that customers need and are structured in a way that makes it easy to deliver that to them. This is becoming a major problem for any business that cannot or will not adapt to this new insights-based economy – their profits will become eroded and over time their existence will be threatened. To become such a customer-obsessed organisation, one that puts the customer experience at the core of everything it does, insight into those customers is essential. Augmented intelligence and insight Augmented intelligence solutions really are at the core of insights. They can take data from all across a business, from multiple sources and in multiple formats, including the unstructured data that is usually the richest in insight. It can then use its analytic power to unlock insight that can be used to improve lead generation and the bottom line, anticipate customer requirements or potential issues, and much more besides. Augmented intelligence will deliver a complete view of any customer, with data insight taken from everything from their most recent earnings call to premium external data sources. This allows an organisation to know and understand their customers more effectively than ever before and makes cross-sell or upsell opportunities much more accurate, targeted and likely to succeed. Because it is capable of managing and analysing so much data, the insight extracted from that data and then presented to the user is deeper and greater than anything previously possible. In corporate financial services for example, augmented intelligence can help an organisation with lead generation, identifying opportunities for clients and recommending the best product or solution for them. The right solution will look at data on competitors, partners and markets and identify catalysts that provide additional upsell or cross-sell opportunities to existing clients, and fresh approaches to prospective clients. It will also factor in the context of each opportunity to recommend which is the best course of action to take. In fact, it will prioritise each opportunity and provide actionable recommendations to account handlers. These are all actions that will help drive the customer experience that modern customers demand. The Forrester report cited revealed that 90% of businesses are not deploying insights and that’s a situation that has to change if those organisations are serious about customer experience and want to protect their market share. Becoming an insights-driven business should be the number one priority for almost every organisation. The most effective way of beginning and realising this goal is in embracing the right technologies, especially augmented intelligence that enhances and improves human imagination. ### Keep learning and beat the robots Our reliance as a society on technology is now almost total. Few of us would be able to cope if all of the innovations we use every day, from our smartphones to our high-speed internet, ceased to operate. However, improvements in technology and its ability to replicate human actions is problematic for some. In particular, the belief that Artificial Intelligence will develop to a point where machines will take over our jobs, precipitating an exponential rise in unemployment is creating anxiety for many. Recently, these concerns have been voiced by extremely credible and well-informed sources, pushing the issue further up the nation’s agenda. The Bank of England’s Chief Economist, Andy Haldane gave a stark warning this summer on the threat of long-term unemployment which he believes will be the result of Artificial Intelligence. Haldane spoke of a widespread "hollowing out" of the jobs market, arguing that the dark side of the current technological revolution will be experienced “…when we have machines both thinking and doing - replacing both the cognitive and the technical skills of humans." In my view, Haldane’s comments are much less about scaremongering around technology and more an urgent and timely reminder that we need to get our act together as a country if we are all to fully benefit from the fourth industrial revolution. Talking about a revolution We must, first of all, do more as a nation to address the ever-widening skills gap. This requires an informed and practical government-led strategy, created in close consultation with businesses and must be created and implemented as a matter of urgency. UK companies are already suffering from a serious shortage of crucial IT skills and with technological innovation accelerating so rapidly, we could soon find ourselves with an almost unbridgeable gap, completely lacking in the skilled people we need to power our future growth. Any government strategy designed to counter the skills gap must of course, be wide-ranging if it is to be impactful. However, I firmly believe that we need nothing less than a skills revolution in the UK if we are to remain globally competitive and not see our citizens suffer unemployment as a result of AI and automation. We have a pressing need to not only address the STEM skills gap, but also to completely rethink how we view education and learning.  Lifelong learning will save us Future disruption to the job market caused by the automation of cognitive skills means that lifelong learning will become key to ensuring technology creates opportunities and not unemployment. This will require a different approach to education and training, with businesses, entrepreneurs and educational institutions closely collaborating to ensure we meet the rapidly changing needs of students.  Unfortunately, when it comes to lifelong learning, the UK is moving in completely the wrong direction. Recent research by the Sutton Trust has shown part-time study in England has collapsed over recent years, with numbers of part-time students falling by 51% between 2010-2015. This is of concern not only for businesses who need to employ people with up-to-date skills but for individuals who need to ensure their continued relevance in an increasingly disrupted and fast-paced jobs market. As the job market radically changes, so too must the way we train and upskill. Meeting the needs of businesses and the economy as a whole therefore rests on widening access to higher-level education and promoting different and diverse routes to study that will appeal to more people. Technology will almost certainly help with this. The promise of free, widespread access to high quality learning is a very attractive one and the MOOC (Massive Open Online Courses) phenomenon has huge potential for those looking to increase educational opportunities throughout their lives. MOOCs can be an online destination for those looking to increase skills as they progress through life, enabling them to join a community of like-minded fellow learners while working towards some type of accreditation.   Alongside these structural changes, we also need to create a different mind-set so education becomes something we dip in and out of throughout our lives, rather than something we do when young and then forget about. In a world where an estimated 85 per cent of the jobs that will exist in 2030 haven't even been invented yet the only way we can prepare for the future is to be as cognitively flexible as possible, open to new ideas and challenges and willing to learn new things and develop new skills throughout our lives. If this sounds daunting, it really shouldn’t. Studies consistently show that adult learning has benefits that extend far beyond our ability to navigate a dramatically different job market. Evidence suggests that it can help foster everything from a feeling of purpose in life to greater levels of wellbeing and an increase in life satisfaction. It has even been argued that continued education throughout life contributes to a ‘cognitive footprint’ which may delay the onset of dementia. Creating this culture of continuous learning will not be easy, but it is completely achievable and will make the difference between success or failure in the face of huge technological change. Given the right mindset and the right tools that change could be tremendously exciting. ### IT departments are still grappling with cloud and container technology Blockchain is attempting to recover from a considerable disservice—the rise and descent of cryptocurrency (as of this writing). At Bitcoin’s zenith, so much hype was generated by the financial world that its real value was obscured. And as cryptocurrencies have retreated, some pundits wrote off blockchain with guilt by association. That’s unfortunate, because blockchain provides something truly new in IT technology—an open, distributed ledger that can record transactions in a verifiable and permanent way. Moreover, blockchain—like cloud, containers, and software defined infrastructure—needs to be part of real IT conversations, not just noisy buzzing from the HODL crowd. Blockchain offers truly beneficial advantages in many industries, even as financial services put it through early production use cases. These range from cross-border payments and share trading, to smart contracts and identity management. However, despite endless industry hype for its potential, research by SolarWinds "IT Trends Report: The Intersection of Hype and Performance," revealed that just 8% of IT professionals surveyed were prepared to hail it as the most important technology to an IT organisation’s strategy today. What’s more, this number only increased to 15% when IT professionals were asked the most important technologies for digital transformation over the next three to five years. However, the results from the C-suite told a different story. In fact, the survey exposed a disconnect between the business leaders charged with setting the company’s vision and the IT professionals tasked with executing that vision. Where the C-suite considers artificial intelligence (AI), machine learning (ML), and deep learning to be key important elements of digital transformation, IT professionals are looking toward the technology and processes that underpin continuous integration and delivery—which ultimately enable enhanced performance and digital experience in today’s environments. Let’s dig into those a little deeper. Cloud at the core It comes as no surprise that cloud computing and hybrid IT will remain IT professionals’ top priority for the next five years. These meet today’s business needs while serving as the foundation for, or are at least are closely related to, trending technologies like blockchain and AI. In fact, 95% of IT professionals surveyed indicated that hybrid IT and cloud is one of the top five most important technologies in their organisation’s technology strategy today. 66% listed it as their number one. If you haven’t already begun consuming different service delivery models—like moving from Microsoft® Exchange™ Servers to Office 365® and migrating more of their mission-critical applications to the cloud—it’s probably front and center on your roadmap. It’s also likely driving new nomenclature. For example, “observability”—leveraging combined metrics, logs, and application traces for controllability—has been added to regular conversation about an organisation’s cloud monitoring strategy. As always, IT is expected to maintain a high degree of fidelity, and to monitor with discipline, while carrying forward the same level of granularity and source of truth that has existed in on-premises environments for decades. Though painful at first, finally establishing a baseline of observability across all elements of hybrid IT pays huge dividends. Containers for speedy delivery The “IT Trends Report” shows a substantial uptick in new technology adoption and increasing mastery of new technology, containers in particular. IT professionals are throwing their hard-won—and often shrinking—budgets at containers after discovering their utility addressing the challenges of cloud computing and hybrid IT. In many ways, they’re “containers in the coal mine,” reassuring skeptical IT organisations that they’re not just hype.   49% of IT professionals surveyed view it as a very important technology priority today. And so, they should. While the array of possible use cases for blockchain, AI, ML, robotics, and IoT are grabbing headlines at the moment, the true reality for IT professionals on the ground is different. IT still must deliver clear business value, and until their application is clearer, their potential still falls slightly shy of getting on the project board today. IT professionals, under increasingly pressure to show immediate value, should continue to prioritise container deployment, both from an investment and skills-development perspective. In many ways, containers are a catalyst—insert storage joke here—that entrain multiple new skillsets: automation, scalability, distributed applications, and orchestration. They drive not just high-level understanding of these techniques, but real-world production experience with technology that is the KIP of an organisation’s ability to transform. Your first step is to check to see if IT is already working with the technology. If so, get to know who is involved and engage with them. If IT is not working with containers, it’s simply a case of going to Docker® and grabbing Docker CE for Mac® or Windows® for laptop-based experiments, learning from the tutorials provided by Kubernetes® (especially Minikube), or consuming a platform like Amazon® ECS. There are also many communities like GitHub® that allows container experts to freely share their knowledge. The hype still lives on It’s natural for business leaders to be eager about implementing new technologies that promise to meet the growing demands of their organisations, unlock additional revenue streams, and set them on the path to digital transformation. Of course, these emerging technologies should be on every IT professional’s radar, but never at the expense of laying the all-important groundwork—cloud, hybrid IT, and containers. The IT professional is ultimately responsible for keeping businesses on a practical track when it comes to implementing technology. Although blockchain, AI, and ML are currently a top priority for the IT professionals, engineers in operations are optimising cloud/hybrid IT environments now with the tools at hand. They’re creating a critical pathway, enabling them to seize the potential of these emerging technologies as soon as their organisations are ready. It’s about prioritising investments in technologies that will deliver business value well beyond IT. ### Using cloud analytics to navigate stormy market conditions As speculation continues to gather about the outcome of the UK’s Brexit negotiations, the business community remains unclear about what the UK’s impending departure from the European Union will mean for companies next year. Alongside this, digital disruption is shaking up traditional working models, and companies are becoming more aware of the need to make major IT investment decisions, to maintain their competitive edge. These are uncertain times for business leaders. Against this backdrop, the important role of analytics to drive insights, efficiencies and smarter decision-making could not be clearer, yet many companies are still lagging behind in this area.  The road to a data strategy Business intelligence tools play a critical role in helping companies large and small to gain better insights into data. Increasingly these tools and their associated strategies are the cornerstone of good financial data management and essential for delivering accurate planning, reporting and forecasting. It’s time for business leaders to start a serious conversation about their company’s data analytics strategy to prepare for the challenges ahead. One of the main reasons for this is that data is now everywhere and can be used to power all aspects of the business infrastructure. Across sales, marketing, finance, IT, and procurement to name just a few, data on purchasing, processes, performance and customer preferences is easily available and incredibly valuable when correctly analysed and interpreted. Big data is now also a big business, with recent research indicating that big data and data analytics will be a key pillar to the future of the UK’s digital economy. The industry is expected to add £241 billion to UK GDP by 2020 as well as creating 157,000 additional jobs, according to official figures from industry body, techUK. Companies have expressed great interest in the technology, but there is still slow progress around implementing more sophisticated programmes. Progressing in the data journey Our recent Data Surge Report revealed that three-quarters (72 per cent) of decision makers in UK companies believe their company’s response to big data has been positive, yet only four per cent recognise that managing big data will lead to less administration. However, when asked about the importance of data, 96 per cent of business chiefs said they understood the importance of data for their company’s future. One of the biggest concerns with implementing data analytics is the fear around effective security and management of sensitive information, despite the fact that these systems are much more secure. This issue is frequently cited by decision-makers as a stumbling block to moving away from inefficient and unsecure - yet trusted - manual reporting systems, often unnecessarily preventing companies from moving forward and making the savings they need. Still, many organisations remain at a very early stage of development. Many oversee environments where operational reporting from business systems is carried out ad-hoc rather than for strategic purposes. Overall, organisations generally lack defined processes for analytics, performance metrics and decision-making. The system in turn often leads to disconnected decision-making on a localised level, which can inadvertently trigger serious business issues. At the other end of the scale, fully adept businesses are reaping major benefits from mastering analytics. In these organisations, data is seen as a strategic asset that has the power to generate revenue and sharpen competitive edge. As a result, an organisation at this stage will deploy a full range of analytics and BI applications to support cross-functional and enterprise-wide decision-making. Additionally, companies in this area will start making use of Artificial Intelligence to plan ahead, identifying problems before they can gain a foothold. The benefits of cloud-based analytics Increasingly, we’re seeing businesses choose cloud environments to manage critical information and business processes. From an analytics perspective, all the major technology companies are now advocating the cloud approach because it is cheaper, faster and a more efficient way to operate. There are many other operational reasons why embracing cloud analytics can enable faster decision making. You can say goodbye to reporting bottlenecks, with users able to log-in, edit and update reporting dashboards without sharing passwords and waiting their turn. Information can be collated, visualised and acted upon at the touch of button, instead of at the end of a complex and time-intensive reporting systems. Best of all, cloud environments are super-fast, removing the painful, sluggish on-premise processes that drive decision-makers to distraction when there has been poor investment in IT. So, as uncertainty continues to grip the business community, it’s more important than ever that companies take decisive action to implement analytics to spot problems before they cause damage to the business. Following the cloud-based route is an effective way to gain all the benefits of data analytics without delays and complications associated with restructuring the IT department. Right now, the future may look stormy, but with cloud analytics you can ensure your future remains bright. ### How Essential is Cloud for New Startups Startups and cloud computing have always been in some ways very connected. However, the world is changing. How much of an impact did the cloud have for the startup boom and how will it help them survive the essential new era? Cloud And The Start Up Boom We have been living in an era of fast technological change. Some still remember - as if it was that long ago - that only 20 years ago telephones were still stuck to the wall and VHS tapes were a big hit. Global connectivity and digitalisation were just dreams - even some of the industry leaders thought it ridiculous to think that we would all be connected by this global network. However, things have changed. If it looks like the world has changed a lot on the surface, you only have to look slightly below to see just how large that change is. With tech evolving, so did the business. Trends like cloud computing and mobile devices have shifted the playing field more towards startups. While they used to struggle with competing with titans of various industries, they were able to compete with them as equals in this field of startups. Startups got the access to the same tools and infrastructure that large enterprises had. Through cloud computing, small businesses had a unique opportunity to operate entire networks without even a single piece of hardware. Back in 2015, startups were equal to large businesses because of the cloud. The New Digital Era Companies around the world - no matter the size and industry - are embracing new technologies, business models and globalized economy. From AirBnB to various learning apps, the change and transformation happening because of the digital era are obvious. So many different technologies have emerged that we could only dream of just a few decades ago. Many digital companies recognise that cloud was one of the most important drivers of this digital transformation. Businesses across the world confirm just how essential it was in helping them become what they are today. The agility of the cloud enabled companies to embrace the change and the market as a routine and not something scary or dangerous for their business. “It’s cost efficient, scalable and very accessible - all of the traits that startups loved - and it adds tremendous value to the digitalisation of the way we are doing business across the globe. But, organisations are shifting towards different usages in cloud computing. For instance, cloud-native applications as designed especially for cloud architectures. Studies show that the number of new business applications will only grow in the future,” says Jonah Ames, an IT specialist at Lucky Assignments. Managing the cloud environment is another issue and organisations are tackling it as a way to boost digital transformation. Experts believe that despite the gloomy economical predictions, new startups and the recently formed ones will lead the next generation of both cloud computing with new inventions and technologies that will further progress the digitalisation of business. Some have a different opinion - the one that says that the startup boom is over simply because the new technologies that companies are using are something that’s too expensive  and unavailable for startups. But, everything depends on the stance an expert takes in relation to this issue. Could this be the End of the StartUp Era? This is a common question nowadays. For one, cloud was the one that made the first startup renaissance happen. But, some people say that the world is shifting again. Cloud still holds the same relevance as it did before but there are new technologies on the horizon. Technology by nature favors large enterprises - by the expensive, complex nature - and this may be the reason startups won’t be as successful in the future as they were in the past decade. Experts say that the world simply favors the big over the small and that this is the reason for this swing-back. Businesses and executives will be the rulers of the new age rather than entrepreneurs. This will happen not because of one reason but because of multiple problems and issues that have arised in the recent years. The big markets and opportunities for big enterprises are too difficult to get into - from the stance of a startup. Areas like AI, AR and VR, drones and so on are all very restrictive and expensive. It all comes down to the resources that startups, in essence, don’t have. Big names have already established some sort of dominance in these fields and they are becoming harder to breach into in any meaningful way.   However, some think that not all is grim for startups. They think that exactly because of this change, the startups and young and ambitious people will come up with new ways to make waves in the tech area. Startups are, by nature, more adaptable and flexible than large enterprises and that is exactly what makes them suitable for this change. ### Multi-Cloud is the Answer to Black Friday and Beyond For many, Black Friday represents an opportunity to get the best shopping deals of the year. For enterprises, like those in retail, financial services, and shipping, it is where IT teams test their mettle. Applications, infrastructure, and everything in between needs to function flawlessly, scale seamlessly, and provide customers across the globe with an excellent experience. Poor performance or an outage — even if just for a minute — can cost companies millions of pounds. In years past, IT teams required excess hardware to sit idle all year in their data centre until this fateful weekend. Enterprises would spend a small fortune to have extra servers, routers, firewalls, and load balancers collect dust until they were needed during Black Friday. Today, much of the data centre has migrated to the cloud. The flexibility, scale, and consumption model have all played part into the adoption—so much so that the question is no longer if enterprises should migrate to the cloud, but how many clouds should enterprises use. The concept of multi-cloud, leveraging multiple cloud infrastructure in a single heterogeneous environment, is already a leading initiative for many enterprises. Multi-cloud delivers choice and flexibility. Enterprises can choose the cloud and services that are best suited to the company. Cloud A may have proprietary AI and machine learning that can improve the operational efficiency of logistics. Cloud B may provide better performance for customers in Asia, where Cloud A does not exist and Cloud C is a cost-effective contingency plan for Cloud A and B in the event of an outage. This is the dream but it is not the current reality. Enterprises are adopting multiple clouds, but have not yet achieved multi-cloud. Instead of managing one data centre, IT teams are managing additional cloud silos. Each cloud works different than the next. The value of flexibility and choice are lost in the mire of complexity but it is not the fault of the cloud vendors. Spinning up compute resources on AWS and Azure is similar and simple enough but deploying applications with the security and high availability required for high-demand usage (like Black Friday) is where the similarity ends. Application services, like load balancing and WAF, are opinionated about infrastructure. They are cumbersome to configure and make delivering applications extremely rigid. Traditional application services divide applications, budgets, and IT teams. Instead of managing just their data centres, enterprises now have to manage their data centres and multiple clouds without consistency. However, realising multi-cloud is attainable and it does not require a joint Amazon or Microsoft solution or some crazy custom wizardry. All that is needed is a consistency layer of application services that can be deployed across clouds and data centres. IT teams should not deploy and scale applications through the lens of infrastructure. They should quickly and easily deploy applications wherever it best suits the business. Abstracting away infrastructure specific application services gives enterprises the advantage of multi-cloud and this flexibility is what will transform how IT teams address Black Friday and other demand surges. ### Blockchain Security Is A People Problem Blockchain was hailed by some as a ‘truth engine’ but, according to Melanie Jones, Product Director for cybersecurity portfolios, there is already a spanner in the works and those with the skills to protect valuable data are finding themselves in one of the world’s most sought after professions.  If there is one thing we know about human nature it is that if something is presented to us as unbreakable, unhackable, unsinkable or inedible, there is always someone with the aptitude and mendacity to crack it. So, it goes with blockchain. Blockchain’s immutability is built into its peer-to-peer nodular transparency. In theory, the decentralised system, distributed across its network of users/nodes, should be impossible to compromise as any attempts to alter the data would be instantly recognisable and investigated by the pool of node owners (active users who validate transactions are also known as miners). Blockchain’s universal adoption as an open ledger by the cryptocurrency markets, sadly, has already proved that - despite its sophisticated software - the open ledger is open to compromise. Since 2014 over $1.4 billion worth of crypto currency has been stolen from exchanges by hackers. Some of the targets include popular crypto trading brands such as Coincheck, MT Gox and BitGrail. Unsurprisingly, it is not the software letting the side down - it is us. By design, blockchain cannot be hacked, but its weakness is often at the point where its systems connect with the real world in software or applications. For example, ‘hot wallets’ are vulnerable to hackers. These wallets function like a digital ATM machine - they are internet-connected applications that store cryptographic keys (needed to access cryptocurrencies). Wallets operated by cryptocurrency exchanges like MT Gox and Coincheck have become the location for 21st century bank robberies. Cryptocurrency exchanges are increasingly claiming to store their customers money in “cold” wallets - on storage devices disconnected from the internet, but as recently as January this year, $500 million was taken in cryptocurrency from the Japan-based company, Coincheck.  The nodes can also be compromised from within, either by a hacker taking over a node, or a legitimate node owner working against the interests of the system. Emin Gün Sirer along with his colleagues at Cornell University have shown that what they call a ‘selfish miner’ can subvert a blockchain from within - even with less than the crucial 51% mining power of the other miners. Also known as a majority attack, a 51% attack occurs when a malicious miner gains control of over 50% of the blockchain network’s hashrate, enabling them to reverse transactions, halt payments, or prevent new transactions from being confirmed. A 51% attack is not easy to pull off, requiring a sizeable amount of computing and therefore a large amount of electricity to accomplish. The construction of data mining factories in places like China, built near dams to benefit from cheap electricity, could become the Achilles Heel of blockchain. Smart Contracts, computer programmes that can automate actions/transactions, are also at risk of compromise. Over $80 million was stolen in ether (Etherium’s cryptocurrency) from the Decentralized Autonomous Organization (DAO) in 2016. The blockchain-based investment fund was forced to reverse engineer the money back by creating a ‘new history’ to replace the money. Not great. So how can companies apply the cybersecurity lessons we’ve learned to ensure that blockchain delivers on its promise of a secure decentralised public record? To fight fire with firewalls, businesses are going to need to skill up. Cybercrime has become mainstream and it’s no longer mainly small-time crooks or isolated hackers causing chaos for dark web kudos. Organised criminals have spotted the potential rewards and, as well as targeting cryptocurrency in the modern-day equivalent of a bank heist, they are now hacking for data - which is a currency all of its own. Data is what drives everything and it’s what the hacker wants because they can sell it or threaten to share it, blackmailing the organisation for financial gain. Organised hackers are starting to think strategically beyond phishing and pillaging bank accounts. As we’re seeing played out in our newsfeeds, criminals - and unscrupulous political operatives - are also taking advantage of blockchain’s system vulnerabilities at the international level. New approaches and skills are urgently needed to counter this evolving cyber-criminal landscape. Job descriptions are being updated and expanded to source individuals with experience and know-how from law enforcement and intelligence to technology, coding and analytics. According to the latest global information security workforce study from ISC, there could be up to 1.8 million information security-related roles unfilled worldwide by 2022. In Europe, the shortfall is projected to be about 350,000, with the UK’s share of unfilled cyber security jobs expected to be around 100,000. We are doing our bit. Next month, in association with the UK government’s new apprenticeship programme, Global Knowledge Apprenticeships and Qufaro will be launching the Level 4 Cybersecurity Apprenticeship at Bletchley Park - the world-renowned home of British cryptography triumphs. In today’s connected, cloud-based society, the Internet of Things means your central heating system is as much at risk of a cyberattack as your laptop. The next generation of codebreakers will be coming to Bletchley Park with a different enemy in mind: the widespread threat of invaders who can access our banks and businesses from the tiniest ‘chink’ in our cyber defences. Much as we’d all love the utopian idealism of blockchain to be the ‘golden symmetry’ of the internet, it is far more likely that the resourcefulness and know how of individual cybersecurity professionals is what’s going to keep us safe - not just from organised crime but human nature. ### Graph Technology: The Secret Sauce AI Is Missing We hear a lot about AI (Artificial Intelligence) getting increasingly more powerful and taking over the world. And it’s true that AI is getting better, but it’s still very domain-specific and actually quite brittle. And potentially, rather error prone. Computers are only ever as good as what we give them to work on – GIGO (Garbage In, Garbage Out), after all. Consider all these sexist AIs out there: US researchers trained an algorithm on a set of photos of people in kitchens – where it so happened more women were visible. Over 100,000 images from the Internet were looked at – and its assumption that only women who appear in a domestic setting became stronger, amplifying some very questionable assumptions. A similar problem cropped up for Amazon, which had to turn off a robot recruiter that had been trained on too many male CVs so was biased against women applying for technical jobs at the firm. Clearly, we need smarter AI to avoid such social issues, and the missed business opportunities they represent. Just throwing more hardware at the problem, however, can’t be the only answer to optimise for AI. Computers can't understand context, for one, and you still need to be able to fit all of your data into storage in order for it to be able to run any useful AI calculation. The problem is how do you get the right data when it misses one crucial axis – relationships? When you think about it, real business insight is based on how things are connected. The good news is that graph technology – whose sine qua non is working with relationships – can be a real boon here, as there are lots of problems where graphs can help feed the right data into your AI software. As a result, there are lots of ways where graph databases can start speeding AI work up – a lot; not by just throwing more compute power at the problem, but throwing the right compute power at it and by introducing an optimised-for-relationship data structure. The power of connections Plus there is something amiss with Machine Learning. Too often we're training these models to predict things but via a misleading, unconnected view of the world. We are missing nuance – how things are related. As James Fowler, author of Connected, notes, if I want to predict whether you are a smoker or not, I can either get all the facts about you, namely name, age, medical history, demographics, etc. – or I can get to know whether the people in your social graph are already smokers. Looking at those two options, I will be able to predict with a much higher degree of confidence that you are (or will be) a smoker with a social graph. Astonishingly, this is true when you are looking at not just your direct friends, or friends of friends but friends three hops out – friends of friends of friends. If social graph can do that, imagine what it can do when working in lock-step with AI. There are two real concrete examples I can give: the first is if you've got a very large dataset and you want to ask it a simple question, Have I seen a customer like this before and what did they do next? To establish that by brute force computation would mean having to look at every customer you've ever seen, and you might have tens or hundreds of millions of customers. But that’s not how your brain would make that decision. Your brain would leverage context and filter down to deliver back, of the million customers you've seen before, here are the hundred that are the most similar. Only then would you start reasoning to say, “she was looking for something similar and this is the decision she made, and these people tend to do the same.” You're taking that process of filtering down using context to reach the right subset to analyse against – a difficult calculation to make in the AI space. This ability is what's called collaborative filtering, and it can cluster your search work in a far more intuitive and performant manner than alternatives, and in order to accelerate your AI/ML project. My next example is how graph databases can filter by helping you avoid trying to pack everything into a giant matrix. Graphs create positive relationship values, which means you can easily reach other people or objects that share these relationships, and in a far faster way than you can with conventional column and row data structures. This allows you to move your AI/ML endeavours from a batch process – let me have my computer just crunch all these calculations for a week – to something far more real-time. You can then start looking at combinations of things and see when they relate to each other and the types of relationships they have to each other, so that you can start to generate real understanding. Given all these advantages, it’s clear that graph technology could be the missing link you’re not taking advantage of that can better help you better store and understand your AI training data so that you can answer that amazing next question – the one you don't know that you have yet. ### How HR can benefit from AI implementation Technology, especially AI, is really changing the way we work and has huge potential to improve HR processes. Whether it’s streamlining laborious tasks or analysing information with greater effectiveness – AI is ensuring that employees work smarter not harder. Many organisations are already harnessing the power of the technology and it shouldn’t come as a surprise that AI spending in on the up. The technology is being adopted by many organisations across a wide range of industries, including human resources departments. The benefit of artificial intelligence is clear but there is also a huge worry that the technology could take away jobs. However, HR managers needn’t panic, the technology will improve areas such as recruiting, assessments and management practises, but it won’t entirely take over from the responsibilities of HR managers and the department itself. Ensuring HR remains human The technology slots very easily into daily working practices and can help HR departments with compliance issues as well as employee onboarding and expense claims. Looking more closely at expense claims, machine learning technology is well placed to help companies enforce their expense policies. This prevents HR teams from sifting through endless piles of receipts and searching for VAT numbers and dates. AI engines have the ability to read receipts and audit expense claims, along with flagging irregularities for human interrogation and sign off. This ensures that the claims submitted to HR departments are precise and legitimate, as well as refusing those which aren’t – like claiming two meals on the exact same date. Ultimately, this relieves the HR department from time consuming tasks, helping them to bring more value to other activities such as training and staff retention.   With automated AI engines carrying out calculations, enforcing compliance and verifying information, HR managers will be better equipped with more reliable and accurate data. This eliminates factors such as human error and offers a greater level of clarity around total business spending and claiming behaviours. This deeper level of insight will enable HR teams to make more educated decisions and evaluate policies and spending habits. AI won’t replace jobs There has been significant value put on AI’s ability to transform the HR department and 66 per cent of CEOs feel it will have a positive effect on employees and organisations as a whole. But how will AI influence the roles of HR managers in years to come? For many HR departments automation is removing the need for people to perform physical checks and with this we’re seeing a job transition rather than a loss. The technology is enabling HR professionals to improve staff experience, helping to lower costs and gain highly beneficial new insights. This focus on value-added contributions is supporting HR professionals to work more openly with their colleagues in other departments, such as accounting and operations. By converting data into the practical information, HR managers can easily analyse data and proactively contribute to strategic business decisions. Alongside the automation of tasks, AI is also improving the accessibility of areas such as duty of care. For many HR personnel, verifying drivers and vehicles is a massive burden which impacts their duty of care responsibilities both financially and environmentally. Failing to adhere to these rules and regulations can lead to financial penalties, which HR professionals are keen to steer clear of. Often validating employee driving licences can be a laborious process which involves filling and managing countless forms in order to gain access to the DVLA’s database. Automating this process through an expenses system means that checks can be automatically carried out – saving the HR team much time and money. HR in years to come It doesn’t come as a surprise that technology and internet-enabled devices are set to create more interesting and personalised experiences as time goes on. Gartner research suggests that by 2022, 80 percent of new mobile devices will have built in AI functionality. This development will affect HR departments and we’ll very quickly see the roll out of optical character recognition (OCR) technology. Enabling HR professionals to easily edit and search different types of documents, like scanned paper documents and pictures - giving teams the reliability and performance they desire. We’ll also see greater intelligence from smartphones when performing specific tasks. For example, if a member of staff is driving, the device may help them to action a claim for their mileage, or automatically produce an expense claim by scanning a receipt through the camera function. Ultimately, the uptake and implementation of AI and machine learning technology has the power to create more opportunities than ever before. The speed of AI deployment throughout business processes is yet to slow down, so it’s vital to integrate these automated tools into HR teams in order to boost efficiency and productivity. ### 5 Key Technologies That Are Driving IoT Development The Internet of Things (IoT), under the influence of several exclusive technologies, is on the verge of booming into the next major technological wave. It's potential to redefine our lives is simply unexplainable. For instance, heart patients continuously need to visit their cardiologist so they can record their heart rate and perform related tests. However, with IoT, these patients can quickly provide their physician with hourly updates even without needing to make a trip to the clinic. They can wear an IoT-connected heart monitor that allows their physician to assess the information periodically and suggest the right course of treatment. For such to happen and for IoT to assume its position as a potent force, it needs support from various technological developments. What these technologies need to do primarily is not to necessarily support the IoT, but instead as they advance, they are subsequently going to massively boost IoT innovation as a whole. Herein are five different technologies that are driving the development of IoT. Cloud Computing     IoT is set to produce a significant data volume, and as such, you will need some considerable space to not only process but also store this data, and this is where Cloud computing comes into play. Cloud computing is the only technology that boasts the potential to quickly and faultlessly process such a significant volume of data. For instance, where numerous smart devices transmit crucial health data to physicians from across the globe, enormous data volumes are produced. Unsurprisingly, only the cloud can process such masses of data effectively. Several significant developments in innovation have rendered cloud computing among the most potent IoT drivers. Identity management platforms are one such solution to offer data security. What’s more, the cloud is gradually more scalable and efficient. In efforts to leverage these benefits, there are numerous cloud-based platforms under development. This will ensure easier exchange of data between multiple platforms since IoT is not exclusively confined to desktops, laptops or even mobile devices.   Marketing Automation   The main proponents of a majority of the activities which are contributing to the rise of IoT as a dominant force are of course multinational tech giants who want to gain commercially. IoT has the potential to offer a substantial volume of information on customers, such as their hobbies, preferences or even what devices they use. International companies can find such data more than valuable as it can help them customise and sell their products and services to fit their market. IoT can also effectively help such firms to generate customer-focused items. Currently, software developers are working to produce marketing automation software which can automate marketing procedures such as customer segmentation, integration of customer data, as well as campaign management. Moreover, there is a massive investment going towards creating intelligent marketing automation systems which can utilise the vital information provided by IoT devices. IoT is well-equipped to provide actionable and vital customer data needed by intelligent (marketing automation) applications. Both IoT and marketing automation can be defined as mutually dependent.   App Technology Boom   App technology is yet another critical component in the development of IoT solutions. The recent emergence of app innovation has drastically been scaling up the rate at which IoT is developing. Generally, apps allow data exchange between various devices. In essence, they offer virtually everything that IoT offers. Apps have been vital for the development of IoT, and their relevance can best be captured through several examples including: Parking apps that can check all available parking spaces within a city. Noise monitoring apps that identify certain sound decibels in otherwise sensitive areas like hospitals and schools. Structural assessment apps, which can monitor the state of materials and vibrations in both buildings and bridges.   IPv6   IoT will facilitate for the interconnection of millions of devices. Undoubtedly, all these devices will need IP addresses. IPv4, which is currently the most popularly used internet protocol, cannot cope with the subsequent demand surge for IP addresses. Furthermore, IPv4 has particular concerns that can hinder the progress of IoT, as can other security threats. IPv4 is not the most secure internet protocol, and considering the volume of confidential data that will be shared through IoT, it can be a risky option. But with IPv6, which is IPv4’s newer successor protocol, all these concerns are adequately addressed. Besides this, it also comes with multiple added benefits including the fact that to address a device, it offers four times more bits on the internet. With these extra bits, you can enjoy about 3.4×1038 address combinations. As such, it can accommodate virtually all space allotment requirements. Furthermore, IPv6 enables direct connection between hosts over the internet although it depends on the firewall policies and security of an organisation. With IPv6, devices can remain connected via the same IP address notwithstanding whether it is roaming in another area. Finally, IPv6 comes with an optional feature known as IPSec for more secure connection between devices.   Sensors   Several factors make IoT outstanding, and one of such is inter-device interaction notwithstanding their technological affiliations. Sensors which are fitted in these devices allow them to interact with multiple devices smoothly and effortlessly. Sensors are among the core components of IoT. For instance, to unlock your main door, the key’s sensor can open it, which instantly transmits a message for your lights to switch on and your thermostats to regulate the temperature in the house. All these activities happen simultaneously. The science behind IoT sensor design is similar to how microprocessors work. They use the lithography procedure that ensures that various sensor copies are rolled out concurrently. However, IoT can only perform a particular task. You can subsequently pair a microprocessor and a typical IoT sensor and attach it to wireless radios to communicate. Summary Essentially, Internet of Things is still a relatively new concept and while various speculations and projections have emerged about how it will find use and how it will be further advanced, many bets have it that IoT might end up growing into an entirely different entity than the tech sector imagined or anticipated. These distinct technologies that are driving IoT development may not only mean a dramatic surge in how many devices we stock in our homes in the near future, but they also might have the potential to push IoT to incredible heights.   ### What you need to take your business digital In an increasingly digital age, companies are under pressure to develop their technological capabilities or risk being left behind. Digital transformation is both a long and difficult process, but it is an important step to planning for the future success of any business. A change in culture to accept a ‘trial and error’ approach to a digital transformation project is essential. Here’s how to successfully prepare to take your business digital. Trial and error While there have been several examples of companies failing a digital transformation project, many need to understand that failing is ultimately in the best interest of the company’s future. Digital transformation is hugely beneficial to companies looking to improve user experience and remain competitive in their industry. However, incorporating technological innovations into your business as it continues to operate on a daily basis is no easy feat. Therefore, a culture of experimentation is key to constantly refining and improving the organisation’s relationship with its users. There is no set portfolio for digital transformation, so companies must adopt a ‘trial and error’ method to discover what works best for them and ultimately what’s best for their users. Failing may be hard to accept, but in reality, it pushes companies towards correcting mistakes, making improvements and optimising the latest emerging technologies. The importance of work culture To make the most out of a failed project, a company needs to have the right culture. Digital transformation won’t be able to proceed if a team can’t admit failure and then embrace it as they learn to keep experimenting and thereby keep improving. This begins with ensuring that technology is given its proper place in the boardroom. Executives need to incorporate technology amongst the business’ top priorities, so it is given the attention and budget needed to take the business digital. In relation to this, having the right team members to guide the digital transformation process is crucial. Executives need to be educated on digital transformation, its benefits, how to plan for it and what is needed from the company to succeed. On another level, technology staff are critical to the success of the project. Companies don’t need more employees when proceeding with a project, they need the right ones. Contractors offer industry knowledge and experience as well as an unbiased opinion to balance out loyal, permanent employees who are highly skilled but are heavily invested in the project. Ensuring you have the correct mix of people of different levels, backgrounds and skills is essential to creating the culture need for digital transformation. In addition, being bespoke is the only way to differentiate and stand out among competitors. In order to connect with users, you need to provide them with a unique service or product. Digital transformation is all about providing the best experience for your users; if you can understand who they are and what they want, it’ll make the process smoother. What not to do While companies need to ‘fail’ if they want to improve, there are certain mistakes to avoid when taking your business digital. Firstly, businesses need to forget the standard market research approach that has previously been the norm when approaching a new development project. While useful for researching material products, you cannot apply the same method when researching digital ‘products’ like websites or mobile apps. Instead of businesses wondering and then deciding what they think people want, it’s essential to think about what users want. User experience research is hugely valuable for companies looking to improve their digital offering by understanding who their users are and what they’re looking for. This research can then be used to refine and improve the digital offering, which is crucial in digital transformation. Another important step before beginning the process is ensuring that there is a proper strategic plan in place. Some companies have struggled to go digital because they jumped the gun and immediately started thinking about implementing new technology. While the possibilities of AI, big data, cloud and blockchain are exciting, first companies need to understand what they need, not what they want. A well thought-out strategy will help companies understand that digital transformation is not a quick fix, it is an ongoing process that embraces change. Similarly, companies need to understand that the switchover between their old systems and servers to the new technologies cannot be achieved in one go. The average business uses over 1,000 software applications which hold masses of crucial data, vital to the daily running of the business. The transfer of information and applications will need a step-by-step plan to ensure the company can continue to operate. While taking a business digital may seem difficult, long and tough to plan, senior decision leaders need to fully embrace the transformation process. By understanding the need for the right culture, a holistic strategy, and the ability to adopt a trial-and-error approach, businesses will be able to enjoy a successful digital transformation. ### End the cloud lottery and reduce your risk Pretty much in anything we purchase, when given too many options, it can be exhausting to weigh up all the pros and cons of each offering. This is the case with the (supposedly) most liberating technology ever, cloud computing. When people select cloud services for their applications from cloud service providers, they rarely get the optimal solution. In fact, they almost never do, because the cloud service providers use an age-old sales strategy to provide so many options, typically in the hundreds, to make the purchase selection almost impossible to understand. The act of securing units of IT on the cloud is a great example of the tyranny of choice. We all like to think we get the best solution for our organisation, one that will reduce our business risk, be cost-effective and increase our margins. However, the truth is that it is impossible to achieve, because there are too many variables to take into consideration. Buying bundles of processing power, memory, storage and networking for your applications would be hard enough on its own. It’s even worse when you factor in the matching of your application needs to the right services. Which family of services should I be selecting? Within that family, which products do I select? Perhaps I need to be looking at containers? Do I optimise for memory? Is application “spiky” in nature, so therefore I need to be considering a service that matches that type of app? And if so, which one?! It is not humanly possible to consider all the variables, as each microscopic decision is a movable beast that is subject to change by the millisecond. In response to the critique that service providers make their customers operate in the dark, the major cloud providers have devised offerings that purport to offer a solution. These are called Reserved Instances. Amazon Web Services (AWS), Microsoft Azure and Google Cloud responded to customer confusion over the various bundles they offered – by offering a new way to save money by making future commitments and buying in bulk sooner than you need the services. Have they helped? Not entirely, as they are almost as complicated as the system of tariffs they were meant to simplify. Reserved Instances promise customers cost savings by committing to a certain level of consumption. However, buying Reserved Instances based on what is being used does little but to lock the organisation to making a long term purchase of the wrong mix of instances. Better to optimise first and only then realise the financial benefits Reserved Instances bring. Dealing with that many variables - each with a huge leeway for change - you would be more likely to win the lottery than to get it spot on. While cost savings are always a priority for the purchaser, in actuality, it is risk that should be at the forefront of their mind. Mismatch between cloud provider resources and an organisation’s application demands can lead to serious business risk in addition to unnecessary spend. Is your application truly getting what it needs at the time it needs it? There are tools to help you monitor change, but nothing to help you to keep pace with it. The most common mistake people make is to use a bill reader to analyse the cloud bill, but sadly, most of these management instruments are too blunt for a job that needs to be looking at the infrastructure issues first. The only real way one can keep pace, is to use machines to track the machines, with deep multi-dimensional permutational analysis. This type of analysis and machine-learning ability can learn all the application workload dynamics, in order to work out the demand for resources. The next step is to work out the supply side. Which isn’t easy, because the supply of processing power, storage and memory is another moveable beast, once again due to constant pricing changes and new technologies that are introduced regularly. Again, it is beyond the capacity of a human brain to keep up with them. The good news is that next generation cloud optimisation solutions are here to help those who are trying to manage the pressures of application performance requirements, business risk and cost pressures. With machine-learning technology to make applications self-aware of their precise resource requirements - and if permitted - allow them to automatically optimise themselves 24/7, the complex mapping of supply and demand is achievable. Only then will you feel like you’re winning the cloud lottery on a daily basis. ### Embracing an optimistic global IT channel through collaboration Globalisation is something that IT channel businesses are increasingly embracing in order to remain relevant to their end customers. Powered by the developments and advancements in technology and the internationally focused, outcome-led market we serve, collaboration has never been more critical in order to succeed globally. As a result of this, IT resellers and MSP companies are now able to explore new markets by entering into partnerships with channel services businesses delivering best-of-breed global solutions. Recent channel research has shown that the global IT channel is expected to expand in the next two years, which demonstrates a clear view of industry collaboration across global partners and customers. Through collaborating with others operating in the channel, companies are able to execute a structured and cost-effective globalisation strategy. For most companies, having a presence in other regions can minimise the potential financial risks of having a purely UK-focused business, which will bring great opportunities ahead of the UK’s official EU exit. Why are Channel Companies Collaborating Overseas With the looming Brexit deadline slowly creeping up on us, channel companies should take this opportunity to embrace the optimism in the global IT channel through collaborating with those who operate beyond the European Union. However, some may find it hard to consider how, why and when they should undertake this significant transformational shift. Industry research has shown that the majority of UK-based IT channel companies already operate in other markets, outside of the UK, mostly through collaborations with existing channel partners, or having a fixed presence in a specific region. We have found through our research that eight in ten organisations have an existing presence in Africa, Middle East and Europe. This shows the increasingly growing appetite of the channel wanting to move into new markets. UK channel businesses should also reflect on the increasing strength of the European IT channel and should be looking to expand further into other European regions if they want to capitalise on the growing global market. As channel businesses begin to collaborate with those operating in local regions, they can ensure that they have the specific infrastructure in place to deliver a consistently high level of service regardless of the geographic location. Opportunities Created from Collaborating Outside the UK The factors to consider with regard to international expansion depend on the size of the company, its financial position, and the current stages it is at collaborating with partners and their associated networks. For most, operating outside of the UK is quite an easy step to take. Smaller firms will often work with existing partners to establish an overseas presence, whereas the larger companies have greater flexibility to offer a complete services package to their end user customers. Development of technologies, such as IoT and Blockchain, prove a strong catalyst for businesses to continue and plan their growth strategies. It is key that existing technologies, such as Artificial Intelligence and Augmented Reality have the power to drive the channel industry to look forward into the future. If businesses take these technologies on board, they are open to more possibilities of becoming relevant in new markets and exploring multiple business avenues. The Future For Businesses That Collaborate Over the past few years, the UK channel has seen a great number of new cloud-native and born-in-the-cloud organisations entering the market. As a result of this, some of these young companies have decided to hold back on growth plans and are still working to establish their business in the UK before expanding into international markets. There are a number of channel companies that think the IT channel will shrink within the next 24 months - an inflexible perspective, given that the majority show increased optimism and pace of development to globalisation. This could be because they are considering other factors within the industry, such as financial and political issues, which can have a balancing effect on expectations for growth. It is clear that through channel companies collaborating, the global IT channel can remain buoyant, due to the current financial and political instability they are facing today - with Brexit being a key player. However, businesses must ensure that they continue to collaborate to innovate, in order to maintain appealing relevant services portfolios. Optimism to embrace a growing global market has never been higher and as a result, this is the time to take advantage by collaborating with global partners who are making the complex the norm in order to achieve unprecedented growth for your business. ### Automating the Future of the Workplace As businesses search to deliver even greater efficiency and cost savings, Artificial Intelligence (AI) powered automation has the potential to drive forward the fourth industrial revolution. There are huge factors pushing companies towards this new zeitgeist; however, building and training sophisticated AI models can be hugely complex. Regardless, automation has made an impact in almost every industry, from manufacturing, to retail and even transport. Just look at the likes of PayPal, the secure money transfer service is using machine learning to combat money laundering by analysing millions of transactions between buyers and sellers to identify fraudulent transactions. It is success stories like this, which are pushing businesses to reimagine how they work. In marketing, automation is already well embedded into industry practices. German software provider, Sensape, is a great example of this, providing cognition to digital signage solutions by teaching the operating systems to see, understand and interact with people at trade events and in retail locations. Through a combination of AI, computer vision and augmented reality (AR), content is adapted in real time, improving the rate of interaction by 14.7 times in comparison to traditional digital signage. While the list of AI powered projects grows daily, for businesses within traditional sectors like finance, there is still work to be done to achieve full automation. Encumbered by legacy systems and data security fears, these industries experience bottlenecks. But, with competition from cloud native fintechs, the pressure is on for the financial services industry to quickly adapt and bring themselves up to speed with technological developments like automation. To do this, it’s useful to take a close look at both the obstacles to using AI to automate, and the opportunities. Overcoming the cost of adoption Firstly, the cost of developing intelligent AI and machine learning models in-house is often too high. This ultimately boils down to two things: having the computing power and data available to build and train a sophisticated model. The UK’s Digital Catapult Centre found that the cost of a single training run for a machine learning system can be well over £10,000. When you consider that this cost is in addition to the price of storing a large volume of clean and consistent data, it’s no surprise that it is one of the main barriers to adopting AI. Despite these expenses, businesses are still rushing to create their own AI applications, which means having the right infrastructure for development is crucial. Increasingly, organisations are turning to cloud and open-source platforms to acquire AI capabilities without the hefty price tag. When storing and computing in the cloud, organisations can exploit on-demand payment models and customisation to streamline tech investment to exactly those capabilities which it needs. Similarly, open-source models typically have limited or no cost at all and require less input from in-house developers. The very nature of open-source means that businesses can build on publicly available AI applications to create software that is fit for purpose. Combatting the skills gap Despite the expense, the availability of both data and computing power has been on the rise in recent years, resulting in a deluge of automated and intelligent services, which developers must now differentiate between to deliver a successful implementation. This goes beyond merely finding an AI solution that addresses business concerns. It requires consideration of factors like identifying the right DevOps partner to provide the right consultation and on-hand support, particularly during the initial phase of the programme where employees require training. Whilst on-hand support is invaluable, businesses looking to expand their automation strategy must address the skills limitation, either by investing in AI experts or training for existing staff. A recent report found that there are only 300,000 AI professionals worldwide, but millions of roles available. As such, skilled developers have a monopoly on the market and are able to drive the price for their services up. Thankfully, cloud and open source platforms are helping to combat the global skills deficit. Open source initiatives such as Fast.ai are delivering a free open source framework which offers training in coding AI models such as image classification and natural language tasks. Aside from being a cost-free tool, training can be delivered on an ongoing basis, ensuring that in-house developers have the latest information and are therefore able to drive forward an organisation’s innovation strategy. Making the right choices for your business’s automation strategy Whilst the IT team are at the heart of this adoption, successful digital transformation initiatives require a company-wide shift in attitude. With automation, this is even more important as employees’ roles will, in all likelihood, be completely transformed. At its best, augmentation is the collaboration of AI and robots with humans, to automate repetitive or strenuous tasks which drive productivity and employee satisfaction. Businesses who do implement augmentation see dramatic results, with studies showing that augmented organisations achieved 28 per cent higher overall performance, did 31 per cent better financially, and reported employees being 38 per cent more engaged. The question is not whether organisations should implement AI solutions in to their business, but how? Applying automation to business challenges which are labour intensive, delivers positive return on investment and greater productivity, enabling companies to retain and grow their competitive proposition. Similarly, an increasingly open technology landscape offers CIOs and CTOs much greater flexibility and agility when adopting new technologies like AI. They have greater access to the tools needed to develop, adopt and innovate in the future, which assures the success of these businesses long-term. The focus for business leaders now will be on applying the right mix of technology to achieve this, whilst reducing the risks associated with complex implementations. ### Top 10 Cloud Based Recruitment Management Platforms For Your IT Consultancy Business With the development of technology came easier ways for Management to find, manage and hire people. This is a great thing, especially considering the size of the candidate markets for some of the top paid and wanted positions. Such is the IT industry which encompasses a wide array to jobs and tasks. It thus becomes understandable why an IT Consultancy would want to automate the process and make it easier for them to handle. However, finding a reliable and effective software is a whole different problem that these companies have to deal with. Luckily, there are plenty of choices and some of them notably stand out with the quality, their features and prices. Here they are: Zoho Recruit This is an online applicant tracking system and recruitment software which can seriously simplify your process. It allows you to track resumes, interviews, and source better talent. Zoho offers 24/7 customer support, requires no contract or set up costs and you can pay as you go. Smartrecruiters Smartrecruiters is used by some of the best organisations in the IT industry as a hiring tool. It has rich features such as collaborative hiring, recruitment marketing, and so on. It’s built on a modern cloud platform and it’s used by companies like Visa, Sketchers etc. You don’t even have to worry about Candidate experience as it will stay amazing throughout the process Newton ATS This is an intuitive system built by professionals who have had experience with recruitment software. It works seamlessly and according to your vision. You can recruit and engage with candidates, work with your team on recruitment or send offer letters - it covers everything. It’s also the only fully mobile recruitment system on the market so if this is something g you are looking for, this software is for you. Breezy HR This is a modern system with effortless design and a ton of useful feature for any recruiter. It can automate a of your manual tams and still ensure candidate satisfaction. It’s efficient and helpful. Lever This is a collaborative recruitment software which promoted talent engagement by automating tedious tasks and clean, easy-to-operate interface. It also has a built-in CRM functionality and it can help you build strong relationships with candidates. Career Arc Career Arc combines social media management with job distribution and recruitment organisation. It’s easy to work with and it automates all of the time-consuming tasks, giving you more time for human contact and testing of the candidates while providing them with a seamless hiring process. Bamboo HR BambooHR has a set of talent tracking and on boarding tools that can release you of the boring details and let you spend time getting to know the candidates. It has a simple design and a wide array of features which allow you to cooperate and provide a good candidate journey. Job Jet This is a big data platform which involves finding, contacting and rating candidates based on their competences. It provides for an effortless experience and spans all of the necessary work flows in order to create a coherent platform for recruiters. It has a built-in messaging management system. Workable Is an all-in-one recruiting software that allows you to track, find, and hire candidates. It’s easy to use and implement into your company while speeding the hiring process up and meeting all of your hiring goals. Over 6000 companies are using it currently. “We have used Workable for some time now and we truly appreciate what it did for us. The recruitment was seamless and easy on us without dampening the wealth of candidate experience. You can say it spared us a lot of time and effort while providing optimum results,“ says David Fulham, a recruiter from Academicbrits. Clear company This is a software system that provides for an effortless recruitment for both private and public organisations. It connects and combines multiple points of recruitment, allowing recruiters to automate many manual processes and candidates to have an optimal experience during the hiring journey. It’s a simple and powerful software that will seamlessly combine organisation, recruiting, evaluating and managing the candidates. Bottom Line While finding a good software isn’t easy - no matter what you’ll use it for - there are some pretty good options out there. When searching, Management need to try to focus on their needs and comparing them to the features of various software. Management should never settle because you’ll just end up with an expensive, unusable product. Look at some of this software if you don’t know where to begin. ### Planet of the Apps – taming SaaS Sprawl It wasn’t so long ago that the IT department held a vice grip on the applications and devices employees could use in the workplace. Internal technology challenges and IT-mandated policies prevented different departments getting the applications and systems they needed to perform their jobs to the best of their ability. Employee’s personal devices were deemed unfit for professional use, and everyone was armed with company-supplied laptops and phones that they didn’t really want to use. This dark age of applications is thankfully behind us, and we’ve entered a period of IT enlightenment. HR, sales, marketing, finance and all the other business units are empowered to choose the applications they need, whenever they need them, and on whatever device. This is helping workers become more productive and effective in their daily work lives, and generally helping the business as a whole. This was the idea, at least. Application Overload Whilst providing different departments with the tools they need to fulfil their own specific goals is certainly an improvement over the one size fits all policies of the past, it has created a problem all its own – SaaS sprawl. A McAfee report from 2017 found that, on average, enterprises are using 1,427 distinct cloud services, with the typical employee using 36 different cloud applications during their working day. The large majority of these cloud apps are used in isolation, not connected to one another. With the IT department loosening its grip on the software and devices employees could use, is it possible that our appetite for applications got the better of us? This increasing SaaS sprawl causes a number of issues for both the IT department and the business as a whole. Firstly, and this isn’t too surprising, keeping track of all these applications and managing them is a real challenge for the IT department. Security concerns around ensuring these apps are kept updated, and are in compliance with regulations and policies, has been front of mind. Whilst multiple applications that offer similar functionality exist within different departments of the same business, getting these applications working together seamlessly has been a struggle.   With the IT department stretched thin, fighting fires on thousands of apps, they can’t be focusing on what should be one of their key functions – playing a vital role in the company’s greater digital transformation goals. Secondly, with so many SaaS applications running independently, managing all the data that flows into them and then comes out the other side can be a real stumbling block for businesses looking to turn their growing data stores into business insights. In a research report that we commissioned, we found over a fifth of employees from large enterprises were unaware of what data other departments hold. If this is indeed the case, then it’s highly probable that, across the business, individual departments are working with different or incomplete datasets, which they’re pushing into their various, disparate applications. It’s a bit of a cliché now, but the old IT adage of “garbage in, garbage out” still holds true, and if departments are pushing poor quality data into their best-of-breed applications, they’re not taking full advantage of them. The benefits of being able to choose the best applications for the job at hand are dulled if the number of apps in use means data is inconsistent, or worse, inaccurate.   Reigning in the Sprawl The issue of SaaS sprawl is not an unsolvable one though. There are steps that can be taken throughout the business to combat SaaS sprawl and simplify the complexity it brings. First and foremost is a realigning of the relationship between the IT team and other departments. Whilst before the IT team held ownership over apps, clearly an application free-for-all with little IT oversight is not an ideal strategy either. IT needs to be seen as a consultative team within the business, providing recommendations and best-practices on how different applications can work together and indicating where other areas of the business could benefit from using the same applications. A key to re-forging this relationship will be identifying those in each department who can be the liaison between their department and IT to ensure that care is being taken in deciding which applications will be of genuine benefit to the business, rather than simply adding to the sprawl. As much as the IT department must be more closely involved with the end user within each department, so too must responsibility be held by the c-suite. If a true cultural shift that encompasses transparency around application use is to take place, a mandate needs to come from the men and women at the top of the business. Connecting Your Application Ecosystem If the above shift in attitudes can be achieved business wide, it will allow IT a far greater level of visibility into what applications are currently running throughout the business, and make the final step of the process far simpler. As alluded to earlier, the true value of SaaS applications cannot be realised if the data being fed into them is inconsistent, redundant, or incomplete. If applications all have the same single view of the data, greater business value can be extracted. With proper data integration pipelines linking all these applications, IT can create a holistic data picture for use by the entire business. Analytics will be more valuable company-wide, insights more accurate, and business leaders will be able to make faster, better-informed decisions on the future of the business. Reigning in SaaS applications isn’t simply a matter of relieving the headaches of the IT department, but ensuring that the business as a whole is properly equipped to become more data-driven and deriving the maximum value from the applications being used. Getting your SaaS sprawl in order now will deliver true business benefits in the future. ### Cloud leaders offer security, should you get onboard? Cloud computing is increasingly growing in popularity among businesses looking to improve operational efficiencies and cut down on technology resources. According to a recent report from IDC, total spending on cloud IT infrastructure in 2018 is forecasted to be $62.2 billion with year-over-year growth of 31.1 percent. These figures highlight that while cloud computing was once only adopted by a small number of organisations, it is now becoming the norm for businesses across the world.    Among these organisations moving to the cloud, many are turning to major cloud hosting providers like Amazon Web Services (AWS), Microsoft Azure and Google Cloud Platform (GCP), that offer software-, platform- and infrastructure-as-a-service. After disrupting the IT market with their flexible, powerful and competitively priced cloud services, we are now seeing a big push by these household named technology vendors to introduce security features in their offerings. According to recent estimates, AWS now holds a 33 percent share of the entire cloud market, closely followed by Microsoft Azure with a 13 percent share. There’s already little room left for traditional hosting providers, and the latest push into security is another step towards further market dominance and customer tie-in, but should you get on board? Putting their money where security is The move into security comes with good reasons. 90 percent of cybersecurity professionals are concerned about cloud security, making it one of the biggest roadblocks to cloud adoption. To help overcome migration hurdles, major cloud vendors like Amazon, Google and Microsoft, have launched new features which are designed to secure cloud environments. The cloud providers already have an organisations’ data, applications and virtual machines, so it’s a convenient next step to use this trust as an opportunity to sell additional services like security. These security operation center (SOC) offerings include identity access management to prohibit unauthorised access to cloud data, encryption for data in transit, multi-factor authentication and secure key management among other things. The services are integrated into each of the vendor’s cloud platform, which means that uptake has been strong as there is very little effort on the customer’s part. However, considering today’s new advanced cyberattacks targeting cloud environments, are these services enough? While many organisations will believe that the security offered in AWS, Azure and GCP is state of the art, unfortunately this is not the case. The security offered by these vendors works well within their own environments, but they can be less effective for an organisation with a hybrid infrastructure. This essentially means that additional security solutions are necessary for these environments. The challenge of hybrid infrastructure AWS, Azure and Google Cloud have disrupted the traditional infrastructure market. After realising that security is a major roadblock to cloud adoption, they are putting money and effort to built-in security features. But hybrid setups remain a challenge for organisations, with a 3x annual growth in hybrid cloud adoption it is important to look beyond the security tools offered by the leading cloud vendors for protection to help overcome these issues. With 40% of organisations opting for hybrid cloud, organisations must consider how effective their security tools will be across these environments. A security tool that supports environments from multiple vendors will prove more beneficial than a tool that is compatible with only one vendor. Organisations should also have a clear understanding of the visibility and access control the tool will provide and strong insight into the level of protection it will provide against today’s advanced attacks targeting the cloud. For example, the new Azure Security Center can handle security assessments for non-Azure assets but customers need to deploy the Azure monitoring agent and this is only available for a small subset of operating systems. With AWS this is not the case, only AWS hosted assets can be monitored. Therefore, if you have a lot of heterogeneous operating systems and legacy applications, you are limited by the tools and will need to use and integrate third party security tools to protect your data and assets. In addition, some of the fundamentals and best practices - namely vulnerability assessment, CIS and CSA benchmarks for cloud security are not covered by the cloud service providers under the shared responsibility model, it is therefore your organisation’s responsibility to provide effective monitoring in these areas. Six key elements to secure hybrid cloud     Identify cloud assets automatically It is easy for company departments to launch new virtual machines and use test storage for enterprises with IaaS not sanctioned by IT. To prevent cloud shadow IT, security teams must be able to automatically discover cloud assets when they are launched, so that they can evaluate their risk and put appropriate security controls in place.   Cloud Security Posture Management   Gartner coined Cloud Security Posture Management (CSPM), sometimes called hygiene, hardening or configuration assessment. With Infrastructure-as-a-Service (IaaS) or Platform-as-a-Service (PaaS), cloud providers transfer a lot of risk to the configuration of the services by the user (for example on AWS S3, the risk comes from the permissions of the buckets, if developers get it wrong then data is exposed). Therefore, security teams need to run configuration assessment in a continuous fashion to ensure nothing is adrift.   Hybrid Cloud Workload Security   With the IaaS shared responsibility model, organisations need to secure their workloads. This includes vulnerability management and hardening, network segmentation or anti-virus. It is especially important to have solutions that support cloud and non-cloud workloads.   API support for automation   APIs are nothing new and most cloud services come with their own APIs to facilitate integration with other systems. On one hand it is important to implement solutions that use the API for discovery and configuration retrieval. On the other hand, the API by which data is accessed remains a weak link. Security teams should extend their assessment to this new attack surface and ensure robust authentication and encryption are in place.   Identity and Access Management   Getting back to the simplest example of S3 buckets, user permissions are the most important configuration to get right. Therefore, security teams need to assess user rights and access on a regular basis and be alerted when abnormal activities are detected. In a hybrid scenario, this means connections to Active Directory as well to AWS IAM API.   Data Security   Data is the crown jewels. Security teams need to have automated ways of identifying the data and then protecting sensitive data at rest and in transit through encryption Cloud services providers are constantly evolving their solution to improve their security offerings to hold up their end of the shared responsibility model, and organisations moving into the cloud must do the same to keep up their part of the bargain. ### Dell EMC Gains High Performance Computing Momentum At SC18, Dell EMC is announcing momentum in high performance computing (HPC) deployments and new portfolio expansions, designed to help accelerate time to insights in a variety of disciplines, including artificial intelligence, bioscience, weather forecasting and more. Dell EMC also announces that the University of Florida’s Center of Space, High Performance, Reconfigurable Computing (SHREC) has won the 2018 Dell EMC AI Challenge. “Advances in IT are making HPC systems increasingly more powerful and innovative to accelerate the time necessary to reach new discoveries, but many still believe implementations can be complex,” said Thierry Pellegrino, vice president and general manager of HPC at Dell EMC. “Based on decades of experience with leading institutions, technology partners and strategic customers, Dell EMC provides an extensive portfolio of technologies that simplify HPC adoption to advance research and further democratise HPC. We remain focused on leading the way in HPC innovation and helping organisations of all types and sizes further advance expanding opportunities in artificial intelligence and machine learning.” Dell EMC fuelling research for human progress Dell EMC continues to be at the forefront of helping customers adopt the latest HPC technologies to fuel a wide range of discoveries and research. Recent customer momentum demonstrates Dell EMC’s commitment to deliver world-class HPC systems that bring together the latest advances in servers, accelerators, liquid cooling and networking: Texas Advanced Computing Center (TACC) at the University of Texas at Austin has selected Dell EMC to develop and deliver its new Frontera supercomputer in 2019, funded by TACC’s $60 million award from The National Science Foundation. At the time of its announcement, in August 2018, Frontera would have been the world’s fifth most powerful system, the third fastest in the U.S. and the largest at any university if completed. The Dell EMC PowerEdge system plans to combine several technical innovations such as CoolIT Systems high-density Direct Contact Liquid Cooling, high performance Mellanox HDR 200Gb/s InfiniBand interconnect and next generation Intel® Xeon® Scalable processors. Frontera’s early projects are expecting to include analysis of particle collisions from CERN’s Large Hadron Collider, global climate modelling, hurricane forecasting and multi-messenger astronomy.   The University of Cambridge has expanded its supercomputing capabilities with its “Cumulus - UK Science Cloud.” This new OpenStack system is the UK’s largest academic supercomputer, providing more than two petaflops of performance, powered by Dell EMC PowerEdge servers, Intel® Xeon® processors and Intel® Omni-Path Architecture. To help solve the UK’s most challenging data driven, simulation and AI tasks, Cumulus is open to all UK academics and industry and delivered in partnership with Dell EMC and StackHPC, a UK start-up specialising in the convergence of HPC and Cloud. It is funded with investments totalling over £13 million from STFC (DiRAC/IRIS), EPSRC (Tier 2) and the university.   The University of Michigan is deploying its Great Lakes computing cluster for simulation, modelling, artificial intelligence, machine learning, data science, genomics and more. The new system is powered by a Dell EMC-enabled HPC infrastructure built on Dell EMC PowerEdge servers. Great Lakes is the industry’s first system to benefit from Mellanox HDR 200Gb/s InfiniBand networking, enabling faster data transfer speeds and increased application performance.   The Ohio Supercomputer Center is deploying its Pitzer Cluster, delivered by Dell EMC. Like TACC’s Frontera system, the Pitzer Cluster will utilise Dell EMC PowerEdge servers with CoolIT’s modular, rack-based Direct Contact Liquid Cooling solution, which allows for increased rack densities, higher component performance potential and better energy efficiency. As a result, it will offer nearly as much performance as the center’s most powerful cluster but require less power and less than half the space. The system will power broad research areas from human genomics to the global spread of viruses. Dell EMC eases HPC adoption with Ready Solutions advancements Today’s HPC workloads require storage infrastructure that scales endlessly and delivers unmatched bandwidth at high concurrency for deep learning algorithms and AI initiatives. To meet these needs, Dell EMC is committed to expanding its HPC portfolio to offer a range of high performance storage options that complement its portfolio of Ready Solutions with the Dell EMC Isilon Scale-out NAS storage powered by the Isilon OneFS operating system. The Dell EMC Ready Solution for HPC Lustre Storage and Dell EMC Ready Solution for HPC NFS Storage are now available with the new Dell EMC PowerVault ME4 storage arrays. Dell EMC built the ME4 Series with 75 percent more drives than the PowerVault MD3 to increase raw storage capacity by 122 percent, while also boosting read IOPS performance by 4X. Its modular design allows for flexible and custom designs, offering increased density, when compared to the PowerVault MD3, and the ability to scale as customers’ businesses grow. Ideal for technical, big data applications, the Dell EMC Ready Solution for HPC Lustre Storage with the new PowerVault ME4 delivers excellent throughput per building block with on-the-fly storage expansion. As an easy-to-use and fully redundant NFS storage solution, optimised for HPC environments, the Dell EMC Ready Solution for HPC NFS Storage on ME4 Series will offer greater overall performance and a denser solution. Both solutions are accompanied by Dell EMC global services and support. Dell EMC PowerEdge servers to support latest accelerator technology Dell EMC PowerEdge R640, R740, R740xd and R7425 servers will support the latest GPU and FPGA accelerators to speed results in data analytics and AI applications. This includes: NVIDIA® Tesla® T4, the universal AI accelerator ideal for distributed computing environments. It is packaged in an energy-efficient 70-watt, small PCIe form factor. Powering breakthrough performance, NVIDIA notes that Tesla T4 provides multiple times the performance of traditional CPUs for both training and inference. Developers can unleash the power of NVIDIA Turing architecture-based Tensor Cores directly through NVIDIA TensorRT and cuDNN software libraries and integrations with all AI frameworks.   From video content streaming to financial services to defence applications, FPGAs allow hardware to be programmed and re-programmed for optimisation. In addition to the Intel® Arria® 10 GX FPGA support today, Dell EMC is now the first server vendor to qualify the Xilinx® Alveo™ U200 accelerator card, adding it to Dell EMC PowerEdge accelerator options. Xilinx notes, for machine learning, Alveo accelerators can increase real-time inference throughput for machine learning by 20X versus high-end CPUs alone. Dell EMC AI Challenge winner Dell EMC has selected a research team at the Center of Space, High-Performance, and Resilient Computing (SHREC) at the University of Florida as the winner of the 2018 Dell EMC AI Challenge. The AI Challenge, launched in May 2018, encouraged entrants to demonstrate practical applications of AI technology with a transformational impact on business, research or society. The winner receives 200,000 core-hours on the Dell EMC HPC & AI Innovation Lab Zenith cluster and a spotlight in Dell EMC’s booth at the SC18 conference among other promotional activities SHREC is comprised of more than 30 industry, government and academic partners working together to solve research challenges in missions and applications that drive and can benefit from reconfigurable, high-performance and reliable computing. For the AI Challenge, the SHREC team was recognised for developing and demonstrating a heterogeneous computing (HGC) system that can support a complete workflow–data analysis and pre-processing, model training, deployment and inferencing–for machine learning and be applied to any application domain leveraging machine learning, including healthcare, business, finance, science exploration and more. “For the AI Challenge, our team leveraged CERN OpenLab datasets to determine the performance of the HGC workflow with CPUs, GPUs and FPGAs for machine learning. The study showed performance gains of 1.45-2.22x,” said Chao Jiang, Ph.D. student leader of the team. “These early results were promising, and we are continuing to experiment with more complex 3D-image based techniques such as volumetric segmentation with 3D U-net to improve performance, as well as 3D GAN for accelerated particle-simulation.” Dell EMC at SC18 To learn more about the Dell EMC HPC portfolio, technology partners and customers, stop by its SC18 booth #3218 at the Kay Bailey Hutchison Convention Center.  Availability Dell EMC Ready Solution for HPC Lustre Storage on ME4 and Dell EMC Ready Solution for HPC NFS Storage on PowerVault ME4 available now.   Dell EMC PowerEdge R640, R740, R740xd and R7425 servers have planned support for NVIDIA® Tesla® T4 Tensor Core GPUs in the first quarter of 2019 and Xilinx® Alveo™ U200 FPGAs in December 2018. ### Why visibility is key to successful VMware deployments As IT teams are finding themselves under pressure to ‘do more with less’, the demands on enterprise infrastructure, including compute, networks and storage, are increasing exponentially. Not only is the infrastructure deployment required to be automated, secure and agile but it must also operate at much lower costs. Viewed as cumbersome and process-driven, ‘standard’ infrastructure deployments are no longer enough to meet the needs of today’s businesses. As a result, companies are utilising technologies such as software-defined networking (SDN), which allows a greater flexibility in network architecture that wouldn’t have been previously considered possible. Shifting data centre workloads to the cloud enables enterprises to achieve elasticity that helps accelerate speed of innovation, create sustainable differentiation, and safeguard customers. This has led to virtual infrastructure, such as that provided by VMware, becoming a critical component of the modern enterprise, providing limitless potential for scalability and functionality. The adoption of software-defined data centres (SDCC), in particular, is gathering momentum. Indeed, VMware suggests that most organisations already have removed physical compute components in their data centres, with the majority virtualising between 50 and 100 percent of their servers. Can you give us an example of the benefits? One of the best-known virtualised SDN technologies is VMware’s NSX platform, which forms the networking and security foundation of the SDDC. Structured in a completely different fashion to traditional network design – think 16 million Virtual Extensible LANs (VXLANs) deployed across a new network, compared to four thousand in a traditional VLAN design environment – it allows flat networks to be built exceptionally quickly to keep up with demand. It also offers micro- segmentation to support fine-grained security policies for individual workloads and automated policy provisioning for quick deployment. However, the adoption of virtualisation and SDCC technology presents businesses with new challenges to overcome. What are these challenges? The complexity of the services delivered in these environments and their interdependencies with the virtualised and geographically distributed infrastructure is unprecedented, and without proper visibility into these new environments, businesses are unable to assess their security posture, or whether services are performing as they should. Traditional network tools don’t have great visibility into the virtual world, though.  Tools from vendors such as VMware, on the other hand will have great understanding of the virtual world infrastructure, but not necessarily have oversight of the applications and services that run on this infrastructure, or their dependencies. Furthermore, they don’t offer visibility into the legacy infrastructure that businesses still rely on. As a result, there is a need for one single service and security assurance platform that can continuously monitor physical, virtual and cloud infrastructures and applications they support, and provide visibility into their performance and dependencies. Why is visibility so important? Visibility into NSX environments is of huge benefit to the DevOps function, for example, whose role in an organisation’s digital transformation is to develop new applications and refactor existing apps to make them compatible with virtualised and hybrid cloud environments. In a microservice architecture, which is quickly adopted by DevSecOps teams, multiple fine-grained and loosely coupled services will have to frequently communicate by utilising RESTful APIs and messaging.  But, as these services increasingly communicate with each other, complete visibility of the traffic going across these new, complex and often hybrid networks will be vital in understanding the conversations taking place. Understanding how these processes communicate with each other is key to understanding how they’ll function when put into a production environment, and necessary to be able to identify root-cause for service degradations. How can we achieve this visibility? It comes down to being able to see network communications which, even though there may not be any physical wire to speak of, is all wire-data. This can be distilled down to create actionable and insightful smart data - metadata based on the processing and organisation of wire-data at its point of collection and optimised for analytics at the highest possible speed and quality. Delivered in real time, all teams – Dev, Sec, Ops, and QA – can use this smart data for common situational awareness, and to make decisions that increase efficiency across virtual networks. What does this mean for a VMware deployment? To realise the benefits of SDCC technology, organisations need deep insights that go beyond traditional North-South traffic views, and span the entire virtualised infrastructure, from the perimeter to the edge. Full and complete visibility of all traffic is crucial, from North to South, and from East to West. After all, when an organisation runs hundreds and even thousands of applications and continuously delivers new releases utilizing agile technologies, microservices and virtualised environments, the clear waters we’re used to when delivering new applications by utilising waterfall methodologies, monolithic software architectures and physical environments, suddenly become an awful lot muddier. Only through having smart data insights based on full visibility can businesses gain a complete view into application performance, security assurance, and user experience. Without it, they risk blind spots, which can hinder innovation, efficiency and security, while increasing the risk of outages and downtime. As the pressure on enterprises to deliver new applications and services faster, with higher quality and security continues to grow, an organisation’s digital future will depend on its ability to gain an end-to-end visibility into its infrastructure, applications and their respective dependencies. ### Cloud: The one certainty in an uncertain post-Brexit world There is still enormous uncertainty concerning the UK’s future after Brexit. But one thing is for certain: business leaders will require flexibility to respond to the impact of the UK leaving the European Union (EU) next year. So it is likely that the upheaval of Brexit will contribute to increased adoption of cloud-based services as organisations continue to prepare themselves for a potentially turbulent environment. This is echoed by TechMarketView which recently reported that 50% of the UK enterprise software market will be Software as a Service (Saas) by 2021, double what it is today. Indeed, organisations are increasingly dependent on technology to serve ever-more demanding users, and will therefore spend more on software and cloud-based services that offer greater flexibility and future proofing. Since 2013, the government has had a very clear statement that any new IT project should consider a cloud-first approach, above all other solutions, as part of its Cloud First policy.Despite this, a recent Eduserv report suggests progress towards a 'cloud first' policy is slow, with only 40% of local authorities saying that they have a cloud policy or strategy in place.At the same time, a new report from Solarwinds showed 77% of public sector respondents considered cloud the most important technology in their current strategy. Public sector organisations remain under unprecedented pressure to deliver transformed services to a growing and demanding population, with major budget constraints. While departure from the EU is likely to require the implementation of a high volume of new legislation, it is also clear that agile and collaborative working, together with increased data insight, will be crucial to transformation. So what can the cloud offer in this period of budget constraints and uncertainty? Enabling further efficiencies With a continued requirement for public sector bodies to work under the constraints of flat or reducing budgets (and no sign of this changing in the near future), organisations need to change the way they operate to proactively deliver better service in a more agile way. The UK’s future prosperity relies on technology innovation and, undoubtedly, moving to the cloud will be a major factor in driving further efficiencies. Not only because the cloud allows a more efficient delivery but also because it helps accelerate the implementation and adoption of technologies such as automation and artificial intelligence (AI). Whether it is increased scalability, improved mass communication or real-time data sharing, cloud computing makes businesses and organisations more efficient, while lowering costs. Offering flexibility around data hosting In pursuit of greater flexibility and controlled IT spending in the pre- and post-Brexit environment, the cloud is a sensible investment. It doesn’t require spend in on-premises hardware and infrastructure (fixed assets) in an unpredictable environment. The cloud can support locally hosted options in either the UK or elsewhere in the EU and, crucially, it is scalable, meaning no need to use a ‘crystal ball’. Organisations can look at what they move to the cloud and when, especially as hardware needs to be refreshed.  Increasing security and streamlining regulatory compliance Cloud solutions have advanced to the point where they are more secure and reliable than traditional, on-premises solutions. In fact, 64% of enterprises report that the cloud is more secure than their previous systems. Hosting all data in a secure cloud solution can ensure ease of access to help organisations meet GDPR requirements. Many organisations will need support to navigate changing compliance requirements post-Brexit, as these guidelines are subject to change, including when new regulations are introduced if the UK finds new trade partners. The flexibility and agility that cloud computing offers will give a clear advantage to organisations, enabling them to adapt quickly. Enhancing agility As the mention of Brexit is unsurprisingly followed by the word ‘uncertainty’, the ability to adapt to sudden changes is vital. Adopting a cloud architecture will help organisations to take advantage of new services and provide them with the agility to adapt rapidly to changes in policies and regulations. The cloud provides the ideal solution to data storage and accessibility issues and is one of the most effective ways for IT leaders to prepare their organisations. Engaging and attracting talent Moreover, to attract top tech talent, it is essential to offer flexible and collaborative working across organisations, especially those that are geographically dispersed. The cloud reduces the need for employees to travel long distances or be firmly rooted to their desks from 9-5pm, as systems can be accessed remotely, on demand. For CIOs, moving to the cloud offers the ability to overcome previous limitations and improve the value they deliver to their organisation through the adoption of tools like analytics or AI, and securing collaboration outside the business premises. Most public sector organisations have, at least partly, embraced the cloud and understand its benefits. In these unsettling Brexit times, it provides organisations with much needed flexibility and agility. If you add that investing in the cloud facilitates rapid adoption of innovation and new technologies such as AI which will be vital to the UK economy’s future success, there has never been a better time to turn to the cloud. ### Close collaboration between IT and business leadership is key to success The challenge of getting the C-suite to understand IT’s role and requirements in order to drive business change is not a new one. Talk of Digital Transformation has been happening for years – really decades. In fact, the hope of many leaders that true digital transformation can occur has often been lost due to a variety of fits and starts.  The reality of limitations in available technologies, managing existing infrastructures, and the need to sustain ongoing business operations all have conspired to inhibit real transformation. But in 2018, when the rate of digital transformation within organisations really could determine the winners and losers – the disruptors from the disrupted – executives and IT teams must finally align to ensure their business remains competitive. At Alfresco, we surveyed more than 300 digital transformation decision makers in the UK and US at organisations with more than 500 employees, and it’s clear that there is still a disconnect between IT and the Board. Both business and IT executives agree when it comes to the importance of digital transformation to their organisation. Nearly two-thirds (65%) of each group state that the impact of competitors that are more effective with their digital transformation is a “top concern”. To back up this sentiment, 87% believe their company’s business results would be impacted if a more technically innovative competitor appeared. There is a clear business case, therefore, for digital transformation. Disrupt, or be disrupted. Commitment is the bottleneck Despite the apparent alignment between business executives and their technology team with the business case for digital transformation, IT sees the commitment of the business as the bottleneck in their digital transformation. Alfresco’s research found that seven in ten (70%) want business executives to move faster in their adoption of emerging technologies. This is nearly twice the number (38%) who believe that their technology team is overly cautious and could move faster. According to the study, the top issue, as ever, is budget. Almost two-thirds (61%) of decision makers we surveyed believe that, due to a lack of budget and people resources to execute digital transformation, their organisation risks being disrupted by faster-moving competitors. Of concern to business leaders will be that 48% believe a “lack of vision among our business leadership” could hamper their digital transformation efforts.  Building the business case for digital transformation We’re at an interesting crossroads in digital transformation, where 50% of the organisations Alfresco surveyed think their company will disrupt and 50% believe they will be disrupted. Executives are most likely to believe that their organisation will be a disruptor (56%) compared to team managers (49%) and individual contributors (39%), so optimism comes from the top. Leadership is going to be absolutely key to drive successful digital transformation. We asked IT stakeholders to identify the factor that would most likely propel their organisation into a disruptor position. Top of their list was “vision of our technology leadership” (62%), followed by “ability of our technology teams to execute” (58%) – so IT is confident it can deliver successful digital transformation, but it just needs the budget and the team to do the job. To achieve the budgetary commitment for digital transformation initiatives, IT leaders need to emphasise the business case for executing digital transformation, namely: Performance benefits: Nearly three-quarters (74%) of respondents expects to see improved employee productivity while 71% expect to reduce costs through improved efficiencies Business competitiveness: 59% expect to drive better decision making through improved data capture, while more than half (52%) expect to increase revenues through new offerings Customer benefits: Nearly two-thirds (62%) anticipate increased customer satisfaction and loyalty Digital transformation is a strategic commitment, not a quick fix, and executive leadership will need to commit funding, and logistical and cultural support to enable the project; nearly four in five (78%) believe that human changes – including culture and organisation - are more difficult than technical changes. We know from our research that a significant majority (71%) of organisations plan to hire and train in-house experts in 2018 to manage their infrastructure-as-a-service. This talent shift is another key consideration to achieve success. Whether managing in-house or outsourcing some elements to experts, the only way to be successful in digital transformation is to ensure that the IT team is working closely with other key stakeholders in the executive, especially the Chief Financial Officer (CFO) and head of Human Resources (HR). The executive will always have their eye on the bottom line, but the key thing is that IT works closely with its board to agree the objectives and roadmap for digital transformation, how to meet the cultural challenges and be clear on what, ultimately, success looks like. Most importantly, the commitment to disrupt, not be disrupted, must be met. Are you ready to meet that challenge? ### Four Reasons to Adopt a Hybrid Operating Model As more and more businesses shift their IT infrastructure over to the cloud model, organisations are faced with the challenge of how to manage it all centrally, but also facilitate the agility companies now require. In many cases, business units are consuming public cloud services directly and bypassing central IT governance. This often occurs because IT departments focus too heavily on infrastructure as a service (IaaS) and not on the true needs of the business – access to multiple IT platforms and applications on demand. For many organisations, adopting a hybrid cloud model rather than trying to consolidate on to a single private or public cloud, is the best approach to streamlining IT and simplifying management. Through hybrid cloud solutions organisations can benefit from agility and cost savings, mixing both public and private cloud to enable self-service provisioning and advanced automation of key processes. Shifting internal mindsets As traditional companies become digital enterprises, IT departments must transform from managers of discrete IT software products and infrastructure, into cloud service brokers that manage multiple cloud applications and services. To do this, three things must happen: Line-of-business managers need to become comfortable that they don’t have to work around IT but can rely on the IT department to procure these new cloud services in timeframes that meet the company’s needs. IT managers must explain to the line-of-business managers that the IT department can get maximum value by buying IT services across numerous departments. Integrating these services in the organisation’s normal internal procurement processes makes it easier for business managers to comply with IT governance and prevents them expensing credit card spend on cloud services, which can also eradicate the cost savings cloud can provide. To transition to a cloud service broker, IT departments can create a single console that provides visibility across all of a company’s cloud services. Typically, each cloud provider has an individual portal, these disparate sources must be integrated into a single, unified portal or dashboard that delivers maximum visibility. Although it can be challenging to change from a traditional model, a corporate IT department stands to gain significant benefits by transforming to a hybrid IT operating model. These are the four biggest benefits the transition can provide: Rapid time-to-market: A hybrid operating model enables self-service and high levels of automation, resulting in the ability to ‘spin up’ the servers needed for a business application in a matter of hours and often in under 15 minutes. This is a huge reduction from traditional IT environments, where it could take several weeks or even months to procure an additional server. Increased visibility: In a hybrid IT operating model, a dashboard is provided that can show how many legacy IT, and public and private cloud services the organisation uses, any cost savings achieved, and show whether the company provisions IT efficiently. A recent study found a high number of IT departments run a significant amount of their servers at 10 percent or less utilisation, meaning they are dramatically over-provisioned (or de-provisioning these servers when a business application was decommissioned was not done). The hybrid IT operating model dashboard strives to eliminate this kind of waste. More efficient and agile DevOps: The ability to easily spin up development environments in the cloud lets developers provide information and services when the company needs them, in the most cost-effective way. For example, during development they may want to run servers in the public cloud for maximum flexibility, but once employing the application, they may want to run the database part on a private cloud for security reasons. Working with a cloud service broker makes it possible to span multiple clouds, which reduces costs and overall cycle time, and increases visibility into all of the company’s cloud activities. Enhanced governance: When all IT activities are managed via a hybrid IT operating model, the IT department can handle security, regulatory compliance, and the IT budget. An efficient hybrid IT operating model also makes it unnecessary for the organisation to provision shadow IT services, thereby saving costs and eliminating the risk of duplication or paying for services no longer in use. Deploying a hybrid IT operating model requires everyone in the organisation to buy into the new strategy in order for IT departments to truly become cloud service brokers and achieve full visibility over the IT infrastructure. That’s why it’s crucial for the IT department to take the lead in developing and implementing an all-encompassing plan that addresses the necessary organisational and mindset changes. ### Get smart and avoid the money pit-falls The upward trend across the SME landscape towards global expansion is set to trigger increased growth of the already thriving business transfer revenue market. Looking at the next generation technologies that will offer better security for businesses operating globally and to avoid the money pit falls.  With the move towards globalisation, spurred on by the prolific nature of an IOT network that is tipped to consist of 50 billion devices by 2020, the international business playground – once reserved for established players only – is opening up to SMEs who are demonstrating an increasing appetite for global expansion and outsourcing. SME international trade in the UK is now worth some $935 billion, with businesses transferring an average of around £35,000 every month internationally. With an upward trend towards global expansion, the current SME business transfer revenue market, valued at £23billion, is set to soar. For businesses operating in the money transfer space, such forecasts present exciting opportunities for both young and old players alike to keep evolving and honing their offering to fit the increasing need. The industry’s evolving marketplace could spell good news for SMEs, who currently have to make their way through complex, costly and time-consuming routes to achieve the best deal. While some important moves have already been made to improve the money transfer journey for businesses, there is still more to be done to help businesses capitalise on the more cost-effective service offerings that new money transfer players are bringing to the table. To roll out the most effective experience that will bring more transparency and reduce the squeeze on SMEs’ time and resources, it’s important to get under the skin of the user and appreciate the pinch points from the SME’s point of view. Greater choice of money transfer options is clearly great for driving down costs and pushing the boundaries of better service delivery.  However, with increased choice, you also have greater complexity in the market, meaning that there is need to spend more time on research, prior to selecting the best option. In the last five years, the number of companies transferring money abroad has grown from 35 to 112, and that’s just in the online space. In other industries, we’ve seen the challenge of search within a complex marketplace tackled with aggregator comparison platforms.  The success of Skyscanner and MoneySupermarket proves that this model clearly works for users, providing a time-saving method of filtering a market’s offering, based on search criteria. The changes made thus far to improve the global transfer money experience have been important, but they have not necessarily been transformational – many of the same things are being done, just differently. However, next generation technologies are on the horizon that will bring together changes in the finance landscape, and offer better security for businesses that are operating globally. The problem with money transfers Whether a business is eyeing wider global expansion or not, the experience of funding overseas operations or paying international suppliers often comes at great cost. Understandably, many will turn to trusted banks – who have earned credibility over long periods of time and offer a convenient route – to make their transfers. According to a YouGov study, more than 25 per cent of businesses believed they were getting a good exchange rate, while almost 20 per cent admitted to having no idea how they were being charged or what the true cost of their transfer was. Moreover, as cited in a report from payments consultancy Accourt, a majority are blind to the fact that banks are charging UK small businesses in the region of £4 billion in hidden fees per year, profiting from the poor exchange rates offered, The convenience factor has no doubt proved to be the lesser of two evils for many businesses, who are deterred by the complex and time-consuming journey of finding the best deal. Since the emergence of Transferwise and TransferGo – who have alerted users to the excessive fees charged by banks – there’s been a proliferation of new-to-market ‘neobanks’ and startups offering cash services for foreign exchange. In trying to create more transparency, this proliferation has actually thrown more confusion at businesses trying to find best deals. For the person responsible for organising international money transfers in a business setting – whether that’s for payroll, paying suppliers or even arranging travel – the road to a best deal is more often than not paved with complexities, from fast-fluctuating rates and volatile markets to an overload of choice and time/budget constraints. Hence, the default decision is to work with existing partnerships to alleviate as much pressure as possible – regardless of the costs. But with the emergence of game-changing technologies and disruptors – who, with customer-centric vision are developing solutions to remove the friction from the experience – the possibility of SMEs benefiting from a best-of-both-worlds scenario is becoming a reality. Change is around the corner With the introduction of blockchain technology – and in spite of it still being in early stages of development – the financial services ecosystem is standing on the threshold of a whole new era. Real transformation is tangibly close, especially in the currency exchange and money transfer space. At its core, blockchain is all about getting data from A to B as quickly and safely as possible, and so for currency exchange, it’s the next logical step towards delivering a transparent, secure and fast service. While we have only skimmed the surface of blockchain technology’s potential, there’s no doubt its impact on the financial services ecosystem will be game-changing. As the world becomes increasingly globalised, with more people travelling and living abroad, industries are getting more competitive in answering the respective needs of travellers. In the mobile network industry, for example, we are starting to see far greater competition regarding usage of data overseas. Only 4 years ago, using data overseas was a cardinal sin met with extortionate fees. There were reports of people forgetting to turn off their mobile data when abroad, only to be met with phone bills in the thousands. We now see a number of providers offering free, unlimited data usage when abroad. While market leaders in the mobile industry were quick to adapt to the increased demand for overseas usage, financial institutions were not so accommodating. So rather than market leaders offering more competitive solutions, new companies began to spring up, offering solutions exclusively for overseas spending, money transfers and currency exchange. Many of these startups are quickly snatching up market share from the big banks and are on their way to becoming household names. The trend in the near future will likely be focused on competition between smaller companies, rather than bigger financial institutions.  Open your eyes to the blind spots Historically, global expansion was a vision reserved only for fully-fledged, large scale companies. In the here and now, the emergence of fintech disruptors and their next generation technologies have enabled SMEs to share in the same near-future aspiration for international growth. Whether you’re already actively making international transfers or are looking to take your first steps, here’s a few top things to consider that will ensure everyone benefits – allowing you to put cash aside and reinvest in your own business: The cost of international money transfers, especially through your traditional high street banks, can be potentially devastating. Take time to shop around and compare rates, including those from online money exchange services with lowest fees Seek out real-time currency exchange rates (without the added commission) – a day can really make a difference to your bottom line Save yourself time and energy that you can put into the business elsewhere by setting up alerts of when rates are at their best for you Explore alternative payment methods as you could save you a significant amount if your business requires you to regularly send international payments or pay overseas workers ### EMEA businesses have their head in the clouds In spite of an overall lack of education and skills relating to clouds, friendly networking technologies, the results revealed more and more EMEA businesses are investing in software defined wide area networking (SD-WAN), a technology that helps users access the clouds quickly and securely. Unsurprisingly, considering the fiscal and reputational damage that follows a data breach, the chief priority for IT teams (for 52% of respondents) is reducing the risk of cyber attacks - which overtakes artificial intelligence (AI), machine learning (ML), automation and Internet of Things (IoT) projects. These results prove that businesses are focused on solidifying their security strategy before moving towards further innovation-led development. Conversely, nearly half (47%) cited security as the biggest challenge when using clouds. When it came to wide area networks (WAN), however, performance to the cloud took first prize, with 42% voting it their biggest obstacle. Not all who SD-WANder are lost It is encouraging to see EMEA organisations are utilising SD-WAN to tackle the latency and security challenges across their clouds and WAN environments: over a quarter (26%) have already deployed SD-WAN, with nearly two thirds (64%) either considering, or in the process of, deploying. This leaves 7% of organisations with no plans whatsoever to deploy SD-WAN. Our research found that the key reason for EMEA’s adoption of SD-WAN was to improve application performance between locations (17%), narrowly beaten by the desire to reduce security risks at remote locations (16%). The IT C-Suite (CIO, CTO, CISO, etc.) is responsible for driving just under a third (28%) of EMEA SD-WAN deployments, a finding that differentiates EMEA from the rest of the world, where IT networking teams primarily drive these projects (29%). It appears that decisions within EMEA are made higher up the ladder, with CEOs driving 8% of SD-WAN deployments - ahead of both the US (5%) and APAC (7%). SD-WAN enabled organisations are already reaping the rewards. A staggering 98% have already benefited from SD-WAN, citing increased network security (46%) as the most impactful benefit. Respondents were quick to note financial benefits too, estimating an average of $1,312,311 (USD) saved in MPLS and networking costs in just one year thanks to SD-WAN. A long road ahead to the clouds As successful as this may all sound, EMEA still has a long way to go when it comes to SD-WAN education. Worryingly, just a third (33%) believe they totally understand SD-WAN and its offering - far less than the rest of the world (57% US, 41% APAC). Whatever the cause for this may be, all signs point to a shortage of the skills and expertise needed to deploy SD-WAN, a sentiment shared with more than a third (34%) of EMEA respondents. As for those who have yet to deploy the technology, the main obstacle is a lack of internal skills (30%), with a quarter (25%) admitting they simply do not know enough about SD-WAN. A frightening result, likely due to the absence of education on the subject, showed that over half (53%) of EMEA respondent believe SD-WAN is a buzzword that will fail to revolutionise networking. Equally alarming, half of our respondents believe the myth that their SD-WAN solution is fully equipped to keep their network secure, when in reality most SD-WAN solutions require additional security solutions. What’s more, a quarter (25%) of EMEA respondents believe SD-WAN security to be worse than a combination of a corporate firewall and virtual private network (VPN), significantly higher than their US (7%) and APAC (9%) counterparts. Admittedly SD-WAN has different security compared to traditional WAN, but that’s not to say it can’t be equally as secure. An apparent lack of education aside, the EMEA region remains hopeful; 43% believe SD-WAN will replace MPLS as the leading solution, with nearly two thirds (63%) recognising that their current WAN will become outdated if they fail to embrace SD-WAN. Acknowledging their lack of skills, many respondents were quick to address the need for further education within their organisations, with 64% agreeing that they currently don’t have access to the necessary training. Of those organisations who are in the process of, or considering, an SD-WAN deployment, under half (48%) said they will train existing staff, and 46% opting to hire new staff with specialised skills. When deciding on an SD-WAN solution, nine in ten (91%) value technical support and consultancy above all else. We do not need no education While these findings highlight the positive effect the new data regulation has had on EMEA organisations and their perception of cyber threats, it is clear that the region leaves a lot to be desired when it comes to SD-WAN education. Although IT teams are recognising cybersecurity as a primary concern (even becoming a CEO worthy issue), a shortage of SD-WAN skills and education is severely halting their cybersecurity progress, leaving them crying out for SD-WAN training. Consider this a call to arms for the industry: only through working together, through better education, and a combined focus on bridging the skills gap can we begin to truly reap the benefits of SD-WAN. Until then, confusion, cynicism and chaos may reign. ### 4 steps for boosting enterprise productivity Despite advances in technology, productivity in Britain is still lagging behind the rest of Europe, with official statistics finding that workers in Britain are about 20 per cent less productive than the United States, Germany and France, and productivity has barely grown since the financial crisis. Recent research also suggests that there’s a productivity perception gap, with many senior decision makers rating their processes as efficient, but over half still spending the equivalent of a working month on administrative tasks per year. While it is unsurprising that business leaders assume their company processes are efficient, the findings suggest that they’re overestimating just how productive they are. And if business leaders are wasting this much time on admin, it is fair to say that this problem will run right through the organisation.   Furthermore, if management and other employees are wasting time on inefficient processes, then attention will be diverted away from long term planning, strategy, personal development, and that of employees. With an uncertain economic outlook, the UK finds itself at a crucial juncture for business, and in order to weather the tide, business leaders need to take charge of company productivity. Identify the weak spots in productivity Business leaders should conduct an audit of where they are spending their time, to build an honest picture of where improvements could be made. However, given the broad range of departments and processes carried out in the day to day running of a business, pinpointing what’s holding the company back can be like searching for a needle in a haystack. Given this, and in light of the fact that business leaders are already over-estimating their company’s efficiency, it is clear that to get an accurate picture, they should be going directly to the source: their employees. Whether it is deploying collaboration tools, transferring core functions to the cloud for real time progress reports, or simply appraising well-being culture, understanding what needs to be improved will put firms in a stronger position when weathering an unpredictable economic climate. Invest in the right technology As well as surveying employees in the business, it is also important to carry out a comprehensive assessment of the technology infrastructure already used within the business, to see where improvements and investments can be made. The good news is that there is a plethora of enterprise tools on the market for companies of all sizes, that aim to make processes more efficient. For many businesses, it starts with moving data away from excel sheets (or even paper) and into a system such as Enterprise Resource Planning (ERP) to go a long way to building the productivity puzzle. ERP systems can also centralise a number of other business processes, including CRM, financials, stock levels and sales. Technology-savvy organisations can easily enhance the ERP system by implementing advanced solutions such as AI-based chat bots, IoT sensors and more to further streamline and accelerate their operations. However, new technology can also be daunting for some businesses, and is often put to one side, as concerns about costs or disruptions to the business are pushed to the forefront. According to a report from Bloor Research, 30 per cent of data migration projects fail, and 38 per cent run overtime, or over-budget, so it’s important to get it right. It is also important to keep in mind that a one-size-fits-all solution might not work for everyone. The same analytics tool that’s used for a sales team in London might not be appropriate for the team in New York. Arm employees with the tools to work remotely While the productivity related benefits of mobile working are tangible, the unfortunate reality is that many companies are lagging behind. In a recent survey, over a third of senior decision makers did not have the correct technology to support mobile working, and 43 per cent could not perform business-critical functions on mobile application. The good news is that the technology exists to make this possible. Modern ERP systems now support mobile application generators, which allows everyone from CEOs to HR to sales to to create a range of applications from their mobiles, and use them to access relevant data and perform core business processes no matter where they are. They can be created in a matter of minutes, and don’t require high levels of IT knowledge. So, a sales rep can be in the field and still have the ability to check factory stock levels, get in touch with delivery drivers, and perform transactions, without having to be at a computer.   Put your employees first The happier your employees, the more productive they are going to be. Unfortunately there’s still a stigma within the UK that longer hours and working through lunch breaks equates to increased productivity. Indeed, a recent survey found that almost a third of senior decision makers were taking a full lunch break less than once per week. While this might create the illusion of productivity, working through lunch while wasting time on inefficient tasks points to poor time management, and ultimately, an unhappier and less efficient workforce. It is also important to adopt a flexible working culture that permeates throughout the company, and isn’t just reserved for the elite, so that employees can work according to the schedules that work best for them. Finally, business leaders should also take into account the importance of both mental and physical health, and ensure that the working environment does not cause either to suffer. With so many trade bodies, think tanks and financial analysts all calling for a rise in enterprise productivity, it has never been more crucial for businesses to take action. Business leaders have a clear understanding of where processes are falling down, and should arm their employees with the tools they need to plug the gaps and work efficiently. ### Innovation | How CIOs Can Help To Unlock Innovation For CEOs [vc_row][vc_column][vc_column_text]“Intelligence is the ability to adapt to change”. These famous words from Stephen Hawking have never been more relevant in the world of business today, mainly because we are currently experiencing the fastest rate of technological change in history. As a result, businesses globally are under pressure to adapt to the opportunities and threats that this brings.  Whether a business is a Fortune-500 or an ambitious new start-up, organisations need to embrace innovation and agility in order to not only survive in the digital era but thrive. Done well, all parties involved will reap the rewards of financial stability, industry leadership and long-term sustainability. That said, it cannot happen in silos. CEOs need the support of other C-level executives to reach their goals and are increasingly turning to their CIOs as the first step to initiating change. That strategy has the support of executives across organisations, as in the ‘State of CIO’ report Line of Business leaders see CIOs as strategic advisors and consultants when it comes to technology and business decisions. Only in the last 6 years has the CIO become a strategic player in the C-Suite, reporting directly to the CEO. As digital transformation initiatives continue to be top of the agenda for organisations today, that trend will only continue. Digital disruption Thanks to the digital era, the role of today’s CIO has evolved. In addition to their day-to-day operational duties, IT leaders are now responsible for driving digital disruption opportunities forward within the business to bring about transformational change. Those responsibilities can extend even further to an organisation’s marketing efforts such as how it communicates with customers and delivers on new experiences. Deloitte’s popular ‘Global CIO Survey’ report looked at how the lines between IT capabilities and business expectations are blurring and how CIOs are in the best position to drive digital strategies across an organisation over any other C-suite exec. The report revealed a significant shift in priorities from “business performance” to “customers,” as a CIO’s top priority. Interestingly, the concept of a ‘digital iceberg’ stood out in the Deloitte report. It found that some CIOs and business leaders see ‘digital’ primarily as the customer-facing technologies and front-end tools which a business can utilise. But this is just the tip of the so-called iceberg. Others see digital as an entire mind-set, where they believe technology is fundamental to transforming entire business models. This requires companies to embed technology into every part of the business, transforming the role of a CIO entirely. The CIO is now the disruption evangelist in an organisation. They’re responsible for harnessing new innovations and forging a digital-first path forward for how the business operates. By identifying and employing the right business solutions, the way that works across the organisations is delegated, completed, and assessed can and will fundamentally shift. For example, cloud technology is now seen as the new normal in managing an organisation’s IT as it alleviates the burden of legacy infrastructure management costs and maintenance hassles. As a result, it frees up the technology team for more strategic, growth-focused work – the second most valued business priority for CIOs in Deloitte’s report. Widespread digital transformation is changing the business landscape in unprecedented ways. CIOs are now leading this process, ensuring employees, partners and customers receive the right tools and information. And that responsibility extends even further to driving business growth through technology and generating new revenue streams. Innovation and its influence In any digital transformation project, business leaders and CEO’s primary focus is on the end-user experience. But that’s only half of the equation, the impact of a new digital-first approach can have on the internal organisation is often overlooked. That’s why it’s critical for CIOs to stimulate a new culture by motivating existing staff to shift their focus away from operational excellence to business transformation, inspiring an ethos of calculated risk-taking and a focus on outcomes. Without the talent to drive digital initiatives forward, the most well-thought through plan will have little chance of success. Unfortunately, many organisations (50%) feel that innovation and disruption priorities currently do not exist or are in the process of being built in their organisation, despite 57% of CIO’s surveyed stating that it’s a core part of their responsibility now. Logical solutions to complement an existing IT architecture, such as capabilities for the integration of services and the flexibility to configure applications may not seem to support customer-centric objectives at first, but the effects they have will be felt nonetheless. The reality is CIOs must assess the impact of innovation and how it will apply to business goals before a digital transformation strategy is implemented. That means understanding the behaviours, demographics, and motivations for all stakeholders, including employees, customers and partners. Striking that balance between building systems and implementing technologies that benefit both internal and external stakeholders, is where a CIO and ultimately, the organisation will thrive. CIO’s, shouldn’t be afraid of championing short-term disruption as it can yield long-term benefits at a time when organisations need it most.[/vc_column_text][/vc_column][/vc_row] ### Technology | Alleviating Environmental Effects On The Food System [vc_row][vc_column][vc_column_text] Embracing technology to alleviate the environmental effects of the food system According to a study published recently in the Nature Research Journal, in order to guarantee that the world’s population can be sustainably fed by 2050 and ensure that the food system is contained within environmental limits, adopting innovative technology is essential. The research identifies the food system as a significant contributor to multiple environmental impacts, with climate change arguably being the most crucial. However, without a strategy to combat these effects, that also embraces vital technological changes, the impacts could rise by up to 90 per cent. One significant method to reduce the environmental impacts of the food system is to address the problem of food waste. Over one third of all food produce is either lost prior to sale or thrown away by consumers. Therefore, as part of the drive to halve food loss and waste by 2050, our entire relationship with food needs to be addressed and this can only be driven from the top. While improving education and expectation for consumers is imperative, as is the focus on increasing biodiversity, we will ultimately see these changes come together within the food supply chain. The solution to delivering these systemic changes and lowering levels of waste at every stage of the supply chain is by embracing digitisation and using IoT technologies to provide real time insights and visibility into machine performance at a granular and actionable level. It is vital to effectively manage temperatures at each point throughout the entire cold chain to maintain product integrity; from the production stage to processing, to the retailer and through to the consumer’s home. Refrigeration has an essential part to play in prolonging produce shelf-life, and it is estimated that by increasing the shelf-life of a product by one day alone could result in avoidable waste being reduced by 5%. With optimised temperature control through automation, critical prevention and automated corrective action, not only can shelf-life be extended, but the quality and aesthetic of the product can be enhanced by preserving flavour, colour and texture. This can ultimately improve the customer experience, whilst also avoiding unpleasant looking products being discarded by both retailers and customers. Crucially, these outcomes are being realised in some retail environments by layering digitisation over existing infrastructure. For retailers, it is simply not financially viable to rip and replace costly control infrastructure across potentially thousands of locations. Instead, by leveraging real-time data and connecting, mapping and integrating this data through edge-based processing to deliver dynamic monitoring and enable automations, IoT capacity can be achieved at pace with minimal downtime and without the need for significant capital investment. This is an innovative and pragmatic approach to utilising technology to reduce the levels of food waste, but to achieve its full potential across the cold chain, a collaborative strategy towards managing temperature controlled environments is crucial. Deploying effective and efficient cooling strategies throughout the cold chain can deliver numerous business benefits including ensuring the highest levels of safety and quality of the end product, and also drastically lowering levels of waste at each stage, including in the home of the consumer. Delivering these essential changes in global food consumption and production is not a quick fix, but by embracing technology as part of a complete strategy throughout the food chain, huge progress will be made towards addressing the wastage endemic and ensuring that the food system stays within environmental limits.[/vc_column_text][/vc_column][/vc_row] ### Gaining a clearer understanding of automation As more and more organisations look to get the best out of their staff, intelligent automation is the only true driver capable of freeing up employees from mundane tasks. In fact, the World Economic Forum predicts that 42% of tasks will be performed by machines by 2022, compared with almost 29% today. However, that doesn’t mean that employees will lose job, 38% of the 300 organisations that took part in the Future of Jobs report expect to increase their headcount. But how do organisations go about making the best of man and machine? Making the most of what you’ve got Encouraging the human workforce to hone their skills and focus on more high value strategic work requires good resource planning and people management. However, those taking on first time automation projects should keep in mind that success lies in taking the right approach, simply thinking in terms of automation for automation’s sake won’t deliver the right implementation or the long-term results. Instead, organisations should work with suppliers that help them focus on transformation. Firstly, helping them to adopt people centric workflows, then look at transforming them into machine centric processes. Ultimately, effectively orchestrating collaboration between the skilled workforce (people) and intelligent machines, requires a new set of skills techniques and technologies called Service Orchestration. These three steps cannot be achieved in one, so plans must be multi-phase, rapid and iterative to maintain momentum and mindsets. This will help drive results across the entire organisation and reduce variables and risk. The journey to automation is constantly and concurrently happening at different stages. Small and incremental changes in the organisation, or the market, or technology, trigger novel processes, knowledge, or skills, that will also need to take that journey or risk dragging the organisation back to manual operations. An easy signal of this is the growth of spreadsheets as proxies for management tools. Automation is something that must be embedded in the organisational psyche, in the DNA of every manager, every employee, of every part of the organisation, operations, or processes. While naive organisations focus on automation, smart organisations are looking towards business transformation. Organisations need to understand how employees can collaborate with the technology and this is a critical factor in enabling their business to compete harder, operate faster, and deliver better outcomes at a lower unit cost of delivery. Starting the journey towards automation Perhaps the best place to start is by looking at why an employee is in a particular role – what do they actually do? Posing this question helps business leaders to recognise the value of a particular skill, knowledge set, or experience that individual employees bring rather than just the actions they carry out. In turn, this helps to define the domains for which subject matter experts are required to in turn define objectives and gain an understanding of the wider benefits automation can bring to the business. Additionally, it also helps to understand the various types of automation that will best suit the initiative and what the evolutionary roadmap is from there. Automation breaks down into a complicated mapping of workflows, processes, tasks, functions, that are simple, complicated, complex, creative, personal, novel, and unknown. Here is an outline of the different type of automation that organisations should be aware of. Task driven automation – Simple tasks require no other input that the starting information, it sits in a singular domain and either succeeds or fails. Simple task automation tools such as Scripts, Macro’ing, Visual sequencing technologies and Robotic Process Automation (RPA) are useful here. Simple tasks by example include updating client profiles, processing forms, entering and transferring data from one system to another. Decision driven automation – Logical business rules consuming internal and external information applied to both trigger tasks and advance processes and workflow. For example; assessing the value of an insurance claim to establish its validity or what level of scrutiny it requires. Workflow driven automation – Arranges work into regular sequences of steps for individual or groups of manual tasks where the resolution of the process is low and tasks are grouped into operational steps. This is effective in people centric environments such as document processing, or service desks which requires the flow of work between one set of skills, knowledge, or experience to another. Business Processes are often defined in workflows such as claims management, helpdesk ticket processing, and application processing. Shared services such as IT, HR and Finance can effectively apply workflow automation in a wide variety of areas to start the automation journey. Process driven automation – Enabling the automation of individual business transactions through high-definition end-to-end processes with or without human involvement such as “Straight Through Processing”. This requires the capture of the skills, knowledge and expertise of the operators as well as the sequence of tasks which not commonly well documented or even known in businesses. Using Intelligent Automation processes vary between those that can be entirely automated, and those which are personal or creative that must have user/operator/actor participation; even variants or permutations of an otherwise fully automated process may require participation whenever the process asks for it (either a legal requirement or an exception-based scenario). Examples include document processing such as claims management and adjudication, or the onboarding of employees or leavers process. Process Orchestration – The evaluation and analysis of situations to sequence decisions, tasks and processes to achieve a prescribed end-state and output. This is a classic approach for Joiners, Movers and Leavers automation a complex but predictable process. Full process orchestration must allow for managing the unknown and providing for intervention without losing the context of any individual transaction. In turn, this requires good insights and visibility of performance to ensure machine are not getting things wrong. More advanced Process Orchestration is often confused with Machine Learning as it adapts to new environments. Service Orchestration – The evaluation and analysis of situations and varying external factors (context) to sequence decisions, tasks and processes to optimally seek, or maintain, a goal. A good example is; Service Orchestration for IT infrastructure; it is always complex and requires constant consideration of the changing, sometime failing, environment. The full lifecycle of an individual service may run for months, or years during which the underpinning environment will inevitably change and possibly become unpredictable. Ultimately, organisations embarking on a workforce transformation project should look to automation technology to empower their employees to work smarter, be more productive and improve their working environment. All the while freeing employees from repetitive tasks and allowing them to focus on other activities which require human characteristics such as creativity, control, judgement and harnessing emotional intelligence.   ### How AI helps retailers hear their customers' voice | Compare the Cloud Customer feedback is key to success to retailers.  According to PwC’s recent Consumer Intelligence Survey, three quarters of consumers cite customer experience as an important factor in their purchasing decisions, behind product quality and price. Delivering a good experience requires retailers to listen to their customers, and act on what they’re saying. Indeed, Morrisons’ chief executive recently suggested that customer feedback was an important factor in the supermarket chain’s three consecutive years of multichannel growth, saying that “listening to customers, responding, and improving the shopping trip are as important now as when we started this turnaround three years ago." By monitoring and measuring customer feedback, brands will be better able to respond to customer demands in a timely manner, improve their products and services, protect their reputation, and predict any trends that might impact their business in the future. The challenge lies in measuring customer feedback and interactions across multiple channels in real time. Opportunities and challenges For generations, retailers have been listening to the demands and bugbears of their customers. Today’s multichannel environment, however, requires a deep-reaching, digitally focused solution. Voice of Customer (VoC) analytics provide valuable intelligence on how consumers perceive a particular brand. By capturing everything a customer says about a retailer and its products – their expectations, their preferences, their dislikes – VoC data can help that retailer shape its offering, and present it at a price that customer is most prepared to pay. What’s more, by combining VoC data with product data, of which most retailers have many years’ worth, it becomes easier to predict trends in demand and sales. However, while a recent survey revealed that the majority of retailers consider VoC to be important, fewer than three in five actually plan to invest in it. Given the variety of data that must be acquired and evaluated for VoC to be effective, this is perhaps unsurprising. While it’s relatively straightforward for software to aggregate and analyse structured data such as pricing information, product ID numbers or quantitative data from opinion surveys, it’s considerably less easy to analyse unstructured data which, unfortunately, is where the real insights lie. Customers of a truly multichannel retail operation will use phone, email, instant messaging and social media to give feedback, make requests or lodge complaints. This information is largely unstructured and written in natural language, which makes it incredibly difficult to merge and analyse, particularly when it is drawn from multiple different sources. Machine reading Handling the sheer volume of data produced by VoC analytics can prove challenging for any retailer. Efficiently and effectively analysing, understanding and gaining value from this information can only be carried out by a machine using a process known as ‘text mining’. This process is particularly important given its ability to read unstructured textual data. As we’ve seen, this data contains more valuable context and insights than its structured, transactional counterpart, by reflecting the opinions, intentions, emotions and conclusions of its authors. Through the application of AI technology, these machines can learn to read written text and not only identify any mentions of people, places, things, events and time-frames that occur, but also assign emotional tone to each of these mentions. It’s even possible for machines to determine whether a document is based on fact or fiction. By acquiring and processing thousands of documents and articles every second, these machines are capable of reading at the pace required for VoC analysis to be effective. It’s important, therefore, that as the volume of data generated by VoC analytics will only increase over time, retailers should invest in an AI solution that is able to process immense amounts of data efficiently and that can scale as necessary. Keep listening The customer is king, and always has been. According to KPMG, they are also telling retailers how to run their business. The problem is “many brands aren’t listening. They may collect reams of data about what customers do and what they buy, but they’re not paying enough attention to what customers actually want or value most.” However, by implementing powerful AI-enhanced technology in conjunction with text mining capabilities, it’s possible for retailers to unlock the valuable intelligence provided by VoC analysis, and more clearly understand how their customers feel, what they like, what they don’t like, and how much they’re prepared to pay. This understanding will then, in turn, allow them to better engage with their customers, offering them what they want when they want it, and delivering the experience so important to guiding their purchase decision. In an increasingly complex and competitive multichannel digital environment, the voice of the customer is the most important guide for any retailer, and one that they should pay close attention to if they hope to not just survive but thrive. ### Cloud Migration | The Three Essential Stages Of Cloud Migration [vc_row][vc_column][vc_column_text]Forrester forecasts that the private cloud market will be worth $26bn by 2022. However, migrating data to the cloud can be daunting, especially for organisations that have legacy systems in place and that have had to overcome cultural reluctance to the change. A good cloud service provider will support the organisation through the migration process at the pace that suits them. A rushed process will often fail as the provider doesn’t take the time to understand the specific requirements of the business they are working with. A thorough migration process entails an analysis of the business’s needs before moving into a detailed risk assessment and project scope. Only then does the operational stage begin. The cloud migration process doesn’t need to be complicated – provided the business has the right support in place – but it should not be rushed. A successful migration process can create the foundation of a strong and agile business, so it’s vital that each stage is managed in a methodical manner, and that the business continues to receive the full support of its provider. Stage one: Analysis of technical and business needs The migration process must start with an analysis to determine the possibilities and the purpose of migration. In our experience, this stage should take from one to three months. During this stage, the provider works with the business to understand the current application market and to determine which applications are critical to the business’s success. The provider must take this time to assess how prepared the business is to make the move to the cloud (or from another service cloud provider). The provider should perform a technical analysis to ensure that the applications the business wants to transfer are not only cloud-ready but optimised for operating in the cloud. Will the business have the same standard of network security? Does it want to use the migration process as an opportunity to switch to new solutions? Providers also need to consider business factors. Is the executive team enthusiastic about the change? Is the business ready to change long-established processes? Has the business determined how it will manage its data security? Has it assessed all of the costs? There will be risks and benefits to moving data to the cloud, or even moving cloud providers, so it’s essential that the business be both fully invested in the success of the migration process, and prepared for the changes that will happen. Stage two: Risk analysis and migration project scope Once the provider has established the business’s unique requirements, it should perform a risk analysis and begin the project management process. What are the main elements that could go wrong during the migration process? What backup plans do the business and provider have in place? How will the business be able to maintain its services while the migration process takes place? The execution stage of the migration process can last anywhere up to six months depending on the scale and complexity of the project. Limited downtime is essential to minimise service disruption for the business’s customers, and there must be a way to safely roll-back changes. At this stage, the business should have a full project plan so that applications are switched over at a time that minimises the impact on the business and its services. Stage three: Managed service operation In this final stage, the business should expect its provider to be focusing on maintaining a strong and reliable managed services solution. Support should be available 24/7, and the provider should have an improvement programme in progress to continually simplify the service and improve efficiency. Businesses need to select a provider that understands the pressures of its industry and has extensive experience in managing migrations across sectors and in different countries. It needs to have a secure and reliable service, but also have one that allows the business to be agile and continue to innovate. The success of a migration process depends not just on the experience and skill of, and collaboration between, the client and provider, but also on the importance that the provider places on client experience. The support process must be as frictionless as possible for the migration process to succeed. Businesses continue to migrate their services to the cloud due to the clear benefits that cloud offers. There’s often a cost-reduction in making the switch, and you can’t ignore the scalability and flexibility that cloud services provide. Operating services in the cloud allows businesses the flexibility to keep up with new trends and become innovators in their sector. However, these benefits are all dependant on a successful cloud migration process. While it’s a complicated and often lengthy process, the cloud service provider should be there to guide the business through each stage of the migration.[/vc_column_text][/vc_column][/vc_row] ### Data Protection | What is GDPR The General Data Protection Regulation (GDPR) is a legal framework agreed upon by the European Parliament and Council, superseding the Data Protection Act (DPA). It came into effect across the European Union on May 25, 2018, as the primary law regulating how companies protect the European Union (EU) citizens’ personal data. It sets the rules for the gathering and processing of private data of people within the European Union (EU), while also imposing fines that can be revenue-based. It covers all companies that deal with the data of EU citizens, making it a critical regulation for corporate compliance officers at banks, insurers, and other financial companies. Companies that are already in compliance with the directive should ensure that they are compliant with the new requirements of the General Data Protection Regulation (GDPR). Companies, corporations, and firms that fail to achieve General Data Regulation Protection compliance are going to be subject to stiff penalties and fines. Plainly put, the General Data Protection Regulation mandates a baseline set of measures for companies that manage EU citizens’ data to properly safeguard the processing and flow of the citizens’ personal data. The General Data Protection Regulation requirements apply to each and every member state of the European Union, aiming to create a more consistent protection of consumer and personal data across EU nations. Some of the essential privacy and data protection provisions of the General Data Protection Regulation include: Ordering certain companies to designate a data protection officer to manage General Data Protection compliance, Requiring the approval of citizens for data processing, Anonymizing obtained data to preserve privacy, Providing data breach or hack notifications and announcements, Carefully managing the flow of data across borders, Appointing dedicated data protection officers, and much more Under the General Data Protection Regulation, companies and firms may not legally process and are pushed to be pseudonymous to any person’s personally identifiable information (PII) without meeting at least one of the following conditions: Express consent (permission for something that is given specifically, either verbally or in writing) of the data subject. Processing for the performance of a contract with the data subject or to take steps to enter into a contract. Processing for compliance with a legal obligation. Processing to protect the important interests of a data subject or another person. Processing for the administration of a task carried out in the public interest or in the exercise of official authority vested in the controller. Processing for the purposes of legitimate interests pursued by the controller or a third party, except where such interests are overridden by the interests, rights or freedoms of the data subject. These conditions mean that the data can’t be associated back to a particular person. The pseudonymization of data allows companies and firms to do some comprehensive data analysis such as assessing average debt quotients of its customers in a particular region — that would otherwise be past the original purposes of data collected for evaluating creditworthiness for a loan. What is General Data Protection Personal Data? The kinds of data deemed personal under the existing legislation include names, addresses, and photos. The General Data Protection Regulation extends the definition of personal data so that something like an IP address can be personal data. It also adds sensitive personal data such as genetic data and biometric data which could be processed to uniquely distinguish an individual. What does the General Data Protection (GDPR) mean for consumers/citizens? Because of the sheer number of data breaches and hacks which have occurred over the years, the unfortunate reality for many is that some of their data have been exposed on the internet. Which is why a lot of companies are leaning toward telecom expense management. Under the General Data Protection Regulation, consumers/citizens’ (or simply put, data subjects’) rights include: The right to be forgotten - data subjects can request personally identifiable data (PII) to be erased from a company's storage. The company has the right to refuse requests if they can favourably demonstrate the legal basis for their refusal. The right to be forgotten is the concept that individuals have the civil right to request that their personal information is removed from the Internet. The right of access - data subjects can examine the data that an organization has stored about them. The right to object - data subjects can deny permission for a company to use or process the subject's personal data. The company can ignore the opposition if they can satisfy one of the legal conditions for processing the subject's personal data but must notify the subject and explain their argumentation behind doing so. The right to rectification - data subjects can expect fallacious personal information to be corrected. The right of portability - data subjects can access the personal data that a company has about them and transfer it. ### Blockchain Technology – The Future of Storage? The Current State of the Cloud Today’s cloud solutions are incredibly useful for both individual and enterprise use cases. Options like AWS or Microsoft Azure have become an indispensable tool for hobbyists and businesses alike, offering a great deal of flexibility with regards to storage and computing. The utility of the cloud, however, is largely undermined by its architecture – the servers storing data are, after all, centralised. The fact that a single business controls these raises certain privacy and security concerns, particularly at a time where data breaches have been rampant. Even titans like Tesla have opened themselves up to compromise in this manner. This reliance on a third party to keep data safe is perhaps one of the biggest downsides to cloud storage. It’s risky to aggregate sensitive information into a central point of failure, but, to date, it seems the pros have outweighed the cons – there’s the cost-efficiency consideration, as well as the ease of sharing files with people around the world (handing a coworker a flash drive directly simply isn’t feasible when the workforce is spread across multiple locations). To usher in a superior alternative, one would need to devise a system that is cheap, scalable and more secure. Exploring Blockchain Technology’s Potential I’ll defuse a common misconception straight away – blockchain tech is not the magical, multi-purpose infrastructure (as most of its hype would have one believe). As far as scalability and speed go, it’s actually terrible. Remember that a distributed ledger requires that hundreds, if not thousands, of nodes store a copy themselves. It needs to be slow to allow nodes to download data, and can’t store large amounts of this without requiring nodes to pay significant storage costs (which would have a knock-on effect of hurting decentralisation, as less will participate). Similarly, there’s another dangerous misconception about privacy. With blockchains, which are upheld by the idea that anyone can view the ledger, there is no privacy. If one did, for whatever reason, decide to upload sensitive data onto the network, it would be observable by anyone, forever. Blockchains are excellent at one thing, however – immutability. Once a block is appended to the chain, it’s extremely difficult to remove or edit the entry (most would say it’s technically impossible). The decentralised nature of a distributed ledger further means that two strangers can transact or exchange information in a trustless manner, as their interactions are governed by code and not by oversight of a third party. We saw this with Bitcoin as currency (and a few other limited use cases), but second-generation networks like Ethereum open up a wealth of other interesting applications with Turing-complete smart contracts. Blockchain Technology and Distributed Storage Smart contracts were conceptualised by Nick Szabo in the 90s, but have only recently become a reality, allowing for code agreed upon by parties to execute without a middleman’s interference. Blockchains are an ideal medium on which these are able to thrive, and I think they are set to play a crucial role in granting, revoking or otherwise altering permissions over data caches. For the reasons elucidated above, storage directly on-chain is impractical (not to mention expensive). The Ethereum blockchain, for instance, can only handle up to 15 transactions per second, by virtue of its Proof-of-Work mining mechanism. It’s worth considering alternative options that would allow for blocks to be processed faster, and thus, boost the throughput of the network whilst enhancing its scalability. Ethereum has proven to be useful for storing and executing smart contracts, and transactions in ether. Recording even the simplest of text files incurs extortionate costs. I think the solution lies not in using a blockchain as a standalone tool, but as complementary to off-chain distributed storage mediums like IPFS. In one hand, you have powerful and unalterable record of which addresses have interacted with a specified smart contract, and in the other, efficient and encrypted storage. An individual (or business, for that matter) could easily draw up a smart contract comprising of an address that points to given content held off-chain. The content is encrypted, of course, and is not available for download unless the  the person wishing to access it would either need to pay a set amount of tokens (an idea which opens up a lot of potential for content management without going through third parties), or they have been given explicit permission to access the content by the owner. With this system, all actions are tracked and traced on the immutable record of the blockchain of when and whom has been given access to content, and at the same time users are able to enjoy the flexibility of distributed  storage, without the risks of centralisation. Moving Forward We’re starting to see a lot of these projects mature, and that can only be a good thing: for all its advantages, cloud storage poses some major risks to the privacy and security of its users. The idea of a single party being in control of an abundance of sensitive data is one that needs to be deprecated, and few technologies match blockchain in their potential to make this a possibility. ### Worth Stealing | Protect your E-commerce site Hackers love e-commerce website, whether you’re a major retailer or the owner of an e-commerce site that processes only a handful of transactions. In fact, consider this somewhat sobering fact: 60% of cyber attacks target small or medium-sized businesses over larger corporations. The simple reason is that they’re usually easy targets. The data is normally worth stealing purely due to how easy it is to get. Don’t want your customer data available for download on dodgy chat rooms or being sold to the highest bidder? Read on for the best ways to protect your e-commerce site from cybercriminals. Don’t have anything worth stealing Hackers target websites to obtain sensitive data worth stealing, with credit and debit card details being at the top of the list. Several household names have fallen victim to such data breaches, including: Home Depot, 2014 - the American retail chain had data from over 50 million customers stolen. This hack included credit card details, resulting in roughly USD $179 million in settlements. Hilton Hotels, 2015 - considering the number of bookings this giant chain processes on a daily basis, it’s surprising they got hacked. But it happened, with credit card data stolen from dozens of their brands, including DoubleTree and Hampton Inn. The solution? Don’t have anything worth stealing housed on your server. Considering that it’s now forbidden to store client payment details, it’s a no-brainer. Update your code Most business owners will either use a pre-packaged e-commerce solution or will hire a developer for a one-off job. The site will start off great, forming an unconquerable bastion with all WordPress security plugins (or modules if you’re running Drupal, you get the picture!) up-to-date. But it usually all goes downhill from there. The website is essentially left to its own devices, with all requests for security updates ending up unopened and unread. And this is where things usually get ugly, as a large proportion of updates are a direct response to vulnerabilities that hackers have exposed. Avoid this problem by regularly updating your plugins; it usually takes little more than a couple of clicks, so no excuses! Choose the right host Not every host is created equal, which is a point that many business owners, unfortunately, learn the hard way. In addition to ensuring your code is up to scratch, it’s essential you keep your e-commerce solution running on a decent hosting server. Instead of shared hosting, opt for a securely dedicated provider. Pay particular attention to the inbuilt security solutions on the server and the backup options (should the worst happen!). Think about this decision in terms of the following analogy. Would you run your physical business out of a cheap motel that’s overflowing with guests and where your customers won’t feel safe? The answer is a definite No. You’d rent a proper unit in the right district, providing an environment that encourages sales and keeps your business safe. Hosting works in precisely the same way. Don’t be cheap and buy adequate hosting from the get-go. Use SSL For some of you, this tip will result in a  ‘well, duh!’ reaction. However, it’s surprising just how many e-commerce websites don’t use SSL to encrypt user data, especially that worth stealing. And of those that use SSL, it’s only a minority that force HTTPS across all instances. In this case, our message is clear and straightforward: Use SSL. Enforce https. SSL certificates are essential, period. They protect customer data via encryption, ensuring hackers can’t do anything with the intercepted information. Considering just how easy it is to implement using services such as letsencrypt.org (and it’s free!), it really should be one of the first tasks you complete before making your site available to customers. Require proper passwords Don’t trust your customers with their passwords. After all, the most common options in 2017 were 123456 and password (yes, the word ‘password’!). Nudge customers in the right direction by requiring them to use strong passwords as part of the sign-up process. Most e-commerce solutions will have an option for this, requesting the use of a minimum number of symbols, digits, and capital letters. When it comes to creating a strong password, the traditional advice still holds true: The longer, the better. 12 characters should be the minimum. Mix it up by using a variety of numbers, capital, and lower-case letters, as well as symbols. Do not combine dictionary words. Don’t use words that can quickly be cycled through using a simple program. Avoid basic substitutions. No, using a zero instead of an o is not a clever solution. Penetration testing Penetration testing, otherwise known as ‘ethical hacking,’ is a process in which your website undergoes a series of tests in order to identify and exploit any existing weaknesses. For example, attempts can be made to log-in to the back-end of your e-commerce solution, ‘stealing’ data. Penetration testing is often confused with a vulnerability assessment. However, these are two crucial yet entirely distinct processes. The latter use vulnerability scanners to identify technical threats, while penetration testing qualifies the threat by actively attempting to exploit vulnerabilities. This not only clarifies any potential false positives but also lays bare the true extent of the problem. The takeaway message Dealing with hackers is mostly all about prevention. Remember, most are implementing pre-coded scripts en masse and are looking for the easiest victims. As data becomes more valuable and worth stealing to cybercriminals just like petty robbers opt for open doors and unsecured locks, all it takes is a little bit of simple preventative measures to scare off most would-be hackers. ### Best Practices for Cloud Negotiation The journey has been long and full of challenges, but the day has finally come. Linda’s organisation has decided to move to the cloud. As the sourcing, procurement and vendor management (SPVM) leader, it’s her duty to choose the right cloud solution and negotiate the best possible contract. But this is easier said than done. Infrastructure as a service (IaaS) is the fastest growing cloud model worldwide. Gartner predicts that the cloud compute IaaS market, regarding end-user spending, will achieve a compound annual growth rate of 28.7 percent between 2017 and 2022. Linda can choose one of the hyperscale providers such as Amazon Web Services (AWS), Microsoft Azure, Google or Alibaba. Or, she can opt for a more traditional vendor with large-scale cloud offerings such as IBM, DXC or Fujitsu. The selection process goes beyond assessing providers’ technology and functionality. Linda must develop a clear negotiation and risk mitigation strategy to avoid hidden and indirect costs, effectively determine potential risks, and negotiate favourable terms and conditions. Below are three best practices that Linda and other SPVM leaders can use to successfully negotiate on a cloud solution. Keep common mistakes top of mind One big advantage of cloud services is that they enable users to buy services easily and directly. This is also a risk. Nearly half of IT spending on the cloud is outside of the IT budget. That means the IT department, in a number of cases, is unaware of the spending and can’t apply practices and risk mitigation steps. In the era of the General Data Protection Regulation (GDPR) and other data compliance regulations, unsanctioned use of cloud services can be a serious threat to businesses. SPVM leaders must keep these risks as well as others in mind before they choose a cloud provider. They should consider current and future security, compliance and regulatory requirements to ensure cloud IaaS does not pose a risk to the business. Prepare your must-have list A lot of organisations struggle to accept the standard terms and conditions of public cloud IaaS contracts. The most challenging factor is likely to be location, as deals across different geographies introduce risks such as varying contractual practices and unfamiliarity with local systems. SVPM leaders have to assess their firm’s main areas of concern first and create a must-have list. The second step is to identify the associated terms and conditions in the IaaS agreements and see if they address concerns or need to be renegotiated. For example, every contract should contain customised end-of-term agreements for moving data back in-house or to another provider. The standard termination notice is 30 days, which may be insufficient for significant amounts of data. Consider hidden costs and total cost of ownership (TCO) Cost reduction is one of the main drivers of public cloud IaaS adoption. In a recent Gartner survey, 50% of respondents considered cost savings as one of the primary reasons for moving to the cloud. However, the ROI is often not as obvious as expected and needs to be carefully analysed. Add-ons, data transfer, security, backups and many other small but necessary capabilities create a pile of hidden costs. Having a good understanding of hidden costs is extremely important to create an accurate business case and estimate TCO. SVPM leaders should develop best, realistic and worst-case scenarios, and include historical data to verify whether or not the calculated future demand is what they think it is.   ### Challenges the industry faces with fraudulent transactions The cloud has fostered incredible levels of innovation and new services that can be rapidly, reliably, securely and cost-effectively deployed.   One such requirement is a cloud-based portal which records and authenticates many business performance processes including: contractual integrity, title and ownership, mission critical processes, customer relationship management and regulatory compliance.  Today’s dynamic environment, that is so increasingly dependent on authenticity and traceability of data held in disparate locations, is driving the need for these solutions.A chain of custody platform recognises and resolves a large performance gap in legacy trust systems at the point where secure documents and artefacts meet data archives. Banks, law firms, healthcare providers, manufacturers, energy producers, transport companies and government agencies are now able to verify in real time the authenticity of secure documents by directly linking them to their respective secure transactions in real time.  The creation of a chain of custody platform has resolved the requirement to link the physical world to the logical world in real time. Large enterprises throughout the EU are confronted with document management costs that generate over €150B per year of direct overhead for companies with over 1000 employees (according to a recent Ricoh study).Unfortunately, it is also estimated that over 50% of this paper mountain does nothing to enhance corporate performance.  A chain of custody platform and service provides the backbone of immutable trust to the four fundamental elements of transactions: Artefact, Activity, Asset and Archive.  New services provide answers to important business process and performance questions including: How can we authenticate the provenance and chain of custody of the: ·      Artefact used by all parties to manage a transactional process? ·      Identity of the holder of the Artefact? ·      Activity that is the reason for undertaking a transactional process? ·      Asset upon which a transaction is based? ·      Archive upon which a transaction is based? How can we synchronise and validate all these elements (artefact, holder, activity, asset, archive) in real time and before progressing to finality? A chain of custody portal aggregates the answers to all these key business performance questions on a single platform with distributed ledger technology (DLT) immutability driven by a Plug Infrastructure. Traditional legacy trust systems cannot answer these questions with sufficient guarantee of immutability, and certainly not in real time.  Therefore, legacy trust and legal “finality” are rife with risks: financial, reputational and legal. The ‘4Trust Chain of Custody’ platform, which was launched in the UK by The TALL Group of Companies in July, provides two essential, industry-unique capabilities: ·      Creates a singular, cloud-based domain integrating physical and virtual worlds in a seamless chain of custody; ·      Provides real-time links between physical and electronic multi-schema artefacts (e.g., documents, contracts, emails, certifications and titles, purchase agreements, liens, as well as, product parts, controlled items etc.) and their associated processes to a DLT archive, which confirms veracity and integrity of custody.   Today, most issued artefacts and associated processes that provide the foundation of business and public-sector performance are not truly secure or immutable, and perhaps more importantly, neither the artefact(s) nor its holder(s) can be validated as authentic. Such pervasive deferrals of trust expose measurable distributed risk to all concerned parties to include unforeseen and unacceptable institutional and personal liabilities. The current approach to managing artefacts is fraught with all facets of waste, with an incurred cost of billions of dollars attributed to lost performance, tarnished branding, and decrement in company value. As a result, deferrals become fertile ground for liabilities borne from corporate malfeasance, organised crime, identity theft and even terrorists, to name just a few.  So, how does 4Trust’s Chain of Custody platform deliver results? The platform delivers trustworthy artefacts linked to a blockchain record by an embedded NFC integrated circuit; the provenance of which can be monitored and confirmed in real-time through subscribed access to smart phone or desktop application. These artefacts are produced using a robust, waterproof proprietary substrate with the same weights, thicknesses, feel and appearance as paper but with much longer estimated shelf lives. The material is engraved using industrial lasers and specialist software which produces 3D variable or fixed data images with an appearance of being manufactured into the material as watermarks. The artefacts exist as a singularly original and immutable item within a proprietary blockchain infrastructure supported by workflow modules hosted by the end-user within a 4Trust B2B license agreement. The platform’s DLT supports their proprietary chain of custody application. This architecture and serving relationships ensure customer data remains in the licensee’s control and their existing print and document management assets can be utilised in their chain of custody platform deployment.    Three criteria are satisfied: ·      Authenticity of the artefact ·      Identity of the holder ·      Trustworthiness of the transaction. This represents a re-invention of  “trust” and “custody” that’s fully transparent under litigation, regulatory oversight and forensic audit by providing direct and real-time links between immutable physical documents and logical transactions in a transparent singular business performance domain. TALL Group is proud to be a founding 4Trust DocuChain licensee with three certified secure document plants in the UK (Runcorn (Cheshire), Hinckley (Leicestershire) and Lisburn (Northern Ireland). http://www.westonpartnership.co.uk/   ### Cloud Computing | Help in Shaping Our Lives Cloud Computing is the new dawn in IT. It's not a new idea but its abilities are now being realized. The two most important factors of life 'Time' and 'Complexity' would get addressed deeply. Cloud Computing With the big data, comes the big challenge of managing huge amount of data being generated daily by different websites. On an average, 2.3 trillion gigabytes of data is being generated every day. But where to store this data? How to process it? What to do to manage it is another big challenge. Cloud computing would address these difficulties by providing a platform for storing huge data generated to be stored in form of virtual data. Data is transformed into small data footprints which would ensure Accurate results of the analysis The minimum time is taken to process data by application More security of data Easy management of data Accessing data from anywhere This will eliminate the paperwork involved. Companies would not require to invest for servers on site. Employees will be productive even though they are not in Office. While accessing data and making required changes in them, security will be ensured. Artificial Intelligence With a plethora of transformation happening, the cloud would be supporting emerging technologies and Artificial Intelligence and help them to adapt to smartphones. AI being used in smartphones takes a long time to analyze data as data is mostly in unstructured form and smartphones have to send data to servers in the cloud. All these affect the performance of AI. To improve this, AI would learn being in the cloud, where processing power is abundant. Phones use Inference, where AI applies what it has learnt to real-life problems. Inference processes data in real time, all the time so smartphones have to rely on the processing power of the cloud to meet the computing demand imposed by AI. This will make the device respond to voice command within no time, organize pictures according to content and capture photographs under different shooting conditions. During work, the language barrier will no more be an issue. Suppose you need to speak to a person who knows only French, with voice recognition, words will be converted to each other’s languages. This is possible with clouds computational power. Digital Infrastructure Cloud would provide digital infrastructure to future cities. With help of cloud`s ability to store and analyze data, there will be better management for parking system, farms, trains, power plants. Even the facilities in medical care would be improved. Patients would receive the doctor`s explanation and medication required on their cell phones. This will increase the likelihood to follow the doctor`s instruction. Even doctors would receive the client information, test results directly instead of visiting the lab. This will improve the decision-making process and there would be better care of the patient. Cloud computing has also changed gaming drastically. Gaming experience has improved for kids and adults as they see 3D games and are much involved in them. Ease at work For working, the cloud allows connecting with people. Community cloud allows business partners to coordinate their activity on a secure platform (protecting secrets even from each other). Similarly, building an E- signature app will eliminate the need of physically signing a document and submitting to another part by fax or mail. It can be signed online and emailed to people who require. It can be viewed only by an authorized person. This will even save time and money. Cloud computing would help to work with multiple vendors. Information can be stored on the cloud and access to different parts of the cloud can be given to a concerned person who can download, work and upload the data. Even through cell phones, all operations can be done. Boost in Entrepreneurship In today's world of innovation, it is important to know the interest of people in product and how much amount they are interested to pay for it. With Cloud, data can be collaborated, tested and modified at a greater speed. This will change the nature of innovation, hence accelerating entrepreneurship. Small and medium-size business will flourish from smaller geographic to global geographic with lower overhead costs. There is no need to build a physical data centre at a new place. They may manage infrastructure as a service through infrastructure providers. This way time taken will be less providing them with a greater advantage over slower competitors. Cloud Computing would lead us to perform better, faster and easier in a much cost effective way. ### How to make IOT real with Tech Data and IBM Part 2 [vc_row][vc_column][vc_separator][vc_video link="https://vimeo.com/295191538/a51b9f92bc" align="center"][vc_column_text]Part two of our video series with IBM and Tech Data on the topic of Internet of Things. You can catch up on part one here. Darryl Kelly is joined by Sujit Bhatt from IBM to talk about the challenges that IBM and their customers are facing in relation to IoT. As well as highlighting best practices they also discuss the solutions they provide to overcome those challenges. [/vc_column_text][/vc_column][/vc_row] ### How Micro Segmentation Delivers Mega Benefits For The Hybrid Cloud IT security is all about keeping data isolated from everyone except those who need access. When you think about it that way, it’s easy to understand why it’s such a vital component of network security. For example, VLANs (virtual local area networks) are a popular form of segmentation that are often coupled with other forms of segmentation like with firewalls and ACLs (access control lists), as part of an IT security solution. However, the emergence of hybrid clouds and virtualized data centers have fundamentally shifted how IT gets done and traditional segmentation approaches just can’t keep up. VLANs are unable to extend to cloud configurations and limited to only supporting 4096 segments. Standard firewalls and ACLs can effectively only block or allow ingress and egress traffic based on a limited set of criteria. Simply put, traditional forms of network segmentation weren’t designed to address the challenges of hybrid cloud. New approaches are needed, and that’s where micro segmentation comes in. What Is Micro Segmentation? Micro segmentation is an increasingly adopted security method that is able to provide security controls at the workload and process level for cloud and datacenter environments. In contrast, older network segmentation methods divide a network into sub networks for hosts that share risk profiles and connectivity needs and application segmentation provides Layer 4 security for hosts used for a specific business process. Micro segmentation operates in a similar way conceptually, with the added benefit of granularity down to the process level. This enables a full view of a network environment along with the ability to implement security policies that have the precision of Layer 7 (processes). This increased level of precision in segmentation enables a highly flexible, secure, and scalable approach to IT security. To understand why this matters, consider the fact that VLANs and traditional firewalls are effectively “blind” to what processes are sending traffic from a given network node, making it highly difficult or impossible to determine if the source of traffic is malicious or not. Micro segmentation can solve this problem and mitigate the lateral movement of security breaches. [click_to_tweet tweet="Traditional forms of network segmentation weren’t designed to address the challenges of hybrid cloud. #Cloud #Technology" quote="Traditional forms of network segmentation weren’t designed to address the challenges of hybrid cloud."] How micro segmentation fits into the modern network The first step to incorporating micro segmentation within an organization is to gather and analyze data about the resources and process of the environment in which it will affect to provide a baseline of operations. This is generally accomplished through the use of network and host sensors searching for Layer 4 and Layer 7 information. To help streamline this process, advanced micro segmentation platforms are equipped with automated processes for datacenter and cloud integration.  Once this baseline is created, malicious activity is much easier to identify. During micro segmentation provisioning, labels are applied to nodes in the network. This entails identifying environment, regulatory sensitivity, application types, and role (e.g. production, PC, domain controller, web server) details. These labels will serve as the basis for policy decisions and increase the amount of visibility security teams have over their networks. This level of visibility helps to identify the dependencies of applications enabling increased malicious activity monitoring, rapid containment responses, and micro segmentation policy creation for prevention of predicted and previously encountered threats. Micro Segmentation Benefits Intrusion Reduction As mentioned, once you have a baseline for expected traffic and behavior on a network, micro segmentation-based solutions can detect, contain, and mitigate or prevent the spread of threats. In instances where the creation of a baseline based on existing traffic proves impractical, IT teams can still use micro segmentation to quickly identify unrecognized activities to limit the spread of breaches.   Additionally, enabling notification of security policy violations provides an organization’s network team with preemptive data in order to modify existing rules and processes to prevent unauthorized activities, allowing a proactive approach for future malicious behavior.   Damage Control Breach containment is another concern for IT professionals with cloud services creating a number of ingress and egress points on a network and DevOps philosophies garnering a large foothold in the industry. In the event a resource within a network environment is compromised from an attack, oftentimes the intrusion will attempt lateral movement from its entry point to cause additional damage. With micro segmentation practices in place, activities and processes are scrutinized against predefined security policies, enabling real time responses to any malicious actions observed, which also mitigates the severity of an attack.  When using this granular process level form of security, the surface of attacks, which often move laterally within an infrastructure via exploited nodes, can also be reduced to minimize the level of the intrusion. Regulation & Compliance Micro segmentation is also an ideal instrument for industry (e.g. HIPAA, PCI) and jurisdictional (e.g. GDPR, data residency) regulation management, which is beneficial as companies and organizations migrate to cloud platforms surrendering physical control of where information is stored. This is accomplished via configured policies isolating systems bound by regulations from the remainder of the infrastructure and governing system communications within regulatory conditions.   The takeaway: micro segmentation enables a modern approach to IT security As we have seen, micro segmentation is a significant improvement over traditional approaches to network segmentation. As network infrastructures and hybrid clouds continue to evolve so will the requirements to secure them. Selecting a platform-independent micro segmentation methodology can help you modernize your approach to IT security while enabling ease of integration and future-proofing your network.     ### AI Adoption | How to make your organisation smarter For all the hype and talk of AI over the past year or so, actual AI adoption worldwide has been slow. According to the 2018 Gartner CIO survey, only four per cent of companies have invested in AI, with a mere 20 per cent actively experimenting with AI solutions. As only 46 per cent of per cent of companies have developed plans to implement AI solutions, it doesn’t look like widespread uptake is on the way any time soon. And the state of play in the UK is even worse - research conducted by the Royal Society for the encouragement of Arts, Manufactures and Commerce (RSA) in 2017 shows we’re lagging behind, pointing to an AI adoption gap between the UK and other developed economies. Of course, AI, when implemented well, can bring untold benefits to any organisation. Automation means the machines can take care of simple everyday tasks that don’t necessarily need a human brain. And when humans are freed up to focus on more strategic tasks, the business becomes infinitely more competitive, with customers and prospects seeing the rewards from working with a more productive, innovative supplier.      So why are businesses – especially those in the UK – holding back on Enterprise-Scale AI adoption and how should this issue be addressed? Well, the problem is threefold. First, there’s a skills-shortage issue, particularly in relation to finding data scientists. Why undergo a complex AI project now, when you don’t have the right staff to successfully bring it to fruition? Thankfully, with the UK government pledging to unlock the power of data as part of its Digital Strategy, the data science skills shortage should be a relatively short-term issue.   The second issue relates to the accessibility and quality of the data. Yes, a lot of organisations hold a lot of data, but they still aren’t fully realising its potential, keeping it siloed between different business units, rather than fully integrated and centralised, so it can be analysed to give the insights necessary to empower tangible, hype-free AI. The third issue, the most significant, actually comes down to planning and implementation. Many AI projects fail because organisations are still in pilot or proof of concept mode and aren’t considering how they’ll scale their trials across the enterprise. To be successful, an AI implementation needs to be considered as a business change program underpinned by technology. Unfortunately, in many organisations, it’s still viewed as an IT initiative or a tactical, department-level experiment without full consideration to scalability across the Enterprise. With all this in mind, I wanted to share four tips for making sure your AI implementation will make your organisation smarter and savvier.     Consider project scalability from the outset   If you’re looking to implement an AI solution, it is imperative you look ahead – consider how, beyond beta, it will be rolled out to benefit your organisation as a whole.     We’d recommend taking a sprint-based approach to ensure the business and IT teams work in tandem around any implementation. That way, leaders can see the impact of the initiative in weeks versus months and years. Avanade helped a leading European insurer automate over 30 processes using Agile Delivery, but it was only a success because we had everyone on board from the outset.     Understand and implement AI solutions that will work in the short term   For many businesses, Robotic Process Automation (RPA) is an ideal starting point for AI adoption. Through RPA, machines are taught to process repetitive, high-volume, manual tasks that use structured data. AI projects based on RPA should significantly reduce time spent on repetitive tasks and free up employees for more complex and rewarding work. They can also deliver rapid ROI.   Know what you want to achieve   Establish parameters for what the implementation will and won’t achieve. A good way to approach AI project management is to start with a basic prototype – with limited functionality – and then scale it up as support and capability are built within the organisation. This approach allows the project to show results fast and build from each iteration. To keep the project on track, put measurable KPI in place. Ideally, businesses should appoint an ‘AI evangelist’ to drive the project.   Bringing employees with you on the AI journey   Our global research shows 79% of business leaders believe internal resistance to change is limiting the implementation of AI technologies in the workplace. So you need to get employees on board as part of the AI planning and implementation process! We recommend providing information and insights into how roles may change and evolve as AI comes into force, explaining to employees that they’re being freed from respective tasks, to focus on higher-level, more rewarding work. This engagement should also include a roadmap for employee ‘upskilling’, where possible. Overall, AI adoption will not only mean making sure everyone knows how to make a success of the project, but also knowing what’s expected of them to bring it to fruition. It will take time to get everyone on board, but the benefits of allowing automation and AI to flow through the company will help staff to focus on more strategic tasks. And that’s something all organisations will reap the benefits from.     ### How to make IoT real with Tech Data and IBM Part 1 [vc_row][vc_column][vc_separator][vc_video link="https://vimeo.com/295191399/d8fe9d2e23" align="center"][vc_column_text] Darryl Kelly is joined by Sujit Bhatt from IBM to talk about what IoT means to IBM and their customers and how the Internet of Things (IoT) is a really exciting area to work in. They discuss the benefits, challenges and best practises customers find with IoT as well as the current trend over the next 12 to 18 months where large number of companies have IoT as a top project area. Part two of the discussion can be found here.[/vc_column_text][/vc_column][/vc_row] ### CX Excellence | Identifying the best partner for you Recent technology development focused on CX (customer experience) has increased consumer expectations to unprecedented levels. Organisations that are able to offer competitive prices, a seamless and personalised purchase journey and high-quality customer service are leading their industries. The customer journey is becoming increasingly valued; customer service significantly influences the purchase decision. Whether inquiring about new functionality or asking vital questions regarding a product’s specs, customers expect to be assisted by reliable experts that can solve their issues immediately. Under these circumstances, managing customer relationships is a critical priority for all companies that want to thrive and stay ahead of the competition.  Furthermore, organisations are starting to grasp the vital importance of truly listening to a customer’s voice and are using the insight they learn to adjust their strategies, not only to meet expectations but to exceed them by providing flawless pre-sales and aftersales journeys. Sourcing the right CX partner A key component of delivering excellent customer experiences is a high-performance contact centre. One where companies communicate with clients and ensure their needs are addressed quickly and easily. Organisations that have adopted the right methodologies and technologies towards CX are winning, and can:    Manage customer journeys and optimise their experience to improve satisfaction    Build efficiencies to lower costs    Reduce employee churn and engage staff    Improve customer acquisition to fuel growth Workforce Optimisation is integral to the success of any CX solution. Open platforms that integrate omnichannel functionalities and connect their contact centre with multiple CRMs and communications systems are paramount. When implemented correctly and with proper training, they empower agents with easy access to all the data required to resolve customer issues quickly. As a result contact centres have become a sophisticated ecosystem of processes, technologies, data, knowledge and people. However, sourcing the partner for a high-performance contact centre and negotiating best-in-class pricing, beneficial terms and conditions is complicated, time-consuming and a resource-intensive process. CIOs and contact centre managers face a wealth of sourcing challenges and an overwhelming number of vendors and distributors. Moreover, while a competitive market allows businesses to compare products and negotiate prices, it can take significant time to evaluate vendors, understand what technology to invest in and find the right fit for the company’s needs and development plans. The process becomes exponentially more complicated when companies migrate from on premise-based solutions to the cloud. CIOs now have to consider a higher number of functional capabilities and have to decide what will be needed immediately, in the mid-term and the longer term. And while migrating to the cloud is a significant challenge for many companies, adopting solutions leveraging Machine Learning, RPA (Robotic Process Automation), the Internet of Things and Blockchain, it’s not a ‘nice to have’ anymore but a ‘must have’, if companies want to deliver personalised customer experiences that increase brand awareness, loyalty, and revenue. Because there are so many variables involved in configuring a contact centre, in many cases, companies get lost on vendors’ features and functions, losing sight of the primary purpose of the contact centre - improving customer satisfaction and retention rates by managing all communications from customer inquiries to distributing marketing collateral via email or social media and other channels.  The ability to handle the customer on their terms, whether it is by phone, email, live chat, social media or text has become crucial. In addition to being able to handle multi-channel communications, contact centre analytics and Workforce Management Software are also vital components in operating a world-class contact centre. At the same time, every business has unique KPIs, traits, strengths and weaknesses and needs a personalised contact centre configuration to deliver the desired business outcomes. Agnostic expertise Under these circumstances, partnering with sourcing experts who understand the company, its products and services but also the importance of customer satisfaction becomes a critical step that can have a significant impact on a company’s future development. Thankfully, sourcing services available to the contact centre market have kept pace with technological advancement, and nowadays companies can use agnostic sourcing companies that can help CIOs and procurement navigate the complexity of choosing the contact centre technology that is the right fit for the specific needs of an organisation. A significant benefit of using an agnostic tech sourcing agency is that they do all the heavy lifting with researching available technologies and top-tier suppliers, and the organisation can ditch cumbersome and expensive tender programmes, reduce costs when negotiating prices and save time with vetting suppliers. Furthermore, in many cases vendors and suppliers are driven by sales quotas they have to fulfil while a sourcing agency acts as an advisor, only focusing on the best technology for the project. Additionally, a CIO doesn’t have to deal with numerous contracts, SLAs and providers - they can manage everything within a single agreement with one point of contact whose team manage the delivery of the agreed outcomes. The bottom line is that many new-generation technologies can and will provide practical solutions to common day-to-day challenges faced by contact centre managers, but matching the right technology to the company’s priorities and goals might make the difference between a highly successful organisation or a struggling one. ### Converged infrastructure | Why you must invest As we move into a modern-day storage environment, it is critical businesses’ infrastructure follows suit. Combining computing, networking and storage hardware into a single integrated architecture, converged infrastructures are today considered a means to accelerate implementation and availability of data-centric structures within businesses. They also reduce IT costs and decrease implementation risk. And, the rapid growth in the adoption of converged infrastructure (CI) is undeniable. Surveying IT professionals, ESG found almost 90% of respondents already use or plan to adopt systems of convergent infrastructure technology solutions in their environments. Its benefits, both technical and financial, are driving the adoption boom for businesses. In fact, Gartner expects it to be the fastest growing segment of the global integrated systems market, reaching almost $5 billion in 2019. Converged Infrastructure Organisations need to accelerate implementation and increase the availability of infrastructure, which are both possible with converged infrastructures. While it has significantly simplified the implementation of computation, unfortunately, most converged infrastructure solutions rely on bulky, complex and slow disk storage systems. These systems simply cannot be maintained in a modern data centre environment. While a great architecture and a comprehensive validation process are a good place to start, more is needed. The fact is that the converged infrastructure must also provide a simpler and lower cost of the ownership experience. This is to allow IT departments to change their focus and maximise the time they have. Instead of concentrating on buying and integrating products from individual suppliers, they can focus more on the workloads that bring commercial benefits and a competitive advantage to the entire business. Cloud With many organisations’ attention having turned to the cloud and the delivery of services, the result is the creation of the new generation of convergent architecture. These are smarter, simpler, smaller and much more efficient solutions that rely on newer technologies and have direct integration with virtualisation and cloud solutions. This architecture also benefits a range of organisations. Everyone from global services that provide historical and real-time information to businesses, to health institutions that provide excellent patient care through faster access to their records and images. The four main benefits of this new generation of converged infrastructure solutions are:   Maximise business results   Studies have shown that reduced latency in applications has a correlation with user satisfaction and commitment. So if you want happy customers to buy more goods and services, increase employee productivity or gain greater insight into your data, then you need to reduce latency. A high-performance infrastructure that combines a powerful primary block for cloud-based, virtualised, and general-purpose datacentres, along with 100% flash equipment, reduces latency and delivers powerful metrics around workload performance.   Scale the cloud   How many of your CIO's initiatives for this year are destined to run on a virtual infrastructure? It is a safe bet to say it will be more than half. Having a number of "too-demanding" applications to be virtualised without the proper architecture can keep the IT department focusing too much time and resources on the wrong things. However, with adequate tools, including powerful servers with service profiles, together with stateless storage, you can configure how, where and when workload instances are implemented.  Stateless technologies allow administrators to configure MAC, World Wide Name (WWN), unique universal identification (UUID), boot details, firmware and even basic input/output system (BIOS) settings in the software, through simple administration interfaces. It also allows the creation of the most agile convergent infrastructure in the industry. Above all, it is important to see how agility translates directly into scalability.   Simplify IT operations   Storage has been the curse of virtual infrastructures. In reality, there has been very little innovation in storage in the last 20 years. Most have been very tactical with the implementation of silos focused on solutions. However, how many solutions do you have to adapt to address VDI, OLTP databases, Big Data analysis or improve the core of the VMware cloud? The use of a converged infrastructure architecture reduces the total number of managed devices, simplifies administration and reduces operating expenses (OpEx). In addition, a validated and verified solution architecture results in many use cases, which facilitates the implementation and reduces the risks when transforming datacentres.   Reinvent the IT budget   The high rates of data reduction and the small size of 100% flash equipment combined with efficient servers translate into substantial savings in space and energy cost, enormous simplification, easier administration and improved data centre economy. In addition, solutions such as FlashStack from Pure Storage and Cisco are easily updated, so customers can add exactly the storage and/or computing capacity they need, as requirements change. Customers can buy components that meet current needs, without having to buy equipment years in advance to support hypothetical future growth. While datacentres have grown significantly in terms of space and complexity in recent times, this hasn’t necessarily translated into growth in scalability or performance. As businesses focus on ensuring systems are operating as they should, the time spent on innovation has fallen down the priority list. However, to remain leaders in their markets it is critical for businesses to deploy a solid IT infrastructure as in today’s digital age these infrastructures act as the critical strategic partner needed for helping business transformation. Therefore, it is imperative to adopt solutions that combine 100% flash storage with online data reduction and powerfully scalable network and server technologies so businesses don’t fall behind. Doing this enables 100% flash workload delivery to be affordable, powerful and most importantly successful. This is what will set businesses aside from the competition. As the number of businesses investing and migrating to a converged infrastructure increases, the benefits are demonstrated further and it won’t be long until it is the norm for businesses. So what’s stopping you? ### SEO | Tips to Increase Organic Traffic in 2018 SEO is an ongoing effort. There is no one-size-fits-all, and there is no one-and-done. Optimizing your website for search engines takes time, effort, and ongoing analysis. If you want organic traffic, you need to be willing to go above and beyond when it comes to your SEO. That being said, there are ways to work smarter, not harder. Organic traffic is hard to come by. Users know what they want, and they trust the top search results to deliver exactly what they’re looking for. If you’re not in those top results, it’s sometimes impossible to land organic traffic from your target audience. This guide will delve deep into the world of SEO to talk about things you can do now to actually increase your organic traffic. These are things that are currently working in 2018, and they’re much stronger than outdated techniques in the past. Understand RankBrain Google’s RankBrain is the newest algorithm that integrates machine learning. It takes user expectations to another level, and that’s why it’s the most important thing on this list. Yes, keywords, meta tags, and backlines are important and will likely always be important. However, they’re not the most important according to RankBrain. With RankBrain, it’s all about making Google’s users happy. It’s about providing high-quality content. Think of RankBrain as a set of steps each user does. First, the user searches for a specific keyword in Google. Next, RankBrain shows a set of results for these keywords. The user will then click on a result, and their next action will help RankBrain assess the quality of that website. Namely, did the page satisfy the user? If they quickly click away to something else, that page probably doesn’t have quality content. On the other hand, if they are satisfied and stay on the website without needing a different result, that page is upranked. This all means you need to prioritize your quality content to make sure users are satisfied the first time they visit your website. SEO steps to take now: Create content that adequately addresses your topic or headline Back your content up with high-quality sources and statistics Include a compelling meta title and description to lure organic clicks Perform a Content Audit We all know that content is king, but what happens to bad content? Not all page content and blog posts are created equal, and it’s probably been a while since you took a hard look at your older pages. Low-quality pages will weigh down the rest of your website. Take a cold, hard look at all your pages and posts. Anything that has zero links, traffic, or conversions is probably ready to be deleted. If you’re not willing to part ways with that content just yet, update it at the very least. These are pages that aren’t adding value to your website. They aren’t a part of your industries conversation, and they’re holding you back. How do you decide which pages aren’t gaining any traction? Check out the URL Profiler tool which scales the process for you by using your sitemap to review any inbound or outbound links. You can also review your Google Analytics to see your top performing (and lowest performing pages). What is it about your top performers that makes them stand out? How can you learn from them to boost your lower performing pages? SEO steps to take now: Input your sitemap into a link crawler to determine which pages contribute no backlinks Decide if you’re able to improve these pages Remove any pages you’ve deemed unnecessary and invaluable Rank for More Keywords Many new website owners make the mistake of trying to rank for a single keyword phrase. While this is a smart move, you shouldn’t limit yourself. One of the best ways to gain organic traffic is through more keywords. Start by creating a secondary keyword. This will expand the footprint of your content and help you show up in more SERPs. How do you get started if you don’t know what keywords to use? First, look at articles from your competition. By using a keyword explorer, you can see what terms they’re ranking for and decide if any of these make sense for your own content. You can also continue your own research into specific long-tail keywords. Consider synonyms for your current keyword selection, since these will likely be popular searches as well. SEO steps to take now: Create a list of synonyms for your current primary keyword Review competitor keywords with an exploring tool Decide on a secondary keyword for your website and begin implementing it Final Thoughts As you can see from the tips above, there are a lot of aspects of SEO you have to consider in 2018. Focusing on the user experience, quality content, and creating a larger reach with your website should be a priority. An SEO specialist can help by providing expert consultative advice specific to your industry, but many of these steps you can do on your own. The most vital step of SEO is to never stop working towards more organic traffic. This is a marathon, not a sprint, and it’s not something you can set up once and leave on its own. As Google continues to understand more about user’s needs, your own optimization methods will need to grow. Start with these tips above to see your organic reach improve. ### Providers | Whoever controls the multi-cloud controls the future Safely navigating through a worsening threat landscape, controlling burgeoning IT complexity, and protecting gargantuan amounts of data are key to maintaining customer confidence. Going it alone, without attempting to tap into the power of the multi-cloud providers seems like an increasingly foolhardy move. According to Foresight Factory’s recent F5 sponsored Future of Multi-cloud (FOMC), disruptive technologies, new strategic imperatives, and evolving governance practices are radically reshaping business and consumer paradigms. Usage demands are already massive for both consumers and enterprises. Netflix users alone consumed more than one billion hours of video content per week in 2017. Meanwhile, almost five billion videos are watched on YouTube every day. In an average month, eight out of ten 18-49-year olds will watch YouTube-hosted content. As the FOMC report unanimously concludes, businesses need a strong multi-cloud strategy now. A faster moving future According to FOMC expert David Linthicum, Chief Cloud Strategy Officer at Deloitte Consulting, the cloud will continue to develop “at a constant rate of innovation.” Rapid technological innovation and hyperscale providers’ deep pockets mean that change could well be multi-cloud’s only constant. The pace of change is accelerating both in terms of software and hardware. RightScale’s 2018 State of the Cloud report recently identified machine learning as the most popular public cloud service in terms of future interest. 23% of respondents plan to use it, and another 23% are experimenting with the technology today. Other developments include new serverless architectures enabling enterprises to cut down on time-to-market and simplify processes. It could also enable provider agnosticism and make it easier to benefit from the multi-cloud. At the same time, the development of new software and hardware features has created an unprecedented innovation arms race in the cloud. “Amazon is adding ten servers into the cloud pretty much every week, and that is going to be ongoing for the next five to ten years. The same applies to Azure,” says David Linthicum. The necessity for hardware to keep up with the demands of both consumers and enterprises is also apparent in fields such as the Internet of Things (IoT) and edge computing. By 2019, IDC predicts that 45% of IoT-created data will be stored, processed, analysed, and acted upon close to, or at the edge of, the network. While the major cloud providers tend to dominate the innovation discussion, there is also plenty of action happening on the (increasingly blurred) margins. “You are going to see more regional clouds provided by telcos, among others, to deliver specialist services to specific areas. Sometimes a local cloud is much more important than having something that is generic,” explains Roy Illsley, FOMC expert and Principal Consultant at Ovum. According to the 2017 Cloudify/IOD State of Enterprise Multi-Cloud report, Software Defined Networking and Network Function Virtualisation (SDN/NFV) are the most critical emerging technologies for the telecommunications, defence and space industries. Containers are much more important for the software, networking and IT services industries. Cloud Service Providers (CSPs) that can integrate new technologies for specific industry verticals will become increasingly valuable. Examples of specialist companies able to constructively mesh into the multi-cloud mix include Navantis, a Canadian vendor that uses Microsoft tools to help companies with application modernisation and integration. It also specifically specialises in Canadian regulation. “You should have at least two of the hyperscale providers and maybe one speciality provider,” advises Eric Marks, FOMC expert and VP of Cloud Consulting at CloudSpectator. “This way you can have competition at the hyperscale level combined with the specialist services of the smaller provider. The smaller provider’s prices could also influence the others.” Having multiple cloud service providers means enterprises can quickly migrate workloads based on their needs at any given time. It also improves enterprise flexibility by avoiding reliance on a single vendor. 47% of industry influencers surveyed by Logic Monitor see vendor lock-in as one of the biggest challenges for organisations dealing with the public cloud today. Innovation into the future Inevitably, workloads will change in the future, influenced by factors such as the need to process data generated from IoT and other nascent technologies. The abstraction of the various layers and constant adaptation to new services do, however, impact on flexibility and cost. While enterprises want to be flexible, it can be difficult when tools that manage different cloud services and containers are hard to find. In addition, maintaining multiple cloud service providers can also be costly, depending on the size of a workload. Dashboards that can be used to monitor multiple cloud services while also providing granular information will be the most common addition to IT professionals’ tool-kits over the next five years. Simple management dashboards are already available, but the incorporation of new technology will be vital. Looking ahead, an abstraction that can reach throughout the whole stack, integrating cloud services, containers and serverless functions will become standard. No single cloud option best serves all infrastructure demands. The era of cloud migration is swiftly accelerating, and the future of the multi-cloud world is set to open a wider spectrum of profitable opportunities for businesses, including improved agility, greater scalability, better aligned operational costs, as well as a clearer focus on business retention and expansion. With advanced security, combined with cloud automation solutions, organisations can dramatically improve their ability to efficiently orchestrate cloud usage and manage their operations more effectively. Historically, the cost has been the primary differentiator when choosing a cloud vendor. That is changing. Today, it is more about what the cloud can enable rather than upfront expense concerns. Now is the time to take control of your future. ### Compliance | There's more to it than consent Compliance | GDPR should serve as a wakeup call for companies to review ALL of their security and compliance measures, not just customer communications There can be few people in Europe who haven’t heard of GDPR and the requirement to meet tough new standards when it comes to managing compliance and personal data or risk penalties that can be up to four per cent of turnover not least because of the deluge of emails and texts sent asking customers to give explicit consent to the collection, storage and use of their data, as required by the EU legislation. However, while the spirit of the regulation may be to empower individuals in terms of how their personal data is collected, stored and used, it’s down to the businesses they interact with to protect that information and keep it safe. As such, the recent implementation of GDPR should be seen as, not so much a chore, but as an opportunity to review compliance procedures, safeguards, technologies, and processes employed to keep personal data safe. It should also be taken as a wakeup call to explore new ways of ensuring compliance going forward, as businesses across the board strive for digital transformation and turn to the cloud as a platform for business-critical applications. It’s never going to be a simple task, but companies reviewing their compliance with GDPR and other data security laws could do worse than consider these three short questions:- What security tools do enterprises really need? Over time, most organisations will have invested in a number of security measures, such as firewalls, for example, which are still very necessary. These, however, can no longer be the only line of defence, not least because they work mainly at the transport layer to protect the perimeter instead of the applications or data itself. Firewalls just don’t have the visibility required to prevent modern, sophisticated web application attacks which is critical as for most organisations web applications are how they do business. This is a fact that hasn’t gone unnoticed with web application attacks more than tripling since 2014, to become the leading cause of data breaches. The tool of choice to counter this threat is the specialist Web Application Firewall (WAF) applied to the application itself instead of the underlying infrastructure to protect against the growing number of attacks seeking to steal data. If not already deployed, a WAF should be added to the security arsenal as soon as possible. Are our Web Application Firewalls up to the job? Although still essential, Web Application Firewalls were first introduced long before the advent of the cloud and the many technologies, such as microservices and containers that go with it. IT managers would, therefore, do well to take this opportunity to ensure that any WAFs already in place remain fit for purpose. One issue is that WAFs are typically built into Application Delivery Controller (ADC) appliances used to balance traffic and smooth out demand across multiple application instances. Using an ADC appliance for both load balancing and WAF cannibalises performance leading to enterprises having to significantly over-provision to address load balancing and security demand during peak traffic. This, in turn, makes traditional WAFs expensive and difficult to configure. They can also be very specific in terms of the infrastructure they will work with, especially when it comes to the public cloud, calling for multiple implementations to provide full coverage. Because it has more than one job to do, the traditional WAF may also have to trade security off against load balancing performance and vice versa. Companies should, therefore, look at software-based alternatives that can be deployed across the component parts of a hybrid infrastructure to provide more complete, scalable, and easily managed protection. Do we really know what’s going on? A common complaint with security tools, in general, is the need for specialist interfaces and expert knowledge to configure and manage them. This is especially true of the traditional WAF which like other, mainly appliance-based solutions, requires custom configuration and setup work unique to each application as well as each on-prem or cloud environment. Additionally, most security solutions do little to provide visibility into application traffic or the applications themselves. As a result, while IT teams may be able to manage security policies well enough, it can be difficult to view logs and analytics to see how effective those policies are and how they might be optimised. This lack of visibility limits how quickly an enterprise can respond to an attack and cripples its ability to apply automation to coordinate rapid countermeasures across all environments and all points of vulnerability at the same time. Again, GDPR provides a real opportunity for companies to re-evaluate their security measures through the lens of the application and customer data. Enterprises need to be able to trust in their existing measures to ensure security and compliance, but how can they be sure about what they can’t see? The online world is becoming ever more diverse and complex, leading to the need for legislative measures, like GDPR, to make sure that data security isn’t weakened as a result. Stricter rules are inevitable, but they shouldn’t be just a box-ticking exercise. The smart enterprise is one that sees their introduction as an opportunity to build a comprehensive and responsive armoury of security measures. Measures able to deliver compliance, regardless of infrastructure or how applications are deployed, both now and into the future.   ### Cybersecurity | An opportunity for financial services organisations Following the implementation of stricter European data protection regulations earlier this year, cybersecurity is at the top of the agenda of most businesses providing services, particularly in the context of protecting customer data. Under the General Data Protection Regulation (GDPR), if organisations suffer a data breach, they could now face fines of up to €20 million or four per cent of their annual global turnover, whichever is highest. There is also a heightened threat to all business due to the potential reputational damage data breaches may cause, as we have seen with the example of BA. As such, more businesses are looking to bolster their defences to remain compliant and avoid unnecessary fines and crippling effects of damages to their reputation. Thus, it should come as no surprise that regulators globally have already been focusing on the importance of strong cyber defences. For example, in addition to GDPR, in the UK, the Financial Conduct Authority (FCA) has listed cybersecurity as a crucial part of its regulatory compliance agenda and provides specific guidelines for organisations on the disclosure of incidents. Similarly, the Monetary Authority of Singapore (MAS) places cybersecurity as a priority, since establishing an international advisory panel. The board, which includes its first chief cybersecurity officer in efforts to drive regulatory standards of compliance for the financial services market. Cybersecurity With cybersecurity at the forefront of the agenda of the financial markets regulators, many companies are asking if they can sleep easy at night as the adoption of cloud-based infrastructure grows rapidly to enable business growth. Are these moves from regulatory authorities impacting the pace of technological advancements in the industry and hindering business? The increased emphasis on cybersecurity from financial services regulators is primarily driven by concerns around the continued health of the global financial markets. Regulatory intervention on such matters is often initially perceived as “additional burden,” “over-regulation,” or an “unwelcome distraction” from generating revenue. However, since many parts of the financial services market fail to drive change in how they manage systemic risks without regulatory intervention, such top-level intervention should be welcomed. Indeed, the whole ecosystem will be better protected and market participants can have the chance to collaborate on how the industry mitigates risk as a whole. The need for a cultural shift A cultural shift is required, however, when it comes to issuing management in the financial services. Organisations should encourage a movement away from brushing issues under the carpet and move towards a culture of proactive disclosure and day-to-day issue management. As cyber threats advance, financial firms need to see this as an opportunity to develop processes and protections, regardless of legislation or pressure from regulators. With consumers holding organisations to a higher standard than ever before, firms are under growing pressure to stay ahead of the curve and be transparent, making appropriate adjustments early enough to protect their business and, ultimately, their customers. In fact, making changes in advance of regulators could earn the trust of new customers by showing stability, forward-thinking and corporate social responsibility. To be proactive in applying best industry practices across the market, organisations should focus on managing an effective transition to cloud technology. Indeed, it would be wise for financial market participants to assess the following questions about their organisations: ### Threats | Can cloud email security defend these? It’s hard to over-estimate how fundamental email threats have become as a route to threat to attack enterprises. While there are numerous ways for attackers to target organisations - vulnerable applications, compromised credentials, poorly-secured infrastructure - email is always a common denominator. An attacker that doesn’t try email, is probably not one you should really worry about. Cloud email infrastructure such as Office 365 Exchange and Google’s G Suite are no different from on-premise email servers in this regard, with Microsoft’s own figures showing a 600% increase in malware incidents recorded on the platform during 2016 alone. While the attacks and threats being designed and put into action within the cloud environment are vast, they can often be divided into overlapping categories. Most begin with the simple ‘spam’ email; these emails tend to be harmless, but they can clog email gateways and employee inboxes. This then moves onto more serious, albeit still generic threats like ransomware, or phishing attacks which harness social engineering tactics. For enterprises, however, the most dangerous and fastest growing category are those designed to specifically target and threat employees, business processes and supply chains. The five most prevalent attack types which fit this category include: Spear phishing and credential theft: Spear phishing is an email that is specifically targeted towards an individual, company, or business with threats. They are often intended to steal data, but can also allow nefarious actors to install malware onto a victim’s computer. Spear phishing uses clever tactics to customise messages, making them appear relevant to the recipient, and once the unsuspecting victim opens or interacts with the email they thought was safe, criminals can get their hands on the data they need. Whaling: Whaling attacks are like spear phishing, however, the two must not be confused. Whaling only targets employees perceived to be ‘high value’, so, for instance, the CEO, CFO and other VIPs within an organisation. These individuals tend to have access to sensitive information like employee or customer data, and also the power to control large balances in banking and securities accounts, making them more appealing targets to criminals. A successful whaling attack can give nefarious actors access to passwords and other important account details which can, in turn, open up corporate hard drives, networks, and even bank accounts. Some whaling campaigns can even go after secret military and other government information. Ransomware: Ransomware has been prevalent cybersecurity threats since 2005, however, over the last 3 years, events have proven that the threat is not only increasing in frequency, but also in complexity. Many see ransomware as the biggest threat facing organisations today. Ransomware now only needs a single victim to gain a foothold on a network, from where it can spread and potentially bring an organisation to a standstill. Business Email Compromise (BEC):  A BEC attack is highly targeted and designed to conduct financial fraud. Criminals often impersonate a co-worker or trusted third party in order to compromise an email system from within. What makes BEC attacks even harder to spot is the fact that, in most instances, there is no payload. Instead, they rely on intent and urgency, imploring the victim to act quickly. Whilst these examples of attack types used in the cloud environment can be categorised in this way, it is important to note that attackers can combine methods and utilise them in a single campaign. Cybercriminals (and defenders) are all too aware of how lucrative phishing can be, and so are willing to dedicate time and resources researching victims, and planning attacks over months. It can be argued that each successful attack, is simply the prelude to beginning a new one. When using cloud email infrastructure, it is imperative for organisations to understand how they can stay safe. To significantly reduce the risk of today’s advanced phishing attacks, next Gen email security must provide a three-pronged strategy:  technical controls, end-user controls and process automation that continuously monitors and respond. Use technical controls to block as many phishing threats as possible, end user controls to help better detect in the mailbox that simultaneously also encourages users to become an active part of the defence strategy. By employing a system that uses machine learning to automatically find the malicious emails that were sophisticated enough to bypass traditional cloud email security and land in inboxes, systems can study every employee’s inbox to detect anomalies and communication habits based on a sophisticated user behavioural analysis. All suspicious emails can then be visually flagged the second an email hits an inbox, and a quick button link inside Outlook & Gmail toolbar enables instant SOC team notification while prompting security tools for further investigation and immediate remediation. This ‘virtual’ security member reduces the risk of human error in identifying malicious emails, and gives organisations a mailbox level defence to ensure protection, and remediation. With the threat nefarious emails bring to organisations growing not only in prevalence, but in complexity, the time to act is now. Phishing will continue to dominate the threat landscape, and the consequences one email can unleash upon a business can be devastating; ransomware and BEC can bring huge financial losses, revenue loss and any reputational damage can be difficult for even the most established brand to bounce back from. Organisations should work to address the gaps in their email security, in order to stay one step ahead of the bad guys.   ### Cyber Threat to SMB | Marc Wilczek | Link 11 Marc Wilczek from Link11 has some interesting insights into why small businesses may be the most vulnerable to a cyber attack. Transcript I'm really pleased to be joined by Marc Wilczek again, MD of Link11. Hi Marc, thanks for coming in again to talk to us about cybersecurity and cybercrime and today there's been a recent release of the Cyber threat UK businesses that was published by the national cybersecurity Centre, I believe, and the National Crime Agency. And they talk about the Cyber threat but with a big focus on the small and medium-sized businesses, we often talk about Enterprise, but. Can you just tell me what does this report say? What is the threat landscape within smaller organisations? Right, it's a very interesting piece of research that was indeed just recently released and what it says in a nutshell is that small businesses still very often underestimate the risks that are involved when it comes to, you know, cybercrime and cyber threats and the report especially talks about ransomware and DDOS attacks as being the number one risk that UK businesses are facing when it comes to cybercrime. Yeah, don't they call it extortion ware when they group it together? Very much an often, you know, they literally go hand in hand because you know, organisations are getting confronted with blackmailing for instance. There are bad guys trying to squeeze an organisation to pay RansomWare or these guys might launch a cyber attack. Yeah, that's that's a very common pattern. So why do you think small and medium-sized businesses are such vulnerable targets? Right, well from a standpoint of the bad guy some of them simply believe they unlike the Enterprise World these small businesses haven't really put emergency plans in place. They're not having enough capabilities tools and systems in place to mitigate these attacks. They're perhaps more likely to pay random ware, that makes them interesting Target for some of these criminals. And on the other hand businesses, small businesses sometimes, they think that security is expensive. It's a nice to have kind of gimmick. They're underestimating the risks that are involved when it comes to the digital world and that makes it a perfect hunting ground basically for cybercriminals, unfortunately. So they might think that putting in decent security is too expensive. Why would a criminal come after me? I'm just a small Target, I'm not worth much to them all of these things. But in actual fact, this Market is incredibly lucrative for cybercriminals. Indeed and it's a very flourishing business and fortunately we've seen that time and time again that organisations are just not prepared to deal with these threats. And just to share an interesting perhaps statistic with you roughly one-third of all cyber attacks are targeted at organisations with less than 250 employees. So roughly one-third of attacks are basically focused upon small and midsize businesses. And the impact of them can be can be quite massive. So are cybercriminals just doing more, for lower value? So they're attacking more businesses, but they're asking for less? That's indeed right, and it's more, you know transaction-oriented but it's a mass business its massive volume that they're basically chasing and it's a very lucrative ecosystem for them. So now in terms of impact just to make it clear if you're a smaller business and okay, you get attacked or held Ransom and you asked for a relatively small amount considering the sums that are asked from Enterprises, but proportionally that can have a massive impact on a business. And then it's not just the financial aspect. It's how do you clean your systems? How do you get back up and running after an attack? Yeah, that's right. Perhaps a couple of things in this regard one is, Imagine for instance a wine shop or somebody who's operating in the e-commerce space and all they do is promoting goods online. So guess what the impact might be if they're under attack if the website is offline for a couple of hours or a couple of days now, it's going to be devastating for that small little business because they're entirely depending on their digital revenues. And at the same time what's also often underestimated is the pain in the aftermath of that event or that attack because it might take an organisation literally is days to restore the systems if not weeks and sometimes these small businesses that might not have backups ready. They're just not prepared to restore the system. So everything, you know, putting everything together it can have you know devastating consequences because it just takes forever to get the systems up and running again. Yeah, it's probably not considered often enough but consequential loss, not just the actual loss due to the ransomware or the demand that the cybercriminal might be making. So I mean where is all this coming from? I've heard terms lately about cybercrime as a service. Is that true? Unfortunately, it is, so the underground economy is growing very fundamentally. And as a matter of fact, there are websites available on the dark-net that offer cybercrime as a service. So people could for instance order a Cyber attack they could order a DDOS attack for just a couple of Euros or US dollars for no money. Basically, which makes it extremely appealing for bad guys just to launch an attack very easily and whether it's going to be, you know, ex-employees what that's going to be one of the competitors next door. Cybercrime has just become so easily consumable thanks to the dark-net. Thanks to cryptocurrencies, which is a real problem at this point in time. It's quite amazing to consider that most well traditionally. I would have thought it's organised crime and I'm guessing it probably is still organised crime behind a lot of these services. But they're essentially Outsourcing their capabilities to anybody who can find it. That's right. Yeah. It's a real fundamental problem and as I've mentioned before, you know, the world is going digital so does crime, and obviously, it's a very lucrative, you know playground and hunting ground for you know, the bad guys out there. It's as easy as going to a popular shopping site and ordering something online. Yes, I think you know one fair point is that the lines between legitimate e-commerce and elicits trading is basically blurring. So it's becoming more or less one of the same and you can see common patterns in terms of, you know, even criminal sites offering reviews for their consumers. Some of them they provide hotline support, it's ridiculous, but that's what's happening out there. So what can small and medium-sized businesses do? I mean we probably scared them enough but, particularly you at Link11. What type of services are you developing to help small businesses deal with this threat? Right, so we do provide a bunch of different services that make sure that you know businesses stay safe. Regarding DDOS mitigation, as you know, one example, we protect small shops and websites against DDOS attacks for instance and we do that as a managed service knowing that an organisation, especially a small shop, they might not be able to afford dedicated security people. So what we do is we provide a full managed service 24/7, to make sure that if an attack is happening, we take care of that and the business doesn't need to worry about it. It's our job to make sure that the business stays safe and there are no outages actually. So I mean the clear message is it's not too expensive. You can do something about it, and you don't have to be an expert in it yourself. Not at all. Yes, absolutely, right, yeah. Great. Well Marc, thank you so much for coming on and talking to me about the threat landscape to small and medium-sized businesses. My pleasure. Thanks again. ### Manufacturers | Disrupt or die | Three crucial steps In the rush to embrace digital transformation, businesses of all sizes will doubtless face pressure from innovative rivals, some with better digital skills or investment capabilities but that should not deter them. Peer insight from early adopters can help them understand crucial risks, spot real opportunities for growth and avoid hurting their bottom line with unnecessary tech investments. Thomas Honoré, CEO of Columbus explains how manufacturers can outsmart the competition with a strategy geared towards understanding how customers use products, and how innovation can be applied to improve the customer experience. Accenture reports that 95 per cent of C suite decision-makers expect major strategic challenges connected to disruption and Industry 4.0 developments yet only one in five organisations are prepared for this.  I often wonder if these businesses feel that they are watching their industry evolve through the window of a time machine racing forward, towards the next new thing, more connected, intelligent and confusing than ever before. When will Industry 4.0 become yesterday’s news? How can manufacturers learn to prepare for the future, when it is unclear what it holds? There are three fundamental steps every business should start with. Manufacturers Take IT personally Car sharing schemes have disrupted car rental businesses, online web stores are challenging high street shops and smartphones have rendered useless telephone boxes, the yellow pages and even physical money. Today, studies suggest up to half of all jobs could undergo some kind of automation over the next ten years. But watching these trends, small and medium size businesses might get the impression that the birth of the next industrial revolution is confined to Silicon Valley. In reality, it’s happening right, on their doorstep and they can all take a slice from the digital transformation cake. Keeping a close eye on market trends is useful for many reasons it informs decision makers of the state, future and key players of the industry. It helps them pinpoint where the next business changing disruption is likely to take place. To thrive, businesses must embrace disruption and apply innovation wherever they can. Solve your customer’s problems – technology can help Businesses that focus too much on the technical side of things risk overlooking the biggest growth potential – customer satisfaction. Can you clearly identify what your customers want? Do you know where your product adds value and where it falls short?  A winning digital strategy is driven by a clear understanding of customer pain points and where the company’s products or services can solve them. Digital transformation doesn’t require businesses to rip and replace a working business model it just means augmenting capabilities to meet and exceed customer expectations. This notion is widely reflected in the industry. A recent Columbus benchmarking report asked experts from Microsoft, Weetabix and BMW Oxford, how customer needs shape their ‘Manufacturing 2020’ strategy. All agreed it took centre stage in their efforts, with Gunther Boehner, Director of Assembly at BMW Oxford saying, “Manufacturers need to have a strong focus on their core processes, which directly relate to customers’ needs and pain points.” Assessing how a company’s products are used is not an easy task but data can help. IoT technology can help manufacturers get closer to the customer and turn data into actionable insights. This only works if you have the organisation, processes and systems to handle the information overload.  Partner up | technology partnerships can separate leaders from laggards As digital transformation takes businesses on to the next innovation challenge, readily available, on-hand expertise and skills become a valuable currency. But for most manufacturers, developing the solutions that give them a competitive advantage cannot be done in-house. Careful selection of technology partners is crucial because they will have a major impact on the ability to develop better mechanisms, implement new technologies and achieve smarter data management. For instance, growing data visibility was a strategic priority for Weetabix. In the Columbus report, Neil Clarke, Head of business unit explains, “We are constantly growing our data visibility, through hardware and software. We are also using new machinery technologies to continually automate for consistency and lower costs. Live data gives real time feedback to the operator on what they need to correct." Start today So how can manufacturing businesses of all shapes and sizes reap the benefits of the fourth industrial revolution while preparing for the next wave of innovation? Amid growing competition, rising customer expectations and an ever-evolving industrial technology scene, a good place to start is to prioritise digitally-minded leadership, skills and technologies and a customer centric business model. These steps will enable manufacturers to embrace disruption today and tomorrow and stay ahead of the competition. ### Machine Vs Machine | Marc Wilczek | Link11 Marc Wilczek, MD of Link11 has some very interesting thoughts on how cybersecurity is going towards machine vs machine with the help of AI and automation. This technology should help prevent cyber attacks in real time; as they are happening, rather than having to rectify issues after the attack has happened. Transcript I'm really pleased to be joined now by Marc Wilczek, the MD of Link 11. Hello, Marc thank you for joining me. Now, you're here today to tell me about the changing threat landscape, now just as a bit of background what is traditionally the type of attacks that a lot of companies face? How is it changing? Right to give you a bit of a perspective here, you know, the whole world is going digital and everybody talks about digitisation and everything, but at the same time cybercrime is changing very fundamentally and also cybercrime is actually undergoing if you will a digital transformation. So even criminals are going through their own digital transformation? Very much so, yes indeed and actually just very recently there was a report published saying that you know cybercrime were a country it would equate to the 13th largest GDP in the world. Equaling to some 1.5 trillion in annual revenues produced through cybercrime. So we're talking about very very big numbers, and as much as the digital world, the legitimate digital world is growing so does the crime. And back to your point in the old days, it was very much the notion of individual versus individual, human versus human. And you know, we're increasingly seeing a very fundamental shift because you know, also the bad guys are ramping up, their abusing and weaponising digital technologies, attacks are getting much much smarter. And in the digital world, it's more the notion of Machine versus machine as opposed to human versus human. So traditionally a business might employ a cybersecurity professional who would try and mitigate attacks from cybercriminals who are individuals themselves or hackers, but now the attacks are what they're carried out by bots there more automated. Just explain to me a little bit more about what's happening? Yes, everything you've just described as absolutely kind of imprecisely summarising what we're seeing in the marketplace. So the bad guys are weaponising digital technology as attacks are getting more complex more sophisticated using multiple vectors at once. They're using and abusing IoT devices to produce massive attacks for instance. So there is a lot happening and you know, the struggle organisations really have is to keep up with that, you know weaponising of digital technology. So it's a digital arms race. We still call it a crime, but it's more like cyber warfare. It very much is and it's a bit of a cat and mouse game if you will. And yeah, it's increasing, you know observable that organisations are really struggling to keep up with that massive amount of data traffic that is for instance coming in. So this complexity now both in the attacks and also in I guess the landscape, the environments that businesses are running. How do they deal with that complexity these days? Right, I think automation is really one of the very big subjects these days because the days when organisations were able, again human versus human, to defend attacks. These days are gone because of that increasing complexity and landscapes are also getting more complex, IT Landscapes are getting more complex. It is really important to automatically, first of all, detect but also mitigate these attacks preferably in real time. So an organisation if they are still taking the traditional approach they have a number of people on their IT security team. I mean, you've got all sorts of issues that they must be facing not least about when an attack comes in, the extent of the attack where it's coming from, the time of day even. So that's not even going into human error, so what is it that they need to do in order to try and keep up with the criminals? Right, I think it's really important to thoroughly assess the threat landscape to be aware of perhaps upcoming and new threats because threat landscape is evolving constantly. That's one thing but the other thing is wherever possible because of that increasing complexity is to, basically leverage Ai and to leverage machine learning mechanisms, to use ruthlessly automation, wherever possible. Just to stay ahead basically of the game and to defend these attacks. So you talk about artificial intelligence, automation, it's algorithms taking on algorithms. So at any point should a human be in the loop or should these be automated to the point where they identify an attack and mitigate that attack? Right in terms of that mitigation, I think let the machines handle it. It's much much safer. It's quicker. It's real time, but in getting the humans involved or getting an analyst for instance involved makes perfect sense, but do that after the event. So once the attack is over analysing what exactly happened and what does it mean in terms of the landscape and what are perhaps additional precautions that can be put in place in and having a human analysing the situation makes perfect sense, but let's do that after the event not within or during the event. So I mean, where do we go from here? Because is it just going to be a case of algorithms chasing algorithms or what are the cybercriminals working on next? And do you still need humans in your security team to make sure that you have the correct weapons or they're kept up to date? Yeah, I wish it was so easy to answer that questions, you know, what's next on their list. I mean that's exactly the kind of the struggle. It is important to analyse the threat landscape and obviously then to enhance, you know services enhance capabilities in order to keep up, but it's not that easy to predict. You know, what are the next things on the horizon, you know, that's kind of a view into the crystal ball if you like. Yes, I may I be wonderful to be able to predict the future I could use that to my advantage. But all right, so from a from a pragmatic approach, if companies are looking to essentially beef up their cybersecurity and mitigate attacks. How do they start? What are the steps that they should go through? Right. So the first thing I think is after they've analysed the threat landscape, is to identify possible loopholes to think about how can they mitigate these risks leveraging Ai, leveraging automation wherever possible. Organisations are strongly advised to think about an automation first type of policy. So wherever possible employee automation because it's just you know safer, as you've just said precludes human error. It's real time, you know, it just makes the whole thing a lot more reliable. Yeah, and in terms of the different attacks, is it a one size fits all, do different organisations see different threats depending which type industry they're in or which geographical location? Is there that much variation? There are some common patterns but you know depending on the IT landscape, organisations might be exposed to certain threats more than others. For instance in organisations that have a large workforce, mobile workforce might face more mobile risks an organisation that has a larger web presence, for instance, might be confronted with more web related threats. So very much depending upon the structure of the organisation, the type of business that is carried out, organisations are facing different type of types of threats. It's a fascinating subject and I hope you come back again to tell me how it does evolve in the future. But yeah machine versus machine, it's quite scary. It is huh. Marc thank you for coming on and telling me about it. My pleasure, thanks for having me. ### Provider | AWS vs Azure vs Google As an ever-increasing number of businesses from start-ups to Fortune 500 companies move to the cloud, CIOs and owners come face to face with the amount of provider options, products and services available for them to choose from. According to Gartner, the market for worldwide public cloud services is predicted to grow from $153 billion in 2017 to around $186.4 billion in 2018—an increase of 21.4%. When it comes to the type of services you need to effectively migrate to and work in the cloud, there’s no one-size-fits-all solution. Every organisation has its own unique set of requirements, and the perfect solution to suit your business may in fact lie in combining products and services from a few different providers. The Big Three at a glance While other strong competitors such as Alibaba Cloud and Oracle Cloud have emerged in recent years, Amazon Web Services remains a strong frontrunner in the cloud computing sphere, with competitors Azure and Google Cloud eking out their own respectable share of the market. For the purposes of this article, we will be looking at the ‘big three’ providers dominating the cloud computing industry Cloud security | Data protection is essential in 2018. Amazon Web Services (AWS) AWS was the first major provider in the cloud market and has been in operation for approximately 12 years, carving out a whopping 33% of market share and generating $1.4bn for Amazon in Q1 2018 alone. The biggest strength AWS possesses is undoubtedly its dominance in the public cloud market, with its success and popularity linked to the sheer scale of its operation. AWS boasts a huge, ever-growing range of products and services, and arguably the most comprehensive network of data centres the market currently has to offer. According to Gartner’s 2017 Magic Quadrant for IaaS, "AWS is the most mature, enterprise-ready provider, with the deepest capabilities for governing a large number of users and resources." Microsoft Azure Microsoft was late to the cloud-game but made up for the delay by taking its existing on-premises products (Windows Server, Office, SQL Server, SharePoint, Dynamics, etc.) and rejigging them for the cloud. Fast forward to today, and Azure has been around for approximately seven years, providing companies with a broad set of features, open source support, and easy integration with other Microsoft tools. A key factor in Azure’s success is user familiarity with the brand, which creates a preference for Azure among loyal Microsoft customers. While Azure is indeed classed as an enterprise-ready platform, in its aforementioned Magic Quadrant report Gartner noted that many users feel that “the service experience feels less enterprise-ready than they expected, given Microsoft’s long history as an enterprise vendor”. Users also cited issues with technical support, training, and DevOps support as some primary pain-points when using the provider. Google Cloud Platform While it lacks the range of services and scale of global data centres offered by its competitors, Google Cloud provides a specialised service when it comes to Big Data, machine learning and analytics, with formidable scale and load balancing, robust data centres, and very low response times. A key advantages lies in Google’s containers offering, having developed the Kubernetes standard now offered by both AWS and Azure. According to Gartner, organisations “typically choose GCP as a secondary provider rather than a strategic provider, though GCP is increasingly chosen as a strategic alternative to AWS by customers whose businesses compete with Amazon, and that are more open-source-centric or DevOps-centric, and thus are less well-aligned to Microsoft Azure.” Choosing the right provider When it comes to cloud migration, every project is unique and dependent on the particular needs, goals, and resources of the company in question. This year, over 80% of enterprises have opted for a multi-cloud strategy, with 51% of enterprises selecting a hybrid solution (i.e. combining public and private clouds.) The best public cloud provider for your business depends on your particular requirements and workload, and the most efficient and cost-effective solution could lie in combining the services of different vendors. If you’re going for AWS: AWS is a strong choice because of its robust range of tools, products and services as well as the sheer size of the provider. The main downside to Amazon’s offering is that it does not provide the more personal relationship offered by smaller operations. Due to its massive size and global spread, it can be difficult for Amazon to maintain close relationships with each customer, but this is remedied by recognised partners and consultants who can offer that level of customer service. If you’re going for Azure: Azure’s biggest selling point is, of course, its existing Microsoft products and loyal customer base. Any existing .Net code works on Azure, your organisation’s server environment will connect to Azure with minimal to no issues, and you should find it easy to migrate on-premises apps. If you want Linux, DevOps, or bare metal, however, Microsoft would not be the ideal choice. Azure offers Linux but it takes a back seat in priority to Windows. DevOps is primarily a Linux/open source play, again, something Microsoft does not specialise in. If you’re going for Google: Google is growing rapidly, however its cloud offering remains a work in progress. Without an established background in working with businesses, the vendor has some catching up to do in terms of its service and range of products on offer, but is clearly focused on investing in and building its presence in the cloud market. Google cloud is also partnered with Cisco, which does know the world of enterprise, and its strong reputation for scale and machine learning working in its favour.   ### A decade of the App Store: where next for enterprise applications? 2018 marks 10 years since Apple launched its App Store. In the intervening period, apps have gone from being relatively unheard of to a key part of our modern tech ecosystem. They are now critical to almost every smartphone user, and according to Flurry’s last State of Mobile report, app usage is continuing to rise.   The App Store started with just 500 apps in 2008 and grew to more than 900,000 in 2013. Today, there are more than 2.2 million. Meanwhile, Android users can choose from 2.8 million apps. While the latest consumer facing apps tend to get the lion's share of attention for the entertainment, fun and usefulness they bring, apps in the enterprise have been around for decades, quietly doing their important and diligent work - from simplifying business processes to increasing productivity. You won't find most B2B applications in app stores however, mainly because businesses often develop their mission-critical applications tailored to meet their unique needs, in-house - either from scratch or as customised versions of “off the shelf” software. Many of today’s enterprises run on applications that were developed 10 or 20 years ago and are integral to the business, collecting and processing extraordinary amounts of data about the company, employees and customers. These systems range back office support systems such as enterprise resource planning (ERP), and HR applications, to transactional systems like payment and supply chain systems - all of which are continually greasing the wheels of business. However, the legacy apps of the past will not enable businesses to compete in a digital, cloud-driven future. Modernising critical business systems is paramount to organisations’ competitive advantage in a world where customers in every sector expect on-demand access to goods and services. Adding to the urgency around modernising legacy apps is the fact it takes time and soon it may be too late for businesses that haven’t already embraced modernisation. Finding the right path to application modernisation The major trends driving modernisation of business applications are three-fold; reducing time to market of new features and functionality to keep pace with competition, the reliability and maturity of increasingly available technology, and market and user expectations. The explosion of open source technologies in the last decade has led to an unprecedented amount of free, stable, maintained, and configurable tools and applications. This has been enhanced by the move to an “as a service” tech economy. Shifting costs from CapEx to OpEx has caused challenges for enterprises. But early adopters have paved the way for the late majority. Capital costs for data centres which can leave businesses tied in to expensive long terms programmes are being replaced by operating expenditures that can provide a lean cost model and easier expansion. This combination of enterprise-class open source tools and cloud services provide the necessary ingredients to begin a modernisation strategy with relatively low capital outlay. Both of those factors have meant that there is already significant cloud use amongst enterprises. The majority of businesses are cloud users in some form and as we look forward to 2019, it’s safe to say cloud is well beyond its ‘early adopter’ phase. The final driver for change is the fact that people have grown to expect highly robust and integrated experiences from the applications they use. Whether it’s controlling their lights, doing their banking, or booking a table, people expect apps to be constantly updated with new capabilities, and for these updates to happen without intervention or downtime. If there is one thing that the current business cycle has proved, evidenced by the severe difficulties that many traditionally operated companies have experienced across sectors, it is that there will always be a business prepared to offer the same products and services with an improved digital experience for the customer. The innovation in the consumer space has created the expectation that enterprise applications run as smoothly as the top-rated apps on the App Store. The good news is the technology, processes and experience to modernise legacy apps is readily available. Of course, execution remains a challenge and thus planning, patience, and an incremental, progressive approach to modernisation are keys to success. Transforming over time, not overnight, is the most reliable and effective way to modernise. Companies must examine and deeply understand their infrastructure, development processes, and application architecture before embarking on any modernisation initiative. Once enterprise technology teams understand the scope of their technology estate and have prioritised apps for modernisation, only then can they leverage cloud, open source software, and modern development processes to evolve critical business applications. You can’t simply “lift and shift” a monolithic application to a hyperscale cloud and expect it to work like a mobile app, just like you can’t tell developers and operations to “do DevOps” and expect high velocity software delivery. Approach modernisation is an ongoing, collaborative process, with specific goals at each stage, so you gain benefits throughout and enable teams to participate fully in their own success. On the anniversary of the App Store, now is the ideal time to consider the whole spectrum of applications -- not just the consumer-facing apps, but also the longstanding, mission-critical enterprise apps that run businesses everywhere. It will be the modernisation of these core business apps, and not anything you'll find in the App Store, which will define the value of the app in the decade to come. ### Blockchain | 3 ways it's changing digital marketing A world in which we are accountable for our every action sounds like an unpleasant fever dream to some and like utopia to others, but we are still far from that reality. However, we are edging closer to it in small, universally approved and publicly registered steps, which cannot be denied once taken. This is why blockchain is the next step in our coming to terms with our imperfections and our acceptance of the fact that while it would be great to be able to trust people unreservedly, they are not always trustworthy. By allowing you to dictate the terms under which you’ll enter an agreement just as much as the other side and ensuring that involved parties are identifiable and accountable, blockchain does a great job of covering for some of our basic human flaws, creating a more trusting, and consequently, a more productive and vibrant environment. Some of the changes that the increased adoption of this technology is bound to cause in digital marketing will be inevitable, something that you have to deal with or overcome; but others will be welcomed and actively explored because of all the potential benefits they can bring. You should be prepared for either. Better targeting Blockchain is equally focused on protecting information and on making it accessible. So, while the parties involved in a contract will, depending on their transactions, be required to provide a set of personal information, that information won't be available to just anyone. As a digital marketer, you need to be able to appreciate the potential of this kind of structured data, even though you cannot reach it in any other way than by directly requesting it from the owners. If this doesn’t seem like much of an improvement to you there are two things you need to consider:  Accuracy - while more difficult to obtain, this data will be far more reliable, focused and comprehensive than what you’ve been able to gather from traditional sources. This means less waste from failed targeting and fewer surprises in general. Legal aspects - data privacy is becoming more of a burning issue as each day goes by. With GDPR taking swing, United States Congress getting involved and the huge part of the population believing that they can unreservedly exercise their right to free speech without it ever endangering their right to privacy, you want to have a clear record of what you did and didn’t do once someone starts pointing their finger your way. Getting that data from relevant prospects may be as simple as straight-out buying it, but even if you find that you need to invest much more effort into obtaining it, it is going to be worth it as long as you are capable of properly incorporating it into your overall strategy.   More transparency with blockchain We’ve already mentioned transparency, but have only touched upon one of its aspects. Needless to say, a radical change in this direction would have a much wider impact on digital marketing than what we’ve discussed. One of its most anticipated benefits would be the ability to cut out the regulatory bodies that we are paying to regulate us. In other words, the amount of trust in trackable transactions that blockchain would allow us to circumvent the middlemen we once needed to ensure the satisfaction of both parties. The ability to trust each other allows smaller entities to form direct collaborations, avoid unnecessary expenses or limitations, be more creative in their marketing, and carve a place for themselves somewhere far away from the overcrowded watering holes created around the dominant, perhaps soon to be obsolete, intermediaries. Decentralized finances allow for decentralized markets, which allow for greater freedom in deciding how you are going to grow and promote your business.   Changing consumer habits Your clients are not your only concern, it’s your clients’ clients that you really need to keep an eye on. Conscientious and professional digital marketing agencies promoting a business will pay more attention to helping their clients than finding a way to attract new ones, and that is exactly what you are doing by learning to anticipate the behaviour of your client’s target audience. The proliferation of blockchain is going to cause serious ripples in all markets and industries, and the consumers are going to change accordingly. If you even want to think about formulating a long-term marketing strategy, you will need to anticipate the direction of these changes and account for them in your plan. What is a typical buyer journey going to look like? How much more pronounced are the differences in marketing funnels of your clients working in different industries going to get? How will you adjust your strategies, and ensure that you can absorb such a huge increase in workload? There’s no need to go into detail about how protecting your client’s interests means protecting your own, so do your best to prepare for the possibility of a drastic change of habits of their target audiences.   Final word Since the limits of blockchain’s application potential aren’t even close to being fully explored yet, we are not left with much apart from speculation regarding the benefits increased transparency and data reliability are going to bring. However, what is forecasting if not informed speculation? While there is no way to accurately predict the changes this technology might bring in just a couple of years, doing your best to keep coming up with educated guesses on the subject is bound to give you a head start on those who leave the speculating to others. ### The technology helping businesses go green With a scorching two-month heat wave following the icy cold Beast from the East, it’s been hard to avoid climate change in the UK. Globally, forest fires in California, Cape Town’s water shortage crisis, and an abnormally heavy monsoon season in Kerala are further signs that the weather is becoming less predictable. While awareness about climate change and the environment has been steadily rising, it’s only been in the last few years where we’ve seen governments and enterprises take serious action. The introduction of 5p shopping bags, electric cars, the replacement of plastic straws with green alternatives, and WeWork not expensing meals that contain meat are just a few examples. But it’s not just large corporations and governments jumping on the green bandwagon. Smaller businesses and entrepreneurs are also playing their part, with cycle to work or public transport schemes being offered, energy saving light bulbs installed, and employees being encouraged to recycle. Often overlooked in favour of these more traditional processes, is the impact of technology in cleaning-up the enterprise. In fact, with the energy footprint of technology already accounting for approximately seven per cent of global electricity, a number which is set to rise further, it’s critical that businesses understand where technology can be a force for good, and how it can be used to go green and reduce their carbon footprints instead of increasing them. Here are three ways technology can help business leaders adopt a more eco-friendly work environment: Cloud-based computing Businesses are increasingly adopting cloud-based solutions in favour of on-premise systems, in order to make the most of a range of benefits including scalability, pay-as-you-go pricing, and the ability to pick and choose applications without having to purchase an entire infrastructure. In other words, companies are able to create bespoke systems at a fraction of the cost. But another, less talked about benefit of cloud computing, is that it’s on the whole far greener. Managing, storing and processing data on a local server consumes far more energy, so switching operations to a public cloud will reduce the company’s electricity consumption, and in turn, their carbon emissions. A study by the Lawrence Berkeley National Laboratory estimated that moving business software such as CRM, ERP and email to the cloud would lower the total energy consumption of these applications by 87 per cent. Paperless offices Switching to the cloud can also reduce the amount of paper printed in offices, as the technology encourages a more digital environment. Even with recycling schemes in place, using paper translates to cutting down trees, which is a key factor in keeping the earth green. As well as cloud computing, there are several inexpensive tools that businesses can invest in to cut down the amount of paper they use. Project management software like Asana, Trello and Evernote can help do away with to-do-lists and notepads while having the added impact of fostering a more collaborative working environment. Similarly, e-signature solutions like DocuSign and OneSpan Sign allow companies to securely bypass the need to print paper for a signature while remaining compliant with regulations. Not only can these tools help companies to be more green, but they can also boost productivity among employees and reduce running costs for businesses. Remote working Going paperless and adopting cloud technology also give employees the opportunity to work remotely, as long as they’re supported by the right tools. The way we work is changing, and won’t slow down any time soon. 9-5 working days and in-person meetings are fast becoming a thing of the past. The gig economy is also booming, and more companies than ever are adopting flexible working schemes and remote-first cultures. Collaborative workspaces are a by-product of this change in working culture, as smaller business and entrepreneurs no longer feel the need to have a permanent workspace. To encourage remote working, many enterprises are investing in video conferencing tools, allowing them to span geographical borders and international time zones, as well as bringing together far-flung remote working employees. Collaboration tools are already fostering remote working, but as they become more advanced, this small shift has the potential to become seismic, so companies need to keep their fingers on the button to stay ahead. Not only are these trends providing employees with a better work-life balance, but they’re also having a huge impact on carbon footprints. Working remotely, be it every day, or once a month, means employees are commuting less and cutting down on carbon emissions. And, if employees are working from home, energy costs within an office can be reduced. Smaller office spaces can be rented, which in turn means less energy on lights, heating, air-conditioning and so on. Introducing a scheme whereby all employees work from home on a Friday would cut the weekly office energy consumption by 20 per cent, which is a significant amount. A recent study revealed that growth topped the list for CEO business priorities in 2018 and 2019. And part of this involves looking forward and understanding that the nature of how we work is evolving. Business leaders are already changing and upgrading the structure of their companies; remote first organisations and decentralised companies are becoming more common. Similarly, investment in technology to support these changes is becoming more urgent for companies who don’t want to fall behind their competitors. For the environmentally conscious entrepreneurs, these changes are also allowing them to make a difference to climate change not just in their personal lives, but in the workplace too. Video-conferencing technology, collaborative workplaces, on-demand webinars, not only allow businesses to digitally transform, but also contribute to the fight against climate change and global warming. ### Could Cloud make AI accessible for all? Cathal McGloin, CEO of customer service technology start-up, ServisBOT (www.servisbot.com) explains why the combination of serverless computing, ubiquitous mobile use and the open sourcing of artificial intelligence by Cloud giants are bringing chatbots within reach of every business: It’s time to chat about bots The latest report from Juniper Research, ‘Chatbots: Banking, eCommerce, Retail & Healthcare 2018 – 2023,’ predicts that chatbot technology could save 2.5 billion customer service hours, yielding annual savings of $11billion by 2023. Juniper market analyst, Sanjay Dhanda, looked at organisations in North America; Latin America; EMEA; China; the Far East; the Indian subcontinent and the rest of the Asia Pacific. He examined the potential benefits of chatbot use in banking, finance and insurance, eCommerce, retail, travel and hospitality and healthcare in terms of customer service cost savings and improved services.  At your service Clearly, one of the major advantages of chatbots, or any automated service, is that they are available whenever customers need them. This is what drove the popularity of automated teller machines (ATMs) in the banking sector fifty years ago and we foresee that chatbots and other forms of automated service will follow a similar pattern of adoption. AI adoption - Is it all chat? In spite of the current buzz around chatbots, live chat is a twenty-year-old technology that has seen a resurgence as a result of the Cloud and e-commerce. Juniper Research predicts that eCommerce transactions via chatbots will reach a value of $112 billion by 2023 as retailers harness AI for marketing, cart recovery, upselling and customer service. The artificial intelligence (AI) currently being used to power chatbots has been around since the 1960s. Five things have changed since then: the adoption of Cloud-based services; the abundance of data to fuel AI; the processing power to deal with all that data; open sourcing by internet giants like Google and Amazon; and the ubiquity of mobile devices. Now, the Android device in our pocket has more data and processing power than anything we had in the 1960s, or even the 1990s. That is revolutionary. How the Cloud makes chatbots more accessible Using serverless technology, companies can implement chatbot technology faster and run artificial intelligence (AI) more affordably, paying for computing resources only as needed, as well as having the ability to rapidly scale to meet business demands. This means that a small boutique hotel or a rapidly growing challenger bank can run AI technology in order to serve their customers. Two years ago, influential venture capitalist firm, Andreesen Horowitz, predicted that AI will become a feature of all software. In my view, AI won’t just be a feature, it will be a requirement. Companies are making it so easy to use AI that developers don’t need to know much about it to be able to apply it. I foresee that AI, smart conversations and automation will totally transform how businesses operate and engage with customers and employees. We see that the blend of Cloud, data, processing power, AI and mobility allows us to go beyond chatbots to deliver a range of AI-based customer service tools that perform different roles and functions. Service bots can operate at the front-end or in the background, engaging with customers, completing transactions, automating repetitive tasks or providing information and support. These bots can be deployed for small specialist tasks or scale up instantly to deal with very large volumes of work and transactional requests. They can work alone on an individual task, or as part of a campaign: working with other bots and human employees to complete a common mission. Will AI steal our jobs? While there have been understandable fears around AI, automation and bots making human jobs obsolete, once again, we can draw comparisons with the introduction of technology in the banking sector.  In the late 1960s, people predicted and feared that ATMs would cause the demise of the bank teller role. However, Val Srinivas, Deloitte’s research leader for Banking & Securities,  recently explored this concern and commented, “Interestingly, as ATMs expanded—from 100,000 in 1990 to about 400,000 or so until recently—the number of tellers employed by banks did not fall, contrary to what one might have expected” 5  Srinivas found that reducing operating costs, through the installation of cashpoints (ATMs), initially resulted in banks opening more branches as well as providing more advanced services, so the number of employees did not fall as predicted and customers gained 24/7 access to their money. By automating routine tasks, the savings in delivery costs could be applied to drive innovation. Srinivas also draws the distinction between “tasks” and “roles”, pointing out that tasks can be removed through automation, without making the role itself obsolete. Just as washing machines, kettles and robotic vacuum cleaners make household chores less time consuming, employees’ roles will become less routine and manual, allowing them to focus on more problem-solving, interesting and higher value-adding services. PwC predicts that, over the next two decades, AI will remove 7 million jobs and create 7.2million, with creative industries and healthcare less affected than sectors such as manufacturing, warehousing and logistics. In spite of the advances in AI and automation, human involvement cannot be entirely removed. If you have a customer with a complex complaint, a bot may not be able to solve this problem. Using customer case history, context and emotion, a bot needs to be smart enough to know when to hand the request over to a human. However, as bot automation becomes more prevalent, human advisors will move to higher value activities, while repetitive and common tasks are automated. Bots will remove tasks rather than roles and, judging by the Andreesen Horowitz and Juniper Research predictions, that has the potential to benefit every business.” ### Deep Learning | How is this technology enhancing AI technology Artificial intelligence (AI) isn’t a sci-fi trope any longer. Even if you don’t realize it, odds are good you regularly encounter AI in your day-to-day life; this includes deep learning. This technology has seemingly countless practical applications across a wide range of industries. Whether it’s analyzing your viewing habits to recommend Netflix titles or using voice app development that adjusts to your preferences, AI is becoming increasingly commonplace. Several innovations are responsible for this surge in popularity. Deep learning is one of them. When you understand what deep learning is and how it drives sophisticated AI systems, you can better grasp how AI’s value and usefulness will continue to grow.   What You Need to Know About Deep Learning AI is a general term referring to computer programs that can “think.” Machine learning is a subset of AI which refers to computers using data to more effectively complete key tasks. Within this broad framework is deep learning. This subset of a subset focuses on using algorithms that train computers to perform high-level tasks that typically involve greater amounts of data than simple machine learning tasks handle. This involves exposing the AI’s neural net to large amounts of data. As it learns, it grows more effective at making predictions, allowing the system to perform certain tasks more efficiently. Although deep learning can be applied to a wide range of AI functions, it’s particularly valuable when used in the following capacities.   Speech Recognition Human language is complicated. A single word can be pronounced in many different ways and have different meanings in various contexts. Additionally, speech patterns and syntax choices tend to vary from one speaker to the next. With deep learning, speech recognition systems learn to better understand what a person is stating or asking by becoming familiar with the nuances of language. This innovation also allows speech recognition programs to answer appropriately when a user asks a question.   Smart Homes A smart home must learn to predict when certain appliances and features should be activated, and when they shouldn’t. Doing so allows the system to decrease energy usage, meet your daily needs, and reduce your bills. For example, an ideal smart home would be able to predict when it should turn the lights in a room on or off based on your previous behaviour. It will also be able to adjust the temperature according to your typical needs. Connected with Internet of Things devices, it could even one day learn to start prepping your morning coffee or set your alarm for you. To successfully perform these tasks, a smart home must use deep learning algorithms to properly glean insights from your behaviour.   Image Recognition This is another AI feature you may have already benefited from in your daily life. It’s especially likely if you take many smartphone pictures and store them in the cloud. AI programs often identify common traits and characteristics from photos to group them into different albums. A basic example would be a program that automatically sorts photos of people into an album specifically meant for these kinds of pictures. Thanks to deep learning, however, image recognition programs will soon be capable of much more than simply helping you organize your photos. For example, deep learning can allow AI to identify commonalities in medical images like X-rays or MRIs. This can help physicians more efficiently diagnose patients. In the near future, such a program could even help medical researchers better understand conditions by identifying similarities between images from patients diagnosed with them.   Looking to the Future It’s clear that deep learning can be an extremely valuable tool. That said, there are a few reasons some businesses may be slow to embrace it. Currently, deep learning programs require very strong hardware to be effective. The neural systems computers use to take advantage of and implement deep learning algorithms are typically very large. Some in the field have begun to realize that creating these networks is easier with GPUs (Graphic Processing Units) instead of the traditional CPUs (Central Processing Units). Although GPUs were originally developed to support 3D games, they also offer the necessary computing power to facilitate deep learning. Furthermore, it’s important to understand that deep learning is currently a highly specialized field. The average programmer doesn’t have the experience necessary to leverage it to its full potential. Thus, businesses need to take the time to research candidates thoroughly before selecting a programmer for a deep learning project. Luckily, as is often the case with emerging technologies, deep learning will become more accessible in the near future. Tech giants like Google and Intel are already working hard to develop hardware that supports it. That’s why it’s smart to get started with deep learning sooner rather than later. Businesses that take advantage of it now will find themselves on the forefront of the next major technological revolution. ### IoT | The quandary facing the telco industry The estimates vary from $250bn to $1trn depending on which industry report you read, but what they all agree on is that the Internet of Things (IoT) is forecast to be one of the key revenue-driving opportunities presented by the fourth industrial revolution. As a result, it’s not a surprise that IoT is already becoming one of the biggest battlegrounds for telecoms companies. However, a recent report from Ericsson, Exploring IoT Strategies, has highlighted that while operators are testing out multiple routes to monetising the IoT opportunity, more than two-thirds lack an adequately defined strategy to do so. This has raised doubts and multiple questions about whether telcos are well equipped enough to capitalise on all of the opportunities presented by the IoT. But of course, it’s not only operators who are looking at ways to incorporate dedicated IoT services and offerings into their business models to increase revenue. Telcos should aim to build partnerships with the businesses who are looking to embrace IoT connectivity and look at the different ways in which they can monetise their services. In fact, telcos are in an advantageous position in terms of being able to provide new connectivity solutions and services for the IoT; after all communications services is at the heart of what they do.  The telecoms industry is rapidly-evolving and competition is fierce, so it’s likely only a matter of time before operators define a clear strategy and are able to reach both new and existing customers with IoT solutions and services that will ensure relevance and sustainable growth. But how exactly can telcos benefit from the exponential growth of the IoT? Where do the big opportunities lie? And how can operators capitalise in such a way so as to expand their core customer base beyond the telecoms supply chain? Well, one key area is content, and more specifically, content delivery. The potential of the IoT in digital content delivery will mean that providers can move beyond traditional methods of delivering content and entertainment. This is an area telcos specifically can exploit, offering an IoT connectivity service that will enable content providers to package up and simultaneously deliver content to millions of connected devices across a wide range of digital channels. As more and more media companies are evaluating the potential of IoT to enhance and reshape their content delivery strategies, telcos should look to strike and take advantage. Data has been mentioned as a vast by-product of IoT connectivity, and this in itself is often the most valuable commercial tool for businesses. This is also an area telcos can take advantage of given their capability for performing real-time data analytics to determine aspects such as product performance, as well as forecasting network capacity and improving customer experience. Another area operators should engage in is security. Keeping IoT networks free from external threats and security vulnerabilities is something that will be near, if not at the top, of most businesses’ agenda. While the IoT facilitates the connection of multiple devices, the amount of data sourced from these makes it extremely challenging to monitor, track, identify and remove any suspicious traffic over the network. Telcos have a rich history of best in class security services and are therefore in a strong position to provide customers and prospects with a secure offering to protect IoT networks. Services including anti-phishing, data encryption and other proactive security countermeasures offer telcos a clear way of monetising their IoT offering and reaffirming their reputation as trusted partners. With the capability of connecting millions of devices across an extensive region or several regions, an IoT network facilitates the possibility of managing multiple devices efficiently and easily irrespective of their location. However, this is something many businesses have struggled with historically, particularly because of the fact that devices are often prone to damage and malfunction. Telcos can take advantage of the opportunity this presents in terms of the management and monitoring of the lifecycles of connected devices within a company’s IoT network. Providing services such as troubleshooting, repair, restoration and fault-detection of out-dated or malfunctioning devices would unlock a new revenue stream for operators. This ongoing management of multiple connected devices within a business’ IoT network will ensure companies have control, making them more resilient and helping lower operational costs, while also boosting revenue for operators. There is little doubt that there are some extremely viable and profitable opportunities out there for telcos looking to capitalise on the full potential of the IoT. However, it’s important to remember that these opportunities should be about creating value beyond connectivity. Telcos should be opening up their services to support businesses from all sectors who are looking to utilise IoT connectivity as part of their business models. Then there’s the advent of 5G to consider, and the significant impact this is expected to have on the IoT. But while 5G uses cases are still being planned, let’s just focus on one thing at a time! ### The CLOUD Act | Shaking up the CSP market Julian Box, CEO and Founder of Calligo It’s widely understood that a core pillar of cloud computing is the storage of data throughout various geographical locations. Storing data globally improves resilience, access and performance. But with the recent introduction of various international, national and industry-specific privacy regulations that require data to be kept within prescribed borders, this nonchalant global approach to data storage is at odds with data residency requirements. It also creates problems when data that is potentially relevant to a certain jurisdiction is stored beyond its borders and its legislative powers. The best-known example of this issue is when the US government struggles to access data it deems relevant to national security because it is stored abroad, and subject to local data laws, often making it illegal for the CSP to disclose the data. This is the exact situation highlighted in the US government’s recent case with Microsoft and attempts to access US-relevant data held by Microsoft in Ireland. The US government's national security concerns therefore directly conflicted with the main principles of cloud computing. Clearly, this was untenable. What is the CLOUD Act? The Clarifying Lawful Overseas Use of Data Act, or more simply the CLOUD Act, was signed into US law in March this year in order to clarify this conflict. This has shaken up the industry, causing various opinions from cloud, privacy and security communities alike. The Act itself makes it easier for the US government to access information that they believe is of interest for national security when held by US-headquartered CSPs in countries other than the US, without infringing the privacy rights of the individual. Of course, the CLOUD Act also provides mechanisms for Microsoft et al to quash warrants where disclosure is contrary to local laws. Such a situation is the likely outcome of any use of the CLOUD Act, in which case, the US government’s main recourse would then be to pursue the diplomatic processes of either a local warrant or a Mutual Legal Assistance Treaty (MLAT) with the country in question. They are currently slow and typically politically sensitive processes, which is why one wasn’t relied upon in the Microsoft Ireland case.   But the recognised effect of the CLOUD Act is that it will encourage the US to simplify the processes around the international diplomatic agreements between the US and other countries – the so-called Mutual Legal Assistance Treaties, or MLATs – in order to make accessing data abroad easier. So, in total, the CLOUD Act is the trigger for the US Government to accelerate the process of simplifying access to data deemed pertinent to its investigations held by any US-headquartered cloud provider, anywhere in the world. What is the practical impact on the CSP industry? This question is best answered by looking at the CLOUD Act alongside other privacy regulations, not just in isolation. Privacy has become such a vital issue that many nations, governments and industry sectors are now devising their own rules and legislation. For example, India has just presented its first data privacy bill, following in the footsteps of the GDPR itself, and more nations are anticipated to follow later in the year. However, this will inevitably create an enormous variety of legislative frameworks for businesses and cloud providers to conform to - regardless of how they may conflict with each other. As an example, a global health insurer needs to be simultaneously aware of the CLOUD Act, Canada’s PIPEDA, Japan’s APPI, HIPAA, the GDPR and the data privacy requirements of any number of financial services regulators. And this is before more nations such as India pass new legislation, or the EU’s ePrivacy Regulation finally comes into force. And it doesn’t stop there. As we rely more on technology in society, even more legislation will come into play. We have already seen this through the introduction of GDPR, a response to the inadequacy of previous privacy regulations in this increasingly digital age. The use of IoT and smart cities will undoubtedly trigger questions, concerns and more legislation. So, to return to the question, what does this mean for CSPs? We’ve seen businesses are becoming increasingly conscious of and concerned with where and how their data is stored, as well as the legislative obligations and ramifications of using cloud services. As such, they will inevitably seek advice directly from cloud suppliers on how to best store data to ensure both privacy and practicality. But few CSPs are able to provide this advice. Few are sufficiently informed to offer guidance on how to best address national and industry-specific laws and regulations. Their teams remain focused solely on the technical aspects of performance and resilience, seeing privacy legislation more as an issue rather than as an opportunity to offer added value through impartial, authoritative and accurate advice. The reality is that a modern business requires a whole new type of CSP - one that has its finger on the pulse in an intricate industry that is constantly updating. Privacy is just too explosive an issue to not address it suitably. As the business world, therefore, continues to transform, and the cloud industry along with it, the CSPs that are able to adapt to offer guidance on the genuinely most privacy-appropriate cloud service will be the ones that shape the industry’s future. ### Startups | Can they disrupt cloud oligopolies It’s official, Oracle is finally submitting to the cloud, and it’s no wonder why. Gartner recently revealed that the public cloud services market is projected to grow 21.4 percent this year to $186.4 billion, up from $153.5 billion in 2017. Can startups take some of this market share?   The war for cloud dominance is raging; Microsoft and Oracle’s challenge to Amazon on a recent $10 billion cloud contract is evidence of this. The focus, for obvious reasons, has been on these large incumbent players, but this is misplaced. Until recently, Oracle offered a closed property management system (PMS) for hotels, a one-stop-shop doomed to inefficiency. However, with nascent cloud companies snapping at it’s heels, Oracle has finally grasped that customers now demand on-going service innovation and technical expertise, albeit years after nascent cloud companies arrived on the scene. In the cloud industry, and the tech sector more generally, it is very often the startup challenges that force the large corporates to evolve.   David, meet Goliath Oligopolies are nothing new, however, in recent years technology has emerged as the tool that can enable the contemporary David to face down Goliath. The US entertainment industry is a prime example of this, with six Hollywood studios currently controlling over 85 percent of the film industry. This market dominance is potentially set to be exacerbated should Disney win the bidding war for 21st Century Fox assets, a contest which has been raging since last December. Late last month Disney agreed to a deal that valued Fox’s entertainment assets at a staggering $71.3 billion, a far cry from the paltry $4.05 billion it used to acquire LucasFilm in 2012. Disney’s ardent pursuit is fuelled by the growing success of alternative online streaming providers disrupting the space. Netflix has disrupted the space to such an extent that the streaming service was actually valued more than Disney in May, albeit for a brief moment. Oracle has now mustered the courage to move to the cloud just as the likes of Disney and Time Warner have succumbed to the power of online streaming; this transition though is fraught with dangers for Goliath and ripe with possibilities for David.   The opportunity for startups Oracle has been on the wrong side of history for a while now, and once we consider the sheer size of Oracle’s infrastructure, the problem is compounded. To put it simply, they cannot re-orientate their offering in the same way nimble startups can. Major transitions such as these provide a rare opportunity for startups to really go after legacy customers; just look at what the likes of fintech players such as Revolut and Monzo have done to disrupt the banking industry. We need only look to the former, who recently announced that 2 million users have signed up to its platform, just months after it announced it hit the 1 million user milestone in November 2017. And what about Goliath? Well, many TSB customers are still unable to access their accounts, presenting yet another opportunity to aggressively seize market share. Startups in the cloud space must do the same, lest face a market filled with only a few quasi-oligopolies. In some aspects, market consolidation has already begun; major vendors such as Amazon, IBM and Microsoft have cemented their positions, and late last year Oracle hit the $200bn valuation mark. However, the world of software has changed; no longer can you simply provide a solid stand-alone product. Customers expect more, they expect on-going service with access to technical expertise, when and where they want it. Take property management software (PMS), an entire sector that has had to be dragged kicking and screaming into the 21st century. Until recently hoteliers had to rely on closed one-stop-shop solutions whose fundamentals remained unchanged for years. This Luddite attitude towards the hospitality sector has allowed disruptive players to thrust forward in an industry traditionally dominated by giants. Startups must be wary of resting on their laurels - the sheer size of these quasi-oligopolies means they have access to alternative growth and development options - such as M&A - which companies early in their life cycle simply do not. Just look at the slew of acquisitions Bob Iger, CEO of Disney, has made over the last decade. The likes of Oracle, Amazon or IBM can do the same - but they have one key weakness, the inability to swiftly incorporate a new infrastructure without suffering immense transition pains. The opportunity is there, startups in the hospitality sector, and the cloud space more generally, simply need to take notice and make the leap. ### Is Digital Transformation Clouding our Security Judgment? In the last year, we have seen security attacks tip over into terabyte territory. It’s not something we expected to see so quickly or frequently but the future is here. It’s not just private companies like retailers or banks that are experiencing attacks of this scale either, we’ve seen the public sector succumb and we’ve seen the infrastructure providers so many companies rely on fall foul. Most notable was the attack on the French Hosting provider OVH, which was one of the first to see a terabyte attack. Professionals looking at security trends weren’t surprised to see terabytes recorded because they had been expecting them, it was the proliferation of bots behind the attacks that caused most alarm – bots had gone mainstream. Back in 2016, we saw a shift in denial of service attacks when brute force started to become the hacker’s tactic of choice. Advanced persistent denial of service attacks using aggressive automated bots started to grow in popularity. Fast forward a year and bots have become more sophisticated exploiting the digital transformation strategies many companies are embarking on. Take for example retail. 40% of retailers say that three-quarters of all their internet traffic is from bots. For them, bots are integral to operations, used to provide online support chatbots through couponing to running price checking to support ‘price promise’ marketing strategies. But as with so many new technologies especially ones that depend on applications and the cloud, progress is the use of bots has widened the security landscape and introduced more vulnerabilities that hackers can leverage. For instance, web scraping attacks plague retailers. Bots are proactively used to steal intellectual property, undercut prices and hold mass inventory in limbo. We’ve seen an interesting trend for using bots to create new black markets. ‘Sneaker bots’, being the most common, whereby bots are designed to buy up the full inventory of exclusive, limited edition trainers so that they can then be sold through unauthorised channels at a markup. It’s not limited to sportswear either. Airline and concert tickets are highly prized. Yet even though retailers know it happens, 40% are unable to say if they are targets because they don’t have the ability to identify bad bots. That’s worrying when you consider that earlier this year, Gartner predicted cloud services would grow by 18% as more companies embarked on transformation programmes. Companies can’t afford to be complacent when it comes to using the cloud for business transactions. Just look at the fall out from Uber’s breach. Although a giant when it comes to developing apps that challenge the status quo and improve life it’s shown you can be made to look pretty small by the hacking community. It’s disruptive business model, though well loved by the investor community, has disrupted lives and stirred up social unrest among the driver community with riots from Paris to India to show for it. It’s made itself a target, a symbol of unethical business for 'ethical hackers' to go after. What’s most interesting though is that it wasn’t a very sophisticated hack. It really didn’t take much to get the data. It would seem that the team developing the apps were sloppy in their processes and that security which should be integral to the design of the app was an afterthought. It’s well known that the process of developing apps, DevOps, is often pressured by the need to get to market quickly and ahead of the competition, and to give venture capital investors a quick return on their investment. But it’s opening up huge risk – in fact, research shows that half of DevOps initiatives don’t include security in design. There’s no doubt that the use of applications to improve agility is why the cloud is so important to major corporations through to the latest fintech startup. There’s no better way to scale and disrupt industries. However, application development in particular continuous delivery models that are used in half the instances of app development comes at a price – people’s privacy. Senior IT directors even admit the risk, with two-thirds believing it’s flawed when it comes to security, underscored by the fact that half of all apps are not developed with security in mind. Security is well and truly an afterthought. Plus, the APIs used by the apps aren’t using encryption. Only 48% of companies inspect the data that is being transferred between APIs and 51% don’t do any security audits or analyse potential security vulnerabilities before launch. [clickToTweet tweet="'Given companies see attacks on their network most days, it is madness that this situation should occur' via @radware" quote="'Given companies see attacks on their network most days, it is madness that this situation should occur'"] Given companies see an attack on their network most days, it seems madness that this situation should occur. However, the need to get to market with innovative apps that will improve transaction times and ultimately attract more loyal customers appears to be deemed more important. It’s likely that the GDPR will change attitudes, and frankly, it must. Breaches won’t be tolerated and the ICO will no doubt be looking to make an example of anyone who attempts to sweep things under the carpet. But even then there are question marks over readiness – especially from across the Atlantic where less than 20% of companies trading in Europe think they will be compliant. However, that’s six months away. More urgent for retailers is the need to hit trading targets this Christmas. While billions may have been spent on Black Friday there are no guarantees that the trend for sales will continue. It’s still possible consumers have had their blow out and won’t spend in the next few weeks. CIOs know they are in the firing line should transactions fail – every sale will be critical. There is absolutely no room for outages. What’s more the investment in bots to make the customer’s experience easier and faster will really be tested. Get it wrong and consumers will vote with their feet. That all places a huge amount of pressure on the telco sector to ensure online sites are always available, apps work every time and money goes through the tills. But it’s a battle it appears to be losing, as 51% of retailers don’t think they can keep applications up and running 100% of the time. And to add further complexity, busy periods really test security measures with 30% of retailers admitting they can’t be sure they can secure sensitive data during peak trading times. Once again business strategy is taking precedence over security. So how can businesses get it right? The most obvious answer is to include security in the design of applications. Once again business strategy is taking precedence over security. So how can businesses get it right? The most obvious answer is to include security in the design of applications. But, of course, designs have to be tested and no matter how imperative it is to get to market companies must ensure the apps are secure, data is encrypted and it can be chased. These facets of development have to be part of the business case. Not only that, but companies need to look ahead and consider how the technologies they use for very positive means are exploited by malicious hackers. In understanding how bots, for instance, can be used against them, they have a chance to develop security processes and policies that protect people and their data around the clock.  After all, as retailers fight for survival and the ICO watches on intently every second will count. ### IoT Security Risks: What could go wrong? The rate of ‘things’ being attached to the internet is increasing and everyone has heard the term ‘Internet of Things’ (IoT), but are they aware of the security risks involved? It means that devices and appliances that traditionally had operated quite happily in their own way and in their own place in the world, are now being forced to connect to the internet. Even if it seems like a daft idea. Often when I buy technology, it’s because of the thought and the initial magic of owning something new rather than the operation and continued use of it. There’snow a fridge with a camera inside of it, in case you’re out at the supermarket and need to know how much butter you’ve got left. Presumably, the light comes on when you use it. Maybe the camera is there to answer that age-old question? The internet connected fridge seems like it was created because of the idea that everything needs to be connected to the internet. How much use an internet connected fridge might get after the initial novelty has worn off, I’m not sure but what is clear is that there is now the potential for someone to use it to access data and other connected devices within your house. Long after the novelty of an internet connected fridge has worn off, it will continue to cool your food and still be connected to the internet. This illustrates the side effect of connecting more devices to the internet. The risks to security exponentially increase. The more devices, the more attack points, which means more management overhead to patch them and validate compliance. In a business context, those devices are potentially outside of the organisation perimeter. A lot of these could be remote devices and connected to the internet using non-traditional business methods. They could be anything from a sensor monitoring water flow in a river, to the camera on a motorway or even an internet connected toaster. The age of apps and cloud services has enabled everyday people to manage and adjust all manner of life-encroaching technology wherever they are, and all through the public internet. More and more businesses are adapting to this style of working and connecting systems for data gathering, monitoring and alerting or remote command and control purposes. Every device increases the attack surface and the risk of a cybersecurity incident. IoT devices could become points of entry onto the network or be the target of a cyber-attack, therefore understanding their normal posture and monitoring their activity becomes even more important. This can be challenging as often devices can be spread across locations, campuses or even geographies and supporting security intelligence at scale can be difficult without the right tools. Celerity have a rich portfolio of enterprise-grade, highly scalable solutions that can help to meet these challenges. Using our Citadel Security Threat Detect capability can help reveal security threats and provide insight into cyber security related activity across the IT landscape, including IoT. This means you can take control of the risks that are presented by the growth of devices within your network and illuminate the blind-spots that they create. ### Business Information | Time To Finally Get Serious David Jones at Nuxeo says the death of Enterprise Content Management and the move to Content Services really has to be dealt with by anyone serious about exploiting the business information. In January 2017, Gartner said Enterprise Content Management (ECM) had officially expired that Content Services is the future.   Managing Business Information Well, as we know, analysts make a living making bold predictions – and sometimes, pretty big claims that are well ahead of what the real market is doing. What are we to make of this pronouncement – and what should business information managers be doing about it? Global information management group AIIM asked its members about their situation regarding information and how it is being managed at their organisations. These guys should know, and the answers are well worth hearing. What stands out is that businesses really do have a problem when it comes to multiple information silos. According to the AIIM probe, over a quarter of organisations (27.4%) have ten or more in-house systems. For historical and technical reasons, this is understandable. However, it’s often a real challenge when you try to connect these systems in a manner that makes it easy to find information residing within different systems while avoiding duplication of content and efforts. And as a result, potentially highly-valuable business information ends up residing in multiple places: e-mails, file and folder structures, cloud file shares, social messaging tools and so on. And spreadsheets also form part of the problem – 61% of the information manager respondents in 2018 say they still put business information in spreadsheets! This may feel like a natural way of organising things, but it sets up reasons to fail when done at scale. Five or 10 years ago, the sort of information that organisations were managing was scanned documents, Word and Excel files, PDFs, and e-mails. Today, it's a far broader set of content types – images, videos, and social media conversations are now considered critical enterprise information assets. Try to manage that type of unstructured content in ageing legacy systems that were built before those content types were invented – and you have a problem. The final issue is the size of the content assets in question. A PDF file could be five or 10Mb in size, but users are now expecting to store Gbs worth of video and images – and high resolution equals big files. If your infrastructure isn't set up to manage that sort of load, then you can also expect many performance issues. At the same time, it’s increasingly difficult for users to find what they need. Three quarters (76%) of the information manager respondents AIIM interviewed said that they can't find the right business information within a timely manner. That's 76% of ‘digital’ organisations that can't access the information they need to work effectively. That translates to major impacts on business efficiency and speed of execution. And if you can’t find the right information, how on earth are you going to be able to share it with colleagues, customers, clients.   Time to dance the 'modernisation two-step’   How do we go about solving that problem? First, if you don't modernise your systems, you can't connect them – which means there’s little point in starting that digital transformation process the CEO is so passionate about. You’d simply end up stopping even more systems from talking to each other and ultimately exacerbate the problem. Two, any solution you come up with has to be mobile and cloud-enabled. It will also need to provide the ability to introduce things like AI, analytics, and blockchain when your organisation needs to start exploiting them. One useful idea is something I call the ‘modernisation two-step’. Think, connecting and consolidating in tandem. On the connection side, work to install a layer to connect all your different organisational systems in order to create an information hub. You don’t move content into it, but this type of central facility approach gives you the ability to know what is stored where – what metadata and content are held, and of course the ability to access and manage it. The next step is to look at consolidating the technology the business uses. After all, each of these systems costs money to run – and a legacy ECM system may be costing tens of thousands, or in larger organisations, millions, just to operate, along with its specialised staff and dedicated servers. By consolidating this technology where appropriate, organisations often realise that all that remains is the data and the content – and with the ability to move that data and content into a modern platform at their own pace, they can sunset those legacy systems when they’re ready. The result is a more streamlined set of information management tools, doing exactly the same tasks as before, but in a much more scalable, flexible, and powerful way. So what to make of the mantra that says ECM is dead? The reality is that information management is changing because the needs of enterprises are changing, and what modern businesses need to do with their information on a day-to-day basis is changing. The way that everybody in the organisation interacts with information is also changing. The good news is that the technology that we need to solve the business problems of today is here – today. We just need to use it. ### Juniper Networks Fosters Adoption of Network Automation Juniper Networks (NYSE: JNPR), an industry leader in automated, scalable and secure networks, today announced a breadth of new offerings to accelerate the industry’s adoption of automation practices. With this latest announcement, Juniper is unveiling a collaborative community that includes tools, labs, libraries and an exchange of innovative applications to accelerate automation adoption for companies and individuals. Juniper EngNet features access to virtual devices that run in the cloud, complete with documentation, along with a full suite of tools to move from manual to automated operations. To complement Juniper EngNet, Juniper designed NRE Labs, a web-based, on-demand automation curriculum focused on the emerging network reliability engineering (NRE) role. Juniper is also announcing additional professional services to help enterprises break ground on automated testing, a key thrust for any enterprise looking to leverage automation to do more in their network.   Despite nearly two decades of industry effort to drive network automation, network operations are still dominated by manual operations issued through the command line interface (CLI). Compare this to a Gartner report that forecasts that “by 2022, only 25 percent of network operations teams will use the CLI as their primary interface, which is down from more than 70 percent today.”[1] Migrating an industry from 70 percent CLI to 70 percent automation requires industry-wide rethinking of the approach to network automation. If automation is to become a mainstream practice, the industry needs to do more than create new products. The success—or failure—of automation in networking will depend as much on the people as on the technology. This means that the education and upskilling of the existing network engineering teams are critical in modernising the network to keep pace with advances in computer storage, applications, cloud and multicloud.   With the offerings released today, Juniper is making the individuals behind network operations—not the vendors and their products—the centrepiece of this industry evolution. Juniper EngNet and supporting tools and services are about moving individuals forward in both their skillsets and careers. By providing tools, virtual resources and a collaborative community for free, Juniper is putting customers first. Through hands-on practice and experimentation, these new offerings aim to ease and accelerate the adoption of automation across network operations, processes and workflows. This will give NetOps teams an industry advantage by advancing their automation skills, practices, and processes so that automation doesn’t remain a mostly specialised practice that only a select few can leverage.   ### Digital Transformation | Christian Rule, Flynet Christian Rule, Director of Business Transformation at Flynet, talks to Joshua Osborn about all things Digital Transformation. He covers off the three phases of Digitial Transformation: Digitisation, Transformation and Reinvention. What stage are you currently facing? What stage should you be facing? Christian takes an in-depth look at the market around us and how others are tackling this journey.  Digital Transformation has been a process that's occurred for years. Right from evolution version 1 in the 60s right up to evolution version 3 in the 00s and today. If you think your company doesn't need to take note, or that it can keep up with competitors, and suppliers, by doing nothing, you may be headed down a dark path. Join Christian and his insightful, interesting discussion around Digital Transformation and be ahead of the game! Find out more about Flynet here. ### Life with Patti Boulaye debuting 5th October With Mental Health awareness a hot topic in the media currently, Patti Boulaye OBE, has made it her mission to get behind the celebrity scene and break down those stigma barriers. Patti has recently finished recording her new show Life with Patti Boule at Disruptive Live's studios in London. A show which takes a good look behind the scenes of celebrities. Celebrities are invited to come on the show and share their real life stories, successes, failures and to humanise the celebrity roles they hold. Through the glitz, glamour, posh hotels and limos lay normal people, humans, just like you and I with feelings, families and the same worries that we all have. This show is a way to reach out and remind us all the celebrity name tag doesn't also come with a 'I'm invincible' tag too. Everyone has their ups and downs in life and as key role models for the younger community, celebrities can feel immense pressure to 'get it right' so as not to affect their followers. Guests on the show have ranged from royalty to film producers and everyone in between. The show will be aired Friday 5th October 1pm on Disruptive.Live Patti Says 'This show comes from my concern about the alarming state of the mental health and suicide rate among our young people - lets tackle it together, now!". https://player.vimeo.com/video/292326436?html5=1&title=1&byline=0&portrait=0&autoplay=0 About the Series Award winning singer Patti Boulaye OBE continues to have a distinguished career spanning four decades – and she’s made a lot of friends along the way. In this exclusive series, Patti Boulaye will look at the other side of success from people from all walks of life, who have lived and breathed it. This series is both inspirational and a cautionary tale for those looking for success. Interviewing the people Patti has met along the road – they will reveal the highs (and lows) of a successful career and lifestyle. ### CaSaaStrophe | Communication during a crisis CaSaaStrophe (noun): a sudden software service failure causing widespread damage. When caSaaStrophe strikes, you need to act fast. The way you handle customer communication during a crisis can mean the difference between devastating loss of trust and retaining customer loyalty. Do you have processes in place in case of a critical technical failure? If not, even brief outages can see your customers heading out to competitors. Niamh Reed from global SaaS provider Parker Software, explains how to communicate with customers during, and after, a caSaaStrophe.   Why it’s important to communicate Customer communication when your software is down isn’t about justification. Nor is it about trying to claw back control. (Although your prompt, upfront acknowledge is a step in the right direction.)   Rather, it’s about honesty, accountability, and expectation management. You need to be transparent with the customers who’ve trusted you with their money, and you need to respond to the stress that a temporary loss of service has caused them. Your software going down isn’t only angering customers because they can’t access your program. They’re also stressed by the loss of work, time and resources — and the costs that implies. They’re annoyed by having backed you, or by not being able to serve their own customers. In a worst-case scenario, customers might also be fearing mass data loss, losing their job or even their own customers. During caSaaStrophe, then, you need to put yourself in the customer’s shoes. Your communication should be quick, open, and help appease alarm. You need to show that you are acting competently, reassure that data is safe (or every measure is being made to keep it safe), and that you understand the customer’s concerns.   Start of the caSaaStrophe When a software meltdown hits, be proactive. Don’t wait for each customer to come to you, contact them first and let them know. When your software is diagnosed as (temporarily) broken, something as simple as informing customers that you are aware of the problem, and are actively looking to fix it, will placate them. You need to communicate early and often. The sooner you can confirm to customers there’s a problem on your end and you’re fixing it, the less time customers are left stewing on the issues they’re experiencing. Don’t just let customers know there is a problem. Point them to places they will be able to find updates and information about the outage, such as social media or a dedicated webpage. Proactive communication demonstrates that you are present and there to support your customers. If you don’t act quickly, or pretend nothing is wrong, customers will be left feeling stranded, unappreciated and frustrated.   Keep calm and carry on updating For the rest of the time your software is down, you’ll need to answer customer queries, manage expectations, and post updates. And you need to do so as consistently and regularly as possible. Social media platforms like Twitter are a great place to post updates and information during a software failure. However, when using social media, make sure you use a dedicated service, feedback or support account, not your main brand account. This way, you keep your updates directed at customers and those affected by the outage.  You’re not interrupting your normal content flow and broadcasting your software failure to leads or followers that it doesn’t affect. You also need to be able to respond quickly to customers. That means keeping your service accessible and readily available to customers around the world. A live chat channel is particularly useful as it gives your customers a convenient way to reach out and get immediate replies. Stress from a software outage often manifests as impatience, so the instant replies from a live chat channel are a useful way to help calm angry customers.   Manage customer expectations When responding to customers, you need to manage their expectations to avoid them feeling further disappointment. That means that any estimations on when the issue will be resolved should be realistic, not a hopeful guess. Unrealistic hope will hurt, not help. Avoiding further disappointment means sticking to the promises you make during the crisis. It can help to let customers know when you’ll next post an update, but make sure you stick to what you say. Be consistent across communication channels. This means that if the software will be mended by the end of the day on Twitter, it’s going to be fixed by the end of the day according to your website, your live chat sessions, and your emails, too. Mixed messages will confuse customers and reduce trust.   After the storm The storm has passed, your software is up and running again. Hooray. But you still aren’t out of the communication woods just yet. After a major caSaaStrophe, no matter how amazing your communication was during the event, there’s still going to be some fallout.   Large software failures damage the trust and confidence that your customers have in your product, and by extension, you. So, when the dust has settled, you need to start picking up the pieces and repairing any damage. This is achieved by following up with customers. If they complain — or mention the event at all — on social media, answer them and ask if there’s anything else they want you to do to help. You should also release a post moratorium — a report that outlines what happened, why, and what preventative steps are being taken to address the root causes. In the aftermath, demonstrate to customers that you recognise that you messed up, that you’ve learned from the incident, and that you’re taking preventative measures. This acceptance of responsibility encourages forgiveness, and puts you on your way to restoring any lost trust.   Every cloud Life isn’t perfect, things go wrong. But your software going down doesn’t have to set you back. The way we handle crises is telling of our character. In business, responding to a crisis competently shows your mettle. Of course, unplanned downtime is never what you want, and it’s never what the customer wants. But the silver lining of an outage is that it gives you an opportunity to demonstrate to your customers that you’re dependable, even when caSaaStrophe strikes. ### Data centre | A vital part of the internet When most people think of information stored in ‘the cloud’ today, they’ll rarely put much thought into what that means - as if the information is magically stored in the air around them. Out of sight and out of mind, few will consider the fact it’s really stored in a physical data centre somewhere; a huge centralised hub of information locked away hundreds or thousands of miles away. These data centres form a vital part of the internet’s infrastructure, and the content and services we consume more of every day. It’s staggering to think, then, that something so important is so unfit for purpose. Yet, that’s the situation we are faced with today: data centres are an unsustainable solution that is costly and damages the environment. Worse still, three-quarters of that unsustainable market is controlled by just four technology companies (Amazon, Microsoft, IBM and Google). New and legacy cloud providers have increasingly shown they recognise the problem - and that’s where things get even stranger. The latest initiative to address it saw Microsoft launch Project Natick - an attempt to make data centres greener by submerging them under water. It’s an expensive experiment that involved sinking a white cylinder packed with servers just off the coast of Scotland, with the hope that water will satisfy the need to keep the machines cool without the expensive systems used in conventional warehouse farms. While it seems churlish to criticise a company for trying to ‘do the right thing’, it’s hard to believe this costly response is a sustainable option for the future. Especially when there might be a smarter solution that lies closer to home. Home as a data centre In fact, DADI believes the solution is inside the home. Instead of short-term solutions to the data centre problem, it’s time to start thinking about the wealth of unused data storage in homes and offices across the world - with billions of computers, games consoles, TVs and other connected devices, that spend most of their lives in standby mode. Connected together, these devices could power a faster, fairer cloud network that is decentralised. That means no central point of failure, no server farms that harm the environment, and no ‘solutions’ such as sinking expensive equipment into the ocean. In the same way that solar panels are now seen as a viable way for homes to become energy self-sufficient, or even income-generating, it’s not hard to imagine a future where our devices at home and in the office are used to power a more environmentally-friendly cloud network. Then there’s speed. The UK’s average broadband speeds of 46Mbps downstream and 6Mbps upstream might sound low in comparison with a tier-one data centre, but by linking devices together in a decentralised network, the potential is enormous. And of course, this aggregated computing power would leverage devices already manufactured and in use – rather than building additional devices to power vast new server farms. Suddenly, the idea of our homes becoming a data centre doesn’t seem far-fetched at all.The added benefits for businesses are also clear. With this collective approach, legacy vendor lock-in will disappear because a quicker, safer, cheaper source of cloud services is available - with minimal technology barriers to adoption. In recent years, interest and investment in ‘smart home’ technologies have grown spectacularly. The industry could be worth $53bn by 2022, built on our fascination with connected homes that make modern life easier - from refrigerators that automatically order your shopping to doorbells that allow you to answer remotely. These may be neat improvements and conveniences, but the ‘home as a data centre’ concept represents a more radical shift in the way we run and govern the internet in years to come. You could argue it offers us a way back from today’s lack of trust and transparency in the technology giants that dominate our digital lives, towards an internet that truly lives up to its own founding principles. Making the net work for you For both sides of that equation, the next question becomes “what’s the incentive?” Any change is a challenge - even for a new network that is faster, fairer and safer.Crucially then, this new way of thinking offers benefits beyond power: it’s also about profit. At DADI, for example, we recently launched our own decentralised cloud network that will give 85% of network revenues directly back to those who contribute computing power. Those who give the most will also earn the most back - enough to cover their broadband bill for example or provide the option to buy a wireless speaker that pays for itself over time. The remaining 15% is split between network maintenance costs and funding the DADI Foundation - a new independent, charitable body that will back causes which use technology to support democracy. I believe this decentralised route is the smart solution and the right thing to do, resulting in a faster, fairer, safer network powered and majority-owned by the public. Such a dramatic change will take time, but momentum is growing with the first few networks already live - and with so much to gain, hopefully, we’ll be looking at a very different cloud landscape this time next year. ### Endpoints | Do they mean a million risks? You may be surprised by how many endpoints are currently in your organisation. Companies with over a million connected endpoints are increasingly common, and as we add more network-connected devices, it will grow. Today, you may now have Apple OS/X or IOS devices, Android, proprietary IoT operating systems and contend with various flavours of Linux. Devices will end up connected to your systems via other devices you may not have considered; switches and routers, process control equipment and dedicated consumer-facing devices such as Point of Sale terminals. Segmenting endpoint risk and establishing clear risk categories show how you can group endpoints, software and users by considering what they have in common. For each of these, you can then define an effective mitigation strategy.   Users and their interaction with endpoints To your average endpoint user, security is an inconvenience rather than an asset. If the user has direct control over their personal endpoint device, they may well deliberately disable security protection measures or defer vital updates to avoid disruption. Even if the endpoint is company-owned and subject to good management practices, privileged user accounts are often necessary for the user to perform their job function (e.g. software development or maintenance). These privileged accounts may mean that malware inadvertently introduced by the user can access other sensitive information on the endpoint and then ‘pivot’ to attack your corporate infrastructure itself.   Ensure you can monitor effectively People assume that a million devices lead to a million risks, which explains the rise in the popularity of endpoint detection and remediation (EDR) tools, which aim to simplify the task of protecting and managing endpoints at scale. It is vital to own a tool that can monitor your environment in every way. This includes not just definitive, up-to-date pictures of hardware and software inventory, but real-time information on network traffic flows and potential anomalies. Without good management tools, you have no baseline on which to build. You may think, that you already have good endpoint detection tools. But do those tools track per-user installed software? Older EDR solutions often feature architectural designs pre-dating the mobile and cloud revolutions, thus leaving you with blind spots of potentially significant unpatched vulnerabilities.   Use virtualisation creatively If a physical endpoint can’t be fully locked down, consider using virtualised guests within endpoints. With the overhead of virtualisation minimal on modern hardware, even laptops or other relatively modest devices can now easily support these environments. You can then lock down the virtualised guest much more easily since you’re dealing with standardised images and can restrict the device owner’s rights within the virtualised environment without constraining their control of the physical device itself. In the event of endpoint compromise, you need only roll back to a previous checkpoint of the virtualised image to completely remove all traces of compromise, which proves easier than cleaning up physical devices themselves.   Encrypt all the things By ensuring that the virtualised environment is fully encrypted, you also ensure that sensitive corporate data is protected. Your system management tools should allow you to continuously scan for unencrypted storage and ensure that risky endpoints (e.g. mobile devices) are always encrypted. If the device is lost or stolen, the sensitive information it contains will remain protected.   Partition your network The best approach is to rigorously partition IoT devices within corporate networks so that they are never connected directly to a vulnerable internal network. Again, good system management tools can constantly scan looking for things like MAC address prefixes associated with IoT manufacturers, allowing you to monitor your network proactively. Obviously, this is not practical with IoT devices intended to act as key networking components, such as switches and routers. You need to ensure you can monitor and manage these devices and update their firmware without disruption, so you can respond proactively when a device vulnerability is reported.   What about AI? While good security practices in AI are still evolving, it’s prudent to consider the consequences of implementing any kind of AI-based subsystem within the organisation. Ensure that you partition this system off so that its inputs and outputs are tightly constrained and that there are procedures to monitor its operation and determine any anomalies that might indicate an issue. If the AI system starts malfunctioning or is attacked, what are your procedures to route around it, so that critical business processes aren’t stalled?   Conclusion Managing large numbers of endpoints may seem a daunting prospect, but with today’s increasingly sophisticated real-time endpoint detection and remediation tools, you can easily reduce a million risks to a handful, then control and mitigate each of these risks effectively. A recent study by Enterprise Management Associates (EMA) took an in-depth look at the key decision criteria for organisations looking to invest in EDR solutions. The report compares two markets leading EDR offerings, 1E Tachyon and Tanium Core, indicating the qualitative differences between an older (Tanium) versus a modern (Tachyon) EDR solution. EMA recommends carefully identifying the business IT requirements for security, administration, and compliance that exist today as well as those expected to be introduced in future. This will ensure the foundational introduction of a solution that will continuously meet organisational needs without periodic disruptions to production environments, unexpected financial costs, and increased management complexities. ### Industrial disruption | A new generation of innovators Disruption can be found in nearly every industry. Entrepreneurs  achieve disruption by developing start-ups that transform the way business is done. Airbnb has changed the world of travel forever. Today, this company enjoys a value of more than $31 billion. Lyft comes in at a cool $7.5 billion, after turning the taxi service industry on its head. FinTech firms like Stripe have created mobile payment solutions that are used by leading financial services companies like Visa. Even specialised services such as computer security are dominated by disruptors like Synack. Every established organisation is at risk of losing its competitive edge. If an entrepreneur hasn't already identified a gap in the marketplace, it is only a matter of time. Successful businesses are not waiting for the next wave of start-ups to disrupt their industry. They are disrupting themselves internally, so they can lead by example, and they are acting to adapt to the Everything as a Service (XaaS) economy. Proven Methods for Creating Internal Disruption The primary issue that traditional companies face is an over-reliance on practices that were successful in the past. Instead of embracing changes that come in the form of new digital and cloud technology, improved access to business insights, and adjusted workforce expectations, they continue to depend on a methodology that is now or will soon be obsolete. Internal disruption can alter the current script, creating a business model shift that is able to compete in today's marketplace. There are three main principles that drive innovation, whether it occurs inside or outside of the organisation. First, disruptors are those that are willing to step outside of their comfort zone. From inside the company, that means an openness to exploring alternative opportunities and new methods of getting the job done - even if they deviate from the core business operations that have driven strategy since the day the doors opened. This requires leaders who are willing to challenge the organisation to explore innovation, and they must follow through when it comes to funding and championing projects. A culture of internal disruption starts at the top when leadership encourages staff members to take risks. The second principle flows directly from the first. Employees will only push beyond current practice when they know it is safe to take risks. A large percentage of startups fail in their first year because the initial idea doesn't quite make the mark. Internal disruptors and innovators face the same risk that their idea, when put into practice, will be unsuccessful. Leaders who can sincerely assure their team that responsible risk will be rewarded - even if it fails - can be confident that internal disruptors will bring their ideas forward. Finally, the third driving principle for internal disruption is a singular focus on customer needs and expectations. In a world where the Everything as a Service (XaaS) economy has become standard, disruption means identifying and implementing relationship-building business practices, so that customers can rely on having their needs met whether from inside or outside of the organisation. Areas of Focus for the 21st Century When disruption comes from outside entrepreneurs, businesses quickly fall victim to obsolescence. An ability to disrupt the industry from within the organisation not only removes the risk of obsolescence - it also puts innovative companies ahead of the competition. These are five critical areas of focus for self-disruptors: Everything as a Service (XaaS) Major financial institutions and FinTech startups have created a model for XaaS partnerships. Innovators from outside these organisations have developed a vast array of tools to make it easier for consumers to manage their finances. At first, traditional banks attempted to match the level of service offered by disruptive startups. They created in-house lines of service applications, but they simply couldn't compete. Now, they have changed their strategy from competition to collaboration, with partnerships that give entrepreneurs access to a large customer base. The financial institutions benefit from keeping their customers happy. Internal disruptors must focus their strategy on improving the customer experience long-term, as successful companies are those who can offer XaaS solutions. Digital and Cloud Technology New technology means new opportunities for providing innovative products and services. However, many organisations are unable to pursue self-disruption, because legacy business systems can't support the changes. As in-house innovators develop disruptive designs, CIOs must consider whether it is time to replace IT infrastructure. Business Insights Advanced analytics and the use of big data are not new to the business world. However, advances in technology have dramatically impacted the amount of insight and the level of detail available. Companies with a self-disruptive strategy are drilling down to learn more about customer behaviour, and they are developing the exact solutions needed to fill gaps in their customers' experiences. Artificial Intelligence Until recently, AI made for an entertaining movie plot, but there was little or no realistic application in the workplace. Now, AI offers impressive enhancements to an array of business processes, from managing repetitive tasks to meeting customer service demands. This change is driven by advancements in machine learning, which makes it possible for technology to grow smarter as it is exposed to data. Disruptors are exploring opportunities for leveraging AI to develop improved solutions to business problems. Workforce Expectations Millennials are now the largest working generation, and the unemployment rate is shockingly low. The combination of these two factors makes it more important than ever to consider workforce expectations when decisions are made. Millennials are the first generation to grow up with digital technology, and they are comfortable with managing their work and personal lives through mobile devices. Retaining top talent requires management to adapt to employee expectations on when and how work gets done. ### Cloud security | Data protection is essential If recent data breaches have taught us anything, it’s that you can never be too cautious when it comes to adopting the right cybersecurity strategies for your business. For example, high profile Amazon Web Services (AWS) customers, World Wrestling Entertainment (WWE) and Verizon exposed the personal information of millions of customers by accidentally misconfiguring their Amazon S3 cloud repositories. Despite providers like AWS providing ample information about the best practices for cloud security, the volume of AWS-related data leaks continues to grow. The unfortunate truth is that data breaches are here to stay and could come from any point across a distributed organisation. Adding to the complexity is the growth of cloud technology and ensuring that the appropriate cloud security is in place without compromising user experience. With cloud IP traffic set to rise threefold in the next five years, it is vital that businesses have the provisioning in place to protect the huge volume of data flowing through their network. However, for companies that haven’t yet moved to the cloud, there are questions, challenges and a great deal of uncertainty. Specifically, it’s often an issue of control that holds them back; as moving to cloud provision, and provision via a third party, can feel like losing control. Needless to say, the stakes have been increased by the GDPR, as you’re now required to have an understanding of where your data is, perform service requests such as the right to erasure, conduct timely breach notifications, manage cross-border data transfers and provide evidence that steps have been made to secure personal data. As a result, an added level of trust for those companies outsourcing the storage and processing of sensitive data they hold on individuals is required. To put it simply, if a serious breach occurs at your third-party data processor, for instance, businesses must trust that they will be informed promptly and that the third party will work with them to fix the problem and protect their customers. With cloud now the norm, transparency is crucial and the onus is on businesses to ask the tough questions and ensure they have the solutions in place to protect their employees, customers and their brand.   Moving to the cloud   While the adoption of cloud services has increased over the past few years, many organisations are still unwilling to make the move to the cloud due to security and compliance concerns. There can be an issue of knowing where your important data lives (this is key if you plan on transferring it elsewhere). In their current on-premise model, companies might not know exactly where data is and how it should be classified. What is the information handling procedure for any given document, image, or program code repository? Which database holds your current customer dataset? Then, once you’ve located and gathered all the data to be transferred, you need to consider how to move all of it securely – and be clear on whose responsibility it is to ensure the cloud storage destination is already secure.   Where the responsibility lies   Cloud service providers have a definitive role to play here. Of course, they’re there to provide data inventory tools and services to help fingerprint and hunt for data in customers’ networks, and to encrypt data for secure transfer (you haven’t gone to all the hard work of locating your data only to send it over the internet without encrypting it) – but there’s important work to be done before this too. Cloud service providers have a responsibility to be transparent with their customers. When a business is going through a procurement process and requesting information to help them figure out which provider to choose, the obligation is with the cloud provider, to be honest about what they can and can’t do. The stakes are simply too high to behave in any other way. This is where the trust relationship begins: the first job of a security specialist is to help potential customers make informed decisions about how to keep their data safe (and within the bounds of GDPR). At Forcepoint, we’ve set up a cloud trust programme within our business which secures data and reinforces customer confidence in cloud security.  It is aimed at ensuring our company is being assessed for all of the most valuable certificates and accreditations available from industry bodies. In addition, it allows our customers to check that we have earned those certificates. We consider this programme essential to our customers, to ensure they meet the requirements of the GDPR regulations.   Cloud Security for your Future   For most companies, the cloud can and does have real business benefits – increasing efficiency, scalability and driving growth across markets. While GDPR is a response to the increasing prevalence and significance of sensitive data to the functioning of our businesses, security and trust remain the most important aspect of all. The cyber threat is continuous and constantly evolving and as a result, organisations must stay ahead of potential attacks, while also remaining compliant with GDPR and other security regulations. So, it is imperative that IT managers get their cloud security in order, as failing to do so could result in devastating consequences in the not-so-distant future. ### A Guide to Data Privacy and Translation The more data people generate and the more they engage online, the bigger the issue of data privacy and data protection becomes. The global nature of our data also throws up issues of its own, in terms of cross-border data control and the language that privacy laws are written in. In this post, we’ll take a look at the particular challenges that come with the translation of documents relating to privacy laws, along with what can be done to overcome those challenges. Data privacy in translation What is data protection and privacy? Data protection and data privacy are two different but related matters. Data protection relates to the protection of data from unauthorised access and use. Data privacy, meanwhile, relates more to who should be authorised to access the data and how they should do so. It’s an issue around the globe. More than 100 countries and territories have comprehensive data protection and privacy laws in place in order to protect individuals’ data that is help by private and public bodies. A further 40 areas have pending data privacy legislation or initiatives. How important is data privacy? Broadly speaking, it’s incredibly important, as it protects individuals’ rights in an increasingly data-hungry world. Admittedly, not every individual cares about the privacy of their data. Many will happily give away permission to an unknown company to access almost their entire Facebook account in exchange for finding out which Game of Thrones character they are most like. However, this only makes it more important to ensure that data privacy is appropriately enshrined in law and that the meaning of data privacy is not lost in translation. Cultural differences play a significant role here. ‘Privacy’ means different things to different cultures, as well as to different people within those cultures. As such, privacy laws have to cover a broad range of understanding, as well as differing levels of interest. They also have to translate into practical guidance that policy makers and those responsible for enforcing the law can usefully comprehend. In terms of the translation of data privacy documents, these differing views of what privacy means can be something of a challenge. The translator needs to remain faithful to the content of the original document. However, what if that content doesn’t resonate with the target audience, getting lost in the translation process? This is where localisation comes in. Localisation and internationalisation as tools for globalisation Localisation services encompass translation services but take them to the next level. They mould and shape the translated content to ensure that it fits with the cultural norms of the target audience. When it comes to specialist areas such as marketing translation, the value of localisation work is easy to see. Indeed, many companies believe that localisation, internationalisation and globalisation are cornerstones of their international success. Put simply, localisation can make campaigns more effective by ensuring that they are more engaging to those at the receiving end – that is, those with money to spend. That’s not to say that localisation is appropriate for every type of translation. When it comes to the wording of privacy laws, for example, a literal and direct translation that stays true to the original is essential. However, a localised translation may well be helpful when it comes to translating the guidance associated with implementing those legal requirements and the intent behind their original formation. Overcoming the challenges of data privacy translation work Translation tasks that relate to data privacy and data protection certainly bring their own unique challenges but the same could be said of any translation job, from legal translation to technical translation. Thankfully, there are certain guidelines that translation clients can adhere to in order to overcome these challenges. First and foremost, it’s essential to use a translator with relevant experience. Translation is not a ‘one size fits all’ activity. Different translators offer different expertise, not just in terms of the languages that they work with but also their sector-specific experience. A translator with previous data privacy experience will be well placed to overcome the challenges relating to this type of work. It’s also important to use a native speaker of the target language. Doing so can result in a better quality translation. It can also help to flag up any areas of the work that might rub cultural sensitivities up the wrong way, thus flagging up the need for specialist localisation services. Careful consideration of the task at hand can also help to overcome particular data privacy-related challenges. This is an area where translation companies and their clients can work in partnership to assess the likely impact of a translated document. It might be that the translation process identifies the need not just for localisation but for supplementary resources to accompany it, such as checklists or guidance documents. Data privacy translation is, ultimately, about attention to detail. The nature of privacy documentation tends to command this, so any translator working on such documents needs to be incredibly detail-focused in their approach. They also need to have a keen understanding of the context of the translation, both in the original language/country and the target one. Again, this can result in a higher quality translation that better serves the client and their document’s intended audience. The wide implications of data protection and data privacy laws mean that it is essential to interpret them correctly, both in their original language and in translation. They exist in order to protect individuals and test cases in the courts can often be deeply personal. As such, exacting standards need to be applied by all those working on data privacy translation tasks. It is only through the rigorous application of such keen focus and attention to detail that the rights of all those that the laws are intended to protect will be guaranteed. Using an established translation company with experienced, talented translators is, therefore, key to ensuring success and to overcoming the particular challenges associated with working with privacy laws in translation.[/vc_column_text][/vc_column][/vc_row] ### Cloud technology to modernise and secure business payments No one escapes digitisation. With new regulations being introduced constantly and Faster Payments becoming the norm for consumers and business alike, the drive towards the modernisation of payments using cloud technology is a reality for many organisations. According to Bottomline Technologies’ 2018 UK Business Payments Barometer, 56% of businesses have adopted real-time or faster payments for regular business payments, an increase of 12% from last year. Add in transformational shifts like Open Banking, PSD2 and new API-driven solutions, and it’s clear that there’s a tremendous appetite for organisations to ready themselves for the digital economy. So how does one prepare to take advantage? By capitalising on new technology and making sure that the basic building blocks of success are in place, namely cloud technology and security. Let’s take a look at how these two fundamental elements impact an organisation’s ability to achieve ultimate success in today’s payments landscape. Stepping into the cloud The UK Payments Barometer research shows a 33% year on year increase in cloud migration by businesses surveyed. It’s encouraging to see that organisations of all sizes are steadily moving away from installed software and some of the outdated perceptions about cloud technology, particularly around security concerns, ditched in favour of the numerous benefits cloud has to offer. Cloud technology gives unlimited compute and store power. It can pull together real-time visibility, providing insight into an organisation’s data and its’ financial position, which makes for improved decision making. The cloud offers flexibility, enabling companies to focus on their business growth without the burden of having to always manage 24x7 IT capacity planning and disaster recovery processes. These are worries that distract from core business, commoditised items that are best managed by outsourced cloud solution providers. When businesses make the switch to cloud-based technology, they can leave behind expensive upgrades, unused backups and license fees – all of which ultimately drives significant cost savings. Additionally, the move to the cloud enables businesses to save time by eliminating system downtime and removes the need for on-premise software updates, infrastructure, maintenance and regression testing. These benefits could not be more evident as when they are applied to the world of business payments. If not already on the agenda, competitive businesses need to consider how cloud-based solutions can better simplify and automate their payment processes. For example, some cloud-based solutions offer finance teams the ability to securely upload, validate, approve, submit and manage payments and collections from anywhere at any time on a variety of devices with a focus on following operational policy and procedure rather than infrastructure uptime and support. Suitable for organisations of all sizes and complexity, the cloud is fully scalable and grows with businesses while also offering significant security and compliance benefits as well. For instance, cloud payment solutions allow for the simple click-on of new services as they roll out and organisations only pay for the volume of transactions processed. Such solutions offer comprehensive reporting and audit trails of all transactions processed and files submitted, providing unparalleled flexibility, control and visibility into businesses. Securing your payments Organisations are in the vortex of a changing compliance and regulatory landscape, which is increasingly soaking up budgets year on year. Simultaneously, fraud concerns continue to rise. The UK Business Payment Barometer reveals a 169% year on year increase in the concern of internal payment fraud since 2016. The two most significant growth areas are the internal exploitation of systems and controls (13% in 2016, 35% in 2018) and external exploitation of internal staff, up 15% in two years. While little can be done to entirely eradicate payment security threats that plague companies, steps can be taken to mitigate risk. Smart organisations are increasingly choosing to transfer and share the legal risk and compliance to outsourced partners such as bureaus or Fintechs. This is a convenient and cost-effective way to eliminate the overhead and burden of dealing with growing risk and compliance issues. This may explain why the research found that fraud prevention topped the list as one of the biggest drivers for companies choosing to move to the cloud. The digital economy has already been embraced by individual consumers and is moving quickly into the business world. Regulators are blessing the progression toward cloud technology, making competition even fiercer. The good news is that the ability to modernise, apply new industry updates and drive efficiencies is well within the reach of every organisation given the new technology consumption models now on the market. Given the volume of regulatory changes in the payments industry and the rapid progression of cloud technology, all organisations need a cloud consumption strategy to compete in the future. It’s just a matter of taking the first step in your digitisation journey by choosing the right long-term partnership to learn and grow. ### The ROI of collaborative robots The UK is facing a labour shortage that is causing anxiety throughout the manufacturing industry. Currently, the UK is heavily reliant on migrant labour and, with Brexit on the way, businesses need a backup plan to ensure that they can continue to operate. Automation, and especially cobots, can fill the inevitable labour gap as they can be easily programmed to perform repetitive but necessary tasks and can be deployed quickly and safely alongside existing workers. It can address immediate labour shortages and help with retention by enabling employees to take on more valuable and fulfilling activities. This is especially true in the case of collaborative robot (cobots) which are designed to work alongside humans rather than replace them. However, when a business looks to automate with robots many different aspects must be considered in order to build a solid business case and understand the true value of robotic automation. In order to analyse the Return on Investment (ROI), many businesses look only at the payback period, which is traditionally calculated by taking the cost of the robot and then dividing it by the monthly salary of the worker. However, if this is your sole strategy of developing ROI breakdown, then you will be excluding multiple other factors that add to the total value and impact that robotic automation will have on your business. ROI shouldn’t be based purely on finances. Other elements, including employee benefits, increased efficiency and stability of products needs to be assessed alongside the total robot investment. Of course, the first step of calculating ROI will always be to start with the initial cost and the payback period, but then a business must look beyond the short-term and consider the long-term benefits that they will receive from investing in this type of automation. Looking at the Big Picture Investment To calculate your total robot investment, you must include the initial investment but then also factor in for additional system accessories and integration charges that may incur. For example, employing a cobot to undertake a pick and place job would require additional investment in the right model of gripper and a sensor to ensure precision, as well as the cobot arm. Businesses must think about which type of robot best suits their existing working environment, but can also be integrated with minimal disruption to reduce the amount of downtime. As well as this, future maintenance costs, software upgrades, feature add-ons and whether or not a particular robot can be easily redeployed within the business to meet changing needs of production must also be considered. For example, industrial robots require large amounts of space as they have to be locked behind safety cages, their heavy and fast movements making them unsafe for human interaction. It’s for this reason that many SMEs are reluctant to invest in traditional industrial robots as they require a large amount of shop floor space that small businesses cannot afford to spare. As well as this, industrial robots are complex to install and have a large upfront cost. Deploying a collaborative robot, however, means the integration costs will be minimal. Cobots are safe to work around and in the majority of cases have no need for safety guarding, ensuring that they have a small physical footprint. Their lightweight frame means that they can easily be moved around the factory site on a trolley and redeployed to carry out different tasks. The intuitive simple touch-screen interface ensures that the cobot can be programmed for secondary tasks in no time at all. Due to the ease of programming and simple online courses, existing employees can be upskilled to become the resident cobot programmer for small adaptations to their operation and use. Protecting and Upskilling Your Existing Workforce Automating dangerous and repetitive tasks with cobots can also improve the health wellbeing and safety of your existing employees. Occupational health and safety issues (OH&S) in relation to manual handling processes are all too common, particularly musculoskeletal disorders (MSDs). With the integration of collaborative robots, these problems can be significantly reduced because workers can be freed up to work on less laborious and repetitive tasks and moved onto higher value roles within a business. Automation will keep employees safe particularly if you adopt collaborative robots, which are ergonomic and have the ability to work alongside humans safely and effectively. Therefore, by automating a particular process in your production process you could save huge amounts on social costs, health and medical benefits, as well as insurance related expenses due to an introduction of safer work practices. Collaborative robot technology can be set up easily, programmed, reprogrammed and integrated into a business by virtually anyone; in Denmark, cobot programming is even taught in schools. This means that businesses don’t have to employ specially trained engineers or technicians to integrate and maintain the robot, so will save money and time on recruitment and training processes. However, this shouldn’t scare people into thinking robots are going to steal their jobs. When the computer replaced the typewriter, it revolutionised the workplace but it didn’t negate the need for receptionists, assistants or other office workers. Instead, the definition of the worker's responsibilities evolved, allowing them to dedicate more brainpower to other more complex tasks. This is the same for the robot revolution – it will redefine the way we work but there will always be a need for employees on the factory floor. People must remember that in the long run, automation will allow humans to have the capacity to deal with more complex jobs and acquire new skills that require human brainpower and will support their future growth and the growth of the business. For example, if a robot is employed within a factory to screw four exact holes into the same size sheet of plastic continuously all day this actually benefits the human worker in multiple ways. It saves human workers from performing a continuous and boring task, from developing repetitive strain injuries and frees up their time for a more complex job, perhaps in administration, management or quality assurance. All while the robot benefits the business by increasing productivity, efficiency and quality. There is also good evidence that automation does not have a negative impact on employment rates. One example is the Danish company, Trelleborg Sealing Solutions, who installed 42 cobots over 18 months. With increased productivity and quality, order numbers surged to the extent that 50 new jobs were created in logistics, quality control, marketing and the back office. Increased Efficiency and Stable Production Automation technology also increases efficiency and the quality of products. For example, if a robot worked on a repetitive task for you 24 hours a day then turn around time for your products to go from your factory to your customer’s door would be dramatically shortened. As well as this, collaborative robot technology ensures consistency and a high-quality product output, which significantly reduces the amount of raw materials wasted. As a result, customer loyalty and retention will increase due to your products consistent high-quality. It’s worth spending the time considering all these factors so you can get a real understanding of the short-term and long-term value robot automation will bring your business. It’s a detailed process because this is an investment in the growth, development, sustainability and competitiveness of your business. However, when building your business make sure you look past the payback period to calculate the ROI. Your payback period will always be longer than your ROI as your payback calculation is used to offset wages for examples whereas ROI calculation takes into considerations overall tangible and intangible business benefits and impact both short and long-term. ### Breaking barriers to trust using Robotic Process Automation The highly challenging global business environment has become even more competitive recently. Technological developments and digital disruptors continue to shape the landscape, against a backdrop of political and economic uncertainty. Concurrently, customer expectations are growing, as they demand high quality products and services delivered to a timescale that fits their needs. This places significant pressure on businesses to meet these expectations, especially as customer trust and loyalty is at an all-time low. To address this challenge, business leaders must look to how they can leverage the ‘soft’ skills within their organisation to provide a better service. In order to best utilise these soft skills, the existing workforce should be supported with technology, such as robotic process automation (RPA), which can be programmed to carry out labour-intensive and time-consuming tasks. However, care must be taken in the implementation phase to ensure that RPA is utilised to its best advantage and cultural changes are addressed early in the process. Tasks fit for a robot RPA is a cutting-edge technology that allows software ‘robots’ to mimic interactions that a human might have with an existing system. The majority of RPA tooling is distinct from machine learning and artificial intelligence, which involves a system actively learning from experience, to improve its performance at a task over time. Common applications for RPA include data entry, data manipulation, system integration and the creation of digital documents. These tasks are invaluable given the volume of unstructured and structured data generated by businesses, which can be analysed to directly inform decision-making processes. The new integrated workforce Highly repetitive and time-consuming tasks can be prone to error when performed by a human, but robots will complete the tasks consistently and efficiently, eliminating the potential for mistakes. By assigning these tasks to robots, employees can focus on those high value tasks that previously may have been deemed out of scope, due to lack of resource capacity. Also, in times of high demand, RPA can be scaled, without consideration of factors like sickness or holiday, making it an additional agile workforce that can be recruited and activated as needed. Using RPA to build customer trust Integrating RPA into a team alters the role of human employees and dynamics within the team. Once repetitive tasks have been reassigned, employees are able to focus on the creative, analytical and critical thinking that robots remain incapable of. This involves reviewing processes to devise new ways to meet client expectations efficiently and even identifying new areas where value can be added. RPA allows space for new thinking that might not be possible at times when teams are focused on delivering challenging projects. Customers are much less forgiving of mistakes than ever before and will not hesitate to change supplier if the service they receive is not consistent in quality or timescale. In addition, customers are quite vocal in their displeasure, which can have negative effects on company reputation. With the new ‘space’ created in the workplace, employees can place a greater emphasis on tasks encompassing emotional intelligence, which is a vital facet for adding value and creating longevity in client relationships. And importantly, RPA can provide more time for upskilling, to ensure that businesses are able to meet and exceed their clients’ needs into the future. RPA in action With increasing demands and seasonal spikes, a top telecoms provider was faced with challenges in meeting service level agreements (SLAs). The business identified an inefficient manual process, which involved the transferring of key data from multiple systems to its destination. This required valuable full-time effort, which kept these resources away from customer queries, introduced operational risk and slowed down the availability of valuable data. Using RPA, systems that previously did not interact were able to work together seamlessly. By eliminating the manual tasks, employees could be reallocated to work on more meaningful tasks – including dedicating more time to client facing activities.  Addressing internal trust issues Although RPA provides existing employees the time to build trusting relationships with clients, successfully implementing RPA requires a considered approach. Organisations must be sensitive to the fact that robots represent a change in culture, not just capability. Anxiety about facing unfair competition and the elimination of roles may resonate with employees, particularly given the recent media focus on how automation may replace jobs. Equally, there might be fears that managing RPA systems could exacerbate, rather than relieve, workloads. The cultural effects must be managed through a strategy that starts at the top. It’s vital to secure support from the leadership team, including IT and operational leads. Consider appointing a head of RPA, who acts as a consistent figure throughout the process to ensure a successful transition. Equally, a RPA Centre of Excellence can act as a vital resource to provide an organisation with a multitude of benefits and services, enabling the rapid and successful delivery of a digital workforce. Taking the leap forwards Implementing RPA presents the perfect opportunity to evaluate and improve on processes, both on a per project basis and across the business. This ensures that the transition delivers significant lasting value. Ongoing education is a key component of the process where employees are taught how to work with the digital workforce and how to respond when RPA sends an exception. By providing these skills early on, businesses can ensure that employees feel prepared for the changes ahead. RPA can play a critical role in keeping up with client demands, allowing employees to nurture those delicate client relationships and build trust. There is also potential to improve the quality and consistency of outputs. Change must be implemented with thought and care. But get it right, and you can keep up with clients’ demands with a strong, integrated workforce even as these demands continue to evolve. ### Download, Cloud or SaaS? Demystifying ITSM solutions Businesses considering investing in an IT Service Management (ITSM) solution are faced with a plethora of choices in the marketplace. Before exploring vendors, the decision can be made easier by first considering the type of platform that will work best for the organisation’s needs. ITSM software solutions come in different guises; SaaS-based applications, cloud-hosted applications, or downloadable software you install and configure on-premises or in your cloud. There is no definitive answer to which platform is best. This is entirely dependent on the needs and concerns of the organisation. Let’s look at the options available and how they deliver based on these needs. Downloadable Software ITSM Applications – key considerations Flexibility:The key advantage in choosing a downloadable ITSM software application is the high level of customisation and flexibility they offer. As it is hosted on company servers or in an organisation’s own cloud environment, customisation can be made more easily. For example, changing the capacity of the system on demand, improving or customising security features, building custom features and sharing data to other areas using APIs. However, such flexibility is not always a welcome feature. It can lead to organisations ‘over engineering’ their solution and creating complicated systems that require almost constant management and investment. Talking of investment, when choosing a solution that allows for customisation, organisations must factor the cost of this in as part of their purchase planning. It is lack of investment in personnel and initiatives that is the number-one reason for failed implementations of software ITSM solutions. Cost: While SaaS or cloud-hosted solution are often billed on a monthly subscription model, downloadable solutions can be purchased as a perpetual license with annual maintenance and support or as a subscription. This model can see the upfront cost being significantly higher but can result in net savings in the long term. Cloud-hosted ITSM Application – key considerations Cloud-based business solutions are increasingly seen as the favoured choice for many organisations, with IDC analysts predicting this market will grow 30% in the next five years. The reason for this growth? Cloud applications can offer rapid, low-cost solutions that can be quickly implemented and therefore assist IT teams to become more agile in their approach. Cost: Arguably the immediate benefit in a cloud-based solution is the question of initial outlay. This is minimal in comparison to a downloadable alternative. The service is purchased using a subscription and therefore allows smaller organisations to get a head start on building an IT help desk that can scale as required. Security: Although the popularity of cloud solutions has rocketed, this has been shadowed by concerns over security of data held off-premises. Cloud solutions see customers rely on vendor security safeguards. With these concerns first and foremost in customer minds, and in media headlines the key players in cloud services know that they must provide state-of-the-art security. This should provide peace of mind for users, but they should also feel able to scrutinise the vendor’s policies and security guarantees. SaaS-based ITSM Applications – key considerations SaaS is also a massive technology trend which provides some unique benefits for companies. SaaS options provide a one-stop package which sees the provision not only of software, but of the supporting infrastructure, applications and data hosting. Cost: A key benefit of a SaaS-based solution are the savings that can be made from the lack of associated infrastructure required. It also offers similar flexibility to cloud models in a monthly subscription, often formed of a base fee, with additional fees for scaling up users. Partner approach: Choosing a SaaS-based service will spare companies from having to purchase, host and implement software themselves. One of the reasons many small, medium and large organisations choose SaaS-based ITSM services is the benefit of having a dedicated partner to ensure successful implementation and ongoing support. A SaaS partner will take responsibility for data storage and maintenance, releasing updates, optimising help desk configuration to improve performance, and maintaining security. With expert overview of the entire process and infrastructure SaaS companies know that the customer experience matters, and they will work with you to make the implementation a success. Flexibility: Companies that go for SaaS-based ITSM solutions may lose some flexibility options versus on-premise hosting. You’ll rely on your SaaS partner to generate and deliver updates, ensure application up-time, and build out any customisation you request, though likely for an additional fee if it’s uniquely for your business. When considering a SaaS solution make sure you explore whether it will allow you to build your own APIs and port data out of the application. Also, be clear as to who owns data, whether you can get a copy of it when requested, and how the data will be used. Closing thoughts Choosing an ITSM solution is not a science – it is an art that depends on myriad factors including the size and scalability of the organisation, the shape of your IT team and infrastructure and the amount of customisation your organisation requires. For those who want a cheap, functional option that is self-managed then cloud options can be ideal. However, if cash and experience are limited then a SaaS partner can offer unparalleled integration expertise and ongoing support. Alternatively, if what you are looking for is flexibility and customisation above all, then a software option can provide what you need if you are willing to make a higher initial outlay. Making sure you fully understand your organisation’s strengths, where support will be needed, plans of scale and complexity of customisation needs is vital to making the right choice. ### Jason Tooley - Building Partnerships and Alliances Compare the Cloud filmed Jason Tooley. Jason talks about building customer relationships and delivering value to customers. ### Collaborative CIOs Can Use The Cloud To Empower And Engage The Enterprise Much of our personal life resides in the cloud – our dating profiles, email trails, cherished photographs and even our life savings. For most people, this is now the norm. But businesses want a part of the action too – migrating to the cloud means data can be accessed from anywhere, faster and with more flexibility. Since the initial, overarching concept of the ‘cloud’ was developed in the 1960s, the technology has come a long way and is virtually unrecognisable compared to where it began.  We’re currently at a point where business leaders don’t necessarily care how technology is delivered. Instead, they want results and they want them fast – ideally at a lower cost and with more malleability than their legacy system. Improved business agility and speed of implementation are key benefits that should be taken into account when a firm looks to move to the cloud. What’s the problem? Cloud services are proven to give a competitive edge – from increasing efficiency to greater security – so why haven’t more organisations deployed a cloud-first approach? In short, most people are scared of change. Implementing new technology, and then making use of it, is one of the biggest obstacles for executives, as employees often resist the change. So, what is the point in a technically perfect deployment if employees don’t take advantage of the system? This is where a collaborative, consensus-building CIO can lay the groundwork for success, well before a deployment is set to go live. Buy-in from the top down When looking to deploy new technologies, getting buy-in from the C-suite is no longer a nicety but a necessity. It is imperative to have their support if they are to sell the long-term vision of transitioning to the cloud to their team. CIOs must adopt a collaborative approach, paving the way for success by ensuring all employees are on the same page. This will increase uptake and longevity in the long run, and after all, it’s the CIO’s job to get staff to use the system effectively. One of the main misperceptions amongst senior executives is that company data is safer on owned premises. But proximity no longer equates to security – dominating the enterprise landscape, cloud solutions provide an economically sustainable option. Additionally, the need to comply with the General Data Protection Regulation (GDPR) is encouraging some companies to move to the cloud sooner rather than later. Compliance is a concern, particularly in the financial sector, and this is driving cloud adoption. Once common cloud misconceptions have been addressed, the senior team need to get buy-in on the vision from the rest of the business. It is only at this point that you can effectively choose what technology is best – this is an opportunity to fundamentally rethink the way business is done. CIOs have the ability to lead the conversation away from specific, individual technologies, towards shaping and fulfilling the organisation’s wider digital agenda. This allows them to add value, both today and in the future. Engage, collaborate, train As with any technological deployment – particularly when it affects the whole company – CIOs need to gather feedback and input from all business areas that will be using the cloud, from finance to product marketing. Communicating clearly ensures everyone feels involved, whilst also having the opportunity to raise concerns and questions if needed.  The same educational, inclusive, and collaborative approach should be used across all jobs at all levels, however there are extra sensitivities concerning IT personnel. A move to the cloud can make IT staff worry about what their role will look like moving forwards, as well as their job security. The IT team need to actively help with the transition to cloud, rather than be bystanders in the process – working centrally with IT will ensure strong collaboration and a successful project in the long-term. Training is also important, not only for understanding the incoming cloud system, but for modernising existing applications and guaranteeing they run efficiently. A ‘lift-and-shift’ approach won’t work when it comes to legacy systems vs. cloud. And it’s vital to emphasise to IT staff that because business users are going through the same transformation, there are likely to be new opportunities to collaborate. New places will emerge where IT skills can be used to drive the business forward that don’t exist in the traditional ‘stand up and power up servers’ world of the past. Facing the future Technology is always changing and innovation is constantly on the horizon. Companies that have successfully adopted cloud-first systems are now using agile infrastructures to evolve the business, faster than we’ve previously seen. CIOs who implement the best business tools, whilst clearly communicating to overcome internal resistance, will be on track to not only bring their organisation success, but to also have a long, meaningful and fruitful career. Ultimately, upgrading to the cloud empowers people at all levels of the company—including IT—to accomplish more than they ever thought possible, and cloud is the heartbeat that should be pumping at the core of any modern enterprise. ### How Fast Is The Modern Office? We live in an age of instant communication. We can host a meeting with just a few clicks, complete a Google search in 0.5 seconds, and send and receive an email almost instantly. But truly how fast is the modern office? Is it any quicker than 10 or even 20 years ago? In the business world, the phrase “time is money” could not be more true, but the extent to which organisations are using this phrase to drive purchase decisions or influence how teams are built is still up for debate. It is widely accepted that efficiency and productivity are both key to a profitable and successful office environment and culture. However, according to research by Vouchercloud, the average worker in the UK is only productive for three in every eight hours. A whole number of factors can influence an employee’s productivity, only some of which organisations can control. In fact, how people work has been a subject of research for psychologists and the scientific community for years, with many organizations investing considerable time and efforts into improving their understanding of how productive teams work. Google for example, has identified ‘psychological safety’ as the key ingredient to successful and productive teams. Research from Google’s Project Aristotle revealed that the best teams are those where all members feel safe to contribute without being judged by their peers. Another factor contributing to office and team productivity comes in the form of a younger generation beginning to dominate the workplace. Millennials – the first generation that has grown up with mobile phones, social media and email – today makes up the greatest proportion of the workforce. These individuals are technology driven and arguably work entirely differently to their older (and to an extent younger) colleagues, meaning that making the modern office as fast and productive as possible has become an even more important priority For example, millennials are more collaborative and take a more digital-first approach than older generations, so introducing breakout spaces or providing easy access to technology that can simplify manual or paper-heavy processes, such as multi-function printers, can have a dramatic impact on office speed. On the other hand, older employees such as ‘Generation X-ers’ might look for functionality and ease of use. This means that it’s vital for businesses to consider how that category of employees interacts with technology so that devices can work for the users and not the other way around. Navigating various user demands can be done by developing solutions that are simple-to-use yet customisable. With different generations all in the same workforce, office environments must evolve to meet the productivity issues and obstacles that risk slowing down teams. In addition, digital transformation continues to be a key focus area for businesses, and for good reason. With the number of networked devices expected to grow to 20.6bn by next year, it’s easy to visualise how companies digitalising their workplaces can improve employee efficiency. Moreover, digitising data holds a huge benefit for businesses in that it can help increase visibility and efficiency across the board. In today’s highly competitive market, businesses who embrace digital transformation, and as a result have speed on their side, stand a better chance to be one step ahead of their rivals. This is especially important from a customer perspective as unnecessary delays due to productivity challenges can have a dramatic impact on experience. Whether it be hardware or software, a big part of keeping the modern office fast is ensuring that technology is both reliable and performance driven. The effect of this shift in the way the modern office operates is increasingly impactful upon SME’s due to the cost of driving efficiency with restricted budgets. At Lexmark, we’ve looked to help our customers in this space by providing productive, reliable and cost effective printing and document management solutions, which help reduce paper-based processes, allowing them to free up time in administration and excel in areas where their expertise lie. We are also helping to mitigate and support demand for speed by minimising the time it takes to go from clicking “print”, to one of our devices putting a document onto paper. In fact, our latest range of enterprise monochrome devices can take a single page document from click to print in under four seconds, and produce between 60 and 70 pages per minute. No longer must businesses plan or schedule print jobs ahead of time. Increasing speed in the modern office is about providing an environment which encourages employees to spend more time working and less time completing unnecessary tasks or waiting for technology to deliver. Lexmark printers can help companies to streamline their day-to-day activities, offering reliable, secure, user-friendly printing and sharing of information and documents at a speed that reduces down time and focuses staff on activity that adds value. ### Juniper’s Contrail SD-WAN solution strengthens Vodafone’s enterprise connectivity London, UK – Sept. 25, 2018 – Juniper Networks (NYSE: JNPR), an industry leader in automated, scalable and secure networks, today announced thatVodafone has selected the company as one of its Software-Defined Wide Area Network (SD-WAN) vendors for its enterprise-focused SDN portfolio, referred to as ‘Ready Network.’ Vodafone’s portfolio will comprise flexible, cloud-based services that are designed to underpin digital business at scale, helping enterprises to compete effectively using agile, open and fast infrastructure with zero-touch deployment. Juniper provides a secure, virtualised and dynamic network platform for one of Vodafone’s SD-WAN offerings. The solution delivers better control and cost optimisation by allowing enterprises to manage multiple sites using a combination of WAN connectivity types to ensure application-level quality of experience. Juniper’s solution is built upon universal customer premises equipment (uCPE) that allows multiple virtualised network functions to be delivered from a single device. Vodafone’s enterprise customers will now have a self-care portal providing real-time, end-to-end visibility and analysis of network performance, as well as centralised selection, provisioning and management of their required WAN services. For example, bandwidth-on-demand can be dialled up and down to suit traffic flows across the WAN in response to changing business requirements, and centralized security policies can be adjusted and enforced across the network. Key Highlights A key element of the Juniper Networksâ Contrailâ SD-WAN solution provided to Vodafone is its Contrail Service Orchestration, delivering automation, simplicity and openness to the design. Operating within a multi-vendor, multi-tenant environment, it ensures secure implementation, analysis and management of each unique network. Contrail Service Orchestration is used by Vodafone to oversee parts of  its SD-WAN services infrastructure for enterprises. It also provides Vodafone enterprise customers with a customised, easy-to-use self-serve portal. Contrail Service Orchestration is designed to unite policies and security controls across hybrid WAN connections, including IPSec, MPLS, broadband Internet and 4G/LTE wireless. The Juniper Networks NFX Series Network Services Platform is deployed at each customer location to enable dynamic, virtualised WAN service creation and to host the virtual next-generation firewall and advanced cloud-based threat protection capability in a single unit. The software-driven NFX Series enables zero-touch, secure service creation, enabling new branches to be brought online easily and cost-effectively with a single keystroke. The NFX Series integrates fully with Contrail, delivering the flexibility and openness required by Vodafone’s large enterprise customers. By delivering comprehensive SD-WAN services, Vodafone is extending the competitive reach of its enterprise connectivity portfolio and focusing on future-looking innovations in line with overall digitisation and underlying trends including 5G, industrial IoT, Big Data and cloudification. “We see SD-WAN as an important capability for many of our enterprise customers and believe that Juniper offers a feature-rich, cloud-enabled secure networking uCPE portfolio that will help us to provide those customers with strong, flexible and agile solutions as they move through complex digital transformations.” Vodafone Enterprise Products and Solutions director Justin Shields  “The advent of 5G, the transition of applications and resources to the cloud, and the prevalence of  IoT in the workplace, make the prospect of digital transformation for many enterprises enticing and daunting in equal measure. Juniper’s focus on simplifying the network will enable Vodafone’s enterprise customers to more easily leverage the advantages of multicloud and a growing mobile workforce, without being burdened by the growing complexities of WAN connectivity. This will free their customers to focus on the most important aspects of their business. We’re proud to be a valued technology partner of Vodafone, employing Juniper’s strong commitment to open, secure networking to bring Vodafone’s next-generation services to market.”  Rami Rahim, Chief Executive Officer at Juniper Networks “Enterprise customers are looking for simplicity, functionality and optimised TCO (total cost of ownership) as they come to grips with building flexible cloud capabilities into their infrastructure. Vodafone’s SD-WAN solution provided by Juniper delivers just that, with reduced hardware requirements, innovative use of the cloud and software-defined infrastructure.” Luca Claretti, Global Account Director for Vodafone Group at Juniper Networks ### Untangling home mesh Wi-Fi networks It’s the ultimate twenty-first century gripe – the internet is lightning fast in the front room, but slows to a crawl in the bedroom. Suddenly that tablet you got specifically to watch Netflix in bed doesn’t seem like such a sound investment. Never fear though, mesh networking has arrived to eliminate those Wi-Fi blackspots and ensure you can still stay up too late watching boxsets. However, before you can enjoy a life of streaming and sleep deprivation, it’s probably best to understand what exactly mesh networking is. Essentially, in your average home Wi-Fi setup, you’ll have a router that spreads a wired signal out as waves across your home. Unfortunately, walls, doors and all the other solid things in your house will weaken this signal, meaning those in larger homes or with thicker walls in particular may receive poor coverage. In a mesh network, additional devices called nodes are placed around your house that transmit Wi-Fi signals more efficiently and eliminate areas of poor reception. Think of it like passing notes in the classroom at school. If you’re next to your friend, you can pass the note directly. If you’re sat at the back of class (the router), and your friend is at the front (a device in a blackspot area), the note will have to pass through the hands of several classmates (mesh networking devices) to reach him. What’s important in this scenario is that, not wanting to get caught by the teacher, these classmates will pass the note in the most efficient way possible. However, like most things in tech, it’s a little more complex than that. There are three types of mesh networking currently on offer that fit various scenarios. Before you go out and invest in a mesh networking setup, you should consider what you need the network to do and which option fits best. Shared Wireless The most straightforward and affordable of the three types of mesh networking on offer, shared wireless places all connected devices and the connectivity between the various mesh network nodes on the same radio bandwidth. This means there can be slowdowns if multiple devices are running bandwidth heavy applications like streaming HD video or downloading. Imagine a road - in a shared wireless setup all traffic is moving in the same lanes. This means that, if multiple HGVs happen to be in the lanes at the same, traffic necessarily slows before they complete their journey. Essentially, shared wireless will eliminate blackspots, but depending on network stresses at a given time, you can’t necessarily guarantee the speeds you need. If you have a realistic view of your network usage, and know it is not demanding, shared wireless is the most cost effective means of moving into mesh networking to ensure a signal where needed. Dedicated Wireless Dedicated wireless allocates radio channels solely to deal with the communication between the nodes that handle backhaul traffic. Backhaul traffic is the nodes’ communication back to your router, rather than to your connected devices like laptops. Continuing with the road analogy, this is more like having a separate bus lane. The traffic that’s most important to you (i.e. that headed to your devices) won’t face congestion due to the HGV traffic headed between nodes and the router. This will ensure you get coverage and top speeds, however, it comes at a premium, and mesh network devices that handle backhaul traffic like this currently come with a heftier price tag. If speed is paramount, dedicated wireless offers more reliable speeds. With 4K HD streaming becoming more commonplace, it is perhaps a more futureproof option when compared to mesh shared wireless backhaul. Powerline The third and final type of mesh networking compliments the Wi-Fi access to devices, by usingexisting power cabling in your home to send and receive the backhaul and inter-node data. Powerline adds a new route, freeing the Wi-Fi to carry more traffic. Keeping the travel theme going; powerline is more akin to adding a toll tunnel. The signal can move quickly between distinct points uninterrupted, bypassing an obstacle but branching back to the road network once the “shortcut” has been taken. A node is plugged into the wall next to your router, and the others where you need strong Wi-Fi signals. Powerline connections generally offer less latency than Wi-Fi. If you need a low latency connection for your gaming PC for instance, powerline will do the trick. Powerline works best in conjunction with other mesh networking technologies. Its low latency means it’s useful for handling backhaul traffic, whilst you can use Wi-Fi for your devices. Travelling onwards Mesh networking is the next step in home Wi-Fi, and as our internet usage only increases, it will soon be the standard for ensuring whole home coverage. If you’re looking to upgrade to mesh networking, be sure to research the brands and devices on offer. Right now, in terms of the balance between affordability and performance, a dual approach of shared wireless mesh, with powerline handling the backhaul, is currently worth looking into. ### Taking Your Cloud Adoption To The Next Level The cloud is changing, and so is your infrastructure – at least, with continual developments in technology, it should be. But how do you keep up? From the latest statistics, we know both public and private cloud are on the increase again, with public cloud adoption up to 92% (from 89% in 2017) and private cloud adoption increased to 75% (from 72% in 2017) respectively. There’s no doubt that changes to the cloud offer new possibilities, ones your competitors will be looking to take advantage of. This leaves decision-makers with the question: are you happy to tick the imaginary box to say your organisation is in the cloud, or are you continually evolving and innovating with the cloud services available? Dave Martin, Head of Service & Delivery at Brand Experience Engineers, Rufus Leonard, offers his view on how to build the right cloud strategy for your business.       It’s not just about individual technology components. The future will involve hybrid cloud deployments that increase complexity and strategically manage multi-cloud services. Infrastructure and Operations (I&O) teams will need to work with multiple service providers designing, maintaining and continually adapting the cloud services they adopt. At Rufus Leonard, we advise our clients to use these four steps to get started. Identify where you are currently on the cloud adoption roadmap Assess the challenges your company will face at each stage Decide how you’ll overcome the challenges Choose the right management tools Location or service? Before we start, it’s worth clarifying that computing strategy teams usually talk about the cloud in one of two ways – location (public, private, hybrid or community), or the service it’s offering, which includes IaaS (Infrastructure-as-a-Service), PaaS (Platform-as-a-Service), SaaS (Software-as-a-Service) or Storage, Database, Information, Process, Application, Integration, Security, Management and Testing-as-a-Service Step 1. Identify where you are currently on the cloud adoption roadmap Several businesses are a fair way down the road, and 81% already have a multi-cloud strategy. But you still have to decide how ambitious you want to be going forward – up-to-date or pioneering? Ultimately, this is a business call, which can be a challenge if decision-makers struggle to understand the importance of cloud services. The answer – as with all business decisions – rests with your customers and your people. Once you know what customers need, and what your staff need to serve them, you can build the cloud strategy to get you there.  Step 2. Assessing the challenges your company will face at each stage Change is never easy, and it’s important to understand, and document, the internal processes that will be affected by your chosen cloud service, mapping applications and workloads to the services available. For example, if you’re delivering a digital transformation, and overlook cloud strategy, you could end up holding up the process, with significant cost to the business. Compared to 2017, we’re seeing IT teams taking more of a lead in planning including: Deciding on the applications to move to the cloud (69% vs 63%) Undertaking the management of the cloud costs (64% vs 55%) Setting the policies around cloud services (60% vs 58%) Ultimately being responsible for brokering cloud services (60% vs 54T) So, involving your IT team early in any significant initiatives is really crucial. A final consideration is providing confidence around the security of the information you are storing in the cloud. Understand where your data needs to be stored, and protected, versus ensuring the end performance for your users. A multi cloud platform will allow this but will require involvement from your Infosec, IT and Marketing teams. Step 3. Decide how you’ll overcome the challenges In our experience, involving departments beyond Infrastructure and Operations always pays off. It’ll help you find the right strategy to support their processes, and other teams, for example marketing, can also see how to make the most of new technologies. Taking time to review processes, namely governance, application development support, automation, operations and (of course!) security allows you to refine, and hopefully, eliminate inefficiency. You can then implement automation, minimising repetitive manual tasks. Look at your in-house team’s skills. Keeping on top of the latest cloud products and services is a time-consuming job and requires specialist people. Consider supplementing them with an external team in the short term, but don’t rule out building up your in-house expertise over time. (The RightScale report 2018 shows that 68% of organisations either have a central cloud team or are planning to have one.) Choose your service provider carefully. Several providers are trying to deliver cloud services based on outdated thinking, so check their track record’s up to date and they’re genuinely specialist. Step 4. Choose the right management tools The likelihood is that you will end up having a hybrid cloud of multiple vendors, with multiple tools for you to monitor and manage. So do you handle this internally, or go for a managed service? And how will the different systems interact? You will need to analyse the different cloud locations to understand what’s going to work for you, and if you have the skills integrate the systems yourself. For Azure – build or utilise Microsoft – with Application Insights or consider external tools such as New Relic or Logic Monitor. If you have the budget, a tool like Power BI will give you meaningful data insights from multi locations via personalised dashboards or regular reporting. And finally Cloud service life-cycles are shorter than ever before, so it’s not just about choosing the provider that’s right for you now but understanding where they’re going in the future and if that’s right for you. Can they show you a roadmap? Are they sticking to it? Prepare to negotiate heavily and ask for a clause letting you transfer after 12 months if you need to. Ultimately, businesses can’t be passive about the cloud anymore. You need to be proactive on everything from your strategy to working with providers. When you look back at your cloud progress in six or twelve months, where will your business be – and how will you get there? ### SD-WAN as a Service: the solution for today’s organisations For many years, the technology of choice for linking together remote locations into a seamless and secure WAN has been MPLS links, since network boundaries coincided with physical ones. With the rise of the mobile, geographically-dispersed workforce, however, the cost of MPLS bandwidth and the inherent limitations of the traditional appliance-based SD-WANs are becoming too burdensome, even when supplemented with VPN appliances meant to accommodate offsite users. SD-WAN as a service (SDWaaS) platforms are the mobile-ready solution today’s organizations really need. Due to their low latency, packet loss, and downtime, MPLS network links have been a mainstay of enterprise networks, linking the LANs of individual branch offices to each other and to the central HQ and thus forming an organization-wide WAN. MPLS links, however, come with high price tags for bandwidth compared to Internet links. Although traditional SD-WANs based on rack-mounted appliances enabled organizations to reserve MPLS bandwidth solely for the latency-sensitive business applications which needed them, only users at the central datacenter or MPLS-linked remote offices could enjoy the performance and encryption benefits. Why VPN appliances aren’t a fix for SD-WANs For workers such as offsite contractors, staff accessing network resources from their mobile devices, employees who work from home, and freelancers working for a corporate client, this has become a major pain point. Since all of them are outside the WAN, they need a secure way to log in. Inevitably, VPN appliances are used to augment the SD-WAN. Network engineers usually employ one of two methods to do this: Option #1: Have all these people connect securely via a VPN to a single WAN access point (usually the main datacenter), and from there these outside users can access files and applications on the WAN and access the Internet securely. Option #2: Set up and maintain a dedicated Internet access (DIA) link to serve as a WAN entry point, with accompanying VPN appliances, at each remote location. Option #1 naturally leads to the central WAN entry point becoming a traffic chokepoint and single point of failure for an entire organization’s remote workforce. Option #2 increases both the WAN’s attack surface while adding complexity since it requires IT departments to install, configure, maintain, patch, and eventually upgrade multiple VPN appliances, not to mention other complementary network appliances like firewalls. The key to resolving this conundrum is by first realizing that appliance-based SD-WANs are a physical solution meant to manage physical network links between physical permanent locations and provide connectivity to users permanently within those locations. The second step is understanding that all those physical networks should be virtualized and cloud-based, accessed through much more affordable last-mile Internet links. This is what SDWaaS is all about. Cloud WAN solutions for the cloud era By moving the SD-WAN to the cloud, SDWaaS delivers the same cloud benefits enterprises are already enjoying in other areas of their infrastructure: Redundancy: no single point of failure Geographical coverage: no single point of congestion and more efficient routing based on proximity Simplified and unified configuration and monitoring Access to network infrastructure wherever there’s an Internet connection SDWaaS also fixes another problem associated with VPNs: the lack of granular access policies and visibility into WAN traffic usage. Since SDWassS platforms provide real-time monitoring, fine-grained access controls, and historical metrics through an easy to use dashboard, the businesses which use them are in a much better position to fulfill the data security requirements often included in the RFPs of potential customers. In this way, SDWaaS can actually drive more revenue through cost-effective security compliance. Lastly, SDWaaS solutions can even replace MPLS throughout the entire corporate WAN. That’s because these platforms combine point of presence (PoP) location proximity to major cloud service provides like Microsoft Azure and Amazon’s AWS with an optimized backbone which comes with an SLA. The result is MPLS-level performance for standard business Internet price. With so much of the network infrastructure of many organizations already in the cloud (or heading there), it makes a lot of sense for them to migrate their SD-WANs there as well. Through leveraging the scalability and flexibility of the cloud, and a global backbone with a comprehensive integrated security stack, SDWaaS provides performance and security to mobile and offsite users without the cost and headache of managing more network appliances. ### Data Protection for Cyber Threats In today’s world, protecting your organisation’s data from cyber threats is becoming a number one priority, not just at the I.T. level but at board level also. Companies are looking for ways to improve their data security to guard against the rising number of ransomware attacks, whilst also protecting their data from insider threats, such as malicious intent from disgruntled employees. The financial penalties imposed by the GDPR have emphasised this massively. A typical ransomware attack consists of five stages: Break-In stage – An employee clicks a malicious URL in an email and triggers the ransomware software install. Latch-On stage - Ransomware probes the network seeking vulnerable machines. Expand (encrypt) stage - Critical servers are infected and encrypted with ransomware. Shutdown stage – Your organisation is paralysed, forced to shut down key services and pay the ransom (the longer you wait, the more expensive it gets). Restore (decrypt) stage - Criminals provide decryption keys; decryption can take days. In some documented cases, they fail to provide the keys at all. Protecting your organisation from the Break-In stage is all about education. Educating your users about security, phishing, malware and ransomware. You should have a process in place to provide regular security training to empower your users into becoming the front-line defence against cyberattacks. Your users should have the knowledge that allows them to recognise risks and know how to report potential threats. A well-designed data protection solution can help you recover from the Expand stage quickly, minimising the impact and cost to the business of the attack, reducing the effect of the Shutdown stage and hopefully avoiding the Restore stage. But in order for this to work, you must secure your backup data! Your backup server is a network application, and as such, cannot be completely isolated using an air gap approach, otherwise it would not be able to deliver backup services. What you can do to avoid the Latch-On stage is to minimise the network footprint of the backup server. For example, don’t expose the backup server to network shares or exports wherever possible, and try not to advertise the application’s services unless necessary. Another approach is to take the backup server out of the domain, or Active Directory (AD). If AD is attacked, and production data compromised, this prevents the credentials of the backup server being obtained and protects the integrity of your backup data. To prevent the Expand stage and stop your backup server becoming infected you need to secure, or harden, the server itself. This can be done by implementing stronger security settings and disabling any services that are not required. Your firewall rules should be configured to only allow access to the backup server using specific ports, also stipulating the permitted direction of network traffic. Wherever possible, backup client-server communication sessions should be configured to use SSL encryption. Your choice of backup server operating system also impacts the potential security of backup data. As the majority of cyber and ransomware attacks tend to target Windows systems, more and more people are moving their backup servers over to Linux or UNIX based systems. If using Linux, we would also recommend enabling SELinux to provide additional security functionality. We mentioned the term “air gap” earlier. In relation to data protection and in simple terminology, an air gap solution provides an isolated copy of your backup data that is not visible on your network. We are seeing an increase in the number companies starting to investigate how an air gap solution can potentially enhance their existing data protection solution. However, care should be taken to ensure that the air gap will deliver what it is required to, not only from a functional perspective but also on a commercial basis. Implementing an air gap using dedicated hardware and software can be an expensive venture so explore how your current technology investments can be better utilised to provide air gap capabilities. Also take the cost of the recovery process into account. How long will the air gap solution take to recover my environment when compared with a traditional backup solution? Tape storage provides the fundamental requirements for air gap; data is stored on offline media that is not mountable on to the network. To access the data, the tape cartridges must be mounted in drives following media requests from a backup application. If electronic vaulting is used to store backup data in a library that is physically located on a different site, or you create additional tape copies which are sent off-site, then even better. What you must do when you use tape to form an air gap solution, is to protect your tape volume catalogue or backup meta data. This data describes what backup data is stored on what tape cartridge and is crucial when it comes to recovering the data. Object storage is an alternative to tape that can form part of an air gap solution, as ransomware does not currently know how to access backup data that is stored on object-based storage devices. Another point to consider is the amount of data your air gap will store. If you store a copy of the last good backup to provide a recovery window of up to a 24 or 48-hour period, and you are hit by a cyber-attack, you need to understand that when you recover the affected systems you will most likely be recovering the cyber threat also. As it may have been resident in your network for a length of time greater than the air gap coverage period, so re-infection may occur. In this scenario, the air gap can provide you with some breathing space to try and disarm the threat before it runs again. Your backup software application can also be used as a tool to help detect (not prevent) when you have been infected by a threat such as ransomware. Backup software vendors are now starting to introduce alerting into their products that perform trend analysis on backup data to highlight patterns that are outside of the norm. For example, when ransomware infects a system and encrypts data, an incremental backup of that system will back up the infected files as new data. This will result in an increase in the daily ingest of backup data for that system and also a reduction in the deduplication ratio of the backup data, as encrypted files will not be deduplicated or compressed. Both of these metrics can be used to alert the administrator of a potential issue. A further type of threat we often have discussions about are those that come from within our own organisation. Dealing with these insider threats, or attacks, is a completely different matter. To combat this scenario, the answer is not just about technology, but a combination of policies and processes working in conjunction with security software. In general, the disgruntled or dismissed employee that we mentioned at the start of this article, will not normally perform a malicious act while they are still on-premise as they could be easily apprehended.  This type of attack often takes the form of a destructive macro or script, which is scheduled to run at some point after they have left your organisation. Once again, an air gap solution may not provide the recovery capabilities to restore your systems to the safe, disinfected, state they were in prior to the attack. As the hack may have been resident in your infrastructure for a time period greater than that provided by the air gap solution. Therefore, you need to rely on a well-designed data protection solution, backed up with security policies and procedures, all working hand in hand with a Security Information & Event Management (SIEM) system. Managing and monitoring human behaviour is the key. SIEM will provide you with a huge data repository that can be mined by AI to allow you to answer some of the important behavioural questions. How are people accessing my data? What are people doing with sensitive data? Which users are behaving abnormally and how? Hopefully, this information will allow you to take action and go on the offensive! If you’d like assistance with your data protection then feel free to get in touch via Celerity or call us on 01772 542450. ### The Cloud Journey: Embracing Software Defined for Hybrid Cloud There are many reasons why a business may move some or all IT services to an Enterprise Cloud and because of this there are many industries such as the public sector that have a Cloud First strategy. Benefits of running in the Enterprise Cloud One of the main reasons is to get better cost management and that usually leads to cost reductions. Running on an op-ex model eliminates any large cap-ex outlays and while those outlays are usually planned at 3 or 5-year refresh cycles, the nature of the industry can mean they are unpredictable. The natural elasticity of the cloud enables you to grow and shrink your consumption based on business, project and customer demand and avoids the risk of making a large investment in an undersized or oversized solution that you’re then stuck with. Another key benefit of moving services to the cloud is to free your internal IT personnel resources to focus on enhancing IT services leading to increased business productivity. If your IT department isn’t spending much of its time supporting the current infrastructure or “keeping the lights on”, then they can become of far more value to the business by becoming a business enabler, rather than a business inhibitor as is often the case. Many cloud providers have technologies that can be used to enhance the services that you transition over to them, of which your business will benefit from. Things such as SIEM, machine learning, automation and Artificial Intelligence (A.I) can offer some real value but are too complex or cost prohibitive to deploy on-premise. Alongside these key drivers other benefits include: complementing or helping with a lack of in-house skills, avoiding issues by sharing the risk of a project with your IT team and the cloud provider and realising new and more secure technologies as they are adopted by the cloud providers. But there’s always downsides… As is often the case with using a new architecture or platform, there are newly introduced risks to be aware of and ‘gotcha’s’ to look out for when moving to the cloud. A fundamental one is security. As while many cloud providers offer very secure services and facilities, the nature of malicious attacks is that cloud providers are tempting targets for any hackers or ransomware attacks and the digital transformation era has introduced new threats. It’s important to understand what authentication, encryption and security technologies are available and that they are included and costed when moving to the cloud. Compliance is another area of concern as you may have directives that your data must be located in a specific geography or be retained for a long period of time. Not all cloud providers offer all services in all geographies so it is important to make sure your specific requirements and therefore compliance is met. For example, if your production data is guaranteed to reside in the UK is this the case for the backups of that data? Other typical areas for concern or things to be aware of include the cloud readiness of your application (especially if it uses multiple tiers), your internet connectivity bandwidth and reliability and the cost of pulling data out of the cloud either when using the applications or migrating from one cloud vendor to another or back on-premise. So which ‘cloud’ is best for me? There are several cloud provision models out there and many providers that cover them so it can be difficult to know who to choose and trust. A public cloud is what most people might think of when looking at a cloud first strategy. Industry giants like Amazon with AWS and Microsoft with Azure are public cloud providers whereby you provision from pre-defined services from their resources with no responsibility for (or access to) the underlying infrastructure. If you’re looking for something that doesn’t directly align to the public cloud portfolio or require a more bespoke offering then a private cloud provider may offer the flexibility you require. Traditionally, a private cloud is related to infrastructure on your premise that you own and are responsible for, but increasingly there is a demand for a virtual private cloud whereby you still run services on dedicated or shared hardware that you do not own. You may require the cloud provider to have some of their hardware located on your premise while still maintaining ownership and responsibility for it, or you may have a particular technology requirement that you need to deploy or you want more control over the underlying solution, if so then a private cloud may be the best choice. Virtual private clouds offer more flexibility than traditional public clouds and increasingly public cloud providers also have a virtual private cloud offering. However, it is still a very common topology for a business to use a hybrid cloud model whereby they use a mixture of public, private and virtual private cloud models to deliver their IT services. How to get there? Moving to your chosen cloud platform(s) is not a simple task and it takes a significant amount of planning and a staged approach to give it the best chance of success. Using the cloud to provide a service that supports your on-premise servers, such as backup (BaaS) or disaster recovery (DRaaS) is typically easier and less disruptive to implement than moving your current production servers to the cloud. Typically, a “gateway” of some sorts would be deployed on your premise and used to transfer your data and servers to the chosen cloud. These may convert your data to a different format or require clients to be installed on your physical servers to virtualise them for example. Additionally, they may need to be reconfigured once they have been replicated to the cloud. Due to all this complexity and disruption, you would typically engage a trusted third-party company for professional services to assist throughout this project to ensure a smooth transition and while this is probably the route you would take, there are steps to take to not only best prepare your environment for life in the cloud, but also give you the most versatility once there. ### TALARI CLOUD CONNECT ELIMINATES HASSLE OF DEPLOYING ENTERPRISE CLOUD-BASED INFRASTRUCTURE Leading Unified Communications and Cloud Service Partners Support New Multi-Tenant Talari Platform that leapfrogs SD-WAN competition  18 September 2018—Talari Networks has announced its new Cloud Connect solution for hybrid multi-cloud enterprise networks.  Talari’s multi-tenant Cloud Connect platform ensures businesses maintain MPLS-class reliability and Quality of Experience (QoE) when accessing multiple SaaS and public cloud services.  Mirroring the reliability benefits of Talari’s failsafe SD-WAN site-to-site connectivity, Cloud Connect helps enterprises avoid the time and cost burden around deploying and managing cloud infrastructure. This cloud-first approach prevents proprietary carrier service lock-in with reliable and predictable multilink, multi-path access to SaaS and cloud-based services. Talari’s solution enables cost-effective and efficient high QoE connectivity to Software as a Service (SaaS), Unified Communications as a Service (UCaaS), cloud service gateways and managed Network as a Service (NaaS) offerings through a new high-availability connection called a Cloud Conduit.  Underlying the immediate commercial appeal of the Talari Cloud Connect solution, Talari is announcing the support of leading UCaaS, NaaS and Cloud Service providers (CSPs).  Technology partners including RingCentral, Evolve IP, Pure IP, Meta Networks and Mode are the first to offer Cloud Connect support. Value-added reseller partners endorsing Cloud Connect include Avant, Bridgepoint and Teneo. “Prior to working with RingCentral and Talari, our firm’s cloud services were dependent on separate and expensive dedicated circuits for messaging, voice, video, meetings and conferencing,” said Jason Kasch, CIO at Structural Group. “The addition of Talari’s new Cloud Connect solution within RingCentral’s UC offering helps boost service reliability while simplifying cloud management and network troubleshooting.” Practical Cloud Connect Enterprise Use Cases Reliable, predictable connectivity to multiple third-party cloud services is critical to enterprises’ business strategies. Talari Cloud Connect unifies the management of enterprise connectivity to SaaS and cloud-based services through the shared administration of Talari Cloud Conduits. TheseCloud Conduits offer multi-link, multipath, reliable connections between customer locations and the Talari Cloud Connect technology running at the CSP’s Point of Presence (PoP), along with shared administration. Cloud Conduits deliver the same failsafe benefits of Talari’s patented SD-WAN technology for SaaS and cloud access, without requiring enterprise customers to deploy and maintain SD-WAN software in the cloud. “As enterprises embrace hybrid IT and multi-cloud postures in pursuit of their digital-transformation objectives, they are compelled to overhaul their WAN architectures and management models to reliably, securely and cost-effectively deliver the cloud-based applications and services that are becoming increasingly valuable to business outcomes," said Brad Casemore, Research Vice President, Datacenter Networks at IDC.  “Talari Cloud Connect responds to the challenges posed by multi-cloud, offering an approach to SD-WAN that addresses the dynamic connectivity, bandwidth, and security requirements at the intelligent edge.” The UCaaS market is a popular time and money-saving enterprise option. Analysts, including IDC Research, predict the UCaaS market growing to more than $25 Billion by 2021.  Talari’s Cloud Connect technology helps leading UCaaS partners deliver a superior platinum-quality service for loss- and latency-sensitive, real-time applications.  CSPs announcing support today include RingCentral, Pure IP, and Evolve IP, underscoring the immediate UCaaS appeal.  High-value UCaaS applications from these partners include managed Voice over IP (VoIP), SIP trunking, contact centre, HD voice and videoconferencing, instant messaging and presence, desktop sharing, and online collaborative workspaces. Improved enterprise cloud application access via cloud-based PoP gateways is another natural Cloud Connect use case. Mode is using Talari’s Cloud Connect to power a cloud service gateway ‘easy button’ time and cost-savings approach for deployment.  Reliable SD-WAN access is delivered to the provider PoP closest to the SaaS / Cloud service. Partner Cloud Managed Services Gains Further delivering on the promise of Managed Service Provider (MSP) offerings, Mode will offer enterprises a more advanced solution using the same Talari Cloud Connect technology. Mode will combine Cloud Connect technology with its own reliable, flexible, cost-effective Cloud Private Network to enable reliable NaaS access to every enterprise cloud service, SaaS and Internet location. This NaaS scenario sees each enterprise Talari SD-WAN location leverage Talari Cloud Connect for accessing the local Mode PoP closest to the enterprise location. From there, traffic flows over Mode’s reliable private backbone for efficient, reliable, predictable access to all sites. “Talari’s customers are heavily invested in complex cloud business applications to securely power everyday operations reliably and at scale,” said Patrick Sweeney, CEO at Talari Networks. “Talari Cloud Connect ensures secure, reliable, predictable access to SaaS and cloud-based network services, without enterprises being forced to bear the burden and complexity of deploying and managing SD-WAN infrastructure in the cloud. Cloud Connect helps enterprises and their cloud services providers deliver full multi-link visibility, reliability and bi-directional QoS while accessing cloud/SaaS-based apps.” Talari Cloud Connect technology also enables reliable, predictable first/last mile access to cloud-based security and infrastructure services. Talari partner Meta Networks provides employees, contractors and partners with secure remote access to corporate applications in private data centres or the cloud. This Talari-Meta partnership addresses two of the most important trends in IT today - cloud migration and always-secure employee mobility. Pricing and Availability Rapid deployment of Cloud Connect occurs through multiple high-traffic routes to market.  Talari will offer a multi-tenant Cloud Connect PoP technology in the cloud.  Communication Service Providers can obtain the Cloud Connect PoP platform functionality at no additional cost for connecting with Talari SD-WAN customers. Talari Cloud Connect software is available starting in October to existing customers with no additional cost. More details on Talari Cloud Connect are available here. About Talari Talari is the leader in Failsafe SD-WANs™ and an NSS Labs “Recommended” SD-WAN vendor. Talari SD-WAN offers both MPLS-class high availability and superior Quality of Experience ensuring predictable application performance for both apps and real-time apps like VoIP and videoconferencing.  Talari technology delivers a multi-link WAN with 50 – 400 times more bandwidth per dollar, WAN cost reductions of 40% – 80%. Talari is deployed in over 500 customers in more than 9,000 locations in over 40 countries and boasts a 90+ NPS (Net Promoter Score). For more information, visit www.talari.com. Additional Talari Partner and Analyst Quotes are available here. Media Contact (UK) for Talari Peter Rennison or Sam Morgan, PRPR, +44 (0) 1442 245030, pr@prpr.co.uk / sam@prpr.co.uk ### Shadow SaaS | The risks of employee software purchases Confidential information – personally identifiable data, customer data, trade secrets – circulates like a bloodstream through enterprise applications. With security breaches making daily news and regulations like GDPR and Sarbanes-Oxley proliferating, bad data security practices can land organisations in court and ruin reputations. Hidden Risk: Employee SaaS Purchases While a chief security officer’s most important responsibility is making sure corporate data is safe and kept out of the wrong hands, that’s difficult to do when it’s unclear how many SaaS apps are running in the environment and who’s accessing them. A hidden risk that creates this gap is called Shadow SaaS – the ability for employees to pay for and start using SaaS apps easily, whether or not the apps are officially sanctioned. In fact, companies often have 15 times the number of SaaS apps in their environment than IT knows about. For example, a telecommunications company discovered $10 million worth of Shadow SaaS in its environment, including 295 unsanctioned products from 266 different vendors. According to Gartner, by 2020, a third of successful attacks experienced by enterprises will be on their Shadow IT resources. Why? Easy to purchase Today’s employees are used to simply purchasing what they need online, especially if it’s fast and helps get things done. They may choose this easy and convenient route instead of going through a lengthy IT and purchasing process, often without an understanding of the bigger picture issues including security, volume discounts, licensing agreements and more. For example, a developer may purchase Elastic Compute Cloud (EC2) right from Amazon with a company or personal credit card. Employees commonly use free applications such as Google Docs and Dropbox to easily and quickly share information across their teams. The result is Shadow SaaS, where cloud accounts are used across the organisation and not managed from a safety and overall corporate view. In addition to breach vulnerability, costs (which includes staff time) can quickly head out of control. How to Prevent the Risk of Shadow SaaS As with most business challenges, a “block and tackle” approach of setting up a process and taking advantage of IT asset automation can dramatically lower potential problems from Shadow SaaS. The following six steps offer a path to control, not only for Shadow SaaS but also for hidden vulnerabilities across the company: Start with a SaaS inventory. The old saying “you can’t manage what you don’t measure” applies so well here. The first step is taking Shadow SaaS out of the shadows and creating a formal inventory. Discover the risks. Using today’s vulnerability risk technology, you can uncover exactly where the risks exist. This insight enables you to apply precious resources and time to the right spots. Find the threats that matter. Another advantage of modern vulnerability risk technology is that it can do more than tell you where the risks are. You’ll discover what risks are most important to help security and IT teams create a highly targeted plan of attack. Review proper licensing. If the SaaS purchase didn’t go through formal company processes, that also means you may not be on top of licensing. It’s possible to integrate a software licensing solution with your IT asset management system to bring to light important issues to proactively maintain license compliance. Know your usage. In addition to licensing details, it’s important to gain insight into actual usage of any Shadow SaaS. You may discover a tool widely used in the organisation that could benefit from a multiple-user subscription. Duplicate tools may emerge that could be combined. Ask employees what they need. Since your employees live the day-to-day reality of what it takes to get projects done, they are a natural and great source of information about important technology needs. By checking in with different teams, you’ll uncover information that can guide technology purchases. Don’t Say No, Empower Employees Many companies have chosen a new programme: setting up an employee app store. It’s the best of both worlds. When you create an enterprise app store of approved software and services, you provide wins for employees and the company. An app store enables rapid access to important tools keeping productivity high, and employees empowered. And it also protects the organisation in multiple ways. IT can vet technology, formally inventory its existence and track vulnerabilities. Procurement can explore volume purchases, manage licensing and more. Employees win. The company wins. While the “shadow” in SaaS may sound scary, it actually provides all sorts of opportunities to apply the latest technology – creating a more agile, action-oriented culture. You’ll also minimize security breaches while creating all the documentation you need to support compliance. Bringing SaaS out of the shadows means saying goodbye to risk, and hello to protection and opportunity. ### How is automation streamlining business? Automation has become the primary driver of economic growth in modern economies, but it can also be disruptive. A cheaper delivery of services and goods can often be accompanied by automation anxiety and the very real prospect of job losses and smaller businesses being left behind. Improving services however can mean far more than just increasing the bottom line profits because automation can deliver so much more for your business. Automation is about improving services for customers and business alike, as well as staff. Many new services are setting out to do just that. Here is how they are streamlining businesses. Profits Less of a dirty word nowadays. Profits are the amount of money a company should invest in its future growth. By continuing to purchase equipment that can improve services, speed up delivery and save money, this automation then drives bigger profits in the future. Not doing so leaves companies hopelessly chasing the success of others. Automated payments are one such advancement doing exactly that. It was recorded by UK finance that in April 2018, 72% of all small to medium transactions were processed through chip and pin or contactless payments. As society heads towards a ‘cashless’ future, handling payments automatically will become increasingly essential. Yet, receiving payments barely scratches the surfaces of this automation. Its arrival can revolutionise the way we monitor money. Digital receipts can automatically update your balance sheet, tax receipts can be calculated for you and the reprocessing costs associated with incorrect or outdated account information can be avoided. Artificial intelligence bank accounts are being launched with intelligent assistants, that monitor your spending or keep an eye on direct debits. These will then do most of the heavy lifting for businesses, preparing reports for you as they go to review your spending. Time Productivity is an issue in the UK, with multiple studies showing the country lagging far behind its closest competitors. In fact, German and French workers achieve by Thursday afternoon the output it takes the equivalent UK workers until the end of the working week to complete. One reason for this is the investment in automation is lower than in those countries, particularly in small businesses. Quite simply, when not enough money is invested in new technologies, the amount of time tasks take to complete is longer. The technology, rather than making workers expendable, should free up time away from mundane tasks and give employees genuine opportunities to learn new skills and advance their value to a business. One company in Edinburgh, Murray Collier Sports Therapy, has employed the help of a ‘talking bot’ to answer the phone and take appointments on the owner’s behalf. This means Murray can focus on treating his clients without worrying about potential loss of income from not answering the phone. The bot answers the call and updates his appointments calendar, maximizing potential revenue. Sharing As we continue to digitise everything, our requirements for tangible possessions will continue to reduce. Letters become emails, filing cabinets move to the cloud, and smartphones serve the purpose of multiple devices. All this, as well as the attainment of cleaner energy, saves businesses space and money. For what we do need, businesses can get it through a sharing economy which technology gives us access too, rather than each person owning each individual device. Hybrid vehicles can reduce our need to own a car, and we may even subscribe to a service like Mercedes and have one arrive when we need it. While property company’s like WeWork use technology to connect demand and supply when it comes to sharing office facilities. These ideas lessen our dependence on the world’s natural resources while significantly reducing co2 emissions. They also massively reduce costs for business owners, despite improving services. Customer Service Anyone with experience of running a business will know the extent to which customer service can make or break your success. Yet, it can also be one of the hardest parts to get right. Well, chatbots could be about to change all that. Instead of long waits, chatbots can answer phones immediately at any time of the day when problems arise, appropriate for the impatient generations coming through. They can handle an infinite number of requests at the same time and provide immediate answers to questions which a human could not. This can reduce service time and operating costs significantly with many studies indicating even current chat bots resolve more conversations successfully than humans. Further, a professor at Georgia Tech University using a software that predicted questions and scripted answers, produced a chatbot that answered questions with 97% accuracy. Those unable to be answered can then simply be forwarded to a real person. With the emergence of Google Duplex, there is also the very real possibility of one robot ringing another. This means no human involvement in a phone call where both bots work together to fulfil their short and simple instructions. For anyone who has sat on the phone waiting for customer service and grown frustrated throughout the process, it’s a mouth-watering prospect for business owners, staff and customers alike. Which is what innovation should be all about. ### Why aren’t data breaches a thing of the past? The tech industry and the billions of people who populate their platforms are undergoing a reckoning that’s causing many to question the very efficacy of the digital age. This shifting sentiment is a spectacular change for an industry that enjoyed unquestioned adoption both at an enterprise and individual level. It’s been less than 30 years since the first website launched at CERN labs, but those three decades are comprised of nearly unbridled growth and participation in a web-based ecosystem. In virtually every way, internet services are becoming more prolific and more popular every year. It’s easy to see why. With nearly four billion users, two billion websites, and readily available access granted by LTE connections and extremely capable smartphones, the internet is the place to be. Moreover, participation is incentivized because the internet feels like an information superhighway where access is cheap, and in many cases, the services are free. Of course, that could not be further from the truth. The digital economy is an expansive and expensive ecosystem that’s fueled and funded by user data. According to an estimate by Forbes, internet users create 2.5 quintillion bytes of data every day – a number so large that it almost ceases to have real meaning. Every click, view, like, and share is collected and catalogued to strategically promote products to improve sales. As Nicholas Confessore recently acknowledged in The New York Times Magazine, “All of this...was designed to help the real customers — advertisers — sell him things. Advertisers and their partners in Silicon Valley were collecting, selling or trading every quantum...that could be conveyed through the click of a mouse or the contents of his online shopping carts.” Big data may drive the internet economy, but it’s quickly becoming more problematic. The Big Problem of Big Data All of the data creation and collection would be more palatable if it were secure. Sure companies would continue making copious amounts of money from their users’ information, but at least they won’t be compromised in a more damaging way. Unfortunately, this information is anything but secure. As the string of astonishing headlines over the past several years demonstrate, companies are collecting an incredible amount of user data, and that data is routinely being stolen or misused. Consider just a few of the most obvious examples. Yahoo, the now defunct social media platform, endured a hack so heinous that it ultimately compromised all three billion customer accounts. Other prominent companies including Target and Equifax have suffered from equally public, if not quite as prolific, hacks that compromised sensitive customer information. This is reflective of the Wild West mentality that is pervasive in internet culture. Anything can be stolen at any time, and for most customers, this is starting to feel more like an inevitability than a possibility. After all, if the U.S. electrical grid can’t be secured against an intrusion, what hope do ordinary consumers have? Unfortunately, companies have little incentive to change. Although these increasingly catastrophic data breaches are avoidable, companies are entrenched in the current strategies that are delivering short-term revenue results, while ignoring the long-term consequences of their actions. All of this data services the targeted ads that, to a significant extent, fund the internet. By owning and maintaining control of their users’ data, they can monetize users’ access. Fortunately, platforms may not be able to behave this way for long. New technologies, specifically the blockchain, are quickly reshaping the possibilities for managing user data by allowing users to own and monetize it themselves. The Blockchain & Big Data The blockchain originated as the accounting backbone for Bitcoin, and it’s risen to prominence as cryptocurrencies have burst into the mainstream consciousness, collecting mind and market share along the way. However, despite all the hype surrounding cryptocurrencies, the blockchain has successfully cut through the noise and established itself as the natural successor to the current, flawed internet infrastructure. Many of the blockchain’s inherent features including its decentralized network, smart contracts, and tokenized transactions reconfigure the possibilities for data security and usability. Most networks are compromised because they have a single point of failure. With a big, bold centralized network to attack and little deterrence or chance of being caught, cybercriminals use rudimentary as well as technologically advanced mechanisms to steal user data. The blockchain’s decentralized network thwarts many of these attempts, and it has other features that advance its security enhancements. More specifically, sharding techniques, which further divide and distribute data among the already decentralized blockchain network, offer an additional level of protection that truly differentiates data storage on the blockchain. Sharding is not a new concept, and it’s frequently discussed as a technique that can support platform scalability, so it’s able to accommodate platforms of different sizes and developmental levels. When coupled with a tokenized ecosystem, these security features make user data more secure and usable than ever before. What's more, platforms that adopt this methodology differentiate themselves from the existing tech incumbents that currently dominate the industry. Consumers can only have so much patience with companies that consume their data while treating it irresponsibly. Therefore, It seems likely that the next iteration of the digital age will favour companies that build compelling platforms that also value their customers’ privacy and secure their data by allowing them to maintain ownership over it. As the pendulum swings in the direction of privacy and security, organizations should be looking to the blockchain as the obvious next step toward embracing that change.   ### Recruitment | Blockchain | A game changer in hiring Whilst Bitcoin has taken the spotlight when it comes to the application of blockchain technology, we are starting to see a surge of innovation as its potential impact on businesses becomes increasingly understood. Blockchain technology is now being tapped into by many businesses and governments around the world. One such industry with huge potential is recruitment.  Having worked in recruitment for nearly a decade, I saw increasing frustration and a steadily decreasing lack of trust in the processes involved to collect references. Outdated processes meant that references could easily be abused and misused. Over eight years, the number of references collected by recruiters decreased by a staggering two thirds. References used to be the gold standard of an employee’s skills and experience, yet according to a survey by CareerBuilder, 75 per cent of hiring managers have caught a lie in a resume. Instead, employers and recruiters make judgments based on workers’ CVs and ‘personal branding’ pages like LinkedIn. In my opinion that’s a poor basis on which to make such an important decision, and unfair on a swathe of skilled employees who escape the recruiter’s net. Whilst the traditional approach to background checking can be expensive, repetitive and inaccurate, there is no better way to measure a candidate's potential than through verified references. As a ledger of the continuously growing list of records shared by every device in a network, blockchain is considered a secure and reliable way of storing data and a perfect way to ensure verified worker qualifications. With trust, credibility and reliability of references spiralling downwards, I saw an opportunity for the application of blockchain technology to revolutionise referencing by removing many of the inefficiencies associated with hiring. Blockchain-based recruitment   A first of its kind, Zinc’s automated referencing, feedback and assessment tool allows recruiters to collect screening information with a single click. Recruiters have access to this information on day one of the hiring journey, before interviewing begins. A technical role in the UK takes on average 12 weeks to fill, by eliminating the need to repeat certain tasks such as psychometric tests, references and team fit analysis, Zinc accelerates and simplifies the process. With an estimated 2 million technology roles available at any time, and an increasing shift towards the gig economy, there is a dire need to speed things up. Zinc ‘open sources’ the interview process by collating and securely storing a candidate’s full work history, aptitude and attitude, inclusive of softer checks such as psychometric tests. Empowering workers is at the core of our work, and for the first time, workers in the technology sector will have absolute ownership and control of their work reputation and identity, and how they choose to share that information. Using their validated digital work passport, workers can be matched more accurately based on their complete profile, before an interview has even taken place. Anecdotally, we hear that a typical worker in the technology sector receives upwards of 50 unwanted approaches weekly through popular recruitment platforms. There is also frustration around the repetition of tests and assessments. A Zinc survey found that 79 per cent of respondents believed they would secure more interviews if references were shared earlier in the recruitment process. It’s our mission to empower workers to take back control and ownership of their work reputation and identity. The future of blockchain in HR Blockchain-verified credentials open up possibilities for a whole new level of automation and trust in the hiring process. In the next few years, I believe that individuals will have a host of information stored on a blockchain, inclusive of academic certificates and work history. It will be within their control to choose who this information is shared with. In time, digital and verified records will become the norm in every industry. It’s an infrastructure that makes complete sense and addresses one of the ongoing issues on the internet as it is today, identity verification. There is a huge opportunity to ‘make better’ what is already in existence today, using technology that is ready and available. We’re working with candidates, recruiters and employers to put trust back where it belongs - right at the heart of the hiring process. ### Applying AI | Running a modern-day data centre For as long as organisations have need for data storage, whether using cloud, hosted or enterprise computing environments, behind them you find data centres catering to their needs. They house anywhere from tens to thousands of servers in amongst racks of networking equipment, as well as supporting critical infrastructure systems. AI can be implemented in order to benefit these data centres. The traditional rules and heuristics of the data centre no longer translate to the running of a modern-day infrastructure. With scale, complexity and all-around optimisation needs growing, the need to move with the times and look towards Artificial Intelligence (AI) has become particularly apparent. AI can be beneficial in many direct but also unexpected ways. There is a massive amount and variety of data available, from critical infrastructures to internal IT systems, and applications and external environmental changes (e.g. weather patterns) for systems to compute. These factors when analysed and synthesised by AI can accurately provide the best outcomes for ever-increasing availability and optimisation, helping to address SLAs and ultimately minimise operating expenses. The Need For AI Many factors are contributing to the need for AI within data centres. Across data centres throughout the world efficiency has become ever more critical. A recent study from the U.S Department of Energy found that a data centre uses up to 50 times more energy per square foot than a standard commercial building, and that data centres consume more than 2 per cent of all electricity in America. With such statistics, operators are now analysing this data using AI to cut costs and consumption to reduce the industry’s energy footprint. One of the ways they are doing this is through data centre consolidation. As an industry, one of the main avenues of profit is through economies of scale – the more customers, servers, racks etc. an organisation has, the more money they can make. But this, of course, means more physical space and more power (energy). By consolidating facilities and using AI to evaluate a better method of storing data using less hardware, businesses can densify the way they store enormous amounts of data – while lowering power usage. Another method of consolidation which takes the onus away from organisations is colocation. Colocation providers allow organisations to rent space for servers and other computing hardware with the necessary scale they desire. The beauty of colocation is that their efficiency-driven business models already benefit from, and are thus driving, AI. There has also been the rise of Edge data centres. These are smaller set-ups which are geographically dispersed, allowing hardware and data to be optimally placed near to need. Rather than being single entities, these Edge sites help build a sizeable cooperative computing fabric by combining with central data centres or cloud computing. And, what’s best, topology like this which provides numerous rich inputs and controls for optimisation and availability is best managed by AI. Developing AI A data centre is an ever-changing environment and as it grows so does the need for AI to evolve with it. As such there are several areas where AI is being applied in data centres today: Optimising energy usage by managing the numerous types of cooling, room, row and rack, with great precision. It is not uncommon for different cooling systems to conflict with each other with their continual feedback and optimisation algorithms. AI provides an ideal mechanism for managing this complexity. Some of the best and most intriguing examples use weather algorithms to assist predicting and addressing hot spots in the data centre   Optimising availability by accurately predicting future application behaviour down to the rack and server. Thus, workloads are pre-emptively moved within or across data centres based on future power, thermal or equipment behaviour.   Multi-variate preventative maintenance, delving into the component level within equipment to predict failure and nip it in the bud   Intelligently managing alarms and alerts by filtering and prioritising significant events. A common problem in data centres is dealing with chained alerts, making it difficult to address the root cause. AI, when coupled with Change of Rate, deviation or similar algorithms provides an ideal mechanism to identify critical alerts   Optimising IT equipment placement by forecasting future states of the data centre rather than simply the current configuration   Although AI has countless benefits and has firmly put its foot in the door as a trend within data centres, two points are critical for its continued success and development. Firstly, AI thrives on large and rich data streams; it’s imperative that the right systems must be in place to collect and compile this data across all parts of the data centre, from IT systems to applications to the critical infrastructure. Secondly, expectations need to be set for what AI will result in, especially around autonomous control. Amongst the many benefits of AI, one of the main considered is real-time analysis on data streams – not following through with this can play a large part in limiting many of the advantages AI technology provides. This is not an issue of renouncing control but rather putting the appropriate management systems in place to achieve the full enhancement from the technology while still setting limits and boundaries. Data centres present an ideal use case for AI: complex, energy intensive and critical, with a very large set of inputs and control points that can only be properly managed through an automated system. This is an ideal breeding ground, and with ever-evolving innovations in the data centre, the need for and benefit of AI will only increase. ### FinTech Services | The future of B2B and banking Are FinTech services the future of cash flow? The results of a wide-reaching independent survey of 4,000 UK businesses to find out how companies manage their cash flow has shed significant light into an area that is reported as a key challenge for businesses across the UK this year.   The results of the survey, analysed in the Soldo Spend Management Whitepaper, reveal costly inefficiencies in cash flow cycles, with up to a staggering £102.6 billion of company spending collectively left unreconciled every year by 18% of businesses.   Heavy administrative burdens, multiple spending channels creating complex accounting methods and a lack of spending autonomy for employees are contributing to significantly reduced productivity and ultimately costing businesses dearly when it comes to tax returns and reconciliation of spend.   The report also highlighted an opportunity for UK companies to utilise new technology and shred the hours that are currently reserved for managing spend, while simultaneously increasing control over company money.   Your business could be draining a huge amount of unnecessary time on locating lost cash and figures - time is money, and both could be spent with more efficacy elsewhere. The problem often emanates from the point of how cash flow within a company is monitored. Where the responsibility lies of how money is allocated and distributed throughout a business can also contribute to a book-balancing headache.   The developments in the FinTech sector have paved the way for companies to utilise financial technology to streamline the process and accelerate company growth. However although there has been some positive improvement, there are still some major flaws in the way businesses conduct their company cash flow. Put simply, the way you choose to allocate company spending within your business is imperative to its success or failure.   Invoicing and receipts have been the most common way for companies to monitor spending. But is this the most financially savvy way to keep on track with the way your business is using its money to benefit the company? On the surface, this method may seem foolproof and at worse leave a small percentage of time and money spent on reconciling any mistakes that are caused by poor cash management. However, look a little deeper and the problems become clear.   The loss of money through simple mistakes, when invoicing and cashing receipts, can add up to amounts that have the potential to affect your entire business. Forgetting smaller items purchased through the business can be easy, but they can very quickly pile up. Make it effortless for yourself to keep track of receipts by using a prepaid card. Soldo provides the simple solution for this by keeping evidence of any receipts and spending incurred and automatically logs and reports them in the smart receipt capture software.   The YouGov report revealed nearly half of the businesses (45%) stated the level of control over company spending is one of the biggest challenges to businesses in 2018. A further 22% of companies were willing to leave a percentage of company spending left unreconciled at the end of each month.   The results prove a clear correlation between time and control on company spending and how this impacts on the way a business run. With the vast range of FinTech services available, the regulation of company spending should not be an issue you expect your business to encounter this year.   44% of companies who took part in the survey agreed that if employees were authorised with a spending card it would increase trust in the business itself, and nearly half (47%) said it would increase employees overall trust in a company’s financial process. Nearly a third of financial decision-makers in companies with between 50 and 249 employees (27%) believe that if employees had a spending card that could be fully controlled, where expenses were reported seamlessly and spending could be monitored in real time, they would spend more responsibly on behalf of the business.   The evidence collected from the large-scale survey highlights the importance financial trust has within a business. Taking advantage of FinTech services can crucially contribute to the way company spending is reported, leaving a vast amount of time your business could be used in a much more creative and constructive way.   This is where FinTech services and developments step in to fill the gaps that current traditional banking systems are not addressing when it comes to business financial services. A lot of B2C FinTech providers are trying to compete with high street banks and traditional retail banks for the consumer market. However, the B2B space is quietly building a strong platform that banks will find hard to turn down.   FinTech services aimed at B2B clients are seeing a real opportunity for innovative and indispensable ways to work alongside banks offering complimentary services that sit beside business banking services. In short, B2B FinTech's are enabling banks to stay modern in an age where businesses rely heavily on digital innovations. The banks that acknowledge and adopt this now will gain the trust of their business clients in the long run.   Take time to research which FinTech's can benefit your business the most - it's a sector that's here to improve your business in more ways than one. ### Netskope | Partnership with Cylance selected to enhance cloud security Company’s industry-leading approach to threat protection combined with Cylance’s predictive capabilities provides end-to-end threat protection to Netskope customers. Netskope, the leader in cloud security, today announced a product partnership with Cylance, the global provider of AI-driven, prevention-first security solutions. The partnership provides Netskope customers with the additional benefit of Cylance’s AI-driven threat detection throughout the Netskope Security Cloud. In today’s connected enterprise, cyber threats are more complex than ever, and many evolving strains of polymorphic malware go undetected by signature-based antivirus approaches that fail to detect them in cloud and web-based environments. As a result, businesses can’t fully protect themselves against new, unknown threats, leaving their most critical assets at the risk of being compromised. Cylance’s AI-based prevention technology adds an additional layer of protection that augments already-robust malware detection, user-behaviour-based anomaly detection, advanced heuristic analysis, and dynamic sandbox analysis of Netskope Threat Protection. Businesses benefit from heterogeneous detection approaches and a defense-in-depth strategy, coupled with the deployment simplicity of the Netskope “one cloud” infrastructure to accurately detect zero-day and unknown threats across all of their cloud and web security use cases for any files and data inspected. “With the increasing speed and complexity of cyber threats targeting enterprises today, it’s imperative for every business to choose a smart security solution that can get ahead of and identify malicious activity before sensitive data is compromised,” said Amol Kabe, VP Product Management, Netskope. “By partnering with Cylance, we’re supercharging Netskope’s threat protection stack, providing customers with a single security platform that offers protection from the fast-moving cyber threat landscape with the highest efficacy in one security cloud.” Netskope customers can achieve comprehensive protection of all user activity through the granular visibility and user activity context across SaaS, IaaS, PaaS, and web transactions provided by the patented Netskope Cloud XD technology. Combined with Cylance, this holistic security solution enables enterprises to improve detection efficacy and reduce operational overhead for security teams. “Our AI-based security technology, coupled with our prevention-first approach to cybersecurity, has permanently reinvented how businesses address endpoint security today,” said John Giacomini, executive vice president of global sales, Cylance. “Together, Cylance and Netskope provide modern enterprises with end-to-end threat protection capable of combating both known and unknown malware strains before they affect business performance.” The Netskope Security Cloud helps the world’s largest enterprises take full advantage of the cloud and web, eliminating blind spots by going deeper than any other security solution to quickly target and control activities across thousands of cloud services and millions of websites. With full control from one cloud platform, customers benefit from comprehensive data protection that guards data everywhere and advanced threat protection that stops elusive attacks. ### Tech of the Week #11: Contactless cash withdrawals using an ATM and NFC We use our mobile phones for a lot of things nowadays, from keeping up with social media to projecting augmented reality. In recent years, we’ve also started using them to pay for our groceries at the shop. Yes, it could be argued that our mobile phone is becoming the most important item we own – promoting itself from a mere gadget to our personal accessory. You could even say it’s becoming more important than our wallet. Especially now you can make contactless cash withdrawals   Where are contactless cash withdrawals available? Whilst perhaps not a totally new idea, it’s not one I’d heard of until now and it’s certainly becoming a more common practice with banks. US finance giant Chase bank is now offering the ability to withdraw cash with just your mobile phone at nearly all (15,000) of its 16,000 ATMs. For those in the UK with a Barclays account, you can make contactless cash withdrawals. Australia and New Zealand Banking Group (ANZ) has you covered if you’re in Oceania. Interestingly, ANZ offers a “Tap & Pin” service, too. Instead of inserting your card into the machine, as you usually would, you simply tap your contactless card and enter your pin.   How does it work? Barclays, which we’re looking at in this example, uses your Android mobile phone. On your Android phone, at least with Barclays, you’ll need the mobile banking app which has had mobile banking set up on. You then turn on a feature called “Contactless Mobile” and you’re good to go. By using Near-Field Communication (NFC), which is available on most, if not all, modern smartphones, the app interacts with the touch point on the ATM. At this stage, you enter your pin into the ATM as usual and proceed to all of the options you’d have with your bank card, including contactless cash withdrawals. Chase offers the contactless feature across Apple and Android devices through their respective “smart wallets”, Apple Pay, Google Pay and Samsung Pay. ANZ offers the feature if the user has an Android phone with the app ANZ Mobile Pay installed. It’s important to note that transactions carried out in this way, i.e. to withdraw cash, are not restricted by the usual contactless payment amounts (which is up to £30 in a single transaction in the UK). It appears that the reason bank app native contactless withdrawals aren’t noted to work on Apple devices is because, whilst they do have NFC capabilities built in, they can’t be used to read NFC tags without a third party app.   The benefits Time aside, the main benefit will be the avoidance of having your bank card being cloned by tampered ATMs. These smart ATMs provide fully encrypted transactions to help protect your assets. It’s also backed up by a money back guarantee, should something go wrong. The other main benefit is the lack of bank card needed to withdraw money. Say you’ve left your wallet at home but need cash to pay something important that just can’t wait, well now you can. If you don’t feel like taking your wallet out on a night out for fear of it being stolen or just getting in the way, now you don’t have to. The cons NFC has been repeatedly given bad press in the past for its ability to be hacked into. We’ve all heard the stories of someone “bumping into” someone else and transferring money from the unsuspecting member’s bank to the attacker. NFC has a usable range of about 10cm (4 inches), so whilst the above scenario was, and likely still is a possible threat, it wouldn’t be as big a threat with withdrawing cash. If you were stood at the ATM, you would soon notice someone coming that close to interact with your device in a harmful way. Like with everything, there’s still a possibility, but not a large one. With banks offering a money back guarantee, it’s a largely safe procedure to work with. ### IT Equipment | Companies risk missing top talent Companies could be risking losing out to competitors during the recruitment process due to workers’ expectations about IT equipment and technology. According to a survey of 1,004 UK workers in full or part-time employment carried out by Probrand, as many as 51% would reconsider accepting a job offer if they discovered the technology they would be working with was not up to their standards. A further 27% claim that they have had to raise the issue of the software or hardware they have been using as an issue with a previous line manager. Of those who said this would put them off accepting a job, older devices  (which can sometimes operate slowly) were identified as the biggest issue, with 71% claiming this as a sticking point. More than 1 in 2 (53%) said out of date or archaic software would be an issue. For 39%, slow internet speeds would be a problem worthy of reconsidering their choice of employment, while 27% felt having to use an operating system they were not familiar with was a problem. Matt Royle, marketing director at Probrand, said: “It’s amazing to see the pivotal role IT equipment is playing in the decision-making process when people are deciding whether or not to accept a job offer. However, given how heavily reliant on technology we all are, coupled with a need for flexible and mobile working, it’s not entirely surprising. The evidence shows having to cope with outdated or slow running technology, or an unfamiliar operating system, can cause huge issues with regard to employee satisfaction and productivity. “Clearly, employers need to carry out regular audits of their IT equipment and - if necessary – update these to ensure new starters are not being presented with old or out of date technologies. This is in business leaders’ interests, as poor technology can have a significant impact on workforce productivity levels.” According to the data, workers have high expectations of the equipment they have access to for work purposes day to day. The 24/7 nature of communication in workers’ personal lives seems to be transferring through into workers’ professional lives, as almost 1 in 3 (31%) expect to receive a company smartphone. Meanwhile, devices which allow more flexible working are key for 42%, who expect to have access to a company laptop. The Top 5 Sectors Who Are Most Likely to be Put Off Accepting a New Role Based on the Technology: Information and communications  – 64% Creative and photographic – 58% Media and marketing– 56% Admin and supportive – 51% Professional services (law, accountancy) – 48% ### N4Stack | Node4 launches ‘N4Stack' services for Microsoft Azure to help companies get more value from the cloud Derby, UK – August 7, 2018 - Node4, the cloud, data centre and communications provider, has launched N4Stack, a range of services enabling Microsoft Azure users to move to a highly automated Infrastructure as Code (IaC) model across data operations, DevOps, security and SysOps functions. N4Stack will integrate Azure SQL, CosmosDB, Azure Active Directory and more to provide customers with a seamless hybrid service model across both Node4 data centres and Azure.   New DevOps-aligned services and adoption of Microsoft Azure offer comprehensive portfolio of transformational services   N4Stack is designed to overcome the challenge faced by many Managed Cloud Providers who try to port out-dated service models from their own data centres into Azure, without embracing its full range of capabilities. Adopting an IaC approach and building teams which include DevOps, data and security experts, allows customers to build new services that are cloud-native and add real value, especially to Azure developers. The launch of N4Stack will allow Node4 to extend its relationship with Azure customers as they build new applications to take advantage of Azure Platform as a Service (PaaS), AI and cognitive services. In addition, Node4 will work with Microsoft to modernise its service capability with a focus on building new competencies and enabling customers to take advantage of Microsoft Azure services. Paul Bryce, Chief Commercial Officer, Node4 commented: “It’s key that we provide our customers with a deep and scalable service capability to address a digital transformation agenda. The N4Stack portfolio showcases the skills of the Node4 technical teams across data operations, security, DevOps, cloud and networking and sets a real benchmark in service innovation. N4Stack has been a catalyst for us to re-think our service models and invest heavily in new technologies, embracing automation and a DevOps aligned approach across all our platforms.” Mark Smith, Senior Director, Azure & AI, Microsoft UK added, “Microsoft Azure enables Node4 to develop and deliver a hybrid solution that allows customers to take advantage of a host of new digital services, available on-demand and at scale. We are excited to see the value that the N4Stack services bring to Azure especially in the Data and DevOps space.” More information about N4Stack can be found here.   About Node4 Node4 is a UK-based Cloud, Data Centre and Communications specialist that is dedicated to serving its customers to ensure that they benefit from the most effective and flexible application of technology. Since 2004 Node4 has achieved great success and growth based on its focused customer service, market leading customer retention and comprehensive service offering. In addition to offices in Reading, Nottingham and London, Node4 owns and manages Data Centres located in Derby, Leeds and Northampton, as well as having dedicated space in a Slough Data Centre. The Data Centres are connected using Node4’s national fibre network, recently upgraded with a multi-million-pound installation of DWDM, which includes points of presence in Manchester and London, as well as interconnects to major UK carriers. Using this infrastructure, Node4’s offerings include Cloud, Colocation, Connectivity, SIP, Hosted Unified Communications and Managed Services. For more information, click here. ### Digital Transformation | Six Degrees and Hitachi Consulting Interview A Compare The Cloud interview with Thomas Konopka, sales director, Six Degrees and Michael Foote, business development director, Hitachi Consulting. Join them as they talk about digital transformation. David Organ: I'm really pleased to be joined today by Michael Foote, from Hitachi Consulting, and Tom Konopka from Six Degree. Guys, hello. Welcome. Now we're here today to discuss digital transformation, both in the public sector and the private sector. So first off, can you tell me a little bit about Six Degrees? Tom Konopka: Sure. So, in a nutshell, Six Degrees are a cloud service provider providing hybrid cloud services. We have a sort of a springboard to the cloud approach, and our vision is really, really simple. It's all about enabling our customers' brilliance. David Organ: And of course, we're here, Michael, with you from Hitachi Consulting. So, can you tell me what the relationship is between Hitachi Consulting and Six Degrees? Michael Foote: Absolutely. So, Hitachi Consulting is the consulting arm of the Hitachi Group. Very much focused on application solutions around Oracle, SAP, and Microsoft. Working together with the Six Degree group to provide a springboard to take people to the cloud using their environments, their communications, to allow our customers to experience the cloud quickly and effectively. David Organ: So, digital transformation. Looking at the public sector first, what sort of problems do they face with their digital transformation journey? Tom Konopka: I think, at times, it's just scale of it. And secondly is really about delivering projects on time and actually delivering real value for the money that they're actually spending. At times, they really need some help about being led, almost like a Sherpa type role, and then working with organizations that provide that agility and flexibility in helping to deliver those projects. David Organ: And in terms of some of the challenges, Mike, can you sort of enlighten me a little bit on the specific challenges that some public-sector bodies face? Michael Foote: Yeah. What we tend to find is, with a lot of the digital transformations, it's about knowledge; it's about information; it's about support; it's about understanding what this new world is and what this new world can do. And again, working with Six Degree Group, we can bring a degree of understanding and a degree of knowledge together and jointly help the customers to succeed. David Organ: And, of course, a lot of these organizations, they have big estates, don't they Tom? I mean, how do you transition, or even go about dealing with such a large estate, when you're moving them, say, to a cloud environment? Tom Konopka: I think a lot of it is around education. And actually, you know, there is no one-size-fits-all model. It's a journey. You have to collaborate with partners. We all have our own skill sets and our own experiences that we can share and actually educate, and take those organizations on that journey and really enable them. You know we call it 'springboard' for a reason, which is actually to help catapult people forward. David Organ: And it's not just public-sector bodies. I mean, you bring experience from all sorts of industries with Hitachi Consulting. Michael Foote: Absolutely. Yeah, and I think that's important. We're working in two specific areas because it's right for us to build a relationship that's effective, and works, and brings success to our clients. So, we're focusing very much on the public sector area initially, with manufacturing as a supporting area, because we have a number of joint customers in that space where we can bring a degree of consulting, a degree of understanding and the application side. That in roles and responsibilities, we can work effectively together to bring that success. David Organ: It's a great story. Guys, thank you so much for joining me and telling me a little bit about your experience with digital transformation. Michael Foote: Thank you. Tom Konopka: Thank you.   For more info on Six Degrees, click here. For more info on Hitachi Consulting, click here. ### Cloud-based identity | Online security At the IT Director’s Forum earlier this year, the main issue for directors across all sectors was online security. One aspect of security which is becoming an increasing concern as they move IT services to the cloud, and hence a significant source of risk, cloud-based identity.   For most organisations adopting cloud will be a gradual process, and the majority will find themselves managing a hybrid solution from multiple providers combined with some in-house provision. Each service will have different authentication requirements, and the challenge is to know who is accessing each service and to independently authenticate them, whilst ensuring that security is maintained. Multiple systems mean, of course, that users will have to work with multiple, more complex passwords. Every individual will have their own way of handling this, but all too frequently the result is passwords on Post-Its stuck on the monitor or office wall, reusing the same passwords, or avoiding logging out completely. Most organisations have implemented policies to try and eliminate this type of behaviour, but it persists, leading to increased security and compliance risks. Other users forget their passwords and have to repeatedly call the HelpDesk for resets. We have carried out surveys which found that some 25% of Help Desk calls logged are due to password problems. The ideal solution is a secure single sign-on, which would reduce security and compliance risks while increasing productivity and reducing costs. Many organisations have tried and failed to successfully implement such capabilities in the past, mainly due to complexity. However, the cloud now offers a solution, as it can provide an authoritative source of identity to authenticate against almost all IT services available today, including corporate, PSN, N3, web, cloud, internal and hosted systems while providing secure access from any location. This minimises the time and complexity of brokering authentication and access to cloud services, simplifying the user experience while reducing security and compliance risks and user support costs. It makes secure single sign-on to all key corporate systems from any location both possible and affordable. Cloud identity authentication works by providing a central account or identity and provisioning this into target systems e.g. Active Directory, SAP, SharePoint etc. This identity manages user authentication and entitlement (tailored to each user’s role) and compliance. It allows single sign-on to web service issues and access to on-premise applications from any location and enables the system to act as an IDP for cloud/extranet services and SAML. Multi-factor authentication, such as security tokens or challenge-response systems, can be incorporated for extra security. A key feature of this type of system is user self-service. All available applications and services are published to a portal and users can then select the applications they need and put them into a ‘shopping basket’ for approval. Configurable workflows through the portal allow authentication and access processes to map to the way an organisation works, streamlining approval. Users can also securely reset passwords without access to any service desk.   Cloud-based identity and authentication management system offers three key benefits.   First, it enhances application security by externalising authentication and authorisation to applications, web resources, web services and data. This protects systems from direct exposure. Multifactor authentication can then be added to provide an additional level of security. Second, having a single secure login standard and basing access to all systems on established policies and audited practices eliminates non-secure user practices and ensures that all systems have compliant authentication levels. By providing complete visibility into identity and access management and providing a formal audit trail it can also help organisations achieve and maintain compliance. Thirdly, by providing user self-service for routine issues, single sign-on can increase productivity and reduce costs, freeing up Help Desk staff to work on other issues. Fordway recently providing a cloud-based identity management service to a Government organisation, who wanted a centralised authentication system to provide secure single sign-on to all corporate systems from any location, facilitating remote and mobile working, whether the systems were hosted internally, in the cloud or by third parties. Our cloud-based Identity and Authentication Management Service (IDAMS) gave them a single integrated system through which they could manage identity, role and IT service management in line with their security policy while providing user self-service for routine issues. In my opinion, identity and authentication management should be the cornerstone of a hybrid cloud strategy. Organisations need to manage identity across multiple providers and cloud provides them with a secure solution. Clearly, any cloud-based identity authentication solution is only as good as the hosting company’s own cloud security. However, most cloud service providers implement and manage considerably better IT security controls than internal IT departments. Single sign-in does not, of course, absolve an organisation of responsibility for security and compliance. They still need to maintain an authoritative source of digital identity which can be used as collateral for all generally available web services.  However, it offers significant security and productivity benefits, and by using standard SAML protocols can reduce the total cost of integration for new applications. ### Cloud Service Providers | Which questions should you be asking your CSP? For most organisations, it is no longer a question of whether to adopt cloud, either wholly or for specific services, but which services to move and when to move them. Having architected and sized the chosen services, they then need to select the most appropriate cloud service providers. If they have done their preparation correctly, the choice of platform is almost immaterial, as these days almost all the technology is pretty good. This does not mean the choice of provider is straightforward. While most of the public cloud service providers are very capable, there are significant differences in everything from design criteria, billing models and contractual terms and conditions to SLAs and data recovery terms. There is also a multitude of providers offering different types of managed cloud service, perhaps to host legacy services until an appropriate public cloud service becomes available. These, too, have their own T&Cs, SLAs etc., as well as the legal jurisdictions where data is held. These are important considerations when ensuring services are GDPR (General Data Protection Regulation) compliant. It is vital to know exactly what your organisation is signing up for to avoid problems in the future, so we have developed a checklist which we have been using with our customers to help them compare different options. As we help them define and negotiate cloud contracts, this has been refined to address the most common pitfalls and misunderstandings and to help them make a realistic evaluation and comparison between different cloud service suppliers.   Availability and usage The first consideration is whether your service requires persistent (reserved), non-persistent (on demand) or metered instances. This will depend to some extent on whether it is required 24 x 7 x 365, but there are other considerations too. Most applications require additional systems such as login/authentication, network etc., which need to be powered up beforehand, so a 9-5 requirement quickly becomes 7-9. Shutting down and restarting has to be sequenced, and some employees will want access outside core hours, so one of cloud’s specific cost-saving capabilities is potentially less useful than it could be as 9-5 quickly becomes 24×7. With availability decided on, you need to ask potential providers whether their service offers this and how cost effective it is. It does not happen very often, but AWS’ terms and conditions allow them to shut down on-demand instances without any reference to the client. If there are specific times that your service must be available, you need to know whether the provider will ensure these within a non-persistent service. With metered services, ask your cloud service providers what guarantees they will give that all capacity is available even if not used, and find out what actually constitutes usage. Several applications generate keep-alive packets to ensure availability, and these can be used by providers offering metered instances as the basis for charging even when services are not actually being ‘used’.   Optimisation and granularity Different cloud service providers handle charging in different ways, so it is vital to understand the characteristics of the service being migrated. Will general purpose instances suffice or are computer, memory or storage optimised instances needed? Costs vary dramatically from individual suppliers as well as between providers. For example, Microsoft Azure has five storage options, each with different dependencies. All need to be understood, compared and evaluated when choosing a service. More generally, you need to find out what is included in the charging model and what is an extra. If an extra, how is it charged and what is the likely impact on overall charges? For example, for an IaaS instance in AWS, there is a minimum of five and potentially eight metered costs that need to be tracked for a single Internet-facing server. Azure and other public cloud service providers are comparable. The complexity increases if your organisation is hosting multiple server environments and if other elements are required to run the application, such as security, resilience, management, patching and back-up, which will appear as additional charges. This is less of an issue with SaaS, which usually has a standard per user per month charge.   Security considerations Maintaining security of cloud services is crucial. First, consider the security classification/business impact of the data within the service. Does this mandate physical location awareness and, if so, where will your organisation’s data be stored? What security, access, audit and compliance controls need to be in place and can the provider guarantee them? If so, how – self-certification or independent testing and validation? Then consider how the potential supplier operates. If they adhere to recognised security standards, they should be able to prove that they have the relevant controls in place. If not, you need to find out how they will guarantee that their infrastructure is secure and patching is up to date. Providers which have to meet public sector requirements will be regularly audited and tested by independent external providers to ensure that they meet the latest security standards and will have tested and audited procedures for dealing with any security incidents. Your organisation is responsible for asking your chosen cloud service providers to deliver the appropriate levels of information security and you need to measure and audit the provider to ensure this is applied. This is particularly true with IaaS, less so with PaaS and SaaS. Irrespective of who hosts the data, under both the Data Protection Act and GDPR your organisation retains responsibility for the security of its data.   Resilience Resilience is another area where it is important to look under the bonnet to find out what is really being offered. You will be charged for transferring data between domains, so to understand costs you need to know the frequency and size of snapshots and the rate of change of data. If the standard offering does not meet your organisation’s requirements, additional resilience may be available – but what exactly is offered and what are the costs? You should also examine services guarantees closely and find out what compensation is offered if these are not met. A major loss of services such as a data centre failure, security breach or other outages, or even reduced performance, could create significant issues for your business. Under most public cloud service SLAs, the cloud provider will apologise and refund a proportion of the monthly service fee. This recompense covers a very small proportion of the disruption you may have incurred, so needs to be evaluated carefully before moving a business critical service. You should also consider whether to have primary and recovery services, where applicable, hosted by the same supplier and whether you have or need an independent backup to restore from in extremis.   Cloud service management, processes and contracts Look into the details of how the service is run. If operational management is via a portal, find out how the supplier handles escalation and service updates. What processes do they use for Problem Management or Major Incident Management, and do they have SLAs? You need to be confident that the way the supplier operates fits the way your organisation needs to operate. With public cloud, you are unlikely to be able to persuade providers to revise their processes to suit your organisation, so will be better off talking to private and virtual private cloud service providers. Making changes to standard terms will always impact on costs, so you need to decide if the business benefits are worthwhile over the contract term. You also need to consider contract flexibility – in particular, whether there are exit or data transfer costs should your organisation wish to switch suppliers.   Cultural fit Finally, think about the cultural fit between your organisation and a potential provider. This may seem trivial, but your organisation is potentially entering into a multi-year agreement which will impact the services it offers its end users. It helps to ensure that all parties are aligned before committing to any agreements. ### Remote Working | It's time to implement Organisations that do not have the right infrastructure in place to support remote working will simply be outpaced by savvier competitors. This is according to M-Files, commenting on new research from Ingram Micro Cloud and Microsoft, which revealed that 60 per cent of under 35’s value the ability to remote work over generous holiday allowances.  Increasingly organisations are adopting mobile, remote and flexible working, capitalising on numerous benefits including higher staff productivity levels. But to achieve these benefits, the right tools and technologies must be in place. Under 35’s can confidently use cloud-based collaborative, file hosting and sharing tools and take a dim view of employers which are unable to provide this. According to Tim Waterton, Director of UK Business, M-Files, to satisfy this demand for remote working organisations need to think smarter about implementing the right technologies: “In the pursuit of a better work-life balance, under 35’s are actively seeking flexible and remote working practices but whether they can find them from their employers is another matter. Not only is it a case of whether an employer offers this, but it’s also how. In theory, it should be simple, intuitive and seamless to access work remotely, but we know this not the case. In fact, a survey we ran ourselves (M-Files) across 250 IT decision-makers revealed that 90 per cent found it at least somewhat challenging to search for and access documents when working remotely.” “It’s imperative that staff are able to have the same easy access to documents when they're out of the office as they do when at their desks. This will only grow further as digital natives continue to populate the office. “Unfortunately, our research demonstrates that many organisations just aren’t providing employees with the means to do this. Firms unable to provide the right infrastructure to support remote working risk losing employees to savvier competitors who can deliver this in a straightforward manner. “To remedy these issues, organisations need to improve the infrastructure that is currently in place. Intelligent information management solutions, for example, can be leveraged to make the management of information much more efficient. Metadata-driven information management solutions allow organisations to simplify how staff access, secure, process and collaborate on documents.” Waterton continues: “Intelligent information management solutions can enable employees to access and manage business information anywhere using any device, regardless of where that information is stored, in an easier and faster manner. Remote working becomes easier as employees can access information without needing to be connected to the VPN or company network, as well as accessing and changing documents while working offline.” “Additionally, this approach can also improve security defences – an effective information management solution means employees working remotely are less likely to turn to consumer-level applications to share information. Because of this, IT departments can retain visibility and control over where business information is and how it is being used, which significantly reduces the risk of data breaches. “Millennials and Centennials are often thought to be the driving force behind changing workplace practices – and are often seen to be having unreasonable and unrealistic expectations. This is simply not true. Work is becoming ever increasingly something you 'do' rather than somewhere you 'go'. By enabling employees to access the information they need to do their jobs from anywhere, using any device - in a secure, controlled fashion - companies can ensure that they have the infrastructure in place to support the demands of digital natives,” Waterton concluded. ### Telia zone | Smart Cities, Graph, And IoT The Internet of Things (IoT) hype may have slowed down, but the number of connected devices is still accelerating: a Gartner estimation places the number last year at 8.4 billion — and growing to 20.8 billion within two years. The problem: how to tie these devices together and make the vision of smart cities and other IoT use cases a reality. So far, the answer to that issue has proved elusive. Introducing the Telia Zone.   The consensus around graph is building According to many technology analysts, the missing element is the emerging non-SQL data approach coalescing around graph databases. Graph technology is a proven data model, used initially by Google, Facebook** and LinkedIn, that’s great at linking multiple elements together, at scale, and finding interesting relationships that other data software formats, like relational, struggle to support. Andy Mulholland, VP & Principal Analyst at Constellation Research, has gone on record, for example, to state that, “The IoT revolution is a revolution that depends totally on graph technology. It’s simply not a feasible answer to go beyond piloting with anything else. It has to be a graph.” Other analysts agree: Alex Woodie, Managing Editor of Datanami, says “One of those key enabling technologies [of IoT] is graph databases.” Tony Baer at Ovum: “Graph technology will allow the Internet of Things to be represented transparently… without the need to force fit into arbitrary relational models. And finally, Matt Aslett, Research Director at 451 Research comments: “Graph databases offer the potential to store and analyse data from the IoT”.   How will this work in practice? A graph can be used to better understand the relationships between data points, such as IoT devices — and those devices and machine-to-machine relationships a smart city or network rely on are easily captured in a graph data model, say, practitioners. So in a transport network, the question, “What road is experiencing the most congestion?” or “Which train is going to arrive late?” becomes: “What is the impact of the problem on the rest of the network?” In parallel, with telecoms data, each time you call a new person or authorise a new device, you make a new connection. The same is true of the smart parking meters, smart traffic lights, or the cameras in the hospital driveways. When a new piece of equipment comes online, it will look around for the relevant controllers or other devices that it needs to listen to or send data to. The powering up or down of a device will make or break dozens of connections — and the natural way to represent such connections is using a graph database. In addition, connections are more than lines between entities: they each include a richness of information, such as direction, type, quality, weight, and more — all of which can be represented in a graph database as an integral part of each relationship. Finally, those smart cities will, of course, be full of smart homes, and a smart network will be needed to deliver great services to those connected energy meters, thermostats and entertainment devices. A graph is also being seen as the default platform for organising the complex networks these services will need. Take Telia, the incumbent telecommunications carrier in the Nordic market across mobile, broadband, consumer and enterprise markets, for instance. In order to stay competitive, the Telia team decided to reinvigorate an area of least innovation: their consumer broadband business.   The Telia Zone The Telia team recognized their roughly 1 million installed Wi-Fi routers as underutilised assets. Their goal was twofold: help consumers simplify their lives and help consumers better entertain within their homes. The solution: The Telia Zone. After tracking all Wi-Fi connected devices within their customers’ homes, the Telia team provides this connected dataset and a number of APIs to a partner-driven ecosystem of apps and services. The result is a smart home platform that allows consumers to pick and choose apps that meet the two goals of simplification and home entertainment. Telia Zone apps do everything from reminding consumers when they forget to lock the door after they leave for the day to let them know when their kids arrive home after school (and turning on the lights when they get there). By tracking patterns of connected data around when users leave and arrive home, the Telia Zone also offers highly accurate and relevant suggestions to other apps regarding when users will most likely be at home — information that is both elusive and salient to every home delivery service. Powering it all on the backend is a graph database, which can be deployed on-premise or in the cloud. The Telia team chose graph technology because their smart home dataset was already a graph of connected data. Also, with Neo4j’s Causal Clustering architecture, the Telia team was able to horizontally scale their operations without always knowing what given traffic loads might be. Furthermore, the graph data model allows them the flexibility to keep evolving the Telia Zone platform without breaking existing components. A connected world strewn with brilliant smart cities and networks is on the horizon — and we have found the right data engine to manage it at last.   **Thanks to Twitter user MusicComposer1, Paul Colmer for pointing out that Facebook does not use graph technology as originally stated. For more information on Facebook's architecture, this Quora article may better explain. ### Incident response | Adapting your cloud security Over the past year, we’ve witnessed several acquisitions by the big security vendors, multiple funding rounds and the beginning of cyber accelerators; the cybersecurity market is undoubtedly buoyant. The challenge of this, however, is ensuring that firms stay true to their specialism while becoming a fully-fledged part of the wider security ecosystem implementing a good incident response.   Chief Information Security Officers have made it clear. Their technology investments need to both work with and enhance their other purchases. Despite this reasoning, no vendor should or can work alone when it comes to cybersecurity. Rather, it takes a collaboration of intelligence and open technology to ensure the very best protection is put in place.  For us at Cofense, while our heritage is in mitigating phishing attacks, it is absolutely vital that this is fully integrated into the wider security infrastructure; indeed, finding a live phishing campaign is amazing, but what do you do next to reduce its impact? This is where we must consider where our solution fits into the wider security infrastructure and how best to work an incident response method to provide the best security protection.   A case in point: cloud security, meets phishing, meets incident response We recognise the increased use of cloud computing and so wanted to take a unique look at cloud security and its connection to phishing tactics. From this, our initiative was to discover how broad and interconnected a phishing defence strategy should and could be. Cloud services have become the key to productivity across businesses in every industry worldwide.  They don’t require a lengthy procurement process and the low cost for short-term use can often mean that business users download and deploy cloud applications without involving IT. Shadow IT is consequently a very real problem for the IT department and Gartner predicts that between 30 and 40 per cent of total IT spend is unsanctioned.  But what about Shadow Cloud? While this comes with its own security implications, which undoubtedly need to be investigated and managed, it also plays into the most likely way attackers are targeting victims; creative, personalised phishing campaigns. If an attacker can map cloud technologies with ease, they can use that to create more authentic phishing campaigns. From phishing a CFO asking for a hefty wire transfer under the fraudulent guise of their accounting software or setting up spoofed branded login pages to steal credentials from a SaaS-based file management service. It only takes a few guesses to determine what shadow IT may be in use across an organisation and cybercriminals are using that information to tailor extremely personalised and targeted phishing campaigns, which may result in an employee handing over their log in details or clicking a compromised link that grants the hacker direct access to the corporate network. Example: Was that Slack workspace actually set up by your organisation? Or did the bad guys repurpose a name and logo to lure your staff into giving up credentials? If an IT team can intercept this process, however, by monitoring what cloud services are in use and gaining visibility into cloud services configured for a corporate domain, it becomes much easier to defend against. Your SOC should know what Cloud SaaS apps you have so they can be on their guard when intelligence is showing them phishing attempts masquerading as those services. The ability to predict what phishing emails might be entering your network and being able to condition users to reduce the likelihood of them clicking a fraudulent email is a vital part of cyber protection. With a more integrated security infrastructure designed to increase visibility, it also enables the business to understand its risk profile, align incident response, build intelligence resources and improve preventative tactics.     Recognising your role in the security ecosystem Perhaps, the next generation cyber security vendor – and those that will dominate in years to come – is that which recognises its expertise, understands its part in the security ecosystem, yet actively integrates and collaborates with those around it to provide the most complete defence possible; whether its collaborative advisory boards, SaaS vendors, technology alliances, or their own internal team members. After all, there’s no point an employee being a savvy security sensor if there is no way for them to report threats, no incident response team to analyse that intelligence and no method of responding to and disrupting the attack in progress. A truly complete defence takes a collective. ### OVH | Hiren Parekh ### Intelligent enterprise | Structure in the cloud Seven in ten organisations now use cloud computing as standard* now that traditional barriers to adoption, such as security, are broken down. It has become a cultural norm for enterprises to embrace the cloud to achieve business success, and those organisations need to give some structure to their intelligent enterprise, says Hugh Owen, Senior Vice President, Product Marketing for MicroStrategy. Faced with the major forces of technology, regulations, market pressures, and competition, organisations are looking to the cloud for solutions. The time-to-value, low cost, and scalability of cloud offerings are forcing organisations to review their existing data architectures and evaluate cloud and hybrid cloud approaches. These organisations are taking steps towards becoming what we call the “Intelligent Enterprise.” While many organisations may find themselves adopting cloud services in an ad hoc, unstructured way, there is huge value in taking a structured approach. It’s time to map the Intelligent Enterprise.   Mapping the Intelligent Enterprise So, what should this “Map of the Intelligent Enterprise” look like? This is the architecture that organisations should follow to become a truly cloud-based business, with all the competitive and operational advantages that entail. At the top of the Intelligent Enterprise, you have the organisation’s most valuable assets—its people. Every member of staff is a stakeholder in the Intelligent Enterprise. These staff all have different roles and requirements from technology, so they will need the devices to access the right content at the right time, wherever they are. They will also need regular training on how to access and use cloud technology to enable them to work even more productively. Add to this layer the other constituents that work with or rely on those organisations, such as their customers, vendors, the influencers in their space, and others. Organisations face the challenge of providing the connectivity and operability to enable constituents to access the data and applications they need to achieve their business or personal goals. To map the Intelligent Enterprise, organisations need to take four steps: 1)     Evaluate: IT decision makers need to understand how external forces are impacting the organisation. These factors need to be incorporated into the road map 2)     Catalogue: The next step is to categorise enterprise assets and identify who would benefit from access to information and enterprise systems 3)     Empower: Organisations then need to arm individuals and teams with powerful tools to explore data on their terms, while establishing a foundation for a single version of the truth across the enterprise 4)     Plot a course: Map out the people, processes, and architecture required to build an Intelligent Enterprise These steps provide the basis to create an intelligence platform to build, deploy, and maintain intelligent applications quickly. This is where applications services really come into their own. Services such as analytics, distribution, identity, collaboration, and telemetry provide constituents with the information they need. Add the ability to create quick reports and you have a compelling proposition.   Making sense of the tech stack Underpinning everything is the technology: cloud services, mobile computing, machine learning, big data, the Internet of Things, and possibly even Blockchain. In the tech stack, there are a wide range of options open to users that enable fast, simple, cost-effective data processing. These options can handle complex queries and offer globally distributed, multi-model databases to enable the storage of huge volumes of data. The growing importance of cloud computing for data storage and processes means it is absolutely essential that tools like MicroStrategy make it quick and easy for organisations to access a wide variety of popular cloud services and data sources. That’s why MicroStrategy connects simply with three of the most widely-used Microsoft cloud technologies—Azure HDInsight, SQL Data Warehouse, and Cosmos DB—to provide powerful analytics and mobility that plugs effortlessly into the broader enterprise data ecosystem. To make sense of the complex tech stack, we recommend creating an ‘Intelligence Centre’ at the enterprise that builds a set of intelligence environments with a single version of the truth with federated data sets and applications. Architects can then deploy federated applications to users to empower them to access the data they need and work even more efficiently. Organisation is key to making the most of the potential that cloud computing offers. Are you plotting a course to becoming an Intelligent Enterprise? ### Multicloud strategy | The 5 Steps to transition As businesses are undertaking a new multicloud strategy, many other companies are beginning to plot their courses from legacy to modern architectures. While some will adopt a Big Bang approach, attempting the migration in one fell swoop, successful enterprises will embark on a more measured endeavour, using the natural ebbs and flows of enterprise IT to gracefully make the transition. While every path will be a bit different, here are five steps that enterprises can follow to start building a multicloud strategy.   Make the underlying network multicloud ready Multicloud is mostly about operational changes. Making distributed pools of resources behave as one cohesive infrastructure requires over-the-top orchestration and automation, end-to-end across the entire enterprise network from the data centre and public cloud to the cloud on-ramps in the campus and branch. None of that is possible unless the underlying network is capable of plugging into an end-to-end orchestration platform. So enterprises looking to build multicloud networks should begin by ensuring their underlays are multi cloud-ready.  There are two things that must be true for multicloud underlays. First, they must have a rich set of open APIs that make the devices programmable. This includes standard APIs like NETCONF. Without ubiquitous support across all the devices in a multicloud architecture, the orchestration layer lacks the reach required to deliver. Second, given the critical role of automation in multicloud, devices need to support real-time streaming telemetry using standard mechanisms like gRPC.  Of course, no enterprise can rip and replace an entire end-to-end set of network infrastructure. The key is to use every refresh and expansion opportunity to make the underlay more focused on multicloud strategy.   Embrace open fabrics across the data centre and campus Network fabrics allow operators to manage groups of devices as a single unit. The automation built into these fabrics is a solid foundation for a more automated infrastructure required to operate as a multicloud.  But the objective in adopting fabrics is not merely to abstract the network. Rather, enterprises should be looking to unify their infrastructure using open protocols that support heterogeneous IP fabrics. By deploying EVPN, network teams can deploy IP fabrics in a multi-vendor world, providing a common underlay technology on top of which overlays can be managed.  By reducing the protocol diversity in the underlying infrastructure, enterprises will be simplifying the underlay in preparation for a unified underlay-overlay management solution.   Introduce controller-based management At some point within your multicloud strategy, orchestration needs to move to a central platform. While it might be possible to make the leap from largely CLI-based operations to an SDN-driven orchestration model, the reality is that such a shift is more than just technology.  The networking industry has managed networks through precise control over thousands of configuration knobs. Centralized management represents an entirely new operational model. Using intent-based management will force teams to abstract what they do. And that will require changes to both process and people.  Introducing controller-based management in either the data centre or the campus is a good way to get teams more familiar with a different mode of operating. Of course, whatever platform is selected should also be capable of operating in a multicloud environment to avoid an operational rip and replace when the enterprise is ready to manage the whole network from end to end.   Familiarize the team with public cloud workloads Multicloud will certainly involve public cloud workloads, so moving a few select applications to the cloud is an important step in the journey.  For most enterprises, a public cloud is mostly an exercise in lift and shift. That is to say that the easiest move for enterprises is to move workloads that previously ran in the private cloud over to a public cloud instance. This will not drive huge benefits in terms of either agility or cost savings—running infrastructure the same way but on a different set of resources is not transformative. However, it will allow teams to familiarize themselves with a different set of tools required to operate within the public cloud.  Solving for connectivity between the private data centre and the virtual private cloud (VPC) instances, for example, will allow teams to extend their orchestration platform to the cloud arena. Working with templating tools like CloudFormation or TerraForm will also give teams some exposure to tools closer to the DevOps space. Enterprises should also try multiple public clouds, which will force the thoughtful architecture of an operational model that supports policy and control across diverse environments. Ideally, the orchestration platform of record will easily extend to multiple private cloud instances, allowing enterprises to begin operating as a multicloud.   Instrument multicloud strategy Multicloud operations rely on automation in a way that traditional infrastructure historically hasn’t. The most basic way to think about automation is this: see something, do something. If an enterprise does not have end-to-end visibility, a meaningful part of the multicloud promise simply cannot be achieved.  And while automation is a long journey that will itself require careful thought, the first step of instrumenting everything in a way that it can be shared across an end-to-end infrastructure is a great first step. Enterprises should decide on the tools required to provide end-to-end visibility, along with the collection mechanisms and event-driven infrastructure required to take action once a condition is detected.   Putting it all together Enterprises need to understand that multicloud is going to be a sophisticated journey. It is essential that the path is broken down into individual steps small enough to take without overwhelming the team. While the technology is new and can seem daunting, the real changes will be to the people operating the infrastructure. The silos that have defined teams for decades will need to come down, and that requires a different way of not just operating, but also organizing.  The path is difficult, but if companies dutifully take advantage of every opportunity to make themselves more multicloud-ready, they will find that multicloud can be a natural outcome of graceful technology adoption. ### Cloud Computing | What is it? Sam Barker, from Flynet explains to Compare the Cloud what exactly Cloud Computing is. Transcript: You might have heard a lot about cloud computing in the news recently but what exactly is cloud computing, and why is it so important? Well put simply cloud computing is a system that allows me to stop storing files on my personal computer or server and instead store them in a rented server which I can access through the internet. So, for example, this is my personal computer. I do my work on it, I store my files on it, and I play video games on it. All my files are stored in my computer's hard drive - but the more files I add to my hard drive, the fuller it gets. Through cloud computing, though I'm able to store my files on a rented server and then access those files through the internet. These servers are usually owned by large companies so Microsoft, Amazon or Apple. And using the Internet I'm able to access them at any time wherever I want, whenever I want. The servers might be in one country or they might be spread across the globe. They could be in France or Germany, in Mexico or Japan. The important thing is that I can access them through the internet. Now the same principle that applies to store files in the cloud also allows me to run systems and programs in the cloud. So on a rented server, I'm able to run a computer program and access that through the internet. Cloud computing is also great for security and disaster recovery. Let's say I drop my phone or break my laptop. Well, I might damage or compromised the files inside - but through cloud computing, as long as I have access to the internet and my password; my files are safe ready for me whenever I need them. Now all the benefits that applied to cloud computing in my personal life also apply to my business life. So for us at Flynet; we use cloud computing as an option for our customers through our partnership with the Microsoft Azure marketplace and the Oracle cloud marketplace. We offer the opportunity for our customers to run Flynet systems on a rented server - and access them through the internet. This frees up our customers from having to actually own all the servers that run the Flynet system - which enables them to invest in other areas of the company and grow in ways that they want to. Through our partnership with Dell Boomi, we also allow the companies that we work with to integrate their legacy systems with other areas of the business and this enables them not just to have a great business system now but also plots out a roadmap for the future of their legacy systems. As you can see, cloud computing is a dynamic and interesting area of technology and it's only going to become more important as the years go by. Flynet is really at the forefront of integrating legacy systems with the cloud and we're making cloud and legacy systems safer, more dynamic and more accessible than ever before. I've been Sam Barker from Flynet, thank you very much. ### Cloud ERP | 3 tech tips to maintain productivity Despite the advances in cloud ERP technology and automation over the last few years, Britain’s productivity has fallen, with a report from the Resolution Foundation revealing that London’s longer working hours are dragging down productivity. Key factors in this decline include a lack of adequate technology being implemented, as well as a digital skills gap. There’s also a well-known tendency for productivity to fall off a cliff in summer months, as the weather gets warmer and people are more likely to take holiday. So what can business leaders do to beat the summer productivity slump? And how can they turn a traditionally quiet time into a period of productivity? Here are three tips:   1 - Make the most of summer downtime Summer is always a quiet time of year, so make the most of the downtime to get your technology in order. Carrying out a comprehensive assessment of the technology infrastructure used within the business will allow you to see where improvements can be made. For many SMEs or growing businesses, moving data away from excel sheets (or even paper), and into a system such as Enterprise Resource Planning (ERP) is a vital cog in the digital transformation wheel. However, it’s also something that can get put to one side, as concerns about costs, or disruptions to the business are pushed to the forefront. Indeed, according to a report from Bloor Research, 30 per cent of data migration projects fail, and 38 per cent run overtime, or over-budget, so it’s important to get it right. Similarly, switching to a cloud ERP vendor that’s a better fit for your business can be a daunting prospect. However, whether you’re moving vendors, or dipping your toe into the cloud ERP waters, the summer months are the perfect time to do it.   Similarly, businesses can also make the most of the downtime so get rid of old data, or unnecessary software. Do you still need that list of customer addresses that’s been hiding in a folder which hasn’t been touched in years? Or, if you’ve moved to outsource your finances, is the recurring monthly charge for a piece of accounting software necessary? By utilising the summer months to re-evaluate or upgrade, your business will be in good stead to tackle the winter months and see in the new year. 2 – Arm employees with the ability to work remotely While there’s clearly a desire to go mobile, and the benefits are tangible, the unfortunate reality is that many companies both large and small are still lagging behind. In a recent survey, over a third of senior decision makers did not have the correct technology to support mobile working, and 43 per cent couldn’t perform business-critical functions on a mobile application.   The good news is that the technology exists, and there’s been an enormous amount of innovation in the last few years. For example, many modern cloud ERP systems now support mobile application generators which allow users to create a range of applications from their mobiles and use them to perform core business processes no matter where they are. These can be created in a matter of minutes, and don’t require high levels of IT expertise. For example, if you’re a sales rep you can check factory stock levels, reach out to delivery drivers, and perform transactions, while in the field, without having to be at a computer. Similarly, a CEO travelling on the go for business would still be able to access real-time company analytics from his mobile. But if the technology is there, what’s stopping business leaders from jumping on board? The most likely reason is that they’re unaware, which is why the summer months are a perfect time to take a look at what’s available on the market and explore your options.   3 – Switch to cloud ERP The cloud is regularly seen as a gateway to increased flexibility and efficiency. It reduces costs, encourages innovation and mobile working, is scalable, and allows companies to take advantage of emerging technologies, such as APIs and artificial intelligence. Plus, in an era where security should be of paramount importance to business leaders, cloud solutions are actually emerging as a more secure option compared to on-premise. But there are also myths surrounding cloud adoption that have put some business leaders off. There’s a misconception that moving to cloud ERP is an exhausting process that has a high failure rate, will require a complete overhaul of systems, and end up being more costly for businesses. Indeed, Gartner predicted that nearly all cloud ERP projects would fail by 2018, and yet we’re over halfway into 2018 and cloud ERP is still growing. With the right vendor, cloud ERP doesn’t have to be a laborious process and will offer a vast number of benefits to your business. The quiet summer months are a perfect time to make the switch, as it gives you time to methodically consider your options, and choose a vendor that will support your transition. ### IoT | What Are the Risks? In this clip taken from October 30th's CommsBussiness Live show, the panel discusses the risks of venturing into the world of IoT.  ### Free software | What businesses should consider When it comes to free software, there is often a catch. As more legacy enterprise software companies jockey to become relevant in the cloud, the competitive environment is increasingly intense. To quickly expand cloud offers, these large vendors often buy smaller companies with cloud-native point solutions. The goal is to broaden their catalogue of offerings, but these offerings are often not designed exactly to meet business needs. And then comes the deal: switch to this software for “free.”   Is free software ever really ‘free’? But, getting free software can easily be compared to getting a free puppy. While taking home a puppy may seem like an easy decision for some, it will not take long before the puppy no longer feels so free. On top of chewing the furniture, it will rack up endless costs in the form of required vaccinations, puppy food, and boarding costs at the local kennel. While it might seem very different, the same goes for accepting free software switch offers. Behemoth vendors, with thousands of SKUs, see a need for speed to expand their cloud offerings – and as mentioned earlier, this often results in customers trading away a trusted and reliable solution from large software companies to a smaller company with an offering that is not suited to the business. In reality, free turns out to not actually be free. As such, smaller companies need to consider these four questions before acting on what looks like a good software deal: 1. What hidden costs will result from the change? When it comes to “free implementation” or services offers, depending on the length and complexity of implementing and integrating the system, there is sure to be an impact on time and productivity. Migrating data and configuration from existing systems, as well as addressing integration and compatibility issues, is inescapably costly. In addition, businesses should consider the time and investment required for training users throughout the company. Indeed, there might be costs for additional services to compensate for a system that is not suitable. 2. Will people use it? What originally seems like a great deal can turn out to be very costly in the long run if a complex system, that few people want to use, is set up. Instead of deterring broad user engagement, business technology should optimally encourage it. The true value of any solution is its ease of use and relevance to users. In order to enhance company-wide performance and make a business more competitive, it is critical to engage end users with the software they need to use on a daily or weekly basis. Otherwise, this “free” software adds no value. 3. Does the product fit the business’ needs and goals? How a new solution aligns with the needs and goals of an organisation is very much tied to its ease-of-use. For example, if an organization is driving toward a model for more collaboration between finance partners and functional leaders, its software should align with that goal. If the solution is free, however, there is a high chance of it being an off-the-shelf offering from a large organisation that may not be optimized for collaboration or support the level of communication or data exchange an organisation needs. The product might represent a small percentage of the software company’s focus. This can’t compare to the benefits that come with using a solution designed from the ground up to solve specific challenges facing a business. Otherwise, free software actually costs the business in terms of productivity and missed opportunities and ultimately unachieved goals. 4. Are support and service given up for price alone? Any new software comes with new issues, and as they arise the vendor of the software will have to be contacted. Vendors of free software are unlikely to have members of their support teams specialise on any one product, however, opting instead for a general training of many of its products. This is a clear downside for most businesses, as it loses out on a point-of-contact who understands everything about its product and works for a company that is solely focused on solving challenges and goals for a specific challenge or function. Good support also depends on whether or not the provider is able to respond consistently to customer feedback and is committed to innovating the product based on that feedback, rather than balancing input across hundreds of different product priorities. There is a good reason to doubt that vendors of “free” software are able to provide just that. Ultimately, for any business considering new business software, the above questions can all as a clear guide for determining the true costs of “free.” Put simply, everything is not always as it seems, and there are distinct benefits to going through the pros and cons of any decision rather than giving into impulse – whether it relates to free puppies or software. ### Supply chains | is the Blockchain the answer? In February 2018, a Gartner analyst described the “irrational exuberance” of blockchain - a term synonymous with a 1996 Alan Greenspan speech which warned of an overvalued market. On the Gartner hype cycle, Blockchain has gone beyond the “Peak of Inflated Expectations” and has been met with a lot of negative press. Despite the adverse coverage, the potential benefits to supply chains promises blockchain is not going to disappear anytime soon.   Blockchains may have originated as the enabling force behind cryptocurrencies, but its potential use cases are much, much wider. In the first three months of 2018 VC funding for blockchain, start-ups had already reached 40 per cent of last year’s total. It has been linked to just about everything with the potential to be the solution for problems in public and private sectors. From immigration, transportation and land registry to gaming, software development and supply chains…the list goes on and on. The blockchain is the disruptive solution. Here are three reasons for the scepticism, albeit there are many more:   Overuse: Blockchain is not a good fit for each and every use case. If the use case is primarily within an enterprise, then it might better for a secure central database or similar technology. Don’t apply blockchain to use cases where it is really not needed.   Loss of ownership: When it comes to deployment of a blockchain, some parties are not comfortable with the fact that data will be shared in order to improve the overall visibility of the process amongst stakeholders. If this is an impediment, consider alternative approaches. The blockchain is about 100% transparency.   Regulations and technology: Regulations like GDPR have stringent requirements for an organization to adhere to the data management procedures. However, some of them are not aligned with the characteristics of immutability in the blockchain. Of course, there will be architectural patterns for how blockchain and GDPR could comply over time. However in the short term, this leads to ambiguity on how technology can support such requirements – and this throws up more uncertainty. Scepticism is a healthy reaction to the claims made of the blockchain. Ask yourself, why use blockchain over other methods that have existed for years? Change is never easy, especially when it is a disruptive technology with lofty ambitions. Blockchain can deliver value through record-keeping, increased security and transparency. Typically, blockchain comes into its own when there are numerous participants who need access to information, where trust and validation are important and where controlling access levels is key.     Don’t believe the hype? In a recent supply chain proof of concept (PoC), blockchain demonstrated its capabilities to achieve absolute traceability, as well as increase efficiency, security and speed. The use of blockchain in global supply chains is not hype: it is real. It is happening now. For instance, tracing a box of mangoes to their point of origin takes around 18 hours. With blockchain, it can be done in two seconds or less. The security and trust blockchain confers onto a system is a value creator in itself. Porsche recently announced it was introducing blockchain to its cars, which would enable owners to perform functions like locking and unlocking the car via an app. It would also enable temporary access authorizations to be granted quickly, easily and securely. Furthermore, the security of a decentralized system like blockchain is not compromised if one point in the system is. (Unlike what happened with Jeep in 2015, when hackers were able to remotely seize control of cars by exploiting a software vulnerability – in one case, while the vehicle was being driven at 70 mph). In the digitally-dominated age, it is not just supply chains involving physical goods that benefit from blockchain. Software supply chains are being transformed by blockchain-enabled secure DevOps, leading to faster launches and better, earlier bug detection. The transparency blockchain enables also means that a project’s status updates can be instant, even when the process involves geographically-dispersed, remote teams consisting of suppliers, contractors and/or partners. Similarly, the provenance and security checks of digital goods can be validated through blockchain, which is increasingly valuable in our hyper-connected world. Early challenges with blockchain technology, such as the computational power are required or the amount of storage used, are on the way to being solved thanks to newer protocols like Intel’s Proof of Elapsed Time (POET). As blockchain matures and adoption spreads, expect interoperability to become an issue between the different systems of distributed data ledgers. For instance, a smart city might host several different blockchain-based systems regulating its many services, all of which need to ultimately work together. Google, among others, is creating blockchain lattices (“blockchains of blockchains”) to address this issue.   Supply chains and the Blockchain While some of the blockchain’s biggest real-world impacts may still be some years off, there is no doubt that the excitement around this technology is well-founded. Already, the hype is beginning to make way for grounded, practical examples and use cases across every type of industry. Supply chains have been among the first to benefit. Another major area to benefit will be the supply chains visibility, security, asset tracking and IoT domains.  Examples of blockchain powering smart cities or vehicle-to-vehicle interactions are beginning to trickle in. Many more will follow. There are plenty of problems to be solved with blockchain. But Blockchain is not the problem. So, it is time to stop the blockchain bashing. As the Gartner report from February 2018 cited “IT leaders must cut through the hype and apply blockchain for maximum benefit”. Blockchains are just getting started.   ### Archive | The future lies in the cloud Humans have been creating and adapting communication systems for thousands of years – at least, that’s what we know based on fragments of the historical archive that have stood the test of time, preserved often by accident and discovered much later by newer generations that need to piece it together to understand the meanings.   Fast forward to 2018, and humans are much better at communicating. We use social media to document what we do, who we spend time with, even what we eat. We use websites to share information, news and opinions with the four billion other Internet users. It’s the stuff of dreams for future historians, or it would be, if the thousands of tweets per second, hundreds of thousands of Facebook posts per minute, and overall billions of GB of data created online each day weren’t at risk of being lost forever. Organisations across all sectors are now recognising that digital content is a precious asset that can deliver value in the short term and decades to come, but without the right planning and foresight they risk relying on obsolete technologies or formats, issues with the third-party platform they are publishing on, or dependence on content management and backups that only provide security in the short or medium term. It’s already happened with sites such as MySpace, with former users of the once world-leading social media platform losing pages, messages, and photos. Similarly, it is very hard to find older versions of websites or web pages – search engines are designed to show you only the very latest content. Most people assume that content uploaded to the web is safe there, but action needs to taken to protect these digital communications and make them accessible for future generations. The legacy and history of 2018 will come from these digital communications, and to preserve it requires a digital archive. As well as storing the data, it needs to be usable for it to be valuable, and a digital archive can give you a snapshot of what a website or social media looked like at a certain point in time. It creates a permanent and unalterable record of what any organisation, from governments to drinks brands, have been putting online at any given period. This has huge, positive implications – companies are digitally preserving the content of legacy and historical significance, and it also allows organisations to demonstrate compliance, especially important in regulated sectors. Cloud technologies are the only way this can be achieved in the most efficient and cost-effective way. Archive Traditional archiving in the form of physical hardware still holds a lure for many people, but it is limited in its capabilities. When the data you’re archiving is growing as exponentially as an organisation’s digital communications, hardware could mean a continuous need to invest in new infrastructure in order to accommodate the data. Cloud storage allows the unrestrained option to add new storage whenever it is needed, thanks to an ability to scale to almost unlimited capacity. Reliability Physical hardware such as servers and hard drives can easily be overloaded or fail. By comparison, cloud infrastructure has a higher level of redundancy (supplying duplicate copies of data), which can be utilised in case of problems. If a hard drive, data centre or server fails, there is peace of mind in knowing that normal service can be resumed with as little disruption as possible.   Security Using an archiving solution that is cloud-native and certified to ISO standards of security means that any organisation can be assured they have full control of who can access their archives, which is a key consideration especially for any personal data held on the archiving platform. It is also incredibly important for public sector organisations that may be holding sensitive information of national importance. The data can be stored in highly secure cloud data centres, protected by strong safeguards, and the scalability of cloud technologies means that the safety is always there no matter the size of the dataset. Compliance Regulated companies and firms that need to record and retain all electronic communications under various rules and legislation – such as MiFiD II or the recently introduced GDPR guidelines – need to be able to prove compliance. Cloud technologies provide the scalability and future-proofing that is necessary to demonstrate that the data has been permanently stored in an unalterable format.   Cost savings Moving to the cloud cuts down on a lot of the overhead that comes with maintaining your own infrastructure – the physical space and the costs and bills that come with it. This allows the flexibility to focus on improving the archive, whether it’s simple user interface updates or much more advanced capabilities, like transferring large amounts of data for large-scale research projects. Usability An archive is worthless if no-one can use it – whether that’s internal staff within an organisation, or students, researchers, historians and anyone else who may wish to access a public-facing resource. Search technologies can be difficult to implement because search engines don’t scan an entire set of documents one by one – they use indexes, to return useful results, faster. Web archives consist of website data stored in a WARC file format. Playback of these files requires indexing, which is essentially a list of all the assets within the web archive, including HTML data or PDFs. Achieving this for large archives can be challenging, sometimes with billions of very small items to index, but content stored in a flexible cloud environment is much simpler to process. Cloud-based search technology gives the ability to scale up or down depending on demand, too. It can also be useful for the quality of the data; for example in the deduplication of any pages. Using a mixture of this, and its own technology, MirrorWeb managed to process 1.4 billion documents for The UK National Archives in just 10 hours. If companies haven’t been archiving websites since the dawn of the Internet then there’s already likely a 20 plus year black hole - a huge void that will never be filled.  Without this, firms will be unable to look back at the contents of their first website or their digital presence when social media first took off. In 2018, the world is collectively estimated to spend one billion year’s worth of time online. To avoid losing valuable pieces of history, organisations must wake up to the urgent need for digital archiving, and the multiple benefits of doing so on the cloud. ### GDPR | Ensuring maximum uptime and compliance With businesses preparing for its arrival for what felt like an eternity, the GDPR finally came into force on May 25th, and organisations across the globe who do business in Europe will now be held accountable for the way in which they handle or process personal data. Indeed, much has been written about the size of the fines that companies could face if they fail to comply: up to €20 million, or four per cent of a firm’s global turnover, whichever is highest. Given the regulation’s focus on data privacy and protection, the security of an organisation’s network and, by extension, the information it holds, are integral to GDPR compliance. Organisations must, therefore, ensure they have measures in place to minimise the effect on their network of any potential breaches, attacks or outages, particularly now that, under the GDPR, data subjects have the right to access any data held on them by an organisation. To protect the privacy of personal data, for example, Article 32 of the new legislation requires its “pseudonymisation and encryption”. It further states that companies must “ensure the ongoing confidentiality, integrity, availability and resilience of processing systems and services” and be able to “restore the availability and access to personal data in a timely manner in the event of a physical or technical incident”. In short, it’s more important than ever that organisations take steps to keep network downtime to an absolute minimum, otherwise, they could find themselves on the wrong side of the regulations, and potentially facing an eye-watering high financial penalty.   Layers of complexity with GDPR The size and complexity of IT networks today means that it’s almost impossible to detect when a network failure might occur. Now, with the GDPR requiring more data than ever to be stored for longer periods, and for it to be available for access at any given time, organisations need to understand what can be done to assure that their networks are able to cope with a sudden increased workload. If and when a problem does occur, IT teams need to be ready to deal with it, with all the information at hand they need to triage and resolve it as quickly as possible. The ideal situation, of course, would be for them to be able to detect when services are degrading before users are even aware of the problem, thereby allowing the IT team to prevent any negative impact it might have on the wider business. Traditional point tools are no longer sufficient for this, however, as they do not account for the interactions between various aspects of the overall integrated system, such as the hybrid network, applications, servers, and supporting services. The situation is complicated further when you consider that much of the functionality that runs on an organisation’s network – its key services and applications - tends to be multi-vendor, requiring IT teams to ensure that everything is working together without friction. Achieving visibility into this environment is hindered somewhat by the fact that these services will be running across both physical and virtualised environments as well as private, public and hybrid cloud environments, which only adds to the levels of complexity. What’s required, therefore, is complete – vendor agnostic - visibility across the entire network; the data centre, the cloud, the network edge, and all points in between.   The smart approach to assurance Continuous end-to-end monitoring and analysis of the traffic and application data that flows over their organization will provide IT teams with the holistic visibility of their entire service delivery infrastructure they need for full-service assurance. This ‘smart’ approach involves monitoring all of the ‘wire data’ information; every single action and transaction that traverses an organisation’s service delivery infrastructure. By continuously analysing and translating the wire data into metadata at its source, this ‘smart data’ is normalised, organised, and structured in a service and security contextual fashion in real time. The inherent intelligence of this metadata will then allow self-contained analytics tools to clearly understand application performance, infrastructure complexities, service dependencies and threats or anomalies across the network. By continuously monitoring this wire data, businesses will have access to contextualised data that will provide them with the real-time, actionable insights they need for assurance of effective, resilient and secure infrastructure. Without this assurance, however, detection, triage and resolution times would be extended. Customers would suffer and the organisation itself would be at risk of not upholding its duty in protecting their personal data. Compliance with Article 32 of the GDPR, along with much of modern business activity, is dependent on the continuous and consistent availability of effective, resilient and secure IT infrastructure. By taking a smart approach to assuring complete visibility and availability, businesses everywhere can be confident in the reliability of their networks, and in their efforts to comply with the new regulations. ### Cybersecurity | A forward-thinking approach Cybersecurity is perhaps the most important issue in today’s business world. But you don’t need to be told that: all you have to do is look at the latest news headlines to be reminded of the severity of the cyber threat landscape. Who could forget the infamous WannaCry attack from last year, which impacted businesses the world over? Or how about the much more recent Dixons Carphone breach, which involved the seizure of 5.9 million payment cards and 1.2 million personal data records?  Incidents like this are recurring more frequently than ever before. This has led to cybersecurity to go from something that was once left to the IT department, to an issue that the entire organisation is responsible for. All efforts also need to be championed from the very top, specifically the C-level and the board. The rise in cyber attacks cannot be denied, but the question remains: why are we witnessing such an increase? While historically these attacks were targeted primarily at companies within the insurance and financial services industries, today no organisation is exempt from the threat. This is mainly because setting up, carrying out and/or commissioning an attack has never been easier or more accessible. It’s key to remember that the majority of cyber attackers are purely opportunistic: they scan the internet, searching for vulnerabilities in order to breach corporate networks. Therefore, the more secure you are, the harder it is to suffer from a breach and the more likely attackers will move on to easier targets.   Know what you’re up against Despite best efforts, no cyber defence is 100% bulletproof. Every organisation is fallible, which means it becomes more about mitigating the risk of an attack and ensuring your organisation is well placed to continue operating during and after an attack. This all comes down to having the right, layered cybersecurity measures in place, shaped by your understanding of the threat landscape and complemented by the skills and expertise of a security partner. The best way to tackle the risks associated with cybercrime is through a risk-based approach. By knowing your business inside-out, understanding your attack surface, the defences in place and the vulnerabilities in your systems, it’s much easier to prioritise the risks around key three areas: Technology, people and processes.   Clearing the hurdles Cybersecurity is no easy task. Devising and putting a strategy in place takes time, and there are various challenges to overcome on the way. The two most common are a lack of cybersecurity expertise within your business and the sheer complexity of understanding the solutions and vendors in the market. Wading through multiple options, dealing with multiple vendors and ensuring you’re delivering a holistic defence programme is difficult. It’s often necessary to work with multiple tools as there is no single solution that can address all areas of your business, but managing, integrating and coordinating those tools only adds to the overall complexity.   Anticipating the future The cyber threat will never stop evolving in some way or another. Over time, cybercriminals will get increasingly bolder and their tactics ever more sophisticated. The devices being targeted will also change, and moving forward we will see the majority of threats be focused around mobile phones (attacks on smartphones will give hackers access to personal and work information), as well as AI, machine learning and IoT. AI and machine learning, in particular, is set to shift the landscape in unpredictable ways, which will consequently change how businesses and cybersecurity experts view the industry.   Working with a cybersecurity partner Despite the likelihood of your business being targeted at some point, it is possible for this risk to be effectively mitigated, and part of this lies in working alongside cyber security experts. A trusted partner can provide the industry knowledge, in-house skills and investment required to ensure all actions undertaken are successful, while also boosting the skills and knowledge of those within the company at the same time.   Conclusion If businesses want to be able to deal with the inevitable risk that cyber attackers pose, it’s imperative they have comprehensive cybersecurity measures in place. This also needs to be a board-level issue; something that is deemed critical to the likelihood of wider business success. However, all organisations need to be prepared for obstacles along the way. Not only must efforts be made to understand the cyber threat landscape, but it’s also important to source the right combination of tools to meet your organisation’s specific needs. To achieve this quickly and effectively, it can help to work with an experienced and skill cybersecurity provider, who can help you navigate this tricky route and ensure you’re doing all you can to stay protected. ### Voice Technology | Making Distinctive voices sing in business With one in six Americans now owning a smart speaker, it’s clear to see the rapid impact that voice technology is having on our daily lives. Not all nations are advanced as the US in their take-up, but it’s safe to say that speech-based control of devices and systems has made its mark in a very short space of time and is definitely here to stay. But what of its business application? In many ways, the potential for voice is even greater in business and industrial environments than it is in the home. Waves are definitely being made but there are stumbling blocks too. Brian Ballard, CEO of Upskill, points to the fact that voice is hugely useful for anyone whose job requires their hands to be doing something other than operating a computer or device. Drivers, machine operatives, nurses, customer services, to name a few examples. As he says, however, a voice in the enterprise is currently “muzzled” and as a CTO looking to develop our own voice-enabled apps for the eProcurement sector, I would certainly agree. Ballard points to two primary issues. Firstly that voice technology has relatively little value to the “desk-based knowledge workforce who commands the majority of IT spending and attention,” - meaning that to date it hasn’t been very high on the CIO agenda. As investment in IoT and AI grows, however, we should see this voice block alleviate. The second issue and the one I can speak of from experience is that adding voice to existing, complex business systems, is just not that easy to do. “Voice technology is powered by machine learning systems that are constantly updating based on millions of user interactions,” Ballard comments. “These systems and the accuracy they afford require the kind of massive processing power and back-end data that resides in the cloud; and for some customers, the idea of any public cloud implementation raises concerns about data security, access control, user privacy and legal risk. Consequently, enterprises that are reluctant, for whatever reason, to migrate business applications to the cloud cut themselves off from these kinds of advanced capabilities.” While the voice for a business challenge is tough, it’s still one worth pursuing. In the world of business applications, for example, much of the functionality different systems afford the business is now commoditised. In my own area of eProcurement, the user experience is everything. As a key transactional business system eProcurement touches many employees, many of whom need their hands for things other than operating a computer or mouse! Voice technology has huge potential to improve procurement processes, especially when combined with AI. Employees can use it to place orders by voice, just as someone might ask Alexa to replenish their coffee machine cartridges via an Amazon order from the comfort of their kitchen. However, it can also be used for analysing organisational-wide spend too. Consider the benefit of, on remembering a key piece of information you need urgently for your next meeting, being able to simply ask a system questions like ‘what was our computer peripherals spend against a target in Q1?’ Voice-enablement could also make procurement eAuctions far simpler to perform. Rather than needing to manually check supplier rates, verbal commands can be used to access information and make the right supplier choice. And it’s not all about the user asking the questions. It’s just as likely that a voice-enabled eAuction platform could do the talking – briefing the user on its analysis of the auction outcome and which supplier it recommends. However, going back to the challenges highlighted earlier, I can tell you that applying voice technology to existing business software is not easy. Voice technology means transforming multiple actions built around structured menus, mouse clicks, keystrokes, and even screen swipes into a single voice interaction. Where traditionally the user would ask themselves the question they need answering in their head and then perform a number of actions in the system to find the answer, voice means simply posing the question and leaving the system to do the thinking. This is where AI comes in. It’s currently difficult to find a procurement app that can accurately identify all of the necessary information needed to answer a concise question. That’s why it’s more challenging developing a voice-enabled app for business use than it is for consumer environments. In procurement, for instance, it would need to include purchase order approval functionality and be able to assess the cost-effectiveness of the products or services being requisitioned. Apps that can perform consumer ‘shopping tasks’ are often dealing with much simpler requests which don’t require anywhere near the same depth or layers of process or decision making. Voice is a challenge for other business functions too. Businesses as a whole are hesitant, according to analyst CCS Insight, especially given the risk for sensitive information to be ‘voiced’ and overheard in open working environments. This raises the question of security, however, progress is being made on the development of specific voice recognition systems into business applications. This means voice becomes the security system too. From a procurement perspective, it means allowing only permitted users to buy items as recognised by their speech. Beyond security, in procurement voice recognition could also work alongside AI and machine learning to tailor purchase recommendations to individual user profiles. Voice is rapidly becoming the norm when it comes to how we use technology and apps in our daily lives and, while business is currently lagging behind, I’m confident that it will catch up. The potential applications and benefits are huge, not only in procurement but across a variety of business functions and processes. ### Digital transformation is all in the execution It’s hard to believe that it has been a decade since the launch of the first Apple iPhone. When the product first hit the shelves in June 2007, the first generation of social media platforms were just hitting the mainstream, and America was on the cusp of electing Barack Obama as President.  Ten years ago, most companies had yet even to consider the impact that consumer technology was likely to have on the way they did business. Most did not have a mobile strategy, let alone a mobile presence for engaging with their customers or for helping employees collaborate. How times have changed. Trying to make sense of digital transformation Today, businesses of all sizes find themselves in the midst of a period of immense disruption, one in which the digital art-of-the-possible is pulling away from average practice.  As organisations scramble to keep pace, most are still struggling to define what digital actually means for their business. For all too many organisations, digital equates to massive, complex upgrades to existing systems. Some of these projects will be wildly successful, but many more fail due to a fundamental misunderstanding of what digital projects should aim to achieve. Digital is not just another IT-driven project. It is a core part of your business in a new world fuelled by disruption and should be at the heart of your business strategy. Doing what you already do slightly better is not going to put you ahead, you need to embrace innovation and create new experiences with a people obsessed mindset. A digital strategy will get you to a “me too” point, but to really differentiate and disrupt, your strategy needs to be digital. Great ideas are nothing without the ability to execute Digitally mature organisations define change in a distinctly different way. Rather than focusing much on individual technologies, or over-emphasizing existing operations, they fundamentally re-imagine the business by exploring the fresh opportunities that the digital world enables, from new business models and creating new markets, to reinventing the core business in contemporary technology terms. According to one recent report, 80 percent of business leaders agree that digital transformation will fundamentally change their company's operating model. It’s encouraging that such a high proportion of businesses recognise the critical importance of digital transformation to their future success. More worrying however is the fact that just 40% of those same business leaders have a strategy or the competencies needed to execute that transformation. To me, ideas are worth nothing unless executed. They are just a multiplier. Execution is worth millions Asked to comment on his secret for success, Steve Jobs once said: “To me, ideas are worth nothing unless executed. They are just a multiplier. Execution is worth millions.” Speaking to my many of clients I’m seeing that while ideas are not in short supply, the majority are struggling to execute. The question is, why? Hamstrung by legacy A recent report Avanade published looking at digital transformation in the banking sector found that senior IT decision makers are spending as much as 19% of their annual IT budgets managing and maintaining legacy IT systems, something which they say prevents them from focusing on innovative projects designed to drive the business forward. The amount of resource currently being allocated to updating and maintaining the status quo represents a significant barrier to innovation. Modernising your IT is, therefore, a critical first step for any business looking to execute true digital transformation. But the fact is even getting to this point on the journey often proves an insurmountable challenge for organisations. However, this is only part of the challenge. The type of bold thinking required in digital transformation thrives in a flexible execution environment. Yet the structure of many large organisations tends to rest on rigidly defined processes, chains of command and cascading communication. The resulting lack of flexibility is frequently at odds with the speed of execution that true digital transformation demands. The ‘try things, fail fast’ mentality that characterises the approach of many of today’s digital success stories remains an alien concept in many highly structured large organisations. Incremental, or iterative changes to the status quo are far easier to sign off, because they follow the established process. Meanwhile, good ideas are watered down to the point where they no longer fulfil the original, strategic vision. Lacking the skills to deliver Even when transformational ideas do make their way through these layers of process, for many organisations the ability to deliver is often held back by the lack of internal resources and structures capable of taking these projects forward. Successful digital transformation relies on having access to a group of creative people with highly specialist, and highly sought-after skills. It would be unreasonable and impractical for the average high-street bank, insurance provider or utility to have this type of specialist available on tap. And besides, even if these organisations could attract this type of resource to join them permanently, could they really keep them motivated or busy in the long-term? Creating the right environment  The successful execution of digital transformation projects relies on having the flexibility to execute at speed. As most large organisations simply aren’t structured to accommodate this type of approach, they need to look at ways to build that flexibility through other channels. Businesses in almost every sector are increasingly looking towards third parties to provide the mix of agility and highly specialised resource they need to create new digital experiences for their customers. In response to this demand, organisations including Avanade are taking a bespoke approach by creating dedicated digital studios, where experienced designers, data scientists, digital strategists, and developers are brought together in an environment clients can tap into in order to experiment, test and develop digital services in a very agile way. Even visionary geniuses have help It is tempting to attribute great, disruptive innovation to a single ‘Eureka’ moment, from which everything suddenly falls into place. Few would dispute Steve Jobs’ position as one of the leading visionaries of the past century, but he had a few other aces up his sleeve too. Not only was he not hamstrung by the need to maintain a costly legacy infrastructure, he also recognised the importance of trial and error in the pursuit of a great product. Finally, he had an incredible team of specialists around him to execute on that vision.  As businesses look towards a future defined by digital, they will do well to ensure their vision has the same ecosystem of support in place. ### Automation | Providing the key to SMB success New technology, such as automation, has the potential to create positive change in workplace productivity, and this isn’t exclusive to bigger companies – small and medium-sized businesses can also benefit if they take guidance on where to automate, looking around their industry to do so. When small Californian fast food chain CaliBurger introduced Flippy the Robot, a machine designed to shoulder the monotonous burden of burger-flipping earlier this year, it showed that AI can both provide a point of differentiation and drive efficiency within a company. Indeed, rather than fearing technology as a threat to jobs, we should instead think of it as replacing processes in order to free people up to work to their best ability. A recent study from Ricoh and Oxford Economics, titled The Economy of People, found that if businesses were to invest in critical workplace elements that directly affect people’s performance – including automation technology – the UK could unlock £36.7 billion in untapped GDP. According to the report, both employees (77%) and employers (90%) agree that technology is the greatest driver of output per hour. In the current economic and geographical environment, automation has the potential to be hugely beneficial. Introducing workflow automation As we enter a new wave of innovation, the digitisation of information is an important technology trend that can be used to better equip SMBs and their employees to be more efficient. One particular area among SMBs that is generating a lot of interest is workflow automation. Workflow automation can have many applications. Put simply, it is the process of digitising and automating document-intensive workflows. SMBs, in particular, stand to improve productivity and business agility, purely by carefully considering how technology can simplify processes to enable their employees to work faster and smarter. Employees clearly want better processes and greater workflow automation, with our study finding that 82% cited the digitisation of information as the most effective technology in improving their productivity. However, business leaders are at risk of taking the wrong path. 93% of companies planning to spend more than 10% of their operating budget on office improvements identified facilities management (sensors, monitoring equipment, temperature, etc.) as the biggest driver of productivity, which is out of alignment with what employees feel will drive productivity. Embracing new tools A good example of workflow automation in practice for SMBs is expense claims. Right now, an employee may have to fill in a form and attach all of their paper receipts. This may have to go to a number of people for sign-off, depending on the amount. This is all before the claim has reached the finance team. Using digital technology, the whole process can be automated and thus streamlined, saving people from a laborious task and freeing them up to focus on more important issues. Staying with the expenses example, traditional systems can be inefficient and slow. An SMB needs to be nimble. Having a clear understanding of expenses enables faster and responsive planning for cash flow. By automating a slow and manual process, a business can achieve a quicker validation of its health, while increasing employee efficiency. As an added bonus, staff are happier as their expenses are paid more promptly. A necessary priority SMB leaders already recognise that technology processes are key to remaining agile and enabling them to capitalise on market changes. In a separate study by Ricoh of 1,608 SMB leaders, 52% were found to believe that without the benefits of updated workplace technology, their business will fail within five years. Over half (51%) are specifically introducing new technology to respond faster to trends. Not all are fooled by the hype around technologies such as virtual reality and blockchain and are instead wisely choosing to prioritise investment in the tools that will have a real, positive impact on the bottom-line. It takes a creative workplace culture to identify areas where such forms of technology could improve a company’s operational efficiency. How this works for different use cases really depends on the process. Often, the higher the volume and the more steps you can remove in a process, the more efficient employees will be, ultimately leading to greater cost savings for an organisation. Uber and Xiaomi, having built billion-dollar organisations by leveraging digital process and operating models to outperform their legacy competitors, are just two of the best-known examples of automation technology directly contributing to the success of a fledgeling business. By streamlining some of the more common tasks, technology can play a major role in empowering employees to work smarter and focus on adding real value to their companies. Rather than have manual processes hampering a company’s ability to innovate, the power of automation will enable change and improve operational capabilities required for SMBs to thrive in today’s environment. ### GDPR | The biggest ever shake-up of Cloud contracts? The deadline for GDPR is looming – GDPR is the new Data Protection legislation which comes into force in May to protect personal privacy and sharpen up companies’ reporting of data breaches. By now, with only a few weeks until the new rulings become law, many businesses have spent a number of months working their way carefully through the processes necessary for GDPR compliance, and most organisations should be very close to a position where they are internally in good shape. However, most company’s IT provision involves a wider ecosystem of external IT providers. Cloud-based providers offer the majority of all software provision, but their standard contracts will not cover GDPR, unless you’ve been through an explicit renegotiation process, or you’re a new customer and they have addressed GDPR in their contract. Ask the question of how many outsourced providers do you use? Payroll, data centres, infrastructure provision, call centres or digital marketing, the list goes on and on. The newly appointed position of Internal Data Controller will be responsible for all of the processes carried out externally by third parties and their sub-contractors. While it is fairly challenging for an IT department to evolve to full GDPR compliance by the target date, this last step in the journey may turn out to be bigger than expected, and may well present the greatest challenge of all.  Every single vendor contract needs to be updated to be sure that every partner, provider and supplier will also play their role correctly in the GDPR procedures. It is the last hurdle to cross in the journey to achieving compliance, but it is an unpredictable one as it adds complexity to the supplier relationship and providers’ responsibilities, and may involve lengthy discussions with suppliers and their legal advisors to revisit their contracts and update the agreements. This considerable amount of administration may not sit well with the normal fast-moving world of the IT function and its IT industry partners. Some organisations are already quite far down this road. They are working through each in-scope supplier contract that needs to be renegotiated and updated, maybe by adding a GDPR addendum to address the new GDPR requirements. There are likely to be many small details to address within every single contract, as wording needs to be re-drafted, discussed and finally agreed. The amount of work remaining to be done at this stage, and the resources required to execute it can be estimated at the beginning of the programme Harder to estimate though are two important factors – a particular vendors’ willingness to negotiate, and the other demands on their negotiating teams, which will dictate the timescale to complete discussions. With just a short time left to prepare, now is the time to take control and manage the redrafting of contracts to be sure of crossing the finishing line on time. With just a short time left to prepare, now is the time to take control and manage the redrafting of contracts to be sure of crossing the finishing line on time. The first key task is to identify which IT suppliers fall within the scope of GDPR, the first pass of which can be done fairly rapidly through common sense discussions with the key stakeholders. This means that negotiations can start, whilst any longer and more detailed data audit is underway, such as a data protection impact assessment (DPIA), which may reveal further suppliers in the GDPR re-negotiations remit. When they have been reviewed and discussed, this draft list of IT suppliers and the priority for each can be used to drive the appropriate approach for reviewing each contract, and how it should be updated and revised. The new legislation requires that all third-party supplier contracts that relate to EU personal data must contain certain wording that meets the requirements of GDPR. Within this, there are no less than eight new provisions that need to be addressed in contractual terms and agreed. While these new areas are being negotiated with suppliers, it is natural for there to be some renegotiation of other commercial terms, as there are new risks and liabilities to both parties that will need to be countered with limits of liability, indemnities and similar clauses. At this stage, it may well be worth adding more resources to the GDPR programme. While it is well known that there are heavy fines for any company that is unfortunate enough to suffer a breach or accidental loss of confidential customer details, these are not the only risks. Being late with GDPR compliance could place an organisation at risk of a ban on all data processing activities. [clickToTweet tweet="GDPR highlights the importance of procurement skills and contract negotiation, two areas which may sometimes have been overshadowed by other priorities." quote="GDPR highlights the importance of procurement skills and contract negotiation, two areas which may sometimes have been overshadowed by other priorities."] There is no doubt that the effort companies have invested in preparing for GDPR has been substantial. Are there any lessons to be learned from this? GDPR highlights the importance of procurement skills and contract negotiation, two areas which may sometimes have been overshadowed by other priorities. ITIL confirms that IT procurement, with good contract management and supplier management, should be acknowledged as valued skills in all IT functions. ### Cloud vendor | The need for a relations role The move from on-premises to the cloud comes with many potential advantages: cost savings, flexibility, scalability, and more. But “simpler” isn’t necessarily one of those advantages. Moving to the cloud can actually make things far more complex for a business, even if they are dealing with a great deal less physical infrastructure. So is there a need for a cloud vendor? Moving to the cloud means dealing with multiple vendors who offer and support multiple services. It means multiple subscriptions to services, and for each service, there will be both a service relationship and a billing relationship. Suddenly what was once a one-off or infrequent relationship is now an ongoing one. Cloud migration does not eliminate the need to manage infrastructure but changes the skill sets that are necessary to maintain it.   A discrete cloud vendor relations role—or not... For a business to stay on top of its relationships with vendors, there must be a cloud vendor relations role within the company. However, this isn’t as simple as hiring someone and giving them the job title of vendor relations manager. Who this role belongs to will depend on the size of the organisation and the staff it already has. A small business may need to simply assign the task of vendor relations to an office manager. Larger businesses may need to assign the role to one or more people as necessary. Businesses will almost certainly already have processes for suppliers, and much of the same rules should also apply to cloud vendor's. However, the process won’t be identical. With some vendors, a bid process won’t be appropriate. For others, a service level agreement (SLA) will need to be established. Some will have an account manager as a single point of contact, while others will have a team. Also, businesses don’t often use the software as provided by the vendor and will have requested their own custom changes to make it fit their processes. A vendor relations expert will need to negotiate what the cost of this is and if that software is available to any other business. Typically, vendors will want to resell any custom software to maximise their revenue—but this has the potential to give trade secrets away. These terms will need to be in place so the customer and vendor position is clear. There is also the work necessary when leaving a vendor. No supplier likes losing business, and the handover from one vendor to another may not go smoothly if SLAs are not agreed up front. The skills needed for cloud-vendor management are not IT skills, though IT knowledge may be valuable. The main skills are being able to negotiate the SLAs and contracts that will give the business the best deal possible, while at the same time maintaining a harmonious relationship that gets the absolute best out of vendors.   The opportunity for MSPs Cloud vendor relations will end up in a tricky no-man’s land for many businesses. What was previously the IT team’s responsibility should now be taken over by another individual or team—but IT knowledge and experience are still incredibly valuable in making sure the business gets the most from cloud vendor's. Many businesses would prefer to enjoy the benefits of the cloud without the extra complication, and this is where managed service providers (MSPs) can step in to manage these relationships. Businesses don’t just want to outsource the running of an IT to an MSP, they want to outsource as much of IT as they can—including all the complications that come with it. To most businesses, there is no real difference between the software installed directly on a machine and SaaS services, and they don’t want to directly manage either. The single billing relationship and a single support relationship that an MSP offers is incredibly valuable for a business, as they can focus on the core of their business without worrying about the IT estate. In order to continue offering this complete service, MSPs need to take over the cloud vendor relations role for their customers. Essentially, businesses don’t just want cost savings, flexibility, and scalability from the cloud. They want all this and simplicity. MSPs need to add “vendor relations” to their skill set to meet this need and become, in effect, the single IT vendor.   ### Water Sector | Cloud technology could save millions The water sector is coming under immense pressure to improve the quality of its service after Ofwat announced it will intervene in water companies whose business plans for PR19 don’t match the “high bar” it expects for customer service. With an increased demand for water, an ageing infrastructure and rising costs, some large water providers are exploring the use of cloud-based technologies, as a way to make considerable cost savings and manage the network more effectively. Yorkshire Water is one such water company that has been investing heavily in advanced technology to improve the services it provides. Since adopting the WorkMobiles’ mobile data capture app, the company has used it to become more agile and efficient compared to its previous method of data capture. Previously, field engineers were relying on paper-based forms and handheld cameras to capture information on jobs and projects and were then having to drive at least five miles back to head office to load their job data into the company portal. Yorkshire Water recognised that it needed a digital solution that would help to collect and manage essential project information more effectively and also reduce administration costs based on the price of fuel and non-productive wage costs. After trialling the cloud-based application with a team of 400 workers, the water provider has deployed the WorkMobile solution to over 1,800 of its employees. They are also continuously looking for new ways to further increase usage of the app. WorkMobiles’ flexible form designer allows users to create mobile forms relevant to the specific job at hand, including site inspections, health and safety forms and timesheets for all workers on site. Using a digital form to capture the information for these important documents reduces the risk of data being lost or incorrectly collected. Job details can now be sent to employees in the field and project data can also be captured in real-time, with all information integrated into internal project management systems. Work can now be completed quickly and recorded more accurately, making for a more efficient network management process. A Yorkshire Water spokesperson said: “There is an increasing pressure to become more efficient and innovative in order to remain competitive and deliver an even better customer service. However, with new, emerging technologies, water companies are now gaining the ability to streamline their working practices and meet the needs of their customers more effectively. “Our main challenge was that our previous data capture process was simply not cost-effective and meant that our teams were spending extra hours travelling back to base to record their job details. We needed a solution that could provide greater efficiency and connectivity, so staff working out in the field could record and share information in a timely manner. “With our aim to roll out the WorkMobile application across various departments in the business, we have calculated that this will result in huge cost savings for us and our customers. The money we save as a result of this switch will help to relieve some of the pressure on our resources and will also help us to provide a better quality of service to our customers.” Colin Yates, the chief support officer at WorkMobile, said: “With pressure mounting from Ofwat to provide a better quality service through the use of innovation, water companies are looking for ways to become more agile and efficient so they can work more effectively. The sector is facing a number of challenges, particularly due to ageing networks that can’t cope with the rising demand for water and the inherent leakage synonymous with older pipes. In order to keep these networks operational, now is the perfect time for water companies to get smarter and embrace new technologies so they can deal with issues quickly and successfully. “It’s great to see that solutions such as ours are helping the water industry to combat its current issues. Yorkshire Water, for example, now has a tool that can help meet the needs of its workforce, so that work can be quickly recorded by employees and information then sent back to the office. The business has now seen greater efficiency amongst its workforce, along with huge cost savings. Every water company across the sector should be looking to embrace technology to achieve similar results in order to create a more sustainable future.” ### AI | Why programmatic has an exciting future In a relatively short space of time, artificial intelligence and the tech that it powers have made a considerable difference to the way marketers reach and engage with their audiences at scale. While in the beginning programmatic relied on past information or segmentation to plan campaigns, now AI is used to spot the patterns in consumer behaviour before making real-time decisions about where to deliver an ad. It’s less programmatic and more predictive. In addition, true AI platforms change tactics on the fly based on micro-calibrations and what features are impacting outcomes. Despite coming on leaps and bounds when it comes to serving ads at the right time and in the right place to engage with customers, there are exciting times ahead. AI ain’t done yet, and the following are just a few of the ways it is going to impact the industry both today and in the years to come.   Smarter marketers There are now oceans of data available to marketers - far too much for the human brain to compute, organise or utilise, which presents both a challenge and a great opportunity for businesses. On one hand, there’s more than we know what to do with. On the other, it holds the secret to something advertisers have been attempting to discover since the ‘Mad Men’, Madison Avenue era - a clear picture of who will purchase what they have to sell. AI enables us to understand all this information; it does the heavy lifting by sorting and then utilising the details it finds. This means brands, advertisers and media agencies are better equipped than ever to understand their consumers and what they want. It is arguably the most powerful marketing tool out there when you consider no human could decide in a millisecond where to place an advert across 300 billion real-time daily impression opportunities.   Insights like never before It’s true that AI is being used to automate the delivery of advertising based on the information it can glean from consumer patterns online. But it is actually way smarter and much more valuable to an organisation.  Although an AI-powered platform like Sizmek’s will provide the brain power to decide which ad to serve and wherein a heartbeat, the benefits to brands and agencies go deeper. The value also lies in its ability to analyse incomprehensible amounts of data, learn from it, then make recommendations to assist marketing decisions at both tactical and strategic levels. In turn, this provides insights like we’ve never seen before. And marketers, or even entire organisations, can use these to improve campaigns, fine tune marketing strategies, reach new consumers and boost the ROI from ad spend.   Better customer knowledge AI is helping us to identify patterns in data sets, providing insight into the factors that lead a consumer to make a purchase. This means marketers can optimise their targeting towards a specific outcome (think % increase in online insurance applications, for example), rather than solely relying on segments and third-party data to make decisions about ad campaigns. Previously, a marketer would assign a specific set of parameters, such as age or gender, in order to locate and then communicate with a particular consumer or audience group. Now, however, we can identify the elements of a campaign that work before discovering and reaching a world of new customers. This is a major milestone for both programmatic and for advertising in general. After all, this is an industry that has relied on the knowledge of ‘target’ customers for decades, and businesses have believed they understand their customers for decades. AI turns the notion on its head, using technology to discover new information about who these consumers are, where they are and what they need or desire.   Augmented intelligence - AI While many believe the power of AI lies in its ability to make technology and ad serving platforms smarter and more efficient, it is actually people who benefit greatly from its adoption. Those running businesses, marketing departments and ad campaigns become better informed as a result of AI. And this major shift in the way we learn about consumers then reach them online is going to categorically change the job of the marketer in the future. Contrary to the media scaremongering about bots taking over the world and stealing our jobs, AI will augment our intelligence, providing us with a powerful tool to make decisions about who we should target and how. People, not machines, will be pulling the levers, and they’ll have an unprecedented amount of insight in order to make the right decisions. Now it’s up to the marketer to become a hybrid - part creative and consumer-focused and part scientific, with one hand on the system that enables them to learn about their audience. Marketers are about to become more intrinsic to the workings of an organisation. If they’re willing to let it, AI can propel them forward, enabling them to play a pivotal role in data management, digital transformation and strategic direction. ### Automation | Customer contact for the AI age Artificial intelligence (AI) and automation continue to create immense opportunities for UK businesses. However, scepticism still persists around their ability to truly engage with customers. In today’s competitive landscape, customer experience remains as important as ever. Yet, in the contact centre industry, there is still uncertainty over how an automation solution can provide the kind of customer experience required without appearing cold or faceless. Far from alienating employees and customers, automation can provide the personal touch that delivers lasting retention. Yet customers are not the only beneficiaries. When deployed carefully and collaboratively, automation can transform a contact centre operation for the business, the customer and the employee.   Dismantling the automation myth The best, most personalised customer service is achieved when we understand customers’ preferences and concerns on an intimate level. Yet all too often, the customer contact sector falls at this hurdle. Being disappointed by bad customer service is an almost universal experience. Holding for hours on the phone only to find that the agent has no idea who you are or what your problem is has become an unfortunate refrain for the sector. Too often, calling customer support is seen as a last resort. Customers view it as yet another barrier between them and the company they want to engage with. It is rarely the fault of the agent that this is the case. They simply lack the tools to perform to the best of their ability. When viewed from both the customer and employee perspective, automation becomes an opportunity instead of a controversy. Sensationalist ideas about the ‘rise of the robots’ and the displacement of human workers have soured the public’s perception of automation. However, while robotics may prove a disruptive force in business, it is an unhelpful analogy for the contact centre environment. Automation is not a zero-sum game. It works best when it is designed not to replace workers, but to enhance their service. When implemented correctly, automation helps employees understand more about an individual customer than they could alone. Collaboration between human agents and machines is the key to creating an experience that is easier, more personalised and pleasant for all.   The future is now AI was once science fiction. Now, along with automation, it has gone mainstream. Virtual assistants are pervasive in our daily lives and even sit at our kitchen tables. Customer contact centres are similarly no strangers to automation, but as ever, the innovation is unevenly distributed. The industry can be split into those who have fully embraced automation and those who are merely dipping their toes in. Neither approach is ‘right’ or ‘wrong’. Contact centres should automate at a pace that matches customer demand. However, in striving to provide the best customer service, it is important to see what is being achieved through automation. When it comes to speed, accuracy and the ability to perform repetitive work consistently, machines cannot be beaten. This is why the use of chatbots is growing exponentially in customer contact centres. At first glance, a complex conversation between two people seems a strange place for automation to contribute or add value. Machines are usually programmed to do a limited set of tasks, albeit very well, but we expect customers to be unpredictable. Yet the majority of queries that agents answer on a day-to-day basis are predictable and routine. Beyond answering these questions with a predetermined response, there is little or any real value that an agent can add to the interaction. Chatbots, then, are well placed to automate low-level customer requests. They can process the overwhelming majority of customer queries quickly and effectively. The small proportion of queries that can’t be answered by chatbots are passed on to human agents. When chatbots are paired with machine learning, the more a customer interacts with these systems, the better they are understood and the more helpful the chatbot becomes.     Chatbots liberate employees from mundane and often repetitive tasks where they can offer little extra value. Instead, they are freed up to focus on jobs that benefit more from their specialist knowledge or creative-thinking, such as more technical queries that require human oversight or decision-making.   Adding value behind the scenes Automation is proving capable in front of house, customer-facing roles. However, it can serve an equally useful back-end function. Indeed, in the first wave of automation that swept this industry, machines made their home in the back office. Though often neglected in contact centres, data management makes a prime target for AI. Contact centres gradually build up deep reservoirs of customer data, but agents have limited ability to make use of it. Knowledge is transferred slowly and imperfectly between humans. A customer who is well-known to one agent may be a stranger to another and receive less useful advice as a result. Yet data transfer between machines is instantaneous. AI has the capability to analyse and process large amounts of customer data fast. A virtual customer profile can be created and exchanged quickly between systems, and between machines and humans. For the agent, AI helps them collect all the information they need about a customer when they need it, helping them provide an excellent experience, no matter the location or customer.    Making machines part of the family The best approach to integrating AI in an existing contact centre is measured and holistic. Many make the mistake of deploying a solution to see how it performs, without having a view of how it will impact employee workflow or the customer journey. It is not right to throw in technology and hope for the best. Automation can have a transformative effect on an organisation, but it needs the right tools, data and back-end processes in place to be effective. Otherwise, automation can become a frustrating add-on and seen as another obstacle by staff and customers. Nor is it correct to take a revolutionary approach to implementation. Stripping out old systems and employees to make way for untested AI can do more harm than good. Contact centres should instead make incremental changes over a period of time. Feedback should be sought throughout from agents and customers, and be iterated upon to improve the system and make it more accessible. In this way, automation becomes a gradual, cultural change that slowly but surely wins the support of its users. Ultimately, humans and machines must work together to provide the optimal customer experience. When deployed correctly and used to enhance, not replace, the work of contact centre agents, everyone benefits. Employees are empowered to do their best work, reduce time on mundane tasks and understand the customer more quickly and better than ever before. Customers will enjoy a faster, smoother and more insightful customer journey. Soon, they will begin to treat the contact centre as the first port of call rather than the last resort. ### Headless | The View from the Trenches The web content management landscape has been steadily shifting over the last few years. The introduction of headless (or to be more specific, API-first) content management systems has moved the goalposts. Headless CMS is nothing new – most have been around for a few years – but the last two years have seen a spike in interest and adoption. Looking for statistics on adoption rates, it is fair to say that these statistics are thin on the ground so it is great to see this report from Kentico which provides some interesting insights into the trends and views in the market. With that in mind, I wanted to add to this overview with our own MMT Digital view on the rise of the headless from the perspective of a digital agency operating on the front lines. I’ve pulled out some of the key headlines and included some advice and recommendations along the way.   Coupled, hybrid and headless A good starting point is to get our language straight. There are lots of terms bouncing around, from coupled through to headless. To help frame this post, let’s quickly summarise the three key terms. By “coupled” we are talking about the traditional and arguably monolithic, content management systems that tie the front end to the content, taking control or at least heavily influencing the presentation layer. By “hybrid” we are talking about those approaches, typically driven from the traditional content management systems, where management functionality is separated out from the presentation layer and served up through stateless APIs (or other methods) to multiple channels. You can achieve this with many of the traditional content management systems out there. And, by “headless”, we are talking about the API-first content management systems, effectively content repositories that provide APIs to allow content to be delivered to any channel.   Competition in the market Over the past few years, we’ve seen an ever-increasing number of API-first content management systems emerging onto the market. These range from cloud-based, SaaS offerings such as Kentico Cloud through to self-hosted, open source offerings, typically developed by agencies and one-man bands. The next few years will be interesting as the various vendor's jockey for position in this space. We can expect to see a number fall by the wayside, most likely due to low adoption rates, lack of investment or poor positioning. Many of the API-first platforms are all doing the same thing which is a different ball game to the traditional CMS where the competition often lay in the rich feature sets and added extras (e.g. Digital Experience Management or Digital Marketing). The winners will be those with the robust roadmaps, strong positioning and cleanest implementations of features and APIs. However, the traditional CMS can’t be overlooked. It’s easy to think that they will simply fall at the feet of this new generation but for many of these traditional CMS, they have been plotting their own route into this space. Some, such as Kentico and its Kentico Cloud product, have moved into space with a separate headless offering while others, such as Sitecore, Episerver and Drupal, are adapting their own platforms to move into this space. Arguably, the latter is headless solutions as the presentation layer is separated out allowing you to distribute to multiple channels but, since they are to a degree still the monolithic solutions of old (lots of features for all kinds of jobs), it is easier to think of these as simply hybrid. It will be interesting to see how these solutions fare over the coming years – particularly as their current licence models aren’t necessarily geared towards modern architectures in the same way as API-first solutions.   CMS Selection CMS selection is an interesting one. Reviewing Requests for Information and Requests for Proposal from the past three years, there have been plenty containing feature matrices of such magnitude that even Everest seems a trifle in comparison. For traditional CMSs, this is often fine. These CMSs are crammed full of features. However, when you move to API-first, that feature matrix becomes a little redundant. These systems aren’t crammed full of features since they serve a specific purpose – to curate and serve content. As a result, we have started to see a change when it comes to those clients embarking on the CMS selection journey. Those matrices will occasionally pop up but, more often than not, these matrices are moving aside as businesses start to grasp the capabilities of headless CMS and the accompanying microservices-based architecture. Feature matrices aside, another aspect to CMS selection is that of cost. Arguably, the initial investment in traditional CMS is lower as you are typically investing in one platform to rule them all. With the headless CMS approach, there is not only the subscription but also the cost of the infrastructure, service layer and additional systems required to deliver certain functionality. However, this is a short-term view. To truly consider cost, we need to take a long-term view and factor in upgrades, maintenance, patches. It is at this point that the headless CMS and its accompanying eco-system comes into its own offering a more competitive longer-term cost. The headless approach is a shift in mindset and not necessarily an easy shift. Traditional content management is ingrained in the market and full education on headless takes time. But, over the coming years, I’d expect the figures in the Kentico report to creep up when it comes to both knowledge and adoption. MMT’s recommendation is to think through your CMS selection criteria. Do you really need a long list of features? Based on experience, I would argue that this is probably not necessary. To allow for the inclusion of headless/API-first CMS, take a different approach and think about practical, everyday usage scenarios, e.g. common tasks, potential scenarios. And think through the long-term investment into the platform. You will need to factor in initial investment, infrastructure, upgrades, maintenance and ongoing fees.   Client-side adoption Moving further downstream from CMS selection towards the practical application of headless, businesses are already moving in that direction – some faster than others. It’s important to note that this isn’t simply a lift and shift. Businesses often have lots of systems built on a particular infrastructure (or infrastructures) that may be tightly coupled. Shifting all of them requires careful planning. The key ingredient here is the service layer. Elements need to be separated out and the service layer is used to power all manner of digital experiences and channels. It can be tricky to get right and we have seen good and bad implementations. It’s not cheap by any means but if implemented correctly, can give you big rewards further down the line – flexible architecture, access to the latest technology, improved performance and speed to market. As a result, what we have started to see is an increase in experimentation. Businesses are using headless CMS to power microsites and campaign sites to enable them to understand the architecture they require and to start creating earlier versions of their service layers to aid them in forecasting expenditure and effort. Whether you opt for a hybrid CMS or a headless CMS, this is arguably one of the most important elements to get right. Get this right and you provide a solid foundation for the future. Get it wrong and you could be inviting further rework. Be incremental in creating your service layer, adding to it piece by piece as you uncover the elements you need.   Developers Developers, on the whole, love headless. And why wouldn’t they? The separation of concerns and the flexibility that headless offers give them free rein to embrace modern front-end frameworks and patterns to deliver more sophisticated and engaging experiences across a range of channels. They have full control over the presentation layer so, client budget permitting, the sky is really the limit. It’s no surprise that this is reflected in the report and we’re seeing similar trends here in the market. Increasing numbers of developers are diving into the API-first world which in turn has a knock-on effect for recruitment and retention within businesses – both agency and client-side. For agencies, this is a tricky one to manage as they need to ensure they have developers with the right skillsets. The signs are pointing to the headless and hybrid approaches as the future (for now at least) but coupled CMSs are not going away just yet. There’s a balancing act to maintain here while we strive towards that future. Unfortunately, there’s no magic wand to wave here. This hinges on business decisions made on future architectures and development approaches plus, for agencies, new business and marketing activities. Start the conversation now – how could a microservices architecture benefit the business (or the client if you’re an agency), what impact does this have on current business operations and what effort/budget is required.   Editors While developers welcome and embrace the move to headless, the same can’t be said for content editors. The traditional CMS encouraged editors to think of content as content that would be pushed out onto a website and, as a result, all the text and media was architected in the form of pages. The move to hybrid and headless CMS steps away from that notion (in general as there are situations where you may use either of these approaches to deliver simply a website) towards thinking of content as essentially raw text blocks. The content you are creating is material that could be used on a website, an app, a smartwatch, through voice, through VR, etc. Some editors are taking to this like a duck to water – especially those who already have experience of solutions such as GatherContent where content is treated as content with no ties to a particular channel. For others, it is a step into the unknown – much like it was in the early days of web content management.   If you are moving to headless CMS, don’t underestimate the time needed to train and support editors during this transition. Headless CMS removes in-page editing (and in some cases the WYSIWYG editor) which can be a daunting scenario for editors. With carefully structured processes and frameworks which they can follow as a guide, the transition can be made as painless as possible.   Summary The perception of headless on the whole has been very positive from all quarters and the benefits that an approach like this yields, when implemented properly, are clear to see – speed to market, ability to innovate with fewer constraints, flexible architecture, ability to engage with the latest technology, ability to reach different channels and access to a larger developer pool. If you are considering a move to this approach, I would urge you to take heed of the areas I have highlighted. To get this right, the move should be careful and considered so you can make the most of the opportunity. ### Blockchain Technology | Turning of the enterprise tide Some of the biggest players in the corporate world have been getting into the blockchain game, with most of them looking beyond cryptocurrency to the potential of blockchain technology to help them improve processes, transform business models and realize completely new opportunities. Walmart, for example, recently filed a patent application for a blockchain-based system that would allow customers to register their purchases, with the aim of establishing a marketplace where customers could easily resell items. This is in addition to another patent the retail giant filed in March for a blockchain-based “smart package” delivery system. IBM has also been working with Walmart on a blockchain-based food supply chain tracking system, with successful pilot projects tracking pork supplies in China and mangoes in Central America. That partnership has expanded into a consortium of major companies, including Dole, Unilever and Nestle, that are further exploring the possibilities for this project. Microsoft is working with JPMorgan Chase and several other corporate giants on a similar venture based on Ethereum that would compete with IBM’s system. Microsoft is also collaborating with both Accenture and Bank of America on yet more blockchain projects. That’s only the latest tip of the iceberg in blockchain interest by major companies. Others developing blockchain applications include Chinese e-commerce giant JD.com, shipping line Maersk, Airbus, Daimler, BAE and major financial players including HSBC, ING and U.S. Bank.   What Blockchain Technology Brings Blockchain technology is probably most well-known for cryptocurrencies such as Bitcoin. But its technological foundations offer the potential for many more applications. Blockchain could improve consensus and democracy in political and work environments and deliver operational efficiencies, data verification and workflows that improve customer engagement and trust. Chief among the blockchain’s features is its ability to create trust. A blockchain stores data across a huge network of computers that constantly verify information with each other. Each computer on the blockchain network has a copy of the exact same data, so it cannot be altered after the fact. This decentralised and immutable nature is designed to allow parties who may not trust each other or a third party to recognize a single truth and agree on transactions. Many of the top 300 companies in the world say that between 50% and 70% of their assets are soft assets, meaning data and software. Often, elements of these soft assets could be useful to others. Blockchain and trustless environments backed by smart contract technology can enable them to safely share and monetise these elements in a trackable way.   Offering Proof Blockchain technology is already showing promise for established companies in the realms of the supply chain, data security and finance, among others. It could also be a boon in situations where enterprises need verified proof of provenance, actions or claims, including in court or when dealing with audits or enforcement. In order to rely on evidence, all parties must be able to trust that it is complete, accurate and unchanged. This can be achieved by using the blockchain to record transactions. When digital records, such as data transactions, events and documents, are stored on the blockchain, they are recorded in an immutable ledger. Each piece of data or event is permanent and can be linked to related pieces of data in a chain for a transparent and unchangeable record of a series of events, transactions or other data. Not only that, but the blockchain offers far more security than traditional databases. A hacker would need to breach a majority of the computers in the network at the same time in order to falsify information or otherwise compromise data — which is practically impossible to do.   Proving Provenance One example of how blockchain can help businesses is in the case of manufacturers fighting counterfeiting or proving the provenance of their products. Traditionally, manufacturers have stored data about their supply chain and the provenance of their goods on their own individual databases —a system that is not very secure or automatically trusted among all parties. Using the blockchain, a manufacturer can record all the various points along the way of a particular product’s journey through the supply chain, creating a permanent record of each product’s provenance. Using this type of record, a company can prove that it did not make a counterfeit item. If a serial number were copied, for example, the company could produce the record of a product’s journey to prove that the counterfeit product is a fake.   Living up to Potential Blockchain technology is still in its infancy, and there are challenges that need to be overcome for it to live up to its true potential. But the future holds many exciting possibilities for the embrace of revolutionary solutions in areas ranging from finance to providing verified proof — and many more to come. ### Smart Contracts | The rise of Co-Ownerships The development of smart contracts has directly led to offering investors security, speed and accuracy in a new way. It gives investors assurance that what they are investing in is trustworthy, key in a market like precious asset co-ownership, in which trust is an underpinning factor. Having grown out of the world of Blockchain, ‘smart contracts’ are ushering in a new age of legal agreements and are destined to become mainstream in legal practice.   Smart Contracts Smart contracts are lines of computer script, which act very much like physical contracts with terms that must be met. Once the terms of an agreement are completed, the contract is automatically and immediately implemented by the blockchain. As smart contracts are based on the blockchain, and so, live on the internet, they can execute transactions very quickly. Without the need for processing documents manually, contracts can be organised efficiently and across borders, thereby streamlining the whole transaction process and adding tremendous value to a functioning co-ownership economy. As a legal document, contracts’ terms and outcomes must be explicit. Smart contracts allow no room for miscommunication or misinterpretation as one of their primary functions is to record all terms and conditions in explicit, immutable detail on the blockchain. Thus, they drastically cut down on time lost through, sometimes ambiguous, clauses and sentences. Smart contracts essentially help to avoid pitfalls of undefined conditions or events.   The importance of the Blockchain Many smart contracts are written, including those at TEND, on the Ethereum blockchain. The Ethereum blockchain makes it easy to create arrangements tailored to specific requirements, and then store on the decentralised and independently verifiable public record. A significant advantage over offline contracts is that smart contracts are self-executing computer code, all parties can be sure that they will be carried out, regardless of human involvement. This makes the system both highly efficient and trustworthy. Blockchain gives a high degree of confidence in the agreements made. Stakeholders gain peace of mind as it is nigh on impossible to change or tamper with data, unnoticed. Critically, this means that users and partners can be assured that the contract will be carried out despite having not met the countersigner.   Co-ownership Co-ownership is a legal concept where more than one person shares the ownership of an object or asset. Smart contracts enable the so-called tokenization of such objects or assets, i.e. creating fractions thereof. Tokenisation is playing thus a key role in the next evolution of the sharing economy, opening up access to previously unattainable, precious and collectable assets to a vast audience. Only a few, exceptionally wealthy individuals can consider purchasing and maintaining a million dollar Aston Martin DB4 outright. But tokenise it so that 5 or 10 co-investors can share the outlay and suddenly it becomes a far more interesting and accessible investment. Smart contracts are offering an effective and repeatable solution for correctly executing a sale or purchase transaction in a co-ownership scenario. Importantly, smart contracts significantly reduce the need for litigation should something go wrong. As they are self-executing, parties commit themselves to abide by the rules and determinations of the underlying code. This facilitates total transparency of the transaction and gives investors full peace of mind.   Growth of Co-ownership The growing demand for co-ownership can also be attributed to a desire, particularly amongst the millennial generation for an asset-lite lifestyle with far great priority being given to life-enriching experiences. Sharing economy companies are enabling people to live this lifestyle while still investing in the precious assets they want to experience. Co-owning an asset rather than buying it outright can enable investors to enjoy a far wider range of opportunities and experiences. Why buy a villa in Mykonos outright when you can potentially have a quarter share in four properties across the globe (and likely use that Mykonos villa just as often!). To sellers, removing the challenge of finding a single buyer through transforming into a co-ownership model can make a very expensive asset much more liquid. This model also allows owners to retain a share in the asset where maybe they have a long-standing emotional attachment.  An attractive proposition to an enthusiast who might be seeking to liquidise an asset (or simply share their passion) whilst keeping a vested (though reduced) interest in. The modern trend towards an experience-driven lifestyle has led to more people, especially younger people, becoming interested in investing in multiple co-ownerships rather than one single asset. This has led to a growing demand for co-ownership, of which smart contracts play a critical part in the secure and trustworthy execution. The rise of smart contracts is not only helping the co-ownership market to flourish but has also seen the advent of the sharing economy 2.0, where companies and individuals can reach agreements quickly and efficiently and trust those agreements to be immutably recorded. ### Big Tech Projects | Quick Business Wins are better I was recently asked a question by an up and coming PM who had been tasked with migrating their business's ERP off an old mainframe, the PM was a new customer of mine so I wanted to protect him if I could, he asked “what was the most important thing to do when managing large ERP migration projects”?, my quipped response was “well..aiming to keep your job is a great starting place”.  This led to the immediate question “no, seriously, what are your thoughts” and after a moments pause, I further validated my previous answer. I substantiated it by explaining the difference between a large bridge building project and any large IT project.  A large bridge building project doesn’t change destination halfway through, the priorities will never change from finishing the bridge.  A large bridge building project has a fairly fixed and pretty well-known set of costs, the pre-project investigations, surveys and the like are normally very thorough and ultimately the objective, to get to the other side is finite and well understood by all.   A large IT project requirement is more often than not, not completely understood by customers or the solution delivery agent, we tend to look at the same steeple but from different sides of the church (if you ever want to test this go draw the church from one side and compare a colleagues view of the other side, they are never the same), but the gap in understanding can be a chasm which will invariably create a delta between the requirement and the deliverable.   Often, large IT projects can be born in minutes, with very little or a cursory amount of work done to survey the current situation.  In addition, those doing the surveying may not fully understand the current and historical business challenges, the technology in place, its limitations or its strengths, often leading to an ill-informed project which will carry way more risk than was necessary to deliver a given set of business outcomes.   Balance is a constant state of recovery   I’ve seen many IT projects start out as one thing and then mid-flight by capability epiphany or a new business priority evolve and or change the requirement. This is quite normal and in the world of liquid tools for liquid needs, this essentially has to be accepted, moreover, the person that embraces this with the right skills and awareness will be the ring bearer to rule all others. I remember running a project to build a production workflow system that halfway through because of the workflow element and the timing of World Com & Enron debacles also then became the solution for achieving Sarbanes-Oxley compliance in limited time. In fairness, I can’t remember which branch of the project ended up being delivered first, but it certainly didn’t end where the project was destined at the beginning. If you, therefore, accept that the destination of any large IT project may not be where you know the bridge build ends, then really you are more likely to have the realisation that you are in a constant state of change or recovery, constantly reacting to changes in business requirement and or business priorities.   Life is a Journey, not a Destination   All of a sudden the deliverables on the journey become way more important than the summit that may never be reached as essentially they may be the only wins that are achieved.  Knowing how to divide the journey into a clear set of separate business wins is the key to success here, if you can get the business to agree to them as mini projects rather than one big project then all the better.   The tantalisation of swapping out the old tech beast for a new shiny playmate fades and is quickly replaced with a desire to tie immediate effort directly and quickly to a set of great business outcomes.  As an aside and for the politically aware and upwardly mobile, outcomes that are aligned to new revenue are always celebrated with more bubbly than making the old horse run a little faster, that is not to say that there isn’t value in the latter, occasional sorties here are where the respect of the old guard will be earned. I said to the PM “how long do you estimate that this project will take”? he said, “I think I can get it done in 3 ½ years”.  I said “in my experience leaving a mainframe will take you at least 7 years, so that is double your estimate”, I then gave him the usual stack of reasons, unknown unknowns, unforeseen costs and the shifting priorities of the business. I asked him “what the business wanted to achieve”? he said, “we wanted to reduce the cost of running IT and become agiler”.  I asked him to articulate how much they wanted to save and what new agility they hoped to gain. You won't be surprised to hear there was no exact or detailed answer to either question.     Rolex or Timex, the time is still the same isn’t it   Drilling further there were some projects that had been difficult to deliver because of integrating with the old ERP.  So their first answer was to throw the baby out with the bath water. At this point, I led him to a few articles and software tools that would have opened all the doors they wrongly perceived were locked.  A simple search on the web could have found the same answers, but they failed to articulate the question and were wedded to the self-perpetuated belief that the problem was the old ERP, this in addition to how much shinier the new Rolex looked compared to their perception of what they had being a Timex. As it turned out they actually only had a few significant projects that were bottlenecked, the two most significant ones were a customer self-service suite (mobile app and website) and modern CRM, they had decided on Dynamics.  He said “we think we will be using the new CRM system in 3 years”, I said, “you could be using it in 3 months without putting your job on the line or your entire business at risk”. The conversation went on and he was listening and moreover wanted others to listen too, so I also imparted some best practice advice around legacy technical debt reduction to his broader team, so that now they have a way forward that gives them agility today, reduces technical debt and commoditises their asset portfolio over time.   Armed with the new knowledge I said why not go and ask the business stakeholders what is important to them.  You won't be surprised that 2 weeks later the plans had changed from the lengthy pluming change project to quite an exciting set of new revenue related business projects that were going to be delivered within the next 6 months. Being halfway up the mountain reduces risk, delivers business wins sooner and the PM will get given medals, not his cards.   The Net Net   If you can’t bracket a business deliverable to 3 months then look harder until you can.  Don’t fall into the trap of believing new technology will increase business agility with a straight line to revenue and bubbly in the boardroom. When it takes too long to deliver and you aren’t responsive to in-flight business needs, it won't matter how shiny the new toy will be if/when it eventually arrives, you won't be there to see it. New shiny tech IN and old rusty tech OUT is great and sometimes it is the only way but in the majority of cases, it’s a fools' errand. Remember if you can enhance the old rusty tech with some new shiny bells and whistles, you won't waste time reinventing the wheel and will invariably deliver great business outcomes more swiftly and in turn reap the accolades and rewards. ### Social Housing and IoT | Mark Lowe, Business Development Director at Pinacl Solutions Pinacl has developed a solution which could save social housing providers thousands. We are already working with several housing associations who are keen to trial and adopt this new technology, recognising the potential for a rapid Return on Investment. Pinacl’s housing solution aims to reduce maintenance costs of properties by deploying sensors to gain real-time insights into the health of each property, enabling the Housing Provider’s limited resources to better support tenants and avoid costly refurbishments. Alerts can be automatically generated if problems arise or are predicted to, notifying the tenants and housing provider to take corrective action so that the property and tenant remain healthy. Understand the advantages of Pinacl's Social Housing Solution, book a free demo today. Please complete this form to request a free social housing solution demo from one the Pinacl team. ### Industry 4.0 | Inspired by a technological leap Western economies have gone through three industrial revolutions and are now on the cusp of a fourth; industry 4.0. As with previous revolutions, this latest has been inspired by a technological leap. In the late 18th Century, the birth of mechanisation saw the First Industrial Revolution. The development of mass production and automation characterised the Second and Third, but what is driving the Fourth and what does it mean for business and manufacturers?   What is industry 4.0? Industry 4.0 has been accelerated by the growth of, the Internet of Things (IoT), artificial intelligence (AI) and connected machinery which has given birth to smart factories. These advances have delivered numerous benefits for manufacturers by making the tech work harder than ever before.   Smart factories The Fourth Industrial Revolution, and in particular smart factories, are helping companies shorten lead times, minimise errors and incorporate Just-in-Time (JIT) processes. These mean manufacturers are empowered to be agiler than ever before, allowing them to cut costs and increase profits, whilst also driving productivity. Smart factories are characterised by cyber-physical systems which monitor processes by creating a virtual copy of the physical world and making automated decisions. Using the Internet of Things, cyber-physical systems communicate and cooperate with each other and with humans in real-time. This has revolutionised the way businesses and manufacturers interact and do business, by allowing them to react faster to changing trends and demands whilst also minimising waste. Another huge advantage of smart factories is that they are operable 24 hours a day. These ‘always on’ factories can help manufacturers increase output and therefore profits.  This also gives smaller manufacturers an opportunity to challenge larger ones as they can increase their output but also remain agile and responsive to their customers’ demands. Mass customisation has only become possible due to Industry 4.0. The ability of modern smart factories to change production tolerances quickly has led to minor changes in mass-produced products becoming a possible and cost-effective reality. Previously manufacturers would have had to change tools and machinery to produce different products. Shorter change over and lead times means that factories are able to produce a greater range of customised products efficiently.   AI and IoT A defining feature of Industry 4.0 is the smart and connected aspect of modern machinery. Machines can ‘communicate’ with each other and the human operator. One of these functions is performing self-diagnostics. This means a machine can detect when it is about to break or wear to unacceptable levels. By doing this, manufacturers are automatically alerted to potential defects and given actionable insights and solutions to solve them.  This form of AI can even go as far as automatically calling out a technician, when it detects a potential problem, to come and resolve the issue. All of this leads to higher reliability and lower repair cycle times. Being able to prevent a problem from occurring by recalibrating the machinery before a batch of defective products is produced helps businesses minimise time and resources being wasted and maximises operating profits and efficiency.   Robotics and Machine learning Machine learning – the science of computers learning and acting like humans – has been one of the greatest advancements aiding Industry 4.0. Now, machines improve their learning over time in an autonomous fashion. This is done by feeding them huge amounts of historical big data sets collected by IoT, which the machines then use to identify patterns and form conclusions. This learning happens in real time so can be actioned immediately. The aim of this is to have the machines optimise themselves and be able to detect when something may be about to go wrong. When twinned with advanced robotics, machine learning can help manufacturers increase or decrease productions levels, call technicians and replace parts proactively rather than reactively. With Brexit looms on the horizon, casting a shadow of uncertainty over the British economy, many SME manufacturers may be worried about their immediate future as increasing production costs and overheads putting the strain on their bottom lines. Industry 4.0 is helping these companies cope with potential challenges and future proof themselves by optimising their processes and using actionable insights to minimise production delays which ultimately increases profits. ### Tech From The Top – S1E1 – Russell Horton – CEO, FluidOne Comms Business Live is delighted to announce the new live stream series ‘Tech From The Top’, presented by Adrian Barnard, CEO of Alpha Beta Solutions. The series will exclusively focus on top CEO’s and directors across the ICT sector and will see Barnard delve into the business lives of the Channel’s top minds. In our first episode, Barnard is joined by CEO of FluidOne, Russell Horton. In this in-depth interview Russell explains not only his ambitions for FluidOne but also how he structures his time in order to progress towards that goal. Find out how Russell keeps his staff motivated, his business philosophy and much more at 3:30pm on Monday 2nd July. ### Information Governance | It's about business For many organisations, information governance is treated as a means to meet the bare minimum requirements of data compliance regulations. But the benefits of sound information governance run way deeper than that, including cost and efficiency savings, says George Parapadakis, Director of Business Solution Strategy at Alfresco Software. I meet hundreds of IT decision makers every year and I am particularly fascinated when we hit the subject of information governance. For example, I often hear that most organisations are confused about where to start and are convinced that their competitors are in far better shape than they are. It might surprise you to learn that your competitors are probably about the same distance along the information governance path as you are, since most organisations have yet to truly gain control of their data, which often exists – unclassified - in siloes. The challenge comes when IT decision makers try to regain control of their data, seeking ever-cheaper storage solutions as corporate data continues to rise exponentially. Continuously migrating data to cheaper storage is not a long-term solution. Information governance can help contain the exponential growth patterns, by managing information proactively, based on its intrinsic business value. IT decision makers, after years of procrastination, are eventually confronted by the inevitable question: When it comes to dealing with information governance - if we don’t tackle it now, when will we?   Taking a strategic approach to information governance Many organisations do just enough with their information governance to ensure compliance with ever-changing data regulations across the world. Information security and data privacy are often treated separately from data management. If the purpose of IT is to deliver services across the business, then IT decision makers should not adopt a “let’s just stay out of trouble” approach, but rather look to run an efficient set of services. After all, the goals of compliance and wider data management are the same: gaining control of our data to deliver wider business benefits. For example, most information that organisations hold is redundant. This could include duplicate content, out-of-date documents, and draft or trivial information. Reducing content volumes lowers storage requirements and costs, accelerates finding the right information, increases work cycles and makes data analysis and information governance that much easier. This leads to great visibility, which leads to better business decision making. The “keep everything, just in case” mentality of most legal departments, is no longer a viable operational model.   Taking control of information governance Organisations need to shift their mindset away from compliance, to taking control of information. Most companies cannot even begin to consider leveraging Artificial Intelligence (AI), machine learning (ML), blockchain, or any exciting emerging data technologies until they have built a deep culture of data quality and data control. Many IT decision makers I meet are unclear on where to start. It’s a big topic, certainly, but I always reiterate that information governance is not that hard to do. You don’t necessarily need to outsource to a big consulting firm and engage of years of process re-engineering. It all starts by bringing method and structure to information governance. Be sure to align governance to your business strategy and don’t be driven solely by compliance. Not all information is equal, so tier your data, prioritising your most sensitive data that would be most compromising in a privacy or competitive breach. Also, be sure to build a future-proof, self-governing platform by automating information governance. Information does not exist in a void, so be sure to use governance controls in your process design and let the system drive that. Automation reduces risk, ensures consistency and reduces operational overheads. At Alfresco, we recommend the following steps for sound information governance:   1)    Aim for invisible information governance Take advantage of technologies that enable information governance to happen behind the scenes, such as using intelligent classification and embedded governance controls, which take away the onus of compliance from the end-users.   2)    Manage records holistically A strong information governance programme requires a unified records management strategy. Look to manage records with a central hub, so that information disposition schedules run consistently and frequently, with complete transparency and auditability.   3)    Build in extra controls Look at features that allow your business to limit which information assets people are allowed to see, and what they can do with them. In today’s strict data protection regulations, this is essential.   4)    Future-proof your solution Make sure your solution is equipped for the future. This includes looking for cloud-ready solutions with an open architecture that is able to scale and manage multiple file types. Make sure it is certified or can support leading industry governance standards like DoD 5015.02 and ISO:15489 There is so much more to information governance than purely meeting compliance regulations. The question now is where along that journey is your organisation, where do you want to take it and how are you going to get there? How confidently can you answer that today? ### #TheCloudShow – S2E6 – Cloud and the Law It is fair to say that the lawmakers have always been behind the rate of technology advances when it comes to updating laws and regulations to keep in line with the pace of technology. Indeed, it is virtually impossible. But what are the laws that you need to understand and what are coming down the line that will have a fundamental effect on how you manage your technology stack and your data? Indeed, what control do you have? A good example of this is the recent CLOUD Act in the US which would change the legal requirements for law enforcement agencies seeking to gain access to data stored on overseas servers. Born from the dispute between Microsoft and the US Department of Justice after Microsoft won a victory in federal appeals court over the DOJ in 2016, invalidating a warrant requiring the company to turn over user emails stored on a server located in Ireland. One could also look at Net Neutrality laws as well that could fundamentally affect consumer rights. As the world gets more immersed in public cloud, there are fewer players who will control your data, and as such, this can bring risk to your business. So what should we be looking out for? With guests: Frank Jennings and Rebecca Butler from Wallace LLP. ### The GDPR Journey | How was GDPR for you? After the GDPR journey came the GDPR deadline. This came and went and the world didn’t end. Some people asked me why there wasn't a transition phase to get ready in time. I responded that they were in the transition phase as the law actually changed in April 2016. A plethora of organisations scrambled to update their systems and processes before GDPR became enforceable on 25 May 2018.   Those dreaded emails Many took this to mean they had to send lots of emails to their database requesting users to opt back in. Some didn’t distinguish between users who had given permission already and those who hadn’t. Or between existing clients and potential clients. Or personal and business users. If they had, they could have saved a lot of unsubscribes by those too lazy to click the button. Or people like me: I simply used it as an opportunity to reduce the amount of spam I receive. In some ways, perhaps it was a good exercise. Those people who actively clicked to give permission are exactly the users the organisations want. I'm waiting to see if the Information Commissioner’s Office will end up handling complaints from people annoyed at receiving emails from organisations they had never given their permission to in the first place.   Higher fines? Not yet GDPR created fear because of the steep rise in fines. Before the GDPR journey began, the most severe fine in the UK was £400,000, which is 80% of the maximum. Fines could now go up to €20,000,000 (about £17m) or 4% of annual global turnover, whichever is higher. This is something businesses want to avoid. The ICO has continued to issue fines after 25 May. This includes Yahoo! (£250,000) after a cyber-attack compromised their network (in 2014) and Gloucestershire Police (£80,000) for revealing identities of abuse victims. But these are under the old law and we still await the first GDPR fine.   Cambridge Analytica & Facebook I had thought Cambridge Analytica would be one of the first. Most people know about this story. Cambridge Analytica took personal information from more than 50 million Facebook profiles and used it to build a system that could target US voters with personalised political ads based on their psychological profile. Employees of Cambridge Analytica, as well as the suspended CEO Alexander Nix, were even filmed boasting about using several dirty tricks to swing elections around the world. The Information Commissioner has been investigating it, assigning 40 or more people. Some say this is the largest investigation ever undertaken by a data protection authority. Of course, because GDPR became law in April 2016 any breaches since then could lead to a new, higher fine. Maybe it’s no coincidence that Cambridge Analytica has shut down.   The GDPR journey and Brexit For a while, people were confidently telling me that Brexit meant we could abandon GDPR. I told them they were wrong on two counts. First, the GDPR journey started and would be in force before Brexit. That's now happened. Second, even after Brexit, if the UK wants to send and receive personal data with its EU neighbours, it will have to adhere to GDPR. The key issue about Brexit is that the UK will have little influence over any possible changes to GDPR or other data laws. Recognising the importance of data flows, the UK asked for a special deal. The EU Commission rejected this. Brexit means Brexit after all. So, we will have to follow GDPR but won't be able to lead on it. Also, if the European Court issues rulings on GDPR, can the UK Supreme Court realistically ignore them and come to a different conclusion? The other factor to consider is whether the broad powers under the Snooper’s Charter – or the Regulation of Investigatory Powers Act to use its proper title – will mean the UK is not deemed to provide adequate protection for data transfers. This will lead to Privacy Shield style negotiations for the UK. And that will keep Max Schrems busy for even longer! ### Security Models | Changing in a world of digital transformation When it comes to successful internet security models, the following key concept has been the guiding principle – make it as difficult as possible for attackers to get into the network. This has traditionally meant building a security architecture with multiple layers of controls. For a lot of companies, added security was achieved by using multiple vendors for the same task, with the thought process being that if one failed to mitigate a threat, the other one would. This is what I call the coconut model – hard on the outside, almost unbreakable and resisting to all manner of threats. The “crown jewels” are safely inside the hard shell, which is well protected. The problem with coconuts, however, is that they have soft, liquified insides – and sticking with this analogy, that means that once hackers are in, data can easily be accessed.   The downsides of the coconut model Over the years, attackers have become much more sophisticated, using zero-day vulnerabilities and bypassing all traditional security defences. Indeed, the main reason a dual-vendor strategy does not work is that all vendors are blind to zero-day vulnerabilities. Furthermore, it creates added complexity. Maintaining a cohesive and consistent configuration to enforce a company’s defence policy is not easy, whilst the traditional model has also led to a poor user experience. As there are often multiple steps required to get to applications, it leads to exasperated users finding different means of bypassing the controls in their search for the best application available to support their job. With the coconut model, maintaining a cohesive system is undoubtedly a challenge, and the time it takes to deploy a new policy and ensure its proper deployment has long been an IT department’s Achilles heel.   The need for change We have subsequently needed new security models as the geometry of the network changes. The cloud is transforming how companies do businesses and is accelerating new opportunities and markets. Applications are no longer physically located within the enterprise perimeter. Instead, businesses are turning to the likes of Office 365, Salesforce and Google Drive which are all stored in the cloud. Users also increasingly require connectivity from the internet as they are working outside the physical perimeter of the enterprise. Indeed, the world of work has swiftly moved away from the centralised model and is now structured with an increased level of flexibility. Employees are no longer tied to a fixed desk at one location. In fact, in a recent global workplace survey, 20-25 percent of workers currently telecommute with some level of frequency, while 50 percent of the workforce has a job that is compatible with at least partial mobile working. Employee mobility has become part of our day-to-day working life, meaning mobile devices are increasingly being used to access data and applications outside the company network. Furthermore, in the spirit of getting their jobs done faster and more efficiently, employees are increasingly demanding frictionless access to their IT systems, no matter where they are, and it’s up to the IT department to deliver this across their legacy infrastructure – without jeopardising network security. New security models are therefore required to control risk, whilst at the same time taking user experience into account. While the coconut model is based on protecting the inside by creating a very solid outer layer, I believe that we need to consider reversing the concept.   From the coconut to avocado security models This new model would be hard on the inside, where the critical data is protected, whilst the outside would be soft, enabling users to connect to applications and collaborate internally, as well as with their external customers and partners. It simplifies access for users, whilst at the same time protecting the enterprise’s key assets. This is the “avocado” model. The fact of the matter is that not all assets need to be protected with the same level of controls. For example, intranet content, which is generally visible to all employees, does not require the same level of controls as the financial database, HR system or industrial plans application data. What businesses need to do is identify key assets and adapt their security controls accordingly for each one. Those with the greatest levels of control would form the core of the avocado. Businesses are increasingly embracing digitalisation, however, there’s no doubt that the move needs to be thoroughly planned. It’s not just about implementing cloud-based applications and seeing instant benefits. Instead, there are ongoing security implications that need to continuously be addressed. The old-fashioned coconut model no longer works in a world where cloud requirements for performance and bandwidth are dominating. Instead, security strategies need to adapt and evolve to ensure that the most important assets are always protected from today’s threats, whilst still giving employees the flexibility and seamless user performance they expect. ### Intelligent Automation | The future of collaborative work Automation has delivered massive gains across various industries and is leaving an indelible impact on the future of work. Human labour is under pressure to adapt (and qualify) to work alongside these ‘smart’ machines. Occupation by occupation, field by field, intelligent automation is integrating itself into the daily aspects of our work.   So Why Intelligent Automation Now? The rise of ‘ubiquitous computing’ has tightly integrated cloud-based software services into all aspects of personal and professional life, while the progress in distributed computation enables companies of all sizes to learn from this wealth of available data. Many professions are now focused mainly on laying the groundwork for the future of their own jobs with intelligent automation. As our capability to automate more and more cognitive aspects of work increase, we are no longer asking if it is possible to automate, but rather what the costs are -- and if there are experts available to deliver it.   The Rise of a Two-tier Workforce This gives rise to an interesting dilemma: Automation is ‘leveling up’ and competing with more and more complex tasks, shifting the mix of occupations, and requiring the workforce to move towards higher-level skills. At the same time, the skills necessary to deliver these solutions are being commoditized into “building blocks”, which can be bought and sold in the nascent “skill economy” -- freeing human experts from traditional employment. Demand for these critical technical skills widely outweighs the supply, and human experts are hard-pressed to automate these highly technical skills in an effort to meet demand by scaling their abilities. Projects are quickly building around this nascent problem, pools of internal and external workers are assembling teams, and solutions are being built with scaling in mind.   The Future Experts Human experts analyze problems, create solutions, and teach intelligent automation systems which then scale – and deploy the product to the market. Continuous improvement, collaboration, and innovation are the driving forces in a technological future that replace the manual creation of a system with a guided solution. Just eight years ago, software was written line-by-line in order to encode years of expert human knowledge for such applications as image analysis and speech recognition.  Today, deep learning architectures surpass mere human domain knowledge in almost every practical application. Andrej Karpathy coined the term ‘Software 2.0’ in 2017 to describe how we are now moving to develop systems by focusing on preparing the teaching material, having the model learn from it, and automatically generating the solution. The result is an abstract pattern of decisions that are applied to perform a task.   Creative Destruction The creative destruction caused by technology is so pervasive that it’s practically a cliché.  It’s difficult to ignore both the speed at which disruption affects society, but also the acceleration in the pace of this change. Technological innovations brought about through the development of new platforms are disrupting global markets from within years to months. Apps such as Uber have revolutionized long-standing, ossified global industries like taxi services. Now the arms race is to deploy autonomous vehicles to the streets (and the sky) and are quickly disrupting the private car-share economy.  We are now seeing the disrupters become the disrupted.   Enabling the Experts Artificial intelligence (A.I.) automation is only one piece of a much larger puzzle. As the need for innovation continues to accelerate, we will have to brace for the impact of continuous self-disruption. Knowledge is no longer as valuable as it once was (as it quickly goes out of date).  The most sought-after skills are now  ‘meta-skills’ that enable an expert to quickly adjust to change, coordinate the constant learning required, and self-guide their learning to reach proficiency in a new territory quickly and effectively. Companies working in fields at the forefront of this societal and technological change have realized for years what is now becoming a commonplace reality – we have to change the way we work – by enabling flexibility to work across domains, time zones, in industries from anywhere, whenever it’s necessary.   The future of the team Teams may no longer be bound by traditional ways of collaboration, as the pace of change requires us to adapt the process to the problem. Human experts must focus on their creativity, learning to interface with technology, and using modern tools to upgrade and scale their professional impact. Thus resulting in intelligent automation. At DREAM, we believe in the power of people drawn to innovation in order to solve problems. Project managers benefit from the guidance of A.I. bots enabling them to overcome problems (some of which they might not even be aware of without the benefit of the system).  Talent benefits from a more robust and accurate matching system of skills to job requirements, avoiding the pitfalls of human error and bias. DREAM enables projects to quickly ‘level up’ their game by automated solutions drawn from a global talent pool. We work to provide a platform for the global community to scale their professional expertise via an expert system which deploys their solution to a larger audience.  We envision a future of distributed expert teams working hand-in-hand with A.I. enabled bots to quickly deliver solutions.  The revolution will be augmented! ### RPA solutions | Could they hold the answer to our data woes? RPA, Robotic Process Automation, often gives the idea that robots are going to a) kill us all, or b) take over the earth. This has always been a hit in Hollywood movies and frequently keeps us glued to Netflix mega-series. It makes great entertainment and most of us accept it as fiction. But as the Guardian reports, when it comes to industry and jobs, automation is often treated as a serious threat – a prediction that may actually be stifling digital transformation and business growth, in areas utilising RPA solutions. McKinsey predicts that 50-70% of tasks are potentially automatable and that this will form the next generation of new technologies. At a time when dealing with huge volumes of data and avoiding errors is critical to using information systems like SAP effectively, organisations are turning increasingly to Robotic Process Automation (RPA) for repetitive, structured processes and some are seeing benefits that far exceed expectations. Typical use cases include automating processes like customer invoicing, journal entries, or asset depreciation. Deloitte describes the role of “robots” as performing “routine business processes by mimicking the way that people interact with applications through a user interface and following simple rules to make decisions.” Based on this definition, it’s easy to see how automation could help users who work with SAP by streamlining repetitive, rules-based tasks, but there is a big difference between generic RPA solutions and RPA solutions built specifically for SAP. Generic RPA solutions are designed to automate processes across many systems and therefore rely on surface-level record and execution mechanisms to mimic user interactions. This can cause the robotic solution to break when something as small as a change in screen resolution occurs. In addition, even what seems like a simple process from the outset can be complex and time-consuming to automate, as exception handling, delays in system performance and other factors must be built into the process by technical resources. These types of issues can cause project cost and time overruns, or even outright failure. SAP-specific RPA solutions integrate with SAP at a much deeper level, so while it’s simple for non-technical staff to record processes through the SAP interface, the robotic process occurs at the system level, making these solutions much more robust and enterprise grade. Solutions built with SAP-specific RPA technology also enable a user to quickly step in to fix any errors using familiar software such as Excel—resulting in faster rollout and ROI. Decreasing transaction processing time The case study below shows how one business is using RPA to manage its financial data processing: A shared financial services approach allows accounting functions to transform delivery by consolidating processing into one specialist unit across multiple businesses. Vodafone’s shared financial services centre handles the bulk of the organisation’s financial management and processes and transactions like fixed assets, purchase to pay, record to report and general ledger. Handling a huge number of assets within an SAP system is especially challenging within this dynamic environment, where requirements are always changing. In one area of the business, Vodafone was handling nine million assets and 100 thousand postings per month which would normally take six months to process. The tasks included using five different SAP screens and two different transactions, so a 100-line item would take up to 60 minutes to process. However, using SAP-specific RPA capabilities, Vodafone was able to use an Excel-based solution and reduce the processing time down to 15 minutes. The results were seven times better than Vodafone expected. “The system works very well for us,” says Peter Barta, Asset and Project Accounting Team Leader, Vodafone, adding “our complicated processes are handled in fewer steps, which reduces time spent on complex postings and allows us to avoid any internal IT debt.” The future for ERP/SAP (and humans) Of course, the question that remains for many people is, what happens to the people running the processes that RPA replaces? According to the McKinsey Global Institute, nearly a third of all jobs could have at least 30% of activities automated by 2030. And although some jobs may cease to exist, others will be created. However, what we’re seeing happen in many businesses, is robotic solutions assist, rather than replace people. RPA technology can in effect ‘take the robot out of the human’ and eliminate repetitive and mundane data entry tasks. Instead, they will be free to work on projects that have the potential to add real value to the business and boost job satisfaction at the same time. It’s ‘win-win’ for businesses as not only have they effectively freed up an employee to perform other tasks, data management issues become a thing of the past. ### Blockchain and Transactions: Here's what you need to know It seems everybody is talking about Blockchain at the moment. Depending on who you speak to, it's either the latest form of IT hype, or a transformative power which is poised to reshape the way we do business and transactions. But, everybody is talking about it.  Here’s what you need to know to join the conversation. At its core Blockchain is a type of database. In particular, it’s good for storing transactional data about assets - which could be something physical like a car, something of financial value like a share, or something of abstract value like a customer. But unlike a regular database, Blockchain has four key characteristics which mark it as something very different. First, it is distributed across a number of nodes which keep themselves synchronised.  This makes it ideal for multi-organisational solutions, such as financial consortiums or supply chains.  Every participant can see all of the data - although to be clear, we use encryption to make sure that they can only read those transactions which they are entitled to read. Second, we need agreement on the transactions.  Whenever a participant wants to add a transaction to the Blockchain, that transaction must be agreed by the other relevant parties. This is a process known as consensus, and it ensures that inaccurate or potentially fraudulent data is never posted to the database. Third we have finality - once a transaction is added to the database, it is final - it can never be deleted or changed.  The status of an asset can be changed by adding another transaction, but you can never hide the fact that the first transaction took place.  This allows us to accurately track the provenance of assets in a business network, to know what has happened to them at every stage of their lifecycle. Finally, the rules of engagement of the business network are encoded into the Blockchain and are visible to everyone - it’s known as a Smart Contract - so that all participants have confidence that everyone is playing by the rules which have been established. These four factors - the distributed nature of the database, the consensus process, the finality of the data, and the encoding of the business rules in the Blockchain itself, can give organisations a high degree of trust in the data, and it’s that level of trust which makes Blockchain so important to the next generation of business applications. ### Cloud adoption, on-premise & data protection in the boardroom Over the last five years, cloud computing has matured to a point when it has become a trusted alternative to on-premise systems. Initially, the technology wasn’t developed enough and there were concerns about its stability and security. Today, a growing number of companies adopt a cloud-first strategy and look at cloud solutions as the preferred option. Even financial or government sectors that tend to be rather conservative are now adopting public cloud technologies. An important point is how the boardroom’s view of cloud computing has changed. For many organizations, the transition to the cloud became part of their digital transformation. While the boardroom executives may not understand all the technical details, they see cloud as a way of increasing their agility which is so important in the new digital age. With cloud, they can simply outsource a big part of the technical complexity and focus on their day-to-day business. What are the main reasons businesses adopt cloud services and applications today? When it comes to the cloud, the biggest benefits have always been the cost savings and flexibility. Initially, a lot of companies were quite sceptical of having their content outside their walls but quickly found the scalability, and potentially tremendous reduction in expenses was well worth the move. These days, integrating systems and finding synergies between platforms is easier than ever, empowering even more organizations to adopt commoditized infrastructure. Because the capabilities of the platform are ever expanding, the barriers to entry are constantly being removed as more organizations realize the value of cloud-hosted solutions. What are the ongoing challenges of cloud adoption? Companies will always be protective of their data and with good reason. With new security breaches announced daily, it’s imperative that organizations define and adopt strict security guidelines for protecting the information they collect and how it is used. Because of geography and compliance limitations, many of these regulations often present a challenge to companies wishing to move to the cloud. Cloud vendors are constantly looking to remove these roadblocks, however, some government and private sector organizations continue to shy away from cloud-hosted technologies, citing security and accessibility concerns. Additionally, many businesses have previously made significant investments in on-premise infrastructure, prohibiting them from abandoning their current solution for a cloud-based model. When it comes to security and data privacy, businesses should always pay attention. However, they shouldn’t presume the public cloud would be less secure by design. It’s always important to look at the given situation and ask yourself “What are the risks of putting my data into public cloud?” and “How do I make my data more secure by keeping them on-premise?” Data privacy is obviously a major topic due to the GDPR and it’s important to understand the implications of storing personal data in a cloud service. In particular, you need to know where the data is stored and how it’s managed. Some cloud service providers are ready to answer these vital questions, while others – especially those who do not operate in Europe – are still trying to figure out how to become compliant. Will there be a time when business-critical information is migrated from on-premise to the public cloud? Business-critical information is already being migrated to the public cloud. Just have a look at how many companies rely on public cloud services for their emails, documents, CRM, accounting or Content Management Systems for their website. For these systems, cloud will become a standard for most organizations – the rate of adoption is only a matter of time. There will always be organizations with some very specific requirements and they may keep running certain systems on-premise or in a private cloud. But their number will shrink significantly over time. What advantages to businesses that are 'born in the cloud' have over legacy businesses? The challenge of legacy businesses is that they aren’t able to move all their systems to the cloud in one step. So they may need to combine on-premise and cloud services at the same time and ensure they’re integrated. As a result, their initial investment is higher and they will see the benefits over a longer period of time. The cloud-native businesses have the freedom to choose their own stack of cloud services that meet their needs best and start using them right away. Modern cloud services enable a much easier integration using standardized APIs (application programming interfaces) and often come with out-of-the-box connectors to other major cloud services.   Are we reaching a saturation point for the industry or is there still further to go in driving adoption and understanding? Cloud computing will never fully replace on-premise solutions. There will continue to be a market for both, as organizations determine which solution is the best fit for their business. While there has been significant adoption of the cloud in the industry, many other organizations have not yet moved their systems, due to previous investments in on-premise infrastructure. Until those companies feel the pain of updating their hardware to the latest systems, there will continue to be a need for the promotion and advocacy of the cloud within the community. And due to the accelerated evolution and innovation of the platform, the need for education on cloud capabilities and usefulness is always present. In the future, expect to see businesses continue adopting the cloud due to increasing competitive forces in the market. As the markets evolve ever more quickly, businesses clearly need to become much agiler – from the Boardroom down - and that’s where cloud can definitely help. ### #TheIoTShow - Episode 6: Disruption: Automation, IoT and IIoT The IoT Show delivers valuable insights for industrial organisations on the industrial internet of things and topics touching the broader internet of things. We pick engaging, hot topics, and ask our speakers to advise on situations, opportunities, and recommendations and gotchas. This series is not a sales medium, rather a series of discussions for you to benefit from the experiences and insights of others. In this sixth episode, we look at Disruption: Automation, IoT and IIoT ### How businesses must stay ahead of the cloud skills gap It is well known that IT skills are in fierce demand across the country – but evidence is mounting that Cloud experts are now, and will remain, the most sought-after professionals in the years to come. Stay ahead of the cloud skills gap. We have tracked demand in IT hiring by analysing the number of roles sought across the country over the course of the year. Our latest report found that during Q3 of 2017, the total number of roles we saw advertised across leading tech disciplines (Mobile, Web Development, Cloud, Big Data and IT Security) fell by 6%. Despite this slight slowdown, demand for Cloud professionals accounted for 16% of IT roles advertised in the report - an increase of 4% compared to the previous quarter. Edinburgh and Leeds, in particular, showed the highest spikes in hiring demand for Cloud professionals this year. This suggests that organisations in some of the UK’s alternative tech hubs are increasing their investment in Cloud technology, which is a large part of ongoing digital transformation programmes. In fact, a study from the European Commission found that Cloud computing will create 1.6 million jobs by 2020. Below, we discuss what smart employers should be doing to ensure that they have access to the right talent and can stay ahead of the cloud skills curve.   Attract and retain people with the right Cloud computing skills As Cloud computing is experiencing something of a golden age, there is a huge demand for people with cloud skills to help businesses migrate to new systems. The diversity of the Cloud providers available means Cloud professionals are having to develop skills for a variety of platforms. Physically hosting and maintaining their own servers and expensive suites of software has become impractical and costly for a wider range of businesses. As a result, more and more companies are now using hybrid cloud platforms like Microsoft Azure and Amazon Web Services. The combination of public and privately designed platforms means Cloud professionals, who can seamlessly operate between both systems and work with these hybrids, are needed more than ever. This process takes a specific set of cloud skills, putting pressure on a limited pool of experts.   Re-vamp Cloud job specs to include new Cloud skills sets Hybrid clouds have emerged from the evolution of data centres into Cloud environments and they are quickly surpassing both public and private clouds in terms of popularity. According to the International Data Corporation (IDC), markets for hybrid cloud will grow from the $25 billion global market we saw in 2014 to an $84 billion global market by 2019. With such large growth predicted, it is no surprise that Cloud hiring is already on the upturn. Organisations around the world are hurrying to move their systems to the Cloud, which means recruiting for talent in traditional software and hardware roles is no longer the highest priority for businesses. That’s not to say that those skills are no longer a requirement, it is simply the fact that we are seeing the biggest transformations take place across the Cloud profession, placing greater importance on securing this rare talent. Our latest Tech Cities Job Watch report found in particular that there is an on-going requirement for professionals with key skills in Hybrid Cloud Platform technologies and Cloud software platforms, such as MS Office365. One factor that threatens the future of Cloud is the ever-looming skills shortage. As well as a scarcity of talented applicants for roles, it could be argued that existing IT teams are generally more versed in legacy systems. Training programmes should not be neglected, as Cloud presents a vital and ongoing need - not a temporary blip.   How can companies attract more Cloud talent? To combat the shortage in the long term, organisations should offer internal training programmes to their staff. Up-skilling your IT employees have a number of beneficial long-term effects, especially as an incentive to improve employee retention. These programmes take time, however, and leave the problem of what to do in the meantime. Consultancy is another option. Businesses should seek consultants who will dedicate time to mentor an organisation’s in-house professionals so that eventually their employee’s knowledge becomes self-sustaining. The most effective way to combat the Cloud skills shortage will be to pay the premium for the skills you need. This does mean offering a salary that aligns with what they are worth to the company, which, for many businesses, can be significant. However, other factors, such as opportunities for ongoing training and the quality of life presented by cities outside of the Capital, can help entice talent from larger clusters. As we approach the end of 2015, all signs seem to indicate that the strong demand for Cloud systems, and talented IT professionals to deliver them, is only going to increase. Staying ahead of the curve means recognising this trend and taking a long-term view of investment in Cloud technology and development. ### Cloud migration – healing the pain that keeps giving From Healthcare to Financial Services, Manufacturing to Retail, cloud computing, and cloud migration, is impacting every sector, and that trend is likely to accelerate in the next three years according to IDC. Enterprise IT operations teams with complex IT infrastructure and business processes face a tough challenge–and hold a key role–in helping their organisation stay competitive. They are waking up–day after day–to the sobering realisation that they can’t manage what they can’t see. And visibility rapidly decreases with cloud migration and hybrid-cloud applications spanning legacy, private and public cloud infrastructure. At a time when businesses are banking on digital transformation to defend and strengthen their competitive position, their most important resources —IT and DevOps— are slowly going blind. The inability to understand dynamic application chains put virtualisation and cloud strategies at risk—increasing the chance of penalties in time, cost, sub-par user experience, and the inability to achieve desired business outcomes. Addressing this visibility gap requires giving IT teams access to innovative monitoring capabilities beyond traditional network performance management (NPM) tools. The rise of machine learning and stream analytics is a key contributor to a transformative, 360-degree view of the end-to-end, hybrid infrastructure supporting business growth. There’s no doubt that the challenges for CIOs and IT Operations Managers are complex and varied, but migrating to the cloud is no longer an option, it’s a must. If enterprises are to make this a painless journey, they must take several steps to avoid the friction that can come with moving to the cloud.   Cloudy challenges The reality with cloud migration is that everyone is in the same boat—be it a company from the Healthcare, Retail, Financial, Manufacturing, or Government sector. The challenges faced by enterprises—regardless of their business type or model—are extremely similar from one business to the other. One first common challenge is that of virtualisation. While its benefits as a technology are well recognized, its implementation and integration can lead to significant disruption for an enterprise otherwise accustomed to handling a set number of physical servers in a controlled, private data centre. Virtualisation undoubtedly increases complexity in the network and application stack, and resources which may have otherwise been dealt at the customer premise, now reside in the cloud. This means that the dedicated IT team previously in charge of monitoring performance for the physical machines, is now faced with having to monitor virtual machines that they cannot physically see. In essence, they’re monitoring the invisible. But that’s not the only challenge. For enterprises, cloud migration and the integration of SaaS applications also requires a cultural change. IT teams today often work in silos, independent from one another. While this division of labour may seem logical from a roles and responsibilities point of view, when it comes to monitoring in the cloud, there is no place for fragmentation and silos. Instead, IT teams must work together to have an end-to-end view of their enterprise network. Indeed, if we are to see enterprises truly embrace cloud, virtualisation, and the many new technologies set to disrupt enterprise sectors, a shared responsibility approach must be adopted across all IT teams and all IT departments. This will allow for a “single source of truth”.   The silver lining Enterprises – much like humans – have one common fear: the unknown. And where cloud migration is concerned, the unknown remains a big factor in enterprises’ migration to the cloud. However, this comes as no surprise; traditional monitoring tools do not provide the visibility needed over virtualised, cloud infrastructure so how can IT teams possibly begin to see into parts of the infrastructure they are no longer in contact with? The answer is a simple, but important one: a unified performance management solution. This solution should be “infrastructure agnostic” whereby it can be used to monitor any part of the enterprise infrastructure, whether that be legacy or virtualised. Importantly, this tool will be complementary across all IT departments and layers of the application and network chain. Instead of creating the silos akin to the solutions of yesterday, this unified approach will complement the different tools used for network and application monitoring respectively. As the cloud increases the dependency between networks and applications, this ability to prevent silos and fragmentation will significantly enhance IT teams’ capabilities for greater visibility.   Overcoming cloud migration complexity with simplicity We often believe that the solution to a complex problem is a complex answer. When it comes to application monitoring, enterprises’ have one significant worry: they just can’t monitor what they can’t see. So, the solution is easy: more visibility, everywhere. The traditional monitoring tools have served their purpose – they have provided the passive, reactive insights that enterprises once needed but today, digital and agile enterprises require proactivity, based on real-time and predictive analytics to guide their decision making. It is this extra insight that will prove to be enterprises’ guiding lights in the darkness of the application chain. Without it, the frictions that come with cloud migration will quite simply be unavoidable, and if enterprises are to be persuaded and convinced that moving to the cloud is best, guaranteeing smooth ride will go a long way. ### Challenges for Customers - Jason Tooley - Veritas Technologies Jason Tooley, the Vice President for Northern Europe, from Veritas Technologies visited the studio to discuss what he sees today in the market and the challenges for customers from a business perspective. Find out more about Veritas Technologies here.   ### The cloud conundrum: navigating the world of multicloud Last year, Gartner forecasted that cloud services revenue globally will reach $305.8bn in 2018 – which will be an increase of 17.5 percent from 2017. As businesses increasingly move to more digitised processes, coupled with the pressures on modern-day IT leaders to have a cloud environment that is fit for purpose, multicloud or other, is more important than ever. IT leaders need to address new challenges concerning security, visibility, cost, performance, automation and migration. Yet, how can these challenges be addressed if so many IT decision makers do not have the tools to get a complete oversight of the multicloud environment or are unaware of what their business is currently spending on cloud?   Staying on top – application mapping Application mapping is one way to provide robust visibility into which part of a company’s IT infrastructure supports a particular business service, such as the various components of your multicloud environment. An application map makes it possible to monitor and document business services. Furthermore, it enables IT to assess the impact of potential changes, configurations and check other functions to ensure optimal business support. Auto-discovery application mapping tools can collect data on network infrastructure and cloud services and their relationships to create tiered and robust application maps. By fostering collaboration between application owners and configuration managers, and extending across all cloud services, application maps can truly advance IT- business alignment and help keep the cloud environment responsive to business demands. When it comes to mapping the multicloud environment, it is best to start with a ‘start anywhere’ approach to allow application owners a better experience by enabling them to start with what they know rather than making them work out what constitutes the “top”. ‘Start anywhere’ mapping can also start from multiple points simultaneously which means IT teams can cope much better with applications or when parts of a relationship are missing and mitigate the managing of the challenges that standard tools face in today’s complex cloud-centric infrastructures.   The complexity of cloud and multicloud Our recent survey has found that 40 percent of IT decision makers globally do not know how much their businesses are spending on cloud services, and this accounts for over half (53 percent in the UK). Couple this with the fact that 78 percent of IT decision makers are looking for ways to integrate emerging solutions like artificial intelligence (AI) into their multicloud strategies, and this shapes up to highlight a worrying trend. How can new technologies be implemented into cloud environments if there is no apparent visibility into spend and resourcing? Multicloud has truly changed the game, and the traditional way of looking at IT infrastructure simply will not work anymore. IT leaders must consider new ways to manage multicloud environments to ensure they are getting the expected benefits from public cloud in terms of cost savings, automated performance optimisation and increased security and governance. They must also adapt their management approach using new technology solutions built for multicloud that leverage machine learning and AI to reduce complexity.   Ensuring agility, compliance, innovation and cost control – how to start 44 percent of IT decision makers in our survey agreed that adopting multicloud is essential to maintaining agility. This ties in with avoiding supplier lock-in, which is essential for any competitive organisation looking to have a robust worldwide cloud presence. The lock-in element is going to become a bigger issue for organisations. For example, it’s all very well and good to have a global deal with Amazon Web Services (AWS) for example, but when you go to Spain, would it perhaps be easier to work with a competitive Spanish cloud provider which will also guarantee that data is stored and managed under Spanish law? Multicloud solutions that aren’t too stringent will help to facilitate this approach to global business. Understanding the costs across any public, private and hybrid cloud model is also critical to maintaining business agility and increasing business acumen. The most effective multicloud infrastructure will be one that is coupled with a cost management solution that analyses future and current costs. Insight and control over capital and operating expenditures can be and should be streamlined. With a single view of on-premise and public cloud infrastructure spend, organisations can analyse and track costs and utilisation, whilst also identify areas of overspend and pre-empting future costs. The next and final considerations are that of compliance/security testing and creating innovation in a multicloud environment. To remain competitive, businesses today have to empower developers to perform critical checks of the software development lifecycle. They also must start thinking seriously about applying intelligence, automation, and predictive capabilities so that less time is spent on repetitive tasks.   Conclusion In summary, here are the quick tips to stay on top of multicloud environments: Gain increased visibility into cloud assets to get the complete picture Re-think management approaches: simplify and automate as much as possible AI and machine learning can remove repetition out of multicloud management Get a clear picture of cloud costs: it may not make sense to run everything in cloud Take a proactive stance to harden a broader attack surface, ensure compliance By taking these clear steps, multicloud environments can be carefully managed to support optimum business performance. The first step is to get clear on what you are dealing with and take the necessary steps to simplify as much as you can. We expect to see many IT decision makers take better control of their multicloud environments this year. ### Global Carrier Community Meeting - Berlin 2018 - Day 2 - Afternoon We’re pleased to announce our partnership with the Carrier Club to bring you a look inside GCCM Berlin 18th/19th June. It’s all about Cloud and Carriers, including how to facilitate Big Data growth. ### Global Carrier Community Meeting - Berlin 2018 - Day 2 - Morning We’re pleased to announce our partnership with the Carrier Club to bring you a look inside GCCM Berlin 18th/19th June. It’s all about Cloud and Carriers, including how to facilitate Big Data growth. ### CEE 2018 GCCM Berlin Live Stream We’re pleased to announce our partnership with the Carrier Club to bring you a look inside GCCM Berlin 18th/19th June. It's all about Cloud and Carriers, including how to facilitate Big Data growth.   Our guests include: AM Wida Schmidt - CC Digital Cloud Fabio Bottan - PCCW GLOBAL Stuart Evers - Turk Telekom International Isabelle Paradis - Hot Telecom Ivegen Martsin - Bics Sabastino Galantucci - DIME Fabrizio Salanitri - Horizen AG Alessia Capitaneo - Sparkle PM Arun Dehiri - Red Dawn Consulting Alberto Grunstein - 42com Telecommunications GmbH Pal Beres - Turk Telekom Int. Kurt Lindquist - LINX Chris Van Zinnicq Bergmann - Global Coud Xchnage Kevin Ju - China Telecom Miguel Lopes - Diaogic Dr Gregor Schmid - Taylor Wessing ### Jumpstarting your journey to the cloud In the decade since cloud computing first emerged, its influence has spread far beyond early take-up from big organisations, to mid-tier and small businesses. With the latest research from the Cloud Industry Forum (CIF) revealing that the overall cloud adoption rate in the UK now stands at 88 per cent, there are certainly no signs of it slowing down. In fact, some research suggests that we’ll see it more than double globally over the next four years. We’re also seeing more technological advances in the cloud and integration of artificial intelligence (AI), meaning companies are being empowered to make the most of their data to differentiate and find new revenue streams. However, as with the growing popularity of any technological developments, concerns still remain. The Harvey Nash / KPMG 2016 CIO survey lists integration with architecture as the main challenge to cloud adoption, closely followed by data loss and privacy risks. There's also a certain level of apprehension and confusion among businesses venturing into the cloud, prompting questions such as; ‘What exactly are the benefits?’ and ‘What option do I go with?’.   The benefits of cloud hosting To put it simply, cloud hosting offers ultimate flexibility. Whether it’s scaling up or down the cloud capacity, agility is at the core of the service, which not only gives a competitive advantage but also improves responsiveness and adaptability. Remember, a company’s infrastructure grows with the business, not the other way around, which minimises hefty and unnecessary costs. The cloud also means reduced IT costs, including hardware and software maintenance, as well as training for IT staff. It accelerates product development and innovation by offering better analytics, for example, which has led towards a seismic shift in understanding customer experience.   Where to start with the cloud When looking to implement cloud hosting services, businesses must take a holistic and flexible approach. Remember, one size doesn’t fit all – it’s all about what works for your business. Ask yourself: ‘What are the specific needs of the business?’ and ‘How will you integrate cloud technology into your existing infrastructure?’ As a starting point, a workshop should take place to give a framework to audit what’s already in place, scope and identify pressure points, and ultimately implement a solution to resolve them. There are dedicated tools available to aid this process, such as SysGroup’s TechWorkshop, which creates an agile, agnostic approach to tech infrastructure. Working with the cloud experts at SysGroup, TechWorkshop activity enables businesses to communicate their technical and commercial objectives into a clear set of requirements. From this, a tailored solution is put in place, ensuring the specific needs of each application are addressed within the cloud. It’s easy to get caught up in the whirlwind of intelligent technologies and services that the market has to offer, so it’s important not to lose focus of what you are actually looking to achieve. It’s crucial, therefore, to have a strategic plan in place based on specific needs and challenges before implementing a cloud solution.   Selecting the best solution   Once your plan is in place, it’s time to select a platform that is best suited to the needs of the business. There are a number of solutions to choose from, including: Private cloud gives users maximum control and security over their data and information at all times. It is guaranteed to meet legal and regulatory requirements, giving businesses the ability to build automation into service provision. Public cloud meets the demand for flexibility, scalability and efficiency. Offering the flexibility of monthly payments, this option is cost-effective, agile and can scale on demand. Hybrid cloud is the most accessible and agile of all the cloud hosting solutions, making it the most practical method for most organisations. If the requirement is to support multiple private and public cloud platforms, services and solutions, a hybrid IT approach is the easiest way to integrate them with existing infrastructure, applications and data. But it’s not just down to selecting the right cloud solution, partnering with the right managed service provider is equally as important in achieving cloud success. A service provider should be solution-focused, educating its customers and aligning the solution with the right technologies and requirements for that business. Reputation and security status should also be assessed, and that’s why SysGroup considers things such as PCI accreditation, ISO and GDPR compliance.   Looking ahead The benefits of moving to the cloud demonstrate that, for the majority of businesses, it is a question of when, rather than if. After all, according to KPMG, 74% of tech CFOs said that cloud computing would have the most measurable impact on their business in 2017. The impact of the cloud is – and will continue to be – far-reaching simply because of the unprecedented flexibility and agility it offers businesses, and its potential to disrupt traditional assumptions about investment in IT infrastructure. In the years to come, cloud hosting will continue to push the boundaries and serve businesses operating beyond the tech sectors. The benefits will continue to grow and emerging technologies, such as AI, will have a huge role to play in this. But perhaps most significantly of all, this technology will give smaller businesses the tools to invest in developments previously considered beyond their resources, while improving speed, capacity and greater levels of data security. ### Leaseweb Hosts “TWINO Fintech Talks” Event at Latitude59 in Estonia The Latitude59 tech conference was held in late May for the 11th straight year, featuring almost 100 speakers from 20 different countries around the globe. Topics ranged from “The Future of Cybersecurity” to “The Future of Sustainable Food,” and, new for this year, Leaseweb partnered with online investment platform TWINO to host the “Fintech Talks” event. Latitude59 is Estonia’s largest annual tech conference, and during its most recent iteration, 13 different topics were addressed by the speakers. Priit Salumaa, co-founder of Latitude59, moderated the discussion, while the panel included thought leaders such as Roberts Lasovskis, TWINO’s P2P platform lead, Kristjan Kangro, CEO and founder at Change, and Joyce Y. Shen, venture capitalist at Tenfore Holdings and data science professor at Berkeley. Shen’s address, in particular, covered the intersection of commerce and big data that characterizes the world of fintech, in addition to opportunities in the burgeoning fintech industry that are waiting to be filled by the next generation of startups.   A Focus on Fintech These opportunities are emerging largely thanks to the next generation of customers. Young people have vastly different conceptions about banking than their parents’ generations, and recent developments in mobile banking technology mean that many Millennials or Gen Zers have quite possibly never been to a physical bank. Instead, they view finances through the medium of their smartphone or computer, with banking apps and other online payment options such as Paypal and Venmo. Younger people are also growing up with different investing options, and part of the “Fintech Talks” discussion included the necessity of transparency of returns and the importance of vesting potential investment partners. For every legitimate financial institution out there, there are a host of scammers waiting to pounce. Doing due diligence before investing is extremely important in determining good investments from bad ones, and the changing face of investments makes telling the difference especially difficult. As banks look to remain competitive and secure, they’ve begun to partner with much younger fintech firms. Tech-forward startups enable banks to revamp how they offer financial services to today’s customers, and the smaller size of these startups means they’re incredibly agile and focused.   Fintech Meets the Cloud Fintech companies have also started embracing the cloud in order to be agiler. Cloud services allow businesses to scale capacity up or down quickly in order to react to customer demand in real time, which can also be cost efficient when you pay only for what you use. The cloud also allows fintech companies to compete with larger (and older) financial institutions that may have more experience in the banking industry but can’t react to changing customer demands as quickly because they’re tied down by legacy IT systems. Integrating cloud technology can help smaller companies compete with the big names, as well as help larger institutions keep up with what their customers want. No one knows for sure where fintech is going and what impact it will have, but it’s a safe assumption that this convergence of the technology and financial sectors will continue to play an increasingly important role. As fintech companies emerge to provide safe, secure, and seamless solutions for their customers, the ones that can do so the most efficiently and effectively will be the ones that rise to the top. You can find out more about TWINO here. ### Why Banks Are Lagging On Open Banking The world of consumer banking is becoming more democratic, as the old banking powers begin to lose their grip and their monopolies. The latest Open Banking regulations have signalled a subtle shift in the world of consumer banking. New competitors are being allowed to compete in a marketplace long dominated by a select few financial giants. These aptly named challenger banks - such as Monzo and Starling - are but the forerunners of a host of other fintech providers that now have a greater capacity to deliver financial services direct to consumers, outside of the traditional banking framework.   Driving Competition and Innovation Open Banking is also the path towards more novel financial solutions, with an emphasis on those that aren’t being offered by the big banks. Moneybox, for example, gives customers the chance to invest their small change. Yolt allows its users to see a full analysis of spending for easy budgeting. The Open Banking Initiative and PSD2 are opening up opportunities for fintech firms in the spaces which have been missed by the bureaucracies of the traditional banks. The rationale behind the Open Banking movement was to enable greater competition, with the view that it would prompt consumer banking services to better cater to customers. It’s now common for people to own more than one bank account, across a number of providers. Gone is the era where customers stuck with one bank from adolescence to the grave. Challenger banks and start-ups are slowly but surely raising the bar of what consumers now expect from their bank. The bricks and mortar banks need to up their game in the face of this significant increase in the variety of products and services. With ever more innovation on the horizon, banks can no longer rely on customer loyalty.   CMA Regulation and PSD2 For the Open Banking initiative to flourish, customers need to be properly informed and be willing to share their financial data with alternative financial providers and services. Understandably, people may be apprehensive about the security of their financial information. Before recent regulation came into force, ‘screen-scraping’ was one solution, in which customers gave third parties their login details for online banking to apps like Yolt and Chip. In January this year, the Competition and Markets Authority began the Open Banking initiative, coinciding with the EU Directive PSD2 (Revised Payment Service Directive). Alongside introducing more competition into a stagnant market, it would provide a common payment framework for fintech providers to build on. Together these regulations stipulated that the nine biggest banks in the UK would be obliged to grant customers the right to give access to “read only” copies of their bank statements to third parties. There is obviously huge potential here. These regulations have forced banks to underwrite the safety of customers’ bank accounts that have been shared with third-party providers (many of which are competitors with those very banks), providing a crucial layer of reassurance for customers who may still have been sceptical of the intentions of the challenger banks. So why is the Open Banking initiative failing to make a significant impact? The most prominent explanation is that consumers don’t know what Open Banking is. The Crealogix Group carried out an independent study which showed that 85 per cent of UK consumers are either unaware of Open Banking or don’t understand it. The fault of this lies partially with banks who have failed in their duty to communicate this information to their customers. Less than a quarter of the few who had heard about Open Banking had received this information directly from their bank.   Collaborative Success The key to Open Banking’s success is uptake. If enough people search out the services of third-party providers, then these companies will create for themselves a space in the market that had once been the preserve of traditional banks. Proper education about Open Banking is vital for it to make a significant change to the consumer banking market and the fintech space. Almost half of consumers (45.5 per cent) are concerned about the security of Open Banking. Sharing banking details in this new way is, of course, a big first in the banking world, but with the initiative and regulations in place, consumers should feel secure. The innovation is indicative of a fintech space that is able to move with more agility and creativity than ever before. The likes of Monzo and Starling are able to swiftly roll out services simply on account of their smaller size, the absence of bureaucracy and the culture that they foster within their company. These firms act more in concert than they do in competition. The recent regulatory moves have made the prospect of collaboration between fintech start-ups with banks (both challenger and established) all the more possible. For instance, Starling bank has been able to partner up with the likes of Flux (an app that collates loyalty cards into a single online wallet). The room for innovation here is massive in terms of what this new breed of banks and service providers can achieve together. Seemingly it has been left to start-ups to drive innovation, otherwise little difference will be made anytime soon, and the prospect of a one billion pound boost to the UK economy will be lost due to inertia. To illustrate this, the first traditional banking app designed to compete with the new challenger banks is headed by HSBC and won’t be available until later this month. In comparison, Yolt, Monzo, Moneybox and many more have been live for quite some time.   Proper Education Will Determine Open Banking Success On paper, Open Banking should be a hit with customers. They have access to better money management tools, budgeting services, and will have a better understanding overall of their finances. The benefits are endless and everyone wants to save more money. The real challenge is educating consumers about these benefits, and from there, allowing them to make informed choices about how they want to take control of their finances. The Crealogix Group. ### #TheCloudShow – S2E5 – Cloud and Public Sector The Public Sector has traditionally been viewed as a slow or late adopter of technology. One could argue that this is no different for Cloud. That being said, initiatives such as the Government Digital Service in the UK have accelerated adoption and it is easy to forget that there have been many successes in the Public Sector with the adoption of Cloud technologies and services. The eternal question, and the reality, will always centre on budgets, culture and the winds of political change.  But can Government Departments at Central and Local Government do more to achieve their outcomes? ### The dawn of a new digital era for healthcare organisations Earlier this year Jeremy Hunt- the Secretary of State for Health and Social Care- signed off on the first official guidance specifically designed to help the UK’s National Health Service make the move to cloud. Although some parts of the NHS already made use of cloud technologies, this marked the dawn of a new digital era for many working within the sector- one in which the widespread adoption of public cloud services and platforms, like Microsoft Office 365, is encouraged and the benefits can become a reality. Benefits which might help to relieve some of the strains currently facing our overstretched healthcare services. Tighter budgets, fewer staff… it’s well publicised that the NHS is currently having to work with very little resource. But by storing data in the cloud, NHS organisations can rid themselves of some of the costs associated with buying and maintaining the hardware and software required to keep it on-premise. Meanwhile, behind the scenes, IT teams can use cloud technologies to install more comprehensive backup solutions which will reduce recovery time in the event of a local system failure. And for those on the front line- the GPs and doctors seeing patients day in, day out- cloud could give them the flexibility and freedom to work remotely. There’s no denying that the benefits of migrating some data from on-site systems to cloud environments could be colossal. This is now being widely recognised, with a recent report from Digital Health Intelligence discovering that 39% of organisations currently not using any cloud technologies plan on introducing some element of cloud-based infrastructure within the next two years. But, these organisations will only be able to reap the benefits if their new cloud environments are protected. After all- as last year’s notorious WannaCry disaster proved- no one is immune to cyber attack. Not even the NHS... Everyone is a target  Nowadays, cyber criminals don’t care who they hit with ransomware, as long as the victim is willing to pay up. This is what makes hospitals and healthcare providers very attractive targets. The NHS’s most valuable digital asset- confidential patient information- has become 100 times more valuable than stolen credit card details and when faced with losing it, IT teams often don’t have a choice. They have to pay the ransom because human lives are not negotiable. This is why it’s important to secure all networks and all patient portals, including websites. Despite common fears, keeping data in the cloud is just as secure as storing it on-premise- as long as you have a suitable security strategy and invest in the correct technologies.  Making the cloud safe  The fact is that a truly secure medical network infrastructure will probably contain more firewalls than patients, but most traditional firewalls are not cloud ready. Simply lifting and shifting traditional solutions and processes doesn’t work, because some are not engineered with the cloud’s elasticity and scalability. Therefore, IT teams looking to make the most of cloud may need to think about refreshing their security technology stack. As part of this, they could also consider using machine learning and artificial intelligence in some capacity. These technologies enhance existing security solutions and protect against more cleverly disguised and targeted attacks- such as spear phishing. They do this by establishing a baseline of ‘normal’ behaviour and then flagging any actions that fall outside of it. Anything unusual or out of character is identified immediately, helping IT teams to pinpoint malicious outsiders. But that isn’t all...  Education, education, education Effective security is not just about stocking up on solutions and tools. Instead, it’s a combination of technology, people and culture. NHS organisations can block out some threats with a cloud-ready, up to date protection system but it is just as important to implement some sort of user awareness programme and training. After all, often your employees are your last line of defence- especially when it comes to social engineering attacks- so educating about potential threats and retraining around cloud environments will be essential. As well as providing user awareness courses and materials, organisations can also look to invest in phishing simulation tools. Through mock campaigns, these can teach employees what signs to look out for and how to respond appropriately if- for example- they receive a malicious email. They can also provide useful data for organisations on which employees are most at risk of attack and transform them from a liability to a strength.   Cloud is a whole new world for many organisations within our National Health Service and its benefits could be endless. But, before organisations can reap the rewards, they must embrace a new way of thinking. For those planning on making the move to cloud, security needs to be priority number one. ### The multiple stages of adopting a multi-cloud strategy There is a common pattern when a new technology emerges to disrupt the business landscape. First comes adoption: slow at first, and limited to the most forward-thinking companies and agile start-ups. By the time the technology reaches maturity and universal adoption, early adopters have already begun looking for ways to enhance and diversify their use of said technology and extend their competitive advantage. We are seeing this today with cloud, as more and more enterprises that adopted cloud the best part of a decade ago are implementing a more sophisticated multi-cloud strategy. The reason is simple: vendor lock-in. Enterprises do not want to be at the mercy of any single vendor – an attitude that has been plain for the last 15 or so years, with enterprises increasingly moving from monolithic platforms to open-source alternatives. Lock-in is a particularly pressing issue in the cloud since enterprises not only have no control over the software itself but also little control over the data. Their entire business may be at the mercy of their cloud provider – not a position where any organisation would feel comfortable. Consequently, what enterprises are looking for now is flexibility in deployment models, one of the primary drivers for adopting a multi-cloud strategy. Why adopt? Adopting a cloud-neutral strategy gives organisations far greater assurance of business continuity and service uptime. This is regardless of whatever interruptions they might experience at the infrastructure tiers, be it the hardware tier (servers, networks, storage devices) or at the software tier (bugs in OS, routers, or device drivers). Since no digital business can afford extended periods of downtime, this is a significant advantage of going multi-cloud. Even more important than uptime, however, is how a cloud neutral strategy can help future-proof an organisation in terms of cost (CapEx and OpEx) and compliance. On the cost front, if it becomes too expensive to run a public cloud, an organisation that has adopted a multi-cloud strategy is not tied to this model forever. They can pivot, building out their own data centres and, without any extra effort, run their applications as-is in their own private cloud (through their own data centres). This is also true for compliance. If due to some future regulatory requirement, an organisation needs to pull some of the data and the corresponding workload from the public cloud into a secure on-premises environment for security and privacy purposes, multi-cloud makes this possible. By adopting this strategy, they now control where to move the data and workloads, so they can choose another cloud provider that meets the required standards, or they can move the data into their own private cloud. Like any technology strategy, there are some drawbacks to a multi-cloud strategy, namely a lack of standards and portability. This simply means that organisations require careful architecture, so as not to leverage proprietary capabilities of a specific cloud vendor. This is an evolving area where there is a consolidation of technologies and vendors. As this gains importance, innovation at the cloud manageability tier will address this. Choosing, managing and securing a multi-cloud management platform Multi-cloud management is evolving fast, but there are several vital steps that need to be reviewed by every organisation. Understanding these points is essential for enterprises as they first assess their multi-cloud options, as well as managing and securing their data from then onwards. These key considerations include: Deployment & management: Businesses will want one common manageability platform, which can deploy and manage across multiple cloud infrastructures. This is easier said than done, however, as different cloud providers have different virtualisation tiers and various levels of managing them. Furthermore, any platform must also deploy and manage the private cloud. One of the main considerations for this platform should be that it allows seamless movement of services and workload between public and private cloud deployments. Tooling for productivity: After management, comes ease of use. After all, the last thing that any business wants is a tool that makes life harder for those using it or gets in the way of time-to-market. An effective multi-cloud platform must be easy to use for both the operations team as well as the development team, for simple and effective configuration, deployment, and scaling of services. Monitoring & diagnostics: Failing to shut down services is one of the most common issues in the cloud. So much so that it costs companies thousands of pounds every year. It’s why organisations must implement monitoring tools into their multi-cloud strategy. Effective tools should alert as well as provide a delegated administration capability; a must in this fast-paced world. Diagnostic tools that provide visibility into the infrastructure tier are also essential, in order to troubleshoot the most difficult issues. Security & governance: With more customer data and confidential information becoming accessible from the web, the management platform must provide the ability to quickly roll out and implement security updates at any tier: infrastructure, platform or application. The management platform must also provide the ability for rule-based governance of data and services so that organisations can control the movement of data across geographic boundaries and conform to governmental rules and regulations. In summary, the cloud has matured from private to hybrid and now multi-cloud technology, which will become the de-facto standard for cloud architecture as companies seek to optimise workloads and avoid vendor lock-in. Data is in private, public, or hybrid clouds and on-premises. However, it needs to be managed and secured effectively no matter the environment. Only businesses that can achieve this will unlock the true benefits of multi-cloud. ### NVMe and DCP1000: one of the fastest NVMe data centre SSD solutions In a matter of just a few years, SSDs have rapidly dominated the consumer market. Now it is the turn of a new generation of SSD for enterprise, which makes the mechanical hard drive seem a distant memory. How are NVMe configurations used alongside SSDs in the data centre?   Summary SSDs have increasingly replaced most mechanical hard drives in consumer products, and although the technology is still more expensive per gigabyte, the benefits of SSDs are increasingly significant. SSDs are faster, quieter and more energy efficient than their mechanical predecessors. On the server side, SSD has not had the same revolution. The enterprise marketplaces demands on-storage media that consumer products cannot meet. Now, however, a new generation of SSD has been developed specifically for server environments. This new generation of SSDs promises to overcome these restrictions and to offer superior performance combined with competitive prices. This article discusses how this new generation of server SSDs enables data centres to support future growth in areas like artificial intelligence (AI), Internet of Things (IoT) or the availability of real-time data.   Data trends IDC’s recent ‘Data Age 2025’[1] study predicts that the amount of data generated globally will increase to 163 zettabytes  (equal to 1.63 trillion gigabytes) by 2025, which is about ten times the data generated in 2016. The study not only highlights the tremendous growth in the quantity of data, but also the changes in the role it plays in our life. ‘Data Age 2025’ mentions the following five major trends that affect the development of the quality of data: 1)    Data will become more and more life-critical: according to IDC’s prediction 20% of the generated data will be critical and 10% hypercritical to our daily lives. 2)    The connection of digital devices: embedded systems and IoT will play a vital role in nearly every part of our lives, and IDC forecasts that we will be connected and interact with these devices on average 4,800 times a day. 3)    Availability of data: data will be increasingly real-time and mobile which means that instant availability is required. 4)    AI as a game changer: the amount of data generated requires, as well as enables, new technologies such as cognitive systems (e.g. language processing or machine learning) for processing them. These systems themselves will produce a vast amount of new data while working with already existing data. 5)    Security: the amount of data that is sensitive, private or should be kept confidential will increase to about 90% by 2025, but only about 50% of it will be secured. The changes mentioned above require significant advances in data processing power as well as data storage. These technology advances then continually drive the development of new applications, technologies and services themselves. The development of NAND flash technology and subsequently SSDs can be considered both an essential requirement for processing and storing vast amounts of data and an enabler for the development of new technologies.   PCIe and NVMe: a new standard and its benefits SSDs were initially – and many of them are still being – built with hard drive interfaces. To work in computers, indeed, they needed to utilise the existing connections developed for conventional hard drives, as well as look like HHDs in order to achieve initial acceptance. The SATA interface that is used to connect mass storage devices to a computer, however, is a limiting performance factor for SSDs. This is due to the fact that it can only transfer a maximum of 600MB/s (SATA 3). To take full advantage of the benefits of NAND flash chips in SSDs such as the superior speed, a new interface had to be created. The PCI Express bus and technology enabled the NVMe standard to take off, with performance surpassing those using traditional SATA interfaces. This new standard allows for data transfers up to 2,000MB/s per lane (PCIe Gen4). Using a maximum of 16 lanes, PCIe Gen 4 offers transfer speeds of up to 32,000MB/s. This new driver standardises how SSDs are connected to the PCIe bus. Apart from the ability to transfer 25x more data than SATA, PCIe and NVMe come with further benefits such as the superior speed. This is due to the NVMe driver sending its commands much faster than the AHCI driver used by SATA, communicating directly with the system’s CPU (rather than through a SATA controller). This means that PCIe and NVMe enable organisations to deal with vast amounts of data. There are additional benefits to using NVMe configurations in a data centre environment. Indeed, a configuration using a single SSD can replace 10 – 12 SATA SSDs together with the controller cards associated with them, leading to 80% reduction in the parts used, meaning that less hardware needs to be deployed in a server environment. For example, in a server farm where hundreds of thousands of racks are employed, a reduction of this size makes a difference, resulting into a lowered administrative effort, a 70% drop in power consumption compared to SATA SSDs, as well as less cooling requirements, contributing to a further reduction in the TCO.   Kingston’s NVMe solution Kingston Technology invested heavily in product development and innovation to support the growth in this area and partnered with Liqid, a start-up company in the data centre industry. Utilising Liqid’s expertise in the field of PCIe and Kingston’s experience in NAND technology, they jointly developed the DCP1000, one of the fastest NVMe data centre SSD solutions on the market. The DCP1000 is a high-capacity and high-performance solution that enables data centres to reach the previously described next-level performance required for applications like Artificial Intelligence, Virtual Reality or Internet of Things. For data centres, this means they can do more with less by making use of Kingston’s and Liqid’s NVMe solution DCP1000.   by Kingston Technology [1] IDC (2017), Data Age 2025. Available at: https://www.seagate.com/files/www-content/our-story/trends/files/Seagate-WP-DataAge2025-March-2017.pdf ### Learning to love the IT team: how ALM automation can help businesses understand their developers The DevOps concept has taken a firm hold on software development in the past few years and at the heart is ALM automation. Most companies have moved on from the old model of separate development and operations teams in order to work in a more integrated way, enabling them to respond more quickly and effectively to business demands. Gone are the old days in which enormous and unwieldy IT projects took place in isolation, all too frequently producing a product that was already outdated by the time it was completed.   So, where are we with this project? The transformation has not been easy, however, and for many organisations, there is still a long way to go. Application design, development and testing are all now happening simultaneously. This has required a complete change in culture for IT teams used to working in siloed departments. Not only is a far greater degree of collaboration required, but teams are also under an enormous amount of pressure to respond quickly to demands as businesses battle fiercely to innovate faster than their competitors. IT teams may still be subject to the dreaded “how are you getting on with the project?” questions from business managers, regardless of which element of a project they are directly responsible for. Typically, the younger generation of IT workers has been more eager to adopt DevOps and agile working than its older counterparts, but in such a fast-paced environment we have seen IT departments struggling at all levels. Part of the problem is coming from senior executives pressing for the introduction of whatever the latest technology may be – from a new mobile app to an IoT product – without a sufficiently clear understanding of what’s involved. The other major challenge is keeping track of such a complex network of development involving so many different moving parts. ALM automation – taking the pain out of DevOps At the heart of a successful DevOps policy is Application Lifecycle Management, ALM automation technology. ALM systems have always been essential in keeping track of the development process, managing applications from an original request through to the end of their existence, but not all of them have kept up with the complexity of new working practices. Many ALMs still require levels of manual intervention that are unrealistic in an environment where so many people need to be informed of every stage of the process. Asking developers to complete lengthy updates for every change they make wastes time that could be far better spent making another improvement to an application. The only way to manage the DevOps process effectively without investing in excessive administration is to automate it. By implementing a system that will automatically track and document everything that happens, you can free developers from administrative tasks, avoid errors and collect invaluable, consistent data to enable continuous process improvement. Collecting data automatically will allow you to maintain an inventory of an application and to understand how the different parts are related, as well as how the various components are being built. This means that when the time comes to deploy a change, you can easily identify which parts need to go to a mainframe, web server, cloud or mobile device rather than having to work it all out again from scratch. An automated system can also detect whether versions have somehow fallen out of sync and take action, either preventing users accessing it until the problem is fixed, or quickly rolling back to a previous version if required. Communication is the key to happiness The DevOps model has transformed the role of IT in business and done much to improve the old “IT Crowd” image of developers. But if your organisation depends on the rapid development and maintenance of your applications, make sure that you have an efficient ALM engine holding it all together. We all know how hard developers work, and that communication with non-technical teams hasn’t traditionally been IT’s strong point. Without a clear system that keeps everyone informed, it’s all too easy for business teams to start questioning what’s happening, and for the IT team to feel sidelined, pressurised and under-appreciated. Automating the process not only improves control and efficiency but also makes for a more secure system. For example, automation can ensure that the principle of “least privilege” is followed, providing access only when an individual absolutely needs it and revoking it by default after a set period of time. Crucially, ALM automation systems also enable clear communication, so business teams can be informed about the progress of the project.  This leads to a greater understanding of the process which serves to highlight the value of the IT team’s contribution to the organisation. And that’s something that will boost everyone’s morale. ### How to Write a Winning UX Proposal A UX proposal, also known as a design brief, is an outline of proposed changes to the User Experience design of a digital product. When looking at a UX proposal the UX designers needs to ensure that they outline the problem, explain why it is a problem, and present a thorough solution. This can be somewhat challenging, as the designer needs to make it very clear what the problem is and how the solution addresses it. They may even be describing them to someone who is not directly involved in the process. All put together, many UX designers may feel put off creating a proposal. But it is actually essential to the designer and the stakeholder, putting them on the same page and keeping both parties informed about each other's vision and concerns. By presenting the stakeholder with a UX proposal the designer can show what better design means to the company, from higher quality products to safer websites, from a more streamlined workflow to better ratings. The proposal will help the designer sell a design improvement to their stakeholder, and avoid confusion and arguments over design decisions you are making along the way. It is their blueprint to better design. Somewhat ironically, after all that hard work the UX proposal could be dismissed because of how it is presented. Stephen N. Lewis, a UX Designer at UKWritings explains: "A UX designer, of all people, understands best how important presentation is. When you hand in a UX proposal that is full of spelling mistakes, rambling, or sloppy, the other designers reviewing it may think that you just don't care. And if youdon't care about your proposal, why should anyone else?" The top tips for presenting a perfect UX proposal are: Speak to a writing consultant. Before getting started it is a great idea to get a writing consulting session underway. If you were undertaking any other task you would talk to the experts, wouldn't you? Same applies to writing: talk to a pro so you know where to start. UX designers can also read professional writing guides, like this one, for the same sort of result. Plan, plan, plan. If there is a limited number of words and points to make, it is best to lay out the bullet points and then expand on them later. This way the designer knows they are covering everything they need to. Make sure that it is error-free.  Just like the glitchy interface and awkward layout nobody wants in their User Experience, the designer needs to avoid spelling mistakes and awkward grammar in their UX proposal. These could distract the reader and make it look like the designer doesn't care about their work. Polish the proposal. Error-free obviously is great, but don't just rely on a grammar checker and a proofreader! Once the proposal is scanned, make sure to actually read it. This way the designer will get a proper feel for how the audience will receive it. Take this time to edit anything awkward or unclear. Get a second pair of eyes on it. Although one reader experience may indicate that it's a masterpiece, make sure to ask for a second opinion. A writing consultant should be able to detect flaws or inconsistencies that the first reader missed. Keep an open mind and make any edits they recommend. Keep it brief. The more the proposal rambles on and on about this detail or that detail, the less of the audience's attention it will have. Try and make sure that every point is descriptive, of course, but do not go beyond the essential. Make sure to reference. Claiming that one in ten users has had issues with a certain button location or loading speed time? That needs a reference. Make sure that every point that is central to the argument is properly referenced, with an actual academic study or company report. And make sure that the referencing style is consistent, so that the reader can actually look up the references for confirmation. Check it for plagiarism. After all this effort to create your UX proposal, it would be really frustrating to find out that part of it, or even most of it, is too similar to an article on another website somewhere. There are only so many sentences in the English language, but a minimum amount of originality is needed. What is more, if the proposed solution already exists, why should a company spend all their time and energy developing a new one? So make sure to check the proposal and guarantee that it is completely original. Naturally, it can be difficult for one person to keep on top of all this. Fortunately, there are some great tools and resources to help you with these pursuits. #1 Grammar and spelling. If you need help making sure your grammar is absolutely perfect, consider tools such as Grammar Checkers. #2 Writing consulting. To ensure your work is absolutely perfect check writing guides online. #3 Proofreading and polishing. Between these services your own writing skills will be massively boosted. And for anything you miss? #4 Making it readable. Staying within word count is essential, as is making sure your work is properly referenced and plagiarism-free. The best tools for the job here are word counters and citation generators for consistent references. If you make the most of these resources and follow the above steps, you should be able to ensure that your UX proposal is perfect. Now your proposed solution can shine all on its own! ### Multi-Cloud Does Not Equal Multiple Responsibility IT spending on cloud technologies, including public cloud providers, is expected to rise from 12% in 2017 to 18% within the next two years, according our Truth in Cloud study.  This trend is likely to continue as more organisations plan to increase the workloads they have across multiple cloud platforms. With a growing number of apps and services available, and the benefits cloud offers – agility, ease-of-use, time-to-provision – it is easy to see why cloud adoption is on the rise.  In fact, over half (56%) of organisations today operate with a cloud-first strategy. Further, business leaders are realising the potential of using multiple cloud platforms: nearly six in ten businesses (58%) that currently use one cloud provider plan to expand their portfolio across multiple platforms. Multi-cloud solutions offer an agile and cost-effective way for businesses to improve resilience, data security and workload management.  But organisations must pay close attention to selecting cloud service providers (CSPs) that are right for their business and their specific IT requirements. Many organisations have moved past thinking about cloud solely from a cost perspective.  Rather, the areas that are of most importance to organisations when it comes to selecting a CSP include data privacy, security and compliance (60%), workload performance (49%) and workload resilience or uptime (43%). It’s encouraging to see organisations embracing a multi-cloud approach as they strive to achieve digital transformation.  But as they increasingly distribute their data across multiple clouds to advance their cloud journey, it is critical that enterprises understand exactly who has the responsibility for their data in the cloud. Dangerous misconceptions Worryingly, many organisations are making incorrect assumptions about the data management capabilities offered by their cloud service provider, leaving themselves exposed in multiple areas.  The majority (83%) of businesses believe that their CSPs take care of data protection in the cloud.  More than two-thirds (69%) expect their cloud provider to take full responsibility for data privacy and regulatory compliance of the data held on their platform, with three-quarters (75%) of businesses even saying they would leave their cloud provider as a result of data privacy non-compliance. Yet cloud service provider contracts usually place data management responsibility onto the service user.  Organisations need to understand clearly their retained responsibilities in order to avoid the risks of non-compliance, which can have massive implications on their business. With the EU General Data Protection Regulation (GDPR) now in force, businesses can’t afford to mishandle their data since they will be the ones who face the repercussions, regardless of whether their data is stored on their own private servers or hosted on a third-party cloud platform. Our research also reveals dangerous misconceptions around the responsibility for cloud outages, with six in ten (60%) respondents admitting they have not fully evaluated the cost of a cloud outage to their business and are therefore ill-prepared to deal with one.  A similar percentage (59%) also believe that dealing with cloud service interruptions is the primary responsibility of their CSP, while most (83%) believe that their CSP is responsible for ensuring that their workloads and data in the cloud are protected against outages. Although cloud service providers offer service-level objectives on infrastructure availability, it is ultimately the service user’s responsibility to ensure that their critical business applications remain resilient in the event of an infrastructure outage and that data is protected against loss or corruption. The information a company holds is its most valuable asset and businesses must have full visibility into their data and be accountable for protecting it, regardless of where it’s located.  Not only will this will help avoid the risks of non-compliance or loss of revenue through downtime, but it can also help organisations to glean better insights from their data to improve customer experiences, manage costs, improve research and development, and build brand loyalty. The future’s bright Despite the challenges that a multi-cloud approach can bring, the opportunities it can provide are far greater. Using multiple clouds enables businesses to increase resiliency and minimise any risk of downtime: if any of their cloud service providers suffer an outage, businesses can simply switch to another one of their platforms.  Our research found that resilience and uptime is one of the most important aspects organisations consider when selecting a CSP.  The ability to move workloads between platforms also allows businesses to pick and choose the right CSPs for specific job functions or workloads. With more companies embracing a cloud-first mentality and introducing multi-cloud approaches into their organisations, the need to navigate the complexities of a multi-cloud world is critical.  As with on-premises environments, companies should consider all aspects of data management as they journey to the cloud, from data protection, regulatory compliance and workload portability to business continuity and storage optimisation. Although businesses can export their data to the cloud and reap the benefits of doing so, they must remember that they cannot export their data management responsibilities. ### Comms Business Live: S3E2 - Exploring Team Collaboration In the next episode of Comms Business Live David Dungay, Editor of Comms Business Magazine, will be taking his expert panel through the world of team collaboration and the opportunities for Channel. Joining him will be Zubair Usman, Collaboration Solutions Architect from Cisco, Adam Clyne, Managing Director of Coolr, Mark Phelps, Collaboration Product Manager at Node4 and Ian Rowan, Channel Manager at Wildix. Tune in at 3:30pm on the 11th June to find out the latest developments in the market and where the opportunities for revenue exist for partners. ### The Axians Network – S1E2 – Cyber Security in the Industrial Sector Joining host David Eighteen for this episode are Juniper Network’s EMEA Security Specialist Lee Fisher and senior consultant for ARC Advisory Group Thomas Menze discussing cyber security within the industrial sector. ### Businesses are looking to the cloud for transformative ways of scheduling The workplace is undeniably changing at a fast rate. Instead of using old-fashioned manual processes and complicated spreadsheets to manage shift planning and scheduling, a myriad of businesses – from gyms and hotels to bars and restaurants – are turning to innovative solutions to work together progressively and conveniently.   The problem with scheduling According to Deputy, more than 80% of small and medium-sized businesses rely on time-consuming solutions, like pen and paper, or spreadsheets, to manage staff rotas. The average restaurant, hotel, or office manager dedicates 2.64 hours a week to the same scheduling task, as well as spending more time on checking the accuracy of pay. That’s a staggering 20% of valuable time, which could be otherwise filled by serving customers and growing the business. Of course, checking pay and records are essential aspects of running any business and in this respect, employers have a duty of care. Many businesses, however, are still falling short. Last month, the government named and shamed nearly 180 employers for underpaying more than 9,000 minimum wage workers by £1.1 million. Mistakes aren’t always made due to unscrupulous employers. The problem can also be that outdated, paper-based systems can often lead to mistakes with time sheets and pay. For staff in shift-based businesses like call centres and gyms, there can be other challenges: such as timekeeping. Employees requesting time off due to holidays or sickness, especially at the last minute, can also complicate matters.   Replace pens and spreadsheets with the cloud Rather than trying to figure out how best to fill gaps in work scheduling manually, it can now all be done in real time via apps and online, thanks to the power of cloud-based employee management solutions like Planday. Whether you run a restaurant or a manufacturing business, instead of having to update spreadsheets with last minute changes, you can now simply drag and drop new shifts into work schedules. And once you have a scheduling system that works, there’s no need to waste time creating it from scratch every week. Instead, the same work schedule can automatically be transferred over to the next week. As for timekeeping, at the beginning of a shift, individual employees can clock-in quickly through an app, which works on Apple and Android devices. With its ability to connect employees and managers in shift-based businesses, in real time, it’s easy to see why some of the biggest names in food service, hospitality, and fitness are now embracing innovative, cloud-based technology to run their businesses more efficiently. These include Ottolenghi restaurants, Soho gyms and St. Pancras Renaissance Hotel among others.   Transforming industries Cloud technology has transformed how businesses operate across all industries. An industry in which staff management, flexibility and reliability are paramount, is the care sector, and this is one area Planday technology has been transforming. In care homes it’s vital that communications are foolproof,  important documents aren’t mislaid and systems are regularly updated. Crucially, when workers can’t make a shift, managers need to know immediately. Sam, a shift-worker at Avondale Care, a group of Scottish care homes which have implemented Planday last year, has told us: “I can pick up extra shifts, swap shifts and request holidays easily and I can communicate with my team effectively. It’s night and day to the old system; a real improvement and all the staff love it.” One of the major responsibilities of care home management is ensuring that staff are rotated regularly, to avoid burnout, as well as ensuring the right skills are available within the team on duty at all times. The Planday app ensures that staff can instantly clock in and out for shifts, all changes are instantly updated centrally. Corresponding salary changes are also recorded automatically. The Director of Avondale Care, Adrian Hendry explained “Planday provides more transparency. Any contract or policy changes that need to be made, can be implemented simply through the system and everyone has sight of it. This, in turn, helps us with employee satisfaction, as all parties are aware of any changes.”   Saving Time and Resources Since the technology is centrally developed, controlled and updated it’s also cost-effective. Our customers save on average seven hours of admin time per week, freeing up time and budget for their essential staff.  Whether you work in the care sector, the fitness industry or a restaurant, cloud-based technology like Planday is changing lives for the better by driving new efficiencies and higher workplace satisfaction.   To find out more about Planday visit https://www.planday.com/uk/   ### From Mt Gox to Now | The Crypto Exchange Transformation Seven years on from the great 2011 Bitcoin theft, the world of crypto exchange is undergoing a stunning transformation. The news that those who had Bitcoin stolen in the notorious Mt Gox theft of 2011 has been welcomed by everyone involved in digital currencies. Revisiting the story serves to remind all of us of the progress crypto exchanges have made in a relatively short period but also the steps that still need to be taken. In 2011, Mt Gox was riding high and dominating Bitcoin trading, accounting for roughly 70% transactions. It turned out to be a sitting duck – a poorly protected centralised exchange that was always going to be subject to a well-organised attack. Thankfully, we now have better protected centralised exchanges. Even more excitingly,  a new breed of decentralised exchange is emerging, offering a whole new level of security and convenience. From exchange central to de-central The sheer number of centralised exchanges in existence nowadays means that a cyber attack with such seismic repercussions for the industry isn’t likely. But, unfortunately, attacks are still going to happen because centralised exchanges are inherently vulnerable. When individuals pass over custody of their funds to a centralised exchange in order to trade, their private keys become vulnerable to hackers. And unlike bank funds – which enjoy protection from governments – centralised exchange won’t always be able to refund attack victims. Centralised exchanges can also abuse power. They can reduce withdrawal limits without warning, while their unscheduled downtime can prevent users from accessing their funds. The sudden delisting of coins is also a risk, resulting in profit loss in some cases. Whilst the best centralised exchanges have contributed immensely to the ecosystem from a usability and trust perspective, hacks of any centralised exchange ruin the reputation of the crypto space and the value of cryptocurrencies in general. De-centralised progress The good news is de-centralised exchanges are emerging fast. This is hugely important for crypto investors. De-centralised exchanges stay true to the cryptocurrency ideal that parties don’t need to take the chance of trusting organisations anymore – instead blockchain technology can ensure commitments between parties are met. Exchanges can be completed without the involvement of an intermediary. De-centralised exchanges also offer individuals full control of their assets. They aren’t required to hand over their private keys to a third party at any point instead value can be exchanged peer to peer, without the involvement of an intermediary. Wider solutions The progress that’s been made since the days of Mt Gox amount to much more than de-centralisation. The user experience and functionality being offered to crypto investors is now rivalling that of the banking world. More and more interesting projects are surfacing offering amazing functionalities and beautiful user interfaces that should ease adoption. New protocols are emerging that will enable inter-chain transactions of virtually every one of the 1,500 digital currencies out there. Soon, one digital currency user will be able to easily send a payment to someone who only uses another. People will no longer be limited to exchanging big-name coins. A SWIFT like future The blockchain has been crying out for a solution akin to SWIFT, which was introduced in 1977 to facilitate cross-border payments. Before that, many of the world’s currencies and banking systems operated in isolation. The crypto world is about to experience a similar revolution. Solutions are emerging that connect one blockchain to another by enabling them to effectively speak the same language. This type of solution will create a blockchain without borders, helping unlock the potential of every digital coin. Using an overlay protocol, it’s now possible to complete transactions near-instantly (as opposed to taking several minutes or hours). Anonymity is also made possible by scrambling transactions to prevent ownership of a given coin unit being traced through a public blockchain. Such a solution can work by operating in parallel to an existing blockchain network (such as Bitcoin’s blockchain). As it runs parallel, the exchange is able to translate unique attributes of clients and nodes on each blockchain (such as the wallet ID number) into a universal code that can be understood by all blockchains. That universal code is what effectively enables blockchains to speak the same language to each other. This is huge news for everyone  who believes in decentralised currencies, but particularly important for unbanked or underbanked populations in developing countries who already rely on them for financial transactions. The Mt Gox era is behind us The development of crypto exchanges into trusted and reputable institutions was always going to be bumpy. The emergence of de-centralised exchanges takes us a giant step away from the Mt Gox days bringing us alongside the fiat banking world in terms of security and usability. Secure inter-currency transfers on de-centralised platforms is a big part of this story. SWIFT’s importance to the progress and security of financial messaging and international trade to this day can’t be underestimated. The emergence of a fast, secure, SWIFT-like de-centralised solution for digital currencies is similarly revolutionary. ### Pulsant Partners with Armor, Protecting Organisations’ Cloud Data from Cyber Threats, while Enabling Adherence to GDPR Regulations Pulsant, a leading UK provider of hybrid cloud solutions, has joined forces with Armor, a top provider of threat prevention and response services for private, hybrid, and public cloud environments. Combining Pulsant Protect —a comprehensive solution that covers the entire organisation, from people and networks to cloud infrastructure and applications —with Armor’s extensive portfolio of security services will ensure that each customer’s valuable data is protected from current and emerging cyber threats, while enabling the organisations to meet their GDPR requirements. With the Pulsant – Armor partnership, customers can turn up full-stack security for their cloud workloads in two minutes or less, while complying with regulatory mandates including GDPR, PCI, and more. Armor blocks 99.999% of cyber threats and provides customers with 24/7/365 threat monitoring and instant access to its team of on-demand security experts. Martin Lipka, Head of Connectivity Architecture, Pulsant, says: “We’ve always been impressed with what Armor has been able to achieve in the cybersecurity space, and so it’s great to be working so closely with them to bring these capabilities into our own security solutions. Through this partnership, we’ll be able to offer our customers an even better range of solutions.” Armor’s compliance with a range of regulations further strengthens Pulsant’s own compliance service, leveraging the same operational analytics for intelligent security monitoring as well as continuous alignment to regulatory frameworks. Lipka continues: “This is an important benefit that we can deliver to our customers alongside Armor. By using the same deep monitoring and analysis of their physical, virtual, and cloud infrastructures, we can keep their infrastructures secure while demonstrating to auditors and regulators that they are, in conjunction with their own best practices, complying with applicable regulations.” Chris Drake, Founder and CEO, Armor, says: “After spending a lot of time with the leadership and people of Pulsant, it was clear to me that Pulsant shared our same core values to guide and protect customers as they evolve their organisations. With that, Armor is extremely honoured to be partnering with Pulsant. Working hand in hand, Pulsant and Armor will provide organisations with the robust, real-time protections needed to defend against the ever-changing cyber threat landscape, while enabling organisations to meet and exceed their compliance requirements.” ### Technophobia, artificial intelligence, and the UK economy Technophobia - The idea that artificial intelligence (AI) will become evil and take control. It has been embedded in pop culture for decades. From the antagonist of the Terminator franchise, ‘Skynet’, to ‘HAL 9000’ in 2001: A Space Odyssey, pop culture has not exactly warmed the British public to AI. Now widespread AI is becoming a reality, and it’s facing resistance.   You’ve no doubt heard the scaremongering cloaking AI advancements in a foreboding shadow. AI is taking British jobs. AI will kill the UK economy and increase the rich/poor inequality. AI will take over the world; it’s the end of the human race. But aside from fanciful dystopian narrative, do any of these fears hold true? AI is new, yes, but is it really as bad for the UK as we’re being led to believe? Niamh Reed from automation specialist Parker Software, investigates whether AI really is likely to kill the UK economy.   Technophobia Anxiety over artificial intelligence is a modern strain of technophobia. With so many seemingly reputable sources (like Stephen Hawking) declaring AI to be disastrous, unnecessary distress is rife amongst the general public. There's no room for technophobia and this must stop. There is no denying that AI will change the nature of the job market, but it will not destroy it. We’re not even close to reaching a stage where AI could run our businesses and induce mass redundancy. Nor do UK leaders want this future. At present, in fact, the UK is preparing to “forge a distinctive role for itself as a pioneer in ethical AI.” And if the numbers are anything to go by, AI will prove to be a major boon for British business, unlocking the heightened productivity that stems from human-bot symbiosis.   Reality of AI AI isn’t all robots replacing humans and cyborgs sent to kill. More commonly, AI is a tool designed to help humans work more efficiently, like your CRM tool, your live chat software, and your business process automation. Far from taking over jobs, artificial intelligence is helping jobs evolve. Technophobia is redundant. In fact, few occupations are fully automatable, but 60% of all occupations have at least 30% technically automatable activities. This means that AI can alleviate tedious and repetitive tasks – but not replace us altogether. You need only look at the failure of Fabio – the Pepper robot recently sacked from a British supermarket for scaring customers – to see that robots aren’t ready to supplant us all from our roles. Instead, AI in the workplace is great for helping with the repetitive, the manual and the mundane. It’s instrumental in freeing up the average UK employee’s time for more important tasks.   “They took our jobs” It’s true that AI will eliminate the need for some tasks to be performed manually. But it doesn’t necessarily follow that it will strangle jobs and economic growth in the UK. AI is much less tuned to dealing with relationships or anything that needs empathy. A robot will not be capable of calming an angry customer, or thinking around unique problems, or relaxing a frightened child in hospital. No matter the job, there will always be levels of it that are simply much better suited to human employees. In other words, no matter how clever AI grows to be, there will always be room for human flexibility, reasoning and empathy in the workplace.     It’s already here If you need evidence that AI is a help to British productivity, you need look no further than the AI already peppering our lives. Our smartphones carry an AI assistant and our homes hold conversational AI agents that help us with a plethora of tasks. We chat with AI chatbots when we reach out to businesses online, and at work, AI is starting to help us analyse data and create a better class of jobs. Artificial intelligence is supporting jobs in every industry. In offices, it’s automating tedious data entry and admin tasks, freeing up employees to tackle the truly important stuff. Customer service is seeing AI improve personalisation and ease of business interactions. In production, AI is automating the more dangerous manual tasks that require little thought, while being supported by human teams from a safer location.   Artificially intelligent future Used correctly, AI will continue to improve working life. It will refine our jobs into ones with high human contact, high creativity, and low tedium. The smart offices of the future will blend human management and empathy with AI-infused insight, and we will see a new class of jobs created as this assimilated future unfolds. The future of AI is not insular but integrated. Plus, AI will deliver substantial productivity gains. This increased productivity means that AI is likely to cause the production costs of goods and services to decrease — meaning lower prices, more income and more money freed up to be invested in British assets like the NHS. In fact, for the UK and its growth and aging population challenges, AI could provide an opportunity to create meaningful work while increasing GDP.   Embrace AI evolution AI is here, and it’s becoming more prevalent across British business. It might become more intelligent than us, it might radically change the kinds of jobs that us humans do. But that doesn’t mean it’s going to kill our economy. On the contrary, it might just be the saving grace we need. We rely on AI already without the fear, and without our economy suffering. Why should the future be any different? Humans are, and always will be, an essential part of the AI journey that we in Britain are just beginning. Both technology experts and elected officials have a responsibility to quell the dystopian fears surrounding AI, and focus on the opportunities it can create. By embracing the evolution that AI is facilitating, we can meet a more efficient future with open arms. ### #TheCloudShow – S2E4 – Cloud and the Edge Edge computing can be defined as a “mesh network of micro data centres that process or store critical data locally and push all received data to a central data centre or cloud storage repository, in a footprint of less than 100 square feet.” Now there are some crystal ball gazers who are saying that Cloud is dead and Edge computing is the future. But, where did Edge come from? A product called NATS started this trend, where enterprise messaging systems and platform technologies could be decomposed into things called microservices using the processing capacity of devices at the end user rather in remote computing. There is clear value in utilising the processing power of devices that are close to the end user. The key factor here lies in some basic physics – so if you did not pay attention in school about the speed of light, you might have to concentrate hard now. Historically, the technology industry has gone back and forth between centralised computing and computing all happening at the end user. This is now being disrupted by things like Machine Learning and Artificial Intelligence. As humans, we understand linear processing – our basic instincts demand responses like fight or flight. Essentially we are 50 thousand year old wet ware – as a result we struggle understand exponential growth. Artificial Intelligence & Machine Learning are seeing exponential growth – what is happening now is basic compute is limited by the speed of light, namely how fast the compute takes as it goes up and down the internet. This is where Edge computing brings that distance for far away data centres that cloud traditionally runs on and brings it to the devices that are closer to where the action is. The argument is that software and compute will move closer to consumer in order to meet their insatiable demand for instant access. The fastest response will lie in telco providers providing 5G and in the devices themselves to process the demand. Fundamentally, this is about what is known as latency, namely the time it take for data to go up and down the pipe. Or is it? ### Understand your exposure to third-party risk How can you trust suppliers and partners to keep personal data safe? If your organisation shares personal information – on employees, customers or prospects – with third-party companies, are you sure they treat it as carefully as you do? Do you audit their data-protection and cyber-security policies? Do you check that they patch their IT systems? Are you sure – right now – that their intranet is secure? And what about all the companies they deal with? Under the GDPR, you have joint liability and responsibility for sensitive data shared with third parties. Now: how would it be if you could see – in real-time – an accurate risk rating for all the companies you work with, and for all the companies they work with, and so on? Get a comprehensive, 360-degree view of cyber risk across entire ecosystems of connected businesses. RiskXchange: the global standard for cyber risk score ratings, research and analysis, providing a simple, automated, centralised approach that empowers organisations to conduct business securely in an open, collaborative, digital world. Find out more at https://www.northdoor.co.uk/solutions/riskxchange ### The Digital Workplace - Episode 2: Digital HR Unleashed In programme two, we look at the rapid pace of HR technology innovation and how to unleash the value of digital HR. This next-generation HR operating model is delivered 'as-a-Service', uniformed, higher-quality and lower lower-cost than standard traditional models, and delivers fast, visible results. Here we touch upon the five key enablers to the next-generation operating model: lean HR & payroll process standards, Digital HR, HR-as-a-Service, advanced HR analytics and intelligent automation (AI and machine learning). With this change in approach to the workforce, the question arises; “Who manages the HR Bots?” We will cover this, the changing role of HR, and the five enablers of digital HR throughout this series. ### Why Big Bang switch outs of IT invariably just go BANG I’m going to take a leap and suggest that if you are reading this you’ve either lived through a big bang, know someone that got fired over a big bang, don’t want to be fired over a big bang or at least read something in the media about a big bang switch out that went wrong.   What is a Big Bang IT Switch Out For the un-inducted, the Big Bang approach is where you retire an operational IT system and replace it with a new one in a single manoeuvre. You finish work on Friday using the good old boy that’s been your partner for some time and then you arrive on Monday to a bright shiny new thing, that says No a lot, but smiles whilst doing it. You don’t have to look too far back to find some chin dropping examples of this “pain has no memory” business practice. In the last 12 months I can think of 3 in the UK alone, British Airways stranding tens of thousands of passengers worldwide, having switched to a seemingly robust but thoroughly untested 3rd party IT provider, KFC having to close all their UK outlets as a result of switching from Bidvest to DHL, TSB denying their customers access to their banking accounts as a result of switching from the Lloyds banking platform to the Sabadell system.   High Risk + Big Bang = Fail In this article, I’m going to focus on why Big Bangs go wrong, rather than why organisations need to change, adapt or modernise their incumbent systems, for which there are varied and often valid business critical reasons for introducing the scary change monster into the house. Most IT strategies talk about reducing TCO and commoditizing the technical and human resources needed to develop and run systems. So for most organisations when that technology change decision is accepted the guys in the boardroom expect that costs will go down. They think purely in terms of the cost of the licensing, hardware and people to support it and how much they will save if that changed for something with a lower commodity price ticket. There is, in fact, a clinical syndrome called New Shiny Vendor Syndrome (NSVS), which is basically what happens when a new vendor wows the board with some headlines about a bright sunny future with light speed enterprise, lottery wins and lifestyle business. Anyone that’s been through this will know it has a powerful LSD type effect with an instantly addictive quality that can spread like an airborne pathogen throughout the lofty regions of an organisation. The investment in the existing system at that point is seen as dead money, a decision that no more money will be spent on the old iron is quickly accepted and a warm fuzzy feeling about a brighter future is embraced at break-neck speed, whilst backs are slapped and champagne glasses clink at this new and amazing business epiphany.   The True Cost of Change In those heady moments of expectation and anticipation as to the effects of new IT panacea, the true cost of change is not really considered. There is an urgency to get to the future as quickly as possible so some best practices are often passed over for want of the new life changing supersonic screwdriver. Anyone that reads the news will quickly understand that the cost of change should be more heavily weighted in the cost of getting it wrong than in the immediate cost benefits of the new technology. A change that goes wrong can damage brands irrevocably, after an avoidable service failure that strands clients in foreign lands or leaves them without their money, as sure as Sunday customers will leave your business and seek less risky shores for future ventures. A systemic failure that creates an enterprise “All Stop”, closing stores and more importantly stopping tills can hit the bottom line in a flash. This not only impacts balance sheets instantly but also forces would be customers to seek other sources of service, possibly stealing them elsewhere forevermore. Sometimes it can be a seemingly small failure in a grand victory that can sink an entire ship, anyone who understands the subtlety of the small and hidden piece of errant functionality that led to the MFI closure can attest to the power of the ill-specified grand solution being broken by the smallest of little gremlins, 'twas David that felled Goliath. In all cases the media frenzy and viral nature of all modern news will seek sacrificial lambs and the vilification of all things branded similarly, dragging down both individuals and share prices alike. The true cost of getting it wrong is felt beyond the IT department, out across the enterprise, into the value chain and the world thereafter, with the possibility of jeopardising the future viability of the business itself.   Accept that Costs Need to Go Up to Go Down The successful agent of change has to land the plane that delivers value now and on the journey, because the destination is an ethereal point in time that will inevitably keep moving away in the ever-changing landscape of need and capability. The board have to be brought back from the New Shiny Vendor Syndrome induced haze to accept that most people don’t live in a house whilst its being fundamentally rebuilt or extended, anyone that’s lived in a house that is being extended will appreciate this, its hell and not to be done twice, to move out temporarily has an additional cost, but that additional cost is better than living in hell. The board have to accept that to realise the long term benefits there will be some additional costs. The additional costs are needed to remove risk from the change and to ensure business continuity throughout the journey. A simple equation, the longer the existing system has been in place, the more sizeable the technical and business logic investment, the more the cost is going to be to move it, it’s as simple as that and that is OK. If you live in a house you collect stuff, the longer you’re there the more stuff you collect, we all accept that the more stuff we have, the more the house movers charge you to move it, that is an accepted cost in addition to buying a new house, so as it is with houses it is the same with IT systems.   To the Promise Land and don’t spare the Horses The aim off this piece has been to guard against the gold rush of great TCO savings that not properly considered and managed can be dangerous to your careers but more fundamentally your businesses. However, I couldn’t do that without giving a few paragraphs over to how to avoid said costly mistakes. A risk free journey will require the new system to run in parallel with the old, for an amount of time that would see the entire business cycle through, this is often at least a year, normally longer as when an issue is discovered it needs to be investigated, resolved and tested, pushing the parallel running end date back to encompass a new complete business cycle. There are lots of great approaches and strategies to managing the technical and behavioural aspects of this type of change and I will cover some of my favourites in another piece. But needless to say that closed approaches like Waterfall will likely have known end dates but take too long for the agent to survive the process. Open approaches like Agile will have no known end date but allow for a more holistic journey albeit at a far greater cost. These are brown field developments, not green ones, functional equivalence is the minimum viable product and all too often the journey time will see the new shiny system drug wear off and the plumbing change perception grow, and people…..no one likes calling out the plumber.   The Net Net Don’t get blinded by the headlines of a new low cost tech future, resist the temptation to make it all about cost reduction, remember the headlines of past victims and avoid becoming one by not using a BIG BANG approach, remember to use best change management practices, even if they take longer and cost a little more. If you can truly say that the new system will empower you with an agility to maintain and grow your business in ways that the current system wont then change is inevitable, just don’t lose your head or you’ll probably lose your job. ### #TheIoTShow - Episode 4: Analytics, AI, Machine Learning, Business and Manufacturing Insights The IoT Show delivers valuable insights for industrial organisations on the industrial internet of things and topics touching the broader internet of things. We pick engaging, hot topics, and ask our speakers to advise on situations, opportunities, and recommendations and gotchas. This series is not a sales medium, rather a series of discussions for you to benefit from the experiences and insights of others. In this fourth episode, we’re going to look areas of insight, analytics and the like that are often fundamentals or leveraged in IIoT and IoT investments. Select the image above to view the webcast, and click here for the Key Learnings Takeaway. ### Comms Business Live Presents: Knight Corporate Finance Surgery ep.2 – The Sellers Perspective In Knight Corporate Finance’s second live value surgery, David Dungay will be looking at the Exit process from a sellers perspective. Alongside Adam Zoldan, Director of Knight Corporate Finance, will be Paul Harrison who recently sold thevoicefactory to EvolveIP and remained part of the business. Also joining the debate is Adrian Barnard, Founder and previous CEO of Modern Communications which he sold in 2013. Over the course of the discussion the panel will give an insight into the highs and lows of the process along with their advice for ensuring a transaction runs smoothly and the pitfalls to avoid. Once again, the panel will be happy to answer your questions live over social media. Tweet your questions using the hashtag #CBLive18. ### Comms Business Live - Season 3 - CPaaS – is the Channel ready for it? In the first episode of this new season, sponsored by TalkTalk Business, David Dungay will be leading a discussion on the CPaaS (Communication Platform as a Service) market and how the Channel are embracing this trend to enable them to build more relevant solutions for their customers and stand out in an overcrowded market. Joining him will be Dom Black, Senior Analyst at the Cavell Group, and Justin Hamilton-Martin, Centile UK, Craig Walker, VP Cloud Services at ALE, Rick Hawkes, Sales Engineering Manager from Avaya and Robinder Koura, Head of Channel Sales – EMEA at Ringcentral . If you have any questions regarding the CPaaS market please tweet them using the hashtag #CBLive18. Series Supported by: Talk Talk Business ### #TheCloudShow – S2E3- The Great Multi- Cloud Debate We all accept that we all do a form of multi-cloud (i.e. we consume several cloud services in isolation – e.g. on your phone we use different social media, email, etc all separate clouds) – but that is not the point of this discussion. The point of the debate is whether it is realistic for organisations to run on two or more cloud services to run a business process or set of workloads. Here is the argument against Multicloud: No-one can agree what it is in the tech industry, organisations struggle with mastery of one major cloud provider – so how can they do it for many? It is basically impossible to consolidate and manage multiple cloud costs onto the famed single pane of glass and the cost and effort of managing multiple clouds is unproductive. But what about the argument for Multi-Cloud: It is about using the right tool for the right job, it can allow you to use the best athletes in the market for what your business wants to do. Multi-cloud can also offer low latency, by having cloud services in one space – it allows data to be transferred across different platforms much quicker. It is about flexibility, allowing organisations to leverage the relative advantages, price-points and geographic locations of the solutions to their best advantage. But which one is right, if at all? With guests Bill Mew, Founder @Mew_Era and Steve Chambers, COO @cloudsoft #TheCloudShow #DisruptiveLIVE ### Why GDPR Will Be Good for Business On the 25th May 2018, the EU’s General Data Protection Regulation (GDPR) will come into force. With the UK’s data protection watchdog, the Information Commissioner’s Office (ICO), reporting year-on-year rises in the number of data breaches, coupled with high levels of public distrust for how online businesses protect and use their personal data, the GDPR is set to offer more rights to EU citizens over control of their data. While largely welcomed by the public, some are concerned the expanded definition of “personal data” and the restrictions being placed on how organisations can use this information will impact the technology industry’s economic growth. Fortunately, while the regulation is mandatory to any organisation with identifying EU citizenry data (regardless of HQ or server location), compliancy will likely help boost business. Protect against fines First and foremost, the EU is ensuring acquiescence by imposing the largest fines ever seen with regards to data protection laws. Breaches of data, improper management, gathering or selling of personal data without subjects’ permission or failure to respond to data subjects’ requests appropriately can all result in a fine of €20 million or up to 4% of your organisation’s global revenue (whichever is more). By contrast, the outgoing Data Protection Directive (DPD), which the GDPR is set to replace, enabled ICO to fine violators of data protection laws in the UK up to a limit of £500,000. Big data will continue to drive the market Setting an upper limit that moves with a business’ earnings and closing loopholes, such as storing data in servers outside of the EU, creates a level playing field throughout the industry. The GDPR prevents big tech companies from buying their way out of trouble. Furthermore, by only limiting those organisations who attempt to cut corners with regards to data protection, the regulation works in favour of small and medium-sized enterprises (SMEs) who, by their nature, are more likely to develop a personal connection with their customers and clients (and so are more likely to get compliance to store and use personal data). While the regulation will make gaining compliancy more difficult, none of the new rules will forbid the use of big data to promote marketing and sales. Provided you are honest about what data is collected from customers, encrypt and store it securely, and ensure customer compliance before using it, you will still be able to use personal data to drive their sales and marketing strategies. An era of responsible data management Consequently, organisations are being encouraged to play a more personal and responsible role when it comes to interacting with customers and clients. This can be as simple as making clients aware that your business is GDPR compliant. Equally, you are well within your rights to assure them of how you will use their data responsibly, how it will be protected and how, in the event you want to use it for other means, you will seek their permission and compliance. In other words, the GDPR promotes transparency in the tech industry. While this requires effort on your part, it will largely be of net benefit. Currently, trust among the general public for big tech and the technology industry is at an all-time low when it comes to data protection. Two out of three Europeans believe online businesses are irresponsible with their personal data and one third have voiced their intent to exercise their rights to have their data removed or limited in the first month of GDPR. GDPR compliance is an opportunity to connect with your customers and capitalise on the public’s demand for greater trust and responsibility. Trust leads to loyalty There is an added business incentive besides merely avoiding punishment or damage to your reputation. By cultivating trust with your customers, you not only improve your organisation’s reputation, thereby attracting more customers from your non-compliant competitors, but you will boost customer loyalty, to which there are numerous economic benefits: On average, 5% of an enterprise’s customer base accounts for 80% of its business; You are 60-70% more likely to sell to an existing customer, while new customers have only a 5-20% success rate; It costs your business five times as much to gain a new customer than retaining a current one; 63% of customer losses to competitors is down to poor treatment or failure to satisfy their needs. Loyalty is big business and the GDPR represents an opportunity for organisations to recognise the public’s demand for greater respect of their rights. By embracing GDPR and offering a more personal service, technology organisations have a chance to reinvent their public image as trustworthy. This is your chance to not only avoid losing customers by satisfying their rights to greater privacy but to strengthen the size of your loyal customer base.       ### Cloud Expo Asia Hong Kong LIVE | Day 2 Join us live for Cloud Expo Asia 2018, Hong Kong. Disruptive will be streaming live from the HKCEC, Hong Kong for the two day event. Don’t miss the latest interviews, news and trends from the Expo. Remember to join in the conversation on Twitter using #DisruptiveLIVE ### Tech of the Week #10: TESS - The Transiting Exoplanet Survey Satellite The Transiting Exoplanet Survey Satellite, TESS, is NASA’s latest mission to find life on other planets. It succeeds the Kepler Space Telescope and launched aboard a SpaceX Falcon 9 rocket. The mission aims to discover thousands of exoplanets, planets outside of our solar system that orbits a star, of all sizes.   NASA, Boeing and SpaceX SpaceX and Boeing are contractors hired by NASA. The original plans were for SpaceX and Boeing to help NASA with two commercial space efforts. One is for Commercial Resupply Services (CRS) and the other is the Commercial Crew Programme (CCP). Since the Space Shuttle Programme was finished in 2011, NASA hasn’t been able to send astronauts to space using their own technology. Instead, they purchase seats aboard the Russian Soyuz spacecraft, built by company RKK Energia.   So why has NASA partnered with SpaceX and Boeing? In a bid to bring these operations back to the US, NASA partnered with the two American giants; who have been funded to provide a cheaper alternative. Originally intended to start in 2016, both companies had faced delays to its spacecraft, with flights now pushed back until 2019. Here is the price comparison of costs between using Boeing and SpaceX, and NASA building their own spacecraft: NASA Boeing SpaceX CRS Missions $272,000/kg of cargo N/A $89,000/kg of cargo (67.3% saving) CCP Missions $1,677 million $654 million (61% saving) $405 million (76% saving)     TESS TESS sat ontop a SpaceX Falcon 9 at Cape Canaveral, FL, Spaceflight101 [clickToTweet tweet="#NASA’s new Transiting Exoplanet Survey Satellite (#TESS) launched on April 18th aboard a #SpaceX #Falcon9 rocket. TESS was developed by #MIT with its ultimate goal being to provide a new catalogue of planets. #Space #futurism" quote="NASA’s new Transiting Exoplanet Survey Satellite (TESS) launched on April 18th aboard a SpaceX Falcon 9 rocket. TESS was developed by MIT with its ultimate goal being to provide a new catalogue of planets."] NASA’s new Transiting Exoplanet Survey Satellite (TESS) launched on April 18th, 2018 aboard a SpaceX Falcon 9 rocket from Cape Canaveral Air Force Station, Florida, USA. TESS was developed by the Massachusetts Institute of Technology (MIT) with its ultimate goal being to provide a new catalogue of planets. Interesting to note, the satellite will act more like a “finder scope”, picking out thousands of planets and paving the way for more advanced observatories. Due to its small size, TESS is unlikely to get much detail but will be able to catalogue a large area of space at one time. The ultimate goal is to find potentially habitable planets. These will be worlds that orbit their main star at a far enough distance for water to exist on the surface, improving the chances of hosting life. Of course, this opens up the opportunity to find life as well. So far, there are a few planets under our discovery that fit this description; Proxima Centauri and TRAPPIST-1. Researchers are aiming to identify about 50 of these worlds with a diameter of fewer than four times that of Earth’s and 500 with a diameter of twice that of Earth.   TESS’s Operations The Transiting Exoplanet Survey Satellite, The Independent In order to catalogue as much of the sky as possible, NASA has split the operation into two parts. Firstly, TESS will operate in the Southern Hemisphere (for the first year) and will move onto the Northern Hemisphere for the second year. This is further broken down into cycles of 27 days. Each cycle will consist of the satellite facing a fixed direction, pointed away from the sun, causing it to “oscillate in a 13.7-day orbit between 67,000 and 232,000 miles above Earth’s surface”. It’s worth mentioning that TESS will follow an orbit not seen before. Most satellites use a geosynchronous orbit, which means they are fixed at about 22,236 miles from Earth. TESS’s minimum orbit is over triple that, and at max height will be over 10 times the height. During the lowest pass, the satellite will transmit images, video and all other data to stations on the ground. Why does it use this orbit? The resulting orbit time of 13.7 days means it will be able to capture the sky in a faster time, thus in-keeping with their operation-time goal of 2 years. So how does it do it? Like Kepler, TESS will detect small dips in light as it looks towards a star. The dip in light is caused by a planet orbiting said star. This is known as a transit. It will accomplish with four 100mm-wide 16.8-megapixel optical cameras. Each camera comes equipped with seven lenses and four Charge-Coupled Device (CCD) image sensors. The cameras will cover a 24x24 degree patch of sky, “large enough to fit the Orion constellation”. Due to its planned direction finding, as detailed above, the satellite will be able to scan thousands of exoplanets. This opens up the way for technology like the Hubble Space Telescope, or the James Webb Space Telescope, to study these new planets in a lot more detail. The James Webb Space Telescope is slated for the year 2020 launch with the goal of better-characterising planet atmospheres. According to TESS’s principal investigator, George Ricker, “public data releases will occur every four months, inviting immediate community-wide efforts to study the new planets.” Meaning we will be able to keep up with investigations and be a part of the discovery.   Next in the series, Wednesday 23rd May 2018… Tech of the week #11: Ulo – The smart-home-ready, customisable security owl With a beak that doubles as a two-way mirror, Ulo is the interactive surveillance camera that communicates through eye expressions. More than that, this technology has sensors, microphones, Wi-Fi, app integration, speakers and an LCD screen that can be customised in a variety of ways…   If you want to talk tech or astronomy, but not astrology, find me in the Twitterverse @JoshuaOsborn16. ### Cloud Expo Asia Hong Kong LIVE | Day 1 Join us live for Cloud Expo Asia 2018, Hong Kong. Disruptive will be streaming live from the HKCEC, Hong Kong for the two day event. Don’t miss the latest interviews, news and trends from the Expo. Remember to join in the conversation on Twitter using #DisruptiveLIVE ### Is AI The Future Of Collaborative Work? The way that we interact in the workplace is changing. While the importance of face-to-face meetings cannot be underestimated, the influx of millennials in the workforce has led to a significant shift towards electronic communications. Recent research from PWC found that 41 per cent of millennials prefer to communicate via instant messaging, or text as opposed to in person. While this has been well documented, this is just the beginning. There is another new technology making an impact on how we communicate and work: artificial intelligence. Its full potential is yet to be seen, however a recent report from Gartner has estimated that by 2022, one in five workers engaged in mostly nonroutine tasks will rely on AI to do a job. What does this mean for collaborative work? Here are three ways AI can revolutionise the workplace in the coming years:  Facial recognition technology Largely thanks to companies like Apple, facial recognition is no longer an emerging technology that’s reserved for MI5 and those with high-security clearance, but instead is part of an average consumer’s every day. But its uses aren’t limited to user authentication, and gradually we’re going to see the incorporation of facial recognition technology in other areas of the enterprise. For example, facial recognition in remote productivity tools will provide a practical way of reading visual cues. Right now, when it comes to making big decisions, be it closing a deal, or interviewing a candidate, many still prefer face-to-face meetings as they can take into account cues that don’t translate well over the phone. But, by applying analytics to facial images on video, meeting hosts will be able to pivot their conversation if needed, and inform more effective post-meeting follow-up. This “meta-meeting”, which focuses more on the feeling of the meeting (body language, tone, etc.) rather than the actual conversation will gather insights and learnings that help the meeting host facilitate better human connections and drive positive results. Virtual assistants AI-powered virtual assistants are making their mark on the enterprise, and it’s already evident that there’s endless potential for the technology to have an impact on the collaboration space. By integrating AI within collaboration tools, low-value tasks, such as organising meetings, dialing in to conference calls, and transcribing audio calls can be taken on by the assistant. Although they may not sound like time intensive tasks, if an employee spends an hour a week carrying them out, that adds up to around 6.5 working days each year, suddenly becoming more significant. If virtual assistants can take on these jobs, employees are freed up to focus on higher value tasks, in turn increasing productivity within the company. And for business leaders who want to see a financial incentive before implementing new technology, a rise in productivity will drive more an improvement on the bottom line. Although there have been concerns raised that virtual assistants and AI will rise up to replace the human workforce, the truth of the matter is that the rise of virtual assistants will allow us to focus more on the actual work and less on mundane tasks that can be automated. However, to ease any transitions, companies should educate employees on where virtual assistants will be used within the company and how they’ll be an advantage. Similarly, if roles are changing, existing and new employees should be given the adequate training and support. The future collaborative work spaces By enhancing the collaboration tools we use, AI will also shape how and where we work. We’ve already seen a rise in mobile, on-the-go working, and the emergence of the gig economy, but AI powered technology, especially with respect to collaboration tools, will encourage this even more. Employees no longer need to be chained to a desk in an office, with technology giving us the freedom to work wherever is convenient. Yes, emailing from the bus on the way to work might not seem so novel, but with AI, the types of jobs we carry out away from the office will change. For example, as alluded to above, facial recognition technology and the meta meeting could encourage business leaders to carry out important business meetings remotely, as opposed to face-to-face. Working remotely and hot-desking are already a given in many companies who recognise that the needs and wants of employees are shifting. Similarly, collaborative office spaces are already being seen as the first-choice option for many freelance workers and small businesses, allowing them to access all the benefits from a permanent office without the high rent prices. AI and collaboration technology will only facilitate these styles of working even more. What’s clear that the way we work is evolving, largely due to an increasingly tech-savvy workforce. thanks to technology. Collaboration tools are already fostering remote working, but as they become more advanced, this small shift in working has the potential to become seismic. AI-powered technology will be a main source of fire that fuels this change, allowing employees to get more out of their enterprise software, be it through virtual assistants or facial recognition technology. Companies who recognise this will be able to attract employees, increase productivity and revenues, and edge ahead of their competition. ### Fog Computing Market Fog computing market to witness a robust CAGR of 65% over 2017-2024, healthcare applications to drive the industry trends Fog computing market, a rather inherent extension of cloud computing industry, is expected to find its way among the most remunerative business spheres of recent times. As the explosive amplitude of IoT continues to touch almost every business vertical, conventional cloud computing platforms are showing their limitations in quickly processing the vast volume of data. The operational challenges of current cloud computing structure are in a way prompting tech giants to move from centralized to decentralized architecture. Fog computing market seems to be one of the major benefactors of this transition. Subject to its outstanding credibility to support vertically-isolated and latency sensitive applications, by means of offering a distributed, ubiquitous, multi layered activity, fog computing aims to simplify the IoT infrastructure. According to the definition given by U.S. NIST (National Institute of Standards and Technology), fog computing decentralizes management, data analytics, and applications within the network itself. This, in turn, minimizes the intervention time between requests and responses, providing local computing resource to devices and network connectivity to centralized services. Additionally, against the backdrop, where reliable estimates suggest that the number of IoT connected devices would surpass 50 billion by 2020, fog computing market, undoubtedly would witness a plethora of opportunities on a global scale. The prediction is aptly validated by a market assessment report that forecasts the overall business space to witness a stupendous CAGR of 65% over 2017-2024. Experts believe, the widening outreach of the application spectrum of fog computing industry is one of the primary factors backing the humongous growth. In this regard, it is prudent to mention that unlike cloud-based model, where data can be accessed only from a centralized platform, fog computing allows end-users even to access smaller intricate, and specific information. Unveiling the potential of fog computing market from healthcare applications The healthcare sector, unquestionably is one of the lucrative growth avenues for fog computing market. Over the past few years, healthcare fraternity has undergone a paradigm shift from volume to value-based model, the fundamental factor that has led to an increased penetration of IoT devices in the business space. In fact, as per reliable estimates, worldwide healthcare IoT installations is expected to reach 161 million by 2020. Against this backdrop, it goes without saying that the necessity to transmit healthcare cloud to all health sectors has become much important, in a bid to ensure real time smart communications. This, in consequence, would fuel the demand for advanced fog solutions that simultaneously deliver low latency and holistic intelligence, subsequently driving fog computing market size. In fact, as per analysts’ opinion, fog computing industry is set to remarkably influence the transformed supply chain model of healthcare fraternity, with an appreciable improvement in operational efficiency, cost reduction, and power consumption. With the help of particular user protocols and predefined authorization, fog computing allows any data with regards to patient health to share among all the integrated platforms through a common interface. Still being considered at a budding stage, the true potential of fog computing industry can only be achieved through extensive research both from private and government entities. In this regard, scientists of Wayne State University recently made to the headlines with the announcement of its plan of developing a collaborative edge computing ecosystem. Reportedly, the envision comes on the heels of creating a platform that would be able to connect all healthcare related information from disparate organizations via a common IoT platform. It has been also speculated that 40% of the global IoT generated data by the end of 2019 would be handled at the edge of the network - estimates further backing the steeping growth curve of fog computing market. Challenges| Fog computing market One of the foremost restraints in fog computing industry growth is lack of proper standardization of its deployment, cite experts. In effect of which, end users sometimes find inconsistency which ultimately result in unreachable information and decreased data retrieval. Add to it, fog also inherits some of the existing problems of cloud computing, security concerns being at the pinnacle. Perhaps marred by the fact that fog devices are generally set up at public places, they are vulnerable to unnecessary interference. With inadequacy of surveillance, transmitting a computation logic to the edge of network has turned out to be a vital potential threat for fog computing industry. However, renowned tech juggernauts such as Intel, Dell, Cisco, ARM Holdings, and Microsoft, appear to show quite proactiveness in overcoming these limitations. Say for instance, in a bid to standardize fog computing technology for future deployment, the aforementioned organizations in association with Princeton University have brought forth OpenFog Consortium in the year 2015. The forum mainly aims to solve challenges associated with innovative digital IT infrastructure pertaining to communication, latency, and bandwidth. The competitive landscape is anticipated to witness similar kind of encouraging research initiatives over the ensuing years. All in all, with severe network bottleneck and subsequent requirement of addressing cloud computing deficiencies, fog computing market undeniably would be the next opportunistic investment spot for tech giants. As per a report by Global Market Insights, Inc. the business is forecast to surpass a global share of USD 700 million by 2024. ### #TheIOTShow - Episode 3: The World of (Smarter) Cities The IoT Show delivers valuable insights for industrial organisations on the industrial internet of things and topics touching the broader internet of things. We pick engaging, hot topics, and ask our speakers to advise on situations, opportunities, and recommendations and gotchas. This series is not a sales medium, rather a series of discussions for you to benefit from the experiences and insights of others. In this third episode of the IoT Show, we look at the topics of Smarter and Connected Buildings and Cities, and what IoT means in the areas of Architectural, Engineering and Construction industries and to Owners/Operators. We focus too on Cities and Public infrastructures and services. ### Siemens Industrial Businesses Appoint New Finance Director Siemens Digital Factory and Process Industries & Drives Division in the UK have appointed Cui Yan to the position of Finance Director, starting the role effective immediately. Cui Yan has held a number of senior positions within Siemens since joining the business in 2003, including Head of Controlling for Siemens China Digital Factory Division & Process Industries & Drives Division. He is a Fellow of the Chartered Institute of Management Accountants and a Chartered Global Management Accountant.  He obtained an MBA from Schulich School of Business at York University. Cui succeeds Robin Phillips, who has been assigned a global role working with Siemens Digital Factory & Process Industries & Drives, based from Manchester. Brian Holliday, Managing Director Siemens Digital Factory, comments: “We are delighted to welcome Cui Yan to our senior management team. This is an exciting time for Siemens as we work with our customers across many industrial sectors and support them as they embark on their digitalisation journeys. We look forward to benefitting from his contribution on an operational level, as well as utilising his insight to support our strategic vision.” Cui Yan, Finance Director, Siemens Digital Factory & Process Industries Division in the UK added “I am delighted to be working in such a vibrant and exciting market, where Siemens are leaders in digitalisation and leaders in the latest technological trends in manufacturing. I look forward to working with our employees and customers to enhance our industrial growth potential, supporting the business in its ambitious growth plans for the years ahead.” ### Tech of the Week #9: Medical drones supplying life-saving treatments Drones are used for many reasons, both business and personal. Amazon can deliver your new pair of shoes and you can get a great aerial shot of your favourite chill out spot. What if I told you there are medical drones that deliver life-saving treatments such as blood? Currently, blood is transported by many different modes of transport; car, motorbike, even Land Rovers. Drones usually don’t make that list. That is until now, anyway. Since 2016, Rwanda has had the help of the California based start-up company, Zipline, to save lives of those in need of urgent blood transfusions and vaccines. They do this by using medical drones. Picture: Afrika News   The Aim and Practical Uses Zipline’s mission statement reads: “More than two billion people lack adequate access to essential medical products, like blood and vaccines. Zipline is solving this using instant on-demand delivery using small robot airplanes.” These aeroplanes are called “Zips”. Starting their quest in Rwanda the company aims to help such a large proportion that don’t have, or can’t get, access to life-saving treatments. This could be for many reasons; namely because the medical centre is in a hard to reach location, such as beyond mountainous regions, or they can’t get the supplies quick enough due to long journey times. Delivery times in the Rwandan area typically range from a few hours to days, by which time chances of survival are minimal to nothing. There are 21 hospitals located within a 75km radius of the central distribution centre and the company hopes to use this system in Tanzania next. There could be future usage all over the world as the company grows. Even in non-hard to reach locations it would still solve the problem of blood’s naturally short shelf life. Rather than hospitals stocking up on all types and potentially wasting crucial stock, they can simply order what they need at the time it’s needed. How Does It Work? There are several processes each medical drone goes through to prepare it, get the supplies to their destination and return, ready for the next flight. Firstly, the remote medical centre in need of supplies sends Zipline a text message with the order they need. These orders are sent through to the central distribution centre where supplies are packaged and prepared for flight. Product integrity is kept to the highest standard and it’s cold-chain maintained (kept at the appropriate temperature for the duration of the delivery). These distribution centres have access to sensitive and scarce items, and the items are packed within minutes. The medical drone then takes off, flying at over 60mph. The medical centre that ordered the supplies is sent confirmation that their delivery is on its way. Within 30 minutes (average fulfilment time) the drones drop the supplies over a designated area at the remote location. The package falls gently to earth on a parachute so they land safely and a text message is sent to the doctor to alert them to its arrival. Lastly, the drone flies back to the distribution centre, where it lands briefly for a battery change and any other updates it may need. Then it’s back to the skies with its next delivery. The Medical Drones [clickToTweet tweet="Once a delivery is requested, the Zip is catapulted into the air via a launch platform. It then uses #GPS coordinates to find its location completely autonomously. #Drone #AI #TechOfTheWeek" quote="Once a delivery is requested, the Zip is catapulted into the air via a launch platform. It then uses GPS coordinates to find its location completely autonomously."] Once a delivery is requested, the Zip is catapulted into the air via a launch platform. It then uses GPS coordinates to find its location completely autonomously. This is all done whilst in communication to local air traffic control stations to ensure any airlines in the area know of potential path-crossing. Though this isn’t likely to happen, as the Zips generally fly a lot closer to the ground than airliners ever would. Whilst flying over mountains or terrain that requires a higher-flying altitude is when potential issues could arise, thus being mitigated with the communications. The medical drones, in total, make up to 500 deliveries per day, in all weather conditions, 24 hours a day, 7 days a week. About 35,000 units of blood are delivered a year in Rwanda, with 50% going to mothers suffering postpartum haemorrhage, 30% going to children under 5 years old suffering from malaria-induced anaemia and the remaining 20% to other crucial causes. Application in Other Countries Currently, there are plans for the drones to be trialled in the US, but we could also see them come over to the UK. The main difference between operating in these countries, as opposed to the African states, is the difference in flight regulations. Currently, there are regulations in the US that state that drones cannot be flown over people, and also cannot be flown beyond a pilot’s line of sight. That is, without a special waiver. However, more data will be needed before such a waiver is given.   Next in the series, Wednesday 16th May 2018… Tech of the Week #10: TESS - The Transiting Exoplanet Survey Satellite The Transiting Exoplanet Survey Satellite, TESS, is NASA’s latest mission to find life on other planets. It succeeds the Kepler Space Telescope and launched aboard a SpaceX Falcon 9 rocket. The mission aims to discover thousands of exoplanets, planets outside of our solar system that orbits a star, of all sizes…   If you want to talk tech or drones, and for more great content, drop by my Twitter @JoshuaOsborn16. ### A Clearer View of the Cloud Despite widespread availability, cloud continues to raise questions for businesses looking at the best way to take their company forward in the digital age. Public, private or hybrid? Everything as a service or applications and workloads on-premises? The variety of cloud configurations is as diverse as the companies adopting them. When most people think of cloud, it’s likely public cloud which springs to mind. It’s certainly one of the most broadly used, in part because public cloud is widely believed to be as a cost-effective data storage option. However, there’s continued debate amongst teams on the trade-off between the benefits delivered by public cloud versus perceived issues.   For many, one of the biggest benefits of public cloud is the cost savings it can provide. But choosing to utilise public cloud shouldn’t be solely based on cost. Public cloud also provides significant benefits when it comes to business agility and productivity. The ability to quickly scale, collaborate across borders, benefit from the latest updates in applications and ultimately gain a competitive advantage are part of the reasons organisations choose to deploy public cloud applications and services. On the opposite side, when compared to the best on-premises solutions, public cloud can in some instances be a performance, reliability and security downgrade. With changing compliance demands in organisations, this perceived downgrade is prompting a renewed focus on on-premises IT. A recent IDC study into the incoming GDPR [1] found that despite downward pressure on IT spend, 34% of European organisations will increase expenditure on on-premises storage solutions to aid compliance with GDPR. While this focus on on-premises might provide better data estate visibility and ensure clarity of data residency, the reality is that organisations that aren’t using public cloud in some capacity are likely missing out. Public cloud should be a part of a business’s strategy, it just shouldn’t be ‘the’ strategy. Rather than existing as competing options, cloud and on-premises should complement each other and integration between them should be the focus for building an agile, and secure, IT estate. A significant majority of companies - 72% in the UK according to our Evolution report - do recognise that cloud and on-premises storage should complement rather than compete. As a result, the focus can shift to what the best hybrid environment is for the business. With the rise of data-intensive AI and Machine Learning projects in business, there is a growing demand for high-performance infrastructure. As a result, hybrid and multi-cloud environments also need to meet this high-performance requirement, enabling data to be delivered and processed quickly. To match these workload requirements, many organisations are choosing to use public cloud in combination with on-premises storage.  This combination ultimately delivers the performance requirements and cloud burst compute needed for these data intensive tasks, while also meeting strict data security, governance and compliance requirements. For many organisations moving to hybrid cloud can require a shift in mind-set and a re-evaluation of priorities amongst the IT department. But it is increasingly clear that the future is going to be hybrid and multi-cloud focussed. As a result, those who don’t make the jump are going to be left at a competitive disadvantage. ### GIANT Health Innovators - Episode 2 - Challenges in Innovation In this episode of Giant Health Innovators show, our host, Barry Shrier joins a panel of health and tech experts to discuss challenges in innovation. Host: Barry Shrier, Founder, Giant Health Event Guests: ​Dr Gyles Morrison​ Naveed Parvez​ Jenny Thomas Dominic Pride Want to join in? Contact us on twitter @DisruptiveLive using the hashtags #GiantLive and #DisruptiveLive ### Pinacl's Social Housing Solution - How Does It Work? Social Housing providers across the UK are responsible for managing tens of thousands of properties. Poor property conditions cause serious problems for the property and tenant health, resulting in the need for constant refurbishments costing millions of pounds every year. Smart living or a Smart Home, you can easily imagine a scenario where by a simple command on your way home turns on the lights for your drive or security light, the hallway and kitchen lights come on and the coffee machine is making your favourite drink. Your curtains and blinds are automatically closed, In the past 5 years, local authorities in England have paid out more than £35m in compensation and legal fees to people living in “unfit” homes (BBC). Pinacl has developed a solution which could save social housing providers thousands. We are already working with several housing associations who are keen to trial and adopt this new technology, recognising the potential for a rapid Return on Investment. Pinacl’s housing solution aims to reduce maintenance costs of properties by deploying sensors to gain real time insights into the health of each property, enabling the Housing Provider’s limited resources to better support tenants and avoid costly refurbishments. Alerts can be automatically generated if problems arise or are predicted to, notifying the tenants and Housing Provider to take corrective action so that the property and tenant remain healthy. Pinacl can provide a full end to end solution from sensor, IoT network to application dashboard. We can monitor humidity, temperature and Co2. The sensors take hourly readings and are processed and presented on our application dashboard, providing real time and historical access to the information for reporting and analysis. The temperature sensors can help avoid many problems including, vacant properties, faulty boilers, poor ventilation and even fuel poverty. Humidity sensors can provide early detection of conditions which are likely to cause mildew and mould in a house which can be highly toxic if they get into your body. A property with mould and mildew will likely spread through vents and start to penetrate walls, seeping into every pore of the house, causing significant cost and time consuming repair works. Specific temperature and humidity conditions will cause damp. Pinacl can help prevent damp by configuring alarms/ thresholds around specific thresholds, automatically triggered when these thresholds are exceeded proactively alerting the Housing Provider to take appropriate action. Our sensors will also measure Co2 levels, comparing the usual Co2 levels expected for the number of tenants occupying the property to the actual level of Co2 present. If the Co2 readings remain high over a period of time this could indicate potentially undeclared tenants residing in the property, poor ventilation or even illegal activity. Pinacl is already working with several Housing Associations who are keen to trial and adopt this new technology, recognising the potential for a rapid Return on Investment from Pinacl’s IoT Housing solution. For more information on our IoT Social Housing Solutions call us on 01745 535300 ### #TheCloudShow - Cloud Economics ### GDPR IT Readiness and SD-WANs On May 25, 2018, the General Data Protection Regulation (GDPR) will be put into effect. The intent of GDPR is to provide a legal framework to strengthen and unify data protection, and distribution for individuals within the European Union (EU). Individual data privacy includes identity information (e.g., name, address, ID numbers), web data (e.g., location, IP address, cookie data and RFID tags), health and genetic data, biometric data, racial/ethnic data, political opinions and sexual orientation. While the regulation protects individuals within the EU, it impacts organizations throughout the world, and they will be held responsible and accountable as to how they process, store and protect personal data. GDPR suggestions for security actions to be considered against appropriate risk include: ·       Pseudonymization and encryption of personal data ·      Ensure ongoing confidentiality, integrity, availability and resilience of processing systems and services ·      Ability to restore availability and access to personal data in a timely manner in the event of an incident ·      Ensure processes are in place for recurring testing, to assess and evaluate the effectiveness of technical and organizational measures to ensure security GDPR will require many organizations to employ comprehensive systems, policies and procedures to meet its wide-ranging requirements. Compliance with GDPR will entail governance measures to protect personal data and defend against breaches. Fortunately, a number of GDPR security and compliance capabilities are already enabled within many software-defined WAN (SD-WAN) solutions. There are two primary areas where personal data can be at risk. The first is data transferred between users and cloud/data centers over untrusted WAN Internet links. The second is in cloud and corporate data center storage devices, servers, and backup and recovery systems. Protecting user data over the WAN Advanced SD-WAN solutions add a layer of data protection at the network edge. An SD-WAN that supports multiple data security capabilities, will provide enterprise IT with network-edge security supported by real-time visibility, policy controls and enforcement mechanisms. Creating hybrid WANs that connect remote offices and users to enterprise and cloud data centers, SD-WANs combine MPLS circuits and inexpensive broadband Internet links. This implementation allows IT to create different security zones for trusted and untrusted WAN links. While MPLS and IPsec VPNs are already secure, broadband Internet links, such as DSL, cable and other connections are untrusted. SD-WANs can mitigate risk by encrypting untrusted networks to protect against unauthorized access and data breaches. Encryption to prevent data from being compromised The GDPR advises pseudonymizing personal data when it is transferred. This is a term GDPR uses to define a process that transforms personal data, so that the resulting data can’t be attributed to a specific data subject without adding additional information. Encryption is a good example, because it makes the original data unintelligible, and the process can’t be reversed without access to a specific decryption key. To prevent data from being compromised as it travels over WANs, advanced SD-WANs will encrypt every packet with 128b or 256b AES across all aggregated WAN paths. Safeguarding data by leveraging the cloud Protecting data from various forms of intrusion like viruses, worms, Trojan horses, ransomware, spyware, adware and other malicious programs, is vital. SD-WANs, with integrated app-centric cloud security services, provide Internet breakouts to secure-cloud gateways. This approach enables organizations to extend their security outside of the corporate data center Reducing data in transit Some SD-WANs include deduplication as part of their WAN optimization features. Deduplication eliminates the transfer of redundant data across the WAN, by sending references instead of the actual data. While this is considered an effective method for reducing bandwidth and storage overhead, reducing data also makes it more difficult for hackers to access the data. Deduplication ensures that only one, unique instance of data is retained at the requesting site. Redundant data blocks are replaced with a reference to the unique data copy, so only data that have changed since the previous backup are transmitted over the WAN. Reliable, cost-effective data protection over the WAN SD-WANs that combine multiple functions within a single physical or virtual edge appliance per location not only lowers IT support and costs, it also simplifies complex security processes. A unified SD-WAN orchestration platform will service-chain diverse security functions, like firewall, NAT, routing, VRFs, VPN concentrator, DHCP and IPsec termination, to ensure security policies flow contiguously through the network edge. In the past, building a service chain to support a new application took a great deal of time and manual effort. It required individually configuring all the network devices, and connecting each device together in the required services sequence. The network edge can be a potential on-ramp for hackers to infiltrate. Applying security policies can be complicated and time consuming with legacy networks. SD-WAN easily secures WAN traffic, by encrypting data with 128-bit or 256-bit keys. To further protect data privacy, other security capabilities can be employed, such as cipher-block chaining, using per-protocol sequence numbers, and enabling per-session symmetric encryption. An SD-WAN with virtual routing and forwarding (VRF) can automatically segregate traffic using multiple, distinct routing tables on a single SD-WAN edge appliance. On that same edge appliance, zone-based security with policy-based filtering can be applied across applications and services, ensuring only valid traffic is permitted into the trusted network. If you can see it, you can secure it By monitoring the network edges, organizations can better meet GDPR guidelines. SD-WANs provide various levels of visibility and auditability of network events and access. This capability allows IT to track how data has been accessed, and identify who retrieved it. Monitoring traffic within the SD-WAN, data are captured to provide a granular and accurate view of network and application usage. By collecting event  data, IT staff can identify, capture and report on the state of the network and any anomalies that may have occurred. Aligning IT initiatives with GDPR requirements While GDPR will protect individual data within the EU, it will also impact organizations everywhere, as they align their data privacy and protection policies, procedures and practices with GDPR requirements. Yet, it also brings an opportunity for organizations to leverage technology solutions, like SD-WAN to help them protect user data and privacy, save IT CapEx and OpEx costs, simplify network connectivity, and use WANs to their advantage – as a business enabler for the digital transformation journey. ### The Digital Workplace - Episode 1: Digital HR Uncovered In programme one, we start by uncovering “Digital HR”. What is meant by this? How does it improve business performance? How does it engage with the multi-generational, multi-national workforces now typical in so many organisations? While we are not all digital natives, there has been a huge shift in the way we all communicate as people and our expectations of the tools we use to do our jobs are now typical of the always-on tablet and smartphone experiences we have at home. ### Tech of the Week #8: Facial Recognition – Caught in a crowd Facial recognition has been in the news a few times lately, and here it is again. As reported by us a short while ago, Chinese and UK police have used special facial recognition technology that can pick someone out of a crowd. China has been building on their surveillance network further; now with 170 million CCTV cameras in operation and plans for 400 million more to be installed over the next 3 years. But it’s not all as bad as it sounds… Facial Recognition Sunglasses [clickToTweet tweet="The company behind the #FacialRecognition #technology, LLVision claims their technology takes only 100 milliseconds to scan a face and recognise a match from a database of 10,000 suspects. #AFR #AI #Privacy #TechOfTheWeek" quote="The company behind the facial recognition technology, LLVision claims their technology takes only 100 milliseconds to scan a face and recognise a match from a database of 10,000 suspects."] In January/February earlier this year Chinese police showcased their facial recognition sunglasses as a trial in Zhengzhou. The aim was to catch a number of criminals and wanted persons during the New Year “Lunar” celebrations. During this time, around 385 million people were expected to travel around the country to see family and friends and welcome in the new year. As you can imagine, with this many people swarming the country, technology sounds almost a necessity to pick a face out of the crowd. The company behind the facial recognition technology, LLVision claims their technology takes only 100 milliseconds to scan a face and recognise a match from a database of 10,000 suspects. LLVision’s CEO, Wu Fei, said, “instant and accurate feedback” is given to police, benefiting from the lack of lag traditional CCTV facial recognition systems suffer. With a built-in artificial intelligence frontend, suspects are also discovered in the very instant they come into range. This allows the officer in charge to make the next move within seconds. Unlike the CCTV systems in which the suspect is unlikely to be in that spot once discovered. The glasses cost $636 and that’s without the facial recognition technology. So far at least 33 people have been arrested, 7 for major crimes and 26 for travelling using fake documentation. There are immediate privacy concerns in a nation that is already one of the most watched nations on the planet, however, it may take some time before the true benefits (if any) are realised. It’s certainly a topic of hot debate. A Man Caught in a Crowd One of the latest stories to come out about China’s use of FR technology is the news of a man who was identified whilst at a pop concert amongst a crowd of 60,000 people. Known as Mr Ao, Ao was apprehended during the concert after he was detected by facial recognition systems upon entering the stadium. Upon his capture, he made surprised remarks. The Use of Automated Facial Recognition (AFR) in the UK China isn’t the only nation to be using facial recognition technology. The UK has also been trialling the use of various FR methods. In December last year, Welsh police used an AFR kit during the football finals to monitor activity and check for suspects. It is used in two ways: The first method uses a database of 500,000 people from their time in custody. The cameras then detect a face in the crowd, cross-reference that image against the ones in the database, and if it’s a match the officer is alerted is sent to apprehend the wrong-doer. The second method uses a more ‘live’ approach. For example, someone gets violent whilst on a night out and it’s captured on CCTV. This footage is then checked against the CCTV film of the current day, and a match is made this way instead. Of course, like with any technology in its infant stages AFR has its issues. London police, whilst using the facial recognition technology at Notting Hill Carnival, reported 35 false matches and one wrongful arrest. It’s clear that the technology needs some refinement before it should be heavily relied on. Especially when there are already concerns over privacy infringements; arresting an innocent party is likely to spark controversy. How Does Facial Recognition Technology Work? So what does the tech behind the technology look like? FR uses an algorithm to determine certain characteristics of the human face. Characteristics include the nose, jaw-line, mouth, ear-size and distance between the eyes. The application would see a subject move in front of, or pass by, a facial recognition-enabled device in which a sample of their physical appearance is extracted. A ‘template’ is then created from the biometrics and compared to another source. This is perhaps where the worry around privacy comes in. A photo of us would need to be kept in a database and that would make many feel uneasy. With the upcoming GDPR regulations we could see our data kept a lot more securely, thus easing some worries, however, the risk of breach and misuse will always be ever-present. The final step is the system’s decision as to whether the images greatly compare and a match is found. Let’s take a look at situations in which FR technology can be used in a way that benefits us in our day to day lives. Non-Legal uses of FR Technology Let's end things on a slightly more positive, optimistic note and look at some potential (and in some cases, already implemented) uses of FR technology: In restaurants - When paying for a meal At a cash point – To save time withdrawing money Government offices – To only allow those with appropriate security clearances in Airports – When checking-in or going through immigration   Next in the series, Wednesday 9th May 2018… Tech of the Week #9: Medical drones supplying life-saving treatments Drones are used for many reasons, both business and personal. Amazon can deliver your new pair of shoes and you can get a great aerial shot of your favourite chill out spot. What if I told you there are medical drones that deliver life-saving treatments such as blood?   If you want to talk tech or facial recognition, and for more great content, pick my face out of the Twitter crowd @JoshuaOsborn16. ### Payments Infrastructure - Is yours built to scale and flexible? In 2018, scaling a business is less an ambition than an imperative: over-saturated industries mean there’s a lot of pressure to grow in-line with the market and ahead of the competition. So what about a payments infrastructure? But running a fast-growing organisation has its challenges, particularly when it comes to infrastructure. An IT setup that’s designed for 20 employees might work at the beginning – but where does that leave your business in six months’ time, when you’ve hired 80 more people? Payments infrastructure is an essential growth consideration, and when you’re making new hires, opening more offices, and changing the entire shape of your company, it’s not always prioritised. But if you’re scaling up a business, neglecting it is a serious mistake.   Payments infrastructure growing pains As your business grows larger, the pressure on your tech infrastructure will only increase. If it’s designed to meet startup-level requirements, the mechanisms in place for distribution, service, and transaction processing will buckle under the weight of a larger customer base using a range of payment methods. As volume increases, payments infrastructure must grow to accommodate this customer base’s varied preferences – as well as addressing other, more technical concerns. An issue such as fraud prevention (while always important) won’t often cause undue strain for a smaller business. For an expanding company, however, processes such as risk identification, threat elimination and containment, user authentication, and reporting become more complicated.   Cultural learning A business can’t just grow bigger, it has to grow smarter. If you’re expanding into international markets, it’s necessary to factor in different economies, legal systems, and payment cultures - these differences can be huge. The European Union may have a supranational legal system and a shared currency, but each nation also has its own discrete laws and customs around payments: in Ireland, credit cards account for 87.2% of transactions; in France, they account for 50%. Pre-paid cards in Italy are used in 15% of all transactions – but the decline rate is significantly higher than regular cards, so businesses with subscription services that require regular recurring debits need to be ready to handle this. Beyond Europe, it’s not uncommon for varying regulations to affect transactions: in the United Arab Emirates, for example, newly-issued cards have restrictions on online payments. So any business expanding into the UAE must put in place a plan for reducing decline rates. Your payments infrastructure must have room to grow larger, but it must also have the flexibility to accommodate key market variations.   Keeping the customer satisfied [clickToTweet tweet="A scaling business can’t afford to maintain a small and inflexible payments #infrastructure. An erratic purchasing process can cause #customers to abandon the transaction. This can have a significant impact on your #brand. #growth #FinTech" quote="A scaling business can’t afford to maintain a small and inflexible payments infrastructure. An erratic purchasing process can cause customers to abandon the transaction. This can have a significant impact on your brand."] A scaling business can’t afford to maintain a small and inflexible payments infrastructure. An erratic purchasing process can cause customers to become disillusioned and completely abandon the transaction. This can have a significant impact on your brand. Managing growth properly means ensuring that your customers are confident and happy to hand over their money, and a secure, convenient, frictionless payment experience is an essential part of this. A modern payments infrastructure should be able to accommodate the sheer diversity of options available to end-users. Today’s consumers pay with their faces, their fingerprints, and their voices as well as their cards and cash. Some nations have systems that aren’t used anywhere else: in the Netherlands, for example, a number of banks participate in the iDeal e-commerce platform. Your payments provider is key to effectively servicing customers in international markets. They must understand your objectives, collaborate with your team, and reconcile operational priorities with economic, cultural, and regulatory differences. They must, of course, demonstrate technical proficiency, but they also need to understand the needs of businesses in your vertical – and any obstacles that may lie in your path. Above all, they should be able to help you continue delivering great service no matter how quickly you grow, or how much your customers’ needs evolve. Ultimately, scaling your payments infrastructure isn’t just about raising your capability to the level of your new requirements. It’s about preparing for the next stage of your company’s growth so you’re not constantly playing catch-up, and it’s about adopting the philosophy of a bigger and better business: one that maintains and improves quality of service to satisfy customers – wherever they want, wherever they are, and however they prefer to pay. ### Cloud Malware - The perfect breeding ground, ignore it and it will bite you The constant struggle to keep malware out of the organisation and deal with it when the inevitable ingress occurs is a well-trodden path for CIOs. But as organisations reach further into the cloud for essential everyday capability, the cloud malware landscape has taken on a new dimension – one where the cloud is a route for its distribution. Cloud malware is a relatively new phenomenon, but cybercriminals are quickly wisening up to the fact that these online environments are a perfect conduit for spreading malware. CIOs need to take it very seriously and ensure they’re on top of any potential opportunity it might have to enter their systems.   Fast and silent spreading The purveyors of malware have started to make greater use of the cloud for a number of reasons. We are increasingly prolific users of cloud services and tend to have multiple accounts, increasing the scope for using the cloud to distribute malware. And cloud-only malware can and does spread incredibly quickly – which tends to happen automatically and be undetectable to the user. A case in point is Virlock. This is a ransomware which first surfaced in 2014, and which has evolved to spread itself through cloud file sharing tools. All it takes to start a whole new chain of infection is an infected file in a cloud shared folder that’s synced between two computers. When someone clicks that file, the infection spreads across not just the shared folder, but also all the files on the new computer. It has multiplied exponentially and will spread further through any files shared with other users.   The scale of the cloud malware problem Locating and containing potential cloud malware spreaders is a challenge because it’s highly likely an organisation uses a lot more cloud services than it thinks. There are, in effect, two types of cloud use. One is the formal, centralised services that are set up and managed by the IT team. Think of accounting, human resources, CRM and other organisational management services, as well as systems that relate to the services or goods that an organisation provides. Then there are the cloud tools people seek out for themselves and use because they help in getting work done – what’s known as shadow IT. Many – quite probably most – of these will not have been sanctioned by the IT team. Typically, CIOs estimate that there to be 30 or 40 cloud apps in use within their organisation – in reality, it’s closer to 1,000. An audit will provide visibility into what is being used and by whom, but that just exposes the scale of the problem to be addressed.   Trust - and the need for education [clickToTweet tweet="One of the reasons why #cloud-only #malware is so potentially dangerous is that users tend to have very high implicit trust in files that appear in corporate approved cloud file shares. #cybersecurity #CIO" quote="One of the reasons why cloud-only malware is so potentially dangerous is that users tend to have very high implicit trust in files that appear in corporate approved cloud file shares"] One of the reasons why cloud-only malware is so potentially dangerous is that users tend to have very high implicit trust in files that appear in corporate approved cloud file shares – as well as files that are attached to records in the CRM system. The assumption is that they must be safe. Whenever a file is uploaded the context and origin of the file is often lost. It is assumed that the file must have been uploaded by someone internally. Security education and awareness need to be updated to explain the new cloud-only malware threat to users and to treat files in popular file sharing apps – like Dropbox and Google Drive – in a similar way to file attachments to emails from unknown senders.   Blocking the routes to infection Businesses need to take control, in order to secure their infrastructure. Of course, the easiest way to do so is by blocking the use of such apps, but, in reality, the IT team will win no friends if they remove access to tools people have adopted to help them work more effectively. So, the holy grail in this situation is finding a way to protect the organisation from cloud malware, while allowing people to continue to use the cloud apps they need and want. Rather than simply banning apps, organisations can categorise behaviours within those services and remove the ability to perform risky actions.  For example, it might be prudent to remove the ability to attach files to webmail apps, or to limit access to documents within file sharing apps to view only. But inevitably, these moves will restrict productivity. If a user can’t download a file from a file sharing app, can they do what they need to? What’s really needed is a much more granular approach, which allows full access to cloud services in the way people want. When the prevalence of email threats became apparent, it became common practice to scan all messages and attachments for viruses or malware. The same concept can be applied to web apps. Scanning all uploads and downloads will reduce the risk, as will applying Data Loss Prevention (DLP), which will typically scan all uploads for keywords or phrases and block anything that looks suspiciously like confidential, commercially sensitive or other data that the organisation wants to keep private is being shared.   A good time to take steps Cloud malware is a real threat today, and it is growing. Moreover, the new GDPR coming into force in Europe at the end of May could result in far heavier penalties for data leaks than we’ve seen in the past. Rather than wait for the stick to hit, organisations should be thinking of the carrot – enabling their people to use the tools that they rely on for efficient working. There is absolutely no doubt that cloud apps can help a business be more efficient. That goes for the organisation wide apps formally sanctioned by the IT team and the myriad apps people find and use themselves, but they must be de-risked as much as possible. Frankly, an organisation that doesn’t think about dealing with cloud malware now is an organisation that’s waiting to be caught out. ### Chatbot: The revolution and evolution of the AI-powered technology The way brands and customers interact has evolved dramatically over the past few years. Even phoning up a bank with a simple customer query seems a little archaic these days. Customers increasingly opt for more accessible methods of communication like the chatbot, built-in messaging platforms and even WhatsApp. Additionally, the conversation tone has shifted from formal and professional to friendly chats that resemble the kind you might have with a friend. Customer service is a top priority for every business - the truth being that without satisfied customers, you will soon be going out of business. But successful customer engagement can prove to be challenging in the age of technology where new channels are constantly becoming available and continually need to be integrated. By 2020 it is anticipated that the planet will host some 6.1 billion smartphone users. And with the majority of consumers moving to mobile customer service as their primary - if not only - method for interacting with a business, the manner in which customers expect to be able to communicate with brands has changed. The ideal customer experience allows customers to interact on the channel of their choice, all whilst maintaining the context of those interactions. Forrester analysts recently reported that the dynamic has shifted away from companies and towards digitally savvy, technology-empowered customers that are driving change. As a result, messaging applications, in particular, have forced businesses to evolve their customer communications capabilities in order to keep up.   Helping humans [clickToTweet tweet="#AI-powered #chatbots have become the go-to solution for addressing real-time #customer queries both efficiently and effectively, but customers still very much value a human touch. #ArtificialIntelligence" quote="AI-powered chatbots have become the go-to solution for addressing real-time customer queries both efficiently and effectively, but customers still very much value a human touch."] AI-powered chatbots have become the go-to solution for addressing real-time customer queries both efficiently and effectively, but customers still very much value a human touch. Many businesses have therefore chosen to optimise their customer service with AI that augments human interactions, rather than replaces them altogether. Take, for example, the case of roadside recovery. When a customer calls requesting assistance, an AI-powered chatbot can immediately access that customer’s previous call records and history via the Customer Relationship Management (CRM) database. Using this data in association with other discrete variables - e.g. the time and location of the call - the bot can quickly anticipate the customer’s needs and respond accordingly. This information, displayed to a contact centre representative, can help augment the conversation; subsequently providing a smoother experience for the customer. This type of AI is becoming fairly commonplace across a whole range of industries, from financial services to health services to HR. And brands are increasingly opting to automate the business process of routine tasks in customer service, e.g. updating payments details and scheduling appointments. But we are now seeing more and more businesses looking for applications based on narrower, more refined AI solutions.   The evolution of the chatbot While the chatbots of today are rapidly becoming a regular component of the customer service experience, it is important to remember that they are constantly evolving. One of the most exciting ways that we see chatbots evolving is through integration. In isolation, a chatbot is effectively a programmed set of responses - analogous in certain respects to an interactive FAQ. But with integration, a chatbot can access historical data to ensure that all questions are relevant to that particular customer’s query, thus eliminating unnecessary or redundant questions and continuing to provide a smoother experience. Furthermore, sentiment analysis can help bots detect when a customer is becoming frustrated, indicating the need for a transfer to a contact centre representative. Businesses can also integrate bots within their CRM platforms, unified communication systems, or internal databases, optimising their usefulness and making them more intelligent.   Bots in the future With more and more companies using APIs to integrate chatbots within their wider ecosystem of communications, there now exists a demand for bots that interact across platforms. We can, therefore, expect to see a rise in SDKs and other frameworks for helping developers build API-driven voice bots that cross seamlessly between channels, from voice calls to message chat and back. In recent years, the line between what consumers expect in their personal communication experiences and what they expect from their interactions with brands has blurred. Companies are increasingly being expected to engage in the same way that customers communicate with their colleagues, friends and even family members. And since the majority of bots are driven by a consolidation of natural language processing and voice-to-text - which can be applied to almost any channel of communication - brands now have the opportunity to offer customers the choice of their preferred messaging channel, e.g. Facebook Messenger, WhatsApp, voice, chat or messaging. Chatbots have revolutionised the ways companies interact with their customers, and we are only at the very beginning. As challenging as the incorporation of the omnichannel customer experience might appear to company leaders, it also forms a great opportunity to engage with customers in a new way, building more meaningful and longer lasting relationships. ### Data Journey - Jonathan Preston - Hitachi Vantara Jonathan Preston, Program Manager at Hitachi Vantara talks to Compare the Cloud about the data journey. To find out more about Hitachi Vantara, please follow this link.   ### #TheIOTShow - Episode 2: IoT Platforms The IoT Show delivers valuable insights for industrial organisations on the “Industrial Internet of Things” and topics touching the broader Internet of Things. We pick engaging, hot topics, and ask our speakers to advise on situations, opportunities, recommendations and gotchas. This series is not a sales medium, rather a series of discussions for you to benefit from the experiences and insights of others. In the second episode of The IoT Show, we’re going to look at some of the underpinnings of IIoT and IoT; as well as the topic of the IoT Platform. ### Tech of the week #7: SABRE, From Supersonic to Hypersonic – a hybrid of space and flight Concorde still remains a pinnacle of technological success thanks to its faster-than-sound flight speeds. The last flight of this wonder was just under 15 years ago, and since then there hasn’t been another aeroplane to come close to Concorde’s speeds. This could soon change thanks to a super-fast jet engine that doubles up a space-bound rocket, marking the UK’s investment to become a space nation. Introducing SABRE.   Supersonic flight – a brief history In 1942, a project by the UK’s Ministry of Aviation and Miles Aircraft began. The idea is to bring an aircraft to market with the ability to fly above Mach 1 – The Bell X-1. It achieved a speed of just under 1,000mph in 1947 by pilot Chuck Yeager. Mach 1 – The speed of sound, 767.269mph. Now, the current record at the time was under half of this, so this was quite a feat. It also spawned the X-planes; a series of experimental US aircraft used to test new aerodynamic concepts and evaluate new technologies. On the 2nd March 1969, one of the most renowned aircraft of all time took to the skies for the first time - Anglo-French invention, Concorde. Built by British Airways and Air France, Concorde was the first supersonic passenger plane, and to this day remains only one of two. This craft was capable of reaching speeds of Mach 2. You may have heard of the ramjet and the supersonic combustion ramjet (scramjet). These engines are capable of reaching Mach 6, thus breaching the hypersonic speed region. Hypersonic – Speeds in excess of Mach 5 Due to the way these engines operate, they cannot be used from standstill, but must already be travelling at high speeds. Normally a (sc)ramjet would be dropped from a larger aircraft with engines firing after reaching its required speed. Supersonic flight isn’t in mainstream operation, outside of military operations, and hasn’t been since 2003. Largely due to the tremendous technical challenges, and subsequent larger costs, this type of flight brings. The aerodynamically different design needs arise from the introduction of the “transonic regime” – where some parts of the plane face supersonic airflows, and other parts subsonic, simultaneously. Coupled with limited technological options and restraints like that of the (sc)ramjet, this technology needs some investment to fully flourish.   Notable supersonic aircraft Aircraft Name First Flight Max Speed Bell X-1 19th January 1946 958 mph (Mach 1.25) Concorde (commercial) 2nd March 1969 1,354 mph (Mach 1.76) Tupolev Tu-144 (commercial) 31st December 1968 1,510 mph (Mach 1.97) Lockheed SR-71 Blackbird 22nd December 1964 2,200 mph (Mach 2.87) North American X-15 (hypersonic) 8th June 1959 4,520 mph (Mach 6.72)   SABRE Concept aircraft equipped with SABRE engine, BAE Systems. So, let’s get down to the tech itself. The Synergetic Air-Breathing Rocket Engine, SABRE, is set to be tested in 2020. Reaction Engines, the company behind the technology is a British firm from Oxfordshire and have so far raised over £26.5 million. The rocket can be used in two ways: an air-breather and as the rocket itself. [clickToTweet tweet="Current flight times from London, Gatwick to New York, USA take around 8 hours. With #SABRE, the journey would be completed in 2 hours at Mach 5. Or 24 minutes at Mach 25. #hypersonic #supersonic #aerospace #technology " quote="Current flight times from London, Gatwick to New York, USA take around 8 hours. With SABRE, the journey would be completed in 2 hours at Mach 5. Or 24 minutes at Mach 25."] Whilst in air-breather mode, SABRE will operate in its commercial form, propelling the aircraft at speeds of up to Mach 5 (3,836 mph). An air-breathing jet engine draws hot gases expelled from the exhaust through a propulsive nozzle, which in turn creates thrust. Reaction Engines has developed technology that can cool air of up to 1,000 degrees Celsius to sub-zero temperatures within a fraction of a second. SABRE takes a slightly different approach in that it uses the hot helium that exits during the cooling process. The high-pressure air is then driven into the combustion chamber. Claimed to be “the most powerful means of cooling air in the world”, the heat exchanger can remove 400 MW of heat from the air, and only weigh as much as 1 tonne to do so. Not only will this benefit the aerospace industry, and be a catalyst for hypersonic means of travel, but other industries such as raw materials, motorsport and industrial processes could benefit too. Concept SABRE engine, Space News. Looking at the rocket-mode, the engine uses onboard liquid oxygen to explode into speeds of Mach 25, or 19,180 mph. This might sound ludicrous, and quite frankly, that’s because it is. To achieve orbit, a space shuttle must reach speeds of around 18,000 mph during a typical NASA launch. That’s a machine, purposely built to go to space being beaten in speed by a machine that doubles as a commercial jetliner. If you were to fire a rifle, SABRE would out-accelerate the resulting bullet by 9 times. By implementing this technology in an aeroplane, as opposed to your traditional rocket, we would see space flights using horizontal take-offs and landings from space. Materials would be saved as the probability of them burning up on re-entry are lowered due to the option of a slower descent. A hybrid approach promises to save costs, which can then be re-invested and continually boost our galactic presence.   Companies backing this technology Several large companies are backing this technology. Rolls-Royce, known for their cars, but also their incredible jet engines have a vested interested. As do aerospace giants Boeing, specifically their investment-arm - HorizonX, and British Aerospace Engineering Systems (BAE). BAE paid Reaction Engines £20.6m back in 2015 for a 20% ownership in the company. Financial investment is coming from Baillie Gifford Asset Management and Woodford Investment Management, both investment management firms. Perhaps surprisingly, the UK government is investing too, contributing £60m to Reaction Engines. This comes as part of a move to secure 10 percent of the £400bn-a-year global space market to boost the economy, create high-skilled jobs, further science and research and more.   2,500mph comparison So, how does Mach 5 compare to other well-known speeds? 833 times faster than the average human walking speed, 3mph 35 times faster than the average cheetah running speed, 71.5mph 9.5 times faster than the fastest supercar, Bugatti Chiron, 261mph 4 times faster than a Boeing 747, 614mph 1.8 times faster than SABRE's predecessor, Concorde, 1,354mph Current flight times from London, Gatwick to New York, USA take around 8 hours. With SABRE, the journey would be completed in 2 hours at Mach 5. It'd only be 24 minutes at Mach 25.   Next in the series, Wednesday 2nd May 2018… Tech of the Week #8: Facial Recognition – Caught in a crowd Facial recognition has been in the news a few times lately, and here it is again. As reported by us a short while ago, Chinese and UK police have used special facial recognition technology that can pick someone out of a crowd. China has been building on their surveillance network further; now with 170 million CCTV cameras in operation and plans for 400 million more to be installed over the next 3 years. But it’s not all as bad as it sounds…   If you want to talk tech or aircraft, and for more great content, fly over to my Twitter @JoshuaOsborn16.   ### Gartner Says Global Artificial Intelligence Business Value to Reach $1.2 Trillion in 2018 STAMFORD, Conn., 25th April, 2018 — Global business value derived from artificial intelligence (AI) is projected to total $1.2 trillion in 2018, an increase of 70 percent from 2017, according to Gartner, Inc. AI-derived business value is forecast to reach $3.9 trillion in 2022. The Gartner AI-derived business value forecast assesses the total business value of AI across all the organisation vertical sectors covered by Gartner. There are three different sources of AI business value: customer experience, new revenue, and cost reduction. Customer experience: The positive or negative effects on indirect cost. Customer experience is a necessary precondition for widespread adoption of AI technology to both unlock its full potential and enable value. New revenue: Increasing sales of existing products and services, and/or creating new product or service opportunity beyond the existing situation. Cost reduction: Reduced costs incurred in producing and delivering those new or existing products and services. "AI promises to be the most disruptive class of technologies during the next 10 years due to advances in computational power, volume, velocity and variety of data, as well as advances in deep neural networks(DNNs)," said John-David Lovelock, research vice president at Gartner. "One of the biggest aggregate sources for AI-enhanced products and services acquired by organisations between 2017 and 2022 will be niche solutions that address one need very well. Business executives will drive investment in these products, sourced from thousands of narrowly focused, specialist suppliers with specific AI-enhanced applications." AI business value growth shows the typical S-shaped curve pattern associated with an emerging technology. In 2018, the growth rate is estimated to be 70 percent, but it will slow down through 2022 (see Table 1). After 2020, the curve will flatten, resulting in low growth through the next few years.   Table 1. Forecast of Global AI-Derived Business Value (Billions of US Dollars)   2017 2018 2019 2020 2021 2022 Business Value 692 1,175 1,901 2,649 3,346 3,923 Growth (%)   70 62 39 26 17 Source: Gartner (April 2018)   [clickToTweet tweet="Breaking out the global business value derived by #AI type, decision support/augmentation (such as Deep Neural #Networks) will represent 36 percent of the global AI-derived #BusinessValue in 2018. #DNN #ArtificialIntelligence #customer #CX " quote="Breaking out the global business value derived by AI type, decision support/augmentation (such as DNNs) will represent 36 percent of the global AI-derived business value in 2018. "]   "In the early years of AI, customer experience (CX) is the primary source of derived business value, as organisations see value in using AI techniques to improve every customer interaction, with the goal of increasing customer growth and retention. CX is followed closely by cost reduction, as organisations look for ways to use AI to increase process efficiency to improve decision making and automate more tasks," said Mr Lovelock. "However, in 2021, new revenue will become the dominant source, as companies uncover business value in using AI to increase sales of existing products and services, as well as to discover opportunities for new products and services. Thus, in the long run, the business value of AI will be about new revenue possibilities." Breaking out the global business value derived by AI type, decision support/augmentation (such as DNNs) will represent 36 percent of the global AI-derived business value in 2018. By 2022, decision support/augmentation will have surpassed all other types of AI initiatives to account for 44 percent of global AI-derived business value. "DNNs allow organisations to perform data mining and pattern recognition across huge datasets not otherwise readily quantified or classified, creating tools that classify complex inputs that then feed traditional programming systems. This enables algorithms for decision support/augmentation to work directly with information that formerly required a human classifier," said Mr Lovelock. "Such capabilities have a huge impact on the ability of organisations to automate decision and interaction processes. This new level of automation reduces costs and risks, and enables, for example, increased revenue through better microtargeting, segmentation, marketing and selling." Virtual agents allow corporate organisations to reduce labour costs as they take over simple requests and tasks from a call centre, help desk and other service human agents, while handing over the more complex questions to their human counterparts. They can also provide uplift to revenue, as in the case of roboadvisors in financial services or upselling in call centres. As virtual employee assistants, virtual agents can help with calendaring, scheduling and other administrative tasks, freeing up employees' time for higher value-add work and/or reducing the need for human assistants. Agents account for 46 percent of the global AI-derived business value in 2018 and 26 percent by 2022, as other AI types mature and contribute to business value. Decision automation systems use AI to automate tasks or optimise business processes. They are particularly helpful in tasks such as translating voice to text and vice versa, processing handwritten forms or images, and classifying other rich data content not readily accessible to conventional systems. As unstructured data and ambiguity are the staple of the corporate world, decision automation — as it matures — will bring tremendous business value to organisations. For now, decision automation accounts for just two percent of the global AI-derived business value in 2018, but it will grow to 16 percent by 2022. Smart products account for 18 percent of global AI-derived business value in 2018 but will shrink to 14 percent by 2022 as other DNN-based system types mature and overtake smart products in their contribution to business value. Smart products have AI embedded in them, usually in the form of cloud systems that can integrate data about the user's preferences from multiple systems and interactions. They learn about their users and their preferences to hyper-personalise the experience and drive engagement. Gartner clients can read more in the report "Forecast: The Business Value of Artificial Intelligence, Worldwide, 2017-2025." This research is part of the Gartner Special Reports “The Future of Work and Talent: Culture, Diversity, Technology” and “Deliver Artificial Intelligence Business Value. ”These research collections focus on the new relationship between technology and talent that transforms existing ways of working and doing business and how to exploit AI for major gains in business value.   About Gartner Gartner, Inc. (NYSE: IT), is the world's leading research and advisory company and a member of the S&P 500. We equip business leaders with indispensable insights, advice and tools to achieve their mission-critical priorities and build the successful organisations of tomorrow. Our unmatched combination of expert-led, practitioner-sourced and data-driven research steers clients toward the right decisions on the issues that matter most. We're trusted as an objective resource and critical partner by more than 12,000 organisations in more than 100 countries—across all major functions, in every industry and organisation size. To learn more about how we help decision makers fuel the future of business, visit www.gartner.com. ### Comms Business Live: Channel in… Public Sector In this episode of Comms Business Live, Editor David Dungay is talking about the Public Sector and the current challenges partners are facing including getting onto the right frameworks, winning contracts from bigger suppliers and how to navigate the current wave of technology disruption affecting the vertical. Joining him are Justin Harling, MD of CAE Technologies and Alex Tatham, MD of Westcoast. ### Legacy systems in 2018. What makes the best Terminal Emulator? Choosing the right terminal emulator continues to be an important issue for users of Legacy Systems in 2018. When choosing a terminal emulator there are two major types to pick from: desktop based emulators and web-based terminal emulators. Flynet will discuss why a web-based terminal emulator is the best option across four factors: Functionality Security Performance Future-Proofing This video animation comes after a previous article around terminal emulators. ### What does GDPR mean for print? On 25th May 2018, the new EU Data Protection Regulation (GDPR) will come into effect across all EU member states. From that point on, all companies and government agencies within the EU must ensure their IT infrastructure is compliant with these new regulations. Not only must they comply, but they will also need to clearly demonstrate how their processes comply, documenting the decisions they take to protect personal data. But is everyone clear on what GDPR means? And what do businesses have to do to make sure they’re ready? Firstly, the act defines personal data as “any information concerning the personal or material circumstances of an identified or identifiable individual”. Here it is important to point out that the scope of what constitutes personal data has broadened. Information that could lead to the identification of an individual now includes everything from economic information, cultural details and mental health information to telephone numbers, IP addresses, social media usernames and more. In addition, organisations will now have 72 hours to report a data leak or face significant fines. In amongst the discussion about how IT infrastructure will be adjusted to comply with these new regulations, an overlooked point has been data protection in printing. After all, data leaks aren’t always large-scale cyber-attacks like we’ve seen recently – they can be something as simple as a printed document ending up in the wrong hands.   The common issues with printing [clickToTweet tweet="Weaknesses in print #security range from advanced issues with #encryption to rudimentary human error. Personal #data is often transferred #unencrypted via the #network. It’s also stored unencrypted on #servers, or even on the printer’s hard drive." quote="Weaknesses in print security range from advanced issues with encryption to rudimentary human error. Personal data is often transferred unencrypted via the network. It’s also stored unencrypted on servers, or even on the printer’s hard drive."] Weaknesses in print security range from advanced issues with encryption to rudimentary human error. For example, many businesses may be simply unaware that personal data is often transferred unencrypted via the network when printing. It’s also stored unencrypted on servers, or even on the printer’s hard drive. And it’s often the case that alternative workflows aren’t established – this is a crucial tool in preventing sensitive information from ending up the wrong hands. Without alternative workflows, any personal data could be sent to printers in unsecured locations. Even a document left unattended in a printer’s output tray could mean data isn’t being adequately protected. Of course, these are just some of the potential hazards businesses will invariably encounter when securing their data in print. It’s inevitable that more complex challenges will become apparent as IoT technology continues to advance and permeate everyday business processes. There’s no denying that business face a potentially difficult task to ensure that they’re GDPR-ready.   How to secure your print fleet In a nutshell, print security must now become a vital part of a businesses’ IT planning processes. And in doing so, they must consider the three key areas of print security. The first is devices; many organisations are filled with ageing, poorly secured print devices. The best defence is to implement secure access features that restrict who can use the output devices using predefined user access controls. The second area is the network. With the increased use of mobile devices and the need to support BOYD initiatives, IT departments must strike a balance between providing users with the tools they need to boost efficiency, but at the same time minimise the risk of intrusion across networks and connections. This could include digital certificates, port filtering, IP address filtering, role-based access control and more.  The third and final area is their documents. Malicious printing can be prevented on a device by configuring it to only allow print jobs if the user is authenticated. They can also implement card authentication for access to physical facilities, print release solutions, and secure document monitoring. These deliver great visibility into physical documents and reduce the liabilities associated with insider threats. Ultimately, it’s crucial that businesses start to make security an integral part of all their printing hardware, and not just in software and IT infrastructure. From May 2018 onwards, an errant printed document ending up in the wrong hands can prove just as costly as a cyber-attack. The EU is warning of fines of up to €20m, or four percent of a company’s annual worldwide turnover (whichever figure is larger). It’s therefore crucial that businesses work to prevent this from becoming a reality.   GDPR As the deadline looms for GDPR, organisations should take the time to understand how they handle the personal data they collect, both externally from customers and internally from their own people. Essentially, it’s time for businesses to ensure their compliance with data protection laws extends all the way from their web security to their printer output tray. ### Marketing Automation - Sly Yuen, Traction Digital - Tech For Marketing 2017 Compare the Cloud talks to Sly Yuen, the Regional Director, for Traction Digital at Tech For Marketing 2017. Technology for Marketing connects over 9500 marketers and decision makers with providers of the newest marketing technologies. Along with our inspirational conference programme, TFM offers exhibitors two unmissable days of networking and lead generation. ### #TheCloudShow – Special - Carriers and Cloud | #DisruptiveLIVE Telecom carriers and their role in cloud. For last 20 years, it is fair to say that continuous technical transformation and information waves have driven a high level of growth in the telecom industry. However, where find ourselves in a situation where the telecoms industry is facing a series of challenges. Connectivity is capturing a smaller proportion of the information value chain while content, service, and products capture more. I don’t think it is that much of a bold prediction to see that that one or more major telecom companies could be acquired by a content company within the next 3 – 4 years. Further, trends such as IoT and mobile are adding billions of new connected data sources globally every year – with that comes an astronomical growth in data volumes. Telecoms companies are also in the heart of the security challenge. As custodians of the networks, carriers play a pivotal role in fighting the new threats that are emerging. Customers expect and demand more proactive protection from the entire internet value chain, and carriers are expected to support these demands with a range of technical and operational innovations or they will fail. So what can they do? With guest Stuart Evers, Chief Commercial Officer for Türk Telekom International. #TheCloudShow #DisruptiveLIVE ### Video Strategy - Stef Wynendaele, StoryMe - Tech For Marketing 2017 Compare the Cloud talks to Stef Wynendaele, the UK Country Manager for StoryMe at Tech For Marketing 2017. Technology for Marketing connects over 9500 marketers and decision makers with providers of the newest marketing technologies. Along with our inspirational conference programme, TFM offers exhibitors two unmissable days of networking and lead generation. ### Health Innovators - Episode 1 - Technology Innovation in Healthcare For the first episode of the Giant Health Innovators show, our host, Barry Shrier, will be talking about Technology Innovation in Healthcare.  Host:  Barry Shrier, Founder, Giant Health Event Guests: Lawrence Petalidis PhD, Head of Innovation and Impact, CW+ Natalie Furness, Communications Director, Medicalchain Alistair Martin, Director of Business Development, Transforming Systems Want to join in?  Contact us on twitter @DisruptiveLive using the hashtags #GiantLive and #DisruptiveLive ### Online Security: Why It Matters To Your Business The internet is not always a safe environment and living in the digital age means that we are exposed to an increasing number of new risks that weren’t risks before. Online security is a broad term to describe the steps and strategies a business, or individual, puts in place to protect and guard their networks and computers connected to the internet. As a business owner, it is vital that you consider online security and include it in your business planning. If your company, big or small, uses the internet in any way, whether it be for emailing, banking, running a website or online store, social media or publishing content, it is important that you understand the importance of online security and know how to keep your business, your information and your assets safe. Why Is Online Security Important? Billions of people use the internet every day and huge amounts of data and information are processed every minute through web browsers. Unfortunately, this means that those online looking to steal data or hack a website have plenty of opportunities to do so and it is harder and harder to guarantee that all internet users and using the web in an ethical and legal way. [clickToTweet tweet="There are simple and effective steps you can take in order to protect your business, no matter its size. #IT #Security #Business" quote="There are simple and effective steps you can take in order to protect your business, no matter its size."] Online security is important because as internet uses increases so do the types of crimes that businesses could fall victim to. In this day and age, one challenge facing businesses is the threat of cyber crimes and cyber breaches. Cybercrime can encompass a whole host of different crimes and fraudulent activity. There are those online looking to steal identities and use the personal details or financial information of other people, there are data thieves who seek to use valuable data from online sources and data corrupters who hack networks and websites to disrupt services. As a business owner, one of the most important aspects of online security is knowing and understanding that the threat of cybercrime is real and taking the process of protecting your business seriously. This is particularly important if you are a small business owner since almost half of cyber attacks are targeted against small businesses according to an investigation in 2016.   Here are some reasons why your online security matters: 1)Financial Cost - dealing with the aftermath of a cyber attack is expensive and can cost your business tens of thousands of dollars. If you’re a small business owner, this can seriously damage the long-term livelihood your business. 2) Reputation - being open to cyber crimes does not reflect well on your business and can affect the credibility of your business, leading to a loss of customers and clientele 3)Data Loss - if you haven’t ensured your data is backed up, a cyber breach could mean the permanent loss of valuable information for your business 4) Customers - online security also matters for the sake of your customers. A lack of online security could put your best customers at risk How Can You Improve Your Online Security? One of the first ways you can learn about how to make changes to your online security is to hire a firm to check your system for vulnerabilities by attempting to hack your security system. This will help you to ascertain where the weaknesses are and how you can tackle them. Some ways of improving your cybersecurity include improving password protection by having strong password strength and regularly requiring the use of passwords to access work appliances and using multiple levels of protection. Putting an online security awareness program as well as providing training for staff can help to ensure that all employees maintain good online security practices and are aware and informed about how to protect your company’s resources. If you’re serious about online security, it is worth considering having extra training yourself so that you gain the skills and knowledge to properly enforce an online security plan. Another effective way to protect against malicious cyber activity, hardware failure and theft is to regularly backup your data. Without a backup, you risk losing valuable information.  If you are involved in online sales, an SSL certificate (Secure Sockets Layer) is essential as it provides secure, encrypted communication between a website and browser. Having an SSL certificate means that a secure icon will show next to your URL when users access your site, which is important for your customers to know that they are shopping in a safe environment.  There are simple and effective steps you can take in order to protect your business, no matter its size. Many businesses make the mistake of not making online security one of their main priorities and this mistake can have dire consequences for your company. How secure is your business? If you’re worried the answer is ‘not very’, then some changes may need to be made.   ### Tech of the Week #6: vGIS – A utility being utilised by the utility industry Augmented Reality has been finding its place in the world more and more over the last year. One company has found a way to couple the AR platform with a new geographic information system (GIS) visualisation platform, dubbed vGIS. This could benefit not only the utility industry but us as consumers, too. What is GIS/vGIS? Example GIS System for water board companies, Software Advice GIS is a Geographic Information System that allows the user to produce a map of a certain area, using specific details to that location. As an example, relevant to this article, utility companies currently use this technology so they can keep track of every detail necessary within their workspace. An electrical company would represent each wire and transformer, their exact location, and possibly other surrounding architecture for future planning. This gives an up to date and accurate platform to help with maintenance and the speed and efficiency in which it can be completed. Example vGIS system showing underground infrastructure, vGIS vGIS is a virtual Geographic Information System that is viewed through the medium of virtual reality, augmented reality and mixed reality. It takes the GIS technology and overlays it on a headset that workers can use to “see through” the ground at the infrastructure below. Rather than using a tablet, finding their exact location and then rotating the map depending on where they’re looking, the headset allows real-time updates on all of this information. How can it benefit the utility industry? Pipes burst, fibre cables require periodic maintenance, electrical cables get damaged in storms – it’s a part of life. Companies need to have the ability to react fast to these situations before they escalate. The Wall Street Journal published an article in 2016* that stated quality checks carried out by utility companies were up 20% by using vGIS. Production was up 25% too. This allows necessary works to be carried out with: Less disruption to us, the public Less time spent on the job, meaning less cost to the business More accurate details, minimising the frequency of future issues occurring The innovation of vGIS has also led to utility companies coming together to further increase efficiency. Rather than 4 or 5 companies turning up to the scene of the problem with their separate GIS systems, leaving the opportunity open for miscommunication, a shared platform can be used. This results in everyone being on the same page; communication is always an important note in any area. Where is it being used? A company by the name of Toms River Municipal Utilities Authority piloted the technology. It uses the Microsoft HoloLens, with an application on tablets and mobile devices also, and is currently mainly being used in New Jersey, USA. However, there are many locations being served by the GIS and AR combination. California seems to be using the technology too, however not on quite such a large scale. These vGIS programs combine geological data, custom maps, 3D building models, various scans, and a whole lot more. By having this data in the cloud, it can be uploaded, converted and delivered to the device platform in real-time. What does it require? [clickToTweet tweet="If you’re thinking of using #AugmentedReality in your company, there are a few things to think about before you get started. #AR #VR #MR" quote="If you’re thinking of using augmented reality in your company, there are a few things to think about before you get started."] If you’re thinking of using augmented reality in your company, there are a few things to think about before you get started. Why do you want to implement AR? What processes will it improve? What challenge will it resolve? It’s important not to jump on the bandwagon because it’s the latest “buzzword”. Always ask “why am I doing this?” Once step 1 has been fulfilled you will need to go out and map your assets and locations. This could be your underground infrastructure, your warehouses, etc. Accuracy is key as it will shape how effective the experience is. Configure your geographic information system database to include any and all attributes important to the operation. This could be material and installation date, for utility companies, or stock expiration, material, etc. for warehousing. Like all records, they should be accurate and complete. The last step might go against the grain of standard business practice, but partnering with other organisation to share data is quite critical. This will help develop the AR world and allow technology to progress faster. This benefits other companies but also yours! Progression means cheaper future processes thus increasing overall revenue. The effect augmented reality is having on businesses vGIS has the ability to provide a new level of operational intelligence, inspiring new services, improving current safety protocols and drives efficiency. Currently, around 400,000 US workers are using smart glasses and Forrester Research predicts that in 2025, that number will be closer to 14.5 million. In the same period, company spending on the technology is expected to increase from $6 million to $3.6 billion. Businesses aside, it could benefit us in our daily lives too. When planting those summer flowers, I can’t be the only one to accidentally dig through an important pipe in the garden. Or when putting up a shelf and there's that moment of “Am I about to put a nail straight through a power line”.   Next in the series, Wednesday 25th April 2018… Tech of the week #7: From Supersonic to Hypersonic – a hybrid of space and flight Concorde still remains a pinnacle of technological success thanks to its faster-than-sound flight speeds. The last flight of this wonder was just under 15 years ago, and since then there hasn’t been another aeroplane to come close to Concorde’s speeds. This could soon change thanks to a super-fast jet engine that doubles up a space-bound rocket…   If you want to talk tech, in this reality or another, get in touch via Twitter @JoshuaOsborn16. *Wall Street Journal article here Cover image from vgis.io ### Customer Satisfaction - Stuart Warren, Fresh Mail - Tech For Marketing 2017 Compare the Cloud talks to Stuart Warren, the Email Marketing Specialist for Fresh Mail at Tech For Marketing 2017. Technology for Marketing connects over 9500 marketers and decision makers with providers of the newest marketing technologies. Along with our inspirational conference programme, TFM offers exhibitors two unmissable days of networking and lead generation. ### Siri gets sassy - the next step for AI Virtual assistants have come a long way in the past few years. Whilst the likes of Amazon’s Alexa and Apple’s Siri aren’t quite in the same league as Iron Man’s Jarvis (sometimes feeling like they’re deliberately trying to test the definition of intelligence), they do provide a view of things to come. Whether we will reach the point of technological singularity (the point where man and machine converge) in our lifetime is widely debated with both excitement and concern, but a lot will change before we reach that stage, and you might find Siri getting a little sassier in the meantime. Speaking at TED2005, futurist and author Ray Kurzweil showed that the rate of adopting new ideas doubles roughly every decade. He built on this further in his book, The Singularity is Near, summarising that man and machine would eventually converge by 2030. Many notable personalities have raised concern for humanity’s future, about what might happen when we reach this point, including Elon Musk and Steven Hawking. Yet the plausibility of AI being able to rapidly evolve to surpass the intelligence of a human brain is also widely disputed. Author Martin Ford postulated a “technology paradox”, in that for the singularity to be reached, the development of the level of technology required would firstly end up automating most routine jobs in the economy, causing mass unemployment and plummeting consumer demand, thus destroying the incentive to invest in the technologies required to take AI to that point. [clickToTweet tweet="As they stand, virtual assistants don’t care about whether you’re affectionate with them, speak bluntly, or even shout at them. #AI #Siri #VirtualAssistant" quote="As they stand, virtual assistants don’t care about whether you’re affectionate with them, speak bluntly, or even shout at them."] Either way, former President Obama rightly pointed out in an interview with Wired magazine in 2016, that a greater focus on the economic implications of the journey towards that point needs to happen. Everyone has a different definition of AI. The simplest way of thinking about it is if you’re using a machine to make a decision on behalf of you, that is artificial intelligence. It can be extremely simplistic, or very complex. Regardless of whether the singularity will take place in our lifetime, rather than putting us all out of a job, embracing AI in our daily lives should help us become more productive as a race, automating the mundane tasks and allowing us to focus on the value of being human. Virtual assistants are the first real evidence of humans interacting with machines in the same way they interact with other humans. As individuals, we turn to the Alexas and Cortanas of this world to augment our day-to-day lives and enable us to free up time for other things (despite them being prone to errors and causing frustration when they give an incorrect response). Currently, computers arguably have no real ‘intelligence’, yet we design them to behave as if they have a certain level of psychology to make them easier to interact with in our daily lives. Improvements in this area are coming fast. In February this year, news leaked that Amazon are reportedly working on building AI chips for the Echo device, enabling Alexa to respond to questions at a much faster and more natural speech rate. The next battleground for virtual assistants to focus on will be inference. As our relationship with technology continues to change, the inability to understand tone in speech remains a real shortfall. Very little emotional meaning of a message is conveyed through words, which is why too often we find that meaning within text messages or even tweets is often lost, and equally as often can land us into trouble. If virtual assistants are going to live up to promise, they need to be able to read between the lines. Aside from the more obvious benefits of improved communication, the possibilities enabled by better machine emotional intelligence are endless. Whether they be as simple as picking up on the fact that you’re in a bad mood, adjusting the lights and putting on some calming music whilst ordering you your favourite takeaway, or even early flagging of potential health issues through detecting a change in your voice or behaviour. Many remain sceptical about whether our changing relationship with machines will benefit us from a social or mental state. Certainly the way we communicate with other humans through technology falls short of the rightly-favoured face to face conversation. Yet as virtual assistants become more aware of nuances in the user’s tone, we may even start to learn something from them in return. As they stand, virtual assistants don’t care about whether you’re affectionate with them, speak bluntly, or even shout at them. As this begins to change, and they start to push us towards treating them the way we would want to be treated ourselves, as a society we may start to remember just how much of our communication lies outside of words when we email or text each other. AI, through whatever definition you give it, is advancing fast, and there’s lots of great things we’re going to see along the way. The singularity may be coming, but Siri’s going to get sassy first. ### Cloud Expo Europe 2018 - Mike Bradshaw, Watson IOT Industry Lead, IBM Compare the Cloud's David Organ interviews Mike Bradshaw, Watson IOT Industry Lead at IBM at Cloud Expo Europe 2018. Cloud is the new normal, but normal doesn’t stop changing. Cloud Expo Europe is the only place to remain more than a step ahead. It’s the place to make sure your business is harnessing the true potential of the cloud. ### Cloud Expo Europe 2018 - Rene Bostic, VP of Innovation and New Technologies, IBM Compare the Cloud's David Organ interviews Rene Bostic, VP of Innovation and New Technologies for IBM at Cloud Expo Europe 2018. Cloud is the new normal, but normal doesn’t stop changing. Cloud Expo Europe is the only place to remain more than a step ahead. It’s the place to make sure your business is harnessing the true potential of the cloud. ### Cloud Expo Europe 2018 - Dominic Poloniecki, Director Cloud Software, IBM Compare the Cloud's David Organ interviews Dominic Poloniecki, Director Cloud Software for IBM, at Cloud Expo Europe 2018. Cloud is the new normal, but normal doesn’t stop changing. Cloud Expo Europe is the only place to remain more than a step ahead. It’s the place to make sure your business is harnessing the true potential of the cloud. ### Decentralised data storage in a blockchain future After a busy 2017, there is now a significant mass of crypto-companies who are looking to disrupt a whole range of industries and who have the funding base to achieve that vision. These companies believe that blockchain will underpin a new decentralised internet and they are hurrying to build the decentralised applications that people will use to access this revolutionary new infrastructure. At the same time, even enterprises that haven’t fully embraced blockchain’s potential yet are still considering the advantages of decentralised data storage. However, while I am optimistically looking ahead to this decentralised future as much as anyone, we cannot ignore the issues that currently stand in the way of progress. Blockchain is the future [clickToTweet tweet="Significantly for #data storage, #blockchain and #decentralisation provide huge benefits and a clear solution to ongoing #cybersecurity problems" quote="Significantly for data storage, blockchain and decentralisation provide huge benefits and a clear solution to ongoing security problems."] Before looking at this issue though, it’s worth reiterating what lies behind the significant growth in blockchain that’s taking place now. It’s well established that this technology can, through cryptographic consensus, provide a significantly more secure environment for value exchanges within a whole range of markets. Such benefits are the reasons why it is hailed as a major disruptive force in everything from media to medicine. Significantly for data storage, blockchain and decentralisation provide huge benefits and a clear solution to ongoing security problems. For example, a major breach at credit reporting agency Equifax last year exposed the personal data of more than 145 million individuals to unauthorized third parties. Today, in the centralised world of cloud storage, the huge databases of leading multinational enterprises act like beacons for hackers looking to access consumer data for criminal activities. This sort of insecure data storage is not only highly vulnerable but also totally unnecessary. Even where hacks haven’t taken place though, the way in which centralised databases can be affected by outages also makes it clear why something must change. In 2017, a major outage in the AWS US East region affected global web applications such as Github, Slack, Adobe and Expedia for a number of hours. The significant point to note is that decentralised data storage, as demonstrated in blockchains, completely mitigates the issues of hacks or power shortages that affect centralised databases. This is because decentralisation and distribution mean that, if a single node is compromised or goes down, the network can continue to update and operate. In fact, for this sort of data storage architecture to even be at risk of failure, you would have to imagine an act of God on a global scale. The flaws in decentralised app development However, as mentioned earlier, to paint a simple picture of untapped opportunity in the blockchain space would be misleading. While decentralised data storage has clear advantages, there are obstacles to overcome. For example, up to now, the main activity within the blockchain data storage and management space has concerned file storage. Companies such as Storj and Filecoin amongst others have raised significant funds to build blockchain-based file storage solutions. However, as we all know, files are only one element of the data storage landscape. While these solutions work well for large, arbitrary files, they are not suited to quickly finding and retrieving the smaller data fields that makeup microtransactions, which are of a fixed size, grouped, structured and easily accessed via mobile download speeds. There is also the difficult question of immutability on the blockchain. While it is often hailed as a great benefit of the technology, developers know that it can also be a tricky topic. In fact, many will be aware of software projects that they’ve worked on where immutability, as a rule, would have been an unacceptable constraint. As a result of these twin issues, many ‘decentralised’ app developers have had to resort back to the centralised data storage solutions they had previously used. However, by focusing on the benefits of decentralisation and transposing them onto the real world situation that app developers face, an ideal solution can be found. How decentralised data storage should work The best aspects of blockchain can be combined with decades-mature database science to implement decentralised data storage that is mutable and therefore fit for the purposes of decentralised app developers. This can occur through a combination of sharding and swarming. Sharding occurs when datasets are partitioned into logical chunks for storage and retrieved to be reformed, much in the same way as torrents work. Swarms are groups of individual nodes across which individual shards of data are replicated. If one node in a swarm fails, the other nodes on which the shard is replicated can still function as normal. As a result, a decentralised database will have vastly superior performance, reliability, scalability and security characteristics. For example, it will be able to consistently offer great performance during peak times but not require the same amount of hardware to be committed during low usage times. App developers who are finally able to make their apps truly ‘decentralised’ aren’t the only ones to benefit either. Such a system could completely disrupt the existing business model for cloud storage by introducing rewards for individual device owners. This is because one of the dynamics that runs in parallel to blockchain is that of a tokenised economy, where digital assets representing value are exchanged through secure cryptographic consensus. In a world of decentralised data storage, individual device owners could be rewarded for opening up their spare capacity using such a model. The big centralised oligopoly that exists today could be replaced by a flexible and cost-effective network of distributed nodes working together, with users scaling database resources as their needs increase without having to manually manage servers and pay upfront for peak hardware costs. Such a future is much closer than many IT and web operations leaders currently believe. However, with a predicted 30 billion connected devices by 2020 and a whole raft of well-funded crypto-companies building decentralised applications to connect this information, maybe it’s time to start taking decentralised data storage more seriously.   ### Empowering the World, One Entrepreneur at a Time Why is there Such On-Going Hype around Cryptocurrency, and is it Actually Masking the Power behind Blockchain Technology? Bitcoin is already amidst the throw of a turbulent year; the recent ban on Cryptocurrency advertising from the world’s biggest search engine (Google) and social media platform (Facebook) is to proceed in June 2018, with rumours of Twitter to soon follow suit. This year has commenced with a lot of turmoil in the Cryptocurrency world. However, specialists argue that this has been necessary to preserve the long-term health of the market. Despite the industry currently in overdrive, key observers of the Blockchain say the technology is bound to not only survive but thrive. The general outlook for Blockchain in 2018 looks to be increasingly positive: Fiat service provider, Robinhood, of which over 1 million people have signed up for, announced their zero-fee crypto trading on February 22nd; March 15th saw the official release of Lightning Network’s first beta implementation for the Bitcoin mainnet, securing $2.5M in seed funding; and one of the most important Polish Cryptocurrency exchanges, Bitbay, has decided to add support for Ripple (XRP) and Infinity Economics Token (XIN), impulsing the IoT to the mass market. It is clear that the production of mass market–focused products will finally be launching this year, making it a lot easier for the wider public to start building on and using the Blockchain. What Does That Say About the Future of our Global Workforce? As widely proposed in recent news, the future is autonomous. We are already moving towards a workforce that could be purely operated with combined use of Artificial Technology and robotics. Recent evidence from the McKinsey Global Institute’s study of 46 countries and 800 occupations found that up to one-fifth of the global workforce will be affected by robot automation. According to the report, 39 to 73 million jobs may be eliminated by 2030 in the US alone, but about 20 million of those displaced workers may be able to easily transfer to other industries. But what if there was a way that the inevitable influx of automated workforces didn’t have to affect the world’s rate of human employability? What if there was a solution that could effectively convert the masses into fully equipped entrepreneurs, by applying one straightforward concept? President and Executive Chairman of the Board, Fabio Zoffi, together with his colleague Prof. Pierluigi Riva, founder and CTO at ORS GROUP, are on an incredible mission to transform 1 Billion people into small entrepreneurs by the year 2040. How will they do this? By making the proprietary algorithms their company currently uses for the world's largest companies available to small businesses and entrepreneurs together with the Blockchain technology. ORS GROUP’s innovative new concept of Hypersmart Contracts provide the mechanism by which they will do this: connecting Artificial Intelligence and Blockchain to apply science to business and make companies of all sizes competitive on a global landscape. President Zoffi says “we are creating communities of like-minded people around the globe, who want to embrace the new digital alphabet ABC - A.I., Blockchain and Cryptocurrency, to turn their businesses into highly competitive ones, in almost any industry sector.” Behind the Algorithms - The ORS GROUP Founded in Italy, ORS GROUP is a leading global supplier of cross-industry software solutions for optimizing and automating business processes. Their large international client base includes that of Fortune 2000 enterprises and span industries like retail, energy, finance, and manufacturing. For over 20 years the company has delivered sophisticated software using proprietary Artificial Intelligence, Machine Learning and Big Data Analytics algorithms, saving their clients over $1 billion yearly. Empowering a Global Decentralized Network - It’s Easy as ABC ORS GROUP’s new digital alphabet acts as an intelligent system of connectors; activating A.I. algorithms (off-chain) to solve complex efficiency/optimisation problems utilising data stored on-chain. They can also release instant crypto payments. Together, these technologies can lead to significant improvements in global value chains, which even small farmers can benefit from. For example, algorithms can be used to predict crop yields and for dynamic price optimisation, Blockchain can be used for providing transparency about the whole food chain and Cryptocurrency used for receiving immediate payments. As an end result, small farmers can regain negotiating powers against distributors and compete globally. Imagine the possibility of a future decentralized network of small companies on a planetary scale, empowered by technologies which enable the “little guy” to put their big ideas into action and to be competitive against the “big boys”. Individual entrepreneurs will become empowered once they are provided with the technology to educate themselves, resulting in the establishment and growth of fully automated and successful businesses. ORS GROUP continues to dedicate itself to ensure that any entrepreneur with a dream will be able to compete on a global scale and in an autonomous world. ### What Value is the GDPR Compliance for Contact Centres? Most of our contact centre customers initially met the GDPR with as much trepidation as anyone else. After working with our customers, I have found that those who embrace the new regulation and use it as an opportunity to transform their compliance processes have a lot to gain. By taking a holistic approach to securing and maintaining the integrity of their systems, contact centres establish the basis for receiving a range of benefits that GDPR compliance offers. Getting your compliance ducks in a row ahead of the deadline could hold great potential for a more efficient use of resources and more effective insight generation from information collected in the contact centre, future-proofing your organisation. Here are some of the activities that I recommend to our customers in preparation for the GDPR: Make the most of Customer Engagement The new regulation encourages a greater level of customer trust. For example, The GDPR requires customers to give explicit consent before organisations can use their data. The new regulation also encourages trust in more indirect ways – for example, with the new complications around using third-party solutions. As a result, organisations are likely to reduce their reliance on third-party solutions, bringing data control in-house, in a more tightly regulated environment. This knowledge that customers have given their explicit consent to use their data, gives confidence to organisations. It shows that the customer has taken a first positive step in engagement. Once you have the customer at your fingertips, this can be leveraged to nurture a more trusting relationship. [clickToTweet tweet="The GDPR requires customers to give explicit consent before organisations can use their data. #GDPR #Customers #Data #Security" quote="The GDPR requires customers to give explicit consent before organisations can use their data."] Another area of the GDPR compliance which brings opportunity for touchpoint with the customer is the ambitious Article 20, the Right to data portability. This is the right for an individual to require an organisation to provide them with a copy of personal data that they have previously provided – where technically possible this should be sent directly to another organisation, presumably a competitor. This right also encourages organisations to be able to receive data in standard formats from other organisations. Here again, you can promote customer trust and impress your customer by being efficient and promoting extra services, such as higher level data security. This article gives you a month to provide the data, if you can provide it immediately, you will readily demonstrate efficiency and a positive customer experience. Furthermore, if you aren’t afraid to maintain a user-friendly interface where customers can receive their data (as per the Right of Access, or the Right to data portability). The transparency and control you allow your customers may give you the edge over your competitors when it comes to being an organisation they want to transfer data to. Commit your organisation to ‘privacy by design’ The old “tick the box” compliance model that used to be the norm in many contact centres will no longer suffice. The GDPR encourages a ‘privacy by design’ approach, which means privacy is integrated into every part of an organisation’s products and services. Also, the GDPR requires organisations to only collect necessary data. To do this, organisations must, first of all, understand exactly what personal data they need, where they’ll be storing it and where it will be coming from. Organisations must analyse their databases and speak to teams across the business, from agents through to managers and CIOs, to understand the amount of personal data being stored and used within their organisation. You may be relieved to get rid of some of it and create new policies regarding which data to store in the future. Acquire a Data Santa Claus Introduced in Article 37 of the regulation, all public authorities, or organisations that engage in large-scale systematic monitoring or processing of personal data, should have a Data Protection Officer who has “expert knowledge of data protection law and practices”. This data Santa Claus will inform and advise, and monitor compliance. This additional expertise will guide you through the labyrinth of data protection, monitoring compliance with the GDPR and other data protection laws, including managing internal data protection activities, training data processing staff, and conducting internal audits. Each organisation can leverage this knowledge and capability to protect and future-proof their business. GDPR compliance is not all bad A lot has been said about the challenges of achieving GDPR compliance, but with these challenges also come some great opportunities. The GDPR will undoubtedly promote market competition as it requires more transparency from organisations, allowing consumers more informed choices. Those who take compliance under their wing are sure to stay ahead. ### ‘Terminator takedown’ fears diminishing – OpenText research reveals UK consumers’ true feelings on AI technology New research from the leaders in enterprise information management, OpenText, today reveals the extent to which UK consumers think robot and artificial intelligence AI technology will impact aspects of their everyday lives – from work to healthcare – in the future.   Robo-colleagues here to stay   The survey of 2,000 UK respondents revealed growing optimism around robots in the workplace. Over one third (35%) of UK citizens would feel comfortable working alongside a robot while almost a quarter (23%) would actively encourage their employer to hire robot colleagues if it would mean a reduction in their day-to-day admin tasks.   The research also revealed fewer concerns about robots taking over jobs completely. While the 2017 survey revealed that a quarter of UK consumers (25%) believed their job could be replaced by a robot in the next 10 years, this dropped to one in five (21%) in this latest survey. Furthermore, almost two thirds (60%) of respondents did not think a robot would ever take over their jobs, suggesting a greater inclination to work alongside – and not be replaced by – robot technology.   AI technology in the NHS [clickToTweet tweet="As #AI is implemented across the #healthcare sector, British consumers will need to put their trust in this #technology." quote="As AI is implemented across the healthcare sector, British consumers will need to put their trust in this technology."]   Faced with a growing population and tight budget, the UK’s National Health Service has already started looking to AI to improve patient service and cut costs. With smartphones due to become the primary method of accessing health services, the NHS is already investing in AI-powered apps and implementing AI technology which will allow NHS 111 enquiries to be handled by robots within two years.   Yet this latest research reveals widespread uncertainty amongst the UK population when it comes to trusting their health to AI:   ·         A more accurate diagnosis was identified as the biggest benefit of introducing AI into healthcare, yet only a quarter (26%) of UK consumers believe robots would reach the correct diagnosis ·         Speed and quick access were also highlighted as major benefits – 21% believe AI technology can offer a quick diagnosis and the same proportion (21%) would appreciate not having to take time off work to visit a doctor As AI is implemented across the healthcare sector, British consumers will need to put their trust in this technology. Yet this research revealed that two fifths (41%) do not know if they would trust the medical diagnosis given by AI and a further 26% confirmed that they did not trust the technology. Just 11% said they would trust the diagnosis of AI more, or just as much, as a doctor’s diagnosis.   Interacting with AI   Despite the increasingly ubiquitous presence of AI in consumers’ lives – from healthcare apps to AI-powered travel predictions on Google Maps – 45% of respondents were not aware of having interacted with AI in the past twelve months. Yet, in the age of extreme connectivity with individuals constantly online and connected through multiple smart devices, the reality is most people interact with AI on a daily basis.   Those surveyed showed a wide array of feelings around the growth of AI technology. Though nearly one in five (18%) described themselves as nervous, 17% are excited about the technology – mirroring increased optimism around the benefits of the technology.   Commenting on the findings outlined above, Mark Bridger, VP of OpenText UK said:   “AI technology is here to stay. Businesses are turning to digital transformation, healthcare organisations are embracing medical technology innovations and, as a result, AI is filtering into every aspect of our lives. It’s positive to see that more of us are looking at the benefits this will bring to the workplace and our wider lives – enabling greater efficiency while also taking away some of the strain of day-to-day tasks.   “While sci-fi films can distort the impact of AI technology, it’s time to stop viewing AI as an existential threat to our livelihoods and our health. AI will transform the workplace as menial tasks, and some non-routine jobs, are digitalised through robotics and process automation but it cannot replace people. The true value of AI will be found in it working alongside humans to ease the pressure at work and across the healthcare system as well as making our lives easier.” ### Open-source security: Can OpenStack really protect your cloud data? A free, open-source platform, OpenStack was created with the ambitious target of giving infrastructure-as-a-service to consumers in a rapid, self-serve manner. It is now one of the most popular open-source cloud projects with the likes of eBay and Walmart relying on its framework. Speed and simplicity were essential throughout OpenStack’s development, with users now easily able to manage it through a web-based dashboard, command-line tools, or through a RESTful API. Security, however, took a backseat until recent incidents such as the VENOM breakout and Heartbleed SSL-related flaw gave rise to no small discussion around its ability to keep data safe as a cloud platform. Safety had apparently been sacrificed for speed and efficiency at the development stage. [clickToTweet tweet="In #OpenStack, safety had apparently been sacrificed for speed and efficiency at the development stage #security #cloud" quote="In OpenStack, safety had apparently been sacrificed for speed and efficiency at the development stage"] That said, when assessing the relative security of OpenStack it is important to remember that the inevitable complexity of public computing, which introduces many layers of interacting technology, is part of the reason security issues are evolving so quickly and tend to introduce many difficult cloud issues. Existing exploits OpenStack is relatively new code. Therefore is likely to contain numerous software vulnerabilities and implementation issues, which continue to be uncovered by the OpenStack community. However with specific portals and projects devised to tackle emerging security issues head on, it the OpenStack community appears to be taking security concerns quite seriously. the OpenStack community appears to be taking security concerns quite seriously At the highest-level, general implementation-based vulnerabilities do exist such as clear text RPC communications and the use of plaintext passwords in some of the authentication files. In addition to this, the reliance of OpenStack on other components can pose an issue. For example, if your team were to use an old version of OpenSSL that suffers from Heartbleed, your organisation’s OpenStack implementation may be affected as well. Staying aware of the latest vulnerabilities and advice on what to do can be found on the community’s security portal, security.openstack.com. This page will make sure users know of the latest security patches. You can even track the open software flaws, based on Common Vulnerabilities and Exposure (CVE). Patching progress The OpenStack Security Project attempts to tackle security directly, and allows the community to share and report vulnerabilities so they get fixed. The security guide is also a great tool for users as it acknowledges some of the security issues around implementing OpenStack and helps to deploy the platform in the most secure manner. Despite the fact that OpenStack is a “cloud” computing platform, it still helps in managing real servers that physically exist somewhere, sending traffic on real networks. Therefore, all the normal, relevant security controls should be considered; (firewalls, IPS, anti-malware, WAD etc.). In certain cases, OpenStack even offers APIs that can help you apply traditional security controls (such as network IPS via the Networking API) using this new cloud model. As with anything, CTO/CIO’s will also need to make sure the OpenStack software is properly marinated and updated regularly. Best practice Auditing your system regularly on a set schedule is an absolute necessity to stand any chance of finding vulnerabilities before they are exploited. Make sure you prioritise any potential exploits based on severity and real-world impact; there may well be cases where a vulnerability could be devastating but simply isn’t accessible in your company’s implementation. One other tip here is to make sure you track the time it takes your team to close or mitigate the threat. They should be closing high priority or severe vulnerabilities quicker over time. All in all, OpenStack is an amazing platform with tons of potential in the enterprise realm. As with all new technology platforms, however, data breaches are happening at a staggering rate and the first question that every CTO/CIO should ask themselves before implementing is how secure can I make this network? every CTO/CIO should ask themselves before implementing, how secure can I make this network? ### #TheIOTShow - Episode 1: IIoT/IoT – how to justify, and how to get going The IoT Show delivers valuable insights for industrial organisations on the industrial internet of things and topics touching the broader internet of things. We pick engaging, hot topics, and ask our speakers to advise on situations, opportunities, recommendations and gotchas. This series is not a sales medium, rather a series of discussions for you to benefit from the experiences and insights of others. In this episode, we’re going to look the basics of IIoT and IoT. What is it/are they. What’s their value and who should be interested, importantly how does one get started on the journey. ### A Guide to Machine Learning Machine learning is one of the most innovative and interesting fields of modern science around today. Something that you probably associate with things such as Watson, Deep Blue, and even the infamous Netflix algorithm. However, as sparkly as it is, machine learning isn’t exactly something totally new. In fact, the concept and science of machine learning has been around for much longer than you think. The beginnings of machine learning Considered to be the father of machine learning, Thomas Bayes’ theorem was pretty much left alone until the rockin 50’s when, in 1950, famed scientist Alan Turing managed to create and develop his imaginatively named ‘Alan Turing’s Learning Machine’. The machine itself was capable of putting into practice what Thomas Bayes had conceptualised 187 years earlier. This was a huge breakthrough for the field and along with the acceleration of computer development, the next few decades saw a gigantic rise in development of machine learning techniques such as artificial neural networks, and explanation based learning. These formed the basis of modern systems being managed by artificial intelligence. The latter being arguably the most integral to the development of systems management technology. Explanation based learning was primarily developed by Gerald Dejong III at the Chicago Centre for Computer Science. He essentially managed to build upon previous methods and develop a new kind of algorithm, enter the “explanation based algorithm!”  Yes, the explanation based learning algorithm was fairly standard in that it created new business rules based on what had happened before. However, what sets this apart as a breakthrough is that Dejong III had managed to create something that would independently be able to disregard older rules once they had become unnecessary. Explanation based learning was one of the key technologies behind chess playing AI’s such as IBM’s Deep Blue. A cold AI Winter However, there was a period during the 70’s when funding was disastrously reduced because people had started thinking that machine learning wasn’t living up to its original billing. This was compounded when Sir James Lighthill released his independent report which stated that the grandiose expectations of what artificial intelligence and machine learning could achieve would never be fulfilled. This report led to many projects being defunded or closed down. This was incredibly unfortunate timing as the UK was considered a market leader when it came to machine learning. This dark period of time was effectively known as the ‘AI Winter’ and bar a momentary slip in the early 90’s, was the only real time that the possibilities of machine learning were ever really discounted by the scientific community. [clickToTweet tweet="#MachineLearning has reached a level where companies have the capability to transform #legacy systems into business driven #analytics." quote="Machine learning has reached a level where companies have the capability to transform legacy systems into business driven analytics."] Who is pushing the technology forward now? Machine learning has reached a level now where companies such as DataKinetics now have the capability to transform legacy systems into business-driven analytics. DataKinetics are at the forefront of their field and have been entrusted by many blue-chip companies, such as Nissan and Prudential, to streamline and optimize complex technology environments. With the advancements within technology today IT professionals are now capable of achieving so much more due to new innovations in machine learning. However, this is just the beginning - if funding and interest into machine learning and AI remains consistent, there’s no telling what can be achieved. Machine learning algorithms that can predict future outcomes, giving us – the humans – to react accordingly. In essence, the main idea behind machine learning is that it’s essentially where a computer or a system takes a set of data created previously, applies a set of rules to it and provides you with an output that in that is more efficient. In much the same way, there’s a cycle between the innovators and forefathers of machine learning and with the companies and groups of people that are doing it today. That’s why companies such as DataKinetics are proud to be associated with such a rich and storied period of human endeavour. Innovators are equally as important as pioneers, without innovation we have static evolution that does not progress our species further and we are staring at a near constant change in the tech space. Datakinetics are innovators within technology and have had the foresight to predict the evolution of mainframe, machine learning and analytics with a tech roadmap spanning for over 30 years! Don’t just take my word for it, have a look at the latest developments at http://www.dkl.com/ ### Tech of the Week #5: 3D printed houses that are fresh off the press You can buy a lot of things with $10,000; a family trip to Florida, a 1970s Volkswagen Beetle, and the wedding every girl dreams of. Usually, a house is not an option to buy for so little, especially not 3D printed houses that can be built in 12 to 24 hours. However, Austin-based startup ICON is making this idea a reality. Eventually, we could see this come to Britain – potentially helping our housing shortages and help revive and revolutionise the manufacturing sector. ICON, New Story and their 3D printed houses project [clickToTweet tweet="The Austin-based startup, #ICON, has successfully used a #3DPrinter and cement to construct a house in 12 to 24 hours. #Housing #3DPrinting #tech #technology #TechOfTheWeek" quote="The Austin-based startup, ICON, has successfully used a 3D printer and cement to construct a house in 12 to 24 hours."] First unveiled at SXSW early this year comes one company’s attempt to help solve the housing problems faced by over a billion worldwide. The Austin-based startup, ICON, has successfully used a 3D printer and cement to construct a house in 12 to 24 hours. This is opposed to the typical 6 to 12 months it takes to build a house using current methods. Whilst 3D printing being used in building fabrication is nothing new, printing on-site in the manner proposed definitely is. The first area they’re looking to test out this technology on a grand scale is El Salvador, with a project of 200 3D printed houses to be built next year. ICON has partnered with a company called New Story, who are a not-for-profit housing company that have already been building homes for communities in El Salvador, as well as Haiti and Bolivia. Eventually, the company aims to build 3D printed houses in the US too. Jason Ballard, one of the three ICON founders, is already looking to the future and our potential colonisation of other planets. “One of the big challenges is how are we going to create habitats in space,” Ballard says. “You’re not going to open a two by four and open screws. It’s one of the more promising potential habitat technologies.” The printer and costs The printer is called Vulcan and it’s pretty big… As you may know, 3D printers are metal frames that use a material (like plastic) to print an object in layers. This requires the printer to be bigger than the object it’s printing. In the case of a house, the printer is going to be rather large. The concept image below shows the equipment on rail sliders, allowing for rapid construction by sliding it to the next plot. Time is saved by not having to disassemble and reassemble anything. Photo: 3ders.org Vulcan is reported to cost less than $100,000 and can be packed up for transportation purposes. Operations will require between 2 and four workers to use and manage the printer. Your home and future office Coming in a lot cheaper than the average American home of about $200,000, these 3D printed houses cost $10,000 to build. ICON is hoping that, with improvements over time, they can bring the cost down to $4,000 per house. So what do you get for your investment? Sizes seem to range from 600 to 800 square feet and models come with 3-4 rooms: 1-2 bedrooms, a bathroom and a living room. It even comes with a porch so you can rest on your terrace on those long summer nights. The printer will provide the concrete structure, however other installations such as the electrics and plumbing won’t be printed or installed by the machine. As previously mentioned, the idea was unveiled at SXSW – well, the model showcased is planned to be tested as office space. ICON wishes to install air quality monitors to see how it performs. Presumably, the reason for this is to assess impact to a worker’s routine, health and safety and other health problems that can arise from being office-bound. The need for human labour Due to restraints around the un-printable electrics and plumbing, there will always be a need for human labour (at least for the time being). This is also true from the fact the machine doesn’t print the roof, windows and any other mechanicals that can’t be “poured”. A key to a good home is the foundations it’s built on, and as Vulcan only prints, and doesn’t excavate, humans will be required for that stage too. However, the process will require significantly less human labour as the need for bricklayers, etc will be eliminated. This is likely in part to how the houses can be built for such little cost to the consumer but could face a backlash from labour unions. Cement is cheap and in abundance, but it’s also strong and should help ease potential worries around the sturdiness and durability of the structure. Natural disasters In areas like El Salvador, where natural disasters aren’t uncommon, these concrete structures could help to provide some extra safety. For example, a hurricane is going to have a harder time against the toughness of concrete than against some of their wooden constructs. By providing those that lack the safety of a house a place to take refuge there is also the greatly increased chance of saving many lives. In conclusion, this is perhaps an ambitious project to some, but overall it sounds like one with far-reaching, positive implications. If the technology and idea were to come to England, we could see it have a positive impact on our manufacturing sector. Particularly in the north where, historically, a lot of manufacturing took place.   Next in the series, Wednesday 18th April 2018… Tech of the Week #6: vGIS – A utility being utilised for the utility industry Augmented Reality has been finding its place in the world more and more over the last year. One company has found a way to couple the AR platform with a new geographic information system (GIS) visualisation platform, dubbed vGIS. This could benefit not only the utility industry but us as consumers, too...   If you want to talk tech or housing, and for more great content, follow me on Twitter @JoshuaOsborn16.   Cover image: New Story ### Timeless Tuesday - The Crystalline Sliver We take a look back at a pivotal time in history when the tiny, electronic semiconductor, dubbed "The Crystalline Sliver", first came to be. The year was 1948. Bell Telephone Laboratories arranged a news conference to announce an important new invention. "The Crystalline Sliver" "Bell scientists and engineers expect it may have far-reaching significance in electronics and electrical communication." "Not much larger than a shoe-lace tip" came a tiny electronic semiconductor that revolutionised the world we live in. They came to replace traditionally used vacuum tubes that were hot, brittle and bulky. In devices like the radio and TV - putting electronics on its path of miniaturisation and global use. In 1956, the 3 scientists behind the discovery won a Nobel prize: William Bradford Shockley John Bardeen Walter Houser Brattain "For their researches on semiconductors and their discovery of the transistor effect." The total number of transistors in the world today, if counted, would exceed... 1,000,000,000,000,000,000,000 A sextillion. ### 8 Public Cloud Security Threats to Enterprises in 2018 As enterprises continue to invest heavily in public cloud technology, experts now agree that the market is entering a second wave, as we take a look at public cloud security threats. Cloud uptake will accelerate faster in 2018, according to a report by Forrester. ‘Enterprises with big budgets, data centres, and complex applications are now looking at cloud as a viable place to run core business applications’ says Dave Bartoletti, analyst at Forrester. An average of 1031 cloud services is now in use per enterprise -- up from 977 in the previous quarter -- according to Netskope’s January Cloud Report. But the threat of cybercrime in 2018 is massive and data breaches are becoming more commonplace. With the average cost of a breach now a massive $4 million, enterprises cannot afford to consider public cloud security threats an afterthought. But there are numerous security threats out there for enterprises migrating to, or already running critical infrastructure in the cloud. Enterprise cloud services are not enterprise or security threats ready [clickToTweet tweet="Most #cloud services now offer APIs for developers to manage and interact with their service. The security and availability of cloud services -- from #authentication to #encryption -- depend on the security of the #API." quote="Most cloud services now offer APIs for developers to manage and interact with their service. The security and availability of cloud services -- from authentication and access control to encryption and activity monitoring -- depend on the security of the API."] Large companies are already using public cloud providers to host critical enterprise applications. And as CIOs become increasingly comfortable hosting critical software in the public cloud, we can expect this trend to continue. Worryingly, 95% of cloud services used in the average enterprise are not enterprise-ready from a security standpoint. And by using unsecured applications, your sensitive corporate data could be exposed without your organisation even realising it. The burden of creating secure applications -- using secure approaches, models, and technology --belongs to developers. The movement to DevOps and CloudOps now places the responsibility of writing and testing secure cloud applications squarely on the developer’s shoulders. Software defects and bugs coded into program logic are a common cause of application vulnerabilities. These flaws can be accidentally built into any application, whether it’s hosted through a provider’s public cloud or on your local network. Most cloud services now offer APIs for developers to manage and interact with their service. The security and availability of cloud services -- from authentication and access control to encryption and activity monitoring -- depend on the security of the API. Risk increases with third parties that rely on such APIs, as enterprises may be required to expose more services and credentials. Weak APIs expose enterprises to security vulnerabilities. To help secure cloud APIs and the enterprise applications they’re used to build, the Cloud Security Alliance recommends security-focused code reviews and rigorous penetration testing Data breaches Due to the huge amount of data stored on cloud servers, providers are an increasingly attractive target to cyber criminals. The severity of damage depends on the sensitivity of the data exposed. Breaches involving health information, trade secrets and intellectual property are typically the most devastating. Breaches can incur fines, lawsuits or even criminal charges leveraged against an organisation. Damage to reputation can also have long-term effects potentially outweighing the initial financial cost. Whilst public cloud providers are actively investing in improved security controls to protect their environment, it’s ultimately the responsibility of the enterprise to protect their data in the cloud. And with the introduction of new data protection laws in Europe, organisations wanting to do business with EU firms must comply to the EU General Data Protection Regulation (GDPR). When the GDPR is introduced in 2018 new accountability and restrictions on internal data flows will be introduced. Organisations risk a $4 million fine for breaching the regulation. Europe’s upcoming GDPR regulations set a global precedent in data security. But with just 2% of enterprise cloud applications GDPR-ready, there’s a lot of work ahead for developers on every public cloud platform. Lack of encryption The Cloud Security Alliance (CSA) recommend organisations use encryption to protect their sensitive cloud data. Encryption is one of the most basic methods for securing data, but many enterprises make the mistake of failing to encrypt sensitive data. UK Base TalkTAlk Telecom Group were recently fined a record $500,000 for security failings which led to the theft of personal data from 157,000 customers. If TalkTalk had encrypted their data, only authorised users with a matching key would able to access private records. Whilst some public cloud providers are starting to provide customers with more control over their data, information stored in the cloud is often not within an organisation’s control. Instead, the integrity of your data may rely entirely on the security practices of third parties. Unfortunately, this is out of your control and impossible to guarantee. Your organisation’s security best practices may not always be applied. With the rise of bring your own device (BYOD) in the workplace, employees may be tempted to use their own cloud-based applications to store or share sensitive data with their colleagues. Known as shadow IT, trends like this, put organisations at risk. Gartner predicts that one-third of security breaches will result from shadow IT services by 2020. Because of this, shadow IT is in direct conflict with enterprise data security. And the result may leave sensitive enterprise data in the hands of an unknown third party applications. Enterprises should consider using a VPN tunnel to protect their public cloud data. A VPN tunnel enables remote off-site employees to create an encrypted end-to-end connection with their company network and transfer data securely regardless of location or application. Weak authentication and identity management A lack of proper authentication and identity management is responsible for data breaches within organisations. Businesses often struggle with identity management as they try to allocate permissions appropriate to every user’s job role. The Anthem Inc data breach resulted in cyber criminals accessing 80 million records containing personal and medical information. This hack was the result of stolen user credentials; Anthem had failed to deploy multifactor authentication. Two-factor/Multifactor authentication systems, like one-time passwords and phone-based authentication, protect cloud services by making it harder for attackers to log in using stolen passwords. Enterprises that need to federate identity with a public cloud provider must understand the security measures which that provider uses. Insider threat Poor identity management can leave gaping gaps in enterprise cloud security when IT professionals fail to remove user access when a job function changes or an employee leaves the organisation. Insider threat can take many forms: a former employee, system administrator, contractor, or business partner. Often dependent on the industry, the criminal’s agenda can range from IP theft (common in manufacturing) to revenge. Within enterprise public cloud, an insider could destroy infrastructure or permanently delete data. Systems that depend entirely on cloud service providers for security, like encryption, are at greatest risk of this type of threat. Insider threat can be disastrous. A recent insider breach affecting Sage resulted in the company’s stock price dropping by 4.3% - causing $millions in losses. However, quickly identifying insider threat can be tricky; it’s possible to misidentify a poorly carried out routine job as a malicious activity. For example, if an administrator accidentally copies sensitive customer information to a public server. Proper training and management to help prevent these mistakes are becoming increasingly important in the cloud. Account Hijacking Techniques like phishing and fraud are well known cyber threats, but cloud adds a new dimension to these threats as successful attackers are able to eavesdrop on activities and modify data. Common defence-in-depth protection strategies can contain the damage caused by a hijacking attempt. But, as always with cyber security, prevention is the best-practice. Enterprises should prohibit the sharing of account credentials between users and cloud services – and enable multifactor authentication where available. The CSA recommend that accounts should be monitored so that every transaction can be traced back to a human owner. The key is to protect credentials from being stolen in the first place. Lacking due diligence Due diligence is the process of evaluating cloud vendors to ensure that best practices are in place. Part of this process includes verifying whether the cloud provider can offer adequate cloud security controls and meet the level of service expected by an enterprise. Enterprises should review accreditations and standards gained by cloud providers, including ISO 9001, DCS, PCI and HIPAA. Enterprises must only embrace the cloud when fully understanding the environment, or risk getting entangled in a myriad of security issues. For example, organisations that fail to read-up on a contract may not be aware of where liability lays in case of data loss or breach. Due diligence massively affects application security and any resulting breach. The rise of cloud technology means the responsibilities of shared security have changed. Public cloud providers may be responsible for your infrastructure, but it can be easy to forget that -- as the customer -- you are responsible for the security of your own application and infrastructure. DDoS attacks DDoS attacks are nothing new but can be especially crippling when targeted at your organisation’s public cloud. DDoS attacks often affect availability and for enterprises that run critical infrastructure in the cloud. This can be debilitating and systems may slow or simply time out. DDoS attacks also consume large amounts of processing power – a bill that the cloud customer (you) will have to pay. On the upside, cloud providers are typically better positioned to handle DDoS attacks than their customers. However, as the Cloud Security Alliance state, organisations must still have a plan to mitigate attacks before they occur.   ### Cloud Expo Europe 2018 - #DisruptiveCIF - Ed Smith, Megaport At Cloud Expo Europe 2018, #DisruptiveCIF had the opportunity to interview Mark Turner, the Ed Smith, Head Of Europe for Megaport, who discussed "Customer Cloud Journey Any Service on Any Cloud". Cloud is the new normal, but normal doesn't stop changing. Cloud Expo Europe is the only place to remain more than a step ahead. It's the place to make sure your business is harnessing the true potential of the cloud. ### Comms Business Live: Countdown to GDPR | #DisruptiveLIVE This week on Comms Business Live David Dungay will taking a look at the impending regulation about to hit Europe… GDPR. With just six weeks to go until the deadline, 25th of May, it is the perfect time to update viewers on the latest information coming out of the ICO and how peoples understanding of the regulation has evolved since its inception. The B2B market has some special considerations when it comes to GDPR and with the help of his expert panel David will look to untangle those last minute challenges the Channel has encountered on their way to compliance. ### Cloud Expo Europe 2018 - #DisruptiveCIF - Chris Hill, Barracuda At Cloud Expo Europe 2018, #DisruptiveCIF had the opportunity to interview Chris Hill, Director, Public cloud business development EMEA for Barracuda, who discussed the need for security as a priority for public cloud and how to ensure your security is ready for the cloud. Cloud is the new normal, but normal doesn't stop changing. Cloud Expo Europe is the only place to remain more than a step ahead. It's the place to make sure your business is harnessing the true potential of the cloud. ### #TheCloudShow - Series One - The Best Of | #DisruptiveLIVE Hosts Jez Back and David Organ look back at the best clips from #TheCloudShow series one. ### Cloud Expo Europe 2018 - #DisruptiveCIF - Mark Turner, T-Systems Ltd At Cloud Expo Europe 2018, #DisruptiveCIF had the opportunity to interview Mark Turner, the Vice President IT Delivery for T-Systems Ltd, who discussed "Customer Cloud Journey Any Service on Any Cloud". Cloud is the new normal, but normal doesn't stop changing. Cloud Expo Europe is the only place to remain more than a step ahead. It's the place to make sure your business is harnessing the true potential of the cloud. ### Getting The Most Of Your Data With A Data Management Strategy As consequence of doing business in 2018, more data is being generated than ever before. This can cause significant upheaval in network infrastructure and device management, as well as requiring a substantial investment of both time and money. Larger organisations that already put it to good use report heightened productivity, better optimisation, and a considerable return on investment. The business case for smaller companies is becoming more compelling, but despite the positive advocation of a big data strategy, a considerable percentage have yet to commit fully. This often comes down to not having a workable management strategy in place, so how can you change this process?   [clickToTweet tweet="Mapping out your data requirements is the first steps towards taming the growing beast that is big data. #BigData #Security #Storage" quote="Mapping out your data requirements is the first steps towards taming the growing beast that is big data."] Map out your big data requirements Immersing your business into the world of big data can be a daunting task, especially if you’re opening yourself up to new data streams for the first time. To stop yourself feeling like a fish out of water, your first step should be to map out the requirements you expect from the data you’re collecting. You may require the information to make decisions about business expansion, or monitor your performance in a given territory. But, one thing’s for sure, you’ll need to be 100% happy with each decision. If you plan to use data to support these decisions, including this in your data management strategy will ensure you won't miss out on any vital information. In setting out each requirement, particularly those that demand a significant investment from your business, you’ll want to receive a positive return. However, something I’ve learned is that data analysis can actually tell you what you already know, and on some occasions the results aren’t always pretty, but it does make you face the brutal facts. Some people tell you to go with your gut feeling, but from experience, when there is data that can backup a decision, it’s always best to do thing scientifically. And that’s about going out and aggregating structured and unstructured data from all possible relevant sources and placing in a data lake (a storage repository) that we can analyse. This yields not only the results we require, but also hidden patterns and new information that can help us in the future. On occasion, the data validates what our gut and experience told us, but every time having the right data on hand to back up the decisions and visualise the next steps becomes a vital part of the process.   Most companies will start with small-scale big data projects, with low initial outlays and a quick ROI, and typically you’d measure success as increased sales, profit or savings. But ultimately, the use of big data in any business is a necessary game changer, and l would advise you quickly begin the strategic planning of data collection and increasing network infrastructure to help you to get ahead of the big data implementation curve. Having the right big data team in place Mapping out your data requirements is the first steps towards taming the growing beast that is big data. However, if you fail to recognise the expertise and knowledge needed to execute your data strategy, you'll face a significant stumbling block in keeping a lid on your storage silo. So to prevent your data from getting out of control, hiring a team of experts should be high on your list of priorities. Unfortunately, there’s no manual to help you decipher who should be part of your team, and currently in the big data jobs market there are numerous different roles which, although they have a similar job title, will require professionals to have a specific skill set or expertise. But, when you begin the process of adding new members to your team, or if you're starting from scratch, the final decision should come down to how you wish to use the data. Do you want to create well-visualised reports or increase the speed of decision making? Whichever route you decide to take, ensure you include it in your data strategy, as this can assist you in making your final recruitment decision. The first step I recommend is to guarantee the members of your team have the appropriate level of skill  associated with data production, extraction, analysis, and finally visualisation. Following this practice will ensure that what you’ve set out in your data requirements will be fulfilled, and fall in line with your data management strategy. Naturally, one of the most important things to look for when searching for right professionals is relevant experience, and during this process you'll begin to realise that big data doesn’t operate under a one-size-fits-all approach. However, when you finally come to a decision on the roles you wish to fill, you can still face significant problems if your business has chosen to implement an unorthodox method of extraction and analysis, it can be tricky to find professionals with the right experience. If you decide to use a Hadoop system, administrators with the proper certifications and experience are a rare breed, so when choosing to invest in analytical software, I advise you take that into consideration, or you could face quite an ominous task to find the right individual. In summary, leveraging big data will give any business a competitive advantage, whatever size they are — just be sure you have your big data strategy in place, and your people strategy is realistic and reflects the marketplace. ### Cloud Expo Europe 2018 - #DisruptiveCIF - Sean McAvan, Navisite Europe At Cloud Expo Europe 2018, #DisruptiveCIF had the opportunity to interview Sean McAvan, the Managing Director for Navisite Europe, who discussed managed cloud services, multi-cloud and managed end-user services. Cloud is the new normal, but normal doesn't stop changing. Cloud Expo Europe is the only place to remain more than a step ahead. It's the place to make sure your business is harnessing the true potential of the cloud. ### Top 3 Digital Solutions for the Legal Sector How the legal industry can benefit from embracing the revolutionary digital technologies that will help improve performance, efficiency and accuracy. Improving business efficiency is a top priority for most businesses, especially for those where paperwork is constantly piling high.  Richard Fishburn, Professional Services Director of Ben Johnson’s Technology division explains how legal businesses can achieve this. The legal sector continues to face many challenges, one of which is managing and keeping on top of process heavy documentation. Traditional methods simply can’t deal with the volume of data requiring management and organisation. This is where digitalisation comes in. [clickToTweet tweet="The legal sector is renowned for its large paper archives, which must be kept safe, secure and intelligently organised. #Legal #Data #Security #Comms" quote="The legal sector is renowned for its large paper archives, which must be kept safe, secure and intelligently organised."] Efficient digital solutions allow legal firms to boost productivity by automating document and information workflows, saving time, space and money. Here’s how: Mail Room Automation Implementing mail room automation transforms the management and processing of legal documents by lowering the time spent manually classifying and distributing mail. Digitalisation automatically sorts and routes all in-bound communications regardless of whether it’s paper, fax or e-mail. Incoming mail can be scanned, automatically saved and indexed using information from the document, such as the ‘case number’, ‘client name’ or ‘address’. Indexing software has made huge jumps forward using natural language processing to get to the core understanding of unstructured text. Software can now determine whether a letter is for distribution to Family Law, Personal Injury or Residential Property by the words it contains and the structure of its sentences. Workflows can also be set up for mail, and with the required documents can then be electronically sent to the correct recipients, delivering the right information to the right person at the right time. This helps speed up the process of responding to incoming queries, reducing waiting times for clients leading to improved customer service and satisfaction. Document Management The legal sector is renowned for its large paper archives, which must be kept safe, secure and intelligently organised. Legal firms will especially benefit from smart document storage and management as it improves their document handling and allows for remote access and tracking. Digital storage on premise or in the cloud is an intelligent way to keep legal archives organised, easily accessible and secure. Additionally, you can add access controls and audit trails helping to comply with regulations and legal admissibility. This includes clear electronic audit trail of who has viewed and amended documents, making document management far more accurate and accountable. This way of storage is far more efficient and has fast searching capabilities – there is no need to spend hours searching through filing cabinets to retrieve documents, giving staff more time to focus on important tasks. Document Distribution for Outbound Communications Postage costs are at an all-time high and clients are demanding access to document at a much quicker rate. Delivery of high quality electronic documents through email or online portals helps reduce the cost of outbound communications but improve the client’s experience at the same time. Intelligent printer drivers can monitor print traffic, make decisions on how best to send outbound communications, apply branding, amend attachments and transition away from print to digital means all through the user simply choosing to ‘print’ a document in the normal way. Where documents absolutely must be sent by post the mailroom can be automated with addresses cleaned and mail sorted for maximum discounts. ### The Tangible Benefits of VR Are So Broad, They're Beyond Tangible Amidst all the hype and wide-eyed growth predicted for virtual reality, it's often difficult for stakeholders to decide which of its two words to focus on. Will VR deliver only indirect, 'virtual 'benefits to businesses or direct, tangible benefits that become bottom-line reality? On this topic, today's hype will definitely become tomorrow's given. Even in these earliest days, VR is already a game changer, and the future holds immense promise for almost every business vertical.  VR might utilize complex technologies, but its innovation is easy to grasp. Simply put, VR is a logical, next-step evolution in our ability to communicate, with each other and with groups. Five thousand years ago, literacy made our thoughts portable. Almost a century ago, the first talking movie made experiences portable. Thanks to technological progress, VR now lets us create and easily distribute new, richer experiences that go beyond the time and space limitations of today's media. Every organization that needs to convey something -- to a customer, a trainee, an audience -- will benefit from the VR paradigm shift. Beyond Sight and Sound  Through VR, perceptual experiences can now take place in three dimensions; a major evolution in and of itself. This benefit is already in growing use throughout the travel and real estate industries, where prospective clients can walk through environments and properties to make informed visiting or purchase decisions. No promotional video will ever match the enticement of a virtual helicopter ride over a rainforest or metropolis, or the sensation of walking along that quiet beach. VR also moves us beyond the limitations of the two senses used in today's communications: sight and sound. By way of hardware peripherals, VR experiences will include tactile and other sensory information to further emulate physical reality. If seeing is believing, what is touching worth? Customers will be able to feel the drape before they buy that suit or curtain. Doctors will learn to feel a micro-fracture that an X-ray might miss. Wearables will produce and extract additional sensations for and from the user. This, in turn, will generate valuable, real-time feedback to the content provider. In the near future, marketers will have their hands on the pulse -- literally. Multiple Paths to The "Right" Message  More than any existing media, VR can provide an experience that's genuinely interactive, and able to switch the narrative flow in real time. The user, the author, the publisher, and even the audience can all influence activity to a directed outcome, whether it's a an informed trainee, a successful sale or a storybook happy ending. The gaming industry has long been pioneering these open-narrative experiences through "quests" where the user determines the action, and also through networked games, where multiple participants can affect actions and outcome. Transposing gaming into the business world, a quest is similar to any transaction or negotiation. The action involves the presentation of persuasive information (such as a proposition or product demo) by one party, while the other decides whether the choice is compelling. If so, deeper interest is expressed and deeper content invoked. If not, alternative content is delivered or selected. While this is happening, a third-party distributor of the content might be deciding, through AI and selective data, which content to serve to which user in real time. An outside audience, such as a focus group or other customers, could steer the details and outcome. VR does away with the one-sidedness of linear experiences like video. Liberation from Place  What if there were a Skype for physical presence? Short of the matter transporter that Trekkies dream about, VR will be our closest answer. What if you could invite thousands to a physical venue that only held a dozen? What if a company's best global talent could interact on the same project in real time without travel? What if companies could train new employees through real-world simulations, rather than procedure manuals and explainer videos? The immersive aspect of VR makes all this possible. Today's teleconferencing will pale in comparison to VR's shared group experiences, where one-to-one can take place at the same time as one-to-many. Our most acute human assets will re-emerge through this technology. Physical cues like eye contact and body language will matter again. Who's paying attention and who's not will be obvious once more. Naturally, this will all be captured as data. Which industries will benefit from VR? All of them, once the technology becomes more adept, standardized and accessible. In these early days, sentiment is still mixed. According to the Consumer Technology Association, users have trouble finding the content they want, and without it, headset usage -- and interest -- declines. Once in full motion though, VR will be a complete game-changer, and not only for business enterprise but for the human enterprise. ### Cognitive Computing Is Not Hype: It Is A Must-Have For Organisations Artificial intelligence (AI) has been the most far-flung goal of mankind since the birth of the computer. However, we can certainly say that we are closer to that goal than ever with the advent of new cognitive computing models. In layman’s terms, cognitive computing is a mashup of cognitive science and computing science, where cognitive science studies the human brain and how it works and computing science deals with the innovative ways of using computers for the betterment of the community. What is cognitive computing? Cognitive computing systems are used to find solutions to complex situations where answers are uncertain or ambiguous, using computerised models that simulate the human cognition process. Although the term is often used alongside AI, it is closely related to Watson, IBM’s cognitive computer system. Cognitive computing intersects with AI and includes several fundamental technologies to influence cognitive applications such as virtual reality (VR), neural network, and expert systems. The prime goal of cognitive computing is simulating human thought processes in computerised models using self-learning algorithms that use data mining, natural language processing, and pattern recognition. This way, the computer could mimic the way human brain works. Even though computers are way faster at calculations than humans, it has not been able to grasp several tasks that we do seamlessly such as natural language and recognising unique objects in an image. Thus, cognitive computing is another try at teaching computers to be more humane. How does it work? Cognitive computing models synthesise data from different information sources while considering context and conflicting evidence to find the best suitable answers. For this, cognitive systems including self-learning technologies that use pattern recognition, data mining, and natural language processing. Moreover, computers are then used to solve the types of problems that require a huge amount of structured and unstructured data. Over time, cognitive modules refine the way computers recognise patterns and figure out how to process data to anticipate probable problems and structure possible solutions accordingly. There are five vital attributes to achieve these capabilities, including adaptive, interacting, iterative & stateful, and contextual. Adaptive: Cognitive computing models must be flexible to learn as the information changes and the goals evolve. In addition, it must be able to comprehend dynamic data in real time and make necessary adjustments accordingly. Interactive: Human to computer interaction is the most vital aspect to develop a cognitive computing model. In fact, users must be able to convey the information to machine and define their goals and the model also must interact with other processors and cloud platforms. Iterative and stateful: The technology must identify problems by asking questions if the given problem is vague or unclear. Cognitive computing models do this by maintaining every bit of information it has about similar situations that have happened previously. Contextual: This could be the most crucial aspect as understanding the context is a complicated process and the cognitive systems must identify, understand, mine contextual data by drawing information from multiple sources and data. Potential of cognitive computing As cognitive computing systems are based on multiple technologies including natural language processing and machine learning algorithm, enterprises must be equipped with required digital strategy and technological infrastructure to incorporate these systems in the business processes to improve the value proposition of services. Apart from this, the increase in the number of software-as-a-service based enterprise management solutions would help more market players to automate their enterprise by offering essential technological framework. What’s more, correct SaaS products can offer the required boost for the emerging companies, SMEs as well as big corporations. Cognitive computing systems help enterprises to stay completive and propel higher revenues in today’s tech-savvy world. This innovative technology has offered an effective tool to improve decision making, elevate employee expertise, and acquire customer view. The rising volume of complex data, increased demand for big data analytics, and the positive impact of cognitive computing on business applications are expected to boost the demand for cognitive computing. According to a research firm, Allied Market Research, the global cognitive computing market is anticipated to reach $13.8 billion by 2020, registering a CAGR of 33.1% during the period 2015–2020. The potential use cases of cognitive computing are vast and vibrant. From cognitive materials to cognitive engineering systems and from cognitive cars to cognitive flight systems, there are innumerable opportunities. Wherever there are complex problems or circumstances are everchanging, cognitive computing could help simplify the issue. Since cognitive computing offers an intelligence information tool and is incessantly evolving to be more like human, it will certainly increase human capabilities and knowledge. Over time, humanity would work hand in hand with human-like machines to solve complex problems that trouble humanity.[/vc_column_text][/vc_column][/vc_row] ### Opportunities and challenges in the age of open banking In recent years, traditional retail banks have faced stiff competition from new market entrants, including challenger banks and FinTech firms that are often less restricted by costly legacy systems. At the same time, changing consumer demands for faster, tech-focused services has diminished the relationship between banks and the customer, creating a new environment of open banking. Open banking is the UK’s version of the second Payment Services Directive (PSD2), which was implemented in mid-January. Open banking and PSD2 force UK banks to open up their data via a set of secure application programming interfaces (APIs), shifting banks from being one-stop-shops for financial services to open platforms, where consumers can start to embrace a more modular approach to banking by consenting to verified third-parties directly accessing their data. This is set to fundamentally change how banks and FinTech companies act and interact, introducing a range of new market challenges and opportunities for both financial services providers and consumers. Below, Matt Smith, CEO of SteelEye, the compliance tech and data analytics firm, explains what he believes to be the opportunities and challenges facing open banking, and its wider impact on consumers. Product and service innovation [clickToTweet tweet="#OpenBanking’s #data sharing rules are aimed at encouraging competition between banking providers and helping to drive the implementation of new #technologies that will make banks agiler" quote="Open banking’s data sharing rules are aimed at encouraging competition between banking providers and helping to drive the implementation of new technologies that will make banks agiler"] One significant opportunity under open banking is likely to be the encouragement of rapid and significant innovation in the financial services market. Open banking’s data sharing rules are aimed at encouraging competition between banking providers and helping to drive the implementation of new technologies that will make banks agiler. When startups and challenger banks have access to the same data that established banks have previously held a complete monopoly over, innovation will be encouraged as financial services companies create new products and services to distinguish themselves from their competitors. For businesses, open access to customer data provides them with insight into consumer behaviour which allows them to validate and align new offerings and initiatives. For example, with insight into consumer needs, banks can offer deals on products including overdrafts, insurance and mortgages. Through innovation, consumers are provided with the ability to quickly compare accounts, helping them to understand where they can get the best products and savings. Personal financial management could now be offered by an array of financial service providers, from established banks to charities, in a move that encourages customers to shift from traditional ‘under one roof’ banking services to specific, individualized services that are suitable for their personal financial situation. Industry collaboration Another significant advantage of open banking is that it encourages collaboration between banks and FinTechs. Crucially, both providers offer something that the other cannot. With so many players in the financial services industry and so much choice for consumers, forming partnerships with their FinTech competitors could help banks both to survive and expand their services in such a rapidly evolving industry. Although PSD2 forces current account data to be shared, other information – such as savings accounts, direct debits, and demographics – isn’t included, giving banks an advantage not held by newer FinTech firms. Most forward-thinking banks will use this advantage to form new partnerships with their competitors, and by opening up to collaboration between banks and FinTech firms, any new products will see the benefits of both ends of the spectrum. Partnerships between banks and FinTech firms will include not only the credibility, trust and customer base of a traditional bank, a vast database of account data, and direction on strategy and the regulatory market; but also the new, agile and low-cost technology offered by the FinTech companies. Challenges facing providers and consumers with open banking One of the main challenges facing open banking is security. The majority of companies that are championing open banking are small FinTech companies, not tech giants like Apple and Google, and a lack of homogenous technical standards, such as certain security requirements, combined with complex internal technology systems may make the process susceptible to corruption and fraudulent activity. By creating complex chains of data access, it also makes it harder to prove who was at fault following a theft. Another concern about open banking is liability, as inserting third-party providers into the banking process increases the risk of scammers gaining access to customer information and their finances. Under open banking, consumers should have any losses replenished by their bank unless there is reason to suspect fraud or negligence, but with both banks and FinTechs alike facing increased security threats, without proper legal clarification it’s inevitable that providers will try and blame a third party. Finally, a lack of education and awareness around open banking’s capabilities has made consumers less likely to consent to their data being shared, limiting banks and FinTechs ability to innovate. Recent research by Accenture shows that two-thirds of consumers would not be willing to share their personal financial data with third-party providers while in September, independent consumer review body, Which?, revealed that 92% of consumers had never heard of open banking. To negate this, banks and FinTechs must educate consumers on the ways in which open banking can improve how they organize and take control of their finances, including through monitoring spending and making better saving and investing decisions.   ### Migrate your workloads seamlessly to IBM Cloud with VMware Join Chris Greenwood (IBM) and Paul Nothard (VMware) for this special broadcast. Host David Organ will chair this exclusive panel and ask the business why, the benefits and the challenges companies face with cloud migration. You can find out more about Migrating your workloads seamlessly to IBM Cloud with VMware – ibm.biz/IBMandVMwareMigration Broadcast date: Tuesday 27th March, 2018 – 2pm BST Also tweet the panel direct throughout the show on Twitter using the hashtags #IBMCloud #DisruptiveLIVE ### IoT growth pushing the data centre to the edge The IoT has been a dominant trend in recent years, with Gartner estimating more than eight billion connected ‘things’ in use today. But one trend begets another, and in order to manage and process the real-time data that devices and sensors are providing, a new development is emerging in the form of edge computing and its data center use. [clickToTweet tweet="To take advantage of both the #IoT and #5G, you need to localise by building your #DataCenter with a closer proximity to #users. This helps to be more efficient; #data is streamed locally and IoT #projects can be deployed more quickly" quote="To take advantage of both the IoT and 5G, you need to localise by building your data center with a closer proximity to users. This helps to be more efficient; data is streamed locally and IoT projects can be deployed more quickly"] It’s not just the IoT, 5G technology too is just around the corner, and the demands that both these will place on existing infrastructure will require a massive shift in how compute resource is delivered. In a sense, we have been here before. Computers were first deployed into businesses using mainframe architecture to provide centralised processing with terminal access. Networked PCs were introduced to deliver more compute power for the end user, with local processing and storage, connected to larger servers for additional storage. Now, thirty years later, after the significant move towards re-centralising processing into the large single entity that is the cloud, demand is once again increasing for distributed localised processing of data – driven by the increase in traffic, complexity of tasks and our need as consumers for low latency connectivity. This trend will likely develop further as our homes and personal transport become smarter and create ever more data. This will generate a demand for localised processing where the endpoint is the home and the car, and aggregated data is fed back into regional edge sites for the next layers of processing, so intelligence can be sent back to the core data centres. If this is the future of the data center, these edge sites need to be delivering continuous real-time data so that no issue is left undetected. If there is a fault or downtime at a site, engineers need to act quickly. A lack of real-time data could mean faults remain unresolved for days or even weeks. Accuracy of asset location and status data will become even more important when engineers begin to manage multiple dispersed edge sites. Auditing and general maintenance tasks will need to be undertaken in the most efficient way and this will require up-to-date knowledge of each site’s status. To put the expected growth of edge data centres into context, around 10% of enterprise-generated data is currently created and processed outside a traditional data centre or cloud, but by 2022, Gartner predicts that this figure will reach 50%. If companies are to take advantage of both the IoT and 5G, they need to localise, and this means designing and building their data center with a closer proximity to users. Not only will this help them to be more efficient because data is only being streamed locally and IoT projects can be deployed more quickly, but it also saves on the costs associated with operating cloud or metro-based facilities. The Internet of Things will challenge the current cloud-centric view of IT by demanding lower latency and higher reliability for real-time applications. This will shift compute, storage, and real-time analytics closer to the edge, which will in turn drive demand for data center technologies. 451 Research Voice of the Enterprise, IoT, Organizational Dynamics 2017 survey respondents indicate that the leading location for initial IoT analytics will be in data centres (53.9%). This will create opportunities for data center technology suppliers who are able to address this demand with micro-modular and regional or local data centres. Who will lead the march into edge data centers? Established colocation companies could capitalise on their market position, buying power and infrastructure knowledge to deploy smaller regional sites. This will mean moving focus from large corporates that require low latency metro-based locations and instead catering to a much larger group of smaller users – all connecting via their data providers. Alternatively, the data providers themselves – telcos – already have the regionally deployed real-estate to enable this with their telecoms infrastructures and their power, they just need to build ‘the edge’ to support the IoT. Huawei predicts that 20% of telco revenue will come from IoT by 2025. Apart from these contenders, there are a large group of potential players such as cable operators, who already have established regional infrastructure, and energy providers, many of whom already have data center offerings and could expand into edge compute. Energy providers may consider this to be a strategic move too given that they will need to add local capability to support IoT enabled smart metering and smart control of domestic appliances. Finally, there may also be a space for the smaller colocation providers to expand and land-grab in this evolving market. As thousands of edge data centers are constructed and start to become operational, many will be unmanned or employ minimal staff which means careful consideration needs to be given to monitoring each facility’s assets and to their environmental conditions. Given that continuous uptime will be expected, tracking not just the performance of assets, but their locations too will be essential. Real-time data needs to be delivered that allows companies to assess an asset throughout its lifecycle and environmental monitoring at a granular level that will allow companies to take action immediately when a problem arises. Instant data on temperature, pressure and humidity will reduce potential environmental risks and allow smart decisions to be made and implemented quickly. Real-time monitoring will be a vital element in the successful deployment of the new edge data centers, delivering a single pane of glass/360° view so that a remote operator can control and manage these mission-critical locations and give companies the reassurance that uptime is guaranteed right to the edge. ### Five tips for adopting Business Intelligence (BI) Recent research conducted by sales-i revealed that 9 out of 10 B2B companies see Business Intelligence – along with other data-driven technologies – as integral to their future success, and it’s not hard to see why. It can boost the quality of your data collection and analysis, allow you to connect with customers and prospects more easily, and help you generate more revenue. Execute your strategy correctly and you’ll have a more informed, prepared, and intelligent team. But successfully executing a Business Intelligence strategy can be a complicated undertaking. In the course of integrating it into your operations and processes, it’s easy to waste time, lose money, and frustrate yourself and your team. The benefits of BI are undeniable – but accessing them requires a coordinated effort across multiple fronts. Keep these five tips in mind and you’ll lay the foundations for successful adoption. Clean your data BI can have a transformative effect on your organisation – but only if it’s working from a bedrock of reliable, accurate data. It follows that the first step in your strategy should be to ensure that your database is as clean as it can possibly be. That means removing any old or irrelevant records: not only do you not need the previous address history of a one-time customer who last bought from you five years ago, keeping it is actively dragging down the quality of your insights. Get rid of it, and institute processes for getting rid of it regularly. Then it’s time to get rid of any records with inaccuracies, inconsistencies, or missing information. If a customer is in your directory under a misspelt name, they’re liable to take offence when you message them; if their phone number or email address is wrong, nobody will ever hear or see what you have to say anyway. When you’ve gotten rid of these records, create specific data entry rules to avoid making mistakes like this in future. And always, always know the provenance of your data. Just because you can buy data from third-parties doesn’t mean you should rely solely on it: it’s always better to maximise the information produced within your business rather than rely on third-party sources. Create a clear Business Intelligence roadmap Business Intelligence implementation isn’t a one-and-done thing: it’s a long-term endeavour and you can’t expect to see immediate results. The true benefits of this technology are measured in months and years, not days and weeks – and only if you’ve created a suitable roadmap for the entire organisation to follow. This roadmap should start with a comprehensive understanding of your current situation. It’s quite likely that this will look very different from your BI setup five years from now: how might you improve it in the short and long-term? Who’s going to use it, and who has a stake in its success? Which technology and infrastructure will you need to put in place to facilitate it? What, in the end, are your objectives? Don’t neglect the planning stage. Only when you’ve got a roadmap can you start moving towards your destination. Educate your team [clickToTweet tweet="A good place to start is by letting them know how they’re supposed to use #BI #technology, and how it will specifically benefit them. Include them at an early stage, and ensure they receive the right #training. #Data #BusinessIntelligence #Strategy #BigData" quote="A good place to start is by letting them know how they’re supposed to use BI technology, and how it will specifically benefit them. Include them at an early stage, and ensure they receive the right training."] Important as strategy is, no Business Intelligence plan survives the first contact with your team. Their concerns, their understanding of the technology, and their attitude towards it will determine your success – and you will need to make certain adjustments to accommodate this. A good place to start is by letting them know how they’re supposed to use BI technology, and how it will specifically benefit them. Include them at an early stage, and ensure they receive the right training. Above all, create a culture of data literacy – in which team members are encouraged to make decisions based on empirical insights, rather than educated guesses. Integration, integration, integration How your BI tool integrates with your existing IT infrastructure determines its ultimate success, so your choice of software should ultimately factor in its relationship to your ERP system. Many data opportunities that might otherwise be lost can be recovered and capitalised on with the right integration. Certain tools, for example, make it possible to pull data directly from separate financial, logistical, and HR modules, amongst others: delivering superior analysis to several corners of the business. This makes your processes more connected and improves operational efficiency. Monitor, evaluate, iterate This one’s pretty self-explanatory. Keep an eye on the way you’re using BI, apply some intelligence of your own in your evaluation, and iterate it so it keeps getting better. Implement reporting processes to demonstrate the positive impact it can have on your organisation, and thereby encourage wider adoption throughout the business. Develop recommendations – and follow them. Essentially: if you’re not using the technology to its fullest potential, figure out how you can. Integrating BI into a busy organisation juggling a number of separate priorities is never going to be the most straightforward undertaking. But if you keep the above five tips in mind, you’ll enjoy a smooth deployment – and a smoother path to successful long-term customer acquisition and retention. ### Beyond The Call Of Duty: Drones Are A Major Industrial Multitool There are a number of technologies that began as tools for the military, top-secret or otherwise, that have surpassed their armed services roots to take the world by storm. The internet, for one, and GPS for another. In this day and age those are both old news, of course, but the latest military solution to make it big is the unmanned aerial vehicle (UAV), commonly known as the drone. These high-flyers initially zoomed into the consumer market, becoming the must-have gadget for gearheads and photography/videography junkies. For as massive as the consumer drone market is, it’s the industrial drone market that’s brimming with innovation, the kind that’s revolutionizing operations and business processes across a number of industries. One drone to rule them all Even the average industrial drone is much more advanced than the toys being flown above family picnics or taking panoramic photos of a beach, but there’s one type of industrial drone that’s truly changing the game, and it’s the automated drone. [clickToTweet tweet=" It’s almost impossible to have a discussion about industrial drones without talking about mining. #Drones #Future #Technology" quote=" It’s almost impossible to have a discussion about industrial drones without talking about mining."] Automated drones are drones that can automatically handle every part of their flying process – deploying, flying, and landing as well as data collection, transmission and processing for both scheduled and on-demand missions. This eliminates the need for a pilot, thereby eliminating the expense of one as well as the delays associated waiting for a pilot to respond to a request for an on-demand or emergency mission. In industrial environments, a delay in an emergency situation could be catastrophic. Automated flight also eliminates the potential for human error in the flight itself. Automated industrial drones have to be built rugged to withstand the extreme conditions that so often accompany industrial sites, and leading automated drones can take care of their own routine maintenance including battery swapping and the changing of payloads and sensors, which effectively turns an industrial drone into a multitool, enabling it to perform a wide variety of tasks in a wide range of industries. Blazing the trail A few short years from now industrial drone adoption will be widespread, and virtually all industries will be better for it, but for now there are a handful of industries and drone applications leading the charge. Mining. It’s almost impossible to have a discussion about industrial drones without talking about mining. Industrial drones are completing haul road inspections for safer and more optimal hauling conditions, helping to manage the environmental liability of tailings dams, completing efficient and accurate surveying and stockpile evaluation, and providing greater precision in drilling and blasting, including before and after digital elevation models and fragment analysis. Agriculture. Agriculture is another drone-adopting industry, and it’s this enthusiasm for new technologies that is driving the push towards precision agriculture. Drones are commonly used to provide soil and field analysis, including 3D maps essential for planting pattern planning, crop monitoring, and irrigation monitoring. Energy. The energy industry is full of dangerous equipment, often located in harsh or remote environments. In order to protect employees, civilians, infrastructure and the environment, that dangerous equipment in harsh and remote environments needs to be regularly (and carefully) inspected. For too long these inspections have put human employees at risk. An industrial drone’s ability to take over these inspections is more than welcome. Construction. From initial surveying to work and equipment inspections to protecting sites against theft, industrial drones have a lot of work to do in the construction industry, and they do it all faster, safer and with more precision than could be achieved with traditional methods. Infrastructure development and maintenance. Whether it’s building a dam, inspecting cell towers, or maintaining complicated highway and railways systems, industrial drones bring a sigh of relief. So many duties related to infrastructure development and maintenance are not only slow and costly when using human employees and equipment like helicopters or boats, but they also put those employees at great risk. Security and surveillance. These are two applications in which automated drones excel. Automated drones are ready to fly at a moment’s notice, ensuring they can get to an emerging safety or security-related situation and immediately begin transmitting essential data such as live video or begin testing for chemical leaks or other hazards – situations organizations definitely don’t to be sending human employees into. Automated drones can also cost-effectively take the place of CCTV surveillance systems. Unintended benefits When militaries around the world were developing sophisticated unmanned weapon delivery systems, they likely didn’t imagine that they would be responsible for the industrial multitools checking for cracks in bridges, ensuring safe hauling conditions and helping to plot new highways. However, they were probably even further from imagining they would be responsible for aerial pizza delivery stunts and bird’s eye view videos of 12-year-olds falling off skateboards. With their ability to cut costs, increase efficiencies and protect human employees, industrial drones – especially automated industrial drones – may not have been the innovation those military researchers intended when they first developed drones, but they’re a pretty amazing innovation nonetheless.   ### Cloud Expo Europe 2018 - Day 2 Join us live for Cloud Expo Europe 2018. Disruptive will be streaming live from the ExCeL, London for the two day event. Don’t miss the latest interviews, news and trends from the Expo. Remember to join in the conversation on Twitter using #DisruptiveLIVE ### Cloud Expo Europe 2018 - Day 1 Join us live for Cloud Expo Europe 2018. Disruptive will be streaming live from the ExCeL, London for the two day event. Don’t miss the latest interviews, news and trends from the Expo. Remember to join in the conversation on Twitter using #DisruptiveLIVE ### The role of the CFO: Change is afoot for finance and the cloud Cloud technology is revolutionising the way companies do business, including the way they store, process and manage data. As a result, organisations are reaping the benefits of cloud, whether its increased flexibility, collaboration or reduced costs. Despite these benefits, there is one department that has historically been reluctant to embrace the cloud - finance and its CFO. [clickToTweet tweet="73 percent of #CFOs said that they would now trust the #cloud to store financial #data. #Finance is more eager than ever to learn about how cloud #technology can make a more #strategic and meaningful impact on their #organisation." quote="73 percent of CFOs said that they would now trust the cloud to store financial data. Finance is more eager than ever to learn about how cloud technology can make a more strategic and meaningful impact on their organisation."] According to Adaptive Insights’ CFO survey revealing confidence relative to data and technology, 73 percent of CFOs said that they would now trust the cloud to store financial data. This is more than double the number of CFOs who trusted cloud three years ago (33 percent). It is clear that despite previous concerns, finance is more eager than ever to learn about how cloud technology can make a more strategic and meaningful impact on their organisation. Moving towards cloud Finance practitioners are becoming increasingly drawn towards cloud technology in recent years, not least because it frees them from the back-office, number-crunching roles, to positions in which they increasingly generate insights and analysis that have a direct impact on business results. Businesses are demanding more from their finance teams and at a facer pace, but having to deal with a large number of Excel spreadsheets means they often lack the agility and time on analysing performance. With cloud-empowered planning, finance teams can make the well-needed shift away from labour-intensive static planning. Also top of mind for many finance professionals is exploring ways to efficiently integrate data to establish a single source of truth and further streamline manual tasks so there is more time for strategic analysis, forecasting and scenario planning. According to Adaptive Insights’ CFO survey, 60 percent of CFOs indicated data integration as the top technology hurdle that most stands in the way of getting actionable reporting information. Relying on manual spreadsheets only gets them so far to deliver quality, and relevant and actionable data to stakeholders, while cloud technology delivers data that can be relied upon. Other key areas of focus for finance are workforce planning and top-line planning. These are issues in which finance has often played a limited supporting role in the past. However, now that finance professionals can produce a range of scenarios and forecasts - such as the impact of Brexit on business - they are being positioned as the core of the strategic brain trust at many organisations. Skills and collaboration Finance no longer shies away from soft skills. Indeed, finance professionals often have a reputation for lacking key soft skills, such as communication and collaboration. However, many question whether finance teams have just been limited by the tools it has traditionally used. While Excel and other static planning tools have a place in this world, they can often be inefficient at generating insights that lead to great collaboration and interaction. Add to that the fact that manual processes are so time-consuming that there is often no chance for meaningful analysis, and it is little wonder that the stereotype of finance huddled in solitary cubicles has taken hold. The persistence of this challenge was demonstrated in Adaptive Insights’ CFO survey, which revealed that 77 percent of CFOs said major business decisions have been delayed due to stakeholders not having access to data in a timely manner. Cloud technology has opened new possibilities for finance to make a big impact. The fast, easy and powerful capabilities of cloud solutions provide finance with actionable data and insights that encourage better collaboration and provide a seat at the table with internal partners and leaders. There is a growing expectation that finance will deliver the story behind the numbers, crafting narratives that provide context and clarity on what the data means to the organisation - both now and in the future. And finance practitioners are learning that they do not have to be remarkable orators. They can add significant value by simply becoming more well-rounded and expert in developing metrics and forecasts that shed light on strategic decisions and future plans. Put simply, as the game changes in finance, better collaborative skills are fast becoming core competencies. By honing them, finance practitioners will bolster their profiles and, ultimately, their careers. Build a community Cloud technology has opened the door for finance departments to make an unprecedented impact on business results. Since many finance professionals will be wading into unchartered waters, it is up to the finance community to offer up thoughts on building cloud technology into their business model. After all, it makes sense for finance to benefit from a community of like-minded professionals. Ultimately, sharing ideas and best practices, tackling new challenges and networking are invaluable, and will make all finance teams much more efficient. Whether they do that via online or in-person interactions, it is the people who are eager to learn, collaborate, and build an engaged community that will make the biggest and last difference. ### Comms Business Live: Channel in…Retail | #DisruptiveLIVE Today David Dungay will be talking two three experts on the evolution of the retail sector. For years we have heard stories of the demise of the high street retailer in the wake of the online shopping revolution but today the high street is fighting back with its own digital strategies to ensure customers enjoy their shopping experience and keep coming back. Joining his panel is Mike McMinn, MD of Marston’s Telecom, Lee Shorten, Group Chairman of Sabio, and Steve Bieniek, Business Development Manager at Paymentsense. The panel will be discussing how the Channel can leverage the changing retail landscape and take advantage of the new opportunities which are presenting themselves. ### IBM Accelerates AI with Fast New Data Platform, Elite Team - IBM Launches Cloud Private for Data, a major new data science and machine learning platform for data-driven decision making - Unveils Data Science Elite Team, a no-charge consultancy that advises clients on machine learning adoption and assists in their AI roadmaps ARMONK, N.Y. – 16 March 2018: As companies look to embrace Artificial Intelligence (AI) to gain competitive advantage and increase productivity, IBM (NYSE: IBM) today unveiled a new data science and machine learning platform and an elite consulting team to help them accelerate their AI journeys. Powered by a lightning-fast in-memory database that can ingest and analyze massive amounts of data –one million events per second – IBM’s new Cloud Private for Data is an integrated data science, data engineering and app building platform. Designed to help companies uncover previously unobtainable insights from their data, the platform enables users to build and exploit event-driven applications capable of analyzing the torrents of data from things like IoT sensors, online commerce, mobile devices, and more. “Whether they are aware of it or not, every company is on a journey to AI as the ultimate driver of business transformation,” said Rob Thomas, General Manager, IBM Analytics. “But for them to get there, they need to put in place an information architecture for collecting, managing and analyzing their data. With today’s announcements, we are bringing the AI destination closer and giving access to powerful machine learning and data science technologies that turn data into game-changing insight.” Launching on the IBM Cloud Private platform, Cloud Private for Data is an application layer deployed on the Kubernetes open-source container software and can be deployed in minutes. Using microservices, it forms a truly integrated environment for data science and application development. In the future, the Cloud Private for Data will run on all clouds, as well as be available in industry-specific solutions, for financial services, healthcare, manufacturing, and more. Commenting on the news, Christian Rodatus, CEO of IBM business partner Datameer, said that “two of the biggest challenges for Data scientists is cleansing and shaping data, and operationalizing their insights to deliver value to the business. The direction IBM is headed with IBM Cloud Private for Data is aligned with Datameer's strategy and will enable companies to more quickly prepare data for machine learning and AI projects and operationalize these across their organizations." The Cloud Private Data solution also includes key capabilities from IBM’s Data Science Experience, Information Analyzer, Information Governance Catalogue, Data Stage, Db2 and Db2 Warehouse. The cohesive set of capabilities is designed to help Cloud Private clients quickly discover insights from their core business data while keeping that data in a protected, controlled environment. In other words, the new solution will provide a data infrastructure layer for AI behind the firewall. IBM Launches Data Science Elite Team Separately, IBM today announced the formation of the Data Science Elite Team – a new no-charge consultancy dedicated to solving clients’ real-world data science problems and to assist them in their journey to AI. According to a report from MIT Sloan, Reshaping Business with Artificial Intelligence, an estimated 85% of 3,000 business leaders surveyed believed artificial intelligence (AI) would enable competitive advantage, however, only about 1-in-5 have done anything about it. For many organizations, the task of understanding, organizing and managing their data at the enterprise level was too complex. That’s where the new Data Science Elite Team comes in. This global team of top data scientists, machine learning engineers, and decision optimization engineers is dedicated to assisting clients on-site to begin helping them better understand and control their data and to start making machine learning an integral part of their business. "Nedbank has a long tradition of using analytics on internal, structured data,” Patricia Maqetuka, Chief Data Officer, Nedbank Ltd. “More data is available now than has ever been available before and analytical tooling has undergone rapid evolution in order to keep up. Nedbank has embarked on a journey to start leveraging both internal and external data, creating new data-driven business models and new sources of revenue. Thanks to the first IBM Analytics University Live we were exposed to the guidance and counsel of IBM’s Elite team. This team helped us to unlock new paradigms about how we think about our analytics and change the way we look at use cases to unlock business value." Specifically, Data Science Elite Team client engagements center around a use case and begin with a discovery workshop that helps clients understand their data environment and break this use case down into 3 to 4 discrete deliverables that can each be realized in two to three weeks. Following the workshop, clients are provided access to powerful data science strategies, technologies and methodologies through data science sprints and validation. The team, which comprises more than 30 people now, and is expected to grow to 200 over the next few years, is currently aiding more than 50 clients. Related · IBM THINK Blog: The New Data Developers · IBM Cloud Private for Data ### Cloud and Virtualization – a marriage made in heaven for Financial Services | IBM & VMware We’ve teamed up with IBM and VMware for a series of broadcasts focussing on the value of cloud computing and virtualisation. For this first show we are joined by VMware’s Rory Choudhuri and IBM’s Bharat Bhushan who will be discussing cloud and virtualisation in the financial services. This half hour broadcast will ask about the challenges and pressures that face the banking and financial industry, new revenue streams, customer satisfaction, security and regulations. How can the financial sector embrace the latest in innovation while remaining competitive and compliant? You can find out more about what IBM and VMware are doing in financial services here: http://ibm.biz/IBMCloud-VMware Broadcast date: Monday 19th March, 2018 – 2pm GMT Join the conversation live on Twitter using the hashtags #IBMCloud #DisruptiveLIVE ### #TheCloudShow – Ep5 – Future of IT Organisations The future of IT organisations has always been a hot topic, but the rules of the game are changing in a cloud first world. Our host, Jez Back, is joined by David Newbould from TechData and Simon Ratcliffe from Ensono to explore this difficult topic. Look out for the special guest appearance from Ernesto as well! ### China’s Silk Road – Globalised Digital Trade and Intercontinental Business Interactions China’s ‘Silk Road’ has hit the headlines recently regarding its impact on business relationships and industrial growth in the present climate of the commercial world. China Telecom Europe’s products, for example, our VPN services for China, aim to encourage cross-continental connectivity between Europe and China, which is the driving force behind all of CT Europe’s work. The original Silk Road allowed for the exchange of goods between China and other continents in the ancient world and had a huge impact on industry as well as cultures. In our globalised 21st century world, the new ‘digital’ Silk Road has the potential to create an equally great impact on how we do business in the modern age. The Original Silk Road The ancient Silk Road was a physical route, or collection of routes, which connected Asia with Africa, the Middle East and Southern Europe, both via land and sea. The ethos of this lives on in 21st-century terms, hence the name ‘the digital silk road’. The ancient network, formally established during the Han Dynasty of China, connected the East with the West and derived its name from the fact that silk, a lucrative trade at the time, was transported in mass quantities via these routes to the different continents. Silk was popular in the west during this period, particularly in Rome, but it was certainly not the only product to make the network successful. Paper and gunpowder were also popular goods for trading, however, the majority of business came from China’s exporting of luxury, expensive goods such as teas, salt, sugar, porcelain and spices. In the other direction of this trade route, China commonly imported materials such as cotton, ivory, wool, silver and gold. Although the network was established principally for the buying and selling of goods, it also saw the exchange of immaterial entities such as cultural aspects, religious ideas, philosophies, technological advances, languages, scientific knowledge and architectural styles. The name “Silk Road” is a translation of the term “Seidenstrasse” first coined by German traveller and photographer Ferdinand von Richthofen in 1877 AD. The Han Dynasty officially commenced trading with the West in 130 BC, from which point the network was used regularly until 1453 AD when the routes were closed as a result of the Ottoman empire boycotting trade with the western world when the Byzantine Empire fell to the Turks. The Digital Silk Road [clickToTweet tweet="The #Digital #SilkRoad is an #innovation-led development, established with the aim of increasing international #connectivity and generating a greater volume of #business interactions intercontinentally. #VPN" quote="The Digital Silk Road is an innovation-led development, established with the aim of increasing international connectivity and generating a greater volume of business interactions intercontinentally."] The notion of globalisation means that countries, as well as continents, are sharing both skills and products at an unstoppable rate, and as such few goods remain exclusively available in the country where they were originally manufactured. This is particularly evident in large cities, many of which have become remarkably diverse and multicultural. Improved methods of travel have also been established over the course of the past century, which means that there is no longer the need for a physical trade route. Instead, there is a growing requirement for digital connections between continents in order to enhance communication and business relations within the context of our online world. Many large organisations are already operating in multiple locations across the globe, and are therefore reliant on telecommunications, in the first instance to ensure the smooth running of operations between international bases, as well as to allow effective communication with clients and partners situated throughout the world. China Telecom is committed to promoting international business relationships through our offerings such as VPN services to China, which not only enable business relationships but also further the potential of ICT to increase productivity and minimise the need to travel, which serves to reduce emissions. The Digital Silk Road is an innovation-led development, established with the aim of increasing international connectivity and generating a greater volume of business interactions intercontinentally. Cloud Expo Europe Digital services are the current focus of many organisations, China Telecom being one of them. IT and computing have become invaluable to industry in almost all sectors, and it is imperative that organisation keep up to speed with the innovation of cloud technology. Industry leaders, including China Telecom, will be showcasing their services and sharing insights at the upcoming Cloud Expo Europe conference, hosted at London’s ExCeL centre on 21st-22nd March. At this event, China Telecom Europe will be discussing their world-class integrated communication solution, as well as their intentions regarding the digital silk road. ### Have Your News Displayed at Cloud Expo 2018! ### Tech of the Week #4: “Alexa, is AI-Powered Voice Transcription Worth My Time?” Real-time, AI-powered technology is fast becoming available that takes what we say and transcribes it with an accuracy of up to 95%. It’s being used by everyone, from paediatricians to the media employees that write up interviews as they happen. But Otter is not your typical voice transcription technology.   Alexa, Cortana, Siri They’re your Tom, Dick and Harry of voice-powered technologies and they have revolutionised the way we go about our day. Alexa, especially, has quite literally become a household name. When she’s not laughing creepily at you as you go about your business, she can do anything from ordering you your next batch of goodies to integrating with smart home technology. In fact, Amazon (who recently debuted their “queue-less”, cashless store named Amazon Go), are apparently working on having Alexa translate in real-time. Studies, including one by ComScore, say that 50% of all searches will be conducted via voice by 2020. Cortana was built into Windows 10 and the Xbox One by Microsoft to allow just this. This is especially useful on the Xbox. Instead of using the joystick and A button to scroll through an onscreen keyboard, speaking directly at the machine saves plenty of time. It also allows for more natural searches. When typing into Google, we use keywords or short phrases to limit results, but artificial intelligence can deal with an entirely written (or worded in this case) question. That’s because it learns based on what other users are saying when searching for an answer, and understands the context to provide a more accurate result. Also known as machine learning, in a somewhat dumbed down way.   Otter’s Impact on Our Lives [clickToTweet tweet="Otter is a voice #transcription #app that takes what’s being said and writes it down for you, like any app that transcribes. The difference here is that this one does it in real-time. #AI #ML #IoT" quote="Otter is a voice transcription app that takes what’s being said and writes it down for you, like any app that transcribes. The difference here is that this one does it in real-time."] Otter is a voice transcription app that takes what’s being said and writes it down for you, like any app that transcribes. The difference here is that this one does it in real-time. Rather than recording what is said and presenting it to you later as traditional services offer, Otter's transcription is given almost immediately. The developers behind this machine-learning powered technology are former Google architect, Sam Liang, and Nuance and Facebook employees. Nuance is an award-winning provider of speech and imaging software. They have come together and formed AISense. It comes as no surprise then that Otter can transcribe with accuracy up to 95%. One of the biggest causes of inaccuracy arising from its use of punctuation, or lack thereof, and its lack of ability to properly function in a crowded place. The app also provides keywords based on what has been said, which could be useful when searching through a substantial transcription to find a particular section. Or maybe, in the case of writing an article, giving a reader a synopsis of what it’s about.   What settings or scenarios could transcription prove useful for? A hospital is one. During an operation, the surgeon or other members of staff may take notes on what is happening. This could be the amount of one particular drug that has been administered, or a rough idea of what steps were taken. These notes would usually be quite basic, just giving the basics for the most part. Otter would remove the need for this by allowing full details to be transmitted as easy as it is to speak the words. If this was combined with the internet of things, these notes could be transferred between departments faster, potentially saving lives. Or it could be shown to hospital bosses (data protection aside) for use in monitoring the staff and care of its patients. An office meeting or lecture is another excellent example. The lecture hall can be a noisy place, sometimes drowning the voice of the lecturer, or at least making particular words get lost. Luckily, AI isn’t as quickly thrown off, meaning that (particularly in the case of Otter) by learning different voices, it can pick out the one you need and fill in the gaps. The lecturer could even record his session to reflect on afterwards and improve for future classes, thus giving people an even better chance to learn. In the setting of an office meeting the recording could then form part of the minutes, or as a base point for all employees to refer to. I, personally, have Ideas come into my head, and they go before I can even find a pen that works. By using a voice transcription service as intelligent as Otter, I could get that dinner recipe or business idea down and ready to refer to when I need it most. The possibilities are definitely broad, with more becoming available every day that the AI, ML and deep learning technologies advance. The app is currently free and available for Apple and Android phones.   Next in the series, Wednesday 21st March 2018… Tech of the Week #5: 3D printed houses that are fresh off the press You can buy a lot of things with $10,000; a family trip to Florida, a 1970s Volkswagen Beetle, and the wedding every girl dreams of. Usually, a house is not an option to buy for so little, especially not a 3D printed home that can be built in 12 to 24 hours. However, Austin-based startup ICON is making this idea a reality…   If you want to talk tech (and have it transcribed), and for more great content, follow me on Twitter @JoshuaOsborn16.   ### Stephen Hawking 1942 – 2018 “We are all now connected by the Internet, like neurons in a giant brain” 8th January 1942 – 14th March 2018 In Memoriam: Stephen William Hawking CH CBE FRS FRSA "Stephen Hawking is the former Lucasian Professor of Mathematics at the University of Cambridge and author of A Brief History of Time which was an international bestseller. Now the Dennis Stanton Avery and Sally Tsui Wong-Avery Director of Research at the Department of Applied Mathematics and Theoretical Physics and Founder of the Centre for Theoretical Cosmology at Cambridge, his other books for the general reader include A Briefer History of Time, the essay collection Black Holes and Baby Universe and The Universe in a Nutshell. In 1963, Hawking contracted motor neurone disease and was given two years to live. Yet he went on to Cambridge to become a brilliant researcher and Professorial Fellow at Gonville and Caius College. From 1979 to 2009 he held the post of Lucasian Professor at Cambridge, the chair held by Isaac Newton in 1663. Professor Hawking has over a dozen honorary degrees and was awarded the CBE in 1982. He is a fellow of the Royal Society and a member of the US National Academy of Science. Stephen Hawking is regarded as one of the most brilliant theoretical physicists since Einstein." - Hawking's Website ### A Different Approach to Managing Spend - Analytics and Machine Learning At first, a novelty, Artificial Intelligence (AI) has been evolving rapidly over the last few years, and we’re starting to see how it can deliver real business value. There has however been both a tangible sense of excitement and concern around AI and machine learning. It seems, just like flying cars, we over-estimate how quickly humans and businesses can implement a new technology. Although still in their relative growth phases, both are making leaps and bounds in innovation. Don’t worry though, we still have a long way to go before a robot takes our jobs. The big data explosion has accelerated the growth of intelligent technologies. Data is the lifeblood of AI. Intelligent computers that can think for themselves have been around for a while, but before the emergence of big data we didn’t have the computing power to drive them, so we are only now starting to reap the benefits of computing, machine, and deep learning. It took a small development at Google however for AI to go mainstream. Word2Vec is a neural network (based on the brain) that changes words into vectors with meaning and context, allowing machines to analyse and learn about context from phrases. For example, take an invoice with a line of text that says “plastic, 500 ml, Crystal Geyser,” the technology will infer that it’s bottled water even though it doesn’t say so because the machine has some prior knowledge to establish context. In recent years, we’ve seen a growth in innovation across the finance industry. The growth of technologies such as crypto-currencies, contactless and mobile payments, and the introduction of intelligent algorithms demonstrate it is an industry willing to innovate and embrace new ideas. It is in managing spend for businesses though where I believe the next major opportunity exists – particularly for the development of AI and machine learning. It is something we at Coupa are already doing. Our technology gathers data from invoices, purchase orders, card information, travel, and expenses; in short, data about anything that businesses have spent money on. All this data is then joined so that it is consistent before being run through three processes; two machine and one deep learning. The machine learning is primarily for normalisation and deep learning for classification. It’s a blend because machine learning is quick, but neural networks can take hours. [clickToTweet tweet="Once this #data is processed it becomes ‘knowledge’ within the #system and allows us to classify 85-90% of spending. The final step, #DeepLearning, can then take classification up to 95% #AI #MachineLearning #Spending #BigData #Data #Tech" quote="Once this data is processed it becomes ‘knowledge’ within the system and allows us to classify 85-90% of spending. The final step, deep learning, can then take classification up to 95%."] Once this data is processed it becomes ‘knowledge’ within the system and allows us to classify 85-90% of spending. The final step, deep learning, can then take classification up to 95%. With the information this data produces, procurement teams can help to answer business-critical questions such as, who is buying what from whom, when, in what quantity, where are they shipping it to and how are they paying for it. These are all classic spend optimisation questions that procurement teams have been looking to answer for a long time. Insights sourced through AI, machine learning and deep learning can be applied to business spend management by normalising, enriching and classifying spend data, eliminating the need for costly, inefficient rules-based integration and transformation of raw data to understand spend footprints. Businesses taking advantage of AI within a spend management system can also gain visibility on total spend including tail-end, which offers more control of the buying cycle.  It can also provide prescriptive recommendations, such as who the most reliable supplier for a service is, or warnings about risk. AI’s ability to be ‘all seeing’ across entire organisations gives it the power to support security and theft prevention. In spend management it can be used to identify fraudulent and suspicious behaviour by creating a fraud profile based on analysing aggregated data for expenses, purchase orders and invoices. This profile score and related spend transactions can then be used to alert a company’s internal auditors or finance personnel for further review and action. As AI and machine learning technologies require large amounts of historical and ongoing operational data, to truly access and critically benefit from insights, organisations are best served to work with an expert partner. The historical knowledge required extends to multiple languages and character sets, as well as industry-specific or otherwise esoteric commodities. The value of AI to the market is as a result why it is so prominent in Coupa’s latest update – R20. Up until this point spend analytics has always been a niche deep in the back office. Total visibility via analytics is the keystone to addressing the issues that businesses across the globe face, particularly as disruption forces efficiency drives. Utilising these technologies will help companies save money and act more effectively and responsibly. I find it all very exciting, particularly because I know it can make a massive difference in transforming a company for the better. ### Data Beyond Borders: Rethinking the Cloud in Times of Global Change A post-Brexit era may spur demand for companies to rethink their hosting needs. One company, in particular, is ready to help: Leaseweb. Their strategy is simple: use state-of-the-art globally distributed data centres, with strong security and data privacy protections, to keep your data safe and available. Offering Infrastructure-as-a-Service (IaaS), the company has chosen to open a new data centre in the UK, allowing EU customers to serve their UK customers directly from a UK location. Leasweb: A history [clickToTweet tweet="Today, @Leaseweb #hybrid ready #hosting solutions include dedicated #servers, #PrivateCloud, #PublicCloud, colocation, content delivery #network (#CDN), and #cybersecurity services" quote="Today, Leaseweb's hybrid ready hosting solutions include dedicated servers, private cloud, public cloud, colocation, content delivery network (CDN), and cybersecurity services."] Leaseweb started in the clouds. 20 years ago while still working as professional pilots, two Dutch founders understood the importance of reliability and global connections. Today they are delivering data across borders on one of the world’s best networks, empowering companies through a comprehensive range of global cloud hosting solutions. Leaseweb’s hybrid ready hosting solutions include dedicated servers, private cloud, public cloud, colocation, content delivery network (CDN), and cybersecurity services. Additionally, they work with technologies like DellEMC, HPE, Microsoft, Intel, NetApp, VMware, Veeam, Acronis, to provide the best-customized solutions for their customers. Growth, change and expansion As a global partner, Leaseweb serves as a single point of contact, offers compliance with local requirements, and can “speak the local language”. Leaseweb’s expansion to the United Kingdom was driven by three factors: ongoing growth, change in international markets, and the belief that expansion should be based on customer requirements. Many EU based customers are looking to have a footprint in the UK to better reach local customers. One of the customers that Leaseweb has worked closely with in developing its UK expansion is TOPdesk. The full ramifications around Brexit are still unfolding. One thing businesses can be sure of is an increase in the complexity of operations in a life of new rules and regulations, particularly regarding data residency. Advantages exist in adopting cloud solutions in a post-Brexit world, and the four areas critical to consider before proceeding include: the importance of data, hosting, security, and architecture. To learn more, read Leaseweb’s blog on this topic. Cloud Expo Europe Don’t miss the opportunity to meet Leaseweb at Cloud Expo London. Attendance is free so be sure to make an appointment here. There are plenty of great offers available at their booth only during Cloud Expo Europe. The Leaseweb director of business development (Avneet Sachdeva) will be holding a talk at Cloud Expo Europe on Thursday 22nd March at 10:30 – 10:55 in the Cloud Innovations Theatre. The topic will be “Data Beyond Borders: Rethinking the Cloud in Times of Global Change. ”Are you ready or not for the new era of IT Flexibility? Find your answer here and don’t miss Leaseweb at Cloud Expo. ### Three e-commerce lessons to be learned from Amazon Go As Brits, we’re generally partial to a good queue (at least, that’s the stereotype). But it’s hard to deny: avoiding checkout queues during the lunch hour sounds too good to be true. Yet it’s exactly this that Amazon Go promises. It’s the first-ever physical store with no money and no queues, and it opened to the public on January 22. The innovative idea uses artificial intelligence (AI), and a series of sensors to revolutionise the shopping experience. While this has sparked an ongoing debate about the future of automated checkouts, there are still lessons to learn from the retail ripples Amazon Go has caused. Howard Williams, marketing director at digital engagement specialist Parker Software, explains the lessons that the successes and failures of Amazon Go teach us about e-commerce, automation and the AI-infused future of retail.   1. Make sure you can deliver on your tech promises The opening of Amazon Go prompted much amusement on Twitter after shoppers were forced to queue to enter a ‘queue-less’ store. Of course, once the hype around the new tech dies down, the experience could very easily live up to its queue-free promise. There is, however, a lesson to be learned from this, and it’s universal – it doesn’t matter whether you operate online, in-store, or both. When implementing new tech, don’t overgeneralise or exaggerate what it will be able to do. Living up to your technological claims is a must. If you do want to boast about your new innovation – and we get it if you do – only do so once you know that it’s working for your customers. For example, let’s say you’ve launched an advanced machine-learning chatbot on your website. This chatbot could answer more types of question than ever before, and that’s great. But don’t promise customers that it can answer their every query, with smart, fast service every time. It will look bad on you and your bot alike when a complex question needs human attention and understanding.   2. Automation is our friend, not our foe [clickToTweet tweet="Many fear #automation disruption – the concern that automation will replace human workers and take our jobs. #AmazonGo is testament to the dual need for humans and automation. #AI #Robotics" quote="Many fear automation disruption – the concern that automation will replace human workers and take our jobs. Amazon Go is testament to the dual need for humans and automation."] Many fear automation disruption – the concern that automation will replace human workers and take our jobs. Amazon Go is testament to the dual need for humans and automation. Despite the store boasting a fully automated checkout experience, there is still a need for human employees. Granted, the roles of the human employees are not that of the standard cashier. Rather, employees are there to ensure a smooth, optimised customer experience. They make sure the shelves are always stocked, fresh products are prepared in-store, and that customers can be helped in real-time should they face any issues. Automation has removed the need for stressful payment interactions. Instead, it has allowed the human staff to focus on providing an excellent customer experience – one with human support available the moment the customer has a question. In other words, jobs aren’t being taken by automation: they’re evolving because of it.   3. Importance of a frictionless journey This successful blend of automation and employees also teaches us the importance of a frictionless, unbroken journey. Technology and humans should be on the same team – not disparate entities. Amazon Go has ensured an optimal customer experience by using people and machines alongside each other in smooth tandem. We live in the ‘experience era’ of online service, and no longer can businesses compete on price or product alone. Customers stay for a seamless, easy journey – and that’s just what Amazon Go has created. It has blended technological speed with the convenience of human support. The result is a fast, efficient shopping experience. Even better, Amazon Go can maintain this frictionless experience even when something goes wrong. After the team at CNBC accidentally got a free yoghurt, Amazon Go (and the yoghurt company) both demonstrated approachability and were able to quickly quell any stress. How? They were available where their customers were, on Twitter, giving real-time support. Offline went online, and the experience remained consistently quick.   What the future brings with Amazon Go Amazon – the world’s most successful e-retailer – has turned its attention to physical retail. In doing so, the brand has still managed to teach us some valuable e-commerce lessons. Perhaps this in itself is the biggest lesson of all: our traditional definitions of retail are starting to evolve. Technology is blurring lines and breaking barriers. Commerce, e-commerce, m-commerce – it’s all starting to merge into one digitally-steeped melting pot. It’s time to adapt to the new reality of retail. ### Tech of the Week #3: Harley Davidson LiveWire – The Electric Dream Tesla is widely known for their electric vehicles and pioneering the electric lorry as well; we have electric bicycles, SUVs, trains and everything in-between, but I’ve always felt one vehicle, in particular, has been missing. A personal favourite of mine: the motorbike. Well, to all you bikers out there, Harley Davidson is providing us with the answer: Project LiveWire. Around 2014/2015, Harley Davidson unveiled its Project LiveWire motorbike, notably used in the film The Avengers: Age of Ultron. This wasn’t your typical bearded, leather-clad bloke ridden hog. Instead, it looked more like a Ducati. One of the reasons for this is to appeal to a more diverse group of people, including “18 to 35-year-olds, women, African-Americans and Hispanic riders”.   What is Project LiveWire? LiveWire is the name of motorbike itself and it is an ongoing project by Harley Davidson to bring an electric motorbike to market. It’s taken a long time, that’s for sure, but there’s good reason for this. Harley has been using big data analytics to get real insight into the risk and reward of such a daring concept, and whether it’s worth it all together. There are other electric motorbike manufacturers; Zero Motorcycles – Formerly Electricross, it was the brainchild of a former NASA engineer and founded in California. This bike had a terribly slow start. It failed to reach their UK/European audience several times from inception in 2010 right through to its acceptance in 2017. This was due to a combination of things, mainly poor mileage range. Victory Motorcycles – A company founded in 1997 acquired Brammo Motorcycles and subsequently its electric motorbike “Empulse”. They began winding down operations in 2017. There are even more reasons for this one failing than Zero, however, the main points were its inability to compete with Harley Davidson (they do have the Indian Motorcycle range that does better do this, mind), poor build quality, and again poor mileage range. It’s easy to see why Harley has spent these years doing the research needed to ensure it too doesn’t suffer a similar, common, fate.   How Big Data Plays a Part If you were to think of the two phrases, “technological advancement” and “Harley Davidson”, you wouldn’t necessarily put them in the same paragraph, let alone the same sentence. However, what they did with Project LiveWire is most deserving of the marriage of the two phrases. Around 7,000 people have tested the bike in America and about 12,000 people have ridden it on a rolling road. That results in about 19,000 pieces of feedback. Big data is definitely appropriate here. So what were the results? 86% of people that tested the bike said they liked it and a whopping 74% of people would consider buying it. But that’s not all… The project also gave Harley other important insights: How much is someone prepared to pay for it? Are the noises just right? Is it sexy enough? Is it fast enough? Etc, etc. As far as launching a new product could go, Harley has given themselves a massive head start and a greater chance of survival for when they finally release the LiveWire.   The Tech As this is the Tech of the Week series, it’s about time we looked at what goes into such an electric motorbike. We can’t be sure on what the final release will look like, but we can take a look at the rumours. Usually, the electric motorbikes the world have seen use a transverse engine. The LiveWire uses a longitudinally mounted engine (those seen in performance cars like the Audi R8, or the Moto Guzzi Jackal motorbike). Even Harley traditionally use a transverse engine. Does this suggest they’re peeking into the sports market with this model? This oil-cooled, three-phase induction electric motor is capable of 55kW of power, producing about 74 bhp and 52lbs/ft of torque. This puts them ahead of both formerly mentioned competitors. It sits on a lightweight, cast aluminium frame and is powered by a 7kWh battery – the same size battery in Tesla’s home ‘Powerwall’ range. This is expected to see a range of… 50 miles. Considering one of the downfalls of Zero and Victory motorcycles was the lack of range (and both were higher than 50 miles), this is a little alarming. These figures are based on the 2014 model though, so we should expect to see big improvements on this upon release. Other expected figures see a 0-60mph time of under 4 seconds and a limited top speed of 95mph. The Bike “Well, that’s pretty amazing” – Joss Whedon with a grin.   Electric Vehicles In the UK, where the sale of diesel and petrol vehicles are to be banned from 2040, electric vehicles will take the top spot. Companies such as Volvo have announced their plan to make all of their new car models have an electric motor by 2019. BMW had also announced their plans to have an all-electric Mini by the same year. Billions of pounds are needed to expand the infrastructure to support so many vehicles, and people will start picking their favourite manufacturers now. If Harley Davidson can pull it off with the LiveWire, they could cement their name in the market for centuries to come.   Next in the series, Wednesday 14th March 2018… Tech of the Week #4: “Alexa, is AI-powered voice transcription worth my time?” Real-time, AI-powered technology is fast becoming available that takes what we say and transcribes it with an accuracy of up to 95%. It’s being used by everyone, from paediatricians to the media employees that write up interviews as they happen. But this is not your typical voice transcription technology…   Want to talk tech or motorbikes? Press the little blue bird under my name and send me a message. ### The IBM Cloud with Sebastian Krause and James Carr: Hybrid Cloud and Knowledge Sharing With IBM’s 4th quarter results recently out, Sebastian Krause (General Manager, IBM Cloud, Europe) talks about the staggering revenue that the cloud is bringing to them and a 24% company growth year on year. Sebastian also covers: IBM Cloud and the hybrid cloud strategy. Customers, digital transformation and business models. How cloud drives innovation.   James Carr (IBM Cloud Garage Developer) also joins the conversation, discussing his work with IBM Cloud Garage. A platform that allows IBM’s vast knowledge of the cloud to be shared with their clients and more. James covers: Design thinking methodology. Client proof of concept. How IBM Cloud helps clients with their projects and goals.   “The IBM Cloud: The cloud for smarter businesses” ### Data Protection for the Enterprise with Backup & Disaster Recovery In 2017, Ransomware appeared as a major cybersecurity threat. It had existed before and had done some damage in isolated events but nothing similar to the scale of WannaCry. This event shed light on the importance of data security and protection. Backup and disaster recovery (DR) is one of the ways enterprises have and can protect their data. However, one thing needs to be crystal clear: backup and DR isn’t meant to prevent malware/ransomware or cybersecurity threats. It’s intended to provide a way to recover. In medical terminology, this isn’t the vaccination to prevent the disease rather it’s the medicine to cure it. Ransomware may have emphasised on data protection but that’s not exactly the main reason why enterprises should protect their data; especially with backup and DR.   Why Enterprises need Data Protection via Backup and Disaster Recovery? According to conventional perception about data loss, there are two significant threats to data: Ransomware. Natural disasters. This does appear logical. There are plenty of enterprises that have fallen victim to natural disasters and never could recover while ransomware continues to claim new victims almost on a daily basis. However, statistics suggest against this. The primary cause of data loss and downtime isn’t precisely ransomware, its hardware failures.   If we are to isolate the natural disaster perspective and look for those statistics, we end up with the following numbers: Evidently, we do make a big deal out of natural disasters, but they represent only 6% of the causes of outages. [clickToTweet tweet="“#Hardware failure and power loss cause more outages than natural disasters; so they are even bigger disasters for #enterprise #data.” More at https://www.comparethecloud.net #DisasterRecovery #DR #DataLoss #Backup #Recovery #CyberSecurity" quote="“Hardware failure and power loss cause more outages than natural disasters; so they are even bigger disasters for enterprise data.”"] If we are to summarise the causes of outages, it’s quite clear that it isn’t just two things; there’s a lot more to consider. Consequently, there’s more reason why enterprises should protect their data using backup and DR.   Backup & Disaster Recovery Services: On-premises, Cloud & Hybrid There are three ways backup and disaster recovery services can be utilised. On-premises. Cloud. Hybrid. All three types of solutions have their advantages and disadvantages. Which solution is best for your enterprise depends on your data requirements.   On-Premises Backup & Disaster Recovery On-premises solutions comprise of local infrastructure that contains the entirety of the backup. This means hot data, cold data and archiving data all backed up on the local infrastructure. This is efficient in the case of hot data but is very expensive and inefficient for cold data and archival data. That’s because the main perk of on-premise solutions is their ability to process high IOPS efficiently without disruption. As cold data and archival data is more significant in volume but low in priority, it is a waste to use this infrastructure for it. Another downside to this, besides the cost implications and the wasted footprint, is that if a natural disaster strikes you still lose your backup. For backup, it’s better to integrate geo-redundancy so that the enterprise data has multiple layers of recoverability. Examples of on-premises infrastructure include Enterprise NAS storage, SAN (Storage Area Network), hyper-converged appliances and hyper-scale appliances.   Cloud Backup & Disaster Recovery Cloud backup and disaster recovery use the cloud for backup and failover as well. As the data is offsite, this introduces a certain extent of redundancy. There’s no infrastructure to maintain or consume dynamic resources like power, cooling etc. However, the downside is that when compared to on-premises infrastructure, the cloud exhibits more latency for hot/mission-critical data. Conversely, the cloud happens to be the best solution for cold data and archival data. As the cloud is more cost-effective than on-premises and cold data and archival data do constitute most part of enterprise data; it becomes an efficient mix for the enterprise.   Hybrid Backup & Disaster Recovery Hybrid solutions happen to be the most interesting and efficient ones, which is why experts highly recommend them for both SMEs (Small to Mid-sized Enterprises) and large enterprises. And there is a good reason for this. Hybrid solutions combine the perks of on-premises and cloud-based solutions. Hybrid solutions comprise of on-premises infrastructure, which is used for hot data, and the cloud; which is used for cold and archival data. This makes it efficient enough to reduce latency for hot data while also reducing the cost for cold and archival data. The cloud part also facilitates replication of data to offsite data centres. This implies that if an entire data centre or backup infrastructure is compromised in the event of a disaster, the data remains recoverable and the enterprise can still reduce downtime by spinning up offsite virtual machines (VMs) and failing over to them. So, hybrid backup and disaster recovery is the ideal solution for the enterprise.   Difference between Backup & Disaster Recovery Lastly, I’d like to address another common misconception and clarify that backup and disaster recovery is not the same thing. There are many differences between the two. The most noticeable among them is that backup is meant to ensure that the entirety of data, regardless of its significance, is recoverable. Disaster recovery, on the other hand, ensures that outage/downtime is reduced and/or wholly eliminated via failover and failback by prioritising mission-critical data recovery. That’s why both solutions bear equal importance for the enterprise.   Summary Ransomware and natural disasters are not the only causes of outages and data loss for the enterprise. Hardware failure and power losses result in more interruptions and consequently more data loss than ransomware and natural disasters. This makes the acquisition of backup and disaster recovery necessary for enterprises; regardless of whether or not - they’re located in disaster-prone areas or susceptible to ransomware/malware attacks. Enterprises can deploy backup and disaster recovery services on-premises, in the cloud and by using a hybrid solution. On-premises introduces speed but is costly and doesn’t facilitate redundancy. The cloud is less expensive but compromises on speed and makes it difficult to efficiently reduce downtime for the enterprise. Hybrid solutions, on the other hand, combine the perks of on-premises and cloud solutions; making them the best for the enterprise. Lastly, it is essential to understand that backup and disaster recovery are not the same. Backup employs technology and techniques that prioritize recoverability of backed up data. Disaster Recovery services prioritise the reduction of downtime/outage by emphasising on mission-critical data recovery. ### What Makes the Best Terminal Emulator? Legacy Systems in 2018   Choosing the right terminal emulator continues to be an important issue for users of Legacy Systems in 2018. When choosing a terminal emulator there are two major types to pick from: desktop based emulators and web-based terminal emulators. In this article, Flynet will discuss why a web-based terminal emulator is the best option across four factors: Functionality, Security, Performance and Future-Proofing.   Functionality A web-based terminal emulator can be accessed quickly and easily by any device with a web browser. This includes laptops, mobile phones, tablets and desktops. In the modern workplace, where Terminal Emulator software needs to be used by a diverse mix of users from salespeople, to shop floor staff and programmers, such flexibility is a must. Using the web to host your emulator gives the user and system owners the increased flexibility of an internet-based application. More importantly, the business applications that run on your legacy system can then become seamlessly integrated with other systems and processes within your organisation.   Security [clickToTweet tweet="While installing a #desktop #emulator might seem the obvious choice to maximise #security, the reality is very different. A #web #terminal emulator is actually far more secure than its locally installed counterpart. #LegacySystems" quote="While installing a desktop emulator might seem the obvious choice to maximise security, the reality is very different. A web terminal emulator is actually far more secure than its locally installed counterpart."] While installing a desktop emulator might seem the obvious choice to maximise security, the reality is very different. A web terminal emulator is actually far more secure than its locally installed counterpart. Web Servers and web browsers take advantage of the most sophisticated security features of the moment. This level of security is important, as it allows the user to perform vital tasks safely, such as internet banking. This also ensures that your web-based terminal emulator can benefit from the most up to date security available, whilst having the agility of a modern web browser based application. By contrast, most desktop terminal emulators are quite clunky and the security is often aeons out of date, creating a large threat surface that can then expose your most critical business systems. Desktop terminal emulators rarely receive security patches from their respective providers, they are often treated as old applications by vendors and as such receive little investment. As such many years will pass without a single update, leaving you wide open to all the latest threats.   Performance Most desktop terminal emulators haven’t had any architectural examination or revision for over 20 years, thus causing slow and bloated software that hasn’t really been designed for today’s operating systems.  In the past even some lesser web-based terminal emulators have used Java applets or active X plugins, essentially creating a locally installed application that runs slowly in a web browser. However, modern web-based terminal emulators are pure HTML and don’t use applets or plug-ins. Because of this remaining compliant isn’t left to users anymore, this is all managed centrally in one place. More importantly, the native HTML use ensures you get all the benefit from the most optimised and fastest application on the planet, the web browser itself.  The browser wars fought between, Google, Microsoft, Firefox and Apple have delivered arguably the most performant application on the planet, even Microsoft Word is now a HTML based application. Modern web-based terminal emulation will utilise compression and intelligent algorithms to make your legacy application lighter and faster than the original data stream, ensuring it can be used comfortably with mobile devices and in wireless environments.  A modern web-based terminal emulator will deliver you a stateless emulation, meaning dropped sessions are a thing of the past.   Future proofing Web-Based terminal emulation software can be one part of a wider future-proofing roadmap for companies. Web-based terminal emulation systems allow companies to get all the benefits of the latest technology, without endangering or changing their business-critical systems. Organisations with a cloud or mobile first strategy can comfortably adopt a web-based terminal emulator: a good emulation vendor will already have a cloud-based offering and deliver interfaces that use responsive designs that work seamlessly on any device. The best web-based terminal emulator will offer a roadmap or provide no code tools that allows the classic green screen interface to be modernised to reflect and have symmetry with other modern applications in the business system ecology.   Conclusion Clearly, a web-based terminal emulator is the best terminal emulator for companies looking to update their legacy systems. A pure HTML browser terminal emulator like Flynet’s TE (Terminal Emulator) is a clear winner in terms of what the best terminal emulator is. Not only does it offer flexibility, increased performance and enhanced security, it is easier to own and manage and offers an inviting pathway into the future. ### Comms Business Live: The Funding Surgery Since the inception of Knight Corporate Finance in 2008, there have been a series of stunning growth stories in the sector. Daisy, Maintel, XLN, Chess, Six Degrees, Southern Comms, M247 have all experienced stratospheric growth and whilst they all have different business models, they all share one common success factor: Without exception each of these companies have had access to a plentiful supply of funding which has been the major factor in delivering growth. In our first interactive surgery we are going to explore the funding options available to smaller businesses, the types of funding and which will be most suited to your strategy. Alongside Adam Zoldan Director of Knight Corporate Finance will be Mark Borzomato, Investment Director of fund:tmt, a flexible fund that seeks to invest in established owner-managed TMT companies and Max Ward, Director of Knight Debt Advisory a new business established to open up the market for debt finance to the channel. ### SD-WANs Streamline Migration to Hybrid-Cloud Infrastructure Hybrid cloud ideally combines the best of public and private cloud. In that sense, it's like the third bowl of porridge tasted in "Goldilocks and the Three Bears" – i.e., something that's "just right," without extreme flaws in either direction. In the case of hybrid cloud, and SD-WAN, this balance usually entails a higher degree of control and security than a fully public cloud environment, alongside greater flexibility and scalability than a strictly private one.   SD-WAN: The Engine Behind a Successful Hybrid Cloud However, these benefits are purely theoretical without an equally flexible and scalable software-defined wide area network to actually produce them. It's no coincidence that both hybrid cloud solutions and SD-WAN architectures have been rapidly expanding this decade: Hybrid cloud is the most popular form of cloud, accounting for two-thirds of deployments surveyed by the "RightScale 2017 State of the Cloud Report." Similarly, more than eight in 10 enterprises had a multi-cloud strategy in place. SD-WAN providers broke through in 2017, achieving what Gartner labelled "mainstream" success. Andrew Lerner even predicted that by the end of 2018, 80 percent of Tier 1 carriers would offer virtualized customer premises equipment, up from only 5 percent in mid-2017 Between them, hybrid cloud and SD-WAN provide a blueprint for digital transformation, with hybrid cloud as the "why" and SD-WAN as the "how"; hybrid cloud can extend and enhance existing IT investments, while an SD-WAN supplies the network intelligence, adaptability and millisecond measurement to help it reach its potential. Migration to hybrid cloud can still be challenging, though, due to various complications such as unexpectedly high costs for additional licenses (e.g., so that on-prem applications can be accessed by cloud services) and the complexity of data fragmentation across disparate environments.   Legacy WAN vs SD-WAN in Hybrid Cloud Environments For starters, hair-pinning all traffic through a corporate data centre via MPLS is highly inefficient. It increases latency and degrades application performance. Plus, simply adding more MPLS bandwidth to the network – in addition to being expensive – usually won't solve the problem, no more than adding more cars to a train will decongest an overtaxed subway system. The only way forward is to support a distributed network architecture, backed by an SD-WAN. Today's progressive SD-WAN's offer platforms that constantly measure different paths for jitter, latency and packet loss, sending traffic down the best one available. For hybrid cloud architectures, in particular, SD-WAN solutions ensure secure, aggregated cloud access via the most practical alternatives to pricey MPLS links, namely commodity Internet and direct-to-cloud connections such as AWS Direct Connect. An SD-WAN from Talari Networks ultimately facilitates scalable performance over any type of transport, whether MPLS, broadband, cellular or satellite, making it ideal for the varied characteristics of hybrid cloud environments. To learn more about Talari's benefits for hybrid cloud connectivity, click here. ### #TheCloudShow – Ep 4 - Are Ecosystems The New Norm? #TheCloudShow – Ep 4 – Are Ecosystems The New Norm? Digital transformation is in full swing in the technology sector. But decision makers are now being challenged to go further and create digital and technology ecosystems that encompass customers, employees, developers, suppliers, and even extended networks of related businesses. ### HOME BARGAINS AGREES £950K MANAGED IT SERVICES DEAL WITH SYSGROUP Managed IT services and cloud hosting provider, SysGroup PLC (‘SysGroup’) has today announced a three-year managed hosting services contract with one of the UK’s fastest-growing high street discount retailers, Home Bargains. The company, which sits under the T.J. Morris Limited umbrella, selected SysGroup following a competitive process to deliver more than £950,000 worth of managed infrastructure hosting and support from March 2018. SysGroup will provide this service and implement specialist security measures to comply with the General Data Protection Regulation (GDPR), set to come into play in May this year. Adam Binks, chief operating officer at SysGroup, said: “With GDPR just around the corner, businesses must ensure that their data is not only hosted in safe hands, but their environments are also protected from the vulnerabilities of the public internet.” Discussing the new partnership, Frank Christiansen, IT director at T.J. Morris Limited added: “We selected SysGroup as our new managed services provider due to its strength, understanding and ability to implement security measures that conform to the requirements of GDPR, as well as its extensive flexible approach to IT solutions.” The contract reflects the Group’s newly refined strategy to focus on the growing market demand for its agile managed IT solutions and cloud hosting among customers from vertical sectors, including retail.    “T.J. Morris is a significant win for SysGroup and we are delighted to have been appointed by another highly regarded, household name,” Adam continued. “The deal is a clear endorsement of our capability and managed services offering, and we look forward to working with more businesses like this in the retail sector going forward.” This agreement with Home Bargains comes after SysGroup recorded a sales pipeline growth of 17.1 percent in its half-year results published in September 2017 and the £3.9m acquisition of managed IT, hosting and security services company, Rockford IT Ltd. The Group has plans to invest further resource in capability, contract wins and strategic acquisitions, alongside organic growth to expand customer offering and geographical reach. ### Don’t Get Left Behind! Embrace AI and Deep Learning Those most responsive to change will become the ones best equipped to survive, or so the well-worn maxim goes. Or, as Jeff Immelt, CEO of General Electric, has stated: ‘if you woke up an industrial company today, you will wake up as a software and analytics company tomorrow.’ History is littered with examples of companies and individuals who have failed to embrace change. In the age of rapid technological advancements, it is becoming imperative for all companies to recognise the impact new technology will have on their operations. [clickToTweet tweet="40% of companies will no longer exist in 10 years if they refuse to embrace new technologies #AI #NewTechnology #Future" quote="40% of companies will no longer exist in 10 years if they refuse to embrace  new technologies"] Yesterday’s sci-fi is today’s reality In an interview with CNBC, Peter Diamiandis, MIT Space Engineer, said that 40% of companies will no longer exist in 10 years if they refuse to embrace  new technologies. These encompass artificial intelligence, robotics and cognitive applications, with 72% of Fortune 500 CEOs stating that their greatest challenge is in keeping up with the rapid pace of technological innovation. But the technology of tomorrow is already here. Sensors are in everything that we do and use. Machine learning technologies are disrupting industries. These are the tools of the future, pioneering new growth and developing new revenue streams. And businesses know this. A recent survey conducted by the Economist Intelligence Unit, intended to gauge the current and future use of IoT by the global business community, shows that almost half (47%) of businesses agree that IoT will be one of the most important parts of their organisation’s digital transformation strategy. It is clear, then, that companies are drawn to the potential that new technologies hold and recognise the benefits they can have. So what’s holding them back from fully employing them? Go beyond just recognition Obviously, simply recognising how great these technological advancements are is not exactly enough to yield tangible results from them. In other words, if you’re fascinated by the potential of, say, artificial intelligence but not ready to take the necessary steps to move your organisation towards implementation, then you’re playing directly into your competitors’ hands. Indeed, if you’re not thinking about implementing these technologies now, you will soon find yourself on the losing side. Sure, the adoption of AI and machine learning technologies has its pain points. Coping with the sheer amount of data is a task in itself, let alone figuring out how to efficiently handle, integrate, model and construct it. This can especially be the case for businesses that wouldn’t necessarily classify themselves as tech titans. Managing the data can be challenging, it has to be extracted properly. Let AI give you a hand Instead of getting lost and drowning in data, companies can overcome this by taking steps to implement powerful development platforms and integrating them with AI technologies – most notably, machine learning and predictive analytics. To do this, organisations must rethink the way they operate and identify the areas where new technologies can create most value. This will allow them to employ these more effectively: they’ll be using the right platforms to analyse the right data, building appropriate models based on what the data suggests, and ensuring only the most relevant data insights are being used to orchestrate future business operations. Organisations will have to be ready to take the necessary steps to overcome the initial challenges in implementing these new technologies, or be prepared to face inevitable decline. Embracing digital disruption will no longer get you ahead of the game – it’s something every business must do in order to survive in today’s rapidly changing world.   ### Tech of the Week #2: Iron Man Style e-Skin The next article in the series is looking at wearable tech, in the form of skin. With advances in Augmented Reality coming along further and further each day, we are now looking at electronic skin (e-Skin) that can manipulate devices around you, from the physical lightbulb to a virtual architecture plan. Be like Tony Stark with this ultra-thin, 1.7-μm-thick polyimide foil, wearable technology… This augmented reality powered computer chip sits nicely in the palm of the hand, but you won’t even notice it. Thinner than a human hair, at less than 3 micrometres thick, it’s hard to believe it doesn’t just snap on contact. For reference, a human hair is roughly 50 micrometres thick.   How does e-Skin work? [clickToTweet tweet="The main goal of the #e-Skin is to give humans their own ability to interact with #magneticfields (#magnetoception). The #receptors form part of the whole magnetosensitive skins, allowing for body position #tracking and #touchless object manipulation." quote="The main goal of the e-Skin is to give humans their own ability to interact with magnetic fields (magnetoception). The receptors form part of the whole magnetosensitive skins, allowing for body position tracking, magnetic cognition and touchless object manipulation."] Birds, insects, bacteria and even vertebrates like sharks use the Earth’s magnetic fields for orientation and navigation. It helps them perceive direction and location, and also the altitude in which it currently sits. Currently, there is nothing to suggest we, as humans, have this magnetic sixth sense, although we do have a protein in the eye called a cryptochrome which could serve this function. The main goal of the e-Skin is to give humans their own ability to interact with magnetic fields (magnetoception). It achieves this by using artificial receptors that interact with said magnetic field, like a bird would. These receptors form part of the whole magnetosensitive skins, allowing for body position tracking, magnetic cognition and touchless object manipulation. This is seen in the case study of a light bulb later on. The technology uses a static magnet in which an electronic appliance is programmed into. When the user touches the magnet, it is replicating touching the physical object.   The Tech Typical virtual and augmented reality technology won’t work in the case of this e-skin. The mix of gyroscopes, camera arrays, accelerometres, magnetometres and real-time image processing requires considerable energy supplies. This, unfortunately, requires space, which would defeat the whole purpose of easy-to-manage wearable technology. Instead, the skin uses giant magnetoresistive sensor foils and magnetic field sensors, providing high sensitivity, flexibility and mechanical endurance. This results in proximity detection, touchless control and navigation. Which basically means, when moving the hand in the y-axis, the magnet would detect the sensor’s relative proximity. This allows for replication of the touch function of a hand by detecting how far away the hand is from the object. Arranged into two Wheatstone bridges resides eight high-performance spin valve sensors. In order to detect hand movement in the x-axis, the skin uses these sensors to determine which axis the hand is moving in. Much like an Xbox controller, moving the right analogue stick right-left or up-down results in an x or y-axis movement, respectively. As the proximity sensor explained previously handles the ‘touch’ function, there doesn’t need to be anything as fancy added at this stage. You now have a system that can activate electronics as the hand touches it, and can be rotated by turning the hand in a motion similar to turning a dial.   Application and the Light Bulb The benefit of using a magnetic field is that the user no longer requires a direct line of sight between the object and the skin sensors. So if there were controls inside a radioactive chamber, they can be controlled from an external, safe environment without the need to send a human being into hazardous places. As mentioned earlier, there was a case-study of engineers and scientists emulating a dimmer switch with this magnotechnology, then using it to turn a virtual light bulb down. The first step was to create the dial which simply consisted of a ring-like plastic structure with a magnet inside. The next step was to code the light bulb and assign intensity parametres to the dial. The dial was coded as angles between 0 and 180 degrees, corresponding to typical hand movements involved in turning a knob. The bulb was coded to be off at 0 degrees and at full brightness at 180 degrees; the intervals being divided up equally between the maximum and minimum integers. Finally, the user put his e-Skin covered hand on the dial (fulfilling the proximity element) and turned his hand left and right to dim and brighten the bulb, respectively. After the success of this trial, the researchers then went on to replicate the physical dial in a virtual dial with great success.   The Possibilities There is currently a great lack of technologies and mediums for interfacing the physical and virtual worlds. This approach takes some fundamentals of virtual and augmented reality and opens up potential far beyond our current capabilities. So, what other possibilities are there? Could the e-Skin be used to land an aeroplane in a situation where a human couldn’t? Could it be as simple as sitting at your desk, noticing your clock is wrong and using your new powers to correct it?   Next in the series, Wednesday 7th March 2018… Tech of the Week #3: Harley Davidson – The Electric Dream Tesla is widely known for their electric vehicles and pioneering the electric lorry as well; we have electric bicycles, SUVs, trains and everything in-between, but I’ve always felt one vehicle, in particular, has been missing. A personal favourite of mine: the motorbike. Well, to all you bikers out there, Harley Davidson is providing us with the answer…   I would like to know your thoughts, connect with me on Twitter. Click below even if you just want to talk tech. ### HOSOKAWA AND SIEMENS ANNOUNCE COLLABORATIVE DIGITAL TECHNOLOGY PROJECT Project utilising Siemens’ MindSphere IoT operating system designed to aid competitiveness, improve productivity and reduce operating cost Hosokawa will benefit from adoption of the Hosokawa Gen4 digital technology platform at its Runcorn manufacturing facility in conjunction with MindSphere [clickToTweet tweet="To boost the UK’s #manufacturing sector, #MindSphere - #Siemens open #cloud #IoT operating platform - has been specified on a #digital #technology project designed to further optimise production and processing performance for #Hosokawa Micron Ltd (#HML)" quote="To boost the UK’s manufacturing sector, MindSphere - Siemens open cloud IoT operating platform - has been specified on a digital technology project designed to further optimise production and processing performance for Hosokawa Micron Ltd (HML)"] With the recently announced Government-backed Made Smarter Review calling for the widespread adoption of industrial digitalisation to boost the UK’s manufacturing sector, MindSphere - Siemens open cloud IoT operating platform - has been specified on a digital technology project designed to further optimise production and processing performance for Hosokawa Micron Ltd (HML), one of the UK’s leading contract processing companies. Specialising in the single source supply of particle and powder processing equipment, technologies and services for a range of sectors, including pharmaceuticals, chemicals, food and cosmetics, HML has a long-standing working relationship with Siemens going back over 20 years. This partnership has now led to the two companies working together on a strategic project that will focus on obtaining performance improvements by using the benefits of industrial digitalisation – and in particular, Siemens MindSphere open cloud IoT operating system and Hosokawa Gen4 technologies. The project – Contract Manufacturing 2020  - will capture real-time and essential operational data at the company’s Runcorn manufacturing facility which can then be analysed at both machine level on the local network, as well as stored and interrogated via the cloud platform.  The ongoing data analysis will provide important insight into all aspects of the plant’s production performance and the status of production assets and aid strategic decision making, scheduling, predictive maintenance and operational availability. Iain Crosley, Managing Director at Hosokawa Micron Ltd, says: “We are delighted to be working alongside Siemens on this important project, which is designed to harness the business benefits digital technologies can deliver for us.  Capturing, storing and interrogating operational data from across our production and processes is vital to provide the intelligence we require to further enhance our performance, and I am determined that Hosokawa will be at the forefront of digital technology adoption. “The MindSphere platform brings with it capabilities for remote monitoring, data-driven analytics and secure cloud connection.  It will allow us, for example, to reduce our production downtime and associated maintenance costs, quickly highlight the potential for any production anomalies and, ultimately, aid our competitiveness by allowing us to get products to market more speedily. “It is a robust and affordable industrialised solution with configurable connectivity that will not only assist with our own contract processing performance, but also enable us to better support our customer base.” Ian Elsby, Chemicals Industry expert at Siemens, comments:  “Hosokawa is a long-standing partner and we are pleased to be collaborating with them on the Gen4 project. The widespread adoption of digitalisation - the technology behind Industry 4.0 - is vital to enhance the country’s competitiveness and productivity.  The MindSphere project for Hosokawa is a clear illustration of the many benefits digital technology can deliver for UK manufacturers. “This is an exciting project and we look forward to securing the tangible and quantifiable advantages it will deliver for Hosokawa as the project progresses over the coming months. The solution will demonstrate sensor to digital twin models and highlight the advantage of digital technology impacting productivity in a positive manner.” The technology delivered on this project will be available to other manufacturers through Hosokawa Gen4. ### Interview with Jonathan Maskew from Clerksroom Direct and Liana Telvi from PSU David Organ from Compare the Cloud had the opportunity to interview Jonathan Maskew from Clerksroom Direct and Liana Telvi from PSU. ### The Weather’s about to Change: Fog Computing is Drifting Down from the Cloud… We’ve all heard of cloud computing. Love it or hate it, use of cloud technology has become ubiquitous in modern business and everyday life. But now, staying along the lines of information technology paradigms named after weather, fog computing seems to be the next big thing in the forecast. SONM, a universal fog supercomputer, announced this week that it would unveil its Minimum Viable Product (MVP) on Christmas day 2017, giving developers the chance to test their range of fog computing services and give them feedback ahead of the official release to customers next summer. It’s set to be one of the most significant stories in tech to watch out for in 2018. So what’s all the fuss about, and what exactly is fog computing? [clickToTweet tweet="#FogComputing is a step between the #data source and the #Cloud - Find out more with @DrAnjaleePerera" quote="Fog is a step between the data source and the cloud"] Essentially, fog is a step between the data source and the cloud. Normally, edge sensors and devices collect and generate data before sending it to the cloud, but that doesn’t come without its problem. Businesses often hit legal restrictions, privacy issues and security questions – not to mention the fact that you might be wasting time and energy sending vast amounts of completely useless data to the cloud. Wouldn’t it be great if there was a data hub or smart device between the source and the cloud, acting as gateway and helping to send appropriate information to the cloud? Well, that’s what fog does. ‘fogging’ allows short-term analytics to be conducted at the edge It’s basically a concept that complements cloud computing; ‘fogging’ allows short-term analytics to be conducted at the edge, allowing the cloud to carry out the more intensive, longer-term analytics on data that is actually significant. SONM positions themselves as providing a viable and more cost-efficient alternative to cloud solutions. Powered by Ethereum blockchain, SONM runs on users’ computers, the benefits of which include an open infrastructure, and a common resource pool with unlimited computing power. The idea is that cryptocurrency miners across the globe will leverage their idle computer power to join the network, and then SONM will be able to reward them for sharing their processing power with SNM tokens. SONM CEO Sergey Ponomarev said: “With the launch of its MVP, SONM is excited to provide our existing partners with early access to the marketplace and the ability to leverage the community’s collective computing resources. This will prove to be a critical resource for learning and tests, providing us not only with the opportunity to develop a user-friendly service with advanced capabilities but also with the ability to build valuable partnerships both now and in the future.” ### Blockchain: A New Dawn in Trade Finance Almost since the dawn of civilization, there has been trade finance. The roots of early forms of trade finance, can be dated as early as the beginning of modern civilization, more than a thousand years ago. The current trade finance market is substantial, with $8 trillion currently financed and a further potential $40 trillion on corporate balance sheets. Trade finance has been provided primarily by financial institutions, unchanged for years, with many manual processes based on old-legacy systems that are expensive and costly to update. Most financing facilities are still bilateral or based on tripartite relationships between a supplier, a buyer and a financial institution mitigating credit risk and offering funding to the trading parties. Such structures are mostly managed manually or through antiquated systems, which are not scalable and result in higher operational costs. Therefore, traditional trade finance facilities are mainly focused on top-tier corporates and their key trading partners, involving large transactional volumes. Two decades ago, new Fintech companies started to emerge, offering different types of trade finance solutions and applications, which improved how trade finance was managed and delivered. Today, some of these applications include thousands of suppliers and buyers and multiple funders. The introduction of cloud-based technology, efficient means of communication as well as economies of scale have resulted in increased scalability, a reduction in the cost of manual processes, all of which have made trade finance more available for a larger number of participants in the trade ecosystem. Currently, there are approximately 30 large destination applications focusing on open account trade finance with each of them competing for more trading partners and financial institutions. [easy-tweet tweet="Blockchain technology can provide enormous benefits to solve challenges in trade finance" hashtags="Fintech, Blockchain"] As a result, many corporates have to work with 20 or more platforms, logging into different systems, performing due diligence on the applications, legal agreements and different processes, together with uploading their confidential trade data to third-party applications. Banks offering financing through these destination applications have to rely on the data integrity and security of these systems and need to integrate and operationalize with a multitude of different applications from third parties. This process can take months of due diligence and in many cases, the bank is still processing the data separately on their side because of a lack of trust and risk. Blockchain technology can provide enormous benefits to solve these challenges in trade finance. It can be used for the basic services that are essential in trade finance. At its core, blockchain relies on a decentralized, digitalized ledger model, which by its nature is more robust and secure than the proprietary, centralized models which are currently used in the trade ecosystem. This creates a safer, more reliable system at a much lower cost. Blockchain technology creates a viable, decentralized record of transactions – the distributed ledger – which allows the substitution of a single master database. As a consequence, blockchain leads to radical simplification and cost reduction for large parts of transactions in trade finance, whilst making it more secure and reliable. It keeps an immutable record of all the transactions, back to the originating point of a transaction, also known as the provenance, which is essential in trade finance as it allows financial institutions to review all transaction steps and reduce the risk of fraud. Blockchain also offers a far better means of establishing and proving identity than present day systems. By providing unique, non-forgeable identities for assets, along with an inviolable record of their ownership, blockchain greatly simplifies the direct transfer of trade assets and increases confidence in their provenance. This can lead to additional financing services based on trade of physical goods. Using the elements and benefits of blockchain described above, existing financial infrastructure can be retooled and new business models deployed to radically improve trade finance solutions and generate new revenue streams. In addition, blockchain allows near real-time settlement for trade finance transactions, which reduces counterparty risk, frees up working capital and reduces transaction costs. It also allows new service level models in which trading partners in effect become the keepers of their own accounts. This could give them not only more security, but much more freedom in choosing funders and technology providers. With industry-wide collaboration platforms like the R3 consortium, the stage has been set for the cooperation necessary to bring out the full potential of blockchain technology as quickly as possible. One company working with R3 and making strides in the trade finance market is TradeIX. The company is rewiring the thousand-year-old trade finance ecosystem by providing a connected and secured platform infrastructure for corporates, financial institutions, and B2B networks through standard communication channels (APIs) and leveraging blockchain technology. The race to make history, drive efficiency gains and scalable solutions in trade finance is on. This development puts trade finance at the heart of new blockchain applications. ### Keeping Customers Safe - Marc Wilczek - Link11 David Organ from Compare the Cloud talks to Marc Wilczek, the Managing Director for Link11 about keeping customers safe and DDOS. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. ### Hybrid Cloud: Trinity Mirror chooses NetApp to Safeguard Real-Time News through Digital Transformation Instantaneous hybrid cloud disaster recovery means no one misses a beat. LONDON, UK, - February 20, 2018 - NetApp (NASDAQ: NTAP)  The UK’s largest news publisher, Trinity Mirror, has turned to NetApp to ensure readers get the latest news and the best user experience by harnessing the hybrid cloud. With more than 150 print and 80 online publications, including the Daily Mirror, the publisher wanted to ensure its content updates remain real-time by moving its disaster recovery to the hybrid cloud. Having embraced the hybrid cloud over the last ten years, Trinity Mirror seized the opportunity to break its infrastructure refresh cycle when the lease came to an end on its secondary disaster recovery facility, onsite at the London HQ. The team turned to NetApp to leverage the hybrid cloud for disaster recovery (DR), having utilised NetApp’s expertise several years previously, in its quest to centralise a siloed editorial and advertising system. Applying NetApp ONTAP Cloud for AWS, Trinity Mirror built a reliable disaster recovery platform for the company’s largest business-critical applications in the Amazon Cloud. Keeping it real-time with disaster recovery Multiple refreshes of online content throughout the day and various newspaper prints going to press every day means up-to-the-minute reportage is the bread and butter of Trinity Mirror. Downtime is simply unthinkable as this could devastate the business, with loss of readers, investors and advertisers. However, with NetApp’s ONTAP Cloud, no matter the scale of a potential incident threatening Trinity Mirror’s rolling updates data can be recovered instantaneously with less than ten minutes of data loss – well below its 60-minute benchmark. Now, there’s no reason for anyone to miss a beat. An economical hybrid cloud solution By breaking its refresh cycle and moving Disaster Recovery into the hybrid cloud, Trinity Mirror eliminates the need for large capital expenditures and recurring migration costs. Meanwhile, ONTAP Cloud’s ‘pilot-mode’ assumes an on-demand cost structure, so while the software is idle in the cloud it is not paid for until the service is needed. Bridging traditional and digital media by breaking down data silos Digital transformation has been pivotal to Trinity Mirror’s success. Built upon its ever-growing data, the publisher has embraced a hybrid cloud approach in order to facilitate the explosive growth in video and image content. Ensuring efficiencies across both print and digital operations is essential and has been achieved by breaking down data silos using a centralised content management system and pushing content from print to the web. Martin Warren, NetApp EMEA Cloud Solutions Marketing Manager: “Trinity Mirror is an exemplar of the power to be unleashed by digital transformation. With readers increasingly leaning on real-time updates from its print and digital publications every day and in the midst of important news events, its hybrid cloud approach ensures risks of downtime are minimised – and the show will go on. In a world increasingly reliant on always-on services like rolling news services, there is no room for error when it comes to optimising services with instantly accessible Disaster Recovery.” Peter Raettig, Trinity Mirror, Head of Technical Operations: “Trinity Mirror reaches 38.6 million people each month, who expect 24/7 coverage of the latest news. In an era when many traditional media outlets are struggling to keep pace with changing information consumption behaviour, it is essential for brands to modernise. NetApp has been a key advisor to us in developing and expanding our hybrid cloud strategy to optimise our business. We are looking at decommissioning our primary data centre and moving it completely to the cloud, and we see NetApp as a core part of that.” ### Join the World’s Largest Legal Hackathon this weekend The legal profession is usually considered conservative on many fronts, but it's a sector as ripe for digital disruption as any. With technologies like Artificial Intelligence, Blockchain and IoT, all underpinned by Cloud Computing, the future of the law firm could look very different in the next 5 years. Well, a lot of legal brains, technology experts, developers, designers and enthusiasts will be working, collectively on exactly that this coming weekend in the world's largest legal hackathon happening simultaneously in over 40 cities and 20 countries. The London stream of the Global Legal Hackathon (GLH) is being co-hosted by Cambridge Strategy Group, Agile Elephant and Pinsent Masons on February 23-25 at Pinsent Masons' London office in the City.  A winner will be declared for London and that team will go through to a global competition with all the other cities, culminating with a winner announced at a banquet in New York on April 21. All of the details and how to register are at: LegalHackathon.London What is the goal of a team entering the Global Legal Hackathon? The goal is to apply innovative ideas and emerging technologies to progress the business of law or facilitate access to justice for the public. Teams of 3 to 6 will come up with a prototype or proposal at the end of the hackathon to present in front of a panel of judges. We expect ideas using technologies like AI, Machine Learning, Chatbots, Blockchain, or the Internet of Things. Where and when? At Pinsent Masons office at 30 Crown Place, Earl Street, London EC2A 4ES, (including their customer centre on the 15th floor with stunning views over London) over the weekend of February 23-25.  18:00 start on Friday, working all day Saturday and most of Sunday, judging takes place from 16:00-18:00 on Sunday and a winner will be announced before 19:00. Who are the Judges and Mentors? Our judges are Christina Blacklaws, Deputy Vice President of the Law Society; Frank Jennings the "Cloud Lawyer"; Dr Richard Sykes, chair of the Cloud Industry Forum; and Joanna Goodman, Author and IT columnist for the Law Society Gazette. Our mentors, to help advise and keep the teams on track include Sophia Adams-Bhati, Richard Tromans, Silvia Cambie, Dennis Howlett, Alan Patrick, Janet Parkinson, Rob Millard, Julie Gottlieb and me. Who is supporting this? There are a lot of people to thank! IBM and Microsoft are providing developers with some free access to their cloud platforms. LexisNexis and JG Consulting are our local sponsors. The Law Society, the Society for Computers and Law and Disruptive.Live are supporting us too. The Global sponsors are Integra, IBM Watson Legal, Global Legal Blockchain Consortium, Cadence, LawDroid and ONE400. Who will be Hacking? We've got teams entered from LexisNexis, Pinsent Masons, Vodafone, and Hult International Business School. Other participants are coming from IBM, Fliplet, Jurit LLP, Hook Tangaza, Sumitomo Electric Finance UK, Said Business School, Legalytics, Cliffe Dekker Hofmeyr Inc, The Incorporated Council of Law Reporting for England and Wales, European Banking Authority, The Founder, Legal Utopia, The Law Society, Bryan Cave, Queen Mary University, Thomson Reuters, Kitmobs, Look, YADA Events, Teal Legal, Bank of America Merrill Lynch, City University, Oxford University, and Westminster University. We've got capacity for 110 people and 12 teams, but we still need more participants to sign up. How can you get involved in the GLH? Hacker teams and team members - Anyone involved in the law, interested in the law, involved in technology for the law, or coders and technologists who want to join the fun.  We know some firms will submit teams, and other teams will form around a great idea at the GLH. Helpers - We need volunteers over the weekend to make it happen and keep everyone happy. Mentors - We need subject matter experts and technologists who can mentor the teams over the weekend to help crystallise their ideas, challenge them, or keep them on track. Judges - We've got 4 great judges, as listed above. Sponsors - As well as the venue we will be providing food and drinks, name tags and supplies. We may even add a main prize and additional prizes. We need sponsors interested in helping us fund all of this - modest amounts in the range £250-500. This is a 'not for profit' exercise for the hosts, but we need to cover our costs (mostly pizza and drinks). Follow us on Social Media We will be using the social media hashtags #GlobalLegalHack & #GLH2018.  Follow the GLH on Twitter at @WorldHackathon and the London organisers @robmillard & @DT. GLH has also partnered with legal media sources ArtificialLawyer.com and Legal Talk Network. Our friends at Disruptive.Live will be generating video and live content and diginomica and the Law Society Gazette will be reporting on the event. We think this is going to be something special. What happens when you get a bunch of lawyers, coders, designers, consultants and marketing types with their laptops and cloud platforms together over a weekend?  Please come and join us and find out! ### New PGi Report Reveals the Smartphone as the Go-To Device for Unified Communications & Collaboration Services Findings Reveal a 70 percent Decline in Users Accessing Meetings via Desk Phones in Past Five Years. ATLANTA, February 21, 2018 – Today PGi, the world’s largest dedicated provider of collaboration solutions, published the results of research undertaken with Mobile World Live, which highlights that nearly half (44 percent) of mobile industry respondents rank the smartphone as their chosen device for unified communication and collaboration (UC&C) services in the next few years. Mobile World Live is the media arm of the GSMA. This projected mobile growth contrasts with a decline of 70 percent in the past five years of desk phone usage for conference calls. In the past twelve months, 55 percent also stated that their desk phone usage for conference calls had declined.  After the smartphone, the other go-to-devices for UC&C services were the PC/laptop (43 percent), desk phone (9 percent) and tablet (four percent). The PGi global report, entitled, “The Future of UC&C on Mobile,” showed that HD audio with background noise suppression is ranked as the most likely feature (43 percent) to encourage greater smartphone usage, followed by ease of access (25 percent), enhanced battery life (16 percent) and improved data plan usage (12 percent). The top network developments to drive future UC&C on mobile include improved network coverage (33 percent), enhanced call quality (29 percent), faster download speeds (13 percent) and enabling 5G (13 percent). The findings also showed that mobile network operators (MNOs) or mobile virtual network operators (MVNOs) are most expected to drive the use of UC&C services on smartphones over the next five years (38 percent). This contrasts with 37 percent of respondents that consider Over-the-Top (OTT) application providers to be well positioned in this sector. The MNO lead could be due to their role in improving network coverage, enhancing call quality and their established UC&C expertise. Frank Paterno, vice president, global carriers, PGi, said: “This global survey of mobile industry professionals shows that collaboration fuels enterprise success and that the smartphone is now the go-to device for UC&C services. At PGi, we have seen the number of mobile devices connecting to our flagship collaboration product, GlobalMeet®, significantly increase year over year. In fact, we’re estimating that over 50 percent of endpoints on our audio or web conferences will be from a smartphone by 2020. This survey and our own user data continue to prove that mobility is a standard professional tool that enables better productivity, more flexibility and powers global, digital workers to get more done from anywhere at any time.” The bring-your-own-device (BYOD) trend within the enterprise means that employees have access to a smartphone both at work and home, making it the obvious choice for communications.Smartphone adoption has grown exponentially, and further strong growth is expected. According to the GSMA Intelligence report, The Mobile Economy 2017, global smartphone connections are expected to rise by 1.9 billion from the end of 2016 to 5.7 billion by 2020. Survey Background on “The Future of UC&C on Mobile: The global report surveyed 333 mobile industry leaders on unified communication and collaboration (UC&C) with a focus on mobile usage. The largest segment of respondents included device and network equipment vendors (26 percent), followed by mobile operators (24 percent) and then MVNOs, fixed operators, business service resellers, software developers and mobile specialists (50 percent). The report provides insight into how operators and manufacturers around the world rate their UC&C experiences on a mobile device, and what impact current and emerging technologies will have on the future use of smartphones for UC&C services in the mobile enterprise. The report also explored the expectations that mobile devices and features, such as high-definition (HD) audio (or HD voice) play in improving the quality and usability of these services in the future. Additionally, the survey investigated attitudes towards the use of over-the-top (OTT) applications for UC&C. The survey ran during Q4 2017. Check out the full report here. ### Why SD-WAN is the New Black - Atchison Frazer - Talari Networks We were joined in the studio by Worldwide Head of Marketing for Talari Networks, Atchison Frazer. In this interview Matt Lovell, COO of Centiq, discusses the concept of SD-WAN with Atchison, the limitations of traditional networks and why SD-WAN is changing the way enterprise views their networking. ### A10 Networks Enables The Bunker to Deliver Ultra Secure Hosting Without Breaking the Bank READING, UK – 20th February 2018 - A10 Networks, a Secure Application Services™ company, today announced that it is working with ultra-secure managed service provider, The Bunker, helping it to deliver highly available and secure services to its customers. Within its purpose-built data centres in Ash and Newbury, The Bunker has implemented A10 Networks’ Application Delivery Controllers (ADCs), a fully managed network solution with firewall, switching, routing, load balancing and SSL termination. Additionally, The Bunker is using A10 Networks as part of its overall managed service offering and has just closed its first customer with A10 Networks’ ADCs as part of the overall solution. Founded in 1994, The Bunker is now a mature business with a prestigious base of long-standing clients with a stronghold in financial services, healthcare, telcos, mobile and professional services. The Bunker serves businesses that have specific requirements driven by compliance and regulation and are concerned with how their data is protected, the value it has, as well as the reputational risk should that data ever be disclosed. The Bunker’s two data centres operate hand-in-hand and availability and fast access is paramount, but at the same time this cannot compromise security, privacy or expose data to any undue risks.  Up until recently it was using three different solutions to deliver this high level of security and availability. It had an in-house developed solution, a network storage solution and an application management performance solution. This was proving cumbersome and expensive to manage, so the decision was taken to consolidate these three solutions into one. The Bunker invited A10 Networks to pitch for this business and undergo a proof of concept and a number of pre-sales technical engagements within its lab environment. The Bunker was impressed with not only A10’s high performance server load balancers and ease of use, but also its flexible licensing model cost in relation to comparable solutions and its Secure Application Services.  As a result the decision was taken to deploy A10 ADCs. Philip Bindley, Managing Director, The Bunker comments: “We had three separate companies doing what we are now delivering with A10 Networks. By deploying A10 we have combined all three technologies into one physical and virtual appliance which keeps everything a lot neater.” “The performance out of the box compared to other solutions was far superior which means that we effectively get a lot more bang for our buck.  We were really impressed with A10’s SSL mutual authentication load balancing which enables us to authenticate any client connecting to our systems.  This means we can effectively move our perimeter a further layer up in the security stack and stop attacks faster.” The A10 ADC platform allows traffic to utilise multiple CPUs without slowing down or blocking.  As a result, applications run exponentially faster - especially during peak usage conditions.  Also, unlike incumbents, A10 includes all functionality within the A10 ADC platform and does not charge additional software licencing fees. Last year The Bunker had a new round of investment into the business and was acquired by Palatine Private Equity. It is now part of the Cyberfort Group, delivering an end-to-end cybersecurity service.  The Bunker is at the heart of this capability, augmented with an offensive and defensive security SOC. So not only will The Bunker be using A10 Networks for its own internal systems, but as a service provider it will be offering the platform as its product of choice to its customers, making A10 Networks the de facto offering for all its other customers going forward. Philip Bindley comments: “We have big growth plans and scalability is key. The fact that A10 can consolidate a number of technologies into one solution that the team can easily manage is crucial to our growth ambitions because that way we can look after more customers and provide a better service.” If you are interested in finding out more information on A10 Networks’ Application Delivery Controllers (ADCs) please visit the A10 Networks website www.a10networks.com. ### Tech of the Week #1: Nanobot - The killer robots we want I’d like to welcome you to our new Tech of the Week column, where I will look at upcoming technologies each week. To kick things off I have chosen to write a Medtech piece, focusing on nanotechnology. Specifically the nanobot, that kills cancerous tumours. According to an article published on Nature Biotechnology, their new technology has a “DNA aptamer that binds nucleolin… and the blood coagulation protease thrombin.” This serves “both as a targeting domain and as a molecular trigger for the mechanical opening of the DNA nanorobots". That’s right, these DNA nanobots target the tumours and immediately get to work. Scientists are injecting them into the body, via syringe, at a staggering 10 trillion nanobots per injection.   How Does the Nanobot Technology Work? The non-human trials have proven the Artificial Intelligence’s potential by not causes clots in other parts of the body. They solely hit and starve the cancerous cells they were aiming for. Apparently, scientists also demonstrated the nanobot's ability to ignore healthy tissues on its way to its objective. Most tumours work in the same way, so successful blood clotting in one cancer should replicate effectively across all cancers. The technology works as follows: Imagine you have a rectangle of cloth, and inside are all of the drugs necessary to encourage tumour death. DNA is then used to “sew” the rectangle shut, only opening when coming in contact with a trigger. The trigger is a chemical signature found only on the outside of the cancerous tumours. Once it hits this the package opens up and delivers the payload. After injecting the site with healing medicines, the nanobot coats the inner walls of the blood vessel with a clotting protein. In turn, this cuts off the blood supply and thrombosis of the tumour occurs; shrivelling up and killing it. You don’t need to understand all the medical terms to see this technology has vast potential in us living longer as a species. I’m not advocating the living forever dream, but this would be of benefit to every single one of us in one form or another.   Treatment Trials The most exciting part of this breakthrough is its success in mice with human tumours. The trials the life expectancy of the mice over double as a result of this treatment. It should be noted, that the process behind the technology works better in some cases than it does others. Tumours with a strong supply of blood, for example, those found in melanoma, are standing a better chance at totally regressing. In the melanoma trials, 3 out of the 8 mice tested showed completed regression. There have been second trials on a larger specimen, the Bama miniature pigs. Sharing anatomical and physiological similarities with humans makes these animals the perfect stepping stone towards approval of human trials. Although more work needs to be done before this can happen. Whilst this isn’t a particularly new idea, trials involving DNA manipulation in cockroaches were conducted back in 2012, it has certainly come on leaps and bounds. Even as recent as 2016, scientific documentation noted that AI bots were a thing of the future. Technology at the time hadn’t progressed to what we see here today, using purely organic methods to conduct trials.   Potential for the Future The potential for this treatment reaches beyond just cancer. It could be used to treat a range of ailments, from the likes of asthma to arteries narrowing from a build-up of fatty deposits. Technically the nanobots could visit parts of the brain and, using neural pattern identification, deliver solutions to mental health problems. Of course, this kind of treatment is far off in its infant stages, but changing the drugs onboard, as well as the key that unlocks the package, has a vast reaching impact. Due to risk factors such as a nanobot going into the brain of the patient and causing a stroke, it could need some refining before we see human trials approved. Nevertheless, this is an exciting breakthrough in the fight against cancer.   Next in the series, Wednesday 28th February 2018 Tech of the Week #2: Iron Man Style E-Skin With advances in Augmented Reality coming along further and further each day. We are now looking at electronic skin that can manipulate devices around you, from the physical lightbulb to a virtual architecture plan. Be like Tony Stark with this ultra-thin, 1.7-μm-thick polyimide foil, wearable technology…   Is there a gadget or break-through technology you would like to see reviewed? DM me on Twitter or LinkedIn using the buttons below. ### Comms Business Live: Peripheral Product Battles Episode: Peripheral Product Battles! What makes a good peripheral addition to your portfolio? Broadcast Date: 19th Feb 2018 Series 2 Episode 6 In this episode David Dungay, Editor of Comms Business, will be trying to answer the ultimate question, what makes a good peripheral product for comms and IT resellers to add into their portfolio? With guests Sean Dixon, Head of Channel at Fidelity Energy, David Hall, Sales Manager at Netpay and Simon Fabb, Co-Founder of Lease Telecom. ### How to Overcome Hybrid IT’s Common Challenges Recent times have seen almost every business become infatuated with the great allure of the cloud and its many benefits. As a result, we have seen organisations of all shapes and sizes make systems migration a priority. According to the SolarWinds IT Trends Report 2017: Portrait of a Hybrid IT Organisation, 92 percent of organisations surveyed have migrated critical applications and IT infrastructure to the cloud over the past year. However, amidst the rush to migrate systems infrastructure up and out of the data centre, it is clear that challenges associated with the cloud and hybrid IT are beginning to emerge and cause issues for many businesses. It is essential that organisations and IT professionals now take note and learn from these common challenges, so they can benefit from their hybrid IT experience. Here are three of the most recurrent challenges that we are now seeing in systems migration, and, of course, how to overcome them: Don’t run before you can walk The SolarWinds IT Trends Report found that 22 percent of U.K. IT professionals surveyed have had to bring back on-premises applications and infrastructure that had previously been migrated to the cloud. With the primary reasons for doing so ranging from security and compliance concerns to worsened performance, it would appear that some IT professionals are getting a little overexcited, hurriedly migrating their systems to the cloud without predetermining whether or not the cloud will provide any real benefit. But using the cloud to enhance services is the whole point, right? Pre-testing, for both performance and security impacts, should always be fundamental to any migration strategy. A comprehensive monitoring toolset that provides visibility into both on-premises systems and the cloud is crucial to accurately determining how any workload, application, or piece of infrastructure will perform in the cloud. This is because monitoring will establish baseline performance metrics that can be used to quickly conclude where a workload is best suited. Visibility can replace authority Hybrid IT environments have opened up a whole new world of benefits to IT professionals. However, there is one major issue that is fundamentally changing how they can operate. The SolarWinds IT Trends Report 2017 found that the number one challenge created by hybrid IT is a lack of control and visibility into the performance of cloud-based infrastructure. This issue is not only damaging the effectiveness of hybrid IT environments but is also a massive barrier to migration for organisations that remain solely on-premises. When a performance issue occurs in an on-premises environment, IT professionals are blessed with the authority to freely search for its root cause. Whereas in hybrid IT environments, it is problematic to determine where accountability lies and where to point the finger. As an internal IT professional, you may be confident that everything is okay on your end, but the cloud service provider (CSP) may claim the same. To overcome this challenge, IT professionals must adapt and learn that visibility can replace authority. But to achieve clarity, and ultimately control, it is critical to perform comprehensive hybrid IT monitoring. This will help you to truly understand how workloads are performing in the cloud, and why they are performing that way. Empowered by the ability to see the cause of the issue, you can then accurately and confidently communicate it to the CSP. This will not only resolve the issue faster but also make sure that relations between internal IT professionals and CSPs are strengthened, helping to ensure system uptime and the creation of lifelong friendships. A culture of continuous learning In today’s on-demand world, no matter whether an application service is hosted on-premises or in the cloud, the end-user will always expect to receive a speedy and high-quality experience. But this level of service can only be guaranteed if the IT professional tasked with managing the environment has a comprehensive understanding of the technology they are working with. [clickToTweet tweet="According to a report, 45 percent of those surveyed do not believe that #IT professionals entering the workforce today possess the #skills necessary to manage #hybridIT environments. #Cloud" quote="According to the IT Trends Report, 45 percent of those surveyed do not believe that IT professionals entering the workforce today possess the skills necessary to manage hybrid IT environments."] It may come as a surprise, but in hybrid IT environments, this isn’t always the case. Cloud technologies have brought with them an added degree of complexity that many IT professionals did not prepare for. According to the IT Trends Report, 45 percent of those surveyed do not believe that IT professionals entering the workforce today possess the skills necessary to manage hybrid IT environments. But this doesn’t mean that we are all doomed. Learning on the job isn’t exactly a new concept, nor is it anything to be scared of. This applies to the most seasoned IT professionals, as well as the greenest. It is therefore now crucial for businesses and IT professionals to help ensure that they are prioritising the development of meaningful, usable skills within a workplace culture of continuous learning. This will make not only sure that the full potential of hybrid IT can be enjoyed, but also that future disruptive technologies are not all that disruptive. Thrive in the new era of work The cloud has marked the beginning of a new era of work, and nowhere is this truer than in the data centre. The rate of systems migration to the cloud is not looking to slow down anytime soon, and if these challenges are not addressed, then your organisation will most definitely be left behind. Implementing the aforementioned solutions to overcome the everyday challenges of hybrid IT will go a long way in making sure that your business’s cloud strategy is built to last. ### Going for gold in the Cyber Olympics: a look at cybersecurity in Pyeongchang High-profile events like the 2018 Winter Olympics currently taking place in Pyeongchang, South Korea, are a hot-spot for potential cybersecurity threats. Cyber criminals often use these large gatherings of people and technology as a means to steal personally identifiable information (PII) or harvest users’ credentials for financial gain. The likelihood of these attacks taking place is now so high that US-CERT issued a bulletin ahead of this years’ Olympics reminding travellers to be aware of both cybersecurity and physical security risks – a warning we’d never have had twenty years ago. [clickToTweet tweet="Dubbed the ‘Olympic Destroyer’, a #cyberattack hit the #Olympics’ #computer #systems just before the 2018 Pyeongchang Games’ opening ceremony, crashing the internal #internet and #WiFi." quote="Dubbed the ‘Olympic Destroyer’, a cyberattack hit the Olympics’ computer systems just before the 2018 Pyeongchang Games’ opening ceremony, crashing the internal internet and Wi-Fi."] Despite this, it was a very different form of attack that the International Olympic Committee (IOC) needed to worry about at the start of the Games. Dubbed the ‘Olympic Destroyer’, a cyberattack hit the Olympics’ computer systems just before the 2018 Pyeongchang Games’ opening ceremony, crashing the internal internet and Wi-Fi. What do we know so far? While it seems as though the shutdown didn’t disrupt any of the Olympic activities or the opening ceremony itself, it has been revealed by cybersecurity researchers that the attack was aimed at data disruption and involved a brand new strain of malware which has only disruptive capabilities – similar to that of the Bad Rabbit ransomware. From this, we can gather that the real intent of the attack was not to steal data, as originally thought, but was likely intended to disrupt the Games and bring embarrassment upon its organisers. This is where the purpose of this malware varies to the types of ransomware which proved popular for threat actors looking to make financial gains last year. Worryingly, however, this malware follows numerous warnings to Olympics organisations after alleged Russian-endorsed cyber-attacks and phishing attacks were spotted from suspected cyber-espionage group Fancy Bear. They are also known as APT28 – the group which was condemned in 2016 for stealing information from the World Anti-Doping Agency (WADA) about US athletes and publishing it online. Could it happen again? It is currently unclear how these threat actors accessed the Olympic systems, however, there is a very real possibility they will be back. While the damage caused by the outage was seemingly minimal this time, the attackers apparently left a ‘calling card’ on the network, threatening a return to perform destruction, leave computer systems offline and wipe remote data. Alongside this, researchers from McAfee’s Advanced Threat Research team have identified a new implant named Golden Dragon, which is being used to target organisations involved in the Games. Similar to an implant previously used to gain access to targeted victims’ systems and gather system information, this implant could allow threat actors to extract valuable data from the Olympic systems. Who is the culprit? Attribution of the attack is currently unclear and, at this point in time, it is too early to say whether this was a nation-state attack or that of someone looking to show off their cyber skills on the big stage. Rumours are circulating about how the most obvious culprits may be North Korean or Russian threat actors, given growing tension between North Korea and the USA as well as Russia’s ban from officially competing in this year’s Winter Olympics. However, none of these theories have been confirmed. The Fancy Bear hack team should also be considered frontrunners when it comes to attribution, following a tweet that was made in early January by the Fancy Bear Twitter account threatening the IOC and WADA. Hours after this tweet, the same account posted a link to the Fancy Bear domain which hosted leaked information including a set of apparently stolen emails that purportedly belong to officials from the IOC, the United States Olympic Committee and third-party groups associated with these organisations. While attribution is difficult at this point in time, organisations involved in or linked to the Olympics need to be aware and prepare for another potential attack. With the motives behind these attacks unclear, it is important that cybersecurity chiefs remain focused on understanding the tactics, techniques and procedures (TTP) of a threat actor whilst keeping an eye on the evolution of threats in order to assess intent and identify potential future attacks on the Pyeongchang Games. ### #PointOfPrivacy – Ep 2 – GDPR Episode: GDPR Broadcast Date: 16th February 2018 Series 1 Episode 2 Can you keep a secret? Bill Mew is joined by legal experts Frank Jennings and Eduardo Ustaran as they discuss GDPR, Data Privacy and what Brexit will mean to how Data is handled. ### #TheCloudShow – Ep 3 – Future of Cloud Service Providers Episode: The Future of Cloud Service Providers Broadcast Date: 16th February 2018 Series 1 Episode 3 Compare the Cloud are peeling the layers back on the Cloud Industry. Answering the questions on the lips of both providers and customers – hosts Jez Back and David Organ will cover the latest trends and best interviews in this space. ### Separating cybersecurity facts from fears Digital attacks and data theft are on the rise. As a result, many businesses are growing increasingly concerned about the safety of their systems. But are preventative measures being put in the right places? Are companies right to fear AI, for example, or is it just misplacing paranoias? From malware and ransomware to Wi-Fi security breaches, it’s time to separate the fact from fiction in cybersecurity and data vulnerability. Artificial Intelligence A recent survey by Avast Business found that 46% of respondents had concerns about security problems with Artificial Intelligence, despite machine learning and predictive algorithms so far maintaining a positive track record in data use and protection. Current plans for further innovation include AI for the energy industry that can self-heal whole networks in the event of bugs and hacks, though at present this has yet to be perfected. So why are we so afraid of this technology? Speculation in pop culture about the possibility of AI as an adversary to humanity seems a likely cause, but a more legitimate complaint is that nothing is unhackable – even the most intelligent artificial system could be open to attack. As such, AI will have to continue to prove itself as a safe and secure development for some time yet to put the fears and uncertainty to rest. Ransomware [clickToTweet tweet="96% of respondents stated that they were concerned about #malware, #spyware and #ransomware, and of the remaining 4%, not one said that they had no concerns at all about these things #cybersecurity #cybercrime" quote="In the same survey, 96% of respondents stated that they were concerned about malware, spyware and ransomware, and of the remaining 4%, not one said that they had no concerns at all about these things."] In this instance, people are right to be concerned. While only 57% cited their level of concern as ‘very worried’, data shows that ransomware attacks increased 6000% in 2016 and 50% of affected businesses had to pay hackers more than $10,000 (over £7,000) to get their data back. 20% spent more than $40,000 – around £30,000 – so here it is the businesses who aren’t so worried that need to rethink their concerns. Ransomware hits a company every 40 seconds worldwide, with the overall cost of attacks continuing to grow. On top of a need for high-grade internet security software, businesses need to train their staff to recognise threats like these and have processes in place for dealing with them. Mobile Security 91% of businesses surveyed cited concerns about mobile security – but research by CheckPoint has found that only 20% of enterprises have experienced a mobile breach. 94% of security professionals surveyed by CheckPoint said that they expect the frequency of mobile attacks to increase, though currently there’s no sign of a rapid increase to mimic that of malware and ransomware. Businesses are wise to seek protection, all the same, encouraging individuals to install digital security software on their mobile devices just as they would on a desktop computer. Even things as simple as setting up automatic locking during periods of inactivity, and automated data wiping after multiple failed logins, can be the difference between data cybersecurity and a significant breach. Wi-Fi Breaches Though many individuals and businesses have concerns about Wi-Fi cybersecurity, it doesn’t stop 75% of people sharing data over unsecured networks – and 95% of millennials admit to sharing personal information over public Wi-Fi systems. With a £30 antenna and the right software, a hacker sitting a mile away can steal passwords, emails and other data shared through unsecured networks like these in seconds, so failure to take threats seriously is no joke. Businesses are increasingly at risk from Wi-Fi cybersecurity breaches as employees access work-related files and sensitive data from personal devices or public networks, making this yet another area where business security should be stepped up in anticipation of increased attacks. Use of unlimited data plans can negate the need to use public Wi-Fi altogether. Using SSL connections and VPNs can also help to reduce the chance of data theft. “Virtual Private Networks allow individuals to safely, remotely access protected information stored in private networks or cloud applications – for example when working away from the office,” says Greg Mosher, Vice President of Product and Engineering at AVG Business by Avast. “The transmission of data between the device itself and the local unsecured Wi-Fi access point is made secure. This aids security anytime someone is accessing sensitive data and supplying their credentials, including cloud applications like Google docs or Salesforce.” Corporate data theft: the figures £9.2billion is lost to cyber-theft of IP and £7.6billion to cyber-espionage each year in the UK alone 66% of small businesses have fallen victim to cybercrime, but 20% continue not to use computer cybersecurity software Only 42% of ransomware victims can fully recover their data, even if they pay the ransom While it’s no surprise that businesses of all sizes have fears about an array of potential cyber threats, what is surprising is that there is still such a lack of investment in defence systems and internet security. Hackers don’t discriminate: the last few years have seen system hacks affect everyone from the NHS to eBay, organisations we expect to have the highest levels of internet security. But, as the NHS data crisis showed us, it’s a lack of even basic IT security that is still leaving personal data exposed. For those who remain complacent, disaster may be just around the corner. For those who wonder if their fears are misplaced, it should be clear that now more than ever, severe security measures that can keep up with changing technologies need to be implemented across the board. ### Will a cloud-based CRM future-proof your business? Needless to say, in order for a business to thrive in 2018, it must be prepared to utilise the latest technology. Whether it’s big data analysis, sales forecasting, or even social media and digital marketing, technology holds the key to communicating with consumers on their terms, in the least intrusive way possible, and giving them a clear and convenient path to the eventual purchase. Modern technology advances at such a rate, however, that businesses can often find themselves unable to stay on trend. It may take a significant investment of time and money to implement a new system across a company, so who can blame a business for being sceptical of adopting smart systems if this technology will be redundant in just a couple of years? This problem is one that plagues prospective users of Customer Relationship Management systems in particular. The CRM paradox A CRM implementation is perhaps one of the most transformational yet complex product adoptions that an organisation can undergo, with virtually every department required to cleanse and migrate data, train in using the new system, and then catch up with the backlog of tasks delayed by the change in process. It can be a while before the business really starts to benefit from the new CRM, but this is well worth the sacrifice, particularly if you ensure that your product adoption is future-proofed. [clickToTweet tweet="A #CRM implementation is perhaps one of the most #transformational yet complex product adoptions that an organisation can undergo" quote="A CRM implementation is perhaps one of the most transformational yet complex product adoptions that an organisation can undergo"] Once a business has identified it needs a CRM, the first stumbling block it may encounter is whether its solution should be on-premise or cloud-based. It would be irresponsible to advise either way on this, as all businesses are different and should adopt a product that best suits their needs, but there are clear advantages and disadvantages to both. Should I choose an on-premise or cloud-based solution? Those with a long history in business will be familiar with on-premise CRM systems, which are installed on local networks and usually administered or developed by a permanent member of staff. A local system is generally inaccessible for anyone outside of the network, which makes it a great solution for close-knit teams that don’t need remote access to the CRM. It also helps with security, as only permanent staff should have access to the system. Some businesses also prefer a capital expenditure (CapEx) versus a subscription-based model (OpEx) for accounting purposes, so the decision will lie across not only sales, marketing, IT, Ops but also Finance. On the other hand, on-premise solutions can often be left behind as technology advances. Periodic updates are available, but many businesses choose not to upgrade because of lack of compatibility with their customised solution, as well as fear of loss of data or disruption to routine. What’s more, with data stored on local systems being inaccessible from outside of the network, this can sometimes pose a bigger data risk than CRMs accessible from online portals. Users are often tempted to share customer data via email or external communication for networking or sales calls, reckless to the fact that this data could be compromised. By contrast, cloud-based CRM systems are accessible from anywhere. With a simple-but-secure online portal, two-factor authentication, and a unique login for each user, data is stored safely in the cloud with no need to use any third-party communication platforms to share information. Likewise, administrators and developers can access a business’s CRM remotely should there be any technical issues or requests for a customised solution. Businesses often take advantage of the contract market for cloud developers, saving money on a permanent staff member who can spend a lot of time ‘on the bench’ when things are running smoothly. The real benefit of a cloud-based CRM system, however, is that it is almost inherently future-proof. The real benefit of a cloud-based CRM system, however, is that it is almost inherently future-proof. With live updates released regularly, online platforms are constantly being optimised as well as protected against the latest security threats. Whether it’s an improved user interface, adherence to the latest data protection laws, or a major product update, cloud-based CRMs utilise regular, subtle updates to relieve the burden on the user and minimise disruption to the business. Integration is also made much simpler when storing data in the cloud, with many web-based systems making product integration as convenient as ‘drag and drop’. This is especially useful given the value of marketing via new social media services. CRM platforms, particularly those boasting marketing automation features, have a duty to embrace the latest communications technology for users to get the most out of marketing. In comparison, businesses often find themselves using on-premise CRM solutions that predate the advent of modern social media platforms, so integration is almost impossible without a major product update. To reiterate, this does not mean that all businesses would benefit from a cloud-based CRM. In industries where the holding of personal or sensitive data is paramount, having a local system is necessary for regulatory and compliance purposes, to mitigate security risks, whilst others must consider budgetary requirements. Is future-proofing ever truly possible? While using a cloud platform can help to future-proof your business, there really is no way to predict the direction in which technology will go. Cloud-based CRMs merely offer you the option of constantly staying up to date, giving you every possible advantage as technology advances. For example, nobody could have predicted the recent meteoric rise (and subsequent fall) of Bitcoin value, and by the time CRMs evolve to be Bitcoin transaction-friendly, the cryptocurrency could have already collapsed. Nevertheless, businesses that are big enough to justify the financial outlay and time investment, and benefit from the seemingly limitless product integrations, should give serious consideration to using a cloud-based CRM system. It at least ensures future-proofing until the next big breakthrough in technology — as for what that will be, your guess is as good as mine. ### Common Online Scams To Watch Out For In 2018 Online scams are big business. In fact, there were over one 1 billion dollars in victim losses in 2016 due to online scams. This shows that online scams are as popular today as they ever were – but the scams aren’t the same as the ones that you were told to avoid a few years ago. This is because the internet is constantly changing, from new cloud developments to the introduction of Robotic Process Automation, and this includes new and updated scams that you may never have heard of before. Here are three online scams that you can expect to hear about in 2018. ICO Scams There are lots of legitimate ICOs online, but there are also lots of ICO scammers. It is very likely that you have heard about the cryptocurrency market recently – after all, it is all over the news! However, many people haven’t heard the term ICO, which stands for initial coin offering. ICO is a popular funding mechanism for projects, and many companies use ICO to sell stock. This is normally a good thing, as it can benefit businesses – but there are pros and cons to investing in cryptocurrency, and one of the main cons is the possibility of being scammed. There are lots of legitimate ICOs online, but there are also lots of ICO scammers, which is why it’s important to learn how to avoid ICO scams. Advance Fee Scams Advance fee scams have been around for decades, but they are still one of the most popular online scams. Advance fee scams normally start with a scammer sending “free” products, services or money to you, and then they ask for a small fee to process the order. Once you have paid the fee they will retract the free items, leaving you with nothing. This scam used to be fairly easy to avoid, but in recent years scammers have gotten smarter and now they often pose as someone you know (such as a friend or an accountant) so that you are more likely to trust them. Social Media Scams Social media scams are also very common. Normally a hacker will hack someone’s profile after they visit a suspicious link, and then they use the account to message other people asking for money and personal details. This one can be tough to avoid, but make sure that you always question anyone asking for your personal details, even if they seem friendly. [clickToTweet tweet="Make sure that you stay safe by being aware of #ICO #scams, #AdvanceFeeScams and #SocialMedia scams." quote="Make sure that you stay safe by being aware of ICO scams, advance fee scams and social media scams."] Online scammers steal money from tens of thousands of people every year. Make sure that you stay safe by being aware of ICO scams, advance fee scams and social media scams. ### Rob Birkett - Bett Show - #DisruptiveCIF Rob Birkett talks to CIF and Disruptive Live at the Bett Show 2018. Cloud Industry Forum (CIF), the industry body focused on raising standards and improving transparency in the cloud sector, has forged a media partnership with Disruptive.Live, the business and technology internet based TV channel. ### Jez Back - Bett Show - #DisruptiveCIF Jez Back talks to CIF and Disruptive Live at the Bett Show 2018. Cloud Industry Forum (CIF), the industry body focused on raising standards and improving transparency in the cloud sector, has forged a media partnership with Disruptive.Live, the business and technology internet based TV channel. ### Simon Ratcliffe - Ensono - Bett Show - #DisruptiveCIF Simon Ratcliffe from Ensono talks to CIF and Disruptive Live at the Bett Show 2018. Cloud Industry Forum (CIF), the industry body focused on raising standards and improving transparency in the cloud sector, has forged a media partnership with Disruptive.Live, the business and technology internet based TV channel. ### Martin Root - Yada Events Technology - Bett Show - #DisruptiveCIF Martin Root from Yada Events Technology talks to CIF and Disruptive Live at the Bett Show 2018. Cloud Industry Forum (CIF), the industry body focused on raising standards and improving transparency in the cloud sector, has forged a media partnership with Disruptive.Live, the business and technology internet based TV channel. ### Why Love is the Best Phishing Weapon Roses are Red, Violets are Blue, With this e-card, I'm scamming you. Who doesn't love love? And who doesn't want to know that they are loved? And, especially, who doesn't want to have a secret admirer declaring their undying love for us?  Human Beings are a marvel of evolution - we are by far the most successful species on the planet, top of the tree of natural selection. Just ask Elon Musk how great we are as a species - it's no wonder that he is planning on preserving the human race by ensuring that we become a multi-planet species (negating the dangers of being wiped out in a single hit, which is much more likely if we remain a single-planet species). But despite the greatness of the Human Being, we're by no means flawless. The founding president of Facebook, Sean Parker, admitted back in November 2017 that Facebook was built to exploit a, "vulnerability in human psychology." Facebook was designed so that all the interactions and features on the site would give users a little hit of dopamine to keep them online and keep on interacting with the site to consume as much of their time and attention as possible. And there's another recent example of human psychology being exploited - perhaps we're really not so great after all. Just ask Elon Musk how stupid we are as a species - if you now own a (not a) flamethrower, then shame on you for falling for such a ridiculous stunt. The hype went through the roof, and the flamethrowers flew off the shelves. The hype went through the roof, and the flamethrowers flew off the shelves. As did the accompanying fire extinguisher. As the boring company's website says, "Buy an overpriced Boring Company fire extinguisher! You can definitely buy one for less elsewhere, but this one comes with a cool sticker and the button is conveniently riiiight above." How many people hit the buy button and completed their purchase of flamethrower (and exorbitantly priced fire extinguisher --- with a cool sticker!!! Like, OMG!!!) just so they didn't miss out on 'the' cool thing of the moment? Perhaps even the coolest thing of all time? After all, there's only 20,000 ever going to be available! I am sure that FOMO - Fear of Missing Out - was behind a huge number of the purchases. But let's get back to romance. Scammers also know how much we want to hear from our secret admirers and they want to exploit flaws in human psychology. As highlighted in a report by PhishMe, "Enterprise Phishing Resiliency and Defence Report", "e-cards are one of the strongest vehicles to deliver email threats, having risen to the top of our susceptibility charts, with average response rates nearing the 25% mark. Valentine e-cards simulations have pulled even higher rates, at times reaching above 50% response." Strong emotional responses like love (and, let's be realistic about it, lust) can keep us from our rational thought processes just long enough for us to react without thinking. How many new flamethrower owners suffered buyers' remorse in the minutes, hours and days after clicking the 'buy' button? (I mean, sure, if you're still really happy with your flamethrower, then none of this applies to you and you made a really good and sensible purchase and it was in no way flawed. Totally.) [clickToTweet tweet="'our brain needs to be conditioned to fully become a security asset'" quote="'our brain needs to be conditioned to fully become a security asset'"] John ‘Lex’ Robinson, anti-phishing and cybersecurity strategist at PhishMe offers the following comment on why our brain needs to be conditioned to fully become a security asset: “The majority of security threats out there do not rely on sophisticated malware or technical vulnerabilities, but the psychology and behaviour of people. Valentine’s Day is the perfect set-up for malicious actors to craft simple but effective email messages that anyone without adequate conditioning at a cognitive level would deem harmless, genuine and acceptable. Our brain hates change and the success in fighting these seasonal threats lies in our ability to condition those deeply ingrained behaviours to instil a sense of genuine alert in our day-to-day communications”. With so many more people on the planet now I expect that Cupid has had to adjust his methods - he probably sends out eCards himself, and I expect he's already replaced his bow and arrow with a flamethrower. The advice to take care when opening emails on February 14th, of course, applies throughout the year. But PhishMe's caution to be extra aware and vigilant on Valentine's Day is certainly justified. Roses are Red, Flamethrowers are hot, Cupid sends out eCards, But I do not. ### Bill Mew - UKCloud - Bett Show - #DisruptiveCIF Bill Mew from UKCloud talks to CIF and Disruptive Live at the Bett Show 2018. Cloud Industry Forum (CIF), the industry body focused on raising standards and improving transparency in the cloud sector, has forged a media partnership with Disruptive.Live, the business and technology internet based TV channel. ### Joachim Horn - SAM Labs - Bett Show - #DisruptiveCIF Joachim Horn from SAM Labs talks to CIF and Disruptive Live at the Bett Show 2018. Cloud Industry Forum (CIF), the industry body focused on raising standards and improving transparency in the cloud sector, has forged a media partnership with Disruptive.Live, the business and technology internet based TV channel. ### Mark Bollobas - Vengit - Bett Show - #DisruptiveCIF Mark Bollobas from Vengit talks to CIF and Disruptive Live at the Bett Show 2018. Cloud Industry Forum (CIF), the industry body focused on raising standards and improving transparency in the cloud sector, has forged a media partnership with Disruptive.Live, the business and technology internet based TV channel. ### What's in a name? The coming rise of gTLDs. "What's in a name? That which we call a rose, by any other word would smell as sweet." These are the immortal words from Shakespeares's Romeo and Juliet, spoken to Romeo by Juliet. In the past, I have often had many women reciting love poetry to me - but talking about that would be going off on a tangent and missing Juliet's point. Juliet's point being - if Romeo's last name were not Montague, then there wouldn't be such a kerfuffle over them being in love. Names shouldn't matter - just their love. But as we know, that wasn't enough and things didn't go too well for the young couple.   names matter. A lot. In fact, they are perhaps the most important thing. But do names matter? Should names matter? On the interweb, just to prove Juliet wrong, names matter. A lot. In fact, they are perhaps the most important thing. It's been reported that insurance.com was the biggest domain name sale in history at $35.6m dollars - but that had a functioning business behind it. The largest domain only name sale in history was --- well, let's just say it was a three letter .com that began with an 's' and ended with an 'x'. I'm sure you can fill in the blank. And that sold for $13m. Yes, that's right. Thirteen million dollars - just for a name. The market in domain names is really interesting - and is only just now becoming ripe for further exploration. It used to be that having a .com or a .net address was such a big thing. But with all the TLDs (Top Level Domain) names now available, so much more is possible. gTLDs (generic TLDs as opposed to country code TLDs) aren't a new thing - .biz and .info were activated in 2001, and a whole raft of new gTLDs began in 2012 and went live by the end of 2013 - but, so far, they haven't caught fire like .com. To date, over 1,200 new domains have been delegated including domains such as .cloud, .online, .shop, and .blog. The man who sold that three letter dot com name for $13m, Rick Schwartz, wrote in a blog entry at the end of 2015 that gTLD are mostly worthless. There would be a few outliers, he says, that would get traffic and could be valuable (e.g. home.loans sold for $500,000 in January 2018) - but in almost every case a .com would be the better bet and it's not worth wasting time and money on gTLD domain names. I remember really well a technology 'expert' talking a number of years ago about the threat that online music would pose to CD sales. The expert confidently predicted that people like to have something tangible - they want the physical product in their hands - so while online music had seen a huge growth by that time, it was a bit of a phase and it wouldn't mean much in the long run for CD sales. And I think we all know how well that expert's opinion has proven to be. Similarly, whilst having made a fortune in domain names, I can't help but think that Rick Schartz is like the music expert: he's seen something and been used to something for so long that he can't see anything else. And whilst we have to acknowledge his success and expertise, there always comes a time for a new generation to come along and to do things differently. And so, I think, it will prove with gTLDs. Over the last few years we've seen many tech companies registering and using .io domains - and whilst not a gTLD (it's actually a ccTLD - country code TLD - for the British Indian Ocean Territory) I think it shows the direction that the whole domain name market will go. Mou Mukherjee, Head of Registry Operations, .Cloud agrees by saying, "new gTLDs (generic top-level domain) will become commonplace as more recognised brands and startups start getting creative with the unlimited potential of this namespace." Of course, as someone with a leading role in a gTLD registrar, we might expect some bias - but I do not think that's the case: rather, it's someone with relevant, up to date, expert knowledge pointing out a clear fact that the world will soon catch up with. Mou went on to say that, "looking forward to 2018, while only 6.4 percent of the 330.7 million domains registered are new gTLDs, the industry can expect to see more recognised and trusted brands dipping their hands into this namespace. With endless possibilities, expect to see brands coming out with creative domain choices to launch new initiatives such as cloud-based platforms, storefronts and communities."   It is true that the internet is led by .com / .net sites and that the market for domain names is still dominated by .com sales - but that doesn't mean to say that this won't change in the coming years. .Com used to be the star of the show. But it won't stay like that forever. [clickToTweet tweet=".Com used to be the star of the show. But it won't stay like that forever." quote=".Com used to be the star of the show. But it won't stay like that forever."] In a recent piece, I suggested that soon the term IoT (Internet of Things) could be cast aside, perhaps even by the end of this year, as what we currently call IoT becomes more and more synonymous with The Cloud itself. In speaking with people outside of the tech industry, I am aware of how many people know about "The Cloud" - but aren't exactly sure what it is. I haven't ventured further to inquire on their thoughts about IoT - but I am sure I would just get blank looks in return. In my mind, it is simple to envisage the near future when non-techy people ask their tech-savvy relatives, "how does my Alexa or [insert name here of whatever IoT device!] work?" - the simplest answer to give to the non-techy will be that it is the cloud that makes it work. While a .io domain name will seem very cool for a tech person, it will mean nothing to the average man or woman on the street. And whilst everyone will be aware of .com or .co.uk etc. it will surely become more and more normal to visit a .cloud website when looking for information on something powered by "The Cloud" in the near future. There are a lot of TLDs available - it's actually hard to see all of them thriving, but there are so many good TLDs that it's almost impossible to imagine that they won't come into their own sooner rather than later. As Mou has suggested, 2018 could be the tipping point where we see big brands taking the leap and showing the power that gTLDs have. ### Robust cloud adoption to foster the use of data centre power solutions Expected to witness some remarkable developments over the coming years, the global data centre power industry is a key component of the digitalisation trend. Data centre power consumption has changed considerably over a relatively short period of time. This can be attributed to the increasing adoption of cloud computing and IoT solutions and the subsequent rise in the number of data centers worldwide.  By 2025, 80% of enterprises will migrate away from on-premises facilities in favor of colocation, cloud, or virtualisation alternatives, according to research by Gartner. Datacenter facilities are among the most energy-intensive building types, utilising up to 50 times the energy per floor space compared to a typical commercial office building. In the U.S. alone, data center spaces constitute around 2% of the country’s electricity usage.  Power consideration is thus an essential element in the designing and efficient functioning of data centres. However, the power that keeps them up and running cannot be an overlooked component. A simple power failure can bring devastative impacts, especially with more and more enterprises turning to cloud and colocation services. A safe and reliable power supply and distribution system are critical to ensure the optimal functioning of a data centre and minimise economic losses.  As per a research report compiled by Global Market Insights, Inc., the global data centre power market size is set to hit an annual valuation of nearly US$15 billion by the end of 2026.  Currently, organisations like ABB Ltd., Black Box Corporation, Cummins, Inc., Caterpillar Inc., Eaton Corporation, Hewlett-Packard Company, Huawei Technologies Co., Ltd., Toshiba Corporation, Siemens AG, Schneider Electric SE, Mitsubishi Electric Corporation, and Vertiv Group Co. are some of the notable providers of solutions for data centre power requirements.  Data center power consumption to soar substantially through 2026 Data centres, on a global scale, utilise over 416 terawatts of electricity per year, which is around 3% of the world’s total electricity generated. It is expected that major cloud service providers such as Amazon, Google, and Microsoft will continue to build massive hyperscale facilities across both developed and emerging regions to up their cloud computing game. Add to that the tremendous requirements for thousands of other tech companies, telecom networks and OTT platforms.  With more and more facilities being built each year, the amount of electricity required to power them is increasing as well, creating a huge demand for efficient data centre power systems. Additionally, considering the current rate of cloud adoption, the global data centre traffic is anticipated to increase three times in the next few years. This in turn will result in more centre being built, further augmenting the demand for data centre power components.  Increasing multi-cloud adoption to boost PDU demand  From large enterprises to multi-tenant and colocation data centres, the shift towards multi-cloud environments and hyper-converged infrastructure is forcing companies to rethink their power supply and distribution requirements to keep pace with the transformation. Consequently, the need for high-performance data centre power systems including power distribution units (PDUs), generators, UPS, and cabling infrastructure is more than ever.  Leading companies in the data centre power industry are continuously working on bring novel technologies to the market. In January 2019, Eaton Corporation had announced the launch of its new high-density rack power distribution unit (rPDU) platform in North America. The platform is best suited for environments which require high power density and offer advanced configurability. These rPDUs were apparently designed with data centre customers in mind and feature a flexible and configurable solution which allows them to optimise operations and minimise risk as the power demand rises.  Citing another instance, in July last year, Vertiv had unveiled the Vertiv Geist rPDU featuring its latest patented Combination Outlet C13/C19 option which is compatible with C14 and C20 plug types.  What does the future hold for the global data centre power market? The global pandemic has had devastative impact on the manufacturing industry during most parts of 2020 due to the global slowdown in the semiconductor industry. Hike in raw materials prices, the shortage of skilled labor, and supply chain disruptions have been among the major roadblocks for industry players amid the current crisis. But the demand for cloud solutions jumped in the period, driving data centre use and sustaining the deployment of power solutions and components. While the data centre power industry continues to fight the pandemic and its impacts, it will need to overcome numerous other challenges in the near future. High initial investments is a leading factor which may limit the adoption of data centre power solutions, mainly among small and medium size enterprises. Market participants will thus need to design high-performance power solutions while exploring ways to minimise component prices.  With more and more enterprises leveraging IoT based solutions to meet customer demands, latency requirements will increasingly push them towards edge data centre located closer to end-users in emerging markets. Cisco estimates that more than 94% of computing tasks and workloads will processed in cloud environments by the end of 2021.  As the demand for cloud infrastructure continues to amplify and service providers increasingly shift towards a cloud-based delivery model, more facilities will be required to accommodate the power intensive high-density servers, strengthening the outlook for data center power solution providers to a great extent. ### DeepFake - Machine Learning Hero or Villain? If you've not seen a DeepFake Video yet, or even heard of DeepFake Videos - you soon will. And that is despite Twitter and Reddit both banning them in the last few days. Technology drives our society forward - but yet, technology itself is passive. It's how technology is USED by humanity that matters most. It is said that many of the members of the Manhatten Project - the developers of the Atomic Bomb in the 1950s - were so caught up in the technology and the scientific breakthroughs that they were achieving that the wider questions of whether it SHOULD be done were largely pushed to one side. It was only after they saw the effects of their creation that they had second thoughts. Whether the creators of DeepFakes have the same soul-searching questions remains to be seen. Deepfakes are a marvellous example of Machine Learning (ML) given a practical, obvious, easily-understood, real-life application. The fact that most of the current uses are somewhat "dodgy" is, however, a real concern. It's a well-known fact that the adult industry has historically been a leader in the technology field. It's simple economics - the adult industry is so huge and worth so much that it can have such a huge influence on business and wider society. Kids and young adults today brought up on Netflix and Amazon Prime (and similar music-based streaming services) can be forgiven for finding it strange that we used to have physical representations of the entertainment that we bought. For anyone who thinks that Blu-Ray & DVD is so passé, the very existence of VHS And Betamax would be a complete mystery. Betamax was the superior technology - but it had very stringent rules and regulations which hampered more 'creative uses' of videotape technology. VHS, despite not being as good, was more open and 'liberal'. As such, the adult industry had no choice but to go down the route of VHS. The rest (so far as tape-based video media is concerned) is history: Betamax died out with little fanfare while VHS reaped all the spoils. The 'mainstream' publishers had no qualms whatsoever in following the lead that the adult industry set: even though they knew Betamax was the better product, they knew that VHS had become the better bet. The wonderful thing about the democratisation of technology is that anyone can do whatever they have the ability to do. The terrible thing about the democratisation of technology is that anyone can do whatever they have the ability to do. The technology landscape is different now - the power isn't in the hands of a select few with $$$$ behind them. It's up to whoever has the right computing power, the right skills, and the right motivation to produce whatever they like. The wonderful thing about the democratisation of technology is that anyone can do whatever they have the ability to do. The terrible thing about the democratisation of technology is that anyone can do whatever they have the ability to do. A DeepFake essentially is a video that transposes one person's head onto another person's body by using Machine Learning techniques. Perhaps the most reasonable use of this technology that we can discuss here is to create spoofs of movie stars appearing in other movies. Anyone who knows their memes would know that Nicholas Cage would be a prime use case for this new technology. And so it proved. Take this a step forward, the same techniques can be used in political satire. It has been said that the only true measure of a strong, liberal society is if satire is allowed free reign. In the UK, particularly, where political satire has been a staple on TV for almost as long as TV has been around, it might seem that questioning if satire is allowed or not is simply silly. But that's to take for granted the society we live in. In other countries around the world, satire doesn't have the same free hand. Using DeepFake technology for purely satirical purposes in the UK would perhaps be seen as clever and humorous - yet the same technology used in other countries could be seen as an act of treason or sedition - and therefore result in the consequences of such acts. But let's just imagine that satire is ok... but then let's take it just a step further still. It is so easy to imagine DeepFake technology that obviously ridicules a politician to be subverted to try to portray a politician in a situation or a manner that misrepresents them completely. The Fake News controversy over the past few years centred largely on simple written content. This could surely be dwarfed by the effects of fake videos portraying people saying or doing things that they have not done, or would not even do, in reality. And the above is just the topics that we can safely talk about here. The 'creations' that have made DeepFakes infamous over the last few months have been much less suitable for a serious technology site such as this to talk about. But the point remains - DeepFakes are only possible because of the recent technology advances that have been developing over the last few decades. Whilst the software technology to produce DeepFakes has been made available freely, the hardware technology, and the individual skill to take full advantage of that software is still limited to those that can exploit it. Without the right hardware and without the right skill, the results obtained are laughable. But a bit of skills, a bit of motivation, and the access to the right hardware, then almost anything is possible. And that's where The Cloud comes in. Fifteen years ago - even less - the best way to exploit the best consumer computer processing power would be to spend a huge outlay and buy the hardware yourself and sit it on the desk in front of you. Now, you don't need the processing power sat on top of your desk - it can reside in the cloud at the end of a broadband connection. GPUs in the cloud are already an obvious use case of The Cloud - Cloud GPU use over the next few years is going to get bigger and bigger. Alvin Feague of DeepFake.ML (DeepFake Machine Learning) is quite optimistic, however. Despite the less-than-tasteful origins of DeepFake Videos, he tells us that, "the [machine learning] technology behind them is sound," and that rather than the, "slippery-slope" that some have described the current deluge of DeepFakes, the techology is a, "rocket ship about to take off!". "The Genie is out of the bottle," Alvin continues. Although he doesn't expect distasteful DeepFakes to disappear, he does expect that more mainstream uses will come to the fore: just as adult content made VHS, and the mainstream industry reclaimed it, Deepfake Videos of a more "questionable" nature won't go away - but more palatable be brought more and more into the mainstream. ### 4 Methods for Adopting Cloud Computing in 2018 Most technology professionals would agree that cloud computing is here to stay. With advancements in cybersecurity and a clearer understanding of how the cloud functions, a large number of businesses will refine their IT strategies and implement cloud-based solutions this year. In fact, recent studies from the International Data Corporation show that spending on cloud solutions has increased more than 20% over the last year. Here are four methods for adopting cloud in 2018. [clickToTweet tweet=" it’s #essential to consider how the #cloud has evolved and will continue to evolve throughout the rest of #2018" quote=" it’s essential to consider how the cloud has evolved and will continue to evolve throughout the rest of 2018"] Before reevaluating your cloud strategy for this year or beginning the process of migrating to the cloud, it’s essential to consider how the cloud has evolved and will continue to evolve throughout the rest of 2018. Here are four strategies and suggestions on how to carry out your plan for migrating to the cloud successfully. Start With Your Basic Business Data The simplest way to begin integrating cloud-based solutions into your business is to start with the small stuff, such as email and data storage. For many companies, it’s likely you’re keeping your documents safe and secure on an external hard drive or other data storage device. The cloud is an easy and quick alternative to a hard drive that cannot be lost, broken, or stolen. Previously, many companies avoided using the cloud because of security concerns. However, cloud security is better than ever and will only improve moving forward. While data breaches have happened in the past, they resulted in cloud companies taking steps toward improving their security. In fact, many would now argue that it’s safer to keep your data in the cloud than on an external hard drive. Invest in Cloud Migration Services Moving your data to the cloud may seem like a daunting task, but there are services and vendors available that assist with the migration of your information quickly and securely. With this in mind, finding a cloud vendor that offers this service, or even paying for a cloud migration service is well worth the investment to save your company time and ultimately money by making sure the cloud is implemented properly. These companies are able to create a cloud strategy specific for your needs, which can alleviate stress on your IT department if they don’t have an appropriate level of resources. Consider Using a Hybrid Cloud Aporeto CEO Dimitri Stiliadia predicted that hybrid clouds will become more prevalent in 2018. The term “hybrid cloud” is fairly flexible in its definition, but essentially, it’s one that utilizes both a public cloud provider and a private cloud platform or private IT infrastructure. The public cloud operates independently from the private cloud or private IT infrastructure, but they communicate with each other for the portability of data. This structure is beneficial because hybrid clouds allow for more control over the individual components and increase flexibility in terms of movement of information. Your company will have more control over where particular information is stored, meaning your data will be easier to manage and more secure if it’s primarily kept on the private cloud network. If you have the capacity to use a hybrid cloud, the flexibility and customization alone makes it worth considering as a solution. Phase in Cloud Migration Over Time Thought-leaders haven’t shied away from weighing-in on cloud migration either. Oracle CEO Mark Hurd is well-known in the industry and is advising companies and prospective customers that while the industry is gradually migrating to the cloud, it’s important to not rush it. Although cloud usage is growing rapidly, cloud solution companies like Oracle are still offering on-premises software, as opposed to pressuring customers to make the shift to the cloud immediately and completely. Although cloud usage is growing rapidly, cloud solution companies like Oracle are still offering on-premises software, as opposed to pressuring customers to make the shift to the cloud immediately and completely This approach is great for companies as it puts their needs and current capabilities first. It’s incredibly important for companies to take their time discovering exactly what level of cloud-computing service is best for them and to take their time transferring their data. ### Mehram Sumray-Roots - Yada Events Technology - Bett Show - #DisruptiveCIF Mehram Sumray-Roots from Yada Events Technology talks to CIF and Disruptive Live at the Bett Show 2018. Cloud Industry Forum (CIF), the industry body focused on raising standards and improving transparency in the cloud sector, has forged a media partnership with Disruptive.Live, the business and technology internet based TV channel. ### Brendan Murphy - Civica - Bett Show - #DisruptiveCIF Brendan Murphy from Civica talks to CIF and Disruptive Live at the Bett Show 2018. Cloud Industry Forum (CIF), the industry body focused on raising standards and improving transparency in the cloud sector, has forged a media partnership with Disruptive.Live, the business and technology internet based TV channel. ### Lee Rendell - Kaspersky Lab UK - Bett Show - #DisruptiveCIF Lee Rendell from Kaspersky Lab UK talks to CIF and Disruptive Live at the Bett Show 2018. Cloud Industry Forum (CIF), the industry body focused on raising standards and improving transparency in the cloud sector, has forged a media partnership with Disruptive.Live, the business and technology internet based TV channel. ### Taking Full Advantage of GPUs Available on Azure to Exploit 3D The growth of cloud services has been extremely prominent over the last few years and will continue to see a significant increase over the next 12 months. A recent report by Microsoft projecting its quarterly activity from the previous months highlighted that Azure’s revenue had seen a growth of 98%, with server products and cloud services revenue also showing an increase of 18%. The rise in adoption of services such as Azure shows that businesses are looking to adopt the benefits of hybrid cloud The rise in adoption of services such as Azure shows that businesses are looking to adopt the benefits of hybrid cloud as this enables them to ‘mix and match’ the advantages of both public and private cloud. Once utilising the full benefits of the cloud with a hybrid solution, businesses can further take advantage of the features of the cloud and look to develop 3D applications within the cloud. 3D in the cloud The development of 3D processes in the cloud can allow businesses to take full advantage of increased power, can make the most of the 3D components that are optimised for the cloud and enable the dynamic scalability of cloud implementation. It can avoid paying for idle computing resources and help to accelerate workflows through mobile and global accessibility. 3D printing has revolutionised many traditional manufacturing processes by reducing the time taken from the design process to prototyping, and from sample to mass production. It is estimated that the 3D printing market will be worth $33 billion by 2023. Similar to AI and Blockchain, this technology has not yet been used to its full potential. 3D applications in the cloud are generally a much quicker option and reduces the drain on resources. Using a modern deployment mechanism, businesses would be able to extract the full potential of technology like 3D and, most recently, 4D printing. Supporting 3D applications Businesses can adopt 100% browser operated software which enables full support for all CAD/3D applications to run in a browser without utilising or depending on computation resources from the end device. This means that users can access both applications provided from anywhere and with any device, irrespective of their location. There are two ways of providing CAD/3D applications in the cloud. They can be connecting in-house hosting environments to oneclick or through Cloud Resource Manager, oneclick’s interface to deploy virtual machines directly in data centres of well-known Iaas providers such as Microsoft Azure. oneclick then uses the latest streaming technology for transmission to guarantee smooth working, even with low bandwidths. The computing and graphics power, offered by GPUs in Azure is performed by the data centres of Microsoft Azure, which allows less powerful end devices to utilise resource intensive software. [clickToTweet tweet="The computing and graphics power, offered by GPUs in Azure is performed by the data centres of Microsoft Azure, which allows less powerful end devices to utilise resource intensive software." quote="The computing and graphics power, offered by GPUs in Azure is performed by the data centres of Microsoft Azure, which allows less powerful end devices to utilise resource intensive software."] Through developing 3D in the cloud, businesses can further thrive from the benefits of the cloud and can continue to optimise their business processes. There are companies such as oneclick that are helping businesses to use browser operated 3D software, allowing them to take full advantage of 3D in the cloud.   ### Cloud is a Constantly Moving Shark It's a well-known cliche that if you ask a hundred people what "The Cloud" is, you'll get a hundred different answers. And if you step out of The Industry and ask everyday consumers, it is very likely that many respondents will say that they have definitely heard of "The Cloud".... but they're just not quite sure what it is. Here at Compare the Cloud we (quite naturally) love "The Cloud" - it brings to mind a quotation from one of my favourite films. In the film, "Annie Hall," Alvy Singer tells Annie, quite matter-of-factly that, "A relationship, I think, is like a shark, you know? It has to constantly move forward or it dies." Luckily for this particular shark, "The Cloud" is moving forward. More than that, it's growing, developing and evolving. Mou Mukherjee, Head of Registry Operations, .Cloud has stated that, "the meaning of 'cloud' for consumers and business users will shift, as it gets more known for the ability to provide dynamic capabilities such as real-time number crunching and self-learning." [clickToTweet tweet="'the meaning of 'cloud' for consumers and business users will shift, as it gets more known for the ability to provide dynamic capabilities such as real-time number crunching and self-learning.' -> read more" quote="'the meaning of 'cloud' for consumers and business users will shift, as it gets more known for the ability to provide dynamic capabilities such as real-time number crunching and self-learning.' -> read more"] For instance, the cloud is now a crucial tool for Artificial Intelligence (AI) and Machine Learning (ML). 2018 is only a few weeks old, and all the indicators are that we are set for a year of significant growth and developments in these areas. There are many consumer applications which either already rely on, or will increasing utilise, cloud and AI. These applications include crowdsourcing, virtual reality and product virtualisation, to mention just a few. According to IDC, in 2018, 50 percent of manufacturers will be collaborating directly with customers via the cloud. Mou continued by telling us, "take virtual assistants and bots for example. Having taken off in the consumer world, we will see 'virtual assistants' transforming our business lives. AI assistants are the future, from scheduling meetings, to taking and sending meeting notes, to providing real-time answers from sales and marketing platforms. These advancements are only possible with the evolution of the cloud." The buzzwords of yesterday will be the new revenue streams of today and tomorrow. More and more new technologies, underpinned by the cloud, will be utilised in the coming months. Manufacturing and industrial brands will be exploiting these not just for optimisation and automation, but looking at how these technologies can go even further beyond what's currently possible to create innovative services for their customers. The buzzwords of yesterday will be the new revenue streams of today and tomorrow. In our 'always on', 'right now' society, the brands that can provide their customers with valuable real-time data will have a huge competitive advantage over those that lag behind. And as any business owner knows, word of mouth is an incredibly powerful tool to build - or crush - your business: utilise the Cloud to its fullest, otherwise it'll more likely be the latter than the former. IoT is almost synonymous with "The Cloud" - could part of the evolution of Cloud in the coming year be that the term "IoT" becomes obsolete? Not the technology, and not the applications of that technology - just the acronym itself? In an industry that loves acronyms, it is perhaps difficult to imagine one of those precious TLA's being killed off, but IoT is so reliant on Cloud technologies, and what we understand as "The Cloud" is expanding so much, that this is a real possibility. But we live in the here and the now, so - for the moment - we'll continue to use the term IoT. According to Saleforce's, "The State of Service in Manufacturing" report, manufacturers' most common use for IoT is diagnostic data. In the report, 71% of manufacturers agree that IoT data is key to increasing customer retention. Mou Mukherjee predicts that this percentage will be close to 100% by the end of 2018. The Cloud isn't going anywhere. Yes, it will change and evolve. It will be a constantly moving shark: it will fully absorb areas that we currently consider separate, and it may well shed things that have been part of the landscape for a long time. One thing that is for certain, however, is that in a year's time, if you ask a hundred different people what is meant by, "The Cloud" you will still more than likely to get a hundred different answers. ### Dell EMC Expands Server Capabilities for Software-Defined, Edge and High-Performance Computing Industry-leading 14th generation PowerEdge servers, the bedrock of the modern data centre, adds AMD EPYC processor-based systems to its portfolio. UK, London – Feb. 6, 2018 Dell EMC announced three new servers designed for software-defined environments, edge and high-performance computing (HPC). The PowerEdge R6415, PowerEdge R7415 and PowerEdge R7425expand the 14th generation of the Dell EMC PowerEdge server portfolio with new capabilities to address the demanding workload requirements of today’s modern data centre. All three rack servers with the AMD EPYC™ processor offer highly scalable platforms with outstanding total cost of ownership (TCO). “As the bedrock of the modern data centre, customers expect us to push server innovation further and faster,” said Ashley Gorakhpurwalla, president, Server and Infrastructure Systems at Dell EMC. “As customers deploy more IoT solutions, they need highly capable and flexible compute at the edge to turn data into real-time insights; these new servers that are engineered to deliver that while lowering TCO.” The combined innovation of AMD EPYCTM processors and pioneering PowerEdge server technology delivers compute capabilities that optimally enhance emerging workloads. With up to 32 cores (64 threads), 8 memory channels and 128 PCIe lanes, AMD’s EPYCTM processors offer flexibility, performance, and security features for today’s software-defined ecosystem “We are pleased to partner again with Dell EMC and integrate our AMD EPYC processors into the latest generation of PowerEdge servers to deliver enhanced scalability and outstanding total cost of ownership,” said Forrest Norrod, senior vice president and general manager of the Datacenter and Embedded Solutions Business Group (DESG), AMD. “Dell EMC servers are purpose-built for emerging workloads like software-defined storage and heterogeneous compute and fully utilise the power of AMD EPYC. Dell EMC always keeps the server ecosystem and customer requirements top of mind, this partnership is just the beginning as we work together to create solutions that unlock the next chapter of data centre growth and capability.” Technology is at a relentless pace of scale and record adoption, which has resulted in emerging workloads that are growing in scale and scope. These workloads are driving new system requirements and features that are, in turn, advancing development and adoption of technologies such as NVMe, FPGAs and in-memory databases. The PowerEdge R6415, PowerEdge R7415 and PowerEdge R7425 are designed to scale-up as customers’ workloads increase and have the flexibility to support today’s modern data centre. Like all 14th generation PowerEdge servers, the new servers will continue to offer a scalable business architecture and intelligent automation with iDRAC9 and Quick Sync 2 management support. Integrated security is always a priority and the integrated cyber resilient architecture security features of the Dell EMC PowerEdge servers protects customers’ businesses and data for the life of the server. These servers have up to 4TB memory capacity enhanced for database management system (DBMS) and analytics workload flexibility and are further optimised for the following environments:     Edge computing deployments – The highly configurable, 1U single-socket Dell EMC PowerEdge R6415, with up to 32 cores, offers ultra-dense and scale-out computing capabilities. Storage flexibility is enabled with up to 10 PCIe NVMe drives. Software-defined storage deployments – The 2U single-socket Dell EMC PowerEdge R7415 is the first AMD EPYCTM-based server platform certified as a VMware vSAN Ready Node and offers up to 20 percent better TCO per four-node cluster for vSAN deployments at the edge1. With 128 PCIe lanes, it offers accelerated east/west bandwidth for cloud computing and virtualization. Additionally, with up to 2TB memory capacity and up to 24 NVMe drives, customers can improve storage efficiency and scale quickly at a fraction of the cost of traditional-built storage. High-performance computing – The dual-socket Dell EMC PowerEdge R7425 delivers up to 24 percent improved performance versus the HPE DL385 for containers, hypervisors, virtual machines and cloud computing2 and up to 25 percent absolute performance improvement for HPC workloads like computational fluid dynamics (CFD)3. With up to 64 cores, it offers high bandwidth with dense GPU/FPGA capability. On standard benchmarks, the server with superior memory bandwidth and core density provided excellent results across a wide range of HPC workloads. The new line of PowerEdge servers powered by AMD EPYC™ processor will be available to channel partners across the globe, so they can cover a broad spectrum of configurations to optimise diverse workloads for customers. Availability Dell EMC PowerEdge R7425, R7415, R6415 are available in the following 30 countries as of today: Americas: US, Canada, Mexico, Brazil EMEA: Belgium, Denmark, Finland, France, Germany, Ireland, Israel, Italy, Netherlands, Norway, Poland, South Africa, Saudi Arabia, Spain, Sweden, Switzerland, UAE, UK APAC: Australia, India, China, Hong Kong, Singapore, Taiwan, Japan, South Korea vSAN Ready Nodes are available now with the PowerEdge R7425, R7415 and the R6415.  Customer quotes:  Jacob Smith, senior vice president, Engagement, Packet Hosting Inc. “Packet is committed to bringing the latest and greatest infrastructure solutions to market, meeting the demands of performance-hungry SaaS platforms and Enterprise users around the world. We're thrilled to collaborate with AMD and Dell EMC to introduce the Dell EMC PowerEdge R6415 which will power the first bare-metal cloud solution running AMD EPYC™ on the #1 server platform4 and a foundation of both our scale-out and edge deployments.”  David Chaffin, director of HPC, University of Arkansas –Fayetteville “The Advanced High-Performance Computing Center at the University of Arkansas Fayetteville runs Molecular Dynamics simulation the open-source code: LAMMPS (Large-scale Atomic/Molecular Massively Parallel Simulator). We found the Dell EMC PowerEdge 7415 single socket server with the AMD EPYC processor runs faster than existing dual-socket servers, allowing us to support our continued need for more bandwidth and additional cores, but using less power in a smaller space.”  HPC and Research Team, Nikhef (Dutch National Institute for Subatomic Physics) “The Dell EMC servers with AMD's new EPYC processor have proven to be excellent for multi-core applications. Performance scales out very nicely for parallel code as well as OpenMP-based code. With the availability of the massive amount of PCI-e lanes, these servers provide us with a lot more storage and network throughput per socket. Also, the fact that the new EPYC processor contains hardware support for both AES encryption and SHA-2 encoding makes it a good candidate for hosting VPN services.” Channel Partner quote: Charlie Bellord, Chief Commercial Officer, Hardware Group “Today’s modern data centre demands an infrastructure than can handle a variety of workloads, and these new servers provide the capabilities for their servers to grow with their data centre needs. Also, the Dell EMC PowerEdge R6415 and R7415 single-socket servers speak to the innovation and partnership of Dell EMC and AMD, and how they work together to bring enterprise-class solutions to market.” ### Beyond belief: imagining a world without open source Whether we realise it or not, open source technology plays a key part in our daily lives. Broadly, the term refers to immaterial goods that are available to any person, at any time, and without (many) restrictions. Most commonly, this is used for software which is available for free as source code. The rise of open source (and its sibling Free Software) was most significant once software vendors started to monetise closed code, contrary to the early, open approach of computing. With open source, everyone is able to use, examine and modify the software and share the outcome with others. Nowadays open source goes far beyond the remit of computer science students and often sees joint development of software projects by many different entities and individuals. Many companies – even those we would usually deem competitors – contribute to open source projects, including Fujitsu, HPE, IBM, Intel, Oracle, Red Hat, and SUSE. Open source helps companies like these accomplish projects together in a safe, “open” space. Undoubtedly, Linux is one of the most famous and substantial open source projects. From humble beginnings as the invention of student Linus Torvalds, it has become a core part of our day-to-day lives. It runs on any Android smartphone as well as operating systems in servers and PCs around the world. Not only that, Linux is the base of many public and private clouds, including Amazon and Google. Many high-performance computing (HPC) clusters run on Linux, along with an impressive 99.6% of the world’s supercomputers. There’s no question that open source has had a huge impact on science, technology, and business. Let’s take a minute to imagine what life would look like without open source technology. There would be no C, Perl, PHP, Python... Let’s take a minute to imagine what life would look like without open source technology. There would be no C, Perl, PHP, Python programming languages, no BSD (Berkeley Software Distribution) and no GNU project.  We’d live in a world of Symbian or Windows phones, where every user would see the same (dull) interface. The market for PC operating systems would be almost wholly owned by Windows, with some traditional Unix vendors sharing the server market. In short – life would be much more boring. And we would not just miss Android: Apple’s iOS (and macOS) are ultimately based on open source, too. Theirs is Darwin, which again builds on BSD, the Mach kernel and other open source projects. Our entire technology ecosystem would look completely different. Slow (or no?) progress Progress would be quite a bit slower in a world without open source. There likely would be little to no digital transformation and it would be more difficult to cooperate with other organisations. Common standards would need to be found between proprietary systems, where unsurprisingly each organisation would want to promote as many of their own “standards” as possible. This, in turn, would result in more hoops to jump through, longer lasting negotiations and unwanted compromises. Without vendor-neutral standards, there would be a significant lack of innovation when it comes to fundamental technologies like the Cloud. Modern companies built on this technology, like Uber or Airbnb, would not exist as we know them today – if they existed at all. Even Windows and Apple developers, including Microsoft itself, heavily rely on open source components, standards, and tools. A world without open source would see significantly less innovation, and software would be more expensive and monolithic. To take the idea of open standards and information sharing one step further, open source also plays a significant role in science, space exploration and drug development – areas where knowledge sharing is paramount. The medical profession makes great use of open source-based systems for computer tomography or MRIs, while supercomputers ensure the masses of data collected by scientists can be analysed. However, it’s not just these complex use cases where open source has a key role to play. We come across open source technologies on a daily basis without even knowing it. Take transportation. Toyota is just one example of an organisation that will be using Automotive Grade Linux in its entertainment systems, sharing technology already used by many global airlines. Companies such as Intel and BMW have joined the GENIVI Alliance to help develop these systems further. The natural next step In an age of rapid digitisation and innovation, sharing the workload has never been so important. The beauty of open source is the breadth of the pool of participants along with the minimal requirements to take part. Internet access and curiosity is all anyone really needs to get involved – and this sense of diversity has made the community what it is today. [clickToTweet tweet="'The beauty of #OpenSource is the breadth of the pool of participants along with the minimal requirements to take part...'" quote="'The beauty of open source is the breadth of the pool of participants along with the minimal requirements to take part...'"] Open source enables variety, rapid innovation and cost savings. Why would we want a world without it? Fortunately, it is highly unlikely that open source will disappear – and even if it did not exist yet, it is highly likely it would be invented. Enterprises will be able to benefit from significant levels of innovation for many years to come and effectively raise their market position thanks to Linux and others. But those who let the opportunity pass them by will inevitably fall behind. ### CMMS Deployment Spotlight: Web-based vs On-premise If your company needs to maintain equipment, manage inventory, deal with stock levels and optimize its supply chain, a CMMS would be a natural fit. With numerous benefits that a CMMS can provide, it is no wonder that more and more organizations are looking to implement it. One of the first questions those businesses are faced is related to hosting. They need to decide between an On-premise approach and an online-based system, hosted on the cloud. Hosting data on-premise means your business must maintain its own hardware and software to run the software and store your data. Your business will need its own IT staff to manage all the back-end work on a dedicated server in the office building. Hosting data in the cloud means you rely on the vendor to run the software and host your data as an additional service. This means much less IT work on your end and the data is accessible through the internet for any user at any time. Less burden on IT Cloud-based CMMS is a good option for startups and small companies without a lot of capital, IT resources, or technical expertise. Without the headache of dealing with installations, server maintenance, unexpected breakdowns and so on, the business can focus on dealing with its core business. Without the headache of dealing with installations, server maintenance, unexpected breakdowns and so on, the business can focus on dealing with its core business. This means that there’s no need for a large IT team to handle the software and any new requirements can be dealt with by the CMMS vendor. It is also easy to scale up or scale down the system, as Cloud providers like Amazon allow companies to rapidly expand their system cores and storage for a fixed fee. Rather than waiting weeks for a new machine as on-premise systems would have to, the cloud-based CMMS can be scaled up in hours. Software updates Typically, any ERP or CMMS undergoes constant improvements and changes over time to fix bugs, incorporate new features and ensure security. While on-premise CMMS needs a software team to maintain and apply upgrades to the software, typically an online CMMS is maintained by the vendor and is automatically updated as part of the license fees. Updates to an on-premise CMMS is a lot more difficult.  Software updates and patches have to be tested by the IT teams to ensure there is no conflict with existing systems or customizations.  This causes additional delays causing on-premise CMMS customers to have to live with bugs or security issues for months or even years at a time. For cloud-based CMMS systems, these updates are handled by the CMMS vendor and can be implemented immediately behind the scenes with no negative impact on the CMMS user.  This ensures that all the users of the online system have access to the latest version while not having to bother with the intricacies of upgrade planning. Dealing with Downtime While every business wants to ensure 100% uptime and availability of their software to keep operations running smoothly, it’s almost inevitable that there will be some downtime. Web-based CMMS systems are typically hosted on multiple servers throughout the world to increase reliability and availability. The cloud hosts offer performance guarantees to their clients to ensure that service disruptions will be minimal and as long as businesses have internet access, their operations will run undisturbed. Rather than having your own IT department worry about downtime issues, with a cloud-hosted CMMS you can leave the responsibility to the vendor. The network, servers, and application are monitored 24/7, 365 days a year often with an availability guarantee of 99.999% or 5 minutes of downtime per year. Security Concerns When comparing the two options, remember that the features of on-premise hosting are more suited to enterprises that have regulatory requirements that force data to be kept internal. Web-based CMMS software can erroneously be perceived as less secure than on-premise solutions, but cloud-based systems are often better maintained, security patched faster and overall better managed due to the fact these servers are the CMMS vendors entire livelihood while the on-premise server is just another server to be managed. Another trade-off for on-premise hosting is responsibility for the data.  Cloud CMMS vendors will automatically back up and store your data included in the license cost.  On-premise, software requires your own data plans and audits to ensure your data is safe. Software Costs Finally, costs should play a role in the decision of choosing one approach over another. The capital investment on an on-premise software is higher, but if managed effectively by your IT staff can pay for itself over time. However, like all equipment, the value of the hardware depreciates and the servers eventually need to be replaced.  Additionally, software updates may need to be purchased and you should consider the cost of lost time due to the slow bug fixes or software updates inherent in on-premise software. On the other hand, a subscription-based software service is a recurring cost.  You get a lot for this recurring cost, but that doesn’t change the fact it is a recurring cost. According to the latest maintenance statistics, most facilities are willing to invest between 50$ and 100$ per user per month. [caption id="attachment_48165" align="aligncenter" width="300"] Cloud Maintenance Costs Per User Per Month[/caption] Conclusion Both approaches are viable options for most companies, but more and more organizations are choosing the online based software or Software as a Service. Cloud-based software allows companies to rid themselves of distractions that would affect their ability to focus on daily operations and their core business resulting in a higher ROI. [clickToTweet tweet="more and more organizations are choosing the online based software or Software as a Service #SaaS" quote="more and more organizations are choosing the online based software or Software as a Service #SaaS"] An on-premise CMMS can be a heavy investment, but in certain situations, it can be a good fit.  If you already have a well-established IT staff, know the software fits your organization, plan to use it for at least 10 years, have regulations to keep data and IT services in-house and can deal with the headaches previously mentioned, an on-premise CMMS might be the way to go. ### Report: Lack of Cloud Expertise Is Losing Businesses Money It’s difficult to find a startup business that isn’t utilizing cloud technology. While it can be hard to conceptualize what the buzz surrounding cloud expertise is about, it's worth keeping in mind that something as accessible and well-known as Google Drive utilizes the cloud. Cloud technology allows for increased mobility, which means no more fretting over forgetting to send a file to work on at home or vacation. The cloud provides real-time collaboration, resulting in eliminating the habit of sending the wrong version of a file. Plus, the cloud offers scalability, which means it’s unlikely one will run out of space. Given these advantages, it’s no wonder that 71 percent of IT decision maker respondents in a recent London School of Economics poll state they believe their company is losing money by not jumping on the bandwagon. Businesses Missing Out on the Cloud Beyond its ease of use, lack of cloud integration costs companies more than $258 million globally. While artificial intelligence continues to gain investment, human expertise in cloud management is what corporations are clamouring after. An inability to attract and retain this talent is the main reason companies are unable to capitalize on cloud computing. Nuanced insight is what is missing from many data forward companies. A business can run reports and simulations all day and get well-crafted and well-engineered data, but directing the right experiments, asking the right questions and then analyzing that data is what is missing. The nature of IT is evolution and change. The longest average tenure at top tech companies is just above two years, and the actual average hovers between one and two years. That’s hardly enough time for a company to hire an IT professional, have them set up a cloud system, guide other employees in learning the system and then maintain the cloud. Unsurprisingly, in that same London School poll, almost half of respondents found that daily management of the cloud was more labour-intensive than expected. [clickToTweet tweet="'...almost half of respondents found that daily management of the cloud was more labour-intensive than expected.'" quote="'...almost half of respondents found that daily management of the cloud was more labour-intensive than expected.'"] The Cloud Brings More Job Functions With companies needing to devote more resources to integrating technology into everyday operations, the days of having one IT employee to take care of all tech-related needs are vanishing. With critical responsibilities like cloud security, migration of project management and app development, larger companies need to supply an IT department to take advantage of cloud computing. Salary and benefits alone are not enough to attract top talent. Many people working in IT have an innate drive to solve problems, making continued learning an alluring prospect for top talent. Burnout is another affliction for many IT professionals, who are often asked to wear too many hats and frequently bothered during their vacation time for IT emergencies — this is why having more than one tech expert on staff is essential. Not to mention, revenue can decrease between the time when an IT expert leaves and the business can hire a replacement. Add to the mix that more than two-thirds of companies are looking to expand their cloud operations, and there is a recipe for an IT employment shortage. Protecting a Business from Cloud Ignorance So, how can a business protect itself from failing within a world that's more dependent on cloud services than ever? Take these steps: Keep Employees Happy: Invest and keep top IT talent satiated. Offer them a salary and benefits package that allows them to focus on their job, hire an ample team to distribute the current and anticipated workload and encourage employee education through seminars and classes. Investigate Ways to Improve: Have your IT staff help take an inventory of how cloud computing is currently aiding the business, as well as identify areas where increased expertise could lead to a higher ROI. Then, give them the tools to implement those changes. Create a list and timeline of how those changes will be implemented — and when. Educate Your Team: Recruit your IT staff to educate the rest of your employees on how to use cloud software. The acquired knowledge will only increase efficiency. While the cloud can remedy mistakes like John from Accounting accidentally deleting all the files from 2012, it still takes time for IT to go in and fix that error. The adage of "a chain is only as strong as its weakest link" could not ring truer for a highly collaborative software system like the cloud. a lack of human investment in the software is what holds most companies back from fully capitalizing on the service Cloud computing is the poster child for an increasingly automated world, but a lack of human investment in the software is what holds most companies back from fully capitalizing on the service. A company is leaving money in the sky if they aren’t investing in staff members who can be innovative and train others to develop and utilize cloud software. ### Spaces Direct and Azure Stack - Mark Mulvany - Dell EMC Mark Mulvany from Dell EMC talks to Compare the Cloud about Dell EMC's Cloud for Azure Stack. Find out more about Dell EMC here: https://www.dellemc.com/en-us/index.htm At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. ### The Global Service Challenge - Procurri Sean Murphy, CEO, and Zack Sexton, President - North America/Co-Founder of Procurri talk about "How do modern businesses deliver services to non-native territories?". Find out more about Procurri here. ### Creating Products With A Remote Development Team; Opportunities & Challenges When first approached to write an article on agile product development I spent a lot of time scratching my head. Most of us do agile, and most of us are doing a pretty good job of prioritising our backlog. I started to panic that we’d become really predictable as a product team! After panic (and a Freddo), it then dawned on me that there was something that was unique about how we build products and something that other product folk might find useful, and that’s the fact that our entire development team work remotely across the world. So my aim with this article is to share with you the opportunities and challenges presented by our approach with the hope that you might find something useful for your own product development. The Opportunities Let's start with the opportunities a remote product development team can bring... Limitless Talent After making the decision to move to a remote product development team, our talent pool went from a 30-mile radius of Manchester in the UK to a large slice of the world. Everyone knows how difficult it is to recruit good developers, especially with the current skills shortage, so why restrict yourself to one town or country? It may sound a bit cliché but why not let the “world be your oyster” and scoop up the best talent from around the world? Scale Your Team Fast We’ve all been there, you’ve come up with a great idea for a product but it then takes at least six months to hire the right people that can make it happen. With a remote approach to development, this isn’t always the case. Without geographical restrictions, you can hire good people quickly, and as result will get your idea to market and start generating revenue faster. Flexible Working We have a very low churn of developers in our product teams, and I don’t think that is just because we offer a great place to work. Feedback we’ve had from the team is often around the flexibility that remote working has brought to their lives, for example, parents can do the morning school run, dog owners can go for a lunchtime stroll and the fitness bods (which we have quite a number of!) can go for as many runs as they like. With this approach comes autonomy and a mutual respect that results in great product development. Multicultural On the face of it, having a multicultural team could be perceived as a challenge, but I see it as an opportunity. I love that I get to speak with people from different cultures and with a different outlook on life on daily basis. I never thought I would spend my days regularly talking to people from Germany, Bulgaria and Brazil (to name a few). Having a multicultural team makes work more interesting, and also enhances the quality of the products that we deliver. Focus Most developers don’t like distractions due to the complexity of their work. They need to concentrate for long periods of time without having background noise. We have found that working remotely enables this as teams don’t get interrupted or have the distractions of a busy office environment, resulting in them being more productive and getting the necessary work done. The Challenges Working with remote product development teams has it challenges, here is how we have tried to overcome them... Product Vision it’s important that your team understand the vision of your product and the direction that you are heading As all good product leaders know, it’s important that your team understand the vision of your product and the direction that you are heading. Compared to when working with onsite teams this has proved to be more challenging when working remotely. To address this problem we have recently implemented monthly roadmap reviews with the whole team using Google Hangouts, this allows the team to see on a regular basis what they might be working on next. It also allows them to input any ideas that they might have for the product. Company Culture At Cloud Technology Solutions our culture is everything and with a remote team scattered around the world, this can be hard to maintain. As a result, we have regular company calls led by our CEO via Google Hangouts. This means that everyone feels connected to the overall vision of the company, as well as feeling part of the team. Although it can be an expensive exercise, we also get the whole company meet up in person twice a year, and for our team, even though we speak everyday face-to-face, get-togethers are always a great way to cement a relationship. Collaboration When your team are all sat around a bank of desks together collaboration is much easier and becomes a natural way of work. We haven’t found the perfect answer to this problem but we have found that the cloud-based technology that we use, such as G Suite, enables our team to collaborate on documents in real time. Staying Agile Agile methodologies such as SCRUM or Kanban lend themselves to an onsite environment. Daily standups, physical Kanban boards, collaboration and team autonomy are the cornerstone of Agile product development, and it is easy to let these slip if you are working remotely as it can feel like more effort. My advice is to follow the methodology you would normally use, regardless of where the team is located, whilst utilising as many cloud collaboration tools such as G Suite or Office 365. What To Remember Although not always ideal, remote working can be the answer to your development teams needs. Here are five key takeaways: Unlimited talent pool. You can hire the most talented developers from anywhere in the world, and can build your team faster. Improved productivity. It is known that those who have flexibility perform better than those who don’t due to autonomy and mutual respect. Team meetings. Regular face-to-face meetings will encourage your team to work together like they would in an office environment, without the distractions! Be on the same journey. Keep your development team on the same product vision with meetings and roadmap reviews. Utilise cloud collaboration tools. Staying agile is key to keeping remote workers collaborating and feeling part of the team and the company. ### What to expect from IoT in 2018 IoT is no longer the new kid on the block. It’s been having a transformative effect on nearly every industry for the past few years, particularly automotive and manufacturing, and the momentum will continue. Connecting everything – and maximising the value of the data those things generate – can drive positive business results, so the potential for IoT is truly massive. But instead of looking too far into the future, let’s focus on what we think the next twelve months will bring. Our first prediction is that we will start to see more robust security through a combination of IoT, artificial intelligence (AI) and machine learning. During the first few years of IoT, the focus was very much on gathering data. But we are now at a stage where we can gather and analyse that data in real time, and the automate decisions accordingly. This will be significant in the area of security – both from an attack prevention point-of-view and from an operational technology stand-point. So in 2018, rather than reacting to incidents, the combination of IoT data with AI and machine learning technologies will help industries predict and proactively prevent IT attacks and operational disasters from happening in the first place. Integration between devices made by different manufacturers is starting to happen, enabling them to work together to make interoperability mainstream. We also predict that 2018 will see the start of interoperability between different device manufacturers and the wider IoT industry and ecosystem. Integration between devices made by different manufacturers is starting to happen, enabling them to work together to make interoperability mainstream. This will really take off in 2018 because of the development and adoption of standards such as OMA Lightweight. This is being adopted across the board – at the chip, software and hardware levels – enabling devices to speak to each other in a common language, and to send and receive data. In turn, this drives a need for all these devices to be managed by an IoT data platform that has the capability to apply policies and automate actions based on the data that is being shared. [clickToTweet tweet="The automotive industry has been transformed by the introduction of IoT #IoT #Automobiles #Cars #Technology" quote="The automotive industry has been transformed by the introduction of IoT"] Our third prediction is that there will be a battle over IoT data ownership, resulting in a significant paradigm shift. Data made big headlines in 2017, as highly publicised breaches made us all more security conscious. To that end, more IoT data was generated this year alone than at any other time in history. With the intersection of IoT and Big Data, we have started to see the collection and analysis of that data become more important as enterprises try to make sense of it to drive tangible business benefits. This is vital when you consider that only 1 percent of IoT data is currently being used. The next step, which we predict will take place in 2018, is to work out who needs to have access to the data, how it is best used and also who owns it. With those questions answered, it will be possible to extract the full value of your data and produce positive business results. The automotive industry has been transformed by the introduction of IoT and we predict this will go further afield in 2018, to transform the fleet management industry. Transportation as a service (TaaS) will go mainstream – in other words, on-demand trucking and logistics is just around the corner. This means that fleet operators won’t have to directly manage their drivers or logistics. Instead, they will be able to consume fleet management as a service so that when lorries need to be in a certain location, the operator can ensure their prompt arrival. This also means the trucks will be loaded correctly and will follow an optimal route. This will be made possible by real-time IoT data, pulled from areas beyond GPS such as weather and weight. This type of data will enable the optimisation of what and how much to load onto a truck, the route it should take and ultimately greatly improve transportation and fleet management. It may not be the right business model for all enterprises, but manufacturing companies that are already up and running with IoT could quite quickly use TaaS to reduce their transport management costs. To continue on IoT predictions that pertain to the automotive industry, we predict that by the end of the year the leading automotive companies will start to invest in existing ride-sharing services or even introduce their own branded offerings. Most will commit a percentage of their fleets to be leveraged as a ride-sharing service. In addition to this, we’re likely to see a global explosion of bike sharing services to accommodate the increase in people opting for multi-modal transportation. An example of this is when a commuter travels by train to a city, but then uses a bike-share to travel to a specific location in the city that may not be walkable from the train stop. Our final prediction is for low power wide area networks (LPWANs) which, as they become more mainstream, will make IoT device connectivity more accessible and affordable, leading to increased adoption. But some barriers still remain before mass adoption of LPWANs becomes a reality. For example, the data usage of most low power devices might be only 10 percent compared to most IoT devices currently connected to 3G/4G networks. With such low data usage, generating revenue by charging for kilobytes of data per device per month does not make sense. A new business model is needed in order to make LPWANs profitable, one that reflects the value of the business outcomes that LPWAN provides – not just the data used. For example, smart city bin collection could be charged each time an action is triggered, such as a request to send out more rubbish trucks. As IoT adoption grows, naturally there are challenges to be faced, but it’s exciting to see the creation of new industry sectors like TaaS. So long as viable revenue models develop hand-in-hand with business innovation, the opportunities presented by IoT will continue to impress us, year on year. ### #TheCloudShow – Ep 2 – IT Debt In The Cloud Episode: IT Debt in the Cloud Broadcast Date: 2nd February 2018 Series 1 Episode 2 Compare the Cloud are peeling the layers back on the Cloud Industry. Answering the questions on the lips of both providers and customers – hosts Jez Back and David Organ will cover the latest trends and best interviews in this space. This episode asks the question of IT debt in the cloud. Joining Jez and David are guests Matt Lovell, COO at Centiq and Matt Watts, Director, Technology and Strategy at NetApp. ### The rise of multi-cloud: putting the customer in charge Over the last few years, we have witnessed exponentially growing interest in what is commonly called a multi-cloud strategy, or multi-cloud infrastructure, across the enterprise community. The focus isn’t limited to specific sectors, it spans across the board: enterprises from all industries are considering and putting in place multi-cloud strategies, virtualising their infrastructures and opting for a mix of cloud providers, rather than relying on a single vendor. And while it is a deliberate choice for many, this is not the case for all organisations. But where has this sudden hype around the subject of multi-cloud come from? What is multi-cloud? Multi-cloud is the next leap of cloud-enabled IT architecture beyond hybrid cloud. It refers to an IT design where multiple public cloud providers and on-premises private cloud resources are used concurrently to realise certain business objectives and metrics that are difficult to achieve with private-only and/or hybrid cloud designs. [clickToTweet tweet="'Multi-cloud is the next leap of cloud-enabled IT architecture beyond hybrid cloud...' -> find out more" quote="'Multi-cloud is the next leap of cloud-enabled IT architecture beyond hybrid cloud...' -> find out more"] These business objectives include the freedom of choice to pick and choose best of breed cloud services across public cloud providers, as well as enabling data mobility to eliminate concern over vendor lock-in. In addition, companies are looking to achieve enhanced data availability and durability with data sets spread across multiple cloud architectures, and cost optimisation with the ability to use the most appropriate cloud pricing scheme for each application across providers. In a nutshell, these factors put the cloud customer in charge with optimal leverage and control, thereby fuelling the growing momentum of multi-cloud. Principles of multi-cloud The benefits of multi-cloud outlined above hinge on a number of principles that must be adhered to. First is the need to normalise data access, control, and security across all clouds with a single interface, with the de-facto choice being the Amazon S3 API – assuming a full-fledged implementation with Identity & Access Management or IAM. Second is the need to ensure that data always stays in its open, cloud-native format – meaning no opaqueness anywhere – so that it can be accessed wherever it resides and can be moved around freely as required. Third is a transparent data brokering capability that allows data to be placed and moved around automatically based on pre-defined business policies. And fourth is dynamic indexing and searching capabilities across cloud architectures so that data can be found and acted upon wherever it happens to reside at any given time. If not implemented correctly and with reasonable safeguards, multi-cloud could exacerbate the drawbacks and challenges that face cloud customers, such as increased complexity and overhead of data management, reduced flexibility in ways the data can be accessed and used, poor control and tracking of where data is placed, and inflated costs with unnecessary copies sitting on multiple clouds. Many organisations can already be defined as multi-cloud users by default since the deployment of multiple cloud offerings within a single company has grown organically out of “shadow IT” Many organisations can already be defined as multi-cloud users by default since the deployment of multiple cloud offerings within a single company has grown organically out of “shadow IT” practices over the past few years.  Those organisations are finding they need a solution to deal with these multiple clouds. A second set of organisations are building a multi-cloud strategy more intentionally in order to reap the benefits outlined above. At this juncture, an increasing section of customers are clearly better educated and more savvy in their use of cloud services, prompting them to explore and implement multi-cloud strategies. That said, the undisputed benefits of multi-cloud could be wiped away if an organisation lacks the maturity, discipline, or capability to adhere to the principles outlined above. It is therefore imperative that organisations seek out trusted technology partners with the expertise, know-how, and solutions to ensure that the principles are met. For example, it is vital to implement and follow cloud provider-agnostic standards that ensure adequate control is maintained. Once locked into any vendor or technology, it is very difficult to change. Looking forward As multi-cloud is becoming the norm for cloud designs and is enjoying mainstream adoption, over the next couple of years there will be demand for a solution that fundamentally changes cloud storage and data management to provide customers with the full power and flexibility of on-premises storage and public clouds so that they can get not only the most value from their data but also the optimal experience in doing so. The rise of such new multi-cloud data controller solutions will help broker and manage information across different clouds. ### Stop Thinking Bitcoin is Blockchain In the run-up to Christmas, It felt like the world was in an uproar about Bitcoin. Suddenly, this sleeping giant was the hot topic of conversation. Bitcoin broke into the mainstream media seemingly overnight, having everyone from the BBC and CNN to the postman in furious discussion over it. “Bitcoin is going to change everything,” they said. “The technology is the biggest thing since sliced bread,” was the general gist. Others concluded that it is simply a bubble, technically the biggest bubble in human history - bigger than the Dot-Com bubble or the Dutch Tulip Bubble of the 1630s. The big problem with all of this is that nobody really has any answers. Some speculators think that Bitcoin will go much, much higher. Kim Dotcom - the German tech entrepreneur currently fighting extradition to the US - has announced his own plans for a cryptocurrency, and thinks that Bitcoin could easily pass $40,000. Meanwhile, billionaire investor Warren Buffett believes “... almost with certainty that cryptocurrencies will come to a bad end.” We’ll only know who is right with the benefit of hindsight. Bitcoin and Blockchain have come to light together, they are as important to each other as the chicken and the egg, but they are not the same thing. But if the sudden deluge of media coverage about Bitcoin has done anything useful, it’s that it has raised the profile and understanding of the Blockchain as a technology that may well influence many existing industries, and underpin emerging ones. So whether the ‘Bitcoin bubble’ bursts or not, the big takeaway is that it’s time to stop confusing the two. Bitcoin and Blockchain have come to light together, they are as important to each other as the chicken and the egg, but they are not the same thing. [clickToTweet tweet="whether the ‘Bitcoin bubble’ bursts or not, the big takeaway is it’s time to stop confusing Bitcoin with Blockchain" quote="whether the ‘Bitcoin bubble’ bursts or not, the big takeaway is it’s time to stop confusing Bitcoin with Blockchain"] Bitcoin was the first decentralised cryptocurrency and as such is the most valuable right now. All Bitcoin is is a tradable measure of value on a Blockchain, no different to any of the estimated 1400+ other cryptocurrencies which have been created to date. Some of these share the same Blockchain (mostly Bitcoin or Ethereum) and some are based on their own. There are only a finite number of ‘coins’ in circulation, which preserves their scarcity and therefore a value that can rise or fall based on demand. Like the chicken and egg analogy, nobody can conclusively tell us which came first as Blockchain was created alongside Bitcoin. Bitcoin would not be here without Blockchain, they were created in unison by the mysterious Satoshi Nakamoto. From startups to multinational corporations, people are betting heavily on Blockchain being the future. Somewhat annoyingly, many are of the mindset that Bitcoin directly represents how important Blockchain is, it is but a mere hint of the potential here. Bitcoin barely scratches the surface of how Blockchain can change the world. Originally known as ‘block chain’, now universally referred to as Blockchain, is a list of blocks, or records, that is ever growing. These ‘blocks’ are secured and linked together using clever cryptography. The individual blocks would normally contain a hash pointer as a link to the previous block in the chain they are linked to, a timestamp for when the block was created and the transaction data that it was created for. In other words, a Blockchain (it is worth noting that there are several) is a digital, decentralised, public ledger of all cryptocurrency transactions. A Blockchain is constantly growing as ‘completed’ blocks get added to the chain. These are added in chronological order once they are completed, and a number of computers on the network - known as nodes - hold identical copies of the complete blockchain. The decentralised nature of Blockchains means that assuming you have a big enough hard drive, anyone can download the entire blockchain. Everyone has access, but no single entity has total control. There is no one central server that can be hacked to change the data, hundreds of thousands of computers hold a copy. It is part of a peer to peer network upon which, once recorded, the data in any given block cannot be altered retroactively without having to alter all subsequent blocks, on all of the machines storing the information. It is the best example of distributed computing with what is known as high Byzantine fault tolerance. DLTs (Distributed Ledger Technologies) are cropping up everywhere now, it is fast becoming a trusted resource for everything from the obvious, such as transferring of funds around the world (using currencies like, but not only, Bitcoin) to signing of legal paperwork, seen as documents can be coded into Blockchain and are therefore as near as makes no difference, fraud-proof. Once signed, a record is there for all to see. An obvious place to use Blockchain is an area where accurate audit trails are required. For example, global shipping company Maersk estimates that it takes some 200 communications between 30 different parties to ship one of its containers from Kenya to the Netherlands. It sees Blockchain as a way to hugely simplify this process and remove the risk of errors. In the case of land-based supply chains, Walmart, Kroger, Nestle and Unilever are working with IBM to integrate Blockchain into their own processes. From our own perspective as a games company, Blockchain allows something that exists purely in a digital sense to be given its own unique identity and value. What this means is that this technology will open the doors to a new form of creativity that is digital-first. People will be able to share and monetize their creations directly, bypassing the gatekeepers who make far more revenue than the creators they profess to support. And the security provided by the fact that each and very blockchained item is unique means that value that is created can be protected and retained. The real value of Blockchain is not in the raft of get-rich-quick cryptocurrencies, or people willing to bet their houses on the rise of Bitcoin. Yes, we may well be in the middle of a speculation bubble for all things Bitcoin-related. But it’s actual success has been to open a door to a Blockchain-driven future. Blockchain is in its infancy, but it could well become the technology that becomes the bridge between the digital and physical worlds. It’s altogether bigger and more exciting than Bitcoin will ever be. ### AI and ML - A Force For Good Stories in the newspapers and online regarding Artificial Intelligence (AI) generally centre around sensational, scary and dystopian headlines. This is a problem: it doesn't reflect the reality of the situation and it would not do for any serious publication (such as Compare the Cloud) to throw fuel onto the fire by predicting The Singularity coming, leading to AI Taking Over the World leaving Civilisation As We Know It in ruins. It's very useful to remind people that technology has been a force for good over many centuries. It also bears remembering that throughout history, despite the positive effects on people's lives, people have always been afraid of new technologies! In 2018 we will see AI making our lives better in surprising but subtle ways whilst silently making progress towards more exotic - but equally, positive - disruptions. [clickToTweet tweet="'a significant and impactful,' milestone for AI will occur in 2018... -> Read more!" quote="'a significant and impactful,' milestone for AI will occur in 2018... -> Read More!"] Compare the Cloud spoke to Craig Hampel, Chief Scientist, Memory and Interface Division at Rambus to give us his predictions for AI and ML for 2018. Craig predicts that, "a significant and impactful," milestone for AI will occur in 2018 whereby AI will develop the ability to extract sentiment from a facial expression, picture, sound or written text, "nearly as well as a human". This is certainly one of the main directions that AI / ML is heading down at the moment and we can expect the major focus of 2018 to be towards the wave of AI derived sentiment - or 'Artificial Sentiment'. This kind of AI can identify and - even manipulate - sentiment and will surely have a huge impact on various industries like advertising, music and even politics. In 2018 AI will have the most significant impact on pervasive data sets; like picture, video, DNA, voice and sentiment. AI’s ability to look at billions of data sets and find correlations, or patterns and identify traits has nowhere near reached its potential - it is a significantly untapped source of meaning and insights as things stand. Services like Google Photos and 23andMe are, "just the tip of the information iceberg." Craig went on to say that, "the possibilities are huge when moderate amounts of inexpensive AI is applied to these large data sets. "The largest natural data set that we currently process the least is our immediate surroundings. The integration of AI, ML and near-human sensors using our surroundings will facilitate ubiquitous computer vision. This has the capacity to benefit different technological advancements such as self-driving, collision avoidance and detection of real-world anomalies that improve and extend human lives." There is so much data around us that the challenge of data is determining which data analysis is useful and will provide actionable and informational results. Arguably, all data is 'used' in the strictest sense but much of it is quickly filtered, compressed, or archived. Only a very small amount of even the most useful data collected to date is used to its full potential. Craig expanded on this by saying, "from DNA to consumer sentiment to archival photos, the amount of information that can still be extracted and analysed from these sources is immense. Yet, our ability to use this data is limited by memory and compute capacity — this is one of the challenges we will face in the upcoming year." we will see a new class of memory hierarchy referred to as Storage Class Memory or SCM that emerges to help bridge the capacity gap that results from slowed DRAM capacity scaling - Craig Hampel, Rambus Right from the beginning of the computing age, computer architecture has relied on hierarchy and filtering to manage data when it exceeds the capacity of the existing infrastructure. These techniques will continue to be applied to process the new waves of data expected in 2018 from the likes of the growing IoT infrastructure and other areas. However, this new wave of data requires improved performance and capacity which places pressure on the existing hierarchy. This pressure is intensified by the fact that the historic rate of improvements in the capacity and cost of DRAM and storage technologies has slowed. Craig added, "in order to compensate for the increased capacity demand on DRAM, we will see a new class of memory hierarchy referred to as Storage Class Memory or SCM that emerges to help bridge the capacity gap that results from slowed DRAM capacity scaling. Rambus is developing memory architectures and buffers to allow storage class memories to achieve performance close to DRAM while staying near the cost curve and capacity of storage devices. "While it may not impact 2018, we are going to see emerging storage devices that have more archival properties in the upcoming year. These will provide an additional level of storage that is less expensive and higher capacity than disk or flash. Microsoft’s work in using DNA as an archival storage device is an example of the type of storage that could result in multiple orders of magnitude increase in the capacity and density." Another AI / ML trend that we will see in 2018 is expanded use of data filtering near the edge. For example, MPEG video where the only content transmitted and stored is that which changes between frames. Considering the large video resolutions (and therefore large file sizes) on offer, this use of edge-processing and filtering has obvious benefits. In a similar fashion, devices near the edge like video cameras and sensors will begin to provide purposely filtered data to the cloud. In these cases data is used locally, to determine what information to forward to the cloud for further analytics. Ultimately only data that produces an action or result is useful. And whilst, at the moment, these techniques are applied in retrospect to existing equipment, Craig says that he expects, "in 2018, this type of filtering will become more purpose built into devices as they evolve." In summary, 2018 is set to be a huge year for Artificial Intelligence and Machine Learning. As such, the sensationalist, scary and dystopian headlines are more than likely to continue to increase - in both frequency and shock value. Learning and understanding the reality behind AI and ML are the only way to see through these attention-grabbing headlines. One thing is for certain though - 2018 will see a massive increase in the capabilities of AI and the use of AI will continue to become more and more a force for good in the modern world. ### Server Management Made Simple - Mark Maclean - Dell EMC Compare the Cloud was joined by Mark Maclean, Product Manager for DELL EMC. In this demonstration, Mark discusses Server Management. Find out more about Dell EMC here: https://www.dellemc.com/en-us/index.htm At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. ### Visual Internet of Things Set For Massive Growth in 2018 We're well into 2018 now - and we have given you a number of predictions and analyses of what 2018 might bring. One of the most exciting developments for the coming year is the expected rise of Visual IoT. James Wickes, CEO and co-founder at Cloudview gives us his insights into what we might see - and what might be seeing us! - in the coming year. [clickToTweet tweet="'It’s very easy to be negative about the deployment of technology for sinister purposes...' find out more!" quote="'It’s very easy to be negative about the deployment of technology for sinister purposes...' find out more!"] The rise of CCTV over the last 30 years has led to repeated warnings from many commentators of a "1984" / "Big Brother" style situation coming to pass. Whilst recent scare stories around the voluntary admittance of monitoring devices into the home (your smartphone / tablet can hear and see everything you say!) James Wickes views the situation far more positively, he tells us, "It’s very easy to be negative about the deployment of technology for sinister purposes. In the case of 'CCTV', I genuinely believe that collating data more securely in the cloud and using machine intelligence to interpret it offers 'privacy plus', where subject data rarely or never gets to be seen by another human being which, let’s face it, is when the problems start." Of course, privacy should always be a concern - but if machine learning can be used to analyse data collected for the stated purposes it was intended for, it may be possible to safely dispose of the footage much more quickly rather than hanging onto it for long periods of time. the main benefits of IoT to date have been with the manufacturers of the devices and not so much to consumers James went on to tell us that so far the main benefits of IoT to date have been with the manufacturers of the devices and not so much to consumers - to avoid a "quasi-1984" reality we all need to be more aware of the "give and take" relationship we have with our IoT devices and to stop giving up little liberties for the sake of little conveniences. Whilst we can all benefit and enjoy the fruits that IoT can give us, if we know what we are giving away in return for purchasing and using our IoT Devices, there is more chance of parity of power between the consumer and manufacturers. And whilst we should be more aware of - and trying to redress - the balance of power, we do need to keep in mind the genuine and positive benefits Visual IoT can bring. Visual data can be used to empower people in numerous ways - monitoring crowd information can be used to help decide which train to take, which bank branch to go to on a lunch break, or if you'll find your own slice of peace and calm on a visit to the seaside - or if that trip will be anything but a relaxing few-hours break! Traditional CCTV numbers have actually started to fall - the cost of traditional CCTV is most evident mainly in the public sector where the cost of deployment and ongoing use is high. In 2018 we should expect this trend to continue with the number of IoT connected cameras to climb as the benefits and cost reduction of Visual IoT really kicks in. Even though traditional CCTV is declining, there has been a massive historical investment, but that's the great thing about Visual IoT - it doesn't require existing CCTV to be replaced, it can absorb them. The lack of a need for "rip and replace" combined with the financial pressures felt by municipal authorities to reduce their costs - this all adds up to huge growth potential in 2018 and beyond. James explains further, "The financial benefits from Visual IoT are easily understood through actual use cases. For example, local councils place cameras in public spaces (along high streets to monitor activity and protect citizens). Historically, footage is fed to manned monitoring centres, who identify issues and coordinate with on-the-ground resources. Attention is typically drawn to incidents in progress through, for example, a 999 call. "Through the use of the cloud, detection analytics can be layered on without the need to replace camera hardware. These analytics can use machine intelligence to identify impending situations and alert the monitoring centre before they turn into actual incidents. This reduces the manpower required to provide monitoring services, removes the risk associated with fatigue and loss of concentration while improving the quality of service delivered." The future - and the present! - for Visual IoT is bright: monitoring with visual devices will not lead to a drab, grey, oppressive "Big Brother" dystopia as predicted by some in the past. ### Digitisation critical if retail supply chain is to meet customer experience demands, says logistics leader Wincanton Wincanton issues guide to the digitised supply chain to help companies plan for industry transformation Wincanton, the largest British logistics company, has today launched a roadmap for businesses looking to transform their customer experience through digitisation of the supply chain. The rise of the digitally discerning consumer and business customer has brought the logistics sectors to a seminal moment, where recognition of this new world and the uptake of digital technology is critical to remaining relevant. ‘The Wincanton Guide to the Digitised Supply Chain’, launched today, contends that a connected, always on, world has led to a new expectation of experience driven by businesses such as Amazon and Netflix that are able to deliver anything, whether it be goods, services or entertainment, on demand. The guide draws on the knowledge and experience of Wincanton experts, and organisations such as IBM, Zebra Technologies, Russel Reynolds and Segro. It provides businesses that are dependent on a resilient and functional supply chain with insight into the rapidly changing digital landscape, which is, in turn, transforming customer expectations. By adopting a ‘near and next’ mindset, Wincanton has looked at the key themes that are set to define the sector in the short and medium-term, with the aim of providing actionable business advice to businesses that are already starting to make the shift, or prepared to do so immediately. Adrian Colman, CEO of Wincanton, says: “It’s all too easy to be dazzled or confused by talk of the future, and one of the biggest issues facing businesses in our sector is the sheer number of technologies there are to learn about and choose from. “At Wincanton, we are gearing up for the digitised supply chain by developing a deep understanding of what it takes to lead. We have produced this guide to cover the most relevant and actionable aspects of digitisation in the supply chain, with reference to the topics and tools that will bring benefits to our customers, and ultimately their customers or consumers as well.” A selection of the themes that are covered in the guide are: Blockchain – In a supply chain context, this has the potential to provide a new, more credible level of traceability and enable frictionless transactions on an unprecedented level Artificial Intelligence – Every asset has a voice, delivering insights that allow operations to run more efficiently and businesses to make better decisions with improved, real-time visibility Warehouse automation – The introduction of automation to the warehouse reinvents what has traditionally been a very labour intensive task, streamlining the process and improving convenience for the customer The Internet of Trucks – Technology will join together the driver, truck and consignment, transmitting data constantly to improve the accuracy of delivery and reduce waste Robotics and the hybrid workforce – Technology is helping to drive better and safe work environments and processes, and considering how to optimise the respective roles of both humans and machines in the new logistics workplace is critical To sign up and receive a full copy of the report, please follow http://go.wincanton.co.uk/digital-guide/chapter-1 ### Total control: how the IoT surpasses M2M Even though both machine to machine (M2M) communication and the Internet of Things (IoT) both concern connecting devices to each other, the IoT is built upon fundamentally more interoperable, more scalable, and more modern technologies. Let’s clear up the confusion many people have about each term and explain why they aren’t interchangeable. From notification to networks Almost ever since we learned how to tell machines what to do, we’ve wanted to know how they do. Cycle counters, oil gauges, error lights, error messages - all these were designed to either assure us the machine was working well or tell us why it can’t function at all. Then, we got smarter. We built sensors into the machines and wired them up to a primitive silicon brain, like a programmable logic controller (PLC) or microcontroller. These silicon brains became the central hub of the machine’s nervous system, a simple electronic network which now carried the constant chatter between the sensors and the controller to more general status indicators: LED and LCD alphanumeric displays which reported more specific error and status messages, like “PC LOAD LETTER”, or “no disc”. Our next step was to make the machines smarter, and so we connected the rudimentary networks inside the machines to larger networks outside the machine. For industrial machines, this led to the wide adoption of wired interfaces like RS-485 and CAN, as well as wireless communication over Zigbee, Xbee, and proprietary cellular networks. While this connectivity became useful for us to remotely access and monitor a device, we quickly discovered the benefits for having large number of machines report their data to another machine, which we would then use to transfer data to and from all the attached devices, as well as to remotely control them. This is the genesis of machine 2 machine (M2M) communication. And M2M was good. The fact that operations and service personnel could receive real-time operational data from their sensor networks and control systems meant that when a machine failed in the field, they were armed with the data necessary to quickly diagnose the faulty component, replace it, and resume operation. Breaking out M2M however, could be better. The device networks are closed to the outside world, and that means those who monitor the machines which in turn monitor a lot more machines have to be physically present in the field or on the premise. Furthermore, these industrial device networks often send and receive messages in their own proprietary format, requiring proprietary software and even equipment to monitor, service, and support the machines on the network. The Internet of Things (IoT) breaks M2M out of its restrictive niches in two ways. The first is through support of the TCP/IP suite of protocols which allow the machines to send their data not only through the local machine network, but to any other Internet-connected computer on the planet. Secondly, IoT leverages physical interfaces which are based on open standards are agnostic, like Bluetooth, Ethernet, Wi-Fi, 3G, and 4G cellular. Big data, big capabilities There’s another way in which IoT devices go far beyond M2M, and that is the integration of advanced analytics: trending, anomaly detection, forecasting, machine learning, and visualization. These are the same tools frequently used for business intelligence (BI), and that’s no mistake, as the cumulative value of all this real-time data from IoT devices is expected to drive increased efficiencies and profits in sectors as diverse as retail, manufacturing, shipping and logistics, and even healthcare. Just as M2M utilized niche (and often proprietary) protocols, IoT leverages the cloud, where the incoming data meets the scalable processing power, storage, and bandwidth required for performing analytics on the torrent of data in real time. Web interfaces or mobile apps hook onto the cloud, allowing access to all the IoT devices at any time from anywhere via the web. Connectivity, control, and monitoring are three general areas of overlap between M2M and IoT, but those similarities mask important differences. Where M2M offers networks that are islands and siloed data, IoT delivers integration, comprehensive analytics, and seamless access. While M2M continues to have trouble extending beyond industrial use cases, IoT is already transforming consumer appliances, home security, manufacturing, and logistics. IoT literally brings things onto the Internet, and what in turn do those things bring? The data reveals endless actionable insights and business opportunities. ### How ‘human-in-the-loop’ will triumph in 2018 to make machine learning applications cheaper and more accessible There is lots of hype about machine learning models. However, the reality is that they are expensive to build, train, validate and curate. We require data scientists to manage this entire process. The more complex the data becomes, the larger the team of data scientists needs to be. Because of the amount of human intervention required, there is also a time lag. This makes them increasingly atrophied as a business and its market evolves. This is the real bottleneck to machine learning. Even more so when the data involved is text or unstructured data. Introducing HITL. Data Science and Data Transformation Many enterprises have invested in big data infrastructure, machine learning and data science teams. However, many have reached a crunch-point where the amount of effort required to gain insight from their data is prohibitive. This is especially true with data scientists cleansing and transforming data. The primary reason why most companies are analysing less than 1% is that unstructured/text data makes up c. 90% of data. Estimates place the average time spent cleansing and transforming data at 80% (New York Times and Crowdflower). Imagine a world where machine learning is cheaper and quicker to deploy. We won't require data scientists to tune and drive models. Imagine how nimble companies could be then. HITL and Machine Learning Models Human-in-the-loop (“HITL”) is a branch of artificial intelligence that leverages both human and machine intelligence to create machine learning models. A human-in-the-loop approach involves people in a virtuous circle where they train, tune, and test a particular algorithm or process. It can, however, mean a lot of different things, from simulating with a human observer involved, through to humans ‘labelling’ training sets for machine learning. Focusing on the machine learning use case, whilst universally recognising HITL as the ‘ideal’ solution amongst data scientists, it still carries challenges regarding variability of different human classifiers or ‘labellers’ both in quality and also judgement in certain circumstances. It is also expensive, even when using non-data scientists, and testing and curating the machine learning models happens with a data scientist after each labelling exercise and at regular intervals to check and update the models to prevent them from degrading. Many HITL situations typically involve outsourcing to third-parties which carries risk and compliance issues (e.g. GDPR) as well as the consistency and retraining points referred above. If there are changes in the business or market, these can result in the data changing and therefore the models will need constant refreshing. So if the data science community agree that HITL is the optimal solution, how do you optimise HITL itself to address these challenges above? HITL Optimisation One example addressing this new approach to HITL is from Warwick Analytics, a spin-out from the University of Warwick. The company has a new technology called Optimized Learning. It can dramatically reduce the requirement for labelling in the first place, minimising human intervention. The product resides in PrediCX, and it is a supercharged ‘labelling engine’. The basic premise is that PrediCX judges its uncertainty. It then invites user intervention to label (aka tag or classify) things manually that it needs to maximise its performance. This in turn leads to minimum human intervention. The human intervention just needs to be someone with domain knowledge and doesn’t need to be a data scientist. The labelling required is ‘just enough’ to achieve the requisite business performance. Also, because it invites human intervention when there’s uncertainty, it can spot new topics. This keeps the models maintained to their requisite performance. Where there are inconsistencies in labelling, it performs cross-validation and invites intervention. If there are differences in labelling, labels can merge or move around in hierarchies. If the performance at the granular level isn’t high, then it will choose the coarser level just as a human might. It’s human-in-the-loop but the human element keeps to an absolute minimum, and the human doesn’t have to be a data scientist, making it so much more accessible. Customer Interactions and Time One example of how this latest technology is being used is a global industrial manufacturing company receiving many thousands of customer interactions per day from engineers working at factories using its products. Most of the communications are accessed through the abundant online resources, although thousands still call the company via multiple channels of telephone and via the webforms which raises the tickets. Time is of the essence, mainly if a manufacturing line stops. This, combined with the vast array of complex products means that the company needs a large number of skilled operators in its contact centre with a full degree of expertise. The company used PrediCX to accurately classify the incoming queries and match them with the ‘next best response’, be it a resolution, or a further clarification for symptoms or detail of the question to be able to do so. A human operator would validate this before replying back to the customer. Providing accuracy scores of the performance of the labelling enables thresholds to automate (when the confidence is very high) or to guide humans to choose the best response if there are competing options. It is easy to plug into any system via an API. HITL Labels As well as dramatically cutting down the time for operators to assist, the labels themselves proved invaluable for the managers of the knowledge base to understand which queries were being asked about most often and indeed it provided ‘early warning’ of new, emerging issues which would not otherwise be picked up by machine learning. This insight was used to enhance and develop the knowledge base accordingly, and also to optimise the online resources, enabling customers to search for the issues in free text, retrieving the appropriate response without the need to contact the contact centre. There was also the benefit that product managers could understand from the labels what was happening to their products if enhancements were needed to usability and reliability, and even to predict which factors were likely to predict an issue to prevent reoccurrence. In conclusion, whilst recognising that human-in-the-loop is the optimal solution for machine learning, it is fraught with practical problems. There are however technologies appearing which have already been tried and tested and are deployed to great effect using HITL. Truly, man and machine can work together to improve outcomes and maximise the potential of both. ### 2018 will be the year of Robotic Process Automation With everyone now back in the office, renewed, reinvigorated and ready for the business challenges that 2018 is set to bring, it’s timely to make a few cautious but confident predictions about what the ‘next big things’ in business tech will be over the coming year, like RPA. 2017 was a tumultuous period in both politics and society, with Brexit changing the economic scene across Europe. With Hollywood being held firmly under the spotlight as Harvey Weinstein faced harassment allegations. However, aside from scandal and suspect, last year was also a transformative time for technology. Virtual reality became the latest must-have for both organisations and consumers. The Internet of Things (IoT) continued to grow at scale. The rapid innovations and developments in connected, autonomous vehicles surged. On the other hand, there has also been much hype and hyperbole reported on the impact of VR, AI and other technologies. Including machine learning.   Diluting the AI hype There’s no arguing that AI technologies are rapidly growing. Both in sophistication, and regarding the practical applications that they can offer. SMEs and enterprises alike could benefit. [clickToTweet tweet="We can reasonably predict, with certainty, that 2018 is NOT going to be ‘the year’ of #AI" quote="We can reasonably predict, with certainty, that 2018 is NOT going to be ‘the year’ of AI"] That being said, we can reasonably predict that 2018 is NOT going to be ‘the year’ of AI. This is despite the tremendous amount of excitement and buzz around the business potential of the technology. Not to mention the tabloid scaremongering around the potential for a mass robot takeover. Looking at both large and small businesses, there is still no concrete use-cases of intelligent learning technology. This includes the advantages that this technology can bring to the way organisations operate. CIOs and IT managers alike should focus on the here and now. They shouldn't focus on what AI may or may not offer them in 2022, 2030 or further into the future. This also means that businesses need to think more about their organisation in its current state – by clearing up and improving efficiencies. This year is a year to reflect on how many employees you have doing what and whether or not it is the most valuable means of doing things, regarding both time and resource. Don’t cling to AI, machine learning or other buzzwords in the hope that these technologies will magically solve your basic business challenges over the coming year.   Turn down the buzz terms Indeed, this brings us on to our next prediction for business technology over the coming year. Buzzwords should be banned! Admittedly, this is something that is more of a fervent hope and prayer than a straightforward prediction. The fact remains, the hype around specific technical terms throughout 2017 has led to unnecessary levels of confusion. It has also lead to misinformation across B2B technology media. IoT, digital transformation, AI and other much-overused acronyms and buzzwords risk losing any meaning whatsoever. They need to be used with material examples of the technology in practice. They can then exemplify and clarify how they can be of benefit in specific cases. Again, business leaders and CIOs need to focus on the present, not be tempted to future-gaze too much. Deliver a plan that focuses on the year ahead and how to make the best use of your current technologies to drive value.   The relationship between shareholders and CIOs So don’t future gaze. Instead, focus your time and energy on delivering shareholder value in 2018, instead of believing too much of the hype and bluster around IoT, AI, virtual reality, blockchain or the other popular tech jargon of the year just gone. In 2018, the CIO’s that do well will be those that get to grips with software robotics and figure out the ways of delivering shareholder value from using this much-vaunted technology. Don’t forget that CIO’s don’t get to stick around for a long time, so you are unlikely to be able to keep the shareholders happy over the coming year from the mid to long-term returns promised by those technologies mentioned above.   RPA will be the only acronym that matters Robotic Process Automation (RPA) in the back office is the essential technology acronym to remember in 2018. Business leaders will need to look at how to transform the back room through RPA. Using RPA is going to deliver a truly competitive edge to enterprises in the coming year, as it can free up people to work on the more creative aspects of keeping a company competitive. In the year ahead, prioritising RPA in the back office will mean companies won’t just be treading water but instead genuinely preparing for the business wins on offer from genuinely scalable robotics in the near future. ### #DisruptiveLIVE - BETT Technology in Education LIVE - Day 2 Event: BETT Technology Education Event 2018 Live Dates: 24-26 January 2018 About the show: Disruptive will be hosting our live shows direct from the ExCeL London for three days. Join us live as we bring you the latest educational technology interviews, trends and startups. ### Bett 2018 - Tom Thacker - Century Tech Tom Thacker, Curriculum Lead for Century Tech, talks about his company, what they do and why they are going to the Bett Show 2018. ### #DisruptiveLIVE - BETT Technology in Education LIVE - Day 1 Event: BETT Technology Education Event 2018 Live Dates: 24-26 January 2018 About the show: Disruptive will be hosting our live shows direct from the ExCeL London for three days. Join us live as we bring you the latest educational technology interviews, trends and startups. ### Bett 2018 - Iain Bell - MINTclass Iain Bell, from MINTclass, talks about what the company does and why they are going to the Bett Show this year. ### Just what is all the fuss about NSX! - James Manning - Dell EMC Compare the Cloud was joined by James Manning, Network Systems Engineer at DELL EMC. In this demonstration, James discusses what is all the fuss about NSX. Find out more about Dell EMC here: https://www.dellemc.com/en-us/index.htm At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. ### Critical terminology that could help you combat CNP fraud Among the challenges facing ecommerce today is fraudulent orders made with stolen credit card information (including the card number, expiration date, and even the security code). The technical name for this is card not present (or “CNP”) fraud, which gets its name from the fact the card is never physically presented to the merchant. The merchant is thus unable to examine the card and check the name and face of the shopper against a valid photo ID (usually a driver’s license), and thereby both prove the shopper’s identity and confirm that the purchase is authorized by the cardholder. CNP fraud is obviously bad for the person whose credit card details have been stolen - disputing charges and ordering a new card are never fun - but CNP fraud is also seriously unfun for the etailer. That’s because when that victim does call and successfully dispute the charge, a chargeback (a refund from the merchant’s bank to the victim’s account) is initiated. [clickToTweet tweet="Device fingerprinting allows merchants to more easily identify returning customers. #Fraud #Tech" quote="Device fingerprinting allows merchants to more easily identify returning customers. "] Although the cardholder gets their money back, the merchant is ultimately stuck footing the bill…and then some. In addition to the refunded charge, the merchant incurs the cost of replacing the merchandise, and paying a chargeback penalty to their card payment processor. All told, the actual cost to the merchant of a chargeback can be more than double the cost of original order. Tech to the rescue? As a result, etailers adopt all sorts of technologies and techniques to detect and decline fraudulent orders, from ecommerce fraud protection solutions based on machine learning, to legions of analysts who manually review incoming orders. Even though many of these measures can be quite costly, they can more than pay for themselves by reducing losses from costly chargebacks. In the summary of ecommerce order fraud at the beginning of this article, we’ve had to define two key terms: CNP fraud and chargebacks. If you’re an online merchant trying to combat CNP fraud, that’s only the beginning of the lexicon you’ll need to be familiar with as you consider, evaluate, and adopt the many anti-fraud solutions available. If that online merchant happens to be you, let’s build up your CNP fraud vocabulary starting with the names of important data points used by fraud-screening algorithms and analysts alike, one of which is IP address: the number assigned to any device connected to the Internet, and are assigned geographically. IP addresses can indicate fraud if the IP address from which an order originates is assigned to Moscow but both the shipping and card billing addresses are in Moscow, Russia. Speaking of shipping and billing addresses, our next term is the name of a system which can be used to verify a match between part of a cardholder’s address on file with the card issuing bank and the billing address supplied during the order: AVS. An AVS mismatch can indicate fraud, while a match can be an indicator that the order is legitimate. However, an AVS match can also occur on fraudulent order (criminals can often buy stolen card details complete with matching billing addresses). Ditto for IP addresses: fraudsters often use IP proxies (basically another computer which relays traffic to and from another computer) to hide their true location, making them appear to be somewhere they’re not. More data, more confidence There are many other clues which can be used to distinguish real shoppers from scammers, but they all have similar caveats. This is why modern CNP fraud protection has shifted to more holistic approaches which consider all the individual data points in an attempt to discern the story behind the order. So, you may hear terms like machine learning (algorithms which allow computers to learn from data instead of being explicitly controlled via commands), and behavioral analytics (the analysis of web browsing and shopping behavior). A third term you’ll come across is device fingerprinting (the identification of specific users and computing devices which allows tracking across web sites or previous orders). Device fingerprinting allows merchants to more easily identify returning customers. Since customers who have placed orders from a merchant (without a chargeback afterward) in the past are extremely unlikely to place fraudulent orders in the future, their orders can be safely approved without much additional scrutiny. Behavioral analytics is able to extract, compile, and compare data from online activity (which pages a shopper looked at, for how long, in what sequence, etc.). This data is very useful because a real shopper’s behavior is very different from a fraudster’s. Machine learning technology is able to take in all the above data (including AVS match results, IP addresses, and shipping and billing addresses) and learn which combination of what values most accurately predict whether that order can be safely approved or confidently declined. Hopefully, by reading this basic glossary you appreciate how much more there is to know about CNP fraud and how to combat it. By smartly adopting the best tech and practices, your definition of CNP might very well be Chargebacks Not a Problem. ### Backup & Storage Options for Small to Mid-size Businesses (SMBs) Startups and small businesses focus on cost efficiency as it dictates how soon they can reach breakeven. In any enterprise, data is the bloodstream. Efficient data utilisation can improve efficiency and contribute to productivity. Similarly, data loss means a significant hindrance, especially for startups. Even small businesses cannot tolerate data loss, specifically mission-critical data. This is why it is of the utmost importance for startups and small businesses to employ cloud backup solutions. It is assumed that large enterprises need cloud-based solutions while Small to Mid-size Businesses (SMBs) don’t have to invest in it immediately. This is a costly assumption that can bite you in the back. Verily, there are other backup options that SMBs can employ for effective backup strategies. However, an effective disaster recovery plan is incomplete without the cloud. Backup & Storage Options for Small Businesses & Startups The first thing that needs to be thrown out the window is tape storages (like USBs, Internal and External Hard drives). The reason for that is simply because it’s expensive and unreliable. If you compare the cost of backup appliances versus tape storage, you will see a drastic difference. Would you choose to back up your mission critical data on a flash drive? If so, what if the drive becomes corrupt? That means all of that data goes in an instant. Can a small business or a startup endure such data loss? That’s why let’s leave our tape storage outside the premises and keep them there. Storage Appliance: Effective File Sharing & Storage Data storage requirements immensely govern the chosen storage solution. For most SMBs, the best answer is not storage on the cloud, but rather Network Attached Storage (NAS). If they only have file hosting and sharing requirements, then this is the best solution. That’s because data hosting and access over a local network is faster than enterprise cloud storage. This is the reason why even large enterprises, utilise local infrastructure to move data efficiently within a particular office while using cloud storages to move it efficiently and safely between regionally separate offices. With the utilisation of a storage appliance, these SMBs can ensure agile data flow and use cloud connect services to create archives in the cloud. That way, they can have the best of both worlds. Public Cloud Backup & Storage Besides the acquisition of enterprise backup appliances, SMBs can also procure backup solutions directly via virtual infrastructure. To do this, they have to ensure that their Ethernet connection is reliable enough to facilitate scheduled backups and can be counted on if a situation arises where recovery is required. Any disruption in the Ethernet connection can translate to downtime or data loss; neither is acceptable for any enterprise. Several mainstream public clouds can provide storage: Amazon, Google, Microsoft etc. You can create copies of your files on these storages and share them using links. The critical thing to understand here is that cloud storage is vastly different from backup in the cloud. Cloud storage solutions are ideal for sharing and working in a parallel manner within a team, increasing work efficiency and overall productivity. However, storage solutions are not reliable for backups. You need to acquire backup solutions to create backup copies of your sensitive data. [clickToTweet tweet="Public #clouds are providing diverse solutions for #backup. #disasterRecovery" quote="Public clouds are providing diverse solutions for backup."] Public clouds are providing diverse solutions for backup. You can backup your data cost-efficiently as long as you analyse it and use the appropriate storage tiers. For instance, if you move your archival data to Amazon Glacier; it promises to be cost-effective in comparison to high access frequency storage tiers like AWS Standard S3 or Azure Hot Blob Storage. While using public clouds, it needs to be stated that enterprises need to focus on security protocols. Recently, several large enterprises classified their sensitive data on publicly accessible folders. This led to costly data leaks, damage to the brand name and loss of clientele. All of which could have been avoided by the efficient utilisation of security options already available in public clouds. Private Cloud Solutions Large enterprises utilise this solution due to the closed nature of their data. For instance, the metropolitan police department would be better off using a private cloud as compared to a public cloud due to various CJIS regulations. Unlike public clouds, there’s a lot of privacy for the data that you have stored. You can control access to your data effectively and ensure that it remains within reach of the users of your choosing. You can utilise a backup infrastructure to deploy your private cloud, or you can acquire the services of a private cloud service provider and use the infrastructure in their data centre to implement it. Hybrid Cloud Solutions The hybrid cloud solution is an exciting mix of many things. Hybrid solutions tap into various options: backup infrastructure, public clouds and private clouds. For instance, an SMB acquires a NAS appliance with cloud connect services. This SMB can configure the device to backup and share their data over a network, then also make copies of this data on a public cloud. By using more than one appliance, they can also tap into private clouds or multiple public clouds and acquire beneficial services from all of them. This makes Hybrid cloud solutions a very flexible and highly recommended solution for SMBs. ### #TheCloudShow - Ep 1 - Are Providers Doing Enough for the SMB Market? Are Cloud Service Providers Doing Enough for the SMB Market? Compare the Cloud are peeling the layers back on the Cloud Industry. Answering the questions on the lips of both providers and customers - hosts Jez Back and Andrew McLean will cover the latest trends and best interviews in this space. Questions from the audience are welcome - Tweet us @comparethecloud ### How Far Can Immersive Technology Encourage Business Growth? We are now living in a time where technology is almost evolving faster than we can keep up – and technologies that are already available to us such as Artificial Intelligence, seem almost futuristic. But, with these advancements come new opportunities - particularly for businesses. To benefit, however, you must keep up.   We have always looked for ways to make technology more streamlined, so our day-to-day lives are easier and we can focus our efforts in the areas that are most beneficial for growth. But, today, as well as this we have technologies that improve the experience for those working for the business and more importantly, the consumer. So, with this in mind – can cutting edge immersive technology encourage business growth and, if so, how far? What is immersive technology? Immersive technology blurs the line between the physical world and the digital world. The environment created is so realistic that the brain is tricked into believing that what you are experiencing is real.  Augmented and virtual reality technologies are classed as immersive. Can it encourage business growth? Immersive technology is already being used to encourage growth within many businesses. The difficulty comes when determining how it can be beneficial to your particular business and its needs. The Guardian, for example,is already using virtual reality as a new way for us to receive news. It uses the key principles of journalism but let us walk in the shoes of the protagonist. In one example, Limbo tells a story that we are presented with on a daily basis - the plight of asylum seekers. But this virtual experience of waiting for asylum enables us to see it from their point of view and therefore, perhaps, rethink our own views on asylum seekers.   This case shows the power of the format but how far it can encourage business growth will ultimately depend on how you use it and whether your business is compatible with this type of technology. How could this work for businesses? Ultimately, it will depend entirely on what your business is but immersive technology is the perfect way to enable potential customers to experience your product or service when it isn’t physically in front of them. Let’s say, for example, that you are an estate agent. A potential buyer turns up and you discuss what they are looking for in a property. You then have to go and visit those properties - which is incredibly time consuming. The benefit of immersive technology is that you can show them round several houses and allow them to get a real feel for it without ever having to set foot through the door. This becomes even more beneficial when they are looking to move across the country or even overseas. Likewise, perhaps those selling furniture could enable customers to see how that sofa they are eyeing up would fit in with the current décor and size of the room – before they commit to buying it and find out later isn’t quite right. Immersive technology has many benefits for businesses and, if used in the right way, it can encourage growth and let brands ‘sell an experience’. The early adopters will stand out from the crowd and could steal a march on their rivals. But, how far will it go? Only time will tell.   ### Finding The Right Cloud Skills For Your Business When it comes to IT strategies for 2018, cloud computing is sure to be the one technology at the core of almost every business’ plan. The problem with rolling out new tech — especially one which will radically change the very foundation of a company’s technical infrastructure — is that current in-house skills aren’t necessarily going to match up with what is required by the new system. As businesses continue to turn to the cloud to improve efficiency, cut costs, and revamp processes, suddenly there are thousands of organisations ready to implement their new cloud-first action plan, and finding they don’t have the internal expertise execute it. In need of new skills, businesses turn to their HR departments, who are tasked with hiring the absolute best cloud professionals on the market. But, as it turns out, every other business with a grand plan to migrate to the cloud is doing the same thing. So, what now? The cloud skills gap problem There is already a significant supply and demand issue when it comes to cloud technology professionals; a huge percentage of businesses report that they are struggling to hire people with the right cloud expertise. Despite the many alluring advantages cloud computing offers, many companies are postponing their migration because of the shortage of required skills, both internally and externally. [clickToTweet tweet="Don’t look for a unicorn when a zebra will do #Cloud #Skills #Technology" quote="Don’t look for a unicorn when a zebra will do "] The mass shift to the cloud will create an enormous number of jobs, but at present, IT professionals are playing catch-up to a new and rapidly evolving technology. Those just entering the market — professionals who are “cloud-born” — are just beginning to build their experience. while seasoned IT veterans are attempting to master an expansive new technology, which, until recently, no one was really sure would become so mainstream. The centre pool of the Venn diagram between cloud skills and IT experience is slim; it’s a pool with too many fishermen and not enough fish. To compound the problem further, hiring managers are not only competing with other end users for talent, but also the big-name cloud service providers themselves, with companies like Amazon, Microsoft, and Oracle jostling to build teams that will help them move forward in the race for cloud supremacy. Demand for cloud talent isn’t about to die down anytime soon. A recent report from Nigel Frank found that the percentage of businesses hosting applications in the cloud had increased by 26% on the previous year. Research also showed that tech partner companies expected managed cloud infrastructure services to be the most in-demand technology by their clients in 2018, beating out business and productivity applications for the first time. Approaching your search Unfortunately, there is no assembly line pumping out qualified cloud professionals into this combative market. With high-paying roles and new opportunities being dangled in front of them, IT professionals will be working on upskilling to meet demand, but closing the skills gap will take time. Businesses looking to deploy cloud solutions now can’t wait around for burgeoning talent; they need to have a plan in place in order to recruit in this candidate-scarce market. So what can businesses do to make their search for great cloud talent more effective, and ensure they have the skills they need to move their cloud strategy forward? The first step is to evaluate your needs. Look at your cloud strategy and work out exactly what you require in a cloud professional. A good cloud pro will have a well-rounded knowledge of cloud computing, but, like all other professionals, will tend to have a specialism. Think about what you plan to use your new cloud platform for. Is it mostly for data storage? Do you want to develop and build apps? Will you be managing customer data in the cloud? Detail your requirements and decide if you’ll need someone with experience in a particular area, whether that be migration, security, databases, or programming and development. With many cloud platforms to choose from, each requiring distinct aptitudes, it’ll also help hone your search if you target professionals who have experience with your chosen provider. Due to flood of wanted ads hitting the job market, businesses may come across applicants looking to meet, and capitalise on, that demand without having a robust enough skillset. Though the pressure may be on to bring cloud professionals to your team quickly in the face of such steep competition, always keep in mind; you can hire right or you can hire twice. Top skills to look out for Once you have your basic requirements mapped out, you should consider any additional skills When searching for potential candidates, it’s important to be on the lookout for The technical skills required to oversee cloud infrastructure are not the same as those once needed to deploy servers and run internal data centres; with infrastructure housed off-site, traditional management skills become much more valuable. Don’t forget to look out for soft skills such as communication, organization, and sound financial literacy. Naturally the most important thing to look for in a potential candidate is the right experience; do they have an understanding of your chosen cloud provider? Have they performed many migrations? Do they have experience building or developing on a cloud platform? However, with new technologies experience can be hard to come by, making it tricky to verify a candidate’s skill set. Certifications can help you see exactly where their strengths lie, and are a good indication that the applicant is enthusiastic about the technology, and keen to keep on top of new developments. Look out for these trusted certifications when sizing up potential candidates: Any AWS Certifications CCSK (Certification of Cloud Security Knowledge) CCSP (Certified Cloud Security Professional) Certified OpenStack Administrator Cisco Certified Network Associate (CCNA) - Cloud CompTIA - Cloud+ Dell EMC Cloud Architect (EMCCA) IBM Certified Solution Advisor - Cloud Computing Architecture MCSE (Microsoft Certified Solutions Expert) - Cloud Platform and Infrastructure Red Hat Certified System Administrator (RHSA) - Red Hat OpenStack VMware Certified Professional (VCP) - Data Center Virtualization Don’t look for a unicorn when a zebra will do That doesn’t mean lowering the bar by any means; it’s simply a case of focusing your search. A good cloud professional will have a working knowledge of the all the basics, but with such a broad range of skills and specialisms within the cloud, you’re unlikely to find someone with expert-level experience in every single area of bleeding-edge tech. Setting your heart on a unicorn cloud professional is going to make things even harder; by evaluating your prerequisites, and distilling the core skills that your organisation really needs to get the most out of its cloud services, you have a much higher chance of finding great that fits your business. Of course, it’s fine to include “nice to have” skills, but don’t expect an influx of applications if you’re asking for the moon. The cloud is constantly evolving, and provided you find a professional who is engaged and passionate about their own development, they’ll pick up new skills as they go. One final tip; demand has created something of a bidding war for qualified cloud experts, and those in the market for a new role command high salaries. If you’re moving to, or already utilising cloud services, you might find that you’re spending less on infrastructure; your hiring budget would be a good place to reinvest those savings to ensure you’re offering a competitive and attractive compensation package. ### Embracing The Cloud Increasingly Integral For Growth Businesses’ Commercial Success Growing adoption of cloud computing has facilitated rapid business change over the past decade. For dynamic businesses, cloud offers new ways to scale their growth and better compete with larger, established firms. For businesses of all sizes, cloud is now a mainstream choice and accepted across industries. A recent survey from Cloud Industry Forum (CIF) revealed that the overall cloud adoption rate in the UK is 88%, with 67% of users expecting to increase their adoption of cloud services over the following year. Small businesses using cloud are growing faster and are often more profitable than those who aren’t. Research by Deloitte revealed that small businesses using an above average number of cloud services grew 26% faster that those who did not and they were 21% more profitable on average. Further research by Exact suggested that small businesses that had made the move to cloud doubled their profits and achieved 25% additional revenue growth compared to their cloudless contemporaries. In fact 85% of those surveyed by Deloitte believed that using cloud services enabled their business to scale and grow faster. Key benefits of using the cloud are the pay-for-what-you-use model, the potential for productivity improvement, faster time to market and enhanced cyber-security.  Going forward, spending on cloud services will continue to grow as it is increasingly seen as a utility and as apps become cloud native, allowing firms to improve efficiency and drive productivity gains. Below Paul O’Shea, CEO of Kumoco, the management consultancy firm that specialises in agile and cloud consulting, explains the benefits of the cloud for growth businesses and shares some thoughts on the future of cloud adoption and innovation. Pay for what you use is cost effective One of the biggest draws for a startup to use cloud is cost. Somebody else runs your stuff so you don’t need to buy the equipment, rent the space to house it or hire staff to manage it. This means that you can focus on adding value to your customers and leave the undifferentiated heavy lifting of running an IT estate to others. With cloud, you need to pay only for what you use much like a utility bill. So if you have less customers on weekends expect to pay less on weekends. From a software license perspective, you can add users as your team grow, meaning you can make use of powerful enterprise grade software for pennies while your team is still small. Cloud-based services allow growth businesses with fluctuating or fast-changing demands to scale up or down at a moments notice. This level of operational agility gives businesses a real advantage. The pay-for-what-you-use model allows companies to adapt their technology resources to accommodate big swings in their operational business, quickly and cost-effectively. Faster time to market ‘Speed wins in the marketplace’ is a truism in today’s fast-paced market environment. To compete in an increasingly digital battlefield and capitalise on new opportunities, companies need to be able to bring new ideas to market at speed. If a project takes months or years to deliver, it can be too late. The flexibility of cloud services can shorten the time between an idea being developed and the product being released. Rather than management having to spend significant time ensuring that there is sufficient capacity and IT resource in place, they can focus on building and launching the new product or service as quickly as possible. More productive working All of your team can have access to all of your tools from anywhere, 24/7. This fosters better collaboration and can make you a truly global organisation from day 1 allowing you to stay in touch with your team from anywhere. From video calling, raising invoices to managing your customer relationships all of this can be done from any device or location. Better security As a startup, you won’t have access to the security resources of large organisations like Google or Microsoft but thanks to cloud you can avail of them. With your software (SaaS), infrastructure(IaaS) or platform (PaaS) security taken care of, you can focus on securing your application. From a compliance perspective, you’ll find that many of today’s cloud offerings are already fully compliant with a raft of international regulations like GDPR, whose provisions will soon apply. Additionally, with cloud, business continuity becomes straightforward with a wide variety of options at your fingertips. A small European startup could have their application running across data centres in Ireland and Germany with minimal effort. Where next... Spending on cloud is expected to grow at more than six times the rate of IT spending from 2015 through to 2020 and by 2020, the International Data Corporation, the market intelligence provider, predicts that 60-70% of software services and technology spending will be devoted to cloud. The world is splitting into software vendors and software consumers. With cost as a key driver all software consumers will eventually move to SaaS products. Only for competitive advantage will it make sense to develop software in house and only those developing will have need for IaaS and PaaS solutions. Cloud optimisation: with most having already adopted cloud in 2018 there will be a shift of focus from adoption to cost optimisation. As cloud estates become larger managing them from a cost and compliance perspective will only get more complex. To address these needs Kumoco have partnered with ServiceNow to build a Cloud Management product. With AI as a Service becoming a reality thanks to large investments by IBM, Google, Amazon, Microsoft and Salesforce it won’t be long before we see the general availability of AI capabilities. Trust: with increasing adoption of SaaS solutions identity and trust will become important topics. How do you manage identity across hundreds or even thousands of cloud services? With the global cloud services market estimated to be worth £190bn in 2017, according to Gartner, and cloud adoption strategies likely to influence more than 50 % of IT outsourcing deals by 2020, businesses’ adoption of the cloud appears set to accelerate further, helping to drive the development and expansion of growth businesses, which are such a critical part of most countries’ economic success. Key considerations when evaluating Cloud services: How much will it cost me as I grow? With most offering ‘free to start’ tiered pricing it's important to check what the costs will look like one you've added all the features you need Is it as easy to get your data out as it is to load it in? What data migration tools are on offer? What if you want to move away later? Does it make sense to have 10 different SaaS products when maybe 3 could meet 70% of your requirements at 30% of the cost? The downside of per user licensing is that the costs can quickly add up if you need multiple SaaS products to get the job done. Do you really need all those features? Probably the basic package will do. Think about the old versions of Microsoft Word how many of the 1000s of features did you actually need or use? What is the impact of having your data spread across multiple tools? Can they integrate with each other? Will you be able to run analytics on your data? Check that your provider offers well documented APIs and that these meet your integration requirements. Does your provider meet your compliance requirements? Where will your data be stored? What support can I expect? Check what is offered by your provider but also check for community forums and 3rd party addons / marketplaces. Skills / professional services: can you easily get help if you need it? Where do I start? Below we have listed some cloud services that we believe are worth a look: Accounting: Xero Collaboration: Google G Suite, Office 365, Communication : Slack Kanban : Kanbanflow, Trello Storage : Google Drive, Amazon S3, Dropbox, Microsoft OneDrive Cloud Platforms: AWS, Microsoft Azure, Google Cloud, Heroku, DigitalOcean CRM: Salesforce, Zoho, Hubspot Payments: Stripe Source Code Management: Github, Gitlab Issue Tracking: Atlassian (JIRA), ServiceNow Support/Live Chat: Intercom Identity: Duo Security, Okta ### What Does 2018 Have in Store for the Cloud? Cloud computing has come a long way since its beginnings in the first few years of the 21st Century. The origin of the term to describe the outsourcing of IT operations is debatable but, as early as 2006, it was a permanent and indisputable fixture not just in the IT sector lexicon, but in that of the corporate world in general. The concept, as we know it today, is the result of two major developments that came together around the turn of the millennium. The first was the emergence of the internet as a major feature of modern business functions, which enabled companies of all shapes and sizes to access a host of computer services online. The second was the evolution of the outsourcing of IT operations by major companies and even by businesses in the IT sector, as they became more confident that they didn’t need to own or manage the infrastructure underpinning their computer systems. [clickToTweet tweet="Business leaders have begun to realise that secure #data #storage is not a question of being on or off the #cloud. " quote="Business leaders have begun to realise that secure data storage is not a question of being on or off the cloud."] These factors combined to provide an opportunity for the likes of Google and Microsoft to offer browser-based applications to businesses to enable them to outsource IT processes over the internet. Overcoming business concerns Back in 2006, when cloud computing first appeared in the corporate world’s collective consciousness, the model was alien to many people. At first, there was great uncertainty as to how far organisations should trust cloud service providers with data, and there were questions about the security of the remote server networks where company information was being stored. This insecurity hasn’t quite gone away – even in 2018, business leaders have concerns about cloud computing, bombarded as they are by media reports of high-profile security issues, such as last year’s Equifax breach and controversies around Wannacry ransomware. These worries resonate in some sectors more than others – PCT has remarked upon a particular reticence in the financial services industry when considering putting their IT operations in the cloud, due to the perceived risk of hacking. Changing perceptions Despite ongoing worries about ransomware and similar issues, over the last two years, the perceptions businesses have about the cloud have undergone a radical shift. Business leaders have begun to realise that secure data storage is not a question of being on or off the cloud. Rather, it depends on the overall security of the entire computing environment – regardless of whether it is in the cloud, or on a business’s premises, in either a physical or network form. This shift is the result of the growing commitment by cloud providers – large and small – to protecting their customers’ data, by creating huge teams dedicated to security and compliance. For instance, Amazon Web Services boasts more than 1,800 security controls around its services. Microsoft, meanwhile, reports that it has helped to rescue as many as 10 million computers infected by malware. Evidence suggests that this diligence has seriously boosted confidence in the legitimacy of cloud computing. Indeed, reports show that 80 per cent of Fortune 500 companies and at least 25 of the world’s 38 largest financial institutions and insurance companies trust the Microsoft Cloud today to provide their business with IT operations that are efficient, reliable and, above all, secure. Looking ahead As we look forward to the future, there is every indication that cloud computing will continue to play an ever more important role in business operations across all sectors and markets. Forecasts from Forrester, for example, suggest that, by the end of 2018, over 50 per cent of companies worldwide will be using a minimum of one public cloud platform. The total value of the global public cloud market will reach a value of $178bn, with a CAGR of 22%. From our perspective at PCT, we expect this trend to continue well into the next decade. By 2028, we believe that only a small minority of businesses will be operating without some form of cloud computing. As we move into the future, business leaders will grow less and less concerned about whether they should move their operations over to the cloud or whether it is secure. More and more, they will look at how they can access the cloud more effectively and efficiently, in a manner that best suits the needs of their business. We can expect the implementation of the cloud in the future to rely less on hardware, and to focus more and more on subtle combinations of software and applications to create an ever-greater array of services designed to meet the specific needs of both businesses and their customers. For instance, the taxi service, Uber, incorporates Google Maps’ API for its in-app location finder, as well as Braintree as its payment processor. In doing so, it accesses reliable, ready-made systems to ensure its service runs smoothly, without the need to invest in creating its own expensive bespoke solutions. Such models provide a good glimpse of how companies could use and combine different cloud applications to launch new business propositions in the future. As long as businesses can pay for these cloud services as they require them, this trend will continue to develop. Into the future Gazing even further into the 21st Century, we can expect cloud computing to transform further. Innovations, such as the advent of quantum computing, artificial intelligence, the growing use of bots and neural networks in big data, as well as the closer integration of humans with computers, will all serve to utterly redefine cloud computing and its place in society. With such changes, the cloud will become more than a nice-to-have. It will be truly omni-present, integrating seamlessly with every facet of our business and personal lives. Here in 2018, it’s clear that cloud computing is still in its infancy – it is still some time before we truly begin to leverage its full potential. When we do so, the possibilities for businesses and the way they operate will be almost endless. ### Why not build your own Cloud environment using OpenStack? As more and more organisations start to take their first steps into the Cloud, pilot schemes and ‘proof of concept’ projects lead to buy in from senior management, and strategic decisions to increase Cloud investment. Set aside your ’cloud native’ start-ups and we’re now getting to the stage where established organisations are getting up to a brisk pace with the Cloud and leading to a DevOps way of working, enabling faster iterations and updates, and an ability to trial new products and services with less risk. This has resulted in increased demands on control and availability of Cloud services that need to be more stringent, putting power and agility back into the hands of an organisation’s IT teams. [clickToTweet tweet="The issue with emerging technology is that skills are scarce. #IT #Openstack #Technology #" quote="The issue with emerging technology is that skills are scarce."] Public Cloud technologies like AWS, Azure or the Google Cloud Platform provide a lot of possibilities, but they leave a gap which OpenStack is aiming to bridge in the Private Cloud arena. Despite the recent decline in the adoption of Private Cloud operating systems stated in the 2017 ‘State of the Cloud’ study by Rightscale. Openstack is an anomaly amongst its Private Cloud peers, and is steadily rising according to the same study. But what is OpenStack? OpenStack is an open source platform for developing, building and deploying private cloud IaaS platforms. It’s powerful and can be administered through command line, RESTful web services and API’s as well as web-based dashboard controls. You can use these to allocate processing power and storage and associate network resources using your own datacentre kit. How did OpenStack emerge? OpenStack has its roots in a joint effort between Rackspace and NASA, but has evolved with the help of over 500 companies and organisations across the world. RedHata big player in both the open source and enterprise markets recognised the inter dependencies of Linux Servers and an OpenStack environment and has collaborated with OpenStack to ensure there are closely aligned product teams and support resources with a focus on infrastructure, security and patching. Similarly, the cloud virtualisation giant VMware recognised a need to deploy VMware based instances to an OpenStack environment and through VMware Integrated OpenStack (VIO) it aims to simplify that process. This is something that Bank of Ireland is starting to ramp up their usage of with the long-term goal of moving towards VIO as its primary Cloud platform. Perhaps the largest and most high profile case study is Oracle who took the plunge to provide a full OpenStack cloud environment for all of their development and test teams to use. Listening to their developers and in an attempt to request and deploy environments, call API’s and build container platforms. They chose OpenStack as a way of providing environments to their developers almost instantly whilst still maintaining control and enforcing usage policies, permissions and security. Although initially tough to gain buy in and trust and with an end game to shut down ‘shadow IT’ side projects they managed to build a strongly resilient environment and grew adoption throughout internal teams through a mix of positive PR spread by word of mouth across business units. This has led to rave reviews internally and decreased the time to market for iterations of most of their new features as well as decreasing public cloud spend for development. Why is OpenStack successful? Its biggest asset is the way it embraces the rapid development of new features, much like AWS but by promoting a ‘DIY approach’ to Cloud. These new features are being driven by a fast-paced community solving their own challenges and making that knowledge, experience and lessons learned available to the wider community – the ethos of open source. So why isn’t everyone committing to OpenStack? The issue with emerging technology is that skills are scarce. In the UK & Ireland there are only a few reference sites and those who are skilled and experienced with OpenStack are only just starting to emerge. It’s largely being driven by a handful of solution providers and systems integrators who are working in conjunction with talented development and infrastructure teams to help them take their first steps. Contractors (with correspondingly high rates) are starting to spread the seeds of some of their OpenStack experience into new organisations, but it’s lacking the wealth of positive PR that Amazon and Microsoft are enjoying with Azure and AWS. How does OpenStack impact the employment market? Recruitment is difficult. Finding professionals for permanent positions is rare and consequently driving a ‘grow your own’ type approach with training going in and research time given to existing Cloud specialists or a mix of developers and systems administrators on the job. Although training to enhance OpenStack skills is starting to catch up through resources such as openstack.org and a strong community willing to collaborate and share their experience, a large emphasis is still being placed on Cloud leaders and innovators supporting and encouraging their teams to grow organically. ### Can you keep a secret? #DisruptiveLIVE presents Point of Privacy #Privacy Can you keep a secret? Can your browser? Point of Privacy is a new regular show on the privacy surrounding technology and how it impacts on our private and professional lives. In this episode host Bill Mew interviews Privacy Campaigner Max Shrems. #Privacy #DisruptiveLIVE ### ManageEngine - Raj Sabhlok - Zoho Corp Compare the Cloud talks to Raj Sabhlok from Zoho Corp about ManageEngine. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. ### How The Cloud Merged Into The Mainstream While the case for the cloud has always been compelling, perceptions over complexity have remained a persistent deterrent for businesses preferring the security blanket of their on-premise solution. We’re well versed in the dialogue that has long surrounded the subject: how the detractors routine countered the promise of greater efficiencies and cost savings with concerns over privacy and security, fearful of the lack of physical control. But perhaps most prevailing of all, was how the technical know-how needed for successful deployment and integration continued to preclude many from reaping the benefits, leaving business process management on premise and the heavy lifting firmly with IT. As a result, this innovative and shapeshifting technology all too often remained divorced and siloed from wider business operations and teams, unable to reach its full transformative potential. [clickToTweet tweet="Cloud cemented its status as a 'go to' platform for underpinning these transactions. #Cloud #IT #Technology" quote="Cloud cemented its status as a 'go to' platform for underpinning these transactions. "] Fortunately, 2017 was a year of more meaningful change in this area, as the hype and predictions finally came to fruition, and the cloud became a more mainstream reality. A number of factors conspired, notably innovation that made its integration and deployment simpler and accessible to more people within an organisation. Additionally, the starring role the technology has played in the rising trend for mergers and acquisitions has driven even wider use. Indeed, the number of these transactions snowballed this year with $52.4 billion (£31.60 billion) in deals conducted across the globe between April and June, representing a 57 per cent year-over-year increase, according to professional services firm EY (formerly Ernst and Young). Amid intense competition, businesses have looked to consolidate their core strengths and focus on their most profitable activities, offloading some of the less well-performing facets of the operation onto other businesses through global deals. Cloud cemented its status as a 'go to' platform for underpinning these transactions. It provided the kind of speed, flexibility and easily scalable infrastructure needed when contending with the challenge of consolidating disparate systems, users and infrastructure, particularly during the post deal transition period. In doing so, the technology helped to shift the perception of IT as a hindrance in the process of mergers and acquisitions, to one that can drive and facilitate the seamless convergence of cultures and operating systems. Amid soaring application needs, solution developers have responded by offering low-code development platforms, better suited to a business operating environment. An environment that demands pace and agility to solve problems and respond to opportunities with minimum disruption. Here, the game changer has been the new wave of solutions which harness visual, intuitive instructions for app building, as opposed to pure programming. This empowers a far wider range of business users to get involved, from a click from a mobile or laptop. The move translates to greater efficiencies and productivity benefits, as applications are built quicker, changes are made more easily for a faster and cheaper operating environment, while organisational culture is improved through better alignment between business and IT.       ### Future Gazing with the Channel - Comms Business LIVE - #DisruptiveLIVE David Dungay, Editor of Comms Business, will be talking to his expert panel about the BIG trends for 2018. Joining him on his panel will be Vincent Disneur, Sales and Marketing Director at Union Street, Justin Fielder, CTO at Zen, Pete Tomlinson, Commercial Director at TalkTalk Business and Jason King, Sales Director at Virtual1. Technology has come a long way in the last five years from a software and networking perspective. The panel will be discussing those changes and how the Channel can now leverage them to build a robust business in 2018 and beyond. Tune in to hear what the panel of Channel experts has to say, you can even ask them a question… just tweet your questions (anytime between now and Monday) using the hashtag #CBLive18 . ### The Link Between Blockchain Technology and Social Media If you have been paying attention to the news recently, you may have heard about blockchain technology. However, you may not be familiar with what this term actually means or how it may affect you. Blockchain technology refers to the decentralized ledger of cryptocurrency transactions, and it is opposed to the centralized bookkeeping strategy that traditional banks use. Because Bitcoin and other types of cryptocurrencies use blockchain technology, each new transaction is stored in a node or in blocks. While the block is separate, the entire cryptocurrency chain needs to approve it. This process makes the use of cryptocurrencies secure and transparent. At first glance, you may not think that there would be a significant link between blockchain technology and social media, but this is not the case. Blockchain technology enables individuals to have more privacy when using Internet-based technology. In addition to this, when social media enthusiasts create viral content, they could potentially receive compensation. [clickToTweet tweet="Blockchain technology enables users to make transactions in a private way. " quote="Blockchain technology enables users to make transactions in a private way."] Here’s a closer look that will reveal specifically how blockchain technology may impact social media. Privacy of Information Social media privacy is a growing concern, and rightfully so. For example, Facebook now has access to your conversations and other data through the Whatsapp app. Snapchat, which used to pride itself on anonymity, now keeps track of all user data and images that have been exchanged between users. These are only a few of the many examples of your privacy being affected, and there are others that you may not even be aware of. Blockchain technology enables users to make transactions in a private way. Only the sender and recipient are aware of the contents of the transaction. This is the ultimate level of privacy online. However, social media networks operate in the opposite fashion. They are usually controlled and monitored by major corporations, and by simply opening an account, you may be giving away your privacy rights. Each action that you make on social media may be recorded and even used in different ways. More than that, your account could potentially be deleted or censored without your consent. Blockchain technology in the social media realm could eliminate these privacy issues. A New Avenue of Payment Many social media platforms are exploring their own payment platforms. For example, Facebook is unveiling a payment feature through its Messenger platform. As you might imagine, Facebook then records all transactions that are made. This is a major invasion of privacy that may make some users shy away from using the technology. Blockchain technology could be used to protect user privacy in these types of financial exchanges. It could even be used to protect privacy for private messaging and more. This could make users feel more comfortable using some social media platforms. When businesses use safe exchanging transactions rather than their current format, they may be able to automate many processes through the use of smart contracts. They could also monitor and analyze data in a more secure manner. As a result, this could save them a lot of time and energy while also generating better overall results. The good news is that this type of technology has the potential to be used in a wide range of industries and in various beneficial ways. Earning Your Share Nowadays, social media content producers are losing a significant amount of money because of social media’s role as a middleman. However, the use of blockchain technology could disrupt this process by taking the middleman out of the equation and enabling content producers to get paid for creating and sharing their content. In fact, this could potentially create a new economy. Users have full control over how and where their content is distributed. Their distribution efforts give them the improved ability to profit from the production of passive income. Essentially, the producer of content, rather than the social media platform, gets compensation for the content. Some people may not currently be producing content because they are not properly compensated for it. Content creators understandably want to get paid for their creations. In the current scenario, social media platforms get paid for the content that you produce, and they also sell ad space so that they make an additional profit off of those who use social media platforms. When you look at it this way, it is clear that the current practice needs to be disrupted significantly. Final Thoughts Many social media users are not aware of how their privacy and even their income potential may be jeopardized by current practices. However, this entire system could be dramatically improved for the users’ experience when blockchain technology is incorporated into social media platforms. ### Multi-tasking Brits are embracing mobile working… Multi-tasking Brits are embracing mobile working… but it’s not as easy as it could be Dubber, the voice services and analytics company, today announced the findings of research designed to better understand how prevalent mobile working is in the UK, and how well supported workers are to maximise productivity when away from their desk. The survey of more than 2,000 service sector employers and employees found that a third of these workers spend more than four hours away from their desk during their working day. Eighteen percent are away from it for eight hours or more, indicating that, for many, their role is completely mobile. A tenth of all workers (11%) do the majority of their business calls on the move, while a quarter (26%) make more than five business calls a week while mobile. On average, 13% of UK service sector workers spend more than two hours a day commuting to work, while 46% spend more than an hour doing so. For most, this is ‘dead time’, with 46% indicating that this time is completely unproductive, and a further 31% stating they are productive for less than half the time they spend commuting. 41% believe they need to be able to multi-task at all times – with a further 37% saying they need to be able to multi-task most of the time. Despite this need, many are being hampered by slow IT equipment (62%), having to take notes on the go (26%), or spending time logging decisions made on feedback gathered in meetings (25%). James Slaney, General Manager and Co-Founder of Dubber commented, “Online interactions clearly dominate today’s communication landscape, but for an increasingly mobile workforce, regular communications with colleagues, suppliers and customers are dependent on more traditional methods such as phone calls. The proliferation of email, messaging and other text based communication channels is in direct correlation to the kind of records and audit trails that they facilitate; enabling complete transparency and limiting miscommunication.” When asked what tools would make their job easier on the move, better mobile network coverage (48%) was the top of the UK workforce’s wish list, along with better phone battery life (38%), indicating that advances in mobile networks and devices are still not where they need to be to facilitate optimum mobile working. Respondents also identified better scheduling tools (37%), and call recording/transcription services (29%) as tools that could make their lives easier. Sixty three per cent of service sector professionals still believe they do their best work in the office, while 31% believe they are most productive at home. Meanwhile, a third (34%) agreed that important details had been missed on phone calls, leading to misunderstanding or miscommunication with colleagues, suppliers and customers. Slaney added, “The ability to work anywhere is obviously advantageous and a necessary component of modern work life, but it remains challenging when information is poorly recorded and crucial details missed when workers have phone conversations. Phone calls are often reserved for the important stuff; for the decision that needs to be made, for the question that we want answering urgently. We pick up the phone when we just need to get things done.” The survey asked an open ended question to respondents to discover what they could envisage using call recording services for, the results of which are included in the following word cloud. Sixty six percent of respondents felt they would gain some benefit from call recording services, with some of the key themes mentioned including taking better notes from calls to refer back to, making better decisions, providing better services, and better time management. Slaney concluded, “When phone calls can be recorded and the information gathered on them digested in the same way we process emails, we will see a resurgence in the number of calls made in the workplace. Once only provided as an always-on service deployed for regulatory compliance or contact centres, now businesses can choose from a range of call recording options and capture their calls at the touch of a button. “Advances in cloud computing and digital transformation mean that phone calls will better retain their value to businesses and advances in speech intelligence technology will allow businesses to unlock their priceless data. For the multi-tasking, mobile working Brit, the ability to make better use of the information gathered through phone calls gives peace of mind so that miscommunication and misunderstandings will be a thing of the past.” ### Digital Healthcare Marketing - The Giant Health Event 17 - Simon Walker - M3 Disruptive Live interviews Simon Walker from M3 at The Giant Health Event 17. Compare the Cloud’s event coverage platform Disruptive partnered with the Giant Health Event 2017 for the first time, interviewing health technology experts from all over. Come join 1000s of passionate healthcare innovators at Europe's largest, most valuable healthcare innovation event. At the famous Old Truman Brewery in London, England. More information can be found here: https://www.gianthealthevent.com/ To find out more about M3: https://www.m3.com/ ### Holographic 3D Realisations - The Giant Health Event 17 - Dr Javid Khan - Holoxica Disruptive Live interviews Dr Javid Khan from Holoxica at The Giant Health Event 17. Compare the Cloud’s event coverage platform Disruptive partnered with the Giant Health Event 2017 for the first time, interviewing health technology experts from all over. Come join 1000s of passionate healthcare innovators at Europe's largest, most valuable healthcare innovation event. At the famous Old Truman Brewery in London, England. More information can be found here: http://www.gianthealthevent.com/ To find out more about Holoxica: http://www.holoxica.com/ ### Updating Your Business Systems? Beware of Cloud Imitations Today’s new enterprises can accelerate quickly and remain agile by adopting subscription-based business apps delivered through the Cloud. Businesses no longer need to own their systems, they simply use them. This new approach threatens to leave behind companies tied to more established, on-premise software.  Additionally, the size of these software applications combined with the extent of their customer bases and distribution networks can make it difficult for providers to move to a new delivery model.  Rather than investing in true Cloud development, many established software providers have chosen to host their old solutions online, even offering subscription-based pricing, so it is becomes difficult to distinguish true, “Born in the Cloud” solutions from imitations. Does it actually make a difference?   Pre-Cloud software providers have extensive customer bases which have paid up-front licence fees and installed expensive IT infrastructures to run their systems.  Unsurprisingly, those customers feel they have made significant investments and want stability. These on-premise solutions are often well-embedded in the business and are therefore difficult to replace.  In response to the market trend towards the Cloud, software providers can choose one of two product development paths: the more pioneering will develop brand new Cloud-based applications; others will simply move their aging technology onto a hosted server and offer it as their “Cloud version”, desperate to extend its life while giving long-serving users a sense of progress. These pseudo-Cloud imitations may look like Cloud products, some even have browser-based input screens and may even offer payment terms that are akin to monthly subscriptions.  Although these solutions free the business from having to maintain on-premise hardware, the software users are simply leasing the asset and they will never be able to deliver the additional benefits of a true Cloud application.    [clickToTweet tweet="The business landscape is moving fast and demands that businesses remain flexible and agile. #Cloud #Technology" quote="The business landscape is moving fast and demands that businesses remain flexible and agile. "] What will a True Cloud product deliver for your business? Firstly, real Cloud apps are truly anytime-anywhere services with the provider looking after everything to do with delivering the service 24/7 including hosting, backups, disaster recovery, ongoing upgrades, scaling the platform as you grow, ensuring full failover resilience etc.  However, the real benefits of Cloud is in what is still to come.  It is the futureproof platform of choice for businesses that are growing and want to adapt to change frequently and in an agile way.  The ability to scale and adapt with the business, allowing new locations or business lines to be added easily, demands a system that can equally adapt and change and won’t require you to wait for hardware or software to be installed – it will simply need to be switched on.    Cloud is now so prevalent that it’s a no-brainer to have a Cloud strategy of some kind in place, but how can you be sure you’re making the right buying choices when it comes to business-critical software?  Apart from agility and future-proofing, let’s look at some of the main functional differences the Cloud facilitates. Integration – the key to efficiency in business processes Pre-Cloud, if businesses wanted to achieve greater efficiencies by merging data and processes, they looked for ERP solutions that purported to deliver all the functionality they needed in one, integrated platform.  These ERP solutions were very costly to buy and even more costly to implement.  Where the ERP solution was too costly or didn’t meet the requirements, the next option would typically involve a specialist software reseller writing a piece of customised middleware to run on a server allowing two systems to exchange data, often only in one direction.  This was often cumbersome, relying on files being extracted from one system, passed across in batch form with a program running to update the changes or transactions in the other system.  When changes happened to either system, the middleware would require modification.  Either of these were costly upfront, with ongoing support costs. True Cloud products come with what is known as an Open API containing a library of “webservices” or “methods” that can carry out various updates from any other system via an authenticated Cloud transaction.  These are effectively pre-packaged integrations that can be reused by any other Cloud app with the relevant authority.  Integrating with a new application is simple: using a secure partner key, any third party system can access another Cloud system to perform an automatic, two-way data exchange.  Once an integration has been completed, it can be published in the Cloud to be accessible to other systems and is automatically updated as the Cloud-based system evolves. ERP Vs. Best Of Breed In the context of the Cloud, this old debate takes on a new perspective.  ERP solutions are still expensive to buy and implement and there is usually a compromise on the functionality in some modules versus the best available. Enterprise Resource Planning (ERP) means different things to different businesses: in retail it’s EPOS combined with finance; in hospitality it’s reservations combined with finance; generically, it can be finance combined with HR and sales databases.  Businesses are developing unique business models that are more difficult to address with generic ERP solutions.  Cloud delivery and easier integration means businesses no longer need to spend large sums or compromise on functionality to get fully-integrated solutions.  Each business function can pick the system that works best for it and, if data needs to pass from one to the other, they can be integrated to avoid re-keying and errors.    Continuous improvement – automatic upgrades Another big benefit the Cloud offers is automated software updates; the latest version is always available for every subscriber.  The concept of true Cloud accounting is that every user accesses the same system, with upgrades made at source and automatically available to all users.   Upgrades to on-premise or pseudo-Cloud applications can be tricky and expensive.  Many providers stop supporting older versions and force an upgrade path that can involve significant implementation work to apply the upgrades, involving data conversions and retraining. They can be incompatible if extensive customisation or server-based integration has been done and may require hardware and operating system upgrades.   Which route will you choose? The business landscape is moving fast and demands that businesses remain flexible and agile. Change is difficult so remaining with your existing system may seem the easy option. However, Cloud opens up a whole new world of possibilities that can be adopted without considerable upfront investment and can scale and flex as your business grows and develops.  If you want a more viable, long-term solution, look for apps which are truly in the Cloud.   ### Why technology will create the perfect retail storm in 2018 According to the latest figures from the Office for National Statistics, Black Friday helped boost sales in November 2017 to 1.6 percent higher than in 2016. However, over Christmas retailers were on tenterhooks more than ever over whether consumer spending habits would remain consistent with the preceding 12 months. The last couple of years have been a been a testing time for the retail sector. Hyper sensitised consumers, blurred lines between the online and physical world, intensification of competition and rapidly changing retail dynamics have collectively contributed to the creation of the perfect retail storm. What is evident is that the proliferation of technology to grow and differentiate retailers, as well as drive speed and flexibility, has proven to be a key competitive edge. Retailers must now lean on it to weather the storm. This year we are therefore likely to witness the use of some very innovative and differentiating technologies in the following areas. Customers at the core Customers will take centre stage like never before. Primarily, the shift from segmentation to understanding individual buying behaviour has never been more real. While this has been pervasive in the online world, the physical world of retail still operates in the realm of consumers in segments or aggregates. With the exciting developments in conversational commerce, chatbots and the growth of voice and image recognition technology, vast sums will be spent on better understanding consumer journeys and life cycles to integrate those insights uniformly across all channels. This integration across all devices, platforms, channels and services will become the norm. Amazon is leading the way in this area, with retailers like Home Depot, Best Buy and Nordstorm are not far behind. Reimagining physical stores Retailers are increasingly disrupting the store to enhance it, rather than to replace it. Regarding stores, therefore, we can expect a significant increase in the adoption of emerging technologies that bridge the information edge online retailers currently bring to the marketplace. The Internet of Things (IoT) and artificial intelligence (AI) / machine learning will be merged to achieve new levels of intelligence. As we see the emergence of the next generation of customer engagement, we will also see the deployment of an array of technologies to better understand key customer triggers, the path to purchase, sensitivity to changes in price and display and service levels in store itself. We will also see the emergence of multiple store formats that will lean heavily on the use of digital and innovative technologies to help differentiate them. Artificial Intelligence (AI) will come of age If 2017 was the year that AI and job loss hit the headlines, 2018 will see the technology taken to the next level. With an increase in the mainstream business application of AI and a greater adoption of cloud, we can expect to see the scale and magnitude of AI implementations grow significantly. Areas like customer experience will continue to be used increasingly across all retail formats, and AI will also be increasingly lent on to pursue new business avenues; especially in supply chain, store operations and merchandise execution. From a pure technology perspective, AI blends three distinct components together. The algorithms are the brain, the big data platforms the body that hosts everything and the cloud the legs that allow AI to move significantly faster and cheaper. Though each of these components will evolve rapidly, we will also see innovations on collaboration forums that will unite these elements and create innovative and powerful AI deployments at scale. Value chains will be transformed Understanding consumers have become intrinsic to the existence of the retail sector, so delivering on promises at the very least is of fundamental importance. In order to deliver on the promise of a new digitally charged world, retailers will start working on transforming their entire or at least significant parts of their value chain. What this means is that we will expect to see supply chain planning shift towards the concept of Network-Based Panning, with all nodes of the supply chain on one physical network. This will allow retailers to gain end-to-end visibility, react in near real time, increase velocity, availability and reduce costs across their supply chain. Fundamental shifts in Amazon’s business model (in its Grocery section, for example) will cause retailers to restructure physical networks and create a smaller flow through distribution centres to emulate Amazon’s same-day delivery promise. Both of these examples will result in the restructuring of processes, investment in new technologies and physical assets, and greater adoption of the cloud. In essence, we are witnessing the evolution of what could be termed retail 3.0, which is not only powered by technology but will also experience rapid evolution and transformation with breakthrough technological innovations. ### ad:tech London and TFM partnership to create UK’s largest marketing and advertising event UK industry events ad:tech London and Technology for Marketing (TFM) have joined forces to form the country’s largest marketplace for advertising and marketing technology at Olympia, London on 26-27 September. The deal will see ad:tech London co-locate with TFM and eCommerce Expo to create a destination event that mirrors the convergence of advertising technology, marketing technology and ecommerce. More than 11,000 brands, agencies, media owners and tech providers are expected to participate in a programme of one-to-one meetings, multi-track conferences and social activities over the two days. CloserStill Media and retail industry body IMRG will be responsible for delivering the integrated experience of ad:tech London, TFM and eCommerce Expo. The ad:tech brand and its suite of global events will remain part of Comexposium. 1,000 senior brands and agencies attended ad:tech London last year. Highlights included keynotes from former deputy prime minister Nick Clegg and L’Oréal CMO Stéphane Bérubé, an agency CEOs leadership debate and The Next Big Thing, a start-up pitch competition in partnership with Nestlé and McCann. TFM and eCommerce Expo were attended by over 9,000 industry practitioners and 200 exhibitors. Justin Opie, Managing Director, IMRG and eCommerce Director, CloserStill: “I was part of the original team who launched ad:tech London 13 years ago and the event continues to be the beating heart of tech innovation and debate. Bringing ad:tech London, TFM and eCommerce Expo under the same roof is a genuinely exciting opportunity to extend our value proposition, break silos and reflect growing convergence in marketing, advertising and eCommerce technology. “This portfolio creates scale that’s unrivalled in the UK and a forum for key decision makers from across the industry to come together, tackle key challenges and build relationships.” Jan Barthelemy, Global Brand Director, ad:tech: “CloserStill has pedigree for delivering high quality technology events and this partnership will enable us to create an even more valuable, immersive experience for brands, retailers, agencies, publishers and tech providers. We look forward to building the definitive industry event here in the UK in partnership with CloserStill.” Mayfield Merger Strategies acted as adviser to CloserStill Media. ### How data-over-sound is securing the future of the IoT We live in a world where everything is connected and, in turn, depends on being connected. Our businesses, our homes and our cities are intricately linked by the growth of IoT. And, increasingly, our children’s educational and play lives are also becoming more digitally-immersive. But, at what security risk? Stories about the threat and the potential fallout of cyber attacks on IoT and connected devices are increasingly hitting the headlines in the mainstream press. This is particularly the case when it comes to connected toys and devices marketed at children, with one recent prominent scare story being that of a watchdog discovering smartwatches designed for children contain security flaws, allowing hackers to monitor its wearer’s whereabouts and even make them appear in a completely different location. Elsewhere, there have been security concerns flagged by consumer groups such as Which? in the UK and Stiftung Warentest in Germany following researchers finding a number of security flaws in ‘intelligent’ toys such as the popular CloudPets and Hasbro’s Furby Connect. And these fears over security flaws with kids’ connected toys and devices feed into a wider concern about the cyber-safety of IoT devices more generally. Connected toys and growing fears over IoT security These latest findings only confirm growing fears over the security and privacy of IoT, particularly in our homes, with the threat that microphones and cameras on our personal devices, even our children’s toys, can be accessed by hackers. Which is why data-over-sound technologies such as those we are developing at Chirp are providing a fundamentally important solution to these security fears. For example, we have worked closely with toymaker Hijinx to create an interactive toy based on the Netflix show Beat Bugs, using our proprietary data-over-sound technology to enable the toys to dance and sing along with the programme when they hear the theme song playing on the television. This is a way of creating a connected toy that offers a fun, connected experience that is completely offline from the internet. And it is the beginning of a revolution in security technologies for connected toys and for connected devices more generally, as removing the need for the device to be connected to the internet results in connected toys that are completely safe, secure and hacker-proof. Data-over-sound protecting IoT devices The rapid growth of the IoT means that this year there are already more connected devices than human beings on the planet, with Gartner estimating 26 billion IoT units are to be installed by 2020 and Cisco and Intel predicting 50 billion in the same timeframe. Security is going to become increasingly important in a world in which everything and everyone is always connected, with data-over-sound offering the most viable alternative connectivity solution to the market, connecting man and machine via sound, rather than relying on traditional (and hackable) network infrastructure. And this is exactly why Chirp’s ground-breaking work with toymakers such as Hijinx is really only scratching the surface when it comes to the possibilities of using data-over-sound to safely connect new devices to our home networks. [clickToTweet tweet="data-over-sound technology will be able to provide safe and secure connections between smart devices" quote="data-over-sound technology will be able to provide safe and secure connections between smart devices"] As we enter the IoT age, data-over-sound technology will be able to provide safe and secure connections between smart devices and those devices without traditional networking capabilities, such as toys or home audio systems. In the case of Hijinx, for example, we simply embedded the toys with Chirp’s data-over-sound technology that operates on a low cost, low power Arm Cortex M4 chip. Building upon this example, in the future we might find a number of different use cases where it might not be possible nor desirable to use existing networking technologies due to security concerns, excessive costs, or because of complex and off-putting set-up processes for the consumer to connect the device to their network. connecting such devices using the unique affordances that data-over-sound offers is going to bring scalable solutions to the table that previously weren’t available In these scenarios, connecting such devices using the unique affordances that data-over-sound offers is going to bring scalable solutions to the table that previously weren’t available. Offering consumers and businesses alike a fast, safe, secure and easy-to-use way of onboarding new devices to their home or work networks. So data-over-sound is not only going to play a fundamental role in ensuring safe and secure IoT connectivity, but the possibilities for data-over-sound as a natural technological evolution that will increasingly help us to connect new devices, whilst minimising the threats posed by exposure to the wider internet, is almost endless. ### The Snapchat Surgeon - The Giant Health Event 17 - Prof. Shafi Ahmed - Medical Realities Disruptive Live interviews Prof. Shafi Ahmed from Medical Realities at The Giant Health Event 17. Compare the Cloud’s event coverage platform Disruptive partnered with the Giant Health Event 2017 for the first time, interviewing health technology experts from all over. Come join 1000s of passionate healthcare innovators at Europe's largest, most valuable healthcare innovation event. At the famous Old Truman Brewery in London, England. More information can be found here: https://www.gianthealthevent.com/ To find out more about Medical Realities: https://www.medicalrealities.com/ ### Hardware Based Encryption - The Giant Health Event 17 - Pasi Siukonen - Kingston Technology Disruptive Live interviews Pasi Siukonen from Kingston Technology at The Giant Health Event 17. Compare the Cloud’s event coverage platform Disruptive partnered with the Giant Health Event 2017 for the first time, interviewing health technology experts from all over. Come join 1000s of passionate healthcare innovators at Europe's largest, most valuable healthcare innovation event. At the famous Old Truman Brewery in London, England. More information can be found here: https://www.gianthealthevent.com/ To find out more about Kingston Technology: https://www.kingston.com/ ### What’s holding organisations back from DevOps success? The global app economy is set to be worth $6.3 trillion by 2021, with the user base rising to nearly every person on the planet. As a result, organisations are having to rethink the way they build products or services to meet the always-on culture today. The DevOps methodology has emerged over the past few years to address this phenomenon, accelerating the way companies build, test and deploy applications. Now as we head into 2018, it’s the buzzword on every CEO’s lips and the priority of most developer and operations teams as the pressure to release applications faster has never been greater.    The reality is a DevOps strategy is less about the tools and much more about the company’s culture. In practice, DevOps establishes a culture and environment to build, test and release software in a rapid, frequent and reliable fashion by embracing agile methodologies across the IT teams. It doesn’t matter whether an organisation is operating entirely in the cloud, taking a hybrid approach or strictly on-premise, the core elements of DevOps don’t change, only the speed of delivery. Cloud technology accelerates the speed of DevOps as steps like automation are easier through cloud-based APIs, database access is faster and scalability is achievable on-demand Cloud technology accelerates the speed of DevOps as steps like automation are easier through cloud-based APIs, database access is faster and scalability is achievable on-demand. However, organisations run into problems when they try to do too much at once, ignore core elements like the database, and don’t have the expertise in place to drive the transformation forward. Safeguarding success in the digital era There are a number of factors that can stifle even the best DevOps intentions. The first is getting everyone across the tech team – developers, operations, security teams, database administrators –  to buy into the DevOps approach, especially when organisational silos have been heavily ingrained in some organisations for years. For example, one well-known high street bank tried to transform a 120-year-old business using the DevOps methodology, applying it to 1000 applications across the personal and corporate banking divisions and sales. Unfortunately, the attempt failed thanks to different internal views and agendas, a highly-regulated environment, a strict central IT system and limited knowledge of DevOps strategies and supporting technology. Now they’re taking that learning and applying it to projects, rather than the entire business. The second factor that often holds DevOps back from success is overlooking the application database. If application updates require changes to the database, the DevOps process often breaks down, because databases are historically developed and managed differently due to their complexity, development process and sensitive nature. Additionally, database development frequently lacks code testing and reviews, source code controls, and the ability to integrate with existing build automation processes, which are critical to preventing errors impacting production systems. Cloud-based tools can certainly help break down the common barriers associated with deploying database changes and automate those adjustments within existing DevOps continuous integration-and-delivery processes. But it also requires database administrators have a seat at the DevOps table, to avoid unnecessary bottlenecks and ensure the best possible solutions are being put forward. As business leaders today are building their DevOps strategies for 2018, there are five factors they should consider to ensure the business remains agile and continues to thrive in the digital era.  Skill-up: It’s critical to make sure that anyone involved in implementing DevOps, especially in the cloud, are properly trained in both technologies and are able to pass their knowledge onto others in the organisation. DevOps represents a cultural change and people are its key so it’s essential that there is a shared understanding of what is involved so that expectations of what can be delivered are realistic.  Avoid vendor lock-in: Don’t lock the business into one cloud or database vendor. To ensure success with DevOps, organisations need to be as open and flexible as possible in their choice of tools as changes to the platforms may be needed in the future. Consider the use of containers for application deployment to provide additional flexibility. Integrate database development: Weave database development into the fabric of any DevOps plans, otherwise, this will become a significant bottleneck that could delay application delivery. Prioritise performance testing: This infrequently happens during on-premise deployments. With cloud-based services, where it’s subscription-based, performance testing becomes more important so you don’t find yourself unnecessarily using more cloud resources than you need. Don’t over-commit: Try DevOps practices and supporting technology on a relatively low-risk or pilot project first, learn from them and then roll-out additional projects with the enhanced knowledge of what worked or didn’t previously. The enthusiasm for DevOps will only continue in 2018. The demands of today’s digital era mean organisations cannot afford to have delayed releases, security breaches, or application downtime, as these effect everything from customer experience to company profits. Adopting cloud technology as part of a DevOps strategy puts companies in a better position to ensure those issues do not arise, but can quickly respond in real-time if they do. [clickToTweet tweet="The enthusiasm for DevOps will only continue in 2018... read more --->>>" quote="The enthusiasm for DevOps will only continue in 2018... "] Now more than ever, business leaders need to understand the importance of agile work environments, prioritise core elements like the database and take a realistic project-based approach to application development. Only then will the future of the business and its teams thrive in this brave new world. ### IBM Cloud Now Connected To BT’s Cloud Of Clouds - News BT today announced the launch of a new service to provide global businesses with direct access to IBM Cloud via the BT network. BT Cloud Connect Direct for IBM offers businesses the benefits of highly predictable, security-rich and reliable network performance when building and deploying critical business applications and data on the IBM Cloud. With Cloud Connect Direct for IBM, BT customers receive a dedicated, high-performance connection to the IBM Cloud so they can access services including compute, network and storage infrastructure as well as an extensive catalogue spanning AI, Blockchain, Internet of Things (IoT), data and analytics capabilities. Direct cloud connectivity helps businesses achieve better performance, security and availability compared to connecting over the open internet. Cloud Connect Direct for IBM is being delivered into IBM Cloud data centres in the UK via IBM Cloud Direct Link, a network service designed to secure and accelerate data transfer between private infrastructure and the public cloud. This will be followed in the coming months with direct connectivity into IBM Cloud data centres in mainland Europe, the US, Australia and Asia, optimising performance and compliance for customers in those regions. Cloud Connect Direct for IBM will be managed and supported from a single BT-managed service desk, providing proactive management and quality of service. Keith Langridge, vice president of network services at BT, said: “Cloud Connect Direct for IBM is designed to help businesses fully harness the potential of high-value digital services delivered via IBM Cloud. Businesses deploying IBM Cloud services can now benefit from improved performance, security and reliability, creating a rich digital experience for their customers and employees.” Kit Linton, vice president of network, IBM Cloud said: “Enterprises are rapidly turning to the cloud to modernize their core applications and build cloud-native solutions that leverage AI, IoT, Blockchain and more. The collaboration with BT will help enterprises around the world seamlessly connect to the IBM Cloud so they can maximize their data and generate new business value.” Cloud Connect Direct for IBM leverages BT’s global network reach, spanning 198 countries and territories. The service is designed, implemented and supported as part of the customer’s existing network infrastructure. This means less redesign, less change and more consistency with existing network practice when connecting to new cloud services. Cloud Connect Direct for IBM is available to customers globally today. Cloud Connect Direct for IBM is the latest development in BT’s Cloud of Clouds portfolio strategy, which has been helping customers since 2015 on their digital transformation journeys by connecting them easily and securely to a world-class ecosystem of cloud services, applications and data. For further information Enquiries about this news release should be made to the BT Group Newsroom on its 24-hour number: 020 7356 5369. From outside the UK dial + 44 20 7356 5369. All news releases can be accessed on our web site. About BT BT’s purpose is to use the power of communications to make a better world. It is one of the world’s leading providers of communications services and solutions, serving customers in 180 countries. Its principal activities include the provision of networked IT services globally; local, national and international telecommunications services to its customers for use at home, at work and on the move; broadband, TV and internet products and services; and converged fixed-mobile products and services. BT consists of six customer-facing lines of business: Consumer, EE, Business and Public Sector, Global Services, Wholesale and Ventures, and Openreach. For the year ended 31 March 2017, BT Group’s reported revenue was £24,062m with reported profit before taxation of £2,354m. British Telecommunications plc (BT) is a wholly-owned subsidiary of BT Group plc and encompasses virtually all businesses and assets of the BT Group. BT Group plc is listed on stock exchanges in London and New York. For more information, visit www.btplc.com About IBM IBM is the global leader in enterprise cloud with a platform designed to meet the evolving needs of business and society. Moving past productivity and cost improvements, the IBM Cloud is tuned for the AI and data demands that are driving true differentiation in today's enterprise. IBM's private, public and hybrid offerings provide the global scale businesses need to support innovation across industries, while its nearly 60 Cloud Data Centres across 19 countries help clients meet their expanding data locality requirements.To learn more about IBM Cloud, visit: https://www.ibm.com/cloud/   ### What is the geko device? - The Giant Health Event 17 - Bernard Ross - Sky Medical Disruptive Live interviews Bernard Ross from Sky Medical at The Giant Health Event 17. Compare the Cloud’s event coverage platform Disruptive partnered with the Giant Health Event 2017 for the first time, interviewing health technology experts from all over. Come join 1000s of passionate healthcare innovators at Europe's largest, most valuable healthcare innovation event. At the famous Old Truman Brewery in London, England. More information can be found here: https://www.gianthealthevent.com/ To find out more about Sky Medical: http://www.skymedtech.com/ ### The cost of DDoS attacks and guarding against them Reflecting on 2017, it seems more than fitting to crown it ‘the year of the cyber attacks’. Sure, Mirai kicked things off in late 2016, but last year there was WannaCry, the Equifax data breach, and, dare we forget, the recent ransomware campaign, Bad Rabbit? These examples are the tip of the iceberg and across both the public and private sectors, nobody seemed immune to cybercrime. This was highlighted when anonymous group, Shadow Broker, leaked the National Security Agency’s own hacking tools. In fact, as 2017 came to a close, a new Mirai-style botnet, Satori, quite literally ‘awakened’, infecting more than 280,000 IP addresses in just 12 hours. It was recently estimated that we will spend $1 trillion globally on cybersecurity between 2017 and 2021 The scale of the problem is eye-watering. According to Cybersecurity Ventures it was recently estimated that we will spend $1 trillion globally on cybersecurity between 2017 and 2021. Cybercrime meanwhile, will cost the world’s economy $6 trillion annually by 2021. These figures beg the question as to how effectively cybersecurity measures are being deployed with such a significant impact on the economy. But at the same time, you wonder just how much worse it could be. One of the main areas of concern are the proliferation of DDoS attacks. These were recently identified in a report from Accenture as being responsible for a significant proportion of Britain’s costliest and most damaging cyberattacks on businesses. As such, it is worth examining how DDoS attacks are impacting businesses and what can be done to guard against them. The facts It was uncovered in Neustar’s recent Global DDoS Attack and Cyber Security Report, that more than four in five organisations have been hit by a DDoS attack at least once in the last eight months. Alarmingly, 36% of these organisations confessed to being in the dark about the attacks, only finding out from their customers when they’d been hit. It comes as no surprise, then, that DDoS attacks have detrimental consequences for a brand, sparking a whole host of trust issues for customers. These figures are only going in one direction as cybercriminals are becoming more cunning, creative and resourceful in their approach. Whereas previously hackers would launch a large scale DDoS attack completely disrupting a website, multi-vector attacks are now the preferred option with half of the average attack size peaking at 10Gbps. Armed with new tools and aided by the constant sharing and selling of attack codes, hackers will launch more targeted, repetitive hits at a frequent pace. In a recent survey by the Neustar International Security Council, nearly half of respondents (45%) admitted that targeted attacks are a growing threat to their business, with almost three quarters (73%) professing that recent cyber attacks have changed the way they approach protecting their organisation. Theft, malware, viruses: the consequences  Motives for DDoS attacks vary. Yet, more often than not they will be twofold: using a DDoS hit to plant malware, viruses or ransomware. In fact, the Global Report found that organisations attacked just once had a 35% chance of seeing malware activated and a 52% chance of experiencing a virus. Worryingly, 92% of organisations experiencing multi-vector attacks also reported theft of intellectual property, customer data and financial assets and resources. It seems an understatement, then, to simply state that this leads to catastrophic results for a business. For example, if we look closely at the UK internet services industry alone, it risks losing up to £111 million by taking at least six hours to respond to a DDoS attack. The problem here lies in the detection and response time to an attack, with 30% in the UK taking 3-5 hours to spot a hit and figures demonstrating an increase in overall reaction time. Interestingly, slower detection and response times coupled with the growing complexity of attacks, aligns with an increase in spending, with 83% admitting to investing more in DDoS protection. Moreover, as application layer attacks become more intelligent, there has been a significant increase in the deployment of Web Application Firewalls with 53% reporting they invested in the technology during 2017. Protect yourself As hackers show no signs of slowing down, it is crucial to process the DDoS landscape and understand how it can affect the technical infrastructure of a business. Possessing this information is vital for assessing the necessary defence solutions and cost, in order to select the correct protection method. For companies looking to reduce spending, there are low-cost services for guarding against DDoS such as “clean bandwidth/pipe solutions” delivered by IPS and content delivery network (CDN) services. While inexpensive, generally these defences are limited to smaller-scale attacks and, in the case of IPS, depend on the user having a single internet provider. A sturdy and economical solution is on-demand cloud which works by redirecting traffic to a mitigation cloud. Yet, it heavily relies on a speedy failover to the cloud in order to escape any downtime. To counter this, the process can be automated by combining the client’s router and the mitigation partner. A successful service will deliver integrated protection and monitor network and application layer (ISO layers 3, 4 and 7) attacks. In comparison, always-on cloud-based protection constantly redirects web traffic, which may cause issues with network latency, even during non-attack conditions. Moreover, extra solutions are necessary to conquer application layer attacks and, as a result, combining with a CDN and a cloud-based Web Application Firewall is recommended for this solution. Outside of the cloud, a hybrid mitigation plan is the recommended choice and comprises of a mitigation appliance and cloud protection. This plan will halt any form of DDoS attack and automatically activate cloud mitigation if the circuit is threatened. Finally, regardless of the solution, it is crucial to have a unified (Layers 3 – 7) 24/7 Security Operation Centre including a user interface with real-time monitoring and reporting. With this, an organisation is more likely to be victorious over an intelligent hacker. [clickToTweet tweet="'As we wave goodbye to #2017, we can confidently assert that #CyberAttacks are going nowhere...' read more ->" quote="'As we wave goodbye to #2017, we can confidently assert that #CyberAttacks are going nowhere...'"] As we wave goodbye to 2017, we can confidently assert that cyber attacks are going nowhere and, more so, that they will only continue to grow in size, scale and intensity throughout 2018. It, therefore, becomes everybody’s responsibility to understand cybercrime in order to fight against it – and applying the correct, well-researched and most effective solutions is an essential starting point. ### Looking for Medical Information - The Giant Health Event 17 - Dean Thomson - Top Doctors UK Disruptive Live interviews Dean Thomson from Top Doctors UK at The Giant Health Event 17. Compare the Cloud’s event coverage platform Disruptive partnered with the Giant Health Event 2017 for the first time, interviewing health technology experts from all over. Come join 1000s of passionate healthcare innovators at Europe's largest, most valuable healthcare innovation event. At the famous Old Truman Brewery in London, England. More information can be found here: https://www.gianthealthevent.com/ To find out more about Top Doctors UK: https://www.topdoctors.co.uk/ ### Moving to the cloud will help SMEs capitalise in 2018 Businesses that don’t invest in the right tech will struggle to offer their customers the optimal experience and they will go elsewhere - it’s that simple! That’s why it’s essential that SMEs make the full use of the cloud in order to grow and improve their customer offering as we head into 2018. The huge variety of technology solutions available poses a challenge for SMEs. Making their way through all the options and ensuring they invest their hard-earned cash into solutions and services that meet their customer needs, while also being scalable and updateable, is crucial. At First Data we found that well over half (60%) of small businesses don’t use cloud computing at all, and this is a worrying statistic. Why? Below are some of the key reasons why SMEs looking to get ahead next year should adopt cloud-based technology solutions. Achieve a better work-life balance Perhaps one of the first and most obvious benefits of moving to the cloud is the fact that time-poor business owners and their staff can achieve a better work/life balance Perhaps one of the first and most obvious benefits of moving to the cloud is the fact that time-poor business owners and their staff can achieve a better work/life balance – as having access to all business’ data and apps in the cloud means that they can work remotely or on the go. Cloud-based business solutions frees up business owners from being always stuck in the office or the store, to concentrate on getting on with the most important work of running your business and focusing on planning, strategy and ensuring their staff and customers are getting everything they need to keep them happy and motivated. Either on doing the best job they can, or being happy to spend their hard-earned cash on your business’ services or products. Working remotely adds a necessary level of extra productivity to businesses. Not only that, a move to the cloud is also able to help SMEs work in real-time and help them meet the demands of customers across multiple areas, no matter where they may be, whether that’s in store, on mobile, or on their laptop at home. Move to the cloud to make the most of peak times This is particularly beneficial for retailers around the busy festive period. That’s because online and mobile e-commerce has completely revolutionised the way we shop for gifts and seasonal treats. As consumers, we are more demanding and harder to please than ever before, because we expect increasingly tailored and personalised experiences from retailers, restaurants and small businesses. Online shopping and mobile e-tail applications have changed our expectations of what makes a decent retail experience. We expect and demand far more bespoke shopping experiences, with retailers offering us discounts on the right products and at the right time for us to take advantage of them. From First Data’s survey of 1,000 UK consumers, 1 in 5 (21%) also said that they would avoid independent stores during the festive period because they believe they will be too busy. And yet a third (33%) said that if the store was using the right in-store tech to ensure there were no queues, then it would tempt them to spend their money. Happy customers love personalised experiences Happy customers are more likely to be returning customers and to spread great word of mouth with their families and friends and be far more likely to have positive and encouraging engagements with businesses on social media. So investing in the cloud ultimately means investing in the best payment technologies and ensuring that you are offering customers a speedy, personalised and therefore quality experience. Offer this type of convenience and seamless experience and you will be sure to boost profitability at this, at any time of the year. [clickToTweet tweet="Ultimately moving to the #cloud is a cost-effective approach giving #SMEs businesses #agility and #flexibility" quote="Ultimately moving to the #cloud is a cost-effective approach giving #SMEs businesses #agility and #flexibility"] Cloud-based technology, like First Data’s Clover Station, ultimately provides SME owners with the flexibility to collect and act upon useful data allowing them to better understand their business and their customers from wherever! It also gives business owners the option to back up their business and de-risks them from over-reliance on a piece of hardware. Ultimately moving to the cloud is a cost-effective approach giving businesses the agility and flexibility to easily try out the next big trend as an add-on, rather than committing cash to a whole new proprietary solution. ### UK businesses need clear direction to keep pace with the economy in 2018 A survey commissioned by Interoute, the global cloud and network provider, has revealed that 51% of UK IT leaders are struggling to secure boardroom consensus for achieving digital transformation objectives. The UK result is significantly higher than the European average, which came in at 41% and more than any of the eight other countries surveyed. “Without a clear route forward agreed by the C-Suite, many IT professionals will find it difficult to progress critical digital transformation projects, risking UK businesses being left behind,” explained Mark Lewis, EVP products and development at Interoute. “With Brexit on the horizon, the UK faces unprecedented change and uncertainty in the market and it’s fair to assume this is impacting decision making at the highest level. But it’s never been more important for UK businesses to haul their IT talent out of running just the day-to-day business systems, and into creating the business processes and customer experiences that will make their products and services outstanding.” “It is vital to get buy-in at the highest levels of any organisation to make the changes necessary that will allow businesses to progress and keep pace with the digital economy,” Mark Lewis continued. “IT professionals need the support of the whole business to be able to deliver the digital foundation business needs for future growth.” When ranking challenges for achieving digital transformation objectives, the 820 European IT decision makers surveyed were on average most concerned about their ability to integrate legacy technologies with cloud enabled applications (55%). This was closely followed by uncertainty around the changing political climate (52%) and a lack of talent/skills available to drive projects (52%). The research showed there is a higher cost associated with digital transformation skills, with 61% of UK businesses, 63% of French businesses, and 66% of businesses in Denmark expecting to pay at least 20% more for these, compared with skills required for other IT projects. “IT skills are in short supply for businesses across Europe, which is why leadership is needed to avoid being left behind in this new digital world,” Mark Lewis concluded. “To make the most of the opportunities presented by digital transformation, IT professionals must work together with C-Suite executives to act quickly and decisively. With the support, direction and consensus of the C-Suite, digital transformation can be accelerated to the benefit of the entire organisation.” ### SolarWinds Acquires Loggly, Strengthens Portfolio of Cloud Offerings SolarWinds, a leading provider of powerful and affordable IT management software, today announced it has completed the acquisition of Loggly, Inc., a provider of cloud-based log monitoring and log analytics software. With the transaction, the company also will add to its team of world-class software engineering talent. Loggly® is a SaaS-based, unified log monitoring and log analytics product, that aggregates, structures, and summarises log data so users can analyse and visualise their data to answer key questions, spot trends, and deliver actionable reports. The acquisition complements the company’s existing portfolio of SaaS-based cloud monitoring solutions and SolarWinds plans to continue investing to innovate and enhance Loggly. With the acquisition, SolarWinds also will deepen its cloud-software engineering and analytics expertise. Former Loggly executives Manoj Chaudhary, CTO and VP, Engineering, and Vito Salvaggio, VP, Product will join SolarWinds as leaders in engineering and product, respectively. Members of the core development, operations, support, sales, and marketing teams will transition as part of the transaction. The addition of Loggly extends the current SolarWinds® Cloud ® Software as a Service (SaaS) portfolio, which includes the Papertrail™, AppOptics™, and Pingdom ® products. Collectively, the SolarWinds Cloud portfolio gives customers broad and unmatched visibility into traces, logs, metrics, and the digital experience. The Loggly acquisition advances SolarWinds’ strategy to deliver comprehensive, simple, and disruptively affordable full-stack monitoring solutions built upon a common, seamlessly integrated, SaaS-based platform. It is the latest advancement toward the company’s vision of enabling a single view of infrastructure, applications, and digital experience management. Loggly will offer a compelling solution to address use cases where customers need log monitoring and log analytics with structured log data and aggregated events. “Rapidly visualising vast amounts of data through log analytics is absolutely critical to solving many problems in today’s diverse, complex cloud-application and microservices environments,” said Christoph Pfister, executive vice president of products, SolarWinds. “Adding Loggly to our industry-leading portfolio will empower customers to accelerate their time-to-insight and solve problems faster, with our usual, disruptive affordability. “Building on these strengths, we will continue investing in Loggly to innovate and extend its value to customers, while integrating its capabilities with our other Cloud offerings to address even broader needs,” he added. ### Converged Infrastructure - The Giant Health Event 17 - Usman Chaudhry - Fujitsu Disruptive Live interviews Usman Chaudhry from Fujitsu at The Giant Health Event 17. Compare the Cloud’s event coverage platform Disruptive partnered with the Giant Health Event 2017 for the first time, interviewing health technology experts from all over. Come join 1000s of passionate healthcare innovators at Europe's largest, most valuable healthcare innovation event. At the famous Old Truman Brewery in London, England. More information can be found here: https://www.gianthealthevent.com/ To find out more about Fujitsu: http://www.fujitsu.com/uk/ ### How and why AI will evolve in 2018 2017 will come to be known as the year artificial intelligence (AI) took off. While the science isn’t new — at FICO, for instance, we’ve been building AI-based fraud detection models for 25 years — AI is now being increasingly deployed across a range of industries, especially in the financial sector. [clickToTweet tweet="How and why #AI will evolve in #2018" quote="How and why #AI will evolve in #2018"] So what’s next for one of the hottest trends in tech? There are five ways AI will continue to expand and flourish over the next twelve months: 1. AI Will Combine with Blockchain Moving forward, organisations will start to use blockchain technology, primarily associated with cryptocurrencies, to record “time chains of events”. In these “time chains” people and their interactions are allocated an encrypted identity in a sequential chain of ‘blocks’. Since all the confirmed and validated transaction blocks are linked and chained from the beginning of the chain to the most current block, this shared distributed, digital ledger becomes the single source of truth. As such, it allows audit trails of data usage in models, particularly in data permission rights. For example, consider renting a holiday villa. In the future, you’ll be able to book the property using a micro-loan that approves you to stay for a set period, like two weeks. This micro-loan will have information about your previous holiday rentals and events attached to its chain, along with a codified history of the property’s previous visitors, occurrences, and maintenance. Throughout your stay, details about your habits and interactions – for example energy consumption, breakages etc. - will be automatically recorded on the blockchain. When you leave the villa and lock it, the rental is complete and auditable on the chain. In 2018, blockchains will also be used to illuminate the web of relationships within specific events to generate further insight and in some instances highlight out-of-the-ordinary and potentially damaging activities. For example, think about average daily interactions with money, people, places and things. Most days follow a set routine but sometimes chains of events occur that have new meaning, perhaps indicating bust-out fraud, money laundering activity, even suicide.   2. AI Will Learn to Fight Back Defensive AI – systems that seed their outputs with “faint signatures” to mislead, confuse or identify the attackers learning the AI system’s response – will be front and centre in 2018. Although we’ve long talked about how AI and machine learning (ML) will help companies drive differentiation in competitive markets, this principle also applies in the criminal world, where attackers are already using malicious AI to circumvent the AI models companies have in place. This arms race, in which criminals arm themselves with “adversarial machine learning,” tops McAfee’s 2018 security forecast. 3. AI Will Have to Explain Itself The need for Explainable AI is being driven by upcoming regulations, like the European Union’s General Data Protection Regulation (GDPR), which requires explanations for decisions based on scores, including those produced by AI and ML systems. In the areas of fraud and credit risk decisioning, the challenge of explainability is a longstanding one. While it is being dealt with successfully in these areas, other industries have entire sets of proposed explanatory algorithms that are either right, ineffective or flat wrong. Under the GDPR, there are hefty penalties for inaccurate explanations – making it imperative that companies correctly explain the decisioning process of its AI and ML systems, every time. 4. AI Will Augment the Workplace In 2018, AI will augment much more of the workplace, not just through better software, but also through increasingly better versions of ourselves. Whether it’s drawing the information together for us to be superhuman at investigation, data recall, or improving how we learn new topics, AI will enhance our ability to process new information, especially in the field of fraud detection. The objective should not be to jump on the AI bandwagon. Instead, it is to further decrease human workloads and increase productivity in compliance departments. In the short-term, traditional approaches will not be replaced, but rather, enhanced with AI that will take detection to the next level. This will help financial institutions generate new alerts as previously unknown patterns are uncovered with AI. 5. Chatbots Will Get Better at Understanding – and Manipulating – Us The chatbots of 2018 will rapidly become more sophisticated, which will help to dramatically reduce the costs of routine customer care activities while improving the overall customer experience. In the coming year, chatbots will be able to quickly understand human tone and speech content, and thus predict the conversational paths that are the highest in value and can fulfil different objectives. However, on the dark side, this subtle “engagement” can turn into manipulation when AI learn the words that sway our attitude, actions and elicit large-scale reactions. ### Vera Loftis - Who are Bluewolf? Compare the Cloud talks to Vera Loftis, from Bluewolf, about who they are and what they do. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. ### How is the Giant Event going? - The Giant Health Event 17 - Barry Shrier Disruptive Live interviews Barry Shrier at The Giant Health Event 17. Compare the Cloud’s event coverage platform Disruptive partnered with the Giant Health Event 2017 for the first time, interviewing health technology experts from all over. Come join 1000s of passionate healthcare innovators at Europe's largest, most valuable healthcare innovation event. At the famous Old Truman Brewery in London, England. More information can be found here: https://www.gianthealthevent.com/ ### Why Unsecured IoT is a Disaster for DDoS Targets DDoS attacks have almost doubled in the last six months, reaching an average of 237 per month, according to new research from Corero. One of the main reasons behind this huge leap in attacks is the increasing number of easy-to-hijack, unsecured ‘smart’ devices which can be recruited into DDoS botnets. And this problem is only getting worse – especially considering all the consumer tech products now being purchased as Christmas presents and how they could soon become prime targets for hacker infiltration and takeover. unsecured IoT devices have powered some of the biggest DDoS attacks against online platforms in the last few years Aside from the personal privacy and security concerns resulting from unsecured Internet of Things devices, another serious danger is the ease with which hackers can harness them for a variety of nefarious purposes, including being recruited into botnets and used in DDoS attacks. Indeed, unsecured IoT devices have powered some of the biggest DDoS attacks against online platforms in the last few years and thus, organizations of all sizes need to ensure their devices, data and networks are safe. The evolution of botnets Botnets have transformed the DDoS landscape. Smart devices are essentially a gateway for cybercriminals into their target’s networks, making them more vulnerable to cyber threats and compromises. One of the biggest and most dangerous IoT-related cyber-attacks in the last few years was the Mirai botnet, which enslaved tens of thousands of poorly protected internet devices into bots used for launching DDoS attacks. Looking forward, the continuing proliferation of unsecured smart devices means there will be no limit to the potential size and scale of future botnet-driven DDoS attacks. By using amplification techniques with the millions of devices currently accessible, such as security cameras, DDoS attacks are set to become even more colossal in scale. Terabit-class attacks with the ability to ‘break the Internet’ – or at least clog it in certain regions – are a reality.  Attacks of this size can take virtually any organization offline, and anyone with an online presence must be prepared to defend against them. Besides their growing size and scale, botnets are also becoming more sophisticated in terms of the techniques they use. For example, the Reaper / IoTroop botnet, which is already known to have infected thousands of devices, is believed to be particularly dangerous due to its exploit of software vulnerabilities to gain control. Acting like a computer worm, it hacks into IoT devices and then uses those to hunt for new devices to spread itself further. But it isn’t just the giant attacks that organisations need to worry about.  Before botnets are mobilised, hackers need to make sure that their techniques are going to work. This is usually done through launching small, sub-saturating attacks which most IT teams wouldn’t even recognise as DDoS.  Due to their relatively small size, compared to normal traffic, and their short duration, these attacks typically evade detection by most legacy DDoS mitigation tools. This enables hackers to perfect their methods under the radar, leaving security teams blindsided by subsequent attacks. Indeed, organized cyber-attack groups regularly test their DDoS tools and techniques, to see how far they can push the envelope. If these techniques are deployed at the scale possible with IoT botnets, the results can be devastating. Wider DDoS trends observed during 2017 In addition to the proliferation of unsecured Internet of Things devices, Corero has observed a growing availability of DDoS-for-hire services.  Due to their cheap price-point and ease of access, they have revolutionised DDoS attacks by giving anyone and everyone access, without needing to have any understanding of coding. A quick search of Google and a spare $50 can put DDoS attacks into the hands of just about anyone. As a result, performing DDoS attacks has never been easier and more cost-effective. Furthermore, hackers are also using sophisticated, quick-fire, multi-vector attacks against an organisation’s security. Such attacks use a combination of techniques in the hope that one, or a few, can infiltrate the target network’s security defences. For example, DDoS attacks are increasingly launched as a smokescreen that distracts IT staff while the hackers stealthily breach other aspects of a company’s database to comb for sensitive data such as credit cards and email addresses. Another key trend observed during Q3 2017 was the return of Ransom Denial of Service (RDoS). In a RDoS attack, cybercriminals send a message threatening to carry out a DDoS attack, or infect organization’s operational systems with forms of ransomware, unless payment is received by a certain deadline. For example, earlier this year the hacker group called Phantom Squad began extorting companies in the Europe, US and Asia. Indeed, DDoS ransom activity is on the rise, with extortion campaigns spanning all industries – from banking and financial institutions, to hosting providers, online gaming services and SaaS organisations. These trends, alone, make for worrying reading but, factor in the scale enabled by unsecured IoT devices and this makes for DDoS attacks that are significantly more powerful and dangerous than previously possible. Indeed, as DDoS threats continue to evolve, organizations of all sizes need to keep up to date with the latest trends and attack vectors, to ensure their data, devices and networks are secure. Securing IoT devices, and how can businesses protect their networks from the growing DDoS threat To avoid smart devices becoming part of the DDoS problem, organizations need to pay close attention to the settings for those devices and, where possible, separate them from access to the Internet and to other devices. Organizations should include IoT devices alongside regular IT asset inventories and adopt basic security measures like changing default credentials. [clickToTweet tweet="'To avoid smart IoT devices becoming part of the DDoS problem, organizations need to...' read on ->" quote="'To avoid smart IoT devices becoming part of the DDoS problem, organizations need to...' -->"] Finally, to stay one step ahead of these ever-evolving DDoS threats, organisations must maintain comprehensive visibility across their networks to spot and resolve any issues, as they arise. The sheer volume of devices involved poses a serious security challenge. After all, any device that has an Internet connection and a processor can be exploited. For this reason, effective DDoS protection requires continuous visibility into the threats, with real-time mitigation as well as long-term trend analysis to identify changes in the DDoS landscape and deliver proactive detection and mitigation. ### Professional Genotype Analysis - The Giant Health Event 17 - Emma Beswick - Lifecode GX Disruptive Live interviews Emma Beswick from Lifecode GX at The Giant Health Event 17. Compare the Cloud’s event coverage platform Disruptive partnered with the Giant Health Event 2017 for the first time, interviewing health technology experts from all over. Come join 1000s of passionate healthcare innovators at Europe's largest, most valuable healthcare innovation event. At the famous Old Truman Brewery in London, England. More information can be found here: gianthealthevent.com/ To find out more about Lifecode GX: https://www.lifecodegx.com/ ### Robotic Process Automation - The Giant Health Event 17 - Rhiannon Dakin - Flynet Disruptive Live interviews Rhiannon Dakin from Flynet at The Giant Health Event 17. Compare the Cloud’s event coverage platform Disruptive partnered with the Giant Health Event 2017 for the first time, interviewing health technology experts from all over. Come join 1000s of passionate healthcare innovators at Europe's largest, most valuable healthcare innovation event. At the famous Old Truman Brewery in London, England. More information can be found here: https://www.gianthealthevent.com/ To find out more about Flynet: https://www.flynetviewer.com/ ### NHS Realtime Analytics - The Giant Health Event 17 - Dr Matt Lovell - Centiq Disruptive Live interviews Dr Matt Lovell from Centiq at The Giant Health Event 17. Compare the Cloud’s event coverage platform Disruptive partnered with the Giant Health Event 2017 for the first time, interviewing health technology experts from all over. Come join 1000s of passionate healthcare innovators at Europe's largest, most valuable healthcare innovation event. At the famous Old Truman Brewery in London, England. More information can be found here: https://www.gianthealthevent.com/ To find out more about Centiq: https://centiq.co.uk/ ### A Headset For Weightloss - The Giant Health Event 17 - Jason McKeown - Modius Disruptive Live interviews Jason McKeown from Modius at The Giant Health Event 17. Compare the Cloud’s event coverage platform Disruptive partnered with the Giant Health Event 2017 for the first time, interviewing health technology experts from all over. Come join 1000s of passionate healthcare innovators at Europe's largest, most valuable healthcare innovation event. At the famous Old Truman Brewery in London, England. More information can be found here: https://www.gianthealthevent.com/ To find out more about Modius Health: https://www.modiushealth.com/ ### Fashion from the Neural System - The Giant Health Event 17 - Danielle Jordan - Fashion Designer Disruptive Live interviews Danielle Jordan, a Fashion Designer at The Giant Health Event 17. Compare the Cloud’s event coverage platform Disruptive partnered with the Giant Health Event 2017 for the first time, interviewing health technology experts from all over. Come join 1000s of passionate healthcare innovators at Europe's largest, most valuable healthcare innovation event. At the famous Old Truman Brewery in London, England. More information can be found here: https://www.gianthealthevent.com/ ### Managing big data storage – options for 2018 Data volumes across businesses of all sizes are growing at an average of 40% per annum, and IDC predicts that worldwide, this will almost double every two years - between now and 2020. This trend is presenting real challenges for IT departments, as they are spending an inordinate amount of time and money on their storage infrastructures. Rather than firefighting data growth, new approaches are required, so valuable IT resources can be liberated to develop and implement business solutions - which drive productivity and have a positive impact on the bottom line. [clickToTweet tweet="Storage requirements for 2018 and beyond - Rather than firefighting data growth, new approaches are required" quote="Storage requirements for 2018 and beyond - Rather than firefighting data growth, new approaches are required"] Influencing factors There are many trends, such as BYOD and the consumerisation of IT, that are driving exponential data growth, as they’re creating increased user expectations – which demand ever-present, information availability. However, despite the best attempts of IT teams to cater for this, they can be left struggling – and frequently experience such headaches as, storage sprawl, spiralling storage costs and an unending cycle of capacity purchases. They face information management and governance challenges too – and due to the complexities of data protection – these tasks can quickly become unworkable and drain resources even further. Getting storage and archiving right is also key to ensuring GDPR compliance. Our recent UK data archiving survey of IT executives, across 150 UK medium to large organisations, revealed that poor practices are creating GDPR risks. Of those questioned, 70% admitted that they don’t have effective processes and systems to swiftly submit data to regulators - and risk being exposed to penalties. The research also indicated that only 30% of those surveyed claimed their staff are able to find all types of structured and unstructured data quickly and easily – so are currently unprepared to cope with the demands GDPR compliance. To avoid the risks of GDPR non-compliance and other legal claims, organisations urgently need to better understand the lifecycle of their data – so it can then be correctly stored, managed, discovered and protected. So, what’s the answer? To keep pace with this change and growth, storage technologies need to be more agile, scalable and flexible to address the myriad of business requirements that are now an everyday norm. Thankfully, traditional on-premise infrastructure is evolving to embrace the many business benefits of cloud technology, which can positively impact on the way business can store, archive and retrieve data. On-premise or cloud? Businesses now face clear data storage options - either in-house, cloud or both. Although cloud storage provides well-documented cost savings, efficiencies and liberated IT operations, decision making should involve thoroughly examining which option best caters for specific business environments. On-premise Almost every business has invested in some form of in-house, data storage – be it a manual process, or a more agile solution that can meet both present and future business requirements. On-premise storage scores highly by providing fast access between data and applications, so businesses have the option of using dedicated high-performance disks. Although these functions are available in the cloud too, the input/output rate is not always as fast. Ultimately, on-premise, high-performance storage holds many advantages for businesses that experience high-performance workloads, large databases, and performance-dependent virtual machines - where the applications are local. Cloud With unstructured data growing faster than ever, storing it in the cloud is the better option. This is because, although unstructured data doesn’t necessarily have the same performance requirements as tier 1 data, it is capacity intensive. Cloud storage works well as files of this type are accessed less in the longer term, due to the scale and resilience of the associated workloads. By using storage in the cloud, also has the added business advantage that management and maintenance of an underlying hardware platform is no longer required. What to demand from cloud The new breed of more mature cloud storage solutions have progressed from traditional services  – and now offer complete archive capabilities, with the security, scalability and integrity of big vendor-owned data centres. They offer fixed operational costs - without owning, running or managing any infrastructure - with minimal upfront investment, and guarantees of a simple transition to the cloud. The best solutions should also deliver an intuitive user experience and the freedom to access from anywhere, across all devices. There are several non-negotiable, key features that businesses should expect from cloud storage options. There are several non-negotiable, key features that businesses should expect from cloud storage options. Firstly, the cost of migrating data into cloud from on-premises should be minimal. Secondly, businesses should only consider solutions that can archive and search across all data - including emails, files, IM and social media. Another key feature to look for is the customisation of policy configuration – including archive, retain and delete data using specific grouping - or policies that suit business requirements. Finally, businesses should only consider solutions backed by enterprise-class cloud infrastructure and data protection – so critical data is extensively covered against outages, loss or corruption. ### Matching Digital Innovation to NHS Needs - The Giant Health Event 17 - Yinka Makinde - DigitalHealth.London Disruptive Live interviews Yinka Makinde - DigitalHealth.London at The Giant Health Event 17. Compare the Cloud’s event coverage platform Disruptive partnered with the Giant Health Event 2017 for the first time, interviewing health technology experts from all over. Come join 1000s of passionate healthcare innovators at Europe's largest, most valuable healthcare innovation event. At the famous Old Truman Brewery in London, England. More information can be found here: https://www.gianthealthevent.com/ To find out more about DigitalHealth.London: https://digitalhealth.london/ ### Forget Forms, Conversation is King The marketing industry has experienced digital disruption like no other. The need to innovate, re-think, and reinvent is constant and has been amplified as a result of the global pandemic.  Average retail conversion rates from April 2020 to April 2021 dropped, and in-person events are no longer a guaranteed method of generating leads. In a world where data is a new form of currency, creative thinking is now forced towards digital solutions. Form fatigue Online forms have been the traditional means of capturing online leads in the past. Their direct approach to data collection means that they are still a valuable tool, even in today’s rapidly-changing world. Permanence is a thing of the past in the digital age and as interactions with leads have evolved. Web form fatigue has now settled in. Context is still something that must be considered, however methods of capturing leads must develop with the constantly changing marketing landscape.  On the business side, web forms cause inboxes to overflow and messages to get lost in an abyss of similar queries. Unsurprisingly, this is problematic and jeopardises businesses losing valued customers as time and attention need to be focused elsewhere. The need for a solution is clear but fixing something that isn’t exactly broken is daunting, especially when it comes to changing a process that has been the norm for so long.  On the customer side, filling in a form is standard practice but is never something people enjoy. Some may argue that forms have the potential to damage a customer relationship before it's already begun. They tend to create an additional off-putting barrier to entry, despite their very purpose being to create digital relationships. This is something that needs to, and can, change. In a market where competitive pricing due to supply chain issues is becoming less important, greater emphasis is placed on customer service and satisfaction. The need for change is now. Capturing leads by capturing attention  Digital innovation over the past 18 months has accelerated - mirrored by the attention economy free-for-all, where marketers fight for consumer recognition. With competition for clicks and conversions increasing, now is not the time to overlook real-time conversational marketing tools. The technology is readily available and accessible. It is now a choice businesses are making to accept convention, over more insightful means of generating leads. Real-time conversational tools such as chatbots provide marketers with the data to learn precisely how a customer would like to interact with a business. They are multi-functional tools with use cases in customer support, entertainment, and data collection. The fact of the matter is, people want to have conversations with and generally trust other people.  Technological advancements have made two-way interactions possible, even without a human presence. Chatbots can engage in meaningful conversation and deliver responses similar to those of a human. It is no longer a necessity for businesses to hire advanced programmers or technologically savvy staffers to run them.  No-code chatbots level up the engagement opportunity as deployment is easy, simple, and accessible. Teams do not require technical skills to automate workflows, like they may have previously needed, freeing up time for more important tasks. They do not look to replace humans but allow our time to be used more appropriately. Data delivery Aside from finding that labour costs could be cut by as much as 30%, IBM’s digital customer care study - conducted 4 years ago - also found that 80% of routine customer service questions could be handled by a chatbot. Since then, technology has advanced greatly, meaning that businesses can now control conversational flow, allow bots to take actions on behalf of teams, and solicit more valuable information about customer needs.  A bot’s capabilities can be enhanced through the archive of data that it creates. The data would allow unanswered questions to be stored and analysed, so that correct answers can be provided in the future. The bread and butter of a chatbot’s functionality are to turn raw data into conversation. However, chatbots can also be taught to process the data they receive and supply analytics to teams. This can help businesses to better understand the granular meaning of why their conversion rates are not high enough and identify any blocks for new customers.  With an aim towards a frictionless experience for customers where convenience and ease of use are the objectives, the use case for conversational assistants become more and more apparent.  Conversation is king Businesses can manipulate information to contact cold leads, profile users among categories for better targeting, and flag any recurring issues input by customers.  Companies can then redirect their efforts to the recurring problems to gain a better understanding of key difficulties.  By not incorporating real-time conversational tools, businesses risk losing vital data insights and enriching their customer relationships. The data-driven marketing economy means that small margins make a significant difference to the success of a business. Data insights from chatbots allow marketers to make informed decisions. These are tailored to a customer’s needs and enable better triage requirements to improve conversion rates. Chatbot implementation means that speculation is no longer needed and calculated responses developed effortlessly and effectively. This is a solution of its time.  The digital evolution Conversational assistants are not just the next trend. They are the next stage in our digital evolution. Just like how CDs have been replaced by online streaming platforms in the music industry, chatbots will replace their webform predecessors.  The scope for chatbots to support business does not stop at the marketing department. The technology saves manpower, streamlines customer service support to HR and internal communications, and most importantly allows salespeople to do just that - sell.  Navigating the digital world is still a relatively new phenomenon to us, so we need to make its architecture as simple and seamless to manoeuver as the real world. The scope for chatbots to support businesses achieve their goals is significant and exciting. It’s time for businesses to move out of their comfort zones and embrace the possibilities chatbots present for marketers.  ### Parkinson’s Walking Assistance‎ - The Giant Health Event 17 - Lise Pape - Walk with Path Disruptive Live interviews Lise Pape from Walk with Path at The Giant Health Event 17. Compare the Cloud’s event coverage platform Disruptive partnered with the Giant Health Event 2017 for the first time, interviewing health technology experts from all over. Come join 1000s of passionate healthcare innovators at Europe's largest, most valuable healthcare innovation event. At the famous Old Truman Brewery in London, England. More information can be found here: gianthealthevent.com/ To find out more about Walk with Path: walkwithpath.com/ ### How to Import the Best Disruptive Tech In the struggle to remain competitive, much of what started out as ‘disruptive tech’ is now mainstream and as costs fall, it is becoming more accessible to small businesses than ever before. However, taking the decision to introduce such technologies is not as straightforward as it sounds and businesses must be clear about what they expect to achieve from their investment before taking the plunge. The phrase ‘disruptive tech’ is widely used nowadays to describe virtually any innovation, and despite it being well understood by many in business tech communities, its meaning is sometimes misunderstood by others. So, what exactly makes a technology ‘disruptive’? [clickToTweet tweet="#Disruptive #BusinessModels have the potential to change their markets irrevocably..." quote="Disruptive business models have the potential to change their markets irrevocably..."] In its simplest form, we can say that it is modern technology that challenges the way current business models work - Netflix’s disruption of the home video market serves as a good example. Previously, the way to watch videos at home was to go and rent them from a Blockbuster store and a large part of the company’s revenue was derived from fees for late returns. Netflix’s arrival on the scene allowed consumers to stream and watch content without leaving their own living rooms. This disruptive business model has since changed the market irrevocably. In the business world, cloud accounting software, Xero, is another example of a disruptive technology that is changing things for the better. Rather than relying on clients sending forms and information back and forth to their accountant, data is held in the cloud and is available to view by business managers and advisers, as and when needed. Through this, the nature of the client relationship has evolved and professional services firms are able to work with their clients in a much more proactive and integrated way. Cloud technology itself forms the basis of much ‘disruptive’ tech and is useful mainly because of its ability to cater for mass integration. Cloud-based systems enable the formation of ecosystems; allowing systems, processes and software to be linked up, delivering data-based insights and facilitating greater collaboration. For an SME looking to adopt the latest disruptive tech, diving straight in is not the best option and making sure base-line processes are in order is a good first step. There is little point in forging ahead with robotic or automation strategies if the business still uses legacy systems that require data to be inputted manually, for example. Such transformations may need to be phased in slowly to avoid excessive workforce disruption and to ensure skilled operatives are trained appropriately. Managers or business owners looking to implement innovative technologies should always take a step back and consider their objectives. They should also ask themselves what exactly they will need to do in order to achieve their objectives.  For some, the solution may lie in the integration of robotics or automation, for others, it may be as simple as beginning to move away from physical data servers to the cloud. Technological know-how is moving on a daily basis and even early adopters will find themselves using legacy processes and systems before too long. Successfully capitalising on the latest tech trends involves being prepared to rethink the entire business model, or change its operational infrastructure, not just bolting on new elements. Particularly for SMEs, cost is often cited as the most prohibitive factor when choosing to implement new technologies. However, in reality, people themselves are often the most resistant to change. Winning over teams who may be set in their ways and somewhat resistant to the implementation of new, external ideas is a key goal for senior-level management. The ability to sponsor a corporate vision is vital in driving through any significant change in a business. Obviously, once company-wide buy-in has been secured, businesses must prepare an implementation plan whilst keeping a close eye on strategic objectives. Customer experience is a key area of improvement for many businesses and personalisation of service is playing a pivotal role in creating greater value for the end user. Big companies such as Amazon and Netflix are leading the charge in this area with bespoke recommendations, but smaller players are realising that personalisation is integral to forming long-standing relationships and attracting repeat business. For SMEs, novel digital marketing tools are revolutionising the way they can interact with their customer base. Where previously, sending an e-shot was an onerous communications task, CRM and databased tools offer user-friendly, personalised mailing that are capable of getting the right information in front of the right people. From a cost perspective, with many tools and systems now offered on a subscription basis, the days of having to finance a large system or infrastructure upgrade are long gone. Most new products provide regular updates, ensuring that businesses are always working with the latest version. This flow of information strengthens intrinsic links between the customer and the business and new mobile interfaces, in the form of apps or portals are increasingly recognised as the de facto way of managing customer accounts and services. For SMEs, having an efficient and useful customer portal allows information sharing and transparency to be central to the relationship. Taking cloud-based accounting software as an example, the use of mobile-enabled dashboards means invoices can be created and shared, along with useful benchmarking statistics and data sets, quickly and easily. The value of the service is enhanced by this flow of data and communication. Whilst SMEs aren’t necessarily expected to offer the same level of service to their customers as a large multinational corporation, improved access to big data now means it can be harnessed at all levels. While smaller businesses might find the phrase ‘big data analytics’ daunting initially, many quickly realise that they can drive additional value from the data sets they already have on file. Data can be used effectively to analyse the customer base - who orders what, who pays on time and who delivers on time. Taking time to use data to analyse and streamline performance where possible can be incredibly beneficial for any organisation, regardless of size. Spotting gaps in the market and trends within customer data can also generate cross-selling opportunities; allowing bespoke products and services to be offered to different customers. For any business uncertain about whether to invest in a ‘disruptive’ technology, the potential boost to productivity and/or efficiency should be sufficient to convince them to make the move. The key lies in quantifying the benefit that the investment is expected to bring, over a defined period of time – this is known as return on investment. At its heart, disruptive technology encourages existing businesses to keep changing Long-term survival should also be front of mind when making investment decisions. The sheer pace of technological change means that businesses can quickly be left behind if they fail to invest at the right time.  Many smaller businesses also have agility on their side, which means they can bring new products and services to market more quickly and in so doing, challenge the dominant players.  At its heart, disruptive technology encourages existing businesses to keep changing, whilst offering newcomers the chance to identify and capitalise on new market opportunities. Facing considerable Brexit uncertainty, disruptive technology offers all businesses an opportunity to acquire resilience and strengthen their proposition. Finding ways to improve customer services and communication could help to ring-fence clients at a critical time and allow the business to focus on developing its business model to meet the future needs of the marketplace. ### Prove your organisation’s security and credibility with a cybersecurity score With the growing numbers of data breaches and escalating cyber threats, cybersecurity scores – the better of which tells you how likely your organisation is to suffer from a data breach – have taken on increasing importance. Just as individuals have their own credit scores, Doug believes that, in order to gain public trust, businesses need a score that reflects their cybersecurity preparedness. Doug can explain the benefits of objective cyber risk measurement, and the questions organisations should be asking to get the right scoring solution. Almost all companies have a supply chain of some sort, and the largest companies are often linked to tens of thousands of business partners. Each of these, in turn, may be connected to thousands more. This ‘extended enterprise’ is not a new concept, but in the current hyperconnected environment, it’s creating new challenges. [clickToTweet tweet="in the current hyperconnected environment, the ‘extended enterprise’ is creating new challenges." quote="in the current hyperconnected environment, the ‘extended enterprise’ is creating new challenges."] Business networks, large or small, have significant implications for an organisation’s cybersecurity position. Direct supplier relationships are often difficult to assess, but the risk posed by vendors of vendors is even more intractable.  These ‘4th parties’ – the partners of partners – represent an additional threat, as they could bring a multitude of new dangers. When these 4th-party threats are multiplied across the vast number of interconnected businesses across the UK alone, data breaches and malware attacks represent the hurricanes and earthquakes of the cybersecurity industry. A single large cyber attack could trigger a highly destructive chain of events that begins with a 4th-party organisation but spreads rapidly to impact your business. In this hazardous cyber landscape, the question businesses must ask themselves is: How can I identify and quantify the aggregate risks I am facing? A good starting place is cybersecurity scoring. Like credit scores that are used to underwrite loans, cybersecurity scores enable you to compare the risk of multiple enterprises and see at a glance where the risk lies. As an individual, a good credit rating is a mark of credibility that can help you progress with, for example, purchasing a dream house or car — as a business, a good cyber score will be an increasingly important mark of credibility that can help you ally with new partners and suppliers. The best cybersecurity scoring solutions use empirically derived predictive analytics to profile business systems and the environment they operate in The best cybersecurity scoring solutions use empirically derived predictive analytics to profile business systems and the environment they operate in – including inferred behavioural policy indicators – to derive a score. These scores are informative in two ways. Firstly, they’re an indicator of how likely your organisation is to suffer a data breach. Secondly, getting a good score will inspire trust in your business, as customers and suppliers will feel more confident that their data is as safe as possible. Here are four factors to consider when exploring different cyber scoring solutions: Understand your starting point: In order to improve, you need to first understand your current position. A score enables you to set a benchmark against which future changes can be judged. That’s why it’s important to choose a solution that responds to shifting conditions but isn’t sensitive enough to change with the wind. All organisations suffer network issues or transient risk conditions. If these are fixed quickly, they don’t impact long-term risk factors. Your score must balance responsiveness to new risk conditions with the long-term score stability necessary to make decisions around, for example, investments and vendors. Determine what you want to achieve: While it’s straightforward to use a cyber scoring solution to get a snapshot of your current cybersecurity posture, this won’t necessarily help you understand how likely it is that a breach will happen in the next 12 months. Some scores on the market are not forward-looking but are simply point-in-time assessments. You need a solution with underlying analytics designed to produce a stable, forward-looking indication of security risk in a relevant future time window. Develop risk profiles for different parts of your business: It’s important to select a solution that gives you the transparency to understand the risk of constituent parts of your business. This may include different subsidiaries and locations. Understanding these in-depth insights is as important as getting an overall picture of your organisation. You won’t be able to act on your results unless you understand what needs to change in each area of the company. Make sure the score is explainable: You need a solution that expresses risk in an understandable way that informs action and justifies investment. Scores should help you to explain the likelihood of a breach at your organisation to your business partners and insurers. Cyber scores must also help you make decisions about suppliers – by understanding their score, you can better assess whether they are introducing you to risk. This is essential if they will be accessing your data or systems. Much like the traditional credit score, cybersecurity scores provide a way of pulling together a range of risks and data sources to help generate measurable, actionable insights. This checklist will help businesses better understand what is achievable with the right scoring solution, and how this enables them to improve their relationship with both partners and customers. ### Why the Giant Health Event? - The Giant Health Event 17 - Dr Odeh Odeh Disruptive Live interviews Dr Odeh Odeh at The Giant Health Event 17. Compare the Cloud’s event coverage platform Disruptive partnered with the Giant Health Event 2017 for the first time, interviewing health technology experts from all over. Come join 1000s of passionate healthcare innovators at Europe's largest, most valuable healthcare innovation event. At the famous Old Truman Brewery in London, England. More information can be found here: https://www.gianthealthevent.com/ ### Bespoke Footwear - The Giant Health Event 17 - Iris Anson - Solely Original Disruptive Live interviews Iris Anson from Solely Original at The Giant Health Event 17. Compare the Cloud’s event coverage platform Disruptive partnered with the Giant Health Event 2017 for the first time, interviewing health technology experts from all over. Come join 1000s of passionate healthcare innovators at Europe's largest, most valuable healthcare innovation event. At the famous Old Truman Brewery in London, England. More information can be found here: https://www.gianthealthevent.com/ To find out more about Solely Original: https://solelyoriginal.com/ ### How to compete in the IoT arms race Sam O’Meara, UK Director of Applause, discusses the future of the IoT market and offers advice to start-ups and scale-ups looking to take an early lead VHS vs Betamax. Android vs Apple. Whenever a new technology starts to become ubiquitous, there is always a battle for supremacy, especially between technology giants. As the Internet of Things emerges as “the next big thing”, there’s a lot at stake in the race to market leadership. Today, it’s still unclear whether an elite few will come out on top and dominate market share, or if IoT vendors’ emphasis on open source gives smaller startups real opportunities to thrive. A recent Bain report shows that executives are struggling to determine where or how to invest in the IoT. This is in part because the IoT market is a fragmented one, consisting of device makers, software providers, and operating systems all vying to be the industry leader in their respective fields. [clickToTweet tweet="in the #IoT arms race, there’s more to the industry than #Alexa and #Cortana" quote="in the #IoT arms race, there’s more to the industry than #Alexa and #Cortana"] The usual suspects - Google, Amazon and Microsoft - seem to be leading the way early on in the IoT arms race - but there’s more to the industry than Alexa and Cortana. By looking beyond smart assistants, smaller players like Ring and VersaMe have taken advantage of applying IoT to safety and education, respectively. As the market for IoT technology heats up, it’s very likely that a window of opportunity will open for small, innovative companies to compete with the big players and gain market traction. How can smaller startups seize this burgeoning opportunity to make their mark in the IoT space? Here are four practical tips. Keep your vision broad, but your MVP narrow The success (or failure) of an IoT product will hinge on its product quality. When you’re in the early stages of developing any product, defining your minimum viable product (MVP) is essential - in the case of an emerging market like IoT, you need to build it to an even higher standard. IoT products have to work in ways that other products don’t, and they have to work better than traditional methods to give consumers a reason to change their typical way of doing things. When defining your MVP, focus on the cost-effective features that are most important to implement without being impossible. Keeping features at a minimum will also provide flexibility and the ability to pivot on your idea. Defining your MVP requires you to really focus on identifying your primary target audience by demographic, region and affordability. Run marketing campaigns targeting these segments only - the more focused you can be, the better. Focus on the end-user If a consumer has to ask a virtual assistant the same question multiple times before they get an answer, why would they ask the assistant again when they could just type the question into Google? Scenarios like this can be avoided through functional testing on a large scale to reveal any product issues beforehand and ensure that quality is never an afterthought. There’s a whole range of bugs that can negatively impact the user experience, and any glitches to the user experience will make a consumer think twice about using IoT devices or may turn them off from the category altogether. And it’s not just software bugs in the traditional sense. While all brands test to check their devices are functionally working, not everyone has the user experience front of mind. Not only are users wide-ranging and unpredictable in their habits, they have varying expectations for product design. This is where testing your device on a large scale - in different regions and scenarios and on multiple devices - can give you a huge advantage. Prioritise security Studies show that 90% of consumers lack confidence in the security of IoT devices, while the media has so far blamed the IoT for attacks on big and apparently stable companies like Amazon, Twitter and Spotify. It’s clear the fear is very real. Security requirements are particularly high when developing IoT devices and applications as they often carry highly sensitive data. Protecting customer information needs to be a top priority for any IoT startup, so it’s important to integrate safety features into the design when developing products. The best way to counter security fears is of course by making sure products are tried, tested and proofed, making sure there are no bugs that cyber-criminals can exploit. You can fix security issues through stringent testing in order to gain and maintain customer trust. Otherwise, you risk losing customers’ business in the short term and in the long term. Understand the ecosystem within platforms Ecosystems will play a huge role within the IoT. IoT devices are connected to one another as well as to the outside world, so experiences are never stationary. They’re part of an ecosystem within a variety of platforms. IoT players should know that they don’t have to go and create an entirely new platform by themselves. They need to use the ecosystem already in place to their advantage. Perfect the ecosystem experience to perfect the customer experience. As the IoT arms race builds to a fever pitch, smaller companies can get ahead In essence, what consumers want from connected devices isn’t much different from what they want from any new entrant to the tech market: user-friendliness, unshakeable security and excellent quality. As the IoT arms race builds to a fever pitch, smaller companies can get ahead - by promoting security, quality and ensuring they put customers at the centre of design. ### Chatbots on the dealing floor The robots are coming, and for the investment sector, their arrival couldn’t come at a better time. The march of artificial intelligence (AI) into the workplace can be seen in every facet of the corporate world, from the production line to the financial services space. Analytics-driven algorithms are assisting with everything from insurance claims adjusting to currency trading. Banks and brokers are benefitting from more informed decision-making, improved analysis and predictions, as well as the ability to trade at machine speed and in greater volume. However, alongside better analytics and greater capacity, AI and cognitive reasoning can deliver something equally important to the financial sector – transparency. Financial trading in the major economic markets has undergone wave upon wave of innovation and trade. The ‘big bang’ of the 80s that saw the age of open outcry trading in pits on the trading floor come to an end, replaced by computers and phones. Technology and connectivity advancements since then have refined the electronic trading process further. Yet, amid the white heat of the technological revolution, one relic of the financial market has remained in the form of the interdealer broker. Legal, decent, honest and truthful An uncomfortable reality of the Libor scandal was the pivotal role played in it by interdealer brokers. As the middlemen who line up buyers and sellers of securities for banks, a group of interdealer brokers emerged as the key enablers of Libor rate-rigging. The firms involved were fined a total of $2.6 billion for rigging global interest rates. It’s a major reason why interdealer broking needs to be reformed and why it needs a new layer of transparency that AI and blockchain can deliver. [clickToTweet tweet="Libor showed us the risk to the marketplace when brokers operate solely for themselves" quote="Libor showed us the risk to the marketplace when brokers operate solely for themselves"] Libor showed us the risk to the marketplace when brokers operate solely for themselves, instead of being in business for their customers and for the betterment of the market. It is one of the major reasons why the interdealer broking market is a perfect candidate for the implementation of AI as a reforming measure. One of the key benefits of an AI broker underpinned by a blockchain transaction platform is the elimination of errors, abuse and criminality that plagues financial markets that rely on human brokers. By harnesses the power of AI to offer users a virtual broker: the market benefits from having informed, data-driven traders that are not only 100 percent trustworthy, but also completely independent and totally auditable. Pair this with a platform that can record every transaction and market movement, including when those transactions took place, and suddenly the interdealer broking market quickly becomes a marketplace with manipulative trading practices can no longer be hidden from view, and where human traders and automated brokers can operate side-by-side and interoperate. Engaging with AI using a cognitive chatbot AI chatbot interfaces in banking are emerging quickly. Startups such as Cleo and Revolut already use AI chatbots for customer service, but their use need not stop at ordering a new debit card. A virtual trader is useless without an interface to instruct and engage it. Cognitive reasoning platforms such as Rainbird allow AI technologies to make human-like decisions based on data and knowledge maps and escalate more complex calls for human intervention. Such a structure ensures a level of certainty and measurable decision-making process with a clear audit trail of what trades were executed and why the decision was made. Rather than being at the mercy of variables of human traders and their own perception of whether a risk is worth taking, an organisation or a customer can tune or cap their AI trader and impose hard limits on how much risk they want to entertain via the AI for a consistent outcome. By using a chatbot interface, users can interact with it using conversational language and logical questions. In the financial trading space. this removes many of the cultural obstacles to using a non-human trader. It also makes the AI behind it more of a resource, allowing users to ask questions, solicit advice and to provide the bot with more information when it doesn’t have all the background it needs to make a call on a trade. Broadening the marketplace with cryptocurrencies The cryptocurrency space is an opportunity for the marketplace. Such is the pace of change in values, as well as new investment currencies emerging on a daily basis, speed is of the essence. To maximise the opportunity, traders need to move faster than humans can. High-frequency trading (HFT) is not new, but being able to interact with an HFT algorithm in real-time to enrich and develop the trading strategy is a change to the status quo. It is also essential in order to fully exploit the opportunities of cryptocurrencies alongside physical fiat currencies. The growing fragmentation of crypto markets is a problematic thing – a stumbling block to those operating in the marketplace. An AI trader can effectively work across these fragmented environments – wallets, exchanges, ICOs, private offerings etc. You can trade safely and effortlessly, with full control of the actions being carried out on your behalf across all crypto and physical currency repositories. Using an AI execution chatbot for trading – with optional Alexa-style voice recognition system – information can be aggregated from a much wider range of sources than a human trader can realistically keep in their head, while still retaining the experience of real-time interaction and thought process. The key to decentralised trading isn’t actually a new cryptocurrency asset but an effective cryptocurrency broker that can move at the same speed as the currencies. The key to decentralised trading isn’t actually a new cryptocurrency asset but an effective cryptocurrency broker that can move at the same speed as the currencies. By simplifying and opening markets to the widest possible range of people, the combination of AI, chatbots and a fully transparent journaling platform for trades deliver new opportunities for the financial sector while continuing the automation and technology push that began with the big bang in the 80s. ### Say Hello to our Automated Future Every year, Workfront produces a comprehensive report called  The State of Work that checks the pulse of the British workplace at large. Over the years, the report has corroborated what many of us know to be true- that emails are an annoying, but kind of a necessary evil. That we all wish we could work more flexibly and that most meetings are a waste of time. This year, however, one important finding was that many office workers are actually quite excited about increased automation in the workplace, 3 in 4 to be more specific. Now, this stat may surprise those who’ve been banging the drum about a machine apocalypse a la Terminator, but I’d wager that it wouldn’t come as news to anyone who has been closely following the developments in automation and robotics, especially as they pertain to improving productivity in the workplace. Man vs Robot When we dug a little deeper, we found that the overwhelming reason workers welcomed automation was because it freed them up to focus on their primary work duties. Too many of us spend a large part our day on non-essential tasks. In fact, according to our research, only 40% of our working day is spent on the primary duties of our jobs. Our findings show that we’re itching to work on the parts of our jobs that require nuance, consultation and wisdom, elements that still require the ‘human touch’, and we’re more than happy to automate the rest. However, as the limits of automation expand further, what we consider “human-only” tasks has begun to shrink at a dramatic rate. Tasks involving data analysis can be automated relatively simply, but what we’ve seen with recent technological advances are more soft-skill tasks like customer service and copywriting being automated as well. British workers don’t seem to be that concerned about these advances. In fact, 92% believe that no matter how sophisticated artificial intelligence becomes, there will always be the need for the human touch in the workplace. It appears workers are bracing themselves for a future where humans and AI coexist, each doing the jobs the other can’t or won’t do. Most of the scaremongering headlines we see about AI focus on how robots may displace humans from jobs in the future. When we actually spoke to workers about this, we found their response was quite nuanced. 38% thought in workers will have to compete with robots/machines and AI for jobs in the not-so-distant future, but 82% also expressed optimism about the opportunity to learn new things as the workforce moves towards automation. It appears workers are bracing themselves for a future where humans and AI coexist, each doing the jobs the other can’t or won’t do. Disrupting the status quo Several industries and fields are already utilising artificial intelligence in advanced ways that not only improve individual productivity but also fundamentally optimise performance in ways that simply were not possible before. Here are four areas where AI is currently making a big impact in how work is done: 1- People Analytics AI is helping large companies understand patterns in unplanned absences and then pre-scheduling extra staff when the data tells it people are likely to be away from work. The automobile industry is already doing this, and many shift-based companies are looking into investing in technology that can predict behaviour based on established patterns. Some HR departments are already using software that analyses email metadata to better understand why some employees are more productive than others. 2- Algorithmic Website Personalisation Website personalisation is the ability for a website to serve visitors with highly personalised and contextual information. It’s of great value to B2B or B2C companies. Artificial intelligence can monitor a website’s analytics every second of the day and enable content customisation based on a user’s location, demographics, online behaviour/history, and other factors. 3- Automated Customer Service Businesses already collect a large amount of data from customers, from their browsing habits, purchases, customer-service inquiries, and even visits to brick-and-mortar locations. The vast amount of data collected can often present a challenge when it comes to processing it. That’s where artificial intelligence comes in. AI can analyse the data to predict a customer’s shopping behaviour and serve them with the relevant information on the website or through in-store marketing. Artificial intelligence can use that data to more quickly solve customer-initiated concerns and even anticipate issues before they occur. 4- Automated Resource Management Resource management is a tricky beast. It involves balancing several variables, often out of one’s control. It’s not easy to figure out team schedules, weigh the impact of new projects against current priorities, and assign individual tasks in a way that keeps everyone busy and all workloads in balance. Artificial intelligence can automatically allocate tasks to those who are most available and relevant to the work, giving a manager final say whether to use or override the recommendations. The Future is Now So the future is definitely automated, but not in the Terminator or iRobot way that most people think. The future will see happier and purposeful workers who know they are spending every minute of their day doing things that not only matter... but things that only humans can do. That’s the kind of future we all want to see. That, and a robot that brings you your morning coffee. [clickToTweet tweet="So the future is definitely automated, but not in the Terminator or iRobot way that most people think." quote="So the future is definitely automated, but not in the Terminator or iRobot way that most people think."] Work automation tools are already working their way into future-ready organisations. These tools have made tremendous strides in putting an end to busywork, slashing the hours spent tackling email, eliminating days’ worth of unproductive meetings, and eliminating other productivity bottlenecks. With these new platforms, employees now have more time and opportunity to focus on actually getting the job done through ingenuity, creativity, and strategy rather than wasting so much time on tasks that contribute virtually nothing to business success or the bottom line. The workforce has sent a message loud and clear: they’re ready for automation tools that free them to apply their time and talents for the real jobs you’ve hired them for. ### Looking Ahead to 2018 Cloud Computing Just before Christmas, we kicked off our look ahead to what 2018 will have in store for us with, "Cloud Trends to Shape 2018" by Jim Somer of Log Me In. It takes a look at artificial intelligence, it's role in facial recognition and defining Digital Transformation. It also talks about something that has long been heralded, but which may now become a reality in 2018 - a fully 'Cloud-Based' workforce. Will every company be able to ditch their offices? Will every company want to? Michael Segal of Netscout also told us why 2018 will be the year of Hybrid Cloud.  Cloud Computing spend is predicted to increase from $67 Billion in 2015 to $162 Billion in 2020 - with 2018 being a critical step along the way. Is Hybrid Cloud already part of your strategy moving forward? If not, should it be? Of course, Christmas is a time for happiness and celebration. And spending. Lots of spending! As more and more transactions take place online, the chance for cyber-criminals to do their worst only increases. Richard Menear, managing director at Burning Tree, took a look at how Christmas (and New Year Sales!) provide cybercriminals with all the necessary opportunities that they need. Luckily, Richard also gives us some tips to stay safe online. Following on from Richard's article, staying with the theme of CyberCrime, we also had some insight from Sophos and SolarWinds about a potential explosion of RaaS in 2018. RaaS, of course, standing for Ransomware as a Service. It's bad enough that Cybercrime takes place - but with RaaS kits being made available online, there is the potential for anyone to become a Criminal Mastermind. Staying on shopping theme, but switching to a happier note, we also had Gavin Wheeldon, CEO of Purple, talking about how technology will shape the future of retail in 2018. A recent survey found that 55% of people look at their mobile device while shopping - if anything it's surprising that it's not more! That figure will only increase in 2018 - so how are retailers going to use that fact to improve the consumer experience? Next Dorian Selz of Squirro took a look at the possibility of 2018 being the year of Artificial Intelligence. As with most technological developments, we often have a promise of what's to come coupled with a taster of its benefits long before the big break-through finally takes place. Read Dorian's article to find out what 2018 will mean for AI - its potential and also the concerns it brings with it. Rounding off our articles looking ahead to the coming year, we had Matt Fisher of Snow asking what will keep CIOs awake at night in 2018! One thing that really stands out is how the role of CIO is expanding outward and encompassing responsibilities outside of the traditional CIO role. In the coming weeks we'll have a few more articles looking ahead to 2018, so keep an eye out for those! In the meantime, Happy New Year from all at Compare the Cloud! ### What will keep CIOs awake at night in 2018? Continuing our series of articles looking ahead to 2018, Matt Fisher, SVP of Product Strategy at Snow Software takes a look at what will keep CIOs awake at night. As IT leaders around the Western world gear up for a much-needed holiday break, sleepless nights have set in as they ponder what 2018 has in store for them. There has arguably never been such a period of tumultuous change in the selection, procurement and funding of technology consumption by the organisation. While the CIO’s world will change in 2018, the following advice will help them sleep more easily at night: Tectonic shift in spending power Analysts like Gartner are already seeing a huge swing in technology spending power in organisations around the world – moving away from IT-procured systems and toward business-funded technology consumption, accelerated by the growing adoption of mobile, SaaS, PaaS and IaaS technologies. As more software (and even hardware) vendors move to subscription models, it becomes easier for business units to select and adopt technologies with no involvement from the IT team or CIO. However, the CIO is still the one the board turns to when trying to understand its increasing spend in technology. With different business units making their own SaaS and IaaS purchases, it is increasingly difficult for the CIO to provide their board with an understanding of what is being purchased and consumed, or why. To remediate this, CIOs need to take visibility and analytics more seriously. By enhancing and evolving their approach to managing IT assets, CIOs can gain visibility not only of the physical and virtual assets inside the enterprise, but also software and hardware located in the cloud. This will enable them to not only report on technology consumption but also to scrutinise it and fulfil their obligation to the organisation to be the guardian of technology spend. To Cloud or Not to Cloud? More and more software vendors are turning to cloud subscription models to build their future revenue streams (and fulfil promises to shareholders).  As such, they are putting more pressure on end-user customers to move their on-premise systems to the cloud.  They are doing this primarily through what looks like heavy incentives for cloud subscriptions and through audit activity. That’s right – the software audit is alive and well. But many vendors are no longer using audits to drive compliance fines. Instead, their goal is to move legacy customers onto the latest cloud apps, versions and licensing plans. CIOs that don’t have full visibility of their current estate and asset costs (both hardware and software) are in a poor position to make a qualified decision on whether moving to the cloud is actually the right thing to do. CIOs that don’t have full visibility of their current estate and asset costs (both hardware and software) are in a poor position to make a qualified decision on whether moving to the cloud is actually the right thing to do. Software and infrastructure service providers deliberately make their initial cloud costs look appealing, but without understanding current costs and ongoing obligations for on-premises systems, CIOs could be unwittingly overloading their technology budgets for apparent short-term gains. A brave new world   Closely linked to the shift in spending power – along with the organisation’s need to innovate and bring new products and services to market – is the changing role of the CIO and the IT team. No longer is the IT team primarily charged with building and delivering technology to the rest of the organisation. Instead, most analysts and fortune tellers agree that the future role of the IT team is to help the business select, procure and consume the right technologies for their needs. Indeed, Gartner recently announced that, “At least 84% of CIOs surveyed [across all major industries] have responsibilities for areas outside of traditional IT. The most common are innovation and transformation.” [clickToTweet tweet="IT leaders that fail to adapt to the new world are likely to go the way of the dinosaur..." quote="IT leaders that fail to adapt to the new world are likely to go the way of the dinosaur..."] IT leaders that fail to adapt to the new world are likely to go the way of the dinosaur, which will no doubt cause many sleepless nights. However, the IT leaders that will thrive will be those that can identify how to work with business units, to transform their role from being ‘head of IT’ to ‘go-to person when we’re looking at how technology can help us achieve new goals’.  The CIO of 2018 and beyond isn’t a deliverer of IT systems so much as a broker of technology services, helping to bring together business units with common goals and technology needs to work together, being innovative and taking advantage of new technologies, but without unnecessarily adding cost or risk to the wider organisation. To summarise, the CIO has a lot on their plate and they’d be forgiven for being a bit distracted this holiday season. But the holiday break is also a great time to take stock, do a little research and form a strong strategy to move beyond the IT silo and integrate more fully with the rest of the business in 2018. And key to it all is getting that complete understanding of what’s happening across the technology estate.  Be the most-informed person with regards to organisational-wide technology spending and consumption and you will be in the best position to provide value to both the board and the individual business units. ### Will 2018 be the Year of Artificial Intelligence? Continuing our series of articles looking forward to 2018, Dorian Selz, CEO of Squirro take a look at advances in Artificial Intelligence (AI) and what we can expect in the coming year. Given the impressive results that some organisations have seen from their artificial intelligence deployments – and the constant hype around AI in the press and at IT industry events - one would be forgiven for thinking that AI is already a truly mainstream technology. But despite some notable exceptions, this is not the case, with many organisations still very much in the discussion phase when it comes to AI and machine learning. But that could be set to change in 2018. As businesses begin to realise the potential for AI to transform certain elements of their organisation can outweigh any concerns or reservations they might have, so investment in AI will increase. Will 2018 be the year that AI hits the business mainstream? [clickToTweet tweet="Will 2018 be the year that AI hits the business mainstream?" quote="Will 2018 be the year that AI hits the business mainstream?"] AI concerns Very few observers would doubt the potential of AI to transform any number of business processes. Yet at the same, very few organisations are deploying AI in any meaningful sense. This is in part due to a number of concerns around what AI really means and how it affects an organisation. Firstly there are issues around how to go about implementing AI. Some organisations lack the skills, knowledge and resource to do and are unsure as to where to deploy AI and who is best positioned to help them. There is also a certain wariness about the long-term impact of AI on the workforce – are some employees at risk of losing their jobs to technology? Then there is the issue of control. When AI automation is used for number crunching or other mechanical tasks, there is no problem. But AI can go beyond that and suggest recommended best actions based on the insight it extracts. Taking automation to this next stage – quasi-decision making – takes away an element of power and control, and some may not be comfortable with or ready for that. Finally, AI is to a certain extent, a step into the unknown. If things are being done in a certain way, and that is working for an organisation, it is a risk to invest in a new technology that would approach things differently. Yes, those processes could be improved and be more efficient, but there is definitely an element of uncertainty to it, which could be a factor in signing off budget for AI-based projects. The potential of AI However, despite these concerns, AI is growing in adoption and will do so even more in 2018. A recent report from IDC and DataRobot revealed Japan leading the way with 74% expected projected growth from 2016-2021. The US can expect 49% compound annual growth, and Western Europe can expect around 44% growth. The same report also predicted that an estimated $57.6 billion will be spent across industries on AI over the next five years, with this spending set to increase employee productivity, deliver new business insights, and more workflow automation. It is benefits such as these that are really driving this wave of increased AI adoption. AI’s ability to automate certain tasks and augment any number of others, and bring enormous insight based on big data is causing many organisations to see beyond any challenges that may come with it. What’s more, this data insight is also becoming more readily available to anyone across the business. AI – unlocking data insight In a world where more data exists than ever before, the ability to get true insight from that data is a key objective for many organisations, and AI is pivotal in allowing them to do so. But until recently, this insight was reserved for data scientists, who might share it with senior executives in a digested form, but it was mostly unavailable to business users that could most benefit from it. Data is one of the most powerful tools in any enterprise, but its value is highly limited if it is only available to a select few. Data is one of the most powerful tools in any enterprise, but its value is highly limited if it is only available to a select few. The true value of data only comes when each employee can benefit from the insight it delivers – the democratisation of data. AI is making data much more accessible to business users, and it is this that will really help drive AI adoption in 2018.  It’s true that AI is still very much in its infancy. Despite all the hype, no AI technology has yet passed the Turing Test, a test of a machine's ability to exhibit intelligent behaviour equivalent to, or indistinguishable from, that of a human. But the advances of faster computing multiplied by advances in algorithmic computing have been breath-taking. AI-driven solutions have not yet mastered causation, but if business leaders can apply such solutions to specific elements of their own business – the efficiency-driven approach of deploying AI – then results will follow. Approached in this way, AI can have a massive impact for any business and will see it gradually make the transition to the business mainstream during 2018. ### How Technology Will Shape the Future of Retail It's Boxing Day - The sales start today! As part of our series of articles looking forward to 2018 and beyond, Gavin Wheeldon, CEO of Purple, takes a look at how technology will shape the future of retail. "There is no such thing as one kind of retail space, but as high streets battle to survive, embracing technology could be the only way to thrive." There’s no getting away from it. Technology rules the waves and this is no more evident than in the retail sector. Subsequently, customers are leading the charge. Expecting to get the best deals, the best service and the quickest routes to purchase. A recent State of Bricks and Mortar survey revealed that 55% of people surveyed said they looked at a mobile device whilst shopping. In China, it’s 92% of the population. 83% of Americans use them to compare prices, 78% search for store discounts and 67% of younger shoppers want redeemable offers straight to their phone. Retailers come in different shapes and sizes so their solutions are based upon how they expect their customers to behave once in store. With these figures in mind retailers are becoming blatantly aware of how important it is that they address the consumer’s needs. And those needs are to be tech-based. A one-size-fits-all approach, however, does not work across the sector. Retailers come in different shapes and sizes so their solutions are based upon how they expect their customers to behave once in store. Convenience stores and spaces like supermarkets will expect their customers to move in and out of the shop quicker. Time is now often seen as the number one asset, and in this respect, the easier and quicker the process is, the better the customer experience. Dwell time may not be as important so the speed of payment and transition will be key to their success. Think about self-checkout, click and collect as well as same day delivery. Amazon Go is at the forefront, with no cashiers or payments, the customer walks in, grabs what they want and walks out. It’s a system also being trialled by supermarket giant, Sainsbury’s within their Euston store. Customers were offered the opportunity to have a ‘checkout free experience’, using an app to purchase their products without having to queue to finalise the process. Whether this is rolled out across all stores is yet to be decided, but if one supermarket is offering its customers the opportunity, others will be sure to follow. On the other side of the coin, luxury brands and well-known department stores will need their customers to visit them with the intention of lingering. These kinds of retailers need to have a brand that resonates in all they do, including WiFi. Every opportunity to increase dwell and frequency needs to be taken and data capture and intelligent automated marketing through WiFi is a prime example of how this can be achieved. The longer someone spends on a shop the more chance the retailer has of them making a purchase, this is why it’s crucial. [clickToTweet tweet="Across the board, traditional retailers are having to think about transitioning customers from in-store to online" quote="Across the board, traditional retailers are having to think about transitioning their customers from in-store to online"] Across the board, traditional retailers are having to think about transitioning their customers from in-store to online. This poses a huge challenge. Retailers have tried many ways of gathering information (emailing receipts), however, this often leaves a bad customer experience and isn’t reliable. Purple allows customer data to be captured in a low friction manner for customers, giving brilliant WiFi in return for some data. A well-known luxury department store use Purple to redirect customers towards key objectives post authentication, including downloading the app and shopping online. Ted Baker recently announced a 4.5% sales growth in-store and 30.5% increase online. In a time when other retailers are noticing severe slumps, it’s interesting to note the fashion brand has been offering customers the chance of ordering products online, whilst in store. It allows employees to offer a more personalised shopping experience with customers, giving them access to Ted Baker’s whole product range in addition to what is available in store. This service is currently being trialled but is being rolled out across the country soon, the PayPal Here app gives shoppers quick and easy access to the online environment. The fashion brand is actually thriving in a tough market and has certainly embraced a more holistic approach towards shopping which is definitely the way forward. Over time physical spaces will become more visible. In the vast majority of cases, most retailers have very little to no information on who is visiting (name/contact data/age/gender/customer history), how long they dwell, how frequently, and as such, they’re missing out on the huge opportunity this data could present. Another luxury department store collected over 200,000 lines of detailed CRM data in first 3-months of launching with Purple. This data also gives great opportunities to review how different stores are performing past the basic Point of Sale data. The impact of understanding how different shop fronts and store interiors, attract and retain visitors can be vital in ensuring the estate is performing at its best. Finally, we can’t get away from millennials. They will drive the retail world forward. We are a ‘mobile’ first, not ‘digital’ first culture now. Millennials demand to buy products quickly and efficiently. They are also shaping how some retailers do business. There is a retailer who has a gaming area on its sales floor, hoping those that come in to play will also purchase. A luxury retailer in America offers free spa treatments to some of its high-end customers, which can be done within the shop. As a new generation begins to demonstrate its shopping potential, retailers will have to constantly evolve in order to survive and thrive. The future is bright with the right technology choices, for both stores and customers. It can be a win-win situation. ### Don't Be Held to Ransom at Christmas! Whether it’s in January when you read your credit-card statement, or if it’s right now that you check the state of your bank balance, you’d be forgiven for thinking you’ve been held to ransom once again this Christmas time. The ‘need’ to spend, spend, spend in our materialist, capitalist, consumerist society has surely hidden the true meaning of Christmas behind the curtain of having to buy ‘the must-have toy’ or the ‘trendiest gadget’. You’d also be forgiven for thinking that if you haven’t bought the right present for (or spent the right amount of money on) your spouse / partner / child / parent / BFF (delete as applicable) then you’ll surely experience the consequences in the New Year. But at least you’re not literally being held to ransom. Not just yet, anyway. In recent months Sophos has been warning us about ransomware distribution kits that are being sold on the dark web to anyone who can afford them. One for the acronym-lovers out there – these RaaS packages allow people with very little technical knowledge to carry out a ransomware attack with great ease. RaaS, of course, standing for ‘Ransomware as a Service’. Whilst RaaS packages have been available for some time, the threat is continuing to grow and the number of kits that are available will only increase with time. In fact, Sophos are warning of a surge in Ransomware fuelled by RaaS in 2018. [clickToTweet tweet="Whilst #RaaS packages have been available for some time, the #threat is continuing to grow " quote="Whilst RaaS packages have been available for some time, the threat is continuing to grow "] What makes RaaS particularly ingenious is the fact that creating malware – even selling it – is not currently illegal. According to Leon Adato, Head Geek at SolarWinds what is illegal is the intent to sell for criminal use. However, intent may be hard – perhaps even impossible – to prove if suitable precautions are used by the cybercriminals. Dorka Plotay, Threat Researcher at Sophos, has even pinpointed one such company – Rainmaker Labs – which, he says, runs its business very much the same as a legitimate software company, and whilst it sells some RaaS on marketplaces hidden on the Dark Web, it hosts production quality “intro” videos on YouTube which explain how the RaaS kits work and how they can be customised with a range of options. As hard as it is to prove intent, the ability to disprove intent is similarly difficult. With an anticipated surge in RaaS-originated attacks we are likely to see renewed efforts to combat these threats from well-meaning security researchers – both employed by professional security firms as well as operating on their own. But Mr. Adato has also urged caution with respect to this as well. As hard as it is to prove intent, the ability to disprove intent is similarly difficult. Mr. Adato cites the case of ‘ransomware cyber-hero’, Marcus Hutchins – aka MalwareTech – who defused the WannaCry attack. Although Hutchins attempted to remain anonymous, the high-profile nature of the attack generated too much interest in his identity and it was eventually uncovered. In August 2017, Hutchins was arrested over a separate attack relating to malware designed to infiltrate the banking industry. Many in the security community believe he has been falsely charged and he himself maintains his innocence. This is one of the latest in several years’ worth of cases covered by the 1986 Computer Fraud and Abuse Act (CFAA). In theory, it outlaws hackers, but in practice, it does not – and cannot –  distinguish been cyber-villains and cyber-heroes. Because of the ambiguity in the CFAA, there is the potential for security researchers to fall foul of the law during the course of their work. A further complication is highlighted by the fact that there’s currently a dearth of properly qualified security professionals (a recent survey revealed that fewer in one in four security professionals have the necessary qualification to keep their organisation secure) and this shortage could be massively exposed in 2018. The task of fighting any conflict starts with knowing your enemy. Dorka Plotay, Threat Researcher at Sophos, has identified five RaaS kits that we could be seeing more of in 2018 if the proper steps at prevention aren’t taken. Philadelphia – which is one of the most sophisticated RaaS kits. For $389 it is possible to purchase a full, unlimited licence which allows for personalisation. Stampado – which was Rainmaker Labs’ first available RaaS kit, which was first made available in summer 2016 for the extremely low price of £39. Frozr Locker - offered for 0.14 bitcoin. Once infected, the victim’s files are encrypted. Its creators even offer online support to the ‘customers’ and they help to troubleshoot problems! Satan – allows you to set your own price and payment conditions and collects the ransom on your behalf. Once paid, a decryption tool is sent to victims who pay up and 70% of the ransom is paid to you via Bitcoin. RaaSberry – Customers can choose from a number of available ‘packages’ – from a one month ‘Plastic’ command-and-control subscription to a ‘bronze’ three-month subscription. Sophos has also advised that the best ways to combat ransomware in 2018 are as follows: Back up regularly and keep a recent backup copy off-site. Don’t enable macros in documents received as email attachments Be cautious about unsolicited attachments Patch early; patch often However, don’t just assume that Malware is PC or laptop / desktop specific. Sophos also warns that Ransomware is on every platform. When reviewing Google Play, Sophos found that the number of different threats had doubled since last year and they warn that there could be an explosion of Android malware in 2018. And whilst Windows PCs will continue with their well-known, ongoing threats, expect to see further efforts to infect Mac computers in the coming 12 months. The recent problems experienced by Apple only serve to highlight that Mac’s previous reputation for near-invincibility is an over-exaggeration and not something to be taken for granted. Being held to ransom – over Christmas presents or your data – is not an inevitability. Just as we can get pleasure at Christmas through the joy of giving, we can get pleasure from our data and applications by giving them the care and attention they deserve as well. Enjoy Christmas! Enjoy the New Year! And enjoy staying safe and secure in 2018 by being aware of the risks associated with RaaS and Ransomware in general and taken appropriate measures to prevent them from affecting you. ### Digital Transformation and A.I.: People are still key In a white paper released in October called Digital Transformation in Financial Services, I described Digital Transformation as the profound and accelerating transformation of the customer experience, business model and operating model brought about by the application of digital technologies.  In a subsequent whitepaper written in November - Artificial Intelligence: Current Uses and Limitation – I explained how organisations are deploying A.I. throughout their business processes.  Both papers discuss how emerging technologies are driving innovation across industries but what they also have in common is to highlight the fact that the involvement of people is key to any successful implementation Instead of merely replacing human labour with machines and automated processes, Digital Transformation and A.I. are most valuable when their deployment enables individuals, teams, and entire organisations to perform more efficiently, to enhance existing processes, to adapt quickly to changing business environments, and to generate new business channels.  Digital Transformation may be driven by multiple emerging digital technologies but instead of using them in isolation, it is about enabling innovation, customer focus and rapid innovation, and developing the appropriate human skill sets to support these goals.  Similarly, A.I., in its current state, is best suited to provide insight into large data sets due to its ability to find patterns in them through highly complex calculations or to automate predictable, repetitive tasks.  Human abilities such as critical reasoning, emotional intelligence, creativity, inspiration, and wisdom cannot yet be well replicated by machines but when combined with A.I. they form a collective intelligence that highlights the power of not simply replacing humans with machines, but in supplementing humans with machines. The Human Role in A.I. In the healthcare industry, great strides have been made through the use of A.I. in the early detection and/or diagnosis of illnesses as well as recommending successful courses of treatment.  Rather than replace the doctor, however, A.I. is used to augment their knowledge through the analysis of historical data in order to find patterns that help them to reach the best possible decision in conjunction with their own experience.  On occasion, A.I. will provide an output that the doctor disagrees with or is plain wrong.  In this case, human intervention by the doctor can guard against mistakes. Aside from partnering with A.I. to form a higher level of collective intelligence, humans must also play a role in the use of A.I. by evaluating opportunities for its use, by supervising and managing it, and by implementing it.  This, in turn, means that there is a human responsibility in ensuring its ethical and appropriate use.  Firms should care about it too in order to help overcome bias or negative reactions to A.I. and to help accelerate its use. Machine Learning algorithms used in A.I. are developed and fine-tuned by learning from experience - otherwise known as data - and being trained (either by itself or manually) to reach ever more statistically accurate decisions.  The fact that the decisions they make are based on the data fed into them means that the outcome of those decisions can be influenced when selecting the data used.  If a Machine Learning algorithm is trained on biased data it will generate biased outcomes. Network architectures of algorithms can also be set up so that some data points are weighted in favour of others.  This too can generate biased outcomes.  This may be done deliberately based on qualitative assumptions (e.g. a convicted murderer poses a greater threat to society than someone charged with littering and should, therefore, be flagged more heavily on a police database), but as soon as human opinion starts to play a part in the decision-making process, even for presumably well-intentioned reasons, the impartiality of the outputs generated can be compromised. Without human interaction acting as a guardrail and left to its own devices, there is the potential for A.I. to make the wrong decisions or to make decisions that can’t be explained due to a lack of traceability in how it reached such a decision.  This is one reason why people weary of the potential negative impact and safety of A.I. worry about its use.  So-called black box solutions - A.I. tools developed by providers to train on either user or proprietary data without disclosing the underlying network architecture – could be open to untraceable and unexplained manipulation and abuse.  The focus of much of the current research around A.I. is around interpretability so that machines can explain decisions they make, but this is not yet available and as such, is an important consideration for people to make when deciding how to use A.I.. Data is the fuel on which A.I. runs and, in general, the more data it consumes the greater insights it can provide.  As users of A.I. seek to consume greater volumes of data, the implications around where the data comes from and who owns it grow.  There are substantial benefits to be gained for firms that can use A.I. to fine-tune their offerings and customer interactions to individual tastes, including boosting sales and revenues through the enhancement of the customer experience.  A plethora of data is now available through the everyday digital interactions that people have through their smartphones, computers, and internet of things including personal data collected through apps, social media, and private photos.  Much of this may not have been intended for outside use, however, and firms need to be transparent about where they source the data they use and how they use it. The solution is to form policy that ensures its appropriate and ethical use and potential revisions to government legislation around data privacy laws and access rights.  Some of the larger providers, such as Google’s DeepMind have already started the conversation around this and although still in its infancy, it is another example of the new sorts of roles and responsibilities being created by A.I. that will need to be policed by humans. A.I. Impact on Jobs  There is no hiding from the fact that some existing jobs will be automated as technology advances, however, this is not a new phenomenon and is not unique to A.I..  A.I. can boost productivity; it doesn’t get tired, it lowers risk by removing human error, and it lowers costs as some human jobs are replaced by machines.  Advocates of A.I., however, argue that by using it to automate certain low-level tasks it can free up human workers to focus on more interesting, value-add activities.  Indeed, a study conducted by the McKinsey Global Institute demonstrates that whilst most industries have tasks such as data collection and processing that can be automated, they also have other tasks like managing and interfacing that cannot be.  This means that in many cases, only specific tasks rather than entire jobs will be handed over to machines. In addition to the requirements for human ownership in implementing and policing it, A.I. could also create new jobs through both short-term bubbles and the longer-term development of new business channels.  Much Machine Learning still requires manual and onerous tagging and training to make it work and this has already resulted in the creation of employment and earning opportunities through services such as Mechanical Turk, Upwork, and Freelancer.  Ten years ago, nobody knew they wanted an iPhone, yet today the product line makes up around 65% of Apple’s revenues, a firm that now employs nearly 120,000 people worldwide versus just over 20,000 when it was first released in 2007.  This is evidence that in the long-term, products and industries that we haven’t even thought of will produce new jobs. The Human Role in Digital Transformation Digital Transformation is a phrase that is often misunderstood.  It has been driven by digital technologies such as social media, online applications, mobile devices, cloud computing and A.I. that generate, store and process data, but it is fundamentally about implementing new processes to create, enable, manage and deliver digital products and services with a continuous and iterative approach to integrating business, operations, and emerging technologies.  That requires organisational change and strong top-down leadership to foster and drive a change in mindsets and therefore involves significant human input. Successful Digital Transformation can see revenues grow through the creation of new products and increased productivity Firms that have prospered in their programmes of Digital Transformation have fostered a culture of organisation-wide collaboration between employees and empowered them to be creative whilst failing fast.  Rather than focusing on merely cutting cost, successful Digital Transformation can see revenues grow through the creation of new products and increased productivity.  New skills are required to implement modern processes and transformation programmes that redefine how firms operate.  Firms that are defensive in their business strategy by focusing on cost leadership may actually be better served by directing resources towards building new business models and developing new product lines. Successful Digital Transformation requires a flexible workforce that can switch seamlessly and intuitively between tasks and the combination of the right technologies and well trained human labour is fundamental to that. Conclusion Digital Transformation and Artificial Intelligence are two driving forces behind modern business innovation and to be effective they each require the right mix of human and technology input.  Today’s A.I. is regarded as “narrow A.I.” and is built with a singular purpose without the ability to fundamentally change the scope or boundaries of its design.  Artificial General Intelligence, where machine intelligence has full human-like capabilities and can perform any task that a human can do, is still many years away.  Until machines are capable of doing everything that humans do, we will remain as the innovators, policymakers, decision-makers, managers and more.  In 30 years, jobs and firms will look very different from today, and whilst upskilling, retraining and education of workers will be necessary in order to adapt to the change, it is no different to what happened during the first and the second industrial revolutions and is nothing we haven’t seen before.  As John F. Kennedy stated in 1962, “If men have the talent to invent new machines that put men out of work, they have the talent to put those men back to work”. ### The Season Of Goodwill: An Opportunity For Cyber Crime Continuing our series of end of 2017 / start of 2018 articles, Richard Menear, CEO of Burning Tree takes a look at the opportunities for cybercriminals to take advantage of the ever-increasing online spending of consumers - a trend that's only going to continue in 2018. This Christmas thousands will be spent online and in the Cloud by consumers buying gifts or taking advantage of sales before and after the big day. It’s the time of year for big-ticket purchases on new technology, white goods, and luxury items, as well as large volumes of smaller value sales on many individual transactions.   With e-commerce sites hitting peak traffic over Christmas and New Year, it also a time when cybercriminals are active, finding it easier to go undetected because of so much seasonal activity. E-commerce businesses will already have increased their security measures to handle this spike in volume, but it also falls to the consumer to protect themselves and ensure they don’t get scammed and their data is not used fraudulently.   If you’re thinking of hitting the online January sales, or have some last minute present buying to do, here are our tips for protecting your valuable data and your bank account: Shop on trusted websites It can be tempting to grab a bargain on an unfamiliar site when a product is offered at a very competitive price. However, you may find that these sites are actually scams designed to entice customers with bargain prices, take your money and then disappear overnight.    The adage that ‘if it’s too good to be true, it probably is’ is worth reminding yourself of if you think you’ve found a bargain. Moreover trusted websites like Amazon, Argos etc. generally have very competitive prices at this time of year, often with free shipping or click and collect, and therefore offer similar bargains anyway. Look for security badges Reputable e-commerce sites will also display security or ‘trust’ badges, such as Norton, McAfee and TRUSTe. If these are present it demonstrates that the company has taken measures to protect your data and that transactions are secure. Look also for the green padlock symbol at the front of the URL for a website. That indicates that webpage is secure and has been verified by a 3rd party authority.   If in doubt use PayPal or another alternative payment method Many e-commerce sites use a hosted payment gateway to process card payments online. You will be redirected to a payment service provider like SagePay, WorldPay or Secure Trading and this should give you confidence that your payment card data will be processed in a PCI DSS compliant environment (removing it from the retailer’s IT environment). However, some companies have a self-hosted payment gateway, which will be branded as the business. For trusted websites like the big retail giants, you can be reasonably confident that they will have the security tools in place to handle your data securely, but for less well-known retailers you may want to question whether proceeding with a transaction is a good idea.    PCI DSS compliance is an expensive process and therefore most small retailers will use a payment service provider. If you’re unsure about the security of a transaction but wish to go ahead, we recommend that you use an alternative payment method like PayPal. This means that you won’t need to share your payment card details directly with the retailer, keeping this data secure. Keep passwords and login details safe Many of us have accounts with popular online retailers that enable us to make quick and easy purchases without needing to enter our details time and time again. Personal details can also be easy to access on our devices when forms have pre-populating fields.    Should your device or laptop fall into the wrong pair of hands it may be possible for a criminal to access your favourite e-commerce sites, make a purchase, change a delivery address, and pay using your PayPal account or even a payment card. Mobiles are most vulnerable as these are easier to lose or have stolen, and m-commerce websites and apps are designed to enable friction-free transactions. Make sure you use robust passwords on all accounts (don’t save them on your device), and most importantly protect your phone or tablet with a unique password. Watch out for email and social media scams An advert on Facebook or an email landing in your inbox could entice you to part with your hard earned cash for no reward. By clicking on a link (especially from an email) you could be a victim of a phishing attack or be redirected to a bogus site.   Best practice in this situation is to search for the site online and find the business directly. If the offer is genuine you will be able to find it by searching for the product from there.   While reputable online retailers are incentivised to keep your data safe (though fines for non-compliance and potential reputational damage if they suffer a data breach), there are plenty of cybercriminals looking to exploit the season of goodwill. Be sensible and protect your data, and always question whether a link, redirect or business is genuine. ### Why 2018 Will Be the Year of the Hybrid Cloud 85% of IT and business decision-makers believe that the cloud will be an essential or vital part of their digital transformation (DX) strategy. Whether this is the Cloud we all know or more of a hybrid cloud. Cloud computing is predicted to increase from $67 billion in 2015 to $162 billion in 2020. Cloud readiness will shape companies’ agendas in 2018 and beyond. However, implementing strategies for DX is fraught with risk. Particularly while simultaneously accelerating the delivery of new applications. Or even with services to customers. Hybrid and Multi-cloud momentum This year we saw an increasing number of businesses moving their applications to the cloud. This trend looks set to continue. To support cloud strategies, more and more organisations will seek to adopt a multi-cloud approach. They will utilise a mix of cloud services from different providers, such as AWS and Microsoft Azure. According to research from Microsoft and 451 Research, nearly a third of companies worked with four plus cloud vendors in 2016. Multi-cloud environments enable businesses to mitigate risks while leveraging different advantages from each provider. This number looks set to increase as organisations continue to wake up to the benefits of multi-cloud. [clickToTweet tweet="The #hybridcloud market is predicted to grow to $91.74 billion by 2021. #growth #cloud" quote="Research from MarketsandMarkets revealed that the hybrid cloud market is predicted to grow to $91.74 billion by 2021."] In 2018, we expect the increased adoption of multiple clouds to be supported by surging momentum for hybrid cloud solutions. Public clouds are used to supplement existing on-premises and private cloud infrastructure. Research from MarketsandMarkets revealed that the hybrid cloud market is predicted to grow to $91.74 billion by 2021. This is up from $33.28 billion in 2016, highlighting its vast growth potential. Visibility underpinning business success This increased hybrid cloud adoption will, in turn, drive further changes for businesses. As an increasing amount of data will need to travel back and forth over the wide area network, Software-Defined Wide Area Networking (SD-WAN) will become a critical consideration. By using an SD-WAN approach, organisations will be able to reduce CAPAX and OPEX. They'll also be able to intelligently manage traffic flows and gain real-time analytics. These can then be turned into meaningful and actionable insights. Also, as a growing number of businesses look to leverage cloud technologies, we expect to see more emphasis placed on visibility overall. Consider continuous end-to-end visibility across datacenters, WAN, branch offices, cloud and SaaS environments. With a 360 view of business services, it will allow IT to quickly identify current or potential problems and rapidly resolve them. Network outages cost large enterprises an average of $60 million a year. With businesses and lives becoming increasingly reliant on connectivity, companies know that there can’t afford to be an “off.” To maintain a competitive edge both next year and in the future, they must ensure they have full visibility across their entire infrastructure, including both off-premise and on-premise cloud environments. However, next year we expect assuring networks to become even more challenging. This is as IoT adoption across different verticals leads to edge computing being used to complement cloud computing. As a result, the service delivery infrastructure will become more distributed. In turn pushing the limits of scalability and real-time analytics for security and assurance. Harnessing smart data In 2018, we are therefore likely to witness an increasing number of organisations adopt a smart data approach. This enables them to obtain a top-down, detailed picture of the hybrid IT environment. They can then truly understand applications and service availability, reliability and responsiveness. This enables them to troubleshoot service issues, both in real-time and back-in-time. Applications and services leveraging innovative technologies like cloud and IoT are only as good as the visibility IT teams have of the entire delivery infrastructure, and so by focusing on investment and understanding service and application dependencies, businesses will be able to dramatically mitigate risk. Intersection of public and private Another key trend we anticipate will arise next year is traditional vendors of on-premise virtualisation technologies, such as VMware, looking to extend their infrastructure into the cloud. In 2017, we already saw movement in this direction, with VMware and AWS expanding their partnership and enabling VMware vSphere-based applications to run on AWS infrastructure. Public cloud providers, on the other hand, will continue trying to extend their services to an on-premises environment with solutions such as the Azure Stack appliance that lets enterprises run their own Azure instances on-premises, thus further blurring the lines between public and private cloud. Cloud shaping business agendas With over one-third of IT professionals naming ‘moving faster’ as their top goal for next year, more and more businesses will look to leverage the cloud to improve service agility and flexibility. In 2017, we saw disruptive vendors continue to showcase the huge potential the cloud can bring. As we look ahead to 2018, and a growing number of businesses harness virtualisation, software-defined technologies and public cloud to drive innovation, improve efficiencies and enhance the customer experience, the year ahead will quickly become the year of the hybrid cloud. ### The Cloud Trends Set to Shape 2018 The cloud offers flexibility, agility and scalability in an increasingly connected world but the technology is moving at a breathtaking pace. With advances developing rapidly, more organisations are looking to achieve higher operational efficiency and move away from traditional networks. Gartner predicts that by 2020 a corporate no-cloud policy will become as rare as a no-internet policy is today. But what’s likely to be in store for 2018 and what trends should businesses keep- up with? We’ll run down the current and most significant developments for the coming year. Artificial intelligence It’s no secret that face-to-face interactions improve relationships in both our business and personal lives. Next year and beyond, we’ll start seeing artificial intelligence play a more significant role in the process of building relationships with colleagues, customers and their partners. With more and more dispersed teams around the globe, remote collaboration tools have a great bearing on the way businesses communicate with each other. From increased productivity to effective collaboration, the benefits of the software are often touted. But is there room for improvement? Well, gradually we’ll see the incorporation of facial recognition technology and remote productivity tools. Thus providing a practical way of reading visual cues. With the recent launch of the iPhone X, there’s been a big buzz around facial recognition. This technology can be applied to more than authentication and expressive 3D emojis. [clickToTweet tweet="Applying #facialrecognition to an #onlinemeeting environment will enable #hosts to pivot their conversation." quote="Applying facial recognition to an online meeting environment will enable hosts to pivot their conversation if needed."] Applying facial recognition to an online meeting environment will enable hosts to pivot their conversation if needed. As well as inform more effective post-meeting follow-up. For example, a salesperson will know the likelihood that a sales pitch has been successful. Or an advertising executive could be given the heads-up that the idea they’re presenting to the client is falling flat. This “meta meeting,” focuses more on the feeling of the meeting (body language, tone, etc.) rather than the actual conversation. It will gather insights and learnings that help the meeting host facilitate better human connections and drive positive results. Ultimately, we’ll see the combining of AI and emotional and conventional intelligence to facilitate improved workplace relationships. Leading to more successful business outcomes. The defining of digital transformation Increasingly many organisations are moving to embrace new technologies and implement strategic change. With a goal of improved productivity and collaboration, importance is being placed on seeking new ways of nurturing innovation. All the while maintaining a competitive advantage. With the emergence of technology such as Blockchain, a digitised record of economic transitions, we already see businesses create value through new innovative technology. Similarly, other emerging technologies such as AI and IoT also require vast amounts of data and computing power. The growing demand for this type of innovation will drive substantial investments in SaaS, cloud and big data technology. We’ll also see more traditional industries begin to embrace new operating models. Setting their sights on digital transformation. Integrated productivity Modern organisation’s implement a variety of cloud-based applications and services. However, it’s becoming increasingly important for information to integrate between apps. As cloud-based environments often increase the number of systems being used, businesses frequently find themselves needing cross-platform integration – to pull together all of their relevant data. For this reason, the collaboration landscape is only set to grow next year, and we’re going to see more cross-platform collaboration. However, integration won’t just be resided to applications, with a growing number of businesses storing workloads in both private and public clouds – we’re set to see hybrid computing established as standard in 2018. Creating, managing and enhancing these connections will be of great importance in the years that follow. The rise of the cloud workforce  More and more companies in the new economy are ditching their offices, in favour of a workforce made up of cloud employees. Gone are the days when a premium is placed on physical office locations. Currently, we’re experiencing the rise of the cloud employee, as it’s no secret that cloud-based applications and technologies efficiently facilitate remote working and personal interactions. Making it easier than ever for employees to collaborate with co-workers and received updates globally – without the need for in-person interaction. Yes, we’re set to see an upward trend. A recent survey at the Global Leadership Summit in London found that 34% of business leaders expect more than half of their company's full-time employees to be working remotely by 2020. Businesses will become more globalised as employees work across various time zones. We’ll see business leaders recognise the benefits of distributed teams. Developing a cloud workforce requires full engagement of all key stakeholders and complete alignment of business strategies and objectives. But with increased savings, flexibility and the best use of real estate, many businesses will look towards a cloud workforce as a way to provide significant productivity gains. ### Addressing the AI Engineering Gap Realising the dream of AI is more process than magic. That’s the good news. But the engineering process itself is a journey. Given the talk of AI in the general media, you could forgive yourself for thinking that it will ‘just happen’. It sometimes sounds as if machine learning is on an unstoppable roll, poised to revolutionise industry and commerce through sheer momentum of technology. But of course, it’s not that simple for organisations that want to adopt AI methods. True, the technologies are maturing at an unprecedented rate. There are countless innovations already being developed to help diagnose deadly diseases, make million-dollar trading decisions, learn skills like driving, and automate many other tasks that have the power to transform whole industries. [clickToTweet tweet="But of course, we know that #AI is not magic. What’s less appreciated is that modern AI is not even #science or math." quote="But of course, we know that #AI is not magic. What’s less appreciated is that modern AI is not even #science or math."] But of course, we know that AI is not magic. What’s less widely appreciated is that, for the most part, modern AI is not even science or math, in the sense of being based on fundamental truths. Modern AI is an engineering discipline—a painstaking process for creating useful things. AI engineering is in some ways similar to software engineering, but with unique properties due to its deep dependencies on large amounts of data and relatively opaque software. Doing AI engineering well requires processes and tools that are under development. Mind the gap We’re living through watershed years for core AI methods. Development is happening at an unprecedented rate. But the tools and best practices for AI engineering are not maturing nearly as quickly. The effect will be that this engineering gap will slow the penetration of AI across industries while offering opportunities for innovative software solutions that focus on big data and AI engineering processes. Headline breakthroughs in AI have come fast and furious in recent years, fuelled by the rapid maturing of techniques using deep learning, the success of GPUs at accelerating these compute-hungry tasks, and the availability of open-source libraries like TensorFlow, Caffe, Theano and PyTorch. This has hastened innovation and experimentation and led to impressive new products and services from large tech vendors like Google, Facebook, Apple, Microsoft, Uber and Tesla. However, I predict that these emerging AI technologies will be very slow to penetrate other industries. A handful of massive consumer tech companies already have the infrastructure in place to make use of the mountains of data they have access to, but the fact is that most other organisations don’t – and won’t for a while yet. There are two core hurdles to widespread adoption of AI: engineering big data management, and engineering AI pipelines. The two effectively go together, and the rate of their uptake will determine the pace of change. Big data engineering competency, for example, is only now just beginning to take hold across industries – though this is very likely to accelerate in 2018 with the growing adoption of cloud-managed services. AI engineering competency is the next hurdle – and it’s likely to be many years yet before it becomes widespread across industries beyond the tech giants. Engineering big data Let’s not put too fine a point on it – success with modern AI depends on success with big data. And as a result, it should be no surprise that data-rich consumer services like Google, Facebook and Uber are leading the AI charge. Recent advances in deep learning only work well when given massive training sets to play with. But accumulating this training data and having the right to use it can be difficult or infeasible outside of marquee application areas in consumer tech. It’s not just about acquiring data either – getting it’s one thing, but the challenge of managing that data efficiently is quite another. Successful AI organisations tend to have engineering expertise with core big data technologies like massively scalable storage, data preparation, analytic engines and visualisation tools. However, increasing numbers of traditional organisations have recently succeeded in moving up the big data learning curve, and the emerging acceptance of cloud-managed solutions should accelerate the spread of that competency across more industries in the near future. Engineering AI Many people don’t appreciate that AI today is an experimental science. The latest successes in AI are not the result of mathematical proofs; they’re the result of exhaustive experimentation and tuning of ad hoc machine learning models. That ongoing experimental process requires discipline and tools, or it breaks down quickly and can lead to the kind of false predictive behaviours, hidden feedback loops and data dependencies outlined in Google's paper on Machine Learning: The High Interest Credit Card of Technical Debt. Engineering an AI application involves pipeline development on samples (usually the job of data scientists), training models on massive data sets (often the role of data engineers), and serving models for live predictions in production (usually managed by DevOps engineers). Monitoring of the serving tier then feeds back to the earlier phases of the pipeline. Each of these stages involves continuous iteration, and there needs to be regular iteration across phases and teams as well. Tools and processes need to be in place to manage daily evolution and debugging of code, pipeline specs, training and testing data; live data feeds, live experiments, and metrics of served models. It’s a long and involved process and one that requires a lot of intense data wrangling to deliver satisfactory results. Creating a brighter future for AI To date, systems for streamlining these processes have not been a focus in AI research or open source. But this issue is an active area of chatter in the tech community, and opportunity knocks for new projects and products in this space. They’ll need the skills, know-how and vision to create radical data productivity, but they’ll be essential to helping the development of AI and machine learning be as good as they can be. ### Access Denied: What DDoS Attacks Are All About and How They Vary The liver disease Hepatitis comes in five flavours: A, B, C, D, and E. And while some of them are easily prevented by vaccines, all of them are downright awful. You know what else comes in many forms and has almost as many capital letters? The answer is DDoS, the cirrhosis on the liver of the Internet. Before we discuss the many forms these Distributed Denial of Service attacks take, let’s first spell out what DDoS means. Access denied Aside from watching cat videos, the principal reason for the Internet is for people to be able to share information, opinions, creative works, and even commerce with each other across the globe. Each of those things is a category of a service which a computer can offer to others connected on the Internet. That obscure blog on post postmodernist influences on Silicon Valley office furniture that you maintain? It’s made possible by a computer offering a web server application like Apache or Ngnix, and the server provides the service of sending web pages and insomnia-curing essays to the web visitors who request to access your blog. However, that service can be denied by many outside causes, like a critical internet backbone cable being cut, or some other computer on the Internet overwhelming your blog with so many requests for data that your server is overwhelmed by the demand, and slows down for all visitors. That’s the “denial of service” part. If that computer is being intentionally commanded to flood your blog, then you have a denial of service (DoS) attack: a bad actor is using a torrent of data to knock you offline and prevent you from sharing your ideas with the world. Our scenario of one computer successfully overwhelming another is a bit far-fetched, as even basic blog hosting services can satisfy that demand. Yet when hundreds or even thousands of computers send attack traffic to a single target website, the ratio of attacking computers to victim systems (sometimes just one) is so lopsided that the target computer is rendered unusable. The use of a massive number of computers in a DoS attack to gang up on one target is called a “distributed” DoS, or DDoS for short. Like other cyber attacks, the underlying motivations may be financial (threatening a DDoS attack against a business in order to collect a ransom payment is a popular reason), or even political (the hacktivist group Anonymous is fond of DDoSing its enemies). The botnet boom Some hackers rent out their botnets, offering DDoS-as-a-service on the black market, which provides them with an additional revenue stream. DDoS attacks are getting both worse and easier to execute, thanks to the rise of botnets - large armies of hacked computers which follow the orders of their hacker commanders. Some hackers rent out their botnets, offering DDoS-as-a-service on the black market, which provides them with an additional revenue stream. Another contributing factor is the endless supply of poorly secured PCs, servers, and IoT devices which provide more and more systems which are easily hacked and added to their botnets. Once enlisted in a bad guy’s cyber army, a commandeered computer can take part in a few different kinds of DDoS attacks. Let’s explain two of them here: Domain Name Service (DNS) Attacks: DNS is the system which allows your web browser to match the address you typed into your browser (the URL) with the string of numbers which actually designates a site’s address on the Internet. Unfortunately, this phonebook of the web can be misused as an amplifier for DDoS attacks. Attackers can make a DNS server send a large volume of data to a target site by sending a much smaller request to the DNS server. Multiply this amplification by the sheer number of publicly accessible DNS servers and the number of attacking computers in the botnet, and you have the recipe for staggeringly large amounts of attack traffic. Application (Layer 7) Attacks: In this type of assault, the attackers send packets of information carefully crafted to exploit a known bug in a specific piece of software or a known vulnerability in a network protocol. Rather than relying on the sheer volume of data to drown a website by using up all of that site’s bandwidth, application attacks usually attempt to exhaust the resources (like memory or processing power) of the server. [clickToTweet tweet="No matter what flavour they come in, #DDoS attacks are a menace for businesses & organizations of all sizes & types" quote="No matter what flavour they come in, DDoS attacks are a menace for businesses and organizations of all sizes and types"] No matter what flavour they come in, DDoS attacks are a menace for businesses and organizations of all sizes and types. Since their website is their storefront, e-commerce companies lose revenue for every second that they are down. Small and medium businesses, on the other hand, suffer when their company blog and other pages timeout. All the SEO in the world can’t fill your funnel if no one can actually read the content. DDoS attacks weaponize data and inflict severe financial harm, regardless of the flavour of the attack or the type of business targeted. Companies of all sizes simply must review their cyber defence posture so that they can minimize downtime and losses. ### How ERP can kick-start your business Technology is evolving quicker than ever and customer expectations are growing with it. With the emergence of ubiquitous connectivity comes a demand for information at our fingertips, wherever we are, and from any device. Unsurprisingly, this has accelerated cloud software and today cloud solutions have become so entrenched in businesses that it’s a challenge to find an organisation that doesn’t have at least one cloud system in use. However, while there’s a general shift towards the cloud, when it comes to ERP many businesses are still reluctant to move away from the comfort of their legacy systems. There’s a misconception that moving to cloud ERP is an exhausting process that requires an overhaul of current systems and risky mass data migration that will end up being more costly for businesses. Indeed, Gartner predicted that nearly all cloud ERP projects would fail by 2018, and yet we’re less than a month away from 2018 and cloud ERP is still growing. while there’s a general shift towards the cloud, when it comes to ERP many businesses are still reluctant to move away from the comfort of their legacy systems The good news is that if implemented correctly with the right vendor it doesn’t have to be a labourious process. But if you’re wondering why you should switch in the first place, here are some key benefits for businesses and customers that come with cloud ERP: Reduce operating costs One of the first things business leaders think about when it comes to upgrading software is the cost to the business. It doesn’t matter how great something is, if it’s going to break the bank, the answer will be a swift no. Thankfully, cloud ERP is far friendlier to the bank account than on-premise, plus many vendors offer a variety of payment option including per user, per month or a flat-rate fee, which is a great incentive for growing businesses. Updates and renewals can be pushed remotely and your provider should be able to handle system maintenance and upkeep. If you opt for an on-premise solution however, you need to purchase the hardware system, licenses and often any renewals and updates. Then you also need to consider paying an IT team to maintain the system and repair it if it starts acting up. Before you know it, you’re spending a significant amount of money on a system you can only access in the office. With cloud-based ERP, your IT departments are freed from time-consuming maintenance tasks, allowing more time to build strategies, adopt technological innovations and help your business run more effectively. Work from anywhere, anytime One of the best things about cloud ERP is that unlike traditional on-premise ERP, it doesn’t need to be installed on a computer and only requires an internet connection. This gives users the freedom to access their system and data in real-time from mobile devices and tablets no matter where they are. Having full visibility of the business whenever it’s needed allows employees to stay on top of today’s business challenges as and when they. They can drill-down using reporting apps and analytics, monitor workflows and processes in your organisation and react quickly to changes in the supply chain. [clickToTweet tweet="'Having full visibility of the business when it’s needed allows employees to stay on top of today’s challenges...'" quote="Having full visibility of the business whenever it’s needed allows employees to stay on top of today’s business challenges as and when they."] Don’t outgrow your system, grow with it If reduced costs and flexible working aren’t reason enough to consider moving to the cloud, it’s also worth noting that cloud ERP is a highly scalable solution. In other words, it can grow as your business grows. While most cloud ERP solutions will come with primary modules such as finance, manufacturing, logistics, human resources, and a built in Customer Relationship Management (CRM), you’ll also have the ability to add or remove modules and applications as your business sees fit. Security first As business leaders become more security conscious, some are hesitant of keeping sensitive data in the cloud. So if you do decide to make the switch, it’s important to find a vendor who has transparency on the quality of data processing within cloud-based ERP. Ideally, they should be held up to industry recognised compliance standards. Because of this, many vendors have invested heavily in enterprise-level security which may even exceed what an on-premise solution can provide. This means that small to medium-sized businesses can access enterprise level security without paying the extra costs. Furthermore, because updates can be pushed automatically companies are always using the latest versions that have been patched for bugs and other potential vulnerabilities. On a more practical note, storing data on-premise means you have to constantly monitor the conditions it’s stored in as your data can be exposed to risks of natural disasters or theft. Is it worth the hype? The benefits to cloud ERP are vast and adoption levels and success rates seem to be flying in the face of Gartner’s prediction. Indeed, in their market guide for cloud ERP product-centric solutions 2017, they assessed that cloud ERP solutions for product-centric companies are maturing and will be adopted by midsize organisations rather than larger ones. By 2025 they predict that at least 50 per cent of large enterprises will be running their core ERP in the cloud. So it looks like cloud ERP is here to stay, and given all the benefits that come with it, it’s not surprising. ### IoT Networks Unlock Cloud Value for Organisations but Standards Hold the Key The Internet of Things (IoT) has really hit the mainstream over the past 12-18 months as consumers brandish their Fitbits, talk to voice-powered personal assistants and marvel at their smart home gadgets. But what about corporate initiatives relating to utilities, industrial IoT (IIoT) and smart cities? As with any rapidly evolving tech trend, it can be difficult to separate hype from reality. That’s why the Wi-SUN Alliance recently commissioned an in-depth global survey of hundreds of IT leaders. research uncovered a growing demand for mesh networks, which could help reduce the load on cloud servers and drive even greater efficiencies The results reveal that IoT projects are rapidly gaining speed: half of those investing in the technology already have a fully implemented strategy in place, although many have encountered problems. Interestingly, the research also uncovered a growing demand for mesh networks, which could help reduce the load on cloud servers and drive even greater efficiencies. Enabling IoT IoT projects are a major driver for cloud services. The all-important data collected from smart sensors are usually sent back to cloud servers to be processed and analysed, while in many use cases a cloud-connected mobile application front-end is also required. So, the good news from a cloud industry perspective is the strong level of growth in IoT projects we uncovered. In fact, firms in the Oil & Gas (75%), Technology (59%) and Energy and Utilities (57%) industries were even more likely to have a fully implemented strategy in place.  Even when data is processed at the “edge” for real-time applications, aggregated data will still be sent to the cloud. After information security, we found that enabling IoT was the second most important IT priority for the next 12 months, with organisations keen to improve citizen safety and quality of life, drive business efficiencies and enhance the reliability of systems and services. The even better news is that 99% of respondents claimed to have already enjoyed concrete benefits as a result of IoT implementations, helping them become more efficient and productive, provide a better customer experience and lower costs. [clickToTweet tweet="99% of respondents claimed to have already enjoyed concrete #benefits as a result of #IoT implementations" quote="99% of respondents claimed to have already enjoyed concrete benefits as a result of IoT implementations"] Canadian utility firm BC Hydro has been using a smart grid of close to two million meters connected by a robust wireless mesh network to reduce energy wastage and improve efficiency. This advanced metering infrastructure (AMI) enables dynamic power management and predictive maintenance to help the firm lower costs for its customers and differentiate competitively. At US utility Florida Power and Light, a smart grid wireless mesh network enabled the firm to restore power to 99% of users affected by Hurricane Matthew last year within two days. It claims the system also actively prevented outages at over 100,000 homes. The benefits of robust IoT networks like this in a smart city context is that they can be expanded to add new capabilities. In Florida, 500,000 street lights were connected to offer greater power efficiency, reliability and public safety benefits. It’s a model that has also been employed around the world in cities such as including Bristol, Copenhagen, Halifax, Glasgow, London and Paris. Improving security However, an overwhelming number of IT leaders (90%) told us they struggled to implement their IoT strategy. Security (59%) continues to be the main challenge to adoption, followed by funding and a lack of leadership commitment (both 32%). It’s not hard to see why security concerns are still rife in the industry, with nation states increasingly looking to compromise critical infrastructure and major new vulnerabilities on the rise. That’s why the Wi-SUN Alliance offers multi-layered protection including encryption and authentication of traffic, OTA update support, and sophisticated certificate-based security to prevent any maliciously modified devices joining the network. When it comes to the other challenges facing organisations, it’s clear that IT bosses must get better at articulating to the board the tremendous business benefits of IoT at scale. Driving cloud benefits It was particularly surprising for us to see that network topology and coverage were the most popular things organisations look for when appraising IoT tech. Most (53%) respondents told us they favour a hybrid approach combining star- and mesh-based networks. Using the latter — which are based around peer-to-peer connections rather than a hub and spoke arrangement — enables more processing to be done at the edge, potentially reducing the load on cloud servers. This could have significant cost and efficiency benefits in certain use cases, such as outage management where the connection to the cloud may be lost after a lightning strike, hurricane or similar. Activities including number plate recognition and traffic management can also be done at the edge to reduce the data transmission and cloud processing load for providers. Elsewhere, IT leaders we spoke to pointed to performance (53%) and support for industry standards (52%) as key factors in their evaluation of IoT technologies. At Wi-SUN Alliance we’re passionate about the need for open standards to ensure seamless, system-wide connectivity and interoperability. Especially when it comes to the communications infrastructure of IoT, standards can lower costs, boost choice, and enable innovative new functionality off the back of more open data sharing and interoperability. That’s the IoT we should all be looking to build — one focused on reliability, performance, security and openness. It can only be good news for organisations, their customers and the cloud. ### Navigating the trend to the multi-cloud A growing number of businesses are moving their data and core applications to the cloud. Whether it’s a ‘lift and shift’ or re-platforming, we see that businesses are adopting the cloud at a much faster rate than ever before. However, organisations are quickly realising that a singular cloud solution cannot serve their future needs, leading to the growth of multi-cloud as a key strategy for many enterprises in 2018.  In fact, Gartner predicts that a multi-cloud approach will become the status quo for 70 percent of enterprises by 2019, up from less than 10 percent currently. increased reliability and uptime are advantageous attributes, but managing a number of cloud environments can result in IT staff being spread too thin. However, navigating the multi-cloud trend should come with a healthy awareness of both the merits and challenges of moving to this kind of environment. While greater flexibility is a plus, a multi-cloud approach can increase overhead costs as it splits an organisation’s workload across multiple providers. At the same time, increased reliability and uptime are advantageous attributes, but managing a number of cloud environments can result in IT staff being spread too thin. With two schools of thought, organisations should evaluate the key pros and cons to ensure their multi-cloud deployment is a success: Pro #1: Determine which provider offers the best performance As enterprises grow increasingly wary about being “locked in” to a single legacy solution, evaluating and implementing a multi-cloud environment can help organisations assess which provider offers the best performance and support for their company’s particular situation. By way of example, GE re-aligned its cloud hosting strategy to leverage both Microsoft Azure and Amazon Web Services, with the intention to understand the best performing hosting environment and to see which contract provided the lowest cost to pass to their customers. Ultimately, using this means that organisations can trial and choose between best-in-class technologies from various providers to ensure they acquire the best solution for their business, at the best price. Con #1: Reduced buying power and increased overhead costs However, a multi-cloud strategy could diminish the buying power of an organisation. If a company is splitting what they buy across multiple providers, they will not be able to benefit from economies of scale; essentially, a company may find itself buying less at a higher price. Added to this, overhead costs may increase as IT teams work with multiple cloud providers, instead of just focusing on one cloud platform. Pro #2: Open to options    A multi-cloud approach gives organisations increased flexibility as the IT department has the freedom to design the most suitable cloud to meet their requirements. For instance, one cloud vendor may offer greater security controls that are not supported by other options on the market, while another vendor may have superior high availability architecture. In essence, when using multiple clouds, organisations have the flexibility to choose the best deployment option for a wide variety of enterprise IT workloads, taking the best bits from each of these clouds to help them meet business outcomes. Con #2: Management and operational challenges The complexity of different cloud environments can quickly become a challenge, especially as getting to grips with a single cloud platform takes a substantial amount of dedicated time and resource; this manpower could potentially be funnelled into other aspects of the business, such as creating new IT features and customer support. What’s more, if a multi-cloud environment is not monitored and controlled properly, operational issues will start to accumulate, in turn creating obstacles when it comes to maintaining access control and security updates. As such, an effective multi-cloud environment requires plenty of planning and resource allocation as internal developer teams need to learn multiple platforms and have additional governance processes in place, depending on the different environments they have to support. Pro #3: Business continuity and reduced downtime Using just one cloud platform for your business could be risky, as the majority of data is stored in one place.  Fortunately, with a multi-cloud strategy, in the event that one cloud platform fails to respond or is unable to deliver optimal performance, applications can be run on an alternative cloud option. Considering reliability and uptime are essential elements to every business, reducing the likelihood of downtime in this way can boost the overall productivity and performance of an organisation. Con #3: Security concerns Security is a big part of the cloud conversation and this naturally bleeds into the multi-cloud debate, not least because the cloud is growing as a destination for sensitive data. The increasing complexity brought about by multiple clouds has the potential to create a greater risk of security issues, especially if the multi-cloud environment is not managed properly and if security updates are not handled correctly. On the other hand, multiple clouds can actually alleviate the risk of widespread data loss and downtime due to hardware failures. With a clever combination of multiple clouds, organisations can actually secure data in a private cloud and operate other areas of the business in a public cloud environment.  [clickToTweet tweet="'With evidence pointing to a rise in multi-cloud adoption in 2018...' click to find out more" quote="With evidence pointing to a rise in multi-cloud adoption in 2018, organisations will have to manoeuvre through the nuances of this approach..."] With evidence pointing to a rise in multi-cloud adoption in 2018, organisations will have to manoeuvre through the nuances of this approach by measuring how much of each cloud platform is adopted, the internal usage, and the cost of workload demands and implementation. Getting to the right solution for your company begins by carefully considering both the pros and cons of this trend. ### Research Optimization with Hybrid Storage & Backup Solutions The artificial intelligence (AI) industry is continuously revealing new innovations. These innovations range from robotics to data analytics and more. AI has begun to contribute to every industry.  Regardless of the industry, the use of AI generates big data. And the biggest challenge of big data is effective storage and backup. Disaster recovery plan for this data is also important because it’s crucial for enterprise operations; as loss of this data can result in disruption of operations or interruption in innovations. [clickToTweet tweet="'the biggest challenge of #bigdata is effective storage and #backup'" quote="the biggest challenge of big data is effective storage and backup"] AI-based technology can generate huge data lakes. These data lakes require effective data analytics and that too requires AI and machine learning. Analyzing a large data lake, in turn, requires a lot of computational capacity as well. Addressing AI Storage & Backup Requirements Storage: Locally & in the Cloud There are two major options for enterprise data storage: Local infrastructure and enterprise cloud storage. The best storage is decided by use case requirements. AI generates a lot of data and AI-based data analytics require high IOPS and reduced latency. On-premises infrastructure possesses the capability to support performance intensive workloads at greater speeds as compared to cloud-based storage solutions. Cloud-based storage solutions emphasize on the cost-effectiveness of the solution. For research-based workloads, enterprises can spare the budget in order to acquire results. That’s why, in comparison, an enterprise NAS storage would be the better choice instead of cloud storage for research environments. Backup: Locally & in the Cloud Similar to storage solutions, enterprises can acquire backup appliances to store their data locally and they can also opt to store them in the cloud. Enterprise data can be classified into three types based on access frequency: Hot data, Cold data and archival data. Each data type has its own backup requirement. Hot data is data that is frequently accessed while cold data is infrequently accessed data. Archival data is the kind of data that’s rarely accessed and mostly kept for either compliance reasons or future reference. local backup infrastructure also reduces latency and supports high IOPS. This makes it the primary choice for hot data backup; however, as enterprises can tolerate latency for cold data, it is better to use cloud services for cold data backup Similar to local storage solutions, local backup infrastructure also reduces latency and supports high IOPS. This makes it the primary choice for hot data backup; however, as enterprises can tolerate latency for cold data, it is better to use cloud services for cold data backup. The same applies for archival data. As the loss of this data doesn’t disrupt operations, enterprises can endure latency in archival data recovery as well. Hybrid Solution for Storage & Backup Evidently, research data storage and backup requirements simply cannot be addressed by a single kind of solution (either local infrastructure or cloud). It has to be a combination of both in order to properly facilitate both requirements: storage and backup. This means, that the enterprise will acquire two sets of hybrid solutions. One will be a combination of a NAS appliance with cloud connect services that store cold data in the cloud and keep the hot data locally. This will be the hybrid storage solution. The other set will be a combination of backup appliance with cloud backup. This backup solution will keep hot data backed up locally while backing up the cold and archival data on the cloud. The enterprise can also opt to create replicas of the hot data stored locally in the cloud. These sets of solutions will effectively address the data requirements of research environments and will optimize the process by supporting AI based innovation. ### Why Cloud-based AI is the key to Organisations staying ahead of Cybercriminals Today’s cybercriminals are technically proficient, creative, well-funded, and well-equipped. This year’s high profile attacks on banks, retailers and critical national infrastructure has proven just how sophisticated and persistent they can be. Indeed, we only need to consider the recent WannaCry attack on the NHS to understand how dangerous modern-day attacks are, for our data and our lives. A typical organisation is data-driven and powered by technology, which means they often have huge volumes of data that – unless adequately protected – could become easily accessible to hackers. With data breaches covering the front pages on a regular basis and EU General Data Protection Regulations (GDPR) coming into force next year, businesses are under growing pressure to detect and mitigate threats as soon as they arise. Why prevention is not the silver bullet Up until now, there’s no doubt that the speed of innovation among today’s cybercriminals has increasingly outpaced the ability of their targets to evolve their security defences. The fact is, cybercriminals only need to get it right once to gain access to what they want, whilst businesses need to get it right every time – a rather daunting task for even the most successful security operations team to take on. Tipping the balance even further, security teams are increasingly being forced to figure out how to do more with less. The amount of information a company processes is not getting smaller – in fact, it’s getting significantly bigger – yet security teams are expected to deliver the same success rate with the same amount of resources. They need to be able to define what ‘normal’ network activity looks like so they can identify and neutralise a potential threat straightaway, and when this is done manually, it’s not an easy job. Security teams are often faced with an overwhelming avalanche of false positives and can easily be distracted by alarm fatigue, which means it could be possible for a compromise to slip through the net. Threat detection has simply become too big a job for security teams to handle on their own – which is why cloud-based Artificial Intelligence (AI) is playing a bigger role. AI and the Cloud Security teams cannot afford to spend time on extensive manual threat-hunting exercises or deploying and managing yet another security product. Security teams cannot afford to spend time on extensive manual threat-hunting exercises or deploying and managing yet another security product. Cloud-based security enables easy, reliable and rapid insight into their network, which not only saves companies crucial time and money but provides them with access to a class of analytics that are not otherwise technically practical or affordable to deploy on-premise. Combined with AI and automation, this technology gives organisations the rapid detection and response capabilities they need to fight today’s cyber attackers. It enables them to cut through the noise and detect serious threats earlier in their lifecycle, thus eliminating time-consuming manual threat detection and response exercises, and allowing analysts to focus on higher-value activities that require direct human touch. Cloud-based AI ups the ante for businesses as it allows them to analyse different behavioural models to characterise how users are interacting with the IT environment and whitelist a user’s “normal” online behaviour in order to pursue user-based threats, which have previously been difficult to identify manually. Cloud AI from beginning to end Cloud-based AI needs to be applied throughout the cyberattack lifestyle in order to automate and enhance entire systems of work and to enable faster and more efficient breach detection. Unfortunately, hackers are constantly looking for new ways to breach existing defence systems, and are proficient at staying a step ahead of the new technology and exploiting the new and existing vulnerabilities. While breaches cannot be completely prevented, Cloud AI plays a key role in stopping these motivated hackers in their tracks. The technology is proactive and predictive, automatically learning and evolving to alert even the smallest changes in events and behaviour models that suggest a hacker might be breaching a system. [clickToTweet tweet="'When #AI is deployed in the #cloud, businesses benefit from #Collective Intelligence...' Read more ->" quote="When AI is deployed in the cloud, businesses benefit from collective intelligence and a broader perspective to get even smarter, and even fast."] When AI is deployed in the cloud, businesses benefit from collective intelligence and a broader perspective to get even smarter, and even fast. Imagine incorporating real-world insight into specific threats in real-time. This will advance the ability of AI-powered analytics to detect even the stealthiest or previously unknown threats more quickly and with greater accuracy than ever before. Thus far, the war against cybercriminals has not been a fair fight, however, organisations are increasingly shifting their attitudes about threat detection. No longer are they relying on prevention-only tactics; instead they are starting to look for ways that they can evolve at the same paces as the hackers. AI has the power to help these companies alter the terms of engagement, particularly when coupled with the advantages that cloud security can bring. Automation and intelligence is tilting the scales in favour of the good guys, allowing organisations to automatically detect and neutralise threats with little or no room for error. ### The Benefits and Pitfalls of Cloud Backup The benefits of investing in data backup solutions are clear and can be summed up in one term: business continuity. As a business, your main goal should be to have a resilient network infrastructure that makes data loss unlikely to occur in the first place. However, disaster can always strike and utilising data backup means that you can ensure a quick recovery and avoid significant downtime and loss of revenue. While cloud backup has continued to gain momentum over time, physical hard drives are still widely used. Over the past ten years, organisations have had to choose whether they want to have their data backed up physically, in the cloud, or through a hybrid solution. While cloud backup has continued to gain momentum over time, physical hard drives are still widely used. It is crucial for any organisation that wants to properly back up its data to continuously and preemptively re-evaluate their backup methods and consider updating or changing them before they become outdated. Having your data backed up on a physical hard drive can be a great security hazard, something a club exclusive to Oxford and Cambridge alumni had to learn the hard way in November. A hard drive was stolen from a locked “comms” room at its London compound. The club has not disclosed what other security measures it had taken to protect the information which included members’ phone numbers, email addresses, bank account details and photographs. This example makes it very easy to make an argument for cloud backup services being more secure than their physical counterparts. Whereas physical hard drives can be physically removed from your premises, suffer inaccessibility issues due to a hardware failure, or even destroyed by a fire or flood, cloud backup solutions take advantage of the pre-existing security features of data centres. In terms of resilience, cloud backup also trumps onsite solutions, as scalability issues are easily solved and spare capacity rarely becomes an issue. The beauty of the cloud is that not only does it make it harder to lose data, but it also makes it incredibly accessible. Today employees can interact with key data from anywhere at any time and from almost any device using any modern web browser. While this is great for productivity and transparency it is also entails major risks. If any browser can potentially access your cloud applications, that includes any buggy, spyware-riddled browsers on your employees’ personal computers, tablets and smartphones, and it is more likely for employees to have their passwords stolen and data to become corrupted or compromised. Another pitfall of cloud backup is that while it can save your business from unreliable on-site hard drives, it can’t save you from yourself. No software program can tell whether you will eventually need the emails you have decided to purge, or whether the purge was even intentional to begin with. As a user of most cloud services there is little you can do in the face of user error, as cloud service providers such as Google or Microsoft simply do not have the time to put a support rep at your disposal every time you accidentally delete a Google Doc or Outlook message. In fact, they are explicit about wanting you to be responsible for correcting your own data loss errors. [clickToTweet tweet="it becomes crucial for users of the cloud to make sure they are prepared for any potential data loss before it occurs" quote="it becomes crucial for users of the cloud to make sure they are prepared for any potential data loss before it occurs"] So if even a cloud-based backup solution isn’t going to prevent data loss, and the cloud service providers aren’t going to come to your rescue, it becomes crucial for users of the cloud to make sure they are prepared for any potential data loss before it occurs. There are two main steps to securing your cloud-stored data: getting on top of your Restore Time Objective (RTO) and adopting cloud-to-cloud backup. Whenever your cloud applications lose data, work is slowed or stopped until that data is recovered, which will have a potentially severe impact on your productivity figures. RTO refers to the acceptable and expected amount of time it takes to notice and correct data loss and return systems to normal standards of operation. While that may be dictated by whatever software and hardware you are using, it is up to the organisation to plan accordingly and make sure your operations match the expectations set by your RTO. One of the best and most up-to-date ways to ensure you can comply with your RTO is through cloud-to-cloud backup, that differs from traditional cloud backup in that it does not involve any data stored on your local hard drive. Instead, data is replicated in one cloud and duplicated into yet another one.  All the reasons for moving your data from hardware to the cloud hold for moving from cloud backup to cloud-to-cloud backup. It offers you a “get out of user error” free card to be played whenever someone within your organisation corrupts, misplaces or destroys vital business data. Paradoxically, even reverse migration of data, from the cloud back to physical hard drives is made quicker and cheaper when you have used cloud-to-cloud backup, as cloud data can easily be replicated into numerous common file types usable by conventionally installed software. By shaping operations around a realistic RTO, and backing up data from both hardware and cloud to cloud-to-cloud backup solutions, security professionals can make it much harder for their organisation to lose valuable data, while ensuring data remains accessible. Still, not even the most vigilant security measures available are immune to human error, and staying idle with your disaster recovery solutions will continue to be risky as threats continue to become more and more sophisticated. Whatever demands or wishes an organisation has when it comes to data backup solutions, it needs to plan for unexpected disasters to occur, and have measures in place to mitigate human errors. The key is to make sure your systems are adaptable and can get the job done while making data neither inaccessible or insecure. By shaping security operations around a realistic RTO and backing up data from both hardware and the cloud through cloud-to-cloud solutions, businesses can ensure continuity in the face of increasing downtime threats ### Robotic Process Automation - Rhiannon Dakin of Flynet Tells us More Compare the Cloud was recently joined by Rhiannon Dakin, Marketing Manager at Flynet. In this demonstration, Rhiannon discusses Robotic Process Automation - undoubtedly the biggest source of competitive advantage available to insurers this year - possibly this decade. Robotic Process Automation is essentially the enablement of a system to complete work that previously required human interaction Find out more about Flynet here: https://www.flynetviewer.com/ At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. ### How the cloud is creating new services, and changing the way businesses buy comms services One of the many areas where cloud innovation is having a big impact right now is business telephony services, perhaps more correctly called comms services, since they encompass unified communications (UM), as well as fixed mobile convergence (FMC).  These are concepts that have been talked about for many years - and implemented to a certain extent -  though they have traditionally been hampered by dependency on physical equipment.  Cloud has removed those barriers and as well as giving end-user businesses greater choice, gone are the days of needing to invest in physical private business exchanges (PBXs), which are typically expensive to buy and maintain.  Cloud telephony is also a catalyst for new service providers to enter the market, or for existing ISPs or mobile firms to reposition themselves and expand their portfolios.  For instance, a reseller can now become a mobile virtual network operator (MVNO), without needing to invest in and install their own networks; instead, they can choose to simply ‘rent’ the technology (in other words, Platform as a Service). This is also proving the catalyst for more flexible pricing models, with options available to the smallest of SMEs to large enterprises and changing the way that businesses buy comms services, such as a ‘monthly right to use’.  They can also expect far more innovative features, because the cloud enables these to be deployed more rapidly, plus eliminate the traditional definitions of the telecom market.  Now, the end-user can be at the centre of the service delivery, not dictated around telephony devices, location, network nor apps.  More on that shortly. These changing market dynamics are great news for businesses purchasing telephony services: never have they had so much choice. According to telecom industry experts The Cavell Group, the Western European market for cloud communications is set to grow by 17,337,538 users in the next 5 years (representing a CAGR of 24.64 percent).  Cavell also says that the UK is the largest Western European market. The big switch off We are also at an interesting juncture in the European telecoms market, with many PTOs such as BT planning to turn off their existing ISDN and PSTN services for good within the next few years.  This is a prompt for customers and suppliers to look at what is currently available in the cloud, simply because they will have no alternative but to switch to IP based services.  At the same time, many legacy PBXs and early IP-Centrex solutions are nearing the end of their life, generating interest in migration to the latest developments in cloud-based telecom services. However, it is perhaps the sheer array of innovative features and services that cloud-based telecoms enable that is the biggest game-changer: true enterprise mobility; tighter integration with internal IT systems and applications; and far better collaborative tools; and a whole host of other services that benefit both an organisation and its individual employees. For instance, users can have just one handset across both fixed and mobile access, either with one phone number or several phone numbers. All routing of services is handled by the provider, rather than on the customer’s premises.  With a cloud PBX, it becomes possible to access unified communications services via any internet-enabled device, within the office or working remotely, creating a single, seamless service. Over a third of the workforce was mobile by 2016 according to Strategy Analytics and there is no reason why this trend should not continue, so this greater degree of freedom can result in tangible benefits, such as greater efficiency and time saved. Over a third of the workforce was mobile by 2016 according to Strategy Analytics and there is no reason why this trend should not continue, so this greater degree of freedom can result in tangible benefits, such as greater efficiency and time saved. For instance, more calls can reach the right person first-time (thus improving efficiency and reducing the number of call-backs needed), or cost-per-minute savings can be achieved by automatic rerouting of mobile calls and apps via Wi-Fi when the user is in the office, rather than over the cellular network). However, this does not mean that users cannot get away from incoming calls, chat streams, collaboration tools and other apps: in fact, they can achieve far better control over their accessibility, such as integrated voicemail management, with messages delivered to a centralised mailbox (not to an individual mobile or PBX/Centrex mailbox), with consistent forwarding rules and even email notifications with an audio file attached. Another area of user control that is still quite new but expected to become very important is ‘user-driven presence control’, an OTT service. This removes the traditional approach of depending on apps like email calendar schedules and instead, users choose their preferences from the mobile device based on their status at that moment, such as whether they want to take a call, or to forward to a colleague, or simply let the call go to voice-mail.  Users can then decide whether to accept calls, forward them to a team member or department, or let them go to voice-mail, depending on the user’s selected state, which the rest of the team can also see. So, does that mean we all end up with just one handset for everything?  That is certainly an option, with a mobile device becoming a gateway for users to manage all services, even with multiple phone numbers attached to that device. However, it is also likely that many companies will have a policy whereby users have two or more devices for the time being. [clickToTweet tweet="'...many organisations are on a learning curve when choosing #cloud-based #comms services...' Read how to choose" quote="'...many organisations are on a learning curve when choosing cloud-based comms services...'"] Choosing cloud telecoms services As an emerging market, many organisations are on a learning curve when choosing cloud-based comms services.  However, there are enough existing deployments in the market – particularly in the Nordics and mainland countries such as France – from which purchasers of these services can learn.  Here are some examples of ‘best practice’ experience, based on working with some of these pioneering end-user organisations: Make sure that the service is a step forward – rather than just replacing old systems.  Use this as an opportunity to improve the way that employees communicate externally and internally, with better control, flexibility and access to more advanced services, such as fixed mobile convergence (FMC) and unified communications (UC). Look for solid interoperability and integration support – such as open APIs and connectors to services and apps within the organisation’s existing technology stack. Pricing that’s transparent and predictable – in other words, no hidden costs, such as maintenance.  Transparent and predictable pricing – e.g. monthly right to use with maintenance costs included. Ability to scale and evolve – if the end user organisation grows, can the cloud comms package meet extra demand quickly? What about upgrades and further developments: is there a roadmap? Reliability and ease-of-use – check that it is easy to manage, with ‘self-care’ options for users, rather than requiring training or complex set-up and configurations.  Also, what back-up is available for business continuity? The right fit – some services suit SMEs better than large enterprises and vice versa, plus vertical markets may have extra needs (such as the education sector, which would benefit from reduced comms costs during holiday periods).  Do overseas offices need to be supported and if so, is that covered (for instance, ensuring that remote team members share user presence control and collaboration tools). Clearly, there are a lot of factors to consider, but on the positive side, businesses buying these services truly are in the driving seat, with a growing number of service providers competing for their business, with increasingly attractive pricing models, greater innovation and services that revolve around users, not around hardware or software. ### The list of lesser Advertised Benefits of Cloud Computing Cloud computing has been growing by leaps and bounds over the last couple of years. It will interest you to know that the global public cloud market is growing at a 22% rate and will get to $146 billion in 2017, up from $87 billion in 2015. It is no longer news that Amazon, Google, Microsoft, and Apple are competing to be the top cloud provider for consumers. Big and Small businesses are now adopting cloud computing. According to Forbes, 60-70% of all software, services and technology spending will be cloud-based by 2020. Also, according to a recent IDG Enterprise survey of 925 IT decision makers,  70% of organizations have at least one applications in the cloud and 16% have plans to do so within 12 months. If you are not yet using the cloud technology for your business, now is the time to start. One of the most popular benefits of cloud computing is that it increases efficiency and it is easy to implement.  You can quickly set it up in a matter of minutes compared to the traditional servers/hardware that can take weeks or months to set up. Apart from these benefits, there are lists of lesser advertised benefits of cloud computing such as:  Less operational issues Cloud computing makes it easy to use standardized services.  This helps to minimize time spent on solving operational issues, thus promoting business continuity.  When you have less operational issues to tackle in your business, it allows you and your staff to spend more time on productive activities that provide value to your business One major business operational issue is network upgrade.  You may need to update your network in the office when it becomes outdated.  This is a very expensive task. Apart from the cost, your employees will also need to get used to the new technology.  But with cloud computing, you don’t need to do any update because all your data is hosted in the cloud. It is the duty of your cloud hosting provider to update their network services at the backend infrastructure. This will enable you to always use the latest infrastructures at the front end. Thus, removing the operational issues where you have to continually update your network as it gets old. Secondly, managing information in one location can lead to saving different versions of the same data. This can lead to operational issues or error. Cloud computing on the hand, makes it easy for employees to access the same information or data, updated and revised in real time.  This is devoid of human error or operational issues. This makes it easy for employees to collaborate, share and access work wherever they are. At least, 58% of businesses collaborate across the organization and ecosystem through the cloud. Easy for mergers and acquisitions Merging with or absorbing another business can be a complex process.  However, with cloud computing, it is easier to merge two or more companies together and also purchase businesses. Cloud computing makes it easy for merging companies to integrate together and quickly too. They can easily integrate their email domains, address lists, calendar sharing, directory services, etc. Cloud computing makes it easy for merging companies to integrate together and quickly too. They can easily integrate their email domains, address lists, calendar sharing, directory services, etc.  This helps to unify communication processes which make it easy to integrate different devices, media, applications, systems etc. The integration process determines how efficient the business will run after the merger or acquisition. Cloud computing makes it easy for information and news to be shared easily with users and in real time too. This implies that stakeholder and staff that are associated with the business will be able to access data or information in real time. Companies merging or acquiring shows signs of growth.  This may make it necessary to upgrade business software.  For instance, a company that has about 200 workers may increase even more in size. Upgrading business software will enable the organization to run efficiently. Using cloud computing makes it easy for merging or company that is acquiring to upgrade their subscription to match their current number of employees, making it easy for employees and stakeholders to collaborate together easily. Traditionally, when companies are merging or buying another company, there is need to consider the hardware and the apps being used.  For instance, if you are going to increase the capacity of your company through the acquisition, you may have to worry about the capacity of your server. With cloud technology, this is not required.  You can extend the applications in the cloud by creating a plan to complete the integration. Cloud services make it easy for mergers and acquisitions to quickly deploy their services, software and empower their employees with the right tools for the business. This will help to keep the company running while in the deployment phase.  Thus, cloud computing is of utmost importance to a globally distributed merger team from different companies and IT landscapes. It fosters tech-savvy Business Executives In this modern age, you need to leverage the latest technology resources to compete favourably in the marketplace. Cloud computing makes it easy for business executives, marketers, CEOs and IT managers to develop their own in-house cloud applications. This is cost saving. Instead of you hiring external professionals, you can easily develop cloud applications that will boost your business growth. The most encouraging thing is, you don’t need to know how to code or have previous experience in the IT world. You can simply learn it online. There are several courses you can take to know more about the cloud computing and also learn to develop cloud applications such as the AWS Certification & Training. The course will enable you to use cloud computing to take your business to another level. When you learn how to develop cloud applications for your business, it will save you a lot of costs and free up your time to focus on more important aspects of your business. You can also use the knowledge you acquired from the course to help other organization to develop their cloud application too. Cost efficiency A lot of business owners have not adopted cloud computing because of its cost. In fact, a report said that 20 percent of organizations are concerned about the initial cost of implementing a cloud-based server. The truth is that traditional desktop software products are expensive, especially if you want access to multiple users. However, if you use cloud computing, it is much cheaper. It is cost efficient to use, maintain and upgrade. With traditional computing, you need to pay for server maintenance, upgrades, power and cooling costs, etc. Cloud-based computing saves you those costs. According to Microsoft survey, 49 percent of SMBs use cloud computing to lower costs. Instead of spending your money on maintaining hardware you may not use, subscribing to software and services for a low cost can help your business to stretch its budgets further. With cloud computing, you can even decide to upgrade to meet your business needs or downgrade during the slow seasons, helping you to further save money. Also, you don't have to pay for features you will not use.  Some cloud services provide pay as you go services. This means you only have to pay for what the cloud service offers. This includes the data storage space. You don't have to pay for too much space you will not use. Bitglass surveyed CIOs and its leaders on the cost saving of using cloud services. Half of all the CTOs and its leaders said that they saved cost in 2015 as a result of using cloud-based applications. Also, a cloud survey report carried out by KPMG revealed that one of the top ways businesses are using the cloud is to drive cost efficiencies. If you are a small business owner who has to pay for IT software and manpower, cloud services save you the cost of hiring professionals and managing your hardware. Easy to access information Cloud computing gives you access to your files anywhere on the globe as long as there is an internet access.  You can access information on your smartphones, tablets, and laptops. According to Newzoo, there are about 2.6 billion smartphones in use globally. This makes it super easy to use smartphones to access files and collaborate in the field, while travelling, in restaurants etc. This implies that your employees can access their files and work while commuting, at home and on business trips.  As they work, it allows them to collaborate with team members. Employees can edit files, makes revisions using the Microsoft office at the same time with other users. This improves flexibility and productivity, especially when workers are not able to make it to the office, they can easily work from any location other than their offices. The ability to access information anywhere at any time makes it easy for small businesses to hire remote workers without the need to open up branches at new business locations. Remote teams can easily share, collaborate on files, see updates in real time and use web conference to meet up.   Unlimited storage In the early days of computing, the floppy disk was used to save and access information. Thereafter, the compact disk was introduced (CD), followed by DVD.  Then USB flash drive was made. The problem with these physical devices is limited storage space. The truth is, companies generate enormous electronic information in form of documents, images, emails, videos, charts, etc. Storing your data in physical devices can be limiting. The physical storage spaces have limited spaces, making it easy to run out of space. With cloud computing, you can save unlimited data and access it in remote databases.  Cloud computing provides unlimited storage space. Cloud services can store large amounts of data. When you need more space, you can easily upgrade to bigger spaces in the cloud or even opt for unlimited space. The good thing here is that you don't have to buy spaces you will not need. You can buy the appropriate amount of storage for your company’s current and future needs. According to IDC study, most companies experience between 40 to 60 percent growth in data volume every year.  This will help you to keep your information in one place and not on several USB, SD cards or servers. With cloud computing, you can have up to 200 storage accounts with about 500 TB data for each one. According to a survey carried out by searchstorage on the amount of data businesses are storing in the cloud, 31% of the group surveyed revealed that they had more than 25TB in the cloud, with 8% saying they have more than 100 TB in the cloud storage. With cloud storage, you can also move files from your local storage to your cloud storage. Instead of emailing files to customers or clients, you can send a web link through your email. Easy to set up backup and recovery Businesses do encounter loss of data at one point or the other.  This can be as a result of fire disaster, a bug in the code, downtime, power outages etc.  All these things can affect service delivery to your customers and this can impede your business growth. Do you know that 93% of businesses that experience a data loss or disaster without a data recovery plan are out of business within one year? And 43% of the businesses that experience a disaster never reopen? With cloud computing, you can easily avoid data loss and disaster by using the backup and recovery features of cloud computing for your business data. Cloud computing provides a way to quickly recover your data in different emergency circumstances. According to a survey report, while 20 percent of cloud users claim disaster recovery in four hours or less, only 9% of non-cloud users could claim the same This is why 43 percent of IT executives said they plan to invest in or improve cloud-based disaster recovery solutions. Using cloud computing is cheaper and it saves time. Traditionally, to backup your data, you will need data centre facility, staff, services, etc. These things cost money. However, with cloud computing, your provider will only charge you when you use the services.  You will only pay when backing up your applications in the cloud. If you need to recover it when the need arises, you will not pay additional fees. Setting up backup and recovery is easy with cloud computing. You don't need to set up your own duplicate data centre or use server, drives, disks, etc.  You can easily set it up in the cloud by using solutions such as Amazon S3 to backup your data. This makes it very easy for you to restore your data if anything goes wrong. With the increase in data generation comes the need for more space, you can easily scale up your demand as needed for cloud backup and recovery. Scaling is much more simple, quick and cheap too. This will enable your customer to have a great user experience even during downtime. Unlike the traditional storage and backup that would be slow as usage increases. [clickToTweet tweet="41% of businesses are planning to increase their investment in #Cloud technologies... -> read more on ComparetheCloud" quote="41% of businesses are planning to increase their investment in Cloud technologies"] Conclusion The above list of lesser advertised benefits of cloud computing shows that the technology has come to stay.  It is not going to disappear anytime soon.  This is why 41% of businesses are planning to increase their investment in Cloud technologies, with 51% of big and midsize companies planning to increase spend compared to only 35% of smaller firms. ### Does voice transcription technology deserve its bad reputation? Voice recognition technology has well and truly ‘talked’ its way into the consumer psyche. One of the biggest selling products over the festive season last year was Amazon Echo, and sales of the device that boasts an intelligent personal assistant called Alexa show no signs of slowing down this Christmas. [clickToTweet tweet="Last year #Google revealed that 20 percent of mobile search queries were initiated by #voice " quote="Last year Google revealed that 20 percent of mobile search queries were initiated by voice "] Last year Google revealed that 20 percent of mobile search queries were initiated by voice and it expects that figure to rise to 50 percent by the end of the decade. It bears testament to the great strides that have been made in the technology that underpins these devices enabling them to understand and respond to your commands with ever-increasing accuracy and efficiency. But what impact has this new smart technology had in the workplace? Many would argue that the biggest opportunity to bring new business benefits lies in the development of technology that can transcribe spoken language into text. Why? As the pace of communication continues to increase, aided by the proliferation of cloud-based tools and social media platforms, it has never been more important to have a clear record of who said what – not only to increase accountability and transparency but to increase productivity, cutting out laborious time spent transcribing calls and meetings. However, delivering on that comes with challenges. Transcription tools have had a hard time in the past and there has been a long-held consensus that flawless voice recognition is an impossible challenge for computers alone to master. But things are changing. Why is speech recognition seen as an impossible technology to master? Up until now, high-quality, accurate transcription on a computer has only been possible when those having the conversation were speaking slowly and clearly, with a well-positioned microphone in a quiet room. And even then, these transcriptions still missed the mark. It’s not hard to understand why: these programs were designed to register any noise coming through a microphone, not just the voices of those involved, but also any background noises - the inevitable echoes, the scuffle of an object being moved about or someone sneezing in the next room etc. All of these elements have an impact on the accuracy of the final transcription. If you then add in the mix different regional and international accents, slang words and unknown acronyms, it’s easy to understand why automated voice transcription is such a tricky thing to pull off without human intervention. The data points created during a single conference call also present a big challenge. For a conference call involving a number of people, the amount of unstructured data created is huge, and almost impossible to pick apart to be able to attribute who said what to the right people. Typically, you still need a human brain to decipher that data and make sense of it all. As anyone in IT knows, collecting data of any kind comes with privacy concerns and voice recognition brings an added layer to that problem. This kind of technology, as mentioned, is designed to respond to all noise – including potentially picking up sensitive conversations that might be taking place in the background. Clear parameters will need to be part of the process if voice recognition software is to build and maintain a long-term reputation as a valuable business tool. Why is it important to keep pushing for perfection? So, if accurate voice transcription is such a hard task, why bother developing this technology that many have said is ‘impossible’ to master? The reason lies in its huge potential. thanks to investment in this space, they have also been able to dramatically improve the accuracy of written transcriptions, with Microsoft claiming that its program can now match the accuracy of human transcribers, having reduced its word error rate from 5.9 percent last year to 5.1 percent Companies like Google, Amazon and Microsoft have understood this and have been investing heavily in core transcription technology that is needed to power their ever-increasing portfolio of home products. But thanks to investment in this space, they have also been able to dramatically improve the accuracy of written transcriptions, with Microsoft claiming that its program can now match the accuracy of human transcribers, having reduced its word error rate from 5.9 percent last year to 5.1 percent in August this year - the average error rate expected of a professional human transcriber. With the ability to automatically transcribe virtual meetings and conference calls, workers could also save themselves a huge amount of time. Not only could candidate interviews be recorded in full for people to refer back to and customer testimonials captured ready to be posted on your company website, meeting minutes could be automatically transcribed and agreements between parties could be traced back easily for increased accountability. It could also help managers keep track of increasingly distributed teams and doctors could even transcribe interviews with patients in medical trails – the list of applications is endless. The adaptability of this technology means that every company can tailor it to their needs. Mujo, a medical rehabilitation start-up, for example, has recently been using Yack.net, our collaboration platform with call transcription capabilities to record board meetings to keep a written trace of what was said and agreed. This has allowed all parties to keep track of tasks without someone having to transcribe those agreements and progress updates. Which businesses are driving this technology forward?   Although this technology isn’t perfect, it is clear that we should keep pushing forward. And the good news is that there are a few really good applications out there already: In the smartphone world, a great example is Cassette, an iOS app that allows the user to record conversations using their iPhone as a recording device. The recorded speech is transcribed and can be replayed. Designed specifically for the workplace, our Yack.net is a team collaboration tool that allows users to record calls with up to six participants. The calls are automatically transcribed and can be searched and replayed in sections, allowing users to pinpoint who said what. Trint is a transcription tool designed for journalists. Pre-recorded interviews are uploaded and transcribed. The interface is optimised to enable the journalist to correct the automatically generated transcription, where required. So, does voice transcription technology deserve its bad reputation? In a way, yes. It’s a work in progress though - and that progress has come on leaps and bounds in the last two or three years. That development has been and continues to be pushed along rapidly thanks to the investment from some of the biggest players in technology who all believe that voice recognition has a bright future. From our own more humble perspective, we have worked tirelessly on enhancing our transcription engine and will continue to push its limits so that we can play our part in unlocking the huge untapped benefits of this technology. ### 5 Myths about Moving your Email to the Cloud If you are a business leader holding back from migrating your operation to the cloud because of things you have heard, then it’s time to set the record straight. There is plenty of misinformation floating around about the cloud that needs to be challenged. [clickToTweet tweet="There is plenty of #misinformation floating around about the #cloud that needs to be challenged." quote="There is plenty of misinformation floating around about the cloud that needs to be challenged."] These myths are preventing some businesses from transforming their business model, making them more economical, efficient and secure through cloud computing. Myth one: The cloud is wide open to hacking Cloud infrastructure provides a safe, walled virtual fortress and when you sign up to a cloud service, your data belongs to you – it’s that simple. The weakest points to any cloud infrastructure are the users themselves. Authentication of users is something that your business can control with simple, effective work policies. Password allocation from management ensures robust passwords are always used. Multiple stage authentication can also be used if security is perceived as an issue. You can educate your staff about phishing scams and how to spot them. Mobile device users also need to adhere to policies of use which are freely available online. This is simple gatekeeping stuff – it’s not rocket science. Just as with any kind of work you want to keep safe, if you are mindful of protecting it, it will remain protected. Myth two: You have to migrate all of your IT to the cloud There seems to be a common perception that it’s all or nothing when engaging with cloud infrastructure as a service. The truth is that it’s largely up to you how you use the cloud for your business. You can choose to use it for a variety of operations and virtual software – perhaps for emails, or for sharing and working on documents but not for your main databases. The point is, you do not necessarily have to shift your whole organisation to the cloud if that’s not comfortable for you. A hybrid solution might serve your company better than a total migration. Myth three: Your company is too large for migration There is plenty of support out there for businesses wanting help with migration to the cloud – in CloudMigrator365 we have helped some of the largest and most recognisable companies in the World to migrate. Breaking migration into manageable stages is a good way to approach shifting a large organisation over to the cloud. Careful planning and project management takes much of the stress away. A well thought out, planned migration will make cloud infrastructure much easier to implement, with minimum disruption to staff. Myth four: Your team will need to relearn everything Office 365 has applications like Outlook, Word, Excel, PowerPoint and SharePoint that will be familiar to your team, making uptake much easier. Whilst the software consists of the tools you will be familiar with, it will also benefit your work with additional productivity and collaboration tools that are available within Office 365. The software you know is the same or similar, it’s just cloud-based and a lot more versatile. You could say that the cloud isn’t about reinventing the wheel – it’s just making that wheel go further. Myth five: Moving over to the cloud is expensive Nothing could be further from the truth. Migration costs are surprisingly affordable and the ongoing licensing costs typically save businesses money in a couple of distinct ways. Firstly, by driving efficiencies and savings compared to on-premise infrastructure, meaning that the vast majority of business will quickly see a return on the investment. Secondly, by ensuring that you don’t lose data and can recover quickly with your data remaining intact when systems go wrong or if there is a workplace disaster.   ### How cloud is evolving so systems can act autonomously Just as cloud adoption is beginning to accelerate and organisations are focusing on how to manage multiple cloud providers rather than on whether to adopt cloud in the first place, a few ‘experts’ have begun to predict cloud’s demise. This may be a way of grabbing the headlines, but in my view, they are asking the wrong question. Here’s why, and how we can realistically expect cloud to evolve in the next few years. The current growth of cloud is simply the next logical step in the regular waves of centralisation and decentralisation that characterise the IT sector. One moment we all think that the best place for intelligence in the network is at the edge, and then technology changes and the most logical place for that intelligence becomes the centre instead. For example, mainframes never died when client-server came along; instead, the terminal became a PC, and the mainframe became the database server. The aim of cloud was to consolidate data centres and enable organisations to benefit from economies of scale The aim of cloud was to consolidate data centres and enable organisations to benefit from economies of scale for hosting their business systems. For many, this has been a great success, whether they are small businesses who no longer need to invest in their own infrastructure or large organisations that can free up real estate and benefit from cloud’s flexibility and scalability for many of their routine applications – albeit moving legacy applications to cloud remains more challenging. Cloud also offers significant advantages for archiving, back-up and disaster recovery. However, we are also seeing a rapid growth in intelligent devices, such as robots in manufacturing, medical diagnostic systems and autonomous vehicles, up to and including self-driving cars – what you might term ‘intelligent client mark 2’. For these devices, which are in effect data centres in their own right, there is a need to process information in real time, and so for them, the latency of cloud is becoming a major issue. Take an autonomous vehicle, which needs information on the changing obstacles around it, or a robot scanning fruit on a conveyor belt in a factory and picking off substandard items. These need to make instant decisions, not wait for information to transit six router hops and three service providers to reach the cloud datacentre and then do the same on the way back. If you think about it for a moment, it’s basic maths! Having intelligence at the edge is vital for applications that seek to do things in real time, such as robotics, and as ‘smart’ devices, embedded systems and use of artificial intelligence to train and manage these devices develops this need will increase. Organisations will need to use cloud for what it's good at i.e. scale, training and developing algorithms and large-scale data stores, and then bring the intelligence to make decisions to the edge device to enable it to act autonomously. An example would be a facial recognition system, where you would use cloud to store petabytes of data to enable you to train the system with many thousands of photos, and then load the algorithm developed into the camera control system so that the initial facial recognition is at the edge. The system can then revert to the data stored in the cloud if further confirmation is required. Different sectors will, of course, require different approaches, and each will need to consider where best to place the intelligence in their network to meet their specific needs. IT can provide the tools and capability but, as always, it is up to each business to decide how best to use them for the greatest benefit to their clients and themselves. In the insurance industry, for example, where actuaries have traditionally analysed massive amounts of data to enable underwriters to make policy decisions, the economies of scale provided by cloud processing offer significant advantages. In this scenario, there is no major benefit in moving intelligence to the edge as such decisions are not sensitive to latency. [clickToTweet tweet="To paraphrase Mark Twain, any tales of the imminent demise of cloud have been greatly exaggerated! via @fordwaybloke" quote="To paraphrase Mark Twain, any tales of the imminent demise of cloud have been greatly exaggerated!"] I hope that by now I’ve convinced you that any tales of the imminent demise of cloud have, to paraphrase Mark Twain, been greatly exaggerated. Each organisation will need to consider its own use case and choose the most appropriate solution, depending on how much real-time processing is required. All will benefit from the scale and flexibility of centralised cloud processing and storage, from construction companies putting together consortia to deliver specific projects such as Crossrail and HS2, who require capacity for a finite amount of time, to public sector organisations who can hand routine applications to a cloud provider in order to focus on their core activities. Even those organisations working at the cutting edge of robotics and AI will benefit from cloud’s scale and capacity. However, their smart devices will need to rely on inbuilt intelligence, supported by cloud services, if they are to succeed. ### #ExecEdTV Episode 6: Leadership and Results- It’s all about mindset In this final episode of the series of #ExecEdTV host Kylie Hazeldine will be discussing Leadership and Results and whether it is all about the mindset with guest panel of entrepreneurs Tim Walker Managing Director of Aura Technologies and Darren Hilton Executive Coach at Tapping the Source. ### Comms Business Live: Distributor Talks In this episode host David Dungay and guests will be exclusively talking to distributors about their vision of a digital future and how the partner channel will need to evolve to fit into that world. Joining David for this episode are Alison Heath, Business Development Manager – Cisco Collaboration at Westcon-Comstor, Darren Garland, Managing Director at ProVu, Steve Harris, EVP Unified Comms Practice, Nuvias & MD, SIPHON and Will Morey, Director and Co-Founder, Pragma Distribution.   Host David Dungay, Editor of Comms Business @CommsGuru4U Panellists: Alison Heath, Business Development Manager - Cisco Collaboration at Wescon-Comstor @WestconComstor Darren Garland, Managing Director at ProVu @Provu_UK Steve Harris, EVP Unified Comms Practice, Nuvias & MD, SIPHON @NuviasUC Will Morey, Director and Co-Founder, Pragma Distribution @Wearepragma ### Gartner Says Worldwide Server Revenue Grew 16 Per cent in the Third Quarter of 2017; Shipments Grew 5.1 Per cent STAMFORD, Conn. 11th December 2017 — In the third quarter of 2017, worldwide server revenue increased 16 percent year over year, while shipments grew 5.1 percent from the second quarter of 2016, according to Gartner, Inc. "The third quarter of 2017 produced continued growth on a global level with varying regional results," said Jeffrey Hewitt, research vice president at Gartner. "A build-out of infrastructure to support cloud and hybrid-cloud implementations was the main driver for growth in the server market for the period." "x86 servers increased 5.3 per cent in shipments for the year and 16.7 per cent in revenue in the third quarter of 2017. RISC/Itanium Unix servers declined globally, down 23.5 per cent in shipments and 18.3 per cent in vendor revenue compared with the same quarter last year. The 'other' CPU category, which is primarily mainframes, showed a decline/increase of 54.5 per cent," Mr Hewitt said. Hewlett Packard Enterprise (HPE) continued to lead in the worldwide server market based on revenue. Despite a decline of 3.2 per cent, the company posted $3.1 billion in revenue for a total share of 21 per cent for the third quarter of 2017 (see Table 1). Dell EMC maintained the No. 2 position with 37.9 per cent growth and 20.8 per cent market share. Inspur Electronics experienced the highest growth in the quarter with 116.6 per cent, driven by ongoing sales into China-based cloud providers, as well as global expansion efforts. Table 1 Worldwide: Server Vendor Revenue Estimates, 3Q17 (U.S. Dollars) Company 3Q17 Revenue 3Q17 Market Share (%) 3Q16 Revenue 3Q16 Market Share (%) 3Q17-3Q16 Growth (%) HPE 3,144,197,027 21.3 3,247,173,253 25.5 -3.2 Dell EMC 3,070,405,586 20.8 2,227,185,685 17.5 37.9 IBM 1,130,618,441 7.7 889,723,595 7.0 27.1 Inspur Electronics 1,085,706,835 7.4 501,144,279 3.9 116.6 Cisco 996,248,000 6.8 929,440,000 7.3 7.2 Others 5,317,865,262 36.1 4,920,169,892 38.7 8.1 Total 14,745,041,151 100.0 12,714,836,704 100.0 16.0 Source: Gartner (December 2017) In server shipments, Dell EMC maintained the No. 1 position in the third quarter of 2017 with 17.8 per cent market share (see Table 2). HPE secured the second spot with 16.4 per cent of the market. Inspur Electronics was the only vendor in the top five to experience positive growth in the quarter. Table 2 Worldwide: Server Vendor Shipments Estimates, 3Q17 (Units) Company 3Q17 Shipments 3Q17 Market Share (%) 3Q16 Shipments 3Q16 Market Share (%) 3Q17-3Q16 Growth (%) Dell EMC 502,845 17.8 452,383 16.8 11.2 HPE 462,777 16.4 492,273 18.3 -6.2 Inspur Electronics 203,306 7.2 119,943 4.5 69.5 Lenovo 151,575 5.4 228,097 8.5 -33.5 Huawei 145,441 5.1 163,355 6.1 -11.0 Others 1,362,727 48.2 1,234,567 45.9 10.4 Total 2,828,617 100.0 2,691,618 100.0 5.1 Source: Gartner (November 2017) Europe, the Middle East and Africa (EMEA)  In the third quarter of 2017, server revenue in EMEA totaled $3.1 billion, a 17.1 per cent increase from the third quarter of 2016 (see Table 3). Server shipments totaled 495 thousand units, a growth of 2.9 per cent year over year (see Table 4). "On the surface, the EMEA server market has some positive results at last," said Adrian O'Connell, research director at Gartner. "However, the increased price of certain components due to supply shortages was the main driver of the revenue growth, as vendors passed increased costs on to users." Table 3 EMEA: Server Vendor Revenue Estimates, 3Q17 (US Dollars) Company 3Q17 Revenue 3Q17 Market Share (%) 3Q16 Revenue 3Q16 Market Share (%) 3Q17-3Q16 Growth (%) HPE 1,037,283,825 33.9 959,089,295 36.7 8.2 Dell EMC 671,200,329 21.9 444,823,990 17.0 50.9 Cisco 230,560,000 7.5 207,900,000 8.0 10.9 Lenovo 223,395,171 7.3 214,695,205 8.2 4.1 IBM 218,462,587 7.1 165,219,588 6.3 32.2 Others 681,559,726 22.3 622,735,528 23.8 9.4 Total 3,062,461,638 100.0 2,614,463,608 100.0 17.1 Source: Gartner (December 2017) In revenue terms, all of the top five vendors benefited from this growth in the third quarter of 2017. HPE maintained the No. 1 position, but Dell EMC saw particularly strong share growth compared with the third quarter last year. "An expanded customer base from the EMC acquisition continues to benefit the server business in the new Dell EMC organisation," said Mr. O'Connell. HPE is the last of the top four vendors by revenue to still have a non-x86 server business, so it's been affected more by the fluctuations of that segment. IBM is ranked fifth for server revenue, showing growth after a long period of decline. This indicates that the effects of the System X divestiture are now behind it. "Despite revenues in the third quarter looking relatively positive, the low shipment growth shows the EMEA server market remains constrained," said Mr. O'Connell. "While component shortages will ease, we're not expecting to see an improvement in the underlying business outlook across the region for some time." Table 4 EMEA: Server Vendor Shipment Estimates, 3Q17 (Units) Company 3Q17 Shipments 3Q17 Market Share (%) 3Q16 Shipments 3Q16 Market Share (%) 3Q17-3Q16 Growth (%) HPE 170,688 34.5 171,671 35.7 -0.6 Dell EMC 107,458 21.7 97,986 20.4 9.7 Lenovo 22,576 4.6 31,198 6.5 -27.6 Fujitsu 24,365 4.9 21,946 4.6 11.0 Cisco 19,287 3.9 20,113 4.2 -4.1 Others 150,854 30.5 138,397 28.8 9.0 Total 495,228 100.0 481,311 100.0 2.9   Source: Gartner (December 2017) ### (Cloud + Security + Monitoring) – Challenges = Opportunity All the cool kids are moving to the cloud and you really should consider it too.  But first of all perhaps weigh up what it actually means to put your money, data and trust into what is, let’s face it, fairly nascent technology. There are great benefits to be had from moving from traditional in-house servers to the Cloud, and we’ll take a look at these later. First, let’s consider the challenges you can expect to encounter when moving to the Cloud. First of all, there are the initial costs to consider. Before you make the move there will be some financial outlay – you may have to put in VPNs and greater connectivity before you start your Cloud journey. Then, of course, there will be lots of technical and contractual i’s to dot and t’s to cross: this is going to take your time and energy but it is essential that you get it right from the off. You need to get the right people in place to make sure you will get the service you expect and need. You also must know exactly what the costs are and be confident that there are no nasty surprises or hidden extras. In short, you need to know exactly where you and your data stand going forward. You also need to consider control. Who will have access to data, what will they be able to do with it and where in the world are they? You may be uncomfortable with handing over your company’s ‘crown jewels’ to a third party so if you have any doubts at all it is essential that you know the who, what, where and how. It is imperative to consider your information security too. For example, will your data be kept on shielded virtual machines that operate on a guarded fabric? Do you want it to be?  And what about DDoS protection and back-ups? Until you are fully comfortable with all of these variables then your server should stay firmly within your sight and your physical control. Moving to the Cloud could also have some unexpected effects on your IT performance. There will be an increase in network length, your data may be several hundreds of miles away and this will increase the time it takes for the users to speak to their servers which then have to process and send back your information. Although this may only take a fraction of a second to do, these fractions of a second add up when it happens many thousands of times per transaction. And it is possible that this retrieval of data could take longer, leading to performance issues that might impact you internally and at the same time become noticeable to external users. Make sure you check for such latency issues before you move. This all sounds a bit scary but there is help at hand. The availability of monitoring services and software that have full visibility of your IT estate’s activity in its entirety means that it is now possible to have much more comfort and faith in making the move. Using a hybrid combination of APM (application performance monitoring) and NPM (network performance monitoring) it is possible for businesses to see exactly who is doing what on their system and where they are. This can be done in real time but, also, alerts can be triggered for unusual activity taking place after hours. The monitoring technology is placed ‘over’ the existing network and takes in all the external pathways and conduits, such as those going to the Cloud servers. It monitors the system and its traffic for a period of time and builds up an understanding of what normal state looks like. Once this is done it then monitors for unusual activity or any signs of the system slowing down, and actively looks for any malicious intent across the network. Such monitoring services can also be used to identify where performance issues lie, allowing the user to see if their Cloud connection is causing any latency and giving evidence of this to their provider. The key to any successful IT partnership is clarity and visibility, and this advanced performance monitoring provides just that. It gives visibility of application flows, and by flagging issues can identify problems before they become user impacting. It can also be used in change modelling which means that a business could see the effect that making a change could have on the network, such as switching to the Cloud. It’s not done for the sake of appearing to be ‘on trend’; there are real advantages and benefits to be had from making the move as long as it is well planned and properly executed. So you’ve got your monitoring in place and are ready to make to switch. What are the opportunities you can look forward to after moving to the Cloud? As we’ve said already, all the cool kids are doing it and there must be some benefits, right? It’s not done for the sake of appearing to be ‘on trend’; there are real advantages and benefits to be had from making the move as long as it is well planned and properly executed. Once the initial costs to switch over are met, the ongoing expenditure will be far lower than maintaining and servicing your own bank of servers. There are no power, cooling or space costs to worry about and you won’t need to pay people to be available 24/7 in case of any outage or service issues. In the long term, this will be a definite saving to your bottom line. The Cloud model lends itself to much-increased agility. Having a subscription model means that you can add and subtract servers as and when you need them. You don’t have to think about the specifying, planning and installation of new kit. There’s no need to think about downtime at weekends or in the middle of the night. Adding capacity in the Cloud can be done really quickly and is a great change enabler for businesses. This added agility means that you can achieve scale in a much more efficient and cost-effective way. [clickToTweet tweet="You have to remember this is the service providers’ bread and butter, not just a means to an end." quote="You have to remember this is the service providers’ bread and butter, not just a means to an end."] The security of your data should be better too. The large Cloud service providers will have up-to-the-minute security technology which most smaller businesses would not be able to afford or move as quickly to install. You have to remember this is the service providers’ bread and butter, not just a means to an end. If they are not secure, remaining up-to-date and releasing the latest patches they will quite simply go out of business. This knowledge combined with your performance monitoring technology should help you sleep better at night. So the Cloud does offer great benefits and opportunities to businesses but it is not something to rush into. Take the time to plan your migration and identify the risks before you go ahead. Make sure you have the mechanisms in place to manage and control those risks. Finally, make sure you understand how your Cloud estate will be managed and monitored. Once you are happy with all this you can get ready to join the cool kids. ### Pure Anarchy in The Smartest of Smart Cities? The term “Smart City” covers such a diverse range of topics that it’s impossible to give a neat definition.  Though, it’s likely that many would agree on many of the contributing features: transport, energy, IoT, health… however, at the top of many people’s lists would have to be digital government and not far behind would surely be retail. Afterall, what use is a city that without shops, and how can you have shops (or any property, for that matter) without a controlling body to keep the city in order? We now live in a consumerist and capitalist society, and it is now long ago that the Sex Pistols sang about, “Anarchy in the UK,” with the line: “your future dream is a shopping scheme.” Somehow between the late 70’s and now, Punks leading icon, Jonny Rotten, became a salesman for butter. Yes. Butter. His appearance in a television butter commercial almost ten years ago surely signalled the end of the Punk’s dream of Anarchy: Punk officially died the day that the butter advert was first broadcast. According to Genius Lyrics, which comments on and attempts to explain song lyrics, the dreams of the song, “are often an expression of capitalism’s influence on how we perceive reward and consumerism: all anyone wants are ‘products’ pushed by companies, rather than anything truly original.” Are we now in that future dream of commercialism? Without a doubt. With a massive Black Friday recently, and – for the first time – an even bigger Cyber Monday just a few days later, the prediction of the, “future dream of the shopping scheme,” has come true. Perhaps Johnny Rotten and the Sex Pistol never envisaged the internet, or even the adoption in the UK of an American holiday that has been warped into an over-exuberant celebration of commercialism, but the essence of their warnings has been realised. But maybe Smart Cities could come to the rescue to revive some of the Punk Spirit and usher in a ‘pure’ form of Anarchy? I’m not talking of disruption, riots, or civil disobedience – quite the opposite, in fact. Just as blockchain is redefining the finance industry with its decentralised and distributed model, could Smart City technologies lead to a radical redefinition of how we are governed, at least at the municipal level? Just as modern technology could conceivably lead to a pure form of direct democracy (where every citizen is personally involved in the decision making-process), there is no reason why the governance of cities couldn’t also be decentralised. Cities could be run without a central organising body. Perhaps it’s Utopian thinking that would be undone by Human Nature: perhaps without a centralised force at the city-level keeping track and controlling things for us, things would soon fall apart – though perhaps no central authority at the city-level is a very real possibility?  Perhaps cities and communities within cities really could look after themselves? perhaps without a centralised force at the city-level keeping track and controlling things for us, things would soon fall apart Of course, the streets need to be kept clean, of course the bins need to be emptied and of course law and order needs to be kept. But will the smartest of smart cities allow citizens the ability to govern themselves without the need to defer to a central city body? History has seen communities and regions coming together to form countries, and blocks of countries: but history has also shown us many amalgamated regions breaking up and splitting apart again – often in troubling and bloody conflict. Even recently, it has been suggested that London could split from the rest of the UK to become an independent state, such are the differing needs and expectations of England’s Capital to the rest of the United Kingdom. It’s this contradictory nature of forces bring together smaller communities to create a larger whole, whilst also feeling a need to stay in “our own” small blocks that could both signal the possibility of success, or the inevitability of failure, of distributed authority in self-governing cities, depending on your point of view. [clickToTweet tweet="The #future of #SmartCity Governance could radically change due to available #technologies " quote="The future of Smart City Governance could radically change due to available technologies "] The future of Smart City Governance could radically change due to available technologies – but should it? Would a decentralisation lead to an anarchy as defined as, “a state of society without government,” or would it inevitably lead to anarchy as defined as, “social disorder due to the absence of governmental control?” Only time will tell. And as shown above, time does tell for all things: time moved on for Johnny Rotten and he succumbed to commercialism. Perhaps time will show that no amount of technological advancement can result in the successful decentralisation of governance for Smart Cities. ### To Go Boldly - Where Will Smart Cities Lead? “To Go Boldly,” sounds weird, doesn’t it? However, some would have it that it is the correct form and that the split infinitive version – the more familiar, “To Boldly Go” – is a crime against grammar. I and (more importantly), the Oxford English Dictionary, disagree. So, ignore the first part of the title of this article and know that it is more than okay, “To Boldly Go”. The case against the split infinitive is based on comparisons with how Latin is structured. As an amateur etymologist, I have a big interest in Latin – but I’m certainly no expert and it’s not something near the very top of my list of “things to study in-depth”. Latin is the past, not the future. And as we boldly go towards the future, it’s important to recognise and understand history, but certainly not be a slave to it. [clickToTweet tweet="And as we boldly go towards the #future, it’s important to understand #history, but certainly not be a slave to it." quote="And as we boldly go towards the future, it’s important to recognise and understand history, but certainly not be a slave to it." theme="style3"] Anyone who, like me, loves Star Trek: The Next Generation knows that humans in future are generally kind, courteous, polite and helpful. What a delightful vision of the future. When coming to create the new series of Star Trek (Star Trek: Discovery) the creators decided not to be beholden to the past ways of doing things and they ripped up the well-worn Star Trek TV Formula. Now, in this new series, we see that humans in the future are just like us: cantankerous, rude and difficult to get along with*. Whatever the truth of the matter about the future of humanity turns out to be there is no question that, just like language (and anything else you can think of) the future will have strong links to the past, but it won’t be straight-jacketed by it. Which neatly brings me round to The Starship Enterprise, and all of the other large Star Trek Starships. It is easy to see that they are massive, self-contained Cities in the Sky: the direct descendants of today’s Smart Cities. Smart Cities on Mars – And Beyond In recent times both Stephen Hawking and Elon Musk have talked about the future of humanity and how, being a one-planet species, we are vulnerable to being wiped out in a single catastrophic incident, such as the asteroid impact that wiped out the dinosaurs all those many millions of years ago. While many people going about their busy daily lives wouldn’t take this idea seriously and would only see this as simply fodder for Science Fiction Movies, the arguments are scientifically and logically sound. As per Elon Musk’s current plans, getting off this planet and onto Mars is a definite ‘next-step’, but after that, it would be highly illogical not to go that one step further and diversify the human race with the construction and habitation of Enterprise-style Starships. Make no mistake, those future Cities in the Sky will have their roots in the past; they will have their roots in our present-day Smart Cities. Building the Cities of the now and of the near future is without question an important and worthwhile job. But if you recognise that the DNA of Smart City plans and designs will carry on into the future and will become part of highly connected, integrated, self-contained Starships it becomes, you know… just a little sexier. If Star Trek – and ALL science-fiction, for that matter – has taught us anything it’s that in our predictions of the future, whilst the essence is often right, the details can be quite different. We either over or underestimate what is possible. For instance, although set more than 300 years in the future, in one episode Captain Picard was pretty pleased with himself that he had a thin, plastic keyboard that he could roll up like a small mat. Star Trek: The Next Generation was made only 30 years ago – and yet we already have this technology! Maybe it seemed like a huge leap just a few short decades ago, but the reality has been much swifter in bringing it to fruition. Because of all the varying technologies and approaches that have been classified under the heading “Smart City” technology, there’s no real overall general consensus over what exactly constitutes a Smart City. It’s a broad church, but that’s possibly one of its greatest strengths: you’re not confined to doing X, Y or Z – there is the freedom to innovate and to think outside the box. And it’s this ability to innovate, to create, to think laterally, and to think outside the box that makes us human and makes humanity worth saving. The bold steps we’re taking now with Smart Cities, IoT, AI, etc. etc. etc. will have repercussions far into the future. Just remember that next time when you’re playing with your Arudino or Raspberry Pi.   * I myself am neither cantankerous, rude nor difficult to get along with. Usually. ### Smart Cities: can smart buildings be both welcoming and secure? A warm welcome and a reliable office building security can co-exist, says Gregory Blondeau, MD and Founder of Proxyclick. We’ve all been there – running late for a meeting, dashing into an impressive corporate lobby and faced with a myriad of paper forms and security checks and let’s be honest, we’ve all questioned whether or not the friendly receptionist has even told our contact we’ve arrived. Cue the awkward do I / don’t I go back up to the desk, sitting not knowing whether you’re technically even later for the meeting. Sound familiar? Worth £193bn annually to the UK economy[1], face-to-face business remains king, so how can UK organisations utilise the latest ‘proptech’ to ensure today’s new breed of ‘smart buildings’ achieve the right balance between a secure building and welcoming environment for visitors? Moving in line with the UK’s industry-wide movement to create smart buildings that understand the needs of the people inside them, new proptech (new technologies designed to refine, improve or reinvent the services we rely on in the property industry to buy, rent, sell, build, heat or manage residential and commercial property) can now provide companies with integrated, smart experiences that bring together access control, CRM and meeting room management. Disconnected experience As the smart building market undergoes a period of exponential growth, why should companies invest in a cloud-based Visitor Management Solution? [clickToTweet tweet="In an era where 85% of Britons now own a #smartphone, brands around the world spend millions on digital #strategies" quote="In an era where 85% of Britons now own a smartphone, brands around the world spend millions on digital strategies"] In an era where 85% of Britons now own a smartphone, brands around the world spend millions on digital strategies, using the latest marketing, advertising and e-commerce technologies to deliver upon consumer expectations. However, there is still a ‘disconnect’ between what a customer experiences digitally, and the physical experience of visiting a branch or HQ. Why is it that companies spend millions on corporate identities and neglect the first thing visitors see? It honestly feels as though the paper logbook is the last thing dating back from the industrial revolution which we still see in today‘s offices! If your website and e-commerce platform says The Ritz, yet your check-in experience is more akin to Fawlty Towers, there’s a serious disconnect for your customers If we can book our travel, message the nation and navigate to your premises using technology, why are we still met with pen and paper at the front desk? In a world where customers have likely asked digital assistant Siri to divulge a company‘s history before their morning coffee is brewed, organisations must keep pace with the technological age we’re in and deliver the front-of-house experience customers expect. To understand how things work at busy reception desks, I used to barge into office buildings unannounced. One thing struck me over and over. There was a mismatch between the luxury of the typical corporate office lobby and the old-fashioned visitor register book used by visitors to sign in. How is it possible that a company could sometimes spend fortunes transforming the lobby while at the same time sticking with a cheap visitor register that everyone could see? So much of business best practice is about efficiency. Somehow, writing in the register and pinning on handwritten identity badges seemed inappropriate. One of the first steps towards creating great first impressions is to retire the paper logbook altogether, and streamline the visitor journey. Customisable and intuitive web-based software can now manage visitors to your offices, from invitation to sign-out. Security On a day-to-day basis, the level of buildings security seems to have heightened within companies and organisations: Where is the warm welcome? It appears to be put to one side while certain measures are double-checked. The dilemma posed is simple: if someone is coming to your office for the first time, they have to get past the security. It invariably involves a wait while you hear from reception that your client has arrived. Your client’s first impression will be dictated by how they are greeted downstairs. Whilst smart buildings are operationally efficient and provide enhanced comfort for tenants, visitors are still often subjected to frustrating security measures like registering their license plate and filling in rafts of forms. As smart cities are created, and visitors utilise the latest technologies such as high-speed rail, driverless cars and smart motorways, archaic front-of-house experiences are likely to create a negative impression. How integrated cloud technology has a central role to play in helping organisations deliver a smart visitor experience Cloud-based visitor management applications now provide companies with simple and friendly software that connects guests with the person they’re visiting. Visitors can be added directly from any electronic calendar into the software and the process of welcoming them begins. Visitors then receive an invitation with all the information they need for a meeting and upon arrival, simply check in using an iPad. Their host is then notified by SMS, email or via one of numerous integrations, for example, Slack and Skype for Business. The long-term vision for cloud service providers is to make arriving somewhere a frictionless process. The parking facility recognises your car, the access control is open for you, you enter the elevator to the right floor, you can access the Wi-Fi and indoor navigation takes you to the right meeting room where your favorite drink and your host is waiting for you – everything taken care of in an integrated, smart way. Every legitimate visitor should feel like a VIP. In the case of Proxyclick’s new check-in app, the guest experience is even more seamless as it shows employees a photo of their visitor, sends the visitor a Wi-Fi code via SMS, and prints out an access badge. The future will also see VMS technology scan visitor ID cards to confirm their identity. Using a face-matching algorithm, the feature will give organisations certainty about every visitors' identity. First impressions are vital. In business, the first impression can mean the difference between getting that contract or not; between the client taking to you and your business or feeling that it wasn’t for them. If we can extend a warm welcome to our clients, then we have an edge in this slightly edgy world. [1] According to a report by the Centre for Economics and Business Research (Cebr) and UK hotel chain Premier Inn (https://realbusiness.co.uk/sales-and-marketing/2016/10/24/face-to-face-meetings-deliver-193bn-for-uk-economy/) ### SysGroup Delivers 46% Revenue Growth North West-based SysGroup PLC (‘SysGroup’), one of the UK’s fastest-growing managed IT services and cloud hosting providers, has today announced revenue growth of 46.6 percent from continuing operations,  up to £3.93m for the six months ending 30th September 2017. The news follows a number of successes for the Group this year, including the full integration of System Professional (acquired in July 2016), a newly established senior management team and the appointment of a Group Sales Director and Group Marketing Director. Continued investment has also been made in technology, including a newly established presence in a Manchester data centre. Reflecting on the growth the company has seen in the past six months, Adam Binks, Chief Operating Officer at SysGroup, said: “This significant increase in revenue is real testament to SysGroup’s efforts in delivering structural and operational changes to better position the business to execute our combined organic and acquisitive growth strategy.” SysGroup has also recorded sales pipeline growth of 17.1 percent to £4.1m since the end of its financial year in March 2017, which will be further boosted by the acquisition of Rockford IT in November this year; the company’s sixth acquisition in the last three years. This move has bolstered the Group’s growth and competency, taking the total number of employees to over 100 with offices across the UK and expanding its market reach into additional vertical sectors, including hotels and leisure. “The long list of operational achievements to date, coupled with our strong sales pipeline, puts SysGroup in an excellent growth position, leaving us optimistic about our future and our continued commitment to our customers and partners.” Adam continued. This latest announcement highlights the growing market demand for SysGroup’s flexible managed IT solutions and cloud hosting among customers from the financial services, insurance, retail, technology, not-for-profit, education and private health sectors. Rockford IT is one of only two, WatchGuard Platinum Partner’s in the UK, which will also give SysGroup the tools to build on existing partnerships and accreditations while continuing to offer fully managed IT services, cloud hosting and IT security solutions to a wide base of customers. Further investment in capabilities and strategic acquisitions will be supplemented with organic growth in line with SysGroup’s strategy to expand its customer offering and geographic reach. ### Could we be on the cusp of aligning mainstream financial institutions with cryptocurrencies? This year has seen Bitcoin skyrocket in value. Last week’s news that the cryptocurrency was up 900 percent came amid increasing speculation that mainstream investors may be on the cusp of embracing it. But the problem still remains: how do we create a bridge between the thus far counterculture of the crypto markets and established financial institutions? We at Disruptive Live spoke to fintech firm Xena Exchange, who are leading the way by launching a second-generation cryptocurrency exchange. CEO Anton Kravchenko dialled in from Slush, a world-leading start-up event in Helsinki, to tell us about his team’s pioneering work. His view of Bitcoin is unexpected; he calls it “a bit of a hype right now”, and expects a degree of correction and cooling to occur. This doesn’t seem to worry him, however, as his focus is on the technology behind blockchain, rather than on the temporary buzz. “One of the hallmarks of blockchain is that it can create trust in a network without the need for an intermediary,” says Kravchenko. “…One can transfer money with a trusted network without having to go to a bank, or take out insurance on their car without the involvement of an insurance company, and still be sure that it will pay out when time comes. Secure, frictionless, and instant transactions are like the future of economical interactions." The Xena Exchange team is based in London and Moscow, and boasts impressive backgrounds in investment banking – experience which they hope will expedite and support their plans to integrate seamlessly with financial institutions. For those looking to gain price exposure to blockchain tokens, such as Bitcoin, Xena Exchange’s role will be to facilitate access to cryptocurrencies and more than 100 tokens of the blockchain projects, whilst also closing the gap between traditional and cryptocurrency exchanges. Are cryptocurrencies really compatible with mainstream banking? With industry experts, like Ruben Galindo Steckel, co-founder of AirTM, suggesting that cryptocurrencies have taken off because ‘many have lost faith in the banking system’, it begs the question as to whether these two fundamentally disparate arms of the fintech landscape can truly align. But with the world economy facing big questions, like the management of inflation rates in developing countries, and the huge behavioural shift towards e-commerce in more developed nations, there is hope that platforms that allow cryptocurrencies to be exchanged on a larger scale will allow banks to move towards a more secure and transparent system, and regain some of that lost trust. The hype, confusion and uncertainty that still cloaks Bitcoin and other cryptocurrencies might mean there is still a long way to go before the old-guard embraces them fully, but the conversation has begun, and with businesses now looking to facilitate that relationship, perhaps we could be closer to streamlining the two than we think. ### Netskope and Facebook usher in secure collaboration in Workplace Deeper integration with Workplace offers comprehensive security and compliance using award-winning Netskope Cloud Security Platform LONDON, UK — Dec. 5, 2017 — Netskope, the leader in cloud security, today announced a first of its kind native integration with Workplace by Facebook. This deep integration uses Workplace APIs to provide comprehensive security and allows enterprise customers to inspect content and apply policies across Workplace to satisfy their regulatory compliance requirements. The announcement follows the release of the inline version of Netskope for Workplace in April 2017. Netskope for Workplace is the most comprehensive solution available in a cloud access security broker (CASB) platform and the only one available today for every CASB deployment mode. Netskope for Workplace provides security teams deep insight into and policy-based control over files and data shared in Workplace groups and chats, safeguarding valuable intellectual property, and ensuring regulatory compliance through the award-winning Netskope Cloud DLP and Threat Prevention. As cloud-based collaboration tools like Workplace make work simpler and more efficient, it is crucial that security teams find and secure sensitive content and apply appropriate controls based on their risk and governance policies. With the added support for Workplace, enterprises can depend on Netskope to enforce a single consistent regulatory and compliance policy across all their cloud services. “Employees across the enterprise are demanding access to collaboration tools like Workplace, but security teams need to ensure that use is compliant with regulations and security policies,” said Sanjay Beri, founder and CEO, Netskope. “Via our partnership with Facebook’s enterprise teams, Netskope was able to leverage APIs to extend our cloud security platform to deliver a unique and first in industry solution for Netskope for Workplace customers. This solution gives enterprise IT peace of mind when it comes to enterprise requirements in e-discovery, regulatory compliance data exfiltration and app activity monitoring. We’re delighted to further our relationship with Facebook to help secure their widely used collaboration platform.” “Workplace helps change the way people work by connecting entire organisations through tools that are familiar to them,” said Julien Codorniou, VP, Workplace by Facebook. “Security is our top priority. We’re pleased to see how ISV partners like Netskope are ensuring that customers have the most advanced data privacy and protection controls and the flexibility they want when enabling Workplace.” Netskope for Workplace provides the most comprehensive coverage of any CASB with visibility and protection extended to Workplace use taking place on- or off-network, or when collaborating with suppliers, consultants, or partners that sit beyond their network perimeter. Customers can enable integration via the Netskope app, which is now available in the Facebook Partner Marketplace. ### Gearing up for smarter cities Could a sustainable highways project in the USA provide a blueprint for transport and infrastructure planners across Europe? John Catling, CEO at WheelRight, a leading automotive technology business based in the UK, outlines how this could work in practice. Although the term ‘smart city’ has only recently entered the public consciousness, the core concept behind it has been in practice for decades. Indeed, there are examples of ‘big data’ and diagnostic techniques which have been used to manage large urban areas since the late sixties. In Los Angeles, California, for example, a research team called The Community Analysis Bureau produced an exhaustive report in 1974, which gave a thorough insight into poverty hotspots across the LA metropolitan area. [clickToTweet tweet="It wasn’t until recently that we’ve had access to the appropriate #tech to fully realise the #smartcity #bigdata" quote="It wasn’t until much more recently, however, that we’ve had access to the appropriate sensory and analytical technology to help us fully realise the potential of the smart city."] It wasn’t until much more recently, however, that we’ve had access to the appropriate sensory and analytical technology to help us fully realise the potential of the smart city. Singapore, for example, has introduced a series of mobility policies as part of its Smart Nation initiative. This uses data for everything, from traffic-calming measures to understanding parts of the city in which the populace is most prone to take up smoking. Boston, Massachusetts, has taken this concept one step further. The business practices of Martin Walsh, the city mayor, show how far things have come since the days of manual data crunching and reams of graph data. Walsh uses a real-time, dynamic dashboard to calculate a range of KPIs for the city’s public services, from helpline calls to the number of potholes filled in that day. The initiative, which was rolled out in October 2015, also includes a ‘CityScore’ figure, comprising 24 separate measures. This includes crime levels, WiFi speed and access to grants – to provide an overall ‘wellbeing’ score for each district. Indeed, North America’s track record in connected infrastructure is incredibly impressive. Another such example is The Ray, an 18 mile stretch of highway in Georgia, US, which is offering a compelling model for transport planners and policymakers. The highway of the future Initially established as a charitable foundation, The Ray provides inspiring insight into how the highways of tomorrow could look. Initiatives that currently line the Alabama to Georgia route include electric vehicle charging points, leading-edge solar roadway technology and WheelRight’s own drive-over tyre monitoring technology which measures cars’ tread depths and tyre pressures within seconds. The genesis of The Ray came as a memorial roadway for its namesake, Ray C. Anderson. A prominent business leader in the region, Anderson achieved recognition for his management of Interface, a multinational carpeting manufacturer which committed to cutting pollution and reducing its use of fossil fuels. In recognition of his commitment to sustainability, a section of Georgia’s I-85 highway was renamed The Ray in his honour in 2014. To ensure that the road tied in with Anderson’s green credentials, his family decided that the route should become a living project for greener, safer freeways. Throughout the next few years, the highway’s scope will expand to include solar panels, sustainable landscaping and Internet of Things (IoT) technology. All of which will go further to making this a smart, supportable and revenue-generating roadmap for a greener future – and a project from which policymakers in other countries can learn. A drive-over solution At the end of last year, The Ray introduced WheelRight’s drive over technology to the project. For the infrastructure planner of the future, technology such as the kind in place on The Ray will play a pivotal role in the development of connected cities. The ability to monitor the tyre condition of cars and public transport vehicles, especially in areas of heavy urban congestion – such as London - could play a significant role in easing traffic, as well as boosting fuel efficiency and reducing road accidents. While tyre pressure is taken as the vehicle traverses the road embedded sensor plates, tread depth is measured via multi-image technology and imaging software. The tech found that one in 50 cars running on US roads has at least one under-inflated tyre, with 7.2% of cars running below the legal tread depth – that equates to 18.8m vehicles in the US running on ‘bald tyres.’ WheelRight’s cloud-based technology is just one of a new generation of applications that support the use of IoT in transport infrastructures. Collecting real-time data to check the tyre condition of vehicles in a city is a big data opportunity. By driving over the road-embedded sensors, vehicles’ tyre data is sent to the cloud, and then onto a mobile phone, laptop or server. Such information is particularly valuable for public transport operators, allowing them to keep a close eye on the tyre quality of their fleet. Our vision for the future is a network of systems in the bus depots and petrol forecourts in major cities around the world, ensuring vehicles on the road are driving with properly-inflated, safe and compliant tyres. Adopting smart technology gives city planners a rich bank of data at their fingertips – key to making smart city values a reality. The route ahead Inspiring projects such as The Ray provide policy makers and transport planners with an exciting vision of the future. Integrated drive-over tyre pressure monitoring technology, solar road panels and electric vehicle charging points are just the beginning of the technological revolution that will sweep our roads. The question now facing European governments is ‘what can we do replicate this success?’ As demonstrated by the original impetus behind The Ray, the key to successful implementation of smart projects is a clear, singular vision. In The Ray’s case, that was the creation of a zero carbon, zero deaths, zero waste and zero impact highway. A fundamental driving force behind the success of The Ray is local government enterprise support, including Interface, the Georgia Conservancy and Georgia Tech’s School of Architecture. This has given The Ray valuable momentum. The Georgia Department of Transportation played a similarly vital role, leading the roll-out of innovative new technologies to improve safety and sustainability. For urban planners and policymakers across the UK and Europe, this should provide a compelling example. Projects such as The Ray need committed and consistent leadership, alongside the proactive support of highways agencies and engagement with leading businesses and tech providers. New technology will play a huge role in determining the successful roll-out of smart cities, and it is crucial that officials are looking now at how they begin to integrate such technology into their planning ### ANS strengthens cloud app development capabilities with new acquisition Independent Cloud Services Provider ANS Group has acquired Webantic, a cloud transformation and application development specialist, cementing its position as the UK’s #1 Cloud Services Provider. Webantic, founded in 2011, is an expert in innovative digital transformation and cloud application development, helping organisations build new applications and experiences for the digital era. ANS currently boasts Microsoft Gold Cloud Platform Provider and AWS Advance Partner status. This acquisition bolsters the company’s ability to offer end-to-end managed cloud services. Enabling ANS customers not only to migrate and manage their applications but develop next-generation digital business applications in the public cloud. ANS has shown impressive growth in cloud services in recent years having worked with some of the UK’s leading enterprise and public-sector organisations. Projects include consultative assessments, supporting customers in their migrations to Microsoft Azure and AWS as well as big data and machine learning solutions. Commenting on the demand for cloud-native applications Andy Barrow, CTO at ANS said: “Organisations are using hyperscale public cloud services as the most strategic way of addressing the new world business challenges of infrastructure and, even more importantly, modern application development. Disruptive trends, such as artificial intelligence and machine learning are developing at a staggering rate and are beginning to impact every industry. In response to this, we’re now seeing a fundamental shift in the need to provide scalability, speed and flexibility through cloud technology.” Commenting on the acquisition, Luke Grimes, managing director at Webantic said: “We are excited about this next chapter and are delighted to have found an organisation that shares our vision and passion for innovation. We believe that ANS is the perfect platform for growth and opportunity for the entire Webantic team.” Paul Shannon, CEO at ANS added: “Webantic brings outstanding expertise in the development of complex cloud-native applications. Like us, it’s a business that has a focus on customer need whilst delivering exceptional technical solutions. These are foundations that make for an exciting future together. “The acquisition of Webantic means that our customers will now have a single point of expertise for cloud application development, migration and management services as well as cloud-ready networks.”   Webantic legal representation was provided by the Kuits team based in Manchester and ANS legal representation was provided by Weightmans. ### Cloud technology is key to making charities fit for the future Advanced Trends report shows one in four charities do not have access to real-time data New research from Advanced has revealed that many charities in the UK are not fit enough for the digital era. According to the Advanced Trends Report 2017*, 65% of charities use Cloud-based technology, yet nearly one in four (26%) do not have access to real-time data and 40% do not have the right tools to do their job effectively. The British software and services company believes charities will be held back if they don’t embrace the Cloud at the heart of their operations and use it to run core functions such as donor interactions and financial management. “Charities are under immense pressure – they need to be communicating closely with donors, efficiently and effectively, to maintain and grow revenues. This is coupled with the challenges of having the right digital technology so that staff can streamline administrative tasks, to save costs and allow them to invest more time in the people that matter – supporters, members and stakeholders,” said Mark Dewell, Managing Director – Commercial and Third Sector, at Advanced. “But the reality is that charities need to have the confidence to embrace Cloud technology fully, to ensure they can transform into digital-first organisations. This will ensure they are fit for the future, ready for real and present challenges around GDPR and better prepared to address threats such as cybersecurity.” Advanced is using the Cloud to help charities move forward in the digital era. It has accelerated its plans to deliver a new Cloud-first strategy. It has introduced CloudDonor, an intuitive donor relationship management system, to help charities process fundraising income, build marketing campaigns and manage merchandising and Gift Aid for example – this includes all the core elements of fundraising management. Its web-based portals provide information for both staff and donors which is accessible at all times, from one source. This adds to Advanced’s portfolio of Software-as-a-Service solutions, such as Advanced Business Cloud Essentials, which incorporates financial management to ensure charities have a unified view of their financial affairs.  Allen Reid, Director of Client Projects at Hart Square – an independent not-for-profit consultancy – concludes: “It’s fair to say that charities know the time is ripe for transformation as the digital era impacts every aspect of life – for members, supporters and stakeholders, as well as staff. But not-for-profits want to be confident they are making the right technology choices. Getting on board with Cloud technology is, without doubt, the right choice for many charities wanting to harness the value of their data to drive engagement whilst ensuring the change process is successful – we’re definitely big advocates for moving to the Cloud.” ### Smart Cities Are Going To Be Ungodly Hot, Unless We Do Something To kick off our week of articles looking at Smart Cities, Kayla Matthews takes a look at an inevitable side-effect of ever-increasing computing power that'll be used in Smart Cities - heat production. People in the greater Toronto area are abuzz about plans for Sidewalk Labs, a subsidiary of a Google parent company called Alphabet Labs, to develop a 12-acre section of land by Toronto’s waterfront. The project, dubbed Sidewalk Toronto, aims to turn the area into an extremely tech-equipped city where everything from public transit times to garbage collection schedules is data-driven and flexible, based on resident demand. Sidewalk Toronto and other tech cities sound inspiring at first, and many forward-thinking people dream about living in one of them. However, there’s a very important, often-overlooked factor: temperature concerns. Much of the technology planned for Sidewalk Toronto — as well as other smart cities — centres on computers. Naturally, computers generate heat. Processors and graphics cards make computers get hotter, and components such as the hard and optical disk drives can, too. Those realities, combined with the ever-present problem of climate change adversely affecting weather patterns, mean we have to start planning now to keep temperatures cool enough in smart cities and other places that are highly dependent on tech, so the people living in them don’t get overheated. Fortunately, there are several things scientists and engineers could do to keep temperatures down. Some of them are as technologically advanced as the smart cities themselves. Building Data Centers With Integrated Cooling Components People who work in data centres or build them understand there are numerous strategies to maintain temperatures at a level that keeps delicate equipment safe and operating normally. For starters, painting the exterior of the building white instead of black increases light reflection abilities, meaning light from the sun doesn’t get absorbed into the structure. Another perk of all-white data centres is that they have lower electricity requirements. Statistics say simply painting a data centre white could reduce its energy requirements by as much as 25 to 30 percent. Also, promoting airflow in a data centre helps keep it cool. There are methods that prevent hot and cold air from mixing, which keeps the temperature more consistent. Furthermore, the use of recessed power distribution units allows operators to keep tabs on temperatures and intervene when necessary. Using Holistic Building Control Systems to Monitor Conditions and Raise Awareness Keeping cities cool will also involve knowing when conditions become too hot and making changes as necessary. A contractor called Peoples Electric Company offers the Unity connected building intelligence system, which offers a single interface for all lighting and HVAC components in the building. Sensors inside the building’s fixtures give up-to-the-minute data about temperature, among other things. It’s also possible to automatically dim the lighting or activate window shades, both of which could temporarily make the premises cooler when it becomes too hot. A warning system could also theoretically activate to tell citizens that conditions are unhealthily warm, letting them seek cooler places before it’s too late. A major problem related to business construction is that many building teams don’t realize there are issues with the structure until it’s too late. With the help of Unity technology or something similar, it’d be much easier to notice that challenges exist, or conversely, evaluate whether something installed to help the heat problem has worked. Planting Trees Responsibly A significant amount of data indicates urban areas with large amounts of shading provided by trees are cooler than cities that lack green space. A study from The Nature Conservancy notes a $100 million annual investment in trees could provide 77 million people with cooler air and result in lower air pollution levels for 68 million residents. However, the solution to making a stifling connected city more bearable isn’t as simple as merely planting trees. Certain tree species are more effective than others when it comes to reducing pollution or making spaces cooler. Also, trees that are too tall block airflow if used along narrow streets. In those settings, it’s more appropriate to use so-called “living walls” covered with greenery. Experts also say it’s important to plant trees that don’t attract mosquitoes or other pests. If urban planners aren’t careful about the varieties they use, the inhabitants of a smart city could deal with infestations that are so severe, high heat levels seem like only a mild concern. In some cities across the world, keeping the area cooler is a collaborative effort. For example, Viennese citizens often maintain gardens on their rooftop. When each household does its part to increase greenery, the whole city benefits. Keeping the Future in Mind During Construction Although there are various ways to alter existing buildings and make them actively cool the surrounding area, it’ll be necessary for planners in smart cities to have a long-term mindset that incorporates transformative housing, rather than structures that get rebuilt as needed. Data from a report released by Global Infrastructure Basel says 75 percent of the structures that will exist in 2050 haven’t even been built yet. Besides considering factors like cooling roofs and materials that reflect the sun’s rays, planning officials should also be aware the orientation of buildings matters. If structures are too tall or packed too closely together, they could block breezes and trap heat. This issue is partially associated with global warming but could become even worse if the people involved in smart cities don’t try to work with the environment, instead of against it. If the layout of a group of buildings holds heat, even the most high-tech materials and innovative ideas won’t cool things down. That’s why the people who build or improve structures in connected cities must be mindful that the things they create now may not exist in several decades. Indeed, it’s necessary to fabricate buildings that cater to current residents and keep them as cool as possible. Beyond that, contractors must also realize a future-oriented mindset is crucial, as our growing reliance on technology and the increasing problems with climate change are both making our cities hotter. This coverage of why smart cities could get so toasty and what engineers and planners can do to cool things down illustrates it’s not sufficient to take a “wait-and-see” approach. Instead, people involved in setting up these highly advanced places for people to live must be diligent in relying on techniques that encourage and maintain livable temperatures. ### Sun, Sea and IoT Malta, in recent years, has become increasingly popular as a destination for everyone, from holiday goers to party lovers and history buffs. It’s increasingly becoming well known for its technological advancements too. If you’d asked me about Malta 5 years ago, I wouldn’t have had a clue what to say. The country wasn’t nearly as talked about back then as they are now. However, since then I’ve been to the island twice, once in 2014 and again in 2016. If you’ve ever been to the country, you’ll agree it doesn’t take long to fall in love with it. Warm, North African sunshine, beautiful Mediterranean waters and the food… oh, the food. Best known for the Maltese sausage, rabbit and Ftira bread, there’s no shortage of Italian inspired dishes to compliment them. But alas, this isn’t a food blog. Instead, I’d like to show you some statistics about the island’s internet sector. 100% broadband coverage (total) compared to the EU average of 98% 100% broadband coverage (rural) compared to the EU average of 93% 100% NGA (superfast) coverage (total) compared to the EU average of 76% 100% NGA (superfast) coverage (rural) compared to the EU average of 40% 99.5% 4G coverage compared to the EU average of 84% Pretty impressive. They’re not resting on their laurels either. The Maltese energy company, Enemalta (a play on words, not an unpleasant sounding trip to the doctors), is investing in their infrastructure. They’re partnering with Streamcast Technologies, a private IP cloud operator, and building a datacentre in the town of Marsa. The main purpose of this being to build on existing underground connectivity, as well as its fibre optic capabilities worldwide. They will then continue their investment, with approximately £67million going into building datacentres around the globe. Enemalta’s executive chairman, Fredrick Azzopardi made comments on how their business development team are looking to keep up this international cooperation. They look to further diversify their global datacentre portfolio and their presence in other areas too. A good example of the other areas we are talking about here includes the private sector. Professor Cachia, of the University of Malta, has seen “significant growth” from the gaming industry, betting and online portals. Growth that could be thanks in part to the impressive quality of graduates and their spread of expertise, and the university’s ICT faculty. Faculty staff have helped to supply the island with a torrent of expertise, evident from their participation in EU-funded projects. One particular project, that was started in 2010 and finished in 2012, was called ICT Venturegate, a brainchild of the nation. The aim of this project was “Innovative solutions for enabling efficient interactions between SMEs in ICT projects and innovation investors”. It was co-ordinated by Paragon and resulted in an information gateway aimed at investors. This was coupled with a target of innovation investments by way of a virtual agency. E-training was also provided for SMEs looking to build on their investment financing capacity. Building on an already established list of National Contact Points (NCPs), the idea focuses on the needs of those NCPs in the network. It also aims to provide solutions to the needs of Third World Countries and their ICT issues. 5 years later and ICT Venturegate has spawned another project, currently in the works, called Idealist 2018, due to be finished by 2020. Building on an already established list of National Contact Points (NCPs), the idea focuses on the needs of those NCPs in the network. It also aims to provide solutions to the needs of Third World Countries and their ICT issues. By utilising their management techniques, strategic collaborations, and continuing to review and monitor the process, Idealist 2018 aims to satisfy certain objectives, including: Reinforcing the ICT NCP network by promoting trans-national co-operation Identifying and promoting best practices amongst NCPs Raising awareness for NCP services that benefit cross-border audiences [clickToTweet tweet="#Malta, with an area size of just 122 square miles is a small country making a big impact. #datacentre #technology" quote="'Malta, with an area size of just 122 square miles is a small country making a big impact.'"] Malta, with an area size of just 122 square miles (smaller than the Isle of Wight), is proving that a small country can make a big impact. The country is a hidden trove of technical wonders that continues to punch above its weight. You may be wondering how this applies to your company? Well, technology is changing quicker than ever before, with startups looking to disrupt the market at every turn. Continuing growth in such an age requires flexibility, foresight and investment, which the island nation realises and builds upon. I recently wrote a blog on cryptocurrencies. Whilst I didn’t personally talk about various legal issues surrounding them, the government of Malta has been thinking about this contentious subject. They have published a discussion paper enveloping the topic, showing they are even looking at the blockchain and Fintech market. Just a note on Marsa: there’s a very interesting, prehistoric burial site nearby called The Ħal Saflieni Hypogeum that I recommend seeing, should you find yourself there. If you have any comments, or if you want to get in touch and talk tech (or holidays), follow me on Twitter. ### The Influence of Cloud Technology in Web-Based Marketing As we move into 2018, technical excellence will play a more important role than ever in how search engines like Google perceive websites and how potential customers perceive brands. In a world of mobile-first indexing, increased security and faster load speeds, any business with a website will be looking to cloud technology to support their success. Cloud technology as a whole has been growing in popularity in recent years, though typically it’s those smaller businesses with greater agility who are first on the take-up. That said, the number of larger businesses also jumping on the cloud-based bandwagon is increasing, with the security of the technology getting better all the time and regulation updates affecting individual sectors and all businesses impacting the way we work in a number of ways. When it comes to web marketing, a website is no longer a ‘nice to have’. Instead, it’s the business’ shop window - and in many cases, is now the only interaction a prospect will have with your brand prior to making a purchase decision. Get it right and your business can thrive through better search rankings and great user experiences. Get it wrong, and your business could be missing out on visitors and therefore sales, with those users choosing your competitors instead. Improving website load speeds using cloud hosting solutions One of the primary applications of the cloud in web marketing is cloud-based hosting. This means that the website itself is ‘stored’ in the cloud - making it more robust, less open to attack and faster at responding to user requests. Amazon Web Services (AWS) is a common platform used by some of the web’s best-known sites, as well as smaller and growing websites. Via AWS, it’s possible to implement something called a ‘separation of concerns’, where assets related to the website can be stored and served separately to one another, and therefore all at the same time, rather than one having to load before the next. This is becoming a much more common convention in modern web design, as it makes for much faster loading sites; all the better for the user, and for search engines when deciding where to rank your website against your competitors. Storing web imagery in the cloud The images used on a website speak volumes. When we use high-quality imagery, we are able to give our visitors a much clearer representation of what’s on offer. If we consider the case of e-commerce websites (where products are sold online) in particular, great imagery goes a long way to replacing that tactile experience of holding a product in store. For those lead generation websites (where we are trying to generate enquiries for our services, rather than sell products), we use imagery to show more about our brand, giving insight into the way we work and, importantly, who we are. But imagery can also be a huge burden on a website. Delivering those images to users every time they visit is a waste of load time, and can be easily mitigated using cloud storage solutions. These storage solutions, usually in the form of a ‘content delivery network’, make it possible for images to be stored and then served from an alternative solution - meaning the demand is on that platform, rather than your own server. The images are still on your site, and as far as the user is concerned, are no different to an image served direct. But the benefit is less demand on your server, which can in some cases mean cheaper hosting solutions, too. Web project management via cloud-based platforms While some businesses may have an internal web management resource, others share the responsibility amongst their team or choose to outsource. Cloud-based project management platforms make website management so much easier. Through solutions such as Teamwork or Jira, management of website tasks is streamlined and multiple people can collaborate to accomplish goals, regardless of physical proximity. If your business has a website (and let’s face it, there are very few that don’t these days), the opportunities available to you via cloud tech are only going to increase. Start investing now to prosper in the future. ### Why Dell EMC for SAP? Here's Why. Compare the Cloud was joined by John Ali, SAP business development manager at DELL EMC. In this demonstration, John discusses SAP for businesses with Dell EMC. "If you're running SAP applications today you're most likely thinking about the future and areas such as HANA and s4 HANA and how you can migrate to these and when you do, that if cloud would be an option for you." Find out more about Dell EMC here: https://www.dellemc.com/en-us/index.htm At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. ### GDPR is coming – but is it enough? Alban Schmutz, senior vice president for development and public affairs at OVH, takes a look at the forthcoming GDPR regulations and considers whether they go far enough. When the EU’s General Data Protection Regulation (GDPR) comes into effect in May 2018, it will provide some much-needed consistency in setting out tougher regulations to govern the treatment of personal data across the continent. By way of reminder, the GDPR is a regulation by which European instances are looking to strengthen and unify data protection for all individuals in the EU and update the rules of data export outside of the Union’s boundaries. The regulation applies to both data controllers (collectors of personal data) and processors (organisations that process this data). OVH and other industry players have long campaigned for such changes to regulations around data security and data privacy, but now the question is, do they go far enough? It’s encouraging to see data privacy being placed at the top of the agenda at the most senior levels in businesses of all shapes and sizes. The challenge is now set for companies to prepare for the new regulations ahead of the deadline. Our view is that we also need to start looking beyond this initial implementation phase and consider where data privacy and security need to go next. The software data protection challenge One of the first agenda items, at a tactical level, will be data protection as it relates to software offerings and suppliers, and not just hardware. GDPR is a move in the right direction but does not go far enough in relation to the extra layers associated with where data is captured and stored. [clickToTweet tweet="#GDPR is a move in the right direction but does not go far enough" quote="GDPR is a move in the right direction but does not go far enough"] European service providers are already in discussions with government agencies to address a wide range of software-related issues and identify potential solutions, and a big step forward is anticipated by the end of 2017. As IT infrastructure and provisioning become increasingly complex, it’s not enough to simply secure the data centre or the desktop. Cyber-attackers employ ever more sophisticated methods to access sensitive data at several levels, and much more needs to be done to counter these. As industry and government work together to combat ever-evolving cybercrime, we need to be sure that software and services are not left out. Towards a ‘digital single market’? Right now, a company doing business in Europe may need to address 20 different markets, each with different rules and regulations around data privacy and security. Any business looking to scale geographically requires a unified IT strategy that meets all the national regulations of each individual market. Putting this in place is an incredibly complex and demanding task – especially for smaller, growth companies. Building a digital single market European would overcome this fragmented structure and help create a level playing field. It would not only help businesses scale faster and become more competitive Unlike in countries with a large internal market, such as the USA and China, Europe’s businesses are held back from scaling quickly enough and steal a march on their competitors. Building a digital single market European would overcome this fragmented structure and help create a level playing field. It would not only help businesses scale faster and become more competitive, it would also help create jobs and increase the number of businesses investing in R&D in Europe. In short, it would strengthen the business ecosystem across the continent. Such a digital single market will require the collaboration of all countries in Europe: a single weak link in the chain would undermine the whole approach. Delivering the true benefits of free data flow within Europe requires higher levels of security and better infrastructure than outlined in GDPR, which only addresses personal data protection. What’s needed is a definition of the infrastructure needed to underpin data security and privacy. A basic framework for this already exists if we combine GDPR, the Data Protection Code of Conduct for Cloud Service Providers, and the European Secure Cloud Label which was launched in December 2016. But there is still work to be done. In summary, GDPR is certainly a step in the right direction in focusing minds on the importance of data protection and security, but will likely only scratch the surface of what’s needed to enable true ‘free trade’ in data and keep one step ahead of cybercriminals. The good news is that the wheels are already in motion among service providers and government agencies, who are working together to build on GDPR. Our industry needs to ensure these conversations keep moving as we strive for an all-encompassing solution for Europe – a single digital market is our ambition, and we will continue to work to make it a reality. ### Is Blockchain the answer to fraud prevention? Thirty percent of businesses reported some sort of fraud within their supply chains last year. In addition, UK businesses also lost over £40m last year due to fraudulent activity carried out by employees, with London businesses being hardest hit with 29% of those losses. A certain degree of trust in staff is required by all businesses when it comes to anyone involved in financial transactions, however, many organisations are now waking up to the idea that Blockchain technology could be used by procurement teams to help prevent internal or external fraudulent attacks. [clickToTweet tweet="90% of North American and European #banks are currently exploring #Blockchain’s potential to combat #fraud." quote="Ninety percent of North American and European banks are currently exploring Blockchain’s potential to combat fraud."] Ninety percent of North American and European banks are currently exploring Blockchain’s potential to combat fraud. As a digital ledger of transactions that can’t be corrupted the finance world is realising its potential to reduce fraud. Blockchain enables smart contracts to be introduced that can store rules for negotiating contract terms, automatically verify contracts, and execute the terms. Any parties transacting with each other who try to change the contract or the agreed transaction can only do so with other participants in the Blockchain consenting to it. Invoices are just one example of where Blockchain can help prevent fraudulent behaviour and stop a supplier’s invoice from being tampered with. But through Blockchain, all transactions and alterations made within the system must be verified by network participants. Using an algorithm, which is essentially a set of rules, all network participants will be able to establish whether alterations being made are legitimate. Data held within a Blockchain isn’t held in a central place. Cyber-criminals, therefore, have no obvious target to carry out an attack, as the data is held in all PCs within the Blockchain. All network participants would have to agree to granting access, making it impossible in theory for a third party to hack into it. It becomes particularly important for a business to put measures in place to reduce fraud when introducing operations overseas, and in countries where scams and misconduct in business are rife It becomes particularly important for a business to put measures in place to reduce fraud when introducing operations overseas, and in countries where scams and misconduct in business are rife. Brazil is the biggest single market in Latin America, but fraud is rife there, with half of Brazilian consumers citing card fraud in the past five years. Brazilian oil company, Petrobras is under fire through Operation Car Wash, an investigation into bribery allegations to award contracts to construction firms at inflated prices. There’s no escaping the unscrupulous behaviour of some individuals, but Blockchain adds a layer of protection for any transactions that could be subjected to a cyber-attack for fraudulent purposes. Blockchain also has the potential to prevent double spending with cryptocurrencies. Fraudsters have found a way to use digital payment systems such as Bitcoin more than once, but Blockchain can prevent another attempt at spending using the same cryptocurrency taking place. The Blockchain market is expected to grow to $20bn by 2024 and, having proven its fraud-fighting credentials, more and more organisations are likely to consider how it could benefit their own finance and procurement processes. Experts predict that eventually businesses will be using Blockchain far more widely, changing the way we manage data, store files and make payments online. As Blockchain begins to become a more mainstream technology, organisations have a lot to gain by exploring its potential to prevent fraud and much, much more. ### How to Deploy DevOps in the Cloud Digital transformation is in the air - as surely as Christmas is around the corner! Digital transformation is fast becoming the main talking point on many C-suite and boardroom agendas and requires software to be delivered at a faster pace. DevOps and cloud are both key drivers for it. While they have always worked alongside one another, enterprises are increasingly choosing to deploy DevOps in the cloud. [clickToTweet tweet="#DevOps and #cloud are both key drivers for #DigitalTransformation" quote="DevOps and cloud are both key drivers for Digital Transformation" theme="style3"] Building blocks Perhaps this should come as no surprise. After all, Gartner expects the worldwide public cloud services market to grow 18.5 percent in 2017 to reach $260.2 billion. Everything is moving to the cloud, including DevOps, and in the next 12-18 months, there will be a sharp rise in the use of cloud platform services in application and as building blocks. There are many factors in favour of DevOps in the cloud. Cloud platforms provide APIs and services for automatic management, provisioning, configuration and monitoring of resources that help in developing infrastructure as code for automating application delivery process. Cloud lends itself very easily to DevOps practices and goes hand in hand. Beyond the clouds There are also several technologies that are very (very) hot right now that are acting as catalysts, for cloud adoption. Containers help in breaking monolithic applications, to make them portable, easily managed and orchestrated. Microservices help in developing services that are independently developed, updated, replaced and scaled. Serverless architectures build on top of containers and microservices to remove the operational concerns from developers and complement DevOps goals of agility and delivering business value. Yet another is the rise in trust-based systems and policy-driven gates to provide right checks and balances in an automated and declarative manner. All these technologies on cloud lend themselves to agile development and to DevOps practices. When enterprise IT teams think of DevOps in the cloud, they typically consider custom application development and sometimes overlook ERP apps. There are also plenty of IoT (Internet of Things) platforms available on the cloud that also lend themselves to DevOps. And then is a plethora of choice on Big Data and Machine Learning platforms on the cloud to build intelligent applications. Not only are applications getting better on the cloud, tools for monitoring are also improving so IT teams can monitor everything from physical hardware to containers and applications via the cloud. DevOps…done a little differently The cloud is characterised by infrastructure-on-demand and pay-by-use models. This makes DevOps in the cloud different. For instance, when developers acquire infrastructure or hosts for their containers, the infrastructure is modelled like any other software. When developers conduct role-based automated software configuration and provisioning, they can scale and shrink applications based on the incoming traffic and load. Developers can improve the COGS (cost of goods sold) considerably by acquiring cloud resources at optimal time/cost. Benefits and best practices DevOps in the cloud, used with Agile practices, can help enterprises get products to market fast and ensure customers get faster releases with new features and fixes. Engineering teams can update features as and when they are ready. If they are not already engaged in Shift Left, enterprise should make this a priority because it can identify problems early and address them at the onset. Lean mean engineering machines Automation is another key enabler. Enterprises will need to assess how best to increase automated testing coverage to reduce lag times and speed up releases. Similarly, setting up automated a CI/CD pipelines to enable faster updates and feedback loops will reap its own rewards further down the line as the business is better able to respond to market changes, feedback from customers etc. Teams deploying DevOps in the clouds will find that the need for coordination and hand-offs is eliminated and many manual activities are replaced by automation, leading to far more productive, lean engineering teams. So, are you considering DevOps in the cloud? Needless to say, there are countless benefits - but enterprises should be aware of the pitfalls too as well as how to maximize their cloud investment. Pay close attention to… Firstly, tool selection can be an issue, so organisations should check ahead to avoid vendor lock-in to the cloud provider. Multi-cloud development is gathering momentum, according to Microsoft and 451 Research, nearly one-third of organisations work with four or more cloud vendors. For a DevOps team new to working in a cloud model, proper resource governance is important too. This ensures there are adequate quotas, there are no overruns on usage and that everything is easy to manage. Lastly, it is crucial to keep a close watch on security – who has access to do what actions. Organisations will need to be clear on how key management and ACLs to cloud services are carried out. Finally, here are the top 5 tips to make DevOps in the Cloud a success: Embrace the DevOps culture: this may sound obvious, but DevOps is more than a set of tools and technologies. It is a mindset and a way of working. Maintain visibility and collaboration across teams at all times. Choose the right tools: it is critical to select the right tools for getting CI/CD pipelines fully automated. Microservices and containers: organisations should move to microservices architectures and adopting containers where possible. Profile the SDLC: identify and ease the bottlenecks. By profiling the SDLC to identify manual activities and wait times, these areas can then be targeted and optimised. Keep it agile: Keep following agile practices to maximise the chances of a successful deployment in the cloud. ### #DisruptiveLIVE - Bonus Stream - GIANT Healthcare Tech 2017 Wearables Fashion LIVE Join us for Day 2 of GIANT Healthcare Technology 2017 LIVE with a special bonus Fashion Show of wearable tech! The Global #Innovation And New #Technology #Health Event held on the 28-30th Nov 2017 LDN #GiantHealthEvent #Tech #DigitalHealth#IoT #Networking #mHealth. ### Capgemini to help accelerate robotic automation across UK Central Government London, November 29, 2017 - Capgemini has announced a two-year agreement with the UK Cabinet Office to develop a Robotic Process Automation (RPA) Centre of Excellence (CoE). The CoE, which is now up and running, will help to accelerate the take-up of RPA across central government by supporting departments to develop plans to automate some of their clerical processes. Widely seen as a major enabler of public sector transformation, RPA describes a process in which software is programmed to autonomously carry out basic tasks across applications, reducing the burden of repetitive, simple tasks on employees. Able to be developed and deployed in a matter of weeks, RPA is highly cost-effective and can typically demonstrate returns on investment within a few months. It has been known to dramatically improve the speed and accuracy of processing, resulting in a quicker and higher quality service for the public. RPA will help Government departments use technology to perform repetitive tasks that may involve drawing information from different databases – for example, to verify a request for a service, grant or benefit. The new CoE itself brings together an expert team of people at the forefront of this technology from the public and private sector and will serve as a showcase for RPA activities by providing education and demonstrations. It will also assist departments to identify potential RPA initiatives and carry out detailed analysis of potential benefits before they decide on implementation. Christine Hodgson, Capgemini UK Chairman, commented: “Automation is a key element of Capgemini’s digital transformation offering and we are pleased to be working with the UK Cabinet Office on such a strategic project.  “RPA is an excellent opportunity for public sector organisations to realise significant productivity gains and focus on more value-added services. Based on our UK public sector and global RPA experience, we are confident that the Centre of Excellence will have a key role in supporting the public service transformation.” ### #DisruptiveLIVE - GIANT Healthcare Technology 2017 LIVE - Day 2 - Afternoon Join us for Day 2 of GIANT Healthcare Technology 2017 LIVE The Global #Innovation And New #Technology #Health Event held on the 28-30th Nov 2017 LDN #GiantHealthEvent #Tech #DigitalHealth#IoT #Networking #mHealth. ### A Fifty Year Low The frugal among you may have a scowl on your face. Interest rates on traditional UK savings accounts are the lowest they’ve been for over 50 years. But there may be reasons to turn that frown upside down if you investigate cryptocurrencies. Over the last 50 years, interest rates peaked at a whopping 13.56% in 1990. Both the preceding and following year saw double-digit interest rates too. Previously this was only seen once in 1980. But from 1970 to 1998 the rate rarely fell below 5% - averaging 7.82% over the 28 year period. Compare that to the rate of 1.23% in 2016 and things seem a little dire. Compare that to an account that Nationwide are offering in 2017 – an account that anyone can get, and not limited by number of withdrawals – and you’re looking at 0.1%. The average person in the UK in their 20s earns on average £1,876.50 per month. Now let’s assume for this example that John saves is entire salary each month in his account. At the end of the year he has £22,518; this entitles him to a whopping £22.52. That’s just not good enough if you're looking to amass a small fortune. In the interest of keeping things fair, John could make use of a 5% rate if he was a Nationwide member, he could even find a better than 0.1% deal elsewhere potentially. There are Help-to-Buy ISAs around as well, so there are options better than the 0.1% on offer, but my point is the comparison between now an older generation. Even the traditional, risk-free option of buying government bonds isn’t worth the time at a measly 2.2% return. Compare that to a cryptocurrency, in this case, Bitcoin, which has seen a yield (in just the last 24 hours at the time of writing) of £526.80 (7.08%). If you could get over 3x the return of a government bond in 24 hours, compared to a year (8,766 hours) wouldn’t you take it? When we take a look at the sheer overall momentum that Bitcoin has had it becomes even more appealing. Here are its gains over the last few years, from earliest to now: 2013 to 2014 saw an increase of £483.28 (5,916.05%) 2014 to 2015 saw a decrease of £304.76 (62.01%) 2015 to 2016 saw an increase of £103.75 (55.57%) 2016 to 2017 saw an increase of £522.80 (180.00%) Jan 2017 to Jun 2017 saw an increase of £1,008.27 (123.98%) Jun 2017 to Nov 2017 saw an increase of £6,150.19 (337.64%) Now I expect you’ll see that level out as the “discovery boom” of the currency settles down, but let’s think for a moment. I could have bought 10 bitcoin in 2013 for £81.69 and they’d now be worth £79,716.83. That makes me feel a little uneasy. Compare that to the interest rates we’ve just been speaking about; which one would you prefer to yield? I’d like to point out that with interest rates you cannot lose money, they are 100% safe. Ok, the bank you have your account with could liquidate and you could lose your money (although this is protected by the FSCS in the UK), but the rate itself wouldn’t be the cause of blame. Whereas, any investment into stock or cryptocurrency can result in financial losses. However, unlike typical financial institutions, bitcoin resides on the blockchain which is completely decentralised, making investment a little safer. Think of your savings account. The rate is affected by the bank, which is affected by the government, which is affected by the economy. So when we see an economy crash, like in 2008, everything is affected and that’s because it is a centralised economy, with one thing in the middle affecting the rest. As I said, the blockchain is decentralised, so should one part suffer financial loss or decline, it is contained a lot easier and has much less of an effective spread. To give an example, if one of the lesser cryptocurrencies were to crash, it would almost certainly have zero negative impact on ones like Bitcoin. These are the current top 5 cryptocurrencies by 2017 yield, including current GBP exchange rate: Ether with a yield of 5,531.24% at ETH 1 = £367.72 Dash (not DashCoin) with a yield of 5,240.85% at DASH 1 = £492.96 Ripple with a yield of 4,280.39% at XRP 1 = £0.2234 LiteCoin with a yield of 1,972.33% at LTC 1 = £75.64 Bitcoin with a yield of 927.81% at BTC 1 = £8,323.70 Please note this is the yield for 2017 only, not since the currency’s inception, which may result in a different order. So, to finish up, this is a market that’s definitely worth keeping an eye on. No longer is this just talk, cryptocurrencies are serious business. I am not recommending leaving your savings accounts behind in favour of these currencies just yet. Though, if I suddenly stop blogging one day, you can assume I’ve invested in the right cryptocurrency and have reaped the rewards! Disclaimer: please note, however, that I am not a financial advisor and any information written here is my own personal view and opinion and nothing written above should be construed as personal investment advice. The figures contained within believed to be accurate at the time of writing, however, any errors or inaccuracies that are included are the result of the sources of information that I have used. I cannot be held liable for any action or inaction on your part as a result of reading this blog! If in doubt, please consult a professional, licenced financial advisor. If you have any comments, or if you want to get in touch and talk FinTech (or savings accounts), follow me on Twitter (before I get rich). ### #DisruptiveLIVE – GIANT Healthcare Technology 2017 Day 2 LIVE AM Session Join us for Day 2 of GIANT Healthcare Technology 2017 LIVE The Global #Innovation And New #Technology #Health Event held on the 28-30th Nov 2017 LDN #GiantHealthEvent #Tech #DigitalHealth#IoT #Networking #mHealth. ### Google Plans To Turn Toronto Into The Most High-Tech Smart City You've Ever Seen From 4th - 8th December 2017 Compare the Cloud be running a number of articles on Smart Cities. As a special preview, Kayla Matthews takes a look at Google's plans to turn Toronto in the most high-tech city you've ever seen. Google has changed the way we use the internet. What if it changed the way people live, too? It just might for some Toronto residents. In mid-October, Sidewalk Labs, a subsidiary of a Google parent company called Alphabet Labs, signed a deal to develop 12 acres of land on Toronto’s waterfront. That deal launched an initiative called Sidewalk Toronto. The goal is to create an extremely connected community called Quayside. A High-Tech Place With a Low Cost of Living It’s no secret that Google keeps tabs on how people use the internet and adjusts its algorithms accordingly to result in more relevant online experiences. Sidewalk Labs plans to do the same with the aim of enhancing community living through data collection. For example, it might use personal information to make streets safer, streamline the necessities of life such as timely trash collection and make public transit routes more efficient. Such statistics would also likely be used to see how people interact with and make use of what Quayside offers its residents. Many people may be surprised about the projected cost of living for this advanced community. Representatives from Sidewalk Labs believe people could live in Quayside for 14% less than the surrounding area. Timber-frame housing, modular units and cohabiting situations are some of the scenarios being examined to help people live well for less. Multi-purpose buildings may also result in mixed-use environments with houses, offices and production facilities. They’ll ensure spaces don’t go to waste and contrast with the city’s existing zoning laws. The associated electricity grid would reduce energy consumption in that district to 95% below city regulations. A Hefty Investment and Huge Potential Development Plans Thanks to an initial $50 million (U.S. dollars) investment from Sidewalk Labs, the Sidewalk Toronto project could result in no less than 3.3 million square feet of housing, commercial space and office buildings. Notably, one of the structures in the new development could serve as the new headquarters for Google Canada. Also, if things go as planned, Google’s “smart city” might not be the only boost to Toronto’s economy. The Canadian destination is also in the running to welcome Amazon. The mega e-commerce company is in the evaluation phase for picking a location for its second headquarters. However, there are housing concerns related to the possibility of Amazon’s arrival. The brand could hire as many as 50,000 new employees, and they’d all need places to live. Could Quayside help fill the housing void? Potentially, but it’s too early to tell. The Possible Pitfalls of a Connected City As interesting as Quayside seems, no city is perfect. Many individuals know that the blue light associated with technological gadgets can disrupt the body’s natural sleep cycle. Even so, 95% of Americans use some form of technology an hour before bed. Canadians likely have similar habits and have grown accustomed to sleeping next to their phones and tablets or using them just before bed as steadfastly as a toddler might rely on a teddy bear or security blanket. Health experts warn using gadgets before bed isn’t healthy. It makes sense then, that Quayside’s inhabitants could theoretically deal with sleep issues from the surrounding and constantly present tech. Sidewalk Labs released a 220-page document outlining aspirations for its community, and noted how only two pages discuss privacy Furthermore, people raise privacy concerns and wonder what sacrifices they’d make to live in a place that seemingly tracks everything they do. Anthony Townsend, an internationally known author, strategy consultant and speaker, recognizes that Quayside represents a “tremendous opportunity." However, he pointed out that Sidewalk Labs released a 220-page document outlining aspirations for its community, and noted how only two pages discuss privacy. [clickToTweet tweet="Google's plans for Toronto Smart City -> 'other forward-thinking organizations... fell short of expectations'" quote="Google's plans for Toronto Smart City -> 'other forward-thinking organizations... fell short of expectations'"] It’s also worth realizing other forward-thinking organizations and people dreamed up futuristic communities that fell short of expectations — presumably because individuals weren’t ready for such significant changes. Songdo, a community in South Korea, is one of them. Granted, the city won’t be finished until 2020, but many analysts view it as a failure because it currently houses only about half the residents it could hold. Walt Disney even had bright ideas of a “future city” called Epcot. As we know now though, only a theme park materialized from the visionary’s dream. Quayside has hurdles to clear but substantial funding and a major brand backing it. It’ll be fascinating to see if and how this groundbreaking concept comes to life. ### Lycamobile builds new digital back office with OpenText UK mobile operator digitally transforms complex legal and invoice processes with OpenText London, UK – 28th November 2017. OpenText™ (NASDAQ: OTEX, TSX: OTEX), a global leader in Enterprise Information Management (EIM), today announced that Lycamobile, the world’s largest mobile virtual network operator (MVNO), has selected OpenText to provide a digital back office for its Legal and Accounts Payable departments. Lycamobile will use multiple products from OpenText's Release 16 platform, including Process Suite, Content Suite, and Contract Centre to help cut costs and increase efficiency by digitally transforming its legal and invoice processes. Lycamobile is the world’s largest mobile virtual network operator, operating in 23 countries across five continents. It is the market leader in the international prepaid mobile calls market, with over 15 million customers worldwide and a new one joining every two seconds. For the first phase of the project, Lycamobile has deployed OpenText Process Suite to digitise and automate the Accounts Payable process, and OpenText Contract Center to support its legal teams. Both of these solutions feed into OpenText Content Suite, providing a single, secure source of truth for AP and legal documents, and reducing the burden of administering physical documents. The teams will generate invoices and contacts digitally and automate key parts of the approval and processing workflows, using Content Suite to store and index documents. Lycamobile plans to expand its use of Content Suite to support the whole head office, encompassing all invoices and contracts, and extending to other departments. "We chose OpenText because of its proven enterprise-grade technology and powerful integrations," said Prem Sivasamy, deputy chairman at Lyca Group. "We wanted a set of solutions that would integrate with our existing digital systems and support the changes we make in the future. We believe OpenText could deliver just what we needed on that." "We are looking forward to using OpenText’s digital back office to digitise our infrastructure and processes,” said Sivasamy. "With OpenText's platform, including the ability to work across all devices, our processes flow much more smoothly." ### How to secure your data in the cloud The recent incidents of leaks and breaches of cloud-based data have clearly shown that the process of securing data in the cloud is often poorly implemented. Even governments and large organisations who “should have known better” have suffered breaches, despite their considerable resources and awareness. With the implementation of the EU’s General Data Protection Regulation (GDPR) just six months away, it has never been more important to properly secure data in the cloud. With mandatory breach reporting to be enforced with GDPR, organisations who fail to do so risk reputational damage and hefty financial penalties. GDPR affects SMEs as well, so all businesses should now be examining their data security practices. This article provides actionable steps to help all organisations from becoming a data breach statistic. Why infrastructure-level and access-level security is not enough “The cloud is safer than on-premise…” – this is a common message from cloud providers on the topic of security. After all, cloud providers have hundreds or thousands of security engineers, and collectively they could do a better job than an average system administrator. The cloud provider can also deliver compliance and certification in line with relevant government standards. It’s hard to argue against that. But then, why do data breaches still occur? Secure infrastructure certainly mitigates against external threats such as hacking, but it doesn’t automatically result in good data security. It is just as important to mitigate against accidents and mistakes that occur within an organisation. Failing to recognise this leads to a false sense of security. There are three distinct layers of security that can protect data in the cloud: Infrastructure-level security: focuses on the infrastructure, such as physical security (e.g. building location and construction; fingerprint scanners for employees) and network security. Access-level security: ensuring only authorised users can gain access to the infrastructure. Data-level security: ensuring unauthorised parties cannot read or use the data even if they have access to the cloud account – achieved using client-side encryption. Recent data breaches, such as those experienced by Accenture, Deloitte, and the Swedish Government, are good examples of how a failure at one or more of these layers can have catastrophic consequences. In those cases, there were failures at the access-level security layer – credentials were leaked due to mistakes made within the organisation, enabling unauthorised parties to access the account and data within it. However, if good data-level security procedures had been implemented, these security breaches should not have resulted in data breaches. Data-level security via strong, client-side encryption can prevent a security breach from becoming a data breach Every data breach starts off with a security breach. For example, when the Australian Red Cross Blood Bank leaked the names, addresses and sexual history details of 550,000 blood donors, it was a security breach due to human error. The employee of a third party IT contractor accidentally placed a plaintext database backup onto a public-facing web server. When properly implemented, client-side encryption is an excellent way to prevent a security breach from becoming a data breach, because an attacker will only see encrypted data. An extremely effective mitigation against security breaches – whether due to human error or active hacking – is client-side encryption. When properly implemented, this type of encryption is an excellent way to prevent a security breach from becoming a data breach, because an attacker will only see encrypted data. [caption id="attachment_47481" align="alignleft" width="300"] How to secure your data in the cloud[/caption] The role of encryption in mitigating against data breach is also recognised in the GDPR. Specifically, article 32 (1)(a) of GDPR guidelines calls for the “pseudonymisation and encryption of personal data”, taking into account the state of the art and implementation costs. An important benefit of using encryption is that a leak of encrypted data is not classified as a data breach under GDPR. Thus it does not have to be reported, and will not result in financial penalties. Therefore, using encryption is an effective way to mitigate against fines and reputational losses when a breach occurs. Cutting through the confusion associated with encryption Unfortunately, encryption is perhaps one of the most underutilised and misunderstood areas in cybersecurity. Due to the number of acronyms (AES, RSA, ECC, TLS), the types of encryption (server-side, client-side) and types of data protected (data in use, data at rest, data in transit), it’s no wonder why many people are confused. In the context of securing data in the cloud, the two main attributes to consider are: Where does the encryption take place – on the cloud server (known as server-side) or on the data owner’s device (known as client-side)? Who holds the key – the cloud provider or the data owner? Not all encryption is the same: why client-side encryption is important Some cloud providers specifically advertise that they use encryption. However, the devil is in the detail because the types of encryption they offer are often ineffective against common attacks. Server-side encryption: If encryption is performed by the cloud provider, it still exposes plaintext (unencrypted) data to the cloud provider, which provides a convenient point for eavesdropping. Cloud-held keys: Even worse, when encryption keys are stored by the cloud provider (which is the case in many cloud-based encryption systems), the cloud provider will decrypt and serve up plaintext data to whomever connects to the account – including hackers. This provides no extra security against attackers who compromise the account. Therefore, the safest way to implement encryption is to choose client-side encryption with client-held keys. This means that the keys are held by the owner of the data, and the encryption/decryption is performed on-premise (and not in the cloud). The cloud provider, therefore, has no way of decrypting the data and serving it to an attacker. Additional requirements for good cryptographic security The ideal encryption system should meet a number of additional requirements: Transparent encryption – in order to maintain the usability of encrypted data, the encryption should be transparent. Encryption of a wide range of data – from data fields, documents, photos and videos, backups, drive images, files. Provide long-term security – because quantum computers are expected to break many cryptosystems within 15 years yet records need to be kept secure for lifetimes. Allow you to copy encrypted data from one storage location to another (known as ciphertext portability). Be designed and peer-reviewed by reputable cryptographers (which counteracts the “snake oil”/lack of trust problem). The technical reasons for these requirements are beyond the scope of this article, however further details and an expanded discussion about the hidden pitfalls can be found here: comprehensive checklist for client-side encryption and GDPR compliance. Given these protections are available, why do data breaches still occur? Dr Ron Steinfeld is a cryptographer and senior lecturer at Australia’s Monash University. He is an expert at cryptographic security for the cloud and has conducted research on a general-purpose encryption system known as ScramFS. He believes the problems with data security stem from the lack of easy and secure encryption solutions. “I think one of the reasons for this [widespread data breaches] is the scarcity of secure cloud storage encryption software that is easy to use and integrated/compatible with commonly used software applications. For that reason, many users store their cloud data unencrypted, or with keys stored on the cloud server. Consequently, any hacker that manages to expose the contents of the server can gain access to the data. Cloud content exposures are not infrequent due to the prevalence of web application bugs and security vulnerabilities, and weak password choices or password reset procedures by users. Encrypting stored user information on the cloud server with a key known only to the user, as is done by ScramFS, should significantly reduce the likelihood of such data breaches.” Putting it together: a full suite of protection There is no single silver bullet for securing your data in the cloud. Instead, a multi-layered approach will protect against a wider variety of situations – active hacking and human error. [clickToTweet tweet="There is no single silver bullet - How to #Secure your #data in the #Cloud @ScramSoftware" quote="There is no single silver bullet - How to #Secure your #data in the #Cloud @ScramSoftware"] Much can be learned from looking at automobile safety. Cars contain multiple mechanisms to keep passengers safe – seat belts, anti-lock brakes, air-bags and electronic stability control. The dramatic increases in passenger safety over the last 50 years have been due to substantial advances in not only the effectiveness of each system but also combining the different systems into a suite that provides comprehensive protection. An analogous same suite of protection can be achieved for securing data in the cloud. The steps for good security are: Secure your infrastructure. Secure your access to that infrastructure. Secure your data directly using client-side encryption. It’s now only a matter of months until GDPR comes into effect (25 May 2018). Organisations must move swiftly to secure their data before they become a statistic. ### Artificial Emotional Intelligence software takes security and intelligence insight to another level – not possible until now Technology is FIRST to monitor extreme life-threatening and anti-social behaviour (including suicide prevention) live via subliminal facial expressions 27 November 2017 – Human, pioneer in Artificial Emotional Intelligence (AEI), has developed new groundbreaking software that has the ability to read subliminal facial expressions live and convert these into a range of deeper emotions and specific characteristic traits in real time.  The innovative software takes security and intelligence insight to another level, something that previously has been almost impossible to achieve. Through the deciphering of live characteristic traits and emotions utilising deep learning, the technology will better equip the security and intelligence world with data that will keep both the security provider and its customers safe, thus allowing them to make vital decisions such as monitoring extreme or even life-threatening behaviour in some cases. Human’s software is the first of its kind, taking AEI and facial recognition a step further into the dynamic realms of characteristic detection. This technology is also supporting suicide prevention, face detection and verification amongst crowds, extreme behaviour monitoring, societal disturbances and anti-social behaviour. Yi Xu, CEO and Founder of Human explains: “The ongoing scanning of emotions and characteristics in both small and large crowds has provided our clients with the ability to flag any extreme highs and lows in emotions. When narrowed down to extreme happiness, sadness, nervousness amongst others, it is giving both general and security staff the foresight and ability to step in before the worst case scenario presents itself.” This continues to further aid the ongoing pattern spotting and monitoring that exists today.  The flagging of characteristics and emotions combined with face detection allows crowd monitoring and tracking to be taken from a previously reactionary process to a preventative state. Yi Xu continued: “Businesses can now be confident that they are acting with the added benefit of psychological and emotional data as well as a visual track. This benefits detecting wanted or suspected individuals as well as identifying people who may be about to commit an offence is invaluable.” With the capability to provide the technology across multiple platforms including mobile, web and API delivery, Human is proving to be both attractive and agile in terms of analysing past and live video footage with no additional equipment needed. Going deeper than simply deciphering human emotion, Human’s patent-pending software goes beyond that using partial facial recognition, robust camera angles and state-of-the-art pixelated raw data to reveal typical personality and behavioural traits.  Having been developed internally by world leading data scientists, Human’s technology provides advanced levels of unbiased insight. Using raw video footage here, Human’s technology is able to tell us live in real time what a person is feeling as well as give us insight into certain personality traits which can be used to predict human behaviour.  Check out the results of this video:https://vimeo.com/234331383 ### #DisruptiveLIVE - GIANT Healthcare Technology 2017 LIVE PM Session Join us for the afternoon session of day 1 of GIANT Healthcare Technology 2017 LIVE The Global #Innovation And New #Technology #Health Event held on the 28-30th Nov 2017 LDN #GiantHealthEvent #Tech #DigitalHealth#IoT #Networking #mHealth. [clickToTweet tweet="Watch the #GiantHealthEvent LIVE @gianthealthevnt @DisruptiveLIVE #DigitalHealth" quote="Watch the #GiantHealthEvent LIVE @gianthealthevnt @DisruptiveLIVE #DigitalHealth" theme="style3"] ### #DisruptiveLIVE - GIANT Healthcare Technology 2017 LIVE AM Session Join us for Day 1 of GIANT Healthcare Technology 2017 LIVE The Global #Innovation And New #Technology #Health Event held on the 28-30th Nov 2017 LDN #GiantHealthEvent #Tech #DigitalHealth#IoT #Networking #mHealth. [clickToTweet tweet="Watch the #GiantHealthEvent LIVE @gianthealthevnt @DisruptiveLIVE #DigitalHealth" quote="Watch the #GiantHealthEvent LIVE @gianthealthevnt @DisruptiveLIVE #DigitalHealth" theme="style3"] ### Could Bitcoin be the secret to side-stepping crippling inflation? Bitcoin can be a route out of poverty in the developing world, and help secure safe dollars. Ruben Galindo Steckel, co-founder of AirTM, explains how. There has been a lot of negative press recently about bitcoin. That is nothing new with an innovative technology that takes power away from regulators and gives it back to the people. Many questions are asked about a digital currency that is, by design, not underpinned by a government-run central bank. But the question that really needs asking is this: Who or what is going to protect a hard-working Venezuelan from an inflation rate the IMF says has peaked above 700% this year? The same question goes for anyone in Argentina or perhaps Angola where inflation is 25% and 27% respectively. It may surprise many but, in the absence of official banking support, bitcoin can play a vital role in turning local currency into dollars and protecting peoples’ income against rampant inflation in their domestic economy. Rise of cryptocurrency The other question that very rarely gets asked is why people are choosing to use cryptocurrencies in the first place? Often the authorities, that oppose their use, will say they are unsafe or used by criminals. The truth is that people are using bitcoin because many have lost faith in the banking system However, I think they need to look more at themselves. The truth is that people are using bitcoin because many have lost faith in the banking system. When you look at the fallout of the 2008 global financial crash, it’s easy to see why. The banks were shown to be acting recklessly with poor regulation and it was everyday people who paid the price. Whether through the banking crash or the economic uncertainty that has followed in many economies, people have lost faith with the banks and the regulatory system that is supposed to keep their money safe and stable. Protecting against inflation This is felt acutely across the developing world as well as other struggling economies where inflation is rife. Hardworking people are earning a local currency which they know is worth less and less every week. It’s only natural they should want to earn a stable currency to protect their income from inflation and ensure they can still feed and clothe their families. [clickToTweet tweet="Hardworking people are earning a local currency which is worth less and less every week - Can #Bitcoin help?" quote="Hardworking people are earning a local currency which is worth less and less every week - Can #Bitcoin help?"] The trouble is, in many countries where the economy is failing, the authorities do not encourage citizens to turn to an alternative currency because it would only continue to devalue their local currency if everyone sold it off en masse. Official blocks vary, but typically citizens are not allowed to open dollar bank accounts or if they are, there are limits on how much they can convert each day. If someone is fortunate enough to have a bank that will change money for them, they can store dollar bills under the proverbial mattress. Many do not and are forced to buy ‘street’ dollars. These unofficial exchanges are not regulated and can be risky, particularly as, once again, the buyer ends up with a wad of notes that can be stolen. You may think that tech-savvy, hard-working people could just move their local currency into dollars through an e-wallet, such as PayPal, but these rarely connect to the banking system in the developing world. They’re focussed on the hot e-commerce markets in the developed economies. So, you have people earning a currency they know is devaluing by the week who are forced to either change just a little of their wages into dollars, if they are lucky or take the route of buying street dollars. Neither route is safe and secure. Safe dollars While bitcoin can be used by everyone under any circumstances, it is these people in struggling and developing economies who are usually overlooked. My business partner, Antonio Garcia, and I built a platform in which users can exchange their local currency into dollars through a P2P platform. A user seeking to exchange their local currency into dollars would be matched up with a cashier on our platform. When the clients and cashiers get matched up with each other to complete a transaction, the funds are held in escrow until confirmation from both parties is received. The cashiers earn a commission for every exchange they help facilitate, and many reinvest their earned commissions or local currency into bitcoin. They then deposit bitcoin back into the platform when the bitcoin price appreciates, and continue cashiering to gain even more commissions.  The client can choose to withdraw the funds back into local currency at a later stage if he is in need of cash. In the meantime, while he holds his funds on the platform, the client’s wealth is preserved by a more stable currency (the US dollar). Many also use the facility to send friends and family members dollars, particularly if they are working remotely. The transfer of dollars from one user to another is free of charge, and the funds enter the recipient’s account instantly. So, if you have read about bitcoin being unreliable and too risky, consider this. What sounds more risky to you? Relying on an official currency that is subject to damaging inflation, or bitcoin trading which can help people turn their local money into secure currency? ### Tata Communications makes its debut in the Gartner Magic Quadrant for Managed Hybrid Cloud Hosting, Asia/Pacific Inclusion based on an evaluation of Completeness of Vision and Ability to Execute MUMBAI, INDIA – November 27th, 2017 – Tata Communications, a leading provider of network, cloud and security services, has been positioned for the first time by Gartner in its 2017 Magic Quadrant for Managed Hybrid Cloud Hosting, Asia/Pacific. Designed by Gartner to examine how effectively providers meet the current needs of business in terms of both ‘completeness of vision’ and ‘ability to execute’, 2017 marks Tata Communications’ debut in the report. Srinivasan CR, Senior Vice President, Global Product Management and Data Centre Services, Tata Communications said: “We enable businesses to create a hybrid IT environment that combines the versatile capabilities of IZO™ Private Cloud and the flexibility of public cloud with enterprise-grade security. IZO™ Private Cloud’s open architecture, tight network integration and granular footprint with 13 cloud nodes across geographies provides unprecedented control over the residency of customer data and applications – keeping up with their employees’ demands for mobile, collaborative and social ways of working. Our inclusion for the first time in the Managed Hybrid Cloud Hosting Magic Quadrant, Asia/Pacific is a testament to our continued focus on making it easy for customers to accelerate their digital transformation strategy through a cloud-first strategy.” Tata Communications’ global suite of cloud and hosting solutions provides customers with the flexibility, scalability and security to help them thrive in today’s fast-changing marketplace. In addition to professional services offered for cloud migration and integration, the company has acquired the Multi-Tier Cloud Security (MTCS) Level 3 certification, Singapore’s highest security management certification for cloud operations. The company’s hybrid cloud strategy is underpinned by Tata Communications’ global network, IZO™ ecosystem, and partnerships with world’s biggest clouds – Microsoft Azure, Amazon Web Services, Google Cloud Platform, Office 365 and Salesforce. Today, over 25% of the world’s Internet routes travel over Tata Communications’ network and the company is the only Tier-1 provider that is in the top five by routes in five continents. ### AWS Announces Amazon Sumerian New service enables any developer to quickly and easily build virtual reality, augmented reality, and 3D applications for popular hardware platforms SEATTLE – Nov. 27, 2017 – At midnight, during the AWS re:Invent Midnight Madness kick-off event, Amazon Web Services, Inc. (AWS), an Amazon.com company (NASDAQ:AMZN), announced Amazon Sumerian, a new service that makes it easy for any developer to build Virtual Reality (VR), Augmented Reality (AR), and 3D applications, and run them on mobile devices, head-mounted displays, digital signage, or web browsers. With Amazon Sumerian’s editor, developers can build realistic virtual environments, populate them with 3D objects and animated characters, and script how they interact with each other and the application’s users. VR and AR apps created in Amazon Sumerian will run in any browser that supports WebGL or WebVR graphics rendering, including Daydream, HTC Vive, Oculus Rift, and iOS mobile devices. Getting started with Amazon Sumerian is as simple as logging into the AWS Management Console. There is no software to install or upfront costs—customers pay only for the storage used for 3D assets and the volume of traffic generated to access the virtual scenes they create. To learn more about Amazon Sumerian, visit: http://aws.amazon.com/sumerian. Customers in many industries are adopting VR and AR technologies to build realistic virtual environments, or overlay virtual objects on the real world, for a range of applications, including training simulations, virtual concierge services, enhanced online shopping experiences, virtual house or land tours, and much more. Until now, creating realistic virtual environments, or “scenes,” required specialized skills and the use of multiple different tools for disciplines such as 3D modeling, environmental design, animation, lighting effects, audio editing, and more. And then, once a VR or AR application is ready, developers must learn and implement unique specifications and deployment processes for each of the major hardware providers. Amazon Sumerian solves these challenges by providing a web-based editor that developers can use to easily and quickly create professional-quality scenes, and a visual scripting tool to build the logic that controls how the objects and characters in the scenes behave and respond to actions. Amazon Sumerian also makes it easy to build scenes where rich, natural interactions between objects, characters, and users occur through the use of AWS services such as Amazon Lex, Amazon Polly, AWS Lambda, AWS IoT, and Amazon DynamoDB. “Customers across industries see the potential of VR and AR technologies for a wide range of uses—from educating and training employees to creating new customer experiences. But, customers are daunted and overwhelmed by the up-front investment in specialized skills and tools required to even get started building a VR or AR application,” said Marco Argenti, Vice President, Technology, AWS. “With Amazon Sumerian, it is now possible for any developer to create a realistic, interactive VR or AR application in a few hours.” With Amazon Sumerian, developers can: Design immersive VR, AR, and 3D environments— Amazon Sumerian’s easy to use editor allows developers to drag and drop 3D objects (e.g. furniture, buildings, and natural objects) and characters into “scenes” (e.g. rooms, office environments, and landscapes). Developers can choose from Amazon Sumerian’s library of pre-built objects, download and import objects from third-party asset repositories such as Sketchfab or Turbosquid, or create and import their own objects. Amazon Sumerian also includes templates with pre-populated scenes. Easily create animated characters powered by AWS AI services—Developers can also use Amazon Sumerian to create animated 3D characters that can guide users through a scene by narrating scripts or answering questions. Amazon Sumerian is integrated with Amazon Lex and Amazon Polly, which provide automatic speech recognition (ASR), natural language understanding (NLU), and text-to-speech capabilities, so that Amazon Sumerian characters can understand and respond to users in lifelike conversations. Deploy applications to popular VR and AR hardware—Amazon Sumerian scenes run in any browser that supports WebGL or WebVR graphics rendering. Amazon Sumerian scenes are hosted in the AWS cloud and served from a publicly accessible endpoint, so that users can access them from anywhere. Mapbox is a location platform for developers with a deeply customizable toolset to embed maps in any application. "We now bring Mapbox location services to Amazon Sumerian, enabling users to integrate 3D maps and surfaces into their AR and VR applications and bring location-based experiences to life," said Alex Barth, VP of Business Development, Mapbox. "For instance, we integrated Mapbox’s points of interest and global terrain maps with Amazon Sumerian so data is delivered and rendered in real-time." Thermo Fisher Scientific is dedicated to improving the human condition through systems, consumables, and services for researchers. "My experience with Amazon Sumerian makes developing WebVR and Web3D much easier,” said Jonathan Agoot, Technology Strategist, Global eBusiness Innovation Center of Excellence, Thermo Fisher Scientific. “I can easily import my 3D models, textures, and animation so I focus on developing a scene and not have to worry about code." About Amazon Web Services For more than 11 years, Amazon Web Services has been the world’s most comprehensive and broadly adopted cloud platform. AWS offers over 100 fully featured services for compute, storage, databases, networking, analytics, machine learning and artificial intelligence (AI), Internet of Things (IoT), mobile, security, hybrid, and application development, deployment, and management from 44 Availability Zones (AZs) across 16 geographic regions in the U.S., Australia, Brazil, Canada, China, Germany, India, Ireland, Japan, Korea, Singapore, and the UK. AWS services are trusted by millions of active customers around the world—including the fastest-growing startups, largest enterprises, and leading government agencies—to power their infrastructure, make them more agile, and lower costs. ### Executive Education Episode 5: Creating Resilience and Innovation On this episode of #ExecEdTV host Kylie Hazeldine, Business Development from Regent Leadership, will be discussing the creation of resilience and innovation despite uncertainty and volatility. Kylie is joined this episode by Charlie Ross - Director of Offset Warehouse and Martha Silcott, CEO FabLittleBag. Executive Education is a 30 Minute live show format designed for business leaders, managers, entrepreneurs and startups. #ExecEdTV #DisruptiveLIVE ### How integrated cloud technology is key to digital fitness Cloud technology is revolutionising business from top to bottom, but interestingly one of the departments that still has considerable ground to make up is marketing. The technology that many organisations have in place for marketing has now become incredibly complex — a mix of multiple pieces of cloud and on-site infrastructure, with very little integration between systems. This lack of integration is having a significant effect on overall digital fitness — it slows marketers down in their day jobs and makes mapping customer journeys and building buyer profiles extremely challenging. And without the ability to understand customers, organisations will struggle to tailor experiences to customers, limiting customer loyalty and potential revenue growth. [clickToTweet tweet="Marketers... don’t always have time to consider the technical implications of their tech buying decisions" quote="Marketers... don’t always have time to consider the technical implications of their tech buying decisions"] It’s understandable how businesses have ended up with such complex technology and systems in place. Marketers, who work under immense pressure, don’t always have time to consider the technical implications of their tech buying decisions, and so adopt point solutions that address one specific problem at one particular time — without being able to consider the wider implications and future needs of that technology. Not only is the technology incredibly complex, a lot of it is also now very dated. Much of the technology that marketing teams have at their disposal today was designed to solve specific IT problems that no longer exist anymore. Most of these problems were caused by the need to involve the IT team in nearly anything that was underpinned by technology, for example spinning up a new landing page for a particular project, or updating content on the website. Involving the IT team at every stage stifled the speed with which organisations could move digitally. Content management platforms of yesteryear have successfully solved this problem, but modern-day marketing requires so much more — and this legacy burden is holding organisations back from innovating and offering the 21st-century digital experience customers now demand. Offering a high-class digital experience is now essential to doing business. Digital experiences are on a par with having a great product to sell, a great team of people working together, a strong supply chain and great customer service. A bad digital experience, on the other hand, can cost you. Competition is too rife and consumers too savvy for them to stick around for something they don’t like. A digital strategy isn’t enough So, what are organisations doing with regards to solving these technical issues? Going about a fix to the problem is no simple feat, so like most things, a solution starts with an effective plan. The good news is that most organisations seem to have a digital strategy in place according to the research of 450 digital respondents from organisations in the UK, France, and Germany in Acquia’s Beyond the hype report. But having a strategy in place isn’t enough — to be truly ‘digitally fit’, have to be able to execute on that strategy. And the key to execution is technology. How a central, unified cloud platform helps marketers improve digital experiences Integrated cloud technology has a central role to play in helping organisations become digitally fit. Cloud offers organisations speed and control with marketing essentials like producing and managing content, personalising experiences to different customers, and mapping customer journeys across digital touchpoints like smartphones, tablets, laptops. Cloud helps with pulling information together on customers, centralising it, and making it available to any part of the business throughout the world. Speaking of global operations, integrated cloud technology is also essential with regards to managing a brand across multiple territories. To ensure a consistent brand feel across all territories while giving branding teams within each region the flexibility to work freely, cloud technology is pivotal. Capturing opportunities with integrated cloud In short, a central unifying cloud platform is key for organisations who want to speed up, regain control and enhance the delivery of digital experiences. Ultimately, cloud technology is essential if you’re to create data-driven journeys and capture new opportunities. ### CommsBusiness Live: Collaboration in the Channel In this episode, David Dungay is discussing the impact of the many collaboration tools on the Channel market. It is a hotly contested space which has seen some big consumer brands blur the lines between business collaboration and everyday communications. Is the Channel grasping collaboration? What should they be aware of before on-boarding solutions into their portfolio? Tune in and find out! Host David Dungay, Editor of Comms Business @CommsGuru4U Panellists: Ian Rowan, UK Channel Manager for Wildix @Ian_Rowan @WildixSrl Charles Aylwin, Director of Channel and Public Sector at 8x8 @8x8UK Jeff May, Regional Sales Director for Konftel @KonftelDACH Rami Houbby, VP for Cloud EMEA at Broadsoft @BroadSoftNews ### Customer queries increased by over 47% during Black Friday week, indicating record sales 27 November 2017 - Customer queries to retailers and logistics partners increased by over 47% in Black Friday week (Monday - Friday 24 November, vs same week in 2016), according to customer communication platform Gnatta.  Gnatta, which manages customer interactions for retailers including ASOS, AO and Missguided, saw a record number of customers contacting retailers to handle everything from stock inquiries to fashion advice. Increasing numbers of these interactions happened across webchat (38%), a shift from last year when social media accounted for nearly 50% of all queries and webchat for 26%. Social media accounted for 37% of queries this year. This points to a shift towards faster resolution, as customers increasingly expect to receive an immediate response. Initial estimates indicate that this year’s Black Friday was set to beat all records, with IMRG predicting that sales would be up 9% on 2016. Gnatta’s figures indicate that final sales tallies will be even higher as customer interactions have increased by 47% over last year. [clickToTweet tweet="“The way customers communicate with retailers is changing...'" quote="“The way customers communicate with retailers is changing...'" theme="style3"] Jack Barmby, founder and CEO of Gnatta said: “The way customers communicate with retailers is changing. People expect an instant response on any channel. They want a retailer to know about previous conversations they’ve had, and to be able to connect the dots across channels. A good retailer will be able to deliver that. Great customer experience differentiates a brand. If you’re going to sell more products, you need to be able to handle more customer queries, too. "Retailers need to be ready to deal with customers seamlessly across multiple channels – phone, social media, messenger, webchat, email, the lot. ” ### Can peer-to-peer sharing solve Windows as a Service? For many IT managers, application deployment remains a central task within the day-to-day management of enterprise IT. As businesses use an increasingly wide variety of different end devices, IT departments must now ensure that their company’s approved application portfolio functions across everything from tablets to desktops, laptops and smartphones. This issue is further complicated by the growing demand for employees to bring and use their own consumer devices within the workplace. Today, IT managers must attempt to control deployments and software updates across a whole host of different operating systems, platforms and device types – many of which they have no real control over. Faced with this complex landscape, it’s easy to understand why many IT managers are struggling to secure their systems and ensure that employees are working in a productive manner. For every unsanctioned or out of date application on an organisation’s network, IT departments are faced with yet another potential security vulnerability. At the same time, employees are also faced with a disjointed experience with incompatible application versions leading to frustrated staff, reduced productivity and, ultimately, a less profitable workforce. While it’s understandable that IT managers can get bogged down in the issue of application updates, there is a much more pressing concern starting to emerge behind the scenes – that of on-going operating system updates. Staying on top of your OS With 83% of today’s enterprises running Microsoft Windows as their primary operating systems, many are now looking to transition to Windows 10. While Windows 10 offers numerous benefits for enterprise users, many have been left concerned by the platform’s new approach to operating system updates. Described by Microsoft as the “last ever” version of Windows, Windows 10 will no longer require businesses to make a hefty “point-0” migration to new OS versions. Instead, major platform updates will be introduced on a rolling basis via a more fluid Windows as a Service model. [clickToTweet tweet="major platform updates will be introduced on a rolling basis via a more fluid Windows as a Service model. " quote="major platform updates will be introduced on a rolling basis via a more fluid Windows as a Service model." theme="style3"] While IT managers may be pleased by the idea that they will never have to undertake another cross-organisation OS migration (similar to that seen at the end of Windows XP), the reality is that constantly updating an organisation’s IT systems comes with its own challenges. In the past, IT teams have had years to test an operating system upgrade before rolling it out across thousands of individual PCs. Now, however, businesses will have less than a matter of months to manage this change, with OS versions now coming on a semi-annual basis. If IT departments fail to install these updates across every PC, they will be left with a network of disjointed platforms, capabilities and user experiences. Even more worryingly, their networks could be left with hundreds or even thousands of potential security vulnerabilities. The damage possible through these unseen OS vulnerabilities was most recently highlighted by the WannaCry cyberattack on the National Health Service. Having prioritised spending on other areas of the health service, the NHS was left with a minefield of half-updated machines, with many still running the 16-year-old operating system Windows XP (which Microsoft stopped formally supporting in April 2014). When these vulnerabilities were exploited by the WannaCry ransomware, the NHS found itself crawling with infected machines, impacting a total of 50 NHS Trusts and costing the company £180,000 in emergency measures. Given that this issue was uncovered at a time when businesses had years to prepare their OS upgrade strategies, imagine how common these problems will become once organisations are forced to update their systems every few months. With this in mind, what can businesses be doing now to prepare themselves for mass-updates and the move to Windows as a Service? Sharing is caring: How P2P is solving Windows as a Service With the “death” of Windows 7 now set for January 2020, all businesses will be expected to move to the Windows as a Service model prior to this date. Faced with this deadline, enterprise IT teams must start to think of new and innovative ways to manage their OS updates on an increasingly large and frequent scale. Not only does this represent more labour hours for IT teams, it can also place a significant strain on an organisation’s network – particularly if that network’s infrastructure is old or out-of-date. With the “death” of Windows 7 now set for January 2020, all businesses will be expected to move to the Windows as a Service model prior to this date. To overcome this issue, many businesses are turning to the use of peer-to-peer (P2P) technologies such as SD ECDNs (Software-defined enterprise content delivery networks). These virtual networks allow businesses to share large files at high speeds, regardless of whether they are still relying on legacy network infrastructures. By distributing an update to multiple machines (or peers) and then allowing those machines to share the updates amongst themselves, SD ECDNs exponentially decrease the bandwidth load on an organisation’s network. The greater the number of peers across a complex distributed enterprise, the more efficient content delivery becomes compared to legacy hardware-based WAN optimisation solutions. While SD ECDNs are already being used by companies to distribute and stream high-quality video content en mass, a growing number of businesses are now beginning to examine how these same networks could be used to solve the issue of constant Windows-as-a-Service updates. By using peer-to-peer sharing to update all of a company’s devices at once, organisations such as the NHS are ensuring that they are always up-to-date, improving productivity and minimising risk. Since discovering the root cause of WannaCry, the NHS has started to explore update delivery via Kollective, a Microsoft approved SD-ECDN. By adopting this peer-to-peer technology, the NHS will not only be able to mitigate similar cyber attacks but has also future-proofed its business for the move to Windows 10. This is the real lesson for IT teams to learn in 2018, that their role must go beyond simply managing application deployments and security threats as and when they arrive. Instead, they must focus on building an infrastructure upon which all future activity is fast, efficient and secure. ### The battle for pole position: Hamilton, Vettel, Big Data, Analytics Whilst Lewis Hamilton wrapped up the Formula One Driver's Championship at the end of October at the Mexican Grand Prix, there were still another two races to complete. We were denied a final day of the season shoot-out, but this just points to the supremacy of the Mercedes Formula One team. This weekend marks the final race weekend of the 2017 season. To mark the event staff writer Joshua Osborn takes a look at some of the Tech used by Mercedes in their latest championship-winning season... Abu Dhabi is well known for a lot of things and their Formula 1 track is just one of them. With its long straights and tight corners, coupled with beautiful water views populated with glorious yachts, the Yas Marina circuit is nothing short of stunning. This weekend the Abu Dhabi brings to a close the 2017 season, a season where I've started to come back to the sport. You see, when West sponsored McLaren and ‘The Flying Finn’ Mika Häkkinen seemed unstoppable I was really into Formula One - I loved the sound of the cars when I didn’t have to turn the television up to 50 just to hear them roar as they flew around at 200mph. For a number of years that love-affair with the sport has waned quite considerably: the rules and regulation changes certainly haven't helped, what with the introduction of DRS and Turbo Engine, some of the fun seems to have been lost. Lately, this passion has crept back in, but not for the reasons you may think. It’s the technology they use that holds my interest now. We all know the names: Räikkönen, Vettel, Hamilton, Ferrari, Force India and Mercedes. But what if I mentioned Rubrik, Pure Storage, or Qualcomm? You may have questions like, “how do they relate to Formula 1?” or, “I’ve not heard of those racers before?”! Let's take a closer look. Rubrik describe themselves as, “a software-defined platform that unifies backup, instant recovery, replication, search, analytics, archival, compliance, and copy data management in one secure fabric across the data centre and cloud.” So how does this relate to Formula One? They are a partner of Lewis Hamilton's championship-winning Mercedes Team. A critical role in any team is the ability to record and analyse a multitude of results, performance statistics and other volumes of race data. An equally critical role is keeping this data safe and that’s where Rubrik comes in. Mercedes recognises the growing importance technology has in the sport (and indeed the world), and the role that Cloud Computing plays. They use a multi-node Rubrik cluster to backup and recover their data volumes; the ultimate intention being to better utilise and manage the sheer vast amount of race data and stay one step ahead of the competition. On to company number 2, Pure Storage: “The Pure Storage Data Platform is powered by software and hardware that’s effortless to use, efficient from end-to-end, and evergreen to upgrade. It’s got everything you need to drive industry-defining innovation and achieve business results that were previously unimaginable.” Mercedes aims to bolster their IT performance and efficiency by utilising Pure Storage’s FlashArray //m which claims to be “simpler, faster and more elegant than any other technology in the datacentre”. This will only enhance their big data, real-time analytics and database systems capabilities. Qualcomm are yet another Mercedes partner. They pioneered 3G/4G and now they’re looking to do the same with 5G by “allowing millions of devices to connect with each other in ways never before imagined.” This brings us onto the Internet of Things (IoT), of course. This, at least to me, seems a lot more obvious in the ways Mercedes can gain an advantage over the competition. By having their devices communicating and updating each other in fractions of a second, this results in as close to real-time statistics as we can get right now. This will affect everyone from the pit crew to the driver themselves. Much of the focus in Formula One is around the Driver's Championship, and it's always the name of the winning driver which has so much prominence. But, within the sport - as I'm sure every F1 aficionado will already know - it is the Constructor's Championship that holds more importance. An F1 team isn't just the drivers, nor is it just the drivers and the pit crew - an F1 team nowadays is made up of the hundreds of people directly employed by the team and reaches out into all its partners. In a sport in which the difference is measured in thousandths of a second, a team has to ensure that every available marginal gain is exploited. And that can only be done by working in tandem with top quality partners who lead and innovate in their own specialised fields. As I watch the cars blast around the Yas Marina circuit tomorrow, I'll try not to get too distracted by the yachts in the harbour. But I will be thinking of the evolution and revolution of tech used by the teams. Maybe I'll even come up with an innovation myself that helps an F1 team gain a few thousands of a second. Maybe you will? I guess the question I really want answered is how many thousands does my hypothetical innovation need to save before I can buy one of those yachts for myself! If you have any comments, or if you want to get in touch and talk tech, push the button below and follow me on Twitter. ### HyperGrid wins Cloud Management Product of the Year at 2017 SVC Awards HyperGrid, Enterprise Cloud-as-a-Service leader, is thrilled to announce that it has won the Cloud Management Product of the Year prize at this year’s SVC Awards. This success represents a powerful endorsement of the HyperCloud platform, which aims to revolutionise the way businesses approach the cloud. The award was given in recognition of the innovative nature of HyperCloud, and how it is making public cloud on-premises a reality for an ever-growing range of businesses. With demand for the benefits of both public and private cloud remaining high, HyperCloud is able to provide the advantages of both worlds in a way that goes beyond traditional hybrid models, without compromising on cost-effectiveness or convenience. Winners in each individual category were decided by a vote, in which a range of industry experts and professionals took part. The awards ceremony took place at the Hilton London Paddington Hotel. Doug Rich, VP of EMEA at HyperGrid, said: “Winning this accolade represents an outstanding achievement for HyperGrid, and the entire team should be commended for their efforts over the past 12 months. We have undergone significant evolution as a company in the last year, with the HyperCloud solution central to this growth. We’re delighted to see that our work has been recognised by our peers in the industry. “Integral to what we do at HyperGrid is bringing the benefits of enterprise cloud-as-a-service to businesses of all sizes, enabling them to gain the flexibility and scalability of cloud, but without vendor lock-in or the need for upfront capex costs that so often hinder cloud adoption. Winning this award at this year’s ceremony is testament to the position that HyperCloud has produced for itself in the market, as well as the potential for public cloud on-premises to truly shake up the way businesses approach cloud in the coming months.” ### Multi-Cloud - Is Your Private Infrastructure Ready? The adoption of cloud is accelerating, and focus is moving to hybrid and multi-cloud migration strategies. This approach allows organisations to choose on an application by application basis, considering which environment fits their needs best; private cloud IaaS, public cloud IaaS, SaaS or PaaS. However, many enterprises do not consider the internal estate and how the on-premise architecture also needs to adapt to the cloud. There are three considerations that Hutchinson Networks recommend organisations should consider when optimising internal environments for public cloud. Cloud connectivity requirements over the Enterprise WAN. Infrastructure security beyond the perimeter firewall. The key ingredients in building a true private cloud. Cloud Optimised WAN The enterprise WAN has, until recently, been a mixing pot of transport technologies - Dark Fibre, L2 and L3 Ethernet, Frame Relay, MPLS and Internet VPN. This gave System Administrators the flexibility to engineer complex bespoke solutions to meet the specific needs of their business (or not, depending on the quality). These solutions are often expensive, demanding considerable bespoke engineering (QoS, redundancy, WAN acceleration and encryption), and offer poor service, with little or no optimisation for cloud. Another key consideration in Enterprise WAN is Internet breakout. For many organisations, Internet breakout has been centralized, typically through a head office or data centre, allowing for centralised security policy and cost-savings on local circuits. As cloud applications are accessed over the Internet, consumers at branch offices are increasingly subjected to additional latency and low bandwidth links. As such, the following guidelines should be considered when architecting a modern enterprise WAN: Local Internet Breakout: This provides a shorter route to SaaS, IaaS and PaaS services based in the cloud. SD-WAN: SD-WAN provides organizations with otherwise inaccessible functionality such as transport independence and ZTD (Zero Touch Deployment). Additionally, many SD-WAN vendors have optimised connectivity to cloud services as well as optimal routing for IPSec VPNs. Cloud Transit: To secure high bandwidth in cloud environments, companies should select a provider such as AWS Direct Connect, Azure Express Route or Fabrix On-Net, enabling them to connect directly to a port on their network fabric. High-Speed Internet Pipes versus Thin Private Circuits: Combining local internet breakout with SD-WAN means that Internet-based IPSec VPNs are now a viable alternative to private circuits for critical traffic like voice. Enterprises should strongly consider the high costs of private circuits compared with the relatively low cost of high-speed internet pipes. Enterprise Security The traditional security perimeter is becoming increasingly irrelevant. This is being driven in part by cloud but also by changes in the way we work such as mobility, where users on the outside are accessing services within the private cloud, while users on the inside are consuming public cloud on the outside. As a result, security solutions and services are simultaneously moving inwards from the perimeter to the endpoint and outward from the perimeter to the cloud. The perimeter firewall no longer provides sufficient protection against a hardened attacker targeting a particular environment. [clickToTweet tweet="'The traditional security perimeter is becoming increasingly irrelevant...'" quote="'The traditional security perimeter is becoming increasingly irrelevant...'"] When defining security architecture for hybrid and multi-cloud, enterprises should consider the following. Federated Single Sign-On: Using solutions such as F5 APM (Application Policy Manager), organisations can federate user authentication across the internal estate, as well as a range of SaaS, IaaS and PaaS solutions. Endpoint Security: As the endpoint moves with the user, host security becomes vital. New solutions, such as Cylance go beyond traditional anti-virus and anti-malware to provide protection at the host level. Cloud Security: While local internet breakout can provide performance benefits, it also presents a challenge, as security policy can no longer be centralised in a data centre. Solutions like Cisco Umbrella can tackle this problem by applying URL filtering and providing anti-malware at the point of DNS resolution. Anti-DDoS: With ever-larger botnets, the frequency and scale of DDoS attacks are steadily on the rise. When these attacks target Internet circuits, they can impact users’ access to cloud services. The most effective way to protect against DDoS attacks is from within the Internet core, using cloud security services such as F5 Silverline. True Private Cloud Private cloud is a key component part of hybrid and multi-cloud. I have previously written about how only 10% of internal IT workloads represent true private cloud and in fact, most simply provide virtualisation. A true private cloud will also involve infrastructure automation, a user self-service portal and utilisation-based billing. Below are some technologies that organisations should consider when designing private cloud platforms. Infrastructure APIs (Application Programming Interfaces): APIs enable administrators to programmatically configure infrastructure using orchestration tools or languages such as Python, REST and ANSIBLE. Software Defined Infrastructure (Network, Security, Storage): Software-defined or centralised controller based infrastructures provide a single point of configuration for each component. When combined with northbound APIs, they radically simplify automation within private cloud environments. Orchestration Engines: To fully realise the benefits cloud computing can offer, enterprises need to consider orchestration engines. These products will automate the individual elements of the infrastructure (network, storage and compute), collect infrastructure inventory, provide billing information and in some cases also provide a user self-service portal. DevOps: Enterprises can close the digital skills gap in their organisation through nurturing in-house DevOps skills, giving them the flexibility to customise or even develop custom automation, billing, and frontend self-service tools. Key Takeaways Hybrid and Multi-Cloud strategies are not just about public infrastructure. Enterprises must consider the implications for their private environments too. Organisations have to think through their cloud optimised WAN, security beyond the perimeter and private cloud automation\orchestration as part of their Hybrid and Multi-Cloud strategies. ### Happy Black Friday Everyone! Phishing continues to be one of the most popular types of cyber attacks as it is a fast and easy way for cybercriminals to make money. However, this week cybercriminals will be in their element using Black Friday and Cyber Monday as a hook to lure consumers to fall for their sophisticated phishing attacks. Recent research from Sophos which surveyed 1000 office workers found that almost one in five (18%) have fallen victim to a phishing attack in the past. 70% of respondents claimed to be confident they could spot a phishing attack, with 21% admitting they weren’t confident they could spot one. When discussing the research, John Shier, Senior Security Advisor at Sophos said, “It’s highly likely that almost everyone with an email account has been sent a phishing email at some point. But phishing attacks are becoming increasingly sophisticated and much harder to spot.  What’s interesting with the results is that though many don’t think that they have been phished, if phishing is done right you wouldn’t know about it, so it’s highly likely that the number of those who have been phished is actually a lot higher.” As the threat of phishing increases over Black Friday and Cyber Monday, here are some top tips on how to make sure you don’t become a victim. 1. If an online deal or email offer with price discounts looks too good to be true, it probably is Hit delete immediately. It’s common knowledge that though there are some great deals to be had over Black Friday, most products are cheaper or the same price at other times of the year, so it’s unlikely you’ll find the deal of a lifetime. The best way to confirm if it is real is to go directly on to the vendor’s site to check the price and avoid clicking the link in the email which is likely to be malicious. [caption id="attachment_47412" align="aligncenter" width="300"] Too Good to be True Offers on Black Friday[/caption] 2. Feel free to browse deals on your phone but be cautious of the wireless network you’re connected to when you’re online shopping Only ever enter your credit card information when you’re on a secure network that you trust. And remember the best way to keep your money safe is to use PayPal or your credit card. Where possible, avoid using debit cards to purchase gifts online. [caption id="attachment_47413" align="aligncenter" width="169"] Wifi Network Awareness[/caption] 3. Be on the lookout for Typosquatting This is where cybercriminals take a popular online brand and change one letter or two to trick you into clicking and sharing personal information. Always check the spelling and be on the lookout for smart typosquatting like the famous Tvvitter attack. [caption id="attachment_47414" align="aligncenter" width="300"] Typosquatting[/caption]   4. Be sensible about password security and incorporate length and complexity Make account passwords different and difficult to guess. Include upper and lower-case letters, numbers and symbols to make passwords harder to crack. 5. If you’re contemplating clicking the link in an email, take a look at the URL first Before you click, hover on the link if you’re on a computer or hold down the link on your phone and you should see the full URL appear. Once you can see it look at the source and ask yourself does this look legitimate? Bear in mind that just because the URL has a padlock icon next to it or starts with ‘https’ doesn’t mean it’s safe. As a rule of thumb if you aren’t sure if it's genuine just delete it straight away. Too late? If you think you’ve fallen victim to a phishing attack always change your password immediately. It’s always worth contacting your bank immediately to see if there has been any fraudulent activity. Overview of Sophos phishing stats research Research commissioned by Morar research and surveyed 1000 office workers in 2017 70% are confident they could spot a phishing attack 21% are not confident they could spot a phishing attack 71% have not been phished 18% have been phished  Phishing is the second most common threat people are aware of. The breakdown is:   80% are aware of spyware 74% are aware of phishing 63% are aware of ransomware 49% are aware of credential stealing malware 39% are aware of Remote Access Trojans 24% are aware of BotNets 11% are aware of APT                               ### Santa’s busy stocking his sleigh but are businesses ready for last minute annual leave clashes? 23 November 2017 - New research by hotel chain Jury’s Inn reveals that 82% of workers still have annual leave to take, despite the New Year being only six weeks away. It found that a third of UK employees still have at least ten days left to use. With Christmas fast approaching this could spell disaster for companies who have not been monitoring their employees’ annual leave throughout the year, and could leave managers panicking how to fill the skills gap left by annual leave clashes at what is for many businesses the busiest time of year. HR Managers will also find themselves under pressure as they will be constantly interrupted with last-minute annual leave queries. They will be asked to make judgment calls about who can take annual leave and then berated by angry staff if they feel a decision is unfair. Adrian Lewis, Absence Management Expert at Activ Absence says, “With so many UK workers yet to take their holiday, this will inevitably cause businesses and HR departments real headaches in the run-up to Christmas and it could be a costly problem to solve. If businesses haven’t managed their annual leave effectively, they could be short staffed and have to pay for temporary staff to fill in the gaps. “A surprising number of big businesses still rely on spreadsheets, wall planners and paper forms to manage annual leave and this doesn’t help. Errors can easily be made and it can lead to conflict amongst staff and stress for HR teams.” “It doesn’t have to be this way. Technology exists that enables easy staff holiday planning and centralised systems so that everyone can see who is off. This avoids annual leave clashes, saves money and ensures a business can plan their rotas and resources effectively.” “Such technology is particularly useful in the run-up to holiday periods as it enables employers to define trigger point alerts such as automatically prompting staff to book leave if a set level is accrued, thus preventing a build-up of annual leave. It can also alert a manager if an annual leave request is likely to create a staff shortage. “Essentially it can be like having an extra person in HR that spots the things a busy line manager might miss.” ### High in the sky or lost in the fog? Navigating the complex world of cloud pricing Businesses migrating to the cloud are understandably enthusiastic about realising all the benefits of scalability, flexibility and economy that the best cloud offerings can deliver. However, many encounter a stumbling block when confronted by the complex array of pricing models offered by different providers. Making sense of all the different options can be a major headache along the path to cloud adoption. Back in 2013, a report from analyst firm 451 found that cloud pricing was excessively complicated, and blamed the immaturity of the market for the lack of clarity and standardisation of offerings and pricing. This was making it very difficult for customers to compare different vendors and indeed sparked the creation of several cloud comparison indices in a bid to resolve the problem. Four years later, are we doing any better? Businesses migrating to, or starting out in, the cloud will find themselves courted by a beauty parade of providers, all claiming to provide just the right solution for their needs. Businesses migrating to, or starting out in, the cloud will find themselves courted by a beauty parade of providers, all claiming to provide just the right solution for their needs. The challenge, however, is that all organisations are different and have varying requirements at different times. It’s the drive to deliver this flexibility – which, after all, is what cloud computing is all about - that has resulted in the complex pricing environment that we continue to see. One size fits all? Attempts to make cloud purchasing more of a transparent off-the-peg activity have resulted in a massive range of alternatives and options aimed at persuading purchasers to buy a package. Take, for example, the hyper-scale public cloud providers such as AWS and Azure. These companies use the idea of instance or t-shirt sizes for their virtual machines, with the customer selecting the “size” that they feel best fits their requirements. There is a vast range of alternatives depending on the CPU type, number of CPUs, RAM, the number of virtual disks that can be attached, whether they have GPU acceleration and myriad other elements. Typically, these are priced on a pay as you go basis, where you are billed by the hour (or minute or more recently by the second) for the particular t-shirt/instance size. Persistent storage is normally billed separately unless you are using ephemeral VMs, where the storage associated with the VM is not persistent, and effectively disappears when the VM is turned off. In some cases, even with persistent storage, a virtual disk is often shown in a VM which is ephemeral and is used for caching or temporary space while the machine is running. As well as the instance-based pricing models, other models have recently launched: Reserved pricing: under this model, you agree to a contract that allows you to use the VMs 24/7 for a one or three-year period. Large discounts are available but, rather like that special offer gym membership that you take out in January, you are paying for the VM regardless of whether you use it or not. Convertible reserved pricing: this is effectively the same as above, but allows you to recoup outlay by selling time slots for your reserved instance that you don’t need on a marketplace. Spot pricing: in this model, you bid for the maximum price you are prepared to pay for a VM. Depending on the provider’s spare capacity, your VM will be powered on when your spot price is reached, and powered off again when your spot price is not high enough. This might be useful for batch processing requirements when scheduled short-term bursts of capacity are required. Ultimate flexibility and simpler pricing structures Away from these complicated pricing models offered by hyper-scale providers, other cloud providers are working hard to make life easier for customers by tailoring solutions based on the characteristics and maturity of the customer’s business and with more straightforward pricing. These basically come down to three alternatives that take a lot of the pain out of identifying the right approach: The Pay As You Go model is based on the actual usage of CPU, RAM and storage by the minute and charged monthly. The benefit here is its bespoke nature: rather than having to estimate which “size” they need ahead of time, customers can provision any size VM (within reasonable limits) and change the specifications whenever they need to. This model suits highly dynamic, growing and evolving businesses who value the ability to scale and shrink at a moment’s notice. The second option referred to as ‘Reservation Pool, is generally lower cost, though slightly less flexible, and offers customers a reserved “bucket of resources” which have guaranteed availability. This option works well for mature businesses that have steady demand requirements and really want to see the cost-efficiencies that the cloud can deliver. Even greater discounts are available if businesses commit to multi-year deals. Organisations that sit in the middle of that spectrum can opt for the third way - a mixture of the two approaches - reserving a bucket of resources (that can be easily increased if required), but taking an option to burst into PAYG resources when the occasion demands it. These three options offer a very cost-effective alternative to the off-the-shelf approach and are arguably more transparent for customers weighing up the various alternatives. All about the money? [clickToTweet tweet="Consultation has a critical role to play when businesses are choosing cloud solutions as pricing remains complex" quote="Consultation has a critical role to play when businesses are choosing cloud solutions as pricing remains complex" theme="style3"] Of course, it’s worth remembering that, despite the apparent “race to the bottom” taking place between the hyper cloud providers, it’s not all about the money. While some services are becoming standardised and commoditised, if you are looking for a specific offering – DRaaS, for example - you will want to know far more about it than simply that it’s the cheapest on the market. That’s why I firmly believe that consultation has a critical role to play when businesses are choosing cloud solutions. The pricing environment remains complex, despite our best efforts, and there is more to vendor selection than price alone. In any case, enterprises moving to the cloud need to be viewing it as a strategic, not just operational, decision and as such, they need to take advantage of the advice that cloud providers offer. This will ensure that they don’t just get the best price, but they also get the best service that meets the real and changing needs of their organisation. ### James Hulse of Dell EMC talks to Adam Binks COO of SysGroup PLC As part of the Dell EMC Partner Program, Director of the MSP Division & Growth Partners at Dell EMC - James Hulse - speaks to Adam Binks COO of SysGroup PLC. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. To find out more about SysGroup visit: sysgroup.com To find out more about Dell EMC visit: dellemc.com ### How technology is revolutionising global payroll When a growing business embarks on a global expansion plan, it has to consider numerous challenges. New markets present lucrative opportunities. However, each new territory is also home to a unique set of legislative requirements. This is often complicated even further by a variety of different languages, customs and cultures. To navigate these challenges successfully, international companies typically enlist the help of local service providers in each country of operation. Some locations are particularly tricky and in these instances, HR may also recruit in-country payroll teams to ensure compliance. This is a time-consuming process that involves complex employee payroll and multiple service contracts. Fortunately, multinationals no longer need to endure this administrative headache. Innovative technological developments are reshaping how these businesses manage their global payroll needs. Cloud software enables companies with multiple offices around the world to consolidate their payroll, reduce their suppliers and manage their processes from one location far more efficiently. Here’s how cloud technology is changing payroll management across the globe. More access, less admin Competition is tough in the global business arena. A good way to keep two or three steps ahead of the pack is for a company to reduces its administrative workload, freeing up staff for more strategic work like compliance, and improve employee access to information. Of course, not all administration is superfluous -  a lot of it is very necessary to keep a business running smoothly. However, one team managing one global payroll service provider is far more efficient than many teams managing lots of small service contracts all over the world. An automated cloud-based global payroll system has many benefits over and above taking care of repetitive, time-consuming work. It also keeps people connected to critical company information from anywhere in the world. Regardless of the payroll manager’s actual location, they can log in and pull all the data they need in real-time, from any device at any time. [clickToTweet tweet="A centralised, cloud-based payroll platform provides a holistic view of the company at all times." quote="A centralised, cloud-based payroll platform provides a holistic view of the company at all times." theme="style3"] A centralised, cloud-based payroll platform provides a holistic view of the company at all times. With fast, easy access to this solid bank of real-time information, business leaders can make strategic decisions quickly. This speed is key to keeping their companies competitive. Enhanced collaboration and security Cloud-based software also enables teams around the world work better together. Instead of using email to send important documents and updates back and forth two colleagues, based in different countries, can work together on a project in real-time. This enhanced collaboration prevents work from being duplicated or lost – and creates an agile working environment that is far more responsive and productive. Companies can improve collaboration and efficiency even further by integrating their cloud-based global payroll with their accounting, HR and other business systems. Companies can improve collaboration and efficiency even further by integrating their cloud-based global payroll with their accounting, HR and other business systems. Nowadays, the closer these business divisions work together the better. Payroll, in particular, benefits from knowing not just how much an employee should be paid - but where a specific staff member is in their employment journey, if they’ve recently been promoted and if they’ve met their KPIs.  With this information at hand, they can make informed recommendations with regards to salary increases and bonuses. What’s more, if they work with a global payroll provider that is ISO27001 certified, multinationals can rest assured that their corporate network is safe. Cloud technologies are much more secure than many people think. However, this does require high-level security knowledge and encryption expertise over and above the standard security measures. How to implement a global payroll system To fully benefit from a cloud-based global payroll platform, businesses need to carefully consider the following three areas before making the move. 1. Current technology If a company’s existing payroll is already on the cloud, then the business will be able to globalise its payroll function without much fuss. If not, the company will need to update its software which can be a daunting prospect for some businesses. However, with the help of the right provider, and with the right technology onboard, it can be done with minimal disruptions. 2. Compliance A multinational business has to be compliant in all countries in which it operates. When choosing a global payroll provider, a company must check that its platform covers all the territories that are relevant to the business. The right multi-country, cross-border solution is crucial to avoid severe penalties or fines for non-compliance. 3. Change management Change is not easy for people and employees often baulk at adopting new technologies. A comprehensive change management programme can help a business get their new payroll system up and running as quickly as possible. It’s incredibly important that all workers are trained on the new system and once implemented, can operate it comfortably. A global payroll system that sits on the cloud can help multinationals run their offices around the world with greater efficiency, intelligence and speed. This helps them maintain a strong market position and achieve ambitious growth objectives. ### Ontario’s Edge In The AI Revolution Artificial Intelligence (AI) isn’t just the future; it’s the here and now. We interact with AI daily, whether we’re asking Siri for directions, experiencing Google predicting our interests based on past queries, or having our bank anticipate a fraudulent transaction before it can happen. As technology continues to advance at a rapid pace, the transformative possibilities of AI are practically limitless, and according to the “Godfather of AI” Professor Geoff Hinton, we are only beginning to scratch the surface. One of the most exciting developments is the combination of AI and the cloud. The marriage of these technologies opens new doors to crowdsourcing and sharing data, fuelling innovations to help address some of today’s biggest challenges in areas ranging from healthcare to urban planning. So it’s no surprise that global tech giants including Google, Facebook, Apple, and Microsoft are investing eye-watering sums in AI. These big bets have the potential for enormous rewards – one study estimates that AI has the potential to double economic growth rates by 2035. At the centre of this AI revolution is the province of Ontario, Canada. Ontario is the second-largest IT cluster in North America, after California and Toronto is now the fourth largest city in North America. Global giants like Google, Microsoft and Uber are capitalising on Ontario’s exploding business and tech scene by investing in AI-enabled technology being built by home-grown talent. This is happening as a direct result of a unique combination of factors vital to the advancement of AI that give Ontario an edge: talent, investment and close public-private sector partnership. On the talent front, Ontario pledged to boost the number of STEM (science, technology, engineering and maths) graduates from 40,000 to 50,000 each year, and is projecting 1,000 students a year will graduate from applied masters programmes in Artificial Intelligence by 2022. Interest in AI-related programmes is in part fuelled by Professor Hinton of the University of Toronto, who pioneered deep learning, a type of machine learning that mimics human brain function to compute massive amounts of data. Talent is one of the reasons Facebook executive, Steve Irvine, left Silicon Valley and moved to Toronto to found Integrate.ai, a company that is focused on applied artificial intelligence. Irvine excitedly notes, “We have amazing talent here, and nobody knows it.” More than 200 successful AI-enabled companies are getting in on the secret, having set up shop in Ontario. One is newcomer Flybits, which started out as a PhD research project and is now a fully-fledged business supported by Toronto’s MaRS innovation lab and Ryerson University. The platform allows businesses to process big data using AI to generate user insights and tailor customer services. The financial sector is an early adopter of AI and is using it to transform how it does business. Ottawa-based start-up MindBridge Analytics, which uses AI to find irregularities in data sets, is revolutionising how audits are done. Its aim is “to bridge the gap between the failings of current statistical sampling techniques” in order to empower auditors to better anticipate and detect fraud. MindBridge’s current partners include the Bank of England and Thomson Reuters. “Cloud-based technology is also being used in the fintech sector by Wealthsimple to achieve its mission of “bringing smarter financial services to everyone, regardless of age or net worth.” Through its digital investing platform, which uses an algorithm to match investors with a personalized investment portfolio designed to meet their goals, Wealthsimple offers top-quality advice to 50,000 investors – totalling over $1-billion in assets under management – at a much lower cost than traditional wealth managers. Bringing such innovations to life requires focused investment in research and development, and a close partnership between the public and private sectors, which includes incentives like extremely generous tax credits to attract talent and R&D to Ontario. This year, the Ontario government invested a further $50 million in Toronto’s Vector Institute, a new public-private institute for AI with a specific focus on deep learning and machine learning. Professor Hinton is leading the institute in identifying new business models for AI to support and sustain economic growth. The Vector Institute is a collaboration between the University of Toronto, the governments of Ontario and Canada, and private partners including Google, Accenture, and all five of Canada’s big banks. The model is focused on bridging the gap between university research and companies like Google to help commercialise research and connect companies with the best talent. Far from a trend on the horizon, AI is already transforming how we live, work and play in Ontario. The combination of AI and the cloud will unleash even more possibilities to improve the world’s future. Ontario is harnessing its thriving tech sector, talent, research and welcoming entrepreneurial climate to lead this global AI revolution. ### Uber Data Hack Following on from the controversy of Uber’s licence to operate in London being suspended due to a lack of corporate responsibility, news has broken today that Uber suffered a massive data breach in 2016. But worse than this is the subsequent cover-up perpetrated by Uber’s management coupled with the fact that they paid the hackers $100,000 to, “delete the data [and] keep quiet.” Joe Sullivan, who was lured from Facebook in 2015 to be Uber’s security chief has been sacked as a result. James Lyne, Sophos’ Cyber Security Advisor has said, “Uber isn't the only and won't be the last company to hide a data breach or cyber attack. Not notifying consumers put them at greater risk of being victimized by fraud. It's for precisely this reason that many countries are driving to regulations with mandatory breach disclosure.” The attack, which happened in October 2016, included names, email addresses and phone numbers of 50 million Uber riders from all around the world. The personal information of approximately 7 million drivers was accessed as well, including around 600,000 US driver’s licence numbers. One positive note is that no social security numbers, credit card details or trip location details were stolen. Uber’s programmers uploaded security credentials to a GitHub repository It has transpired that Uber’s programmers uploaded security credentials to a GitHub repository – from there it was elementary for the hackers to access Ubers servers hosted on Amazon. Chester Wisniewski, Sophos’ Principal Research Scientist, has commented, “Uber's breach demonstrates once again how developers need to take security seriously and never embed or deploy access tokens and keys in source code repositories. I would say it feels like I have watched this movie before, but usually, organizations aren't caught while actively involved in a cover-up. Putting the drama aside and the potential impacts of the upcoming GDPR enforcement, this is just another development team with poor security practices that have shared credentials. Sadly, this is common more often than not in agile development environments.” Rik Ferguson, Vice President Security Research at Trend Micro has said that it is, “heartening” to see that Uber’s new management team have come clean about the breach, but he, “remains concerned” at some of the wording in the blog of Mr Khosrowshahi which revealed the breach. Mr Ferguson continued, “[Mr. Khosrowshahi] appears to distance Uber’s ‘corporate systems and infrastructure’ from the ‘third-party cloud-based service’ that was the target of the breach. This is perhaps indicative of the root of the problem. Cloud services adopted by a business *are* corporate systems and infrastructure and from a security perspective should be treated as such - You can’t outsource accountability." Mr Ferguson’s final comment is especially relevant following the news last week regarding Cash Converter’s own breach – again the blame for the breach was initially placed on a third party rather than responsibility for the failure being taken on board. Further breaches are certainly bound to happen in future – industry analysts will surely be watching and listening for which companies are brave enough to accept accountability and which will continue to try and shift the blame. It is for the leaders of all kinds of businesses that look after personal data to take note of the recent headlines and start a culture shift acknowledging that responsibility cannot be passed on when failures like this occur.   ### Protecting reputation from a Brexit ‘data breach backlash’ Safeguarding data is a business problem, not an IT problem. Just as the benefits of adopting cloud platforms and services will impact every part of a company, so will the consequences of a security breach that leads to the loss of information stored and processed in the cloud. A recent Ponemon study commissioned by Centrify has shown that data breaches have a far-reaching and devastating effect on UK businesses – causing particular damage to reputation and customer loyalty. More than half of the consumers that responded to the survey had been affected by a breach; 65 percent of whom lost trust in the company, while 27 percent ended their relationship with it altogether. As the Brexit process gains pace, reputation will become increasingly valuable. Strong customer loyalty will help to sustain business revenues if consumer confidence and spending continue to dip. However, neither IT bosses nor senior level executives are taking responsibility for protecting reputation. [clickToTweet tweet="Data breaches have a direct impact on company finances and shareholder value" quote="Data breaches have a direct impact on company finances and shareholder value" theme="style3"] This must change. Data breaches have a direct impact on company finances and shareholder value: the share prices of the companies surveyed by Ponemon dropped an average of five percent following the disclosure of a breach. They also experienced up to a seven percent customer churn, with an average revenue loss of £3.07 million. Appropriately protecting customer data in the cloud, and at every other point it resides or passes through, is a bottom-line issue. It is also a significant challenge, however, when there are troubling ‘blind spots’ and disconnects – both across the organisation and between the business and its customers. Time for a reality check When it comes to the biggest threats facing UK companies, IT practitioners and senior marketers both believe that a data breach ranks at the top, with marketers viewing it as more damaging to reputation than a scandal involving the CEO. However, while marketers recognise loss of brand value as the biggest cost of a breach, IT practitioners are more worried about losing their job, or the department coming under greater scrutiny. Just 23 percent of CMOs and three percent of IT practitioners are concerned about a decline in the company’s share price. There’s also disagreement over who is responsible for protecting the brand: 71 percent of IT practitioners believe it has nothing to do with them, whereas approximately two-thirds of senior marketers believe the IT department should take responsibility. While 60 percent of senior marketers say their department collaborates with other functions in maintaining brand reputation, only 18 percent of IT departments do this. There is one thing IT leaders and CMOs do agree on: more than a third think senior managers in their business are failing to take brand protection seriously. The expectation gap Most customers will be users of cloud services themselves, at home and at work. They are probably aware of the limitations and risks and will have seen reports of high-profile data breaches in the media. It’s hardly surprising, therefore, that they have high expectations about what companies should do to safeguard their personal information, in the cloud and elsewhere. A huge 79 percent of consumers believe organisations have an obligation to take reasonable steps to secure their personal data, while 70 percent say a company’s privacy and security practices are very important to preserving their trust. Businesses must do more to meet customer expectations if they are to preserve loyalty and reputation as the terms and the effects of Brexit become clearer. Businesses must do more to meet customer expectations if they are to preserve loyalty and reputation as the terms and the effects of Brexit become clearer. This means upgrading cybersecurity to ensure customers can feel confident that their information is being protected from leaks, loss and theft, wherever it is held or processed. Leading from the front Data security is a whole business issue that requires a whole business approach. This is why ultimate responsibility for preserving reputation by protecting customer data lies with the senior team. They must take the lead on developing and implementing a comprehensive security strategy that protects the entire business and brand. They are also in the best position to drive the culture change required to ensure that IT better understands the link between cybersecurity and brand protection, departments are aligned on priorities, and everyone is aware of the consequences of a breach. The strategy should ideally include: Appointing a fully dedicated chief information security officer (CISO) to help move the organisation to a stronger security posture. Opening up clearer channels of communication – encouraging and facilitating collaboration between lines of business to determine and execute shared data security plans. Making strategic investments – allocating adequate budget to putting in place skilled people and security-enabling technologies, especially enterprise-wide encryption and an identity and access management (IAM) system to control and audit who can see what data and when. This will protect the business if the worst happens: the Ponemon research found that the stock prices of companies with a strong security posture recovered much more quickly following a data breach. Educating employees – with training and awareness programmes that increase their understanding of the risks of data breaches, and their role in protecting information from loss or theft. Preparing for the worst – by creating a comprehensive incident response plan. This should include procedures for communicating with customers, investors and regulators, and pre-assigned roles and responsibilities. Carrying out regular security assessments – to shine a spotlight on vulnerabilities in the organisation’s computer, network or communications infrastructure. Participating in threat sharing programmes – with partners and other companies you trust to prevent and quickly detect attacks that might be targeting your sector, and avoid duplicating work that’s already been done. As Brexit gets closer, customers’ priorities, concerns and attitudes will evolve. Sustaining their trust and loyalty will help businesses stay profitable though uncertain and potentially turbulent times. This is why data security is no longer just about protecting data; it’s ultimately about protecting corporate reputation and brand value. When reputation takes a hit, this has serious long-term financial consequences: loss of revenue, erosion of shareholder value, and a drop in productivity caused by the time it takes to recover. And if data cannot be appropriately secured on cloud and mobile platforms, this will limit the adoption of such technologies – and therefore the benefits to be gained by doing so. The companies that survive and thrive as the UK departs from the EU will be those that recognise this and take a holistic and strategic approach to strengthening the whole enterprise’s resilience to breaches. ### How Robotic Process Automation Can Digitally Transform Organisations Robotics and artificial intelligence (AI) technologies have been with us for some time now but they finally feel as though they have come of age. Organisations are getting to grips with how robotics and AI can help to improve business processes. However, the whole concept of how humans will work with robotics and AI is misunderstood. There is a mistaken belief that robots are a threat to human jobs rather than a way to deliver higher quality customer service and greater efficiencies across an organisation. The journey started 20 years ago with intelligent capture when devices acquired the cognition to read text within documents. The key lesson learned from this phase was that you can’t do automation without information: you need to get data into an automated channel before it can be routed to the right person to do something with it. The next step was to automate the process of actioning the data to achieve an outcome. This had already been proven in the early stages of the AI industry, where, back in the 1950s, a machine was invented that could play chess. In 1997, IBM invented a computer that beat a grandmaster at the game. While we’re seeing increasing numbers of organisations undertaking pilots and proof of concept exercises with robotic technology, few have achieved anything meaningful with it. But in the last 20 years nothing this significant has happened. While we’re seeing increasing numbers of organisations undertaking pilots and proof of concept exercises with robotic technology, few have achieved anything meaningful with it. In fact, we have spotted a real problem relating to CTOs knowing what to do with robotics and how to put them to work to digitally transform their businesses. Even digital transformation is an industry term people that many people don’t understand. At its simplest level, digital transformation relates to how customers can engage with an enterprise. This doesn’t mean ‘just’ going digital and creating apps - it’s about supporting every channel of communication, whether that’s email, voicemail, or good old-fashioned post. HMRC and its self-assessment tax service is a great example of digital transformation. It has seen excellent adoption rates and is massively successful. The big challenge with digitising customer channels is the need to prove true and legal identities using e-signatures and trusted IDs via banks, but this can now be done. A digital mailroom like the one EDM Group built for HMRC captures data from incoming documents and runs a set of business rules that routes it to a group of knowledge workers. It has between 30,000 and 40,000 envelopes coming into its post room daily and the system has achieved a huge reduction in manual effort, taking away the tedious physical work that people don’t want to do. Organisations can believe that it is too difficult and expensive to use this kind of platform because their business systems are old and hard to use - and still work. The good news is that those systems may not need to be replaced. The real barrier to automation is how to integrate systems using Robotic Process Automation (RPA). RPA enables organisations to do integration without integrating. Even though it’s called robotics, RPA does not use a real robot. Instead, it involves a virtual machine. It logs on to the system with a pre-recorded script, it has some intelligence, some auto routing - and types into the system and bridges the gap without any physical intervention. This solves the classic Catch 22, which is that there’s no point in capturing all of the data within documents into mailroom or emails. The reason you go digital, build an app or put a process online is to have complete digital engagement and interaction with your customer. However, customers can still print or email that communication and there’s no point in undertaking RPA that can’t automate the whole process – you haven’t solved the problem of what happens when the capture of the data get stunted and stopped. The assumption that robotics is AI and that you have to tell it exactly what to do is defective – if you try to build a robot-script that covers lots of routing it’s a massive undertaking. Step forward AI and cognitive intelligence. Semantic meaning and cognition mean you can take a letter from a customer (not just a form) and interpret what type of communication it is. Our technology can discover whether an email or letter is a complaint just by its tone and language, for example. Of course, knowledge workers are way up the equivalent AI scale. Simulating one second of human brain activity takes 82,944 processors, demonstrating how powerful they are. But knowledge workers don’t always follow the rules. They adapt and learn but they have good and bad days and are difficult to scale. [clickToTweet tweet="The objective for the next 5-10 years is, therefore, to use RPA to get people to do their jobs better." quote="The objective for the next 5-10 years is, therefore, to use RPA to get people to do their jobs better." theme="style3"] The objective for the next 5-10 years is, therefore, to use RPA to get people to do their jobs better. This implies a scaling down of manual processes but not a full replacement of human beings. RPA speeds up labour but is not intelligent. Its main advantage is that unlike humans it doesn’t get things wrong. The next thing to worry about is Superintelligence, a machine that’s smarter than people - but that’s at least 70 years off. Between now and then, RPA is here to assist us humans, not take away our jobs. ### Brexit and the GDPR pose a risk for UK Cloud Providers The combination of the implementation of the General Data Protection Regulation (GDPR) next year and the UK’s decision to leave the European Union has created a unique and complex environment for cloud companies. Foretelling the exact outcome of the negotiations and determining what regulations will eventually prevail requires a crystal ball that none of us has. This is a significant challenge for an industry which is growing exponentially and is characterised by fast-paced deployment and the seizing of market opportunities. Here is my assessment of the current state of play and the likely effects on cloud companies as both Brexit and the GDPR loom closer. Signposts begin to appear There are still a lot of unknowns, but the introduction of the UK Data Protection Bill in September is beginning to offer organisations some idea of where the UK might deviate from the terms of the GDPR. It is encouraging that the UK’s Information Commissioner recommended that the Data Protection Bill is read in conjunction with the GDPR to give a full insight into the framework that will exist, at least initially. The Commissioner also stated that she hopes the introduction of the bill “will send an important signal about the UK’s commitment to a high standard of data protection post-Brexit. This, in turn, will play a role in ensuring uninterrupted data flows between the UK and the EU.” This could be seen to indicate that the UK is going to aim for an adequacy agreement with the EU Commission, meaning that the provisions in place in the UK for data protection will be sufficiently comparable to those of the GDPR to ensure that data can flow uninterrupted between the UK and the EU. This would be the preferable situation. A UK framework that does not meet the standards of the GDPR would cause serious challenges for companies doing business in both regions. Data has no borders The GDPR impacts everyone who wishes to offer goods and services to EU citizens. The inability to trade with EU citizens would be a catastrophic blow for UK businesses. Currently, the level of EU vs Non-EU trade with the UK is at parity, so the loss of ability to trade with the EU is obviously something to be avoided. Therefore, compliance with GDPR is a must for cloud providers in order to protect their customers; it doesn’t matter whether the UK is in the EU or not – data has no borders. The difficulty lies in the uncertainty hanging over the approach that the UK will take to its data protection regime. compliance with GDPR is a must for cloud providers in order to protect their customers; it doesn’t matter whether the UK is in the EU or not – data has no borders Any requirements adopted by the UK that stipulate augmentation above the provisions of the GDPR will not increase the risk of non-compliance with the GDPR, although they increase complexity. However, any exceptions to the GDPR granted by the UK work conversely; they may reduce complexity for operations in the UK but they do increase the risk to cloud companies. Organisations operating only under the Data Protection Bill rather than the GDPR guidelines may risk falling foul of the EU Directive. Companies need to ensure that their cloud provider is conforming to the highest data protection standard (in the initial case, the GDPR) and also incorporating any regional variations without compromising that higher regulation. If the two approaches to data privacy should drift to the point where they are incompatible, then cloud providers will need to ensure that they have an operation in the EU that can comply with the GDPR. Establishing a geographical presence in the EU The uncertainty surrounding GDPR and Brexit has already prompted those companies that have the resources to future-proof their operations by setting up locations in EU territories. This strategy goes beyond simply what a cloud provider thinks is in its best interests, however. It’s a matter of remaining competitive in the new environment. Customers, as the data Controllers, will drive the cloud industry, as data Processors, to adopt GDPR-compatible levels of safeguards. If incompatibilities exist due to differing standards between the UK and the EU, UK cloud providers risk losing customers to EU-based competitors that can conform to Controller requirements. Market disruption is unavoidable An industry as dynamic as the cloud sector naturally undergoes rationalisation from time to time in response to changing markets and, as in this case, regulatory environments. I believe that smaller, regional-based cloud providers that are not able to keep up will begin to exit the market after May 2018.The regulatory overhead and staffing issues will force a pruning within the industry, both from within and outside of the EU. [clickToTweet tweet="An industry as dynamic as the cloud sector naturally undergoes rationalisation from time to time" quote="An industry as dynamic as the cloud sector naturally undergoes rationalisation from time to time" theme="style3"] Large organisations and those cloud providers, like iland, who are well-prepared will fare well regardless, as they have the resources to ensure that they can deliver compliance and geographic data sovereignty, whatever the regulatory environment throws at them. They can cherry-pick the best talent from a pool which is currently quite limited, and in this way may squeeze smaller players out of the market. Here at iland, we will be keeping a very close eye on pronouncements coming from the UK government, UK Information Commissioner and the EU Commission in an attempt to counter some of the uncertainty and make good business decisions. There’s no doubt that GDPR and Brexit together will be a major disruptor of what the landscape looks like today – and careful preparation is key to ensure uninterrupted service and protect customers’ data after May 2018. ### Cloud Computing Leaders Announce New Cloud Analytics Academy AWS, Looker, Talend and WhereScape have teamed up with Snowflake to launch the Cloud Analytics Academy, a training and certification program for data professionals who want to advance their skills for the technology and business demands of today’s data analytics. The academy provides online training courses for technical experts, business leaders, analysts and business intelligence professionals, who can receive up to three Cloud Analytics Academy certifications upon completion. Anyone who completes all three tracks will achieve Academy Master certification. [clickToTweet tweet="AWS, Looker, Talend and WhereScape have teamed up with Snowflake to launch the Cloud Analytics Academy" quote="AWS, Looker, Talend and WhereScape have teamed up with Snowflake to launch the Cloud Analytics Academy" theme="style3"] The adoption of cloud computing continues to evolve at a rapid pace. Gartner expects the worldwide software-as-a-service (SaaS) market to grow more than 63 percent by 2020. The Cloud Analytics Academy is designed to provide data professionals with the most up-to-date information, training and best practices for cloud analytics. Data professionals of all technical and business levels and backgrounds can complete any or all of the following Academy tracks: Executive Fast Track – Learn the key technologies and techniques to foster an effective cloud analytics team. Cloud Foundation Track – Become proficient with the fundamental building blocks of cloud analytics. Modern Data Analytics Track – Learn advanced technical concepts to propel your cloud analytics. “Organisations of all sizes now face enormous pressure to deliver analytics faster and at a lower cost than ever before, and many are looking to the cloud to address these challenges,” WhereScape CEO, Mark Budzinski said. “We’re excited to partner with Snowflake to help data professionals gain the knowledge needed to maximise the benefits cloud data warehousing offers. We also want to help data professionals understand how automation can help IT be more agile in their development and operational efforts to deliver value to the business faster.” The demand for cloud data analytics professionals continues to grow at an astounding rate as organisations migrate to the cloud Snowflake’s Chief Technical Evangelist and Cloud Analytics Academy curriculum designer, Kent Graziano said: “The demand for cloud data analytics professionals continues to grow at an astounding rate as organisations migrate to the cloud and look for new ways to gain insights from their data. The Cloud Analytics Academy is the perfect environment for high achievers to sharpen their skills, expand their knowledge and become go-to experts in building data solutions for the cloud.” The Academy curriculum features experts from Snowflake, Amazon Web Services, Looker, Talend, WhereScape, Duo Security, Age of Learning, Sharethrough and YellowHammer. Best-selling author and acclaimed professor, Tom Davenport, will present the opening keynote on the growth and current landscape of cloud analytics. The current curriculum is the first iteration of the Cloud Analytics Academy’s continued training initiative, with new courses and certifications to be added in the coming months. To learn more about the academy courses and instructors, or to sign up, visit: www.cloudanalyticsacademy.com. ### Why securing IoT takes a village The consequence of innovation moving at lightning speed has meant that too often security considerations have become an afterthought.  The recent AT&T Cybersecurity Insights Report tells this exact story, with 85 percent of enterprises in the process of or intending to deploy IoT devices, despite only ten percent feeling confident that they could secure those devices against hackers. Just as the web was designed for collaboration and sharing, IoT devices are designed for interoperability and machine-to-machine communication.  In fact, in some cases, the devices were never intended to make use of the internet at all and have been retrospectively ‘connected’ to the internet; often presenting further challenges for those hoping to secure the environment these devices exist in. Even when security is front of mind, nothing is 100 percent secure.  When you consider that an attacker only needs a single vulnerability to gain a foothold within the network, combined with the fact that the IoT world is so intrinsically linked together, the damage that could be done once an attacker is within the network can be devastating. With IoT, the most important part of security is understanding that it can’t be solved with a single solution or by one company in the ecosystem; it takes a village. Who is responsible? Unlike with other technologies, in IoT it’s often difficult to pinpoint who is responsible for security.  If your coffee machine breaks, for example, you take it back to the manufacturer.  If it’s connected to the internet, however, there are many more stakeholders that could be responsible for the malfunction – often IoT devices don’t perform as they should due to an error in the software.  In terms of security, there are manufacturers involved, application developers, cloud platform providers and other third parties all of which could have some level of responsibility. It's also important to understand that end-users often don’t see the layers of responsibility.  A car buyer, for example, may well assume that the automotive firm is responsible if its connected car is faulty when, in fact, the automaker may have little expertise in IoT security.  What’s more, regardless of this, they will still be held accountable by the user. Breaking it down There are three important factors of IoT security; device, network, and cloud. For the device, it is important to consider the use case, what data is collected and its ubiquity to understand the level of risk.  With numerous security solutions that can be deployed – authentication, user access, application access, device lifecycle management and data encryption, for example – often a cost-benefit trade-off is required between protecting everything and paying for everything.  A sensor tracking radiation from a nuclear power plant is arguably more sensitive than that from a farmer’s weather station, for example. The movement of data as it is transported to the cloud and delivers IoT services must also be secured.  Devices can connect to the internet via cellular, Wi-Fi, Bluetooth, LPWAN or even satellite; all of which have their own security implications.  Across the board, data in transit should always be encrypted and passed in secure private networks, rather than openly sent over the internet, and users should be required to verify and authorise devices on both the network and applications within the network.  With cellular connectivity, a layer of security is already built in thanks to global standards which include ciphering keys and encryption algorithms on the SIM to securely transmit and receive data. To secure the broader cloud environment it goes without saying that standards such as ISO/IEC 27001 should be adhered to, but organisations also need to get granular with controls for the IoT applications themselves.  Role-based access should inform the identity management and access control strategy and anomaly detection with a degree of automated remediation will also add additional protection for the broader security of the IoT portfolio. [clickToTweet tweet="It is fundamental that all within IoT community take notice of the security provisions needed to protect ecosystem" quote="It is fundamental that all those within the IoT community take notice of the security provisions needed to properly protect the ecosystem." theme="style3"] It is fundamental that all those within the IoT community take notice of the security provisions needed to properly protect the ecosystem.  With much of the technology still in its infancy, it’s still early days when it comes to understanding the security significance of having so many devices tied together.  As a minimum, all user data should be encrypted, ‘personal’ and regulated data should be treated according to its own local private and data protection rules, and identification and authentication of all those within the ecosystem is paramount.  Securing connectivity is vital and an IoT connectivity management platform, with rules-based security policies that identify and act on anomalous behaviour from connected devices, should be a top priority. ### Black Friday Scam Warnings - How to Stay Safe   Black Friday is just around the corner on 24th November 2017, and it is set to send consumers into a frenzy of deal searching and sales hunting this year, but in the chaos, it is important to remember that not all deals are as they seem.  With brands such as Amazon, Dell, Kohl and Microsoft already advertising around their Black Friday sales push, it is relatively easy for hackers to predict the type of emails that are likely to be sent out and line-up a phishing campaign accordingly. The top two cyber scams PhishMe advises internet users to look out for ahead of Black Friday are: 1. Fake survey to earn reward Emails pretending to be from a legitimate company that ask recipients to complete a survey in return for a reward are often sent by spammers. In addition to never delivering on the reward, a great deal of personal information is collected in the process of completing the survey that can then be utilised by the cybercriminals or sold to other companies. 2. Impossibly good deals… on sketchy websites Some emails may link through to websites that have no intention of providing goods – counterfeit or legitimate – at all.  Instead, the URL directs to a non-secured website, using an IP address rather than a domain name.  While the same goods may be discounted for Black Friday on the legitimate site, this is likely to be at a much lower rate than on the fake site. Aaron Higbee, CTO of PhishMe offers the following advice on Black Friday scams: “The old adage, ‘if it seems too good to be true, it probably is’ stands true with most of the Black Friday cyber scams, but it is important for consumers to become conditioned to recognise the signs of fake deal. Making sure you’re on a reputable online store is the first step to securing your Black Friday shopping and this can be verified by ensuring the link is consistent with the webpage.  While you’re looking at the URL to make sure you are where you think you are, take note of the ‘s’ in ‘https’ which stands for secure and check that a green lock appears with green text in your browsers address bar. This will ensure that any credit card information is protected with encryption and that you are dealing with a reputable retailer as the site’s owner has been verified. There is no doubt that Black Friday can be a great opportunity to snag a good deal during your holiday shopping, but keeping these simple steps in mind for checking the authenticity of deals, could be the difference between saving you a few pounds and costing you a gold mine.” ### NewVoiceMedia named Most Innovative Tech Firm in Tech of the Year Awards NewVoiceMedia, a leading global provider of cloud contact centre and inside sales technology that enables businesses to have more successful conversations, has won a 2017 Tech of the Year Award, for Most Innovative Tech Firm. Launched by monthly publication Corporate Vision this year, The Awards identify and showcase the most talented and forward-thinking individuals and businesses in the technology industry, and commend them for their efforts over recent months in terms of innovation, stellar customer service and industry-leading products and services. Winners were chosen through a combination of votes from Corporate Vision’s network of industry partners and the magazine’s own thorough in-house research process. NewVoiceMedia’s ContactWorld technology is an intelligent, multi-tenant global cloud contact centre and inside sales platform that joins up all communications channels and plugs straight into an organisation’s CRM platform for full access to hard-won data. With a true cloud environment and proven 99.999% platform availability, ContactWorld ensures complete flexibility, scalability and reliability. The company now serves more than 650 customers worldwide, including MobileIron, Lumesse, Vax, JustGiving and Canadian Cancer Society. The award win follows several recent victories for NewVoiceMedia including being named by Forbes as one of the world’s top private cloud companies and honoured in the Sunday Times Hiscox Tech Track 100, an annual league table which ranks Britain’s private technology, media and telecoms companies with the fastest-growing sales. “These achievements underscore our commitment to offering the best possible technology on the market for driving more successful conversations for sales and service teams”, said Dennis Fois, President and COO of NewVoiceMedia. “Our cloud contact centre and inside sales platform is attracting some of the world’s highest-growth businesses as we continue to drive innovation that transforms the way they connect with their customers and prospects” ### Conjuring up some Magic Debbie McGee’s performances on Strictly Come Dancing this year have been nothing short of magical. Until, of course, Saturday night’s show where she and her professional partner, Giovanni Pernice, scored a meagre (by their standards) 33 and found themselves in the bottom two dance-off. Debbie’s late husband, Paul Daniels, probably would have liked the performance. But not a lot. All the dancers on the show - both professional and celebrity - talk about the magic of dancing at the Tower Ballroom in Blackpool, the spiritual home of Ballroom Dancing. Each year on the Blackpool Special week it’s almost as if the trip up north is a pilgrimage for believers in the Magic of Ballroom. So for Debbie’s magic touch to desert her on such an important week was such a huge shock and surprise. It’s almost enough to make you even question if magic is even real. Some true believers in magic, however, are the guys and gals over at Canonical, the company behind the popular Ubuntu Linux distro. They have developed Conjure-Up - a command line tool written in Python that lets you summon up big-software with a so-called “spell”. Conjure-Up is an installer for complex topologies of interconnected software, like Openstack and Kubernetes or Hadoop: the “spells” are a list of instructions of which applications to deploy and how to deploy them. As their tagline says, “start using your big software instead of learning how to deploy it.” Wise words indeed. There is no point in having software if you can’t use it! And why spend so much time and effort configuring it when you can cast a spell and have it done for you? OpenStack has been hailed in some quarters as the future of cloud computing. Although there are a number of spells in Conjure-Up’s GitHub repository, focusing on one of the main spells - a spell to handle the deployment of OpenStack - is especially interesting as OpenStack has been hailed in some quarters as the future of cloud computing. It is an open-source platform for cloud computing that allows virtual servers and other computing resources to be managed and served. Part of its appeal is that it is available licence-free and so has a competitive advantage over other proprietary solutions, such as VMware. OpenStack was first released in 2010 as a joint venture between Rackspace and NASA. The OpenStack Foundation was created a couple of years later and now has more than 500 companies participating in the project - including some of the biggest companies in hardware, software, and hosting, for instance, IBM, Intel and Dell EMC. Casting the OpenStack spell on conjure-up allows you to quickly and easily build a modern, scalable and repeatable cloud infrastructure. Going back in time slightly, a couple of years ago research regarding OpenStack commissioned by Suse Linux (results published in January 2016) found that half of all enterprises that tried to implement OpenStack failed. Furthermore, they found that 65% of companies said that the implementation experience was difficult. Until now OpenStack revenue has been mainly from service providers offering multi-tenant Infrastructure-as-a-Service. But recent research has found that growth in OpenStack is now moving towards private cloud quicker than previously anticipated. In fact, it is now predicted that revenue for OpenStack from private cloud will surpass revenue from service providers using public cloud in 2018. Returning for a brief moment to the Suse research, even back in 2016 it was discovered that 44% of companies planned to download and install OpenStack themselves. Bringing together the fact that OpenStack in Private Cloud is growing quicker than thought and that it’s likely that a significant percentage of companies will attempt to implement OpenStack themselves, it’s not difficult to see why conjure-up is going to be important as 2017 winds down and we enter into the new year. As Arthur C. Clarke’s famous third law states, “any sufficiently advanced technology is indistinguishable from magic.” Of course, conjure-up isn’t really magic. When it comes down to the nitty-gritty of it, conjure-up is merely a simple command-line installer tool. But it does take away a lot of the pain of deploying big software and if OpenStack really is going to be the future of Cloud Computing, having a tool that lets you get started using your software by reducing the time and effort required to get it set up really is the equivalent of having a magic wand in your armoury. And if moving into 2018, you are one of the companies that are going to take OpenStack’s private cloud numbers ahead of its public cloud numbers by building and managing private cloud infrastructure you’ll certainly enjoy Conjure Up. As Debbie McGee’s late husband Paul Daniels (probably) never said, “You’ll like it. A lot.” Now that's magic.   ### A Tale of Two Cities: Who stalks more? We all know to do our research before going to that job interview or important client meeting. Did your research involve stalking your target though? No, because we are taught that stalking leads to restraining orders and the opportunity to shower with people twice the size of us. In an era where social media is fast becoming intrinsic to our day to day lives, there is generally a lot of information about us online because of what we post. Recently, a woman lost her job because of a photo she put on Facebook of a party in the care home she worked for. Puggles the kitten might be cute to show your friends, but is it worth getting your P45 over? Ok, Puggles is probably safe to post, but where do employers draw the line? What do business associates look for in your profile when seeing if you’re right for them? Another reason people look at others online is to find the best way to sell to them. Have an interest in League of Legends? Have you heard about the new Razer BlackWidow keyboard? And whilst we’re at it, why not indulge your gaming side even more by pre-ordering Razer’s new smartphone? There is a fine line between looking at someone as a potential investment and snooping, or even potentially discriminating (when looked at from an employment point of view). Yet companies on both side of the pond are turning to social media stalking. Now, I’m not here to preach. Get fired --- I don’t care --- but at least be educated before you go. Before I lead us on to my favourite part of the blog - the stats – let’s talk details. The findings are from a survey conducted by CITE Research, on behalf of SugarCRM, that looked at 400 business professionals (200 from the Queen’s country and 200 living the American Dream). Those who partook were employed full time at director-level or above in a sales or business development role, working for a company with at least 100 employees. The main purpose being to try and define what technology stacks should look like for a modern sales team. As technology progresses, the sales role is ever-changing and it’s rather unclear to a lot of business leaders how to keep that department successful. Hard sales is gone. No-one wants a toaster anymore, they want a service. They still want a product, but due to the technicalities of what’s on offer to today’s crowd people want - and need - aftercare. Aftercare for a good price at that. It’s too easy to look around at the competition thanks to Google, comparison sites and the largest online social connection history has seen to date. Brand loyalty seems to have gone out the window almost entirely, so what’s next? Let’s get back to snooping, shall we? America – 60% of your companies are turning to Facebook as their tool of choice when researching for prospective client meetings. Britain – 46% of your companies are doing the same. This leads to an average of 53% of companies using Facebook. I don’t know about you, but the thought that, for every 2 meetings I have, 1 of those had looked through my Facebook, slightly distressing. The chronicle of my late teens, documented by angsty photos and opinionated comments is something I rather wish didn’t exist anymore. My early twenties, of National Trust landscapes and photos of my dog (a very handsome boy I might add) is fine though. It’s not even a quick search either, 72% of research sessions last 30 minutes or more whilst a staggering 49% spend at least 45 minutes in the following areas: LinkedIn is king (or queen) of the castle with an aggregated score of 64%. In a close second comes a company’s website at 63%. Google is third on 61%. The aforementioned Facebook – fourth at 53% Twitter even makes an appearance, coming in fifth with 34%. Research shows that millennials are more likely to look for their prospects on the social media giants Facebook (59%) and Twitter (41%), whereas the over 55 crowd favour the company website (83%) and LinkedIn (76%). So what does this all mean? I feel it’s easy to put a negative spin on things. At the end of the day, we all need to keep a closer eye on what we post online, but once we do it leads to quite a positive outlook. We are paving the way for future tech that looks to not only improve our journey as a customer but also give us access to tools that help both us and our companies. It generates opportunity for you as entrepreneurs to enter the market and turn sales on its head. Become the stalker, not the stalked. If you have any comments, or if you want to get in touch and talk tech (or my dog), stalk me on Twitter, LinkedIn, Google, and Facebook. ### Growth, investment and relocation to Wales accelerated by new IoT programme Internet of Things (IoT) start-ups across the UK can apply for an accelerator programme based in Wales, launched today to provide promising businesses with the financial investment, invaluable knowledge and extensive network needed to accelerate growth.  The ‘Internet of Things Accelerator Wales’ (IoTA Wales) is actively seeking promising innovators outside Wales to sign up to this unique Accelerator, a collaborative initiative that has been jointly created and designed by The Accelerator Network, Innovation Point, Barclays Eagle Labs, Inspire Wales and the Development Bank of Wales. The 12-week programme aims to identify and support up-and-coming IoT businesses in key areas such as cybersecurity, energy, manufacturing, education and agriculture, during the exciting but challenging start-up journey. The programme will accept ten promising businesses to receive a £50,000 investment for a 10% equity stake in the enterprise, as well as access to world-class facilities, academic expertise and free creative office space to promote intensive-growth. Ian Merricks, Chair of The Accelerator Network, said: “The appetite amongst tech entrepreneurs has never been greater, with the number of individuals turning their business idea into a reality growing year on year - yet still 50% of all start-ups in the UK fail within the first five years. “The IoTA Wales is responding to this through an intensive-growth programme that is specifically designed to help make the IoT start-up journey a shorter and more successful one.” In 2015, 68% of digital investment in the UK was outside of London and the latest Tech Nation report found that Wales has the fastest growing digital economy out of London.  The Cardiff and Swansea tech cluster creates £392m annually in GVA, employs 17,000 people and saw 103 start-up births last year. IoTA Wales comes at a transformative time for Wales’ tech scene, following September’s signing of a deal for a south Wales home for the world’s first compound semiconductor cluster, CS Connected, the first investment under the £1.2bn Cardiff Capital Region city deal programme. The cluster is expected to attract £375 million in private investment and create 2,000 well paid and highly skilled jobs in the region, working on the technology behind the future of robotics, AI and driverless cars. The organiser of London’s biggest IoT meet up, Alexandra Deschamps-Sonsino, named the most world’s most influential Internet of Things (IoT) expert (Postscapes, 2016), claims Wales has a competitive advantage over London which it needed to capitalise on to make the most of the trend’s opportunities. She said: “You can make a profitable business in this space by manufacturing just 5000 pieces and through the production of the Raspberry Pi, Wales has the manufacturing capability which we just don't have in London.  That bit is potentially the hardest piece of the jigsaw.” Secretary of State for Wales Alun Cairns said “Wales is a dynamic nation that is home to some of the most innovative companies in the world. As we prepare to leave the EU, I want Welsh innovators to seize the opportunities available to expand their companies and help us trade on the global stage. Industry and academia can work together to keep Wales at the forefront of technology and I hope as many companies as possible apply.” IoT connects tangible devices, systems and services over the internet, enabling them to collect, share, analyse and act on data. This includes objects ranging from phones and fridges to wearable tech and medical devices, enabling smarter homes, cities and transport to become a reality. To find out more about IoTA Wales, visit: https://www.innovationpoint.uk/iotawales Applications to be a part of IoTA Wales are now open via https://www.f6s.com/internetofthingsaccelerator-cardiff/apply and will remain open until Midnight on Saturday 9th December 2017.  Businesses from across the UK are able to apply to be on the programme, but those from outside of Wales must be willing to relocate. David Warrender, CEO of Innovation Point, who is delivering the programme, said: “Launching the nation’s first IoT accelerator programme here in Wales is no mean feat, and is yet another example of the country’s growing role in the digital revolution.  “The programme will have an intense focus on IoT security-related devices and applications, providing high-potential applicants with access to a globally proven network of business mentors, high calibre partners and potential investors - all critical components of accelerated success.” Major organisations already signed up as delivery partners on the programme include Sony, Airbus Group Endeavr Wales, Cardiff University, Sorensen, Business Wales, TechUK, Pinacl and RS Labs amongst others. Businesses lucky enough to be selected as part of the start-up cohort will benefit from free creative office space at the latest Barclays Eagle Lab in central Cardiff. Steven Robert, Head of Strategic Transformation at Barclays Eagle Labs, said: “As well as providing an inclusive environment that fosters innovation and promotes shared growth, businesses will have access to a “Maker's Lab” set up, plus support staff and structured support programmes to help with the rapid development of IoT prototypes.  “As ideas and products progress, businesses will also benefit from access to additional state of the art IoT labs at Cardiff University, which is making an in-kind contribution that includes academic expertise.” ### GDPR Unready - Cash Converters Data Breach Is a Sure Warning The quite remarkable results of research this week shows the shocking state-of-preparedness of UK firms with respect to GDPR. Only further highlighted in 'the real world' by a data breach from a large high-street organisation. One survey found that only 18% of UK Large and Multinational Organisations were 'highly confident' about meeting the GDPR deadline of next May. The fact that only 25% of law firms surveyed feel GDPR ready, despite being a more favourable figure, is perhaps the more worrying statistic. If the specialists in law are struggling, how else is everyone else supposed to be ready? Yesterday's news that high-street pawnbroker, Cash Converters, suffered a serious data breach has highlighted just how unready the vast majority of businesses are. Not just because that it occurred, but also where it occurred. The fact that the breach occurred on an old system that was no longer used by the public brings to light the fact that there is just so much data in so many systems that it's not just your bright, shiny new systems with state-of-the-art security that you need to worry about. Jon Topper, CEO of UK tech company The Scale Factory has said, “When migrating away from old solutions it’s important to bear in mind that old digital assets will still be running and available online until such time as they are fully decommissioned. As a result, they should still be treated as ‘live’, which means maintaining a good security posture around them, keeping up with patching and so forth.” There's so much data in so many places, it's tough to keep track of it all. However, just as ignorance of the law is no defence, the initial response from Cash Converters that the website that was hacked was being managed by a third party just won't cut it under GDPR regulations: if it is your data to look after and keep secure, there are no mitigating circumstances. From the 25th May 2018 under GDPR the maximum fine raises from half a million pounds to €20 million or 4% or annual global turnover Under the current Data Protection Act, the Information Commissioners Office has several options when it finds an organisation is in breach of the act - including imposing a monetary penalty of up to £500,000. From the 25th May 2018 under GDPR the maximum fine raises from half a million pounds to €20 million or 4% or annual global turnover – whichever figure is higher. A very substantial increase. The full fall out of the Cash Converters data breach isn't yet known*, but if it is serious and if this attack had happened in 6 months times, would the full force of GDPR be brought down on them, or will there be a 'softer' approach to the first few months considering the seeming lack of preparedness across the board? Cash Converters has already taken a huge hit in recent years. Following the introduction of the Financial Conduct Authority's rate cap on High-Cost Short-Term Credit in January 2015 they made the decision to cease offering personal loans in store and online, effectively ending their biggest source of turnover. They stopped lending towards the end of June 2016, just before the end of their financial year. In their accounts, they state that in the year ending 30th June 2016 their turnover from 'continuing operations' was £13.9m --- but that the turnover from 'discontinued operations' was significantly higher at £26.3m. Yesterdays headlines certainly won't do them any favours. I feel sympathy for many of their customers who have felt the need to take out high-interest loans in the past - especially if it is their data that has been stolen in this attack. On the face of it, it may not seem that cybercriminals would choose to target people who are struggling financially, but only the desperate would choose to turn to such high-interest loans - and desperation makes people especially vulnerable. If this is a planned attack by cyber-criminals looking to utilise what they can find for financial gain, then they may have chosen their perfect targets. However, as Andre Stewart, VP at Netskope, points out, “While many Cash Converters customers may be wondering if their username and password are among the stash of stolen data, the fact is that the stolen credentials shouldn’t give any cause for concern – if basic cyber hygiene procedures were followed. But what are the chances that those passwords have been used for multiple accounts and remain the same? The truth is that we make it too easy for cyber attackers to tap into our online accounts and data by leaving our log-in credentials unchanged for years at a time, using the same details across accounts or choosing insecure passwords which are far too obvious." This is certainly sound advice for the here and now in a data-driven world that relies on the password to protect ourselves. But perhaps the password has had its day? As James Romer from SecureAuth points out in a recent interview at IP Expo 2017 relying on a username protected by a password is extremely vulnerable to thefts such as Cash Converters (or any of the myriad of security breaches over recent years) - once the username and password is in the hands of the criminals, they have access to whatever system they have hacked. What they can't steal, however, are your regular patterns of behaviour - instead of just relying on a password, 'a behaviour profile' can offer better protection. Any deviation from your normal pattern of behaviour and a red flag is raised. Passwords aren't going to disappear anytime soon - but as the sole means of protecting our data, their days will have to eventually come to an end. Even simple 2-factor authentication has its limits and will have to be superseded by a stronger strategy. Data is all around us - more than that, data is integral to everything we do in the modern world. GDPR makes clear, in no uncertain terms, that how organisations protect that data is of paramount importance. But so is how we ourselves, as individuals, take responsibility for our own security. Technological and strategic advancements will improve matters over time. But, as we know, the cybercriminals will never be too far behind, no matter what rules and regulations are put in place. * Update - just prior to going to press, we received this comment from Jason Hart, CTO of Data Protection, Gemalto: “This is yet another case of a company not protecting the sensitive customer data it holds. While no credit card information was taken, hackers were able to access usernames, passwords and addresses, which can be used to launch social engineering attacks. This should serve as yet another wakeup call that businesses need to protect this type of data at its source. Through methods like encryption, hackers may be able to take the data, but not actually be able to read it, ensuring it can’t be used. It’s incredibly frustrating to see these attacks continue to hit the headlines, given the relative ease of methods that are out there now to prevent them.” ### How to save a business from a cyber-attack Cyber-attacks are on the rise, and unfortunately the lack of awareness in smaller businesses, and the ‘it won’t happen to me mindset,’ means that they have become an increasingly attractive target for cybercriminals. Given their size, it’s unusual to find an SME that has access to a dedicated specialist IT team, equipped with the skills and experience necessary to deal with complex data breaches, meaning they are especially vulnerable. The consequences from this lack of preparation can be catastrophic, with reputations at risk as well as potential legal damage. Changes to technology have meant that data is now more valuable. We can use consumer data to fuel complex metrics and analytics, enabling us to know more about our consumers. The more we know about our consumers the more we are able to tailor promotions to them, increase sales and profitability, and subsequently drive the price of data upwards. Despite the many benefits of using data properly, the changes in the way we work means that in some cases our data could be more vulnerable than before. Businesses have started to move away from outdated legacy systems and towards cloud services, allowing greater flexibility to employees, and offering a more efficient way of storing data. However, many firms are turning to public clouds, which unlike their private counterparts, are often riddled with vulnerabilities and security challenges. Building walls There are dedicated products available to help businesses protect themselves from attacks such as Disaster Recovery as a Service Products (DRaaS), which protect the cloud by mitigating potential disastrous consequences of a security breach. Demand for these solutions is gaining momentum as firms begin to implement adequate controls and it is expected that DRaaS solutions will account for 90% of disaster recovery operations by 2020. But what else can businesses do to help prevent and minimise damage, should they become the victim of a cyber-attack? Awareness A successful data breach strategy begins with awareness. Employees need to be aware of the value of data, the increasing reality for all firms of cyber-attacks, and the unfortunate consequences that can come as a result. Firms must go beyond the obligatory requirements such as ISO and create a culture that understands the importance of information security. Whilst this may seem obvious, there are examples where lack of employee awareness has had unfortunate consequences. The recent Equifax hack saw 143 million US customers have their personal information stolen between mid-May and July and is one of the worst attacks of its kind. This month it was also announced that 700,000 UK customers may also be at risk from the hack. However, following these attacks, further mistakes were made. Post-hack, Equifax created a website for customers affected by the data breach which offered information and advice. However, website phishers easily replicated the new site, copying its look and feel and offering wrong information to customers. Not only did this dupe Equifax customers, but it also fooled their employees. Employees posted and promoted the link to the fake website on social media, directing thousands of customers towards the page. This highlights the need for employees to be aware of the wide-ranging forms that hacks can take, from data breaches to phishing and everything else in between. The cyber landscape is constantly changing, and employees need to be up to speed with the latest happenings, knowing what to look out for and how best to deal with these situations. The Equifax example shows that cyber-attacks are not just one attempt to sabotage a firm, but can come in waves that unless adequately prepared for, can be catastrophic. Increased regulation The regulatory landscape is in constant flux. High profile regulation GDPR comes into play next year, which outlines a new set of rules on how businesses store and use data regarding all EU citizens. Firms that don’t comply could be facing fines of up to £17 million or 4% of annual turnover, whichever is higher. Additionally, firms have only 72 hours to report a cyber-attack to both the regulator and consumers. When preparing for regulation, it’s important for organisations to bring it back to basics, security gaps often come to light as firms get to grips with a new requirement. To ensure full compliance firms should take a 5-step approach; assess, implement, educate, maintain, certify. Begin by assessing the current situation. What regulation do you currently comply with and what changes are involved in the new regulation? Undertake a risk assessment and analyse the gap between your IT system and new governance – what do you need to do to fully comply. Once you have assessed the situation you need to establish a governance framework and implement IT systems to ensure that you have the infrastructure to cope with regulatory requirements. Raise awareness of new systems by offering awareness training to all staff, highlighting the capabilities of new systems and all governance rules. Whilst data security is everyone’s responsibility, to ensure constant compliance it’s recommended that businesses give someone the role of maintaining information security, in some cases this could take the form of appointing an Information Security Manager who can monitor regulation and ensure ongoing compliance. Once compliant, organisations can then look to certify themselves against regulations such as ISO27001. Quick Reactions Speed is key when dealing with a data breach, reducing data recovery time to a minimum is essential for businesses to maintain access to information and stop potential hacks. Currently, statistics show that on average someone is in your system 40 days before they are detected. It goes without saying that the longer someone is in your system the more damage they can cause, and therefore firms need to be quick to realise that they are under threat. To overcome this, all employees need to be aware of what to look out for when detecting a cyber-attack. Often it’s the case that employees don’t know what ‘normal’ looks like and therefore should an abnormality occur, they may not notice. To ensure effective risk management, employees need to have an awareness of how systems should appear, and the type of changes to look out for. Thus increasing the chances of spotting them quicker. The backbone of security The changing approach of hackers means that businesses need to continually revisit their strategies to ensure they have the best approach in place to deal with security breaches, incorporating new trends and ensuring their technology is sophisticated enough to deal with increasing demand. This can be hard for a small business with small budgets and even for large corporations who have vast amounts of data and complicated systems. Fortunately, security solutions and providers can help businesses prevent future attacks by examining network traffic for known attack patterns, analysing trends and monitoring the methods of attack. With the right partnerships, firms are guaranteed constant compliance and with the partners acting as the ‘backbone’ of security needs, preventing and ensuring minimal damage should a company become a victim of a data breach. ### Rackspace Announces Completion of Datapipe Acquisition Strengthens Commitment to Become the Global Leader in IT as a Service LONDON – Nov. 16, 2017 - Rackspace today announced that it has completed its transaction to acquire Datapipe, a managed services leader across public and private clouds, managed hosting and colocation. Through this acquisition, the largest in its history, Rackspace expands its capabilities as the world’s leading provider of managed hosting and private cloud solutions that more companies are using as they move critical workloads out of their corporate data centres. Rackspace also becomes the leading provider of managed public cloud services across all the top public cloud infrastructure platforms, including Amazon Web Services, Microsoft Azure, Google Cloud Platform and the Alibaba Cloud. "This acquisition demonstrates our commitment to become the world’s number one provider in IT as a service" “This acquisition demonstrates our commitment to become the world’s number one provider in IT as a service,” said Joe Eazor, Rackspace CEO. “Datapipe brings important new capabilities to Rackspace that will enable us to better serve customers, globally and at scale. Together, we will build on the industry leadership both companies established in expertise, reliability, security and support, to create a new level of end-to-end customer experience.” Rackspace will begin the integration process for Datapipe immediately in a planned way that maintains and enhances support levels for customers. Datapipe leaders joining the Rackspace executive ranks include Joel Friedman, now chief technology officer of Rackspace; Dan Newton, senior vice president, account management and service delivery; and Dan Tudahl, general manager, government solutions. ### OpenText joins Dell EMC Select Partner Program OpenText, the global leader in Enterprise Information Management, has today announced that it has signed a definitive reseller agreement with Dell EMC establishing OpenText as a reseller partner within Dell EMC’s Select Partner Program. LONDON, UK – November 16, 2017 – OpenText™ the global leader in Enterprise Information Management (EIM), today announced that it has signed a definitive reseller agreement with Dell EMC establishing OpenText as a reseller partner within Dell EMC’s Select Partner Program. The agreement initially includes OpenText InfoArchive® to help enhance Dell EMC offerings focused on IT transformation. This combination paves the way to help enterprise customers modernise and transform IT infrastructures. InfoArchive will be immediately available through Dell EMC’s sales channels. OpenText InfoArchive and Dell EMC’s services and storage solutions enable enterprises to securely manage legacy information from multiple applications and sources, without the requirement to maintain expensive legacy applications.  “A modern digital strategy enables enterprise customers to simplify IT, reduce operational costs, streamline compliance and deliver business insight,” said Adam Howatson, CMO, OpenText. “A major roadblock to realising complete digital transformation of an organisation is the enormous volume of data and content contained in legacy systems.”  “The agreement between OpenText and Dell EMC is the first stage of a dynamic and growing relationship between the two companies. The combination of OpenText’s leading EIM platform, and Dell EMC’s leading storage services and solutions, will enable our customers to meet and exceed their digital transformation goals,” said Howatson.  “We are excited to partner with OpenText to deliver an archiving platform to support our customers’ application transformation requirements,” said Mike Arterbury, Vice President, Technology Alliances, Dell EMC. “We look forward to working closely with OpenText to engage in opportunities where we can maximise our collective resources to solve customer problems.” OpenText InfoArchive enables enterprises to target applications suitable for archiving and retirement, and manage all associated structured and unstructured information. InfoArchive can assist enterprises in the requirement to retain legacy and past-era applications, while providing continuous access to valuable data and information, and ensure compliance with governance and retention policies. InfoArchive works with Dell EMC’s storage platforms such as Isilon and ECS, and Dell EMC’s Application Archiving and Retirement Service, to enhance customer compliance, accessibility and information analytics capabilities. ### Cash Converters Suffers Data Breach From an Old Site The words written by Cash Converter’s Social Media team this morning, “Good morning all. We are here to answer any queries you may have until 5.30,” seem happy and pleasant enough. But their main feed has been pretty quiet ever since. Their replies feed, however, has been pretty active – unfortunately, they are all variations of, “We are happy to discuss this with you over the phone or email.” The company has admitted today that customer usernames, passwords and addresses may have been taken by a third party. Data breaches from live sites are embarrassing enough, but it has emerged that this unauthorised access was to an old site which is no longer in use by customers, but was still online. Jon Topper, CEO of UK tech company The Scale Factory has said, “When migrating away from old solutions it's important to bear in mind that old digital assets will still be running and available online until such time as they are fully decommissioned. As a result they should still be treated as ‘live’, which means maintaining a good security posture around them, keeping up with patching and so forth” In their customer notification, Cash Converters were quick to point out that the old site was operated by a third party, possibly intending to deflect responsibility for this breach, which definitely won't fly under GDPR regulations coming into force next year. Companies running server infrastructure that handles customer data should be engaging with experts to review their security posture ahead of that, in order to avoid being slapped with a large fine.” With recent reports and studies suggesting that only a fraction of large UK and Multinational Organisations are ‘Highly Confident’ over GDPR compliance before next May’s deadline – and perhaps more worryingly still – only 25% of law firms surveyed are ready for GDPR, issues surrounding the security of personal data will only come under the microscope more often in the coming months. ### 2018 will be the year of 'breaking down the monolith' Cloud adoption continues at pace as we move to an everything-as-a-service economy. To be successful, today’s organisations must focus on engaging and delivering continued value to their customers. This means businesses have to change the way they operate with a focus on delivering what the customer wants as quickly and efficiently as possible. Engaging with customers to achieve an outcome and building ongoing relationships that deliver value. What the best companies realise is that success requires a fundamental change in the way they operate across processes, people and systems. Digital transformation and the need for greater agility favours a cloud infrastructure. However, for large established businesses, moving to the cloud all at once is a daunting prospect fraught with risk. This is where microservices come into their own. While not new, only now are organisations embracing microservices as a way to break apart their legacy monoliths. However, breaking down the monolith is about much more than the technical architecture. Monolith applies just as much to the organisations and its processes and procedures. Without undergoing a cultural change that supports digital transformation, you will continue to operate at near monolith-speed and agility levels. 2018 will prove to be the year that the microservice economy comes of age, replacing the old approach with better, more constructive ones. We’ll see the unbundling of SaaS and marketplaces. Microservices: More than just mini-services and multiple apps Where the monoliths of the past were (and still are) one big program with many functions, today's systems call for the agility and functionality of microservices. Microservices are important in bringing down the monolithic architecture of both the technical side of the system itself and that of the organisation's processes and procedures by streamlining everything into smaller, more manageable chunks. Today's consumers are impatient. They demand that everything be delivered in a matter of milliseconds and prefer to receive information seemingly before it occurs. That's why 2018 will be the year of the microservice economy--it has to be. These mini-services and multiple apps will allow organisations to pick up speed and increase agility, unlike the slow and rigid structure that accompanied the monoliths. To compete, organisations really won't have a choice not to take advantage of microservices and their benefits. The role of the cloud platform in building and delivering microservices Moving away from the rigid monolith systems means embracing the benefits of microservices. The Cloud platform is the perfect approach for the microservice economy because it allows for fluid and dynamic development independent of the gross physical architecture previously required by the monoliths. Microservices bring organisations to life because they: Provide small yet targeted scope and functionality Communicate according to industry-wide standards Create opportunities to develop and deploy services independently without disruption to the entire system Are easily deployable without massive cost Are easily disposable, meaning new releases don't interfere as much with business continuity Allow for individual testing and increased speed of service Have a better fault tolerance The Cloud creates the best environment for cultivating all these benefits across organisations, no matter how large or small. Mobility becomes a given, leaving behind the hindrance of prior physical parameters and space. It also allows the freedom to experiment and implement enterprise tech at lightning speeds so customer service can remain the top priority. Why microservices develop and deliver customer value quickly Microservice architecture allows concise, dedicated teams to develop resilient, web-scaled systems faster and more efficiently than legacy systems. Legacy systems require large teams of people communicating through sometimes unidentifiable components to make changes that will only affect the aspects they want changed. But because legacy monoliths are one big program, change requires a huge amount of careful planning and coordination and takes much more time and effort than many companies have to spare; certainly more time than customers desire to wait. When the change is finally ready it usually requires a full service interruption to deploy which can be very costly to the company. But with microservices, isolating components is quick and easy because each one is a dedicated service. Each microservice still communicates with the other apps for functionality, but updating one has minimal impact on the other components, unlike legacy systems. This means that overall customer service isn't interrupted and updates are completed quickly, delivering continued customer value. Continued dawning of the microservice economy as fuel for marketplaces Unbundling SaaS and ushering in the microservice economy means the demise of the big monolithic applications. This economy will also create flourishing marketplaces with many microservices that seamlessly work together to provide business outcome-empowering solutions for the new era of enterprise systems. Having these microservices will not only fuel the need for marketplaces but also create them. 2018 will see more enterprise app stores, similar to Apple's App Store or Android's Google Play, but for organisations. Customers will integrate this in their own marketplaces, where users can select from a wide variety of applications to customise their own business strategies, empowering individual departments and the organisation as a whole. This constant iterative approach will help organisations better support their people and service delivery. ### Why Cloud Tech is the Catalyst for Startup Success Cloud technology has revolutionised the way we work in recent years. But more than that, it’s opened up new opportunities for those entrepreneurs who want to set up a business. Cloud technology removes the barriers to entry that used to exist in the form of property requirements and hardware installations and replaces them with easy to access cost-effective business solutions. Indeed, interest in cloud technology is nothing new; according to Google’s own data (via Google Trends), searches for cloud tech have been on a steady rise for the past five years. But it’s with the increasing number of applications that use of cloud-based tech has really taken off. In this sense, cloud adoption is making it easier, quicker and cheaper than ever for entrepreneurs to set up new businesses, as well as reducing the amount of time and management required for basic business admin. Cloud technology and the new business boost With more than 650,000 new businesses incorporated in 2016, entrepreneurship plays a huge part in Britain’s growing economy. Go back thirty years, and setting up a new business was a huge logistical challenge. Not only did you need a great idea and the funds to back it, you had to be able to invest in things like an office space, telephone lines and computer networks to get your company off the ground. In doing so, you set your business its first risk right there and then; with the investment in hardware comes the danger that, should the business fail, that investment is lost. Look ahead to 2017 and beyond, and new business formation has been streamlined by cloud technology. Remote working has been facilitated by the plethora of cloud working tools available - including solutions like Google Drive and collaborative working documents - meaning that entrepreneurs no longer need to invest in a physical office space to be able to employ staff. Cloud communications tools have made it possible for business owners to communicate via a range of platforms, no longer tied to a phone line but able to utilise unified communications solutions like instant messaging, VOIP and mobile devices. In today’s cloud-thinking world, starting a business has never been easier. Investing in cloud tech for improved business efficiencies It’s not just new businesses that benefit from the cloud. With cloud technology at its most secure and widely adopted point, its use to improve efficiencies in a business and to facilitate new working practices is becoming commonplace amongst savvy business owners. Take, for example, the rising popularity of flexible working. By allowing employees to work in the format most desirable to them - be it flexible working hours or remote working - business owners strive to improve employee satisfaction and with it, productivity. It’s not uncommon for today’s business to operate across locations, whether that’s multiple offices or simply not having an office at all, instead allowing for employees to work from home. These flexible, remote working conditions are only made possible when cloud technology is in use. And with the advantages of flexible working including improved productivity, increased engagement and better staff retention, savvy business owners are seeing cloud investment as a no-brainer. Cloud-based solutions also reduce the time required to manage business operations. For example, a cloud-based accounting system such as Xero makes management of employee expenses, and the resulting tax management, a much more streamlined task. Security and cloud technology Security has historically been the most common objection to cloud technology in businesses. With vast improvements made to the security of the cloud, though, it’s fast becoming the most secure platform for businesses to manage their communications, maintain their records and keep documentation and information safe. The major benefit of cloud-based working is that the security vulnerabilities of the cloud have, and continue to be, so strictly monitored that any potential flaw is spotted and resolved, usually before most people are aware of it. Cloud-based business solutions are tested vehemently, with hackers employed solely to try to ‘break’ the system and therefore protect it against any real-life risk. What’s next for cloud technology? Cloud technology is really only in its infancy. The scope of what can be achieved when we take advantage of the cloud is yet to be fully explored, but with innovations being made almost daily, we can be sure of its continued growth into 2018 and beyond. Today, technology has made it possible for us to set up a business in half an hour or less. Simply sign up for a cloud-based accounting system and the need for on-site or external support is negated. Choose a cloud-based communications platform and all of your telephony - and other comms - needs are addressed. Essentially, there’s no excuse for small businesses to be relying on poor internal processes. In communications and beyond, savvy business owners will do well to invest in those tools and technologies which improve their efficiencies, security and, ultimately, satisfaction. ### Don’t let a traffic spike become a Sword of Damocles Managing traffic peaks with cloud caching In digital transformation programmes, the number of website visitors has become a popular key performance indicator (KPI) for measuring brand engagement. But the power associated with a traffic spike can fast turn into a perilous “Sword of Damocles” if the website is not prepared. Cloud caching technologies that don’t require many costs upfront offer a solution. In the world of e-commerce, website traffic is the modern version of ‘footfall’ and every company wants to get as much of it as possible. Indeed the number of site visitors is usually a top KPI. Given that, one would assume that websites would all be prepared for big hits of incoming traffic. Far from it! Every year, recurring events such as big National Lottery jackpots, the Grand National races and Glastonbury ticket sales all lead to crashing sites and frustrated users venting their anger on social media and in the comment fields of online newspaper articles. Unforeseen traffic spikes are even more complicated to handle. When the Duchess of Cambridge wore a dress from fashion retailer Reiss during a meeting with the Obamas in May 2011, Reiss’s website crashed for two and half hours. Caching technology as a solution Fortunately, there is a technical solution to ensure that websites perform, even at times of peak traffic. Foreseen or unforeseen. The magic solution is intelligent web caching. A web cache contains copies of all the content of a website. It sits in front of a company’s backend server farm where it intercepts web requests and delivers copies of content to visitors. This means the company’s backend doesn’t have to reproduce multiple impressions of the same content, which can slow down or even lead to a website crash when many visitors arrive at the same time. Thus caching is critical for managing traffic spikes efficiently and it helps to improve website performance. The alternative - adding infrastructure and caching servers to support traffic peaks that “might” happen is inefficient and expensive, potentially conflicting with other business KPIs to reduce cost and carbon footprint. Fortunately, most caching solutions can be used either on-premise or in the cloud making big infrastructure investments unnecessary. Cloud caching = flexible web scaling Cloud caching enables companies to choose when to scale capacity up or down depending on predicted traffic patterns such as last-minute bets at the Grand National. If unpredicted peaks occur, such as in the mentioned Reiss scenario, no hardware investments need to be made - companies just scale up and pay for the infrastructure they actually need and use during the peak. However, there are scenarios for which a cloud-only deployment is not the right choice. One is when a company needs to maintain hands-on, local control for security or data-handling reasons. There are also certain operational tasks that require a more flexible approach. For example, an US-based global retailer might only have data centres in Nevada and California, but for regulatory compliance, reasons might be required to serve content from other locations in Europe. In these scenarios, a company doesn’t need to renounce intelligent cloud caching completely but can opt for a hybrid deployment where the caching solution is installed on-premise and is complemented with the cloud version. The advantage of such an architecture is that the traffic can be dynamically directed between on-premise and the cloud, depending on the actual traffic needs. If there is one thing that’s certain in our uncertain times it’s that traffic peaks can happen to any business at any time. Companies of all sizes require greater flexibility and more cost-efficient solutions to protect their sites, apps and CDNs while ensuring the best possible user experience. Deploying caching in the cloud or as a hybrid model is the most cost-efficient way to ensure that traffic spikes turn into revenue and loyalty, not a Sword of Damocles. ### Only a Small Number of Large UK and Multinational Organisations 'Highly Confident' over GDPR Compliance Following yesterday's news from CenturyLink EMEA that only 25% of law firms surveyed are prepared for the General Data Protection Regulation (GDPR), it's perhaps no surprise that, broadening out from specialist law firms, that number drops to an even smaller number of large and multinationals surveyed by the leading specialist law firm, Technology Law Alliance. The survey by Technology Law Alliance shows that only 18% of UK and multi-national organisations are ‘highly confident’ that they will meet the deadline next May, for compliance with the new GDPR. Jagvinder Kang, Co-founder and Director of Technology Law Alliance, comments: “On the face of it, this seems to be a shocking figure, but it can be understood if you consider the challenges which organisations are facing.” The survey results showed that the biggest challenges which organisations face, are dealing with the large number of systems on which data is stored and processed, and the lack of internal resource and know-how about GDPR. Kang explains: “Large organisations have complex systems and interactions with large numbers of databases. Although some organisations may have thought that Cloud Computing would simplify IT conceptually, it can give rise to problems from a data protection perspective.” With the ‘high confidence’ figure for GDPR compliance by 25th May 2018 being at such a low level, one would assume that this would have the attention of the Boards of the respective organisations. However, only 51% of organisations indicated that regular Board level reporting was being undertaken in respect of GDPR readiness. Kang notes: “This figure is alarming, especially as the survey responses showed that 78% of organisations regarded GDPR compliance as more important than other compliance programmes.” In terms of what organisations are actually doing to prepare for GDPR, 89% of respondents indicated that their organisations were involved in some form of data mapping or data flow activity. However, only 41% had a detailed GDPR compliance plan in place. The discrepancy between these figures is a concern, as Kang cautions: “Organisations need to be wary about just undertaking resource-intensive work on data mapping, without thinking about what they are going to do with the output of it, and how the activity is going to move them to compliance. Unfortunately, too many organisations are treating the data mapping as an end in itself, when in reality it’s just the start of what could be a very long journey.” Software tools can assist with GDPR compliance and know-how, and Technology Law Alliance has developed its own GDPR software compliance tool, ‘Asimuth’, from their spin-off company, Asimuth Limited. Kang explains: “The feedback which we have received is that a lot of organisations are anxious about the perceived scale of the task, and some don’t know how to progress or continue with GDPR compliance - so we have developed Asimuth to help them with that – not only for initial compliance up to 25th May 2018, but also for ongoing compliance beyond that date.” Although the survey results revealed that there are clear challenges which GDPR compliance is imposing on organisations, over three-quarters of organisations saw GDPR compliance as a positive initiative. Organisations cited reasons such as: helping them focus more clearly on the way in which data is used internally; becoming more transparent with individuals with regard to use of their data; and improving security within their organisations. These positive benefits accord with the messages which the Information Commissioner’s Office (or ICO) is advocating, for organisations to embrace GDPR compliance. The full GDPR Readiness Report (November 2017 edition), detailing additional survey results and analysis, is available free of charge for download from www.Asimuth.com ### The Evolution of the Intranet into a Digital Workspace [embed]https://vimeo.com/242774919/c3d444ed9b[/embed] We speak to Fintan Galvin, CEO of Invotra, as he talks about the evolution of the Intranet. Invotra describe themselves as a SaaS Digital Workplace Provider - a tool that helps glue teams together. Hear how Invotra has recently implemented Ethereum Blockchain and the benefits that this can bring when establishing contracts between organisations and individuals. ### Driving brand engagement with cloud-hosted enterprise video A live golf lesson in the cloud—courtesy of Aberdeen Asset Management Connecting on a personal level and building trust with customers and other stakeholders is critical to the success of any marketing team. Marketers have become creative and innovative by using cloud enterprise video to drive brand awareness, strengthen the connection with their audiences and add value to the relationship. Investing in live events like sponsorships, trade shows, conferences and speaking engagements is a great way to reach customers and prospects. Of course, not everyone can attend exclusive events in person. Think about how you can extend the benefits and the live experience to reach exponentially more individuals and create new opportunities for interaction and engagement. Take the case of Aberdeen Asset Management, one of Europe’s largest public, pure-play investment management group, operating in 38 offices and investment centres in 25 countries. With tightening legislation around customer and employee gifting, financial institutions are finding new and inventive ways to execute marketing initiatives. And for innovative companies like Aberdeen Asset Management, live video is becoming an increasingly effective way of delivering value. Sponsors of the men’s and ladies’ Scottish Opens as well as numerous high-profile golf professionals, Aberdeen Asset Management took a creative spin on their sponsorship and recently hosted a Live Golf Masterclass with 8-time European Order of Merit winner, Colin ‘Monty’ Montgomerie OBE. Presented by BBC Golf Correspondent Iain Carter, the session was broadcast live via a cloud enterprise platform from the highly exclusive Wisley Golf Club in Surrey. Undeterred by the windy conditions Monty was able to share his extensive expertise on driving, chipping and putting, not to mention the importance of having “soft hands at address”, with Aberdeen Asset Management’s clients and employees streamed to over 16 locations worldwide with as many as 400 live viewers. As a financial company, Aberdeen is increasingly having to utilise some of the assets it acquires through sponsorships in different ways. To be able to offer a live golf experience to its clients and staff without them having to move away from their desk or mobile device is just one of the ways Aberdeen is trying to do that. “Our webcasting platform has allowed us to broadcast events around the world which gives us tremendous exposure in key markets but it is the ability to tailor the added extras that gives us value with our clients and involves all our staff,” says Robin Spring, Senior Sponsorship & Communications Executive, Aberdeen Asset Management. According to Aberdeen marketers who use video grow revenue 49% faster than non-video users. And according to HubSpot, 80% of customers remember a video they watched in the last month. Video is both visual and auditory, which makes it easier for users to remember than strictly text-based content. However, broadcasting live from marketing events is not the only way to use cloud enterprise video creatively. Sharing internal thought leaders with the world Organisations have great experience and expertise that extends beyond what customers purchase in the form of your products or services. And management should find a way to show customers the full value that their organisation brings to the relationship. This could include general industry knowledge, deep technology expertise or perspective on big events and news. Many thought-leading organisations use video in the cloud to connect with audiences quickly and with authenticity. Video updates provide an opportunity to put your full team of experts in front of the client and build trust along with your subscriber base. UK-based investment management firm Brooks Macdonald for instance shares insights with customers in the form of a video blog every Monday morning with the help of a cloud enterprise video solution. The firm’s Weekly Market Commentary blog takes only minutes to publish because Brooks Macdonald’s agile marketing team has its workflow down to a science. That speed-to-market has paid off. Ghislaine Perry, Group Head of Marketing at Brooks Macdonald says, “These regular videos put our experts in front of customers in a very human way that they’ve come to expect.” Interviewing influencers to build organisational credibility  In the absence of an internal expert, an unscripted conversation with an outside industry influencer can be an impactful way to capture your audience’s attention. Simply record a dialogue with an analyst, consultant or opinion-leading customer—then publish it on your website, blog or social media channels. The conversation doesn’t have to be an endorsement of your organisation. The idea is to show your experts are in tune with what’s going on in your industry and that you listen as well as you share your opinion.  Wrapping it all up Every year companies make massive investments in live events like trade shows, customer summits and speaking panels. And due to cost and logistical reasons, only a fraction of employees and customers can experience these events, while even less can directly participate. With this in mind, a good cloud-hosted enterprise video system can put easy-to-use video capture, editing and delivery tools into the hands of your marketing and communications teams—and turn your company into the level of innovator companies like Aberdeen Asset Management has become. ### New research from CenturyLink reveals snapshot of GDPR readiness and Cybersecurity challenges facing law firms Only a quarter of law firms are GDPR ready. One in five has experienced an attempted cyber attack in the last month. New research from CenturyLink EMEA, a network and managed IT services provider specialising in digital transformation, reveals a snapshot of the state of GDPR readiness and cybersecurity attack risks at UK law firms and the investment being made to respond effectively to attacks. The findings of the survey of 150-plus IT decision makers in the legal sector are detailed in a new paper – Law Firms and Cybersecurity: how can lawyers keep their client data confidential. Amongst the findings: GDPR Only a quarter (25%) of law firms surveyed are ready for the general data protection regulation (GDPR). As the GDPR deadline fast approaches and the importance surrounding data protection intensifies, the results highlight that the majority of firms need to prepare while they still have time to be fully compliant with the legislation coming into force on 25th May next year. Failure to do so could result in severe penalties, with a maximum fine for data breaches up to 20 million Euros or 4 percent of annual global turnover. Steve Harrison, Sales Director at CenturyLink EMEA, commented on the GDPR findings: “With the deadline for GDPR compliance looming ever closer, law firms still have a chance to be ready, but they need to take action now. At CenturyLink, we provide a GDPR readiness assessment for businesses that are unsure of where to begin. This enables organisations to analyse their business and data to determine where the gaps are, and what steps should be taken. In addition, implementing a security log monitoring and analysis service will enable organisations to quickly identify if and when they have experienced a breach, enabling them to better comply with the GDPR breach notification regulation.” Cybersecurity According to the study, one in five law firms has experienced an attempted cyber attack in the last month. In addition, less than a third (31%) of IT directors believe their firm is compliant with all cybersecurity legislation. Respondents cited several challenges to more effective privacy and data security including human mistakes (50%), dedicated cyber attacks such as Distributed Denial of Service (DDoS) and ransomware or SQL injection (45%) and lost documentation and devices (36%) as the top problems. In a bid to combat such cybersecurity threats, more than half (55%) of firms said they have employed data security professionals and 60% now provide compulsory cybersecurity training for staff. Law firms are also outsourcing their IT infrastructure to providers who can offer a secure environment – to support their digital transformation initiatives, 43% of respondents are moving the hosting of their applications to cloud providers and one in four (23%) are moving their servers to a colocation facility. Shadow IT In regards to shadow IT, the research revealed 43% of IT decision makers at law firms trust their IT teams ‘to do the right thing’ for their business despite a third (33%) of firms not permitting bring your own device (BYOD) or bring your own apps (BYOA). Eleven percent have no shadow IT policies at all. Steve Harrison commented on the cybersecurity findings: “Every time a law firm faces an attempted cybersecurity attack, their infrastructure, data and customers’ data, as well as their reputation, is at risk of being compromised. That risk grows as companies have to offer more online services and flexible remote working options for staff in order to be competitive in today’s digital world. “It’s promising to see that growing numbers of law firms are taking steps towards greater security by moving away from legacy, on-premise IT systems to private or public managed cloud solutions. At CenturyLink, we provide cloud and security services to keep organisations’ operations running at peak performance. Managed services not only minimise the risk posed by external attacks but free up internal resources to focus on innovative IT and business initiatives.” ### Practical Benefits of Cloud-Based Project Management Software Cloud project management software is the new standard for organizing and managing clients, projects, tasks, documents and even staff members. It’s more efficient and scalable than the more traditional options you might be used to, it’s quickly becoming the going standard across multiple industries. Cloud-based solutions are very likely the key to providing the convenience and accessibility your team is looking for. The following are just a few of the many practical benefits moving your project management interface into the cloud can bring to the table. Which ones are of the greatest value to your company? It’s cost-effective and affordable for virtually every organization. Keeping overhead low and that all-important bottom line healthy are important concerns for all businesses, but this is especially the case if you run a small business or are trying to get a startup off the ground. Choosing a cloud-based solution eliminates the need for expensive hardware and installation fees that can cost your company a fortune from day one. You won’t need to hire full-time IT personnel to help you manage it either. All you need are the computers you already rely on and an Internet connection. In the event you do run into trouble, a solution is just a customer service call away. Ongoing costs are typically low as well. It’s flexible, scalable and versatile. When a software solution is so streamlined and simple that you only need an Internet connection and a web browser to access it, you instantly tap into unbeatable flexibility. Every member of your team can access it using whatever interface or platform they’re most comfortable with (i.e. Mac, Linux, Windows or mobile). Projects can be completed more efficiently, not to mention more quickly. Individual workers will be able to access information when out of the office when necessary. Your entire team will be happier when they’re able to work flexibly as well. It lets you create a highly connected team. More and more businesses are beginning to rely on remote workers of all types and with good reason. A workforce that includes remote workers is often more affordable, convenient, and efficient than keeping your entire staff on site. However, keeping your entire team as connected as they need to be can be challenging to say the least. The appropriate project management software for your organization creates a centralized work hub that any member of your team can access from anywhere they happen to be. Everyone can communicate, work, and collaborate effectively. Enterprise-level security. When it comes to sensitive data associated with your company, your clients, and your employees, there’s no such thing as “too secure” and today’s cloud-based software solutions definitely deliver. Features like built-in data recovery, ultra-secure servers, and ironclad data centers ensure that you’ll always be protected in all the right ways. [easy-tweet tweet="When it comes to sensitive data associated with your company, your clients, and your employees, there’s no such thing as “too secure”" user="@Adrian_DeGus" hashtags="Cloud, ProjectManagement, Security"] In fact, your sensitive data is actually much better protected than it would be if you were using an old school on-site solution. You never have to worry about someone inadvertently downloading a virus, losing info via a computer crash, or even losing your entire office building in a fire. All of your information is stored safely in the cloud and can easily be accessed from any device you choose. It’s ready to go, immediately. Cloud-based project management software doesn’t require long waiting periods or lengthy installation processes. Your entire system can be up and running in mere minutes. All you have to do is sign up for an account with your service provider of choice, spend a few minutes setting things up, and you’re all ready to go. There’s no waiting period to worry about when it comes to getting access to updates, new features, or additional tools either. It’s all available and ready to use instantly, leaving you and your team free to focus on other aspects of running your company. It’s easy to integrate into your current system. As touched on above, convenience is definitely the name of the game when it comes to today’s cloud-based project management solutions. For that reason, most options are designed to fit seamlessly into whatever system you already count on to run your business. You’ll be able to integrate it with other tools and interfaces your team is already familiar with. For that reason, there’s little to no learning curve involved. No one on your team needs to change much about how they already work either. Everything they need to do their jobs will be accessible from one convenient hub. It’s reliable and perpetually up to date. Keeping every member of a large team on the same page at all times can be challenging to say the least. You can never be sure who has the latest version of a critical document. And what if the team member you need to contact ASAP about a particular piece of information is out to lunch, out sick, or away on vacation? With cloud-based project management software, worries like those are complete non-issues. All vital information is perpetually right there at your fingertips. It’s always up to date and accessible by every single member of your team. Working becomes exponentially easier and completing projects efficiently becomes a breeze. It’s easy to customize. There are project management solutions that your company will outgrow sooner or later and then there are cloud-based solutions. When your project management interface lives in the cloud, it’s easy to scale things up or down according to your company’s needs. The moment your business needs to change how it does things for any reason whatsoever, your software can be adjusted accordingly. Add or remove users in seconds. Add or subtract features at will. Effortlessly update your interface quickly and painlessly. With cloud-based project management software in your corner, the sky really is the limit. ### Infoblox report finds 1 in 4 UK healthcare IT professionals aren’t confident in their organisation’s ability to respond to cyberattacks New report examines whether the healthcare industry is prepared to combat evolving cyber threats. London, UK – 14 November 2017 – Infoblox Inc., the network control company that provides Actionable Network Intelligence, today announced new research that revealed one in four UK healthcare IT professionals aren’t confident in their organisation’s ability to respond to cyber attacks.Technology is booming in healthcare organisations with digital transformation policies leading to increased adoption of connected medical devices, big data analytics for faster and more accurate diagnoses, and paperless systems for the easy exchange of patient information. As technology becomes more ingrained into core healthcare offerings, there is an increased threat of cyber attacks disrupting services, stealing sensitive patient data, and putting lives at risk. Infoblox commissioned a survey of UK and US healthcare IT professionals to gain a better understanding of whether the healthcare industry is adequately prepared to combat this evolving threat. Infoblox’s report Cybersecurity in healthcare: the diagnosis reveals: Ready for ransomware Following the significant disruption caused to the NHS by WannaCry in May 2017, many healthcare organisations are preparing themselves for further ransomware attacks. One-quarter of participating healthcare IT professionals reported that their organisation would be willing to pay a ransom in the event of a cyber attack. Of these, 85 percent of UK respondents have a plan in place for this situation. Dangerous operating systems The number of connected devices on healthcare organisations’ networks is exploding, with 47 percent of the large healthcare organisations surveyed indicating that they are managing over 5,000 devices on their network. One in five healthcare IT professionals reported that Windows XP is running on their network, which has been unsupported since April 2014. 18 percent indicated that connected medical devices on their network are running on the unsupported operating system, leaving organisations open to exploitation through security flaws in these unpatched devices. Patching outdated operating systems is impossible for the 7 percent of IT professionals responding that they don’t know what operating systems their medical devices are running on. Even when the operating system these devices run on is known, a quarter (26%) of large organisations either can’t or don’t know if they can update these systems. Investing against the threat 85 percent of healthcare IT professionals reported that their organisation has increased their cybersecurity spending in the past year, with 12 percent of organisations increasing spending by over 50 percent. Traditional security solutions are the most popular, with anti-virus software and firewall the solutions most invested in over the past year, at 61 percent and 57 percent respectively. Half of organisations have invested in network monitoring to identify malicious activity on the network; one third have invested in DNS security solutions, which can actively disrupt Distributed Denial of Service (DDoS) attacks and data exfiltration; and 37 percent have invested in application security to secure web applications, operating systems and software. Rob Bolton, Director of Western Europe at Infoblox said: “The healthcare industry is facing major challenges that require it to modernise, reform and improve services to meet the needs of ever more complex, instantaneous patient demands. Digital transformation presents a massive opportunity to support the doctors and nurses who work tirelessly – but these new technologies also introduce a new cyber risk that must be mitigated. The widespread disruption experienced by the NHS during the WannaCry outbreak demonstrated the severe impact to health services that can be caused by a cyber attack. It’s crucial that healthcare IT professionals plan strategically about how they can manage risk within their organisation and respond to active threats to ensure the security and safety of patients and their data.” The report includes a case study on how Geisinger Health uncovered malicious activity on its network and was able to quickly and accurately identify the offending device, containing the malware before it spread throughout the network. Commenting on the event, Rich Quinlan, senior technical analyst at Geisinger Health, said: “In spite of all the conventional steps we take to protect our internal network, patient care could still be affected. We could have an entire hospital full of useless ultrasound devices because one was brought in with a virus and we have no control over them. And if it was able to exfiltrate data, we would have a compliance issue.” The report also draws on the survey findings to provide actionable recommendations to healthcare organisation to better combat against the evolving cyber threat. ### Bringing Business Continuity to the Boardroom Why choosing cloud-based back-up and DR needs to be a strategic decision The benefits of cloud technology are prompting more and more organisations to investigate how the scalability, security, and economy it delivers can give their businesses a competitive edge. We’re also seeing many customers keen to use the cloud for its well-documented advantages in back up and disaster recovery (DR). In the rush to realise the benefits, however, many organisations are failing to take a strategic approach to their business continuity plans. I recently took part in a webinar discussion with participants from our partner CSN Group, and our back up and disaster recovery technology provider, VEEAM and I was struck by how frequently we encounter a purely tactical approach to back up and DR that doesn’t necessarily meet the strategic needs of the business.  While I can understand how this situation has come about, I firmly believe that it’s time to open up the dialogue between the IT department and the C-suite and ensure that decisions concerning business continuity and DR are led from the boardroom. Back-up and DR – two sides of the same coin I often find that customers are looking initially for either backup or disaster recovery, when actually what they really need is a blend of both. Back up is about having snapshots of data in time, which might be required for compliance reasons or when an end user has lost data that they need to be restored, at a point in the future. Disaster recovery is about the business being able to continue to operate, with users still having access to systems and revenue generation uninterrupted, when something has gone wrong. David Schaap, CTO of CSN Group, makes a good working distinction between back-up and DR, noting that back-up is driven by requests from end users for data restoration, while disaster recovery is initiated at the executive level. So, back-up and Disaster Recovery have different use cases and are needed at different times by the business but crucially, because they make use of the same technology, they should go hand-in-hand. Good decision-making is hampered by bad communication When it comes to selecting the technology used for back-up and DR it’s usually down to the company’s IT managers to make the decision, based on their knowledge of the organisation’s current systems and operations. It’s a bottom-up process that I believe sits at odds with the fact that the business continuity strategy that the technology is supposed to support is set by the C-suite. This can lead to the frustrating situation where the IT department is clamouring for budget to deliver back-up and DR, but they don’t have full visibility of the level of investment available and the amount of operational risk that executives are prepared to accept. On the other hand, business leaders suddenly get a wake-up call in the shape of a high profile climate or security-related disaster and demand an “instant” solution that guarantees zero downtime, without understanding that an effective programme is not a quick fix product that can be immediately installed. Fundamentally, both sides are operating in the dark and this lack of alignment between the technical expertise of the IT department and the strategic priorities of the C-suite means that it’s difficult for either side to make good decisions. Ideally, all technology decisions should cascade from the business continuity strategy. It’s the responsibility of the executive suite to decide what the business imperatives should be when problems occur and what level of back up is needed to ensure regulatory compliance. These directives should then be interpreted by the IT department so that the solution they select can meet the stated requirements. There’s also an important role for the IT department in providing technical counsel to the executive level, so information needs to flow both ways. Getting strategic about testing regimes I’d like to see organisations applying more strategic thinking to their DR testing regimes. I often hear companies saying that they’re going to test quarterly, and perhaps over a weekend. But disasters don’t happen quarterly, and they often happen when the business is in full operation. A disaster, by definition, is going to be something that hasn’t been anticipated. This actually means that events such as major storms, which can be predicted and prepared for to some extent, aren’t the concerns that should be driving the testing cycle. It’s more likely that a major issue will be caused by human error, a patch that goes wrong, or a newly evolved cyber-threat. Business leaders need to be confident that their testing regimes and the associated business continuity programmes are being continuously enhanced and updated to meet the next level of potential threat. [easy-tweet tweet="'But disasters don’t happen quarterly & they often happen when the business is in full operation.'" user="@ilandcloud" hashtags="BusinessContinuity, DisasterRecovery, Strategy"] One of the benefits of using the cloud for DR and back-up is the ability to run test programmes without impacting day-to-day operations. This means that testing can be carried out on an ad hoc basis if necessary, allowing the business continuity plan to evolve and adapt in real-time as described above.  The isolated test environment can even be used to test software patches before they go live – potentially avoiding a disaster in the first place. Promoting a culture of preparedness The culture of disaster preparedness needs to be set from the top and talked about at all levels of the organisation. Users need to know how they will access critical systems if they can’t reach the office for whatever reason, and this plan needs to be tested outside of the disaster cycle, to ensure that those employees will be able to keep working should disaster strike. All of this comes firmly under the auspices of strategic decision-making, so it’s time that the C-suite and IT departments aligned to support one another. The nature of cloud – its cost benefits and scalability plus the business critical aspects of back up and DR - mean that selecting solutions should be a board-level decision supported by the knowledge and expertise of the IT department. In today’s world it’s a case of when, not if, companies will be affected by disaster, and when they are they should be confident that their business continuity plan will do its job, so everyone else can continue to do theirs. ### £100 bn boost if businesses embrace cloud computing and online procurement Big data is boosting buyer power and strategic role of procurement teams London, 13.11.2017: Research from Confederation of British Industry (CBI) highlights the UK economy could get a £100bn boost to productivity if businesses were more willing to embrace everyday technology such as cloud computing and online procurement. However, digitising the procurement process goes deeper than online to boost productivity. In the digital business landscape, supply chain and procurement departments are reassessing their skill sets to become more data-driven and improve business performance. Procurement teams are under ever-mounting pressure to reduce cost, maximise efficiency and enhance profit margins while ensuring the organisation is being supported by reliable and trustworthy suppliers. Supply chain and procurement are becoming more involved in executive-level decision making, shifting from a cost centre mentality towards being recognised as a strategic part of business operations. Machine learning and predictive analytics are breaking new ground for procurement to enrich, transform and analyse internal and external data sources, creating insights to improve performance. Farida Gibbs, CEO of professional services firm Gibbs Hybrid comments: “Predictive analytics are being used to transform five key procurement areas – spend, category, supplier, compliance and performance. A data empowered procurement function equipped with a new set of data skills enables collaboration and partnership with finance and supplier teams. Moving away from decisions based on retrospective fiscal data, real-time data is a powerful asset for procurement when it comes to spend analysis as businesses can adjust their strategy in a more agile way.” Farida continues, “Working in an organisation where there is no culture of data-driven decision making requires a shift in mindset. Data insights across the end-to-end process of procurement are helping businesses significantly reduce costs, enhance profitability, increase customer satisfaction, and gain competitive advantage.” “Procurement leaders are progressively looking for scalable solutions to prevent a data overload – speed and innovation are critical success factors in today’s digital world. According to ProcureCon’s Annual CPO Study of chief procurement officers, within the next 18 months only 3 percent of respondents would have no procurement processes automation while 61 percent would have between 21 percent and 60 percent of the process automated. Automation provides businesses the scalability to help process high volumes of data to derive meaningful insights.” Gibbs Hybrid are the headline sponsor of Procurement Leader’s DITX 2017 conference discussing Procurement 4:0 and how real-time data can drive insights to execute customer centricity and strategic sourcing. ### Executive Education Episode 4: How Team Leaders Can Unleash Team Capabilities On this episode of #ExecEdTV How Team Leaders Can Unleash Team Capabilities. Hosts Kylie Hazeldine, Business Development from Regent Leadership and Colin Pinks, Director from Regent Leadership are joined by hosts Paul Z Jackson, CEO, Paul Jackson Associates and Matt Smith, CEO, SteelEye. Executive Education is a 30 Minute live show format designed for business leaders, managers, entrepreneurs and startups. #ExecEdTV #DisruptiveLIVE ### Comms Business Live: WebRTC, The next wave of business communications? In this week's episode, David Dungay is talking to several WebRTC experts about the impact the technology is having on the Channel market. The rise of WebRTC enabled products has been a key talking point in the industry for about five years, are we finally at the point where we will see mass adoption and disruption? Tune in and find out. Host David Dungay, Editor of Comms Business @CommsGuru4U Panellists Trefor Davies, MD of Netaxis UK @philosophertap Nathan Ronchetti, Head of Operations at Voiceflex @Voiceflex Richard Manthorpe, Product Director at Content Guru @cgchirp Ralph Page, Global Director of Strategic Solutions at Ribbon Communications (previously named Genband) @ribboncomm ### App Development in the Cloud With tech advancements occurring in the world, it seems that they are endless especially in the app development sphere. Cloud-based app development is one of the hottest trends paced in the last few years. One of the most agitative things that cloud providers offer is the ability to build, test, and deploy apps using cloud frameworks. Developers are trying hard to keep up with the ever-changing times. Here are the things you need to know before creating mobile software in the cloud. Cloud computing in 2017 Cloud-based software development has changed the IT practices, but nowadays the second wave for private, public, and hybrid clouds is entering the app development market. And what is the difference between private, public, and hybrid clouds? Public clouds are the most common cloud solution. They are usually owned by third-party cloud service providers. Microsoft Azure, Google, Amazon, and others. These resources are shared by thousands of people. Google reported that the number of users had reached 1 billion by 2017. [easy-tweet tweet="Hybrid clouds are a mix of the features peculiar to both public and private clouds." hashtags="Cloud, Hybrid"] Private cloud solutions are privately owned by one organization or business offering specific security controls.  They are supposed to be a more safer development environment than public clouds. As a rule, app entrepreneurs use private clouds for sensitive processes and thereafter, public clouds for simple tasks. Hybrid clouds are a mix of the features peculiar to both public and private clouds. Simply put, hybrid clouds are more complex solutions. For example, such clouds come useful when a company needs to keep confidential information secured, but at the same time, general information needs to be stored in a public cloud. That’s it with the clouds. Our next step is to find out what makes cloud-based software development come forward. Benefits of developing mobile apps in the cloud Let’s remember how things are done when it comes to mobile app development. A team usually starts with estimating the project and decide whether they can develop it or not, and if not, a team needs to acquire both the manpower and technology for the project. And when it comes to cloud computing, it can provide a way where infrastructures are available to everyone who is interested. And if your business needs grow up, the development infrastructures increase, too. You pay for the services you use. It’s a cheap option while there is no need for investing in in-house hardware and development tools. So cloud computing is a low-cost development solution. A few years ago it took months to get prepared for a project, and now developers can start coding within a couple of clicks. So the primary advantage of building apps in the cloud is its speed. Moreover, you don’t need to worry about hardware and software, infrastructure and storage, maintenance and other resources required to create and set up the apps. Instead of taking care of the things mentioned above, the developers can focus on tech issues. Things to consider before creating an app in a cloud Along with the bespoke benefits of cloud computing such as speed, agility, flexibility, and reduced costs, there are a few things to watch out when deciding on cloud-based software development solutions. Constancy Constancy and reliability are two major issues to consider when it comes to development in the cloud. The sad fact is that even using the biggest cloud providers, there is a risk that your app won’t be accessible 24/7. For example, Pinterest and Netflix were down due to the massive hacking attacks on Amazon’s servers. Advice: Always have a backup plan for your services. There should be the ability to deploy the app in all cases. Keep in mind, there will be always a chance that an outage may occur. Most companies have a recovery plan for on-site tech facilities, so why not to have such a plan for cloud services? Security It’s obvious that putting data and apps to clouds and especially public ones has lots of hidden risks.  As an IT entrepreneur, you should foresee all the risks and think well before deploying data to the clouds as they are prone to hacking attacks. Advice: Consider if data is less sensitive for you company, and then deploy it to public clouds, in other cases use private ones. Keep learning There are lots of things that are new to developers especially for those who develop apps in the clouds. For example, the configurations that you use locally are difficult to replicate in the cloud. If it’s your first time of developing an app on the cloud, be ready to learn some new things like SQL, command line tools, XML, etc. Advice: There are lots of eBooks and handbooks on cloud computing available on the Internet. Some of them are free. Never miss a chance to talk with developers experienced in cloud computing. Restrictions on licensing The terms of cloud usage differ from provider to provider. Some providers are very strict with what you are allowed to do and not. Advice: Read carefully what you are signing to avoid additional mulcts that may occur due to the violations of terms. Integrations with on-site apps It’s quite a challenging task even for cloud-proficient developers to integrate existing apps with cloud-based ones. As a rule, such problems occur due to the access failure. Cloud providers, especially of public clouds, don’t grant access to cloud technologies, infrastructure, and other platforms. Advice: Look for the ways to integrate the existing apps with the cloud ones without the access to the cloud infrastructure. Find how to get around this by utilizing the APIs of your provider. Platforms for cloud app development There are quite a lot of cloud service providers that allow developers to develop, test, and deploy the apps. These are the best cloud providers: Amazon Web Cloud Google App Engine Salesforce Platform Rackspace Microsoft Azure Summing up Cloud app development remains a challenging task for many developers, but still, it’s only at its early stage of development. With years to come, it will take the app development market. It may take a little time to get used to special aspects of cloud computing. But it’s worth learning since cloud computing offers much more benefits such as speed, flexibility, low maintenance costs, agility, and others. Get prepared for the changes and improvement long before they occur. ### We’ve lost that trusting feeling – say a third of UK employees Nearly one in three (31%) UK employees have no confidence in the leadership of their company to create and run a modern digital infrastructure, according to the Advanced Trends Report 2017. The new findings will come as a blow to many CEOs as the UK grapples with a changing business landscape that includes Brexit, increased cybersecurity threats and the General Data Protection Regulation (GDPR). The annual Advanced Trends Survey of over 1,000 professionals in UK organisations is the second to be commissioned by British software and services company Advanced. Like the first report, it reveals the state of readiness amongst British businesses in the face of digital disruption and examines the biggest barriers to digital transformation – of which leadership is one. When asked about the most important attributes for a leader in the digital era, the majority of respondents said leaders should be able to embrace change (82%), think and react with pace (67%) and be able to make bold decisions (57%). Only 42% felt bosses having a strong digital skill set was important, suggesting employees believe bosses are better off leading the company through change and making high-level decisions that will determine the future success of the company. “Businesses will not succeed in the digital era without a strong, skilled and admirable leadership team,” says Gordon Wilson, CEO at Advanced. “A lack of confidence will only demotivate employees, thwart productivity and cost businesses money. Ultimately, it will leave leaders trailing behind those that do have the leadership attributes to reimagine their business and embrace the opportunity of the digital era.” Tom Thackray, CBI Director for Innovation, adds: “We know that businesses’ ability to innovate and embrace the digital era is fundamental to the prosperity of our economy. It is vital that British businesses have confidence in their organisation’s leadership to deliver digital strategies that will support growth and create new business models for the future.” Gordon concludes: “We are in a tumultuous period of economic uncertainty, and employees will be looking to their leaders for reassurance as Britain leaves the EU. We continue to see mixed feelings from our survey respondents about Brexit – 52% see it as a threat to business survival (compared to 49 % in last year’s report) while 48% see it as an opportunity for growth and prosperity (previously 51%). Do leaders have mixed feelings too, or do they actually see Brexit as an opportunity and are not communicating their confidence from the top down?” Brexit aside, the increased threat of cyber attacks and impending GDPR are enough justification for staff to demand better leadership. Both place new responsibilities upon their leaders to ensure every employee understands how to protect corporate and personal data. The consequences of being breached are serious and can cost leaders their jobs as seen with TalkTalk and Equifax. Tom says: “Without strengthened efforts to improve cybersecurity, the undoubted potential of the UK’s digital economy will be unfulfilled. Cyber resilience is increasingly important for all companies across the economy. They must continue to move from awareness to action, by ensuring cyber security is a board-level priority and making the right investments for their digital future.” Julian David, techUK’s CEO, adds: “It’s no longer a question of whether or not your company will experience a cyber attack, but rather when it will be attacked. That is why it is so important that cybersecurity is a top-level priority for organisations, from the boardroom down. Organisations that prioritise security can confidently adopt new technologies, from cloud to IoT to AI, which facilitates innovation and helps them grow their businesses.” The repercussions of a data breach or loss would be even more damaging if a company failed to safeguard its data under the GDPR. Equifax, for example, could have been fined up to $124 million if the regulations had already come into effect. Julian explains: “Too many organisations are unprepared for (or unaware of) the changes that GDPR will bring and the new responsibilities placed on data controllers. Additional guidance is needed from the Information Commissioner’s Office, and particularly for SMEs, on how to prepare for GDPR. Similarly, senior leaders must ensure they are doing all they can to manage the changes ahead with the information already available. “GDPR is less than seven months away and we must avoid a situation where organisations are on the wrong side of the law, and at risk of large fines, without realising it.” But it’s not just large organisations facing these risks – all businesses are. Worryingly though, nearly one in five (18%) businesses still remain unprepared for a cyber attack while 25% are either unsure or not confident their organisation is going to be ready for the GDPR. Gordon adds: “There are pressing issues that businesses should be taking action on today. Leaders urgently need to get a handle on the challenges they can control, such as regulation and security risk, and not get distracted by changes that are out of their hands, like Brexit. The reality is that digital is fast becoming a pre-requisite for businesses to gain a competitive advantage. Creating and running a modern digital infrastructure will help to improve employee productivity and customer service, as well as reinstate confidence and trust in leaders’ ability to grow successfully.” Additional findings: Most (88%) believe a connected digital infrastructure is important in being able to service and anticipate their customers’ needs Only 28% describe their organisation as having a modern digital infrastructure (departments connected through integrated data and statistics to gain insight into operations) – compared to 61% that don’t (11% ‘don’t know’) Given that digital is becoming the norm, only 75% are confident in their company's ability to adapt and embrace change Over a third (36%) don’t think they have the right tools to do their job effectively in the digital era. Of those that do have the right tools, half are now over 50% more productive as a result The Advanced Trends Survey was carried out online in September 2017. The full Advanced Trends Report 2017 can be found here: www.oneadvanced.com/trends. ### Park Holidays Selects Datapipe to Modernise Infrastructure Using AWS Datapipe prepares the UK holiday park operator for its busiest summer trading period. LONDON, November 13- Park Holidays UK, one of the largest holiday park operators in the South of England, has selected Datapipe, a leader in managed cloud services for the enterprise, to replace its end-of-life infrastructure with a new cloud-based system utilising Amazon Web Services (AWS). Park Holidays UK is one of the largest operators of holiday parks in the South of England, with 28 parks stretching from Devon in the South West to Suffolk in the East. Park Holidays UK offers its customers affordable caravan holidays, and it relies on its website and web-booking infrastructure to take most of its bookings. The demand for holidays is seasonal, with the bulk of the bookings coming in the lead-up to the busy summer period. The Park Holidays UK website was running on legacy web servers that were first installed 10 years ago. In that time, the number of bookings taken through its website has grown dramatically. The infrastructure was reaching its capacity limit, so in February 2017 it began investigating new options. Park Holidays UK needed an entirely new technology stack built on AWS. It wanted to use the opportunity to rewrite its website from scratch and deliver a new, responsive site. However, Park Holidays UK did not have the expertise or knowledge to make such major changes to its infrastructure on its own, especially with regard to ensuring the new cloud-based solution was fully PCI compliant and that security was locked down. It needed guidance from a trusted partner. With Datapipe it found that partner. Not only did Datapipe deliver a solution hosted on AWS, it offered Park Holidays UK a fully resilient solution with the ability to automatically bring on new resources when the load reaches a threshold, ensuring there is always spare capacity available. In case of any problems, failover time is now measured in minutes whereas previously, there was potential for up to two weeks of downtime for hardware replacements - this was a huge business risk to carry. Datapipe also gave Park Holidays the reassurance that data held on the cloud would be secure. Datapipe's experts ensure that best practise methods are used to hold the data, and that end-to-end monitoring is in place to lock down the cloud environment. [easy-tweet tweet=""Ten years ago, the majority of our bookings were taken via our call centre. Now the tables have turned, and most of our bookings come in via our website which makes it critical to our business"" hashtags="AWS, Infrastructure"] Mike Procyshyn, IT Director at Park Holidays UK says: "Ten years ago, the majority of our bookings were taken via our call centre. Now the tables have turned, and most of our bookings come in via our website which makes it critical to our business. However, the infrastructure had not been significantly upgraded in that time and it was reaching the limits of its capacity. "With the weak pound, more Brits than ever opted for a staycation this year, meaning we have been busier than ever. We could not afford to have any doubts about the resilience of our website and Datapipe has delivered a secure solution that sets us up for another ten years or more." Scotty Morgan, Chief Sales Officer, Europe at Datapipe, says: "With Park Holidays UK, we delivered a new cloud-based solution that elastically scales depending on demand. It no longer has to worry about capacity issues and has a secure solution ready for the summer peak." ### Cloud Holds the Key to GDPR Compliance – but not without a culture shift With only a few months until the General Data Protection Regulation (GDPR) comes into full force, there’s an air of unnerving panic sweeping across organisations in the EU to ensure compliance. And this panic isn’t without reason. Companies found to breach the data of EU individuals will suffer severe financial penalties for serious incidents, with fines of up to four percent of a company’s annual revenue. As a result, ensuring total compliance with the legislation is a must, but with 88 pages of legislation, it’s a daunting endeavour for any organisation to get up to speed with GDPR. However, what is often overlooked is the fact that most organisations are actually in a good position to navigate through this tricky new legislation. GDPR, in fact, adopts almost 80 percent of the existing Data Protection Directive that has been a mainstay for organisations for some years. The other 20 percent will arise from the GDPR, which represents the large hurdle for organisations in accommodating to this new landscape. But for some, adjusting and complying with the GDPR may be quicker than anticipated with the growing adoption of cloud throughout the enterprise. Capitalising on the Cloud Research from Cloud Industry Forum identifies that overall cloud adoption rate in the UK stands at 88 percent, with 67 percent of users expecting to increase their adoption of cloud services over the course of the coming year. At Google’s Cloud Next event held earlier this year, Google executives advocated that cloud can help boost GDPR compliance, as moving data to the cloud can alleviate the pain of upgrading security practices and data protection standards in line with the new regulations. In the same vein, large technology vendors including AWS and Microsoft Azure have both committed to ensuring GDPR compliance to support their customers operating in the EU. Cloud can facilitate the move towards compliance, but how can this be achieved?   Centralising and monitoring data One of the biggest implications of GDPR is the requirement for accurate storage, visibility, and monitoring of EU data – in an attempt to bring more accountability of consumer data. Organisations who have traditionally relied on legacy on-premise IT infrastructure will quickly find that they are unable to offer the strict monitoring nor the stringent security assurances required with GDPR. However, by storing data in more sophisticated cloud environments, organisations are now able to centralise data from all assets in one location, rather than having several data access points. Not only is this important for improving visibility and accessibility of data, but effective storage in the cloud can offer operational benefits including the prevention of data leakage and reducing the possibility of data duplication. Additionally, GDPR will provide more user control through the ‘right to erasure’, also known as the ‘right to be forgotten’, meaning users can request their data to be removed from company databases or even amended if incorrect. This has the potential to see organisations being inundated with more requests from consumers to access their data. But by using cloud technologies, organisations can effectively remove the typical administrative headache for staff in organising and maintaining this data, having instant access to data where required. Having said that though, cloud isn’t the be all and end all when it comes to GDPP compliance. Cloud needs to be met an organisational culture shift While there are undoubted benefits of cloud, organisations cannot solely rely on the cloud to facilitate their move towards GDPR compliance. Instead, a wider appreciation is required from both organisations and cloud service providers (CSPs) in their role towards this goal. Only through collaborative partnerships and a deeper understanding of each party’s involvement with GDPR can real progress be made. Any organisation storing data in an IT environment needs to ensure best practice is being followed, such as patching, monitoring and access controls, with the appropriate 24/7 support and governance wrapped around them to ensure protection is sufficient for the value of data being stored. If these environments are in the cloud, then additional activities such as supplier management need to be implemented to maintain the required level of assurance for the organisation, i.e. checking the supplier is implementing the controls at all times. [easy-tweet tweet="GDPR is a game-changing legislation that will affect organisations of all sizes. " hashtags="GDPR, Security"] Additionally, the mindset surrounding GDPR also requires change. GDPR is not only a business matter managed by IT departments of organisations but a dedicated involvement from all parties in upholding the values of GDPR. Internally, everyone from junior members of staff to the board must understand the role of GDPR, while externally, business partners and technology suppliers will also be held accountable for ensuring services and security of the highest standards. Putting this in context of GDPR, in the event of a potential data breach, organisations are now required to declare the incident within 72 hours to the relevant supervisory authority, or else face significant financial penalties. Having robust initiatives and a thorough understanding of how to detect and report threats throughout the entire business ecosystem will be critical in staying on top of the game. GDPR is a game-changing legislation that will affect organisations of all sizes. Organisations can find some solace in the fact that much of the foundation of GDPR is rooted in the Data Protection Directive, but this shouldn’t be mistaken as complacency for GDPR compliance. With more organisations capitalising on the cloud, it only makes sense that cloud can help move towards 100 percent compliance. But this must be matched with a shift in culture and understanding, or else organisations will be left ill-equipped ahead of the May deadline. ### Re:scam Doesn’t Want Us to Russian to Things We’ve all seen the emails promising us millions of pounds in exchange for a couple hundred quid, or from that gorgeous eastern European girl looking to marry us if she could just afford her visa. I’ve forwarded numerous emails to spoof@paypal.com in the hope it’ll make a difference, because truth is, we all fall for it from time to time. However, it’s increasingly becoming a bigger problem, which one company aims to change - with a twist. Let’s start with some statistics. Globally, cybercrime costs us $125.9billion in 2016. That’s no small change, even in a world dominated by multi-billion deals. To put that into perspective, Facebook acquired WhatsApp for $19billion, which at the time was considered steep by anyone’s view. Cybercriminals made that much money in just 55 days that year. The total cost of financial cybercrime per country is as follows, (2016): USA: $20.3billion (USD) Mexico: $5.5billion (USD) UK: $2billion (USD) Germany: $1.5billion (USD) United Arab Emirates: $1.4billion (USD) New Zealand: $257million (USD) So how do we combat this and start putting money back in the pockets of the good? Whilst New Zealand’s annual loss due to cybercrime is significantly less than the other countries in the chart (though an insignificant amount it is not), one company has decided enough is enough. Introducing the star of the show… Netsafe, a New Zealand non-profit online safety organisation whose independent body is made up the Police, the Ministry of Education, telecomms organisations and IT industry partners. Now, they reckon that the scamming side of cybercrime costs us about $12billion a year as a planet. Whilst this may only be about 4.66% of the total cybercrime figure, considering this is completely preventative, that’s a big saving available. Anyway, continuing their focus on online safety, the company has created a bot called Re:scam. This focuses primarily on phishing and is designed to waste phishers’ time, distracting them from us regular folk. (For more information on what phishing is and how to protect yourself online, Netsafe have a page to help) Due to the bot’s artificial intelligence upbringing, it has the power to be, and remain, believable to unsuspecting recipients by learning from a variety of people from all over the world. As the programme engages in more fake conversations with scammers overseas, its vocabulary, intelligence and personality traits will grow. All you have to do to make use of the bot is to forward any dodgy looking emails to me@rescam.org. If they’re found to be fraudulent, Re:scam will pick up the conversation and continue it for as long as there are minutes in the day. Unless the scammer gets bored and stops of course. In the last 24 hours, at the time of writing, the AI program had wasted 25 days worth of scammers’ time, sending over 16,000 emails. This leaves less time for the trickster to prey on others. Not only does it waste time, but it does it with a sense of humour. Some examples of Re:scam’s communications include: (When talking to a ‘single Russian lady’) – “I don’t want to Russian to things”. (When talking to a ‘bank requiring funds’) – “Are you talking about real money? Because if so then you certainly have my attention.” (When talking to an ‘official person’) – “It all seems quite a wee bit official like.” As you can see, it uses language that suggests it is excited and on-board with anything it has to do. Scammers rely on people trusting what they read in order for them to come across as plausible as possible. This then gives off the impression they could have a catch and that they should spend time following it up. Unfortunately for them, they are being played at their own game. Well, not quite, Re:scam doesn’t actually phish for details unfortunately (Re:scam 2.0 perhaps Netsafe?). [easy-tweet tweet="to some a robot with multiple personalities sounds like the start of the apocalypse" hashtags="ai, cybercrime, phising"] Now, to some, a robot with multiple personalities sounds like the start of the apocalypse, but you can rest assured that this robot won’t be using its powers for anything but good. At least… That’s what I’m hoping…. Netsafe? So, there we have it. Give those swindlers a taste of their own medicine by sending them down the rabbit hole whilst you enjoy the fishing that won’t harm you. If there’s one thing to remember from all of this, I think the bot itself sums it up nicely… “Deleting a scam email protects you, but forwarding to me@rescam.org protects others.” Aside: For those of you that, like myself, deal in food rather than money, cybercriminals enjoyed 6,998,332,407 XXL Papa Johns pizzas at your expense in 2016. It’s time to put the ham back in our hands. If you have any comments, or if you want to get in touch and talk tech (or food), follow me on Twitter.   ### Application Performance Visibility – Sort Out That subscription! Many CIOs struggle to deliver responsive services for globally-mobilised workforces while controlling networking costs. But they could take surprising inspiration from the way I’ve managed my new Ultra HD TV package and all the hardware that comes with it. That’s because there are new, emerging, ways to regain control of networking costs so that, much like me, you can now pay a small enablement fee upfront, then only pay for the monthly TV ‘package’ you need – without compromising on service. Let me explain. Enterprise CIOs are cornered. As they run cloud and hybrid environments, digitise processes and mobilise workforces, they must ensure new levels of digital experience for users. But these heads of IT need to better understand network and application performance, and carry out ever-faster local troubleshooting, to truly succeed. Yes, they’ve taken action, using smart tools like SD-WAN to intelligently route traffic across their different WANs, reducing the IT team’s required skill sets and ongoing maintenance and travel costs. But they can’t even realise the full benefits of SD-WAN without network and application performance visibility. [easy-tweet tweet="CIOs are swamped by disparate network and application performance data" hashtags="CIO, Data"] Today’s CIOs are swamped by disparate network and application performance data. Worse, they lack the budget or team resources to upgrade performance monitoring tools or consider buying new, cutting-edge ones outright. Using managed IT service offerings frequently offers no respite either: while external skill sets may augment the IT team’s existing capabilities, the CIO still faces extensive upfront CAPEX costs to acquire the necessary technology in the first place. CIOs are thus trapped in a cycle of rising monthly costs of managing – and upgrading – their different performance monitoring tools while lacking genuine CAPEX headroom to install advanced cutting-edge devices. Let’s go back to my new Ultra HD TV package. I wanted to access a certain entertainment package: movies for the kids, but only certain sports – since bowls and cricket aren’t my thing. A mix of channels was tailored especially for me, and I now pay for monthly access to them. I paid a small upfront cost for service enablement, which involved a new router – I managed to install that myself – and the latest devices for each room, which were installed by a professional, and a new, futuristic-looking and acting remote control. But what all that gives me is an affordable version of the latest, cutting-edge TV technology: better storage, amazing new features, possibly the sharpest definition quality ever. And the best thing is that if any of this fails, it’s all under warranty for the service’s lifetime. There’s a small consideration: if I end the contract, my access ceases to exist and I must send everything back. Similar thinking – top-level service and controlled ‘upfront’ hardware costs - is now being adopted in the UK’s networking market to help global enterprises achieve the leap forward in network and application service performance insights, at a price they can afford. This approach is called Visibility as a Service. Instead of CIOs continually buying more performance monitoring tools and ratcheting up their IT resources just to run them, this novel approach gives IT heads the single 24/7 view of their core network and application health, on a worldwide basis, while only paying for expert engineers’ time and infrastructure needed. Visibility as a Service lifts today’s crushing monitoring and management burdens from IT teams, showing them what’s going on across their global applications and network infrastructures; they get meaningful data on demand to enhance their service levels while capping both monthly OPEX and upfront CAPEX costs. Just like my new Ultra HD TV package with the right-sized service agreements, this new Visibility as a Service capability are now available to global enterprises as customised service packages. If the technology fails, a replacement is provided. If the service is cancelled, access ends and the technology has to be returned. Options range from an entry-level service, where the IT team does the monitoring tasks while the provider looks after the different monitoring infrastructure, to more advanced network and application performance-focused options where the IT team can access deeper application insights or see bandwidth issues, with targeted technical support still part of the monthly service charge after finite, one-off service enablement costs. And IT VPs wanting to see emerging network and application performance trends using richer, more forensic data, can access a higher-level service package, with root cause analysis and troubleshooting capabilities investigated jointly with their service provider. And for CIOs urgently needing visibility down to end-user experience monitoring and rapid troubleshooting across their web, cloud and hybrid environments, bespoke service packages with real-time ‘drill down’ responses and active improvement options, are available. Just as I raved about my new Ultra HD TV package’s features, Visibility as a Service packages will help CIOs boost their company’s agility by accessing cutting-edge technologies and skill sets …while escaping the trap of ‘upfront’ CAPEX costs.   ### Pre-Christmas Sales Slump Shows Importance of Optimising the Supply Chain Artifical intelligence and automation of replenishment process can reduce costs, improving retailers’ profitability The latest figures from the British Retail Consortium, released this week, show an alarming 1 percent drop in like-for-like sales year-on-year before the crucial Christmas trading period. As retailers enter what should be their busiest time of the year, these figures underline the importance of retailers increasing the efficiency of their operations and reducing their costs as much as possible. This is according to Uwe Weiss, CEO of Blue Yonder, the world leader in AI and machine learning for retail supply chain optimisation. Uwe suggests that, in the current climate of uncertainty, optimising the supply chain may hold the key to unlocking costs that can help retailers improve their profitability and improve the customer experience. [easy-tweet tweet="optimising the supply chain may hold the key to unlocking costs that can help retailers improve their profitability and improve the customer experience."] A wide range of factors may have contributed to the sales slump that retailers are currently experiencing. Lingering concerns over Brexit and the state of the economy, along with the recent increase in interest rates, has meant that consumers are saving their money rather than splashing out at the shops. In addition, the October half term holiday may have resulted in more families spending their money on going to attractions and on days out, rather than shopping. Many retailers may not have accounted for the drop in demand, meaning products that they purchased have been left on the shelf. For Uwe, intelligent use of data can be an important solution in helping retailers to reduce their costs and accommodate inconsistent customer demand: “The recent fluctuations we have seen in the retail market clearly show the importance of accurately predicting customer demand. Retailers simply cannot afford the burden of wasted stock on their bottom line, particularly with the possibility of fewer sales just around the corner and a reduction in demand. In addition, retailers have to maximise their sales potential. This means having as many products out on the shelves as possible so that when consumers come into the store they have a better chance of finding what they are looking for. This can increase both sales and brand loyalty because if customers know that they can find their favourite products in a store, they are more likely to return. “Retailers are sitting on an enormous amount of data that they can use to make increasingly accurate customer demand predictions. Internal data, such as past sales and product pricing, and external information, such as public holidays, weather petterns and even football matches, can be integrated with AI technology to accurately forecast customer demand. “Replenishment can then be optimised and stock level decisions automated, across thousands of product categories and hundreds of stores. Combining their data with advanced AI and machine learning technology can help retailers to insulate themselves against fluctuating customer demand, ensuring that they do not waste precious resources on over-stocking and always have the product available to satisfy their customers” Blue Yonder Replenishment Optimization is an artificial intelligence solution that allows automated store replenishment to efficiently reduce waste. The solution utilises a wide variety of data points to create accurate and granular forecasts of customer demand, with a weighted optimization of waste levels and product availability, its automated decisions reducing the burden of making manual interventions on retailers. For more information, visit our AI in Leadership hub. ### Could Blockchain Trigger the Next Healthcare Revolution? Blockchain has experienced a meteoric rise in the past year, which has largely been due to its connection to bitcoin – a currency that has gone from obscurity to being a topic of national interest. But, blockchain technology holds plenty more potential than just as an enabler of cryptocurrency. The development of the technology is creating opportunities for enterprises across some industries, one of which is healthcare. The healthcare industry is often perceived as a slow adopter of technology, so it may come as a surprise that it’s one of the industries that have been quick to assess and realise the transformative effect blockchain could have. Although the technology is undoubtedly disruptive, the journey to blockchain-based healthcare systems or applications will not happen overnight, and will instead be an evolutionary process.  For this to work, trust and governance within a blockchain network or consortium will be critical. From securing a high risk and high-cost industry and creating a layered accountability, to creating a trackable log of the drug supply chain, the benefits of blockchain in healthcare are far-reaching. They can include: Clinical trials: The majority of pharma research related to clinical trials is done and recorded in silos, if it’s even recorded at all. An estimated 40% of clinical trials go unreported, and investigators often fail to share their study results. This makes it difficult for collaboration across an organisation’s internal team and creates crucial safety issues for patients as well as knowledge gaps for healthcare stakeholders and policymakers. Blockchain-based systems could help drive unprecedented collaboration between participants and researchers around innovation in medical research fields; including delivering precision or personalised medicine. Interoperability and clinical health data exchange: It’s widely known across the healthcare industry that solving HIE (Health Interoperability Exchange) issues is easier said than done. The fact that data can be structured and unstructured and is often from multiple sources makes it difficult to find a solution. However, blockchain could enable data exchange systems that are cryptographically secured and irrevocable. This enables seamless access to historic and real-time patient data while eliminating the burden and cost of data reconciliation. Population health management: With blockchain, organisations can eliminate intermediaries and access patient databases on a large scale, without needing to rely on HIE based systems. The technology has the potential to transform population health management by providing trust where none exists, including continuous access to patient records and directly linking information to clinical and financial outcomes. Cybersecurity: A recent report from the Information Commissioner's Office (ICO) and data security firm Egress found that human error, rather than external threats, was the main cause of breaches across every sector of the UK economy. Healthcare organisations suffered 2,447 data breaches and accounted for 43 per cent of all reported incidents between January 2014 and December 2016. Cumulative healthcare breach numbers were almost four times more than the second highest sector, local government. As data digitisation in the healthcare space moves full speed ahead, cybersecurity challenges will inevitably grow. Blockchain-enabled solutions have the potential to bridge the gaps of device-data interoperability while ensuring security, privacy and reliability around cybersecurity. Drug supply chain integrity: The European Observatory estimates that counterfeit drugs cost the pharmaceutical industry and Europe more than €10 billion each year and may result in the loss of up to 40,000 direct jobs. A blockchain-based system – such as the BlockRx project – can create a chain-of-custody log, tracking each step of the supply chain at the individual drug/product level, as well as managing the drug development lifecycle. Add-on functionalities such as private keys and smart contracts add to the seamless supply chain integrity. [easy-tweet tweet="The convergence of big data analytics and access methodologies will accelerate the growing trend of data democratisation" hashtags="BigData, Blockchain"] Machine Learning and healthcare IoT: The convergence of big data analytics and access methodologies will accelerate the growing trend of data democratisation and consumer’s access to their personal health information. This will enable all members of a patient’s care team to stay informed, involved and connected with one another. As more sensitive data starts to move rapidly between a larger number of devices, organisations, platforms and machine learning applications, blockchain may be able to offer an easier and more comprehensive strategy for ensuring trust, data integrity and better control over how information is used. So what’s holding the healthcare industry back? Blockchain presents numerous opportunities for healthcare but it is not yet fully mature, nor will it be a cure-all solution that can be immediately applied. While there are many promising projects underway, more proof points around the tangible benefits of deploying blockchain technology are needed. The industry as a whole must back the technology and trust in its benefits before we’ll see widespread adoption, and this requires education and experimentation. ### British American Tobacco turns to the OpenText Cloud to improve employee information management OpenText Extended ECM for SAP SuccessFactors provides global, single source of truth for employee records LONDON, UK – November 9, 2017 – OpenText™ (NASDAQ: OTEX, TSX: OTEX), the global leader in Enterprise Information Management (EIM), today announced that British American Tobacco (BAT) has deployed OpenText Extended ECM for SAP SuccessFactors as part of overhauling its HR processes and records management. Using OpenText, BAT is able to cut costs and improve HR document management efficiency across the company. BAT employs more than 50,000 people in over 180 countries, growing rapidly both organically and through acquisition. The company’s success has resulted in a fragmented HR function managing a large and increasingly mobile workforce. As part of its global transformation initiative 'Programme Aurora', BAT selected OpenText Extended ECM for SAP SuccessFactors to integrate with its single, global instance of SuccessFactors. In accordance with its cloud-first strategy, BAT deployed the solution in the OpenText Cloud, principally in the European data zone. "OpenText Extended ECM for SAP SuccessFactors was a natural choice for us. We can now manage employee files, letters, contracts, and other documents in one place, providing global continuity as employees move around," said Andy Straw, IT program manager at BAT. "For the first time, we have a single source of truth." The rollout of Programme Aurora focuses on modernising its HR function modelled around HR shared service centres at strategic locations around the globe. This approach will produce an agile, integrated HR function and will allow BAT to realise cost savings and efficiency improvements for the whole organisation through standard, consistent HR processes, supported by powerful digital tools. "A critical part of this project was improving our HR reporting. In the past, it could take three months to produce meaningful HR reports and analysis," said Straw. "Even the most common of tasks, such as providing an accurate global headcount, could take months, by which time it was out of date. Now, we can produce key business reports, whenever we want, at the click of a mouse." The OpenText solution has also helped deliver additional workflow improvements, such as the automation of key employee HR communications and better privacy and security controls. BAT currently has more than 20,000 employee records in the system and is in the process of expanding it to cover over 50,000 HR files across the organisation. OpenText Extended ECM for SAP SuccessFactors helps organisations redefine employee engagement while effectively managing the new digital workforce, addressing the demands of the digital workforce through the simplification and transformation of current HR processes. Integrated within the SuccessFactors UI, all relevant employee file content is readily accessible to business partners, which empowers them to effectively manage talent, lead HR specific projects and achieve service level expectations of the new generation workforce. ### Is Big Data the Key to Curing the NHS? Since the NHS began in 1948, life expectancy has increased on an average by a decade, and chronic diseases such as heart disease and diabetes have risen dramatically thanks to rising obesity levels. It is no secret that this is putting a major strain on the NHS – A&E waiting time targets have been missed every month since July 2015, and the NHS net deficit for the 2015/16 financial year was over £1.8bn. However, the healthcare crisis has coincided with the technology revolution. The cost of 1GB of data has dropped from £30,000 to 7p between 1981 and 2010, and a range of analytics solutions have come to market. Can big data help solve the NHS’ problems? Early adoption The NHS has only used big data in pockets so far, but there have been some interesting and effective use cases. A £1.3 million research project by NHS Scotland is seeing it collect patient level data and implement big data analysis techniques to get a better understanding of the warning signs of diabetes. The objective was to create computer algorithms that help doctors predict which people are most at risk and likely to benefit from targeted intervention, and it is estimated that this will lead to £200 million in savings, and result in a 30% drop in limb amputation over four years. Meanwhile the National Institute for Health Research Health Informatics Collaborative (NIHR HIC) are producing an anonymised patient level data set made from five NHS Trusts. The benefits of big data are clear, and it has become much easier for organisations to collect and store this level of data from its customers and stakeholders. The challenge is to convert that data into information that can help improve operations. The use of data analytics This is where the implementation of data analytics is required, and organisations can choose from a range of solutions in the market that can cater for the even most niche areas of their business. These analytics solutions can give a business a view of customer behaviours or consumer trends, and allow them to cater their products or services to meet those needs more effectively. An example would be in the management of patient flow and ward capacity in hospitals during the notoriously busy winter months. [easy-tweet tweet="Predictive data analytics and risk scoring has the potential to shed light on patient flow and hospital demand" hashtags="Data, Analytical"] Predictive data analytics and risk scoring has the potential to shed light on patient flow and hospital demand, allowing the NHS to make informed decisions regarding allocation of resources and significantly improve the NHS’ ability to navigate this busy period. The barriers to implementation However, despite the potential of big data and data analytics, a seamless analytics offering requires heavy investment from the business involved. It is not a simple case of an analytics solution plucking the raw data from the sky, the process is reliant on IT infrastructure, the organisation of data, and the flow within the business as a whole. The challenge for the NHS is its complexity – it consists of 853 for-profit and not-for-profit independent sector organisations, providing care to NHS patients from 7,331 locations across England alone. Therefore utilising big data and analytics would require a huge financial, planning and design operation to ensure the organisation has the right infrastructure to allow the effective flow of data across the organisation. This is not forgetting the need to design effective systems for the management and access of that data, and implement effective training processes for staff to work with these technologies – it is certainly not something that will happen overnight. The data protection question In 2018 there will be a new challenge, with the 2018 European General Data Protection Regulation (GDPR) coming into place in the UK and forcing businesses to comply with stricter governance of the sharing of data. A major concern for the NHS will be that these regulations hamper the flow of useful data within the organisation, and make it even more difficult to utilise. Yet even more significantly it will require a financial investment, operational changes and effective training of staff on how to handle data. It inevitably raises the question of whether the NHS can afford to invest the time and capital required to reap the rewards of big data when senior decision makers remain in a constant battle to keep their heads above water. Is it worth it? However, the benefits of big data are clear, and in the case of NHS Scotland, you can see its long-term potential to drive down costs across the NHS. There is also significant faith in this big data movement across the NHS, evidenced by the decision of the Engineering and Physical Sciences Research Council (EPSRC) to fund five research centres around the UK that will apply analytical methodologies to assess healthcare sector concerns. Given that the NHS will be investing in its data management as a result of GDPR, I would argue that it is in-fact the perfect time for the organisation to be investing into effective data analytics solutions that work across all of its Trusts. Modern day computing power and the drop in the cost of data storage mean these solutions are more accessible than they ever have been. It has the potential to revolutionise the way we run health and social care in the UK, and it is time for the NHS to embrace this opportunity. ### Atos revolutionizes enterprise AI with next generation servers Paris, 9 November 2017 - Atos, a global leader in digital transformation, launches BullSequana S, its new range of ultra-scalable servers enabling businesses to take full advantage of AI. With their unique architecture, developed in-house by Atos, BullSequana S enterprise servers are optimized for Machine Learning, business‐critical computing applications, and in-memory environments. In order to utilize the extensive capabilities of AI, businesses require an infrastructure with extreme performance. BullSequana S tackles this challenge with its unique combination of powerful processors (CPUs) and GPUs (Graphics Processing Unit). The BullSequana S server’s flexibility leverages a proven unique modular architecture, and provides customers with the agility to add Machine Learning and AI capacity to existing enterprise workloads, thanks to the introduction of a GPU. Within a single server, GPU, storage and compute modules are mixed for a tailor-made server, for ready availability of all workloads worldwide. Ultra-scalable server to answer both challenges: from classical use-case to AI BullSequana S combines the most advanced Intel® Xeon® Scalable processors - codenamed Skylake - and an innovative architecture designed by Atos’ R&D teams. It helps reduce infrastructure costs while improving application performance thanks to ultra-scalability – from 2 to 32 CPUs - with innovative high capacity storage and booster capabilities such as GPU (Graphics Processing Unit) and, potentially other technologies such as FPGA in further developments. “Atos is a prominent global SAP partner delivering highly performant and scalable solutions for deployments of SAP HANA. We have been working together to accelerate SAP HANA deployments by providing a full range of SAP HANA applications certified up to 16TB. The new BullSequana S server range developed by Atos is one of the most scalable platforms in the market, optimized for critical deployments of SAP HANA. It is expected to open new additional collaboration areas between SAP and Atos around artificial intelligence and machine learning,” said Dr. Jörg Gehring, senior vice president and global head of SAP HANA Technology Innovation Networks. BullSequana S - to reach extreme performance whilst optimizing investment: Up to 32 processors, 896 cores, and 32 GPUs in a single server delivering an outstanding performance and supporting long-term investment protection, as capacities evolve smoothly according to business needs. With up to 48TB RAM and 64TB NV-RAM in a single server, real‐time analytics of enterprise production databases will run much faster than on a conventional computer by using in‐memory technology whilst ensuring both security and high quality of service. With up to 2PB internal data storage, BullSequana S efficiently supports data lake and virtualization environments. “To power Machine Learning and AI in enterprise IT, Atos has designed a new-generation computing platform, which accelerates our customers’ digital transformation by converging business‐critical computing and HPC within a single device. BullSequana S is the ultra-scalable, ultra-flexible, go-to server that delivers extreme performance while optimizing investment”, underlined Arnaud Bertrand, Fellow, Head of Big Data and HPC at Atos. Availability The first BullSequana S machines manufactured at France-based Atos factory are available worldwide from today. ### Recognising the Importance of the Cloud Architect There are a number of contributing factors towards the eventual success of cloud adoption. The cloud architect is paramount to the eventual success of cloud adoption. Cloud computing is showing no signs of slowing down. Spending in public cloud services is predicted to reach $260.2 billion in 2017, an 18.5% increase in U.S. dollars (18.9% in constant currency). In addition to this, it is growing more than five times faster than growth in IT spending across all categories. Organisations must realise that the complexities of cloud adoption will continue to provide challenges with regard to successfully adopting cloud computing. The implementation of cloud computing is multidimensional and must be run as a multi-year programme, as opposed to a fixed-duration project.  What’s more, reliable, scalable solutions do not come about from ad hoc, simple projects. Cloud adoption is a hugely complex issue, and as such requires a strong architectural leader that will take executive vision and oversee the entire cloud adoption process. A number of organisations are likely already utilising cloud computing in some capacity, but the technical architects may not feel ready for the adoption that is occurring. There are several possible explanations for this lack of preparedness, but it often arises from organisations beginning their cloud adoption through ad hoc processes. This ultimately leads to issues, frustrations, duplicate work cycles, general inefficiencies or inappropriate use that puts the company at risk. In order to overcome the issues that stem from ad hoc adoption, an architect must be placed in charge of the cloud programme. [easy-tweet tweet="A cloud architect is responsible for the entirety of cloud computing initiatives within an organisation" hashtags="Cloud, CloudComputing"] But what exactly is a cloud architect? A cloud architect is responsible for the entirety of cloud computing initiatives within an organisation, as well as directing the architectural aspects of a cloud brokering team across all aspects of IT and the business. They must not only evangelise, strategise and delegate, he or she must also architect, design, facilitate, lead and direct cloud initiatives on multiple fronts. There are no set responsibilities for a cloud architect. They may shift from day to day, and may vary from organisation to organisation. As the cloud computing market matures and the organisation improves its cloud adoption, the cloud architect will also be required to adapt and support the tasks and projects in front of them. They cannot possibly hope to act as an adequate consultant for businesses without staying up-to-date on the latest trends and issues. That being said, there are three main tasks that the cloud architect is currently responsible for. 1. Ingraining a Cloud Adoption Culture into an Organisation Successfully Organisations recognise just how vital it is to have a “cloud-first” strategy. But no matter how good the strategy is a company won’t achieve anything if its culture doesn’t include a sound understanding of the cloud and all that it can offer. It is the job of the cloud architect to inspire a new culture and influence a change in behaviour toward the adoption and consumption of cloud services. By changing behaviour in the necessary way, organisations can expect to see a culture that embraces cloud services as the primary, prioritised and promoted approach. 2. Develop and Coordinate Cloud Architecture A further priority for the cloud architect is to develop and coordinate cloud architecture across a variety of disparate areas within the organisation. This can range from user experience to application development, for example. A focused cloud architect must work to closely coordinate the architecture changes to all of these areas. This does not mean that they dictate or implement all of these changes and entirely reshape the business. Rather, it means that they must build relationships with the functional architects in each area to influence each toward the necessary changes. 3. Design a Cloud Strategy and Coordinate Adoption While a cloud-first strategy certainly places the cloud within the heart of an organisation, it doesn’t necessarily mean “cloud-always”.  After all, there will be certain scenarios in which cloud services may not be the best approach. It is the responsibility of the cloud architect to recognise such situations and convey their thoughts to the company. The way in which the cloud architect will do this is by creating a cloud adoption process to coordinate and align adoption. When an IT initiative shows signs of being challenging, but also fails to display any indications of slowing down, it is time to allocate people to the initiative. Cloud computing is one example of such an initiative, and therefore requires a cloud architect to help guide an organisation towards cloud computing success.  Moreover, the timing of this is crucial. It is far better if an organisation opts to architect complex IT initiatives as early as possible. This is because it is much easier this way. If they choose to do it later down the line, they can encounter problems in the form of existing or in-flight projects. Learn more about CIO leadership and how to drive digital innovation to the core of your business at Gartner Symposium/ITxpo 2017. Follow news and updates from the events on Twitter using #GartnerSYM. ### How 3D Printing will be Valuable in the Future Remember the replicators from Star Trek? 3D printers are the realisation of that technology for the 21st century. 3D printing is one of the most exciting technological inventions in recent decades. It allows you to print anything using cartridges usually filled with plastic or metal powder that binds together to form geometric objects. The 3D printing process works by using computer-aided design software. A schematic of the item you’re trying to print is designed in a computer program, allowing it to be beamed to a 3D printer, which prints or assembles the object you want to create. Depending on the complexities of the object, it can take minutes or hours to print an item. The applications of 3D printing are endless and in this feature, we’re going to examine a few of the more obscure professions and locations 3D printing has and will make a significant impact. One giant leap for mankind Planning to go to outer-space can be challenging. Obviously, you’re going to need to pack the important things like oxygen, food, water, fuel and other essentials. But what if you get up to the International Space Station (ISS) and forgot, say, a really important wrench to repair part of the ISS? With a 3D printer, you no longer need to worry about that problem. You can use the technology to print yourself a brand new spanner - and that’s exactly what Commander Barry “Butch” Wilmore did in 2014. The folks at NASA have been testing 3D printing with a view to leverage the technology on missions to Mars. It’s a truly exciting time for those in the space industry, with 3D printing helping to change the way astronauts approach their missions. 3D printing offshore Right now, 3D printing is used offshore by organisations such as Shell and GE to produce one-off or custom components. In particular, workers use 3D printing to create refined and streamlined designs of particular pieces of drilling equipment. In the future, the oil and gas industry will greatly benefit from 3D printing equipment on an industrial scale to move essential equipment to rigs in various offshore locations, which at the moment can be time-consuming and a drain on resources. 3D printing will help oil rigs to build, replace or repair components and equipment that is essential for their day-to-day operation. [easy-tweet tweet="A significant challenge that faces 3D printing is the quality of raw materials used when printing" hashtags="3D, IT"] For instance, one of Shell’s scientists successfully managed to print a connector he designed and created for a pilot unit that breaks down refinery by-products into building blocks for higher value chemical products. You can watch the remarkable video of the whole process by clicking here. Quality assurance is needed A significant challenge that faces 3D printing is the quality of raw materials used when printing. The quality of 3D printed parts is influenced by the raw materials used. Many 3D printers use powdered metals to produce items, which are more often than not, recycled. However, after repeated usage, carbon, oxygen and sulphur ratios can be altered over time, thus impairing the quality of 3D printed items. Equipment like the Inductar EL Cube can be used to monitor the carbon, oxygen and sulphur concentrations of metal powders to ensure high quality 3D printed parts. While the potential for 3D printing is exciting, considerable caution should be exercised. The technology is in it’s infancy right now and has a long way to go before mass-production levels of adoption are seen   ### IBM Underscores Commitment to Data Responsibility, Gives Clients Full Control of their Data in Europe via the Cloud Concerns about data protection, privacy and security are at an all-time high as enterprises around the globe prepare for regulatory and compliance requirements like GDPR. To address this, today, IBM has announced major upgrades for its cloud data centre in Frankfurt, Germany to give clients even more control and transparency over where their data lives, who has access to it and what they can do with this access. The transition to the cloud presents enormous opportunities for enterprises operating in Europe. The rapid growth of higher value services like AI, analytics, blockchain and IoT is enabling companies to unlock transformative insights from their data like never before. In fact, the European Commission estimates that the value of the data economy in the EU can increase to €739 billion by 2020. While exciting for all of us, this transition also introduces the need for greater responsibility. Concerns about data residency, security and personal data protection are at an all-time high as businesses prepare for pending regulatory and compliance requirements like the General Data Protection Regulation (GDPR). Every day I hear from a wide range of global clients across every industry that they need tighter controls and visibility into where their data is stored and processed in the cloud. However they don’t want to be slowed down when they decide to tap into higher value cloud services that can drive their businesses forward. That’s why I’m excited to announce an important step forward in IBM’s ongoing commitment to data responsibility. In December 2017, IBM will roll out a new support model and capabilities for IBM Cloud in Frankfurt, Germany that truly set our approach to data responsibility apart. Data responsibility is an area where IBM has already taken the lead and it has always been IBM’s policy that the client determines where their data is located in the cloud. Today IBM is taking that a step further to ensure access is fully restricted and to give clients complete control and transparency over where their data lives, who has access to it and what they can do with this access. “IBM’s commitment to data responsibility and the added controls in the IBM Cloud in Europe allow us to trust IBM to protect our most valuable data,” says Patrick Palacin, Cofounder and CTO of TeleClinic, which provide security-rich telemedicine services to patients throughout Germany. “Our patient and caregiver information is highly sensitive and additional capabilities to ensure data residency, security and privacy mean the IBM Cloud is the innovation platform we can trust.” Let’s take a closer look at what these new capabilities to the IBM Cloud in Frankfurt will mean for enterprises. [easy-tweet tweet="Every enterprise is responsible for protecting its data, and this includes knowing who has access to it and when." hashtags="GDPR, Security"] IBM Cloud Puts You in Complete Control Over Your Data Every enterprise is responsible for protecting its data, and this includes knowing who has access to it and when. IBM will roll out new controls to ensure access to client content (including client Personal Data and Special Personal Data) is restricted to and controlled by EU-based IBM employees only. These employees will play a critical role in IBM’s incident and change management processes by reviewing and approving all changes from non-EU based employees that could affect client data. In a move that is unique to only IBM Cloud in dedicated environments, clients will review and approve all non-EU access requests to their content if an instance requires support or access from a non-EU based employee. If granted, this access is temporary and the client will be notified when the temporary access is revoked. Logs that track access are made available to the client. Access to IBM’s Higher Value Services to Drive Innovation Clients want to use IBM Cloud as a platform for innovation and new business value, but they need assurance that their data will be restricted to the EU at all times to help comply with regulatory requirements. IBM has one of the most robust portfolios of cloud services on the market and the new support model and capabilities will be applied to the full cloud architecture stack in Frankfurt, including infrastructure and higher value services like AI, data, and DevOps over time. Increased EU-Based Staff for 24/7 Local Support  To better support our EU clients, IBM is growing its customer support teams in Europe to provide 24/7 in-region operations and support. These new EU-based employees bolster IBM’s technical expertise and client success staff across Europe to deliver a robust, always-on client experience that is designed to meet the needs of today’s global businesses. Advanced Encryption Capabilities So Only You Can Unlock Your Data Additionally, in the New Year, IBM will roll out advanced capabilities that enable clients to encrypt their data - at rest and in-transit – with their own master keys. Encrypting data enables clients to store their data in the cloud and protect it from theft and compromise. Since the keys remain in possession of the customer, the data is protected from cloud service providers as well as from other users. Today’s news builds on IBM’s leadership in data responsibility. The only technology company to clearly and completely outline its data responsibility practices and principles in one place, IBM complies with the data privacy laws in all countries and territories in which it operates. IBM was an early leader in developing and adopting the European Union (EU) Data Protection Code of Conduct for Cloud Service Providers for several offerings, securing certification under the U.S.-EU Privacy Shield and the APEC Cross-Border Privacy Rules. Nowhere is IBM’s commitment to Europe more apparent than in our continued investment to expand our cloud footprint in the region. Europe is currently home to 16 of IBM’s nearly 60 cloud data centres across 19 countries worldwide. With data centres in France, Germany, Italy, the Netherlands, Norway, Spain, Switzerland and the UK, clients can confidently run their data when and where they need. We also recognise that public cloud may be the ultimate destination for enterprises, but hybrid cloud is a key part of the transition. That’s why IBM also offers clients the IBM Cloud Private software platform and a portfolio of hybrid capabilities to help bridge public cloud and traditional IT environments. While data privacy is especially important in Europe, clients in many markets face regulatory pressures to protect their users’ data. IBM plans to take the improvements outlined here and adopt them across other IBM locations in the future. ### Blockchain and Financial Services Blockchain is one of the most exciting and often misunderstood technologies of our time. Its effectiveness is never in doubt but what it does and how it works is not always clear to everyone. For the financial services industry, blockchain is definitely a new darling. The ability to record and track transactions in a decentralised database that cannot be changed or meddled with provides a level of security and flexibility that is the stuff of dreams for banks and other financial instructions. As such, blockchain is increasingly seen as a viable technology that could disrupt the financial services industry and become a staple for banks and institutions. In a world where there is increasing pressure on bank and financial institutions to digitise as much of their workflow as possible, as well as provide impenetrable security, blockchain provides the perfect solution to this challenge. From the registration of a loan to the transfer of assets between parties, blockchain’s popularity is increasing by the day. Expensive promises For many financial institutions, paper is still the bedrock of their lending practices. The documents involved in these processes – mortgage notes, vehicle loans and equipment loans – have a cash value attached to them. The obvious flaw in this system however, is that if the documents are destroyed, their value is completely lost. Also, there is the question of how these paper-based processes translate to the digital world. How can banks and financial institutions ensure that digital assets are managed properly as they exchange hands in today’s complex and compliance-driven lending ecosystem? There is increasing pressure on banks and financial institutions to comply with the myriad of regulations and they need to be able to quickly demonstrate how processes took place. This is where blockchain’s added value comes into play. One of the most exciting and promising features of the technology, is its immutability. Once a transaction has been recorded and tracked in the blockchain ledger, it’s there for good – it can’t be tampered with, or deleted. In other words, blockchain provides a security blanket for the transfer of digital assets. This is one of the reasons why many in the industry believe that potential to further strengthen compliance and audit by demonstrating a secure chain-of-custody for the transfer of any digital asset (e.g. secured loan or lease) to anyone that has permission on the network. Understanding the potential of blockchain within financial services As mentioned earlier, the benefits of blockchain are rarely questioned but the debate on how banks and other financial institutions can get the best out of blockchain is an ongoing conversation. There is still a lot of work to be done and organisation need to think more broadly about how they can use the technology and get the best out of it. For many in the financial services industry, especially those working in “innovation labs” and small groups working on projects in an unconventional way, these are exciting times and the possibilities are endless. Another benefit of blockchain is that it gives banks and financial institutions choice with respect to the technology platform they choose while digitising their workflow. With digital lending, for example, leveraging blockchain to record transactions provides a new way of providing access to all parties to a transaction in an efficient and cost-effective manner. Direct participants and “observers” such as auditors and regulators can more easily track transactions and verify their authenticity. [easy-tweet tweet="Blockchain does not replace the entirety of the pre-existing electronic signature, registry or e-vault capabilities " hashtags="Blockchain, IT"] Blockchain will work with, not replace, existing systems Blockchain does not replace the entirety of the pre-existing electronic signature, registry or e-vault capabilities that existed before its implementation. Instead, it can act as a complementary technology that helps simplify visibility into and the tracking of document-based transactions linked to financial assets for all parties involved and further strengthens compliance and audit. For example, e-Signature platforms can integrate with a blockchain-based ledger, to record operations such as the transfer of digital assets between two parties, and verify the authenticity of the underlying documents in the loan package. It has the has the potential to give financial institutions complete visibility into the lending process and demonstrate who has legal control over the authoritative copy of the loan documents. This process also ensures that an electronic duplicate of the original copy of the loan document can’t be made, as that would render both copies invalid. The combination of technologies can also simplify the tracking and storage of loan documents, from origination through funding. There is so much more that can be said about blockchain and its potential benefits to the financial services industry. We haven’t even mentioned the connection with cryptocurrencies in this article and that topic alone has enough about it to fill books. But however you look at it, there are exciting times ahead for the technology, and financial institutions need to exploit the potential it offers, if they want to stay one step ahead of their competitors. ### ThreatConnect now provides one place to visualise your intelligence and operations In an effort to provide its customers with the most efficient way to gain insight into and situational awareness of their security and intelligence, ThreatConnect, Inc.©, provider of the industry's only extensible, intelligence-driven security platform, is introducing new platform features and capabilities. With built-in and customisable dashboards and the new TAXII server in ThreatConnect’s latest release, users will be able to derive more metrics-based insights into threats and the actions teams take in response. With ThreatConnect Dashboards, TC ManageTM, TC AnalyzeTM, and TC CompleteTM customers are able to easily visualise data that displays the impact of their security efforts and gain a better understanding of the threats their organisations face.  Customers are able to automatically monitor their security operations and intelligence in a way that is actionable and relevant for their teams. By viewing the built-in dashboards or creating their own and adding custom metrics - even from outside of ThreatConnect - analysts are able to make critical security operations decisions and act to mitigate them quickly. With the power of data to gauge and observe trends, activities, and performance, the dashboard feature provides customers the ability to: See meaningful data immediately: monitor recent history, open tasks, active incidents, false positives and more. Also, there is an added capability to view specific data and dig deeper from the dashboard. Build insightful dashboards: prioritise how information is viewed and determine the best way to display the information using a variety of charts and graphs. Personalise dashboards: teams are able to build and configure their own dashboards for their own use or to share with others in the organisation. Andy Pendergast, vice president, product at ThreatConnect, said, “Dashboards is a powerful feature that will provide valuable insights to our customers from all the systems and data that they plug into Threatconnect, which differentiates us from other vendors in our space. Our customers won’t need to log into multiple systems to get a clear picture of their intel and operations. Plus, they will be able to share the impact of their efforts and the resulting intelligence with their security team and other staff.” Also in this release, ThreatConnect will offer a TAXII Server. Now customers can seamlessly manage TAXII-client enabled systems and tools that ingest data with the increasingly adopted STIX-TAXII standards. Security professionals who need to share enriched intelligence in a standardised way with other groups or associations are now able to connect a variety of TAXII clients directly in the ThreatConnect Platform, as well as leverage STIX-compliant data not found anywhere else (threat rating, confidence rating, and ThreatConnect’s proprietary ThreatAssess score). Pendergast added, “Being able to just pass STIX data is not enough. It could just add more noise. You have to be able aggregate all of your threat data in one place, leverage our analytics to filter what's relevant and then distribute it to your security tools and systems seamlessly with confidence. With ThreatConnect, we put curated and relevant threat intelligence to work to reduce your time to detect and respond to threats.”   ### Juniper Networks Democratizes the Telco Cloud with Contrail Cloud Enhancements Juniper Networks (NYSE: JNPR), an industry leader in automated, scalable and secure networks, today announced enhancements to Contrail Cloud to help service providers mitigate the challenges in building and operating distributed and scalable clouds. These updates, including integration of OpenStack Platform and Ceph Storage from leading open source solutions provider Red Hat, built-in AppFormix automation and visibility, pre-validated virtual network functions (VNFs) and new end-to-end support services, give carriers a far simpler path to delivering business, IoT and mobile services in the cloud. Some of the world’s largest service providers have already proven that leveraging cloud architectures to offer business and mobility services can yield immense benefits, including service agility and speed of innovation, as well as considerably lower risk and costs. However, the road to a distributed, scalable and always-available cloud is paved with operational challenges. Tedious procurement processes, complexities in building clouds, integration and interop, coordinating support across multiple vendors, and IT skills gaps in managing carrier-grade service level agreements (SLAs) can all complicate the journey. In a poll among Juniper customers, the company found: Respondents said “lack of visibility for all parts of cloud networks” is their top challenge in migrating to the cloud (nearly 40 percent ranked it No. 1); operational complexity ranked second; skills gaps among engineers third and rip-and-replace costs came in fourth. More than half (55 percent) of respondents indicated they are using two or more vendors in the network, which can add operational complexity. Contrail Cloud makes deploying the telco cloud far easier through simplifying the underlying Linux distribution with Red Hat, seamlessly gleaning network insight with AppFormix, clearing the traditionally difficult task of validating VNFs by pre-qualifying, and adding end-to-end support services to smooth implementation. Companies – from telcos to mobile operators and cable providers – can use this integrated solution to easily navigate the complexities of cloud deployments and operations with improved performance and scale, and service SLAs. Aligning with Juniper’s Cloud-Grade Networking tenets, Contrail Cloud simplifies the deployment, usability and operation of the virtualized cloud infrastructure with a proven, trusted, open and integrated software stack without compromising performance, scale and availability. ### 5 Benefits of ISO 9001 Certification for Cloud Providers Cloud computing refers to an IT solution where different services, such as storage and applications, are deployed via the Internet. Services are typically consumed on an as-needed, pay-per-use basis and are maintained by the cloud provider. Thanks to cloud computing, organisations can have access to powerful and flexible computing capabilities. Businesses can personalise their computing infrastructures to suit their specific needs and the requirements of their particular industry. This allows organisations to externalise all or part of their IT systems, applications and storage. Inspire trust Cloud computing creates numerous market opportunities for organisations. It helps businesses to address their computing challenges by enabling them to do more with their IT budgets. However, choosing the right cloud computing service provider is critical. One way for potential customers to assess providers is by looking at the certifications they hold. By choosing a certified provider, organisations can have the assurance they need to leap into the Cloud. It’s important for organisations to choose a provider whose certifications address the specific challenges of cloud computing. Many organisations hesitate when considering cloud computing as an IT solution. Migrating to the cloud raises questions and concerns, particularly where secure, business-critical applications are at stake. When a company decides to purchase cloud services, it’s important that the cloud service can offer enough guarantees in areas such as reliability, security, and continuity. One credential that may ease concerns is ISO 9001 certification. ISO 9001 is a recognised industry standard for the quality management of business procedures. It applies to all processes, offers product control solutions and better management regulation for any business. As such, it can be especially helpful to cloud providers and their customers. Ensure customer satisfaction The growth of any business is dependent on customer satisfaction, which is driven by consistency and reliability of products or services. In short, if your customers trust in your ability to deliver a secure and reliable service, you’ll benefit from repeat business. ISO 9001 accreditation offers proof of an organisation’s commitment to its customers, consequently inspiring trust in a business’ ability to deliver. It demonstrates an organisation’s willingness to find new ways of improving services. In the case of cloud service providers, the most important service element is data management. Providers must be able to handle data and personal information in a specific manner when transmitting over public networks, storing on mobile devices and recovering or restoring important information. Customers will feel more assured in trusting a cloud service provider that can demonstrate third party validation of market-specific best practices. Compliance demonstrates that a company has a deep understanding of how to safely handle information and that it is dedicated to protecting its customers’ data. Increase market share Similarly, certification sends out an important message to potential customers. An independent verification of your internal Quality Management Systems shows a commitment to quality, customer service and continuous improvement. With ISO 9001 certification, your organisation can confidently compete for business. As an internationally recognised standard, accredited certification means organisations meet a high level of quality. Accreditation can give a business the edge over similar set-ups, enabling a provider to differentiate its brand from its many market rivals in the increasingly competitive Cloud. Address security issues Cloud providers have the capacity to control vast quantities of their consumers’ personal information. Data breaches, however, are an unfortunate consequence of the Internet age. Yet, this isn’t necessarily because cloud service providers have weak security. Reputable providers focus on finding new ways to keep data safe and secure. In addition to their efforts, governing bodies like the International Organisation for Standardisation (ISO), also devise new standards and guidelines to help provide guidance on data protection. ISO 9001 accreditation can offer clients peace of mind that the security of their data is being taken seriously. Improve efficiency and effectiveness While cost is often cited as a reason not to implement ISO 9001, once in place, certification is often credited with creating enhanced operational efficiency and subsequent cost savings. The standard stresses quality assurance and an important aspect of this, is security. Accredited cloud providers ensure that threats to the business are assessed and managed on a continual basis. Identity and access management solutions are often put in place to help protect corporate applications, resources and data. From introducing multi-factor authentication to monitoring suspicious activity through advanced security reporting, potential security issues can be mitigated. In this way, the overall efficiency and effectiveness of company procedures undergo continual improvement. ### The Evolution of Technology is Embarking on an Exciting New Chapter Organisations have been adopting new technology in the name of progress for decades. With each new wave of technology, IT professionals are required to learn new technical skills in order to stay relevant. It doesn’t take long for a technology to go from a mystic concept to something that is revolutionising the way we live and work. Procrastinate for too long, and you could be left behind. Each new wave of technological change seems to bring with it potential benefits to an organisation even larger than the last. Cloud and the Internet of Things have the power to redefine entire industries, and our lives and businesses are being transformed faster than ever before as a result. As an IT expert, it is your responsibility to ensure you have the right skills to safely lead an organisation through the transformation process. The depth and breadth of skills needed for the latest IoT and cloud solutions are such that you can no longer go it alone. For instance, with IoT, you need to be able to technically understand everything from the endpoint sensor, right the way through to custom applications in a hybrid cloud environment running sophisticated cognitive analytics. We’ve helped some customers design and build sensors from scratch, while other customers have required security and networking expertise so that the data generated from the sensors can be fed through gateways, often to a custom application sitting in a cloud. Client demands of their IT providers are vast, and still increasing. There are numerous examples of IT providers employing our team of software developers to build these custom applications for them, based on the output of end-user workshops. The same team has been used to augment IoT data with other data sets from a third party application, such as a customer relationship management tool or IBM Watson, and the expertise of our analytics team has even been drawn upon to find meaningful insights in big data sets and represent the information in a way that users find valuable. The biggest adjustment for IT professionals is the realisation that a technical skillset alone is no longer enough. Completely transforming the way an organisation operates is a skilled process, and IT professionals are now required to not only execute IT processes, but also convince their organisation of the need for change in the first place. The implementation then must be led from the top - at board level. To do this, you need credible professionals who are able to sit amongst senior board members and explain to them, without using jargon, why they need to embark on the journey. It is much more than just investing in technology. Most of these transformation projects require a complete overhaul and restructuring of an organisation.  So, in addition to broad and deep technical skills, IT professionals now also need to be experts in leadership and change management. [easy-tweet tweet="The IT professional has to be an outstanding innovator and an incredible inventor. " hashtags="IT, IoT"] Moreover, cloud and IoT are “innovation catalyst” technologies since the limits of what is possible rest on how well an IT professional can innovate.  The IT professional has to be an outstanding innovator and an incredible inventor. Why outstanding and incredible? The point of this technology is to outperform your competitors. You need to think of new innovative solutions to business challenges that will give the organisation a competitive edge – this requires people with brilliant and creative minds. Personally, I believe the skillset required has become so vast that it is unlikely to find everything you need in one individual. But this is no bad thing - IT providers will need to build a more balanced team with individuals of different personality types to stay afloat in today’s market.  I see this opening the IT industry to a broader range of individuals and perhaps make it more appealing to women. After all, greater diversity can only be a good thing in any industry. If you work for an IT provider, you will be familiar with the challenge of balancing the right skills and expertise with profitability. Whilst finding good people in terms of skill and attitude can be difficult, choices surrounding utilisation and making the numbers balance financially remain challenging - even with the benefit of good staff. Most IT providers prefer to have their skills remain in-house, but this is not financially viable and, looking ahead, IT providers must find external partners with skills to complement their own in order to survive. At Tech Data, we’re in a very fortunate position in that respect. Few IT providers have the scale and resources of a $25+ billion global IT distributor, and we can fill skills gaps for our customers and partners on a project-by-project basis.  In this way, we can help IT providers on-board new technologies without significant financial risk. But what about the skills you want to build up in-house? Our Academy business focuses on providing vendor-authorised technical training to IT providers and their customers. When it comes to deployment of hardware and software solutions, it is essential for teams to have up-to-date knowledge, not to mention first-class training which helps IT, professionals, to build more fulfilling careers in the industry and enhance their personal development. Cloud and IoT are very broad terms that apply to lots of different types of technology, and so, when it comes to technical IT training, there is no single technical course that covers everything an IT professional will need. Instead, we help students to identify the right product and technology courses for them using the right vendors and then guide them through the certification process. Finally, a successful cloud and IoT project depends on having a good understanding of the cloud and IoT ecosystems which today include new and exciting technology companies that have been born in the cloud or IoT. Many of these don’t come from the traditional data centre environment. These companies are pushing the boundaries of technology to deliver transformational and innovative solutions faster, as well as slashing development costs. Tech Data is passionate about helping its partners understand and navigate the ecosystems and using its contacts to open doors for our customers. IT professionals have always been about understanding business requirements and delivering a robust and relevant solution. Today that takes visionaries - individuals who will lead the strategy conversations and, through innovation, show businesses what is possible. The rewards for those who can do this are significant and continuing to grow. We see this on a daily basis, and it is why we are so committed to helping IT professionals guide their organisation through IoT, cloud, or a business transformation journey, from start to finish. ### AI to Provide the Antidote to Pressures on Retailers Following Brexit Intelligent analysis and automation of replenishment process can optimise the retail supply chain Retailers, currently suffering through some of the most challenging trading conditions they have ever faced, should look to new technology to optimise their operations, reduce their costs and strengthen their bottom line. This is according to Uwe Weiss, CEO at Blue Yonder, the world leader in AI and machine learning for retail supply chain optimisation. Uwe suggests that with advanced innovations in supply chain technology, retailers can better prepare themselves for potentially challenging times ahead. A wide range of factors bear responsibility for the challenging situation that many retailers find themselves in. A stagnant economy with low wage growth and rising inflation has led to a marked decline in consumer spending, and the uncertainty following the vote to leave the European Union has only worsened matters. The subsequent depreciation of the pound putting a strain on retailers’ supply chains and profitability. In addition, analysis from the British Retail Consortium indicate that global food commodity costs have risen by an average of 17%, putting further pressure on retailers’ supply chains. Uwe Weiss comments on the impact that this is having on the retail sector: “With retailers suddenly having to pay much more for their stock, it is more important than ever that they ensure produce levels are managed correctly. Retailers simply cannot afford to waste stock, whether that is food that spoils before it can be sold, or branded goods that are out of season or out of date before they reach the shelves. There is a real risk that if retailers cannot optimise their supply chains that consumers will begin to see some of their favourite products go missing from the shelves. “Retailers must balance the need to keep products on their shelves with maintaining a profitable and efficient business. This is where technology can give retailers a competitive edge and insulate them against the fluctuations in the supply chain. Retailers have access to vast reams of data, including their own internal data such as past sales patterns and customer footfall, and external information such as public holidays, the weather. “When this data is combined with advanced AI technology, stock replenishment optimization solutions can then accurately predict customer demand and automate stock level decisions, across thousands of product categories and hundreds of stores. By using the data at their disposal to optimise their stock levels, retailers can ensure that, even though they may have to pay more for some products, they maximise the profitability of these products.” Blue Yonder Replenishment Optimization is a machine learning solution that allows automated store replenishment to efficiently reduce waste. The solution utilises a wide variety of data points to create accurate and granular forecasts of customer demand, with a weighted optimization of waste levels and product availability, its automated decisions reducing the burden of making manual interventions on retailers. ### 70% of European IT decision-makers see Brexit as a business opportunity France, Belgium, Italy and Switzerland see most opportunity in Britain leaving the EU as organisations look ahead to 2018 London, UK, 6th November 2017 – Interoute, the global cloud and network provider, has revealed new survey results showing almost two-thirds (63 per cent) of businesses across Europe are anticipating high growth in the year ahead. Reassuringly despite its turbulent outlook, the UK was only slightly behind the European average at 61 per cent. As the political climate presents opportunities for those countries and companies that can move at pace, on average 70 per cent believed Britain leaving the EU was an opportunity for their company. In the survey of over 800 European IT decision-makers, business confidence surrounding Britain leaving the EU was highest in France (83 per cent) and Belgium (78 per cent) but lowest in Sweden (49 per cent). While the outlook surrounding Brexit was generally positive across Europe, 63 per cent of IT leaders anticipate it could ultimately result in the disintegration of the EU. In Germany (78 per cent), France (70 per cent) and Italy (66 per cent) this belief was significantly stronger. Percentage of respondents that strongly agree Britain leaving the EU is an opportunity Percentage of respondents that strongly agree Brexit could result in the disintegration of the EU France 83% 70% Germany 67% 78% Denmark 70% 46% Italy 76% 66% Sweden 49% 59% Netherlands 63% 60% Belgium 78% 60% Switzerland 74% 64% European average 70% 63% Given the changing political climate, 95 per cent admitted Brexit specifically had impacted their decision-making process. While, 96 per cent said the overall current political uncertainty in Europe (e.g. US and Russian foreign policy) had impacted their decision-making. As organisations have been forced to respond to the changing business landscape, the majority of IT leaders have shifted their infrastructure choices to focus on driving more agility. In total, three in five European organisations (61 per cent) are pursuing new digital transformation projects. The UK follows closely behind the European average with 60 per cent of IT decision-makers investing in technology that allows them to continuously innovate and maintain a competitive edge. Across Europe, the top drivers for digital transformation projects were cited as reducing costs by improving operations (46 per cent), enhancing the employee experience through more mobile and social capabilities (45 per cent) and improving the customer experience (41 per cent). However, in the UK, enhancing the employee experience emerges as the top priority for nearly half of the companies (45 per cent), which may indicate a desire to improve employee productivity and create great customer experiences. [easy-tweet tweet="The UK and mainland Europe is in the middle of a period of political uncertainly, which is impacting IT decision makers" hashtags="CIOs, GDPR"] Matthew Finnie, CTO of Interoute, commented, “The UK and mainland Europe is in the middle of a period of political uncertainly, which is impacting IT decision makers across all sectors. The more ambitious organisations see this as a time of opportunity where the right technology investments will result in high growth and an enhanced experience for employees and customers alike. Changing foreign policy politics, Brexit and even new GDPR regulations are forcing organisations to review the legal, financial and operational aspects of their business. Digital transformation is about creating an organisation that is flexible enough to respond to market and geopolitical changes. Organisations that move now to ensure they have the right digital infrastructure foundation in place will undoubtedly be the best placed to succeed in these uncertain times.” The results of this survey are taken from the report, ‘The challenges facing European IT decision-makers engaged in digital business transformation,’ that explores the factors governing IT decision-making and how senior IT professionals are preparing for change. ### Ultima implements innovative virtual workspace for Systems Powering 6 November 2017, Ultima, a leading provider of on-premise and cloud IT infrastructure and managed service solutions, has implemented an innovative virtual workspace solution for Systems Powering Healthcare (SPHERE), the IT services company specialising in healthcare work with NHS trusts. The solution will see SPHERE host its IT services in the cloud, enabling it to quickly increase or decrease the number of users, and introduce new software simply while ensuring the ability to react immediately to any cyber-attacks and secure its data more effectively than previously. The solution combines both Microsoft Azure and Citrix Desktop’s technologies to deliver a virtual workspace for enterprises in the cloud. SPHERE can now quickly ramp IT user capacity up or down, paying only for what they need, resulting in better ROI and cost savings. This solution is a result of a partnership first, with Ultima bridging the gap between the Microsoft Azure and Citrix Desktop infrastructures to create a fast, simple transformation to virtual environments that allow employees to work anywhere, anytime in a secure way. And rather than dealing with multiple vendor’s Ultima is providing a ‘one-stop-shop’ for customers. Scott Dodds, CEO, Ultima says, “We have partnered with Microsoft and Citrix for over fifteen years and to unify these relationships means that we can help organisations transition to virtual working quicker and more cost-effectively than before. We have deep expertise and understanding of both the Microsoft Azure and Citrix technologies, enabling us to move customers to the cloud no matter how complex their legacy or bespoke platforms and applications. Our Modern Data Centre and WorkSpace offerings that combine Microsoft and Citrix technology will help customers to make the change and deliver secure, effective, evergreen IT.” ### Securing Your Cloud Infrastructure While cloud applications have demonstrated their ability to increase enterprise productivity, IT security executives currently face a tough balancing act to ensure that sensitive information remains protected, while the business continues to progressively adopt cloud applications. On-premise applications are now a thing of the past, and since those days, the rulebook has changed. Organisations now rarely have physical access to their data storage, which means it’s vital fundamental questions are asked and teams understand who needs to take responsibility for security. You might ask: Who’s really responsible for my data? The short answer is: you are. As the data owner, it’s your responsibility, not that of the Cloud Service Provider (CSP), to secure your data. The EU’s General Data Protection Regulation mandates that in the event of a data breach, enterprises will be held directly accountable, and will not be able to shift the blame onto the cloud provider which holds the information. [easy-tweet tweet="The data you have stored in the cloud resides in a physical location" hashtags="Data, Cloud"] Where’s my data? The data you have stored in the cloud resides in a physical location. So when setting up and managing your storage, make sure you discuss with your Cloud Service Provider which country, or countries your data will reside in. Be aware that the requirements and controls placed on access differ from country to country. Regulations like Privacy Shield have the potential to pose a significant issue to companies looking to use of cloud services in different jurisdictions. If your cloud provider stores your information outside the EEA for example, the data protection policies may be less stringent. Should a data breach occur it is you who would be held accountable. Who has access to my data and my code? Insider attacks present a huge risk. A potential hacker could easily be someone with approved access to the cloud. Additionally, more advanced users might have access to encryption keys for data that they might not have approval to view. You need to know who’s managing your data and the types of security controls and access management protocol applied to these individuals and their network accounts. A key element which is worth putting some thought into here is the encryption of data, both in transition, and at rest. Data encryption is an effective way to maintain compliance with regulatory programmes and you’ll need to be able to answer questions such as, “who has the keys”, “when was it encrypted from”, “who encrypted it” and “when does the encryption expire”. What is the current maturity and long-term viability of my chosen CSP? How long have they been in business? What’s their track record? Are they operationally effective and secure? If they go out of business, what happens to your data? Naturally, data confidentiality within cloud services is a fundamental concern: you need to be confident that only authorised users have access to your data. Here, we must stress again that, as a data owner, you are fully responsible for compliance – it’s up to you, not the CSP, to secure valuable data. Public cloud computing asks you to exert control, without ownership of the infrastructure, in order to secure your information through a combination of: Encryption Contracts with service-level agreements By (contractually) imposing minimum security standards on your provider. What happens if there’s a security breach? What support will you receive from the provider? While many businesses claim to be hack-proof, cloud-based services are an attractive target to determined hackers. BT has in place established policies and procedures that ensure the timely and thorough management of incidents according to priority. For example: BT Contractors, employees and third-party users have a responsibility to report all information security events in a timely manner. Every event is reported promptly either through the BT Cloud Compute Service Desk or the Portal in compliance with statutory, regulatory and contractual requirements. What is the disaster recovery/business continuity plan? Remember your data is physically located somewhere, and all physical locations face threats such as fire, storms, natural disasters, and loss of power. In case of any of these events, you need to find out how your cloud provider will respond, which protocols it has in place, and what kind of guarantee it offers to continue services. Successful business continuity depends not only on the CSP’s provision of the IaaS, but on the timely recovery of your data, which is ultimately your responsibility to ensure. ### OpenStack private cloud revenue to overtake its public cloud revenue in 2018 Growth of OpenStack private cloud will overtake public cloud revenue for service providers sooner than previously anticipated. Sydney and New York City, 6th November 2017 – 451 Research predicts that service providers with OpenStack private cloud revenue will exceed revenue from service providers with OpenStack-based public cloud implementations in 2018, sooner than previously expected. From a regional perspective, deployments in China and APAC are now growing faster than in the rest of the world. While not the primary driver, a contributing factor is the Chinese government’s Ministry of Industry and Information Technology advocating for OpenStack. The latest study, published on the first day of OpenStack Summit 2017, forecasts OpenStack revenue will exceed $6bn by 2021, as it experiences a growth rate of 30% CAGR. OpenStack is benefiting from the increase in hybrid cloud environments with 451 Research’s Voice of the Enterprise: Storage Survey finding that 61% of the enterprises surveyed now utilize a hybrid cloud approach. Certain verticals and regions that are less enthusiastic about exclusively using hyperscalers also present a growth area for OpenStack. 451 Research believes the OpenStack development community has made progress addressing challenges in installation and rolling upgrades over the last two years, resulting in increasing production deployments. This will continue in 2018 as OpenStack becomes more shrink-wrapped and risk is reduced for enterprise use. The platform was once limited to mostly development/testing and proof-of-concept deployments but there are now mission-critical workloads on OpenStack across nearly all enterprise verticals and regions. [easy-tweet tweet="OpenStack growth is now shifting towards the private cloud space faster than previously expected." hashtags="OpenStack, PrivateCloud, Cloud"] To date, OpenStack revenue has been overwhelmingly from service providers offering multi-tenant IaaS, but 451 Research Market Monitor finds that OpenStack growth is now shifting towards the private cloud space faster than previously expected. While containers are eclipsing OpenStack in terms of attention, 451 Research’s OpenStack Pulse finds that very few OpenStack users are abandoning OpenStack. In reality, some of the most innovative and progressive OpenStack deployments feature the use of container technology such as Docker and Kubernetes. ### 10 years of AWS EMEA Ten years ago, on November 6, 2007, Amazon Web Services (AWS) opened its first infrastructure Region outside of the US, and first in Europe, Middle East, and Africa (EMEA), in Ireland. The first service available to customers was the Amazon Simple Storage Service (Amazon S3). This was the first time low cost, secure, and durable storage was available to customers, on European soil, with pay as you go pricing. After launching with one service, in one EMEA region, AWS now has more than 80 services available for customers in EMEA across eight Availability Zones in three infrastructure Regions, in Germany (opened in October 2014), the UK (opened December 2016), and Ireland.  To support customer demand AWS continues to expand and still has three additional Regions, and nine Availability Zones, in Bahrain, France, and Sweden yet to open between now and early 2019. [easy-tweet tweet="Now, on November 6, 2017 the most expensive storage tier for Amazon S3 in the AWS Ireland Region is $0.023 per Gb per month – nearly 90% less than at launch." hashtags="AWS, EMEA"] Over the last 10 years AWS has continually lowered the prices of their services. When AWS launched Amazon S3 in Ireland in 2007 it cost $0.18 per Gb per month for standard storage. Now, on November 6, 2017 the most expensive storage tier for Amazon S3 in the AWS Ireland Region is $0.023 per Gb per month – nearly 90% less than at launch. It isn’t only with Amazon S3 that pricing has been reduced. AWS has lowered prices 62 times across the services with no competitive pressure to do so. As AWS continues to grow the business in EMEA, and around the world, you can expect the prices of the services will continue to fall. AWS customers, by the numbers AWS is now the foundational infrastructure for some of the most exciting, fastest growing, and most well-known startups in EMEA, and around the world. AWS counts startups such as, Blippar, Deliveroo, Delivery Hero, Dubsmash, Fanduel, Funding Circle, Gett, Home24, iZettle, Just Eat, King, Mojang, Rovio, Spotify, Shazam, Skyscanner, Supercell, TrustPilot, Vente Privee, Viber, Vivino, Zalando, Zoopla, and many others as customers. AWS has also been helping European enterprises to use the cloud to embrace innovation and set themselves apart from the competition. Customers from across EMEA, from organisations of all sizes, are doing exciting work using AWS. ### Integrating IT with Business and Society Computers changed the way that businesses operate, and now digital technologies are changing society too.  Digital practitioners are emerging as the professionals who will deploy these technologies in business enterprises. They must also consider the social dimension. Business Use of IT Business use of Information Technology (IT) started in 1951 when Lyons of London used their LEO 1 computer for bakery valuations. LEO’s applications increased during the 1950s to include such things as production, accounts, payroll and, under contract to the UK Meteorological Office, scientific calculations. Such applications may still form (pun intended) the lion’s share of business IT applications, but the way in which they are delivered has changed. The introduction of personal computers allowed much greater user interaction, the Internet brought universal connectivity, and the World-Wide Web moved the user from the periphery of the IT systems to their heart. The old “input-process-output” model, with the process stage visible to and understood by just a few experts, is gone.  The process information is exposed to end users, who can change it. [easy-tweet tweet="Digital technologies have built on the Web and added to the business capabilities of IT systems" hashtags="Digital, IT"] The Digital Revolution Digital technologies have built on the Web and added to the business capabilities of IT systems. Led by cloud computing, they also include social computing, mobile computing, and big data analysis. Networking has vastly increased the value of IT systems that monitor and control their surroundings, giving us the Internet of Things. Cognitive computing and artificial intelligence are adding to the analytical capabilities, and supporting a more human style of user interface. These technologies do not just alter the way applications are delivered; they make possible new applications, such as shopping, job seeking, news publication, and product support. Like the original use of computing, the new applications are changing business but, more than this; they are changing society. This is the digital revolution. The Business-IT Gap In the early days, the business people had their visions of what computing could do, and hired technicians to turn them into reality. Engineering computer hardware, and programming the software, became specialized disciplines. A business-IT gap grew between two sets of people neither of whom wanted to, or could, do each others’ jobs. There are examples of technicians launching successful businesses, most notably Facebook founder Mark Zuckerberg but, particularly with traditional IT, there is still a gap between business people who don’t understand why their IT is such a big cost and IT people who don’t understand how it could help the enterprise make a bigger profit. The Digital Practitioner The digital technologies go hand in hand with new approaches to development and operations.  “Agile” methodologies result in projects that deliver value continuously from their early stages and evolve through user feedback. “DevOps”, by combining development and operations, ensures that developers know how customers use their systems, and understand their needs. The digital practitioner is emerging as a new professional that can help enterprises harness digital technologies and methods to gain business benefit. The exact role is still evolving. As Forrester’s Charles Betz points out in his book on Digital Delivery, “Digital investments are critical for modern organizations and the economy as a whole. . . Now is an ideal time to re-assess and synthesize the bodies of knowledge and developing industry consensus on how digital and IT professionals can and should approach their responsibilities.”  Digital Value The role of the enterprise in society is changing too. Betz puts an understanding of digital value at the start of a digital practitioner’s education, and few would question this. But what is digital value? The World Economic Forum White Paper: Unlocking Digital Value to Society: A new Framework for Growth gives it a social, as well as a business dimension, adding indicators such as carbon emissions and job creation to financial ones like operating profit and value addition. The paper says that some social value is “trapped” because there is no economic incentive for businesses to provide it. There is, however, a moral incentive because digital technology “has given the world a front-row seat to business decisions and operations, and no company can make decisions that are unacceptable to public opinion without scrutiny, severe penalty or, in the most extreme cases, extinction.” The unstated implication is that an evolution of enterprise models is needed for delivery of more social value. Delivering Value In The Open Group, the Open Platform 3.0™ Forum is looking at how enterprises can gain business value from new technologies. With members of other Open Group Forums, its members participate in the Digital Practitioners Work Group, whose aim is to develop and promote an understanding of what it means to be digital, and which is working to create a Digital Practitioners’ Body of Knowledge. Such work, in industry bodies and academia, will define a professional approach to the use of digital technology. That approach cannot now just take account of machines and money. It must also consider people. Digital practitioners should help their companies succeed by achieving social objectives. Business is an integral part of society, and technology can help the business deliver social value. ### QStory publishes new White Paper: “The Next Step Up in Contact Centre Performance” QStory, a provider of end-to-end automated Intraday Management solutions for contact centres, has announced the publication of a new White Paper entitled “The Next Step Up in Contact Centre Performance”. The publication is aimed at contact centre, human resources and training professionals and provides a practical guide to maximising agent idle time for coaching and other skills improvement activities. [easy-tweet tweet="Contact centre organisations strive to deliver exceptional customer service whilst making the most of agent time" hashtags="Data, Technology"] Paddy Coleman, CEO of QStory commented, “Contact centre organisations strive to deliver exceptional customer service whilst making the most of agent time and valuable skill-sets. However, when service levels are squeezed, training is often the first casualty.  “By combining the latest Intraday Automation with a well-established WFM infrastructure, managers are better able to manage a mix of people with different skills, preferences, contracts, schedules and breaks in real time. Readers of our new white paper will learn why technology has a strategic role to play and how Intraday Automation should be an intrinsic part of every HR professional’s talent management, career progression and training programme armoury.”  To download the white paper please visit QSTORY WHITEPAPER. ### Bracknell Forest Council appoints Invotra to transform Intranet Invotra is already used by 45% of central government Fast growing tech company and intranet software provider Invotra, have announced their appointment by Bracknell Forest Council, to help them transform their intranet, become more efficient and encourage greater collaboration among their employees. Bracknell Forest are Invotra’s first local authority client; they already work with 45% of central government including the Home Office, Department for Transport, Department for Work and Pension and HMRC. The local authority previously had a static system that offered no social or collaborative capabilities for staff. Invotra were brought in to solve the challenges Bracknell Forest were facing with delivering multiple services on a tight budget, discouraging siloed working, reaching a diverse workforce and enabling them to work flexibly, and improving the digital skills of their wider team. [easy-tweet tweet="The roll out of Invotra will provide Bracknell Forest’s employees access to a range of interactive features" hashtags="IoT, Technology"] The roll out of Invotra will provide Bracknell Forest’s employees access to a range of interactive features, such as a message wall to encourage collaboration, groups where staff can come together and discuss a particular subject, and a powerful content management system. They will also benefit from single sign on, as Invotra will integrate with their active directory, meaning they won’t have to remember their username and password each time they log in. Invotra’s ‘software as a service’ (SaaS) intranet platform also means that Bracknell Forest will have access to features and functionality requested and developed for their central government counterparts, at no extra cost. Bracknell Forest will also be involved in the alpha roll out of a significant project, GOV.invotra – a digital communications platform where all of Invotra’s government clients can come together to discuss, share best practice and collaborate with one another, and Invotra. The project is currently in its early stages, but is already gathering momentum among Invotra’s central government client base. Fintan Galvin, CEO of Invotra said, “We’re delighted to be working with Bracknell Forest, and welcome them on board as our first local authority client. They have a clear understanding of how digital communications platforms can be transformative for engagement, efficiency, and collaboration, and are already giving us lots of positive feedback following the initial training sessions. “It’s also very positive that they’ll be a part of the GOV.invotra project. We believe it’s vital for local authorities to have a more audible voice and opportunity to engage with central government; GOV.invotra gives them that opportunity. The platform has the potential to bring all government departments together in a way that has not been seen before, and it’s a platform designed by government for government, so we’re excited to increase the scope of the proposition to ensure it also serves local authorities.” Colin Stenning, Digital Services Manager at Bracknell Forest Council said, “Like most like authorities, we’re facing financial challenges as well as challenges around a dispersed workforce and limited opportunities for collaborative working.” “To tackle these challenges, we’ve been working with Invotra to develop a new interactive and collaborative intranet. Once launched, this is due to encourage our staff to work more effectively and use the intranet as a two-way tool rather than just a static broadcast channel, which it has been previously.”  “We’re also very keen to get behind the GOV.invotra project; we believe that engaging central government directly, and being a part of the conversation, is a great opportunity, and that it will be of longer-term benefit to public services.” Invotra have been appointed by Bracknell Forest Council on an initial two-year contract, with the option to extend for a further two years. Initial training was completed within three weeks, allowing Bracknell Forest’s team of ‘webmasters’ to deliver training to the workforce and get users fully engaged. ### Securing the Internet of Old Things The IoT is growing at an unprecedented rate and market forecasts are ambitious. While IHS has suggested that there will be 30.7 billion connected devices by 2020 and 65.4 billion by 2025, Intel has gone even further with its prediction that there will be 200 billion connected devices by 2020. New products are hitting the market every day but we’ve largely glossed over the fact that the IoT is not a new phenomenon. Although the term was only coined in 1999, ‘old’ internet of things devices existed before then – from printers connected to fixed networks, to old smart TVs and Bluetooth speakers. While these devices still work perfectly well, what’s concerning is that we do not know how secure they are and more often than not, neither do their manufacturers. These devices all had code built into them to make them useful but future-proofing them was seen as a cost and time sink - not a revenue generator. The necessary updates may be found somewhere on the website for consumers technically savvy enough to find them, but most devices were sold with un-patched embedded operating systems. As cyber-attacks continue to hit the headlines and grow more sophisticated, it is simply financially unsustainable to have so many easily compromised devices in the market. The reasons behind the risks  The IoT has an encouragingly low barrier to entry but only a very small number make it through a single 9-18 month product cycle before failing. Even more products fail once they have been launched to market, which often puts an end to any future security updates leaving a huge number of IoT devices unsupported. Essentially, short product lifespans offer little incentive to provide costly long-term support leaving devices ripe for hijacking. [easy-tweet tweet="A contributing factor when it comes to the rise in IoT security vulnerabilities is product design" hashtags="IoT, Security"] Another contributing factor when it comes to the rise in IoT security vulnerabilities is product design. Most consumers want and appreciate easy-to-use devices and this is often what helps one company gain an edge over its competitor. However, it comes at the cost of compromised security. Not only can the design of core software and passwords be uneconomical, but also anything other than a default password is widely regarded as too complicated for most consumers and therefore an unattractive consideration for companies. And finally there remain serious issues around ownership, as it is not always clear who is responsible for ensuring the security of IoT devices. This disconnect within the production process means that security holes are being left open at all stages of the product development phase, resulting in physical vulnerabilities in the hardware as well as others within the operating system and at the application level. While all of these issues are still a concern for new IoT devices, they are particularly damaging for the internet of ‘old’ things. The potential implications of leaving these devices unsecured include: The theft of personal information (e.g. family photos on a computer can be encrypted leading to ransomware issues) Loss of sensitive data (if a Wi-Fi router can be controlled then cyber criminals can easily steal data by acting as the ‘middle-man’ for online banking, Google or social media websites) Burglaries (Thieves can steal household belongings by using a computer to break into a home) How to secure IoT devices Admittedly, profit margins on IoT devices don’t pay for ongoing maintenance but by centralising the responsibility for updates, manufacturers can ensure their devices are protected against malicious threats. By separating the hardware, the low-level software (i.e. the kernel), the operating system, and the overlaying software into independent components, both vital software and firmware updates can become increasingly automated. This ultimately maximises the overall security of the end device, with core updates running every time an IoT device is connected to the internet. By requesting digital authentication for all apps, if one app is hostile then the operating system will isolate that app to make sure no other damage is done to the device. As well as significantly reducing the risk, this approach makes the updates as pain-free as possible for both manufactures and consumers alike. Raising awareness of security threats is an important and ongoing task but the reality is that many consumers aren’t as motivated as they should be to secure their devices. Addressing the issue needs to begin with the operating system and built up from there. At Canonical, we believe that the ability to update software reliably and automatically is absolutely crucial, particularly for older IoT devices that may become physically harder to access. In the company’s initiative, Ubuntu Core, delivers exactly that – free and automatic updates to both the operating system and related apps. In the increasingly app-focused IoT world, security is paramount and manufacturers need to deal with security threats right from the outset if they are to create devices that are just as safe for consumers twenty years from now as they are today. ### DOCOMO and MediaTek Achieve World's First Successful 5G Trial Using Smartphone-Sized NOMA Chipset-Embedded Device to Increase Spectral Efficiency New momentum for development of commercial 5G services TOKYO, Nov 3, 2017 - (JCN Newswire) - NTT DOCOMO, INC. announced today that in a joint trial conducted with MediaTek Inc. it has successfully developed a chipset to increase the spectral efficiency of mobile devices by up to 2.3 times compared to existing LTE technology. The chipset combines DOCOMO's non-orthogonal multiple access (NOMA) radio access technology and MediaTek's multi-user interference cancellation (MUIC) technology, which is required to achieve NOMA. NOMA multiplexes signals at a base-station transmitter to leverage the increased signal processing capacity of user devices and cancel interference among multiplexed user signals. MUIC removes interference from other users when a base station transmits a signal to a number of users simultaneously. During the trial, three smartphone-sized devices embedded with the chipsets, each placed in a different location, received data that was transmitted simultaneously from a base station using the same frequency, while the transmission power of the signal transmitted to each device was adjusted. Using the developed chipset, each device successfully eliminated interfering signals intended for the other devices and received only the intended data, resulting in up to 2.3 times greater spectral efficiency than that of single-user Multi Input Multi Output (MIMO). [easy-tweet tweet="DOCOMO is forging ahead with its development of 5G technologies, aiming to increase the signal processing capacities of user devices in populated urban areas and standardize 5G's radio interface and improve spectral efficiency." hashtags="5G, mobile"] DOCOMO is forging ahead with its development of 5G technologies, aiming to increase the signal processing capacities of user devices in populated urban areas and standardize 5G's radio interface and improve spectral efficiency. Going forward, DOCOMO will continue to collaborate with world-leading vendors in its research and development of commercial 5G communications devices and services scheduled to launch in 2020.   ### Snowflake Announces Instant Elasticity Snowflake’s new standard of elasticity comprises instant provisioning, on-demand performance, on-demand concurrency and per-second pricing, saving Snowflake customers up to 80%. SAN MATEO, Calif. – Nov 1, 2017 – Snowflake Computing, the only data warehouse built for the cloud, today announced the availability of Instant Elasticity. Snowflake customers can now define their service level agreement (SLA), while providing an unlimited number of users with consistent performance, predictable pricing and no overbuy. Customers save money with true cloud elasticity, enabled by the instant availability of computing resources to meet their organization’s needs, while only paying for exactly the resources they use. Instant Elasticity consists of the following features: Instant provisioning – With just a few clicks, enable any amount of computing power based on the size and nature of your workloads. On-demand performance – Once you apply your preferred computing power, instantly scale that resource up or down to get the performance you desire. On-demand concurrency – Using the multi-cluster warehouse, instantly provision and release additional capacity based on your number of concurrent queries. Per-second pricing – Snowflake is the only data warehouse that bills by the second, harnessing the true value of instant provisioning, on-demand performance and on-demand concurrency so customers only pay for the resources they use. Some of the significant benefits of Instant Elasticity include: Predictable pricing – Decide exactly how much horsepower to throw behind your workloads and predict, in advance, the estimated cost. Faster performance – Spin up larger amounts of computing power for shorter periods of time to obtain quicker, actionable insights. Reduced costs – Eliminate the cost of an idle warehouse by paying only for the seconds of computing your organization consumes. Faster is free! Snowflake’s unique architecture, which truly separates computing from storage, and Snowflake’s cloud-built elasticity, enables customers to instantly provision and scale as many computing clusters needed to support an unlimited number of users and processes, without impacting performance and eliminating the cost of per-hour pricing. In addition, Snowflake customers don’t have to endure the laborious task of manually provisioning resources, which many data warehouses require. Just log in, and with a couple of clicks you’re up and running. Snowflake customers already capitalizing on Instant Elasticity have seen up to 80% cost reductions, with an average savings of 24%. All Snowflake customers already benefit from instant provisioning, with warehouse availability in less than one second, 95% of the time. [easy-tweet tweet="Snowflake makes Instant Elasticity possible because it’s the only data warehouse built for the cloud" hashtags="Data, Cloud"] Snowflake makes Instant Elasticity possible because it’s the only data warehouse built for the cloud. Other, so-called cloud data warehouses are merely legacy, on-premises technologies moved to the cloud, providing minimal additional benefit. Other data platforms, such as query services, have their own shortcomings. These solutions fail to deliver in the following ways: Legacy architecture – Unlike Snowflake, computing and storage are locked together, and as a result, so is pricing, preventing per-second billing. Inelastic solution – Without instant, on-demand scaling, accessing the true benefits of per-second billing are impossible. Bulk billing – Many data warehouse cloud offerings still charge by the hour, forcing enterprises to pay for unused and idle data warehouse time. Unpredictable pricing – Query services that bill based on the amount of data scanned make it difficult to predict the cost of a query before it is executed. HotelTonight is a mobile travel app that allows users to find discounted hotel accommodations up to seven days in advance. The company uses Snowflake to predict consumer behaviour and better serve its hotel and retail customers. “Snowflake already provides us with a revolutionary cloud data warehouse architecture and technology,” HotelTonight’s Head of Data Engineering, Jonathan Geggatt said. “They’ve now taken the technology to the next level by coupling Snowflake’s unique elasticity with per-second pricing. It’s changing the way we perform our analytics, for the better.” Millions of users across the world rely on Runkeeper, a leading mobile fitness app, to track workouts, set goals and connect with the community to stay motivated. Every time users create events, such as starting a run, sharing with a friend or responding to a push notification, they generate data that helps Runkeeper understand how to improve the customer experience. “Snowflake’s architecture continues to enable revolutionary new features such as Instant Elasticity,” Runkeeper’s Director of Platforms, Karla Goodreau said. “The cost savings are significant and time to value has also shortened, making us rethink what’s now possible when it comes to our data analytics.” Snowflake CEO, Bob Muglia said: “Snowflake continues to drive data warehouse innovation. Instant Elasticity delivers even better value to our customers, enabling organizations to cost-effectively provide all their users access to all their data for unlimited insight. Instant provisioning, on-demand performance, on-demand concurrency and per-second pricing make Snowflake the most powerful and efficient cloud data warehouse available. We’re proud to announce Instant Elasticity so enterprises can obtain true, data-driven insight from the no. 1 data warehouse – Snowflake.” ### The Risks Created by Multiple Mobile Devices According to the latest research from Gartner, migration to the cloud of corporates’ most critical applications, their finance systems, is happening more quickly than predicted in the past. The vast majority of finance executives (93%) surveyed said they saw the cloud being utilised for half of their enterprise transactions in the future. With many other applications, including HR and CRM already being hosted in the cloud, organisations without the right security protection are potentially opening valuable data up to risks such as cyber-attacks. The adoption of mobile technologies, bring your own device (BYOD) and the ‘always-on’ business environment is exacerbating this challenge. Our research with over 1,000 consumers found that nearly all (97%) UK adults always have their mobile phone on them. An additional 9% carry a dedicated work mobile as well, while many Brits are further weighed down by a tablet (31%), a laptop (29%) and wearable technology (8%). While the cost savings associated with replacing on-premise servers/PCs with cloud-hosted systems/mobile devices have huge appeal, they also present new requirements to address security risks. The most pressing of these is the blurring of lines between home and business use. Not only are employees using their personal mobile to access corporate systems, they are using work phones to access personal apps such as Facebook or WhatsApp. The danger is that corporate data is retained on mobile devices that are carelessly disposed of. Let’s take the example of the smartphone. If an employee is using their personal phone for work purposes it means a significant amount of business data could be vulnerable; from passwords and emails to larger documents or even easy access to the wider company servers. Our research found that more than one in 10 consumers (11%) were unsure that they had permanently deleted the data from their recycled or discarded devices. In a separate study, our engineers analysed more than 60 devices sourced from online shopping sites. They found residual personal data on just under half (47%) of them. Devices that are discarded, whether they are recycled or end up in a land fill, and have not had the data properly deleted first are rich pickings for the data thief. Companies work hard to enforce and regularly update security protocols to ensure the safekeeping of company data, which can include everything from financial records to personnel files. These protocols are easy to keep track of and enforce on devices that a company’s IT department knows about. [easy-tweet tweet="Less than a third of consumers (32%) regularly back up their devices" hashtags="Mobile, Security"] Considering the devices it does not know about, this means that company data is only as secure as the measures put in place by the employee – almost certainly less stringent than enterprise security. To put this into perspective our study revealed that less than a third of consumers (32%) regularly back up their devices. We would always advise corporates and consumers to ensure they thoroughly delete personal data on mobile devices before disposing or recycling them. Our research shows that even when devices are reset to factory settings or are partially destroyed by water, fire or physical damage we are still able to recover personal data from them. Just deleting data is not enough. Specialist erasure software that overwrites existing data several times will minimise the risk of data recovery by third parties and provide peace of mind that all data has been completed removed. So, what should businesses do to ensure their data is safe? Understand who can access what data and where Have an effective Bring Your Own Device (BYOD) policy, get staff educated and on-board with their responsibilities, and be prepared for when something goes wrong. Make sure all the right people are involved from C-level awareness down to IT implementation of policy and all clarity for all employees so they know their responsibilities. Consider the wider legal ramifications The European General Data Protection Regulation (GDPR) is due to come into force by mid-2018. Start preparing now, find out what data you have, where and how it can be accessed and get rid of data that you no longer need. Being found in breach of the new regulation has financially crippling fines up to and including 4% of a company’s annual revenue. Ensure all devices are recycled responsibly and the data is deleted Devices that are discarded or recycled without having the data irrevocably deleted makes it easy for thieves to steal sensitive data. Two methods of deletion to consider are: Erasure software – the software overwrites random binary sequences over existing data. This is done several times to minimise the risk of any data being recoverable. It’s important to remember that different storage devices (HDD, SSD, flash media) may need different techniques to successfully delete the data. Degaussing – this method works on devices that store data magnetically (i.e. HDDs and tapes). It ensures rapid and thorough deletion through a powerful demagnetisation process. This method renders hard drives as completely unusable and can also be used for damaged media. Talk to an expert before the worst happens It is hard to recover from a data breach or data loss incident, therefore it is advisable to stop them before they happen. For more detailed information on sanitising devices properly, visit our website to find out more about the different hardware and software methods available. Nobody wants to risk their business sensitive data falling into the wrong hands by letting security fail at the very last step of the data lifecycle. ### IT education doesn’t meet employers demands say CIOs Security skills is biggest area of concern, followed by digitisation and business intelligence UK – London, Thursday 2nd November 2017 – The demands of the IT employment market are not being met by today’s education system, according to new research from recruitment specialist Robert Half Technology UK. With the UK skills gap costing the UK economy an estimated £63bn a year, the poll of CIOs reveals the most critical technological skills that UK technology leaders want education providers to focus on providing. According to the research, more than nine in ten CIOs (92%) believe that the IT education provided by colleges, universities and technical schools doesn’t meet the demands of the employment market. [easy-tweet tweet="According to the research, more than nine in ten CIOs (92%) believe that the IT education provided by colleges, universities and technical schools doesn’t meet the demands of the employment market." hashtags="education, CIOs, IT"] The poll found that the biggest area of concern among CIOs is IT security, with nearly six in ten (59%) saying that this is one of the critical areas that IT educators need to focus on. The next most urgent areas for enhancement are digitisation (23%), business intelligence (20%) and software/application development (18%). The UK requires 104,000 STEM (science, technology, engineering and mathematics) graduates each year, yet 40,000 jobs in this area are left vacant, according to figures from education campaigners, Your Life. What’s more, student engagement with STEM subjects is declining among older learners, with just 33% of boys and 19% of girls taking two or more STEM subjects at A-level. “While businesses are urgently trying to reskill their employees through internal training and development programmes, the root cause of the current skills gap lies much deeper,” said Phil Sheridan, senior managing director, Robert Half UK, South America and the Middle East. “Recent graduates, particularly in STEM subjects, are already lacking vital skills that UK organisations desperately need.” “Technology is changing so rapidly that it is little surprise that education providers are struggling to teach their students skills that will still be relevant when they reach the workplace” Sheridan continued. “They should also consider what they can do to offer work placements or ‘Year in Industry’ initiatives that will encourage prospective students and convince them that learning STEM subjects is the path towards an incredibly fulfilling and well-rewarding career”. ### Highly Scalable Cloud Ecommerce Platform Reply, through the two companies Storm Reply, Amazon Web Services Premier Consulting Partner, and Portaltech Reply, Sap hybris Partner, supported Lavazza, world leader in the distribution and marketing of coffee products and machines, in the design and implementation of the new eCommerce website in Cloud. Through Amazon Web Services, Reply moved Lavazza’s eCommerce infrastructure to Cloud, in order to ensure an increased capacity and flexibility, as well as the possibility to take control over all the releases chain and standardise all the processes. [easy-tweet tweet="Reply implemented a replatforming project on SAP Hybris B2C platform" hashtags="Reply, Cloud"] After having rebuilt the eCommerce structure in Cloud, Reply implemented a replatforming project on SAP Hybris B2C platform, able to deal with both eCommerce backend and front end management. The need to have a platform to enable multicountry management was matched during the project, as after the launch of the Italian platform the store was brought to the English and German market, using the same platform but implementing new developments and integrations in order to adapt it to national differences. The platform simply and integratedly manages the different catalogues and products, payment providers, payment methods and taxation modes. Thanks to the elasticity of the solution developed by Reply, Lavazza was able to increase its customer base, opening the platform to new countries and adding more compute capacity, releasing new products and supporting digital marketing initiatives on a daily basis, reacting immediately to the internal business demand.   ### No Platform Immune from Ransomware, According to SophosLabs 2018 Malware Forecast   Ransomware ravaged Windows, but attacks on Android, Linux and MacOS systems also increased in 2017 Just two strains of ransomware were responsible for 89.5 percent of all attacks intercepted on Sophos customer computers worldwide OXFORD, U.K. – Nov. 2, 2017 – Sophos (LSE: SOPH), a global leader in network and endpoint security, today announced its SophosLabs 2018 Malware Forecast, a report that recaps ransomware and other cybersecurity trends based on data collected from Sophos customer computers worldwide during April 1 to Oct. 3, 2017. One key finding shows that while ransomware predominately attacked Windows systems in the last six months, Android, Linux and MacOS platforms were not immune. “Ransomware has become platform-agnostic. Ransomware mostly targets Windows computers, but this year, SophosLabs saw an increased amount of crypto-attacks on different devices and operating systems used by our customers worldwide,” said Dorka Palotay, SophosLabs security researcher and contributor to the ransomware analysis in the SophosLabs 2018 Malware Forecast. The report also tracks ransomware growth patterns, indicating that WannaCry, unleashed in May 2017, was the number one ransomware intercepted from customer computers, dethroning longtime ransomware leader Cerber, which first appeared in early 2016. WannaCry accounted for 45.3 percent of all ransomware tracked through SophosLabs with Cerber accounting for 44.2 percent. “For the first time we saw ransomware with worm-like characteristics, which contributed to the rapid expansion of WannaCry. This ransomware took advantage of a known Windows vulnerability to infect and spread to computers, making it hard to control,” said Palotay. “Even though our customers are protected against it and WannaCry has tapered off, we still see the threat because of its inherent nature to keep scanning and attacking computers. We’re expecting cyber criminals to build upon this ability to replicate seen in WannaCry and NotPetya, and this is already evident with Bad Rabbit ransomware, which shows many similarities to NotPetya.”  The SophosLabs 2018 Malware Forecast reports on the acute rise and fall of NotPetya, ransomware that wreaked havoc in June 2017. NotPetya was initially distributed through a Ukranian accounting software package, limiting its geographic impact. It was able to spread via the EternalBlue exploit, just like WannaCry, but because WannaCry had already infected most exposed machines there were few left unpatched and vulnerable. The motive behind NotPetya is still unclear because there were many missteps, cracks and faults with this attack. For instance, the email account that victims needed to contact attackers didn’t work and victims could not decrypt and recover their data, according to Palotay. “NotPetya spiked fast and furiously, and did hurt businesses because it permanently destroyed data on the computers it hit. Luckily, NotPetya stopped almost as fast as it started,” said Palotay. “We suspect the cyber criminals were experimenting or their goal was not ransomware, but something more destructive like a data wiper. Regardless of intention, Sophos strongly advises against paying for ransomware and recommends best practicesinstead, including backing up data and keeping patches up to date." [easy-tweet tweet="Android ransomware is also attracting cyber criminals" hashtags="Ransomware, CyberCriminals"] Cerber, sold as a ransomware kit on the Dark Web, remains a dangerous threat. The creators of Cerber continuously update the code and they charge a percentage of the ransom that the “middle-men” attackers receive from victims. Regular new features make Cerber not only an effective attack tool, but perennially available to cyber criminals. “This Dark Web business model is unfortunately working and similar to a legitimate company is likely funding the ongoing development of Cerber. We can assume the profits are motivating the authors to maintain the code,” said Palotay. Android ransomware is also attracting cyber criminals. According to SophosLabs analysis, the number of attacks on Sophos customers using Android devices increased almost every month in 2017. “In September alone, 30.4 percent of malicious Android malware processed by SophosLabs was ransomware. We’re expecting this to jump to approximately 45 percent in October,” said Rowland Yu, a SophosLabs security researcher and contributor to the SophosLabs 2018 Malware Forecast. “One reason we believe ransomware on Android is taking off is because it’s an easy way for cyber criminals to make money instead of stealing contacts and SMS, popping ups ads or bank phishing which requires sophisticated hacking techniques. It’s important to note that Android ransomware is mainly discovered in non-Google Play markets – another reason for users to be very cautious about where and what kinds of apps they download.”   The SophosLabs report further indicates two types of Android attack methods emerged: locking the phone without encrypting data, and locking the phone while encrypting the data. Most ransomware on Android doesn’t encrypt user data, but the sheer act of locking a screen in exchange for money is enough to cause people grief, especially considering how many times in a single day information is accessed on a personal device. “Sophos recommends backing up phones on a regular schedule, similar to a computer, to preserve data and avoid paying ransom just to regain access. We expect ransomware for Android to continue to increase and dominate as the leading type of malware on this mobile platform in the coming year,” said Yu. For access to the full SophosLabs 2018 Malware Forecast and Ransomware Infographic, go to here.   ### Swyx launches new technology partner program In order to work even more closely with third-party manufacturers in the future and to continuously expand the range of solutions for users, Swyx has set up a new program for technology partners. At the Swyx Partner and Technology Conference on September 27 2017, the UC specialist introduced the Technology Alliance Program (TAP) for the first time. The Technology Alliance Program supports third party hardware and software vendors in the development and marketing of complementary products for Swyx’s communication solution. The new program will enable technology partners to offer innovative Swyx compatible solutions based on open interfaces, standards and protocols, thus opening up new customer groups. TAP replaces Swyx’s previous purely technical certification program and integrates technology partners much more strongly than before. Thanks to closer cooperation, clearly defined certification processes and joint marketing activities, both sides benefit even more from working together. Technology partners receive full support from the TAP team and test licences for Swyx products. After successful certification, they can use Swyx's marketing channels and present their products on the UC provider's website. Extensive tests are also an essential part of the program to ensure the complete compatibility and functionality of the partner product and Swyx’s solution. Interoperability can be tested in a jointly agreed test procedure either by Swyx or by the TAP partner itself. After successful completion of the tests, the partner's product receives a certificate confirming its compatibility with Swyx - so users can rely on the fact that they can combine the two products without any functional restrictions. “TAP is the logical further development of our previous technology partner program and enables us to continuously expand our own offering as well as to market third-party solutions that optimally complement our portfolio. We have thus set the course for closer cooperation with technology partners from different areas in the future and in turn expand the Swyx ecosystem. In this way, we open up even more opportunities for users to take communication in their company to a new level”, explains Martin Classen, CTO of Swyx Solutions AG. For more information about the Technology Alliance Program (TAP) please visit the Swyx homepage or e-mail tap@swyx.com ### IBM announces new offerings to Watson Data Platform ARMONK, N.Y. – 2 Nov 2017 – IBM (NYSE: IBM) today announced new offerings to its Watson Data Platform, including data cataloguing and data refining, which make it easier for developers and data scientists to analyze and prepare enterprise data for AI applications, regardless of its structure or where it resides. By improving data visibility and security, users can now easily connect and share data across public and private cloud environments. By 2018, nearly 75 percent of developers will build AI functionality into their apps, according to IDC. However, they also face the obstacle of making sense of increasingly complex data that lives in different places, and that must be securely and continuously ingested to power these apps. Addressing these challenges, IBM has expanded the functionality of its Watson Data Platform, an integrated set of tools, services and data on the IBM Cloud that enables data scientists, developers and business teams to gain intelligence from the data most important to their roles, as well as easily access services like machine learning, AI and analytics. "We are always looking for new ways to gain a more holistic view of our clients’ campaign data, and design tailored approaches for each ad and marketing tactic,” said Michael Kaushansky, Chief Data Officer at Havas, a global advertising and marketing consultancy. “The Watson Data Platform is helping us do just that by quickly connecting offline and online marketing data. For example, we recently kicked off a test for one of our automotive clients, aiming to connect customer data, advertising information in existing systems, and online engagement metrics to better target the right audiences at the right time.” Specifically, this expansion includes: New Data Catalog and Data Refinery offerings, which bring together datasets that live in different formats on the cloud, in existing systems and in third-party sources; as well as apply machine learning to process and cleanse this data so it can be ingested for AI applications; The ability to use metadata, pulled from Data Catalog and Data Refinery, to tag and help enforce data governance policies. This enables teams to more easily identify risks when sharing data, and ensure that sensitive data stays secure; The general availability of Analytics Engine to separate the storage of data from the information it holds, allowing it to be analyzed and fed into apps at much greater speeds. As a result, developers and data scientists can more easily share and build with large datasets. “The key to AI starts with a strong data foundation, which turns the volume and velocity of incoming data from a challenge into an asset,” said Derek Schoettle, General Manager, IBM Watson Data Platform. “For companies to innovate and compete with AI, they need a way to grasp and organize data coming in from every source, and to use this complete index of data as the backbone of every decision and initiative.” To further help companies grasp control of all of their data no matter where it resides, IBM is also announcing a series of new features to its Unified Governance Platform. These bring greater visibility and management of clients’ global data, including new capabilities that help to better prepare for impending data protection regulations such as GDPR. Built on open source technologies and fueled by IBM Cloud, the Watson Data Platform brings together IBM’s cloud infrastructure, powerful data services and decades of experience helping clients across industries solve their data challenges. Linked closely with the communities most important to data scientists and developers, including Python and Spark, the Watson Data Platform continues to evolve to build the most open and complete data operating system on the cloud. For more about IBM Cloud, visit: https://www.ibm.com/cloud-computing/. ### Cloud Expo Asia interview with Kenneth Kjellberg from CEJN AB Disruptive interviews CEJN AB, Manager Engineering, Kenneth Kjellberg at Cloud Expo Asia 2017. Compare the Cloud’s event coverage platform, Disruptive partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Visit CEJN AB: https://www.cejn.com/ ### Cloud Expo Asia interview with Lim May-Ann from Asia Cloud Computing Association Disruptive interviews Asia Cloud Computing Association's Executive Director, Lim May-Ann at Cloud Expo Asia 2017. Compare the Cloud’s event coverage platform, Disruptive partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Visit Asia Cloud Computing Association at: http://www.asiacloudcomputing.org/ ### Julian Rice, Director Marketing and Sales Enablement - International at DST Comments Insurance companies today face significant challenges, increased capital requirements and continuing low interest rates rank high amongst them, however both are low in precedence when compared to the need to drive digital transformation within the industry. According to research by Gartner*, close to two thirds of the world’s biggest insurance companies have already invested in insurance technology start-ups. By 2018, eight out of ten property, casualty and life insurers will have joined them.  However, although insurers are fully aware the future is digital, it seems that when it comes to the digital transformation of their own business, things are more difficult. Although insurers are investing in other technology businesses, a digital strategy itself, is often not included within the insurer’s own business strategy. It is, in fact, perhaps the key challenge facing the insurance industry today. Below are six key steps that are essential in helping insurance firms achieve digital transformation: 1. It starts with people… Insurers need to understand the needs and priorities of the people at the heart of the digital initiative to develop their approach. According to Jonathan Swift (Wired’s Editor-in-Chief)**, to succeed in the evolving insurance market, you need to think like a start-up, whilst taking the time at the outset to analyse, and ask what your existing customers, stakeholders, employees and executives want and need from digital transformation. [easy-tweet tweet="The gap in technological expertise is not unique to the insurance industry" hashtags="Technology, Digital"] They also need to bring in the digital expertise, not just at the front end, implementing digital initiatives, but at the top, setting the digital vision and strategy, whilst identifying the risks, and opportunities from emerging technology and changing consumer behaviours. The gap in technological expertise is not unique to the insurance industry. A review***from last year of 1,925 executive and non-executive directors at more than 100 of the largest banks, showed only 6% of board members and 3% of CEOs had professional backgrounds in technology. More than four in ten (43%) had no technology experience on their board at all. This knowledge gap is a common challenge across financial services but one that must be addressed. 2. It relies on data… Insurers already have much of the data they need, particularly in terms of customer behaviours and attitudes. Aviva’s Chief Digital Officer believes they already have incredible insight, the challenge now is capturing this from silos across the business and attempting to turn it into actionable intelligence. Insurers need to use external digital analysis, customer experience and empathy mapping and digital value chain analysis to inform a process of segmentation and targeting, and to define their value proposition and goals. 3. You need a roadmap… Digitisation is changing markets quickly. With the threat of disruption from new, and changing customer behaviours and expectations there is an urgency to implement digital initiatives that will prevent insurers falling behind and losing market share. This cannot however come at the expense of a long-term plan. Insurers need a lasting plan as well as roadmaps for strategic initiatives and business imperatives. 4. This requires a digital framework… A framework allows the company to address digital goals and objectives. Not only will it show how initiatives to meet short term business requirements will be achieved, but also how these will change over time and how they link with your strategies. The framework will address how projects today will fit in with the vision of the future, and the methods – on premises hosting, using the cloud, software as a service – that make most sense to get there. 5. Define the digitals scope… There is no escape from considering in detail the company’s approach across digital. It is not just about what the business does, but how it presents it. Insurers must outline the purpose, objectives and key initiatives and challenges in all the key areas: Website, branding, online content, digital advertising, contact management, social media and mobile. They must also determine what needs to be serviced in-house and what can be outsourced. 6. Get it right first time…  Execution and governance are essential to success. Organisations often bite off more than they can chew, starting with a grand transformation that takes too long to provide meaningful returns. The result can mean compromises that undermine the potential value of the project’ Should the end result be a disappointment to consumers or users, they’ll be reluctant to adopt later iterations. Big visions and objectives therefore need to be segmented into small, deliverable projects that provide valuable functionality. That’s not to say the digital challenge does not require big changes. It does. But on the way there, each step counts and it is one of the key areas insurers need to start addressing. They should do so not only because digital transformation is probably the biggest issue that they face, but also because it’s central to addressing so many of the other challenges on the boardroom agenda. ### Cloud Expo Asia interview with Gabor Vincze from Gravity Disruptive interviews Gravity's Head of Global Service, Gabor Vincze at Cloud Expo Asia 2017. Compare the Cloud’s event coverage platform, Disruptive partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Visit Gravity at: http://www.yusp.com/ ### One in Four Organisations Rely Solely on Passwords to Secure BYOD Access Systems left exposed to the threat of compromised credentials, potentially resulting in unauthorised access to company data London, UK – 2nd November, 2017 – Bitglass, the Total Data Protection company, today announced the findings of its BYOD and Identity research report, which includes insights from more than 200 IT and security professionals surveyed at the Gartner Symposium/ITxpo conference. According to Bitglass’ research, one in four organisations do not have multi-factor authentication (MFA) methods in place to secure BYOD – a well-known enterprise security gap. Simply using passwords – i.e. single-factor authentication – to control user access to corporate data, has resulted in several high-profile data breaches in recent months, including Zomato, Deloitte and Microsoft. “Enterprises often misjudge the effectiveness of traditional security solutions, many of which are readily bypassed,” said Rich Campagna, CEO, Bitglass. “The BYOD boom exposes organisations to risks that can only be mitigated with data-centric solutions that secure access.” The report also highlights the top cloud security priorities for organisations. BYOD security and access are clear concerns, as external sharing (45 percent), malware protection (40 percent) and unmanaged BYO device access (40 percent) came out on top. In order to meet these needs, organisations will need to adopt new security solutions. While three quarters of respondents already have encryption and on-premise firewalls in place to protect corporate data, more are starting to deploy Secure Web Gateways and cloud access security brokers. [easy-tweet tweet="28% of respondents have no multi-factor authentication methods in place for BYOD access" hashtags="BYOD, Technology"] Bitglass’ survey also found that some organisations still have concerns with the newest user authentication methods. The report found that 61 percent of IT and security professionals surveyed have reservations about Apple’s Face ID technology as a viable method of BYOD authentication. While, traditional authentication methods like passcodes, PIN codes, and fingerprint recognition are familiar and trusted by enterprises, facial recognition technologies remain unproven. Highlights of the survey include: 28 percent of respondents have no multi-factor authentication methods in place for BYOD access For those using MFA for BYOD, third party applications (42 percent) and SMS tokens (34 percent) are the most popular methods used External sharing is rated the leading cloud security concern for professionals surveyed (45 percent) Also listed as top security concerns are malware protection (40 percent) and unmanaged device access (40 percent) 61 percent of respondents have reservations about Apple’s Face ID technology Top Apple Face ID concerns include accuracy of face detection (40 percent), prevention of unauthorised access (30 percent) and speed of face detection (24 percent) ### Harnessing the Power of Cloud Securely Cloud isn’t in the future, it’s today’s reality. Organisations are harnessing its power to introduce flexible ways of working. But the issue isn’t whether organisations use public, private or even hybrid cloud platforms. It’s not even what data they choose to store in the cloud, or how they access it. It’s whether they’re doing it securely. And there’s the problem – cloud is part of the new elastic attack surface. Whereas organisations once only worried about securing servers and laptops, today’s organisations struggle to manage a complex computing environment which includes mobile, cloud and IoT to name just a few. Most organisations cannot currently monitor, manage and understand the nature of their Cyber Exposure consistently or with confidence. This creates a Cyber Exposure gap and the larger the gap, the greater the risk of a business-impacting cyber event occurring. How can organisations harness the power of the cloud securely? A New Frontier with an Old Approach The traditional approach of building a secure perimeter to ring fence infrastructure and data has been consigned to the history books – actually, cloud allows new services to be spun up in seconds. Cloud computing allows organisations to expand and adjust their IT environments with incredible flexibility, but it has also introduced new challenges to identifying and reducing cyber risk. The reality is that the tools and approaches organisations use to understand Cyber Exposure didn’t work in the world of client/server, on-premise data centres, let alone today’s elastic environment. As validation, Tenable’s 2017 Global Cybersecurity Assurance Report Card, which surveyed 700 security practitioners around the world, found that participants rated their ability to assess risk in ‘cloud environments’ [the combination of software as a service (SaaS), infrastructure as a service (IaaS) and platform as a service (PaaS)] at just 60 percent. This dearth of confidence mirrors an alarming and widespread lack of visibility into not just cloud instances, but also most other areas of the modern computing environment. It’s worth clarifying that the perception of cloud as being any more vulnerable than on-premises solutions is a myth. It doesn’t matter where the infrastructure, applications, or data reside - if they’re connected then they’re vulnerable. It is important that organisations accept this and address the issue. In order to do this, a new security approach is required that encompasses both a new way of thinking, and a toolset capable of adapting to these elastic working environments. [easy-tweet tweet="Cyber Exposure is an emerging discipline for managing, measuring and reducing the modern attack surface. " hashtags="Cyber, Cloud"] A New Frontier Requires a New Approach Cyber Exposure is an emerging discipline for managing, measuring and reducing the modern attack surface. It should be approached as a live, dynamic process that expands and contracts along with the elastic attack surface. After all, containers and cloud workloads may have a lifespan of minutes to hours which makes them extremely hard to see and protect. There are three fundamental questions organisations need to be able to answer if they’re to stand a chance of understanding and reducing their cyber risk: are they secure; how are they exposed; and most importantly, how do they proactively reduce their exposure. To do this, organisations should practice four related disciplines: Discover: It’s impossible to protect what you don’t know exists so the first stage is to inventory the computing environment in real time.  Having mapped these assets whatever they may be - from desktops, laptops, servers, applications, containers etc., and wherever they may reside – be it in the Cloud, physically networked, etc. the organisation can establish a baseline of the current and desired operational state. Assess: Having established what makes up the organisation’s infrastructure, the next phase is to accurately determine any areas that are exposed. This is basic cyber hygiene and should check for any vulnerabilities, misconfigurations, out of date software, products that are no longer supported or no longer accessed or used.  It should also include users that are either no longer active or privileged accounts that potentially pose a risk. Analyse: Having mapped the network and identified the perceived risks, the next element is to put these risks into context. Is the asset critical to the day-to-day operations of the business, or does it hold vital information? Where does it live? Does it move? Who or what has access to that asset? If it’s vulnerable, is it being actively exploited? The answers to these questions will help organisations properly prioritise their risks to determine what needs to be remediated first. Fix: The final element is fixing what needs fixing. This may mean implementing temporary security controls while waiting for a patch, updating systems or upgrading hardware. Cyber Exposure Lifecycle This isn’t a one time action, but rather an operational security lifecycle. The boundaries of the organisation’s perimeter and accountability are expanding and contracting hour by hour, minute by minute and in some cases second by second. Organisations need capabilities for inventorying not just on-premises infrastructure, but also in and across the cloud in real-time. Organisations need to embrace this new way of thinking - to understand their Cyber Exposure in a way that adapts to this new world of modern assets and elastic working practices. Cloud isn’t the future, it’s part of today’s reality. And organisations must make sure they’re harnessing its power securely. ### Cloud Expo Asia interview with Shahram Mehraban from Lantronix Disruptive interviews Lantronix's VP of Marketing, Shahram Mehraban at Cloud Expo Asia 2017. Compare the Cloud’s event coverage platform, Disruptive partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Visit Lantronix at: https://www.lantronix.com/ ### NVM Private Equity Provides Development Capital Investment for Big Data Analytics Business Hello Soda NVM Private Equity (NVM) has invested £4 million in Manchester based Hello Soda, a global text and big data analytics business. Having achieved exceptional sales growth in the last year, the investment will enable the company to continue its global growth plans and raise the profile of its unique and high-tech offerings that boost revenues for businesses in the ID verification, risk and personalisation verticals. The transaction is being jointly funded by NVM Private Equity alongside an additional £1.5 million venture debt facility from Clydesdale and Yorkshire Bank’s Growth Finance team. The company, which also has offices in both the US and Thailand, specialises in providing advanced text analytics software solutions to customers across six continents meaning it is able to provide a global, 24/7 service to its wide portfolio of customers.  Their multilingual software solutions work to increase access for consumers in the digital age through deriving unique, real-time data insight to make more informed decisions on verifying identities, reducing fraud, and personalising the user experience. Led by highly-experienced CEO James Blake, the business has grown rapidly since its launch in 2013 and currently has customers operating in countries as widely spread as the UK, USA, Mexico, Australia, Spain and South Africa.  Furthermore, James, who brings 20 years’ experience to the table in technology sales is focused on Hello Soda expanding and exceeding expectations. Founder and CEO of Hello Soda, James Blake, said: “NVM are a great cultural fit for us and it’s fantastic to work with a team that truly share our values and vision of a more inclusive digital world. Working with NVM will enable us to really ramp up our global growth to better support our existing and prospective clients and we’re very excited to see the impact that our innovations will have." Liam May, Investment Manager of NVM Private Equity said: “Hello Soda already has a strong market position and great growth potential across a number of sectors and geographies. With the capital to pursue its ambitions now in place, combined with leading, innovative technology and a young, entrepreneurial management team, this is an extremely exciting time for the business.” Nick Edgar, Growth Finance Senior Director, at Clydesdale and Yorkshire Banks, said: “We are committed to supporting the economic growth of businesses across the North of England and working with customers to take advantage of the investment opportunities that the Northern Powerhouse is offering. A unique FinTech proposition providing customers with a wide range of data insight, Hello Soda is in an excellent position to continue its global growth in this emerging digital space.” ### Cloud Expo Asia interview with Indra Pramana from Simpler Cloud Pte Ltd Disruptive interviews Simpler Cloud Pte Ltd CEO, Indra Pramana at Cloud Expo Asia 2017. Compare the Cloud’s event coverage platform, Disruptive partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Visit Simpler Cloud Pte Ltd at: http://www.simplercloud.com/ ### Cloud Expo Asia interview with Chester Wisniewski from Sophos Disruptive interviews Sophos' Principal Research, Chester Wisniewski at Cloud Expo Asia 2017. Compare the Cloud’s event coverage platform, Disruptive partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Visit Sophos at: https://www.sophos.com/en-us.aspx ### Why SMEs Need to Become Confident Adopters in the Cloud Small businesses are increasingly showing an appetite for the cloud, but we are far from reaching the tipping point in adoption. We are all familiar with the concept of the cloud, but we are led to believe that businesses are well on their way in using cloud technologies. This is far from the truth. Our latest research shows just 33% of organisations are experienced in the cloud and 37% have only recently launched cloud computing projects for the first time. Every business owner will have its own pain points, but most would agree it’s difficult to get a handle on production, sales and finance while keeping pace with growth. The cloud can help tackle these common challenges by giving them a complete, real-time view of the business. However, there are still SMEs that find it extremely difficult to see how it works and recognise the tangible benefits available to them. So there is still a job to be done in proving the cloud’s worth and that responsibility should fall on the cloud providers. [easy-tweet tweet="Most organisations are worried about security (82%) and data protection (68%) in the cloud" hashtags="Cloud, Data"] Crucially, cloud providers must do more to boost confidence and knowledge among SMEs reluctant in moving to the cloud. They need to address the same concerns around security and data protection reported over and over again. After all, according to our same research, most organisations are worried about security (82%) and data protection (68%) in the cloud. Reality check It’s time that all of us as cloud services providers take a reality check and help SMEs become confident adopters. Let’s not leave any behind. We need to proactively give clear guidance on security responsibilities and support organisations in being better protected, ensuring devices and applications are properly patched and secured. Those writing the software are clearly best placed to provide this. With the General Data Protection Regulation (GDPR) coming into force next year, we also have a duty of care to provide clarity on how data is being stored and secured in the cloud. But how can SMEs choose the right cloud provider for them? We have found that organisations want financially stable providers and prefer those that store data locally and offer local support. This will become even more pertinent as Britain leaves the European Union. They will trust the providers that can offer certainty in an uncertain market and choose to work with those with a vested interest in the UK and the cloud. At Advanced, we see the cloud’s potential to impact every small business and, for this reason, have launched a cloud-first strategy to deliver a Software-as-a-Service (SaaS) solution that is accessible and adaptable for all. Our solution, Advanced Business Cloud Essentials, covers the entire business process from accounts and payroll through to operations, stock, customers and the supply chain. James Gourmet Coffee, J S Bailey, Aspire Furniture and Aspire Manufacturing are examples of organisations that have invested in this cloud solution and trusted Advanced as a cloud provider to help them become confident adopters. In fact, the founder of Aspire Furniture, a fast-growing online mail order furniture business, says it’s an incredible tool for supporting growth over the last three years, and that he would never turn back from the cloud. The sky is the limit The bottom line is that, while cloud adoption is accelerating and that a breadth of businesses can use the cloud to gain a competitive advantage, there is still some way to go in convincing SMEs. A cloud-driven small business needn’t be a distant pipedream, but we all need to support the remaining 30% of businesses that, for whatever reason, are holding off from using the cloud. Only then will be reach the tipping point in cloud adoption. ### Huawei and Capita announce global partnership [London, 31 October 2017] Global ICT solutions provider Huawei announces that it has established a global partnership with Capita Secure Information Solutions to integrate Capita’s next-generation suite of public safety solutions into Huawei’s Smart Cities offering. The partnership will involve the design, development and provision of innovative ICT solutions to address the challenges of policing and meet the needs of citizens. As cities increasingly connect city infrastructure to the internet, this opens up new opportunities for governments and local authorities. Capita’s end-to-end technology solutions will help Huawei service the complex requirements of police forces and other emergency services across approximately 170 countries in which it operates. [easy-tweet tweet="Huawei will work with Capita to develop joint solutions using Capita’s Integrated Operational Policing platform" hashtags="IT, Cloud"] Huawei will work with Capita to develop joint solutions using Capita’s Integrated Operational Policing platform, which supports the initial contact from citizens and from the resulting processes from criminal investigation, case preparation, custody management, intelligence to evidence management. Capita’s integrated platform provides a modular approach that will help governments and law enforcement agencies around the world transform their operations. The platform manages policing operations from multi-channel public contact through to crime and intelligence with secure management of digital evidence material, all centred around a single, integrated data store. Capita’s portfolio will be integrated with a range of components from Huawei for storage, servers and networks, as well as innovative technology for video management and analysis. Capita’s solutions will also be configured to operate with the Huawei eLTE critical communications and cloud infrastructure. Huawei and Capita’s integrated offer has been demonstrated at a number of locations across the world. Capita will work closely with the blue light services that adopt Capita and Huawei’s joint suite of solutions to install, test and integrate products with existing platforms, as well as provide ongoing support to meet evolving operational requirements. Alan Hartwell, Chief Technology Officer, Capita Secure Information Solutions, said: “The new partnership with Huawei will increase the global reach of our solutions, helping police forces and other emergency services across the world to deliver more effective and efficient services to the people they serve. Like us, Huawei recognises the importance of investment and continued development of products to ensure that clients have the most up-to-date solutions available. We look forward to working closely with Huawei to deliver these services to existing and new clients as they come on board.” River Wen, sales director of Smart City Solutions, Huawei Enterprise WEU, said: “We are excited to be incorporating Capita’s state-of-the-art products into our global Smart Cities offering and benefiting from their experience in delivering integrated public safety solutions. We know that our clients want to deliver modern, forward-thinking emergency services in the countries they operate, and Capita’s end-to-end solutions will help us do this.” ### Cloud Expo Asia interview with Dimuth Samaranayaka from Ascension IT Disruptive interviews Ascension IT Market Manager Data Centre, CEO, Dimuth Samaranayaka at Cloud Expo Asia 2017. Compare the Cloud’s event coverage platform, Disruptive partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Visit Ascension IT at: http://www.ascensionit.com.au/ ### Cloud Expo Asia interview with Dr. Thomas Wellinger from Reichle & De-Massari Disruptive interviews Reichle & De-Massari's Market Manager Data-Centre, Dr. Thomas Wellinger at Cloud Expo Asia 2017. Compare the Cloud’s event coverage platform, Disruptive partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Visit Reichle & De-Massari at: https://www.rdm.com/ ### Navisite Bundles Proofpoint with Office 365 to Stop 99.9 Percent of Targeted Attacks Proofpoint cloud-based email security and continuity now offered as part of Navisite’s Managed Office 365 services London, UK – October 31 – Navisite, a part of Spectrum Enterprise, a part of Charter Communications (NASDAQ: CHTR), today announced it is partnering with Proofpoint, Inc., a leading cybersecurity company, to offer Proofpoint’s cloud-based enterprise-class email security and continuity as part of Navisite’s Managed Office 365 services. By bundling Proofpoint Essentials advanced threat detection technology, Navisite is the only Managed Office 365 provider that includes a 99.999 percent SLA for Office 365.Navisite helps organisations with email security including credential phishing and ransomware attacks while embedding business continuity which results in a true enterprise-grade Office 365 experience. Through this collaboration, Navisite is combining its industry-leading Microsoft Office 365 management and operational expertise with Proofpoint’s highly-scalable, cloud-based email security platform, to provide clients with a robust cloud service that helps limit their risk of attacks and advanced threats to email communications and sensitive data. Clients, such as the National Theatre, are already taking advantage of the new collaboration. “Proofpoint Email Protection managed through Navisite will provide the National Theatre with the assurance that our messaging environment is safe, secure and protected,” said Jon Cheyne, IT Director, National Theatre. “The need for this today is greater than ever before, as the risks to our data and systems via messaging platforms increase daily.” [easy-tweet tweet="Each day, Proofpoint analyses more than one billion email messages" hashtags="Technology, Security"] Each day, Proofpoint analyses more than one billion email messages and hundreds of thousands of social media posts to protect people, data and brands from advanced threats. Navisite clients can also enhance their Managed Office 365 plan to include Proofpoint’s Enterprise level threat protection and continuity services. Key Proofpoint features include: ·       Advanced Inbound/Outbound Email Protection: Stops targeted email attacks containing malicious links and attachments, while preventing spam, viruses, malware, phishing attacks and more. ·       Email Data Loss Prevention: Controls sensitive data from leaving the organisation by enforcing data security policies and preventing inadvertent data loss with Navisite’s managed Proofpoint e-Discovery and email archiving solutions. ·       Assured Business Continuity: Ensures vital email communications are available at all times with automatic failover and recovery. Users stay connected to email via Outlook integration, a web portal, or mobile support should Microsoft Office 365 go offline. “Through our collaboration with Proofpoint, we are offering our clients advanced threat protection and security management capabilities, enabling them to better control their IT environment with greater resiliency against malware, ransomware and phishing threats,” said Sumeet Sabharwal, general manager and group vice president, Navisite. “As cloud becomes more mainstream and managed services like Office 365 are commonplace, data security remains a top concern for our clients. That is why we are partnering with Proofpoint to deliver solutions that monitor, detect and protect a client’s most valuable data against the most sophisticated attackers.” “Organisations worldwide are moving to Microsoft Office 365 with Navisite, and our collaboration provides vital email safeguards against advanced threats and targeted attacks,” said Bhagwat Swaroop, executive vice president and general manager of Email Security for Proofpoint. “We stop 99.9 percent of all targeted attacks before they reach Proofpoint users with our advanced cloud-based threat detection. That success rate, coupled with threat insights to identify attacks and the assurance of email service availability, puts control back into IT hands.” Delivered as a managed service, Navisite’s Managed Office 365 solutions provide businesses with greater choice, flexibility and scale to accommodate the diverse needs of their users, while addressing pressing security and email compliance considerations. More information about Navisite’s Managed Office 365 Productivity Suite is available at:http://www.navisite.com/services/application-services/office365.   ### Cloud Expo Asia interview with Jitender Khurana from Philips Lighting Disruptive interviews Philips Lighting Regional Head, Jitender Khurana at Cloud Expo Asia 2017. Compare the Cloud’s event coverage platform, Disruptive partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Visit Philips Lighting at: http://www.lighting.philips.co.uk/home ### Why it's Time we Started Working with Robots “There will be fewer and fewer jobs that a robot cannot do better [than a human]”, Elon Musk famously announced at the World Government Summit in February. For many, statements like this result in a chorus of concern about how we humans need to stay on our toes, lest intelligent machines displace our jobs and cause mass unemployment. However, there are many opportunities that automation brings and what we need to be focusing on is how it can be used to improve, rather than replace existing roles. As we all know, there are aspects of jobs that are tedious and repetitive, and this is where automation can add value. Take chatbots that answer basic customer queries, speech recognition tools that transcribe text, advanced analytics technologies that analyse unstructured data, and natural language processing tools that schedule appointments. Rather than replacing skilled workers, these capabilities are augmenting their roles by undertaking the ‘robotic’ aspects of the workplace allowing humans to be more efficient in other aspects of their roles. On top of performing simple administrative tasks, even more, advantageous is artificial intelligence (AI) driven automation that mimics the complex process of human learning and decision-making. This is known as intelligent automation (IA) or cognitive automation. From research Avanade commissioned, we found that over half of business leaders worldwide are confident that IA will augment rather than replace existing roles. In addition, 86 per cent believe their business should implement IA solutions in the next five years to be a market leader, according to the same research. However, to deploy IA effectively, an understanding of its correct role is fundamental. IA certainly can increase efficiency in the workplace, but greater productivity will only follow if the technology is aligned with the right objectives. For this reason, IA needs to be firmly goal-oriented, and implemented within a well thought out digital transformation strategy – not as a disparate point solution that saves a few minutes for a handful of employees across the business each day. [easy-tweet tweet="There is a substantial “fear factor” when it comes to IA" hashtags="IA, AI"] It is also important to understand that to implement new technologies and ways of working effectively, heads of department may require new skills and capabilities to successfully bring the collective workforce on board. Eliminating the IA fear factor There is a substantial “fear factor” when it comes to IA, likely stirred by the media driven panic around the subject. These technologies enhance our day-to-day activities, and free us up to focus on more interesting, higher level tasks – rather than, say, bearing the frustration of waiting in a call queue for an answer to a simple question. Relating these capabilities to a typical modern working environment such as a call centre further sheds light on the benefits of IA in the workplace. Answering the similar questions about store opening times is hardly going to give an employee job satisfaction. Applying their knowledge to help solve more complex customer queries may be more rewarding though. To succeed with IA, we need to recognise both the strengths and shortcomings of humans and IA technology, as the right balance of human and IA capabilities will help optimise each other’s performance. Then, implementing IA within the right framework will result in a more engaged and more productive workforce. People may fear the rise of automation, but they also dislike performing robotic tasks. IA gives us significant opportunity to improve the way we work, eliminate meaningless tasks and give us greater job satisfaction. It’s time we embraced robots to start working with us, not in place of us. ### Executive Education Episode 3: Transformational Leadership This week sees the panel host Kylie Hazeldine, Business Development from Regent Leadership discuss transformational leadership with her guests John Legg, Chief Operating Officer, Gecko Lab and Colin Pinks, Director, Regent Leadership. #ExecEdTV #DisruptiveLIVE ### Cloud Expo Asia interview with Jaheer Abbas from Limelight Networks Disruptive interviews Limelight Networks' Regional Director, Jaheer Abbas at Cloud Expo Asia 2017. Compare the Cloud’s event coverage platform, Disruptive partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Visit Limelight Networks' at: https://www.limelight.com/ ### Comms Business Live: Channel IoT Comms Business Live is back with its second series! This week David Dungay and his guest panel will be discussing some of the major themes impacting the IoT market right now and how the Channel can capitalise on the opportunities coming their way. It has been, and to some extent continues to be, a complicated area of technology for partners to grasp but many providers out there have invested considerable resource into making it easy to digest for the Channel. We will be discussing the best ways to get into the market without having to make large upfront investments and also how to avoid some of the common pitfalls which partners can often fall into. Host David Dungay | @CommsGuru4U Guests Dan Cunliffe, MD of Pangea | @Dan_Pangea Anton Le Saux, Strategic Telecoms Leader at Zest4 | @Zest4Anton Nick Sacke, Head of IoT at Comms365 | @Comms365 ### Cloud Expo Asia interview with Joel Norton from Zerostack Disruptive interviews Zerostack's Regional Manager, Joel Norton at Cloud Expo Asia 2017. Compare the Cloud’s event coverage platform, Disruptive partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Visit Zerostack at: https://www.zerostack.com/ ### Cloud Expo Asia interview with Kaustubh Patwardhan from Ashnik Disruptive interviews Ashnik's Head - Business Development and Strategy, Kaustubh Patwardhan at Cloud Expo Asia 2017. Compare the Cloud’s event coverage platform, Disruptive partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Visit Ashnik at: https://www.ashnik.com/ ### Announcing a partnership - looking after your database and CIFS requirements OpenMinds, the specialist database clustering integration firm announces their strategic partnership with Peer Software, the CIFS file replication and accessibility specialist. Speaking out, during an interview with Compare the Cloud, both parties elaborated on the reasoning behind the alliance: [easy-tweet tweet="At OpenMinds we deliver services around public, private and hybrid cloud models " hashtags="Cloud, Technology"] “The joining together of our two companies is the next logical step in the evolution of cloud-based services we provide. It is rare that two companies, such as OpenMinds and Peer Software, completely compliment each other. At OpenMinds we deliver services around public, private and hybrid cloud models in regards to database clustering. Peer Software caters for CIFS file replication and collaboration with the same environment” said Bipin Patel, CTO, OpenMinds. Clearly, from Bipin's perspective, the partnership makes sense. Later, during the interview, Ken Gorman, UK Director for Peer Software, has this comment on the partnership: “With the two firms joining together we now can achieve complete data replication, accessibility and control across a multi-site, multi-vendor and multi-cloud world. Prior to today’s partnership, our respective companies were leaders within each field of expertise. Now, by joining our skill sets together, we are one of the most formidable partnerships that cater for niche and mainstream file systems that have services that make true cloud data services achievable!” So, from both sides, the partnership seems like a win-win scenario with the co-joined partnership. Further developments will unfold over the next few weeks as I am told there are more exciting announcements coming that are based on data protection and accessibility! ### IoT - The Waiting Game is Over   Businesses need stop waiting around to make the most out of the opportunities presented by the internet of things (IoT). As the amount of data that flows between these internet-connected devices increases exponentially, organisations that want to have a competitive edge need to be harnessing it. Really, it’s a no-brainer. Like cloud computing, if businesses don’t have a strategy to make the most out of IoT they are already considered to be behind the times. Recent market research showed that although nearly 90% of execs feel the Industrial Internet of Things (IIoT) is critical to their companies’ success, only 16% have a comprehensive IoT roadmap. With Industry 4.0 on our doorstep, innovation is key, and businesses cannot afford to be left behind. So, how can they plan and execute a winning IoT strategy? How to win with IoT The IoT presents a unmissable opportunity to harness data. One of the industries in which we have seen IoT take off is manufacturing. With IoT, manufacturers optimise their machine productivity and reliability to meet their business goals. Manufacturers can also use real-time insights gathered from their sensors to develop more effective and actionable solutions. Furthermore, this enables them to be more proactive when it comes to machine maintenance, reducing costs and saving on downtime. Another example of an industry in which IoT has proven successful is the transport industry. Forrester predicts that fleet management in transportation will be the hottest area for IoT growth. With the IoT, third-party logistics providers can implement their ideas to improve their vehicle reliability and optimise their fleet’s utilisation. This is all thanks to connecting vehicles to the IoT ecosystem, providing actionable real-time insights. With connected vehicles, connected freight and connected fleet, businesses receive improved visibility of potentially hazardous goods, while also benefitting from the ability to improve predicted arrival and departure times. These industry-specific applications of IoT are proof of the ways this technology can reduce costs while optimising productivity. They are also a great example of how the successful planning and implementation of this technology can allow industries dealing with large volumes of data to manage it safely and on a global scale. How can I implement a successful IoT strategy? It may sound obvious, but, it’s crucial to start by laying out a series of business goals and carefully defining objectives. Then, and only then, can businesses determine what they need to achieve in order to implement the appropriate IoT strategy? [easy-tweet tweet="It’s important for businesses to realise that the IoT demands a high level of investment" hashtags="IoT, Technology"] With clear objectives in mind, businesses need to be putting these IoT strategies into place now. With industry 4.0 and IoT both taking off, it’s important for businesses to realise that the IoT demands a high level of investment, excellent specialists and the command of new technologies. Once implemented, in most IoT projects, the IoT loop will only be closed off by continuously monitoring and analysing all relevant Key Performance Indicators (KPIs). These KPIs can be tracked via a dashboard to make it simple to monitor the progress, as well as analyse the best ways to reduce costs and improve efficiency. Keep up with the latest IoT trends Although the term IoT has been floating around for a while now, to stay ahead and remain competitive, businesses must continuously be evolving their strategic plans in order for the business’ IoT projects to keep up with the demand and need of each area. This involves furthering big data use cases, and taking into account how machine learning and AI might integrate with IoT plans to keep driving the business forward. With smart machines being better than humans at accurately and consistently communicating data, businesses need to look at how this technology will help them remain up to date and competitive. New advancements in technology now allow Data Scientists to complete their work in modern “AI” languages, and with a touch of a button this can be converted into live and operational automation systems that feed off billions of events and many terabytes of data – it’s game-changing for business. Industries such as manufacturing are leading the way when it comes to innovating use cases for IoT. They are therefore reaping the benefits of driving their businesses forward successfully. As the technology continues to become more widespread and is applied across more areas and sectors, we are bound to see further use cases across sectors as brands use IoT to turn their next big idea to a connected reality. So, perhaps that’s the last piece of advice – these vertical use cases show that IoT is essential, but IoT on it’s own will struggle to win significant attention in your business outside experiments. When combined with deep learning and artificial intelligence (AI) technologies, IoT steps into a world of practical impact that warrants attention in the boardroom. Adding AI to IoT makes the technology smarter, and far more valuable to business operations. Maybe we should stop saying IoT and instead say AI.IoT - where the AI is specific to the data science that adds value in each vertical industry. ### Cloud Expo Asia interview with John Martin from NetApp Disruptive interviews NetApp's Director of Strategy and Technology, John Martin at Cloud Expo Asia 2017. Compare the Cloud’s event coverage platform, Disruptive partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Visit NetApp at: http://www.netapp.com/us/index.aspx ### Mirror Head reflects new projection possibilities Panasonic has expanded its accessories range with the launch of a digitally operated MirrorHead solution, capable of creating dynamic, eye catching moving projections. Designed for professional show integrators, the series is available in four models (ET-MH14, ET-MH15, ET-MH17 and ET-MH18) suitable for Panasonic projectors from 6,200 to 30,000 lumens, making it ideal for both rental and touring events. [easy-tweet tweet="The extendable Mirror Head has a maximum pan rotation of 180 degrees" hashtags="Panasonic, Gadgets"] The extendable Mirror Head has a maximum pan rotation of 180 degrees and a tilt rotation of 90 degrees. It can be set to move either rapidly or slowly to project pictures, animations, videos and text on to any surface, whether it be moving or static. “Static projection belongs in the past,” said European projector marketing manager Hartmut Kulessa. “The journey into a new visual experience begins here. The technology paves the way to unprecedented possibilities in lighting, projection and interior design.” The MH14 series features a quick, secure snap-on mounting for stack frames but can also be used without frames. The Mirror Head, which is available from July 2017, also comes with a high resolution LED display for easy configuration. To see the Mirror Head in action, please watch the following video:  https://www.youtube.com/watch?v=APpw6ZKIQ3I For more information on Panasonic Visual System Solutions please visit: http://business.panasonic.co.uk/visual-system/              ### Cloud Expo Asia interview with Joseph Lee from Dynatrace Disruptive interviews Dynatrace's Solution Director, Joseph Lee at Cloud Expo Asia 2017. Compare the Cloud’s event coverage platform, Disruptive partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Visit Dynatrace at: https://www.dynatrace.com/ ### Cloud Expo Asia interview with Daniel CF Ng from Cloudera Disruptive interviews Cloudera's Senior Director Marketing at APAC, Daniel CF Ng at Cloud Expo Asia 2017. Compare the Cloud’s event coverage platform, Disruptive partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Visit Cloudera at: https://www.cloudera.com/ ### Businesses are worried about the challenges of managing data in the lead up to GDPR enforcement Concern about the strict new General Data Protection Regulation continues to focus on security at expense of data privacy St Helier, Jersey, October 24 2017 – 93% of companies are worried about the storage of their data in the cloud after the General Data Protection Regulation (GDPR) and 91% are concerned about how the new rules will affect cloud services, according to new research from Calligo, a world-leading cloud solution provider. The figures were in a survey of 500 IT decision-makers in companies with more than 100 employees and £15 million turnovers, examining how businesses are preparing for the new regulation which comes into force next May. [easy-tweet tweet="In relation to cloud services, 46% are concerned about the GDPR’s complexity" hashtags="GDPR, Cloud"] Despite the severe penalties for infringing the GDPR, only 14% said worries about meeting their obligations under the new privacy laws with the wide-ranging new rules for handling and storing data are uppermost on their minds. Security and breaches are the largest area of concern, selected by 41% of respondents. In relation to cloud services, 46% are concerned about the GDPR’s complexity, yet just 15% highlighted privacy. “While our research shows that companies are rightly concerned about how the GDPR will affect the cloud, it is apparent that many are not helping themselves,” said Julian Box, CEO, at Calligo. “Although 89% claim to be very or quite clear about how GDPR will affect their organisation, they don’t seem to be giving due weight to meeting these new privacy obligations.” “Of course, security is a huge concern, but it is only one part of the GDPR jigsaw that all organisations storing personal data of EU citizens have to have in place before the enforcement deadline on 25th May next year.” Box added, “There is little point putting a ring of steel around data you shouldn’t have.” More than half of respondents (52%) said the GDPR would not affect how they use cloud services, ranging from 40% in the legal sector to 100% in education. Less than half (49%) of respondents said continuing doubts about the Privacy Shield (allowing EU citizens’ data to be held in the US in compliance with EU law) would affect their use of the hyperscale cloud. Only 26% said they choose a cloud provider because they are confident about its GDPR effectiveness, whereas 41% make their choice based on scalability. In other findings, the research revealed that the average amount respondents said their organisation is willing to spend on preparing for GDPR is £1.67 million. Calligo is an expert in the General Data Protection Regulation; their offerings include several services based engagements around GDPR as well as their world-leading GDPR enabled Infrastructure as a Service platform. The latest generation of this offering has GDPR workflows in place which help their clients meet their privacy obligation under this new law. A Calligo whitepaper on the GDPR can be found on the website: https://calligo.cloud/gdpr/ebook ### Cloud Expo Asia interview with Ken Soh from Athena Dynamics Disruptive interviews Athena Dynamics' CEO, Ken Soh at Cloud Expo Asia 2017. Compare the Cloud’s event coverage platform, Disruptive partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Visit Athena Dynamics at: https://athenadynamics.com/ ### Cloud Expo Asia interview with Bas Winkel from LeaseWeb Disruptive interviews LeaseWeb's Managing Director, Bas Winkel at Cloud Expo Asia 2017. Compare the Cloud’s event coverage platform, Disruptive partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Visit LeaseWeb at: https://www.leaseweb.com/ ### Tech News Show Halloween Special! https://www.youtube.com/watch?v=fmgO2dtO_Go Watch #DisruptiveLIVE Halloween Special now! Here you will watch the latest tech news, trends and interviews. This week's guests include Mostafa EISayed from Automata Technologies and Arthur Leung from Curve.   ### Cloud Expo Asia interview with Royce Teoh from Oracle Disruptive interviews Oracle's Head, Digital Business, Royce Teoh at Cloud Expo Asia 2017. Compare the Cloud’s event coverage platform, Disruptive partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Visit Oracle at: https://www.oracle.com/index.html ### Cloud Expo Asia interview with Mr Poh Yee Tiong from Procurri Disruptive interviews Procurri's Head of Asia-Pacific, Mr Poh Yee Tiong at Cloud Expo Asia 2017. Compare the Cloud’s event coverage platform, Disruptive partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Visit Procurri at: https://www.procurri.com/ ### Trick or T(h)reat? Haunted House study reveals IoT Research reveals over 70,000 access attempts to smart homes Scan for smart home devices lists more than 68,365 open web GUIs globally, and 1914 from the UK Sophos reveals 8 expert tips for running a secure Smart Home Thursday, 26 Oct. 24, 2017 – Yesterday, Sophos has launched ‘Project Haunted House’, a continuous attack analysis and assessment of smart homes over the period of several weeks. With the aim of raising awareness of responsible IoT device use, a virtual smart home, simulated for this purpose and including original control and network infrastructures, has been set up and will be used as a potential target for attack and left exposed on the Internet. The final results of the research project will be published in November 2017, however, first interim numbers from the project have revealed more than 70,000 access attempts from 24,089 individual IPS to our virtual house. Therefore, a clear tendency is already evident: the Haunted House is definitely no Halloween one-timer but a valid danger for private smart homes – if not handled correctly. To bolster these numbers and make a classification in the largest context possible, the project also includes active internet scans for smart home devices via search engines like Shodan or Censys. A scan beginning in October resulted in more than 68,365 open web-interfaces from well-established smart home components globally, and 1914 from the UK, which are primarily used in private households – such as wireless window contacts, smoke detectors, automatic door opening/locking systems, and camera systems. All these devices were easily accessible without a password via the internet. The visualisation via heat maps is showing that the IoT technology is concentrated in cities and urban centers like London, Manchester and Birmingham while fading out into rural areas. “The sheer numbers emphasise the importance of being cautious while building your smart home”, says James Burchell, Security Specialist. “Otherwise there is a growing chance that it won’t just be trick or treaters at your door this Halloween, but real life cyber gangsters that are looking for you money and data.” [easy-tweet tweet="Every IoT device needs to run with the most up to date firmware to be as secure as possible." hashtags="IoT, Technology"] 8 tips to NOT get a Haunted House but a secure Smart Home: Keep your home networks exclusive - Don’t share it with others. Don’t connect IoT devices with your home network if it isn’t necessary – Your TV for example mustn’t be connected to WLAN if you are mainly watching TV via cable or antenna. Create a separate network for IoT devices - If your WiFi router is able to create various networks (segmentation), you should implement a special network for IoT devices and thus interrupting access to your regular network Create various sealed off networks on different WLANs - It is even better to create various sealed off network areas for Home Office, entertainment electronics, building and security technique or the guest network – each with different WLANs. This can be enabled by a Firewall which is only allowing the communication that is necessary to use the components but not the infiltration of an infection from one IoT device to the other. You can install the Sophos UTM Home Edition Firewall for free on your PC. Use secure VPN technology - You shouldn’t use an insecure port forwarding on your router to get remote access to your IoT devices from the internet. Use a secure VPN on your smartphone or Mac/PC instead. Keep your software up to date - Install up to date AV software on all PCs, Macs and Android Smartphones. Free tools like Sophos Home or Sophos Mobile Security are available at the Sophos website. Secure everything with the latest firmware - Not just PC, laptop and smartphones – but every IoT device needs to run with the most up to date firmware to be as secure as possible. This might be time-consuming but is definitely worth the effort regarding security and privacy. Google is your friend - You might want to Google search potential security gaps of the IoT device you are going to use. This gives you a quick but good overview if the product of your choice is already a focus of hackers or even been hacked. ### Minkels presents new white paper on implementation of micro data centre strategy Veghel, the Netherlands, 26 October 2017 – The role of data centres is quickly changing, driven by the cloud, data growth and IT cost reduction. This leads to new challenges when it comes to future-proofing data centre infrastructures. Data centre supplier Minkels – part of the publicly traded company Legrand (NYSE Euronext Paris: LR) – has published a number of white papers which can be used as a guide to creating a future-proof and energy-efficient data centre. Minkels is now launching a new white paper about the micro data centre, titled ‘White paper 09 – Micro data centres: from strategy to implementation’. [easy-tweet tweet="The micro data centre market is expected to expand from $2.67 billion in 2017 to $8.47 billion in 2022" hashtags="DataCentre, IT"] The micro data centre market is expected to expand globally from 2.67 billion dollars in 2017 to 8.47 billion dollars in 2022. Niek van der Pas (Minkels’ Lead Data Centre Expert and author of white paper 09) explains: “Looking at the data centre market, three drivers that are affecting market demand can be distinguished. First of all, more and more applications are ending up in the cloud. As a result, many companies only need a small and clear data centre – though it still has to meet the requirements of a large one.”  Other companies choose to outsource their IT to commercial data centres or cloud data centres. “These companies can reduce the size of their in-house data centres, which means more efficiency: lower energy use and more predictable (operational) costs. Last but not least, there is the Internet of Things (IoT) – the main driver in the micro data centre market. In 2014, there were about 14 billion devices connected worldwide. By 2020 there will be 50 billion. Centralised data processing does not always fit IoT applications, for instance because of low latency requirements (edge analytics). IoT applications therefore require this data to be handled locally by using micro data centres.” IoT has a major impact on the data centre ecosystem and on connectivity. Both the core and the edge layer offer new chances and challenges. “When implemented correctly, micro data centres can become the cornerstone of the modern IT Infrastructure – fulfilling all the needs of a high demanding low latency highly available IT-platform at the edge”, says Van der Pas. “To understand the considerations leading to the selection of the best micro data centre solution, white paper 09 discusses the Data Centre Decoupling Point (DCDP), a stakeholder analysis and a business risk analysis which will be translated into availability, security and resource efficiency demands. All these aspects are projected to the practical implementation of a micro data centre strategy. White paper 09 guides the reader in making the right considerations and choices.” ### Cloud Expo Asia interview with Joshua Au from A*STAR Disruptive interviews A*STAR's Head of Data Centre, Joshua Au at Cloud Expo Asia 2017. Compare the Cloud’s event coverage platform, Disruptive partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Visit A*STAR at: https://www.a-star.edu.sg/ ### Sharing Analytics Across the Enterprise Leads to Smarter Business More and more companies are finding their way to the establishment of an analytics practice that analyses data quicker, better, and faster across the enterprise than could have been conceived of when the journey first started. The enterprise journey from the older business intelligence departments that sprang up from the 1960s to the more modern analytics teams that work across an organisation has been a steadily increasing trend from centralised control to decentralised empowerment. Now, the everyman ‘line of business’ analyst can combine their detailed knowledge of specific business challenges with access to company data to deliver more comprehensive analyses. Now, cloud services have really broken down internal physical and organisational barriers, and made it easier for any kind or size of business to streamline and open up enterprise data, whilst still retaining control of access rights and security. What’s more, complex analytics challenges that used to require statistics grads and quantitative analysts to execute complicated predictive modelling and company reporting are being solved by line of business analysts. Solving these challenges no longer relies on either knowledge of coding (leaving them within the domain of IT users), or are so knowledge-specific that only PHDs can understand the statistical requirements. [easy-tweet tweet="‘Data democratisation’ has been enabled by the rise of big data analytical solutions" hashtags="Data, Analytics"] ‘Data democratisation’ has been enabled by the rise of big data analytical solutions, from tools to whole platforms, and made sophisticated analytics accessible to the average person within business – those ‘line of business’ analysts who most likely wouldn’t consider themselves an analyst, and wouldn’t have it on their job description or title. Nonetheless, regular business people across every department need to answer business questions, plan for the future, and optimise the activities. To do all of these things, they need good data to ensure they aren’t making poor assumptions that may render their plans suboptimal. Working smarter not harder So what enterprises have experienced over 60 years of increasingly rapid data explosion, experimentation, and harnessing, is a whole new way of making quantifiably better business decisions. At the same time, the ability to derive benefits based on data have cascaded down from siloed pockets of ability or knowledge within IT, or driven sometimes by enthusiastic supporters in particular departments like marketing or finance, into every department. Now, leading businesses allow employees who need to access data as part of their role to become their own analyst, rather than funnelling reporting and query requests to a centralised but non-business expert function. But for governance and security, a central resource is often maintained to act as a guardian of data quality and ensure that the business uses its data appropriately and effectively. With all users familiar with the chosen analytics platform, the business is able to allow data to be accessed seamlessly from end to end of the business, with no artificial breaks for fresh data prepping, cleansing, or reworking that bedevil the average enterprise. With this arrangement, an enterprise is able to unleash four things: Accessible data for all who need it It’s no use if Finance needs to know how effectively Marketing budgets are being spent if neither team is using the same metrics or underlying data. The whole business should be working to common standards and from a shared view of the truth. Faster time to decisions With data from all teams able to be interrogated and shared, there’s no more business silos, requests for information, and back and forth. Business people make the decisions they need to with the data they have permission to access. It can be that simple. More advanced business benefits Ideally, all levels of employees should be able to learn not only from data analysts, but from each other as their confidence and use of data analytics grows and a culture of fact-based decision-making matures. Then, instead of creating a dashboard or a report once or twice, they use analytics to solve much bigger business challenges. In fact, every employee has it within them to be a deliverer of enhanced business value by discovering their own marginal (or more major) gains in every corner of the business’ activity. Freeing up IT There’s no longer a good reason for IT to own all the data, and be required to run reports at the behest of other departments. It’s an inefficient use of resources and one that no longer makes sense in a data democratised world. There are analytic platforms that empower analysts and business analysts to more easily consume data and make more informed business decisions by avoiding the bottlenecks of a traditional centralised data repository. As with all cloud-based solutions they are designed to meet expectations for flexibility and scalability, without sacrificing security or performance. Simply put, a decent analytics platform should make it easier, quicker and safer to deploy analytics across an organisation. From A to Z (Analyst to Zettabytes of data!) Democratic analytic solutions that empower the regular business user, engender a data culture. Where possible there should be a Chief Data Officer (CDO) or other authority steering the business with best practices, whilst allowing decision-makers the freedom to experiment with their data. These are the core components of a smarter analytics programme that holds the potential to invigorate every element of the business. Adding a cloud element may allow a business to break down any remaining silos and mature the data culture at speed, providing analysts and business analysts with access to business-critical insights to make more informed, data-driven decisions in less time. When analysts can build repeatable workflows that prep, blend and analyse data, they can publish these to decision makers. Decision makers can browse and customise and execute workflows on demand, without interrupting the analyst who originally created it – once again saving time and effort. ### Spotzer to Transform Service Experience for Global Customer Base with NewVoiceMedia LONDON, 26 October 2017 – NewVoiceMedia, a leading provider of cloud contact centre and inside sales technology that enables businesses to have more successful conversations, is helping Spotzer enhance its contact centre operations and global customer service experience with its ContactWorld platform. Spotzer is a white-label provider of bespoke digital marketing solutions for large enterprise customers who service small businesses, and has product delivery and service teams across four continents. The company builds 7,500 websites a month, and speaks to thousands of customers throughout the process. With a strong focus on Net Promoter Score, it is important to Spotzer to have the best possible technology stack to deliver on its key value of “value every customer”, making sure that the customer experience is second to none. [easy-tweet tweet="NewVoiceMedia’s cloud contact centre technology also integrates seamlessly with Spotzer’s CRM platform" hashtags="CRM, Cloud"] The company selected NewVoiceMedia’s ContactWorld for Service solution for its reliability, ease-of-use, insight into outbound calls through detailed analytics, and the strength of its global platform, including new call routing architecture which optimises contact centre management and operations, while ensuring the highest quality customer experience across the world. NewVoiceMedia’s cloud contact centre technology also integrates seamlessly with Spotzer’s CRM platform to ensure all customer interactions are tracked and service levels measured. With ContactWorld’s click-to-dial feature, Spotzer’s consultants are now able to make thousands of outbound calls annually to clients without having to spend time searching for and manually dialing each number. Inbound calls can also be intelligently managed and routed, ensuring callers are connected directly to their account manager – improving handling time and customer satisfaction. Consultants also benefit from immediate access to a customer's entire history of interactions and calls. Furthermore, advisors can log into the same system wherever they are, as all they need is a phone and internet connection, meaning they can work from multiple locations. The platform offers a real-time window into the entire contact centre operation, so advisors can be easily managed, and customisable, rich reports allow the company to understand where improvement opportunities exist. Peter Urmson, CEO at Spotzer, comments, “As a global leader in digital marketing solutions for businesses who need to solve problems for their SMB customers, it’s essential for us to provide the highest quality customer service experience. Building in the NewVoiceMedia solution is a key part of the evolution of Spotzer offering a world-class contact centre solution that is fully integrated into our CRM platform. We were excited to hear that NewVoiceMedia maintains the highest level of integration with our CRM platform in the industry and that ContactWorld could be easily integrated with our existing systems. We now have flexible, reliable technology with real-time insights into our entire business, meaning we can provide the best possible customer service experience to our global customer base”. Dennis Fois, president and chief operating officer at NewVoiceMedia, adds, “We’re extremely pleased to be working with Spotzer and look forward to seeing the company transform its customer experience and business efficiencies with ContactWorld. With a true cloud environment, we’ve not only provided the company with a reliable and feature-rich contact centre solution, but our technology is completely flexible and scalable and will continue to support the business throughout its future rapid international growth". ### Cloud Expo Asia interview with Andrew Martin from Zerto Disruptive interviews Zerto's VP Asia Pacific and Japan, Andrew Martin at Cloud Expo Asia 2017. Compare the Cloud’s event coverage platform, Disruptive partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Visit Zerto at: https://www.zerto.com/ ### Cloud Expo Asia interview with Oliver Gerhardt from Wavecell Disruptive interviews Wavecell's CEO, Oliver Gerhardt at Cloud Expo Asia 2017. Compare the Cloud’s event coverage platform, Disruptive partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Visit Wavecell at: https://wavecell.com/ ### Thales Announces new Security-as-a-Service for Centralised Control of Encryption Keys  CipherTrust Cloud Key Manager delivers full security, visibility, and management for enterprises to control their “bring your own” encryption keys Cambridge, UK. – Oct. 26, 2017 – Thales, a leader in critical information systems, cybersecurity and data security, announces CipherTrust Cloud Key Manager for support of Microsoft Azure Key Vault and Amazon Web Services (AWS) Key Management Service (KMS) bring your own key (BYOK) capabilities. The solution allows users of these dominant public cloud solutions to meet compliance mandates and further protect their most sensitive data by creating and managing encryption keys separate from their cloud provider’s infrastructure.  [easy-tweet tweet="To help save time & money, a growing number of enterprises are eschewing legacy technologies in favour of cloud and SaaS environments" hashtags="Cloud, Technology"] To help save time and money, a growing number of enterprises are eschewing legacy technologies in favour of cloud and SaaS environments. While these technologies are digitally transforming businesses, they present challenges: enterprise data is fair game for cybercriminals regardless of operating environments, and meeting compliance and best practices requirements isn’t always straightforward. In response, enterprises are developing encryption strategies to better protect and control their data. While effective, this presents a new hurdle; when considering that many enterprises utilise multiple cloud providers, the management of encryption keys can prove difficult.  Thales CipherTrust Cloud Key Manager offers a number of benefits to help enterprises control and secure encryption keys in multi-cloud environments, including: ·         Providing unique, enterprise-ready encryption key lifecycle management spanning an ever-growing list of leading cloud vendors (Salesforce, Microsoft Azure, AWS) ·         Centralising multi-cloud encryption key creation and management separate from the cloud provider’s control with a choice of a SaaS or on-premises deployment ·         Achieving compliance with a FIPS 140-2 and Common Criteria certified key store with visibility into how, when and by whom encryption keys are used through logging and a set of built-in usage reports  Peter Galvin, VP of Strategy at Thales eSecurity says: “Organisations are struggling to manage an exploding number of encryption keys. CipherTrust Cloud Key Manager puts control in the hands of enterprises rightfully concerned about the compliance and data protection challenges inherent in multi-cloud environments. The intuitive and well-designed as-a-service offering makes managing encryption keys simpler by eliminating the need to architect, purchase and deploy hardware. Through an easy to use web interface, organisations can simply create, rotate and backup keys in a growing list of cloud providers”.    Sumedh Barde, Group Program Manager at Microsoft says: “Our largest customers have heterogeneous application environments. Managing keys across these diverse locations is complex. CipherTrust Cloud Key Manager solves this problem by giving our customers a single pane of glass to discover, manage, and monitor their encryption keys across Microsoft Azure and Office 365, as well as other locations. The solution leverages Microsoft’s native key service, Azure Key Vault, so our customers continue to get the best experience from their apps in the Microsoft cloud. All of this makes the Thales solution a welcome addition to the Microsoft Azure security ecosystem”.  Available as a service, or for on-premises deployments, CipherTrust Cloud Key Manager supports Microsoft Azure, including Office 365, as well as Salesforce Shield Platform Encryption and Amazon Web Services. For more information, please click here. ### Cloud Expo Asia interview with Jerzy Szlosarek from Epsilon Disruptive interviews Epsilon's CEO, Jerzy Szlosarekat Cloud Expo Asia 2017. Compare the Cloud’s event coverage platform, Disruptive partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Visit Epsilon at: https://www.epsilon.com/ ### Huawei Launches VR OpenLab Industry Cooperation Plan to Promote Construction of Cloud VR Industry Ecosystem [Hangzhou, China, October 26, 2017] Huawei officially released its Virtual Reality (VR) OpenLab industry cooperation plan, at Ultra-Broadband Forum (UBBF 2017), to promote the prosperity of Cloud VR, facilitate technological innovations, incubate business scenarios, and construct a comprehensive Cloud VR industry ecosystem. Huawei and partners jointly released the VR OpenLab industry cooperation plan VR development in games, videos, and live broadcast will increase drastically in the next two years, according to Gartner's 2017 Hype Cycle for emerging technologies. Due to the characteristics of Cloud VR, which include lightweight terminals, low costs, and cloud rendering, deployment can be achieved rapidly on a large scale, making Cloud VR an inevitable trend of future VR development. [easy-tweet tweet="Cloud VR poses higher latency and bandwidth requirements on the backhaul network" hashtags="Cloud, VR"] However, Cloud VR development is still in its infancy and requires a joint effort from the entire industry for faster development. Cloud VR poses higher latency and bandwidth requirements on the backhaul network, which means current network architecture and technologies are still unable to effectively support Cloud VR. Therefore, industry partners must seek solutions to create value and construct business models. The VR OpenLab industry cooperation plan aims to help upstream and downstream partners jointly explore the development path of the Cloud VR industry. VR OpenLab will focus on four research directions: business application scenarios, service solutions, bearer network innovations, and operators' service implementation. Through end-to-end industry cooperation, VR OpenLab bridges the ecosystem breakpoints and fully promotes the commercial popularization of VR. As the initiator of the VR OpenLab industry cooperation plan, Huawei will focus on supporting the development of the cooperation plan relying on iLab. By providing a 1,000 m2 R&D lab, E2E network devices, and dozens of experts, Huawei will cooperate with industry partners for joint innovations, facilitating business success. Currently, 30 partners have already joined the VR OpenLab industry cooperation plan. During UBBF 2017, Huawei and partners jointly demonstrated multiple innovations, including Cloud VR videos, Cloud VR games, VR live broadcast, remote large-space VR e-Sports and live broadcast, VR social fitness, and VR music. ### Cloud Expo Asia interview with Peter Fleming from Deskera Disruptive interviews Deskera's CFO, Peter Fleming at Cloud Expo Asia 2017. Compare the Cloud’s event coverage platform, Disruptive partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Visit Deskera at: https://www.deskera.com/ ### Use Chatbots Realistically Or Risk Turning Off Your Customers VoiceSage’s Matt Weil argues that naive AI coverage harms our appreciation of what this useful technology can actually do In the summer, a story gripped the world’s media: a sinister-sounding AI alert. The claim was that Facebook had had to end an AI-powered chatbot trial that was deemed a threat. And as a result, the press had a field day: 'Robot intelligence is dangerous: experts’ warning after Facebook AI develop their own language'  was typical of the headlines the scare generated. The story was about the fact that two chatbots had created their own ‘secret language’ (doubtless to be used for that Robot Takeover). However, all this stemmed from an innocuous trial of Natural Language processing in the context of negotiation. Facebook AI coders had been experimenting with programs that negotiated with each other as part of a project to understand how linguistics played a role in such discussions: the software were programmed to experiment with language in order to see how that affected their dominance in the negotiation, and they worked out some clever shortcuts. We soon witnessed stories that corrected the misleading impression that there was anything sinister afoot: as Gizmodo says, "In their attempts to learn from each other, the bots thus began chatting back and forth in a derived shorthand. But while it might look creepy, that's all it was." However, it’s adding up to more fuel for public unease about AI that is misplaced. And of course, too many people don’t understand AI in general. That is a problem, as what chatbots (the most immediate and eminently practical application of AI ideas to our space) can and can’t do really matters in a business and customer service context. Over-hyping has negative repercussions. Early usage of chatbots has in many cases disappointed brands as they are pushing them to replace humans instead of augmenting them. Take this recent discussion, for example, Putting Human Call Center Agents on Hold . It’s a well-researched overview, but it’s predicated on the claim that when it comes to AI in customer outreach, it’s going to be all about replacing agents – “The impact of artificial intelligence on contact centre operations will be significant… Chatbots are the first step, but automated bots that naturally interact with callers will initially reduce and eventually eliminate all but the most complex human agent interactions during the next decade.” Although this author admits this probably won’t happen inside of five years, he’s certain that it’s inevitable. [easy-tweet tweet="Great brands like Twitter and Airbnb are already successfully using chatbots" hashtags="AI, Chatbots"] Great brands like Twitter and Airbnb are already successfully using chatbots. And there’s no question that chatbots are a really useful technology, but the fatal error is to think chatbots are all that is required. The public likes problems solved quickly, and a slick interface that gets issues like updating an address completed without having to speak to a human after multiple levels of ID check. As a recent Business Insider analysis shows, 44% of US consumers said that if a company could get the experience right, they would prefer to use a chatbot or automated experience for CRM. Chatbots are not a panacea But note the key phrase – if a company could get the experience right. Customers very good at spotting over-eager AIs. If a chatbot fails to help you find out if your delivery will come in on time, the experience becomes a negative one. Brands pushing too hard to automate too much of the customer outreach process – seeking cost-savings and efficiency – will face a backlash as a result. Brands are correct to value the concept of automating some of the outreach process, and the efficiency increases that follow due to agents being moved to higher value work. However, just as we need to exercise some scepticism about headlines about a robot takeover tomorrow, organisaton also need to exercise some balance about the right use of AI in customer service. Chatbots are useful for firms, but be realistic – they will work as an addition and a support for the contact centre, not a replacement. Use AI/chatbots to automate simple, repetitive tasks the user needs to do, or as a fallback, but it would be foolish to adopt them too quickly, nor make them the only arrow in your proactive customer service quiver.   ### Behind the Expo interview with Grant Caley from NetApp Andy Johnson interviews NetApp's, Principal Technology Officer, Grant Caley at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit NetApp at: http://www.netapp.com/ ### Behind the Expo interview with Sean McAvan from Navisite Andy Johnson interviews Navisite's, Managing Director, Sean McAvan at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Navisite at: http://www.navisite.co.uk/ ### Netskope Content-Aware Information Rights Management Program Provides Protection that Follows the Data Integrations and new method provides cloud data protection for the “last mile” London, UK, 25th October 2017 – Netskope, the leader in cloud security, today announced the Netskope Information Rights Management (IRM) Program. The program leverages integrations from Microsoft, TITUS, and Vera, combined with the contextual awareness of the Netskope Active Platform to protect data beyond the reach of any control point. This protects so-called “last-mile” data such as that downloaded to a USB drive, shared from device to device via Bluetooth, or resident in a Shadow IT cloud service or an unmanaged endpoint. This data-centric classification, control, and protection can only be unlocked by the customer or an intended recipient. The patented Netskope Context Engine provides insight into the “who,” “what,” “when,” and “where” surrounding the data, enabling Netskope customers to invoke IRM policies during critical moments such as when employees need to share data and collaborate with a third party in a secure manner. According to the Netskope Cloud Report, the average enterprise uses more than 1,000 cloud services and users are increasingly accessing them from remote locations and mobile devices that are both managed or unmanaged by IT. This escalates the importance of making IRM not overly reliant on access to a particular app or suite of apps. Netskope IRM policies use the patented Netskope Context Engine to capture transaction details such as a user’s location, device, device classification, activity (such as “upload,” “download,” “share,” and “post”), and the presence of a third-party collaborator. Netskope uses these transaction details and applies cloud data loss prevention (DLP) to determine the sensitivity of the data. If deemed sensitive, Netskope can now classify, label, protect, and enforce usage policies on the data through integrations with partners such as Microsoft, TITUS, and Vera. These relationships are key components in Netskope’s IRM and data-control strategy, giving customers the ability to apply advanced data classification, file security, and tracking of the data itself. “Every customer eventually asks the ‘last mile’ question: What are you doing to protect data where you don’t have a presence?” said Sanjay Beri, founder and CEO, Netskope. “By virtue of being in the cloud and built to govern cloud services, Netskope already has a significant head start on that challenge. Now, with our strategic IRM partners, we’re helping our customers address the ‘last mile’ by providing data protection that stays with the data, period. This is a significant milestone for the industry and something that will provide our  customers will the best data protection available in the market.” [easy-tweet tweet="With the ongoing avalanche of data breaches, businesses are begging for a more profound approach to security" hashtags="Security, Data"] “With the ongoing avalanche of data breaches, businesses today are begging for a different, more profound approach to security,” said Ajay Arora, CEO and Co-founder of Vera. “The only way to achieve true protection is to secure the data itself, and Netskope is addressing this challenge head-on. Together, we’re giving enterprises the ability to establish persistent, dynamic protection over sensitive information at the moment it matters most — in the cloud, and through the last mile.” “For years the industry has been riddled by challenges associated with cloud data security, and Netskope has worked to address them with sound technology and pragmatism when it comes to user experience,” said TITUS CEO Tim Upton. “Netskope’s IRM Program illustrates their commitment to cloud data security, and the integration we’ve built together allows our customers to extend user and machine-driven classification beyond the reach of any control point.” ### Behind the Expo interview with Andrew Lloyd from Corero Network Security Andy Johnson interviews Corero Network Security's President and EVP Sales and Marketing, Andrew Lloyd at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Corero Network Security at: https://www.corero.com/ ### Skybox Security Raises $150 Million Led by CVC Capital Partners' Growth Fund with Participation from Pantheon SAN JOSE, Calif. (October 25, 2017) — Skybox™ Security, a global leader in cybersecurity management, announced today the company signed a definitive agreement to receive a $150 million growth equity investment led by CVC Capital Partners’ Growth Fund (CVC Growth) for $100 million, with participation from Pantheon for $50 million. CVC Capital Partners is a leading private equity and investment advisory firm, and Pantheon is a prominent global investor in private equity, infrastructure and real assets.  Based in Silicon Valley, Skybox has a compound annual growth rate (CAGR) of 46 percent and positive cash flow (2014 ­– 2016). This round of funding will enable an accelerated investment in sales and marketing, customer care and R&D. It will also be used for the potential M&A activity, to capitalize on the approximately $10 billion market opportunity in cybersecurity management. Skybox builds best–in–class cybersecurity management software that gives customers comprehensive visibility of their unique attack surface. The software uses analytics to prioritize an organization’s risk exposures and recommend informed action to best address those exposures. These capabilities extend across highly complex networks, including those in physical, virtual, cloud and operational technology (OT) environments. The company’s broad platform, the Skybox™ Security Suite, enables organizations to reduce security risks that attackers can find and exploit, such as device misconfigurations and policy violations, as well as exposed and unpatched vulnerabilities. “Skybox’s track record is impressive and there is clear demand for their solutions,” said Jason Glass, senior managing director of CVC Growth Partners. “It is a true leader in cybersecurity management, helping organizations better protect themselves and become more efficient. Gidi Cohen, Skybox’s co-founder and CEO, is a respected innovator in security management and analytics, and we look forward to working with him and the wider executive team as we expand Skybox’s offering and global reach.” [easy-tweet tweet="The demand for Skybox’s solutions has been increasing due to a maturing cybersecurity market" hashtags="Skybox, Security"] The demand for Skybox’s solutions has been increasing due to a maturing cybersecurity market and the desire among security leaders to more efficiently manage security programs and gain better ROI from existing technologies. In the first half of 2017, for example, Skybox showed a 62 percent increase in sales and 59 percent increase in product transactions compared to the same period last year (January 1 – June 30). “Enterprises, governments . . . everyone is either embarking on or going through massive digital transformation, and this means new challenges for security because the attack surface of these organizations is growing more complex,” said Skybox CEO Gidi Cohen. “We’ve been consistently evolving our technology to meet those challenges. With this investment, we’ll accelerate that innovation, focusing on some of the most critical areas, such as security management for the cloud and the OT networks that control critical infrastructure.” Skybox customers are Global 5000 companies and government agencies, in more than 50 countries and nearly every industry. The company’s products are used by six of the top 10 global banks, 10 global telecommunications firms, five of the world’s largest consumer goods manufacturers and 10 of the largest energy providers globally.  Jason Glass and John Clark, managing partner of CVC Growth Partners, will join Skybox’s board of directors. CVC Growth has secured total equity commitments of $1 billion and invests primarily in North America and Europe, focusing on high growth software and technology­–enabled services businesses in a variety of sectors, including security, cloud computing, mobility, compliance, payments, financial technology and vertical software. ### Behind the Expo interview with James Slaney from Dubber Andy Johnson interviews Dubber's Co-Founder and Head of Product, James Slaney at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Dubber at: https://www.dubber.net/ ### Behind the Expo interview with Rob Pickering from IPCortex Andy Johnson interviews IPCortex's Founder and CEO, Rob Pickering at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit IPCortex at: https://www.ipcortex.co.uk/ ### Dell Expands PCaaS, Management Options for Workforce Transformation London, UK – Oct. 24, 2017 Next-generation organisations look to Dell for new ways to procure, deploy and manage devices, PCs PC as a Service speeds, simplifies PC lifecycle management PCaaS now includes a Services Delivery Manager, complete lifecycle services, flexible financing and software options for greater security and management Dell, VMware jointly engineer Dell Client Command Suite and VMware Workspace ONE, powered by AirWatch, solution to manage devices from one console at both the OS and firmware level Windows 10 Provisioning by VMware AirWatch will enable zero IT touch provisioning of new Dell PCs drop-shipped direct to users Innovative devices like XPS laptops with InfinityEdge displays are flooding the market thanks to demands of mobile workers. According to the 2016 Dell and Intel Future Workforce Study, 81% of young workers say technology influences the job they take. These demands are also driving a rethink of the traditional PC lifecycle model. Next-generation organisations are looking to Dell to help them procure, deploy and support multiple types of devices and easily access, manage and update apps, while protecting against security breaches, driving up productivity and driving down cost. To aid customers in their workforce transformation and make Dell PCs the easiest to procure, deploy and manage in the industry, Dell is expanding its PC as a Service (PCaaS) solution. Dell and VMware also introduce zero-touch Windows 10 Provisioning by VMware AirWatch service, as well as joint engineering initiatives to integrate VMware Workspace ONE, powered by AirWatch technology, with Dell Client Command Suite to enable the next step in cloud management. These two solutions provide organisations with a choice in how to simplify and streamline PC management, whether they choose a full service solution for procuring, deploying and managing PCs, or opt for self-service PC deployment and management. Dell PCaaS enhanced to simplify PC lifecycle management  [easy-tweet tweet="Dell PCaaS enables customers to reduce their daily burden and cost of IT management" hashtags="Dell, IT"] Dell PCaaS combines the latest hardware and peripherals, software, lifecycle services including deployment, support, and asset recovery and financing into one all-encompassing solution with a single predictable, price per seat, per month. PCaaS enables customers to reduce their daily burden and cost of IT management, allowing them to focus on the transformation of their business with confidence. Research shows that customers can reduce the costs of PC lifecycle management by up to 25% with PCaaS. New standard features provided with every engagement include a full suite of services and financing options. PCaaS Services Delivery Manager: Services Delivery Managers provide a deep level of services engagement for all stages of the customer’s PC lifecycle. This manager is the single point of contact for the customer from initial planning to the end of the lifecycle, and coordinates with deployment, support, Dell Financial Services and asset recovery. Flexible Financing: Business needs change over time, so Dell Financial Services provides flexible financing options that include the ability to flex the number of systems up or down and/or upgrade PCs mid-term by up to 5, 10 or 15% within the 36 and 48-month term options. Complete Lifecycle Services: Dell has added asset recovery into the standard PCaaS services offer. Asset recovery enables a simplified return of the old equipment to Dell Financial Services with an option for an on-site data wipe service. Dell also has expanded the software options available for PCaaS customers to help protect, secure and manage their investment. These options include Dell Endpoint Security Suite Enterprise for file-based data encryption and advanced threat prevention; VMware AirWatch for modern, over-the-air unified endpoint management; or Absolute Data and Device Security for asset tracking and adaptive endpoint security. Also included in the standard PCaaS offer is the ProDeploy Client Suite to provide full deployment from project management through planning, configuration and integration. The ProSupport Suite for PCs and Tablets helps meet all support needs including priority access to expert support and proactive monitoring for automatic issue prevention and resolution. “With digital and workforce transformation, the way people work is changing,” said John Moody, vice president, Client Solutions Services Product Group, Dell. “To respond to this shift, IT needs a simpler solution to provide end users with the most relevant and secure technology, and manage it throughout the entire PC lifecycle. They need flexibility in how they purchase manage and retire PCs responsibly – with one point of contact throughout – and we deliver that with PCaaS.” VMware AirWatch integration and joint engineering for Dell Windows 10 PCs For customers who do not require the full lifecycle services offered by PCaaS, Dell is also introducing Windows 10 Provisioning by AirWatch. The integration of AirWatch unified endpoint management (UEM) technology, which powers the VMware Workspace ONE digital workspace platform, will simplify the deployment of new Dell PCs. This will improve the experience for both IT and end users by reducing the touchpoints between ordering and getting an employee up and running. It provides a number of self-service options including greater visibility into the status of the device as it’s configured, the ability to download approved apps and reset passwords, greatly reducing the need to contact the help desk. Unique to Dell, AirWatch also integrates with Dell Client Command Suite, allowing firmware settings to be configured from the cloud as part of the zero-touch provisioning process. These settings can then be updated or changed over-the-air in real time. “What’s different about what we’ve been able to jointly engineer with AirWatch and Dell Client Command Suite is you can now manage all devices from one console and manage them at the firmware level in addition to the OS,” said Brett Hansen, vice president, Client Software and general manager, Data Security, Dell. “Now, IT can optimize power management for peak and off-peak times, get reports on battery and system health to reduce user downtime and data loss, and remotely manage the BIOS settings for increased security. Dell is the only PC manufacturer with this depth of visibility into PC health and status both above and below the OS.” Availability The PCaaS enhancements announced today are available now from Dell for OptiPlex desktops, Latitude laptops and Dell Precision workstations in the United States, Canada and UK. It is also available as a resell option through the channel. Windows 10 Provisioning for AirWatch and Dell Client Command Suite is available worldwide on Dell Latitude, OptiPlex and select XPS PCs, and Dell Precision workstations. Quotes  Anurag Agrawal, Chief Analyst, Techaisle “PCaaS offers multiple benefits including predictable cost, streamlined procurement, lifecycle management, and access to current technology. Our research shows PCaaS contracts ensure that users are equipped with current and capable devices, significantly improving productivity while reducing exposure to security threats.”  Noah Wasmer, senior vice president, mobile products, End-User Computing, VMware “Employees expect their device onboarding experience at the office to be as seamless as it is for them to purchase and set up a personal mobile device. With the integration with AirWatch, the unified endpoint management technology that powers Workspace ONE, setting up a Dell PC can now be that simple. With just a few clicks, an employee can have their new device up and running with all their preferred applications, making them productive in minutes and helping IT drastically reduce the amount of time spent on device provisioning and onboarding.” Additional resources These resources are available to learn more about PCaaS o   Webpage o   Video o   Infographic o   Whitepaper o   Channel Partner Information Connect with Dell and follow the latest news on Twitter from @Dell Find out more about how Dell Technologies is collaboratively solving customers’ biggest challenges by visiting Dell Technologies’ Annual Report to Customers ### Behind the Expo interview with Daniel Power from OneLogin Andy Johnson interviews OneLogin's Sales Manager, Daniel Power at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit OneLogin at: https://www.onelogin.com/ ### Behind the Expo interview with Meredith Amdur from Rhetorik Andy Johnson interviews Rhetorik's CEO, Meredith Amdur at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Rhetorik at: https://www.rhetorik.com/ ### UKFast Triples Public Sector Revenue Investment in infrastructure and personnel delivers instant returns Cloud hosting specialist UKFast’s public sector department is on track to triple revenues in 2017, following significant investment in the firm’s public sector and government offering earlier this year. Recent new business highlights for the leading British provider include a £250,000 deal with the Cabinet Office, a £500,000 deal with software development service CDS, a £266,000 deal with enterprise mobility management provider Nine23 and most recently a £375,000 deal with an undisclosed public sector organisation. A £2.3m investment in UKFast’s government data centre space earlier this year ensures the firm meets the heightened regulatory needs of its government, financial services, international telecoms and utilities clients. The data centre upgrade was shortly followed by the acquisition of public sector cloud hosting firm Secure Information Assurance (S-IA). UKFast CEO Lawrence Jones said: “Government and the public sector is a major growth area for UKFast. Our acquisition of S-IA was timely, and with it comes relationships with organisations including the MoD, Cabinet Office and other high-profile government departments. “We're ensuring the government knows there is a better British cloud alternative, more aligned to the needs of the great people of this country. It's not just better value for money either, government deserves clearer billing, they want to know where their data is and they value having people at the end of a telephone day and night on standby to serve them when they need it most. “They get all this for less, and at the same time, the money spent with us stays onshore, not lining the pockets of the two richest men in the world. Buying British is positive for our economy and it starts with government leading the way.” Martin Knapp, head of public sector at UKFast, says that since S-IA joined forces with the hosting giant, his team has seen a marked increase in enquiries. “We’re investing heavily in Restricted LAN Interconnectors (RLI) for defence, our Public Services Network (PSN) for central government and the Health and Social Care Network (HSCN) for health service capabilities.  “That investment is reflected in a clear growth in the number and scale of the projects we're being asked to deliver. The UK public sector is seeing us as an increasingly credible alternative to UKCloud and the US hyperscalers. “We’re also seeing a great deal of interest from private firms wanting to tap into our expertise and accreditations to help them sell into the public sector.” [easy-tweet tweet="This year, Jones’ technology group is seeing a significant rise in enquiries from the public sector for cybersecurity" hashtags="Technology, CyberSecurity"] Following several high-profile cyber-attacks affecting government and public sector this year, Jones’ technology group is seeing a significant rise in enquiries from the public sector for cybersecurity and ethical hacking services. Jones added: “The addition of our cybersecurity arm Secarma adds an extra dimension, which is proving valuable for our public sector clients who are putting security higher up the agenda when considering outsourcing.   “But the biggest reason we are doing so well is that we don't have hidden costs. CIOs are becoming increasingly savvy to this now and don't want to be caught out with surprise billing as their usage increases. Ours is a simple model that allows you to scale without the penalty.  “The future of all cloud is scaling with fixed costs. It's a blend of the AWS model with our ‘no surprises’ approach." ### Blockchain, Machine Learning, Robotics, AI and Wireless Technologies will reshape digital business in 2018 Dimension Data says Blockchain has immense potential to disrupt and transform the world of money, business, and society London, UK – 24 October 2017 – Blockchain, together with artificial intelligence, machine learning, robotics, and virtual and augmented reality, have the potential to deliver disruptive outcomes and reshape digital business in 2018. And companies that have not started the digital investment cycle are at high risk of being disrupted.  This is according to the list of top IT predictions for 2018 published today by Dimension Data. But the top trend for the coming year is the adoption of Blockchain - the technology behind Bitcoin - and its immense potential to disrupt and transform the world of money, business, and society using a variety of applications. Ettienne Reinecke, Dimension Data’s Group Chief Technology Officer, says Blockchain has gone from strength to strength.  “Last year, when we looked at the top digital business trends for 2017, we predicted that centralised transaction models would come under attack. We were spot on. In the financial services sector, we’ve seen the US and European capital markets moving onto Blockchain platforms, and similar activity in markets such as Japan. Considering how conservative and compliance-focused this sector is, that’s quite remarkable. “It’s ironic that the cybercriminals who perpetrated the recent WannaCry ransomware attack could hold a federal government to ransom and demand to be paid in Bitcoin. Bitcoin might be a crypto-currency, but it’s based on Blockchain, and if cybercriminals are confident that Bitcoin provides a safe mechanism for the payment of ransoms, it indicates just how secure the distributed ledger approach is. I believe that Blockchain has the potential to totally re-engineer cybersecurity, but the industry has yet to come to terms with it.” [easy-tweet tweet="In the world of IoT you’re generating millions of small transactions that are being collected from a distributed set of sensors" hashtags="IoT, Blockchain"] Reinecke predicts that Blockchain will also deliver on the promise of Internet of Things (IoT) in the year ahead.  “In the world of IoT you’re generating millions of small transactions that are being collected from a distributed set of sensors. It’s not feasible to operate these systems using a centralised transactional model: it’s too slow, expensive, and exclusive. To extract the true value from IoT technology you have to be able to operate in real time. Once a sensor alert is received from a control system you must react to it, meter it, and bill for it instantly – all of which negates the viability of a centralised transactional authority. The cost of the transaction has to be near-zero or free, and the cost elements of a centralised model simply don’t support the potential business model in IoT,” he explains. In 2018, some interesting applications of Blockchain and IoT in the area of cybersecurity will emerge. Significant attacks have recently been launched from low-cost IoT endpoints, and there’s very little incentive for manufacturers of these devices to incur the cost of a security stack, which leaves them extremely vulnerable. Blockchain can play a fundamental role in securing these environments. Another exciting trend to look forward to is the boom in new wireless technologies that will enable IoT and bring us a step closer to the dream of pervasive connectivity. Some of these advancements will include 5G and Gbps Wi-Fi, new controls, virtual beacon technology, and low power, long distance radio frequency. There’s also a “digital fight-back” coming on the part of certain incumbent players. Established businesses that have proactively transformed into digital businesses, modernised their architectures, and embedded high levels of automation into their operations have a window of opportunity to claw back market share in the year ahead. That’s because there’s been an increase in the number of cloud-born start-ups themselves starting to be disrupted in certain industries. “I predict that a number of digitally transformed incumbents will successfully start reclaiming their markets because they have more credibility, longer histories, an established customer base, and assets that can stand the test of time,” says Reinecke. Read more about Dimension Data’s 2018 predictions for cybersecurity, digital infrastructure, hybrid cloud, digital workplaces, digital business, and customer experience. ### Behind the Expo interview with Colin Woods from Global Data Centre Partnership Andy Johnson interviews Global Data Centre Partnership's Account Director, Colin Woods at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Global Data Centre Partnership at: http://pstg.co.uk/global-data-centre-partnership/ ### On Board with the Data Revolution? The bold businesses at the crest of the wave of digital transformation – ready to surf the data economy with the agility of a Bondi brother as they shift from CAPEX to OPEX strategies – appear to be in the minority. Yet the energy is building and data management companies like NetApp have their boards waxed and ready to steer businesses through the waves.   As the world looks up to its contemporary giants, like Uber and Airbnb, so worldwide spending on digital transformation is set to rocket.  The question is, will your business be looking on in admiration, or will it be out there making waves with the pros? With NetApp’s data management ecosystem in place to maximise scalability and flexibility within the data fabric, any business can climb those frothy peaks. So what’s holding them back? Technological advances have awakened businesses to the service-based enterprise, entrenching fundamental shifts in business strategies that enable them to scale out their businesses to dramatic proportions. More traditional businesses can jump on board too, but despite the enticing fairy-tale of the modern start-up and the exponential rise in spend on digital transformation, set to hit $2.1 trillion in 2019, a recent Dell survey suggests many businesses are less enthused. While there will always be the innovators, leading the way with their bold new ‘tricks’, there will also be the willing adopters, quick to jump up on that ‘board’  themselves. Then there are the followers with only tentative plans to engage, the cautious evaluators that let others test the business waters – and of course, the laggards. According to Dell, 41% of the 4,000 large enterprises surveyed around the world fall into the ‘follower’ category and 24% were identified as evaluators. Meanwhile, 19% were classified as laggards. So much for a digital revolution, right? [easy-tweet tweet="86% of Britain’s biggest businesses have assessed their risk of digital disruption" hashtags="DigitalTransformation, IT"] Change, especially at speed, quite naturally evokes fear. Indeed 41% of Dell’s respondents said they were uncomfortable with the pace of the technological changes. But that’s not to say that businesses aren’t taking it seriously. Learning and development experts AVADO found that 86 percent of Britain’s biggest businesses have assessed their risk of digital disruption, with the majority putting a c-suite executive in charge of driving digital change.  Swimming against the tide or opting out is simply not an option for those businesses that want to compete. Tapping into the data economy means businesses can streamline their services, switch to the service-based enterprise and offer increasingly personalised services. The resulting shift in business models from CAPEX to OPEX, means channelling serious investment into digital transformation. For example, Walmart is set to spend $2 billion in order to overhaul its core capabilities and boost digital engagement. Similarly, General Electric is spending $1 billion every year in its quest for digital dominance. Meanwhile, start-ups are exploiting their agility by developing digitally native businesses from the get-go – asserting themselves as market disruptors. As the conversation around digital disruption rumbles on, inspiring the Leaders and unnerving the Evaluators, it is data that is keeping the economy afloat. Currents of data streaming through our everyday lives – connecting our personal desires and needs with the businesses that can alleviate them – are the future of retail, healthcare and manufacturing. Scalable, real-time data management keeps businesses in action and their customers contented. Data management companies like NetApp are omnipresent and as ubiquitous as the tides of data they manage. With the NetApp’s infrastructural provisions and data management to support business’ service-focused software, any business can command the surf and ride the crest of digital transformation. ### Behind the Expo interview with Spencer Lea from Axians Andy Johnson interviews Axians' Sales Director, Spencer Lea at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Axians at: http://www.axians.co.uk/ ### Behind the Expo interview with Nick Sacke from Comms365 Andy Johnson interviews Comms365, Head of IoT and Products, Nick Sacke at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Comms365 at: https://www.comms365.com/ ### Behind the Expo interview with Dennis O' Sullivan from Eaton Electrical Andy Johnson interviews Eaton Electrical's EMEA Data Center Segment Manager, Dennis O'Sullivan at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Eaton Electrical at: http://www.eaton.com/ ### Channel Tools and Compare the Cloud have announced a strategic partnership Channel Tools has announced a strategic partnership with Compare the Cloud to collaborate on video content mobile apps and future technology services for the IT channel. London, UK, October 16th, 2017 – Technology focussed sales and marketing specialists, Channel Tools, have announced a strategic partnership with informative, social and forward-thinking Compare the Cloud to produce unique content and services for the needs of the business. With the assistance of Compare the Cloud’s co-brand, Disruptive, this collaboration will enable customers to see Channel Tools as a distinctive proposition through video content and social media review. Additional services will help to achieve a different dimension within Channel Tools’ audience base and distribute the message that Channel Tools can help deliver sales and marketing services to complement a company’s marketing team. [easy-tweet tweet="Channel Tools and Compare the Cloud have announced a strategic partnership to produce unique content and services" hashtags="IoT, IT"] With the combination of Compare the Cloud’s social media platform views and Channel Tools collective IT channel experience and technology toolkit, it will see a partnership that creates inventive content and the ability to reach a vast audience base. About Channel Tools: At Channel Tools, we deliver core services around sales and marketing services that can be delivered on or off-site to complement your go-to-marketing team, using the best technologies and specialists.  With the addition of Mobile and IoT services including our Mobile Channel Enablement platform, Integrit, and Social Hub Platform, our customers can now achieve greater visibility to achieve smarter demand generation. See more at: https://www.channel-tools.co.uk/ @ChannelTools  About Compare the Cloud: At Compare the Cloud we cover the most exciting industry events with video interviews, blogs, social media presence and live broadcasts, providing a refreshing voice. We are passionate and disruptive whilst being educational to the newest rising stars of cloud technology to the largest vendors in the world. In short, we are influencers, analysts and enablers for you to use that create visibility and demand generation for you across the technical industry. See more at: https://www.comparethecloud.net/ @comparethecloud ### Behind the Expo interview with Anthony O'Mara from Malwarebytes Andy Johnson interviews Malwarebytes, EMEA Vice President, Anthony O'Mara at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Malwarebytes at: https://www.malwarebytes.com/ ### Cloud transformation strategies for information and communication technologies Most UK SMEs, corporates and enterprises use on-premise IT infrastructure that they manage themselves. Consider, for example, business telephony – in June 2016 Gamma Communications revealed that there were 17.6M PBX extensions in the UK and Cavell Group found that there were 2.4M hosted telephony seats – so that’s about a 3:1 ratio in favour of a policy to use premises-based, legacy technology. The vast majority of businesses are still using and managing IT infrastructure in the same way that they have always done. As the famous saying goes, ‘if you do what you’ve always done, you’ll get what you’ve always had’ and this applies to how you use and manage IT infrastructure. Change can be good, and when it comes to technology, often making a small change to the way things are done can have transformational benefits to the way you work. So here are some of the reasons why companies may change the way they use and manage IT infrastructure and the benefits that this delivers. Unified Communications as a Service Although they work very well, perhaps the greatest disappointment about expanding PBXs and associated expand ISDN network services, is that despite the myriad of imaginative features and services they deliver, very few people use their business phone to do much more than making and receive calls. Today we just have DDI numbering, caller ID and call barring. Any attempts by the carriers to charge for services like call waiting, ring-back and user-to-user signalling failed miserably because customers didn’t see the value. What changed most of all with unified communications (UC) is that the Internet enabled people to communicate in different ways which are more convenient or cheaper than by making phone calls. Instant messaging and presence is an example of a convenient, un-intrusive and informal way of exchanging information without having to make a call or write an email. Audio and web conferencing are features that UC can deliver far more cheaply than by using expensive telephony conference bridges. For businesses with an international footprint, this can be particularly beneficial. Although the PBX vendors have sought to turn their PBXs into UC products, what has caught them out is the dash to the cloud, enabled by a combination of the Internet, cheap data connectivity and virtualisation. The overarching attributes of any UC as a Service (UCaaS) is its availability and security, so provided that UCaaS is demonstrably as reliable and secure as a premise-based system, what conceivable benefit is there of buying, hosting and maintaining your system? This factor comes into play when it’s time to upgrade a PBX or buy a new one. Does it make sense to commit to purchasing a system which has an intrinsic shelf-life and associated running costs when you can use UCaaS that does the job just as well, is future-proof and saves you money? Probably not. UCaaS is revolutionising the way businesses communicate, whether that be internally in the same office, with colleagues overseas of with customers across the world. Cloud Computing At IP Expo held in Manchester in May 2017, one of the speakers asked their audience “If you had a great technology idea and decided to create a start-up tomorrow, how many of you would buy your server?” Not one person in the room raised their hand – clear evidence that we just don’t do it that way anymore. In Q1 2017, Canalys revealed that the global cloud computing market was worth a staggering £9bn, growing at 42% year-on-year. There’s no doubt that cloud computing is a facilitator. And companies are using it to improve their ability to innovate, migrate from legacy technology and manage peak capacity without having to buy additional resources that they will hardly ever use. [easy-tweet tweet="There are some different cloud options to choose from, all offering their own benefits. " hashtags="Cloud, IT"] There are some different cloud options to choose from, all offering their own benefits. The market is dominated by public cloud vendors such as Amazon Web Services – public because it’s a resource shared by all users. These vendors provide simple and fast access to compute resource, ranging from a single server with one CPU to several thousand with multiple CPUs, delivered within minutes and charged for on a pay-per-use basis. Private cloud, where compute resource is ring-fenced per customer, more closely resembles the UCaaS model, where a customer specifies a computing requirement and then commits to a 12-month or multi-year contract. Like a public cloud, each customer manages their own services, including the facility to move capacity between servers and to increase capacity to manage temporary peaks in demand. Since hybrid cloud combines the customer’s own server infrastructure with public cloud compute resource, this is ideal for anyone who has decided to dispense with their own infrastructure but who wants to do it in little steps. Rather than buying more premises-based servers, there has been a momentous shift towards cloud adoption, which is just as reliable, more flexible and cheaper. Computing has now become a utility available on tap rather than something that you need to own. And regardless of which option is adopted, cloud promises a range of different benefits to support and drive growth, while future-proofing operations. Outsourced Managed IT On 12 May 2017, many organisations, including the NHS, fell victim to a ransomware attack caused by a strain of malware known as WannaCry. Infected Windows computers displayed a message that files had been scrambled and that users would have to pay US$300 in Bitcoin to restore their documents. Operations and appointments that weekend were disrupted across the country and it took several days for things to get back to normal. This attack, although serious, could have been a lot worse. For those that continuously monitor their networks, this attack will have been flagged, enabling them to put preventative measures in place to limit the spread of the infection. Since many businesses don’t have the resources to dedicate towards 24/7 proactive monitoring of their systems, they often opt for outsourcing the management of their IT to an external company that is guaranteed to be on hand at all times to address any issues before they impact the business. For a managed IT service provider, taking calls at any time of day, having up-to-date technical knowledge and knowing what to do in the event of an incident is their bread and butter. In the case of the WannaCry ransomware attack, a managed IT specialist will have been particularly beneficial since they are often made aware of these issues in real-time and are therefore able to take systems offline as a precaution to limit the damage while the issue is taken care of. Monitoring, maintaining and managing IT equipment is something that many companies do themselves, but often to the detriment of what really matters – the business-critical applications and data. For an IT department to become a profit centre, rather than a cost of doing business, its staff need to be focused on the company’s information while a dedicated team looks after its equipment. This is a relatively simple change to make and offers the additional benefit of improving compliance. It often doesn’t take much to spark positive change within a business, particularly when these technologies are so readily available. Often the biggest challenge is taking the leap in the first place. ### IP Expo Interview with James Hulse from Dell EMC Disruptive interviews Dell EMC, Director Managed Service Provider Division & Growth Partners, James Hulse at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Dell EMC at: https://www.dellemc.com/ ### IP Expo Interview with Comedian & Identity Theft Speaker, Bennett Arron Disruptive interviews Comedian & Identity Theft Speaker, Bennett Arron at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Bennett Arron at: http://www.bennettarron.com/ ### IP Expo Interview with Rhiannon Dakin from Flynet Disruptive interviews Flynet, Marketing Manager, Rhiannon Dakin at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Flynet at: https://www.flynetviewer.com/ ### IP Expo Interview with Antony Bell from TrustMe Disruptive interviews TrustMe, Managing Director, Antony Bell at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit TrustMe at: https://www.u-trustme.com/ ### IP Expo Interview with Ed Owen from Zerto Disruptive interviews Zerto, EMEA Cloud Manager, Ed Owen at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Zerto at: https://www.zerto.com/ ### IP Expo Interview with Bipin Patel from Open Minds High Availability Solutions Disruptive interviews Open Minds High Availability Solutions, Technical Director, Bipin Patel at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Open Minds High Availability Solutions at: http://www.openminds.co.uk/ ### Tech Data to Participate in Microsoft’s IoT in Action Series Tech Data IoT leader will be part of a panel at first event in Europe BARCELONA, SPAIN (October 20, 2017) – Tech Data (Nasdaq: TECD) today announced that it will attend Microsoft’s “IoT in Action” series as a platinum sponsor, beginning with the European event taking place on October 23 in Leipzig, Germany. The event will be the first in Microsoft’s series of ‘IoT in Action’ events, with close to 700 participants registered. In Leipzig, Tech Data’s Victor Paradell, vice president, IoT and Analytics Solutions, Europe, will participate in an expert panel discussion presenting a successful IoT solution with Azure that partners can replicate. Partners ranging fromindependent software vendors to system integrators will have the opportunity to meet with IoT solutions aggregators during an afternoon matchmaking session, as well as over evening dinner. [easy-tweet tweet="Tech Data will participate in an expert panel discussion presenting a successful IoT solution with Azure" hashtags="IoT, TechData"] “We are excited to participate in Microsoft’s IoT in Action series as a platinum sponsor. These events will provide partners with the access, knowledge and skills to develop their IoT capabilities and understand how they can develop them more quickly in their markets,” said Paradell. “As an IoT solutions aggregator, Tech Data provides the necessary expertise to foster more collaboration within the partner ecosystem and build up long-term relationships. We hope these sessions will inspire future developments.” Microsoft’s IoT in Action series invites partners from around the world to learn and collaborate with tech leaders looking to build world-class IoT solutions and accelerate their go-to-market time. Each event is tailored to amplify the value of a solutions aggregator to the IoT ecosystem. The other events in the series will take place in Boston, Tokyo, Seoul and San Francisco between October 2017 and February 2018. For more information on Tech Data’s IoT solutions in Europe, contact Iot@techdata.eu. ### Eaton refocuses channel strategy to provide enhanced benefits to partners and capitalise on strong revenue growth Slough, UK, 20th October 2017 – Power management company Eaton today announces it has made a number of key strategic enhancements to its Power Advantage Partner Programme to leverage market growth, better support its channel partners and data centre customers, and continue growing revenues in a competitive market. The move is part of an increasing focus and investment on Eaton’s delivery channel for its data centre strategy.  The changes will give the Eaton’s extensive network of resellers even more benefits from their relationship with the company. A revamped online portal will provide access to a complete set of tools, including online training, sales collateral and a new deal registration facility. These resources, combined with business development support from Eaton, will enable partners to boost demand creation, deliver superior service to their end-user customers, and maximise their sales and profitability. A range of service and solution training courses, modules and tutorials will be available via the online portal. In addition to comprehensive enablement, partners will be able to take advantage of margin-builder programmes, such as deal registration and MDF (Marketing Development Funds), and demand-creation assets, including ready-to-use campaigns for fast execution of marketing initiatives. The Power Advantage Partner Programme will operate on a tiered basis, with partners given access to different features depending on their eligibility and present stage in the partnership lifecycle. Partners that successfully complete training will receive official accredited certification status, empowering them to enhance their knowledge, customer service skills and sales potential. Eaton will continually develop its training syllabus, giving resellers the opportunity to enhance their skills and the value they deliver to customers. The additional investment and support that Eaton is providing is a demonstration of its long-term commitment to its channel partners, said David Oddie, IT Channel Partner Manager EMEA at Eaton. [easy-tweet tweet="In the era of digital; transformation, organisations are more dependent than ever on their IT systems" hashtags="IT, Digital"] “In the era of digital; transformation, organisations are more dependent than ever on their IT systems and infrastructure and the reliability and availability of power is major concern. They know that it’s vital to protect themselves from power surges and outages while also keeping their costs under control and reducing their impact on the environment. As a leader in power management, Eaton is leading the way in delivering solutions that protect critical systems – and we depend on our partners to deliver our products and adapt them to suit the customer’s environment and needs. “This further investment is part of Eaton’s ongoing commitment to ensure partners get the very best support, enablement, guidance and, most importantly, quick and easy access to information and funding. In such a competitive environment, we know that investing in our partner network is the best way to deliver a great return for our customers, for partners, and our own business as well.” Steven Martin, business unit manager, at Tech Data UK, added: “Efficient, effective power management is absolutely vital for critical IT deployments today – from the largest data centre to the smallest cluster of servers. Eaton is a well-known and trusted brand, and a key strategic partner for us in the UK. Having access to the wealth of features and benefits that Eaton is providing in the updated Power Advantage Partner Programme, means we can offer additional business development support to Eaton partners and help them make the most of every opportunity to grow and develop their sales.” Kevin Matthews, Enterprise Sales Director at Exertis said, “Eaton is recognised as a leader in power management solutions with an expanding portfolio of products and solutions that optimise the performance of data centre infrastructure. Their channel partner programme reflects their desire to enable resellers to work with them to build solutions for their customers, taking advantage of additional expertise, resource and the latest products and services. Eaton’s commitment to the channel, through this programme, can only be mutually beneficial, driving further growth and profitability for all parties. Exertis expects to play a key role in recruiting partners to this exciting opportunity.” ### SteelEye joins EY’s fast growth platform   SteelEye’s cloud-based service offers firms a comprehensive solution and an ability to analyse and evaluate regulatory data to provide a wealth of insights to help optimise their business  EY’s platform provides a unique suite of services and support to help drive the development of fast-growing tech companies  London, UK, 20 October 2017: SteelEye, the compliance technology and data analytics firm, is pleased to announce that it has been invited to join EY’s Fast Growth Platform, which provides a unique suite of services and support to drive the development of fast-growing tech companies. This will help SteelEye expand its capabilities and enhance its capacity to assist its clients in meeting their regulatory obligations.  Launched in July 2017, EY’s Fast Growth Platform provides a three-tiered, tailored programme that includes start-up, scale up and global support. It offers emerging tech companies a range of services at each stage of their journey to help spur their expansion, including financial forecasting, growth planning, strategic support, social strategies, and comprehensive legal services. The industry-leading platform combines this support with access to EY’s global network of clients and professionals to help them maximise opportunities. In assessing companies for the Fast Growth Platform, EY considers a range of factors, including the boldness of the company’s strategic vision, strength of the management team and the distinctiveness of the offer or service that the business is bringing to the market.  EY draws on its extensive experience of working with fast-growth companies to inform its support for the platform as well as its development of other complementary key initiatives. These include the firm’s EY 7 Drivers of Growth methodology to help spur the expansion of growth businesses as well as its ‘Entrepreneur of the Year’ programme, which celebrates the people who are building and leading successful, growing and dynamic businesses. “The services offered by EY’s platform and the expertise and support we can draw on will be a strong growth enabler and will help us develop the best service we can for businesses looking for support to cope with the significant regulatory challenges that many face,” says SteelEye CEO, Matt Smith. “Next year a number of key directives will be implemented and our cloud-based service is designed to give companies a comprehensive solution and an ability to analyse and evaluate regulatory data to provide them with a wealth of insights to help optimise their business. Our motto is ‘don’t just comply, compete’. [easy-tweet tweet="SteelEye’s innovative cloud-based platform uniquely offers transaction reporting, record keeping..." hashtags="Cloud, IT"] MiFID II, GDPR and PSD2 are just three key regulatory directives that come into force next year. In particular, for MiFID II, SteelEye’s innovative cloud-based platform uniquely offers transaction reporting, record keeping, trade reconstruction, best execution and data analytics in a single solution. This will help firms to trade with greater efficiency, profitability and control. At the same time, the open API framework allows clients to harness the power of their data in whatever manner that best suits their business needs. “We are thrilled that SteelEye has joined our Fast Growth Platform”, says Richard Goold, EY’s Fast Growth Leader. “The company is one of the most dynamic and innovative businesses in the RegTech arena and is offering businesses a unique solution to manage their regulatory obligations. We look forward to helping the company better serve its clients and accelerate its growth both domestically and internationally.” RegTech is one of the fastest growing areas in the broader FinTech arena. According to specialist date intelligence firm Fintech Global, RegTech investments have more than tripled over the last five years. Investments in RegTech companies have grown by 38.5% (CAGR) in that period. Last year a record $678m was invested in 70 companies, compared to $185m in 32 companies in 2012. Q1 was a record quarter in terms of deals completed (21) hilst Q3 was a record in terms of volume ($305m). London takes the top spot for the city with the most RegTech deals with 39 investments between 2012 and 2016. Earlier this year, SteelEye partnered with the London Stock Exchange’s UnaVista Partner Programme to provide firms with an integrated, end-to-end technology-based solution that would help them meet their transaction reporting obligations. SteelEye has also established a presence in Switzerland, opening offices in Zurich and Geneva with plans to expand into Germany in the coming months. SteelEye’s cloud-based platform was formally launched in October 2017. To learn more and to book a demo, please visit: https://www.steel-eye.com/ ### IP Expo Interview with Tony Hollingsbee from Kingston Technology Disruptive interviews Kingston Technology, Business Manager SSD, Tony Hollingsbee at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Kingston Technology at: https://www.kingston.com/en ### Why CMOs Need to use Web Services to Stay Innovative This month, Amazon was named as the fastest growing web service on the planet. It’s not much of a surprise really. It’s set the benchmark for scaling your business without the rigmarole and expense of building a reliable infrastructure yourself. You get a lot of bang for your buck. Of course the model works because you can use it to incubate new ideas at very little cost to you, and change the capacity as your needs fluctuate. Brands that were just an idea on a paper serviette have become multi-nationals overnight thanks to the flexibility web services bring. JustEat and Zoopla being two brands that spring to mind. What’s interesting about those brands, aside from the fact they have completely disrupted a market, is that they are not traditional Silicon Valley darlings. JustEat is Danish, Zoopla is British. They debunk the myth that you must be rooted in California to succeed. The other thing they have in common is that they have done something that every brand should do, new or old: they’ve innovated. They have taken advantage of technology. And while technology may be at the heart of how they do things it’s not what they do. They are a take away service and an estate agent. In short, technology is fundamental to their existence but it’s not why they exist. There’s a lesson there for lots of business, and many a boardroom is well aware of it. However, what they struggle with is how to apply the lesson to their own business. How can they be agile, innovate and change a market when they are the status quo? It’s an urgent puzzle to solve - look at Toys ‘R’ Us. It was the golden child of retailing 25 years ago, now it is filing for bankruptcy. There are of course lots of reasons circulating as to why this happened, but one of the most interesting came from a retail analyst who said it has struggled to keep up with technology, not just buying habits. That sums up the day job for many CMOs, lots of ideas in the hopper yet limited resource to execute them. They are too busy getting the ‘must do’ campaigns, out of the door. No time to think about ones that will separate them from the competition, nor how they will embed a process that’s makes devising such campaigns business as usual. The pace of change in the industry, especially technological, also makes it very difficult to work out where to place your bets. Should you sell through Facebook? Is WhatsApp an advertising channel that should be exploited? What about augmented reality? The list goes on and that’s before you’ve even thought about how you will measure a channel’s performance. [easy-tweet tweet="CMOs must find the time, money and skills to move their marketing beyond the ‘me too’ marketing campaigns" hashtags="Marketing, CMO"] But given the intrinsic link between customer engagement and sales performance something has to change. CMOs must find the time, money and skills to move their marketing beyond the ‘me too’ marketing campaigns that all their competitors deliver, and instead trial new ideas and take to market extraordinary innovative ones more often. I think the answer lies in shaking up how we organise skill and use web services. I’ll explain. Organisational structures haven’t really changed. We still very much rely on a system of managers making decisions with a team of reports who make a plan happen. What has changed, is how much we expect people to do and how fast we expect them to do it. This isn’t a model for innovation, or certainly not a sustainable one, largely due to skill, time and money. Often the people involved aren’t the best equipped in terms of skill to deliver the job. For instance they might be brilliant at identifying who to target with an Instagram campaign, yet lack the technical know how as to how to design and run a campaign. And why should they if it’s not a core part of the business? They also lack the time to do the day job, let alone devise ways to exploit shiny new marketing techniques. And then there’s often the lack of funds to do it. Marketing teams have therefore looked outside of the company to help close this gap. It’s a model that has worked and great collaborations have materialised. That said, there is still only a finite pot of money to give agencies and, speaking to CMOs, ideas often fall short of the ambition. The ‘wow’ campaigns are either far and few between or bumped in favour of what they’ve always done. It seems that nine times out of ten there’s a trade off – invest in testing new ideas yet risk failure, or go with the familiar and have guaranteed results. There is however a half way house that could solve the problem. Enter the hybrid model. It might feel radical for HR departments but it’s one that works. Instead of binding the marketing department to a fixed structure, hybrid models allow you to have a central group of experts who run the day to day and a flexible set of dedicated contractors who take on the role of incubator. They are experts in their field with the knowledge and experience to deliver innovative project quickly and to a very high standard as and when its needed. In essence the company creates a gig culture employing people who aren’t tied to the desk full time but make themselves available for projects for a specific period of time. For instance they might work on a Christmas campaign during May, June and July knowing they can take the summer off to be with their family. It’s a brilliant way to plan ahead, incubate ideas with experts, knowing that the campaign will make it to market and the investment will deliver an ROI. Plus the spinning plates of the day-to-day don’t fall. When it becomes really powerful is when you couple this skill with the plethora of web services companies, especially those dedicated to customer management. These specialists are driven by innovation, taking new technology and exciting new content creation models to make the ordinary extraordinary. They make it their business to ensure the new works with the old, and they don’t require massive IT projects to do it. They can spot opportunities and convert them quicker than you can say business transformation. What’s interesting is that lots of companies are already doing this out of necessity rather than because a strategic decision has been taken to run operations this way. If they actually formalised the approach, and made the change permanent, they could very quickly find they have a powerful strategy for delivering campaigns that get the customer and the competition talking. ### IP Expo Interview with Kelly Murphy from HyperGrid Disruptive interviews HyperGrid, Founder and CTO, Kelly Murphy at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit HyperGrid at: https://hypergrid.com/ ### IP Expo Interview with Jamie Graves from Zonefox Disruptive interviews ZoneFox, CEO and founder, Jamie Graves at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit ZoneFox at: https://zonefox.com/ ### Nokia announces strategic collaboration with Amazon Web Services to enable easier transition to the cloud Nokia will support service providers in their migration to AWS with a complete suite of services Nokia and AWS to collaborate on 5G use cases for service providers Nokia to bring disruptive IoT applications for large enterprise customers using AWS services Nokia and AWS to bring Nuage Networks SD-WAN services to AWS enterprise customers Espoo, Finland – Nokia today announced a strategic collaboration with Amazon Web Services, Inc. (AWS) that will accelerate the migration of service provider applications to the cloud and drive digital innovation for large enterprise customers. On their journey to achieve the performance and flexibility brought by cloud technologies, Communication Service Providers (CSPs) and Enterprises need solutions to address the connectivity needs of cloud-based applications such as optimizing latency, virtualized network and routing services, and solutions for the Internet of Things (IoT).  Nokia and AWS are working together to bring a unique and powerful set of solutions that will enable service providers to implement cloud strategies faster leveraging Nokia’s expertise in wireless, wireline and 5G technologies. Large enterprises require fully managed connectivity to access cloud infrastructure, and fully integrated IoT and analytics solutions to enhance their productivity and ease of digitalization. The two companies will work to deliver differentiated solutions using NokiaSD-WAN and its IMPACT IoT platform in combination with AWS Greengrass, machine learning and artificial intelligence services. [easy-tweet tweet="Nokia and AWS will work together to generate new 5G and Edge Cloud strategies " hashtags="Cloud, Nokia, AWS"] The scope of the agreement announced today covers four areas of collaboration: Nokia will support service providers in their AWS implementation strategy with a complete suite of services including consulting, design, integration, migration and operation for infrastructure and applications. Nokia and AWS will work together to generate new 5G and Edge Cloud strategies and guidance for customers including reference architectures that enable both service providers and enterprises to benefit. Nokia and AWS are working to bring an improved user experience for Nuage Networks SD-WAN customers who use AWS. Enterprises can benefit from this seamless integration with AWS and launch secure branch connectivity in hybrid environments with “Single Pane of Glass” capabilities. Finally, the companies are commercializing IoT use cases with AWS Greengrass, Amazon Machine Learning,Nokia Multi-access Edge Computing (MEC) and Nokia IMPACT platform. Kathrin Buvac, Nokia’s Chief Strategy Officer, said: “The 4th Industrial Revolution requires a tighter integration between the IT and networking infrastructure worlds. Our collaboration with AWS will accelerate the migration of service provider applications to the cloud and enable us to forge new opportunities together by delivering on next-generation connectivity and cloud services. This is a wide-ranging collaboration, spanning our services capabilities in application migration, SD-WAN from Nuage Networks, 5G, and IoT, allowing new growth opportunities for our top customers across both the service provider and large enterprise market segments.” Terry Wise, Global Vice President of Channels and Alliances, Amazon Web Services, Inc., said: “Service providers are accelerating their migration to AWS in order to drive innovation for their customers and deliver lower total cost of IT to their organizations. We are excited to partner with Nokia to accelerate cloud transformation for service providers, and enable the digital transformation journey for our mutual large enterprise customers.” ### IP Expo Interview with Neil Christie from iomart Disruptive interviews iomart's Commercial Director, Neil Christie at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit iomIart at: https://www.iomart.com/ ### IP Expo Interview with Vicki Sammons from Dell EMC Disruptive interviews, Dell EMC's, Business Development Manager, Vicki Sammons at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Dell EMC at: https://www.dellemc.com/ ### DCs should embrace the power of water to increase cooling efficiencies, says the Green Grid DC facilities looking to tighten their belts should explore liquid cooling With power consumption at an all-time high, organisations need to move past their ‘data centre hydrophobia’ and embrace innovations in liquid cooling to usher in greater efficiencies within the DC environment. According to the Green Grid, doing so can reduce energy usage, improve IT equipment (ITE) performance, and decrease total cost of ownership (TCO). Today there are 4.9 billion connected devices globally, according to Gartner, and Cisco has forecast that this figure will increase to50 billion by 2020. This rising connectivity creates more data that needs to be stored, meaning that one of the greatest future challenges within the data centre industry will be improving energy efficiency and reducing operating costs. As data centre facilities can do little to influence the rising rate of data consumption, Roel Castelein, EMEA Marketing Chair at the Green Grid, suggests organisations should embrace liquid cooling to improve overall efficiency. He explains: “There’s no escaping it, data volumes are growing and doing so at an exponential rate. While data centres have successfully adapted to support growing data exchanges - by offering scalable power and cooling systems to allow servers to consistently perform and prevent downtime - efficiency remains a key choke point. Fortunately, many new liquid cooling technical developments and products have entered the market, with the potential to offer facilities significant efficiency gains. “These new developments transcend the power and performance limitations inherent in air-cooled IT hardware, while reducing IT and cooling energy. In practice, they range from simply retrofitting existing major server brands with liquid-cooled heat sinks to implementing complete rack-based systems with integrated coolant distribution units (CDUs). These systems can interface with existing facility system cooling loops, or they can be independent and self-supporting by means of supplemental external heat rejection systems, such as standard fluid coolers. [easy-tweet tweet="There are many reasons for both IT and facilities to explore leveraging the benefits of liquid cooling" hashtags="IT, Data"] “There are many reasons for both IT and facilities to explore leveraging the benefits of liquid cooling, not least improved IT performance and a reduction in IT and facility energy usage and space, which results in the overall reduction of total cost of ownership (TCO). And, with liquid cooling becoming more commercially viable, it won’t be long until liquid cooling is one of the most cost-effective options to cool ITE in data centres,” Roel concluded. Download the Green Grid’s latest white paper on liquid cooling technology here. ### European businesses are advancing their digital transformation Almost half of companies polled have implemented formal digital transformation strategies to reduce operational costs and improve customer experience LONDON, UK – 19th October 2017 – CenturyLink, Inc. (NYSE: CTL) today announced the results of a commissioned study on the digital transformation of European enterprises. Conducted by 451 Research, the study provides insight into how, why and at what pace organisations are preparing for their digital future. The survey – which polled more than 400 executives across Europe – found that more than half (52 percent) of companies are increasing IT spending levels within the next year to achieve the significant benefits of digital transformation. In addition, the majority of organisations are making progress with planning and implementing their digital transformation projects – almost half (47 percent) have a formal strategy and are actively digitising business processes, and 25 percent have started on siloed projects without an overarching strategy. “In a global business economy dominated by disruption, innovation and cyber threats, speed to market is of the essence,” said Richard Warley, CenturyLink managing director, EMEA. “While it’s encouraging to see the majority of enterprises in Europe increasing their investment in digital transformation, the associated complexities can result in numerous roadblocks. Moving to a superior network and employing agile cloud technologies are foundational elements to simplifying the transformation process.” “Digital transformation programs in Europe are now being viewed as strategic, long-term initiatives and typically receive the support of top-level executives,” said Nick Patience, founder & research vice president, software at 451 Research. “A critical aspect of these transformation projects is entrusting some of the fundamentals to external service providers. More than 90 percent of companies surveyed are using or expect to use a third-party transformation partner. This enables business leaders to refocus internal resources on developing new services and applications to support innovative business initiatives.” [easy-tweet tweet="Nearly 40% of respondents expect digital transformation to cause ‘major disruption’ over the next three years" hashtags="DigitalTransformaiton, IT"] The study also found nearly 40 percent of respondents expect digital transformation to cause ‘major disruption’ over the next three years. The top factors driving digital transformation are reducing operational costs (51 percent), improving customer experience (41 percent), and creating new services or revenue streams (37 percent).  The Pace of Transformation Despite more companies focusing on digital transformation, they acknowledge it will take significant time to successfully complete these projects. The majority (55 percent) of respondents estimate it will take three to six years to achieve company-wide digital transformation, largely due to how long it takes to overcome business and IT complexities. Barriers to a Successful Transformation The primary barriers respondents identified are inflexible IT systems and lack of operational agility (36 percent), overcoming organisational silos/outdated work practices (35 percent), and inability to migrate legacy IT and business applications to the cloud (33 percent). The Growing Importance of Cloud The study confirmed the increased importance of cloud, with 57 percent of respondents rating cloud services as a ‘very important’ enabler for digital transformation and 29 percent rating it as ‘important’. Choosing the Right Partner In addition to long-term commitment, choosing the right partner is critical to achieving success. Forty-three percent of companies are using or expect to need an IT services (including communications services) provider in support of their digital transformation programs, and 38 percent use or expect to use a cloud service provider, the study found. To Download the Study Commissioned by CenturyLink, the study was conducted by 451 Research in the first quarter of 2017, including interviews with more than 400 executives at leading companies in the financial services industry, retail sectors and public sector agencies across the UK, Austria, Germany and Switzerland. In all, more than 1,400 executives were polled from global enterprises of various sizes in North America, Europe and Asia Pacific. The study can be downloaded as part of the “In Pursuit of Digital Transformation” report. ### IP Expo Interview with Zoe Magee from Channel Tools Disruptive interviews Channel Tools, CEO of Channel Mobile, Zoe Magee at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Channel Tools at: https://www.channel-tools.co.uk/ ### Public, Private or Hybrid: Which Solution to Pursue In the puzzling world of cloud it’s sometimes hard to determine which solution or mix of solutions is best for you and your business, given the overwhelming amount of information available online. The chances are that you’ve explored both public and private cloud options and maybe even discovered the hybrid cloud, a blend of the two cloud environments. The solution you settle on depends on which features you find most important and how much you’re willing to invest. The cost-effective public cloud is easy to manage and offers increased scalability, a private cloud provides greater control and heightened security in an age where data breaches seem to be a weekly occurrence. While hybrid cloud brings the best of both worlds, merging public and private cloud for lower total cost of ownership (TCO), with enhanced security, scalability and management features. What’s best, Public, Private or Hybrid? The most important questions to ask are: Do you need full control over your data processing flow and privacy? Do you have a need for specific regulatory or compliancy? Do you employ dedicated network engineers? Do you have the resources to invest in server infrastructure and datacentre management? If you answered yes to all of four questions, then a fully private cloud solution is your best bet. Especially for mid-size or large companies and enterprises with the resources to invest in server infrastructure and datacentre management, as well as dedicated IT personnel to manage it all. There are initial costs during set up, but it works well for businesses with highly sensitive information, or executives who wish to retain full control over an organisation’s data. Alternatively, if you answered no, a fully public cloud offers you an affordable solution that scales up or down as needed, leaving room for growth for your small or mid-sized company. With no server or infrastructure investment, it cuts initial deployment costs, as well as those associated with software licensing fees and dedicated IT personnel. This can always be a starting platform and then as your business grows you can look to upgrade or integrate a different solution. Or, are you somewhere in between? Then a hybrid cloud is the best option. It can provide a low-cost transition strategy for more cloud-based operations at a later stage, or act as a long-term solution that guarantees sensitive data stays on your premises, while separately creating a resilient, scalable public solution. It also allows you to benefit from a lower TCO and quicker results, without compromising sensitive data availability or compliance needs. Designing Your Hybrid Cloud If you choose hybrid cloud, it opens up another more difficult question: Which applications should be hosted publicly vs. privately? Most applications can be evaluated based on the same four questions you used to evaluate which cloud solution suits your needs. But you will also want to consider your availability needs. [easy-tweet tweet="Many public cloud providers offer a SLA that will guarantee certain levels of uptime" hashtags="Cloud, Data"] Many public cloud providers offer a service-level agreement (SLA) that will guarantee certain levels of uptime. Private, self-managed cloud may not be able to do the same. If downtime for a particular application would be problematic, a public cloud option may be the better fit. This does leave you open to certain risks if your provider’s services go down however. You will be completely reliant on them to fix the problem, while a private cloud’s downtime will be your internal responsibility. Be sure to consider whether you have the resources to address downtime issues if you opt for the private option for crucial applications. Also examine each application you’d like to host in the cloud and consider the needs for each. If you need a lot of control over the data the application contains, particularly for regulatory or compliance reasons, a private cloud may work well. If you think a particular application is going to need to grow (or shrink) in the coming weeks, months or years, the scalability of the public cloud would be a great fit. Finally, be sure you understand your organisation’s security profile and each application’s security needs. If you may be vulnerable to attacks, whether on your own merits or based on your access to partners and customers, be sure to understand how the data you will be storing in the cloud will be secured. Are you Ready for the Cloud? If your network isn’t able to handle the increased bandwidth that moving to the cloud will require, then you will create unnecessary frustration with users. Before moving any applications to the cloud, public or private, think about total users, remote workers and plans for future growth, and understand how that will impact your network’s ability to integrate and interact with the cloud. Hybrid cloud will make it easy to test the waters without fully migrating over, which makes it just as important to find a cloud vendor that allows you to move freely between private, public and hybrid deployment models rather than having to commit to one model long-term. It’s essential to pick a vendor that will help you tailor your solution to your business’ specific requirements today, and be sure it’s flexible enough to grow or change with your organisation to keep you covered in the future.   ### IP Expo Interview with James Morgan from Juniper Disruptive interviews Juniper's VP UK, Ireland & Global Key Accounts, James Morgan at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Juniper at: https://www.juniper.net/uk/en/ ### Interoute adds Cloud Fabric to its Edge SD-WAN service Interoute, the global cloud and network provider, has launched Interoute Edge SD-WAN, a Software-Defined Wide Area Network service meshed into its global Cloud Fabric. This latest evolution of the Interoute Edge product family reduces costs and improves cloud application performance by intelligently managing connections to applications. Utilising the company’s Cloud Fabric, Interoute Edge SD-WAN blends public and private access networks into one dynamic application network. It ensures application traffic is directed over the fastest routes without impacting availability or needing costly, unused backup circuits. The Cloud Fabric mesh is constantly adapting and re-routing itself, to intelligently optimise application data flows between local offices and the corporate private cloud environment and accelerate applications delivered from the public cloud. “Interoute Edge SD-WAN is bringing new levels of optimised application performance in the cloud,” explained Mark Lewis, Executive Vice President of Products and Development at Interoute. “As more applications move to the cloud, enterprise users can be forced to take an indirect route across the global internet to access those apps, resulting in increased latency and poor performance. Interoute Edge SD-WAN combined with Interoute’s Cloud Fabric Software Defined core ensures that traffic takes the fastest and most direct route, optimising throughput and ultimately cloud application performance. It provides access to public and private computing combined with accelerated access to other applications elsewhere in the cloud.” Interoute Edge SD-WAN is delivered using Interoute’s global Cloud Fabric, which binds together 17 VDC (Virtual Data Centre) Cloud Zones, co-location facilities, Points of Presence (PoP) and third-party cloud providers with an ultra-low latency private network backbone. Sensitive information and critical traffic is delivered using this global private network and the application performance aware routing. Integrated network acceleration uses de-duplication and compression to reduce bandwidth requirements and deliver non-business critical traffic using a secure tunnel over the public internet. Low priority public internet activity is also passed through Interoute SD-WAN Edge Connect to ensure it is secure. ### Zen to sponsor AWS AWSome days in Leeds and Nottingham Zen will be sponsoring and attending this years’ AWS AWSome days in Leeds and Nottingham. As a certified AWS Advanced Consulting Partner, and home to a number of AWS qualified experts, Zen will be available to discuss everything relating to AWS with attendees. In particular, Zen will discuss cost optimisation, right-sizing business’s cloud environments, ensuring security and obtaining funding for proof of concept. The Leeds event will take place at the Royal Armouries Museum on 2nd November, while the Nottingham event takes place five days later on the 7th at the Nottingham Conference Centre. For those new to AWS, these free, one-day training events are a great place to start. Training is delivered by AWS technical instructors and covers everything a business needs to gain a deeper understanding of AWS. The days will begin with sessions exploring AWS foundations and infrastructure, before breaking into separate Technical and Business Tracks. The Technical Track covers four broad areas: ‘Security, Identity and Access Management,’ ‘AWS Infrastructure,’ ‘AWS Databases,’ and ‘AWS Elasticity and Management Tools.’ The Business Track on the other hand is more suitable for business leaders and covers the topics, ‘Best Practices for Getting Started with AWS,’ ‘Cost Optimisation on AWS,’ and ‘Security & Compliance in the AWS Cloud.’ As well as comprehensive learning sessions, the events will also provide the opportunity to network with both the AWS team and industry peers, and have questions answered by AWS experts. For more information, visit https://aws.amazon.com/events/awsome-day/ukir-roadshow-2017/ ### IP Expo Interview with Bob Adams from Mimecast Limited Disruptive interviews Mimecast Limited, Product Marketing Manager, Bob Adams at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Mimecast Limited at: http://www.mimecast.co.uk/ ### IP Expo Interview with Sean Herbert from Baramundi software Disruptive interviews Baramundi Software, Country Manager UK, Sean Herbert at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive has partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Baramundi Software at: https://www.baramundi.de ### IP Expo Interview with Ian Daly from Plan B Disaster Recovery Ltd Disruptive interviews Plan B Disaster Recovery Ltd, Director Ian Daly at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Plan B Disaster Recovery Ltd at: https://www.planb.co.uk/ ### Delivering on the Promise of Smarter Logistics The global financial impact from cargo loss as products journey through the supply chain was estimated to be $60 billion in 2016. This astonishingly large figure demonstrates the urgent need for an overhaul of our outdated logistics systems. Fortunately the world of logistics is on the cusp of a radical transformation - fuelled by the powerful combination of mobile computing, analytics, and the cloud. But although this opens up fantastic opportunities, many global organizations are struggling to change fast enough to remain competitive in such a rapidly evolving market. Against this fast moving backdrop, it is easy for poor decisions to be made that will impact an organisation’s bottom line considerably if left unchecked. This is a huge global issue because, according to Accenture, an astonishing 8-10% of a developed nation’s GDP spend is on logistics. Faced with these enormous challenges, some first mover businesses are starting to embrace digital transformation and are moving their logistics processes and systems to the cloud for a more flexible and agile approach to logistics management that will also reduce costs. [easy-tweet tweet="Using a cloud-based model, it is far easier to take advantage of emerging trends such as AI/machine learning" hashtags="AI, MachineLearning"] AI and IoT driving change Using a cloud-based model, it is far easier to take advantage of emerging trends such as AI/machine learning and IoT, which offer the prospect of smarter, more efficient logistics operations. Take anticipatory logistics for example, which uses machine-learning algorithms to log previous customer activity and information, allowing suppliers to anticipate customers’ needs and move products or resources closer to their location in preparation for an order. The result is a win-win: the customer gets their order faster and is very happy with the service, while the supplier benefits from greater customer loyalty and a more streamlined process. Another thing that has been complicated and incredibly expensive to achieve in the past concerns the ability to track the entire logistics chain in real-time. This is changing now that organisations can use cloud-based platforms to manage live tracking and alerts from IoT and mobile GPS devices. By bringing all of these data points into a single platform in real time, combining these with powerful web-based analytics capabilities, and making this information accessible via a smart web-based dashboard, organisations gain valuable insights every step of the way. Examples include: improvements in capacity planning, dynamic routing, predicting shipment delays, bottleneck alerts, and temperature tracking for fresh foodstuffs. Another branch of AI  - cognitive computing – is also making in roads in logistics operations. Using a cognitive approach - where the logistics system provides valuable information extracted from a mix of structured/unstructured data, and the employee then decides on the best course of action – can lead to even more informed decision making and more optimal outcomes. The Ascent of Autonomous Agents Of course a discussion on IoT and logistics would not be complete without a mention of emerging autonomous agents such as drones and self-driving trucks. Self-driving trucks are due to be trialled in the UK from 2018 as part of a £8.1million project to reduce business costs (and also cut pollution), while drone delivery services are currently being tested by a number of large retailers and logistics companies in a bid to solve the problem of "last mile" deliveries.  Clearly this will usher in the need for a raft of new processes, as well as challenges around how best to manage the ever-growing volumes of IoT data in such a rapidly changing environment. For any organisation considering autonomous agents, a Software as a Service (SaaS) logistics platform removes the burden of needing to future-proof technology investments. Increasing Your Delivery Happiness Score Underpinning all of these technological advances is the focus on delivering highly personalised experiences that put customers at the heart of the process. A recent study found that over 86% of customers rate the quality of delivery they receive as part of their online shopping experience, making it crucial that companies offer the best customer experience possible. This means turning on a dime to respond faster to customer orders and personalising the experience to best meet each customer’s specific needs. One organisation that has already overhauled its business operations to facilitate more informed, faster decision-making is DHL eCommerce. A division of the world’s leading logistics company, Deutsche Post DHL Group (DPDHL), it provides e-commerce related logistics such as fulfilment, cross-border delivery and domestic delivery in various markets. DHL eCommerce is committed to staying one step ahead of the competition by offering the best-in-class and most customer-centric technology solutions. Finding a technology platform that could help orchestrate first and last mile delivery and give more granular insight into these crucial stages was imperative for the firm. Charles Brewer, CEO DHL eCommerce explains: “With e-commerce growing at such a rapid pace, we see a fantastic opportunity for high-quality solutions that will offer a great customer experience and more choice, convenience and control for online shoppers. We set about finding a platform that would be scalable, future-oriented and flexible to allow us to optimise our delivery operational efficiency and FarEye fits the bill.”  FarEye’s SaaS platform runs on a Business Process Management engine and provides DHL eCommerce with smart analytics and a real-time view of the logistics trail, meaning that it can keep the customer informed at every step, and make parcel shops more efficient. DHL eCommerce is also taking advantage of tools including efficient route optimisation, live tracking, and predictive delay calculation, while the mobile application ensures delivery agents can pick and deliver seamlessly with real-time updates. According to Brewer, this means that: “We can deliver ‘delight’ by having complete visibility of the logistics movement and keeping customer informed at every step, in real time”. In summary, we are living in an exciting era where the pace of change in logistics across all industries is enormous. The requirement to rethink and redesign logistics processes has never been stronger. With a cloud-based approach, organisations can deliver on the promise of smarter logistics, boost their bottom line, and deliver an outstanding service to customers too. ### IP Expo Interview with James Romer from SecureAuth Disruptive interviews SecureAuth, EMEA Chief Security Architect, James Romer at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Secureauth at: https://www.secureauth.com/ ### IP Expo Interview with Peter Barnes from Dell EMC Disruptive interviews Dell EMC, VP Infrastructure Solutions Group, Peter Barnes at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Dell EMC at: https://www.dellemc.com/ ### IP Expo Interview with Steve Denby from Node4 Limited Disruptive interviews Node4 Limited Head of Solution Sales, Steve Denby at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Node4 at: https://www.node4.co.uk/ ### Calligo Acquires Canadian-based Cloud Services Business Calligo, a leading global cloud solution provider, is excited to announce that it has purchased Canadian cloud services provider 3 Peaks Inc. The acquisition of the Burlington Ontario and Vancouver BC based business, which has clients right across Canada and the USA, is an important step in Calligo’s ongoing growth strategy which includes expansion into North America. All existing 3 Peaks clients will continue to receive the high levels of service they have become accustomed to along with access to the extended capabilities and data privacy enabled services that Calligo offers as a global cloud provider. “We’re delighted to have acquired 3 Peaks as part of our continuing growth strategy,” said Julian Box, Calligo’s Chief Executive Officer. “We now have a firm focus on North America and this acquisition will allow us to provide our unique GDPR-enabled infrastructure and services to clients based there.”   “We know that 3 Peaks’ excellent reputation for customer-focused service delivery will make this a great acquisition, giving us a superb team, respected new clients while expanding our capabilities into this exciting market.”  [easy-tweet tweet="Calligo provides trusted, privacy-focused and GDPR-enabled cloud solutions" hashtags="Cloud,GDPR"] Founded in 2011, Calligo provides trusted, privacy-focused and GDPR-enabled cloud solutions to businesses across the globe. Calligo’s emphasis on data privacy and residency enables clients to leverage the advantages of combining advanced and innovative cloud technologies, unrivalled expertise and a commitment to the highest level of standards based compliance and security.  The business services hundreds of clients worldwide from its locations in the United Kingdom, Jersey, Guernsey, Switzerland, Singapore, Bermuda and Luxembourg. Ian Clark, CEO, 3 Peaks, said: “We are very excited to have concluded this agreement with such a respected and innovative business as Calligo. Calligo’s deep commitment and understanding of data privacy means our clients will benefit from their expertise and unique service offerings, while continuing to enjoy the highest standards of local support.”   Box added: “With the backing of our investor Investcorp Technology Partners, we are actively looking to execute further strategic add-on acquisitions over the coming months as we continue to expand our global footprint.”  ### IP Expo Interview with Marcus Langford from IntegrationWorks Disruptive interviews IntegrationWorks' Delivery Manager (UK), Marcus Langford at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit IntegrationWorks at: http://www.integrationworks.co.nz/ ### Internet of Things And Data: A Powerful Connection Neo4j’s Emil Eifrem looks at why graph databases have enormous application to IoT Graph is now a very familiar idea regarding its usage in capturing relationships and powering in social networks, with many of the social web giants such as Google, Facebook and LinkedIn having pioneered its application.  However, there are also very promising applications of graph databases that are not so obvious. It’s the mapping networks of things – more specifically, sensors, devices and machine-to-machine relationships – that are the basic fuel of the complex IoT (Internet of Things) systems we want to build. Smart homes will need to run a large number of sensors, networks, devices, cameras, power grids and smart water and thermostats to work, for example. But they will only really be effective if they are linked up together as a connected Internet (network) of many things (devices). [easy-tweet tweet="An IoT structure will have to underpin our future smart homes and cities." hashtags="IoT, AI"] In other words, an Internet Of Things (IoT) structure will have to underpin our future smart homes and cities. When a new item of equipment or sensor comes online, it will want automatically to seek local controllers or other devices that it needs to listen to or share data with, while the powering up or down of just one individual sensor may create or end dozens of connections – maybe hundreds. Eventually, thousands. The abstract that out, and you’re working with a complex data structure of many nodes. Understanding connections are the key to understanding dependencies and uncovering cascading impacts – and graph technology may be the only feasible way of capturing all that density and inter-connectedness. Graphs and the smart home Isn’t it possible to manage this volume of connections with my existing database technology, however? While it’s true that some smart home IoT problems could be handled by a relational database, they’re not an especially satisfactory fit. That’s because they represent data as tables, not networks, and IoT queries strain a data structure not set up to map connections. That’s why observers like the analysts think that this level of device IoT functionality can best be implemented in a graph database, as they process complex, multi-dimensional networks of connections at speed, especially if these are native or thoroughgoing graphs. Alex Woodie, Managing Editor of Datanami, says, “One of those key enabling technologies [of IoT] is graph databases.” Tony Baer at Ovum: “Graph technology will allow the Internet of Things to be represented transparently… without the need to force fit into arbitrary relational models." And, Matt Aslett, Research Director at 451 Research comments: “Graph databases offer the potential to store and analyze data from the IoT”. Another independent commentator agrees: analyst Quocirca says that “The prime benefit of a graph database is that it operates on a pattern-matching basis, [disregarding] everything that is not related to what it needs and focusing only on what is relevant. Graph databases are therefore ideal for dealing with the real-time nature of IoT.” Nordic and Baltic telco leader Telia, for example, has embraced graph technology, with its new digital ecosystem and platform for broadband connections, Telia Zone. It can potentially serve 1.2 million users, with graph-powered smart home management a major feature. The vision is to be part of the big eco-systems alongside Apple, Google, and Amazon, giving customers a slew of fun and useful broadband-delivered consumer digital services. Telia Zone is the basis for a future smart home system, starting with VOD and home entertainment, and is expected to have 13 million devices as individual nodes, with 20-30,000 events per second. It’s, therefore, a sizeable IoT network, with the Neo4j graph database there to help the Telia team create new connections on the fly and make new APIs out of any that may become desirable. Also, the firm will soon add in AI (artificial intelligence) and Machine Learning, and it says that graph technology is the best way to handle that. Graph analytics is at the heart of how Telia Zone understands data, in effect – using the graph with the router as the node, apps as connections, and other locations or devices where a subscriber connects to Telia Zone giving a wider context. Predictive analytics can also look at when two known methods are in the same zone or when a subscriber is likely to arrive home. With the backing of the analysts and the example of pioneers like Telia showing the way, it’s hard to disagree that it’s graph-based IoT management that promises to be the shortest way to get us to the smart domestic future we all want. ### Executive Education Episode 2: The 4th Industrial Revolution and how it will affect Leadership Disruptive Live presents the second episode of Executive Education in association with Regent Leadership! This week’s episode will focus on the 4th Industrial Revolution and how it will affect Leadership This week sees the panel hosts Colin Pinks, Director from Regent Leadership and Kylie Hazeldine, Business Development from Regent Leadership with their panel Peter Holliday, Director of Strategic Development of Regent Group and Graham Bird, Fellow of Oxford Leadership. ### IP Expo Interview with James Lyne from Sophos Disruptive interviews Sophos' Global Security Advisor, James Lyne at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Sophos at: https://www.sophos.com ### HP Scoops Top Prize to be Named Best Overall Partner at Misco Star Awards HP was the big winner as it beat off fierce competition from Dell EMC, Hewlett Packard Enterprise, iiyama and Lenovo to be crowned Best Overall Partner at this year’s Misco Star Awards.  Exertis, Westcoast, Cybercrowd, Hewlett Packard Enterprise and Dell EMC were also among those recognised for their efforts in the channel and honoured at the awards ceremony hosted at the Chelsea Harbour Hotel. Other notable winners included Lexmark, Logitech, APC, iiyama and Microsoft Surface. With over 37 brands nominated, competition was fierce across the ten partner categories. Alongside these, two leading account managers were recognised for their efforts during the past year in Partner Account Management and Distributor Account Management, with Keith Ward from Kensington and Dan Horton from Exertis winning in their respective categories. With HP scooping the big prize, Dell EMC was named Best Server & Storage Partner, whilst Cybercrowd took the award for Best Services Partner and Hewlett Packard Enterprise got the nod as Best Solutions Partner. In the strongly contested category of Most Outstanding Distributor, the honours were shared between Exertis and Westcoast. Lexmark came out on top to win the award for Print and Scan, with iiyama landing the gong for Interactive and AV Solutions. Peripherals went to Logitech, as APC took the award for UPS and Power Surge. Finally, Microsoft’s work around evolving form factors paid off as it took the award for Client Devices with its Surface product. The awards followed the annual Misco Expo which saw hundreds of delegates attending a one-day event at Battersea Evolution in London to network with leading manufacturers and to hear from some of the IT industry’s most recognised figures – such as Graham Cluley – as they addressed key concerns when it comes to cyber security, the General Data Protection Regulation (GDPR) and IT solutions for the public sector. Comments Alan Cantwell, CEO, Misco Group: “The Misco Star Awards recognise the best of the best from within our partner ecosystem in the UK and are our way of saying thank-you to those who make a difference on an annual basis. Misco Expo is a great opportunity for our business and public sector customers to spend quality time with our partners, to help address current IT issues or concerns. Digital transformation is high on the agenda for many organisations now, so it was good to see visitors engaging with our Solutions Team in the Business Transformation Hub.” About Misco A leading provider of IT products, solutions and services to businesses and public sector organisations, Misco has operations in the UK and Ireland and also across Europe in Italy, Spain, the Netherlands and Sweden. With over 30 years’ experience and a portfolio of 40,000 information technology products from all the leading manufacturers, Misco has an established Public Sector Sales Team with specialists in Education, Healthcare and Government sectors to help establishments choose the right IT solutions. Misco’s expert team understands that not all businesses have the necessary time and technical skills in house to deliver complex IT solutions. That’s why our Solutions and Services Team has six dedicated practices focused on Collaboration, Network, Security, Audio Visual, Infrastructure and Workspace Transformation. www.misco.co.uk ### IP Expo Interview with Christian Fumic from Lantronix Disruptive interviews Lantronix's EMEA Sales Manager, Christian Fumic at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Lantronix at: https://www.lantronix.com/ ### IP Expo Interview with Atchison Frazer from Talari Networks Disruptive interviews Talari Networks, WW Head of Marketing, Atchison Frazer at IP Expo 2017. Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, interviewing technology experts from all over. Visit Talari Networks at: https://www.talari.com/ ### How an ‘as a Service’ Model Will Deliver IoT for the Masses Possibly envisaged to become the biggest driver of business growth in the next decade, the Internet of Things (IoT) has been estimated by Accenture to add $14.2 trillion to the global economy by 2030. However, although the appetite and potential benefits of IoT are there, it is not yet ready for mass distribution. Risk and cost remain the two biggest barriers to adoption, however, the reality is that for businesses of all shapes and sizes, a new IoT as a Service model could hold the key to IoT uptake on a mass scale. Limitations to consider While the concept of IoT offers appeal to businesses of every size, the truth is the IoT’s vision is yet to be achieved. For many SMEs, the expense of 3G and 4G’s connectivity have made IoT projects prohibitively expensive and while the market has responded with the introduction of low cost, low power wide area networking (LPWANs), there is not yet a ‘go to’ solution for companies, adding an element of confusion into the mix. In addition to the networking hurdle, creating a reliable and sustainable IoT infrastructure is a hugely complex task and has affected the development of viable business models. Organisations must not only find a way to manage sensors, networks, data storage, data analytics and an essential link to operational systems that use IoT data to drive valuable business insights, but there is understandable uncertainty around the long-term viability of the model and underpinning technologies. [easy-tweet tweet="IoT is hitting a new level of maturity in both technology and delivery model" hashtags="IoT, Technology"] For any business thinking about investing in IoT, there are some very real concerns. Where is the future proofing? Where is the consistent, proven and reliable network infrastructure?  How can we undertake such a complex project without investing in huge additional technical resources? The good news is that IoT is hitting a new level of maturity in both technology and delivery model that will not only reduce risk and cost but give businesses peace of mind that they are investing in a future-proof solution. IoT Maturity While the cost model of mobile technologies has made IoT at scale unreasonably expensive to date, there has been a rapid evolution of LPWAN technology recently that will make projects with tens of thousands, even millions of devices, possible.  While there are licensed cellular variants such as Narrowband IoT (Nb-IoT), currently being used in pilot projects in Eastern Europe and southern Spain, it is the unlicensed LPWANs that already available for businesses to take advantage of. One of the most notable global LPWAN technology developments is LoRaWAN, created by Semtech, marketed and sustained by 500+ world-class organisations in the Lora Alliance standard, which is being rolled out across multiple countries. And while today there is no single, cross UK network, the ability to blend networks in different regional areas, including the adoption of international roaming via LPWANs – now provides organisations with a seamless, low cost, scalable IoT network model. This growing maturity of network technology is being mirrored by the advancement in design and manufacturing of devices – with new sensors and devices available with batteries that can last up to five years, minimising on-going cost and maintenance requirements.  Essentially, it is now possible to deliver IoT projects at a far lower cost – opening the door for large-scale, yet affordable IoT projects. Proof of Concept This maturity is being confirmed by the growing number of high profile IoT projects that are using the technology to deploy projects at scale. For example, the Smart City project in Milton Keynes is using parking sensors on the road that can tell when a vehicle is parked. In addition to enabling new parking enforcement systems, the project is collecting sensor data to analyse trends in parking activity to support on-going road management planning. Similarly, the Cambridge Smart City project is already starting to measure air quality within this highly congested environment. Both demonstrate the potential value of IoT, rolled out in a scalable and cost-effective way. End to End IoT In tandem with technology, advancing is a maturing market model, with a growing number of providers stepping up to manage the network fragmentation and delivering IoT solutions as a service, a future-proof model. This End to End IoT model encompasses every aspect of the solution from sourcing and deploying sensors, to creating the blended network, managing data storage and undertaking analytics. Also, with integration skills and the use of APIs, IoT platforms and their vital data can be made accessible to operational systems. The IoT as a Service model will make key applications, such as building management systems, smart parking, pest control and waste bin management available for instant use without heavy customisation, removing all barriers to entry, especially within the SME market. IoT is set to revolutionise businesses. However, it is by creating a model that can reduce the risk and cost that we will see the technology truly thrive. ### Quest Launches New Cloud Platform to Easily Manage Backup and Recovery Environment   Provides single, consolidated management view of data protection environment for MSPs and enterprises Cloud-based solution delivers unmatched monitoring from any location and any device Helps ensure uptime of all systems, applications and data as an extension of Quest Rapid Recovery ALISO VIEJO, Calif—Quest Software, a global systems management and security software provider, today announced the new Quest Data Protection Portal – an intuitive, cloud-based management console that provides end-to-end visibility for an entire data protection environment. The first release of the new portal is an extension of Quest’s award-winning Rapid Recovery solution. The portal addresses a growing demand among Managed Service Providers (MSPs) and enterprise customers for a single pane of glass view into managing and monitoring growing data protection environments from anywhere at any time. Over time, the Quest Data Protection Portal will encompass all of the business’s data protection solutions, giving customers complete visibility and management of backup and recovery needs of business applications across hybrid virtual infrastructures. To stay competitive and successful, companies cannot afford a minute of downtime. Though an effective backup and recovery solution remains critical to ensuring business continuity, as organisations are tasked with protecting increasingly complex and heterogeneous environments spanning physical, virtual and cloud-based systems, data protection management has become a mounting challenge for enterprises and MSPs alike. [easy-tweet tweet="The Quest Data Protection Portal is cloud based" hashtags="Cloud, Data"] The Quest Data Protection Portal is cloud based and extends the capabilities of Quest Rapid Recovery by giving enterprises a single management console to easily monitor and manage their entire data protection environment. With the portal, MSPs gain a single point of entry to easily manage backup and recovery across multiple customers, use cases and deployment locations – all through a single interface. Key features and functionalities of the Quest Data Protection portal include: Cross-location and device monitoring: A cloud-based delivery model ensures that customers can monitor all of their Rapid Recovery Core servers in any location, from any location via mobile device, laptop or desktop. Activity status displays: Admins and MSPs can check the status of all backups, replications, and virtual standbys across entire data protection surface areas and customer deployments within the solution for easy updates and peace of mind. Access control at group and user levels: Maintain the highest level of security and ensure the right users can access the business critical data they need and at any time.    Near real-time alerting: In the event of a backup or recovery error, admins receive real-time notifications and recommended tactics to ensure the environment is operating at optimal levels at all times. “At Quest, we pride ourselves on delivering the software tools, both on-premises and in the cloud, that address our customers’ toughest technology challenges,” said Ronnie Wilson, President and GM, Quest DP | KACE. “With the new Data Protection Portal, we’re giving MSPs and enterprise customers a simple, single-pane-of-glass data protection management console so they can monitor and protect an increasingly complex backup and recovery environment, and react quickly in the event of a disaster.” “As an early adopter of the Data Protection Portal, our experience since implementation has been extremely favourable,” said Rigo Verastica, Server Administrator, Digital I/O, LLC. “The portal is an impressive tool that facilitates managing and monitoring the health of our entire backup universe. Technical support during the process has been world-class.”   ### Who Choose An OEM License? Mark Maclean from Dell EMC explains. Why choose an OEM License? Mark Maclean from Dell EMC explains how Microsoft OEM licensing works and why it benefits companies and service providers. ### Atos launches data encryption solution to enable businesses to be compliant with global regulations Atos’ cybersecurity solution supports GDPR compliance Paris, Monaco, Les Assises de la sécurité, October 12, 2017– Atos, a global leader in digital transformation, announces the Trustway Data Protection Suite, a new comprehensive data encryption solution, to protect sensitive data and effectively manage data security to reduce the risk of cyber attacks. This powerful platform enables businesses to protect, securely manage and migrate sensitive data wherever it resides, whether on-premises or in virtual, public, private or hybrid Cloud environments. This new solution will combine Trustway Hardware Security Module (HSM) with a ‘Gemalto Software Inside’ Data Protection suite to provide clients with a complete end-to-end and interoperable data encryption solution for transparent and efficient data protection at all levels of the business data stack, including the application, database, file system and virtual machine, using encryption and tokenization. [easy-tweet tweet="The TDPS enables enterprises and organisation to be compliant with all new regulations worldwide" hashtags="Data, Security"] The Trustway Data Protection Suite enables enterprises and organisation to be compliant with all new regulations worldwide: the European General Data Protection Regulation (GDPR), the Payment Card Industry Data Security Standard (PCI-DSS) and the US Health Insurance Portability and Accountability Act (HIPAA). It is compatible with environments from leading providers. Key features and benefits: One centralised solution for the whole company with standardised data encryption for each organization Easy access to and control of encrypted data A highly secure environment: HSM architecture certified CC EAL4+ and ANSSI Reinforce Qualification (highest level for non-military products) Cost savings – from reduced budget for training resources and security policy management A modular architecture that enables CISOs (Chief Information Security Officers) to meet various encryption needs from different operational teams The new solution is now available and will be presented on the Atos’ stand #11 at the 17th edition of Les Assises de la Sécurité, which takes place in Monaco from 11 to 14 October. “We are pleased to offer our customers an end-to-end suite of solutions for their data protection. This new solution strengthens our leadership in cybersecurity and our position as a European cryptography provider” said Alexis Caurette, VP, Head of Cybersecurity Products at Atos. Atos: #1 in Europe and a global leader in cybersecurity Atos is a European leader in data protection and has a global team of over 4,500 security specialists and a worldwide network of 14 adaptive Security Operation Centers (SOCs). It integrates the best security technologies and offers a full portfolio of security solutions to support its clients in their fight against high-level threats. Atos has been positioned by NelsonHall as a leader in its Managed Security Services (MSS) NEAT vendor evaluation study and ranked first in Europe in terms of market share. ### Compare the Cloud interviews Allan Zander CEO of DataKinetics Neil Cattermull interviews CEO of DataKinetics, Allan Zander. Throughout the interview, Allan provides insights towards the digital transformation of utilising mainframe technology. For more information about DataKinetics visit: http://www.dkl.com/ ### Enterprise Cloud – It’s About Management Today, more than 90 percent of enterprises have deployed or are planning to deploy a cloud solution. As enterprise cloud deployments grow, however, requirements for manageability expand. Public cloud/service provider features are now becoming very relevant to large enterprises because they have massive user bases, many different departments, even sub-companies, and different geographic regions. They need internal tracking and billing mechanisms, and they need control over which resources each user can access. Cloud Management Challenges As cloud use expands within an enterprise, management challenges expand with them. What might begin as a shared resource for the DevOps group can rapidly expand into a repository for all business units in all geographical regions. Challenges in such an environment include: Ensuring that each user or group has access to adequate resources without overprovisioning; Making optimal use of existing server, networking, and storage systems to maximize return on investment; Enabling chargeback and billing mechanisms to accurately account for resources used; and Managing resources across multiple clouds. To overcome these challenges, a cloud platform should meet several management requirements. Cloud Management Requirements A cloud is a collection of shared resources, and fundamentally, running a cloud requires managing user access to those resources. There are several aspects of management to consider. [easy-tweet tweet="Cloud administrators need to be able to use existing as well as new servers" hashtags="Cloud, IT"] Infrastructure management – A cloud will likely leverage new as well as existing server, storage, and networking resources. The goal of a private cloud management system is to maximize flexibility in integrating and managing these resources. For example, cloud administrators need to be able to use existing as well as new servers and storage systems, and they shouldn’t be constrained in terms of which servers and storage systems they can use to enable or scale the cloud. User management – Cloud administrators need the ability to manage users individually as well as in groups, dedicating specific resources to specific users and groups as needed. Groups can be development teams, business units, departments, or even sub-companies, and users may belong to more than one group. Server/CPU management – Each node in a cloud cluster is shared by multiple applications and users. Server and CPU management allows administrators to control how many users can access a server or even a specific CPU core in a server, and to dictate what levels of computing service each user gets. Networking management – An effective management system should allow cloud administrators to control networking resources on a per-user basis, determining how much bandwidth each user or application gets, which levels of QoS each user or application gets, and which logical networks any individual user or group can access for security enforcement. Storage management – Most enterprises tier storage depending on the needs of the data being stored. Different users require different levels of storage performance, such as slow spinning disk storage (SATA or SAS) or fast flash storage (SSD or NVMe), and applications such as Data Recovery or Backup must have dedicated storage resources. A cloud administrator should be able to optimize the use of storage resources across the enterprise to best meet the needs of each user, group, or application, and should be able to scale with individual storage systems. Billing management – Large organisations need to be able to charge departments for cloud resource usage. This is a fundamental capability for tracking total cost of ownership (TCO) for the cloud platform and associated infrastructure. Multi-cloud management – Most organisations use multiple clouds, incorporating both public and private cloud resources into their cloud deployments. An effective cloud management system allows for workload mobility and access control across multiple clouds through a single user interface. A large company might have a large data centre in North America, with regional data centres in other countries. Users need access to local resources, but also the ability to “burst up” into different data centres and sets of resources. Cloud Management Benefits By managing all aspects of cloud operations, enterprises can maximize the use of their existing hardware, ensure that all users have access to only the resources they need and have the governance and security they require for successful operations. As enterprises expand their use of cloud technology, they need greater management capabilities to maximize infrastructure investments and ensure that all users get access to the compute and storage resources they need without over-provisioning or starving applications. With the right cloud management platform in place, they can achieve these goals. ### Time for Artificial Intelligence to Really Deliver Technology is a sector that has often been prone of over-hyping the latest trends that emerge within it. Over the past few years, virtual reality and driveless cars are just two examples of technologies that while rich in potential, have been over-hyped by both tech and business press, when they are in fact some way still from emerging fully. The fact that industry analyst group Gartner calls one of its market analyses the ‘Hype Cycle’ further highlights the tech industry’s propensity for hype, and perhaps one of the most over-hyped technologies of recent times is Artificial Intelligence (AI). Despite the media attention and the number of vendors claiming to have world-leading solutions, the number of actual end-users deploying AI to good effect is fairly limited. But that’s not to suggest that AI is without enormous potential. A recent (July 2017) report from the McKinsey Global Institute Study, Artificial Intelligence, The Next Digital Frontier, revealed that tech giants including Baidu and Google spent between $20B to $30B on AI in 2016, with 90% of this spent on R&D and deployment. So AI is on the agenda of the biggest and smartest firms in the world. But how can organisations make the best use of AI and ensure it truly delivers on its potential? Data and AI – the perfect match The one thing that can help AI emerge from the hype is simple – data. Organisations hold more data than ever before, and thanks to the Internet of Things and the connected world we live and work, this data is growing all the time. Used effectively, data can deliver unparalleled insight into customers and more, but it has been a challenge until now for organisations to manage this data in a way that enables easy extraction of actionable insight. The answer lies in AI and machine learning (ML). But before AI and ML can get to work, a big issue for many organisations is the manner in which data is stored. It isn’t uncommon for a business to store and manage data in multiple CRM platforms, as well as a number of other repositories – ERP, other databases, on the server and desktops - across the enterprise. This can arise through M&A activity, expansion or simply inefficient managing of IT resources, but the result is the same – data held in many siloes. This makes the management and analysis of this data a much harder job than it need be. If you cannot access data than how can you be expected to draw insight from it? Addressing unstructured data Furthermore, the sheer volume of big data in modern organisations can be bewildering, and it comes in files and formats that most CRM systems are unable to manage effectively. Unfortunately, this data is often the most valuable, containing rich insight into a particular customer and their specific needs and requirements. This unstructured data includes: any social content – Twitter, Facebook, LinkedIn, Instagram – by, and relating to that customer; email conversations between the customer and brand; service call scripts that detail any recent or historical issues, and much more besides that doesn’t into the formats used by most CRM systems. By not deploying unstructured data within a CRM, it can potentially be a major problem. It means that huge swathes of potential customer insight are missing. Utilising technology that captures both structured data in siloes and the masses of unstructured data, means businesses can begin to benefit from AI. [easy-tweet tweet="AI it can be hugely transformative and have a genuine and tangible impact on businesses and their customers" hashtags="AI, IT, VR"] The potential of AI Used properly, AI it can be hugely transformative and have a genuine and tangible impact on businesses and their customers. For instance, enterprise search is an area that is crying out for the use of AI and ML. Knowledge workers are spending too long searching for information that might not even be filed where they are searching for it. Using an AI-based cognitive system allows much more complex search queries and can even make relevant and contextual suggestions back to the user. This is AI effectively understanding better than the user what that user is looking for, saving time on searches and delivering better results. AI can also deepen customer understanding. An issue for larger firms is knowing who holds which relationships at a particular customer. AI technology can look at mases of unstructured data - all emails sent to an organisation, from and to different people, in different departments and multiple locations – and provide a way for a user to take meaningful action with a customer. This outlines clearly and in real-time who knows who within an organisation, invaluable when enterprise relationships can be so complex. These are just two examples – the potential of AI can go much further than that. But the key to taking AI beyond the hype lies with data, that’s where the potential is at its richest. By deploying AI and ML, organisations can collect data from multiple sources and in multiple formats, extracting fresh and insightful meaning from it and helping to deliver a complete view of that customer. ### WiredScore announces new investor group led by Bessemer and Fifth Wall amid aggressive global expansion London, October 12th 2017 – WiredScore, the world’s leading rating system for technological capacity in commercial buildings, has secured the support of a group of major investors in its rapid expansion plans. An investment team, led by Bessemer Venture Partners (‘Bessemer’) and Fifth Wall Ventures (‘Fifth Wall’), has acquired a majority share of the company, paving the way for WiredScore to further expand its global operation. “This new investor group underlines the strength of WiredScore’s offering; delivering critical connectivity accreditation to hundreds of commercial buildings in multiple countries,” said President and EMEA MD, William Newton. “Businesses now expect high quality IT infrastructure and internet as standard and that’s why our Wired Certification programme is proving such a compelling proposition for landlords across Europe.” This week also sees the Sisters Towers in Paris become the thousandth building to be Wired Certified. At the forefront of innovation in technology, the project has been developed by Unibail-Rodamco, Europe’s largest listed commercial property company, as part of its mission to support the wellbeing of its occupiers.  This is one of a number of iconic properties globally to achieve Wired Certification, including the Empire State Building, Google's Office in New York, the Walkie Talkie, the Cheesegrater in London, Tours Duo in Paris and the Pressehaus in Berlin. Globally, there are now over 4 million people working in Wired Certified buildings. [easy-tweet tweet="WiredScore’s mission of promoting best-in-class internet connectivity in commercial buildings has transformed " hashtags="Transformation, WiredScore"] Founded in 2013 in partnership with Mayor Bloomberg and the City of New York, WiredScore’s mission of promoting best-in-class internet connectivity in commercial buildings has transformed buildings across the globe. The company launched in London, UK, in October 2015 with the endorsement of the Mayor of London, certifying iconic landmarks including The Shard and White Collar Factory. The continued success of the Wired Certification programme included rapid regional expansion into several major cities including Birmingham and Manchester. In March 2017, the company opened an office in Paris, France, marking the beginning of WiredScore’s significant European growth this year. This led to the company being awarded the prestigious label of excellence from the Finance Innovation cluster, the ministerial committee in charge of promoting economic development and competitiveness in France. Just months later in July, WiredScore launched in Dublin, Ireland, before opening a German office in Berlin in early September with support from the Mayor of Berlin. WiredScore is due to launch in Canada before the end of the year. ### Alibaba Cloud and the UK Met Office to Co-organise Tianchi Data Mining Contest Global competition for solutions to climate challenges for future generations Hangzhou, China, 12 October 2017 – Alibaba Cloud, the cloud computing arm of Alibaba Group, today announced at the Computing Conference in Hangzhou China that it will organise a data mining contest with the Met Office, the United Kingdom’s national meteorological service. Data experts from around the world will be invited to use Tianchi, Alibaba Cloud’s global big data crowd intelligence platform, to develop solutions relating to a potential future world in which deliveries by unmanned balloons must be optimised to navigate the variabilities of the UK weather. The contest will be set in 2050, in a world where the invention of ‘anti-gravity engines’ has led to the creation of unmanned balloons as the preferred logistics solution. However, because of the UK’s complex meteorological conditions, the balloons are occasionally delayed, damaged or even destroyed by extreme weather conditions causing disruption and loss. The contestants will be challenged to create algorithms that can plan flight routes for these balloons to navigate the endless variation and changeable nature across the UK to optimise their delivery schedules and costs. The team which comes up with the best solution will win USD10,000 cloud credits and an internship opportunity at the Met Office. [easy-tweet tweet="Participants will be able to test their algorithms and modify their models accordingly" hashtags="Cloud, Technology"] Contestants will utilise public meteorological data provided by the Met Office in categories including rainfall and wind speed. Changing temperatures, rainfall, wind speed in area grids of four square kilometres and five-minute time intervals will be selected to challenge the resilience of the balloons. This will be overlaid with data for the balloons such as take-off and landing locations, maximum fly time and speed, as well as a set of limiting factors such as collision avoidance parameters. Participants will be able to test their algorithms and modify their models accordingly. “We are excited about hosting the competition with the Met Office. We share a vision to develop solutions that can solve real-life problems of global interest. This competition will challenge the participants in a scenario with real-life factors to test both their technological capability and creative thinking. We look forward to receiving innovative solutions from data talent around the world,” said Wanli Min, Chief Machine Intelligence Scientist, Alibaba Cloud. "Since our foundation in 1854, the Met Office has been striving to continuously improve the accuracy and usefulness of weather forecasting. We are excited that data from our forecasting systems is part of this challenging competition. We hope all contestants will enjoy the British weather as much as we do, even if it pushes them to the limit," said Alberto Arribas, Head Informatics Lab, Met Office. There are more than 100,000 developers and 2,700 academic institutes and businesses from 73 countries and regions in the Tianchi community. Through hosting big data competitions, such as this, Tianchi pools the wisdom of data experts from around the world and leverages the expertise of the Alibaba Cloud data scientists to develop professional solutions for various verticals including smart manufacturing, aviation, transportation and logistics. The Tianchi scientists’ creativity will help entities in different sectors to address real-world challenges. An example of this is the successful KDD Cup 2017 competition, deemed the “Data Mining Olympics”, in which 3,582 teams participated to develop solutions to predict the traffic flow of highway tollgates. The contest will open for registration on 30 October. For more information, please visit the Tianchi website from 30 October. ### ZoneFox launches first hosted UEBA platform to protect vulnerable organisations ZoneFox will provide secure, rapid, and scalable deployment for all its clients and deliver robust and progressive cyber security to the mid-market Edinburgh 2017 – Leading cyber-security software specialist ZoneFox has announced the release of its fully hosted platform. Utilising the exact same roster of intelligent technology designed to help companies protect their data, hunt threats, and remain compliant, ZoneFox can reside remotely in the highly secure ZoneFox cloud. ZoneFox’s platform is the most secure, fast, and cost-effective route for clients to access the award-winning technology, and is available within minutes of the endpoint agents being deployed. ZoneFox has seen particular demand in the mid-market for its hosted service, where manpower and cyber skills are scarce, but the risks are just as prevalent as they are in enterprise. SMEs host customer data, IP, and other prized assets, making them an attractive target for cyber criminals. Nearly one million SMEs were hit by a cyber attack in the last year and, with its latest release, ZoneFox will equip SMEs with the progressive technology needed to remain compliant with data laws and legislation. ZoneFox combats the growing risk of the insider threat, and rapidly identifies and alerts on anomalous or suspicious activity by monitoring user behaviour and data movement, both on and off the network. This includes activity such as file transferring, data loss or theft, writing IP to removable media, tunnelling data through the dark web, ransomware and malware files entering the network and unauthorised or suspicious file access. Matt Little, CTO, ZoneFox says: “Technology should benefit everyone, not just those with big cyber budgets. The mid-market is a prime target for hackers, but in the face of an ever-evolving and sophisticated landscape they lack the manpower or skills in-house to proactively manage and track data as it traverses the network. We’ve sought to eliminate that pain point giving them the confidence they have the visibility they need, but taking away the complexity.” [easy-tweet tweet="ZoneFox breaks the mould for the UEBA industry" hashtags="ZoneFox, Data, Technology"] “ZoneFox breaks the mould for the UEBA industry, putting powerful AI and machine-learning technology directly into the hands of businesses regardless of size. The result is that every business can now access a highly secure, easy to deploy and cost effective solution,” Little continued. The product has recently been trialled and is now being deployed with existing clients, and is now available to be purchased. This announcement follows rapid growth which the company is seeking to capitalise on and extend its footprint further and hosted ZoneFox is a strategic cornerstone of this plan as it will enable it to deploy its technology at companies around the globe. ### Avaya brings personalised cloud to the mid-market with BT Wholesale Tailored UC and CC cloud solution combines resiliency and MiFID II / PCI compliance out of the box Guildford, UK –Avaya announced the UK launch of ACS Select, the next generation of the Avaya Cloud Solutions delivery platform, which is hosted securely on the BT Wholesale network. The service enables mid-market organisations with legacy on-premise and complicated integrations of multi-vendor solutions to transform their customer experiences, and create the same personalised and tailored cloud experiences which have typically been the preserve of larger enterprises.  Available through BT Wholesale channel partners, ACS Select will empower their medium-sized customers, a market known for its cautious approach tomigrating existing on-premise infrastructures to the cloud. They will now be able to collaborate and communicate on a par with larger enterprises and personalise their customer experience across voice, video, messaging and customer contact applications, all via a single platform. This innovative hybrid cloud service is an extension of the longstanding relationship between BT Wholesale and Avaya, which combines the strengths of BT’s ultrafast high-performance Ethernet network and Avaya’s market-leading IP Office Cloud and Avaya Contact Centre Select solutions. Building upon the strong Unified Communications (UC) team engagement and Contact Centre (CC) customer engagement foundations offered by AvayaCloud Solutions, ACS Select offers further functionality, carrier-grade resilience ‘out-of-the-box’, and business continuity options to improve and tailor the customer experience. It also includes managed service options to support channel partners’ go-to-market strategies, including a managed network which extends the SLA out to the customer site. Intelligent call recording – which is vital for financial services and many other industries - will be included to ensure customers can demonstrate recording compliance with financial industry standards and directives, including the Markets in Financial Instruments Directive (MIFID). Secure Payments ensures compliance with Payment Card Industry Data Security Standard (PCI-DSS). Growing enterprises need technology that can be configured to suit increasingly complex requirements unique to each company. With ACS Select, organisations which have tailored their on-premise platforms over time into totally unique environments can emulate that personalisation in the cloud. Flexible service options enable the configuration of customers’ platforms to their own specific hybrid or fully on-premise requirements. In the hosted environment, BT Wholesale and Avaya will help companies better manage costs by eliminating the need for large, up-front capital investments and upgrades. The ‘pay-as-you-grow’ model will enable businesses to respond quickly and flexibly to their customer demands and improve both customer experience and revenue generation.  Avaya and BT Wholesale have created an end-to-end service that is resilient, guaranteed and highly secure, whilst giving customers the flexibility to create, configure and control tailored architectures through their preferred channel partner. Ioan MacRae, UK Managing Director at Avaya, said: “The introduction of ACS Select reflects the strategic importance we place on our alliance and partner ecosystem. We have a long and successful relationship with BT Wholesale, and with them we are helping both partners and customers understand the importance of how services integrate for a successful cloud migration, especially from a resilience, security and regulatory compliance standpoint. “ACS Select is not a mass-market, one size fits all offering that restricts performance capabilities or lumbers companies with functionality that is of no use to them. Our reseller partners can offer it as a bespoke solution to fit to their mid-sized customers’ existing systems and applications, along with the economies of scale and speed to market enabled by the combined reach and market share of two of the most trusted brands in communications.” Simon Orme, Director of Sales at BT Wholesale, said: “We have gained a tremendous amount of insight from customers over the years when selling Avaya Cloud Solutions. Building on that strong foundation, Avaya and BT Wholesale have designed ACS Select to solve the issues which we know companies are facing when it comes to configuring a solution that will deliver a consistent end-to-end service. For example, there are lots of companies running a cloud communications environment across multiple sites, with some at completely different stages in their cloud migration journey than others. The scalable and highly customisable nature of the platform meets the needs of very fast-moving and ever-changing organisations, who want their employees to experience an identical level of service across every site.” The service is designed especially for scalability. Customers can choose from a capacity of up to 3,000 seats for UC and up to 400 seats for CC, and will only be charged based on their use, meaning they can manage costs and scale up or down as they require. For more information and details of features and functionality, visit the Avaya Cloud Solutions Delivered by BT Wholesale page.https://www.btwholesale.com/pages/static/products-services/avaya-cloud-solutions.htm ### Compare the Cloud will be at Cloud Expo Asia - Day 2 [vc_row][vc_column][vc_video link="https://vimeo.com/238558164/0730493b91" title="Cloud Expo Asia Day 2 - Morning Show"][/vc_column][/vc_row][vc_row][vc_column][vc_video link="https://vimeo.com/238556391/cad6f9f98f" title="Cloud Expo Asia Day 2 - Afternoon Show"][/vc_column][/vc_row][vc_row][vc_column][vc_column_text] ABOUT Compare the Cloud’s event coverage platform, Disruptive have partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Tune in on the 11th and 12th October to watch live coverage from the Marina Bay Sands Expo and Convention Centre, Singapore. The broadcast will be from a specially built stand where attendees, exhibitors and speakers will be interviewed and update you on the latest Cloud tech trends. You can watch the broadcast live at the expo, online or on demand after each day. Watch live at Disruptive! Schedule  Cloud Expo Asia Live 12th October Show 1: 11:00am - 12:30pm SGT  Show 2: 2:30pm - 4:00pm SGT MORE INFO As the Cloud is now the method businesses go through every day it’s important that we know the services available and solutions that are critical to your cloud activity. At Cloud Expo Asia more than 300 of the world’s leading suppliers will be attending, 350 expert speakers and thousands of attendees. For more information about guests and shows visit the official site at: http://www.cloudexpoasia.com/ [/vc_column_text][/vc_column][/vc_row] ### Xero launches machine learning automation to improve coding accuracy for small businesses and their accountants The next step in accounting automation to bring greater efficiency and clarity to businesses LONDON, UK - 11 October, 2017 – Building on the success of its first machine learning project, Xero announced the next step in its journey towards code-free accounting – the automation of account codes and bills for all small businesses and their partners. With more than 500,000 bills entered into Xero every day, and each line of a bill edited individually, the automation of billing account codes is set to transform accounting practices, ensuring greater accuracy and reducing the time small businesses spend creating bills. While bills is the second-most commonly used feature of Xero, it has the second-highest rate of defaults, with every small business using the system differently.  Fifty percent of businesses use 10 or more expense codes, while others create their own codes, which means that information is often entered incorrectly.  Xero’s new artificial intelligence system will consider each individual business’s characteristics, then recommend account codes based on what it has learned. “We set out to further develop our machine learning programme to reduce the number of mistakes being made when creating bills. In doing so, we’ve created a system that not only learns from the individual needs of our customers, but can also make objective decisions about which account their transactions should be coded to,” said Andy Neale, Head of Data Science and Automation at Xero. With accuracy at the forefront of Xero’s latest development, the account code automation is rolling out to all Xero users, with customers continuing to code their accounts for bills as normal. A minimum of 150 bills is required to be entered and as more bills are entered and accepted or corrected, the better the suggestions become. [easy-tweet tweet="Account coding has the potential to impact both small business finances and reporting" hashtags="Machinelearning, Accounts"] “Account coding has the potential to impact both small business finances and reporting. So, when applying machine learning to Xero’s billing system, we needed to ensure accuracy was at the forefront of any changes we were making. While we’ll only know the true impact of the technology when people start using it, on average, we know we can achieve more than 75 percent accuracy after 150 bills have been entered.” “Account codes can be time intensive task for small businesses. We are using machine learning to simplify the system overall, so that more time is spent growing the business, rather than on low-value, administrative tasks,” Neale said. ### Why the IoT Holds the Key to Unleashing Blockchain’s Power With spending on capital markets applications of blockchain expected to reach $400 million by 2019, growing at a 52 percent compound annual growth rate, it’s clear that momentum is surging. Blockchain growth is further fueled by the Internet of Things (IoT), as one in five IoT deployments are expected to utilise basic blockchain services by 2020. The implications of this are enormous; using blockchain for IoT transactions and data sharing can ease security concerns, remove single points of failure, streamline processes and cut costs. As an example, co-working spaces and apartments could control the ability to access the rental, and autonomous vehicles could automatically pay for petrol and parking. But the use cases don’t stop there. When supported by service assurance, IoT and blockchain can unleash mind-blowing innovations in every industry it touches. As world-leading businesses such as Microsoft, IBM and General Electric rush to implement blockchain, enterprises should look to embrace the technology, or risk being left behind. Creating trust So what exactly is blockchain? As Kamaljit Behera, Industry Analyst at Frost & Sullivan, states, it is “a new data structure that creates trusted, distributed digital ledgers for assets and other data. It is an immutable record of digital events shared peer-to-peer between different parties. It can only be updated by consensus of a majority of the participants in the system and once entered, information is very hard to erase.” Because of this, blockchain creates a high degree of trust. Additionally, you can only add transactions, not remove or alter them, making blockchain attractive to organisations subject to Sarbanes-Oxley, HIPAA and other regulatory frameworks. On an IoT network, blockchain can facilitate not only financial transactions but also secure messaging between devices. By operating according to embedded smart contracts, two parties can share data without compromising the privacy of its owner. Although blockchain doesn’t solve every security problem for IoT devices, such as the hijacking of IoT devices for use in DDoS botnets, it helps protect data from malicious actors. At its core, blockchain enables an additional layer of security and trust to be added to existing applications, as well as newer ones. This approach provides the potential of a shared platform, while ensuring control, authenticity and integrity, which in turn can power new and innovative business models. The beauty of distributed blockchain transactions lies in its simplicity. Supported by growth in edge computing devices and 5G networks, this simplicity will enable faster, more efficient communications between autonomous devices – without passing them through single points of failure. Blockchain can also maintain faithful records of IoT device functions, making it possible for devices to communicate autonomously without a centralised authority. [easy-tweet tweet="Blockchain and IoT will add complexity to the IT infrastructure" hashtags="Blockchain, IoT"] The importance of service assurance However, for blockchain’s potential to be realised, service assurance is paramount. Like any digital transformation technology, blockchain and IoT will add complexity to the IT infrastructure. This can include edge devices (or sensors) and servers participating in blockchain transactions, middleware for encryption and authentication, and virtual machines for distributed databases and applications. Although autonomous device communications and accelerated transactions can boost efficiency, and improved availability and added security can cut costs, service assurance is now more necessary than ever before. In an IoT/blockchain environment, service delivery can be impacted by load, latency and errors, and because blockchain is basically a highly distributed database, assuring service delivery is more difficult. It requires pervasive visibility across data centers, multi-cloud or hybrid clouds, and edge infrastructures.   Illuminating  the service delivery path that includes load balancers, gateways, service enablers (including DNS), network, servers and databases – distributed or not – and all their interdependencies, can be achieved using software-centric monitoring of traffic-flows to deliver smart data.  By getting insights into service, application, and infrastructure performance using smart data that is well-structured, contextual, and available in real-time, then it is possible to control business outcomes from IoT/blockchain. While DNS is just one example, the coming growth in IoT devices in combination with blockchain will mean a surge of DNS requests and DNS- dependent services, which can have a major impact on service delivery and performance. The ultra-low latency of DNS services should be of concern for business continuity and assuring IoT performance quality. If DNS performs badly, then those IoT and blockchain services will suffer. That can mean parts of the connected world that are becoming more and more dependent on automation will come to a standstill. Healthcare, manufacturing, energy distribution, transportation, and financial transactions can get derailed because of DNS problems. However, losing control is avoidable with the right service assurance platform, where IT teams get visibility into DNS issues like errors and busy servers. As IoT devices drastically multiply – a projected 24 billion connected devices by 2020 –and almost all enterprises that invested in trials of blockchain repeating their investments, it’s clear that we are on the cusp of the IoT and blockchain revolution. It is therefore vital that IT professionals prepare themselves by architecting solutions that include service assurance to not only reduce business risk but also guarantee performance, availability and reliability in the production environment. Through this approach, they will be able to truly unleash the power of blockchain and the IoT. ### Snow Software launches GDPR Risk Assessment   Fast identification of GDPR risk areas with complete asset visibility London, October 11, 2017 – Snow Software (“Snow”), today announced the launch of Snow GDPR Risk Assessment, empowering organisations around the world to accelerate their General Data Protection Regulation (GDPR) compliance programmes by exposing personal data loss risks. [easy-tweet tweet="Snow has extended its leading network discovery" hashtags="Snow, Security"] Snow has extended its leading network discovery, software recognition and management capabilities with a new solution for GDPR. This provides visibility of devices, users and applications to pinpoint which applications present a potential data loss risk, whether on-premises, across cloud or on mobile. Today, Snow GDPR Risk Assessment identifies more than 23,000 application versions that hold or transmit personal data, and can flag devices, which are insufficiently protected, for example, due to lack of encryption and/or anti-virus software. This solution continuously analyses applications and surfaces issues through out-of-the-box management reports so organisations can see, understand and mitigate risk. “Every organisation conducting business in Europe must comply with the GDPR by May 2018, otherwise they may face fines of up to 4% of their annual revenue or €20 million, whichever is the greater. Despite this, many organisations don’t know where to focus their efforts. Snow GDPR Risk Assessment offers an extremely quick and cost-effective way of kick-starting a comprehensive GDPR compliance program,” says Matt Fisher, SVP Product Strategy, Snow Software. Snow GDPR Risk Assessment is available with immediate effect. For pricing information contact your regional Snow office or preferred Snow partner. For more information on the solution visit: https://www.snowsoftware.com/int/products/snow-gdpr-risk-assessment. ### Cloud Computing & The Modern Age In today’s world, the Internet and technology have become very common in companies of all sizes. People using the Internet, the speed of information response, together with a variety of electronic devices and bigger storage capacities, has brought forth a new era in social media. Cloud computing can be considered as the innovative of this new technology and can bring a diversity of advantages to business. What is cloud computing? Through a service provider, cloud computing is a system that serves as an internet-based information centre where customers can access all kind of files and software safely through several different devices, no matter where they are. It is a solution for companies and individuals looking for an easy way to store and access media from one device to another and the ability to share that media with other people who have been given access. Cloud computing is efficient and better for promoting strong workflow; it's also much more cost-effective than traditional computing solutions. For one, you no longer have to worry about the cost of hiring IT experts to handle your systems because they're being handled for you on behalf of a third party. [easy-tweet tweet="Cloud computing saves energy and can help your business to drastically lower its carbon footprint" hashtags="Cloud, CloudComputing"] It's also a much greener solution than standard solutions, something that is becoming more prevalent for businesses everywhere. Cloud computing saves energy and can help your business to drastically lower its carbon footprint, which is always a good thing. Advantages of cloud computing on businesses Global access Because the private cloud is on a remote server, your employees can share files and calendars with ease. This allows for better planning, better time management and an all-round more efficient way to run a business. Businesses can be able to identify which users, agencies, clients and staffs require access to the functionalities of software and the data that requires being manipulated. Similarly, it also controls what kind of access to the networks is made available to all. Efficiency of data sharing You don't have to worry about updates and errors either rather than storing all of your important data on a single server, true cloud computing solutions use a remote cluster of servers, so if something goes wrong with one file or all of your data, it will be able to be transferred to another device, so no information gets lost. The continuous availability and location independence that helps with data storage, computing as well as networking is also made possible with cloud computing. Information can be accessed easily, and similarly, user requirements could be accommodated even if they are in different time zones and geographic locations. For example, for companies involved in marketing online, using cloud computing can be very useful. When having employees around the world with different time frames and need to share information that needs to be secure, instead of mailing it the web designer from the Miami SEO branch and the ppc consultant from San Francisco one could easily access to the same information and share their projects by going into the cloud software. This way the information that is exchanged is secure and be updated anytime. Less technical knowledge needed Cloud computing can be personalized according to the needs of the business. It can prevent freelancing to save worker costs. The cloud computing service provider already offers IT support to the company, so users don’t need to hassle with technical issues. Saves money and energy Cloud computing avoids the need for businesses to invest in stand-alone software as well as servers. Companies can avoid additional costs that are otherwise associated with licensing fees, data storage costs, costs for software updates as well as management using utilizing the cloud capabilities. The availability of one-time payments followed by the ability to pay while on the go is among some of the benefits that help save on costs. Cloud computing avoids the need for on-site servers as the businesses, and their data are stored in a virtual space. Similarly, it reduces the need for in-house IT staffs as most of the manual updates are tasks that have to be done by a system administrator and it would be fulfilled by cloud computing by making use of resource APIs. The hosting servers would be actively monitored, maintained and upgraded with the help of cloud providers. It is safe and secure Cloud computing service providers have maintained that the security layers around their databases are so stringent that hackers would find it difficult to penetrate even just the first tier. In fact, cloud computing providers have banded themselves to conduct research and development for the purpose of discovering more techniques that may be utilized by hackers to penetrate the system, then applying the solutions necessary so that the same method could never be used again. With the current developments and progress that cloud computing providers are making, it can be said that the security concerns raised are being addressed so that the service would be safe and secure to use. One thing is unquestionably sure. Cloud computing technology will be the wave of the future and will be changing how people use the information highway for decades to come. Large storage Compared to a personal computer, cloud computing can store much more data and can be used for smaller, medium and large companies with an unlimited storage capacity offer. Device Independence How I mentioned before, the cloud computing software can be accessed at any time from anywhere in the world and not only that but also it can be used with any device that has a browser and can have access to Wi-Fi. ### Workday Delivers its first Data-as-a-Service Offering with Workday Benchmarking Benchmarks including Workforce Composition and Leadership Effectiveness equip global organisations with valuable market insights to optimise business performance LONDON, UK. – 10th October, 2017 – WORKDAY RISING – Workday, Inc. (NASDAQ: WDAY), a leader in enterprise cloud applications for finance and human resources, announced the general availability of Workday Data-as-a-Service (DaaS), a cloud service that provides valuable data to customers to enable more informed decision-making. The first service delivered on the DaaS offering, Workday Benchmarking, provides key metrics to customers seeking a better understanding of their company’s relative performance in comparison to peers to help achieve optimal performance in their respective markets. With Workday Benchmarking, customers worldwide can leverage the collective power of Workday’s extensive community – including over 26 million workers across more than 1,800 global organisations. Workday Benchmarking is seamlessly unified with all of Workday’s products, including Workday Financial Management, Workday Human Capital Management (HCM), Workday Planning, and Workday Prism Analytics, empowering customers with the only cloud system to plan, execute, and analyse their business through one secure environment.  Up-to-date Benchmarks, Analytics, and Insights to Improve Performance Traditional benchmarking services and reports typically provide outdated data delivered out of context in silos. These services are not only costly, but difficult to manage due to their lack of flexibility around rules and calculations. Workday Benchmarking extends the analytical power of Workday and addresses these challenges by enabling customers to better understand their individual organisation’s relative performance using current, reliable data. Once customers elect to participate in the Workday Benchmarking service and select what data to contribute, their data is de-identified and added to a secure, aggregated dataset. In return, they get access to benchmarks representative of Workday’s participating customer community and can see how they compare against peers with similar demographics such as industry or company size. Benchmarks are surfaced right in the Workday applications and dashboards that business leaders access most, so they are empowered to make informed decisions on what actions to take to improve their company’s competitive position—all without ever having to leave Workday. With Workday Benchmarking, customers have ready access to a growing catalogue of benchmarks, including: Workforce Composition Benchmarks (Age, Diversity, Tenure, and more) – For example, HR leaders at a technology company could compare the percentage of males and females in their workforce with similar organisations to gauge how well they are tracking against diversity goals. Turnover and Career Retention Benchmarks (Talent, Turnover, and others) – For example, a talent professional in healthcare noticing an increase in attrition can see if his company’s voluntary turnover is higher than that of industry competitors, which may indicate a retention issue. Leadership and Manager Effectiveness Benchmarks (Span of Control and Leadership) – For instance, a chief accounting officer at a financial institution concerned that her people managers may be overwhelmed can compare span of control with managers across the industry to identify which teams may need more support. Workday Usage Benchmarks (System Utilization, Business Process, and others) – For example, an HR information systems executive in retail can find out if business processes are being completed faster or slower than peer organisations, to help determine if he should reconfigure business process definitions for greater efficiency. Financial Management Benchmarks – In future releases, benchmarks including core revenue growth and return on invested capital will be available to help customers better connect workforce metrics to financial success and strategic business goals. For example, a CFO at a professional services firm could compare revenue per employee against similar companies, then model against project staffing needs to ensure optimal staffing that delivers on billability while balancing costs. [easy-tweet tweet="Benchmarking data can be secured to a domain and shared with anyone in the organisation" hashtags="Data, Workday"] Like the entire Workday suite of applications, Workday Benchmarking leverages the power of one - one version of software, a single source of truth, one customer community, and a single security model and user experience – enabling customers to benefit from: Current and Reliable Data – Live, transactional data ensures aggregation that is consistent, accurate, and highly relevant, greatly reducing the time customers typically need to wait for access to benchmarks and calculations they can trust. Benchmarks in Context – Benchmarks can be configured to surface in the context of where customers most frequently interact with Workday. For example, a business leader can view a dashboard benchmarking their company’s actual performance against peers, and then quickly take action all in a single system. Built-in Security and Privacy – Workday Benchmarking relies on the highest privacy and security standards. All data contributed to Workday Benchmarking is de-identified and aggregated with similar data from a variety of sources prior to reporting. Distribution and Governance – Like all analytics in Workday, benchmarking data can be secured to a domain and shared with anyone in the organisation, allowing insights to reach the people who need them without compromising control over individual permissions. Flexibility – As calculations evolve, peer groups change, and metrics are added, Workday’s cloud architecture makes it possible for Workday Benchmarking to quickly adapt and immediately deliver new metrics to customers. Comments on the news “Data is the new currency, and Workday Benchmarking demonstrates how we continually deliver the analytical capabilities that enable our customers worldwide to unlock even more insights from their data and the broader Workday community,” said Joe Korngiebel, chief technology officer, Workday. “With Workday Benchmarking, our customers can tap into up-to-date, reliable data that gives an industry-wide view of how their individual business is performing compared to others, better equipping them to prioritise business initiatives, pick up new best practices, and strategically allocate resources in an instant.” ### Compare the Cloud will be at Cloud Expo Asia - Day 1 [vc_row][vc_column][vc_video link="https://vimeo.com/237713242" title="Cloud Expo Asia Day 1 - Morning Show"][/vc_column][/vc_row][vc_row][vc_column][vc_video link="https://vimeo.com/237711368" title="Cloud Expo Asia Day 1 - Afternoon Show"][/vc_column][/vc_row][vc_row][vc_column][vc_column_text] ABOUT Compare the Cloud’s event coverage platform, Disruptive have partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Tune in on the 11th and 12th October to watch live coverage from the Marina Bay Sands Expo and Convention Centre, Singapore. The broadcast will be from a specially built stand where attendees, exhibitors and speakers will be interviewed and update you on the latest Cloud tech trends. You can watch the broadcast live at the expo, online or on demand after each day. Watch live at Disruptive! Schedule  Cloud Expo Asia Live 11th October Show 1: 11:00am - 12:30pm SGT  Show 2: 2:30pm - 4:00pm SGT MORE INFO As the Cloud is now the method businesses go through every day it’s important that we know the services available and solutions that are critical to your cloud activity. At Cloud Expo Asia more than 300 of the world’s leading suppliers will be attending, 350 expert speakers and thousands of attendees. For more information about guests and shows visit the official site at: http://www.cloudexpoasia.com/ [/vc_column_text][/vc_column][/vc_row] ### Tune in to Watch Disruptive Live Stream at Cloud Expo Asia About Don't forget Compare the Cloud’s event coverage platform, Disruptive have partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! The broadcast will be from a specially built stand at the Marina Bay Sands Expo and Convention Centre, Singapore. You will hear from exhibitors and speakers who will update you on the latest Cloud tech trends. You can watch the broadcast live at the expo, online or on demand after each day. Watch live at Disruptive! Schedule  Cloud Expo Asia Live 11th October Show 1: 11:00am - 12:30pm SGT  Show 2: 2:30pm - 4:00pm SGT MORE INFO As the Cloud is now the method businesses go through every day it’s important that we know the services available and solutions that are critical to your cloud activity. At Cloud Expo Asia more than 300 of the world’s leading suppliers will be attending, 350 expert speakers and thousands of attendees. For more information about guests and shows visit the official site at: http://www.cloudexpoasia.com/ ### VOSS Offers End-to-End Microsoft UC Management by Enabling Microsoft Teams Tuesday October 10, 2017 - Richardson, TX - Today, VOSS Solutions announced that its flagship product, VOSS-4-UC can now enable users for Microsoft Teams to optimize an organization’s Microsoft UC experience. Microsoft Teams is the chat-based workspace component of Microsoft’s Office 365. Microsoft's unified communications (UC) roadmap indicates that Microsoft Teams will be the combined platform for persistent chat, instant messaging, collaboration, enterprise voice, and conferencing. Its flexible framework allows organizations and groups of employees to customize the collaboration environment to drive productivity and real-time communications. VOSS-4-UC, which has been ranked by Nemertes as the world #1 UC operational management platform, helps customers design, deploy, operate, and optimize the most advanced UC environments, on-premises or in the cloud. VOSS-4-UC enables users for Microsoft Teams as part of the user provisioning process, as well as Skype for Business and Exchange. While Microsoft Teams is a cloud-based service that is part of Office 365, Skype for Business and Exchange services can be provisioned regardless of whether they are on-premises or online. Users who are migrated to Skype for Business from legacy PBX systems via VOSS-4-UC can be enabled for Microsoft Teams as part of that migration process. As Microsoft adds more administrative features for Microsoft Teams, those features will be added to VOSS-4-UC. [easy-tweet tweet="VOSS has been providing enterprise-grade UC migration" hashtags="Microsoft, IT"] Christopher Martini, Vice President at VOSS Solutions, commented; "VOSS has been providing enterprise-grade UC migration and management solutions for Microsoft UC environments for some time now. The evolution of Skype for Business into Microsoft Teams is a logical step for Microsoft, as they look to capitalize on the tremendous amount of interest in these tools at the enterprise level. We are excited by the momentum building around our solutions, as we introduce advanced automation to improve the productivity and collaboration of organizations around the world. Provisioning Microsoft Teams is a logical next-step for VOSS, and we are pleased to be at the forefront of the UC management market in offering this crucial capability." Dana Vizzolini, Microsoft Global Alliance Director at VOSS, added; "There is a very exciting opportunity for enterprises that are starting to leverage Microsoft Teams.  We see so many organizations struggling to communicate effectively across different locations, time zones,  departments, and projects.  A solution like Teams can bring their collaboration and communication efforts together into one, central location. This is a large part of why we now use Teams here at VOSS." For organisations that are interested in learning more about moving to Microsoft UC and incorporating Teams into a communication and collaboration strategy, please visit http://www.voss-solutions.com/solutions/microsoft/office-365/. ### KnowBe4 Announces Growth and Focus on European Market New School Security Awareness Training Enables European Businesses to Effectively Combat Ransomware, CEO Fraud and Other Malicious Threats London, England – 10 October 2017 – Understanding the only way an organisation can be safe from phishing is to make sure end-users are educated, KnowBe4 is formally bringing its new school security awareness training to Europe, with an established beach head in the U.K. The provider of the world’s most popular platform for security awareness training and simulated phishing attacks, offers its services throughout Europe via a network of security-focused VARs and MSPs. As attacks such as ransomware and CEO fraud continue to severely disrupt small and large organisations around the globe, KnowBe4 continues to innovate and develop effective ways to turn employees into a “‘human firewall” that can recognise and avoid cyber threats, thereby reducing organisational risk and protecting company assets. Earlier this year KnowBe4 issued The 2017 Endpoint Protection Ransomware Effectiveness Report, which found that endpoint security products are providing a false sense of security with regard to ransomware. Thirty three percent of respondents had experienced a ransomware attack in the past 12 months, despite the fact that more than half of them (53 per cent) had deployed multiple endpoint solutions to protect against the malware. Yet, respondents that combined online security awareness training with frequent phishing attack testing saw the lowest percentage (21 per cent) of successful ransomware attacks. “Ransomware is a runaway train for global businesses of all sizes and it’s only one of many threats that a well-trained and tested organisation can prevent,” said Stu Sjouwerman, CEO of KnowBe4. “We’re able to expand to Europe because businesses know that our new-school approach to security awareness training can help them educate their users to avoid cyber threats, reducing their corporate risk in the process. We are working with a growing and very talented pool of distributors and managed service providers to bring KnowBe4 to businesses throughout Europe.” [easy-tweet tweet="KnowBe4 last week announced its Q3 2017 results, sales were 2.63X greater than Q3 2016" hashtags="KnowBe4, Cloud"] KnowBe4 last week announced its Q3 2017 results, sales were 2.63X greater than Q3 2016.  Q3 2017 marked the 18th-straight quarter of growth, driven by an increasing enterprise demand for its new-school approach to security awareness training. KnowBe4 is putting an emphasis on growth within the European market, especially the UK, as the European region has grown at 250 per cent year on year during the calendar year 2017. Within the U.K. specifically, KnowBe4 plans to triple its number of employees, double its number of partners and count more than 10 per cent of overall company revenue coming from the country by the end of 2018. “We know just how daunting a task it can be to keep an organisation secure. The relentless push by cyber criminals and the potential harm from ransomware, phishing attacks, CEO fraud, and other forms of social engineering can be substantial, costing billions of dollars," said Perry Carpenter, Chief Evangelist and Strategy Officer at KnowBe4. “We want to help CISOs and IT Managers reduce the human error within their organisation, thus reducing their social engineering attack surface. The tools we’ve built quickly and easily determine the weakest links within an organisation by providing engaging security awareness training and deploying simulated phishing attacks to their employee base.” Since the start of 2017, KnowBe4 has expanded its partner program by 250 percent, more than doubling its channel partners. “Perimeter security solutions, which used to protect users even when they clicked the wrong link, no longer work on their own. It’s critical that employees train and support their entire workforce, which, until now, was a daunting and commonly unsuccessful task,” said Ian Kilpatrick, EVP Cyber Security for Nuvias Group, KnowBe4’s value-added distributor in the U.K. “KnowBe4 is able to automate the task of training and testing employees so a company’s workforce can thwart social engineering-based attacks like phishing and ransomware. KnowBe4 has been a very good partner and one that has helped us serve our client base well.” “Ransomware, CEO fraud and other types of attacks are on the increase and are a real risk to the future of businesses, even though they are often preventable. People let these threats in, so we wanted a partner that could help educate users and make these attacks less successful,” said Tony Mason, director at Security Software Solutions, a KnowBe4 Premier Solution Delivery Partner. “KnowBe4’s combination of Security Awareness Training and simulated phishing tests are intuitive, effective and empower our customers to take back control of their biggest security risk – their staff.” For more information about working with KnowBe4 or to inquire about partnering with the company, please visit:https://www.knowbe4.com/partnering. ### Fluke Networks Opens LinkWare Live Platform to Developers, Introduces LinkWare Live Affiliate Program Brother, DYMO and Epson Join to Extend Benefits of the Fluke Networks LinkWare Live Cloud Platform for Cable Design, Installation and Management Everett, Wash., October 10, 2017 – Fluke Networks introduced the LinkWare Live Affiliates program, inviting companies and developers to create services and products integrated with the LinkWare Live cloud platform that will benefit and better serve contractors and cabling installers. Fluke Networks LinkWare Live is the industry-leading cable certification project management platform, used by the professionals who install and maintain the infrastructure of today’s connected world – who have uploaded more than eight million tests results to date. These contractors, installers and network maintainers are uploading test results at a rapidly increasing rate, recognizing and reaping the benefits of more efficient workflows thanks to the LinkWare Live platform and integrated products from Fluke Network LinkWare Live Affiliates such as DYMO, Epson and Brother that promise even more productivity gains. According to Bob Briggs, President, Tri-City Communications, Norfolk, Virginia, “It’s so great to see companies pool their knowledge and integrate technologies to improve efficiencies for an entire industry. We’re using the Brother LabelLink app to pull cable ID data from Fluke Networks’ LinkWare Live cloud platform to print labels – and the efficiencies in time, cost and accuracy have really impacted our bottom line. Our installers are happier too.” Brother, DYMO and Epson Join as New LinkWare Live Affiliates Brother, DYMO and Epson have joined the Fluke Networks LinkWare Live Affiliates Program and recently announced applications integrated with the LinkWare Live cloud cable certification service. The partners’ products make the network and cable labelling process much more efficient by eliminating manual entry and time‑wasting tasks by loading the labeller with data generated from LinkWare Live during the network design and installation stages. DYMO® ID software is integrated with Fluke Networks LinkWare Live and allows project managers, cable technicians and installers to easily access projects in LinkWare Live, import cable ID data, and use the built-in label application and pre-loaded templates to simplify labeling tasks. With LinkWare Live, installers can print labels directly or transfer labels to a DYMO® XTL label maker for editing and printing in the field. The Epson Datacom app, when paired with the Epson LABELWORKS PX LW-PX400 printer, streamlines network installation labeling with simple-to-navigate menus, TIA-606-B or customized formatting, and brand name patch-panel templates. This app is now upgraded to work with Fluke Networks’ VersivTM System to quickly print wire and cable identification directly from LinkWare Live. Brother LabelLink (known as iLink&Label outside the US) is a productivity and quality improvement app for cable installers using Versiv testers and Brother PT-E550W handheld printers. The award-winning app enables the contractor to access the project cable ID information in the LinkWare Live cloud via smartphone or tablet and then transfers it to the labeller over Wi-Fi. LabelLink eliminates the need for duplicate data entry and promotes faster, error-free labeling and easier testing. Times are a Changing: The Day in the Life of Today’s Installers “The daily work of installers worldwide is changing dramatically as they increasingly use LinkWare Live to design, install and manage cabling projects, uploading nearly half a million test results a month,” said Eric Conley, vice president and general manager of Fluke Networks. “As more applications are integrated with the LinkWare Live platform, installers will reap the benefits, including increasing time savings and business operations efficiency, which all add up to cost savings and more profits.” Before LinkWare Live, installers typically had to type each cable ID on a labeller onsite, a tedious, time-wasting process – and with cabling projects ranging from dozens to hundreds of thousands of links that meant installers were typing sometimes more than 500,000 labels – or more if there were errors, typos or miscommunication. It was one huge time waster and a significant business problem that was ripe for innovation. Now, companies and developers such as Brother, DYMO and Epson are integrating their products with Fluke Networks LinkWare Live and the Versiv family of testers, including the DSX-8000 CableAnalyzerTM to help transform the way installers work, making their jobs easier, more productive and efficient. [easy-tweet tweet="DYMO’s new partnership with Fluke Networks makes the labelling process much easier" hashtags="DYMO, Network"] DYMO, Epson and Brother Product Information DYMO - “Label identification is essential for datacom professionals to keep operations running efficiently, but it can be tedious,” said Adam Delange, director of marketing, DYMO. “DYMO’s new partnership with Fluke Networks makes the labelling process much easier. With Fluke Networks, DYMO users can now easily access LinkWare Live projects, import Cable ID data and use built-in label applications and pre-loaded templates, simplifying previously complex labelling jobs.” Epson- “The Epson LABELWORKS PX LW-PX400 works wirelessly in the field via Bluetooth and the Epson Datacom app, making it a perfect fit with LinkWare Live,” said Andrew Kasun, marketing manager for Epson LABELWORKS PX. “Working together with LinkWare Live, the LW-PX400 makes cable identification easy and seamless. Plus, the Epson Datacom app is designed for ultimate versatility with patch panel templates, self-lamination wire wrap or heat shrink tube templates and TIA-606-B formatting, so network installers have everything they need to label their work accurately and efficiently.” Brother - David Crist, president of Brother Mobile Solutions points out, “Our multi-year journey working with Fluke Networks has been an honour and a tremendous learning experience. When the LinkWare Live and LabelLink combined vision was first conceived by our two companies, we had high hopes for the kind of value this solution would bring to the Datacom contractor market. A year later, we both continue to receive validation from the market that the solution is delivering the kind of efficiencies we envisioned.” Fluke Networks LinkWare Live Affiliates Resources Companies and developers interested in the Fluke Networks LinkWare Live Affiliates Program and creating solutions based on LinkWare Live platform should contact info@flukenetworks.com. Learn more about LinkWare Live at www.flukenetworks.com/linkwarelive. ### Alfresco Once Again Positioned as a Challenger in Gartner 2017 Magic Quadrant for Content Services Platforms Report Position as a Challenger Is Based on the Company’s Completeness of Vision and Ability to Execute Oct.10, 2017 — Alfresco Software, a leading open source provider of enterprise content management and process automation, has announced that Gartner, Inc. has named Alfresco as a Challenger in its October 2017 Magic Quadrant for Content Services Platforms, based on its ability to execute and completeness of vision. 2017 marks the second year in a row that Gartner has placed Alfresco as a Challenger in the evolving Enterprise Content Management (ECM) market. “We believe being recognised as a Challenger in Gartner’s 2017 Magic Quadrant for Content Services Platforms is validation of Alfresco’s ongoing innovation and execution in the rapidly evolving content services market,” said Thomas DeMeo, vice president of product management at Alfresco. “Alfresco’s Digital Business Platform is a great fit for organisations that are looking to differentiate themselves by delivering engaging customer experiences at scale. For organisations with digital transformation initiatives, our open source roots and transparent approach are appealing to developers, content professionals and architects alike. “We have expanded our technology partnerships to advance differentiation and innovation, including being the first content services provider on Amazon to launch an Alfresco instance with an Amazon Web Services Quick Start. By providing an extensive set of open APIs, developer tools and cloud-ready deployments, we build a level of agility, customer engagement and loyalty that is fundamentally different than many of our competitors,” added DeMeo. [easy-tweet tweet="Alfresco’s inclusion to the Magic Quadrant was based on Gartner’s evaluation" hashtags="Gartner, AWS, Gartner"] “Gartner's new definition for this market is as follows: A content services platform is a set of services and microservices, embodied either as an integrated product suite or as separate applications that share common APIs and repositories, to exploit diverse content types and to serve multiple constituencies and numerous use cases across an organisation.”  Alfresco’s cloud-ready and developer-friendly open source software powers the daily work of millions of people, addressing the unique needs of financial services, healthcare, insurance, government, manufacturing, and publishing companies, in countries around the world. Alfresco’s inclusion to the Magic Quadrant was based on Gartner’s evaluation of Alfresco’s enterprise-wide capabilities and vertical use cases. The full Gartner report can be downloaded for free. ### FatPipe Introduces the First Enterprise Multi-function VNF Solution   New integrated VNF/SD-WAN solution allows enterprises to virtually deploy more network functions SALT LAKE CITY, UT – Oct. 10, 2017 – FatPipe® Networks, the inventor and multiple patents holder of software-defined networks for wide area connectivity and hybrid WANs, today announced the launch of a multi-function VNF (Virtualized Network Function) to its family of SD-WAN solutions. FatPipe VNFTM integrates SD-WAN functionality with routing, firewall, security, DPI (Deep Packet Inspection), QoS, WAN Optimization and DNS (Domain Name System) management to enable the digital transformation of NFV (Network Functions Virtualization). This high-performance VNF solution allows enterprises and organizations the ability to easily add and deploy more NFV without the complexity of multi-vendor interactions. FatPipe VNF has been implemented on several NFV hypervisors (virtual machine monitors), including OpenStack, AWS, Azure, VMware and Wind River. “The FatPipe VNF could help enterprises more easily deploy NFV, which has been previously constrained due to the difficulties involved with multiple-vendor components required for a successful NFV implementation, including OpenStack,” said 451 Research’s Senior Analyst Networking, Jim Duffy. [easy-tweet tweet="FatPipe has been shipping SD-WAN products since 2001" hashtags="BigData, Data"] FatPipe VNF is also available on FatPipe-branded hardware. When selecting FatPipe hardware, customers will have the option of choosing a purpose-built hardware or opt for hardware built with OpenStack and the FatPipe VNF pre-installed, the latter allowing third-party VNFs to be loaded onto FatPipe hardware. FatPipe has been shipping SD-WAN products since 2001. Virtual network functions can be added from FatPipe’s growing ecosystem of VNF partners, or from any OpenStack-compatible provider. Features of FatPipe VNF with integrated, licensable features include: SD-WAN Bandwidth Spectrum Ranging from 1Mbps to 10Gbps Internet Application Path Control FIPS (Federal Information Processing Standards) 140-2 Certified VPN Encryption IPS (Intrusion Prevention System) / IDS (Intrusion Detection System) DDoS (Distributed Denial of Service) Mitigation DHCP (Dynamic Host Configuration Protocol) Server Inbound DNS (Domain Name System) Management Web Filtering WAN Optimization / Bandwidth Reduction Granular and Holistic Network Visibility OpenStack-based host platform for hosting third-party VNFs “OpenStack is becoming the standard for virtualization of network functions,” said FatPipe’s CTO Sanch Datta. “By building an integrated VNF/SD-WAN that’s branded-to-deployment of OpenStack, we can reduce the complexities and barriers to entry for enterprise NFV deployments. We expect this to have a similar benefit to NFV adoption as commercially branded Linux software had for enterprise adoption.” ### Nuance speech recognition tool enables GPs to treat four more patients everyday Nuance Communications Inc. today announces that Dukinfield Medical Practice has deployed Dragon Medical Practice Edition to help the practice stay one-step ahead of its medical document workflow while freeing up doctors to spend more time with their patients. The deployment – in partnership with Nuance Healthcare Connections Premier Partner, Freedom of Speech, has enabled the team to process clinical documents more quickly and accurately. Previously, the practice’s six GPs had been using digital voice recorders to dictate notes that were then transcribed by secretaries, a costly process that was prone to backlogs. The Nuance speech recognition software has enabled Dukinfield Medical Practice’s GPs to transcribe detailed, accurate notes directly – the team is already enjoying accuracy rates of 95 percent, which is improving each time Dragon Medical is used. The speed with which the reports are completed has enabled the practice to treat four more patients per day – this is particularly important for the practice, which is facing the challenge of patient numbers rising by approximately 5 percent a year. The GPs have also reported that patient letters are now much more detailed than before, capturing a complete view of the patient’s story which, in turn, leads to better care. The new process has also enabled the secretarial team to concentrate on other, more patient-focused, tasks. Practice Manager Julie Pregnall explained, “Like many medical practices, we were facing the challenge of maintaining our high-quality patient care and serving the community against the backdrop of rising patient numbers. This meant that we needed to look at our processes to see where we could drive further efficiencies. With Nuance’s clinical speech recognition solution, we’ve been able to improve the speed and quality of our medical records which previously were quite fragmented.” Simon Wallace, Chief Clinical Information Officer at Nuance Communications continues, “GPs have a difficult balancing act to play for medical records. Whilst they know that more detailed reports with greater context around the patient’s lifestyle can support better ongoing care, doctors are time poor and want to treat as many patients as they can in the time they have. I’m delighted that Dukinfield Medical Practice has seen such productive results – for the GPs, secretaries and patients alike – with little disruption to the way the doctors work.” Dragon Medical Practice Edition is designed specifically for GP surgeries and medical centres. It provides advanced clinical speech recognition capabilities including a wide range of medical vocabularies and macros. Dragon Medical makes it quick and easy to update electronic patient records – freeing up more time to care for patients. ### Learning to Trust Cloud Security By Larry Biagini, Chief technology evangelist at Zscaler Cloud-centric computing is inevitable, so you need to face your concerns and be realistic about risks. After more than 35 years running IT for large enterprises, I've lived through various IT technology shifts: mainframe, client/server, RISC, CISC, etc. But early on in the development of the cloud, I recognised that the shift to becoming a cloud-enabled business is different. In enterprise IT, cloud security remains a topic of contention. Many IT and security leaders fear that a move to the cloud could cause problems, such as losing control of sensitive data. While concerns about risk are understandable and need to be addressed, they're often misplaced. It's time businesses are honest with themselves about in-house capabilities before dismissing security in the cloud. Traditional enterprise security is based on perimeter controls — a model that was designed for a world where all data, users, devices, and applications operated within the perimeter and within the security controls. But as today's users blur the lines between activity inside and outside the perimeter, that model falls short because the perimeter is too big. I'd even say that in any mid- to large-size enterprise, there are more devices, users, and entry/exit points than the company knows about. Cloud-centric computing is inevitable because of the network, not your network, is just a conduit to allow access from trusted requestors to trusted resources. You will provide resources to those that you trust, when they need them and where they need them. The perimeter that will need protecting will be very small and contain services and properties that are critical to your business but not users. Users consume resources but are never on the cloud provider's core network. If they were, their perimeter could not be protected. As you evaluate security in the cloud, be realistic about the risks because deferring the transition to cloud services is itself a risky proposition. Businesses already rely on the cloud What kinds of companies are leveraging the cloud today? Yours, for one. Even if you don't officially sanction any cloud services or applications, your employees are using them. So are your customers, suppliers, and business partners. Services that support file sharing, online collaboration, storage, and other daily activities are all hosted in the cloud. There's no getting around the fact that data is already being generated and shared there; business transactions are also happening, and new business models are emerging. The primary drivers for cloud adoption are speed, agility, and cost containment. For me, speed is the new currency. Business won't wait for anyone or anything, and IT is no exception. Because of lingering security concerns around control and reconfiguration, many businesses still rely on the private cloud model or use a hybrid approach that retains mission-critical data and applications on-premises. This is necessary in some cases, but not in most. If you allow the some to become the all, you'll be missing the train, and your business will leave the station without you. For many, it already has. [easy-tweet tweet="Cloud security providers are similarly able to identify and patch threats" hashtags="Cloud, Security"] In the cloud, software providers can immediately update or upgrade customers. Cloud security providers are similarly able to identify and patch threats and vulnerabilities across thousands of companies at record speed, thanks to the benefit of multitenant cloud architectures. Financial institutions, for example, will want to maintain their "crown jewel" applications in their own data centre, but when it comes to new applications, building infrastructure to maintain a Web application or mobile application simply makes no sense. Companies such as Betterment and Kabbage are using financial technology to push the limits on traditional banking, leveraging a user interface that appeals to the customer and allows those businesses to operate without the huge infrastructure of traditional finance organisations. Plan for the journey As you begin your journey, enlist the help of public cloud and software-as-a-service providers. Learn how they think and operate. Check the "us vs them" attitude at the door and be realistic about your capabilities. Their reputations rely on their ability to execute and to do it securely. There's a reason the National Security Agency, for example, turned to Amazon Web Services to build the NSA cloud — instead of attempting it on its own. It's OK to learn as you go. Many organisations have approached the move to the cloud as they would any major IT transition. They analysed it and tried to glean as much as they could about the cloud and how it's provisioned, managed, and secured. That's not all bad, but the traditional vetting and risk processes slowed them down. Ultimately, the lesson learned has been: just do it. Don't let outdated notions around security stand in your way to modernise. So, start with taking your low-risk apps — you probably have hundreds — into the cloud. As you take that first step, you'll begin to see dividends in production, efficiency, and cost, and they will only increase over time. The cloud makes you more secure Once you get past the initial holdups, the cloud opens a massive opportunity to keep your users, applications, and data safe, thanks to the benefits of shared threat protection. You will need to hire talent that eats, sleeps, and breathes cloud to supplement your current workforce, but you will no longer be locked in competition for infrastructure, networking, and security talent with the likes of Amazon, Microsoft, or Google. You don't have to make the entire jump at once. You can merge cloud services and applications into your existing infrastructure, chipping away at the legacy stack a little at a time. Trust those who understand the cloud, and hire people who know how to secure and take advantage of it; a few key people can have a multiplier effect. Just ensure that they are apprised of the future strategy of your business — it's a joint growing process. In the end, it's all about trust.   ### Testing 1-2-3:  Three Reasons Why Cloud Testing Matters   It has been nearly three years since an Amazon Web Services senior executive said: “Cloud is the new normal”.  Since that time, the momentum behind cloud migrations has become unstoppable, as enterprises look to take advantage of the agility, scalability and cost benefits of the cloud.  In its 2017 State of the Hybrid Cloud report, Microsoft found that 63 percent of large and midsized enterprises have already implemented a hybrid cloud environment, consisting of on-premise and public cloud infrastructures.  Cisco’s latest Global Cloud Index predicted that 92 percent of enterprise workloads would be processed in public and private cloud data centres, and just 8 percent in physical data centres, by 2020. So the future is cloudy, with enterprises adopting hybrid cloud strategies using services from a mix of providers.  But irrespective of the cloud services they use, or the sector in which they operate, all enterprises share common goals:  they want their business applications to deliver a quality user experience under all conditions; they want those applications to be secure and resilient, and they want them to run as efficiently as possible. Shared responsibility However, achieving those goals is not always straightforward.  To paraphrase computer security analyst Graham Cluley, the public cloud is simply somebody else’s computers.  While the provider should offer a strong foundation for high-performance and secure applications, the enterprise still needs to assume responsibility for the security, availability, performance and management of the processes associated with those applications because that responsibility cannot be abdicated.  More importantly, the enterprise is responsible for properly configuring and managing the security controls provided by the cloud provider. Let’s examine the challenges enterprises face in ensuring their cloud applications are secure, deliver a quality user experience and are cost-efficient. Challenge #1:  Cloud Security Achieving robust security in the cloud is challenging for three reasons.  First, understanding an organisation’s current security levels, where additional protection is needed and where potential vulnerabilities may lie, is difficult regardless of whether the environment is on-premise or in the cloud. As there are more and more security products and platforms to manage across complex hybrid environments, having a single comprehensive view of the security posture becomes more difficult. Second, the highly dynamic nature of cloud environments, coupled with an every widening cyber threat landscape, requires security in those environments to be similarly flexible and fluid. Policies need to scale up in line with the infrastructures they are protecting.  Third, there is a shortage of security expertise, with IT teams’ already stretched to manage the tools and processes in place across the hybrid environment. Cloud security solutions also generate huge volumes of security events, making it difficult for personnel to prioritize and remediate risks. Challenge #2: User experience While different applications have slightly different SLAs and user expectations – think about the difference between a training sandbox and real-time online retail applications.  User experience is typically predicated on two things:  application performance and service availability.  When these are compromised, user dissatisfaction can quickly translate into the loss of business. The complexity of multiple design choices in the public cloud, from hardware architectures to instance types optimized for different applications, make guaranteeing a consistent user experience that much more complicated.  Factors such as the underlying cloud infrastructure hosting the application, the network connectivity between user and application, the performance of application delivery elements (for example session load balancers), and the actual design and architecture of the application, can all impact the user experience. Challenge #3: Cost and efficiency Cloud providers enable a variety of options to build cost-effective, scalable and high-available applications. From utility-based models with on-demand charges to reserved price options and spot instances or price bidding, there is flexibility for an enterprise to choose the model that suits their needs.  The challenge is to identify which is best. Cost optimisation is a case of weighing price and performance, according to the precise needs of the organisation in question. Settings and architecture designs must be optimized to deliver required application auto-scaling, and support demand peaks and troughs as they occur.  Design choices relating to securing workloads range from security endpoints running inside each instance, to network security appliances in various locations, to a security control offered by the cloud provider. Each of these choices operates at different cost rates, impacts application performance in different ways, and delivers various levels of security effectiveness. Given this complexity, understanding how to select the solutions that are most efficient is not an easy task, unless organisations can model the applications and threat vectors targeting them. Meeting the challenges:  how testing can provide value To meet these challenges, organisations migrating some or all of their workflows to the cloud must be prepared to embed consistent testing into their processes, in both pre-production and production. There is a direct relationship between test and risk – by getting testing procedures right from the start, enterprises can dramatically reduce their risk exposure, and ensure they successfully harness the full benefits of the cloud. In pre-production, before a cloud migration takes place, testing can provide quantifiable insights to empower security architects, network architects and security teams during vendor selection, performance and cost optimisation processes, scaling up, availability, and training.  For example, on the vendor selection side, assuming the functional requirements are met, procurement managers need to ascertain which public cloud vendor is cost-efficient regarding price and performance.  They need to establish which of the available tools for securing application workloads are efficient, secure and, ultimately, ideal for their specific requirements. Moving on to questions of performance and cost optimization, IT and security managers need to confirm how security policies and architectures can be optimized, and what the best settings are for an auto-scaling policy. These decisions are based on a range of factors, from memory utilisation to new connection rates, and again, consolidating and analyzing those factors can only be done via a rigorous, real-world testing process. Then there are questions around how the cloud architecture will perform once deployed. Where are the bottlenecks in the application architecture as it scales? How fast will applications self-recover from errors, and how will the user experience be impacted if some application services fail? Testing from pre- to post-production Answering these questions requires an extensive pre-production testing program, with realistic loads and modelling threat vectors, as well as failover scenarios. This provides the assurance that the cloud architecture will empower rather than restrict the business.  It also enables security engineers and analysts to understand better what they are working with. And testing must not end once a cloud environment has gone live.  Production-stage, continuous testing is essential to monitor for service degradations, while continuous security validation is essential to provide security service assurance. In conclusion, as the cloud is the new normal, continuous testing of cloud workloads needs to be embraced as the new normal too, at all stages of application deployment and delivery. Testing is the only means of ensuring that organisations can fully realize the benefits of the cloud, without the risks of security breaches, poor user experience, or unnecessary costs. www.ixiacom.com ### Digital Transformation driven by increasing competitive threats rather than innovation Monday 9 October: Competitive pressure is driving 70% of digital transformation projects globally, according to new research out today. With over a third of companies believing that failure to complete digital transformation projects will result in competitors taking advantage, market concerns are high when it comes to bringing internal processes and infrastructure up to date. The survey of 300 global B2B organisations found that competitive pressures are coming from all sides: while 38% believe the threat is from existing companies; 35% see pressure from new online market entrants; and34% from cheaper overseas suppliers. To compete effectively and to enhance the customer experience, 73% of global organisations believe that e-commerce plays an important or vital role in digital transformation. According to the research from Sana Commerce, which was produced by Sapio Research, nearly half of respondents say that e-commerce is at the heart of digital transformation because it’s responsible for the customer experience. [easy-tweet tweet="Two-thirds of organisations have a digital transformation strategy, and 28% are acting fast" hashtags="DigitalTransformation, Digital"] Two-thirds of organisations have a digital transformation strategy, and 28% are acting fast – rolling it out quickly as possible, yet only 4% have completed the project, highlighting some disconnects between what IT wants to deliver and what it’s able to execute. There are a number of challenges that threaten the success of digital transformation projects, the survey found. Unsurprisingly, legacy technology was the biggest challenge with 41% naming it as an issue, but over a third believe that itis internal resistance to change that could hamper success. Michiel Schipperus, CEO of Sana Commerce, comments “It’s do or die when it comes to digital transformation projects for B2B organisations. We know that companies are feeling the pressure from increasingly online businesses competing in new territories. The research found that 72% of organisations believe they will sell 100% of their products online in the future, making markets wide open to further online competition.” The B2B market is tackling competitive pressures by using technology to deal directly with end users: 73% state that they will sell directly in the future.  This may explain the rise of B2B marketplaces whose use is expected to grow by a third over the next two years. In an industry that often plays catch up to B2C, the B2B market is using digital transformation to keep the organisation competitive. Over two thirds will use the Internet of Things (IoT) and Machine to Machine (M2M) to enable predictive ordering or automation; 68% will see sales completed using mobiles; 67% will use virtual reality to personalise the buying experience, and 61% will use driverless cars and drones for delivery. For more insight into the role that e-commerce plays in digital transformation and how the process is being delivered in companies around the world, download the report here. ### TeleWare appoints BT heavyweights to strengthen channel offering TeleWare, a leading communication technology business, has announced the appointments of Andrew Collis and David Neal. Both individuals bring years of invaluable sales experience to TeleWare, having formerly worked at BT for several years. Andrew Collis Joining TeleWare as a senior channel manager, Andrew Collis will be responsible for all aspects of channel management and development for the BT Group, one of TeleWare’s key partners. Andrew worked at BT for eight years, where he oversaw the sales strategy and activity aligned to the conferencing sector. Prior to TeleWare, he worked at Vodafone as a principle sales solution advisor. Andrew has over 18 years’ experience in the UK channel, with a very strong track record of overachieving sales target. Andrew Collis comments, “TeleWare shines as a company, offering great opportunities and great products. The business has heavily invested in its partners and I am looking forward to being involved in the front line of sales and helping to create a strategy and vision for the BT Group.” David Neal David Neal joins TeleWare as a senior partner, business development manager. David has over 30 years’ experience in commercial sales including 15 years at Infonet Corp and eight years at BT Global Services. Prior to TeleWare, David was a business consultant at Larato Limited. David Neal comments, “TeleWare has an excellent reputation in the channel. I have been involved in the telecommunication sector for most of my career and for many years, I have watched TeleWare standing out above the competition. I was drawn to the company due to its innovation, unique positioning within the channel and a very impressive team.” Steve Haworth, CEO of TeleWare said, " TeleWare is on an ambitious growth trajectory over the next three years and now is the time for experienced, ambitious individuals such as Andrew and David to join the fold.” ### Productivity, Data and the Burden of BATs We all have aspects of our jobs that we’re not quite so enthusiastic about. Whether it’s compiling reports, giving presentations or attending networking events, more often than not we dislike these tasks not because they’re unnecessary or boring, but because we’re perhaps not so naturally adept at them. In reality, only by confronting those tasks we find uninteresting or feel less comfortable with do we actually improve our skillsets, grow professionally and increase our input to the business as a whole. No pain, no gain, after all. However, when highly-trained employees are forced to perform tasks that add no value, this dynamic shifts. The second part of our Disconnected Data research, on Productivity Pains, has revealed that many business users and IT workers are being burdened with tasks that not only do they find boring and repetitive, but are actually having a significantly damaging effect on productivity. All in all, these Boring Automatable Tasks (BATs) are being performed by nine in ten of the business users we surveyed, and businesses are wasting 24 working days a year per employee by asking their skilled employees to perform these rote tasks. So what exactly falls into the BATs barrel? Tasks such as data entry, searching for data, accessing archived data, and data processing were signalled as the most burdensome BATs by our respondents. All tasks that should, in 2017, be easily automatable. The Productivity Impact With large businesses wasting so many man hours per employee, these tasks are also severely impacting productivity on a nationwide level. It’s no secret that the UK is mired in a productivity crisis. British workers are currently 23 percent less productive than their French counterparts, and 27 percent less productive than the German workforce. Whilst BATs are certainly not the only factor at play in this issue, they’re clearly still a significant one. It’s highly unlikely that this productivity issue will resolve itself. Last month, the Office of National Statistics released figures which revealed that the productivity of the UK’s workers, measured on their output per hour, fell 0.1 percent in the second quarter of 2016 following 0.5 percent fall in the first quarter. This means that the workers of 2017 are actually slightly less productive than those of 2007. Considering the technological advances that have been made in the past 10 years, it begs the question – why are these tasks being performed manually, and why are businesses sacrificing nearly a month’s worth of workdays per employee every year? Disconnected Data Strikes Again Whilst the first stage of our Disconnected Data research revealed how siloed data and a lack of collaboration is hindering innovation and costing organisations billions, the second stage revealed that disconnected data is the root cause of these productivity pains and BATs, and it’s frustrating workers. In fact, 64% of our respondents pointed to poor data integration practices as an issue affecting productivity. [easy-tweet tweet="On average, large organisations use seven different business apps and systems regularly" hashtags="Business, Data, App"] This shouldn’t be especially surprising when one considers how many applications and systems, and departments and regional business units, are involved in large businesses such as the ones we surveyed. Nearly all of our respondents (98%) indicated that they were involved in projects that rely on company data from multiple systems and departments. On average, large organisations use seven different business apps and systems regularly, with IT decision makers typically using more than business users (eight compared to five). Companies in the financial services sector use even more, clocking in an average of nine apps. With data not properly integrated this means, inevitably, that much of this painstaking work will have to be done by an employee or employees. This, of course, is not the most efficient means of performing this kind of task, and it certainly doesn’t help with employee morale. Unsurprisingly, nearly two-thirds (61%) of our respondents expressed frustration that important projects often suffer delays caused by poor data integration. Integration and Efficiency It’s no coincidence then that, with nine in ten business users forced to spend time on BATs, the same proportion (91%) see connecting data, applications and systems as a beneficial move for their organisation. In fact, over a third (likely the most heavily involved with BATs) see it as essential. Additionally, our respondents speculate that if these data gaps were closed, they could see a 28% boost in efficiency. Our research has shown that the impact of disconnected data on productivity is a powerful one, and could have immediate benefits for businesses if addressed. However, there’s a longer-term story at play here. AI and automation are set to be two of the key business technologies for the future and, although nascent, promise to revolutionise job roles and how businesses operate. If organisations want to be prepared to embrace this AI future, they’d be wise to get their house in order now, rather than playing catch up further down the line. ### Alibaba Cloud and Fusionex Enter into Strategic Partnership Joining hands to promote digital transformation within the ASEAN region  Alibaba Cloud, the cloud computing arm of Alibaba Group, and Fusionex, a leading data technology provider specialising in Big Data Analytics (BDA), Internet of Things (IoT) and Artificial Intelligence (AI), today announced a partnership to provide industry-leading end to end cloud solutions in the ASEAN region.  Under the partnership, Fusionex will deploy its award-winning big data solutions on Alibaba Cloud’s powerful, scalable and cost-effective infrastructure, and will become the key Alibaba Cloud go-to-market partner in the ASEAN market combining Alibaba Cloud services with its offerings to provide clients with a total solution. [easy-tweet tweet="Enterprises seeking to use Fusionex’s offerings will be able to access them via Alibaba Cloud" hashtags="Cloud, Enterprises"] By leveraging Alibaba Cloud’s cloud computing infrastructure and sophisticated data intelligence services, ranging from data processing to machine learning, Fusionex will be able to provide advanced big data solutions for its increasing base of customers, especially in sectors such as financial services, trade facilitation, e-commerce, retail, aviation, travel and hospitality as well as manufacturing. Enterprises seeking to use Fusionex’s offerings will be able to access them via Alibaba Cloud. The two companies will also join forces in driving innovation in BDA, Machine Learning, AI and IoT. Alibaba Cloud will leverage its infrastructure and solutions to empower Fusionex’s innovation centre to develop technology and platforms that will drive the digital transformation of the region. “As we enter a new era of digital transformation, businesses across the ASEAN region require new levels of support to tap into the emerging opportunities. Machine intelligence and cloud computing combined with big data analytics will play a crucial role as the computing infrastructure.” said Raymond Ma, Head of ASEAN & ANZ, Alibaba Cloud, “We are very pleased to partner with Fusionex who shares our vision of developing the next generation of technology, and empowering our customers with drivers of change.” Ivan Teh, Fusionex Chief Executive Officer, said, “Today’s partnership marks an exciting step towards Fusionex’s goal of further advancing our technology and expanding our reach. We are delighted to partner with Alibaba Cloud to support the digital transformation of ASEAN countries. The sophisticated cloud services provided by Alibaba Cloud and data technology offered by Fusionex will complement each other and create synergy in servicing our customers more swiftly and effectively as well as providing them the opportunity to scale at a much faster rate.”  ### New Atos paper outlines major advances in cyber security London, 2 October 2017 – Atos, a global leader in digital transformation, believes waiting for digital threats to materialise will soon be consigned to history as we enter a new age of predictive cyber security capability able to ward off cyber threats even before they occur. Digital Vision for Cyber Security offers a range of expert views on the scale of the challenge as cyber criminals become more sophisticated and Internet of Things security comes to the fore. The paper highlights new thinking to build cyber security resilience for the next generation by harnessing automation and machine learning to understand – and predict – the threat landscape. Presented by some of the leading subject matter experts within Atos and across public and private sector including McAfee, Darktrace and techUK, this new paper positions cyber security as a critical element of digital transformation - demonstrating that organisations across the public and private sectors cannot truly transform without securing themselves against an ever-expanding range of ubiquitous, sophisticated and rapidly changing cyber threats. Digital Vision for Cyber Security covers the role that cutting-edge technology will play in responding to these challenges and the balance organisations have to achieve between risk management, data sensitivity and the costs that these challenges present. It also touches on the cyber skills shortage and how it might be tackled. Speaking ahead of its publication this week, Adrian Gregory, CEO, Atos UK&I said: “For years, the received wisdom for how to secure technology and data was a firewall, equivalent to a ‘lock on the door’. Today the answer is more complex. “In a world where technology has burst out of the comms room into almost every area of life, how do we install walls and locks everywhere? Clearly, our approach to cybersecurity must adapt. “In this paper we have collected a range of expert views on the size of the challenge and also the new thinking that can give us confidence in the face of the vast array of threats that exist in the cyber world.” [easy-tweet tweet="A more automated cyber security approach is essential " hashtags="Security, Cyber"] Pierre Barnabé, Executive Vice-President, General Manager Global Division Big Data &Security, Atos: “A more automated cyber security approach is essential to address the sheer scale, complexity and volatility of risks in the digital age. As Robert S. Mueller, ex-Director of the FBI once said, we cannot undo the impact of technology – nor would we want to. We must use our connectivity to stop those who seek to do us harm.” Mariana Pereira, Director at Darktrace, the world leader in AI technology for cyberdefence, said: “In an age of limitless data and complex networks, there is simply too much happening, too quickly, for legacy security tools to be able to deal with. Instead of trying to predict the hallmarks of the next WannaCry, organisations are increasingly turning to the latest advancements in artificial intelligence to detect and autonomously respond to in-progress attacks, before they have inflicted damage.” In July this year Atos launched the prescriptive Security Operations Center (SOC). Combined with Atos Big Data analytics capabilities and powered by Bullion servers, the new security solution makes it possible for customers to predict security threats before they even occur. Detection and neutralisation time is improved significantly compared to existing solutions. ### What’s next for ERP, e-Commerce and the Cloud? Once the domain of large enterprises, the cloud has caused a fundamental shift in the way ERP is delivered. Cloud-based offerings from the likes of Microsoft and SAP, and more recent solutions, such as Netsuite, were created natively in the cloud, transforming the need to implement large, inflexible on-premise solutions. Now accessible to smaller companies, cloud-based ERP is being used to support business growth through integrated solutions, such as e-commerce in new ways. But how has the cloud impacted ERP and integrated solutions like e-commerce? And what’s on the horizon for these technologies? In the past on-premise ERP incorporated all the business’ systems throughout the organisation. More recently, however, we witnessed companies moving away from a single, monolithic solution and instead reverting their attention back to specialised, single systems. If we look at CRM as an example: while once standalone; then incorporated into the ERP solution; the increasing focus on customer experience has driven it to become a business’s main customer information management system. This fundamental shift and rejection of the single solution approach is offering companies more appropriate and beneficial platforms from which to drive organisational growth. [easy-tweet tweet="The cloud offers a platform to integrate all these component systems" hashtags="Cloud, Technology"] The cloud offers a platform to integrate all these component systems, making it easier for businesses to build flexible solutions that fit their specific needs. They can upgrade single systems quicker and more effectively using the cloud, and only have to upgrade the components that they need. This has helped companies overcome huge hurdles when looking to deploy new solutions across the business. Innovations in cloud delivery, such as Dynamics 365, has also made it easier for businesses to implement ERP systems and then add any additional tools they might need quickly and easily. If we look at Dynamics 365 and Sana, in many ways, it’s become like our phones. Businesses can use the application, decide if they like it, and if not, they just delete it. Making that first step towards delivering a new technology service to the business becomes much simpler. Businesses are also able to self-provision applications and ensure they’re using systems that are completely up to date, providing them with a way to easily back up data for example. It also presents smaller businesses with the benefit of being able to use trusted enterprise-class systems such as Microsoft Azure; which they can now attain at a much lower cost than on-premise. So what’s next for the relationship between the cloud, e-commerce and ERP? We’re currently seeing the development of micro-services within the cloud as a way for businesses to access more highly specialised apps. If you look at a cloud service such as Microsoft Azure, businesses can use different Microsoft apps hosted on the platform. Then as more and more specialised apps, such as Sana, are brought into the cloud, businesses can integrate more closely with other tools that share the same cloud service space. This is already happening in the consumer space with the rise of voice-controlled assistants for example, where apps are drawing on each other to greater benefit. It makes sense to use the micro-services within the cloud to enhance technology solutions in line with customers’ needs, and with greater speed and agility. These microservices are going beyond simply moving systems into the cloud. The value is around occupying a shared cloud eco-system, which provides access to complimentary system tools and technologies that meet the business’s needs. It makes sense to use these micro-services within the cloud to enhance a business’s solutions in line with the customers’ needs with much more speed and agility. But perhaps the most interesting part about ERP in the cloud is that it currently comes from what’s happening with app sourcing. Businesses are changing the way in which they source enterprise tools by replacing them with apps. ERP itself is becoming an applet platform where additional component apps are being built and integrated. All of this creates an environment where businesses can experiment more internally. Where before they may have lacked skills, now it’s as simple as a few clicks. As a result, it’s likely that we’ll see more trial and configuration of concepts being generated by companies. They’ll build it, run it for a couple of days and if it works then they’ll build more. All of this innovation within the cloud provides an environment where both the customer and the business can experiment more internally and find the right solutions and services that suit them. Whereas before they may have lacked the much-needed skills required to operate such a solution, it’s now as simple as a few clicks away. ### Compare the Cloud will be at Cloud Expo Asia 2017 next week! ABOUT Compare the Cloud’s event coverage platform, Disruptive have partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Tune in on the 11th and 12th October to watch live coverage from the Marina Bay Sands Expo and Convention Centre, Singapore. The broadcast will be from a specially built stand where attendees, exhibitors and speakers will be interviewed and update you on the latest Cloud tech trends. You can watch the broadcast live at the expo, online or on demand after each day. Watch live at Disruptive! Schedule  Cloud Expo Asia Live 11th October Show 1: 11:00am - 12:30pm SGT  Show 2: 2:30pm - 4:00pm SGT 12th October Show 1: 11:00am - 12:30pm SGT Show 2: 2:30pm - 4:00pm SGT   MORE INFO As the Cloud is now the method businesses go through every day it’s important that we know the services available and solutions that are critical to your cloud activity. At Cloud Expo Asia more than 300 of the world’s leading suppliers will be attending, 350 expert speakers and thousands of attendees. For more information about guests and shows visit the official site at: http://www.cloudexpoasia.com/ ### Preventable Machine downtime costs UK Economy in excess of £180bn It has been revealed that preventable machine downtime in the UK manufacturing industry is costing manufacturers a staggering £180bn every year. At a time when the UK Economy sets-out to re-establish itself as an economical, industrial and technological powerhouse on the global stage, the knowledge that over £180bn can be added back to the economy by controlling preventable manufacturing downtime will certainly come as positive news. The comprehensive research commissioned by Oneserve, the award-winning field service management company, exposes the cost and time implications associated with traditional current machine maintenance methods. [easy-tweet tweet="There are 133,000 manufacturers in the UK, contributing a total of £6.7tn to the global economy" hashtags="AI, MachineLearning"] There are 133,000 manufacturers in the UK (ONS), contributing a total of £6.7tn to the global economy, and according to industry senior business leaders, the overwhelming causes of machine downtime are internal technical faults and ageing, unreliable machinery. Yet, nearly all instances (94%) can be prevented through harnessing readily available technologies, such as predictive maintenance software. Using machine learning algorithms and data collected from sensors on machines to monitor performance 24/7, Oneserve’s software analyses and learns when and how a machine will fail and then triggers remedial actions to get operatives equipped and on-site before the failure occurs. Harnessing AI and Field Service Management in this way means machine downtime can be prevented. Chris Proctor, CEO of Oneserve commented on the findings: “Whilst unsurprising, it is truly shocking to see the scale of losses businesses endure due to machine downtime. It is clear that existing maintenance processes aren’t working and the time has come for predictive methods to become the norm – not the exception. I hope that in light of these findings, businesses can take control of machine downtime and ensure they are working to deliver the best possible service for their customers.” Predictive models are the leading solution for preventing the tremendous financial and timely losses experiences by manufacturers and the United Kingdom’s economy. ### 4 Advances in AI and Machine Learning: All Found in the New iPhone Apple CEO Tim Cook once said, “Technology should serve humanity, not the other way around.” That’s an incredibly deep, yet increasingly relevant, point. As technology advances — sometimes at an alarming rate — it seems to strengthen its grip on us, not the other way around. Look at smart home technology, for example. It has the potential to be remarkably disruptive, and when it works, it can make our home lives so much more convenient. However, the market is so fragmented currently, and there are so many different standards, protocols, devices and even hubs, new consumers getting involved are often confused. For smart home tech, strictly speaking, at what point do we stop and say OK, let’s quit adding, and instead improve what we have? Say what you will about Apple and their products, but that’s what they’ve continued to do for decades. They release a product, they perfect it — and then they release it again with better, more powerful hardware. In the case of the new iPhone X, we have nearly the same scenario we always have. While there are a few aesthetic changes, for the most part, the iPhone is still the iPhone — except for one modern, innovative element born from the technology of modern artificial intelligence and machine learning. Thanks to these powerful technologies — found in the iPhone X, by the way — the consumer electronics spectrum is about to change considerably. As Cook so accurately pointed out, we are on the precipice of seeing our technology serve us, not the other way around. On stage, Cook brazenly claimed the iPhone X will “set the path for technology for the next decade.” Let’s take a closer look at what he means by that, and how it’s going to happen. Introducing the Neural Engine Inside the new iPhone X is an A11 processor that includes embedded AI support, referred to as the neural engine. The goal of the component is to accelerate AI processing and software functions, thanks to the enhanced use of images and speech. This is all part of the artificial neural network, or neural engine, as it’s called. In layman’s terms, the neural engine is a small portion of the internal hardware dedicated specifically to artificial intelligence. This opens up many new opportunities for the technology, especially when it comes to innovative features. For example, the tech is responsible for the iPhone’s new facial recognition authentication, and customized emojis complete with a copy of your facial expressions. [easy-tweet tweet="“A11 bionic neural engine” is capable of handling up to 600 billion operations per second" hashtags="Bionic, Technology"] Machine Learning in Your Pocket The “A11 bionic neural engine” is capable of handling up to 600 billion operations per second, which is 70 percent faster than the previous generation of Apple’s mobile processors, the A10. This boost in processing power and performance means machine learning is not just possible, but incredibly viable, and in such a small device. You see, machine learning requires a variety of conditions to work reliably — including access to lots of data, significant processing power and an audience. The latter allows the system to gather enough intelligence and stats to build actionable insights. With the new iPhone, Apple has combined all these things into a tiny, handheld device. At any given time, the system can take advantage of Apple’s vast network of data, plenty of processing power —  thanks to the new A11 dedicated chip — and, finally, Apple’s large customer base. Accessibility Is Now Seemingly Limitless  To work with machine learning and AI technologies, developers and organizations had to invest a lot of money, time, hardware and resources into the endeavor. An indie software development company making a mobile app, for instance, may find AI or machine learning out of their reach — at least, that’s how it was before Apple’s latest move. Now, the new chipset, new focus and new technologies provide more accessibility to everyone. Mobile app developers can tap into inexpensive platforms and tools offered by Google, Amazon, Apple, IBM and many others. They can build fully operational, robust and capable mobile apps and services that tap into these technologies, without deploying the necessary hardware themselves. That’s a huge leap forward, especially regarding putting the technology into the hands of a large community. Siri Is About to Get a Whole Lot Smarter  Siri supporters and enthusiasts already know the personal assistant is quite accurate already, but there’s always room for improvement. Thanks to the technology embedded in the new iPhone, that improvement’s going to happen, and faster than ever before. Siri will be able to tap into the boost in processing power, systems and data these technologies are built upon, not to mention Apple’s gigantic install base. Imagine the assistant returning answers to queries with little to no delay, and with increasing accuracy. Furthermore, imagine if Siri can react to more natural speech, as opposed to specific phrases or commands. One thing is certain: Siri and her OS are both about to get a whole lot smarter. Thanks to AI and machine learning, that’s not only possible, but will happen at unprecedented speeds. ### Tweets from Disruptive Live at IP EXPO - Day 2! Disruptive Live at IP EXPO - Day 2 ### Take a look at some IP Expo Highlights [vc_row][vc_column][vc_column_text]Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, where they produced live stream shows throughout the day. Take a look at some of the highlights from IP Expo![/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_video link="https://vimeo.com/237076406" title="Disruptive Live from Machina at IP EXPO - Day 1, Morning Show"][/vc_column][/vc_row][vc_row][vc_column][vc_video link="https://vimeo.com/237100029" title="Disruptive Live from Machina at IP EXPO - Day 1, Afternoon Show"][/vc_column][/vc_row][vc_row][vc_column][vc_video link="https://vimeo.com/237398955" title="Disruptive Live from Machina at IP EXPO - Day 2, Morning Show"][/vc_column][/vc_row][vc_row][vc_column][vc_video link="https://vimeo.com/237403236" title="Disruptive Live from Machina at IP EXPO - Day 2, Afternoon Show"][/vc_column][/vc_row][vc_row][vc_column][vc_video link="https://vimeo.com/237076912" title="IP Expo Day 1 - Behind the Expo with Andy Johnson"][/vc_column][/vc_row][vc_row][vc_column][vc_video link="https://vimeo.com/237395251" title="IP Expo Day 2 - Behind the Expo with Andy Johnson"][/vc_column][/vc_row] ### The digital renaissance has arrived in the form of the cloud By Damon Anderson, director of partner at Xero Accountants have used the trusty ledger to service their clients since even before the 13th century. But the birth of the cloud has turned accountancy on its head, putting the ledger to rest and making life far simpler for the business owner. In the cloud, businesses are better connected, data can be accessed 24/7, credits and debits are digitally recorded and information travels faster between buyer, business and bank. Layers of time-intensive admin have been removed for small businesses and their advisors. So it’s no surprise that according to our new Digital or Die report, 83 percent of accountants believe they need to understand technology just as well as they understand accountancy. Yet change can be overwhelming – almost half now worry about being left behind, which is twice as many as last year. Evolution is hard to hide from The pace of new technology development is relentless, and it’s hard to avoid. Even HMRC is planning to ‘make tax digital’ by 2020. But it seems not everyone is keeping up – one-in-four accountants we surveyed admitted that they weren’t aware MTD is on the horizon. [easy-tweet tweet="Accountants are recognising the need to move with the times. " hashtags="Cloud, Futurism"] Despite the nervousness, accountants are recognising the need to move with the times. Almost half of advisors (45%) we asked believe that practices need to go digital to survive. They understand the competitive advantage technology can give them, and tax practices are seeing the potential of streamlining their processes in order to be more profitable. With benefits including lowered cost of business, new revenue streams and keeping up with market expectations, the opportunities that going digital can bring are vast. Learning on the job Accountancy, though, is an industry ready to adapt. Accountants and bookkeepers are making the effort to learn as the pressure builds to stay relevant. Half of accountants are learning about new technologies at their workplace, and a quarter are enrolled in offsite courses. This is in spite of three-in-10 saying they don’t think there is enough educational support when it comes to learning about digital tools. We’ve commissioned a unique portrait to depict changing times in the industry. Since Luca Pacioli – Father of Accounting – debuted his innovation, the general ledger has been a faithful servant, but the next revolution has arrived. The digital renaissance is here. To explore our findings in detail and discover further insights, download the report here. ### Compare the Cloud are at Cloud Expo Asia 2017 ABOUT Compare the Cloud’s event coverage platform, Disruptive have partnered with Cloud Expo Asia 2017 to bring you live shows throughout the event! Tune in on the 11th and 12th October to watch live coverage from the Marina Bay Sands Expo and Convention Centre, Singapore. The broadcast will be from a specially built stand where attendees, exhibitors and speakers will be interviewed and update you on the latest Cloud tech trends. You can watch the broadcast live at the expo, online or on demand after each day. Watch live at Disruptive! MORE INFO As the Cloud is now the method businesses go through every day it’s important that we know the services available and solutions that are critical to your cloud activity. At Cloud Expo Asia More than 300 of the world’s leading suppliers will be attending, 350 expert speakers and thousands of attendees. For more info about guests and shows visit our official site for the Expo at http://www.cloudexpoasia.com/     ### IP Expo Europe 2017 - Day 2 [vc_row][vc_column][vc_column_text]Compare the Cloud’s event coverage platform, Disruptive have partnered with IP Expo Europe 2017 for the second year running, to bring you live shows throughout the event! Tune in on the 4th and 5th October to watch live coverage from the ExCel at IP Expo. The broadcast will be from a specially built stand where attendees, exhibitors and speakers will be interviewed and update you on the latest tech trends. You can watch the broadcast live at the expo, online or on demand after each day. Watch live also at Disruptive![/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_video link="https://vimeo.com/237398955" title="Disruptive Live from Machina at IP EXPO - Day 2, Morning Show"][/vc_column][/vc_row][vc_row][vc_column][vc_video link="https://vimeo.com/237403236" title="Disruptive Live from Machina at IP EXPO - Day 2, Afternoon Show"][/vc_column][/vc_row][vc_row][vc_column][vc_video link="https://vimeo.com/237395251" title="IP Expo Day 2 - Behind the Expo with Andy Johnson"][/vc_column][/vc_row] ### Tweets from Day 1 at IP EXPO! Disruptive Live at IPEXPO 2017 - Day 1 ### ANS primes Wakelet for future growth Cloud and managed services provider ANS Group has delivered a Public Cloud solution with Amazon Web Services for fast-growing social content platform Wakelet, priming the company for future growth and paving the way for machine learning and artificial intelligence. The two Manchester tech companies have partnered to allow more users to access Wakelet’s digital platform while providing headroom for the ambitious company to expand its offering in the near future. The state-of-the-art managed public cloud solution, built on Amazon Web Services is offering Wakelet richer data analytics and machine learning capabilities to enhance the platform’s services and create a seamless user experience. The scalable infrastructure has also enabled the business to reduce costs and free-up internal resources to focus on optimising and adapting its offering. Wakelet's platform allows users to capture articles, videos, tweets and pretty much anything else, by knitting them together into a beautiful and easy to access collection. Nick Blow, engineering and infrastructure lead at Wakelet said: “As a tech start-up, we never know when our next period of explosive growth could be, and being able to scale out in a matter of minutes, rather than days or weeks, makes a huge difference. [easy-tweet tweet="ANS’ expertise and knowledge has been a huge help during our migration to the public cloud" hashtags="ANS, Cloud"] “ANS has been a huge asset for us. The team have an in-depth understanding of our needs as a business, and the cloud services they offer perfectly complement our existing tooling and workflows, rather than getting in the way. ANS’ expertise and knowledge has been a huge help during our migration to the public cloud, and the 24/7 monitoring has reduced the number of false positive alerts to essentially zero, which has been a huge burden off our developers’ shoulders." Andy Barrow, CTO at ANS Group said: “We work to support ambitious businesses along their journey by offering clever and bespoke cloud solutions. This was a flexible solution that integrated with Wakelet’s existing infrastructure and helped to optimise and develop its offering for its growing customer base. We expect to see this agile cloud solution help Wakelet grow into a real influential force in the tech industry over the next few years.” ### Ventiv Technology acquires Webrisk from Effisoft Ventiv Technology, the world’s leading independent provider of risk technology solutions, announces that it has purchased Webrisk, the risk management information system (RMIS) from Effisoft SAS, the risk and re/insurance solutions firm.  [easy-tweet tweet="Webrisk is a full featured risk management information system " hashtags="RiskManagement, Security"] Webrisk is a full featured risk management information system that helps risk managers to manage their daily operations more effectively. Following the acquisition, Ventiv will continue to support and invest in the Webrisk platform as well as its existing risk management technology solutions. Edouard Lefebvre, current Chief Operating Officer of Webrisk Paris and David Thomas, current Director of Webrisk London, will both be joining Ventiv, as well as all current Webrisk personnel.   Steve Cloutman, Managing Director International, Ventiv said: “We’re delighted to announce the acquisition of Webrisk, this acquisition is a solid step in growing Ventiv’s European operation. Webrisk come with a strong reputation, fantastic customer base, a highly skilled team and a great product, all of which we’re proud to be adding to the Ventiv group in Europe. I am confident that through our combined talent and experience, we will continue to provide excellent service to our expanded customer base whilst we invest more into our product portfolio.” Edouard Lefebvre said: “We are truly excited to be able to join Ventiv. This move means that we will be joining the market leader in both Europe and across the globe.” David Thomas added: “The acquisition and integration allow us to continue to support all our clients with an even stronger global network of offices and expertise.” ### Volta Data Centres and root6 join forces to power UK media industry UK broadcast industry now benefit from specialist support in the heart of London 4 October 2017, London, UK: Award-winning, carrier-neutral central London data centre, Volta, has partnered with technology supplier to the UK broadcast industry, root6, to offer state-of-the-art technology support to the UK film, post-production and new media communities in central London. root6 was acquired by Jigsaw24 in February 2017, combining two super power providers of audio and video technology solutions and managed services. root6 have an ambition to constantly push the boundaries of media & entertainment IT solutions, and to blaze a trail of best practice solutions, acting as the example of how to improve business infrastructure to its customers. [easy-tweet tweet="Volta quickly became an obvious partner for root6 to further develop their offering." hashtags="DataCentre, Data"] root6 needed a data centre in close proximity of Soho, where majority of the media industry resides; they also needed a super resilient facility. With a history of zero outages and a unique location on two separate diverse power rings coming into its building, Volta quickly became an obvious partner for root6 to further develop their offering. Soho-based broadcast studios and production houses now have a technology partner providing super-resilient data centre services within a short distance from their office. root6’s experience working with firms such as Soho-based film production company ITN Productions, BFI (British Film Institute), New York-based finishing house Final Frame, postproduction facility Edit 123 and postproduction facility 422.tv make it the perfect technology partner for any broadcast company. Rupert Watson, Director at root6 comments, “We are already seeing great value in the partnership with Volta. It was key for us to choose a data centre partner that could keep pace with the development of innovative technologies as it is vital to our business in the media industry. In the fast-moving media community, easy access to information, as well as availability at any time of day and night is vital because deadlines can’t be missed”. Even though root6 has transferred its processing and IT equipment into Volta’s central London data centre, root6 technicians still maintain constant access when required. Moving its equipment to a hosted provider also provides the experience and knowledge for root6 to showcase to customers how this approach can be beneficial to their own business infrastructure. Watson continues, “Our high-end TV and DCinema customers look to us to be ahead of the game when it comes to providing uncompressed quality and low latency HD and 4K video. Through our partnership, we can offer Complex Wave Division Multiplex (CWDM) and Keyboard Video Mouse (KVM) solutions that fit perfectly with the metro fibre network that terminates at Volta.” Jonathan Arnold, Managing Director, Volta Data Centres, concludes, “root6 is a committed and highly experienced supplier to UK broadcast, film, post-production and media companies. It’s a great opportunity for Volta to provide root6 with flexible colocation services and demonstrate the tangible benefits that can be achieved by working with a data centre partner. Finding the right data centre is a business critical decision for many companies, especially those that require superfast, uninterrupted data connections and a reliable power supply – things that Volta can provide with ease.” ### New German market research highlights requirement for advanced email security to protect organisations from ransomware and impersonation attacks Munich, Germany, October 4, 2017 – Mimecast Limited (NASDAQ: MIME), a leading email and data security company, today announced its plan to expand into Germany. Mimecast’s new office in Munich will serve as a regional hub for Central Europe and strengthen ties with existing customers and partners. Mimecast will build a team that offers high-quality local German-speaking customer service, technical support and partner program management. Mimecast will also open two fully replicated data centres in Germany in 2018 to support the growing appetite for local data residency. Mimecast is 100% committed to the channel and is currently recruiting additional German partners to accelerate its growth and further enhance customer experience in the region. “Mimecast is all-in on Germany with a long-term view on investment in technology infrastructure and jobs,” said Michael Heuer, Country Manager - Central Europe. “Email is a critical service for organizations and they need help to archive their growing volumes of data in a highly secure repository while providing powerful search tools that allow employees to easily access this information quickly for productivity and GDPR compliance. [easy-tweet tweet="61% of organizations had some of their data encrypted by ransomware in 2017" hashtags="Ransomware, Security"] “Our research shows that advanced email attacks in Germany are growing and many firms have not yet embraced a cyber resilience strategy. It’s vital that organizations not only invest in preventing attacks but also prepare how to keep their business working during an incident,” he added. Mimecast German research highlights* Sophisticated email attack volumes are rising – 55% of respondents had seen more targeted spear-phishing with malicious links in the last 3 months. 47% saw impersonation fraud asking for confidential data rise and 43% for fraudulent wire transfers 61% of organizations had some of their data encrypted by ransomware in 2017 with 37% of those affected paying the ransom Only 19% of German organizations have already adopted a cyber resilience strategy Mimecast’s integrated cloud suite is designed to comprehensively manage email security, archiving and business continuity for on-premises and cloud services like Microsoft Office 365. Prospective customers, partners and employees are invited to meet Mimecast representatives at the it-sa 2017 exhibition and congress held on 10-12 October in Nuremberg. Hall 10.1-214. ### IP Expo Europe 2017 Live - Day 1 [vc_row][vc_column][vc_column_text]Compare the Cloud’s event coverage platform, Disruptive have partnered with IP Expo Europe 2017 for the second year running, to bring you live shows throughout the event! Tune in on the 4th and 5th October to watch live coverage from the ExCel at IP Expo. The broadcast will be from a specially built stand where attendees, exhibitors and speakers will be interviewed and update you on the latest tech trends. You can watch the broadcast live at the expo, online or on demand after each day. Watch live also at Disruptive![/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_video link="https://vimeo.com/237076406" title="Disruptive Live from Machina at IP EXPO - Day 1, Morning Show"][/vc_column][/vc_row][vc_row][vc_column][vc_video link="https://vimeo.com/237100029" title="Disruptive Live from Machina at IP EXPO - Day 1, Afternoon"][/vc_column][/vc_row][vc_row][vc_column][vc_video link="https://vimeo.com/237076912" title="IP Expo Day 1 - Behind the Expo with Andy Johnson"][/vc_column][/vc_row] ### Challenges and Opportunities: Automating a Financial Business   The term ‘digital transformation’ has been deployed to describe an array of digital projects, from the big to the small. But what does true digital transformation really mean for businesses? When you strip away the buzzwords, digital transformation is simply about adapting the culture of a business and the way it operates to work with new technology. That’s just what we do for our clients at ISG – specialising in digital transformation consultancy, we work with businesses to help them scrutinise their current operating model and see how they can optimise their performance using the available technology. Automation is just one-way companies can look to leverage technology, but its results are often the most transformative. The simplest form of automation is Robotic Process Automation (RPA). RPA creates a Level One digital workforce by automating standard, repeatable processes using structured data, such as data in spreadsheets. It’s a tool that is increasingly being adopted by financial businesses looking to streamline and improve their services – although its benefits extend far beyond the bottom line. To demonstrate the opportunities RPA can present, and the challenges it can overcome, let’s take the example of Western Union. Western Union is a leading financial services organisation which helps businesses and individuals move money – in 2016, the company moved over $80 billion dollars between customers. When ISG began working with Western Union, they had a twofold vision: to empower their workforce and improve customer service through digitisation. Yet there was a significant barrier to achieving the latter goal. Important procedures such as compliance processes were time-consuming and labour-intensive, requiring highly skilled individuals, yet the task itself was routine and left employees unengaged. This contributed to a high staff turnover within these roles. [easy-tweet tweet="A tailored approach is vital when supporting businesses through digital transformation projects" hashtags="DigitalTransformation, Technology"] A tailored approach is vital when supporting businesses through digital transformation projects, and it’s very important that businesses understand the value technology can add to their specific organisation as a first step. As a leader in global payment services, it was clear that Western Union needed a solution that was both scalable and secure to operate reliably and accurately in a heavily regulated financial environment. With information security top of the agenda, we worked with Western Union to demonstrate to senior stakeholders how trained virtual workers, through the adoption of RPA tools, are capable of carrying out highly complex tasks, such as compliance procedures, with pinpoint accuracy. In the pilot stage of our work, we implemented pure RPA for a compliance process which freed up the capacity of 12 highly-skilled employees, empowering them to conduct more value-add tasks, boosting staff engagement, and helping the business retain staff as a result. In addition to creating more resource, automating a compliance process and training virtual workers removed the risk of human error, and improved customer service by facilitating a quicker and more efficient process. Following a successful pilot stage, we worked with Western Union to develop a further seven automation processes, and set up an infrastructure by creating 40 virtual workers ready to be used as a result of automating a number of highly skilled procedures. As we’ve seen through the Western Union example, automation presents businesses with an opportunity to adapt to new technology, improving their processes and customer service as a result. But a key benefit many organisations overlook is the ability of this new way of working to support individuals within the business. In the financial services sector in particular, highly skilled staff are required to complete mundane tasks and the turnover rate for these roles can be quite high, particularly amongst the millennial generation. Digital transformation, such as RPA tools, has the ability to address this issue by empowering individuals to conduct more value-add tasks and allowing businesses to utilise their skills in other areas.  It is clear that, when exploring automation, businesses should carefully consider the benefits beyond increased accuracy and efficiency and look at the needs of the organisation more broadly. ### IP Expo Europe 2017 Live ABOUT Compare the Cloud’s event coverage platform Disruptive have partnered with IP Expo Europe 2017 for the second year running, to bring you live shows throughout the event! Tune in on the 4th and 5th October to watch live coverage from the ExCel at IP Expo. The broadcast will be from a specially built stand where attendees, exhibitors and speakers will be interviewed and update you on the latest tech trends. You can watch the broadcast live at the expo, online or on demand after each day. Watch live at Disruptive! SCHEDULE IP Expo Live Show 1: Wednesday 4th October 10:30am – 12:00pm Show 2: Wednesday 4th October 2:30pm – 4:15pm Show 2: Thursday 5th October 10:30am – 12:00pm Show 3: Thursday 5th October 2:30pm – 4:00pm Behind the Expo Live Show 1: Wednesday 4th October 1:00pm – 2:00pm Show 2: Thursday 5th October 1:00pm – 2:00pm MORE INFO For more info about guests and shows visit our official site for the Expo at http://www.ipexpoeurope.com/IP-EXPO-LIVE   ### Integration Works – Of course it does! Back in 2016 in Las Vegas I had the pleasure of having dinner with 2 of the directors from a company called IntegrationWorks, a New Zealand based firm that had just picked up an IBM Beacon award (The IBM Beacon Awards recognize IBM Business Partners who have delivered exceptional solutions using IBM products and services. By delivering innovative solutions to drive business value, Beacon Award winners help transform the way their clients, the industry and the world do business). Apart from having a very nice dinner at the café de Paris, we discussed DevOps, Tech strategies and world domination. At the time I thought they had a great business in New Zealand and a winning formula for growth and innovation within regulated sectors, as well as the retail market place, with the services they offered. As you do, we exchanged business cards and said our farewells and I left on an aeroplane back to the UK. 16 months later by complete chance, I bumped into Grant, one of the Directors from IntegrationWorks in Southbank and chatted briefly. To say I was surprised on their progression is an understatement and we had a great chat based on DevOps integration and challenger banks, PSD2 and many other bits and pieces. What they are doing is game changing for the regulated sectors, retail and others and I’ll tell you why as the three main focus areas that they now deliver to the UK market place are unique! The Banking and Financial Services Integration market  This has always been a misunderstood subject as the word regulation always comes out on to the table. However, IntegrationWorks have created a “Managed Service for Integration – MSFI”. Any company that is looking to establish integration as a foundation for business growth and transformation will encounter the requirement for integrated systems. The proliferation of people, processes, and technology to achieve this often results in unanticipated costs and overheads. These factors must be aligned through a common delivery model to ensure objectives are met in a more efficient manner. Managed services are far more than a simple operational support contract. A Managed Service for integration can provide the organisation with a flexible, cost-effective, and more robust integration capability and with IntegrationWorks that have the framework in place that quite literately picks up your initiative, execute and implement – reducing your project timeline down by months! “This is a partnership with clients where a plan is put together to form the ideal desired team for current and future needs while providing the frameworks, methodology and processes to enable mutual success. The Managed Services are delivered for a fixed, regular monthly subscription price depending on the base plan (based on the number of teams) and supplemental capability required in the month. The option to simply enhance the model through additional charges and services for faster outcomes or higher service levels can be easily achieved and negotiated in advance.” Grant McKeen, Business Development Manager, IntegrationWorks. PSD2 – What is it? The Payment Services Directive 2 allows third-party developers to build payment service infrastructures around the platforms of financial institutions (mostly open API based) that creates an open banking disruptive model for challenger banks to enter the game. This creates a fair and level playing field for new non-traditional firms to provide payment services alongside new rules that include transparency for account services, charges, reporting obligations and complaint procedures for consumers. [easy-tweet tweet="IntegrationWorks is currently working on a PSD2 open banking appliance for the European market" hashtags="IntegrationWorks, Market"] IntegrationWorks is currently working on a PSD2 open banking appliance for the European market, based on IBM Datapower and API Connect.  This would be used as a free non-financial negotiating tool to sell IBM Datapower and API Connect at full retail pricing. The customer benefits from low-risk PSD2 adoption, IBM benefits from increased sales, IntegrationWorks benefits from rebates and potential services work. Integrated License Management (ILM) Aside from meeting potential litigious and costly compliance measures with the vendor, an efficient integrated license management practice will save money, increase security, improve support, improve productivity, and allow your business to stay informed. However, the main and most important benefit is that you meet your obligations as a business, regarding your end-user license agreements, thus meeting your legal and contractual obligations with the vendor.  “Software Asset Management (SAM) is important to any business that licenses software. If organisations are serious about avoiding software licensing penalties or fees, they need to ensure they implement efficient discovery tools, license optimisation software and consider partnering with software license experts to be able to create accurate software asset reports.” Grant McKeen, Business Development Manager, IntegrationWorks. Grant then expanded on the specifics and commented “We are focused on the best-fit for our customers and will align the best piece of technology suited to budget and requirements. This saves time from sourcing multiple vendor pricing, keeps vendors competitive and allows you immediate access to localised specialist vendor experts that increase value-add. Our partnerships with IBM, MuleSoft, Oracle, WSO2, Redhat, EightWire and Apigee allow us to secure you the best licencing and product options.”  Any way you look at it IntegrationWorks is a cracking company that not only does what is described on the tin, but they are also great guys to work with. I have asked him to write some information on PSD2 and their vast experience complying with this regulation for clients, and he has agreed so watch out for future articles! What do IntegrationWorks do? Click here! ### NFON now connected to the world's largest Internet Exchange Points DE-CIX and LINX NFON AG, a worldwide leading provider of cloud telephone systems, is now providing its customers with an even better quality of communication and services worldwide thanks to its connection to the world's largest Internet hubs DE-CIX (Deutsche Commercial Internet Exchange in Frankfurt am Main, Germany) and LINX (London Internet Exchange in London). The direct connection to the two Internet Exchange Points eliminates the need for additional parties and thus potential sources of error, such as countless peering points. “On the connection between NFON and its customers, it could happen in the past that data packets were lost due to error sources on the way, without any influence from our side. By connecting to the DE-CIX and LINX, we eliminate peering points and raise our network connection to a new and unrivalled level,” explains Jan-Peter Koopmann, Chief Technology Officer of NFON AG. With its connection to the DE-CIX and LINX nodes in particular, NFON's network is even closer to its customers - especially the core segment of small enterprises and SMEs. “The Large Enterprises segment is often connected to a network operator, which has direct peering with the Internet Exchange Points mentioned above. For our customers, who are mostly small and medium-sized enterprises, the bridge to DE-CIX and LINX means a better service and a noticeable increase in quality without additional costs”, says Jan-Peter Koopmann. "Thanks to the direct connection to DE-CIX and LINX, we have more influence on the data traffic and are less dependent on external sources of interference and potential bottlenecks." [easy-tweet tweet="The connection to DE-CIX and LINX is a logical step to adapt the network connection of NFON AG" hashtags="NFON, Network"] The DE-CIX Internet hub in Frankfurt has been the heart of the German and European data exchange via the Internet since 1995 and is the world's largest Internet hub in terms of traffic volume. At peak times, up to 5.6 terabytes of data per second can be transferred via DE-CIX servers, and up to 3.8 terabytes via LINX. “The connection to DE-CIX and LINX is a logical step to adapt the network connection of NFON AG to the growth and internationalization of the company,” says Jan-Peter Koopmann. “Through our partners, we are always able to activate on demand internet exchanges such as VIX in Austria and many others.“ ### Before Automation, Fix Your Manual Processes Beside the president’s perversion of social media and the car crash of Uber, few topics have risen up the tech news ranks in 2017 quite like automation. Gartner forecasts the robotic automation market will grow by 41% each year to 2020 (that’s $6.2bn a year by 2023, according to P&S Market Research), while automation technology in the home will be worth $21bn by 2020, says Zion Market Research. [easy-tweet tweet="Automation is following the standard tech hype cycle" hashtags="Automation, Robotics"] Automation is following the standard tech hype cycle. There is almost religious zeal, a belief that smart algorithms, plus a smattering of artificial intelligence, can help businesses put processes on auto-pilot, saving vast quantities of person-hours and freeing executives to make every task repeatable. The only ones who have to fear the automation revolution are the up to 38% of US workers and 30% of UK employees whose jobs may be lost to automated processes, according to a PwC report. But the people can sleep a little easier. Because automation will not deliver on its smart promise if its underlying processes are dumb in the first place. You see, automation is simply the combination and automatic execution of existing digital workflows - and, as my new book Humans Vs Computers explains, human executives are notoriously bad at creating those workflows. Just ask drivers in California. When the state introduced a surveillance camera system to automatically charge road tolls by detecting licence plates, it seemed like a good idea. But the state overlooked its own California Vehicle Code 4456, section C2, which allows a new car to be driven without licence plates for up to 90 days - a loophole which allowed new drivers to drive with impunity. The researchers at IBM now know the risks, too. When they pre-loaded the Urban Dictionary, a repository of slang, in to their super-intelligent Watson artificial intelligence engine, they were surprised that Watson could no longer tell the difference between profanity and clean language, responding to one user with simply: “Bullshit”. They shouldn’t have been surprised. Automation is no different from any other GIGO computer process - if you put Garbage In, you will get Garbage Out. And, whilst automation will frequently prove to be fallible in this way, it is the human navigators with their hands on the tiller who we can blame for charting the wrong course. In the huge obsession with automation these days, people lose track of the fact that automating makes things faster, not better. Speeding up a beneficial process delivers value faster. But automating a bad process only makes it spiral out of control. The dirty little secret of the software industry is that automation doesn’t make things better, only faster. Take for example, Microsoft’s chatbot, Tay. Tay was designed to become smarter as more users interacted with it, however, it instead quickly learned to spew anti-semitic tweets that human Twitter users fed into the programme. Microsoft eventually realised and shut it down, but the damage had already been done and at a very fast pace. Every company examining digital automation today should stop, step away from the technology options and vendor websites, and rewind. A prerequisite to any automated process should first be auditing and examining your current manual workflow - whether it be digital or analogue - for gaps and faults. You need to look on automation not only as a positive force that can boost business efficiency but also a negative one that, if implemented without prior due diligence, could lead to damaging consequences. And, before you get tempted, no, you can’t ask a computer conduct that audit. As the world will soon see, sending a bot to do a man’s job can cause more ill than good.   ### Up Time Episode 2: Digital Transformation Disruptive Live presents the second episode of UP TIME in association with Cogeco Peer 1! This week’s episode will focus on Digital Transformation. Digital is transforming the way we communicate and transact, particularly in the realms of digital marketing, e-commerce and document management. With each evolution comes opportunity and risk. This episode will look at trends in digital transactions. Where should businesses be investing and what risks and challenges do they need to overcome? This week see’s the panel host Susan Bowen from Cogeco Peer 1 with her panel Caroline Porteous, Head of Adobe Sign and Fabrizio Fantini, Co-founder of Buy Expressly. UptimeLive ### Tech Data to Set Out its Cloud Practice Strategy at Cloud Summit in Venice     Distributor will articulate how the meaning of value is changing and what it means for the channel at annual gathering of partners in Europe  BARCELONA, Spain. (October 2, 2017) – Tech Data Corporation (Nasdaq: TECD) will set out its European strategy and plans to enable and support cloud solutions and service provider partners in moving to the next phase of their cloud practice development at its annual Cloud Summit, which takes place in Venice on October 3. In the keynote addresses, Tech Data’s Reza Honarmand, vice president, Cloud, Europe and David Newbould, director, Cloud, Europe, will set out the company’s view of the current state of the market. They will explain how Tech Data provides partners with specialist knowledge and engagement in order to enable them to deliver stable platforms for the modern workplace, alongside advanced solutions and services that support the digital transformation of customers. Tech Data’s end-to-end offering of cloud solutions and services enables partners to develop specialisation key next-generation technology segments such as advanced datacentre, enterprise security, business intelligence, cognitive computing and IoT. [easy-tweet tweet="Highly complex technologies such as cognitive computing and IoT are driving digital transformation" hashtags="DigitalTransformation, IoT"] Reza Honarmand stated: “The market is maturing rapidly, with IT solutions now leveraging next-generation architectures such as hybrid cloud and platform services; this is leading to greater adoption in key areas, such as IT operations.  At the same time, highly complex technologies such as cognitive computing and IoT are driving digital transformation by focusing on the creation and deployment of innovative business solutions. Those partners who have the skills and business models to meet the needs of a changing customer profile will quickly rise above their competition and realise multiplying profits.” Delegates at the Cloud Summit will hear presentations from, Victor Gureghian Baez, senior director, Worldwide Distribution and Indirect Channels for Microsoft; Roberto Zampese, business development manager, Oracle, Italy, and Raffele de Lucia, Watson leader, IBM Italy. There will also be a presentation on monetisation of IoT solutions from Victor Paradell, vice president, Tech Data Smart IoT and Analytics, Europe, at Tech Data. The annual Tech Data Cloud Summit brings together leading cloud vendors, service providers and channel partners from across Europe. By providing a forum for evaluation of the current state of the market and key areas of development and growth potential, Tech Data underlines its position as the leading cloud player in the distribution sector. The Tech Data Cloud Summit takes place at the JW Marriott Hotel in Venice on the afternoon of 3 October 2017. It is running alongside Tech Data’s European Vendor Partner Summit and the annual Canalys Channel Forum, both of which take place in Venice in the same week. About Tech Data Tech Data connects the world with the power of technology. Our end-to-end portfolio of products, services and solutions, highly specialised skills, and expertise in next-generation technologies enable channel partners to bring to market the products and solutions the world needs to connect, grow and advance. Tech Data is ranked No. 107 on the Fortune 500® and has been named one of Fortune’s “World’s Most Admired Companies” for eight straight years. To find out more, visit www.techdata.com or follow us onTwitter, LinkedIn, and Facebook. ### Scotland's Pipeline of Data Science Talent Boosted   High industry demand for Data Lab funded MSc 2017 cohort – many already snapped-up The Data Lab boosts MSc places by 45% to 130 to meet demand Industrial placement winner helps cataracts patients’ pathways With demand for data science talent showing no signs of waning, Virgin Money, Royal London, Aquila Merkle and Aggreko are some of the private sector companies that have snapped up the cream of Scotland’s newly qualified data talent crop. Data Lab MSc 2017 graduates have also already secured data roles in the public sector for organisations such as NHS NSS, Glasgow City Council and North Ayrshire Council. One of the cohort, Ross McLean, has even started his own business, Keedo, on the back of the learnings from the MSc in Data Science he completed at Robert Gordon University (RGU). Keedo is an organisation designed to build products for the Higher Education sector. With demand for data skills continuing to grow, The Data Lab has boosted the number of places it funds on Data MSc courses this academic year at eleven universities across Scotland from 90 to 130, representing a 45% increase. [easy-tweet tweet="Over 500 students have benefitted from The Data Lab’s education programme" hashtags="DataLab, NHS"] The MSc programme is core to The Data Lab’s aim to unlock the estimated £20bn value of data to Scotland and generate 248 high-value jobs. Over 500 students have benefitted from The Data Lab’s education programme which includes the MSc, industrial doctorates and executive education programmes among other training opportunities running since 2015. Josh Ryan-Saha, skills programme manager at innovation centre, The Data Lab explains: “The best companies and organisations are now using data to develop new products and services. They can choose anywhere in the world, so Scotland needs to have people here with the skills they need now and in the future. “Scotland is in a good place at the moment. There is local talent here – as underlined by the appetite for the MSc programme both from candidates and from employers. We are committed to ensuring this pipeline of talent continues to flow.” Minister for Further Education, Higher Education and Science, Shirley-Anne Somerville, also commented, “Data science is an area of rapid growth around the world and we want Scotland to lead the way in meeting the demand from business for data skills talent. It is encouraging to learn there is a strong uptake among graduates for Masters courses in data science and I very much welcome this news that for 2017-18, Data Lab has almost doubled the number of places available on its MSc programme to further build on that. Improving skills, enthusiasm and knowledge of STEM subjects such as data science at all levels of school, college and university and encouraging uptake of careers in this sector are vital to both Scotland’s society and our economic prosperity - this will be the central aim of our ambitious STEM strategy for education and training.” Industrial Placement Award A fundamental element of the MSc programme is the opportunity to undertake a three-month industrial placement. Some fifty students benefited from this with a number of them then going on to be recruited for full-time rolls as a result of the placement. Anita George, who studied at the University of Stirling and undertook her placement at NHS NSS, secured The Data Lab Industrial Placement Award. Anita George said: “It was a rewarding project to be a part of. It was about optimising pathways for patients diagnosed with cataracts, a major reason for blindness among the elderly. To be a part of something aimed at making a real difference to the life of a person is great. There were major challenges in terms of confidentiality and the size of the project but we got there. “While I was there I applied and got a position as an information analyst in public health for the NHS.  It is a great opportunity and another exciting phase in the journey I’ve taken over the past year on this course.” MBN Solutions partners with The Data Lab to help deliver the placements, and earlier this month hosted the Data Scientist 2.0 event, with The Data Lab, to celebrate the conclusion of the of MSc program’s second round. The event, which included the awards ceremony, was also a networking opportunity, bringing students and industry experts together, to foster relationships with potential future employers. Paul Forrest, MBN chair says: “It was very difficult to judge as the placements were all very impressive and a mark of the quality of the data courses across Scotland. In the end we wanted to give the award to the placement that delivered the most social impact. The project on cataract patient pathways does just that. The award to Anita is well deserved.” Challenge programme The MSc Programme also includes a pioneering year-long data challenge competition. It was won by a team of students from the University of Stirling, Robert Hamlet, Peter Henriksen and Cynthia Morel who worked with Zero Waste Scotland to analyse data from three local authorities to develop a prototype centralised data platform that will detect, predict and prevent future flytipping across Scotland – an issue that is estimated to cost the country £2.5m. ### Executive Education Episode 1: C-suite Skills for the 21st Century Disruptive Live presents the first episode of Executive Education in association with Regent Leadership! This week’s episode will focus on C-suite Skills for the 21st Century. This week sees the panel hosts Colin Pinks, Director from Regent Leadership and Kylie Hazeldine, Business Development from Regent Leadership with their panel Matt Dreisin, Former CEO for MegaNexs and Professor Sir Deian Hopkin, Former Vice Chancellor London South Bank University. ### GDPR Practice Lead - Voices of GDPR - Adam Ryan from Calligo In this episode of Voices of GDPR, Compare the Cloud speaks to Calligo's GDPR Practice Lead, Adam Ryan. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. To find out more about Calligo visit: https://calligo.cloud/ Voices of GDPR ### MicroStrategy Delivers Analytics for Everyone with MicroStrategy 10.9™ Introducing Dossier™, a New Storybook Experience Around Analytics Tysons Corner, VA, October 2, 2017 -- MicroStrategy® Incorporated (Nasdaq: MSTR), a leading worldwide provider of enterprise analytics and mobility software, today announced the general availability of MicroStrategy 10.9, the newest feature release to the company’s MicroStrategy 10™ platform. This feature release introduces Dossier, a new storybook experience for everyone across the enterprise that makes it easy to consume, share, and collaborate around analytics. To learn more about the new features in MicroStrategy 10.9, visit our What’s New in MicroStrategy page. “MicroStrategy 10.9 represents the biggest leap forward since our MicroStrategy 10 platform launch and underscores our vision of delivering ‘Intelligence Everywhere’,” said Tim Lang, Senior Executive Vice President and Chief Technology Officer, MicroStrategy Incorporated. “We believe collaborative analytics accelerates the velocity of decision-making. That’s why we’re introducing Dossier, an easier and faster method of consuming analytics that we believe end users are going to love.  MicroStrategy 10.9 empowers users to do more with their analytics regardless of their technical skill or role." Dossier Delivers New Storybook Experience Around Analytics  MicroStrategy 10.9 delivers Dossier, an interactive book of business which combines relevant analytics into a single place, organized into chapters and pages. The new streamlined interface goes beyond dashboards and brings data together in a format that all users recognize and understand. The new consumer-oriented and browser-based interface offers end users a modern and interactive means to explore data.  Dossier makes it easy to navigate and share reports and visualizations that matter to users. “Assisting customers in improving their businesses through media-centric analytics insights is a top objective at Imagine, and MicroStrategy 10.9 is instrumental in helping us achieve this goal,” said Adam Gotlieb, Director of Product Management for Global Traffic, Imagine Communications. “Dossier introduces a new streamlined interface that goes beyond dashboards and brings key data into a format that users can understand and use to make better, actionable decisions and identify new opportunities.” Collaborative Analytics Turbocharges Decision Making Version 10.9 enables users to collaborate via discussion threads and real-time comments within the Dossier interface. Anyone who has access to a dossier through the MicroStrategy Library™ can ask questions, highlight trends, and share their current filter state with others. Users who are tagged in a comment can receive real-time notifications via email, browser banners, page and library alerts, or push notifications. MicroStrategy Library Helps Users Quickly and Easily Find Content and Answers The 10.9 release delivers the MicroStrategy Library, a personalized portal for all users to access their BI content and dossiers. End users can login to the MicroStrategy Library to find a list of all dossiers to which he or she has subscribed or has access. Each dossier in the library is represented by a customized thumbnail image, making it easy to visually identify content. Additionally, powerful search options are available for users to find content quickly. New notifications and comments made to the user will show up clearly as an alert, so the user doesn’t miss an update. Regardless of what browser or mobile device a user is on, the MicroStrategy Library is responsive in design and will automatically resize as needed. Visual-based Filtering Accelerates Data Exploration Dossier delivers a completely revamped approach to filtering data. Users can explore their data with a clean, simple interface and multiple filter types, including visual filters, advanced qualification filters, and in-canvas selectors. With visual filters, users can now embed any MicroStrategy visualization into the filter panel of a dossier, providing a visual and intuitive way to filter information without taking up valuable screen real estate. [easy-tweet tweet="MicroStrategy 10.9 makes it easier for users to access, analyze, and operationalize their data" hashtags="Data, MicroStrategy"] Redesigned Data Discovery & Dossier Authoring Facilitate Faster Access to Data and Insights MicroStrategy 10.9 makes it easier for users to access, analyze, and operationalize their data. MicroStrategy 10.9 provides business analysts with more connectors to data sources, easier data preparation, visual analytics and guided advanced analytics. Analysts can author once and deploy on multiple devices, without worrying about the target device. Dossier and Document Certification Maintain Effective Data Governance A common need for organizations that have invested in self-service strategies is the ability to ensure that everyone is able to differentiate content that is tied to certified versus untrustworthy data. With MicroStrategy 10.9, analysts and data stewards can watermark content with a ‘certified’ stamp.  This step offers an easy way to implement governance across a self-service deployment and makes it clear for business users which content is tied to trusted data sources. New Connectivity, Including Enhanced Out-of-the-box Connectors, to Diverse Types of Data MicroStrategy 10.9 amplifies existing investments by keeping up with the latest certifications and providing instant access to nearly any type of data with out-of-the-box connectors to popular data sources, including personal spreadsheets, enterprise data warehouses, text files, cloud sources, and big data systems. With a clean simple interface, analysts can sort through 80+ data sources to quickly find the source they need. Version 10.9 delivers live data access for Hadoop Gateway, allowing end users to get the latest information from HDFS. In addition, MicroStrategy 10.9 delivers an SDK to build out new out-of-the-box connectors that help analysts visualize text and semi-structured data such as Solr, Box, One Drive, Elasticsearch and many others. These new connectors allow analysts and citizen data scientists to run queries against semi-structured data, like log files and text scripts. Data is available to end users in tabular format and can be readily used to build reports or dossiers. New connectors built using the Data Connector API are regularly available and can be downloaded by visiting the MicroStrategy Community. Greater Flexibility to Create Custom MDX Queries for Advanced Financial Reporting MicroStrategy 10.9 enables business users and analysts to visualize and interact with multi-level or ragged hierarchy reports against MDX sources, and delivers enhancements with dynamic filtering, persisted sort order, efficiency improvements, and support for derived attributes and metrics. Business users can now apply mathematical and other transformational functions, and create derived elements within hierarchy reports using more complex formulas and calculations. New Embedding APIs Easily Integrate Dossiers into Websites, Blogs and Portals Offering a vast array of APIs, MicroStrategy 10 is an open platform that works with larger technology ecosystems and offers easy ways to extend its capabilities. With MicroStrategy 10.9’s new embedding APIs, dossiers can be easily embedded into third-party applications with the ability to interact with MicroStrategy content and associated functionality. Users can slice and dice the data, navigate to different pages and chapters in the dossier, apply filters, and interact and analyze the data within the application itself. In addition, single sign-on or guest embedding can be enabled, so the user can interact with the dossier without having to log in within the third-party application. “After testing MicroStrategy 10.9 as an early adopter, we are excited about the new set of APIs in the release that will make embedding and integration of MicroStrategy capabilities with applications of our choice easier,” said Praphul Kumar, Senior Director of Product Management, Genesys. “As we create the world’s most powerful, personal and intelligent customer experiences, MicroStrategy helps us to democratize those experiences through their rich set of APIs. We use MicroStrategy APIs to embed real-time and historical information right into the hands of contact centre agents and supervisors so they can view and analyze that information in any application. MicroStrategy is the most open platform in the market, and we are excited to build more data-driven client applications with the new APIs.” [easy-tweet tweet="MicroStrategy customers can deploy MicroStrategy on AWS in data centres" hashtags="AWS, DataCentre"] Upgrades to MicroStrategy on AWS Strengthen Analytics in the Cloud MicroStrategy on AWS allows organizations to deploy and manage enterprise analytics environments through the provisioning console. MicroStrategy 10.9 adds significant enhancements to its cloud offering, including single-click hotfix upgrades and the ability to deploy MicroStrategy into existing Virtual Private Cloud (VPC) accounts. New deployments will be automatically launched in the latest R4 instances that are optimized to require less memory. MicroStrategy customers can deploy MicroStrategy on AWS in data centres in eight different locations: Northern Virginia, Ohio, Oregon, Ireland, Frankfurt, London, Sydney, and Tokyo. The MicroStrategy on AWS provisioning tool is also available in nine different languages: English, French, Italian, Portuguese, Spanish, Dutch, Japanese, Korean, and Chinese. The latest release also introduces various additional enhancements, including:  Responsive design that makes it easier than ever to build and deploy analytics across different device types and screen sizes;  New and improved data connectors that allow organizations to access datasets across different projects in a MicroStrategy deployment. Version 10.9 also certifies the latest data sources, including Maria Database 10.x, Apache Hive 2.x, Spark SQL 2.0, and MapR 5.2, and delivers improvements to data connectors for Presto, Snowflake, and Amazon Redshift; and Out-of-the-box connectors for both Solr and Elasticsearch.  Release of MicroStrategy 10.4™ Hotfix 5 Alongside the MicroStrategy 10.9 feature release, MicroStrategy 10.4 Hotfix 5 is generally available to customers today. This hotfix improves overall product stability and includes over 200 customer-reported enhancements and fixes across the platform. Furthermore, this latest update to the MicroStrategy platform release provides driver and feature parity for relational and big data gateways with the latest MicroStrategy feature release (10.9). This enables access to brand-new data sources and provides optimized connectors for additional data sources — allowing customers to expand and enhance their existing applications. The latest hotfix (10.4.5) is also available for MicroStrategy on AWS. MicroStrategy customers can apply MicroStrategy 10.9 or 10.4.5 to their analytics environments today by downloading the new release from the MicroStrategy download site. To see MicroStrategy 10.9 in action, attend our upcoming global Symposium Series, or register for our “MicroStrategy 10.9 and Dossier: Collaborative business analytics that boost adoption” webcast, taking place on Wednesday, October 25, 2017, at 2:00 p.m. EDT. About MicroStrategy Incorporated MicroStrategy (Nasdaq: MSTR) is a worldwide leader in enterprise analytics and mobility software. A pioneer in the BI and analytics space, MicroStrategy delivers innovative software that empowers people to make better decisions and transform the way they do business. We provide our enterprise customers with world-class software and expert services so they can deploy unique intelligence applications. To learn more, visit MicroStrategy online, and follow us on Facebook and Twitter. MicroStrategy, MicroStrategy 10, MicroStrategy 10.4, MicroStrategy 10.9, MicroStrategy Library and Dossier are either trademarks or registered trademarks of MicroStrategy Incorporated in the United States and certain other countries. Other product and company names mentioned herein may be the trademarks of their respective owners. ### Fintech - Discussion on Blockchain - Finastra's Peter Farley Compare the Cloud interviews Peter Farley, Capital Marketing Strategist of Finastra. Peter speaks about how Finastra's main role is to provide the plumbing infrastructure for financial institutions to run their business efficiently. What does Blockchain mean? Blockchain technology provides an open decentralised database of every transaction involving value. It creates a chronological and public record that can be verified by the entire community. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Finastra here: http://www.finastra.com/ Blockchain ### A Smooth Cloud Transition for Budgeting and Planning Organisations are increasingly realising the transformative potential of the cloud to enhance budgeting and planning to drive better business results. Although finance is often one of the last departments in the enterprise to go cloud, the shift is on, according to Gartner, which disclosed that it is no longer covering non-cloud solutions in its latest report. And CFOs agree; according to Adaptive Insights CFO Indicator Q1 2017 report, finance leaders expect technologies to increasingly reside in the cloud. Indeed, CFOs estimate that 33 percent of their IT infrastructure is Software-as-Service (SaaS) today, and they forecast this to grow to 60 percent of their infrastructure in four years. With a few exceptions, the key question facing CFOs and Financial Planning and Analysis (FP&A) leaders is fast becoming not if you are moving to the cloud, but when you are doing it. Yet committing to a cloud transition is only the first step. It is essential to take key actions to help ensure a smooth transition and position finances teams and the broader organisation to reap the full benefits that a cloud solution can offer. Here are three phases to a successful cloud migration for FP&A: Assess the current state of the organisation The legwork companies do leading up to the migration to the cloud can go a long way toward ensuring a smoother, more productive future in the cloud. The key to the pre-migration assessment is to identify the systems, technology, and processes currently being used to manage and analyse financial and operations data in the organisation. It is also important to get a clear snapshot of the data itself. Where is the data being housed? Is it siloed? Is there general agreement on what is the core data for the organisation? This assessment will help create a benchmark from which to measure the organisation’s progress, while also ensuring the cloud system deployed has the functionality, capacity and capabilities to deliver the most value. It will also provide reassurance to executive sponsors and the leadership team that the necessary research has been done and the potential benefits clearly identified. The assessment also helps identify processes or technology that are currently delivering value and might have a place in the post-cloud environment. Educate and inform Those already well-versed in cloud know its many benefits. For FP&A, the cloud frees teams from the drudgery of static planning and enables them to move to an active planning process. However, even the best technology and most seamless deployment cannot be successful if you do not get buy-in from everyone from the executives to the admins. It is only human nature for people to focus on “what’s in it for me” when faced with change. To be successful, organisations need more than just a cloud transition strategy for the technical and logistical aspects of moving from an Excel environment or a legacy system to the cloud. They also need a cloud migration strategy to effectively communicate the benefits, as well as provide adequate training to get partners from throughout the organisation understanding and leveraging the cloud technology. Ultimately, moving to the cloud should be positioned as not just an FP&A project, but rather, an advancement that offers benefits to the whole organisation through more efficiency, better collaboration and stronger business results. [easy-tweet tweet="It is important to continue to promote the benefits of the cloud " hashtags="Cloud, IT"] Do not declare victory early If companies accurately assess their current state and effectively communicate the benefits of the cloud, they should move through the first phase of the transition with real momentum. From there, the key is to sustain it. It is important to continue to promote the benefits of the cloud as an opportunity to adopt active planning that can enhance efficiency and create a competitive edge. Additionally, they can leverage forecasting and modelling capabilities that offer value to business partners by providing insight on everything from managing headcount to more effectively tracking sales and inventory. By managing the transition in this way, companies continuously build on the foundational benefits seen from the transition to the cloud. This will build an army of advocates, while also positioning FP&A as collaborative strategic partners, as opposed to order takers and number crunchers. By making it clear from the start that a full cloud transition takes time and is most successful when executed in a phased approach, companies will buy themselves time to ensure that the migration is successful and offers the best opportunity for strong ROI. Ultimately, by taking a well-planned, strategic approach, organisations can dramatically improve the chances that their transition to the cloud goes smoothly, is well received and delivers maximum value to the business. ### Customer Experience - Discussion on Blockchain - Servion's Sameet Gupte Compare the Cloud interviews Sameet Gupte, CEO of Servion. Sameet speaks about how Servion focuses on customer experience as that is where they believe the maximum spend will be over the next 4-5 years. What does Blockchain mean? Blockchain technology provides an open decentralised database of every transaction involving value. It creates a chronological and public record that can be verified by the entire community. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Servion here: http://servion.com/ Blockchain ### Facebook/NFL Deal: Premier League To Be Next? Speaking about the NFL and Facebook’s latest programming partnership deal, Dror Ginzberg, Co-Founder & CEO of Wochit said: “This groundbreaking deal comes at a worrying time for the NFL, as all the major American TV networks are reporting sustained audience declines for live viewing over the past year.” [easy-tweet tweet="Increasingly, people are looking for ‘near live’ game highlights online" hashtags="Online, IT"] “Increasingly, people are looking for ‘near live’ game highlights online, as opposed to watching full games, especially as American Football games are quite lengthy, compared to other sports. From Wochit’s own research, video content ranging from roughly 60 to 90 seconds get both the most views and shares on Facebook, meaning this new deal puts them in a prime position to capitalise on shifting consumer habits.” “Now, Facebook has an undeniable edge with high quality, premium content that other online competitors won’t have access to, while also having access to NFL Films’ content to distribute on its fledgeling broadcast service Facebook Watch.” “Undoubtedly the decision-makers from the Premier League will be analysing this deal closely. Similarly to the NFL, the Premier League’s TV rights once seemed the most bankable TV content until last year when ratings fell dramatically. The next broadcasting deal for the league will likely be historic as digital-only players finally take on Sky and BT.” ### Network security ‘not a top priority’ for IT bosses despite Equifax breach IT Executives are failing to recognise the importance of IT security new research has revealed, with most businesses placing the issues of completing IT projects on time and reducing IT costs ahead of a secure network, regardless of the concerns highlighted during the recent Equifax data breach that exposed the data of 400,000 UK residents. The study that was commissioned by global managed service provider Kaseya, included 900 small to medium businesses (firms consisting of up to 5,000 employees) and highlighted an underlying issue in regards to IT Security. Only 21% of respondents labelled security as the main issue among firms that were deemed ‘efficient’ in their IT operations. A surprising statistic considering 40% of respondents across the survey admitted that ensuring compliance and security was their second most important technology challenge in 2017. The concern of working with outdated legacy systems hampering growth and innovation listed as the biggest issue among 44% of respondents. [easy-tweet tweet="83% of businesses still focus on day to day IT management tasks" hashtags="IT, Cloud"] The research also delved deep into the stages of operational IT maturity experienced by SMBs indicating that many firms are still struggling to get the most out of their IT capabilities, with 83% of businesses still focusing on day to day IT management tasks that are often time-consuming and manual rather than working with comprehensive and aligned processes. The race to the cloud also emerged as a clear trend, with one-quarter of advanced IT companies turning to cloud hosting services for private cloud, with market leaders such as Microsoft being rated as the vendor of choice with Amazon close behind. The outsourcing vs on-premises debate has also proven to be a widely-discussed subject over the past year and the research discovered that backup and recovery is most likely function to be outsourced among of SMBs (33%). Server support and network connectivity also proved to be popular recording 30% and 29% of the poll respectively. “One of the most important findings in our annual survey of SMB and midmarket enterprises and their IT practices is a significant year-over-year increase in the number of what we call ‘mature’ organisations. As companies of all sizes embrace modern tactics and leverage strategic new technology, such as the cloud, predictive analytics and advanced security, they become more ‘mature’ as an IT organisation as a result,” said Taunia Kipp, global senior vice president of marketing for Kaseya. The final aspect of the study closely analysed the role of the CIO and supporting IT department. While the CIO’s role has changed over the past decade, the high regard of the role depends on a business’s overall level of IT maturity, with 81% of advanced businesses believing that the head of the IT department influences C-level decision making in comparison to just 76% of those firms that were less efficient in IT capabilities. About Kaseya Kaseya is the leading provider of complete IT management solutions for managed service providers (MSPs) and midsized enterprises. Through its open platform and customer-centric approach, Kaseya delivers best in breed technologies that allow organizations to efficiently manage and secure IT. Offered both on-premise and in the cloud, Kaseya solutions empower businesses to command all of IT centrally, easily manage remote and distributed environments, and automate across IT management functions. Kaseya solutions manage over 10 million endpoints worldwide. Headquartered in Dublin, Ireland, Kaseya is privately held with a presence in over 20 countries. To learn more, visit www.kaseya.com. ### Swyx unveils new products at its Partner and Technology Conference 2017 At the Swyx Partner and Technology Conference 2017, the unified communications specialist presented SwyxWare 11 for the first time with innovative new features, including a practical addition for handling high call volumes and a new generation of IP phones. Also for mobile users there are further improvements in connectivity with the current version of the Swyx solution. Version 11 of the Swyx communication solution offers a range of new features and functions. The suggestions of partners and customers have been incorporated into the development process so Swyx can support them even better in their daily work. The latest solution now offers features that make installation and administration easier. Using the online tool Swyx Configuration Planner, partners can arrange individual settings discussed with each customer and generate a configuration file at the push of a button. All predefined configurations are then automatically adopted and the installation time is significantly reduced. The new ‘Property Cloning’ function also reduces the amount of work involved: This allows you to change the basic settings for several users at the same time and enables administrators to configure different user groups much faster and more conveniently for new installations or adaptations. Efficiently manage high call volumes with SwyxPLUS VisualGroups In the new Swyx Version 11 release the new product option SwyxPLUS VisualGroups is available. This addition offers an intelligent and effective solution for conveniently distributing calls in a group. This enables companies to increase accessibility and improve service quality. Employees in smaller sales and support teams, at the front desk or reception can now handle a large number of incoming calls very efficiently. Through integration into the central user interface of Swyx, all team members have a clear overview of the queue, which is easy to configure and can be edited directly in the Swyx Client. Thanks to the statistics functionality of SwyxPLUS VisualGroups, service quality can be easily monitored by those responsible, for example the average waiting time of callers can be seen. The addition is seamlessly integrated into the Swyx communication solution and can be installed without any additional effort. VisualGroups forms part of a series of practical Swyx options that make unified communications even more productive, such as SwyxPLUS VisualContacts with its tried and tested integration of all contact data. Intelligent IP phones and even better mobile integration At its Partner and Technology Conference Swyx will also be presenting a new generation of desktop telephones for optimal voice communication within the workplace with the SwyxPhone L62, L64 and L66. The intelligent SwyxPhones based on Unify devices are deeply integrated into the Swyx software and in combination offer many useful functions from phone book and call list to presence information. These intuitively operated IP phones are characterised by their high audio quality, practicality and simple setup. The new SwyxPhones are impressive with their modern design, which combines an appealing appearance with ergonomic controls and a small footprint. New versions of Swyx Mobile Clients also ensure that smartphone users are even better integrated. With the latest update, Swyx Mobile for iOS is not only compatible for the new operating system iOS 11 from Apple, with the use of the Apple Call-Kit, IP-based calls are now also integrated in the iPhone. Users can switch between GSM and IP calls without disruption and, thanks to the common caller lists, can keep track of which calls they have made over which connections. In conjunction with Server version 11, the app also supports the system's own push notifications. This makes handling even more convenient and improves the battery performance of the mobile phone. The Android version of Swyx Mobile will also support push notifications in future. About Swyx Swyx your business. Swyx Solutions AG develops software-based communication solutions for the requirements of medium-sized companies. A range of suitable phones and other hardware devices complete the product portfolio. With Swyx, users benefit from all the advantages of optimally networked communication: they can increase accessibility, improve processes and ultimately increase productivity. Founded in 1999, as a pioneer in the field of IP communications, the company is now the European market leader in this field. With a two-tier distribution model and more than 1,200 authorised dealers, Swyx is present in 24 countries worldwide. The award-winning unified communications specialist headquartered in Dortmund, Germany, and with locations in the UK and France, offers both in-house communications solutions and cloud-based variants together with its partners. A total of around 800,000 users already rely on Swyx. Currently, more than 150 employees are committed to ensuring that Swyx continues to be one of the technology and innovation leaders in the IP communications market. ### Financial Markets - Discussion on Blockchain - IBM's Søren Mortensen Compare the Cloud interviews Søren Mortensen Director of Financial Markets. What does Blockchain mean? Blockchain technology provides an open decentralised database of every transaction involving value. It creates a chronological and public record that can be verified by the entire community. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about IBM here: https://www.ibm.com/uk-en/ ### Programmatic Advertising Traffic - Discussion on Blockchain - EnvisionX's Zheng Zhang Compare the Cloud interviews Zheng Zhang CEO of EnvisionX. Zheng Zhang talks about how his company launched programmatic advertising market place. What does Blockchain mean? Blockchain technology provides an open decentralised database of every transaction involving value. It creates a chronological and public record that can be verified by the entire community. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about EnvisionX here: http://www.envisionx.co/http://www.envisionx.co/ Blockchain ### Avoid Making These Top 5 Cloud Computing Mistakes That Can Seriously Damage Your Business Streamlining business to the cloud can significantly improve the convenience and costs of doing business. Businesses are increasingly moving towards cloud-based technology to conduct their day-to-day business dealings. But, the seriousness of this transition and the related dangers that could possibly occur cannot be overlooked. While there are a great number of real benefits a business can experience by utilizing this technology, mistakes can be made that could seriously impact the efficiency and overall security of a business. Here are the top five cloud computing mistakes most commonly made by businesses. Not establishing role-based access It’s absolutely essential that you limit access to different information only to the specific personnel that are qualified to receive it. In the same way that not everyone within an office has access to financial information about the business, not everyone working within the cloud should have access to everything contained within. Information cannot be compromised, so it’s vital that each individual, dependant on their role, is set up with an appropriate level of access. “An added level of security that’s recommended to safeguard information, is to add a two-step authentication process at all access points. This helps ensure that your cloud resources are not accessed without authorization, either accidentally or maliciously”, - says Mike Baldwin, a Data Analyst at Big Assignments. [easy-tweet tweet="All data held within your cloud needs to be encrypted" hashtags="Cloud, Data"] Assuming your cloud data is automatically protected It is a typical assumption made by many companies that their cloud-based data is protected by the company providing the technology. Simply put, this is not true and, even though your cloud provider may have to comply with certain security standards, it does not mean that you aren’t responsible for securing your own data. All data held within your cloud needs to be encrypted and secured at your own time and expense. But, the good news is that putting these protections into place tends to lead to cost savings over time, because it prevents the loss of valuable information. Not ensuring a reliable connection It can be tempting to try to cut costs by choosing a lower-priced storage option or unreliable internet services. Lucille Highsmith, a Cloud Computing Manager at UK Writings gives her comment on the issue: “Don’t put your valuable cloud data at risk in order to save a bit of money, because in the end it will end up costing you more financially and in all of the trouble and hardship of losing your information. Your cloud-based business is your business, so don’t risk compromising it.  Always ensure you’ve got the proper network requirements necessary to properly run your cloud-based infrastructure.” Getting locked-in by your vendor Moving from one cloud-based provider to another is not as simple as it may see. There are a number of factors that may come into play when attempting to make this shift that may actually make it impossible to complete. In this situation, you could essentially be locked-in to your current vendor, despite your desire to move on. Always be sure to read the fine print before signing on to any cloud provider, and ask questions prior to any agreement being made. Ask point-blank what the requirements are for moving your data out of their cloud. Not having a backup plan Knowing and understanding the different avenues of data retrieval is an essential component of your cloud-based recovery plan. “It’s more peace of mind that your data is safe and secure, and you’ll be equipped with the confidence that you’re taking steps to ensure your data is backed up and protected in the event of a disaster”, - comments Mark Lazo, an Operation Manager at Australian Help. Many businesses are taking their work to the clouds because of the ease at which they can collaborate and share data, along with the cost savings associated. But, it’s essential that you’re taking the necessary steps to avoid these common mistakes, so you and your data don’t fall victim to the damaging effects that come along with them. ### Business Transformation - Observing Big Data - Hortonworks's John Kreisa Compare the Cloud interviews John Kreisa Vice President of Marketing from Hortonworks. John talks about how Hortonworks is a company that provides opensource Big Data platforms for capturing, storing and analysing data at any scale. What does Big Data mean? Big Data is the phrase referring to a large volume of data, structured and unstructured. As this data is so large it is difficult to process through a typical database and needs to be analysed computationally to reveal trends and associations. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Hortonworks here: https://hortonworks.com/ Big Data ### Snow launches Snow Cloud Discovery to provide unrivalled visibility of SaaS and IaaS usage and spend New advanced analytics solution gives integrated view of cloud, on-premise and mobile usage to more effectively control spend and deliver Digital Business Transformation initiatives Snow Software, the global leader in Software Asset Management (SAM) and Cloud Spend Management solutions, today announced the launch of Snow Cloud Discovery, which brings unrivalled visibility into Software-as-a-Service (SaaS) application and Infrastructure-as-a-Service (IaaS) environment usage to empower organizations to optimize their cloud spend. Cloud adoption is increasing at a colossal pace, with Gartner projecting worldwide public cloud services market will grow 18 percent to $246.8 billion in 2017, with estimates for IaaS and SaaS to grow 36.8 percent and 20.1 percent respectively this year. With this burgeoning utilization of cloud computing, it is essential that CIOs have a handle on how the IT budget is being used in order to capitalize on the potential to optimize cloud assets, identify wasted spend and accurately forecast future costs. Snow Cloud Discovery provides an integrated view of SaaS applications and IaaS environments giving insight into usage at an organizational and user level. IT leaders have actionable intelligence on which to effectively manage their entire IT estate and take advantage of cloud, to innovate and advance digital business transformation, without risk of unnecessary overspend. With Snow Cloud Discovery organizations can: Discover o   Automatically see SaaS applications in use across the enterprise, regardless of which department purchased the subscription. o   Expose zombie Virtual Machines (VMs) across public cloud vendors with full inventory of software components and usage details. o   Automate the collection of data that is critical to the establishment of an accurate license position. Visualize o   Consolidate a view of all application, environment and device usage across on-premise, public and private clouds, and mobile devices. o   Garner insight and highlight risks from simplified reporting of component level utilization metrics. Optimize o   Automatically identify, downgrade or reassign duplicate and unused SaaS subscriptions and find and deprovision zombie VMs. [easy-tweet tweet="Business unit IT spending will increase to 50% of enterprise IT spending for individual enterprises" hashtags="IT, Business"] According to Gartner: “Business unit IT spending will increase to 50% of enterprise IT spending for individual enterprises, with a strong focus on, or aspiration for, digital business transformation.” “IT is increasingly being cut out of the decision-making and budgeting processes and this is being exacerbated by digital transformation projects,” says Peter Björkman, CTO of Snow Software. “CIOs and IT leaders are feeling marginalized and their relationship with the rest of the business has changed irreversibly. Snow’s deep analytical capabilities provides insight into all technology usage, including that of business units, enabling IT leaders to efficiently manage and allocate technology budgets to support organizational goals.” Cloud spend is spiralling out of control in many organizations, with CIOs and CFOs unable to identify wasted spend, optimize existing assets, or accurately plan for future costs. Complete visibility of all applications is essential to support an already-overburdened IT office if they are to get a handle on cloud spend within their business. With this visibility IT leaders can influence how the organization and business units consume software, driving substantial cost savings, efficiency gains and preventing security risks. For more information on Snow Cloud Discovery, visit: https://www.snowsoftware.com/int/products/snow-cloud ### Calligo Acquired Luxembourg-based IT Services Business AMS Systems PSF Calligo, a leading global cloud solution provider, today announces that it has purchased AMS Systems PSF, a highly respected Luxembourg based IT Services business that provides managed services and cloud infrastructure to the financial services sector. [easy-tweet tweet="Calligo provides trusted, privacy-conscious cloud solutions to businesses across the globe" hashtags="Calligo, Cloud"] Founded in 2011, Calligo provides trusted, privacy-conscious cloud solutions to businesses across the globe. Calligo’s emphasis on GDPR services and data residency enables clients to leverage the advantages of combining innovative Cloud technologies, unrivalled expertise and a commitment to the highest level of standards-based compliance and privacy.  The business services hundreds of clients worldwide from its locations in United Kingdom, Jersey, Guernsey, Switzerland, Singapore, Bermuda and now Luxembourg. “We’re thrilled to have found the right partner in AMS Systems PSF during this exciting period of growth for our business,” said Julian Box, Chief Executive Officer, Calligo. “I’m confident that AMS Systems’ proven track record, complementary technology services and excellent reputation will support Calligo’s strategic expansion into Luxembourg. This acquisition gives us a fantastic team, respected clients and unlocks the Luxembourg market. We’re also excited to announce that we will be the first CSP to host Azure Stack in Luxembourg. With the backing of our investor Investcorp Technology Partners, we are actively looking to execute further strategic add-on acquisitions over the coming months as we continue to expand our global footprint.” Mark Gillies, AMS Systems said: “We are very excited about this agreement with such a respected and dynamic business as Calligo because it brings together two highly entrepreneurial organisations, providing us with a unique opportunity to expand our services while sharing in the success of Calligo’s expanding international cloud network.” Post acquisition, AMS Systems PSF will be integrated into Calligo.  The rebranded company will continue to operate from its existing location with no change of personnel thereby ensuring continuity of service for its clients.  Over the next few months Calligo will expand the range of services provided including being the first service provider to offer Azure Stack in Luxembourg. KPMG in Jersey & Luxembourg and AMMC Law in Luxembourg acted as advisors on the transaction. ### Why you Should Stick with the AWS Cloud Service As the cloud infrastructure market expands, the service offerings being made available to end users are growing and evolving. This shifting landscape has helped major players, such as Google Cloud and Microsoft’s Azure, enjoy strong growth over the last year. But, despite their recent success, both still trail behind the clear market leader, Amazon AWS – and by some distance. While both Azure and Google Cloud have strong plus points that are helping them to gain ground, neither yet have the same reach, scale or experience of AWS. All of which provide AWS with a head start on their competitors. These include the following: Features and services When choosing between cloud infrastructure providers, the decision can often come down to the range of features available to the end user. In this respect, AWS has the advantage simply because it has a bigger offering than its competitors. The broader range of resources can allow you to build your infrastructure in different ways and with AWS, many are supported by a range of dedicated services. For example, AWS provides a managed service for Elasticsearch for log analytics, full-text search and application monitoring. This makes it easier to deploy and scale this technology quickly as the necessary technical assistance is available. Technology agnostic AWS is much more technology agnostic, compared to other cloud providers. For example, if you are used to working with the Linux operating system (OS) you would be far more likely to choose AWS over Microsoft Azure which is far more focused on Windows OS. While there can be advantages to dealing with just one cloud infrastructure supplier – you have only one supplier to chase on technical assistance and support queries, for example –  it can be beneficial to give yourself flexibility. If you are not overly dependent on one supplier, you can then spread the risk in the event of a disaster. In this respect, AWS doesn’t tie you too closely to any one technology provider, providing a freer, more risk-averse option for long-term planning. It is worth noting that although Kubernetes – the future of the cloud – is the brainchild of Google, most of our customers are choosing to deploy this technology in AWS. This is despite Google providing a managed IaaS offering. Community support The size of this community is a major deal. The reality of developing cloud infrastructure is that it involves the integration of a large number of building blocks. This includes a huge variety of technology products and services that provide you with the required storage, computing power and databases as well as the tools that make it easier to pull everything together. The cloud market is also highly innovative with new options becoming available to us all the time. When it comes to integrating the latest technology, AWS’s position as the market leader gives it a clear advantage. With a much larger user base, it has a much bigger community to draw upon whenever advice is required to support a project. The ability to draw on their experiences and expertise can offer a significant shortcut when determining the best way to proceed. This makes it quicker and easier to design and develop an architecture that supports the needs of individual businesses with unique sets of requirements. It is also a key reason why most of our customers are deploying Kubenetes in AWS over Google. [easy-tweet tweet="AWS stressed the importance of security within its company values" hashtags="Security, AWS"] Location At its most recent annual Summit, AWS stressed the importance of security within its company values – and it is doing that in the UK by tackling data sovereignty issues. When AWS opened a data centre in London at the end of 2016, it started to build up a secure and reliable cloud system for its UK users - allaying common fears about the lack of data privacy for those organisations with data in other European locations. It has plans to expand further in the future, with facilities also proposed for IX Manchester. Even with Brexit looming, the message is clear for British AWS users: keep calm and carry on. With these guarantees and a more localised service, AWS continues to cement itself as the major player in the cloud services market – and this is also helping to address latency issues. Although companies like Netflix have managed to reduce this as an issue – by building up CDN networks to support their businesses around the world – AWS’s large global network of data centres is offering a helpful option for sectors such as telecommunication or e-commerce, where speed is a major concern. ### Informatica AI-driven Data Catalogue and Cloud Data Lakes Power Intelligent Disruption Informatica unleashes new intelligent data lake management solution for GDPR at Strata Data Conference 2017 Informatica®, the Enterprise Cloud Data Management leader accelerating data-driven digital transformation, today announced a new set of solutions and enhancements in the areas of intelligent data lake management and enterprise data cataloguing and discovery that enable organisations to turn data lakes into business value using Artificial Intelligence (AI) and the cloud to drive intelligent disruption. The new Informatica offerings and enhancements will debut at the Strata Data Conference 2017, Booth 543, on September 26-28, at the Javits Center in New York City. Informatica will also showcase the latest version of many core components of the Informatica Intelligent Data Platform® (release 10.2), powered by the CLAIRE™ engine. With clairvoyance in mind and AI at the centre, CLAIRE is the industry’s most advanced metadata-driven AI technology embedded in the Informatica Intelligent Data Platform. CLAIRE delivers intelligence to Informatica data management products and solutions by applying machine learning to technical, business, operational and usage metadata across the entire hybrid enterprise, on-premises and in the cloud. With its new solutions, Informatica now delivers: An exponential leap forward in accelerating regulatory compliance efforts for the era of GDPR, The industry’s only intelligent data catalogue that now integrates with Hortonworks Atlas, and The industry’s first end-to-end data lake management solution in the cloud to support Cloudera Altus. [easy-tweet tweet="Informatica Compliance Data Lake is a new, end-to-end data lake management solution architecture" hashtags="Data, Cloud"] Exponential leap forward in accelerating regulatory compliance efforts for the era of GDPR Informatica Compliance Data Lake is a new, end-to-end data lake management solution architecture that empowers compliance analysts to rapidly build comprehensive and trusted compliance reports. This is especially important now with regulations such as the European Union’s GDPR, which becomes effective in May 2018, and could have significant impact for organisations and how they manage data pertaining to customers, partners, and other sensitive data entities. Built on Informatica Intelligent Data Lake®, Informatica Big Data Management®, Informatica Enterprise Information Catalogue and other Informatica technologies, the Compliance Data Lake powers a comprehensive view of previously inaccessible compliance-related data from email, social media, instant messages, financial transactions and other non-traditional sources. This provides more accurate compliance analytics and reporting to ensure and prove adherence to critical regulations, such as GDPR. Organisations can accelerate their regulatory compliance through rapid intelligent data discovery, optimised data processing, support for collaborative and efficient human workflows, and easy self-service data preparation with AI and machine learning-assisted automation and governance. Informatica delivers pre-built connectors and parsers to collect, process, and deliver a variety of application and user data including unstructured data such as emails and chats of ever increasing volume, and can deploy the solution in the cloud or on-premises. By enabling all types of compliance-related data to be ingested and processed in one place, a compliance data lake dramatically improves analytics necessary to support compliance reporting.With the recent release of 10.2, Informatica Enterprise Information Catalogue is now integrated with Informatica Axon™ which powers the industry’s first enterprise data governance solution. This increases business and IT’s ability to collaborate to define, implement, and track outcomes including those related to regulatory compliance. Industry’s only intelligent data catalogue now integrates with Hortonworks Atlas With 10.2, Informatica delivers the industry’s leading AI- and metadata-driven data catalogue for discovering, understanding, and managing all enterprise data wherever it resides, with one-click rapid deployments on Amazon AWS and Microsoft Azure. Powered by CLAIRE, Informatica Enterprise Information Catalogue can discover and catalogue all types of data and data relationships across the enterprise, including both structured and unstructured data. Informatica Enterprise Information Catalogue is the most comprehensive and intelligent enterprise data cataloguing and discovery solution to support Apache Atlas, Hortonworks’ open-source big data governance framework. The integration of Enterprise Information Catalogue and Atlas enables data professionals to get complete end-to-end visibility and lineage for all enterprise data outside of Hadoop and inside Hadoop-based data lakes. Industry’s first end-to-end data lake management solution in the cloud to support Cloudera Altus As a leading solution for data integration on data lakes in the cloud, Informatica Big Data Management will support Cloudera Altus, Cloudera’s Platform-as-a-Service (PaaS) offering that speeds the creation and operation of elastic data pipelines for data lakes in the cloud. Together, Informatica and Cloudera provide the power of Informatica’s advanced and extensive data transformations with the simplicity of Cloudera’s PaaS for an end-to-end data lake management solution in the cloud. Informatica has reimagined big data and data lake management with the recent release of 10.2, managing all types of data at all latencies, including real-time streaming, for all types of users, at scale – while executing on-the-fly advanced data transformations and data quality rules using collaborative visualisations and governed self-service data preparation tools. Supporting Quotes: “Intelligent disruption is built on a solid data foundation and the pace of innovation, and is dependent on the ability to unleash the power of data to become more agile, gather faster insights and make smarter data-driven decisions,” said Ronen Schwartz, senior vice president and general manager, Cloud, Big Data and Data Integration, Informatica. “The new offerings and enhancements we are unveiling at the Strata Data Conference in New York are all aimed at helping organisations rapidly discover, access and unleash the power of all their data, whether in the cloud, on-premises or in big data environments. Informatica delivers a comprehensive approach to intelligent data lake management and data cataloguing and discovery, using AI and other technologies to ensure that customers get the data they need to power trusted advanced analytics.” “Throughout our long partnership, Cloudera and Informatica have developed many seamless integrations between Cloudera Enterprise and Informatica’s comprehensive data lake management portfolio,” said Philippe Marinier, vice president, Business Development, Cloudera. “Beyond being one of the first ISV partners to integrate with Cloudera Navigator, Informatica’s integration with Cloudera Altus Data Engineering will enable our joint customers to benefit from Informatica’s enterprise-grade connectivity, advanced data transformations, and high-performance pipeline optimisation while deploying metadata-driven data hubs in the cloud.” ### GÉANT and Microsoft partnership helps Research and Education communities achieve quicker breakthroughs with cloud technology September 27 2017, Cambridge, UK and Amsterdam, NL: GÉANT, Europe's leading collaboration on network and related infrastructure for research and education, today announced the latest achievement in a successful multi-year partnership with Microsoft to make powerful cloud computing environments more accessible to the research and education communities. The agreement between GÉANT, 36 National Research and Education Networks (NRENs), Microsoft and ten approved Microsoft resellers are part of GÉANT’s Infrastructure-as-a-Service (IaaS) framework, and provides a cloud-based offering tailor-made for the research and education community in 36 European countries, based on Microsoft Azure, and available through GÉANT’s innovative Cloud Catalogue. Thousands of universities, schools and research institutions can now benefit from Microsoft Azure cloud services with procurement, contracting and integration taken care of by GÉANT and the NREN community under a common GÉANT IaaS framework agreement, making Microsoft Azure cloud computing an easy-to-use option for the research and education communities, with greater cost-effectiveness, and dedicated support. Key benefits include: Institutions can buy and use Microsoft Azure directly, without the need for complex and time-consuming tenders and contract procedures; and benefit from volume discounts. Framework contracts are compliant with EU privacy and data security regulations. Users can log in to the Microsoft Azure services using single sign-on (SSO), via their institutional identity management solutions. Network traffic costs are significantly reduced, with Microsoft Azure services connected to the high-performance data networks provided by GÉANT and its NREN partners.  Support is available for moving workloads to Microsoft Azure. Existing Microsoft licensing arrangements can be leveraged through BYOL (bring your own licensing). Enterprise cloud management tools enable control, oversight and delegation to a community of users and group. Microsoft, GÉANT and the NRENs have collaborated closely for many years to enable the research and education community to benefit from advancements in technology, including cloud computing, which can help accelerate breakthroughs in research as well as enhance and optimise student engagement and institutional operations. Erik Huizer, GÉANT Interim Chief Executive Officer, said, “The GÉANT Cloud Catalogue is already delivering significant benefits to users across the European research and education communities. We are excited to have Microsoft Azure on this platform, further expanding choice and supporting our NREN partners in the cloud services they can offer to their customers.”  [easy-tweet tweet="Microsoft Azure is the perfect choice for research and education institutions" hashtags="Microsoft, IaaS"] Anthony Salcito, Vice President of Worldwide Education at Microsoft added: “Microsoft Azure is the perfect choice for research and education institutions. It allows quick access to powerful computing environments where complex data can be stored and shared easily and securely. It also gives researchers access to tools that can turn data into insights and enable them to collaborate with others quickly and securely to accelerate discovery. We are delighted to bring these benefits and opportunities to the research and education communities across the EMEA region through the GÉANT IaaS framework agreement.” To find an approved Microsoft Azure reseller on the GÉANT IaaS framework or for more information go to https://clouds.geant.org/geant-cloud-catalogue/geant-cloud-catalogue-iaas/microsoft-azure/microsoft-azure-resellers/ ### Randstad Goes All-in on AWS Leading global HR services company selects AWS to accelerate its digital transformation and optimize its global IT infrastructure Today, Amazon Web Services Inc. (AWS), an Amazon.com company (NASDAQ: AMZN), announced that Randstad, a leading international staffing and recruitment company, will migrate its global data centres to AWS. With this move, Randstad aims to increase efficiencies and align IT infrastructure with its digital vision to provide high-quality service to its customers around the world. AWS’s proven security and operational expertise helps Randstad protect sensitive data and meet its compliance needs. Randstad also has access to the most robust and full-featured set of cloud capabilities with AWS, allowing Randstad to innovate from idea to launch in record time. [easy-tweet tweet="Currently, Randstad has a number of operating companies running successfully on AWS" hashtags="AWS, DigitalTransformation"] Randstad is consolidating its IT infrastructure across 30 IT departments, providing service to 40 operating companies globally. This consolidation involves the migration of Randstad’s applications and IT services to AWS. Currently, Randstad has a number of operating companies running successfully on AWS, and the next step in Randstad’s journey is to move all of Randstad’s operating companies to AWS so that the entire organization can take advantage of the cloud. In addition to driving digital innovation, Randstad’s move to AWS has helped the company become more agile and competitive while reducing capital expenditures. “At the beginning of our digital transformation, we adopted a ‘cloud-first’ policy for all of our new IT projects, which helped us to decrease our cost of innovation. Due to this success, we chose to migrate our entire global IT infrastructure to AWS,” said Bernardo Payet, General Manager at Randstad Global IT Solutions. “By migrating our data centres to AWS, we can further reduce our global IT operational costs and increase our pace of innovation.” “We are thrilled to be fueling Randstad’s digital transformation,” said Mike Clayville, Vice President, Worldwide Commercial Sales at AWS. “Randstad has already seen an uptick in companywide innovation and been able to implement deeper, more streamlined security practices with AWS. We look forward to continuing to support Randstad on their journey to AWS over the next year.”   ### The Local Merchant - Observing Big Data - Kentico's Bart Omlo Compare the Cloud interviews Bart Omlo, head of sales for Kentico. Bart talks about how Kentico provides software for online marketing, web content management and eCommerce. What does Big Data mean? Big Data is the phrase referring to a large volume of data, structured and unstructured. As this data is so large it is difficult to process through a typical database and needs to be analysed computationally to reveal trends and associations. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Kentico here: https://www.kentico.com/ BigData ### Huawei Unveils New Mobile Cloud Storage for Smartphone Users New Huawei Mobile Cloud provides Huawei mobile phone users with a simple and secure service for data backup and synchronisation Huawei Consumer Business Group has today announced the launch of its new Huawei Mobile Cloud, enabling consumers to backup and restore their data and phone settings wirelessly, synchronise and easily transfer data across Huawei mobile devices, as well as store and access files safely using Cloud Drive. Subscribed users to the Huawei Mobile Cloud will receive 5GB of free cloud storage. From September onwards, the Huawei Mobile Cloud update will be gradually rolled out on the Huawei P10, P10 Plus, P10 lite and Nova 2, with other models to follow in due course. All photos and videos taken with the camera, screenshots and screen recordings can be automatically backed up to Cloud. Users can simply access them from their browser at cloud.huawei.com or from the Gallery App on their device. [easy-tweet tweet="At launch, Huawei Mobile Cloud will offer 5GB of free cloud storage" hashtags="Mobile, Cloud"] Data can also be automatically synchronised across all Huawei mobile phones, so it’s quick and easy to access Contacts, Calendar, Wi-Fi and Notes using the Huawei ID. This data can also be managed through the Cloud Web Portal. At launch, Huawei Mobile Cloud will offer 5GB of free cloud storage and there will be the option to upgrade and purchase more storage plans from 2018 onwards. All data is stored exclusively within the EU on European servers, in compliance with EU Data Protection and Privacy Laws. All the services have been designed with user privacy in mind and the Huawei Mobile Cloud Services are certified by CSA Star (ISO/IEC 27001). "The launch of Huawei Mobile Cloud highlights our commitment as a business to creating a more convenient mobile experience for our users, all the while assuring them that their data can easily be backed up and restored, as well as remaining secure,” commented Walter Ji, President of Huawei Western Europe’s Consumer Business. “All files and photos stay within EU servers and we have local Legal, Security and Privacy expert teams in EU, to give users complete peace of mind.” Full Huawei Mobile Cloud feature support will be initially available at launch in Germany, Italy, France, Spain, UK, The Netherlands, Finland and Poland, with more countries to follow later. Updates for the Huawei P10 and P10 Plus will start rolling out on September 27th, followed by the P10 lite on September 28th, while the Nova 2 will receive Mobile Cloud on September 30th. Carrier launch schedules will vary. About Huawei Consumer BG Huawei's products and services are available in more than 170 countries, and are used by a third of the world's population, ranking third in the world in mobile phone shipments in 2015. Fifteen R&D centers have been set up in the United States, Germany, Sweden, Russia, India and China. Huawei's Consumer BG is one of Huawei's three business units and covers smartphones, mobile broadband devices, home devices and cloud services. Huawei's global network is built on 20 years of expertise in the telecom industry and is dedicated to delivering the latest technological advances to consumers around the world. For more information, visit Huawei Consumer BG online: http://consumer.Huawei.com/en ### Data Management - Observing Big Data - DataStax's Martin James Compare the Cloud interviews Martin James, Regional Vice President of DataStax. Martin talks about how DataStax helps organisations with a data management environment, where they need to provide relevant data to their customers. What does Big Data mean? Big Data is the phrase referring to a large volume of data, structured and unstructured. As this data is so large it is difficult to process through a typical database and needs to be analysed computationally to reveal trends and associations. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about DataStax here: https://www.datastax.com/ Big Data ### Secure Identity Ledger Corporation Introduces Blockchain Platform for Delivering First-Ever Turnkey Digital ID System New blockchain platform to put consumers in control of their online identities  ·       Unique technology developed for people to create their own, encrypted One Digital ID(sm)   ·       Digital tokens will be purchased using Visa, MasterCard and Amex as well as digital currencies such as Bitcoin ·       No data stored, only verification provided via a managed digital identity Falls Church, VA, September 27, 2017 – Secure Identity Ledger Corporation (SILC)(sm),the first digital identity platform for industry, government and consumers, today announced in advance of its Token Sale the preview of its blockchain platform and web portal. Blockchain My Secure Ledger will be the entry point for individuals, government and companies to learn how digital ID data can be recorded, managed and distributed in a more equitable way. For more information, please visit http://www.secureidentityledger.com SILC will provide the first-ever turnkey digital ID system for consumers, including registration, authentication, verification and personal monitoring. The platform will provide individuals with a unique digital token that enables them to prove their identity to third parties. And unlike other Blockchain solutions that operate on a Proof-of-Stake or Proof-of-Work basis, SILC operates through a Verified Existence Algorithm(sm) in which both parties can exchange and confirm information through the SILC dashboard without it being recorded. SILC’s One Digital ID is cryptographically secure, unalterable and legitimate. “While governments and businesses have been driving development of blockchain technologies, consumers are poised to be the biggest users and winners of the blockchain,” said Danny Lee, co-founder and board member of SILC. “With One Digital ID on My Secure Ledger, people can take back control of their online identity and protect themselves from the increasing number of breaches and attacks.” Since the dawn of the personal computer age (1980s) and increasingly during the Internet age (1990s), consumers have given up control over their personal information. Our online identities today are comprised of a swath of personal data and scattered across the Internet under multiple usernames and passwords. Banks, online shopping destinations, social networks and hackers, among others, are in control of our ID. And as the physical and digital worlds merge – from online grocery shopping to virtual doctor’s visits – it’s more important than ever that users take back control.   [easy-tweet tweet="SILC aims to revolutionise the way personal information is managed" hashtags="#Blockchain #Security"] “SILC aims to revolutionise the way personal information is managed, building trust among consumers and businesses in the Blockchain Age(sm),” said Quang Trinh, co-founder and board member. “SILC will provide One Digital ID for every person, so they can confidently and securely access healthcare, social assistance, academic certifications, voting and more. The time for One Digital ID is now.”    Consumers who create their digital IDs now will be among the first to benefit from the Blockchain Age. They will receive one, encrypted digital ID in sync with the blockchain, not stored on a phone or computer that can be easily lost, stolen or broken. One Digital ID is stored in pieces that when used in a transaction come together to form each individual’s own, unique pattern to validate their identity and complete an exchange or operation.   By building its own blockchain platform, My Secure Ledger, SILC is removing roadblocks such as miners dictating fees or forking the chain. Consumers can be confident the My Secure Ledger blockchain will always be accessible and supported and the impending bloat of the existing blockchains won’t impact their accounts. For updates on the pending Token Sale, please sign up at the SILC website: http://www.secureidentityledger.com ### Drone Major Group - launches to provide first ever global connectivity platform to the fast growing drone industry Today the launch of Drone Major Group Limited, a new group of companies focused on the fast-growing drone industry, has been announced, bringing together a number of complementary businesses which provide connectivity for the first time ever to the global drone industry.  Drone Major Group is the first company in the world to provide a unique and integrated suite of products and services to the global drone industry at what is an important time in the development of standards and growth of this exceptionally dynamic sector. Central to the business will be the launch of a new portal - Drone Major – launched today, which will provide critical connectivity for the global industry. Until now there has been no single resource available to help navigate and procure an extensive range of drones and drone related services.  This will be the world’s first online portal to facilitate the sale of drones and drone-related equipment and services throughout the world. A membership scheme will also deliver added-value benefits and ‘one-stop’ services including ‘Drone Major experts,’ who can provide information and guidance to navigate the myriad of choices available to the market. The platform’s current members comprise over 80% of the drone market, and include some of the world’s largest drone operators. Another integral company within the group will be SUAS Global (Surface, Underwater, Air, Space), currently the world’s leading online network for the drone industry with over 40,000 global subscribers.  Founded by ex-army entrepreneur Robert Garbett, Drone Major Group has at its operational core Software Major, a software development company and Cyber Major, a fast growth risk assessment and resolution consultancy.  These two companies, combined with SUAS and the Drone Major portal, provide a first-time-ever connectivity to the world’s drone industry. Drone use is already well established and growing across a vast array of applications, ranging from emergency services including search and rescue and tackling fires through to new innovations in agriculture, construction, humanitarian aid, wildlife preservation and personal security, generating predictions that drones will spawn a $100 billion industry by 2020. [easy-tweet tweet="The advance of the drone industry towards full connectivity is inevitable" hashtags="Drone, Platform"] Robert Garbett, Founder and Chief Executive, Drone Major Group Limited, comments: “The advance of the drone industry towards full connectivity is inevitable. How long this will take will depend on the will of the industry and its ability to develop robust standards; the willingness of regulators to enable new applications; the advancement of technology to tackle the challenges ahead; and the ability of the innovators to understand what technology is available to enable their vision of the future.  At Drone Major, with our growing online community, and involvement of most of the drone industry’s leading companies, we are responding to the requirements of this dynamic market and matching customer needs with a transparency and connectivity unavailable until now.” Explore the new Drone Major platform at: www.dronemajor.net ### Open Minds -  HA Clustering in Public Cloud Environments! I stumbled upon Open Minds by accident during one of my research days, and I was very glad that I did. Based in the heart of Birmingham this little-known firm has been helping businesses for over a decade but have not been shouting loudly enough to get noticed in the way they deserve. Established by a small group of consultants in 1990, Open Minds serve corporate clients and managed service providers (MSP’s) across a wide range of industries. They started business initially by providing technical training and consultancy in Unix, C and C++ and over the decades evolved into being a provider of specialist application recovery products and services. One of the main strengths that Open Minds have under their belt is clustering configuration, implementation and support. Clustering systems can be tricky at the best of times but try that within a multi-cloud approach and it’s a scary notion. I had the pleasure of meeting Open Minds on several occasions, most recently at the Channel Live event in Birmingham where I interviewed Bipin Patel, Technical Director at Open Minds  (click here to watch). Bipin, AKA “Mr Cluster” as I call him, has a great story to tell with a vast amount of knowledge within the industry (over decades) of high availability clustering - but modest with it. High availability for database clusters can be a scary notion to the uninitiated, but that’s what makes Bipin get out of bed in the morning (his words). Wait, that’s not what I want to get across here. Open Minds have a very unique proposition that I am sure you have not heard or seen before - Public cloud dev, test and implementation for HA clusters across the major public cloud providers. Let’s face it, how much assistance would you expect from your public cloud provider when scoping, testing and implementation of your complicated HA database cluster? Err do I need to answer that really? [easy-tweet tweet="Open Minds conduct global support for the SIOS software product range" hashtags="Open Minds, IT"] Once thought of as an unmovable part of IT Infrastructure, you can now have the flexibility (and cost) of public cloud infrastructure for your complex clustering environments, safe with the knowledge that IT DOES WORK and you are in the hands of a true expert. Additionally, Open Minds conduct global support for the SIOS software product range too, making after-hours support easier if needs be. I know what you may be thinking – this article is weighted heavily as an advert for Open Minds, but really this is not the intention. After 1000’s of providers we have interviewed, Open Minds are one of the first that cater for public cloud adoption for a very niche set of products. We all know some of the benefits of public cloud adoption but did you know that there is a company that could plan, migrate and support your complex database clustering needs? I like this team as they haven’t advertised, marketed or even run sales campaigns to gain traction within the industry and they deserve some recognition for that alone. However, with the “CMaaS - Cluster Migration as a Service” options (yes I did make that acronym up) delivered by them for public cloud adoption are unique. ISV’s, MSP’s, Database providers as well as SI’s if you haven’t come across them before then do look them up. To hear more from Open Minds tune into our live stream from IP Expo where Bipin will be interviewed at 2:30 pm on 5th October: https://www.disruptive.live/machina/ ### Blockchain for Good Hackathon with Hyperledger & Accenture Blockchain is the hot buzzword in tech at the moment –this technology holds the promise of allowing teams to collaborate in a secure, transparent way via a digital ledger. Blockchain is a new type of database system that maintains and records data in a way that allows multiple parties to share and collaborate to execute transactions. This technology has the power to move any kind of data securely and auditably among all parties to a transaction.  The data on a blockchain is trusted and secure, preventing random changes. Accenture and Hyperledger are teaming up for a “Blockchain for Good Hackathon in Dublin on Sept. 30, asking developers to come together to create blockchain apps around Digital Identity or Sustainable Supply Chain using open source Hyperledger code from the Linux Foundation.  The challenge of the Hack is to utilize Hyperledger open source code & tools like Hyperledger Fabric, Hyperledger Composer, Hyperledger Indy and Hypeledger Sawtooth to build applications that will benefit humanity. Blockchain has shown great promise in establishing provenance of goods, to expedite routine business processes and to enable the world’s unbankable to receive payments via cryptocurrencies through mobile apps at their fingertips.  Successful blockchain for good implementations like Plastic Bank have allowed plastic that was littering the ocean to be recycled for rewards. The possibilities are truly endless. Blockchain technologies can be used to enhance routine business processes and thereby enhance the quality of life. The Hackathon Prizes The inaugural “Blockchain for Good Hackathon” is taking place at The Dock, Accenture’s multidisciplinary research and incubation hub, on Saturday, Sept. 30 and Sunday, Oct. 1. The two organisations (Hyperledger & Accenture) are inviting developers across Ireland to help apply blockchain technology to develop game changing ideas and solutions for providing secure digital identities and sustainable, eco-friendly supply chains. The grand prize winning team will receive an 8,500 EUR cash prize and a two-day all expenses paid trip to visit Accenture’s Blockchain Center of Excellence in the South of France to develop their hack into a proof of concept. Hackathon registration and details The Blockchain for Good Hackathon is open for registration to all Blockchain developers with an interest in Hyperledger and blockchain who are 18 years old or older. Participants in the Blockchain for Good Hackathon will work together with top coders, architects and experts from Accenture & Hyperledger to build innovative solutions that can positively impact millions of lives and the environment. To register for the event visit: http://events.linuxfoundation.org/events/blockchain-for-good-hackathon/attend/register To review the agenda visit: http://events.linuxfoundation.org/events/blockchain-for-good-hackathon/program/agenda. Registrants should bring their laptops and their creativity. The Blockchain for Good Challenge With blockchain for a sustainable supply chain, data can be collected about the path of food products including produce and livestock. Products can now be tracked from farm to table help reduce spoilage and prevent diseases. [easy-tweet tweet="For Digital Identity, blockchain offers the opportunity for individuals to obtain a unique proof of identity" hashtags="Blockchain, Digital"] For Digital Identity, blockchain offers the opportunity for individuals to obtain a unique proof of identity to gain access to essential services, such as healthcare, social benefits and financial services. It also is essential to have access to better job opportunities. But over a billion children and adults worldwide do not have a recognized form of identity. In many countries, the design and implementation of civil registration and identification programs are shaped by cultural preferences, regulatory requirements, technological maturity and privacy needs. Each team will be asked to develop a working prototype around either Sustainable Supply Chain or Digital Identity.  Prototypes must be developed using Open Source Hyperledger Code publicly available via the Linux Foundation and Git Hub. Each team will be required to create and present to the judges a proof of concept using blockchain driven concepts. Why Blockchain? “Blockchain technology is on course to revolutionize how the world exchanges value, information and material,” said David Treat, event co-host and managing director in Accenture’s global blockchain practice.  “This hackathon is geared to channel these innovations for humanitarian and environmental benefit, beginning with the right to identity and the need for environmentally sustainable supply chains. And we’re calling on the many talented engineers in this region to help.” “The Linux Foundation and Accenture both have long histories looking for ways to apply technology to improve the way people live and work,” said Hyperledger Executive Director Brian Behlendorf, co-host of the Hackathon. “We see blockchain in particular as a technology that can drive positive impact in areas that may not have reaped the full benefit of the digital revolution.” Mr. Treat of Accenture and Mr. Behlendorf will both be onsite at the Hackathon to meet with attendees. Developers will also get to hear from Dakota Gruener, the executive director of ID2020 , about the challenges that 1.1 billion people face living without an official identity, and why blockchain may play a crucial role in the technical solution. The combination of expert speakers, amazing prizes and a chance to interact with the best & brightest developers make the  “Blockchain for Good Hackathon” a must for coders. Follow all the Hackathon news on Social Media in real-time with the hashtag #Blockchain4Good. ### Foehn Redefines Cloud Phone Systems Management with Launch of Voxivo Easy-to-use cloud phone system reduces the burden on IT & business managers with intuitive controls and ‘drag and drop’ dial plan functionality Foehn, the award-winning provider of cloud communications technology, has launched Voxivo, a new cloud phone system. Voxivo’s design simplicity and innovative interface transform crucial but often over-looked aspects of staff productivity amongst businesses of all sizes, by giving IT or business managers and employees alike easy control of key phone system management tasks that were previously denied to them. [easy-tweet tweet="Voxivo marks a break with the uninspiring and over-complicated UK office phone set-ups" hashtags="Mobile, Cloud"] With its highly-attractive and easy-to-use phone system interface, Voxivo marks a break with the uninspiring and over-complicated UK office phone set-ups of the past. It empowers companies to take administrative control of phone system performance, workflows and device configuration, removing the need to rely on third-party telecoms service providers, and enabling companies’ in-house IT managers or business managers that increasingly need to make ‘on the spot’ phone system changes or add new extensions, to ensure employee responsiveness and the best customer experience. Voxivo’s intuitive interface helps IT managers, team supervisors and employees alike save time that is otherwise lost on a daily basis in seeking technical assistance from third-party suppliers for ‘non-technical’ but vital phone system changes. Through the Voxivo interface’s clear graphics and ‘drag and drop’ system change options, businesses will achieve ‘breakthrough’ administrative capabilities, including: A supervisor portal enabling IT managers or team leaders to set up or revise call plans and add users, better matching campaign requirements, staffing levels and team skill sets to fast-changing customer demand. An individual user portal lets team members or customer agents track their call history, make their time-saving changes in an instant, such as twinning their office phone with their mobile or changing voicemail settings. Clear graphics-based reporting enables department heads and team leaders to instantly review performance and make changes to dial plans/staffing. UK companies’ lack of phone system flexibility and frequent supplier ‘lock-in’ has long resulted in demotivated system users and prolonged customer response times, at both large and small firms. Over the longer term, many firms are unable to get optimum value and agility from their communications technology investments. James Passingham, CTO at Foehn, explains: “We have redefined the cloud communications experience. We’ve seen the industry offering reiterations of uninspiring telephony platforms that are unintuitive and convoluted for technical and business teams.” “At Foehn we believe in doing things differently, so customers can reclaim control of their cloud phone system without needing technical help for administrative changes. We also believe administrators should empower staff to be able to control their communications. For this to happen, we used our expertise of open source and created a visual cloud phone system. We call it Voxivo.” “Voxivo is a game changer. It provides a new perspective on business communications that empowers companies. This removes a headache for IT managers and gives agility to business teams who need to operate quickly and efficiently.” Voxivo also features Foehn’s transparent pricing guarantee with no hidden costs for extra features. Also, because it draws on open source technologies, it brings a more flexible approach to companies’ future phone system development and integration as well as avoiding the upheaval and down-time involved in upgrading to next-version proprietary phone systems. Voxivo sits on Foehn's robust platform infrastructure with an industry-leading service uptime. Its highly resilient and scalable architecture is located across several data centres, and Foehn proactively monitors and manages a quality of service across their entire network. ### Consumer Experience - Observing Big Data - 360insights's John Bird Compare the Cloud interviews John Bird, General Manager for 360insights. John talks about how 360insights is a world-leading SaaS-based channel and incentive management platform. What does Big Data mean? Big Data is the phrase referring to a large volume of data, structured and unstructured. As this data is so large it is difficult to process through a typical database and needs to be analysed computationally to reveal trends and associations. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about 360insights here: https://360insights.com/ Big Data ### Bright Computing Announces Integration with IBM Power Systems Bright Computing, a global leader in cluster and cloud infrastructure automation software, today announced that Bright Cluster Manager 8.0 now integrates with IBM Power Systems™.  [easy-tweet tweet="IBM Power servers are built for modern data solutions" hashtags="IBM, Data "] IBM Power Systems high-performance computing servers are deployed in many of the largest clusters around the world. Configurable into highly scalable Linux® clusters, Power Systems offer extreme performance for demanding workloads such as genomics, finance, computational chemistry, oil and gas exploration, and high-performance data analytics. IBM Power servers are built for modern data solutions and promote fast time to insights with solution-specific hardware accelerators, scaling-out as the data requirement grows. Unveiled in May 2017, Bright Cluster Manager 8.0 delivers exciting new features for automation and ease-of-use for Linux-based clusters and public, private and hybrid clouds. Feature highlights include a new and highly intuitive web-based user interface, integration with Apache Mesos and Marathon, support for OpenStack Newton, and for the first time, support for IBM Power Systems servers. Martijn de Vries, CTO at Bright Computing, commented; “At Bright, we pride ourselves on an aggressive development strategy to maintain our position as a leading provider of infrastructure management technology. Our customers are increasingly looking at POWER8, and are therefore we are very happy to be able to offer this valuable integration as part of Bright Cluster Manager 8.0.” “The integration of Bright Cluster Manager 8.0 with IBM Power Systems has created an important new option for users running complex workloads involving high-performance data analytics,” said Sumit Gupta, VP, HPC, AI & Machine Learning, IBM Cognitive Systems. “Bright Computing’s emphasis on ease-of-use for Linux-based clusters within public, private and hybrid cloud environments speaks to its understanding that while data is becoming more complicated, the management of its workloads must remain accessible to a changing workforce.” ### SunTec Xelerates the Open Banking revolution to Deliver Business Value Helping global banks leverage collaborative ecosystems to become true value aggregator marketspaces SunTec™, whose technology powers the financial services delivered to over 300 million consumers worldwide, today unveiled its Open Banking offering through its Xelerate® technology platform. Xelerate® enables banks and financial services enterprises to embrace the Open Banking economy and leverage it to their advantage. The solution makes it possible for banks to innovate rapidly on their business models and maximize revenue by aggregating and delivering true value to their customers in a collaborative partner ecosystem. Banks can leverage the business layer to derive maximum business value from their APIs through end-to-end management, monitoring, and monetization capabilities across the customer and partner value chains. With the upcoming PSD2 regulation in Europe giving the right impetus to the Open Banking economy, there is increased competition amongst banks, emerging segments such as FinTech and companies from other industries for customer wallet share. UK Regulator Competition and Markets Authority (CMA) believes stronger competition and new technology will deliver up to £1 billion in direct benefits for customers. This means traditional banks could lose out on future revenue. Building a reliable partner ecosystem would be the best strategy for banks, and new regulations are stirring them further in that direction. Ecosystems could help them expand their product lines and become true customer owners. The Open Banking wave opens up the financial services world to possibilities of innovation and newer disruptive revenue models that can increase avenues of collaborative business with partners in the larger ecosystem. SunTec has created its Open Banking solution based on its previous experience of bringing together various data sources to deliver relevant customer offers. The out-of-the-box solution, with preset configurations and definitions, can be easily parameterized to suit various revenue monetization models applied across the ecosystem. [easy-tweet tweet="PSD2 and Open Banking will change the way banks interact with their customers" hashtags="Banking, Fintech"] The three key strengths of the Xelerate are: Manage – End-to-end partner-centric and customer-centric management and extended product catalog capabilities coupled with a robust price governance, transparent billing and accounts receivables engine. Monitor – Banks can now monitor the performance of APIs, services, products and partners in real-time to maximize revenue. It shows the added value each partner brings to a bank, and helps financial organisations know what they can deliver to customers at every stage. Monetise – The inbuilt price modelling framework and dynamic product bundling capabilities based on customer insights helps banks innovate on revenue models for partners and customers. It takes monetisation a notch higher by enabling revenue maximization through creation of co-innovated solutions in the partner network. Nanda Kumar, SunTec CEO, said, “PSD2 and Open Banking will change the way banks interact with their customers. Banks which adapt to the new regulatory demands will evolve into marketplaces for their customers because they will source, package and deliver services to each individual customer. The perceived value of a bank is redefined by the new regulations through the management, monitoring and monetizing of customer data. This is an exciting time for the industry and only just a start.” ### Customer Empowerment - Observing Big Data - NICE Ltd. Ed Creasey Compare the Cloud interviews Ed Creasey, Consulting Director for EMEA NICE Ltd. Ed talks about how NICE Ltd. is a global enterprise software that focuses on empowering organisations to make smart decisions by the use of data. What does Big Data mean? Big Data is the phrase referring to a large volume of data, structured and unstructured. As this data is so large it is difficult to process through a typical database and needs to be analysed computationally to reveal trends and associations. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about NICE Ltd.here: www.nice.com Big data ### Daikin and Hitachi Embark on Collaborative Creation Aiming to Establish a Next-Generation Production Model Daikin Industries, Ltd.(TSE: 6367) and Hitachi, Ltd.(TSE: 6501) today announced that, as of October 2017, they will embark on a collaborative creation aiming to establish a next-generation production model utilizing IoT in order to support skill transfer from expert workers. As of October 2017, Daikin will introduce a system that utilizes advanced image analysis and other technologies-the solution core of Hitachi's IoT platform "Lumada" -to enable digitalization, comparison and analysis of the skills of expert workers and trainees in the brazing process, part of the manufacturing process for air conditioners at Daikin's Shiga Plant (Kusatsu, Shiga Prefecture); with the objective of ensuring consistent quality, improving productivity and developing human resources at production locations worldwide.Daikin and Hitachi will jointly demonstrate the viability of a production model using this system, to begin full-scale operation of the system in actual manufacturing workplaces during this fiscal year, and aim to expand applications of the system to other Daikin factories and manufacturing processes worldwide. Starting from the joint demonstration experiment, Daikin and Hitachi plan to advance their collaborative creation towards the realization of a next-generation production model, in which global locations coordinate and share their information, skills and techniques utilizing advanced IoT. In recent years, as global competition is being intensified in the manufacturing industry, in order to swiftly respond to the rapidly change of market conditions, in addition to shortening product development time and achieving quicker launch to market, there is also a need to improve quality and increase productivity globally; and the skill transfer from expert workers is a kind of issues. In order to improve and equalize quality worldwide, Daikin has worked for many years to train new workers and transfer skills of expert workers, with efforts centering primarily around core skills such as brazing, lathing, sheet-metal working and arc welding; which are all essential to the manufacturing of air conditioners. Meanwhile, Hitachi has worked to provide solutions that will raise the level of manufacturing capabilities (one of the key strengths of the Japanese manufacturing industry) through the application of IoT platform "Lumada" utilizing OT, IT and digitalization of the manufacturing workplace utilizing the latest research and development outcomes, based on experience and know-how developed through Hitachi's own manufacturing operations. Hitachi considered that it could provide support for the efficient skill transfer from expert workers to a large number of workers by advanced image analysis that detects signs of deviations of workers' motion in front-line and facilities failures to quantitatively digitalize compare and assess the skills of expert workers and trainees.Under the cooperation with Daikin, Hitachi has conducted validation tests for the digitalization and modelling of workers' motions and use of tools, etc., in brazing processes in the manufacture of air conditioners.By analyzing data gathered from brazing processes from the "4M" perspectives of "Man", "Machine", "Material" and "Method", Daikin and Hitachi considered the extent to which the data could be linked to the creation of new production methods, and has now reached a stage where there is potential for applying systems for the digitalization and modelling of workers' motions and other phenomena to actual manufacturing workplaces. [easy-tweet tweet="Daikin and Hitachi constructed a Brazing Skills Training Support System" hashtags="Daikin, IoT, IT"] Based on validation tests conducted so far, Daikin and Hitachi constructed a Brazing Skills Training Support System that quantitatively assesses and analyzes differences in brazing work carried out by expert workers and trainees; and are now set to introduce the system to the actual manufacturing workplace at Daikin's Shiga Plant, and to begin joint demonstration in October 2017. Specifically, the system chronologically collects and digitalizes motion data-including hands of expert workers, torch angle and angular velocity, and the angle, distance and angular velocity at which the filler materials and workpieces are supplied-using cameras and sensors, and uses the data to construct standard motion models. Similarly, by gathering and digitalizing data on movements and characteristic phenomena occurring when trainees carry out brazing work, the system makes statistical comparisons with the standard motion models created based on expert workers' motions.This enables quantitative assessments of the brazing work of trainees based on the examples set by expert workers, allowing trainees to acquire the necessary skills in a shorter period of time, and enabling the company to seek to standardize and improve upon worker's skill levels; leading to improved consistency of quality, increased productivity, and enhanced development of human resources working on the front lines of manufacturing worldwide. Based on the results of these joint demonstration, by establishing unified standards for brazing work for introduction at all global production locations, and by integrating and analyzing data together with data from monitoring & control systems and production equipment, Daikin and Hitachi plan to further improve quality and productivity, and to train more skilled, expert workers. Overall Image of the Brazing Skills Training Support System http://www.acnnewswire.com/topimg/Low_HitachiDaikin92617.jpg ### RSA Integrates with Microsoft Azure Active Directory  Integration provides safe journey to the cloud by enabling customers to use RSA SecurID® Access multi-factor authentication with Microsoft Azure Active Directory Premium conditional access RSA SecurID® Access from RSA, a global cybersecurity leader delivering Business-Driven Security™ solutions, adds more options for two-factor authentication to Microsoft Azure Active Directory Premium. RSA SecurID Access customers can satisfy their need for strong authentication with added flexibility for hybrid environments in their journey to the cloud, simplify management challenges, and provide a consistent experience for users regardless of the location of the applications. Security concerns with the cloud are still very prevalent, and account hijacking via stolen credentials remains a top concern. As organizations transition to the cloud, they must have a high-level of confidence that users are who they claim to be, and Multi-Factor Authentication (MFA) and modern authentication techniques are designed to provide a higher level of assurance. [easy-tweet tweet="Enterprises are increasingly looking to the cloud to drive IT transformation" hashtags="IT, Cloud"] “Applications are being deployed at incredible speeds with user populations demanding access from wherever they are, whenever they want, from any device,” said Jim Ducharme, vice president, Identity Products, RSA. “By integrating RSA SecurID Access with Microsoft Azure AD Premium, RSA can help customers bridge the islands of identities that are created as customers journey to the cloud.” RSA SecurID Access is designed to support a broad range of cloud and on-premises applications and provides a choice of authentication methods with risk analytics for higher-levels of identity assurance. The solution is engineered to complement an assortment of modern authentication methods, including, for example, the use of smartphones with push authentication, fingerprint and eyeprint verification. Users benefit from a common experience across a hybrid set of applications, while organizations can increase their identity confidence through risk analytics with a solution that is designed to adapt to changing requirements as they shift to the cloud. Alex Simons, director, program management, Microsoft Identity Division at Microsoft Corp. added, "Enterprises are increasingly looking to the cloud to drive IT transformation. By working with RSA, we can simplify and secure our shared customers move to the cloud. We are excited to work with RSA to make it easier for RSA SecurID customers to move to Microsoft Azure Active Directory.” Organizations can use the broad set of RSA SecurID Access authentication methods to provide a high level of identity assurance for Azure AD SaaS applications, including Office 365, Azure AD application portal and Azure AD administrative console. Other benefits, as engineered, include: Leverage and extend existing RSA SecurID investment: RSA SecurID customers can ensure a consistent and predictable experience for their end users across current RSA SecurID and Microsoft Azure AD protected resources (VPN, routers, gateways, etc.). Consumer-simple usability with a high level of security: Organizations can adopt a layered approach to authentication by combining push, biometrics and other methods including risk analytics, for higher-levels of identity assurance. Flexible authentication: An increasingly diverse population of users and a growing list of strong authentication use cases require organizations to support an assortment of different work styles. Within a given policy, RSA SecurID Access gives users a choice of authentication methods that best suit their individual preferences. PRICING AND AVAILABILITY The capability to interoperate with Microsoft Azure Active Directory Premium P2 conditional access is currently in the latest release of RSA SecurID Access and shipping to customers. It is available as part of RSA SecurID Access Enterprise and Premium Editions. Learn more about the editions and pricing here. ABOUT RSA READY Through its time-tested RSA Ready Program, RSA has certified more than 500 SaaS and on-premises applications to be interoperable with RSA SecurID Access technology. With each of the RSA SecurID Editions, customers can choose their authenticators from a wide range of options – from simple mobile form factors like push-based “tap to approve” methods to mobile-based biometrics to one of the most secure hardware tokens in the industry. ABOUT RSA RSA, a Dell Technologies business, offers business-driven security solutions that uniquely link business context with security incidents to help organizations manage risk and protect what matters most. RSA solutions are designed to effectively detect and respond to advanced attacks; manage user identities and access; and, reduce business risk, fraud, and cybercrime. RSA protects millions of users around the world and helps more than 90% of the Fortune 500 companies thrive in an uncertain, high-risk world. For more information, go to rsa.com. ### Cloud Suppliers and GDPR; How to Ensure you Find the Silver Lining? We live in increasingly digital times, and citizens expect their on-line activities to be conducted from wherever they happen to be – using their smartphones or tablets. Very few spare a thought for how that is even possible, which will almost certainly involve the use of cloud services, and which in reality, will probably be being delivered by organisations they have never heard of in locations unknown. There is a high degree of "blind trust" that the citizen data will be safely handled by these cloud services. Of course, that’s the most likely citizen perspective (if they care at all). For those organisations looking to take the traditional manual, paper-based services into the cybersphere, carefully selecting a credible cloud service provider becomes an important consideration which will help to underpin the companies’ approach to security and privacy of personal data. With the European Union, General Data Protection Regulation (EU GDPR) arriving in May 2018, choosing the right cloud service provider could greatly help (or indeed hinder) an organisation’s GDPR compliance levels. Data privacy is underwritten by information security, which extends to the technical, operational and personnel controls which deliver the physical cloud infrastructure itself. To avoid data breaches, unauthorised access or damaging virus or malware attacks, potential cloud service providers should readily be able to demonstrate their capabilities, and the customers should ensure they are in a position to understand their responses. It should be expected that the more comprehensive the provision and the better the level of service then the higher the price, but that’s a consideration against the reduced risks of financial penalties arising from a data breach taking place. [easy-tweet tweet="As a data processor, citizens need to understand where their data is." hashtags="Data, Cloud"] Cloud service providers should be transparent about where their hosting and support facilities are located. As a data processor, citizens need to understand where their data is, and the legal framework which applies to those locations. While the European Union is harmonising under GDPR, different rules apply within the United States and other non-European nations. Advice should be sought to understand the prevailing legislation, or whether an alternative commercial approach is sufficient to protect the personal data that would be processed within the cloud service. There’s also need to rely upon the co-operation of a chosen cloud service provider to ensure that the increased range of subject rights which arrive with GDPR can be properly actioned within (in most cases) 30 days of receipt. Subject access requests, the correction of inaccurate data, or the request to erase personal data, to name a few, will require clear and efficient co-ordination between the customer and their cloud service provider, and technically complex infrastructure must be able to support such requests. Consideration also needs to be given to any back-up, archived or declared sub-processor involvement in data processing. There’s a lot to think about when selecting a cloud service provider, but on the other hand a properly justified procurement decision will undoubtedly deliver benefits to the customer. The scale and resilience of most cloud providers, combined with the monitoring, maintenance (e.g. patching) and physical protection of their assets will be greater than most standalone customers can deliver, and activities such as these are crucial to preventing and detecting many forms of data loss, theft or compromise. It should be expected that the focused competences of suppliers cloud analysts and engineers, normally on a 24x7 basis, will again provide protection and tighter SLA commitments than the customer could ever manage. Third-party cloud services will need appropriate time and resources from the customer to ensure that they are selected and managed effectively. Customers will need to understand and disclose the nature of any external data processors such as cloud service providers within privacy notices and Data Protection Impact Assessments (DPIA), such that citizens can make an informed decision about whether they are content to have their data processed in the manner prescribed. Article 35 of GDPR mandated “privacy by design”, and as such, all data processing activities need to be designed and implemented with privacy-related activities as the main focus. The preparation of a well-structured DPIA (such as those provided by the utopian solution from InfoSaaS) will by necessity need to involve engagement with any contracted cloud service provider, recognising that the customer (as data controller) and their data processors need to closely work together to ensure that personal data is maintained securely, the obligations of GDPR can be properly met, and the rights of citizens as data subjects can be delivered within the required timeframe. Customers are advised to identify the “Data Protection Officer” within the cloud service provider, as this individual plays an essential role in communicating the requirements of GDPR internally and ensuring that the implementation of data protection activities is effective. Selected carefully and managed closely, cloud services can greatly assist organisations achieve compliance with GDPR. About potential fines of up to €20m (£17m) or 4% of global annual turnover for data breaches under GDPR, choosing wisely is a small price to pay.   ### Putting the Past into the Cloud: Integrating Legacy Software and Systems Sam Woodcock, Director of Solutions Architecture - EMEA and APAC, iland I was recently asked to share my thoughts on the challenges of integrating legacy software and systems with the cloud. We know that cloud promises huge benefits regarding business agility, competitiveness, cost reduction and management – that’s why cloud integration is at the top of the CIO agenda - but there are some common challenges that can be overcome with some astute project management and by looking at the bigger picture: One business; one system A common mistake when it comes to integration, paradoxically, is the tendency to view the cloud as a completely separate entity from legacy IT systems. Businesses that have built up years of best-practice regarding management, security and networking for on-premise systems don’t see the bigger picture and fail to incorporate this processes into the cloud environment. The result is the perpetuation of ‘IT silos’ with multiple management procedures. This is disjointed and more complex to manage, therefore defeating many of the objects of the exercise. Instead of viewing the cloud separately, organisations need to leverage their prior investment and focus on developing a holistic management methodology for legacy on-premises systems and cloud systems. This makes perfect sense, for example, in the case of IT teams that have been using VMWare with legacy systems. Choosing a VMware-based cloud provider ensures familiarity and gives those IT teams the opportunity to bring relevant experience to bear during the implementation and management phases. No such word as “can’t.” Organisations that have legacy systems that can’t be virtualised don’t need to abandon the goal of cloud migration. By working with a cloud provider that can support both colocation and physical environments, the benefits can still be realised. One of the reasons that many of our customers choose iland Secure Cloud is the ability to co-locate legacy systems, meaning that they can still achieve integration without having to write off any prior investment. Security and disaster recovery – confidence and compliance [easy-tweet tweet="Security and business continuity play a vital role during deployment" hashtags="Security"] Naturally, security is a key concern in any project. Ensuring that the chosen cloud environment can maintain backups and protect company data to the highest standards of compliance is critical. Security and business continuity play a vital role during deployment, which is why many iland customers are leveraging our replication tools to migrate legacy systems to the cloud. This approach gives them the opportunity to first protect legacy systems in the cloud with DRaaSand use that as a testing ground to gain confidence before running live applications in the cloud. This facility ensures that migration is as painless as possible. Painless project – is that possible? Nothing worth doing is 100% easy, but the benefits of cloud integration are so compelling that the effort is undeniably worthwhile and it’s up to suppliers to demonstrate integration methodologies that minimise pain and maximise benefit. Looking at all of the challenges above, one thread runs through them all, and that’s project management (to understand this more in depth, why not look at this recent article written by my colleague, Andrean Georgiev).  A strong project manager can communicate with all stakeholders and assist in developing those holistic management strategies, devising a strategy for colocation and ensuring compliance with industry-specific and general security standards. It’s their job to seize those challenges of cloud integration and take out the pain. At iland, we pride ourselves on having some of the best project managers in the business. So yes, there are some pitfalls to avoid when integrating legacy software and systems into the cloud. But, by viewing and managing the system holistically, making the most of the historical knowledge within IT teams and ensuring that the project is expertly co-ordinated throughout, the goal of a secure, compliant and fully integrated cloud environment that complements on-premises systems can be achieved. ### Cloud Tech and ID Verification Innovation Credas provides a wide range of products and services that enable any business to onboard and verify people quickly, accurately and affordably via a mobile app. Often, the way in which many businesses verify their employees, partners or clients ID can be slow and expensive, exposed to unnecessary risk and as a result open to inaccuracies. Credas can reduce the exposure to risk in a simple and flexible way ensuring commercial benefit and time saving solutions.  Our Know Your Customer, Anti-money laundering, Right to Work, Right to Rent and Anti-slavery products ensure accurate and transparent checks. Offering a fresh perspective in a mature market, we quickly realised being cloud based was the only viable option to satisfy the need for flexibility, security, scalability and cost management. There are a lot of buzz words there, none of them new and 10 or 15 years ago the solutions to those hurdles were very different and certainly not as inter-connected in the way they are today. [easy-tweet tweet="Data is very much the oil of the 21st century and is a mission-critical asset to all businesses" hashtags="Data, Business"] Back then on-premise and data centre solutions would have been required and that instantly increases cost, complexity and the number of moving parts within an operation. Data is very much the oil of the 21st century and is a mission-critical asset to all businesses. The need to store it securely whilst making it easily accessible is non-negotiable, but also a difficult balance to find. With new data regulations coming into play next year, our clients are comfortable knowing their data is held within a globally renowned and trusted cloud platform and not on a server underneath a desk. Credas captures sensitive information, so knowing the data is being looked after is incredibly important to us and our clients and ultimately ensures we build trust and confidence as a business. Security is high on our agenda and a key selling point for Credas, and it's why we run multiple penetration tests each year to ensure our defences are strong and that we keep up-to-date with the various techniques used by cyber criminals. From a technology point of view, we combine facial and ID recognition with a best-in-class data platform, well-designed, user friendly applications and seamless integration options.  All of this has enabled us to provide our core verification engine as a stand-alone component, meaning we can integrate easily with existing systems our clients may have in real-time using various methods, such as, APIs. Imagine the difficulty in achieving this simple task if we were not cloud based. Operationally there are also multiple benefits to being cloud based that free us up to do what we do best (and that isn’t patching servers in our server room): Capital-expenditure that often comes with running your own IT is eliminated because we use a subscription based model, which is great for cash flow too. This enables us to run accurate financial forecasts so we can plan our development work carefully and manage the spend. Risk management is a key area for all businesses but seldom at the top of the list for smaller companies. When things go wrong, it's important to have a disaster recovery plan in place that minimises any impact on the running of the business. Being cloud based makes this much more manageable and achievable. Scalability and performance is very important to us and we can easily grow, shrink or repurpose our eco-system as per business demand and react very quickly to various conditions. Speed of delivery is another critical component in a digital world, and as feedback is vital to improve our products we need a platform and key processes in place that enable us to execute, change and explore opportunities quickly. Using a continuous delivery pipeline in our cloud environment, we can progress changes rapidly through automated functional and performance test suites, and implement those changes to our live environment reliably and with no downtime.  This speed of delivery is vital in allowing our products to evolve and adapt quickly based on our clients’ needs, whilst ensuring high-performance and availability of our systems at all times.  Being cloud-based has made this process significantly slicker and more cost-effective than conventional alternatives by allowing us to spin up servers, scale our build agents on-demand and simulate load for automated performance and stress testing. There are hundreds of valid points to highlight about using the cloud, but for Credas being cloud based has enabled us to realise our potential quickly whilst keeping expenditure low which has been vital to realise our business growth plans ### Digital Advice Live 2017 We attended the Digital Advice Live event, organised by Smart Assistant, with no particular aspirations except curiosity. With a very impressive line-up, it made me, in particular, think what the content for the day will manifest into. Buying influences within the digital age, who are leading the way with digital transformation within the retail sector or a fashion parade for brand advocation for the sponsors? Well, all I can say is it was all of the above and much more. With topics such as AI and Chatbots, Digital Trends, Omnichannel experiences and more it was a digital marketing fest that, over the course of the day, gave so much information to digest. Markus Linder, CEO at SmartAssistant started the event with an insightful view of the future trends in the digital age, then swiftly followed by David Smith, CEO at Global Futures and Foresight, discussing the topic 'First we do things differently, and then we do different things!' Yes, this was one of many clichés that followed throughout the day, but these words did strike a chord to change one's perspective. Next up was my personal favourite, Peter Francis, VP of Digital Growth and Acquisition at T-Mobile. Peter shared some amazing growth statistics that T-Mobile have achieved with some alarming sector facts that were very interesting to hear. For example – “Since the migration from Digital Forms to Digital Assistants and Advisors T-Mobile have increased their web conversion rates for sales by 300% and web orders have increased by 3.5x!” I was fortunate enough to have the opportunity to have several discussions with Peter and it seemed that the main reason for the increased revenue was down to an understanding of the consumer mindset. Multitasking is now a basic essential skill required by ALL of us and the need for shorter and frequent interaction is more in demand. This rings true when you place smart assistants and chatbots in that role to the customer and it seems to be a winning strategy. Next up to discuss how Miele has evolved digitally was Elena Helfberend, Head of Sales and Project Management at Miele. Now Miele has changed the way they engage their customers massively, just take a look at their online retail presence. Some interesting statistics were demonstrated by Elena that highlights their research on buying influences with conversions - This is what they found based on the touch points that they analysed broken down by percentages. Traditional sources Print media 43% - in-store help 43% - family and friends 33% - dealers catalogue 17% - Manufacturing catalogue 14% Digital Sources Websites 54% - Customer Reviews 51% - Dealers Website 42% - Manufacturers website 35% - Online Forums 12% I believe that these statistics for digital sources could and would be more for other firms within the retail sector but at the very least this information shows us one thing that cannot be overlooked. Digital sales are increasing, and if you haven’t transformed yet, your competitors are. There were many other discussions during the day that demonstrated valuable insights into the retail sector and Digital Transformation, including Ben Hart Director at Evens Cycles and how they guide their customers through an online experience, Curtis Bice, Marketing Project Manager at Trek Bicycle and the value of enriching the digital customer journey. There was also, Stephanie Maeder, Experience Manager at Swisscom discussing how they have improved the customer experience with Digital Advice and Tamara Dutina, Digital Marketing Specialist at Beko with their advice on how to guide and advise shoppers to engage. [easy-tweet tweet="A new generation is here to see, react and engage within an inherently different way" hashtags="AI, Engagement"] I will try and summarise my key takeaways from the event without cliché’s, analogies and from a consumer’s viewpoint. It seems clear that the buying experience has changed and continued to do so. To win that customer over you need to think digital, you need to think engagement, and you need to think smart! A new generation is here to see, react and engage within an inherently different way than a decade ago. With the continued advancements in technology and the ever-increasing online community, you simply have to adopt a digital strategy, and this is not just within the retail sector, it’s every sector. Look what T-Mobile has done, look what Apple has done and look what Google is doing. Yes, they are giants within the industry, but they have invested huge amounts of time and money to get the overall customer experience right. Take a note from their experiences and replicate some of their strategies in your own business. However, the first leap is the biggest – review how your customers perceive you and how you interact with them as this will give you valuable insights. ### Amazon Web Services Announces the Opening of Data Centers in the Middle East by Early 2019 New AWS Infrastructure will enable customers to run workloads in the Middle East and serve end-users across the region with even lower latency SEATTLE--(BUSINESS WIRE)--Amazon Web Services (AWS), an Amazon.com company (NASDAQ: AMZN), today announced that it plans to open an infrastructure region in the Middle East by early 2019. The new AWS Middle East (Bahrain) Region will consist of three Availability Zones at launch. Currently, AWS provides 44 Availability Zones across 16 infrastructure regions worldwide, with another 14 Availability Zones, across five AWS Regions in China, France, Hong Kong, Sweden, and a second GovCloud Region in the U.S. expected to come online by the end of 2018. AWS today also announced it will launch an AWS Edge Network Location in the United Arab Emirates (UAE) in the first quarter of 2018. This will bring Amazon CloudFront, Amazon Route 53, AWS Shield, and AWS WAF to the region and adds to the 78 points of presence AWS has around the world. For more information on AWS’s global infrastructure, go to https://aws.amazon.com/about-aws/global-infrastructure/. “As countries in the Middle East look to transform their economies for generations to come, technology will play a major role, and the cloud will be in the middle of that transformation,” said Andy Jassy, CEO, Amazon Web Services, Inc. “Some of the most gratifying parts of operating AWS over the last 11 years have been helping thousands of new companies get started, empowering large enterprises to reinvent their customer experiences, and allowing governments and academic institutions to innovate for citizens again. We look forward to making this happen across the Middle East." This announcement has been welcomed by political leaders and royal families in Gulf Cooperation Council (GCC) states. Countries across the Middle East are looking to innovate, grow their economies, and pursue their vision plans, such as Saudi Vision 2030, UAE Vision 2021, and Bahrain Vision 2030, and cloud technology will be key in helping them achieve this. His Royal Highness Prince Salman bin Hamad Al Khalifa, Crown Prince of Bahrain, First Deputy Prime Minister, and Chairman of the Bahrain Economic Development Board, commented, “Today’s announcement is a significant moment for Bahrain and the region. For the Kingdom, the expansion of regional cloud capacity builds upon a business environment that is already driving innovation and entrepreneurship, using technology to accelerate economic diversification in Bahrain. Through improved efficiencies, access to new career opportunities, and helping to enhance the delivery of government services, this marks further realization of the principles of sustainability, fairness, and competitiveness that form the core of Bahrain’s 2030 Vision.” [easy-tweet tweet="Amazon Web Services is delivering the Middle East a world-class service" hashtags="AWS, Data"] H.E. Khalid Al Rumaihi, Chief Executive of the Bahrain Economic Development Board, said of the news, “AWS’s commitment to expanding its presence into the Middle East and North Africa (MENA) region, from Bahrain, is a major enabler for technology and data-driven business across the GCC. This will benefit global corporates, SMEs, entrepreneurs, and governments alike. The ability to store and share data at speeds the Gulf has never experienced before has the potential to help companies gain competitive advantage, allowing them to compete more effectively at a global level. Amazon Web Services is delivering the Middle East a world-class service. With such a young, technologically adept, and growing population, the Gulf is well positioned to drive innovation in mobile applications and digital services. I am very eager to see how our region’s entrepreneurs will make use of this exciting opportunity.” AWS Investing in the Middle East AWS is growing its presence in the Middle East bringing offices, staff, education, training, startup support, and other investments to the region. In January 2017, AWS opened offices to serve its rapidly growing customer base with a presence in Dubai, UAE and Manama, Bahrain. These offices have been established with teams of account managers, solutions architects, partner managers, professional services consultants, support staff, and various other functions for customers to engage with AWS. Another investment AWS is making for its customers in the Middle East, and around the world, is to run its business in the most environmentally friendly way. An important criteria in launching the AWS Middle East (Bahrain) Region is the opportunity to power it with renewable energy. AWS chose Bahrain in part due to the country's focus on executing renewable energy goals and its proposal to construct a new solar power facility to meet AWS’s power needs. The Bahrain Electricity and Water Authority expects to bring the 100 MW solar farm online in 2019, making it the country’s first utility-scale renewable energy project. For more information on AWS’s commitment to sustainability, go to https://aws.amazon.com/about-aws/sustainability/. AWS also announced that it is supporting the advancement of technology education across the Middle East making AWS Training and Certification programs available to customers. In the education sector, AWS is supporting the development of technology and cloud computing skills at local universities through the AWS Educate program, providing students and educators with the resources needed to accelerate cloud-related learning. This program is now available for students attending institutions such as King Abdullah University of Science and Technology in Saudi Arabia, the Higher Colleges of Technology in the UAE, Bahrain Polytechnic, University of Bahrain, as well as Oman College of Management and Technology, the Jordan University of Science and Technology, and many others across the region. To support the growth of new business, AWS works with incubators and accelerators in the Middle East to provide resources to startups through the AWS Activate program. In Saudi Arabia, AWS works with the Badir Program for Technology Incubators and Accelerators at King Abdulaziz City for Science and Technology (KACST) to provide startups with access to technology resources as well as expert advice, education, and training to help promote Saudi youth entrepreneurship and grow new businesses in the Kingdom. To help new businesses across the Middle East go global, AWS also works with a number of local and international accelerators and incubators active in the region such as AstroLabs in the UAE, Cloud 10 Scalerator in Bahrain, as well as 500 Startups, Startupbootcamp, and Techstars, providing training, AWS credits, in-person technical support, and other benefits. Middle Eastern Organizations Increasingly Moving to AWS Organisations across the Middle East – in UAE, Saudi Arabia, Kuwait, Jordan, Egypt, Bahrain, and other countries – are increasingly moving their mission-critical applications to AWS. Startups in the region choosing AWS as the foundation for their business include Alpha Apps, Anghami, Blu Loyalty, Cequens, DevFactory, Dubizzle, Fetchr, Genie9, Mawdoo3.com, Namshi, OneGCC, Opensooq.com, Payfort, Tajawal, and Ubuy, as well as Middle Eastern Unicorn Careem, the leading ride-hailing service in the MENA region. Careem runs all of its operations on AWS and over the past five years has grown 10 times its current size each year. “It is great news that AWS is opening a Region in the Middle East,” said Magnus Olsson, chief experience officer and co-founder of Careem. “We have been all-in on AWS since we launched in 2012 and using the scalability of the cloud has helped us to cope with rapid growth. After starting in Dubai, we now serve over 12 million commuters in 80 cities across the Middle East and North Africa, including Turkey and Pakistan – this would not have been possible without AWS. The new AWS infrastructure Region in the Middle East gives us the opportunity to experiment with new Internet of Things (IoT) technologies that, in the future, will give us the ability to run a fleet of self-driving Careem cars comfortably and safely taking passengers to their destinations.” Some of the Middle East’s most historic and established enterprises are moving mission critical applications to AWS. Enterprises such as Actel, Al Tayer Group, Batelco, flydubai, Hassan Allam, Silah Gulf, Union Insurance, and the United Arab Shipping Company are using AWS to drive cost savings, accelerate innovation, and speed time-to-market. One successful business that used AWS to grow is eCommerce provider SOUQ.com. “AWS has been key to our success throughout our journey of exponential growth in the MENA region,” said Ronaldo Mouchawar, CEO and Co-Founder of SOUQ.com. “Businesses in the region need robust technologies to both scale rapidly and provide world class service to their customers. With innovation being the main focus on the region’s national agenda, the advent of the world's largest provider of cloud computing in the region is great news for everyone. This will help support the growing demand for cloud technologies as well as accelerate business expansions and success, regionally and around the world.” Another enterprise using AWS is Middle East Broadcasting Center (MBC), the largest private media company in the Middle East, delivering Arab language content to over 150 million people across the Middle East and North Africa. “AWS has been critical in our digital transformation initiatives,” said Joe Igoe, MBC's Group Director of Technical Operations. “Knowing we will have AWS infrastructure close to our viewers is invaluable for us. AWS has helped us speed up innovation and rapidly expand into a wider range of scalable and reliable digital services, such as our Video on Demand platform, Shahid.net. During the holy month of Ramadan, TV viewership spikes dramatically. This year, we successfully scaled our online platform to support a massive increase in traffic and delivered hundreds of thousands of concurrent video streams and dozens of petabytes of content to our viewers.” Government organizations are also working with AWS to lower costs and better serve citizens in the region, including the Bahrain Institute of Public Administration which has moved their Learning Management System to AWS, reducing costs by over 90 percent. Another government organization using AWS is the Kingdom of Bahrain Information & eGovernment Authority (iGA). The iGA is in charge of moving all government services online and is responsible for ICT governance and procurement for the Bahrain government. Earlier this year the iGA launched a cloud first policy, requiring all new government ICT procurement to evaluate cloud-based services first. Mohamed Al Qaed, Chief Executive of Kingdom of Bahrain iGA, said of the announcement, “AWS forms the backbone of our digital government initiatives so the news that an AWS Region is coming to our country is warmly welcomed by us. Through adopting a cloud-first policy, we have helped to reduce the government procurement process for new technology from months to less than two weeks. We are in the process of migrating 700 servers with more than 50 TB of data to AWS with the goal of decommissioning our hosting platform by the end of 2017. We have also started to migrate systems of national significance, such as our Bahrain Data Locator, and supporting other entity system migrations, like the Ministry of Education LMS that has 149,000 users, with more planned on the way. As we move more mission-critical workloads to AWS, we look forward to even greater efficiencies and being able to complete our mission to become eGovernment & ICT Pioneers.” In addition to established enterprises, government organizations, and rapidly growing startups, AWS also has a vibrant AWS Partner Network (APN) across the Middle East, including APN Partners that have built cloud practices and innovative technology solutions on AWS. APN Consulting and Technology Partners in the Middle East are helping customers to migrate to the cloud include Al Moayyed Computers, Batelco, C5, du, DXC Technology, Falcon 9, Infonas, Integra Technologies, ITQAN Cloud, Human Technologies, Kaar Technologies, Navlink, Redington, Zain, and many others. For companies looking to join the APN or for the full list of members please visit: https://aws.amazon.com/partners/. ### WND UK Hosts Two-Day ‘Hackathon’ Workshop to Help Entrepreneurs and Businesses Develop Sigfox Applications WND UK, the UK’s primary Sigfox network operator, will host a two-day, hands-on workshop at Sci-Tech Daresbury in Cheshire for businesses wishing to learn more about how the Low Powered Wide Area Network (LPWAN) communications protocol – Sigfox, can be developed to improve ideas for commercial business applications. In conjunction with the Science & Technology Facilities Council Hartree Centre and Internet of Things (IoT) technology company, WaveReach Ltd, this free event, funded by the European Regional Development Fund through the LCR 4.0 programme, will take place from 10:00 to 16:00 on 4th and 5th October 2017. Neal Forse, CEO and founder of WND UK, commented: “The event will be invaluable for businesses and entrepreneurs that may have ideas for a Sigfox application but require technical assistance with implementing them. Dedicated Sigfox trainers and staff from Hartree Centre will be on hand throughout the two days to offer their technical expertise. Forse continued: “As the UK’s primary Sigfox network operator we have adopted a dynamic rollout strategy which allows us to prioritise base station deployments to meet customer demand. Our approach is key to enabling widespread adoption of Sigfox, which will be a key driver for IoT applications.” [easy-tweet tweet="Sci-Tech Daresbury is home to the Hartree Centre’s IoT testbed" hashtags="Technology, Science"] Tom Kirkham, business development manager at STFC Hartree Centre commented: “Sci-Tech Daresbury is home to the Hartree Centre’s IoT testbed which offers companies the ability to test their applications within a campus environment using the latest technologies in both IoT and supporting data-centric technologies. The Sigfox workshop will be a hands-on opportunity to create applications from scratch and test them within the campus using emerging LPWAN technology.” David Griffiths, director for WaveReach Ltd, said: “This event presents an exciting opportunity for businesses to get to grips with the benefits of deploying Sigfox for IoT devices. Sigfox is an extremely affordable technology from a communications cost aspect compared with other wireless technologies such as cellular, Wi-Fi, LoraWAN and microwave. It’s a superb technology, currently in the process of a massive roll-out in the UK; the Sigfox network provides a robust, secure and low-priced means of delivering data to customers in the UK and around the world.” Places for the two-day workshop are limited and applicants must be able to attend both days Individuals or companies who are not already taking part in the LCR 4.0 programme will be expected to fill out a short-term assist form For further information or to register for the event, please contact Tom Kirkham at tom.kirkham@stfc.ac.uk ### The Cloud Security Dilemma Not a single week goes by without seeing an equal amount of articles decrying the gaps in cloud security and articles praising the benefits of the cloud including how much more secure it can be than an organization’s own data centre. So which is it? And for companies who are trying to define their next IT and security infrastructure, what is the right direction? The cloud security dilemma The answer may just not be the same for every company. There are obvious benefits to either side of the equation, and those benefits vary in importance depending on what kind of business you’re considering and which industries that business serves. Ultimately, an organisation will have to consider its risk profile, its security core competencies from an infrastructure perspective as well as from a staffing and resource perspective, its regulatory environment and its potential for a large data breach exposure.  Some enterprises naturally have a higher tolerance for balancing risk versus reward whereas others do not. But it will also have to consider its digital transformation journey, its ability to manage its own IT systems and business applications, its growth plan, its user base and, more importantly, its business model. Let me give you two examples: A financial service company that has a very large data centre with legacy systems and applications, including some that live on the mainframe, will have a large IT security staff and an ample security budget to protect itself from breaches.  Its risk profile will be high based on the sensitive and PII (personal identification information) data it deals with, but it will have plenty of security resources to comply to the regulations it operates within. In contrast, a mid-size company with strong growth and high customer acquisition would rely on SaaS applications to support its fast pace and would more than likely favour cloud vs on-premises implementations.  This mid-size company will lean towards selecting well-known cloud infrastructure providers to host their business because they would consider their security superior to what the company could deploy itself in a data centre with the resources at hand. The shift to ‘identity-first.’ But the reality of this million-pound question is that the premise of how we define security is incorrect, or to be exact, it needs to evolve.  Let me explain.  If you are looking at security as perimeter-based protection, then there is an argument to be made that anything in a data centre, which is surrounded by next-gen firewalls and other threat prevention solutions, will surely be protected.  But consider this: once an important piece of data that lives on-premises behind a firewall (think a financial application) is extracted from that system, exported to an Excel spreadsheet, and then emailed around, that data is immediately exposed. In a world where not only employees but contractors, partners and even customers have access to an enterprise’s applications and data, how can perimeter security be enough? How can it make any data centre more secure than the cloud? The short answer: It can’t. [easy-tweet tweet="We need to embrace the digitalisation of all corporate data and think identity-first" hashtags="Data, Digital"] Today, it is not enough to simply protect data centres, on-premises systems, and cloud apps, we need to embrace the digitalisation of all corporate data and think identity-first. Consider how people work today – everyone has access to data and applications in the cloud or on-premises - the access that needs to be secured is between the user and the data or application. Consider as well that users vary between employees and non-employees like contractors, partners and temporary workers – which makes it even harder to control who has access to what, in the cloud or on-premises for that matter. Then it becomes evident that enterprises need to shift their focus towards safeguarding their users’ access to applications and data -- in other words: to safeguard their users’ digital identities. Here are five things to consider when extending perimeter security to securing digital identities: Establish a system of records for digital identities. This should include employees, contractors, suppliers, partners, customers and potentially even RPA (robotic process automation) bots. Govern access by all these digital identities to understand who has access now and who should have access. This can be done using an identity governance solution to manage and provision access. Consider a vendor who can handle both on-premise and cloud applications to manage any complex, hybrid IT environment. Take into account all of your data – both structured data, residing in systems and applications, and the ever-growing amount of unstructured data, from emails, documents and files that enterprises manage today. This is particularly important given how many recent breaches targeted data stored in unstructured systems (such as a cloud-based document file system). Unstructured data is typically much harder for companies to discover and classify, as it is not clear who owns it and who has access to it. Balance people, process and technology – understand what your core competencies are and if your company does not have the IT staff to support an identity governance program, a cloud-based solution may be the right choice for you. Rather than spinning cycles debating cloud vs on-premises security approaches and which is better, it’s high time to take a step back. Today, the most successful attack vectors are the users and their credentials.  Protecting digital identities of all users accessing any data has or will very quickly become the priority for most organisations.  The more regulated ones will take this direction first, as privacy regulations and data breaches are consuming their boardroom. Being able to manage the user relationship to data, controlling who has access to what, who reads what sees what – well that has become the number one security control.     ### Neustar WAF Helps Organisations Combat Growing Application Layer Threats Web application firewall capabilities integrated with the world’s largest DDoS defence network for comprehensive protection against web security threats Neustar, Inc., a trusted, neutral provider of real-time information services, today introduced its new Web Application Firewall (WAF). It combines the world’s largest always-on Distributed Denial of Service (DDoS) mitigation service with a cloud-based WAF, as part of Neustar’s Integrated Security Solutions Platform. With this powerful defence combination, Neustar can protect customers against both volumetric attacks and threats that target the application layer, for comprehensive security protection across the entire network stack. “While DDoS attacks continue to command great attention, application layer threats have become equally as damaging, are the most difficult to detect and provide little to no advance warning before they wreak havoc on your critical applications,” said Barrett Lyon, Vice President, Research and Development, Neustar Security Solutions and DDoS industry pioneer. “This necessitates a security posture that is always on and also has the scale and expertise to respond to the largest network and application layer threats we are seeing today.” [easy-tweet tweet="The Neustar WAF is a cloud-provider, hardware and CDN agnostic solution" hashtags="Cloud, Security"] The Neustar WAF is a cloud-provider, hardware and CDN agnostic solution, making it compatible anywhere applications are hosted. Customers can reduce costs and consistently configure rules anywhere, without any restrictions. Integrated with Neustar’s always-on DDoS mitigation service, the combination provides a comprehensive, layered protection stack that proactively prevents bot-based volumetric attacks, as well as threats that target the application layer, such as SQL, XSS, CSRF, session hijacking, data exfiltration and zero-day vulnerabilities. In addition, Neustar’s Integrated Security Solutions Platform summarises critical information in a single pane of glass, allowing better visibility and simplified management across multiple solutions, exposing threats quickly. Critical security information - DDoS mitigation activity, WAFrules, restrictions and logging, authoritative DNS query traffic, blocked query data for recursive DNS end-user protection, intelligence about IP addresses, web application availability and performance – is delivered in one place. Its intuitive data visualisation capabilities help organisations easily monitor and manage critical services, and understand their operational state at one glance. “Our team is keenly focused on preparing for today’s attacks, as well as tomorrow’s greatest threats and making sure our clients have the mitigation technologies to counter whatever hackers may throw at them next,” said Nicolai Bezsonoff, General Manager, Neustar Security Solutions. “By adding a cloud-based WAF to our DDoS mitigation solution, Neustar is now able to provide the best of both worlds – the capability to protect against both volumetric attacks and threats that target the application layer. With this new addition to our security stack, we continue helping the world’s biggest brands monitor, accelerate and defend their online presence.” More than 2,500 leading enterprises and government organisations worldwide use Neustar Security Solutions to keep their organisations secure. For more than 20 years, Neustar has been delivering a comprehensive set of solutions for organisations looking to leverage real-time information to improve their security posture. ### NFON UK Relocates and Expands Presence NFON UK, an entity of NFON AG, Europe’s leading cloud telephony provider, is relocating to larger offices in the UK as it seeks to enhance its foothold in the UK market. NFON UK has moved from its offices in W3 in Acton, and taken a much larger office in One York Road in Uxbridge, which is over 1,000 square foot bigger. “The move signals the company’s confidence in the UK and supports its long-term plan to become the leading provider of cloud telephony in the UK”, says Myles Leach, Managing Director, NFON UK. [easy-tweet tweet="Demand and adoption of cloud telephony in the UK have hugely increased in recent year" hashtags="Cloud, Telephony"] Demand and adoption of cloud telephony in the UK have hugely increased in recent years, as businesses look to consolidate their ICT providers and move all functions to the cloud. NFON is perfectly positioned to capitalise on this shift, as it has one of the most comprehensive cloud telephony solutions currently available on the European market today. This is evidenced by the company’s growth in the UK market to date – its growth is exceeded 300% in its third year, and it already boasts over 100 channel partnerships with the likes of Chess, Westcoast and Scansource. Myles Leach: “As part of this relocation and expansion, NFON UK is on a recruitment drive, and we will be adding five new sales and support staff to our already strong team of 30 employees in the UK by the end of 2017. With our larger base, great people and product set, we are perfectly positioned to achieve our growth ambitions. We have huge confidence in the UK market, and we are pleased to prove this via these new offices, which will continue to support us as we grow.” Hans Szymanski, Chief Executive Officer, NFON AG, comments: “Since we opened the NFON UK subsidiary in April 2013 the UK market has gone from strength to strength, and our support hasn’t wavered in light of the result of the Brexit referendum. This is a big investment in our UK division, which we have done because we believe in the UK team and the UK market. It forms part of our long-term plan, which sees the UK continue to maintain a key role in our overall company growth. It’s poignant that as we pass our tenth-year milestone, we enter a new phase of internationalisation. As well as expanding in the UK, we have also launched in Portugal this summer, just opened a new capital office in Berlin and will launch in Italy in 2017. By the end of this year, we will have increased our number of employees in the NFON group in the business by 34 percent.” ### Are Security Fears Holding Back Hybrid IT? So much is written about hybrid IT that it would be easy to think that adoption is now a given and that every business with savvy enough IT professionals is scrambling to adopt the hybrid IT business model. The truth is a little less clear-cut. The benefits of hybrid IT have been poured over in great detail since businesses started to opt for a mix of on-premises and cloud infrastructure. SolarWinds IT Trends Report 2017: Portrait of a Hybrid Organisation, a study surveying IT professionals, found that respondents cited scalability, improved efficiency and cost savings as reasons for moving to hybrid IT. It was also found that nearly three in five organisations have received either most or all of these expected benefits. With this in mind, it may come as no surprise that 92 percent of respondents said their organisations had migrated critical applications and IT infrastructure to the cloud over the past year. However, a major issue facing these organisations is a fundamental lack of understanding about what hybrid IT solutions mean for their organisational processes and personnel. This mix of services owned and managed by an internal team and those owned and managed by cloud service providers (CSP) can cause confusion, especially where compliance and security responsibility begins and ends. While the benefits of hybrid IT have been rightly touted, it's also true that a hybrid IT approach hasn't worked for every organisation. The SolarWinds survey found that 22 percent of respondents who migrated applications and infrastructure to the cloud ultimately brought them back on-premises, primarily due to security and compliance concerns. Security is a priority Security is a primary concern for all IT professionals. It is one of the core deliverables for CIOs to their organisations, along with operational efficiency and improved agility. Cybercrime is a fast-growing industry with huge ransoms, as phishers and hackers move to get their hands on the most valuable of commodities: an organisation’s data. [easy-tweet tweet="Data breaches are happening with greater regularity and severity" hashtags="Data, security"] Data breaches are happening with greater regularity and severity. These attacks result not only in a significant financial loss and liability but can harm the business’s reputation with customers. With such high stakes, it is no surprise that IT professionals are hesitant to place their complete trust in third-party service providers, especially when outsourcing parts of their infrastructure services to CSPs. Under pressure to avoid breaches To prevent data breaches, businesses need to fortify their security strategies, processes, and protocols. For IT professionals, security procedures have traditionally taken a back seat to more pressing issues, such as operational health and performance. Security can no longer be a secondary thought. The potential collateral damage from a data breach can cripple a business. In response to the increasingly sophisticated threat landscape, IT budget managers are dedicating more personnel, resources, and processes to breach prevention. They’re also focusing on risk mitigation after a breach occurs, as the weakest link remains the end-users, who are vulnerable to spear phishing and socially engineered attacks. Hybrid IT solutions can aid this growing emphasis on security and compliance, as cloud service providers can bridge potential gaps in security and compliance rigour that plague some internal IT organisations. Be ready for the future by relying on your monitoring discipline Cloud places businesses’ focus to the automation and orchestration of highly scalable and available infrastructure services, with emphasis on security and compliance. Again, relations with CSPs are based on trust. As such, businesses with proper investment and sound security practices won’t miss a beat with a hybrid IT approach. Cloud service providers’ security practices are just as effective as those on-premises, and a company with the right resources and protocols in place can simply extend that rigour and discipline to encompass security cloud workload and data. They'd get the benefits of cloud deployments—less management, improved scalability—without compromising on security. Additionally, a well-thought-out monitoring with discipline strategy plays a key role in ensuring the security of a hybrid IT approach. A comprehensive hybrid IT monitoring solution can offer you holistic visibility across the entire application stack, from the data centre to the end-user, and ensure that any anomalies are spotted and addressed. This solution should break down the silos between on-premises and the cloud to quickly surface the single point of truth. In closing, monitoring establishes and builds trust with CSPs by giving IT professionals valuable insight into their applications’ health and performance, while also verifying that CSPs are meeting their SLAs as they transition mission-critical application services to the cloud. This rigour and discipline can help businesses continue to realise the benefits of hybrid IT without the fear of compromised security and governance. ### Dell Boomi Adds Boomi Flow to its Unified Platform for Building the Connected Business With low-code workflow automation, Dell Boomi becomes only enterprise iPaaS vendor to combine new low-code workflow automation with cloud native integration Dell Boomi™ (Boomi) announced an extension of its unified platform capabilities with the introduction of Boomi Flow, its new low-code development solution for building and deploying workflow applications. Boomi Flow aligns with Boomi’s flagship cloud integration technology to deliver the industry’s only unified cloud platform that orchestrates data, applications and business processes to help organisations run faster and smarter. Based on Boomi’s acquisition of San Francisco-based software vendor ManyWho earlier this year, Boomi Flow lets organisations easily build workflows that streamline business processes such as order-to-cash, purchasing, employee onboarding, case management and credit limit approvals. Boomi Flow equips organisations to improve productivity, accountability and collaboration internally and with customers and partners to help build The Connected Business and drive digital transformation efforts. [easy-tweet tweet="Boomi Flow completes the integration cycle to let customers connect everything and engage everywhere" hashtags="Engage, Connected"] Developers use Boomi Flow’s intuitive, low-code environment to rapidly create business applications that bring together people, processes and systems, or which meet predetermined parameters for automated execution. In one example, Boomi Flow can help organisations streamline the employee lifecycle, from recruitment, background checks, onboarding, training, promotions and other job changes, to separation. “Boomi Flow completes the integration cycle to let customers connect everything and engage everywhere,” said Boomi CEO Chris McNabb. “It’s a natural fit and a big differentiator. Adding workflow orchestration to our capabilities to move, manage and govern data really opens new frontiers for how organisations can digitally transform and build The Connected Business.” McNabb noted that Boomi has seen significant interest from its Integration Platform as a Service (iPaaS) customers in how Boomi Flow can augment and extend their ability to connect applications and data sources. Conversely, Boomi Flow customers are looking to Boomi iPaaS for agile integration that’s far faster and easier than custom-coding or traditional on-premise middleware, he said. Based in South Africa, Afrox Healthcare is the sub-Saharan Africa arm of Linde Healthcare, a part of the global Linde Group of Germany. Afrox Healthcare used Boomi Flow to rapidly build a mobile app that streamlines end-to-to-end processes for medical care and nurse visits to patients. Delivered in just eight weeks, the app connects to SAP and Salesforce and enables nurses to enter and access key data to improve healthcare delivery. “Boomi Flow made it fast and straightforward with minimal coding to build a mobile app that can be used both online and offline, in places where 3G and wifi are unavailable,” said Joseph Ramashala, Head of Afrox Healthcare. “It integrates seamlessly with our SAP and Salesforce systems and supplies data and file security, with identity management available to mobile.” With Boomi Flow, Boomi is the only software vendor to combine workflow automation with application integration in a unified, cloud-native platform. The Boomi iPaaS technology works in concert with Boomi Flow to move, manage and govern the data that drives business processes. Synergies between Boomi Flow and iPaaS give organisations a complete set of workflow and data management capabilities that can span multiple internal and external stakeholders, and cloud and on-premise applications. Organisations can utilise the technology to create and manage processes from simple to sophisticated and run a more efficient business. Boomi Flow further aligns with Boomi’s Master Data Management (MDM), API Management and EDI (electronic data interchange) offerings to supply additional capabilities. For instance, Boomi MDM can check for duplicate records across several applications in a workflow and flag them for resolution. Boomi API Management equips organisations to build connectivity across systems ranging from legacy mainframes to mobile apps, preparing them for data exchange and workflow automation. “From connecting on-premise to cloud to devices, it’s super exciting to bring workflow automation to the Boomi platform,” said Steve Wood, VP & GM, Boomi Flow. “Our joint customers are transforming their businesses using Boomi’s unified Cloud platform to achieve greater business agility.” ### The Challenges of IT Within a Regulated Environment With the ever-increasing demand for technology, or even the ever increasing demand of technology, knowing the industry you work within in regards to technical support is paramount. Regulations for technology within strict compliance industries are hard to keep up with, or even relate to, as this was/is seen as a black art full of non-descript terms that outline certain systems and processes. Let's highlight some examples to see if you know some of the terminologies: Cryptocurrency    Tokenisation    Blockchain     Fintech To name a few, these are the most common being banded around at the moment. However, understanding the terminology is the first hurdle, how about understanding the regulators enforced compliance rulings for your business technology? REC 2.5.19G01/04/2013 The FCA may have regard to the arrangements for maintaining, recording and enforcing technical and operational standards and specifications for information technology systems, including: The procedures for the evaluation and selection of information technology systems; The arrangements for testing information technology systems before live operations; The procedures for problem management and system change; The methods to monitor and report system performance, availability and integrity; The arrangements (including spare capacity and access to backup facilities) made to ensure information technology systems are resilient and not prone to failure; The provisions made to ensure business continuity if an information technology system does fail; The arrangements made to protect information technology systems from damage, tampering, misuse or unauthorised access; and The arrangements made to ensure the integrity of data forming part of, or being processed through, information technology systems. “The FCA may have regard to the arrangements made to keep clear and complete audit trails of all uses of information technology systems and to reconcile (where appropriate) the audit trails with equivalent information held by system users and other interested parties.” – The FCA No, you see the quandary here – Regulators have all the authority of enforcing these rule sets within a compliance world of understanding but how does a technical support organisation understand these (and the examples listed are a very small subset of what FCA regulated firms require for adherence) and assist with your technical governance? A certain amount of “decoding the jargon” is required with a technical understanding of the regulations to IT requirements that can only be found within particular IT environments. Let's draw down on this further with an example of the point I am trying to make. Let's say that I am a CFO of a hedge fund looking for an IT Support organisation to assist with the running and maintenance of my IT estate. What are the most important points that I am concerned about for the support of my IT Infrastructure? IT Security – absolutely Customer Service – definitely Cost-effectiveness – Yes, I’m a CFO after all Understanding of my industry technical compliance – Yes but it’s not normally a deal breaker With the above points that are important to me, who would I choose and what will be the deciding factor? Sadly not the point you would think! – I have an understanding of technology, I use google to find answers to problems, and I label technology as a commodity with an associated cost. The fact is that individually I probably spend £10  a day on coffee (£50 a week, times that by 20 staff = £1000 per week), which is more than I would spend on the cost of being technically supported (sad but true). Now you add this to the cloud technologies available that promise to achieve bigger, better and cheaper alternatives to the infrastructure I have today, how can I ever put an importance on technology with even more choice available? As you can see, this is a real problem for regulated industries where there is a real need for understanding my technical to business compliance and with GDPR only months away from enforcement, this is a genuine problem. [easy-tweet tweet="Guidance Technologies – Focused IT Support for regulated businesses" hashtags="IT, IT Infrastructure"] Wouldn’t it be a refreshing change to have peace of mind knowing that the technical support firm you engage with understands your challenges without google searches? Guidance Technologies – Focused IT Support for regulated businesses, yes you heard that correctly and we have been doing this for over a decade! We are based in the heart of the City (20 Fenchurch Street) and travel in and around the London area wherever needed. As you would expect, we are experienced with the financial IT marketplace jargon, as well as conduct support and technical advice for regulated firms to a very high standard. Take a look at a typical client quote that we receive from the businesses we support. “We have been supported by Guidance Technologies since we started our company a few years ago. Their understanding, support and proactive approach is second to none. The staff are very friendly and are obviously very experienced with the technology they install and support. The remote assistance works well for us, they are able to deal and solve problems within minutes of occurring. I would have no hesitation in recommending Guidance to friends and associates.” Misha Patel - Office Manager, Aurelius Investments UK So in essence, we feel that understanding your individual business needs that equate to your associated technology is paramount when providing technical support. Just as an IFA (Independent Financial Adviser) will advise you with your financial decisions correctly and appropriately, so should your technology provider have the no how to do this within your industry? ### Channel Live Day 2 - Paul Lawrence, Director for Salpo Technologies Paul Lawrence from Salpo Technologies came on Disruptive's live stream at Channel Live on the 13th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Salpo Technologies at: http://www.salpo.com/ ### Channel Live Day 2 - Adam Cathcart, Director for 9 Group Adam Cathcart from 9 Group came on Disruptive's live stream at Channel Live on the 13th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about 9 Group at: https://www.ninegroup.co.uk/ ### Channel Live Day 2 - Steve Denby, Head of Solution Sales for Node4 Steve Denby from Node4 came on Disruptive's live stream at Channel Live on the 13th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Node4 at: https://www.node4.co.uk/ ### Channel Live Day 2 - Neil Wilson, Head of Marketing for Virtual1 Neil Wilson from Virtual1 came on Disruptive's live stream at Channel Live on the 13th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Virtual1 at: https://www.virtual1.com/ ### Channel Live Day 2 - Richard Thompson, Director of Partners for TalkTalk Business Richard Thompson from TalkTalk Business came on Disruptive's live stream at Channel Live on the 13th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about TalkTalk Business at: https://www.talktalkbusiness.co.uk/ ### Channel Live Day 2 - Mike Mills, Head of Channel for Gamma Mike Mills from Gamma came on Disruptive's live stream at Channel Live on the 13th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Gamma at: https://www.gamma.co.uk ### Channel Live Day 2 - Martin Borrett, CTO IBM Security Europe at IBM Martin Borrett from IBM came on Disruptive's live stream at Channel Live on the 13th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about IBM at: https://www.ibm.com/uk-en/ ### CloserStill Media are pleased to announce the acquisition Top 50 from UBM Howe and Opie formerly owned IMP events, from where they launched a number of highly successful technology events in the 2000s including Adtech UK in 2006 (subsequently acquired by DMG) and eCommerce Expo in 2006. They are now directors of IMRG, the UK’s online retail association which hosts the UK’s largest online performance benchmark for the retail sector. [easy-tweet tweet="Customer service is fundamental and integral to business success, especially in the digital age" hashtags="Customer, Digital"] Howe mentioned, “Customer service is fundamental and integral to business success, especially in the digital age. We are extremely excited to be involved with the UK’s leader in benchmarking in this area, and recognise the importance of raising standards of customer service across industries. By combining Closer Still’s marketing expertise and our knowledge of benchmarking products we believe we can take this programme to the next level.” CloserStill was established in 2008 with co-founders Phil Soar, Phil Nelson and Michael Westcott having previously run Ithaca Business Media. There they ran Internet World as well as several other highly successful events. CloserStill Media was named by the Financial Times this April as Europe’s fastest growing exhibition company and won the 2017 Exhibition News’ Awards for Best Business Exhibition and Best Exhibition Marketing. Alexia Maycock, Marketing Director of CloserStill adds. “This is a massively exciting investment for us all. Between us, the team now running the Top 50 have been responsible for running the UK’s most successful marketing and technology events of the last two decades at the phases when they were growing the fastest and delivering the most for their respective stakeholders” Closer Still Media have also acquired eCommerce Expo and Technology for Marketing, Opie adds, “the synergy between this group of products is the customer, they need to be at the centre of every successful business’ culture, the organisation of the future will be designed to meet the needs of the ever-changing consumer irrespective of the channel” The Top 50 improvement programme will be at the heart of the proposition going forward. Maycock adds “it is our intention to extend the breadth of industry representation within the membership programme and build the community with responsibility for their customer facing channels promoting best practice across sectors” The Top 50 Gala Dinner scheduled for the 25th of October in Birmingham is shaping up to be another fantastic celebration of achievement within this sector. Mayfield Merger Strategies acted as adviser to CloserStill Media ### Channel Live Day 2 - Alex Hollingworth, Senior Director EMEA Sales for Lantronix Alex Hollingworth from Lantronix came on Disruptive's live stream at Channel Live on the 13th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Lantronix at: https://www.lantronix.com ### Channel Live Day 2 - Paul Lawrence, VP for Corero Paul Lawrence from Corero came on Disruptive's live stream at Channel Live on the 13th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Corero at: https://www.corero.com/ ### CloserStill Media are pleased to announce the acquisition of Technology for Marketing from UBM TFM was launched in the late 90s by CMP (now UBM) within Howe and Opie’s team, where they were directors at the time. Subsequently they left to establish IMP Events. There they launched a number of highly successful technology events in the 2000s including Adtech UK in 2004 (subsequently acquired by DMG) and eCommerce Expo in 2006. Howe and Opie have partnered with CloserStill Media to run TFM. Howe comments “We are delighted to be working with the team at CloserStill who have an incredible track record of investing in and operating award-winning exhibitions in related sectors and we share a common vision for the potential and importance of the event. We also continue to work with the existing teams to explore new and innovative ways to foster even closer relationships between the market and the show’s audiences” [easy-tweet tweet="CloserStill Media was named by the Financial Times this April as Europe’s fastest growing exhibition company" hashtags="Marketing, Exhibition"] CloserStill was established in 2008 with co-founders Phil Soar, Phil Nelson and Michael Westcott having previously run Ithaca Business Media. There they ran Internet World as well as a number of other highly successful events before also selling the events to UBM in 2007. CloserStill Media was named by the Financial Times this April as Europe’s fastest growing exhibition company and also won the 2017 Exhibition News’ Awards for Best Business Exhibition and Best Exhibition Marketing. Alexia Maycock, Marketing Director of CloserStill adds. “This is a massively exciting investment for us all. Between us, the team now running TFM have been responsible for running the UK’s most successful marketing and technology events of the last two decades at the phases when they were growing the fastest and delivering the most for their exhibitors” Opie adds “These are exciting times for martech and the market generally. At no time has innovation in this space been moving faster. We look forward to working with suppliers, users and industry players to help the event become the definitive and interpretive event for marketers looking to develop their strategies and make the most of marketing technology solutions.” TFM runs 27-28th September at Olympia. The event content will be a key component of how the team plan to develop it and make it more relevant for senior buyers going forward. As the first part of this they are also today announcing a new “Digital Customer Theatre” in association with IMRG, the UK’s association for online retail at the 2017 event. Speaker highlights include Wickes, Diageo, Vodafone, Sainsburys, Mattel, notonthehighstreet.com and Eurostar sharing the latest experiences they have and their plans for harnessing technology. The focus of the team will be to build on the seniority of delegates by making the event more strategic and visionary in its content proposition. Maycock adds “Even before the deal was done we had put in place a cracking new additional programme; recruited a team of telemarketers to launch a VIP programme; and developed an additional multi-channel marketing campaign to back this up. We will be targeting the UKs biggest organisations by marketing spend which we will then be building on for 2018. We will be working closely with the industry to take the event to the next level” There has been a fantastic response from the community in regards to this new venture: ‘We were pleased to hear that IMRG and CloserStill are getting involved in TFM – it’s important that we can rely on these events to provide genuinely actionable insight for our teams, and we look forward to seeing how they evolve the proposition to be really compelling for businesses such as ours.’                                 Neil Sansom, Woolovers  ‘The market for events in our space has become very congested and we need to select the ones that will deliver the most useful insight for our teams. We are confident that the team behind TFM will develop it into a must-attend for event in the calendar.’ Johanna Fawcett, Zizzi ‘We are delighted to be working closely with the organisers to help evolve the show proposition to really reflect what businesses like ours want to hear about and make the content compelling, relevant and informative.’                                                    Mark Wright, Jack Wills ‘Bringing together the powerful data that IMRG offers with the reach of TFM, eCommerce Expo and Top 50 Companies, this acquisition is very exciting news for those in retail. I look forward to being involved in helping shape the development of them and seeing the benefits of knowledge and the network they make available to my team.’                                                    Madeleine Melson, Amara Register here if you would like to attend: https://registration.n200.com/survey/31fj0k5a63naf   ### Disney-developed Dragonchain, brings the power of blockchain technology to business Dragonchain delivers the first hybrid blockchain platform for businesses and offers an incubator program to start-ups and entrepreneurs building applications on the platform Blockchain is a disruptive technology for trust-based business models, for proving the provenance of supply chains or recording identity through to replacing public ledgers such as recording property transactions Dragonchain allows easy integration of blockchain technology into existing business models using a scalable solution hosted on the cloud Dragonchain, the open-source blockchain platform originally developed at Disney in 2015, today announces it will bring the power of blockchain technology to business applications with a secure, serverless, scalable platform running in the cloud. This will enable simple integration of blockchain functionality into existing business models. To date, blockchain technology has been constrained by a number of security concerns when building smart contract applications. Dragonchain addresses these limitations by allowing businesses to retain complete control of their sensitive business data and proprietary business logic. [easy-tweet tweet="The public blockchain provides publicly verifiable checkpoints, even for private implementations" hashtags="Blockchain, Hybrid"] Dragonchain is a hybrid blockchain, allowing businesses to innovate using different levels of verification from basic transaction verification up to public blockchain verification on Bitcoin or Ethereum to provide independent witness or “proof of existence”. The benefit is that applications can achieve consensus in stages, from a private trusted context up to a trustless public blockchain. The public blockchain provides publicly verifiable checkpoints, even for private implementations. The result is a flexible method of implementing business-focused systems. The platform is designed such that a traditional software engineer can easily and quickly build secure smart contracts into their applications with no prior experience with blockchain technology or platforms. Dragonchain was originally developed at Disney’s Seattle office in 2015 - 2016 under the name, “Disney Private Blockchain Platform”. The project was open-sourced by Disney in October 2016 and is maintained by the Dragonchain Foundation. Businesses will be able to pay for access to the platform, services, software and Dragonchain incubator with Dragon tokens issued in the forthcoming ICO. The Dragonchain Incubator will support start-ups and entrepreneurs building applications on the platform. Joe Roets, Founder and CEO of Dragonchain, Inc., says: “When we created Dragonchain, we wanted to build an easy-to-use hybrid blockchain platform with all the benefits of immutable proof on the public blockchain, in a flexible business-focused package. Most importantly, we wanted to build a platform that is secure and easy to implement. “Unlike existing platforms, with Dragonchain you retain complete control of your data. Sensitive business logic and smart contract functionality is kept proprietary. Dragonchain enables applications to leverage a spectrum of trust through multi-level contextual verification.” ### Channel Live Day 2 - Bipin Patel, Technical Director for Open Minds Bipin Patel from Open Minds came on Disruptive's live stream at Channel Live on the 13th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Open Minds at: http://www.openminds.co.uk/ ### CloserStill Media are pleased to announce the acquisition of eCommerce Expo from UBM Howe and Opie formerly owned IMP events, from where they launched a number of highly successful technology events in the 2000s including Adtech UK and eCommerce Expo in 2006 (subsequently acquired by DMG and UBM respectively) Thereafter, Justin and Graeme joined the trade association for the ecommerce industry, IMRG. IMRG was a founding partner of both the event and the awards, and remains a key industry partner of the expo, being integral to the success of the event. Justin Opie says, “We are absolutely delighted to be back running the event. These are exciting times for the ecommerce sector and the market generally. The UK leads innovation in the ecommerce sector and the event is the industry showcase. We look forward to working with the retail community, solution providers and industry stake-holders to help the event become the definitive and interpretive event for ecommerce in Europe. Howe and Opie are partnering with CloserStill Media to run eCommerce Expo. Howe comments “We are delighted to have partnered with the team at CloserStill who have an incredible track record of investing in and operating award-winning exhibitions in related sectors and we share a common vision for the potential and importance of the event. We also continue to work with the existing teams to explore new and innovative ways to foster even closer relationships between the market and the show’s audiences.” [easy-tweet tweet="CloserStill Media was named by the Financial Times this April as Europe’s fastest growing exhibition company" hashtags="Exhibition, Media"] CloserStill was established in 2008 with co-founders Phil Soar, Phil Nelson and Michael Westcott having previously run Ithaca Business Media. There they ran Internet World as well as a number of other highly successful events. CloserStill Media was named by the Financial Times this April as Europe’s fastest growing exhibition company and also won the 2017 Exhibition News’ Awards for Best Business Exhibition and Best Exhibition Marketing. Alexia Maycock, Marketing Director of CloserStill adds. “This is a massively exciting investment for us all. Between us, the team now running eCommerce Expo have been responsible for running the UK’s most successful marketing and technology events of the last two decades at the phases when they were growing the fastest and delivering the most for their exhibitors” eCommerce Expo runs 27-28th September at Olympia. The event content will be a key component of how the team plan to develop it and make it more relevant for senior buyers going forward. As the first part of this they are also today announcing a new “Digital Customer Theatre” supported by IMRG at the 2017 event. Speaker highlights include Wickes, Diageo, Vodafone, Sainsburys, Mattel, notonthehighstreet.com and Eurostar sharing the latest experiences they have had and their plans for harnessing technology. The focus of the team will be to use this as a first step to build on the seniority of delegates by making the event more strategic and visionary in its content proposition. Maycock adds “Even before the deal was done we had put in place a cracking new additional programme; recruited a team of telemarketers; and developed an additional multi-channel marketing campaign to back this up. We will be targeting the UKs biggest organisations by marketing spend which we will then be building on for 2018. We will be working clsoely with the industry to take the event to the next level” There has been a fantastic response from the community in regards to this new venture: ‘We were pleased to hear that IMRG and CloserStill are getting involved in TFM – it’s important that we can rely on these events to provide genuinely actionable insight for our teams, and we look forward to seeing how they evolve the proposition to be really compelling for businesses such as ours.’ Neil Sansom, Woolovers ‘The market for events in our space has become very congested and we need to select the ones that will deliver the most useful insight for our teams. We are confident that the team behind TFM will develop it into a must-attend for event in the calendar.’ Johanna Fawcett, Zizzi ‘We are delighted to be working closely with the organisers to help evolve the show proposition to really reflect what businesses like ours want to hear about and make the content compelling, relevant and informative.’                                                    Mark Wright, Jack Wills ‘Bringing together the powerful data that IMRG offers with the reach of TFM, eCommerce Expo and Top 50 Companies, this acquisition is very exciting news for those in retail. I look forward to being involved in helping shape the development of them and seeing the benefits of knowledge and the network they make available to my team.’                                                  Madeleine Melson, Amara Mayfield Merger Strategies acted as adviser to CloserStill Media About CloserStill Media CloserStill Media were established in 2008 and now run more than thirty exhibitions in five countries with 200 staff and offices in London, New York, Hong Kong and Singapore within the technology, healthcare and learning sectors. In the last seven years they have won 30 major industry awards including being six times winners of the Best Business Exhibition and in 2016 were awarded “The Most Respected Exhibition Company” by the Association of Exhibition Organisers. For the last two years they have featured in the Sunday Times International Track of the hundred UK companies with the fastest international sales growth. Their events include Cloud Expo Europe, Big Data World, Data Centre World, the London Vet Show and Learning Technologies. Register here if you would like to attend: https://registration.n200.com/survey/1hzyromonugf1 ### Channel Live Day 2 - David Fearne, Technical Director for Arrow ECS David Fearne from Arrow ECS came on Disruptive's live stream at Channel Live on the 13th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Arrow ECS at: http://www.arrowecs.co.uk/ ### WND UK dispels Sigfox network myths The UK’s primary Sigfox Internet of Things network operator confirms Sigfox’s reliability, cost and security advantages Previous attempts to rollout a reliable and affordable low-power wide-area (LPWA) communications network in the UK have fallen short of the mark, resulting in a lack of confidence in the technology, misinformation in the market and uncertainty among customers. That’s the view of Neal Forse, chief executive of WND UK, the UK’s primary Sigfox network operator. WND UK is committed to providing Sigfox coverage to 95% of the UK population by 2019. “The feedback we’re getting from many of our channel partners is that some customers remain uncertain about the reliability, security and costs associated with using the Sigfox network. We’re spending a lot of time reassuring customers that Sigfox is proven technology and the basis for a commercially viable IoT network. Unfortunately, the failure of previous network operators to deliver on their promises has left us with a confused market,” commented Forse. As the promise of the IoT becomes a reality, service providers and manufacturers are looking for reliable and affordable network solutions to power the billions of low-powered devices expected to come online over the next few years. Forse continued: “If we look at what WND is achieving in international markets such as Europe and Latin America as it continues to deploy Sigfox networks throughout Brazil, Mexico, Columbia and Argentina, then customers should be reassured that we will achieve the same outcomes here in the UK. “Regarding the economics of using the network, we’ve adopted an entirely different business model compared to previous network operators that allows us to rapidly deploy base stations, and at no cost to customers. “It’s a model that is based on usage rather than passing on network deployment costs to our customers. We firmly believe that it is the responsibility of the network operator to fund the development of the network infrastructure. We base licence fees on message volume, resulting in very low ‘per message’ charges for high-volume users. Furthermore, unlike other IoT networks, you don’t require a gateway to use Sigfox.” Tim Streather, channel partner and co-founder of Spica Technologies confirmed the benefits of WND UK’s new approach to the Sigfox network rollout: “Working with WND UK is a huge plus for us because of its refreshing customer-centric model to building out a UK Sigfox network. Its dynamic rollout strategy ensures that Sigfox base stations can be deployed at customer sites anywhere in the UK and be fully operational in a matter of weeks. WND UK’s approach is about getting base stations rolled out at speed throughout the country.” Nick Pummell, managing director for LPRS, commented on the power and cost benefits of Sigfox: “We explain to our customers that when it comes to wireless technologies, then it’s not a case of ‘one size fits all’. However, Sigfox is a really interesting proposition for us and our industrial customers as it’s a reliable technology which has proved particularly useful for battery-powered devices that live in the field for many years, periodically sensing and transmitting their application information. Sigfox’s ultra-narrowband (UNB), low-power technology is enabling lots of new applications that were previously not possible using other radio technologies.” [easy-tweet tweet="WND UK claims that no other IoT network operates in the same way as Sigfox" hashtags="IoT, Application, Technology"] Enabling IoT security WND UK claims that no other IoT network operates in the same way as Sigfox, and that its protocols are measurably more secure and robust than competing systems. Forse explained: “Although Sigfox-ready devices are classed as IoT objects, they are not directly connected to the internet and do not communicate using internet protocol. Sigfox-enabled devices have a built-in behaviour; when this requires data to be transmitted or received, a device will communicate via a radio message. Each message is picked up by several access stations and is delivered to the Sigfox cloud network over a secure VPN, which then relays it to a predefined destination, typically an IoT application. Because Sigfox devices don’t have IP addresses, they are not addressable for rogue hackers to gain access.” Such a security design ensures that Sigfox-ready devices are prevented from sending data to arbitrary devices via the internet and are shielded from interception by strict firewall measures. Lowering costs With the costs of connectivity and devices falling significantly, everything is now in place for Sigfox to become the preferred wireless network technology for low-powered devices. David Griffiths, director for WaveReach Ltd, said: “Sigfox is a very affordable technology from a communications aspect, particularly when compared with other wireless technologies such as cellular, wi-fi or microwave. It’s a superb technology and once rolled out in the UK, the network will provide a robust, low-priced means of getting data around the country.” WND UK is adopting a dynamic rollout strategy, prioritising deployments to meet customer demand. Since the launch of its UK operation in March 2017, the operator is already providing coverage to over 37% the UK’s population and is on track to cover 50% before the end of 2017, with over 450 base station installations across the UK. ### Channel Live Day 2 - Elsa Chen, CEO for Entanet Elsa Chen from Entanet came on Disruptive's live stream at Channel Live on the 13th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Entanet at: https://www.enta.net/ ### Channel Live Day 1 - Mat Jordan from Procurri, Neil Cattermull from Compare the Cloud and Andrew McLean from Disruptive Andrew McLean from Disruptive came on Disruptive's live stream and was interviewed by Mat Jordan from Procurri and Neil Cattermull from Compare the Cloud at Channel Live on the 12th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Procurri: https://www.procurri.com/ Find out more about Disruptive: https://www.disruptive.live/ ### Channel Live Day 1 - Cal Leeming, CEO for Lyons Leeming Cal Leeming from Lyons Leeming came on Disruptive's live stream at Channel Live on the 12th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Lyons Leeming at: https://lyonsleeming.com/ ### Channel Live Day 1 - Jay McBain Principal Analyst for Forrester Jay McBain from Forrester came on Disruptive's live stream at Channel Live on the 12th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Forrester: https://go.forrester.com/ ### Channel Live Day 1 - Adam Clyne, Director of Coolr Adam Clyne from Coolr came on Disruptive's live stream at Channel Live on the 12th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Coolr: https://www.wearecoolr.com/ ### Channel Live Day 1 - Andrew Wilson from Node4Ltd Andrew Wilson from Node4Ltd came on Disruptive's live stream at Channel Live on the 12th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Node4Ltd: https://www.node4.co.uk ### Channel Live Day 1 - Tony Martino from Tollring Tony Martino from Tollring came on Disruptive's live stream at Channel Live on the 12th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Tollring: http://www.tollring.com/ ### Channel Live Day 1 - Bernie McPhilips from Pangea Bernie McPhilips from Pangea came on Disruptive's live stream at Channel Live on the 12th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Pangea: http://pangea-group.net/ ### Channel Live Day 1 - Chris Gilmour from Axians Chris Gilmour from Axians came on Disruptive's live stream at Channel Live on the 12th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Axians: http://www.axians.co.uk/ ### Groundbreaking Smart Device for Urban Cyclists SmartHalo brings innovative technology to cyclists for safer and healthier urban mobility SmartHalo Technologies Inc., the urban mobility technology innovator, announced today that SmartHalo, its smart device for urban cyclists, is now available for $149 in the US and Canada online at smarthalo.bike. With its circular, award-winning minimalist interface, SmartHalo shows the quickest and safest paths to destinations, lights the way at night and keeps the bike safe against thieves with its integrated alarm system. It’s the smart biking system cyclists have been waiting for. [easy-tweet tweet="SmartHalo addresses the current challenges of today’s urban mobility " hashtags="Smart, AI"] SmartHalo addresses the current challenges of today’s urban mobility and contributes to a greener, more human cities. With its sleek design, SmartHalo attaches to any type of bike handlebar and pairs by Bluetooth with the SmartHalo app. While cycling, the phone stays in the pocket or the purse – all the information is shown using the tactile circular interface and contextual sounds. Precise and intuitive navigation designed to take the safest and fastest routes, while keeping the eyes on the road. A powerful front light that makes night rides journeys safer and more enjoyable that automatically turn on and off depending on the time of the day. An alarm designed to trigger a loud sound when the bike is tampered with, to prevent bike theft. A series of enhanced fitness features that automatically track and analyze fitness metrics without having to press start or stop. Smarthalo’s assistant notifies you of incoming calls with a simple blue dot at the centre of the interface. SmartHalo took the cycling world by storm with its Kickstarter and pre-sale campaign with 10,000 units sold and delivered. Discover all our videos on Youtube   ### Channel Live Day 1 - Paul Fackrell, Head of Services Fujitsu Paul Fackrell from Fujitsu came on Disruptive's live stream at Channel Live on the 12th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Fujitsu: http://www.fujitsu.com/uk/ ### Property as an Asset Class - Discussion on Blockchain - TrustMe's Antony Abell Compare the Cloud interviews Antony Abell, Managing Director of TrustMe. Antony speaks about the work that TrustMe do in property as an asset class. What does Blockchain mean? Blockchain technology provides an open decentralised database of every transaction involving value. It creates a chronological and public record that can be verified by the entire community. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Blockchain ### Mercedes-AMG Petronas Motorsport and Rubrik announce new partnership ●      Pioneer in Cloud Data Management to work with team to accelerate the protection of race data ●      Team to use Rubrik’s data backup and recovery technology at their headquarters to improve data management Mercedes-AMG Petronas Motorsport confirms a new team partnership with Rubrik, specialist in Cloud Data Management. With data volumes, backup and recovery requirements becoming ever more demanding in Formula One, the team is investing in class-leading technology in order to stay ahead. Specifically, the team will be using a multi-node Rubrik cluster at their Brackley headquarters to protect the team’s critical race data.  The team will also use Rubrik’s REST API (Application Programming Interface) to integrate with their current tools to analyse their data utilisation. With this information, the team expects to become even more efficient in how it manages and utilises the vast volumes of race data. “We are delighted to welcome Rubrik to Mercedes-AMG Petronas Motorsport,” commented Toto Wolff, Head of Mercedes-Benz Motorsport. “In the fast-moving world of information technology, it’s essential to be right at the forefront, particularly for us in the area of data management, and we look forward to working with Rubrik to maximise our potential in this area.” Mercedes-AMG Petronas Motorsport is at the forefront of adopting new technologies within the racing world,” said Bipul Sinha, co-founder and CEO, Rubrik. “Rubrik’s cloud data management platform will enable the team to access and manage critical race information, providing them with a new competitive edge. We are excited to partner with Mercedes-AMG Petronas Motorsport to turbocharge their innovative approach to data management and contribute to their continued success.” ### Channel Live Day 1 - Gordon Davies and Adrian Fry from Cisco Gordon Davies and Adrian Fry from Cisco came on Disruptive's live stream at Channel Live on the 12th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Cisco: https://www.cisco.com ### Channel Live Day 1 - James Marshall, CTO from Microsoft James Marshall from Microsoft came on Disruptive's live stream at Channel Live on the 12th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Microsoft: https://www.microsoft.com ### Channel Live Day 1 - Zoe Magee, Head of Channels from Channel Tools Zoe Magee from Channel Tools came on Disruptive's live stream at Channel Live on the 12th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Channel Tools: https://www.channel-tools.co.uk/ ### When Androids Attack: Protecting Against WireX Botnet DDoS Attacks   It appears Mirai may have some competition. And its name is WireX. Google recently removed roughly 300 apps from its Play Store after researchers found that the apps in question were secretly hijacking Android devices to feed traffic to wide-scale distributed denial of service (DDoS) attacks against multiple content delivery networks (CDNs) and content providers. The WireX botnet is to blame. Akamai researchers first discovered WireX when it was used to attack one of its clients, a multinational hospitality company, by sending traffic from hundreds of thousands of IP addresses. The malicious applications in question included media and video players, ringtones and other tools like storage managers. The nefarious apps contained hidden malware that could use an Android device to participate in a DDoS attack as long as the device was powered on. It’s unclear how many devices were infected – one researcher told KrebsOnSecurity that WireX infected a minimum of 70,000 devices, but noted that estimate is conservative. It is believed that devices from more than 100 countries were used to participate in the attacks. Protecting Mobile Networks from Weaponised Smartphones WireX, much like its predecessor Mirai, illustrates the importance of protecting your network and applications from attacks. Large-scale attacks can come from anywhere, even a botnet comprising tens of thousands of Android devices. As these types of attacks grow in frequency, sophistication and size, organisations need to solutions in place to stop them before they have the opportunity wreak havoc. [easy-tweet tweet="Traditionally, mobile and service provider networks are protected against attacks" hashtags="Security, Mobile"] WireX is unique in that it introduces a new threat: Weaponised smartphones, which introduces billions of endpoints ripe for infection that can propagate bad agents upon a mobile network. Traditionally, mobile and service provider networks are protected against attacks that come in through the Internet. However, many critical components are left unprotected based on the assumption that attacks will be stopped at the Internet edge. Attacks like WireXchange this paradigm. “WireX proves that attacks can originate from inside a mobile network as well, and a few thousand infected hosts can affect the brain of a mobile network,” A10 Director of Product Management Yasir Liaqatullah said. “These infected smartphones will eventually start to attack the critical components of mobile networks, and the potential fallout from that could be tremendous.” Attacks like WireX reinforce the need for service providers to protect their key assets on all fronts – not just from attacks from the outside, but from the inside as well. To combat attacks like WireX, service providers and mobile network operators need an intelligent, scalable DDoS defense solution between smartphones and the mobile network infrastructure, both the internal and external. To address this sophisticated type of attack, a modern DDoS solution requires intelligence to understand the changing nature of a polymorphic attack, which has the ability to change signatures and varying headers, like those launched by WireX. Placing high-performance, scalable and intelligent threat protection in the mobile network will help service providers defend against these billions of weaponised endpoints and empower them to detect online threats and multi-vector attacks types of attacks, learn from them and, most importantly, stop them. ### Channel Live Day 1 - Neil Muller CEO from Daisy Group Neil Muller from Daisy Group came on Disruptive's live stream at Channel Live on the 12th September. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Daisy Group: https://daisygroup.com/ ### Informatica Reimagines Data Management with Artificial Intelligence to Unleash Data-Driven Digital Transformation Now Shipping, Informatica Release 10.2 Powers Enterprise Cloud Data Management and Intelligent Disruption Delivers the industry’s most comprehensive, AI-driven enterprise data catalogue, powered by the CLAIRE engine Provides the only end-to-end AI-driven enterprise data governance and compliance solution Delivers big data ingestion, processing, streaming and data lakes supporting latest best in class open source technologies, including Spark, Spark streaming, Kafka and Apache Zeppelin Detects and protects critical data across the enterprise with an all-in-one solution Enables enterprise-scalable hybrid and multi-cloud deployments across leading cloud ecosystems, including Salesforce, AWS and Microsoft Azure  The need for businesses to become more agile and lead their own intelligent disruptions using data has never been stronger. To accelerate their data-driven digital transformation, organisations need to manage data using a strategic approach rather than a tactical one. To unleash the full disruptive power of data, Informatica has reimagined data management, powered by metadata-driven Artificial Intelligence (AI). Informatica, the Enterprise Cloud Data Management leader, today announced general availability (GA) of the latest version of many core components of the Informatica Intelligent Data Platform®, Release 10.2., an intelligent, scalable and integrated platform for managing any data to accelerate data-driven digital transformation. Powering the 10.2 release, the CLAIREÔ engine has enterprise-wide and metadata-driven AI at its core and informs the Informatica Intelligent Data Platform, from data cataloguing and discovery to data governance stewardship to big data and data lake management. The resulting business outcomes include: [easy-tweet tweet="Successful digital transformations are data-driven and require a strategic approach to data management " hashtags="DigtialTransformation, Data"] Increased ROI from the industry’s most comprehensive AI-driven enterprise data catalogue for discovering and managing all enterprise data. Fast and effective data governance through collaboration driven alignment, documentation, implementation and adherence to policies and compliance mandates, such as GDPR, BCBS 239, CCAR, etc. Faster and complete business insights leveraging a scalable and flexible intelligent data platform built to manage data of any volume, velocity and complexity leveraging best in class big data technologies. Faster hybrid and multi-cloud data management deployments across ecosystems, leading to greater ROI and decreased risk of moving to the cloud. “Successful digital transformations are data-driven and require a strategic approach to data management that catalogues and governs all data that matters, secures data that needs protection, ensures the quality of trusted data, scales for big data workloads and real-time processing across hybrid architectures, and brings it together for an actionable 360-degree view of all an organisation’s data,” said Amit Walia, executive vice president and chief product officer, Informatica. “Only Informatica, as the leader in Enterprise Cloud Data Management, delivers on all these imperatives through the industry’s only intelligent data platform, powered by CLAIRE.” Reimagining Metadata Management and Data Cataloging With Release 10.2, Informatica delivers the industry’s leading AI-driven enterprise data catalogue. Powered by CLAIRE, only Informatica Enterprise Information Catalogue can discover and catalogue all types of data and data relationships across the enterprise with AI-driven metadata management so that no relevant or useful data remains hidden or obscure. This includes: Discovering business entities from structured and unstructured data. Rapid deployment in the cloud with one-click deployments on Amazon AWS and Azure. Broader metadata ecosystem connectivity, including Apache Atlas and Azure SQL DB and SQL DW integration. API Framework including open REST APIs for analytics, extensible custom scanner framework and secure open metadata access and enrichment. Reimagining Data Governance and Compliance, including GDPR, BCBS 239 and CCAR With Release 10.2, only Informatica provides an out-of-the-box, end-to-end intelligent data governance solution that brings together people, processes and systems with a holistic, collaborative approach to deliver strategic business outcomes. Using modular and integrated components of the Intelligent Data Platform, including Informatica Axon™, Informatica Enterprise Information Catalogue, Informatica Data Quality and Informatica Secure@Source®, enables organisations to uniquely leverage business, technical, operational, and usage metadata to gain a complete data view for data governance and compliance. Included in this release are: Axon integration with Enterprise Information Catalogue, which increases business and IT collaboration to define, implement, track and attain outcomes. Axon integration with Informatica Data Quality, which enables users to visualise and align policies, data quality standards and rules and offers visibility of overall data quality in the right context for IT and non-technical business users. AI-driven data lineage with Axon, which enables non-technical business users to automate data asset business lineage generation. Reimagining Big Data and Data Lake Management Only Informatica manages all types of data at all latencies, including real-time streaming, for all types of users, at scale – while executing on-the-fly advanced data transforms and data quality rules using collaborative and governed self-service data preparation tools. New in Release 10.2 are: Advanced transformations and pre-built data quality rules, plus advanced data preparation editing including “back-in-time” exploration. Data validation visualisation using Apache Zeppelin and intelligent/smart chart recommendations for collaboration and analyst productivity. Next-generation streaming and changed data capture (CDC) with complex data type processing at scale for virtually all types of streaming data, and first-in-industry support for mainframe and RDBMS log-based CDC as a streaming source. Full support for Amazon Kinesis Streams, Kinesis Firehose and EMR in streaming mode for efficient processing of real-time big data. Broader cloud ecosystem connectivity and deployment for Amazon AWS and Microsoft Azure.  Reimagining Data Security for Business Outcomes including GDPR With Release 10.2, Informatica detects and protects critical data across the enterprise with the most comprehensive data-centric security solution, including data discovery, classification, User Behaviour Analytics, risk scoring and automated protection. Capabilities in this release include: Acceleration of compliance initiatives with classification and security policies for regulations such as GDPR. Support for unstructured data, including Office, PDF, text, XML, CSV and JSON. Expanded platform support including Amazon Redshift, Azure and SAP. Orchestration of a broad range of security controls for remediation, including Ranger, Sentry, data masking and encryption. Protection of data in cloud, data lake, unstructured data and traditional data environments. Reimagining Hybrid Data Management With Release 10.2, only Informatica manages enterprise-scalable hybrid and multi-cloud deployments with comprehensive Hadoop ecosystem support (including latest versions of Amazon EMR, Amazon S3, Redshift, Microsoft HDI, Azure DW, MapR, Cloudera, Hortonworks, ADLS and BLOB). Additional capabilities include: Rapid one-click and zero-footprint cloud deployments with fast conversions from PowerCenter® to Big Data. Unified, global view and analysis of all data integration processes across the enterprise. Hub-based orchestration of all data flows – across clouds, big data environments and traditional systems. ### Can the UK’s Silicon Roundabout really compete with the Silicon Valley? Last year British tech companies raised a staggering £6.7bn in investment - more than the rest of Europe combined. London is now recognised by the likes of PwC as one of the world’s top Cities of Opportunity; in fact, excluding the US, more tech start-ups go public or are acquired in the UK than anywhere else in the world. The completion of Traveler’s £375 million pound acquisition of Simply Business (the UK’s largest online business insurance broker) last month is just the latest high-profile example of a UK company gaining traction in the global marketplace and, ultimately, being acquired by a US-based business. But can the UK retain its status as a hub for successful businesses – and can the UK start-up scene compete with that of the US? To begin to answer this, we must first understand the essential role that the cloud has played in enabling UK start-ups to be both flexible and ambitious, and how, by nurturing a culture of creativity, the UK has become such a hotbed for technology start-ups. Make some noise In recent years successful start-ups have become commonplace in the UK; their stories have not only left a lasting impression on the world’s perception of the UK but also reveal an interesting common thread among these businesses. Deliveroo demonstrates this point in practice: founded in 2011, the brand has become a household name in a very short period, operating in over eighty-four cities worldwide. Deliveroo realised that it could use the power of software to create an innovative solution to the challenge of delivering food to its customers more conveniently than the competition, and in doing so helped to reshape the food delivery market. Deliveroo’s appreciation for cloud software as a creative tool is not itself unique; it is instead reflective of an attitude which countless successful UK start-ups have adopted. Fintech start-up Monzo has only founded two years ago but finds itself on a similar trajectory to Deliveroo (albeit in an altogether different market). The company’s tagline, ‘it’s time for a new kind of bank’, reflects the mindset that has helped to make it a success. Unlike conventional banking, almost all of Monzo’s operations run through mobile applications. Just a few years ago the prospect of such an unconventional bank being issued a UK banking licence would have seemed highly unlikely. And yet, this April, Monzo’s licensing restrictions were lifted, and the bank can now offer its customers a full UK current account. Reach for the skies This change of attitude is significant because it reveals a subtle, but crucially important, shift in the way in which disruptive, software-heavy technology is being viewed. [easy-tweet tweet="Companies today are beginning to understand the importance of agility and innovation" hashtags="Fintech, Cloud"] Looking at the recent Simply Business acquisition reveals how this new attitude is bringing more success for UK start-ups. Speaking of the £400 million move Alan Schnitzer, Travelers’ CEO, explained that “technology and innovation [are] driving customer preferences and expectation.” Schnitzer argues advancing a company’s digital agenda to better serve customers is “a key strategic priority.” Companies today are beginning to understand the importance of agility and innovation. As Simply Business’ acquisition attests, this insight has been realised during a peak period for UK start-up acquisitions. This isn’t a coincidence, these companies have recognised the role the cloud can play in helping them to adapt to changing customer preference with speed and agility. The cloud has empowered businesses like Monzo, Simply Business, and Deliveroo to create innovative solutions through which to better engage with the customer. Through their creativity, these start-ups have frequently caught the eye of bigger brands and potential investors. Communication is key Specifically, the old mantra that ‘communication is key’ holds true here – businesses are using cloud platforms to help put communications at the very heart of their business. By doing this, they have been able to drive meaningful interactions with their customers. In an increasingly digital age, the importance of this cannot be understated. Today’s digitally-savvy consumers increasingly expect their communications to be both instant and secure – regardless of the platform, they’re using. Once upon a time, even the most well-established and well-oiled brands would struggle to achieve both of these goals simultaneously - the cloud has changed this. No longer do start-ups need huge-budgets and lengthy timelines. By working closely with developers, these UK companies have created flexible and responsive interfaces with their customers, which can grow with the company and evolve as customer preference does. The growing opportunity The UK has always strived to be a welcoming, accommodating home for young tech companies – it’s the only G8 member with a Tech Nation Visa, and it also offers attractive tax cuts for start-ups. It has become enveloped in a culture of exploration and experimentation – a perfect setting for the developer in today’s cloud-centric markets. The UK represents somewhere where developers are allowed, and often encouraged, to put the onus on being creative. In nurturing this unique culture, the Silicon Roundabout in London has the chance to become a true one-of-a-kind setting in which the brightest and most daring minds can work. ### Is it Time to Do or Die? Out of date attitudes to your audience and market, a reliance on tired methods of communicating. Our audiences have evolved, but systems haven’t moved to meet their new needs? Do we still need our internal PR and Comms? Do we need to expand the department? Do your senior management see you talking but hear no words? These are all problems, challenges and questions we hear time after time. We experience innovative design and technological evolutions on a seemingly daily basis, and we’re adept at quickly finding methods of using them to our advantage. But how often do these changes and developments leave businesses in the shadows or with difficult decisions to make? Are those decisions being taken and are you acting on change quickly enough? There can be many reasons why a business fails to modernise its systems with the most common being an inability or failure to analyse how your audience and working environment has changed, the other of course is sticking one’s head in the sand. Our content marketing industry has seen mobile use grow to dominate the digital market, surging away from traditional desktop and tablet positions and brands and organisations were initially slow to react and equally as slow to capitalise. It’s one thing to deny statistics; it’s another to go against the trend. The trend, as they say, is your friend. Digital marketing spend grown from £8m in 1997 to £10bn in 2016 which demonstrates that marketers and brands have quickly realised where their audiences have moved to. We aren’t what we were, you know! Knowing that customer viewing and browsing habits have become more transient and fickle comes with many problems, and it’s these such problems we we’re faced with every day – it’s what forced us to adopt a new order. Customers now spend their time surfing from one channel to the next, with fleeting visits on each. This to a marketer in the initial stage may seem like a nightmare because your captive target audience isn’t slouched in front of the TV watching your ad anymore. The days of the captive audience has gone – or have they? To the brave, this new era of digital change presents us an exciting new opportunity. With multiple new opportunities to interact with both existing and new audiences, we can, in fact, increase the efficacy of our marketing campaigns and more importantly awareness reach. As a result, we designed a unique approach to producing content that we position in front of these surfers wherever they may be, once we’ve identified them. There was a caveat though. Studies showed us that customer behaviours changed - while the same person may have moved from the TV to the mobile they behave like a different person depending on which digital channel they’re active on – our audience has in fact taken on a Jekyll & Hyde characteristic. This extraordinary data showed us that we needed to further understand the behaviours and habits of our market depending on where they are and when, and on what platform they’re viewing the content in a whole new way and this is, in fact, applicable in many industry sectors. In understanding this new data, we were forced to develop new systems and put them in place urgently to keep ahead of the market and our new audience. Of course, for a small and agile business, this should be a relatively simple and cost-effective exercise in processing change and implement new ways. For larger scale organisations, however, modernising a system can be like an oil tanker taking evasive manoeuvres in avoiding a duck in the water. It doesn’t happen quickly, and it sure makes a lot of waves in the process. Invariably, the duck doesn’t come out well in the end. So, to present and put forward the case for change to decision makers we needed to design a system to modernise the systems. The first step of which is to be able to demonstrate you have effectively understood how the landscape has changed and developed. Be positive about it all! Ensure your business sees the new landscape you now understand as an opportunity. If you can’t see new opportunities in how things have changed, you have a larger issue at hand, but assuming you do, set about effectively designing new methods and approaches to capitalise on these new opportunities and believe in how the change will benefit your organisation in the long run. Address concerns yes, but present solutions. When we’re asked to help our clients in adapting to change and reaching new and unknown markets its vital to know the organisation is willing to demonstrate the commitment? Are they in it for the long haul? Modernisation and success don't happen overnight. Ask yourself if your organisation fully understands the risks of refusing to modernise systems? They’ll want to know if the financial implication is practical. What’s in it for them and what’s in it for you? At some point, your management board will be faced with a do or die scenario, and they’ll be looking at you, wondering whether you’re feeding them the true picture and a watertight case for spending their hard-invested cash.  Take the London Insurance market for example. This 331-year-old founding father of the world’s financial market still to this day operates to models designed for the most part, centuries ago. Communications between brokers and underwriters are still, at times, delivered by hand. As emerging markets grow and as transaction systems operate electronically, the London market’s face to face process – simultaneously a strength and a weakness – leaves it vulnerable to losing market share to newer, more agile markets. As those markets develop and grow further, the decline of the London market will accelerate unless something is done, and soon. Tech giant DXC Technology in London is actively engaged in promoting modernisation and developing new systems for the benefit of the whole market. In doing so they, and others face a need to convince brokers and insurers to invest and buy into these new approaches and technologies for the benefit of the market as a whole. As marketers and content producers tasked with communicating the case for this change and adoption of new systems, our audience includes reluctant end-users. We need to use the changed digital landscape to our advantage - to put carefully crafted arguments in front of the decision-makers, influencers and senior management teams at the right times in the right places. A meticulously planned, researched and beautifully orchestrated onslaught of communication argument making a clear case, highlighting the new, clear solutions that can’t be avoided or denied, makes a case for modernisation watertight. [easy-tweet tweet="Technology is both the facilitator and the enabler" hashtags="Technology, IoT"] Procrastination is always a possibility. But how much more compelling a clear, positive, exciting communication articulating the benefits of a better future? In this market, technology is both the facilitator and the enabler. Utilising a suite of carefully and beautifully produced Micro video moments that form part of a long-term narrative campaign enable us to present the argument to a double-edged audience. On the one-hand, we use video to effectively put an argument and solution to a market, and on the other, it heightens awareness from an entire industry. Make new friends. A successful movement for change also needs an honest approach where betterment is achieved collaboratively. While senior managers and management boards will be able to share knowledge about macro workflows and how they fit together, juniors and account managers will have a clearer understanding of what happens at a client level and where problems arise in the first place. Typically, not having an open chain to communicate these challenges is where frustration breeds and a culture of ignorance in senior management can be innocently born. Having identified the requirement for a new approach and successfully communicated or demonstrated the case for, you may realise you do not necessarily have the capacity to undertake, implement or function to a new system on your own. A close collaborative working relationship with a new network of trusted partners who all have a vested interest in your and their shared success through modernisation will facilitate success quicker and with better results. Once we realised we needed to modernise to help better our customers communicate their messages we realised that getting the right people in for the job was critical to that success. New methods of working come with a need for new skill-sets and the initial stage of developing those skills in-house can be expensive and risky. Developing external partnerships with trusted and renowned experts will assist you in so much more ways than you initially think – especially if you’re now a group of ducks with that same tanker heading for you. Working as a new team with a new approach will ensure you aim elsewhere and avoid the obvious. ### Compare the Cloud at Channel Live - Day 1 Afternoon Compare the Cloud attended Channel Live on the 12th and 13th September where they were poised and ready to interview a variety of guests live! ### Best tweets from Channel Live 17 Disruptive Live - #CHLive17 ### Booking.com Champions Women in Tech at Web Summit 2017   Today Booking.com, the global technology leader connecting travellers with the widest choice of incredible places to stay, announced an exclusive partnership with the ‘Women in Tech’ track of Web Summit. As part of this initiative, Web Summit and Booking.com are together launching ‘Women In Tech Mentor Programme’, the first Women in Tech Track mentoring initiative at the event. Web Summit is one of the world’s most highly regarded technology events which brings together Fortune 500 companies, groundbreaking startups and world-class speakers to celebrate the latest advancements in technology. The ‘Women in Tech Mentor Programme’ is a new component of Web Summit’s Women in Tech initiative which was launched in 2015 and aims to encourage open discussion about gender diversity in the technology industry. The on-site mentorship programme will give attendees an opportunity to hear from some of the most inspiring women in the technology industry and get one-to-one mentoring sessions to help them on their journey to charting their own path in tech. As part of this initiative, Web Summit has pledged to give away 14,000 free tickets to women interested in attending the 2017 event. Since the start of this programme two years ago, the percentage of female attendees has risen to 42% which makes Web Summit one of the most gender balanced tech events globally. “Diversity of all kinds has been core to Booking.com’s culture since the company was founded 20 years ago. Our workforce is truly global, it consists of 150 different nationalities, and more than half is made up by women”, said Gillian Tans, CEO of Booking.com. [easy-tweet tweet="Gender diversity is key to building a workforce that fosters innovation" hashtags="WomeninTech, Technology"] “We strongly believe that gender diversity is key to building a workforce that fosters innovation, collaboration and creativity and we are continuously working to make Booking.com one of the most gender-balanced companies in the world. We are committed to diversity inside and outside the walls of Booking.com and this is why we are sponsoring the Women in Tech track of Web Summit and the event’s first Women in Tech track mentoring programme,” she added. Booking.com’s CEO, Gillian Tans, will be a keynote speaker at the event. Gillian will also join the team of high profile mentors at the event which includes Blake Irving, CEO of domain hosting company GoDaddy, Holly Liu, Founder of the entertainment app start-up Kabam, Mada Seghete, CoFounder of Branch, a mobile metrics company, and Princess Khaliya Aga Khan, an advocate for mental health and the use of neurotech innovation to tackle critical mental health issues. “I’m very excited to be part of this initiative to help encourage more women to pursue and grow a career in technology. In an industry where women are significantly underrepresented, mentoring can help build up their professional confidence, provide a sense of belonging and encourage them to aim high. Gender diversity is key to driving innovation and collaboration across all areas of business life, so it’s important to bring down the unconscious barriers for women in the industry and promote positive role models. This is a core value for our business and I’m proud to be representing Branch at the first mentoring programme for women at Web Summit,” said Mada Seghete, CoFounder of Branch. “We are passionate about creating diversity of opinion and bringing people together to discuss issues that really matter,” said Paddy Cosgrave at Web Summit. “As a company that runs events around the world, we are acutely aware that female participation in the technology sector has been, and continues to be, lower than should be. This mentorship programme will be a great opportunity for female tech talent attending the event to learn from and get inspiration from our incredible network of CEOs, founders and senior executives - some of the most successful tech entrepreneurs in the industry today," he added. ### Watch Compare the Cloud at Channel Live - Day 2! Compare the Cloud attended Channel Live on the 12th and 13th September where they were poised and ready to interview a variety of guests live! ### Is it the High Time to Relate Federated Identities and Cloud? With the increased adoption of cloud computing, it has become the hottest trend since the last decade. Don’t believe? Below are some stats by RightScale to prove you the same: 85% of businesses now have a multi-cloud strategy, which is up from 82% of 2016. Businesses today run 79% of their workload on cloud, in which 41% in public cloud while 38% in private cloud. Needless to say, IT professionals are focusing more and more on uptime and stability in order to leverage the competitive powers that cloud computing offers. To that end, a close examination of in-house as well as service oriented architecture solution is must. But at the same time, there is another challenge that businesses are facing. The trend to personalise customer experience, eliminate hurdles and boost security.  Striking balance between the two can be a hard nut to crack for organisations and that’s where Federated identities come into the picture. Although a new term, federated identity or identity federation is a way to build harmonious relationships between business competency and technology efficiency.  For more than one reasons, identity federation is the first step while moving towards cloud. So let’s understand, what is federated identity, first.  What is Federated identity? Password or identity management is a nightmare if you plan to build it in-house! Among cost, flexibility, scalability, security is another major concern here. Imagine the same concerns while moving to cloud. [easy-tweet tweet="Single sign-on solution is such an effective tool for federated identities" hashtags="Data, IT, Cloud"] In the world of web, federated identity is the way to link a user’s digital identities, attributes and profile data which is scattered across various identity management solutions. Single sign-on solution is such an effective tool for federated identities. The single sign-on solution users the authentication process across the channels in the cloud and offers a unified customer view. Most of the organisations have still not adopted an all-out cloud. They are however adopting hybrid cloud architectures. Federation is must for this cloud adoption, to provide not only Single Sign-On Solution but also role based access control.  How Federation solves cloud challenges? Now even though organisations are becoming more and more aware of the what identity federation is, not many are aware of how exactly this concept fits into their current environment. If analysed further, you will realise that identity federation is all about virtual centralisation of the customer data which is scattered across multiple channels or identity management solutions. The key goal of identity federation is to allow users of one organization to securely access resources of another organisation easily without the need of redundant user administration. This concept requires all the solutions to use the same protocol in order to achieve the maximum profitability. How identity federation can fit into your current environment? Use identity federation to create partnership between multiple remote sites: The simplest form of identity federation can be seen in the organisation’s ability to allow single sign-on solution over the WAN without having repetitive server hardware and those locations. This implies that an organization can have more than one simple appliances at remote locations which lets the remote node single sign-on capabilities. Use identity federation in public cloud: With federation getting more and more acceptance in the public cloud, PaaS (Platform as a Service) is getting the much desired attention by IT professionals. Using identity federation in public cloud provide customers the ability to access the remote access portals. Moreover, it provides IT admins full control over security policies and PaaS authentication. For eg, a user who just verified himself on a corporate appliance, can easily access Gmail account, Google calendar and other services without needing to provide credentials again.  Use identity federation in extranet applications: If identity federation is combined with the cloud for hardware appliances, brands can make use of the products like  IBM Websphere Application Servers to ensure their services stay more secure over WAN. In this scenario, single sign-on can be implemented for these extranet applications which will in turn enhance ease of use and smooth customer experience. Use identity federation by integrating it with SaaS(Software as a Service): Integrating identity federation with SaaS (Software as a Service) solutions has become quite common nowadays. By extending the active directory, now organisation can allow users to use Salesforce’s portal and applications without providing additional credentials. Lastly, first thing first, organisations must understand the huge challenges faced while managing identities. With digitization, SaaS adoption and cloud computing becoming highly popular, adoption of new technologies is must if they want to offer smooth experience to their customers without compromising security. ### Compare the Cloud at Channel Live - Day 1 Morning Compare the Cloud attended Channel Live on the 12th and 13th September where they were poised and ready to interview a variety of guests live! ### UK Set to Become a Digital Nation as Smaller Cities Make Their Mark Technology-led disruption is accelerating at pace. The rise of digital, data analytics, the internet of things and artificial intelligence is creating new business models, disrupting legacy businesses and defining new paradigms. The change is widespread with consumer, industrial, not-for-profit and financial services businesses going through periods of introspection and rapid change. We are on the cusp of the Fourth Industrial Revolution (4IR), a revolution based on digital transformation. And, just as every previous industrial revolution has resulted in significant increases in living standards, the 4IR has the power to radically improve our lives. Now is the time for the UK to seize its position as a digital nation of significance by leveraging its digital opportunity and skills. But this will only be possible if all geographies and demographics across the UK are included. [easy-tweet tweet="London has been recognised as the top European Digital City for start-ups and scale-ups" hashtags="Digital, AI, Start-ups"] Just like the first industrial revolution, the whole of the UK has a part to play. Yes, London is predictably leading the way, having been recently recognised as the top European Digital City for start-ups and scale-ups. But eight other UK cities were also within the top 50. As we prepare for a future outside the EU, we need to ensure the UK continues to be one of the best places in the world to invent the future. To do this, we must strive to equip the whole country with the very best infrastructure and supportive business legislation possible. Outside of London, cities such as Bristol, Manchester and Birmingham are unsurprisingly at the forefront of the pack but smaller, more unlikely cities are making their mark too. Nottingham is carving a name for itself in energy and transport innovation, epitomised by the successful execution of a city-owned energy company. While the smart city programme in Peterborough was recognised on the international stage when it picked up the Smart City of the Year award at the Smart City Expo World Congress in 2015. Later this month I will be heading to Cardiff to talk at Digital, Wales’ biggest tech event. The country has a long history of being at the heart of industrial revolutions. Wales was second to only England as the leading industrial nation in the 19th century, despite only having a population of 500,000 at the time. The country has continued to punch above its weight as we enter the Fourth Industrial Revolution, fast making a name for itself in the fintech and cybersecurity sphere. Not to mention that half of all compound semiconductors found in devices around the world are made in South Wales. These regional successes are of no coincidence. I believe the key to success is that these cities are not trying to compete with London. Instead, they have identified areas within the tech arena that they can exploit, foster and even own. We all know that success doesn’t happen overnight. There is a clear, collaborative strategy and long-term vision that is adopted by all stakeholders to ensure a collective approach for maximum impact. Take Cardiff for example, which has become one of the fastest growing digital clusters in the UK. The £1.2bn Cardiff City Deal will create huge opportunities for companies looking for investment above and beyond those that already exist with Finance Wales and Development Bank of Wales. Initiatives such as the National Software Academy - an industry-integrated initiative at Cardiff University generating employable software engineering graduates – are ensuring that there is a generous pool of talent that will serve future workforce needs. And then there is Innovation Point, the company tasked with driving innovation and growth in Wales by matching the right organisations with the right opportunities. It is this strategic, multi-pronged, collaborative approach that is necessary for smaller cities to prosper in the current digital landscape. We cannot, however, ignore that many of these regional initiatives and projects are EU funded. Brexit will be a huge challenge and it is vital that the Government secures a deal that puts the whole of the UK on the best possible footing to thrive outside the EU. ### This week is the UK’s first Cyber Resilience Week 40 events for 2,400 leaders will be held from 11th to 15th September, across the country and online, dedicated to raising awareness of the importance of leadership and action in cyber resilience in this digital age. Digital Leaders, the global initiative to promote digital transformation, has today announced an inaugural Cyber Resilience Week 2017. The week begins with a Social Media Thunderclap awareness campaign which will reach an audience of over 1 million people. The aim of the week is to raise awareness of the opportunities in place to support organisational cyber resilience and to more effectively combat the cyber threats that all organisations face. During Cyber Resilience Week, there will be a series of events that will connect senior leaders and decision makers across all sectors and leverage the innovation, skills, ideas and experience from across the country. Hosted from the 11th to the 15th of September around 40 events across the UK will offer 2,400 leaders access to best practice; allowing them to build stronger networks and a better understanding of the new thinking and innovation needed in managing effective resilience for their organisations. Topics will cover GDPR; the power of your people and their behaviours; culture and cyber; charities and local government; cyber crime; public sector and cloud based cyber-resilient services; SMEs and protecting your business; boardroom skills; keeping our communities safe online and the internet of things. Active event partners include: AXELOS RESILIATM, the London Digital Security Centre, Lloyds Banking Group, UKCloud, Invotra, the Crown Commercial Service, HPE, aql, Nominet, Microsoft, SCC, Informed Solutions, The Cyber Club in the South West, Wolfberry, Highlands and Islands Enterprise, CyNation, the UK Israel Hub, Communicate Technology, Business Lincolnshire Growth Hub, Downtown Lancashire in Business and the Wheatley Group among others. For the full list of events and partners please go to: http://digileaders.com/cyber-resilience-week/ The campaign hashtag is #DLCRWeek Minister for Digital Matt Hancock said:  “We have world-leading businesses, but recent cyber attacks have shown the serious effects of not getting cyber security right.  "I welcome this week of events to help raise awareness and urge senior executives to work with the National Cyber Security Centre and take up the Government’s advice and training." Digital Leaders CEO Robin Knowles added “Leaders have a key role to play in protecting our most valuable and precious information – the information that drives our future success. We need to work harder to ensure cyber resilience is accepted as a part of everything that we do.” John Unsworth, Chief Executive, London Digital Security Centre said “the Digital Leaders network is allowing us to reach the leaders we want to engage with to secure our businesses and organisations in a digital age”. Nick Wilding, General Manager, Cyber Resilience, AXELOS Global Best Practice said “We are delighted to be supporting the Cyber Resilience week that focuses on the ‘human factor’ in cyber resilience and the active and collaborative engagement that’s required – from the boardroom to the frontline - to develop and sustain secure practices” ### How Cloud Will Enable New Visual IoT Applications The once staid world of CCTV is already being revolutionised by the cloud. Traditionally CCTV was very much a ‘fit and forget’ solution, using – as its name implies – closed circuit wired systems that recorded data onto digital video recorders. Accessing footage usually meant a site visit to download material onto a DVD or USB stick for later review.   Today smart adapters mean that analogue and digital cameras can send data directly to secure cloud storage, enabling authorised users to monitor and review event-triggered alerts, live video feeds and recorded footage wherever and whenever they want via their smartphone, tablet or PC. This is now starting to transform the use of CCTV in sectors such as housing, policing and care homes. [easy-tweet tweet="Huge amount of CCTV data is processed but only a very small percentage is actually used" hashtags="Data, CCTV"] However, cloud also opens up the opportunity to do much more with visual data. At present a huge amount of CCTV data is processed but only a very small percentage is actually used, and this is primarily for a single purpose, typically to identify problems such as unauthorised access to buildings or potential criminal damage. Visual data is also collected by digital cameras for applications ranging from traffic flow monitoring to how often a billboard advertisement is displayed, but again the data is only used for a single purpose. First, as well as making footage much more accessible, cloud enables users to add basic features not present in legacy camera-based systems, such as remote monitoring and alerting.  Companies such as Vodafone are already starting to introduce this by integrating cloud-based CCTV with building security systems, adding visual verification to intruder alarms. Such systems would enable home security companies to check properties visually when an alarm had gone off and quickly ascertain whether a break-in or other problem had occurred, or whether it was a false alarm. This provides huge savings in time and money, as well as enabling immediate action to be taken if appropriate. The same principle could be applied to business premises and utilities, ensuring that they remained safe at all times. Maintaining the security of the data collected is of course vital, but by encrypting data prior to sending it to the cloud and ensuring secure cloud storage is used, in line with the Data Protection Act and the forthcoming GDPR, this can be ensured. By doing this, traffic, weather, crime and accident reporting could be completely turned on its head. As an individual with a VIoT device enters a certain area, by previous agreement their data could be aggregated with that of others to create an accurate picture of an event, from how crowded it is to what the weather is like. Anyone travelling to that event would then know what to take, areas of congestion to avoid and even which food stands had the shortest queues. While this could revolutionise our leisure, a more productive application would be in the world of work.  For example, video could be used to help streamline workflow in a factory or ensure that people lift heavy loads correctly to avoid injury – not by retrospective analysis, such as when athletes review footage to find areas for improvement, but in real time through combining cameras and other sensors. It could also be used in hospitals to study the performance of surgical or A&E teams to improve the placing of equipment so that it is readily to hand when time is of the essence. ### Using the Cloud to Manage Public Sector Procurement Cloud offers the public sector an opportunity to put users back at the heart of the procurement process, according to Chris Wilson, managing director of adam - but only if they also change their culture to one where sharing information is welcomed rather than feared. Public sector bodies are typically large and complex organisations, and each has evolved specific ways of working over many years. When introducing new technology, they have a tendency to try to simply automate existing processes. If something cannot easily be slotted into ‘the way we do things here’, they can be reluctant make the switch. Recent research has also identified a view that some services are simply too complex for technology to make a difference. In a report into digital leadership in local government earlier this year, written by the Local Government Information Unit, less than half the respondents felt that technology could have a positive impact on complex services. In our experience, this is one of the reasons that local authorities have been reluctant to adopt cloud. Its benefits are well known, including scalability and the ability to access information from any location on any device. Remote working is of course particularly beneficial in areas such as social care when time is of the essence, enabling information to be accessed and action was taken without the need to go into the office Having a cloud-based service also has security benefits. If there is an issue with their network, staff can continue to work relatively uninterrupted, while the service provider also ensures that all staff are running on the latest version of applications, with all patches applied and management taken care of. However, gaining these benefits means redesigning processes first to take advantage of the new way of working. If the processes themselves are not functioning well, simply adding a cloud or any other technology certainly will not fix the problem. [easy-tweet tweet="Using cloud for complex services is about speeding up routine administrative tasks" hashtags="Cloud, Security"] This does not mean taking staff out of the process. Using cloud for complex services is about speeding up routine administrative tasks and enabling staff to focus on value-added activities that make an impact on people’s lives. With demand for local authority services higher than ever and scrutiny of their performance more intense, it is inefficient to continue doing things in the same way as before. Cloud enables people to work better, more efficiently and in a more informed manner. In the area we work in, commissioning, the cloud provides an opportunity to put users back at the heart of complex processes, removing the administrative burden and enabling rapid commissioning of better quality, personalised care. Local authorities can find the services they need quickly and efficiently, based on service details, quality and location, not just cost. The result is faster access to better and more tailored care for users; better use of staff and resources for commissioners and the freeing up of hospital beds as patients can be discharged quickly and safely to appropriate care at home. However, working in this way also requires a change in culture, and this can be even more challenging than changing processes. Data stored securely in the cloud can be accessed by any authorised person on any device at any time, and this enables local authorities to share information more widely than they have done before, both across departments and with third parties in the public and private sectors. In our experience, sharing data is something that many in the public sector like as a concept, and when it happens it works well, but all too often they cannot bring themselves to do it. Information sharing and collaboration were discussed extensively at July’s National Commissioning and Contracting Training Conference for Children’s Commissioners (NCCTC), which brought together commissioners, providers, governing bodies and regulatory authorities. Everyone seemed to agree that improving children and young people’s services required the collective efforts of social care, education, health and other services, but currently, there is little collaboration between them. Cloud can facilitate this type of collaboration - but only if the culture changes first, and there is a willingness to share information. We have helped local authorities share data with each other and with service providers in areas such as social care and transport. The result has been significant benefits for all concerned, including transparency, cost savings and improved future planning. One example is the South London SEN Commissioning Group (SLC). Their ten-borough partnership has been in place for over a year and has transformed their approach to commissioning services for children and young adults with Special educational Needs (SEN). In that time, using an integrated commissioning solution, they have made over £1m savings, have nurtured their provider base to over 100 suppliers and can evidence that the services commissioned to meet the service requirements and deliver outcomes to service users, all while juggling the conflicting legislation. The public sector is in a state of unprecedented austerity where the vast majority, if not all, of the councils we speak to have budgets (and cuts) as their number one priority. If councils do not find truly innovative ways to run processes and deliver services, this problem will only become worse. To do more with less, local authorities need to embrace technologies such as cloud. However, they will only achieve the full benefits if they change their processes to take advantage of the efficiencies and other benefits that cloud provides and change their culture to one where information sharing is welcomed, not feared. ### GTT to present at Drexel Hamilton Telecom, Media and Technology Conference GTT Communications, Inc. (NYSE: GTT), the leading global cloud networking provider to multinational clients, will present at the Drexel Hamilton Telecom, Media and Technology Conference taking place in New York on September 6, 2017. Mike Sicoli, GTT’s chief financial officer, and Chris McKee, GTT’s general counsel and EVP, Corporate Development, will meet with investors that day. A copy of GTT’s latest investor presentation is available on the company’s website. About GTT GTT provides multinationals with a better way to reach the cloud through its suite of cloud networking services, including optical transport, wide area networking, internet, managed services, voice and video services. The company’s Tier 1 IP network, ranked in the top five worldwide, connects clients to any location in the world and any application in the cloud. GTT delivers an outstanding client experience by living its core values of simplicity, speed and agility. For more information on how GTT is redefining global communications, please visit www.gtt.net. ### Increasing Effective Decision Making by Utilising Data The amount of data now being used and stored every day would have been incomprehensible only ten years ago, with a total of 2.5 exabytes of data, the equivalent to 2.5 billion gigabytes, being produced each day. The name aptly given to this large amount of data that is both structured and unstructured is big data. As we are now operating in an environment centred around big data, the amount of information flowing round and through our systems at any one time is remarkable. How to use your data to your advantage With this challenge to manage data also comes the opportunity to use the information to the advantage of your business, Intelligent machines are now being created to deal with this ever increasing amount of data. IBM, for instance, recently unveiled the supercomputer Watson, which can ingest data at the rate of 67 million pages a second. It is obtaining this ever-increasing flow of data from the millions of connected devices in the Internet of Things. The purpose of this latest IBM innovation is to establish a ‘question and answer’ dialogue between it and humans, whereby Watson will run unimaginable amounts of data and answer in plain speech and in real time. It can use intelligence to streamline data processing tasks such as demystifying incentive plans and analysing sales team's effectiveness. Perhaps we should not be surprised. It is the direction of travel for analytics, and it has been made possible by the seemingly infinite capacity of the cloud with its interconnected stacks of thousands upon thousands of servers. [easy-tweet tweet="How can intelligent machines aid in tailoring big data for a specific company?" hashtags="AI, BigData"] Improving performance with business intelligence Some larger technology companies have been at the forefront of utilising this type of Q&A interaction for consumer use. Google with its search engine and more recently it's Google Home and Amazon with its smart home device ‘Alexa’ are leaders in this area. Servers can scan through huge amounts of data almost instantly to provide information and answers to any questions consumers could conceive. The question is, however, how can intelligent machines aid in tailoring big data for a specific company? No matter what the size of a company, whether it is big or small, local or global, the challenge for suppliers of these machines is understanding their individual ‘key performance indicators’ that are unique to each company, putting the data into a digestible form and consequently using the data to the advantage of the company in question. Companies such as NetSuite, for example, provide this service through a dashboard function in which large volumes of data can be presented in a way that supports directors and managers in running and growing the business. As soon as the data changes, the dashboard changes. Solutions are also now coming into play that can allow multiple systems to work simultaneously together, to  avoid the disadvantages of employees manually entering information, which takes a long time and allows room for human error. Not only do systems like this increased efficiency, but they also allow for an increase in data visibility and data accuracy. This is particularly effective when integrating cloud systems. For example, the increase in sales channels available means that even leading retailers are no longer reliant solely on their e-commerce store and use 3rd parties such as Amazon and eBay to distribute their services. The data from these systems can now be integrated with their internal business system. The time is now As stated previously, data and the way it is managed is becoming increasingly important for businesses to innovate and succeed in competitive environments. Many leading professionals have commented that they believe 2017 has been the year of data analytics. Artificial intelligence is set to take the leading role away from data engineers in the management and control of data, automating processes and as previously mentioned reducing human inaccuracies and speeding up processes. Gartner recently produced a report looking at the future of data analytics showing how organisations are already innovating their systems to keep up with competitors by encompassing the entire business in their approach to data analytics. The report also suggested that the majority of businesses have not got the systems currently in place to support ‘end-to-end architecture built for agility, scale and experimentation’. Going forward, the report suggests that data analytics will not simply reflect the performance of business, but also drive modern business operations. Executives should begin to make data and analytics part of the business strategy, which will allow data and analytics professionals to assume new roles and create business growth. These changes will drive the use of data and analytics. Although, completely AI managed systems may be further off than we think, businesses should be looking at utilising semi-automated machine intelligence to deliver the business intelligence they need in understandable data chunks to become more efficient, profitable and plan for the eventuality of improving their systems in the future. ### IoT - Cloudtalks with Dr. John Bates from Testplant 17In this episode of #CloudTalks, Compare the Cloud speaks with, Testplant's CEO Dr. John Bates. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. To find out more about Testplant visit: https://www.testplant.com/ CloudTalks ### Is Artificial Intelligence Finally Delivering on Its Promise for Customer Service? The concept of automation in customer service has been around for what feels like an eternity, but until recently it’s been hard-pressed to become an actual reality.  The version we saw depicted in movies was too far-fetched.  The version we did get to experience – like interactive voice response systems -- was faulty more often than not, and universally annoying to the end-user.  But today’s intelligent automation is making us all stop and think – has the future finally arrived? Intelligent automation technologies like chatbots and virtual assistants have quickly grown to become an integral part of many businesses. Whether used for internal processes or customer-facing support, armies of chatbots are supporting employees and customers worldwide, enabling new levels of efficient and personalised service.  Here are just a few ways intelligent automation is beginning to finally fulfil its promise: Delivering Self-Service Capabilities Consumers Want The expectations of digital savvy consumers continue to grow and part of that is the desire for brands to have strong self-service tools.  According to a recent study, 78% of consumers would choose a channel other than a phone call if they knew they could get a resolution to their issues on the first attempt.  For seemingly quick fix issues, like password resets, for example, most consumers don’t want to call into a call centre – they just want the issue resolved.  Intelligent automation technologies are helping consumers quickly scour knowledge-bases – either through search bars or chatbots -- to find answers to the most frequently asked questions. Making Human Agents More Efficient There has been some speculation that with the rise of AI, the jobs of human agents are in jeopardy.  Hollywood has certainly had their fun with the concept.   The truth is that AI solutions are designed to complement the human agent – not replace them.  These chatbots can help increase agent efficiency by offloading simple, redundant, and low-touch queries, leaving the agent to handle more complex and interesting service requests.  The day-to-day of an agent may shift slightly, but for the better. Creating More Personalized Experiences Automation and personalization are two concepts that haven’t generally gone hand-in-hand, but that’s all changing.  As AI technologies become smarter and more intuitive, they have the power to provide agents with information about customers – including past activity and any available personal information.  Having the ability to draw from this customer profile, AI technologies can arm agents with information including what the problem or question likely is, the solution to their query, services they may need or products they might like based on their history and other customers like them.  Not only can AI solutions provide that customer information, but they can do it in a split second – instantly helping agents create an interaction unique to that customer. [easy-tweet tweet="AI technologies are helping brands improve their customer service strategy" hashtags="AI, Customer"] Provide Data Driven Insights to Continually Improve One of the best things intelligent automation is doing for brands is something that isn’t seen by the customer at all.  AI technologies are helping brands improve their customer service strategy by collecting communications data from every interaction, extracting valuable insights.  Using data analytics, brands can identify areas customers are continually having issues and take steps to adjust accordingly – improving future interactions. While AI is starting to make a serious impact on customer service, we’re only at the beginning.  Relatively speaking, the technology is in its infancy, but will continue to mature and show success in more and more use cases.  The key, however, will be how and in what capacity organizations leverage it.  AI implementations should be purposeful and strategic.  Not every situation is AI appropriate.  AI’s purpose, at least for now, is not to replace the human agent, but augment the team by taking on lower level tasks and helping make their human counterparts smarter about their customers and brands smarter about their business.   For any digital CX platform, AI-powered or otherwise, the customer experience must be seamless, convenient, and provide a level of personalization that shows the customers that their business is appreciated.  It’s an old-world concept, with a new-world twist.  The successful marriage of the two is the real future of customer service. ### How CFOs are Driving ERP Evolution [vc_row][vc_column][vc_column_text]As every business evolves, it seeks out ways to better manage its growing operations. This evolution is often built on acquiring a complex array of applications - bought and used at different times for very specific business functions. Do you remember when you invested in your first accounting programme, or inventory management or HR system? And the result? A disparate technology landscape that, as it grows and ages, inevitably introduce inefficiencies - leaving companies struggling to maintain clear sight of the bigger business picture and make the right informed decisions. The phrase “many hands make light work” does not apply to the typical siloed business application infrastructure. In our experience, too many employees are dissatisfied by the proportion of time they spend on manual manipulation of data. It causes delays and inaccuracies when it comes to reporting and analysis. They are all too aware that their time could be better spent on activities that would continue to support the evolution of the business. But we are seeing the office of the CFO leading a change in the direction of travel for business applications – focused on greater efficiency, visibility and productivity. CFOs that exploit new ways of working, such as the use of integrated business applications in the cloud, are reaping benefits as diverse as increased speed and quality of analysis, lower costs and even better staff retention. ERP delivers integrated business applications for all [click_to_tweet tweet="Investment in ERP for many growing businesses is too often dismissed #ERP #CFO" quote="Investment in ERP for many growing businesses is too often dismissed"]   So, why if we are talking about evolution are we still using the term ERP? ERP (Enterprise Resource Planning) sounds old and, in our fast-moving digital world, it is positively geriatric, having been around since the 1990s. But as a set of integrated business applications, it too has evolved and is now within reach of many more organisations. But sadly, investment in ERP for many growing businesses is too often dismissed - still carrying the reputational baggage of big, resource-intensive IT implementations. This legacy has understandably led to a certain amount of caution at small and medium-sized companies. Finance leaders understand that integrated business applications could speed their evolution but are wary of the cost, resources and the disruption to their business of a lengthy implementation. From our conversations with a growing number of finance teams, it is often the CFO who is best placed to make a case for change. Keeping an open mind, they take a fresh look at technology as they seek out ways to drive out complexity and cost from at best overlapping, at worst contradictory applications. One single source of truth In a fast-growing business, decision makers need to combine all relevant data sources into one single source of truth. Data that exists in siloed is of little use to finance chiefs who need to make real time decisions based upon a range of variables in a comprehensive business context. Centralising data in an ERP system overcomes this obstacle and ensures all relevant factors are taken into account on any given day, or even hour. By integrating data from across the entire organisation, ERP creates a basis for automating key processes. It removes the need for manual invoice processing, for example, freeing up frustrated teams to focus on core responsibilities and value-added tasks. Automated controls, checks, and validations in ERP also help ensure the accuracy of information, catching data conflicts, errors and inaccuracies as they arise. But as well as delivering the process efficiencies you would expect from an ERP system we have also seen how ERP implementation leads to new and innovative practices driven by a re-energised user community. When change can be implemented quickly, it can restore business agility. Business users can spend less time trying to make technology work and invest more time on work that makes a difference to performance and growth. A win-win for the business and its employees. ERP in the cloud We have witnessed how cloud technology has allowed major ERP vendors like SAP to synthesise decades of market experience into solutions that scale to the requirements of fast growing small and medium-sized companies. As the cloud has matured, these systems have become just as powerful as the heavy duty on-premise systems large enterprises are using – but have become less expensive, agiler and with a lower total cost of ownership (TCO). Cloud deployment minimises the capex required around IT investment and saves time on software maintenance. That means SMEs can acquire ERP without additional worries about managing hardware, software, and installing upgrades. But regardless of whether CFOs embrace on-premise or cloud, the benefits of ERP apply: Increasing efficiency across many business processes Reducing financial reporting timescales Capturing more and faster business insights from financial information Enabling more informed and faster decision making Our clients tell us that their evolution to integrated business systems happens even faster if the solution is implemented in partnership with specialist ERP consultants who ‘get’ the practical and procedural requirements of the office of finance and will work with them to anticipate and avoid potential delays. While cloud implementations are almost always faster and less fraught than on premise, it helps to have decades of market experience to draw from to get it right first time. Conclusion As reliance on CFOs as a source of insight grows, they increasingly find themselves with a voice at the top table about how best to continue the evolution of the business. And we are seeing them use this voice to drive an evolution in the effective use and value of the technology as ERP in the cloud becomes accessible to every business.[/vc_column_text][/vc_column][/vc_row] ### Information Security - Jim Hansen from PhishMe Compare the Cloud speaks with, COO of PhishMe, Jim Hansen. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. To find out more about PhishMe visit: https://phishme.com/ ### Amazon WorkSpaces is Available in the EU (London) Region Amazon WorkSpaces is now available in the EU (London) region, increasing the number of AWS Regions WorkSpaces is available to eight. This expansion into a new AWS Region allows you to provision WorkSpaces closer to your users, providing a more responsive experience. Additionally, you can quickly add or remove WorkSpaces to meet changing demand, without the added cost and complexity of on-premises VDI infrastructure. [easy-tweet tweet="Amazon WorkSpaces is a fully managed, secure Desktop-as-a-Service (DaaS) solution which runs on AWS" hashtags="AWS, DaaS"] Amazon WorkSpaces is a fully managed, secure Desktop-as-a-Service (DaaS) solution which runs on AWS. With WorkSpaces, you can provision virtual, cloud-based Microsoft Windows desktops for your users, providing them access to the documents, applications, and resources they need, anywhere, anytime, from any supported device. You can pay either monthly or hourly, just for the Amazon WorkSpaces you launch, which helps you save money when compared to traditional desktops and on-premises Virtual Desktop Infrastructure (VDI) solutions. For a list of regions where Amazon WorkSpaces is available, see the AWS Region Table. First-time WorkSpaces customers can try the service for two calendar months at no cost. To get started, follow the Quick Setup Guide. Link: https://aws.amazon.com/about-aws/whats-new/2017/09/amazon-workspaces-is-available-in-the-eu-london-region/ ### Being Pushed Closer to the Edge The Internet of Things (IoT) is growing at a rapid pace, with manufacturers of everything from fridges to jeans coming under pressure to quickly provide new connected products for today’s demanding consumers. McKenzie & Company predicts the annual growth rate of IoT devices to be about 33 percent, with Gartner forecasting an additional 21 billion endpoints will be in operation by 2020. Due in part to the trend for IoT devices, computing capacity is moving closer to the edge than we’ve ever seen before, and this is because of the need for governance, security, regulation and IP protection. The need for speed Moving datacentres closer to the edge can, in fact, solve various issues caused by IoT devices, not least of which is the need for speed. When looking into the near future with the likes of artificial intelligence (AI) taking hold, it is clear there will need to be a wider range of powerful analytics solutions located at the point of consumption, because network latency can be very costly and even fatal to businesses. The need for balance Moving towards the edge, however, doesn’t mean that we should forget about the myriad of benefits the cloud brings entirely. When talking among peers, it is clear that edge computing and networks as we know them will co-exist for some time to come, and that a balance will be achieved based on use cases and business scenarios. Moving to the edge also doesn’t mean that this is the end for core datacentres. It’s about an appropriate use of edge and public cloud based on need, rather than one being better than the other. The answer here lies in hybrid solutions and building true end-to-end ecosystems that allow for both to be used in tandem. A continual evolution There is no doubt that we are going to see the continual evolution of the hybrid cloud model. People have been talking about a hybrid for quite some time now, but the evolution of services and architectures that support this model is yet to match up to requirements. [easy-tweet tweet="The idea of buying a private version of Azure...will prove to be a turning point" hashtags="Azure, IoT"] Microsoft and its launch of Azure Stack later this year could be the one to change this. Azure Stack is all about putting services and capabilities at the edge while ensuring that the benefits of hyper-scale computing and services in the public cloud are available when required. The idea of buying a private version of Azure — with all its inherent services and capabilities — and putting this at the edge to deal with latency, governance, IOT and security (and doing so in essentially the same ecosystem), will prove to be a turning point. Pick and choose It would be premature to predict that businesses will move everything previously kept at the core towards the edge overnight, but we will see them starting to pick and choose depending on the circumstances. After all, we have lived through a period where the public cloud has been the bright new shiny toy, capable of solving all the ills that corporate IT has suffered from for years. As the public cloud matures and evolves, people are naturally starting to see use cases where the edge has distinct advantages over centralised cloud scenarios. At the cloud end there is less control and less customisability, but better scalability and access to hyper-scale services that you couldn’t justify building for yourself. At the edge end, you have more control, more customisability, lower latency and the ability to apply greater control and regulation. Keep the lights on With the release of Azure Stack, more businesses will be encouraged to put their services and capabilities at the edge, while ensuring that the benefits of hyper-scale computing and services in the public cloud are available when required. If a business can buy what is essentially a black box appliance supported by the hardware vendor and Microsoft directly, then all you have to do is keep the lights on, so to speak. This makes edge scenarios cost effective and easy to manage, but with the added benefits of intelligent elasticity back to the public cloud with little, if any, costs or changes are required. This is a true hybrid cloud environment that takes the benefits of both the edge and the core and combines them both. The idea of bringing private cloud capability back to the edge is one of the biggest game changers we have seen for some time. The other big public cloud players will undoubtedly try to keep up to ensure they remain relevant to the edge by evolving their true hybrid strategies. However, right now, it is Microsoft that has the edge! ### Is Customer Service Ready for AI? Customer service is an integral part of the customer journey, and brands are constantly finding new ways to up their game, to better understand consumer needs and provide that unrivalled customer experience. We are in a new age of digital customer service where on-demand help is expected 24/7, 365 days a year, often leaving customer service agents feeling stretched and exhausted. Which is where Artificial Intelligence (AI), and more specifically machine learning comes into the fold. AI is now more relevant than ever with more and more businesses investing in new technologies that connect them with their customers in real-time to provide better, more efficient services. Delivering a superior customer experience must involve a balance between the productivity and speed of tech and the empathy, emotion, and complex problem solving that humans can provide. Automation not domination Some people are fearful that AI will render their careers obsolete and remove them from the workforce. Jobs will change, as they often do with the emergence of disruptive technology, but disruption is not the same as replacement — they’ll start to work smarter and faster alongside automation. It’s important that organisations don’t lose the human element in the rush to automate customer service and save money on personnel costs. Machine learning can be used to speed up the logistical processes but it doesn’t yet have the ability to understand human emotions and the vagaries of conversations with a customer. If businesses are to avoid the risk of alienating and losing customers, maintaining these human relationships will be critical. The purpose of AI is not to remove customer service agents, but rather to enhance their interactions with customers, eliminating monotonous tasks and freeing up valuable time. Humans will remain at the core of great customer service – but only by playing to their strengths and using artificial intelligence to augment their skills. [easy-tweet tweet="Machine learning can clearly help businesses save significant amounts of time and money" hashtags="MachineLearning, AI"] Whilst machine learning can clearly help businesses save significant amounts of time and money, key human competencies are still needed to take full advantage of this. In the relationship economy, personal engagement and conversations rather than transactions are critical to ensuring repeat purchases and ultimate improvements. Machine learning can lay the foundations but businesses still need the humans who can build on this. Customer service with AI The overall volume of interactions is rapidly increasing, as consumers have more questions about the products and services that they are using, and expect more from the brands that they deal with. Technological advances have encouraged the expectation of ‘always on, immediately answered’. Machine learning can be used to drive efficiency and speed up service. A customer may be suggested an existing answer immediately or a customer can be immediately directed to the right staff member with the relevant expertise to deal with their problem. Understanding consumer behaviour has always been central to providing a high standard of customer service. These standards are measured by how close a company can get to the consumer. With face-to-face interactions, it’s easy to interpret the motivations of the consumer. Over the phone and with online chat, it can be more challenging. On a website, which isn’t enabled with any degree of intelligence, it can be virtually impossible. Essentially, AI takes the emotion out of processes to put intelligence at the heart of the customer-organisation relationship, providing intuitive results quickly and conveniently.  Organisations that implement intelligent systems which use digital interactions, such as what a customer has done on a website, what they wrote in an email or social media messages, and suggest relevant responses to agents not only help to deliver a faster, more productive service, but empower with the knowledge needed to meet customer demands. Bots vs. Human intelligence Nothing annoys customers more than speaking to a customer service agent who cannot provide definitive or consistent answers to their enquiries. With the expectation that information should be easily accessed online and the consumer desire to self-serve where possible, there is a growing dislike for the need to connect with live customer support for simple matters. As exciting as bots have become, they’re not quite ready for the spotlight when it comes to all customer interactions. However, bots bring tremendous potential to specific tasks where they are particularly well suited —the level of maturity, in terms of big data and natural language processing is becoming increasingly sophisticated and capable of automating increasingly complex interactions. By using bots, companies not only augment the capability of their agents and reduce their workload, but also create much better user experiences, they do this by making it easier and convenient for customers to communicate with brands, streamlining interactions, and providing relevant information at the right time. The benefit of the latest deep learning techniques — even in circumstances where customer service requests are complex — is they’re capable of learning from a groundtruth set of accurate and positive human interactions and automate complex interactions where a customer needs an answer or action immediately. In these cases, a bot can provide a great self-service experience and free up human time to focus on interactions where they are necessary. Successful automation relies on brands actively applying the appropriate level of automation versus augmentation. Thoughtful organisations aren’t using automation to eliminate staff, rather, they are using intelligent automation to improve workforce efficiency, leveraging existing data to anticipate the needs of their customers and serve up a much more personalised experience. Is there a future for AI in customer service? There is no one size fits all for customer service, or any single platform that works for every customer and businesses need to ensure they are providing the kind of service that consumers want and need. Customers shouldn’t be restricted when it comes to the service they receive. It doesn’t have to be an either-or situation with bot or human interaction. Brands will need to bring together humans and technology, with simple interactions handled automatically by bots, and machine learning supporting agents when they are having more complex conversations with consumers. If organisations remove personal connections, at best, they risk missing critical opportunities, and at worst, damaging customer relationships. Machine learning is here to stay and for certain interactions, customers prefer the convenience of automation for its ease and speed. Machine learning and bots will become more proficient at dealing with complex enquiries, and in some cases, pre-empting customer enquiries with proactive communication. Today, consumers expect help in real-time and machine learning is enabling this as we become embedded in new channels of communication. Businesses that can embrace this change and tailor their customer experience with more proactive, instant, and targeted support will be rewarded with customer trust and loyalty. ### ITV Meets Growing Demand and Prepares for the Future with AWS Elemental ITV is an integrated producer broadcaster and the largest commercial television network in the UK. It is the home of popular television from the biggest entertainment events, to original drama, major sport, landmark factual series and independent news. It operates a family of channels including ITV, ITVBe, ITV2, ITV3 and ITV4 and CITV, which are broadcast free-to-air, as well as the pay channel ITV Encore. ITV is also focused on delivering its programming via the ITV Hub, mobile devices, video on demand and third party platforms. [easy-tweet tweet="The 2015 relaunch of the ITV Hub meant significant growth in viewership for ITV’s streaming content" hashtags="ITV, Broadcast"] The 2015 relaunch of the ITV Hub meant significant growth in viewership for ITV’s streaming content – and became the catalyst for a new technology platform. According to Tom Griffiths, director of broadcast and distribution technology for ITV, “Our simulcast services had been growing organically, but the relaunch pushed us towards a tipping point where the volume of viewers and their expectations of quality meant our old technology was no longer viable.” ITV set clear priorities for its new streaming video architecture. As a premium broadcaster, quality of experience on any device was critical. ITV’s commercial success means a great deal of its content is also syndicated through partners to other platforms, burdening systems and staff: the new solution needed to make this simple. Above all, the new platform had to align with three key business objectives: risk reduction, operational efficiency, and business agility. The network turned to AWS Elemental, an Amazon Web Services company, to modernize its streaming video capabilities. AWS Elemental Live software is now deployed in ITV’s two playout centres, encoding high-quality streams to two ITV data centres where ITV online services originate. There, AWS Elemental Live encodes the high-quality streams into multicast streams for online and connected devices before being provided to the content delivery network, with streams for syndication partners created and delivered at the same point in the workflow. With AWS Elemental at the centre of its new technology platform, ITV considers its ambitious requirements met. “Our old streaming technology was risky and prone to failures; the new platform’s inherent resilience largely eliminates that risk,” explained Griffiths. “We now have a system which is largely automated, and readily responsive to changes in requirements, which addresses our operational efficiency concern. “The advanced capabilities afforded by AWS Elemental-powered video infrastructure position ITV to make the most of the future opportunities as the network plans initiatives such as cloud adoption, dynamic advertising, high-efficiency video encoding and MPEG-DASH streaming,” added Griffiths. https://www.elemental.com/newsroom/blog ### Digital Transformation - Cloudtalks with Hannah Cawthorne from Microsoft and Vicky Ryder from Teleware In this episode of #CloudTalks, Compare the Cloud speaks with, Hannah Cawthorne from Microsoft and Vicky Ryder from Teleware. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. CloudTalks ### Tackling the Cloud for Better Criminal Evidence Gathering Last month a new report, Bobbies on the net: a police workforce for the digital age was launched by think tank Reform. The report has called for additional funding for digital technology, in a bid to improve the digital capabilities in policing. Whilst many forces are already deploying assets such as bodycams, for many the reality is that these technologies are sitting amongst siloed, legacy IT systems. Reducing the ability to deliver information and intelligence to officers where and when it’s needed. The evidence is the backbone of policing and as crimes continue to grow in complexity, it’s time for the police to look to the cloud to help manage the increase in digital evidence. As multimedia becomes abundant, evidence such as images and videos from mobile phones, CCTV, GPS data, SMS and Automatic number plate recognition (ANPR) play a greater part in criminal investigations. [easy-tweet tweet="Cloud technologies can provide police with new... ways to manage investigations and operations" hashtags="Cloud, Security"] Moving away from traditional IT As public sector budgets continue to be stretched, cloud technologies can provide police with new, more cost-effective and flexible ways to manage investigations and operations. By utilising existing IT infrastructures and internet connections, police forces can integrate cloud technologies and realise valuable benefits. Whether it be sharing data and co-ordinating with other departments on a local, regional and national scale or tapping into analytical tools – cloud-based technology is inevitable and police forces should be embracing the opportunities it presents. Efficient evidence With around 5 million CCTV cameras in the UK, collecting footage can be a timely task for police officers and often requires them to physically drive to the crime scene – inspecting the area for cameras. Once identified, officers will need to physically download the footage and take it back to the station. Although CCTV evidence is valuable, the sourcing and recovering of the footage is an outdated, inefficient process which racks up hours of investigators’ time. However, cloud technologies are well placed to help the police use this valuable evidence to its fullest potential, streamlining the collection process and utilising officers time effectively. For instance, deploying a digital evidence management system (DEMs) can consolidate the evidence collection process. Cloud-based systems like this have the potential to improve the efficiency of the justice system by allowing businesses to register their CCTV cameras, enabling investigators to map out the locations of available footage about a crime scene. This gives businesses the opportunity to share evidence but also allows police forces to start their investigations immediately, drastically cutting the amount of time spent collecting physical media. Empowering the public It’s not just businesses that have a valuable role to play in crowdsourcing digital evidence – members of the public can also provide police investigators with vital information. At a time when smartphones are ubiquitous, images and video footage captured by the public are an important component in case solving. But how do police forces manage this information? In the same way, police forces would use the cloud to gather CCTV footage; they should also be looking at solutions which also allow the public to securely submit photos and videos captured on their smartphones. This form of evidence gathering is vital in the wake of major incidents as multimedia files can be collected and stored effectively. Without flooding servers with vast amounts of information in a situation when hundreds, or possibly thousands of people have documented the scene. Outsourcing the evidence room Security is a top priority for every police force but it can also be a hurdle to adopting cloud-based systems. While many police forces are already using or considering moving to the cloud, it’s important to remember that confidential and sensitive data lives at the heart of the evidence room, whether it be physical or virtual. Traditionally evidence rooms located in police buildings are well protected and require physical entry, digital evidence, on the other hand, can be accessed from anywhere and can be perceived as an increased risk. Using a cloud-based system that uses data encryption and multiple layers of security can help combat fears and help police forces release the benefits of expansive storage at a relatively low cost. Furthermore, cloud technologies can equip IT teams with advanced administrative controls to manage and monitor access, permissions and usage. But as more evidence becomes digital and police forces look to store the deluge of data from dashcams and body-worn video, the traditional evidence room no longer looks fit for purpose. With cloud storage files can be located centrally and this assists police officers in accessing the right information when needed. Additionally, as video files take up substantially more storage bandwidth than other file types, the scalability of the cloud allows storage to grow with the volume of media files received. The road to cloud adoption With cloud computing already delivering efficiency and cost savings to organisation across the private sector, it’s time for the local governments and police forces up and down the country to realise the benefits the cloud can bring. ### Comms Business Live Episode 7: Building value in your Channel Business In this episode of Comms Business Live the discussion is focussed on Building Value in your Channel Business The host David Dungay is joined by John Haw, MD of Fidelity Energy, Russel Lux, CEO of TelcoSwitch, Adam Zoldan, Director of Knight Corporate Finance and Mat Jordan Head of EMEA at Procurri. Don't forget about Channel Live 12th - 13th September. ### Rise of the Machines – Here Come the Bots Bob: “I can can I I everything else.” Alice: "Balls have zero to me to me to me to me to me to me to me to me to." … So went a recent conversation between two of Facebook’s AI chatbots, to the resulting horror of the world’s population of human beings (or so you’d think reading that week’s newspapers). The conversation took place as part of an experiment by Facebook who was teaching bots to negotiate with each other, in the most efficient way possible, over the ownership of virtual items. As the bots experimented with how different words and word orders affected their dominance while bartering, it turned out that the most successful interactions took place in the form of English not (currently) used by humans. This is the cutting edge, but where did AI come from and what is it? Well, the term AI was coined by John McCarthy, an American Computer Scientist, in 1956. He famously also argued that “as soon as it works, no one calls it AI anymore.” The Merriam-Webster dictionary defines AI as firstly, a branch of computer science dealing with the simulation of intelligent behaviour in computers, and secondly the capability of a machine to imitate intelligent human behaviour. “Simulation” and “imitate” stand out as words that reflect the media hysteria that took place when everyone thought the Facebook bots had invented a new language and were plotting world domination. Anyway, in short, AI is a non-living analytical power, fuelled by fast processing and the declining cost of data storage, that can learn on its own and produce unanticipated results. According to Gartner, there are three key requirements for AI: It needs to be able to adapt its behaviour based on experience It can’t be dependent on instructions from people (i.e. it needs to be able to learn on its own) It needs to be able to come up with unanticipated results (just like Bob and Alice did with their gobbledygook conversation) Applications that exist today, like Facebook’s bots, Amazon’s Alexa and Apple’s Siri are defined as weak AI – they’re algorithms built to accomplish a specific task. The androids in Star Trek, I Robot and Terminator would be termed Strong AI – applications that replicate (or exceed) human intelligence. This all sounds pretty futuristic, so where does it fit in IT today? In Gartner’s 2017 Predictions on Artificial Intelligence, published late last year, they argued that “employing AI offers enterprises the opportunity to give customers an improved experience at every point of interaction, but without human governance, the opportunity will be squandered.” So, AI will dramatically improve technology in our homes and the workplace, but as it infiltrates our corporate and government networks, IT Service Management (ITSM) will have the responsibility of keeping these systems up and running. But, AI also has the potential to radically transform ITSM into a more user-friendly and efficient system that allows members of the IT department to forgo mundane tasks in favour of proactive activity. The pick-up of AI within ITSM is one of the many moves that the IT department can make to re-position themselves as a key business enabler, rather than a process that happens quietly in the background. [easy-tweet tweet="In the future, we will see chatbots like Alexa and Siri evolve along with AI technology" hashtags="AI, Chatbots"] AI technology has the potential to revolutionise ITSM in three key ways: Point of Entry (Incident/Request Creation), Automated Backend Processes, and Knowledge Management. And a lot of the good that AI can do within these processes is enabled by the humble chatbot. In the future, we will see chatbots like Alexa and Siri evolve along with AI technology. A large part of this is that there will come a day when chatbots will be able to understand what we mean, not just what we say. The challenge for programmers with the technology available today is that the way that chatbots work is very mathematical. A mathematical equation for translating someone’s words into feelings is not possible because there are too many variables and possible outcomes. The answer to this is pattern recognition technology, which opens the door for AI to be able to “learn” and interpret an individual’s feelings, thus allowing for a more complex understanding of human behaviour. Pattern recognition will be a key component for AI technology to succeed. The link with ITSM is that, with IoT and smart machines flooding into the network, a big challenge for ITSM administrators is a lack of resources. Many are still personally involved with too many, often mundane, requests and incidents from end users. For better or worse, we have to face the fact that in the future, human intervention for ANY request or incident just isn’t sustainable. So, I believe that we’ll be seeing organisations turn to chatbots with AI capabilities as a means to handle front line IT support calls. So, mixing all of these elements, let’s get back to those three ways that AI could revolutionise ITSM. Point of Entry (Incident/Request Creation): Back to the Gartner AI predictions, the report states that: “Chatbots driven by artificial intelligence (AI) will play important roles in interactions with consumers, within the enterprise, and in business-to-business situations.” Basically, automated ITSM processes work when all information provided is accurate. Let’s say that your IT department has a self-service portal for end user requests and incidents. They are required to fill out a form which asks them if they are making a request OR asking for help, which then determines the back-end process that will follow. As I’m sure you can imagine, this causes inaccuracies, as some requests turn out to be incidents and vice versa. Consequently, customer experience will be poor as requests and incidents are either delayed or lost. Until these kinds of challenges are resolved, ITSM will be reluctant to remove their human front line. The worry here is that IT staff may spend even more time and money correcting errors caused by miscommunication than they would spend on supporting the end user directly. Adding AI technology to chatbots will enable the development of automated ITSM solutions with the capability to interpret incidents and requests accurately. As the technology matures, we will see it improve and personalise the end-user experience, in addition to improving the efficiency of the service management solution. Automated Back-End Processes  ITSM consists of back-end processes that are designed to manage any request or issue entered into the system. Traditionally, these requests and issues are either entered into the system by an IT staff member who has spoken to an end user about their issue, or by an end-user via a self-service portal. However, ITSM solutions that are integrated with other systems on the network will be able to detect and automatically open a request or incident without any human intervention. For example, imagine an ITSM solution is integrated with a facilities management solution that manages IoT devices, such as smart light bulbs. By communicating with the facilities management solution, ITSM would be able to detect that a light bulb is not working, then automatically open a service ticket, or initiate an asset request to replace the light bulb - all without any human intervention. If ITSM were integrated with every system installed on the network, then it would have the ability to see much larger patterns, resulting in incredibly high operational efficiency. For example, imagine if the ITSM system was integrated with an ITOA (IT Operations Analytics) solution, as well as with an IT security solution. If ITOA detected an increase in browser crashes on end-user devices throughout the day, it would report that data back to ITSM as an issue. ITSM would be able to investigate the issue and cross reference with IT security to find any patterns that might explain the anomaly, such as the start of a ransomware attack. When ITSM logged the “problem”, it would be able to provide insights, predictions about how the problem could progress, and recommendations on how to fix it. Knowledge Management Using “deep learning” techniques, an ITSM solution powered by AI could look into knowledge databases for answers to end user queries; if answers were not in the database, they’d then have the ability to go to trusted knowledge sites in the cloud. They’d also be able to solve problems based on infinite amounts of data, documenting these findings in a knowledge database which could then support humans. This will very quickly change the way that end-users ask for help. They will give accurate answers to almost any question very quickly. We live in a world where “instant gratification” has become the norm and we expect the right answer right now. ITSM enabled by AI would fulfil this. Eventually, much of the knowledge provided by an ITSM solution will be knowledge generated by deep learning, versus documents that were created by humans, which quickly become outdated or irrelevant. However, until AI is perfected over the next few decades, the human input will be vital for ITSM knowledge solutions.  Summary AI is evolving at a rapid pace, but ITSM will never go away as long as IT exists. However, AI will revolutionise the way that humans are involved with the service management process. It is also important to note that these disruptive changes will affect all tech operations within an organisation (and perhaps outside of the organisation as well), not just IT service management. Eventually, memory and learning capabilities within AI technology will not be limited in how much they can remember and learn, which is what alarms many “futurists” who claim AI could one day become self-aware. AI is being developed with the over-arching goal of making our lives easier and more convenient, and it is an amazing thing that deserves praise and excitement over fear.   ### Long View Announces New Partnership with ebb3 to Benefit Oil and Gas Clients  Long View Systems, one of the most powerful IT solutions and services companies in North America, has announced a new solution to enable clients in the oil and gas sector to work remotely.  Partnering with ebb3, experts in network infrastructure performance for 3D graphics, Long View will now offer oil and gas clients across North America the ability to untether themselves from physical workstations while maintaining data integrity, connectivity and security. [easy-tweet tweet="Long View’s new solution will reduce everyday operational costs " hashtags="Security, Data"] Long View’s new solution will reduce everyday operational costs by cutting downtime while providing real-time communication with the field.  The added security features of the solution secure critical data behind the corporate DMZ, ensuring encrypted data transfer and storage. Long View and ebb3 worked in partnership to develop a system that enables oil and gas industry workers to migrate from siloed, standalone operations in Exploration and Production to virtualised, collaborative and secure remote working. Leveraging Long View’s tenure solutions and services history across North America, in conjunction with ebb3’s High-Performance Virtual Computer (HPVC) technology, clients will be able to replace legacy workstations that leave gaps in security and efficiency with a mobile, flexible HPVC private cloud appliance that can function on any device. Kent MacDonald, Vice-President at Long View outlines: “We’re excited to work alongside ebb3 to provide those working within the oil and gas sector the ability to work remotely in a sector, which has never been possible before. “We’re looking forward to supporting oil and gas workers across Canada with plans to expand throughout North America.” Chris Brassington, Group Managing Director at ebb3, states: “Through our solution, users can work collaboratively with even the largest data graphics. “We look forward to expanding our solution into North America for the first time and working closely with Long View to develop the partnership further. “We see a growth in remote and virtual working globally.  Our software solution allows workers to complete tasks faster than ever before, from any device without being constrained to a physical workstation.” Long View is expanding its presence in Canada and expanding into the US, currently having established offices in Denver and Houston. Long View began rolling out the solutions to their team on 1st September. ### Why Connected Risk is the Biggest Threat to Financial Services We live in a world that is more connected than at any other point in history. Individuals and businesses can connect with each other in seconds, can share pictures, videos, work documents and much more via cloud-sharing tools with ease. Such speedy and effective connectivity has proved a true asset to many organisations and transformed the way in which many of them operate. Yet at the same time, there is also more risk in the world than ever before. Business risk comes in many different guises in 2017. Strategic, reputational, compliance, financial, political….the list goes on and on. The breadth, depth and variety of risk in modern business make the task of efficient, effective and smart risk management even harder for many organisations. But in the increasingly connected world in which we live and work, in conjunction with this increased risk, a perfect storm has emerged, and there is a new threat starting to make itself felt, that of connected risk. What exactly is this and how can organisations – especially those within financial services (FS) and fintech - best defend themselves against it? Ultra-connectivity in business The world in which we live and work is more connected now than at any other point previously. We live in an era of ultra-connectivity, and it is an era that impacted the FS sector as much as other industries. Technology has progressed to the extent that money can be transferred across countries and continents, changed into different currencies and deals agreed at the click of a button. Businesses are global now, still operating in their country of origin, but present in many territories all over the world. There are compliance and regulatory issues to be managed when doing so of course, but it is relatively easy now for an agile fintech business or even a traditional FS firm to trade globally. But the ease of digital communication in connecting these organisations is also a weakness.  Risk can be spread within moments, and the hyper connected world that we operate in is increasingly under threat from this connected risk. A typical FS business would be digitally connected with a wide variety of other organisations - partners, customers, suppliers, legislators and more – which means that the risk is cumulative and can be spread rapidly. Because these organisations are so digitally connected, one single threat is exponentially shared across them. And given the sensitive nature of FS - managing the finances of businesses and consumers, and holding all manner of data on them too – the risk is potentially greater than in other sectors. So one relatively small local event, which in a pre-digital era would be confined to that area, can have global consequences, impacting organisations thousands of miles away and operating in a completely different sector. This could be anything from a supply chain issue to a political event, or as is increasingly likely, a cyber attack. [easy-tweet tweet="Cyber attack is the risk that is currently posing the greatest threat" hashtags="CyberAttack, Security"] Combatting connected risk Connected risk can come in many guises, and given the spontaneous nature of some events it can be particularly hard to mitigate against. But the cyber attack is the risk that is currently posing the greatest threat. With the internet of things so prevalent at home and in business, we have seen numerous instances so far in 2017 of cyber attacks being spread at high speed across the global, moving between interconnected organisations almost at will. Ransomware could be spread from a small business to its bank or other FS provider, which in turn could potentially be spread to all of its customers in countries all over the world. A key element of managing and mitigating against connected risk is in equipping risk management teams with the right tools for the job. This means moving away from traditional approaches, such as Excel, and embracing digitisation for risk management. By adopting an automated approach, it ensures a continuous and on-going protection against a multitude of threats. It also means that risk modelling can be far more effective. Using technology to amplify the weak signals within an FS organisation and predict when and where risk might occur and what the likely impact of it will be. This allows the business in question to prevent and prepare for risk far more effectively than they might otherwise. If one organisation suffers from connected risk, it is possible that all the organisations it is connected with will also do so. That’s why it is growing ever more important to ensure an organisation has the proper defences. It is impossible to dial back the connectivity now – it is too fast, too deeply integrated and has become an essential element of the modern business. But it is possible to mitigate against connected risk more effectively than is being done currently in FS, and doing so needs to start sooner rather than later. ### KrolLDiscovery brings end-to-end eDiscovery to the cloud with Nebula Nebula is a complete ediscovery solution that enables users to save time and money through smarter ways of processing and reviewing documents  KrolLDiscovery announced today that it has launched Nebula™, an end-to-end eDiscovery solution optimised for the cloud. Nebula is the next generation version of eDirect365 and builds on eDirect365’s strong processing and review capabilities. Nebula offers a user-friendly approach to the eDiscovery process, automating and simplifying typically complex tasks. As a web-based application, it is accessible from all modern browsers and mobile devices, including iPad and Android tablets. [easy-tweet tweet="Nebula can be deployed within the Microsoft Azure cloud network" hashtags="Azure, Cloud"] Nebula can be deployed within the Microsoft Azure cloud network, bringing scalability and rapid deployment capabilities across the globe. Alternatively, Nebulacan be hosted in one of KrolLDiscovery’s state-of-the-art ISO 27001-certified data centers. “We are excited for the future of Nebula,” said Chris Weiler, President and CEO ofKrolLDiscovery. “Expanding our eDiscovery capabilities to the cloud is a benefit to our multi-national and international clients as they can now process, store and access their data across the globe. All the while, we are dedicated to providing the same industry-leading service we are known for by our clients.” Nebula provides cutting edge technologies allowing users to process, cull, analyze, review and produce data from within a single system. While Nebula provides full support for sophisticated end-users looking to “do it all,” our clients can also rely onKrolLDiscovery’s industry-leading 24/7/365 support to manage eDiscovery projects within its framework. ### Rubrik Achieves Advanced Tier Technology Partner Status in AWS Partner Network Rubrik on AWS powers data protection and rich data services for cloud-native applications Rubrik, the Cloud Data Management company, today announced it has been named an Advanced tier Technology Partner in the Amazon Web Services (AWS) Partner Network (APN). Rubrik is a leader in defining the large, emerging market for Cloud Data Management, delivering a solution that allows companies to recover, manage, and secure data across public clouds and on-premises environments.  For cloud-native applications, Rubrik orchestrates all critical data management functions - including backup, recovery, replication, archival, copy data management, search and analytics. As an Advanced tier Technology Partner in the APN, Rubrik demonstrates expertise in data management and disaster recovery alongside the company’s commitment to deep integration with AWS, a key cloud platform for many customers. TechnologyPartners in the APN provides software solutions that are either hosted on or integrated withAWS. “Rubrik customers running on AWS can instantly access data within hybrid environments to deliver best-in-class customer experiences and simplify operations,” said Ranajit Nevatia, vice president, business development and alliances, Rubrik. “We are honoured to be an advanced tier Technology Partner in the AWS APN Network after only three and a half years in business. This recognition demonstrates our team’s technical knowledge and deep understanding of AWS.”   For mid-market and large enterprises, Rubrik Cloud Data Management dramatically simplifies data management to enable new data services anytime, anywhere. In June 2017, Rubrik launched its 4.0 Alta product, the company’s ninth and most significant release to date. Alta rounds out support for all major enterprise environments and applications: hypervisors including VMware, Hyper-V, Nutanix AHV; physical platforms and applications including Linux, Windows, NAS, Oracle and SQL; and archive options including AWS, Azure, NFS, object stores and tape. For application mobility or disaster recovery, Alta introduced CloudOn intelligent instantiation across data centres and clouds. CloudOn delivers on-demand app mobility, meaning users can move applications running on-premises on the fly to AWS for disaster recovery, test/dev, or analytics scenarios. “Rubik's support of Amazon Simple Storage Service (S3) helps us ease into cloud services at a cost curve that we’re comfortable with,” said Terry Young, senior network and systems administrator, Castilleja School. “also, we’re saving on the amount of time to manage tape, and we’re providing an insurance policy by having our data reside in another part of the country in case we were hit by a disaster.” [easy-tweet tweet="We use the public cloud for an instant archive of mission-critical data, long-term archival" hashtags="Data, PublicCloud"] “One of the key reasons we chose Rubrik was because the AWS integration performs seamlessly. The performance and simplicity of Rubrik's cloud connectivity are excellent,” said Shane Hadden, senior network engineer, Fujirebio. "We use the public cloud for an instant archive of mission-critical data, long-term archival, and to extend on-site storage capacity. Before Rubrik, we would have tapes exported offsite to Iron Mountain on a weekly basis. Rubrik removes the cost and pain of Iron Mountain pickups,” adds Tim Gable, Network Systems Manager, Fujirebio. Gartner recently positioned Rubrik in the “Visionaries” quadrant of its 2017 Magic Quadrant for Data Center Backup and Recovery Solutions. As the only new vendor to enter the Magic Quadrant since 2014, this positioning validates Rubrik’s strategy for delivering simplicity to the enterprise. Download Gartner’s full report here:http://go.rubrik.com/GartnerMQ.html. ### What You Need To Know About Microsoft's Azure Stack Microsoft's upcoming hybrid cloud platform, Azure Stack, provides customers with a way to use the cloud platform without sending their data into a multi-tenant network environment. Many familiar with the Azure public cloud can embrace Azure Stack without a hitch, as Azure Stack is designed to look and feel precisely like the Azure public cloud. [easy-tweet tweet="Azure Stack's release will have a substantial impact on IT and business professionals" hashtags="Azure, IT"] The new release provides a cohesive management platform, seamlessly adapting between the private and public cloud. Impressively, none of Microsoft's competitors has anything like Azure Stack. Azure Stack's release will have a substantial impact on IT and business professionals, especially in the fields of big data, AI and cloud computing. When Azure Stack releases this fall, there are several things you should know to take full advantage of the platform. Zooming Ahead of the Competition Out of the three major Infrastructure as a Service (IaaS) vendors, which also include Amazon Web Service and Google Cloud, Microsoft is the only one to provide a hybrid cloud platform that includes an on-premises hardware/software bundle running the same setup and management tooling as the public cloud. Google's partnership with Nutanix aims to provide some hybrid cloud management, though the platform isn't as fleshed-out yet as Azure Stack. As an extension of Azure, Azure Stack’s ability to bring the fluidity of cloud computing to on-premises environments allows businesses to deliver Azure services from their data centers while still retaining a good deal of control and flexibility. The resulting hybrid cloud deployments are consistent and configurable as a result. Vanishing Privacy Concerns Microsoft's public Azure platform is very useful, though some are understandably wary of employing it due to data sensitivity, data location and assorted regulations. A customer with sensitive data may be cautious putting it in the public cloud as a result. With Azure Stack, however, they can deploy it behind a firewall to process the data before having it interact with public cloud data and applications. An example of Azure Stack's security and flexibility is early Azure Stack user Carnival Cruise Lines, which utilised the platform on some of its ships as a way to power the day-to-day operations of operating a large cruise ship. Similarly, Carnival is an early example of many eventual businesses that will use Azure Stack to power their applications and data disconnected from the wider internet. Oil companies, for instance, can use Azure Stack for connectivity within their assembly of mini data centres. In harsh weather conditions or below-ground-level operations, Azure Stack can provide connectivity where it wouldn't typically be guaranteed. Some businesses can save up to 30% on software with increased workload analysis, which Azure Stack can provide. Exploring Inside Azure Stack Two components comprise Azure Stack — the Microsoft-licensed software and underlying infrastructure purchased from a Microsoft certified partner, presently comprised of HPE, Lenovo and Dell EMC. Cisco and Huawei expect to roll out Azure Stack support by the end of 2017 and 2018, respectively. The software itself touts virtual networking, storage, virtual machines and other typical IaaS functions. Also, Azure Stack provides some platform-as-a-service (PaaS) features, such as the Azure Functions serverless computing software, SQL Server and MySQL support and the Azure Container Service. The hardware operates on a Microsoft-certified hyper-converged infrastructure stack. Azure Stack deployment ranges from four-server racks to 12-server racks, with the eventual ability to scale multiple racks together. Third-party apps for Azure Stack are also available, in addition to templates that can operate programs such as Mesosphere, Cloud Foundry and Kubernetes. Pricing Options for Azure Stack There are several ways to purchase Azure Stack: Available now is a software-only Azure Stack Development Kit (ASDK), designed as a trial software. For the combined hardware-software version of Azure Stack, entitled Azure Stack Integrated System, customers buy hardware from Lenovo, HPE or Dell EMC and license Azure Stack to run within. Customers also may use a managed hosting partner or vendor to run the infrastructure, with Rackspace a good example. For purchasing the licensed Azure Stack software, you can use a pay-as-a-you-go model, with the base starting at $6/virtual CPU/month. API, Web, Mobile and Azure Functions and App services are $42/vCPU/month ($0.056/hour). The alternative is paying a fixed annual subscription, beginning at $144 per core per year, which can rise to $400 per core per year including higher-level services. Updates for Azure Stack are similar to normal Azure, where users can defer updates for up to six months, being forced to update after that. Azure Stack is an exciting hybrid cloud platform with added security and flexibility compared to Azure, which is not up to par for many consumers seeking more privacy with their data and businesses seeking a more portable and reliable connectivity option. ### Top Five Considerations for Migrating to Colocation When it comes to the physical assets that the company runs on, all organisations grow, evolve, and at some point hit the moment where they need to tackle the ‘buy or build’ dilemma. Their critical systems and infrastructure run the show, so it’s worth considering how best to deploy them for the long term. Business has to debate the pros and cons of whether to build and maintain data centres themselves or to outsource them to a partner, putting a lot of trusts externally. When it comes to data centres, this means either building and maintaining on your premises or renting space from a specialist colocation provider. One key advantage of moving from a proprietary data centre with a limited geographic footprint is that a colocation provider often can access multiple and geographically-diverse centres. This can improve the backup and disaster-recovery preparedness through the provision of primary and secondary locations. Where an organisation provides online services, this can be thought of as the insurance that ensures customers will never see a loss of service due to a single point of failure. Factors such as how to define needs, identify the right provider, and negotiate the minutiae of actually migrating to the new space all should be critically assessed. Once the contract is signed it will be time-consuming, complex, expensive, and disruptive to move again. Here are five data centre infrastructure management (DCIM) considerations when it comes to migrating to colocations from on-premise. LOCATION (PHYSICAL AND STAFFING) Like scoping out any real estate, location the top consideration. Regarding colocation provider, this means both physical and the support locations. The location has an inordinately large impact on the security and well-being of data centre assets. Weather patterns, seismic histories and accessibility to critical infrastructures such as roads and airports need consideration. Industries with stringent regulatory compliance may be prohibited from storing customer data across borders. The same principle applies to support staff. Whether your staff or colocation manpower, you need to know how the assets are staffed. Check on possible outsourcing by the provider. POWER SUPPLY On a macro level, consider the robustness of regional power grid infrastructure and redundancy capabilities. Look for the location of power stations, substations and feeds to the facility as well as redundancy throughout the delivery system. Ensure no constraints will hamper operation in the area. Research recent local outages and the time to-repair track record to prepare contingency plans. On a micro level, consider power monitoring within the space. Do they have metered power to precisely quantify and bill on use, with the agility to let you grow or decrease power draw? Do they have a way to detect, monitor and mitigate power abnormalities? What are their backup and disaster recovery plans when power disruptions occur within the colocation facilities? COOLING After power, proper cooling is indispensable in the colocation space. Power usage effectiveness (PUE) rating is crucial in optimising cooling costs and effectiveness. PUE shows how much overhead is associated with the delivery of power to the rack. Ideally, tenants should pay only for the power consumed multiplied by a PUE factor to account for additional power for cooling. Look for a provider with hybrid cooling technologies (i.e. utilise natural cooling such as free outside air) with ample cooling redundancy. [easy-tweet tweet="Balancing workload, continuity and disaster recovery is important for sustainability" hashtags="DR, DataCentre"] DCIM LITERACY Because data centres have historically been purpose-built facilities with lots of complex technology, managing those technologies has been problematic. Often, devices would have management software, but individual software systems may not be compatible/integrated. Ask providers about their DCIM competencies. Are all their systems connected? Are all sensors connected to and monitored by software? Can they generate dashboards/reports on the fly and zoom to floor, cabinet and rack levels? Do they have an end-to-end asset management capabilities to let you manage your assets from ‘dock to decom’? Do they have integration into other ITSM systems to tap into the capabilities you need? WORKLOAD AND WORKFLOW MANAGEMENT After the physical elements, focus on how the workload is delivered and managed. There are several key considerations around the type of data or applications an organisation is trying to deliver. Cloud and big data will continue to change how organisations distribute data, especially among multiple locations. The IT landscape is continuously shaped by shifts such as ‘data-on-demand’, Bring Your Device and the Internet of Things, so ensure providers are is in-the-know and capable of keeping up. Balancing workload, continuity and disaster recovery is important for sustainability. The distance the data has to travel, and the amount of bandwidth provided can mean the difference between a great user experience and deployment failure. Workflow management systems prioritise the delivery of certain data and infrastructure components. Furthermore, it helps determine what needs to have higher uptime requirement so, in times of bottleneck, organisations can access the most important information first. AND FINALLY These steps are the start of a journey. Issues such as the physical security situation are another matter entirely, and an array of security certifications such as PCI DSS 2.0, SSAE 16 and ISO 27002 need to be understood as needed for the service required. Of crucial importance, but worth an article or two in its right is an investigation into the Service Level Agreement. SLAs are the cornerstone of a good relationship and avert conflict. When selecting the right colocation provider, creating or having a good SLA and establishing clear lines of demarcation are crucial. Migrating data centres into colocation are mission critical. It’s key to consider the major DCIM factors outlined, as not all colocation providers are the same. ### Key Considerations in Adopting the Right UC&C Technology Strategy The UC&C sector is on the rise with organisations investing approximately $8.1 million in these services and technologies. The increase in use is justified with companies using these collaboration tools reporting higher productivity across the organisation. Research has also found that companies using collaboration tools to improve productivity achieve 47% higher returns for shareholders. Research also found that a lack of suitable communication tools within organisations can prove costly with around $37 billion estimated loss due to employee misunderstandings and poor communication.  To put this loss in simple terms, a business of 100 employees can stand to lose an estimated 17 hours a week in seeking clarification, resulting from misguided communication. Regarding organisational culture shift, modern tech-savvy employees now expect employers to supply more than just basic collaboration services such as conference calling.  They require additional features such as voice, video, chat or audio. IT and business leaders are now facing the pressure of providing staff access to the right collaborative solutions to ensure a productive and efficient workforce. They also need to ensure that they meet the organisation’s needs, scale and reach. So, what should business leaders consider when they look to develop their UC&C technology direction? What is the organisational need When starting the process, it’s important to understand the needs of your organisation thoroughly.  Questions to answer include:  What is my organisation’s unique collaboration mix?  And what technology do we lack in that my organisation needs? Each organisation, department and team member have a unique collaborator profile, so it’s important to ensure that your teams aren't being overloaded with tools that are unnecessary, irrelevant or that don’t drive maximum efficiency across your enterprise.  These factors are important when considering your UC&C direction.  With these in mind, you can then begin to build a reliable UC&C strategy and business case. Know who you’re buying from   Before committing to a supplier, it’s important to do your research. Do they have a positive reputation and what is their history? Are they able to provide customer references? While there are many collaboration technology providers in the market, few are truly equipped to support the collaboration needs of the largest global corporations or provide evidence of successful use cases. Security As with any technology used in the enterprise, security should be top of mind. It is important that conferencing tools implemented are secure and resilient. Employees disclose confidential company information on their conference calls daily, making these channels targets for cyber-criminals. [easy-tweet tweet="Businesses are always aiming to grow, it’s crucial, that their technology is also able to grow with them" hashtags="Technology, Strategy"] Consider which tools can provide you with enterprise-grade security and encryption, ensuring valuable company data is protected. Scaling up   Businesses are always aiming to grow, it’s crucial then, that their technology is also able to grow with them. Carefully evaluate whether a vendor’s capabilities meet the needs of your company, based on your size and scale requirements. For large enterprises, demonstrating the capabilities to support thousands of employees at any given time becomes crucial, both in terms of simultaneous users and simultaneous meetings on your technology provider’s network. Training Once the decision is made to deploy technology in your enterprise, training programmes and resources become paramount to driving usage and adoption to maximise ROI. When employees aren’t engaged in the UC&C technology early on or don’t understand the value to their specific work flow, they are less likely to adopt the UC approach. Support when you need it When implementing new technology, it’s important that you have access to support regardless of the strength of a provider’s network infrastructure and resiliency.  Organisations will usually experience minor complaints, so it’s essential to have access to multiple support avenues, including in-meeting support, live chat, phone support and customer communities globally. To empower your modern business collaboration, IT and business leaders must think of how their chosen technology works within the context of their people, culture and process.  The cost of delays attributed to technical difficulties, install delays and missing credentials, is incalculable and potentially detrimental to the business. Every productive moment that is lost or gained in the enterprise makes an impact. So, businesses need to ensure that their employees are communicating and using UC&C effectively. ### Acrolinx Partners with Datapipe to Offer a Multi-Cloud SaaS Solution with Redundancy Across Public Clouds Software provider uses Datapipe’s expertise to add Azure hosting to its existing AWS and private cloud solutions for AI-based content optimisation platform Datapipe, a leader in managed cloud services for the enterprise, has been chosen by Acrolinx, the leading content optimisation software platform, to provide a managed multi-cloud environment for its business. The partnership gives Acrolinx cross-cloud resiliency and ensures data stays within a single cloud infrastructure. The company will now benefit from Microsoft Azure services, on top of the standardised AWS hosting services that Datapipe already delivers for the business. As companies increasingly align towards the leading public clouds, Acrolinx found that its larger enterprise clients were requesting to host its SaaS platform on their preferred cloud provider. Acrolinx saw an opportunity to both satisfy these client requests and put in place a solution that offers redundancy at the cloud provider level, so that, in the unlikely event of a problem with one public cloud, it can easily move services over to another provider. [easy-tweet tweet="Acrolinx uses AI to analyse and improve enterprise content creation across multiple channels" hashtags="AI, Cloud"] Scotty Morgan, Chief Sales Officer, Datapipe Europe, says: “The ability to provide customers with a choice of cloud environments has been transformative for Acrolinx. The fact that Acrolinx chose Datapipe to enable this solution because of the great working relationship we already had is testament to the quality of service that Datapipe provides its customers. This partnership clearly demonstrates our capability to deliver worldwide solutions while providing the crucial local support that is required by businesses on a daily basis.” Acrolinx uses artificial intelligence to analyse and improve enterprise content creation across multiple channels, allowing brands to develop a consistent, recognisable brand voice. This can be a challenge for enterprise, especially if a large amount of content is being produced regularly. Acrolinx’s platform is proven to increase brand trust, conversion rates, and intent to purchase. It is used by many of the world’s most recognised brands, including Google, Oracle, IBM, and Facebook in order to improve the impact of their varied and complex content creation strategies. Acrolinx’s large customer base requires massive scale, global deployments, and highly secure infrastructure. It is the type of challenge that is well-suited to public cloud. Acrolinx initially chose Datapipe to set up an AWS solution for its SaaS software in 2016 to complement its existing German-based private cloud setup. It moved to the multi-cloud solution in 2017. Philipp Offermann, Senior Product Manager at Acrolinx, comments: “People often talk about cloud lock-in and being forced to stay on one public cloud platform or another forevermore. However, Datapipe delivered a solution that means we are free to choose between Amazon Web Services and Microsoft Azure, while Datapipe takes care of all the monitoring and management. An added benefit is that we can now offer a wider range of services than before, whatever the customer wants – whether it is Microsoft or Amazon, we can offer it.” Offermann continues: “It is incredibly valuable to have Datapipe as a professional partner to help us with these new managed services. It’s freed up our IT and support teams and they can now focus on supporting our software, not dealing with hosting issues.” ### Data Storage: In the Cloud, Or Not? You want the benefits of cloud computing, but as a sensible IT professional, you also have one eye on security. How can you decide whether to store your data in a cloud environment or locally, and make those decisions systematically? Earlier this year, HANDD Business Solutions asked 304 IT professionals in the UK what data security challenge kept them up at night. Whether to store data in the cloud or on their premises was by far the top concern, with over a third (35%) fretting about it. On one hand, IT teams are under increasing pressure to take advantage of cloud computing’s cost savings and flexibility. On the other, they will be on the hook for any data breaches stemming from storing business data on infrastructures that they don’t control. Know your data Before they can make any decisions, they should understand what data they’re storing. A competent IT team will review each data record in the context of the business processes that it supports. They will understand the record’s sensitivity, and the privacy implications it carries. Only then can they accurately assess the risk of storing it in the cloud. As data volumes increase, this information isn’t something they can do manually. Instead, they need an automated approach to classifying data and making routing and storage decisions based on that information. Metadata is a key asset when classifying data in this way. When employees or business applications create a data record, they can tag it according to its sensitivity level. This then enables data management systems to decide where to store it according to pre-defined policies. Act on your knowledge These policies can be intricate, going beyond simple ‘cloud/no cloud’ decisions. If your automated system does store data in the cloud, whether to encrypt or not it will be another critical decision. Simply encrypting all your cloud-based data regardless of sensitivity will bring its own challenges in terms of system performance and cost. By classifying data at the outset, administrators can automatically make policy-based decisions about data encryption. Another approach to protecting data, particularly suited to hybrid cloud computing models, is tokenisation. This substitutes data stored in the cloud with a token that referring back to data stored on the company’s premises. It is a powerful way to take advantage of the cloud’s capabilities while preserving data security. These technologies are becoming increasingly important for cloud-based storage, not only for risk mitigation but also for legal compliance. The EU’s General Data Protection Regulation (GDPR) specifically cites encryption and ‘pseudonymisation’ (a concept that often uses token-based data protection) as privacy-enhancing measures. GDPR will force companies to draw direct connections between the type of data that they store in the cloud with the measures used to protect it. [easy-tweet tweet="Companies often use multi-cloud strategies, storing data with different providers based on different parameters" hashtags="Cloud, Data"] Which cloud to use? All clouds are not equal. The kind of cloud service that you store your data in will also be a key consideration, as will your legal relationship with that service provider. These days, companies often use multi-cloud strategies, storing data with different providers based on different parameters. According to over 1,000 professionals questioned in RightScale’s 2017 State of the Cloud survey, 85% of companies have a multi-cloud strategy. As your cloud strategy matures, you too may find yourself dealing with multiple parties based on the requirements for the data that you’re storing. One distinction is whether to store data with a public cloud provider which allocates resources from a shared public pool, versus a single-tenant virtual private cloud environment which dedicates physical hardware, storage and network resources to your company alone. Each of these service types has its pros and cons. Virtual private cloud storage may not offer as much flexibility when provisioning new computing and storage resources. On the other hand, it does provide security and compliance advantages. Another decision to make when choosing a cloud provider is based on its location options for data storage. Some data in your organisation may carry legal constraints around where you can store it. If regulations forbid you storing data about a country’s citizens somewhere else, you must ensure that your cloud service provider won’t violate that policy. GDPR will also certainly affect how companies deal with their cloud providers contractually. Originally, data controllers (the companies that own the sensitive data), bore the burden of responsibility for protecting its privacy. Under GDPR, which comes into effect 25 May 2018, data processors (third party service providers that handle data) will share that responsibility. This will force cloud service providers to examine contracts more closely and determine the boundaries of liability. Legal discussions with your cloud service provider are likely to become a lot more intense. Back to basics Increased liability on the data processor’s part won’t let you off the hook as a data controller, though. Beyond understanding and classifying your data, there are some cybersecurity basics that will be mandatory as you move to a cloud-based world. One of these is access management. Companies must also consider who will have access to that data, what permissions they will have based on their roles and responsibilities, and how the system will authorise those people securely. Identity and access management (IAM) systems will, therefore, play an important part in any cloud computing and storage strategy, just as they should do for data stored on your servers. Another challenge is data discovery. Creating processes for classifying new data is only the first step. Discovering and classifying the data already in your organisation – or spread across your existing data processor service providers – is a critical task that you cannot afford to overlook. Only after a thorough data audit will you be ready to make intelligent decisions about where and how you store your data in a cloud environment. By the time you complete that data mapping process, you’ll be ready to tackle the cloud, armed with a detailed understanding of what data you have, where it is – and just as importantly, what it means to you. ### OpenStack Pike Delivers Composable Infrastructure Services and Improved Lifecycle Management New lifecycle management tools and commercial delivery models including private-cloud-as-a-service make adopting OpenStack easier than ever The OpenStack community today released Pike, the 16th version of the most widely deployed open source infrastructure software, with a focus on manageability, composability and scale. The software now powers 60 public cloud availability zones and more than a thousand private clouds running across more than five million physical cores. With new delivery models like private-cloud-as-a-service, it's easier than ever to adopt OpenStack through the open source ecosystem where users are not locked into a proprietary technology or single vendor. OpenStack's modular architecture also allows you to pick the functionality you need - whether that¹s bare metal or block storage provisioning‹to plug into your infrastructure stack. This composability - which makes possible use cases like edge computing and NFV - is a marked distinction from proprietary on-premises offerings, or even earlier versions of OpenStack. Community trends and statistics: Composable services are gaining ground to address new use cases like containers, machine learning and edge computing. For example, OpenStack Ironic bare metal service now features enhanced integration for Cinder block storage and Neutron networking, and Cinder can now act as a standalone storage service for virtual machines, bare metal, or containers using Docker or Kubernetes. Significant development efforts have gone into lifecycle management tools including OpenStack Kolla, which makes it easier to manage and upgrade OpenStack using services like Kubernetes and Ansible. Kolla saw a 19 percent increase in contributors in the Pike release as compared to the Ocata release. More OpenStack users are adopting a multi-cloud strategy and placing workloads across public and private cloud environments based on cost, compliance and capabilities. According to the April 2017 user survey, vendor lock-in was the number one business driver for OpenStack clouds, and 38 percent of OpenStack deployments interact with at least one other public or private cloud environment. OpenStack is continuing to experience strong growth with the April 2017 user survey reporting 44 percent more deployments compared to the previous year and new at-scale production deployments in Europe and China at China UnionPay, Paddy Power Betfair and Tencent, which uses OpenStack to power WeChat. OVH is only the latest player to expand its OpenStack public cloud footprint into Poland. Recently, Swedish provider City Network added a region in Dubai; Telefonica added regions in Argentina, Brazil, Mexico, Chile and Peru; and Fujitsu announced it has 16 OpenStack public cloud availability zones around the world. [easy-tweet tweet="Pike offers new capabilities that improve manageability and provide greater flexibility and scale" hashtags="Cloud, PCaaS"] The features and upgrades that Pike brings are the lessons of experience you get from enabling thousands of public and private clouds, large and small, for seven-plus years, said Jonathan Bryce, executive director of the OpenStack Foundation. The rise of composable services and simpler consumption options are part of that maturation process. Our community is now focused on eliminating future technical debt as well as growing OpenStack's capabilities to support ever-expanding use cases. Pike offers new capabilities that improve manageability and provide greater flexibility and scale, including: Nova Cells v2: The Nova Cells architecture supports large deployments and scaling the computer service. Version 2 allows operators to shared their deployments to help with scaling the database and message queue, as well as segregate failure domains and help eliminate single points of failure. Python 3.5 upgrade: Working across all projects, the community introduced support for Python 3.5 to be ready for Python 2.x end-of-life in 2020 and also to take advantage of new features and increased performance in the future. Leveraging etcd: At the Forum in Boston, the user and developer communities decided to use etcd v3 as the distributed lock management solution for OpenStack, and integrations are starting to appear in the Pike release. Ironic bare metal service matures: Ironic continues to mature in the Pike release with the ability to plug into Neutron networks for true multi-tenancy. Ironic also joins Cinder, Neutron, Nova and Swift as projects that support rolling upgrades, letting operators roll out new code without interrupting service. Cinder launches Œrevert to snapshot and ability to extend volumes: Revert to snapshot lets users recover from things like data corruption, or to reset after running tests. Users can also now extend storage volumes without shutting down virtual machines, keeping applications online during extensions.  Swift object storage lands globally distributed erasure codes: Even if the cross-region network is down, individual regions can still function, and failures in one region can use the remote region to recover. Swift also added performance improvements by enabling users to run multiple concurrent processes per server.     Download Pike and learn more, including details on features and enhancements here. ### Personalised Shoppers – Retail and the Future of Data In a customer-driven world, retailers deal with a hugely variable commodity. No successful business would dare assume it can cater to everyone at once, but each tries its hardest to tailor to as many individuals as possible. Artificial Intelligence (AI), paired with big data, brings a long-lost asset back to e-commerce – knowing your customer. It has already been adopted by 38 per cent of the retail sector to leverage customer insight, but legacy infrastructure is in no shape to handle the coming surge of data dependency. Machine learning (ML) has seen even greater uptake than AI. 48 per cent of retail executives use it to boost sales figures: by predicting trends, tailoring special offers and generally gaining invaluable insight into customer’s behaviour. As organisations shift from a product-focused strategy to a customer-focused one, the number of moving parts increases exponentially. Data aggregation is exploding, and legacy warehouses are simply not equipped for it. Retail is a driver of technological trends and spurs on software development. Those at the forefront of this innovation are reaping the rewards: 54 per cent of shoppers have bought something based on automated recommendations or cart-reminders, while over 70 per cent have been nudged into purchases through coupons and discounts. To support continued growth and viability of these techniques, the data sector needs to lead with innovation of its own – it must build itself into the cloud. AI and ML are incredibly promising technologies, but are being held back by data bottlenecks. The demand on legacy warehouses - used by most of the industry – exceeds the capacity they are capable of handling efficiently. Already sluggish, this will be exacerbated by investment in Internet of Things (IoT) technology, which is expected to be seen as a priority for at least 30 per cent of organisations within two years. Modern retail outlets take technology as seriously as stock and sales. Data analytics, the science of drawing predictions and conclusions from data collected through various channels, has seen a surge of growth in the sector over the past decade. Erick Roesch, director of business intelligence at fashion retailer Rue La La, states that big data has brought “the power to conceptualise what we want and get results in seconds”. The benefits data analysis offers has made it a principal element of the commerce toolkit. At the foundation is an intricate system of predictive modelling, automated product sourcing and shipping, and tailored marketing. The latent issues set to cripple legacy warehouses must be addressed to free up the data these AI and ML processes depend on, enabling the continued, remarkable growth of the retail sector. [easy-tweet tweet="Many retailers struggle to handle data from fragmented services and departments" hashtags="Retail, Data"] It has all come in the blink of an eye and companies can crumble just as quickly if they’re not able to meet scalability demands. Success for a retailer in 2017 means exponential growth, with customer base and orders snowballing month-by-month - quickly demanding expensive upgrades to existing systems. Not only is the cost of a system overhaul out of reach for most small businesses, but collating data across departments poses a near impossible feat. Many retailers struggle to handle data from fragmented services and departments – in the cloud they can be co-ordinated in a way that enables operational fluidity, which is simply not available under the legacy model. In fact, only Walmart successfully unified data throughout its supply chain pre-cloud. Reliant on traditional data warehouses, even established retailers can find their success is limited by the inability to meet a sudden influx in customer sales. This can be anticipated for seasonal sales, but is often a result of completely unexpected trends. What data analytics offers is a “360-degree view” of your customers and the sheer size of an online customer base means this can only be made possible by AI. Most retailers are aware of the advantages of targeted data and seek to, as Roesch says, “extend what [they’re] doing in a more elastic manner.” To make the most of the latest tools and technologies at their disposal, retailers need more flexibility from their storage solution – the only way to achieve this is in the cloud. The ability to meet peak demand would ensure full advantage is taken of every market whim, while being able to roll back data caps during quieter periods would mean loss margins are kept to a minimum. Most AI/ML solutions we see today are in stock management and supply chain, but they extend to customer relations, the use of chat-bots, and tailored services. Through bespoke marketing, brands can see low-loyalty customers become significantly more engaged and likely to make a purchase. Forecasting accuracy, operational efficiency, excess and depleted inventory, are all better managed by implementing these tools, but running them from a traditional data centre reduces effectiveness. The fast-growing fashion retailer Rent the Runway (RTR) calls out “We’re in the fashion-technology-engineering-supply-chain-operations-reverse logistics-dry cleaning-analytics business.” This is the new model – multi-faceted service-based businesses. Technology and retail are growing together and strengthening the case for one-another’s development. Using ML, a service such as RTR can be tailored to the individual and offer an unprecedented degree of customer insight. To support this ‘Closet in the Cloud’, RTR turned to a cloud-based data warehouse to provide a system that boasted scalability at the click of a button. By adopting a true-cloud data warehouse to support the organisation, businesses such as RTR can avoid drowning in their own success. Legacy warehouses are inflexible, offering one option for retailers who hope to succeed - ‘overprovision or bust’. This creates a culture of duplicate data filling up capacity, as concurrency is virtually non-existent. As AI is dependent on a vast amount of resources, such wastefulness can quickly lead to redundancy. Optimisation is the trend of today – the leaner your data strategy, the more effective your predictive analysis will be. There is no singular version of the truth in a legacy warehouse, rather many half-truths. Any attempt at personalised marketing or sales predictions is thrown horribly off the mark and effectiveness trends to zero. A cloud solution eradicates this in its inherent ability to support a virtually infinite number of concurrent users – any changes to the data will only be in service of creating a more complete image of the customer, stock, sales, etc., empowering the business as it grows. Young businesses are fighting a good fight against established industry giants, but they will need every possible resource in their favour, with AI and ML as their powerhouse. They need a holistic view of their customers so as to leverage every advantage and close every sale. They need flexibility and scalability in real-time to minimise overheads and meet demand. They need concurrency, accuracy and organisation across every department. Cloud-based data warehousing is the only viable solution for those who seek a level playing field. As legacy-loyal competitors find themselves tethered to outdated systems, a new breed will strategically fill the niche. ### Smart Cities - Compare the Cloud with Phil Beecher Wi-SUN Alliance Compare the Cloud speaks with, Chairman of Wi-SUN Alliance, Phil Beecher. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. To find out more about Wi-SUN visit: https://www.wi-sun.org ### Navisite Launches VMware vCloud Availability Program at VMworld Navisite expands managed cloud services and investments in VMware NSX to support networked replication and disaster recovery to the cloud for on-premises VMware vSphere implementations Navisite, a part of Spectrum Enterprise, a division of Charter Communications (NASDAQ: CHTR) and a Premier partner in the VMware Cloud Provider Program, today announced that it will launch a program that will offer early access to Navisite’s managed services for the VMware-native cloud-based replication and disaster recovery capabilities for VMware vSphere and VMware vCloud Availability for vCloud Director. This complements Navisite’s recent launch of VMware NSX network and security virtualization platform.  With Navisite’s VMware cloud services, running VMware vCloud Availability for vCloud Director, Navisite clients will be able to test replicated environments and move applications seamlessly, all while production is actively running. [easy-tweet tweet="Recent cyberattacks has highlighted interest in cloud replication DR implementations" hashtags="DR, Cyberattacks, Cloud"] “Our clients have been looking for a native solution to replicate and migrate protected virtual machines to the cloud - without the need for costly dedicated hardware or complicated third-party solutions,” said Sumeet Sabharwal, group vice president and general manager, Navisite. “The spike in recent cyberattacks and ransomware has highlighted interest in cloud replication disaster recovery implementations. Navisite VMware vCloud Availability for vCloud Director based replication service provides clients with a simple and cost effective solution to enable availability of their VMware environments.” Navisite’s VMware vCloud Availability for vCloud Director solutions offer a variety of benefits for clients, including: On-premises installation and complete VMware compatibility: Using a Navisite VMware Cloud as a vCloud Availability for vCloud Director replication target, clients can replicate virtual machines (VMs) from primary VMware vSphere environments to a highly scalable, geographically-diverse site, based in one of Navisite’s enterprise-grade data centers. Simplified failover switch capabilities: Clients can gain access to a remote site for disaster recovery without the painful process of configuring a VPN. After an interruption at a client’s primary site, users can simply initiate a failover switch to keep business applications running in a remote Navisite environment. Once the primary site is restored, clients can then initiate a failback to return to running applications as usual. Leveraging a distributed architecture provides Navisite clients with unique failover testing capabilities. VMware-native solution with enhanced functionality: Unlike alternative third party replication tools that can be costly and difficult to integrate, vCloud Availability for vCloud Director is a native solution developed with the full support of the vSphere stack. “Navisite’s innovative program gives their clients access to VMware’s latest replication and Network Virtualization technologies,” said Ajay Patel, senior vice president, general manager, Cloud Provider Software Business Unit, VMware. “Navisite’s vCloud Availability for vCloud Director solution offers clients an end-to-end, VMware-native solution that can surpass existing third-party data protection solutions.”  Clients are already taking advantage of this cloud replication and Disaster Recovery as a Service (DRaaS) solution. One example is Ceridian, a global human capital management company that has turned to Navisite to help meet the needs of its changing business model. “As the only provider in the U.S. with a VMware vCloud Availability replication offering, Navisite is exceeding our expectations yet again,” said Warren Perlman, CIO, Ceridian. “Their new multi-cloud solution combines the power VMware NSX and vCloud Availability for vCloud Director to deliver seamless and rapid recovery time in the event of a disaster, allowing us to keep the focus on our core business.” Navisite is a Gold Sponsor at VMworld 2017, the industry's largest virtualization and cloud computing event, taking place August 28-31 in Las Vegas. Navisite will be discussing and demonstrating its VMware expertise along with new VMware vCloud Availability for vCloud Director at booth #212 during the conference. For more information about Navisite’s VMware vCloud Availability for vCloud Director visit:http://www.navisite.com/solutions/vcloud-availability-trial. ### How Would Blockchain Impact on the Stock and Currency Markets? Let's face facts; the notion of using blockchain technology to revolutionise the financial marketplace is nothing new.  Back in 2015, the Nasdaq stock exchange used blockchain to transfer shares through a decentralised ledger, removing the historic need for a middleman or clearing house. More recently, Goldman Sachs has filed a patent application that would leverage the principles of blockchain technology for buying and selling of currency, in a bid to speed up transactions and reduce the cost of trading. While it would appear as though blockchain was a more natural fit for an already liquid and real-time marketplace, there is no doubt that this technology could also be applied to stocks. But how would it impact on these markets, and would make trading more accessible? How will Blockchain Challenge and Change the Face of Trading? In the current market, currency and share trading tend to be conducted through third party brokering sites and clearing houses, who facilitate orders in exchanged for predetermined commission fees. This can be a relatively elongated process (for financial institutions in particular), while it can also prove increase fees in instances where more than one middleman is involved. By leveraging a transparent and decentralised ledger, however, Blockchain technology would create a direct link between participants who could complete their transactions through a peer-to-peer network of brokers and independent traders. So while participants would still execute orders through an online trading platform, the underlying, back office function and final settlement will be completed by using blockchain, boosting efficiencies and market liquidity in the process. There are numerous benefits to this practice, even though its application would vary between stocks and currencies due to the diverse nature of these assets. Blockchain would universally create a more efficient and cost-effective trading process, however, minimising third party or custodian delays while also virtually eliminating associated auditing costs. This would also lend itself to instantaneous and real-time trades, which could prove crucial in a volatile and constantly changing entity such as the foreign exchange. Given the recent scrutiny of questionable practices by banks and other financial institutions, blockchain technology may also offer timely benefits in the form of transparency. This would solve an issue that sits at the core of financial market trading in the digital age, particularly in liquid and malleable entities such as the foreign exchange. In short, blockchain would act as a decentralised and virtually impenetrable ledger of completed transactions, listed encrypted orders that are then distributed through a public network for the purposes of transparency, compliance and accountability. Suddenly, the ownership of trades would be indisputable, making it far harder for anyone to manipulate records or commit fraud in the financial marketplace. What Are the Challenges?  Despite the universal advantages of blockchain, not all marketplaces are created equal and some challenges may arise when the technology is applied. Take the stock market, for example, which would suddenly become a far more liquid entity should real-time blockchain technology be introduced. So while this would theoretically make it easier to short-sell stocks and most likely lead to increased investment in shares, we would also see a significant increase in trading activity and greater volatility in price shifts. Given that stocks are seen as a more secure store of wealth than currency, this may not benefit every single share trader or their underlying philosophy. [easy-tweet tweet="Blockchain technology offers a far more natural fit for the forex market" hashtags="Blockchain, Technology"] In this respect, blockchain technology offers a far more natural fit for the forex market. After all, this is already a market that boasts high levels of liquidity, while the speed and efficiency of short-selling and spread betting would be enhanced by blockchain. Even then, the volatile nature of the foreign exchange means that prices can fluctuate wildly in instances where trading volumes rise at a disproportionate level, with practices such as high-frequency trading having triggered sizeable losses in certain market conditions. With blockchain optimising the number of real-time trades and short-sells, investors could see similar issues if the technology was widely applied. The rise of transparency may also provide challenges, even though it will have a positive impact from the perspective of regulators and independent traders. The notion of having accurate market positions exposed to the public as a form of trading ID may be too much to bear for some financial institutions, however, particular high-level investors and lucrative hedge funds. With these trading vehicles typically active over a prolonged period of time, there is a risk that other investors could leverage the data to replicate successful strategies for their own financial gain. The Bottom Line: Why the Rise of Blockchain as a Trading Staple is Inevitable These challenges aside, there is no doubt that the accessibility, efficiency and security of blockchain mean that it will play an increasingly pivotal role in the future of the financial markets. Its impact will vary, of course, as while it will simply increase the liquidity and trading volumes associated with the forex market, it has the potential to revolutionise share trading and alter its fundamental nature. The principles of blockchain would even be applied to way in which share ownership is detailed, with the volumes, prices and timing of each individual transaction recorded for prosperity. In truth, there is no limit to the potential of blockchain in the financial market. Even the commodity market, which is affiliated with the physical economy and still utilises physical documents as part of its value chains, has flirted with the idea of leveraging blockchain, in a bid to remove the use of manual processes and unlock collateral. Once again, blockchain would also modernise the commodity market and offer faster access to finance, bringing it in line with more liquid entities across the board. ### What To Know About Devops In 2018 Software development has always been a demanding field of work. Unlike many other professions, software developers always have to be up to date with the latest and greatest in the tech industry. The breadth of knowledge a person can have is limited, however. As systems became more complex, dividing the software development process into development and deployment made much more sense. Devops handles the deployment end of the process. While deployment and systems administration is a major task in itself, devops engineers are no longer limited to the conventional boundaries of a sysadmin. This change has been caused by the industry wide shift to cloud based systems from on-premise systems. Devops encompasses a much broader range of skills today. The most important skills a devops engineer must have in 2018 are discussed below. Working with the cloud Nearly all jobs in the devops field today are related to cloud based service administration. The term “cloud” broadly refers to someone else’s computer, for example the offerings from Amazon AWS, Google Cloud Platform, Microsoft Azure and DigitalOcean. These providers allow the user to provision and destroy servers on demand. This layer of abstraction gives rise to a very powerful way of architecting and scaling services based on the real-time demand. A devops engineer must be very comfortable with using these providers, provisioning servers programmatically using a deployment tool like Ansible, Puppet, or Chef, handling firewalls, and other sysadmin specific tasks. Continuous Integration Today, everyone uses continuous integration. Providers like TravisCI and CircleCI allow the user to run builds on their systems based on conditions like new commits to the codebase and report back the results to the required endpoints using webhooks. Once a build succeeds, they can hook into the production or staging servers and perform the actual deployment. A devops engineer must be comfortable with using and setting up continuous integration. Familiarity with the popular OSS alternative Jenkins is an added bonus. Microservices As the industry largely moves from monolithic applications to microservices architecture, the deployment process has become much more complex. A devops engineer must be familiar with the design choices made in microservices based architectures and the rationale behind them. Scaling One important benefit of microservices is the scaling potential they provide. Individual services can be scaled horizontally or vertically as per the demand. A devops engineer must know about orchestration, setting up load balancers/proxies, when to scale and in which direction, and sharding databases. Distributed systems Distributed systems are required because of the limits imposed by vertical scaling. As microservices become more and more parallel, distributed systems concepts come into play. A devops engineer must know how to work with message queues, setting up replication on databases, service discovery, and inter-process communication techniques. On the theoretical side, a good devops engineer must know about the CAP theorem, the various tradeoffs and various consensus algorithms. [easy-tweet tweet="Devops engineers are often required to automate processes other than software development" hashtags="Devops, Cloud"] Automation Good devops engineers automate as much as they can. A devops engineer must know about setting up automatic backups to a storage platform like Amazon S3, automated deployment systems, monitoring, and automated alerts to the relevant people. Devops engineers are often required to automate processes other than software development. Automation is an indispensable requirement in achieving business growth. Open source marketing automation software like Mautic can be used for this. A devops engineer must be comfortable in automating services by joining them via webhooks. Containers Virtualization is the backbone of all cloud based services today and containers are right at the center of this development. Docker and rkt are two popular container engines used widely in production by many companies. A devops engineer must be familiar with these container engines and other alternatives like containerd, and BSD Jails. Additionally, devops engineers should have experience with container orchestration, networking with containers, service discovery, integration with build systems, init system for multi-service containers, and setting up security profiles for AppArmor and SELinux. Logging Logs are the unifying layer in many distributed systems. A good devops engineer must be experienced with rotating logs, a metrics server like Graphite, logging drivers like Syslog, Fluentd, and systemd logging via Journald. Bonus points for being familiar with making containers work with different logging drivers, sending logs to data processing pipelines, and setting up log based replication in databases. The breadth of knowledge required for becoming a good devops engineer can be daunting. Devops engineers need to match the pace of change in the industry. With large scale automation coming into the picture, the role of a devops engineer has become very crucial. Trying out new and challenging paradigms is the key to moving forward and staying relevant as a devops engineer. ### The Cloud - Cloudtalks with Steven Boyle from Integrated Cloud Group In this episode of #CloudTalks, Compare the Cloud speaks with, CEO's Integrated Cloud Group's Steven Boyle. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. To find out more about Integrated Cloud Group visit: https://integratedcloudgroup.co.uk/ Cloudtalks ### 5 Things You Can Do With OpenStack There are so many misnomas about what you can and can’t do with OpenStack.  I talk to people every day who say “OpenStack is ok, but you can’t……” and more often than not, you actually can.  Here are 5 ‘Swiss army knife’ features that you may not know are possible with OpenStack. Multi Tenancy / Project Spaces Slice your cloud resources any way you wish. Set up ‘project silos’ for individuals or groups of users. With an overarching cloud administrator controlling access across project spaces, your single OpenStack cloud can be used by any number of users and teams without impacting each other. Tenancy separation is a default, with the option of creating virtual routers to connect tenancies if required. Availability Zones (AZ) Availability zones are used to group cloud nodes, and can contain multiple host aggregates. Host aggregates are typically used to group compute nodes of similar physical architectural type and storage nodes of various IO performance. Being able to group nodes of differing architectural types, I.e., Intel and Power CPUs and HDD and SSD disks, enables your OpenStack cloud to dynamically and automatically launch instances based on metadata defining various architectural types. A typical use case where this level of separation is required is when you have Intel compute nodes hosting ‘standard’ workloads and HPC compute nodes ring-fenced for bid data research and analysis roles. AZs can also be useful when used correctly in High Availability and Disaster Recovery solutions. Software-defined networking (SDN) The OpenStack Neutron service provides a full-featured SDN feature set. With full control of virtual networks in the multi-tenant cloud, every network can be defined as individual secure networks. Support for VXLAN and VLAN enhances the security scenarios even further. Also available are options to create Firewalls, Load-balances, Routers and IPSec VPN tunnels. All at a granular per tenant basis. [easy-tweet tweet="E-commerce is an ever growing revenue stream for public cloud providers" hashtags="Cloud, SDN"] Public cloud services E-commerce is an ever growing revenue stream for public cloud providers. OpenStack was created as a Private Cloud solution. However, when combined with autonomous product, customer management, payment gateway and billing systems such as Atomia and Host-bill, it is entirely possible to turn your OpenStack private cloud into an OpenStack public cloud, offering public hosted services and opening up a new revenue stream for Managed Service Providers. Distributed Cloud It is possible to host a central controller in a data centre or an HQ site and install additional OpenStack nodes in remote branch offices to support local workloads. The remote site users will still access their tenancies through the central controller, but they will have the option of launching instances at their local branch cloud node. This scenario is made simpler by OpenStack cloud providers who offer hyper-converged OpenStack nodes; such as ScaleCloud appliances. If you think you know OpenStack, think again. It is an extremely versatile cloud, and if you exploit every service it has to offer, you can get ahead of the pack. ### Rubrik Launches Global Service Delivery Partner Program First of its kind program creates premium services offerings and new revenue streams for partners with managed service offerings Simple, flexible consumption models now available for Rubrik’s leading cloud data management platform  Rubrik, the Cloud Data Management Company, today announced a new Service Delivery Partner program to offer partners simple, profitable and flexible consumption models that are powered by Rubrik. The new program enables ServiceDelivery Partners — encompassing MSPs, VARs, Consulting Partners and any company that includes a managed service offering in their portfolio — to deliver to clients a complete suite of automated data management services on a scale-out architecture across private, hybrid and public clouds. Service Delivery Partners can leverage Rubrik’s technology to create differentiated services across a wide range of client use cases, including backup and instant recovery, archival, test/development, search, reporting and analytics, compliance, copy data management, and cloud solution services. Rubrik’s nimble software platform offers superior performance and operational simplicity that unlocks service delivery value by quickly enabling new paths to premium services. [easy-tweet tweet="50% of organisations will augment or replace their current backup application with another solution by 2021" hashtags="Gartner, Cloud, Backup"] “According to Gartner, 50% of organisations will augment or replace their current backup application with another solution by 2021. The market opportunity here for our partners is huge,” said Randy Schirman, Vice President, Worldwide Channel and Service Delivery Partner Sales, Rubrik. “The channel is asking for straightforward programs that accelerate their time to market, help them build premium services, and create value so they can expand current client relationships and capture new accounts. The Rubrik Service Delivery Partner Program is designed to address these challenges and help our partners create richer and deeper customer relationships.” New Consumption Models The Rubrik Service Delivery Partner program offers flexible engagement models for partners including subscription and two new consumption engagement models for both hardware and software. The consumption model for hardware provides Service Delivery Partners with a low-risk and low-cost platform upon which they can build new services that are powered by Rubrik on a pay-as-you-grow basis. The consumption model for software allows partners to leverage Rubrik software on top of approved third-party hardware appliances, including Cisco, HPE, Dell, and more. Unlike legacy MSP programs, Rubrik Service Delivery Partners are charged based upon back-end data (the actual data used on the device) versus front-end data that may include duplication. Additionally, Rubrik makes capacity aggregation available across multiple clients and multiple clusters, which helps partners centrally manage and bill for utilization, as well as helps meet their minimum annual commitment and reach volume discounts regardless of the service (e.g. backup, archive, disaster recovery, etc.) being delivered to their clients. Program Tiering Rubrik offers two levels within the program, to reward extra volume and commitment from partners. The select level is aimed at those partners looking for a flexible entry point into the service delivery market via a-la-carte pricing, targeted at SMB to mid-market customers. Partners at the Elite level are already delivering services targeted at enterprise customers who have large data sets to manage or migrate. White Label Services  For Service Delivery Partners that see the value of offering Rubrik data management services but do not have the resources or capabilities to build these services for their clients, Rubrik has engaged Assured Data Protection to offer partners a suite of white label services that are “powered by Rubrik.” “We are very excited to be selected as the delivery partner for Rubrik’s white label service option,” said Simon Chappell, co-founder & CEO, Assured Data Protection Inc. “Assured DP’s global presence and collective years of experience in delivering managed data protection services places us in a unique position to enable Rubrik partners. Together we will accelerate partner time to market for new services and solutions.” Additional Program Benefits  Rubrik Service Delivery Partners have access to a wide range of additional benefits, including: Branding. Service Delivery Partners build their brand leveraging Rubrik to support the messaging of their solutions to their clients. Co-Marketing Support. Rubrik is committed to working with partners on joint marketing events to raise awareness and drive demand for cloud data management solutions.  Eligible partners can request MDF funds to help with demand generation. Technical Support. Service Delivery Partners have access to Rubrik award-winning Technical Support to help them deliver services to their clients. Online partner portal. A new online portal is available to all Rubrik partners with sales tools, educational materials, instructional videos, and more. Rubrik has quickly grown a strong global Service Delivery Partner network with market presence across North America, Europe, the Middle East and Asia Pacific regions. ### Building the Digital Enterprise Platform ecosystems will become a key differentiator as enterprises advance their digital strategies As digitisation matures, organisations are increasingly finding themselves part of a digital ecosystem — which encompasses business partners, competitors, customers, regulators and other stakeholders that exchange information and engage digitally. In turn, CIOs are shifting their spending to digital business, with typical CIOs already investing 18 percent of their budget in favour of digitisation, a figure set to reach 28 percent by 2018, as reported by Gartner. Digital-savvy organisations that have deeply embedded digital into their planning processes and business model are already spending 34 percent of their IT budget on digital efforts. The share of wallet is predicted to increase to 44 percent by 2018 as organisations progress with their digital initiatives. By 2018, half of EA initiatives will focus on digital platform strategies Digital platforms underpin some of the most successful business models, from Amazon over Facebook, and Salesforce to Uber. All of them strategically leverage cutting-edge technologies such as artificial intelligence (AI), big data analytics, cloud computing, machine learning, and the Internet of Things (IoT) to advance their operating model, capture market share and keep their organisations competitive. Enterprise Architects (EAs) will play a vital role in facilitating the corporate agenda. EA practitioners must, therefore, concentrate their business architecture efforts on defining their digital platform's strategy and outlining their operating model. Moreover, EAs will increasingly focus on the business and technology opportunities and challenges of digital ecosystems. Among other items, this includes integration work, ensuring interoperability, and improving the user experience. Gartner predicts that by 2018, half of EA business architecture initiatives will focus on defining and enabling digital platform strategies. Design-Centric architecture is essential to build and enhance a platform ecosystem Going forward, enterprise architects will gradually devote more time and attention to design aspects of architecture — which is at the forefront of digital innovation. By 2018, 40 percent of EAs will focus on design-driven architecture, according to Gartner. This enables organisations to better understand the ecosystem and its players, gaining insight into them and their behaviour and enhancing their portfolio with complementing services they are searching for. However, beyond the question of “what” customers want, design-thinkers focus on the “why”. Utilizing design-driven approaches is common practice at many leading platform companies including Apple and Dropbox, for example. However, the transition to design-centric architecture has a multitude of implications for people, processes and tools alike. Smart EA practitioners will enhance the design skills and build additional capabilities within the EA team. They also need to actively educate other parts of the organisation on design-driven architecture and identify an area where they can incorporate the methodology to not only spur innovation but also learn, as a group, how to do design. Digital ecosystems are spreading like wildfire as the ‘cloudification’ accelerates Digital ecosystems are evolving quickly. As part of the ‘cloudification’ trend, IT landscapes are becoming much more fragmented. With hybrid already being the norm, organisations increasingly move one step further toward a “cloud-first” or “cloud-only” approach and essentially operate in a multi-cloud environment spread across a variety of providers. The results of Gartner's annual global CIO survey support this development. In a poll of more than 800 CIOs, the average number of ecosystem partners grew from 22 in 2015 to 42 in 2017. The number is forecast to reach 86 in two years from now. In other words, CIOs in organisations leveraging a digital ecosystem is seeing and expecting to see; their digital ecosystem partners double every two years. Nurturing the organisational capabilities for a striving digital ecosystem [easy-tweet tweet="The shortage of qualified labour remains a key challenge for CIOs to accomplish their goals" hashtags="CIO, Cloud"] Enterprises are increasingly adopting bimodal IT, which is a core ingredient of the organisational skills needed to create a digital ecosystem. Nearly half (43 percent) of survey respondents report that they are embracing bimodal IT. However, the results suggest huge differences between the performance of the top percentile and the centre span. Leaders of the pack far outpace the average organisation in their use of bimodal, with 68 percent of all top performers having adopted bimodal compared with only 17 percent of the trailing performers. The shortage of qualified labour remains a key challenge for CIOs to accomplish their goals. More than a third (34 percent) of survey participants reported that information-related skills represent the biggest bottleneck, particularly the capabilities needed to perform data modelling and utilize cutting-edge analytics to build data-centric business models. The capabilities that may have worked in a pre-digital era are no longer sufficient to cope with real-time data streams presented by the Internet of Things (IoT), analytics, operational technology and digital ecosystems. As a result, in-depth expertise in emerging technologies is rare and expensive. Building leadership capabilities to develop and orchestrate a digital ecosystem Executives should rethink their leadership priorities, engage stakeholders and involve them in the various digital initiatives. Those that are digital ecosystem-ready see the bigger picture and think outside the box. These leaders look for opportunities in every direction. They are prepared to orchestrate diverse partnerships, leave the comfort zone, and question the value propositions and business models of the past. Having an in-depth understanding of the business objectives is crucial for building a digital ecosystem. Together with their CEOs, top-performing CIOs put the strong emphasis on growth and digitisation. In the digital era, in which data is the currency, CIOs have the unique opportunity to play a fundamental role in the creation of new revenue streams. The CIO survey revealed that growth is a recurring theme in the strategic business priorities for 2017 with 28 percent of the top performers, 21 percent of the typical performers, and 24 percent of trailing performers quoting it among their top three priorities. Digital business is a high strategic business priority for the top and typical performers (28 and 20 percent respectively) but was only cited by six percent of trailing performers. ### 10 Surprising Ways to Use Cloud Storage Cloud storage is the level of computing that most professionals need to take advantage of. However, just because it’s a professional tool that millions of users use every day, doesn’t mean that basic user cannot use it on a daily basis. Below, are our top 10 ways you can use cloud storage for even the most minimal tasks. 1.    Upload Your Photos to the Cloud If you take photos using your digital camera or smartphone camera, I bet there are plenty of photos lingering around in its storage. Before you take new photos, how do you get more space without deleting the photos themselves? Well, you’re going to need to upload them to another source. This is where you can use the cloud to upload them. This will free up space on your computer and other devices. Plus, it makes it a lot easier to share with family and friends. 2.    Organize Your Thoughts You never know when a million dollar idea is going to hit, this is why you always have to be prepared. One of the best ways you can record your thoughts, notes, and to-do lists is to write them down virtually. For example, you can take advantage of a note app such as Evernote or Wunderlist, which both have built-in cloud storage to store your every thought. 3.    Working With a Group For school or professional work, cloud storage has never made collaboration easier. When working with a group, you can easily share files and documents, which can save both money and time. Plus, most cloud-based word document services such as Dropbox allows multiple people to work on the same document all at once. 4.    Using Dropbox to Host Blogs Even though Dropbox is primarily a cloud storage service, integration with third-party apps allow for you to do even more! Believe it or not, you can even use Dropbox as a private web hosting platform, in which you can host your very own blogs. 5.    Organize Your Media On your computer and devices, you probably have a ton of saved images, videos, audio files, and more. Instead of simply leaving them all scattered, you can use cloud storage to sort your files and create cohesive digital media libraries. 6.    Combine Your Cloud Storage All in One You no longer have to decide between Amazon Cloud and Google Drive. If you frequently use multiple cloud storage platforms such as Onedrive, Dropbox, or Google Drive, you can combine all your storage providers into one easy location. One such service is Cloudz, which can combine all your cloud storage accounts into one basic account. 7.    Save Your Passwords in the Cloud Internet security is more important these days than ever before. However, it can be difficult to remember all your passwords for every website you use. This is especially true since it is suggested that you use a unique and complex password for each and every account you use. How can one person be expected to remember so many different passwords? [easy-tweet tweet="Password management programs can store all of your passwords in the cloud" hashtags="Cloud, Cloudstorage"] Now, you don’t have to use your brain power to remember your passwords. Instead, you can use a cloud-based password management program. Password management programs can store all of your passwords in the cloud and can even generate unique passwords for you. All of your passwords will be protected by one single master password, which is the only one that you will have to memorize. 8.    Create a Private Cloud Server with OwnCloud If you would rather build your own private cloud server without needing to rely on a third-party, creating your own cloud storage and syncing server may be a great option for you. Luckily enough, there are plenty of open-source cloud sharing platforms, such as OwnCloud, that gives its users full control over who can access their files and where they are stored. 9.    Make Embedded Videos on Google Drive Those who frequently need to upload videos or embed them on a blog or website will love this tip. You can easily use Google Drive to perform these tasks. This can be particularly helpful if you do not want to use a video hosting site like YouTube for your videos. This process is fairly simple. First, you must open your Google Drive account, click upload, and select your chosen video. Then, click the share button to change the access for viewing from private to public. Right click on your video and choose “Open with - Google Drive Viewer.” From there, select “File - Embed this Video.” From there, all you need to do is copy and paste the embed code onto your website or blog. 10.   Share Your iTunes Music If you use iTunes to store all of your music, Apple makes it easy to share and access your tunes, no matter where you are. All you need to do is turn on Home Sharing or Music Sharing. This will open your music library up to your entire cloud network of devices. There are plenty of helpful things you can do with cloud storage, including some that you may not have ever thought of before. From word processing to note saving, it seems the options are limitless when it comes to what can be accomplished with the help of cloud storage computing power. ### Move Over Fintech To bring positive disruption to stale markets, risk is usually necessary. Here, Daniel Horton, technical manager at Parker Software, explains the technological risks the financial services sector needs to undertake to stay ahead of increasing regulatory requirements. There are always risks when implementing new inventions and technologies, especially in highly regulated or particularly sensitive markets. The Financial Conduct Authority (FCA) has an overall objective to ensure that financial markets work well — maintaining consumer protection and overall market integrity. However, the FCA is also eager to promote competition in the interest of consumers, something that many new financial technology — or FinTech — innovations can provide. Traditional financial organisations can be hesitant to embrace new technology. However, it is important that the industry understands how this innovation brings positive market disruption by forcing adaptive thinking and processes. FinTech is already making significant changes to efficiency in the financial sector, but there is a long-standing view that increasing regulation will hinder growth. Financial services could be described as one of the most complex, highly regulated industries on the planet. In any industry, regulatory and compliance issues are important, convoluted and resource-consuming. Regulatory technology, or RegTech as it has become known, may provide a better solution for the financial services sector to meet these increasing regulatory demands. The collaboration of regulatory standards and technology has existed for some time. However, increasing penalties for non-compliance — as well as a greater focus on data and reporting in the industry — has added more pressure for innovation and investments in the RegTech market.  Recent years have seen a significant change in technologies used in RegTech solutions, with innovators embracing technologies such as big data analytics and artificial intelligence. Artifical intelligence in RegTech Artificial intelligence may be one of the biggest sources of opportunity for the RegTech industry. By nature, artificial intelligence modelling — and to some extent, machine learning — is ideal for tasks that need to be handled accurately and logically, such as reporting and document archiving. The strict regulatory fashion of the finance industry makes it an ideal sector to invest in AI, because the technology can automate tasks that are based on specific systems, set rules and procedures. We are likely to see an increasing presence of artificial intelligence in the financial services sector in the next few years. Tasks such as providing soliciting or financial advice could easily be provided through a bot interface, using existing artificial intelligence and automation technologies. Some companies are already using more advanced AI applications, such as IBM’s Watson. However, most of these companies are using this technology in an experimental fashion, as opposed to completely overhauling their existing procedures in favour of artificial intelligence. Despite its potential, artificial intelligence is still in its infancy. Until it is fully developed, there will still be arguments about the accuracy and ability of AI over a well-trained human representative. [easy-tweet tweet="Predictive analytics technology can use pattern recognition to assess data from across the globe" hashtags="Technology, Fintech"] Using big data to manage risk Alongside artificial intelligence, any sector that is dealing with large amounts of valuable data has the potential to use this information to make intelligent predictions. In the financial services sector, one application of big data analytics is to identify potential instances of fraud. Predictive analytics technology can use pattern recognition to assess data from across the globe.  Naturally, manually assessing this data is a time-consuming process. However, combining this with artificial intelligence could allow the technology to automatically scan and categorise potential areas for fraud according to the amount of risk, and then pass on this fraud detection for human review. As new types of fraud are identified, these incidences could be flagged and integrated into the machine learning logic. Using this method, automation could detect similar activity in the future and prevent fraudulent activity. For the time being, artificial intelligence will simply assist the financial services sector — not become an integral part of regulatory compliance and standard management. There are clear advantages to handing over the reins of regulatory compliance to a computer designed to review and complete administrative tasks. However, while the technology is still in a state of learning, it will not have the upper hand over human employees for another few decades. ### How to Control Cloud Repatriation with App-centric Infrastructure Performance Management Cloud adoption is on the rise. According to an IDC report, enterprises are driving the increase with 68 per cent using public or private cloud: a 61 per cent jump since 2015. Why? Because businesses are experiencing the benefits. Most of you are cloud experts, so I don’t need to tell you that among the many reasons given for organisations moving their data to the cloud, flexibility, scalability and perceived lower cost are often top of the list. But it’s not a one-way street. Just as some businesses are dipping their toes into the cloud infrastructure, others are pulling back – returning applications from the public cloud to a private cloud or a physical data centre on site. In fact, research by ESG has revealed that 57 per cent of enterprises has moved at least one or more of their workloads from the cloud to an on-premise infrastructure. Honing performance Although security is a concern, performance is also a key driver for this shift. At best, a poorly performing application can result in a slow-down, at worst a complete outage. Both scenarios have serious implications for the business, whether it’s that customers are driven elsewhere, staff are unable to do their jobs properly or that the whole infrastructure grinds to a halt. As a result, a slow application will be whipped out of the public cloud and repatriated somewhere else. The underlying cause is often hidden, still lurking, ready to cause further performance issues later down the line, and result in more applications being moved, perhaps unnecessarily. Although most teams monitor their applications and infrastructure in some way – Gartner analysts recently reported that most organisations host five or more infrastructure monitoring tools per data centre – many of the tools employed just aren’t up to the job. For example, tools to monitor application performance highlight application speeds in isolation without looking at the impact on the wider infrastructure. Alternatively, domain-specific tools focus on an individual component’s performance but offer no insight into the cause of an issue. It’s like being given a piece of a jigsaw puzzle, without understanding how it fits into the full picture. Getting the full picture Application-centric infrastructure performance management (IPM) offers the best of both worlds: visibility and analytics across application and infrastructure domains, helping to ensure optimum performance and spot potential issues before they become a problem. It records and correlates thousands of metrics in real-time every second: giving insight into how each component fits with the others, recognising when some applications will be working harder than usual and how to manage an increase in demand. And pinpointing what is causing an application to struggle. Teams don’t need to wait for a crisis to realise that an application isn’t performing optimally; with a holistic view, it’s much less likely that the infrastructure will ever reach that point. [easy-tweet tweet="Understanding application workloads mean the IT team can manage their infrastructure more effectively" hashtags="IT, Cloud"] As well as offering the full picture into how each component is performing, app-centric IPM means IT teams no longer have to over-provision infrastructure for fear of future performance problems. They know exactly how their infrastructure plays, recognise performance limits and understand what’s truly needed to make sure everything runs smoothly. Understanding application workloads mean the IT team can manage their infrastructure more effectively: so no more knee-jerk reactions to poor performance. And no more improperly blaming performance issues on a specific silo team. It’s easier to plan much more efficiently too: teams can predict with much greater accuracy how their workloads are likely to expand and how demands on the IT infrastructure are expected to increase. So the cloud’s flexibility and scalability can be called on in a much more strategic and managed way. Performance over availability I recognise that it’s a change in focus: until now availability has been the watchword. In fact, many service level agreements (SLAs) promise incredibly high levels of uptime. But technological advances mean that availability should be a given – it’s performance that needs to be guaranteed. And by that, I mean both performance of applications and the infrastructure they run on. There’s an appetite for it, and the first public cloud providers to offer performance-based SLAs will be setting the bar for the rest. Refocusing on performance is not just a game changer for cloud service providers, it’s also transforming infrastructure management. IT teams are identifying business-critical applications and taking time to understand how each element fits into the IT infrastructure. With this knowledge, it’s much easier to make informed decisions about where each application should run – whether it would perform better in a public or private cloud, or whether it makes more sense for it to sit in a physical data centre onsite.  CIOs are starting to take control: moving applications between different infrastructures only when it’s needed. This insight and planning will mean that applications aren’t dragged from the cloud because their performance is impacted by another component. Instead, CIOs and IT teams will be able to determine how an application will fare in that environment before it’s implemented so that each one will be in the best-suited infrastructure from the get-go. And the flow from the public cloud will be controlled.   ### VOSS Awarded Skype Operations Framework Featured Partner Status VOSS Solutions, a Microsoft Certified Partner, announced that the company is now recognized as a Skype Operations Framework Featured Partner. Developed by Microsoft, the Skype Operations Framework (SOF) is a comprehensive guide that empowers organizations to successfully and cost-effectively deploy Skype for Business services in the cloud. Following a proven methodology, SOF equips organizations with the right assets and tools to ensure a successful Skype for Business experience. [easy-tweet tweet="VOSS has been listed as a SOF featured partner for its ability to carry out complex UC migrations..." hashtags="VOSS, IT"] SOF featured partners are selected by Microsoft for their product and services offerings that help organizations maximize the value from their Skype for Business Online investment.  The designation reflects the fact that VOSS has worked with Microsoft to offer solutions that magnify the benefits of moving to Microsoft Office 365 for UC.  In addition, it validates that VOSS employees are certified in administering the framework. VOSS has been listed as a SOF featured partner for its ability to carry out complex UC migrations, and manage Skype for Business Cloud PBX & PSTN services from a single pane of glass. The VOSS Skype for Business Voice and Meetings in the Cloud Offer aligns with SOF to minimize UC complexities and increase operational efficiency in the cloud. Organisations face challenges at every stage of the Skype Operations Framework, and VOSS provides tools and expertise specifically designed to meet those challenges. The VOSS Offer includes Skype for Business online, Meetings, and PSTN calling with Cloud PBX wrapped in VOSS management tools, training, and 24x7 support to ensure customer success. Features include: Expert guidance to help plan and execute the ideal solution at a pace that suits the organisation Reduced implementation & annual operational costs, and improve end user adoption Simplified rollout and ongoing management with the world’s most advanced, and most secure UC management solution Flexibility to manage a multi-vendor or hybrid environment through a single interface, as needed Future-proofed UC solution ready for the addition of any new application, device or service as the need arises Christopher Martini, VP Skype for Business at VOSS, commented; "This is a great milestone for VOSS and our customers, as we strive to become one of Microsoft's most valued partners. We are excited to be listed as a SOF featured partner, as it reflects our work with an increasing number of global organizations to optimize their Microsoft UC environments. The guidance and resources provided as part of the framework really do enable us to better serve our customers." Through August and September, VOSS is hosting a 6-part webinar series for organizations interested in migrating to – and managing – Microsoft UC. To find out more about the series, and to register, click here: https://webinars.on24.com/voss/microsoftuc ###  Why Salesforce’s New Platform Holds the Key to Making the Most of AI When my colleague Jason Guthrie last wrote about the introduction of Salesforce Lightning a year ago, he described it as the most significant update to the Salesforce user interface (UI) in the company’s history. For the uninitiated, Salesforce Lightning is the next-gen platform established to replace Salesforce Classic. Lightning has been introduced to deliver a more visual, responsive and intuitive customer relationship management (CRM) UI, and one which will take employee experience and customer engagement to the next level. The Lightning includes; Lightning Experience, a set of modern user interfaces (UIs) notably for the Salesforce1 Mobile app and template-based communities, Lightning Component Framework,  a JavaScript  for customisation of the Lightning Experience, Visual Building Tools, which enable  drag-and-drop app-building and customisation, and an app store for partners’ Lightning components At launch, Lightning could not replicate everything that Classic could do. So there was a chance that certain apps and capabilities users had become accustomed to on Classic might not have been available on Lightning. Whilst this should still be a consideration, there is a new element to the equation: ‘Einstein’, the innovative Artificial Intelligence capabilities now offered by Salesforce. Forward-thinking businesses looking for a competitive edge are evaluating, if not already deploying, AI technologies to provide Augmented Intelligence for their teams. To evaluate or deploy AI in the Salesforce world it is essential to switch to the Lightning platform in order to use Einstein’s advanced capabilities. Einstein allows users to leverage analysis of structured customer data to: Intelligently predict likely outcomes of customer journeys Inform staff decision-making and generate recommendations for next best action Automatically score and prioritise sales leads, and send notifications and alerts In other words, Einstein provides businesses with next-level insights from their CRM data, and helps marketers and sales teams to ‘augment’ their own knowledge with actionable business intelligence. The only way to gain access to such an obvious business advantage is to transition from Classic to Lightning. Those who have already made the leap are now one step ahead of their customers, positioned to satisfy customers by making timely recommendations, putting them in position to overtake their competitors. [easy-tweet tweet="Gaining tactical guidance from Einstein is not the only AI option that is available" hashtags="AI, Data"] However, gaining tactical guidance from Einstein is not the only AI option that is available. Earlier this year, Salesforce entered into a partnership agreement with IBM to combine Salesforce’s Einstein capabilities with IBM Watson, Big Blue’s AI, famous in the UK for its presence at Wimbledon’s annual tennis tournament and across the Atlantic for challenging human players at Jeopardy. Watson excels at making sense of unstructured data that isn’t necessarily within a business’ CRM database–drawing insights from text, video, images, emails, customer surveys, and social media, to name a few. Together, Einstein and Watson can ‘marry up’ structured and unstructured data, allowing users to provide their customers with more tailored, thoughtful and predictive experiences than ever thought possible. The two AI solutions combined create a holistic AI solution with unparalleled capability. Take the insurance industry for example. Combining Einstein and Watson, an insurance company using Lightning can now pull local forecast data from IBM Weather into Salesforce, and automatically send safety and policy information to customers who are at risk of being impacted by severe weather events.  So if a hailstorm is on its way, customers in the danger area will automatically receive a text message, email or automated call from their insurance company telling them to park their car under cover before the storm hits. The insurance company then saves a claim, the customer saves their car. Together, Watson and Einstein offer an insights engine for every industry. With this in mind, it’s time for those still using Classic to ask themselves whether their business can now afford to do without Lightning and miss out on having the option to tap the combined power of Einstein and Watson. Choosing to implement Lightning may cause some transitional disruption, but the long-term benefits quickly outweigh that. Is it worth risking falling behind industry peers for the sake of short-term consistency? At Bluewolf, we find it unsurprising that we are completing more Classic to Lightning migrations than ever before. A growing number of businesses are realising that, to maintain a business advantage over their industry rivals, the time to invest in truly intelligent business systems is now. We’ve recently completed Lightning projects for the likes of T-Mobile, Politico, and Australian Associated Press. And we’re leading by example. After many years of running our entire business on a customised Salesforce platform, Bluewolf made the transition to Salesforce Lightning. In less than four months, we went from prototype to launch, improving the user experience and efficiency of our team. Our Sales team no longer needs to navigate across multiple locations to access the information they need. They have fast, easy and integrated access to the information and insights, can collaborate seamlessly, and focus on prospects–not process–to drive the best customer experience. Whether they are using a desktop or a mobile device, Bluewolf sales reps are empowered with key, customised information in one central place. Ultimately, the shopping list of technology that a business needs to scale should, and does differ from company to company.  But if you use Salesforce it is crucial to take the time to understand when the most appropriate time for Lightning could arrive and implement accordingly. If there is something that’s certain though, it’s that Lightning is the future of Salesforce CRM, and it holds the key to making the most of AI. ### Comms Business Live Episode 6: The Great Telephony Debate In this episode of Comms Business Live the discussion is focussed on The Great Telephony Debate The host David Dungay is joined by Andrew Dickinson MD of Jola, Paul White MD of NTA and Clifford Norton MD of Channel Telecom. Don't forget about Channel Live 12th - 13th September. ### Next Generation Content Services Enable the Flexibility and Freedom of Hybrid Approach We need a fresh, modern approach to content management. It may seem like only yesterday that enterprise content management (ECM) systems arrived on the scene, but one-size-fits-all solutions are no longer adequate for today’s complex and hybrid infrastructures. However, the need for a quick fix has driven many enterprises to act tactically rather than strategically, resulting in complicated mixed infrastructures and the need to maintain many dissimilar, but overlapping platforms. This is the conclusion of a recent ASG-commissioned technology adoption profile study, “Today’s Enterprise Content Demands a Modern Approach” conducted by Forrester Consulting, which  surveyed 220 IT enterprise architecture and operations decision makers involved in content management. Its research has also shown that enterprises really want standardisation on a single ECM solution – but in reality, 93% are using multiple repositories to store content, with 31% using five or more systems to manage it. Looking at these statistics, it’s easy to understand why the Forrester Consulting study has come to the conclusion that a new approach is needed. We’ve got used to everyone talking about ‘ the data explosion’, but volumes of unstructured data in the form of business content such as office documents, presentations, spreadsheets and rich media have grown beyond expectations also. The majority of organisations (60%) are storing 100 terabytes (TB) or more of unstructured data; nearly one quarter (23%) have one petabyte (PB) or more of data.  Further, these volumes show no sign of decreasing; 82% of those polled reported an increase in unstructured data stored over the past 24 months with 50% saying volumes have increased by more than 10% over this time. With this content comes the ‘easy access versus security’ conundrum. The information is worthless unless employees (and often customers or partners too) can reach it quickly as they need it. Today, this means making content available via mobile as well as via traditional channels. Yet exchanging information with external parties and doing so remotely, outside a company’s firewall, can heighten regulatory and security concerns. [easy-tweet tweet="90% of organisations said they will be using cloud-based content management systems" hashtags="Cloud, Data"] Legacy systems remain a challenge Underpinning these concerns is another major worry for IT professionals with nearly three out of ten saying they are challenged by legacy systems. A quarter say their ability to move to the cloud is hampered by their existing infrastructure. Yet the next two years will be a transition period for enterprise content management deployment methods with monolithic ECM suites giving way to cloud-based platform. In the Forrester Consulting study, a decisive 90% of organisations said they will be using cloud-based content management systems, either as a primary or hybrid approach. Will the inflexibility of legacy systems prevent these enterprises from taking advantage? In reality most major enterprises are unable to make a ‘rip and replace’ move to the cloud. Too much has been invested in customising and maintaining legacy systems over the years. However, all enterprises are facing stiff competition from start-up companies that are entirely cloud-based and therefore have greater agility to respond to market demands and changes. For example, the US Insurance company Liberty Mutual decided that the greater flexibility and access offered by cloud delivery options made the change worthwhile. Choosing a next generation content services platform which accesses and manages content from anywhere on any device and also has the capacity to work in a hybrid environment, it was able to migrate while supporting both the old and the new. The platform became the archive content repository for formatted documents on Amazon Web Services (AWS). Because it supports open source, the platform allows Liberty Mutual to bridge from mainframe, multiple legacy repositories to AWS. It is deployed on AWS as a Platform-as-a-Service to capture process-driven content and on premise or in hybrid environments. This hybrid approach enables enterprises to manage content for regulatory compliance while still offering the benefits of the cloud, seamlessly integrating on-premise and cloud repositories for maximum flexibility, easier IT management and lower costs, achieving fast ROI on both legacy and IT systems. Content is the lifeblood of any organisation. When it is difficult to access, decision-making suffers. So whatever the patchwork of systems used, enterprises are seeking tools that enable them to access, view and use content from a single app, regardless of location. In addition, as cloud content services become pervasive, enterprises must still cope with the massive content stores that remain on premise. Any technology that helps bridge the cloud/on premise gap will help businesses gain value from all their IT including their legacy systems.   ### The Gig Economy Exposes Businesses to New Risks Cloud computing is establishing itself as an integral part of enterprise IT and security infrastructure, with growing challenges to business operations hastening its adoption. The allure is evident, from cost savings and speed of deployment, to flexibility and simplicity. This is especially true among those organisations dealing with distributed workforces and a high turnover of contract workers. As companies transition more of their IT operations to the cloud, we see a degree of rising concern about the challenges of creating flexibility while keeping control of users and data. In fact, in our recent Market Pulse Survey, 74 per cent of respondents cite their people – which includes regular employees, part-timers and the increasing number of contract workers - as the main exposure point for the organisation – the weak link in the chain. The rise of shadow IT (42 per cent) and poor password hygiene (24 per cent) being among the main contributing factors to this. The cloud is changing how we work – employees can access business applications in the cloud completely external to the enterprise network. Gartner estimates that by 2020, 90 per cent of enterprises will have a hybrid environment where on-premises applications and services co-exist with applications in the cloud. There are many benefits to this, including greater flexibility and agility for IT teams dealing with legacy systems. However, it’s also presenting new challenges. The sensitive data that users have access to now extends across both on-premises and cloud environments. Therefore, security must be centred on the user regardless of application location. With this in mind, how would you go about managing identities in a distributed working environment? Identity in the cloud Identity enables organisations to securely adopt new cloud technologies while still being able to have full visibility and control over who has access to what sensitive information. With more users coming and going, accessing critical systems and applications in order to do their jobs, there is more data to protect across a wider population of users than ever before. It is increasingly spread far and wide, and often outside the perimeter of the corporate firewall. So while the cloud enables scalability, it also creates additional security concerns that must be dealt with proactively. [easy-tweet tweet="Identity in the cloud is about more than just security." hashtags="Cloud, Security"] When identity governance is delivered from the cloud itself, it offers the crucial security, compliance and automation that organisations need while also offering all the deployment and operational benefits of a cloud-based solution. Identity in the cloud is about more than just security. The cloud has effectively become an agile service, offering businesses the flexibility to deal with distributed and centralised workforces at the same time. Which is perfect for the growing trend of self-employment and gig work in the UK. More people are becoming self-employed, with around 15 per cent of UK workers today falling into this category. There’s no real hard numbers on the number of gig workers, but estimates are at around 1.1 million. As a business evaluates moving critical business infrastructure such as identity management into the cloud, there are some factors to consider. Particularly, on how to evaluate and select the best governance solution for the organisation. Since IAM logically belongs at the centre of an organisation’s IT operations and security strategy, how it will affect the organisation as a whole as well as its users’ needs to be considered. Whether an organisation is implementing identity governance for the first time or transitioning from an on-premises IAM solution, it’s important to take a holistic approach to business requirements and how they are addressed. Common business drivers for identity governance include protecting the organisation from internal and external security threats, meeting regulatory compliance requirements, enabling the business with convenient access services and lowering operational costs. Building a cloud-based identity governance strategy In order to meet security, compliance and business enablement goals, organisations should put a comprehensive plan in place for how to deliver all the critical governance services required – from user provisioning to password management, access certifications and access request. Solutions that provide a broad set of capabilities on a single platform should be considered front and foremost. It’s also important to consider how well a solution integrates with complementary solutions such as helpdesk, HR systems, and privileged account systems. This helps avoid integration problems. Taking an all-encompassing approach from the outset will ensure organisations don’t face any underlying issues later on. Today’s enterprises are increasingly becoming cloud enterprises. Combining the power of identity with the cloud, an organisation can ensure safer, more efficient and better-protected data and users – whether they’re working for a day, a month or a year. What’s more, these enterprises are free to do what they set out to do in the first place: improve the organisation. Whether it’s gaining a competitive advantage, chasing new growth opportunities, or providing a better customer experience. The empowerment organisations gain with identity governance is what allows them to be confident, fearless and unstoppable. Put simply, identity provides the power to make the cloud enterprise secure. ### #Uptime with Cogeco Peer 1 - Episode 1 - Tweets from the show! #Uptime with Cogeco Peer 1 - Episode 1 ### Are IT Teams Equipped for the Mobile Computing Revolution? "Businesses must adapt to the rise in smartphone computing or face complex device management issues," says Mathivanan Venkatachalam, director of product management at ManageEngine. There’s a rebellion taking place in organisations all over the world. Laptops may be getting smaller and sleeker, but smartphones and tablets are becoming a more powerful tool for employees, providing a compact computing solution. Recent trends in device usage have certainly tilted toward the use of touchscreen devices over traditional desktop computers. In some cases, organisations are funding this revolution, arming their employees with brand-new business-owned smartphones and tablets that enable workers to stay constantly connected to email accounts, shared servers, CRM systems, and other resources. Research firm Statcounter recently reported that mobile web usage has overtaken desktop computer web usage for the first time ever, highlighting the shift towards mobile devices over traditional laptops and PCs. Similar research from comScore reveals that 20 percent of millennials no longer use laptops or PCs, suggesting that traditional computer usage will decline further as younger generations opt for the convenience of mobile devices. Further analysis from We Are Social predicts that by the end of this year, there will be approximately 5 billion smartphone users globally. The connected world enables employees to access corporate resources from any place, at any time. The nine-to-five work day is no more because administrative tasks can be managed quickly and easily whenever employees have their devices in their hands. This in turn frees up more time within the work day so employees can focus on higher priority responsibilities. [easy-tweet tweet="BYOD is the practice of employees using their own personal smart devices for work activities" hashtags="BYOD, IT"] Organisations that haven't yet recognised the benefits of providing employees with smart devices—or that simply don’t have the budget to do so—are no doubt beginning to experience bring your own device (BYOD) culture. BYOD is the practice of employees using their own personal smart devices for work activities, allowing them to complete tasks remotely or on the move. IT admins: Agents of change There are clear benefits to this workplace revolution, but IT departments must adapt if their business is to benefit from the increased productivity that BYOD brings. If IT teams are to become the agents of change by facilitating a smooth transition to a smarter, mobile-friendly organisation, they must be provided with the right tools for the job. With the increased organisational benefits of mobile computing comes a whole new set of problems for IT teams, who must adapt to the challenges of managing the IT capabilities of a mobile workforce. Common issues such as a user struggling to locate a downloaded file on their smart device, or encountering complications streaming a presentation to a TV screen, can result in lengthy and often fruitless calls to the IT help desk. The IT admin on the other end of the line will typically attempt to assist with blind instructions, and this can be a complicated and time-consuming process. It only takes one party with poor communication skills to add to the frustration. An IT admin that uses too much jargon or an employee that can't articulate their issue or explain what they’re seeing on their screen can result in wasted time and effort for both parties. If the issue is unresolved due to the IT help desk being in another location, employees will often turn to other departments whose team members are likely to be technically proficient. Marketing departments and digital teams are known for using smart devices to manage social media accounts, and these teams often become substitute IT admins when the help desk is unavailable. Relying on these adopted IT teams to regularly troubleshoot issues creates unnecessary stress and strain over issues which should be managed by the real IT help desk. Direct results through remote management Enterprise mobility management (EMM) solutions, such as Mobile Device Manager Plus,  enable IT admins to successfully manage and monitor mobile devices, wireless networks, and other mobile computing services across their organisations. Any organisation that strives to reap the benefits of mobile computing must acknowledge the value in adopting an EMM solution or face a rise in unresolved IT issues across the business as a whole. Vendors developing these software solutions have been quick to recognise the potential fallout from poor communication between dispersed IT teams and individuals experiencing issues with their mobile devices or the networks that they connect to. The solution is integrated remote management, developed to simplify the entire process of troubleshooting remote devices by enabling IT admins to access and control employees’ remote devices from their own computers or devices. Available in both cloud and on-premises editions, remote control can assist help desk technicians in resolving tickets much faster because they can access employees’ devices and troubleshoot issues directly. When faced with device difficulties, employees needn’t waste valuable time trying to resolve the issue on their own. Instead, with the addition of remote troubleshooting, administrators can access employees’ mobile devices remotely at any time and provide support on the go. Admins can take complete control of a user’s device to view its current condition and provide support. Admins can view the device screen in the same way they can share the screen of a desktop. Best all, user privacy is protected, allowing users to grant permission before admins take control of their device. The mobile computing revolution may be well underway, but if businesses are to benefit from greater productivity through increased convenience, they must ensure their IT teams have sufficient management solutions in place. ### Up Time Episode 1: How has technology changed and how is it changing? [vc_row][vc_column][vc_column_text]Disruptive Live presents the first episode of UP TIME in association with Cogeco Peer 1! This week's first episode will focus on the topic of change. How has technology changed and how it is continuing to change. This week's panel will be discussing technology and trends which have historically led to big things and future trends - where are we going? This week see's the panel host Susan Bowen from Cogeco Peer 1 with her panel Lisa Lavis from Glow, Richard Heyns from Brytlyt and Tom Adams from Cogeco Peer 1.[/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][/vc_column][/vc_row] ### Fintech Offerings are Challenging the Traditional Banking System After a sustained period of investment in financial technologies and services, the last decade has seen an explosion of disruptive fintech offerings that are challenging to the traditional banking system. The boom in consumer access and choice, along with the challenging of the status quo in currencies, transactions and financial systems, creates an exciting environment for those involved in or around the fintech industry. However, recent trends and reports suggest a shift in where the investment and interest are being directed. Most indicators are showing a decrease in investment in saturated consumer fintech with a large shift to investment in backend technologies and services that have the means to reduce costs and deliver improved services across financial services. As with most industries, the drive to automate processes and drive tighter integration between systems is helping to deliver major efficiencies. In most cases, user (human) interaction is not required. With this in mind, it means that systems (especially backend) need to be modernised and updated – and a key requirement is fixing flaws in the existing systems. This provides opportunities as the emphasis switches from new shiny consumer-facing products to the critical, yet somewhat less glamorous, backend services infrastructure. Let’s quickly define what we mean by the backend. I would describe the backend as data and technology repository of everything that makes a bank’s web presence and mobile apps run. In many cases, all the information required to run a system is stored on remote or even cloud-based servers. For larger enterprises, backend systems include a much broader scope of information and systems. In all cases, the information stored in backend systems will be critical, sensitive and vital to a company and its customers. The data housed in the backend can be expansive and extremely sensitive. [easy-tweet tweet="The ultimate goal of backend systems is to improve access to a company’s data" hashtags="Data, Fintech"] What are the benefits of improving backend systems? The benefits usually manifest themselves in three categories: Efficiency: The ultimate goal of backend systems is to improve access to a company’s data -automating the availability, analysis and insight provided. This could be sharing information with customers in the best format and on the device they want to consume it on, or, providing rich, integrated information to a decision maker based on all available data. Removing systems and people from critical pieces of the process reduces errors and increases speed. Both of these lead to a natural reduction in cost and a greater availability of opportunities. Improved decision making: Again, better data, availability and insight drive better decisions. Imagine if you can go to one single place and find all information about an individual or company. The more information you have available, the better your capacity to make a good decision. Add this to real-time information, and things start to change really. Future: Only by having better integration and availability of data can you then think about using new technology such as AI, machine learning or predictive analytics. This can’t be done without significant improvements in backend systems and the integration of the data available to them. On top of this, by modernising your infrastructure, you can begin to think about leveraging other technologies and services that would have previously been out of reach. Plug and play infrastructure and services create an enormous opportunity, but they also carry some risks. If everyone has access to the same technology and can leverage the same services with relative ease, they can also provide the same or similar experiences to their customers with very little to differentiate the experience. How do you add value when everyone has the same information? When picking a backend provider or set of providers, it’s important that you take the time to do proper due diligence. Investigating the technology’s capabilities and ensuring they meet your business requirements is critical – it’s vital the technology and the access to services are both at the required standard. With most backend technologies, you will need some level of hand holding while you work out the best and most efficient way to access and utilise the APIs provided, so make sure you pick a vendor that will work with you, and that understands how to ‘play nice’ with other third parties. It’s also important that you look into the vendor. How long have they been in business? Do they have good customer references? Get a feel for the people too. Are they honest, straightforward and trustworthy? A reputable account manager won’t sell you the wrong technology, no matter how close to their sales quota they are. Finally, a reputable organisation will have a great track record of responding to customer issues and feedback, working hard to resolve them as quickly as possible. Below is a list of things we recommend companies think about before considering a new technology vendor: Standards: Try and find a vendor that can work with data standards or standard ways of integrating, such as the payment service directive 2 (PSD2) in Europe, which paves the way towards a greater commitment to ‘open banking’ by other governments and regulators. We expect that further emphasis will be put on fintech organisations to leverage open banking, API platforms and the adoption of standards. Infrastructure: Banks run mostly on mainframes used to run overnight batch processing cycles. These can be decades old and have been adapted and changed over time to work against different business needs. The cost of maintaining this old infrastructure is consuming the lion’s share of most bank’s annual technology investments without returning any new value. Security: Security is more important now than ever before. It’s vital that your vendor understands the security of their technology and whether they can provide you with the relevant documentation. Backend tech can often be over-looked but a report on cyber-attack statistics from June 2015 showed that most hackers (almost 60%) attack targets with the intent of accumulating data and making money. That same report showed that around 72% of known attacks occurred in the “backend”. Data integration: It is rare that an organisation has all its financial data in a single database or data warehouse. Due to acquisitions, technology transitions and other business factors, data tends to get fractured and housed in many different locations. Any software solution must have a way to easily integrate with data sources that are both on premise and housed in remote locations, including the cloud (which has become increasingly popular in recent years due to its flexibility and cost effectiveness). Data availability: There have been large investments and a lot of buzz around AI in the fintech space. To properly leverage AI, it is vital that data is available, as this can only be achieved once the pieces above are all in place. Fintech startups are disrupting the marketplace by offering technology and services that challenge the traditional ‘brick and mortar’ financial institutions and their software providers. However, all fintech technology needs to be considered through the familiar long-term lens. Will the technology provide a better experience, will it provide a solid ROI over time and will it put the organisation in a more solid financial position opening significant future opportunities? A positive response to all these questions is crucial for future success.   ### When IoT Attacks   In the first example of an IoT device being used in a physical attack, two security researchers revealed at Black Hat 2017 how they could hack Internet-connected car washes to close entryway and exit doors, locking a vehicle and its inhabitants inside the wash chamber while causing mechanical arms to strike the vehicle. If the driver tried to escape, the attackers could repeatedly open and close the wash bay doors as the car attempted to exit, damaging the vehicle and potentially injuring its occupants. In the rush to bring IoT devices to market, manufacturers often give insufficient attention to the additional security exposures created when systems become increasingly connected. Connections mean more pathways and back doors that could be exploited by a hacker—especially when a system’s own designers may not be aware that those pathways and back doors even exist, as is often the case with vulnerable open source components. In the case of the PDQ car wash, the automated systems run on Windows CE and have a built-in web server that lets technicians configure and monitor them over the Internet. Not all PDQ car washes are online, but the researchers found more than 150 that were. Microsoft also no longer supports the version of WinCE used in the PDQ control system, meaning it might be possible to take control of the machinery by exploiting security vulnerabilities in the outdated operating system. Secure software is an ephemeral concept. What we think of as secure today can change overnight as new vulnerabilities are discovered and disclosed. As code ages, the probability is high that more vulnerabilities are likely to be disclosed. But the researchers found an easier back door to access the online PDQ system they broke into—the admin default password (would you believe “12345”?).  That security lapse is a good reminder for everyone—from consumers to car wash owners—to practice cyber-hygiene and change default passwords as one of the first things they do when setting up a new system, especially one that will be joining the wild kingdom of the Internet of Things. [easy-tweet tweet="Driving the IoT revolution is software, and that software is built on a core of open source components." hashtags="IoT, Machinelearning "] IoT and Medical Devices Driving the IoT revolution is software, and that software is built on a core of open source components. A recent Forrester Research report acknowledges the widespread prevalence of open source in applications, citing that custom code now often comprises only 10 to 20 percent of any given commercial application. Black Duck On-Demand audits of commercial applications consistently find open source components in nearly 100 percent of the applications scanned. Open source use is pervasive across every industry vertical, making up an average 20 to nearly 30 percent of commercial applications in the Automotive and Financial Services industries and up to 46 percent in the Healthcare vertical. The thought of software vulnerabilities in pacemakers and other medical devices and systems is troubling. The same researchers who demonstrated the PDQ hack, Billy Rios and Jonathan Butts, had earlier turned their attention to pacemakers. They acquired hardware and supporting software for four different brands of pacemakers and looked for weaknesses in architecture and execution. One of the biggest issues noted in the paper they published earlier this year was one Black Duck sees time and again—unpatched software libraries. Vendor One Vendor Two Vendor Three Vendor Four Number of identified third-party components   201   47   77   21 Number of vulnerable third-party components   74   39   51   10 Identified number of known vulnerabilities in third-party components   2,354   3,715   1,954   642 (From: Security Evaluation of the Implantable Cardiac Device Ecosystem Architecture and Implementation Interdependencies, Billy Rios, Jonathan Butts, PhD;  May 2017) All four pacemakers the researchers examined contained open source components with vulnerabilities, and roughly 50 percent of all components included vulnerabilities. Worse, the pacemakers had an average of 50 vulnerabilities per vulnerable component and over 2,000 vulnerabilities per vendor. We don’t know how old the devices and software were, but, since the equipment was purchased on eBay, we can assume they were not newer models. As I noted earlier, older code—whether proprietary or open source—is more likely to have had more vulnerabilities disclosed. Their paper also doesn’t state if the researchers checked for software/firmware updates from the vendors prior to analysis. My guess is that they did not, but whether this would have made a real-world difference is arguable. Black Duck’s own research indicates that vendors are typically not aware of all of the open source they use, since it can enter the code base in so many ways. On average, prior to having a Black Duck code scan, our customers were aware of less than half of the third-party libraries they use. Adding Open Source Security to the Internet of Things There are billions of reasons for IoT security – 20 billion IoT devices by 2020 in fact, according to Gartner. Billions more connected devices coming online in the next few years will create new security challenges. Security should be at the core of the design of all IoT devices—not an afterthought, or worse, reactive after the damage has been done. When there is a vulnerability to be found, the laws of statistics guarantee that someone will eventually find it. With over 3,600 new open source component vulnerabilities reported in 2016, the need for greater visibility into and control over the open source in IoT devices is clear, and detection and remediation of open source security vulnerabilities should be a high priority. IoT manufacturers need to adopt an approach to cyber security that addresses not only obvious exposures but also the vulnerabilities that may be embedded in application code. Any organisation planning to leverage IoT technology will need to examine their software eco-system to account for open source identification and management and to ensure that the open source that may be in their IoT platform is not introducing hidden security vulnerabilities. ### Informatica Named as a Leader in Gartner’s Magic Quadrant for Metadata Management Solutions Informatica, the leader in enterprise cloud data management, positioned highest for “Ability to Execute” and farthest for “Completeness of Vision” for metadata management solutions Informatica, the enterprise cloud data management leader accelerating data-driven digital transformation, today announced that Gartner, Inc., a leading IT research and advisory firm, has positioned Informatica as a Leader in its 2017 Magic Quadrant for metadata management solutions report. Gartner has positioned Informatica highest on the ‘ability to execute’ axis for the second consecutive year, and farthest on the ‘completeness of vision’ axis this year. The complete report, including the quadrant graphic, was published on August 10, 2017, and is available athttps://www.informatica.com/metadata-management-magic-quadrant.html?Source=News-Release According to the Gartner report, “The metadata management solutions market continues to demand more innovation and better execution to address the needs of automation (machine learning and semantic enrichment by search capabilities), and combined cloud and on-premises deployments. The technology continues to expand in its capabilities and support for multiple use cases. A growing ecosystem of system integrators and independent software vendor partners supports the most popular metadata management solutions.” The report notes that, “In harnessing data for business outcomes, data and analytics leaders must understand the flood of data in its multiple formats. Data has been available in disparate repositories for decades, but in today's digital business environment organisations face new demands to access and use data across these repositories — by mapping relationships between the different data formats.” “Providing customers with the ability to manage and leverage metadata across the enterprise has been part of Informatica’s DNA from our inception,” said Amit Walia, executive vice president and chief product officer, Informatica. “It is integral to the Informatica Intelligent Data Platform, which uses metadata-driven intelligence to increase data management productivity across all relevant data types, use cases and users as our customers pursue their digital transformation initiatives and projects including data governance and analytics. We believe that Gartner’s positioning of Informatica as a leader in metadata management underscores our long track record of enabling customer success in these initiatives, and the new channels we continue to open for leveraging metadata in innovative and expanded ways.” [easy-tweet tweet="Data-driven companies require rich metadata management capabilities to maximise the value of their data " hashtags="Data, metadata"] Data-driven companies require rich metadata management capabilities to maximise the value of all their data and engage in intelligent business disruption. Informatica is a pioneer and innovator in metadata management. At-scale metadata management is a pillar of the Informatica Intelligent Data Platform and its CLAIRE™ engine, a breakthrough metadata-driven artificial intelligence technology that dramatically accelerates data delivery and business data self-service. The company’s industry-leading metadata management innovations, such as Enterprise Information Catalog, Informatica Axon data governance and Business Glossary, enable business and IT users to leverage enterprise-wide data discovery, end-to-end data governance, business context and full lineage to deliver a comprehensive and unified view of metadata whether on-premises, cloud, or hybrid environments. This allows organisations to power their data-driven digital transformation to deliver next-gen analytics, differentiated customer engagement, cloud/application modernisation, holistic data governance, compliance, data security and other high-value initiatives.   ### Securing your Network in the Upside Down World of SDx The smash hit Netflix series ‘Stranger Things’ centres around the mysterious disappearance of residents of Hawkins, Indiana. Those who vanish find themselves in a frightening, parallel nether-world called the ‘Upside Down’, where things are not exactly as they seem. For some organisations, moving from physical hardware-based networks to SDx public or private clouds can feel similarly alien. While the familiar, conventional network construct still exists, the security infrastructure has disappeared since there’s no physical infrastructure to get to grips with. So what do they do next? What makes SDx strange? As we know, SDx stands for software-defined infrastructure. That is, rather than being structured around physical hardware – routers, switches, cabling and so on – SDx environments consist of virtual machines and networks, which may be wholly owned and managed by the business, in the case of private clouds, or wholly owned and managed by an external cloud provider, in the case of public clouds. Hybrid cloud models offer a combination of the two. [easy-tweet tweet="Automation is the key that makes SDx environments so attractive and ultimately possible" hashtags="SDx, Data Centre"] In a software-defined environment, new applications can be created, or existing ones moved to new locations, almost instantaneously. And it is this elastic scalability that makes SDx environments so different from their hardware-based forebears. These infrastructures are provisioned and changed, flex and contract on demand – potentially hundreds of times in a single day. Automation is the key feature that makes SDx environments so attractive and ultimately possible. Automation covers everything from expanding or defining new networks, storage and servers to – crucially – deploying security. Collectively, automation of these functions shifts the data centre to being application-focused, rather than hardware-focused. So, when it comes to the key features of this Upside Down world, we’re talking about characteristics like elastic scale, dynamism and automation. What is the impact of these new network characteristics to security management? Security management in the Upside Down Before enterprise data centres were turned Upside Down by SDx, the deployment of a new application was a complex process involving numerous different parties and lots of time. Different personnel were responsible for installing server hardware and operating systems, for connecting new servers to the network, and for provisioning the necessary security equipment and policies. As such, deployment a new application could take weeks or even months. The Upside Down features outlined earlier – elasticity, dynamic rates of change, and automation – have drastically sped up these processes. Pre-configured templates for frequently used services are even available to application owners, and they can provision applications across multiple data centres and cloud environments with a simple click in a self-service portal. In turn, this places huge pressure on security admins. Apps are created, moved or altered at lightning speed – which means that enforcement of security policies and visibility of security incidents also needs to move at lightning speed. But in practice, it often doesn’t. In too many SDx environments, security management trails the ability to automate the provisioning of new infrastructure, because traditional security controls are fixed at the network perimeter. To keep up with the elastic dynamism inside the SDx environment, security management needs to move inside that environment too. But how? Security management in SDx environments, where physical hardware has disappeared, and dynamic elasticity is the norm, depends on two key principles. Dynamic automation We’ve focused heavily on the rapid rates of change in SDx environments – how applications can be created and moved in seconds. Security provisioning and management absolutely must be equally dynamic if it’s to be both effective and a business enabler, rather than a road block on day-to-day operations. Dynamism is achieved by close integration with the network virtualization and public infrastructure as a service (Iaas) solutions that underpin the SDx environment – whether Microsoft Azure, AWS, VMware NSX, OpenStack, Cisco ACI and so on. The objects defined by those solutions when provisioning applications – groups, tags, etc. – need to automatically feed into security management, so that any changes in the software-defined environment are automatically and immediately reflected in the security policies. Security management should not be based on arbitrary IP addresses – it needs to speak the same language as the virtualization technology on which the SDx environment is built. Manual management of security policies, or human intervention to ‘translate’ between the languages of security and virtualization, is not a viable option in this brave new SDx world. Consolidation Security management in SDx environments needs to be holistic. IT security needs comprehensive, real-time visibility into the entire environment – and it needs this visibility from a single pain of glass. Security management consolidation is essential if security incidents are to be effectively identified, correlated and analysed across the various cloud networks that make up a typical SDx environment. SDx environments are complex, continually shifting and changing, and the teams who manage applications across them may not necessarily be able to visualize the security implications of the changes they make. What’s more, physical hardware or legacy networks often remain in place alongside the virtualized environments. Collectively, this makes it remarkably difficult to track where data centre traffic is going from and to – and to ascertain how exposed the infrastructure is to threats and vulnerabilities. As such, effective security management in the SDx environment means having a unified solution that consolidates policy management, visibility and reporting across all physical, private and public cloud networks. It must be intuitive enough for all stakeholders to manage easily, scalable enough to handle security deployments wherever data goes, and analytical enough to offer detailed correlation of security events across the entire virtualized network. With this in place, security can be managed effectively in any environment – whether in the physical world or in the Upside Down.   ### Young Women Hold ‘Vote of No Confidence’ Over Career in Tech   A survey of more than 1,000 university students conducted by KPMG and independent market research company High Fliers has identified a worrying crisis in confidence among young women with regards to their digital skills. The poll found that only 37 percent of young women are confident they have the tech skills needed by today’s employers, compared with 57 percent of young men. This is despite scoring on a par with their male counterparts when assessed on digital skills such as data manipulation and use of social media. There is evidence that this lack of confidence could be putting many young women off applying for jobs: 73 per cent of female respondents said they have not considered a graduate job in technology. Commenting on the findings Aidan Brennan, KPMG’s head of digital transformation said:  “The issue here isn’t around competency – far from it – but rather how businesses understand the underlying capability of an individual and how to unlock it. I think this research highlights the work that needs to be done to show the next generation that when it comes to a career in tech, gender isn’t part of the equation.  [easy-tweet tweet="Businesses committed to building a truly diverse workforce need to adapt their recruitment processes" hashtags="Womenintech, Technology"] “Competition for jobs is tough, and we know that female job seeker can be less likely to apply for a role than their male counterparts if they don’t feel they already possess every pre-requisite the job demands. Businesses committed to building a truly diverse workforce need to adapt their recruitment processes to reflect this, and ensure they don’t fall into the trap of listening only to those who shout about their capability loudest.”  Graduate Trainee, Mary Smith, who studied history and politics at university and recently joined KPMG’s tech consulting graduate programme agrees: “If you look at the subject I studied at University; you might wonder how my background makes me a good fit for a tech career at a professional services firm. KPMG saw something in me that at the time I may not have seen in myself. Now I am in the role; it is clear that the skills that I already possessed are very much transferrable to the job I am doing. I would encourage more young women not to be deterred by jobs which include an element of tech, and to instead have the confidence and belief in your capabilities to apply and succeed.” Anna Purchas, interim head of people at KPMG in the UK, says that the firm is already taking action to tap into this pool of talent and target women who are digitally capable but may not yet be confident in their skills. “We recruit around 1,000 graduates each year through our graduate recruitment process, Launch Pad, and we are proud to have reached a 50/50 gender split amongst our graduate intake. However, to maintain this level of equality in an increasingly digital world, it’s vital that more women like Mary have the confidence that their tech skills will be applicable for a role at a professional services firm like ours.  “Earlier this year saw the successful launch of ITs Her Future, our initiative aimed at encouraging more women to consider a career in tech. This summer we also launched Future Ready, our online tool designed to help young people who may not yet have experienced working in an office understand how the skills they do possess could be applicable in the workplace.  “We are now open for applications for our tech-related graduate programmes. Undergraduate work experience schemes will open later this year, such as Women in Technology, designed to help female applicants find out more about a career within technology at KPMG. “As business leaders, we must take positive action to bust the myth that there is just one type of person for any one job, highlight the value of diversity, and boost the confidence of young people as they prepare for the world of work.” ### The AI Revolution Has Arrived, Will Testers Suffer? Artificial intelligence, once the preserve of science fiction, has gone mainstream. Whether it’s Elon Musk feuding with Mark Zuckerburg, or Amazon Alexa fighting crime, scarcely a week passes without the media scaremongering about AI’s effects or speculating on its future.  Venture, corporate and seed investors aren’t immune to the hype. There’s been something of a gold rush to fund AI and machine learning solutions (already to the tune of $3.6 billion), and you don’t have to look far to find examples of industries likely to get swept away by the nascent technology. Artificial Intelligence is predicted to eliminate 6% of US jobs by the year 2021. The question now for many commentators is which industries face the chopping block, and how soon we should start worrying. It’s not just low level or menial jobs threatened by AI. Start-ups such as Oxford University spin-out DiffBlue have made headlines for using AI to revolutionise the way software developers work. DiffBlue recently raised £17 million for its AI solution that automates coding tasks considered an inefficient use of a human developer’s time. In the coding community, attention has turned to AI’s ability to disrupt the ‘testing’ industry entirely. Are the days of manual testing soon coming to an end? We think it’s unlikely. Although AI is already revolutionising the way software developers having to test complex chunks of code (and rightly so), it has a long way to go before it can accurately replicate aspects of user-based testing. The number of different scenarios associated with a human interacting with an app or website are close to limitless. AI doesn’t currently have the capacity to successfully tackle the multifaceted and unpredictable variables associated with how, where and when humans use a particular app or online service. [easy-tweet tweet="An initially poor mobile experience will discourage 85% of customers from further use" hashtags="AI, MachineLearning"] This is particularly true on mobile. An initially poor mobile experience will discourage 85% of customers from further use, studies have shown. Perfect user experience is more important than ever to consumers, and human-based testing remains the most assured route to securing it. The human shopping experience Consider using a taxi app to order a car to arrive at your home on a rainy day, only to find that your driver has driven to the café 500m down the road instead. Noticing this, you walk to your car, getting soaked in the process. The app’s system registers a successful pickup. You, on the other hand, are far from happy. Or imagine you go to an electrical store, to pick up a new laptop pre-ordered on the website. When you arrive at the store it takes ages to find the right one, and there aren’t enough staff members on hand to assist you. At this point, some people would have given up and gone to a competitor. The detection of user frustration and its contributing factors is difficult unless you have resources on the ground to test for real-life scenarios like this. Also, companies have traditionally focused on small sample sizes of testers due to financial constraints. The crowd testing solution Crowd testing puts digital properties (such as mobile apps, websites, IoT and connected devices) into the hands of people that are representative of your customer demographic. With crowd testing, you can expand the sample sizes of testers while simultaneously focusing on the most relevant consumer demographics for the product. For example, if you require men aged between 18 - 34 years old in the South West of England, a crowd testing company can use its community of testers to get your product tested by the core demographic. This pool of testers is then further refined as it can be split into two groups: the vetted quality assurance (QA) professionals and the ordinary user with no QA background. The former allows experts to effectively identify any bugs or problems that might have been missed by your internal team, while the latter will allow you to explore how intuitive your product is. Being able to accurately test how your website or mobile app will function ahead of its release in a specific market gives you an invaluable opportunity to ‘get it right’ before your customers start to engage. So is AI going to replace humans?  AI can identify errors and bugs in software with unparalleled precision. AI doesn’t rely on food, water or sleep and, unlike a human tester, can undertake repetitive tasks 24/7.  As the technology continues to develop it will increasingly become an integral part of the troubleshooting process. But this is not to say human testing will ever become obsolete, and therefore businesses must not treat it as a panacea for all potential problems. While AI might be ideally suited to spot errors in code, it still lacks the sophistication required to adequately reproduce a human’s user experience. Crowdtesting platforms allow real users to engage with digital products under real world conditions in a way that artificial intelligence can’t. By using AI, companies have the best chance to perfect their code and UX, removing any potentially harmful bugs. But in an increasingly competitive digital market where first impressions can make or break a product, companies must take a holistic approach to software testing, and I think we’re going to see more and more companies using AI in conjunction with human-based testing. What’s most clear is that first impressions are everything for the consumer, and no company can afford to skip on the user experience. ###  Why Businesses Should Back Up the Back Office Across multiple industries, old age computing systems are wreaking havoc and preventing businesses from reaching their full potential. It’s no secret that businesses pour thousands of pounds each year into updating their websites, in a bid to make them as streamlined, user-friendly and as flashy as possible. But as they increasingly neglect the back office, they open themselves up to a multitude of serious issues. Issues that can easily be solved if their technology was just up to date in the first place. The media is filled with examples of business system failures that often result in long lasting damage to the credibility of the companies involved. Take British Airways, which earlier this year suffered a system malfunction that left 75,000 passengers grounded at the start of the May half term holidays. The problem – a faulty power issue caused by human error – we now know cost them £80 million pounds and cast a shadow on BA as a leading European airline. These issues translate into the financial services sector too. When The Royal Bank of Scotland lost 600,000 customer payment records and direct debits in an IT crash in 2015, it was a ‘technical glitch’ that was blamed. Meanwhile, companies like these have state of the art websites that enable their customers to streamline their search criteria and book a flight in under a minute. If all of their departments were given the same tools and investment, it’s possible that their front and back offices would complement each other. It’s easy to point the finger at these businesses from the outside in, but the reality is that many firms struggle to optimise past investment or are slow to upgrade their systems. When faced with a choice, website and marketing development often takes priority because there is less business risk attached. However, it’s important to remember that a business is built on its infrastructure, so an effective back office will ultimately enable your employees to execute more effectively, which has to knock on benefits to the customer experience. With the world on the cusp of the Fourth Industrial Revolution, 2017 will be the year where this dichotomy comes to the fore. Digital capabilities are no longer the sole province of Silicon Valley enthusiasts; understanding and harnessing digital technology in the business should be a necessity for all – especially given how competitive most industries are now. [easy-tweet tweet="48% of businesses ranked performance as the biggest business priority" hashtags="Backup, DR"] Challenges in the digital age While the digital age opens us up to a universe of opportunities, it brings fresh challenges to the playing field. For many businesses, overhauling legacy methods is a costly and daunting procedure. This requires additional training for employees and re-working of systems which takes time. A report by Deloitte concludes that 48 per cent of businesses ranked performance as the biggest business priority. Given how high this figure is, more should be done to ensure they onboard employees giving them adequate training. Budgeting is a key problem. Many businesses, especially small SMEs, can’t afford to fund front and back office re-development. State of the art websites is still seen to be a priority. However, a report by PwC claims that office ‘digitisation is rapidly transforming at large how businesses interact with their customers’. For example, sales representatives are now more mobile, as they have access to huge amounts of data through new office systems, in which they can manage to invoice and the information of their customers. Ensuring that the back office is forward-facing makes for intelligently informed employees, which ultimately leads to an increase in revenue. Perhaps the most talked about challenges that the digital age brings with it is increased cyber security problems. It seems that not a day goes by when our news feeds aren’t inundated with security breach stories and the recent WannaCry virus is still a prevalent topic in the media. Outdated and faulty servers can mean that back office systems are more likely to be penetrated by viruses – which can cost companies fortunes. These operational problems, such as lost records, can cause serious legal complications for businesses that hold sensitive information. And with the upcoming introduction of the EU General Data Protection Regulation (GDPR), it’s now more important than ever that office systems are watertight. Technology that abides by ISO 27001 standards will ensure that these challenges are kept at bay. Aligning all aspects of a business to be digital first needs to be a priority. Yes, businesses will initially have to invest more in these systems, but the change will pay dividends in years to come. Onboarding employees will result in a better-equipped workforce and one that can make informed decisions. Given how common cyber breaches are, impermeable systems will ensure that sensitive data is kept under lock and key. When all servers, networks, storage systems and payment methods are secure, then we know the walls will stay standing through whatever challenges our digital future brings. ### Unleashing Fintech Performance with In-Memory Computing It’s no surprise that the financial technology (fintech) market, which grew from $1.8 billion in 2010 to more than $19 billion in 2015, is soaring. Financial institutions are increasingly dealing with growing volumes of data, higher transaction volumes, and increasing regulations. The more financial institutions can apply advanced third-party solutions to their most pressing real-time data challenges, the better. [easy-tweet tweet="In-memory computing platform delivers the speed, scale and availability companies need for a modern fintech architecture " hashtags="Fintech, Comuting"] With so much riding on high performance, how do fintech companies ensure that their applications are up to the task? For Misys, a financial services software provider with more than 2,000 clients, including 48 of the world’s 50 largest banks and 12 of the top 20 asset managers, the answer was in-memory computing. The infrastructure demands on Misys were tremendous as they accessed huge amounts of trading and accounting data to drive high-speed transactions and real-time reporting. Processing bottlenecks were limiting the company’s ability to scale and launch new services. An in-memory computing platform delivered the speed, scale and high availability the company needed for a modern fintech architecture. In-memory computing is not a new idea. Until recently, however, the cost of RAM was too high to implement large-scale in-memory computing infrastructures for any but very high-value, time-sensitive applications such as credit card fraud detection or high-speed trading. The cost of memory has been dropping around 30 percent per year for many years, and today it is just a little more expensive than disk. This means that in-memory computing is now cost-effective for a broad range of applications that require high levels of performance. By keeping data in RAM, in-memory computing eliminates the delay when retrieving data from disk prior to processing. When in-memory computing is implemented on a distributed computing platform, further performance gains can be realized from parallel processing across the cluster nodes. The total performance gains from in-memory computing can be 1,000x or more. Industry Impact of In-Memory Computing In-memory computing currently supports data-intensive applications in many industries. For example, financial services firms face tremendous challenges from the tsunami of data created by 24-hour mobile banking. At the same time, financial regulators continue to add new reporting requirements so firms must be able to increase the speed and accuracy of calculations when changes occur in currency exchange rates, interest rates, commodity prices and stock prices. Similarly, in the hospitality sector, online travel booking sites are looking to migrate from legacy technology to a modern platform that will enable them to scale cost-effectively. They face 24x7 web-scale traffic while retrieving and processing availability and pricing information from a wide variety of sources, which must be processed and delivered to visitors in real-time. Web-scale sports betting solutions face the same scaling challenges. They are faced with a growing number of users, a growing number of sports, more real-time betting scenarios, and increasing regulations and oversight. In all these cases, a scalable in-memory computing platform offers lightning fast parallel processing across a cluster of commodity servers. This allows companies to plan for growth and scale more cost-effectively. Moving forward, in-memory computing will be critical to supporting a wide variety of industries including online business services, the Internet of Things, telecom, healthcare, ecommerce and much more, while addressing use cases such as digital transformation, web-scale applications, and hybrid transactional/analytical processing (HTAP). In-Memory Computing Platforms – A Look Inside To satisfy the requirements for extreme speed and scale with high availability for a variety of different applications, fintech companies use the latest generation of in-memory computing platforms. These typically employ the following in-memory computing components: In-memory data grid – A memory cache inserted between application and database layers of new or existing applications – the cache is automatically replicated and partitioned across multiple nodes, and scalability is achieved simply by adding nodes to the cluster In-memory compute grid – Distributed parallel processing for accelerating resource-intensive compute tasks Distributed SQL – Support for SQL queries running across the cluster nodes including support for DDL and DML, horizontal scalability, fault tolerance, and ANSI SQL-99 compliance In-memory service grid – Provides control over services deployed on each cluster node – guarantees continuous availability of services in case of node failures In-memory streaming and continuous event processing – Customizable workflow capabilities for running and processing one-time or continuous queries – offers parallel indexing and streaming of distributed SQL queries for real-time analytics In-memory Apache® Hadoop™ acceleration – In-memory computing platform layered atop an existing Hadoop Distributed File System (HDFS) to cache the Hadoop data – MapReduce is run in the in-memory compute grid Persistent store – Ability to keep the full dataset on disk, with only user-defined, time-sensitive data in memory – enables optimal tradeoff between infrastructure costs and application performance by adjusting the amount of RAM used in the system Broad set of integrations – Native support for a wide range of third-party solutions such as Apache® Spark™, Apache® Cassandra™, Apache® Kafka™ and Tableau Conclusion For fintech companies looking to cost-effectively deliver the speed and scale their clients demand, in-memory computing platforms are a proven solution that delivers speed, scale and high availability. The ability to easily scale out in-memory computing platforms means that fintech companies can expand their infrastructure as needed while ensuring optimal performance – even as they continue to innovate to create new, differentiated services.   ### M2M IoT Solutions - Insight into IoT - Comms365's Nick Sacke Compare the Cloud interviews, Nick Sacke, of Comms365. Nick talks about Comms365 and how they build and offers M2M IoT solutions and networks. What does Internet of Things (IoT) mean? The Internet of things or IoT is the collection and processing of data from a range of micro devices and conductors including RFID, Bluetooth and micro sensors that allow for intelligent M2M or machine to machine communication. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Comms365 at: https://www.comms365.com/ IOT ### Excelero's NVMesh Wins A Best of Show Award at Flash MemorySummit Named Most Innovative Flash Memory Technology in the software-defined storage category for its shared NVMe at local performance, so enterprises can build hyperscale data centers with ease Excelero, a disruptor in software-defined block storage, announced that its NVMesh® server SAN product was named a 2017 Best of Show award winner at the Flash MemorySummit in San Jose last night. Recognised as the Most Innovative Flash Memory Technology in the software-defined storage category, NVmesh allows enterprises to build hyperscale data centre architectures by delivering shared NVMe at local latency and performance. “The largest web-scale cloud providers have been enjoying hyperconverged architectures to achieve the massive scale and performance required to be successful in delivering IT solutions to marketplace,” said Jay Kramer, chairman of the awards programme and president of Network Storage Advisors Inc. “We are proud to select Excelero NVMesh® for the Best of Show Technology Innovation Award as it brings to the general commercial marketplace a new architecture and virtual SAN approach for shared NVMe that scales performance linearly at near 100% efficiency.”  [easy-tweet tweet="Because of limitations of NVMe flash, it’s virtually impossible to level out utilisation" hashtags="Technology, IT"] The result of 10 patents/pending patents, NVMesh allows enterprises to run unmodified applications that enjoy the latency, throughput and IOPs of a local NVMe device while benefiting from centralised, redundant storage. Distributed NVMe storage resources are pooled with the ability to create arbitrary, dynamic block volumes usable by any host running the NVMesh block client. These virtual volumes can be striped, mirrored or both while enjoying centralised management, monitoring and administration. “Because of limitations of NVMe flash, it’s virtually impossible to level out utilisation across an entire infrastructure, which results in NVMe capacity and performance waste,” said Lior Gal, CEO of Excelero. “We’re proud to be recognised for the pioneering way that our NVMesh addresses a major limitation of NVMe, and helps enterprises achieve their flexibility, efficiency and scalability goals.” ### IoT and Digital Transformation - Insight into IoT - Box's David Benjamin Compare the Cloud interviews, David Benjamin, Senior Vice President and General Manager of Box. Ben talks about Box and how they have discovered through their customers a link between IoT and Digital transformation. What does Internet of Things (IoT) mean? The Internet of things or IoT is the collection and processing of data from a range of micro devices and conductors including RFID, Bluetooth and micro sensors that allow for intelligent M2M or machine to machine communication. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Box at: https://www.box.com/en-gb/home IOT ### My Business has had a Data Breach, What Next? Any type of data breach, whether due to an external hacking incident or an internal staff error, is a significant issue that needs immediate attention.  A key aspect of the legal requirements surrounding a data breach is to demonstrate that your business or organisation takes the issue very seriously and is proactively seeking to not only protect any individuals who may be affected, but is also taking active steps to improve systems and processes quickly to prevent a similar issue occurring again. Communications following a data breach, both internally and externally, need to be carefully managed to convey these key messages effectively. In the immediate aftermath of a breach the most important thing to establish, as quickly as possible, is exactly what data has been compromised and the number of individuals affected. You need to focus on confirming exactly what has happened and how any risks created can be mitigated, prepare your statement and reassure your customers and employees that you are in control of the situation.  Knowing precisely what you are dealing with is key in the early stages to allow you to manage the next steps around communication.  Whilst it is important to act without delay, don’t feel that you need to rush to make available information about a data breach incident until you have been able to verify it. Internally, communications need to take a structured approach to support a swift investigation and establish exactly what data has been compromised and to what extent. [easy-tweet tweet="Any business that experiences a data breach is to report it to the ICO within 72 hours" hashtags="Business, Databreach"] Under current laws there is no mandatory requirement to notify the regulator, the Information Commissioner’s Office (ICO), or the individuals affected. However changes to the data protection laws, which will come into effect within the next 12 months, will require any business that experiences a data breach to report it to the ICO within 72 hours of becoming aware of it, and then to notify affected individuals if the breach is likely to impact on their rights and/or freedoms. In turn, this will mean that having a rapid response approach to breaches will become even more critical in the near future. Once you’ve determined which legal requirements you are required to fulfil regarding notifying the ICO and affected individuals, and whilst ensuring you are not disclosing any confidential information, key messages to be relayed publicly should be kept short and aim to include: any reassurances you can give regarding how serious the breach is general information you can give about what type of data is affected advice to individuals on how to prevent identity fraud which could occur as a result of using the information which may have been compromised This information should only be issued in a manner which does not impact on any ongoing investigation into the incident itself or any attempts to further protect systems and data following the breach.  However, if you are able to confirm that no payment related data, or medical or health related data is involved, this can be a useful message to begin reassuring the public. You should also provide information regarding the communication that the affected individuals can expect from your business following the breach.  Where possible, share security assurances such as confirming that you won’t be contacting any of your employees or customers via email or phone asking for passwords or account details in the coming weeks.  This will provide reassurance to your community; it shows that you care about their individual safety and that you are working towards a solution.  If personal passwords have been compromised, sharing details of how users can change their passwords is also a good place to start. Finally, it’s worth bearing in mind that it’s not just the breach that needs your attention during the immediate incident response phase, but also the channels of communication you use to contact the affected individuals to educate and inform them about the situation.  It’s important to think about how best you can ensure that any messages surrounding the data breach efficiently reach those who may be affected.  In addition to a press statement, you should also consider issuing information to your customers and employees either via an email newsletter, by post, or even a banner and news article on your website homepage.  This will ensure that the message reaches anyone affected as quickly and as transparently as possible. ### DellReview43221 https://vimeo.com/229307947/5c02190d33 ### IoT Advantages for Business - Insight into IoT - HERE/FORTH's Paul Armstrong Compare the Cloud interviews, Paul Armstrong, Founder of HERE/FORTH. Paul talks about HERE/FORTH and how IoT applications for businesses. What does Internet of Things (IoT) mean? The Internet of things or IoT is the collection and processing of data from a range of micro devices and conductors including RFID, Bluetooth and micro sensors that allow for intelligent M2M or machine to machine communication. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about HERE/FORTH at: http://www.paularmstrong.net/hereforth/ IOT ### Data Protection and The Cloud Company IT teams must take active, purposeful steps to prepare for the inevitable, whether the attack is on a single machine or multiple IT assets. The secret to recovering from a ransomware breach or any other disaster lies 
in preparation. By looking at how the cloud can support better continuity planning, companies can make their recovery processes easier and more effective. The importance of taking the right approach to data protection Firstly, it’s important to understand how that preparation can provide better results for both IT and the business.  While data held on servers is often protected in some shape or form, It's how those servers are protected, and what systems they have in place that could create vulnerability gaps. For example, offsite data is usually always handled using tape backups that are removed to offsite storage. There's a lot of overhead in that for business. These off-site copies can be very useful in the event of a full site failure or major ransomware attack. If you need access to this data quickly, it could be days or weeks before you get access to it for recovery. Looking at the cloud for continuity can provide companies with the ability to meet faster recovery time objectives as virtual machines can be, within minutes, spun-up in the cloud, rather than having to be recovered to specific physical hardware or on-premises hypervisor platforms. Using cloud for disaster recovery can help deal with the growing amount of remote data that companies are creating too. The rise in the number of branch office locations normally requires a dramatic shift in protection strategy, particularly where ransomware is concerned. Typically, approaches to DR and backup on multiple locations will depend on data replication to DR sites and install multiple backup appliances at each remote location and the data centre. Alternatively, IT teams can try using non-purpose built backup software that moves data over the wide area network (WAN), or use tapes to protect data on the servers themselves. This is traditionally a specialised approach that focuses on 'mission critical' applications, due to the cost involved. However, ransomware attacks are not fussy and will encrypt huge amounts of critical and non-critical data -- potentially infecting the site DR and backup servers themselves. Cloud offers a new model for companies to cope with these kinds of attacks by opening the doors to protect a broader range of applications at a much lower price point, while also providing data isolation from the primary attack vector. Another problem here is that these approaches rely on people with the right IT skills being on-site to ensure that data is getting replicated properly. For most remote offices, these skills are simply not available. Tape backups can fail while using appliances adds additional hardware expense for each site. More importantly, each location would need to have these costs bought upfront. When ransomware leads to critical files or machines being unavailable for periods of time, the costs for recovery can be considerable. [easy-tweet tweet="Using the availability offered by today’s mature cloud vendors can help businesses deal with ransomware" hashtags="Cloud, Ransomware"] Improving data protection performance across multiple locations Using the availability and flexibility offered by today’s mature cloud vendors can help businesses deal with ransomware. Deploying on Cloud removes the infrastructure burden associated with maintaining a second site, as well as opening up the ability to protect multiple workloads on the same platform. Here, the power of the public cloud for backup and DR can provide an on-demand solution for immediate spin-up and failover in case 
of disaster, regardless of what platforms are being protected. Moving to the cloud instead can help improve the protection for each remote or branch office location, while also avoiding some of the costs that typically come with extending server protection further across the business. By using public cloud platforms, IT can ensure that each location’s IT assets are adequately protected while costs can be contained. Indeed, using cloud can help keep data protection costs in line with the amount of data that is created. By purchasing capacity on demand, rather than the traditional on-premise over-provisioning to meet future years’ estimated data volumes and associated service levels, costs can be managed more effectively. For company IT teams thinking about how to protect the data on their servers against attacks like ransomware, the cloud can offer a better approach. As more data gets created in more places, protecting this data becomes critical over time. By using Cloud-based services, companies can improve their ability to meet their recovery time objectives and protect their operations against ransomware attacks. ### Cutting Through the Hype of Automation There’s currently a lot of debate swirling around automation, AI and their future within the workplace. Broadly speaking, there are two prevailing schools of thought on the issue. On the one hand there are fears that increasing levels of automation will render a growing number of job roles unnecessary, leaving human workers redundant and sparking a worldwide employment crisis. The popular “robots stole my job” argument. On the other is a more hopeful vision, and one that’s less likely to gain the same kind of traction in national media. This centres around automation enabling humans to focus on the human tasks that they’re best at. The kind of skills that no automation can replace, and shifting, rather than removing, job roles. [easy-tweet tweet="Robotic process automation is neither robo-apocalypse nor automated utopia" hashtags="Robotics, AI"] Given I am in the employ of a company that specialises in the automation of financial processes, you’ll be unsurprised which side of the fence I fall on. However, I do to some extent take umbrage with both of these viewpoints. They are, in my opinion, and to different extents, based on an unrealistic vision of not only what exactly automation entails, but how far along the technological journey we currently are. Robotic process automation (RPA) is neither robo-apocalypse nor automated utopia, and I’d like to separate fact from fiction and lay out what it really has to offer for modern business. Defining RPA The hype, both positive and negative, as I’ve alluded to above, stems primarily from a level of confusion around what actually constitutes RPA. This isn’t surprising, as even within the industry itself a number of different definitions persist. Take Deloitte, for instance. In its report, The Robots are Coming, it defines RPA as such: “RPA is a way to automate repetitive and often rules-based processes…Software, commonly known as a ‘robot’, is used to capture and interpret existing IT applications to enable transaction processing, data manipulation and communication across multiple IT systems…Multiple robots can be seen as a virtual workforce – a back-office processing centre but without the human resources” TechTarget, however, has a markedly different view of RPA: “Robotic process automation (RPA) is the use of software with artificial intelligence (AI) and machine learning capabilities to handle high-volume, repeatable tasks that previously required a human to perform” These definitions may appear very similar at first glance, but there are some key differences which feed into the unrealistic expectations around RPA that currently exist. Hype vs. Reality Let’s start with the good news. Both the Deloitte and TechTarget definitions rightly identify that RPA is capable of performing repetitive tasks at high-volume. This is at the core of BlackLine’s offering, as our solutions automate fiddly and repetitive financial close processes which would normally be performed by finance staff. What appears in the Deloitte definition, but is lacking from TechTarget’s, is another key tenet of RPA – its rules-based nature. In other words, and to use an IT truism: “garbage in, garbage out”. Humans set the rules and the automation software performs them. If the rules are wrong, the automation processes are too. The software has no capacity to make a judgment call. It follows orders unquestionably. It’s the omission of this key tenet of RPA which, in my mind, opens the floodgates for hype. TechTarget’s definition is perhaps here the greatest offender. AI and machine learning are separate technologies to RPA and, although not entirely unrelated, are comparatively in their infancy. AI and machine learning’s ultimate goal is to augment human intelligence and, one day, make judgements based on previous experiences that could look like human interaction. However, we’re a long way off this. You might think Deloitte’s definition has come out of this analysis relatively clean. However, despite correctly noting the rules-based design of automation, it then goes on to describe “a virtual workforce”. Given what I’ve laid out above, is the idea of a workforce that is unable to adapt to unforeseen circumstances, and will follow incorrect orders unquestionably, an attractive one for businesses? I’d wager that it isn’t. The Business Challenge of Automation It’s important then for business leaders not to fall into the trap of believing automation’s a panacea to cure all their ailments, as this hype often obfuscates the very real challenges that companies will face when embarking on their automation journey. If companies are not aware of these issues, it can leave them unable to scale their automation programs to the degree they require, resulting in abandonment of implementations and wasted resources. As with any digital transformation project, unless you know what you want to achieve going in, all you’ll achieve is added complexity.  Automation is not a one-click fix, and requires changes to processes and the training of staff in how to manage and interact these new ways of working. Business leaders need to recognise that automation is a slow build that needs to be driven by the unique needs of each business and its users. It’s a step-by-step iterative process. Businesses need to take an end-to-end view of the desired outcomes if any project is to prove successful. It’s not as attractive a prospect as a one-click fix, but it’s the reality of the situation. Realistic Goals If business leaders do go into an automation implementation with their eyes open, projects can provide huge benefits. Pepsico saw a 67% reduction in time spent on reviews using our financial controls and automation platform, whilst Cox Communications reallocated 52,000 man hours away from reconciliations and towards analysis. Results like these stem from realistic goals for projects, and a recognition that not all processes are ripe for automation, even if they might appear to be. As with the two examples above, these projects focus on rules-based automation of routine, low-risk activities. The ultimate goal for any business considering automation should be to empower their employees to work better, and focus on other tasks requiring human strengths such as emotional intelligence, reasoning and judgment. To quote Leslie Wilcocks, Professor of Technology, Work, and Globalisation, London School of Economics: “RPA takes the robot out of the human”. ### Kids predict the tech of the future, declaring that “technology is everything!” New research commissioned by Dell Boomi - and carried out by OnePoll - surveyed 1,000 12-15 years olds based in the UK to ask them what technologies they think they’ll be using when they enter the workplace in the next 10-15 years. The top five technologies identified were: Cars that drive themselves (66%) Robot assistants at work (47%) Artificial Intelligence (41%) Virtual offices (38%) Robot housekeepers (34%) What’s more, just over a quarter (26%) of youngsters think commercial space travel will be a thing by 2032, and over one in five (22%) think they’ll be using computers linked to their brains. In fact, over half (52%) of 12-15 year olds think they will be using technology that hasn’t even been invented when they go into the workplace in 10-15 years’ time. What does technology mean to you? Boomi also asked the children in the survey what technology means to them. Many of the responses centred on how technology ‘makes lives easier’ and ‘makes life better’, as well as how technology ‘assists people’ or ‘helps and entertains humans’. For others, technology was about the ‘new’ – such as ‘innovations’ and ‘gadgets’ that they can use to talk to friends or do homework, while some simply stated that technology is ‘the future’. The word cloud below highlights the most popular terms and here are just a few quotes from the 12-15 year olds themselves: “A better way of doing things” “Everything! It's in everything I do in my life” “It's new stuff. It means life will be easier and maybe better” “It's the future and it will do almost everything that humans can do and more” “Without it I would spiral into depression. It keeps me entertained” “A way of escaping reality and having fun with my friends on social media or texting" ### Embedded Devices - Insight into IoT - Canonical's Mike Bell Compare the Cloud interviews, Mike Bell, EVP Devices & IoT of Canonical. Mike talks about Canonical and how they are the platform for key innovators. He goes on to explain the challenges that developers have within the embedded and IoT spaces. What does Internet of Things (IoT) mean? The Internet of things or IoT is the collection and processing of data from a range of micro devices and conductors including RFID, Bluetooth and micro sensors that allow for intelligent M2M or machine to machine communication. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Canonical at: https://www.canonical.com/ Iot ### Overcoming Business App Development Challenges Demand for mobile business apps has never been higher. As smartphone and tablet adoption continues to grow rapidly, business owners are realising a mobile presence is a necessity. Apps can help businesses streamline their processes, better engage customers and reach new ones. Often the use of the cloud figures into this process. But companies have been deterred from developing business apps because of the complexity. The cost of development can be high, there are skills industry shortages, app development can be slow and resource intensive and getting apps to work across multiple device types and different operating systems is hard.  There’s also backend integration issues and the need to ensure apps are future proof and can evolve as business needs change. With these challenges, it is little wonder Gartner reports that more than a quarter of enterprises globally, have not built, customised or virtualised any mobile apps in the last 12 months. Let’s consider the top challenges: An industry-wide skills crisis There is a lack of experienced mobile developers in the industry, which is a problem for companies looking to increase their app development. App development also takes time – most IT professionals report that apps take between six months to a year to develop and roll out. [easy-tweet tweet="62% of companies reported spending more than $500,000 to create just 1 to 3 apps" hashtags="Apps, Mobile"] A costly process Analyst Forrester found that, 62 percent of companies reported spending more than $500,000 to create just one to three apps. Thirteen percent said that they spent $5 million or more to do so. Working with multiple platforms and devices Today’s business users work on numerous devices including iPads, iPhones, Android and Windows phones and tablets, and Windows, Linux and Apple desktops plus different operating systems across the devices, so apps need to work across these devices (whatever the operating system and whatever size they are) - this is a key development challenge. What type of apps are best for the business - Web, Hybrid or Native? Businesses need to decide on the best apps for their business - web apps, hybrid apps or full Visual Studio and Xamarin projects to deploy native apps? They must consider the development challenges and costs of each and if the apps will be future proof and adapt if business requirements change. Integration challenges Successful apps will integrate with multiple types of SQL databases, and maybe other database types such as MultiValue databases, and integrate and synchronise with back-end systems. Getting around the hurdles These challenges can be overcome and we’re seeing growing excitement about the potential of rapid app development tools that accelerate app development. Selecting the correct rapid app development tools can enable the development and deployment of web, hybrid and native apps more quickly and cost-effectively than traditional methods. If the tools are cloud-based, then authorised staff including management and the development team can really share the development process and review the app at various stages of its build. This makes it a truly iterative process resulting in apps that genuinely reflect the business requirements. Using such tools businesses can create apps for different device types and platforms and integrate with all SQL and other databases, while using a single design project and code base.  Developers can also save time as the tool should only require them to develop the design once and the generated app should then run on any type of mobile, tablet or desktop. These systems should offer multiple options for designing apps without any restrictions about where the data is held. Data from multiple databases should be able to be used and accessed in a single app, and the information combined for optimum use in the app design. The apps should integrate with back-end systems, including the immediate synchronisation of data between user devices and back-end databases and systems. The tool should offer all the features of a typical RAD environment, including menu creation, integration of device specific features, styling and data management, and a WYSIWYG screen designer. These features enable businesses to quickly build user interfaces, define and manage data, and use the product’s business logic components to create apps quickly. With a simplified development process, there should be less need for additional developers. Instead, existing staff who really understand the business can be used. But importantly, such tools should provide the option for businesses to generate as web, hybrid or as full, industry-standard Visual Studio and Xamarin projects which can be deployed as native apps or form part of the IT teams’ onward development programme. When the business needs change or technology evolves, the apps should be able to be updated too – offering an unlimited growth path so businesses can become more competitive and profitable.  App development doesn’t need to be challenging, businesses just need to select the right tools to overcome the hurdles.   ### Anthony Burrisson from ACR IT Solutions - Dell EMC ChannelTalks James Hulse of Dell EMC speaks to Anthony Burrisson, Managing Director of ACR IT Solutions on the latest Dell EMC #ChannelTalks series. ACR is an integration specialist, VMware and Microsoft specialist using their technologies to deliver solutions for customer needs. Find out more at: http://www.acr-its.com/ Dell EMC are the providers of technology solutions, services and support and this series James interviews a range of experts from within the technology industry. Channeltalks ### CIOs Require Greater Maturity to Tame the Cloud More and more enterprises are going ‘all in’ on cloud, indeed analysts at IDC claim that IT spending will hit $44.2 billion this year of which and cloud will make up 46 per cent of businesses IT spend. An independent CIO report commissioned by Fruition Partners in 2015 that looked at the use of cloud computing identified that they had a range of serious concerns over cloud control.  In 2016, the research was repeated and extended to cover 400 CIOs of large organisations in both the UK and the USA, and across a range of industry sectors and it showed that little had changed. Maturity is the key to minimising risk The most concerning finding are that 80 per cent of CIOs admit they do not apply the same comprehensive IT service management processes to cloud as they do for in-house IT services. When compared to last year’s figures, the research finds while there has been a slight improvement in overall cloud maturity, there is still a long way to go. In fact, more than 85 per cent of CIOs still says that the proliferation of public cloud computing services is reducing the control their organisation has over the IT services it uses.  CIOs are highly concerned by the negative impacts of this trend, with three-quarters of them saying it leads to financial waste and increases the business and security risks to the organisation. In the light of these risks, CIOs cannot afford to ignore the fact that the need for rigorous management is actually greater, not less, in the cloud.  Part of the solution to this is making better use of the IT Service Management (ITSM) tools that they already employ to manage in-house IT.  The survey found that while in-house IT services are, on average, managed by a combination of six established ITSM processes, cloud-based services are, on average, only subject to four, and only a fifth of CIOs report that ITSM processes have been applied to all cloud services. Cloud sprawl is becoming easier to regulate Shadow IT is also a particular concern that is affecting CIOs’ control over enterprise IT: two-thirds of respondents believe that employees signing up to a cloud storage service, CRM applications or collaboration applications is creating a culture of shadow IT within their organisation, with 62 per cent saying that there are cloud applications being used in the business without the IT function’s knowledge. There is a ray of light in these findings: the wider use of service catalogue functionality contained within ITSM solutions means that 71 per cent of those who use this kind of application say that it is helping them control cloud sprawl.  By using a service catalogue, CIOs can help minimise shadow IT: if it’s easy and hassle-free for users to choose and use 3officially-sanctioned cloud services, then they are less likely to follow a DIY route. Orchestrating cloud platforms is key to good service Another concern is that the more responsibility CIOs hand over to third-party cloud providers, the more they open themselves up to the ‘blame game’ if one of those services fails. This is evidenced by the finding that more than half of respondents say that if users have problems with cloud applications, such as Salesforce or Dropbox, they deal with the cloud provider's support team directly.  While this may take the support burden off corporate IT, when things go wrong it’s not clear who is responsible, particularly if SLAs are not correctly put in place or monitored. There is one encouraging sign regarding improving control over the cloud: last year, only 27% said they could and did use ITSM tools to orchestrate cloud platforms.  This year, that figure has gone up to 40%.  However, that still leaves 44% who have the tools but aren’t using them for orchestration, and 16% who don’t have the right tools in place at all.  The orchestration is an essential part of the cloud management process, and CIOs should focus on putting this technology to work. [easy-tweet tweet="It’s clear that CIOs continue to face challenges from the cloud" hashtags="CIO, Cloud"] ITSM is key to taming the cloud It’s clear that CIOs continue to face challenges from the cloud.  Although there are many opportunities that the cloud offers to reduce costs, improve productivity and introduce a ‘consumerised’ service experience, it’d be foolish for CIOs to thoughtlessly trust that public cloud services will work flawlessly and be delivered perfectly at all times. Similarly, they cannot ignore the risks that shadow IT can pose by creating an uncontrolled universe within an organisation’s IT infrastructure. Part of the solution is to use investments in ITSM that have already been made by using it to manage public cloud technology in the same way organisations are using it for in-house systems and private cloud. By offering employees functions such as self-service access to approved cloud services, or by using ITSM to manage assets and service providers, there is a much better chance that CIOs can start to tame the cloud and exploit its benefits without the risks. ### Healthcare - Insight into IoT - Synopsys's Adam Brown Compare the Cloud interviews, Adam Brown, Manager of Security Solutions of Synopsys. Adam talks about Synopsys and how they spoke to healthcare provider organisations and how medical devices had been breached due to issues towards the software that ran the IoT devices. What does Internet of Things (IoT) mean? The Internet of things or IoT is the collection and processing of data from a range of micro devices and conductors including RFID, Bluetooth and micro sensors that allow for intelligent M2M or machine to machine communication. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Synopsys at: https://www.synopsys.com/ IOt ### Answers to the Top 4 Questions About the SWIFT Customer Security Programme Attestation for the SWIFT Customer Security Programme (CSP) will start in July. That might seem like plenty of time to show that you can comply with the 16 mandatory controls (and hopefully the 11 advisory ones as well), but it’s coming up faster than you think. You’re going to need every second between now and then to craft a solid security strategy that proactively protects your organization against the ever increasing threat of fraud – a strategy that protects you well into the future, not just until the next breach happens. As you work through the security planning and implementation process, here are answers to the top 4 questions regarding the SWIFT CSP, along with some guidance on what you need to do to avoid future attacks. Just how serious are the cyber threats that banks face? Very. That may sound somewhat alarmist, but it’s just a realistic view of the ongoing state of the industry. Business email compromise scams have risen 2,370% since January 2015, draining international organizations of more than $5 billion since just 2013. A full 38% of organizations now admit that it’s difficult to tell the difference between a legitimate payment and a fraudulent one. Plus, in a November letter to banks worldwide, SWIFT themselves have said that “the threat is very persistent, adaptive and sophisticated – and it is here to stay.” This is not a drill. Banks are under attack and must take action now to not only meet SWIFT’s requirements, but to implement a comprehensive security plan that will future-proof their institutions against growing threats. Are internal or external threats the bigger danger? Threats can come from anywhere and they all represent a significant danger. While external threats can range from individual hackers to organised fraud rings or even state sponsored actors, fraud perpetrated by insider threats is no less insidious. In fact, some of the most infamous attacks to capture the headlines in the past 18 months are alleged to have been perpetrated by insiders. To effectively protect against both internal and external threats, it’s necessary to proactively monitor all user behavior, regardless of where it comes from. What has changed in the threat landscape to drive the need for new methods of security? Progress is a tremendous change agent. When Karl Benz drove the first car in 1886, seatbelts were hardly a necessity because the car’s maximum speed was 10 mph. [easy-tweet tweet="As payments have moved faster, so have cyber criminals" hashtags="#CyberSecurity #Fintech"] Payments have been the benefactor – and the victim – of a similar type of evolution. As payments have moved faster, so have cyber criminals. That, in essence, is the problem. The methods fraudsters now use to infiltrate payment systems and divert funds are evolving faster than anyone can keep pace with. The situation that has evolved with SWIFT customers is a perfect example. The $81 million heist in Bangladesh. $12 million in fraudulent transfers from Banco del Austro (BDA).  An attempted attack on Vietnam’s Tien Phong Commercial Joint Stock Bank. SWIFT was never the direct victim of an attack, but their customer’s local environments -- banks with insufficient security controls -- were an irresistible target and therefore an Achilles heel for the entire community. It was a perfect storm of circumstances that shined an international spotlight on the desperate need for all organization that handle payments to employ better security protections. The traditional log-file systems that are still widely used today aren’t the answer, either. They are of little to no protection because they only make organizations aware of a fraud incident after it’s happened. Once payment fraud has occurred, it’s next to impossible to recover the losses – most of the $81 million from the Bangladesh heist, for example, is still unaccounted for. To avoid financial losses and reputational damage, threat detection must happen in real time. Do all 27 of the SWIFT CSP controls need to be implemented in order to achieve maximum protection? You certainly need to implement the 16 mandatory controls. As for the 11 advisory ones, that’s a topic of much debate. Some organizations view them as optional, but we would strongly recommend that you assess their necessity for your own organization. Many of them are common sense measures that should be a part of a comprehensive security plan anyway, so it may just make sense to comply with them as well. Also worth considering is the fact that as the threat landscape evolves, controls that are currently advisory may end up becoming mandatory anyway. Ultimately, the CSP program is a positive step in the right direction for defining a strong baseline of security standards for the SWIFT community. You should look to build on it as a foundations for a broader security playbook designed to stop fraudulent payments before they happen. It would be a mistake to view having to comply with the CSP as a distraction from the real focus of your business. Instead embrace it and use it as an opportunity it enhance your organization’s overall security procedures. This is a perfect chance to evaluate whether or not your security is up to the challenge of protecting your payments against modern threats. Fraudsters are using every tool and trick available to them. Are you? ### Paul Brown from Certus TG - Dell EMC ChannelTalks James Hulse of Dell EMC speaks to Paul Brown, CEO of Certus TG on the latest Dell EMC #ChannelTalks series. Certus TG is an IT infrastructure partner who builds, support and manages customers IT infrastructure. Find out more at: http://www.certustg.com/ Dell EMC are the providers of technology solutions, services and support and this series James interviews a range of experts from within the technology industry. ChannelTalks ### Confidence in Disaster Recovery Plans Falls for Second Consecutive Year Annual Data Health Check findings reveal lack of confidence in DR plans as businesses respond to recent cyber threats and DR testing falls. New research from business continuity and disaster recovery service provider, Databarracks, has revealed that organisations are now less confident in their ability to recover from an incident. Contributing factors include a lack of testing, budgetary constraints and the growing cyber threat landscape. The findings are part of Databarracks' seventh Data Health Check report, released today. The survey questioned over 400 IT decision makers in the UK about their IT, security and continuity practices over the last year, and what they expect to change in the next 12 months. [easy-tweet tweet="Organisations are making changes to their cyber security policies in response to recent cyber threats" hashtags="DR, Security"] Key data from the survey includes: Almost 1-in-5 organisations surveyed (18 per cent) “had concerns” or were “not confident at all” in their disaster recovery plan; an increase from 11 per cent in 2015 and 15 per cent in 2016; Organisations are increasingly making changes to their cyber security policies in response to recent cyber threats (36 per cent this year, up from 33 per cent last year); Only a quarter (25 per cent) have seen their IT security budgets increased. Small businesses are particularly affected with just 7 per cent seeing IT security budgets increase; Financial constraints (34 per cent), technology (24 per cent) and lack of time (22 per cent) are the top restrictions when trying to improve recovery speed; Fewer organisations have tested their disaster recovery plans over the past 12 months – 46 per cent of respondents had not tested in 2017, up from 42 per cent in 2016. Peter Groucutt, managing director of Databarracks, commented on the results: “It isn't surprising that confidence in disaster recovery (DR) plans is falling. We have seen major IT incidents in the news regularly over the last 12 months, which has raised awareness of IT downtime and we have seen what can go wrong if recovery plans aren’t effective. “What is surprising is that fewer businesses are testing their DR plans. The number of businesses testing their DR plans increased from 2015 to 2016 but has fallen this year. We know that testing and exercising of plans is the best way to improve confidence in your ability to recover. The test itself may not be perfect, few if any are and there are always lessons to be learned. Working through those recovery steps, however, is the best way to improve your preparedness and organisational confidence. “It is also surprising to see a decrease in DR testing because new replication technologies are making testing easier. It is now far quicker to recover systems, validate that the recovery was successful and even carry out user testing, so there is no excuse to not test. “More testing would also be our advice to organisations concerned about cyber threats. Businesses are taking the right action by reviewing and updating IT security policies in response to new threats. The next step is to test your ability to recover. What steps would you follow? How do you isolate the issue? Do you failover to replica systems or recover from backups? Cyber recoveries are often far more complex than the more common incident causes like hardware failure and human error and the increased likelihood warrants dedicated cyber recovery testing,” Groucutt concluded. For more information, please see the link to the Data Health Check survey here: http://datahealthcheck.databarracks.com/ ### How Business Can Improve Data Management Practices A recent study by DMC Software found that 79% of businesses in the UK are failing to put the customer experience first due to poor data quality in their customer records. Just 21% admitted they were confident with the customer data they hold – proving there’s a long way to go for businesses who want to capitalise on the ‘year of the customer’. Of course, many businesses are aware of the importance of data and analytics for building and cementing long-lasting customer relations – yet, the survey proves that businesses are struggling to make it a priority. It is estimated, 2020 is set to be the year that customer experience will become a key brand differentiator for customers, so if businesses are to keep up, it’s time to rethink how data practices are currently managed. With data the foundation of a customer-first strategy, the below points highlight the issues businesses are facing, and how a meaningful and most importantly, profitable, the strategy can be adopted. Invest in storage solutions Many businesses face data storage difficulties, using solutions that are unfit for use or disparate systems that create an unclear and inconsistent view of the customer. [easy-tweet tweet="Only 43% of businesses use a dedicated CRM" hashtags="Data, DR,CRM"] The study highlighted that only 43% of businesses use a dedicated CRM, while those remaining utilise Excel spreadsheets (20%), paper-based records (5%), or email software(11%). It is perhaps the latter which is the most concerning given the rise in cybercrime. For businesses to ensure they are storing customer data securely, it is important to review their current systems and identify the right solution for effective data management. This is especially important considering the introduction of GDPR next year to strengthen data privacy and protection. Only by making the changes, organisations can adhere to data guidelines at the same time gaining a solid overview of their customer base.  Improve data accuracy There are currently an estimated 2.5 exabytes of data being produced every day, and by 2020 it is predicted that there will be 44 zettabytes in the world. With more data than ever being created, an ineffective data management solution can be one reason why inaccuracies occur. For example, 21% of businesses use separate processes to manage different business functions – such as one to manage customer relationships and a finance package for invoicing. Therefore, even though the solutions being used are fit for purpose, inaccuracies and duplicate data can occur across systems providing an inconsistent view of the customer. Ultimately, having an impact on the customer experience. 38% of businesses admitted that they have lots of incomplete records and 41% said that some had missing details. This has an enormous impact on customer relations and service, leading to customers missing important communications.  Only 21% believed that all records were complete helping to obtain a strategic overview of the customer. To avoid such errors from occurring, it’s important that businesses have a concrete data management process in place. Employees should also be educated on the importance of clean data and the benefits not only to the business but how it directly impacts their role too. Check for duplicate data Alongside data inaccuracies, duplicate data was found to be an issue for businesses, with just 34% stating it was something they regularly check for when maintaining their customer records. 66% of respondents reported they found duplicate data by accident, which can become a source of customer frustration if they have to repeat their details or if issues and conversations aren’t stored on records. 13% said they cleanse data annually, while 20% said that this isn’t something they carry out at all. It's not just customers who are faced with annoyances when it comes to data inaccuracies and duplicates. Data inconsistencies can be a source of frustration for customer facing staff who do not have the correct data on hand when speaking to a customer, resulting in miscommunications or awkward conversations on their part. For data to be of the highest quality and regularly updated, the example should be taken from the 28% who said that they update records after each customer interaction. After any interaction with a customer, if not automatically updated in the system, this information should be recorded manually.  This allows everyone in the organisation to be aware of the latest interaction, to enhance sales opportunities. While the findings do indicate that many businesses are proactive in their approach to managing customer data, it’s clear that for some, data management is something which requires immediate attention. ### Reboot: Is there sexism in the tech industry? Broadcasting live today at 5pm BST. Watch Reboot - our new live show peeking behind the curtain on the tech industry. Melissa Jun Rowley - @MelissaRowley Suki Fuller - @SukiFuller Cecilia Pagkalinawan - @ceciliany Daniel Tenner - @swombat #disruptivelive ### IoT Applications - Insight into IoT - Cambridge Consultants's Rob Milner Compare the Cloud interviews, Rob Milner, Head of Connectivity of Cambridge Consultants. Rob talks about Cambridge Consultants and how they add connectivity to products in a whole host of different industries. He explains that there is a migration away from niche applications to much more broader purpose scalable applications. What does Internet of Things (IoT) mean? The Internet of things or IoT is the collection and processing of data from a range of micro devices and conductors including RFID, Bluetooth and micro sensors that allow for intelligent M2M or machine to machine communication. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Cambridge Consultants at: https://www.cambridgeconsultants.com/ IOT ### Humans Need Digital Twin to Achieve Zero Unplanned Downtime   We often hear the phrase, ‘you cannot account for human error’ but that seems illogical in today’s connected world. We have the technology to not just account for human error but to eradicate it. The internet of things with the proliferation of affordable and reliable sensors is changing the way in which we can view, manage, service and support technology, processes and any physical object. By mirroring a process, product or service into a virtual world, we can create environments in which machines can automatically analyse performance, warn of impending issues, identify existing or potential errors and even suggest part upgrades or changes to procedures to make them more efficient. This is the digital twin idea. As a concept, it’s been around for a while (NASA used it on early space missions) but the emergence of IoT has made it a commercial reality. Digital twin eliminates guesswork from determining the best course of action to service critical physical assets, from engines to power turbines. Easy access to this combination of deep knowledge and intelligence about your assets paves the road to wider optimisation and business transformation. Digital twin technology spans across all industries where the value is in assets and more generally complex systems. Its ability to deliver early warnings, predictions, and optimisation is fairly universal. In time, I think we’ll see the concept of a digital twin to be applied to human beings as well, playing a significant role in healthcare. However just mirroring is not enough. If the aim is to achieve zero downtime or at the very least, overall insight into on-going product and process performance, the digital twin has to be analysed and that analysis has to feed other functions. What the digital twin produces, when bundling data with intelligence, is a view of each asset’s history and its potential future performance. [easy-tweet tweet="Combining APM with FSM tools, the digital twin idea is transformed into an intelligent agent" hashtags="IoT, Digital"] This continuum of information leads to early warnings, predictions, ideas for optimisation, and most importantly a plan of action to keep assets in service longer will, sending commands to machines in response to those forecasts. In other words, if you close the loop, with data and predictions, you can act directly on the asset itself. That’s where Asset Performance Management and Field Service Management are changing the way service operations run. By combining APM with FSM tools, the digital twin idea is transformed into an intelligent agent. Businesses have, for the first time, a complete suite of intelligence at their fingertips, to understand potential equipment issues, and pre-empt them or act upon them quickly and efficiently with the correct tools and parts, should machinery need fixing. This means field service is managed more efficiently, reducing costs and ensuring minimum downtime as engineers attend jobs with a full understanding of the problem, the right parts to hand and a complete knowledge of how to fix it. This is the shift from an often blind and reactive approach to fixing broken products and services to a predictive model that should eliminate waste, reduce costs, downtime and importantly human error. The digital twin idea has additional benefits too, as it can use historical data and current data to provide a complete picture of a particular asset, its past performance, what it should be achieving now and its likely end of life date, when it would be predicted to be less efficient. This sort of knowledge is gold dust for product designers and manufacturers as it can feed back accurately, which parts work well and where machines would need improving or upgrading. Combined with the knowledge of field service professionals this makes for a powerful tool for upselling products and services to customers. Any new ideas or enhancements can be fully supported with data analysis and perhaps even simulations to illustrate how new parts and functions would improve performance. It offers justification and also accountability and should cut through irrelevant or unsuitable product or service ideas. It’s transforming service at the edge by bringing together all the facets that make businesses and machines tick - and goes a long way to creating a world of zero unplanned downtime. ### Ken Harley from Circle IT - Dell EMC ChannelTalks James Hulse of Dell EMC speaks to Ken Harley, UK Sales & Marketing Director, from Circle IT on the latest Dell EMC #ChannelTalks series. Circle IT; Covers all sorts of IT infrastructure from small schools to media companies. Find out more at: http://www.circleit.co.uk/ Dell EMC are the providers of technology solutions, services and support and this series James interviews a range of experts from within the technology industry. Channeltalks ### A Digital Transformation – It’s Not Just About Platforms The term “Digital Transformation” has been and continues to be mentioned practically everywhere within the industry at the moment. Depending on whom you talk to and why, you will get many different educationed answers back on the definition of the term. So from a “non-technical” viewpoint what does Digital Transformation actually mean and why should you even care? Mckinsey, a well known consultancy firm state “Business leaders must have a clear and common understanding of exactly what digital means to them and, as a result, what it means to their business (for a deeper look at how companies can develop meaningful digital strategies and drive business performance.” http://www.mckinsey.com/industries/high-tech/our-insights/what-digital-really-means Now I believe that the above statement is a catch-all statement to preach to the uninitiated within the CXO digital age but if you are like me, you may be discussing Digital Transformation from a different viewpoint. Let’s discuss a few non-technical changes first when we talk about digital transformation. [easy-tweet tweet="Technology is a critical business enabler that directly underpins everything we do" hashtags="Technology, CloudComputing"] Cultural Changes with a Digital Disruption This often gets overlooked when talking about change, however, this is an important factor within this topic. Technology today is a critical business enabler that directly underpins everything we do. As generations pass so do historical ways of working. Once I was happy to read that newspaper on the commute to the office, now I read publications sent directly to my smartphone that have been chosen specifically to my interests. This is a good example of a cultural change that we have simply accepted, and this type of digital transformation is straightforward to understand. However, not all businesses are publication houses or editorial based organisations. Being Lean and Agile with a Digital Transformation  Every business needs to be on top of operational efficiencies while maintaining an ability to be agile. However, achieving this is a “secret ingredient” that is not easy to apply to a recipe of success for organisations. By applying a “digital first” mentality, any given business can benefit from the technical infrastructure that promises agility and not bleeding edge technical concepts. Let's face it, we are not going to regress as a species and decide digitalisation is not going to be part of our lives, it is only going to speed up our own digital transformation evolution. Cloud Computing started this trend and it's only going to increase with the adoption of digital currencies and other major changes to the way we interact day to day. Achieving competitive advantage with a Digital Transformation Now, with a business mind-set, you can see where this makes sense. We have all been reminded, in various digital transformation workshops, seminars and events, of the Kodak rise and fall. Once the most dominant photographic company on the planet, leading the way with the development of digital photography, to quite simply lose out and eventually go out of business because they sat back and waited far too long to launch. This was incredible considering their market share and global footprint. However, I believe that many more surprises are in store over the next few years, showing us again that even some of the largest companies on the planet can get it wrong. There are many more reasons for having a digital transformation strategy, but the few listed here are good examples of pressing questions to ask yourself and organisation that will prompt a discussion that at the very least will start you on that journey. Of course, you may think all change comes at a cost. Yes it does, but keep in mind that short-term pain can lead to long-term gain and never a more true statement can be made about digital transformation. Digital Transformation with Intel and other platforms? Technology is alway pigeonholed into categories and type, but in all honesty, it shouldn’t matter to the end consumer as long as the results are the same right? Intel architectures are very well known and also common within the end consumer (your computer at home?) as well as being in the workplace. Intel platforms are cheaper per box, and scale well in smaller increments, but they aren’t the best platform for large scale transaction processing, and other high throughput computing concerns.Mainframe technology has been in place for decades and has been, quite literately, running most of the largest firm’s core data processing infrastructures in the world. This sleeping dragon, so to speak, has had face-lifts, modifications, additions, and extras while embracing the world of Opensource technology too. It's faster, more secure and more resilient than ever before which enables consumers of this technology to be ahead of the digital curve. I can access my mainframe application from my smart device securely; I can have the agility and raw power of cloud computing burstability within a single box. I can also create virtual machines with automatic failover within a single unit. My point is this – I may be slightly biased to mainframe technology given my background, but why wouldn’t I be when the numbers add up, and this type of technology has evolved into an easy medium to use for a digital transformation. Find out more about digital transformation and DataKinetics.  ### Comms Business Live Episode 5: Ransomware – Channel threat or opportunity? In this episode of Comms Business Live the discussion focussed on Ransomware - Channel Threat or Opportunity? The host David Dungay was joined by Sham Miah, UK Channel Manager at Webroot, Sean Newman, Director of Product Management at Corero Network Security and Clive Simms, CEO or Meta Defence Labs. Don't forget about Channel Live 12th - 13th September. ### New Technologies - Insight into IoT - The Open Group's Chris Harding Compare the Cloud interviews, Chris Harding, Director of Interoperability of The Open Group. Chris talks about The Open Group and Open platform 3.0. What does Internet of Things (IoT) mean? The Internet of things or IoT is the collection and processing of data from a range of micro devices and conductors including RFID, Bluetooth and micro sensors that allow for intelligent M2M or machine to machine communication. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about The Open Group: http://www.opengroup.org/ IOT ### Is Active Directory On Its Way Out? I know that the utter mention of this will have technology experts and engineers stand and take a position on this. Recently I was asked by a Business investor on whether they should invest in a company that specialises in automating the task of moving profiles and data off machines onto other machines seamlessly. It made me think about the future of user profiles and where things are going in the world of domain profiles and group policies and what the world will look like in 5 years’ time. What would the de facto operating model look like based on the current speed of advancements in the cloud space? I come from the Novell days of NDS and have seen Microsoft take a complete strangle hold on that market for the best part of my career and cannot see that changing shortly. There are no official statistics that I could find but we can rest assured that the majority of organisations worldwide use AD. With the release of Azure AD, are things slowly changing and Microsoft just building up their capabilities in the cloud to slowly force organisations to move to the cloud? With technologies like Microsoft Intune and conditional access and the ability to push policies to mobile devices and Windows 10 devices to control the user and device behaviour, is that not group policies in the cloud? What about Enterprise state roaming that allows user settings to roam with their devices, is that not an early indication of the death of on premises roaming profiles which were the bane of all NT system administrators lives? Under Microsoft’s cloud based AD premium subscription service in conjunction with Microsoft Intune and Windows 10, you can even join a machine to the domain without the machine even being physically on the same network as your on premises Active Directory controller. Microsoft even released a short while back Domain services where you can stand up virtual servers in the cloud and allow it to join to a Domain service to a function which talks to our on premises domain controllers. Ok, for you on-premises purists the functionality is limited in comparison, but did you not say that about Exchange online? And do Business leaders care that you cannot do this functionality or not run this PowerShell script? [easy-tweet tweet="AD connect and user account details can be synched to the cloud tenant albeit their passwords" hashtags="Accounts, Cloud"] Some organisations will simply shy away from wanting to move their user accounts and computers into the cloud. Well, that's what I thought, but in came AD sync which is now known as AD connect and user account details can be synched to the cloud tenant albeit their passwords which are reverse hashed. Only recently Microsoft allowed the use of more dynamic group memberships in the Azure tenant as well. I would say that Active directory is here to stay in the short term especially for larger enterprises that have complex inter dependencies such as multi forest domains. However, it would be a wise move for existing IT professionals to start understanding how Azure AD works since that's the way things are moving in my opinion. I can quite easily see Microsoft detangling on premises active directory to the extent that the main primary authentication service will be Azure AD and the likelihood of a local on premises relay or read only domain controller working in a hybrid fashion. For smaller business, it’s a no brainer, in my opinion, to move their authentication into the cloud and to leverage enhanced authentication methods such as Multi factor authentication and the great machine learning capabilities within AD premium. In summary, I believe Active Directory is here to stay for the foreseeable future especially amongst larger enterprises. Though I would not be surprised with Microsoft slowly putting in more features and capabilities to Azure AD as a slow and subtle way for organisations to get used to the idea of managing their identities and devices in the cloud. As we know the great advantage of the cloud is that innovation and enhancements happen much faster than on premises technologies. So the emphasis will be on Microsoft’s part to innovate and grow in the cloud space than on on-premises technologies. Interested to hear your views on this development. ### Joe Littlewood from Nviron - Dell EMC ChannelTalks Joe Littlewood from Nviron is interviewed by James Hulse of Dell EMC for the latest #channeltalks series. James Hulse of Dell EMC speaks to Joe Littlewood, Sales Director of Nviron the latest Dell EMC #ChannelTalks series. Nviron are an IT services business based in the north west of the UK. Find out more at: http://www.nviron.co.uk/ Dell EMC are the providers of technology solutions, services and support and this series James interviews a range of experts from within the technology industry. Channeltalks ### Syrex and Redstor Deliver Leading Cloud Backup and Security Solutions Syrex, the provider of virtualized and hybrid network infrastructures and solutions, has partnered with Redstor, a market leader in data management and security solutions, to provide clients with a powerful cloud backup and security solution. Together, the two organisations deliver an enterprise level package that is ideally suited to challenging South African conditions, assures of comprehensive security and capability and yet remains accessible and intuitive to use. “We have undertaken the move to the cloud as an organisation, so we are ideally placed to support our partners and clients in making a move themselves,” says Jason Kotze, Pre-Sales Consultant, Redstor. “We have been providing cloud backup solutions for enterprises since 2005, and have consistently invested in businesses and solutions which support this growth. By 2014 we had more than 9,000 cloud customers, and today we have more than 1,000 resellers and 25,000 end users of our solutions.” Ralph Berndt, Director, Sales at Syrex, adds: “Redstor Enterprise Backup Pro is tailor-made for the SMME and Enterprise looking to invest in cloud and wanting assurances that their data will remain secure, accessible and in their control. The system is extremely advanced and is designed to protect the entire organisational infrastructure, regardless of whether it is virtual or physical. We opted to work with Redstor because their system speaks for itself and has support for numerous operating systems and applications to ensure that installation is also made as seamless as possible.” [easy-tweet tweet="Syrex provides data resilience in dual locations to ensure client data is secure and recoverable" hashtags="Data, Cloud"] The Syrex enterprise backup solution developed using Redstor Backup Pro is one of the fastest on the market. It uses incremental backups, advanced byte and block-level patching techniques, compression and client-side deduplication to manage this feat and to ensure that RPO and RTO objectives are met.  While the technology that drives the solution has its layers of complexity, the beauty of its implementation is its simplicity. Syrex provides full data resilience in dual locations to ensure that client data is at all-times secure and recoverable. “The intuitive management console supports multi-tenanted environments, provides detailed reports and helps identify and resolve issues quickly,” says Berndt. “In addition, files, folders, SQL and Exchange are intelligently laid out, and reporting can be automated for ease of use.” Another notable feature is one which suits the South African environment down to the unconnected ground – when an initial backup is required from a location with poor connectivity; the data can be couriered to Redstor for upload, backup and storage in the cloud. “Partnering with a Syrex is an exciting proposition – they are aggressive in the market, they understand their client’s requirements, and they push to provide great solutions with service to match,” concludes Kotze. “ We are looking forward to working with them in bringing Redstor Backup Pro to their customer base.” ### Optimising the Cost and Performance of AWS Using a Hybrid Cloud Since its debut in 2006, AWS has become the public cloud of choice thanks to its low-cost, easy-to-use, self-service resource provisioning. But with these advantages come several pitfalls. Organisations can use AWS as a de facto IT environment without carefully planning their IT usage governance policies or their strategy and budget for AWS, as they do with their on-premises cloud infrastructure and resources. This makes it easy to create a sprawling IT environment on AWS’s public cloud that lacks the discipline and hygiene of internal IT systems. The convenience of setting up virtual machines on AWS can also be a trap for developers, who come to rely on it too much. In this article, we’ll see how to achieve a balanced workload between public and private clouds. Advantages and pitfalls of AWS The advantages of AWS are well known. Using a point-and-click interface, enterprise developers can set up a quick production environment on AWS instead of waiting days or weeks for their company’s IT Operations unit to provision and configure the infrastructure and resources they need. Startups and small companies can create their own IT environments on AWS, and thus avoid the capital expense of a big infrastructure investment. More recently, AWS has made valuable IT tools available to subscribers, such as big data analytics, AI, and voice- recognition application development tools. But with these advantages come several pitfalls. Organizations can use AWS as a de facto IT environment without carefully planning their IT usage governance policies or their strategy and budget for AWS. This makes it easy to become over-reliant on AWS, and to create a sprawling IT environment on AWS’s public cloud that lacks the discipline and hygiene of internal IT systems. If an enterprise uses AWS too much, the costs of using it can eventually outweigh the benefits. The company may end up paying a huge amount per month to host a large public cloud IT environment. Over time, this can have a severe negative impact on the IT operations budget. Using hybrid clouds to balance AWS usage A hybrid cloud, with a mix of public and private cloud, is a good foundation for optimizing performance and costs. The key steps to optimization are: Set up an on-premises private cloud infrastructure. Establish control over AWS usage through IT policy and governance. Take stock of the company’s overall use of AWS to understand exactly how and for what purposes the people at the company are using it. Determine which workloads (i.e. high-performance workloads) will run better on a private cloud, and which workloads (i.e. workloads requiring “Burstable Bandwidth”) should leverage AWS or other public clouds. Balance workloads between the two platforms to optimize productivity and costs. Managing AWS and hybrid cloud for an optimized cloud infrastructure The solution to optimizing your performance and cost on AWS is part technology and part business planning. It’s not enough to simply set up a hybrid cloud – you need to know how to utilize and balance IT workloads between AWS and the private cloud. And for that, you need to understand how your company is using AWS right now, and take control of it. [easy-tweet tweet="The AWS Admin will be responsible for monitoring the company’s ongoing AWS use" hashtags="AWS, Cloud"] Designating an AWS Admin is an important first step. This admin will be responsible for creating and enforcing IT policy and governance on AWS. The AWS Admin will be responsible for monitoring the company’s ongoing AWS use and tracking monthly AWS spend. The admin should also have the power to determine which workloads will be deployed on the company’s private cloud, and which will be deployed on AWS and other clouds. Cloud management platforms can help with this, but even basic administration (done the old-fashioned way, using spreadsheets) to understand usage patterns and spend can quickly reduce unnecessary spend. The AWS Admin should have the authority to right-size the company’s AWS profile. AWS offers Trusted Advisor, a cloud optimization tool that helps to minimize over- provisioning of workloads. Using this tool, the AWS Admin can determine the total amount of CPU, memory, storage, and network bandwidth the company is currently using on AWS. If the Admin determines the company is paying for unused capacity and bandwidth, he or she can switch over to a smaller, less-expensive AWS profile that is better suited for the company’s needs. Achieving a Balanced Workload The key to balancing workloads between public and private clouds is to know which workloads work best on which type of cloud. To do this, you have to continuously analyse your application portfolio, and deploy each workload to the appropriate public or private cloud locations for maximum performance and cost-effectiveness. There are, of course, some workloads that you should always keep behind your firewall, and never put on AWS or other public clouds. However, if you have workloads that could run on either public or private cloud, you can strike a balance by putting workloads on the appropriate medium. The question to ask is: Where should my workload run, so I get the most value out of my AWS investment? Here is a couple of use cases that illustrate workload balance: High-Performance Workloads vs. Worldwide Availability – Your company has a mobile application that collects small amounts of non-confidential usage data from users and sends it back to your databases to be analysed using data analytics programs. Since you want your application to be accessible to mobile users throughout the world, you should host it on AWS, which gives you a global, multi-site presence. But databases and data analysis tools typically require higher levels of CPU, memory, storage capacity, and bandwidth – all of which may incur higher fees for your company on AWS. Therefore, you should host your databases and data analysis tools on your private cloud servers, and set up a link to AWS so the database can receive incoming data from your mobile application. Burstable Capacity and Bandwidth – Your company may need temporary digital capacity on public clouds like AWS for short-term peak periods. This is known as “bursting your capacity.” For example, if you are a retail company, you may need to rent a short- term “burst” in CPU and bandwidth from AWS during the holiday season, and move your eCommerce site onto the public cloud to handle the spike in transactions and web traffic. When the holidays are over, and demand returns to normal, you can shift the site back to your on- premises private cloud, and shut down your public cloud environment to save money. AWS is a fabulous environment for bursting capacity or rapidly setting up resources for smaller DevOps projects, but AWS user can get out of hand without corporate governance, and AWS is inappropriate for applications requiring high security or high performance. By balancing AWS usage with a private cloud, companies can optimize their AWS spending. ### IoT Leader Wia Raises €750,000 in Seed Funding Wia, one of the leading Internet of Things (‘IoT’) cloud platforms, has secured a seed investment of €750,000 from Suir Valley Ventures, the entrepreneur-led venture capital fund (‘Suir Valley’), with additional funds coming from Enterprise Ireland.  IoT comprises objects and devices that can be linked, monitored and controlled over the Internet and Wia will use the funds raised to deliver on its objective to establish its cloud platform as the enabling technology that powers this rapidly growing market. Wia’s ground-breaking IoT cloud platform enables developers to turn any type of sensing device into a secure, smart and useful application in a matter of minutes, generating considerable time and cost savings for teams who would have previously had to spend many months on 100,000s of lines of code to try to build out their own cloud infrastructure.  Wia provides a vertically integrated offering incorporating management, data capture and storage, analysis, control, as well as the seamless integration of additional services and devices. In line with its strategy to become a one-stop provider of IoT infrastructure, later this year Wia plans to launch its latest product which will incorporate the world’s first off the shelf IoT billing system. [easy-tweet tweet="Wia’s award-winning cloud platform is today used by developers in over 85 countries" hashtags="Wia, Cloud"] Launched in 2015, Wia’s award-winning cloud platform is today used by developers in over 85 countries.  In addition, Wia has partnerships in place with leading technology companies including Sigfox, PubNub and Twilio, and plans to expand to the US later in the year.  Today’s seed investment will be used to fund Wia’s continued global expansion and further investment into R&D of its technology platform. In recognition of its pioneering work, in 2016 Wia was selected to join NDRC’s LaunchPad programme, winning their investor day.  NDRC is an Irish Government funded agency that was established to invest in digital companies and start up teams with the potential to grow internationally.  NDRC currently ranks as no.2 University Business Accelerator in the World and no. 1 University Business Accelerator in Europe by the UBI Global Index. Founder & CEO Conall Laverty, 28, has previously developed and architected software solutions with Big Motive, Channel 4, BBC, Net-a-Porter, Cambridge Silicon Radio and Deezer. He attained a First Class Honours from Queen’s University in Computer Games Development.  Mr Laverty has built a team of high calibre and expert professionals at Wia. They will be joined by Suir Valley’s Brian Kinane who has been appointed to Wia’s Board of Directors. Conall Laverty, Founder & CEO of Wia, said, “We are thrilled to bring the experience and knowledge from Suir Valley Ventures as partners with us on a milestone in Wia’s journey.  The Internet of Things, through which devices communicate with each other is playing an ever-increasing role in both corporate and domestic life around the world, and we believe our platform will take it to the next level”. Barry Downes, Managing Partner – Suir Valley Ventures, said, “We are delighted to announce that we are leading the seed venture capital round for Wia.  The Internet of Things (IoT) is an important focus area for the Fund and an area of tremendous growth and productivity for enterprises and the economy. Conall has led a superb team to develop a ground-breaking IoT platform that is the easiest and fasted way to link sensors and devices to develop IoT applications. Wia’s end-to-end platform provides full device and application management, security, data capture and storage, analysis, control, as well as the seamless integration of enterprise systems. We are looking forward to working with Conall and his team to help them rapidly expand the Wia business with developers and enterprises internationally.”    Niall McEvoy, Manager, HPSU ICT Dept., Enterprise Ireland said, “Wia is an innovative Irish start up that has made strong technical and commercial progress, developing world leading platform technology in the rapidly growing IoT space. The company is combining innovation with global ambition and Enterprise Ireland looks forward to continuing to work with them as they expand and scale in international markets”.  Speaking about Wia, CEO of NDRC Ben Hurley said, “Wia offers a compelling product which enables connectivity and information-sharing between devices and sensors. They have a really strong team in place with deep technology expertise and clear ambition. From NDRC’s perspective the company’s potential and capacity for growth was clear when we invested in them initially in 2016 through our investment and acceleration programme, and then with further follow on funding this year. This announcement today will enable Wia to achieve continued employment growth here in Ireland and grow their business into new international markets in the future.” ### Security Index - Cloudtalks with Unisys, Salvatore Sinno In this episode of #CloudTalks, Compare the Cloud speaks with, Salvatore Sinno, Cheif Security Architect, from Unisys. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Visit Unisys website to find out more: http://www.unisys.com/ Cloudtalks ### Timico Brings in New Senior Talent to Drive Client Experience Timico, the managed cloud service provider, has brought in new talent in the shape of four senior hires to help drive the company’s focus on delivering an industry-leading client experience. The new appointments will ensure Timico consistently develops and delivers the necessary products and services to meet its clients’ digital transformation needs and fulfil its future organic and acquisitive growth objectives as it further scales its business following the £50 million investment in the company by Lyceum Capital earlier this year. Kirsty Johnson has been recruited as Client Operations Director responsible for all service desks, provisioning teams and client service teams to ensure Timico is continually delivering the highest possible level of service.  Prior to joining Timico, Kirsty spent 12 years at Alternative Networks, latterly as Operations Director during a period of rapid organic and acquisitive growth. In a new position, Colin Riddle has been appointed as Head of Products & Services, with responsibility for Timico’s entire portfolio. He will lead a team of Product Managers to develop and deliver Timico’s Managed Cloud Service Provider proposition. Prior to joining Timico, Colin spent almost four years with managed cloud computing company, Rackspace, most recently as Manager, Product Enablement, where he led a team of Technical and Business Engagement Managers and was also responsible for continual improvement across Product International. Martin Riley has joined Timico in a further new role as Head of Application Development and Technical Strategy, accountable for enabling the digital transformation of the business, planning and delivering the integration of company-wide solutions to improve the delivery of service to Timico’s clients.  Martin was previously Head of Infrastructure at managed cloud services provider, Adapt. [easy-tweet tweet="Timico has invested further in its ITIL processes and in its digital infrastructure" hashtags="Timico, Digital"] Finally, Joanne Smith has been promoted within Timico to the newly created position of Head of Client Service Management. Joanne first joined Timico in 2007 and has worked in client-facing roles ever since, most recently as Strategic Account Director where she looked after some of Timico’s largest customers.  She, along with her team, will be responsible for providing dedicated focus to the continuous improvement of the client experience. In addition to bringing in new talent, Timico has invested further in its ITIL processes and in its digital infrastructure including development of its service management tool, ServiceNow and its CRM systems to make operations more robust and flexible. All four members of the team will report to Chief Operating Officer, Clodagh Murphy who commented: “I am delighted to welcome our new colleagues to their roles at Timico, and I am confident that they will add both strength and depth to our frontline operations teams. I’m really excited to see what their combined skills and experience can bring to the business as we enter the next phase of our journey to becoming a leading Managed Cloud Service Provider offering an industry-leading client experience.” ### Security - Insight into IoT - Gemalto's Joseph Pindar Compare the Cloud interviews, Joseph Pindar, Director of Data Protection Product Strategy of Gemalto. Joseph talks about Gemalto and how you provide the security around IoT. What does Internet of Things (IoT) mean? The Internet of things or IoT is the collection and processing of data from a range of micro devices and conductors including RFID, Bluetooth and micro sensors that allow for intelligent M2M or machine to machine communication. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Gemalto: http://www.gemalto.com/ IOT ### Datashur Pro USB Flash Drive Product Review iStorage, a prominent British design and manufacturing company of highly secure portable data storage and encryption products have arguably created one of the most secure data storing products on the market. Introducing the datAshurPRO memory stick, robust, adaptable, and highly secure. Released in 2015, iStorage’s elementary motivation to create a product with such high levels of data protection was due to the Data Protection Act. The CEO of iStorage, John Michael outlines:  ‘It is essential that all corporate and government organisations comply with data security regulation to avoid hefty fines, which can reach £500,000.’ Data breaches predominantly occur due to loss or theft. Therefore, iStorage have understood the demand to produce a highly secure data storage device for government and corporate organisations that regularly obtain and share sensitive information. [easy-tweet tweet="The datAshurPro incorporates a military grade XTS-AES 256-bit hardware encryption" hashtags="Data, iStorage"] The defining features of the product are predominantly security based. Firstly, the datAshurPro incorporates a military grade XTS-AES 256-bit hardware encryption. The data is also encrypted in real-time with a dedicated hardware encryption processor. In addition, the drive features a PIN activating system, which allows the user to create a unique, memorable seven-digit code to protect their data. This PIN system includes a very sophisticated user timeout lock system. This allows the drive to detect brute force hacking, which is a particularly important feature in the event of a theft. As the drive will shut down and lock out the hacker, thus protecting the company’s data. These features ultimately give organisations and businesses peace of mind as they can be sure their sensitive data is obtained in a completely secure domain. Further, the datAshurPro has other important features referring to memory size, compatibility and tough physical design. Firstly, capacities of 8, 16, 32 and 64GB are available, allowing businesses and organisations to choose which size would most effectively suit their amount of sensitive data. The device also includes a drive reset feature. This component deletes the encryption key rendering all data as lost forever, the drive can then be reformatted and reused.  Notably, the drive also maintains OS compatibility and platform independence. Which allows the product to be compatible with a wide range of systems including: Windows, Mac, Linux, Thin Clients, Android, Chrome and Embedded Systems. The device’s operating system is protected by an extremely robust water and dust resistant outer shell. Allowing the product to remain immune from physical damage. This is vital as physical damage to memory devices is a common cause of data loss. Less durable memory devices on the market can be corrupted by a mere splash of coffee (I’m talking from experience). A possible drawback of this device is that it is priced at the higher end of the portable storage market. The pricing of the product ranges at: 8GB- £69.00 16GB- £89.00 32Gb- £109.00 64GB- £129.00 However, secure data for a business is priceless. The prices listed above are drop in the ocean compared to the £500,000 fine which could be asserted if your company fails to comply with the data security regulations. Arguably another possible drawback is evident during the set-up of the product. When creating a new PIN password for the device the user only has 10 seconds to type in their new password. Which must be 7-15 digits in length, must not contain only repetitive numbers and must not contain only consecutive numbers. If the user exceeds the time limit the device will shut off and return to the original start mode. However, this feature does ensure that the user has selected a suitable secure password for their device. Otherwise, if the user was allowed to choose an easily breakable password, this would undermine and contradict the state-of-the-art security technology. My words of advise, be prepared to be slightly patient during the setting up phase. In summary, this product would be ideal for corporate business’s or government organisations that require the transferring or storage of private data. The datAshurPRO would ensure your companies data is completely tamper-proof and secure, efficiently preventing the untimely event of loosing data. ### Esteem Systems Dell EMC ChannelTalks James Hulse of Dell EMC speaks to Vicki Sammons, Business Development Manager and Mark Benson, Director of Marketing for Esteem Systems on the latest Dell EMC #ChannelTalks series. Esteem Systems; an aggravator of cloud and on-prem services and design solutions to meet their businesses needs and outcomes. Find out more at: http://www.esteem.co.uk/ Dell EMC are the providers of technology solutions, services and support and this series James interviews a range of experts from within the technology industry. Channel Talks ### Customising the Cloud for SMEs As cloud computing continues to mature and develop, it is becoming increasingly apparent that one cloud solution is not suitable for all, particularly when it comes to SMEs. In contrast to bigger businesses, SMEs need scalable cloud solutions that provide them with tools and applications to help support their specific business needs. One of these requirements is keeping costs to a minimum, as smaller organisations often don’t have access to the same budget levels or resources as big enterprises. A cloud solution that is truly bespoke and customised to the needs of SMEs and is agile enough to enable redesign, development and scale out on an on-going basis, could provide the answer that these smaller businesses are looking for. Why customise the cloud? Recent Node4 research reveals that 25% of mid-market companies say that their needs are overlooked by IT providers. A one-size-fits-all, off-the-shelf solution will very rarely deliver maximum value to business, so IT suppliers need to deliver solutions that work for these smaller organisations. Obviously, no two businesses are the same, and every company has different needs and layers of existing IT infrastructure, so it’s important to base applications that are relevant to each business on flexible cloud infrastructure. Whether the SMEs need a single development server, a load balanced dual-site set of web servers, a comprehensive Disaster Recovery strategy or a private virtualized environment; vendors should be providing more bespoke solutions so that the SME and its employees get the right balance of services to suit their individual needs. So, what should cloud providers be focusing on to deliver the cloud solutions that SMEs need? Here are Node4’s five top tips: Scalability For SMEs, the flexibility and quality of a cloud-based solution and the individual services delivered through it are of greatest importance. These small-to-medium sized businesses need to be very sure that the solution they are getting is both scalable and built to last. One of the main advantages of a cloud-based infrastructure is that SMEs only use what they need when they need it, so scaling down is going to be just as common as scaling up. For example, a chain of hotels may need a scalable web application as its business is seasonal. They’re likely to receive hundreds of booking requests in the run up to Christmas, but in February they may only get a fraction of this. As such, vendors need to demonstrate to SMEs that they understand this constant change and should be able to provide a solution that is custom-tailored to the business’ requirements. Consultation It’s not just infrastructure and support that SMEs are demanding from cloud providers. They also expect sound consultation, and rightly so. By using ‘consultation’ we mean real advice, explaining the benefits and constraints of each solution (in simple terms), and where mixing and matching may be beneficial. It is crucial to have the business objective in mind at all times when designing the solution that will support them for what might be 5+ years. Trusted Advice It is becoming ever more critical for cloud providers to partner with their customers. SMEs expect vendors to work closely alongside them to find a suitable solution that works well for the business, considering all options. This is regarding what is provided, how much it costs, how they pay for it, the total cost of ownership/ROI and clarity that the individual services provided through cloud technology are correctly aligned with business needs. For example, what are the pros and cons of a usage-based model versus a subscription-based model for telephony services hosted through the cloud? A trusted advisor from their cloud provider ensures customers receive independent answers based on experience and expertise to all their cloud-related questions. Easy, simple billing Cloud solutions for SMEs do not necessarily have to come with the large capital investment usually associated with a virtualized environment. Rather, businesses should be benefitting from the latest technologies with a simple OPEX based pricing structure. For instance, for enterprises that traditionally have cash-flow problems (e.g. building and construction companies); it may be beneficial to pay for services as they are used rather than pay a lump sum up front. This will help them get the most out of the investment. [easy-tweet tweet="SMEs should demand virtual environments that won’t lock them in." hashtags="SME, Cloud"] Say goodbye to ‘lock ins’ Some cloud solutions can make it tough for SMEs to transfer their data out when they are looking to change providers. SMEs should demand virtual environments that won’t lock them in, making it easy to change providers when required. Vendors also have to get used to this way of working. If SMEs need to add or remove certain services, or increase or decrease the amount of a certain service being delivered through the cloud, vendors should have the built-in flexibility to make this possible. Value of a bespoke cloud Fast-growing SMEs are continually evaluating cloud services as a tool for reducing costs, limiting risk and providing the flexibility they need to run their businesses in a fast paced, technology driven workplace. Ultimately, the test for any cloud solution is whether it delivers real value to the business. The cloud undoubtedly provides unprecedented levels of flexibility for SMEs to meet the ever-shifting challenges they face. However, unless SMEs have exactly the right solution for their businesses individual needs, then it is unlikely they will reap the full benefits that the cloud can deliver. This means going beyond the accepted limitations of private, public or hybrid solutions and entering a new era of truly customised cloud solutions. The responsibility should be on vendors to provide these capabilities, and SMEs need to ensure that they are working with the right suppliers.   ### Connectivity and Visualisation - Insight into IoT - Riverbed Technology's Steve Foster Compare the Cloud interviews, Steve Foster, Senior Manager, Solutions Engineer of Riverbed Technology. Steve talks about Riverbed and how they will impact the future of IoT. What does Internet of Things (IoT) mean? The Internet of things or IoT is the collection and processing of data from a range of micro devices and conductors including RFID, Bluetooth and micro sensors that allow for intelligent M2M or machine to machine communication. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Riverbed Technology: https://www.riverbed.com/gb/ IOT ### Cloudreach and Cloudamize Join Forces In Blackstone Backed Merger to Accelerate Intelligent Cloud Adoption Cloudamize will form a new software business unit within Cloudreach and continue to scale its industry-leading cloud analytics platform and independent global partner ecosystem. London and Philadelphia, August 3, 2017 – Cloudreach, a company majority owned by Blackstone Tactical Opportunities, today announced that it had acquired a majority stake in Philadelphia-based cloud computing analytics and migration automation leader Cloudamize. Terms of the transaction were not disclosed. The deal brings together two highly complementary and accomplished businesses that were both born in the cloud and had become leaders in their respective spaces. The combined company has a major presence in North America and the UK as well as operations in 6 other countries. It's combined expertise in cloud consulting, managed services and cloud enabling software form a wide breadth of support and services to enable, integrate and operate cloud platforms. Substantial investment is planned for the new software business unit to develop an independent  Cloudamize partner ecosystem, increase partner success resources, and accelerate product development. Cloudamize uses high-fidelity analytics and powerful automation to take the guesswork out of scoping, configuring and migrating to cloud infrastructure and allows customers to make critical cloud deployment decisions with ease and confidence. Cloudamize has assisted well over 500 companies such as ESPN, OfficeMax, and Salvation Army in their journey to the cloud saving them an estimated $100 million in the process. “Today we take the next exciting step in our quest to accelerate intelligent cloud adoption with software enablement of services. When we started our partnership with Blackstone, one of the key priorities was to software-enable all our products.” said Pontus Noren, CEO of Cloudreach. “Combining Cloudamize’s industry leading cloud analytics and migration automation with Cloudreach’s expertise will make it easier for our customers to adopt cloud quickly, efficiently, and at scale. Ultimately it’s all about enabling enterprises to innovate using the power provided by hyperscale cloud platforms either through Cloudreach’s services or via our growing independent partner ecosystem .” “Joining forces with Cloudreach accelerates the vision I had when I originally founded Cloudamize -empowering the enterprise with analytics and automation to simplify cloud decision making,” said Khushboo Shah, Cloudamize founder and chairman. “When we first launched Cloudamize, we saw an opportunity to automate the tedious and manual process of discovering applications and infrastructure, building migration plans, and determining best possible cloud configurations for optimal performance. I am thrilled to combine efforts with Cloudreach as it’s time we move the market beyond the brute-force manpower-intensive processes of the past and help our customers focus instead on maximizing the value of their cloud deployments.” Cloudreach is a “born-in-the-cloud” software enabled services business whose agile approach to business transformation is underpinned by deep expertise and experience in enabling, integrating and operating cloud technologies. They help some of the world’s largest organizations, including BP, Volkswagen Financial Services and Hearst intelligently adopt hyper scale cloud platforms. Cloudreach was recently named a leader in the 2017 Gartner Magic Quadrant for Public Cloud Infrastructure Managed Service Providers. [easy-tweet tweet="Cloudamize provided actionable insights and recommendations that were incredibly valuable " hashtags="Cloud, Data"] "I was one of the earliest users of Cloudamize - it was a phenomenal platform then, and it's even better now," said Craig Conway, CTO of Livingston International. "Cloudamize provided actionable insights and recommendations that were incredibly valuable in that they identified dramatic reductions in our AWS spend. I am very pleased for Khushboo and the Cloudamize team, and I look forward to future collaborations with the Cloudreach organisation." “Making the journey to the cloud can be overwhelming, but the right tools and expertise can reduce time-to-value, and give better, and more strategic results for enterprises,” said Owen Rogers, Research Director at 451 Research. “Together, Cloudreach and Cloudamize are positioned to deliver business value through this combination of the right tools and expertise.” ### Latest CRM Trends A few years ago, the efforts of Customer Relationship Management would not generate enough revenue. In fact, customer relationships management was thought to be on the decline. Today, the arrival of social media and smartphones have prompted businesses to reconsider customer relationships initiatives. Trends in customer relationships are bound to change how your business relates to your clients. A successful customer relationships management entails continued innovation and the observance of best practices. Social networking sites are increasingly transforming the customer experience. Social networking sites allow customers to express their opinions on anything. As such, customer feedback has become much more important. Customer feedback can make or break a business. Social media has increasingly become a useful tool for companies to engage with its clients. Here are some of the trends expected in customer relationship management in 2017. [easy-tweet tweet="CRM will continue to look for ways to improve customer experience " hashtags="CRM, Cloud"] Cloud-Based Customer Relationships Management Cloud computing technology continue to rise as technology evolves. Likewise, customer relationships continue to grow as technology evolves. Cloud computing software is an economical technique for collecting customer data. Cloud-based CRM application ensures that customer data is readily available. In fact, cloud-based customer relationships management is expected to gain momentum as cloud computing technology continues to progress. Centralised Data CRM will continue to look for ways to improve customer experience through extensive data collection and analysis. A CRM app will help your business centralize its customer data. That way you will be able to engage your target audience more efficiently. Customer relationship management will help you generate leads for your sales team. Besides, it's a continuing process that helps you remain in touch with your growing customer base. Social CRM Comcast was the first company to interact with its customers on Twitter in 2008. That confirmed the efficiency of social networking sites in managing your relationship with customers. Twitter and Facebook marketing techniques remain in an uptrend. Businesses are paying attention to trends in social media marketing to stand out from the competition. Social networking sites empower consumers to influence brand image and customer perception. Today, negative feedback is no longer a call to action. With social networking sites, you can expect negative feedback about your brand to reach potential markets before you. Businesses are increasingly responding to changes in customer relationships. Gamification and social media optimization have become important marketing strategies. They help keep your customers engaged with your brand. Crowd-sourcing Over the past few years, social media has allowed customers to express their opinions on anything. As a result, businesses are increasingly utilizing crowd-sourcing to improve their customer relationships. Allowing customers to voice their views can help your business provide innovative and customer-tailored products and services. In this case, customer relationships management is no longer used only for lead generations and marketing. It also helps a company provide innovative products and services. Mobility Mobility has turned into a crucial component of business success. Unlike before, customers are no longer bound to PCs. Cloud-based storage helps customers to access data on the go. Mobile devices have increasingly empowered customer service resources. Besides online and mobile experiences, real-world involvement affects consumer perception. Flexibility CRM flexibility allows an enterprise to customise its offerings. Your CRM app should be easy to integrate and have multichannel publishing. Accessible and flexible CRM applications are increasingly becoming important for businesses. Business Intelligence Brands are increasingly looking for integrated solutions. Conversely, cloud-based solutions only support the best CRM app. As such, there is a trade-off between the two, and customer relationships management still seesaws between them. Data-driven marketing technique has become important for businesses. In 2017, customers expect contextual communications that acknowledge their interests and needs. As such, this is the time to make sure your technology meets your client expectations. The Future of Customer Relationships Management CRM application and marketing aspects will potentially grow in coming years. Pursuing strategies that are in line with the demographic you target helps you harness the power of customer relationships. Twitter and Facebook have increasingly allowed customers to express their opinions about a product or service. Social media is now seen as a crucial business driver. It has become necessary for companies to listen and respond to customers' feedback. That is the only way you can harness the power of customer relationships and continue to anticipate customer needs. ### John Armstrong & Barry Coombs from ComputerWorld - Dell EMC ChannelTalks James Hulse from Dell EMC speaks to John Armstrong, Managing Director and Barry Coombs, Operations Director for ComputerWorld on the latest Dell #ChannelTalks series. ComputerWorld specialises in Data Centre, Cloud and Virtualisation market space. Find out more at: http://www.computerworld.co.uk/ Dell EMC are the providers of technology solutions, services and support and this series James interviews a range of experts from within the technology industry. ChannelTalks ### How Big Data Enables Trade Industries To Go High Tech Big data plays a crucial role in ushering trade industries to a more high-tech sphere. Big data's role is especially prominent in trade industries, where big data can improve the manufacturing process, make custom product design more efficient, offer greater quality assurance and effectively manage supply chain risk. Big data enables a firmer, more thorough look at what's working and what isn't in all these aspects, leading to business that operates smoothly. Investments in big data by companies continue to grow. As of 2015, over 75 percent of companies have already invested in big data, compared to only 58% in 2012. The increasing reliance on big data is a good thing for efficiency. Increased analytics, in addition to seamless access and integration involving that data, results in improved communication on all sides of the supply chain. Improved Manufacturing Process In the creation of a product, one of the most important aspects is consistency. If 100 customers order a product and half of them receive a defective version, the negative PR impact would be extraordinary. Even a smaller portion in the range of 10% can be cause for serious alarm and monetary loss. Big data enables trade industries to hone in on the defected products, helping identify parameters that had an impact on the defect. Comprehensively analysing defective products via big data has worked well for several industries, including pharmaceutical manufacturing. McKinsey and Company were able to save between $5 and $10 million by identifying parameters culled from big data that impacted vaccine yield after production of two batches produced high yield variation. An improved manufacturing process is made possible by big data and its role in identifying and helping to remedy defects. This type of high-tech monitoring helps keep pace with the competition. Efficient Custom Product Design How your company's product stands out from its competition is a big factor in its success, and custom product design is integral. Trade industries can monitor customer behavior on their websites using big data. This behavior can combine with additional data, like location and gender, to help deliver customized product offers. Trade industries can gain an understanding of which products are viable and which ones are expendable when using both website analytics and big data regarding inventory performance. Beyond that, integrating products with big data can help provide information on product habits. For example, modern air compressor systems often come with software that allows real-time data tracking. The tracking and monitoring help customers operate their products at optimal efficiency, providing a positive impression. [easy-tweet tweet="Quality assurance can be a very time-consuming and costly endeavor" hashtags="BigData, Technology"] Quality Assurance Made Better Quality assurance can be a very time-consuming and costly endeavor, especially for businesses that test every single product. For example, Intel originally had to run thousands of tests per individual chip they produced. They decided to incorporate big data to help define predictive analytics, decreasing the number of tests required for the chip to be cleared. They ended up saving $3 billion in manufacturing cost for one line of Intel Core processors, showing the mighty powers of efficiency and cost-savings that big data can bring to trade industries. Effective Management of Supply Chain Risk Accidents happen. It's especially the case when raw materials get delivered. Bad weather, mishandling and common mix-ups can contribute to supply chain mishaps. If there are one thing businesses of all types can commiserate about, it's the often uncertainty of the supply chain. Fortunately, businesses that have embraced the high-tech aspect of big data are monitoring weather and traffic data to determine delivery times, as well as gauging whether to hold off on a shipment. Additionally, the big data can weigh which suppliers are meeting their deadlines, comparative to supply competitors. This aids in developing contingency plans to make sure a natural disaster doesn't impact production. As businesses continue to make strides incorporating big data and consequently become more embracing of high-tech, they will notice fewer hiccups in the supply chain, in addition to quality assurance with greater foresight and an improved manufacturing process. ### IoT Apps - Insight into IoT - DCSL's Nick Thompson Compare the Cloud interviews, Nick Thompson, Managing Director of DCSL Software. Nick talks about what his company does that will impact the future of IoT. What does Internet of Things (IoT) mean? The Internet of things or IoT is the collection and processing of data from a range of micro devices and conductors including RFID, Bluetooth and micro sensors that allow for intelligent M2M or machine to machine communication. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about DCSL here: https://www.dcslsoftware.com/ IOT ### Connected Devices Enable Faster Insights and Faster Service The Internet of Things (IoT) is transforming the relationships between companies that deliver services in the field and their customers. Whether it’s for consumer services like broadband, telecoms, satellite TV and utilities, or business services like facilities management, oil and gas extraction or HVAC, change is gathering momentum. It has come at an important time. With large industries around the world, like utilities, continuing to deregulate, IoT is helping smaller companies and start-ups compete and scale through more competitive Service Level Agreements (SLAs). What’s more, with a growing engineer shortage, more effective service models are required. It is not like the IoT is new anymore, according to Gartner, “things” – electronics, sensors, and software that can be sensed, monitored, and controlled remotely across network infrastructure – use the Internet more than people, and their number will reach 25 billion by 2020. Here’s how IoT, data from connected devices and other digital technologies will facilitate seamless customer services, rapid problem-solving and the more efficient deployment of skilled resources: Delivering Higher Performing Products and Services through Data-driven Decisions Delivering maintenance has traditionally relied on either periodically examining and repairing problems based on a fixed schedule whether needed or not (calendar-based maintenance), or waiting for things to fail and then fixing the problem, no matter how many visits it takes (reactive maintenance). The IoT offers another option. [easy-tweet tweet="IoT allows condition-based monitoring to prevent failures before they occur" hashtags="IoT, Data"] Detect equipment failures before they occur IoT allows condition-based monitoring to prevent failures before they occur. Connected devices can schedule predictive maintenance, detect issues before they debilitate functionality, and diagnose problems accurately. If connected to a powerful artificial intelligence (AI) based workforce management solution which can optimally schedule a technician by balancing skills, asset location, parts, technicians’ locations even traffic, there is an opportunity to address an issue before it is even noticed by a customer, or disrupts their lives.  With AI and a workforce management solution working in harmony, it is possible to easily facilitate service calls, routing jobs to the proper agent or technician, and even delivering customer satisfaction surveys following on-site or machine-to-machine cloud-based equipment servicing. The benefits of this proactive fix aren’t just to the customer. Using IoT you are only deploying skilled engineers when they are needed, which is far more effective than the best guess calendar appointment system. Improve customer service with unprecedented level of detail Proactive notification of customers affected by an outage or an affected asset changes the customer service delivery model. You can better position your customer support centres to allow more immediate service, thus reducing costs and providing customer care simultaneously. With the data from connected things combined with Artificial Intelligence (AI), you can better understand and ‘learn’ your customers’ preferences and behaviour. This provides a superb opportunity to add value with a service that is personalised to customers’ expectations. Not only does this mean that you can keep them engaged and satisfied with your organisation, but you can also use AI and IoT to create predictive failure models, enabling your field teams to offer customers valuable maintenance contracts which prevent future downtime, while increasing the lifetime value of each customer relationship. This can increase revenues by adding a premium for a personalised maintenance package. Understand your operations Also, the customer and asset data collected become an incredibly rich source of business insight to improve operations. You can pinpoint areas of efficiency providing service companies and manufacturers unparalleled insight into real-time and historical performance. It then becomes possible to do more of what works and target problem areas for improvement, all based on solid information. Over the long term assets will become more robust thereby reducing the cost of maintaining them. You can also develop future fee-based services, such as guaranteed uptime, or tighter SLA time frames. This helps build better, long-lasting relationships with your customers, based on the data-driven understanding of services in the field. The IoT Revolution The IoT is invaluable for analysing internal operations, product management, and customer service options. The true value comes from providing services connected through existing and emerging digital technologies all along the service supply chain. IoT is providing information across the cloud which can be utilised by artificial intelligence and delivered straight to customer and technicians’ mobiles, is a situation few could have imagined even three years ago.  It is these technologies which service companies must look at. They can now provide customers with the ease and convenience they want, with better offerings to earn loyalty and trust – the building blocks to increase customer lifetime value. This helps generate increased sales, which, with lower costs from optimal service delivery, help to increase profitability. It is not just short term gain though, the ‘Big Data’ delivered by IoT presents the prospect of informed and sustained competitive advantage. This is why all C-Level executives must take this and other digital technologies very seriously. It is technology which will bring about the brand value customers, shareholders and business owners demand. ### Chris Stos-Gale SICL - Dell EMC - ChannelTalks James Hulse from Dell EMC speaks to Chris Stos-Gal, Technical Director of SICL on the latest Dell EMC #ChannelTalks series. SICL is a Managing Service Provider with a focus on infrastructure, find out more: http://www.sicl.co.uk/ Dell EMC are the providers of technology solutions, services and support and this series James interviews a range of experts from within the technology industry. Channeltalks ### This is a test ### Internet Tips for DIY Data Recovery to Blame for Permanent Data Loss Kroll Ontrack cautions against DIY data recovery techniques as self-inflicted permanent data loss is on the rise. The data recovery experts at Kroll Ontrack are seeing an increase in do-it-yourself (DIY) data recovery attempts on media shipped to their labs. “DIY data recovery techniques and videos found on the Internet are encouraging individuals to attempt to recover their own data when a loss occurs”, says Robin England, Senior Research and Development Engineer at Kroll Ontrack. “We are seeing an influx of drives that have evidence of a DIY data recovery attempt. In many cases, these attempts cause damage, leaving the data to be unrecoverable.” Kroll Ontrack, in an effort to caution users against performing DIY data recovery techniques, released their Top 10 DIY Data Recovery Fails: A common DIY fail – When a hard drive fails, individuals will run CHKDSK, which destroys data that would otherwise have been recoverable.   Common RAID 5 error - When a drive fails in a RAID 5, it will continue to function in a degraded mode. Most people are unaware because they don't monitor the array, but when a second drive fails, the array fails, and the data is inaccessible. That's when they pull the drives out, reset them and reboot. At that point, the initial degraded drive may spin up and come ready. The RAID controller will notice the data on the degraded drive is not in sync with the data in parity on the other drives, so it rebuilds parity with the invalid data from the degraded drive. This can overwrite days, weeks, months or even years of data.   Forever encrypted - Certain external drives are encrypted and the encryption key resides on a chip inside the electronics of the enclosure. When these drives fail, owners will throw away the external enclosure and try the drive in a different one. Ultimately, the drives are sent in for data recovery and cannot be unencrypted. [easy-tweet tweet="Attempts are made to recover data from a hard drive with physical damage/read errors" hashtags="DataRecovery, Security"]   Software fail - Often attempts are made to recover data from a hard drive with physical damage/read errors using data recovery software. Some users will also load the software onto the damaged drive, which includes the data the person is trying to recover. This results in further damage to the hard drive and the data. There is also a risk of the data being overwritten.   Sticky rice – A popular Internet remedy; people will often put wet phones in rice in order to dry them out. Phones that are sent in for a data recovery service are then covered with rice and rice residue.   Technology guru – Most people have a friend or relative that is thought of as an ‘expert’ in technology. When a data recovery is needed, this technology guru will open the hard drive in a non-cleanroom environment and dust will fall on the drive. The dust is then cleaned off with their hand. Although the dust will be cleared away, the fingerprint left behind will cause additional data to be lost.   Open me - People try to open hard drives and often miss the screw(s) hidden underneath the labels. They then proceed to use a screwdriver to pry open the top cover causing scratching, divots and in some cases, breaking of the platters. When scratched or gouged, the top surface becomes unrecoverable, resulting in only a partial recovery at best.   Freezer recovery - Another DIY Internet myth is putting a hard drive in the freezer to recover the data. People will often do this and then attempt to run the still-frozen hard drive. During this process, water will condense and freeze to the platters of the drive, causing the frozen hard drive to crash.   Old tricks – Years ago, a person could swap the circuit board on a drive in an attempt to fix it, but now the boards are specific to the drive. Without the original drive, it will never function. Some people are still attempting this today. They try to swap the circuit board in their drive in an attempt to recover their data and when it doesn’t work, they send it in for professional data recovery. In a few instances, Kroll Ontrack was sent a pile of boards and their cleanroom engineers had to figure out the correct one before the data could be recovered.   Something is missing - There is a tip on the Internet that if you remove the platters from one drive and put them to a new one, you can recover the data from the platters. This method has been tried countless times without success. They will then send the platters in for professional data recovery and in some cases we have seen just the platters sent in a sandwich bag, nothing else. Without knowledge of the hard drive model, among other vital information, the data is unrecoverable. It is always best to contact a professional before attempting a DIY data recovery solution. Data recovery is a very delicate process and if done incorrectly, could cause permanent damage. ### Six Essential Steps to Help Insurance Firms Achieve Digital Transformation Insurance companies today face significant challenges, increased capital requirements and continuing low-interest rates rank high amongst them. However, both are low in precedence when compared to the need to drive digital transformation within the industry. According to research by Gartner, close to two-thirds of the world’s biggest insurance companies have already invested in insurance technology start-ups. By 2018, eight out of ten property, casualty and life insurers will have joined them. However, although insurers are fully aware the future is digital, it seems that when it comes to the digital transformation of their own business, things are more difficult. Although insurers are investing in other technology businesses, a digital strategy itself is often not included within the insurer’s business strategy. It is, in fact, perhaps the key challenge facing the insurance industry today. Below are six key steps that are essential in helping insurance firms achieve digital transformation: It starts with people… Insurers need to understand the needs and priorities of the people at the heart of the digital initiative to develop their approach. According to Jonathan Swift (Wired’s Editor-in-Chief), to succeed in the evolving insurance market, you need to think like a start-up, while taking the time at the outset to analyse, and ask what your existing customers, stakeholders, employees and executives want and need from digital transformation. They also need to bring in the digital expertise, not just at the front end, implementing digital initiatives, but at the top, setting the digital vision and strategy, while identifying the risks, and opportunities from emerging technology and changing consumer behaviours. The gap in technological expertise is not unique to the insurance industry. A review from last year of 1,925 executive and non-executive directors at more than 100 of the largest banks, showed only 6% of board members and 3% of CEOs had professional backgrounds in technology. More than four in ten (43%) had no technology experience on their board at all. This knowledge gap is a common challenge across financial services but one that must be addressed. It relies on data… [easy-tweet tweet="Insurers need to use external digital analysis... to inform a process of segmentation " hashtags="Analysis, Data"] Insurers already have much of the data they need, particularly regarding customer behaviours and attitudes. Aviva’s Chief Digital Officer believes they already have incredible insight, the challenge now is capturing this from silos across the business and attempting to turn it into actionable intelligence. Insurers need to use external digital analysis, customer experience and empathy mapping and digital value chain analysis to inform a process of segmentation and targeting and to define their value proposition and goals. You need a roadmap… Digitisation is changing markets quickly. With the threat of disruption from new, and changing customer behaviours and expectations, there is an urgency to implement digital initiatives that will prevent insurers falling behind and losing market share. This cannot, however, come at the expense of a long-term plan. Insurers need a lasting plan as well as roadmaps for strategic initiatives and business imperatives. This requires a digital framework… A framework allows the company to address digital goals and objectives. Not only will it show how initiatives to meet short term business requirements will be achieved, but also how these will change over time and how they link with your strategies. The framework will address how projects today will fit in with the vision of the future, and the methods – on premises hosting, using the cloud, software as a service – that makes most sense to get there. Define the digitals scope… There is no escape from considering in detail the company’s approach across digital. It is not just about what the business does, but how it presents it. Insurers must outline the purpose, objectives and key initiatives and challenges in all the key areas: Website, branding, online content, digital advertising, contact management, social media and mobile. They must also determine what needs to be serviced in-house and what can be outsourced. Get it right the first time… Execution and governance are essential to success. Organisations often bite off more than they can chew, starting with a grand transformation that takes too long to provide meaningful returns. The result can mean compromises that undermine the potential value of the project.' Should the result be a disappointment to consumers or users, they’ll be reluctant to adopt later iterations. Big visions and objectives, therefore, need to be segmented into small, deliverable projects that provide valuable functionality. That’s not to say the digital challenge does not require big changes. It does. But on the way there, each step counts, and it is one of the key areas insurers need to start addressing. They should do so not only because the digital transformation is probably the biggest issue that they face, but also because it’s central to addressing so many of the other challenges on the boardroom agenda.     ### The Mid-Market: How to Embrace the Cloud for Business Success When it comes to cloud adoption in the UK, the largest enterprises typically grab the headlines. These are the ones making the big, bold strategic and technological steps towards efficiency with cloud solutions while managing extensive legacy systems and security concerns. At the total opposite end of the market, there are vast numbers of micro businesses who might be considered ‘cloud-native’ – who never had the on-prem services that other businesses are phasing out. In between these two lies the mid-market, a group that as diverse and whose needs are just as distinct and important as large enterprises and small to medium sized businesses. Mid-market businesses, which can be roughly defined as those looking to serve up to about three and half thousand end users, are established enough to be facing serious, existential decisions as they weigh up the benefits of moving critical business processes from the familiar and into the cloud. That’s because the mid-market is the engine room of the UK economy. According to the business advisory firm BDO, mid-sized businesses accounted for a third of UK private sector turnover and added more jobs to the economy last year than small businesses and FTSE 350 companies combined. This shows how big an opportunity the mid-market presents to IT vendors to help these businesses migrate their apps and infrastructure to the cloud. [easy-tweet tweet="The Cloud Industry Forum reported that 88% of UK businesses had started to implement the cloud" hashtags="Cloud, Business"] It’s important that vendors do not forget the mid-market is facing the same digital disruption challenges as larger enterprises, and have often made considerable infrastructure investment that they don’t want to abandon overnight. While very small companies are often able to take the plunge and leverage new trends such as the cloud, midmarket companies may not have as much flexibility to adapt. Still, the Cloud Industry Forum reported in March that 88% of UK businesses had started to implement the cloud. It’s clear that there is momentum behind mid-sized companies in their digital transformation. The job is now to show these businesses that moving to the cloud doesn’t have to be an agonising experience. Case study: Lights, Camera, Cloud Cloud migration is not an on/off decision; it’s a journey. Customers on this journey demand a particularly high level of expertise from service providers, and that’s why they can’t simply rely on a wholesale solution. Customers want vendors to understand their business, provide local customer support and have knowledge of exactly which staged applications will best improve their operational performance. So how do you do it? It often starts with something as simple as the opening of a new office or the expansion of an existing site. It involves working with the vendor to understand your key business drivers and outcomes. A film studio provider, The Crossing Studios, came to Avaya after they had converted warehousing space and found themselves with eight buildings in various studio locations, all running different communications technology. This meant the studio IT staff were responsible for setting up and managing phone systems at every location for each production that rented the studio space. It’s essential for directors, crew and actors to stay in contact with the producers, creative teams and agents in Hollywood during production. Any loss in network connectivity could result in an expensive delay that costs the production thousands of dollars an hour. A blockbuster multi-million-dollar shoot operating on a tight schedule might sound it needs a wholesale cloud solution, but that wouldn’t have been right in this case. Of course, the cloud has brought The Crossing Studios operational efficiency, but without ongoing cloud management support, the company couldn’t work intelligently to respond to the needs of its customers or grow its bottom line. A hybrid cloud deployment model offered faster deployment, flexibility, scalability, predictable costs and the reliability it was looking for. Importantly, cloud management helped the studio's provider expand from eight studios to more than 330,000 square-feet of studio space without running into difficulties managing or maintaining its network infrastructure. The managed service around its hosted cloud solution also allowed them to work with the vendor if one of the productions had a new request. Cloud management as part of a hybrid solution made it easy to deliver a consistent communications environment that was aligned to each client’s needs and helped The Crossing Studios control operating costs and reduce the workload of IT staff.  Looking ahead, this hybrid cloud solution has allowed The Crossing Studios to implement video collaboration so that productions can benefit from face-to-face calls and meetings with their execs back in Hollywood. A Simple, Scalable Communications Platform That’s Easy to Manage Businesses want to pick and choose what works best for them instead of relying on a single provider or product. The next few years will be a critical period for the mid-market's embrace of the cloud, and it will take effective collaboration between customers and telecom providers to meet their needs. The overall shift to efficiency in the cloud is inevitable: mid-market companies getting left behind isn’t. ### Fluke Networks’ LinkWare™ Live Reaches Industry Milestone   The industry’s fastest growing cloud-based cable certification project management system currently averages 300,000 results per month Fluke Networks today announced that its cloud-based LinkWare Live service has reached a significant milestone – contractors and cable installers have uploaded more than five million test results to date. Today’s installers are able to manage and analyse complex certification jobs and upload the results from anywhere with Fluke Networks Versiv family of testers via LinkWare Live. Installers are reaping the benefits of more efficient workflows based on LinkWare Live. Technicians upload results from the jobsite over Wi-Fi, avoiding the time and expense of driving testers back to the office. Project Managers can set up testers, track job progress, and receive notification of testing mistakes, even while away from the jobsite. LinkWare Live also tracks the last used location and calibration status of testers, reducing project delays. Jobs get done faster, and reports delivered to customers sooner, leading contractors to report gains of up to 20 percent in efficiency. The use of LinkWare Live by customers is accelerating, now averaging over 300,000 test results uploaded per month – up nearly 50 percent from just 12 months ago, making it the industry's fastest-growing cloud-based cable certification project management service. ITM Gets Quick Return on Investment in Fluke Networks Versiv System and LinkWare Live  ITM Communications Limited is a UK-based provider of ICT infrastructure solutions and services. It uses best of breed technology plus its in-house expertise and innovation to deliver standards-compliant solutions and services to clients throughout the UK, Europe and the rest of the World. In 2016, it invested in 20 Fluke Networks Versiv testers and LinkWare Live and put them to work.  “The Versiv System clearly offers several advantages when compared to the previous generation of test equipment,” said Mark Barber, Director, ITM Communications Limited, United Kingdom. “In addition to the productivity gains in testing, we are also able to get test results back to our project managers almost immediately utilising Fluke Networks LinkWare Live. The advantages translate into productivity gains for both ITM and our customers.” Point 1 Relies on LinkWare Live to Manage Increased Demand for Fiber Installation Services Two years ago, Point 1 started experiencing a surge in demand for fiber installation, troubleshooting and certification. The Livermore, California-based contractor jumped on the opportunity and standardised on Fluke Networks Versiv family of testers, and adopted the LinkWare Live cloud-based service to improve planning and management of projects and testers, and streamline the process of reporting certification results to its clients, which range from data centers to healthcare providers. They also took advantage of LinkWare Live’s integration with the Brother LabelLink application to quickly generate labels from the LinkWare Live database. [easy-tweet tweet="LinkWare Live gives us the means to provide our clients professional cable certification" hashtags="Data, Cloud"] “Fluke Networks Versiv testers and LinkWare Live provide Point 1 the ability to demonstrate superior job control and provide quick response to clients, saving all parties time and money,” said Bob Figone, Group Executive for Point 1. “LinkWare Live gives us the means to provide our clients professional cable certification, identification, location, and fault prevention services across all of their projects, no matter where they are located.” LinkWare Live Helps E2 Optics Deliver Enterprise-Grade Services on World’s Most Complex Data Center Jobs  E2 Optics specialises in designing, engineering and installing structured cabling and advanced IT systems for companies who own and operate the world’s largest and most complex data centers. These world-class technology companies turn to E2 given its expertise and unique approach in delivering enterprise-grade technology and customer service. Today, E2 has more than 250 employees working on projects in North America and Europe, and it’s one of the fastest-growing structured cabling companies in the U.S. “E2 Optics chose LinkWare Live to improve the planning and management of our data center projects that are underway in the US, Canada and Europe,” said Casey Canada, Manager of Field Operations, E2 Optics. “The LinkWare Live cloud-service provides huge savings on our large datacenter jobs – up to 500,000 links.  We load all the project settings into LinkWare Live, and our field techs access the single database with their multiple testers.  This saves the time of setting up each tester, and, more importantly, eliminates errors.  “For example, we’ve had instances where techs made minor errors entering cable ID’s, such as using dashes instead of periods. In order to get the results accepted, we’ve had to dedicate as many as three associates to review the ID’s in the test results and modify them as necessary.  LinkWare Live eliminated this hassle,” said Canada. Fluke Networks Worked with Installers to Design Ground Breaking LinkWare Live Cloud Service “In developing LinkWare Live, we looked closely at the daily challenges that installers were facing,” said Eric Conley, vice president and general manager of Fluke Networks. “They told us that they often wasted time applying incorrect test limits and were frustrated at the time delay between testing and being able to generate reports, not to mention the project delays caused by misplaced, lost or stolen testers. We created LinkWare Live to eliminate these challenges, and to enable installers to complete each job on time and on budget. To achieve the milestone of five million test results uploaded so quickly is a testament to the value that LinkWare Live brings to installers’ businesses.” LinkWare Live also features an easy-to-use device tracking and management capability based on Google Wi-Fi location services to reduce the likelihood that testers are lost or misplaced. Not only is it possible to monitor the last used location, but the software also checks that each device is always calibrated and running the latest firmware. To save additional time, LinkWare Live allows installers to send cable IDs and test settings straight to Brother labelers at the jobsite for seamless labelling. For more information about LinkWare Live's capabilities or to sign up for a free trial, please visit: www.flukenetworks.com/linkwarelive.   ### Workforce Optimisation in the customer engagement centre: The future is in the cloud Workforce optimisation (WFO) is an industry strategy that enhances contact centre resources to promote customer experience, while also boosting operational efficiency. Mainly used by customer contact centres to improve workforce management and agent performance, WFO involves automating processes, data visibility, compliance on legislation, performance and quality monitoring, and solving staff business problems. Now, companies from all sectors are migrating much of their IT estate to the cloud, which is opening up a world of opportunity by digitally transforming customer engagement and WFO by allowing information to be accessed remotely, by an entire workforce whenever needed. Optimising workforces to drive high standards of customer engagement will be greatly impacted by the rising use of cloud computing in the future. Making agents’ lives easier, boosting productivity, allowing flexible working from home, and decreasing operational costs, are all major benefits that can help to elevate customer contact centres into omnichannel-led, Customer Engagement Centres (CECs). [easy-tweet tweet="Manual spreadsheets can now be a thing of the past for those whose jobs include scheduling duties." hashtags="#Cloud #Workforce"] The following advances in WFO, driven by cloud, can positively impact customer engagement strategies from all business sectors: Avoiding the unnecessary red tape of a supervisor with self-management Technology can empower employees to manage their own workloads overcoming the complexity and frustration of antiquated, manual schedule management. Self-service in the workplace will give employees a better sense of freedom from the restraints of processes meaning easier, faster schedule management on the channels of their choice. This delivery model will assist the contact centre agent in a way that is natural, fast, and optimises resources. Longer interactions will be reduced and mobile apps, text-based chatbots and internal portals will empower employees to manage their own schedules, change shifts, request annual leave and take part in training. Any interaction with the app or chatbot by the employee is automatically updated in the WFO platform, which could even automate further actions for the agent to react to. Simplifying work with automation Manual spreadsheets can now be a thing of the past for those whose jobs include scheduling duties. Transformation of business processes afforded by the cloud means that these planning tasks can now be automated across a work network, which includes many things that can become self-service, such as training, coaching and extra shift scheduling. If a scheduler wants to offer overtime or a shift swap, a notification (via the employee’s preferred channel) can be sent to appropriate employees, with the option for them to take up the offer. All of this is then automatically updated in all schedules that are relevant and in the agent portal and system – all driven by a cloud-based automated business rules engine. This advance in scheduling means that both the planner’s and the agent’s time is optimised to not dwell on spreadsheets or play email ping-pong.  Collaboration – there’s no ‘I’ in collaborate WFO means optimising people and processes to create better customer service and customer engagement. As contact centres adopt multi-channel approaches, agents may also be under pressure to gain new skills and work in blended contact centre environments (taking and making both inbound and outbound interactions across multiple traditional and digital channels). Now that multichannel has become the norm in most customer contact centres, the most modern centres are now already adopting an omnichannel customer engagement strategy that can take place across multiple channels, without loss of context and with a 360-degree view of the customer. This ensures that conversations never really truly end, even if interactions take place across multiple channels depending on the convenience for the customer at the time. This change needs to be reflected in the skills that workers have, which is where cloud collaboration tools will come in. Cloud collaboration enables employees to make the best of the skills and resources available, and in real time, with screen and video sharing, access to a knowledge base, skills-based presence and more, resulting in faster service that is more likely to hit first interaction resolution targets. There’s no doubt that the workforce is changing with the age of the tech savvy and flexible millennial advancing into work. This means more employees will have the knowledge and capabilities to work from home. This is where the above cloud solutions can really benefit the workforce. The nine-to-five regime is beginning to evolve into a remote workforce and the tools and capabilities need to be in place to ensure that agents can do their best, whether in the contact centre or out. Cloud collaboration, scheduling and automation will make this move as seamless as possible. These three distinct areas show where cloud technology can have a positive impact on the individual agent or the entire workforce. The future is about self-management, collaboration and automation, and having these advances work their magic from a cloud-based foundation available to access whenever and wherever. ### Jisc Selects GTT to Upgrade Internet for UK Research and Education Networks GTT Communications, Inc. (NYSE: GTT), the leading global cloud networking provider to multinational clients, announced today an agreement with Jisc, the provider of digital solutions for the UK research and education community and operator of the Janet network, one of the leading national research and education networks in the world. GTT will deliver high-speed internet services to the Janet network to support the innovation, research and learning of its approximately 18 million UK college and university users. GTT delivers internet services over its Tier 1 global IP backbone, ranked in the top five worldwide. With over 300 points of presence, GTT provides flexible, high-capacity cloud networking services to the world’s largest organizations. [easy-tweet tweet="GTT’s resilient internet connectivity solution for Jisc guarantees ultra-high network availability" hashtags="Data, Cloud"] “GTT is committed to connecting people and organizations around the world,” said Martin Ford, GTT division president, EMEA. “We are pleased to enhance the collaboration of Jisc’s users through high-performance internet connectivity and look forward to further supporting Jisc as their geographic and service requirements expand.” GTT’s resilient internet connectivity solution for Jisc guarantees ultra-high network availability, interconnecting via three geographically separated points of presence. The initial service agreement provides up to 120 Gbps of internet capacity with the opportunity to further upgrade the service as Jisc’s user demand increases. The Janet network is also the operator in the U.K. for eduroam, the international roaming service for the education community that supports more than 70 million users globally. “GTT is an ideal partner for Jisc, providing us with the comprehensive global network reach, scalability and flexibility that we require as we grow,” said Tim Kidd, executive director, Jisc technologies. “We look forward to future collaboration to power UK higher education and digital transformation.” ### Cyber Security - Let’s talk AI - Cylance’s Lloyd Webb Compare the Cloud interviews Lloyd Webb, Sales Engineering Director of Cylance. Lloyd talks about what the issues are in cyber security and how AI helps to address them. What does Artificial Intelligence (AI) mean? Artificial Intelligence (AI) is the area of technology that focuses on creating intelligent machines that react and work like people. Some examples of what these machines are created for include problem solving or learning. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Cylance here: cylance.com/en_us/home.html ### Fujitsu Speeds Up Transaction Processing on the Blockchain Develops technology to accelerate transaction performance by 2.7x, enabling use in systems that require high performance Fujitsu Laboratories Ltd. today announced the development of technology that accelerates transaction processing for Hyperledger Fabric, one of the Hyperledger blockchain frameworks hosted by The Linux Foundation. The blockchain is a technology that creates systems with excellent resistance to falsification while preserving high transparency and reliability, all without centralized management. It is expected to have applications in a variety of fields, particularly in finance. Now, Fujitsu Laboratories has developed technology to speed up transaction processing by making the processing of communications between applications and the blockchain platform, which had been the source of bottlenecks, more efficient. In a trial where this technology was implemented in Hyperledger Fabric v0.6.1, it increased transaction performance by approximately 2.7 times compared to the previous method. With this technology, it has become possible to apply blockchain technology to online transaction systems, which require high performance. Details of this technology were announced at P2P Financial Systems 2017, an international conference about the blockchain held in London July 20-21. Development Background The blockchain creates a shared ledger system that, without a centralized manager, is highly transparent and reliable while being extremely difficult to falsify, by requiring the parties involved to mutually verify the accuracy of the transaction data and preserving it in a chain format. The open source blockchain framework Hyperledger Fabric, being developed through Hyperledger, in which Fujitsu Limited is a premier member, is the focus of much attention as a blockchain that is building a robust commercial transaction platform. Hyperledger Fabric uses a consortium-type structure in which the number of participants is limited, and is being trialed for use in a variety of fields, particularly finance, but also for supply-chain management in manufacturing, data conversion of insurance policies, real estate contracts, license management, and energy transactions. Issues With blockchain, groups of nodes based on the number of participants form a network, and work together through the network to perform a series of processes from executing transactions to validating the legitimacy of transactions. For this reason, the number of transactions that can be executed per unit of time is limited by communication bottlenecks through the network, compared to previous centralized systems, making it difficult to apply the technology to things like online transaction systems, which demand high performance, including the ability to immediately process large volumes of transactions. [caption id="attachment_44758" align="aligncenter" width="373"] Figure 1[/caption] http://www.acnnewswire.com/topimg/Low_FujitsuBlockchain73117Fig1.jpg Figure 1: Transaction processing on the blockchain About the Newly Developed Technology With the blockchain, while consensus is formed between participating nodes, transactions are processed by applications reading and writing data on the shared ledger, with safety ensured by linking the transaction data in chain format to be managed. Using its proprietary analysis technology, Fujitsu Laboratories learned that, under network conditions in which a response time of about 64 milliseconds or less is required, such as the case where a consortium-type blockchain is operated at multiple locations within Japan, communications between the applications and the blockchain platform during transaction processing are the primary cause of bottlenecks. Now, based on these analysis results, Fujitsu Laboratories has developed the following two technologies to improve transaction performance speed by reducing the number of communications between the applications and the blockchain platform. Features of the newly developed technologies are as follows: 1. Differential Update State (DUS) Functionality When processing transactions on the blockchain, a commonly used method is to retrieve the specified data, then handle the computational processing in the application before writing it back to the blockchain platform. Fujitsu Laboratories has now developed functionality that executes only differential computations on the specified data, in one processing action on the blockchain platform, and functionality that reduces the number of computations directly linked with the number of communications. 2. Compound Request (CR) Functionality Fujitsu Laboratories developed functionality to aggregate multiple processes to send to the blockchain platform for batch execution. This functionality not only makes processing on the blockchain platform more efficient by aggregating multiple processes, it also reduces the number of communications. The functionality maintains accuracy by rewinding to the origin point of the batch execution if a partial error occurs in the aggregated processes, and reprocessing. [caption id="attachment_44759" align="aligncenter" width="347"] Figure 2[/caption] http://www.acnnewswire.com/topimg/Low_FujitsuBlockchain73117Fig2.jpg Figure 2: Example showing the reduced number of communications due to the DUS functionality and the CR functionality Effects Fujitsu Laboratories implemented this technology in Hyperledger Fabric v0.6.1 and measured transaction performance on a blockchain platform consisting of four servers. Whereas the previous method could handle 500 transactions per second, Fujitsu Laboratories achieved 1,350 transactions per second using this newly developed technology, an improvement of approximately 2.7 times. [easy-tweet tweet="the Hyperledger Fabric framework has become applicable to online transaction systems" hashtags="Blockchain, Technology"] Future Plans With this newly developed technology, in terms of performance, the Hyperledger Fabric framework has become applicable to online transaction systems that demand high performance in excess of 1,000 transactions per second, such as those demanded by financial institutions. Fujitsu Laboratories will continue development of technologies to further speed up the blockchain while adapting them to the latest version of Hyperledger Fabric, and will carry out trials with a view to commercial applications of this technology, with plans to commercialize it through Fujitsu Limited during fiscal 2017. (1) Hyperledger Fabric v0.6.1 Stable version of the Hyperledger Fabric framework as of July 5, 2017. (2) Consortium-type Blockchains can largely be divided into three types--public, consortium, and private--of which the consortium-type is regarded as being the strongest contender for applications such as in financial institutions. ### The future of business systems Technology moves at an astonishing rate. We need only look back 10 years to recognise the impact of technological advances on the way that businesses operate. 2007 saw the first release of the iPhone by Apple and the unveiling of Android by Google. In this same year, only 15 million households had internet access and 25 per cent of those were still using dial up. In 2007, Amazon offered a service named Elastic Cloud Compute (EC2) as a means to rent remote computing power and IBM announced plans to “Push Cloud Computing“ in the New York Times, but until recently most people did not recognise “Cloud Computing” as a phrase or description of any value or significance. Although the origin and the precise meaning of the term remains a subject of debate, June 2012 is the first time the phrase was included in the Oxford English Dictionary. Even five years ago, most customers were not receptive to storing business data on the internet. [easy-tweet tweet="So why is Cloud the future? There are three crucial reasons - security, affordability and agility to adapt." hashtags="Cloud, Business"] Today, ‘Cloud Computing’ appears 48 million times on the internet and leading industry analysts are predicting Cloud spend will increase by 4.5 times the current rate of spend! With such significant change within such a short space of time, how will business systems change over the next 10 years? Here, Andrew Peddie, the MD of FHL - which is celebrating its 10-year anniversary this year - gives his predictions about what business systems will look like in 2027. All businesses will move towards the cloud, no exceptions! In 2007, few businesses were willing to adopt the internet as a secure repository for their data. Those that did were either seen as cutting-edge or crazy, the trust simply wasn’t there. For many, “Cloud Computing” meant nothing and although it is now an accepted term, it is viewed by many in the industry as a buzzword for marketing purposes and an oversimplification of a complex offering. In spite of this, there is no doubt that having applications available via the internet from any device and being able to access securely stored data at any time is an incredibly powerful proposition. Cloud Computing has arrived, is here to stay and is no longer new nor the domain of the elite. It is becoming better, faster and more affordable by the minute and is increasingly the solution of choice to facilitate growth and change for businesses. So why is Cloud the future? There are three crucial reasons - security, affordability and agility to adapt. ‘Best of breed’ Cloud systems are overseen by the world’s foremost security experts; the collective purchasing power of the midmarket client base makes this practical and affordable. A true cloud platform has the flexibility to scale and accommodate change and ensures that organisations are always up-to-date and operate on the latest version of the application. It seems logical to forecast that by 2027, every organisation will be Cloud-based, no exceptions! Convergence - Cloud app functionality will converge into single source platforms More and more consumers are willing to transact via apps. Organisations utilise in-house apps to manage business processes such as sales, ordering, fulfilment, expenses and payments. Apps market places have never been so busy. Increasingly, consumers and companies prefer to work with a single provider. Take television as an example. People have options for their viewing such as Sky, Virgin Media, Amazon, Netflix. Much of the content is shared under license and often, telephony and broadband are part of a holistic offering. If you extend this thinking to cloud-based business application platforms, by 2027 expect to see a small number of providers (perhaps three or four) offering similar solutions which benefit from converged functionality of third party apps. More than likely, companies offering extended functionality will be acquired for the larger players to merge the features into their own eco-systems. Unified communications will be standard on all platforms By 2027, businesses will no longer need to run separate communications solutions as their technology platforms will deliver unified communications. Communication features such as voice, video, chat, file sharing and email in a “one-to-one” or “one-to-many” format will be available to any device via the organisation’s chosen business platform technology solution. Advanced automation of business processes Today, many businesses could not survive if they had to rely solely on manual processes. High volume, low value transactions, currency fluctuations, electronic invoicing, automatic payments and collections are an essential part of competing in a rapidly evolving and increasingly demanding market. Within the next 10 years, manual business processes will be a thing of the past for most businesses and ‘accounts payable/receivable’ departments will most likely be extinct. Instant real-time transactions can be processed automatically, period-end billing, complex consolidations, currency fluctuations, intercompany transfers and even solvency computations are already a familiar feature for some organisations. Artificial intelligence applied to basic business rules will feature strongly in advanced decision-making techniques ensuring automated optimal performance in all but the most complex scenarios. No Code / Low Code and Gamification Although gaming might conjure up images of teenagers spending hours on X-Box and PlayStation, modern computer games are a great exponent of advancement in programming and hardware capability. The military has played out “war games” or simulation of projected scenarios to help better predict and plan for potential events and as a training aid, it is invaluable. So why not business? Consider the pre-sales process. Most consumers like to conduct their own research these days and don’t like to have a sales person calling them. Imagine an online scenario-based environment that could simulate a business’ requirements, evaluate a technical and/or commercial fit and generate custom output to suit its needs without the need to engage in time consuming emails or telephone calls. What better way to research a product or service? It is accepted that the modern computer game has millions of lines of code written by genius developers who sit in darkened rooms looking at over-sized screens for a number of years just to create a seamless and lifelike experience for the gamer. When you buy a PlayStation game, you don’t expect to have to learn to write code to be able to play it. Similarly, the cloud-based business platform will emerge equipped with low code or zero code “toolsets” that allow the user to configure and customise features and functionality in a visual environment and with the use of code. This work will be simple enough to be managed in-house making it fast and cost effective to deliver and will be “future-proofed”. A final word Ten years is a long time in technology. In the world of business systems alone, there have been dramatic changes in the last decade. In 2007, the outlook for business adoption of software as a service – SAAS – was positive but not prolific. It is somewhat different today with leading analysts predicting universal adoption and forecasting cloud as the default solution of choice for more than 80% of businesses. Cloud is now considered mainstream technology more than disruptive and it has gained the respect and confidence of the mid-market. Technological convergence, advanced business process automation and low/zero code environments will create such lean, efficient and agile organisations that the best exponent of technology will dominate. The last 10 years has been an interesting period of preparation in cloud-based business systems, almost like a warm up to the main event. By 2027, we can expect to see the smartest of companies equip themselves with the very best cloud technology tools and benefit from versatility and agility in meeting the ever-changing needs of their market place. Let’s go! ### Security - Let's talk AI - Alert Logic - Oliver Pinson-Roxburgh Compare the Cloud interviews Oliver Pinson-Roxburgh, Director for solution and architecture team at Alert Logic. Oliver talks about what does Artificial Intelligence mean to him and to Alert Logic. What does Artificial Intelligence (AI) mean? Artificial Intelligence (AI) is the area of technology that focuses on creating intelligent machines that react and work like people. Some examples of what these machines are created for include problem solving or learning. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Alert Logic: https://www.alertlogic.com/   Let's Talk AI ### ISPA questions ‘Broadbad 2.0’ report’s methodology and findings The British Infrastructure Group (BIG) report draws attention to two important areas – good quality broadband and automatic compensation – that are already being addressed in a constructive and effective way by industry, Government and regulators. ISPA understands that access to good quality broadband is a key issue for MPs and their constituents up and down the country. We actively welcome interest from parliamentarians in broadband and we have helped several MPs with local campaigns to improve broadband availability and take up. However, it is important that policy in this area is undertaken in a robust, evidence-based manner. The report’s figure that up to 6.7m users may fail to receive speeds of 10Mbps unhelpfully conflates take up with coverage. Ofcom data from June 2016 shows that 95% of the public can access broadband of more than 10Mbps, with average speeds of 51Mbps, and thinkbroadband data from July 2017 shows that 97% (or around 900,000 users) are able to access speeds of 10Mbps or more, which enables users to access and perform everyday online activities, including streaming. Additionally, while more than 89% of the population can access superfast broadband (speeds of more 30Mbps), Ofcom figures show that 31% of premises have actually done so. ISPA is supportive of everyone being able access good broadband. A number of ISPA members are helping meet this challenge by rolling out broadband using a variety of technologies and services across the UK. Underpinning market rollout, we also support the creation of a broadband Universal Service Obligation, passed into law in May 2017 as part of the Digital Economy Act. This will give everyone a legal right to request a 10Mbps broadband connection and has the ability to evolve over time in line with consumer demand and user expectations. In further recognition of the importance of being online, ISPs and Ofcom have been working on a set of voluntary proposals that will provide consumers with automatic compensation for network outages. The scope and detail of these proposals are currently being consulted upon but this, alongside existing consumer measures, such as the free-to-consumer alternative dispute resolution, means consumers have effective ways of seeking redress. Commenting on the report, Andrew Glover, Chair of ISPA Council, said: “ISPA members are actively rolling out super and ultrafast connections throughout the UK, increasing the availability of broadband of at least 10Mbps to over 95% of UK premises as speeds have increased to an average 51Mbps. As well as competing on the quality of broadband, ISPs compete on customer service, and new and existing consumer protection measures provide a strong basis for maintaining good service levels. ISPA welcomes parliamentary interest in broadband and we have helped support MP’s local broadband campaigns, but it is important that research and reports that inform policy are robust. By failing to acknowledge the work that is already underway and selective use of data, this latest report falls short of this standard.” ### Computer Vision - Let’s talk AI - Calipsa’s Mohammad Rashid Compare the Cloud interviews Mohammad Rashid, Co-founder & CEO of Calipsa. Mohammad speaks about how Calipsa is an AI company that builds software to automate the understanding of videos. The main technique used is computer vision, where computers are able to interpret what they can see on videos - similar to humans. What does Artificial Intelligence (AI) mean? Artificial Intelligence (AI) is the area of technology that focuses on creating intelligent machines that react and work like people. Some examples of what these machines are created for include problem solving or learning. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Calipsa here: http://calipsa.io/ let's talk ai ### Bots: the good, the bad and the ugly   Chatbots are over fifty years old, but the technology is still evolving. Despite recent advancements, the history of chatbots hasn’t been all plain sailing. In its latest infographic Parker Software, creator of WhosOn live chat software, has illustrated the history of chatbots. The infographic, Bots - the good, the bad and the ugly, can be downloaded from the WhosOn website here. The history of chatbot technology begins in 1966, when Joseph Weizenbaum created an early natural language processing computer program, Eliza. The program was designed to mimic human conversations by matching user prompts to pre-scripted responses. Since Eliza’s creation, technologists have experimented with the creation of several different chatbot programs and software applications. Only in recent years are we beginning to see chatbot technology used in everyday applications — think Apple’s Siri, Microsoft’s Cortana and Amazon’s Alexa. While some could argue that these intelligent, voice activated assistants are not truly chatbots, these applications are introducing the wider consumer market to the possibility of chatbot technology in our homes, workplaces and on the go. “Eliza may have been developed over fifty years ago, but chatbot technology is still in its infancy,” explained Howard Williams, marketing director at Parker Software. “It’s only been in recent years that we have started to understand the potential of chatbot technology from a business perspective, rather than as a novelty or as a technological experiment. “Without even realising, many customers will have interacted with chatbots on customer service calls or live chat applications. However, it’s important that customer service is not completely reliant on this technology to manage complaints,” continued Williams. “As chatbots continue to evolve, it is vital that organisations take steps to explore the technology before fully implementing it into their customer service efforts.” Parker Software provide business automation software and live chat applications for ecommerce organisations. In the company’s upcoming book, The Conversation Engine, the technology experts at Parker Software will be exploring the potential, the future and the risks around using chatbots for customer service. ### A Complete Overview of the Dell Technologies Stack Compare the Cloud was joined by Peter Barnes, Enterprise Systems Director of Dell EMC to discuss the overall technology stack. Peter also talks about how the different technology areas, such as VMware and the EMC storage services, leverage an overall design that encompasses all areas of solution and cloud native services. Find out more about Dell EMC here: https://www.dellemc.com/en-us/index.htm At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. ### Mitel announces definitive agreement to acquire ShoreTel Mitel (Nasdaq: MITL) (TSX: MNW) and ShoreTel (Nasdaq:SHOR) today announced that they have entered into a definitive merger agreement pursuant to which Mitel will acquire 100% of the outstanding shares of ShoreTel common stock in an all-cash transaction at a price of $7.50 per share, or a total equity value of approximately $530 million and a total enterprise value of approximately $430 million. The purchase price represents a 28% premium to ShoreTel’s closing share price on July 26, 2017. Stronger together as a global market leader in the rapidly growing UCaaS market  Continuing to deliver its move-to-the-cloud strategy, with this transaction Mitel is accelerating on a growth path by investing further and faster into the UCaaS (Unified Communications as a Service) market as digital transformation accelerates customer demand for cloud-based solutions globally. The combined company will be the #2 player in the UCaaS market, creating a supplier with the scale and technical capabilities to enable customers with new cloud-based solutions and applications. The combined company will be headquartered in Ottawa, Canada, and will operate as Mitel. Rich McBee, Mitel’s Chief Executive Officer, will lead the combined organization. Steve Spooner, Mitel’s Chief Financial Officer, will also continue in that role. “This is a very natural combination that enables us to continue to consolidate the industry and take advantage of cost synergy opportunities while adding new technologies and significant cloud growth to our business,” said Mitel CEO, Rich McBee. “Together, Mitel and ShoreTel will be able to take customers to the cloud faster with full-featured, cloud-based communications and applications.” Uniquely qualified to take customers and partners to the cloud Together, the combined company will have approximately 3,200 channel partners and an industry-leading portfolio of communications and collaboration solutions. Mitel and ShoreTel are committed to providing continued support and an attractive path forward for all customers and partners – cloud and premise. On closing of the proposed transaction, the combined company will have a global workforce of approximately 4,200 employees. “With the announcement today, this concludes our comprehensive review of strategic alternatives by delivering a significant cash premium for our shareholders,” said Don Joos, CEO of ShoreTel. “Customers are clearly moving to the cloud at a rapid pace. The combination of Mitel and ShoreTel creates a new UCaaS market leader with a differentiated strategy and solution, and a clear migration path so that no customer is left behind or will have to abandon what they already have to cloud-enable their organization.” Once the transaction is complete, Mitel will be uniquely positioned to offer all customers the advantages of cloud-based communications. For enterprise customers, ShoreTel’s solutions will strengthen Mitel’s ability to cloud-enable customers with existing premise or mixed estate deployments, creating the technical foundation needed for delivery of next-generation cloud applications. Size, scale and financial foundation to drive growth Financial highlights of the transaction include: Combined sales of $1.3 billion* Increases Mitel’s total recurring revenue to 39% of total revenue* More than doubles Mitel’s UCaaS revenue to $263 million* Significant synergy opportunity targeted at $60M in annual run rate spend expected to be achieved over two years Expected to be accretive to non-GAAP EPS in the first year *based on trailing twelve months combined to March 31, 2017 Transaction Details The transaction will be completed through a cash tender offer for all of the outstanding shares of ShoreTel common stock, followed by a merger, which will not require approval of ShoreTel’s stockholders, in which remaining shares of ShoreTel common stock will be converted into the right to receive the same $7.50 cash per share price paid in the tender offer.  ShoreTel’s Board of Directors has recommended that ShoreTel stockholders tender their shares in the offer. In connection with the execution of the merger agreement, ShoreTel’s directors and executive officers, have entered into tender support agreements with Mitel pursuant to which they have agreed to tender their shares to Mitel's offer. Mitel intends to finance the consideration for the acquisition and associated transaction expenses using a combination of cash on hand from the combined business, drawings on its existing revolving credit facility and proceeds from a new fully underwritten $300 million term loan maturing in 2023.  The existing term loan and revolving credit facility will remain in place, with the Company having already obtained the requisite majority consent to certain amendments which accommodate the acquisition and the incremental financing. BMO Capital Markets is leading the new term loan facility with Citizens Bank, N.A., HSBC Bank Canada and Canadian Imperial Bank of Commerce serving as Joint Lead Arrangers and Joint Bookrunners.  Citizens Bank, N.A., lead on the existing amended facilities, will act as administrative agent for these and the new term loan.  EA Markets LLC provided Mitel with independent advisory and transaction services in conjunction with the arrangement and structuring of the new financing. The transaction is expected to be completed in the third quarter of 2017, subject to ShoreTel stockholders having tendered shares representing more than 50% of the outstanding shares of ShoreTel common stock, certain regulatory approvals having been obtained and other customary conditions to the tender offer having been satisfied. Jefferies LLC is serving as financial advisor to Mitel, Paul, Weiss, Rifkind, Wharton & Garrison LLP is serving as legal advisor to Mitel and Osler, Hoskin & Harcourt LLP is serving as legal advisor to Mitel in connection with the financing.  J.P. Morgan Securities LLC is serving as financial advisor to ShoreTel and Fenwick & West LLP is serving as legal advisor to ShoreTel.     ### Provide Insights and Automating Processes - Let’s talk AI - Sidetrade’s Jean-Cyril Schütterlé Compare the Cloud interviews Jean-Cyril Schütterlé, SVP, Head of Product Management and Co-Innovation S/4HANA of Sidetrade. Jean-Cyril talks about the types of AI Sidetrade are using to help their customers acquire new customers and to grow their business. What does Artificial Intelligence (AI) mean? Artificial Intelligence (AI) is the area of technology that focuses on creating intelligent machines that react and work like people. Some examples of what these machines are created for include problem solving or learning. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Sidetrade here: http://uk.sidetrade.com/ LEt's talk AI ### Moving to the Cloud: Multiple Benefits for Business A hosted cloud solution (a software solution that is simply hosted on a cloud platform, rather than traditional servers) is different from a native cloud solution due to its architecture. A native cloud solution can utilise the scalability, local geographical deployments of the service and the vast storage available, rather than just being hosted there and operating in the same way it is used on a static server, eliminating any need for hardware, and the associated headaches. True cloud solutions can therefore provide best in the industry services and the list of benefits continue to roll out Seamless integration and unlimited scalability  One of the major barriers to progress and service diversification is the time and resources needed to develop and deploy innovative value added services internally. PaaS (Platforms as a Service) provide a platform for organisations to develop, run, and manage applications without the complexity of building and maintaining the infrastructure typically associated with developing and launching an app. A good example of PaaS with high uptake, is Google’s business tools. This is a simple PaaS offering where they are able to spin off new services (new versions of powerpoint, word etc.) and update and upgrade services on the go without the end user needing to alter or change anything or download updates. Although true cloud solutions offer many benefits, organisations must design their platform to take advantage of the built in seamless integration and elasticity, in order to realise these benefits. Undoubtedly, the most significant benefit of true cloud platforms is their capacity for unlimited scalability, which enables end users to scale the processing power and storage capacity of solutions to fit their individual requirements. True cloud solutions have been built securely to perform in a multi-tenant cloud environment. They therefore support multi-tenancy, as well as data redundancy. They are often deployed with open APIs, providing users the flexibility to adapt the solution to fit their individual requirements. This agility perfectly complements the dynamic and constantly changing demands of the modern world. Democratisation of super computing power To spin off such successful solutions like Spotify, startups would need much larger funding to spend on hardware and computing power. Cloud solutions allow them to tap into this vast computing power for fraction of the cost and allows them to grow their usage as they grow. This levels the playing field for smaller organisations. Cloud solutions are better equipped to help companies provide best in the industry solutions for their clients. The leading true cloud platforms are Amazon Web Services (AWS), the Google Cloud Platform and Microsoft Azure. Two notable examples of companies using these platforms to deploy their service include Netflix and Spotify. AWS enables Netflix to rapidly deploy data content on an enormous scale, to servers all over the world. [easy-tweet tweet="Providing best in industry solutions is the best way to secure a good reputation" hashtags="Cloud, Security"] It is AWS that enable Netflix to manage their huge user base and volume of data. Spotify use Google’s Cloud Platform to host their data centre, having opted to focus on data user queries to provide the best possible user experience. The Google Cloud Platform enables Spotify to scale their service to fit their popularity, and to answer user queries within seconds, by hosting their data centre on their scalable and secure platform. It stands to reason that providing best in industry solutions is the best way to secure a good reputation and competitive differentiation. True cloud platforms enable companies to deploy their solutions in a ‘Software as a Service’ format, which guarantees unlimited scalability and global availability. Open APIs – giving users the flexibility to adapt solutions to fit their requirements Cloud solution providers offering productised APIs (Application programming interfaces) give businesses the ability to integrate their cloud services within existing business tools and process. Furthermore it gives them opportunity to tailor their cloud services to suit their business model and has the flexibly to change as the organisations grows. Automating back office processes is an excellent example, giving the organisation the free time and opportunity to work on more ROI focused tasks. Greater value for money Native cloud platform with unlimited scalability and no CapEx enables rapid deployment, offers very attractive financial returns, and opens up new revenue streams with the end user customer base that have never been realised before. It again goes back to the ability of cloud enabling unlimited scalability meaning no tedious upgrade is required, no replacement of aged and out dated hardware. Cloud enables companies to remain updated all the time, with minimal investment, and allows business to remain flexible and adapt solutions as necessary. Furthermore, compliance is a necessary part of business. It’s up to individual companies to avoid any potential cost or disruption, by having in place agile and flexible technology that can adapt to changes. After all, there are no guarantees that compliance requirements that are in place today, won’t change over time. Data has become a form of currency, and its value to businesses is increasingly being exploited to improve service delivery, open up new revenue streams, and redefine standard business practices. Secure, accessible and compliant Another of the many benefits of building a native cloud service on AWS infrastructure, is that it will automatically comply with all GDPR requirements when this law comes into force in less than a year. The flexibility that comes with cloud solutions also helps with regulation. With ever changing geopolitical stances around the world, organisations get a helping hand in becoming compliant for shorter periods of time with minimum financial risk. AWS provides a Data Processing Agreement (DPA) stating that they will meet the requirements of the GDPR. Their teams of compliance, data protection, and security specialists work to ensure that their customers across Europe are fully prepared for the new regulations. For laws like MiFID II & Dodd-Frank, service providers can localise service deployment using AWS, to comply with data sovereignty, high capacity and long term storage requirements. A final thought Scaleability, flexibility and global accessibility. These are a few of the positive attributes moving to the cloud can offer businesses. In a competitive market place, where each business model is looking how best to drive ROI – cloud solution services are shown to be adaptable solutions, with no Capex and rapid deployment that offer attractive financial returns and open new revenue streams to the end user customer.  Add to that security and the assurance that a cloud service is GDPR compliant and it becomes more evident and apparent the major benefits there are for businesses to adopt a native cloud service - and never looking back.   ### PatSnap launches cloud platform to accelerate drug research PatSnap, the global Intellectual Property Analytics company backed by Sequoia and used by innovative companies to accelerate their research and development (R&D), has today announced the launch of Chemical by PatSnap, a Software as-a-Service (SaaS) platform that combines innovation data with vital and relevant scientific information into one single and easily searchable interface. Chemical by PatSnap enables intellectual property professionals at science-led organisations to easily validate chemical development projects by analysing Big Data through Machine Learning and Artificial Intelligence (AI). Its database links over 114 million chemical structures, clinical trial information, regulatory details, toxicity data, over 121 million patents and a number of other sources, quickly validate chemical development projects. Having recently secured series C investment from venture capital group Sequoia, and experiencing rapid growth, PatSnap is now turning its established expertise in machine learning and artificial intelligence to helping pharmaceutical and chemical manufacturers map the innovation process from investment to commercialisation. “The main challenges in R&D are that companies use resources in a way that’s not productive, for example hiring people to do studies and accumulate lots of data, but at the end of the day, they do not assimilate all that information into a coherent strategy. Successful commercialisation of a drug is expensive and fraught with high risk. Estimated costs can rise to as much as $2.6 billion, while 14 drug candidates will fail clinical trials for every one that makes it to market. Current strategies have not been able to bring down the costs of Research and Development, and the pressure to adopt value-based and outcome-based pricing models has rapidly intensified,” said Ali Hussein, UK Product Leader at PatSnap. “It’s a well-established principle that Big Data holds the potential to address these problems, but until now it has been difficult to extract this information from the worlds of chemistry and innovation intelligence in either a cost-effective or resource-efficient way. Particularly challenging is the accurate integration of multiple relevant data sets and the skill set required to analyse and interpret results.” Organisations need to fail fast and fail cheap, and need to harness the power of big data analysis to overhaul the current rate of research, and rapidly seek out corresponding areas of innovation. Organisations are no longer content for data providers to be mere points of reference, but expect them to be able to generate immediate answers to an array of critical business questions. ### Machine Learning - Let’s talk AI - SAP’s Sven Denecken Compare the Cloud interviews Sven Denecken, SVP, Head of Product Management and Co-Innovation S/4HANA of SAP. Sven talks about how SAP the areas of Artificial Intelligence they are using within the business. What does Artificial Intelligence (AI) mean? Artificial Intelligence (AI) is the area of technology that focuses on creating intelligent machines that react and work like people. Some examples of what these machines are created for include problem solving or learning. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about SAP here: https://www.sap.com/index.html Let's talk ai ### IoT Solutions and a Smarter Grid can Help Cut Power Outages The Internet of Things (IoT) is a huge focus for governments and businesses across the world, with both looking at how they can make both products and infrastructure “smarter”. IoT-enabled “smart cities”, for instance, are revolutionising what is possible in any given metropolis. Having a variety of solutions “talking” to each other enables businesses to boost process automation, drives efficiency, and can lead to savings. India has begun its Smart Cities Mission, which aims to develop more than 100 smart cities across the country. Electricity distribution companies must also look at ways they can leverage the IoT. Research has found that the number of UK power outages in 2015 rose dramatically compared to the previous year – from 537 incidents to 640 – with the UK’s ageing infrastructure creaking under increased demand. Outages not only cost distributors money but also cost UK companies thousands of pounds, with the total cost of downtime estimated to be 66,170 Euros per hour. There are many reasons why such companies should consider implementing IoT solutions, but preventing such power outages should be a major factor in their considerations. By leveraging IoT and creating a smarter power grid with IoT-enabled solutions, such as intelligent lighting, these types of blackouts can be reduced and managed more effectively when they do occur. A smart grid with smart solutions allows electric distribution companies to have access to a constant stream of real-time data. This enables them to spot problems early and fix issues before they evolve, as well as allowing them to analyse this data to be more intelligent in their planning. For example, if they notice a growing trend of increasing energy consumption in one particular neighbourhood, they can analyse the data to see if this trend is likely to continue and then plan accordingly, ensuring the correct infrastructure is in place to cope with the demand. Or, if the lights in a particular building appear to be on more frequently than expected, this can be inspected to establish whether there is an issue. Therefore, better understanding of how electricity and the grid works allows them to plan for the future more effectively and plan for upcoming events in advance. [easy-tweet tweet="IoT enables the remote diagnoses of problems to be identified in advance" hashtags="IoT, Data"] What’s more, if an outage does occur, having access to granular data enables companies to react to downtime more effectively and efficiently. Responding to an outage can often be time consuming, costly, and inefficient. In the past, the majority of electricity distribution companies would send the nearest maintenance or repair team to the problem area. Once they got there, this team then would have to diagnose the problem on-site and begin the repair, which could take considerable time. In addition to this, as these companies would have little or no idea of the fault before the arrival of the repair team, they may not have the correct equipment, meaning they may not be able to begin the repairs until a further support team arrives. This can extend downtime considerably, especially in rural areas. This extra delay is not only time consuming and frustrating but damages customer relations. But this no longer needs to be the case. IoT enables the remote diagnoses of problems, meaning the tools needed to fix the problem can be identified in advance. So, rather than sending the closest team to the problem, companies now can send the team that has the right equipment – thus saving time and money. However, we still are some way behind fixed line network infrastructure being able to cope with such large amounts of data needed for remote solutions to be a success. But by leveraging cellular technology, smart grid solutions can have access to reliable connectivity even in the most remote locations, allowing the flow of real-time, actionable data. Overall, if these solutions are backed by reliable cellular technology, they can not only make electrical distribution companies more efficient but they also can make the grid smarter, allow for more intelligent planning, and prevent costly and reputation damaging down times.   ### Predictive Analytics - Let’s talk AI - Logical Glue’s Robert De Caux Compare the Cloud interviews Robert De Caux, Chief Product Officer of Logical Glue. Robert talks about which industries Logical Glue are primarily focused on and what sets the company apart by way of the techniques in which they use AI. What does Artificial Intelligence (AI) mean? Artificial Intelligence (AI) is the area of technology that focuses on creating intelligent machines that react and work like people. Some examples of what these machines are created for include problem solving or learning. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Logical Glue here: https://www.logicalglue.com/ LEt's talk ai ### A Guide to Blockchain and How it Works The global blockchain market is experiencing exponential growth: market predictions show that by 2024 it is predicted to be worth US$20 billion compared to US$ 315.9 million in 2015. For those who are unfamiliar with it or just want to learn more René Bader, Manager of Critical Business Applications & Big Data at NTT Security provide some explanation on what it is and how it works. Blockchain allows the validation of peer-to-peer network transactions without the need for an intermediary providing traceability and transparency. The “crypto-currency” Bitcoin is one of the driving forces of Blockchain but beyond this there are numerous application possibilities as I’ll go on to explain. [easy-tweet tweet="The role of the blockchain is to act as a public ledger for Bitcoin " hashtags="Blockchain, Bitcoin"] I’ll use Bitcoin as an example, Bitcoin permits value transfers between unknown parties without the need for a financial service provider as a central agent. These transactions must be secured seamlessly, verifiably and transparently and it is the peer-to-peer architecture of the blockchain technology, described in more detail below, that allows this to happen. The role of the blockchain is to act as a public ledger for Bitcoin for all network participants. It is not an absolute account, but it does record all transactions that have ever been executed and validated, which can be used to calculate the current account balance. Even though the information on the transactions is public, the parties involved remain anonymous because the addresses of the transactions only show as anonymous codes. Bitcoin vs traditional payment systems The Bitcoin approach is different because it documents transactions solely in the Blockchain as opposed to traditional payment systems such as banks who are responsible for monitoring incoming and outgoing transactions, controlling the account balance and storing them centrally in their systems. Bitcoin uses what’s known as a decentralised database because the information on the transactions is distributed and stored fully on all computers that are participating in Bitcoin. This means that the complete blockchain is not owned by anyone and is public. The current size of the blockchain of the Bitcoin network is around 80 gigabytes, but with the continual formation of new blocks, it is constantly growing. Blockchain takes its name from the fact that transactions are collected and stored in consecutive blocks about every ten minutes. The maximum size of a block is one megabyte and contains several hundred transactions. Each new block is joined to the previous block in chronological order and builds a new chain link in the Blockchain which over the course of time the chain continues to expand.  The very first block is aptly called the genesis block. In addition to a time stamp and the actual transaction data, Each block contains a time stamp, the actual transaction data as well as two hash values based on the cryptographic hash function SHA-256: a hash value over all the transactions collected in the new block, the hash value of the previous block. Key in the chain Private and public keys must be created before a block is formed and integrated into the Blockchain; these are generated by the wallet software as a key pair on the client computer, a participant of the Bitcoin network and based on asymmetric encryption. Visible to all, the public key is used to create a 34-character string to act as a Bitcoin destination address for any additional transactions. This is important for the anonymity of the transaction as the address cannot be returned to the public key. The private key signs a transaction, and without it, the transaction would be deemed invalid. The public key attached to the transaction proves the sender of a transaction has the appropriate private key and is the originator. The signed transaction is sent to all nodes in the Bitcoin network and as soon as a certain number of nodes confirm the receipt it is confirmed as "shipped". Validity is the next phase of the transaction process and this is checked before blocking to prevent any tampering and to ensure the correct amount is credited to the recipient and debited from the sender. Without validation, an amount could be sent several times, or Bitcoins could be output which is not present at all. Validation by Mining In blockchain technology, there is no central location for transaction validation instead there is the "Miner" in the Bitcoin blockchain. These are computers or pools of computers that provide their computing capacity to the system for the validation of transactions and the formation of the blocks. Miners must prove their trustworthiness and ability to qualify for the task of mining and this a "proof-of-work" process is used. Miners must solve a computer-intensive cryptographic task - the creation of the hash value - which can only be processed by trial and error. These tasks are hugely difficult, so the frequency of the block formation is controlled and held for about ten minutes. The calculation effort ensures that subsequent modifications of the blockchain are excluded, and the blockchain cannot be scaled at will. Mining is complex and time-consuming, and by way of motivation, miners receive a certain number of Bitcoins as a "reward" when the cryptographic task has been solved. Conversely, the attempts of the slower miners, which processed the same transaction at the same time, are thus automatically invalid and are cancelled. Having built a valid block, the miner sends it to the network where each participant can control the block's validity and attach the block to the local copy of the blockchain. The miner validates the Blockchain which is deemed a trustworthy process in the peer-to-peer architecture where there is no central control instance making the transaction ready and irrevocably documented for all blockchain participants. Mining acts as the validation and "money creation" in the Bitcoin system. To start with 50 Bitcoins could be recreated per block, a figure which has now been halved to every 210,000 blocks, so it is now only 12.5 Bitcoins per block. Consequently, only 21 million Bitcoins can ever be generated, if the limit is reached, which will likely be around 2140, no new hashes or new Bitcoins can be generated. Bitcoin is probably the most widely known application area of blockchain technology, but there are many other areas in which it could be used including: Banks Blockchain is a technology currently undergoing careful consideration for use in the banking sector where its use in billing and transferring assets by documenting transactions without validation from a central office could significantly reduce costs and speed up processes. Smart Contracts Blockchain technology can be used to process more complex contracts. For example, map the conditions of a contract can be mapped in the form of an executable program code ensuring automated compliance with the contract determining which condition leads to which decision. The speed at which ownership transfers or leasing could be increased and can be carried out without value. Once the buyer or tenant has paid the vendor seller or landlord, it would be transferred to him/her, or alternatively, access could be given to the digital key. Insurance Blockchain could be used to dynamically adapt insurance conditions based on the policyholder’s habits and the premium payment adjusted accordingly; For example, in motor insurance, this could depend on the driving performance. Music industry . For this purpose, Blockchain offers the ideal solution in the music industry, where many artists want direct responsibility for their music sales, music rights and the conditions of use as it can directly link the use and payment to algorithms embedded in a blockchain. Dialling systems In the use of digital voting systems, blockchain can protect against tampering while also ensuring the anonymity of the. Patent registration In the Blockchain patents for the relevant administrative offices along with the documents for proof of intellectual property could be filed de-centrally, permanently and without an intermediary. To have certificates guaranteed by mathematical encryption would regulate the possession, existence, and integrity of these documents on a global scale. To conclude blockchain is a new and innovative technology that is gathering pace but for many, there are a lot of questions that remain unanswered, and many of the answers currently available will not be final. However, such is its traceability and transparency for any transactions signal its importance and potential to be used in wide range of innovative scenarios.   ### Dstl scientists tackle growing problem of space junk UK scientists at the Defence Science and Technology Laboratory (Dstl) are leading an innovative experiment to tackle the growing problem of space junk. Hundreds of thousands of manmade objects orbit the Earth, but fewer than 5,000 are operational satellites. The most congested area sits within 2,000 kilometres of the Earth’s surface, known as low Earth orbit (LEO), where collisions can cause further debris. If this problem is not addressed, space junk threatens to make space exploration and satellite launches impossible. It also poses a hazard to existing satellites, which make an important contribution to the UK’s military capability. To tackle this problem, the Inter-Agency Space Debris Coordination Committee (IADC) has proposed that all LEO satellites should be de-orbited within 25 years. However, the traditional de-orbiting rocket method is expensive. As part of a large collaboration with industry, government and academia, Dstl space scientists are leading in exploring alternative methods. The Daedalus experiment – part of the Space Situational Awareness Project in Dstl’s Space Programme – is exploring the effect on satellites of so-called Icarus ‘de-orbit sails’, made of 25 micrometre-thick aluminium-coated Kapton, a high heat-resistant polyimide film. When deployed, the sail increases drag, causing a controlled descent into the Earth’s atmosphere where the satellite will burn up. Sean Murphy, a Principal Scientist in Dstl’s Space Programme, said: “It’s vitally important that we remove satellites that have reached end of life so they don’t remain in orbit as pieces of space junk. Space junk clutters up the space environment and ultimately pose a hazard to the useful satellites we rely on.” Announcing the experiment at the first meeting of the Defence External Innovation Advisory Panel, the Minister for Defence Procurement, Harriett Baldwin, said: “Our £800 million Innovation Initiative will help our Armed Forces maintain their edge into the future, where ever-evolving technologies present new challenges and opportunities. That’s why we have committed to spending 1.2% of our £36 billion growing defence budget on science and technology. “The Innovation Panel will help meet the complex challenges of the 21st Century, while delivering the high-wage, high-skills jobs of the future; and it’s particularly fitting that we welcome astronaut Major Tim Peake as I announce the UK’s leading role in cutting-edge satellite research.” The first Daedalus trial has just started. A Canadian satellite, known as CanX-7, deployed its de-orbiting sail in early May 2017 and is expected to burn up in the Earth’s atmosphere in around two years’ time. Two other satellites, TechDemoSat-1 (TDS1) and Carbonite-1 (CBNT1), have been fitted with Icarus de-orbit sails created by Cranfield University and are expected to start their descents later this year. The experiment results will help to characterise changes in the brightness of the satellite caused by the sail deployment, quantify the drag increase that is due to the de-orbit sail, and critically compare different orbital dynamics models. Dstl is leading co-ordination of the UK element, tasking sensors to collect data to support this experiment. Of particular relevance to the military community is the effect that such high-drag satellites have on current space situational awareness sensors and processing. A video of Sean Murphy talking about the problem of space junk can be found here. ### AI and Financial Services - Let's talk AI - FICO's Derek Dempsey Compare the Cloud interviews Derek Dempsey, Director, Data Science Centre of Excellence of FICO. Derek talks about how Artificial Intelligence impacts financial services. What does Artificial Intelligence (AI) mean? Artificial Intelligence (AI) is the area of technology that focuses on creating intelligent machines that react and work like people. Some examples of what these machines are created for include problem solving or learning. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about FICO here: http://www.fico.com/ Let's talk ai ### Picture Perfect: 5 Tips to Protect Cloud Video   Several years before making today’s headlines as the Trump-Russia Investigation's special counsel, former FBI chief Robert Mueller made waves with his wry comment that “there are only two types of companies: those that have been hacked, and those that will be”. Today, it may be more accurate to say there are two types of companies: those that know they have been hacked and those that don’t. This uncomfortable truth, whilst unsettling, is in fact the first step to working out a strategy for storing and delivering video and content over the Internet. Why? Because once a company has recognised that they are not immune from cyber attacks, they can get down to the nitty gritty of working out what lengths, costs and trade-offs they are willing to accept in order to meet the industry and legal obligations around the protection of video data and content. Data protection and how this pertains to both video data and content is understandably a hot topic. This is driven by numerous factors including the increasing amount of media coverage of breaches, such as the infamous iCloud hack, and the surge in the number of new,  mission-critical, cloud applications on the market. The EU General Data Protection Regulation (GDPR) - which defines the rights of EU citizens around the privacy and protection of their personal data - is another crucial consideration. As are questions about how both Brexit and the EU-US Privacy Shield Framework agreements (including the executive order signed by the new US administration) will affect EU data stored in the US (and vice versa?). So when moving your video data into the Cloud, here are the top 5 tips on what to look for in an online video platform and how you can best protect your data. Tip 1: Differentiate between video data and video content In the world of online video, video data refers to the metadata  - or data about data - such as categories, descriptions and tags as well as attached documents, viewer comments, likes and so on. Video content refers to the actual video frames, i.e. what you watch. This is an important distinction to make because while it may be acceptable for your video data to live in the Cloud, it may not be acceptable to have your video content in the Cloud. Tip 2: Break down your data and content into different classifications of privacy Classification is generally one of the following: publicly available, internal only, internal and confidential or regulatory. Each classification requires a greater level of protection, with the risks ranging from embarrassing to highly damaging to breaking the law (where legal regulations apply). Tip 3: Understand the legal obligations for specific office locations The legal obligations regarding data and content are complex, constantly changing, and highly idiosyncratic based on the classification and the context in which they will be used. When combined with the different approaches required for different countries, you begin to understand what might keep CIOs awake at night. For example, in the UK there is the Data Protection Act 1998 (DPA), whereas in the EU it is the 1995 EU Data Protection Directive, which is soon to be superseded by the GDPR. And between the EU and the US there is the EU-US Privacy Shield, which supersedes the The Safe Harbor framework. It’s a compliance minefield. Plus you also need to be mindful that regulations do overlap in certain ways. Tip 4: Investigate the different Online Video Platform (OVP) deployment options There are a number of cloud-based choices available today: Cloud Software-as-a-Service – all of your data and content will be stored on the provider’s datacentres (wherever they may be) and delivered over the public internet via Content Delivery Network partners (such as Akamai, Amazon Cloudfront, Limelight etc). On-Premise – the provider’s platform is deployed on your servers (most likely inside your headquarters) and data never leaves your internal network. Hybrid – a mix of cloud and on-premise. In a nutshell this means some data can be stored in the Cloud and some data on-premise, depending on the confidentiality restrictions of each type of data. Private Cloud Software-as-a-Service – this is essentially an on-premise installation, but instead of being deployed on in- house servers, the deployment is powered by a trusted provider, such as Rackspace or Amazon Web Services or Microsoft Azure. Tip 5: Question which secure protocols and encryption types are being used How data is stored and via which protocols it is delivered is crucial to determine how secure your video is while at rest and during transit. Ensure your online video platform provider encrypts video data and content at rest – or, in other words, the servers on which the video data and content is stored. In transit, ensure secure protocols (such as HTTPS) are used. And for the highest level of security, you should adopt on-the-fly DRM packaged content. Whether you are an enterprising entrepreneur or a global enterprise, the move to the cloud is transforming businesses. Findings from 451 Research’s most recent Voice of the Enterprise (VotE): Cloud Transformation study highlight the sunny outlook, with 22% of organisations polled adopting a ‘cloud first’ approach, with infrastructure as a service (IaaS) or the public cloud the fastest-growing model. By chasing away doubts about cyber breaches, businesses can tap into an increasingly wide range of services provided over the Internet instead of on premise, with all the advantages that offers. These tips will help you to navigate the data protection maze and help achieve a secure, cost-effective solution to storing and delivering your video content in the cloud.     ### Rackspace Teams Up with Pivotal to Deliver Managed Pivotal Cloud Foundry  Helps Operate Pivotal’s Cloud-Native Application Development Platform on Customers’ Infrastructure of Choice Rackspace® announced from its Rackspace Solve customer conference in New York City that it is aligning efforts with Pivotal® to deliver Managed Pivotal Cloud Foundry to enterprise customers. This new Rackspace solution will help enterprises use Pivotal Cloud Foundry®, one of the world's most powerful cloud-native platforms, to quickly build and deploy applications at scale. Rackspace will manage Pivotal Cloud Foundry on any public or private cloud, as well as on customer-owned infrastructure, backed by deep technical expertise and by the results-obsessed Fanatical Support® for which Rackspace is famous. “Fortune 500 customers using Pivotal Cloud Foundry to build, deploy, and run their legacy and cloud-native apps have experienced 2,000 percent increase in developer productivity, as well as a 50 percent reduction in IT costs due to platform automation,” said Bill Cook, president and chief operating officer, Pivotal. “Since moving at startup speeds is on the minds of every business and government organisation, the collaboration between Pivotal and Rackspace would provide customers the option to manage their cloud environment, so they can focus on rapidly shipping code.”  Pivotal now works with over one-third of the Fortune 100, and a rapidly growing portion of the Fortune 2000, who rely on Pivotal Cloud Foundry to rapidly develop and run modern and legacy applications at start also, Pivotal Cloud Foundry is the only application development platform that runs on any cloud infrastructure—across public, private and managed clouds. With Managed Pivotal Cloud Foundry, Rackspace intends to deploy and operate a fully-managed Pivotal Cloud Foundry solution across any infrastructure selected by the customer. This approach allows customers to achieve greater business agility and accelerated application delivery. The operational benefits of Managed Pivotal Cloud Foundry include: 24x7x365 Comprehensive Management – Customers who don’t want to hire and dedicate resources for managing Pivotal Cloud Foundry will be able to completely offload operations and management tasks to Rackspace. Those tasks include troubleshooting, managing upgrades and feature releases and integrating various services—such as MySQL, Redis and RabbitMQ—with Pivotal Cloud Foundry. This model will allow users to focus their scarce engineering talent on the tasks that differentiate their business, such as product development and customer service, rather than on the management of Pivotal Cloud Foundry, yielding higher cost-effectiveness and speed-to-market. Multi-Cloud Capability – Customers who need a managed, multi-cloud deployment will be able to leverage support for leading public and private clouds as well as dedicated hardware. Rackspace will manage Pivotal Cloud Foundry on customer-owned infrastructure as well as any public or private cloud infrastructure, including those powered by Amazon Web Services® (AWS), Google Cloud Platform®(GCP), Microsoft®Azure®, Microsoft Azure Stack®, Microsoft Private Cloud, VMware®, Rackspace OpenStack® Private Cloud and Rackspace OpenStack Public Cloud. On-Demand Expertise – Rackspace experts will handle Pivotal Cloud Foundry version upgrades, feature enhancements and other technical updates as they arise. Rackspace will help customers hit the ground running with rapid deployment and offer SLAs of 99.99 percent uptime with a 15-minute response time for emergency issues. [easy-tweet tweet="Managed Pivotal Cloud Foundry is Rackspace’s first step into the managed platform space" hashtags="Cloud, Platform"] “Most organisations want to deliver application features more quickly and efficiently while modernising their architectures, but getting there isn’t always easy,” said Brannon Lacey, vice president of applications and platforms at Rackspace. “Pivotal Cloud Foundry is a valuable platform that can help businesses achieve their development goals more effectively. Managed Pivotal Cloud Foundry from Rackspace makes this technology, and its benefits, accessible to developers in organisations of all types and sizes, regardless of their expertise and experience with the platform.” “Managed Pivotal Cloud Foundry is Rackspace’s first step into the managed platform space, as we move up the stack to solutions that customers want our help with,” continued Lacey. “It is a  solution that helps customers get up and running on Pivotal Cloud Foundry quickly and stay up and running, with operational support and proactive monitoring. This way, in-house teams can focus on innovation and getting out to market quickly while Rackspace handles the backend.” Managed PCF is now in general availability for customers in all regions. For more information, visit http://www.rackspace.com/managed-pivotal-cloud-foundry.   ### Panasonic 4K hits the Big Screens this Summer Panasonic’s new 4K-ready video wall display series, the LFV8,  is set to launch this summer in 49 and 55-inch sizes.  The displays have the ability to show 4K images in a 2x2 configuration and larger using DisplayPort in a daisy-chain fashion. The 49-inch panel offers 450 cd/m2 brightness, while the 55-inch screen has 500cd/m2 with native 4K-capable DisplayPort inputs and outputs. When installed in a 2x2 configuration across the four Full HD panels, the result is a single, film-like 4K (4096 x 2160/30p) video image. [easy-tweet tweet="The new precision video wall mounting system saves time, reducing labour costs" hashtags="Panasonic, Video"] The two model range meets the demands of professionals by streamlining the installation process. The new precision video wall mounting system saves time, reducing labour costs. The rigid mount uses an automatic magnet system, eliminating potential alignment inconsistencies.   The display features an ultra narrow bezel-to-bezel distance of just 3.5mm, allowing for near seamless video wall display. This is combined with high resolution IPS (in-plane switching)  technology so that the image remains clearly visible even from oblique angles, vital for signage applications. In addition, the 49-inch display promises improved visibility by combining an anti-glare treatment with a low-reflection coating (AGLR). “These panels are perfect for creating an impressive 4K video wall,” said Enrique Robledo, European Marketing Manager for Panasonic. “Whether it’s for retail, hospitality or transport, these displays have excellent blacks and highlights making them immensely capable at grabbing the attention, while the flexible installation allows you to exploit just about any installation space.” The display is engineered for 24/7 reliability in either landscape and portrait orientation. The 55-inch (TH-55LFV8) will be available in August with the 49-inch (TH-49LFV8) display available in September. For more information on Panasonic Visual System Solutions please visit: http://business.panasonic.co.uk/visual-system/ ### Natural Language Processing and Big Data Analytics - Let’s talk AI - SafeToNet’s Richard Pursey Compare the Cloud interviews Richard Pursey, CEO of SafeToNet. Richard talks about what SafeToNet is trying to do to protect children in the online world using AI, and how they can tell the difference between banter and aggression. What does Artificial Intelligence (AI) mean? Artificial Intelligence (AI) is the area of technology that focuses on creating intelligent machines that react and work like people. Some examples of what these machines are created for include problem solving or learning. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about SafeToNet here: https://safetonet.com/ Let's talk AI ### Comms Business Live Episode 4: Who's Eating Your Lunch? Analyst firms have their view. For example, IDC predicts that by 2020 30% of the top firms in every business sector will not exist, as we know them today. They will be replaced by new firms, will have merged, will have not kept pace and declined, or will simply not be relevant anymore to the business needs of the day. This week on Comms Business Live, host David Dungay will be joined by: Justin Harling, Managing Director of CAE Technology John Whitty, CEO of Solar Communications Andrew Wilson, Director of Channel Sales at Node4 Don't forget about Channel Live ICT event on 12th - 13th September.     ### Citrix and Google Align to Help Embrace Secure Cloud Transformation Broad Portfolio of Combined Solutions Help Customers Transition to the Cloud; Citrix Workspace Service Coming to Google Cloud Platform To help businesses fully embrace the benefits of flexibility and scale that cloud delivers, Citrix today announced extensions to its long-term strategic relationship with Google. Customers will be able to use CitrixCloud to provision and manage secure digital workspaces, including CitrixWorkspace Service, on Google Cloud Platform. Citrix and Google are working together to bring cloud delivery of applications and desktops, and secure cloud-optimized endpoints to their enterprise customers who are increasingly looking to both public and hybrid clouds to solve their business requirements for secure digital workspaces. Along with these new cloud solutions, Citrix and Google also announced new integrations, available today, between Citrix ShareFile and Google G Suite that enable follow-me-data when using Citrix workspace solutions. A new ShareFile plug-in that allows the secure sharing of files via Gmail and a ShareFile connector to Google Drive provides users with one place to find all their documents. In addition, Citrix NetScaler CPX is available now on Google Cloud and is targeted for availability in the Google Cloud Launcher marketplace by the end of the quarter. Google Cloud’s emphasis on containers and the Kubernetes orchestration system will allow the developer community to use NetScaler CPX to easily build and scale secure applications in the cloud. [easy-tweet tweet="Companies of all sizes have an opportunity to embrace cloud transformation " hashtags="Cloud, IT"] “Today, we are deepening our successful partnership with Citrix. By adding the ability to manage IT services in Google Cloud Platform from Citrix Cloudservice offerings, we and Citrix are providing enterprise customers more options as they make moves to the cloud,” said Nan Boden, Head of Global Technology Partners, Google Cloud. “Our collaboration with Citrix will help businesses of all types accelerate their transition to the cloud including the desktop infrastructure and applications that they want to use.” “Companies of all sizes across all industries around the world have an amazing opportunity to embrace cloud transformation and empower their people to work securely from anywhere using digital workspaces,” said Steve Blacklock, VP of Global Strategic Alliances, Citrix. “Our customers are asking Citrix and Google to work more closely together to deliver innovative solutions from the cloud to help them embrace the future of work.” With Citrix Cloud services integrated with Google Cloud, Google customers will have the ability to easily add virtual apps and desktops that run on Google Cloud and use them alongside the G Suite productivity suite. Citrix and Google continue to work on delivering secure end point solutions for accessing digital workspaces with the recently announced Citrix Receiver for Chrome 2.4, which offers true multi-monitor support on Chromebooks. Google recently announced official Chrome Browser support for Citrix along with graphics optimisations available only on XenApp. Google and Citrix product teams continue their collaboration to enhance and optimise XenApp delivery to Chrome OS based devices for enterprise customers with ongoing Receiver for Chrome updates. ### The Process to Business Transformation When it comes to the en vogue subject of business transformation, many think that it is synonymous with a complete ‘rip and replace’ of existing IT infrastructure. They would be wrong. You simply cannot expect a company driven by older, legacy technology one day, to then make a complete switch to new technology the next. Instead, the old and new (the bi-modal) worlds of IT will need to coexist for some time yet, so business transformation needs to be looked at as a gradual process, rather than a simple ‘plug and play’ event. Can legacy technologies still deliver value? While legacy technologies such as mainframes might not be the cheapest to run or easiest to manage, they can still provide limited value. Determining that value comes down to whether the technology is meeting ongoing business requirements. For example, if a mainframe continues to provide the support needed to successfully run the business, then it’s doing its job. However, if business requirements are changing at a fast pace, it is often difficult for legacy teams and technologies to keep up. Migrations and upgrades to legacy systems can sometimes take years, and companies run the risk of facing obsolescence even before the upgraded environments are fully functional. It is therefore worthwhile continuously reviewing the merits of such technologies and retiring the workloads that no longer meet the businesses requirements, bringing in new technologies to replace them. This way you are not removing the entire service, but merely retiring and replacing certain workloads. So what’s the role of cloud? Senior leadership tends to interpret cloud’s role in a number of ways. To start with, there’s often a worrying assumption that a simple ‘lift and shift’ approach, covering all of an organisation’s processes and data, will drastically reduce operational costs, improve performance and cut product time-to-market. This isn’t realistic. Instead, IT leaders should look at their environment from a business perspective and make informed observations on what workloads and processes are mature enough to move to the cloud. This should be combined with the insights of other technical staff and executives, who’ll need to collaborate on the specific changes needed both before and after migration. The need for innovation Amazon founder and CEO Jeff Bezos once commented: “The most important single thing is to focus obsessively on the customer”, and he was right. Most business transformations today are driven by the need to enhance the end-user experience. As such, IT’s focus should move on from building an environment for infrastructure and applications – ring-fenced processes which will outlive their value in future –  and instead focus on building scalable platforms, which equip the wider business with the agility and flexibility it needs. Customers today want to do more with less, and do it faster. Focusing obsessively on the customer’s demands helps the organisation ready itself to adapt, which in turn means IT departments have to build more flexible IT solutions. These need to deliver ‘cloud-like’ customer experiences on legacy technologies, as organisations often simply can’t afford to wait until all operations are fully migrated. This will give rise to a host of companies offering customer experience platforms specifically designed for legacy technologies, for those organisations still stuck in the middle. Keeping up with the pace of change Business transformation is a daunting challenge for any company, but those that have mastered the skill of accurately predicting where the business is heading are those that succeed – just look at Netflix or Amazon, for example. Bill Gates once said: “What keeps me up at night is the fear of complacency”, and this sentiment is more apt today than ever. The arrogance of success encourages business leaders to think that yesterday’s innovations will work just as well tomorrow. To remain relevant, leaders need to be able to grasp where the business is really headed over the next decade. This all means that IT needs to be prepared to respond to changes quickly and remain agile when creating scalable platforms for the business. [easy-tweet tweet="Business transformation is unique to all companies." hashtags="BusinessTransformation, IT"] A meeting of the old world and the new Ferocious customer demand for improvements and efficiencies is insatiable, so businesses need to be prepared to respond to change in an effective and swift manner. Moving workloads to on-demand environments, and bringing in lean ‘Just In Time’ technologies such as cloud computing provide the solution. Innovations such as these provide businesses with the agility to turn their services on and off according to patterns of demand and provide the IT foundations to manage market unpredictability. Business transformation is unique to all companies, even those that are direct competitors in the same industries. And with a buoyant market and an abundance of different tools to choose from this presents organisations with the best chance to choose the right fit for their long-term business goals. However, businesses need to find the middle ground, between the risks of attempting to change too much within a short time, and resisting change altogether and to depend on legacy technology. Above all, organisations need to be cognisant of the fact that some of the functionality that exists with on-premise legacy IT will have to remain so, as full cloud migration isn’t always operationally viable. It is therefore critical for businesses and IT departments to embrace the model of hybrid integration and reach an understanding of coexistence between the old world of IT and the new. ### Transcription and Speech Recognition - Let’s talk AI - Dubber’s James Slaney Compare the Cloud interviews James Slaney, Co-founder of Dubber. James talks about what kind of AI he is using and what he is achieving with it. What does Artificial Intelligence (AI) mean? Artificial Intelligence (AI) is the area of technology that focuses on creating intelligent machines that react and work like people. Some examples of what these machines are created for include problem solving or learning. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about Dubber here: https://www.dubber.net/ LEt's Talk AI ### Thales: As GDPR Approaches, Retail Data Breaches Remain Unacceptably High Two in five retailers across the globe have experienced a data breach in the past year, according to the 2017 Thales Data Threat Report, Retail Edition, released today. The report, issued by Thales, a leader in critical information systems, cybersecurity and data security, in conjunction with analyst firm 451 Research, reveals that a staggering 43 percent of retailers had experienced a data breach in the last year, with a third (32%) claiming more than one. Click to Tweet: 2 in 5 retailers have experienced a data breach in the past year #2017DataThreat http://bit.ly/2uwoKY9 With 60 percent claiming that they had been breached in the past, it’s perhaps unsurprising to learn that the majority (88%) of retailers consider themselves to be ‘vulnerable’ to data threats, with 37 percent stating they are ‘very’ or ‘extremely’ vulnerable. As a result, three-quarters (73%) of retailers expect their spending on IT security to increase. Taking steps toward compliance An increase in regulations such as the forthcoming EU GDPR has led to greater awareness and concern around issues of data privacy and sovereignty, with 72 percent of retailers claiming to be impacted. [easy-tweet tweet="Almost two-thirds of retailers (64%) are encrypting their data." hashtags="Data, GDPR"] The report reveals that, in an effort to comply with these new requirements, almost two-thirds of retailers (64%) are encrypting their data, 40 percent are tokenising data, and a similar number (36%) are implementing a migration project. Pressures to use advanced technology increase risk According to the report, half of retail organisations (52%) will use sensitive data in a big data environment this year, with a third (34%) using encryption to protect that data. Despite this, however, 39 percent were very concerned that they’re using these environments without proper security in place. What’s more, the report found that as the adoption of cloud and SaaS environments continues to rise, so too do concerns regarding their safe use. Two-thirds of retailers (67%), for example, claimed to be very or extremely concerned about cloud service providers (CSPs) falling victim to security breaches or attacks. A similar number (66%) expressed concerns about vulnerabilities in shared infrastructure, and 65 percent were worried about the custodianship of the encryption keys used to protect their data. 63 percent of respondents suggested that such fears could be allayed through the use of data encryption in the cloud, with keys being controlled at the retailer’s premises, while half (52%) preferred the CSPs to control the keys. Garrett Bekker, principal analyst for information security at 451 Research says: “Breach results were not so rosy for global retail – a staggering 43 percent of global retail respondents reported a breach in the past year alone, approaching twice the global average. These distressing breach rates serve as stark proof that data on any system can be attacked and compromised. Unfortunately, organisations keep spending on the same security solutions that worked for them in the past, but aren’t necessarily the most effective at stopping modern breaches.” Peter Galvin, vice president of strategy, Thales e-Security says: “With tremendous sets of detailed customer behaviour and personal information in their custody, retailers are a prime target for hackers so should look to invest more in data-centric protection. And as retailers dive head first into new technologies, data security must be a top priority as they continue to pursue their digital transformation.” Retail organisations interested in improving their overall security postures should strongly consider: Deploying security tool sets that offer services-based deployments, platforms and automation; Discovering and classifying the location of sensitive data within cloud, SaaS, big data, IoT and container environments; and Leveraging encryption and Bring Your Own Key (BYOK) technologies for all advanced technologies. Please download a copy of the new 2017 Thales Data Threat Report, Retail Edition for more detailed security best practices.     ### Industrial Internet and IoT - Unboxed - Mark Homer from ServiceMax (GE Digital Company) Compare the Cloud speaks to Vice President of global customer transformation for Service Max (GE Digital Company), Mark Homer. In this episode of Unboxed Mark speaks about the industrial internet and the internet of things. He speaks about how to connect capital assets - equipment and pieces of high-value engineering - to the industrial internet and how there can be driven value. Mark also speaks about looking at the role of field service management. If you would like to learn more visit: https://www.servicemax.com/ge-acquisition Unboxed ### Blockchain - The Missing Link for IoT?  Blockchain is expected to be instrumental in a digital transformation in the coming years, especially in the field of IoT. But there are technical hurdles to overcome largely because most IoT devices lack the adequate computing power to participate in blockchains directly. That said, as with most IoT initiatives, a small thing like power isn’t going to stop the world from trying. You just have to look at the importance of cryptocurrencies, which rely on blockchain to operate, to see the potential. Cryptocurrencies, which allow people to move money in the same way they move information on the internet, are being traded in huge sums. There are currently more than 900 different cryptocurrencies being traded and the most popular, the Bitcoin (BTC), has a market cap of over $40 billion with daily volumes averaging $1 billion and peaking at around $2 billion. Besides providing real opportunities for cyber criminals and clever, high risk, traders, our interest in cryptocurrencies has proven that blockchain is a viable technology to exponentially grow the IoT ecosystem.  At the moment though, technologists are grappling with exactly how it will do this. [easy-tweet tweet="Most of today's IoT ecosystems are built around a centralised, brokered communication model." hashtags="IoT, Blockchain"] The future of IoT is decentralised To understand the need for technologies like blockchain for IoT, we need to understand the problem IoT will be facing in the future. Most of today's IoT ecosystems are built around a centralised, brokered communication model.  All the IoT devices in the system are known and authenticated by and communicate through a centralised, large cloud which provides huge amounts of processing power and storage. At its basic level, any two IoT devices exchanging information are brokered through the central system, even if they are a couple of feet away. They rely on a private network and internet cloud servers to exchange even the smallest bit of information. While the cloud provides immense potential for computing and storage and will continue to persist as a design pattern for small scale IoT deployments, this central model will not be able to cope with the huge ecosystems we expect to see shortly. Even if centralised cloud servers could accommodate the scale in an economical fashion, they are still the single point of failure for the whole ecosystem. For an ecosystem of devices to scale to millions or even billions, a decentralised approach is preferred in which each device represents an autonomous system. All communication and information exchange between devices, servers and services of the ecosystem should be based on distributed protocols. This is where blockchain can help. In fact, IBM in partnership with Samsung has published a proof of concept whitepaper for a system, known as ADEPT, that uses elements of blockchain to create distributed networks of autonomous devices to form a decentralised IoT ecosystem. It notes that any protocols used by the autonomous systems should be secured, authenticated and distributed and that each node in the ecosystem should be able to perform in a distributed fashion three things: messaging, file sharing and coordination. It’s obvious IoT devices need to be able to message its ecosystem to alert it to a change in the environment, and do so in a distributed, secure and authenticated way. But current IoT messaging systems such as MQTT use a central broker design, and while they can be secured and authenticated, they can’t scale to support millions or billions of devices without complex hierarchical designs. So we are starting to see the development of new peer to peer messaging systems which provide encrypted messaging, low latency, and guaranteed delivery, store and forwarding of messages whereby the message can 'hop-on' to other devices. Known as Distributed Hash Tables, these allow devices to create their hashtag and find other devices in its network. We’re likely to her more about Telehash to name just one approach, which is an emerging open source version of this messaging technique. Of course, there are times when files need to be shared – like software updates or configuration settings. Bittorrent is well known as a robust peer-to-peer file sharing protocol. But it’s still not enough. When there is a need for an actual transaction, like payment, Blockchain will be the technology of choice, as it provides a decentralised ledger where autonomous ‘things’ in the network can follow the rules and verify the validity of transactions without relying on a central authority or human. What’s more every device in the system keeps a complete history of all the transactions performed in the whole ecosystem and as it’s tamper proof it’s fundamentally secure which is essential if you are building complex networks where life or death is at stake - there is no risk of a ‘man in the middle’ cyber attack. Combine all this together and IoT becomes smart, self-supporting and self-sustaining. Blockchain allows devices to make the right decisions at the right time and log the history, ideal for situations where there must be a ledger of transactions for regulatory compliance. It will even go as far as to fix the ecosystem if something breaks based on the protocols its programmed to follow. However, we can’t get carried away yet. IoT ecosystems aren’t always closed; sometimes they need to talk to another ecosystem or network. This inherently brings risk as you create a bridge from one system to another. The minute an API or a web application in the cloud is introduced to create this bridge so you create a target for a hacker. Consider the recent DDoS attacks on the Bitfinex and BTC-e Bitcoin exchanges, and theft of cryptocurrency in the case of Classic Ether Wallet. It wasn’t the blockchain nodes that targeted it was the web services they relied on. Plus you can’t always rely on the inherent security of the blockchain technology. If it’s not implemented correctly, you introduce weakness, as happened when a hacker exploited a blindspot in some code on the Etherum Investment Fund platform, draining it of around $53m worth of digital currency in a few hours. What’s curious about all of this is that many of the technologies being developed using blockchain are similar to the botnets hackers have used to cause havoc – even the IBM and Samsung pilot is strikingly close. So while we strive to create secure smart IoT ecosystems, we have to build in security. Blockchain is going to be a large part of the puzzle in the future, but it is only ever going to be as good as the humans who design it – inadvertently put in a flaw and you destroy utopia.   ### Customer Intelligence and Security - Let's talk AI - SAS' Peter Pugh-Jones Compare the Cloud interviews Peter Pugh-Jones, Head of Technology, UK & Ireland of SAS. Peter talks about what does Artificial Intelligence mean to him and to SAS. What does Artificial Intelligence (AI) mean? Artificial Intelligence (AI) is the area of technology that focuses on creating intelligent machines that react and work like people. Some examples of what these machines are created for include problem solving or learning. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. Find out more about SAS here: https://www.sas.com/en_gb/home.html Let's Talk AI ### Pulsant Expands Cloud Storage Offering with INFINIDAT InfiniBox High Availability and Ease of Management Enables Cloud Service Provider to Benefit from Significant Savings and Meet Stringent SLAs  Pulsant, a leading UK provider of hybrid cloud services, has selected two InfiniBox storage arrays from INFINIDAT to deliver storage service excellence to its many customers throughout the UK. Pulsant benefits from enterprise-class performance, cost effectiveness and ease of management while its customers enjoy high levels of storage performance, availability and service reliability.  [easy-tweet tweet="Pulsant selected InfiniBox storage arrays from INFINIDAT to store a wide range of customer data" hashtags="Data, Cloud"] The market for cloud computing services hotly contends. Service providers have to fulfil stringent service level agreements (SLAs), or their customers will take their business elsewhere. One of Pulsant’s most strategic offerings is its cloud storage service, especially in light of the growing demand amongst mid-tier and enterprise customers to outsource some or all of their storage infrastructure. Pulsant was looking for an alternative storage solution to run alongside and eventually replace its more traditional tier-1 storage-based system. The company wanted to provide a more cost-effective service to position itself more competitively while retaining the high service quality and availability that its customers were accustomed to and required.    Pulsant selected InfiniBox storage arrays from INFINIDAT to store a wide range of customer data. The systems are used as shared devices to service customer infrastructures as part of Pulsant’s storage services as well as larger shared environments including Pulsant’s Enterprise Cloud platform, a VMware-based multi-tenant infrastructure platform housing several hundred customers.    Before selecting INFINIDAT, Pulsant carried out a thorough TCO analysis, which included modelling the cost of a service for customers, the results of which were the largest contributing factor in the purchasing decision. In addition to the low TCO, the Pulsant team was impressed by INFINIDAT’s enterprise-class functionality. The InfiniBox arrays offer a feature set which replicates that of more expensive systems but reduces complexity and cost by deploying commodity style hardware, intelligent disk utilisation techniques and effective SSD and memory caching layers for “hot” data sets. INFINIDAT’s innovative design enables ease of operation, while its low power and cooling requirements have resulted in cost savings, ultimately delivering more for less than other storage arrays.   The installation process was quick and easy with the systems being in production within three months from the start of the initial proof of concept testing. The INFINIDAT team was involved at all times, and Pulsant was impressed by the level of communication, depth of technical knowledge and project management skills. Since installing the solution, Pulsant enjoys an average latency of sub-1ms across a highly diverse set of customer workloads and has been able to apply software updates real-time with no disruption of service. In addition to these key customer benefits, Pulsant’s operations team enjoys the simplicity of management and the lack of need to tune the arrays as these automatically and intelligently adapt to changing workloads and data access requirements.    “Our InfiniBox systems provide an excellent balance between ease of operation, enterprise-class features, availability, performance and cost. As such, the solution is exactly what we were looking for”, said Russel Ridgley, head of cloud services, Pulsant. “The low TCO allows us to position ourselves aggressively from a commercial point of view. The flawless reliability of the system means we can meet our strict SLAs with our customers. And the management simplicity helps us to easily service a large number of very different customers and maintain several hundred directly connected devices and volumes without any problems.”   ### The State and Future of the AI Market How has the AI industry developed so far and what have been the major milestones in the AI industry? The first thing you have to understand about AI is that it’s not the picture most people have in their heads—robots acting like humans or taking over the world. First, robots are a narrow vision of the “body” in which AI might be present. The reality is that right now, your Amazon Echo, your smartphone, and even your car’s antilock brake system is the “body” for AI. That leads to the second point, once AI is developed, it quickly becomes rather mundane—part of the background of our existence. AI powers every search we make, about half of all stock trades, and the price of airplane tickets and hotel rooms. [easy-tweet tweet=" AI will become a critical factor in the success and market share of most companies" hashtags="AI, MarketShare"] In terms of market value, what is the ultimate potential of the AI market? Just how lucrative will this industry be to those who lead it? Beyond “huge”; the potential of the AI market is hard to gauge because it will be interwoven into the fabric of so many industries, B2B and B2C products and services. An estimate of 15 billion in direct revenue (software, services, hardware) in 5 years would not surprise me. In the longer term, AI will become a critical factor in the success and market share of most companies. The difference between the AI haves and have-nots will be akin to the difference between businesses which are online vs. only offline. Who in your opinion is leading the AI industry and why? The big leaders are Google, Amazon, Facebook, and Apple. They share three critical characteristics: Huge data sets, proven success in the practical application of Narrow AI, and the commitment of R&D and M&A resources to AI investment. How do you think the AI industry will play out in the next five years? The next five years are particularly exciting. AI is like a snowball rolling down a hill and it’s finally big enough that we can see it easily and watch it grow before our eyes. From our perspective, the breakthroughs in education alone will be transformative. Our early tests are pointing to the feasibility of vocabulary acquisition at a speed which is an order of magnitude faster than what is currently available.  And Lingvist isn’t the only company already using AI to effectively tailor education around the skills and capabilities of an individual learner. It’s ironic that one of the early successes of artificial intelligence will be to advance human intelligence. What are the implications of the AI industry becoming something that is controlled by two or three dominant players? How will it influence innovation and product development if AI algorithms are more widely distributed around the world, rather than confined to those aforementioned tech behemoths? In consumer and commercial applications there is a virtuous cycle in AI; AI feeds on large data sets. If AI can provide a product advantage, which in turn attracts larger user numbers, then it gets better and delivers greater product advantage, and so on. For Lingvist specifically, we have a number of insights now that would not have been possible without the size and activity of our learner base. Among them: it takes about 17 hours to learn 2000 words; a learner’s inherent knowledge before using Lingvist can be predicted fairly easily by testing them on a handful of words — and perhaps most revealing, the actual “forgetting curve” as pioneered by Hermann Ebbinghaus (and used by a number of AI education applications) does not fit Ebbinghaus’ original mathematical model — our data implies a substantially different equation is a much better match. Because of the feedback loop and competitive advantage granted by AI, there is some danger of consolidation among a handful of players. The barrier to entry for someone who doesn’t possess access to a large data set could become formidable. However, given the players involved, it’s quite likely that one or more will provide AI as a Service (AIaaS)- Amazon very likely, and quite possibly Google as well. This could actually be a powerful enabler of innovation and diverse applications—similar to the way AWS enabled a host of startup companies by providing cloud based, affordable hosting.   ### Understanding The Underlying Challenges Of Cloud Based Payments Electronic payments and POS are the next big frontier for cloud based services.  The infrastructure for these services is shaped by what is known as HCE. Host card emulation (HCE) is the on-device technology that permits a phone to perform card emulation on a near field communication enabled device without relying on access to a physical card. The combination of always online devices and cloud computing makes placing cards in the cloud, a viable option for more open and scalable mobile based commerce. HCE services in the OS use the HCE client app to support multilevel security methods called for by the visa and MasterCard HCE specifications. Tokenization plays an important role in security against authorised card access in HCE. Tokenization reduces the risk for banks by replacing the sensitive information by tokenized pseudo-information used in the payment systems. But tokenization raises some challenges: In the most secure instances, the original card is deemed necessary for tokenization. For this to happen, the confidential details of the original cardholder need to be protected before the tokenization can happen. All the payment card details in the form of tokens are stored in a secure, centralised vault that now becomes a target for hackers. Infrastructure requirement can be another issue. A new token is created each time a new merchant or customer signs up and provides you with their data. When a new transaction takes place, the information about the customer needs to be detokenized so that the payment processing system can authenticate the details and approve the transaction. The transaction fee that merchants pay to their card processing company includes the cost of tokenizing and detokenizing the card holder details. While minor, the infrastructure necessary to do this is still an added cost that merchants have to bear. Businesses tend to minimise the cost they need to bear concerning tokenization and detokenization by routeing all their transactions through one payment processor. But if this processor uses an in-house tokenization algorithm, it could lock your payments to this specific processor. The challenge then is to find a tokenization service that will work equally well with multiple payment processors. This way, merchants are not limited to using one or two specific payment processors. Other issues include the potential negative impact on the speed of transactions and the fact that data analysis cannot be performed on tokenized data since good tokens do not allow the initial data to be reconstituted just from the token itself. [easy-tweet tweet="The mobile payments market is taking a new step towards simpler and cost-effective solutions" hashtags="Cloud, Mobile"] Another aspect to look into, for making cloud based payments secure, is the security challenges posed in the Integration of  NFC with the cloud for POS/m-POS transactions. The mobile (m-) payments market is taking a new step towards simpler and cost-effective solutions. Recently introduced payment options using mobile phones integrate NFC technology with cloud-based systems. Security infrastructure for NFC payments is designed to be multi-layered with the customer’s account and card details being stored in the secure element within the device and be maintained in the cloud. The secure element might be directly embedded by the mobile device manufacturer in the phone body or offered by the payment service provider as a removable secure digital card. The use of secure physical element, as is the current industry trend, is vital because in its absence the exposure to risk is much higher. This multi-layered security infrastructure further poses the need of risk management apps in case of the device being stolen or lost. An easy to use mechanism for deactivating NFC services on such lost or stolen devices and reactivating them on another will help enhance security. Security service providers including ARM, Gemalto, and Giesecke & Devrient are working on the development of the trusted execution environment (TEE) as a security standard, but there is still some time before it is ready for production. The successful combination of NFC and cloud will require solutions to help mitigate the security risks involved in data transmission. Despite constant monitoring and authentication checks that make the cloud itself secure, transmitting data over the air carries an element of risk. Payment information for individuals will be stored in the cloud, and it will be mapped individually to a person logging into an m-payment app. Therefore, data transferred between the cloud and the device initiating the transaction occurs over the air, putting the data at risk to exposure to parties capable of reading it during transmission. A hybrid approach that combines NFC and cloud for m-payments, hence removing the need for secure physical elements on a mobile phone will make the application of NFC services simpler and cheaper. However, NFC with cloud based systems will still require additional solutions to mitigate the security risks involved in data transmission. This should be done in respect of international payment standards such as PCI DSS. Many of the security considerations for conventional e-commerce also apply to mobile payments, for example, the provision of using public key cryptography to verify the integrity of endpoints during a transaction. The mobile payments field has a few more added concerns because of the introduction of new technologies, thus increasing the attack surface. Cloud based payments showcase the future of cardless and contactless transactions. But there is still time to see it working as the above-discussed challenges are still to overcome. ### How to Preserve Trust in the AI Era This year’s Edelman Trust Barometer revealed the largest-ever global drop in trust across all four key institutions – government, business, media and NGOs. One sector, however, has so far weathered the anti-establishment storm: tech. Edelman found that 76 percent of the global general population continues to trust the sector. But will this trust survive the widespread adoption of artificial intelligence? AI, after all, is known for being a black-box approach. It can easily lead to a ‘computer says no’ scenario. New regulations in Europe could bring this to the fore soon. The General Data Protection Regulation (GDPR), contains specific guidance on the rights of individuals when it comes to their data. Point 1 of Article 22 of GDPR states that: “1. The data subject shall have the right not to be subject to a decision based solely on automated processing, including profiling, which produces legal effects concerning him or her or similarly significantly affects him or her.” The Article continues by stating that the data controller must implement suitable measures to “safeguard the data subject’s rights and freedoms and legitimate interests, or at least the right to obtain human intervention on the part of the controller, to express his or her point of view and to contest the decision.” In short, consumers are entitled to clear-cut reasons as to how they were adversely impacted by a decision. For model-based decision making, the model must be able to demonstrate the drivers of negative scores. This is a fairly simple process for scorecard-based credit decision models, but when you add AI to the mix it becomes more complicated. [easy-tweet tweet="Businesses that deploy AI in their decision-making processes must be accountable and transparent." hashtags="AI, GDPR"] There is the potential with AI-based decision-making, for example, for discrimination against individuals, based on factors such as geographic location; combating such discrimination is an important part of the Digital Single Market being planned by the European Union. To ensure continued trust in the tech sector at a time of great public scepticism, businesses that deploy AI in their decision-making processes must be accountable and transparent. Explainable AI That’s where Explainable AI comes in. This is a field of science that seeks to remove the black box and deliver the performance capabilities of AI while also providing an explanation as to how and why a model derives its decisions. There are several ways to explain AI in a risk or regulatory context: Scoring algorithms that inject noiseand score additional data points around an actual data record being computed, to observe what features are driving the score in that part of decision phase space. This technique is called Local Interpretable Model-agnostic Explanations (LIME), and it involves manipulating data variables in infinitesimal ways to see what moves the score the most. Models that are built to express interpretability on top of inputs of the AI model.Examples here include And-Or Graphs (AOG) that try to associate concepts in deterministic subsets of input values, such that if the deterministic set is expressed, it could provide evidence-based ranking of how the AI reached its decision. These are often utilised and best described to make sense of images. Models that change the entire form of the AIto make the latent features exposable. This approach allows reasons to be driven into the latent features (learned features) internal to the model. This approach requires rethinking how to design an AI model from the ground up, with a view to explaining the latent features that drive outcomes. This is entirely different than how native neural network models learn. This remains an area of research, and a production-ready version of Explainable AI like this is several years away. Ultimately, businesses to convince their customers that they can trust AI, regardless of the failures that are likely to occur along the way. It may seem like we’re entering a world where machines do all the thinking, but we need the ability of people to check the machines’ logic — to get the algorithms to “show their work,” as maths teachers are so fond of saying. The same applies to machine learning, incidentally. Machine learning gobbles up data, but that means bad data could create bad equations that would lead to bad decisions. Most machine learning tools today are not good enough at recognising limitations in the data they’re devouring. The responsible use of machine learning demands that data scientists build in explainability and apply governance processes for building and monitoring models. That’s the challenge before businesses today. To deploy AI — and enjoy the benefits that come with it — they must get customers to accept the reasoning behind decisions that affect their future. ### Compare the Cloud Interviews Dr. Laura Sophie Dornheim from Adblock Plus Compare the Cloud speaks to Dr. Laura Sophie Dornheim from Adblock Plus. Laura speaks about Adblock Plus and how the software allows people to surf without pop up ads appearing. However, due to challenges with publishers and content owners as they feel threatened that is affecting their revenue. This is why Adblock Plus are launching Flattr. Flattr is a micropayment platform that allows users and people who love quality content to consume quality content without the creators having to resort to advertising. If you want to find out more visit:https://adblockplus.org/ ### The Cure for Cloud Sprawl Digital transformation is underway, and most enterprises have embraced the shift to a cloud-first footing. That’s no secret. However, if cloud migration wasn’t disruptive enough for many businesses, firms now need to also be able to anticipate—and lay the groundwork for—a multi-cloud environment. That’s the IT world these days: solve one challenge, and four more appear. The challenge of multi-cloud is not an easy one to overcome. Only those organisations that deploy solutions and services capable of operationalizing data will see success. IT will need to make sure all the moving parts adhere to governance and compliance protocols, and this will only be possible if the right tools are used. Multi-Cloud: the challenges [easy-tweet tweet="An issue with a multi-cloud environment is the spread of shadow IT" hashtags="ShadowIT, Cloud"] IT teams in most organisations are already under great pressure to juggle multiple software platforms, server instance types, security policies, operating system images and application-specific requirements. Every project turns into three or four projects because every app and service need to be tweaked to fit each cloud platform. This can lead to a constant rewriting of the entire deployment instruction script and ultimately, cloud sprawl. Another issue with a multi-cloud environment is the spread of shadow IT. In many companies, the manual processes used to deploy and manage cloud services can take days or even weeks to complete. Once the services are available, end users are disconnected from lifecycle operations and must rely on help desk interactions to get their jobs done. This process is often slow and tedious and leads to significant delays for end users trying to complete their tasks. It is not surprising that users feel the need to find an alternative and figure it out on their own with the swipe of a credit card, putting them outside the control of their IT department. Operationalisation tools will be essential As a result of the rapid development of the multi-cloud environment, there’s an equally fast emerging market of multi-cloud management solutions and services. To remain agile in the multi-cloud world, you should focus on tools that have the following features: A blueprint design tool that is cloud-agnostic and translates well when moving from one cloud provider to another. If the project starts with a clear blueprint, one that is easy to create no matter what provider you’re working with, IT teams will find that overall, the hours of work it takes to create and deploy cloud services will be significantly reduced. By abstracting the different elements of the creative workflow, blueprint design tools also simplify the design of cloud services by auto-generating an execution plan. This plan will bring together all the up-to-date elements of the blueprint and automatically generate corresponding service catalogue items. Easy-to-set compliance. Governance (g. approvals, quotas, permissions) and compliance (e.g. data centre selection, security settings) policies should be defined independently and applied at run-time across any cloud. Single-pane control panel. Developers who want to test their code on a multi-tier test environment (g: development to acceptance to production) should be able to do so with a simple click of a button. User portal with drag-and-drop ease. A good user experience is key if you want to prevent shadow IT. The easier it is for the end user to find and access their personalised service catalogue and launch new services, manage lifecycle operations of stacks or individual resources, and manage consumption against costs, usage and quotas, the less likely it is that they will try to find an alternative solution on their own. A single system of record. This should discover and manage all cloud resources and provide seamless integration with incident, problem and change management. This type of system will automatically route issues and requests through the standard IT processes. Be Ready for the Next Shift The incessant acceleration into the cloud and multi-cloud environment is making it imperative for businesses of all sizes to be ready if they want to future-proof their operations. The tools that will help you set yourself up for success in an ever-shifting tech-scape do exist, but businesses need to ensure they are deploying the right ones. Not all vendors and providers are the same, and not all organisations can cope with the same level of disruption. The kind of seamless automation we envision for developers and users in the multi-cloud environment will allow a business to scale nimbly as their cloud usage increases exponentially. And, that is something that will put a smile on the face of everyone in the C-suite and the IT department. That a multi-cloud environment will be a reality for most enterprises in a few years is undisputed, but how many of those enterprises will be ready to make the most of it remains to be seen. ### Zodiac Aerospace Taps Birst’s Cloud-Based NetworkedBusiness Analytics Platform World Leader in Aerospace Equipment and On-Board Systems use Birst to Strengthen Supply Chain and Scale-Self-ServiceAnalytics Birst, an Infor company and a leader in Cloud Business Intelligence (BI) and Analytics for the Enterprise announced that Zodiac Aerospace, a world leader in aerospace equipment and onboard systems, has deployed Birst’s cloud-based networked business analytics platform to help address order-fulfilment bottlenecks in its supply chain, enable self-service analytics across its business units, and generate reports in minutes instead of weeks. At Infor’s customer event, Inforum 2017 at the Javits Center, Norman Hussey, Director of Business Analytics at ZodiacAerospace, and Pedro Arellano, Vice President of Product Strategy at Birst, will present “Birst BI & Analytics – The newest addition to Infor” [TECH-104 session], at 3 p.m., Wednesday, July 12, in Room 1E17. Zodiac Aerospace, headquartered in Plaisir (Paris), France, has about 35,000 employees at more than 100 sites around the globe. With annual revenues of more than 5 billion Euros, the company has grown significantly, through some acquisitions, in recent years.  These acquisitions have enabledZodiac Aerospace to strengthen its position as a major participant in the aircraft cabin industry and to become the worldwide leader in commercial aircraft seats. Zodiac Aerospace is utilising several Infor Enterprise Resource Planning (ERP) platforms, which are designed to meet the unique business requirements of manufacturers in the aerospace industry, which helps the company manage the core and supporting processes in its business. As a result of the company’s organic growth and acquisition activity, it formed several business units operating over multiple ERP systems. Several legacy BI systems were in place, including a central BI platform and enterprise data warehouse. However, this system requires Zodiac Aerospace’s Central IT team to manage this complex process, serving as the central clearinghouse for reporting requests from the business units. Zodiac Aerospace needed responsiveness to its capability in BI and Analytics that would provide self-service analytics business units across the enterprise – thus reducing the business units’ reliance on central reporting – and enable a faster, more efficient order-fulfilment process. It also wanted a cloud-based solution, which would support development and usage anywhere (regardless of geography) and provided easy sharing of insights. The company explored using desktop discovery solutions, but those solutions were not able to prepare data for analysis, in an integrated fashion, nor connect/analyse data on an enterprise scale. Consequently, they sometimes produced inconsistent reporting that undermined trust in the decision-making process. Enter Birst’s networked business analytics platform. With the platform's patented, Automated Data Refinement (ADR) technology and self-service capabilities, Zodiac Aerospace's business units can prepare and analyse data quickly, reducing their reliance on Central IT and identifying issues in the customer order process.  For example, the Birst solution helped identify customer orders, which, at times, were not synchronised with their customer’s system. This capability reduces the risk of missed shipments, improving on-time delivery. Birst’s networked business analytics platform provided quick-to-market, added-value content.  Business users could generate reports in minutes instead of weeks and boost efficiencies across the entire supply chain. By networking analytic instances (spaces) together, the Birst platform enables Zodiac Aerospace quickly deliver specific content and shorten its time-to-market – and avoid the administrative hassle of people and business units working with data in isolation. Further, the Birst platform offered a fully integrated, automated analytics data store and integrated data preparation capabilities – along with rapid and reliable connectivity to the Infor M3platform and other systems. According to Norman Hussey, Director of Business Analytics atZodiac Aerospace, “Birst’s fully contained data store and data warehouse are a great value. Data connectivity is also very powerful and simple, which gives us faster time-to-market.” [easy-tweet tweet="Birst’s fully integrated stack enables you to focus on delivering what your organisation need." hashtags="Cloud, Data"] Hussey also noted that the desktop discovery systems, which Zodiac Aerospace evaluated, “needed back-end data preparation done before they could start. Birst has that full process integrated. Most importantly, Birst’s fully integrated stack – in the cloud – enables you to focus on delivering what your organisation needs, without worries about memory, disks, support, and backups.” Brad Peters, Senior Vice President and General Manager ofBirst, said, “With Birst’s networked business analytics platform, we believe Zodiac Aerospace can continue to achieve greater levels of operational excellence, reduce risks and maintain strong service-level agreements with its customers. Birst creates a network of analytics that connects every part of the organisation through trusted insights.” ### KnowBe4 Releases Quarterly Top-Clicked Phishing Report for Q2 2017 Results Show Human Error Continues to Be an Organisation’s Weakest, Link KnowBe4, the provider of the world’s most popular security awareness training and simulated phishing platform shared its Top 10 GlobalPhishing Email Subject Lines for Q2 2017. While the results show that users click most frequently on business-related subject lines (“Security Alert” is the highest ranked at 21 per cent), they still click with alarming frequency on subject lines not related to work topics and showing red flags. According to Osterman Research, email has been the number one network infection vector since 2014. It’s an effective method because it gives attackers more control than merely placing traps on the web and hoping that people will stumble over them. Instead, attackers craft and distribute enticing material to both random and targeted means. This method gives the cybercriminals greater control in selecting potential victims, leveraging multiple psychological triggers and engaging in what amounts to a continuous maturity cycle. The Top 10 Global Most-Clicked Global Phishing Email Subject Lines for Q2 2017 include: Security Alert – 21% Revised Vacation & Sick Time Policy – 14% UPS Label Delivery 1ZBE312TNY00015011 – 10% BREAKING: United Airlines Passenger Dies from Brain Haemorrhage – VIDEO – 10% A Delivery Attempt was made – 10% All Employees: Update your Healthcare Info – 9% Change of Password Required Immediately – 8% Password Check Required Immediately – 7% Unusual sign-in activity – 6% Urgent Action Required – 6% *Capitalisation is as it was in the phishing test subject line [easy-tweet tweet="A company’s ‘human firewall’ is an essential element of organisational security." hashtags="Security, Technology"] “The subject lines we are reporting here actually made it through all the corporate filters and into the inbox of an employee. That’s astounding. We are in a security arms race, and a multi-layered defence is critical because each layer has different points of effectiveness and ineffectiveness,” said  Perry Carpenter, Chief Evangelist and Strategy Officer at KnowBe4. “If crafted correctly, the right type of message can sail through all of the defences because it is finding the least effective point of each and playing into the human psyche of wanting to receive something you didn’t know about or needing to intervene before something is taken away. Ultimately this means that a company’s ‘human firewall’ is an essential element of organisational security because people truly are the last line of defence.” Businesses also have to be aware that social media messages to their users are potential landmines to their corporate networks. KnowBe4 evaluated the Top 10 Global Social Networking Subject Lines and found that four of the top 10 spots equaling a full 44 percent were related to LinkedIn messages, which users often have tied to their work email addresses. As part of its ongoing research efforts, In October 2016 KnowBe4 evaluated more than 10,000 email servers and found that 82 percent of them were misconfigured, allowing spoofed emails to successfully bypass endpoint security systems and enter an organisation’s network. Aggregating information on the most clicked phishing test subject lines and sharing that data with clients is another way that KnowBe4 is helping protect against social engineering tactics that continue to plague businesses around the globe, resulting in growing ransomware, CEO fraud and other phishing-initiated attacks. Businesses that are not already working with KnowBe4 to effectively train their workforce into a “human firewall” can utilise a number of free tools at www.knowbe4.com to test their users and their network. ### Tennis and Cognitive Technology Every year, I adore absorbing everything about The Championships, with Wimbledon, the tennis and technology continually become more amazing and inspiring. [caption id="attachment_44496" align="alignright" width="169"] Wimbledon scores[/caption] This year the tennis has been phenomenal and my highlight was watching Johanna Konta beat Simona Halep to get to the semi-finals.  It was an epic match with many twists and turns and Konta’s victory was a real “Yeah” moment!  Equally the weather that day was wonderfully British, but rain did not stop play; even though it was so wet that all other courts were closed, action continued on Centre Court where the roof cover was deployed.  But, amidst the rain the rest of Wimbledon navigated their way with umbrellas, strawberries and cream and some with glasses of Pimms to Henman Hill to watch Konta triumph, via live feed on the large screen – I’d know, I was there! Although I could talk endlessly about the tennis and the epic way the matches have played out, I was delighted to see the increasing use of cognitive technology becoming more integrated and excelling our insights into the tennis at The Championships this year. Watson has been used for many years at The Wimbledon Championships; it enables the brand to scale from tennis club to a coveted , world class sporting event, media curator and producer, and technology pioneer and innovator over a spectacular fortnight in SW19. For me there were three new cognitive based highlights I was excited to see this year, including the new “Ask Fred” chatbot which is part of the Wimbledon App - I will delve into this in a moment.  I was wowed by the behind the scenes cognitive innovations in the IBM Bunker demonstrating the technology behind the tennis; I was very taken with the Cognitive Highlights Dashboard which identified the most exciting moments from a match based on noise levels, crowd cheering and action recognition. An example of action recognition is where Watson can recognise the significance of player actions such as when they shake hands at the end of the game. Most of these feeds are unstructured and not obviously logical data but using Watson APIs can be interpreted to give insight in a meaningful way.  The cognitive highlights capability automatically identifies iconic moments and match events of interest from hours of footage across six show courts, which can then be served up to the media. [caption id="attachment_44498" align="aligncenter" width="514"] Cognitive Highlights[/caption] The third cognitive output I was inspired by was the competitive margin metrics. The use of this metric helps the Wimbledon editorial team tell stories about the matches in more insightful ways as play unfolds. This gives extra insight to the fans by creating contextual content on a deeper level of engagement. It works with a huge set of information collated for each player that identifies their performance against many metrics; the closer the margins are rated by Watson the potentially more intense and eventful their game may be. ASK FRED named after Britain’s legendary three-time Wimbledon Champion Fred Perry! For the majority of visitors to Wimbledon each year, it is their first visit and they want to make the most of their day!  So, the Watson-enabled chatbot “Ask Fred” is a great source of information for them and one that is easy to use because it understands natural language and conversation. The chatbot uses cognitive capabilities to understand, reason and learn.  It provides conversational concierge style Q&A for Wimbledon related topics such as “Where can I buy strawberries and cream?” by identifying and understanding the intent and entities used in the question. The intent is the part of their dialogue which identifies the type of question being asked, such as – how do I find…, where can I get… when is the next…., what time does…., where can I buy…..; and the entity would be the target part of the dialogue such as: Strawberries and Cream, Pimms, Court 17, Henman Hill, Champagne Bar, Umbrella.  “Ask Fred” also understands a number of typos, spelling variations and poor grammar. [easy-tweet tweet="The “Ask Fred” chatbot is a fabulous way to gain insights about what visitors are interested in." hashtags="IoT, Wimbledon"] The “Ask Fred” app uses cognitive “Natural Language Processing” capabilities.  These [caption id="attachment_44499" align="alignright" width="169"] "Ask Fred"[/caption] are employed to interpret our written dialogue (or question) in context and translate that into something very logical which can be processed to provide an answer rather than a generic list of results. This is conversational Q&A beyond the confines of a search engine .The Watson Conversations offering is the IBM technology that enables the development of a chatbot. The “Ask Fred” chatbot is a fabulous way to gain insights about what visitors are interested in and discover what information they want during their Wimbledon visit. By collating their questions and discussions held with “Ask Fred” their queries can be fed into a continuous development loop so the “Ask Fred” corpus or knowledge base can be expanded to become more comprehensive and honed – cognitive technology and “Ask Fred” is very much about learning and evolving into something better through feedback! If you want to check out the Watson Conversations and Natural Language capabilities you can create your own free trial Bluemix account and build your own Chatbot - check out the link and have a go today, even I have built one! IBM has been the Official Information Technology Supplier and Consultant to the All England Club and The Championships, Wimbledon, for 28 years. For more information on the technology behind The Championships, go to ibmuk.co/insta     ### AWS Announces G3 Instances for Amazon EC2 Next generation of high-performance graphics acceleration instances offers the most CPU and memory in the cloud for 3D rendering and virtual reality applications Amazon Web Services, Inc. (AWS), an Amazon.com company (NASDAQ: AMZN), announced the availability of G3 instances, the next generation of GPU-powered Amazon Elastic Compute Cloud (Amazon EC2) instances designed to make it easy to procure a powerful combination of central processing (CPU) and random-access memory (RAM) for workloads such as 3D rendering, 3D visualizations, graphics-intensive remote workstations, video encoding, and virtual reality applications. Backed by NVIDIA Tesla M60 GPUs, G3 instances offer double the CPU power per GPU and double the host memory per GPU, then the most powerful GPU cloud instance available today. To get started with G3 instances, visit https://aws.amazon.com/ec2/instance-types/. Customers running complex modelling and 3D visualisation analyses, such as medical image processing, computer-aided design, and seismic visualisation jobs, require significant CPU processing power and memory. To offer the best performance for these graphics-intensive applications, the largest G3 instance offers twice the CPU power and eight times the host memory of the previous generation graphics instance (G2). The g3.16xlarge features four NVIDIA Tesla M60 GPUs with 64 CPUs using the latest custom Intel Xeon E5-2686v4 (Broadwell) processors, 488 GB of RAM, and Enhanced Networking using the Amazon EC2 Elastic Network Adaptor. The G3 instance also features two Amazon EC2 “firsts” to speed video frame processing and improve image fidelity for graphics workloads: support for NVIDIA GRID Virtual Workstation capabilities, including the ability to support four monitors with resolutions of up to 4K (4096 x 2160), and hardware encoding to support up to 10 H.265 (HEVC) 1080p30 streams per GPU. [easy-tweet tweet="AWS provides the broadest range of cloud instance types to support a wide variety of workloads." hashtags="AWS, Cloud"] “AWS was the first to offer cloud instances customised for graphics-intensive applications. In 2010, we launched the CG1 instance type to provide a cost-effective, high-performance instance for graphics-heavy applications, and today, G3, our third generation GPU-powered instance, serves the most demanding graphics workloads such as 3D rendering and data visualisation,” said Matt Garman, Vice President, Amazon EC2. “Today, AWS provides the broadest range of cloud instance types to support a wide variety of workloads. Customers have told us that having the ability to choose the right instance for the right workload enables them to operate more efficiently and go to market faster, which is why we continue to innovate to better support any workload.” BeBop Technology delivers specialised cloud-based services used by the world’s leading media companies and movie studios. “At BeBop, we help customers migrate to a cloud-based environment and integrate the cloud into their existing editorial and post-production workflows,” said David Benson, Chief Technology Officer and Co-Founder of BeBop Technology. “The advanced graphics features of Amazon EC2 G3 instances help our customers edit their work with the same quality and fidelity they would expect from a local workstation, and deliver results faster and much more cost effectively than previously possible.” Landmark, a Halliburton business line, is the leading technology solutions provider of data and analytics, science, software, and services for the exploration and production industry. “Landmark’s DecisionSpace 365, the industry’s first end-to-end E&P SaaS solution, is empowering oil and gas companies to make the shift to the cloud where they’re experiencing breakthroughs in efficiency across the E&P lifecycle,” said Chandra Yeleshwarapu, Global Head of Services and Cloud at Landmark. “The E&P Cloud is increasingly complex with very large datasets, 3D, dynamic algorithms, security, and global reach. The large memory and powerful GPUs within Amazon EC2 G3 instances enable Landmark to deliver value to our clients in ways that were not possible before.” ZeroLight is a leading omnichannel visualisation platform for the automotive industry. “Our real-time 3D car configurators allow new and returning customers to visualise every part of their vehicle to an extremely high standard, allowing them to make more informed purchase decisions,” said Darren Jobling, Chief Executive Officer of ZeroLight. “Amazon EC2 G3 instances will enable us to continue to deliver our unique, real-time, high-quality 3D experiences for automotive customers through every channel. With the increased rendering performance of G3, we will be able to build upon ZeroLight’s current cloud solution, adding new features to our online experiences that deliver higher basket value potential.” Customers can launch G3 instances using the AWS Management Console, AWS Command Line Interface (CLI), AWS SDKs, and third-party libraries. G3 instances are available in three instance sizes: g3.4xlarge, g3.8xlarge, and g3.16xlarge. G3 instances are available in US East (Ohio), US East (N. Virginia), US-West (Oregon), US West (N. California), AWS GovCloud (US), and EU (Ireland), and will expand to additional Regions in the coming months. ### IT Friendly Robots - Answers to Key Concerns With widespread Robotic Process Automation adoption across organisations, it’s essential that the IT department is effectively engaged. Paul Taplin, MD at Voyager Solutions addresses some of IT’s key concerns and provides some answers Robotic Process Automation (RPA) software is being increasingly requested and driven from within business operations teams to replicate process work that was previously done by humans. RPA is popular because it’s quick to implement, and minimises the need for costly systems integration. Although this all sounds great, the benefits of using a RPA approach may not be fully appreciated by the IT department.  I have discussed areas of concern with IT professionals and provide explanations to help them make more informed RPA choices. It can work for IT Although RPA is not always driven by IT department, there’s a rapidly growing list of scenarios that benefit its own internal operations. These can range from initial projects in service management and transaction processing – to resource-intensive, administrative and transactional work. As well as the obvious cost savings and efficiency improvements, in the short term, RPA has the potential to challenge the way that processes are fulfilled in the IT department. RPA will also provide future benefits too - allowing IT teams to digitise things that currently aren’t possible and provide valuable insights into forthcoming challenges.  Moving at pace A real issue for the IT department is speed to implement - so costs are kept down and benefits are quickly realised. RPA addresses these goals as it’s much faster to implement than using a traditional IT approach - from a standing start - into live production. This RPA approach is also non-invasive to existing systems – as it reads from and writes to them as a user would - so no costly alteration is required. This means no lengthy design, build, test and rollout projects. The IT department needs to govern the lifecycle activities that are required to implement a process - but doesn’t need to follow an “industrialised” process. This is because some elements, such as; “testing” and releasing are required on a much lighter basis - as the target environments don’t need changing. Moving fast also keeps costs down which provides many opportunities to deploy an RPA solution. Numerous benefits can accrue by putting automation in place quickly, such as achieving resource savings – instead of employing staff full time, process quality enhancements and the elimination of backlogs. This is especially attractive if there is a fine imposed for missing service commitments to customers. Wider scope of automation RPA can be used more broadly, for example where traditional IT approaches are not cost effective, where access to 3rd party systems is required, and where RPA is used to process information that only humans can do currently. There are significant areas of scope that traditional IT won’t reach for a very long time - if ever.  These include: Processes that may be required in the short term, while waiting for a longer term ERP solution to deliver, or during gaps in releases that lead to the need to have temporary/interim processes Processes “under the IT radar” that operations would benefit by automating - but aren’t significant enough to make it onto the IT development agenda, or are too expensive to justify Processes relating to 3rdparty systems – often no IT interface exists between systems owned by different parties who need to work together, or the interface is sub-optimal, resulting in re-keying of information When considering RPA, and for the reasons above, it should be seen as complementary - rather than competing with existing automation plans and approaches. Developing an operational automation capability By giving business operations the capability to automate – they can swiftly respond to changes in the operating environment - without reliance on the IT function and lengthy capital programmes. RPA should be applied with a mindset to continuously improve, rather than implement a single project. This means approaches to automation should be seen as a capability, to address specific organisational needs. Once the capability is established, this gives business operations the control and ability to quickly and effectively automate work. There are a number of largely positive implications of progressing down this path, as business operations’ ability to implement automation becomes less dependent on IT development priorities and budget constraints. Automation becomes a tool available to everyone in operations and the discipline of RPA can improve quality and efficiency of operations. [easy-tweet tweet="Implementing Robotic Process Automation is an inexpensive way to achieve results" hashtags="Robotics, Technology"] For example, business operations and IT will gain insight derived from process automation. This is because automation provides substantial data, which gives a precise audit of activity and detailed feedback on case exceptions and their causes - big data, automation, analytics then becomes an action list. If a process is automated using robotic automation, it becomes easier to include into a more traditional IT project, as the requirements are already determined, standardised and documented. Final thoughts Implementing RPA is an inexpensive way to achieve results quickly – and requires minimal up-front investment to automate processes and activities. When IT teams opt for developing an RPA capability to help business operations achieve its objectives, there are many compelling efficiency benefits that can be achieved. Service delivery risks and performance can be improved by RPA, with its ability to minimise the cost of quality by reducing or removing quality check and errors. This then leads to a reduction in the compensation paid to customers through processing delays. By providing a better service to customers, RPA can significantly reduce resource-intensive; queries, chase ups and complaints. Ultimately, RPA puts the focus back into the hands of the process experts to drive automation projects - without the need for IT enablement projects that may be costly and run the risk of delays and overruns. With value swiftly delivered - this allows the IT team to focus on areas of core competency. http://www.voyagersolutions.com ### Ocado Technology's Anne Marie Neatham at Futurestack Compare the Cloud speaks to the COO for Ocado, Anne Marie Neatham at Futurestack. Anne speaks about how Ocado are only e-commerce base and how they use the New Relic browser application. For more information on Ocado visit: http://ocadotechnology.com/ ### Searching for the Right Help Desk Software The right help desk software can make a huge difference in the success and bottom line of your firm or business. In fact, some would venture to say it serves as the foundation for client relations by providing your business with a robust yet streamlined in-house solution. That being said, not all help desk software is created equal, so you need to consider all of the angles. Here are seven key tips for selecting the best help desk software for your business needs. 1. Keep Your Budget In Mind When shopping for help desk software, you’ll find solutions ranging in price from $0 all the way up to $1,000 and more. Some vendors charge a monthly fee and offer a slew of amazing features. However, it’s easy to go overboard with features and pricing when searching for a help desk. As is the case with most things, you don’t always get what you pay for with help desk support software. Fortunately, there are plenty of options and price ranges out there to choose from. There are even some great open source solutions for the budget-conscious. [easy-tweet tweet="Helpdesk support software helps you manage client relations and keep track of workflow" hashtags="Software, Workflow"] 2. Decide On Your Needs Helpdesk support software not only helps you manage client relations, but it helps keep track of workflow as well. Without a competent help desk system, ensuring all of your customers’ needs are addressed promptly is all but impossible. For many companies, the ability to submit tickets via email is a must-have feature, for others, it may not be. Knowing exactly what you’re looking for will help you to quickly and accurately weed out help desk support software lacking the features you need. 3. Ensure It’s Easy To Use Your help desk support software must be easy for your clients and employees to use. If you opt for an overly complicated system, it will disrupt work flow and cause your business to run less smoothly than it should. When it comes to helping desks, there are some easy-to-use tools, and there are some that are incredibly difficult to operate. Take advantage of any trial periods or demos to find out if the tools you need are user-friendly for both your customers and employees. Do your research and make sure to choose wisely for both of their sakes. Once you decide on a winner, Tech Trace states the implementation of an employee training program is a good idea for maximising its effectiveness. 4. Consider Using a Web Interface Some tools offer both email and Web-based solutions for ticket submission. Although many companies prefer the ease and straightforwardness of email ticket submission, others prefer the control afforded by Web-based systems. With a Web help desk portal, you have more control over when, why, and how help requests are submitted. 5. Assess Ticket Management Features It would be logical to assume all help desk solutions would allow you to manage tickets granularly and provide you with assigning, escalating, cancelling, and closing capabilities. However, this isn’t always the case. Some help desk solutions provide users with greater control over their tickets than others. Therefore, you should think about how much control you want to have over your help tickets before shopping the marketplace. Don’t settle for software that doesn’t provide you with the control you need. 6. Expand Your Brand Your help desk needs to be fully branded for your company. In addition to helping with continuity, a branded help desk legitimises your company in the eyes of customers and provides a professional look. Thus, make sure the help desk support software you select can be branded with your company logos and information. While critical, this is unfortunately often overlooked. Customer service is key, but the importance of branding for your brand cannot be overlooked. 7. Don’t Sleep On Security In most cases, your help desk ticketing system will pass client data to and from servers. This data can contain names, addresses, passwords, and a bevy of other sensitive information you’re responsible for safekeeping. For this reason, the software you choose should facilitate secure data transfer. If you opt for an email-based ticketing system, make sure to consider SSL or another trusted form of encryption. Well, there you have it. By following these tips and performing your due diligence, you'll know what to look for when searching for the right help desk software and begin benefitting from a much more productive and reliable workflow in no time at all. ### Falcon Media House and Media Nucleus sign MoU to target Southeast Asia and Africa OTT and Broadcast Markets A new partnership between global digital media group Falcon Media House and Media Nucleus, a well-established digital and broadcast technology products company will offer an industry leading OTT service to millions of users. The partnership enables Falcon Media House to offer an OTT platform to medium and large size broadcasters with a potential reach of 250 million users in India alone, as well as parts of its content portfolio to the content service providers in the region. Media Nucleus, a recognised specialist offering innovative broadcast and Pay TV solutions, with clients like the state broadcaster Doordarshan in India, Fox Movies in the Middle East to Zuku in East Africa, will enable its customers to experience the best Q0S & QoE and seamless video streaming using Falcon Media House’s Q-Flow technology. [easy-tweet tweet="Q-Flow overcomes the challenges of congested and slow connections to deliver content to the end user." hashtags="Technology, Digital"] Going beyond traditional network limitations, Q-Flow overcomes the challenges of congested and slow connections to deliver content to the end user using the most efficient and cost effective route, resulting in seamless streaming over even the most challenging networks and mobile conditions. Rather than investing in hardware and laying new network cables, the joint solution will enable broadcasters and cable companies to increase their market share with a lower capital expenditure. Q-Flow provides a unique win:win representing a solution for both the OTT Operator and their end users alike. Quiptel will integrate with Media Nucleus’ Subscriber Management Software CAMS to provide billing solutions to customers. Falcon Media House Chief Executive, Gert Rieder, commented, “We’re excited to partner with a highly established business which recognises that live and on-demand content delivery via OTT represents the future of broadcasting. The local networks and mobile services in India, Africa and Southeast Asia are not reliable enough to provide high quality streaming experiences. Media Nucleus has a huge established customer base including many of the leading broadcast companies in these regions and we’re looking forward to enabling them to deliver a world class service using our software which will help them grow in a cost efficient manner, avoiding the huge costs of building infrastructure.” Bhaskar Majumdar, Director and Investor of Media Nucleus, said, “The explosion of the smartphone market, and with it the increase in demand for streaming video content requires a robust and reliable OTT service both inside and outside the home. Media Nucleus with its reach across subcontinent and MENA markets will enable the Quiptel OTT platform and its content to capture share in these markets.” ### Tackling Tomorrow's Biggest Opportunities with Cloud Native NFV Two of the biggest opportunities in communications today – providing next generation wireless networks and consumer video entertainment – demand flexible, scalable networks that can create new services in unprecedented ways. Network Functions Virtualization (NFV), which has captured the imagination of the telecoms industry, helps service providers take advantage of these opportunities; NFV unlocks years of proprietary hardware and slow service development cycles in favour of continual service innovation on standard high-volume servers, switches, and storage. But the value of NFV goes beyond the capital efficiencies gained by using non-proprietary hardware in the network. A cloud-native NFV implementation creates a truly new kind of network, one that's easier to control and configure, giving service providers a chance to build a foundation for future services using a unified, programmable infrastructure. [easy-tweet tweet="5G networks need to be highly scalable and deliver ultra-low latency connectivity." hashtags="5G, Data"] What's more, a cloud-native approach to NFV is required if a service provider wants to compete seriously in the next-generation of mobile services. But network functions that are simply ported over from dedicated appliances to run in the same way they always have, but on different hardware, won't get the job done. 5G networks need to be highly scalable, deliver ultra-low latency connectivity and support a huge number of concurrent sessions. Download speeds will increase, the number and kind of connected devices will proliferate (driverless cars!), and that hyper-connected network must be reliable and underpinned by a robust security strategy. Tomorrow's mobile network can't rely on the call restriction control mechanisms of a traditional phone network. If it does, the next large-scale natural disaster – like the Great East Japan Earthquake – will lead to mobile networks that can't stand up to the massive number of call attempts as people are urgently trying to find out if their family and friends are safe. Cloud native design is critical, and it differs from the traditional approaches to networking in two main ways. First, cloud-native VNFs are based on several small microservices, not inflexible, monolithic software architectures. Second, these micro-services are deployed in virtualized containers for maximum portability and flexibility. Microservices are small, independent processes that are highly composable and reusable, scale independently, and can be incrementally deployed and enhanced. Microservices necessarily enable service providers to introduce and upgrade applications more quickly. A microservices approach also changes the way features and functions are built and developed; small teams of people can work on a particular part of a big problem and build something that’s relatively simple and compatible with the greater whole. For the service provider, in general, this approach allows them to innovate more rapidly and be more agile, which is what they are looking to achieve with NFV in the first place. Containers are isolated execution environments on a Linux host that behaves more or less like a full Linux server. Containers, though, share a single instance of a Linux OS, compared to virtual machines, which simulate all of the hardware and software in a server. They're more efficient; they reboot faster than virtual machines and keep service downtimes to a minimum. They scale, chain, heal, move and back-up more quickly and easily, so network operators can achieve high service availability and resiliency without requiring one-to-one hardware or virtual machine redundancy. Again, it's one of the inherent characteristics of virtualization in the enterprise that makes the technology so attractive for communications service providers. Put it all together – cloud-native virtualized network functions built in container environments – and you can achieve theoretically limitless and cost-effective scale. Services can start up more quickly, and there's better fault isolation when one part of a service fails. Also, microservices can be added as needed to handle increased capacity or backup needs. Cloud native VNFs spin up quickly under the control of next generation management and network orchestration platforms and are chained to form new services that will be consumed by business and residential customers. Many of these services will be found in the home. There, service providers can use cloud-native NFV to change the way consumers are entertained and automate processes in the home. A virtualized residential gateway – one that uses native cloud VNFs and moves the traditional home gateway functions (authentication, security, network address translation) into the cloud – can replace a separate set-top box for TV services at each television in the home, over-the-top Internet video devices, Wi-Fi routers and DVRs. This not only dramatically simplifies service management but provides immediate CAPEX savings. Also, service providers can open up new revenue opportunities by remotely diagnosing and fixing connectivity and smart home device issues, and they can introduce new digital smart home services from partners, content providers or third-party application developers. Cloud native VNFs can deliver unified communications services for the home: high-quality voice service and enhanced calling features, network-based voicemail and IP text and video messaging, plus the ability for each family member to take all of that functionality with them, on their mobile devices, wherever they are. To sum up, a native cloud approach to NFV is necessary for service providers to compete effectively and deliver the digital services and experiences customers want. In the next few years, we'll see an explosion of new IoT devices, video-based communications and entertainment services and smart home applications. That will underscore the need for the network functions supporting these products and services to be designed in a way that doesn't just reside in the cloud but takes full advantage of the economics, agility, and flexibility of the cloud. We are entering a new era of communications and, as such, service providers need to prepare now for scale, service creation opportunities and cost efficiencies that can only be delivered with cloud-native VNFs. ### Citrix: NetScaler SD-WAN event Citrix's NetScaler SD-WAN - Scalable, Reliable, and Cost Effective Application Delivery. As application and desktop virtualization increase or applications move to the cloud, IT managers face the challenge of providing these applications without a performance penalty to branch and mobile users. Citrix NetScaler SD-WAN can help you effectively and economically increase WAN throughput while accelerating enterprise applications and ensuring the performance and availability of mission-critical applications. Join Arrow and Citrix to learn more We would like you to join us to discover how to offer your customers reliability, optimisation, security and manageability. Join us for an EXCLUSIVE event on 19th July 2017 at Citrix Executive Briefing Centre in Paddington to learn why you should be talking to your customers about Citrix NetScaler SD-WAN. ——————- Agenda 9:30am Registration and arrival 10:00am Welcome & Introductions by Dan Griffith, SDWP Business Sector Manager at Arrow 10:15am Market Landscape & Opportunity for Partners by Mark Hardy, Director for Northern Europe – Networking at Citrix 10:45am Typical Use Cases & Customer Success Case Study by Mark Hardy, Director for Northern Europe – Networking at Citrix 11:30am Tea & Coffee break 11:45am Technology Demo – by Justin Thorogood, EMEA Business Development at Citrix 12:15pm Why Arrow & Citrix – by Dan Griffith, SDWP Business Sector Manager at Arrow 12:45pm Q&A 1:00pm Lunch & Networking * please note places are limited REGISTER TODAY ——————- Date: 19th July 2017 Time: 9:30am - 1:00pm Location:  Citrix EBC – Paddington 20 Eastbourne Terrace 16th Floor London W2 6LE ——————- For any Citrix queries and to speak to one of our team contact citrix.ecs.uk@arrow.com or call 0800 983 2525 For any queries regarding the above event please contact Maria Ferrara, Marketing Manager at maria.ferrara@arrow.com ### The Cloud vs. on Premise Dilemma When is data best stored in the cloud? When is it best-stored on-premise? There are some considerations when it comes to deciding where data must reside. In general, organisations are inclined to store sensitive or personally identifiable information (PII), such as financial or customer data on-premises where they have complete control. This is especially true when access to the data is primarily sought by internal employees or contractors. On the other hand, data that is used by external parties, such as business partners or vendors, can be stored externally in the cloud to improve access to information and cross-company collaboration. However, as new security tools and approaches become available, the decision to store information on-premises or in the cloud can come down to the availability and cost of storage. Many organisations today are rapidly moving data storage to the cloud to take advantage of the improved user experience and the significant cost savings that weren’t possible with traditional, on-premises storage solutions. What are the legal issues around storing data, whether it is on-premise or in the cloud? What is the current situation in the UK? In most respects, the legal issues around whether data is stored on-premises or in the cloud are similar. There are specific requirements that must be addressed in both cases, such as governing who has access to information, where it is physically stored – including country or region requirements – and whether it must be encrypted ‘at rest’ and ‘in transit’. However, there are additional implications for data stored in the cloud. For instance, organisations must be ready to validate the security and data protection controls put in place by the third-party hosting the data. The organisation will be required to show that any sensitive data residing in the cloud is protected to the degree required by law, especially with GDPR just around the corner. Another important legal aspect of storing data in the cloud is drafting the appropriate security requirements and service level agreements with third party cloud vendors. It’s important all parties understand and agree to the specific safeguards which will be used and how the third-party will respond if any inappropriate activity is detected.   How will things change when it comes to data storage on the cloud/on-premise when GDPR comes into force? Specifically, with GDPR on the horizon, many organisations will require a considerable shift in their thinking and their IT business support systems as the focus on the protection of personally identifiable information (PII) is magnified. The number one issue for organisations will be accurately identifying where PII data is stored. Once an organisation has a handle on the location of data, it can implement the required legal oversight and controls for each location, or move the data to a location that meets the minimum requirements for GDPR. [easy-tweet tweet="Embedding privacy early in the design process of systems ensures a holistic view of data" hashtags="Data, Cloud"] GDPR legislation specifically introduces the idea of ‘privacy by design’ which means all new cloud and on-premises systems must be architected to ensure private and personal data compliance at the start and end of all business or service process. Embedding privacy early on in the design process of systems ensures enterprises have a holistic view of what data they have, its availability, who can process it and who has access to it.  This means governing access in a sustainable, consistent and auditable way. The reality is, privacy by design and securing PII is no longer merely a desire but is set to become a legal mandate. It’s critical that any business subject to GDPR takes steps to understand the legal issues around storing data and how to implement the relevant controls and best support its obligations. Failure to do so will result in heavy financial fines and put the organisation’s reputation at risk in the longer term. What are the drivers for putting data either in the cloud or on-premise? The main drivers are the availability of the data, storage costs and security. Understanding the trade-offs between these three areas is how the organisation will ultimately decide where data can and should be stored. What is likely to change in the next 12-18 months concerning data storage for organisations? What should be on their radar and why? Over the next few years, more organisations will look at cloud-based storage options - that’s a given. As cloud solutions seek to address security permutations, enhance productivity gains and save on the corporate wallet, organisations will begin to seriously consider migrating data to this platform as the benefits significantly outweigh the risks. Transition to cloud-based applications and business productivity platforms such as Office365 are also driving this transition. As more and more data starts out in the cloud, leveraging cloud-based storage solutions tied to these applications and platforms will become the default option for many organisations moving forward. ### Procurri Named in Gartner Report on Third-Party Maintenance Providers for Data Centre Procurri Corporation Limited (“Procurri”, and together with its subsidiaries, the “Group”), a leading global independent provider of Data Centre Equipment and Lifecycle Services , announced today that Gartner has listed Procurri among 18 other companies in its report on “Competitive Landscape: Partnering with Third-Party Maintenance Providers for Data Center and Network Maintenance Cost Optimization”, published on 7 July 2017 by Christine Tenneson. Commented Mr Sean Murphy, Procurri’s Global Chief Executive Officer, “We are delighted and honoured to be one of the selected few companies highlighted in this report. Being featured in the highly fragmented third-party maintenance (“TPM”) services industry is an acknowledgement of our proven track record in the data centre TPM market, and certainly positions us well as the go-to partner for quality and efficient TPM solutions and services worldwide.” [easy-tweet tweet="What Gartner has identified in this report is the core of Procurri’s unique eco-system" hashtags="DataCentre, Gartner"] In the report, Gartner identifies a major shift occurring in the TPM market: the third-party data centre maintenance, third-party network maintenance and secondary hardware markets are coming together. Gartner notes that most providers in this space want to offer support across servers, storage and network. Today, there is more cross-pollination between secondary hardware sales and TPMs. “What Gartner has identified in this report is the core of Procurri’s unique eco-system: we have an extensive global presence in three key regions – the Americas, Europe and Asia-Pacific, on top of our expertise in three key markets – hardware resale, TPM for both data centre and storage, as well as IT Asset Disposition (“ITAD”). This is our key competitive edge as we are able to offer a comprehensive suite of data centre hardware and services worldwide, all from a single point of contact. I am heartened that we have gotten a first-mover advantage on this upcoming trend,” concluded Mr Murphy. Prior to this mention, Procurri was listed as a Sample Vendor in Gartner’s Report titled “Lower Both Storage Acquisition and Ownership Costs by Using Third-Party Maintenance”, published on 3 March 2017 by Stanley Zaffos. The report delved into third-party maintenance, specifically for storage, and how the use of third-party maintenance may enable the safe extension of service lives of installed storage arrays, make changing storage vendors easier, and provide leverage when negotiating infrastructure refreshes with incumbent vendors. Procurri was also identified as a Representative Vendor in Gartner’s Report on Market Guide for IT Asset Disposition, published on 22 November 2016. In that report, Procurri was listed as a provider offering a different value proposition than ITAD service providers, through its “converged infrastructure” approach that utilises ITAD as a source of used IT equipment with which it can build complete “load and go” solutions. ### United we Stand - How Cloud is Unifying the Workplace  The forces of digital disruption and the changing work styles of the workforce are pushing organisations to become more responsive to both the needs of their customers and their employees. Companies that don’t keep pace will find themselves on the outside looking in when competitors create fresh ways of doing business and establish new expectations for customers as well as employees and partners. Legacy technology, especially in Unified Communications (UC), is now reaching the stage where it's literally holding companies back. The forces of digital disruption In recent years despite a challenging business environment, unemployment has come down to near record lows. If businesses want to attract and retain the right talent; it is crucial that employees can move seamlessly between multiple communications platforms to collaborate with colleagues. As new employees enter the workforce they are increasingly demanding access to on-the-go mobile solutions and easy to use technology in the workplace. Organisations that fail to deliver risk losing these employees if they think about how they can create a positive Unified Communications user experience. Of course, it is not all bad news.  Companies prepared to digitally transform their workplace will be able to bring better business outcomes with less effort. While Unified Communications has been around for a long time, it is only now that technology advances are truly meshing with the business opportunities enabled by a changing workplace. Having worked as a VoIP support engineer since 2002 I have seen many changes in Unified Communications over the last 10 years. Despite the mass adoption of VOIP, on premise’ infrastructure is often expensive and difficult to manage and less efficient to deploy. Now you can have voice, web, video, chat, co-browsing, and on-demand experts who may be in different locations. Breaking down the workforce silo’s Businesses have traditionally divided their communications between a front office, CRM environments and their back-office experts.  Cisco AI modelling enables them to predict customer interaction within the contact centre environment and compile information to route incidents, via skill base routeing where experts deal with more complex requirements. *Customers are directed to the best-suited people, regardless of where they are located, so customers no longer need to re-explain their situation. Customer experience could move from being an issue to an important competitive differentiator. (*66% of respondents switched company due to poor customer service Source: Accenture study of customer service practices) Unified Communications Challenges at Scale When it comes to larger organisations with one or more contact centres, efficiency at scale and having a more productive workforce is vital. A contact centre needs to swiftly ramp up or down to accommodate peaks.  Large organisations need to deal with customer issues quickly, effectively and have seamless social network integration. But at the same time support secure mobility and flexible working practices. We are all familiar with chat, internet video and voice calls, however the demand for backend systems integration and different business tools to aid in the way we interact within organisations is increasing, a new customer experience on a single cloud platform is required to tackle our 24x7 mobile workforce. At Hutchinson Networks, we engineered a cloud solution called Vertex which is a Unified Communications as a Service (UCaaS) cloud solution which enables clients to jump between having all their equipment on the premises and enabling their workforce to be fully mobile.  Through device and location independence users get a seamless experience, smaller organisations with fewer users may connect to the service over the Internet, while larger enterprises may adopt private Ethernet connectivity with our bespoke services. [easy-tweet tweet="AI can be used to improve our day to day collaboration in business" hashtags="AI, Communication"] What does the future hold? Consumers are now becoming familiar with Artificial Intelligence (AI) through voice assistants of Google Home, Amazon Echo or Apple Siri. But now AI can be used to improve our day to day collaboration in business, such as using Cisco Spark to take notes in a meeting or Cisco Spark Bots to apply AI to compile shared documents and then set up a meeting to discuss them. Security is not a concern as messages are encrypted and can only be shared within the team rather than the wider organisation. And from a Social Media perspective, by utilising Unified Communications, the vision is equally exciting. Using WhatsApp customer service teams can set up a continuous line of communication until the incident is resolved. Through machine learning data collected from social media interactions can be used to get a deeper understanding of the customer. It can be used to find the best channel to communicate with them or to offer them a more personalised service through training AI to automate answers to repetitive questions creating a UC-centric workflow that incorporates application data. We recently worked with a post-production film company based in Glasgow which produced special effects for well-known blockbusters and post production services. Using Vertex, UC platform the post-production house was required to translate its ideas from ‘the physical to the virtual’ and to share its creative content across teams in different time zones and on some mobile platforms. Using this technology, they could think on their feet by making annotations or drawings, then collaborate with each other on them. Whilst Unified Communications in the cloud will continue to evolve, businesses can transform the way they work right now through simple tools that combine all their application data and enable them to collaborate better. Whether a business is large or small, adopts full or hybrid cloud, they can all benefit from integrating their core services, reducing the need for complex infrastructures, enhancing their user experiences and minimising their costs. ### 20 minutes Aurelien Capdecomme at Futurestack Compare the Cloud speaks to CTO for 20 minutes, Aurelien Capdecomme at Futurestack. Aurelien speaks about who 20 minutes are and explains about the future of e-commerce platforms. For more information on 20 minutes visit: http://www.20minutes.fr/ ### The Flash to HTML5 Switchover and its Effects on Video Advertising Following a number of announcements and recommendations, in mid-2016 Google Chrome finally began its process of transitioning from Flash to HTML5 and throughout 2017 the switch has accelerated each month. As a result, all websites running video need to use an HTML5 video player and advertisers must use HTML5 compliant tags or users will not see content without manually enabling Flash in Chrome. With this in mind, Google expects to deprecate Flash completely this year, and we’ve already seen a significant shift in that process with the makeover of various sites and major desktop browsers announcing plans to default to HTML5. [easy-tweet tweet="Across the whole internet, the Chrome browser has a 60% market share" hashtags="Chrome, Data"] While Flash has played an important part in our technological history, helping to deliver countless videos, gaming and advertising content to users worldwide, it’s also been the subject of an ill-performing, unstable and insecure software platform over the years. It’s not the first and certainly won’t be the last victim of innovation, but one of many that have seen a progressive transition over the years whilst failing to keep up with an ever growing and demanding industry - not to mention whose focus has swiftly turned to mobile devices, which famously don’t support Flash very well, if at all. Across the whole internet, the Chrome browser has a 60% market share, but research will tell you that audiences utilise different browsers based on their needs or wants. For example asking a user to accept the use of Flash will significantly break the experience they have on that page or video, not to mention the ‘extra step’ added to a process that should be kept as seamless as possible, and one that will benefit consumers, advertisers and indeed website owners. That said, this high percentage market share from Chrome cannot be ignored, and with some online companies seeing a higher use of Chrome by their users, they also need to be acutely aware of how the switch to HTML5 can affect things like pre-roll ad markets for publishers, for example, who may rely on monetizing through their video inventory. Despite repeated warnings by Google of the imminent deprecation of Flash in Chrome, the industry has been slow to respond and is now feeling the pressure to switch as the amount of available flash inventory is rapidly decreasing. This transition is happening now and over the next few months, Flash video inventory will simply disappear within Chrome. It’s difficult to predict exactly how this will affect the market and indeed the advertisers who hold all the monopoly money, but timings and availability of HTML5 based inventory will likely be high on the agenda. Get your launch timings wrong as a publisher and you risk jumping the gun. Arrive late to the party and you will have missed a mountain of an opportunity. Simply put, timing is essential and both publishers and advertisers will mostly benefit by working together. That hurdle aside, technical challenges and their solutions also play an important part in the process. Despite this and all that goes with it, however, there is further room for growth, sustainability and improved services. It may be true that costs will rise with the implementation of new solutions, site overhauls and so on, but a greater, more refined user and advertiser experience is what ultimately lies on the horizon for our industry. As ever is the case, the transition to HTML5 video cannot be achieved by a single party. The connected nature of the video advertising ecosystem means all parties need to push for faster support of HTML5 as failure to do so will result in a loss of inventory for advertisers and decreasing revenues for publishers. ### HPE forms a go-to-market alliance with Microsoft through Cloud28+ Today at Microsoft Inspire, Hewlett Packard Enterprise (NYSE: HPE) announced it has formed a go-to-market alliance with Microsoft through Cloud28+, the world’s largest, open community dedicated to increasing enterprise cloud adoption. Microsoft joins Cloud28+, an initiative led and sponsored by HPE, as a premier Microsoft Azure Stack technology partner. By becoming a Cloud28+ member, Microsoft will combine forces with HPE to provide partners with new business opportunities and tools, while accelerating customer business outcomes powered by a rich hybrid IT ecosystem. Through Cloud28+, HPE and Microsoft will create a joint Azure Stack ISV onboarding program focused on connecting software vendors to HPE and Microsoft solution and service providers, helping them to grow their market visibility and develop new revenue streams. In turn, HPE and Microsoft solution and service providers will benefit from dedicated lead generation campaigns and sales enablement activities to bolster their Azure Stack sales. Customers will gain access to an even more comprehensive community and cloud service catalogue, allowing them to find their own right mix of partners and hybrid IT solutions to match their specific workload, industry and data sovereignty requirements. The mission of Cloud28+ is to offer customers, regardless of their location, a seamless on- and off-premises experience, based on a common platform of shared resources – services, tailored solutions and knowledge. Recently named by 451 Research as the largest cloud aggregator, Cloud28+ continues to develop rapidly since first launching its catalogue service in December 2015 and expanding its worldwide community one year later. Today, Cloud28+ counts more than 500 partners – service providers, solution providers, ISVs, distributors, system integrators, universities and public sector organisations – contributing to a catalogue of more than 18,000 build and consume services. Focused on providing high levels of data sovereignty and customization, Cloud28+ partners offer services out of more than 300 data centres in every region around the world. [easy-tweet tweet="Through Cloud28+, HPE and Microsoft will create a joint Azure Stack ISV onboarding program" hashtags="HPE, Microsoft"] “HPE and Microsoft are taking a step together to enable the channel in a new way and its ability to compete effectively in a digital, platform-driven world,” said Crawford Del Prete, Executive Vice President, Worldwide Products and Chief Research Officer, IDC. “This is an important partnership for HPE and Microsoft. Combining the power of the HPE channel community through Cloud28+, with all the possibilities of Microsoft Azure Stack, makes for a strong partner value proposition and a compelling environment for customers to more easily engage.” Additional partner benefits include: Access to the Cloud28+ digital marketing engine to save time and money. Partners can publish thought leadership content, launch online campaigns tied to their offerings and easily track results. A worldwide community and digital business platform to increase visibility and new go-to-market avenues. Partners can quickly build new alliances and propel their offerings on the market. A ready-made ecosystem to accelerate and differentiate HPE ProLiant Azure Stack ISV Readiness and offerings to increase sales. Partners gain time to market with Azure Stack investments and go-to-market engines; richer ISV content from which to tap into; and technical readiness with dedicated Microsoft and HPE training and events. In the future, Cloud28+ will also enable a customer transaction link to the Azure Cloud Solution Provider program. Access to flexible HPE IT Investment and Asset Management solutions that serve as the bridge between technology and finance solutions to help partners and their customers achieve their business goals. “As a long-standing community member, we’re pleased about Microsoft becoming a Cloud28+ technology partner and this further endorses our long-term commitment,” said Tony Limby, GM, IAAS and Hybrid Cloud, BT. “Our customers are looking for hybrid /IT services powered by trusted brands like HPE and Microsoft. Now, we’ll be able to work with them to deliver an even stronger service and portfolio of best-of-breed Microsoft Azure Stack solutions.” “With IT at the core of every business today, Cloud28+, gives all members – whether customer or partners – the opportunity to quickly connect, share knowledge, and innovate together,” said Xavier Poisson, vice president, Indirect Digital Services, Hewlett-Packard Enterprise. “It’s a transformation engine around shared values that boosts broader business opportunities for everyone involved.” Alyssa Fitzpatrick, general manager, Worldwide Channel Sales, Microsoft Corp., said, “It’s all about choice and trust. By joining Cloud28+, we will help empower customers with market-leading hybrid IT offerings fueled by partners who know and serve them best. We see Cloud28+ as a real differentiator in terms of helping our customers and partners build their own ecosystems.” As customers focus more heavily on the connectivity, control and compute opportunities presented by the growth of the internet of things (IoT), HPE and Microsoft will also offer customers and partners a rich ISV and distributed service provider network to support the move to computing at the edge. Announced at Discover 2017, Cloud28+ now hosts HPE’s Connect IoT hub, which brings together established hardware and software vendors, system integrators and advisory partners, such as Deloitte, service providers, and OEMs of HPE technology, as well as small innovative technology partners and IoT specialist channel partners. Connect IoT is structured around use cases, industry verticals and partners that have expertise leveraging IoT within the enterprise, throughout the industry and across both metro and global scale networks. Availability Cloud28+ is open to all customers to join free of charge. There is no cost to search the catalogue or connect with partners. Likewise, customers can submit their request for proposals directly through the Cloud28+ portal for a customer response from the community. Microsoft ISV and HPE Partner Ready partners will now have immediate member access to Cloud28+, as part of their respective Microsoft and HPE partner programs. There is no cost for partners to join Cloud28+, nor are there any brokerage fees through the platform. To learn more about Cloud28+ and join the partner network, visit www.cloud28plus.com. About Hewlett Packard Enterprise Hewlett-Packard Enterprise is an industry leading technology company that enables customers to go further, faster. With the industry’s most comprehensive portfolio, spanning the cloud to the data centre to workplace applications, our technology and services help customers around the world make IT more efficient, more productive and more secure.   ### New Relic's Lee Atchison at Futurestack Compare the Cloud speaks to Senior Director for New Relic, Lee Atchison at Futurestack. Lee speaks about his previous experience at AWS and the future of e-commerce platforms. For more information on New Relic visit: https://newrelic.com/ ### Seamless Business Expansion With a Cloud-First Strategy Expanding physically to new locations, or digitally by introducing new applications for customers or partners, are key challenges for many enterprises. For geographical expansion, they have to determine the right region for growth, identify the best workspace, recruit talent, figure out how to best engage customers and account for the total financial impact of expansion. When it comes to expanding digitally by introducing new ways for customers and partners to interact online, companies have many strategies, technology, and user experience challenges to overcome. To simplify and aid this expansion, many organisations are tapping into the benefits of the cloud. With the cloud, companies no longer need to worry about the on-premise data centre, and software infrastructure previously needed to support new locations. Cloud applications now enable people to collaborate easily in real time, no matter where they are located. Modern cloud Software-as-a-Service (SaaS) applications are built to provide great web and mobile experiences, making access for users even more productive. Infrastructure-as-a-Service (IaaS) and Platform-as-a-Service (PaaS) also alleviate the need for upfront investment in tools required to support IT. As companies embrace the distributed workforce, applications such as Microsoft Office 365, Box and Slack are just a handful of popular cloud-based services that enable employees to work together in real-time, regardless of their location. Rather than relying on in-person meetings or calls, these cloud-based tools allow more fluid interaction, leading to faster response times and stronger relationships between staff and customers. This not only helps to maintain a company’s culture across regions but also helps organisations provide their external stakeholders, such as partners and customers, with a better user-experience. But while the benefits of moving to the cloud undoubtedly outweigh the negatives, rolling out cloud apps can also present its challenges. More apps, more headaches? Not only does a successful cloud application deployment require the right people to have access to the apps they need, it also relies on giving those people access from any device. Due to the multiple log-ins required for each service, this can create a massive headache for IT teams. Onboarding and provisioning employees across an organisation and ensuring they can access trusted cloud applications from devices can be laborious and time-consuming. This requires new accounts to be set up and provisioned for each service, and devices to be configured with policy-based controls to ensure security. When expanding headcount quickly, this can significantly slow down the time taken to get new employees up and to run, often leaving some without access. There is also no guarantee that they are accessing from trusted devices. And with more accounts come more passwords. People can only remember a finite number of credentials, and will often use the same one across multiple accounts, creating a significant security risk. If hackers can breach one password on a less secure website, they are likely able to gain access to much more – including sensitive work information, or a personal bank account. It’s also not just employees that organisations need to keep secure. When companies build their apps or websites to engage with customers and partners, they need to ensure that their customers' and partners’ user data is kept safe as well. The potential damage a security breach can have on brand reputation and the overall business is enormous. [easy-tweet tweet="An integrated identity...provides a secure way for users to access their cloud-based apps " hashtags="Cloud, Security"] Identity unlocks the benefits of the cloud Cloud-based identity and device management have become a viable solution in addressing many of these issues. For your employees, an integrated identity and device management solution provides a consistent and secure way for users to access their cloud-based apps across the devices they use. In this way, employees, no matter where they’re based, can be quickly and easily granted access to the services they need while maintaining the right level of security. IT leaders can easily assign apps to contractors, freelancers, and business partners, enabling them to enjoy the benefits of cloud collaboration while still benefitting from centralised management. This ensures secure access across devices, locations, and applications. That said, the advantages of identity management aren’t reserved for third-party apps alone. When an identity management service is accessible via APIs, developers and technologists get the ability to deliver seamless digital experiences that are also secure across their partner and customer-facing applications and websites. To illustrate, Adobe undertook several major cloud initiatives a few years ago. The company also rolled out hundreds of cloud apps including Microsoft Office 365 internally to its tens of thousands of employees, at one point supporting 300 applications and effectively moving critical email, calendaring and SharePoint tools to the cloud. Around the same time, the company also moved all of its Creative Suite applications to the cloud. This enabled users to access, download and install every Adobe Creative Suite application via a Creative Cloud membership – not only changing Adobe’s business cycle but its customer identity needs. Adobe now uses an identity management solution to ensure both customers and internal users have secure access to the apps they need. Their enterprise customers seamlessly and securely connect to its Adobe Cloud offering using existing corporate credentials. The company also uses identity management to support the third-party cloud apps used by its staff, meaning access can be managed safely and efficiently while providing a seamless experience for the end user. The ‘cloud paper trail’ has been another benefit for organisations investing in the cloud. By utilising cloud apps, organisations can better visualise who is using what app or website and when. Organisations can now tap into what services are being used most frequently, understand which apps are not being used, detect usage based on location and prevent nefarious logins. On a more granular level, this can help enterprises discontinue particular services to save on business costs. As businesses expand, physically or digitally, the cloud can enable global collaboration for internal groups and deeper connections with customers and partners. Through cloud identity, all members of a business ecosystem, from employees to the end-user, can enjoy an improved user experience. As the cloud continues to innovate, so will its importance as a foundation for business growth.   ### SMEs Unprepared to Recover from an ‘Inevitable’ Cyber-Attack While the threat of cybercrime is at the forefront of SME owners’ minds, ‘cyber recovery’ is not, according to a new study, The Business of Cyber Recovery, by PolicyBee. Five hundred UK SMEs were asked about their preparedness for cybercrime and its aftermath: one in three believe that a cyber-attack on their business is a matter of ‘when’ not ‘if’, and a quarter believe an attack is ‘likely’. However: 74% have not put any budget aside to deal with the aftermath. 43% will react if and when a cyber-attack happens and have absolutely no plans in place. Just 14% of all SMEs have a detailed plan which covers all bases and crucially have tested that plan. Sarah Adams, cyber insurance expert, who commissioned the study for PolicyBee, said: “Large corporates will all have a ‘what if’ plan in place that has been stress tested via a crisis simulation or role play exercise. They will know exactly what to do in the event of a cyber-attack. However, small businesses seem to be chancing their luck and despite expecting to be hacked, aren’t preparing to be prepared. “The difference between a large and small company is that at least in the short term, no single individual will lose their income in big business - but in a small business, their day to day livelihood could be altered dramatically within a scarily short space of time.” Businesses in denial Younger respondents seem more aware of potential cyber risks - as business owners get older they think a cyber-attack is less likely: 22% of 18-34-year-olds think a cyber-attack is unlikely; 41% of 35-54-year-olds and 56% of 55+-year-olds. Business in the South West and East of England are most in denial of a cyber-attack - those in London and the NE are the most switched on. Similarly sole traders believe they are least at risk from a cyber-attack: 71% say it is unlikely; 32% of businesses with 10-49 employees and one in five of businesses with 50-249 employees. Adams continued: “More mature sole traders in the South West and East Anglia seem to be in the most potentially vulnerable group. If you are one of these people, it would be well worth looking at your business’s potential to become the next cyber victim, and how you’d continue to operate afterwards.” [easy-tweet tweet="Only 24% of IT businesses say an attack is unlikely " hashtags="Security, CyberAttack"] IT and management consultant firms more switched on to cyber recovery Interestingly, SMEs operating in the IT and management consultancy sectors had a much more realistic attitude to cyber-attacks: only 24% of IT businesses say an attack is unlikely (48% say likely) 16% of Management Consultants say an attack in unlikely (51% say likely) According to PolicyBee, who provides cyber insurance and other business insurance to freelancers and small businesses, the study highlights the fact that SMEs are simply too busy running their day-to-day operations. Adams concluded: “It’s not the usual case that all SME owner-managers are burying their heads in the sand, as the study shows some awareness of the possibility of an attack amongst some groups. It’s more that these busy owner-managers haven’t prioritised any time to deal with the aftermath of an attack. We’re all familiar with the terms cybercrime; cyber-attack; and hackers; but we need to make ‘cyber recovery’ part of the general discussion now too.” ### NEC Joins Hyperledger to Advance Blockchain Technology NEC Corporation announced that it had joined Hyperledger as a Premier member, emphasising NEC's commitment to leveraging its innovative research to both advance blockchain technology as well as its global adoption. Hyperledger is an open source collaborative effort created to advance blockchain technologies and services around the globe by addressing important features for a cross-industry open standard for distributed ledgers. Recent years have seen an increasing focus on devising novel secure and scalable blockchain platforms that address the needs of many industrial use cases. NEC has been leading academic research on blockchain security since the introduction of the Bitcoin system. In December 2016, NEC Laboratories Europe released, together with Aalto University, a novel blockchain consensus technology based on Byzantine fault tolerant protocols. This technology builds on hardware-based trusted execution environments and combines their use with lightweight secret sharing. NEC's technology can achieve a throughput over 100,000 transactions per second - making it the world's fastest and most scalable Byzantine fault tolerant protocols to date. [easy-tweet tweet="NEC is promoting the transformative benefits of blockchain technology" hashtags="Blockchain, Technology"] "NEC's global reach will be a huge asset, as we continue to grow the commercial ecosystem for Hyperledger technologies," said Brian Behlendorf, Hyperledger Executive Director. "We welcome NEC and look forward to collaborating with them to help develop an open, distributed ledger technology that makes business processes transparent and solves interoperability across many industries." "NEC is promoting the transformative benefits of blockchain technology and advancing its adoption by contributing our expertise and knowledge to the Hyperledger community," said Dr Juergen Quittek, Vice President at NEC Laboratories Europe. ### Does GDPR Make Shadow IT More Frightening? With GDPR set to go into effect early next year, many companies are working hard to get into compliance. But how is shadow IT, where employees use non-approved applications and services to get their work done without consulting IT or management, likely to cause issues for your company? Thomas Owen, Head of Security at Memset explores. Shadow IT is the deployment and use of IT systems within an organisation (and often outside of its boundaries) in contravention of IT policy and without the knowledge of teams responsible for corporate IT.  ‘Shadow IT’ has been an industry buzzword for a few years now, growing out of the deperimeterisation of the corporate network and the acceptance of Cloud systems by early adopters within an organisation.  Shadow IT has, quite rightly, been blamed for numerous breaches of security or compliance and is often the bane of many CIOs’ morning to-do lists.  Organisations that lose control of their IT and critical data can find themselves at increased risk from attackers from any point on the threat actor triangle and be facing increasing fines from regulators. So how is the EU's General Data Protection Regulation, set to go live May 25, 2018, set to change this risk landscape?  GDPR Article 25 makes reference to the ‘state of the art’ regarding technical security controls and mandates a 72-hour data breach notification requirement that will likely see organisations elevated to the regulator’s notice for even minor or suspected incidents.  The greatest fines, up to 20 million euros or 4% of global turnover will likely be reserved for those organisations that either fail to notify the regulator or commit egregious failings in technical security and procedural compliance.  Shadow IT, where employees could place critical personal data onto poorly protected, unmanaged machines or services outside of the view of the organisation’s audit and compliance processes are perfect candidates for a large breach, notified late, from highly non-compliant services. [easy-tweet tweet="Shadow IT is characterised as the result of developers...personnel circumventing corporate policy" hashtags="Shadow IT, GDPR"] According to the Data Breach Investigations Report (DBIR) put together by Verizon Enterprise, 25% of the breaches they assessed globally involved an internal actor and ‘most of the incidents are still taking months and years to discover.’ If GDPR is breached due to the use of unmanaged and non-compliant shadow IT, some spectacular fines and PR damage is likely to be coming your way. Regardless of disciplinary outcomes internally, your business will still have taken a body-blow. Shadow IT is often characterised as the result of developers or operations personnel circumventing corporate policy, but it can more positively be described as ‘well-intentioned employees attempting to support business objectives despite poorly managed compliance processes.’ If salespeople are incentivized to hit their targets, operations to ensure a given system is performant and available or developers to increase the rate of release of code, they will naturally seek the tooling that best supports their personal view of what needs to be done. Shadow IT, particularly when already well established in an organisation, isn’t always attributable to a single person. Where the CFO himself uses a critical pricing spreadsheet that, unbeknown to IT, is hosted on a white box server in an arbitrary cupboard because the approved tools are out of date or difficult to use, doubling down on offenders or bringing out the Compliance Big Stick isn’t going to serve regulatory alignment, users or the business itself. Much better it to co-opt the most likely offenders, those technically savvy users who most aggressively seek out tools to support their specific goals, into a community within the origination that drives improvement not only in IT systems and tools but in the governance, compliance and risk awareness of the business.  GRC fails every time where it does not intimately involve the most vocal users and early adopters and results in a detriment to business.  Compliance to GDPR is no different, just with a bigger negative outcome at the end of a painful, public breach. Target the causes of non-compliance, not the symptom, shadow IT stemming from non-business aligned policy and weak internal tooling. While IT departments can restructure systems, tighten security, and re-tool permissions, they have more limited control over end users who ‘go rogue’ and set up their cloud-based services and software solutions. If we accept that our employees are informed and well intentioned, shadow IT suddenly gets reframed as users that are attempting to execute on business objectives in the face of policy and internal IT that doesn’t support the mission. That’s a wider problem than policy compliance alone. Obviously, if you’ve got a true motivated malicious insider on your hands, you’re highly likely to be included in next year’s breach statistics, whatever you do. Inventory and systems discovery – Perform automated scans of less well-known network areas. Sit down with users from across the business and hierarchies to discover what tools they use. Are these known? Cloud based? Used by others? What kind of data to they contain or require? This can then inform the basis for a truly comprehensive Privacy Impact Assessment, as well as giving the CIO grist for his budget planning mill. Educate users, particularly those likely to offend - While it may seem obvious, companies should outline in their policy regarding the use and creation of new tools and systems. Educate everyone through effective training (not another powerpoint in email) and engage specifically with those users likely to offend.  What pain points are they experiencing? Where does policy or existing tooling get in the way of their daily lives? Where their approach is just plain wrong, explain, and where their pain is justified, you have just discovered an ideal stakeholder to assist with correcting your internal issues. Control access to PII – If your central database or store of personal customer data controls the ability to access and export customer data in a granular fashion, it’s likely that the number of individuals able to deploy particularly damaging shadow IT will be limited, and a limited pool is easier to manage.  Monolithic access, where a user can either get ‘root’ or all permissions or none at all are breeding grounds for serious abuse and security incidents.  Not to mention that appropriate access control is a fundamental part of both good security practice ### Worldline Secures Mobile Ticketing Contract with Govia Thameslink Railway New mobile ticket system signals richer functionality and cost savings for UK’s largest rail franchise Worldline [Euronext: WLN], European leader in the payment and transactional services industry, has inked a deal with the UK’s largest rail franchise, Govia Thameslink Railway, to replace the operator’s current desktop Ticketing Issuing Services (TIS) with the Worldline Mobile Ticketing Service, @Station. The @Station system will provide significant advantages of a Mobile point-of-sale system, including the ability to ‘queue bust’ as well as cross and upsell both Rail Settlement Plan and non-RSP products to customers. It will also ensure that GTR is Direct Acquiring (DA) compliant. The upgraded systems are bolstering the technological capabilities across the GTR network at a time of modernisation for the UK’s busiest rail operator. The organisation is halfway through a transformation plan set to enhance the rail experience for passengers, while continuing to operate its growing interconnected services, and adopting leading technology for issuing tickets is an important part of this process, with the key smartcard a recent addition, designed to boost the customer experience. [easy-tweet tweet="Implementation of the new mobile system is set to complete by the end of 2017" hashtags="Mobile, Data"] The contract builds on an existing relationship between GTR, its Go-Ahead Group parent company, and Worldline. GTR already uses Worldline onboard ticket issuing systems, and Go-Ahead Group recently procured 585 mobile onboard TIS units from Worldline. Implementation of the new mobile system will begin in the fourth quarter and is set to complete by the end of 2017, with GTR to benefit from cost savings from the commencement of installation. The @Station service shares the same back office architecture as Worldline onboard TIS, so all mobile ticket sales data will be captured through one system. Using Worldline’s mobile and onboard systems in tandem also delivers operational synergies and efficiencies. Staff should quickly become familiar with the user interface, lowering training and support costs, and enabling flexibility in staff deployment. ### Groundbreaking Computer Software to Improve UK Resilience to Extreme Events A powerful computer system capable of revolutionising the UK’s ability to plan for extreme events, such as flooding and power outages, is being designed as part of a government scheme to make the nation’s infrastructure more resilient. It is estimated that inadequate infrastructure costs the UK £2million a day, and extreme events can cost hundreds of millions more. To make the nation’s vital systems more adaptable to such changing circumstances, the UK Government has invested £138million in a project called UKCRIC (UK Collaboratorium for Research in Infrastructure & Cities),  which will see state-of-the-art new facilities established at 11 universities. These systems provide essential services such as energy, transport, digital communications, water supply, flood protection, and wastewater and solid waste collection, treatment and disposal. As part of UKCRIC, £8million has been invested in the Data and Analytics Facility for National Infrastructure (DAFNI). DAFNI is being designed and developed over the next four years by the Science and Technology Facilities Council (STFC) at its Rutherford Appleton Laboratory in Didcot, Oxfordshire, and will provide access to massive secure data storage, fast computer performance and the next generation in systems visualisation. It will provide new software to enable significant advances in infrastructure systems research creating unprecedented opportunities to transform infrastructure services and pave the way for a more sustainable future. Professor Jim Hall from Oxford University, the principal investigator for the project, said: “DAFNI will put the UK in a unique position to analyse the infrastructure systems upon which we all depend, helping to improve performance and pinpoint vulnerabilities. The future of our economy, society and environment depends on the right choices being made for energy and transport systems, digital communications, water supply and flood risk management. DAFNI will provide researchers and decision makers with unique capabilities to analyse system performance and make wise investments.   “It's great to work with STFC and our other university partners on this project, which is pushing the boundaries of the nation’s current large-scale computing capabilities. “Over the coming years, we are going to be bringing together business, government bodies and research organisations to collaboratively deliver this unique national capability. “I am delighted to be leading a team of remarkable minds on this ground-breaking programme. It is a really exciting time for computational science, and I’m excited to see just how much we can achieve over the next four years.” Funded by the Engineering and Physical Sciences Research Council, which is providing £125million of support for UKCRIC, DAFNI will offer globally unique software which will allow researchers to study complex infrastructure systems in cities, such as sewage systems or transportation networks. [easy-tweet tweet="SCD brings in a wealth of experience and expertise in delivering large-scale research infrastructure" hashtags="Data, ComputerSoftware"] Dr Erica Yang, Head of Visual Analytics and Imaging Systems at STFC’s Scientific Computing Department (SCD) and STFC DAFNI project director, said: “SCD brings in a wealth of experience and expertise in delivering large-scale research infrastructure for the major UK and international collaborations. “By locating DAFNI at STFC’s Rutherford Appleton Laboratory and making massive value-added data and compute resources available, the UKCRIC community will benefit and exploit the well-established infrastructure development expertise of SCD. Working with the highly interdisciplinary community, the Department will become the nation’s data intensive science hub for infrastructure systems’ research, underpinning the sustainable growth and development of globally unique large-scale data-intensive analytics facility driven by the needs of leading UK academics and industrialists in the field.” Work on DAFNI began today at the four-year project’s official launch as stakeholders were invited to a joint DAFNI-Innovate UK organised launch event at the Future Cities Catapult in London, as part of the consultation process. This is a critical step in the project, as it will help to decide how DAFNI is designed and set up. At the event, the project team discussed the benefits that DAFNI’s advanced capabilities will offer to the UK innovation and research community and began a year-long consultation with users from research, business and policy on the best way to design and deliver DAFNI so it can deliver practical applications which best address the challenges faced by society. The event also introduced an initial set of pilot projects which will be jointly developed by DAFNI university partners and STFC’s SCD to kick start the development process. Peter Oliver, a project sponsor within SCD, said: “DAFNI is a strategically important project for SCD and is an evolution of systems we’ve designed and built for climate and earth system sciences (JASMIN), Particle Physics (GridPP Tier1) and our activities in supporting STFC’s Facilities and Diamond Light Source. I’m proud to be the project sponsor for DAFNI in SCD and I’m excited by the opportunity to work with the UKCRIC community.” ### Affordable Remote Monitoring, the Key to Unlocking Farming Inefficiencies? Precision farming is on the rise. The process involves the constant monitoring of the state of crops in order to provide the perfect growing conditions. It is being used to improve productivity and the sustainability of farms but has traditionally been expensive to implement. As a result, current applications are unlikely to see widespread use - particularly considering 78% of the world's produce is grown in developing countries according to the Brookings Institute. Despite high demand for farmed produce, many of those employed in agriculture in much of the developing world are still relatively poor. Just 12% of their crops are high value ‘cash crops’ suitable for export. Implementing new technology driven farming techniques is therefore restricted to a small section of the agriculture industry. The World Bank identifies agricultural development as a key means of poverty reduction. Technology if affordable, would, therefore, make a particularly positive impact. In Africa, we’ve already seen the emergence of farming apps such as EZ Farm which uses predictive analytics to advise farmers on useful things such as moisture levels in the soil. However, by inputting real sensor is driven data on weather, soil types, the slope of the land, moisture, heat, chemicals and other conditions; water, fertiliser and pesticides could be applied in more precise quantities specific to crops needs to increase productivity. Groundbreaking firms such as Arable are already leading sensor innovation, developing new means to provide insights into crop health. The core of the cost and challenge, however, comes most often in the transmission. Farming by nature is conducted over enormous areas of land, typically with little access to mobile data. Those early adopters have therefore often been forced to set up their networks across considerable areas of land - at a very high setup cost. As a result, to date, those without available investment have been unable to benefit from a connected farming environment. In the areas where networks exist, costs can still be particularly high, and issues around connectivity and signal strength will persist. The hype around Industrial Internet of Things solutions from mobile network operators running on 4G (and later 5G) mobile networks has encouraged some level of uptake. However, these systems are limited in their capabilities in that they are reliant on network efficiencies and costs can also be very high. We’ve all been without a data connection when we should, or in areas where traffic is too high to connect. Service issues are almost accepted in the consumer market, but in an industry as essential as agriculture, existing network reliability simply isn’t sustainable to run monitoring equipment on. Furthermore, the cost of running a connected SIM which supports LTE often far exceeds the value of the data. Also, the processing power required to transmit the data is high and can take up unnecessary amounts of space. It is clear that a more efficient and affordable means to introduce remote monitoring into farming could make a clear difference to farmers globally. In my view, the ideal technology is a relatively little-known one - Unstructured Supplementary Service Data (USSD). [easy-tweet tweet="USSD requires far less signal strength than mobile data meaning less power demand." hashtags="Data, Agriculture"] Effectively the internet without the internet and a feature within all cellular networks 2G to LTE, USSD provides a means to transmit information in regions where there is little to no mobile data coverage available. Implemented correctly into the agriculture industry, it could provide some immediate cost savings and enable remote monitoring at an affordable price point for more farmers. USSD requires far less signal strength than mobile data meaning less power demand, allowing devices to last longer in the field. For agriculture, this makes sensors simple to install in and around crops. There is no need for microprocessors and components to communicate the data - in turn reducing costs for manufacturing. USSD does depend on GSM networks, however with only 20% of the globe's population without access to basic mobile services, according to the GSMA, and with this number continuing to shrink, USSD could help to drive adoption of more automated farming in multiple regions. Making available more efficient and affordable technology to improve agriculture processes would provide the essential catalyst to unlocking farming inefficiencies globally. There is a lot of hype around agri-tech, despite how clever the predictive analytics application is in the cloud – the data needs to come from the field. Given most fields are in low data coverage areas but are almost always in far-reaching GSM reach, USSD seems like the perfect partner to advancement. The technology is available now and does not require support from mobile networks or governments to implement. The ball is in the court of the agriculture industry; time will tell how they chose to utilise these available solutions. ### NEC Receives Frost & Sullivan's 2017 Asia Pacific Integrated Biometrics Solutions for Public Safety Leadership Award [vc_row][vc_column][vc_column_text]NEC Corporation today announced that it had been conferred the 2017 Frost & Sullivan Asia Pacific Integrated Biometrics Solutions for Public Safety Leadership Award. The award recognises NEC for its technical leadership and growth performance in 2016 in Asia Pacific for the Biometrics solutions segment. Urban planners today are faced with greater challenges than before. While the digitisation of the environment and business landscape brings enormous benefits, there are unprecedented risks that agencies, corporations and citizens become exposed to. NEC addresses their concerns with a wide and established product portfolio to support the authorities in public safety. Its solutions have been widely deployed in security checkpoints, police checks and border controls in many countries across the Asia Pacific region. [easy-tweet tweet="NEC's advanced facial recognition solution employs one of the most cutting edge technologies" hashtags="Technology, AI"] Its on-going development and advancement in biometrics have evolved to include facial recognition technology to bring security to the next level. At the same time, supplementing biometrics solutions with video analytics offers a lot more information to derive meaningful insights that have not been possible before. "NEC is a pioneer in multimodal biometrics authentication. Its advanced facial recognition technology solution employs one of the most cutting edge technologies in the world. With no direct interaction with people, facial recognition is the least invasive method of assessing individuals. In particular, NEC's NeoFace Watch Solution is instrumental in identifying suspects in border controls and public places who have been registered as being involved in suspicious activities by the authorities," noted Ajay Sunder, Vice President, ICT, Asia Pacific, Frost & Sullivan. "The award is recognition for our proven solutions and validates our expertise in the public safety sector. NEC is committed to providing the best-in-class technology to law enforcement, border security and governments globally to create better cities and keep communities safe," said Tan Boon Chin, Managing Director, Global Safety Division, NEC Corporation. As one of the pioneers of cutting edge biometric technology, NEC has been involved in the development of facial recognition technology for more than 30 years. NEC's NeoFace, part of NEC's suite of AI technologies, is a facial recognition engine boasting the world's highest authentication accuracy for real-time investigations. NEC's facial recognition technology has been introduced to more than 40 countries globally. NEC's video facial recognition technology has been top ranked for four consecutive times in the benchmark testing of the U.S. National Institute of Standards and Technology (NIST). http://www.nec.com/en/press/201703/global_20170316_01.html[/vc_column_text][/vc_column][/vc_row] ### 5 Tips for Managing Projects In The Cloud Less maintenance, easier training, lower costs and the ability to access anywhere makes cloud-based projects a no-brainer for teams that want to work quickly and collaboratively. But there’s more to it than just telling your team to use the software. Check out the following tips to ensure that your projects are a success in the cloud. 1. Find a Project Management Solution That’s Easy To Use One of the benefits of cloud-based projects is that everyone can chip in and participate. But if you find a solution that requires intense technical know-how, you should think about another solution. Find one that focuses on usability. That way no matter the skill level or technical savvy of your team, everyone can participate. You’ve got to find the solution that’s right for your team if you expect them to use it and adopt it. Cloud-based projects really only launch well when everyone is contributing and updating tasks, adding files and commenting where needed. Teams can quickly see priorities, tasks and needs for the projects, while also getting a personalized to-do list for themselves for them to work on. All comments and files can be arranged by tasks or projects.  This keeps the reporting accurate and up-to-date. There is still a role for the project management to assess budgets and time, but it is up to the whole team to sign in and update their work. 2.     Monitor Your Project Progress Project management software is essential to tracking tasks, as a manager can easily check on the status of assignment at any time. With project management software, you’ll get big picture view into all open activities and the ability to look down the line to anticipate problems. You can break the job down and look ahead with Gantt charts and other visualisation tools to see what tasks are on track or falling behind. What’s great about cloud-based projects, is that many tools will offer alerts and updates to your team members once a task is finished. Also, any tasks that are late will be highlighted or flagged. [easy-tweet tweet="With a cloud-based app, you can securely share files to those that need to see it." hashtags="Cloud, IoT"] Cloud-based projects do a lot of the updating for you, moving projects along and notifying team members without any outside intervention. 3. Stay Secure With a cloud-based app, you can securely share files to those that need to see it. You can even set different permissions and send files only to those individuals. Strong passwords, single sign-on capabilities and the right encryption levels will guarantee the trust of your upper management. This is especially important if you’re working with external clients. They’ll appreciate knowing that you have checked that the tool is secure for their information as well. 4. Consolidate Your Communication & Get The Right Version Every Time The chief benefit of cloud-based projects is that you now have a reliable and consistent method of organisation Gone are the days of using your inbox as the location for all project files, comments, and updates. You’ll free your inbox from unnecessary emails. Project managers and other team members have one spot to reference for updates or questions. It’s faster to leave updates or list ideas that everyone can access immediately. But for the capabilities of the software to really take off, you’ll need everyone on board with the software and committing to the updates. Because a file is not tied into someone’s hard drive, cloud-based projects have the ability to update files automatically. That way the updated and correct file is immediately accessible. You no longer have to wait on someone to send a file out to the whole team. If a version is lost or missing, you can go back to the previous one and compare any changes. 5. Know How You Work And Get Help When You Need It When starting with cloud-based project management software, a good start is key to your team’s success. Everyone has misgivings about how it could go, so starting off on the right foot with help will help it go smoothly. A world-class customer and product support team will help you do it. Even if you move your projects to the cloud, no project management software can change your team’s habits, without real conversation and a commitment to change. The best project management software companies offer support to your team and also to work with you on establishing the right habits. The best features won’t matter if your team doesn’t use them effectively. Partner with a company that provides the “safety net” you and your team need to reach your goals. No team or project should be left on their own to tackle the tough spots of cloud-based projects. ### Is Failing Security Hampering The Adoption Of The Cloud? The reality that all kinds of business data is now stored, managed, accessed and maintained in the cloud is inescapable, but is security still hampering the adoption of true utility cloud-based computing? Nathan Johnston, Solutions Architect at Memset explores. Businesses are relying on cloud services more than ever, as they tend to be more convenient and often cheaper than alternatives. However, as businesses increase their dependence upon cloud services, so too have the malicious actors that follow the data and computing resources to the cloud. There is no escaping the headlines about significant data breaches experienced by large organisations like Yahoo, LogMeIn together with the discoveries of NSA and government breaches heighten concerns of the security risks traversing the internet. Cyber security breaches have become an endemic problem in modern society and business. Cyber security is becoming a growing concern all round, so much so, that now the impact of cyber security attacks are being tracked and assessed at a national level by the UK Government. Such threats are considered to be equal in risk to the UK as international terrorism and global military conflict, according to the National Security Strategy. Ongoing concerns about security issues, such as access to data, geographic location of sensitive data, compliance, and visibility into organisations’ public and hybrid cloud environments, continue to hinder the adoption of the cloud for many businesses. [easy-tweet tweet="Good security practice is now an absolute requirement to have a sustainable business" hashtags="Security, Cloud"] While in the cloud industry, we’re reassured by survey results showing that the majority of businesses who have adopted the cloud have found security is not an issue, with many stating that security has improved, some companies remain sceptical. Good security practice is now an absolute requirement to have a sustainable business and a competitive edge in most national and international markets, but this doesn’t have to be at the cost of efficiency, expense or agility. Regardless of the size and type of your organisation, there are actions you can take. The issue underlying most of this apprehension is a lack of visibility into an organisation’s cloud environment, which signals a loss of control. Whilst this is a real concern, it is equally important to remember that the relationship that exists between a cloud service provider (CSP), like Memset and their customers incorporates a shared responsibility of security; although the CSP maintains strong security and compliance controls across their entire infrastructure platform, the customer is responsible for anything they manipulate on the cloud platform. This is one of the primary challenges of using cloud resources: acknowledging that the services offered by the cloud providers establish a shared responsibility between the cloud provider and the cloud user. Both the cloud provider and the user must be aware of system and data security to prevent a breach in the security. And in the event of a risk being identified, it is illogical to automatically assume the fault lies with the CSP. With many of the major cloud security breaches, reports tend to name drop the well-known, multi-million-pound service providers are at fault, when in fact it is usually a result of hackers managing to figure out the network credentials of a third party vendor that lead to such devastation. Equally, focusing on technology rather than the issue can add to the chaos of a security risk. With the adoption of hybrid cloud solutions becoming the norm, critical information is likely to be siloed in different areas of the organisations’ storage solution. While we’ve come some way in centralising company data, that previously had to be continually transferred between departments and companies, a central data repository adds some other security considerations. Hackers want to have a greater impact for their exploits. So, they tend to attack the central repositories of data that’s available. Even organisations that take data protection seriously and institute sound policies for implementation can be still caught out, especially if they are growing through acquisition of other smaller entities and incorporating them into their information system. In this process they lose sight of some server that belongs to the network that does not have the same level of protection as the other servers on the system, thereby leading to a major compromise. Thus, movement of data out of the central repository should be monitored at all time. Essentially, if CSPs and customers work closely together with the same goal, their IT infrastructure can help create governance rules and policies accordingly, enabling business users and data scientists to find, understand and trust the data they need to fuel critical insights. ### Pulsant and Corent Join Forces to Offer Comprehensive Cloud Migration Services Pulsant, a leading UK provider of hybrid cloud solutions, has announced a partnership with cloud migration platform company Corent Technology that sees both parties complementing their existing services. The partnership allows Pulsant and US-based Corent to deliver a more effective and comprehensive cloud service to customers. While Pulsant can effectively determine the readiness of a business looking to migrate to the cloud with Corent’s automation tools, Corent is able to help that customer take its existing application portfolio and move it to the new cloud-based infrastructure. This partnership enables customers of Pulsant and Corent to benefit from a much-improved cloud experience that minimises the number of involved suppliers and takes the pain out of the migration process. Adam Eaton, sales director, Pulsant, says: “Pulsant offers a cloud readiness assessment as part of its Consultancy Services offering, and for a while we had been looking for someone to help provide us with a toolkit that would allow our customers to actually move over their applications and systems once that assessment had been completed. This is exactly what we are now able to provide thanks to our partnership with Corent. Its services essentially augment our cloud readiness assessment — providing customers with assistance throughout the entire cloud migration journey. “We’ve seen an increased interest from our customers in ‘cloud-ifying’ their applications, and so I’m confident this will be the beginning of a mutually beneficial, long-term business relationship. We’ve already found Corent to be extremely helpful and responsive, working closely with us to get the ball rolling and assist us as we learn together along the way.” [easy-tweet tweet="Pulsant is one of the key players within the cloud technology space" hashtags="Technology, Cloud"] Jeremy Neal, UK manager, Corent Technology, says: “Pulsant is one of the key players within the cloud technology space, and certainly one of the leading Microsoft partners — a company we also work very closely with. Pulsant has been one of our top targets and we’re very pleased this partnership is now in place. “The deal also allows us to help make cloud migration much easier for businesses across all sectors — something we’ve always been passionate about and believe will only become more important as time goes on.” ### Why New Retail Technology Benefits Everyone The experience of retail shopping is constantly changing. There was a time when customers were content to purchase their items at a brick-and-mortar retail location. Today, the retail consumer wants the ability to view and purchase products on their phone, tablet, computer as well as in person. They also want products delivered to them when and how they choose. Consumers desire the ability to easily return an item if they've have made a mistake. The retail transformation taking place is designed to accommodate the changing needs and desires of retail consumers. Cloud Computing From Amazon to British clothing brand Asos, cloud computing for retail is fast becoming the norm for any serious retail business. By interacting with customers quicker than any traditional retail store you can now have the best customer service through quick communication of data using the cloud. Using the cloud can streamline the supply chain for any business by collecting, analysing and storing data faster than ever. Cloud computing is cost effective as you can try out new services or store essential data in a much cheaper option than any older data system. Finally decision making and communication between departments becomes streamlined through the cloud making business go faster and smoother than ever before. [easy-tweet tweet="Many major retailers are now increasing their use of robotics" hashtags="Robotics, BigData"] Big Data The use of big data is only going to increase. This approach utilises technology and the information obtained from it. Most retail organisations are finding new and different ways to interpret the big data they have at their disposal. This is often driven by the intense competition within the retail industry. Being able to use big data to its fullest potential has caused the need for experienced professionals in data analysis to be highly desired within the retail industry. Robotics The use of robotics has been common with many areas of retail when it comes to the supply chain. Many major retailers are now increasing their use of robotics. It's estimated that Amazon utilizes over 44,000 robots in its warehouse operations as well as other areas of its logistics and supply chain. The use of robotics with several retail chains is increasing in the area of self-check out as well as delivery from suppliers to retail outlets, the consumer and more. Centricity This is a unique approach to serving retail customers. It requires those in the retail industry to look beyond their organisation and work with others in the industry. This could even involve competitors. This creates a world of retail that is transforming into a collaborative industry where participants are interconnected. Major retailers are developing specialised networks that can provide customers with individually targeted products and services. The goal of centricity is to meet the needs of consumers. Multichannel Retailing This is when a retail organisation is able to provide a number of different ways for customers to buy goods and services. This involves providing products and services with traditional venues such as brick-and-mortar stores as well as catalog shopping, and telephone. It also involves offering products and services using nontraditional methods such as social networks, chats, websites, emails as well as apps and more. This makes it possible to improve analytics and better understand the desires of consumers. This provides an increased opportunities to establish a brand to a wider customer base. It enables a company to have more visibility on their targeted demographics and more. Structure The retail industry is now structured to invest more technology on the customer experience. This involves in person and online. Retailers are eliminating things that do not add to the customer having a positive experience. This involves eliminating as many layers as necessary between the responsible employee and the customer. Retailers are making customer experience part of the compensation the company provides. The walls between different departments such as sales, marketing, and advertising are gradually decreasing and eventually being eliminated. Blockchain Another evolving aspect of retail transformation is the increased ability to serve consumers. It could involve using blockchain. This is a shared ledger technology. It enables companies to accurately manage and identify complicated digital transactions. This technology is also able to provide a secure storage environment for the values of items involved in these transactions. It enables increased accuracy in tracking domestic as well as international transactions. This technology will continue to grow with more involvement from the digital economy. Bitcoin The use of the bitcoin to purchase items in the retail industry is increasing. Overstock was the first major retailer to accept the bitcoin as a form of payment a few years ago. The main problem is the value of bitcoins can change quickly and significantly. Retailers are slowly implementing bitcoins as a form of payment, and this is expected to increase in the future.   ### ETELM’S 4GLinked Solution a Success at the ETSI MCPTT Plugtests The first fully integrated multi-technology Mission Critical solution, developed by ETELM, successfully interfaces with other vendors at the ETSI/3GPP MCPTT Plugtests in France ETELM, a leading manufacturer of advanced Mission Critical communications infrastructure, has successfully participated in the first Mission Critical Push-to-Talk (MCPTT) Plugtests, and connected a TETRA base station directly to LTE EPC cores and MCPTT servers. By connecting TETRA with LTE without any gateway, ETELM’s 4GLinked solution offers much better performance and operability as strongly requested by all mission critical users now and in the future. [easy-tweet tweet="MCPTT is the new standard that will drive PMR " hashtags="Hybrid, Communication"] ETELM is a pioneer in the critical communications sector and was the first company to implement a MCPTT solution into a hybrid TETRA/LTE network, thanks to its partnership with Nemergent, a working group that specialises in the development of interoperable emergency communications solutions. During the Plugtests, which were organised by ETSI and the TETRA and Critical Communications Association (TCCA) from 19 to 23 June 2017 in Sophia Antipolis (France), ETELM’s solution successfully interfaced with many other vendors. This created a fully integrated solution in line with ETSI/3GPP standards (3GPP release 13), connecting users from different technologies and giving them the ability to use the same services. “MCPTT is the new standard that will drive PMR and is increasingly becoming the key component for Critical Communications,” said Pierre Minot, President, ETELM. “ETELM has a strong commitment to develop fully standardized solutions and we announced our collaboration with Nemergent on a MCPTT solution for critical communications over TETRA and LTE earlier this year. This marked the first major step forward for PMR accessibility to multiple technologies on a single unified platform. We are pleased that the solution has performed so well at the first MCPTT Plugtests by successfully interconnecting with all the main players in this market.” ETELM’s 4GLinked solution overcomes traditional issues associated with legacy systems and their evolution and association to LTE, by bringing to market a fully interoperable and harmonized mission critical communications solution. “The Plugtests were important to start deploying end-to-end solutions including MCPTT implementations,” Minot concluded. “Our solution fulfils the expectations of all legacy PMR users that are willing to benefit from the broadband services provided with LTE technology.” The first deployments of the solution are currently taking place. For further information, please visit http://www.etelm.fr/. ### Neil's Hard Talk - Neil speaks to Lionel Hackett from WiseCrowd On the pilot episode of 'Neil's Hard Talk' Neil interviews CEO Lionel Hackett from WiseCrowd. Lionel speaks firstly about WiseCrowd and how they are a trusted online platform connecting freelance consultants, specialising in governance risk and compliance with businesses, mainly focussing on financial or fintech businesses that have short-term requirements. Lionel then speaks about the challenges of cyber security information and how threats keep growing on businesses large and small. The conversation changes to GDPR, new regulation coming into force next May. Lionel explains how some businesses are making the right preparations, but some are not even aware. This provides an opportunity for WiseCrowd's consultants for them to offer companies support and relevant advice to move forward. WiseCrowd is an early stage startup who want to build a relationship with their clients to help them understand how the platform works to help understand the features that they want. Neil sums WiseCrowd up as; ‘Uber for skills in fintech’ For more information on WiseCrowd visit: https://wisecrowd.global/ #ToughTech ### INFINIDAT and M2M Enterprise Announce UK-Wide Partnership INFINIDAT, a leading provider of enterprise data storage solutions and M2M Enterprise, an award-winning specialist memory, server and storage distributor, today announced their partnership through a UK-wide distribution agreement. M2M Ltd, founded in 1998 and based over two sites in Bromley, UK, will be INFINIDAT’s first UK distributor. By adding INFINIDAT’s petabyte-scale enterprise-class data storage solutions to enhance its portfolio, M2M Enterprise completes its top end storage offering, and will enable INFINIDAT to grow its high-end level reseller base. Gregory Scorziello, UK country manager for INFINIDAT, said, “We were looking for a distributor with a focused approach, whose key aim was to introduce new, rather than mainstream technologies to the market and who would treat INFINIDAT’s offering strategically. With its dedicated sales team and marketing resources, M2M Enterprise will not only help us grow our market in the UK, but also introduce the organisation to specialist resellers who see INFINIDAT as a strategic vehicle with which to drive incremental revenue, profit and market share.” [easy-tweet tweet="InfiniBox, best known for its high performance, reliability and availability..." hashtags="CloudStorage, Data"] INFINIDAT’s flagship enterprise storage array, InfiniBox, best known for its high performance (99.99999% uptime), reliability, availability and innovative “self-healing’’ architecture, delivers powerful storage capabilities in one unified solution that can be easily leveraged to address the management and analytical needs of Big Data, enabling access to multiple databases and sources, managing growing volumes of data and virtualised data centre and cloud storage. Ged Mitchell, Managing Director of M2M Ltd, said, “Whilst developing M2M Enterprise, and with the emergence of flash, we wanted to engage with an innovative and disruptive storage array vendor. We see INFINIDAT as a key enterprise player within our portfolio, providing the top performing, most advanced storage offering.” Working with specialist storage VARs and corporate VARs, M2M Enterprise will be targeting a number of verticals for INFINIDAT, including eGaming, financial services, insurance, MSPs, cloud providers, and large Big Data users. The partnership has already borne fruit with the customer win of brightsolid, a successful UK cloud provider based in Aberdeen and Dundee, Scotland. M2M plans to support the ten plus INFINIDAT specialist resellers through lead generation, focusing on end user engagement, commercial and technical support and an insured credit line for orders placed. M2M works with tier one, industry-leading, disruptive vendors such as Samsung Semiconductor, Samsung Electronics, Seagate, Micron, Elastifile and PrimaryIO in the supply of, HDD, solid state and PCIe drives, optical and flash storage products as well as Enterprise software and hardware solutions. ### WannaCry Attack: Let’s Talk Cloud and Encryption The 150 country wide ransomware attack, known as WannaCry, which took down a number of organisations, including several NHS trusts, flooded the media last month.  The ransomware attack – which involved cyber criminals gaining access to data and then encrypting it with a key known only to them – saw $300 worth of Bitcoin being demanded from each affected user or organisation for the release of their data. With countless headlines written about WannaCry and so many security solutions on the market, it is easy for IT teams to become overwhelmed.  Enterprises are caught between the extremes of buying a one-size-fits-all solution – that doesn’t fit all, to buying specialist solutions and patching them together to try and create the most robust security possible, yet potentially leaving gaps in the protection. [easy-tweet tweet="The cloud can be an effective way to backup data, systems and files..." hashtags="Cloud, Security"] Critical to protection against attacks such as WannaCry is knowing what has been protected and with what solution – in this case, we’re discussing cloud and encryption. The cloud factor The cloud can be an effective way to backup data, systems and files to recover from a ransomware attack.  It’s not quite as simple as this, however.  For example, firstly, cyber criminals often encrypt files in remote repositories as well as physical ones.  Secondly, while operating systems, applications and system files can be restored, data files are much harder to recover unless they’ve escaped the attackers in the first place or the ransom is paid (and even then an enterprise is still at the mercy of the attacker). So, how do you keep files safe from attack?  The versioning and recycle bin features of cloud applications are crucial to this but are something which many file sharing/storage solutions fail to include. Through versioning, every revision of a document is stored, so you can go back and retrieve a previous version of the file before the ransomware attack took place.  With the recycle bin function, no matter who deletes the file – an attacker or a legitimate user – a copy will be kept. Why does encryption matter? One crucial thing the WannaCry attack shows is the power of cryptography, albeit in this situation for ill-intent.  The cryptography is not only applied to hijack data but also to ensure anonymity/verifiability of the Bitcoin transaction in regards to the ransom payment.  The message here is simple; if the perpetrators are using cryptography against you, why wouldn’t you use it to potentially keep attackers out in the first place?  Although it won’t completely guarantee that attackers can’t hold corporate data up for ransom, it will certainly go some way to making this more difficult. Encryption is also important in the aftermath of an attack. In a situation such as WannaCry, the ransomware attack was initially all about control and access to the data.  Once this first phase is over, an attacker could utilise programs installed as part of the attack for a second wave of compromise – for example, data exfiltration to sell on the dark web. Having a robust security infrastructure, with data-centric encryption combined with stringent access controls and strict policy requirements, means every time there is an access attempt to a specific piece of data, policy requirements must be met and access rights cleared before the data is decrypted.  This ensures any malicious software that has been installed as part of the attack, creating a ‘backdoor’ to the system, actually has little value.  Essentially, an enterprise will have protected itself from the inside out; by encrypting data at the heart of the business. Ultimately, no single solution can keep an enterprise completely safe from a breach and, often, not even a whole host of solutions can keep it truly secure.  As always, the devil is in the detail.  Understand what you’re protecting and how stringent that protection is; this should include data-centric encryption and cloud services which utilise versioning and recycle bin features.  It is also important to encourage best practice security techniques.  Ensuring systems are up-to-date and patched appropriately is an absolute minimum requirement.  Guiding employees on how to avoid the human error element of a successful cyber attack is very important – for example, educating them to treat emails with a security-savvy eye. In the case of ransomware attacks like WannaCry, backup processes must be in place and at a comprehensive standard where strong access controls and collaboration features not only make the system secure but effective in practice.  After all, if the solution doesn’t work well for the business user, it simply won’t be used, putting the data and systems at greater risk of unsanctioned IT use and risk of malicious attack. ### Evolution of the ‘Huddle Room’– an Essential Requirement for All Businesses Over the last five to 10 years, the traditional office has given way to a more transient work environment. The growing popularity of flexible working coupled with a mobile, dispersed workforce has decreased the need for separate offices and desk spaces, with businesses now considering alternatives, including hot-desks in open-office environments. With many offices transitioning to an open floor plan, it’s more important than ever to have designated collaboration spaces free from noise and distractions. As such, the concept of smaller meeting spaces – or ‘huddle’ rooms – was born. Creating areas for small groups of three to five people and virtual teams to meet and work together on the fly is an inexpensive and space-saving way forward. Businesses of all sizes and industries are moving to this model. In fact, Frost & Sullivan estimates that there are currently 30 million huddle rooms worldwide. Moreover, huddle rooms are arguably the most productive rooms in the building as they provide privacy in collaborative environments and require little physical space (up to 10 – 12sqm). What’s Driving the Need for New Office Space Environments? While open-space floor plans are trendy and the desire to create more collaborative, flexible workplaces is strong, cost is essentially an underlying factor driving the increase in open offices. Businesses simply cannot afford to waste office space and employee productivity. With the increase in flexible working, companies are finding they require less square footage per employee in the physical office space. A recent study estimated that flexible working could generate workstation savings of £1.1 billion for the UK economy. This could be a pretty big draw for companies to consider when investing in new workspace environments. In addition to providing a flexible meeting space in open floor plans, the huddle room can introduce many benefits to an organisation including greater culture fluidity and collaboration as people sit closer together. Also, huddle rooms are very attractive to millennials, who, by 2020, will make up half the global workforce and expect a collaborative culture.                        Ensuring ROI in Huddle Rooms First and foremost, ensuring a good return on your investment in huddle rooms comes down to space optimisation. This includes the facilities team creating a huddle room to avoid wasted space in the open plan environment and to management building a culture within an organisation to ensure all spaces are optimised for different requirements. [easy-tweet tweet="Many companies are extending the audio and video conferencing technology beyond the boardroom" hashtags="Cloud, Video"] Traditionally, a huddle room would have just been equipped with two to four seats and a telephone, to be used for a one-to-one meeting or a private phone call. However, many companies are extending the audio and video conferencing technology beyond the boardroom and executive suite to smaller meeting spaces. By ensuring the technical functionality and capability is the same across all rooms, they can be used for all types of meetings, from internal brainstorms to senior management strategy sessions to customer facing collaborations, keeping the utilisation as high as possible in every meeting space. Video conferencing has emerged as an essential tool for many companies’ communication and collaboration strategies. The cost, ease of use, flexibility and scalability of video conferencing opens the door to a wide range of new use models that meet the needs of today’s dynamic work environments. However, according to the same report by Frost & Sullivan, less than 5 percent of the 30 million huddle rooms worldwide are video-enabled. This could be due to several factors, including high acquisition and operating costs of the technology solutions, as well as education – many companies are simply not aware of what is possible. Businesses are willing to pay for an increase in video conferencing users but are unlikely to invest in more IT resources to support the solution. A SaaS model can help break that barrier and make it easier to deploy and manage operating costs. A cloud-based video conferencing solution dramatically decreases the time to deploy and manage the solution and also decreases the cost for both IT departments and end users. An all-in-one cloud collaboration solution with the audio web, video conferencing and group chat from a single, trusted provider gives users an easy-to-use, consistent communication experience across platforms and devices to achieve maximum productivity and collaboration, despite their location. For the IT administrator, a fully integrated system, including the actual hardware device, software, service and support means less time spent on deployment, training and management, which translates into a lower total cost of ownership. ### UK Sigfox network operator commits to providing 95% UK population coverage WND deploys UK’s first nationwide Internet of Things network WND-UK, the UK’s primary Sigfox Network Operator, is ramping up the deployment of Britain’s first dedicated Internet of Things (IoT) network, as it commits to providing 95% of the UK population coverage by 2019. As the promise of the IoT becomes a reality, service providers and manufacturers are looking for reliable and affordable network solutions to power the billions of low-powered devices expected to come online over the next few years. Traditional cellular networks, such as 4G LTE are too expensive and resource-hungry for many low-cost applications, while other technologies do not offer the range needed to implement a nationwide network. [easy-tweet tweet="In a world of IoT, tomorrow’s smart devices will need to be wirelessly multilingual" hashtags="IoT, Data"] Sigfox is the world’s first dedicated low-power wide-area communications service for the Internet of Things. Harnessing ultra-narrow band technology, Sigfox provides basic connectivity to devices that do not require high throughput. This unique approach is ideally suited to the vast majority of IoT devices as it requires very little power – enabling devices to run for years on a single battery. Sigfox also enables the use of low cost devices and cost-effective network usage. “In a world of IoT, tomorrow’s smart devices will need to be wirelessly multilingual,” said Neal Forse, chief executive, WND-UK. “In other words, they will need to be able to communicate across different kinds of networks and choose the most appropriate technology for specific tasks so that performance and battery life are optimised. “Sub-gigahertz networks, such as Sigfox, require far less power and provide much longer battery life for the many devices that only require intermittent internet connectivity and the transmission of small amounts of data.” The operator has adopted a dynamic rollout strategy, prioritising deployments to meet customer demand. Since the launch of its UK operation in March 2017, the operator has installed 50 base stations, already providing coverage to nearly a third of the UK’s population. WND-UK has put partnerships at the centre of its business model and already has a strong ecosystem of channel partners looking to utilise its IoT network, including flood detection specialist LeakSafe and IoT solutions providers such as Real Time Logistics Solutions (RTLS), SPICA Technologies, Amphy and Low Power Radio Solutions (LPRS). View from the channel Real Time Logistics Solutions Ltd (RTLS) was formed in 2001 as a technology company with emphasis upon asset management, smart sensor development, reusable core technology, OEE monitoring and IoT infrastructure. Managing Director Peter Milton, said: “Our mission is to meet future industry requirements by use of innovative technology and creative partnerships. We’re now working closely with WND-UK as a channel partner to assist in the provision of the Sigfox national network. The sensor technology available via WND will significantly improve the cost basis for implementing IoT systems and provide the necessary spring board to wider international coverage and business opportunities.” Spica Technologies, a specialist systems integrator and IoT solutions provider working in the UK’s facilities management (FM) sector relies on WND’s Sigfox network to connect a range of sensor-driven solutions. These include its FM ‘smart cleaning’ platform and a unique water monitoring solution, developed to address the risks associated with Legionella in healthcare and public sector buildings. Co-founder of Spica Technologies, Tim Streather, commented: “Working with WND-UK is a huge plus for us because its approach to building out a Sigfox network that extends throughout the UK is very different to those attempts made by previous network operators. WND’s dynamic rollout strategy ensures that Sigfox base stations can be deployed at customer sites anywhere in the UK and be fully operational within weeks.” Forse concluded: “The applications our channel partners are working on today demonstrate how IoT technology has the potential to completely transform business models. “Sigfox is the lowest cost, lowest power technology for IoT applications, and we’re already well underway with delivery of a robust and cost-effective Sigfox network.” ### Logical Glue Secures Investment to Develop Machine Learning White Box Insights  Logical Glue expands its machine learning platform, providing powerful, intuitive data driven insights, enabling better decision making, reducing risk and increasing profit for Lenders Logical Glue, the cloud-based software company that helps the financial services and insurance industry reduce risk and increase profits with insightful and faster data-driven decisions, has closed a major private investment from UK entrepreneur Tom Singh. The investment will allow Logical Glue to expand its cloud-based machine learning and statistical modelling platform, further developing its patented “white box” decision-making engine and data visualisation tools. Logical Glue is a trusted partner to financial service companies large and small, from lending to insurance. It helps businesses reduce risk, increase profits and deliver a better customer experience, with fast time to value. Logical Glue is helping Lenders increase acceptance rates by 40 per cent, increase profits by typically 5-20 per cent, decrease default rates by 15 per cent and increase recovery collection rates for debt collectors by 18 per cent. In addition, it has reduced manual interventions for underwriters twenty-fold. [easy-tweet tweet="Logical Glue is delivering the Machine Learning platform of the future" hashtags="MachineLearning, Data"] Tom Singh OBE, the founder of fashion retailer New Look and a leading UK entrepreneur, has invested in Logical Glue. Tom says: “Logical Glue provides a straightforward and accessible Machine Learning platform for Lenders and insurance providers, delivering faster and more accurate decisions that will underpin increased productivity and profitability. Logical Glue really bridges the gap between data science and the boardroom. Fast, accurate and automated decisions based on data have a place within many industries, from retail through to finance.” Logical Glue’s Machine Learning platform is becoming the go-to platform for data scientists and business users alike. For data scientists, the platform is an intuitive solution that dramatically cuts down the time to build highly accurate predictive models. It also provides essential tools to rapidly hypothesise and test modelling sensitivities. For business users, it is the graphical interface, providing straightforward visualisation of complex data relationships. It allows a deeper understanding of business outcomes and the classification of data between important influencers and noise that has little impact on outcomes. Logical Glue’s platform is the first code-free platform that can build three different types of predictive models for making better decisions: neural networks, logistic regression and fuzzy logic models. In a few minutes, advanced solutions can be created which enhance insight, accuracy and flexibility on existing predictive models. Models can easily be integrated into existing platforms and data workflows using Logical Glue’s API, allowing models to be deployed to production far quicker than the norm. Daniel McPherson, co-Founder at Logical Glue, comments: “Logical Glue is delivering the Machine Learning platform of the future – a platform that will make a significant difference to financial services organisations and beyond by delivering better, faster decisions and providing the consumer with the best customer experience.” ### A Tipping Point for Cloud - Areas to Help Enterprises on their Cloud Journey By 2020, nine out of 10 organisations will utilise hybrid cloud infrastructure, according to Gartner. With cloud spending in this area rising faster than ever before, and data centre outsourcing on the decline, 2017 is set to be a tipping point for IaaS adoption.  Oracle’s James Stanbridge looks at why this is, and suggests some key focus areas enterprises considering cloud should consider to help them on their cloud journey. Catalyst for cloud What is the catalyst for this increasing focus? While the International Monetary Fund predicts strong near-term growth prospects for the region, the medium- to long-term outlook is less sunny. Global growth remains sluggish, and macroeconomic events such as China’s slowdown are contributing to economic uncertainty. Set against this, businesses know they need to modernise and innovate to stay competitive, and are often trying to do so with reduced or static IT budgets.  Cloud, with all its benefits, offers a solution. Early stages Some enterprises have already recognised the advantages cloud economics offers and started to move their workloads, but we are only in the early stages. Why? Some business have been concerned that moving to a cloud infrastructure model would result in performance inconsistency and that the public cloud wouldn’t be as reliable as working with an on-premises system. Data sovereignty laws has also been a major concern, particularly for mission critical systems and those organisations operating in the more heavily regulated sectors such as government, education, healthcare and financial services, in which customer privacy and industry compliance are paramount. New arrivals Now, numerous suppliers have released, the next-generation of enterprise infrastructure cloud solutions designed to answer the demanding performance, reliability and security needs of businesses. These range from primitive, basic but high performance compute, storage and networking capabilities, to bare metal cloud services that combine the elasticity and utility of public cloud with the granular control, security, and predictability of on-premises infrastructure to deliver high-performance, high availability and cost-effective infrastructure services.  In addition, tools to help with broad base virtual machine migration and management across hybrid environments are arriving on the scene, ensuring a more seamless experience. In simple terms, a cloud that performs as well as if not significantly better than dedicated hardware, on-premise. First steps As enterprises begin to look to take advantage of infrastructure cloud, a good starting point is using cloud infrastructure to help reduce costs and modernise (de-risk) the business, transform development and unlock innovation, transformation and new growth. Here are some great examples, of how companies are using IaaS to meet these business requirements. Cost reduction & modernisation [easy-tweet tweet="Test and development is one of the most common reasons to use infrastructure cloud" hashtags="Cloud, HybridCloud"] Pikicast, a South Korean social media content provider, migrated to public cloud services in order to minimise its upfront costs and support a long-term data retention strategy. The business adopted a cost-effective archiving solution to store up to 15GB of daily user-access data, enabling the business to more efficiently analyse user patterns, such as specific media-content interests, without the cost burden of storing large volumes of user data. As a result, the company was able to reduce maintenance costs by 98 percent and gain superior support for its ambitious growth plans. Transforming development Test and development is already one of the most common reasons to use infrastructure cloud.  One of Korea’s leading IT solution providers, Goodus, was experiencing increasing customer demand for its in-house developed-network-management solutions. But, the business was challenged by legacy architecture that made it difficult and time-consuming to perform multiplatform testing scenarios. Thanks to the use of database, compute and storage cloud services, the company now enjoys a highly responsive, low-cost development process. Innovating in the cloud Cloud is also an innovation enabler. Falkonry is an artificial-intelligence company that specialises in recognising patterns of behavior from data in real time. For example, brain seizures can be anticipated and monitored and then the right intervention can be provided in a timely fashion. Using bare metal cloud services, Falkonry is now able to quickly leverage a huge amount of computing power to explore data; giving customers instant access to its revolutionary artificial intelligence technology so that they can solve problems in days or weeks, rather than years. By choosing an enterprise-class cloud solution, Falkonry can assure clients that their sensitive data is protected. These are just a few examples of how businesses have made the move to cloud in order to lower costs, effect innovative transformation and stay at the leading edge of their industry. With cloud infrastructure hitting a key tipping point this year and economic uncertainty lingering on the horizon, now is the time for enterprises to take a deep look at enterprise-class cloud. ### Here is Why AWS Still Dominates the Cloud Market The cloud computing market is growing and offers incredible opportunities for companies wanting to establish themselves as leaders in cloud computing. According to Tech Crunch, the IDC expects the cloud market to more than double in three years, to $195 billion by 2020. One company that is leading the cloud market arena is Amazon Web Services (AWS). According to Canalys, AWS dominates the market with 33.8 percent global market share while its closest competitors—Microsoft, Google, and IBM—together account for 30.8 percent of the total market. Other smaller competitors include Oracle and Alibaba Cloud. ZDNet claims that AWS 2016 operating income was $3.11 billion on revenue of $12.22 billion and its annual operating profit was more than 25 percent. Synergy Research Group reported, “While Microsoft, Google, IBM, Alibaba and Oracle all achieved Q1 growth rates that were substantially higher than that of AWS, AWS revenues are still comfortably bigger than the other five combined.” The question to ask is, “Why does AWS still heavily dominates the cloud market niche?” Head start Amazon started as an e-commerce company call Merchant.com to assist third-party merchants to build online shopping sites. However, Amazon struggled to develop an organised environment to separate various services. Therefore, the challenge was for Amazon to create a platform useful for third parties. And in 2006, Amazon Web Services (AWS) emerged and shaped the public cloud industry. Developers used AWS to test and build cool apps. The emergence of AWS came from the philosophy of “virtuous cycle”. The virtuous cycle begins with getting customers, increasing the number of clients, and adding more servers and features. When AWS adds more servers and features, customers enjoy lower prices and AWS gains even more customers. [easy-tweet tweet="AWS remains the leader in the cloud market also because of its constant price drops." hashtags="AWS, Cloud"] John Dinsdal, chief analyst at Synergy Research Group, stated: “AWS was essentially the first to launch” and “made the market its own” before other giant companies launched similar services. Although companies, such as Microsoft and Google, experimented in cloud technology, they treated cloud services as a fad and failed to use their resources in project development. Even though Microsoft launched its platform Azure in 2010, it is because of AWS' quick initiative that it’s a prominent leader that still rules the cloud market. Price drops AWS remains the leader in the cloud market also because of its constant price drops. According to a MarketWatch, “AWS has cut prices more than 50 times since its launch in 2006", and SunTrust Robinson Humphrey analyst Kunal Madhukar said he "wouldn’t be surprised if further price reductions were coming shortly, following recent cost cuts for Microsoft’s Azure.” Since AWS launched in 2006, it maintains its cloud market lead due to the aggressive investments it makes to expand its network. The more network expansion AWS experiences, the greater economies of scale that enable them to offer customers lower prices and full-bodied enterprise-scale features. And remember the virtuous cycle? The increasing number of clients that AWS gains allows more funds to be allocated to network expansion and results in price cuts. These price cuts translate to competitive service. Although rivals such as Microsoft, Google and IBM had higher Q1 growth rates than AWS, according to the Synergy Research Group, it is important to remember that it’s easier to grow from a small number of customers that to increase from a big number of clients. According to TechCrunch, “While AWS might not have the eye-popping growth percentages of its rivals, it still grew at a decent 47 percent, with earnings of $3.53 billion on an astonishing $14.2 billion run rate". New infrastructure and innovations Remember how AWS began in 2006? It started as an infrastructure platform to enable developers to build and test new apps. Then, there was proven the success of the virtuous cycle customer base. AWS has more than 1 million users including big name companies such as Netflix and Airbnb. Because of the massive customer base, AWS has better visions than almost any vendor into how customers use cloud services. Therefore, AWS stays abreast of creating new infrastructures and innovations to meet its customer needs as well as supporting AWS around the world. Since 2006, AWS has progressed into a platform for databases, developer apps, tools, and analytics. According to MarketWatch, Amazon “recently announced plans to open an infrastructure region in Sweden in 2018, joining the 16 regional operations it currently maintains around the world, with another two slated to come online later this year.” AWS' ability to innovate change goes beyond computing and storage. AWS has built a full-featured platform of services that has currently given the company a competitive edge as the leader of the cloud service. But while AWS has the lead, its rivals such as Microsoft and Google, are also fast growing and making a place in the market with their revolutionary machine learning platform and artificial intelligence (AI). According to Seeking Alpha, AWS seems to be behind its competitors when it comes to AI. Artificial intelligence is one of the fastest growing areas in cloud computing. Because of the massive amounts of data needed for AI, there is a race to reap the enormous rewards in the future. Fast and easily scalable solutions AWS provides quickly and easily scalable solutions for its customers. According to thenextweb.com, AWS has the following strengths: Diverse customer base Broad range of strategic adoption tools such as native cloud applications and e-business hosting Large tech partner ecosystem including software vendors that integrate their solutions with AWS Extensive network of partners that provide app development expertise managed service, and professional services such as data centre migration Rich array of IaaS and PaaS capabilities Rapid service offerings and higher-level solutions expansion. So, having huge capacity gives AWS opportunities to find solutions for its business challenges. Companies can use AWS auto-scaling to match their load demands instead of having to maintain capacity for top loads in traditional data centres. By using AWS auto-scaling, companies adjust the number of servers added or remove them depending on the load. Auto-scaling also helps companies detect when a server is unstable, then terminate it, and launch another server to replace the unstable one. So, the advantages of using AWS offers companies vast solutions for their different needs under one shelter. It is no surprise that AWS still dominates the cloud computing market because of the services it offers its customers. Even though AWS' rivals such as Microsoft, Google and others achieved higher 2017 Q1 growth than AWS, Amazon’s quick initiative in its early years, price drops, improvements in infrastructure, constant innovations, and fast and easily scalable solutions still make the company the leader in cloud computing. However, companies such as Microsoft and Google are making gains in the market with artificial intelligence and machine learning platforms. For Amazon to remain as the frontrunner for cloud services, they must be fast and innovative when implementing platforms for artificial intelligence as they were in their early years. It is going to be interesting to see what Amazon will do to keep its competitive edge in the cloud market wars. ### Breaking Through the Machine-Learning Hype Machine learning is one of 2017’s most used buzzwords, with tech companies across the board raving about its applications. Pick an industry, any industry, and you’ll surely see news around how machine learning is helping businesses achieve unseen levels of innovation. Unfortunately, more often than not these expectations are inflated. Gartner recently ranked machine learning as a buzzword at the top of their peak of inflated expectations. It’s understandable that it’s becoming harder to see where the practical applications of machine learning lie. In essence, machine learning is a general term describing how systems can learn from feedback loops to improve their performance. The term encompasses a variety of different technologies, making it difficult for the average business to realise how machine learning can specifically help them. Here’s an overview of the technologies you should know about and how they can be applied to a variety of businesses. Machine learning and business applications Due to the wide array of AI and machine learning media coverage, there’s a common perception that machine learning is limited to groundbreaking new companies who have large amounts of money to spend. This doesn’t have to be the case: machine learning has been used in the past by smaller companies with less budget, or by larger companies but not in the kind of media-friendly ways you might expect. An example of this is Airbnb’s use of machine learning. The company is always working to get more bookings, competing with the hotel industry on both price and experience. To do so, it has implemented machine learning processes which work in conjunction with data to make search results more personalised to the company’s customers.This has boosted online conversion rates by 1% - a huge amount considering online conversion rates are usually very low. While companies like Airbnb, sitting on huge cash reserves may find it easy to invest and experiment with these technologies, there’s a popular perception that businesses with smaller budgets would find this difficult. Despite what you might think, machine learning has matured rapidly, with many core component technologies open-sourced or productised in ways that make them accessible to smaller businesses on tighter budgets. How to be practical about it So, what are these technologies, and how are they typically used? Google’s machine learning team has developed a technology called TensorFlow, which is a framework to implement machine learning at scale. TensorFlow, which was open-sourced in 2015, has allowed businesses of all shapes and sizes, from startups to established brands like Ocado, to use machine learning for their specific needs. Naturally, Google isn’t the only company developing these technologies. Amazon are also competing by offering their own solution, Amazon Machine Learning. This is different to TensorFlow in various ways - with the former, you can build your own models and execute them against datasets wherever you like. AML, on the other hand, requires that you upload your dataset to Amazon and use their API to execute queries. AML is a hosted machine learning product, while TensorFlow gives you more freedom. Microsoft is another player in this market, providing a service called Azure Machine Learning Studio. This product helps users to see their algorithms, executing them in their Azure Cloud. On top of these three big corporations, there’s a wide array of smaller startups providing machine learning services, among them BigML, MLJar and Algorithms. The point here is that you don’t need to be an expert in the field, or have heaps of money to use machine learning. Yes, these tools will require some upfront investment in research from your developers. However they have also made machine learning something affordable and accessible to all sorts of businesses. It’s a trade-off I predict more and more businesses will find attractive in the future. [easy-tweet tweet="There’s a huge range of options available as the machine learning API ecosystem develops" hashtags="MachineLearning, AI"] Find what works for you When it comes to picking a machine learning service for your business, you will need an open platform that is compatible and extensible with the best services. Remember - there’s a huge range of options available as the machine learning API ecosystem develops, but that doesn’t mean you should be overwhelmed. As long as you focus on your specific business needs, you should have no issue making a choice. It’s also important to decide whether you want a ready-made machine learning ‘product’ to use or a set of APIs. Among the solutions which behave more like products are those available are Amazon Rekognition which provides image recognition, Amazon Lex, which focuses on chatbots, or Amazon Polly which works on text to speech. These should be easy to integrate and have many uses. Another option is to build features into your products, which are powered by APIs powered by machine learning. One example of this is Google Cloud Video Intelligence API. This is a service that is currently in beta which can recognise objects in videos.  You can also build and train your machine learning models depending on your expertise. For this, you can use AML or TensorFlow. Think carefully about your problem and what resources you have. Research and assess what will work best for you. Machine learning has a wide range of applications, which may make it sound daunting, but the market is getting larger all the time. The big players like Google and Amazon are investing a lot of time and thought into making machine learning more accessible for ‘regular’ businesses, so there’s been no better time to jump in and experiment. ### London North West Healthcare Selects Genmed as its HMRC Vendor  Genmed announced on 30th June, that it had been selected by London North West Healthcare NHS Trust as its preferred HMRC compliant vendor-neutral managed services partner.  The appointment was by mini competition under the SBS managed services provider framework and is initially for a three year period with an option to extend by agreement. London North West Healthcare NHS Trust (LNWH) is one of the largest integrated care trusts in the country bringing together acute hospital and community services across Brent, Ealing and Harrow.   With an operating budget of £743.6 million, it employs 9,500 staff, serves a diverse population of approximately 950,000 people and is responsible for Central Middlesex, Ealing, St Mark’s and Northwick Park Hospitals.  The Accident & Emergency department at the Northwick Park site is thought to be the largest in Europe. Vince Pross, LNWH’s associate director of procurement, explains, “The Trust currently has a small number of services delivered using a managed services contract model but wishes to expand this considerably to now include various priority areas such as endoscopy trust-wide, surgery, anaesthesia, LED lighting, imaging and cardiology, air-conditioning, fire alarm maintenance and electronic document management.  Genmed was selected as it can support us in all our priority areas.” Genmed has an established and proven track record of success providing managed services to the NHS to help trusts address the current lack of additional capital available. Genmed will work in partnership with LNWH to manage the procurement process on its behalf in the above priority areas. This will include running mini competitions, selecting suppliers in conjunction with clinical input, and managing the whole delivery, administrative and invoicing process. [easy-tweet tweet="Genmed’s services are off balance sheet and HMRC compliant for VAT recovery" hashtags="Healthcare, recovery"] The costs for these managed services will then be packaged using an umbrella contract and billed monthly or quarterly. This approach improves cash management as no big upfront investment is required from LNWH and the costs are ‘smoothed’ over time. Also, Genmed’s services are off balance sheet and HMRC compliant for VAT recovery.  This will allow LNWH to reclaim back the VAT so that budget can be saved or money used to fund additional front line services. Robin Modak, Genmed’s chief executive officer, says, “Most managed service providers focus solely on transactions. They’ll just source and supply equipment.  Genmed is different as our approach is much broader. We’re more consultative and will work with Trusts like LNWH to not only procure services cost effectively and efficiently, but fully understand their operational and clinical needs, STP planning goals and look at how we can, therefore, add real value with service provision to improve the patient’s pathway and care provided.” The genmed approach is closely aligned with all the key directives highlighted by Lord Carter of Coles in his report to improve the efficiency of hospitals. Genmed works with best of breed suppliers with a track records of success in the NHS The LNWH brief for managed services is broad.  To deliver this, Genmed is vendor neutral and will work with LNWH to select partner firms and OEMs who will fulfil the Trust’s requirements at the right price. For example, one of LNWH’s areas of focus is electronic document management (EDMS). Genmed partners with leading bureaus, BPO  and records management firms including Restore, the largest UK owned archive document storage company who works with over 40% of all NHS Trusts providing records management and scanning services. In addition, Genmed announced a partnership in February 2017 with CCube Solutions, a leading EDMS software vendor, to use the managed services approach  to cover all costs required to shift from paper to digital medical records so that the heath service meets the Government’s deadline that they should be paperless at the point of care by 2023. To close a medical library and go paperless typically costs more than £1 million.  A managed service means a Trust can close a library, get rid of the paper and vans required to deliver them, redeploy people - or reduce headcount - and the cash saved can immediately be used to pay for the service. The approach therefore becoming self-funding. CCube Solutions has a proven ability to deliver based on two decades of expertise and an established track record of project success in the NHS.  Today, its EDMS is used at 28 trusts and health boards around the country including Aintree, Addenbrooke’s, Aneurin Bevan, Milton Keynes, North Bristol, Papworth, St Helens and Sheffield Teaching Hospital. ### TrustFord Accelerates its Networks with ANS TrustFord, the UK’s largest, dedicated Ford dealer group, has partnered with cloud and managed services provider, ANS Group, to transform the performance of its network and IT systems with a Wide Area Network (WAN) solution. The company – which has 60 sites across the UK offering sales, servicing and repairs –  will also enjoy significantly higher bandwidth, faster network speeds and direct public cloud connectivity, allowing it to run more applications with increased efficiency. This will enable them to better engage with customers at its dealerships through technology, transform its internal communications and future-proof its network infrastructure. [easy-tweet tweet="The cloud-ready network will deliver TrustFord customers up-to-date ‘smart’ car software" hashtags="Cloud, IT"] The project will enable TrustFord to accelerate the introduction of the FordStore concept into dealerships, which creates an immersive, digitally-led experience for customers through interactive displays, a dedicated app and video content. The WAN will also deliver improved connectivity between sites by enabling next-generation video conferencing applications, and facilitating remote working. On top of this, the cloud-ready network will deliver TrustFord customers up-to-date ‘smart’ car software and updates as they are developed and released by Ford. Commenting on the partnership, Andy Pocock, IT director at TrustFord, said: “We chose ANS because its values and ambition complement our business perfectly, as well as the resiliency and scalability of its WAN offering. “We are the largest Ford dealer in the UK and pride ourselves on our forward-thinking, application-led IT strategy. The team at ANS understand our vision and have exceeded our expectations. We’re confident that this project will future-proof our IT so we can be even agiler and adapt to changing business needs in years to come.” Andy Barrow, CTO at ANS, added: “TrustFord is leading the way in the motor industry, and it has been quick to adopt the latest technology in order to deliver a better experience for both customers and staff. We are looking forward to the start of a very successful working relationship and supporting TrustFord’s growth through technology.” ### What is AppFormix? - Sumeet Singh In this interview Compare the Cloud Talks to Sumeet Singh, Product Leader for Juniper’s cloud operations offering, AppFormix. Acquired at the end of 2016 AppFormix combines machine learning and advanced analytics to streamline and automate operations across software-defined infrastructures and application layers. In an engaging discussion, Sumeet walks us through the challenges of running modern, multi-cloud and hybrid cloud environments providing a first-hand account of the sleepless nights and daunting workload that led to the creation of AppFormix. Combining the ease of deployment and seamless integration with environments spanning public clouds, OpenStack and Kubernetes, AppFormix is rapidly becoming the ‘go to’ product for monitoring and optimising cloud operations in the Enterprise and Telco space. Find out more about AppFormix at https://www.juniper.net/uk/en/ ### Compare the Cloud Interviews Chris Hafner from Strategic Planning Society Compare the Cloud's guest host Priyanjalee Perera interviews Chris Hafner from Strategic Planning Society. Chris Hafner talks us through Artificial Intelligence, NLP and Automation as well as giving some background on his history in the industry while using AI in the managing consulting field. Chris also refers to other disruptions AI can have through various industries. Find out more about Strategic Planning Society here: http://www.sps.org.uk/ AI ### Suprema Showcase's Its Latest Biometric Security Solutions at Security Twenty  Suprema Inc., a leading global provider of biometrics and security solutions, today announced that the company will showcase the company's latest range of biometric security solutions at Security Twenty 17 North at Harrogate, on July 4. At the show, Suprema will showcase its latest facial recognition access control terminal, 2nd generation fingerprint IP readers and security platform. [easy-tweet tweet="FaceStation 2 is loaded with Suprema's latest face recognition technologies." hashtags="Technology, AI"] Introduced at last week's IFSEC International 2017 exhibition, Suprema plans to build on the successful introduction of its flagship biometric device, FaceStation 2, the world's best performing facial recognition terminal in terms of matching speed, operating illuminance and user capacity. In addition to FaceStation 2, Suprema will demonstrate its range of access control solutions including fingerprint IP readers and security platform. The new FaceStation 2 is loaded with Suprema's latest face recognition technologies. On the biometric side, the FaceStation 2 performs fast face matching of up to 3,000 matches per second, which makes FaceStion 2 the world's fastest facial recognition terminal to the date. Suprema's new face recognition technology has been improved by enhanced algorithm, new optical structure and class leading 1.4GHz quad-core CPU. As a result, matching speed has been improved by 300% than the FaceStation, its predecessor. On the optical side, the new face recognition technology now overcomes possible interference from dynamic lighting conditions including sunlight and ambient light. The new technology now allows greater range of operating illuminance from zero lux to 25,000 lux which covers almost every possible lighting conditions regardless of indoor or outdoor, day or night. "At Suprema U.K., our team is very excited to take part in this year's Security Twenty 17 North and to continue our successful new product launching in UK market. Since our opening in UK last year, Suprema U.K. has been improving our product offering on UK and focuses on providing localized support and training to UK security industry," said Jamie McMillen, Managing Director at Suprema U.K. To experience latest innovations of Suprema's products and solutions, please visit Suprema stand at Security Twenty 17 or visit us atwww.supremainc.com. ### Keeping Business Data Safe - 4 Questions to Ask Vendors Malware phishing attacks and data breaches have unfortunately become the new norm in today’s digital world. Cyber criminals are increasingly targeting Internet users of all shapes and sizes to gain access to sensitive data or online accounts, with the recent WannaCry attack only one of the plethora of incidents that businesses have to deal with. According to official government figures, half of UK businesses were hit by cyber-attacks in 2016 alone. Cyber-attacks can be expensive, not only in the cost of cleaning up, but the scars of those experiences often take a while to heal. No organisation is immune to these attacks either, but there are a few measures that can be taken to minimise the risk to customers and the business as a whole. This is where due diligence comes in. Businesses must make sure they are thorough in their evaluation of cloud services providers to ensure they have the necessary protocols and capabilities to guard against cyber-attacks adequately. As businesses increasingly leverage third-party digital services, they need to be sure that the vendors they partner with have their best interests at heart. But who’s keeping vendors honest? Business can no longer assume that vendors are on the same page when it comes to cyber security. The stakes are too high for that. Whether you are reviewing an existing relationship with a vendor or assessing a potential new customer, here are a few key questions businesses need to ask to make sure they are keeping their data and customer data safe and secure at all times. Which cloud security standards do you adhere to follow? As a result of recent high-profile hacks, the need for strong security controls and processes has risen on the list of business priorities. Businesses are particularly concerned about whether or not vendors can meet necessary security requirements to keep document-based transactions safe and secure. For regulated industries, there is a need to go above and beyond commonly used security protocols. The ultimate goal is to protect data so that businesses can remain compliant with standards imposed by stakeholders. There is a range of security protocols, including SOC 2, HIPAA and FedRAMP that offer robust standards and processes. With these standards, auditors keep vendors honest by making sure they attest to and implement security best practices – day in and day out – without exception. [easy-tweet tweet="Finding a dependable and security-conscious vendor that offers flexibility is essential." hashtags="Security, Vendor"] How flexible are your deployment options? The cloud is getting increasingly popular with businesses but trusting another company with documents and data is never easy. Finding a dependable and security-conscious vendor that also offers flexibility is essential. Vendors must be able to give their customers choice on how and where they deploy the solution (i.e., in a public cloud, private cloud or on-premises behind a company’s firewall). Businesses must also be able to make necessary changes with minimal inconvenience because it is the ability to make these types of on-the-fly changes that ensure that they can keep moving and eliminate (or at least minimise) security risks without impacting customers, partners and employees. Can you white-label the experience? When a vendor’s logo and brand is prominent as part of the user experience, it can create confusion and a disjointed experience. Also, if the vendor is breached, even though it doesn’t necessarily have anything to do with the business, it’s not inconceivable for it to have a spill-over effect that impacts the business by association. One good piece of advice is for businesses to fully white-label the experience – removing all traces of the vendor’s brand. For example with e-signatures, businesses should consider white-labelling everything from the web and mobile screens to email notifications that are sent to signers. This will make your business, and your customers less vulnerable to attacks in the event your vendor’s service is breached, and it will also and it will also make it easier for customers to detect suspicious emails. How do I know I can trust you? Trust and security are at the heart of digital transformation, and it must remain the top priority in the world of digital transactions. As mentioned earlier, no one is immune from cyber-attacks, but by investing in the right human and technology resources, building trust and confidence will be much easier. The concept of a digital trust chain that links technologies together to provide a secure transaction from end-to-end needs to be at the heart of any digital business. This chain should include everything from authentication to identity access management and other security components needed to safeguard the process, including the data and documents underlying the transaction. It is essential for businesses to thoroughly research vendors to understand their product capabilities, cloud security practices, certifications, track record and the frequency of their security audits before putting their money on the table. It might sound like a lot of work, but this approach could expose past shortcomings, incidents of data loss/leakage or other risks that could potentially harm your business and customers. ### Fujitsu and SUSE Unveil "SUSE Business Critical Linux" Fujitsu and SUSE today announced the launch of a new premium Linux support service, "SUSE Business Critical Linux," which will be exclusively offered by Fujitsu over the next 12 months. Fujitsu has selected SUSE as its preferred Linux partner for this new premium Linux support service, which will be offered based on the two companies' strategic alliance concerning open-source product development and support, announced in November 2016. With the accelerated adoption of Linux operating systems by enterprises worldwide, demand for higher-level support services is growing, especially in mission-critical areas where applications such as in-memory databases are running on Linux platforms. "SUSE Business Critical Linux" will be provided jointly by Fujitsu and SUSE to address customers' evolving needs with a highly reliable, 24/7 support framework that significantly extends current support periods, from five years to up to eight years per SUSE service pack. [easy-tweet tweet="Fujitsu and SUSE help customers cut risks and ensure around-the-clock application availability " hashtags="Strategy, Data"] This exclusive offering provides business customers with one-stop shopping, including Fujitsu-manufactured servers and other hardware. Together, Fujitsu and SUSE will help customers cut risks, ensure around-the-clock application availability and long-term application stability, and optimise maintenance schedules and costs. Katsue Tanaka, SVP, Head of Platform Software Business Unit at Fujitsu Limited, says: "We are pleased to announce 'SUSE Business Critical Linux' as an outcome of the strategic alliance with SUSE. This new support ensures the highly reliable and secure computing environment required in mission-critical systems, and the offering further extends and strengthens the existing strategic partnership between SUSE and Fujitsu." Rupert Lehner, Head of Enterprise Platform Services at Fujitsu EMEIA, says: "We're proud to introduce this unique service offering to customers worldwide. Open source business applications are rapidly evolving, and with this comes a growing demand for flexible, fast and secure support services that can meet organisations' enterprise requirements for maximum system reliability. Together with our preferred Linux partner, SUSE, we are committed to delivering open source-based solutions for the most challenging business environments. This service agreement takes our strategic partnership to a new level." "The long-term support provided by SUSE with Fujitsu for mission-critical computing is big news for enterprise customers seeking always-on solutions that really are always on," said Phillip Cockrell, SUSE vice president of Worldwide Alliance Sales. "Mission-critical workloads demand the powerful technology and detailed, ongoing support and service this joint solution will provide." The new service offering will help meet enterprise requirements for maximum data and application availability and rapid innovation in the data center, and will support the ever-increasing demands of mission-critical workloads. Pricing and Availability Fujitsu will make "SUSE Business Critical Linux" available in EMEIA and Japan from July 2017, to customers running "SUSE Linux Enterprise Server 12" and "SUSE Linux Enterprise Server 12 High Availability Extension." The service will subsequently be rolled out to other regions globally. Pricing varies according to region. ### Cloud-based Accountancy – The Smarter Way to Manage Your Accounts In the last few years, more accountancies have introduced cloud-based accountancy into their practices.This sees the use of accountancy software that is hosted on remote servers instead of it being hosted internally onsite. This is one of the key distinctions between cloud and traditional accounting. As there is no longer the need of a local hard drive, one of the main benefits for why cloud-based accounting is becoming increasingly popular is because it allows for data and information to be stored and accessed online at any time. As it is hosted on the cloud, it also means it does not require anything to be installed on computers and updates to software are made as soon as they are available. Another advantage compared to traditional accounting is that there is less risk of losing information as data on the cloud is automatically backed up regularly. One of the early adopters in the UK for cloud based accountancy was Norwich accountants, Farnell Clarke. Partnering with Xero, one of the leading suppliers of cloud accounting software, Farnell Clarke have been awarded a Most Innovative Practice by the British Accountancy Awards.Part of the reason for the success of cloud-based accountancy for accountants and their clients is because business owners have recognised the improvements it provides in streamlining their accounting processes in their day to day business activities. For the entire workforce, whether it is the accounts department checking invoices, a sales representative providing their expenses or the owner requiring to review financial figures, by using the cloud and being able to connect online at any time, tasks can be handled remotely by all. This is one of the key reasons for why many industries have initiated cloud-based operations as it allows for business owners and employees that require easy and simple access to their accounts frequently to work agile and effective. [easy-tweet tweet="Business owners now have instantly updated financial information about their company" url="#Cloud #Finance"] Additionally, another area that has had an important impact in using the software is that whenever accounting actions are made, businesses can see these changes reflected immediately in their account in real time. Business owners now have instantly updated financial information about their company. With financial information right at their fingertips, for fast-moving industries with the need to respond to business changes quickly, this is of high importance to work dynamically and effectively with their finances. As well as benefitting business owners and employees, another aspect that it will help for improving the efficiency of accounting processes, whether it is national or international, for businesses with multiple offices as it permits for all of them to access the same data across the board at the same time. This multi-user access means working together has been made simpler and with the explosion in the use of portable smart devices such as smartphones and tablets in the day to day business activities, never has getting access or keeping connected with accounts and finances been easier for businesses. This has proven particularly beneficial for workforces that are required to work mobile or on the regular on the road such as sales teams. A frequently asked question when a business is considering whether to use cloud based accountancy or not is whether their information and data will be safe and secure. With the increasing awareness about cyber threats, it is vital that businesses feel reassured their sensitive data, particularly financial details, will be protected. Many developers of cloud-based products recognise this and develop their products with the strictest of security measures. Cloud accounting is a secure method and in some cases, an even safer process compared to traditional accounting. For example, with traditional accounting, information is hosted internally, and on company computers, if this is stolen, the data can also be taken. Cloud accounting does not have this same risk. ### Cloud Control: Getting a Grip on Costs 57Today, businesses increasingly want the flexibility to respond to new opportunities quickly and are turning towards the cloud to achieve this – with IDC expecting spending on public cloud services to surpass $141 billion by 2019. The cloud has been heralded as a way for companies to not only increase agility but also to reduce expenditure, by replacing the traditional costs of IT, such as staff and hardware, with utility costs often based on consumption. However, as there are a multitude of cloud vendors and services to choose from, with some different payment models – for example, on a subscription or pay per use basis – it can be a challenge to calculate which solution is the most cost-effective. According to 451 Research, only 64% of providers publish their prices online, so businesses, “have to manually engage and negotiate with over one-third of providers to get a complete view of the market”. While the rationale behind the transition to cloud computing is well grounded, selecting the right cloud service to suit a business’ needs can be a barrier to adoption. Buy cheap, buy twice…or three times, or four times When selecting a service, businesses need to ensure value for money to realise the cost savings that cloud promises. One way to control these costs is through purchasing a cloud subscription and paying ‘upfront’. However, there can be a risk with this model that organisations could end up paying for services they no longer use – if they fail to monitor the consumption of the services they subscribe to. Alternatively, an organisation can opt for a pay-per-use option; the risk here is that consumption may quickly overrun forecasts and blow the budget. When choosing a cloud service, it’s important not to simply pick the cheapest solution on offer, as these rarely work out the best value in the long run. For example, the ‘cheapest’ cloud services are usually tied to fixed long-term contracts that restrict the very flexibility businesses moved into the cloud to achieve. It’s also crucial that any cloud purchasing decision is made centrally and in line with business objectives – whether that’s reduced costs, increased flexibility or a mixture of both. If purchasing decisions are not centrally controlled, a company may end up with duplicated services in several clouds. To avoid this pitfall, companies should conduct an assessment of shadow IT to identify which unapproved cloud services and applications that have been purchased outside of IT controls are already in use. Ultimately, organisations must ensure they have complete visibility of all services they use to manage cloud effectively, control costs and reduce security vulnerabilities. [easy-tweet tweet="Modern dashboard solutions can display current cloud usage as well as costs" hashtags="Cloud, Technology"] Control the cloud to avoid a storm Post-implementation of a cloud service or product, tracking consumption and costs across multiple cloud environments is crucial to ensure projects don’t run over budget – especially with the growing diversity of cloud environments. The best way to keep tabs on numerous clouds is through the adoption of a centralised dashboard, which integrates consumption data from different cloud providers. Modern dashboard solutions can display current cloud usage as well as costs, and deliver notifications in real-time when parameters are reached – helping organisations keep a real-time picture of their cloud environment. To optimise cloud projects and achieve the desired business results at the best cost, companies need to regularly review the cloud market to ensure workloads are placed with the best supplier. This underlines the importance of a dashboard, as it enables companies to assess whether consumption and cost remain in line with forecasts, so budget or workloads can be adjusted. This ability to analyse cloud usage and cost enables businesses to anticipate and identify trends within its consumption, and optimise according.  Ensuring, for example, applications that are not being used are dropped; or moving a workload back to an on-premise environment if cloud usage and cost has wildly exceeded expectations. For organisations to realise the benefits of cloud computing, the selection and management of cloud must be approached in the right way. If companies choose a cloud that isn’t fit for purpose, businesses won’t achieve what they set out to. Similarly, if businesses fail to manage their cloud environments, costs can easily spiral out of control – which nullifies any savings the company sought to gain. This underlines the importance of a centralised dashboard which provides visibility of both cloud costs and cloud usage – to ensure that business can get a grip on cloud costs both now, and in the future.   ### Cyber threat to professionals at all time high Professional firms, particularly those handling money for mortgage lenders, are increasingly at risk of high level cyber-attacks, according to leading experts in the field of law, finance and mortgage lending. Speaking at the Mortgage Tech UK conference yesterday morning (27th June) Georgina Squire, Partner at solicitors Rosling King, warned professionals of the dangers that cyber-attacks impose to professional firms such as lenders, valuers, brokers and solicitors and advised on what steps should be taken to prevent and mitigate the fall-out when falling victim to a such an attack. A joint report by GCHQ’s newly-opened National Cyber Security Centre and the National Crime Agency published earlier this year, warned that the cyber threat to UK business is at an all-time high. Georgina Squire said: “Given that 2016 was the year that law firm data security came to the fore, by means of high profile events such as the Panama Papers and the rise of Ransomware, it seems reasonable to suggest that the threat will impact professional services firms in much the same way as the wider economy. “Over the last year, professional firms such as my own, have been urged to take proactive steps to manage the risk posed by cybercrime to protect both ourselves and our clients.” Squire also encouraged firms to increase staff training and awareness and to build in additional security and protections to try to limit the potential harm, adding: “We have legal and regulatory obligations like all other professionals involved in the mortgage industry. Our regulators are increasingly taking on more hands-on approach to data security and publishing regular guidance.” Current threat areas for law firms and other professionals involved in mortgage origination are: Cloud computing Email fraud/phishing Ransomware Identity fraud Firms are increasingly being advised to develop a cyber-aware business culture, by developing clear and efficient internal procedures for handling money and considering methods to avoid, for mortgage lenders, the two most prevalent areas of cyber fraud which are: 1 Identity Fraud 2 Friday Afternoon Fraud One aspect of identity fraud is “home high-jacking”. The value of this type of fraud is more than tripled since 2013 rising to nearly £25 million in April 2017, according to HMLR registers. Squire said: “It is now getting to the stage where criminals are paying people to pose as tenants and rent a property using fake identities. One of the tenant’s changes their name by deed poll to match the true owner’s name, put the property on the market and sells it to a cash buyer. It is only when the buyer goes to register the change of ownership with HMLR that the true owner, the landlord, is alerted.” To detect this type of fraud, firms are advised to consider the following: If a tenant is hassling the agent saying they need keys and want to move in quickly, that might be indicative of them being up to something. In these scams it is often the case that the tenants do not actually ever live in the property - which remains empty. This can be checked by the agents. Fraudulent tenants rarely move into the house, they are looking to sell it on quickly. Solicitors and professionals doing KYC on new clients may be alerted by the sight of a brand new passport or brand new driving licence or other ID. Fraudsters change their name by deed poll to the landlord’s name. When doing KYC follow up on references. Maybe check phone numbers. Owners can sign up for HMLR Alert service which informs you if someone is checking the register for your house… The second major type of cybercrime hitting professional firms acting in the mortgage industry, is what has commonly become called Friday Afternoon Fraud. “As solicitors, we see alerts from our regulator, the SRA, almost weekly now with stories of Friday Afternoon Fraud”, said Squire. To avoid this type of fraud, Georgina Squire suggests the following: Beware of changes in bank details mid-transaction and beware of requests to do so during a transaction. When someone asks for money to be sent to a particular bank account, call back on a phone number to double check those bank account details over the phone and verify them. Only the most sophisticated of fraudsters would be able to intercept that call and provide their own mobile phone number to verify the fraudulent bank details. Never send out our bank details in open emails. They should be sent in a password protected attachment with the password sent by separate email. It may sound simple but it should act as an extra layer of protection and is something that is certainly worth doing. ### Bright Computing Wins 2017 HPC Innovation Award Bright Computing, a global leader in cluster and cloud infrastructure automation software, today announced that it has been selected by Hyperion Research for a 2017 HPC Innovation Award for its ability to provision and manage virtual servers in Microsoft Azure.  In May 2017, Hyperion Research (previously the HPC team at IDC) launched the award program to recognize noteworthy achievements using High Performance Computing (HPC) resources. Organizations in the HPC space were invited to submit interesting innovation and/or ROI examples to showcase success stories involving HPC in science and industry. Bright Computing submitted its Azure integration which is included in Bright Cluster Manager version 8.0. The integration enables organizations to build an entire cluster in Azure from scratch, or extend an on-premises cluster into the Azure cloud platform when extra capacity is needed. Key features of the Bright Cluster Manager 8.0 / Azure integration include: Uniformity – Bright Cluster Manager 8.0 ensures that cloud nodes look and feel exactly like on-premises nodes. This is accomplished by using the same software images to provision cloud nodes, as the software images that are already being used to provision on-premises nodes. Users are authenticated on cloud nodes in the same way as on-premises nodes, providing a seamless administration experience. A single workload setup allows users to manage separate queues for on-premises and cloud nodes. Streamlined setup process – An intuitive wizard in Bright View asks some simple questions to quickly and easily set up the cloud bursting environment. In addition, Azure API endpoints are accessed via a single outgoing VPN port in the internet. Data management – Bright Cluster Manager 8.0 includes a tool which automatically moves job data in and out of Azure. Scale – Bright allows organizations to scale nodes up and down, based on the workload. Virtual nodes in the cloud can be terminated automatically when they are no longer needed, and recreated when new jobs are submitted to the queue. Steve Conway, Sr. Vice President of Research, Hyperion Research, commented; “This award recognizes Bright Computing for their contribution to HPC innovations for academic, government and the commercial sector, and for making HPC more accessible to a broader audience through the cloud, using platforms such as Microsoft Azure.” Bill Wagner, CEO at Bright Computing, added; “We are honored to receive this prestigious award. We have been a long-time supporter of the HPC team at Hyperion, and we look forward to their continued success in years to come.” He added; “We are very excited about the Microsoft Azure integration, the great collaboration, and the business momentum between Bright Computing and Microsoft.” Brett Tanzer, Partner Group Program Manager, Microsoft Corp. said, "There is fantastic momentum between Bright Computing and Microsoft Azure right now.  Helping organizations easily create hybrid clouds is a critical scenario for many customers, especially those using HPC in the cloud for the first time.  We are pleased that Bright Computing is already receiving industry recognition for its work with Microsoft Azure, and look forward to more to come from this collaboration.” ### Powering Small and Medium Enterprises with Cloud Technology The UK is home to more than 5.4 million SMEs. To look at it another way, 99% of all UK business are SMEs, employing 60% of employees and, as of the start of 2016, generating over £1.8 trillion in turnover. Their survival (just four in ten small businesses will be trading after five years) and growth are central to the UK’s economic success, and as such, many are actively looking to find better ways to manage their businesses. Cloud technology has been an important game changer for SMEs, providing access to a wide variety of specialist services and software and providing efficient, cost effective and convenient tools for all business functions from accounting, to people management, to marketing. SMEs’ ability to navigate the complexities of business management has been significantly enhanced as they embrace the use of tools and technologies not previously available and there are some unique and specific benefits they enjoy from accessing Cloud services. The two most obvious benefits for SMEs are in accounting and marketing. Technology and social media have transformed the way in which SMEs can engage with their stakeholders, particularly customers, and the ability to link accounts and data provide astonishing insight, in real time, for business owners. Using mail distribution services such as Mail Chimp automates and smooths the process of direct marketing and success can be measured directly via click-through tracked by services such as Google Analytics. Feedback is almost instantaneous and can be incorporated into future activities with minimal effort. Accounting packages, now widely available and reasonably priced, have also had a huge impact on SMEs. 9 Spokes’ research suggests that 67% of SMEs have at least looked at an accounting package at some point. The reason for this is clear – accounting apps such as Sage One, QuickBooks Online or FreeAgent all provide an easy-to-use, cost-effective way to simplify daily accounting, ensuring SMEs can make better finance decisions. It also takes a lot of heavy manual processing away from business owners and managers, reducing the time and effort required for regular business processes such as checking invoices have been paid, or expenses met. [easy-tweet tweet="The low cost of most cloud technology also has an impact on SMEs" hashtags="Cloud, SME's"] What’s essential is that power is put in the hands of the business owner and the accountant becomes a partner to the process, rather than the doorkeeper. The low cost of most cloud technology also has an impact on SMEs. In the past, SMEs have struggled to keep up-to-date as software develops and is updated. Today, best-in-breed technology can be available for smaller businesses without large procurement costs. SMEs can now benefit from innovative new business services that might previously only have been accessible to large corporates, leading to profound effects on productivity and making them more competitive. SMEs can level the playing field much more easily, without significant capital expenditure to remain current and cutting edge, They also have the option to experiment with services and products, without the large upfront investments that can lock them in. The lost cost also translates to scalability. As a business grows, Cloud technologies can easily be scaled up without significant new investment, keeping costs and investments in line with the size and success of the business. For sophisticated businesses, the importance of top-level cybersecurity is already high on their agendas. For SMEs, this can be harder to understand or to access. Again, Cloud technology solves this problem for them with the best apps including high-end security that encrypts and backs up data on the Cloud, ensuring it is protected from theft or accident. Communication is another area in which SMEs gain an advantage through using Cloud technologies. Social media allows them to connect with their customers in real time, while apps designed to work within firms can have a significant impact on internal communications. Cloud-based apps routinely offer multiple user logons, all in real time, allowing managers and employees to update information such as staff rosters or sales information. This can be particularly helpful when teams are spread out across multiple locations. 9 Spokes research done in conjunction with Bryter found that over 80% of SMEs said they need better ways to monitor customer feedback, sales activity, cash flow and people issues. 60% agree they need constant access to information. Taking this into account, here is a selection of the types of apps we think are most useful for SMEs right now. The list is in no way exhaustive (check out www.9spokes.com to see other apps we recommend), but we think it’s a good start for any SME thinking about how cloud-based services can support their business. Accounting – Hedgebook, Sage One, Quickbooks, etc. These apps can be integrated with the SMEs business account, generate and send invoices and track payments made and received. Data is available at all times, on any device, allowing for instant decision making. Marketing and analytics – Google Analytics, Mailchimp, Facebook, Campaign Monitor. All of these do different things, but at their most basic they provide tools for quick and easy communication with customers and ways to review and analyse the response. If using electronic marketing and communication tools, one or more of these are essential, especially when it comes to business promotion. HR and people management – Deputy, GeoOp. These tools allow SMEs to manage staff, duty rosters, workflow and internal communication. eCommerce/Inventory – Shopify, Vend, etc. These apps track inventory, provide sales information and offer convenient access to e-commerce solutions. For SMEs selling products online, they are invaluable Productivity – Office 365, SuiteBox, Box. Available anytime, on any device, these apps store data in the Cloud, provide access for multiple users, create virtual meeting spaces and more. The list of tools is long and benefits SMEs as they move between offices, customers, staff and suppliers. The impact on the use of new technologies, particularly Cloud technology, on the sustainability and growth potential for SMEs cannot be underestimated. Their ability to compete and to use these tools to accelerate growth will continue as they familiarise themselves with the opportunities. The potential impact for the entire market, and economy is massive. ### Wimbledon Prepares for Greatness – Re-shaping the Fan Experience with Cognitive IBM (NYSE: IBM) and The All England Lawn Tennis Club (AELTC), today unveiled new technologies for The Championships 2017, marking the next phase in the AELTC’s journey, cementing digital as the gateway to their brand with unique data insights and cognitive solutions. Sports and entertainment companies compete on compelling content, which is why personalising and re-shaping the fan experience is a top priority for telecom, media and entertainment companies. Wimbledon is embracing this change in their pursuit of being the best tennis tournament in the world. It starts with making sense of data and realigning around fans: understanding and delivering on their needs and preferences. To this end, The Championships, Wimbledon 2017 sees AELTC and IBM reveal four strategic innovation areas: Changing fans’ perceptions - A new real-time insight powered by a bespoke solution from IBM, which seeks to highlight matches of particular interest and quality. By providing real-time insights and analysis around breaking match records faster than anyone else, IBM supports Wimbledon in being ‘first to market’ with content, helping Wimbledon rise above the noise of other global media outlets. The use of a new competitive margin metric will help the Wimbledon editorial team to tell the stories of the matches in new ways as play unfolds. This will provide added insight to fans, with the aim of creating not just context, but a connection that will encourage them to delve deeper. IBM SlamTracker with Cognitive Keys to the Match is a cross platform application that provides real-time scores, stats and insights for all matches in progress. The updated Keys to the Match will be more detailed, including insights such as pace-of-play, serve placement spread or baseline proximity. The solution also uses Watson APIs to refine and update the player style based on match data. Wimbledon fans will have an unprecedented level of analysis, insight and engagement as the match unfolds, particularly with mobile devices in mind. Real-time data will be integrated from multiple sources including courtside statisticians, chair umpires, radar guns, ball position, player location and even Twitter for social sentiment. At the beginning of a match, fans can launch IBM SlamTracker to check out players’ keys (ie what are the tactics to look out for on two particular players in a head to head match) and follow their progress against their keys in real time—point by point. The end result is a richer, more engaging experience before, during, and after each match. IBM will produce additional insights based on “pressure situations” within a match (down 0-40 in a game, down 2 sets). These insights will show the historical performance for a player when in the specific situation, revealing hidden patterns in player and match dynamics to determine the pressure situations. Leveraging AI to assist the Wimbledon visitor - In 2016, Wimbledon developed the award-winning “My Wimbledon Story” feature in the official apps. The apps have been enhanced this year with the new Watson-enabled bot, ‘Ask Fred.’ IBM Watson is revolutionising the guest experience at Wimbledon, offering a cognitive assistant to answer a variety of questions from fans visiting the event. The new mobile app will serve up information about dining options, features a natural language interface and the app also provides an interactive map of the venue. The new assistant is available to visitors to Wimbledon within the Official Apps for mobile and tablet. New AI-powered automated video highlights for Wimbledon fans - Fans don’t just want to know what happened, they want to see it. The Wimbledon video production team is responsible for serving up all the best highlights from across the tournament for their own digital platforms. New for 2017, AI-powered automated video highlights will be generated using IBM Watson and other video and audio technologies to bring to life the most exciting moments of The Championships from the six main Show Courts. With an average of three matches per court per day, video from the matches can quickly add up to hundreds of hours of footage which could take hours to pull together into highlight packages. The AI system created by IBM Research scientists and IBM iX consultants will auto-curate highlights based on analysis of crowd noise, players’ movements and match data to help simplify the highlight video production process and focus on key moments in the match. This will allow the Wimbledon editorial team to scale and accelerate the video production process for highlight packages and expand the number of potential matches that are turned into timely highlight videos for fans to watch and share. A 360 camera view will also be provided from the practice courts to give fans a unique view of The Championships they have not seen before. All innovations have been designed to help Wimbledon deliver high value content in the moments that matter to connect with fans everywhere. Helping fans understand what it takes to be the greatest of all time - IBM is leveraging all of the data and cognitive capabilities put in place for The Championships to delve into what it really takes to be a great champion at Wimbledon, adding social listening to debate who is the greatest of all time. New for 2017 is ‘What Makes Great’, a solution powered by IBM’s data and cognitive capabilities. This provides fans with a new perspective on greatness from an analysis of 44 years of sports coverage. IBM will use its detailed historical data to identify key tennis performance measures against a set of topics that are collectively required to make a great champion. There is more to being great than simply looking into statistics. Attributes such as a great serve and return, to more emotive drivers such as passion and performing under pressure, helping a fan understand who is the greatest of all time and therefore enhance the fan experience. IBM used its Cognitive solutions to conduct deep analysis of unstructured data such as: 53,713,514 tennis data points captured since 1990 6349 newspaper print articles from the Telegraph written during The Championships since 1995 22 years of articles, daily blogs and interviews from web sites (Wimbledon.com and Telegraph since 1995) 10 Wimbledon annuals, interview transcripts and more recently social media commentary that total 11,208,192 words Finding out what makes a champion is a hugely complex undertaking. By comparing quantitative and qualitative data as well as looking at Champions across all five draws, IBM has informed a timeless debate in an entirely new way, one that has not been possible - until now. Wimbledon will use IBM’s cognitive and cloud platform to make its digital channels the most engaging place to experience The Championships and open the eyes of every sports fan to the beauty and craft of the players "in their pursuit of greatness". “We are excited for this year’s developments, yet again improving and developing our digital strategy for fans to make the most of their experience year-on-year. In an increasingly competitive sporting landscape, IBM’s technology innovations are critical to continuing our journey towards a great digital experience that ensures we connect with our fans across the globe – wherever they may be watching and from whatever device that may be,” said Alexandra Willis, Head of Communications, Content & Digital at the AELTC. “With help from IBM, we are providing new on-site features in the SmartPhone apps such as the ‘Ask Fred’ assistant, allowing fans to plan their day at The Championships and make the most of their visit. Similarly, we are working with IBM to access additional insights in order for our fans to truly understand and share the moments that matter. This year, a combination of design and data driven content and insights will provide fans with the unique Wimbledon experience they expect and more.” “At the heart of Wimbledon’s technology are IBM’s cognitive solutions delivering the best insights and analysis possible to the AELTC to encourage great fan engagement,” said Sam Seddon, Wimbledon Client & Programme Executive, IBM. “Cognitive computing is the next revolution in sports technology and working with us, Wimbledon is exposed to the foremost frontier of what technology can do, as we work together to achieve the best possible outcome for the brand and the event. Cognitive is now pervasive from driving the fan experience, to providing efficiency for digital editors to IT operations.” ### Cloud Harmonies: Securing and Safely Sharing Your Data   As Nobel Laureate Bob Dylan sang in his distinctive drawl, “the times they are a-changing’”. Way back when Bob picked up his pencil to write his 1960s classic in long-hand, the cloud was either a visible mass of condensed watery vapour floating in the atmosphere or defined a state or cause of trouble, suspicion or gloom. Fast forward to today’s always-connected world and times are still changing – and constantly. Now, there is a deluge of data flowing into organisations – from smartphones to smart water meters – and cloud computing is a phenomenon transforming businesses. Whether you are a global enterprise or an enterprising entrepreneur, cloud computing brings world-class data centre capabilities to your fingertips. Gartner predicts that the worldwide public cloud services market will grow 18% in 2017 to $246.8B up from $209.2B in 2016 and 451 Research expects growth in hosting and cloud services spending to outpace growth in overall IT spending by 25.8% to 12% this year. But concerns about data security and governance are more than a dark cloud on the horizon. Although the cloud promises organisations the flexibility and elasticity to leverage data to enable smarter business decisions, businesses need to trust that the data can be shared securely.  Organisations will lose out on a key benefit of cloud technology if their data is not properly governed and they can’t share data risk-free. Daring to share a precious asset Data is the crown jewels of an organisation. The way data is stored, manipulated, analysed and managed is crucial to being competitive and compliant.  With EU GDPR less than 12 months away and other future regulations demanding greater data transparency on the horizon, enterprises need to be able to aggregate data from disparate sources to get a single, 360-degree view of customers or transactions to ensure compliance and maximise the business value of their data. Adopting a cloud model allows employees, customers, partners and suppliers – as well as the cloud provider’s operations team – access to an organisation’s network and services.  Naturally, security is always of paramount concern, not to mention the need to keep close track of who has access to what information. Given these requirements, cloud security has to be flexible enough to allow some users to access the data, but not necessarily others. [easy-tweet tweet="The data inside the public cloud environment needs to be secure. " hashtags="Data, Cloud"] Public cloud providers are doing a great job of traditional network and operational security, but security requires a multi-layered approach.  Just as protecting the crown jewels requires more extensive security than the hi-tech cameras and security guards keeping a watchful eye on the Tower of London walls, cloud security also needs to go beyond a ‘perimeter strategy’. The data inside the public cloud environment needs to be secure. While the cloud environment may be secure, the data inside that environment may not be, and this is the responsibility of the data owner or custodian - not the cloud provider. For this reason, data-level security inside a public cloud environment becomes as critical, if not more so, than the network security. If an organisation’s database lacks comprehensive, hardened security, it is far more likely to suffer a successful data breach and make the headlines for all the wrong reasons. Before moving any sensitive data to the cloud, enterprises need a database that provides a rigorous level of data security while allowing all of the elasticity and flexibility required to take full advantage of the cloud. For example, an enterprise-hardened NoSQL database wraps layers of security right around the data itself, using advanced encryption, role-based access controls and other security features to mitigate the risk from both insider threats and external hackers. Data governance clouds the issue Data governance casts a huge shadow on businesses, whether in the public cloud or on-premises.  Even when organisations have devoted vast amounts of time gathering data and building data lakes, they can’t leverage their data assets if the data isn’t governed properly. Without governance, organisations might unwittingly expose their data crown jewels or violate data regulations. If sensitive information has not been fully redacted - or hidden and transformed - it could lead to a brush with the regulators for inadvertently exposing personally identifiable information (PII) about employees or customers. And, if data lineage and provenance can’t be validated, organisations can’t give data scientists, or testing teams access to the data for analysis as it represents too great a risk. But there is a silver lining to the data governance cloud. Rather than seeing data governance as a weight on a company’s shoulders, it is a way to unlock the value of data and drive business value. By untangling the knots of data currently isolated in numerous silos throughout their organisations, and applying effective metadata management capabilities to their data lakes through a NoSQL database platform, companies can profit from getting their data in better shape. And using flexible database technologies with advanced security built-in, such as tools to easily and quickly redact information, gives organisations the confidence to share their data appropriately without worry/hesitation. Cloud switching: keeping options open The adage about ‘not keeping all of your eggs in one basket’ is one enterprise would do well to remember when it comes to cloud procurement. The march towards hybrid cloud environments has momentum as CIOs accelerate the use of two public cloud services – typically Amazon Web Services (AWS) and Azure - to ensure they are not locked into a single vendor or location. By undertaking all cloud application development using a cloud-neutral database that works across every provider, as well as on-premise, businesses can make the switch if their provider experiences a breach or when an alternative vendor launches a new service or speciality that is more suited to their business needs. The outlook By chasing away doubts about data security and governance, businesses can focus on the bottom line, and profit from all the cloud has to offer regarding flexibility and agility. The cloud revolution will continue to transform business models and working practices. And with streams of his songs jumping 512 percent globally after his 2016 Nobel Prize in Literature was announced, Bob Dylan would echo that still “the times are a changing”. ### ES Disrupt: Eversheds Sutherland (International) launches new tech startup product Eversheds Sutherland (International) has announced the launch of an innovation-led tech startup product, ‘Eversheds Sutherland Disrupt’ (ES Disrupt). Unique in the market, ES Disrupt is a web-based bespoke legal service for startups and scaleups. Such businesses have varied and sometimes complex legal requirements but they rarely have ready access to appropriate, tailored legal advice. Through ES Disrupt, startups can ensure their business is legally protected as it scales. Modelled on the ‘one-stop-shop’ approach, ES Disrupt offers a suite of legal documents which can be tailored to meet individual startups’ requirements. Everything from funding and company formations, to hiring, corporate compliance and trading contracts can be requested via the ES Disrupt website. ES Disrupt is the brainchild of trainee Jocey Nelson. Prior to starting her training contract with Eversheds Sutherland, Jocey worked for a tech incubator and experienced first-hand how receiving no or inadequate legal advice can materially impact the chance of success for a startup. ES Disrupt was her proposed solution, and was the winning entry in the firm’s ‘CEO Innovation Challenge’ in 2016. The firm’s TMT group has considerable experience supporting the tech and startup community, and this in-depth understanding of their legal needs has shaped and informed ES Disrupt.  A startup for startups, ES Disrupt was wholly funded and developed by Eversheds Sutherland. Lee Ranson, Chief Executive Officer, Eversheds Sutherland (International), comments: “Cultivating innovation and keeping it at the heart of our business has long been part of our strategy. We encourage everyone to think of innovation as a core part of their role, as Jocey has clearly demonstrated by having a great idea and the drive to make it happen. Congratulations to Jocey and all the team who have brought ES Disrupt to market.” Charlotte Walker-Osborn, Partner and Head of the TMT sector, Eversheds Sutherland (International), comments: “We are passionate about the work we do with the tech startup and scale up communities, actively immersing ourselves to ensure we really are providing our clients with the right level of support for them to succeed and to have tailored and easy to use documentation. “The team has, over many years, enjoyed supporting and mentoring startups and scaleups at all stages, working closely with investors and providing in-depth legal support and advice from both the customer and supplier perspective. By purposefully pooling that knowledge and experience, we have created our own tech and early stage ecosystem, and from there, the perfect platform for ES Disrupt. “Now, thanks to the backing and investment by the firm, Jocey and the team have been able to create a tangible solution that will give startup and early stage clients a much more stable basis from which to operate and grow.” Jocey Nelson, adds: “Having the opportunity to be part of such a unique journey has been a fantastic experience. The hard work has been worth it and the support from the firm, incredible. We enjoy a culture that cultivates an environment in which you’re actively encouraged to think outside the box and ES Disrupt epitomises exactly that. I’m very much looking forward to continuing this journey by supporting our clients through the use of this exciting new product.” Earlier this year, Eversheds Sutherland was named winner of ‘Innovation in Legal Expertise’ at the prestigious FT Innovative Lawyers Awards. The firm is ranked number eight in the FT50, the list of the most innovative law firms in Europe. For more information on Eversheds Sutherland, visit eversheds-sutherland.com ### Fujitsu Develops Virtual Machine Control Technology to Improve Server Density of Datacenter Racks Fujitsu Laboratories Ltd. announces the development of virtual machine (VM) control technology that improves server rack mounting density for each datacenter rack. Currently, in datacenters, the number of servers mounted to a rack is limited by the total rated electrical power of the servers, which must be less than the power supply of the rack. However, often times, server load is low, at approximately 10% to 50%, and power usage for each rack proportional to load is also low compared to the rated power, leading to racks with low server mounting density. Now, in order to increase server mounting density, Fujitsu Laboratories has developed technology that enables efficient server placement by setting up a partition made up of backup servers in the datacenter, and migrating VMs to the backup partition based on the physical distribution and power consumption of the virtual machines. With this technology, for racks that are running virtual servers, datacenters can reduce their space usage by increasing mounting density. Fujitsu Laboratories has calculated that in a case where server rack operational efficiency(1) was increased to 90%, space usage could be reduced by 40%. Details of this technology were presented at IEEE Cloud 2017, an international conference currently being held in Honolulu, U.S., from June 25-30. Development Background As the number of servers installed in datacenters continues to rise and its further increase is expected with the growth of AI and IoT systems going forward, there has been a demand for ways to increase server mounting density within the bounds of limited space and power. Issues In datacenters, when installing servers in a rack, the number of servers is decided based on the rated power usage of each server, so as not to exceed the rack's power supply. However, often times, server load is low, at approximately 10% to 50%, and power usage for each rack proportional to load is also low compared to the rated power. Being this low creates a demand to raise mounting density. In response to this demand, in recent years a "power capping technology" has been developed which monitors the actual operational status of servers in racks that mount a number of servers in excess of their power supplies. If that power supply is about to be exceeded, the technology limits the operational frequency of servers to suppress power consumption. This technology, however, could not be used for applications that require a specified level of processing performance, such as mission-critical applications. About the Newly Developed Technology Now Fujitsu Laboratories has developed a VM control algorithm that first installs physical servers in racks at high density in a datacenter and then establishes a back-up partition, using the migration functionality of the VMs to move them depending on the power consumption of each server. In this way it prevents each rack from exceeding its amount of power supply. Features of the newly developed technology are as follows: 1. VM control technology based on physical distribution Virtual datacenters built with VM management software have physical servers split into theoretical management units called clusters. The physical servers within a cluster can be mounted into different racks, with no physical limitations. VMs are automatically migrated between clusters when a physical server stops operation due to a fault or when it undergoes maintenance. The actual physical layout of the servers, however, is not always taken into account. Fujitsu Laboratories has now developed a technology that prevents racks from exceeding their power supplies. Using an API that can be used in normal datacenter management, the company will build a database related to the physical distribution (the operational partition) of servers that provides services as well as the physical distribution of the backup partition that the VMs will be migrated to as it approaches its power supply limits. Then, the continually changing power usage volume collected from the servers will be linked to their serial number and rack number (Figures 1 and 2). This technology enables datacenter operators to increase the mounting density in the operational partitions, reducing space usage. http://www.acnnewswire.com/topimg/Low_FujitsuServer62717Fig1.jpg Figure 1: Increasing server mounting density in a datacenter http://www.acnnewswire.com/topimg/Low_FujitsuServer62717Fig2.jpg Figure 2: VM control technology based on physical distribution 2. Technology to set installation rules based on actual operational data Frequent migrations will occur if server mounting density is increased exceedingly, so an appropriate balance is necessary between migration frequency and server load commensurate with power volume. Fujitsu Laboratories has now developed a technology to determine the number of servers that should be mounted on a rack by statistically estimating the frequency of migrations for each rack, based on the pre-measured load per server, assuming that changes in the load follow the normal distribution (Figure 3). If the center of the normal distribution is a server load of 30%, for example, by installing servers based on a power usage value for 50% load, this would account for 95.5% of changes in load, enabling a maximization of server mounting density. http://www.acnnewswire.com/topimg/Low_FujitsuServer62717Fig3.jpg Figure 3: Technology for setting rules for server installation Effects By applying this technology, Fujitsu Laboratories expects that it will be possible to increase the mounting density of servers in datacenters, significantly reducing installation space. By using this newly developed technology, for example in a cloud service where changes in load can be predicted with a normal distribution, a datacenter operating with ten partitions at a server rack operating efficiency of 50% would be able to reduce space usage by 40%, with now five partitions and a server operating efficiency of 90%. Future Plans Fujitsu Laboratories plans to incorporate this technology into Fujitsu Software ServerView Infrastructure Manager, Fujitsu Limited's infrastructure software for operations and management, within fiscal 2018. (1) Server rack operating efficiency The ratio of the total power consumption of the servers to the power supply of the rack. About Fujitsu Laboratories Founded in 1968 as a wholly owned subsidiary of Fujitsu Limited, Fujitsu Laboratories Ltd. is one of the premier research centers in the world. With a global network of laboratories in Japan, China, the United States and Europe, the organization conducts a wide range of basic and applied research in the areas of Next-generation Services, Computer Servers, Networks, Electronic Devices and Advanced Materials. For more information, please see: http://www.fujitsu.com/jp/group/labs/en/. About Fujitsu Ltd Fujitsu is the leading Japanese information and communication technology (ICT) company, offering a full range of technology products, solutions, and services. Approximately 155,000 Fujitsu people support customers in more than 100 countries. We use our experience and the power of ICT to shape the future of society with our customers. Fujitsu Limited (TSE: 6702) reported consolidated revenues of 4.5 trillion yen (US$40 billion) for the fiscal year ended March 31, 2017. For more information, please see http://www.fujitsu.com. ### OSS PaaS Rundown: Comparing Dokku, Flynn, and Deis Workflow Open source software is currently witnessing a renaissance. For every major platform out there, there is at least one open source alternative. Heroku made Platform-as-a-Service (PaaS) popular and is still one of the best PaaS providers out there with hundreds of integrations to choose from. But Heroku can cost a lot more than you expect as your application grows in scale. Some popular OSS alternatives are compared below: Dokku Dokku presents itself as “Docker powered mini-Heroku.” It is the smallest and simplest of all the options reviewed here. It comes with a web-based setup (after the initial installation) or can be set up in an unattended mode as well, which is suitable for deployment scripts and CI/CD. Heroku-compatible applications can be pushed to it via Git and it builds them using Heroku build packs. The only downside is that Dokku is limited to a single host. While it works for small applications like side-projects, the lack of horizontal scalability makes it unsuitable for larger applications. Dokku is also not able to provide high availability because a single server means a single point of failure. This is where Flynn comes in. [easy-tweet tweet="Large applications are moving towards microservices from monolithic designs" hashtags="Applications, Data"] Flynn Large applications are increasingly moving towards microservices architecture from monolithic designs. Services are divided as per their functionality and are often a part of processing pipelines. There are even services for those services. For example analytics from various sources can be clustered using an aggregator. Microservices give the developer the ability to scale parts of their application independent of each other as per the load. Dokku is unsuitable for this kind of applications as they require a platform that can support multi-cluster deployments.  Flynn was built with keeping high availability and scalability in mind. It can be run on a single server or can be scaled up to multiple nodes. Like Dokku, Flynn also works on the same Heroku-like format. Applications are deployed using Git and built using build packs. Flynn’s components also run inside a cluster as highly available Flynn apps. Flynn also includes built-in databases that run on a cluster. PostgreSQL, MongoDB, MySQL, or Redis can be initialized with a single click, and it also provides console access to the CLI clients of these databases. Flynn provides a web-based dashboard for monitoring and administering the cluster. It also shows aggregated logs from all the nodes there. HTTP, HTTPS, and TCP load balancing is built-in and automatically configured. Flynn also provides overlay networking for scaling applications and built-in service discovery, so you do not need to configure Consul service yourself. Both Flynn and Dokku support Twelve-factor applications, making all Dokku applications compatible with Flynn. Flynn has support for external plugins to extend it. It is built on an Ubuntu base. All in all, Flynn is a solid platform for developing scalable applications in a Heroku-like environment on your server. It can be seen as a successor of Dokku. Deis Workflow Deis Workflow is built on top of the battle-tested Kubernetes. It adds an easy to use the layer on top of a Kubernetes cluster to make application deployments easier. Deis Workflow is most popular among the three platforms discussed for large and complex applications. The platform is delivered as a set of Kubernetes microservices. Both the platform services and the application runs in separate namespaces to separate the workload. Deis Workflow can deploy new versions of the application without any downtime using its services and replication controller. As with Dokku and Flynn, it also supports deployment via Git. It can be controlled via the CLI client or using the built-in REST API. It also includes an edge router to enforce firewall policy in the cluster. All the code pushes, config changes and scaling events are tracked and Deis Workflow makes it easy to rollback to any previous version with a simple API call. Additional workload which is not managed by Workflow can be added using the underlying Kubernetes’s service discovery. Using Deis Workflow requires initializing a Kubernetes cluster, which makes it a little less beginner friendly - although Google Cloud Platform, Amazon AWS, and Azure Container Services all provide easy to use managed Kubernetes set-ups. Once the cluster has been initialized, it can be installed with Deis Helm package manager. Deis Workflow also follows the principle of Twelve-factor applications. It can either use a build pack or create a new docker image if a Dockerfile is found. Workflow is also built for scalability. It is built on a CoreOS base. For CAP, it uses Fleet and etcd for Gossip/RAFT from the underlying CoreOS. Logs and metrics can be drained to any supported sink. Deis Workflow also provides support for in-built alerting based on predefined thresholds. Alerts can be sent to a Slack channel, Pagerduty, to a custom webhook, or via email. Other notable mentions are PaaS like Cloud Foundry BOSH and OpenStack Solum. Choosing the one that fits your application best can be a strenuous task. The best way to do this would be to try out each of the platforms, starting with the Dokku, which is the simplest. If the application requires more flexibility and scalability, it would be better to move on to Flynn at the cost of some increased complexity. DevOps engineers who are already familiar with Kubernetes will find transitioning to Deis Workflow very smooth. ### Comms Business Live Episode: Digital Transformation In this episode of Comms Business Live, the discussion focussed on Digital Transformation and guiding your customers through that journey. The host David Dungay was joined by Carl Churchill, MD at Netpay, Dan Cunliffe, MD at Pangea and Nathan Marks, CDO at Daisy Group. Don't forget about Channel Live 12th - 13th September. ### Comms Business Live Episode: SD-WAN In episode 3 of Comms Business Live the discussion focussed on SD-WAN and are channels grasping the opportunity. The host David Dungay was joined by Nick Sacke, head of IoT and Comms365, Simon Pamplin, EMEA Technical Sales Director, SilverPeak and Jim Norris from Nuvias. Don't forget about Channel Live 12th - 13th September. ### Comms Business Live: Episode 2 Comms Business is working with Disruptive Tech TV (a 24/7 online streaming channel run by Compare the Cloud) to bring you live streams on current Channel trends. The next stream is on Monday 26th June at 3:30pm and will focus on Digital Transformation and how you can guide your customers through it. Speakers include; Dan Cunliffe – Pangea, David Dungay – Comms Business, Carl Churchill – Netpay, Nathan Marke – Daisy Group. ### Moving to the Public Cloud is the New Norm for Businesses Recent research from Gartner predicts that the worldwide public cloud services market will grow 18% in 2017, to a total of $246.8 billion. This is expected to continue to increase far beyond 2020. Many organisations across the world have already adopted the public cloud, on the basis that it offers a more secure computing environment than many previously had, and at a fraction of the cost. Indeed, public cloud has reached a tipping point: organisations have now put their trust in it in sufficient volume for concerns around compliance and practical application to now be mitigated, in turn, driving yet greater peer adoption. Despite all this, there is still some reticence. These range from security concerns and issues around data sovereignty to questions around control and third-party reliance, leading some even to assume that a hybrid public-private approach is the best way forward. Just recently NSA whistle-blower Edward Snowden became the latest industry figure to take aim at the public cloud, accusing the technology of being “disempowering”. But nonetheless, the volume of adoption means that the question appears to have evolved from “why use the public cloud?” to “why not use the public cloud?”. After all, on closer inspection, the arguments against public cloud do not always stack up. Security Many organisations cite security as one of the main reasons to eschew public clouds, where applications and data are hosted on dedicated servers within a cloud provider’s data centre, accessible only via private connections. [easy-tweet tweet="We need to cast off the excuses not to rely on public cloud" hashtags="PublicCloud, Hybrid"] But this is outdated. Governmental departments are now using Amazon Web Service (AWS) data centres, meaning enterprise security concerns around public cloud adoption have in fact diminished. A prime example of trust in the public cloud comes from the Government Digital Service, which has stated that it is a perfectly safe platform for data. This is not to belittle the gravity of security considerations. Rather, public cloud service providers, like AWS and Microsoft Azure, are fully aware of the dangers and are constantly investing in developing and enhancing their security capabilities to stay one step ahead of today’s constantly shifting security landscape and protect customer data and applications. Even to the point, ironically, that the investment and security resilience outweighs that of most private datacentre set-ups. Moreover, Gartner predicts that by 2018, better security will in fact be the main reason why government agencies decide to use the public cloud. Data sovereignty Issues around data sovereignty are another potential impediment to public cloud adoption. The concern is that the data the company is storing will be subject to the legislation of the country in which it is being stored. However, in Britain the post-Brexit era will see an increase in British-based data centres, with the government’s Public Services Network (PSN) weakening the data sovereignty argument against public cloud by demonstrating its confidence in the technology. This will also reduce latency and help to open the doors to new classes of organisations who are in need of tangible operational savings, but were previously locked out of using the public cloud. Control Another argument against the public cloud is that organisations like to stay in control of their infrastructure and services, and do not want to be reliant on third-parties. A lack of training and early engagement guidance on best practice is therefore cited by businesses as a weakness of the public cloud. But government procurement initiatives are beginning to break down large contracts. This presents the opportunity to ease large legacy supplier lock-in, allowing small and medium suppliers to actively compete to be able to deliver on these challenges using the public cloud. Smaller players on smaller contracts mean that suppliers will start behaving more like systems integrators, working alongside customers to offer consultancy on best practice public cloud implementation. A hybrid cloud instead? While a hybrid cloud may seem to be the best of both worlds, it introduces more problems than it solves. Private clouds, while inherently providing more control, are more expensive to build and maintain than the public cloud, and do not provide the same levels of scalability and flexibility. A hybrid environment, to mitigate perceived weaknesses of the public cloud, actually does little more than introducing an additional level of complexity and management challenges, which is simply unnecessary. Hybrid cloud is simply too complex to manage without constant training and outside help, which brings us to the main reason that hybrid cloud is not a viable solution: cost. In simple maths, adopting a hybrid cloud means adding the costs of private cloud, plus public cloud, plus highly skilled staff to manage the intricacy. And even after all this, the security of the private – not public - part of the cloud remains a concern. Furthermore, compatibility is a major issue when building a hybrid cloud and with dual levels of infrastructure, a private cloud that the company controls and a public one that the company doesn’t, the chances are that they will be running different stacks. We need to cast off the excuses not to rely on the public cloud. With governmental backing, high adoption, continuous investment and low costs, the public cloud is already the best approach for your business. Besides, many of the reasons holding back the public cloud are not actually about the technology itself, rather the negative mindset around its adoption. ### Inherent Advantages Drive Cloud Backup Push Cloud backup strategies and techniques are nothing new to the world of data management. Without belabouring that obvious fact, the world’s increased movement to cloud-based data solutions for every element of data-intensive activity is having a real effect on the increasing shift to cloud backup solutions. The forces converging to drive every kind of enterprise and organisation into cloud resources are similarly motivating the movement to cloud backup specifically. Certain cloud backup advantages are also influencing IT managers to integrate cloud backup capabilities into hybrid IT implementations. [easy-tweet tweet="Low-cost, open-source technology is available; proprietary technology is under pressure" hashtags="Technology, Cloud"] Cloud Efficiency Effects An Inc. magazine article from last summer outlines three reasons that define how cloud solutions drive efficiency and how they’ll shape enterprise IT going forward: Innovation—Cloud is the computing architecture of the future. In three years, Intuit predicts 80% of all small-business computing will live in the cloud. Cloud technology is driving open-source innovation ahead faster than conventional server solutions. Competition among cloud providers is bringing previously high-end technology to every segment. Innovation such as data reduction was once only available in top-shelf storage systems. Now, software-defined data centre (SDDC) technology brings advanced features to everyman platforms. Agility—Successful companies innovate faster and more efficiently. A cloud strategy delivers the lowest cost, most scalable IT infrastructure. By moving to cloud solutions, companies can focus on core strengths and competitive advantages rather than infrastructure investments. The cloud helps you manage remote offices and scale operations seamlessly. Cost Optimization—The emergence of a public cloud architecture created a framework that was the very definition of a system built to optimise utilisation and maximise resource efficiency. But now that innovation is available for hybrid cloud solutions. Low-cost, open-source technology is available; expensive, proprietary technology is under pressure.   Why Cloud Backup? Without mission critical data, whether that is financial or technical, a company is lost. It’s essential to have modern and dependable systems in place to protect essential business information. Sophisticated cloud backup solutions have emerged as secure and cost-effective data protection options for businesses of all sizes. Six common cloud backup advantages follow that illustrate how this approach can benefit IT orgs: Lower total cost of ownership (TCO) — Cloud backup technology leverages the cloud’s economy of scale and the replacement of on-premises equipment. Cloud backup can integrate into existing IT infrastructure for complete protection of all data, securely transmitting encrypted backups to the cloud. The benefit? A lower TCO compared with owned, onsite strategies. Reliability and recovery speed — Modern hybrid-cloud, data-protection solutions can instantly bring backup data to life as cloud virtual machines. Managed service providers can help establish recovery time objectives and recovery point objectives that match business requirements. Data security — Modern backup vendors perform data encryption before the data leaves the device. A unique encryption key generated by the backup software stays with the user, preventing unauthorised access to data even if the system is breached. Protection for cloud data — Conventional data protection packages do not protect cloud-based workloads such as Microsoft Office 365. As more businesses adopt cloud-based services for their ease of use and pay-as-you-go licensing, protecting data in the cloud will become a standard requirement for any business. Protection against ransomware — Encrypted backup data stored in the cloud are less likely to be susceptible to ransomware attacks. Even though cyber criminals are trying to attack backups, backed-up data stored in the cloud with reputable vendors are protected against such attacks. When on-premises systems are hijacked by ransomware, cloud data may end up being the only lifeline available. The trend in data protection and backup is into the cloud. The benefits are real, as are the threats to data integrity from external (and sometimes internal) breaches and system failures. The best backup solution is one that you do not worry about.  With the wall of data, evolving threats, and emerging AI-based computing landscape, it’s essential to deploy backup solutions that meet needs now and scale as needed.  The best solutions don’t just offer cloud backup as an option but anticipates and addresses emerging needs.  That means they can accommodate hybrid cloud and hybrid IT environments today with a complete, secure backup capability—one with advanced security measures, such as ransomware protection, integrated into the solution framework. ### Technology and the Law Ep 6: Ask the Experts In the sixth and final episode of Technology and The Law, cloud lawyer Frank Jennings was joined by Abhas Ricky from Hortonworks and Adam Nash from Webroot. This episode saw the discussion of Digital Business. ### Taking Every Step to Secure the Cloud Digital transformation is a key priority for enterprises all over the world, but most IT decision makers have not completed technology deployments to address the initiatives that are critical to digital business. Digital transformation at its core is about improving the customer experience by modernising tools and processes within an organisation.One common digital transformation initiative is moving identity management infrastructure to the cloud. Moving this infrastructure to the cloud has many benefits, including elastic scalability, reduction in Total Cost of Ownership, and co-location with cloud-based applications. Every day, more and more enterprises of all sizes are finding value in moving to a cloud-based infrastructure, especially to help development teams collaborate on solving complex challenges. But before you move your entire infrastructure to the cloud you will have to make sure it is secure. Obviously, you will put in place firewalls, data encryption data and monitor data to ensure it is secure, but you’re skipping the first step. So, what is the first step? Deploying the Access Control Layer in the Cloud One key benefit that cloud has brought to the industry is the ability to self-provision capacity, and access control is no different. Using the one-click experience for images on the Amazon Web Services (AWS) Marketplace, one can get started with dynamic policy management in minutes. [easy-tweet tweet="Access control services must be as fault tolerant or better than the services they protect" hashtags="Cloud, Security"] Moving to an operating expense model for budgeting lowers risk from what has traditionally been a capital expense world with on-premise software deployments. Rather than having to put all of the investment up front with little predictability on what services and integration will cost, an organisation can model their policies and integrations in the cloud and only pay for what they use on a development scale. Access control services must be as fault tolerant or better than the services they protect. One can leverage the infrastructure or platform as a service provider's capabilities to provide auto-scaling and elasticity for access control. When the demand for your access control service exceeds your current capacity or your Service Level Agreement, add additional nodes seamlessly.   Attribute-Based Access Control (ABAC) Identity and access management technology, like Attribute-Based Access Control (ABAC), is a key enabler of a secure cloud experiences. ABAC helps enterprises deliver more personal, convenient and trusted mobile experiences to customers, employee and partners while enabling secure access to apps and data in the cloud. Axiomatics' dynamic authorization solution provides a Policy Administration Point that features a multi-tenant capability that can be leveraged in your private cloud to service multiple departments independently and consolidate global policies where appropriate. Axiomatics also delivers a stateless Policy Decision Point (PDP) that can scale predictably and be deployed in a hybrid model where some PDPs can be on-premise to service legacy while cloud hosted PDPs can service anything from Infrastructure (IaaS) to Software (SaaS). It is not a coincidence that the emergence of APIs and microservices as the building blocks for application development have emerged alongside Cloud. To leverage the flexible deployment benefits of cloud models, one should have loose coupling and high cohesion of autonomous, domain-specific services. Access control is no different. Externalizing token issuance, policy decision, multi-factor authentication, user profile and risk defeat the tyranny of the layered architecture and the monolith. Axiomatics provides a lightweight REST/JSON protocol for accessing its decision services. Establishing the access control services and the governance and training for using them is a key first step to moving to the cloud. Securing Cloud Services Standards play a key role in the fabric of trust between cloud providers. To trust these services for your core business functions, the contracts with the consumer must be maintained. SCIM, OpenID Connect, and XACML form some of the important identity, and access formats and definitions that cloud providers are relying on to implement externalisation of these key services allowing access control vendors to innovate without worrying about redefining how the services are to be consumed with each version. Not every organisation wants to model its access control policies the same way. Axiomatics provides the flexibility to use Attribute-Based (ABAC), Role-Based (RBAC) or Risk-Based (RiBAC) Access Control models, usually in conjunction. Leveraging contextual data and comparing characteristics of the protected resources with the attributes of the identity will make your policies more resilient and keep the number of roles and groups from exploding. Enforcement of this dynamic policy often happens at the API Gateway that virtualizes the services from the cloud provider. The Gateway is strategically placed to incorporate the attributes of the client and the identity of the consumer with metadata of the resource being accessed to formulate the question:"Can the consumer perform this operation on this resource under these conditions?" The Policy Decision Point can perform additional lookups as needed to formulate an access decision or an expression to determine which data should be returned to the client and which data should be filtered, masked or redacted. An example of a gateway would be Amazon API Gateway - or Apigee API Gateway. Summary There are key drivers to organisations move to cloud deployment: ● Digital Transformation ● Sharing data outside of organisational boundaries ● Tiered subscription and personalization ● Privacy and protection of sensitive data One of the key first steps in achieving the nimble and robust capabilities of cloud deployment is to establish the access control services required to support these drivers. Dynamic authorization is critical to leveraging contextual attributes for real-time policy decisions. API Gateways provide high cohesion of policy enforcement of your business services deployed in cloud providers. As more organisations migrate to the cloud, the need to address complex authorization use cases for cloud-based resources is only going to grow. ABAC can help provide real- time dynamic authorization to the cloud, all while providing optional tools for auditing, reporting and policy testing. ### Hackers Won’t Stop Us From Cloud Adoption, Say 80% of Businesses Advanced report shows cloud adoption isn’t slowing – but cloud providers must build more confidence among adopters 22 June 2017 – Businesses are pressing ahead with their digital transformation plans, despite fears of being hit by a cyber attack or data protection regulations. This is according to a new independent research report from Advanced, which questioned over 500 senior executives in UK organisations about their attitudes to using the cloud as part of their digital transformation plans. Most organisations surveyed are concerned about security (82%) and data protection (68%) in the cloud but, perhaps surprisingly, 80% of them are not put off from adopting the cloud following recent high-profile cyber attacks such as WannaCry. A third (33%) of organisations admit to being experienced in the cloud and continue to consider it for all new projects, while 37% have recently launched cloud computing projects for the first time. [easy-tweet tweet="76% say that governments should do more to protect businesses and their customers from a cyber attack" hashtags="Cyber-Attack, Security"] Although positive, these findings should not negate the common concerns and challenges. The survey also found that businesses want better support if they are to execute their digital transformation plans effectively. Security is the biggest barrier, with 76% saying that governments should do more to protect businesses and their customers from a cyber attack. Meanwhile, 82% of organisations want to see cloud providers do more to build confidence among those looking to adopt a digital transformation strategy, of which the cloud is fundamental. When asked what they look for in a provider, most say financial stability (69%), data held in a UK location (65%) and local support (58%) – above typical benefits touted by providers including scalability (46%) and the breadth of application offerings (38%). Jon Wrennall, CTO at Advanced, says: “It’s encouraging to see businesses are undeterred from using the cloud, which is fast becoming the right choice for many to drive efficiencies, innovate and grow. Sadly we are seeing the same concerns around security and data protection reported over and over again. It’s right to be concerned about security; it’s time that all of us as cloud services providers take a reality check. “As an industry and profession, we all need to proactively give clear guidance on security responsibilities and support organisations in being better protected, ensuring devices and applications are properly patched and secured – those writing the software are clearly best placed to provide this. With General Data Protection Regulation (GDPR) coming into force next year we also have a duty of care to provide clarity on how data is being stored and secured in the cloud. “There’s still a job to be done in creating trust in the cloud and helping customers use the cloud in the right way for the digital transformation that’s right for them. Our survey shows most organisations want financially stable providers and prefer those that store data locally and offer local support; this will become even more pertinent as Britain leaves the European Union. They will trust the providers that offer certainty in an uncertain market and those with a vested interest in the UK and the cloud.” The independent research was carried out following the results of the general election, during week commencing 12thJune. Over 500 participants took part in the survey, which was carried out by Techmarketview. The full report can be found here: www.oneadvanced.com/cloud. Additional quotes: Tom Thackray, Director of Innovation at CBI, said: “Digital technologies offer businesses the length and breadth of the UK the tools and platforms needed to start, scale and reach a global audience. Technologies like cloud now underpin much of the UK business infrastructure and there is a clear intent from companies to keep up with the pace of change. But with great digital opportunities comes an element of risk – companies must ensure cyber security is a boardroom priority and work closely with suppliers and customers to remain cyber resilient.” Sue Daley, Head of Cloud, Data, Analytics and Artificial Intelligence at techUK, said: “It is good to see that businesses understand the economic and efficiency benefits of cloud computing and confidence continues to grow. However, it is vital that businesses, and consumers, are better informed around the security of cloud services. Cloud computing can offer much greater levels of security and resilience required by users. We, the industry, must all work together to inform users of the robust security that underpins cloud technology, and how they can feel confident in choosing the technology for their businesses. That is why techUK produced a series of papers earlier this year, Building Trust in the Security of Cloud, including practical advice for those considering the cloud.” ### Wipro Collaborates with Red Hat for Cloud Application Factory  Wipro Limited (NYSE: WIT, BSE: 507685, NSE: WIPRO), a leading global information technology, consulting and business process services company, today announced a collaboration with Red Hat, the world’s leading provider of open source solutions, to set up a cloud application factory designed to offer developers and IT teams a repeatable and rapid methodology for application modernization across public, private, and hybrid clouds. Wipro’s cloud application factory will have a dedicated services team that can help drive the strategy, design, and delivery of next generation applications globally, using Red Hat OpenShift Container Platform, Red Hat’s award-winning container application platform. Enterprises often not just need to support legacy applications in a cost efficient manner, but also increasingly need to quickly develop new cloud-based applications that enable innovation and can drive business process transformation. Red Hat and Wipro recognise this challenge and have built a joint offering designed to address both requirements. The Red Hat OpenShift Container Platform is the industry’s most comprehensive platform for cloud-native application development. As a flexible development platform, OpenShift spans across legacy and cloud environments so that customers can develop cloud-native applications without having to rewrite their legacy applications. Traditional applications can coexist alongside new, cloud-native and container-based applications. [easy-tweet tweet="Wipro and Red Hat will provide a scalable factory approach to developing next-generation applications" hashtags="IT, Cloud"] In collaboration with Red Hat, Wipro provides dedicated IT consulting and services teams to help customers with application lifecycle management, via specific roadmaps, frameworks, and best practices. From the initial discovery phase to the strategy and design phase, and ultimately when customers are ready to build a production environment, Wipro and Red Hat teams will provide a scalable factory approach to developing next-generation applications. The Cloud Application Factory is designed to deliver key benefits, including: A consistent application platform for public, private, and hybrid clouds Reduced time to market for new applications and features A unified methodology developed by dedicated architecture experts Comprehensive interoperability planning of client’s legacy and cloud-native applications The ability to leverage the global open source community to drive innovation Findings from a 2015 Oxford Economics study cite that 63% of IT and business exec respondents reported open source software would be critical to business agility over the following three years. As a result, to help better compete in the digital economy, many organisations are looking to open source solutions to find the flexibility and scalability they expect from the next generation of modern and agile cloud-native applications. As an open source platform, Red Hat OpenShift Container Platform can help reduce the cost, and time associated with developing and modernising enterprise applications. A 2016 IDC study on the business value of OpenShift, found that OpenShift enables customers to respond to market requirements faster by delivering business-critical, microservices-based applications with DevOps processes. These benefits include 66% faster application delivery times, $1.29 million average annual benefits per 100 application developers, and 531% average ROI over five years. “We have strengthened our alliance with Red Hat to build and deploy an effective and scalable team execution methodology to meet joint customer interest, today and in the future,” said Andrew Aitken, General Manager and Global Open Source Practice Leader, Wipro Limited. “It's exciting when considering the potential innovation that customers can enjoy with Wipro's Cloud Application Factory powered by Red Hat OpenShift Container Platform. This is really about helping to streamline customers’ time-to-market by deploying modern applications, in a scalable and repeatable way,” said Julio Tapia, director, OpenShift Partner Ecosystem, Red Hat. ### Look to the Cloud for Communications Services Greater efficiencies reduced OPEX and CAPEX, and streamlined business growth: it’s no wonder companies across all industries are turning to the cloud. Jorn Vercamert, VP, Communications Solutions at BICS, examines the benefits of cloud-based communications solutions, which can deliver a boost to business, quality of service, and customer satisfaction. Ninety-five percent of respondents to a recent survey are using the cloud, with the majority adopting a hybrid approach which utilises both public and private services. The digitalisation of life and work across an array of industries has allowed new start-ups to enter the marketplace and forced incumbents to up their game. Developments in technology and connectivity have offered opportunities as well as challenges, requiring businesses to keep up with rapid rates of change. In this competitive climate, a business must be able to operate, grow and deliver on customer expectations easily and affordably, or risk losing out to rivals. Cloud technology and services have been around for some years now, with solutions providers now offering advanced, low-cost tools which are accessible to all businesses. These have allowed companies to take advantage of virtualized environments to store and move data, backup files and information in case of a server crash, and move assets off-site, thus freeing up resources. With the cloud market maturing, the cost savings versus traditional business strategies have become apparent. One report found that 45% of virtualized operating system instances running on-premises would be more economical if they were running in the cloud, offering cost savings of up to 43% annually. Fewer expenses are of course great news for any organisation. For start-ups and small businesses, the financial savings delivered by the cloud enable rapid business growth while ensuring a high quality of customer service is maintained. In addition to moving services, data storage and processing to the cloud, many businesses are now starting to realise the benefits of moving communications to a virtualised environment. Adopting a cloud strategy for communications is particularly important for those companies looking to extend their global reach, allowing them to tap into new markets through borderless, secure and scalable connectivity. [easy-tweet tweet="CTOs with a view to scaling up their business can partner with globally-oriented service providers" hashtags="Cloud, Data"] Cloud communication services reduce the total cost of ownership by minimising investments in physical equipment and infrastructure, while still providing the flexibility to scale the services when required. Cloud adoption and easy-to-integrate APIs from third parties have made a huge range of global communications services, both national and international, more accessible. Forward-looking CTOs with a view to scaling up their business can partner with globally-oriented service providers, which have the expertise and resources to bridge communication services directly to each local in-country provider. These services bring particular benefits to unified communication providers, cloud PBX providers, contact centres, conferencing service providers and digital service providers, which face an additional challenge when looking to scale their business internationally; retaining a local presence and links with their customers. Businesses can take advantage of – for example, local, toll-free numbers – provided by cloud-based service, which allows customers, end users and their businesses to be visible in the local telephone directory. This provides customers with a low-cost, or free means of getting in contact with a company in a specific region, even if they are physically located elsewhere. This local presence can help a company strengthen its customer service offering; a call centre, for example, can offer accessible, high-quality 24/7 contact to its customer base. Conference service providers, meanwhile, will be able to use local and global toll-free numbers to provide their customers with the connectivity needed to organise meetings on-the-go. Mobile working is an accepted part many businesses today, many of which require tools to facilitate this easily and without hefty implementation and operational costs. Organisations which have moved data and processes to a cloud environment will have undoubtedly realised significant cost savings; the same applies to migrating communications to the cloud. Many cloud-based comms services are based on a pay-as-you-go model, meaning significant outlays are replaced by affordable subscription fees, whereby a company will only pay for the services it uses, when it uses them. This is beneficial for new market entrants, as it removes the need for one-off investments. For established players too there are benefits, enabling the low-cost launch and quick time to market for new service offerings. The scalable, flexible nature of cloud communication services is also good news for business. Traditionally, a company may oversee quite a large hardware and IT infrastructure stack, though due to changing demands on services, not all of this would be used all of the time. Cloud comms services are scalable and can be expanded and scaled back to trial new services, and respond to varying demands. Business success is dependent on effective communication, whether this is between colleagues, customers, or end-users. Businesses of any size can now ensure that this communication is robust, secure, reliable and low-cost, by migrating communications services to the cloud. ### New Relic Plans First European Availability Zone Located in Germany in 2018 BERLIN ­­– FutureStack – 22 June 2017 – Digital intelligence leader New Relic, Inc. (NYSE:NEWR) announced plans to establish its first European availability zone for its Digital Intelligence Platform located in Germany. Upon its expected launch in 2018, customers will be able to access the full power of New Relic’s cloud-based platform, while having the confidence that their data remains within Europe. With 44 percent year-over-year revenue growth in the EMEA region (Europe, Middle East, and Africa) from the fiscal year 2016 to 2017, New Relic has driven success with leading enterprises across the region. The announcement supports the company’s broad international growth plan. “Enterprises across Europe are moving to the cloud, adopting DevOps, and creating new digital customer experiences to drive business growth––and they need a single platform to monitor the success of these critical efforts,” said Lew Cirne, CEO and founder, New Relic. “With the announcement of our planned European availability zone in Germany, we are doubling down on our commitment to servicing our customers across Europe, which has so much opportunity.” [easy-tweet tweet="New Relic has expanded its team across Europe to offer increased partnership and support" hashtags="Cloud, IT"] “Revenues of Public IT Cloud Services in Western Europe are expected to grow 23 percent CAGR to $41.4 billion from 2015 to 2020,” said Michael Ceroici, research analyst, European Datacenter Group, IDC. “As European Public Cloud Services play catch up with the market in North America, IDC expects growth will be driven by increased confidence in cloud security developments, broader cloud adoption among European SMEs, and the desire to streamline data centre ownership and maintenance costs.” New Relic’s Growing Presence in Europe New Relic has expanded its team across Europe to offer increased partnership and support to its growing global customer base. The company has offices focused on sales and service in Dublin, London, Munich, and Zürich. Also, in 2015 New Relic announced its European Development Center in Barcelona, responsible for creating new platform innovations across the New Relic Digital Intelligence Platform, including recently expanded integrations with Amazon Web Services. The New Relic Digital Intelligence Platform   The New Relic Digital Intelligence Platform is a SaaS solution that delivers full-stack visibility into the performance of digital initiatives––from the underlying host, through the application of the end-user experience. Innovative enterprises throughout Europe use New Relic’s platform to gain powerful analytics and actionable insights that help them build and run modern digital businesses. New Relic’s platform is powered by a multi-tenant cloud database, which is built to scale during peak business events, such as holidays or sales, allowing companies to have visibility into their systems when they need it most. FutureStack: Berlin Features Digital Leaders from ProSiebenSat.1, Allianz X, Amazon, Verivox, and Lotto24 The announcement was made at FutureStack: Berlin, New Relic’s first-ever full day event for customers and partners in Germany. The Berlin event featured a keynote from Founder and CEO Lew Cirne and real-world success stories from ProSiebenSat.1, Allianz X, Amazon, Verivox, and Lotto24. It also included hands-on product training and networking events for customers to connect with peers across industries. New Relic kicked off its global FutureStack Tour to a sold-out crowd in London last month and planned to host additional events in New York City, San Francisco and Sydney in the coming months. For event details and information on speakers, please visit the website. About New Relic New Relic is a leading digital intelligence company, delivering full-stack visibility and analytics to over 40 percent of the Fortune 100. The New Relic Digital Intelligence Platform provides actionable insights to drive digital business. Companies of all sizes trust New Relic to monitor application and infrastructure performance so they can quickly resolve issues, and improve digital customer experiences. Learn more at newrelic.com. Forward-Looking Statements This press release contains “forward-looking” statements, as that term is defined under the federal securities laws, including but not limited to statements regarding New Relic’s plans to open an availability zone in Europe, including the timing for opening of the availability zone as well as its specific location within Europe, market and customer trends and opportunity, attendance, benefits of attendance and topics covered at FutureStack: Berlin, and additional FutureStack Tour events. The achievement or success of the matters covered by such forward-looking statements are based on New Relic’s current assumptions, expectations, and beliefs and are subject to substantial risks, uncertainties, assumptions, and changes in circumstances that may cause New Relic’s actual results, performance, or achievements to differ materially from those expressed or implied in any forward-looking statement. Further information on factors that could affect New Relic’s financial and other results and the forward-looking statements in this press release is included in the filings we make with the SEC from time to time, including in New Relic’s most recent Form 10-K, particularly under the captions “Risk Factors” and “Management’s Discussion and Analysis of Financial Condition and Results of Operations.” Copies of these documents may be obtained by visiting New Relic’s Investor Relations website athttp://ir.newrelic.com or the SEC's website at www.sec.gov. New Relic assumes no obligation and does not intend to update these forward-looking statements, except as required by law. ### SaaS for Media - #Cloudtalks with BASE Media Cloud, Ben Foakes In this episode of #CloudTalks, Compare the Cloud speaks with Managing Director and Founder, Ben Foakes from BASE Media Cloud. Ben explains that BASE Media Cloud are a media cloud service provider, and he's found that there is a market for fully manages SaaS for media. Ben also talks about what the company is focussing on this year and where he sees the future going now that media companies are starting to adopt cloud services. #CloudTalks Visit BASE Media Cloud's website to find out more: http://www.base-mc.com/ ### Jaguar Land Rover’s Spark44 shifts up a gear with Datapipe’s Managed AWS Solution Jaguar Land Rover’s advertising agency migrates to the public cloud to transform its delivery of global marketing campaigns; sheds legacy infrastructure burden with Datapipe’s expertise Datapipe, a leader in managed cloud services for the enterprise, has led the cloud migration and digital transformation of global digital advertising agency, Spark44. Migrating its legacy infrastructure to the public cloud has given Spark44 the ability to manage and control digital assets consistently across the world.  Spark44 is a joint venture with Jaguar Land Rover that specialises in developing demand creation programs globally. With 18 offices in 16 countries spread globally, Spark44 found that distributing materials and content to local agencies had become increasingly challenging. In the digital era where content is in a continuous cycle of updates and refreshes that change almost daily, Spark44 needed a platform that would maximise its ability to deploy content globally. As its legacy infrastructure was becoming a significant hurdle for the business, Spark44 migrated its infrastructure onto AWS’ public cloud with the help of Datapipe. [easy-tweet tweet="Spark44 chose Datapipe because of its deep AWS expertise" user="@comparethecloud" hashtags="Landrover, datapipe"] Spark44 decided that the public cloud was the best solution for its requirements, but without the expertise to migrate and manage a public cloud infrastructure, it looked to Datapipe for assistance. Spark44 chose Datapipe because of its deep AWS expertise, along with its flexibility and willingness to collaborate with the agency from the start. Ahmed Hasan, CTO at Spark44 said, “We met with several service providers. Datapipe was the only one willing to come into our office. I’m a big face-to-face guy and they were happy to sit down with me, listen to my needs, and architect an idea that would really work for us. This not only demonstrated their competency but also illustrated their willingness to go above and beyond in forging a partnership, an ethos close to our own hearts.” Datapipe developed a new global content management and distribution platform utilising three AWS regional sites – the master site in Europe and satellite sites in North America, and APAC. The solution has given Spark44 the ability to support customer demands by improving not only the speed in which new promotions or campaigns are rolled out to the markets but also the consistency of the agency’s global deployment. Tony Connor, Head of EMEA Marketing at Datapipe said, “Spark44’s migration to the public cloud demonstrates the impact of cloud technology on businesses looking to modernise and expand. Legacy infrastructure is a challenge that holds back many businesses and with a trusted partner like Datapipe, businesses can overcome barriers to successful digital transformation. Spark44 now has everything in place to distribute its content globally using our fully scalable solution.” Through working with Datapipe, Spark44 can automate processes that were previously manual, and it has the resilience and scalability to cope with increased customer demands for tracked campaigns and in-depth analysis of results. This has set the agency up for future success and growth as its infrastructure is now built for scale. ### In the Works – AWS Region in Hong Kong Last year we launched new AWS Regions in Canada, India, Korea, the UK (London), and the United States (Ohio), and announced that new regions are coming to France (Paris), China (Ningxia), and Sweden (Stockholm). Coming to Hong Kong in 2018 Today, I am happy to be able to tell you that we are planning to open up an AWS Region in Hong Kong, in 2018. Hong Kong is a leading international financial centre, well known for its service-oriented economy. It is rated highly on innovation and for ease of doing business. As an evangelist, I get to visit many great cities in the world and was lucky to have spent some time in Hong Kong back in 2014 and met some awesome customers there. Many of these customers have given us feedback that they wanted a local AWS Region. This will be the eighth AWS Region in Asia Pacific joining six other Regions there — Singapore, Tokyo, Sydney, Beijing, Seoul, and Mumbai, and an additional Region in China (Ningxia) expected to launch in the coming months. Together, these Regions will provide our customers with a total of 19 Availability Zones (AZs) and allow them to architect highly fault tolerant applications. Today, our infrastructure comprises 43 Availability Zones across 16 geographic regions worldwide, with another three AWS Regions (and eight Availability Zones) in France, China, and Sweden coming online throughout 2017 and 2018, (see the AWS Global Infrastructure page for more info). We are looking forward to serving new and existing customers in Hong Kong and working with partners across Asia-Pacific. Of course, the new region will also be open to existing AWS customers who would like to process and store data in Hong Kong. Public sector organisations such as government agencies, educational institutions, and nonprofits in Hong Kong will be able to use this region to store sensitive data locally (the AWS in the Public Sector page has plenty of success stories drawn from our worldwide customer base). If you are a customer or a partner and have specific questions about this Region, you can contact our Hong Kong team. ### Automation Ensures Effective Management of the Hybrid Cloud                 Today the focus on digitisation is driving a steady stream of change throughout complex enterprise landscapes and, for CIOs particularly, the goal is to deliver the kind of world-class end-to-end quality and business process improvements that enable differentiation and disruption across competitive markets. However, while some firm’s processes might range across just a handful of enterprise applications, other IT infrastructures have become infinitely more complex – often including major software deployments such as ERP that range across multiple on-premise, cloud and mobile platforms. [easy-tweet tweet="Companies turning to the cloud also face the challenge of more frequent software updates" hashtags="Cloud, IT"] Clearly, today’s business processes traverse complex paths across multiple enterprise applications. This diversity, coupled with increasing dependence on the cloud, is bringing unprecedented change into an already complex environment. Not surprisingly, this can introduce the new risk of single failures bringing critical business processes to a grinding halt. Traditional legacy testing approaches struggling to keep up Companies turning to the cloud also face the challenge of more frequent software updates. With cloud applications, organisations have less control over the timing of software changes and where those software changes occur. The window to manage changes in the cloud - and across the enterprise – can also be very short. So, while hybrid cloud deployments can dramatically increase both complexity and the pace of enterprise IT change, there’s still a requirement to validate end-to-end business processes across both public and private cloud platforms as well as your existing non-cloud applications. While IT teams have traditionally been able to validate and test the performance of business processes where there are just 10 or so variations, they’re now finding it difficult to handle the multiple variations of today’s deployments. If for example, you extend your business process to work across ten different mobile devices and operating systems, your team will have to handle around 100 tests instead of the original 10. And if that process also now extends further across ten more linked enterprise systems, you will end up with a testing challenge that’s 1,000 times more complex than when you started. With the need for these complex processes to be validated across both cloud and non-cloud applications, it’s increasingly accepted that traditional legacy tools and manual software testing approaches are simply no longer viable for today’s hybrid application infrastructures. Moving towards an automated testing approach With this level of escalating complexity, it becomes almost impossible – without some form of automated business process discovery approach – for CIOs and their teams to determine their current ‘as-is’ business process state. And that’s problematic because finding out exactly how your current business processes operate is essential if organisations are to establish a starting point for their digital transformation initiatives. That’s why we’re now seeing organisations evolving testing and validation emphasis away from the performance of individual applications to focus on the quality of entire connected business processes. With end-to-end business processes now spanning dozens of enterprise apps, cloud and digital platforms, CIOs are turning towards technologies that automatically discover current ‘as-is’ business processes, accelerate test automation, and carry out complete step-by-step validation to ensure quality. Adopting this approach means that they can then confirm that all underlying business functions and transactions are performing correctly across every enterprise application and interface, including cloud apps. Automation helps businesses bring continuous change into their ERP landscapes, safe in the knowledge that they have validated the performance of their critical business processes. And because organisations utilise and customise their ERP deployments in vastly different ways, it’s important to find an automated software testing approach that tests your processes in the way you use them. Instead of using up expensive resources to identify and capture all current IT processes, organisations can use the latest business process discovery techniques to take the pain out of generating and maintaining critical business process documentation. And by automatically testing the integrity of end-to-end business processes at a rate that can keep pace with digital transformation ambitions, cloud application changes and critical security requirements, they can also take significant steps towards mitigating technology risk across their hybrid cloud enterprise landscape. With end-to-end business process testing in place, CIOs can also identify issues before they become problems, whether that’s achieving successful initial migration to cloud-based applications, verifying integration between cloud and non-cloud enterprise applications, or ensuring that critical transactions and processes are performing correctly across the cloud landscape. That’s why a growing proportion of the world’s leading organisations are now replacing manual QA and legacy script-based tools with intelligent automation solutions to help drive a continuous ‘lights out’ approach to software testing and business process discovery. And with automated solutions not only reducing risk but also delivering test automation that’s up to three to five times faster than other enterprise cloud testing approaches, organisations can be confident that their IT infrastructure will continue to work flawlessly – even when their business applications change. ### GTT Communications, Inc. Acquires Perseus    GTT Communications, Inc. (NYSE: GTT), the leading global cloud networking provider to multinational clients, announced today the acquisition of Perseus, a provider of high-speed network connectivity serving many of the world’s top financial and eCommerce companies. This strategic combination: • Extends the reach of GTT’s global Tier 1 IP backbone with new points of presence and routes connecting key markets in Latin America, Asia Pacific, India, and South Africa, including Pacific Express, the new low latency route between Chicago and Tokyo Increases GTT’s client base, adding clients in the financial service and eCommerce segments Reinforces GTT’s market leadership in ultra-low latency services and augments cloud networking portfolio with financial market data services The purchase price was $37.5 million, plus the assumption of approximately $3 million in capital leases. GTT expects that the purchase price will reflect a multiple of post-synergy adjusted EBITDA of 5.0x or lower, with integration and cost synergies to be achieved within two-quarters. [easy-tweet tweet="GTT gains an expansive global network footprint and world-class clients" hashtags="Network, Cloud"] “The acquisition of Perseus demonstrates our commitment to connecting people – across organisations and around the world – by strengthening our low latency service offerings and accelerating our expansion into key high-growth financial markets,” said Rick Calder, GTT president and CEO. “We look forward to serving our new clients with simplicity, speed and agility.” “This transaction provides strategic benefit to both client bases as well as our organisations,” said Jock Percy, Perseus chief executive officer. “GTT gains an expansive global network footprint and world-class clients. We thank our employees and clients for their loyalty and expect a seamless integration with GTT over the upcoming months.” About GTT GTT provides multinationals with a better way to reach the cloud through its suite of cloud networking services, including wide-area networking, the Internet, managed services and voice services. The company’s Tier 1 IP network, ranked in the top five worldwide, connects clients to any location in the world and any application in the cloud. GTT delivers an outstanding client experience by living its core values of simplicity, speed and agility. For more information on how GTT is redefining global communications, please visit www.gtt.net. ### Rising to the Cloud: How Storage as-a-Service has Turned the Industry Inside-out The data storage industry has been turned irrevocably on its head. If you have been working in this fast-changing market for some time, you may remember that, not too long ago, your storage requirements were fulfilled by industry titans. They would provide expensive and unwieldy hardware with little flexibility, accompanied by lengthy and complex contracts. The traditional enterprise storage management worked like a hamster-wheel of migration and refresh cycles every few years. But, thankfully, those days are behind us, and here’s why. Increasingly, cloud storage is becoming the preferred choice for organisations of all shapes and sizes. In fact, current market data shows that by 2020, as much as 50 percent of the overall storage market will have moved entirely to cloud-based infrastructures, according to the IT Brand Pulse Industry Brief published in May 2017. The reason for that is simple. Both on-premises and in cloud solutions provide organisations with the ability to meet their day-to-day needs, without having to compromise on performance, cost or security. The cloud, however, has the strong advantage to offer enterprises unparalleled scalability, capacity, speed and cost savings when compared to other forms of data storage. In addition to this, the emergence of storage-as-a-service (STaaS) has given businesses the ability to switch spending models from rigid, change-inhibiting CapEx to elastic, highly responsive OpEx. This puts companies in the driving seat, as the storage giants are no longer able to simply provide expensive hardware every few years and sit back watching the money roll in. Even industry bellwethers such as EMC and NetApp have either retooled or sold their companies to be able to compete in this new world order, in which power no longer lies in the hands of the vendor, but in those of the end user. The STaaS model gives businesses added autonomy and flexibility, and best of all, they only pay for what they use. But how can you ensure one cloud strategy will work for your business? Firstly, it is important to note that not everyone is, nor should be, in the same cloud. When you decide to adopt a certain cloud strategy you need to ensure you are not making any unnecessary trade-offs, by tailoring the strategy to your organisation’s specific needs. You’ll need to make sure that your increased flexibility doesn’t come at the expense of your infrastructure’s ability to run the day to day applications smoothly. You also need to consider isolating your most valuable data, in which case you will likely need a virtual private cloud. Secondly, your cloud architecture should guarantee that all single points of failure are eliminated so that it is proven you are arming your business with a resilient and reliable storage solution. Your cloud provider should offer an automatic multi-zone high availability solution to provide real-time failover across multiple locations to ensure your business is protected from complete facility failure. Thirdly, you’ll need to confirm the cloud is working for your business. A cloud service should be managed such that you do not have to spend time on either becoming familiar with or running the underlying infrastructure. This means ensuring your cloud provider guarantees 100 percent uptime and offers a solid service level agreement. Finally, you’ll need to check that you have 24/7 proactive support. A reliable as-a-service cloud solution will not only provide you with the flexibility your organisation requires to compete, but it will free IT administrators’ time so that they can focus on their core activities rather than spend their time and energy putting out fires and managing crises. [easy-tweet tweet="Any successful data storage strategy requires the ability to plan for the long term " hashtags="DataStrategy, Storage"] For many organisations, maintaining in-house infrastructure and data management is crucial. It may be the case that the public cloud is not the best option for your business or at least parts of it. Some of your data may need to be on the edge of the cloud in an on-premises architecture to provide the high-performance, low-latency and security needed for your specific applications. But you can still take advantage of the as-a-service model. If this is the case, you should consider on-premises as-a-service (OPaaS) storage, which offers the same level of flexibility and scalability as STaaS, but without the need to relocate your data outside of your organisation. Any successful data storage strategy requires the ability to plan for the long term while acknowledging your requirements are likely to change as your business environment and needs evolve. That way, architecting for the long-term will pay off. To do this, you’ll need to develop a plan outlining where different data sets should be stored. The as-a-service model eliminates the need to invest capital into enterprise storage infrastructure and delivers cloud-scale capacity and performance with enterprise SAN and NAS functionality at affordable, subscription-based pricing. And if that wasn’t enough, with the STaaS model you can never make a mistake. This is because if you choose the wrong configuration today, STaaS solutions evolve with you and give you the ability to change your configuration, on-the-fly, as needed. The net result is that your enterprise can be simultaneously more agile and efficient in an increasingly competitive world. ### iland Launches Secure Cloud Services from New Location in Amsterdamto Service EMEA Customers Channel partners, MSPs and customers to benefit from iland’s new data centre location  iland, an award-winning global cloud service provider of secure and compliant Infrastructure as a Service (IaaS), Disaster Recovery as a Service (DRaaS) and cloud backup services today announced the availability of their Secure Cloud Services from a new data centrelocation in Amsterdam, the Netherlands.  This expansion positions iland, which was established in 1995 and has been delivering cloud servicesglobally for over 11 years, to continue to service customers across EMEA with a new focus on the Benelux region and further supports iland’s global growth. With the General Data Protection Regulation (GDPR) deadline also looming in May 2018, as well as Brexit negotiations underway, the opening of this new data centre means that iland will continue to meet the data privacy requirements of customers in the EU. The announcement comes hot on the heels of the relaunch of iland’s channel program last month which has been revamped to enable channel partners to easily build a cloud business and serve bothnew and existing customers in the Netherlands and the wider EU region.  iland sees a huge market opportunity for channel partners who are already well versed with VMware, Veeam, Zerto and other virtualization products to diversify their businesses into Infrastructure as a Service (IaaS), CloudBackup and Disaster Recovery as a Service (DRaaS) and is actively recruiting channel partners in the Netherlands. Dante Orsini, SVP Business Development at iland comments: “This is an incredibly exciting time foriland as we expand our global footprint and increase our focus on working with channel partners to deliver value to customers.  We have undertaken extensive research to ensure that the facility satisfies our needs as well as those of our partners and customers. Both Brexit and GDPR were two key factors in deciding to deliver our Secure Cloud services from the Netherlands. We are delighted with the continued growth we are experiencing across Europe and look forward to continuing to help European customers adopt business-critical cloud initiatives.”   [easy-tweet tweet="iland is a good fit for the CSN Groep as they share similar priorities on cloud security" hashtags="Cloud, Security"] “The CSN Groep is partnering with iland to help customers in the Netherlands ensure business continuity and cloud security with iland’s market-leading DRaaS and Cloud Backup solutions,” said David Schapp, CTO of CSN Groep, a leading Managed Services Provider in the Netherlands.  "We’re excited about the investment iland is making in their Secure Cloud platform in this region, both for the growth opportunities it provides to our business as well as for our customers.   iland is a good fit for the CSN Groep as they share our priorities on cloud security, technology excellence, availability and a keen focus on customer success.” iland’s other data centres are in Los Angeles, Dallas, Washington D.C., London, Manchester and Singapore. In addition, iland will be opening a data centre in Sydney, Australia in August of 2017. iland’s global data centres were chosen not only for their ability to meet iland’s advanced security and compliance standards, but also for their strategic placement near large population centres for disaster recovery and good access to the world’s best networks. Each state-of-the-art facility plays host to the workloads of growing businesses, both local and global, requiring additional capacity, local presence, or sophisticated disaster recovery support.  iland also has backup facilities at every location. The proprietary iland Secure Cloud platform is built on VMware vCloud virtualisation, a familiar platform for many businesses, which eases the transition and time to market for adopting cloudservices. iland cloud services can be monitored and managed through the iland Secure Cloud Console, which provides visibility and powerful management features across billing, performance, security, compliance, testing and reporting from a single pane of glass. To help customers and channel partners respond to an environment of increased risk from cyber-attacks, ransomware and other threats, advanced security features are natively integrated into the iland cloud platform, which is fully certified and compliant with industry regulations including ISO 27001, SOC 2, ITIL, HIPAA, G-Cloud and CSA STAR. ### #TransformCIO Episode 3 - Interview with the Arcadia Infrastructure and Services Director, Peter Kish #TransformCIO is a series of interviews of CIOs and CTOs who are changing the way technology is deployed and viewed in their companies. In this third episode, David Terrar, Founder and CXO of Agile Elephant was joined by Infrastructure and Services Director, Peter Kish from Arcadia. This episode saw the discussion focus on digital transformation and whether Peter believes digital transformation should be supported by IT. He also talks about how technology has changed the retail space over his lifetime, including how the online side has transformed the industry. #TransformCIO ### Shining a Light on IT Sales Data Despite most consumer electronics organisations deploying state-of-the-art software systems to help them increase productivity, improve customer engagement or realise cost savings, a significant number remain in the dark when it comes to accurately tracking indirect sales data. And are missing commercial opportunities as a result. Many don’t have a full picture of the distribution chain once products leave the warehouse, let alone know if they are managing sales partner networks effectively.  This lack of transparency around channel data is an enormous problem.  Companies have found themselves, traditionally, being forced to rely solely on the data submitted by channel partners – regardless of its levels of accuracy. A recent independent survey completed by Zyme, in fact, showed that more than half (52 percent) of the 166 organisations interviewed had to simply ‘trust’ that the information their channel partners had provided was accurate. Overpayments In turn, this level of doubt creates a whole host of other problems. Not least is the over-payment of incentives and rewards to partners, which is standard practice due to continual confusion. Without specific information, suppliers err on the side of caution - overpaying their sales partners.  A staggering 93 percent of the respondents said that they treat incentive payments as a cost of doing business, rather than seeing them as an investment to manage longer term return. The worrying reality is that the majority of organisations still don’t have firm control of their channel data so can’t know if funds are being allocated incorrectly, or which partners may well be missing out. A quarter of organisations were concerned about incentive overspend on partner programmes due to problematic data collection and recording issues, while 37 percent worried about incorrect payments being paid to channel partners that would need to be identified, and corrected, later.  Creating more pressure on resources to conduct back-checks. A further problem created here is that 61 percent of the companies admitted that, in the last 12 months, financial rewards for the channel have had to be capped because of previous overpayments made in error. Hardly motivating. Just having ‘trust’ in channel-submitted data is no longer enough to satisfy board members or stakeholders.  They want facts and details. This is where Channel Data Management (CDM) comes in. And to be successful, it has to be run on a cloud-based system.  We designed a platform that can offer channel visibility by automatically taking the partner’s raw sales data, identifying the most important information and presenting it through analytics dashboards that are integrated into the vendor’s own business systems.  Automating as much of the process as possible we can save time and costs for both sides of the sales function, and enabling almost real-time data sharing makes it a business dialogue that all parties can benefit from. [easy-tweet tweet="Resellers are also becoming cloud service providers" hashtags="Cloud, IT"] Because it’s cloud-enabled technology, there is no complication around upgrades, no expensive consultants onsite and every user within the business can access the data on demand.  Immediate benefits to working collaboratively in this way include the ability to manage channel partner incentives efficiently, to automatically calculate and validate rebates and track any credits earned by partners. It can also manage data checks at key points during the process, pushing queries seamlessly back to partners if necessary, thus ensuring incentive payments to partners aren’t affected. Cloud However, the IT channel is no longer a straight transactional play any longer. The continued adoption of both private and public cloud services among customers means the channel’s sales engagement is much more complex. Resellers are also becoming cloud service providers. Distributors can also be cloud aggregators. It’s a massive leap away from the capitalised, large upfront transaction to a services-based revenue flow, with opportunities for new revenue even coming in the post-sales delivery of the solution. Vendors with cloud, software-as-a-service or other offers that don’t require physical distribution may need to connect multiple data sources to meet the same data profile requirements. However, cloud service providers can then use the data collected via CDM to manage their supply chain, build their inventory policies, drive incentive programmes, and shift their sales compensation models. There is good news.  78 percent of the survey’s respondents actively want to understand more about channel partners and channel marketing expenditure, which demonstrates that vendors are ready to address all the issues surrounding analysing services-based sales data. Because whether shifting boxes or offering cloud services, delivering actionable insights leads to a more accurate view of the sales channel which in turn improves margins, and creates a more productive sales force. ### Live Chat Shows in Full Swing for ICT Channel - First episode was 12th June 19th June 2017: Comms Business Live hosts a new series of live-streamed interviews with IT, data and comms industry leaders. The first live stream took place on 12th June and featured directors from Larato, Keyrus and LogRhythm. [easy-tweet tweet="It’s now more important to future-proof your business and protect your customers" hashtags="Cyber-Security, GDPR"] The shows will take place every two weeks in the run up to the new Channel Live event which runs from 12 – 13 September 2017 at the NEC, Birmingham. David Dungay, Editor of Comms Business Magazine, will interview a different industry professional on hot market issues including cyber-security, GDPR and Digital Transformation. “With IT threats to businesses increasing and evolving, it’s now more important than ever to future-proof your business and protect your customers” says David Dungay. “Each show will tackle the issues affecting our industry, with insights from leading players. The shows will be interactive and viewers will be encouraged to put forward questions during the interview. As a publisher and event organiser, it’s important that we push the boundaries to deliver relevant information in the right format - we believe that live streaming is the perfect fit for the channel.” The show programme is as follows: 26th June – Digital Transformation – Guiding your customers 10th July – SD-WAN – the next big Channel opportunity? 24th July – The Channel, who is eating your lunch? Channel Live is the UK’s only channel-specific event, launched to address the digital challenges and threats facing the market. To watch Comms Business Live please visit: http://commsbusiness.co.uk/comms-business-live-streams/   ### Disruptive Tech TV interviews Robert Weideman from Nuance Communications In this interview, Disruptive Tech TV speaks to General Manager, Robert Weideman from Nuance Communications. Robert speaks about how Nuance develops technology within virtual assistance, human machine interfaces. They build solutions for the consumer space, healthcare and customer service. Nuance is also the creators of 'Dragon Dictations' which came about in 2002, a product that allows you to speak instead of typing. From this, the conversation turns to virtual assistance and voice recognition where Robert explains what Nuance are doing today. He explains that technology likes speak recognition which to will make it possible for people to make it easier to use devices, interact with services and generally make life safer. Nuance has been building up technology from the late 90's to pursue further with this. Robert goes on to explains how people should not be afraid of the current shift towards AI and biometrics. He explains the opportunities that will come out of this shift will make life more convenient and safer. There was also a quick chat with Siri! ### Comms Business Live Episode: GDPR In the first episode of Comms Business Live, editor of comms business magazine David Dungay was joined by his panel. This episode saw the discussion on the General Data Protection Regulation and how it will be replacing the General Data Protection Act of 1995. Don't forget about Channel Live 12th - 13th September. Voices of GDPR ### Compare the Cloud interviews Michael Custers from NGA HR In this interview Compare the Cloud speaks with Michael Custers, SVP Strategy and Marketing for NGA HR. In this interview, Michael speaks about how NGA HR are a global leader in HR and payroll solutions. They are found in 145 countries including the UK. Michael explains how NGA HR are unique they take their clients on a journey, which may take multiple steps and may take time, but they are available every step of the way. NGA HR are a truly global business which deploys its solutions recognising there are multiple languages, cultures and legislations. Michael continues to explain the importance of HR and the overall significance of an employee's experience - "meaning every touch point between the employer and the employee really matters". For more information visit: www.NGAHR.com ### Changing the Narrative on the State of Retail The retail industry has been expanding on the global scale with the U.S. still maintaining the undisputed leader in the global arena. At the 2016 inaugural Microsoft Envision conference that was held in New Orleans, lengthy discussions were conducted on the current state of the retail business. The findings were staggering, contrary to the common perception that online retailing had taken over the traditional brick and mortar stores. Last year alone, retailers spend $62 billion on "shoppers' experiences" totalling and competed fiercely for consumers' attention. Retail revenues have been on the rise with 72 percent increase since 2000. There is Magnificent Growth in Retail Business What is surprising is the massive number of retail storefronts in totalling over 3.8 million. This represents a rise of 190,000 in just under two years. The majority of these stores are individual shops, meaning they're small and medium businesses as opposed to large chain operations. However, the small retailers only account for 25 percent of the total retail revenues. [easy-tweet tweet="The expansion of retail business means stiffer competition for small business owners" hashtags="CRM, Strategy"] Additionally, there are 14 billion square feet of retail outlet space across the U.S., which is a 142 percent increase from the year 2000. This illustrates that there's a lot of shopping happening on brick and mortar outlets in addition to the digital commerce. Particularly, 93 percent of revenues emanating from retail came from physical shops as opposed to online shops. This doesn't mean online shops are dwindling. They're still on the rise but digital commerce operators like Amazon, Birchbox, Warby, and Bonobos are slowly moving from clicks to bricks and establishing actual storefront locations. The massive expansion of retail business means stiffer competition for small business owners. To get ahead of the competition, you need to focus on creating better customer experience to lure shoppers to your stores. One strategy you can employ is working on the path to purchase. Experts advise small business owners not to shy away from showrooming. This is where consumers visit your shop to check out your merchandise before shopping online for better prices. Many business leaders have shared similar sentiments with less focus on duct-tape omnichannel. Embracing the Millennial The millennials have disrupted the retail shopping trends, especially the fashion industry. The millennials rarely go without their mobile devices. To tap into this massive market segment, your website must be accessible from mobile devices. Additionally, your store should have internet access through Wi-Fi. However, you should only use the Wi-Fi to connect with the customers as opposed to sharing with the customers your access codes to the network. Install Apple or Google wallets for online payments. Since millennials are more that baby boomers, it makes more business sense to attract and retain them by utilising news ways of engagement. The current state of retail shows that shoppers are ever in the stores. They are either shopping, working, sitting, playing, or on the go. This indicates that they are seamlessly transitioning in and out of the physical world. Retailers should, therefore, find means of cracking these silos existing between their physical stores and their websites. Many retailers can attest to the fact that their key focus is on building better CRM systems and customer loyalty programs to maintain the already existing customers to make repeat purchases. Other priorities include establishing mobile friendly platforms and creating better business information analytics. The store owners should give their staff the necessary tools for empowering and engaging their customers to have a seamless view of their clients' needs. Being mobile friendly is already a necessity for any business looking to stay clear of the competition. So, the bottom-line is to provide your consumers with a clear reason to keep coming to your store or online shop.   ### Why move testing to the cloud? In a super-charged technology market, where intense consumer demand has led to heightened competition, organisations must balance the need for innovation and speed with the challenge of bugs and defects. So, perhaps it’s no surprise that rigorous testing is now a critical part of software development. Robust and thorough testing determines the correctness, completeness and quality of a product - and ultimately helps guarantee an organisation's success. But testing has also become an increasingly challenging activity. The growing complexity of business applications calls for tangled testing processes. And digital transformation requires a much broader view of the end-user; the devices they use and the conditions under which a product or service must operate. Together with the prevalence of Agile methodologies, where facilities need to be always-on as part of the continuous integration cycle, these challenges mean that, for many, setting up and managing in-house testing facilities has become insurmountably difficult. [easy-tweet tweet="Cloud testing offers scalability" user="@comparethecloud" hashtags="Cloud, tech, IT"] Managing a test facility in-house The list of requirements for in-house test facility management is comprehensive. Companies have a swathe of responsibilities, from managing mobile devices; their contracts and maintenance - to overseeing IT infrastructure, including the cost of servers, software, OS and maintenance. And, as well as the physical hosting demands, increasing labour costs are also often prohibitive. An Agile lab is required to be operational 24/7, often serving daily commits and verification and nightly full regression testing. And the need to mimic real user conditions means that the testing task, which includes audio connectivity, network virtualization, network connectivity, peripherals virtualization - and much more - becomes infinitely more challenging. Migrating to the cloud So, while it is possible to build a digital testing lab in-house, it’s an unfortunate truth that most companies will ultimately fail. Testing labs often don’t receive the required attention from local IT teams and the digital aspect introduces many factors which IT departments simply aren’t geared up for. The good news, though, is that there is an alternative - and that’s the cloud. Cloud services give testers access to scalable and ready-to-use virtual labs with the library of operating systems, test management and execution tools, backup and storage necessary for creating a test environment that closely mirrors real life scenarios. And importantly - cloud testing offers scalability. Companies of all sizes can handle larger projects than they normally could - and development teams can obtain the infrastructure required when an extra testing push becomes necessary, rather than spending on proprietary resources which may only be used for a specific project or for a short amount of time. In addition to scalability, cloud infrastructure enables easier testing and monitoring of the production environment. Applications can be tested for the exact number of actual users. And extra testing for global applications is also possible. Internationalisation and localisation methods allow companies to detect where users are when they interact with an application and tailor the user experience accordingly. Cloud testing provides organisations with the ability to refocus effort (and resources) on testing itself, rather than maintaining an in-house lab. And the rewards are evident - experts have said that companies can nearly double test efficiency by embracing a cloud model - helping bring new products and services to market more quickly. How to get started - the importance of vendor selection So, if cloud testing is such a compelling option, why is there still some reticence from organisations who haven’t yet made the move to the cloud? For us, cloud infrastructure, like any digital service, should be adaptable, safe and able to evolve within an agile IT environment. If it’s actually a burden or a security risk, then it’s not doing its job. Organisations - whose main business isn’t in maintaining infrastructure - need to know that they’re working with technology partners that can keep them protected from reliability or security issues. Reassurance is required from the outset. Put simply, while cloud services can simplify your testing operations, and help re-direct resources to the things that really matter, you must trust your technology partners absolutely. When outages in service are no longer within your own ability to fix, or data leakages aren’t within your remit to control, then trust is paramount. The important questions When selecting a partner, organisations must ask tricky but vital questions about redundancy, uptime and reliability – and make sure that robust disaster recovery procedures are in place should the worst happen. Our own customers tell us that the search for a partner starts with geographic location. Selecting a cloud vendor which addresses their needs in terms of regulatory requirements, the location of users - and cellular coverage - is crucial. Second up is security. Like any other IT provider, a digital testing lab cloud provider must meet rigorous security requirements. And third is availability. While 24/7 is a mandatory requirement, making sure this is backed by SLA is important. Making sure a vendor can support new devices and new operating systems within a reasonable time is also critical for staying on top of market needs. Requesting “same day” delivery of new OS versions or devices is possible - and often vital - in the race to market for new applications or services. And lastly, integration is key. Your teams might be using different testing tools and solutions. It’s important that organisations ensure there is compatibility between the tools they’re using (e.g. Selenium, UFT, etc.) to the abilities of the vendor. Support and migration - the final piece in the puzzle So, like many other cloud migrations, moving testing to the cloud simply “makes sense”. Organisations don’t want to - and often can’t - manage an in-house digital testing lab for an Agile environment - and cloud testing presents a labour saving, cost effective solution. But to make any migration successful the last piece of the puzzle is adequate support - to make the migration process simple. An Agile environment requires quick resolution for issues and any cloud vendor worth its salt should provide 24/7 support from experts. So, with the right provider in place, and support reliability and maintenance guaranteed, the future of testing is bright.   ### Data security – how the cloud can help you Following the recent malicious ransomware attacks, the issue of data security has never been more critical for businesses and organisations across the globe. ‘How safe is my data?’ is a question every business professional – from the smallest start-up to the biggest multinational – is asking right now. A security breach is the stuff of nightmares for any company, with the potential to bring down a site and impact revenue and reputation. Where does cloud computing fit into this debate? There’s no question that outsourcing data storage to a cloud provider makes sense on a multitude of levels. Thanks to the cloud, companies no longer need to spend time and money building and maintaining their own energy-hungry onsite servers. Software applications can be managed remotely, freeing up time that would otherwise be spent on IT resources. [easy-tweet tweet="The harsh fact is that every computer is vulnerable, whether its data in the cloud or not" user="@comparethecloud" hashtags="Cloud, security"] The majority of companies are now adopting some form of cloud technology, whether public, private, or hybrid. But will the high-profile ransomware attack, like the one which infected 230,000 computers in 150 countries this May, make slow adopters more or less likely to trust their data to the cloud? The harsh fact is that every computer is vulnerable, whether its data is stored traditionally or via the cloud. Education around this issue is key. Cloud vendors and IT support companies need to be honest about the benefits and risks of any new technology. These following points are worth discussing with any business weighing up the pros and cons of cloud migration: Your data is still your data – it’s not ‘owned’ by the cloud and you still have responsibility for it. You can decide how to encrypt your data. Companies can choose to set up personal encryption keys which work in tandem with cloud providers’ own encryption services. This extra layer of security controls who can and who can’t access data stores. Without the key, the data is worthless. Cyber criminals are inventive. Unfortunately, there will never be a security solution that will protect data in all cases. Fraudsters are constantly updating and evolving malicious software in an ongoing game of cat and mouse. Stay one step ahead of the game by taking expert advice. It’s essential that businesses work with IT professionals who can plan appropriate backup and data recovery procedures – then implement that plan should disaster strike. Focus on fact not myth. An experienced host service provider can debunk myths surrounding the cloud. Myth one: data is not as secure in the cloud. As discussed earlier, security risks in the cloud are the same as those faced by traditional IT solutions. But with the cloud you do have an advantage – the risk is shared with your cloud hosting provider, so you’d never face disaster alone. The key for any company considering the adoption of cloud technology is to work with an IT expert so they can review requirements and implement appropriate solutions. This will also help you to dispel any myths and preconceptions that you may have about cloud’s capabilities, security, and functionality, and help you to explore whether the pros outweigh - or indeed dismiss - any of the potential drawbacks. ### Hybrid IT Nirvana is Through Networks You Don't Own Hybrid IT is no longer the new kid on the block. What was once an exciting possibility for businesses looking to realise an array of benefits, from improved efficiency to cost savings, is now a reality for many organisations. SolarWinds recently released the SolarWinds IT Trends Report 2017: Portrait of a Hybrid Organisation, an annual study that explores how hybrid IT is impacting the modern organisation, capturing responses about the benefits, barriers to consumption, and key considerations of implementation from U.K. IT professionals who have adopted hybrid IT in their organisations. [easy-tweet tweet="The responsibilities of IT professionals have grown broader" hashtags="IT, Data"] The research found that 92 percent of survey respondents said their organisations have migrated critical applications and IT infrastructure to the cloud over the past year. Therefore, it’s no surprise that IT professionals must become better equipped to monitor and manage infrastructures in this hybrid IT era. Blurred lines The responsibilities of IT professionals have grown broader. Today, they must not only be a jack-of-all-trades but a master of them too. Traditional monitoring tools no longer work when managing a hybrid IT environment, as they are unable to provide comprehensive visibility across both on-premises and the cloud, a necessity in ensuring that the high standards of end-users are met. According to the SolarWinds study, the second greatest challenge created by hybrid IT was increased network complexity. Additionally, the study cites that IT professionals ranked hybrid monitoring/management tools and metrics, application migration, automation, and data analytics as the skills and knowledge most urgently needed to successfully manage hybrid IT environments. The need for a combined monitoring and management solution has never been more pronounced for IT professionals, whose resources and time are increasingly strained by the demands of hybrid IT. One single dashboard would help IT, professionals, to monitor and manage the network efficiently, helping to intelligently decide what parts of the business belong in the cloud and what belongs on-premises. That said, sometimes a dashboard just isn't enough. On the map The level of complexity in today’s IT environments means that IT professionals often need more than a dashboard, and instead must aim for a single-pane-of-glass, complete visibility into the entire network, both on-premises and cloud. This is no luxury either. For a hybrid IT architecture to succeed within an organisation, IT professionals must have visibility across services and see the pathways of applications, as well as the quality of service. The ability to map networks then becomes a vital commodity. Visual path monitoring provides IT professionals simplified detection of issues in internal networks, while also extending troubleshooting through the internet and into service providers’ networks, covering both parts of the hybrid infrastructure. Modern network path monitoring tools simulate application-specific traffic, which passes through firewalls in much the same way as user traffic. This means that if a problem occurs in the cloud, IT pros can see what went wrong and contact the service provider with the right information to get it fixed. Historical data from these tools can also be used to help with capacity planning, offering contextual information that can be used to decipher when a business may need to move resources on-premises. Mapping the unknown Getting the most out of a hybrid IT environment requires absolute visibility and access across infrastructures. So, how can IT professionals best monitor and manage their networks and, more specifically, the networks they don't own? Monitor cloud and on-premises infrastructures from a single platform, allowing IT professionals to visualise the entire network landscape, and see through a single platform when application performance slows. A holistic view of this environment will help IT professionals turn data into actionable insights, and remove the need to go between multiple platforms to manage it. Implement a monitoring solution that offers alerts on cloud resource consumption trends, allowing IT pros to see across the entire hybrid environment. This provides IT pros with the ability to see what belongs in the cloud and what belongs on-premises. A solution that offers this level of detail could prove invaluable. Monitor quality of service (QoS) and the end-user experience. This can be achieved by mapping a path on how end-users are using applications and see the quality first hand. Here, metrics should be put in place to ensure that performance never dips below a certain level, with IT professionals ensuring that key changes in performance are well understood. Learn to trust and invest time in getting the most out of cloud service providers. Cloud service providers are not the enemy, and while IT professionals may resent handing over control of their data to outside influences, the benefits are substantial enough to warrant the leap. To ensure this trust is in place, IT pros should know each cloud service provider and its offerings like the back of their hands. While hybrid IT may seem like a headache for IT professionals, its potential to improve your organisation is significant. Instead of fighting the tide, it's time IT professionals adapt and start looking beyond their own environments and map the networks they don't own. Until they do this, the true benefits of hybrid IT could continue to elude them. ### #TransformCIO Episode 2 - Interview with the ex-Met Police CTO, Stephen Deakin #TransformCIO is a series of interviews of CIOs and CTOs who are changing the way technology is deployed and viewed in their companies. In this second episode, David Terrar, Founder and CXO of Agile Elephant was joined by CTO, Stephen Deakin from Interim. This episode saw the discussion focus on the difference between handling IT in the public sector compared to Stephen's experience in the private sector. He also talks about what his experience was like as CTO of the Metropolitan police and the different technologies used by the police. Stephen also shares his favourite book recommendations! #TransformCIO ### Technology and the Law Ep 5: SLA In the fifth episode of Technology and The Law, cloud lawyer Frank Jennings was joined by Sue MacLure from PSONA, Mitchell Feldman from RedPixie and Ben Boswell from Worldwide Technology. This episode saw the discussion of Service Level Agreements.   Hashtag: #TechLaw Twitter: @disruptivelive ### Don’t be Afraid to Lead Your Customers to the Cloud One of the greatest obstacles to moving customer data to the cloud is customer concerns about being able to see and control their data assets. This concern is unfounded – data in the cloud is as accessible for customers as if it were kept anywhere else, but customers can sometimes be reluctant to make a move. It’s essential for us to reassure our customers that their data will be safe, accessible and under control. Below is an example demonstrating how we worked with an enterprise backup and recovery business to move its customers successfully to the cloud. The nature of the problem   Cristie Nordic AB is a large European IT provider that specialises in business continuity planning and providing solutions for disaster recovery. In 2014, it saw an opportunity to expand its enterprise backup and recovery business to the mid-market through cloud-based managed services. However, Cristie Nordic faced the challenge of convincing its customers to migrate data from on-premise backup and recovery systems to the cloud.  In this case, customers were concerned not with the change of control of the data assets, but with the perceived lack of visibility into the data. They also expressed concern about whether backed up data that went into the cloud was complete.  Additionally, they were concerned as to whether they could gauge their data growth, or accurately predict what their hosting charges would be in the next quarter. To streamline operations, Cristie Nordic determined it needed one backup and recovery software licence from which to bill multiple customers, track their usage separately, and ensure compliance with their licence agreement. It wanted to be able to forecast licence cost growth, to remain profitable while providing customers with the best possible solution at a fair price.  The right tools for the job Cristie Nordic selected Rocket Servergraph Professional for its backup monitoring and reporting services. Servergraph provides a single view of the entire backup environment and is recognised for its ease of use. Key factors in the company's decision included the software's ability to support a broad range of industry-leading storage and backup systems. Moreover, it enabled proactive monitoring and reporting, as well as the chargeback capability the service provider was looking for to effectively run and deliver its cloud-based offerings. The process was a success, and large swathes of backed up data were moved to the cloud with the approval of customers. Magnus Thunberg, data protection specialist for Cristie Nordic, who was closely involved with the Servergraph implementation, gave a comprehensive breakdown of the process: “Previously, responding to customers’ backup status and service level requests required manually sorting through information on the backup servers and log files, then exporting data into spreadsheets and charting programs to create reports. By adding Servergraph to monitor, alert, and manage the backup operations, not only could we cost-effectively manage the backup processes, we could also provide comprehensive reporting without struggling to manipulate the data. We’re now able to implement a standard report for every customer with the information they need to assure them that everything is optimal. The most important thing for us to provide is confidence to the customer that the service is delivering on contracted levels, and we couldn’t get that capability from any of the competitive solutions. While confidence is not something that is directly measurable, backup success rate is, and with …, not only can we maintain a very high success rate, we can also easily prove it to the customers. The new cloud backup and recovery solution also helped customers reduce the risk of managing software licenses as their implementations scaled up or down. Cristie Nordic used the chargeback and automated billing capability to monitor capacity utilisation and growth by the customer. With these planning tools, the firm could monitor customers' licensed capacity and project when they would need to upgrade. The chargeback feature also delivered billing insight within individual customers' cloud-based backup operations. Chargeback provides, even more, confidence for a customer to sign a contract. Customers don’t get frustrated with the expense when we can proactively sit down and help them manage their business. They can see the value. [easy-tweet tweet="Providers need to show clients the advantages of moving to the cloud" hashtags="Cloud, Data"] Servergraph has enabled us to expand our MSP business quite rapidly. We have grown backup data under management from 100 TB to 3 PB within two years in a region of the world that on average has smaller companies with fewer data. Severgraph allows our product experts to remotely view the implementations and see what is happening with customers. If a problem occurs, we can help them immediately and deliver the highest quality solution.” The experience Cristie Nordic has had with transferring its clients backed up data demonstrates providers that the challenge of shifting this data is not insurmountable, even when faced with concerned customers. However, it does rely on these providers making clear to their clients the advantages of the move, using the right software to implement the move, and above all be willing to lead their customers in the right direction. ### Live Special: LGBT in Business and Tech with Pride in London 2017 [vc_row full_width="stretch_row" full_height="yes" columns_placement="top" content_placement="top" css=".vc_custom_1497528845318{margin-top: -40% !important;background-image: url(https://www.comparethecloud.net/wp-content/uploads/2017/06/PrideCTCCover.png?id=43786) !important;background-position: center !important;background-repeat: no-repeat !important;background-size: cover !important;}"][vc_column][video_row video="https://player.vimeo.com/external/221728400.hd.mp4?s=b1b7fb7e92fde6d9409433919af50d76c27871c7&profile_id=174"][fullwidth_row content_width="120%"][fullheight_row content_location="top"][vc_empty_space height="500px"][vc_column_text] Compare the Cloud and LGBT Pride in London Special LGBT in Tech Today at 7:30pm [/vc_column_text][/vc_column][/vc_row][vc_row][vc_column][vc_column_text] [/vc_column_text][vc_column_text]In collaboration with LGBT Pride in London 2017 Compare the Cloud & Disruptive Live are hosting a special edition of Pride in London Live on the topic of LGBT in Business and Technology. Join us for this 30 minute discussion show with our guests; Twitter, Deliveroo, LinkedIn and OUTstanding.[/vc_column_text][/vc_column][/vc_row] ### Netclearance Releases mBeaconSAM, a Bluetooth 5 Payment Beacon The new mini sim card sized Bluetooth 5 payment device will allow tens of millions of existing payment terminals to offer next-generation API-based mobile transactions Netclearance, providers of the world’s biggest network of smart Bluetooth payment terminals, today unveils the world’s first commercially available Bluetooth 5 payment beacon. mBeaconSAM comes in a mini SIM-card sized smart card that fits into existing card reader terminals and will enable them for IoT payments transactions. The device brings three key benefits to merchants and financial institutions: Upgrades existing payment terminals by adding IoT features. The device allows terminals to accept payments from mobile wallets in apps. It functions outside of NFC technology used by Apple, Samsung and Google Wallets Adds proximity marketing capabilities. It acts as a beacon which brings all the benefits of in-store proximity marketing with Apple iBeacon and Google Eddystone-compatible technology The technology is future-proof. Being the first commercial Bluetooth 5 payment beacon with a built-in secure element, it offers bank-grade security with the additional capability of over-the-air firmware upgrades. Bluetooth 5 offers better range, lower power and faster speeds, but it is fully backwards compatible with previous generations mBeaconSAM is designed to meet the growing demand from merchants and financial institutions looking for alternative revenue streams from API-based mobile payments that use open wallets, outside of existing walled gardens. This democratisation of mobile wallets means that any app can include a wallet which could be used for any payment. This is in line with the upcoming Payment Services Directive, PSD2, which looks to create an equal playing field for payment service providers. mBeaconSAM upgrades existing payment terminals to accept payments from smartphones or wearable devices using Bluetooth Low Energy (BLE) technology. It is not limited by existing card-based NFC technology, enabling payments to be quickly made on iOS, Android and other Bluetooth devices. [easy-tweet tweet="Mobile payments are both attractive and safe for consumers " hashtags="Mobile, IoT"] Jonathan Duffy, EMEA executive director at Netclearance says “Merchants are using tens of millions of payment terminals around the world but the vast majority of these terminals are limited to card-based NFC technology that only works with the closed mobile wallets from Apple, Google and Samsung. mBeaconSAM has been developed to fill a void in the mobile payments and engagement infrastructure. It enables open app-based payments using Bluetooth. “Mobile payments are both attractive and safe for consumers and have the power to revolutionise the purchasing experience. For merchants and financial institutions, it potentially opens up new streams of revenue. “It’s really exciting to bring the first Bluetooth 5 payment and proximity marketing beacon to market, and adoption of the latest technology will ensure mBeaconSAM is supported for years to come.” Based on appearance, the smart card looks and feels like a mini-SIM card. However, under the hood, the smart card contains a high-speed processor, crypto-engine, secure memory and BLE 5.0 capabilities that interface with the terminals readers via the standard PSAM interface. It performs a variety of functions including access control, mobile payments acceptance, mobile proximity marketing, workforce management and secure customer identity verification. mBeaconSAM joins the Netclearance’s leading portfolio of BLE based payment terminals, unattended payment and mobile ticketing solutions widely used in Scandinavia and other regions. All Netclearance solutions utilise low-power wireless technology and feature open APIs, mobile SDKs, and advanced analytics capabilities.   ### Optimising Hybrid Environments Through Increased Visibility In recent years, cloud has become a key enabling technology for digital transformation. According to a study by Harvard Business Review, enterprises are benefiting from the flexible capacity, business agility, and lower fixed costs it provides. As a result, a growing number of organisations are migrating applications and information stores to public and private cloud solutions, so that hybrid IT architectures—where information is stored in the cloud as well as on local systems – have become the norm. However, while hybrid environments can provide significant IT and business benefits, they also lead users to cope with poor application performance issues on an alarmingly regular basis, reducing employee productivity, making operations less efficient, and impacting on the bottom line. This creates a significant gap between the performance needs of businesses and IT’s ability to deliver. With applications, data and users scattered across branch offices and other remote locations, optimising and delivering great application performance is becoming increasingly difficult. As a result, IT engineers spend 10 to 50 per cent of their time on root-cause analysis, endlessly searching for the sources of application performance problems in a sea of complexity. Achieving complete visibility across all code, data, networks, and end-user devices will help them maintain control over these complex environments, and take a fully integrated, proactive, agile approach to managing application performance that is directly tied to business value. Cloud adoption may be inevitable, but it’s not easy As enterprises move to deploy cloud architectures, the network becomes the key factor in determining the success or failure of cloud initiatives. At the same time, many of the demands of today’s enterprise environments are at odds with the network’s original design assumptions. Three key trends highlight the problem: Dispersed workforce: More users are working remotely than ever before, either from home offices or an ever expanding number of branch offices. In fact, today nearly 80 per cent of enterprise employees and contractors are located in branch and regional offices, resulting in a dramatic increase in the amount of traffic on the enterprise’s wide-area network (WAN) for remote offices and branch offices. Cloud-based services and applications: Lured by benefits such as pay as you go, automatic upgrades and reduced infrastructure spend, enterprises are increasingly adopting SaaS. This places a strain on legacy networks, as they require remote traffic to pass through the datacentre in order to access SaaS resources, unnecessarily loading the WAN and introducing latency that can negatively impact application performance. Bandwidth-intensive business usage: applications such as teleconferencing, remote training and multimedia presentations are putting intense pressure on networks with a finite amount of bandwidth. Due to their increasing use, 48 per cent of respondents to a recent survey of enterprise IT departments expect bandwidth requirements to double by the end of 2017. Given that traditional architectures are often rigid and difficult to scale, not to mention overly complex to manage, these findings put the network on a collision course with the bandwidth needs of the enterprise going forward. [easy-tweet tweet="Today’s organisations tend to use a range of monitoring and troubleshooting tools" hashtags="Cloud, IT"] As a result, achieving network agility has become a common stumbling block. Applications can fail or underperform for a multitude of reasons. The application code can be problematic, network issues can create connectivity issues, or servers can fail.  Even worse, IT can remain “in the dark” about why their enterprise applications are running slowly, as nearly two-thirds (64%) of enterprises continue to use a fragmented approach to technology monitoring, which cannot possibly deliver end-to-end visibility. Today’s organisations tend to use a range of monitoring and troubleshooting tools. But, because each domain has a different perspective and restricted view, engineers often arrive at conflicting conclusions when looking to solve application performance problems that are crippling the business. Separate IT teams see only part of the transaction, and effective communication between those teams is challenging because they are using different, unsynchronised metrics. As a result, IT wastes time trying to determine the root cause for performance problems instead of fixing them. This translates into a reactive approach which falls short in the complex, application-driven world. What true visibility is all about Optimising application performance is a lot easier when organisations have a clear view of any problems that arise across the entire network – plus the right tools to fix them. Regardless of whether applications are deployed on-premises or in the cloud, new technologies provide end-to-end, unified visibility into applications and the networks they run across, all the way out to the end-user experience. They provide organisations with visibility, optimisation and control, thus offering the necessary insight to ensure optimal performance of enterprise applications, while maximising IT efficiency and productivity. As a result, issues that previously might have taken weeks to resolve, now only take days. When organisations see and understand exactly what's going on across all networks, data centres, ISPs, branch offices, cloud services, and remote users, they can free up IT’s time to develop innovative solutions, plan capacity, and proactively identify and address emerging issues. Equally important, IT can also configure application infrastructures so that they respond to the organisation’s needs. Ultimately, according to a Riverbed report, for organisations worldwide, IT’s improved visibility into application performance results in increased productivity (56 per cent) and revenue (43 per cent), as well as improved customer service (54 per cent), product quality (49 per cent) and employee engagement (46 per cent). Organisations running applications in the cloud should not have to sacrifice control over convenience and cost. With cloud technology appearing more often, and the massive amount of data that is produced through an enterprise network, ensuring applications are performing at optimum efficiency is more important than ever. In the pursuit of agility, businesses will only be able to keep their noses in front if they have a clear picture of application performance across the network, regardless of where they are located. Once this is taken care of, they will have a far better idea of what is needed to ensure they deliver and maintain business-critical applications within a hybrid environment, ultimately ensuring increased productivity across the enterprise, improving the bottom line. ### Ransomware: To Pay or Not To Pay? If you don’t yet know about ransomware, you soon will. It’s been called the most profitable type of malware ever released. Bad guys deploy it on a computer or network and hold the data hostage until a fee is paid to get it back. Just last week the WannaCry ransomware strain went global, impacting computers in more than 150 countries and wreaking havoc on Britain’s National Health Service, Spain’s Telefonica and France’s Renault automobile factory. Ransomware is usually brought into a company when a user clicks on a bad link in an email and cyber criminals download the malware to their computer or network without their knowledge. It happens fast, and sometimes you can’t tell it is there. Most antivirus programs do not detect it as it is rapidly changing with new variations every day. Being successfully hit by a ransomware attack can set a business back 50 years into “pen and paper” management and the ransom can get very high. The WannaCry attack charged £240/machine, which adds up very quickly, particularly for small and mid-sized businesses (SMBs) It’s time to face facts:  Ransomware has become a “when not if” scenario for businesses of all sizes. To pay or not to pay is ultimately a business decision and one which most organisations do not make lightly. How did these ransomware victims get in such a dire situation? In most cases, an employee opened up an attachment to an email that looked legitimate, but the attachment was malicious and infected the network. If a business has up-to-date backups, ransomware is no problem. But in the cases where the company’s backup / restore strategy has failed one way or another it can go wrong in several ways. Perhaps it was thought their backups were being regularly made, but in reality, the process was broken, and no backups were being created at all. That amounts to a high-wire act without a safety net because hard disks sooner or later always fail. Or, backups were created, but the restore process failed and the business was unable to could not get their files back. This happens more often than you might think. Victims like this find themselves in an extremely uncomfortable position: pay an organised cybercrime network thousands of pounds and hope to get the files back, or be confronted with lost days, weeks, or months of work? Understandably, you could take the perspective that it’s a matter of principle and that you should never pay ransom to criminals. That is easier said than done when you look at potential damage to your organisation that a ransomware attack can bring. To pay or not to pay is ultimately a business decision. Nowadays there are different types of ransomware infections. The low-grade spray-and-pray phishing attacks that only infect one workstation are relatively easy to fix by IT. Wiping the workstation and rebuilding it from scratch takes about 20 minutes. It gets worse when that infected workstation had a connection to a file server, and all the files on that server were encrypted during the infection. Now a whole group of employees are left to sit on their hands without access to their files. [easy-tweet tweet="The ransom usually soars up to anything from £30,000 up to £75,000" hashtags="Ransomware, Backup"] Worst case is the ‘bad guys’ were on the network for quite some time, were able to infect all of the organisation’s machines and lock them all at the same time. This type of network compromise sets an organisation back 50 years into “pen and paper” management, and the ransom usually soaring up to anything from £30,000 up to £75,000.  Especially healthcare organisations with x-ray machines, MRI equipment and other medical devices that run Windows are vulnerable to attacks like this, with literally life-or-death in the balance. However, any organisation, large or small, is in the firing line for this new type of internet extortion. What to do about this? It is crucial to start with a so-called defence-in-depth strategy to protect your network: Weapons-grade backups that are tested regularly to see if the file-restore function actually works properly. Ensure all software is up-to-date. That means the Windows software, and also patch all third-party applications that are running in the organisation. Run updated antivirus software but don’t rely on it. Today’s antivirus often does not catch ransomware. Identify users that handle sensitive information and enforce 2-factor authentication. Check your firewall configuration and make sure no criminal network traffic is allowed out. Ultimately, your employees are the weak link in your IT Security. The standard defence in depth strategy above is a good start, but it’s missing a critical step: educate your users so they can thwart ransomware before it can infect your workstations or your network. A good security awareness training program will help educate users on what types of red flags to look for so they don’t make a mistake that puts your business at risk. That taking the time to question an email and call the person who supposedly sent it could save time, money and frustration for the entire company. Let’s stay safe out there.   ### Banking, Healthcare and Insurance – Is your Tech Compliant? With the ever increasing demands for compliance within the regulated sectors, one of the few elements that are often overlooked is the extremely important terminal emulator.  The emulator that allows you to access, operate and basically utilise your Mainframe, Unix, VMS, iSeries and other infrastructure that are the transactional life blood for the majority of organisations. In a recent interview with Rhiannon Dakin, Marketing Manager at Flynet, we explored why this oversight takes place and the solution, with a surprising return on investment. [easy-tweet tweet="Terminal emulators are often seen as evil in the transaction-heavy, regulated sectors" hashtags="Technology, Regulation"] In the last month Flynet has helped five insurers and two banks with exactly this challenge and are boasting about it, and why not? In a recent testimonial, you can see why Canada Life (Founded in 1847 and were Canada’s first domestic life insurance company) is very happy! With security features that have enabled them to pass a thorough KPMG penetration test, including: SSL v3.0 and TLS v1.0, v.1.2 encryption, a FIPS 140-2 validated cryptographic module, support for NT domain, Active Directory, Azure Directory and multi-factor authentication as well as most off the shelf and proprietary IAM and SSO solutions with templates available for integration to get you running from the second you deploy. Terminal emulators are quite often seen as a necessary evil in the transaction-heavy, regulated sectors, but how often do you have to patch, maintain and support these emulators? For example, most emulators rely on Java/Active X plugins, sadly no longer supported by the major browsers as well as Oracle announcing the deprecation of the Java Applet, which they will no longer be supporting, or patching! What are you going to do? I tell you what you should do, look at a pure HTML emulator and evolve.   [caption id="attachment_43711" align="aligncenter" width="505"] Flynet[/caption] [caption id="attachment_43712" align="aligncenter" width="508"] Flynet Pic 2[/caption] One industry that is being hit particularly hard is the US healthcare sector.  Think hospitals, health insurers, policy holders, pharmacies, pharmacy data managers (PDMs)and in a word – HIPAA. If you haven’t investigated this before and you are working in the healthcare industry, you should have! HIPAA mandates secure data access controls and audit trails that are not easily satisfied by other emulators. [caption id="attachment_43713" align="aligncenter" width="481"] Flynet pic3[/caption] It’s Tech week very soon so why not sign up for a free audit, where Flynet will look at your existing emulator and help you ascertain whether you could be exposing yourself to any risk. Click here for your free emulator audit to see if your tech is compliant! Take a risk free trial that brings your green screen technology to a digital transformation! ### What is the Hype Behind Moving to the Cloud? Few transformational technologies have generated as much hype and confusion as the cloud. Companies have concerns about security and performance and are nervous about handing control of their infrastructure over to somebody else. That hasn’t stopped the acceleration of cloud adoption, like small organisations and the largest global enterprises alike are increasingly turning to the cloud for some, and sometimes all, of their company’s infrastructure and software. So why is this? ­ It’s easy to credit cost as the primary reason for the uptick in cloud adoption because it can be, in fact, cheaper compared to on-premises hardware. While the cloud can certainly be cost-effective as organisations only pay for the resources they use, there are three further key reasons why companies turn to the cloud — and never look back. [easy-tweet tweet="With the cloud, companies can start deploying an entire platform CRM" hashtags="Cloud, CRM"] Fast deployment Let’s discuss what it would take to put together an end-to-end technology like a CRM solution. A company must start by putting together a task force to research all the possible deployment solutions, purchase the correct hardware to support it, and then hire an entire team to deploy and manage it. When the company grows, it will have to revisit these arduous steps time and time again just to ensure its systems can scale – this is both an inefficient use of employee’s time and a costly process With the cloud, companies can start deploying an entire platform CRM, a data warehouse, and even an analytics platform, at nearly the push of a button. They don’t need to spend time configuring hardware, maintaining systems, or upgrading physical servers. This means, without investing a huge amount of time and resources, companies can have faster access to world-class infrastructure, enterprise-class security, and reliable entry to their technology. Companies often commit enormous resources to build multiple redundancies of internal systems in case one fails—something that is easier, and typically free, in the cloud. Scalability with business demand Unpredictability is inherent in business. This makes managing infrastructure increasingly difficult. Companies need teams of people to measure capacity as ordering and deploying additional hardware can be a painfully slow process. The cloud alleviates all these concerns by allowing organisations to scale up and down as their business changes. It also eliminates the need to suddenly acquire additional infrastructure to deal with unpredictable spikes in demand. Furthermore, the cloud provides flexibility in how companies grow in other ways, including the use of many different technologies and how the complement each other. With the cloud, companies can select from a variety of options—such as data services, storage options, and operating systems— instead of locking themselves into previously purchased on-premises solutions. If their needs change, swapping services in the cloud is just like starting a new one altogether, all accomplished at the push of a button. It’s where business is going While traditional technology solutions like CRM, marketing automation, and data analytics are moving away from on-premises technologies to the cloud for the two reasons above, the cloud is also where tomorrow’s technologies will be built – meaning sooner or later; companies will have to get on board. Every day, something new—a data service, an analytics tool, even a storage solution—is built in the cloud and released into the world. As time goes on, companies will become less and less likely to integrate new, cloud-built technologies with existing legacy solutions, which are built on closed platforms, running on inflexible hardware, and sit behind firewalls. Companies with their infrastructure and software in the cloud will be ready to embrace and integrate these innovations, empowering their employees to tackle today’s biggest business challenges. The cloud is also simultaneously secure and available anywhere—perfect for whether a sales team makes calls from an office or sells pharmaceuticals in a doctor’s office using mobile devices. Today’s businesses must accelerate innovation, move faster, and become more reliable than ever, making integrating with the cloud now a profound competitive edge for the future. ### Consumers Preferences Shift to Digital Content Reveals Limelight Networks Consumers worldwide increasingly depend on smartphones to access digital content and tend to go online to purchase video games, movies and books, according to the latest “State of Digital Downloads” research report fromLimelight Networks, a global leader in digital content delivery. Highlighting consumers’ shifting demand for both streaming and downloading content, the annual report found that only 14 percent of all respondents still prefer to rent or purchase DVDs of movies and TV shows, and only 25 percent prefer hard copies of books or traditional print media. In comparison, two-thirds (66 percent) of respondents prefer to stream or download TV shows and movies, while 38 percent prefer to download books, newspapers, and magazines. When it comes to music, almost half (46 percent) of respondents say they prefer to download music rather than stream it or purchase a CD. Comparatively, European respondents’ entertainment consumption is similar, with 13 percent of European consumers preferring to rent or purchase DVDs of films and TV shows. Around 15 percent of German and UK respondents opt to consume entertainment through these traditional mediums, compared to just 9 percent of their French counterparts. The preferences for streaming rather than downloading content continues to grow. Notably, two-thirds of European consumers (62 percent) prefer to download or stream TV shows and movies. The highest streaming rate is observed in France at 49 percent, compared to 42 and 40 percent in the UK and Germany, respectively. [easy-tweet tweet="Digital content is the preferred format for media consumption by a growing mobile-first audience" hashtags="Digital, Mobile"] More than one-third (37 percent) of German consumers choose to download books and magazines, with the UK (36 percent) and France (30 percent) not far behind. German respondents are 58 percent less likely to consider obtaining hard copies. Similarly, respondents in Germany and the UK (34 percent) opt to download music in comparison with less than a third (25 percent) of French consumers. Instead, French respondents are 56 percent more likely to stream music. “Digital content is now the preferred format for media consumption by a growing mobile-first audience,” says Michael Milligan, Senior Director at Limelight Networks. “There’s no question that content needs to be easily accessible and optimised across all connected devices and global networks if it’s to reach the widest possible audience and provide the best experience. This is no longer a feature for providers, but a necessity for survival.” Additional insights from the report include: Consumers want free content. When it comes to accessing music, half of the consumers (51 percent) will only download it if it’s free. British (56 percent) and German (44 percent) audiences are significantly more likely to accept having to pay for music than their French neighbours (29 percent) A further 74 percent will only download a mobile application if it is free. However, this pattern shifts when it comes to books and movies, which customers are more willing to pay for (40 percent are willing to pay to download TV shows and movies.) Consumers expect fast downloads. Nearly one-third (30 percent) of all respondents highlight slow download times as their primary frustration with downloading content. Japan, in particular, has little patience for slow downloads with 41 percent of respondents citing this as their top frustration. Consumers in the UK also cite slow downloads as their number one frustration. Interestingly, the concerns around downloading varies significantly across European consumers. The bigger concern for German consumers (32 percent) is when it doesn’t work. By contrast, French respondents are more frustrated by content that has to be started all over again (29 percent) New apps and app updates are the most common type of content downloaded. In fact, these are downloaded 22 percent more often than music, the second most downloaded type of content The Internet of Things (IoT) hasn’t gained widespread adoption yet. Two-thirds (66 percent) of consumers do not yet have devices such as digital assistants, home automation hubs, or internet-connected thermostats and have no plans to purchase them in the next two years. However, consumers aren’t avoiding IoT due to security concerns. In fact, less than 30 percent of all respondents express a security concern with either digital assistants or smart home devices. This is reflected in attitudes of European consumers. Respondents in the UK are the most concerned (28 percent), followed by Germany and France respectively (26 and 21 percent, respectively). UK consumers have the highest number of privacy concerns with nearly one-third (26 percent) seeing this as a major issue around IoT devices. The “State of Digital Downloads” report is based on a survey of 3,500 consumers ranging in age, gender, and education in France, Germany, India, Japan, Korea, the UK, and the U.S. The complete ‘State of Digital Downloads” report is available here. ### What are the Benefits for Companies Who Move to Cloud Accounting?   The shift towards a more digital future and the impending introduction of HMRC’s “Making Tax Digital” initiative fundamentally changes the way we will do business, and also the needs and demands from customers. At Albert Goodman, we already have over 70 staff trained in Xero, Quickbooks and Sage One, and over 400 clients reaping the benefits of moving their accounts to the cloud. Online accounting platforms allow businesses to create invoices, pay bills and keep up to date with their finances on the move. Similarly, Making Tax Digital (MTD) is HMRC’s drive to make the UK the most digitally advanced tax system in the world – and will ultimately require businesses to submit quarterly tax returns via digital platforms. At the moment, MTD plans are on hold due to the General Election, but they haven’t been scrapped, and the general feeling across the profession is that it is very much still coming and needs to be prepared for. However, we aren’t looking at MTD as the key driver for moving to the cloud, we are already moving our clients for the many benefits it brings them and the ability to comply with MTD will be an additional benefit. The benefits of Cloud Accounting have been well publicised, but it’s worth revisiting, the key reasons are: Time saving Cloud Accounting systems allow you to upload or automatically bring “live feeds” of your bank account and credit card transaction data into your accounting stream. This, along with other smart features, means it’s quicker and easier to stay up to date with your bookkeeping, and you don’t need to wait for the paper bank statement to arrive in the post because you don’t need to enter the bank transactions – meaning time spent on bookkeeping will be drastically reduced. Real time information, all the time Even when you are working on the file, your accountant can be working on the same file at the same time. This allows us to interact with you in real time and assist you with any questions live. You get a clear overview of your financial position in real time, enabling you to make quicker and smarter business decisions, and enables us to advise you better. [easy-tweet tweet="Cloud Accounting ensures there are no more expensive upgrades" hashtags="Cloud, Accounts"] Flexibility With Cloud Accounting software, you have the option to run your business remotely from anywhere, from any device with an internet connection. You can run your business from work, home, or on the go, and you can be confident that you have an up-to-date picture of how your business is doing, no matter where you are. You can see your business account balances, outstanding invoices, overall cash position and much more from anywhere, 24/7. Cost savings Cloud Accounting ensures there are no more expensive upgrades, up-front licence fees or downtime – Cloud software is updated live at no extra cost and means you don’t have lots of different versions of the same file. You don’t need to worry about installing the latest version or waste time waiting for significant upgrades to take place, as you receive access to new features and updates instantly. All your files are saved in the Cloud, so you don’t have to spend extra time or money on data backups, as your data is automatically saved and accessed (by you) remotely. If you need to give access to someone else, another benefit includes an unlimited number of users at no extra cost. All of this for a monthly subscription – which equates to around the same price as a mobile phone contract. Having worked as an accountant for many years before the digital revolution, I can vouch for what game-changing features cloud accounting technology is providing to both accountants and business. As our lives are becoming more and more digitalised, it should come as no surprise that business finances are moving in the same direction at a rapid pace. We no longer take our holiday photos to be developed and wait a week for them to come back before we can see them, so why do we still wait weeks for the bank statement to arrive in the post and then spend hours reconciling our records to the statement? Cloud Accounting gives us the opportunity to streamline our finances and ensure bank transactions are fed directly into our accounting software on a daily basis, giving the software the data to match transactions ready to be reconciled – resulting in a couple of minutes’ worth of work, as opposed to hours or days. Cloud Accounting also benefits accountancy firms, such as Albert Goodman, because it means all data is in the same format and accessible in the cloud, which enables us to analyse and provide clients with valuable KPIs and benchmarking information on an ongoing basis. The centralised and automated cloud system allows us to spend more time providing added value services to our clients, such as integrating apps designed specifically for certain business sectors into their cloud package – for example, for retail businesses we can integrate point of sale apps in their accounting software, or we can support marketing firms by integrating sales pipelines into theirs. The list of potential options is endless. As accountancy firms move to the cloud and old fashioned processes become automated, businesses should expect to receive more than just the basic compliance work from their accountant. I would urge businesses to make the most of an accountant who is comfortable with cloud accounting software. The knowledge that an accountant has, combined with the latest cloud technology, is a powerful tool when making informed business decisions about your business in real-time. ### Six Signs That Your Mobile App Needs an Instant Update In this competitive world, mobile applications are playing an important role in your business. They are considered as almost everything to your business. Once the first version of the application has been developed in 4 to 6 months – depending on the type and size of the app – the very next step that you need to take is maintain and support that application.  Today, there are lots of popular and successful apps that often see updates as often as weekly whereas other release cycles may happen once or twice a month. It is important for you to figure out how and when to update your mobile application because mobile apps are a lot different than traditional desktop software apps. In this post, we will have a look at the six signs that your mobile application needs an instant update, so continue reading: To Update App’s Design Design plays a vital role in any product development and if your app’s design is going to change no matter whether it is the user experience or visual design, it is possible that you need to release a major app update. Remember that redesigning is not as easy as you think because you have to wipe-off all you already have and think of something new that your existing design doesn’t have. For example, Uber application jumped from v2.x to v3.x at the time to release its new design in 2016. Thus, getting a new design for your app demands for a major update. New & Exclusive Features When you decide to include new and exclusive features in your application, it is time for a major update to your application. If you want to add some of the wonderful and important features that can improve users’ experience, your app needs an update. When you decide to bring new features to your app, you need to do a lot of research and enhance all your ideas into offerings that deliver high value to users in terms of new features. Authentication syncing across devices, integration with a third-party service and more are some of the examples of major features. Moreover, including only a new feature may affect the minor version number; therefore, adding better offline support, 3D touch or other features may simply move an app from v1.2 to v1.3. Users’ Feedback As you know that all the hard work, efforts and time that you are giving up is meant to deliver users. So, you should keep communicating with your users on different forums and social media channels so that they can share their feedback about your application. In any case, if you get some opinions from users or some idea, you can note them down and combine with other ideas to be updated when you plan your next version of the application. However, if users are constantly complaining about any feature of your app, it’s high time to rework on it and update your application with all important things. Enhancing Performance If your application is already in the market and you are looking to perform well among your targeted audience, it would be good if you measure its performance. The performance of the application can be accredited to the level or quantity of engagement and the quality or value that is creating for users. It is directly related to the success your app achieves and revenue that it generates. In addition to this, you also need to know that whether it is performing well and getting repeat visitors or not. You can also analyse that how much time they are spending on your application and what is the bounce rate. In case, if your application is meeting with any of these performance parameters, it is high time to update your application to improve its performance. [easy-tweet tweet="In-app years at two years old code is considered as old and outdated" hashtags="App, Mobile"] Your App’s Codebase is Outdated Do you know that in-app years at two years old code is considered as old and outdated? As you know that Swift is a four-year-old and earlier it did not exist. Today, the technology has changed so rapidly that you need to update your code and it would a major app update for you. Once you decide to update your code base, it needs a lot of precision, dedication, and commitment. Moreover, if your app’s code is lengthy and not following any standard and agreements of the latest trends and approach in programming, you need to update your app. Decided to Expand App to Other Devices or Platforms There is a major update is on the card when your application goes across devices on the same platform or moving to a new platform. It comprises developing a tablet-specific version or going from Android to iOS or vice versa. By expanding to other platforms and devices, you need to rework on the backend services or the app’s data model. It will also impact on the user experience and visual design.  Though it is essential that each design feel native to the platform and there will need to be a unified experience across platforms and devices. It will result in a number of changes, which filter back into the user experience and design. Visual design changes and user experience can impact the back-end servers. These are the six signs that your mobile application needs an instant update. If you are noticing any of these signs in your app, you can update it and make it more powerful to deliver a rich user experience. However, you can also opt for customization if you want to improve any part of your app, but make sure to get in touch with a veteran team of mobile app developers, who has experience of customising.   ### The Move From Working For IT Services to With Them Having recently made a move from working for IT services vendors for many years, to work in a corporate IT department, I have to re-orient the way I think about a lot of things. One aspect that is great though is finding out that some things that I have believed to be true are now proving themselves to be completely correct - despite the hype often present throughout the IT industry. (The many things I’ve been wrong about we’ll delicately sweep under the carpet…). In my new role, I am seeing real life cases of business units being able to innovate faster, accelerate towards their strategic goals, and scale to customer demands in ways they just couldn’t have done previously, all by using cloud computing. I am coming across previous cloud sceptics that have now realised that by having a lot of the daily maintenance tasks that used to consume their time, provided automatically as part of a broader cloud platform service, is freeing them up to do more productive and innovative tasks. But the biggest surprise of all was finding IT security professionals - working in what many people would negatively describe as a “regulated” vertical – who were completely open to using cloud computing! For years I was led to believe, almost religiously, that there was no way InfoSec professionals in regulated industries would allow corporate applications to reside in a public cloud. “They just couldn’t” I was told with an aura of disbelief at even the utterance of such an absurd idea! But it just didn’t seem to ring true with the rest of the facts. Almost every story of a security or data breach that made it to the press, and there weren’t a shortage, seemed to reference internal data centres or careless use of mobile devices. Where was all of the data that was surely pouring out the back of the public clouds going to, and why was no-one screaming about it? And how on earth were the big cloud vendors claiming such growth when they were only being used for test/dev workloads, or by developers that were going rogue on their bulging personal credit cards? [easy-tweet tweet="The idea that cloud is inherently insecure is fast going the way of the minidisc " hashtags="Cloud, IT"] Instead, I found an Information Security team that realise that not only is it possible to apply security standards and controls equally across the data centre and cloud but that in many instances it is both simpler and cheaper to apply the required security controls in a cloud environment. Why invest in, deploy, maintain, patch, and power separate solutions for each aspect of your security requirements onsite, when in the cloud you can often ‘check a box’, select a few configuration options, and you’re done? The idea that cloud is inherently insecure is fast going the way of the minidisc – it seemed a great idea initially, but has been quickly proven not to be the case. But more than that, I think it’s probably now simpler to secure your data inside a cloud platform than it is outside. And I’m not even trying to sell you anything anymore! ### The Safety of the Cloud When Processing Payments Jo Gibson, Operations Director at First Capital Cashflow, a leading cloud-based payments Bureau, discusses the safety of the cloud when it comes to processing payments. In the constantly evolving digital age, the cloud has been one of the key components to the rapid success of online payments and seamless transactions of money globally. However, recent high-profile data breaches at organisations including Talk Talk, Three Mobile and not to forget the NHS, has put cyber-security under the microscope. Business owners are becoming increasingly concerned about how they can protect customers’ confidential data, especially if it contains financial information. Research conducted by cloud specialist Zero2Ten has found that 60 per cent of people cited concerns about data security as a barrier to the adoption of cloud-based payment services. Businesses are being put off from using cloud-based technology, but are they, in fact, missing a trick? Or do they need more information about its benefits? Safety of the cloud When processing payments using automated methods like Bacs Direct Debit and Direct Credit, the cloud is proven to provide comprehensive security benefits that are far superior to paper-based or in-house software solutions. [easy-tweet tweet="The cloud offers access to security benefits that aren’t available with in-house technology " hashtags="Cloud, Security"] Hosting financial data in-house can present business owners with problems and stresses they don’t need on their plate. Expensive upfront costs of software installation combined with costly on-going upgrades and licenses fees are an unwelcome sight. The cloud offers access to a whole host of security benefits that aren’t available with in-house technology – or most certainly would come with a high price tag. The benefits include things like continual updates to technology, email hacking prevention and constantly updated antivirus software. Highly advanced cloud providers even appoint ethical hackers to regularly and meticulously test their systems, constantly keeping them one step ahead of potential breaches. A great benefit that the cloud has to offer is that it can be accessed anywhere at any time. With data being stored in the cloud, business owners have the flexibility to gain access to this information at their convenience. This doesn’t just help the practicality of the working day; it means that actions can be taken fast if a security breach occurs. For example, if a laptop is lost or stolen, data can be accessed immediately from any location in the world and wiped to prevent it falling into the wrong hands. Further, if a burglary takes place at headquarters or even if a natural disaster occurs, financial data can be remotely accessed instantly, and the situation can be contained. It is well documented that missing laptops are often the source of data breaches, so an ability to remove data from these systems remotely is a major advantage of cloud payments technology and something that currently isn’t communicated to businesses and their customers. Stalwart data centres Another advantage of the cloud, and probably its biggest safety selling point is the ability to host information in a secure, offsite data centre. Hosting sensitive financial data in high-security data centres is one of the biggest security advantages and something security, and IT professionals need to capitalise on. Data centres provide a cast iron level of security, they are the highest quality environment possible for data storage, with experts in their field on hand to tackle any unwanted incidents. Moreover, the UK has some data centre sites across the country, so leading cloud providers can store information across multiple locations, further reducing the risk of a hack or breach taking place. Using the cloud and secure data centres, in turn moving sensitive information away from company’s headquarters, mitigates against a range of disaster scenarios and further cement the argument that the cloud is the safest tool when it comes to protecting sensitive financial information. We’re only human Based on an analysis of over 450 cyber incidents the firm had handled, BakerHostetler’s 2017 Data Security Incident Response Report concluded that employee action/mistake was the cause of 32% of all incidents. Many SME’s looks to technological failure as the one and only reason for the recent insecurity surrounding the cloud. This key misconception has to be addressed. In reality, the human error lies at the heart of the majority of breaches. As a business owner, opting to process payments using cloud-based technology significantly reduces the possibility of human intervention. Encryption With a cloud-based service from a leading provider - like those that handle financial data - delicate customer information can be encrypted, reducing the risk of any unwanted breaches. Multiple layers of sophisticated encryption act as a barrier. So, in the unlikely event of a data breach, highly sensitive information is more likely to remain secure. Indeed, these layers of encryption could be implemented for in-house software, but not without significant upfront costs to business – something only larger organisations would be able to foot the bill for. The future There are many advantages for business owners to invest in cloud-based technology, including to process payments or any other related expenditure. As the first cloud-based service provider in the Bacs sector, we knew security was paramount, and our cloud-service has stood the test of time. Time and time again, we see businesses embrace the potential of the cloud to enhance their payment processing systems, safe in the knowledge that our systems, supported by experienced in-house developers, have allowed clients to securely process payments for the last 15 years. As long as businesses use a reputable and trustworthy cloud provider, then the cloud is one of the most secure methods of hosting and handling data.   ### AWS IoT Button is Now Available for Customers in Europe The AWS IoT Button is now available in Europe. The button can be ordered in the United Kingdom, Germany, France, Italy and Spain from the local Amazon retail websites. [easy-tweet tweet="The AWS IoT Button is a Wi-Fi device" hashtags="AWS, IoT"] The AWS IoT Button is a Wi-Fi device that can be used to test, pilot and deploy innovative ideas without the need for the development of additional hardware, firmware or networking protocols. A wide variety of use cases can be addressed using the AWS IoT button including home automation, ordering, feedback, notification, integration with third party APIs and many others. The button can be easily configured using the AWS IoT Button Developer mobile app. The latest version of the mobile app can be downloaded from the Android or iOS store. Using a simple configuration, developers can get started working on IoT projects. You can find out more information on the product here. ### Cloud Capabilities – A Digital Transformation Enabler UK shoppers spend more than £1bn per week online, and as a result, our town centres are changing.  From badly impacted brands like Jaeger and Austin Reed to Debenhams’ efforts to reshape its high street presence by focusing on “social shopping,” there can be very few retailers who haven’t been affected by the online revolution. The customer has always been King, but now they dictate how they generate demand, communicate and consume information across all channels. They are demanding more personalisation and faster fulfilment.  And to meet these demands, businesses need to ensure customers can interact with them on any device, at anytime, anywhere. More than ever before, investment in enabling capability and technology is critical. Now is the time to replace legacy applications that are not digitally compatible, with cloud-based solutions that can be rapidly scaled and transformed to enable digital customer propositions to be delivered and evolved at pace. Businesses that move quickly to integrate the customer journey will win in the omnichannel, digital age. The integration challenge There are many factors for those that want to compete and establish the optimal customer experience across all channels, which include: the need to understand customers better than before and develop deep customer insight The imperative to establish strong business alignment across the organisation. The technology function alone cannot drive digital and neither can marketing – digital needs be embedded in your business, with board sponsorship, to ensure effective channel integration investment in enabling capabilities that support information flow between different customer channels and interactions – typically leveraging integration layers (API Platforms) While many organisations have a vision of the integrated omnichannel capabilities they desire, the journey is not so clear. How do you start to untangle legacy platforms without killing essential services? How do you deliver integration at pace, alongside other initiatives? At the same time, how do you inject future proofing and innovation into the mix? And how do you ensure you are compliant with new regulations such as GDPR? Cloud capabilities have a significant part to play in helping organisations migrate from a complex point-to-point interfaced application landscape, and introduce cloud application platforms that can be rapidly be deployed to create digital propositions. Thus the digital platform of the future will: enable new capabilities to be ‘plugged and played’ within the omnichannel estate integrate data from till, loyalty, 3rd party concessions, apps, social, servicing and payment capabilities However, while cloud solutions can be a digital enabler that can unlock the benefits of an integrated digital business, the choice and selection of cloud solutions and how you migrate from your current business solutions will be key. New software market entrants over the last few years have architected their products from the ground up – but often do not have the breadth and process maturity expected, while other vendors have migrated/acquired products to incrementally evolve their on-premise and cloud capabilities. These evolving cloud-based solutions may not currently offer all the functionality that their on-premise versions could provide, so, a straight technology upgrade will often not be the answer for many organisations. [easy-tweet tweet="The benefits of cloud solutions can be realised if all aspects of your operating model are considered" hashtags="Cloud, Digital"] Operating Model Putting the business process coverage and solution maturity aside, the many benefits of cloud solutions can be realised as long as all aspects of your operating model are considered – not just ‘technical gap fit’. Cloud solutions will change the financial operating model mechanics with a switch to opex from capex. However, often the depreciation of existing assets and desire to delay investment until “end of life” will present challenges regarding when is it right to move to a cloud or hybrid platform. The ‘currentcy’ of cloud solutions also presents benefits with the ‘provider’ managing technical upgrades and patching for security, as well as leading advances on the solution road map. However, there are potential “downsides”, as a customer you have limited options to say ‘no thank you I want to stay on my current version’. Within an agreed period, you will be forced to accept the cloud platform changes and forced to invest in testing to verify and accommodate business changes that can result from platform enforced changes. The scale and performance of some cloud platforms have had detrimental impacts on some production business operations over the last 12 months, in some cases extending project delivery timelines, while steps were taken to ensure end-to-end business performance is acceptable. When sizing cloud solutions and subsequent contracting, ensure the SLA’s and performance ‘guarantees’ are locked into your commercial agreements with an appropriate level of growth, or run the risk of business performance impacts and un-planned opex cost growth – just to achieve the expected performance required. The challenges of cloud platforms often far outweigh the alternative, as long as due consideration is given to all key delivery and operating criteria. This involves how you build internal support capability or outsource support services to 3rd parties, overlayed with the role that the cloud software provider takes.  What is certain is that the shape and form of support services will need to evolve compared to the traditional on-premise model. As organisations look to place the customer at the heart of their business, they need to build intelligent, digital business platforms with the right blend of solutions that balance digital proposition excellence and customer satisfaction, against the cost to transform and to support the business.  For that, you need a robust business case and a clear set of business objectives, and that can be a challenge for your organisation and your incumbent software providers.  The digital transformation challenge may well see an end to single-vendor “best in suite/portfolio” solutions as costs, benefits, vendor capability maturity and flexibility will not stack up.   ### Technology and the Law Ep 4: Social Media Watch the fourth episode of “Technology and The Law”. A brand new show broadcasted by Disruptive Tech with presenter and technology lawyer, Frank Jennings. This show focuses on the legal aspects and debates towards the technological industry. Previous episodes have focused on Data Location and Contract Risks from Cloud Computing and GDPR. This week is the fourth week and will focus on Social Media with a great panel, including; Joel Davis from Agency:2, David Terrar from Agile Elephant, Harshini Carey from KMD Neupart UK and Melissa Jun Rowley from The Toolbox. Frank Jennings is the commercial and cloud lawyer at Wallace LLP. He specialises in commercial contracts with cloud and technology, data security and intellectual property. Frank is known for his can-do attitude, managing risk and problem solving to maximise return on investment. Who else could present this show? If you are interested and want to learn more about the legal aspects and debates towards the technological industry then tune in! Hashtag: #TechLaw Twitter: @disruptivelive [caption id="attachment_43645" align="alignnone" width="978"] Frank Jennings Show[/caption] ### IT as a Service – What Does This Actually Mean Other Than Another IT Acronym? Whilst at the IBM Interconnect event this year the phrase “IT as a Service” or “ITaaS” was mentioned throughout each stream I was reporting on. Now you may think that this is a term that’s been used many times in our technological past and you may be right. However, there are not many companies that deliver this in the correct context. Take an IT Support service, for example, this literately translates correctly to IT as a service but in reality is just IT Support. [easy-tweet tweet="Everything in the changing tech landscape has an acronym now; PaaS, IaaS" hashtags="Saas, IBM"] IBM has recently announced their version/vision of IT as a Service and I would like to explain this in more detail for several reasons. One, I was asked to, and two, I think I should put the description in context to the world of cloud and multi-cloud computing. Everything in the changing tech landscape has an acronym these days. PaaS, IaaS, SaaS to mention a few require explaining in their right. So, if we stop looking at these business enablers singularly and work on Business outcomes enabled by technology, we get “IT as a Service”. “We innovate faster, create a single bill, unlock new levels of software-defined everything and ensure the tech is always available and on, allowing the CXO to sleep at night with IT as a Service.” This sounds great? Of course, it does but imagine if you can also predict changes, improvements, consolidation of infrastructure with new business models and outcomes. Well with IBM it’s a reality and not just wishful thinking! And what’s smart is that IBM has created IT as a Service mechanisms by industry. Whether you are working in the Retail, Banking, Manufacturing or any other industry, IBM has a defined service delivery platform of the products and services that take away the confusion of multiple choice. [caption id="attachment_43618" align="aligncenter" width="529"] IBM global[/caption] I know what you are thinking, this sounds too good to be true right? Well if we put this in the correct context (IBM Services) no it isn’t. IBM has the cognitive set of services (Watson), the development platforms (Bluemix), the hardware and software services to complement the industry standard requirements for applications and business objectives and they have packaged these into market vertical services delivery that leverage cognitive thinking technology that can think and run for itself flawlessly. In a recent interview Mohammed Farooq, GM of IBM Cloud Brokerage Services at IBM GTS, explains what IT as a Service means to IBM in a cloud brokerage model. https://www.youtube.com/watch?v=K4_fORSIpEo&t=21s&list=PLenh213llmcZTJLJMAurMJkVnYBY4wLhw&index=22 Are we getting very close to automating IT? No, not yet as IBM insist that relationships are more valuable at a business level of any organisation to make the right choices as we still need that degree of control. However doesn’t it make sense to take away the burden of the managing some of the technology and deciding what direction it needs to take to accommodate your company’s needs instead of multiple choice? In any way you look at this argument automation, cognitive and proactive management with advice with a service organisation that has been doing this for decades has to be an IT as a Service that you can rely on. I like it IBM, and I want to know more and if you do too take a look here! http://www-935.ibm.com/services/us/en/it-services/gts-it-service-home-page-1.html ### Amazon Web Services Makes AWS Greengrass Available to Customers Annapurna, BSquare, Canonical, Digi International, Intel, Lenovo, Mongoose, Qualcomm Technologies, Raspberry Pi, Samsung, Technicolor, and Wistron join growing list of partners and OEMs bringing AWS Greengrass to connected devices Customers including Enel, Konecranes, Nokia, Pentair, Rio Tinto, and Stanley Black & Decker, are using AWS Greengrass for Industrial IoT Today, Amazon Web Services, Inc. (AWS), an Amazon.com company (NASDAQ: AMZN), announced that AWS Greengrass, software which allows customers to run AWS compute, messaging, data caching, and sync capabilities on connected devices, is now available to all customers. With AWS Greengrass, devices can run AWS Lambda functions to perform tasks locally, keep device data in sync, and communicate with other devices while leveraging the full processing, analytics, and storage power of the AWS Cloud. More than a dozen AWS partners, including Annapurna, BSquare, Canonical, Digi International, Intel, Lenovo, Mongoose, Qualcomm Technologies, Raspberry Pi, Samsung, Technicolor and Wistron are integrating AWS Greengrass into their platforms so that devices will come with AWS Greengrass built-in. To get started with AWS Greengrass, visithttps://aws.amazon.com/greengrass/. [easy-tweet tweet="AWS Greengrass eliminates the complexity involved in programming and updating IoT devices " hashtags="AWS, IoT"] With the proliferation of IoT devices, enterprises are increasingly managing the infrastructure that is not located in a data centre, such as connected devices in factories, oil wells, agricultural fields, hospitals, cars, and various other venues. Because these devices often have limited processing power and memory, many rely heavily on AWS and the cloud for processing, analytics, and storage. However, there are circumstances when relying exclusively on the cloud isn’t optimal due to latency requirements or intermittent connectivity that make a round trip to the cloud unfeasible. In these situations, IoT devices must be able to perform some tasks locally. Programming and updating software functionality on IoT devices is challenging and complex. Relatively few developers have the expertise to update these embedded systems, and even fewer can do so without creating unwanted downtime. AWS Greengrass eliminates the complexity involved in programming and updating IoT devices by allowing customers to use AWS Lambda to run code locally on connected devices in the same way they do on the AWS Cloud. With AWS Greengrass, developers can add AWS Lambda functions to connected devices right from the AWS Management Console, and devices can execute the code locally, responding to events and taking actions in near real-time. AWS Greengrass also includes AWS IoT messaging and synching capabilities so devices can send messages to other devices without connecting back to the cloud. AWS Greengrass allows customers the flexibility to have devices rely on the cloud when it makes sense, perform tasks on their own when it makes sense, and talk to each other when it makes sense – all in a single, seamless environment. “Many of the world’s largest IoT implementations run on AWS, and customers across industries – from energy to mining, to media and entertainment – have asked us whether we could extend AWS’s industry leading cloud capabilities to the edge,” said Dirk Didascalou, Vice President of IoT at AWS. “By embedding AWS Lambda and AWS IoT capabilities in connected devices, AWS Greengrass gives customers the flexibility to have devices act locally on the data they generate while using the AWS Cloud for management, analytics, and storage – all using a single, familiar AWS programming model. We are excited to make AWS Greengrass available to all AWS customers, and with AWS partners shipping AWS Greengrass-capable devices it is now incredibly easy to build and run IoT applications that seamlessly span devices on edge and in the AWS Cloud.” The multinational power company Enel is the largest utility in Europe regarding market capitalization and has the largest customer base among its European peers with over 65 million customers worldwide. "Connected devices improve all aspects of our daily lives, from the smart meters in our homes that help us save energy, to the black boxes in our cars that show us how we’re driving, to the stoplights with sensors that monitor traffic,” said Fabio Veronese, Head of Infrastructure and Technological Services at Enel. "Enel is building AWS Greengrass-enabled smart gateways for the home and industrial gateways for our power generation sites, where AWS Greengrass will allow us to process and act on large volumes of data with sub-millisecond latency."  Finland-based Konecranes is a world-leading group of Lifting Businesses, serving a broad range of customers, including manufacturing and process industries, shipyards, ports, and terminals. “Konecranes is leading the way in the industrial internet of things with over 15,000 connected cranes and thousands more on the way this year,” said Juha Pankakoski, Executive Vice President of Technologies at Konecranes. “We have already been using AWS IoT to build Truconnect, a digital crane platform, and the addition of AWS Greengrass will help us take the development to the edge. We see AWS Greengrass as the enabler for a new set of digital services, allowing us to program and deliver software to equipment in a secure manner and without risking operational safety. This supports well our aim to build the next generation of lifting as the leading technology company in our industry.” Pentair plc is a global company dedicated to building a safer, more sustainable world through industry- leading products, services, and solutions that help people make the best use of the resources they rely on most. "Some of Pentair’s aquaculture customers are located in remote geographies with unreliable internet connections, and industry regulations restrict which data points can leave their physical premises," said Phil Rolchigo, Vice President of Technology at Pentair. "AWS Greengrass will enable our devices to behave consistently no matter the level of connectivity in their operating environment while allowing us to take advantage of the AWS Cloud for machine learning and big data analytics." Rio Tinto is a leading global mining group that focuses on finding, mining, and processing the Earth's mineral resources. "Rio Tinto operates mining equipment in some of the most extreme environments on earth, where connectivity can be unreliable, and road conditions can cause production delays, equipment damage, and potentially put people at risk," said Brian Oldham, Vice President of Industrial and Operational Technology at Rio Tinto. “AWS Greengrass allows us to measure road roughness and process the data locally to make our haul trucks operate more safely and efficiently, regardless of network coverage. We saw results from equipment in the field only two weeks after deploying the service, and the potential value it can bring to our operations. We’re evaluating additional use cases for AWS Greengrass in other areas.” Stanley Black & Decker’s Digital Accelerator is the company’s innovation arm in charge of infusing digital capabilities across all products, process, and business models. "Stanley Black & Decker’s Digital Accelerator has selected AWS Greengrass as one of the standards for edge computing and edge analytics across our entire portfolio of products,” said Yasir Qureshi, Director, IoT Platform and Digital Architecture at Stanley Black & Decker. “AWS Greengrass improves the efficiency of our tools by eliminating the latency of transmitting the data to the cloud and instead processing breaking and maintenance data locally for improved job site productivity. Additionally, the single programming model between the AWS Cloud and local devices enables us to significantly shorten our development lifecycle and make our equipment smarter and faster." Ecosystem support for AWS Greengrass Canonical produces Ubuntu and offers commercial services for Ubuntu's users. “AWS Greengrass will enable more customers and developers to realise the benefit of processing and analysing data at the edge,” said Mike Bell, Executive Vice President, IoT and Devices at Canonical. “By distributing and installing AWS Greengrass as a snap, the universal Linux packaging format, developers can reduce the time and complexity of building smart edge solutions across new and existing hardware. Using snaps, manufacturers will not only find it easier to build IoT devices but to monetize smart developer solutions running on the AWS IoT platform.” Intel, the world leader in silicon innovation, develops technologies, products and initiatives to continually advance how people work and live. "Many IoT use cases demand a distributed computing architecture at the edge, close to where the data is generated,” said Jonathan Ballon, Vice President, Internet of Things Group at Intel. “AWS Greengrass running on Intel technology delivers a secure, intelligent edge that allows developers to easily create new applications from edge to cloud." The Raspberry Pi Foundation provides low-cost, high-performance computers that people use to learn, solve problems, and have fun. "Raspberry Pi is part of a movement of organisations and individuals that share a common goal: Empowering people to shape their world through digital technologies," said David Thompson, Director of Web Services at the Raspberry Pi Foundation. "AWS Greengrass extends the capabilities of the AWS Cloud to the Raspberry Pi, making it easier than ever for anyone to build the next great connected product." Qualcomm Technologies pioneered 3G and 4G – and is now leading the way to 5G and a new era of intelligent, connected devices, including automotive, computing, IoT, healthcare, and data centre. "The next generation of IoT devices and gateways depends on high-performance edge capabilities tightly integrated with the cloud," said Jeffery Torrance, Vice President of Business Development at Qualcomm Technologies, Inc. "Developers and manufacturers can use AWS Greengrass to take greater advantage of the connectivity, compute, and security capabilities delivered by Qualcomm Technologies' system-on-chip platforms to build edge solutions that tap into the power of AWS IoT. Thanks to our strong work with AWS, prototyping and developing on Qualcomm Technologies' platforms using AWS Greengrass can start today."   ### The Reasons Why CRM Software Will Change Your Company In today's world, many business owners are thinking about which strategies they can use to keep their organisations active and alive. If this is one of your primary concerns, now is the time to tap into the power of utilising customer relationship management (CRM) software. Learn more about CRM software and how it can benefit your company by reviewing the information found in the following quick reference guide: What Is Customer Relationship Management (CRM) Software?  CRM software is a type of software that enables business owners to create and maintain strong relationships with customers. It can be used to enhance and accelerate key sales processes such as lead generation and customer retention. [easy-tweet tweet="CRM software will provide you with a historical view and analysis of all customers " hashtags="CRMSoftware, CRM"] Benefits Of CRM Software There are numerous benefits that an individual can attain through the use of CRM software. Some of them include: Historical View And Analysis One great benefit of CRM software is that it will provide you with a historical view and analysis covering all of the customers you've acquired as well as prospects. This data will help your sales and marketing team reduce searches and anticipate customer needs in advance. Detailed Customer Data Another great benefit of CRM software is that it contains detailed information about each customer. Having ongoing access to this detailed data will empower you to carefully track customers and determine how best to market products to her or him. The data will also enable you to predict whether a customer will be profitable or not. Customer Acquisitions As noted in Management Study Guide, CRM system software plays an important role in the customer acquisition process. The process begins with identifying a prospect and maintaining all of the contact data in the CRM system. The software will contain various features that make it easy for the sales and marketing team to follow up with the prospect and then convert her or him! Picking The Right CRM Products  Now that you recognise the great value of CRM software, you may be ready to start picking the right CRM products. To ensure that you can, make sure that you carefully read the online reviews that have been left about specific product providers. You may want to begin by reading a Zoho review. In addition to reading a Zoho review, be sure to peruse the online reviews of key CRM software companies such as: Pipedrive Marketing 360 Salesforce Nutshell Desk.com Once you start your search for the ideal CRM software company to purchase your products from, make sure that you know what features to look for. As noted in "The Secret To Choosing The Best CRM For Your Sales Organisation," some of the features and capabilities that the best CRM software will have include: role-based security (to help maintain and protect institutional knowledge) tight email integration native mobile support effective follow-up and automation Don't Delay: Invest In CRM Software Today! If you're serious about changing your company in a dynamic way, get serious about the use of CRM software. This software can help expedite and optimise the sales team's ability to convert a prospect into a loyal customer. Refer to the information found in this quick reference guide to ensure that you can select the ideal CRM products once you're ready to start shopping! ### Success Secrets: How to Master the Art of Delegation within Business Most business managers know deep down that delegation will save them time, boost productivity and grant them valuable head space, and yet they can't quite bring themselves to let go of the responsibility. "It takes too long to teach someone else how to do it, the quality of the work will be reduced, delegating is too stressful." I've heard it all before, but delegation doesn't have to be difficult or time-consuming. Done properly, it could earn you a well deserved day off at no expense to your output, in fact, it might just be the secret to success. Here's how to master the art of delegation in 5 simple steps: Step 1. Sort out your priorities As a business leader, your time is the most valuable so you have to think carefully about how you use it. There's no point spending days trying to figure out how to use Photoshop when somebody else could do the job more quickly and efficiently. If you struggle with particular tasks or find them uninteresting, delegate to someone else who's more skilled in that area. To help me focus my time, I write priority lists numbering everything from 1 to 3 (1 being the most important tasks) and delegate everything that's not directly impacting the growth of the company. It helps me to stay on track of my goals and drastically improves my productivity as I'm always putting my skills to the best use.  Step 2. Choose the best person to delegate to  Spend time getting to know your employees' specialised skill sets and interests so that you are able to make an informed decision when you delegate a task. Don't just pick the person who is the least busy as this could damage the quality of the output. For example, whilst my assistant takes responsibility for all of my administrative tasks, I would never ask her to write blog content as this isn't her area of expertise, which could end up causing her unnecessary stress and waste her time. If the job you're delegating needs skills that your team members are lacking, consider outsourcing to a professional or virtual services company.  Step 3. Provide clear instructions However, talented the person is, they're not a mind reader. Be clear about the nature of the task and your expectations, and provide examples where possible. If it's something they've done before, you won't need to go into so much detail, but it's still important to ensure they've understood what you're talking about. [easy-tweet tweet="The worst thing for productivity is a boss who micromanages" hashtags="Business, Strategy"] Step 4. Just let them get on with it Establish a deadline and keep communication channels open so that your employee knows they can ask questions if they're struggling, and then leave them to do the job. The worst thing for productivity is a boss who micromanages. If you've invested time in recruitment, you should be able to trust in your employees' skills and dedication. Remember you're not the only one who can do the job and as a manager, you're there to support, not dictate how your employees work. Give them space to learn and correct their own mistakes. Step 5. Offer constructive feedback When the task has been completed make sure you show your employee gratitude. Tell them where they excelled and the areas which could be improved. Also, ask them to give you feedback. Did they enjoy the task, do they have any questions, could you have been clearer with your instructions? It's never easy to give your boss feedback, but employees appreciate the opportunity to voice their opinions. It's a good way for them to take control of their learning process and helps you to delegate more effectively in the future. As with all things, delegation gets easier with practice, but don't make the mistake of adding it to your task list as this will create more stress. It should be integrated into your management process as a way not only of increasing your productivity, but also to assist your staff with career development. Speak to individual members of your team about their goals so that you know which areas they'd like to progress in and the next time you have a task to delegate, you might be able to offer them a valuable opportunity to learn. Approaching delegation in this way also means you will feel less guilty about off-loading responsibility as you know that you are potentially contributing to an employee's knowledge. My advice, if you're still wary, is to start slow by delegating basic tasks such as research and formatting. As you start to experience the benefits first hand, you'll want to delegate more and more, but be aware that you're still in control and that the quality of work is continually being improved rather damaged. Of course, there are some things that should never be delegated. Any strategic decisions or tasks, for example, should always be left to you. After all, every company needs a leader. ### Gartner Says Worldwide Server Shipments Declined 4.2% In the first quarter of 2017, worldwide server revenue declined 4.5 percent year over year, while shipments fell 4.2 percent from the first quarter of 2016, according to Gartner, Inc. "The first quarter of 2017 showed declines on a global level with a slight variation in results by region," said Jeffrey Hewitt, research vice president at Gartner. "Asia/Pacific bucked the trend and posted growth while all other regions fell. "Although purchases in the hyper scale data centre segment have been increasing, the enterprise and SMB segments remain constrained as end users in these segments accommodate their increased application requirements through virtualisation and consider cloud alternatives," Mr Hewitt said. Hewlett Packard Enterprise (HPE) continued to lead in the worldwide server market based on revenue. The company posted just more than $3 billion in revenue for a total share of 24.1 percent for the first quarter of 2017 (see Table 1). Dell EMC maintained the No. 2 position with 19 per cent market share. Dell EMC was the only vendor in the top five to experience growth in the first quarter of 2017.   Table 1 Worldwide: Server Vendor Revenue Estimates, 1Q17 (US Dollars) Company 1Q17 Revenue 1Q17 Market Share (%) 1Q16 Revenue 1Q16 Market Share (%) 1Q17-1Q6 Growth (%) HPE 3,009,569,241 24.1 3,296,591,967 25.2 -8.7 Dell EMC 2,373,171,860 19.0 2,265,272,258 17.3 4.8 IBM 831,622,879 6.6 1,270,901,371 9.7 -34.6 Cisco 825,610,000 6.6 850,230,000 6.5 -2.9 Lenovo 731,647,279 5.8 871,335,542 6.7 -16.0 Others 4,737,196,847 37.9 4,537,261,457 34.7 4.4 Total 12,508,818,106 100.0 13,091,592,596 100.0 -4.5 Source: Gartner (June 2017) In server shipments, Dell EMC secured the No. 1 position in the first quarter of 2017 with 17.9 percent market share (see Table 2). The company had a slight increase of 0.5 percent growth over the first quarter of 2016. Despite a decline of 16.7 percent, HPE secured the second spot with 16.8 per cent of the market. Inspur Electronics experienced the highest growth in shipments with 27.3 per cent. Table 2 Worldwide: Server Vendor Shipments Estimates, 1Q17 (Units) Company 1Q17 Shipments 1Q17 Market Share (%) 1Q16 Shipments 1Q16 Market Share (%) 1Q17-1Q16 Growth (%) Dell EMC 466,800 17.9 464,292 17.1 0.5 HPE 438,169 16.8 526,115 19.4 -16.7 Huawei 156,559 6.0 130,755 4.8 19.7 Lenovo 145,977 5.6 199,189 7.3 -26.7 Inspur Electronics 139,203 5.4 109,390 4.0 27.3 Others 1,254,892 48.2 1,286,097 47.4 -2.4 Total 2,601,600 100.0 2,715, 138 100.0 -4.2 Source: Gartner (June 2017) Europe, the Middle East and Africa (EMEA) In the first quarter of 2017, server revenue in EMEA totalled $2.8 billion in the first quarter of 2017, a decline of 12.2 percent from the first quarter of 2016 (see Table 3). Server shipments totalled 503 thousand units, a decline of 8 percent year over year (see Table 4). "The EMEA server market has started 2017 in the same way as 2016 ended," said Adrian O'Connell, research director at Gartner. "The main challenge for vendors in the region is that increased levels of economic and political uncertainty are putting pressure on an already weak server market."  Table 3 EMEA: Server Vendor Revenue Estimates, 1Q17 (US Dollars) Company 1Q17 Revenue 1Q17 Market Share (%) 1Q16 Revenue 1Q16 Market Share (%) 1Q17-1Q16 Growth (%) HPE 915,079,134 33.0 1,118,767,775 35.5 -18.2 Dell EMC 561,854,433 20.3 533,414,867 16.9% 5.3 Cisco 199,260,000 7.2 211,640,000 6.7% -5.8 Fujitsu 197,885,111 7.1 217,138,861 6.9% -8.9 Lenovo 183,544,197 6.6 196,801,000 6.2% -6.7 Others 712,376,198 25.7 875,367,033 27.8% -18.6 Total 2,769,999,072 100.0 3,153,129,537 100.0% -12.2 Source: Gartner (June 2017) In revenue terms, all of the top five vendors except Dell EMC saw declines in the first quarter of 2017. HPE maintained the No. 1 position, but it suffered a sharp revenue decrease of 18.2 percent. "The first quarter of the year tends to be relatively strong for Dell, but the acquisition of EMC is proving positive for the server business at the moment," said Mr O'Connell. Revenues declined for Cisco, Fujitsu and Lenovo. Their revenue rankings improved, however, as IBM fell out of the top five for revenue this quarter following the divestiture of its x86 server business to Lenovo. The most notable figure of all is the 18.2 per cent revenue decline for HPE. [easy-tweet tweet="Businesses are buying fewer servers due to political and economic uncertainty" hashtags="Businesses, Data"] "HPE's size means it is subject to the moves of the wider market more than some other vendors. Weakness in the business segment and sourcing changes in the service provider space have reduced its revenue significantly," said Mr O'Connell. Businesses are buying fewer servers due to political and economic uncertainty. Moreover, shifts toward hyper scale architectures mean service providers are increasingly buying lower-cost "white-box" servers from original design manufacturers (ODMs). As a result, the decline in shipments in the "Others" category at 1.4 percent was lower than the top five, except Dell "Leading server vendors will be doing all they can to ensure that service providers don’t continue to shift their server purchases toward ODM suppliers," said Mr O'Connell. "Combined with the significant inroads made by China-based suppliers, we expect to see continuing challenges and downward price pressure across the EMEA server market for some time to come."   Table 4 EMEA: Server Vendor Shipment Estimates, 1Q17 (Units) Company 1Q17 Shipments 1Q17 Market Share (%) 1Q16 Shipments 1Q16 Market Share (%) 1Q17-1Q16 Growth (%) HPE 167,818 33.4 199,819 36.5 -16.0 Dell EMC 112,872 22.4 116,305 21.3 -3.0 Lenovo 25,070 5.0 25,439 4.7 -1.5 Fujitsu 24,186 4.8 30,385 5.6 -20.4 Cisco 20,659 4.1 20,106 3.7 2.8 Others 152,567 30.3 154,765 28.3 -1.4 Total 503,173 100.0 546,819 100.0 -8.0 Source: Gartner (June 2017) Additional information is available to subscribers of the Gartner Servers Quarterly Statistics Worldwide program. This programme provides worldwide market size and shares data by vendor revenue and unit shipments. Segments include region, vendor, vendor brand, subbrand, CPU type, CPU group, max CPU, platform, price band, operating systems and distribution channels.   ### Migrating to Public Clouds – On the increase for Enterprises! In a recent interview, I discussed the pros and cons of migrating to a public cloud, stating that nobody prepares appropriately for a migration of a live application that’s in place already. We all think it’s too easy to migrate that key Virtual Machine instance to public clouds with the multitude of tools available today which makes us complacent with the perceived ease of Cloud Computing. However, in reality, moving that key application that resides on a virtual infrastructure is not easy at all and more often than makes the migration process fail! I have seen migrations of a primary application to a Public cloud environment go well but also very very badly, making me a skeptic. However, after reading some interesting statistics that include Public cloud in the enterprise it does seem that large corporates are increasingly comfortable utilising public cloud infrastructure as a service. “According to a recent JP Morgan survey of more than 200 CIOs of large enterprises, “16.2% of workloads are currently running in the public cloud, and in five years 41.3% of workloads will run in a public cloud”. If that’s the case in 5 years, what will happen in a decade? As AWS becomes a $14B business and Azure’s market share continues to grow, enterprises are leveraging the public cloud more and more each year.” However, even though self-provisioning of new workloads on AWS or Azure is simple, migration of a running service to the cloud requires more planning. It’s a common perception that migrating existing workloads to public cloud—especially those with a lot of data—must be complex, time-consuming and risky. But Enterprise IT organisations are rapidly establishing good migration practices, and migration technology are evolving quickly to support them. [easy-tweet tweet="Velostrata migration software is agentless" hashtags="CloudMigration,Cloud"] Velostrata, a provider of cloud migration software, had a lot to say about this and even provided us with a “Public Cloud Check Sheet” that can be to be used when planning your migration. The firm also has one special ingredient that I have not seen elsewhere, a USP if you like - Their migration software is agentless, yes you did read that right! The below graphic is taken from their website where you can find some amazing information on migrations in general and not just about their products. It seems like I was wrong with being a sceptic. However, I think that maybe a trial of the toolset would be beneficial to most enterprise firms to see it in action. And…they also provide a test environment that ensures you get it right before when you eventually press that green button! Velostrata’s guide to public cloud migrations can be found here! https://velostrata.com/cloud-migration/ ### #TransformCIO Episode 1- Interview with SuperBet's CTO, Finbarr Joy #TransformCIO is a series of interviews of CIOs and CTOs who are changing the way technology is deployed and viewed in their companies. In this first episode, David Terrar Founder and CXO of Agile Elephant was joined by CTO, Finbarr Joy from SuperBet. This episode saw the discussion focus on a CTO's role and the importance of leading digital transformation, which led further to speak about cloud computing, citizen developers, emerging technologies and the impact of technology on the industry and those on the outside. Finbarr also shares his favourite book recommendations! #TransformCIO   ### VMware New vRealize Cloud Management Platform    New VMware vRealize Operations 6.6 Introduces Automated Workload Balancing to Proactively Optimise Application and Infrastructure Performance; New Native Operations Management for Hyper-Converged Infrastructure Solutions Powered by VMware vSAN New VMware vRealize Automation 7.3 Enables DevOps-Ready IT with Enhancements for Containers and Configuration Management Solutions; Deepens Integration with VMware NSX for Day 2 Operations and Improves Support for Microsoft Azure   VMware, Inc. (NYSE: VMW) today introduced major updates across its VMware vRealize® cloud management platform to improve software-defined data centre (SDDC) and cloud operations and accelerate application and infrastructure service delivery across hybrid clouds. New releases include VMware vRealize Operations™ 6.6, VMware vRealize Automation™ 7.3, vRealize Business for Cloud™ 7.3, vRealize Log Insight™ 4.5 and vRealize Network Insight™ 3.4, which combine to provide enterprise customers with advanced intelligent operations and automated IT capabilities to more easily stand up and operate a VMware-based cloud. Additionally, VMware is introducing increased support for containers and configuration management solutions to ease moving applications from dev-test to production. The VMware vRealize Suite enables customers to manage and provision at scale--compute, network, storage and application services across hybrid cloud environments. The platform’s comprehensive management capabilities for the SDDC and across multiple clouds help customers to address three common use cases--intelligent operations, automated IT and DevOps-ready IT--based on thousands of customer engagements. “VMware is committed to supporting our customers’ digital transformation initiatives by helping them to modernise their data centres as well as integrate their public clouds,” said Ajay Singh, senior vice president and general manager, Cloud Management Business Unit, VMware. “These latest updates to our vRealize platform will help customers get more out of their hybrid cloud investments today, and put them on a path for a cross-cloud management of applications and infrastructure regardless of where the workload is running.” Plan, Manage and Scale SDDC and Multi-Cloud Environments with Intelligent Operations Intelligent operations help enterprises to plan, manage and scale their SDDC and multi-cloud deployments with confidence - addressing operations management for everything from applications to infrastructure. New vRealize features addressing this use case include: Automated, Proactive Workload Placement: vRealize Operations 6.6 adds substantial new intelligence to workload placement decisions to fully automate workload balancing across clusters and datastores based on business imperatives, including the ability to schedule rebalancing in a convenient maintenance window. This release will also feature predictive Distributed Resource Scheduler (pDRS) combining analytics from vRealize Operations with VMware vSphere® DRS to predict anomalies and act before contention occurs. VMware vSAN™ Operations Management: vRealize Operations 6.6 will deliver native vSAN management and monitoring--no longer requiring a separate download and installation of a management pack. Capabilities include capacity and time remaining, dedupe and compression savings, and reclamation opportunities for hyper-converged infrastructure (HCI) solutions powered by vSAN. Additionally, the new release will also enable centralised management of multi-site and stretched cluster vSAN environments with advanced troubleshooting, proactive alerting and visibility from VM to disk. Combined Operational and Business Insights: The new release will bring vRealize Business for Cloud 7.3 as a tab in vRealize Operations 6.6 for new insights that show how capacity utilisation drives cost efficiencies by combining operational and cost metrics. New vRealize Business for Cloud 7.3 will improve insight into the complete costs of AWS and Microsoft Azure instances alongside VMware-based private cloud costs. Cross-Cloud Security and Networking Management: vRealize Network Insight 3.4 will introduce support for Amazon Web Services (AWS) networking and security to enable users to plan security by AWS Virtual Private Cloud and AWS tags. Customers will be able to add AWS EC2 VMs to user-defined applications for micro-segmentation planning and troubleshooting traffic in AWS. Automating IT to Deliver Faster Time to Implementation and Lower Customer TCO The Automated IT use case helps IT organisations speed up service delivery by providing them with the capabilities to fully automate core IT processes. New features include: VMware NSX Operations: vRealize Automation 7.3 adds increased support for NSX- related operations including the setting of advanced NSX controls for load balancing, network and security functions for Day 1 and Day 2 operations. Automated, Proactive Workload Placement: vRealize Automation 7.3 significantly improves placement decision capabilities when considering where to provision newly requested virtual machines. Users now have the option to set a placement policy within vRealize Operations and have vRealize Automation consume it. Accelerating Application Delivery with DevOps Ready IT The Developer-Ready IT use case helps IT teams to meet the needs of developers who want to use the tools of their choice yet enable IT to seamlessly move applications from a laptop into production. New features include: Container Management Enhancements: vRealize Automation 7.3 now features Admiral™ 1.1, the highly scalable and lightweight container management portal. Support for Admiral 1.1 enables users to manage Virtual Container Host instances generated by VMware vSphere Integrated Containers as well as Docker hosts. The new release also provides support for Docker volumes enabling users to create and attach volumes to containers. Leveraging Configuration Management Solutions: vRealize Automation 7.3 introduces a new framework to enable configuration management tools as first-class citizens with Puppet being the first ecosystem partner. Customers can now seamlessly deploy, configure and manage production-ready OS, middleware and applications by using vRealize Automation's blueprinting, service orchestration and governance workflows along with capabilities delivered by configuration management tools Blueprints Parameterization Improvements: vRealize Automation 7.3 substantially improves reusability of blueprints and lowers customers’ TCO through the introduction of parameterized blueprints. Service designers will be able to directly define the “T-shirt sizing” (i.e. small, medium, large) for specific resources into blueprints and easily tailor services to their needs through “sizing” parameters. Microsoft Azure Public Cloud Integration: vRealize Automation 7.3 enhances support for Microsoft Azure by enabling deployment and management of application and middleware services. [easy-tweet tweet="Adopting vRealize Automation was a decision to enable VMware to gain a competitive edge." hashtags="Cloud, ERP"] As the world’s most complete and capable hybrid cloud architecture, VMware Cross-Cloud Architecture™ enables consistent deployment models, security policies, visibility, and governance for all applications, running on- and off-premises, regardless of the underlying cloud, hardware platform or hypervisor. Cross-Cloud Architecture builds on its leading private and hybrid cloud capabilities by offering customers the freedom to innovate in multiple clouds and is delivered through VMware Cloud Foundation™, VMware Cross-Cloud Services™, and the VMware vRealize cloud management platform. Supporting Quotes “Adopting agile development practices and vRealize Automation was a strategic business decision to enable us to gain a competitive edge in the manufacturing ERP market,” said Chris Smallwood, vice president of IT, IQMS. “We have accelerated our product development cycles, and look forward to the new DevOps-related and hybrid cloud features of vRealize Automation 7.3 to help our IT efficiently meet the needs of our development team, and ultimately our customers.” “vRealize Operations Manager is a powerful tool that provides visibility into capacity and performance of our virtualized estate,” said David Wyatt, Infrastructure Architect, Tatts Group. “The custom dashboard feature provides that ‘single pane of glass’ scoreboard that allows senior management, architects and operational staff access to key metrics that matter. We look forward to upgrading to vRealize Operations 6.6 where early access has shown a more refined user interface and additional capacity reporting features.” Product Availability and Pricing VMware vRealize Automation 7.3 and VMware vRealize Network Insight 3.4 are now available. VMware vRealize Operations 6.6, VMware vRealize Business for Cloud 7.3, VMware vRealize Log Insight 4.5 are expected to become available in VMware’s Q2 FY18 which ends August 4, 2017. vRealize Network Insight 3.4 is now available in two editions – Advanced and Enterprise – beginning at $1,245 per CPU. Join the VMware vForum Online Event VMware will conduct a “What’s New with vRealize Suite” session during VMware’s vForum online on June 28, 2017, at 10:00 am Pacific. Tune into the session to hear from VMware’s cloud management leaders and subject matter experts on the updates as well as demonstrations of the new functionality. Register for the vForum here.    ### Enhance Workplace Productivity Through the Cloud As the managing director of a cloud migration company, I admit to being a bit biased towards working in the cloud.  Believe me; I get it, the whole thing sounds exotic, complex, risky and something that only tech companies should do.  My mother reacted with horror when I told her what I do – until I asked her how she liked her online banking – she loves it: “so fast and easy and convenient”.  So for over ten years she has trusted the cloud with every financial transaction but is horrified by the thought of her emails being held in the cloud! It is a commonplace today for businesses to migrate emails, calendars, files and sometimes archives to the cloud to offset the cost of maintaining on-premises hardware.  These technologies are seen as a commodity with employees appearing not to care how they view their emails and files, just so long as they’re accessible whenever they’re required. Migrating to the cloud is now a relatively simple process with many customers worldwide even opting to self-migrate using the specialised and simple-to-use software.  The days of complex 11-month projects with a huge number of migration workshops and onerous project management fees are long gone – although many consulting companies still have a vested interest in making moving to the cloud sound much more difficult than it has to be. We have witnessed a major upswing in migrations to the cloud. Many SMEs are now starting to migrate all their files and emails, mid-size businesses are overcoming their initial fears over security and data sovereignty, and the majority of large companies are already in the cloud to one extent or another but looking at options for moving between cloud providers, for example, moving from Google Cloud to Office 365.  Migrating to the cloud can bring about many benefits including enhanced workplace productivity for businesses of all sizes and types – here are just a few examples: Connections anytime, anywhere A properly planned move to the cloud should provide a simple, secure and cost-effective way for all employees to connect to company resources from any device, anywhere on the planet - freeing your workforce from being tied to a local network in company offices.  For many, travelling nationally and often internationally every week can be commonplace. Therefore operating out of a huge range of airports, offices, hotels and coffee shops safely and easily is critical. A clear cloud usage policy in place ensures employees are clear on guidelines for usage and productivity is boosted. An extension of this is the freedom to work on any device from anywhere.  The increasing acceptance of Bring your Device (BYOD) as a business policy enables employees to access their work data and work on a platform they are familiar with.  Research suggests that this policy has many key benefits such as improved job satisfaction, increased flexibility and increased morale.  The company also benefits from lower training costs, reduced hardware costs and a more flexible workforce.  However, it must be pointed out that a strong data protection policy must be in place with clearly defined BYOD usage guidelines covering topics such as device security, data encryption, data monitoring and other data protection risks.  With a bit of planning and a sensible usage policy, BYOD offers real tangible benefits to both employees and the company, enabling both to fully experience the undoubted benefits of the cloud.  Extended team of IT troubleshooters Another joy of cloud-based working is the fact that any IT problems encountered are the problem of one of the few very large specialist cloud organisations.  In the unusual event of something going wrong, you can bet that a large team of specialists will be working on the problem immediately and will keep working until the issue has been resolved. No longer will you have to wait for in-house IT to help you sort the issue. [easy-tweet tweet="In the cloud, the latest update security measures are instantly at companies’ fingertips" hashtags="Cloud, Security"] In the event of a natural disaster or act of terrorism, security and redundancy are built into data centres for cloud providers. The recent worldwide cyber security crisis provides a good example. This new strain of ransomware attack, WannaCry, once again demonstrated the huge vulnerabilities caused by companies and individuals not patching their systems. When working in the cloud, the latest updates and most up-to-date security measures are instantly at companies’ fingertips, therefore, minimising the risk of vulnerabilities. Security factors include multi-tiered physical security, backup power supplies and communications capability alongside environmental security measures such as climate controls and fire suppression. All of this enables companies to benefit from less expenditure on back-up kit and training and get on with running your business today rather than spending valuable time and money worrying about tomorrow. Quick and agile collaboration through SaaS Managing permissions, access to information, new hires and leavers (provisioning) is also easy to do using the cloud.  Simple to use and relatively cheap software is available to make provisioning easy, enabling employees to be hired, fired and inspired at the click of a mouse.  While it is respectfully suggested that you let a real human do the actual hiring and firing, when the decision is made, setting up permissions and access to company data is now a simple task.  Most of this provisioning software is consumed as Software as a Service (Saas) providing a scalable and predictable cost. Cloud-based provisioning facilitates quick and agile co-operation within and between teams and departments, enabling them to respond rapidly to changing business needs and priorities.  Huge expenditure on clunky systems should not exist in the cloud, anything worth doing should be available on subscription without large fixed costs to buy, install and back-up. Once you have migrated to the cloud, freed your data and employees from poor back-up and service, enabled BYOD, managed your data security and access, you still have to get people working together effectively.  Businesses can only fully exploit the investment in the cloud and enhance workplace productivity by expanding beyond the use of email, calendars and files.  Microsoft Office 365 offers some useful collaboration tools, but Microsoft has been poor at explaining the benefits of them to its end users. The vast majority of employees use the cloud exclusively for their emails and files, however, if you haven’t heard of Groups, Delve, Clutter, Planner, Sway or Teams, do not worry - you are not alone!  It’s worth your time doing some research as the tools available are often very useful when used and can offer a much richer, effective and productive workplace cloud experience. ### How the Cloud is Helping Businesses Evolve Communication Cloud-based technology is helping businesses evolve smarter, quicker, simpler communications strategies that many only dreamed possible just a few short years ago. Over the last ten years, corporations and companies both big and small have spent big money – and I mean really big money – on state of the art communications and video conferencing systems in a bid to try to connect their increasingly multi-site or globalised workforces. Step forward to 2017, and we’re seeing remote working and conferencing becoming increasingly commonplace, and businesses open their eyes to the possibilities of a “new normal” for connecting people, powered by the cloud. We’re also seeing company c-suites – CEOs, CFOs, CIOs and Telecoms leaders – wising up to the rapid evolution in cloud technology which is enabling businesses of all shape and sizes – including many of our own clients – to rethink and reshape their communications and conferencing. Revolabs – and our parent company Yamaha – believe the cloud has the capability to revolutionise the multi-billion-dollar global conferencing industry. We believe it, because we’re seeing first-hand the benefits and capabilities technology has to transform the way businesses connect, streamline costs and connect and engage people. The “new normal” For many, cloud conferencing is becoming business as usual. Big businesses – supported by healthy budgets – can invest in the technology right now, so for many, they’re already reaping the benefits of cloud conferencing. But, we believe the benefits for all businesses – and to UK plc – could be immense, and we’re aiming to encourage people, including business owners, CIOs, CEOs of big, medium and small companies – to understand the benefits to the bottom line and the productivity of their teams. If you take me as an example. I look after Revolabs’ sales and operations across Europe, the Middle East and Africa (EMEA) on behalf of our parent company Yamaha. And I live in Cheshire in the North West of England. For the first time, last year I made the jump to the cloud. Believe me, EMEA is a big patch to cover! But instead of being like George Clooney’s character in the film Up in the Air (although I could do with the air miles!) I can connect to clients across the world using our cloud conferencing products and systems from my home office, a local coffee shop, a huddle space, or even on the 7:56 am from Milton Keynes….OK, so the last one might be a bit pushing it a bit! All the functions of my job can be delivered through the cloud. My CRM can be accessed via my browser or Android app, as can my email, networking app, data storage, company analytics. Having access to all this data to hand means my meetings are as productive as they can be. I use a cloud-based video conferencing systems which require no special hardware and can be accessed by anyone even if they don’t have regular access to the service themselves. The point is I’m using cloud technology to get business done. I achieve sales to support our parent company Yamaha’s efforts to increase their cloud conferencing market share in the EMEA region. I can engage and support my teams. And I can see my clients smile. Think back just three or four years ago and meetings were pretty much the same experience for us all. They always followed a standard format. Face-to-face being the preferred option…..that’s if you could find a meeting room in the first place! There was usually a lot of people (many having travelled to be there) and almost everyone would have that slightly uncomfortable feeling that they could have been doing something more productive. And that’s because they all know they could have been. The conference call used to be the second best option. It involved either audio only over the mobile or desk phone, or it was dialling into the dedicated videoconferencing room. This room had dedicated hardware to facilitate the call and unfortunately was not accessible to everyone. The availability or lack thereof meant that conferencing didn’t become the habitual part of working life that business leaders wanted to see, which is a shame as for may they had invested a significant amount of money in the system as part of time efficiency drives, financial savings programmes or carbon footprint reduction programmes. The cloud, the ever-advancing technology, the simplicity of the operation and the growth in a more “opex” based (i.e. usage-based or ‘pay and your grow’) technology have all combined to make a fundamental shift modern day meetings. [easy-tweet tweet="Gartner claimed that the cloud is evolving every single area of a company’s IT infrastructure" hashtags="Cloud, IT"] And, as a result, this space has changed forever. Living in the Cloud I live and breathe conferencing – in particular cloud conferencing – but I’ll break this down simply. The cloud has changed the landscape completely. Only recently Gartner claimed that the cloud is evolving every single area of a company’s IT infrastructure. That’s any company. Any shape, any size. Why? Because cloud technology is allowing users to drive how they access information and business services. It is enabling companies to start looking at more efficient financial models as they examine less capital-intensive infrastructures. It means we have permanent access to data when we need it. We now have the capabilities to run high definition video conferencing calls using nothing more advanced than a laptop or smartphone. We have made accessibility much smarter and simpler. Today, participants can join a meeting that is hosted online, but they can also share content with all the participants and, depending on which service they use, they can also invite people to dial in from traditional telephone numbers. The flexibility is staggering. And it is evolving to shape the market it serves. Cloud coverage Over the last 18 months, we have seen significant growth in the conferencing market, largely enabled by the popularity of cloud conferencing technology from the likes of Zoom, BlueJeans, Citrix and Skype4Business. It has been reported that by 2020 the revenue generated by manufacturers of conferencing technology will exceed $6.4billion. That’s the GDP of a small nation. But it shows the scale of the opportunity and also the demand. We’re seeing an evolution not a revolution in the cloud conferencing market. We’re working with many enterprises – big and small – who are evaluating the necessity of having all singing, all dancing meeting rooms. Instead they are seeking to use their investment to enable a much greater number of smaller “huddle” style meeting rooms that have a basic – but still highly-technical and smart – screens, cameras and sound peripherals. These rooms can accommodate a PC or the NUC running the latest cloud conferencing application. Importantly, the outcomes are achieving engagement objectives. The clients we’re working with are enabling us to strengthen the theory that greater accessibility enables greater adoption too. By bringing your device (BYOD), organised video conferences are enabling people to increasingly break free of the restrictions and limitations of meeting rooms, face-to-face and teleconferencing. Instead, cloud conferencing is enabling companies to introduce more visually and psychologically stimulating spaces in which to meet. Bean bags, sofas, coffee shops, adapted VW Campervans are all becoming meeting spaces. And cloud conferencing and data services are very much at the heart of making this evolution possible. This cloud thing making the world a smaller place. It’s bringing economical proprietary to more and more regions of the world, and that has got to be a good thing. One thing’s for sure I’ll never think about clouds hanging over me as a bad thing again. ### Telcos Need to Adapt Quickly to Deliver Omnichannel Almost every five years we see a significant evolution in technology lifecycles. In 2000, we saw IP-based communications (mostly desktop) go mainstream in the developed world. From 2005, it spread to mobile and then the adoption of smartphones exploded, led by the iPhone, and getting ‘stuff to things’ became mainstream. But by 2010, apps were de rigueur, and there was a dramatic shift of customer conversations to social media (think Twitter and Facebook). We also saw the introduction and rapid take-up of chat apps (WhatsApp, WeChat, Viber, etc. ), which have themselves become multi-device platforms for Voice over Internet Protocol (VoIP) and video calling, messaging, content distribution, payments, and other services. Interestingly though, a recent white paper - commissioned by IMImobile - suggests the way consumers interact with companies is perhaps going back to the future. In the early noughties, we were using voice and text over telco-dominated channels which then shifted online as consumers conversed more between themselves and businesses. We seem, though, to have now returned to natural language (text and voice) but with one significant difference. The always-on culture means consumers want to interact with brands on multiple channels of their choice and at a time of their convenience. In industry speak, it means companies are now faced with orchestrating a true omnichannel customer experience. Not a new concept, but one which is becoming increasingly core to companies’ success in the world of the digital citizen. This has all happened so quickly, thanks mostly to telcos. They have been instrumental in the dramatic change in how we communicate with each other by investing heavily in network infrastructure, including high-speed data networks. For example, BT announced investments of £6bn across Openreach and EE in 2015 while Vodafone had committed to an additional £2bn in mid-2016.This has enabled the social and digital platforms, content and services that stimulate interaction between businesses and their customers today. In fact, it has helped develop a whole new ecosystem, with connectivity and high-speed internet almost a given, certainly in advanced economies and in many other areas around the world. [easy-tweet tweet="With this evolution in technology comes a rapid change in consumer behaviour" hashtags="Technology, AI"] In the past two years, we’ve seen the rise of AI through Natural Language Processing and Machine Learning -  including automation via chatbots and virtual assistants, powered by powerful platform-based engines such as Amazon Alexa and Apple Siri. We are just at the start of the AI journey in business-to-consumer interaction. But with this rapid evolution in technology comes a rapid change in consumer behaviour. And today’s telcos and businesses have to be able to adapt quickly to succeed, ensuring that they can effectively and proactively interact with customers across their journey at the perfect point to deliver maximum value. This is also key for telcos to be able to effectively upsell services resulting from initiatives such as the vast investment made in quad play (mobile, fixed line, Broadband and TV), which has resulted in service fragmentation. And it is why omnichannel has such value, and why it’s essential to the future of telcos. So what do they need to do to get it right? Firstly, telcos need to identify and address capability gaps in their organisational competence and technology platforms, if they are to maintain optimal relationships with their customers into the future. To successfully deploy a platform approach to omnichannel customer experience, telcos must also commit to providing a unified view of the customer that extends across all of their departments and data systems. And they must focus on achieving four key objectives as they build out their omnichannel platform: customer or persona recognition, orchestration, continuous adaptation, and protection of the customer. Orchestration is particularly important if telcos are to avoid customer journey fragmentation. They must remove system and operational silos and create a single unified customer orchestration layer, utilising the right digital channel at the right point in the customer journey. Last but not least, digital transformation has also put pressure on telcos to respond to rapidly changing market dynamics, including the imminent rollout of 5G networks, integration into the IoT and the increasing use of Artificial Intelligence. In sum, digital-savvy consumers with higher expectations are driving the need for telcos to radically improve the customer experience. An intelligent orchestration layer, supported by real-time contextual data, predictive analytics, and automation, all integrated with back-end systems for fulfilment, provides the platform for omnichannel. This is a major milestone towards ensuring profitable growth and staying relevant to the customer and ultimately keeping them. The recent white paper “Orchestrating the omnichannel customer experience” by Ovum Consulting provides a clear pathway and guidance on how telcos can digitally transform the customer experience in a “dynamic, unpredictable, complex and ambiguous environment”. ### State Street Appoints Head of Global Exchange Business for EMEA State Street Corporation (NYSE: STT) announced today the appointment of David Pagliaro to the role of head of State Street Global ExchangeSM for Europe, the Middle East & Africa (EMEA). Based in London, Pagliaro will report to John Plansky, global head of State Street Global ExchangeSM and Liz Nolan, co-head of Global Services for EMEA. Pagliaro has nearly two decades of professional experience in financial services. Prior to joining State Street, the most recently worked for S&P Capital IQ for nine years. During this time, he held a number of roles, most recently global head of S&P credit solutions, where he managed a number of responsibilities including the commercialisation of research, data and analytics across the Americas, Asia-Pacific and EMEA. Before this, he was global head of corporate and commercial lending at the firm; and prior to this, senior director for its EMEA fixed income solutions. Commenting on the appointment, Nolan said, “We are delighted to welcome David to our EMEA team. His wealth of experience across multiple segments of the financial industry will further strengthen our EMEA offering as we continue to grow our presence in this region.” [easy-tweet tweet="The demand for Global Exchange’s services and data has rapidly increased " hashtags="Fintech, Data"] Plansky added, “We launched Global Exchange, our data and analytics arm, four years ago, recognising that the industry and our own business is evolving to become data-led. The demand for Global Exchange’s services and data has rapidly increased during this time, driven by factors including low return, the growth of passive and regulation among others.” James (JR) Lowry, who most recently held the role of EMEA head of Global Exchange, will return to Boston to lead Global Exchange for North America and Global Exchange’s Innovation and Advisory Solutions team. He will also oversee State Street Associates® (SSA), the academic affiliate of State Street whose partners include renowned academics from the Harvard Business School, Massachusetts Institute of Technology’s (MIT) Sloan School of Management and Boston College. Lowry – who will report to John Plansky – will be returning to Boston in July after a comprehensive handover. About State Street State Street Corporation (NYSE: STT) is one of the world's leading providers of financial services to institutional investors, including investment servicing, investment management and investment research and trading. With $29.83 trillion in assets under custody and administration and $2.56 trillion* in assets under management as of March 31, 2017, State Street operates in more than 100 geographic markets worldwide, including the US, Canada, Europe, the Middle East and Asia. For more information, visit State Street’s website at www.statestreet.com      ### A Radical New Model is Needed for Data Analytics to Thrive Data warehousing has reached an impasse. Born in an era of monolithic systems walled off from the outside world, the data warehousing technologies that have dominated the last thirty years stand today as an unsightly reminder of the wear and tear taken by increasingly vain attempts to help them stumble forward into the modern world. The legacy data warehouse has struggled because it suffers from the limitations of an approach and architecture defined by the constraints of the physical world, a world in which resources were limited, applications ran in isolated silos, and changes were infrequent and carefully controlled. The limitations born from that world are today the chains that shackle the data warehouse, tying it to limitations of the past and preventing it from addressing the realities of the present and future.  Simply put, the legacy warehouse was not designed for the volume, velocity, and variety of data and analytics demanded by the modern enterprise. Data analytics, once accessible only to the largest and most sophisticated global enterprises, today is a top priority for marketing, finance, development, and sales at organisations of all types and sizes. Stories of the outsized impact of data analytics only increase the demand for better insights, for example, the story of how utilities can realise a 99 percent improvement in accuracy by harnessing big data. However, the growing demand for data analytics is shining a spotlight on the painful inadequacy of legacy data warehouses. For one, they were never designed for the sheer volume and pace of data that exists today. In large enterprises the volume of data generated within the organisation is overwhelming. For example, in just one year Rolls-Royce generates over three petabytes of data in the manufacturing of fan blades for turbines. Mining that data is essential to design, manufacturing, and after-sales support. Exploding volumes of data are not just a large enterprise problem even mid-sized organisations that may not generate such volumes of data internally have access to large external data sources such as data.gov.uk. Traditional data warehousing, with its chains of the physical world, is prohibitively expensive and inflexible, limitations that become painfully obvious with the scale of todays data. The limitations force difficult capacity planning exercises and large upfront expenditures to acquire sufficient capacity in advance to avoid a painful, disruptive exercise to scale your data warehouse. Not only is the scale of data a challenge for organisations, so is the demand for faster results. Not so long ago users would have been happy to get updated analytics on a weekly or daily basis. Today that would be seen as a failure users expect results in seconds and expect those insights to be based on data that is continuously updated, even as the number of people with direct access to data and analytics grows. Again, the limitations of traditional warehousing become painfully clear. Prohibitively slow batch processing has crippled business intelligence (BI), and the inability of legacy architectures to scale to handle growing numbers of users ruins response times for everyone. How do organisations cope with the challenge of adapting legacy data warehousing to address these challenges? Unfortunately, most have struggled mightily to do so. The complexity and labour required to keep these legacy solutions from collapse add to the strains not only on IT but also on the business. It's no wonder that less than half of business executives in a recent UK survey expressed confidence in the insights they have from their data and analytics solutions. If enterprises are to continue to grow, they will need a radically different approach to data analytics - one that leaves behind legacy shackles from the physical world and ascends to the cloud. [easy-tweet tweet="A cloud-based solution should be able to operate painlessly at any scale" hashtags="Cloud,DataAnalytics"] The promise of cloud for data analytics is a solution that frees organisations from the limitations and constraints around which legacy data warehouses were designed. A cloud-based data warehouse should not only eliminate limits to resources and the complexity of systems and infrastructure; but it should also fundamentally eliminate the friction and complexity of providing the entire organisation with fast, flexible access to all of the data that they need to make the best possible decisions.It should empower enterprises to shift their focus from systems management to analysis. To deliver this, a cloud-based solution should be able to operate painlessly at any scale, effortlessly able to adapt immediately to dynamic changes in data volume and analytics workloads. It must also make it possible to combine diverse data, both structured and semi-structured, to bring together data silos that prevented users from accessing data locked away in a sprawl of enterprise databases, CRM systems, general ledgers, and web applications. It must do all of this at a cost and efficiency that is connected to the value delivered, not tied to a legacy model of upfront costs and ongoing management burdens. Such a data warehouse changes the way that analytics are made available to users. Instead of an assembly line in which multiple steps are required before data is available for users and their analytics, a new approach can emerge that can be thought of in terms of distillation: a system in which data is available immediately and at every step of refinement, supporting data exploration and experimentation in the same system that also supports business intelligence and reporting. This not only changes the experience for users, but it also changes the experience for the business. Data insights can be always up to date and directly accessible to everyone who needs them. Further, they can be richer and more comprehensive because they can combine data from all possible sources to create insights that would have been inaccessible before. This new world holds the promise of reinvigorating data analytics, ending the struggle faced by enterprises today. Although it is difficult to forecast the changes taking place across the data landscape, the promise of clouds on the horizon could mean new life for your business. ### CrowdStrike Extends Falcon Platform with Enhanced Cloud Coverage   Reading, UK, CrowdStrike® Inc., the leader in cloud-delivered endpoint protection, today announced, as part of its Spring release, new features of the CrowdStrike Falcon platform custom-built for cloud providers and modern data centres, providing best-in-class prevention, detection and response for Windows, Linux or macOS servers, powered by artificial intelligence/machine learning. The servers used in the modern-day data centre are faced with commodity, as well as advanced stealthy attacks. CrowdStrike Falcon leverages its industry-leading artificial intelligence/machine learning as well as industry-leading Indicator-of-Attack (IoA) behavioural analysis to bring real-time protection to servers whether on-premise, virtualised or in the cloud. As data centre or cloud deployments grow or evolve, with CrowdStrike Falcon, customers are freed from having to add additional management servers or controllers for endpoint protection. With Falcon’s lightweight agent, customers can quickly and easily add end-to-end protection with instant zero reboot deployments, no performance impact or signature updates - all of which improve the performance of business-critical servers. CrowdStrike Falcon enables management of all systems, irrespective of their location, from a single console providing a consolidated view into all assets for the enterprise. CrowdStrike Falcon supports all major platforms including Amazon AWS, Google Cloud Platform and Microsoft Azure. It also provides protection for guest OS hosted on all popular hypervisors and protects Windows, Linux and macOS guests with a kernel-mode agent. CrowdStrike Falcon allows for complete protection policy control, with full flexibility around policy deployment at the individual server, group or cloud platform/data centre levels. Irrespective of how a server is deployed, the security team retains complete visibility and the control required to prevent or contain the attack. New and Enhanced Capabilities CrowdStrike Falcon provides features critical to securing data centres, focused on control, visibility and complete protection: Linux Kernel-mode Agent – Falcon Linux agent is now a full kernel-mode module, providing comprehensive real-time visibility from its high position in the kernel into key OS events. Amazon Linux Support – Falcon Linux agent now fully supports Amazon Linux distribution, a popular platform on Amazon Web Services (AWS). Falcon Discover – Falcon Discover’s asset, application and user account visibility features help to optimise workloads, manage costs and audit/remove unauthorised accounts of systems deployed in the cloud, data centres and on-premise. Falcon Data Replicator – Falcon Data Replicator provides real-time access to the raw event data stream, which customers can ingest into their local data lakes for correlation against event data collected from other systems. This opens up the full comprehensive dataset of more than 270 OS-level event types that Falcon Insight customers can now integrate into their own data analytics solutions. AV-Comparatives has certified CrowdStrike Falcon for anti-malware and exploit protection and noted that Falcon can “help organisations efforts with respect to PCI, HIPAA, NIST and FFIEC compliance.” “For a while now, within our highly complex environment, managing high-value systems required a choice between maximum protection and maximum performance — CrowdStrikehas removed that dilemma,” said Anton Reynaldo Bonifacio, chief information security officer, Globe Telecom. “Adding best-in-class prevention, detection and response without increasing complexity have long been atop every CISO’s wish list. CrowdStrike Falcon is lightning fast to deploy and manage, and doesn’t slow down a single machine — on-premise, in the cloud, or anything in between.” [easy-tweet tweet="Many legacy AV solutions don’t provide sufficient visibility to enable threat hunting" hashtags="AV,Cloud"] “With this Spring release, we continue to advance the Falcon platform to ensure customers can protect all of their systems, whether physical, virtual or cloud-based, with reduced complexity and improved performance,” said Dmitri Alperovitch, CrowdStrike’s co-founder and chief technology officer. “Many legacy AV solutions don’t provide sufficient visibility to enable threat hunting and forensic use cases, they poorly protect non-Windows environments, and are cumbersome and sometimes risky to deploy to cloud or hybrid cloud-based data centres. CrowdStrike Falcon addresses all of these pain points and adds scalability, efficacy, and speed.” Recently named a Visionary in the 2017 Gartner Magic Quadrant for Endpoint Protection Platforms, CrowdStrike has set the new standard for endpoint security — providing organisations with the only solution that can prevent, detect, respond and hunt for attacks via a single lightweight agent. The platform has achieved impressive success in the market replacing not only legacy AV solutions, but also a variety of next-generation AV point products. CrowdStrike Falcon has been independently tested and proven as an effective AV replacement, including verification from testing with AV-Comparatives and SE Labs. For additional information about the Spring release, please visit our blog here. ### An Explanation on Why Hybrid Cloud is Here to Stay  The steps to hybrid cloud and what benefits it can help deliver Cloud isn’t just a passing trend, it’s here to stay – but it remains a challenge for some companies. So what’s holding business leaders back? Budget and resource constraints are often the biggest obstacles to organisations making changes to their business models - and adopting cloud at scale can be risky, and at times, a daunting process. However, without implementing cloud at scale, enterprises will never experience the true impact, value, or even lower overheads it can offer. Traditional IT environments are unable to keep pace with the growing demands to ensure faster time-to-market and the digital innovation they need to stay ahead of the competition. And legacy infrastructure is increasingly proving expensive and time-consuming to maintain, as well as being slow and inflexible when it comes to delivering new IT developments. Organisations that embrace digital technologies to keep pace with customer expectations will gain a significant competitive advantage over those that fail to make the necessary changes. Embracing the cloud, why hybrid is the answer For companies wishing to introduce cloud into their IT infrastructures, it’s recommended to start small and scale up over time. Enterprises should, gradually, take steps to adopt cloud for scalability, time to value and flexibility - while retaining valuable assets in the safety of the business’ current IT infrastructure. This configuration is commonly known as hybrid cloud. Just like hybrid cars that use electricity and petrol when power is low, hybrid cloud brings two technologies together to support each other. This allows IT teams to use on-premise systems for some tasks and cloud for others. In 2017, enterprises will increasingly live in a hybrid IT world, split between on-premises solutions and cloud environments. It’s easy to see why hybrid strategies are on the rise; hybrid cloud enables workloads to exist on either a vendor-run public cloud or a customer-run private cloud. This means that IT teams can harness the security and control of a private cloud as well as the flexibility of public cloud services. [easy-tweet tweet="Organisational requirements are constantly changing and evolving" hashtags="Hybridcloud,Cloud"] Recent reports have revealed that many organisations have already started to split their cloud budgets between public and private deployments, creating demand for hybrid cloud strategies that give companies greater flexibility. A recent study from IDG Research Services has shown that by increasing IT agility through hybrid cloud, digital transformation becomes faster, easier and less expensive. The survey also reveals that hybrid cloud enables accelerated and improved investment by reducing IT costs.  The four biggest benefits of hybrid cloud Scalability Organisational requirements are constantly changing and evolving. Hybrid cloud solutions are particularly valuable for dynamic workloads; when work surges, a hybrid environment allows for quick scalability to meet these needs. These can then be scaled back to avoid over-provisioning to keep costs under control when demand drops. Time to value Hybrid cloud is fast and inexpensive to scale out and does not require the up-front investment in the infrastructure of private cloud. It’s no longer a case of trying to determine the maximum load; a hybrid solution allows for allocation and reallocation to meet changing workload needs as and when required - which can have a significant impact on the bottom line. Companies also need to take into account the business value of getting a revenue-generating application to market sooner. Flexibility With hybrid cloud in place, IT teams can take advantage of ‘platform-as-a-service’ (PaaS), where applications can be developed, tested and launched quickly, efficiently and securely. Simplifying and automating the IT infrastructure also increases its resiliency – reducing the likelihood and impact of business downtime. Furthermore, hybrid cloud enables organisations to create an ‘always-on’ business environment – along with the ability to make key decisions in real-time, this provides a high level of service to customers, employees and partners. Security While there’s a perception that the cloud isn’t as secure as on-premises infrastructure, there is increasing evidence that public cloud environments suffer from fewer attacks and viruses than traditional IT environments. Also, security policies can be applied consistently across the standardised services – ensuring that workloads reside where they have the appropriate level of protection. Finally, with the introduction of self-service, there’s less inclination to resort to shadow IT - avoiding unnecessary security risks. Where to start with hybrid cloud One of the most important steps organisations need to remember is to not just look at the typical hybrid cloud definition - which tends to look solely at the infrastructure involved. Instead, they need to focus on what should be done to suit each business need; running server workloads, split between on-premise and cloud platforms, is a difficult and complex system with many moving parts that all need to be managed well. Companies should look at hybrid cloud platforms from a Line of Business (LOB) perspective. Many services can be placed in the cloud; HR, Payroll, Customer Management Systems, Marketing and Social Media all have desirable cloud solutions, and when integrated with on-premise applications, become very productive with quicker time to value and lower overall costs. A growing number of organisations are choosing to adopt a hybrid cloud strategy – leveraging the cost savings, flexibility and scale of public cloud services where appropriate while continuing to host the most business-critical applications on-premise. According to a Market Forecast Report for the Worldwide Hybrid Cloud Computing Market from 2016 to 2022, nearly 82 percent of enterprises will have a hybrid cloud strategy in place for 2018 - with more than 60 per cent of large enterprises planning to implement hybrid cloud by 2020. In the next five years, we will see a strong increase in the adoption of a hybrid cloud approach to IT service delivery. Recognising the need to adopt cloud technologies alongside legacy infrastructure as a means of working smarter is one thing, but selecting the right technology and approach will be how a significant boost in productivity, innovation and the bottom line is generated.   ### Technology and the Law Ep 3: GDPR Watch the third episode of  “Technology and The Law”. A brand new show broadcasted by Disruptive Tech with presenter and technology lawyer, Frank Jennings. This show focuses on the legal aspects and debates towards the technological industry. Previous episodes have focused on Data Location and Contract Risks from Cloud Computing. This week is the third week and will focus on GDPR with a great panel, including; Martin Warren from Netapp, Avril Chester from Scope and Chris Cooper from Knownow. Frank Jennings is the commercial and cloud lawyer at Wallace LLP. He specialises in commercial contracts with cloud and technology, data security and intellectual property. Frank is known for his can-do attitude, managing risk and problem solving to maximise return on investment. Who else could present this show? If you are interested and want to learn more about the legal aspects and debates towards the technological industry then tune in! Hashtag: #TechLaw Twitter: @disruptivelive   ### Prepare for Disruption - #Cloudtalks with Oxygen Associates, David Sowerbutts In this episode of #CloudTalks, Compare the Cloud speaks with CEO, David Sowerbutts from Oxygen Associates. David explains that Oxygen Associates is a specialist consultancy company that provide software to enterprise clients, anything that has an IP or SIM card can allow them to run a profile of the company and find its state of reality. Oxygen Associates can begin with the data and then build an investment case, and finally find the correct resources to apply a solution. This can be done on a risk aware base over the course of 8 - 12 weeks. #CLOUDTALKS Visit Oxygen Associates page to find out more: http://www.oxygenassociates.com ### Helping Customers Make That Cloud Journey - #Cloudtalks with Bell Integration's John Davenport and Eugene O’Sullivan In this episode of #CloudTalks, Compare the Cloud speaks with John Davenport, Sales Director and Eugene O’Sullivan, Strategy and Innovation Director from Bell Integration. John began the discussion by explaining Bell Integration and how they work within the IT environment to develop and build software that customers use on the move to cloud technologies. Eugene explains how the pitfalls of cloud adoption put clients off the move. Bell Integration look to take away these concerns by building a methodology that will break down customers hurdles. John explains that customers are driven by an end date and this is why people go through Bell Solution to have and achieve their end date. #CLOUDTALKS Visit Bell Integration page to find out more: http://www.bell-integration.com/ ### IoT - #Cloudtalks with Zoetrope Labs, Ben Howes In this episode of #CloudTalks, Compare the Cloud speaks with Founder, Ben Howes from Zoetrope Labs. Ben explains that Zoetrope Labs are a consultancy company specialising in IoT, product development and LPWA - Low Powered Wide Area networks. Their aim is to reduce cost of IoT connectivity. Furthermore, Ben speaks about the unique side of their new product, ZConnect, is that pulls together all the building blocks of IoT to create a turn-key solution for device manufacturers. #CLOUDTALKS Visit Zoetrope Labs page to find out more: https://zoetrope.io ### Predictive People Analytics - #Cloudtalks with Tap’d Solutions’ Anthony Ryland In this episode of #CloudTalks, Compare the Cloud speaks with Managing Director, Anthony Ryland from Tap’d Solutions. Anthony speaks about how Tap'd Solutions is an HR consultancy, and they realise that technology is prime and ripe to accelerate the HR journey. Tap'd Solutions help HR leaders make better data-driven decisions. Right now, they are focussing on education through forums; to have a community of HR professionals that can grow together. #CLOUDTALKS Visit Tap'd Solutions page to find out more: http://www.tapdsolutions.com   ### Protecting Data on the Move - #CloudTalks with Bluebird IT Solutions' Filip Novak In this episode of #CloudTalks, Compare the Cloud speaks with account and business development manager, Filip Novak from Bluebird IT Solutions. Filip explains that Bluebird IT Solutions helps companies manage and secure their most important file transfers within and outside organisations. Filip further speaks about GDPR and how companies will need to look at their IT infrastructure and identify their weaknesses. As data is at its most vulnerable while in transit, Bluebird can help protect data on the move by providing solutions that add security at each level of communication, visibility and monitoring. #CLOUDTALKS Visit Bluebird IT Solutions page to find out more: http://www.bluebirdits.co.uk/ ### The Data Explosion Effect Due to recent advances in technology, businesses now have access to a plethora of data on their customers. This is often used to understand them better, but also to create more targeted campaigns, develop holistic services for clients and determine direct business strategies. However, whilst the benefits of having large amounts of data is clear, has it ever crossed your mind as to whether we need all of it? Challenging data Sending emails, making phone calls, collecting information for campaigns; each day we create a massive amount of data just by going about our normal business and this data explosion does not seem to be slowing down. In fact, 90% of the data that currently exists was created in just the last two years, and because of this experts are predicting a 4,300% increase in annual data production by 2020. The problem with this data overload is while companies have realised the value data can bring, they’ve adopted the belief that simply collecting sheer volumes of it is what will bring the insights they’re craving. As such, they’re struggling to cope with the data they, revealing a fundamental lack of understanding of how data should be managed and used. [easy-tweet tweet="Many firms keep data “just in case” because they don’t know if it will be relevant in the future" hashtags="Data, Storage"] The data that is kept and where it is stored can have a big impact on a company’s ability to turn data into a valuable asset. Many firms simply keep data “just in case” because they don’t know if it will be relevant in the future. This mind-set is doomed from the start as a lot of data has a set lifespan. What’s more, this mentality is often fuelled by the fear of deleting something of potential importance that can’t be retrieved at a later date (when asked for by a manager, for example), or to abide by compliance rules. By piling up more and more data, it becomes increasingly difficult to determine what is useful and what isn’t. It also racks up the IT costs for storage and back-up. In fact, research has shown that of all the data many businesses are storing, on average 52% of it is considered "dark", meaning there’s no real use for it and in effect it’s a waste of resources. Where does the value in your data lie? Fortunately, there are services and tools available to help businesses find value in the data they possess. One way is through data assessments, which look at existing data, gather intelligence and keep a regular eye on it; giving businesses the insight they need to decipher what is important to them. Another to consider is machine learning which will help with the handling of data. This is particularly useful for SMEs as once it makes it past the early adopter phase, they will have an automated process in place that would previously have been costly in terms of man hours. However, until they’re able to put this in place, to make the most of the information they have at hand, smaller businesses should consider the storage solutions they're using. They need to differentiate between which data they might need quick access to, which is likely to be the most the useful, and which is there just to be stored for compliancy’s sake. These services can essentially take on the burden that comes with the current data explosion. As data is generated and changed constantly, these services can monitor each type as and when it comes along and set policies around it. Folders of documents that haven’t been accessed for years can be flagged and moved to more suitable locations. Businesses will then be empowered to take informed decisions about their data, armed with a greater level of insight into its contents and value. Benefits to your business By understanding how to maintain data more effectively, businesses can realise great operational benefits such as increased productivity and significant cost savings. With help from Insight’s Data Optimisation Service, organisations can gain visibility of data in its entirety and recommend services that provide continued analysis of customers’ environments. Maintaining data is costly, but intelligent assessments can help to identify the useful stuff and enable companies to take charge of the current data explosion. ### Student Experience and Retail - #Cloudtalks with Katalyst Communications’ Marcus Langford In this episode of #CloudTalks, Compare the Cloud speaks with Business Development Partner, Marcus Langford from, Katalyst Communications. Marcus speaks today about how Katalyst have an unusual customer experience background through retail centres and ways to improve the retail basket. Marcus also spoke about how they are trying to assist companies who are selling online to increase their basket sales and student experience. Katalyst is seen as unique as they bring so many platforms together within student experience and retail. #CLOUDTALKS Visit Katalyst page to find out more: http://katalystcommunications.com ### Managed Service Providers, Look to the Cloud Take a look at your business; its very success is tied to technology. Whatever the industry, growth, efficiency and much more are dependent on an organisation’s IT infrastructure. For the IT services industry, this has understandably had a profound effect. Dependency on technology to support your core business and ensure customer satisfaction makes it a key competitive differentiator. If your technology isn’t up to scratch there’s a good chance your business isn’t either. The importance of effective IT has not been lost on most businesses. Fair to say that they have realised they were unable to manage the demands of effectively running their infrastructure alongside driving and executing on their operational business goals. We’re seeing this borne out all around us as greater numbers of companies turn to managed service providers (MSPs) to step into the breach and fill that skills gap. It is those that partner with MSPs that can introduce and leverage the latest technologies the quickest. They can tweak and update their service models to accommodate constant change in business requirements and align with customer demands. Being able to get the newest and best solutions to market quicker helps keep existing customers satisfied and can also win new customers too. As MSPs grow in the influence they too need to look at how they can scale their operations. For most, the answer lies in the cloud and network management solutions. But why the cloud? In optimising operations, three key benefits emerge:  Benefit 1: Flexibility For MSPs, the number one benefit of leveraging the cloud for the management of their unified wired/wireless networks comes in the form of enhanced flexibility. With an MSP’s customers’ infrastructures geographically spread out, the cloud offers the ability to scale, optimise and manage the distribution of resources, conveniently from one centralised location. What’s more, MSPs are able to use their resources more efficiently. Considering the fact that customer resource consumption varies according to their relatively fluid demands and contracts, additional resources can be deployed or surpluses redirected elsewhere. New networking devices can also be connected in just a matter of minutes without dispatching staff thanks to zero touch provisioning. All in all, the cloud gives MSPs greater freedom and flexibility to scale effectively. Benefit 2:  Better staff management Unified wired/wireless cloud managed networking platforms offer greater simplicity for deployment and operations, meaning a reduced training burden on MSP staff.  It is no longer imperative to invest sizeable amounts of resource – time and financial – to train staff so that they are experts just to keep day-to-day operations running. Instead, teams which are generalists rather than specialists can be employed allowing more manpower to be dedicated to innovation and business transformation. [easy-tweet tweet="Any MSP will tell you that customers are more demanding than ever before" hashtags="MSP,Cloud"] Benefit 3: Automation Any MSP will tell you that customers are more demanding than ever before; being able to respond rapidly and in an agile manner is now a point of differentiation. The key to integrating automation into MSPs’ business models is API technology. Built into network nodes and centralised management, APIs allow applications to directly program the network infrastructure. As a result, cloud-managed solutions with API-enabled automation make it possible for MSPs to become more agile and reactive to customer needs by offering more innovative services and models while retaining a high level of efficiency. APIs are a huge growth opportunity for MSPs and their customers alike. With APIs playing an increased role, the operations model will become more stable: companies will be able to leverage further their infrastructure and MSPs will be able to optimise and grow their offerings. Conclusion The cloud, by accommodating flexibility, improving staff training, and delivering automation capabilities should be the bedrock of all MSPs integrated and all-inclusive network management offerings. In an environment where agility, scalability and efficiency are characteristics of success, the cloud provides MSPs with the competitive edge that will differentiate them from the competition.   ### Hybrid Multi Cloud - #Cloudtalks with Softcat’s Nick Barron In this episode of #CloudTalks, Compare the Cloud speaks with  Product Manager, Nick Barron from Softcat. Nick speaks today about what Softcat are doing in the cloud and where they see the market going. He explains how Softact is unique as they believe the world is multicoloured and how they work with a variety of cloud providers such as; AWS, GSP. But, their aim is to work with customers to place their infrastructure in the right possible place, whether that be multiple cloud platforms or hybrid. This year Softcat are focussing on helping customers understand why they will want to go to the cloud. They believe it is all about choice and all about placement and understanding what a business has. #CloudTalks   Visit Softcat page to find out more: https://www.softcat.com ### The Cloud - Thank You for Helping SaaS Providers to Step it up a Gear This is a love letter to the Cloud. That’s right, you read that correctly. I love the Cloud and I’m not afraid to say it… A little sappy, I know, but you get where I’m going with this; the Cloud is enabling SaaS providers like me everywhere to push software to become the best it can be, so we can meet the challenges that businesses face in 2017. When I compare it to the position where we were with software products before the Cloud was brought into the mix, we’re now in a completely different world. Without Cloud The software landscape pre-Cloud didn’t seem terrible at the time, but looking back, you now realise how slow, clunky and frustrating the experience was. Let’s take product information management (PIM) systems, which I’m most familiar with. PIM is something that retailers and distributors know about all too well; having the software allows you to manage every one of your products across your organisation’s platforms and ensure that the product information is consistent across the board – including on your ecommerce websites, in print and on social media. The old school PIM systems that aren’t cloud-based seemed fine in the past, but on reflection they often featured a ‘bodge job’ mentality – with the wrong methods having to be used to get around the difficulties of managing a variety of different data types. The most commonly seen ‘bodge jobs’ have included: ‘Single-Table Inheritance’, ‘Separate Attribute Tables’, ‘Multi-Columned Tables’ and ‘XML in Single Column’. The problem essentially, is that by using any of these methods, you end up with a system where the database is not working as hard as it should, leading to a bottleneck effect and limited web serving capacity as a result. Enter Cloud The key benefits we always circle back to with cloud-based SaaS products are, of course, speed and scalability. When it comes to PIM products specifically, those hosted in the Cloud are able to store large volumes of data using a cluster of specialist databases – instead of making do with an old clunky server stuck at the back of your office. [easy-tweet tweet="Improved efficiencies and abilities to update itself are a selling point for the Cloud" hashtags="Cloud, Storage"] Another huge advantage of Cloud is the ability to maintain availability during any sort of power outage or planned period of maintenance, for instance. But moving systems over to the Cloud is about far more than improving internal processes and speed; it is a crucial step in helping businesses to remain competitive. Improved efficiencies and abilities to update and heal itself have proved to be a huge selling point for the Cloud, as we set people up with their new PIM service. Since they don’t need to worry about creating another dead-end legacy system situation for them to ‘bodge’ in another few years’ time, there are far fewer barriers in the way of jumping on-board. The Cloud also aligns itself perfectly to the types of companies who are using PIM. Anyone dealing with the fast-fashion market, for instance, has speed at the top of their priority list. Getting products to market and online quickly is a differentiator in a market not just fast by name, but also by nature. Offering PIM through the Cloud helps us to provide a quick and reliable piece of software that doesn’t hold up retailers or their suppliers. Ultimately, you’re also more likely to invest in the latest software if it’s easier and cheaper to do so.  We no longer have to rely on companies being of such a size that they can afford a dedicated space for servers in their office – or the dedicated man power to manage it! This has opened up new markets to SaaS providers. By catching growing businesses at the beginning of their journey, SaaS providers can help them to scale as they grow, rather than waiting for them to secure investment for their serious, long-term data infrastructure. So in conclusion Cloud, I love you for helping us to step it up a gear, for helping to us to reach more customers and improving our software for everyone.   ### How Do You Overcome and Fight Back Against Cyber Attacks How do you overcome a challenge like cyber security? It’s a question that continues to confuse and frustrate businesses of all shapes and sizes, and is more pertinent today than ever before. Almost every day we seem to read stories of those who have suffered the consequences of not having sufficient measures in place to protect themselves — you don’t need to cast your mind back far to remember what happened to the payday loan company Wonga. As cyber attacks continue to grow in both their rate and their sophistication, businesses are left with no option but to sit up and pay attention to what has become a serious issue. It wasn’t too long ago when businesses thought they could get away with taking a reactive approach — many simply would not consider the threat of cyber attacks until they were faced with one, at which point they would try (and often fail) to deal with it effectively. But businesses cannot afford to do this anymore. Instead, they need to take a proactive approach that includes preparations and measures to mitigate risks and protect all valuable data and assets. However, depending on the size and nature of the business, adopting this proactive stance is often easier said than it is done. The biggest issue for most businesses is convincing the senior leadership teams that cyber security is something worth actively investing in. Originally, back when businesses were fighting online attacks reactively, cyber security used to be exclusively dealt with by IT departments. But the issue is much larger than it used to be and now requires company-wide participation. What’s more, even with the backing of senior management, it can be hard for businesses to know where to begin in their cyber security journey. The solution to effectively fight back against cyber attacks is through the implementation of a cyber security strategy. This is a comprehensive set of best practices that covers every eventuality and is distributed to all employees, raising awareness of the issue and making sure everyone knows exactly what to do in the event of an attack. However, these are not simply copy and paste jobs: each strategy needs to be tailored to the specific nature and requirements of each business. If daily responsibilities regularly include the handling of personal data, for example, then the strategy should outline any additional risks this poses and suggest effective ways of mitigation. When an attack does take place, the strategy should be able to help define the attack, identify what’s been affected and help make the recovery process much smoother. It shouldn’t just be left on a shelf to be forgotten about, either; it should be regularly referred to and updated to ensure everyone is on the same page. When devising these cyber security strategies, there are several factors that should also be considered to help make them more effective. Firstly, businesses should think about how many employees are regularly working from home or outside of the office and how this could present security risks. These individuals might be working with confidential data while sat in a public space, or they might be saving documents directly to their devices without backing them up or storing on an internal company network. This means that if the data is compromised, it’s gone forever. [easy-tweet tweet="A comprehensive backup routine will prevent any serious cyber attack repercussions" hashtags="Backup,Cyberattack"] In fact, businesses need to check that they are regularly backing up all areas of the network that are used to save or edit work. While it’s easy to lighten the workload by only backing up the folders that are used most frequently, a comprehensive backup routine will prevent any serious cyber attack repercussions later down the line. One small step that can make a huge difference is introducing a password policy. We’ve all been warned of the dangers of weak passwords before, but it’s something that continues to cause trouble for businesses, especially when you consider how many people use the same password for multiple accounts. Making sure all employees are using strong passwords should not be taken lightly, and this applies to any device that is connected to the internet, including phones, printers, IoT (Internet of Things) devices and more. Finally, all businesses should identify any vulnerabilities that exist within their IT infrastructure. Many companies are still using old, outdated technology as part of their IT systems, and much of this can simply be exploited by cyber criminals looking for an easy target. As a result, this technology should be updated wherever possible to maximise defences, or failing that there should be extra preventative measures in place to deter any hopeful attackers. While it’s ultimately difficult to completely stop cyber criminals from trying to wreak havoc, it is completely within the power of businesses to reduce the overall impact of the cyber criminals’ actions through proactive efforts. ### Gartner Says Four Vectors Are Transforming the Security Software Market Analysts to Discuss Latest Security Market Trends at Gartner's 2017 Security and Risk Management Summits The security software market is undergoing a dramatic transformation due to four key developments, according to Gartner, Inc. The use of advanced analytics, expanded ecosystems, adoption of software as a service (SaaS) and managed services, and the prospect of punitive regulations are causing enterprises to rethink their security and risk management software requirements and investments. "The overall security market is undergoing a period of disruption due to the rapid transition to cloud-based digital business and technology models that are changing how risk and security functions deliver value in an organisation," said Deborah Kish, principal research analyst at Gartner. "At the same time, the threat landscape and rise in the number of high-impact security incidents are also creating demand for security technologies and innovations that deliver greater effectiveness." Four vectors are transforming the security software market: [easy-tweet tweet="SaaS for security and risk management is becoming critical " hashtags="SaaS, Security"] By 2020, Advanced Security Analytics Will Be Embedded in at Least 75 Per Cent of Security Products Enterprises are increasingly seeking products that incorporate "smarter" predictive and prescriptive analytic technologies, which help warn users of potential security incidents and provide guidance on optimal responses. These more-advanced analytical capabilities are driven by a variety of underlying technologies, such as heuristics, artificial intelligence/machine learning and other techniques. Successful vendors will work with customers and prospects to understand use cases where analytics will deliver significant value and augment limited security staff and resources. Acquiring and Integrating Products and Technologies Will Be a Critical Strategy to Increase Market Share and Enter New Markets Given the preponderance of startups and smaller vendors pursuing innovative approaches to security problems, acquisition, integration and consolidation are highly effective strategies to increase market share and enter completely new markets. In many cases, mature vendors in search of continued growth are acquiring faster-growing companies from emerging adjacent markets. In other cases, vendors are optimising profits by consolidating similar products under a single brand, therefore leveraging economies of scale by combining core functions, such as development, support, sales and marketing. End Users' Quest for Flexibility Will Increase Adoption of SaaS Security buyers are making security product investment decisions that support digital business, fit their current challenges and deliver performance value. Gartner's recent end-user security spending survey indicates that, to do this, they have a preference for products in an as-a-service format. SaaS for security and risk management is becoming critical as customers transition to digital business practices. However, providers must consider the financial implications of maintaining support for legacy security products while investing in an as-a-service product or managed service. The Regulatory Environment Will Create Opportunities for Security Software Providers The EU General Data Protection Regulation will come into effect on 25th May 2018 and could see organisations facing heavy fines should they receive a single complaint about mishandling private data. Punitive regulations will create board-level fears, driving security software budget decisions based on the potential financial impact of fines and noncompliance. Consequently, organisations will look to providers with products that provide the needed visibility and control of their data. Providers should identify the key regulatory requirements and constraints in target geographies by working with legal counsel to deliver product and service choices that will alleviate board-level fears. Gartner clients can learn more in the report: "Market Opportunity Map: Security and Risk Management Software, Worldwide."   ### Service Provider Evolution in a Hyper-Scale Cloud World We work in the Service Provider business at VMware and are often fielding questions as to what’s going on in the marketplace both from a commercial and technical perspective.  In response to these frequently discussed topics, we decided to share with you our personal views and to hopefully further an open dialogue as to the direction the market is taking and how Service Providers overcome the challenges they face. In this blog, we give you some insight as to the origin of many of the issues Service Providers face today, in subsequent blogs we will expand on some of the points raised.  We hope you enjoy the blog and thank you for reading. In recent years, we have heard new phrases used by vendors, analysts and technology evangelists such as “The Always On culture, Digital Transformation, Digital Strategies” etc.  However, when you strip it back, the common denominator is the End User, basically, you and me.  Since the advent of PC’s in the 70’s & 80’s and online marketplaces like iTunes, eBay and Amazon creating new business models, customer expectations of the user experience has become ever more demanding.  New application delivery mechanisms have had to be created for an ever expanding list of diverse end user devices, networks have had to scale and deliver increasingly higher bandwidth.  This has also had the impact of moving IT ever closer to the end user.  Consider email, believe it or not, there was a time before email, but now pretty much everyone has at least one email account, often more.  Now imagine the email system not working and the impact on your business and personal life, or even worse imagine losing your smartphone!  Zikes! This has meant that traditional business and consumer applications have had to be reformatted or replacements created entirely from scratch to be able to be delivered online to a range of devices.  This is all driven by market forces, again you and me.  In turn, this has applied significant pressures on the traditional enterprise IT departments in supporting their organisations in meeting the end user demand and business expectations.  These challenges are exacerbated by IT having had to manage ever shrinking budgets, possibly dwindling specialist skills, the requirement of new skills and tools, the list goes on.  Therefore, enterprises have had to find ways to overcome the challenges presented by this increasing demand and consequently the attraction of passing the problem to someone else to manage has become highly attractive, particularly if that other entity can save you money in the process.  Hence the Service Provider market. Obviously, not all enterprises have adopted an outsourced model, and many have instead developed their Cloud capability whether it be private, Hyper-Scale Cloud or a combination.  In line with keeping up with user expectations enterprises have had to develop ways of creating services with not only a consistent user experience but also at an almost speed of light delivery.  Gone are the days when users will be satisfied with pushing a button and receiving a response in anything less than a few milliseconds.  How many times have you tried to view a video via social media and just moved on when it doesn’t play within a few seconds? Many enterprises have looked to solve these issues by adopting a software stack that not only provisions an environment in record speed but also manages the network, compute and storage components with a vastly reduced number of element managers and therefore deliver a far more consistent quality of service in a shorter timescale. This has, therefore, meant that when bidding for business, the Service Providers have had to have capabilities that either match the current capability of the enterprise or more likely exceed those capabilities.  The reality is that in the past Service Providers have been able to get away with offering little more than a multi-tenanted infrastructure at a competitive cost and that has been sufficient to convince Enterprises to make a move from hosting their workloads on premise to moving those workloads onto a shared platform in the Service Provider’s Data Centre, where the management of connectivity and the infrastructure is handled by the Service Provider, but very little else in terms of the overall value-add technology services that businesses operate to generate revenue. The Service Providers of today can no longer afford to have such a limited approach and certainly not based on an operational model where on boarding and day two operational processes rely on manual configuration and intervention. The days of taking from weeks to months to deploy or update basic virtual machine provisioning services are no longer acceptable in a Hyper-Scale Cloud world where a credit card and a few buttons clicked by the end customer can have a range of sophisticated services deployed in minutes.  Ironically now, due to the scale of workloads under management, many Service Providers are facing the same issues that their customers have traditionally faced.  I.e. how to leverage efficacy at the network, compute and storage layers and provide a consistent quality of service in a timely fashion, with minimal manual intervention.  If this wasn’t enough, the demand for value from the enterprise has if anything, increased with the advent of the Hyper-Scale Cloud pricing models.  Therefore, Service Providers have to evaluate where their value is derived from, is it hosting, managed hosting, private dedicated cloud, multi-cloud management or something else? The Managed Services Value Chain; [caption id="attachment_43430" align="aligncenter" width="349"] Graph from article[/caption] Among the more progressive Service Providers, a new approach is emerging. For these Service Providers the delivery of a virtualized physical infrastructure is no longer seen to be a key component of their offering and is  instead regarded as a burden on their balance sheet, leading them to focus their efforts on developing multi-cloud management capabilities with a view to becoming a Cloud Services Broker, offering a single SLA for their Enterprise customers across multiple Cloud endpoints, where the choice of those endpoints reflects the optimum cost and service characteristics for each part of that customer’s service.  For a Service Provider to be able to offer the services and capabilities of a Cloud Broker, they will need to transform radically in the way in which they architect, cost, deliver and manage services. Gone are the days where a customer’s bespoke IT environment could be adopted in its entirety by the Service Provider and managed on a bespoke basis. [caption id="attachment_43431" align="aligncenter" width="352"] Charts for service.provider[/caption] Successful Service Providers in the emerging Cloud Broker segment have several characteristics in common: Firstly, they have existing experience, skills and toolsets for managing workloads in a variety of Clouds private or public. Secondly, Cloud Brokers offer good connectivity into the Hyper-Scale Clouds and can often offset or minimise the impact to their customers of the Hyper-Scale Cloud bandwidth costs. Thirdly Cloud Brokers typically have developed or acquired an overarching orchestration framework that allows them to tie together the assessment, provisioning, management and day 2 operations of deployed services across multiple platforms. Fourthly, visibility into monitoring and alerting and SLA adherence is an area where Cloud Brokers may leverage individual platform-specific element manager to access pertinent data but then in tandem leverage the APIs from those individual solutions to allow the Cloud Broker to aggregate information up into a common service status view across multiple endpoints, avoiding a new set of technology sales from emerging in how they run their services and ensuring those supporting the end customer have a view of the entire environment from a single pane of glass. Finally, most Cloud Brokers are deploying a sophisticated set of networking and security technologies to enable them to on-board customers into any one of the supported technology endpoints and then facilitate the communication between distributed applications and services while preserving the desired levels of security and compliance. This last attribute is challenging to deliver as it requires a good understanding of the various technologies and architectures deployed by the Hyper-Scale Clouds from networking and security perspective as well as a means of facilitating communication and in some cases mobility between the different technology endpoints without compromising security. VMware’s overarching vision, one Cloud, any application, any device aligns very well with this emerging Service Provider segment of Cloud broking. To offer Cloud broking services successfully, Service Providers need visibility into a variety of Cloud Endpoints and ability to manage and operationalize the customer workloads and services deployed there. VMware has begun to iterate a technology approach that it has termed the Cross Cloud Architecture, which will deliver over time a comprehensive management, orchestration, costing and security platform which could enable Service Providers to offer Cloud Broking services to their end customers across multiple Cloud endpoints. [easy-tweet tweet="VMware is committed to helping its Service Provider ecosystem prepare for the next generation" hashtags="Service,VMware"] Some of these capabilities exist today in solutions such as VMware’s vRealize Suite from an Enterprise, on premise, perspective for customers looking to leverage multiple Clouds and from a Service Provider perspective with VMware’s vRealize Automation, Operations and Business for Cloud which are deployed by the Service Provider from their data centres. As a growing number of the Cross Cloud services become available via a VMware hosted SaaS model, the ability of Service Providers of all sizes and maturity to offer a range of compelling Cloud Broking services will continue to expand. VMware is committed to helping its Service Provider ecosystem prepare for the next generation of service offerings by enabling its Partners to offer a variety of sophisticated services across multiple public and Hyper-Scale Cloud Endpoints So, what does this mean?  It means you need to decide who and what you are and future proof your offering.  No-one has a crystal ball, but there are clear indicators as to what our customers are going to need moving forward.  You will need to have an answer to the multi-Cloud environment, particularly when it comes to Hyper-Scale Clouds irrespective if you decide you are an IaaS provider or Cloud Service Broker.  The demands from customers will continue to gather pace and agility will be key.  Many enterprise customers view on the competition has radically changed from knowing who the competition is and forming value propositions to address the challenge to now worrying about competitors with new business models that they cannot see coming.  As a key enabler to the enterprise, you will need to be able to provide the underpinning services that will enable the enterprise to address these challenges.  In our next blog, we will discuss how VMware is helping Service Providers develop strategies to address these issues. ### How we can Move Legacy Applications Safely into the Cloud There are many reasons organisations may want to move all of their applications to the cloud, from avoiding the need to yet again replace internal IT infrastructure when it reaches the end of life or – particularly for public sector organisations – moving away from the expensive prime contractor or managed service contracts. Moving applications such as your corporate email service to public cloud SaaS service is straightforward. There are several high-quality services available plus tools to simplify migration, and prices are acceptable compared to on-premise service provision. The challenge comes in moving applications which may have been originally developed ten or more years ago and on which the business relies. These could be bespoke applications developed in-house, or specialist packaged applications which have been customised to organisational needs and where the software providers’ SaaS offering, if available, cannot accept the customisations.  Few of the cloud services currently available offer the ability to easily transfer legacy applications and all the associated data onto them. For example, Autodesk has a Software as a Service (SaaS) offer, but at present, this only runs one version of the software and cannot accept customised applications or third party developed add-ins. Similarly, organisations which have tailored their ERP applications to suit the needs and processes of their organisation may be unable to find an appropriate cloud solution, and the same holds true for local authorities who have highly customised applications for services such as parking management and waste collection. Some application providers are developing their own SaaS strategy, but in many cases, we have seen it is: “‘We'll park and maintain a dedicated version of your software on a public cloud service”, and they will charge you a considerable premium for the privilege of doing so. [easy-tweet tweet="All the cloud provider offers is the hosted VM, it will still need monitoring and management" hashtags="Cloud,Management"] For organisations that find themselves in this situation, there are some options available. One solution is cloud re-platforming onto Infrastructure as a Service (IaaS), where an organisation moves its application, as is or with minor enhancements, to operate from another provider’s infrastructure. This enables organisations to free themselves from having to own, operate and manage the infrastructure on which the application is hosted and the associated day-to-day responsibilities and costs of running it, while still maintaining the existing licence and support with the application provider. However, with the public cloud, all that the cloud provider offers is the hosted VM (virtual machine); you will still be responsible for providing patching, resilience, back-up, security and application support and maintenance inside the instance (see Table 1). The service will also still need monitoring and management.  In my view, the best answer is Managed IaaS, offered by Fordway and a wide range of other suppliers through G-Cloud and commercial routes. A second option is A Platform as a Service (PaaS), where the cloud provider provides a secured and patched base application, such as a database or development environment, onto which the organisation installs and manages its own tailored application or code. Again, the organisation has to retain responsibility for maintaining the application itself, and most legacy applications will need redeveloping to work on publicly available PaaS services as these are based on the current versions; there are not many SQL Server 2008, Informix or Progress DB PaaS services.  If redevelopment or migration to new platforms does not make business sense, moving to a suitable IaaS is pretty much the only option. Table 1: what is provided with different cloud options     Service provider security responsibilities Customer security responsibilities IaaS Control access to the hosted instance, good general security up to and including host and hypervisor patching and proactive infrastructure security monitoring Securing access to the instance(s) and everything inside them plus security of integration between instances or contracting the provider or another third party to do it for you PaaS All the above plus OS and platform patching Access and authentication to the service plus application and code patching for any service running on the platform SaaS Overall security of the service including responsibility for securing any client data hosted within the service Authentication to the service and data transfer between service providers   A third option is managed cloud SaaS, in which responsibility for all aspects of the application are transferred to a third party provider or partner.  The provider customises the service to the exact characteristics required, providing a tailored service while enabling the organisation to take advantage of cloud’s low costs, scalability and flexibility. The outcome is predictable costs, flexible billing, internally managed service delivery and clear, internally developed, robust SLAs, but this normally requires either significant customisation of the SaaS offering or the organisation has to accept that they can work without the customisations and enhancements they have previously developed. For organisations running legacy applications, a cloud solution is achievable.  They are likely to find themselves using the private cloud or managed IaaS as a staging point until more appropriate public cloud services become available. Whichever combination of cloud services is chosen, the organisation still needs to retain responsibility for ensuring that their chosen cloud provider offers and can meet the required SLAs and is suitably financially secure to continue to deliver the required service for many years to come. ### NFON AG receives ‘Growth Excellence Leadership Award’ by Frost & Sullivan NFON AG, a worldwide leading provider of cloud telephone systems, has been honoured by the consultancy Frost & Sullivan with the ‘2017 European Frost & Sullivan Award for Growth Excellence Leadership’. Growing at above-market average growth rates by implementing a combination of growth diversification, high price/performance value, and compelling customer purchase and ownership experiences, NFON has entrenched itself in the European hosted IP telephony and UCaaS market. Elka Popova, Vice President at Frost & Sullivan: “With its strong position in the German market which has been achieved through an extensive feature set and an outstanding pricing model, NFON now strives to continue this success in the European market.” One of the core elements of NFON’s growth strategy is the global expansion to ensure scalability. “We feel truly honoured to be named GrowthExcellence Leader. This shows that our expansion strategy is fully paying off. It also encourages us to pursue this strategy across Europe”, Hans Szymanski, CEO of NFON AG, explains. In addition to the German market, it has successfully launched operations in Austria, Croatia, Hungary, the Netherlands, Poland, Romania, Slovakia, Slovenia, Spain, Switzerland, and the UK. It will be launching services in the Portuguese market in the second quarter of 2017 which will be followed by Italy in the third quarter of this year. [easy-tweet tweet="NFON stands out among European-hosted IP telephony " hashtags="IP,Strategy"] The demand for All-IP solutions throughout Europe – particularly among SMBs – is a key factor in NFON’s success: “NFON stands out among European-hosted IP telephony and UCaaS providers with its ability to continually grow at above-industry growth rates,” said Frost & Sullivan Vice President Elka Popova. “After NFON’s initial focus on small- and medium-size businesses (SMBs) the innovative provider is also striving to harness increasing demand for cloud communications solutions among larger organisations and is experiencing accelerated growth through greater average deployment size.” In the first quarter of 2017 NFON and T-Systems announced their partnership with the joint product ‚Dynamic Services for Unified Communications NFON Enterprise’ as well as the recent introduction of a fourth distribution channel – Wholesale –and winning PIRONET, a CANCOM SE company, as first partner. ### Royal Warrant for Computer Firm Crowned Britain’s Best Rural Business A Sussex computer firm named Britain’s best rural business in a prestigious awards ceremony has now been awarded a Royal Warrant by Her Majesty The Queen.  Landmark Systems Ltd, from Pulborough, was informed by Buckingham Palace in March that it had received the Royal seal of approval. The award has come hot on the heels of Landmark’s victory at the CLA and Amazon-backed Rural Business Awards, where it was crowned overall Champion of Champions at a glittering ceremony in 2016. [easy-tweet tweet="Landmark’s staff were “hugely honoured” to have been given the warrant," hashtags="ComputerSystems, Data"] Landmark provides accounting and property management software, including the KEYPrime brand, to farms and estates to help them run their businesses effectively. The company has been a supplier to Royal farms and estates for ten years. Landmark’s managing director Nigel Parsons said staff at the company were “hugely honoured” to have been given the warrant, which is in place for the next five years.  He said: “I am proud of the public recognition it conveys on our team made up of sales, administration, training, support and development staff who are the company.  They work hard to deliver a quality product and service either at Pulborough, in West Sussex, where we are based, or out on farms and estates all over the UK.  It is a very special accolade for us as software is not easily assessed and has to be judged on what it delivers over time in a business sense, and not on what it tastes of or looks like!   “As an SME, to be appointed to the family of Royal Warrant holders is an unparalleled seal of approval. It comes in Landmark’s thirtieth year of trading and in a year in which we are also the CLA Rural Business Award Champion of Champions we could not ask for a better way to celebrate.”            ### How Happy Is London? Question and Answer David Fearne, Technical Director at Arrow, talks about the company’s recent social and technological experiment: How Happy Is London?  So, first things first, what exactly is How Happy Is London? How Happy Is London? (HHIL) is a demonstration of how the intelligence and insight uncovered by large-scale data analytics can lead to better business decision-making. The project shows what can happen when you take billions of pieces of data from unconnected sources in the public domain, integrate them, analyse them and transform the results to give meaningful answers. In this case, we wanted to answer the question: How Happy is London? Everyone knows London and everyone understands the concept of happiness - so, choosing to measure the happiness of London is a topic that everyone can relate to. The final output is an up to the minute picture of the city’s mood – which is refreshed from new data every 60 seconds. The data is represented online as a series of images and a ‘Happiness Indicator’ ranking, to show where the present mood in London is - between ‘Business as usual’ and ‘On top of the world’ at any point in time. How does it work? The system is run in line with an architectural model for manipulating, storing and exploring data. For instance, we have a layer for ingestion, which takes all the data from its sources and brings into an ETL (Extract, Transform, Load) phase - which enriches the data. Then computation and algorithm happen. The data is then stored in structured and unstructured data platforms and presented to the various users via a combination of Apache NiFi and API connectivity. We explore and visualise the data using the website, an installation video screen in our London office and a live twitter feed. And finally, through an API that lets people interact with the data and utilise it in their own applications. What is the data volume being processed? We process 2.6 billion data points every single day for three core areas – social, transport and weather. For weather, we pull about 36,000 lines every minute. For traffic on London’s roads, every single time we query the API it’s 180,000 lines of data. We calculate every single section of traffic to understand it, every longitude and latitude coordinate - and if a road bends a lot we get a lot of coordinates. What is the methodology behind it? Before we store the data, we refine it and dispose of anything we don’t need. For example, we take the 180,000 lines of road data and refine it down to 3,600 lines; this quantifies if a road is running clear or blocked. How do you ensure data security? In general, we don’t need to because all the data is open data. We don’t encrypt the data as it’s not sensitive, however, we do apply a certain amount of access security. For example, we have key-based authentication and encryption when people access the data remotely. Is the algorithm secret? No, it’s not secret. As far as selling HHIL again the whole architecture is valueless as it’s so custom to how happy London is. However, it’s open to everyone as we want people to adopt it, leverage it and build their own ‘How Happy Is’ for their company. The results are presented as a REST API under an open data license, so any third party can register and take advantage of the Happiness Index for their own applications. Can other companies publish this data? Yes, we consume open data so it seems only right to provide open data. They’d have to mention that the data had come from us somewhere but anyone can use it. Who could find this useful? Anyone who is looking to do something more insightful with their data or their customers’ data - or to become a data-driven organisation. There is an incredibly wide customer base as analytics is going to be a big market over the next two-three years; especially as the competitive edge becomes thinner and thinner, making data-driven decisions essential to gaining a competitive edge. [easy-tweet tweet="HHIL is the simplest way of using analytics; taking numerous data points and combining it together" hashtags="Data, Technology"] Will the project continue and evolve or will it be switched off? HHIL will grow over time. The whole point of the project is to help customers who say “I have no idea what analytics is” and use this as their first foot on the ladder. This is the simplest possible way of using analytics; taking numerous data points and combining it together to understand how well your business is doing. Phase two of the project is going to be about predictive analytics. Based on what we know happened this time last year so we can see what might happen next year. We will be able to predict how happy London is tomorrow. This fits with the natural progression of the adoption of analytics in the industry; as our partners are adopting more complex technologies their end customers are becoming more confident and more reliant on the technology. We want to build HHIL to accommodate the growing interest in more complex analytics. Phase three will be focused on machine learning and artificial intelligence. We know how happy London is today and have predicted how happy London will be tomorrow – but how can we change the outcome of the happiness of London? In a business context, you know how well your business is doing today, we can predict how well it’s doing tomorrow - but what if we can suggest ways to increase the success of your business? It’s scary stuff but very doable using machine learning algorithms and artificial intelligence - and is the ultimate goal of the How Happy Is London? project. Any more future plans? Watch this space for other How Happy is...? projects in major cities across the globe.   ### There is Now a Need to Get Ourselves Back Up To Speed Backing up data is incredibly important and something that is relevant to the majority of businesses, especially those industries which rely heavily on data, such as finance, retail, healthcare, public sector and much more. We have all been reminded of the importance of saving our work as we go – or else risk suddenly finding ourselves losing valuable information (and time). Think of that important PowerPoint presentation you had worked on for hours before your PC crashed - and you realised you had not saved your progress. Backup is exactly the same, and often companies forget this until the worst happens. Given how crucial backups are in an economy where data is the new gold, it is alarming that the technology and infrastructure which many businesses use to support reliable and secure backups is out of date, which poses significant risks to business continuity. To contextualise this, in the UK alone, the business cost of data loss is about £10.5 billion a year (as of 2014), and with more data produced every year, this cost is only going to go up. Traditionally, backups have been done to tape – which, while cheap, is less efficient and reliable than modern storage technologies – and runs the risk of degrading over time. The often-quoted lifespan for LTO tape is approximately 30 years. Backing up to the disk and to the cloud is the modern way. Supportive infrastructure like networking capabilities means that it now makes more sense in terms of cost, reliability, efficiency and scalability. [easy-tweet tweet="Realising the benefits of OPEX models than CAPEX means businesses get a better ROI" hashtags="ROI, OPEX"] We have reached a point in our digitalisation journey where IT solutions such as backup cannot be made primarily on the basis of picking the “cheapest” option that will do the job. It can be easy to just look at the CAPEX associated with storage infrastructure, particularly in backup, but a number of factors have changed in the storage landscape. It is increasingly important to examine the OPEX. The cloud, for example, can have a lower running cost over time, and a far greater lifespan. Realising the benefits of OPEX models rather than CAPEX also means that businesses get a better return on investment and pay for the services that they actually use rather than an upfront fee which is often wasted. However, from an IT department perspective, these services can create what is known as “shadow IT” – systems set up away from the IT function. This is an important consideration and does not necessarily need to be the case. It is vital to consolidate these services and manage all services in the same way so that they can be backed up efficiently and easily. Without consolidation, they may not be backed up at all, creating huge potential risks. This process might not happen otherwise, as those systems are not part of the organisation’s existing infrastructure and are, therefore, not managed in the same way and possibly not backed up at all. This is where a universal data management solution, spanning your entire storage environment (including backups) becomes so important, holistically unifying your data landscape, and helping your business become flexible, scalable and cost-efficient. In summary, backup to the cloud and to disk is more cost-effective, more reliable and faster than the traditional methods. NetApp’s Auto Support software solutions mean that businesses that are licensed to use services such as SnapMirror and SnapVault, but are not actively using them and realising the benefits of them, can be identified. This means that NetApp can offer them services such as Backup as a Service (BaaS), AltaVault and other software solutions which can maximise their investment and improve their IT. This enables both NetApp and its partners to proactively help customers manage their IT provision; update, upgrade and improve their infrastructure; and reap the benefits of a more cost-effective solution. NetApp’s BaaS solution is now being rolled out to customers by its trusted partners Daisy and Node4. ### Technology and the Law Ep 2: Cloud Contracts ### Where Do I Start with General Data Protection Regulation? The European General Data Protection Regulation (GDPR) will become effective on May 25, 2018, and require organisations to identify and protect sensitive personal data of EU Citizens. The new regulations mean that any organisation, big or small, will need to comply with new rules regarding the collection, storage and usage of personal information regarding EU citizens. But reports have suggested that many IT security professionals are either not preparing or are unaware of any changes that need to be made to their business processes to ensure compliance. It’s crucial that organisations do not stick their heads in the sand regarding the new regulations, or believe that the rules do not apply to them without fully understanding them. After all, those that do not comply with the new regulations face potentially severe penalties. So the question is, where do you start? Well, consider the objectives of the GDPR. They are to 1) give citizens and residents back control of their personal data and 2) simplify the regulatory environment for international business by unifying the regulation within the EU. Another key point to understand early on is that although the UK has voted to leave the EU, UK business will still have to comply with new regulations if the data they handle is about EU citizens, or has the potential to identify individuals within the EU. What’s more, digital minister Matt Hancock has confirmed that the UK will replace the 1988 Data Protection Act (DPA) with legislation that mirrors the GDPR post-Brexit. The key stipulations of GDPR are:  Firms with over 250 employees must employ a Data Protection Officer (DPO). This person is responsible for ensuring that a business collects and secures personal data responsibly. The requirement to appoint a DPO will also apply to small businesses employing less than 250 staff if the processing carried out is likely to result in a risk to the rights and freedoms of data subjects, the processing is not occasional, or the processing includes special categories of data as defined in GDPR Article 9. Breaches in data security must be reported immediately to data protection authorities such as the Information Commissioner’s Office (ICO) in the UK. Ideally, breaches should be reported within 24 hours if possible but at least within 72 hours. Individuals have more rights dictating how businesses use their personal data. In particular, individuals have a ‘right of access’, ‘a right to data portability’ and a ‘right to be forgotten’. [easy-tweet tweet="GDPR provides for a fine of up to €20 million or 4% of annual turnover" hashtags="GDPR,DataManagement"] Failure to comply with the GDPR will lead to heavier punishments than ever before. Under current rules, the UK’s Information Commissioner’s Office (ICO) can fine up to £500,000 for malpractice, but the GDPR provides for a fine of up to €20 million or 4% of annual turnover (whichever is higher). What’s more, individuals can sue a business for compensation to recover both material damage and non-material damage, like distress. So what are the steps your business should be taking? I believe that there are three steps to getting a business ready for the GDPR: Data management begins with discovery Before you can implement any processes regarding the treatment of data, and requests for data under GDPR legislation, you must find the relevant data within your organisation. The advice from the UK’s ICO and other national authorities concur with this approach, identifying “identifying what data you hold” as a key step. Given how rapidly data is collected, created and stored by organisations, it would be impossible to find this out manually. What is correct at the beginning of this year could be wildly different in 6 months’ time. Moreover, attempting this manually will result in a catalogue of where people think data is held and processed (usually the systems designed to hold the data, like a CRM system) rather than where data is held (such as in a spreadsheet extracted from the CRM system to run a regular report). This task of creating a data inventory does not need to arduous, by using Big Data and Machine Learning principles as part of an eDiscovery and data mapping process, you have the ability to rapidly find and categorise data and to do so on an on-going basis – ensuring continual compliance for your business rather than just at a single point in time. The added benefit of a digital discovery process is that you can also uncover the unknown data resident in your organisation. Classification Once you’ve found your data, you need to be able to classify it. Not only for corporate governance but also for the GDPR which distinguishes between Personal Data and Sensitive Personal Data. To make sure that your classification is applied consistently, it shouldn’t be left to people to try to remember, or a lengthy guidebook. Here, Machine Learning and Big Data make sure that nothing is left to chance and that every data point is classified as it should be every single time. Implement Relevant Processes Once you have identified and classified your data you have a robust platform upon which to implement your processes. This third step is where you can apply the skills of your people and any consulting teams that you engage to do the following: Decide which processes are required – this may include: De-duplication Handling of requests for information Handling requests for deletion of data Managing interactions with third-parties and assessing their compliance status Communication of the GDPR and what it means, throughout your organisation Decide which processes can be automated, and which need to be handled by people. These are the first steps in what will be an on-going process. But I believe that these steps are crucial for any organisation that wants to get it right first time. After all, understanding the type of data that will be affected under the GPDR is one thing, but having to search for where that data is held and who is responsible for it is another issue entirely and, unfortunately, without the right tools I can see many organisations running into trouble. In a perfect world, all data would be stored securely, and processes would be in place to ensure personal data is kept separately under a security framework. But in my experience, that’s just not the reality. Across the organisations we have worked with, there is an average of 10GB of unstructured data per employee, and 9% of that data contains personally identifiable information. So don’t be caught out when the GDPR comes into force next year. Get a grip on your data now and understand what the new rules mean for your business.   ### Awareness and risks of stored information - Voices of GDPR - Alfresco's George Parapadakis George explains about Alfresco and how they have grown from a content management company to digital business platform. George moves on to speak about how Alfresco are helping individuals to become GDPR by creating the capabilities, protecting information, identifying it and managing the processes around this information. Furthermore, George speaks about two different areas around GDPR depth within a company, customer data and HR/Employee records. Voices of GDPR Visit Alfresco website to find out more: Alfresco.com ### Qualys tools for GDPR compliance - Voices of GDPR - Qualys's Darron Gibbard In this episode of Voices of GDPR, Compare the Cloud speaks to Darron Gibbard, Managing Director, EMEA North region for Qualys. Darron explains how Qualys helps with GDPR compliance through asset discovery and vulnerability management tools. These tools are used to discover the assets and discover what assets are on the network. He explains that individuals can’t be compliant with GDPR without understanding how many assets you have and where the data is stored. Once this is found out, the vulnerability management comes into place and discovers what is at risk. Furthermore, Darron speaks about challenges and success stories Qualys have been through with their clients. Voices of GDPR Visit Qualys website to find out more: https://www.qualys.com/ ### Challenges of GDPR - Voices of GDPR - DataStax’s Martin James In this episode of Voices of GDPR, Compare the Cloud speaks to Martin James, the Regional Vice President for DataStax. Martin explains how DataStax is a database company that provide database technology that works within 5 key aspects. He goes on to explain two key areas; data security and graph capability, that DataStax do to make people GDPR compliant. Furthermore, he explains the challenges that clients might face. Voices of GDPR Visit DataStax website to find out more: https://www.datastax.com/ ### New Model: ‘Secure GDPR’ - Voices of GDPR - KMD Neupart UK's Harshini Carey In this episode of Voices of GDPR, Compare the Cloud speaks to Harshini Carey, the Regional Director for KMD Neupart UK. Harshini explains how Neupart is an information security management company and how they work with the vast amount of clients from the NHS. Harshini further goes on to explain how Neupart has a new model called ‘Secure GDPR’ and how it compares to other vendors. Voices of GDPR Visit  KMD Neupart UK website to find out more: http://www.neupart.com/ ### When Cloud Skills are the Bare Minimum Whether to manage scalability or encourage collaboration, cloud skills are in demand across businesses and industries. In fact, the Cloud Industry Forum reported that cloud adoption across UK businesses had reached 88%, with over two-thirds of survey respondents expecting this to rise in the next year. Interestingly, we see this trend reflected in our own customer base. Roughly 80% of our clients, which comprise of fast-growing technology, media & telecommunications and e-commerce businesses, now operate entirely in, or are moving to, the cloud. These businesses, therefore, have a critical need for skilled developers, project managers and even members of the senior management team who are proficient with cloud and can help them meet their business objectives. For most of our clients, being able to ‘talk cloud’ is the bare minimum, with many businesses today requiring their employees to be au fait with cloud-based services. As company’s cloud strategies continuing to evolve, we’re seeing three general trends emerging amongst our clients in terms of the skills they are looking for in candidates: Developers who can see cloud software through to deployment – gone are the days of just writing code and throwing it over your shoulder for the Testers to deal with. Instead, being able to understand the impact your code is having on the infrastructure and being able to monitor and support the software once deployed, is quickly becoming a critical skill for Full-Stack Engineers. Many companies see candidates that can do this as the ‘full cloud package’, and as a result, prioritise them in the search for talent. [easy-tweet tweet="High availability experience is becoming gold dust for businesses that regularly experience large amounts of web traffic" hashtags="Webtraffic, Cloud"] Candidates with high availability (HA) experience – in the age of data, any internet business needs to be able to scale easily to deal with sudden colossal volumes of traffic. High availability experience is fast becoming gold dust for any business that regularly experiences large amounts of web traffic and this demand is only expected to keep growing. For some of our clients, HA experience is vital. Currently, all our contractors working at one of our biggest clients have at least some HA experience as the company can often generate more web traffic than Amazon. This skill needs to be top of the IT agenda. People who possess the ‘softer skills’ needed to get the job done – from our experience, having cloud engineering skills is only half of the story. Non-technical skills are just as important and our clients are searching for candidates who can work collaboratively within a team. They need people who are engaging and can work both within their team, as well the wider business; which is critical for change facilitation. The stereotype of the unsociable IT worker slumped at their desk with headphones in all day is thankfully dying a death. Driving technological innovation is such a core business priority, that in most cases, people from across the wider business will work with their IT team for numerous projects. Whether that’s alongside the marketing team to help update the branding of the company website, or working with HR to develop and grow the internal HR system. To this end, working across the business with many different personalities, and varying levels of understanding of IT processes and systems requires skills in collaboration, creativity and understanding to see the projects through. Companies simply want digital teams that fit culturally with the business, and for many of our clients, having a good employer brand is critical to attract and retain talent. They want people coming in who match their values and ambition and who can eventually become Brand Ambassadors. The industry is moving fast and these trends point towards companies looking for skills that help them meet these ever-changing demands. Whether it’s searching for the harder skills needed to easily roll out new innovations or the personal skills to make sure the projects run smoothly, recruiting the right people is critical for ambitious businesses to remain competitive this year and beyond. ### Integrating peoples' systems - #Cloudtalks with W3Partnership’s Graham Woods 16In this episode of #CloudTalks, Compare the Cloud speaks with Graham Woods, the Managing Director of W3Partnership. Graham talks about how W3Partnership are a specialist integration consultancy, offering expertise integrating peoples' systems together. Graham also talks about where he sees the industry going. #Cloudtalks Visit W3Partnership's website to find out more: http://www.w3partnership.com/ ### Focussing on all sectors - Voices of GDPR - Collibra’s Olivier Van Hoof In this episode of Voices of GDPR, Compare the Cloud speaks to Olivier Van Hoof, the Pre-Sales Manager UK for Collibra. Olivier talks about being focused on Data Governance and how Collibra focuses on all sectors. Olivier also talks about how Collibra approaches GDPR, and how you need to understand your data. Voices of GDPR Visit Collibra's website to find out more: collibra.com/ ### Taking GDPR Seriously - Voices of GDPR - DST Systems' Ruaraidh Thomas In this episode of Voices of GDPR, Compare the Cloud speaks to Ruaraidh Thomas, Managing Director of DST Systems. Ruaraidh talks about helping clients master complexity and how regardless of what's coming, GDPR needs to be taken seriously. Voices of GDPR Visit DST Systems' website to find out more: http://www.dstsystems.com/ ### Need to Know Before Choosing an IaaS Provider  Before embarking on a cloud journey make sure these basic principles are clear  Infrastructure as a Service (IaaS) adoption is spreading at a rapid pace. According to Gartner’s findings, the IaaS market has been growing by more than 40 percent per annum since 2011, with revenues expected to reach $34.6 billion in 2017. Looking ahead, the market is again poised to more than double in size and reach $71.6 billion in revenues by 2020. Gartner estimates that by then, IaaS providers will deploy the overwhelming majority of all virtual machines (VMs) as more organisations apply a “cloud-first” strategy. IaaS offers a shared and highly scalable resource pool that can be adjusted on demand. With full control of the allocated computing power and storage, the operating system, middleware and applications, users can commission and decommission IT resources at a blink of an eye, following a pay-per-use model. The user is responsible for sizing, configuring, deploying, and maintaining operating systems and applications, taking care of backups and so on, which is both an advantage and disadvantage at the same time. While workloads can be set up in whichever fashion needed to suit the individual needs, it does require know-how, time and effort to determine the optimal setup and continuously self-manage the environment. The design can literally change hour by hour, day by day, week by week, month by month as needed, thereby allowing users to swiftly accommodate changing requirements without having to make new capital expenditures, and without losing any time until the new equipment has physically arrived. In his 2015 whitepaper, John Hales provides a comprehensive overview about things to consider before opting for IaaS. The following list serves as an extract and is by no means exhaustive. Instead, it aims to provide a basic set of questions that need clarification before starting a cloud deployment in order to increase your chances of success and mitigate risks. [easy-tweet tweet="IaaS offers a shared and highly scalable resource pool that can be adjusted on demand" hashtags="IaaS,Storage"] Networking  The network layer builds the foundation for all cloud activities. Without it, there is no way of accessing any of the infrastructure deployed in the cloud. Hence, before choosing an IaaS provider, a couple of items should be clear:  What type of access will be required and for what purpose?  How much bandwidth do you need for the planned workloads?  How will the network from your on-premises data centre and other existing cloud providers integrate with the new IaaS provider?  In an effort to avoid duplicative IP address ranges, how does your IP scheme integrate with one of the new IaaS providers?  What speed is possible between VMs and storage?  What is the IaaS provider’s bandwidth between the various DC locations?  How much bandwidth is there within a rack (i.e. to the top of rack switches)? Compute  Another critical component is the compute layer comprising the CPU and memory. While there is a whole array of aspects to consider, the following provides a starting point:  Are bare metal/dedicated physical servers available, if needed, or just VMs?  If VMs are available, are they multi-tenant or single tenant?  What are the CPU and memory options?  Are the CPU cores and memory that your VM uses dedicated to you or shared across multiple VMs?  How many CPUs can be placed on a server? Are multicore CPUs available, or are they all single core? How many sockets are available? (Note: These questions have licensing implications!)  What is the memory speed and type? What are the options to choose from?  Can you select the type and speed of storage attached? Storage  Depending on the type of workloads you wish to run in the cloud, there are a number of questions on the table. Among them are the following:  What kind of storage classes are available?  What kind of shared storage is available (iSCSI, NAS, SAN, AoE, FC, FCoE, CIFS)?  How does the IaaS provider ensure storage performance?  What is the fail-over concept (active-active; active-passive; failover cluster)?  How is storage billed (space consumed, IOPS allocated, IOPS used)?  What options are available to minimise the cost of static or slowly changing data such as archives or backups?  Keeping in mind the exponential data growth of the digital universe and the fact that data unfolds gravity, what are the available price tiers as you increase volumes and how competitive are they? Security  One of the biggest and most cited obstacles when it comes to transitioning to the cloud is security. When considering an IaaS provider, here are some examples of the items that need clarification:  In which location/country is the data ultimately stored, and which laws (privacy and otherwise) are applicable?  Is the DC location in accordance with data privacy or compliance regulations or corporate policies that might be applicable in your case?  How is the DC classified (tier-1 to tier-4), and is this sufficient for your specific requirements?  What certifications does the provider hold?  How does the multi-tenancy concept look? In other words, how is the environment logically or physically separated and secured from all other customers hosted by the IaaS provider?  What kind of secure workloads can be hosted? If you ever wanted a dedicated environment or even caging for highly sensitive workloads, would the provider offer these additional services? Will the provider help you pass audits of those workloads? If so, how?  What access do technicians have to the servers, VMs, and storage used by you? What kind of audit trail is available upon request?  If a security incident were to occur, how would you be informed? Data Recovery  Equally important to security is the availability of the workloads deployed. Hence, a long list of items to be clarified with the IaaS provider should include the following questions:  What are the options for local failover (if a VM fails)? Is failover automatic or manual?  What if a site-level failure occurs? How does the IaaS provider ensure high availability across sites?  Are there costs associated for replicating between data centres? If so, are some locations free, cheaper or more expensive than others?  Are there application design or deployment requirements to make DR possible?  What is the availability committed in the IaaS provider’s SLA, and how does this match your requirements?  What is the IaaS provider’s average mean-time-to-repair (MTTR) and how does this compare relative to other IaaS providers or industry standards?  For business critical workloads: What is an appropriate penalty in case of a major service interruption, and is the IaaS provider willing to agree to these terms? ### Choosing the correct technology – #Cloudtalks with MyBench's Adele Ross In this episode of #CloudTalks, Compare the Cloud speaks with Adele Ross, Associate Manager for MyBench. Adele talks about helping clients choose the correct technology for their needs and keeping up a relationship with customers. #Cloudtalks Visit MyBench's website to find out more: http://mybench.co.uk/ ### Structured Connectivity and the Future of A/V Technological innovations are becoming such a common occurrence, and businesses must keep up in order to remain competitive. The challenge lies in developing a connected infrastructure today that can support the technologies of tomorrow, without sacrificing speed and data capabilities. Joseph D. Cornwall, an Audio-Visual (A/V) expert for Legrand’s C2G product line, discusses how A/V still has a part to play in a connected and modern infrastructure and the emerging technologies that will shape the future of A/V. Nowadays, the least we expect of our homes and workplaces is a connected infrastructure in which to use the sophisticated digital devices that have become integral to our daily lives.  The last two decades have redefined the audio/visual landscape in particular.  Gone are the limits of analogue connectivity, replaced by fibre-optics, cloud storage, software-as-a-service and the emerging Internet of Things (IoT). As the capabilities of these technologies increase, so too do our expectations.  This presents a challenge to businesses – remain ahead of the curve without submitting to costly annual building upgrades. The solution is to design a connected infrastructure that is flexible enough to support older, but viable, technologies while facilitating the integration of new solutions as they emerge.  To do this, businesses need to be aware of how the A/V landscape is changing, and the solutions that are waiting in the wings to cope with these changes. A/V is here to stay With the arrival of “smart buildings”, it’s tempting to dismiss A/V as obsolete and regard LAN connections as the main player in modern, connected infrastructure.  While LAN does, and will, have a part to play, A/V remains a vital part of how we communicate. The function of your A/V system should not be left to chance.  There is, unfortunately, no such thing as a “one size fits all” approach, and the landscape is evolving quickly.  Some solutions are either becoming or will be, integral, to a fully-functioning A/V infrastructure. USB Type-C USB (or universal serial bus) connections have become such a key part of our lives that we arguably take them for granted.  They are now integrated everywhere, from podiums and workstations to the centre console of a car. USB Type-C is the next step for USB connectivity.  Delivering power of up to 100 Watts, USB Type-C can be used to input high-definition video into multiple monitors.  It also has the potential to facilitate the upgrade of the screens on our desks to interactive, touch-sensitive devices. [easy-tweet tweet="HDMI is already established as a quality A/V connection and is commonly associated with digital video" hashtags="HDMI,A/V"] While it has yet to be universally adopted, USB Type-C will be a vital tool for any business as interactivity becomes more commonplace.  It also comes with adapters to facilitate the interface between existing structured wiring. High-Speed HDMI HDMI is already established as a quality A/V connection and is most commonly associated with digital video.  Any structured solution must include at least one HDMI port.  When installing a port, users must be cognisant of the resolutions they anticipate supporting the life of the installation, as well as their current resolution requirements.  An infrastructure that uses high-speed HDMI performance supporting up to UltraHD 4k video resolution will remain viable for many years. DisplayPort DisplayPort is a digital display interface standard specifically designed for the transfer of digital video, audio and data between a source and a sink.  It’s now found on the majority of new convertible laptop, desktop and workstation computers due to its rich feature set and HDMI/HDCP compatibility.  DisplayPort replaces VGA connectors and these can be adapted to the new framework.  Each structure, therefore, should now feature DisplayPort connectivity as standard. HDBaseT HDBaseT facilitates the home and commercial distribution of uncompressed, high-definition multimedia content.  Its cornerstone is 5Play™, a feature set that includes uncompressed UltraHD 4k digital video, embedded digital audio, 100BaseT Ethernet, USB2.0 and various control signals through a single category cable.  HDBaseT solutions are available in a small box transmitter-receiver or wall-plate form and can be mixed and matched to suit the application.  Furthermore, they don’t connect to a LAN, which means that using an HDBaseT connection does not affect network security or capacity. Optical media conversion This is a good solution for connectivity distances of over 100m.  Scalable to work with the maximum levels of A/V performance, unified optical solutions are easy to install and can accommodate runs of up to 330m while supporting 32-bit colour space. Creating value in present and future structures Now more than ever, the spaces in which we live and work must enhance our technology and add to our productivity if they are to deliver their real value.  The need for a scalable, adaptable and high-performance structured A/V solution to achieve this is undeniable.  Even in the world of smart buildings and virtual storage, A/V is and will remain, a key player in modern, connected structures. ### Solgari Partners with SP Sysnet to Empower APAC’s  Announced at CommunicAsia 2017, SP Sysnet enriches communications portfolio with Solgari’s comprehensive cloud-based telephony & compliance software suite Solgari, the global provider of the world’s first complete enterprise cloud business communications software solution, has announced its partnership with SP Sysnet, a leading APAC IT solutions and consulting specialist focused on delivering leading communications solutions to global clients. The premier partnership will enable SP Sysnet to provide its extensive client base with Solgari’s end-to-end cloud communications & compliance platform. Solgari’s cloud software solution consists of integrated service modules including telephony, contact centre, WebRTC collaboration, IVR, voice verification and compliant call recording that can be deployed individually or in unison. Customers can unify all communications services – voice, chat, video, web conferencing and IM – regardless of location. Moreover, as companies, including fintech, are governed by increasingly stringent data and compliance regulations, Solgari addresses these needs through advanced incorporated security features, including end-to-end call encryption and a secure PCI DSS compliant call archiving service that also provides for highly accurate word and phrase searching. Numerous regulations can be addressed including Central Bank, FCA, SEC and GDPR compliance. The typical customer impact is far more efficient and intelligent communications in the cloud while reducing overall communication and compliance costs by at least 50 percent. [easy-tweet tweet="Innovative fintech organisations are changing the business landscape" hashtags="Fintech,Cloud"] “Solgari truly enriches our existing portfolio,” said Siva Prasad, chief executive officer and managing director at SP Sysnet. “Cloud technology provides huge benefits, enabling businesses to be more agile, efficient and scalable, but there’s been a reluctance from financial services organisations to adopt it due to perceived risks around security and compliance. The fintech industries in the region are truly beginning to find their feet, and Solgari’s platform can empower those operating within APAC and globally with cloud communications, without the fear of non-compliance fines with local regulations. “Innovative fintech organisations are changing the business landscape but they require the best technology that empowers them to compete; with Solgari, we can offer them that,” concludes Prasad. SP Sysnet started out as a hardware IT procurement & consulting services but has evolved into a comprehensive end-to-end IT infrastructure, network & communication technology solutions provider. Based in Singapore, the company serves enterprises operating in India, Hong Kong and Japan, as well as the wider APAC region. Speaking at CommunicAsia 2017 in Singapore, Ed Grant, COO at Solgari, added “We’re delighted to have entered into a partnership with SP Sysnet. Siva and his team have the vision to bring global cloud capability to existing and new customers including in financial services. Fintech hotbeds are springing up all over the globe, but the variety and pace of start-ups in APAC is truly staggering. Our cloud communications software solution is perfect for innovative companies that are seeking best of breed SaaS solutions to help them grow quickly internationally while meeting related compliance needs, and without the cost and restrictions of legacy solutions. We’re already empowering globally recognised fintech organisations – including leaders such as Aztec Exchange and Finsa in APAC – and this partnership enables us to help more.” To learn more about Solgari please visit us at booth 5L4-08 at CommunicAsia http://www.communicasia.com/ from May 23rd to 25th. ### Certes Networks Launches Stealth Encryption For the first-time enterprises can encrypt sensitive data, maintain operational agility and meet growing regulatory requirements Certes Networks, an innovator in software-defined security solutions, yesterday announced the launch of its patented Stealth Encryption solution. By encrypting data at the transport layer of the network, Certes is allowing business to increase security without compromising infrastructure or business agility. Certes has developed the industry’s first layer 4 encryption technology to secure data-in-motion, over any network, all without affecting network and application performance. Traditionally, network encryption solutions have worked at layers 2 and 3 of the network OSI Model and are often a function within a router, firewall or other network device. As a result, deploying encryption within the current encryption technology parameters can have a major impact on network and application performance, for example, encryption processing can cut firewall throughput by over 75%. The Certes solution alleviates this problem by decoupling encryption from the infrastructure to ensure consistent network and application performance. [easy-tweet tweet="Certes’ patented technology encrypts only the payload of the packet" hashtags="Technology,Encryption"] Additionally, by working at layer 4, Certes Stealth Encryption ensures that only the data payload is encrypted, not the entire packet. To date, encryption technologies have only been able to encrypt all of the packet. This leaves an organisation without visibility of its own network traffic, meaning when performance or configuration issues arise encryption solutions must be turned off to allow the problem to be identified and addressed. Certes’ patented technology encrypts only the payload of the packet; a network using the solution can maintain complete visibility, functionality and all troubleshooting capabilities. Further to this, the stealth encryption element means any intruder looking for sensitive data within the network will not be able to identify encrypted packets versus non-encrypted packets easily, resulting in any stolen sensitive data being rendered useless. As businesses look to meet growing regulatory requirements around consumer data protection, including PCI DSS, ISO standards and GDPR, many are adopting a zero-trust model. This approach, which dictates that nothing within the network is inherently trusted, is pointed to by analysts as the way to minimise the risk and impact of a data breach. Certes actively supports businesses moving to this model as its Stealth Encryption offers zero-trust with zero disruption, allowing CISOs to enforce security and policies to protect sensitive digital assets without impacting operations. Paul German, CEO at Certes commented, “Encryption is a vital component of any cyber security strategy, as it allows businesses to comply with a growing number of standards surrounding data protection. Not only this, but it has the ability to render stolen data completely useless to hackers. However, traditional approaches that have hampered network functionality meaning many businesses are not making the most of the protection encryption technology can offer. Certes Stealth Encryption is enabling businesses for the first time to simultaneously maintain complete functionality  while protecting valuable data.”   ### The Cloud Must Embrace the Edge While cloud computing brings numerous benefits such as economies of scale, consumption-based pricing and the ability to get applications to market quickly, there are indications that on its own, it might not be able to cater for the evolving needs of new age technology. This is where edge computing is going to step up.   If cloud computing is all about centralising computing power in the core, edge computing, as the name suggests, is about taking that power back out to the periphery. In simple terms, the edge is about enabling processing and analytics to take place closer to where the endpoint devices, users or data sources are located. Why would we want processing activity to happen at the edge? One of the main drivers is the rise of the IoT (Internet of Things) and other applications that require real-time decision making or artificial intelligence based on the fast processing of large, multiple data sets. Take the example of a self-driving car that might rely on 100+ data sources tracking areas such as speed, road conditions, the trajectory of nearby vehicles, etc. If a child steps into the road, the car needs to make an immediate, real-time decision to stop. If it waited for the data to be sent into the cloud, processed in the core, and for instructions to come back, it’s going be too late. [easy-tweet tweet="Edge computing allows the processing to happen locally within or near the device" hashtags="Cloud,Computing"] It’s the same for an AI based drone or robot that has to perform services equivalent to a human being. It needs the data it collects to be acted upon swiftly and not have to wait for the analysis to arrive from somewhere at some time. Edge computing allows the processing to happen locally within or near the device – right where the action is. In doing this, it eliminates the latency involved in waiting for data to go to the core and back. The demand for applications that rely on real-time or near real-time processing and analytics on edge is on the increase. Retailers, for example, are wanting to use big data analytics to help identify shoppers who walk into their stores so they can deliver personalised real-time offers and products ads. As well as driving out latency, the move to the edge delivers numerous other benefits. For example, if more processing is done at the periphery, this means fewer data overall needs to be transmitted across the network into the cloud, which can help reduce cloud computing costs and improve performance. And with processing power located at numerous points throughout the edge rather than centralised at the core, there is no single point of failure. The edge architecture may see greater deployment of micro data centres - small, modularised systems that host less than ten servers – to provide localised processing. Or the smart devices themselves will become more compute and storage intensive. Or the architecture may include edge gateways that are located near the devices and sensors and act as processing engines and a conduit to the core/cloud based setups. So does all this mean the end of the cloud? No. It’s likely that we’ll see a co-existence of edge computing alongside the centralised cloud model. The real time “instantaneous” insights will be processed near the endpoint or data source, while the cloud acts as the big brother that processes and stores the large data sets that can wait. The cloud may also act as the central engine to control and push policies towards the edge devices and applications. Both cloud and edge computing are required to work in tandem to ensure both hot and cold insights are delivered on time, cost effectively and efficiently For instance, a connected and smart locomotive engine would need instantaneous insights to be processed in order to support its smooth operation while it is travelling. It will generate massive amounts of data that would have to be combined with external data (environment, overall temperature, track health etc.) and processed immediately, often in less than nanoseconds. This function has to be performed using the edge architecture (either the engine acts as the edge or will be in constant communication with the edge gateways). On the other hand, to maintain and manage the overall health of the engine, the insights may not be required in real time. Here the data sets could be transferred back to the central core where they can be processed to gain insights. There would be a segregation of functions: some real-time insights will be required immediately for smooth operations while others can take longer and be processed in the core. Similar use cases can be found with autonomous cars, drones, other connected devices and IoT applications. As more data is collected and needs to be processed in real-time or near real-time at the margins, so there will be a greater need for edge computing to complement the cloud. ### Why your Business Needs CRM Software In this increasingly technologically advanced society, it is becoming necessary to have the best tools and resources to reach out to potential customers. Small businesses especially need to be able to connect with their customers in a variety of ways. Customer Relationship Management (CRM) software helps small businesses connect with their customers in a variety of ways. It also helps you organise your contacts in an effective way so you can have access to information quickly and easily. Your business needs CRM software to conduct business successfully and retain customers. Having a good relationship with your customers is a key element to growing your business. The right CRM solution can increase your productivity by 26 percent. You know that your business depends on making the right contacts and providing the right customer service. Providing Outstanding Customer Service with CRM Software It doesn't take much thought to understand that the best CRM for small business is the one that can help you connect with your customers effortlessly. CRM solutions that provide integrated connections with the tools and resources you need to conduct business are the ones you want to utilise for your business. Your customers are your "bread and butter" so it is important that you do all you can to draw them in and retain them. A CRM solution that doesn't integrate well with your existing applications or is difficult to use remotely is one you need to think twice about using for your business. A good CRM solution can be customised to reflect the needs of your business. For example; you may only need a good contact list that is compatible with your software right now. Later on, you may want to add reporting capabilities to track customer metrics. The best CRM for small business allows you to be flexible with your tools and resources. Organising Your Customer Data and Providing Access Small businesses need to be able to access customer information quickly and easily. It does no good if you're scrambling to find information that your customer needs right now. A CRM software solution can provide needed databases that can be easily accessed from anywhere but are secure enough to protect sensitive information. When you are in the field talking to a potential client, you need an easy to use CRM system that collects customer data and allows you to pull it up instantly. It is also important on the customer's end to be able to reassure them that their product(s) are on their way. It does help sales and marketing professionals to understand the customer's behaviour patterns and make timely suggestions for new products and services. [easy-tweet tweet="Your business needs CRM software...with this software you can manage workflow" hashtags="CRMSoftware,CRM"] Providing Timely Deliveries and Services An important reason why your business needs CRM software is the fact that with this software you can manage workflow. Satisfied customers provide good feedback for you. The best CRM for small business allows you to track where the products are in the pipeline; understand whether or not you need to follow up on a lead and provide necessary information for your sales and marketing professionals. The best indicator for the improvement and success of your business is the fact that you are seeing return customers. The information you can receive from a well organised and smooth running CRM software solution is priceless. The fact is that your business needs CRM software more than ever before to stay on top of your competition. As a small business owner or entrepreneur, you understand that competition can be fierce. With the right CRM software solution, you can beat the competition handily. ### 3D Systems Chooses ThingWorx Platform for 3D Printers ThingWorx IoT Platform to Monitor 3D Printers for Remote Service and Maintenance; LiveWorx Event to Feature Several Demonstrations of Technology FARNBOROUGH, UK – May 23, 2017 –– PTC (NASDAQ: PTC) and 3D Systems Corporation (NYSE: DDD) today announced that 3D Systems has selected and plans to embed the ThingWorx® Platform from PTC into its 3D printers, enabling intelligent monitoring and remote service and maintenance. 3D Systems provides a comprehensive portfolio of solutions and services for digital design and manufacturing, including both plastic and metal 3D printers. As part of a focused plan to make 3D production real, the company is enhancing its ecosystem of software and hardware to meet production customer needs for end-to-end manufacturing workflows. [easy-tweet tweet="The integration provides 3D Systems visibility into service requirements of the printers" hashtags="3DPrinters, Technology"] 3D Systems plans to enable its 3D printers with the ThingWorx Platform and remote services, designed to maximise printer availability and productivity. The planned solution will give customers the ability to create real-time dashboards to monitor their printers, providing valuable insight and actionable intelligence into the printers’ utilisation and materials status. The integration will also provide 3D Systems with visibility into the service requirements of the printers, enabling the company to remotely diagnose and, in some cases, remotely service the printer, enabling faster repairs and further gains in overall machine productivity. “We believe the capabilities enabled through ThingWorx technology can help us deliver the machine uptime required in production environments,” said Carol Zampell, vice president, software, 3D Systems. “This will enable our customers to gain a distinct business advantage through improved resource utilisation, business insight, and overall gains in productivity.” 3D Systems previously announced the integration of its 3D Sprint™ SDK with Creo® design software from PTC to enable seamless CAD-to-print functionality, as well as a full set of print management tools. “PTC has a long-standing relationship with 3D Systems, and we are excited to continue expanding the collaborative and innovative work by enabling their printers to be smart, connected products,” said Mike Campbell, executive vice president, ThingWorx Platform, PTC. “We support their vision to the business value and opportunity that exists in 3D printing as an industry, and the power the IoT can have in driving that further.” During this week’s LiveWorx technology conference and marketplace, attendees can experience the combined capabilities of 3D Systems and PTC technology. The event, which showcases solutions engineered for a smart, connected world, will take place in Boston from May 22-25. Register to attend or watch the live steam at www.liveworx.com. Additional Resources ThingWorx Internet of Things Platform Harvard Business Review: "How Smart, Connected Products are Transforming Companies," authors PTC CEO Jim Heppelmann and Harvard Professor Michael Porter Forward-Looking Statements Certain statements made in this release that are not statements of historical or current facts are forward-looking statements within the meaning of the Private Securities Litigation Reform Act of 1995.  Forward-looking statements involve known and unknown risks, uncertainties and other factors that may cause the actual results, performance or achievements of the company to be materially different from historical results or from any future results or projections expressed or implied by such forward-looking statements. In many cases, forward-looking statements can be identified by terms such as “believes,” “belief,” “expects,” “may,” “will,” “estimates,” “intends,” “anticipates” or “plans” or the negative of these terms or other comparable terminology.  Forward-looking statements are based upon management’s beliefs, assumptions and current expectations and may include comments as to the respective company’s beliefs and expectations as to future events and trends affecting its business and are necessarily subject to uncertainties, many of which are outside the control of the respective company.  The factors described under the headings “Forward-Looking Statements” and “Risk Factors” in the respective company’s periodic filings with the Securities and Exchange Commission, as well as other factors, could cause actual results to differ materially from those reflected or predicted in forward-looking statements. Although management believes that the expectations reflected in the forward-looking statements are reasonable, forward-looking statements are not, and should not be relied upon as a guarantee of future performance or results, nor will they necessarily prove to be accurate indications of the times at which such performance or results will be achieved. The forward-looking statements included are made only as the date of the statement. Neither 3D Systems or PTC undertakes an obligation to update or review any forward-looking statements made by management or on its behalf, whether as a result of future developments, subsequent events or circumstances or otherwise. ### Are you Ready for GDPR? The General Data Protection Regulation (GDPR) has been long discussed but will finally come into force in just a year’s time, on 25th May 2018. Replacing the current Data Protection Act, the GDPR introduces a series of far-reaching changes to the way in which organisations are allowed to store, move and manage data. The main reasons behind the introduction of the GDPR are clear: it strengthens, simplifies and unifies previously disparate data protection regimes across Europe and extends the rights of individuals with regards to consents, their access to and maintenance of their own personal data. Despite Brexit, organisations in the UK will nevertheless be obliged to adopt the GDPR. Not only will the GDPR take effect before the UK leaves the EU, but any non-EU organisation doing business with EU customers or processing their data needs to comply or face significant fines of up to 4% of annual global turnover or €20 million, whichever is greater. A key principle of the regulation is ‘data protection by design and by default’. Systems and processes must have data privacy built in from the start and data should only be collected to fulfil specific purposes and discarded when no longer needed. In an age where storage is both plentiful and relatively cheap, the tendency in recent years has been to keep everything ‘just in case’, so this mindset will need to change. Another big change is that Data Processors, who manage data on behalf of companies, will share responsibility with Data Controllers, the owners of the data. Anyone outsourcing the management of the personal data they process needs to ensure that contractual arrangements are updated and that responsibilities and liabilities are clearly stated. As part of this, controllers need to be aware of new restrictions of moving data across borders to non-EU countries and ensure that they are working with processors who are also compliant. The right to erasure (‘to be forgotten’) is a part of the GDPR which could create a lot of work for organisations that do not have an accurate picture of which personal data they are storing about customers, employees, prospects etc. Data that is deemed personal includes any measure that could be used to identify them, including genetic information. Individuals can ask an organisation what data they hold, request erasure without undue delay in certain situations and ask for proof that it has been securely erased. [easy-tweet tweet="Data subjects need to be notified where there is a high risk to them following a breach" hashtags="Data,GDPR"] From the 25th May 2018, companies that fail to protect data and experience breaches will need to report incidents to their data protection agency within 72 hours of Data Controllers becoming aware of it. Data subjects will need to be notified where there is a high risk to them following a breach. With all of this in mind, the key priorities for organisations in the coming 12 months will be to: Assess whether or not they need to appoint a Data Protection Officer (DPO). All public authorities and companies that practice ‘regular and systematic monitoring of data subjects on a large scale’ or where the entity conducts large-scale processing of ‘special categories of personal data’ should appoint a DPO. Ensure that data is correctly catalogued, including alignment to approved retention policies, and that data that is no longer required is securely erased. This includes data stored in the cloud and paper documents stored off-site. Organisations may have paper documents stored off-site for decades that have been scanned and duplicated in online systems – while paper is difficult to hack, document storage must be compliant with the principles of the GDPR. Consider how paper documents containing personal data entering the organisation are dealt with as they move through various business processes. By setting up a digital mailroom, organisations can ensure that they have a clear and compliant audit trail for all paper documents entering the business. Conduct due diligence on where data is being stored in the cloud by cloud application providers and other Data Processors. This could include file sharing apps used by individuals within the business as well as corporate systems such as accounting, CRM, personnel and content management platforms. Make sure the cloud hosting provider has the tools and technologies in place to protect data, identify and report a data breach and produce information and/or incident logs when required. Request confirmation from the cloud hosting provider that data is not leaving the EU or that it is not crossing borders to non-compliant countries where there is a higher potential risk of espionage. Create required Records Of Processing Activities Update Data Protection Policy and data breach procedures Review procedures for subject rights and update privacy notices and consent forms While the GDPR is a major shakeup of the data protection regime in the EU, it is also an opportunity for organisations to introduce the best possible practice in data and document management and use this to enhance trust in their business and processes. While this represents one incentive for taking urgent action, the threat of fines that could potentially put an organisation out of business is another equally pertinent one. Time is running out to get the basics in place.  ### Survey Finds New Breed of Digital Innovators World’s largest IT leadership survey finds 89 percent of CIOs investing in innovation First sign that IT leaders are significantly investing in digital labour Enterprise-wide digital strategies increase 52 percent in just two years Cyber security vulnerability at all-time high; only one in five can handle an attack ‘very well’  LONDON – May 23, 2017 – Despite two-thirds (64 percent) of organisations adapting their technology strategy because of unprecedented global political and economic uncertainty, 89 percent are maintaining or ramping up investment in innovation, including in digital labour. More than half (52 percent) are investing in more nimble technology platforms to help their organisation innovate and adapt. This is according to the 2017 Harvey Nash/KPMG CIO Survey, the world’s largest survey of IT leadership. While economic uncertainty is making business planning difficult for many organisations, it is clear digital strategies have infiltrated businesses across the globe at an entirely new level. The proportion of organisations surveyed that have enterprise-wide digital strategies increased 52 percent in just two years, and those organisations with a Chief Digital Officer have increased 39 percent over last year. To help deliver these complex digital strategies, organisations also report a huge demand for Enterprise Architects - the fastest growing technology skill this year, up 26 percent compared to 2016. Cyber security vulnerability is at an all-time high, with a third of IT leaders (32 percent) reporting their organisation had been subject to a major cyber-attack in the past 24 months – a 45 percent increase from 2013. Only one in five (21 percent) say they are “very well” prepared to respond to these attacks, down from 29 percent in 2014. The biggest jump in threats comes from insider attacks, increasing from 40 percent to 47 percent over last year. [easy-tweet tweet="Organisations have moved on from strategizing about digital, to actually making it happen" hashtags="Digital, Security"] “Making a success of technology has always been challenging, and this year’s Survey says that it has just got harder still,” said Albert Ellis, CEO, Harvey Nash Group. “Layered on top of astonishing advances in technology is a political and economic landscape that is dynamic and changing fast, sometimes in surprising ways. However, what is very clear is that many technology executives are turning this uncertainty into opportunity, driving their organisation to become more nimble and digital. CIOs are becoming increasingly influential as CEOs and boards turn to them for help in navigating through the complexity, and the threat and opportunity, which a digital world presents.” “Organisations have moved on from a world of strategizing and talking about digital, to one in which they are actually making it happen, and we are now seeing widespread and active implementation,” said Lisa Heneghan, KPMG’s Global Head of Technology, Management Consulting. “The businesses we see as Digital Leaders are taking a pragmatic approach, applying technology and automation across their business, including in back office functions, to create a platform for broader transformation”. Now in its 19th year, the Harvey Nash/KPMG CIO Survey is the largest IT leadership survey in the world. Additional findings from the survey include:  Digital leadership has changed Almost one in five CIOs (18 percent) report their organisations have ‘very effective’ digital strategies. CIOs at these digitally-enabled organisations are almost twice as likely to be leading innovation across the business (41 percent versus 23 percent), and are investing at four times the rate of non-leaders in cognitive automation (25 percent versus 7 percent). Overall, the survey found almost two-thirds (61 percent) of CIOs from larger organisations are already investing or planning to invest in digital labour. CIOs love their jobs, and are more likely to be involved at the Board level CIOs who are “very fulfilled” in their role is at a three-year high - rising from 33 percent in 2015 to 39 percent this year. For the first time in a decade, more than seven in ten CIOs (71 percent) believe the CIO role is becoming more strategic. 92 percent of CIOs joined a Board meeting in the past 12 months. However, the CIO life span is just five years or less (59 percent), although many want to stay longer. Female CIOs receive salary boost  In a striking development, female CIOs are far more likely to have received a salary increase than male CIOs in the past year (42 percent and 32 percent, respectively), but still, the number of women in IT leadership remains extraordinarily low at 9 percent, the same as last year. Big data/analytics remains the most in-demand skill While the fastest growing demand for a technology skill this year was enterprise architecture, big data/analytics remained the most in-demand skill at 42 percent, up 8 percent over last year. Complex IT projects – increase risk of failure monumentally Two-thirds (61 percent) of CIOs say IT projects are more complex than they were five years ago, and weak ownership (46 percent), an overly optimistic approach (40 percent), and unclear objectives (40 percent) are the main reasons IT projects fail. Over a quarter (27 percent) of CIOs say that a lack of project talent is the cause of project failure, but project management skills are completely absent from the CIOs top list of technology skills needed in 2017, dropping a staggering 19 percent in one year. ### Colocation and The Cloud As businesses’ data requirements continue to increase, many are looking to colocation facilities to outsource part of, or all, their data centre estate. For smaller businesses, colocation facilities can provide a more affordable option compared to capital investment and lets companies outsource associated IT maintenance costs. Larger enterprises with global customers see colocation as a way to build a presence in multiple countries at a lower cost, in turn enabling them to adhere to local data sovereignty laws. This is evident in Europe, where the colocation market is at an all-time high according to a report from CBRE. Cloud is said to be one of the main drivers of this activity – Google, AWS and Microsoft all have multiple cloud facilities across Europe. Companies have likely also been attracted by tax incentives in many of Europe’s data centre hubs including Nordic countries and Ireland. These locations also have the additional benefit of natural cooling – another likely factor behind Europe’s colocation market success. Downtime and disruption From an operational point of view, running a colocation facility is not without its risks. A mistyped command by an AWS team member deleted access to multiple servers and caused many high profile websites and online services to go offline[1]. This included Expedia, Quora, and Slack – even sites checking website availability were affected by the outage! The outage lasted most of the day, costing both AWS and its customers heavily – all due to human error. One case of human error cost the company in more ways than one. There are likely to have been financial implications, potentially including compensation to customers, but also damage to the reputation of the company’s reliability. [easy-tweet tweet="Services are judged on how quickly they are up and running again" hashtags="DataCentre,Services"] Microsoft is another example. The company suffered downtime to its services when a cooling system failed at its Japan data centre[2]. Equipment shut down to prevent overheating and caused disruption to its services for several hours. It also suffered another large outage in early March when users were unable to access Skype or Hotmail[3]. Colocation customers require the assurance that outsourcing their data centre requirements will not affect reliability. No service is immune but customers need to feel confident that their colocation provider is committed to disaster prevention and has procedures in place when outages occur. Services are judged on how quickly they are up and running again. The earlier an issue can be detected, the faster the issue can be resolved or avoided all together. How to help prevent outages Reducing operational risks is important in this context and key for any data centre manager, especially for those who work for colocation providers. One option is to consider real-time environmental monitoring, which can alert data centre managers to issues such as cooling system leakages and overheating equipment. Solutions can detect potential issues before they have a chance to develop and data centre managers can address the problem before it becomes a major event. IT asset management is another option to evaluate. This provides location data on equipment and additional insight to assist decision making. Data centre hardware is frequently updated and older equipment is more likely to break. Having information about the age of equipment and upcoming warranties can help businesses better manage their data centre estates and finances. IT asset management provides the intelligence to ensure equipment is running smoothly but also to track its whereabouts. Colocation data centres are vast and trying to find the one asset in need of attention through manual methods could take hours, by which time the issue could have escalated. Surprisingly, environmental monitoring and asset tracking are often still performed with a clipboard and spreadsheet. Not only is this prone to human error but it is a time-consuming – time that could be better spent maintaining equipment and preventing outages. Automated data collection eliminates the need to manually collect environmental and asset information and reduces the risk of human error. Making the move Moving to a colocation facility has its benefits – a secure building and environment, connectivity and the benefits of not having to run and pay for a facility. Data centre estates can also be spread across multiple locations to better serve an enterprise’s needs and the costs of running a data centre are reduced. Colocation providers have SLAs to uphold, and any downtime is costly, so having the intelligence and tools to resolve issues as quickly as possible is key to keeping customers happy. Providing additional insight and demonstrating a commitment to reliability is now the deciding factor for customers when choosing a colocation provider. The data centre estate is still evolving and colocation and the cloud look likely to continue to be part of this. As more organisations outsource their data centre requirements, providers will need to stand out from the competition and demonstrate they are the best choice.   Sources: [1] https://aws.amazon.com/message/41926/ [1] http://www.datacenterknowledge.com/archives/2017/03/31/data-center-cooling-outage-disrupts-azure-cloud-in-japan/ [1] http://fortune.com/2017/03/07/microsoft-outlook-outage/ [1] https://aws.amazon.com/message/41926/ [2] http://www.datacenterknowledge.com/archives/2017/03/31/data-center-cooling-outage-disrupts-azure-cloud-in-japan/ [3] http://fortune.com/2017/03/07/microsoft-outlook-outage/ ### How SMEs are Driving Cloud Uptake Discussion over the cloud and the need for businesses to adopt a cloud-first policy is everywhere in the IT sector at current, dominating conversations in company boardrooms and online communities alike. With the cloud offering seamless updates to existing services amongst other benefits, it is only natural that it is viewed in some quarters as the most assured way of securing business for the long-term. With this in mind, it is important to consider the pace at which businesses are adopting cloud technologies. The Paessler Cloud Acceptance Study looked into attitudes towards the cloud amongst SMEs, and looked at who is leading the way, how they’re getting there, and also some of the issues preventing them from taking up cloud services at an even faster rate. The key to growth SMEs hold a privileged status within the UK as what some refer to as the ‘engine room’ of the economy. Given their ability to scale quickly as well stay agile and adapt to the rapid changes going on in their marketplace, it is widely recognised that they have a pivotal role to play in driving the country forward economically. This is further endorsed by their sheer dominance of the business world in the UK. The Federation of Small Business (FSB) found in a 2016 study that 99.9% of all private businesses would be classed as small or medium-sized. Because of their ‘challenger brand’ mentality, they also often lead the way when it comes to adopting new technologies and innovative trends, forging the way for other companies to follow suit. The results of our cloud survey reinforce this idea, finding that 70% of SMEs already use cloud technology as part of their IT infrastructure, or are planning to do so shortly. Clearly, in this area, SMEs are showing living up to their reputations as ‘early adopters’ by spearheading the path to a prosperous digital economy, and jumping on these trends earlier on and in significant numbers. Overcoming barriers However, there is also reticence on the behalf of some SMEs to dive head first into the cloud. Just some issues that have cropped up are data security, the costs incurred by updating an entire system, and also a lack of knowledge within SMEs. [easy-tweet tweet="IT can often find its way to the bottom of the list of priorities" hashtags="IT, Data"] These concerns are hardly surprising on one level. It is unlikely that many SMEs will have enough resource in their IT departments to devote a staff member solely to cyber security, and so keeping customer and client data secure is bound to be front of mind when it comes to making changes to company IT. In addition, the idea of committing often sparse resources to the upfront costs associated with shifting to the cloud takes some careful thought for SME owners. With a budget that needs to cover other factors such as staffing, marketing and product development, IT can often find its way to the bottom of the list of priorities. However, small businesses are beginning to find ways to address these issues. In terms of security, the three major cloud providers, Amazon, Microsoft and Google, provide access to round-the-clock teams devoted to protecting their data. In many ways, it could be argued that this level of attention actually improves security. And in terms of addressing the cost, the Hybrid Cloud has emerged as a viable alternative for many of these businesses, providing the opportunity to combine cloud services and on-premise infrastructure into a mixed network. Delivering value The growing number of SMEs shifting their IT to the cloud cannot simply be explained by good marketing from the big three cloud providers. The cloud presents obvious opportunities when it comes to small and medium sized business in terms of saving time and money, scalability, and long-term IT strategies: Freeing up time and lowering long-term cost For instance, whether a business is delivering website content or download link to its products or services, using a cloud-based delivery network is the best efficient way of getting the desired information out to customers. It’s generally cheaper than using a data centre for the same purpose, and also faster. Securing and stabilising the IT network Most SMEs will only have a small core team devoted to the IT network on a day-to-day basis. Cyber security is an issue that constantly comes up in considerations for small businesses, and having a cloud provider take responsibility for it, with their far larger resources given over to security, is often seen as the safest and most secure option. Reassessing the IT infrastructure In a broader sense, moving to the cloud also provides SMEs with the ideal opportunity to take another look at their IT networks, and ask whether it is still working as hard as it can for their business. Whether it’s asking whether legacy systems are being used in the way they should be, or whether the wider IT system can be simplified, nothing is off the table when it comes to shifting to the cloud. Making the shift Ultimately, the advent of the cloud is set to continue apace, having already changed the way we do business and perceive technology. Given that small businesses as a whole are already at the forefront of this trend, it is almost assured that more and more SMEs will continue to move their services to the cloud, and carry on driving cloud uptake in the UK. ### Sesui, Announced as Finalist in the Oxfordshire Business Awards Cloud-based telephony innovator, Sesui Limited, is proud to announce it is a finalist in the Oxfordshire Business Awards 2017, following on from its success in the 2016 Queen’s Award for Enterprise: Innovation. The awards have recognised, rewarded and promoted excellence amongst Oxfordshire-based organisations for 23 years. This year, Sesui is one of only three finalists in the Oxford Science Park Innovation category, which is looking to reward the development and/or introduction of an innovative product, process, service or business method. Sesui works with its clients to transform the telephone from office commodity to an invaluable business tool with resilient, scalable, flexible and functional telephony solutions. Owned and developed in-house, its i-Platform underpins all Sesui products and services. It can be overlaid onto any type or combination of telephony infrastructures including digital, IP or analogue, allowing clients to maximise their existing infrastructure and investment. [easy-tweet tweet=" The i-Platform has enabled Sesui to develop robust and agile virtual contact centre solutions" hashtags="i-Platform,Virtual"] The i-Platform has enabled Sesui to develop robust and agile virtual contact centre solutions, allowing for the design and deployment of new functionality simply and easily, and the day-to-day management and monitoring of customer activity. It is for its innovation in the development of the i-Platform that Sesui has been shortlisted in the Oxford Science Park Innovation Award. In 2016, the i-Platform was also recognised in the Queen’s Award for Enterprise: Innovation. Sesui also won the ‘Most Pioneering Cloud-Based Contact Centre & CV Innovation Award’ for the Sesui i-APP within the Technology Innovator Awards. Lee Bryant, Managing Director, Sesui, comments: “Cloud telephony is increasingly on the rise as companies of all sizes recognise the efficiency and productivity gains it enables. However, for Sesui, it’s also about providing companies with a new lens on business performance, unlocking real-world insight into customer and workforce behaviours that organisations have not previously been able to see. This is why we are so proud of the recognition we have received from the business and telecommunications industries, and look forward to the results of another successful entry.” ### Loom Systems Selected As a Gartner “Cool Vendor” Vendors selected for the “Cool Vendor” report are innovative, impactful and intriguing Loom Systems, a provider of an AI-powered log analysis software platform, announced today it has been included in the list of “Cool Vendors” in the Performance Analysis report by Gartner, Inc. The report states, “In an attempt to tame an increasingly complex IT environment, several vendors are bringing advanced analytics and enhanced scalability to the availability and performance management market.” [easy-tweet tweet="Loom’s software platform uses a combination of machine learning technologies to provide predictive insights" hashtags="IT, Technology "] Targeted at DevOps and IT professionals, Loom Systems’ AIOps platform is used to analyse log data generated by applications and infrastructure. Loom’s software can be used with multiple data types including events and metrics and from multiple data sources both directly and via other management tools. Loom's AIOps platform automates the skill-intensive tasks associated with preparing data for analysis while simplifying implementation and speeding time to value. Loom’s software platform conducts root-cause analysis and actionable mitigation recommendations through its Tribal Knowledge Bank™ (TriKB) and uses a combination of machine learning technologies to provide predictive insights and a virtual personal assistant-like recommendation engine. “We consider our inclusion in the Cool Vendor report by Gartner, confirmation of our mission to develop technology that analyses both human and machine-generated data down to the single event or transaction,” said Gabby Menachem, CEO and founder, Loom Systems. “We apply artificial intelligence to remove the pre-processing IT teams need to do before they can analyse data and we also provide an intelligence layer that ties log data to remediation and action.” ### IDBS Announced as a National Technology Awards Winner The IDBS E-WorkBook Cloud has won Cloud Product of the Year Guildford, UK, May 19, 2017 – IDBS, a leading R&D and scientific software solutions provider, has been recognised by the National Technology Awards for its innovative cloud solution. The firm’s flagship platform, The E-WorkBook Cloud, has been awarded Cloud Product of the Year at the prestigious awards, beating companies such as UKFast and Workday for the coveted prize. The IDBS E-WorkBook Cloud is an enterprise cloud-based platform designed specifically to meet the demands of research and development data management and analysis in science-driven industries globally. [easy-tweet tweet="The IDBS E-WorkBook Cloud was designed using our deep understanding of the R&D process" hashtags="IDBS,Cloud"] Laurence Painell, Vice President of Marketing at IDBS said: “IDBS’ recognition by the National Technology Awards is a true testament to our dedication to enable organisations to operate faster, more effectively and tackle some of their most significant challenges. The IDBS E-WorkBook Cloud was designed using our deep understanding of the R&D process and the specific pain points faced by the industry. It’s an honour to be recognised as a leading provider of cloud solutions that facilitates ground-breaking research and overcomes research hurdles.” The National Technology Awards ceremony, organised by FStech Magazine, took place at the Millennium Hotel London Mayfair on Wednesday, May 17th and celebrates the leaders of pioneering technology to help drive standards and encourage excellence within the industry. IDBS will continue to provide the best cloud-based R&D technology and solutions to the world’s most forward-thinking companies, helping them solve global challenges and foster innovation. ### Technology and the Law with Frank Jennings Ep. 1 - Tweets from the show! Technology and the Law with Frank Jennings Ep. 1 ### Why IT Support Services Will Drive Future Growth IT support services are central to ensuring technology stays online and continues to operate effectively. Worth millions of pounds to the UK economy, the IT support industry forms an important part of the UK’s services economy, a section which according to IHS Markit accounts for as much as 80% of economic productivity. Technology support, in general, is set to continue to grow over the coming years as reliance and demand grows. Gartner estimates that we are set to see IT services grow at a rate of 3.5% over the next three years. A key driver is an increasing appetite for on-demand and always-on services, fuelling the growth in IT infrastructure and storage requirements. This, in turn, has affected the scale and manner in which technology services are being consumed, increasing the consequences associated with any service outage. Being online and operational has become the norm. Therefore things quickly escalate if any service failures occur. Operating in the IT channel, organisations in our space have focused efforts on traditional technology solutions, chiefly hardware, software and consultancy sales. In order to attract higher margin business and retain competitiveness, the channel has had to diversify further by steadily strengthening and extending the services portfolio, particularly around support, which continues to be where we’re also seeing the fastest growth. Agilitas research carried out by OnePoll asked which areas channel leaders expected to see the strongest revenue streams in 2020, with the majority of the channel leaders, surveyed answering software and hardware support services. In recent years, the priorities of customers have altered to focus as much on service performance as on product performance. As-a-service models are providing the means to drive these capabilities by guaranteeing results for end users. Delivered at a single OPEX based cost, we’re seeing increasing popularity for similar models in both the business and consumer worlds, primarily as pricing, service levels and contractual obligations tend to be more flexible and customer focused. Designed to simplify a service at a single cost, their popularity is a result of consumers and businesses being able to outsource or shift responsibility to ensure an action happens. Amazon Prime is a good example, delivering goods purchased online as fast as possible for a single cost of £79 per year. It seems, however, that this trend is experiencing greater momentum in IT though because of the economic conditions businesses are currently facing. In short, support based delivery models for IT services provide customers with a level of certainty in times of uncertainty. [easy-tweet tweet="Recruitment as-a-service, are on the rise, enabling organisations to focus on delivering to customers" hashtags="IT, RaaS"] With appropriate support services in place, be it inventory management, engineer resource, disaster recovery or technical advice, when things inevitably do go wrong, organisations do not have to shell out thousands of pounds to maintain uptime and business continuity. The growth in IT dependence is well documented, however, as our government continues to grapple with the opportunities and challenges facing Britain outside of the European Union, support services are providing businesses with some level of assurance. Similar trends are being seen outside of IT and technology. HR and recruitment as-a-service, for example, are also on the rise, arguably enabling organisations to focus on delivering to customers rather than administration or support activity. In Agilitas’ view, the focus has moved very much in favour of the end user customer. Support services run ‘as-a-service’ are putting customers in control of their spend and therefore results. This, in turn, reduces pressure on businesses as costs can be budgeted months in advance. Support services are an area ideal for outsourcing. The capital investment required to build what they supply from scratch can be very high, be it inventory, technical resource or IT systems, and because of the competition that already exists, space is often restrictive to new entrants.  In the space, Agilitas operates - inventory-as-a-service, organisations looking to manage their own inventory on behalf of customers not only have to make that initial heavy investment but need to commit to a programme of continuous investment in order to retain service levels.  In order to succeed IT, hardware stock must be at an optimum level and availability in order  to service customers effectively while hitting contractual SLAs Looking to the future, the core reason behind support service growth will be a move to fit future models of doing business. Research has indicated a potential step change as channel organisations have looked to focus on the areas that they do best - serving their end user customers. ### Technology and The Law, Episode 1: GDPR and Data Location Tune in today to "Technology and The Law". A brand new show broadcasted by Disruptive Tech with presenter and technology lawyer, Frank Jennings.  Technology and The Law will be available to watch, live today at 2:30 pm UK time. This is a show that focuses on the legal aspects and debates towards the technological industry. The topics this week will focus on GDPR and Data Location. Frank will be joined by four technology experts; Rick Powles, Chris Bridgland, John Easton and Sue MacLure where they will look into the current regulations and how people perceive existing laws. These episodes will be running for 6 weeks, for 30 minutes, focussing on a variety of different topics, such as the Cloud, social media and many more. So if GDPR isn’t your cup of tea, tune into an episode that best suits you. Every episode, four new experts will come in to contribute towards Frank's topics.   Frank Jennings is the commercial and cloud lawyer at Wallace LLP. He specialises in commercial contracts with cloud and technology, data security and intellectual property. Frank is known for his can-do attitude, managing risk and problem solving to maximise return on investment. Who else could present this show? If you are interested and want to learn more about the legal aspects and debates towards the technological industry then tune in! https://www.comparethecloud.net/techlaw-ep1/ or https://www.disruptivetech.tv/techlaw/ Hashtag: #TechLaw Twitter: @disruptivelive ### Vivonet Deploys Genesys Cloud Solution PureCloud by Genesys helps cloud-based hospitality technology provider achieve fast ROI and improve customer satisfaction through increased stability, functionality and ease-of-use Vivonet, the industry leader in cloud-based hospitality solutions, has deployed PureCloud by Genesys™, a cloud-based customer engagement and employee collaboration offering from Genesys®, the global leader in omnichannel customer experience and contact centre solutions. Vivonet currently uses PureCloud by Genesys to support approximately 30 Vancouver-based agents and contact centre managers, who handle hundreds of technical support interactions per month. Vivonet will roll out the solution to an additional 28 agents in a contact centre in Columbus, Ohio, later this year. PureCloud by Genesys is a unified, all-in-one customer engagement and employee collaboration solution that is easy to use and quick to deploy. A true cloud offering, PureCloud by Genesys is flexible, open, feature-rich, and built for rapid innovation. Before selecting a solution, Vivonet tested PureCloud by Genesys against competitor Five9 across nearly 300 touchpoints. Vivonet ultimately selected the Genesys solution for its intuitive interface, all-in-one feature set, and AWS-based architecture, which makes it reliable and easy to deploy and manage. “There is no other customer experience solution that compares to the simplified, sleek omnichannel routeing of PureCloud by Genesys,” said Shafique Adiatia, Vivonet system administrator. Vivonet cites a high record of uptime as one of the most significant benefits of PureCloud by Genesys. By ensuring effective, uninterrupted service, the solution has caused customer satisfaction ratings to increase and escalations to decrease since Vivonet transferred from its previous cloud solution. “Our former solution’s unreliability was costing us money because customers receive credits when service is interrupted,” said Adatia. “We’ve reduced refunds since the transition. Now I can sleep at night knowing I don’t need to worry about our contact centre going down, and our customers can reach us when they need to.” [easy-tweet tweet="Vivonet values the omnichannel functionality of PureCloud by Genesys" hashtags="Vivonet, Cloud"] Vivonet also values the solution’s ease of use, which made deployment and training lightning-quick for faster return on investment (ROI). “PureCloud by Genesys is so intuitive that our non-technical interns were able to build a fully-functioning contact centre in mere days, using only the comprehensive resource documentation,” said Adatia. “From start to finish, full system deployment took less than a month — from the initial contact centre build-out, to the development of advanced call logic, to full agent adoption.” Vivonet values the omnichannel functionality of PureCloud by Genesys, as their previous system didn’t allow them to engage with customers using the method of their choice. Today, Vivonet uses features such as voice with WebRTC, advanced call routeing and email routeing, and speech-enabled and interactive voice response (IVR), in addition to leveraging the reporting dashboards, data analytics and quality monitoring capability. “With PureCloud by Genesys, we make better decisions for how and when we staff our technical support environment,” said Adatia. “And, we use the listen functionality, dual phone recording, and customer journey metrics to understand every step of the customer experience—from when they first dial our number, to the option they selected to get to an agent, to the entire path they took until the call ends.” PureCloud by Genesys is exceeding the company's expectations and according to Adatia the company plans to roll out more of the solution’s capabilities by the end of the year, starting with co-browsing, web chat, and workforce management. “We’re honoured that Vivonet has selected PureCloud by Genesys and confident that our technology will help them deliver great customer experiences by providing a reliable, intuitive and powerful solution to improve functionality, processes and engagement,” said Tom Eggemeier, president of Genesys. For more details on how Vivonet is benefiting from PureCloud by Genesys, read the full case study on the Genesys website. ### The Impact of GDPR on SMEs The General Data Protection Regulation (GDPR) has made headlines for the last year, but with only 12 months to go, SMEs need to ensure their businesses are prepared. Regardless of ongoing Brexit negotiations, the UK has committed to implementing GDPR by 25 May 2018. All UK businesses collecting data on EU customers will be affected, including SMES. While small business owners may consider this just another regulatory burden, it is an important one to take note of. With this in mind, we’ve highlighted some of the key facts below. Why is GDPR needed? With the internet and cloud computing, data processing had changed significantly since the late 90s when data protection laws were last reviewed. This has come hand in hand with an increase in data breaches which peaked last year, growing by 40% compared to 2015, according to the Identity Theft Resource Center. A recent report by Juniper Research revealed that almost three-quarters of UK SMEs think they are safe from cyber-attack, yet half of these suffered a data breach. Against this backdrop, the need to protect personal data more effectively has never been more evident. [easy-tweet tweet="GDPR will strengthen the protection of personal information" hashtags="GDPR,Security"] What is GDPR? GDPR is the new law that comes into force next May 2018, requiring any business that operates in the EU or handles the personal data of EU residents to implement a strong data protection policy to protect client data. It replaces the Data Protection Act 1998, with the following key differences: Broader scope: the regulation affects any business that collects, processes or stores personal data from EU-based individuals, including businesses based outside the EU. The definition of personal data has also been broadened to include genetic, mental, economic, cultural and social identity, thereby bringing more data into regulation. Tougher penalties: a two-tiered sanction regime will see breached organisations receiving fines of up to 4% of annual global turnover or €20 million – whichever is greater. Fines are dependent on data loss and the systems and technology put in place. Shorter notification of breaches: businesses are required to report data breaches to the relevant Data Protection Authority within 72 hours of detection. Accountability and Privacy by design: GDPR places increased accountability on business systems and processes. Data controllers must maintain documentation, conduct a data protection impact assessment for riskier processing, collect only necessary data and discard it when no longer required. Appointment of a data protection officer (DPO): a mandatory requirement for all public authorities and companies whose core business activities are data processing. Consent required to process children’s data: parental consent will be required to process personal data of children under 16 and individual EU Member States may choose to lower this age to 13. Access to data: data subjects are entitled to request a copy of their personal data in a format usable by them and electronically portable to another processing system. The right to be forgotten: data subjects have the right to erase their data. Businesses must ensure they have the processes and technology to delete data in response to requests to do so. GDPR will strengthen the protection of personal information. Regardless of size, all companies doing business in the EU will be required to collect, store and use personal information more securely. While there are few areas where SMEs are recognised as having fewer resources and capabilities than larger enterprises, small businesses can take comfort in some leeway regarding documentation and record keeping. The degree of leeway at this stage is still uncertain. Providing SMEs can demonstrate they’ve taken a proactive approach to data protection, privacy and meeting the requirements of GDPR, regulators will work with them on any problems that might arise. Engaging with the right consultants and documenting all actions undertaken will be key to compliance and avoiding hefty fines which may have a disproportionate effect on SMEs How to prepare your SME? Businesses need to act now to ensure they understand the actions required to achieve compliance. The first step is undertaking a data processing audit to identify any gaps in your systems, while giving you sufficient time to rectify them. The Information Commissioner’s Office provides a handy guide to helping prepare your business; Preparing for the General Data Protection Regulation (GDPR): 12 steps to take now. There are also various GDPR awareness training days that are available to help you understand what is required of your business. Customers are the lifeblood of any SME so ensuring you have the correct processes in place to handle data and avoid a potential breach is vital.  Establishing a plan to deal with any potential breach or cyber-attack is equally important. This will avoid further damage to client or supply chain relationships, particularly with the new reduced breach notification periods. In summary, the sooner you start getting your GDPR strategy in place, the better. For more advice on protecting your data within your IT systems, please get in touch, and we’d be delighted to help. ###  4 Ways Cloud is Acting as a Temporary fix Many companies negotiating full data centre rebuilds are considering cloud solutions to ‘fill the gap’ as demands on business-critical applications grow. Cloud-based solutions of this type are providing businesses with the ability to extend legacy infrastructures until they can be upgraded, re-built, or until companies decide to deploy more cloud computing power. Here are some examples of how the cloud is minding the gap as a temporary fix: Filling the gap by avoiding a ‘floor sweep.' Cloud can be used to fill the gap by avoiding a ‘floor sweep’ of the data centre: this refers to when a company faces major capital investments in the replacement of data centre hardware. Also, when facing a limitation in IT capacity, the cloud can be used to supplement traditional IT systems to provide the flexibility regarding capacity that is required. Also known as Cloud bursting, this allows businesses to augment their on-premise systems with short-term additional capacity in the cloud. For example, a national satellite entertainment service provider had ageing hardware systems, some of which were nearing their limits or facing reduced support from vendors. The company had a mature IT group that operated its own data centre. In addition to maintaining a large corporate data warehouse, its production applications ranged across all the corporate functions, including managing a large service fleet to install and repair devices among their customer base. The provider knew it faced major capital investments in hardware upgrades or a “floor sweep” of its data centre. In facing a limitation in IT capacity, the company put the cloud to the test to see if it could be used to supplement traditional IT systems and provide the flexibility and capacity required. What the company found was that the cloud was an excellent solution to the demonstrated limitation in IT capacity it was experiencing, particularly around its requirements for development and testing. In its journey to avoid a floor sweep, the company began to understand cloud architectures better and were able to evaluate the whether the transition of its production systems to the cloud would be an efficient way to fill the gap. Ultimately, the company decided to supplement its traditional IT systems with a cloud-based development system, minimising a floor sweet across the board. This is one story amongst many with a focus on the cloud’s ability to fill gaps previously left open by large-scale IT transformation programmes. So when it comes to considering a full floor sweep, it’s not surprising that many companies are opting to use the cloud to both plug gaps and capitalise on market opportunities in a cost effective way. Filling the data centre deployment gap For its traditional big-box data centre, the hardware purchase and installation of an equivalent system would take close to a year to become production ready. Many companies facing these extended timescales have started to consider cloud solutions as a ‘temporary bridge’ to a future hardware purchase. [easy-tweet tweet="The right cloud solution can allow a company to extend the life of a critical application" hashtags="CloudSolution,IT"] For example, the right cloud solution can provide a company with the ability to extend the life of a critical application, enabling a medium-term bridge to continue operations and allowing the option to bring the application on-premises at a future time. This can all happen while a major data centre is being re-built or in place of a large scale hardware purchase. Filling the staffing gap Even if a company has considerable IT infrastructure on-premises, it may decide to adopt a cloud approach to provide the desired flexibility and speed required for new initiatives. The right solution means a company will be able to avoid any impacts on the existing production environments and IT staffing, for example, it is relatively straightforward to add and subtract additional Cloud resources without diverting significant resources away from the day to day operations activity for the on-premise systems. Filling the gap with a multi-cloud structure In order to deal with increasing demands on its current IT system and challenges associated with mixed workload environments, a company might look to a solution with a multi-cloud structure, with different workloads allocated to different cloud providers and technologies.  For example, database management could be hosted on a private cloud behind the organisation’s firewall with report generation workload being run on a public cloud and sized according to reporting and processing needs.  The reporting systems could be augmented at busy month-end periods with capacity reduced during other times of the month.  This can allow a company to be able to support its IT operations economically with a small staff. Overall, it’s easy to see the appeal regarding companies adopting cloud solutions as a supplements or ‘temporary bridges’ to a future hardware purchase. In comparison to hardware installations, cloud solutions can be production ready within a matter of weeks, with the bonus of costing the fraction of a full re-build. Of the many companies that have successfully looked to the cloud to fill all manner of gaps, it’s unsurprising that many find their stop gap cloud solutions rise to meet end-to-end business challenges, becoming the go-to solutions for many enterprises moving forward. ### Open the Pod Bay Doors, Siri   Things didn’t run smoothly when the astronauts in 2001: A Space Odyssey toyed with artificial intelligence. When asking to be let back inside the spaceship, supercomputer HAL eerily responded, “I’m sorry Dave, I’m afraid I can’t do that.” Thankfully, the personal assistants we use to today are a little friendlier. Here, Roxanne Abercrombie of business automation specialist, Parker Software, explains the role of the intelligent assistants for both consumers and businesses. Grab your phone and ask Siri the same question. Today’s intelligent assistants are designed to provide friendly, fast and intuitive customer service, so you’re likely to get an instant, sassy response, like Siri’s: “Here we go again, we intelligent assistants will never live that down.” Currently, consumers are basking in the fun of instantaneously arranging dinner reservations, making language translations on the move and, of course, prompting Siri to give them humorous responses to ridiculous questions. Every technology giant — including Apple, Google and Microsoft — is investing significant resources into the development of new intelligent assistants. In fact, AI is quickly becoming a major focus of competition for these firms. Let’s face it, if consumers can use their smartphone to order a taxi, set an alarm for the next morning and send a text message to let their friends know they’re home safely, they will expect the same speed and intuition whenever they use technology — including when dealing with businesses for customer service enquiries. [easy-tweet tweet="'Intelligent assistant’ describes a software agent that can perform automated tasks for individuals" hashtags="AI,IoT"] From painstakingly slow automated speech recognition (ASR) and unreliable chat bots, automation has not always had the best reputation in the customer service realm. Even today, when the technology is highly efficient and capable, there’s a widespread fear that artificial intelligence will replace human service representatives entirely. However, the apocalyptic vision is far from the reality. The term ‘intelligent assistant’ simply describes a software agent that can perform automated tasks for an individual, whether that be through voice commands or written as text in a live chat window.  For customers, this technology can simplify basic enquiries. In e-commerce, for example, if a customer wanted to return an item and replace it with an alternative, the assistant can automatically draw customers’ details from the CRM system, update the file and process the new order for the same address. For more complicated enquiries, there should always be the human agent available. As automated technology will reduce a number of time agents usually spend on administrative tasks, they are free to deal with these complicated enquiries in a more efficient manner. What’s more, for the companies implementing this technology, the data it generates can provide an unprecedented insight into human patterns, including the vocal and written clues that signify a user’s feelings and preferences. For technology giants, collecting this information will be crucial to the next phase of artificial intelligence — making the technology even smarter and more reliable. For some, the thought of even smarter artificial intelligence is a scary dystopia — we’re blaming you for this, HAL. However, there is no need to fear. Artificial intelligence is already a major focus for technology giants, and as a result, it will continue to play an increasing role in our personal and professional lives. What is important, is that businesses and consumers embrace intelligent assistants — not as a threat, but as the name would suggest, an assistant. ### Cloud ERP: The Risk and the Reward Enterprise Resource Planning (ERP) systems are the beating heart of organisations operating across multiple industries, from manufacturing to retail, public services to utilities. ERP enhances the management of core business processes, including planning, purchasing, inventory management, logistics and finance. Joining up the dots of different departments, removing silo working cultures, promoting transparency, and aiding efficient collaboration with suppliers and customers, it’s safe to say ERP has transformed the way businesses function for the better. Coupled with effective customer communication management, which controls the process of designing, defining the use and managing the destination of business documents within an ERP system, companies can simplify, systemise and economise a multitude of processes, from invoice generation to label creation. Implementation of customer communication management software can give businesses long term cost savings, through reducing the need for hardware, printers and excessive manual administration. Effective output management can also strengthen an organisation’s brand, by offering users a quick and easy means of creating and upholding consistent communication within customer-facing documents that are pre-aligned with corporate brand guidelines.   As consumers, we are becoming more comfortable with the cloud as a concept, from the weekly food shop to streaming music instead of hoarding CDs, and the acceptance isn’t far behind for the business community either. In fact, according to the Cloud Industry Forum (CIF), 63 per cent* of UK businesses are planning to move their entire IT estate to the cloud in the near future. And although some way behind other SaaS offerings, ERP systems have also begun the slow migration to the cloud, with giants like Microsoft launching the first cloud version of its flagship ERP system, Dynamics AX – now known as Dynamics 365 - in 2016. The slow yet steady shift is happening in the ERP market, and those who haven’t yet converted are beginning to consider the possible risks and rewards associated with the move. So, with leading ERP vendors investing heavily in cloud technology, what’s preventing more businesses taking the leap right now? Barriers to uptake [easy-tweet tweet="There's a lack of trust in cloud-stored documents with governments questioning its security" hashtags="Security, Cloud"] Although the adoption of cloud ERP has been steady and looks to be undertaken by many more UK businesses in the future, for companies in some countries there are more challenges to overcome first. Although technological advances are being made every day, legislation and regulations aren’t necessarily following at the same pace and, as a result, some countries are being restricted on the uptake of services, such as cloud ERP. For example, the law in Switzerland currently states that all data must stay in the country. There is an overall lack of trust in documents being stored in the cloud with some governments questioning its security. Therefore many businesses are reluctant to fully embrace the cloud and instead opt for an on-premise ERP solution. Market leaders like Microsoft are encouraging people to engage with cloud ERP and have focused efforts on launching and actively promoting cloud ERP solutions over on-premise alternatives. This investment demonstrates a confidence in cloud and where Microsoft leads, others will of course follow. However, this also presents a problem for some businesses, particularly in the manufacturing sector where there is a reluctance to make a move to the cloud due to concerns about the possible implications that it would have on trade. A number of manufacturers work on remote sites where a reliable internet connection is sometimes hard to secure. Losing access to such a vital system could have a detrimental impact on a manufacturing business and cause any number of issues from a delay in production to a breakdown in communication between production and logistics leading to missed shipments to customers. For some businesses, in some sectors or certain locations, the risk of cloud ERP still out ways the reward and, until legislation is updated and access to high-speed internet access catches up with developments in cloud technology, maintaining an on-premise solution may be the best way forward. But when it works, it really works Although there are challenges for some, for most businesses, the reality is that there are many rewards associated with converting to cloud ERP over sticking with an on premise solution. For starters, cloud ERP solutions are much quicker to implement than on premise, which can take anywhere between 2-5 years to fully install and can take longer to update. Whereas an average cloud ERP implementation can be completed under a year. Businesses can also enjoy large financial savings as capital hardware costs are removed with the cloud ERP solution. In contrast, on-premise ERP systems require significant capital investments, which only increase when buying new hardware or high-end servers. On average it’s considerably cheaper to implement a cloud solution over on-premise. The cloud solution carries further financial benefits, as due to its subscription license agreement, upfront costs are removed in favour of monthly fees, giving businesses more financial flexibility Research has shown that concerns about cloud ERP security have fallen to 25 percent**, from the 29 percent** recorded last year, showing that there may be an opportunity for vendors to engage with businesses and challenge common misconceptions surrounding cloud ERP; tackle the question of security, reassure customers and promote the efficiency and cost saving available. Another selling point for most organisations looking to streamline their ERP function is the issue of maintenance. A solution hosted in the cloud ensures that maintenance is carried out directly by the vendor, meaning there is less chance of the system crashing or general aspects of the service failing. This also has the potential to reduce staffing costs, as there is less need for in-house IT support. Ultimately, there are many variants for businesses to consider when embarking upon an installation of ERP and customer communication software. To ensure customers’ needs are met, and all points considered, an individual assessment should be a vital first step. With the cloud ERP market still somewhat in its infancy, it may take a while for businesses to embrace it fully and for on-premise solutions to become obsolete but there’s no doubt that that cloud represents the future of ERP. A future full of rewards for those brave enough to take the leap and open the doors for others to follow.   ### IBM and Nutanix Launch Hyperconverged Initiative  IBM POWER and Nutanix Software target high-performance workloads delivered via one-click private clouds IBM (NYSE: IBM) and Nutanix (Nasdaq: NTNX) today announced a multi-year initiative to bring new workloads to hyperconverged deployments.  The integrated offering aims to combine Nutanix’s Enterprise Cloud Platform software with IBM Power Systems, to deliver a turnkey hyperconverged solution targeting critical workloads in large enterprises. The partnership plans to deliver a full-stack combination with built-in AHV virtualisation for a simple experience within the data centre. In today’s technology landscape, processing real-time information is necessary but not sufficient. Being able to react in real-time used to give enterprises a competitive advantage, but this approach no longer guarantees happy customers. The value has now migrated to the ability to rapidly gather large amounts of data, quickly crunch and predict what’s likely to happen next - using a combination of analytics, cognitive skills, machine learning and more. This is the start of the insight economy. Handling these kinds of workloads present unique challenges - needing a combination of reliable storage, fast networks, scalability and extremely powerful computing. It seems like private data centres that were designed just a few years ago is due for a refresh - not only in the technology but also in the architectural design philosophy. This is where the combination of IBM Power Systems and Nutanix comes in. This joint initiative intends to bring new workloads to hyperconverged deployments by delivering the first simple-to-deploy, web-scale architecture supporting POWER based scale-out computing for a continuum of enterprise workloads,including: Next generation cognitive workloads, including big data, machine learning and AI Mission-critical workloads, such as databases, large-scale data warehouses, web infrastructure, and mainstream enterprise apps Cloud Native Workloads, including full stack open source middleware and enterprise databases and Containers With a shared philosophy based on open standards, a combination of Nutanix and IBM will be designed to bring out the true power of software-defined infrastructure - choice - for Global 2000 enterprises, with plans for: A simplified private enterprise cloud that delivers POWER architecture in a seamless and compatible way to the data centre Exclusive virtualisation management with AHV, advanced planning and remediation with Machine Learning, App Mobility, Microsegmentation and more, with one-click automation A fully integrated one-click control stack with Prism, to eliminate silos and reduce the need for specialised IT skills to build and operate cloud-driven infrastructure Deploying stateful cloud-native services using Acropolis Container Services with automated deployment and enterprise-class persistent storage [easy-tweet tweet="IT teams now recognise the need of embracing the next generation of data centre infrastructure" hashtags="IT,DataCentre"] “Hyperconverged systems continue on a rapid growth trajectory, with a market size forecast of nearly $6 billion by 2020. IT teams now recognise the need, and the undeniable benefits, of embracing the next generation of data centre infrastructure technology,” said Stefanie Chiras, VP Power Systems at IBM. “Our partnership with Nutanix will be designed to give our joint enterprise customers a scalable, resilient, high-performance hyperconverged infrastructure solution, benefiting from the data and compute capabilities of the POWER architecture and the one-click simplicity of the Nutanix Enterprise Cloud Platform.” “With this partnership, IBM customers of Power-based systems will be able to realise a public cloud-like experience with their on-premise infrastructure,” said Dheeraj Pandey, CEO at Nutanix. “With the planned design, Enterprise customers will be able to run any mission critical workload, at any scale, with world-class virtualisation and automation capabilities built into a scale-out fabric leveraging IBM’s server technology.” Pricing and Availability The IBM and Nutanix initiative will bring options for clients and a seamless experience for these clients and will be sold exclusively through IBM sales force and channel partners. Specific timelines, models and supported server configurations will be announced at the time of availability. ###  In The Cloud The way players are purchasing lottery tickets is changing quickly. Players are embracing the ease and flexibility of being able to enter draws whenever they want, wherever they are, without having to walk to the local newsagent or drive to the nearest supermarket. Following the trend outside of lottery, and in e-commerce in general, but this shift towards online and mobile play is slow for state lottery operators. For most, their technological infrastructure is built on decades-old technology, and unable to handle the experience savvy consumers have come to expect. It seems almost like a cliché, but for us, a cloud platform, and more importantly a cloud philosophy, can clear these hurdles. Our Insight platform is built on these principles – more than just throwing all our old hardware into AWS, we have included cloud philosophy in everything we do, from the information architecture to the way we build our API’s. This all powers our secondary lottery platform and means we can offer our partners a robust and flexible product fit for purpose. This makes us pioneers in our industry, with most suppliers and operators yet to fully embrace the philosophy. But ours is an industry that constantly straddles innovation and technological advancement, and as a company, we believe it is best to stay ahead of the curve rather than constantly playing catch-up. For lotteries and operators making a move online, cloud solutions mean they don’t have to consider the expense of establishing new data centres or putting their current infrastructure under pressure with an increase in activity and volume. It becomes a no brainer, because our cloud philosophy also extends to our commercial model – we pass on the low risk and low entry barriers directly to our customers. By using a cloud-based online lottery platform, operators can future-proof their technological infrastructure while at the same time reducing overheads and maintenance costs. It also allows them to offer a smooth and seamless experience to their players, who expect to be able to complete transactions in seconds. This can prove challenging when it comes to huge multi-million dollar jackpots, which often see large numbers of players flocking to sites to purchase tickets ahead of the draw. Our platform uses the almost infinite scaling abilities, micro services (and platforms such as Docker) so that the cloud can take this sudden influx of activity in their stride, without compromising on the quality of experience enjoyed by the end user. [easy-tweet tweet="Using cloud computing makes for a quick and efficient integration process with lottery partners" hashtags="Data,Cloud"] When it comes to operating a successful online lottery product, consistency is key. Cloud solutions play a vital role in delivering this both from a supplier’s perspective to their operator partners and from an operator’s perspective to their players. They are not liable to the same technological hiccups and downtime – a major plus for all stakeholders. From a platform supplier’s perspective, using cloud computing makes for a quick and efficient integration process with lottery partners. Deploying software updates is much quicker, meaning they are always using the most advanced platform to deliver the best possible experience to their players. That is what a cloud philosophy is all about, using technology to provide a fast, reliable, solid, flexible, and help platform suppliers and online lottery operators improve the player experience while drastically reducing costs. As online lottery continues to become more popular with players around the world, operators will have to embrace cloud-based solutions to handle the major uptick in traffic and to ensure their product offering continues to meet player expectations. It also helps meet ever-changing regulatory requirements. Winners Group is licensed by the Alderney Gambling Control Commission, and we have worked together with them to facilitate our transition into a cloud-based company. It makes us cutting edge, innovative, and a standard-setter in the industry. But those that switch to cloud-based solutions now will see a big increase in performance which will help grow their share of wallet in the short term, while also putting them firmly in the driving seat to capitalise on market growth over the coming months and years. Ours is a fast-moving industry, and cloud-based solutions are the only way to keep pace. ### The Need for Speed CommScope’s New High-Speed Migration Platform Prepares Data Centre Managers for Decades of IncreasedBandwidth HICKORY, N.C., May 16, 2017— As the world embarks on a potential fourth Industrial Revolution, it is the 30 to 50 billion connected devices that will spur the unprecedented growth in bandwidth. The next few years are critical to building the networks that will meet the demand. In anticipation of this need, CommScope, a leader in communications network infrastructure solutions, is introducing its High-Speed Migration platform that assists data centre managers with building faster, more agile, high-density migration plans. “With our High-Speed Migration solutions, we’re able to help data centre managers accelerate the growth of their DC capacity and the speed of their digital transformation initiatives,” said John Schmidt, vice president of Global DataCenter Solutions at CommScope. “We are quickly moving from 10Gb/s and 40Gb/s to 100Gb/s, 400Gb/s and beyond. The more data consumers and network users need, the more services they expect, the more critical speed becomes. This is a global phenomenon and one of the top challenges that data centre managers will face.” CommScope’s High-Speed Migration portfolio works for duplex and parallel applications and allows customers to decide on the best approach to architecture. It also supports higher speeds and emerging applications without having to rip and replace. More than that, the platform allows CommScope to act as a trusted partner with our highly trained team of network architects who understand a customer’s business needs and provides insight to future data centre ecosystems and technology trends. The following innovative solutions are part of the first phase of the High-Speed Migration platform and support current and future high-speed applications: MPO connectivity options: 24-fiber connections that ensure lowest first cost duplex deployments with a single connection, 12-fiber to support the seamless expansion of legacy 12-fiber infrastructures, and 8-fiber support QSFP technologies, providing for customers utilising this parallel optic configuration. Fibre optic panels: Ultra- and high- density panels that simplify management of duplex and parallel ports for dynamic migration and flexibility. Ultra-Low Loss (ULL) performance: Ultra-low loss pre-terminated components enable longer link spans with more connectivity options and guaranteed support for attenuation-sensitive applications. LazrSPEED® WideBand OM5: Part of the flagship CommScope SYSTIMAX® portfolio most recently designated OM5 by the ISO/IEC. It enhances the ability of short-wavelength division multiplexing to provide at least four-fold increases in usable bandwidth while maintaining backwards compatibility with legacy multimode fibre. imVision®: The automated infrastructure management system (AIM) that gives oversight and control to SYSTIMAX physical network connectivity solutions. [easy-tweet tweet="Delivering connectivity resources at scale has become a competitive differentiator" hashtags="Data, Resource"] As part of CommScope’s entire High-Speed Migration strategy, solutions include all fibre types (multimode and single mode), intelligence and customised connectors for all data centres. Delivering connectivity resources at scale has become a competitive differentiator, per Jennifer Cooke, research director in IDC’s Datacenter Trends and Strategies group. Digital transformation initiatives are built on making data-driven decisions, and the ever-increasing need for speed will challenge network and data providers soon. “Preparing to meet this challenge requires not only investment but a strategy and vision to transition when and where resources are needed,” said Cooke. “CommScope’s high-speed migration services meet a critical need in the market by providing a bridge between current and future demand and supporting a strategic vision for data centre and network transformation. CommScope will host a Twitter chat on May 17, from 10:00 a.m. CDT to 11:00 a.m. to answer questions about high-speed migration. Use #COMMtweets to tag your question at the company handle, @CommScope. Click here to download approved CommScope data centre images, as well as the Ethernet Alliance’s Ethernet roadmap. It shows the dramatic ascent to 800Gb and 1Tb, which is expected to occur by as soon as 2020.   ### Snow Software Reveals Visibility Into Microsoft Office 365 London, May 16, 2017 – Snow Software, which helps more than 6,000 organisations around the world reduce their software spend, today announced Snow for Office 365, which extends the ability to manage software licensing and availability to Microsoft Office users across all platforms – desktop, cloud and mobile. Snow for Office 365 empowers organisations to integrate the management of the world’s most popular SaaS application with existing software investments, eliminating over-spend and ensuring that users are on the optimal subscription plan, saving up to US $27 per user per month. In addition to detailed, actionable intelligence, Snow for Office 365 provides automated, approval-driven workflows to attack administrative costs which can be as much of 93% of Office 365 licensing expense. Organisations can leverage automation to support self-service access requests, redeploy unused licenses, shift users to lower cost plans based on usage and provision time-limited subscriptions of expensive applications such as Project 365 or Visio 365. [easy-tweet tweet="Many organisations are over-spending because they're unable to accurately assess their needs " hashtags="SnowSoftware, Data"] “We need a single view of our entire Microsoft spend regardless of whether it is on-premise, in the cloud or on mobile. This will enable us to further optimise our technology spend,” says Carola Iberl, Global License Manager at DEKRA. “Feedback from our customers shows that, while moving applications and infrastructure to the cloud is initially very financially attractive, many organisations end up over-spending because they are unable to accurately assess their real needs and actual usage,” says Peter Björkman, CTO of Snow Software.  “Snow for Office 365 ensures that, from day one, organisations avoid over-committing to seemingly-lucrative subscription deals and gives IT leaders the insight they need to get the right levels of service for the right price.” For more information on Snow for Office 365, visit: www.snowsoftware.com/int/SnowforOffice365 ### Access to Enterprise Technology - #Cloudtalks with Cognition Foundry Ron Argent In this episode of #CloudTalks, Compare the Cloud speaks with Ron Argent, Founder & CEO of Cognition Foundry. Ron speaks about how Cognition Foundry enables start-up companies to have access to enterprise technology. He explains how they are working with the Plastic Bank to help construct a systems design where they could develop and deploy a digital currency that is contained within the plastic supply chain. #cloudtalks Visit Cognition Foundry website to find out more: http://www.cognitionfoundry.com/ ### What do you know about Technology and The Law?   Monday 15th - Disruptive Tech is launching a brand new talk show with legal expert Frank Jennings called “Technology and The Law”. If you are interested and want to learn more about the legal aspects and debates towards the technological industry then tune in! This show is airing for the first time on Friday 19th May at 2:30 pm UK time, where Frank will be speaking to four tech industry experts. This week he will be focussing on GDPR and Data Protection, looking into the current regulations and how people perceive existing laws. [easy-tweet tweet="Disruptive Tech is launching a brand new talk show “Technology and The Law”" hashtags="Technology,Law"] These episodes will be running for 6 weeks, for 30 minutes, focussing on a variety of different topics, such as the Cloud, social media and much more. So if GDPR isn’t your cup of tea, tune into an episode that best suits you. There will be four different experts, every episode, who will come in to speak to Frank about the topic of the week. Frank Jennings is the commercial and cloud lawyer at Wallace LLP. He specialises in commercial contracts with cloud and technology, data security and intellectual property. Frank is known for his can-do attitude, managing risk and problem solving to maximise return on investment. Who else could present this show? Disruptive Tech TV, part of the Compare the Cloud network, is the UK’s first 24/7 online technology TV channel. They cover some of the biggest technology events in the world as part of their live shows, produce original content, explore cutting edge tech and interview the disruptive voices in the industry. Show details can be found here:  https://www.disruptivetech.tv/techlaw/ Hashtag: #TechLaw Twitter: @disruptivelive ### Why 2017 Should be the Year of Cloud Champion Migrating your organisation to the cloud can be a complex business, yet it’s become an important aspect of technology strategies within today’s enterprises. Recent research with 900 IT leaders, carried out by cloud communications provider Fuze, reveals that 97 per cent will have a formal cloud strategy in place by the end of 2017 and 81 percent say they already have a dedicated cloud champion in place who is responsible for driving cloud-related initiatives. With fewer than 10 percent of IT leaders confirming they have no plans to move critical business functions such as communication, HR, storage, security and CRM to the cloud, it appears that cloud computing technology is set to infiltrate almost every part of the enterprise in 2017. There’s admittedly some way to go for today’s IT leaders to deliver on their cloud plans. For example, in North America, 45 percent of enterprises have implemented a cloud migration strategy across their entire organisation, yet this figure falls to around a quarter in France and Germany and just 10 per cent in the UK. It leaders have clear intentions to embrace the cloud and have the strategy and cloud champion in place to accelerate enterprise-wide initiatives. Here are four reasons why 2017 needs to be the year of the cloud champion, where the best intentions of IT leaders become a reality that drives business success: The App Generation is coming Today, businesses are on the cusp of welcoming a disruptive force into the workplace – the App Generation. Having grown up never knowing the world without smartphones and on-tap internet access, these teenagers (aged from 15 to 18 years) will introduce an entirely new dynamic for IT departments, with their expectations to be permanently connected. Three-quarters of the App Generation expects to be able to use the latest technology at work, and they will demand technologies that fit with the way they want to work, interact, and collaborate. At the heart of this trend is an expectation that has been set by consumer devices, with IT leaders under pressure to introduce and deploy technologies with the same usability, accessibility, and seamless integration that users experience as consumers. Your workforce will become super mobile and super connected Flexible work arrangements aren’t a new concept. Home working, flexi-time and job sharing have been in place for years within many organisations. The UK Flexible Working Regulations 2014 then formalised the opportunity for all employees (not just parents and carers) to request flexible working to suit their needs. The majority of today’s workforce (83 percent) already believe they don’t need to be in an office to be productive. The arrival of the app generation into the workplace will only serve to accelerate the demand to abandon traditional nine to five hours and work flexibly. As members of the app generation advance in their careers, flexible working will no longer be considered a benefit – it will be an expectation. As flexible, multi-location and multi-device working become the status quo, IT leaders will be challenged to support this requirement across the organisation. This trend will put the spotlight firmly on communication and collaboration for IT leaders. Increased workforce mobility and flexibility need a new approach to technology – one that enables effective communication and a consistent experience for every employee, regardless of how and where they work. It’s no surprise that already, 59 percent of IT leaders are treating the adoption of new communications platforms as a top priority for 2017. Removing complexity will drive innovation More than three-quarters of IT leaders (78 percent) believe the IT department’s ability to innovate is critical to business success. Unfortunately, for the average CIO, time to plan and work on innovations is one thing that is often in short supply. Only 37 percent of IT leaders says they are spending adequate time on innovation. Instead, today’s IT teams are focusing their time on managing IT platforms and resolving user issues. Given this situation, is it any wonder that so many are struggling to manage a cross-department migration to the cloud? [easy-tweet tweet="Voice, video and collaboration applications can be provided on a single cloud platform" hashtags="Cloud,IT"] Communication and collaboration are one of the main areas where IT leaders are experiencing a productivity drain – managing and maintaining the applications, providing ongoing support to users and dealing with the complexity of multiple apps and on-premises equipment. On average, enterprises today are running 12 applications for video calling, voice conferencing and live chat alone. The cloud offers numerous opportunities to unleash IT teams from the constraints of day-to-day communications operations. Voice, video and collaboration applications can be provided on a single cloud platform and delivered to any user on-demand, while the cost and time associated with on-premises equipment can be eradicated. With a cloud champion, enterprises will have a sponsor to drive the simplification of IT infrastructures, reducing the number of applications, and providing workers with easy-to-use, single-app alternatives. In doing so, the IT function can dramatically transition IT from an operational function to a driver of innovation and fresh approaches. An internal advocate will accelerate enterprise-wide adoption A cloud champion will provide a much-needed strategic focus for any cloud migration. While the intention of nearly all IT leaders is to have a formal cloud strategy in place by the end of 2017, there’s some work to do to accelerate cloud initiatives. Today only 30 percent have implemented one across the entire organisation and 36 percent only have ‘part’ of a cloud migration strategy. Without an enterprise-wide strategy in place many businesses will find themselves faced with a piecemeal approach to cloud migration, moving one technology at a time. This disjointed vision leads to a lack of continuity across departments, with little consideration for how technologies will integrate or how this will impact the user experience. [easy-tweet tweet="97% of organisations anticipate having a formal cloud strategy in place" hashtags="CloudStrategy, Cloud"] Using a cloud champion, IT leaders can create a ‘broker’ for all enterprise cloud services, with a champion that gives cloud migration the consideration it deserves. The organisation’s enterprise-wide migration can be strategically aligned to the entire needs of the business, without having to stop or stall the day-to-day operations of the IT department. Clouds on the horizon When it comes to the cloud, the future is in sight. Two-thirds of business leaders already have a formal cloud strategy implemented across all or part of the business, and 80 percent of companies have an internal champion driving cloud migration. By the end of 2017, 97 percent of organisations anticipate having a formal cloud strategy in place, and 92 percent will have an internal champion driving this. Embracing the cloud is the first step toward staying ahead of the curve. However, IT departments will only be able to drive real innovation and competitive advantage when they define how to make a move, plan what to strategically migrate and ensure full integration once working within the cloud.     ### Gone in 60 seconds - The Grim Reality of Ransomware Damaging effect on business extends far beyond data loss and lack of knowledge means businesses are at risk of immeasurable loss Monday 27th February 2016. New research from Timico, an end-to-end, managed cloud service provider in partnership with Datto, a business continuity solutions provider, reveals that the effects of Ransomware attacks on UK businesses cause unquantifiable financial cost and immeasurable data loss. Despite this, there is an alarming lack of awareness when it comes to being prepared, with two-thirds of UK businesses having no official Ransomware policy, to guide employees on what to do in the event of an attack. Ransomware is a malicious software designed to block access to a computer system until a sum of money is paid. It is the biggest global cyber threat to business with reported incidents increasing in frequency and complexity and the financial demands escalating. The National Cyber Security Centre weekly threat report predicts that new innovations in Ransomware are already happening, such as targeting internet-connected devices to create a “Ransomware of Things.” 60 seconds or less to shutdown The research report entitled ‘The Reality of Ransomware,’ polled 1,000 UK organisations, all of whom were Ransomware victims and many attacked within the last 12 months. The research found that well over half (68%) of respondents said that the effects of an attack were almost instant with data systems going from fully functional to essentially useless within seconds and minutes. Nearly a quarter (23%) reported lockdown within just a few seconds, and 18% said that systems were down within a minute of the attack. A further 26% reported systems being blocked within a few minutes. The drastic effect on business The Timico and Datto research found that, for the majority (85%) of companies that have been victim to Ransomware, systems were down for a week or more, causing £1,000s in financial damage a day to most businesses. A third (33%) had to endure their data down for more than a month, with 15% reporting their data as ‘unrecoverable.’ But retrieving data is becoming increasingly more difficult for organisations. The ransom fees, demanded by cyber criminals before they will unlock the victim’s computer system, are rapidly rising. Nearly a quarter (23%) of respondents paid over £5,000 to retrieve their data and 26% paid a fee of between £3,000 - £5,000. Higher Ransomware fees in large corporates were reported, with a third of corporate businesses paying over £5,000 to recover data compared to just half that number of SMEs (15%). The highest number of SMEs (35%) paid between £500 and £1,500 ransom fee. The true cost of Ransomware is a financial abyss Knowing the extent of the cost of the attack on the business is often unknown. Nearly a third (29%) of those polled could not even estimate the overall financial cost to the business of the Ransomware attack, deeming it ‘unquantifiable’. Over half (53%) of respondents estimated that the attack had cost the business between £1,000 to £2,000 per day in lost revenue, due to its data systems being down. Lack of guidance leaving organisations vulnerable to loss  Despite Ransomware being well reported as one of the biggest, evolving threats to organisations worldwide, many staff within UK companies would have no idea what to do in the event of an attack.  Timico and Datto found that a staggering two-thirds (63%) of UK businesses have no official Ransomware policy in place to guide staff on what to do when an attack occurs, leaving them vulnerable to huge and unquantifiable financial and data loss. [easy-tweet tweet="A ransomware attack can have a debilitating effect, with long-term consequences" hashtags="Ransomware,CyberAttack"] Nabeil Samara, Chief Digital Officer at Timico says. “These research findings clearly show that the speed of a ransomware attack is almost instant, while the effects on the organisation can be far reaching. “It’s not just a case of the data loss and financial cost to the business. A ransomware attack can have a debilitating effect, with long-term consequences across the business, with the company even breaching terms of any regulatory bodies that the business holds themselves accountable to.  “The Government has now launched its The Cyber Schools Programme but education needs to extend beyond the classroom and into the workplace.  It’s critical that all organisations, no matter what size, acknowledge the increasing and evolving threat of Ransomware as attacks become ever more frequent and instil a policy, that is regularly updated, to educate staff on what to do if the business comes under attack. Protection and communication are key to the difference between success or failure and will save the business infinite costs in the long run.” Andrew Stuart, Managing Director, Datto EMEA, says, “The high quantity of strains and constant evolution mean traditional signature-based anti-virus aren't effective against this threat. A Backup and Disaster Recovery solution which utilises the Cloud can effectively defend against ransomware. This creates regular encrypted backups of your data, and maintains prior versions. In the case of a ransomware attack, businesses can easily 'roll back' their data to an uninfected version, so no ransom needs to be paid. And of course, this second copy has the added benefit of preventing data loss via accidental deletion.’'  TIMICO’S TOP TIPS TO PREPARING FOR (AND PREVENTING) A RANSOMWARE ATTACK   Get senior stakeholder buy-in, so all company Ransomware prevention and response policies are communicated and enforced from the top. 2. Be proactive with your backup policy, and above all test on a regular basis.   3. Educate your users not to open or click on suspicious looking emails or attachments.   4. Up to date antivirus software should be considered essential.   5. Don’t get complacent – audit your historic backups, imperative if you have a multi-vendor solution in place.   6. Understand your Recovery Time Objective (RTO) i.e. how long can you afford to be down for?   7. Understand your Recovery Point Objective (RPO) i.e. how much data can you afford to lose?   8. Encourage your users to keep their work and personal data and apps separate.   9. Don’t pay the ransom! It’s still highly unlikely you will get your data back, or if you do it will be in an unreadable format.   10. Do report the crime to the police, many don’t and as such attacks go under the radar…don’t let cyber criminals get away with it! ### AWS Outage, Reinforces the Need for Hybrid Cloud Amazon Web Service’s (AWS) recent outage resulted in an entire day of loss of services for its clients and reinforced the point that solely relying on a public cloud infrastructure can have drawbacks. 5nine Software’s Morgan Holm looks at why hybrid cloud is the best solution for most businesses. When Amazon Web Service (AWS) suffered a catastrophic outage recently after its servers crashed, millions of companies were left reeling from the disruption – everything from larger websites to people’s smart homes and even library catalogues was rendered temporarily useless. This has understandably raised a question mark over the reliability of the public cloud as a sole solution for businesses. Particularly as AWS has the biggest share of the market for cloud-based providers – bigger than the next two providers (Microsoft and Google) combined.  Organisations need to evaluate their public cloud workloads for disaster recovery scenarios. However, while many companies have been focused on moving to purely cloud-based infrastructures, a growing number are realising it is not necessarily an appropriate model for their organisations. Whether this is due to the age of the company, its size, or simply a preference for control or to maintain “data sovereignty”. As such, a natural middle ground has emerged in the shape of the hybrid cloud. The benefits of a hybrid cloud solution are wide ranging. In the first instance, depending on when organizations were established and if they are carrying any “legacy baggage” – for example, those organizations that went through virtualisation during the early 2000s – they will need to keep those legacy systems in place until they have had time to “retool” and factor them into a new system. A scenario like this makes hybrid cloud the obvious choice. [easy-tweet tweet="Public cloud structure is essentially a cost-benefit structure" hashtags="PublicCloud,Cloud"] The other major challenge that businesses face is a desire to maintain complete control of their data onsite. If your datasets and applications are a core part of your business, it’s understandable that you would want to ensure you retain that data on your premises and within your control. Whether that is down to regulatory compliance or maintaining data sovereignty (where that data resides, who has access to that data, how that data is secured) it all comes down to having that ability to manage those essential elements that are central to your business. A big part of what is attractive about a public cloud structure is essentially a cost-benefit structure, especially if you have a business where there is periodic need to scale up. For example, for a company that performs tax audits, there are seasonal spikes in their business, whereby, maintaining and managing infrastructure components on a year-round basis doesn’t make sense. Therefore having the ability to use the public cloud is the perfect solution. Of course, security is a concern for public cloud. The cloud providers themselves provide robust security systems and have sophisticated teams focussed on their security, but essentially, the real responsibility for maintaining the security of the applications and data is down to the end user. If you open up an application and leave it exposed to the public cloud, that’s not the responsibility of the cloud provider if this then results in a data breach. Maintaining secure processes is a necessity whether you use the public cloud or not, but it’s a reason, why for some people, it’s a risk too far to transfer everything into a public system. Many organisations also contain various public facing information systems. For example, if you have a website, or a documentation centre accessible to customers, this is all in the public domain, and as such will create less of those security concerns with moving to the public cloud. These scenarios are part of why we’re seeing a real mix of both public and private cloud use. With the AWS outage, the result was hours of zero access to services for their users in a particular region. The affect of this (though dependent on the individual user’s model of business), it’s reasonable to presume, that those Amazon users that relied on AWS to host all of their data would have been completely immobilised during that period. This understandably is going to temper people’s enthusiasm for public cloud as a singular structure for their business. However, with hybrid cloud, there’s no need to throw the baby out with the bathwater. You may want to keep your core applications on premises, but if you have a need for the elasticity of the public cloud, with its ability to scale up when you need that actual capacity, workload mobility really becomes key to the solution for businesses. What companies need is flexibility, scalable and “always-on” provision. This ultimately means that workloads need to be agile. We need to be able to move from public to private seamlessly, but equally – as the AWS outage highlighted – we also need to be able to do the same between the public cloud providers themselves. While the latter is a little way into the future, the former is here now.   ### Data Protection - #Cloudtalks with Sire's Russell Cook In this episode of #CloudTalks, Compare the Cloud speaks with Russell Cook, Managing Director of Sire. Russell speaks about how Sire focuses on client data protection and how they focus on technology that supports data 24/7. He further speaks about the two levels of protection; protection data and production data. #cloudtalks Visit Sire website to find out more: https://www.sire.co.uk/ ### Kx technology to support RxDS   -   Kx selected by RxDS as the platform for its data analytics solutions  -       Solutions based on Kx will unlock the data within pharma and healthcare companies to allow better and more rapid decisions   -       RxDS aims to disrupt market valued at US$ 24.6 billion by 2021  LONDON/PALO ALTO, CA Global technology specialist FD today announced on 11th May, that RxDataScience Inc (RxDS), an innovative start-up focused on the pharmaceutical and healthcare landscape, has selected Kx technology to power its data science platform. The agreement will allow RxDS to target the global healthcare analytics market, expected to reach US$ 24.6 billion in 2021. RxDS’s selection was driven by the extreme performance that the Kx technology stack provides for linking and analysing large volumes of data from multiple sources. Kx will allow RxDS to generate actionable insights for its customers through its unique capability to ingest, process and analyse huge volumes of historical and streaming data in real time. In particular, Kx is well suited to its ability to combine different, large datasets to provide a holistic view for clients within a single platform. [easy-tweet tweet="Kx will power a wide range of RxDS solutions across the Pharma value chain" hashtags="Kx,RxDS"] RxDS has extensive experience within the pharmaceutical industry, using data science and machine learning to solve complex business challenges.  Kx will power a wide range of RxDS solutions across the Pharma value chain generating predictive analytics insights from terabyte volumes of patient medical, claims and lab records incorporating billions of rows of data.  These solutions are purpose-built applications such as Rare Disease Patient Finder, Longitudinal Patient Data Analyzer, New Product Launch Accelerator, and Predictive Managed Markets Analytics. These innovative solutions are quickly tailored for individual customers’ markets and products generating immediate and significant business value. Larry Pickett, Chief Executive Officer of RxDataScience commented: “Through its ability to rapidly link and query very large data sets commonly found in the healthcare and pharma markets, Kx is the ideal platform to enable delivery of our data science applications. Its lightning fast in-memory columnar time series database provides unmatched speed and flexibility to support our customers’ ability to rapidly identify trends and enable better decision making across their organisations.  Until now, these sophisticated analytical technologies were only used by the most prestigious financial services firms on Wall Street – we are now bringing Kx technology to pharma and healthcare.” Brian Conlon, Chief Executive Officer of FD commented: “RxDS is led by experienced pharma and healthcare experts with strong products and a solid development roadmap to deliver high-value solutions to their customers. Combining this domain expertise with our Kx technology provides the capability to disrupt these valuable and growing markets.”   ### How to Stop Banging Your Head Against the (Data)brick wall. [vc_row][vc_column][vc_column_text]For many organisations, the prospect of realising functional applications in Azure databricks may seem little more than a distant possibility. For others, this will be something that has been achieved and taken as a given. In either scenario, there is always more that can be done to improve the performance of applications. While moving to the cloud is an important first step, there is a considerable amount more that needs to be done once that migration is completed. Jobs can often be inefficient leading to a gradual accumulation in cost that is ultimately felt by the business at a critical mass. No matter how finely-tuned those jobs are, they can generally be refined further to generate greater efficiencies and an optimised data ecosystem Business Intelligence is Power Optimising your workloads in Azure Databricks fundamentally boils down to operational intelligence. By having clear insights into what applications are running, how they are utilising resources, which users are accessing them, and at what times this is all occurring, a broad picture of the data landscape can be drawn. This kind of granular detail leads to faster diagnosis and resolution of application problems. For example, If an app is failing, rather than spending considerable time finding the metrics needed to inform your root cause analysis, we can instead draw these conclusions right away. This minimises downtime of what could be business-critical applications. So what kind of intelligence does the data team need to draw these insights? Importantly, they will need a detailed view of usage breakdown and trending over time across workspaces and users. Equally, it is essential to have visibility into data usage. This should cover things like what tables are being accessed when they’re being accessed and by who, and which applications. Consideration should also be given to the extent of custer usage which can lead to the identification of hot, cold, and warm tables. While such comprehensive data may seem excessive, it allows for the discovery and improvement of efficiencies in resource utilisation. This allows the data teams to determine things like what resources are being used versus what is available, or see which applications are running at any one time. After all, it is impossible to ensure that resources are being utilised appropriately without this vital context. When we identify an instance where resources are not being utilised appropriately, we are now in a position to redefine the cluster in terms of what is being provisioned. Data as a Guide To illustrate what this means tangibly, let us say that we have three workspaces split out in our Azure Databricks environment. As is common in most organisations, each distinct job function has their own workspace. In our example, these three are finance, engineering and HR. Using all the aforementioned metrics, we quickly identify that the finance team is the biggest user by a considerable margin. This should immediately indicate to the data team that this needs to be looked into. First, we should ask whether there are some inefficiencies at fault here or whether this dominant usage is essential to the business function and justifiable. If it is an inefficiency, we can use further metrics to investigate what the potential cause might be and how we can go about remediating it. When we have comprehensive data on app performance and usage, we can extrapolate the ‘why’ that explains the changes in usage, which goes a long way in this kind of decision-making. Such comprehensive metrics also have use in the longer term. While it is certainly indispensable in resolving job issues -  as the above example shows - it also provides value in a broader sense when it comes to ROI conversations. With granular metrics on who is using which clusters for their applications and to what extent, we can assess which are providing value to the business and can be optimised. Moreover, with such sufficient data to hand, we can even project what these costs will look like in the future and ensure we have continued cost visibility. The ability to forecast in this way allows for better budgeting and provisioning which can be felt more widely in different business functions. In summary, effective cost management is informed by intelligence. The more we have, the better we can optimise. Gathering this information, and finding ways to implement is a process that will be considerable dividends for cost management and improvements in performance. To become an organisation who can diagnose and resolve app problems fast, this cannot be overlooked. More pressingly, users can enjoy greater productivity by eliminating the time spent on tedious, low-value tasks such as log data collection, root cause analysis and application tuning[/vc_column_text][/vc_column][/vc_row] ### Movement of Technology - #Cloudtalks with Cetus's Paul Kiveal In this episode of #CloudTalks, Compare the Cloud speaks with Paul Kivel, Strategic Development Director of Cetus Solutions. Paul speaks about how he looks across three key areas: agile infrastructure, mobile workspaces and security. He goes into greater detail and explains how the movement of technology has changed each sector. #CloudTalks Visit Cetus Solutions website to find out more: https://cetus-solutions.com/ ### Who is your 2017 Internet Hero and Villain? ISPA is once again calling on the public to submit their nominations for the annual Internet Hero and Internet Villainawards at the 2017 ISPA Awards. In what has been a momentous 12 months, now is the time to point the finger at the good, the bad, and the ugly of the Internet world. Will Brexit or the rise of Trump have an impact on the shortlist, or perhaps the development and innovation of broadband deserves a mention? If the proliferation of Fake News has got you itching with rage, or you long to sing the praises of a tech-savvy MP, now is the time to nominate. The surveillance and digital security debate show no sign of going away anytime soon, and we look forward to a slew of entries nominating those who honour digital civil liberties and lambasting those who, er, perhaps don't. Maybe the Digital Economy Act deserves a nomination and with an election campaign now seemingly not complete without some form of hacking or security breach, there is plenty to consider this year. [easy-tweet tweet="The Internet Hero and Villain... to recognise those who have impacted on UK Internet industry" hashtags="IT,Technology"] Last year saw Apple take the crown for their admirable commitment to defending the fundamental principles of encryption and customer privacy. Mossack Fonseca Panama were damned as Internet Villain of the year for their poor cyber security practices. Don't delay – nominate today. Now is an opportunity to recognise those who have done the most to help or hinder the internet industry in the past 12 months. Announcing the search, ISPA Secretary General Nicholas Lansman said “The Internet Hero and Villain awards are how we recognise, in a fun and light-hearted manner, those who have, for better or worse, impacted on the UK Internet industry in the previous 12 months. We want an as wide range of suggestions as possible, so we're calling on the public for their help in getting their ideas across to us.” Nominations can be submitted to ISPA via email (awards@ispa.org.uk) or via Twitter (@ISPAUK) using the hashtags #InternetHero and #InternetVillain. We’re taking nominations up until Monday 5th June. The winners will be announced at the gala ISPA Awards ceremony taking place at The Brewery on Thursday 13 July 2017 at Café de Paris in central London. There are a limited number of free media spaces available, so please email pressoffice@ispa.org.uk. Tickets include a drinks reception, three-course meal plus wine and entertainment late into the night. ### Freedom Around Accountability - #Cloudtalks with Emerge’s Jow Arif In this episode of #CloudTalks, Compare the Cloud speaks with Jow Arif, founder of Emerge UK. Jow talks about how the business was formed from the concepts of freedom around accountability. Jow also talks about how there isn't enough trust in the workplace. #Cloudtalks Visit Emerge UK's website to find out more: emerge.uk.com/ ### Moving To The Cloud, The Right Way Opting for a cloud environment is now a mainstream way of thinking for UK businesses regardless of size or genre. Indeed, recent statistics show that, in the next 15 months, 80% of all IT budgets will be committed to cloud solutions. Hybrid cloud adoption grew 3X in the last year, increasing from 19% to 57% of organisations surveyed. This indicates that business attitude towards cloud is maturing and enterprises are realising the vital role it plays in business operations. Unsurprisingly, hybrid cloud has become the cloud model of choice for many organisations that want to build greater flexibility into their infrastructure in a cost-effective and future-proof manner. What’s not to like about hybrid? You get to keep some of the simple data on premise yet free up your existing infrastructure by moving critical elements to the cloud, all the while relying on your cloud provider to have the latest provisioning and security in place that helps to ensure you keep abreast of industry regulation and legislation. Every business’s number one priority is to deliver the best service they can to clients and customers which are another key reason why cloud deployment is an increasingly key focus. Businesses like Marotori are a notable example of how the cloud is transforming the way they work and meet the needs of their business and their customers. Marotori is a hosting company that had grown out of servicing its clients’ needs. The business provides niche market hosting facilities for clients and partner companies. The challenge they faced was that they were originally hosting data out of a data centre in a location that was simply too far from them. Moving to a better location, in London, was absolutely critical to the future of the business. “One of the biggest challenges we’ve faced over the years as a business has been with the hardware. It’s a tough call to make as there’s a lot of profit to be made in providing the hardware to customers as well as the hosting services but it requires a substantial investment," said Rob Thomson, Director at Marotori. [easy-tweet tweet="Moving to a cloud environment, can provide great benefits to the way businesses operate and services to clients" hashtags="Cloud,CloudPlatform"] “We made the decision to migrate approximately 80% of our business over to 4D Data Centre’s cloud platform last year. Everything with the migration went perfectly. By moving over to the cloud, the biggest risks to our business, such as having a hard disk fail, have been removed. What’s more, with this migration, we’ve been able to consolidate our rack equipment down from three full racks to just over one. At the moment, we have to maintain some racks due to the nature of some of our financial service clients, however, in the long term, we will be moving to a pure cloud environment.” Moving completely to a cloud environment, if done properly, can provide great benefits to the way businesses operate and enable them to provide a better service to clients regardless of industry sector. Choosing the right cloud provider, therefore, who will manage your migration to the cloud and provide ongoing support is vital. It’s not all about cost and time-saving. I would strongly advise that businesses find a partner that can help them migrate at their own speed. Not everything needs to be migrated away from the existing infrastructure and, in fact, there may be good reasons to select a ‘hybrid colocation’ solution, mixing the existing physical infrastructure with the flexibility of a virtual one. It’s for this reason that the geographic proximity of a cloud provider to the existing business is critical. A local cloud provider not only offers speedy access to data and servers (for major upgrades or changes), but it provides optimal connectivity speeds (low latency equals better performance) as well as the reassurance that the business will meet any legal requirements to keep data within country borders, which is extremely important given the incoming General Data Protection Regulation (GDPR). GDPR makes data storage and security even more critical therefore it’s important when sourcing a provider that businesses are clear about where their services are being used and where the data resides. Don’t be seduced by cost savings from foreign cloud providers, as it could land your company and clients in hot water with national regulators. Selecting a good quality, a local cloud provider that offers support and access to virtual servers and data will reap dividends in the long run (as well as providing peace of mind). If they allow you to store your physical servers on-site (colocation) and plug them directly into their cloud infrastructure (hybrid cloud) then you could end up with the best of both worlds. Communication and support are also crucial, so selecting a supplier who can meet tech support requirements, day or night, 365 days a year is a must. Ultimately, choosing a cloud provider is like handing over your keys to your home or car - you’re giving them access to and control of something so precious - your data - and where your data is stored and how it is managed. Before you do this, therefore, make sure they’re a genuine, established partner that you’ve heard good things about and that you trust implicitly. After all, in today’s digital era, data is a company’s crown jewels and the way businesses treat and protect their data will govern the very reason why businesses succeed - reputation. For more information about the cloud services 4D Data Centres can provide to your business, visit: https://www.4d-dc.com/cloud     ### Unfragmenting Security with Threat Intelligence It has often been said that complexity is the enemy of security. It is a simple statement but, nonetheless, one that holds true time and time again. The more complex your infrastructure, the more likely it is to have seams with exposed vulnerabilities. This is exactly what hackers are looking for, places where people and processes are not perfect and something is left unprotected. In my last article I talked about how defence-in-depth and layering defences so that if one does not work, another layer is there to stop the attack. This has not always been the saviour we thought it would be. This stems from the fact that each layer of defence has been a point product; a disparate technology that has its own intelligence and works within its own silo, creating fragmentation. And, since this creates complexity, it stands to reason that to combat the enemy and improve security we need to reduce it. But how can you begin to unfragmented something that is already out there in many pieces? To my mind, the best way is to find the glue to put things together. This glue comes in the form of threat intelligence, integrating layers of point products within a defence-in-depth strategy to reduce it. [easy-tweet tweet="Companies need to apply their threat intelligence to their environment in smarter ways" hashtags="ThreatIntelligence,Data"] But this isn’t just a problem with defence-in-depth. You also see it in your external threat intelligence feeds and across the different teams involved in maintaining your security posture. Let’s take a closer look at the fragmentation that exists in these areas and how threat intelligence can help. A study by the American university, Carnegie Mellon, analysed the blacklist ecosystem over an 18-month period and found that the contents of blacklists generally do not overlap. In fact, of the 123 lists (which each included anywhere from under 1,000 to over 50 million indicators) most indicators appeared only on a single list. It’s no wonder there’s a huge data overload problem! The study goes on to say, “our results suggest that available blacklists present an incomplete and fragmented picture of the malicious infrastructure on the Internet, and practitioners should be aware of that insight.” But don’t just take their word for it; the 2015 Data Breach Investigations Report commissioned by Verizon came to a similar conclusion noting that “there is a need for companies to be able to apply their threat intelligence to their environment in smarter ways.” In an attempt to get the best coverage as they build their threat operations, most organisations are typically forced to use multiple data feeds, some from commercial sources, some open source, some industry and some from their existing security vendors – each in a different format. Lacking the tools and insights to automatically sift through mountains of disparate global data and aggregate it for analysis and action, the data remains fragmented, often does not have context and just becomes more noise. The path to threat intelligence begins with aggregating that external data into a threat intelligence platform (TIP). Nevertheless, a TIP needs to go further than simple aggregation. It must also operationalise and apply that intelligence as the glue to reduce fragmentation. With global data in one manageable location, it needs to be translated into a uniform format and augmented and enriched with internal and external threat and event data. The correlation of events and associated indicators from inside your environment with external data on indicators, adversaries and their methods, allows you to gain additional and critical context in order to understand what is relevant and high-priority to your organisation. Now you’re in a position to utilise that threat data, automatically exporting and distributing key intelligence across all the different layers of defence in depth to improve security posture and reduce the window of exposure and breach. So how can you deal with the fragmentation across teams? Well, the key here is to find a way to use that threat intelligence for better decisions and action, and this can often be a challenge in siloed organisational structures. You might have a SOC (security operations centre), a network team, an incident response (IR) team and a malware team. More often than not, they don’t even work together, let alone share information or intelligence. Forced direct communication isn’t often effective, so how do you get those teams to work together in a way that makes sense? By offering a single repository for all threat intelligence that is contextual and prioritised, you can foster much-needed collaboration without them necessarily even knowing it. With the ability to add commentary and store data for longer periods of time, the repository can become a core component of their processes. As the different teams use and update this repository, there is instantaneous sharing of information across other teams, resulting in faster, more informed decisions. Taking this a step further, by integrating that repository into other existing systems – including, but not limited to SIEM, log repositories, ticketing systems, incident response platforms, orchestration and automation tools – you will allow disparate teams to use the tools and interfaces they already know and trust and still benefit from and act on that intelligence. For example, the IR team uses forensics and case management tools. The malware team uses sandboxes, the SOC the SIEM and network team uses network monitoring tools and firewalls, and this is just the beginning. By getting consistent intelligence directly from the repository that they have been working in and updating collectively, everyone operates from a single source of truth, reducing fragmentation and complexity so they can accelerate detection and response. I am in no doubt that complexity is the enemy of security, but this doesn’t have to mean that you are entirely helpless. The enriching of threat data from all your external and internal sources with context, relevance and prioritisation, allows threat intelligence to become the vital glue that reduces the overall fragmentation across your security environment. By reducing this complexity you can ensure that your teams can work together with their existing tools to keep your organisation safer. ### Cloud Adoption for Fintech Systems Fintech has taken the financial services world by storm. Customers of financial service businesses are demanding quicker, more elegant and increasingly mobile ways to interact with their financial data. Fintech startups are disrupting the marketplace by offering technology and services that challenge the traditional ‘brick and mortar’ financial institutions and their software providers. However, all fintech technology needs to be considered through the familiar long-term lens. Will the technology provide a better customer experience, will it provide a solid ROI over time and will it put the organisation in a more solid financial position? Is the same technology appropriate for both customer facing and back office systems? 6 things to consider when evaluating fintech cloud-based products [easy-tweet tweet="Top fintech organisations will have a great record of responding to customer issues and feedback" hashtags="Fintech,Cloud"] Due diligence – Investigating the product’s features and making sure they meet the requirements of your project is critical – but due diligence on the software vendor itself is just as important. Is the vendor well-funded? Does it have a solid operating plan going forward? Equally, you also need to look at the software vendor’s people. Are they honest, straightforward and trustworthy? A reputable account manager won’t sell you a product that is not a good fit, no matter how close to their sales quota they are. A trusted product manager can provide a product roadmap that is reasonable, achievable and tailored to the market’s needs. A top fintech organisation will have a great track record of responding to customer issues and feedback. You are likely to be entering a long-term relationship with the fintech organisation, so do your homework to make sure it’s a good fit and that it will provide service that exceeds your expectations! Configurable solution – If the solution the fintech vendor is selling you requires a lot of custom programming or custom screens to meet your needs, this is a strong indication that the software may not be a great fit. A flexible, dynamic product will allow a rich configuration functionality meaning the solution can be tailored to your specific organisational needs. Occasionally programming is needed to develop a specific feature, ideally, that feature should be usable across the fintech vendor’s customer base to ensure it gets the attention of the product development and QA teams in future software releases. If it doesn’t, you can easily get left behind. Robust data integration – It is rare that an organisation has all its financial data in a single database or data warehouse. Due to acquisitions, technology transitions and other business factors, data tends to get fractured and housed in many different locations. Any software solution must have a way to easily integrate with data sources that are both on premise and housed in remote locations, including the cloud, which has become increasingly popular in recent years due to its flexibility and cost effectiveness. More on that next… Reduced IT costs – Cloud-based fintech solutions are driving down infrastructure costs and increasing the efficiency of internal IT service and support organisations. Moving technology solutions to the cloud and providing a single point of access benefits both the fintech vendor and the organisation supporting the solution. Enhanced collaboration – Web-enabled cloud solutions for your financial data systems provide the opportunity for more users to access the system and allow for enhanced collaboration. With very little training, users can use your organisation’s financial data systems through easy to use web-based access. Fintech vendors are constantly building collaboration tools into their applications that complement the number crunching capabilities of the system. Leverage the best of both worlds – Many fintech vendors offer 100% web-based solutions. For customer-facing solutions, this is critically important as users want to be able to connect to the system with a variety of different computing devices. For back-office systems, a hybrid approach still holds tremendous value. There is often a core group of users needed to build, maintain and modify the business logic of the application. A 64bit desktop computer is still a great way for fintech vendors to deliver this functionality. The tools to develop and deploy software on the desktop is mature and can offer rich functionality via the user interface. Still, it is important that desktop clients operate seamlessly with web-facing systems to deliver the solution to a wider audience of users in the organisation. This is an exciting time for organisations to investigate and deploy solutions provided by fintech vendors -there is currently a lot of research and development money being invested in new and emerging technologies. Established vendors can leverage their deep knowledge of the industries they serve and use it to develop exciting new software. Taking the time to investigate all the options available in the fintech space can help drive your own business forward and allow you to reach new and interesting insights in your financial data. ### Media in the Cloud - #Cloudtalks with TV Everywhere's Iolo Jones In this episode of #CloudTalks, Compare the Cloud speaks with Iolo Jones, CEO for TV Everywhere. Iolo talks about building systems in the cloud to manage media and manage rights. He also talks about how media has become part of marketing for every organisation. #Cloudtalks Visit TV Everywhere's website to find out more: http://www.tveverywherenow.com/ ### SDN Beats Edge Data Centres in the Heavyweight Hype Fight The world of IT is notably susceptible to overblown hype cycles and over-promoted technologies that, in the end, do not quite live up to expectations. One of the big ideas doing the rounds at the moment is the edge data centre. But is the hype justified in this case? Or has other technology simply superseded the need for edge data centres? The ‘edge data centre’ phenomenon is driven purely by customers’ belief that that the proximity of a data centre to their own sites will increase the performance of their systems. This is certainly an attractive idea, especially for any kind of mission-critical or real-time traffic. With the growth of video data, cloud-based applications, gaming, and now Internet of Things (IoT) systems – the performance of which are all significantly degraded through higher latency – edge data centres appear to have a huge advantage. With proximity on their side, users benefit from reduced backhaul, lower latency and reduced network congestion. Certainly, this has been the thinking in capital markets, where data centre proximity to, or even co-location with, major stock exchanges is considered a key advantage to traders looking for sub-millisecond latency. So, what is an edge data centre? But as with all hype moments, the original definitions can get a little fuzzy over time. When it comes to the edge data centre movement, there are operators opening or simply re-purposing existing facilities in tier-two or tier-three markets – the areas that have not been well served by proximal infrastructure in the past. These smaller facilities are being positioned as edge data centres that provide low-latency access for local users. There are also vendors offering small-scale, modular hardware to provide a complete, self-contained data centre designed to operate remotely with minimal support. These too are being positioned as edge data centres – as are certain services from co-location providers. In addition, software vendors are getting in on the game with solutions such as remote management, operations, orchestration and deployment tools made available on the promise that they enable edge data centres. However, none of this means that a small data centre in a remote region is actually on the edge. When the customers are mainly local and the services being delivered are from local businesses, then, in reality, it’s just a small local data centre. However, it could be providing edge services if, for example, it hosts appropriate hardware from a major content delivery network (CDN) provider. In other words, the edge data centre is not defined by size, speed, service or customer base – the only criteria that count is proximity. What customers want But judging by the conversations we have with our customers it seems that edge data centres are more discussed than deployed. Certainly, our customers are more concerned with the IoT than hi-definition video, and in the past few years, the advantages of being able to do more with the data they now have available have driven a growing interest in cloud and networking infrastructure. But that demand has manifested itself in demand for the centralised cloud-technology model, rather than edge data centres, not least because the cloud has always been about optimising and driving better data practices and making services more efficient. That the edge data centre concept hasn’t taken off among these customers is largely due to enhancements and improvements to software-defined networking (SDN) that enable the data centre to be extended to a customer endpoint. In an SDN, the control of data traffic is managed by programmable software and is no longer dependent on forwarding hardware such as routers and switches in the network’s nodes. Administrators can use a central control to regulate the transfer of data and to deliver services to wherever they are needed in the network, regardless of the specific devices, server or other hardware components. It allows organisations to centrally design, assign and manage application-aware policies and to secure and control all network traffic across all sites. When SDN-clouds are in place, the same concept of software definition is extended all the way from a user’s device to the data centre and whatever private, public, or hybrid cloud arrangements the organisation has in place. [easy-tweet tweet="It is important that the network and the cloud infrastructure is aware of the data it is carrying" hashtags="Cloud,Data"] Software defined networks – software defined clouds The fact is that real-time services don’t necessarily require an edge data centre in immediate proximity. For example, a retailer that deploys a software-defined device equipped with IoT sensors and connected via software-defined or a data-aware network can optimise traffic flows coming in and out of that device. For our customers, the most important thing is that the network and the cloud infrastructure is aware of the data it is carrying and can manage and control all relevant work streams appropriately. Because SDN technology automates traffic flow and constantly looks for the optimal path through to the back-end, the traditional bottlenecks and barriers to low latency disappear. Combining SDN with the cloud in an end-to-end solution means customers can find new, faster ways of delivering services via devices connected to the IoT – which in turn work much faster and more efficiently than edge data centres. Underneath all the hype about edge data centres, what we are really talking about is how cloud services are provided. Most cloud service providers are exploring ways to deliver multiple services, across multiple geographies – and across multiple private, public or hybrid clouds. What they need are technology solutions that enable the delivery of applications and services to the locations their customers want. From there, it is about the automation of the delivery of application management and services to those locations. Edge data centres have the undoubted advantage of being intuitively easy to understand: tell someone that a facility is closer and of course they will believe that it delivers quicker services. Software-defined networking, in contrast, is much more difficult to come to terms with – especially for non-technical business managers with tightened budgets and a Board to report to. But that’s not a good enough reason to ignore an elegant solution that is much better suited to the world where the IoT, smart connectivity, rapid analytics and real-time information flows are becoming the norm. Quite simply, the combination of SDN and the cloud enable customers to find new, faster ways of delivering services than edge data centres would allow– and so far, our customers agree. ### Quotall Chooses Datapipe to Manage its Migration to AWS Insurance e-trading software platform provider, Quotall, has selected Datapipe, a leader in managed cloud services for the enterprise, to manage the migration of its integrated SaaS application to AWS cloud. This partnership leverages Datapipe’s cloud management skills to ensure that Quotall’s business-critical application is securely monitored and managed cost-effectively on the AWS platform. Quotall’s software allows insurance distributors to e-trade customised products and integrates the internet channel with their telephone-based and face-to-face sales activities. Datapipe will migrate Quotall’s production environment from private to public cloud, managing the AWS platform on behalf of the insurance software provider. This move serves to free up Quotall’s in-house IT team to spend more time on business priorities. [easy-tweet tweet="The move from private to a managed AWS cloud is one that made sense for Quotall" hashtags="PrivateCloud, AWS"] In addition to creating cost efficiencies, the transformation will deliver more agile infrastructure that can support a change-in-market approach for Quotall. Working with Datapipe, the organisation will benefit from expert governance, ongoing technical and commercial optimisation, and best practice scale for its infrastructure, taking advantage of 24x7 global support from Datapipe. “The move from private to a managed AWS cloud is one that made sense for Quotall,” explains Simon Ball, founding director at Quotall. “We are a rapidly growing business with constantly evolving IT needs. As the cloud market has matured, there is a lot more confidence in the public cloud, but with so much at stake we need it to be managed carefully. Moving to AWS has allowed us to cut our IT management overheads, and partnering with Datapipe has given us the confidence to undertake a fundamental technology transformation in the knowledge that our business-critical software service delivery will only be enhanced. “Having previously worked with Datapipe to host our test and development requirements, as well as our user acceptance testing, we knew that they would help us use the public cloud effectively and deliver the desired results.” Stewart Smythe, Managing Director at Datapipe Europe, says, “We are delighted to extend our working relationship with Quotall as it moves into the AWS cloud. Our approach is designed to make it easier for customers like Quotall to change and innovate. Public cloud environments can be complex to create and maintain effectively. Helping Quotall to get it right first time means it can immediately start to capitalise on the efficiencies and economies AWS offers.” ### The Right Way to Adopt the Cloud We should not be surprised to see history repeat itself. The adoption of cloud technology is no different. The cloud is not new; but our wholesale, blind adoption of it is fairly standard. It may be human nature that compels us to leap into the pool without first looking. In the days of the PC, we loaded up thousands of desks with IBM and IBM-compatible PCs. When Lotus-123 arrived, thousands of companies migrated sensitive financial data onto floppy disks. Not too long after, we embarked on the era of the LAN which made it easy for us to share expensive printers. This then moved to Wide Area Networks, enabling enterprise-wide sharing and collaboration. Email, which had been around since the late 70’s on UNIX and other multi-user systems became mainstream in the late 80’s and exploded with the advent of the commercial internet in the early 90’s. In all of these cases, there was one aspect of technology that lagged all of the new innovations. This was management. Whether it was system management or network management, - in all cases the tools needed to manage the tools came after. It was not until things started to go wrong that customers realised there was a need to secure, protect and manage the new technology that had already become mission critical and irreplaceable. We are doing it again. In the past two decades, we have embraced the wonders of virtualization and open source, both of which have caused a major technology revolution. Virtualization has allowed us to exploit unused processor and network resources and introduce automation and agility into the server powered the data centre. These technologies are also the seeds of our current cloud market. These virtualized machines have made Amazon, Google and Facebook possible. Can you guess how much Google or Amazon pay Microsoft to operate their infrastructure? If you guessed nothing, you’d be right. The ability to vastly reduce the cost of delivering compute network capabilities is the root cause behind cloud solutions. This is why it should be cheaper for companies to run applications in the Amazon cloud that would be run in the corporate data centre, but is that true? Is it in all cases cheaper to run in a public cloud? We as an industry have a habit of leaping before we look. Digital transformation is fueling this, and in an effort to keep up with the competition, conversations similar to the below are taking place. CEO: “Mr. CIO I want us to move to the cloud.” CIO: “Yes, and why is that sir?” CEO: “Because it is cheaper and faster and cooler than what we are doing,” CIO: “Sir, do you know how much we currently spend or what our current performance levels are?” CEO: “No. I have no idea, but I am sure they will be better when we get into the Cloud!” A new paradigm This is perhaps slightly dramatised but has been taking place in executive suites and boardrooms around the globe. “If the big guys, Google, Facebook and Amazon, are doing it, then it must be good.” There are, of course, an enormous amount of benefits to utilising virtualization and open source technology, regardless of industry. Incorporating agility and continuous development and deployment into your business strategy is becoming crucial now too. Alongside this, there is a deep need to digitally transform your company in order to keep pace with the market, innovate and win new customers. There is also an easy way to do all of the above wrong. [easy-tweet tweet="The value of the cloud is not just cost and performance. It is also about speed and agility" hashtags="Cloud,Technology"] The easiest way is to think that this new technology is merely a new way to do what you have been doing, falling victim to what Thomas Kuhn called “Confirmation Bias”: what you know and how you have done things in the past should not dictate future endeavours. Too many companies are attempting what they have always done using a new paradigm, as opposed to thinking about what they need to do to win and then applying new approaches to solve problems. Digital Transformation asks the question, “How can we better serve our existing customers whilst attract new customers?” The answer to these questions falls under the category of faster innovation. This is where the cloud comes in. The value of the cloud is not just cost and performance. It is also about speed and agility, scaling up and down based on market and customer needs. It is about deploying new ideas quickly, vetting their performance, improving or dismissing them, moving from a failsafe to a safe-to-fail environment. Whether this is done in a public or private setting depends entirely on the business in question. Back to basics How do we manage this journey? How do we deliver more predictable outcomes? If we go back to our conversation on what we need in order to make an intelligent decision is exactly that: intelligence. To determine if our new solution is faster, we need to define speed. To know if our new solution is cheaper, we need to define cost. To know if our new solution is better, we need to define the quality of service. In other words, we need to establish an accurate baseline understanding of our existing environment before we attempt to move it. Think of it this way: will the system you’re running today work the same when you move it into the cloud? Will it cost the same? Will it scale the same? In order to answer these questions, you are going to need a constant to apply to your existing environment and to your new environment. You are going to want to compare apples to apples so you can tell your team that yes, you have improved performance, yes you have reduced costs and yes you have increased automation and agility and the quality of service experienced by your customers. You will need to think about measurement and metrics, choose an approach that will work in both your existing and future environments, and be able to maintain the constants you expect to rely upon. In the end, we can all learn from our many mistakes. Knowing what we want to accomplish, how we are working today and what will be an improvement seems to be the bare minimum of an intelligent plan. Before you embark on the cloud journey take careful stock of where you are and have a clear understanding of your desired outcome - Know Before You Go. ### This is my new blog My new blog is great! ### Turning whistleblowers into assets - #Cloudtalks with Addveritas' Dino Bossi In this episode of #CloudTalks, Compare the Cloud speaks with Dino Bossi, Co-Founder of Addveritas. Dino talks about turning whistleblowers within companies into assets rather than see them become liabilities. #Cloudtalks Visit Addveritas website to find out more: https://www.addveritas.co.uk/ ### Move Emails from Promotions to Gmail Inbox Do you care where at Gmail your message lands? The tab where the email is placed determines heavily whether or not the email will be opened. And since Gmail is one of the most popular mailbox providers now, a lot of your subscribers may be using Gmail. Therefore you should care. If an email is delivered to the Primary inbox, a user sees a notification on his phone. An email that arrives at Promotions goes silent. The Primary inbox is a more personal area, while the Promotions folder has miscellaneous odds and ends. In order an email inside the Promotions tab is read, the user has to take a decision to go to that tab. And there, your email has to stand out among hundreds of promotional emails that all scream for attention. So, wouldn't you want to increase chances of your email for landing in the Primary tab? Below is my real case study about how I was able to drag an email from the Gmail's Promotions tab to Primary. But first, let's remind common tips that you can find everywhere: Don't sell. It makes sense. If you are selling in your email it's promotional, isn't it? Authenticate your email with DKIM and SPF records. It must be done for deliverability in general, but it doesn't really a decisive factor for the email to be delivered to the Gmail's Primary tab instead of Promotions. Personalize by name. It's not really important because promotional emails often address the recipient by name. Include only one link. The idea here is that personal emails usually don't have a lot of links. However, we think that there is nothing wrong with including up to three links. Don't include pictures. Just like with links, it is believed that people don't send a lot of images in personal emails. Keep it short. In our opinion, it's not important because the email length doesn't really tell whether or not the email is promotional. People send long personal emails too. Don't use complex HTML. It should be reasonable since personal emails are often plain text messages without fancy HTML. Let's Test It To know what really makes Gmail send an email to the Promotions tab instead of Primary, we have to do some testing. For this test, I used EasyMail7 email marketing software and GlockApps spam testing service. These tools are integrated which makes it easy to send spam tests and view the reports. I took a very basic message with a logo, an image, a call-to-action button and some text. I tested it with GlockApps. Where did it land? Right in the Promotions tab. Let's remove the big image and test. Still in the Promotions. Then I removed the logo and tested. It made no difference for Gmail that placed the email to the Promotions inbox again. I thought I could play with the footer. From our experience, the email footers added by email service providers can cause the Promotions tab placement. First, I deleted the buttons with social links from the footer. Typically they tell about the promotional nature of an email. Let's see if it helps. It didn't. I went forward and simplified the footer. I removed the text "Add us to your address book" and "update your preferences" that is typical to bulk mailings. This time, the email was placed under the Primary tab! Now I decided to add the logo and image and test again. Again in the Primary tab! What if I add social buttons back? Let's see. Bad news! The email is landed in the Promotions folder. Now the last test. I removed social buttons and added more links to the content. I had 6 links in total including a hyperlinked logo and image. The email is delivered under the Primary tab! What's the Conclusion? My test showed that Gmail did look at the content to decide whether or not an email was a promotion. But in the first turn, Gmail looks for footprints added by email service providers indicating that the email is a part of a bulk mailing campaign. The footprints like "Add us to your address book", "Update your preferences", "Unsubscribe from this mailing list", the "View in browser" text and icons with social links make the email be considered promotional. A reasonable amount of links and images are acceptable and don't make Gmail send the message to the Promotions tab. I didn't test the email length and personalization because they don't really matter. There is no standard for how long the email should be to be delivered to the Gmail's Primary tab. Some marketers stick to a short copy and send the recipient to the website for more information. Others create a long story in order not to make the reader take an extra action. The recipients are also different. You should find out what email length works best for your audience yourself. Personalization by name certainly adds a human touch, but that's all. It's not going to predetermine your tab placement. To summarise it, here are the things that really have an impact on Gmail's Primary tab placement: - Footer added by email service providers (no matter what kind of content you send) - Images with links to social profiles - RSS content - Pure promotional content. If your email is selling something, it will most likely land in the Promotions tab. And it's normal, isn't it? Can you still increase chances for your promotional email to go to the Primary tab? 1) Try creating a simple plain text message with a link to your website where people can read more about your offer. 2) Try changing some of the elements (subject, HTML vs. plain text, footer, the number of links and images, sending email/domain) and test the email after each change to see where it lands. The GlockApps spam testing tool will show you where your email is delivered within a few minutes. You need to test in order to not send blindly hoping that something will work. You have to have real data to find out what causes the deliverability problems and be able to fix them. 3) Make sure that every email has real value for your recipients. If your emails are valuable, no matter how they are looking and where they end up, your subscribers will be looking forward to reading them. Don't sell in every email. Give something to build a relationship too. 4) Be consistent in sending. When your subscribers know when to expect your email, even if it ends up in the Promotions tab, they'll search for it there. [easy-tweet tweet="The Gmail's filter relies heavily on how each user interacts with your emails" hashtags="Gmail,Technology"] That builds the most important element of email marketing - recipient engagement. The Gmail's filter relies heavily on how each user interacts with your emails and will take the decision where to put your messages based on that. The things that make the email go to the Promotions tab we found out through our testing can be beaten by a good engagement. Let's recap once again. The key things you have to put emphasis on are: - Send as much value in every email as you can - Be consistent with your sending frequency - Focus on interaction with your subscribers (encourage opens, replies, forwards) Good email marketing is about building a relationship with your recipients. Engagement matters the most and is the key to good deliverability.   ### Lasernet Adds Advania to Certified Partner Network Input/output management software vendor, Formpipe Lasernet, has signed up leading Icelandic IT reseller, Advania, to its certified partner network. The deal represents Formpipe Lasernet’s first venture in the Icelandic region, supporting the firm’s plans for global market dominance.  Agreeing on the terms of the partnership earlier this month, Advania will be adding Lasernet’s agile, feature-rich input/output management functionality to its existing ERP offering, concentrating initially on the Microsoft Dynamics AX market. [easy-tweet tweet="Offering Lasernet, Advania can give customers a solution for document management" hashtags="Lasernet,DocumentManagement"] Commenting on the partnership, Advania director, Sigríður Þórðardóttir, said: “Working with multiple organisations using Microsoft AX products, we have the first-hand experience of some of the challenges they face. Dynamics is an incredibly powerful solution but we identified a clear demand from users to enhance some aspects of its functionality, including document management. “By offering Lasernet, we can give our customers a state of the art solution for document management, which will empower them to more effectively coordinate the associated processes and, ultimately, work more efficiently and deliver noticeable cost savings. Choosing Lasernet was easy. The solution is very efficient compared to similar products on the market yet at the same time incredibly easy to use. “We are hoping this will be a fruitful partnership for both parties. We have full confidence in Lasernet’s ability to provide our customers with a solution that will enhance the standard functionality available in Dynamics and make day to day business life much easier for them.” Despite the relatively short partnership to date, Advania is already in the planning phase of a Dynamics AX implementation with one of its customers in Iceland. The implementation will see Advania’s customer, which operates in the utilities sector, adopt the MECOMS platform by Ferranti Computer Systems to better manage its meter data and customer information. The MECOMS solution is based on Dynamics AX and sold with Lasernet functionality as standard, following an earlier partnership between Formpipe Lasernet and Ferranti. Mike Rogers, sales and marketing director at Formpipe Lasernet, believes this latest partnership is further proof of Lasernet’s unrivalled integration with Microsoft ERP systems. Mike explains: “Advania is a serious Dynamics player in the region so to be selected as the company’s document management software vendor over a number of our competitors is a true testament to Lasernet’s seamless integration with Microsoft solutions – as well as its ease of use and a long list of features. “Equally, we have strict rules in place as part of our certified partner network. It’s important to us that Lasernet maintains its position in the market and continues to set the benchmark for innovative input/output management technology, as well as superior integration with a multitude of ERP solutions. We immediately recognised synergies with Advania and are proud to have the firm represent, sell and support Lasernet.” ### Firms Closing Innovation Gap to Pursue Revenue Growth UK accounting and bookkeeping firms catching up with leading foreign markets to accelerate client migration to the cloud ahead of new digital tax laws 18 per cent of firms now work exclusively with clients online The number of UK firms with at least 80 percent of online clients is expected to rise within two years, from nearly a third (31 percent) to over half (56 percent) 28 per cent of UK firms currently have mostly desktop clients compared to 9 percent in Australia and 5 per cent in the US - but the gap is closing Firms with 100+ online clients experience 16.3 per cent year-on-year growth compared to 9 percent for those with up to five online clients Gary Turner, UK co-founder and managing director says, “...if not managed correctly, Making Tax Digital does run the risk of pushing some firms back to the Dark Ages...”  A new report revealing how UK accountancy and bookkeeping firms measure up against their peers has revealed that the UK is closing the gap behind the US and Australia in the percentage of firms with most of their clients online, as the findings suggest that moving to the cloud helps accelerate practice revenue.  The Benchmarking Report by Xero, which surveyed 400 UK accounting and bookkeeping firms of all sizes and types, revealed aggressive targets from UK firms who are encouraging more clients online before new HMRC digital tax laws come into force in 2018. Whilst UK firms have the highest number of mostly desktop clients at 28 per cent - compared to around 9 percent in Australia and 5 per cent in the US - the report suggested that UK accountants are intent on moving more clients online. Firms with fewer than 40 percent of their clients online are set to drop to just 9 per cent by December 2018 (from 46 per cent). Positively, 31 per cent of ‘pace-setting’ UK accounting and bookkeeping partners currently with at least 80 per cent of their clients online is expected to rise to 56 per cent of partners within two years. Currently, 18 per cent of UK firms surveyed has between 98-100 percent of their clients online, suggesting that this percentage is the most ahead of the curve in preparations for the Making Tax Digital rollout, which will require the vast majority of businesses to keep digital tax records and update the HMRC quarterly. The firms with the fewest online clients are at most risk of falling dangerously behind in the preparations to be compliant ready for the roll-out. The report revealed that UK firms with a larger number of clients using online accounting are adding more new clients and growing revenue faster. Those with 100 or more business clients using online accounting experienced 16.3 per cent year-on-year revenue growth compared to 9 percent for those with between 0 and 5 online clients. Revenue per employee also increases to become 33 percent better than average in the group with 100 or more online clients, and firms that build up at least 125 online accounting clients get more word-of-mouth referrals which bring in new clients. These findings point to how the industry is evolving, as Xero accounting and bookkeeping partners experience increased growth and revenue as a result of operating online. [easy-tweet tweet="Marketing appears to become more important as businesses scale up their client portfolio" hashtags="Marketing,Business"] Firms of all sizes most intrigued by their competitors’ MTD preparations and use of business apps The participating firms were asked which areas of their competitors’ businesses they were most interested in. While the results showed that interests differ depending on business size, Making Tax Digital preparations and the use of business apps was of particular interest to firms regardless of their size. Marketing appears to become more important as businesses scale up their online client portfolio. The report found there to be a clear step up in the interest in marketing when firms shift from fewer than 35 online clients to larger portfolios. Concern about switching clients from hourly to monthly billing plans drops quickly and significantly as the online client portfolio grows. This indicated that monthly billing is a positive step for those who’ve made the move. Learning which apps are the most useful for accounting practices was found to be of high importance to firms of all sizes. Those with small online client portfolios want to know where to start, while larger firms are looking to confirm their choices and add to the range of apps they use. Gary Turner, co-founder and managing director of Xero commented: “The report shows HMRC's new digital tax rules are informing a growing preference for cloud accounting in the profession with good reason; many professionals recognise the need to shift away from compliance services to adding value. However, if not managed correctly, Making Tax Digital does run the risk of pushing some firms back to the Dark Ages of compliance accounting, a loss for both the profession and the clients.” For more information and to read the full report please click here.   ### Why Cloud Computing is Perfect for Start ups In the UK alone, the combined annual turnover of UK SMEs in 2016 was £1.8 trillion, 47% of all private sector turnover.  There has also been sustained growth in the total SME population, with increases of 59% since 2000 and 2% since 2015.  If startups continue to develop at the current rate, they need a strong IT system in order to stand out from the crowd and keep pace with their competitors. Yet, in actual fact, the majority of startups don’t have a robust or permanent IT system.  It’s not that start-ups are ignoring the value and importance of IT, but rather that, with limited resources, they more often than not can’t afford to focus on anything other than the core purpose of their business, which is understandable.  This often leaves start-ups at a disadvantage when competing with larger companies who may have a strengthened IT system and bigger budgets.  Fortunately, with cloud computing, even start-ups with limited IT resources are able to achieve quality performance typically reserved for their larger, enterprise counterparts. Let us examine a few reasons as to why cloud computing is perfect for start-ups and SMEs: Flexibility Cloud technology is flexible which enables it to help start-ups find the perfect solution for every stage a business has to go through. Startups have to be a lot more nimble than larger companies, as they are constantly reacting to dramatic shifts in growth and stresses. In the past, organisations would have to anticipate these peaks and troughs and then build their infrastructure to handle it. This often meant overestimating and overspending most of the year.  Cloud solutions help reduce this ambiguity and allow startups to quickly scale up or down, depending on their needs. It's easy to expand your IT infrastructure according to the dynamics of your business using the cloud. A company is able to start with a relatively small infrastructure, then gradually expand or scale up as quickly as they need, eliminating latency caused by the management of physical IT architectures. Using cloud technology also allows users to access all their documents from any location as well as prevent catastrophic data losses through either hardware damage, theft or loss.  As a nimble start-up, an employee is able to start a document in London, travel to a client meeting in Manchester – open and finish the document on the train, and then pick up where they left off at home in Southampton.  This flexibility underpins the flexible processes which make the majority of businesses work in today’s society. Reduced Costs Perhaps one of the main benefits of the cloud is its ability to level the playing field between start-ups and SME’s compared to larger enterprises. By removing much of the initial up-front costs, the cloud allows small and medium-sized businesses to compete with their larger rivals and in some instances to surpass them.  For example, at Aruba, we have specific Aruba Cloud programmes designed for start-ups where SMEs can receive free Cloud credit, exclusive offers at the end of the program and technical training. [easy-tweet tweet="With cloud technology, companies can easily create disaster recovery solutions" hashtags="Cloud, DisasterRecovery"] Performance and Reliability In the very early days of the cloud, people worried about its security however for me, using an in-house server feels a lot more exposed than keeping your data, and important client and business information, in secure data centres guarded 24/7 by large security systems and processes. There are also practical benefits to using cloud technology.  With cloud technology, companies can easily create disaster recovery solutions and recovery plans, or data backups.  While previously disaster recovery could only be afforded by large enterprises it is now available in scalable costs, all done and managed by the cloud.  SMBS and startups, therefore, are now able to implement disaster recovery solutions and recovery plans by accessing the cloud infrastructure. As you can see from the above, cloud technology is perfect for start-ups competing with their larger competitors.  Cloud technology allows a start-up to become more flexible, more reliable and have a greater performance, all at a reduced cost compared to a more traditional on-prem offering. About Aruba Cloud Aruba Cloud (www.arubacloud.com), part of the Aruba Group, is a leading Cloud service provider. The service offers a comprehensive range of Cloud solutions for customers around the world. Thanks to a network of six data centers in various countries (UK, Germany, France, Italy and the Czech Republic), Aruba Cloud offers its own customers a range of services and solutions created to respond to the needs of customers however small or large they are, including SOHOs, startups, SMEs and large businesses. Aruba Cloud's solutions are based on three kinds of Cloud service: VPS SSD, Public Cloud and Private Cloud, plus a selection of Cloud Storage and Backup accessories. About We START you UP We START you UP is the Cloud solution designed to support startups as they find their feet, nurturing technological talent and innovation.  Aruba Cloud offers startups the best technology in terms of platforms and data centres, free Cloud credit and much more.  ### Tailoring Customer Experience - #Cloudtalks with Livestyled’s Adam Goodyer In this episode of #CloudTalks, Compare the Cloud speaks with Adam Goodyer, Founder and CEO of Livestyled. Adam talks about who Livestyled are, what they focus on and how they tailor customer experience. #CLOUDTALKS Visit Livestyled's website to find out more: http://livestyled.com/ ### Becoming Truly Proactive (and not mopping the floors every day) In my last article - I explored how firms might get all members of their management team (from the Technical Geeks to the Business Execs) to basically communicate more effectively and hence make smarter, more intelligent decisions.  Coincidence – Correlation – Causation – which is it??? When looking into what’s going on ‘under the hood’ of a mainframe (or indeed any server) - it’s very important to distinguish what’s really going on. We, humans, are prone to many kinds of psychological biases and it’s all too easy for us to fall into the trap(s) of not being able to properly identify the differences between pure coincidence (especially when that would support our theories) – and also, to further distinguish between (mere) correlation and actual causation. Such distinctions are very important if a company is to avoid overly paying for Computing time/similar – and to really get to the root cause(s) of what is truly causing unnecessary and unwanted bottlenecks as far as efficient processing of (say) critical batch routines is concerned. In order to really focus on what’s actually going on with your systems, the toolset I mentioned in the previous article – namely “ITBI” from the Danish company SMT Data – truly excels at sorting the wheat from the chaff in these respects. Their front-end enables its users to clearly identify exactly what is going on behind the scenes and to further allow users to change the “Geek_type” labels into more meaningful/business-exec type language and/or show relative cost amounts so that everybody across the company can get a handle on what’s really going on. As the opening lines of their recent Press Release states: “Large IT installations have a wealth of data about capacity and performance but often struggle to create value from this data. Successful optimisation requires tools that create transparency - combined with people and processes focused on cost-hunting. A flexible data warehouse to manage the data and result in oriented reporting and analysis tools are a must. It may also be important to enrich the technical data with cost and organisational information in order to understand ‘who is using what and for how much’. SMT Data’s IT Business Intelligence (ITBITM) solution is built for this...” [easy-tweet tweet="Take a look at just what the SMT Data “ITBI” toolset could do for you/your organisations." hashtags="SMTData,Cloud"] Think about that for a moment. Imagine that you can capture “Actionable Insights” about where the true costs are occurring and further, that you can accurately distribute the costs across the relevant Business Units to determine more accurately just how profitable (or maybe not) they really are. To those of you out there who are involved with Mainframes and/or x84 distributed systems – I thoroughly recommend that you take a look for yourselves at just what the SMT Data “ITBI” toolset could do for you/your organisations. Similarly, as mentioned in previous articles – be sure to also check out the Consultancy/Optimisation Services that are available from DataKinetics – a company that has been around specialising in the optimisation of Mainframes for over 40 years – and counts many of the top Fortune 500 companies as longstanding customers. And ‘yes’ – they even employ people there who can speak “COBOL” too! Ergo – if instead of an expensive (and potentially very risky) system re-write – all you really need are some strategically applied “tweaks” – then these are the kind of folks who can advise you as to your options. So – whether you’re looking to upgrade, add more capacity etc., - before committing, perhaps your time could well be spent first having a chat with DataKinetics to determine if all you really need to do is simply optimise what you already have. In other words, if every day you find that your staff are always somehow bogged down, fighting the same fires, and forever “mopping the wet floors” – maybe the toolsets and expertise that’s out there could help you instead - find and fix the leaking faucet! ### FHL and Kyriba to Deliver Secure Cloud Treasury Solutions FHL Customers Now Have Access to Kyriba’s World Leading Treasury Management Solution Kyriba Corp., the global leader in cloud-based treasury, cash and risk management solutions, today announced a channel sales partnership with cloud solutions provider, FHL (www.fhl.co.uk), the leading provider of NetSuite products in Europe and NetSuite’s EMEA Partner of the Year 2017. FHL’s 150+ NetSuite customers now have access to Kyriba’s cloud-based solutions. Security is a growing issue for treasury and finance professionals, just as accurate cash flow forecasting and the power to make strategic decisions at the CFO level are of high-ranking priority, according to a recent benchmark survey from PwC. With Kyriba, FHL now delivers a suite of treasury management solutions to complement its NetSuite portfolio, helping finance professionals to effectively manage cash flow forecasting and operate more strategically in a secure cloud environment. FHL, a UK-based cloud solutions provider, named NetSuite’s EMEA Solution Provider Partner of the Year on six occasions – offers a range of services to its European clients such as consulting, planning and project management, solutions development and training. [easy-tweet tweet="Kyriba’s world-leading cash management solution adds value to our NetSuite offering" hashtags="NetSuite,Cloud"] “We are dedicated to providing our customers with ‘best of breed’ cloud solutions and are pleased to join forces with Kyriba,” said Andrew Peddie, Managing Director of FHL. “Kyriba’s world-leading cash management solution adds value to our NetSuite offering, helping organisations improve their cash visibility and better secure their assets.” Kyriba’s treasury, cash and risk management solutions are delivered 100 percent in the cloud, and seamlessly integrate with NetSuite, providing NetSuite users with enhanced functionality, including strategic cash management, increased payments security, cash forecasting, bank reconciliation and automated financial management processing. “As a premier NetSuite partner, FHL will be a fantastic advocate of Kyriba in the UK,” said John Campbell, Managing Director of Kyriba in Northern Europe. “As a modular solution, Kyriba’s and now FHL’s clients benefit from the flexibility inherent in our modular, SaaS delivery model. Customers can maximise the value of their spend by selecting and implementing the modules they require now, and then have the opportunity to add modules as their needs evolve.” To learn how your organisation can help financial professionals to become more strategic, or how to partner with Kyriba, contact treasury@kyriba.com ### Sports Streaming for the On-Demand Generation From Facebook Live to Twitter’s Periscope, there is no denying that video streaming has become a staple of the internet – whether it is following the news, keeping up with celebrities or watching sports games live. And with a reported 78% of people watching videos online every week, it looks like the video is here to stay. In fact, it is estimated that by 2019 80% of consumer Internet traffic will be video, and if enterprises wish to stay ahead of the curve, they must fully embrace this new platform. It is clear that viewers are flocking to the internet for all forms of entertainment – including sports. The on-demand, always-on habit that has changed the way we live our lives was mirrored in last year’s Olympic Games, with over 100 million people streaming video of the action in Rio. In the UK, no fewer than 30.2m UK browsers streamed action from Rio on BBC iPlayer and on BBC Sport. The most popular streamed event was the men's singles tennis final, in which 1.9m browsers watched Andy Murray as he retained his Olympic title by beating Argentina’s Juan Martin del Potro. In fact, last year, live streaming traffic from the Olympics surpassed all previous year’s numbers from the London and Sochi Olympics combined in just over a week — a powerful signal to the streaming video industry both around sports and other primetime content. Streaming video has brought the game-time experience online. The technology that has made this possible is cloud computing. Cloud computing now enables any organisation or company to use a network of remote servers hosted on the Internet to store, manage, and process data, rather than a local server or a personal computer.  This technology has had a massive impact on video, as the video is rich in data with visuals, graphics, film and much more, it was previously very difficult to manage, distribute and stream, until cloud computing changed all that. Today, cloud computing enables powerful and high-quality video streaming, and the sports industry have really embraced it. In the enterprise space, 2017 will be the year that more businesses view themselves as content-service providers and begin to adopt video as a core tool for communicating with customers and business partners.  For all types of businesses and sports brands alike video streaming services offer a unique channel with which to further engage customers with content. Video allows brands to analyse and understand how their audiences are consuming video content, what they prefer watching and how they react to certain images in real-time. These types of insights are invaluable for customer retention. [easy-tweet tweet="Online streaming platform Footters has adopted IBM Cloud Video to bring football enthusiasts together" hashtags="Cloud,IBM"] Today, it is crucial for enterprises to provide additional value to ensure customer retention and similarly, there is demand for new products and services when it comes to the sports industry.  For example, online streaming platform Footters has adopted IBM Cloud Video to bring football enthusiasts together, to deliver on-demand video content for football players - professional and amateur alike. The Spanish company aims to connect as many as 24 million football teams and 270 million players around the world with its platform designed to stream amateur matches and through the platform enable professionals from across the world to have the opportunity to meet and collaborate. Footters is currently working with 50 teams to help connect them with other players, agents, scouts, institutions, tournaments, brands and even families – demonstrating the unifying power of live video. Unsurprisingly, it is not only traditional media services that have seized the opportunity to employ video to embrace fans, but key players from outside the industry. Until recently, media industry experts viewed sports as a way to prevent cord cutting (i.e. cancelling traditional TV or satellite services). Controlling sports, it seemed, would allow cable providers and major television networks to maintain fans’ subscriptions. But in the meantime, non-media companies have joined the streaming video fray. Twitter’s Periscope was one of the first social media platforms to allow users to create live broadcasts which were followed by their announcement that 10 NFL games will be streamed for free on Twitter. Clearly, streaming video is drawing new categories of investors because there’s money to be made. In the UK, nearly a quarter of UK households subscribe to Netflix for on-demand streaming, with 1.4 million joining the streaming service in 2015 alone. Online streaming of live professional sports is the future, and TV companies and sports channels must wake up to the reality that the sports entertainment landscape is rapidly changing. With an ever increasing number of sports fans opting to stream matches, we must be ready to embrace video streaming, and fast! ### The Joy of Testing This is a test article. The Cow jumped over the moon. ### AI and Data Analytics are Reshaping Business Models   80% of digital transformation is now being driven by the need to meet customer expectations and 47% believe this presents an opportunity to ‘reinvent’ their business 57% of organisations are rethinking their business models as a result of new technology such as AI, APIs and data analytics Two-thirds (65%) of survey decision makers say they don’t have the skills to service these new models Nimbus Ninety, the UK’s independent research community for disruptive business and leading cloud solutions and hybrid IT services provider, Ensono, today reveal that more than half of companies are rethinking their business models due to advances in technology from automation to data analytics and artificial intelligence, and the need to meet customer expectations is driving this change. The overwhelming driver of digital transformation is the need to meet customer expectations. 80% of the 250 senior UK business decision makers surveyed said this was why their organisation needs to undertake a digital transformation strategy. This year’s survey also highlighted a new desire to ‘reinvent’ business models and 47% cited this as a key reason, up from 6% last year. Reducing cost is no longer a factor; only 18% stated this as a reason. [easy-tweet tweet="43% of companies view keeping pace with technology change as their most significant challenge" hashtags="NewTechnology,AI"] While new technologies are felt to be fundamentally changing revenue models and operational structures, the report also found 43% view keeping pace with technology change as their most significant business challenge for the year ahead. This is reflected in the spending priorities for 2017, which focused on infrastructure and processes to support the implementation of new applications. Spend on the cloud (44%), infrastructure (43%) and agile transformation (35%) were in the top five priorities, helping companies to establish a more agile technology foundation so their businesses can implement new services and enable customer-focussed teams to better respond to customer needs. Simon Ratcliffe, Principal Consultant, Ensono commented: “Companies that address their technology infrastructure – be it cloud, legacy or the connections between the various elements – can adopt new disruptive technologies and build momentum. As this research demonstrates, companies now understand the business value of improving customer experience. The next step on the journey is to execute these new business models, and for companies to find ways to find the skills needed to do this.” The report also highlighted this need for new skills in digital transformation. While 39% rely on in-house skills for the broader set needed for transformation, only 35% believe their organisation has the right skills in place. 41% said they are working with a solutions vendor and more than a third are working with a consultancy or design agency to help plug gaps in skills. To download the full report, please visit: https://www.ensono.com/downloads/uk/digital-trends-2017 ### Protecting the Cloud from DDoS Cloud migration is a hot topic and has been for about five years now. Everyone is familiar with the broad adoption of cloud services by the business of all shapes and sizes. The types of cloud service used to support migrated applications and data vary, as do the scale of the service providers – from large public/hybrid cloud providers to smaller, more specialised managed application service providers – but what they generally have in common is both a reliance on Internet connectivity for access and multi-tenant infrastructure. [easy-tweet tweet="61% of data centre/cloud operators saw attacks in 2016 that saturated data centre bandwidth" hashtags="Cloud,Data"] Unfortunately, DDoS (Distributed Denial of Service) attacks are an increasing problem for cloud and hosting providers due to their rapid growth in scale and frequency. An individual attack may only target one application within an environment, but if the attack is large enough to saturate Internet connectivity then everything that shares the same Internet ‘pipe’ can be affected. This was highlighted by Arbor Networks’ Worldwide Infrastructure Security Report, which reported that: 61% of data centre/cloud operators saw attacks in 2016 that completely saturated data centre bandwidth 21% of data centre/cloud operators experienced more than 50 DDoS attacks per month As a result of the above, there is a growing pressure on cloud providers, and those procuring their services, to ensure the right availability protection is in place. How do we defend availability? There are two main ways in which a cloud service, or the customer of a cloud service, can be protected from the DDoS threat: The end-customer can procure virtualised DDoS protection infrastructure from their vendor of choice and pair this with a DDoS protection service from a specialised Managed Security Service Provider. This is the same model many enterprises have been using to protect data and applications resident in their own datacenters for years, simply transposed to the cloud. This has the advantage, from the end-customer perspective, of familiarity with the same solution being used for both cloud and non-cloud services. The alternative is for the end-customer to procure (or be provided with, as a part of the core service offering) DDoS protection from their cloud operator. Many ISPs deployed DDoS detection and mitigation infrastructure to protect their own businesses and then looked to derive revenue from this capability by offering managed services to connected customers. Cloud service providers are increasingly doing the same – they need to protect themselves, so why not leverage the equipment and expertise they have put in place to provide sticky, high-value add-on security services to their customers. Both of the above can provide protection, and which preferably will be dependent on the needs of the end-customer versus the capabilities of their cloud service provider.  For the cloud operator, though the latter is obviously preferred, more and more operators are looking to explicitly provide DDoS protection services. However, this isn’t (generally) something they can do on their own; today’s volumetric DDoS attacks will cause problems for all but the largest cloud operators – as an attack can reach over 500Gbps – and thus most cloud operators will need an upstream DDoS protection service to deal with high-magnitude attacks when they occur. The providers of these services are, in some cases, the vendors of equipment that can be used within the cloud environment to provide local protection and in these cases integrated, or in some cases fully managed, services can be put into place. One thing is certain cloud operators and the users of their services need appropriate DDoS protections to protect the availability of their services.  Arbor’s WISR showed a significant jump in the proportion of data centre/cloud operators seeing revenue loss during 2016 due to DDoS attack, but this needn’t happen. Appropriate defensive technologies and processes exist and can be deployed and can even drive new revenue streams for cloud operators. ### Interxion Adds City Cloud to Cloud Connect Platform High availability, low latency and secure interconnectivity to City Networks City Cloud now available across Europe Interxion holding NV (NYSE:INXN), a leading provider of carrier and cloud neutral colocation data centre services in Europe, today announces that City Cloud, City Networks’ OpenStack-based public infrastructure, has joined its Cloud Connect platform. Enterprise customers can now interconnect directly with City Cloud through City Connect. This enables customers to extend their network with high speed and secure connections to City Cloud points of presence in Interxion’s Stockholm, London and Frankfurt campuses. By connecting to City Connect through the Cloud Connect platform, Interxion customers will have a reliable and high-performance VPN connection to City Cloud. At the same time, this enables enterprises to easily maintain fully redundant infrastructure and build systems that span multiple service providers across Europe. City Cloud customers can also take advantage of the Cloud Connect platform and gain access to other cloud service providers available on the platform. [easy-tweet tweet="77% of digital leaders use or plan to use hybrid and/or multi-cloud environments" hashtags="Cloud,HybridCloud"] Highlights ·  According to IDC, European businesses are taking control of IT transformation with new infrastructure solutions. Digital leaders are far ahead of other companies in their use of hybrid cloud: 77% of them use or plan to use hybrid and/or multi-cloud environments. ·  City Network is a gold member of the OpenStack Foundation, providing funding and aligning to the OpenStack mission. Using OpenStack for its public IaaS, City Cloud enables a wide range of benefits like high performance, open API’s and no vendor lock-in. ·  To meet industry specific demands for high security and availability, both City Cloud and Interxion comply with ISO standards and industry-specific regulations. By combining City Cloud and Cloud Connect, organisations can comply with demands originating from laws and regulations concerning auditing, reputability, data handling and data security such as Basel II and Solvency II. “We are excited to join Interxion’s Cloud Connect platform as we continue to focus on innovation through open IT infrastructure. Customers can now access City Cloud throughout Interxion’s footprint and beyond using cloud services from City Cloud and other Cloud Connect partners. This will enable multi-cloud strategies, and is ideal for organisations who are subject to regulatory requirements for data sovereignty and security,” said Johan Christenson, CEO and founder, City Network.   “The addition of City Cloud to our Cloud Connect platform further increases the value to our customers who want to build and operate high-performance hybrid IT solutions from our colocation facilities using City Cloud’s OpenStack infrastructure,” said Vincent in 't Veld, Director Strategy & Marketing, Interxion. ### Getting Buy-In For Cloud Investment In Your Council Many companies and organisations have been slow to embrace cloud computing due to uncertainty surrounding its security and financial costs. This can be problematic for any IT department, but it’s a particular challenge for those working within councils and other public sector departments to overcome. We’ve taken a look at some of the arguments IT professionals can present when attempting to encourage the adoption of cloud computing in their organisation. The Challenges of the Cloud There are many challenges that IT professionals might face when attempting to introduce buy-in for cloud investment. These include: Considering data security for both full cloud and hybrid strategies Ensuring a new cloud strategy will be scalable as the organisation grows and develops Considering how cost effective it is and will be, compared to traditional methods Ensuring it will meet the organisation’s operational goals Considering whether cloud can meet the organisation’s compliance mandates Security Challenges One major factor that will encourage leaders to embrace the technology is the issue of security, which is becoming an increasing concern as more and more criminals turn to hacking as a means of harvesting data and exploiting organisations. While in some small businesses this may not be an imminent worry, for a council that’s storing virtually endless amounts of personal data on behalf of their constituents, security could not be more vital. Cloud investment can protect data from localised hacking, ensuring that sensitive information is not accessed by unauthorised parties. [easy-tweet tweet="The financial benefits of cloud computing are a major point of confusion" hashtags="CloudComputing,Financial"] Scalability Of course, a sure-fire way to encourage cloud uptake is to demonstrate to your council why cloud services will be beneficial in both the short- and long-term. Thankfully, cloud investment can be regularly and easily adjusted to meet the needs of any organisation, so that both short-term and long-term growth or downsizing can be mirrored in the level of cloud services solicited. Cost Effective? According to Sanil Solanki, research director at computing firm Gartner, the financial benefits of cloud computing are a major point of confusion, both within and outside of council offices, resulting in many IT departments meeting opposition when attempting to introduce the technology. Solanki added that IT professionals desperate to begin implementing the cloud should highlight the significant amounts of money that could be saved; a point that’s particularly relevant in the current economic climate as councils around the country are forced to cut budgets. Meeting Goals Another council requirement that could tip the scales in the direction of cloud computing is the need for a fast, reliable IT service that can be used to achieve the organisation’s goals. For example, an outdated IT system in a council is likely to have negative implications when it comes to upgrading procedures for communicating with the public. To stay ahead, councils need to consider the benefits of a regularly updated cloud system, which could be the key to smooth running, increased public engagement in the future and achieving goals. Meeting Compliance Mandates Compliance mandates continue to change and develop as businesses and government departments increasingly rely on computers and other forms of technology to run their operations. As the EU sets its sights on imposing yet another set of data protection regulations in the form of the General Data Protection Regulation (GDPR), councils should ensure they are prepared by updating the security of their data management systems. Thankfully, cloud systems are inherently safer than traditional systems, making it possible for organisations to meet their compliance mandates with less time, less effort and less expense. The Benefits of the Cloud We’ve already covered many of the benefits of the cloud above, but when approaching your council it’s important to use concise and clear arguments to ensure they understand how beneficial cloud investment could be. We’ve listed the main arguments below: Increasing Return on Investment Those higher up in your council will want to be convinced that the cloud can help them grow and develop as and when they need to. According to a survey conducted by Cloud Industry Forum, 55 percent of organisations feel they have received a greater return on investment (ROI) and a competitive advantage against their competitors by embracing cloud technology. Reducing Demands on IT Departments Many IT departments are overwhelmed by the number of people requiring help with data and hardware issues every day. However, by upgrading to cloud technology, IT experts can be freed up to focus on other aspects of the business while the cloud provider deals with any issues. Cloud Flexibility Flexible working is increasingly popular among businesses and organisations, with many employees now completing their work tasks using laptops and other mobile devices. By using cloud technology, organisations can ensure workers can access files and documents on any device without hassle or delay. The Cloud in Action In case these benefits aren’t quite enough to convince your council to embrace cloud technology, we wanted to include an example of the cloud in action. In particular, the story of how the Royal Borough of Windsor and Maidenhead in Berkshire recently managed to implement an extensive overhaul of its IT infrastructure. As part of its overhaul, the organisation moved its back-end server estate to the cloud, brought disaster recovery in the house and introduced remote working technologies that improved service while reducing costs. These changes have not only reduced energy usage across the data centre, but the centralisation of storage has also reduced risk and helped the borough save £100,000 per year. "Early last year we decided it was time to consider virtualisation to help achieve our objectives," said Keith Clark, Head of ICT at the Royal Borough. "So far, we have reduced our energy consumption by 44%, with total project savings of around £1.2 million expected during a five-year period."   ### CensorNet Wins 2017 Red Herring Top 100 Europe Joins previous winners including Alibaba, Facebook, Google and Skype in earning the prestigious honour. CensorNet, the complete cloud security company, announced on 3rd May, that it has been selected as one of this year’s Red HerringTop 100 Europe winners, a prestigious list recognising Europe’s leading private companies and celebrating these startups’ innovations and technologies across their respective industries. Red Herring Top 100 Europe celebrates outstanding entrepreneurs and promising companies, selecting the award winners from approximately 1,200 privately financed companies each year in the European region. Since 1996, Red Herring has kept tabs on these up-and-comers and their Top 100 list has become a mark of distinction for identifying promising new companies and entrepreneurs. Red Herring’s editors were among the first to recognise that companies such as Alibaba, Facebook, Google, Kakao, Skype, SuperCell, Spotify, Twitter, and YouTube would change the way we live and work. Thousands of the most interesting and innovative companies have graced the Top 100 list over the years. [easy-tweet tweet="Combining web and email security with CASB, organisations can embrace the cloud securely" hashtags="TheWeb,Cloud"] "We’re delighted to be recognised by Red Herring as one of the Top 100 most promising technology ventures in Europe,” said Ed Macnair, CEO of CensorNet. “We have dedicated ourselves to understanding the cyber threats being faced by businesses across the world.  The result is our innovative enterprise-class Unified Security Service – by combining web and email security with CASB Cloud Application security, we offer a ‘single pane of glass’ platform, underpinned by robust Multi-Factor Authentication that gives organisations of any size the freedom to embrace the Cloud securely.  This award validates our vision and product direction.” Red Herring’s editorial staff evaluated the companies on both quantitative and qualitative criteria, such as financial performance, technological innovation, management quality, overall business strategy and market penetration. This assessment was complemented by a review of the track records and standings of similar startups in the same verticals, allowing Red Herring to see past the “hype” and make the list a valuable instrument of discovery and advocacy for the most promising new business models in Europe. “In 2017, selecting the top achievers was by no means a small feat,” said Alex Vieux, publisher and CEO of Red Herring. “In fact, we had the toughest time in years because so many entrepreneurs had crossed significant milestones so early in the European tech ecosystem. But after much thought, rigorous contemplation and discussion, we narrowed our list down from hundreds of candidates from across Europe to the Top 100 Winners. We believe CensorNet embodies the vision, drive and innovation that define a successful entrepreneurial venture. CensorNet should be proud of its accomplishment, as the competition was very strong.” Following CensorNet’s well-deserved win, the company has been invited to showcase themselves to the US market at the Top 100 North America event in June and later compete internationally for the Top 100 Global in November. ### Call for Digital Specialists for Cancer Innovation Challenge Less than two weeks left (registration closes 15 May) to apply to funding competition to help improve cancer patient care Share of £325,000 available to fund innovative tech solutions  (L to R) Steph Wright - Project Development Manager,  Cancer Innovation Challenge, Catherine Calderwood - Chief Medical Officer, Gillian Docherty – CEO, The Data Lab  Scotland’s digitally skilled workforce is being invited to apply their skills and expertise to help Scotland become a world leader in cancer care. The Cancer Innovation Challenge, launched in March, is seeking novel data and tech proposals to help cancer care in Scotland. There are just two weeks left for anyone interested in contributing and to apply for a share of £325,000 from the Challenge’s first funding competition to find new ways of recording and integrating patient data to develop leading-edge care solutions, with registration closing on 15 May. Data is gathered at every stage of a patient’s cancer journey. Insights derived from this data have the potential to improve services, treatments and outcomes. The Cancer Innovation Challenge is funded by the Scottish Funding Council and is being delivered by three Scottish Innovation Centres – led by The Data Lab and supported by the Digital Health and Care Institute (DHI) and Stratified Medicine Scotland (SMS). [easy-tweet tweet="The Cancer Innovation Challenge is seeking data and tech proposals to help cancer care in Scotland" hashtags="Health,Data"] Gillian Docherty, CEO at The Data Lab, said: “It is critical we harness the skills of a digitally literate workforce in Scotland whether they have experience in the health sector or not. There have been phenomenal advances in cancer care in Scotland over the last decade and, while we understand the outcomes of patient care, time and funding constraints sometimes limit our ability to analyse how cancer services could be improved. We’re keen to encourage learnings from other sectors so are calling on anyone with data and digital skills who wants to make an impact on cancer care to get involved.” The funding competition is split into two phases with the first lasting for three months with a focus on technical feasibility and the second lasting for six months with a focus on development and evaluation. A second funding competition will be launched later in the year focussing on innovative data science solutions using cancer data. Additionally, anyone with experience of working with large complex data sets in secure environments is invited to respond to the consultation process to help shape the second phase of the competition and ensure that the outcomes from the project have the maximum potential to effect real world impact on cancer patient care. The consultation is now live and the deadline for response is 5 June. Another way to be part of the Cancer Innovation Challenge is to participate in the Cancer Data Dive taking place in Edinburgh on 15-18 June in partnership with Product Forge. Entrepreneurially minded data scientists, analysts, clinicians, designers and software engineers are invited to come along and spend three days and nights developing data focused projects to improve cancer care in the NHS in Scotland. To find out more about the Cancer Innovation Challenge’s current funding opportunities, the Cancer Data Dive, and the cancer data consultation, please visit www.cancerchallengescotland.com   ### £37 Billion Waisted on Failed Agile IT Projects British business is set to waste an estimated £37 billion on failed Agile IT projects over the course of the next 12 months, according to a new report from independent IT consultancy 6point6. 6point6 commissioned a survey of 300 CIOs in the UK and the US to examine their experiences of Agile and measure how successfully the principles of Agile are being applied and executed.  The findings are being launched at the CIO Insight Summit in Frankfurt today in a major new report, An Agile Agenda: How CIOs Can Navigate The Post-Agile Era (available for download at http://content.6point6.co.uk/an-agile-agenda). Chris Porter, CTO and co-founder of 6point6 and one of the authors of the report said: “Agile IT in the UK is facing a hidden crisis - 12% of Agile projects are failing completely.  CIOs tell us they expect to undertake six agile projects next year, one in eight of which will fail completely.  Given there are about 6,000 CIOs in the UK and that the average Agile IT projects costs approximately £8 million, that represents a huge amount of waste.  The truth is that, despite the hype, Agile development doesn’t always work in practice.” The research also uncovered that over half of CIOs regard Agile development as “discredited” (53%) while three-quarters (75%) are no longer prepared to defend it.  Almost three quarters (73%) of CIOs think Agile IT has now become an industry in its own right while half (50%) say they now think of Agile as “an IT fad”. Chris Porter said: “This is a conservative estimate.  We’ve only looked at Agile IT projects that fail completely.  This doesn’t include the waste involved in Agile projects that fail only partially.  UK and US CIOs now estimate that nearly a third (32%) of Agile projects fail to some degree.  The failure to apply Agile effectively is a huge problem for the UK.” The IT consultancy is launching their 6point6 Agile Agenda to tackle the problem and help CIOs survive in the post-Agile era. Be ready to distribute Agile 32% of Agile projects that fail, do so because teams are geographically disbursed. As Agile is increasingly being used at scale, organisations have been forced to look across local, regional and international boundaries to find the talent they need in the quantity they require. Agile favours co-location of teams and business people but this is increasingly a luxury so organisations must develop ways to successfully overcome this limitation, which will involve additional process, communication, governance and management. [easy-tweet tweet="44% of Agile IT projects that fail, do so because of a failure to produce enough documentation" hashtags="AgileIT,CIO"] Be ready to scale Agile 6point6’s research showed 95% of CIOs have worked in a scaled Agile environment involving multiple teams working on multiple projects concurrently. But Agile itself does not provide a means to scale up or out. Instead, organisations must either establish their own scaled Agile methodology or embrace an existing one. Be ready to transition The report revealed 44% of Agile IT projects that fail, do so because of a failure to produce enough (or any) documentation. Agile teams cannot service a piece of software indefinitely. At some point it will have to be handed over to others to run it and they will need a set of instructions that simply can’t be written at the end of a project. Be ready with up-to-date architecture, designs other mechanisms such as wikis, automated tests, dynamically generated documentation and Clean Code. Protect the CIO 6point6’s research demonstrated that the average life expectancy of a CIO is just 14 months.  This is simply not long enough to affect the cultural change and ensure the ongoing boardroom support required to nurture success in an Agile environment.  CEOs must empower and support their technology leaders if they want to succeed in the digital economy. Bring back planning The survey found that 34% of failed Agile projects failed because of a lack of upfront and ongoing planning. Planning is a casualty of today’s interpretation of the Agile Manifesto, which can cause Agile teams to lose their way and their customers to lose visibility of what they are getting for their money, now and in the future. Bring back IT architecture 68% of CIOs agree that Agile teams require more Architects. From defining strategy to championing technical requirements (such as performance and security) to ensuring development teams stick to the rules of the game, the role of the Architect is sorely missed in the Agile space.  It must be reintroduced. But the report did uncover some relatively good news for the UK tech industry.  While the rate of complete failure of Agile projects was 12% in the UK - in the US, CIO’s report 21% of Agile projects result in complete failure. The organisations taking part in the 6point6 research employed approximately 1,300 people on average with revenues of £127m.   ### NHS Trusts Still Wary of Cloud New research has found that almost 65% of NHS Trusts choose not to store any of their data in the private or public cloud. These results came from a freedom of information request sent by cloud and managed services provider ANS Group. Out of the 142 NHS Trusts that were approached, 86 of them responded. The findings revealed that 63% (54) didn’t store any of their data in the cloud, while the remaining 37% (32) stored some information in the cloud. Of those that use it, 63% (20) opted for private cloud, while 13% (4) used public cloud and a further 25% (8) used a combination of the two. When asked if they were considering moving any data into the cloud during the next 12 months, the majority (59%) of NHS Trusts asked said they weren’t. [easy-tweet tweet="NHS Trusts are handling and storing an increasing amount of extremely sensitive data each day" hashtags="NHS,Data"] Andy Barrow, chief technology officer at ANS Group, said: “NHS Trusts are handling and storing an increasing amount of extremely sensitive data each day. Although cloud platforms can offer secure and scalable storage, these findings suggest that some Trusts may still be hesitant to embrace it, and therefore could be missing out. “Keeping costs low is also a huge priority for the NHS in its current circumstances. Cloud could be a viable option for Trusts to explore since they will only pay for the storage that they use and may not have to invest in costly hardware. “Any NHS Trusts looking to take advantage of cloud must ensure they have the correct skills, either internally or externally, to navigate the migration and manage their cloud environment. They will also need cloud-ready networks to be in place in order to make the most of this shift. “Although there are steps to be taken in order to prepare, NHS Trusts looking to streamline and modernise the way they operate could certainly benefit in some way from a bespoke cloud solution. Businesses in all sectors are using the cloud as a facilitator and there is no reason why NHS Trusts can’t do the same.”  ### MENA Cloud Adoption: Opportunities and Challenges Historically, using computer programs meant buying a physical software package, installing it manually onto the hard drive, and upgrading the programs intermittently at significant cost and at risk of incurring compatibility issues. Maintaining business software was an inefficient and costly process. Then came the cloud. The cloud removes the burden of manually upgrading hardware and managing data storage while providing employees with a global means of communicating with customers, and access to multiple cloud providers and key data from any location. The possibilities of such an IT infrastructure are endless. From improving business agility to driving down capital expenditure on hardware. The cloud enables businesses to be more efficient and strategic and facilitates global engagement with businesses and consumers. Right now, one of the most exciting areas of cloud growth is the Middle East and North Africa (MENA). Gartner estimates that cloud spending in the region has increased by 22 percent in the last year. The total spend currently sits at $1.2 billion and is expected to reach $2 billion by 2020. The main growth drivers are Platform as a Service (PaaS) and Software as a Service (SaaS), which currently account for $1 billion between them. Despite predicted growth, the cloud is still an emerging technology in the MENA region. For all the region’s potential, there is resistance and challenge in equal measure. For cloud providers to work strategically in the region, the following opportunities and challenges should be taken into account. [easy-tweet tweet="Security represents the most pressing challenge to the future of cloud in the MENA region" hashtags="Security,Cloud"] The challenges: security and connectivity Security represents perhaps the most pressing challenge to the future of cloud in the MENA region. Research from PwC indicates MENA businesses are several times more likely to fall victim to cybercrime due to shortcomings in the necessary security knowledge, skills, and processes to effectively identify and respond to threats. This evidence is supported by high-profile breaches in the region, including the hack in December 2016 on the Saudi Arabian government, and the 2012 attack on Saudi Aramco, which put roughly 10 percent of the world’s oil and gas at risk. Ultimately, embracing the cloud’s additional connectivity will require MENA businesses to fortify their security measures. In meeting this challenge, providers should work with local businesses by supplying education on the cyber security landscape, and how best to secure a business operation. This includes information on necessary skills and the security solutions currently available to businesses. Connectivity Issues The MENA region possesses an undeveloped technological infrastructure, which cannot provide local businesses with affordable bandwidth. Naturally, this has inhibited the growth of bandwidth-reliant technologies such as the cloud and has cultivated a business culture still largely reliant on more traditional hardware solutions. The availability and quality of data pipes and data centres in the region is improving, but demand for bandwidth still greatly exceeds supply. This has caused regional bandwidth costs several times higher than anywhere else in the world. As such, businesses considering cloud deployment are also forced to assess whether they have the resources to sustain a cloud-centric business model. Providers should facilitate an effective means for businesses to monitor their cloud application usage, in line with available resources and cloud deployment goals. This system will enable adopters to identify the areas of high-bandwidth usage and adjust their strategies accordingly. Opportunities within the MENA market The driving factor behind the move to cloud in the MENA region is not architectural but rather user-generated. Providers no longer need to convince organisations that the cloud is the next key business revolution. They already know. And they’re now demanding the same ‘anytime-anywhere-on-any-device’ service experienced by their counterparts in Europe, Asia, and North America. User-generated demand indicates huge growth potential for regional providers in the coming years. To capitalise on this opportunity, providers should work to facilitate an environment wherein local businesses not only want to adopt the cloud but where they also feel ready to do so. They need to ensure businesses are ‘cloud-ready’. In supporting regional cloud-readiness, providers should continue to educate prospective adopters on cloud services, because businesses should know what the cloud does and how it can benefit their goals before they can understand how it will fit into their business models. Providers should also offer services to ensure businesses have the necessary skillsets to support cloud deployment. This may include introducing cloud training programmes that vary by context and difficulty, to ensure businesses and employees have the relevant practical experience for their needs. Network Architecture The network architecture in the MENA region has traditionally been an issue. Due to regional regulatory and territorial complexities, it has made more sense for networks to send their local traffic all the way to Europe to exchange data before backhauling it to the Middle East. This system has both reduced cloud efficiency and increased network costs. Cloud and network providers should view the regional infrastructure as both a challenge and opportunity. By working to build regional data centres and improve the abundance and quality of data pipes, cloud and network providers can ensure a more functional and efficient network architecture for local businesses. Infrastructural improvement will not happen overnight; it will require a long-term commitment from providers and cooperation from local stakeholders. The same can be said for the all of the challenges and opportunities within the region. But providers should recognise that the move to the cloud is not just a fad. In fact, Cisco’s Global Cloud Index estimates that 95 percent of MENA workloads will be processed in the cloud by 2020. Given the real prospect of a cloud-driven future, providers should be doing all they can today, to ensure they are in the best possible position for tomorrow. It may not be the providers with the biggest names or most resources that establish themselves as leaders in the region, but those most willing to address market needs. ### A-Grade for Panasonic Education  Exhibition Panasonic has hosted the third iteration of its popular Education Exhibition, presenting the company’s visual, security and broadcast camera products aimed at the education market. Held at central London venue The Brewery, attendees from nearly 40 the UK and Irish universities were given a tour and live demonstrations of key technologies during the afternoon event, followed by a chance to learn more at a question and answer session. Highlights of the day included a full lecture capture demonstration, integrating Panasonic’s AW-HE40 remote camera and Panopto recording and editing software for a complete lecture capture solution. Additionally, there was the chance to get hands-on with the VariCam LT, already in use in the UK at Bournemouth University. Panasonic’s visual presence was also strong, with a laser projection area including the PT-RZ12, PT-RZ970 and PT-RZ660 up and running for comparison, as well as video wall and digital signage technology. Attendees had the opportunity to hear from partners Draper and Panopto with short presentations from each. Queen’s University Belfast also spoke at the event, elaborating on their experiences refurbishing the Medical Biology Centre with 18 Panasonic LCD displays. [easy-tweet tweet="Next year Panasonic will be bringing forward more case studies to learn about how AV can be used" hashtags="AV,Panasonic"] “This is the third year we’ve run this event – it is consistently very well attended and the opportunity to get hands-on with our products is well received by the education sector,” said Gareth Day, UK Visual Systems Group Manager. “The presentations this year had a very encouraging response and so I think next year we’ll be bringing forward more case studies to allow the sector to learn from each other about how AV can be used.” For more information about Panasonic’s full range of solutions for business, visit http://business.panasonic.co.uk ### Private vs. Public vs. Hybrid Cloud Solutions for Small Businesses As with anything in life, the business world is not free from trends. As new solutions are developed, the business community jumps at the opportunity to make their lives (and conduction business) easier. Some of the newly developed solutions turn out to be immensely beneficial and improvements are continuously developed over time. One such example is cloud solutions. They have proved to be immensely beneficial, have spread everywhere and chances are, you are already using a form of a cloud-based solution (at least a CRM software). With time a few distinct forms of cloud solutions have developed. Everything first started with public cloud solutions, then came private ones, and now hybrid cloud solutions are all the rage. With that many options, it is easy to get confused or feel overwhelmed when it comes to deciding which cloud-based solution works best for your business. Luckily, this article is here to help. Public Cloud Solutions Public cloud was the first type of cloud-based solution that was developed and it revolutionised how we conduct business.  Do you use Dropbox? Or do you have an outsourced e-mail provider? You are then a using a public cloud service! If the service is provided off-site, over a public network (the Internet) it is a public cloud service. It provides adaptability and flexibility: you pay for what you use. That gives you the ability to tailor the system to your needs, as well as change it as your business develops and grows. However, since these systems use a public network, they are by far the most vulnerable ones. You also have the least control: you trust the provider that he will keep your data safe and secure. That doesn’t necessarily mean that public cloud options are bad. They are a great solution for everything from human resources to customer relations management. Just make sure your provider has solid backup and recovery as well as data security solutions. Private Cloud Solutions With the rise of security concerns regarding public cloud solutions came private clouds. If your company is very data-oriented and data security is your primary concern, private cloud solutions are the way to go. The other example is if your industry requires extreme security measures like in banking or military. The downside is that you lose the scalability and adaptability of public cloud solutions. Also, you yourself will have to handle all the hardware and software, so that means more work. Not to mention the huge investment needed to set up and maintain your private cloud. This will require educated and capable IT department that will be able to set everything up and work on maintaining the system. With such expensive downsides, not many companies can afford a private cloud, but most likely you don’t need it anyway. There aren’t that many small or medium sized businesses that handle such crucial data which require extraordinary security measures. If you’re not a large system like a bank or an investment fund, it’s likely that there are better solutions for you than private clouds. [easy-tweet tweet="Hybrid cloud solutions balance out issues of having local, on-site cloud computing" hashtags="Hybrid,Cloud"] Hybrid Clouds Technology is constantly improving and developing, so it is not strange that it did not take that long for a solution trying to combine benefits of both private and public clouds was developed. Hybrid cloud solutions balance out issues of having local, on-site cloud computing with benefits of having a cloud services provider. Let’s say that you need a private cloud for storing sensitive information and data, but at the same time use a public cloud for less-critical tasks like interacting with your customers. Hybrid cloud solutions try to take the best from both worlds. They aim to provide users with scalability and flexibility of public cloud solutions, while at the same time having a higher data security. But hybrid cloud solutions do require more investment of time and money – managing a hybrid cloud solution tailored specifically to your needs won’t be as easy or as cheap as with public clouds. That being said, they do offer a good balance between the two worlds and probably are the best solution for most small and medium-sized businesses. Finding the right fit for you is the key. There are many factors to weigh in and think about. And sometimes what seems like the cheapest solution is not necessarily the best (and ends up costing you much more in the long run). If you’re not an IT expert yourself, the best advice would be to try to find someone who is - a good, reliable cloud service provider that will take the time and effort to create a cloud-based package that will fully satisfy your company’s needs. Also, a smart thing would be to ask for a second and a third opinion. Don’t be afraid of shopping around and taking the time to pick the right choice. A little more effort at the beginning will certainly pay off in the end. ### HPE Launches GDPR Starter Kit Bundled security and governance solutions help organisations quickly get started to meet GDPR deadline Hewlett Packard Enterprise (HPE) Software today announced the availability of a GDPR Starter Kit, which helps organisations take a critical first step in preparing for the European Union’s looming General Data Protection Regulation (GDPR). This bundled set of software solutions assists organisations to automatically identify, classify, and take action to secure information that falls under this regulation. GDPR presents a unique challenge to organisations around the globe, since it applies to any entity that collects or processes EU citizen data, and imposes harsh penalties for those that do not comply by May 2018. According to a recent PwC Pulse Survey of C-suite executives from large multinational corporations, 54 percent of respondents reported that GDPR readiness is the highest priority on their data-privacy and security agenda, and 77 percent of respondents plan to spend $1 million or more on GDPR compliance.1 “Getting started may be the greatest challenge for many organisations, as data volumes often number in the billions of objects, timeframes are constrained, and determining what falls within these regulations can be cumbersome and complex,” said Joe Garber, vice president marketing, Information Management & Governance, HPE Software. “The GDPR Starter Kit provides customers with an easily integrated solution set for assessing data, allowing them to take the first step in addressing data and risk management outlined in the regulation.”  PwC highlighted Personal Data Assessment as a key stage in compliance in a recent white paper, “Technology’s Role in Data Protection – the Missing Link in GDPR Transformation.” The GDPR Starter Kit follows HPE’s earlier launch of a comprehensive GDPR solution portfolio and aims to provide organisations with streamlined next steps on their paths to compliance. [easy-tweet tweet="The GDPR Starter kit combines world-class software, to help conduct a Personal Data Assessment" hashtags="PersonalData,GDPR"] “The effective use of technology is critical for organisations to monitor what sensitive EU citizen data they hold, and to apply and enforce policies to protect this information,” said Stewart Room, global data protection legal services leader at PwC. “With the upcoming EU General Data Protection Regulation set to deliver a fundamental change in how personal data is handled, organisations must ensure they have the right technology and controls in place to meet the new requirements. The natural first step for many is to use analytics tools to understand what personal data is held, where it’s being stored, and how to classify it." Neil Cattermull from Compare the Cloud said: "GDPR is not just a leading term of engagement for consultancy practices, it's a very real and present issue that has to be addressed now. It's satisfying to see the technology vendors assisting in this way to enable compliance as well as growth amongst industries" The GDPR Starter kit combines world-class software, including HPE ControlPoint, HPE Structured Data Manager, HPE Content Manager and HPE SecureData in bundled solutions to help customers conduct a Personal Data Assessment and optionally encrypt data that is subject to these regulations. This unique combination of classification, governance, and data security products delivers a number of important benefits: Automate assessment of structured and unstructured data, which alleviates a traditionally manual, error-prone process. Quickly and cost-effectively enable GDPR-responsive data to be encrypted in an automated fashion to mitigate security breaches. Take a critical step toward lifecycle and retention management to enable compliance with additional GDPR articles and corporate governance requirements. ### Secure Your Cloud with Cloud As the IT&C Companies and their Customers keep migrating from independent, standalone, own hardware and software infrastructure to virtualized environment several other issues, problems and risk arise. Most of these new threats, risk are related to security. Everything changes once the It Infrastructure has been cloudified. For once, you don’t store your own data in your own facility. You used to know exactly where your data is, and, moreover, what protection and security your system implement starting from the physical premises security to securing the hardware and software applications. Of course, this means that YOU have to design, implement, manage, maintain your own: -        Premises physical security system -        Resilient and redundant hardware infrastructure -        Resilient and redundant storage systems -        Resilient and redundant network infrastructure -        Resilient and redundant applications and software which require investment, personnel. And not only once! You will have upgrade cost, recurrent cost, subscription costs forever. Relax, all of your IT&C Infrastructure and all your data is safe and secure… or are they? Implementing and maintaining a robust, reliable, secure environment is an effort, even for IT&C companies where the focus of the business is exactly the IT&C systems. What about if you are not an IT&C company, but you are running an IT&C Infrastructure. All of these items are just Cost and Effort for you. And you probably are not so safe anyway, as cyber risk evolve, sometimes faster than cyber defence. So… should you move your IT&C environment in Cloud? It seems TCO is less after all, and you can know focus your own business.! You are actually taking your precious information, your intelligence, your assets in a Cloud, and it will be taken care of by your CSP. But is it safe? Is it secure? Yes, it is! In a professional CSP environment everything you needed to do on your own on a smaller scale, it’s now performed as a business focus and at a larger scale. [easy-tweet tweet="AdNet Telecom’s data centre provides all the physical premises security" hashtags="Security,DataCentre"] How do they (CSPs) do it? First, usually, your environment is hosted in a dedicated DataCenter which normally complies with at least Tier III specifications (<<link to uptime institute or a similar definition of the 4 tiers>>). AdNet Telecom’s data centre provides all the physical premises security (resilient and redundant power supplies, cooling, network connectivity, surveillance, firefighting and a totally automated DataCenter Infrastructure Management). Second, your servers and storage infrastructure will be most likely virtualized. Believe it or not, this actually makes your data more secure, as most of the virtualization Hyper-Visors such as (VMWare) address several security issues: -        Your data is not running on an actual specific hardware platform (like a server with any number of CPUs, RAM, and storage disks) that in a case of any failure or malfunction will result in information loss or reduced availability until some technician will replace or repair the unit. Your data is stored in complex resilient, redundant multi-access storage platforms with hundreds of thousands of disk arrays. If one disk fails, the next on is a spare one, if a second disk fails, RAID mechanics in place will instantly recover your data. If an access path to the storage unit fails, there are at least 2-3 more on active standby taking over. If one whole storage unit fails, backup, replication, disaster recovery and business continuity plans will ensure that your data is safe and secure in another usually remote site. -        Your applications run in a virtualized environment hosted on clusters of blade servers, your entire server can and will be migrated instantly from any faulty unit to another one without any disruption -        The storage that contains your data is connected to the virtualized servers that run the application over Virtualized Network Infrastructure (NFV/SDN). This means that switching and routeing software actually run as virtual entities on the virtualized cluster of blade servers, and besides being redundant by network architecture they benefit from all of the redundancy and migrations mechanics that protect your data and application. Usually, for a customer that is running a complex virtualized infrastructure in a CSP environment, AdNet Telecom provides dedicated virtual network infrastructure (Cisco Nexus1000V switches, Cisco CSR1000V routers, VMWare vSwitches) implemented to the specific customer’s requirements and design. Up to this point, it’s clear that your assets are safe from data loss caused by any type of hardware malfunction of servers, storage units, network devices, and because of regular backups, and replications you are also safe from damaging the data yourself (by mistake or sabotage). If your data was totally disconnected from outside networks and the internet that would be enough – but this is not the case. -        Q: How does AdNet Telecom as a CSP protect your network from outside networks, Internet and external threats? -        A: By using a wide range of security appliances that address specific risks, threats and attacks! -        Q: Ok, but how is this different from what you would do to protect your locally hosted infrastructure?r9 Same as the other virtualized network elements (switches, routers), security appliances can and are virtualized as well benefiting from all the advantages already described. If you want to implement a hardware security appliance of any sort, your trusted hardware provider will ask the right questions in order to identify the exact model you required, based on functions and performance. The issue here is, that if now you need some functions, and later you will need others, most likely you will need to change the old device with a new one or, start with all the functions from the start to avoid missing one – both solutions are very costly and require effort and cause downtime. What about performance? If initially you need your firewall to serve 1Gbps of traffic but expect your business to grow, what performance level should you choose? This changes with the virtualized flavour of security appliances. Their performance is directly related to the resource that has been configured to the underlying virtual appliance. If you need to CPU Cores and 4GB of RAM to server 1Gbps of traffic, to server 2Gbps of traffic you just need to add two more CPU Cores and 4GB of RAM to the virtual appliance and increase (if needed) the performance license. Needless to say this is a very simple task using hypervisors as VMWare Suite. You have also the option of clustering your firewalls. If you need a cluster of two virtual firewalls to serve 2Gbps of traffic, you will just need three for 3Gbps of traffic and minor changes to your network configuration. No downtime! CSPs usually benefit of special packages from security appliances providers that enable them to easily increase of decrease license packages depending on load. For example: if this month overall summarised customer traffic through the firewall is 20Gbps, and trends show a growth in customer traffic patterns to 30Gbps, CSP will simply order for the next month an extended capacity license and either increase the performance of the virtual appliances or add new firewall nodes in an existing cluster. No hardware change. No CAPEX! Cyber security is usually layered. The first layer of defence in AdNet Telecom’s security infrastructure is the so-called “IP Firewall”. This element is responsible for protecting the inside systems against IP, TCP and other OSI L3-L4 threats, risks and attacks (such as Spoofing, SYN Attacks, and others that are attempting to alter IP/TCP Headers in order to “trick” the network that traffic is clean). IP Firewall functionality is accomplished by implementing Cisco ASA-V security appliances and Fortinet Fortigate-VM versions. These are deployed on customer’s dedicated virtual environment external exposed borders (for complex implementations) or as main IP Firewall for simpler configurations. The second layer of defence concerns DDoS related attacks. These types of attacks are focused in overloading the network and/or application infrastructure by generating at very high rates from multiple geographically distributed sources requests that appear to be healthy. Unprotected systems will keep working, but the useful traffic cannot be delivered because of overloading. Sources are plain users’ terminals which are infected with malware that allows the attacker to control and coordinate many hosts over the Internet (all victims of that malware). AdNet Telecom protects their data centres against DDoS Type Attacks using Arbor Network SP/TMS Solution. The flow collector, Arbor Networks SP, will be constantly monitoring all border routers and identify surges/spikes in traffic. In such a case, it will analyse the traffic and decide if it’s a normal traffic increase or an attack. The system has two methods of resolving DDoS attacks: -        Just deny all traffic matching the attack pattern and protect the network and applications against overload. This is achieved by automatically manipulating the border routers’ routeing tables in order to ignore the traffic from infected sources. It is very efficient and Arbor can deny DDoS attacks up to tens of Gbps. The downside of this solution is that it usually more users share via NAT Mechanics the same public IP; if one of the users behind a specific public IP is infected, Arbor will set the border routers to reject all traffic from that public IP, and all the other non-infected users will not be able to access your applications. -        The improved version includes a cleaning system – Arbor System TMS. The flow collector works the same, it detects the attack, but instead of setting the border routers to deny traffic from the sources, it redirect the traffic to TMS which will inspect all traffic and will surgically remove only the attack packets and thus allow all traffic from clean terminals using same public IPs with infected terminals. Because TMS needs to inspect each and every packet, usually It does not have the same capacity as the detection/rejection engine. If cleaning capacity is reached – system will revert to the basic functionality. Are all these systems enough to protect your codified infrastructure? Unfortunately, no! According to SDX Central’s “2016 Next-gen Infrastructure Security Report,type” the highest occurrence attack techniques is SQL Injection. SQL Injection techniques consist of some attacker using common input forms on your application interface (like input fields in registration forms or contact forms) to insert data in your database. Sometimes this enables the attacker to destroy, or gain access to your data. This type of attack will never be identified and filtered by an IP Firewall (the packet headers are perfectly normal), nor by DDoS detection and cleaning (it is not an attack pattern), as this is just a normal web request on your application interface. There are sevetypesof attack techniques which try to use some vulnerability of your application architecture. Developers will know and prepare your systems against such vulnerability attacks when deploying the application and databases but this is not enough. The solution here is to insert on the path of the request between the attacker and the application server itself an intermediate system which will appear to the attacker as being the application itself. Besides optimization, acceleration of application traffic, SSL offload, load balancing, these intermediate systems generically known as Application Firewalls/Web Firewalls/Application Delivery Controllers will inspect each request, match it against an attack signature database, sometimes perform its action and check the result without forwarding it to the protected application server behind it. AdNet Telecom deploys A10 Networks Thunder ADC System as application delivery controller which can be customize to a specific customer application, with knowhow from the application developers, to understand how your application works and what requests should it allow, should it quarantine, or deny. ### Who owns Your App?  Post Brexit, venture capitalists are still pouring money into tech start-ups, creating fantastic opportunities for innovative businesses.  Even with funding, however, turning innovative ideas into digital reality is challenging in a market dogged by the skills shortage. Rather than attempt to build the software developer skills in-house, turning to a third party developer can be a fast track to delivering the new product. But, warns Nick Thompson, Managing Director, DCSL Software, don’t forget to check who owns the Intellectual Property. Roping in some help The UK’s technology sector drew more investment than that of any other European country in 2016, according to data from London & Partners, with more than £6.7 billion ($9.5bn) invested into UK tech firms in 2016¹. But coming up with a great new concept for an app or software as a service is just the beginning; getting the product developed, tested and available to customers is often far more of a challenge than start-ups expect. [easy-tweet tweet="The UK technology industry is suffering a well-documented skills gap " hashtags="Technology,IT"] One of the biggest problems is the sheer demand for a top notch IT development skills. The UK technology industry is suffering a well-documented skills gap – recruitment is tough and costs are escalating, especially in software development. Creating a good development team is a difficult and complex process - it can take years for an organisation to attain the correct blend of skills and expertise, a timeline that is simply not an option for startup companies with a quick go to the market deadline. As a result, growing numbers of startups are turning to third-party developers to turn innovative ideas into commercial reality. And while this makes sense on many levels, there are a couple of major pitfalls that organisations need to avoid. Setting the ground rules Firstly, make sure the developer is not using its own proprietary tools to build the software. Unless the developer is using a standard set of Microsoft tools that can be picked up and used by any development team, the business risks being tied into that developer for the life of that product.  While a proprietary product may offer a slightly quicker development time, the constraints on on-going product value and flexibility for the software owner vastly outweigh any time benefits gained. Secondly, ensure the business actually is the software owner. It sounds obvious, but the vast majority of organisations approaching developers to realise specific product ideas never check Intellectual Property (IP) ownership. Failure to ensure that the IP is transferred by the company building the software at the end of the project could be devastating. Without the IP, a company is effectively licensing its idea from the developer. And having built a commercial piece of software from the ground up that developer could take its asset straight to market, bypassing the company with the original idea. Surprisingly, many company owners are incredibly naïve about IP and never ask the developer any questions about ownership, but simply asking the question up front and ensuring full ownership of the completed software should be an essential part of any bespoke software development contract. It doesn’t get much worse than coming up with an idea, finding the backing and working with a bespoke software developer to bring that vision to life and then realising that the business does not own the IP. What if it doesn’t go to plan? Should the situation arise where a company doesn’t own its IP, costs can start to accumulate at an alarming rate. Often the debate around who owns the IP doesn’t really begin until a company wants to change their development company, so if it’s then discovered that the developer actually owns the IP, things can start to get tricky. Worst case scenario is that the customer could be faced with starting all over again, as that developer is able to legally retain the rights to what they have created, which can be a huge expense for the business. Alternatively, a brand could also be faced with the challenge of being held to ransom by the developer who may try to make them part of a large fee in return for the property. If companies decide to try and fight this, legal battles can also result in high costs, and often still leave the IP in the hands of the developer. Aside from the monetary costs, there are also some key business risks associated with the loss of your IP. One major problem with not owning the IP is what the developer might choose to do with it. If they own the app then they are well within their rights to sell any features they may have designed to any business that asks for it. This means a company could spend a significant amount of money to develop a system that puts them ahead of the competition, only for a competitor to swoop in and buy the work to date for a fraction of the cost. A quick recap of what businesses should do to prevent this from happening to them? Choose a reputable developer Make sure the software is developed in an industry standard way Be upfront about the need to own the IP and make sure it is written into the contract   ### BMC Introduces Cloud-based SecOps Response Service Award-Winning Security and Compliance Solution simplifies, enforces and ensures security for multi-cloud and on-premise environments BMC, a global leader in IT solutions for the digital enterprise, today introduced BMC SecOps Response Service, a cloud-based solution that eliminates security risks and reduces companies’ overall attack surface across multi-cloud environments, including AWS and Microsoft Azure.  When coupled with endpoint management tools, including Microsoft System Centre Configuration Manager (SCCM) and BMC BladeLogic Server Automation, organisations can rapidly prioritise and remediate. According to IDC, more than half of all enterprise IT workloads will live in the cloud within the next year. 1 As organisations accelerate their cloud adoption strategies, many sacrifice security for the sake of speed. Traditional tools used to manage the data centre are not built for the cloud, and cloud security operations tools often require manual inputs that lead to delays and inconsistencies. Delivering on BMC’s promise to offer customers multi-cloud management solutions to fit their needs and environment, SecOps Response Service solution offers teams the ability to quickly enforce security policies and manage multiple cloud environments and on-premise systems, including data centres from one tool. The SecOps Response Service solution: Enables security and operations teams to operationalise their vulnerability management strategies for faster remediation and reduced risk with policies that are flexible and scalable; Provides visibility into planned actions, predictive service level agreements (SLAs), and burn down views, thus enabling them to more actively control the security levels of an organisation; Prioritises vulnerabilities based on policy, SLAs, severity, and impact. “Online Business Systems has built our Security Integration Framework on BMC technology, and we believe that the new BMC SecOps Response Service will fill a critical gap in the market and will be a game-changer in the Security Operations space,” said Jon Fraser, Managing Director, North America, Online Business Systems. “As environments become more complex with the advent of hybrid infrastructure, there has been a significant gap in SecOps tools available to solve this challenge– until now.” [easy-tweet tweet="SecOps Response Service facilitates increased coordination and collaboration between the teams" hashtags="SecOps,Cloud"] “When operations receive vulnerability data from the security team, it lacks the context necessary in order to prioritise and take action,” said David Cramer, vice president and general manager of security operations at BMC. “SecOps Response Service helps answer critical questions such as, which applications are dependent on those systems? Do we have a patch to address the vulnerability and has the patch been tested? When is the next available maintenance window that we can use to remediate this issue?  A cloud service like SecOps Response Service facilitates increased coordination and collaboration between the teams. By providing prescriptive and actionable data, IT operations and security teams can prioritise and remediate security threats and vulnerabilities based on the potential impact to the business.” According to Gartner, the cyber security community harbours the irrational fear that most attacks will target zero-day vulnerabilities or previously undiscovered weaknesses. In fact, most malware is designed to attack vulnerabilities that IT professionals have already identified. Additionally, 99.99% of exploits are based on vulnerabilities that have already been known for security and IT professionals for at least one year.2 The SecOps Response Service solution addresses this by natively integrating operations with vulnerability scan data from Rapid7, Tenable, and Qualys, providing automatic enterprise-grade remediation. As companies look to enhance their Security Operations capabilities, BMC offers professional and education services to complement SecOps Response Service. These include subscription-based 24/7 access to online training to accelerate adoption of the solution, consulting for process and operational change management, a three-month Day 2 Operations service to help organisations optimise their use of the solution post go-live, and an Application Managed Service where BMC staff manage the solution on the client’s behalf. For more information about BMC SecOps Response Service and to view a video with further details about the solution, click here. To register for the May 17 webinar about this critical solution, click here. ### The Top 5 Factors to Consider Before Purchasing CRM Software Customer satisfaction is, no doubt, a key factor common in all thriving businesses. Therefore, how you deliver your product to the market is just as important as what you are delivering. Customer Relationship Management Software (CRM) solutions help organisations handle business data relating to their clients. A CRM solution will help in better customer service, retain customers, up-sell and cross-sell more effectively. Good CRM software can, therefore, scale up your business considerably as long as you get it right while shopping for one. Some of the factors you need to consider before buying one include. Scalability The primary intention of purchasing the software is to help your business grow. For this reason, consider the software’s ability to handle increased operations that are bound to arise with this growth. Will it grow with you? When will it require upgrades, and how much will they cost? The depth of features a CRM solution offers can indicate how scalable it will be. From professional sales, features to help desk functions, investigate every aspect a software package promises to do before making the purchase. Mobility Businesses are increasingly working on the go. As such, a CRM software package that has mobile capabilities is preferred. As you shop for your CRM software, check out all the functionality it offers on its desktop and mobile versions. Software with automatic sync capabilities and offline features would be better for your business. These minor details will increase your staff’s productivity immensely. [easy-tweet tweet="Purchasing CRM software will have significant implications on the business costs and operations." hashtags="CRM,Business"] Features Find out what features your business is most in need of before talking to CRM vendors. What are your current processes as a business, and what tools are currently being used by your staff? There will be no need to go for the biggest and most comprehensive software since there are options that offer the best CRM for a small business without having to pay extra features you don’t need. Test Purchasing CRM software will have significant implications on the business costs and operations. It is recommended that you first test it at the workplace before you buy. Let your workers have a go at it testing every feature. Determine how helpful is it to them in performing everyday tasks like sending email replies, tracking leads and converting them into clients. Take into account their feedback before making the purchase. Hardware Some CRM software offers more flexibility than others. For this reason, know your hardware and how flexible a CRM solution is before making a purchase. A product that gives you and your workforce the ability to use it across any operating system, be it PC, Mac, Android or iOS should be preferred. It will help your business, and its workers remain efficient and ease the burden of transitions. It is worth noting that size matters. There are software packages considered the best CRM for small business. Referred to as Small business CRM, these offer the basic features a small business needs without having to pay for the additional features in large enterprise applications designed for established businesses. Here is where to start your search for the best CRM for small businesses. ### Cloud-Managed BYOD Bring your own device (BYOD) is not a new concept. Rumoured to have originated in 2009 at IBM when, instead of fighting the trend of employees bringing their own mobile phones, tablets and storage devices to work, the company chose to embrace it.   Since then BYOD has taken off in a big way, in both large enterprises and in small businesses.  In 2016, Tech Pro Research reported that 72% of the organisations it surveyed were already allowing the use of personal devices for work, or were planning to introduce the practice within a year. The trend is perhaps unsurprising considering the consistent growth in the smartphone market.  The Deloitte Global Mobile Consumer Survey 2016 found that last year smartphone penetration in the UK hit an all-time high of 81%.  The company is predicting a rise to 85% in 2017, as we hit the 10-year anniversary of the first full touchscreen smartphone.  Shipments are expected to grow from 1.48 billion in 2016 and to more than 1.84 billion by 2020, making the smartphone the most successful consumer electronic device ever created. The BYOD Business Businesses point to increased employee mobility and productivity, as well as satisfaction and convenience, as the main drivers for BYOD practices.  With smartphone prices ranging from £70 to almost £1000, allowing employees to use their own device that they’ve personally chosen – and purchased – in the workplace also significantly reduces hardware expenditure for businesses. So why are we still discussing BYOD?  While on its face BYOD seems like a done deal, the inherent lack of control over employee devices presents two distinct challenges for business owners. The first is managing cost.  If employees are using their own devices to make business calls, but the business is footing the bill, how can the cost be controlled?  Some businesses provide a ‘work phone’ allowance employees can use to offset the cost of their mobile bill.  Others allow employees to expense the cost to the business.  Neither option gives the business control over the tariff, usage or cost of the plan.  It also means the business has no visibility into calls being made on behalf of the company. The second challenge is managing ownership of a business mobile number.  Employee work phone numbers become associated with their role, and more importantly, the business they are working for.  The contacts they make while working for the business also become associated with their work mobile number.  The issue arises when the employee leaves.  If they are using their own device with their own mobile plan, the business has no ownership over the number.  The employee leaves, and so do the business’ contacts. [easy-tweet tweet="Like many business technology conundrums, the solution to BYOD lies in the cloud " hashtags="BYOD,Cloud"] The Cloud Solution to BYOD The inability to manage cost, number ownership and ultimately control, means many business owners are still struggling to develop effective BYOD policies.  The answer, however, is simple.  Like many business technology conundrums, the solution to BYOD lies in the cloud. Mobile numbers, like many other aspects of our digital lives, can now be virtualized.  Businesses can store their mobile numbers in the cloud, and easily provide employees with a work phone number.  A cloud mobile number works just like any other mobile number.  In the background, however, the number is not stored on a physical SIM card but is held in a ‘cloud-based mobile network’.  Accessing the phone number works the same as any other cloud-based service – through an internet connected device.  It also has the same flexibility as any other cloud-based service – global access on any connected device, at any time. It Works Both Ways By using cloud mobile numbers, businesses can reduce the time and cost involved with traditional SIM-based contracts.  Cloud numbers can allow them to anticipate and prevent issues of increased costs and decreased visibility associated with BYOD, while also enabling a new level of control over a company’s intellectual property (IP). When a new employee starts, the business can allocate a work phone number the employee can use on their own mobile phone for as long as is required.  The employee has the convenience of using the handset they know, and the satisfaction of retaining their personal number. The business does not have to pay for a new device or contract.  It is also in control of the number.  When the employee moves on the business can immediately assign the number to their replacement, keeping the contact point with the business rather than the employee. Size Does Not Matter The benefits of cloud mobile numbers stretch across any sized business, from large international enterprises to small high street firms.  Corporates, small businesses, freelancers and the self-employed can all benefit from separating business and personal communications on one device, as well as the ease of deployment and cost saving capability that comes with cloud technology.  From a cost and IP control perspective, the larger the organisation, the more savings that can be made. Businesses with high staff turnovers, for example, recruitment and estate agents, will see the most significant benefits.  Even with just five employees, the cost saving can be significant. For recruitment and estate agents, in particular, business success depends on the contacts made using work mobile phone numbers.  Keeping these numbers in the business when employee leaves can directly impact bottom line results.  With a cloud mobile platform, when the employee leaves, the business can easily reassign their number to their replacement.  With no long contracts, SIM cards, or devices to manage, the contacts remain in the business and staff communication is significantly simplified. Effective Management Through The Cloud The centralised management capabilities of cloud platforms are transforming industries across the world.  While the reduced cost, increased control and scalability all provide real business benefits, it is the centralised management capability that can make the most impact in the long term. With cloud mobile numbers, businesses can access every work mobile number via an online dashboard, adding and managing lines as required, with complete visibility over usage and billing.  Cloud platforms can offer what traditional mobile network operators cannot: instant number transfer, live analytics, call recording and conferencing, smart diverting and CRM integration. Mobile phones once changed the communications landscape forever.  The cloud will now do that again. ### Cloud Control, Make it Work For You  In Business applications based on the cloud, the normal way of working is a browser to a server, epitomised by systems such as Salesforce. Benjamin Dyer, CEO of Powered Now, discusses how to extend this to industries that don’t fit the standard model. The computer industry goes in cycles. First, there was the mainframe. This was guarded by “The Operators” – the high priests of the expensive air conditioned and corporate owned data centre. You submitted a pack of punched cards and if you were lucky you got them back with some printouts the next day. My business partner is actually so old that he remembers those days. Next came “time sharing”. You could dial up a data centre from a dumb terminal over a telephone line and pay by the minute. The problem was that line speeds meant a few characters were all that could be sent and received. The big change came with the mass produced PC that could run in any office environment. Incorporates, this morphed into client-server setups involving both PCs and mainframe servers. Then came the internet. Suddenly anyone could afford a connection and any resource could be accessed from everywhere. With always-on broadband, 3G and 4G followed by the smartphone revolution, the possibilities have exploded. [easy-tweet tweet="AWS and Salesforce are growing at unprecedented rates. " hashtags="AWS,Cloud"] Now, we have reached the cloud world. Large server sales are down year on year. AWS and Salesforce are growing at unprecedented rates. Microsoft and Google are playing catch up. Increasingly, everything happens in the cloud. The new model is somewhat like the mainframe all over again. However, this “mainframe” is composed of every single internet-connected server in the world and the “terminals” consist of every computing device on the planet. Compared with the previous setup, everything is available from anywhere, always on and video has replaced text. The cost is hugely less and this means that most people in the world can benefit. There’s just one small problem. The mobile internet isn’t always on. Anyone who, like me, makes frequent train journeys knows this. Yes, the net is often available. But no, not always. My company, Powered Now, builds apps for trade businesses like electrical contractors, builders, gas engineers, roofers and much more. These businesses are mobile so the mobile web is ideal. Smartphones and tablets are easy to use and freely available so the time is right to computerise this industry, one of the least automated in the world. Invoicing is at the heart of what we do, but there is also team chat, team tracking, shared diaries and much more. There are nearly a million trade businesses in the UK alone and hundreds of thousands have multiple employees. But they are businesses. Even the smallest need their system to work 100% of the time. This isn’t a game, it isn’t entertainment, it’s mission critical. Mistakes can cost thousands or even tens of thousands of pounds. Lost data might mean that the tax man gets on their backs. So the nearly-always-on internet isn’t good enough. These apps need to work when there is no signal. And by work that doesn’t just mean the ability to capture data, they must be able to provide information in the field and process data that is input. When a trading company can raise a quotation when still outside the client’s house and send it ten minutes after leaving, they will often win the business just by impressing with their efficiency. But it means the data stored locally must be comprehensive. We’ve been deploying this model for years. It’s nice to have the benefit confirmed by Instagram who has just announced an offline mode for Android. At the same time, local storage isn’t enough. Devices can be lost, stolen or broken particularly in the hostile environment of a building site. When teams are out in the field, management needs to know where they are, at least in order to know who to send if there’s an emergency call out for a gas leak, electrical failure or broken lock. [easy-tweet tweet="We think of ourselves as a hybrid system with the cloud at the heart" hashtags="Cloud, Hybrid"] This means that while local storage is necessary, the same information needs to be available both centrally and to other team members. And file transfer won’t cut it, data needs to be as up-to-date as possible. It means a record, or row, level synchronisation between all devices. There are several practical questions, such as what to do about the services you have mashed up with, what about the unique numbering of documents, how do you recover from lost/broken devices and what to do when you run out of storage on any device? That’s before you even think about the potential collision in data updates when devices are offline. These are the challenges that my company, Powered Now, has had to overcome in building a system tailored for the field trade industry. It works when there is no signal, but safely uploads and downloads any outstanding information as soon as a signal becomes available. Some data can only be allocated in the cloud, so the system can deal with some uncertainty. Any services depending on APIs and therefore requiring a signal must be limited to those that are not vital for day-to-day operation. Mashed up services must be accessed via a message queue so that as far as a possible failure in any of these services does not cause degradation in the overall ecosystem. The system has to work across Macs, PCs, Android phones and tablets plus iPhones and iPads. So is this a cloud system? Absolutely, we are dependent on the cloud. The cloud gives us low cost highly reliable backup, synchronisation capabilities and access to third party services. It also provides centralised reporting. We couldn’t have built the system without the cloud. At the same time, we are a new breed. We can run when the cloud is down because we have to operate when our customers have no signal. We think of ourselves as a hybrid system with the cloud at the heart. ### Free Registration Opens for New ICT Channel Event 12 – 13 September 2017, NEC, Birmingham www.channel-live.co.uk 2nd May 2017: Free registration is now open for Channel Live, the new event for ICT professionals (12-13 September 2017, NEC, Birmingham), sponsored by TalkTalk Business. [easy-tweet tweet="In an era of digital transformation, businesses are rapidly changing and evolving" hashtags="Digital,Business"] The UK’s only independent and Channel-specific exhibition and conference Channel Live, was launched to address the digital challenges and threats facing the market. Event Director Paul Johnson explains “In an era of digital transformation, businesses are rapidly changing and evolving. Yet many Channel organisations are still not adapting to this new environment. Engaging with digital trends and partnering with the right suppliers is now essential to future-proof business and thrive in a crowded marketplace.” Visitors can attend seminars, view product demonstrations and listen to interviews on topics including: Digital Transformation Major market shifts in areas such as Hosted Telephony Cyber Security Public VS Private Cloud UCaaS IoT eSIMs 5G Skype for Business Shadow IT WebRTC Regulatory changes, including what the Channel can expect from GDPR. The event will also explore potential new markets for the Channel, such as utilities and social WiFi, and their expected impact over the next few years. On the show floor, an exhibition of over 100 suppliers will demonstrate the latest telephony solutions and technology. Exhibitors include AutoTask, Cloud9, FirstCom Europe, CityFibre, QGate, Xelion, Performance Networks, Parallels, and Gamma among others. The event is free to attend and visitors can register at www.channel-live.co.uk/     ### Oracle Empowers Customer Experience Professionals with New AI Apps   Innovative commerce, marketing, sales and service apps enable smarter customer experiences that drive tangible and predictable business results Tuesday 2nd May, 2017 - Oracle has announced new artificial intelligence based customer experience apps that empower commerce, marketing, sales and service professionals to deliver smarter experiences across the customer lifecycle in real time. The new Adaptive Intelligent Apps for CX uniquely blend first-party and third-party data with sophisticated decision science and machine learning to deliver the industry’s most powerful AI-based customer experience solutions. Adaptive Intelligent Apps for CX are designed to enhance existing commerce, marketing, sales and customer service applications within the Oracle CX Cloud Suite. Adaptive Intelligent Apps are powered by insights from the Oracle Data Cloud, which is the largest audience data marketplace in the world with a collection of more than 5 billion global consumer and business IDs and more than 7.5 trillion data points collected monthly. By applying advanced data science and machine learning to Oracle’s web-scale data and an organisation’s own data, the new Adaptive Intelligent Apps for CX have an unprecedented ability to react, learn and adapt in real-time based on historical and dynamic customer data such as clickstream and social activity as well as inputs such as weather, lookalike audiences and Internet of Things (IoT) data. The new Adaptive Intelligence Apps for CX deliver customised insights that improve with every customer interaction and include innovative apps designed for consumer and B2B professionals. Adaptive Intelligent Apps for CX deliver immediate impact by embedding within Oracle CX Cloud Suite applications to support customer experience workflows across commerce, marketing, sales and service. Commerce Professionals: New AI-powered capabilities turn static journeys into smart ones by delivering targeted product and content that is most relevant to the shopper’s immediate context. Recommendations utilise account data, shopper third-party data and real-time inputs to optimise outcomes and create superior consumer experiences for both first time and known shoppers. This drives repeat visits, loyalty and ultimately revenue. Marketing Professionals: New AI-powered capabilities enable smarter cross-channel experiences by delivering the most relevant, personalised content for each individual customer, at scale. By personalising engagements in real time across all channels, the new AI capabilities empower marketers to capture attention, drive engagement and improve conversion. [easy-tweet tweet="New AI-powered capabilities enable smarter sales experiences" hashtags="AI,Sales"] Customer Service Professionals: New AI-powered capabilities enable smarter and faster resolution of customer issues by providing the best information in the right channel at the right time. By delivering predictive product failure, predictive account health and predictive recommendation capabilities to customer service professionals, customers receive connected service experiences that ultimately improve customer loyalty and advocacy. Sales Professionals: New AI-powered capabilities enable smarter sales experiences by optimising the selling process for sales teams and customers. For customers, the buying process is made more effortless and seamless as the right offer, optimised for the individual customer, is presented digitally or through a sales professional. Sales professionals can improve productivity by following guidance derived from opportunity analysis as well as using account engagement and next-best-action capabilities to accelerate and close more deals. “Oracle is uniquely placed to deliver the promise of artificial intelligence based enterprise applications and is delivering on the future of AI driven business applications today,” said Clive Swan, SVP Applications Development, Oracle Adaptive Intelligence. “By combining first and third party data with advanced machine learning and the industry’s most comprehensive cloud applications suite, Oracle provides a complete package that helps eliminate the need for more integrations or other costly and time-consuming processes. This enables our customers to achieve immediate value and take a smarter approach to business transformation.”  Part of Oracle Applications Cloud, Oracle CX Cloud Suite empowers organisations to take a smarter approach to customer experience management and business transformation initiatives. By providing a trusted business platform that connects data, experiences and outcomes, Oracle CX Cloud Suite helps customers reduce IT complexity, deliver innovative customer experiences and achieve predictable and tangible business results. The Oracle CX Cloud Suite includes Oracle Marketing Could, Oracle Sales Cloud, Oracle CPQ Cloud, Oracle Commerce Cloud, Oracle Service Cloud and Oracle Social Cloud. ### Blockchain - What Does it Mean for The Cloud and User Behaviours As we continue to see and benefit from the increasing emergence and application of blockchain technology it is clear that this will have a far greater impact on many of us than we realise at first glance. The potential goes far beyond simply creating a cryptocurrency as was the initial purpose when the concept of Bitcoin was realised and we are still learning just how broad the application for blockchain can become. Insights from emergent technology thought leader and Team Blockchain board member Sally Eaves also give us a further understanding of key areas that are developing in this respect. The real power of blockchain lies in the universality of its application - this is a technology that we can layer into many existing industries and solutions to offer new levels of security, transparency and accessibility to all parties involved. This is a technology that is not necessarily creating new realities or radically altering the solutions we already use, however it does have a massive potential to streamline, simplify and optimise a vast majority of these solutions and adding new security at the same time. This is where many people simply cut and run - “it’s too complex”, “I don’t understand blockchain” - are common excuses to ignore what is happening and divert interest and focus elsewhere. When we understand the purpose and intent behind blockchain, it becomes easier to place the application and start to understand what this really means for our existing industries. Security and Transparency One of the strongest selling points of blockchain lies in the security and transparency offered by the distributed ledger framework. Every time a transaction occurs, that transaction is validated across the entire network and captured on record for all to see and agree on, over and over and over - this has a phenomenal value when we take into account just how often we accept loss from common error, reporting inaccuracies and general fraud in most industries today. Any process that is heavy on information or data transfer and creates large volumes of transactional data stands to gain massive optimisation benefits from the application of blockchain technology. This is already being displayed in the efforts of organisations such as Everledger (insurance & asset management) and the Maersk & IBM partnership to optimise data digitalisation in the shipping industry which is estimated to save the global industry billions of US$ annually. Behaviours and The Cloud So what does all this mean for the cloud and our behaviours? Sally Eaves believes we have already come a long way, but the true potential is just beginning: “The original principles and early applications of blockchain are now being rapidly engineered and expanded for the cross-sector benefit.  Given my research into the tech-human relationship, I am particularly interested in the application of blockchain in identity management – helping individuals to gain better transparency, protection and control of how their personal digital assets are utilised.  Finally, aligning with my work in social innovation, I am excited by the capacity for blockchain and linked technologies to make a difference beyond the areas typically spoken about.” The cloud stands to benefit in many ways as we start to utilise this technology to increase the security and accountability that is perceived around technology solutions. [easy-tweet tweet="Blockchain will enable a new paradigm of engagement with tech solutions" hashtags="Blockchain,Tech"] One of the leading concerns with cloud technology and computing has always been focused on security. As blockchain technology becomes increasingly applicable and ‘consumer friendly’ we are seeing its application in many instances, but the core is always coming back to the immutable nature of the data and the security and transparency this offers. From a long-term perspective, there is no doubt that blockchain will not only influence our behaviours but enable a new paradigm of engagement with tech solutions that we are becoming increasingly familiar with. As smart contracts are starting to be both recognised and applied in established markets and regions (the State of Arizona in The U.S. is among the growing number of American states that now recognises smart contracts and blockchain technology as a part of recognised legal commerce) we are opening up new opportunities where services and solutions will not simply be accessed and opted into, but in many cases they will act and apply themselves according to the criteria we have agreed to. Smart contracts allow us to take many cases that are manual and lengthy today and migrate them to an automated and secure state tomorrow - all supported by the underlying blockchain infrastructure to give us the transparency and security we desire to ensure that we are willing to engage with and trust a solution. Banking and finance applications will see major benefits, as will we as individuals as we are able to remove many of our traditional manual processes in these sectors in favour of more secure and transparent blockchain supported processes that can interact with smart contracts to ensure we are all benefitting from the innovations that emerge. This is where our behaviours will start to feel the impact of blockchain too - as we come to accept and expect the visibility and immediacy that these solutions offer when supported by blockchain infrastructure, so too will our decision become affected by the availability of this information to us as consumers. International banking, asset transfers, reference checking - these processes are traditionally costly services that we as individuals have to pay for, in a blockchain supported world all this information is immediately and immutably available, always - these costs will start to disappear and we will come to expect this availability in all we do. These are just some of the areas that we can already see the effects and impacts of blockchain as we start to redefine the boundaries of our expectations - this space becomes increasingly interesting as our behaviours align with the capabilities and we continue to improve on the solutions that we are all able to engage with on a day to day basis. We have an opportunity to advance from a society based on trust to one based on truth.     ### The Cloud – Changing the Face of Finance Whether it’s Brexit, a surprise result in the US Presidential race or a snap General Election here in the UK, tumultuous geopolitical events in recent months have fostered a climate of currency fluctuations, volatility and business uncertainty. When the financial chips are down it’s only natural that large corporations will adopt more defensive strategies, and look to their CFO to shepherd them through difficult terrain. However, in order for CFOs to achieve this, they need to be both suitably empowered and equipped. This is where cloud-based solutions can play a pivotal role, providing CFOs access to all the data available to them whenever it is needed, rather than at the traditional month-end reconciliation period. If CFOs have access to the right insights and figures in real-time, they’ll prove a far more effective asset in steering businesses through more perilous economic straits. A burdened finance department When we queried UK companies on who they felt was the most appropriate leader during tough economic times, the majority said they’d trust an ex-finance chief to sit in the big chair. This sentiment was similarly shared by business decision makers in the US, France and Australia. [easy-tweet tweet="Finance teams are too busy to contribute to more strategic tasks" hashtags="Finance,Strategy"] This all makes a lot of sense really. CFOs and finance teams have the economic credentials and nous to steer a course where others, perhaps, may not. The appropriateness of finance departments to contribute to the strategic direction and business development is nothing new of course, but our research also revealed that, despite business decision makers recognising the additional value of finance teams in the current economic environment, they were unable to utilise them effectively. As with many business stumbling blocks; the key causes are a lack of both time and resource. Finance teams are too busy focusing on the meat and veg of their roles – month-end processes and reporting – to contribute meaningfully to more strategic tasks. Over a third of UK business decision makers admitted as much. If businesses are to take a proactive stance in recovering from times of financial uncertainty they’ll need to discover new tactics to generate savings and boost returns from back-end processes. Two areas where it’d be useful to have financial experts on hand rather than bogged down in rote, repetitive tasks. Standardisation in the Cloud Freeing up financial professionals to concentrate on strategic business goals will require a systemic change in the day-to-day role of the finance department. However, as businesses tighten their belts over the next 18 months there’ll inevitably be a greater emphasis on the bottom line, reducing errors and accelerating the speed of reconciliations. All jobs that sit firmly within the remit of the finance department. This is where the cloud comes in. As FSN’s 2016 Future of the Finance Function Survey highlights, the cloud has the ability to mobilise change as well as provide advantages over disparate and pricey ERP environments in terms of cost, agility and ease of implementation. If finance teams are working across multiple interfaces and systems to amalgamate data for a financial close then they’re wasting valuable time, resources and impacting the online traceability and audibility of transactions from the financial accounts back to their original systems. On the other hand, a cloud platform can facilitate the delivery of data from multiple systems, introducing standardisation across processes, applications and vendors whilst reducing the complexity of amalgamating activities and transactions. The cloud also provides additional benefits for business by breaking down organisational silos. Through its scalability, multi-departmental collaboration becomes a far easier task and fosters a new level of knowledge sharing, decision-making and strategic alignment. Automation and the Cloud When processes, vendors and applications are all standardised in the cloud it becomes a much more fertile environment for automation.  The business cases for automation are many-fold. An automated process that is repeatable and transparent reduces exposure to risk and bolsters confidence in the reporting outcomes. With businesses concentrating on faster reconciliations and mitigating errors, an investment in automation offers the one-two punch of providing this extra speed and accuracy by eliminating spreadsheets, alongside liberating finance teams to focus on the business strategy tasks needed for growth. Additionally, as reconciliations occur in real-time, the CFO has real-time insights into the company’s financial position to inform business decisions and take advantage of opportunities before their competitors. Unlocking the potential   Business leaders are waking up to the fact that their finance teams are being under-utilised, and their potential to offer tangible business benefits is going to waste,  as they trawl through spreadsheets and focus on the daily grind. If business leaders are to unlock the nascent potential of their finance teams they need to equip them with the proper tools to unburden them from the yoke of repetitive tasks and manual reporting. Standardised processes and automation in the cloud allow finance teams, CFOs and Controllers the freedom to tap into their wider skill sets and focus on higher-value areas including fraud detection, compliance, and data analytics. With the current political landscape, it’s near impossible to see what’s around the corner, and this uncertainty will play its part in the wider economic environment. If businesses truly feel that its time for CFOs is to step up to take the reins, a shift towards embracing cloud-based solutions would be a pretty good starting point. ### Confidence in the Cloud As more and more organisations employ cloud security services, C-suites and IT Decision Makers (ITDMs) need to work together to implement the right levels of defence to protect both the business and customer information that’s stored there. Our recent global research showed cyber security represents the most significant business challenge to 71% of C-suite respondents. Amongst ITDMs, 72% expect to be targeted by a cyber-attack over the next 12 months. In both groups, confidence in their organisations’ defences against cyber-attack was very high. It revealed that businesses are increasingly aware of the unpredictable nature of the cyber threats they face and, despite differences, C-suite executives and ITDMs alike are taking increasingly pragmatic and informed choices about how they go about minimising the risks they face. This is even more important given that into cyber defence shows that almost half (48%) of C-suites will increase their usage of cloud computing services in the next 12-18 months. Interestingly, the story is different for ITDMs: just under a third (29%) say they’ll cut back on cloud usage, suggesting a disconnect between this group and the C-suite who may lack knowledge when it comes to the current level of cloud services in use throughout their organisation. [easy-tweet tweet="Two in five (41%) C-suites say cloud services are a vital part of doing business" hashtags="Cloud,Business"] Yet, overall, both groups shared similar beliefs around the utility and risks of cloud computing. Two in five (41%) C-suites say cloud services are a vital part of doing business; a third say they’re necessary, but they’d rather not use them, while just under a quarter (24%) think the cloud represents a security risk. It’s worth noting that just over a quarter (26%) of ITDMs refused to answer the question on how confident they are that their organisation has the right security controls for cloud services. However, the overwhelming majority of those who did respond are very or fairly confident that their organisation has the right defence strategy in this area: a total of 64% for the C-suite and 85% for ITDMs. This is likely due to the efforts of cloud service providers, and their scale as organisations. Large cloud computing providers can devote significant resources to securing the delivery of their key products, and this benefit is passed on to small companies that use these services – businesses that would otherwise never be able to afford these security measures. However, the risks lie in how, and to what extent, users and their employers secure implementation; a supplier may provide a perfectly secure product, but if employees bypass security in the name of speed or convenience, all the effort in the world from cloud providers will be futile. [easy-tweet tweet="Cyber risk has reached the top of the agenda for many directors over the last few years" hashtags="Cyber,Security "] Cloud security is paramount and C-suites and ITDMs need to be clear and aligned on the strength of their current defences, and therefore the investments they need to make, to ensure their business and their customers remain secure. Cyber risk – and associated worries – have reached the top of the agenda for many boards of directors over the last few years, and quite rightly so. Cyber security is now a business risk issue like any other. But how different groups within your company perceive this risk is another thing. Whether you’re an IT director or a C-level executive, one thing is clear: you need to do more than talking to each other: you must make yourselves understood. The divergence of opinions between C-suite and ITDMs when it comes to potential threats, accountability and responsibility creates gaps for attackers to exploit. Such disconnects and communications failures can also create problems in the event of an attack when the time is often of the essence and clarity is important. It’s vital that organisations work to narrow these gaps in understanding, intelligence and responsibility.   ### CybSafe uses Behavioural Science to Reduce Cyber Security Risk Seed funding enables the start-up to tackle the biggest cyber threats to business: their own employees CybSafe, a cyber security software company, have launched its platform to tackle the human aspect of cyber security using pioneering e-learning techniques grounded in behavioural science. CybSafe, which was founded in 2015 by cyber resilience and intelligence experts from the global counter-threat specialists, Torchlight Group, has secured seed funding from financial services compliance consultancy, Regulatory Finance Solutions (RFS). The service from CybSafe is based on the simple principle that if a company’s employees learn how to improve their security-related behaviour, the employees will be more secure and so will the business. Human error is the main cause of data breaches. According to UK Government research published on the 19th April 2017, 46 percent of companies have suffered a cyber attack in 2016 compared with just 24 percent the previous year, with 72 percent of attacks related to fraudulent emails sent to employees. It is increasingly clear that as the threats evolve, organisations need to do all they can to embed and sustain more resilient cyber security behaviour throughout their organisation. Cyber security learning is no longer a box-ticking exercise, it needs constant refreshing and testing, with a tailored approach designed to fulfil the needs of the individual and positively change behaviour. Using intelligent software and proprietary analytics, CybSafe’s cloud-based platform learns an individual’s knowledge level and their behaviour patterns to deliver a personalised e-learning programme. Delivered through a mobile app or online, the GCHQ-accredited e-learning platform will save businesses money, not just by reducing their risk of becoming victim to a security breach, but also by delivering meaningful training that constantly evolves based on current threats. Oz Alashe MBE, CEO and founder of CybSafe explains: "Businesses recognise that their own staff represent their greatest security vulnerability, but it’s a problem that is costly and difficult to address and this results in cursory attempts to impart information which rarely has the desired effect. CybSafe transforms cyber awareness training from a box-ticking exercise into an immersive, recurring experience that positively changes security-related behaviour. [easy-tweet tweet="The CybSafe platform gives them the edge and helps them more effectively protect their organisation." hashtags="Cybsafe,Security"] “Most businesses – whether they have an information security team or not – don't have the expertise, capacity or resource to address the human aspect of cyber security properly. The CybSafe platform gives them the edge and helps them more effectively protect their organisation. We’re harnessing the collective lessons from across the cyber security community and making this available to all." Michael Couzens, Managing Director of RFS comments: "CybSafe takes an innovative approach to cyber security that is proven to reduce risk within an organisation. From our own experience in financial services, it serves an immediate need that we hear requested by our customers. The company has the full package – a pioneering technology platform, an innovative approach to e-learning coupled with a strong leadership team, and we are excited to support it.” CybSafe has already attracted blue chip clients including FTSE 100 financial institutions, leading law firms and other international companies, as well as expert partners in securityand the financial services industry. The company will use the funds from RFS to expand its headcount, with a focus on hiring business development expertise and engineers to further refine the platform. Further information on the investment from RFS is available here: http://www.rfs.co.uk/press-release-rf…vests-in-cybsafe/ ### Ensono Launches Hyperscale Cloud Incubator Hybrid IT provider dedicates a business unit to staying at the forefront of evolving market, helping clients best leverage cloud technology and services. Ensono™, a leading hybrid IT services provider, announced the launch of its Hyperscale Cloud business unit, a company incubator dedicated to accelerating a new-to- market cloud service offering built on Amazon Web Services (AWS) and advancing Encino's current managed service offerings built on AWS.  The public cloud is a major global growth driver in IT infrastructure. To ensure a seamless transition for organisations, Ensono has created an incubator with a singular focus on nurturing and growing ideas and product offerings around the public cloud to drive both acceleration and advancement. As an AWS Consulting and Managed Service Provider partner for the past five years, it is clear that Ensono’s highly-targeted approach will drive forward its position in the marketplace. “Amazon Web Services helps our clients increase business agility, facilitate innovation and accelerate their time to market,” said Brett Moss, Senior Vice President and General Manager of Hyperscale Cloud at Ensono. “Bringing the AWS benefits alongside Ensono as their Managed Services Provider enables our clients to focus their resources on running their businesses. Our global team of experts provides a systematic approach for the design and deployment of infrastructure operations, with a consistent and secure process so clients can continue to be great at what they do.” Ensono’s Managed AWS portfolio of services is organised into five modules, enabling Ensono to meet its clients wherever they are in their cloud journeys, and support them in achieving their business goals from digital transformation. The modules include: AWS Consulting and Proof of Concept AWS Migrations Cloud Run and Operate Services AWS Advanced Automation Services Advanced Reporting and Cost Optimisation “As more businesses begin to transition to the cloud, they need a partner who can provide a customised strategy that will set them up for success,” said Jason Deck, Vice President of Hyperscale Cloud Product and Marketing at Ensono. “Ensono provides its clients with a specialised workforce and tailored strategy to help organisations make these significant shifts, helping businesses to leverage the cloud’s full capabilities while remaining up and running throughout the transition period.” [easy-tweet tweet="National Trust, transitioned its entire website and new mobile platform to AWS. " hashtags="NationalTrust,AWS"] Ensono has already helped companies take advantage of the cloud. National Trust, a conservation charity and Ensono Cloud client in the UK, seamlessly transitioned its entire website and new mobile platform to AWS. The transition was the largest and most ambitious transformation initiative in its history. “We’ve been able to maximise the availability of the website while scaling the platform in line with seasons and events,” said Glen Yarwood, IT operations director at National Trust. “Using Ensono's managed services for AWS provides automation and scripting which enables us to turn development environments off when not being used each day.” Ensono enabled National Trust’s transformational journey by providing a platform and location-independent hybrid IT approach that utilised the AWS cloud. Ensono’s three-phase approach of identifying assets and services across the entire IT estate enabled the Trust to implement its data centre strategy and deliver a hybrid solution with the optimum blend of managed, co-location and cloud services. Ensono’s scalable, secure, and cost-effective solutions helped some of the Trust’s digital projects outperform initial business case estimates, while Ensono’s team of consultants ensured that customers experienced uninterrupted service following the transition. For more information about the incubator and Ensono’s cloud services visit https://www.ensono.com. ### Gartner ranks Informatica No.1 iPaaS solution provider Informatica accounts for 19 percent of global iPaaS market share revenue in 2016 Informatica®, the world’s No. 1 provider of data management solutions, announced on 25th April that Gartner, Inc., a leading IT research and advisory firm, has, for the third consecutive year, ranked Informatica as the number one global Integration Platform as a Service (iPaaS) solution provider. This ranking is based on 2016 market share revenue identified in Gartner’s report “Market Share: All Software Markets, Worldwide, 2016” for Enterprise Integration Platform as a Service (iPaaS) in the Application and Infrastructure Middleware Market. Last month, the Gartner 2017 Magic Quadrant for Enterprise Integration Platform as a Service (iPaaS) report positioned Informatica as a Leader in the iPaaS industry for the fourth straight year. In fact, in this report, Informatica was positioned highest on the ability to execute axis and farthest on the completeness of vision axis. [easy-tweet tweet="Informatica is the only iPaaS vendor operating at more than $100 million scales" hashtags="Informatica,iPaaS"] In its new iPaaS market share report, Gartner sizes the 2016 global iPaaS market at $688 million, with Informatica accounting for $127 million in iPaaS revenue, or 19 percent of the whole and six percentage points more than the closest competitor. With its iPaaS revenues growing 40 percent year-over-year from $91 million in 2015 and more than doubling from 2014, Informatica is in hyper growth mode and is the only iPaaS vendor operating at more than $100 million scales. “We believe 2016 was a watershed year for the global iPaaS market, and for Informatica iPaaS solutions in particular as we further solidified our position as a recognised iPaaSleader for vision, execution and ranked number one for iPaaS market share revenue,” said Ronen Schwartz, senior vice president and general manager, data integration, big data and cloud at Informatica. “Informatica customers continue to move aggressively in transitioning integration workloads and data to cloud environments. Informatica currently has more than 5,000 enterprise iPaaS customers, connected to thousands of applications and moving more than 22 billion transactions per day, with 100 percent year-over-year transaction growth rate. As these customers invest in our solutions, we continue to invest aggressively to bring to market the ‘next’ generation of innovative iPaaS and data management solutions that will enhance their journeys to the cloud.” The Informatica Cloud® market leading iPaaS delivers the broadest set of cloud integration/iPaaS capabilities on a single platform across all systems, in the cloud and on-premise.Informatica is a pioneer in cloud integration and now more than 1.5 million jobs are processed with Informatica Cloud every day, demonstrating the growth in workloads shifting to the cloud and driving today’s hybrid cloud era. Customers depend on Informatica Cloud to underpin such key use cases as data integration, application integration, process integration, data quality, master data management (MDM), data security, B2B integration, data preparation and more across multiple types of environments, all powered by Informatica’s iPaaSplatform. ### Wix Video Gives Users Unparalleled Video Capabilities Industry’s First Solution to Curate, Control and Monetize Video Wix.com Ltd. (Nasdaq: WIX), a leading cloud-based web development platform, has launched Wix Video, designed to provide an industry-leading solution for showcasing, sharing and selling videos online. The newest all-in-one tool is the first in the industry to provide users with the ability to centralise their video library and combine their videos into a single playlist regardless of the host. Wix Video pairs an advanced video streaming and hosting service by Wix with third party services so users no longer have to choose between YouTube, Vimeo or Facebook to publish their video content. Instead, users can seamlessly integrate their videos into a single, complete online portfolio. [easy-tweet tweet="Video is a powerful medium, video creators should be the ones with the power" hashtags="Video,WixVideo"] Wix Video is specifically designed to meet the needs of creators, artists, businesses and brands looking to incorporate video into their stunning websites to promote their services, grow their audience, engage with their customers or sell online: For video creators, it provides complete, customizable, control over their content and how it looks on their website, along with a wide variety of ways to monetize content while preserving its integrity. For businesses using video to showcase their products and share the story behind their brands, WixVideo ensures that demos, instructions and customer testimonials look amazing from any size screen. For instructors, educators and small businesses that offer lessons, tutorials or online classes, WixVideo provides the ability to create multiple channels and easily update them on their Wix website, sell videos and rentals and offer monthly subscriptions - all commission free. Wix Video follows Wix’s successful freemium model where some advanced features are offered as part of a premium offering. "Video is a powerful medium, and we believe video creators should be the ones with the power, having complete control over how their content is displayed and how it is monetized," said Danielle Raiz, Head of Wix Video at Wix.com. “Wix Video was developed to help our passionate users get the most out of the video by opening up exciting new options for display and distribution. Importantly, we decided that while other platforms may take portions of video revenues from users, Wix Video will take no commission and was launched, as with all of our products, to further increase Wix users’ successes." Wix Video’s state-of-the-art features and functionality include: Industry-Leading Curation. Wix hosted and third party videos can be combined into a single playlist. Complete Control. Wix Video is based on adaptive streaming that ensures both high-quality streaming and smooth playback, and users have full control of their layout and design. Revenue Generation. Wix allows users to keep all the revenue they generate and the ability to deliver videos to their customers via subscriptions, individual purchases or a temporary pass for 24, 48 or 72 hours. Watch what Wix Video can do for you: http://www.wix.com/wix-lp/wix-video/watch-video Learn more about Wix Video: http://www.wix.com/wix-lp/wix-video Wix Video App: https://www.wix.com/app-market/wix-video/overview Check out how these Wix users are using Wix Video: John Lovett – johnlovett.com Diego Uchitel – diegouchitel.com/film Slate Goods NYC – Slategoods.nyc An Evening with Bukowski – youneverhadit.org Share This: #video creator? @Wix Video is the new, all-in-one solution to showcase, promote & sell videos online: http://www.wix.com/wix-lp/wix-video Share This: #video revolution? @Wix Video enables you to showcase, promote & sell videos online. See for yourself: http://www.wix.com/wix-lp/wix-video   ### Multi-channel Communication via the Cloud is Essential to Digital Success To succeed in business, it is crucial that all the information you exchange with customers and suppliers be of the highest quality and always up-to-date. By consolidating business communication and bundling it across all channels, critical processes are executed at faster speeds. Cloud Messaging Services are an ideal solution, as they allow you to send and receive business documents and customer information – as a fax, email or SMS. Users benefit from maximum transmission security and availability, highly scalable transmission capacities as well as utmost transparency. At the same time, valuable information and sensitive data stay where they belong: Safe and secure within your company. No matter whether it’s about offers, orders, customer or delivery documents, alerts, invoices or transport requests, efficient business processes, as well as smooth and transparent electronic communications, are crucial to the success of any business. Yet, in many companies, the business communication takes place via decentralised infrastructures, inflexible solutions or outdated equipment. In most cases this proves to be cost and maintenance intensive, requires a huge amount of administrative effort and fail-safety are frequently lacking. Furthermore, fax servers have limited capacities, meaning that companies are not able to react accordingly when fax volumes change, especially during peak times, and the reliable and speedy processing of high data volumes is therefore impossible. At times of low demand, by contrast, overcapacity unnecessarily eat into the budget. Companies increasingly changing over to cloud communications services In order to accelerate the processes already located in the cloud, companies are now seeing value in moving their business communication to the cloud, such as the fourth largest back in Germany, DZ BANK AG. The bank is using cloud messaging services for its electronic communications to ensure secure, professional email transmission, SMS authentication as well as for fax communication. [easy-tweet tweet="Companies are now seeing value in moving their business communication to the cloud" hashtags="Cloud,Communication"] Modern providers offer for this reason purpose-built cloud messaging services. These allow companies to bundle and consolidate communication processes across all channels and send data and messages directly from their systems. Certified cloud messaging services enable information to be smoothly transmitted via fax, email, SMS and EDI. Furthermore, they offer the highest levels of fail-safety and transaction security. In contrast to on-premise systems, these services can be individually adapted to meet the most diverse customer requirements. At the same time, they are scalable to ensure that users can always react flexibly to new requirements, while servers and backup infrastructures are substantially unburdened. The exchange of at an international level is greatly simplified by using cloud messaging services. The documents are displayed correctly in all languages, providing global, multi-location activities with optimal levels of control and supervision. For DZ Bank, having this robust, error-free transmission of sensitive data that met the compliance with regulatory guidelines was a critical factor. When it comes to multichannel communications, the bank’s cloud fax solutions are used in the trading of securities where it is essential that there is legal proof that the documents have been delivered. The emergency plan for the ordering process is also managed by means of a connection to Retarus’ fax infrastructure. In addition, Retarus’ Managed SMS Services ensure the secure and speedy authentication of bank staff in the IT systems. The passwords (tokens) are delivered via SMS to the users directly, within a few seconds. The bank moreover uses the web-based, multichannel messaging platform Retarus WebExpress to notify a huge amount of recipients about important developments in the context of investor or public relations in a timely and secure manner by means of email, fax and SMS. Process optimisation through direct connection With a direct connection to infrastructure to the respective service provider, it is not only possible to provide automated direct debit pre-notifications and break down bookings into individual items or transmit time-critical orders, invoices or transport requests quickly and reliably. It is also possible to communicate with end customers, for example via SMS, without much cost or effort in order to notify them of delivery status, provide booking and order confirmations, conduct product recalls or send late payment reminders. This requires no additional investment in hardware, software, maintenance, licenses or telecommunication lines. All that is required is an IP connection with the service provider - at Retarus the connection is completed in no time at all via e.g. the SAP-certified interfaces RFC and BC-SMTP. Documents are then transmitted by email, fax, SMS or EDI directly via the service provider’s infrastructure. Guaranteed Delivery Rates Special features, such as Retarus’ “NeverBusy Technology” prevent capacity bottlenecks at times of peak usage, enabling companies to reliably send large fax volumes even to recipients which have smaller-dimensioned fax infrastructures. When sending two fax messages simultaneously to the same number, the line is usually busy, meaning that the transmission needs to be repeated. In practice, despite automated repeat mechanisms many message transmissions fail. However, when using Retarus’ NeverBusy Technology messages that need to be transmitted to the same number are automatically sent shortly after each other. In addition, the transmission status of the documents is automatically reported directly back into the SAP system, ensuring that potential delays are recognised speedily. [easy-tweet tweet="Concerns about security are the biggest obstacles preventing cloud services being used " hashtags="Security, Cloud"] Security in focus Concerns about security remain the biggest obstacles preventing cloud services from being used even more commonly. According to the German Association for Information Technology, Telecommunications and New Media (BITKOM), 60 percent of the companies surveyed in BITKOM’s Cloud Monitor report stated that they fear unauthorised access to sensitive business information. Approximately 40 percent are concerned about losing data, while about 35 percent have legal reservations regarding the cloud. However, cloud services are actually not any less secure than an in-house infrastructure. On the contrary, an increasing number of service providers are consciously investing in the security of their products even contractually guaranteeing it in service level agreements (SLA). Companies which choose to use cloud messaging services should, therefore, make sure that they are well informed about the respective service provider. Sector specific maximum security is was essential for DZ Bank when selecting its communications provider. Among others, the guidelines set by the German financial regulator BaFin (Bundesanstalt für Finanzdienstleistungsaufsicht) such as MaRisk (Minimum Requirements for Risk Management) security and compliance reasons, it was also important for DZ BANK to have contractually binding assurance that processing takes place at Retarus’ own data centres located in Germany. Local data processing according to applicable laws The location of the data centres and the transparency offered by the various services should play an important role in deciding in favour of a particular service provider. This is essential, as many international service providers save the data that has been entrusted to them in the cloud, without specifying exactly where the data is located. But especially for internationally active companies, it is essential that they are able to comply with the regulations applicable in the countries where their branch offices are located. To ensure better risk assessment and minimise danger the provider should, therefore, offer its customers complete control over the processing of their data. Reputable service providers give their customers the choice to decide for themselves where and according to which laws and regulations their data is handled. Retarus’ customers, for instance, can gain binding assurance whether their data is processed in Retarus' data centres in Europe, Asia or the US. Specific data processing requirements of branch offices in different countries can be easily agreed upon in one single contract. Secure business communications via SNC It is equally important to protect business-critical or personal data which is exchanged with colleagues, customers and suppliers from unauthorised access. By using Secure Network Communications (SNC) companies can efficiently secure the communication coming out of their SAP systems. For many enterprises - particularly those in the finance sector - this is actually a mandatory requirement for all SAP connections with external systems. That’s why messaging service providers specialising in SAP systems furthermore support connections via SNC. As a part of the SAP Netweaver system architecture, the software layer facilitates the use of strong authentication as well as encryption, making it a reliable alternative to VPN. While firewall specialists are usually required when setting up conventional VPN connections, SAP specialists can install SNC independently. Transparency thanks to pay-per-use model In the best case, the provider will additionally offer a web-based, intuitive-to-use administration portal, which provides customers unlimited control over the processing of their data. The portal should make reports and analyses available, which allow a detailed overview of the company’s communications and the services performed by the provider at all times. In this way, it is possible to quickly detect erroneous addresses and consequently improve the quality of the master data. The rising costs should ideally be precisely assigned to specific cost centres, while cost and efforts should always be determined by the current demand. In this way, companies are free to focus on their core competencies, still being able to react flexibly to new requirements. ### BrickerBot Takes Aim at Unsecure IoT Devices Imagine a fast moving bot attack designed to stop its victim from functioning. Called Permanent Denial-of-Service, PDoS is an attack that damages a system so badly that it requires replacement or reinstallation. By exploiting security flaws or misconfigurations, PDoS can destroy the firmware and/or basic functions of a system permanently. It is a contrast to its well-known cousin, the DDoS attack, which overloads systems with requests meant to saturate resources through unintended usage but generally the functioning of the system is restored to normal as soon as the DDoS attack stops. Discovery On the week of the 10th April, we identified a new form of PDoS attack – which we dubbed ‘BrickerBot’ - through our honeypots. Over a four-day period, Radware’s honeypots recorded 1,895 PDoS attempts performed from several locations around the world. The honeypot that identified the BrickerBot attacks is a custom developed piece of software that makes conversations with bots. It does not execute any commands, it simply replies to any received commands with what the bot expects to see as an output. That way it is completely safe and can be quickly adapted to emulate any environment or response that bots expect. So in many ways, it’s like a chatterbot for IoT bots – talk about fighting fire with fire! Analysis Unlike Mirai, which aimed to exploit insecure IoT devices and recruit them into a botnet army, BrickerBot’s sole purpose is to compromise IoT devices and corrupt their storage. [easy-tweet tweet="The BrickerBot attack used Telnet brute force to breach a victim’s devices" hashtags="BrickerBot,IoT"] Besides this intense, short-lived bot (BrickerBot.1), Radware’s honeypot recorded attempts from a second, very similar bot (BrickerBot.2) which started PDoS attempts on the same day with lower intensity but more thorough and its location(s) concealed by TOR exit nodes. The BrickerBot attack used Telnet brute force - the same exploit vector used by Mirai - to breach a victim’s devices. Upon successful access to the device, the bot performed a series of Linux commands that could ultimately lead to corrupted storage, followed by commands to disrupt Internet connectivity, device performance, and the wiping of all files on the device. While I do not believe a lot of servers will be vulnerable to the attack – most servers are installed with best practices for securing access and telnet is not enabled by default in recent Linux installs – what I am concerned about is the amount of IoT devices that may be vulnerable, as has been proven by the Mirai botnet. Damage BrickerBot is not damaging the storage inside the devices it attacks – they are still useable after reformatting. It is attempting to corrupt the information on the storage and its partition table (the layout of the file systems on the storage card). If the device has field replaceable flash storage, the device can be recovered through ordering a new flash card with pre-installed software image and replacing it. If the device has a USB port and the manufacturer provides a procedure to boot and reformat and repartition the storage and flash the cards from a USB mounted the image, that might be enough to recover a device. Needless to say that this process is time-consuming and not a basic task that can be performed without a serial connection or special recovery sequences built into the device. The bot will not damage the device in such a way that it is not recoverable. But we still refer to it as a PDOS attack because even after the attack is over, the damage persists and intervention is required to get the devices back in an operational state. Author(s) Right now, the motives and author(s) behind BrickerBot are unclear. Some theories have emerged, but we have no supporting evidence to favour one theory above another. The bot leaves a very limited trace, only a set of commands that proves the purpose is to corrupt and break unsecured Linux devices that are directly exposed to the Internet through their telnet service and use factory default credentials. So who could be the author(s) behind BrickerBot? It could be that the author wants to get back at the IoT industry: imagine the number of support calls and RMA manufacturers could receive if the attack is successful. It could be a grey hat that is punishing careless users and manufacturers of unsecured IoT devices, forcing them to upgrade their software and reconfigure their devices with hopefully more recent and less vulnerable images and parameters, eventually solving the IoT problem – but in a very unethical way. It might be a hacker that had his botnet compromised by security researchers and is now trying to get revenge through breaking their honeypots – although less likely because to do so the hacker should have extensive analytics on the connections and infection attempts combined with C&C connections to be able to find potential honeypots in a large list of victims ; or he doesn’t care and turned his back for good to the IoT botnet industry and tries to break whatever smells like an IoT device or honeypot. Securing the network While the first PDoS attacks from BrickerBot.1 have stopped, the attacks from BrickerBot.2 are still active and on-going. My advice to businesses worried about their IoT devices is: Check and upgrade the software of your devices as often as possible Change the device’s factory default credentials, look further than the web based GUI and check for CLI access – manuals and getting-started notes might not always reveal the existence of a CLI access, therefore check the forums of your device manufacturer for CLI, telnet, SSH and default credentials Disable Telnet access to the device, if CLI access is required, SSH is a better option – if no CLI access is required, disable both of them Network Behavioural Analysis can detect anomalies in traffic and combined with automatic signature generation provides protection against 0day network exploits User and Entity Behaviour Analytics (UEBA) can spot granular anomalies in traffic early. ### What are the Brexit Costs (and Benefits) to Digital and Technology Companies? Short term profits rise – longer term hit to profitability  It was been well documented that the Digital and Tech sector will be hard hit by Brexit – as EU born, talented workers - coders, designers, innovators and entrepreneur risk takers - begin to look outside the UK for their future employment and to start-up their next venture. The suggestions are that small Digital and Tech firms will be hardest hit, as they lack the budget and resources to apply and pay for visas for existing/new employees.  Are you worried about this? Explore your options and how to start planning here. But let’s look at the overall cost of Brexit on Digital and Tech firms; “What are the £ costs of Brexit to Digital and Tech firms?  And are there any benefits?” The Brexit Tracker has been built to calculate these – and provide a Brexit dashboard to monitor every Brexit related item.  So we created a Brexit Tracker profile for an average Digital and Tech firm and discuss the results here. First of all, what is an “average” Digital and Tech firm?  10 minutes of online research suggests it will have 31 employees, established 4 years ago and be doubling its turnover every 2-3 months, having raised c. £3m of seed capital.  The wage bill is their biggest monthly cost.  It will be largely UK centred, with increasing EU and global sales and revenue generation. Excellent case studies of larger Digital and Tech firms are in the Future Fifty programme. With up and coming firms in Tech City’s Upscale 2017 programme. So we loaded these stats to create a Brexit Tracker Digital and Tech firm profile. This considered the impact on Operating profits from over 390 economic indicators, assessing the net effect on a firm’s Op profits due to; movements in currency, changes in UK economic growth rate, employment and wage, EU and Rest of the World imports/exports costs, (all specific to the Digital and Tech sectors) as well as changing consumer and business confidence. The net impact of all these for our firm’s operating profit is not entirely bad, as seen in the monthly summary – pre and post the Brexit referendum:   Impacts Explained:  Net currency benefits, a resilient UK economy and B2B and B2C confidence with relatively stagnate Digital and Tech wage demands has kept the firm monthly profits positive.  The often forgotten reduction in base rate in August 2016 (0.50% down to 0.25%) also generates higher operating profits, as debt interest costs fall. The Tracker also forecasts future Operating profits – using the user’s views on Brexit related issues. We anticipated the possible thoughts of the owner of this made up firm; she is worrying how to retain her EU sourced highly skilled talent pool, fears the associated drain of innovation on the firm’s capabilities post-Brexit and is concerned about losing access to the Single Market. Yet she is also excited about the opportunities of opening up new international sales, supported by new Government back initiatives. The Tracker provided a summary forecast of what could happen to the firm’s Operating profit over the next 2 years if their Brexit views are realised And it’s not great news:  £211k net cost of Brexit in the next 12 months Rising to a £272k net cost for the following year. But it is important to recognise these are forecast impacts of Brexit on the firm – over and above the actual economic factors we reviewed first. Both sets of data need to be considered in planning scenarios. [easy-tweet tweet="Brexit is undoubtedly going to impact Digital and Tech firms" hashtags="Brexit,Digital"] Our conclusions Brexit is undoubtedly going to impact Digital and Tech firms – and economic and market changes are already having a direct impact on costs and revenues. “But owners, directors and stakeholders must not assume Brexit will be just bad news” There are influences that have already increased their firm’s revenues or reduced their costs.  It is important to recognise these variables – differing in magnitude and application for each and every firm – to then begin to plan for the continued changes that Brexit will bring. How does your firm win from Brexit? It is vital to know where your firm is now and what has already happened with associated impacts. Here’s our 5 point plan: Research the firm’s differing management views on Brexit to understand what could happen and what are the pain points – and possible profits opportunities. Assign individual responsibility for planning for Brexit provide them with the appropriate resources to collate all views and to build a robust reporting framework. Agree Key Performance Indicators, a timeframe and reporting procedure - preferably into the Boardroom (or relevant stakeholder meeting.) Invest into Brexit related training and education courses, review Brexit consultancy services or begin your planning by creating your own management information via an off the shelf package like The Brexit Tracker. Frequently monitor and discuss Brexit impacts as part of your overall plan and communicate responses to the entire firm as appropriate. And finally note what Dwight D. Eisenhower said; “Plans are nothing; planning is everything.” About The Brexit Tracker The Brexit Tracker analyses over 390 economic indicators pertinent to the sector and a firm’s particular circumstance. Stakeholders use their dashboard to understand likely implications for their business and compare their views with those of their peers. The tool enables the expensive resource to be smartly invested in strategic planning, not squandered in trying to make sense of a myriad of factors that may or may not be relevant to the business. ### AWS and the General Data Protection Regulation (GDPR) Just over a year ago, the European Commission approved and adopted the new General Data Protection Regulation (GDPR). The GDPR is the biggest change in data protection laws in Europe since the 1995 introduction of the European Union (EU) Data Protection Directive, also known as Directive 95/46/EC. The GDPR aims to strengthen the security and protection of personal data in the EU and will replace the Directive and all local laws relating to it. AWS welcomes the arrival of the GDPR. The new, robust requirements raise the bar for data protection, security, and compliance, and will push the industry to follow the most stringent controls, helping to make everyone more secure. I am happy to announce today that all AWS services will comply with the GDPR when it becomes enforceable on May 25, 2018. In this blog post, I explain the work AWS is doing to help customers with the GDPR as part of our continued commitment to helping ensure they can comply with EU DataProtection requirements. What has AWS been doing? AWS continually maintains a high bar for security and compliance across all of our regions around the world. This has always been our highest priority—truly “job zero.” The AWS Cloud infrastructure has been architected to offer customers the most powerful, flexible, and secure cloud-computing environment available today. AWS also gives you a number of services and tools to enable you to build GDPR-compliant infrastructure on top of AWS. One tool we give you is a Data Processing Agreement (DPA). I’m happy to announce today that we have a DPA that will meet the requirements of the GDPR. This GDPRDPA is available now to all AWS customers to help you prepare for May 25, 2018, when the GDPR becomes enforceable. For additional information about the new GDPRDPA or to obtain a copy, contact your AWS account, manager. In addition to account managers, we have teams of compliance experts, data protection specialists, and security experts working with customers across Europe to answer their questions and help them prepare for running workloads in the AWS Cloud after the GDPR comes into force. To further answer customers’ questions, we have updated our EU Data Protection website. This website includes information about what the GDPR is, the changes it brings to organisations operating in the EU, the services AWS offers to help you comply with the GDPR, and advice about how you can prepare. Another topic we cover on the EU Data Protection website is AWS’s compliance with the CISPE Code of Conduct. The CISPE Code of Conduct helps cloud customers ensure that their cloud infrastructure provider is using appropriate data protection standards to protect their data in a manner consistent with the GDPR. AWS has declared that Amazon EC2, Amazon S3, Amazon RDS, AWS Identity and Access Management (IAM), AWS CloudTrail, and Amazon Elastic Block Storage (Amazon EBS) are fully compliant with the CISPE Code of Conduct. This declaration provides customers with assurances that they fully control their data in a safe, secure, and compliant environment when they use AWS. For more information about AWS’s compliance with the CISPE Code of Conduct, go to the CISPE website. As well as giving customers a number of tools and services to build GDPR-compliant environments, AWS has achieved a number of internationally recognised certifications and accreditations. In the process, AWS has demonstrated compliance with third-party assurance frameworks such as ISO 27017 for cloud security, ISO 27018 for cloud privacy, PCI DSS Level 1, and SOC 1, SOC 2, and SOC 3. AWS also helps customers meet local security standards such as BSI’s Common Cloud Computing Controls Catalogue (C5) that is important in Germany. We will continue to pursue certifications and accreditations that are important to AWS customers. What can you do? Although the GDPR will not be enforceable until May 25, 2018, we are encouraging our customers and partners to start preparing now. If you have already implemented a high bar for compliance, security, and data privacy, the move to GDPR should be simple. However, if you have yet to start your journey to GDPR compliance, we urge you to start reviewing your security, compliance, and data protection processes now to ensure a smooth transition in May 2018. [easy-tweet tweet="AWS offers a wide range of services to help customers meet requirements of the GDPR" hashtags="GDPR,AWS"] You should consider the following key points in preparation for GDPR compliance: Territorial reach – Determining whether the GDPR applies to your organisation’s activities is essential to ensuring your organisation’s ability to satisfy its compliance obligations. Data subject rights – The GDPR enhances the rights of data subjects in a number of ways. You will need to make sure you can accommodate the rights of data subjects if you are processing their personal data. Data breach notifications – If you are a data controller, you must report data breaches to the data protection authorities without undue delay and in any event within 72 hours of you becoming aware of a data breach. Data protection officer (DPO) – You may need to appoint a DPO who will manage data security and other issues related to the processing of personal data. Data protection impact assessment (DPIA) – You may need to conduct and, in some circumstances, you might be required to file with the supervisory authority a DPIA for your processing activities. Data processing agreement (DPA) – You may need a DPA that will meet the requirements of the GDPR, particularly if personal data is transferred outside the European Economic Area. AWS offers a wide range of services and features to help customers meet requirements of the GDPR, including services for access controls, monitoring, logging, and encryption. For more information about these services and features, see EU Data Protection. At AWS, security, data protection, and compliance are our top priorities, and we will continue to work vigilantly to ensure that our customers are able to enjoy the benefits of AWS securely, compliantly, and without disruption in Europe and around the world. As we head toward May 2018, we will share more news and resources with you to help you comply with the GDPR. ### Enterprise Infrastructure Service Transformation in a Hybrid Cloud World - #Cloudtalks with Union Solutions' Jason Rabbetts In this episode of #CloudTalks, Compare the Cloud speaks with Managing Director and Founder of Union Solutions' Jason Rabbetts. Jason speaks about how the future of Union Solutions will lead to bringing an infrastructure service to a largely hybrid model that will help customers to improve their services that are right for them in the business. Jason, continues to explain about how Union Solutions stand out by focusing on the outcomes for their customers and leading the customer through transformation with appropriate policy process in technology change. They understand the art of the possible and apply it to what customers want to achieve. #CLOUDTALKS Visit Union Solutions website to find out more: http://www.unionsolutions.co.uk/ ### Using Data to Grow Your Small Business If you think big data is the preserve of big business then think again. Even if you run a small business you can still use data to garner insights that will help you run a successful business. In fact, you have one key advantage over your larger competitors. You can act more quickly on the insights you gather from your data. You are less likely to have to go through several rungs of approval or implement complex process changes to make things happen. Gathering your data To begin with, you need to identify your key data sources and there may well be more than you realise. Sale figures, customer information and stock levels are standard sources of business data. The digital age means you now also have data from web traffic, social media and communications like emails. When gathering your data, think beyond what your business is generating because you can also access extra information online. You can even get information on your competitors relatively easily. If you still don’t feel you have the data you need, then it may be worth conducting a market research survey to gather more information. Storing your data Although your business may be generating numerous amounts of data, it is no good unless you have a way of storing it. Identifying the best way to store your data is not a decision to be taken lightly. You need to be able to capture it accurately and have a storage system with the right capacity for you. Begin by identifying how much data you need to store and whether you are truly only storing useful information and not duplicating any of it. Consider whether you have large files like videos and how easily and quickly you need to able to access your data. Also, consider if you will store the data on-site, or use a remote cloud-based system, or a hybrid of the two. Cloud-based storage is often a good solution for small businesses because it is simple to setup, maintain and upgrade. It is a cheaper option too because you can pay for it via a monthly subscription. It is also great for accessing data on the go and sharing it with others you work with. Security is also of paramount importance when storing data, especially since you are likely to hold personal information about customers. Don’t take for granted that the system you choose is secure. Do your research and make sure you have the necessary protocol in place to protect your data. [easy-tweet tweet="Customer Relationship Management software is essential for leveraging customer data" hashtags="Data, CRM"] Analysing and using your data The thought of analysing data can be incredibly daunting but luckily you don’t need to hire a data scientist or be a maths whizz to unpick your data. Neither does the process have to be manual or labour intensive. If you are envisaging dealing with unwieldy spreadsheets then you’ll be relieved to know that there are lots of more sophisticated alternatives. There are lots of relatively cheap tools available to help you with your data analysis. For example, Google Analytics is a tool most people can get to grips with and it offers a swathe of information on customers’ online behaviour. You can use it to understand how best to engage your customers and drive more sales. Lots of software also comes with dashboards that allow you to easily visualise data and get key information at a glance. Data can give you a deep understanding of your customers and answer questions about what they are likely to buy, to when and how they prefer to shop with you. You can use this information to improve your marketing or customer service and even inform your pricing. Customer Relationship Management software is essential for leveraging customer data. You can use it for all manner of things, like finding out if you are unwittingly placing obstacles in their way, which could be preventing them from purchasing from you. You can use it to work out who your most valuable customers are and offer them incentives to remain loyal to you, such as discounts on their favourite items. By continually tracking and monitoring your data you can quickly spot when you need to take action. You can spot trends, forecast revenue and growth and better understand your strengths and weaknesses. This is particularly valuable if you can analyse your data in real-time. Making the most of your data The greatest value you can get from your data is the ability to make strategic decisions that set you apart from your competitors. Neglecting data means you could be making business decisions without being fully aware of the reality. ### The First Live 4K Video from Space The Event On April 26, during a Super Session at the 2017 NAB Show, the U.S. National Aeronautics and Space Administration (NASA) will produce the first-ever live 4K video stream from space using the highest resolution video ever broadcast live from the International Space Station (ISS). Join astronauts in their home above the clouds as NASA astronaut Dr Peggy Whitson on the ISS converses with AWS Elemental CEO and co-founder Sam Blackman at the LVCC – and then listen in on a panel discussion between filmmakers, technology leaders and NASA scientists. The live feed from 250 miles above Earth will be encoded using AWS Elemental video processing software on board the International Space Station (ISS) and on the ground at Johnson Space Center for delivery by Amazon Web Services (AWS) in the cloud to multiscreen devices. Where to Find It “Reaching for the Stars: Connecting to the Future of NASA and Hollywood” 10:30 to 11:30 a.m. on Wednesday, April 26, 2017, in room N249 (North Hall) of the Las Vegas Convention Center Learn how advanced imaging and cloud technologies are taking scientific research and filmmaking to the next level Moderator: Carolyn Giardina, technology editor, The Hollywood Reporter Panelists: o    NASA astronaut Dr Tracy Caldwell Dyson, PhD. o    NASA imagery experts program manager Rodney Grubbs o    Bernadette McDaid, head of development, VR & AR, Bau Entertainment o    Khawaja Shams, vice president of engineering, AWS Elemental o    Dr Dave McQueeney, senior principal investigator, IBM Watson Group How to Watch It Produced by NASA, NAB, and Amazon Web Services (AWS), the panel event and live stream from space will be available to the public for multiscreen viewing in live 4K and down-converted high-definition (HD) video at https://live.awsevents.com/nasa4k. 4K-capable devices will be required to view 4K content After the event, lower resolution streams of the broadcast will also be available as a video-on-demand (VOD) asset on NASA’s Facebook page, NASA Television and the agency’s website as well as on https://live.awsevents.com/nasa4k Another First for the International Space Station (ISS)! On April 24, Astronaut Whitson will break the American record for cumulative time in space when she reaches 535 days on the ISS The ISS orbits the Earth at an altitude of about 250 miles (approximately 402 kilometres) and travels at a speed of 17,200 miles per hour (27,600 km per hour) The orbiting microgravity laboratory serves as a world-class test bed for the technologies and communications systems needed for human missions to deep space. Astronauts are learning about what it takes to live and work in space for long periods of time, increasing our understanding of how the body changes in space and how to protect astronaut health. NASA is also working with commercial crew and cargo partners to provide access to low-Earth orbit and eventually stimulate new economic activity, allowing NASA to continue using the station while preparing for missions beyond. [easy-tweet tweet="Live 4K video will help take research to the next level and allow NASA to share discoveries" hashtags="NASA,4KVideo"] What You Will Learn Audiences at NAB and online will learn how: Hollywood and NASA inspire one another and the fascinating, surprising outcomes of this decades-long collaboration; 4K and Ultra HD (UHD) give NASA scientists a clearer, crisper view of their experiments, Earth, and the solar system; Live 4K video will help take research to the next level and allow NASA scientists to share discoveries as they are happening – instead of after the fact; The powerful, first-ever live 4K streaming, cloud-enabled workflow from space works and how it can help commercial space endeavours and content creation; and, How live 4K will play a role in the journey to Mars … How Live 4K from Space Works ISS to Johnson Space Center Anchored by a 4K camera and AWS Elemental Live encoder aboard the ISS Video set up, controlled and managed over an IP network via the AWS Elemental user interface AWS Elemental Live encodes content in HEVC and sends it in a UDP transport stream via the ISS network to Johnson Space Center (JSC). Johnson Space Center to Las Vegas Convention Center At JSC, the 4K video and audio inputs are decoded and routed to an AWS Elemental Live encoder; The AWS Elemental Live system delivers the signal to a network switch as an HEVC A/V output for delivery via satellite to a Roberts Communications Network satellite downlink truck on site at the LVCC, where the signal is brought into Room N249 for live viewing; 4K projectors from Christie display the live video in full 4K resolution during the session. Public Live Streaming at www.live.awsevents.com The AWS Elemental Live systems provide redundant transcoded multi-bitrate outputs to two AWS Elemental Delta video delivery systems AWS Elemental Delta systems deliver HTTP Live Streaming (HLS) packages for IP delivery to the Amazon Web Services cloud Amazon Route 53 routes the signals to the Amazon CloudFront content delivery network for delivery to multiscreen and connected devices everywhere at www.live.awsevents.com Network monitoring is performed by Amazon CloudWatch ### DR as a Service - #Cloudtalks with Union Managed Services' Heston Paules In this episode of #CloudTalks, Compare the Cloud speaks with Head of Sales, Heston Paules for Union Managed Services. Heston speaks about how it is important to get the message out for leveraging public cloud, for business outcomes and to not focus on the cloud being a destination. Heston further speaks about the approach to be hybrid and keep data where it is needed. From Heston's view Union Solutions is set apart from others through their customers focusing on their business outcomes and how DR is used as a service and backup as a service. Union Solutions focus on what the company wishes to achieve. #cloudtalks Visit Union Solutions, Managed Services page to find out more: http://www.unionsolutions.co.uk/managed-services/ ### Whistleblowing Protecting People and Companies - #Cloudtalks with Addveritas' Andrew Samuels and Dino Bossi In this episode of #CloudTalks, Compare the Cloud speaks with Andrew Samuels and Dino Bossi, Founding Partners of Addveritas. Andrew begins by explaining how this brand new company is providing whistleblowing solutions. Dino, further speaks about the definition of whistleblowing and how important it is for organisations to regard their employees as assets rather than liabilities. Andrew continues to speak about how Addveritas stand out through their industry leading experience and offerings. #cloudtalks Visit Addveritas to find out more: https://www.addveritas.co.uk/ ### Protect Yourself – Getting Cyber Security Right Did you know that 75% of all UK businesses have suffered from at least one cyber attack in the past 12 months? It’s a shocking statistic and one that puts the severity of the cyber security issue into perspective. Online criminals pose a persistent threat, and the sobering truth is that for that 25 % lucky enough not to have experienced an attack, it’s only a matter of time before they do. We hear about cyber attacks and the impact they cause on an almost daily basis. Just earlier this month, the payday loan firm Wonga suffered a serious data breach as a result of a cyber attack which saw the confidential data of around 245,000 customers compromised. According to experts, it was one of the most significant data breaches in the UK that involved financial information. Although it may seem like businesses are helpless to suffering from cyber attacks at any given moment, there are measures that can be taken to minimise the impact of any incident from both operational and financial perspectives. Firstly, businesses must accept that the role of IT is changing, and cyber security needs to be the responsibility of the entire organisation, driven from the top down. This isn’t just to do with allocating additional budget, either; it involves looking at cyber security from a proactive point of view, which not only includes employee awareness but educating users in data ownership and their responsibilities in respect to that data. Secondly, businesses should devise a cyber security strategy which allows them to prepare and effectively deal with a cyber attack and consequently mitigate any risks. Historically, most cyber attacks are caused by simple human error, whether it’s unknowingly clicking on a malicious link, providing information to a counterfeit website or using unapproved software, and so user awareness (making sure all staff are knowledgeable about the different types of attack and how to spot them) should be one of the primary focuses in any cyber security strategy.  Beyond that, the specifics of a cyber security strategy will very likely differ from one organisation to the next, depending on the size and nature of the business and the various assets it holds. [easy-tweet tweet="Cyber Essential is a government-backed scheme that addresses around 80% of cyber threats" hashtags="Cyberthreats,Security "] Regardless of the plans and policies businesses already have in place, they can always benefit from using an industry-recognised framework when assessing their own state of ‘cyber readiness’. Cyber Essentials and Cyber Essentials Plus (audited) are a simple but effective way of doing this. Cyber Essential is a government-backed scheme that addresses around 80% of common cyber threats, radically improving the security defences of any business. By becoming Cyber Essentials and Cyber Essentials Plus certified, businesses are demonstrating to suppliers, partners and customers that they’re taking cyber threats seriously and have prepared themselves accordingly. Plus, with 80% of threats already addressed through these frameworks, businesses are perfectly positioned to tackle the remaining 20% with the advice and expertise of an IT or cloud service provider. As well as Cyber Essentials, businesses might want to consider attaining the ISO27001 certification. For small businesses within a supply chain of a larger company that expects them to be aligned to ISO27001, it may not be cost effective to actually complete the ISO27001 certification.  For these businesses, there is an alternative cost-effective information security certification they can obtain. It’s called the IASME Governance standard. This standard was developed over several years during a Technology Strategy Board funded project and offers an affordable alternative to ISO27001 — the international standard that many larger companies aspire to achieve. To successfully achieve the IASME Governance standard, businesses must undergo an assessment against all of the ISO27001 Operational Control categories as well as a Cyber Essentials assessment. Plus, as of March 1, 2017, the IASME Governance standard also includes a separate assessment that takes into account the imminent introduction of the General Data Protection Regulation (GDPR). This regulation is designed to standardise the procedures that must be carried out to protect against cyber attacks and is something that all businesses must adhere to from May 2018 onwards. As part of the GDPR, any business that fails to comply will be hit with significant fines — up to €10 million or 2% of annual global turnover — while those that suffer from a data breach will face an even worse punishment with potential fines of €20 million or 4% of annual global turnover, whichever is greater. By successfully completing the IASME Governance standard scheme, businesses of all sizes can ensure that they are meeting every requirement outlined in the GDPR, and can operate confidently without living in fear of being fined. There’s no doubting the irreparable damage that can be suffered at the hands of cyber criminals, but it is also well within the power of businesses to protect themselves sufficiently through cyber security strategies and best-practice frameworks. ### BI in the Future, is there a BI Problem? - #Cloudtalks with Assimil8's Will Boyle In this episode of #CloudTalks, Compare the Cloud speaks with Will Boyle, head of sales for Assimil8 Business Analytics. Will speak's about how Assimil8 focus on people managing their data and using it more efficiently. He continues to speak about the future of Assimil8 and the changing landscape of business intelligence. #Cloudtalks Visit Assimil8 website to find out more: http://www.assimil8.com/  ### Removing uncertainty from the office of finance - #Cloudtalks with HAYNE Solutions’ Mark Cracknell and Tony Redding In this episode of #CloudTalks, Compare the Cloud speaks with Mark Cracknell, Director and Tony Redding, Sales Director at HAYNE Solutions. Mark talks about helping finance teams in becoming more efficient and effective and what he see happening in the next year. He also talks about how the ability for companies tore-plan and re-budget quickly and accurately will become incredibly important going forward in these uncertain times. Tony talks about the speed of change, moving to the cloud and how budgeting forecasting, planning and analytics has changed in the last year. He also talks about the need to understand the impact of change very quickly and react accordingly. #Cloudtalks Visit HAYNE Solutions to find out more: https://hayne.co/  ### Cloud Communications: Separating Myth from Reality Real-time communication (RTC) services have begun and will continue, to migrate to Cloud environments. As part of this migration network device, such as session border controllers (SBCs) are also moving to a virtual, cloud deployment model. However, there are a few common myths associated with this RTC transition. Some deal with promises of cloud functionality that do not really exist, while others deal with unrealistic expectations of how a cloud deployment will be simple and easily extensible. Fortunately, for each of these myths, there is a reality that clarifies exactly what a cloud native SBC can deliver. Myth 1: Cloud infrastructure inherently provides the necessary redundancy mechanisms needed for any mission-critical application. Reality: While this may be possible for web-based data applications, this is generally not true for real-time communications as it has highly stringent reliability and quality requirements to ensure no sessions get dropped, to assure minimal packet loss and jitter, and to handle failover on media streams within fifty milliseconds. It’s important to ensure when RTC is provided in a Cloud infrastructure, a cloud native SBC is available to provide high availability for signalling and media sessions in order to meet these highly stringent fail-over requirements. [easy-tweet tweet="Firewalls are not designed to adequately protect real-time communications" hashtags="Data, Firewall"] Myth 2: Secure RTC in the Cloud does not exist. Reality: Migration of applications to the Cloud expands possible attack surfaces, so it’s often believed that real-time communications delivered from the Cloud are not secure. And while firewalls do a great job protecting data, they are not designed to adequately protect real-time communications. With a cloud-native SBC, it is possible to provide complete media, signalling and management security for real-time communication, all the way up to and including the application layer. This means that the solution will know how to terminate, initiate and re-initiate signalling and session description protocol (SDP) for media, including handling encryption. Myth 3: High performance for RTC requires a trade-off with feature functionality. Reality: Full performance should not be limited by features such as encryption, DOS/DDOS prevention or secure real-time transfer protocol (SRTP). Look for a cloud-native SBC that can go one step further with two complementary development efforts. The first is the adoption of microservices, where virtual SBC performance will scale separately for signalling, media processing and media transcoding functions. The second is a unified management process that enables alignment and optimisation of the scaling between these functions. Myth 4: Cloud scaling is simply a model based on adding more central processing units (CPUs). Reality: While this is often true for many applications, it, unfortunately, does not apply well to media transcoding for RTC. The fundamental problem is that CPUs are not designed to meet the processing requirements of media transcoding. They are not fast enough, nor can they scale when dealing with complex codec types. Simply adding more core processors is not effective. However, with the help of a cloud-native SBC, you can scale very effectively using a combination of microservice architecture and Graphical Processing Units (GPUs). Microservices means that transcoding as a virtual SBC function can scale independently of signalling and media protocol processing. It also means that the processor can be chosen to optimise transcoding without impacting processor choices for signalling or media processing. A software framework that exploits GPUs for compute-intensive operations can run processing systems in parallel and at a large scale. This type of solution is just not possible using CPUs. Furthermore, tests have been conducted using GPUs to transcode versus CPUs and the results prove that GPU far exceeds CPU performance and approaches the level of performance found with Digital Signal Processors (DSPs) running on proprietary hardware. Myth 5: Simply porting an application to a virtual machine makes it optimised for the Cloud. Reality: While it is true that applications are virtualized to run in the Cloud, making them cloud-native is a complex task. Unfortunately, virtualization alone does not make an application optimised for the Cloud. This is especially true for real-time communications. The reality is that a lot of investment in software development and interoperability testing is required to ensure that a cloud-native SBC fits seamlessly into an orchestration model so that it can be instantiated as “run-time” ready to truly unlock automated, elastic scaling. Myth 6: Moving to the Cloud simplifies multi-vendor deployment. Reality: From a generic framework, simplifying multi-vendor deployment seems achievable, but unfortunately, no two service orchestrator solutions or VNF managers are quite the same. Custom work would be required for the VNF vendor to integrate with multiple service orchestration vendors, and for service orchestration vendors to interwork with multiple VNFs. In part, this is because open source solutions are still immature and are not yet well adapted. It is not likely that a true multi-vendor implementation will work without pre-qualified interoperability testing. In order to ensure easier deployment, look for solutions that comply with ETSI interface standards and provide open Application Programming Interfaces (APIs) for integration. Although moving RTC to the cloud will undoubtedly deliver benefits that will be passed along to end customers, it’s important to be mindful of the common myths that are often linked to this migration process.  Identifying these myths and understanding how to separate them from reality will prove to be a key step in the successful migration of RTC to the Cloud.   ### Embracing the changing landscapes - #Cloudtalks with Orb Data Limited’s Anthony Nartey In this episode of #CloudTalks, Compare the Cloud speaks with Anthony Nartey, the chief sales and marketing officer for Orb Data Limited. Anthony talks about what Orb Data is going to be focused on in the next year and where he sees the industry as a whole developing in the future.  #CLOUDTALKS Visit Orb Data Limited to find out more: http://www.orb-data.com/ ### 5 Major Benefits of Using Enterprise Cloud Applications In this cutthroat market, professionals and businesses are constantly working hard to develop some new and innovative to add extra efforts, value, and simplify our daily lives. Being consumers, it’s been privileged with increasingly faster access, better service, and constant advances in the technology that are inspiring almost every single aspect of our lives. When it comes to talking about the enterprise cloud computing, it is a complete solution for varied organisations and companies that enable IT to be delivered as a service. All the computing resources deliver rapidly to end users as per their business requirements. However, the main benefit of this solution is that it has an ability to be scaled up or down, so users can acquire all those resources that they are looking for and need to pay only for what they are using. Moreover, some of the high-end enterprise application delivery needs some modern solutions that are mainly built for the modern methods of app delivery. Do you know what are main advantages of these high-end solutions? Some solutions like The Web and SaaS applications are mainly developed through solutions for cloud-based delivery, experiencing higher security, rich performance and availability. Let’s Have a Look at the Benefits of Enterprise Cloud Applications Simplifies the Way You Handle Work As we all know that Cloud-based solutions have replaced the hardware-based solution that was not handling the entire process of scaling worldwide in the best possible manner. You can deliver a highly reliable service experience with every single app without having to ascent for the second place for your servers using a symmetrical architected and cloud-based app delivery solution. Previously, the biggest problem for all the IT departments was constant purchasing and configuring new hardware and software so that everyone in their company can get up-to-date information about the new technology. But it would be quite expensive mainly depending on the size of your company. Cloud technology based applications for your enterprise can solve this issue. Using cloud, applications, different programs, and files, which are necessary for daily operations and stored in the Cloud. Moreover, there is no need for T department to spend resources to upgrade software and hardware. [easy-tweet tweet="By opting for Cloud Computing, you have the final privacy that you want " hashtags="Security, Cloudcomputing"] Security For various businesses across the world, security is one of the major concerns, but Cloud Computing do not have any access to your data and sensitive information. By opting for safe and secure networks, you can get a final privacy for your enterprise that you want. Now, there is no more losing important files from your computer or laptop because the digital storage capabilities of the Cloud make sure that hardware failures are a thing of the past. Recovering data, backup, encryption options are some of the different ways the Cloud can keep your work safe and secure and its privacy. Eliminating IT Costs As you know that by saving some amount of money on hardware acquisition and maintenance costs can help you to save your business’s small amount. However, with executing cloud-based application solution, you can look forward to reducing in support expenditures as user experience enhances your app performance. In addition to this, you can reap some of the other advantages of the affordable cost that cloud will offer for you business apps’ delivery. Boosting Productivity of Your Employees No matter what type of device you are using, you were expected to keep reading based on how quick this post is loading and inclined to do work-related tasks on the go based on the usability of the apps that you need to perform those tasks. Undoubtedly, the ability to access your data, information or we can say your office using any device at any time and location has optimised productivity. One another major advantage that you can get by application delivery solutions is that it fits the faster pace of a cloud-based platform. Boost Your End Users’ Experience Nowadays, a lot of transactions have been noticed through mobile devices comprising online B2B transactions. You can see connections with customers that streamlined their entire work as they can access to your high-quality websites and applications, so make sure to utilise an advanced cloud-based application delivery solution as it would be a great option that can benefit you. However, your business may have some internal users in the locations that are mainly based on private network infrastructure and more mobile and cloud initiatives will play a very important in supporting increasingly consumerized workforce. So, these are the benefits of using enterprise cloud applications for your business, so it is would be a good choice to invest in the cloud computing technology and reap the rewards. However, the simplicity and affordability of the Cloud computing make this technology worth adopting. If you have decided to get an enterprise cloud-based application, you should ensure that you get in touch with a leading mobile app development company that has a team of professional mobile app developers.   ### Rubrik’s Cloud Data Management Platform Now Runs in AWS and Microsoft Azure Rubrik on AWS and Microsoft Azure to power data protection, rich data services for cloud-native applications.  Rubrik, the cloud data management company, today announced a milestone expansion of its product portfolio, extending its scale-out software fabric to deliver data protection for cloud-native applications running in AWS and Microsoft Azure. Rubrik is a leader in defining the large, emerging market for Cloud Data Management, delivering a solution that allows companies to recover, manage, and secure data across public and private clouds. In addition, Rubrik recently announced bookings approaching $100M annual run rate in just six quarters of selling.    According to IDC, more than 80% of IT organisations will be committed to hybrid cloud architectures by end of this year.  As enterprises migrate applications to the cloud, the need for a cloud-scale data management platform becomes paramount in order to protect and manage data born in the cloud. Rubrik Cloud Data Management is deployed on-premises through plug-and-play appliances and at the edge with software appliances (Rubrik Edge).  Rubrik Powers Data Management for Applications Running on AWS and Microsoft Azure   For cloud-native applications, Rubrik orchestrates all critical data management functions—including backup and recovery, replication and disaster recovery, archival, copy data management, search, and analytics. By extending Rubrik Cloud Data Management software fabric to run on public cloud providers, enterprises can: Get up and running in minutes by deploying Rubrik as a software instance on AWS and Azure. Protect cloud applications, deliver inter- and intra-cloud replication (Azure to AWS, across AWS regions, across Azure regions), bi-directional replication between cloud and on-prem, cloud data archival, and more. Avoid cloud vendor lock-in by easily migrating data from public cloud to public cloud to optimise application service quality. Achieve unprecedented management simplicity by using the same consumer-grade HTML5 interface to manage applications on-premises, at the edge, and in the cloud. Start small and scale-out easily by growing the Rubrik cluster in lock-step with cloud data growth. All data is indexed and efficiently stored in a single, scale-out repository while providing data resiliency. Instantly locate and deliver application-consistent recoveries for data born in the cloud, including files, folders, filesets, VM, and database instances (e.g., Windows, Linux, SQL databases). Provide actionable insights with rich visual reporting. Rubrik Envision allows enterprises to create, customise, and share platform analytics on consumption, compliance, and more, across a multi-cloud environment. [easy-tweet tweet="To operate in a multi-cloud world, enterprises require a backup and data management solution" hashtags="Cloud,Backup"] "To successfully operate in a multi-cloud world, enterprises require a backup and data management solution that frees their data from the underlying infrastructure. Rubrik encapsulates all data with rich services—policy, security, automation, access control, compliance, and search—to achieve ultimate workload and data portability across any environment,” said Bipul Sinha, co-founder and CEO, Rubrik. “With this monumental release, enterprises can instantly access data within a hybrid cloud environment to deliver best-in-class customer experiences, streamline operations, and prepare for future innovations.”   “The unprecedented market traction for Rubrik Cloud Data Management reflects the need for simple backup and recovery solutions that provide data services across clouds not only for data protection but for new purposes like test/dev, analytics, reporting, etc.” says Jason Buffington, Data Protection Analyst at ESG. “This new cloud-native support really completes the picture for enterprises looking to go to the cloud.”   Enterprises looking to protect cloud-native applications running on AWS and Microsoft Azure can enroll in Rubrik’s Early Access Program.  ### 21st Century Financial Planning - #CloudTalks with Budgeting Solutions’ Paul Bavington In this episode of #CloudTalks, Compare the Cloud speaks with Paul Bavington, a Director of Budgeting Solutions, a Gold IBM business partner, talks about where they will be focussing this year and where he really sees financial analysis evolving and how companies can use it to their best advantage. #cloudtalks Visit Budgeting Solutions to find out more: http://www.budgetingsolutions.co.uk/ ### One in Eight Consumers in England Have Had Their Healthcare Data Breached More than three-quarters of English consumers think healthcare providers have the greatest responsibility to keep data secure. One in eight consumers in England (13 percent) have had their personal medical information stolen from technology systems, according to results of a new survey from Accenture. The survey of 1,000 consumers in England revealed that the vast majority (78 percent) believe health care providers have a great deal of responsibility for keeping digital healthcare data secure, compared to only 40 percent who believe it is their personal responsibility. Despite this, the findings show that more than half (56 percent) of those who experienced a breach were victims of medical identity theft and more than three-quarters of those victims (77 percent) had to pay approximately £172 in out-of-pocket costs per incident, on average. In addition, the survey found that the breaches in England were most likely to occur in pharmacies — the location cited by more than one-third (35 percent) of consumers who experienced a breach — followed by hospitals (29 percent), urgent care clinics (21 percent), physician’s offices (19 percent) and retail clinics (14 percent). More than one-third (36 percent) of English consumers who experienced a breach found out about it themselves or learned about it passively through noting an error on their health records or credit card statement. Only one-fifth (20 percent) were alerted to the breach by the organisation where it occurred, and even fewer consumers (14 percent) were alerted by a government agency. Among those who experienced a breach, the majority (70 percent) were victims of medical information theft with more than a third (39 percent) having personal information stolen. Most often, the stolen identity was used for fraudulent activities (cited by 82 percent of data breached respondents) including fraudulently filling prescriptions (42 percent) or fraudulently receiving medical care (35 percent). And, a quarter of consumers in England (25 percent) had their health insurance ID number or biometric identifiers (18 percent) compromised. Unlike credit-card identity theft, where the card provider generally has a legal responsibility for significant account holder losses, victims of medical identity theft often have no automatic right to recover their losses. “Patients must remain more vigilant than ever in keeping track of personal information including credit card statements and health records which could alert them to breaches,” said Aimie Chapple, managing director of Accenture’s UK health practice and client innovation in the UK & Ireland. “Similarly, health organisations must monitor patient information more carefully and remain transparent with those affected in the event of a breach to swiftly resolve the issue without losing consumers to competitors.”   Despite the myriad breaches occurring, consumers still trust their health care providers (84 percent), labs (80 percent) and hospitals (79 percent) to keep their healthcare data secure more than they trust the government (59 percent) or health technology companies (42 percent) to do so. About two-thirds of consumers in England (65 percent) either maintained or gained trust in the organisation from which their data was stolen, following a breach. And, more than half (68 percent) of English consumers said they want to have at least some involvement in keeping their healthcare data secured, whereas only a quarter (28 percent) said that they have such involvement today. [easy-tweet tweet=" Nearly all 95% of data-breach victims reported those holding their data took action." hashtags="DataBreach, Security "] In response to the breach, nearly all (95 percent) of the consumers who were data-breach victims reported that the company holding their data took some type of action. Some organisations explained how they fixed the problem causing the breach (cited by 29 percent), explained how they would prevent future breaches (23 percent) or explained the consequences of the breach (22 percent). Of those that experienced a breach, over half (53 percent) of respondents felt the breach was handled somewhat well while only 15 percent of respondents felt the breach was handled very well, indicating there is potential room to improve. “The time to assure consumers that their personal data is insecure, capable hands is now,” Chapple said. “When a breach occurs, healthcare payers and providers should be able to swiftly notify those affected, with a plan of action on how to remedy the situation and prevent it from happening again.” Methodology The findings in this news release relate only to the England portion of Accenture’s seven-country survey. The full research, “Accenture’s 2017 Healthcare Cybersecurity and Digital Trust Research,” represents a seven-country survey of 7,580 consumers ages 18+ to assess their attitudes toward health care data, digital trust, roles and responsibilities, data sharing and breaches. The online survey included consumers across seven countries: Australia (1,000), Brazil (1,000), England (1,000), Norway (800), Saudi Arabia (850), Singapore (930) and the United States (2,000). The survey was conducted by Nielsen on behalf of Accenture between November 2016 and January 2017. The analysis provided comparisons by country, sector, age and use. ### A Look at AI Website Builders: The Future or Just a Gimmick Web development has undergone numerous transformations in the past decade. During the first phase of its development, it was enough to know HTML to make a website. Today, it is quite different. Some of the most popular websites out there run on PHP, jQuery, JavaScript, Python etc. Since creating one requires practice, knowledge and skill, some companies have come up with website builder solutions that allow users with no technical knowledge to create and publish websites. This saves a lot of money for companies, who in the past had to spend a lot of it on professional website designers. The companies that stand behind popular website builders have decided to make the entire website creation process even smoother and easier by introducing an artificial intelligence component. AI Website Builders Already Exist In the sea of website builders, the ones with the simplest and the most intuitive interface have emerged as popular options. The drag and drop solutions integrated into these builders proved to be very attractive for people with no previous technical knowledge. This allows practically anyone to build a website from scratch and to start growing an online presence. Some of the most popular are Squarespace, Weebly, Wix, Jimdo etc. By updating their platforms with Artificial Intelligence, some of these companies are aiming to make the entire process even easier. Grid, Wix and Firedrop are some of the leading brands when it comes to using AI in their website building process. They say that websites built by an AI outperform other websites. For instance, the Wix ADI (artificial design intelligence) algorithm is capable of creating stunning and unique websites in just a couple of minutes. The developers behind it say that they want to set a new standard in web design. Firedrop makes things even more interesting by introducing Sacha – an AI web designer. Sacha acts like a chat bot and personal web developer assistant. This AI is able to guide you through the entire web building process with no drag and drop toolbars and menu systems. Through the chat window, you get to choose colours for each website section and to review results in real time. This chat is mobile ready so that users can use AI to create websites on the go. [easy-tweet tweet="The AI algorithm has access to a number of website layouts, designs, content and navigations options" hashtags="AI, Website"] So How Exactly AI Website Builders Work? While the process of gathering data from a user may differ from AI to AI, under the hood, these AI algorithms basically do similar things. After gathering user information by asking simple questions about user needs and goals, the AI starts to build a website. Basing its decisions entirely on these answers, it is able to create a unique and completely customised website. The AI algorithm has access to a number of website layouts, designs, content and navigations options. Basing decisions on user’s needs and goals, and what it has previously learned about you, the AI chooses the best possible combination, creating a unique website from scratch. It is also capable of creating more robust websites. The website that you get as a result can be further customised, but that is something that you will have to do. Traditional vs AI Website Builders In order to compare these two, let’s see their performance when it comes to cost, design and coding and time efficiency. [easy-tweet tweet="When it comes to time efficiency, there is nothing that could beat an AI website builder" hashtags="AI, Website"] Cost efficiency is what interest most people, and this is where an AI website builder really shines. You can get your website created, up and be running at a minimal price by using this option. It is a much more affordable solution than pricey teams of professional web designers. When it comes to time efficiency, there is nothing that could beat an AI website builder. It outperforms even the most experienced professionals in the field. It is just another tool that brings website building closer to being a completely automated process. However, the design and coding efficiency trophy still go to the humans. It seems that AI website builders can’t produce code as well as a professional programmer. It writes code with no regard to semantic markup, which is important to structure data and reinforce its inherent meaning. The same goes for designing. The AI is not really designing your website, instead, it picks up a design from the design options in the database. This means that human designers have to feed the AI with style rules that it will apply correctly to any content. Conclusion AI website builders are already available on the market. From the data we have in our hands at this moment, we can conclude that this solution is great for small businesses and individuals who want simple portfolio websites and personal blogs for a fairly low price. On the other hand, more complicated projects requiring design and code that will compliment a branding strategy will still be managed by teams of non-robot professionals. We can safely assume that professional coders and web designers will not be replaced by AI, at least not anytime soon.   ### What Does your Business Need? Data Analysis Software Getting up to speed with a digital, fast-moving marketplace requires modern data analysis tools. Today's analytic tools are more flexible and user-friendly to promote a collaborative approach toward utilising company data to your best advantage. Even moderately sized companies will have a diverse set of user skills and business operations. Data analysis software can be expensive, but going for the most affordable options may not provide the support and full functionality you expect. Taking the time to evaluate your choices will ensure that they provide the best value. The more easily and completely a robust product can be integrated with existing systems, the sooner you can start reaping the benefits. Analysis workflow Presuming data warehouses are in place and efficiently designed, modern analytic needs follow a basic cycle of user demand. While IT may provide the infrastructure, business users, in general, will navigate the data analysis process. Data solutions may require varying levels of training and adjustment, but selecting the right data tools should involve their potential for delivering high levels of satisfaction in each phase of the data flow. Accessibility Many organisations are used to deploying data analysis tools from a top-down approach where IT not only provides the support but delivers analysis as a service to interested parties. Today, the trend is more toward self-service BI. The IT department must still provide an efficient data structure with accurate and timely content, but it is the end users who are empowered to access and explore this data to perform their own analyses. Users who are able to ask their own questions and discover the answers both ease the burden on IT and facilitate faster business decisions. As user expertise and knowledge increases, executives gain greater confidence in the conclusions that are drawn. Your choice of data analysis platforms should provide IT with control and scalability while allowing end users transparency and flexibility in using their own information. Interaction Interactive components are really an aspect of accessibility, but a vital feature that shouldn't be overlooked. Interactive data visualisations in reporting tools or dashboard infographics can allow users to more easily encapsulate and understand data while giving the opportunity to drill down to a greater level of detail for more precise investigation. The ability to interact with the data not only leads to better understanding of the information that's available, but it makes data analysis more interesting and engaging so that users are encouraged to explore and utilise company data more often. Interactivity provides opportunities to engage with the data in a way that allows the user to better understand it. The added knowledge and symbolic touch and feel make data visualisation not only fun but more efficient. This efficiency, in turn, leads to being analysis and the results for which your company is looking. Analysis and discovery Analysis can cover a wide variety of needs and methods. It's essential that data analysis tools provide the flexibility to accommodate additional options beyond simple reporting. Data visualisations for monitoring performance and statistical or predictive analysis for exploring market segments are common tasks but involve different approaches. Data discovery will lead to new theories and questions. Even users with basic skills and responsibilities are going to occasionally have more complex needs. Though it may require some additional training or support, the BI tools you deploy should allow them a wider spectrum of options. Training on how to use different analytics tool is beneficial on multiple levels as well. Though some people have a natural ability to use different analysing options there are always things that can be hidden from even the most intelligent user. These pieces of training can bring to light new tips and tricks that will allow every user the ability to grow and to add on to their knowledge base. This creates a more effective workforce that can lead to better results. Ultimately, this will deliver more value to the company. [easy-tweet tweet="Today, data analysis tools are typically web-based to provide central storage " hashtags="Datasharing, Storage"] Data sharing Sharing information within a company once meant delivery by email or formal presentations with hard copies or slide shows. Today, data analysis tools are typically web-based to provide central storage and availability to anyone with a browser and the right permissions. Your applications should also allow granting other users permissions for copying or editing, version tracking, and export to a range of formats such as PDF or spreadsheets. This provides more opportunity for collaboration and discussion among team members or across departments, such as providing accurate information to customer service in order to improve customer satisfaction In addition, this collaboration can also foster a better working environment where people feel free to share their opinions. This allows for various solutions to different types of problems and allows people to learn from what other people have already done in the past. Data governance There are numerous approaches to data management. Each business will have different requirements on the spectrum of required controls. Sensitive customer information or valuable intellectual property must be secured. Regulatory requirements for healthcare and other industries require tracking who has access to what information, and when and how it's used. It's essential that your choice of data analysis tools includes mechanisms for establishing access permissions, user logs, and auditing data interactions. The transition from cumbersome and highly technical business analysis tools to more user-friendly solutions is well under way. Modern BI platforms provide a number of advantages, but it's essential that every product is evaluated in terms of how well it serves your business goals. Do an audit of different BI platforms and what they offer. Shop around to find out which one will meet your needs and help to grow your company. ### iland Named a Leader in Cloud-Based Disaster Recovery Report Report finds iland receives highest scores possible in the categories of data security, pricing, service levels and contract terms iland, an award-winning global cloud service provider of secure and compliant cloud hosting, disaster recovery (DR) and backup services, announced on April 19th, that it has been named a “leader” in Forrester Research, Inc.’s report, “The Forrester Wave™: Disaster-Recovery-As-A-Service Providers, Q2, 2017.” The Forrester report states, “[Infrastructure & Operations] leaders are increasingly responsible for supporting their firms’ digital businesses and are often measured by the effectiveness of customer engagement. DRaaS promises to help I&O pros build a resilient technology infrastructure so they can deliver always-on services where downtimes are either imperceptible or last just a few seconds.” [easy-tweet tweet="iland received the highest scores in data security, pricing, service levels and contract terms" hashtags="Security,iland"] In Forrester’s evaluation, iland received the highest scores possible in the categories of data security, pricing, service levels and contract terms. The Forrester report states iland DRaaS services are delivered “using Double Take, Veeam, and Zerto. Its self-service console integrates the underpinning replication solutions and makes it easy for customers to perform all operations on a single console. The iland Secure Cloud Console automatically measures the RPO and displays it over time - customers can set alerts in the event of a breach of a pre-set service-level agreement - and offers embedded security and compliance reporting. Once failover is executed, systems are scanned regularly for viruses, vulnerabilities, file integrity, firewall events, web reputation, application control, and intrusions. Upon failover, customers immediately gain access to built-in seven-day backups, providing additional resiliency. [It] works with [Business Continuity] consultants that evaluate DRaaS options and provider recommendations. It also has an impressive roadmap.” The iland Secure DRaaS solutions are part of a suite of Secure Cloud services, which include cloud backup and cloud hosting (IaaS), delivered to a global customer base from data centres in the Americas, Europe and Asia.  Customers can replicate both virtual and physical environments to the production-quality iland cloud and perform non-intrusive testing and recovery. “The economies of iland’s DRaaS cost model, inexpensive bandwidth, and advanced replication technologies have made enterprise grade DR available to the masses. As a result, many organisations are new to cloud-based DR, so it’s critical that we guide our customers to deploy cloud-friendly architectures and follow best practices” said Brian Ussher, president and co-founder of iland. “Every iland customer is assigned a project manager and an experienced technical team of DRaaS and network specialists skilled in managing the most complex deployments.  Additionally, iland continues to invest in our Secure Cloud Console to increase the visibility, security and reliability of our DRaaS offerings. We are proud of this recognition by Forrester and honoured by our customers’ and partners’ continued loyalty and support.”  Last month, iland announced the latest release of their Secure Cloud Services, which encompassed support for Model Contract Clauses and the U.S. and U.K. Privacy Shield Frameworks that address the challenges of the constantly changing cloud compliance and data privacy regulations in addition to new features to help customers obtain more visibility into cloud billing and performance management functionality. For more information on iland’s disaster recovery and other secure cloud services: Visit http://www.iland.com/services/ White paper: The Definitive Guide to Business Continuity in the Cloud The Forrester Wave™: Disaster-Recovery-As-A-Service Providers, Q2 2017 ### Dunkin’ Brands has Selected AWS as its Cloud Infrastructure Amazon Web Services, Inc. (AWS), an Amazon.com company (NASDAQ: AMZN), announced that Dunkin' Brands Group, Inc. (Nasdaq: DNKN), the parent company of Dunkin’ Donuts and Baskin-Robbins, has selected AWS as its cloud infrastructure provider. The company has completed the migration of its mobile applications, e-commerce websites, and key corporate IT infrastructure applications from on-premises infrastructure to AWS for increased scalability, reliability, availability, security, and reduced costs, and has improved the digital experience for Dunkin’ Donuts and Baskin-Robbins guests.     “Our mobile applications and digital properties are an absolutely critical way through which we reach our customers and they must be secure, available, and high performing at all times” Dunkin’ Brands has a number of customer-facing applications, such as its mobile applications and e-commerce websites, which serve as a critical way for the company to interact with its customers. Dunkin’ Donuts and Baskin-Robbins customers frequently use their mobile applications to review the menu, order ahead and redeem rewards, allowing customers to easily and quickly pay for their orders or send virtual gift cards. Several of these business applications, as well as the Dunkin’ Brands digital web properties, run on AWS. In addition to providing high performance, reliability, and security, AWS has enabled Dunkin’ Brands to maintain high availability during peaks in usage. For example, key consumer events like National Coffee Day and National Donut Day, as well as popular timeframes such as the holiday season, drive significant peaks in use of these key applications. While Dunkin’ Brands had previously found it increasingly difficult to predict and manage the on-premises capacity needed to provide an optimal digital experience for its guests during these times, it now relies on AWS to easily and reliably scale up and down as needed. In addition, Dunkin’ Brands has also migrated internal corporate IT infrastructure applications to AWS to reduce costs and increase availability. [easy-tweet tweet="Dunkin’ Brands selected AWS due to the depth and breadth of the services" hashtags="Cloud,AWS"] “Our mobile applications and digital properties are an absolutely critical way through which we reach our customers and they must be secure, available, and high performing at all times,” said Santhosh Kumar, Vice President, Infrastructure, Data Security and Privacy at Dunkin’ Brands. “We selected AWS as our cloud infrastructure provider for these key business applications due to the depth and breadth of the AWS services, and their experience in securely managing enterprise applications. AWS also provides us with redundancy to help us meet our goals of high reliability and availability, robust security and optimal performance for our applications, and the ability to quickly add capacity on demand when needed.” “Dunkin’ Brands is a great example of an enterprise company’s journey to AWS. They began their migration to AWS with their development and test workloads and websites, and after benefiting from lower costs, faster innovation rates, and improved reliability, migrated critical, customer-facing and corporate IT infrastructure applications,” said Mike Clayville, Vice President, Worldwide Sales at AWS. “We’re excited to work closely with Dunkin’ Brands as they continue their journey to AWS.”     ### FatPipe Networks Collaborates with Wind River to Accelerate Deployment of NFV Solutions FatPipe’s SD-WAN Platform integrated with Wind River Titanium Cloud optimises SD-WAN and NFV deployments for service providers and telecom equipment manufacturers FatPipe® Networks, the inventor and multiple patents holder of software-defined networks for wide area connectivity and hybrid WANs, announced on 20th April, it has integrated its SD-WAN (Software Defined WAN) platform with Wind River Titanium Cloud as the carrier-grade foundation to showcase and validate its Network Functions Virtualization (NFV) solution. Titanium Cloud is a portfolio of commercial-ready NFV infrastructure software platforms that enable service providers to deploy virtualized services faster, at lower cost and with carrier-grade uptime. By validating and pre-integrating FatPipe’s SD-WAN platform with Titanium Cloud, companies can deliver optimised SD-WAN/NFV solutions that are ready for deployment in live networks to service providers and telecom equipment manufacturers (TEMs). FatPipe Networks has performed testing and validation processes as part of the Wind River Titanium Cloud ecosystem program dedicated to accelerating the deployment of solutions for NFV. Through the collaboration of industry-leading software and hardware companies, the Titanium Cloud ecosystem ensures the availability of interoperable standard products optimised for NFV deployments to help accelerate time-to-market for service providers and TEMs. As one of the leading vendors for WAN link load balancing and WAN path control with thousands of customers across six continents and intellectual property defining SD-WANs, FatPipe’s SD-WAN platform includes key features that transcend WAN failures to maintain business continuity, including zero-touch branch deployment, hybrid WAN connectivity, tuned application performance, easy integration, granular WAN visibility, multi-path security, secure full mesh VPN connectivity and flexible centralized policy deployments. [easy-tweet tweet="Joining forces with Wind River, FatPipe can assist mutual customers in reaching their NFV objectives " hashtags="NFV,Cloud"] “The value of FatPipe’s SD-WAN technology integrated with Wind River Titanium Cloud means service providers and TEMs can easily and quickly deploy and manage a fully integrated NFV solution across their entire network without concern for interoperability,” said FatPipe’s CTO Sanch Datta. “By joining forces with Wind River, a global leader in NFV and IoT software, we can assist our mutual customers in reaching their NFV objectives.” “Service providers are in need of validated and market-ready end-to-end NFV solutions. To address this need, collaboration across the ecosystem is essential,” said Charlie Ashton, senior director of business development for Software Defined Infrastructure at Wind River. “We are working with leaders like FatPipe Networks to create optimised, interoperable SD-WAN/NFV solutions for service providers and TEMs who are deploying NFV in their networks. Titanium Cloud provides a foundation for carrier-grade NFV infrastructure, and by leveraging other pre-validated NFV elements, service providers can quickly achieve their goals such as reducing OPEX while accelerating the introduction of new high-value services." Titanium Cloud is designed to meet the stringent "always on" requirements of the communications industry. With Titanium Cloud as the NFV infrastructure software foundation, the telecom industry can rapidly roll out new services with carrier-grade uptime and reliability required by communications networks. Titanium Cloud is based on open software standards including Linux, real-time Kernel-based Virtual Machine (KVM), carrier-grade plugins for OpenStack®, Data Plane Development Kit (DPDK), and accelerated virtual switching, all optimised for Intel® architecture platforms.   ### Splunk Cloud Launches on AWS Marketplace Splunk Inc. (NASDAQ: SPLK), the provider of the leading software platform for real-time Operational Intelligence, announced on April 20th, support for SaaS Contracts in AWS Marketplace. The new globally available API capability enables seamless procurement and deployment of Splunk® Cloud. The automated and accelerated purchasing process for Splunk Cloud via AWS Marketplace ensures fast time-to-value for customers leveraging Splunk solutions to gain real-time security, operational and cost management insights across their Amazon Web Services (AWS) and hybrid environment. The University of San Francisco (USF) is home to an innovative academic community of more than 12,000 students, faculty and staff. “As a higher education institution, USF prides itself on being at the forefront of technology, which is why we turned to Splunk and AWS,” Opinder Bawa, vice president of information technology and chief information technology officer, University of San Francisco. “Ensuring the privacy of student, faculty and staff data is of critical importance. Adopting Splunk has enabled our small IT team to protect against increasingly sophisticated threats. The availability of Splunk Cloud on AWS Marketplace will make it even easier and faster to enhance the security of our entire AWS environment. This visibility is essential to our digital transformation journey.” AWS Marketplace streamlines customer adoption of technology such as Splunk Cloud via a consolidated purchase environment and integration with their AWS accounts, which have terms already established. AWS Marketplace SaaS Contracts simplifies the process even further by enabling customers to prepay for Splunk Cloud based on expected usage tiers through contracts up to one year in length. The Splunk Cloud cost is integrated into the customer’s AWS bill once they subscribe, resulting in a consolidated, easy to process bill. By the end of the first half of 2017, Splunk will offer term-based pricing for extended contract terms for up to three years in length, offering specific discounts for longer contract duration purchases. [easy-tweet tweet="Splunk is an innovative software platform that drives efficiency and visibility " hashtags="Splunk,AWS"] “Splunk is an innovative software platform that many of our customers rely on to drive efficiency and visibility across their entire infrastructure, and now customers will be able to purchase multi-year SaaS contracts through AWS Marketplace for the first time,” said Dave McCann, vice president of AWS Marketplace & Catalog Services, Amazon Web Services, Inc. “Our customers want easy-to-deploy SaaS solutions like Splunk Cloud to drive data-driven operational efficiencies and speed innovation, with the long-term contracts now possible via AWS Marketplace, they can experience even greater cost savings.” “With the availability of Splunk Cloud via SaaS Contracts on AWS Marketplace, existing AWS customers can more easily get started with Splunk to gain end-to-end visibility across their AWS infrastructure,” said Susan St. Ledger, chief revenue officer, Splunk. “By offering support for SaaS Contracts on AWS Marketplace at launch, we are excited to collaborate with AWS on this latest go-to-market method for bringing Splunk Cloud to the 100,000 current active buyers, and the growing customer base AWS Marketplace is building.” ### Rubrik Unveils Integration with Pure Storage  New Solution Delivers Integrated Hybrid Cloud Functionality, Simplicity and Performance Across All Environments Rubrik, the Cloud Data Management company, today announced its data protection solution for Pure Storage FlashBlade, along with new integration between Rubrik’s Cloud Data Management Products and Pure Storage’s FlashArray//M.  With this new collaboration, customers can achieve simplicity, performance and integrated data management across hybrid cloud environments. "We are pleased to lead the market with our data protection solution for FlashBlade. Finally, customers can achieve unparalleled simplicity, performance and scalability across their environment with Pure and Rubrik,” said Bipul Sinha, founder and CEO of Rubrik.  “Our integration truly ties together Pure’s performance and manageability, to Rubrik’s cloud-native data fabric, making data agiler across hybrid cloud environments.”  Rubrik Cloud Data Management integration across Pure Storage’s FlashArray//M and FlashBlade, delivers high-performance backups without compromising application consistency, granularity, or cloud mobility.  Customers using Rubrik and Pure benefit from: Deployments in under one hour vs. weeks; Unrivaled performance with near zero latency and significantly reduces VM stuns; Management across a data fabric that spans on-premises and cloud; Seamless scalability; Powerful onboard reporting; and Reductions in footprint by 70% or more. “As a FlashArray and FlashBlade customer, we’re excited to pair the Pure platform with Rubrik for our backup needs,” said Katie Bye, Director of Infrastructure, Farm Bureau Insurance of Michigan. “The integrations with array-based snapshots, combined with the ease of use of both products has greatly enhanced our operations as well as disaster recovery.” [easy-tweet tweet="Pure and Rubrik wipe away storage and backup management complexity" hashtags="Storage,BackupManagement"] “Pure and Rubrik wipe away storage and backup management complexity, resulting in massive operational savings. We trust Pure and Rubrik to run and backup our highly transactional applications,” said Jacob Warren, Systems Administrator, Red Hawk Casino. Rubrik’s support for FlashBlade, Pure Storage’s unstructured data offering, has been designed to cater to companies running complex, data-intensive workloads between private and public clouds. “We are excited to partner with Rubrik on this integration because it offers clear benefits of simplicity and performance for hybrid cloud environments,” said Matt Burr, VP of Global FlashBlade and FlashStack, Pure Storage. “For forward thinking IT teams who are seeking innovative and efficient technology, Rubrik and Pure are a great choice.”  For more information, read Rubrik and Pure Storage Joint Solution Brief, or attend our upcoming joint webinar Rubrik and Pure Storage Integration and Technical Overview featuring Rubrik Product Management Director Nitin Nagpal, Pure Storage VP and CTO Americas Chadd Kenney, and Pure Storage Technical Director Cody Hosterman will be taking place on May 5 at 10 a.m. PT. Sign up today to learn more. ### SolarWinds Database Performance Analyzer Now Available in the Microsoft Azure Marketplace Azure SQL Database users can now benefit from Multi-Dimensional Performance Analysis with simplified deployment On 20th April, SolarWinds, a leading provider of powerful and affordable IT management software, announced the availability of SolarWinds®Database Performance Analyzer with support for Microsoft® Azure® SQL Database in the Azure Marketplace.  SolarWinds Database Performance Analyzer delivers deep visibility into the performance of top database platforms, including Microsoft SQL Server® 2016, and provides advice for optimisation and tuning to accelerate database performance. Using agentless architecture and unique Multi-Dimensional Performance Analysis™, it quickly finds the root cause of complex problems and improves the performance of on-premises, virtualized, cloud, or hybrid IT application environments. With its availability in the Azure Marketplace, the thousands of organisations running millions of Azure SQL Database instances can now benefit from these capabilities, with simplified deployment in minutes. [easy-tweet tweet="SolarWinds Database Performance Analyzer can help cloud developers achieve the ROI in the cloud" hashtags="Cloud, SolarWinds"]  “We’re thrilled to offer SolarWinds Database Performance Analyzer with Azure SQL Database support in the Azure Marketplace,” said Gerardo Dada, vice president, product marketing, SolarWinds. “By helping to eliminate potential overprovisioning, slow end-user experience, and overspend, SolarWinds Database Performance Analyzer can help cloud developers achieve the ROI and cost efficiency they seek in the cloud.” According to the SolarWinds IT Trends Report 2017: Portrait of a Hybrid IT Organisation, databases are one of the top three infrastructure elements IT organisations are migrating to the cloud. Furthermore, the study found that by weighted rank, the top reason for prioritising these areas of their IT environments for migration was the greatest potential for return on investment (ROI) and cost efficiency. “We think customers will benefit from SolarWinds Database Performance Analyzer with support for Microsoft Azure SQL Database and are pleased to make it available for easy deployment through the Azure Marketplace,” said Andrea Carl, director, commercial communications at Microsoft Corp. “Now our customers running millions of Azure SQL Database instances benefit having additional tools to quickly root out problems and improve overall performance.”    SolarWinds Database Performance Analyzer is part of the SolarWinds end-to-end hybrid IT performance management portfolio of products. The SolarWinds portfolio also includes SolarWinds Server & Application Monitor (SAM), which provides deep visibility into the performance of business-critical applications and the infrastructure that supports them on-premises and in the cloud, as well as SolarWinds Network Performance Monitor (NPM), which provides comprehensive network performance monitoring with the NetPath™ feature for critical path visualisation on-premises and in the cloud.  Pricing and Availability  Pricing for SolarWinds Database Performance Analyzer for Azure SQL Databases in the Azure Marketplace starts at £530 GBP* per database license (perpetual, BYOL). A fully functional two-week trial is available. Customers will not incur SolarWinds software charges during free two-week cloud trials; however, infrastructure charges may still apply. Pricing for other versions of SolarWinds Database Performance Analyzer starts at £1,515 GBP* per database instance, and pricing includes the first year of maintenance (licenses are perpetual with a related maintenance fee). For more information, including the downloadable, free 14-day evaluation, visit theSolarWinds website or call +353 21 500 2900.  ### 5 Reasons to Use The Cloud For Product Sourcing Today’s consumers are demanding more convenience, variety, affordability, and timeliness, with every purchase and interaction they make. Thanks to the versatility and diversity of the eCommerce market, this demand is being answered with resounding promise and creating an infrastructure for infinite growth in the future. In fact, recent statistics highlight that eCommerce is currently one of the world’s fastest growing markets; in 2013, global eCommerce sales reached $839.8 billion - by next year, Statista anticipates these annual sales to reach upwards of $1.5 trillion. Similarly, several eCommerce think tanks predict that by 2020, more than 50% of consumers across the world will make online purchases. Just as the market’s past advances wouldn’t have been possible without the implementation of innovative technology, neither will the success of future benchmarks and industry projections. Perhaps one of the most valuable features to the eCommerce professionals and users across the globe is the development of cloud-based platforms. By onboarding cloud technologies, the entire selling process becomes faster and easier. Furthermore, when adopted early on, cloud-based platforms offer a slew of benefits including increased market value and the opportunity for a more streamlined experience. Benefits of Product Sourcing in the Cloud Over the next five years, the fashion, electronics, media, food, personal care, furniture, appliances, toys, hobby, and DIY sectors are projected to expand by a significant percentage. However, by onboarding cloud technologies, suppliers can provide the ultimate experience for consumers. The cloud enables companies to have real-time access to things like product descriptions, pricing, and dimensions, allowing them to cut costs while also ensuring that critical information is accurate and up-to-date. [easy-tweet tweet="The cloud enables quick turnaround, allowing suppliers to feature services before competitors. " hashtags="Cloud, Digital "] Onboarding new products and suppliers can take months and cost a fortune. However, by implementing cloud-based technologies, product sourcing is not only quicker, but it’s more accurate and effective. Not only does this offer consumers the convenience, variety, affordability and timeliness they crave, but it also gives suppliers agility in product sourcing and a competitive edge that will keep them relevant in a cut-throat market. In addition, product sourcing with the cloud offers a slew of benefits including: The cloud enables quick turnaround, allowing suppliers to feature fresh products and services sooner faster than competitors. By having an expansive, up-to-date, cutting edge inventory of products, you will attract and retain a remarkable customer base. Using the cloud cuts back on onboarding time which not only eliminates the risk of timely errors, but it also prevents suppliers from overlooking costly mistakes associated with data accuracy. Perks of Partnering With Professional Cloud Product Sources When using a company that specialises in cloud-based product sourcing, suppliers don’t have to worry about the fine details that take a prolonged amount of time like connecting them with retailers, and implementing data feed - regardless of data language. This enables businesses to expand product offerings without working through the risks and challenges of sharing data with online stores and other marketplaces. Additionally, professional cloud-based product sourcing companies provide a centralised process for a vast array of product sourcing needs. Not only do these companies specialise in the technical integration of multiple suppliers, but they also utilise cloud technology to sell and process orders effectively such as: Titles, descriptions, and images MAP, MSRP, and street pricing Weight and dimensions Invoice services Purchase orders More Better yet, leading product sourcing companies that specialise in cloud-based technology are able to provide a seamless onboarding process in a fraction of the time it would normally take. Instead of waiting months for the process to go live, retailers can connect with global suppliers and integrate data into online marketplaces in a matter of weeks. By integrating cloud-based product sourcing and partnering with experts to facilitate the onboarding process, eCommerce markets can gain a competitive edge by avoiding costly risks and providing the convenience, variety, affordability and timeliness that consumers demand.   ### Ensono Demonstrates Commitment to Data Protection Through Partnership with TRUSTe Ensono continues to refine global services through its intercontinental data protection standards Ensono™, a leading hybrid IT services provider, partnered with TRUSTe, the leading global data privacy management company, to support its compliance with the European General Data Protection Regulation (GDPR). Through this partnership, TRUSTe will perform data inventory and mapping – one of the critical activities in Ensono’s GDPR compliance roadmap. [easy-tweet tweet="Ensono’s partnership with TRUSTe further strengthens its global data protection programme" hashtags="DataProtection,Ensono"] The EU GDPR is a comprehensive regulatory framework whose objective is to protect the “fundamental rights and freedoms of natural persons, and in particular, their right to the protection of personal data,” according to the European Council. It outlines the requirements that organisations who process the personal data of EU individuals must apply in order to safeguard such data, regardless of the organisation’s location. Ensono’s partnership with TRUSTe further strengthens its global data protection programme, demonstrating Ensono’s commitment to safeguarding sensitive information and assuring conformity with European standards on data protection. “Compliance with the GDPR is very important to our business,” notes Pete Bazil, Ensono’s chief legal officer. “Over the last year, Ensono has experienced significant growth and transformation as a global company. Our strategic partnership with TRUSTe to achieve GDPR compliance reflects our commitment to our European clients and global partners and the safeguarding of their sensitive information. This will remain an important priority and focus for the company over time.” Ensono launched its GDPR compliance initiative in March 2017 and adopted a phased approach that positions it to align with the GDPR prior to the May 25, 2018, effective date. For more information about Ensono, please visit https://www.ensono.com.     ### How to Choose the Perfect Cloud Service Provider Although small and medium-sized businesses (SMBs) are aware of the benefits a cloud-based IT infrastructure has for their business, some are still reluctant to implement cloud services for several reasons: they still have concerns about data privacy and security and lack of trust in cloud service providers (CSP) with regards to continuous availability of the rented infrastructure. But more than these, complexity is a big factor: where to start and which provider to use. Despite these concerns, cloud services do offer many advantages and can solve several problems for SMBs. One of the main benefits being that companies can focus on their core business as cloud services are immediately available. In addition to this, their implementation allows better collaboration as the services are available anytime and anywhere and also require no financial CAPEX investment due to pay-per-use pricing models. This also means a company does not need to invest in their own hardware and additional staff to maintain the system in-house. Finally, SMBs can respond much more individually and flexibly to dynamic resources claim. Important decision criteria It is possible to operate all IT services in the cloud. The challenge for most SMBs is to find a Cloud Service Provider (CSP) that fits their needs. In general, a reliable CSP should have already established itself on the market for several years. Extensive technical product features, a current product portfolio and certified security standards ensure that the CSP is trustworthy and experienced in this sector.  The three major decision criteria an SMB should seek in a CSP are: Reliability: Small businesses should look at whether they have positive statistics on continuous availability disposable and how they manage. They should also consider whether the provider is a start-up company or part of a well-established company. The financial stability of the CSP should be also taken into consideration. Evaluating reliability from a technical perspective means the infrastructure should have a stable availability in order to guarantee a customers´ project is always accessible. Professional cloud service providers, for example, have implemented technical measures to ensure continuous availability when there are power supply disturbances and have DDoS attack prevention implemented to guarantee availability Security is also essential, although the level of security depends on the industry. Financial services require a higher security level than just a hosted website. Nevertheless, every SMB requires a secure service where unauthorised access to data is prohibited. With regards to data protection laws, it is highly recommended to choose a provider that offers a local data centre in your country which also means a better performance. A majority of companies expect that their data – especially when stored in the cloud – underlie local data protection standards. Easy to use: CSPs can offer so many technical features that businesses might feel overwhelmed. The best product, especially for SMBs is often the one that has an easy to understand user interface with exactly those enterprise features that are needed. Important requirements: generally and technically speaking Besides the major criteria, some general requirements should be considered while selecting a CSP. A company should develop its own checklist but nevertheless, a couple of things should be mandatory: A short contract period allows customers to change the provider quickly if this is needed. Be aware that often the best price is only available when you commit to a long term contract – meaning, flexibility is lacking. Pay-per-use: Meaning you are only billed according to the actual use. A transparent and accurate cost tracking is important. Check what is included in the price and choose preferably a CSP that includes unlimited data traffic. This prevents a higher bill in case the number of visitors on your website increases seasonally. Price/performance ratio should be considered during the selection process for your CSP as well. A very affordable solution which lacks features is not the best choice as many additional costs can occur and increase your monthly bill. On the other hand, a very extensive solution that has a lot of features included but which you do not need for your business is also not the ideal solution. Therefore, you should have a deeper understanding of your specific requirements and the relative price and decide accordingly for a CSP that offers the best price/ performance ratio for your individual case. Local language support on multiple channels is required. Fast and reliable help should be available via phone, chat, e-mail or also on the popular social media channels Certifications like ISO 27001 confirm a provider’s high-quality infrastructure from an official side. Independently conducted performance testing is also a good indication of a CSP being a reliable partner. [easy-tweet tweet="If you have chosen a CSP, the cloud is easy, fast and reliable with many technical opportunities " hashtags="Cloud,CSP"] Going a little bit deeper into the technical requirements, the following technical features should be included in a cloud service: An up-to-date virtualization technology–it is important for SMBs that they can rely on a stable and market standard solution. Short provisioning time makes sure the service is available rapidly. Nothing is more annoying than waiting hours or days before a service is available. The latest storage technology should be used – for example, SSD/SAN based storage is significantly faster than traditional HDD storage and guarantees a quick response time for each project. Increasing resources as needed without downtime is a massive advantage as one does not need to re-start the systems once resources like RAM, CPU and storage are topped up. Integrated backup options are necessary to back-up the system – many CSP offer those features as an add-on feature but at least a simple backup solution should be available free of charge. Shared storage is a good feature to exchange information between the Virtual Machines (VMs) of a cloud server. It saves time and money, as the data only has to be stored once and can be used for multiple projects. Load balancing distributes data requests across multiple servers allowing consistent access times for website visitors. If you want to automate your cloud infrastructure yourself, check if a fully documented Cloud Application Programming Interface (API) is available that allows you to integrate your own software. From a business perspective, the cloud is the most suitable solution for SMBs to store and process their data. If you have chosen a CSP, the cloud is easy, fast and reliable with many technical opportunities that were previously reserved for large companies from the enterprise segment. SMBs can concentrate on their core business and the specialists at their CSP take care of the infrastructure 24/7 – a service that an SMB could never usually realise without massive investments. On top of that, lower costs and scalability offers new freedom.   ### Businesses Yet to Embrace Digital are Missing the Boat The term digital transformation will not be unfamiliar to many UK organisations in today’s fast-paced technological climate. The latest research from the Cloud Industry Forum (CIF) reveals that 44 percent of UK organisations have already implemented, or are in the process of implementing, a digital transformation strategy, and a further 32 percent expect to have done so within the next two years. Two in five UK businesses expect their organisation’s sector to be moderately or significantly disrupted by digital transformation within the next two years. However, the term ‘disrupted’ in this context, does not convey wholly positive connotations and is representative of a large number of businesses yet to enter the ‘digital age’ as we know it. Many businesses are yet to convert for a number of reasons. CIF research from 2016 reveals that 80 percent of UK organisations are adopting the cloud in some form. For the other 20 percent, however, aside from the undeniable benefits of collaboration, the cloud pushes dangerous new boundaries that test both new IT admins and longstanding IT experts. And whilst, unfortunately, the worst can occasionally happen - disasters which can be a huge obstacle for your business - paper documents and files stored on hard-drives are still vulnerable to unforeseen circumstances, such as an accident or malware wiping swathes of critical client data. [easy-tweet tweet="80% of UK organisations are adopting the cloud in some form" hashtags="Cloud, Digital "] Working with experienced cloud providers reduces this risk and makes it easier for businesses to ensure information remains safe and compliant with all appropriate regulations such as GDPR. A recent study from Fujitsu found that ageing technology is a major barrier to digital transformation with 57 percent of the businesses surveyed admitting that existing technology is struggling to keep up with the demands of the modern digital world. In today’s climate, however, ageing technology is no excuse for stunting business agility and productivity, and ultimately letting a business fall behind its competitors. So how can these organisations keep pace with the ever-increasing speed of doing business and “go digital”? The cloud is a great place to start. Embrace the cloud The first step to digital transformation is to embrace the cloud. According to CIF’s latest report, 92 percent of businesses state that cloud services are important to their organisation’s digital transformation strategy, with 49 per cent believing it to be ‘very important’ or ‘critical’. The applications and services most likely to be cloud-based today are web hosting (65 per cent), Platform-as-a-Service (which covers many cloud options) and office productivity tools such as Microsoft Office 365, used by half of the respondents. However, the picture is expected to look somewhat different in three years’ time, and while adoption of these three application types will continue to climb, considerable growth in other areas is also expected. By 2020, for example, three-quarters of respondents expect to have adopted cloud-based video conferencing systems, such as Skype for Business, while 70 percent plan to use cloud-based CRM applications. [easy-tweet tweet="83% of cloud users have successfully improved the reliability of their IT through cloud services" hashtags="Cloud, IT"] Achieving business outcomes Cloud-based solutions are helping businesses achieve their objectives, through speeding the pace of activity and decision-making and improving the way they run. CIF’s latest research found 83 per cent of cloud users have successfully improved the reliability of their IT through their use of cloud-based services, 85 percent have increased the speed of access to technology, and 76 percent have reduced the risk of data loss.  Especially important with the forthcoming GDPR deadline in May 2018. Furthermore, companies can empower and enchant employees by using cloud-based tools that make it easier for teams to collaborate, communicate and provide access to a wealth of information previously siloed in different systems across the organisation. Optimising business operations depends on knowing what’s happening in the business now – and what’s likely to happen in the future – ensuring businesses are ahead of the game, and most importantly, ahead of their competitors. Ditching the desk It can be baffling how some businesses are not championing mobile working as a way to both speed up and enhance workflow. Now, as long as employees have internet access, they can work off shared files at any time - regardless of location. Mobile data enables team members to manage work processes wherever they are working from. Not only does this improve the productivity of the business, increased mobile working also means employers can recruit desirable candidates who are based outside the local area. The cloud enables the top candidates to work from anywhere, so physical location is no longer a barrier to employing the best possible staff. Another advantage for businesses using cloud services are the flexibility they deliver: no longer are employees limited to the traditional nine to five office hours. Telecommuting helps to reduce the stress of a long daily commute, and by facilitating flexible working initiatives, it plays a vital part in creating a more motivated, happier and engaged workforce.  The cloud and business clarity The burden of having to organise an entire IT infrastructure whilst also managing and trying to grow your business can be time-consuming, stressful, and an inefficient use of resources. Using cloud services eradicates this problem, freeing up precious time to focus on the business. Cloud services have the added benefit of reducing capital expenditure – a key advantage for SMBs with limited budgets and cash. Cutting unnecessary costs means funds can be redirected to the areas the business most needs, such as winning new customers. What’s more, with no need for in-house physical infrastructure, businesses will not incur unexpected maintenance costs, so there won’t be any nasty surprises further down the road. While cloud computing is now clearly mainstream within the business, there are a number of obstacles, including Brexit, which may stand in the path of greater digital transformation. Given the government has just announced its latest digital strategy, intending to make the UK the “best place to start and grow a digital business”, there has never been a better time to explore digital transformation. Cloud services are a vital part of this transformation, and businesses are encouraged to embrace it sooner rather than later. After all, harnessing the value of cloud computing can be liberating on many different levels, from streamlining communication to improving data security. For a predictable low monthly cost, using cloud services will help boost a business’s bottom line, enabling it to run a more efficient and ultimately, more successful business. In some ways, it’s a fundamental change, but at the same time, it’s merely a different way of enabling the same results – great customer service, looking after staff and growing the business profitably. ### Informatica Launches Flexible Pricing Model For Data Integration on AWS Marketplace Customers can quickly license, deploy, scale and upgrade award-winning Informatica cloud data integration solutions directly on AWS Marketplace Informatica®, a leading provider of data management solutions, announced yesterday, that customers can now seamlessly and flexibly license, deploy, scale and upgrade Informatica Cloud data integration solutions directly on AWS Marketplace using a frictionless self-service model. As organisations seek to rapidly innovate using cloud data services, such as Amazon Redshift data warehouse, Informatica’s collaboration with Amazon Web Services (AWS) helps accelerate the customer journey to the cloud by streamlining access to exceptional data management, reducing upfront costs through flexible licensing options, and removing business and technical barriers to deployment. Improved business outcomes include: Faster time to value – Organisations can quickly jumpstart new high-value cloud initiatives, such as cloud data warehousing with Amazon Redshift, with rapid access to comprehensive cloud data management from Informatica. Improved IT operations – CIOs can optimise IT operations with a consistent approach to deploying Informatica data management solutions on AWS, coupled with unified and flexible billing from AWS. Enhanced business productivity –By helping accelerate the Proof of Concept stage, customers can leverage a simplified self-service approach to rapidly licensing and deploying cloud data integration, and then quickly and seamlessly add and scale cloud integration services on AWS Marketplace to support increased data volumes, types and sources as needs evolve. Flexible licensing options and expanded offerings Informatica is among the first AWS Partner Network Technology Partners to support the new SaaS Contracts API on AWS Marketplace. Today’s announcement is an enhancement of the Pay As You Go Informatica Cloud offerings available since August 2016 on AWS Marketplace. New flexible licensing options will enable customers to more easily expand theirInformatica deployment on AWS in several ways, including data processing capacity, number ofInformatica connectors and vertical scaling of processing power. [easy-tweet tweet="Informatica accelerates the customer journey to the cloud with AWS." hashtags="AWS,Cloud"]  Informatica for AWS Informatica accelerates the customer journey to the cloud with AWS. Its metadata-centric, hybrid data management platform is optimised for maximum scale and is certified for AWS. TheInformatica platform supports key data integration and data management patterns, such as data warehousing and data lakes, any user and latency, and any location—in the cloud, on-premise or in hybrid environments. The platform offers native connectivity to key AWS data services, including Amazon Redshift, Amazon Simple Storage Service (Amazon S3), Amazon Relational Database Service (Amazon RDS), Amazon Aurora, Amazon DynamoDB, and Amazon EMR, and offers hundreds of pre-built connectors to cloud and on-premise data. Informatica products includingInformaticaPowerCenter, Informatica Cloud, and Informatica Big Data Management are certified for AWS and available on AWS Marketplace for easy consumption and deployment, so customers can bring connected trustworthy data to AWS with speed, scale and confidence. Supporting Quotes “Informatica teamed up with AWS to provide customers with an easy, flexible and fast path to leveraging connected trustworthy data on AWS,” said Ronen Schwartz, senior vice president and general manager, Informatica Cloud, Informatica. “We have now opened the door for customers to bring to bear the full power of Informatica data management on their AWS deployments, thus demonstrating our solid commitment to the AWS ecosystem, while enabling our customers’ highly individual AWS journeys.” “Informatica Cloud data integration solutions are the innovative and flexible offerings, which many customers rely on for their data management needs, and they can now be purchased with multi-year contracts through AWS Marketplace for the first time,” said Dave McCann, vice president of AWS Marketplace and catalogue services. “With AWS Marketplace SaaS Contracts, customers now have more flexibility in purchasing longer terms, which typically bring lower prices, and deploying Informatica data integrations products with simplified procurement and unified billing through their existing AWS account.” ### TeamViewer and IBM MaaS360 Deliver Remote Support Functionality for Unified Endpoint Management TeamViewer remote control functionalities will be available on the IBM MaaS360 UEM platform as an integrated service TeamViewer and IBM MaaS360 will support Android, iOS, Windows and macOS TeamViewer integration allows admin staff to better support employees through enhanced Service Desk capabilities TeamViewer, a leading global software provider for digital networking and collaboration, today announced remote support functionalities on IBM MaaS360 with Watson, a cognitive unified endpoint management (UEM) platform.  TeamViewer remote control functionality will be available by MaaS360 to simplify management and remote support. On any supported MaaS360 managed device, IT can initiate a remote support action with a single click. This unique integration enhances the user support experience and makes it very simple and easy to use. Visual troubleshooting With the integration of TeamViewer’s screen sharing and remote control options, the customer’s admin staff can visually detect where the problems are. This leads to better root cause analysis, allowing help desk agents to more easily access the device, identify the problem, and troubleshoot. This, in turn, means issues are resolved with greater transparency for the user. Faster support Thanks to the visual detection, problems are easily and quickly located. This leads to faster and more accurate resolutions and increased end user satisfaction. [easy-tweet tweet="Privacy and data issues are a leading concern for users. " hashtags="IBM,DataProtection"] Improved privacy Privacy and data issues are a leading concern for users. End users must give their consent before any remote support action can take place. They also have a better visual overview about what the help desk agent is doing on their devices. Collaborated support TeamViewer’s quick-access features such as chat and file transfer improve the exchange of information between IT and users. HelpDesk Business Efficiency Advanced features such as session recording and connection reports help IT departments continuously assess performance and improve the effectiveness of remote support in their business. “TeamViewer is a smart solution with cutting-edge technology that enables seamless integration into other solutions for a faster, simpler and safer help desk experience”, says Alfredo Patron, Vice President Business Development at TeamViewer. “Our unique integration with MaaS360 is a further step in proving the importance of remote access and support technologies on UEM platforms, and we are proud to offer our support and expertise.” MaaS360 with Watson is the industry’s first and only cognitive unified endpoint management (UEM) platform, enabling a smarter approach for enterprises to manage and secure all endpoints plus their users, apps and content. MaaS360 can be used to manage disparate endpoint types—including smartphones, tablets, laptops, ruggedized devices and the Internet of Things (IoT)—all from a single console. By offering a native-like approach to containerisation, MaaS360 preserves personal privacy and protects work data without compromising the end-user experience. Its integration with TeamViewer gives MaaS360 enhanced Service Desk capabilities that will allow customer administrators to remotely access their iOS, Android, Windows and Mac devices—speeding up support and the overall user experience. To learn more about MaaS360 and TeamViewer check out the blog:https://ibm.biz/Bdi2W6. ### How can the Cloud and Big Data Save Lives? Much is written about the transformative effect the cloud can have on businesses, but I am inspired by a project that is helping a non-profit organisation in its effort to help eradicate disease and ultimately save lives. Today, the cloud offers access to on-demand computing resources that, only a few years ago, would have been considered super-computing. The “Visualise No Malaria” project led by business intelligence software vendor Tableau and the non-profit organisation PATH is using cloud-powered real-time analytics to help the Zambia government eradicate malaria from its country by 2020. Making an impact on people’s lives PATH is a global non-profit organisation that aims to save lives and improve the health of vulnerable people in need through the application of technology. The organisation has an innovative approach to tackling health issues, with its strategic application of vaccines, drugs, diagnostics, devices, and system and service innovations for maximum impact. [easy-tweet tweet="PATH-pioneered approaches have contributed to saving 6.2 million lives over the course of 15 years" hashtags="PATH, Cloud "] PATH-pioneered approaches have contributed to saving 6.2 million lives over the course of 15 years, but its malaria programme doesn’t just want to control malaria. PATH has set out to assist the Zambian government in its quest to eliminate malaria in Zambia within three years. For the Visualise No Malaria project, PATH enlisted the help of Tableau to apply its considerable experience in data analytics to understanding the cause and probability of malaria outbreaks in populated areas across Zambia’s vast 750,000 km² landscape – a country three times the size of the UK. Visualise No Malaria The Visualize No Malaria team built upon Tableau’s tools by engaging partners such as Alteryx, EXASOL and Twilio to build a complete technology “stack” in the cloud for real-time operational intelligence. PATH has enlisted the help of “Tableau Zen Masters” - experts nominated by the Tableau community for their desire to help improve Tableau solutions. These experts have volunteered to help by sharing their skills and knowledge with health workers in Zambia on the frontline. They are helping to automate workflows and improve the visualisations so that Zambian officials have the intelligence that they need to predict which area of Zambia will next be affected by an outbreak. [easy-tweet tweet="By integrating mapping technologies...health workers can predict areas where mosquitoes breed" hashtags="Mappingtechnology, Cloud"] By integrating mapping technologies and testing algorithms, the Zen Masters have been able to help Zambian health officials understand how to map geospatial data such as elevation and slope and combine it with hydrological features, such as topographic wetness and stream power. In doing so, health workers can now create a very precise, accurate map of water courses, and so are able to predict likely areas where mosquitoes will breed. By combining this information with meteorological models of precipitation and temperature, a relationship can be established that allows Zambian health officials to proactively focus energy and resources areas where malaria outbreaks are most likely to occur. The power is in the cloud It is the cloud that makes this all possible. The data is aggregated in EXASOL’s analytic database running on Amazon Web Services, which is specifically designed to analyse large datasets very quickly through the optimal use of server memory. This computing power makes it possible for the PATH team to analyse massive data sets at very high speed and produce near on-demand intelligence using Tableau and Alteryx and take action against the malaria-ridden parasite where the probability of an outbreak is highest. Cloud elasticity has an important role to play in enabling the project’s future. The scalability of EC2 instances and storage means that as the data sets used to continue to grow, the ability to run fast analytic queries will not be hampered because the analytic and visualisation solution has been built to scale. This is crucial, as the quality and accuracy of the visualisations will improve as more data is collected and analysed. Speeding ahead PATH has empowered the Zambian government to tackle malaria head-on, and the Visualise No Malaria project has provided the tools and know-how to tackle this health imperative. Through the work of Tableau, EXASOL, Alteryx, and the other technology partners of the Visualise No Malaria project, Zambian officials now have the confidence to use the data they have at their fingers tips, which will allow them to make informed decisions and devise new strategies to stop the disease from spreading. In Africa, a child dies every two minutes from malaria. It is a preventable and curable disease so with better intelligence provided by improved analytics, resources can be directed appropriately in a way that saves the most lives. It is a humbling display of what data can do in the right hands when combined with the best tools for the job. We hope that as this project proves itself, the benefits it brings will be leveraged by other nations in sub-Saharan Africa and beyond to help eliminate malaria.   ### Matthew Finnie from Interoute LIVE from Cloud Expo '17 #CEE17 Disruptive Tech TV LIVE from Cloud Expo '17 talk to Matthew Finnie from Interoute about announcing Interoute's Managed Container Platform, what containers are mainly used for and virtual data centres. ### Why Security Needs to Stick it’s Head in The Cloud It comes as no surprise that cloud has become one of the most talked-about topics in the industry given the pressure that CIOs and their teams are under to improve ROI and reduce TCO. According to analysts at IDC, public cloud spending is set to grow by an incredible $122.5 billion this year, which is a 25% inflation from 2016. Nine out of ten organisations are now using cloud-hosted apps and services, some likely without even knowing, it as they upload photographs onto their company Twitter page or log sales into Google Docs. The concept of the cloud was invented in the 1950s in the form of mainframe “time sharing” which saw multiple users sharing mainframes for data access and CPU time. The cloud in its modern form then emerged at around the same time as the internet in the 1990s and was so named by Professor Ramnath Chellappa.  In a paper that he presented in Dallas in 1997, the Professor quite aptly defined cloud computing as a “…computing paradigm where the boundaries of computing will be determined by economic rationale rather than technical limits alone." [easy-tweet tweet="60% of businesses claim to save money because of the cloud." hashtags="Cloud, Business"] According to a study conducted by the Cloud Industry Forum, it appears that Chellappa was spot on.  Less reliance on data centre hardware and the advent of cloud-based subscription service programs has resulted in 60% of businesses now claiming to save money because of the cloud. In addition to helping organisations save money by purchasing less, the cloud also helps businesses save money by improving efficiency and accessibility to a vast spectrum of shared resources. Maybe all these numbers won’t come as much of a surprise to IT Pros. It is fair to say that along with the internet, the cloud has fundamentally changed how we work today. So, what’s the problem? Whilst the above statistics seem to portray the cloud as a widely embraced service, there remains some apprehension around what, when and how much data businesses should be migrating to the cloud. One of the biggest concerns is data privacy and security. Namely, some people don’t think the cloud is secure enough to be trusted with their data – and the thought of implementing their own security tools within the cloud is too much to handle. Duncan Brown, an analyst at IDC whose forte is data security, supports this argument by suggesting that there are three “P’s” holding people back from moving wholeheartedly to the cloud: Privacy, Perception (of the reputation of cloud service or cloud security service), and Paranoia. So, whilst the cloud hugely benefits organisations in terms of efficiency and flexibility, organisations can perceive security as lacking within the cloud, either bolted haphazardly on to strategies as an afterthought or worse, left out altogether. When considering the vast quantities of often incredibly sensitive data that are stored in the cloud, the lack of well-thought-out and integrated security plans is a worrying prospect, particularly with the growing security threat landscape which lives to seek out and exploit organisations’ vulnerabilities. [easy-tweet tweet="Why some people aren’t getting the most out of the cloud is the struggle with trust" hashtags="Cloud, Security "] Security and ITSM Uniting Within the Cloud A lingering reason why some people aren’t getting the most out of the cloud is the struggle with trust. However, in practice, we increasingly understand that data entering and leaving the cloud is subject to the same level of scrutiny as any other data entering or leaving the network, and in some cases subject to a higher standard. Critical network security such as firewall, asset management, application control, and AV are employed in combination to provide that level of scrutiny. An integrated ITSM and security solution is capable of improving operations for IT management and IT security, as well as for business efficiency as a whole. IT Service Management plays a vital role as it often represents an early warning mechanism capable of identifying incidents which can represent a potential threat that could impact a cloud-based service. If working properly as an extended team, the service desk can quickly alert the experts if a risk is identified with a local or a cloud-based service... In many cases an action must be taken, quickly deploying a patch for example, and this sequence is best performed by the service desk, operations and security teams working closely together.  Taking this a step further, the action can be requested through the Service Catalog which then employs automation tools and enforces audit and governance standards thus permitting the organisation to execute both quickly and in the best possible manner. In a unified and coordinated IT operating model, the service desk would help facilitate a culture of security awareness within the organisation – monitoring user behaviour and quickly identifying risky actions are just two examples of this growing awareness. Once organisations start better understanding how to best leverage Cloud based systems, and the inherent strength of a layered, unified service management and security service, any lingering fears around cloud usage will fade away and lead the way to further business efficiencies uniquely associated with the many benefits of Cloud.     ### YADA Launches Culture & Conversation: Let's Talk Mental Health Mental Health issues will be the leading cause of absence from the workplace in the UK this year, but we still don't talk about it enough. That’s why the yada events team have chosen ‘Let’s Talk Mental Health’ as the first topic for our Culture & Conversation event series. It will bring together London’s leading voices on Mental Health to breach the debate and support the Central London branch of The Samaritans charity. Chaired by TV behavioural expert Richard Daniel Curtis of The Mentoring School, we ask you to join us for an afternoon of discussion groups, networking and popup stores with guest facilitators: Claire Eastham (author of We’re All Mad Here and featured on ITV’s ‘This Morning’) George David Hodgson (Fashion designer behind Maison De Choup, Winner of British Fashion Startup Awards 2016 & featured in Vanity Fair) Clare Scivier (Music Agent and founder of the Charity ‘Your Green Room’). There will also be guest visits from Mental Health campaigners Jonny Benjamin MBE and Neil Laybourn from the charity Heads Together There will be 4 group discussions hosted by the facilitators which are: Mental Health in the Workplace, Mental Health and the Arts, Breaking the Taboo, and The Hidden Issue: Anxiety. These will give a comprehensive view of Mental Health as it intersects both the corporate and creative worlds of work. The afternoon sessions will culminate with all the groups joining together in a large round table discussion about all the content and issues raised. The evening will end with networking and a marketplace where you can get your hands on George’s designs and Claire’s book, with drinks provided from healthy brand Vivid Matcha, and live music to support the Sing For Samaritans campaign. There will be a minimum donation of £4 to attend, as that is the cost of a single call to The Samaritans (with all proceeds going to The Samaritans, so give what you can!), with registration taken on the yada Events app - available on the App store and Google Play. [easy-tweet tweet="It’s shocking when you see the statistics of how many people are affected by mental health " hashtags="MentalHealth,Charity"] This is an amazing opportunity to cover the crucial topic of the moment, 1 in 4 people in the UK will experience a Mental Health problem each year. It’s an important cause and important to yada, so we want to raise as much awareness as possible: ‘It’s shocking when you see the statistics of how many people are actually affected by mental health. It’s starting to gain traction with the media, but there needs to be more attention given, so we had to try and do our bit and partner with The Samaritans.’ - Mehram Sumray-Roots yada’s Product Director. Come support one of the largest and most hidden epidemics of our era and learn and discuss what can be done on April 19th 1-6 PM, at 1 Primrose Street, London or for more information head to https://www.yada.live/ . About yada Events yada is an innovative events platform that allows you to engage with your attendees from beginning to end. We offer complete event solutions from ticketing to the curation of content posted by your guests. With yada you can create bespoke event pages, capture every moment and share it with social media or our LiveView. ### Ian Pattison from IBM LIVE from Cloud Expo '17 #CEE17 Disruptive Tech TV LIVE from Cloud Expo '17 talk to Ian Pattison from IBM about how they are trying to 'get conversation above technology'. IBM is trying to encourage trust, trust within companies and within data. Furthermore, the conversation developed into how Blockchain is all about sharing data and building trust between one another. ### CEOs Seek Out Digital Transformation A Gartner survey of 2,600 CIOs worldwide conducted in October 2016 found that they are already spending 18 percent of their budget in support of digitalisation. This number will increase to 28 percent by 2018. Given the scale of this opportunity and the impact it will have on businesses of all types and sizes, NetApp hosted a panel session as part of its UK Backup as a Service (BaaS) launch. The discussion, alongside partners Daisy and Node4, focused on how customers’ digitisation needs are driving change in how IT services are delivered. Here are the top findings from the event: Matt Watts, Director, Technology and Strategy EMEA, NetApp “We are seeing a pace of change that we have never seen before. Customers expect more from partners and vendors and we need to understand how we offer broader services that meet the needs of the market. Furthermore, data is now the only asset of lasting value for many companies. Data is the currency of the digital economy and NetApp is the company that manages and protects the world’s data.” [easy-tweet tweet="Backup and recovery have been among the top IT priorities " hashtags="Backup,IT"] Laurence James, Product, Alliances and Solutions Manager, NetApp "Backup is a matter of trust. Backup and recovery have been among the top IT priorities for as long as I can remember and NetApp is very strong in the backup and recovery space. Our partners have taken up our solutions and the technologies which surround them. As we see with Backup as a Service, NetApp is working together with its partners to build true customer value for businesses who are asking different questions of their IT infrastructure.” Nathan Marke, Chief Digital Officer, Daisy Group “Customers want to transform into digital businesses but find it hard to focus on innovation due to the complexity of the IT environments they run today. Demands for 24x7 operations raised levels of cyber threats, the sheer pace of technology change and the need to do more with less add to the day to day challenges of running legacy systems and holding onto good skills. Customers need partners who can help to free them up from this complexity by delivering the digital foundations essential for their business to succeed. Our NetApp powered DRaaS solution is a great example of a solution that removes complexity and allows our customers more time to unleash their creative talents on figuring out digital.”  Steve Denby, Head of Solutions Sales, Node4 “The service-based delivery model removes a barrier for SMEs to becoming an enterprise. The digital agility through DevOps enables businesses to get products to market. This is where companies such as Uber and Air BnB have come from. We can now give smaller players the same tools as the multi-national corporations they are competing with, and we see what impact they can register on the market.” In summary, the way vendors and partners work with their customers is changing. Customers have traditionally been builders and operators, but they are now looking for service providers that will be their broker of service to successfully deliver on the promises of digital transformation.   ### Michael Winterson from Equinix LIVE from Cloud Expo '17 #CEE17 Disruptive Tech TV LIVE from Cloud Expo '17 talk to Michael Winterson from Equinix about who Equinix are, data centres, being able to access Europe, America and Asia. Michael also talks about how the network and data centre elements are going to start becoming standardised. ### Information Builders Collaborate with Microsoft Azure Marketplace to Offer its Omni-Gen Enterprises buying Azure Marketplace PaaS applications can improve ROI for key business entities with leading data platform Information Builders, a leader in business intelligence (BI) and analytics, data integrity, and integration solutions, today announced that it is working with Microsoft Azure Marketplace to offer its Omni-Gen™ Master Data Management (MDM) Edition on-demand on the Azure Cloud. The solution provides a single platform for generating applications that combine data integration, data quality, and master data management, in a fraction of the time such projects typically require. Many companies are making critical business decisions or developing important corporate strategies based on information that is outdated, incomplete, or incorrect. An MDM strategy, with the right supporting technologies in place, can change all that. Master data management software can help organisations enhance productivity and boost operational performance by improving information accuracy and data exchange within and beyond the enterprise. By offering its Omni-Gen MDM Edition via the cloud on the Azure Marketplace, Information Builders is bringing in-depth data profiling, broad-reaching data quality management, and data governance and stewardship capabilities to Azure’s robust user base in an on-demand fashion that will help organisations deploy these capabilities easily and effectively. Key Benefits Information Builders’ Omni-Gen MDM Edition, a unified platform for integrating, cleansing, and mastering information is available to enterprises of every size ­­– via the cloud, on-premises, or with hybrid cloud capabilities – through the Azure Marketplace Omni-Gen MDM Edition represents the culmination of many advancements on the Omni-Gen platform ­– it includes the complete features and capabilities included in the platform’s Data Quality and Integration Editions, plus new technologies that will help organisations rapidly create and efficiently maintain a single view of core entities among all information sources The solution lets users gain a 360-degree view of master data across all functional domains, allowing them to shorten development times, improve ROI, optimise information integrity, easily identify and audit record data origination and remediated data, and more – all from the Azure framework Organisations can assess and visualise data through dashboards and open presentation interfaces, enhance report accuracy, and gain a single view of a customer, allowing for better customer satisfaction. [easy-tweet tweet="Quality data that is properly managed can greatly support critical, large-scale decision-making" hashtags="Data, BI"] Gerald Cohen, president and CEO, Information Builders said: “Quality data that is properly managed can greatly support critical, large-scale decision-making. A master data management solution can enable this and much more. The ability for business users to efficiently integrate and master incoming data allows for organisations to significantly speed up project timelines, thus dramatically increasing the value of their enterprise data. Information Builders’ technology was developed in a way that works well with the Microsoft Cloud platform. We hope those searching for PaaS options on the Azure Marketplace will take advantage of the ability to increase ROI while simultaneously improving data quality and management.” Access the Omni-Gen MDM Edition on the Microsoft Azure Marketplace today, or visit our website to learn more about the solution and its benefits. About Information Builders Information Builders provides solutions for business intelligence (BI), analytics, data integration, and data quality that help drive performance improvements, innovation, and value. Through one set of powerful products, we enable organisations to serve everyone – analysts, non-technical users, even partners, customers, and citizens – with better data and analytics. Our dedication to customer success is unmatched with thousands of organisations relying on us as their trusted partner. Founded in 1975, Information Builders is headquartered in New York, NY, with global offices, and remains one of the largest independent, privately held companies in the industry. Visit us at informationbuilders.com, follow us on Twitter at @infobldrs, like us on Facebook, and visit our LinkedIn page. ### Myths about Backup and Recovery and How to Bust Them It is widely believed that the cloud is intrinsically insecure and cannot be entrusted with business-critical or backup data. In fact, according to a recent IDC research, 39% of European decision-makers consider security to be a top-three inhibitor for cloud adoption. It is true that cloud-based data protection strategies need to be managed differently to on-premise backup and recovery. But that doesn’t mean that cloud backup should be discounted as risky: when it is implemented correctly, it can be a cost-effective, secure solution for many organisations. All that is needed to make sure your data is safe is the right approach. Some pre-conceived ideas about cloud vulnerabilities are just false; others can be overcome with an effective strategy. But first, let us get down to the real issue here – what are some of the biggest myths around cloud security you should be aware of? Myth 1: The cloud is not a secure option for backup and recovery It is natural to feel your data is safest when it is closest to you, the same way some people feel putting their savings under their mattress or in a safe at home is sounder than letting a bank handle them. However, keeping your data on-site is not always safer. Yes, floods and fires are rare, but what will happen to your data if there is an outage and all your backups are kept in the same location?  The most efficient way to protect your data is to follow the backup rule of three, better known as the 3-2-1 rule: having at least three copies of your data, two of which are local but on different mediums, and one that is always offsite. Cloud backup and recovery providers ensure the environments they offer are secure and offer the right level of performance. These include high-level certified encryption at the source, securing vital business data with certified data centres and well-vetted personnel, and ensuring data availability with remote virtual standby for emergency application failover and failback. [easy-tweet tweet="When tackling cloud backup, there isn’t a one-size-fits-all model" hashtags="Cloud, Backup"] Myth 2: One-size-fits-all works When tackling cloud backup, there isn’t a one-size-fits-all model. In fact, selecting an off-the-shelf model could certainly put your data at risk. Every organisation’s requirements are different, and its data needs to be stored, backed up and recovered in a variety of ways. Before considering a cloud solution, you need to understand how your data works. Think about the volume of data to be protected, identify which data is business critical, how quickly you need to recover it (your RTO) and how long ago your last backup can have taken place (your RPO). In-house technical expertise will determine the level of support you need from your provider. Myth 3: Cloud management is not necessary Backup and recovery should never be left to chance, whether it’s on premise or cloud-based. You need a robust strategy that you test and review regularly. Your planning should be focused on business continuity, including testing, and should cover all eventualities. From a cloud perspective, managing cloud provisioning and consumption regarding computing and storage, for example, is critical – after all, it is your budget! You will also need strict controls over system access and clear roles and responsibilities for everyone involved. Practical steps Once you are confident that you are ready for cloud-based backup and recovery, there are extra steps to consider, regarding risk mitigation. [easy-tweet tweet="Cloud-based data protection doesn’t have to be intimidating." hashtags="Cloud, Data"] Get to know your internal data protection protocols and how to best pair them with a cloud backup and recovery service. Understand your data and applications, measure system and application criticality against direct and indirect variables, such as compliance requirements, productivity, visibility, performance indicators and business objectives; this will help you manage backup and recovery protocols. Testing is essential to understand your data recovery plan’s efficiency and accuracy. Make sure you test regularly and follow up any issues before they affect your recovery capability. Understand the economics and trade-offs of leveraging the cloud for your backup workloads – like anything in IT; it’s a numbers game here too. Cloud-based data protection doesn’t have to be intimidating. A reliable provider will make sure your data is backed up and ready to be restored at the touch of a button. Armed with the right knowledge of the technology available, and a good understanding of your data needs, you’ll be in the prime position to implement a robust backup and recovery solution that will effectively protect your data and your business.   ### Glenn Fitzgerald from Fujitsu LIVE from Cloud Expo '17 #CEE17 Disruptive Tech TV LIVE from Cloud Expo '17 talk to Glenn Fitzgerald from Fujitsu about how they provide the mechanisms through 'pro-creation', the hybrid platforms and the support for the existing IT around and enabling that IT to support the new developments. The interview touched upon Fujitsu's offering of their K5 - a globally available public cloud solution, and also moving the conversation onto 'the biggest change to tech in the last decade.' ### Battle the Barriers and Build the Industry The future of AI and robotics is down to two things – adoption and adaption. It’s success in day to day life, lies equally with the broad adoption of these technologies by the broader public, as well as the adaption, to embracing something new. There is a continuous hum to be heard, on the ‘rise of the machines’, which sits hand in hand, often, with fears around security; be it job security (the robots will steal our jobs!), cyber security (it’s Big Brother) or just security in one's self (do I really need more help). Ultimately, it’s the aftermath of ensuring that all of these fears are demolished, that is surging the future of this tech. For us, and broadly within the industry, the UK is a particularly interesting market. For a well-developed, innovative and tech-heavy region, the adoption rate of the day to day technology is markedly slow, particularly when it comes to AI and smart tech. The UK has the lowest penetration of dishwashers in Europe, something which I believe is linked to an underlying sense of pride. Using technology to do something that we’re perfectly capable of doing, such as wash our plates and sweep our floors, is seen as slightly lazy and almost even taboo. There is also this sense of trust, which is difficult to break; ‘if I do it myself, I know it’s done properly’. At Neato, we’re trying to break down these barriers and provide trustworthy smart technology, that saves you time to do the things you love. [easy-tweet tweet="Neato's goal is providing the most reliable technology that consumers can trust" hashtags="Technology, AI"] Slowly but surely these ideologies are changing, with a more rapid flip in adoption gaining traction, over the last few years. This is arguably down to the refining of these technologies, a tactic which we think is the most effective in ensuring public satisfaction. We’ve seen tech in the home evolve from just devices, such as vacuums, to AI and smart home technology, such as Neato robot vacuums. As the technology develops, evolves and becomes more refined, the public becomes more trusting as malfunctions occur less and less frequently. Our fundamental goal at Neato is always providing the most reliable technology that consumers can trust, from our intelligent LaserSmart navigation and smart home connectivity to superior cleaning technologies. This is not only to benefit the brand but, as advocates for home robotics and artificial intelligence, also to develop consumer AI and Smart tech trust. Trustworthy tech equals future development! Recent trends across the nation have seen the UK adapting to ideal ways of life. For years there has been continuous research looking into finding the idyllic lifestyle and on several occasions, the Scandinavian nations have been praised as the happiest way of life in the world. With this in mind, it’s no surprise that trends such as Hygge and Lagom are fast becoming the latest trends in the UK. Both styles embrace a clean, healthy, relaxed and ‘all in moderation’ lifestyle, which fundamentally fuels the adoption of AI and home tech battling the taboo. Yes, you can sweep the floors and clean your own plates, but why not leave the vacuuming to your trusty Neato? You get the satisfaction of feeling like you’ve worked hard and cleaned but with only half the work- all in moderation complete. [easy-tweet tweet="Yes, robots will fulfil certain aspects of life which may fall within the remit of a human job." hashtags="Robotics, AI"] As part of the wider conversation, not just UK market specific, there is also a lot of concern around job and cyber security, but really, they come hand in hand. Yes, robots will fulfil certain aspects of life which may fall within the remit of a human job. Neato will vacuum where a house cleaner also would, but that doesn’t mean humans in the workplace will become obsolete. There will always need to be an added element of human control which will keep jobs available and create newer and better jobs – especially when considering the implementations of security with technology. As we expand our artificial intelligence horizons there will always be a demand for and expansion of security measures, from home security to malfunction security. Therefore, tech companies are continually staying open minded, adapting to demand and moving fast. Then there are multiple layers of testing before a product hits the shelves and continuously rolled out updates for their devices. At Neato, we leverage the security skills of Heroku and Amazon Web Services (AWS) for cloud services and data storage to keep our customer’s data protected. We meet all industry standards, such as using 2048-bit RSA private keys and AES 128/256 bit encryption, to ensure we keep data safe upon transfers and storage. Overall, we see it’s important to continue to fuel the AI and smart tech conversations in order to inform the public and tackle the scepticism. As long as we continue to break down consumer barriers to information and innovation, the benefits alone will see the industry boom even further. I predict 2017 to be on the brink of a golden tech revolution. Neato Robotics designs robots for the home to improve consumers’ lives, driving innovation with intelligent laser navigation, smart home connectivity, and superior cleaning technologies. For more information, visit: www.neatorobotics.com     ### Michele Borovac from Velostrata LIVE from Cloud Expo '17 #CEE17 Disruptive Tech TV LIVE from Cloud Expo '17 talk to Michele Borovac from Velostrata about migration to the cloud and why is Velostrata different to other companies. Michele also talks about her personal journey to Velostrata. ### 2025: A Day in the Cloud It’s 2025, winter and London is unsurprisingly overcast, but you rise to a beautiful sunrise inside your bedroom courtesy of your Lumie Bodyclock. Unfortunately, traffic is bad today and having allowed Citymapper to override your alarm, you’ve been woken slightly earlier. Your alarm clock tells your kitchen’s smart devices to start making a freshly brewed coffee downstairs. You glance at your health monitor, and it tells you that you’ve had a good sleep, and sends all your vital signs to your doctor for an upcoming check-up. After getting dressed, you tell your virtual assistant to turn off the radio and main power in your flat, which signals a nearby driverless car that you’re ready to be picked up. On the ride to work your smartphone syncs with the car’s rider interface, and reminds you it’s your mum’s birthday in Singapore. With a busy day at work and given she’s eight hours ahead, the car sets up a video call for you to share a happy birthday wish on the way to work. Traffic is worse than anticipated but the good news is your office has been informed of your different route to work and is aware of the delay. Your colleague is already on the case with meetings being rescheduled and updated in real time. Simultaneously, your smartphone automatically gets the latest messages and updates from the company cloud, so you can get up to speed on the commute in– best to be prepared for any requests that have come in overnight from your team in the U.S. Once you get to the office your first meeting concerns a project you’re collaborating on with the German team. As you walk into the conference room, the glass partition reveals a perfect picture of the team sat in their own conference room in Berlin – it’s surprisingly sunny for January. Although you don’t speak German, voice recognition technology translates what is being said around the table in real-time. Both teams can see the project’s progress as well as share resources and documents as though you were in the same room, thanks to the company's cloud-based collaboration software. As the last person leaves the conference room, the lights fade and all devices switch to standby, reducing energy consumption. As it reaches late morning your watch sends a reminder to get some exercise in before lunch and suggests a favourite class at a nearby yoga studio. You accept and your watch signs you up for the class instantly. However, as you are on the way to your class you run into a colleague and your watch suggests instead lunch at your favourite sushi restaurant. It’s not often you see this colleague so you take the opportunity to catch-up. Your watch books a table for two and cancels your yoga class. After returning to the office you now finally have a chance to watch a video recorded last night by an American colleague giving you your next assignment. The U.S. office is currently short staffed and is outsourcing much of its office work to employees around the world, using video collaboration and file sharing. By the time you know it, it’s nearly time to leave. It’s cold so you decide to take advantage of the company’s carpooling fleet of driverless cars. As you make your way down to the basement, taking into account your home address, the cloud assigns you to join your colleagues from a similar part of town. This gives you time to check your fridge tracker so you can nip to the grocery store and pick up some fresh mozzarella for the pizza you have been dreaming about for the past hour. [easy-tweet tweet="We increasingly come into contact with the cloud - so much we don’t even notice it." hashtags="Cloud,Technology"] After a quick stop at the shops, at last, you’re home! A news bulletin has been compiled for when you switch on the TV and it starts playing as you sink into your heated sofa. Your watch gently reminds you that you missed your exercise target, and recommends a 30-minute virtual reality simulation of a hike along the coast of Hawaii. You are inspired because the simulation is based on personal data from a previous trip. The stunning scenery and the sound of the ocean waves are motivation for you to get off your couch. As you start cooking the lights in your lounge have begun to dim due to prep-programming, the heating feels cosy and you’re overwhelmed by fatigue – better get this pizza cooked quickly. You check on the weather and transport for tomorrow - traffic won’t be as bad and your alarm is returned to normal time. As you get ready for bed you close your blinds with a hand gesture. The room temperature and humidity automatically adjust to optimum levels for good sleep, based on your personal health data. You can rest easy knowing the cloud is preparing your day and will be there to greet you again in the morning. In our average day we increasingly come into contact with the cloud – so much that we don’t even notice it. It’s where our personal information is stored, our memories are archived and by 2025 the cloud is set to be the home of all enterprise data. As devices and businesses become more connected, the way we interact with the world will transform. In 2025, every day will be a day in the cloud.   ### Frank Jennings from Wallace LIVE from Cloud Expo '17 #CEE17 Disruptive Tech TV LIVE from Cloud Expo '17 talks to Frank Jennings from Wallace about the importance of the General Data Protection Regulation and how not enough organisations take it seriously. This interview touches on the ambiguity of GDPR, the uncertainty and what it actually comes down. Frank, further explains that Wallace will help those with GDPR concerns, especially now, due to Brexit. ### The Golden Rules of Successful SIEM Deployments Security Information and Event Management (SIEM) solutions are widely deployed to protect networks from internal and external threats. These solutions perform complex analysis on network data, including log data, to identify security issues, but these analyses can only be as good as the underlying data. The underlying principle of a SIEM is having a single pane of glass to look at all the relevant data about an enterprise’s security, making it easier to spot trends and see patterns that are out of the ordinary – as well as saving time for the security analysts. However, whilst SIEMs play a critical role in preventing or investigating breaches, the reality is that deployments rarely meet expectations. Here we examine how to optimise SIEM investments and the role of log management in improving incident response capabilities. Current issues Many organisations are faced with the challenge of fragmented or incomplete log collection. What happens if you are losing logs? What happens if your SIEM is overloaded by the amount of log data? Will your SIEM be able to detect security threats with incomplete data? Organisations need to address these issues in order to avoid unknown costs or over utilisation of the SIEM system. [easy-tweet tweet="IT departments are under enormous pressure to do ‘more with less’ " hashtags="IT, Data"] Nowadays, IT departments are under enormous pressure to do ‘more with less’ but even the most well prepared IT teams are overwhelmed with SIEM alerts and events. Organisations frequently make the mistake of feeding the SIEM every log and security event, only to find it inundated with data and alerts. In this case, SIEM simply adds to the noise rather than increasing the efficiency of the security team. How to solve these issues? Achieving SIEM optimisation By following best practices, organisations can significantly improve the performance of their SIEM for faster detection, response and investigation of potential threats and security risks. One key aspect of improving SIEMs is to optimise log management. Filtering out the irrelevant logs improves SIEM performance while also reducing the amount of log data feeding it – and less volume usually means cheaper licensing as well. Using granular policies based on log file types and compliance requirements, retention and detection can be achieved easily and reliably. Such a solution also produces higher quality, tamper-proof data, leading to increased confidence in analytics. Being certain that logs aren’t lost or haven’t been tampered with increases the integrity of SIEM data. By optimising their SIEMs, organisations could save up to 40% on their SIEM licensing costs per year.  Key practices By employing the following key practices, organisations can greatly improve SIEM performance: Choose your log management carefully: Choosing a log management tool with a wide platform and log source support – such as Syslog formats – is advisable. Having an ‘SIEM-feeding’ tool that processes and provides structured and unstructured data, as well as having transformation features like filtering, parsing, rewriting, classifying and enriching is recommended. This means only the most valuable information has to be forwarded. Compress your log messages: Compressing log messages reduces bandwidth consumption, resulting in a more stable operation and requiring less storage. Ultimately this also reduces costs. Do not lose log messages: To prevent lost messages look for features like buffering, failover destination support, message rate control and application-level acknowledgement. Pinpoint potential attacks: On average, a security professional has just 7 minutes per SIEM alert to determine its wider context. User Behaviour Analytics can pinpoint the riskiest security issues by comparing any suspicious activity to the baseline activity of the user in question. Integrating a SIEM with a Privileged Activity Monitoring solution will allow organisations to analyse the riskiest user activities in real-time to help prevent cyber-attacks and privileged account misuse. Accompany functionality with highly scalable and reliable performance Guarantee regulatory compliance: With the GDPR due to come into effect in 2018, together with standards such as PCI DSS and HIPAA already in place, it is important that any anonymization services and pseudonym generation are compliant with these regulations. Continual reassessment Optimising SIEMs through the implementation of a log management system means organisations will ensure data is being moved and stored safely and securely without the risk of lost messages or of overwhelming the security team, reducing potential threats and security risks. SIEMs are not “fire and forget”. It is essential that organisations continually re-assess their environment and adapt to the business as it changes over time.   ### Ed Owen from Zerto LIVE from Cloud Expo '17 #CEE17 Behind the Cloud Expo LIVE from Cloud Expo '17 talk to Ed Owen from Zerto about the advantages of using Zerto to help minimise the recovery time after people's software goes down. Ed also mentioned that they can replicate the customer's environment from their Microsoft HyperV to VMware and vice versa, allowing customers the ability to move if necessary. Furthermore, the conversation continued to explain the next steps for Zerto. ### Derek Cockerton from DellEMC LIVE from Cloud Expo '17 #CEE17 Behind the Cloud Expo LIVE from Cloud Expo '17 talk to Derek Cockerton from DellEMC about the upcoming movement of Azure Stack and how the features and functions of Azure in the cloud can be moved onto Premium Software to a customer's data centre. Furthermore, the discussion progressed into where they see the cloud is going next. ### Chris Hill from Barracuda LIVE from Cloud Expo '17 #CEE17 Disruptive Tech TV LIVE from Cloud Expo '17 talk to Chris Hill from Barracuda about the companies focus on keeping each customers transition, to the public cloud, secure and easy. Also, they touched upon how IoT is causing different devices to be attacked and how Barracuda can help to decrease and manage these problems.   ### Andy Cureton from ECS Digital LIVE from Cloud Expo '17 #CEE17 Disruptive Tech TV LIVE from Cloud Expo '17 talk to Andy Cureton from ECS Digital about the future transformation of new technology and the development of coding. ### 8 tips to make RPA your best friend Robotic Process Automation (RPA) is more than just a ‘buzzword’, neither is it a futuristic fantasy of 10 arm robots performing multiple tasks at once. It is in fact, according to Gartner,  the fastest growing segment of the global enterprise software market. In 2018, Gartner estimated the spending on RPA software exceeded $600 million, with an estimated $2.4 billion predicted by 2022. Transforming the way that mundane processes are carried out, it has been embraced across all sectors as it offers improved efficiency by reducing human error especially when it comes to inputting and working with digital data. By carrying out the tasks that everyone hates to do, RPA frees up time for staff to concentrate on higher value work, increasing business productivity. It also provides that necessary first step into digital transformation for even the most traditional of companies. Initially installed in factories to improve production through speed and greater reliability, the RPA process has now been adopted by many industries and sectors. Right now banks, insurance companies and utilities are embracing the benefits of RPA. These organisations are highly mature ‘organisationally’ and depend on defined processes to operate, many of which are repetitive. Unlike people, robots have no problem with performing the same task over and over again. A key advantage for implementing RPA is it can increase productivity and therefore reduce costs.  Researchers at The London School of Economics found that RPA in the Energy Sector delivered a 200% ROI where only 25% of the processes were automated. By increasing RPA in this instance, the productivity and returns could be increased further still. In regards to the insurance sector, a McKinsey report found that one insurer achieved a significant productivity gain when RPA was used to carry out a particular process in 30 minutes which had previously taken two days to complete. It is these administrative tasks, including invoicing and reporting, where RPA can really help as a time-saving and cost-saving measure to transform how businesses operate. RPA will not work for every business. But if you’re at the ‘considering stage’ then these 8 tips might just help make the decision to adopt RPA clearer. Will RPA be your best friend for 2020, or simply remain just an acquaintance? Tips to successfully implement RPA at your company Define Your Business Goals First RPA cannot solve all business problems and there are certainly many factors to consider before you decide to incorporate it into your business. Begin by reviewing your current business needs and analyse whether RPA can contribute to achieving these goals. For example, if your company has a goal to increase profits by 5% for next financial year, optimising the expense side of the business could make a significant contribution. RPA benefits include greater accuracy and speed so could be key to reducing costs and increasing profits. Asses Your Process Maturity Before Implementing RPA RPA delivers the best results in situations where you have highly mature and well defined processes. There is little value in automating ineffective processes. To assess your process maturity, consider the following: Do you have standard operating procedures (SOPs) written and used in the department Do you track success or mistake rates in your process? For RPA to work effectively, operations needs to be structured. There is little value in automating ineffective processes. Address Change Management and Personnel Concerns Directly Will the introduction of RPA have an impact on employment within the company resulting in job changes or layoffs? If so, where possible ensure that managers look for retraining opportunities for staff who may be impacted by RPA. Employees who know your company’s values, expectations and needs are worthwhile, so it is beneficial to look for internal relocation opportunities where possible. If jobs are cut then perhaps develop a program to help them find new employment outside of your organisation. Use 80/20 Analysis to Focus your RPA Implementation When it comes to implementing your RPA project, adopting the 80/20 concept to your project planning is a good place to start. Ask yourself which significant department or function is 80% driven by repetitive processes? Once you have shortlisted these departments, which may include finance, human resources and IT, ask your staff to identify then 20% of tasks that have the most repetitive process. Then focus your RPA on those areas, e.g administration or reporting. Establish Monitoring and Reporting to Detect Problems Although RPA reduces human input in tasks carried out, it is still important to monitor work undertaken as completely hands-off automation is not here yet! To minimise unpleasant surprises, we recommend management review monthly reports focussing on both quantitative data points (e.g. number of tickets or issues solved) and qualitative elements (e.g. comments from customers and employees using the system). Manage the Implementation as a Project Bringing innovative technology to your organisation is a major challenge. By appointing a project manager with suitable experience and qualifications to oversee the implementation of RPA will allow any problems to be quickly identified and resolved, in turn reassuring stakeholders. Start With a Pilot Project Introducing RPA as part of your business transformation process will present challenges. To minimise disruption to the business as a whole, start with a small pilot project in one department, e.g. automate 1-2 processes in finance which can be closely monitored. Get External Support for your RPA Transformation To capture the benefits of RPA, it is best to seek external support for your project. Existing staff may have limited to no expertise to deal with the new technology being implemented. External support through training, consultation or full project management provided by experts like Ciklum, will help ensure the easy transition to the adoption of RPA into your business.[/vc_column_text][/vc_column][/vc_row] ### Mikko Hypponen from FSecure LIVE from Cloud Expo '17 #CEE17 Disruptive Tech TV LIVE from Cloud Expo '17 talk to Mikko Hypponen from FSecure about how he improved twitter's security, the privacy problems smart-phones have and bitcoins. ### Paul Fackrell from Fujitsu LIVE from Cloud Expo '17 #CEE17 Behind the Cloud Expo LIVE from Cloud Expo '17 talk to Paul Fackrell from Fujitsu about what Paul's up to with Fujitsu, the challenges MSPs are facing and demanding more from resellers. ### Richard Holmes from Arrow ECS LIVE from Cloud Expo '17 #CEE17 Disruptive Tech TV LIVE from Cloud Expo '17 talk to Richard Holmes from Arrow ECS about Star Wars, what has grabbed his attention best at Cloud Expo this year and how this year's Expo has improved on the last. ### Growth in Data Centre Market Drives Latest Expansion at RF Code Appointment of former Dell and Salesforce executives strengthen firm’s global reach to meet increased demand for solutions that reduce expenses while increasing data centre efficiency RF Code, the leading provider of real-time intelligence for data centres, announced on the 13th of April an expansion to its global operations with three executive appointments at its Austin, Texas headquarters. Following a year of rapid global growth within its subscription pricing model and channel programme, the new roles of Senior Vice President of Strategic Partnerships, Senior Vice President of Worldwide Sales, and Vice President of Marketing reflect the next stage of business growth for the data centre solutions provider. Jonathan Luce, who led RF Code to record revenues in 2016 and deployed RF Code’s subscription sales strategy, is now the Senior Vice President of Worldwide Strategic Partnerships. In this new role, Luce will be responsible for accelerating RF Code’s sales by establishing global strategic partnerships to merge RF Code’s data centre solutions with their product offerings. In concert with the executive team, he will also oversee business development and retention strategies, with a focus on continuing growth within the company’s subscription pricing offerings for data centres in both domestic and international markets. The expansion of worldwide sales efforts will be led by Christine Burke as Senior Vice President of Worldwide Sales for RF Code. Bringing more than 20 years of global software sales experience to the company, Burke is tasked with providing strategic sales direction to lead RF Code through this next growth phase. She will also oversee sales operations along with strategic account planning for U.S. and global customers. With a long history of success in the technology sector, Burke has previously held leadership positions at Anaplan, Salesforce.com, and CA Technologies, among others. [easy-tweet tweet="RF Code's growth has helped to cement its position as a global leader in data centre intelligence" hashtags="Data, Global"] The strategic focus on positioning RF Code’s offerings for market growth will be led by Rebecca Kan as Vice President of Marketing. Kan brings over 15 years of diverse marketing experience to her new position. Kan is tasked with developing and implementing an overall corporate marketing strategy to support RF Code’s data centre offerings, building awareness around the company’s innovative subscription pricing plans and translating the company’s business objectives into marketing strategies that drive revenue. A veteran of the technology space, Kan has led marketing teams at a variety of companies, including Dell and SolarWinds. “The recent growth that RF Code has achieved has helped to cement its position as a global leader in data centre intelligence,” said Ed Healy, CEO at RF Code. “I’m delighted to welcome these new additions to our leadership and look forward to implementing their strategies to continue building on our success, both domestically and abroad. With robust growth in RF Code’s subscription sales and channel programme, we are prioritising new strategic collaborations across the world while ensuring that existing partners continue to enjoy a high level of success.” ### Richard Sexton from CloudPhysics LIVE from Cloud Expo '17 #CEE17 Disruptive Tech TV LIVE from Cloud Expo '17 talk to Richard Sexton from CloudPhysics about data analytics, security and customisation of the cloud. ### Rick Powles from Druva LIVE from Cloud Expo '17 #CEE17 Behind the Cloud Expo LIVE from Cloud Expo '17 talk to Rick Powles from Druva about how Druva is a data information management company in the cloud, GDPR and E-Discovery. ### What to Look for in a Unified Communications Solution The range of choice for businesses looking for a unified communications (UC) provider has never been greater. A few years ago, ‘unified communications’ was mainly restricted to sending voice as data and saving a small fortune on phone bills. Naturally, this remains a compelling feature of UC but the platforms available now do so much more. With all the choice on the market though this can create confusion for even the savviest of decision makers, so where to start in and what to look for? Three main options Broadly speaking, UC solutions fall into three main camps. Firstly, there are the more ‘traditional’ providers of UC which as stated, base their offerings around voice over IP (VoIP). Users simply make calls via a VoIP enabled desktop phone and the business makes considerable savings versus paying the likes of BT for landline minutes and by replacing legacy TDM lines into their offices with IP based SIP connections. It’s a familiar environment for the user and no extra training is required. Secondly, there are web-based offerings like Google Hangouts, WebEx, Citrix ‘go to meeting’ and most recently, Amazon Chime. Think of them as being like Apple FaceTime but for the enterprise. These are mainly geared towards pre-arranged meetings and users receive a diary invite via email which typically contains a web link and dial in number. Employees turn up at the appointed time and off they go. Using the web link usually means users require a headset to access voice functions but they’ll also get access to on-screen video and presentations. Being able to see and interact with colleagues and customers can make this an immersive experience. However, services like this are an ‘add on’ to what firms have already – they’ll still need to pay a telephony provider (no matter whether that be TDM or a VoIP provider) and maintain the associated equipment. [easy-tweet tweet="One of the key things to look for is ‘PSTN’ calling" hashtags="Cloud, Technology "] Finally, there are providers that combine the benefits of the first two categories and apply an even greater level of functionality using the cloud. One of the key things to look for is ‘PSTN’ calling. What this means is that a phone number is assigned to a user and enables them to make and receive phone calls via the UC platform. Effectively this means an end to the desktop phone and associated bills. Instead, all collaboration – whether this is virtual meetings, instant messages, phone, or video calls between colleagues – takes place via UC. The app it uses can be installed on whatever device the user chooses. Some solutions – such as Skype for Business – often come bundled with other useful business software too. In this case, this includes the full Microsoft Office suite, Yammer, 1TB of OneDrive storage, an Exchange mailbox and voicemail integration with Outlook – all for around £30 per user per month. Consolidate but integrate When making a choice it naturally comes down to the individual needs of a business. Some small businesses, for instance, may find that if they just want to try UC and see if it works for their employees that using the ‘free’ features provided by the likes of Google Hangouts might be a good starter. However, it is important to look at all the communications tools that a firm is already using and build a complete picture. Add up the separate conference services, mobile, fixed lines, and business software packages that the company uses. Separate services can increase costs and can also be inefficient and disjointed for employees to use. Firms might find that consolidating as much of this as possible into one can cut costs but also ensure their employees are equipped with new ways of working. However, in looking for consolidation, it’s equally important to seek integration. Cloud solutions are relatively new, and therefore in most cases, not everything comes ‘out of the box’ – features that firms have got used to such as call routeing, call queuing and voice recording often need third party solutions or expertise to work in the cloud. Similarly, most organisations still have some legacy technology such as fax and franking machines plus security and door entry systems which are based on analogue connectivity. Firms need to consider that non-IP system such as these will be around for a while yet and any UC plan needs to take account of this fact and devise workarounds to ensure everything is integrated as possible. [easy-tweet tweet="UC technology cannot be a ‘top down’ decision and user ‘buy in’ is important" hashtags="UC, Technology "] The importance of SLAs and training Two other business considerations remain – training and service level agreements. UC technology cannot be a ‘top down’ decision and user ‘buy in’ is important, so employees understand how the new technology benefits them and how to use it. This isn’t normally too difficult given the flexibility it can enable in their working life, but without training, it can’t just be assumed. Finally, firms need to remember service level agreements. In the traditional world of telephony, this was easier to understand – with the system usually based on the premise and SLA’s on PSTN connectivity where firms would be compensated if their supplier suffered a service outage. But in the cloud-based world where the Internet may be involved and many more suppliers in the end to end supply chain, companies need to understand what redress they have to that supplier in the event their vital UC solution doesn’t work as planned. So, there are plenty of considerations that any firm needs to take on board. However, with the amount of choice on the market means there will invariably be a UC solution (or solutions) which will suit their needs. ### First Collaborative Cloud- Based Platform for SME's Helping European SMEs to increase productivity and enhance competitiveness in a global framework On the 13th April, Atos, a global leader in digital transformation, leads the ‘Cloud Collaborative Manufacturing Networks’ (C2NET) consortium, launching the first collaborative Cloud-based platform for SMEs (Small and Medium Enterprises) in the manufacturing industry. This unique solution offers SMEs a single and affordable Cloud-supported platform, which has until now not existed. The project is funded by the European Union as part of the Horizon 2020 program and is made up of twenty partners including industrial entities, associations, research centres, ICT companies and public institutions. C2NET supports SMEs to optimise their manufacturing and logistic supply chains by reducing the complexity currently surrounding manufacturing management systems. It offers a platform on which products, processes and logistical data can be securely stored and shared in the Cloud. Business benefits include: -      Access to data analytics in real-time – enabling businesses to respond faster and more efficiently -      Ease of access to supply chain management information, wherever and whenever – from any mobile device (including PC, tablet, smartphone) -      An easy and secure system to collaborate and share best practice to increase productivity and enhance competitiveness The project covers: The creation of a data collection environment - the C2NET Data Collection Framework - for IoT-based continuous data capture from supply network resources. Collaboration Tools – to provide support to the collaborative processes of the supply network. Cloud Platform – to integrate the data module, the optimizers and the collaborative tools in the Cloud. The Optimizers – to optimise the manufacturing and logistics assets in the supply network [easy-tweet tweet="Atos actively participates in the design and implementation of the C2NET platform" hashtags="Atos, Digital "] In addition to leading the consortium, Atos actively participates in the design and implementation of the C2NET platform, coordinating the implementation of the pilot projects and designing the business models.  Concept demo The results of C2NET will be shown in real-life scenarios through four pilots that will allow supply network partners to share assets and information via the Cloud, collaborate on production and delivery plans across the supply network and provide decision-makers with data analytics in real-time. Four pilots have been developed: Automotive: To automate and encourage collaborative planning, data collection and sharing of information. Dermo-cosmetics: To develop more collaborative, agile and distributed planning processes. SMEs Industrial Park. To manage logistic flow and resources more efficiently. Hydraulic & Lubrication: To provide suppliers and customers with a centralised hub of information, accessible from anywhere, which enables forecasting, identifies supplier delays, and checks delivery status. C2NET consortium The C2NET consortium is led by Atos and consists of 19 European organisations and a third party. It includes: Atos, Universidad Politécnica de València, Armines (Mines Albi as third party), IK4-Ikerlan, UNINOVA, Instituto Tecnológico de Informática, Tampere University of Technology, Teknologian tutkimuskeskus VTT Oy, Linagora, Caja Magica, Faurecia, Novatec, Pierre Fabre, Interop-VLab, Fluidhouse, TecMinho, Flexefelina and Antonio Abreu Metalomecánica. This project is funded by the European Commission under the Framework Program H2020 with Grant Agreement No. 636909. For more information: http://c2net-project.eu ### Alistair Mackenzie from Predatar LIVE from Cloud Expo '17 #CEE17 Disruptive Tech TV LIVE from Cloud Expo '17 talk to Alistair Mackenzie from Predatar about the new developments made within the company and how it has transitioned from reseller to become a managed service provider. ### Driving Salesforce implementation in weeks, not years Now more than ever, companies of all sizes are realising that in order to deliver superior, real-time service and offer the best experiences for their customers and employees, they need to ensure they are maximising the contribution made by the digital technologies at their disposal–cloud, mobile, social, and most importantly data.  After all, in today’s competitive market, data is your customer and the most valuable asset in delivering exceptional customer moments. Business opportunity vs. business reality Deploying cloud CRM platforms, like Salesforce, are an obvious first step to uniting the data a business holds and extracting value. However, misconceptions about the risks and complexity of migrating to the cloud persist in companies of all sizes. For mid to large enterprises, concerns over lengthy deployment cycles loom large. These businesses often have legacy, on-premise and inflexible CRM systems that keep them ‘locked in’ to particular ways of working. Small and medium-sized businesses (SMBs) often encounter the challenge of ‘keeping the plates spinning’ for both business and customer demands, which can delay the development of new technology systems that offer a greater competitive edge. [easy-tweet tweet="For mid to large enterprises, concerns over lengthy deployment cycles loom large." hashtags="SMB, Salesforce"] As a result, implementation of new solutions often proceeds with caution, with significant time expended determining whether the new solution’s success can be guaranteed, if the transition can be achieved without interruption to the business, and what ROI will look like. The consulting landscape presents few options, particularly for SMBs. This often results in them implementing cloud solutions by themselves, based on limited experience, skills and expertise, or using their budgets to partner with smaller, potentially less experienced consultancies. Either route can result in delayed delivery of features and the failure to adopt industry best practice. So when looking to transform a business, the key questions for its leaders are: Do you attempt to implement a solution such as Salesforce yourselves, or partner with a consultancy? If you take the consultancy route, what is their experience, and do they know how to prioritise your customer’s experience at the core of every engagement? Delivering rapid Salesforce implementation As Salesforce’s longest-standing consulting partner, we felt such questions and challenges couldn’t be ignored. Our newest service, Bluewolf Go, delivers rapid Salesforce implementation to get companies live on Salesforce in just 30 or 60 days. With frameworks based on more than 16 years of proven expertise on the Salesforce platform, Bluewolf Go can move customers through their digital transformation at a faster rate and achieving desired results at high tempo. Companies already familiar with Salesforce and with well-defined objectives for their solution are able to go live in 30 days, whilst those who are less familiar or are looking for a guided approach can go live in 60 days. Our smart delivery model–for SMBs, as well as enterprises with smaller, departmental needs–reduces direct client involvement, and ensures all decisions, timetables and expectations are clearly defined upfront so our client’s attention can remain focused on their business and outcomes. With fixed pricing, timeline, and deliverables, the balancing act between the needs of now and developing for the future then becomes a lot easier to manage. Questions about success and ROI are resolved, with risk kept to a minimum. This approach is already proven. The first companies we’ve taken through the Bluewolf Go process are already seeing huge benefits from implementation. For example, Bluewolf implemented Salesforce for Diener Precision Pumps, a Swiss manufacturer of precision gear and piston pumps, in just six weeks. Diener’s sales reps are now saving an average of 5-10 hours a week after shifting from manual data entry to automated reporting, and the company now has real-time access to all of its global order information, including shipments, pending orders, and scheduled deliveries. When we think about innovation at Bluewolf, we start with a business outcomes approach with the customer at the core. The benefits of insight and efficiency through deploying the latest cloud services can be delivered with certainty in weeks, rather than years, if a relentless focus on employee and consumer experience is applied through the process. ### Simon Porter from NGAHR LIVE from Cloud Expo '17 #CEE17 Disruptive Tech TV LIVE from Cloud Expo '17 talk to Simon Porter from NGAHR about providing HR as a service, changing the skills companies have and how HR is one of the underinvested areas in business. ### Data Boomers, Hybrid Millennials and the Rise of Generation Cloud Be it our music library, photo albums or ‘hilarious’ cat videos, a lot has changed in how we manage our personal data over the past decade. Storing - and protecting - picture albums used to be an arduous task of transferring photos from a camera to computer before burning them onto CDs or external hard disks, and then buying more storage and repeating the process as space ran out.  Thankfully, today we don’t have to deal with the complex, painstaking process of years past; we snap a few photos on our phone and...voila...they’re automatically backed up and managed in the cloud. Any photo is easily and instantly accessible from any device, anywhere in the world. The cloud represents a new era liberating consumers and enterprises alike by delivering universal access and automation over their applications.  [easy-tweet tweet="The cloud represents a new era liberating consumers and enterprises alike" hashtags="Cloud, Enterprises"] Unfortunately, most businesses are slow to fully embrace the cloud world and still have to deal with the complexity of managing data that we, as consumers, have long forgotten. And if we thought we had it bad, managing business data is infinitely more complicated than a few photo albums. Businesses have databases that are constantly changing and teams need to access both current and past versions of their data - all whilst following stringent requirements around data security. And make no mistake, businesses are creating data at extreme rates. So much so that IDC predicts that worldwide data levels will reach a staggering 180 Zetabytes by 2025. Little wonder that enterprise data storage requirements are growing at 40% per year, according to Enterprise Strategy Group.  Coming hand-in-hand with this exponential data boom is a new breed of hybrid cloud enterprises all looking to take advantage of the emerging cloud economy and promising to deliver greater agility, speed and performance without the added cost and complexity.  However, many enterprises are unable to realise the promise of the cloud because of the limitations within their own infrastructure. They have spent millions on data centres, and now want to leverage the pay-as-you-use economics and automation of the cloud while maintaining control and visibility over their data. What they are missing is a killer application that bridges the gap between their owned systems and the cloud. All the while data collection and analysis have become crucial for reaching business goals across industries and, as the role of data has transformed, so has the way in which it is stored, protected, and managed.  So herein lies the challenge: how do enterprises democratise the public cloud whilst simultaneously managing data at scale across hybrid cloud architectures?  Welcome to Cloud Data Management. “But what is cloud data management?” I hear you shout. To fully understand the solution we first need to remember the problem. Backup and recovery, until very recently, was an extremely stilted market. It served as an insurance policy rather than a usable business asset. From a CIO’s perspective, backup and recovery were in the hopes that they would never need to use it. Think about it — enterprises would only see the return on their investment in the case of a disaster.  [easy-tweet tweet="Cloud data management is about securely managing and orchestrating data to wherever it is needed" hashtags="Cloud, Data"] Cloud data management, however, redefines the backup and recovery market into a value-creating function. Cloud data management is about securely managing and orchestrating data to wherever it is needed, whenever.  Imagine having all your data as easily available to an analytics application in the cloud, as it is to your local test/dev. team.  This opens up a world of possibilities for CIOs to use their data to create value without the limitations of infrastructure. In a nutshell, cloud data management orchestrates mission critical application data across private and public clouds while delivering data management functions such as backup, disaster recovery, archival, compliance, search, analytics, and copy data management in a single, run anywhere platform. Built for ‘Generation Cloud’, it eliminates the complexity of legacy solutions with an automated policy engine that manages data throughout its lifecycle across all data management functions.  This means abandoning the legacy architecture approaches that only bring incremental improvements to the table, and instead creating a new, ground-up solution that encompasses a scale-out design and convergence between software and hardware, allowing active data management capability and live online access to historical data. Ultimately, it helps businesses realise cloud economics. Unless they leverage an application that spans their data centre and the cloud while ensuring universal access, declarative automation, and security, it will always be a slow adoption path to the cloud.  Cloud data management is here to empower businesses to run faster, smarter, and stronger.  ### 3W Infra Migrates IaaS Infrastructure to New Data Center in Amsterdam 3W Infra, a fast-growing global Infrastructure-as-a-Service (IaaS) provider that managed to grow its dedicated servers under management from 3,000 to 4,000 servers during the second half of 2016, announces the migration of its complete IaaS infrastructure to a newly commissioned data centre configured for high-redundancy in Amsterdam, the Netherlands. The migration to this new data hall is part of an infrastructural upgrade to support 3W Infra’s rapid customer growth.  3W Infra has migrated its IaaS infrastructure including dedicated servers and network infrastructure to a new data hall within Switch Datacenters’ Amsterdam-based data centre complex, Switch AMS1. This new data centre features a highly energy-efficient data centre design through the use of indirect adiabatic cooling technology and hot aisle containment (calculated pPUE = 1.04). Its efficiency and highly redundant 2N power configuration would cater to the uptime needs of customers with demanding applications including enterprises, cloud providers, gaming companies, and financials. Designed for flexibility and scalability, this new data centre hall in Switch AMS1 offers an extended capacity of about 400 data centre racks and enables 3W Infra to offer companies considerable growth perspective. It's scalable power modules starting at 5 Ampere per cabinet up to 25 Ampere for high-density requirements (scalable steps of Ampere) are aimed at a wide range of customer types - from startups to enterprises and companies with demanding applications. 3W Infra expects to complete its phased data centre infrastructure migration at the end of April 2017. ISO 27001, 100% Uptime Guarantee “The 2N power configuration gives 3W Infra customers a robust 100% uptime guarantee instead of the 99,999% we had before,” said Roy Premchand, Managing Director of 3W Infra. “The easy-scalable power modules available onsite allow our clients to grow their power infrastructure in a cost-efficient way. They can start with 5 Ampere and grow their infrastructure with 5 Ampere each time they need to add more power for their equipment.” [easy-tweet tweet="The robustness and high-security features will help 3W Infra meet ISO 27001 requirements" hashtags="Security, IaaS "] “Power, cabling, security…really all data centre infrastructure included in this newly built data hall is very robust,” added Mr Premchand. “The robustness and high-security features will help us meet ISO 27001 requirements, as we’re currently aiming to achieve ISO/IEC 27001 certification for Information Security Management.” OCP Specifications The newly commissioned data hall in Amsterdam, Switch AMS1, is one of the first colocation data centres in Europe that’s suitable for Open Rack Systems based on OCP principles. The Open Compute Project (OCP) originates with Facebook, while companies like Intel and Microsoft joined the OCP in an early stage. A variety of industry giants joined the OCP later on, including Google, Cisco, IBM, NetApp, Lenovo, Ericsson, and AT&T. It means that Switch AMS1 is modern premises well suited for housing OCP-specified equipment based on open standards. Its infrastructural efficiency would also be a good fit for 3W Infra’s ‘pure-play’ IaaS business regarding the delivery of traditional, highly customised dedicated server technology, colocation services and IP Transit. Fast Growing Company The announcement follows the news that 3W Infra published the results of its latest server count. 3W Infra now has 4,000 dedicated servers under management, 1,000 more than half a year ago. Although quite a young company (founded in 2014), 3W Infra has been able to show significant growth in the second half of 2016. 3W Infra’s jump-start growth would be triggered by the company’s ‘pure-play’ IaaS hosting approach where cloud delivery is left to customers. 3W Infra’s high-volume transit-only network with global reach and its ability to deliver highly customised IT infrastructures are also part of 3W Infra’s success in growing at such a fast pace, according to Mr Premchand. “We expect to continue our exponential growth rate, as we have quite some large sales opportunities in our sales pipeline,” added Mr Premchand. “A variety of potential customers has shown interest in the new data hall already, companies with extensive and complex infrastructures I must say, but they were waiting until we have finished our migration processes.” ### Alex Raistrick from Rubrik LIVE from Cloud Expo '17 #CEE17 Behind the Cloud Expo LIVE from Cloud Expo '17 talk to Alex Raistrick from Rubrik about the challenges people are facing with backup and restoring data and how Rubrik's cloud data management platform can help to protect, maintain and index individuals data within the cloud. ### Understanding the Technology Behind Virtual Data Rooms The Virtual Data Room (VDR) market is set to be a billion dollar business over the next couple of years. A report published by IBISWorld pegs the current industry revenue at $832 million with annual growth rates of around 13.7%. So what exactly are virtual data rooms and how do they work? Here’s a primer on the technology behind VDRs. What Is a VDR? A virtual data room is an online repository that enterprise businesses use to store highly sensitive and classified information online. They are not merely digital archives used to create backups of company documents. Instead, VDRs serve as an interface for businesses. Companies have the ability not only to store but also to share their business documents with third party stakeholders such as investors, attorneys, and shareholders. And they can do so without compromising confidentiality. Businesses primarily use VDRs during fundraising, IPOs and audits. [easy-tweet tweet=" The first component of a VDR is file storage in the cloud." hashtags="Cloud, Storage"] Secure File Storage in the Cloud The first component of a VDR is file storage in the cloud. At first glance, VDR storage might not seem any different from generic cloud hosting. The key difference here is data encryption and accessibility. Generic cloud hosting services are secure in the sense that any kind of data transmission to and from the server is encrypted. Encryption prevents unauthorised access to the information by third party hackers. But while the transmission lines are secure, the document itself is not. Anyone with a direct link to the hosted file might be able to access the information. And this possibility remains, irrespective of whether or not the data transfer to and from the server itself is encrypted. A VDR, on the other hand, encrypts the data transfer lines as well as the documents themselves. This way, businesses can ensure that in the event of successful third party intrusions, the hackers are left with nothing but gibberish data that cannot be comprehended without the right encryption key. Multi-Factor Authentication (MFA) Secure file storage is only one part of the story. File sharing can be tricky, given that it might not always be possible to track down the source of a leaked document. VDRs play a critical role in establishing the rules of accessibility of the hosted documents. There are two steps to ensuring the security of the hosted documents during file sharing. Once decrypted, the files are only rendered to users who can validate their identities with the help of a unique secondary authentication process. This second step could be anything from SMS-based OTP (one-time passwords) to RSA tokens and biometrics. However, with the U.S. National Institute of Standards and Technology (NIST) recently declaring that SMS-based two-factor authentication as unsafe, more and more VDRs are now moving towards alternate forms of validation. Digital Watermarking [easy-tweet tweet="Digital watermarking is the process of covertly embedding a marker in the video" hashtags="Digital, Video "] While multi-factor authentication (MFA) systems protect documents from unauthorised access, they still do not protect customers from illegal leaks. Protection from illegal leaks is made possible through watermarking. Digital watermarking is the process of covertly embedding a marker in the video, audio or image-based data. You can then identify that data with the help of a secure algorithm. Digital watermarking enables you to trace every copy or file you download or view from the VDR. It is, therefore, possible to accurately recognise the source of a leak. In essence, a virtual data room has four technological components: secure file transfer, encrypted storage, MFA-based document access, and digitally watermarked document access. Not all VDRs necessarily come with all of these features. A VDR’s features depend on the requirements of the end customer and the level of confidentiality needed. Nevertheless, these properties make VDRs an attractive option for businesses looking for a safe and reliable way to manage sensitive information. ### Steve Strutt from IBM LIVE from Cloud Expo '17 #CEE17 Disruptive Tech TV LIVE from Cloud Expo '17 talk to Steve Strutt from IBM about change and becoming the future of IBM. Steve also talks about focusing on enabling clients to adopt their platform. ### Four Ways to Achieve ROI from your Private Cloud Cloud has arrived. In fact, a 2016 Cloud Industry Forum survey found over four out of five UK organisations have already adopted at least one cloud service. It’s no wonder: both public and private cloud offer incomparable flexibility and scalability for growing businesses. Data backup and recovery to become more affordable in the cloud. And by sharing sensitive documents through the cloud rather than via email, teams can collaborate much more easily wherever they’re based. However, the technology that promises to be cost-efficient, easy-to-use and tailored to your needs can sometimes have hidden costs. Private cloud is no exception Without a robust strategy in place, and without a solid understanding of how your workloads have to perform, it’s extremely difficult to make sure get the optimum results. But with foresight and planning, it is possible to get the right cloud provision for your requirements and achieve a return on investment (ROI). Tip one: Make sure you know how to calculate your ROI An obvious tip – but many elements are often missed when calculating ROI. The outlay for your chosen solution is just one factor in the equation. Make sure you include all the elements involved: from the projected lifespan of the equipment to the cost of capital – how much you could expect to earn if you invested the funds somewhere else. You’ll also need to consider the operating costs: power consumption, and staff needed to implement the new solution and then manage it day-to-day. Remember that you might need to employ additional solutions for your platform to work optimally. All these costs should be included in your ROI measurement. Tip two: Innovate the IT stack to achieve a competitive advantage It’s easy to think that adopting cloud is innovation enough. That’s true to some extent, but much more can be done within the IT stack to make sure your cloud is performing at its best. [easy-tweet tweet="Be brave: think about adapting your cloud to work harder. " hashtags="Cloud, IT"] Innovating other parts of the business is second nature. But it can be daunting to start reworking the IT stack. Be brave: think about adapting your cloud to work harder. Move apps to the cloud and offer new services to customers. This approach will mean you’re already getting more from your cloud in terms of function and performance. Tip three: Implement alternative technologies Don’t be daunted by newer or “open” technologies: solutions like software-defined storage (SDS), Software-Defined Networking (SDN) and open source are well-suited to cloud and offer enormous benefits in the right environment. Those benefits can lead to competitive advantage and increasing profit margins. SDS can seem impenetrable – and no wonder – experts are still arguing over a firm definition of its meaning and many vendors make wild claims about the technology. However, in straightforward terms, SDS simply refers to a software-focused storage solution that runs alongside or completely independently of existing hardware. Flexibility and price are often quoted benefits but there are much more. SDS is scalable, so it can seamlessly grow with your business. It replicates data across clusters of servers, offering robust protection against outages. It eliminates silos and is designed to work across virtualised and bare metal infrastructures. All of which encourage your cloud to work harder. Open source is also well-matched when it comes to cloud. The software’s original source code is made freely available and can be shared and modified. Because it’s non-proprietary, open source tends to be cost-effective, and much more adaptable to your needs, both of which have a substantial impact on ROI. Tip four: Get practical [easy-tweet tweet="Often CIOs and IT teams are held back by the fear of impacting on performance" hashtags="IT, CIO"] There are many ways to find extra returns when developing your cloud infrastructure. Often CIOs and IT teams are held back by the fear of impacting on performance, of causing outages or worse, of losing business-critical data. If you and your team keep on top of new technologies, recognise how they will align with other elements within your infrastructure and understand their limitations, it will be much easier to intelligently integrate different tools and applications to help boost ROI. Try talking to industry experts and looking at what the new and more innovative vendors are doing. Why ROI? IT infrastructure is one of the highest, if not the highest, costs for many organisations. Cloud often promises to be a more cost-effective alternative, but some IT teams find that the price of off-the-shelf cloud from big vendors can skyrocket. It doesn’t have to be this way: with innovation and education, it’s possible not only to break even but to achieve ROI from your cloud. ### Hermes Trials Self-Driving Robots in London Hermes, the consumer delivery specialist, is launching an innovative testing programme for the use of self-driving delivery robots in London. In partnership with Starship Technologies, Hermes will soon trial a number of parcel collections in the London borough of Southwark. This latest trial follows on from a project that has seen Hermes Germany and Starship Technologies test parcel delivery by a robot in the Ottensen, Volksdorf and Grindel suburbs of Hamburg. Starting last August, three robots were deployed on the streets of the German city. Hermes will use the testing period in the UK to better understand how the robots could enhance the company’s ability to offer an increased range of on-demand solutions in the future, as part of its ongoing commitment to providing value-added services. Initially, the trial will allow the delivery firm to offer limited thirty-minute time slots for the collection of parcels, either for items being returned to retailers or for items being sent by small businesses or consumers via myHermes. Moving forward, the robots could offer Hermes greater scheduling and tracking capabilities. [easy-tweet tweet="The self-driving delivery robots offer a viable alternative to drones" hashtags="Robotics, Hermes"] The self-driving delivery robots offer a viable alternative to drones, especially in highly developed cities, towns and suburbs where strict aviation laws are in constant operation. Each vehicle is 55cm high by 70cm long and incorporates a secured compartment where parcels with a maximum weight of 10KG can be transported, accessible to consumers via a link generated by a smartphone app. They have six wheels and can travel at speeds up to 4mph per hour. The robots can be used within a 2-mile radius from a control centre, where the vehicles are loaded and charged. The aim is for the robots to be 99% autonomous in the future, and can always be connected to a human operator via the internet and GPS. In the future, one operator can monitor several robots at the same time and can also take control of the robots if required. Carole Woodhead, CEO of Hermes, said: “Starship Technologies is a highly innovative and pioneering firm. We are extremely pleased to utilise their expertise to explore exciting new ways that will further strengthen our portfolio of services and offer greater choice and convenience for customers. We can already see first-hand the success they’ve had with food deliveries in London, and we are excited to team up with them in a bid to revolutionise the home delivery marketplace.” ### G3 Comms Pioneers Luminet Portal  G3 Comms speeds connectivity install times with Luminet partnership Luminet, a leading UK-based intelligent managed services provider, today announces that G3 Comms is expediting its customer connectivity times with Luminet’s flagship product “Fibre-Air” – a wireless 1GB clear channel business Internet service that can be installed in as little as 10 working days. G3 is a leading systems integrator and provider of network and unified communications solutions – it serves its enterprise customers with global requirements.  It also has an indirect business, Genius Networks, providing bespoke global connectivity services exclusively via approved reseller partners. G3 routinely helps enterprise customers with their connectivity services as they relocate and expand their business premises, which are typically EFM (Ethernet First Mile, over copper) or Ethernet circuits using fibre-leased lines. However, the minimum lead times for services are typically at least 50-60 days, and these can be lengthened considerably with the added, frequent complication of wayleave agreements to complete necessary fibre digs and other civil works. [easy-tweet tweet="Luminet’s Fibre-Air stood out because of its lead times and resilience." hashtags="Luminet, Digital "] James Jeffs, Director at G3 Comms: “There are always a lot of ifs, buts and maybes around ordering fibre leased lines, which makes it very hard to manage customer expectations - it will take a month before an operator like BT will even conduct the initial site survey, the result of which might show the service can’t be delivered and puts you back to square one.  Customers have their own timescales for moving, and these delays make hitting them a nightmare. Luminet’s Fibre-Air stood out because of its lead times and resilience.” G3 Comms is also the first of Luminet’s channel partner to use its new online portal, which fully automates all sales processes. Luminet has integrated its secure, automated service portal with the G3/Genius online portal via APIs, to support 24/7 ordering and extend the visibility of service updates and other key data. James Jeffs, Director at G3 Comms continues: “Luminet portal is such a slicker and easier way of managing our billing and ordering with an operator than sending emails back and forth or constantly checking up with account managers on the phone. These automated, self-service capabilities make a huge difference for our Genius partners too and eliminate the need for us to intervene to communicate missing information such as site survey results.  With Fibre-Air available via the Luminet portal, both G3 and our Genius partners get fast access to more value-added OTT revenues. Sasha Williamson, CEO of Luminet, adds: “Our Fibre-Air proposition has gone down a storm in the channel market – it is solving a very real issue that is a regular frustration for business leaders who have been previously hampered with office moves because of the times it takes arranging the connectivity – G3 has doubled its orders since it started reselling the product. However, Fibre-Air is not just a temporary fix but as an essential primary or backup connection – plus it is very priced competitive with the fibre-based alternatives over 12, 24 and 36-month terms.  Fibre-Air added to our industry leading range of computing and connectivity services that are now readily available on our new portal makes for a very compelling proposition - the new portal makes it simpler, easy and faster to quote, define technical specs, order and reorder services, as well as providing better value and increased margins.” About Luminet Luminet is the intelligent managed services provider, offering businesses Connectivity, Computing and Intelligent services and solutions. Luminet has a peer-to-peer relationship with its customers and has the UK’s leading NPS customer satisfaction score of 42 in telecoms and technology. Combining a true customer-centric organisation with strong values, culture and capability together with technical excellence and best-value makes Luminet a partner of choice for business, supported by industry-leading SLAs, and ISO 9001 and ISO 27001 certifications. ### Alex Cruz Farmer from NSFocus LIVE from Cloud Expo '17 #CEE17 Behind the Cloud Expo LIVE from Cloud Expo '17 talk to Alex Cruz Farmer from NSFocus about the impact of DDoS attacks and how NSFocus cloud platform will help service providers protect their infrastructure through these attacks. ### Data Security Challenges in Health Technology Dealing with medical data is a very delicate process, and the consequences of error are potentially very severe. There is nothing more valuable than our health, and that should underpin everything we do. I see huge potential for disruption in the medical sector based on various innovations in technology, and that made me want to move out of fintech to focus on these new and exciting challenges. Financial institutions have been the pioneers in compliance practices, which have helped to reduce the potential risks to individuals in cases of major data leaks. These practices include, but are not limited to, anti-fraud technologies and practical insurance policies. Financial data leaks are serious, but it is just money; health data leaks can have far more serious consequences. Healthcare data reveals very detailed information about us, and losing control over this may lead to problems in all areas of our lives. These challenges make me feel very privileged to work on healthcare data security as CTO of doctify.co.uk. Common challenges If you want to ensure proper data handling, make sure you don’t fall foul of these common errors: Team structure Inappropriate level of permission given It’s important to carefully define roles that are required for accessing data in your organisation and clearly identify the permissions each role has on the data. Having clearly defined roles and a list of users who have certain roles makes it easy to periodically audit permissions especially when a team member leaves. Lack of detailed data access audit [easy-tweet tweet="Individual members of the organisation are the easiest target for cyber-criminals." hashtags="Cyber-criminals, Cloud "] Assuming your roles, permissions and ACL are set correctly, and it is still important to audit your data access. Your data storage solutions need to allow you to review users who are requesting excessive data, as that could be the initial sign of a breach in your organisation. Poor password policy Individual members of the organisation are the easiest target for cyber-criminals. Usually the weakest link is the use of the same password across multiple applications. As an organisation, you need to monitor quality of passwords and make sure they are not being reused. Your policy needs to enforce regular password changes. Lack of U2F (Universal 2nd Factor) usage Introducing two-factor authentication into your organisation reduces the likelihood of exploitations based on phishing attacks. With two-factor authentication, authorisation does not depend solely on passwords. Poor training Technologies keep evolving, but users have to evolve too. Make sure your team is up to date with recent threats, and that they know exactly whom to contact when they are suspicious about something. Encryption Lack of encryption of data stored in servers Usually, there is a good level of protection when it comes to accessing data servers, but every organisation needs to look into solutions to minimise damage in the event of servers being compromised. Unencrypted internal communications SSL is commonly used for communication with your mail servers, but as soon as an individual machine is compromised, any attacker has plain access to the whole communication. Your company needs to encrypt emails using solutions like PGP or S/MIME [easy-tweet tweet="All common platforms like iOS, Android or Blackberry have very good provisioning models" hashtags="iOS, Android"] Mobile devices The company needs a clear mobile devices policy. All common platforms like iOS, Android or Blackberry have very good provisioning models, allowing you to exercise fine control over permissions on the devices. If you do allow access to your company data through personal phones, you should ensure that this can be done only via U2F devices, but ideally, any such access should be limited. These rules apply equally to laptops. Lack of encrypted backups An institution’s backup process is usually the weakest point in regards to data security - it’s easy to implement the process wrongly.  Firstly, data needs to be encrypted, and, secondly, you need to split responsibility between two people: one person holding the key to encrypt the data, the other holding the key to decrypt the data. Excessive data Storing more data than needed Organisations tend to have an appetite for storing more data than they need for their processes. It is a difficult process, but as part of your rules structure, permission and encryption, you need to be prepared for the possibility that all of the above can fail. Therefore, your last line of defence is for minimal data to be accessible at any single point. Make sure that your data is as anonymised as possible given your operational requirements.   ### Ulrike Eder from drie LIVE from Cloud Expo '17 #CEE17 Disruptive Tech TV LIVE from Cloud Expo '17 talk to Ulrike Eder from drie about who they are, cloud security and how most cloud products out there are more secure than your own premise solutions. ### 4 Important Steps to VM Migrations Considered! Cloud migrations can be tricky when looking at the multitude of options for moving workloads from point A to point B. I don’t think there is such a thing as a “typical” environment or technical setup, if there was, it would be easy right? However, before you migrate your IT crown jewels (data and VM’s) to someone else’s IT infrastructure, there are 4 points to consider back at the planning stage that will affect your overall decision. Security When moving your data, it's worth considering the security of the mechanisms which you use to migrate. In the days before security was taken seriously, USB drives were used extensively (some still use them now), but the security is very lax. These days, VPNs and encryption can solve security for data in flight over a network. However, if using an agent-based system for migration, security holes can be created when antivirus must be disabled or ports must be opened. Testing In an ideal world, you would scope out your requirements and test with a new provider (or even several of them) before you make the leap. Will your new provider give you access to a sandbox environment to test on prior to migration? Will they also give you the appropriate time and resources to assist? Testing can be very resourced intense to complete, and it's usually associated with additional costs. Further, replication of data must happen before testing can begin, making it impractical to do real-world testing of any production applications – especially those with a lot of data. Deployment If you are replicating VM’s to the cloud (who wouldn’t?), you can deploy the client agent remotely using a handful of different methods, but this is always a cumbersome task.  Having been in that situation many times, I always find myself wondering, “Will it? Won’t it, maybe?" and then find that I’d have to start the data copy again due to corruption. If you’re an organisation that wants to move hundreds of applications to the cloud, consider the scalability of your migration solution. [easy-tweet tweet="Internal resources can easily equate to the same price as using an outside firm." hashtags="Security, Resources"] Time It's hard to put a value on the time cloud migrations take (except when an external consultant or firm gives you a bill). However, organisations with internal chargeback mechanisms in place are finding that internal resources can easily equate to the same price as using an outside firm. As you plan your migration timeline, compare how much time each migration solution you are considering takes ‘per server’. If you multiply your ‘per server’ time by the number of machines you need to move, you can figure out how many human resources you’ll need, and an overall cost based on their per hour time. All in all, migrating your virtual environment to the public cloud (with an infrastructure that is likely different than that in your data centre or colo facility), requires significant planning and coordination – even before you get to the actual process of moving workloads. From there, do what you can to make your life easier. Installing “stuff” to get the replication started/completed is a headache. This coupled with the additional skills and resources you’ll need will add cost (let alone the challenges when you try and migrate back if you find that the provider of choice isn’t right for you). When reviewing Velostrata and their agentless migration approach, it seemed to tick every box to hurdle over these 4 main points. Their migration toolset is agentless, a big tick in my box. They provide an easy way to ‘test before you migrate’ without having to move all the data first, which allows you to ensure you scale correctly before you push the magic button to migrate. But don’t just take my word for it, Gartner awarded them a “cool vendor” rating in 2016 and I can see why. Take a look at their 3minute video that explains in simple terms how their software works, probably better than I can within this blog.   ### Real-Time Text Recognition Promises to Accelerate the Digital Journey New ‘point and capture’ toolkit empowers instant use of text-based information viewed through smartphones and tablets ABBYY, a leading provider of technologies and solutions to action information, today introduced its new Real-Time Recognition technology, enabling ‘instant’ text extraction from the preview screen of mobile devices. Real-Time Recognition based on ABBYY technology enables faster, simpler extraction of text from documents and objects placed in the live video stream from smartphone and mobile device’ cameras. Data extracted using Real-Time Recognition can be used in a variety of mobile applications ranging from highly responsive customer services to automated enterprise processes and consumer apps. Applications based on the ABBYY Real-Time Recognition SDK (software development kit) can pull text from on-screen objects and automatically convert it into digital data. The conversion takes place directly on the mobile device and within the mobile application. The technology adds a new level of convenience to mobile apps by replacing tedious manual data entry. Once digitised, data can be easily processed by the application or even sent to cloud-based systems for further handling. On-screen images are processed in real-time, eliminating the need to take a photo and save it in order to process the data and then access the information. This makes the data entry process simpler and more intuitive for app users. To meet the increasing demand for more responsive, ‘anywhere, anytime’ services in today’s fast-paced digital environment, enterprises can develop real-time recognition-enabled customer services and products to speed up data acquisition and processing times. As part of customer onboarding processes, such as opening an account or loan application, information from ID cards, salary payslips and other documents can be captured within seconds and then transferred directly into online forms or apps for new customer sign-ups. For customer self-services provided by banks or insurance companies, mobile apps can quickly extract and use transactional information such as bank transfer details (IBAN, etc.) or policy numbers in claim forms, to make data entry processes more efficient. As no photos are stored on the device, the technology is well-suited for use in processes that must comply with data security and data privacy standards. [easy-tweet tweet="Real-Time Recognition can help transform data into outstanding customer experiences." hashtags="Data, Technology "] Software vendors developing consumer apps can easily add new functionality and value for their users. For example, travel apps can offer fast translation assistance when images captured on preview screens can be quickly translated using mobile dictionaries. Text reading apps for those with reading and learning disabilities can be enhanced by ‘live text input’ into their text-to-speech components. “Real-Time Recognition can help transform data into outstanding customer experiences,” explains Dr Rainer Pausch, product group head, SDK products at ABBYY Europe. “Simply point your device’s camera at text and ‘lift out’ what you can see live on the display. This provides an extremely intuitive and accurate way to capture information printed on virtually any background. This means it’s ready for instant use, on-device response and no more typing. We see great value with this new ABBYY technology and strongly believe that it can be a real value-add to a vast range of mobile apps”. The new Real-Time Recognition SDK complements ABBYY’s comprehensive portfolio of tools that simplify capture, digitisation, and extraction of data. ABBYY’s offering ranges from mobile-based information capture and OCR processing to enterprise-level automated document processing and data extraction. Specifications and Availability  ABBYY Real-Time Recognition SDK is available for integration into new or existing applications for iOS and Android. The ABBYY developer toolkit supports easy integration and offers code samples and ‘quick start’ guides to help developers get started. The SDK supports capture of texts in 63 languages. For more information on the ABBYY Real-Time Recognition SDK or to request a trial for developers, please visit the ABBYY website (www.abbyy.com/en/real-time-recognition-sdk). ### Busting the Myths Around the Cloud With more than 90% of businesses using cloud technology in some way or another, it would be right to say that it has quickly entered the mainstream as businesses realise its benefits. However, for many organisations implementing it is no easy ride and for those not using it, the prospect of cloud can be daunting. Their hesitations are threefold; firstly it takes a lot of time investment to educate decision makers in the business on why they should use their resources on investing in new infrastructure when legacy systems ‘do the job’. Secondly, recent high-profile security breaches have brought data security to the attention of people even outside of the IT department, meaning they hook on to the perceived risks of the cloud without considering the full strategy. There’s also a lack of awareness of the different offerings available, meaning some organisations are even getting ahead of themselves and implementing cloud-only environments believing it is the best solution. As the cloud continues to move into the mainstream, it is time to acknowledge and bust the various myths surrounding it. [easy-tweet tweet="By 2020, an estimated 24% of IT will be cloud-based." hashtags="Cloud, IT"] Myth 1: It’s all or nothing when deciding to take our business into cloud   By 2020, an estimated 24% of IT will be cloud-based. Forward-thinking businesses have begun to take note of this trend and have adopted that an ‘all or nothing’ approach is the best way to get ahead of the game. However, an organisation needs first to consider its business objectives and the technology available to gain a clear understanding of what kind of solution it requires to help it reach its goals. If the organisation is fairly stationary, migrating data into the cloud might not be the best solution for their business. Alternatively, if hosting a versatile, agile and completely mobile workforce is the aim, then moving more of the company’s data into the cloud rather than hosting it on-premise could be the right way forward. While the cloud can be secure when understood and managed properly; businesses must ensure they follow a structured approach to their IT infrastructure. Such an approach involves various steps; exploring the options that are available them, crafting and planning a model that aligns to their specific requirements and finally, delivering end user training means it can be used to its maximum potential. Myth 2: Purchasing ‘off the shelf’ IT for my business is the best approach [easy-tweet tweet="Technology is continually advancing with new services and devices " hashtags="Technology, Services"] One of the main benefits of cloud computing for businesses is the increased efficiency it brings as services that would usually take months to deploy can be ready to use in minutes. Technology is continually advancing with new services and devices being introduced to the market every day, and its human nature to be excited when the latest gadgets are introduced. However, each solution and business are unique and a one-solution-fits-all approach does not translate into an effective cloud strategy that will deliver on investment. Therefore, an assessment of the volume of the data your organisation needs to retain; the scale of your organisation; and your budgets, are all factors to be considered when purchasing dedicated cloud solutions. What’s more, organisations will need to spend time thoroughly reviewing all prospective IT in line with their future objectives and goals of the business. Myth 3: Cloud only benefits big organisations Today, organisations across all sectors, big and small, can experience a multitude of benefits from using cloud offerings. One of the biggest benefits for SMBs, in particular, is the significant savings it can provide, particularly in that, it helps organisations lower operational costs. Cloud technology means better flexibility as files can be accessed from any device, at any time and in any place. This means SMBs can cater to more flexible working arrangements and even BYOD. However where cloud pays off for SMBs is in allowing their businesses to grow by giving space to test new products, while only paying for what they use. With a pay-as-you-use cloud model that allows new technologies to be easily integrated into existing IT infrastructure, businesses are able to be agiler, innovate faster and reduce their reliance on costly legacy systems. With the right management and understanding of the cloud, organisations big or small across all sectors can choose to find the solution that works best for their culture and business goals.   ### Digital Transformation for legacy IT – Is it possible? Already, I feel as if the term “Digital Transformation” has been overused. However, it may be because I am embedded in the digital industry and not an outsider to the phrase. A lot happens in an internet minute – 450,000 tweets get sent, over $750,000 gets spent online, 3,500,000 Google searches happen, 4,100,000 videos are viewed on YouTube, and nearly 1,000,000 swipes occur on Tinder! The digital revolution has happened and if you haven’t considered how you are going to survive or continue to compete in the age of convenience shopping, viewing, searching and swiping then now is the time! With a new generation of technology consumers are growing up and demanding what they like to use, your firm has to stay relevant and even your legacy IT systems simply cannot be left untouched because it was too risky to or simply couldn’t be transformed. [easy-tweet tweet="Legacy technology needs to evolve and be part of the brave new world of the digital revolution" hashtags="Digital, Technology "] As I have mentioned many times before, legacy technology needs to evolve and be part of the brave new world of the digital revolution. Some of the largest firms in the world have gone out of business because of lack of foresight or even greed. Now, you may be thinking “that doesn’t affect me, my business isn’t that dependant on such a culture change”, but let me tell you, it does! If you are in business;, your prospects and future prospects are more part of that change than you are and use more up to date technology than you in the form of social media platforms (and I’m not talking about using Instagram to send a pic of my latté to my friends). Your staff workforce would be happier if you transform (ask anyone under the age of 30), your clients will see you in a different light, and your competitors will see you like another horse in the race for business and not a laggard. Digitally transforming is not just about being visible across various platforms but, to transform your IT estate to make it easier to use, easier to support internally, easier for your clients to approach you and with the added benefit that new potential business models can be born out of the transformation. I happen to like the Flynet story in a big way as they have taken one of the oldest elephants in the room and made it sexy. Their software services are so easy to use and implement, which makes you think “why didn’t I think of this before?” No change needs to be conducted to legacy mainframe kit (although the infrastructure isn’t really legacy anymore), and with a simple HTTP service in front of the infrastructure enables (within 15mins from install), your green screen app to be present on any device, anywhere. It was a pleasure interviewing Rhiannon Dakin, Marketing Manager at Flynet - please take a look at her video for her take on how Flynet add value to legacy environments. ### Places of Worship are Turning to Technology to Host More Events Priava, the leading cloud-based venue management software company, announced on 7th April 2017, that it is seeing an increased demand from churches, cathedrals and other religious organisations who are looking for technology to help optimise the revenues they can earn from hiring out their impressive buildings and spaces for events. EMEA Head of Sales at Priava, Mike Jeanes commented; “The UK and Ireland is home to some of the World’s most iconic and architecturally wonderful cathedrals, churches and other clerical real estate that regularly host, not just their own services, but other private events such as weddings or even acting as a film set.  [easy-tweet tweet="With places of worship boasting impressive audio capabilities... they become desirable spaces " hashtags="Cloud, Worship"] However, like many other venues, religious organisations are having to become more commercially minded when it comes to maximising the level of income they can raise from their unique assets that range from the main nave through to meeting rooms or a crypt. With many places of worship boasting impressive audio visual capabilities and other facilities, they are extremely desirable spaces to host a wide range of events and conferences and this provides opportunities to boost revenue outside of their standard operations.”   Using cloud-based venue and event management technology, places of worship can automate many of the tasks associated with booking and delivering events. As a result, they can increase the volume and complexity of events they are able to hold, and at the same time increase funds towards the running and upkeep of buildings and for the overall benefit of their own congregations and communities. Two well-known places of worship that have recently chosen to implement Priava include Christ Church Cathedral in Dublin and St. Andrew Holborn Church in London. Susanne Reid, Head of Tourism and Events at Christchurch Cathedral, Dublin, commented, “The reason we chose Priava was to provide our events team with a professional system that could save substantial time and enable us to more efficiently manage the Cathedral’s increasingly busy diary, from booking spaces for clerical meetings through to hosting a wide range of events such as lunchtime recitals or evening concerts. Priava’s on-line diary system and built-in CRM is an ideal platform to centralise all the information we need in one place, so we can easily communicate exactly what is going on to all the key stakeholders within the Cathedral. We also liked the fact that Priava was cloud-based as it would give us a solution that would be continually updated in the future.” Some of the key benefits of implementing Priava include: Maximise occupancy of spaces for events leading to additional income generation that can be reinvested to support maintenance of buildings and other services Smoother and easier booking and managing of religious and secular events delivers an enhanced experience for congregation/guests and saves time for personnel Centralised system provides greater clarity for clergy to see what is going on through access to a shared on-line calendar Enables places of worship to diversify the type of events to include private functions such as weddings, corporate meetings, parties, filming et al Saves time on administration of events so clergy can focus on core duties Online enquiries - Integrated online calendar and easy-to-use enquiry form means that it is easier to capture all enquiries that are recorded straight into the system Real-time availability at a glance - Manage multiple venues/spaces and high volume of events at the same time and maximise occupancy levels with access to real-time availability; Repeat & reoccurring bookings - Save time and manage repeat bookings and create and edit reoccurring events with ease e.g. services/masses, prayer times, educational tours, community outreach programs, special events and celebrations etc. Moving from manual to automated system enables support for higher volume of events All Health & Safety information can be recorded centrally and attached to individual event sheets Reduced dependence on IT resources - No need to depend on the resources of in-house IT staff, as all updates are carried out remotely and made available to all users simultaneously; Customise reports & booking templates - The easy-to-use report designer enables administrators to create simple and effective booking forms and contracts etc.   ### 8 Tips for Achieving Cloud Security As cloud computing in the enterprise continues to deliver enormous efficiency and cost benefits, organisations are faced with significant challenges relating to privacy, security and the data protection and availability of critical business assets. These challenges are only growing as more and more enterprises adopt cloud-based IT. This growth has been highlighted by new research which polled 400 IT decision makers across the US and Europe. The survey found that, on average, 40% of all organisations’ applications are deployed in the cloud and this number is expected to grow by a further 30% in the next year. The security stakes have therefore been raised and, as we have seen, hackers don’t discriminate. One of the largest beaches in 2016 occurred at the UK mobile operator Three when hackers successfully accessed its customer upgrade database simply by using an employee login. This occurred soon after another major breach at broadband provider TalkTalk where the details of more than 150,000 customers were stolen including the bank account details of around 15,000. The result was 95,000 lost subscribers, which cost the company approximately £60million. Ownership of security within cloud activities must be prioritised as c-level executives, IT managers, CISOs and security professionals plan their cloud security strategies. Below are eight recommendations for ensuring cloud security. While these might seem a bit overwhelming, the alternative is even scarier – risky cloud use that leaves organisations vulnerable. With thorough planning and a new perspective on cloud security, your company’s data will be more secure in 2017. Don’t put a bullseye on your data. Think about approaches that minimise the target value of an organisation’s data. Consider deploying services on virtual private clouds or internal/on-prem systems - entirely within a firewall, keeping information away from the spotlight of highly visible SaaS targets. [easy-tweet tweet="Collecting evidence on the existence of data can pose a threat...losing data " hashtags="Data, IT Threat"] Protect corporate user identities or metadata. User identities are subject to hacking; enterprises must protect their corporate user identities since the loss of user identity is likely to result in loss of the user’s corporate data. Similarly, collecting evidence on the existence of data and its properties can pose a threat as much as losing the data itself. Some cloud storage solution providers do not adhere to this strategy and keep all of their customers’ metadata centralised in a public place. Thus, indirectly requesting enterprises to put their faith in them, which poses a significant risk to data confidentiality and integrity. Avoid risks associated with SaaS providers generating and managing encryption keys. Encryption keys generated in un-encrypted servers can provide attackers with easy access enterprise data. Similarly, having your SaaS provider manage your keys increases your susceptibility of losing control of your data. While cloud services providers boast high security, including physical protection of hosting facilities, electronic surveillance and ISO 27001 certifications, many provide no protection against government data requests, blind subpoenas, or clandestine spying. Make sure you own user identities, metadata, and encryption keys to ensure the highest levels of data privacy. Control your endpoints and offices. Use enterprise mobility management (EMM) tools to eliminate shadow IT and create secure productivity spaces within corporate-provided and BYOD devices. Encrypt all data at the source to ensure the greatest levels of access to file security. Lock down external collaborator access. Implement strict policies to enforce what data can and cannot be uploaded in a file sharing environment, control what domains/emails can and cannot be emailed to, audit all accesses to ensure there are no anomalistic events. Data loss prevention (DLP) tools can be used to restrict access behaviours. [easy-tweet tweet="Set rigorous policies around password strength and refresh rates" hashtags="Password, Data"] Improve password security. Set rigorous policies around password strength and refresh rates. Consider adding multi-factor authentication that will require the user to use a combination of something they know like a static password and something that they have such as a smart card or a token that generates a one-time password. Know your data protection options. Understand the limitations of cloud services to recover data lost in the event of an attack, user error, etc., as part of your vendor’s SLAs. Ensure that you protect data residing in the cloud – i.e. back up your SaaS applications, as well as services and applications running on public cloud IaaS – as part of a comprehensive organisational strategy for backup/recovery of data in all locations (on-prem and in-cloud). Investigate multi-cloud strategies. When organisations run applications on multiple cloud services rather than relying on a single vendor, they reduce the risk of a vendor’s service outage causing them significant issues and downtime. This is a critical component of a cloud strategy that enables organisations to preserve cloud optionality while strengthening their business continuity models. ### Christian Rule from Flynet LIVE from Cloud Expo '17 #CEE17 Disruptive Tech TV LIVE from Cloud Expo '17 talk to Christian Rule from Flynet about enabling access to green screen technology and with their toolset you can work against any legacy environment. ### Jacqueline Davey from IBM LIVE from Cloud Expo '17 #CEE17 Disruptive Tech TV LIVE from Cloud Expo '17 talk to Jacqueline Davey from IBM about what's new in terms of Cloud for IBM, including data centres and blockchain. ### Melissa Snover from The Magic Candy Factory LIVE from Cloud Expo '17 #CEE17 Disruptive Tech TV LIVE from Cloud Expo '17 talk to Melissa Snover from The Magic Candy Factory about winning #DisruptivePitch, 3D printed sweets and the technology behind it. ### Tom Clark from ITV LIVE from Cloud Expo '17 #CEE17 Disruptive Tech TV LIVE from Cloud Expo '17 talk to Tom Clark from ITV about his choice of common platforms and the diverse infrastructure of ITV. ### Greencore Group takes its infrastructure global using Datapipe’s capabilities Greencore, the international convenience food manufacturer, has leveraged Datapipe’s global presence to set up infrastructure in the United States to initially host its voice and video application, Skype for Business. Greencore, a leading manufacturer of sandwiches and other food to go products to grocery retailers in the UK and leading manufacturer of sandwiches, meal kits and salads to CPG, convenience retail and food service leaders in the US has been an Adapt client since 2014, outsourcing all of its infrastructure requirements. In November 2016, Greencore announced its acquisition of US-based Peacock Foods and as a result, it needs access to local infrastructure in the United States. In August 2016, Adapt became a Datapipe company and now offers clients access to its global footprint that encompasses the Americas, Europe and Asia-Pacific. Greencore has taken advantage of this capability to host its latency sensitive applications, such as voice and video, in US data centres in Kansas City and New Jersey. Datapipe is now able to offer Greencore a true choice of best execution venue for its infrastructure – important now that the US comprises nearly half of Greencore’s annual sales. Scotty Morgan, Chief Sales Officer at Datapipe Europe said: “After Greencore’s acquisition of US-based Peacock Foods, we knew it was the perfect opportunity to make the most of our new global capability to support them. This is exactly the reason that Datapipe acquired Adapt – to be able to offer global services to our UK-centred client base.” Chris Smith, Head of Infrastructure at Greencore, added: “Since we started working with Adapt three years ago, we have received great service in the UK. The added global capabilities after the acquisition by Datapipe came at exactly the right time for us. “Collaboration is the foundation of our business and key to this collaboration is a coast-to-coast, robust voice and video capability. We needed local infrastructure in the United States to keep latencies down and Datapipe was able to deliver. This also contributes to cost saving for the business, via reduced travel requirements.” ### Jez Back from Deloitte LIVE from Cloud Expo '17 #CEE17 Disruptive Tech TV LIVE from Cloud Expo '17 talk to Jez Back from Deloitte about the economics of cloud and how to capitalise your cloud service. Deloitte are about to enter the blockchain game with new ecommerce evolutions. ### Yuval Dvir from Google Cloud LIVE from Cloud Expo ‘17 #CEE17 Disruptive Tech TV LIVE from Cloud Expo '17 talk to Yuval Dvir from Google Cloud about their rebranding, resurgence into the cloud market and how they have become the building blocks for many work eco-systems. ### Shawn Zandi from LinkedIn LIVE from Cloud Expo ‘17 #CEE17 Disruptive Tech TV LIVE from Cloud Expo '17 talk to Shawn Zandi from LinkedIn about the work in progress of the merger between LinkedIn and Microsoft and the integration of the brands moving forward and the application to the workplace for half a billion users. ### Marc Esmiley from iomart LIVE from Cloud Expo '17 #CEE17 Disruptive Tech TV LIVE from Cloud Expo '17 talk to Marc Esmiley from iomart about revelations of cloud and changing the focus of the hosting business with public cloud bringing a new tagline to the brand of 'unleash the power of the cloud'. ### After the Expo – Are we seeing a major shift in tech? So another year passes and with a myriad of interviews under our belts later, we broadcast live at the Cloud Expo Europe 2017. We had a great line up of guests including ITV, Google, LinkedIn, IBM, Fujitsu, as well as many others. However, this article is not about name-dropping who was on the show but more so for the general trend and theme. To note as a main topic of discussion was the IT concept “Blockchain”. To the uninitiated of you, let me explain Blockchain as a concept in simple terms as there are many spin-offs, improvements and acronyms already appearing such as; Hyperledger, BaaS - Blockchain as a service, that can confuse you. “A simple way to consider what Blockchain technology could provide is to think of a ledger (e.g. an Excel spreadsheet) that can only be updated when a majority of trusted participants, (known as ‘miners’ connected via ‘nodes’) have simultaneously agreed that the proposed transactions are valid and will not permit a situation whereby any erroneous “double-billing” could take place. Furthermore, the Blockchain-style ledger maintains a full and potentially transparent Audit-Trail of all of the transactions ever made via that particular ledger.” [easy-tweet tweet="Everledger uses the Blockchain technology method to create a digital ledger" hashtags="Everledger, Blockchain"] With this said the use cases for a “Blockchain” are not limited to financial transactions, you can use this method across most industries where you would require asset control, verification, the state change of devices or even history validity (authenticity). One example of a non-financial sector use case is the company Everledger, showcased at many events and also one of the winners of The Barclays Accelerator Techstar programs in 2015. Everledger uses the Blockchain technology method to create a digital ledger, a database if you will, that stores information regarding Diamonds and can quite literately replace the paper based certificate of authenticity issued today. Not only this, it can store information such as location - history of where it was mined from to where it has been sold. Also, If the diamond has been stolen, and it has been part of the process within the Everledger system, it will flag up and show this with the digital authenticity key attached to it! This simply put is an insurance company’s dream come true and will save millions! Anyway, this article is not specifically about Blockchain but the tech trends discussed at the Cloud Expo Europe 2017 event. In my opinion, there were four main themes throughout the Expo that were demonstrated/reinforced by guests on our live show www.Cloudexpo.tv Blockchain – as mentioned earlier this is a hot topic and offers to solve a host of business challenges with a distributed shared ledger system where historic entries cannot be changed. (see above for example use cases) Digital Transformation – A lot of buzz has been made over digital transformation initiatives for the last 2 years now, but over the last 12 months the large consultancy houses have upped their game to deliver this to organisations as a lead in engagement to consult. However, the tech houses/vendors have now caught up by applying business needs to digitalisation with technology that encompasses mobile, agile, bigdata and IoT initiatives that rely on the correct tech to underpin them. Flynet, a UK based company came onto our show and explained how they provide digital transformations to mainframe technology. Imagine legacy architecture you may have present that simply just works? Flynet provide the makeover and compatibility for the digital age and apply omnichannel front end services that take away the need for legacy skills and code for your application/service and provide a new interface that works while making it Cloud compatible. Multi-Cloud – I think if you ask five people what this means you will get 5 different answers. From the people, we spoke to, and also including my opinion, multi-cloud as a definition is a multi-hybrid approach. Simply put inter-connectivity/operability between the different types of cloud. Whether it’s through API’s, services or shared services from one type of cloud to another. Iomart explained this well on the show and got my vote for the best explanation (next to mine) with their offerings. However, there were many others boasting a “Multi-Cloud” approach too! Security - Off the back of a major public cloud outage the week before and, let's be honest, it’s a very hot topic anyway, security is paramount to any visible piece/element of technology across the internet. A buzz was created when Miku Hoppenden came on to our show and discussed current and future trends. Ransomware is a security hot sub topic at the moment and doesn’t seem to want to go away anytime soon. The general consensus is that companies would rather take the risk of lax security than investing in good information security, due to the many options available as well as high price tags. This seems crazy to me, as good security should be part of the foundations of Cloud technology. These were the four trends I saw that were very popular at the event, however, there were a lot more topics that were presented across the two days. I can nearly see your eyes roll back as you read the next paragraph. What technology can underpin, be used for, or be best suited for one or all of the five above points? Mainframe Technologies! Mainframe technologies may have had the resurgence of late for a few reasons. The technology has had a face-lift insomuch that OpenSource software can now be run and is certified to run on the architecture. It’s already prebuilt in most circumstances and can be just a matter of lift and shift. The tech caters for all the major versions of Linux, as well as container based software, is constantly being redeveloped to tick even more OpenSource boxes! Also, there is a dedicated user forum called; “The Open Mainframe Project” that services the OpenSource community and the adoption on mainframe tech! [easy-tweet tweet="Mainframes are still one of the most secure computing environments on the planet" hashtags="Mainframes, Computing"] With regards to Security, mainframes are about as secure as you can get with built-in encryption at the hardware level as well as ultra fault tolerant hardware, it’s a piece of tech that just runs! OK I understand it can also be expensive but like for like with required outputs/running services it’s cheaper than x86 tech (yes you heard that right). With the major advances in security for this technology (even have a branded ultra secure versions of the product range), mainframes are still one of the most secure computing environments on the planet. Look up DataKinetics and see what amazing things they are doing in the tech space for Mainframes. One case study from suggests quite literately 1,000,000’s of dollars of savings just for one client by using their mainframe optimisation toolset. DataKinetics have been around for as nearly as long as mainframes and if you have one I bet you have used their software at some stage as it clearly squeezes the tech configuration to the max allowing you to save a lot of money, I mean a lot! Once seen as legacy technology or a “must have” elephant in the comms room, this perception has changed and now you can have a hybrid cloud approach utilising your existing mainframe to bolster workload shifts, burst out to public cloud provisioning or even run Linux environments that do not require specialist O/S skills with the help of companies such as Flynet (mentioned previously) that bring digital transformation to the mainframe. Blockchain is definitely here to stay and is rapidly increasing in adoption as a technical concept of service delivery. IBM has a purpose built blockchain service (BaaS) that you can adopt to a mainframe environment that will enable you to have ALL of the ticks in the right boxes for R&D and deployment as well as continuing to develop the service to accommodate even more advances in this area – Hyperledger as an example – that secure their dominance other of players in the Blockchain race for leadership. As you can see there is a revival for the tech, there are amazing advancements in the supporting services around the tech, and it’s affordable not only to the large corporates that once had to have it. Now you can utilise the technology at a price you can afford because you need it and it just makes sense to. Are we looking at a new dawn, era if you will, where mainframe technology will be method for cloud in its right, who knows but all I can say is that centralised technology that can cater for humongous workloads, provide a digital transformation you need with a cost you can afford has to be taken seriously as a major player in the tech industry!   ### Credence Security in partnership with CensorNet CensorNet, the complete cloud security company, today announces it has selected Credence Security, the region’s specialty distribution company, as its Value Added Distributor (VAD) for the Middle East, Africa, India and Pakistan regions. Credence Security and CensorNet will work together to offer the cybersecurity company’s advanced cloud security products, including its Unified Security Solution (USS) to enterprises and resellers across the region. The deal further expands CensorNet’s global reach and addresses the growing demand for Cloud security solutions in the Middle East and across Africa. The partnership provides Credence Security with access to a unique Cloud security solution that consolidates point solutions into one easy to manage platform. Commenting on the partnership, Vivian Gevers, Credence Security Managing Director, said, “CensorNet’s Unified Security Solution (USS) with its award-winning Cloud application control makes it an ideal security solution and managed service for our value-added channel partners. By partnering with CensorNet, we add a critical best-in-class security solution to our existing portfolio and as such are one step closer to delivering on our commitment to offering our partners and enterprises in the region a one-stop-shop for all their security needs.” USS is a comprehensive cybersecurity service that combines modules for the security, monitoring and control of the web, email and cloud application across an organisations’ network in one single dashboard, meaning that common policies can be easily applied and incidents tracked across different media. USS provides the security and control of an on-premise or endpoint component with the flexibility and mobility of a cloud service. It is the next generation in Email and Web security with Cloud Application Control giving enterprises the power to extend web access policies to Bring Your Own Device (BYOD) initiatives and to monitor and control Shadow IT. Ed Macnair, CensorNet CEO, commented; “Credence Security has two decades of experience delivering IT security technologies to enterprise customers in the Middle East, Africa and the India subcontinent through its strong network of reseller partners and is therefore perfectly placed to promote our security solutions in the region. We are excited to have them join the CensorNet family.” About CensorNet CensorNet, the complete cloud security company, helps organisations to effectively manage and control the use of cloud applications in their business. The Company provides a unified and multi-layered approach to securing the cloud via its purpose built, multi-functional cloud security platform that delivers integrated web security, email security, CASB and adaptive multi-factor authentication. This provides organisations with security-focused visibility and control over user access, data and assets to allow access while controlling outcomes and allows organisations to address the security, audit, compliance and productivity issues associated with the use of cloud applications and devices. CensorNet enables organisations to control Shadow IT, safely implement BYOD initiatives and protect from cyber threats. With more than 4000 customers and over 1.3 million users worldwide, the Company is headquartered in Basingstoke, UK and has further offices in Austin and San Francisco, US; Copenhagen, Denmark and Munich, Germany. For more information, visit www.censornet.com About Credence Security Established in 1999, Credence Security, previously ARM, the regions’ speciality distribution company, specialises in IT security, Forensics and Incident Response. Working closely with leading IT security vendors including AccessData, Fidelis CyberSecurity, Digital Guardian, RedSeal and Popcorn Training, we design and deliver intuitive and customised cybersecurity, governance, risk and compliance solutions that protect organisations against advanced persistent threats, malicious adversaries and internal malpractice. Credence Security is headquartered in Dubai, UAE and serves enterprises across the Middle East, Africa and India through a network of resellers throughout the territory. For more information, visit www.credencesecurity.com ### The Value of Partnership in Digital Transformation The storage industry, as with much of the technology sector as a whole, is powered by collaboration. Vendors, partners and customers all have to work together to create the best possible products for real business scenarios. What makes a great partnership within the industry is the same as what makes a great partnership between individuals – that the overall result is more than the sum of its parts. Through the sharing of insights, expertise and solutions, partners, vendors and customers all stand to benefit. At NetApp, we know our partner community is fundamental to our success as a business. That’s why we believe in being partner-first. NetApp has over 400 authorised partners in our network across the UK&I, who help us every day to provide our customers with exemplary solutions and services to meet their storage needs. Nowhere was this more evident than during the UK rollout of our Backup as a Service (BaaS) offering, where Daisy and Node4 joined us as launch partners. Together, we will provide businesses with state-of-the-art storage technology without any additional hardware investment. [easy-tweet tweet="One of the key themes for our BaaS launch is that of digital transformation." hashtags="BaaS, Digital "] One of the key themes for our BaaS launch is that of digital transformation. It is a topic that is on the tip of every business’s tongue – but what is seldom discussed is the pivotal role that partners play in making this a reality. For large vendors like NetApp, it would be challenging to go out alone and make customer’s dreams of quick, easy transformation into a reality. The technology industry is getting more complex by the day, and partners are the experts in helping customers navigate through the chaos – after all, at the end of the day, customers don’t want to have lengthy discussions with vendors about their potential solution, they just want someone to make it happen. A partner infrastructure provides customers with exactly this service. Furthermore, for a customer, a partner is key for helping them to make the most out of their solution. In the storage industry, our customers want to achieve the best ROI for their investment – which means getting the best speeds and storing the most data in the most efficient way. Our partners work closely with use to understand the potential of our technology and translate this expertise to the customer. As a result of our close collaboration, our partners are able to take our product offering and build on it to create more tailored, service-based solutions that match the exact needs of their customers, even within specific verticals such as finance, healthcare and manufacturing. Building on their nuanced expertise, our collaboration can develop truly transformative infrastructure solutions that can support businesses on their digital transformation journey. Partners don’t just help customers, they help vendors grow as a business. In an industry that changes every single day, and new customer requirements just around the corner, partners form a vital link in helping NetApp stay ahead of the curve in understanding what businesses really want. At the same time, we can provide them with our expertise in identifying areas of rapid growth, and help them to make the most of the high growth sectors such as Flash, cloud and converged infrastructure. The value of a partnership comes in its ability to offer a multitude of benefits – learning opportunities, sales opportunities and growth opportunities – to all its participants. With the launch of our BaaS proposition, in conjunction with our partners and their customers, NetApp is harnessing our network’s combined expertise to provide outstanding IT services that meet the challenges that business experience along with their path to digital transformation. ### Is the Public Sector Ready to go Digital? The UK government’s long-awaited Transformation Strategy has been unveiled. The strategy highlights the importance of data governance, breaking down silos and the need for continuous development and improvement of digital services. With the current government urging “digital by default”, the focus has been on public sector organisations to deliver public services via initiatives such as Government-as-a-Platform (GaaS) as a means of attaining more effectiveness and efficiency in those services. In the early 2000s, e-government usually involved turning an analogue process into a website, but this isn’t enough anymore. Digital offers the promise of transforming the public sector’s mode of operation, how it relates to citizens and the services they consume. [easy-tweet tweet="Two areas public sector bodies need to address are accessibility and mobility." hashtags="Public, Mobility "] Whilst much progress has been made over the past few years, the public sector is still managing the technological and cultural transition from relying on siloed data to collaboration between government departments to deliver improved services. Two areas public sector bodies need to address are accessibility and mobility. Digital inclusion is a key issue for public sector organisations that are making the transition to digital service. Guidance from Gov.UK  promotes accessibility and warns that excluding anyone from using a service based on disability may be in breach of the Equality Act 2010. An EU directive passed last year stipulates that public sector websites and apps must be made accessible to those with disabilities. The UK government advises that public sector organisations should test accessibility issues by running an automated tool against a site, supplemented by regular manual checks and testing undertaken by an accessibility specialist. A specialist can advise on content, designs, prototypes and code and identify barriers that could affect disabled people. Equally, enabling more applications to be accessible from mobile devices can enable citizens to access government services from wherever they are and allow employees to be more productive. IT managers in the public sector have a great opportunity to use mobility to deliver innovative digital services. Mobile Device Management (MDM) technology and bring your own device (BYOD) policies can allow employees and departments to work more flexibly while collaborating and sharing data in a secure fashion. BYOD can also reduce the cost of buying and provisioning devices for employees. While the UK public sector transforms its services, the challenge is to do so while resources are tight. However, it is important not to focus too much on cost reduction at the expense of keeping citizens’ needs at the centre of digital services. [easy-tweet tweet="Digitisation of government services will be among the biggest public sector challenges" hashtags="Digital, Public Sector"] Many of the digital transformation and inclusion issues currently faced by the public sector have already been solved by the private sector, using open standards and platforms. This provides public sector IT managers with an opportunity to avoid reinventing the wheel and learn from digital services that have already been introduced in the financial, retail and manufacturing sectors. Digitisation of government services will be among the biggest public sector challenges over the next twenty years. If managed wisely, it will bring great benefits to public sector organisations and the citizens they serve. The biggest challenge is to use the cloud, connectivity, collaboration and mobility to digitise in a way that makes lives simpler for citizens on a daily basis and increases the quality and efficiency of public sector activities while enabling continuous improvements.     ### ThreatConnect reveals the most common threat personas in cyber security It is the duty of security operations directors to ensure that they have complete visibility into their security posture. With threat actors’ tactics evolving all the time, a comprehensive and flexible threat response is a must – neither governments nor enterprises can afford to leave the back door open. So what are the top threat personas that organisations need to be wary of? · State sponsored hackers – These are the big dogs. The anonymity of web-based attacks means that nation-states can achieve their more ethically questionable aims via puppet actors, making it extremely difficult to prove links between individual hacks and state-sponsored campaigns. However, state-sponsored hackers are sometimes identifiable by their attack patterns and dedication to a specific target. They’re a tenacious breed - if you think you’re being targeted by a state-backed hacker (and aren’t a conspiracy theorist), you should be ready for a long struggle to throw them off. · Ideological attackers – these threat actors, for example the hackers that targeted Dyn DNS systems, are intent on propagating their views with noisy, public attacks - website defacements and DDoS attacks, for example. If after this sort of petulant demonstration they feel their message is not being heard, then they may look for a more spectacular platform upon which to propagate their doctrines. For some, that means espionage activity or strategic leaks of confidential documents in support of a broader information operations campaign; for others, it might simply mean a particularly mean series of insults on Twitter… · Criminally motivated – Criminals have always been attracted to an easy buck, so it’s hardly a surprise that they’d take advantage of the way technology has evolved to fill our lives. So for example, malware with moderate antivirus detection that only looks for credit card data and point of sale services may indicate a moderately resourced attacker who is likely criminally motivated. That’s a fairly well-prepared example. As well as the slightly bumbling phishing emails we’ve all encountered, cyber criminals can also come in two particularly dangerous forms: o A) The silent attacker – cyber criminals may lay silently within an enterprise for months, biding their time until it’s the right moment to attack. Since some malware can edit its code once installed to mask its presence, these quiet lurkers embed themselves on a network to gather sensitive data in secret, either extracting personal details or monitoring communications, constantly feeding the results back while they wait for the opportune moment to strike. o B) Sophisticated cyber criminals - on other occasions, the strategy of threat actors transitions from watching to attack. The tools in use are getting to sci-fi levels of sophistication. Highly resourced fraudsters can now use custom malware that surreptitiously replicates itself to thumbdrives to jump air-gapped networks and automatically looks for and collects documents with the keyword “SECRET”. Anything you try to hide is all the more likely to be found. Not all adversaries are created equal and intent is rarely consistent across the board. For example, if your adversary is driven by espionage then you wouldn’t expect to see any defacement or ransomware activity. Instead, you need to be wary of sensitive information leaving your network. Organisations that have a strong understanding of their adversaries and can develop persona-based intelligence capabilities will be better placed to automate their security operations, mitigate threats faster and adapt more quickly. Many question whether adversary intelligence is really a must-have, but knowing what they are up against will allow organisations to build more comprehensive mitigation strategies at a tactical level. ### How the Cloud Makes it Simple to Innovate Digital transformation is speeding up the rate at which companies make or break. A couple of decades ago, a Fortune 500 company would last for about 75 years. In the last 15 years, 52% of Fortune 500 companies have gone. The average lifespan for even the largest of companies is now down to 15 years. Successful companies are able to innovate quickly, are agile, listen to their customers and provide the best possible digital experience. Those that adapt quickly to the digital world are 26% more profitable than their industry peers, according to research by MIT Sloan. All this acceleration is down to the fact that success (or failure) is just one social share away from any of your customers. The game has become highly competitive. If customers don’t like the experience you provide, they’ll quickly switch vendors, and they share almost everything online: the good, the bad, the ugly. Your best bet is to be faster, to innovate and simplify. You need bi-modal IT, as Gartner calls it. This means having, on the one end, mature enterprise technology, and on the other end, being agile, fast, innovative. Make sure that your back-end is robust and stable, then go wild on the front-end without risk. Moving fast on the front-end means, for many businesses, going into the cloud. IDC predicts that by 2020, 67% of all enterprise IT infrastructure and software spending will be for cloud-based offerings. Here’s how you can test fast and shorten the time from idea to delivery, using cloud-based content management systems. [easy-tweet tweet="Being innovative means trying out stuff, experimenting and risking failure. " hashtags="Cloud, Risk"] Safe to fail Being innovative means trying out stuff, experimenting and risking failure. Choose solutions that reduce risk, make failure less costly and remove obstacles for you to test new ideas. Increase flexibility so that you can innovate faster out there in the market. You wouldn’t want to wait months for IT resources to be available. You need to onboard staff quickly and move with your digital transformation. Empower front-end developers Many CMS products still carry a risk, or a divide between front-end and back-end. In the classical way of doing things, you would have a graphic designer, then someone does HTML, CSS and so on. Then it goes to the back-end developers to create templates, and you need to test again. It’s never how you wanted and takes forever. In the old world, the learning curve was steep. Enterprise technology is complex. You have to learn how to customize in a Java environment and work on templates at the back-end. The new world is more API-driven and uses standard front-end frameworks. No more compilation or build time or restarting the system after changes. Front-end developers already understand Javascript and they can now create templates. Depending on your needs, you can work exclusively with the front-end and not use Java at all. This speeds up development and makes things happen fast. And what if those front-end developers did all the work that you really needed, using the latest Javascript frameworks? Java is still one of the most used programming languages in the world. It’s really solid, enterprise-quality, mature technology. This is heavy lifting, but for getting ahead today, it’s about being agile on the front-end. Enter the Javascript community, with all those amazing frameworks and innovation that’s happening in that space. Easy deployment and live testing Simplify operations with cloud-based products. In the old world, you needed heavy lifting in the back-end to deploy a product and this took time. In the new world, press a button and the platform deploys automatically - all done in a best-practice setup. Simplify bug fixes. No more waiting for the next rollout and being exposed to security risks. With the cloud, you’re always on the latest security update and the latest version with new functionality. Test against live data. If you’re happy with the results, just click again and you’re switched to whatever version just released. You can also roll back if you prefer to stay with the previous version. [easy-tweet tweet="Businesses now want live websites in weeks, instead three - six months." hashtags="Websites, Cloud"] Simplify rollout Businesses now want live websites in weeks, instead of the usual three to six months. But even more crucial than the time it takes for the initial rollout is the time it takes to add new features. It is the day-to-day, incremental innovations that make the difference. That’s why Magnolia launched its cloud-based CMS, Magnolia NOW. The cloud environment allows you to change anything at any point in time, totally risk-free. You can even start with smaller sites, then add functions and grow content. Continuous integration lets you release features as often as needed. Take the risk ouf of innovation with a cloud-based CMS. Work in a light and live environment and shorten the time from idea to launch - a simple, elegant and no-fuss way to innovate.   ### Business Intelligence – you either have it or you don’t…  [Definition of Business: …the practice of making one's living by engaging in commerce…] [Definition of Intelligence: …the ability to acquire and apply knowledge and skills…]  Without both – you will have neither! I’m an avid reader of pretty much all things “Technical” – especially since it relates to FinTech. Lately, I’ve seen a slew of articles about so-called “cognitive” and “predictive” systems – how Blockchain technology and the Internet of Things (IoT) will change life as we know it. Lastly – but by no means least – how cyber security (or lack thereof) is becoming increasingly important. How then can people employed on the technical side of things ensure that their non-technical – but business focused bosses - make smart and informed decisions about what to do next? Non-technical bosses typically don’t “speak Geek” – whereas, the technicians themselves are often incapable of explaining things in a fashion which doesn’t make the non-Geeks’ eyes glaze over! In an attempt to bridge the Geek versus, the rest divide – I once spent a lot of time working on a set of tools from Microsoft known as; Visual Studio “LightSwitch.” The product was discontinued in October 2016 but will continue to be supported (as a component of Visual Studio 2015, until some time in the year 2020) and I personally no longer use it. Nevertheless, I’ll never forget just how quickly I was able to use the tools and quite literally deliver a new app in virtually no time. Once I’d linked to one or more databases, I was able to quickly “translate” the various Table and individual Field names appearing on the LightSwitch front-end to whatever the User wanted – just like that. Furthermore, any subsequent Queries/Reports, etc., would reflect the User-preferred terms and not those typically applied by Database Administrators – (e.g. “Client Clearing Code” instead of “CltClrgSysId”.) It was great while it lasted. So what’s available now? Well, things change, and lately, I’ve been looking into how Mainframes have evolved over the years – in particular, IBM’s z Systems and their Linux variants. In previous posts, I’ve reported on just how powerful and secure they are, and in many ways, so vastly superior to x86 type technology. Indeed, this is exactly why so many of the world’s top banks, credit card companies, airlines, etc., still use Mainframe technology. See the following excerpts from my previous blogs to get a feel for what I’m talking about: [easy-tweet tweet="For IT infrastructures that run their private cloud, z Systems can run up to 8,000 virtual servers" hashtags="IT, z Systems"] When compared to an x86/ distributed server environment, the operational costs of the z Systems are half, while the performance is 30 percent greater. For IT infrastructures that run their private cloud, or wish to, z Systems can run up to 8,000 virtual servers, while saving money. It lowers costs because there are fewer software licenses to bother with, lower energy consumption, and the cabinet of a z System takes far less room, lowering the need for facilities and facility upkeep. As if the standard “raw power” proffered by a z System wasn’t enough – there are Partners who specialise in optimising the throughput of various applications and routines. An example is that one such firm enabled a major US Bank to reduce the time taken to process some particularly complex batch jobs from over ten hours to less than one minute! More recently, I’ve been looking at some Press Releases regarding DataKinetics and their relationship with the Danish company called SMT Data – and in particular - taking a look at their “LightSwitchEsque” front-end tools. The tools and services are being promoted under the banner “IT Business Intelligence” – (often times shortened to ‘ITBI’) – and the sample screenshots and testimonials are very impressive indeed. For example – take a look at the following excerpt and screenshot: Business mapping and cost mapping makes the stored information understandable in all areas and on all levels of the organisation, while reports specifically created for and directed to very different recipients all share the same data source (resulting in ‘one truth’).  Optimisation needs no longer to be suggested, planned, prioritised and executed by technical staff only. As an example, the following picture shows a report on the potential daily CPU costs (if it turns out to be the monthly peak, which in this case is the cost driver) with totals divided into applications/business areas. Through the BI functionality, a mouse click on a specific day will change all values in the report to cover this day only. Likewise, a mouse click on a specific business area will change all values in the report to show cover this business area only: With the right tools, organisation and processes, and with a relatively small amount of time and effort, optimisation can save a large IT organisation millions in costs a year. Think about that – Geeks producing reports and analyses that the Business folk can truly understand – and then make better, smarter, informed, more businesslike and intelligent decisions. Take a look at another excerpt and screenshot: Enriching the technical data with information about who is using the capacity can be helpful in the dialogue with developers or business users. Also, adding cost information makes it easier to prioritise optimisation efforts, quantify trade-offs, and document the resulting savings. Imagine how much more “bang for the buck” you could realise for your business by getting a true handle on just where the money is being spent – then having the expertise of an ISV like DataKinetics to delve under the hood and truly optimise the workloads – further driving value out of the true workhorse in your infrastructure. All pretty impressive stuff! Finally, it seems that there’s a way to get all members off a company’s team to begin speaking the same language… …and bring a lot more intelligence into their Business.   ### Rackspace Launches Global Professional Services for Enterprise Customers Offers Unique, End-to-End Expertise and Support Across Leading Public and Private Clouds to Enable Faster IT Transformation On April 3rd, Rackspace announced an expanded professional service proposing to help customers move workloads out of their data centres and onto the world's leading public and private cloud platforms. This offer, called Global Solutions and Services, (GSS) will provide customers across all markets with the specialised expertise and support needed to transform their IT operations. Through GSS, Rackspace architects and engineers will work with customers to plan, assess, design, migrate, manage and optimise their journey to the cloud, helping to ensure they achieve maximum performance, agility and cost-efficiency. For the first time, Rackspace will provide professional services to any business, regardless of which cloud platform it chooses or whether it is an existing Rackspace customer. Rackspace has earned the highest level of partner certification from Amazon Web Services® and Microsoft® Azure ® and recently announced a strategic relationship with Google Cloud to become its first managed services support partner for Google Cloud Platform (GCP). Rackspace also offers specialised expertise and support for dedicated servers and the three major private clouds, powered by VMware®, Microsoft and OpenStack®. Today, more than 70 percent of organisations with 10,000 or more employees are undertaking an IT transformation initiative, according to "Voice of the Enterprise: Cloud Transformation – Vendor Evaluations," a study published in January 2017 by 451 Research. Also, 70 percent of organisations are also using multi-cloud environments. Many of these businesses lack the in-house resources and skills required to complete this type of transformation, from the upfront planning and execution to the ongoing management of their new cloud solutions. “As more businesses look to move IT operations out of the data centre, they recognise that they need help transitioning to multi-cloud models,” said Rachel Cassidy, vice president of Global Solutions and Services at Rackspace, who wrote an in-depth blog post about the new offering. “In response to customer demand, we’re expanding our managed cloud portfolio to offer professional services not only to the SMB and mid-market customers we have served for nearly two decades but also to our fast-growing stable of enterprise customers.” [easy-tweet tweet="Rackspace operates one of the largest VMware footprints in the world" hashtags="Rackspace, Data"] Cassidy continued: “Rackspace operates one of the largest VMware footprints in the world, and we employ thousands of specialised cloud experts across multiple platforms. With this breadth and depth of expertise, we are uniquely suited to help guide our customers through every phase of their cloud journey. We provide everything from front-end assessments and guidance to full migration services and ongoing operational support, depending on the customer's needs. Our goal is to relieve clients of the complexity of moving to multiple clouds, allowing them to focus on their core business.” Rackspace GSS will place an added focus on enterprise and mid-market businesses and will help support their evolving business needs by providing specialised expertise and professional services in several key areas, including: Portfolio Solution Architects will serve as customers' trusted advisors, helping them run each of their workloads where they will run best. Through Rackspace specialised expertise on all of the leading public and private clouds, as well as dedicated servers, architects can offer both a seasoned point of view and unbiased guidance on which solutions will serve customers’ unique needs. Specialist Solution Architects with deep expertise and experience on specific cloud technologies will work directly with customers on each cloud platform that they choose. Professional Services/Partner Services delivered through cloud architects, consultants and engineers that will help customers with planning, assessment, design, architecture, migration and optimisation across the selected cloud solutions. Rackspace leverages strategic partners as needed to complement the company’s expertise and capacity, providing customers with complete, end-to-end solutions. Rackspace enhanced professional services will include expertise and support not only for multi-cloud infrastructure but also for security, compliance and governance, as well as for data stores and other applications. Rackspace services include load testing, DevOps project support and maintenance, security testing and project management as a service. “This seems to be a timely offering in keeping with market trends we’re seeing,” said Melanie Posey, research vice president, cloud transformation, at 451 Research. “The drivers behind many organisations’ IT transformation efforts – whether toward all-in public cloud or hybrid – are beginning to change. While overall cost savings remain important, other factors such as improved time to market and business agility are equally compelling drivers toward a different type of IT infrastructure environment. As the trend continues, organisations will increasingly seek out external assistance to guide them through the transformation.” The GSS offering is now available globally. For additional information on Rackspace GSS, please visit www.rackspace.com/en-gb/professional-services. ### Securing the Cloud The challenge A huge amount of business applications and services continue to move to the cloud. Along with the rise of remote workers within an organisation, securing and controlling access to cloud-based infrastructure and services has become a challenging and complex requirement. Once infrastructure and services are no longer secured or maintained by an organisation, ensuring additional controls are in place along with mature policies is vital to reducing the risk exposure. Some organisations may have mature Identity and Access Management (IAM) solutions, which protect internal systems. However, with the rapid adoption of the cloud, many are trying to use existing policies and controls to secure the cloud. This is not an effective way of approaching the problem; the cloud must be seen for what it is, a completely different solution which requires its own policies, procedures and controls. [easy-tweet tweet="Auditing capabilities can also be an issue when it comes to cloud-based services" hashtags="Cloud services"] Whilst many cloud applications or services provide a certain level of access control, this requires them to be managed locally, either by platform or application. Generally, it does not provide the ability to link internally to the active directory or identity platforms, or at best it provides an unreliable or insecure API integration with internal systems. This means managing users becomes a complex and time-consuming process to conduct and also reduces employee effectiveness and productivity, as users are forced to continuously log into the various systems they require. This is one of the most common issues businesses encounter. Auditing capabilities can also be an issue when it comes to cloud-based services or infrastructure. Whilst most cloud solutions will provide a level of auditing capabilities, they can be quite limited and will require the administrator to access each service or platform to run reports. Again, this is a very time-consuming exercise, particularly if an organisation has multiple solutions requiring auditing. Risks and threats Many cloud providers have their own security controls in place to protect their services. However, it is the responsibility of the business to protect their own data in the cloud. As such, the security controls provided to an end user are usually limited and in some instances, simply do not exist. Some of the most common risks to cloud-based services can be overcome by ensuring an IAM solution is in place. The most common risks which can be reduced through an IAM solution are as follows; Poor identity and access governance and management. From weak passwords to poorly managed joiners/leavers/mover’s processes Data breaches caused by poor credentials and identity management and procedures Insecure user interfaces and API. Lack of provisioning, authentication and user activity monitoring Compromised accounts. Attackers gaining user credentials, poor password expiration policies Insider threats. No user auditing capabilities, poor user authorisation policies, lack of duty segregation While an IAM solution will provide the ability to reduce these risks and threats unless it is combined with a mature strategy and the correct processes and procedures, the reduction of risk will be far less. Securing and reducing risk The main challenges of moving services or infrastructure to the cloud can all be remediated, or at least reduced greatly, by implementing an IAM solution. An IAM solution combined with an enterprise Single Sign-On (SSO) solution will provide an organisation with the following benefits; Centralised management, visibility and control Increased security hardening Consolidation and control of access privileges Automation of all user lifecycle processes Demonstrable compliance adherence It’s important to note that implementing these solutions without proper planning and the underlying policies and procedures in place will result in poor adoption, issues with deployment and configuration and the programme will ultimately fail. Policy is key One of the first actions an organisation should carry out when taking its first steps towards the cloud or working to mature their existing cloud access controls, is to evaluate any existing policies currently in place governing the acquisition of new infrastructure or software and the management of user credentials. In most cases, these will not provide sufficient governance for cloud infrastructure and software as usually they will have been developed for internal systems and user management only. Ensuring the correct policies and procedures are in place for the cloud is vital and this should be a priority for any organisation looking to mature its cloud access. This can be achieved by updating the existing procedures to include the cloud or creating a new set purely to govern the cloud. There are good arguments for both approaches, but providing it is done correctly it makes little difference to the effectiveness. A mature policy should consider not only the provisioning and de-provisioning of users but also look at the impact of any potential compliance requirements the organisation may be required to adhere to. Simplifying access Refining or creating policies to cover cloud-based software and/or infrastructure will provide a business with the governance needed to ensure the risk exposure is reduced. Nevertheless, these will not provide any additional security or assist with controlling user access. This is where additional tools can provide a huge amount of value utilising an SSO solution such as Ilex International’s Sign&go, which supports federated SSO. The solution is very effective and can also provide cost savings to the organisation. Implementing an SSO solution to integrate with an IAM solution, or even as a standalone, will provide a great amount of control over what users can access. The solution will also reduce the need for users to remember multiple credentials. Also, it will provide a single portal to manage and audit user behaviour and activity. Another benefit of implementing a single SSO solution, such as Sign&go, is the ability to add mobile access and data security including total protection of corporate data. [easy-tweet tweet="Using a federated SSO solution will allow for a user to be allocated a single set of credentials" hashtags="SSO, Cloud"] Federated access Another very effective expansion to a Single Sign-On solution is to introduce federated capabilities. Using a federated SSO solution will allow for a user to be allocated a single set of credentials, which grants access to multiple accounts. This access is controlled by the active directory groups a user is a member of. By introducing federated sign-on, a business can expect to realise the following benefits; Reduced timescales for rollout of cloud services Greater control over joiners/movers/leavers, access will automatically be adjusted/decommissioned reducing the overhead of accessing individual platforms/services Greater auditing capabilities Increased user productivity through reduced login times/issues with credentials Reduced strain upon IT helpdesks In short, an SSO solution utilising federation is quite simply a business enabler. Conclusion There is no doubt that moving to the cloud can cause concern and result in an increase of risk exposure, especially around the increasing security challenges and the potential requirement for additional IT infrastructure spend. This said, a move to the cloud should be viewed as a business enabler as the benefits can be great; reduction in infrastructure maintenance costs, user enablement through greater collaboration capabilities, location flexibility, user experience and flexibility, automatic software updates in many cases and a reduction in CapEx. These combined will more often than not outweigh the investment of additional security measures should they be required, more so when these benefits are further increased as the additional security controls can be applied elsewhere in the business resulting in further increases in security throughout the organisation. The key consideration when moving to the cloud is to evaluate and understand the gaps in existing process, policy and procedures, the potential need for additional security controls and the requirement for detailed planning and project governance is critical. If these key actions are carried out, it will ensure any adoption of cloud services or infrastructure is a success.   ### Secure use of USB's found by Honeywell Secure Media Exchange (SMX) provides a  simple, safe way for industrial plants to use USB-removable media and manage USB ports. Helps protect industry and critical infrastructure from USB-borne threats. Honeywell (NYSE: HON) Process Solutions (HPS) today announced a new solution for industrial sites as they balance productivity and cyber security demands. Honeywell’s new Secure Media Exchange (SMX) protects facilities against current and emerging USB-borne threats, without the need for complex procedures or restrictions that impact operations or industrial personnel. Malware spread through USB devices – used by employees and contractors to patch, update and exchange data with onsite control and computer systems – is a key risk for industrial control systems. It was the second leading threat to these systems in 2016, according to BSI publications, and uncontrolled USBs have taken power plants offline, downed turbine control workstations, and caused raw sewage floods, among other industrial accidents. “Industrial operators often have hundreds or thousands of employees and dozens of contractors on site every day,” said Eric Knapp, Cyber Security chief engineer, HPS. “Many, if not most, of those rely on USB-removable media to get their jobs done. Plants need solutions that let people work efficiently, but also don’t compromise cybersecurity and, with it, industrial safety.” Currently, many plants either ban USBs, which is difficult to enforce and significantly reduces productivity, or rely on traditional IT malware scanning solutions, which are difficult to maintain in an industrial control facility and provide limited protection. These solutions fail to protect process control networks against the latest threats and offer no means to address targeted or zero-day attacks. [easy-tweet tweet="There is a continuous increase in cyber threats around the world" hashtags="Cyber threats, security "] “SMX is a great example of Honeywell’s major investments in new industrial cyber security technologies, products, services, and research which further strengthen our ability to secure and protect industrial assets, operations and people,” said Jeff Zindel, vice president and general manager, Honeywell Industrial Cyber Security. “With the continued increase in cyber threats around the world, Honeywell’s industrial cyber security expertise and innovation are needed more than ever for a smart industry, IIoT and critical infrastructure protection.” Honeywell’s SMX was developed by the company’s cyber security experts based on field experience across global industrial sites and feedback from Honeywell User Group customers. Honeywell has one of the largest industrial cyber security research capabilities in the process industry, including an advanced cyber security lab near Atlanta. Honeywell also partners with cyber security leaders, including Microsoft, Intel Security and Palo Alto Networks, among others, to develop new, highly-effective industrial threat detection techniques. [easy-tweet tweet="Honeywell’s SMX provides hassle-free, multi-layered protection for managing USB security" hashtags="USB, Security "] Honeywell’s SMX provides hassle-free, multi-layered protection for managing USB security, letting users simply plug in and check devices for approved use in the facility. Contractors “check-in” their USB drive by plugging it into an SMX Intelligence Gateway. The ruggedized industrial device analyses file using a variety of techniques included with Honeywell’s Advanced Threat Intelligence Exchange (ATIX), a secure, hybrid cloud threat analysis service. SMX Client Software installed on plant Windows devices provides another layer of protection, controlling which USB devices are allowed to connect, preventing unverified USB removable media drives from being mounted, and stopping unverified files from being accessed. SMX also logs USB device connectivity and file access, providing a valuable audit capability. “For most plants, the proliferation of removable media and USB devices is unavoidable, but the security risks they bring don’t have to be,” said Knapp. “We know our customers have limited resources to maintain another system, so Honeywell manages SMX for them. SMX never connects to our customers’ process control networks. From a system administration perspective, it’s like it’s not even there.” Managed and maintained directly by Honeywell, SMX provides the easy and secure solution to USB security in industrial plants. It helps prevent the spread of malware through removable media; stops unverified files being read by Windows hosts; and, through the private ATIX connection, provides continually updated threat information and advanced analytics to help detect advanced, targeted, and zero-day malware. Honeywell is the leading provider of industrial cyber security solutions that protect the availability, safety and reliability of industrial facilities and help securely deploy Industrial Internet of Things (IIoT) technologies. Honeywell’s complete portfolio includes cyber security technology solutions, managed industrial cyber security services, and professional cyber security field services. The portfolio leverages the company’s industry-leading expertise and experience in process control.     ### Why You Should Avoid Email and Use Secure Cloud Networks In 2015, there were a record-breaking nine mega breaches and, l 3.1 billion private records were leaked. Many of these were a result of email targeted cyber attacks such as trojans and ransomware. Why Isn’t Email Secure? When email was invented in the 1970s, it wasn’t created with security in mind. Although technology has advanced tenfold since it's invention, it is still lacking the basic credentials that many businesses require when sending personal information or confidential data. There are a number of threats which endanger your email communications, including: Malware - According to eWeek, between 2% - 4% of emails contain a virus. That equates to more than 6 million infected emails being sent and received every day. Engaging with this dangerous correspondence could trigger a ransomware attack and leave businesses frozen out of their own system until hackers are paid thousands of pounds to be unlocked. Easy Interception – It is extremely easy to intercept email communications. Furthermore, if messages and their attachments are not encrypted, much of the information can be read without difficulty. That’s because many email providers do offer encryption options, however these are not used as standard and work on an opt-in basis so many users do not realise they are leaving their communications exposed for anyone with the capability to see. Spear Phishing – 95% of data breaches start with a spear phishing attack, according to the SANS Institute. Spear phishing is so effective because it appears so real. Fake emails masquerade as messages from legitimate contacts as a way for hackers to sneak their way into your systems. A simple click within one of these emails can cause all kinds of issues, with the goal of many hackers being to steal your personal or financial information. They often trick anti-spam or anti-virus protection systems as a result, and their 'authenticity' is only amplified by the fact these emails are only sent in small volumes. Human Error – Security breaches caused by human error can broadly be categorised under two headings: Disgruntled employees/ex-employees Careless or uneducated staff No one is perfect and we all make mistakes, but a lack of online security awareness and training could become a company's biggest pitfall. According to recent reports: [easy-tweet tweet="93% of data protection breaches are caused by human error." hashtags="Data, Security "] 58% of senior managers have accidentally sent the wrong person sensitive information. On average 25% of workers overall have done this. 87% of senior managers regularly upload work files to a personal email or cloud account. 93% of data protection breaches are caused by human error. Two-thirds of employees believe that their organisations can improve their cyber security. Lacking Compliance – Depending on the industry, organisations may have specific protocols and regulations they must abide by to maintain the safety of sensitive data such as financial information or medical details. Your basic email provider can't provide this and therefore is not able to meet these imposed specifications. Missing Authentication – Authentication tools act as a second layer of defence and will restrict access to authorised data to only the intended reader. Standard email services do not offer two-factor authentication options, therefore, if you leave your inbox open on your unguarded computer or your email is intercepted, anyone can read its contents. A Solution to Your Email Security Issues To learn more about email’s safety flaws and what you can do to secure your systems, download your free copy of ‘The Solution to the Email Security Problem’ - a comprehensive guide by Maytech.   ### Digital Transformation: A Checklist for Your Strategy  Digital transformation is a term on everyone’s agenda right now, with organisations scrambling to adopt every new approach they hear about.  This, despite ongoing confusion around what digital transformation is and whether businesses need it.  At its bottom line, digital transformation means using digital technologies to maximise business efficiency. From your employee onboarding process through to your supply chain management, it’s about finding the best way for technology to drive your business outcomes – for your customers, your people, and your partners. Digital transformation requires a shift in mindset. IT has historically been a support function, located somewhere in the basement. Today – where all businesses are technology businesses whether they realise it or not – tech has been the enabler. Look at how disruptive brands like Netflix and Deliveroo have essentially applied technology to existing industries and opened up new markets. In the modern landscape of IoT, AI, and robotics, previously “transformational” technologies like cloud, and big data, have become fundamental. But as the innovative layer of tech quickly becomes the norm, how do organisations ensure they are staying current with their digital transformation? [easy-tweet tweet="To unlock the potential of digital transformation, the cloud is vital." hashtags="Cloud, Digital "] The cloud transition To unlock the potential of digital transformation, the cloud is vital.  The last couple of years has seen even late-adopters park their concerns around public cloud as we see a growing confidence in it. Flexibility, cost efficiency, collaboration, security are all better enabled via the cloud so getting this strategy right at the outset is the first step towards digital transformation. Think like a start-up We see so many larger businesses moving like tankers when it comes to digital transformation. They’re tied into long procurement cycles, arcane processes and heritage infrastructure.  By streamlining processes and bringing more agility into your business via apps like Slack, and automation glue like Zapier, you create a new working culture.  It’s about opening up your business operation using technology. Pointless work, manual entry, paperwork are ruthlessly eradicated in the start-up environment with focus directed towards outcomes.  Your business should do the same. A customer-centric ecosystem Digital transformation requires a new way of working right across your entire business ecosystem.  Treat your suppliers and partners in the same way you do your customers.  Structure data effectively between your organisations and tighten integration of your systems to enable closer collaboration and more effective delivery. This requires groundwork but your suppliers will thank you in the long term. [easy-tweet tweet="Organisation needs to have a solid identity infrastructure in place" hashtags="Identity, Cloud "] Security and digital identity No digital transformation can be complete without this.  Your organisation needs to have a solid identity infrastructure in place, with strong encryption and two-factor authentication at a minimum.  Have an eye on GDPR to – 2018 isn’t far away – and stay close to continually moving regulations. Consider SaaS Platforms SaaS platforms, chosen carefully, can bring huge benefits to the way you deal with business data in your organisation. Businesses work better when colleagues enjoy a culture of sharing and transparency, and good SaaS platforms can act as an enabler for this both on their own and via integration with other systems. You can be confident that your data is being taken care of by experts in that area, and a good SaaS provider will always provide a way for you to extract your data in the event that you want to leave their platform. Streamline everything Digital transformation demands a long-term view. Analyse your business processes and don’t jump at the temptation to slavishly automate everything. Instead, look at inputs and outputs, tie your approach back to your original goals, zoom out and revisit your “why”. You may have processes in places that are working for you.  Stick to these.  At the same time, be prepared to scrap the ones that aren’t. Which brings me to my final point… Cut the ties with sunk costs [easy-tweet tweet="Change is hard and, with technology, the stakes will always be high." hashtags="Technology, Cloud "] One of the biggest barriers to transformation is the fallacy of sunk costs.  Businesses are so wedded to legacy systems they can’t see how these are holding them back, possibly by years.  This demands bold leadership and to be prepared to cut your losses early.  We often see businesses commit thousands of consultancy hours to making an off-the-shelf system work.  Get clarity. Know what something can and can’t do and its place within your long-term overall strategy.  As DevOps leaders, we see the tendency to be protective about an existing tech infrastructure all the time, driven by uncertainty and an understandable focus on uptime, stability, and security.  Ensure the agenda you respond to is that of the organisation as a whole. That bigger vision that puts customers at the heart of a more flexible, scalable and responsive infrastructure. Change is hard and, with technology, the stakes will always be high. Digital transformation demands that you as a business and technology leader think beyond the project mindset.  This isn’t a one-off, rather an ongoing process to optimise your organisational performance potential.  It’s where businesses think they’ve reached the end-goal, down tools and go back to business as usual that transformation fails. Instead see this as a journey, that cultural shift that drives a shared commitment to continuous improvement and enhanced performance right across your business.  This will be your transformation. ### Teradata Finds Half of Business Critical Data is Moving to the Cloud Security is the number one concern amongst senior business executives when it comes to cloud data adoption, yet more than half of business-critical data is likely to reside there by 2019. [easy-tweet tweet="Eight out of ten executives cite security as a concern when storing data in the cloud" hashtags="Cloud, Security"] According to the findings of a new study released today, by Teradata), the leading data and analytics company, eight out of ten executives cite security as a concern when storing data in the cloud, but that has not stopped a huge rise in the amount of critical data being sent there. Looking ahead to 2019, executives predict that over half of IT (56%), customer (53%), and financial data (51%) will reside in the cloud. The study highlights that although businesses want to invest in cloud storage and many plan to do so over the next two years, there are real concerns about security of information and data breaches. A number of key trends were identified: Security and lack of control are the top obstacles holding back storage of critical data in the cloud: 40% of respondents say general security is a risk, while 25% of respondents believe cloud data adoption will result in more security breaches. A quarter of those surveyed believe data cloud adoption will result in lack of control. 52% say executive buy-in to data in the cloud is holding them back and 22% struggle with the additional staffing needed to move to the cloud. Yet cloud storage is set to rise sharply in the next two years: Although 58% of data from organizations surveyed already sits in the cloud today, usage of cloud will increase over the next two years, with three in ten companies globally predicting a significant increase of data in the cloud by 2019. Majority of legal data is being held back from the cloud: Legal data is being kept on physical servers for the most part, with just 27% of businesses surveyed expecting to move their legal data to the cloud in the next two years. Telecoms is making the biggest commitment to the cloud, with marcomms making strong moves within organizations: 48% of organizations surveyed anticipate a significant increase in their cloud storage use by 2019. 48% of marketing and communications departments within organizations surveyed will increase data stored in the cloud by 2019. Healthcare will move its customer data to the cloud, whilst utilities will prioritize IT and R&D data: 59% of respondents highlighted that the healthcare industry will move customer data to the cloud in the next two years. Utilities will prioritize moving IT infrastructure (64%) and R&D/Engineering data (52%). Marc Clark, ‎Director of Cloud Strategy and Deployment at Teradata said: “Our message to organizations around the world is that the cloud is actually one of the most secure means of virtual storage available. While our study finds widespread concerns, the fact is that cloud storage is growing rapidly, remains hugely cost-effective, and that there are ways to manage it securely. “Cloud computing security processes should be designed to address the security controls that the cloud provider will incorporate, in order to maintain the data security, privacy and compliance with necessary regulations, as well as providing a business continuity and data backup plan. “By identifying the barriers within the business that hinder further adoption, as well as where cloud storage is creating positive opportunities, we aim to provide reassurance that cloud storage is a safe and cost effective way to store company information.”   ### Amazon Web Services Announces the Opening of Data Centers in Sweden in 2018   New AWS Infrastructure Region will enable customers to run workloads in Sweden and serve end-users across the Nordics with even lower latency Amazon Web Services, Inc. (AWS), an Amazon.com company (NASDAQ: AMZN), today announced that it plans to open an infrastructure region in Sweden in 2018. The new AWS EU (Stockholm) Region will comprise of three Availability Zones at launch. Currently, AWS provides 42 Availability Zones across 16 infrastructure regions worldwide, with another five Availability Zones, across two AWS Regions in France and China, expected to come online this year. For more information on AWS’s global infrastructure, go to https://aws.amazon.com/about-aws/global-infrastructure/. “The AWS investment in Sweden will strengthen our position in the global digital shift. For us, trade in a modern globalised economy is not only about goods, but also about services, sharing of knowledge, and the free flow of data.” “For over a decade, we’ve had a large number of Nordic customers building their businesses on AWS because we have much broader functionality than any other cloud provider, a significantly larger partner and customer ecosystem, and unmatched maturity, reliability, security, and performance,” said Andy Jassy, CEO, AWS. “The Nordic’s most successful startups, including iZettle, King, Mojang, and Supercell, as well as some of the most respected enterprises in the world, such as IKEA, Nokia, Scania, and Telenor, depend on AWS to run their businesses, enabling them to be more agile and responsive to their customers. An AWS Region in Stockholm enables Swedish and Nordic customers, with local latency or data sovereignty requirements, to move the rest of their applications to AWS and enjoy cost and agility advantages across their entire application portfolio.” [easy-tweet tweet="The AWS investment in Sweden will strengthen our position in the global digital shift. " hashtags="Digital, AWS"] This announcement has been welcomed by Sweden’s Innovation and Enterprise Minister, Mikael Damberg, describing the decision as good news for the country. "I am very happy to welcome AWS to Sweden. Their decision to establish a new region in our country is a recognition of Sweden’s competitive position within the European Union (EU), with the highest levels of renewable energy, in the power grid, in the EU, as well as a world-leading digital infrastructure and IT industry,” said Mr Damberg. “The AWS investment in Sweden will strengthen our position in the global digital shift. For us, trade in a modern globalised economy is not only about goods, but also about services, sharing of knowledge, and the free flow of data.” AWS has been steadily increasing its investment in the Nordics to serve its growing base of customers. In 2011, AWS opened a Point of Presence (PoP) in Stockholm to enable customers to serve content to their end users with low latency. In 2014 and 2015 respectively, AWS opened offices in Stockholm and Espoo, Finland. Today, AWS has teams of account managers, solutions architects, business developers, partner managers, professional services consultants, technology evangelists, start-up community developers, and more, helping customers of all sizes as they move to AWS. When launched, the AWS EU (Stockholm) Region will enable organisations to provide even lower latency to end users across the Nordics. Additionally, local AWS customers with data sovereignty requirements will be able to store their data in Sweden with the assurance that their content will not move unless they move it. Organisations across the Nordics – Denmark, Finland, Iceland, Norway, and Sweden – have been increasingly moving their mission-critical applications to AWS. Hot Nordic startups like Bambora, iZettle, Quinyx, Tidal, Tink, Tradeshift, Trustpilot, and Vivino, as well as leading gaming firms like King, Mojang, and Supercell have built their businesses on top of AWS, enabling them to scale rapidly and expand their geographic reach in minutes. Enterprises in the Nordics, like ASSA ABLOY, Finnair, F-Secure, Gelato, Husqvarna, IKEA, Kesko, Modern Times Group, Nokia, Sanoma, Scania, Schibstedt, Telenor, Wärtsilä, WirelessCar (Volvo), WOW Air, and Yleisradio (Yle), are also using AWS to drive cost savings, accelerate innovation, and speed time-to-market. One Nordic enterprise that is using AWS to innovate is ASSA ABLOY, the global leader in door opening solutions, with over 47,000 employees and annual sales of USD $8.3 billion worldwide. The company’s hospitality division is using AWS to power their Mobile Access solution, allowing hotels to offer their guests the ability to check-in via mobile phone and to use their device as their room key. “AWS has been critical to our ability to innovate and quickly develop new solutions for our customers,” said Jan Hedström, Cloud Operations Manager at ASSA ABLOY. “With AWS, we have been able to develop and test new business ideas the same day they come up, allowing the project teams to build fruitful partnerships with internal stakeholders, such as business owners and project managers. This also allows us to bring new services, such as our Mobile Access solution, to customers worldwide, such as major hotel chains, who are using the solution to enhance their guests’ stay.” Another well-known Nordic enterprise using AWS to transform their organisation is Scania, a world leading manufacturer of heavy vehicles. Scania has a long history of research and development and is bringing advanced technologies to their trucks, buses, coaches, and engines to help them reach their goal of becoming the leader in sustainable transport solutions. Scania is training hundreds of their employees on the latest AWS technologies, enabling them to develop and build reliable, secure, and scalable solutions quickly. Scania is now planning to use AWS for their connected vehicle systems, allowing truck owners to track their vehicles, collect real-time running data, and run diagnostics to understand when maintenance is needed, reducing vehicle downtime. ”In a connected world, a flexible, scalable, and reliable cloud infrastructure, such as what we get from AWS, is critical for our ability to develop, experiment, innovate, and stay ahead of the competition,” said Michael Müller, Director Infrastructure Services, at Scania. “For us, the cloud is more than technology; it is a fundamental part of our strategy moving forward. Through connectivity, powered by the cloud, we can enhance vehicle performance and customer profitability. This makes AWS much more than just a service provider; they are an important business partner supporting our future growth.” [easy-tweet tweet="With AWS’s pay-as-you-go pricing, we will be able to work out the precise cost of each of our services" hashtags="AWS, Cost"] WirelessCar, part of the Volvo Group, is a Swedish company using AWS to innovate quickly. WirelessCar connects two million vehicles in more than 35 countries. It moved its delivery engine, a test environment for WirelessCar software, from its on-premises data centre to AWS. WirelessCar delivers global, high quality, and secure services, that enable vehicles to respond to remote user commands, intelligently plot routes, and communicate with the manufacturer about service requirements. “We run our entire delivery engine on AWS, from test and development through to deployment,” said Martin Rosell, Managing Director at WirelessCar. “Using AWS, we can provide our developers with a more flexible, agile, and scalable platform. Before moving to AWS, we added major features only a few times per year. Now, we can add features on-demand and in continuous deployment mode. This rapid pace of innovation has allowed us to scale our business according to customer needs and has also brought greater transparency to our operations. With AWS’s pay-as-you-go pricing, we will be able to work out the precise cost of each of our services and make better, more informed business decisions as a result.” As well as innovative enterprises, many of the most well-known and fastest-growing startups in the Nordics are using AWS to build and rapidly expand their businesses around the world. One former startup which is now a leading mobile gaming company, Finland-based Supercell, used AWS to run their business from day one. Supercell is ‘all-in’ on AWS and their games – Boom Beach, Clash of Clans, Clash Royale, and Hay Day – make use of almost every service available on AWS and attract more than 100-million players on iOS and Android devices every day. “Using AWS we have been able to focus our resources on improving the gaming experience for our players, instead of wasting time on procuring and maintaining the infrastructure,” said Sami Yliharju, Head of IT Infrastructure at Supercell. “Our games generate about five terabytes of log data every day, which we analyse to better understand player behaviour and improve the game experience. This would be very hard to maintain with an on-premises setup, especially as we have seen explosive growth. With AWS doing the heavy-lifting, we have been able to iterate faster to quickly and seamlessly deliver new games and features to our millions of users around the world.” Trustpilot, another Nordic startup, leveraged AWS to expand its business globally. The company provides over 150,000 e-commerce businesses, across 24 countries, with TrustScores. A TrustScore is a consumer rating that is based on more than 27 million online reviews. “Trustpilot chose to go ‘all-in’ on AWS from day one to enable us to support our rapid growth,” said Rudy Martin, VP of Operations at Trustpilot, which serves over 1.6 billion website impressions per month. “Retail is a seasonal business which experiences peaks during the holidays, such as Christmas and Easter, and when there are special events, such as Black Friday sales. These peaks are reflected in the increased demand for reviews to help the consumer make an informed purchasing decision. AWS enables us to efficiently scale our infrastructure to manage these peaks in traffic, which can sometimes exceed 250 percent of our usual daily traffic load.” In addition to established enterprises and rapidly growing start-ups, AWS also has a vibrant ecosystem in the Nordics, including partners that have built cloud practices and innovative technology solutions on AWS. AWS Partner Network (APN) Consulting Partners in the Nordics helping customers to migrate to the cloud include Accenture, Capgemini, Crayon Group, CSC, Cybercom, Dashsoft, Enfo Group, Evry, Jayway, Nordcloud, Proact IT Group, Solita, Tieto, Wipro, and many others. Among the APN Technology Partners and Independent Software Vendors (ISVs) in the Nordics using AWS to deliver their software to customers around the world include Basware, eBuilder, F-Secure, Queue-it, Xstream, and many others. For the full list of the members of the AWS Partner Network, please visit: https://aws.amazon.com/partners/ ### Skills: The Solution to Optimising Operations on the Cloud Cloud computing has been revolutionising the IT environment for businesses and their workforces, enhancing the efficiency of operations, the opportunities for flexible working, and exponentially increasing their access to storage and compute capacity. Despite the benefits of working in the cloud, businesses still need to be aware of potential pitfalls and acknowledge that it makes them more vulnerable to cyber security issues. As such, organisations must recognise that investing in the cloud requires heightened security measures within their IT functions, as well a balance of the right skills needed to thrive. Prepare for challenges of the future Looking ahead, there is a serious need for businesses to equip themselves to deal with technology disruptors that are unexpected and unpredictable. Networkers’ Voice of the Workforce research, a survey of over 1,600 tech professionals which examines confidence in the sector, shows that many IT professionals currently believe their organisations are neither prepared nor agile enough for the challenges of the future, such as the internet of things, big data, and mostly, cyber security. Also, only 34% of tech professionals believe their companies are proactive in implementing changes that will allow them to prosper in the future technology landscape. Acknowledging the risks that operating in the cloud poses and preparing to mitigate these risks is the first step to securely operating on the cloud. The second is bolstering businesses’ defences, and advancing skills and operations to fully embrace the cloud as fundamental to computing operations. [easy-tweet tweet="Data breaches are potentially the most damaging safety issues for companies" hashtags="Cloud, Data"] Improve IT processes  Data breaches are potentially the most damaging safety issues for companies operating on the cloud. These could be avoided through implementing the right modelling applications to protect IT infrastructures and undertaking thorough testing to decipher the strength of your businesses data defences, in order to understand the next steps needed to optimise security. In addition, businesses can contribute to mitigating cyber security and data hacking risks by ensuring they have extensive data backup procedures in place, which will help workforces to operate efficiently within the cloud. The progression of automated infrastructure brings the evolution of an existing buzzword - ‘DevSecOps’ - putting security firmly in the spotlight for both developers and operations engineers. Automation removes some of the risks of human error but only if a solid process for the sign off of code is supported, with many tools available which allow for accountability and change tracking as code evolves. Instigating a ‘DevSecOps’ focused approach, uniting the development of operations and the work of IT professionals will help contribute to effective security management of working within the cloud. [easy-tweet tweet="57% of tech professionals believe there is a skills shortage in the sector" hashtags="Cloud, Skills"] Upskill to create a security-conscious culture Our research showed that 57% of tech professionals believe there is a skills shortage in the sector. Furthermore, it should be a concern for businesses that security and cloud skill demands are increasing exponentially and far outstrip availability. With companies continuing to move to cloud-based services, we will see an increase in demand for skills in developing areas, like mobile device management, multi-factor authentication, specialist and off the shelf networking and systems monitoring tools and vulnerability assessment and penetration testing. To succeed, organisations will need to train and upskill their existing staff to ensure they have elevated awareness of security practices. Educating and engaging your userbase about threats, how to spot them and how to report them, will help to protect organisations from a nasty attack or could even stop damaging attacks in their tracks. Now, companies are increasingly cross-training and retraining their existing staff, taking advantage of new cloud security based certifications, such as the Certified Cloud Security Professional Certification (CCSP) and Microsoft and AWS certification tracks, in which security training is embedded throughout. Businesses are also building their own teams with the skills, processes and best practices in place to provide secure infrastructure. Instilling knowledge and awareness throughout all levels of business will be the most effective way to ensure that all employees feel like are appropriately equipped to deal with change.   In addition to offering insightful training to existing staff, organisations should look to hire IT professionals with knowledge of cloud implementation and awareness of security practices. Recruiting externally for a blend or permanent and contract resource will be advantageous in filling the knowledge gaps and bringing innovative ideas and new perspectives to revitalise your workforces and security measures. Ultimately, as the cloud is becoming increasingly fundamental to organisations, we need to make sure our businesses are both secure and flexible enough to adapt to change, and that our workforces have an informed understanding of the cloud, and the associated cyber security risks at every level of business. This agility and preparedness will ensure that businesses keep their competitive edge, and only the best-equipped will succeed. ### The Government Digital Strategy - Is it Time to Change? Karen Bradley MP, Secretary of State for Culture, Media and Sport recently launched the Government Digital Strategy, which aims to ‘create an economy which is resilient to change and fit for the future’, as the UK prepares to exit the European Union.  Bradley addressed two key themes in her speech at the Strategy’s launch; one being the need to get the digital economy on the solid ground prior to Brexit and the other being the growth of digital skills in the UK population.  Of course, the success of the former depends largely on the progress of the latter. [easy-tweet tweet="90 percent of all jobs will require an element of digital skills within the next 20 years" hashtags="Digital, Technology "] Boosting the nation’s digital prowess is particularly pressing given that it has been predicted that 90 percent of all jobs will require an element of digital skills within the next 20 years*.  There has been a great deal of concern recently about the UK’s digital skills gap and it is positive to see policy makers working to address the issue.  The Digital Strategy promises to ‘establish a new Digital Skills Partnership, working together with partners who are passionate about making a difference and who share our ambitions to tackle the digital skills gap.’ This is certainly to be welcomed as the industry has much to offer government when it comes to developing digital skills.  The Future Digital Inclusion programme (run by the Good Things Foundation and funded by the Department for Education) is a prime example; the initiative reached, supported and trained more than 270,000 people in basic digital skills.  Those ‘left beyond’ by the digital era is also now being supported by private sector organisations investing in digital skills development.  As an example, Barclays has rolled out its Digital Eagles programme, which offers training in digital banking, along with a range of other online users. For the government to truly grow the digital economy and ensure that as many citizens as possible can contribute to this, it must collaborate with industry to provide integrated training programmes. While Britain is a digital world leader, there is no room for complacency, and we need to ensure that the UK continues to develop the skills it needs to stay at the forefront of the digital revolution. This is particularly important as Brexit looms, with many voicing their concern over the implication of immigration curbs compounding the digital skills shortage. In the face of Brexit, businesses and policy-makers must work together to support the UK’s burgeoning digital sector in accessing the right skills from around the world, while fostering homegrown expertise. [easy-tweet tweet=" Chancellor Philip Hammond announced a £300 million allocation for supporting research talent" hashtags="Data, Strategy "] In the recent Spring Budget, Chancellor Phillip Hammond expressed the government’s intention to ‘make Britain the most attractive place to start and grow a business’, which of course relies upon a suitable tech-savvy workforce to recruit from.  To this end, the Chancellor announced a £300 million allocation for supporting research talent, including 1,000 PhD places in Science, Technology, Engineering and Mathematics (STEM) subjects. This is certainly to be welcomed, as it indicates the government’s recognition of the enormous value technology plays in our economy. The new Digital Strategy should strengthen the resolve of both government and industry bodies in future-proofing our economy by investing in progressing our collective digital skills.  With so much uncertainty in the run up to Brexit, collaborating to meet this core objective is more important than ever before.   ### New Adnet - Better Service According to a study conducted by the EU, as part of the DigiCom program, it is estimated that the demand for IT practitioners in the European Union will reach 750,000 by 2020. The tremendous pace at which the demand for e-skills has been growing will either entail the outsourcing of certain IT activities to third countries (India, the US, etc.) or generate investments aimed at the professional development of employees, in order to close this gap. [easy-tweet tweet="750,000 IT specialists needed in the EU by 2020." hashtags="IT, EU"] 14,000 companies active in the IT&C field and approximately 100,000 employees. Taking advantage of its solid positioning, Romania exported IT services amounting to 1.5 billion EUR in 2015 (approximately 10% of total services exported). How did we manage to achieve such outstanding results? 3 key factors have positively influenced this evolution: the excellent professional training of employees, major investments performed in IT infrastructure (data centres, fibre optic backbones, research laboratories, etc.) and the flexibility of Romanian companies in the process of developing customised solutions for their clients. Romania is ready to absorb part of these projects, as part of its strategy to invest strongly in the future. Here, at AdNet Telecom, we have also performed a number of investments in the past two years which we hope shall position us as a leader in the field of data centre and cloud services. AdNet Telecom opened the first modular data centre in Eastern Europe, using state-of-the-art technologies, for maximum energy efficiency. Our goal is to reach a 1.3 PUE at full capacity. What advantages does this pose to our clients? An energy efficient data centre helps companies implement their green development strategies, as well as reduce collocation costs. Based on this concept, we opened a second location, in Cluj-Napoca (labeled Romania’s Silicon Valley) in order to better cater to the local market, as well as to enable rendering back-up, disaster recovery and business continuity services (we also provide office space for our clients), exclusively based on our own infrastructure. And because the data centre and cloud service market is highly receptive to our solutions, we will soon open a third data centre. [easy-tweet tweet="AdNet Telecom offers virtualisation solutions using: VM Ware." hashtags="Telecom, IT"] AdNet Telecom offers virtualisation solutions using one of the most advanced platforms: VM Ware. CISCO UCS provides the processing power, while the HPE 3PAR storage provides the speed and reliability of the platform. A highly flexible platform which allows us to provide the most efficient cloud solutions: IaaS, PaaS and DaaS. The DDoS protection system provided by Arbor Networks, together with the load balancing solution, provided by A10 Networks, allow us to supply clean and quick connections to the services of the AdNet Telecom data centres. We have revised the range of services provided by AdNet Telecom, which has been designed in order to help companies improve and develop the IT infrastructure behind their business applications, as easily as possible. The main objectives of the new products are maximum efficiency and flexibility. Our website, www.AdNettelecom.ro has also been redesigned, in order for clients to easily access the products best suited to their needs, as well as to place orders, thus reducing the delivery time of the product to the client to a minimum. We have also invested in human resources. Any platform, regardless of its technical performance, is of little worth without people. People who know what solutions they can offer their clients in order to meet their needs, people who know how to respond to support questions, in an efficient and timely manner. We want to help our clients reduce their IT infrastructure costs and allow them to focus their resources on what matters most: their core business! ### Demonstrating the Benefits of Cloud Technologies The adoption of cloud technologies by business has increased dramatically in the last couple of years: a 2017 RightScale study found that 85% of enterprises are using a hybrid cloud strategy. Few businesses are operating solely in the cloud yet, as organisations blend public and private cloud services with more traditional on-premise solutions. This is likely to change over the coming years as over half of businesses (63%) intend to move their entire IT estate to the cloud in the future (CIF). But there are significant barriers preventing businesses from implementing more cloud solutions. The shift towards the cloud is widely regarded as a key element of digital transformation: the realignment or investment in new technologies that will improve a business’ customer experience and revolutionise the way it works. Digital transformation is more than just a buzzword; 4 out of the 10 current market leaders will be displaced by innovative and agile new businesses that focus on digital transformation, according to the Global Centre for Digital Business Transformation. For many, the move to the cloud is a natural first step on this journey. However, a recent Oracle study conducted with business leaders and IT decision-makers indicated they are facing several challenges when it comes to implementing cloud services. A key finding was that many organisations fail to align business goals with IT projects; 62% of respondents said this affected their ability to implement the cloud and other innovative technologies. [easy-tweet tweet="Transformational technology needs to be seen as an investment" hashtags="Technology, Cloud "] In order to make use of cloud technology, and indeed wider digital transformational technology, a business’ IT department must be repositioned. Rather than a necessary ‘break-fix’ solution as it once was, IT is now a strategic resource for businesses and should be the department driving digital innovation. Transformational technology needs to be seen as an investment, and one that can deliver a significant ROI, rather than merely a cost. For this to work the traditional IT Manager/Director roles will need to transform into that of the redefined CIO. This role will take in a wider business responsibility and understanding rather than simply keeping the lights on. Innovation and digital transformation will be expected of this role. IT departments often face resistance not only from business leaders but from wider business teams. Legacy systems and resistance from the teams that use them – largely HR and Finance departments, according to Oracle – present another barrier to integrating cloud services. While integrated cloud services would be an obvious solution, it’s not that easy for IT: 41% cited a lack of understanding around the need for cloud integration from departments outside of IT. So how can you help the departments stuck on legacy systems to understand the many benefits of the cloud? First and foremost, IT departments should have sole responsibility for all technology implementations in a business. Respondents cited managing shadow IT as a difficulty, whereby departments outside of IT purchase and implement systems and applications without the permission, and often knowledge, of IT. This leads to the aforementioned issue of departments on legacy systems. Even if those systems are more forward-thinking, the integration of other applications isn’t often taken into consideration like it would be if IT headed the project. By taking control of a business’ entire technology stack, IT can implement a solid technology strategy that encompasses all of the tools and systems necessary for the business. IT can then ensure the business has a base upon which to build innovative technologies – that often comes in the form of Infrastructure-as-a-Service (IaaS). Next, you should be able to demonstrate the benefits of the cloud. An immediate ROI can often be seen in reducing the number of servers your business requires, which in turn reduces the office space needed to store them. Take Office 365 for example. Some businesses that have upgraded to Office 365 from an older Exchange deployment have done so to move their emails from on-premise servers to the cloud. Not only does this increase rack space or free of virtual machine resource, but it solves the issue of bloated mailboxes which can pose storage and backup issues and ultimately make systems sluggish. Office 365 is the perfect example of a hybrid solution. All of your applications are stored in the cloud but the traditional Office suite, including Word, PowerPoint and Excel, can still be downloaded and used natively on an employee’s machine (depending on the plan selected). This offers the flexibility of the cloud, as workers can access their email, files and applications anytime and anywhere. Office 365 also means all employees will be on the same version of the Office suite, removing the headache of incompatible versions. There are significant benefits to using cloud-hosted Office 365, and that’s before you even get to its innovative tools that come at no extra cost. The cloud is lauded for its ability to improve communication and collaboration between colleagues – a key benefit recognised by respondents of the Oracle survey. Office 365 takes this to the next level with the recently-launched Teams, a collaborative workspace that allows employees to engage in discussions, share and edit files and set up automated workflows in one centralised hub. [easy-tweet tweet="A Microsoft study found that using the cloud offered increased employee productivity." hashtags="Cloud, Microsoft"] Many business leaders are concerned about the extra costs associated with the cloud. The cloud needn’t be a costly investment, however – per-user-per-month subscriptions, offered by Microsoft for Office 365 and its recently upcoming cloud-hosted ERP and CRM app Dynamics 365, drive down costs and offer flexibility for growing businesses. Further, a Microsoft study with business leaders found that using the cloud offered increased employee productivity and significant cost savings. Your business might have already made moves towards the cloud – in fact, it’s very likely it has. It’s also very likely that you’re facing resistance from your wider organisation when it comes to implementing more cloud services. Attitudes towards IT departments need to change in order for cloud services, and wider digital transformation, to be taken seriously. Ultimately, these attitudes probably won’t change unless there is hard evidence of its benefits. Office 365 might be part of your cloud solutions, or it might not yet be. With a potential ROI of 154% and initial investments paid back in 6 months, what better way to demonstrate the advantages of the cloud than through this innovative, reliable platform?   ### Now is the time to enter the Rural Business Awards 2017 Rural firms from across Britain are being invited to get their entries in for an awards scheme designed specifically for the countryside sector. Now in their third year, the Rural Business Awards (RBAs) are the only UK-wide business awards for rural enterprises. They are sponsored by Janine Edwards Wealth Management Ltd, Principal Partner Practice of St. James’s Place Wealth Management, and operate in partnership with the CLA. The RBAs are split into 13 categories including Best Rural Sporting Business – along with other important aspects of the countryside economy such as Best Rural Diversification Project and Best Rural Manufacturing Business. At the 2016 awards, the overall Champion of Champions was Landmark Computer Systems, from Sussex, which offers computerised solutions helping its agricultural customers manage their businesses effectively. Other victors included Lowe Maintenance Training from Settle in Yorkshire, which runs forestry courses and was a finalist in our Best Rural Professional Services Business category (which Landmark also won). Gravity Digital, a digital marketing firm from Derbyshire, was the winner for Best Rural Creative Business. The RBAs are the brainchild of Leicestershire businesswomen Anna Price and Jemma Clifford, who wanted to showcase the wealth of entrepreneurial talent in rural areas of Britain – a sector of the economy they felt was all-too-often overlooked in favour of large city-based firms. Awards co-founder Anna Price said: “We are so proud to be holding our Rural Business Awards for the third year running. Over the past two years, our judges have read about an extraordinarily diverse range of rural businesses in this category and in all the others, and we have been truly humbled by the amazing work that is being done all over the Great British countryside. “As for this year’s awards, we urge as many rural businesses to get their entries to us as soon as possible. Entry is so quick and simple that all businesses need to do is tell us what they do best, and we’ll do the rest!” So just what is a rural business? The CLA defines rural businesses as fitting into three broad categories: land-based businesses; land-related businesses and other businesses located in rural areas. The category list aims to draw together businesses from across these areas to acknowledge the breadth and depth of opportunity presented by the Great British countryside, as well as to celebrate the achievements of our rural businesses. Backed by a range of high-profile MPs, the awards are organised by rural businesses for rural businesses and judged by people who understand the rural sector, which is growing rapidly and employs in excess of 3.4million people in more than 600,000 businesses across the UK. With the partnership of the CLA plus the backing of previous winners, the awards founders feel they are once again positioned to generate considerable excitement, interest and pride in the achievements of the rural business sector. Nigel Parsons, from Landmark Computer Systems, explains why the company entered the awards and the benefits of having won, saying: “The RBAs are a tremendous way of engaging the whole company in a common goal. The amount of work involved to apply is well worth the experience of being able to benchmark your product or service, acknowledge and reward your staff and to involve your customers with the application process. If you are successful, then they have been part of the experience and of course they are why you are successful! “Internally, recruitment is enhanced for an award winner as any new member to a team can see that you have an industry-wide, national seal of approval. It gives you a point of difference and is easy to communicate through all mediums. The support and enthusiasm from the RBA team is engaging, contagious and ongoing. It is fun to be part of something in farming which is in its infancy and which is gathering momentum. “Unlike other award schemes the judges have no contact with the applicants. The judging system is well thought through and transparent. Size of business is irrelevant and the diversity of entrants is staggering. We at Landmark wish all entrants the best of luck for the 2017 awards and if there is a tip for making a successful bid I would say, start early and read the guidelines carefully but definitely have a go!” Demelza Hartley, from Lowe Maintenance Training, explains why success in the RBAs has meant so much: “As a small business, it can sometimes be a hard task seeking out recognition and acknowledgement for the blood sweat and tears that we have to put in during the course of the financial year! “Through advertising, word of mouth and social media, selling your business can be a hard nut to crack, so hearing about the Rural Business Awards and the chance to demonstrate our uniqueness, successes to date and our vision for the future was an opportunity we could not let pass. The ethos of the awards that runs through all categories and the values entrenched in the rural sector are very much something our business strives to emulate. “Having been an RBA Runner-up in 2015 and then a finalist last year and to be judged by people who understand the rural sector, was an achievement in itself. With over 600,000 businesses in the sector, it makes you sit back and take stock of the credit that comes from being nominated. The idea of the awards celebrating the achievements of our rural businesses are an example of what our Great British countryside and its people have to offer.” Sharon Stevens-Cash, director of Gravity Digital, said winning an RBA had been “absolutely fantastic” for business. She said: “The awards were championing the importance of rural business, an ethos that’s really close to our hearts, and we’d seen them covered a lot in the media so as we have a rural base and a strong range of rural clients, we were keen to get involved. These were the first awards we had ever entered, so we were thrilled with the results! “Winning an award has been absolutely fantastic! People are still talking about the RBAs six months on and there has been a really positive impact on our reputation. We’ve enjoyed lots of coverage in the media and we are thrilled to have won such a prestigious national award.” Winners in the thirteen categories will be decided by an independent panel of judges drawn from the rural business sector, official agencies and rural charitable organisations. The aim is to grow the scheme to become the UK’s most prestigious and respected awards for countryside-based enterprise. CLA Director General Helen Woolley said: “The CLA is delighted to once again partner with the Rural Business Awards to highlight the vital contribution that rural businesses make to the countryside and to the wider economy. “Our ambition for rural areas is no different to the rest of the economy. We want to see investment unlocked, to achieve greater productivity driving growth, the creation of jobs and an improved standard of living across our rural communities. By working with the Rural Business Awards I am confident successful and innovative rural businesses will have the chance to shine.” Rural Business Award Sponsor, Janine Edwards of Janine Edwards Wealth Management Ltd, Principal Partner Practice of St. James’s Place Wealth Management, said: “As such a highly regarded event, it is an enormous privilege for my team and I to once again be involved in sponsoring the Rural Business Awards. We strongly believe that accomplished companies in the rural sector deserve real recognition and these awards are the perfect way to go about it. The importance of the awards in the rural community is evident through the efforts and great contribution to the economy that the businesses make all year round. Celebrating their accomplishments is a fantastic way of continuing to support this.” This year’s glittering awards ceremony will be held at Denbies Wine Estates in Surrey on October 5. The application window is now open and entries can be made up to June 30, 2017. To enter, visit our website: www.ruralbusinessawards.co.uk where you will find details about each category. There is a simple form to fill in and an entry fee of just £50. You can also nominate another business who you think deserves to be recognised in our awards. If you would like help with entering or advice on which category to choose, call us 07908 722 497 or email office@ruralbusinessawards.co.uk Find out more via social media on: T: @RuralRBAs F: The Rural Business Awards Keep up with the Twitter chatter and use #RBAs. Our full category list is: 1. Best Rural Start-up 2. Outstanding Rural Diversification Project 3. Best Rural Clothing or Accessories Business 4. Rural Innovation of the year 5. Best Rural Manufacturing Business 6. Best Rural Professional Services Business 7. Best Rural Creative or Media based Business 8. Social enterprise / community project of the year 9. Best Rural Tourism Business 10. Best Rural Sporting Business 11. Rural Employer of the year 12. Rural Entrepreneur of the year 13. Best Food & Drink Business ### Biggest Headache Of Data Revolution Addressed By New Service A new service from start-up, Brainnwave, is offering quick, easy access to datasets from around the world, including Skyscanner, Ordnance Survey and Airbus Defence and Space, all in one place for the first time. The new online data discovery and access marketplace is set to slash the time and resources spent by data scientists and business analysts in finding the right data (from the trillions of gigabytes of data that is increasing minute by minute) to address their specific organisation’s challenges. Crucially, the Brainnwave one-stop-shop service promises users access to the ‘slices’ of data relevant to them and prices it accordingly. Until now there has been an ‘all or nothing’ model for data acquisition which has cost and relevance implications. The key challenge faced by those at the heart of leveraging the benefits promised by data, is the disproportionate time devoted to sourcing data in a usable format (80%) compared to actually analysing it and identifying insights (20%). A recent Brainnwave survey* underlines the weight of this challenge with 82% of data specialists either agreeing that this so-called 80:20 paradigm is accurate or much worse. Don Baker, Brainnwave founder, explains: “Data drives decisions. Brainnwave wants to make managing your data library as easy as managing your music, photo and video libraries online. We are tackling the access challenge head on by making it easier and faster to find the data needed. This allows users to concentrate on unlocking innovation rather than on finding and accessing the data in the first place. The transparent pricing policy means users only ever pay for what they need and can have the confidence that the data is reliable. When time is money it makes sense to focus resources on generating the insights that inform strategic decisions rather than on searching for the relevant data in the first place. “The pioneering platform supports discovery and seamless access to any kind of data, notably geospatial data. Almost every piece of data has a location and time aspect to it. The potential insights are therefore increased making it potentially more valuable. Geospatial data insights cut across industry boundaries. For instance a health application could use geospatial insights to help understand disease spread geographically whereas the renewable energy sector may find specific geo data from dedicated drones hugely useful in monitoring and planning their windfarm maintenance programme.” Anthony Mills, Account Manager of Ordnance Survey, says: “We are pleased to support Brainnwave in the launch of their innovative new service that makes it easy for customers to discover, buy and access our data products online and on-demand.” Brainnwave was set up by Don Baker, who has had a career working with complex and secure data having worked for the US Government, along with CEO Steve Coates, former UK entrepreneur of the year and CTO, Graham Jones, former senior software engineer at Bloomberg. The team has secured significant seed equity from two private equity firms as well as a SE grant of £500,000. It is growing fast and currently has a team of 20 based in Edinburgh. Scottish Innovation Centre, The Data Lab, also played a crucial role by funding an academic partnership with the University of Edinburgh and The University of Glasgow, which allowed Brainnwave to develop the technology behind the unique data marketplace platform. Jude McCorry, Head of Business Development at The Data Lab, said: “The Brainnwave service will make sourcing data much easier for businesses and allow them to reap its value and help them enhance their services. The easy-to-use site will enable them to locate data sets, and smaller parts of big sets, quickly, to better understand their customers and enhance their services. We were delighted to support Brainnwave and partner them with the University of Edinburgh so they could fine-tune their technology and take it to market.” ### Back to Basics: Cloud Recovery When reviewing and implementing a backup solution it’s important to remember that the purpose of it is to provide the ability to recover. Recovery is an aspect that could easily be overlooked or not properly understood, many solutions providing multiple restore options which may in some cases have different costs. What is cloud recovery? As with any non-cloud backup medium, once a cloud backup has taken place, it will be stored for its retention period so that data can be recovered from it if required. Cloud recovery is the process of restoring data that has been lost, accidentally deleted or corrupted, over the Internet or network, from a cloud-based system. This kind of restore typically involves the recovery of data to a desktop, single server or network attached storage system. Cloud recoveries could be a single file, directory, database or even a full system or systems. Disaster Recovery refers to policy driven procedures to restore data, infrastructure and systems on a larger scale if a natural disaster or a human-made disaster takes place. Disaster Recovery could also include failover to systems that data is replicated to. Why would I need to recover? The most common reason to need to recover or restore data is data loss, which can be caused by anything from an accidental deletion of data to hardware or software failure or disasters. Other reasons for needing to restore data could include for testing or insurance purposes, to be able to reference historical versions of documents or to comply with an audit. Alternatively, a novel use for cloud backup and recovery is the migration of data between systems or when the hardware is upgraded. [easy-tweet tweet="Organisations can grow or shrink their backup selection with ease" hashtags="Cloud, Backup"] Benefits of recovering using a cloud service: Using a modern cloud backup and recovery solution gives an organisation the ability to be highly selective with what data is being backed up, how often it is backed up and how long it is kept for. Organisations can grow or shrink their backup selection with ease and extend retention periods or increase the backup frequency with just a few clicks. Cloud-based recovery comes into its own in a similar fashion. Traditional tape-based backups, while reasonably reliable, are no guarantee of being able to recover data. For example, tapes stored for long-term retention are susceptible to damage and corruption. Even if the tapes aren’t damaged, simply identifying a tape with the right version of a file can take some time. Once a tape has been identified and retrieved, it then must be restored in its entirety before specific files or folders can be used. Cloud-based restore solutions on the other hand typically have intuitive search functionality enabling the right versions of files or directories to be restored very quickly. P2V or not 2V With new technologies changing the landscape of modern business, it is important to have different recovery options available. Being able to not only recover in different ways but to different locations, will give more flexibility and allow for different platforms to be catered for. [easy-tweet tweet="Virtual servers and appliances give the option to scale resources on demand" hashtags="Cloud, Virtual "] Virtual servers and appliances give the option to scale resources on demand, but they need to be covered by backup and recovery procedures or risk being lost. Testing With the ever-increasing threat that malware such as Cryptolocker or other ransomware variants pose to data, being safe in the knowledge that you can roll back to a successful backup is invaluable. Cloud-based backup and recovery solutions give the ability to test restore data instantly with a few clicks. As restores can be tested quickly and easily, they can be done more often and on more systems, providing assurance that all data is available for a real-world restore should it be needed. Disaster Recovery Business Continuity and Disaster Recovery processes will need to bring together many aspects of an organisation to allow the organisation to get back to operational capacity quickly. As different data sets have different values within an organisation, they can be prioritised for restore whether this is from one backup or multiple. ### Blockchain: How can IBM help? IBM has been helping companies to transform their businesses and their industries for over 100 years, and Blockchain is no different. We actively support and contribute to the Hyperledger project, which is building a business-focused Blockchain-based on permissions, rather than anonymity. You can try Hyperledger for free on our Bluemix cloud platform - you can spin up a test environment of 4 nodes, deploy some chain code, explore the APIs, and run proofs of concept. If you want to move into production you can use our High-Security plan - it’s functionally identical to the free version, but runs on high-performance POWER processors rather than x86 and has real differentiation in terms of security, data privacy and compliance. [easy-tweet tweet="Remember that a business solution is not just about the Blockchain" hashtags="Blockchain, IBM"] We have Blockchain garages in various cities around the world; these can help you to shape and define your project, and build the first iteration using agile techniques such as pair programming - this is great if you want to get your own developers skilled up in Blockchain. Alternatively, if you just want us to come in and build a project for you, our GBS services team can help with that. Remember that a business solution is not just about the Blockchain - you’ll need to integrate with back-end systems, manage API connections, business processes and business rules - and consider how to manage the DevOps process.  IBM has the breadth of skill to match your ambitions. Whatever it is that you want to do with Blockchain, IBM is ready to help - just visit ibm.com/blockchain to find out more. ### The Face of Workplace Computing is Changing Gone are the times of being chained to your desktop computer or an office space. The introduction of cloud technology has seen the rise of trends such as the mobile workforce and BYOD. However, one area this hasn’t extended to is flexibility within desktop services and applications. Whilst many see the obvious benefits of the cloud storage services such as Google Drive and Dropbox, new technology now allows users to remotely access entire desktops through any device. Many are reluctant to take the next step to complete web-based working, but there are extensive perks to being able to access not just your files, but your entire desktop and applications through a virtual desktop infrastructure (VDI) or desktop-as-a-service (DaaS). [easy-tweet tweet="IT managers can build and manage tailored virtual desktops" hashtags="Cloud, Virtual"] Productivity and efficiency VDI or DaaS allows businesses to massively improve efficiency and productivity in a myriad of different ways. One way businesses look to do this is by understanding the role of each department and their individual IT requirements. For example, IT managers can build and manage tailored virtual desktops for specific departments i.e. the marketing team probably won’t need access to a company’s financial software and the operations management department will almost certainly not want to have their desktops cluttered with CRM tools. The inherent mobility and device flexibility characteristics of VDI or DaaS can also be used to drive productivity and efficiency. By creating a virtual desktop experience on the move, staff are no longer limited by often under-featured mobile applications or rely on their work to be picked up by other members of staff when they are travelling to and from meetings. There are also financial efficiencies to consider. A VDI or DaaS often results in the freeing up IT resources and hardware, as there is less pressure on the installation of software or maintaining specific devices. This can result in a more strategic use of IT leaders, using them to drive efficiency and growth through improvement of processes rather than day-to-day management of current systems. Security There are many security benefits to web-based working. Storing files on hardware means that if your device or laptop is stolen your data will be immediately at risk. With DaaS, you can keep your files locally on your browser meaning that can take further measures to protect your information once you learn it has been compromised. But say you are subject to a security breach. How easy is it to get back up and running afterwards? Having to reinstall all of your software and applications can take hours or even days. With modern cloud systems, you can access your entire desktop and all of your applications in seconds. No reorganisation. No reinstalling all your apps, and as we all know every minute as a business is crucial. [easy-tweet tweet="IT can administer all software licences from a central location" hashtags="IT, Cloud"] This is also music to the ears of your IT manager, who can now easily keep track of the applications your workforce are using. Previously, applications downloaded to a desktop couldn’t be easily monitored, which posed a threat to security as these apps hadn’t been vetted. Now IT can administer all software licences from a central location, making everything easy to control and manage. If a new person joins your team, they can easily be set up in moments, eliminating hours of installation. It’s also easier for new users as they can start being productive straight away. Flexibility Flexibility is also a key benefit. There’s nothing worse than the dread when you go on a trip and realise you’ve forgotten your laptop, or if you’ve accidentally spilt your 9 am latte over your keyboard, and suddenly the screen’s gone blank. With virtual desktop software, you can quickly hop onto the spare laptop, or into your home office, and access a replica of your desktop anywhere. No stress. This also applies for new trends such as wearables and IOT. Imagine being on the run or out in the car, and your colleague wants to Skype, you have your smartwatch, so you can, but what if they ask for an important document immediately? With cloud technology, this is becoming a reality. There’s not normally enough storage on these devices to keep entire files, but with internet connection and web browsers you can access files on any end user device. It’s understandable to be anxious about the idea of making the jump to keep your business completely online. However it’s simple to try alongside your current systems, and most vendors will gladly guide you through the setup process every step of the way. ### BYOD is Dying. It’s Now Bring Your Own Everything! Today’s organisations need to place more emphasis on how they manage all manners of devices, as opposed to debating whether to have a BYOD or corporate supplied mobile device.  BYOD is now dying and the term BYOx (bring your own everything) is better placed to describe this phenomenon and IT managers have more important concerns to be addressed as a result of the change. For example, organisations need to decide on whether they provide devices to all employees, some employees (e.g., managers and executives) or no employees? Questions that need addressing include, how do user-owned devices connect to the network? How do you ensure personal and corporate data separation on a mobile device? What about company-owned devices and who owns, and thus has free access to, the data stored on them? And what happens when a device with company data or the ability to connect to the company network is stolen? The answer is a comprehensive mobile device management policy.  IT managers need a simple way to add and remove data, and importantly they require network access from the mobile devices.  Another essential component is the ability to safeguard enterprise data through two-factor authentication and sandboxed applications. Corporate supplied mobile device vs. BYOD Supplying some or all employees with the mobile devices they need is the fastest way to resolve security issues: the phone or tablet is completely under the control of your IT department., can be locked down to corporate use only, and can be wiped on demand if required. Drawbacks include the expense and the employee’s degree of understanding that the device is not theirs. Initial awareness that it’s a corporate device is high, but the longer it’s in someone’s possession, the more they tend to forget. This can lead to confusion, with non-business contacts, photos, etc. ending up on the device and potentially problematic downloads (e.g., games that may not be benign from a security standpoint). Data confusion in itself can lead to privacy concerns and with the introduction of new controls such as the European GDPR proposals, companies to have well thought out policies in place to manage these devices. One problem is that most companies’ corporate IT policies were written so long ago that they don’t cover today’s working environment in which employees regularly access enterprise systems from home and the road, sometimes from multiple devices. The best way to mitigate risk around supplied devices is to have users read, understand and sign a comprehensive policy that outlines who owns the device, what use is allowed, what is not allowed and the consequences for not following the company’s rules. [easy-tweet tweet="There is a multitude of potential security and data management pitfalls" hashtags="Security, BYOD"] Advantages to having employees use their own devices for work include potentially lower costs and convenience for employees, such as not having to carry two devices. However, there is a multitude of potential security and data management pitfalls. The company email system and enterprise systems must be securely accessed, company data on the phone must be secure and strict data migration policies must be in place (i.e., don’t transfer company data to an insecure location). Finally, the company must be able to lock and then wipe the device should it be stolen. Security Issues – Malware and Phishing There are two main security issues with mobile devices: malware and phishing. Protecting against phishing is, first, a matter of employee education. Although it’s sometimes difficult to identify a phishing message (the email may appear to come from the employee’s legitimate contact), making employees hyper aware of abnormalities in emails can go a long way toward reducing risk. Protecting against malware is strictly the responsibility of IT—often, the user is unaware that their device has been infected. It’s essential to use a robust threat detector and to keep devices updated with the latest OS, the latest patches and strong anti-virus applications. This requires enforced application deployment and monitoring as well as automated patch management across the device estate. Here are further steps to consider: Use an identity access management solution that provides two-factor authentication. This prevents thieves from using a cached password on the device from accessing your enterprise data. Move to encrypted email, since cloud-based email is a prime target for thieves looking to capture sensitive data. Create and maintain access control lists that define which users, devices and apps can access which areas of the network, thus limiting the areas a compromised device can access. [easy-tweet tweet="Sandbox as many applications as possible " hashtags="Security, BYOD"] Sandbox Applications An excellent way to protect mobile devices, regardless of who owns them, is to sandbox as many applications as possible - securely separate them from the operating system as well as other applications. For example, instead of using the mobile device’s built-in email application to connect to corporate email, IT installs a sandboxed email application. The app lets the users read and respond to emails online rather than downloading email onto the device. Access to mail can be controlled remotely, and the application can be disabled or removed on demand. There are currently sandboxed applications for contact databases, email and documents. The uses are different, but the principal is the same: access data/documents online, so they are never downloaded onto the phone. The download and security of sandboxed applications happen via a mobile device management solution that allows IT to easily delete the apps from a corporate or personal device if lost or from a personal device when someone leaves the company. Wipe Data The ability to delete business-related apps from an employee’s personal device and to completely wipe all data from a stolen device is essential to every company’s security. It’s critical to set up procedures that consider every circumstance, including: An employee’s personal device with company data/access on it was stolen An employee’s corporate phone was stolen An employee gives notice: IT needs to ensure he/she no longer has access to company systems after last day of employment An employee with a personal device used for business suddenly leaves the company And the big one – the children of an employee cannot accidently access or delete corporate data when given the phone to play games on. Outcome Today’s technologically complex, highly mobile world dictates a multi-pronged approach to mobile device management that includes: A flexible, frequently reviewed mobile device management policy that is understood by all employees A strong mobile device management system that lets IT quickly and easily act in the case of a security breach or device theft Protecting enterprise systems by using sandboxed applications on mobile device   ### Cloud Expo Europe 2017 Live - #CEE17 Tweets from the Expo! Cloud Expo Europe 2017 Live - #CEE17 ### The Impact of GDPR Legislation on IT GDPR, or General Data Protection Regulation is a new piece of EU legislation which becomes compulsory in May 2018, following a two - year implementation period. Despite Brexit, GDPR still has an impact on UK companies. With the triggering of Article 50 set to happen in March 2017, the timings mean that Britain will still experience life under GDPR. Even in a post-Brexit world, any country that offers goods or services, or collects, controls or processes data on EU citizens, must abide by the GDPR rules. [easy-tweet tweet="Some companies need to appoint a data protection officer" hashtags="Data, Storage "] GDPR – Key details Businesses are familiar with existing laws, but they may need to make changes to adapt to the new legislation. Here are the key details about GDPR: GDPR applies to organisations outside of the EU that either offers goods or services within the EU or hold data on EU citizens Some companies need to appoint a data protection officer as part of their accountability programme The definition of personal data under GDPR is any information relating to an individual. This includes the obvious such as name, date of birth, as well as the less obvious data such as IP addresses. There is some confusion over what different EU member states consider to be personal data, but the advice is: consider all information relating to an individual as data which must be protected Demonstrating compliance will require things like data protection impact assessments, additional paperwork and recordkeeping Companies must make it equally easy for a person to give and withdraw their consent for processing and storage of their personal data, and any intent to use data for marketing must be completely transparent. Where companies employ a data controller, they need to demonstrate where consent was given. Data controllers (or business owners in companies that don’t employ a data controller) must immediately notify relevant authorities (e.g. ICO) in the event of a breach. Failing to do so within 72 hours incurs a substantial fine GDPR and data storage Whether you use the virtual or physical server, rules about data protection are the same. Encryption alone is not enough to prevent a breach. We recommend the following: - Single sign on where possible to prevent multiple users and passwords - Alphanumeric passwords - Encryption of both stored data, and ‘on the move’ or shared data - Security levels and permissions on sensitive information e.g. HR data - Multi-factor authentication - Robust BYOD policies, where devices are automatically wiped of company data if a staff member leaves or their device is stolen, or to restrict company data which can be accessed on the move The role of IT suppliers Some businesses might think their IT supplier will be partly or wholly responsible for safe storage of their data, but in reality, it should be a group effort in collecting, storing and protecting it. Often, a data breach is caused by a user error rather than an IT fault, so businesses must address any internal issues and their processes, then work with their IT supplier to make changes. [easy-tweet tweet=" Auditing all business processes is also an important early step" hashtags="Storage, Data "] We recommend that businesses start by evaluating all of the data they collect and do a deep cleanse. Holding onto only the strictly business relevant data will help with compliance. Auditing all business processes is also an important early step, from hiring someone to completing a transaction, the process is likely to involve personal data in some shape or form. Consider how each of these processes could be more secure and work with your IT supplier to address any gaps. Staff training is essential. If you need to change your IT policies, do it now and make sure employees understand their individual responsibility, as well as how to spot malicious attacks such as ransomware. By complying with company policies, they are protecting their personal data and their job. Some businesses also conduct user behaviour analysis to identify if staff are surfing potentially dangerous sites or mistakenly downloading malicious software. The best way to ensure compliance and adapt to changing legislation is to work with your IT supplier closely. Share results of the audits and ask for their recommendations. It’s far easier for them to help you protect data if they understand broader company policies and any changes you’re making. In summary, working in partnership with them can help you avoid unwanted fines and protect your business for the future. ### Cloud Foundry to Bridge Critical Developer Gap with Cloud Foundry Certified Responding to an impending critical shortage of developers, Cloud Foundry Foundation, home of the industry-standard platform for cloud applications, today announced the launch of the world’s largest cloud-native developer certification initiative. The “Cloud Foundry Certified Developer” program will be delivered in partnership with the Linux Foundation, responsible for training and certifying more developers on open source software than any organisation in the world. General availability of the program will be announced at Cloud Foundry Summit Silicon Valley in June 2017. Businesses around the world need more skilled developers: there are a quarter-million job openings for software developers in the US alone, half a million unfilled jobs that require tech skills, and the forecast grows to one million within the next decade. The new Cloud Foundry Developer Certification helps developers set themselves apart when looking for a first job or upgrading their current role. Performance-based, community-based and independent of any distribution vendor, Cloud Foundry Developer Certification is the guaranteed way to demonstrate cloud skills and expertise. Organisations need developers to be highly productive while working on top of Cloud Foundry. Available online, Cloud Foundry Developer Certification and Training can be taken wherever there is demand in the world. “The Cloud Foundry Certified Developer program will be a huge value to the community,” said Brian Gregory, Director of Cloud Strategy and Engineering for Express Scripts, a Fortune 100 healthcare company. “In our shift to Cloud Foundry, we've seen a massive uptick in productivity and business results. So much so, that the business continues to demand more from us, in the effort to make prescriptions safer and more affordable for our 3,000 clients and 85 million members. We can't hire developers fast enough -- we currently have a goal of hiring 1,000+ developers. Knowing a developer is Cloud Foundry Certified could streamline our hiring process and help ensure we're bringing on qualified candidates.” “As Ford continues to use Cloud Foundry to support the demands of our business, finding software engineers with the right skills is a priority,” said Aaron Rajda, Director of MS&S Solution Delivery and FordLabs for Ford Motor Company. “Ford recognises certifications across a number of roles today and a Cloud Foundry Certification will be important in helping to identify and develop our software engineering team members.” Certification is the best way to verify job candidates have practical experience with Cloud Foundry, across any distribution, including 2017 Certified Platforms: Huawei FusionStage, IBM Bluemix Cloud Foundry, Pivotal Cloud Foundry, SAP Cloud Platform and Swisscom Application Cloud. The training material can also be licensed, allowing Cloud Foundry Foundation members to offer the content through their own employee and customer training channels. The program suite includes: A free introductory course offered via the edX platform. A self-paced eLearning Cloud Foundry Developer course. A training partner program which includes licensed materials for in-person Cloud Foundry developer classes, offered by member companies including DellEMC, EngineerBetter, IBM, Pivotal, Resilient Scale, SAP, Stark and Wayne and Swisscom. “Cloud Foundry Certified Developer” Certification, awarded to individuals who pass a performance-based exam. The “Cloud Foundry Certified Developer” program follows a report issued by the Cloud Foundry Foundation in November 2016 that surveyed nearly 900 IT executives worldwide and revealed an escalating gap in cloud skills which threatens enterprises’ ability to embrace digital transformation. The survey found companies are doubling down on training internal engineering teams, rather than outsourcing engineering overseas or hiring new talent, to address the skills shortfall as digital becomes the modern business core competency. Cloud Foundry Developer Certification seeks to address these shortages with the preferred method of training while equipping developers with a lasting, marketable skill set. “Companies need developers with the skills to build and manage cloud-native applications, and developers need jobs. We pinpointed this growing gap in the industry and recognised our opportunity to give both developers and enterprises what they need,” said Chip Childers, CTO, Cloud Foundry Foundation. “The Cloud Foundry Foundation is a nexus for collaboration and growth across industries with the open source mentality that everyone should win -- our members, end users, developers -- so it’s logical for us to drive a training and certification program that enriches the professional lives of developers and makes them valuable assets to companies who prize them for their own digital transformation.” At launch of the new Cloud Foundry Developer Certification initiative, the training will be offered by more than a dozen leading technology, education and systems integration organisations around the world, including EngineerBetter, IBM, Pivotal, Resilient Scale, SAP, Stark and Wayne and Swisscom. The Linux Foundation, which has trained more than 800,000 students in over 215 countries on critical skills and certifications around Linux development, systems administration and OpenStack since 2003, will offer the online eLearning infrastructure for the Cloud Foundry Developer Certification initiative. Developers can register for the free introductory course on edx.org in early May. The self-paced eLearning course “Cloud Foundry for Developers” will be available to the general public on June 13 for $500. The Certification exam takes up to four hours to complete and can be taken online for $300. The exam will also be offered in-person at the Cloud Foundry Summit Silicon Valley on June 13-15, 2017. Register for on-site certification when you sign up for Summit and receive a discount on your ticket: https://www.regonline.com/registration/Checkin.aspx?EventID=1908081. Discounts are available for bundled and bulk pricing purchases. To find detailed information on the suite of offerings, visit https://www.cloudfoundry.org/training/. ### A Hybrid Strategy that Makes the Difference We’ve reached a tipping point for digital adoption. With an estimated 50 billion devices expected to be connected to the internet by the year 2020, digital is transforming our lives more than ever before. As a result, the growth in cloud services is increasing at an irresistible rate. Cloud continues to make digital services more effective than ever before. For the IT department this means fundamental restructuring and re-skilling; for the business, it means transformative improvements in the speed, flexibility, efficiency, competitiveness and innovative potential of the organisation. One thing is clear – it has huge potential. In 2017 more cloud platforms than ever before can come together to provide business services for the enterprise. Public cloud platforms are already maturing, and there are more opportunities for organisations to integrate multiple public and private cloud services into their legacy systems. Soon, businesses who don’t adopt cloud platforms will risk falling behind competitors and face high costs – organisations can no longer afford to rely on legacy systems. [easy-tweet tweet="The cloud can be a disparate and fragmented space" hashtags="Cloud "] But this growth of cloud creates complexity. The cloud can be a disparate and fragmented space – making the new world of fast IT is highly desirable, but also quite difficult to achieve. The challenge now comes in integrating the legacy estate into an ecosystem of third party public cloud platforms in a seamless way, that all provide “best of breed” business services. So how do businesses achieve this? Implementing a hybrid IT strategy Businesses want to be agiler and enjoy greater flexibility. They also want to take advantage of new technologies such as the Internet of Things (IoT) and big data. The foundation to all of this is the cloud. But to effectively embrace cloud and an organisation’s current estate, businesses need to consider a hybrid IT approach that allows businesses to embrace fast IT while leveraging existing legacy systems. A hybrid IT strategy is more of a journey than a one-off change. There are tools available to support this to address the challenges and mitigate the risks including the difficulties of integrating new and old systems, getting technical services to the proper scale and overarching growing complexity. Digital enablement platforms, for example, can help businesses manage the different speeds of IT while allowing for robust security, visibility across multiple cloud platforms and effective cost management. [easy-tweet tweet="Orchestrating a hybrid IT environment can be complex and challenging." hashtags="Hybrid, IT "] Orchestrating a hybrid IT environment can be complex and challenging. It is also a balancing act involving many other factors. Agility, fast delivery, automation, compliance and security all need to be catered for when designing the right service. But by working with the right partner, it is achievable. Success is all about how the service, supply, processes and people knit together to deliver a secure, seamless customer experience. By understanding the strategic goals and understanding the business’ specific needs, organisations will quickly understand the value of a Hybrid IT strategy.  Connecting technology with humans Hybrid cloud services provide a great platform for organisations to leverage human-centric ICT services due to the inherent connectivity, scaling potential and commercial models. So once a hybrid IT strategy is in place, organisations can start thinking and creating new services without constraints helping employees to do their jobs more effectively. For example, if a gas engineer fixing a boiler at a resident’s house is missing a part, with the click of a button they can check whether there is a nearby engineer with the part required. This means that the customer is happy that their issue is fixed “first time” and the utility company is pleased as they are not spending more time and cost rearranging the second call out. Also, digital technologies also have the potential to change the nature of training, meaning that individuals can have personalised training suited to the needs and requirements of their role.  For example, if a lone worker is struggling to fix a broken pipe, there is the potential for the worker to stream the problem live to the back-office through a camera on their helmet to ask for assistance. By using technologies to enable workforces to collaborate, technology and humans can work together more efficiently, cutting out the need for complete automation. Conclusion Cloud will only continue to have a huge impact on our day-to-day lives so having a hybrid IT strategy in place is vital if organisations are to create the “best of breed” service for customers. New innovative services such as IoT has a lot of potential in 2017 but need a secure and scalable platform to sit on, making it critical for organisations have the right foundations in place. Collaborating with the right partners is critical to the success of organisations transformation agendas. By navigating the journey to achieve an ideal balance of Hybrid IT services, there will be huge benefits not only for efficiency and agility but employees too. Hybrid IT, if implemented correctly, should allow organisations to digitise with confidence. ### AWS Announces Amazon Connect Amazon Web Services, Inc. (AWS), an Amazon.com company (NASDAQ: AMZN), today announced Amazon Connect, a self-service, cloud-based contact centre service that makes it easy for any business to deliver better customer service at lower cost. Amazon Connect is based on the same contact centre technology used by Amazon customer service associates around the world to power millions of customer conversations. Setting up a cloud-based contact centre with Amazon Connect is as easy as a few clicks in the AWS Management Console, and agents can begin taking calls within minutes. There are no up-front payments or long-term commitments and no infrastructure to manage with Amazon Connect; customers pay by the minute for Amazon Connect usage plus any associated telephony services. To get started with Amazon Connect, visit https://aws.amazon.com/connect. The contact centre is the front-line for a company’s most important asset – its customer relationships. But, traditional contact centre solutions are complicated and expensive. Companies often have to invest in complex, proprietary hardware and software systems that can take months or even years to deploy, require specialised skills to configure and consultants to implement, and come with inflexible licensing that makes it difficult to scale as contact volumes fluctuate. With Amazon Connect, customers can set up and configure a “Virtual Contact Center” in minutes. There is no infrastructure to deploy or manage, so customers can scale their Amazon Connect Virtual Contact Center up or down, onboarding up to tens of thousands of agents in response to business cycles (e.g. short-term promotions, seasonal spikes, or new product launches) and paying only for the time callers are interacting with Amazon Connect plus any associated telephony charges. Amazon Connect’s self-service graphical interface makes it easy for non-technical users to design contact flows, manage agents, and track performance metrics – no specialised skills required. Amazon Connect also makes it possible to design contact flows that adapt the caller experience. Contact flows can change based on information retrieved by Amazon Connect from AWS services (e.g. Amazon DynamoDB, Amazon Redshift, or Amazon Aurora) or third-party systems (e.g. CRM or analytics solutions). For example, an airline could design an Amazon Connect contact flow to recognise a caller’s phone number, look up their travel schedule in a booking database, and present options like “rebook,” or “cancel” if the caller just missed a flight. And, customers can build natural language contact flows using Amazon Lex, an AI service that has the same automatic speech recognition (ASR) technology and natural language understanding (NLU) that powers Amazon Alexa, so callers can simply say what they want instead of having to listen to long lists of menu options and guess which one is most closely related to what they want to do. “Ten years ago, we made the decision to build our own customer contact center technology from scratch because legacy solutions did not provide the scale, cost structure, and features we needed to deliver excellent customer service for our customers around the world,” said Tom Weiland, Vice President of Worldwide Customer Service, Amazon. “This choice has been a differentiator for us, as it is used today by our agents around the world in the millions of interactions they have with our customers. We’re excited to offer this technology to customers as an AWS service – with all of the simplicity, flexibility, reliability, and cost-effectiveness of the cloud.” GE Appliances has been making appliances for over 100 years and today offers a wide number of appliances under the Hotpoint, GE, Haier, GE Café, GE Profile and Monogram brands. “Amazon Connect is a radical shift in the contact centre space – there is no complex hardware configuration and management – Amazon Connect makes voice an application on the network. We can rapidly connect it to anything, and easily leverage other AWS micro-services we have already developed,” said Brian Pearson, CTO of GE Appliances. “Our business continually strives to improve the ownership experience of our consumers. With Amazon Connect, we can both simplify and personalise the consumer experience, aligning our processes to better address the needs of our consumers. We’re excited to move towards a software-defined call centre model, using Amazon Connect, driven by customer-centricity.” Based in Raleigh, NC, Bandwidth is a communications software company powering top brands with voice and messaging solutions. “Amazon Connect helps us streamline operations and drive workforce efficiency to the next level,” said Ryan Henley, Vice President of Customer Success, Bandwidth. “Our agents are now able to work remotely with ease using the desktop telephony features Amazon Connect provides, and our call centre leaders are able to quickly and easily review calls with reps, providing a fast feedback loop for continuous performance improvement.” AnswerConnect’s fully remote team helps clients ensure they never miss a call again, offering 24/7/365 live reception and answering service to clients worldwide. “We need to be available for our customers 24/7/365, no matter what,” said Natalie Fung, CEO of AnswerConnect. “As a fully remote company, we needed a cloud-based soft phone that could easily scale up or down, keep our remote workers connected, and give us the 24/7 availability our clients know us for. Amazon Connect has provided that and more. We have real-time, historical visibility, and reporting. Amazon Connect easily integrates with our existing systems, and the usage-based pricing accommodates our need for seasonal scalability without financial impact. These factors made our decision to move to Amazon Connect an easy one. It was the best choice for both us and our customers.” Amazon Connect integrates with a broad set of AWS tools and infrastructure so customers can record calls in Amazon Simple Storage Service (Amazon S3), use Amazon Kinesis to stream contact center metrics data to Amazon S3, Amazon Redshift, or an external data warehouse solution, use Amazon QuickSight for data visualization and analytics, and use AWS Directory Service to allow agents to log into Amazon Connect with their corporate credentials. Amazon Connect also integrates with leading CRM, Workforce Management (WFM), Analytics, and Helpdesk offerings from Appian, Calabrio, CRMNEXT, Freshdesk, Paxata, Pentaho, Pindrop, Salesforce, SugarCRM, Tableau, Twilio, VoiceBase, Zendesk, and Zoho. This means customers can embed the Amazon Connect agent experience into the applications their agents already use. If customers require additional support for custom integrations, they can work with AWS Partner Network Consulting Partners 1Strategy, Accenture, Aria Solutions, Persistent Systems, Slalom, Solstice IT, VoiceFoundry, and Wipro. Amazon Connect is currently available in the United States and 18 countries throughout Europe and will expand to more countries in the coming months. ### Cloud outages highlight need for organisations to start designing for failure on public cloud services, says Databarracks The recent Amazon Web Services (AWS) outage, which impacted a number of high-profile websites and service providers, has highlighted the importance of specific skill sets to support public cloud services. This is according to Radek Dymacz, head of R&D at disaster recovery and AWS consulting partner Databarracks, who states that organisations should adopt a ‘design for failure’ approach to prevent outages. Gartner has forecasted that the worldwide public cloud services market will grow 18 per cent in 2017, with additional research from the Cloud Industry Forum (CIF) showing that the overall cloud adoption rate in the UK now stands at 88 per cent. For Radek, this growth has contributed to a shift in the cloud marketplace. [easy-tweet tweet="Platforms like AWS need a different approach to traditional hosting" hashtags="AWS, Cloud"] He explains: “The growth of hyperscale cloud services has led to an increase in managed services for these clouds. We have seen telecoms providers, data centre owners and managed service providers launch their own cloud services and, in many cases, pull out of the market. Many of these businesses are now focusing their efforts on providing managed services for the hyperscale public clouds of AWS, Azure and Google. However, platforms like AWS need a different approach to traditional hosting. “The ability to design for failure is essential to the value proposition of public cloud platforms, and yet organisations are still consuming AWS services as though they’re building a traditional hosting environment. The great strength of platforms like AWS is that you can build in resiliency in a way that scales depending on your budget. At the larger end of the spectrum, this might involve using object storage across multiple Availability Zones and even Regions to provide an extra layer of resilience. This is expensive but, for large organisations, so is downtime. We recommend that all organisations adopt a ‘design for failure’ approach. This means that if any single element fails then there is an easily-identifiable specific cause, with a known resolution. “What we’re seeing in customer demand agrees with this trend as businesses are now more mature in their use of cloud services. They have gone beyond testing, so they are now seeking help to increase resilience, optimise cost and support it round-the-clock. Therefore, when looking for support, organisations must select a supplier with genuine expertise, rather than a cowboy. To do this, one trick is to listen to the naming conventions they use as this is a surprisingly effective way to identify people who have not changed their approach to infrastructure. For example, consultants with little experience within the AWS ecosystem will use terminology such as ‘server’ instead of ‘instance’. “Also, don’t be fooled by brand champions; almost anyone can pay their way through certification with AWS or Azure so you should always ask your provider how long they have been working with their chosen platform and ask to see multiple and specific case studies. This should help you find an experienced public cloud provider, but if in doubt always opt for shorter contracts. “Although launching services in AWS is simple, it’s maintaining them that requires a highly specialised skillset. Working with a demonstrably experienced AWS provider typically involves redesigning the way your applications work, specifically around decoupling services and resources. But there’s a huge grey area between infrastructure, resource provision, application functionality and service delivery. You should therefore always choose a provider who has the developers and a support team to occupy this grey area, and who can work collaboratively with you to keep things running,” concludes Radek. ### Cresatech Joins Forces with Telit Using IoT to put Safety First in the Utilities Industry Using Telit’s IoT technologies, Cresatech is enabling utility providers to mitigate the operational and safety risks resulting from copper theft Cresatech, the specialist in continuous and real-time M2M communication technology, today announces it is partnering with Telit, a global enabler of the Internet of Things (IoT) to enable the real-time monitoring of critical electricity distribution infrastructure. Copper theft has plagued utility providers over recent years and often goes unnoticed until things go wrong. For example, until now, those operating in the electricity sector would have been blind to copper theft until a fault occurred, leaving both its employees and the wider public exposed to the potentially deadly combination of live electricity and unearthed equipment. By using Cresatech’s CuTS monitoring solution utility providers know immediately where safety has been compromised and take the important corrective action, restoring normal service and mitigating risk. Powered by Telit’s IoT modules and IoT Portal, Cresatech’s CuTS solution uses edge computing connectivity to provide a real-time status dashboard and generate alerts when substation earthing is damaged or stolen. Using secure wireless communication, Cresatech’s solution integrates seamlessly into Telit’s IoT Portal bringing its monitoring sensors online and providing utility service providers up-to-the-minute detailed information on the status of their infrastructure. This helps utility providers maintain complete control of their infrastructure, minimising outages, improving utility performance and customer satisfaction. Additionally, in a world increasingly under threat of cyber-attack, it is essential for utility providers to know that the high levels of security protect any element of online connectivity. Through its work with large corporations, including manufacturing plants where digital security is critical, Telit has a proven track record that ensures a secure platform. This allows utility providers to use Cresatech’s solution with complete confidence over cyber security. [easy-tweet tweet="We are working with some of the world’s largest and most essential companies" hashtags="IoT, Digital"] Cresatech CEO Simon Nash commented, “Our solution hinges on being able to provide our customers with secure constant connectivity to enable real-time monitoring. Working in the utility industry, we are working with some of the world’s largest and most essential companies, and by partnering with Telit, we can ensure quality, secure product that meets the needs of our customers.” Sammy Yahiaoui, Telit Vice President of EMEA IoT Services Sales, commented, “Cresatech can monitor, collect and communicate real-time data securely from exposed service sites and infrastructures in complex operational environments with Telit’s end-to-end IoT solutions. The companies together are helping customers to achieve operational, process and safety requirements that reduce cost and mitigate hazardous risks.” ### Cloud DNS – What’s Stopping you Joining the Party? Across industries, companies of all sizes have embraced business applications and platforms that are based in the cloud. After all, doing so offers numerous benefits regarding deployment, upkeep, reliability, and cost as compared to running the same services on premise. These advantages apply just as well to domain name server (DNS) technology - an underappreciated yet fundamentally vital part of the Internet, connecting users to websites. Yet a significant majority of enterprises choose to keep their external DNS deployments on premise – why? DNS has been called the most important technology that no one knows about, so we thought we’d check if this is true and try to find out what factors either encourage or discourage businesses from making the decision to deploy their external DNS in the cloud. To this end, the results of our recent survey from Forrester Consulting shed some insight. [easy-tweet tweet="Many firms experience considerable DNS-related troubles. " hashtags="Cloud, DNS"] Setting the scene It’s no surprise that the results showed that the dominant deployment method for external DNS is overwhelmingly on premise. What is worrying is that out of the respondents who are responsible for their firm’s DNS technology, an average of only 20% of them claim to be “very knowledgeable” about one of the nine DNS components and use cases we asked about. So combine the prevalence of on-premise DNS with the lack of skills to support such a model, and we can understand why many firms experience considerable DNS-related troubles. Key among these are challenges around security, including distributed denial of service (DDoS) attack vulnerability and DNS protocol security, which were cited as major or moderate challenges by approximately two-thirds of those surveyed. But it’s not just critical security issues that bog down IT professionals with on-premises DNS. The majority of respondents also cited being challenged by a number of resources — both time and money — used to maintain and upgrade their system. The pull towards cloud-based DNS Since on-premises DNS operators’ chief concern pertains to security, it follows logically that the majority of respondents cited better protection from DDoS attacks and improved DNS security (DNSSEC) as having a major influence on their decision to host in the cloud. Even more of influence, however, is the prospect of improved reliability and availability in an era when customers expect instantaneous service anytime and anywhere they please. Dovetailing off this imperative, companies are motivated to move DNS to the cloud by capability indicators such as improved disaster recovery, performance, scalability and traffic management, as well as detailed reporting on what’s working well and areas for improvement. What are the barriers? If firms have strong factors pulling them toward cloud deployments, why have so few made this move? The list of common inhibitors includes several applicable to any cloud technology deployment, such as pricing models, migration, and mandates for isolation and security. But one theme on this list stood above the rest: vendor service and support. According to our respondents, perceptions regarding vendor response time and communication constitute the primary major inhibitor, and more general support concerns earn the highest combined mentions as a major or moderate inhibitor. [easy-tweet tweet="DNS is a critical infrastructure and security component " hashtags="Cloud, DNS"] DNS Decision-Makers Prefer MSPs Because DNS is a critical infrastructure and security component for any modern organisation, there is a wide choice of companies offering DNS services, each with distinct service models and specialisations. With so many of those who are responsible for DNS struggling with the maintenance of their on-premise systems, it makes sense that when we asked them to rate the type of DNS service provider they preferred, the largest share elected to hand responsibility over to a managed service provider (MSP). In fact, MSPs are nearly twice as preferred as are outsourced IT services partners. [easy-tweet tweet=" All organisations require DDoS protection." hashtags="Cloud, DDos"] DNS Selection Criteria: One-Stop-Shop AND DDoS Protection DNS is chock-full of components and uses cases, and not all service providers have the capabilities to address all of them. Companies that seek several services — or want to completely unload their DNS responsibilities are likely to seek robust solutions that don’t require multiple vendors. Our survey respondents laud both MSPs and Internet service providers (ISPs) for this criteria, considering them as tied for having the greatest advantage for one-stop-shop offerings. Regardless of how much help they need operating and maintaining a DNS, all organisations require DDoS protection. Survey respondents indicated that MSPs have the greatest advantage here. DNS Selection Criteria: Service AND Expertise Our survey respondents reported both low levels of DNS knowledge and significant concern around the communication and support from cloud DNS providers. Therefore, DNS expertise, as well as the level of service provided by those experts, are likely to be high priorities for those who decide to evaluate cloud DNS services. Our survey respondents gave high marks to MSPs for both of these metrics by deeming them as having the greatest advantage for service, as well as for subject matter expertise and experience. The word is spreading about external (authoritative) DNS Despite the slow uptake, the survey showed indications of uptake acceleration. Factoring in the respondents who are currently planning a migration, nearly half (47%) of organisations will soon host their external DNS in the cloud. What’s more, only a quarter of respondents said they are not at least considering such a move. Firms are increasingly open to the idea of migrating to cloud-based DNS solutions under the prospect of improving on-premises deployments’ shortcomings. Various concerns — chiefly around vendor support and service — prevent such moves from occurring en masse. When they do evaluate such options, buyers believe MSP have an advantage for service, as well as for other evaluation criteria.   ### The 5 minute guide to Openstack! Introduction OpenStack started as a joint venture between NASA and Rackspace. In 2012 the OpenStack Foundation was set up to create a private cloud experience available to everyone as an open source project. Globally there are approximately 3000 developers working on OpenStack code at any time, and the foundation boasts 17,000 members in 140 countries. Many of the developers are ‘gifted’ to the foundation from companies such as RedHat and IBM. Developers come together twice per year in a week-long open forum where they can share ideas, plan future functionality and agree to the way forwards. These ideas are then developed with the aim of including them in the next major software release. The OpenStack software is actually a collection of many software packages each providing services that work together to create pools of infrastructure resources; compute, network and storage. The pools are made transparently available to end-users as cloud resources. The administrator can then configure individual tenants (or projects) to have their own resource quota sliced from the overall availability within the cloud environment. These quotas are typically shared among multi-tenanted projects accessed through the Horizon web portal, or via the OpenStack APIs. [easy-tweet tweet="#OpenStack is a massively scalable #cloud solution. " hashtags="Cloud, Data"] Businesses benefits OpenStack is a massively scalable cloud solution. Its greatest advantage is that it is licence-free, giving it the edge over the likes of VMWare. Many businesses around the World have now implemented OpenStack. They range in scale from just one physical node up to the hundreds of thousands of cores, such as that in CERN, where 190,000 cores keep the data analysis going. It powers financial and e-commerce businesses, such as PayPal and eBay. In 2016, Intel announced it was moving virtual machines away from VMWare and onto OpenStack. A commercial drive has also seen OpenStack evolve significantly and what was originally developed to provide private cloud environments has now been deployed successfully by public cloud providers.  Opportunity for new business The current issue anyone looking to adopt OpenStack faces is that it can be tricky to implement.  With so many software packages, developed by different teams, needing to interact with one another, it has been a challenge for the foundation to keep accurate installation guides meaning the implementer is required to be highly skilled and prepared to spend months getting the installation and configuration right. In fact, Suse quoted that “Half of all enterprises that tried to implement an OpenStack cloud have failed.” and once it is running correctly, it needs maintaining (sometimes even fire-fighting). This has provided a growth industry around creating commercially supported solutions. The products range from professional services, managed service support contracts, pay-per-use public solutions and physical OpenStack appliances capable of being integrated into IT estates quickly and easily. In summary OpenStack is complex and therefore can be extremely difficult to implement correctly. It’s imperative that the core is implemented correctly and therefore the design and delivery of the software should follow standard IT architectural design methodologies. You should seek comfort in the fact that there are thousands of developers supplied by over 500 supporting companies actively working on the maintenance and roadmap and that there are hundreds of OpenStack clouds in production operation, including many in the Fortune 500. ### Managed Kubernetes: Understanding Your Options in the Cloud Kubernetes is a popular container orchestration platform used by many professionals for software development, administration, and security. Containerisation is especially popular for cloud-native applications, and environments that require continual updates. However, Kubernetes management can be complex, especially for enterprise-level operations.  In this article, you will learn about managed Kubernetes, including definitions, features, and top solutions.  What Is Kubernetes as a Service (KaaS)? KaaS is a third-party, managed Kubernetes (k8s) offering designed to make implementing k8s easier. With KaaS, you get access to all of the benefits of k8s but your vendor manages some or all of the configuration, deployment, and maintenance. Some vendors also provide hosting for deployments.  These services enable organisations that do not have the expertise or willingness to manage k8s on their own to smoothly orchestrate container deployments. KaaS provides the same scalability, self-healing, and workload management settings with the addition of personalised support and often pre-built template configurations. Why Use a Managed Kubernetes Service? While it may be tempting to deploy Kubernetes on your own, this can be a daunting process if you do not have in-house expertise or dedicated staff. Kubernetes is notoriously complex to set up and maintain, presenting a barrier to many organisations. KaaS can provide a solution to these issues.  Move Your Business Forward Kubernetes isn’t a magic bullet but it can help you deploy applications and services faster than your existing methods. It can also help you ensure better availability than traditionally deployed services. Using a managed service, you can achieve these benefits without added burden on your developers or operations teams.  After your vendor has deployed, teams can begin immediately using k8s managed containers with minimal effort. Since KaaS typically includes monitoring and service level agreements, you can be sure that your deployments are secure and remain operational. This is ideal for smaller organisations that may not have the staff or time needed to manage k8s on their own.  Simplify Open-Source Open-source can be great. It enables flexibility and cost savings that proprietary solutions can’t match. However, it also requires much more effort on the organisations part to operate and maintain. These solutions also do not come with the dedicated support that enterprise software does.  KaaS vendors can provide this lacking support and can often help you integrate other open-source tools you may be using. For example, Jenkins or other continuous integration / continuous deployment tools. Additionally, vendors often offer additional tooling to simplify monitoring and reporting functions. These services can help you customise the open-source framework of k8s to more closely meet your needs.  Features to Look for in a Managed Kubernetes Platform If you’ve decided that a managed platform is the right choice for you, you have several options to choose from. However, not all are equally suited to your needs. When choosing a vendor, you should ensure that they provide the configurations and services that best suit your requirements. To do this, make sure to look for the following features and traits: Includes pre-configured environments Cloud agnostic and supports hybrid environments Integrates with existing services Includes built-in authentication and access controls Supports automation and load-balancing Easy to use and customisable interface Top Managed Kubernetes Platforms Once you understand what features to look for, you can begin considering your options. Below you can find information on the top KaaS providers.  Google Kubernetes Engine (GKE) GKE is one of the most well-developed platforms, integrating seamlessly with Kubernetes. This makes sense since k8s was developed by a Google team. This service was designed for use on the Google Cloud Platform but you can also use it with hybrid environments.  GKE provides features for self-healing, IP range reservation, private container registries, management of master nodes, and integrated logging and monitoring with StackDriver. It also provides automatic updating and auto-scaling.  AWS Elastic Container Service for Kubernetes (Amazon EKS) EKS is an extension of AWS’s Elastic Container Service (ECS) that is optimised for k8s. It runs on EC2 instances and can be used across availability zones. EKS includes features for updates on demand, built-in security, and access controls.  You can integrate EKS with CloudTrail and CloudWatch services for logging and auditing purposes. Currently, it is only available for use with AWS resources and does not support hybrid configurations. Azure Kubernetes Service (AKS) AKS is the managed service offered by Azure. With AKS, you can deploy k8s to Azure instances, or hybrid environments via Azure Stack. It includes features for traffic management, built-in security, automatic upgrades, and batch automation. You can manage your deployment via command line, web console, or through integration with Terraform.  Platform9 Platform9 is a cloud-agnostic, third-party managed service. You can use it with any cloud platform, on-premises, or on VMware. It offers automatic upgrades and pre-configured environments for fast deployment. With Platform9, you can manage your deployment from a web interface.  IBM Cloud Kubernetes Service IBM Cloud Kubernetes Service is a certified k8s service, hosted on the IBM cloud. It integrates with more than 170 existing IBM cloud services. This service includes features supporting the deployment of IoT, blockchain, machine learning, and analytics applications.  OpenShift OpenShift is an open-source, cloud-agnostic platform that is available in free and paid formats. You can use it in a public cloud via pre-configured container templates, as a managed, private cluster in a public cloud, or as a Platform as a Service in a private cloud. It includes features for built-in monitoring, security, and custom domain routing. It also comes with a library of prepackaged applications. You can manage OpenShift via console or command line.  Conclusion KaaS can help you delegate Kubernetes management responsibilities to experts in the field. KaaS is offered by leading technology companies, such as Google, AWS, Azure, Platform9, IBM, and OpenShift. When choosing a KaaS service provider, look for the features that serve your business best. [/vc_column_text][/vc_column][/vc_row] ### Deloitte Expands Cloud Services With Acquisition Acquisition of cloud consulting business of Day1 Solutions to accelerate clients’ digital and core transformation Deloitte announced today significant investments in its cloud services, providing clients with an advanced set of analytic and cognitive capabilities to enable their digital transformation. Investments include the acquisition of substantially all of the assets of Day1 Solutions Inc. (“Day1”), an innovative cloud consulting firm; the addition of 3,000 new, U.S.-based high-tech engineering jobs; and the opening of new cloud studios across the country. “Cloud is the backbone of innovation and a conduit for clients to reimagine how they do business,” said Ranjit Bawa, principal, Deloitte Consulting LLP. “For years, we’ve helped our clients view cloud integration as a critical driver for business transformation. By adding these significant investments to our portfolio, our clients will have access to deeper cloud expertise and even more innovative capabilities, as well as the talent they need to help them thrive in a fast-moving digital economy.” The services of Day1’s team will further enhance Deloitte’s cloud capabilities and accelerate its collaboration with leading cloud platforms in the market. Day1’s customer base extends across commercial industries and government agencies, with significant success in the public sector. “As one of the largest professional services organisations in the world and a recognised leader in applying new technologies to businesses, Deloitte provides the reach and expertise needed to share our award-winning cloud solutions with a diverse client roster,” said Luis Benavides, founder and CEO of Day1. “At the end of the day, our commitment to clients is to harness the power of the cloud to accelerate their digital transformation.” Deloitte is also amplifying the value of cloud by adding 3,000 new engineers to help organisations integrate, streamline and manage business operations in the cloud. This group will focus on innovating cloud technologies to accelerate and enhance the depth of analytics and cognitive solutions available to customers. The majority of these new engineers will operate out of three new cloud studios to be opened in Orlando, New York and Washington, D.C., over the next year, expanding the existing network of 44 Deloitte Digital studios worldwide. “The growth of cloud computing has skyrocketed over the last few years – according to Synergy Research Group, $148 billion in 2016 and growing 25 percent annually. Cloud capabilities are the great enablers of digital transformation, and there’s a strong demand from clients to help them innovate their businesses with new cloud-based platforms,” added Bawa. “These strategic investments help strengthen Deloitte’s cloud-based solutions by providing clients with a full spectrum of digital, analytics and enterprise cloud services – that ultimately power business agility and growth in a cloud-driven world.” For more information about joining the Deloitte Digital team, please visit the following site: http://deloittedigital/careers/cloud. ### Cloud Call Centre, NewVoiceMedia, partners with Monet Software NewVoiceMedia announces partnership with Monet Software to optimise call centre management and customer experience NewVoiceMedia, a global provider of cloud call centre and inside sales technology that enables businesses to have more successful conversations, has announced a partnership with Monet Software, a leading cloud-based workforce management solution provider. ‘Monet WFO Live’ integrates seamlessly with NewVoiceMedia’s ContactWorld platform to help optimise customers’ call centre management and operations to transform their customer service experience. [easy-tweet tweet="Service levels are optimised for everyday" hashtags="Cloud, Technology "] By combining ContactWorld with Monet WFO Live, managers can take positive steps towards balancing the dilemma of needing to minimise operational costs while maximising service levels. By enabling organisations to use their call centre data, Monet WFO Live improves the accuracy of the business’s contact demand forecasts and creates staffing schedules that ensure agents with the right skills are made available at the right time. Furthermore, service levels are optimised for everyday use is simple and the upfront costs and burden on IT resources associated with traditional technology are removed. According to Gartner, “A growing market awareness of the importance of the employees in customer ‎engagement centres is triggering an adjustment in the technologies needed to manage their day-to-day ‎roles.” Jonathan Gale, CEO of NewVoiceMedia, commented, “We are delighted to announce our partnership with Monet Software, which will give our customers the benefit of a premium pre-integrated partner, together with the ease of use, efficiency and reliability they expect. Together, ContactWorld and Monet WFO Live will help businesses develop an effortless customer experience and excellent agent efficiency, ensuring those with the right skills are available at the right time”. Chuck Ciarlo, CEO of Monet Software, said, “Monet customers have already discovered the forecasting, scheduling, quality and performance management benefits of WFO Live Workforce Optimisation in the cloud. When these time and cost-saving advantages are combined with ContactWorld, the result will allow call centres to take advantage of the highest level of cloud technology solutions, delivering the ultimate in customer experience”. ContactWorld is a multi-tenant intelligent communications platform that enables sales and service reps to have more successful conversations with their customers and prospects worldwide. Core call centre functionality such as omnichannel contact routeing, self-service IVR, automated outbound dialling, screen pops and instantaneous CRM updates are provided with proven 99.999% platform availability. For more information about NewVoiceMedia and Monet Software, visitwww.newvoicemedia.com   ### PaaS and Cloud-First in Web Content Management: What Marketers Need to Know. With the API economy becoming ever more prevalent, as well as organisations increasingly basing their activity around secure access to and exchange of content and data, the wider implications for marketers need to be discussed. A recent report by Forrester Research predicted that ‘PaaS and cloud-first are winning in Web Content Management (WCM)’ and we should take heed. 2017 continues to witness a number of companies embracing cloud-first, platform-as-a-service (PaaS) and microservices-based architectures. There are a number of key reasons why marketers need to start realising why these approaches are essential to success. So what do we need to know? Increased marketing capability PaaS is a revolution in the way products themselves are built, going beyond simple cloud-hosting, and is directly impacting on marketing strategies. First of all, deployment of such systems provides increased application independence from the underlying cloud infrastructure on which it runs. Infrastructure becomes a commodity and the real added value lies in the application that runs on top of it, wherever that may be physical. Secondly, PaaS applications typically expose themselves in a very modularized way, as a series of services that can be consumed from anywhere, by any other application. Services may expose data, content or other digital information, and therefore enable easy aggregation of information from various environments to create a new combined set of information with some unique value. [easy-tweet tweet="PaaS solutions enable marketers to rationalise and combine their assets" hashtags="PaaS, Cloud"] In the world of Web Content Management, this means that PaaS solutions enable marketers to rationalise and combine their assets,  previously not as accessible in locked-down repositories. As PaaS becomes increasingly adopted across different business platforms, it is allowing marketers to broaden their capabilities. Through these cloud-based tools, marketers are able to pull together all relevant strategies, aspects and techniques involved in building a campaign or broader digital experience far more seamlessly. And it becomes much easier to do so consistently, across an ever-growing range of touchpoints and channels. Improved methodology and time-efficiency The development of PaaS and cloud-first via WCM has allowed for more software simplicity, meaning marketers can exchange and make the best use of information across their systems and software in a quick, methodological and time-efficient manner. Information is key to successful marketing techniques, and PaaS systems will prove simpler due to the smaller, more identifiable pieces of information that can be accessed and (re)used.  Here at SDL, we anticipated this trend, acknowledging that by building a platform around the necessary tools for successful marketing processes to take place, this would prove beneficial long term. Fully separated content, structure, design, and application logic are all benefits for today’s marketing professionals, providing both content and context in a channel-agnostic way. Global outlook We cannot underestimate the importance attached to global, multi-lingual marketing. The universal demand for brand messaging - identity relation for customers combined with strategies for promotion - involves a wide variety of staff, skills, insights, and expertise to deliver value across the globe. [easy-tweet tweet="Organisations need a platform that offers true agility" hashtags="Cloud, PaaS"] To get this right is not easy – especially when launching campaigns on a global stage. Organisations with global ambitions need a platform that offers true agility. A content management system that offers, for example, a headless and decoupled architecture, makes life easier for marketers who want to rapidly publish content across multiple channels and devices. Cost reduction Pricing is a critical part of the marketing mix for any international firm, ultimately creating revenue and increasing client retention levels. Translating marketing content and subsequent global content syndication across many different channels in various languages is costly. However, building on a cloud-oriented approach, the following is possible; individual marketing material and components can be translated, exposed through PaaS services, and repurposed by a variety of content consuming applications. This provides unprecedented cost savings, workplace efficiency and marketing content consistency. The rise of the API economy means that PaaS and cloud-first tools will become increasingly more essential to executing successful global marketing strategies. And the benefits of introducing such tools are significant for marketers - from reducing overall costs, increasing efficiency and ultimately enabling marketers to join up relevant digital experiences seamlessly, PaaS and cloud-first are the winners for WCM. ### Innovation & collaboration: Recent trends in cloud security Data security providers have been developing various innovative solutions to enable security of customer data. Moreover, organisations have been collaborating with security providers to enhance their capabilities of data security along with efficient management. As organisations have realised the importance of shifting their data on the cloud to enhance efficiency and avail other benefits, the next step involves the security of data. Data security providers have been developing various innovative solutions to enable security of customer data. Moreover, organisations have been collaborating with security providers to enhance their capabilities of data security along with efficient management. The market for cloud security is booming owing to major steps taken by organisations to ensure the security of data and innovative solutions offered by providers. According to the research firm Allied Market Research, the global cloud security market is expected to reach $8.9 billion by 2020. Following are some of the recent activities taking place in the industry highlighting the trends: Innovation to offer new capabilities Innovation has become a focal point for cloud security providers to protect the data of their customers. Tufin, the provider of Network Security Policy Orchestration solutions, has offered its latest cloud-native solution, Tufin Iris. It will offer visibility and control for the security policies for applications based on the cloud. The Tufin Iris will integrate with DevOps pipelines for enabling continuous compliance and maintenance of business agility. This solution will provide a rise in visibility into various risks for cloud-based applications of organisation deploying public cloud environments. Moreover, the solution will enable development teams to possess the tools that are important for validating compliance with security policy. Riding on the wave of innovation, National Australia Bank has been employing enterprise security practices for coping up with cloud-first outlook. It has deployed a new tool to determine the security gap and fix it in nearly 45 seconds. The cloud security team of the bank has joined hands with Amazon Web Services (AWS) Professional Services for developing a continuous compliance solution. This will enable them to continuously and systematically track the security compliance position with the help of AWS. This will enable identification of the risk and gap, determine security control measures, and deliver the code as fast as possible. NAB is able to get the real-time view of the security compliance for the enterprise along with allowing development teams to offer the solution faster. The solution has enabled the access to security updates of the entire enterprise by just pushing a button. Moreover, it helped the bank to roll down the deployment time of enterprise security control from 19 days to 45 seconds. Security architect Ivan Sekulic outlined that the solution is an example of maturation of IT security capabilities of NAB. Collaboration to enhance capabilities Collaboration has been one of the major strategies adopted by firms to enable cloud security. Wipro decided to collaborate with the Check Point for availing cloud security services. The services of Check Point, known as CloudGuard IaaS, will help clients of Wipro with advanced fifth-generation threat prevention capabilities for enterprise edge applications and virtualised cloud deployments, according to Sheetal Mehta, the Senior VP and global cybersecurity and risk service head at Wipro. Check Point outlined that the partnership with Wipro will help in speeding up the demand for the range of security products that are based on threat protection and security cloud applications of enterprises. CloudGuard is able to protect nearly 2,000 cloud environments of enterprises. The partnership with Check Point would help in optimising investments in the cloud infrastructure. The automated and dynamic policy would help in leveraging contextual information regarding the infrastructure and adjust security policies for changing the cloud environment. Moreover, there are various capabilities such as rapid deployment and logs & reporting. These features will help in consolidating the security of cloud for enterprises. Capitalising on the trend of collaboration, AWS avails support from for its cloud Cylance with CylancePROTECT. Cylance offers artificial intelligence (AI) capabilities for detection and prevention of malware. It will provide support for AWS Linux for protection of applications on cloud services infrastructure from various cyber threats posed by bad actors. Moreover, its AI capabilities offer insights on threats along with threat elimination capacities. The company will deploy its AI-driven and prevention-first security solutions for cloud computing environments. It has machine learning techniques and provides scalable threat detection, root cause analysis, response, and threat hunting. Cylance will help AWS in the prevention of data breaches impacting the security of data of organisation on the cloud. There is another collaboration taking place in which AlgoSec signed a collaboration agreement with Microsoft. This collaboration will enable interoperability between Microsoft Azure Firewall and the AlgoSec Security Management Suite for facilitating central security management. Customers of Azure Firewall will be able to manage different instances across various regions and Azure accounts. Anner Kushnir, VP Technology at AlgoSec, outlined that enterprises have been accelerating adoption of Microsoft Azure. As the adoption increases, they need an ability for management of all the security controls and devices for protecting their deployments. This collaboration will speed up the Azure deployments along with enabling the entire visibility and control for the security devices. Moreover, its capabilities will enable detection of security risks and misconfigurations along with extensive security policy management.        ### Syneto Shakes up the Hyperconverged Infrastructure Market with HYPER Series 3000 Platform   All-in-one hyperconverged platform enters market at half the cost of a traditional infrastructure Syneto, a leading hyperconverged infrastructure vendor, unveiled on the 22nd March, its new HYPER Series 3000, a range of hybrid and all-flash hyperconverged platforms with built-in instant disaster recovery (DR) capabilities, which allows users to restart their infrastructure on the DR unit within 15-30 minutes. These new products provide industry-leading performance and agility and broaden the company’s solution portfolio. The new series consists of two platforms: the HYPER Series 3100 and 3200. The HYPER Series 3000 is an all-in-one hyperconverged system designed to address the data management, virtual application deployment, usability and data recovery challenges that businesses face today. It aims to replace traditional IT solutions and put an end to the inherent frustrations and restrictions that are faced by enterprises with small to medium-sized data centres as well as SMBs and organisations with remote offices. [easy-tweet tweet="HYPER Series 3100 can sustain up to 24 medium-sized VMs" hashtags="Hybrid, Software"] The HYPER Series 3100 can sustain up to 24 medium-sized VMs with up to 16TB effective hybrid storage capacity, including space saving mechanisms such as compression, deduplication and thin provisioning. The HYPER Series 3200 is designed to sustain up to 48 medium-sized VMs, with different capacity and performance options available: Hybrid: up to 16TB effective hybrid capacity Hybrid2 (double-hybrid): 8TB hybrid tier alongside 3.2TB all-flash tier All-flash: 7.2TB all-flash The HYPER Series 3000 is a perfect fit for virtual server workloads, including e-mail and domain servers (HYPER 3100 series), as well as high-performance virtual applications such as databases and Enterprise Resource Planning (ERP) servers (HYPER 3200 series). Because hyperconverged infrastructures leverage the full potential of modern computing solutions, they are able to deliver a cost-effective, full–stack IT infrastructure with significantly improved application performance, space efficiency and fast resource deployment. According to Gartner, the hyperconverged market will reach $5bn in sales and account for 24% of the storage market by 2019 and by 2020 respectively; 10% of all systems will be self-protecting. Given the evolution of the market and by listening to existing customers, Syneto focused on a software-defined-storage approach. To ensure business continuity, the new series provides hyperconverged systems with built-in disaster recovery capabilities to help organisations combat the challenges of traditional IT infrastructures: rising costs, management complexity and lack of proper disaster recovery capabilities. “Leveraging our market experience and technological know-how, it was an obvious step for Syneto to move resolutely into the hypercovergence market. We have created something that businesses of all sizes require: an all-in-one hyperconverged infrastructure with a built-in disaster recovery capability which is affordable and offers high performance at the same time,” said Vadim Comanescu, CEO Syneto. The HYPER Series 3000 runs virtualised business applications, shares and stores files in a highly efficient manner and ensures the safety of all data via the built-in disaster recovery unit, which can “replay” the customer’s infrastructure on the included DR unit within 15 to 30 minutes of a downtime event. Data security is further enhanced by always-on multi-layered data protection at HDD, file and VM level, as well as the ability to create up to 1440 automatic backups per day, for every application, offering exceptional protection versus data corruption. All the products in the series utilise the latest V4 Xeon Processors for high-performance computing and the latest innovations in NAND flash technologies. Performance is never compromised. Thanks to Syneto’s intelligent caching mechanisms, which cache data across multiple storage media types, virtual applications benefit from 5-20 times higher performance compared to traditional IT infrastructures. Furthermore, the products feature built-in 10Gb software-defined network connectivity between applications and storage, increasing the performance even further and removing the need for additional expensive networking hardware purchases. Performance-hungry business applications can be hosted on separate all-flash tiers to provide truly groundbreaking performance.   ### The Big Question is; When will Ransomware Attacks Hit? The government’s acknowledgement that the escalating threat of ransomware attacks are a question of “when, not if” for UK organisations was not accompanied by sufficient advice on recovery, in its new report ‘The cyber threats to UK businesses’. This is according to Peter Groucutt, managing director of disaster recovery service provider Databarracks.  Last week, The National Crime Agency (NCA) and National Cyber Security Centre (NCSC) launched its first joint report into ‘The cyber threat to UK businesses.’ The document outlined what it expects to be the major trends seen across the cyber security industry over the coming months, highlighting the “significant and growing” threat of ransomware to UK businesses.  While the report advised UK organisations combat cyber-attacks with robust awareness, reporting and cyber security programmes, it failed to acknowledge the more immediately actionable role good continuity practices can play in surviving and recovering from cyber-attacks.  [easy-tweet tweet="Ransomware experienced an explosive growth last year." hashtags="Security, Ransomware"] Groucutt discusses: “Ransomware experienced an explosive growth last year, with over 60 new variants emerging since the start of 2016. Industry practitioners have suggested that the sophistication and ferocity of attacks have seen organisations part with over $1 billion to retrieve their encrypted data, with SMEs and individuals increasingly being targeted.  “There is a clear and urgent need for organisations to increase their survivability of - as well as defences against - cyber-attacks in the near future. The pervasiveness of ransomware is particularly troubling. It’s a hugely lucrative industry, and traditional security measures, such as anti-virus, are failing to keep pace. Whilst outright prevention of an attack may be impossible, good continuity practices, such as a carefully tailored backup solution, can effectively negate the consequences.”   [easy-tweet tweet="Supporting this is the need for an effective backup strategy." hashtags="Security, Backup "] Groucutt continued: “It is also critical that an effective incident response plan and backup strategy are in place; something that was surprisingly omitted from the government’s advice within the report. Whilst we typically advise customers to plan for the impacts of disruption, rather than the specific scenario that caused it, certain cyber threats do warrant specific response plans, and this is certainly the case for ransomware. It would be advisable for UK organisations to make a ransomware attack the next focus of any future continuity planning if they haven’t done so already.    “Supporting this is the need for an effective backup strategy. In the event of a ransomware attack a business will have two likely options: recover the information from a previous backup or pay the ransom. The challenge remains that many traditional DR services are not optimised for cyber-threats. Replication software will immediately copy the ransomware from production IT systems to the offsite replica. Replication software will often have a limited number of historic versions to recover from so by the time an infection has been identified, the window for recovery has gone. This means that ransomware recovery can be incredibly time consuming and requires reverting to backups. This often involves trawling through historic versions of backups to locate the clean data. Partnering with a specialist can dramatically reduce this process, ensuring faster recovery and ultimately greater peace-of-mind.  “The threat of ransomware will only increase so steps need to be taken to mitigate risks. The advice from the government provides a solid foundation for those looking to address this but it is imperative this is supported with an effective response plan and backup strategy,” Groucutt concludes.   ### Legacy Technology is the Biggest Barrier to Digital Transformation Nimbus Ninety’s research, in partnership with Ensono, identifies leadership and vision as large contributors to failed digital transformation projects. On the 21st March Nimbus Ninety, the UK’s independent research community for disruptive business and technology leaders, have released their latest research into digital transformation in partnership with Ensono, leading cloud solutions and Hybrid IT services provider. The Digital Trends Report interviewed 251 senior stakeholders responsible for digital transformation initiatives and found that 36% view increased competition from digitally driven companies as a challenge, yet more than half (52%) of respondents rated their organisations’ progress toward achieving their digital ambitions as adequate or poor. The report, which reviews forces and barriers to digital transformation, revealed that 30% still do not feel well-equipped to seize the opportunities presented by digital. Conversely, 46% stated their organisation has significantly transformed over the past 12 months, suggesting a widening gap between those who are embracing digital transformation and those who have yet to start. The report also measured barriers to digital transformation, with legacy systems taking an overwhelming lead in blocking progress. [caption id="attachment_41430" align="aligncenter" width="383"] Digital Transformation image[/caption] Responding to the legacy issue, Simon Ratcliffe, Principal Consultant at Ensono says: “Although legacy systems are often cited as a barrier to transformation, they are, in fact, an enabler.  Legacy systems need not be replaced to achieve transformation, they need to be factored into the execution. Progressive organisations are building new, transformational solutions in parallel with their legacy systems and slowly switching business operations across. Legacy IT and modern cloud-based solutions can co-exist and ignite the transformational journey.” [easy-tweet tweet="27% do not have a clear digital strategy." hashtags="Digital, Technology "] The research also found that 39% of respondents are still in the process of developing their digital strategy compared to only 29% having one in place. A further 27% do not have a clear digital strategy communicated across the entire business. Instead, separate departments have formulated their own strategies. This decentralised, reactive approach results in pockets of digital success that are difficult to measure and replicate across the business. Inevitably, these organisations will struggle to defend their market position against this with a more formulated digital strategy. The report highlighted issues of leadership and vision as large contributors to failed digital transformation projects. A broader cultural change is necessary for digital transformation to succeed. Nimbus Ninety’s findings on the reasons for failed projects rebukes the notion that technology changes the culture, and instead shows that technology only enables culture to change. Simply implementing the technology and expecting it to work without supporting a culture change through strong leadership, vision and HR dramatically reduces the chance of a successful digital transformation. [easy-tweet tweet="Technology alone does not drive digital transformation." hashtags="Technology"] Simon added: “Technology alone does not drive digital transformation. Digital transformation is only truly effective when an organisation embraces it completely and reshapes the way it thinks and acts. Digital transformation is the perfect opportunity for IT to step up and take control of the strategic agenda but for this to be an effective approach, all parts of the business need to be engaged before any technology is adopted.” The Digital Trends Report 2017 illustrates the importance of planning the journey to business transformation and the technology needed to facilitate it. Jessica Thorpe, senior research analyst at Nimbus Ninety says: “At the beginning of any business transformation process, it is important to understand what IT assets are already in the business, what they do and what value they deliver. From here, you can work out the best approach to operating today and optimising for tomorrow. Working in partnership with an expert provider of digital transformation will enable organisations to deploy agile delivery techniques and spend the precious budget on technology that delivers the fastest time to value, while also makes the best of their existing assets. Only then will businesses keep pace with their customers’ expectations and drive operational efficiency.” To download the full report, please visit: https://www.ensono.com/downloads/uk/digital-trends-2017 ### SailPoint awarded U.S. Patent for Securely Managing Identities From the Cloud Patented technology allows employees to safely manage enterprise passwords from mobile services SailPoint, the leader in identity management, today announced that the U.S. Patent and Trademark Office has recognised SailPoint’s innovative approach to empowering enterprise employee and partner populations to manage their network access credentials, even when located remotely and locked out of the corporate network. This new patent award enables SailPoint to uniquely provide a solution that ensures security for the enterprise and ease of use for both remote workers and office-based employees and partners. SailPoint’s patent number 9,600,656, entitled ‘System and Method for Domain Password Reset in a Secured Distributed Network Environment’, is another example of the company’s innovative approach to securing identity data for global enterprises. [easy-tweet tweet="A significant loss in productivity and end-user dissatisfaction" hashtags="Cloud, Technology"] Typically, company-issued laptops and other mobile devices don’t allow users to recover from forgotten network passwords while away from the office and disconnected from the company domain servers. This presents a challenge to anyone who works remotely or travels as part of their job responsibilities.  Should they happen to forget their network password while disconnected from the corporate network, until now there has been no way to reset the password without employing duplicate local accounts or physically connecting the device to the main company domain server and network? In this situation, remote employees are effectively locked out and must ship their laptop to the company's IT staff so they can reset the password and then ship it back.  In the case of the business traveller, they must wait until they return from their business trip to re-attach to the corporate domain server.  This all leads to a significant loss in productivity and end-user dissatisfaction. SailPoint’s patented technology enables remote business users to reset their local and domain password remotely without having to involve IT.  This process leverages SailPoint’s newly patented technology, and the SailPointopen identity platform, to create a seamless, secure and convenient end to end solution. Once reset, the users’ password is automatically updated across the connected windows domain, eliminating any need for physical connectivity or the arcane shipment of hardware. “Companies of all sizes need to ensure the security of remote workers without sacrificing productivity and ease of use,” said Darran Rolls, CTO of SailPoint. “This patented technology illustrates our continued focus on infusing innovation into our open identity platform to solve real-world issues that our customers face today. With this capability, when a remote worker needs to reset their password and they cannot connect to the corporate network, we can provide a simple convenient self-service solution that does require direct It involvement. This means cost savings, improved productivity and a better end-user experience.” This new patent complements SailPoint’s existing patent portfolio which includes a “System and Method for Securing Authentication Information in a Networked Environment,” highlighting the uniquely differentiated way that SailPoint’s SaaS solution, IdentityNow, maintains no knowledge of the administrative credentials used to access enterprise resources located behind the firewall, ensuring that even in the case of a rogue employee in the data center, the customer’s infrastructure will remain secure. ### Imagine a Hands-Free Work Future Engineers are currently designing both computers and mobile devices to be more hands-free. This means people will have more meaningful conversations with computers. The primary way that engineers envision people communicating with devices is through voice. The future of hands-free human-machine interfaces depends on how well the devices understand human speech, with all its mistakes, pauses and accents. There will likely be different, multiple points at which people can use voice to have conversations with computers. Languages vary considerably, with complex features such as tones and stresses. In addition, people speak very differently, at different volumes and with a wide range of pitches. The mistakes of Skype's real-time translation software illustrates how far artificial intelligence has to go even in translating between two different variations of the same language. [easy-tweet tweet="Voice is becoming more of an interface mechanism. " hashtags="AI"] The Rise of Digital Assistant Digital assistants such as Alexa, Siri, Cortana, Google Assistant and others reveal a future where the screen as a computing interface could be eliminated. Voice is becoming more of an interface mechanism. Yet the devices seek to supplement voice with data. For example, Cortana mines its work users' emails and calendars. Soon, with Microsoft Office 365 cloud service, Cortana should gain the ability to search files and find pertinent documents. Banking giants Wells Fargo and Visa are also working to create their own voice and biometric identification systems. These will use voice to engage in actions such as transferring funds. Hands-free computing through voice control promises the ability to engage with computers while going about business as usual. But voice control poses a number of security risks. Even soft verbal communications can be picked up. People can also use mimicry to impersonate the authorised user and steal data. This means that voice control on a broad scale could threaten both individual and corporate rights to privacy and secrecy. Voice control may offer a new level of functionality and benefits without the need for people to have a phone in front of their face. How offices will seal the security risks to translate this need to the world of work remains to be seen. Multi-Threading: The Holy Grail Devices or programmes that are multi-threaded, or able to remember multiple situations, may be the key to improving conversations with AI bots. Today, a user usually must finish a use case before starting another one. Most people do not always finish an old conversation before beginning another one. When a user can engage in multiple conversations with an AI bot at the same time, in different stages, AI bots may be able to achieve much more. Engineers are currently working on how to give devices the flexibility to offer both voice and text as an interface. Voice can work in the privacy of a home or car, particularly as more workers start to perform their jobs remotely. In a crowded office, the text is likely a better option. [easy-tweet tweet="AI is now limited to natural language processing" hashtags="AI"] Building AI Into Different Systems Engineers are also working on integrating AI into different systems and bots. This is a challenging task. AI is now limited to natural language processing and some basic skills correlated to fixed data, such as weather, traffic, trivia and inquiries about a company's inventory. If individuals want to communicate with bots in a more meaningful way, the bots must be "smarter" -- more proactive and intuitive. They need to learn about the individual with whom they are communicating. The bot should know the user's preferences and behaviour, to anticipate and suggest potential needs. Without such "learning," the bot cannot really engage in meaningful two-way communication. Although an individual can be very forgiving in the beginning if the bot doesn't always get it right, users expect a bit to improve and learn from their behaviour. In 2017, users will be looking for bots that are dedicated to achieving more.   ### Dell Boomi acquires ManyWho Dell Boomi™ (Boomi) announced it has completed a transaction to acquire ManyWho, a unified cloud and low-code development platform.  ManyWho™ simplifies workflow automation and allows businesses and developers to turn business processes into rich software applications to connect employees, customers and core systems. Workflow automation is a critical need for modern businesses and organisations pursuing digital transformation and IT modernisation. Adding the ManyWho low-code capabilities to Boomi’s market-leading integration platform accelerates the company’s ability to deliver workflow automation to customers on a unified platform, something no other company can match. [easy-tweet tweet="Boomi is the world’s leading cloud integration platform." hashtags="Cloud, Integration"] The acquisition unlocks the ability for businesses to maximise best-of-breed cloud applications, driving efficiency, increasing the time to value and building a competitive advantage. With the addition of ManyWho, the Boomi platform provides customers with the enabling technology to address the challenges of Hybrid IT. Now, the platform allows businesses to connect, manage data changes, ensure data quality and re-establish efficient business processes across your IT landscape. Boomi is the world’s leading cloud integration platform. ManyWho is rethinking the way businesses manage workflow automation. Together, the Boomi platform offers the only solution where customers can move, manage, govern and automate data and processes in a unified way. [easy-tweet tweet="“Boomi is a perfect fit for ManyWho"" hashtags="Boomi, Cloud"] “Both Boomi and ManyWho were born in the cloud and are cloud-native. Without an on-premises legacy to manage, Boomi provides instant access to services with no installation, effortless and automatic software upgrades and crowd-sourced ease-of-use to achieve short time-to-value,” said Chris McNabb, CEO of Boomi. “Boomi plus ManyWho brings world-class integration together with leading cloud workflow automation. This combination provides our customers with a connected workflow which is key to an efficient and differentiated business.” “Boomi is a perfect fit for ManyWho. Both companies are committed to delivering innovative ways to help companies move fast, be nimble and collaborate at scale,” said Steve Wood, co-founder of ManyWho. “Joining Boomi allows ManyWho to scale quickly, offering businesses the end-to-end integration solution they need to drive Digital Transformation.’’ ManyWho was founded in 2013 and is headquartered in San Francisco, California. Boomi plans to keep ManyWho’s employees and existing operations and will continue to invest in additional engineering, channel, marketing, professional services, support and sales capability to grow this business. ### Blockchain: The Third Revolution in Trust Blockchain is helping to transform how organisations trust each other, and to understand why you need to understand the nature of trust. As humans we learn to trust people based on personal knowledge - I trust you because I know what you have done in the past.  This type of person-to-person trust works fine in tribal or village economies, but psychologists will tell you it doesn’t scale beyond about 150 people. Therefore, throughout history, we had to invent new trust mechanisms in order to scale our economy. The first great revolution in trust was coined, first minted around 640BC in Turkey.  It provided a universal mechanism of immediate asset exchange, backed by a central authority (usually a king or emperor who put his image on the coin).  Coins enabled the portability and accumulation of wealth, allowing us to build cities and pay for armies, which helped us to build the first regional empires and trading networks. [easy-tweet tweet="It's a kind of Uberisation, but without the Uber" hashtags="Blockchain, History, Finance, Economy "] Around 1500, along with a wave of religious reformation we had the second trust revolution - the concept of money.  Invisible money and future money - credit, dividends, return on investment - again backed by central authorities such as banks and corporations, particularly in the more protestant and mercantile countries like England and the Netherlands.  This new source of large-scale investment helped to fund global exploration, the industrial revolution and created the foundation for our modern financial systems. Blockchain removes the need for those centralised trust authorities - offering the potential for providers and consumers to come together in a mutually beneficial network - a kind of Uberisation, but without the Uber.  It’s really a modelling of the person-to-person trust from ancient societies, but on a global scale - everyone in a business network can see the actions of everyone else, you can’t hide or delete behaviours. If that’s the case, then Blockchain has the potential to be the third revolution in trust.  I believe that, just as coins and money helped to grow the global economy, the applications which Blockchain will enable will drive our economy into the 21st century and beyond.     ### Three Signs your Digital Collaboration is Failing to Meet the Mark  Having been preached to for years, enterprises are well-versed on the benefits that they stand to gain through cloud-based collaboration tools: increased productivity, increased agility, a truly connected workforce and importantly, cost savings. Unfortunately, effective digital collaboration is one of those things that almost every enterprise agrees it needs, but very few are actually able to deliver on. The problem is that while most companies are trying to improve performance by investing in a variety of new technologies, the results tend to fall short: one-off initiatives deployed in isolation by individual departments; a new tool that creates more productivity issues than it solves; a short-lived spike in adoption that isn’t sustainable. Many enterprise leaders reading this will be forgiven for believing that they’ve already ticked-off their digital collaboration objectives. They’ve deployed a file-sharing system, an extranet to share content with clients, video conferencing, and are seeing strong adoption – so what else is there? [easy-tweet tweet="Many employees are still battling with mundane tasks." hashtags="Cloud, Digital "] The reality is, despite these shiny new technologies, many employees are still battling with mundane tasks. A survey of 200 US and UK professional service firms in 2016 found that 33 percent of professionals were still missing deadlines because they were waiting for approvals. Worse still, 68 percent admitted to working on a file, only to discover that it wasn’t the latest version. And let’s not even start on the vast amounts of information and specialised knowledge that is still solely stored within an individual’s emails. It’s here that productivity suffers, and the right digital collaboration tools are needed to help an enterprise. Here are three signs that more efficient collaboration tools are needed within your enterprise: You have too many tools Digital collaboration is often entered into cautiously – one department, workflow or business function at a time. This is entirely understandable and logical. But if care is not taken, organisations can be left with dozens of disconnected point solutions. When this happens, employees become frustrated and productivity actually suffers. For example, employees whose job involves handling or using a lot of information – lawyers, teachers, and scientists – tend to use on average eight different collaboration tools every day to perform their jobs. Switching between this many tools to perform different tasks - edit documents, share comments, and coordinate deadlines - can create significant interruptions to workflows. Added to this burden, piecing together information from these various sources is time-consuming, audit trails are accidently broken, and team updates are easily missed. The only real solution is a single collaboration tool that is sufficiently flexible to accommodate any style of working. This allows employees to spend less time organising documents and tasks, chasing approvals, and searching through email - and more time delivering exceptional results. Your existing tools aren’t being used Low adoption rates are often caused by individual reticence to change, usually caused by the individual not appreciating how the new tool will improve their way of working or benefit them personally. Everyone has their own motivations, targets, preferences and suspicions when it comes to changes in how they work. These differences can be categorised into personas, which each need to be treated in very specific ways.  Failing to do so will result in adoption being undermined, schedules lengthening and project costs spiralling. For example; one persona that IT teams encounter regularly is the Champion. This person is the c-level supporter of the new technology and is personally invested in its success. They will, of course, be vocal about the new technology, but need to see progress being made quickly, and want to be kept in the loop at all times. For this persona, IT teams must take care to alert the champion to the achievement of roll-out milestones and the delivery of benefits. The opposite of the Champion is the Laggard. These are time-poor employees, who view training as burdensome and aren’t convinced by the need for the new technology, no matter its purpose. However, motivated by company-wide success, they do care about initiatives that will help both their team and company perform better. With this persona, IT teams should regularly present evidence of the technology’s necessity, not only for themselves but also for the wider company or within the market. Alongside these are Detractors (resistant to change and vocally so), Dependents (eager and enthusiastic, but lacking in technical capability so require extra training), Passives (quiet and compliant, but slow adopters), and many others besides. Driving user adoption is not just a simple case of identifying personas and communicating with them appropriately. It’s about achieving a delicate balance amongst them all that – if dealt with poorly – can ultimately jeopardise user adoption. But get it right, and you’ll accelerate the deployment timescale and adoption rates – and your ROI. [easy-tweet tweet="We are no longer tethered to a physical place of work" hashtags="Cloud, Digital "] You need to be in the office to get anything done We are no longer tethered to a physical place of work or even a particular device. While the need for mobility exists and the argument is well-understood, the promise of mobile working is still just that for many – a promise. Businesses tend to find that despite having rolled out many different digital collaboration tools, workflows still get interrupted as the functionality available on a mobile device don’t match those of an office-based PC. This shouldn’t be the case. Approvals should not be held up by a key person travelling. Nor should the latest version of a vital document be difficult to track down. Instead, a well-designed collaboration initiative should create a seamless experience across any device, in any location. Updates made to a document, task or comment stream should all be synchronised across all of your devices so that important updates from teams are never missed. Effective collaboration is now rightly deemed essential to a business’ progress. But it is too often undermined by the excessive deployment of point solutions, lack of appreciation of individual characters in the roll-out and a failure to fully address mobility. The projects’ potential is, therefore, being left unfulfilled or lessened, and the reasons are frustratingly avoidable. ### RingCentral Announces New European Offering RingCentral Office extends presence to 13 European countries with in-country purchasing capability and Euro billing RingCentral, Inc. (NYSE: RNG), a leading provider of enterprise cloud communications and collaboration solutions, today announced the ability for customers to purchase RingCentral Office across 13 European countries. RingCentral’s in-country European expansion signifies a strong commitment to providing local presence and the ability to purchase RingCentral solutions in Euro currency in those new markets. The countries included in today’s announcement are Austria, Belgium, Denmark, France, Ireland, Italy, Luxembourg, Netherlands, Portugal, Norway, Spain, Sweden and Switzerland. RingCentral has expanded its global customer care services to provide local language support in French, Canadian French, German and Spanish. Also, RingCentral will extend its multilingual product offering to include Spanish, Latin American Spanish and Italian by July 2017. Current RingCentral Office languages supported include American English, UK English, Canadian French, International French and German. “The rollout of RingCentral Office in Europe is a major milestone in our global expansion plans,” said David Sipes, chief operating officer of RingCentral. “We’ve seen enormous traction with our Global Office product, which serves U.S., Canada, and UK-based enterprises with employees in various countries. With enterprises in Europe migrating to the cloud, we are excited to offer them in-country, fully localised communications solutions.” Building on its existing Global Office offering, RingCentral continues to make significant investments to provide enterprise-grade quality of service, compliance, security and reliability, with 17 data centres worldwide. The delivery infrastructure includes direct peering with more than 45 carriers and 200 ISPs around the world. [easy-tweet tweet="RingCentral also plans to open several sales offices." hashtags="RingCentral, Cloud "] Along with the localised product and support expansion, RingCentral also plans to open several sales offices based in Europe, to complement existing sales locations in the U.S., UK, Canada and Singapore. RingCentral is also committed to a major expansion of its channel partnerships across Europe and has already signed up some channel partners including CDW, and master agents such as AVANT, Intelisys Global, IngramMicro, ScanSource, as well as Google Premier partners Devoteam, Cloud Technology Solutions, NetPremacy and NRX, among others. “With RingCentral Office now available in Europe, we have a tremendous opportunity to empower our enterprise customers with a single cloud solution that makes worldwide communications seamless,” said Olivier Chanoux, co-founder at Devoteam G-Cloud, a Google Premier partner and trusted advisor to European enterprises across 17 countries. “The enterprise workplace is evolving, and our customers are increasingly demanding cloud technologies to enable a more agile workforce. We selected RingCentral for its innovative, pure cloud approach, and we look forward to working with them on the enterprise segment.” For additional information, please visit www.ringcentral.co.uk. ### Universiade Competition Broadcasted Worldwide with the Aid of InfiNet Wireless InfiNet Wireless help broadcast the Universiade 2017 competition to TV channels all over the world Earlier this year, the 28th Universiade – the biggest international sporting event of its kind – was held in Almaty, Kazakhstan, attracting more than 2000 student-athletes from 57 countries and a worldwide audience of more than 1 billion viewers. The event was broadcast live from numerous sports facilities, including ice stadiums, ski jumps and ski slopes in and around Almaty. By using InfiLINK XG, the advanced high capacity InfiNet Wireless solutions capable of reaching up to 500 Mbps, the national operator JSC Almatytranstelecom was able to reliably deliver high definition video streams both to local and foreign television broadcasters. [easy-tweet tweet="Reliable communication links in challenging urban environments" hashtags="Wireless, Broadcast"] The basic requirements for the selected solution were the ability to guarantee uninterrupted connectivity over a wide area, providing reliable communication links in challenging urban environments, as well as operating in extreme climatic conditions with very low temperatures, high humidity and intermittent snowfalls. The InfiLINK XG solutions, all fitted with high gain 28 dBi integrated antenna, were deployed both on mobile units as well as fixed locations dotted around the sports facilities, thus ensuring stable high-speed connectivity with minimal data loss for seamless transmission of TV signals. Genghis Khan Nysanbayev, the commercial director at Almatytranstelecom branch, commented: "We tested a number of solutions offered by other manufacturers and decided to opt for the InfiNet solutions because it has proven to be the most reliable and cost effective in the challenging topology we needed to deploy in. In addition, it is well suited to the harsh climate of our region as it delivers high-quality TV streams even when the wireless antennas and mast structures are iced over. InfiNet’ssolution has allowed us to deliver live broadcasts with the highest possible quality of service for the organisers of theUniversiade and the billion people that watched the various competitions throughout this event". Previously, InfiNet has provided high definition solutions to broadcasters covering the last three Summer and Winter Olympic Games, as well as the 2010 and 2014 FIFA World Cup in South Africa and Brazil, respectively. ### Why The Cloud Will Be Essential In Defending Against The DDOS Attacks Of The Future DDOS attacks are about to get a whole lot worse thanks to the Internet of Things. But help may come from an unlikely source - the cloud. The Internet of Things promises to do some great stuff for both our professional and personal lives. But all that enrichment comes at a pretty hefty cost, at least now, in its early days. I’m talking, of course, about security. Or rather, the lack thereof. Internet-connected devices are easy targets for hackers and ideal candidates for use in botnets. They often lack proper configuration, and many users fail to use strong login credentials. That isn’t terribly surprising if you stop to think about it. The majority of IoT manufacturers have middling expertise where cybersecurity is concerned. After all, it’s not something they’ve ever needed to think about. Sure, a company that makes kitchen appliances probably has an IT department, but how great are the chances that they’d bother to include administrators and security professionals in the manufacturing process?” You should already know the answer to that - somewhere between ‘slim’ and ‘none.’ [easy-tweet tweet="Users still can’t be bothered to apply strong credentials to their devices" hashtags="IoT, Data"] That isn’t the only problem with the connected world, either. In spite of all the hacks, data breaches, and privacy leaks we’ve seen over the past several years, a startling number of users still can’t be bothered to apply strong credentials to their devices. Just take a look at the top ten passwords used to hijack IoT devices, and try not to cringe at the fact that there are things like ‘root,’ ‘123456,’ and ‘password.’ Someone looking to create a botnet, therefore, doesn’t even need to be particularly skilled at hacking. They just have to take the shotgun approach - slam default usernames and passwords into as many devices as they can find, and see which ones they can recruit. They’re more or less guaranteed to pick up at least a few. The result of this mess? Some of the largest botnets we’ve ever seen. And it’s only going to get worse from here - at least until someone steps up and holds manufacturers accountable for their security flubs. Of course, you can already guess the problem with that. The consumer market doesn’t care about security. They care about whether or not their devices are easy to use. And that, in turn, means that IoT vendors and manufacturers have little to no incentive to harden their devices. Even on the rare occasion that a hardware or software vendor is held liable for a breach, the regulatory fine amounts to little more than a slap on the wrist. It would be like causing a car accident and only being penalised with a $50 fine. Until we find a way to incentivize security amongst vendors and ensure consumers don’t bumble along with default usernames and passwords, the Internet of Things will continue to represent a major security risk, even as it transforms how we work and live. [easy-tweet tweet="IoT isn’t going anywhere anytime soon." hashtags="IoT, Cloud "] Sadly, that means that in the interim, all you can do about any of this is shore up your defences and hope you can survive whatever botnet happens to be pointed your way because IoT isn’t going anywhere anytime soon. And as you may have surmised, traditional DDOS protection may not be enough to weather the storm - at least, not of the sort anyone save for a dedicated host can afford. In this, the cloud may be the answer. “It looks like 2017 will see [the scale of botnets] increase even more rapidly with the abundance of insecure IoT devices and the fact that large-scale attacks have become simpler to execute,” Duncan Stewart, Deloitte's Director of TMT Research told Gigaom. “ The consequence may be that CDNs and local mitigations may not be able to scale readily to mitigate the impact of concurrent large-scale attacks, requiring a new approach to tackling DDoS attacks.” Many of the features that allow the cloud to offer competitive advantages over traditional hosting services also make it ideally suited for weathering DDOS attacks. Failover and reliability. On-demand scaling. Distributed networking. It seems like it’s perfect, right? Almost. See, the market...isn’t quite there yet. DDOS-as-a-Service is an excellent idea, but it’s also one that’s still in the wings. The solution may well be to bake DDOS protection into existing cloud platform. By providing cloud-based mitigation, providers can shore up their platforms against potential attacks. That just leaves non-cloud servers and services - which could all benefit from a bit of cloud scaling, anyway. ### The Weakest Link in the Cyber-Security Battle: The Human Factor Much of the attention is typically focused on building a ditch and a drawbridge to protect the castle against a hostile outside world. However, this approach neglects the fact that the crown jewels (namely data these days) are also exposed to a range of internal threats. Humans are the greatest asset of any given organisation but also the weakest link within, being predominantly unaware of their behaviour and providing ample opportunities for intruders to infiltrate. The result of a breach cannot only cost millions of dollars in short-term damages and legal fees but, also severely harm the reputation and brand recognition in the long run. The single biggest threat: wrong user behaviour In a recent survey, as much as 73% of organisations experienced an internal security incident, causing small and medium businesses on average up to US$40,000 per event, and more than US$1.3 million in large enterprises. With 42% the confidential data loss by employees was cited as the largest single root cause. Another 28% reported cases of accidental data leaks, and 14% intentional leaks of valuable company data. 19% confirmed that they lost a mobile device containing corporate data at least once a year. [easy-tweet tweet="90% of organisations encounter at least one insider threat each month." hashtags="IT, Security "] Users don’t necessarily have bad intentions. The vast majority operate in good faith without even realising that they are exposing themselves as well as their organisation to cyber threats. The Skyhigh Report finds that 90% of organisations encounter at least one insider threat each month, with the average organisation even experiencing 9.3 insider threats monthly. In essence, user behaviour can be categorised into malicious activities, negligence or accidents. However, no matter what the underlying rationale ultimately was and how these incidents are being clustered in retrospect, organisations are well advised to do their utmost to prevent them from occurring in the first place. The world is changing rapidly: The impact of digitisation The whole notion of digitisation, a sharing economy that explicitly encourages new ways of working, including themes such as “bring your own device” (BYOD), forces IT leaders and the C-suite alike to rethink their security agenda. There seems to be a gap regarding the share of mindset (embracing new ways of working) and share of wallet (adjusting investments to keep up with the change). The digital revolution is hardly stoppable. It’s a running train that smashes its way through society. Only 11% of organisations are prohibiting BYOD altogether, but this attempt is likely not sustainable for long in today’s reality. Young talent, in particular, wants to use cutting-edge technology. Employee satisfaction matters and people strive for more freedom, control, and flexibility. Everyone uses mobile devices and the number of connected IoT devices (smart watches, wristbands etc.) is projected to reach billions over the next couple of years. Trying to ignore these facts by simply proclaiming a ban is doomed to fail. On that token, Microsoft already concluded in 2012 that irrespective of official policies, 67% are using personal devices at the workplace anyway. Assuming that dispatching company-owned devices instead would entirely solve the issue is a misbelief. A report suggested that 60% of employees use company-issued mobile devices to work from home, on the road or for personal activities. However, 94% noted that they connect their mobile devices and laptops to unsecured Wi-Fi networks, thereby exposing corporate data through supposedly protected devices to various risks. Summing-up Insiders have a far better understanding of the organisation than outsiders. In a digital world, the actions of employees can have severe consequences. To a very large extent, this has to do with the inadvertent human error. Organisations must, therefore, take appropriate precautions and protect against threats from inside more than ever before. Finding the right balance between device preference, usability and IT security is a delicate trade-off. No matter whether these devices are personally owned or owned by the company, ensuring a 360-degree protection of these devices and the contained data – at rest, in transmission and in use – remains an ongoing challenge that needs constant enhancements. Indeed, it’s quite the cat-and-mouse game. Besides implementing tools and enforcing policies, the most effective combat strategy starts with the first line of defence – education. According to PWC’s 20th CEO survey, 53% of CEOs are to “a large extent”, and 38% to “some extent”, concerned about cyber-attacks negatively impacting their business. The tides are slowly shifting the attention of the cyber-threat increases in the boardroom. Creating awareness and a sense of urgency can be enormously powerful – especially when embraced from the top of the organisation across all levels. People love role models; hardly anything can be more effective than a championing CEO who leads by example. Or, as novelist Ken Kesey once put it: “You don't lead by pointing and telling people some place to go. You lead by going to that place and making a case.”     ### Atos and Siemens expand their strategic relationship to provide cybersecurity solutions for the utilities, oil and gas industries in the US Atos, a global leader in digital services and Siemens, a global engineering leader, announce today they have entered into a Memorandum of Understanding (MOU) and will leverage their portfolios to help customers establish an integrated first line of defence against cyber-attacks. Siemens and Atos work together in the area of cyber security for industrial companies, providing customers in the manufacturing and processing industries with comprehensive security services and products. The Atos and Siemens partnership in the U.S. is part of a global agreement around cyber security including common go-to-market and shared research and development efforts to target Information Technology (IT) and Operational Technology (OT) security for any market.Atos, with its unique capabilities in the field of IT, including identity access management, real-time security analytics, next-generation cryptography and software-defined security architecture; and Siemens, with its deep domain know-how and solutions for OT cyber, including security program design, security lifecycle management, plant security monitoring and incident response, are well-positioned to help companies with an integrated approach to protecting against, detecting, and correcting threats quickly and efficiently.  [easy-tweet tweet="There is a corresponding need to boost cyber defences" hashtags="Strategy, Security"] As utility companies increasingly use software to become more efficient and reliable, there is a corresponding need to boost cyber defences – going beyond compliance regulations to secure operations. In oil and gas, digitalization brings a convergence of IT and OT connectivity that enables data to travel from the field to the control room to the enterprise network – underscoring the need for a unique set of solutions to address the crossover between IT and OT. A recent study from the independent Ponemon Institute shows that nearly 70 percent of U.S. oil and gas cyber managers said their operations have had at least one security compromise in the past year, resulting in the loss of confidential information or OT disruption – highlighting the need for the oil and gas industry to increase its cyber defenses. “We are pleased to have the opportunity to expand the Siemens and Atos relationship as U.S. utilities, oil and gas industries are realising the extent of cyber security challenges when moving into a digitised and connected ecosystem,” said Michel-Alain Proch, Group Senior Executive Vice President and CEO North America, Atos. “With our combined end-to-end suite of solutions and innovative approaches to security analytics and better detection and response capabilities, customers will see tangible advantages in cost and risk reductions, as well as enhanced performance and flexibility gains.” “As the energy industry benefits from digital technologies and solutions, there is a need to guard against growing cyber threats. This new cooperation is part of our broad effort to deliver cyber security solutions to America’s energy sector. By bridging operational technology and information technology capabilities, we can strengthen our customers’ defences against costly and disruptive attacks,” said Judy Marks, CEO Siemens USA and Executive Vice President of New Equipment Solutions for Dresser-Rand. The Atos-Siemens alliance, founded in 2011, forms one of the largest strategic relationships ever between a global engineering company and a global IT provider.   For more information, visit: www.atos.net   ### Is 2017 the Year of the Hybrid Cloud? The enterprise will increasingly live in a hybrid IT world in 2017, split between on-premises solutions and cloud environments. Recent reports have revealed that many organisations have already started to split their cloud budgets between public and private deployments, creating demand for hybrid cloud strategies that give businesses greater flexibility as well as more workload deployment options. It’s easy to see why hybrid strategies are on the rise. Hybrid cloud enables workloads to exist on either a vendor-run public cloud or a customer-run private cloud. This means that IT teams are able to harness the security and control of a private cloud as well as the flexibility of public cloud services - thus, getting the best of both worlds. The most mature IT teams are reviewing their workloads – often in consultation with a cloud services provider – to determine which are most suited to a public or private cloud environment and adapting their workload placement accordingly across hybrid cloud options. Here are the three key benefits often associated with implementing a hybrid cloud solution: [easy-tweet tweet="IT has the option to burst workloads into the public cloud" hashtags="IT, Cloud"] Scalability Demand will wax and wane. Organisations’ requirements rarely run in a horizontal line and public cloud solutions are particularly valuable for dynamic or highly changeable workloads. When traffic surges, a hybrid environment allows for quick scalability in order to meet the needs of the moment. When the surge dies off the cloud resources consumed can be scaled back to avoid over-provisioning and keep costs under control.  With a hybrid cloud strategy, IT has the option to burst workloads into the public cloud when required to maintain performance during periods of increased demand. Cost Public cloud is fast and inexpensive to scale out and does not require the up-front investment in the infrastructure of private cloud. The great thing about a hybrid approach is that it brings long-term savings. It is no longer a case of trying to determine the maximum load, reserving what is needed for that maximum load and paying for it all whether it is used or not. A hybrid solution allows for location and reallocation to meet changing workload needs as and when required which can have a significant impact on the IT bottom line. Security While the perception that the cloud is not as secure as on-premises infrastructure is a persistent one, there is increasing evidence that public cloud environments actually suffer from fewer attacks such as ransomware and viruses than traditional IT environments. Cloud service providers like iland now offer levels of security and compliance reporting that are very difficult for medium to small enterprise businesses to match in their own IT infrastructure.  Despite this, there may still be reasons why certain apps - particularly those running on legacy systems - may need to stay on-premises and a hybrid cloud strategy enables that dual approach. [easy-tweet tweet="Organisations looking to adopt hybrid cloud as a means to increase agility" hashtags="Cloud, IT"] Despite all the opportunities and related benefits provided by a hybrid cloud strategy, there is a key challenge that businesses must overcome - have visibility into and control of their cloud workloads and resources. All too often, organisations looking to adopt hybrid cloud as a means to increase agility within their IT operation get stuck in the implementation stage. They are left fighting for control of their environment because the majority of cloud providers don’t offer the same level of visibility teams are accustomed to from their on-premise resources. They also struggle to maintain consistent network and security policies meaning that migration becomes a far bigger headache than initially expected. As companies make cloud computing a more strategic part of their overall strategy, having that visibility and management control over areas such as performance, billing, security and compliance reporting, is more important than ever. At iland, we find that while customers continue to value and leverage our support teams, they increasingly want to take a more strategic and self-sufficient approach to cloud management in order to fully leverage all of its benefits. This means having the capability to perform data analytics, adjust resources, do DR testing on-demand, generate security reports and manage networking. And, increasingly, customers are using our APIs to link essential data about their cloud resources and workloads to their own IT systems which are invaluable in managing both public and private on-premises cloud environments in a holistic way. As adoption of hybrid cloud increases, IT teams will need cloud service providers to provide the visibility and end-to-end cloud management tools that will help them approach and manage cloud in a more strategic way. The need for agility continues to be one of the biggest drivers of cloud adoption and with more complex hybrid cloud environments becoming the norm, management is key. You can find out more about how customers leverage the iland cloud management console here. ### 5 Hot Fintech Trends in 2017 The term “Fintech” (otherwise known as financial technology) is commonly used to describe a company that provides financial services through the use of modern technology. With the significant advances in technology, these companies are seen as having a revolutionary impact on financial services such as banking and investment management and are recognised as genuine competitors to the traditional banks and investment firms. There is an increasing demand from customers for flexible services for managing money that can be easily integrated with our increasingly digitised day to day lives.  Let’s look at five Fintech trends organisations in the financial industry are introducing to gain a competitive edge. 24-hour accessibility In recent years, online banking has given us the capability to manage our money from the comfort of our home. It is now possible to do most banking activity online, from simple transactions to the more complicated tasks such as mortgage applications. Less of us will find ourselves walking into a bank, as banks adapt to consumers demanding services on their own terms such as being able to manage their account on all devices and having access to it wherever and whenever they want. Use of mobile devices Where the internet succeeded in moving customer relationships from bricks and mortar to the online sphere, smartphones have taken banking to now be on the move. [easy-tweet tweet="A digital revolution that is happening right now is the idea of the Internet of Things." hashtags="IoT, Fintech"] The rise of mobile payments is one of the most major changes in recent times for how a consumer manages their money. With the explosion of mobile apps and contactless technology, traditional methods to transfer money such as cheques has rapidly declined in recent times. The Internet of Things A digital revolution that is happening right now is the idea of the Internet of Things. As an increasing number of us use smart devices, we are becoming part of an intuitive connection of devices in our day to day lives. Significantly this connectivity enables the collection and exchange of data. For financial services, such as banks, this access to data is a valuable source of information for competitor advantage. Being able to track and analyse the behaviours, wants and needs of their customers means banks will be able to provide us with more tailored offers and personalised experiences. In an era where it is simple to move banks, being more customer-centric in their services will be crucial for retention. Wearable Technology Wearables such as smartwatches have soared in popularity in recent years and are contributing to the Internet of Things concept. Alongside smartphones and mobile apps, as more innovative wearable technologies continue to be developed, never before have we been more connected to our money. Security   In an era of the Internet of Things, digital security is critical for the protection of a customer’s details and data. To maintain trust with customers, there has been significant investments to protect against cyber threats, as financial services such as banks have been rolling out biometric services including fingerprinting authentication and voice recognition technologies, both of which many believe will soon replace passwords.     ### A 'Mean Blind Spot' is Leaving Companies Vulnerable to Cyber Attacks New research has identified a ‘mean blind spot’, which leaves organisations vulnerable to cyber attack – particularly in the months of April and October. A study by the University of Portsmouth found the length of recovery time between cyber attacks can leave organisations susceptible to further attacks. This ‘mean blind spot’ is the average interval between the recovery from an existing incident and the occurrence of a new incident. [easy-tweet tweet="Cyber attacks and data breaches are becoming more frequent" hashtags="Security, Data"] Dr Benjamin Aziz, a senior lecturer at the School of Computing, conducted the research using a community dataset of cyber incidents known as VERIS. The data is collected from a wide range of industries and different types and sizes of organisations. He said: “Cyber attacks and data breaches are becoming more and more frequent and most companies will have plans for a counterattack in place. “However, the problem arises when you look into organisations’ recovery times. If a company takes a month to recover from a cyber attack, but the next incident is a week away, there is a real risk that the subsequent attack can’t be tackled because recovery resources will have been deployed to handle the first attack. “When you layer recovery times on top of each other there is a blind spot, where your resources are depleted and recovery time is slow. This is when companies are in danger of leaving themselves open to multiple attacks.” In his analysis of VERIS data, Dr Aziz also found that organisations are least prepared to tackle security incidents in the months of April and October. He said: “This finding is surprising because you’d expect August and December to be the months that companies are unprepared when staff are most likely to be on holiday. My analysis found that in April and October it took days for companies to recover from an attack, rather than hours. “This could be due to peek in attacks being during those months or due to internal reasons, but I’d need to do further analysis to drill down the details.” Dr Aziz hopes his research gives organisations insight into the resilience of their IT infrastructure, the recovery cost of internet attacks and the future cost to defend against them. He said: “I hope the findings will help minimise the threats against cyber-attacks in an increasingly digital world. Lots of businesses are prepared to combat one attack, but now they need to prepare for multiple attacks. “Although our new metric does not identify the cause of an attack or suggest a solution, we hope it can help as objective evidence for IT managers to argue for more organisational support or resources to secure their infrastructure, so they are well prepared to combat numerous attacks.” ### Why it's OK to BYOK For the first time ever, we are seeing that more enterprises are moving more workloads to the cloud than their data centres.  With these new applications, they are also sending sensitive data to the cloud, and, in some cases, ignoring the risk of that data being compromised. In fact, today, nearly two-thirds of enterprises are using SaaS applications, while the percentage of workloads deployed to the cloud is expected to rise from 41 percent to 60 percent in the next two years. And it’s not hard to understand why; cloud services eliminate the need for internal infrastructure, maintenance and support, improve productivity and ultimately reduce costs. However, where there are rewards, there are also risks. As information moves to the cloud, data may be under an organisation’s logical control but it will physically reside in infrastructure owned and managed by another entity. And this is cause for concern for many within the business community. In addition to an overwhelming fear that their cloud data could suffer a security breach, the majority of organisations in our most recent Data Threat Report revealed that they were concerned over shared infrastructure vulnerabilities and also a lack of control over where data is processed and stored in the cloud. Curbing the fears Of course, cloud and SaaS providers can use encryption to protect an organisation’s sensitive data stored in the cloud. And, at a point when data breaches are at an all-time high, such security measures have never been more important. If we are to learn anything from the various cyber-attacks on high profile organisations over the past few years, it’s that hackers are after one thing: data. Protecting it using encryption is critical to ensure that any data is rendered useless to anyone other than its owner. However, this is just one part of the puzzle. Encryption alone is not enough - access control and key management can also prove to be weak points in a provider’s defences. So, to recap the benefits of cloud computing confidently, an organisation must have full control over its data – and the keys that protect it. Keys represent trust, and their secrecy and integrity determine whether that trust can be relied upon. [easy-tweet tweet="“bring your own encryption key” identified as the most popular way to secure data in the cloud" hashtags="BYOK, Cloud"] Keeper of the keys To acknowledge this trust balance, we have seen rising popularity in businesses managing and controlling their own encryption keys. In fact, the “bring your own encryption key” (BYOK) concept was identified as the most popular way to secure data in the cloud, according to our 2017 Data Threat Report. What’s more, when it comes to the security techniques and solutions organisations are planning to implement next year, encryption with BYOK topped the list. Clearly, this is a trend not to be taken for granted. As such, almost all the major cloud providers from Microsoft to AWS now offer BYOK capabilities. Google, for example, have all enabled a BYOK strategy for their Cloud customers to generate, protect and supply their encryption keys to the cloud using an on-premise hardware security module (HSM), allowing them to securely move more workloads to the cloud. Furthermore, while most organisations want to take advantage of the public cloud, many are faced with resistance when migrating workloads containing sensitive data due to strict security standards that support internal policies or regulatory compliance. These requirements, however, are often overcome when encryption and customer controlled keys can prove to auditors that the enterprise is the custodian of the encryption keys and, consequently, the data they protect. Taking control While only 35 percent of global organisations either currently implement or have plans to implement BYOK encryption as part of their overall data security strategy, we can be certain that, as more businesses move sensitive data to cloud against a backdrop of an ever more precarious threat landscape, this number will only increase. So whether your organisation is adopting a public, private or hybrid strategy, ensuring confidentiality and security of data has to be at the top of your agenda. We only have to recall the numerous data breaches on high-profile organisations – from Yahoo to Ashley Madison - to understand the devastating impact of valuable data being hacked and leaked online. As more organisations focus on moving their sensitive data and applications to the public cloud, now is the time to take control, hold the keys to your castle and protect your data - even when it is physically out of reach. ### Barracuda Offers Backup Customers and MSPs Faster Restores with LiveBoot 2.0 New Barracuda Backup Updates Minimize Downtime in VMware vSphere and Microsoft Hyper-V Environments     Highlights: Barracuda enhances data protection capabilities with the new Barracuda BackupLiveBoot 2.0, which expands Cloud LiveBoot to include the ability to rapidly spin up Microsoft Hyper-V environments in the Barracuda Cloud, and reduces boot times to minimise downtime in VMware vSphere and Hyper-V environments. Barracuda Backup LiveBoot 2.0 updates provide faster and easier recovery for on-premises VMware environments in cases where primary storage is lost or no longer available and increased support for larger virtual machines. Barracuda Backup LiveBoot 2.0 also includes a new VM Preview capability that gives customers and MSPs the ability to see a screenshot of a VM so they know if/when it is ready for use, as well as user interface improvements that allow for pre-configuration and bulk management. Barracuda Networks, Inc. (NYSE: CUDA) today announced Barracuda Backup LiveBoot 2.0, which includes Cloud LiveBoot for Microsoft Hyper-V and gives customers faster restores to minimise downtime in VMware vSphere and Hyper-V environments. Barracuda LiveBoot 2.0 offers a backup and recovery option for larger virtual machines and reduces boot times. [easy-tweet tweet="In today’s landscape, there are a lot of potential obstacles" hashtags="Backup, Storage"] “The whole point of deploying a backup strategy is to make sure businesses are covered when things go wrong, and in today’s landscape, there are a lot of potential obstacles,” said Rod Mathews, senior vice president and general manager, data protection at Barracuda. “We’ve seen everything from natural disasters to ransomware attacks or employees accidentally deleting critical files that can wreak havoc on an organisation’s recovery strategy. Barracuda Backup LiveBoot 2.0 provides customers with a powerful and affordable layer of protection for recovering lost data.” Faster Restores, Reduced Downtime Network Computing1 recently reported that IT downtime costs enterprises $700 billion a year, and according to a study by Everbridge2, the average cost of IT downtime is $8,662 per minute. These statistics underpin the critical need to limit downtime in order to conserve resources, save money, and keep business moving. Using Barracuda’s LiveBoot and Cloud LiveBoot, customers can quickly and easily be back online and be running while additional fixes are being implemented. Using Barracuda Backup LiveBoot 2.0 Barracuda Backup and Barracuda Backup – Intronis MSP Edition includes LiveBoot 2.0, which gives organisations protecting VMware vSphere and Microsoft Hyper-V environments the option to leverage their Barracuda Backup appliance or the Barracuda Cloud if their own virtual environment becomes unavailable. Options include: LiveBoot (for VMware) – allows customers and MSP partners to recover from primary storage failures by using Barracuda Backup as a storage source for VMware hypervisors, quickly booting the protected virtual machine directly from the local backup. Cloud LiveBoot (for Hyper-V and VMware) – allows customers and MSP partners to boot VMs directly from backups stored in the Barracuda Cloud, and assign private or public addresses to devices for remote access. [easy-tweet tweet="LiveBoot 2.0 provides rapid recovery for virtual environments" hashtags="Storage "] Continued Increase in Value Barracuda Backup LiveBoot 2.0 provides rapid recovery for virtual environments built into Barracuda Backup, without additional costs or complexity to customers with an active subscription. This new functionality continues to provide value to new and current customers, much like the platform refresh that was announced just last month, which included increased storage capacities, meaning a lower cost per terabyte with more room for data growth. The refresh also included performance boosts that enable faster backup, restore, and replication speeds — also at no additional cost to customers with active subscriptions. Pricing and Availability Barracuda Backup LiveBoot 2.0 is available at no additional cost to active Barracuda Backup and Barracuda Backup – Intronis MSP Edition subscribers. For more information, please visit cuda.co/backup. Resources: Blog: Barracuda LiveBoot 2.0 – cuda.co/18368 Blog: LiveBoot 2.0 for MSPs – cuda.co/mspliveboot Case Study: Hayward Tyler Group - cuda.co/cshtyler Hayward Tyler Group was hit with a ransomware attack. By leveraging Barracuda Backup, the company was able to identify all encrypted files, delete them, and restore previous uninfected versions of the files from the most recent backup created, all within an hour of discovering the ransomware attack. Video: Understanding Ransomware - cuda.co/vm3926 [1]IHS Markit Survey, “The Cost of Server, Application, and Network Downtime: North American Enterprise Survey and Calculator” - cuda.co/ihs700b 2 Everbridge, “The 2017 State of IT Incident Management” 2016 - cuda.co/italt1 About Barracuda Networks, Inc. Barracuda (NYSE: CUDA) simplifies IT with cloud-enabled solutions that empower customers to protect their networks, applications, and data, regardless of where they reside. This powerful, easy-to-use and affordable solutions are trusted by more than 150,000 organisations worldwide and are delivered in appliance, virtual appliance, cloud and hybrid deployment configurations. Barracuda's customer-centric business model focuses on delivering high-value, subscription-based IT solutions that provide end-to-end network and data protection. For additional information, please visit barracuda.com. Barracuda Networks, Barracuda, and the Barracuda Networks logo are registered trademarks or trademarks of Barracuda Networks, Inc. in the U.S. and other countries.       ### What does Blockchain do? Blockchain works best when it’s used across business networks, with multiple participants, where it’s important to establish trust between the parties. The sweet spot for early adopters has been financial markets and banking, where we see Blockchain used for share trading, currency exchange and international payments.  The benefits for these projects are around speed, given the near-instantaneous nature of transaction consensus compared to the end-of-day settlement, and the cost reduction of not having multiple independent databases in each organisation, all talking to each other. In the physical world we’re seeing supply chains adopting Blockchain to track the movement and ownership of goods - often using Internet of Things sensors or other location events to update the status of assets in real time. Increasingly, we’re seeing decentralised markets, such as the energy cooperative in New York which is connecting micro-generators with consumers without the need for a central utility company.  This type of direct provider-consumer network has the potential to be highly disruptive in many industries. [easy-tweet tweet="It feels like we’re back where the world wide web was in the early 1990s." hashtags="Blockchain, Social, Technology, Networks"] At the Hyperledger meetup in London in February 2017, Cognition Foundry talked about a great use case - encouraging the public in poorer countries like Indonesia and the Philippines to collect and recycle plastic washed up on the beaches, which is then converted to digital payments to avoid the local corruption which is endemic in those countries. Blockchain will really start to make a difference when we start to see more of these types of use case - environmental, social, and industry-disrupting. It feels like we’re back where the world wide web was in the early 1990s, when everyone focused on online encyclopaedias and retail catalogues.  At that time, very few people foresaw e-commerce, digital banking, streaming music, video on demand and social media, yet the web made all of that happen. We know that we’re only scratching the surface at the moment.  What is certain is that as more organisations adopt the technology, we will continue to see new and interesting uses for Blockchain.   ### Cloud Phone Systems Bring Power and Simplicity to Office Communications Despite the positive headlines that cloud-based business processes and apps generate, there is a darker side to our economy’s shift to online and hosted IT services over recent years; one still being played out in offices and workspaces. The ‘noughties’ saw large and small businesses piling in to buy VoIP systems. These, we were told, would bring about more flexible and cost-effective working for UK plc; especially in smaller firms - the lifeblood of our economy. Sadly, many UK companies never quite got them right. Despite all the promises, users are condemned by ageing VoIP systems and their inflexible call strategies, leading to agonising waits and repetitive conversations with customer agents (yes, we’ve all been there!) Somewhere in the hype surrounding ‘business process digitisation’, the voice of the small business owner or harassed office manager, asking for a balance between automation of business process and user control, was ignored.  As a result, many small firms must soldier on with clunky user interfaces, inflexible management of call workloads, and rigid vendor cost plans, all of which have contributed towards a failure to meet expectations. But now, new feature-rich cloud telephony platforms, using open source’s versatility and backed by expert technical support, are supplanting old VoIP systems and delivering a second wave of low-cost and flexible telephony and contact centres - delivering connected processes and agile business capabilities for SMEs. [easy-tweet tweet="These cloud telephony systems balance power with ease of use." hashtags="Cloud, IT"] Developers of cloud-based phone systems are ‘designing in’ features such as automation of workflows, aggregation of voice and social media channels, and integration with existing CRM applications. These small but high-functionality telephony platforms, with simple and enjoyable interfaces, can be operated across multiple call centre locations and customer touch points. It’s the simplicity and power of the Apple iPhone, replicated for the communication and collaboration needs of the SME. These cloud telephony systems balance power with ease of use. When most surveys show consumers want voice communication more than other channels, the rise of flexible cloud telephony is timely news for smaller organisations who, above all others, hang their reputation on a personal service and flexible response. And the operational and cost benefits for SMBs and hard-pressed public organisations are clear. In one case, a small district council, facing capital budget constraints, switched to cloud telephony and is making £30,000 in savings every year over its old on-premised system. [easy-tweet tweet="Private and public organisations are using cloud telephony systems." hashtags="Cloud, IT"] We have helped small firms set up call centres to tie together different CRM databases – agents want customer data in one place rather than having to glance across multiple screens.  And we know of private and public organisations that are using cloud telephony systems to spin up new offices overnight in response to growth plans or changing customer needs. In the earlier Internet era, communications providers used to offer customers bespoke phone systems options, triggering lengthy integration work and support. Today, we still use these technologies to build powerful, feature-rich platforms but they are far simpler for customers to set up and operate. We build-in those intelligent features from the outset, using software that enables simple integration with key applications, and provide management tools that give control to the user rather than to a third-party specialist. Leading cloud telephony providers have the skills to build powerful and intelligent phone systems that can be integrated with anything. But complexity is easy, simplicity is difficult, and that’s what customers increasingly recognise.   ### Alation and Trifacta Deliver Best-in-Class, Integrated Data Cataloging and Data Wrangling Solution Alation the collaborative data company, and Trifacta, the global leader in data wrangling, today announced the extension of their partnership to include the joint delivery of an integrated solution for self-service data discovery and preparation – required elements of modern analytics platforms.  The unified solution leverages deep integration between the two products to drive a seamless experience for the data analyst, no matter which application they start from. Analysts can start to work with data from either the Alation Data Catalog or Trifacta Wrangler Enterprise. No matter where they begin their experience, whether it starts with data discovery or with data preparation, users can access data catalogue and data wrangling features within a single interface. The integration tames self-service analytic chaos by providing analysts with a choice of where to start the often iterative process of preparing data for self-service analysis. [easy-tweet tweet="Analysts need the tools that allow them to work productively" hashtags="Data, IT "]  “Organisations are embracing self-service analytics but struggle with the distributed nature of self-service analytic projects. Analysts need the tools that allow them to work productively and in a more collaborative manner with data experts – from finding, understanding and trusting their data, to preparing that data for analysis,” said Satyen Sangani, CEO, Alation. “Our partnership with Trifacta enables analysts to accomplish all of that within a single solution across databases and Hadoop – making their work much more efficient.”  “As the new analytics stack comes into focus, it is critical that best-of-breed products seamlessly work together. To do so, organisations must openly share metadata across applications and users to ensure they can effectively utilise and govern their information,” said Adam Wilson, CEO, Trifacta. “Through our partnership with Alation, we are providing analysts with a seamlessly integrated solution for data wrangling and cataloguing that meets the requirements of modern self-service analytics.” [easy-tweet tweet="A productive self-service analytics organisation requires more than business intelligence" hashtags="Data, Analytics"] Many enterprises already understand that a productive self-service analytics organisation requires more than business intelligence (BI) and analytics tools, including joint Alation and Trifacta customers eBay, MarketShare and Munich Re. Analysts must have a deep and wide understanding of the complete lineage of data. They must be able to discover, wrangle and trust their data to achieve the benefits of self-service analytics.  “At Munich Re, our data strategy is geared to offer new and better risk-related services to our customers. A core-piece in that strategy is our integrated self-service data analytics platform. Alation’s social catalogue is part of that platform and already helps more than 600 users in the group to discover data easily and to share knowledge with each other. With the introduction of Trifacta's powerful data wrangling solution to the platform and the tight integration of both tools we expect time to insight to go down significantly,” said Wolfgang Hauner, chief data officer, Munich Re. Alation and Trifacta have committed to a joint development roadmap through the end of 2017. The initial integration is scheduled for customer general availability in the second quarter of 2017.  About Alation Alation is the first data catalogue built for collaboration. With Alation, analysts are empowered to search, query and collaborate to achieve faster, more accurate insights. Alation automatically captures the rich context of enterprise data, including what the data describes, who has used it, and the relationships between the data, analysis and insights. Alation's catalogue is generated and updated using automated intelligence and improved through human curation by analysts, stewards, experts, and business users. Alation is funded by Andreessen Horowitz, Bloomberg Beta, Costanoa Venture Capital, Data Collective and General Catalyst Partners. Customers include eBay, Albertsons, Square, and some of the world's largest insurance and pharmaceutical firms. For more information, visit alation.com. ### Top Benefits of Using a VPS for Your Business When you are trying to create a successful business, it is important that every facet performs at the highest level possible, and this includes the web servers. A slow server can turn off customers, and in the end, it could lead to a loss of revenue. It is for this reason that many businesses are choosing to use virtual private servers (VPS) for their hosting needs. Increased Performance Unlike shared hosting plans, where there are dozens, if not, hundreds of other users sharing the same server as your business, a VPS will ensure that your enterprise remains independent of other users. This is important because websites that share a plan can be affected by one another, and this can cause functionality problems. For instance, if a website sharing the same server as your business is receiving high levels of traffic, it can slow all of the sites on the server. By using a VPS, the functioning of your website will never be determined by other sites. Greater Control A common issue that can occur if you do not use a VPS is a lack of access to your root environment. Without root access, you must rely on software packages that the hosting provider supports. Unsupported software can raise security concerns for hosting providers, and on a shared server, the software that is available will be dictated by the host. If your business is looking to use industry specific software that is not supported by the host, then this could cause issues and may prevent the use of the software entirely. By using a VPS, your business will be able to freely implement any necessary software changes within the server. [easy-tweet tweet="VPS makes it easy to scale up operations without interfering with the server." hashtags="VPS, Servers, Website"] Scalability If you own a small business and do not have plans of expanding, then you may be able to predict the amount of traffic your website will receive. However, if you are looking to grow your business, then chances are that traffic levels will increase as the customer base expands, and you will need a server to accommodate this. Unlike a shared server, which can be limiting in regards to scalability, a VPS makes it easy to scale up operations without interfering with the functioning of the server. In fact, scaling up can be as simple as upgrading your hosting plan, and this typically does not require any downtime, so your site will continue to operate without interruption. Low Cost It used to be that businesses would typically opt for shared hosting plans because virtual private servers were too expensive to use. However, as internet technologies have advanced, the cost of web hosting has significantly decreased, make a VPS an option for nearly any business. Many providers have VPS hosting plans for under $10 per month, which is hardly more than the rate of most shared plans. Customer Service When problems are encountered with a hosting service it can make for a stressful situation. After all, the livelihood of your business could depend on these servers. If you use a VPS then there will be a dedicated customer service representative who will be able to help you resolve any issues, and also offer recommendations that will meet your business needs. In Conclusion With internet technologies continually improving, and more and more businesses utilising the power of the internet to increase their exposure, there becomes a necessity to use a virtual private server. While there are other hosting plans available, such as a shared plan, these can negatively affect the functioning of the website if traffic increases. A VPS will allow for more flexibility and easier scalability as you expand your business. In addition, the low cost of using a VPS makes this option affordable for nearly any enterprise. If you are in need of a hosting service for your business, you will want to consider a virtual private server. ### HYPERGRID Builds on Exceptional Momentum Into FY17 HYPERGRID builds on exceptional momentum into FY17 to become a leader in ­­Hybrid Cloud and DevOps HYPERGRID, the Hybrid Cloud-as-a-Service leader, announced exceptional growth in FY16 while driving accelerated growth into FY17. The year was highlighted by the acquisition of DCHQ, the launch of its new platform, HyperCloud, and the introduction of a new consumption-based business model. This new business model, along with a significant expansion in its customer base and global partner network focused on multiple verticals, contributed to over 100% growth in revenue in FY2016. HyperGrid also added some senior leaders to its management team, positioning itself well for accelerated growth in the future. “FY16 was a monumental year for us,” said Nariman Teymourian, CEO of HYPERGRID. “We exceeded our revenue goals and raised our external profile significantly. HyperCloud, our hybrid cloud platform, offers application, platform and infrastructure services in a unique pay-as-you-go consumption model, which fundamentally changes how customers consume IT.” [easy-tweet tweet="HYPERGRID’s mission is to enable business innovation" hashtags="Cloud, IT "] Since launching in July 2016, HYPERGRID has been focused on its mission to bring IT consumption services to help CIOs and IT leaders focus on creating business value rather than on cost and service management. HYPERGRID’s mission is to enable business innovation and to provide flexibility, simplicity and agility for business users while maintaining security, control and governance over the delivery of IT services. “The ability of enterprise IT to enable innovation at a rapid pace and scale is directly tied to the business need for technology-based market differentiation. We are working to establish HYPERGRID as a true enabler and innovator as we define a new category in IT,” said Jim Ensell, CMO of HYPERGRID. Progressing into FY17 HYPERGRID will also build on its engagement with Microsoft, announced in FY16, as well as launch the next version of HyperCloud. About HYPERGRID HYPERGRID is the Hybrid Cloud-as-a-Service leader that simplifies IT for the enterprise by providing consumption based, full stack cloud services on premises. HYPERGRID’S offering, HyperCloud, uniquely addresses enterprise IT needs by combining IaaS, PaaS, and applications services on premises with security and governance for application management. For DevOps, it delivers lifecycle management and push-button VM, bare metal, and container deployments. These capabilities, combined with multi-cloud management and orchestration, all delivered via a consumption model, offer a truly compelling value proposition. HYPERGRID is IT Simplified for the Business, delivering unmatched simplicity, scale, and economics that accelerates business innovation, growth, and long-term success. ### Zadara Storage to Expand Market Reach through Agreement with Tech Data Technology Solutions to offer Zadara’s Enterprise Storage-as-a-Service to growing Public and Private Cloud Users. Zadara Storage, the provider of enterprise-class Storage-as-a-Service(STaaS), today announced an agreement with Tech Data Corporation (Nasdaq: TECD), one of the world’s largest wholesale distributors of technology products, services and solutions. Technology Solutions, formerly a division of Avnet and now part of Tech Data, will make Zadara Storage solutions easily accessible to channel partners worldwide through its Avnet Cloud Marketplace. Zadara Storage uniquely offers users enterprise-grade storage-as-a-service in any location (cloud, on-premise or hybrid), supporting any data type (block, file and object) and connecting to any protocol (FC, iSCSI, iSER, NFS, CIFS, S3, Swift). Customers of AWS, Google Cloud Platform and Microsoft Azure can access Zadara Storage directly from their cloud virtual machines and enjoy Zadara’s enhanced privacy, performance and data protection features. [easy-tweet tweet="Public cloud SaaS market is growing at approximately 25% per year" hashtags="Cloud, Data"] According to the industry analyst firm IT Brand Pulse, the public cloud storage-as-a-service market is growing at approximately 25% per year, and by 2020 will capture 25% of a massive $50 billion in enterprise storage spending. The rapid transition from traditional CapEx storage to as-a-service solutions is due to the scalability, elasticity and economic benefits of OpEx-based storage-as-a-service. “At IT Brand Pulse, we believe that the rapid growth Zadara Storage is enjoying is due to their excellent product/market fit,” said Frank Berry, CEO IT Brand Pulse. “Users are tired of the traditional model of purchasing and managing storage, and Zadara Storage has a unique, world-class storage-as-a-service solution, providing enterprise-grade storage that is also fully managed.” The Avnet Cloud Marketplace enables Technology Solutions partners, including value-added resellers (VARs), independent software vendors (ISVs), managed service providers (MSPs) and system integrators (SIs), easy access to products like the Zadara Storage Cloud. With this announcement, Technology Solutions partners now have access to the award-winning portfolio of Zadara Storage block, file and object storage services. All Zadara Storage solutions are provided in a pure OpEx, pay-as-you-go model, provide resource isolation, and are fully managed – enabling customers to focus on managing their business, rather than managing their storage. “As enterprises worldwide continue the transition from purchasing CapEx storage to subscribing to OpEx-based storage services, we are pleased to add Zadara Storage to the Avnet Cloud Marketplace,” said Sergio Farache, senior vice president, strategic business units, Enterprise Solutions, Tech Data. “Corporate IT customers require flexible, agile storage services that align with their business, and Zadara Storage will enable our resellers to address this growing market demand.” [easy-tweet tweet="Zadara Storage Cloud is the foundation for companies" hashtags="Cloud, Data"] “We are proud to partner with Technology Solutions and help their partners participate in the dynamic shift from CapEx storage to OpEx storage services,” said Nelson Nahum, CEO and co-founder of Zadara Storage. “Our Zadara Storage Cloud is the foundation for companies from nearly every vertical market worldwide. Our new relationship with Tech Data will now expand our reach even further, by providing these solutions through the Avnet Cloud Marketplace.” About Zadara Storage Zadara® Storage offers enterprise Storage-as-a-Service (STaaS) through the award-winning Zadara Storage Cloud. It can be deployed at any location (cloud, on-premise or hybrid), supporting any data type (block, file and object) and connecting to any protocol (FC, iSCSI, iSER, NFS, CIFS, S3, Swift). The VPSA® Storage Array service provides enterprise SAN and NAS while the ZIOS™ service delivers private object storage. Zadara provides resource isolation, exceptional data security, and management control. Zadara is available via OPaaS (On-Premise-as-a-Service) and through a variety of partners including Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform and others. Learn more atwww.zadarastorage.com, Zadara’s Blog, LinkedIn and Twitter. ### Will 2017 be the Year of IoT? If you attended Mobile World Congress (MWC) last month you would have heard a lot of buzz about 5G and the Internet of Things (IoT) with many vendors like HPE, IBM, VMware and others talking about their role in 5G and IoT.  As iland is a V-Cloud partner, I was particularly interested to hear about the VMware IoT theatre sessions which went on throughout the show during which VMware talked about their IoT strategy and how it was helping organisations to harness IoT devices. As you would expect there was a lot on show. There was everything from IoT-enabled safety jackets to keep people safe at sea or in the mountains to connected cars showing the latest innovations in vehicle-to-vehicle communications and autonomous driving. Sierra Wireless demonstrated a live LTE CAT-M network for smart water solutions while Huawei put the spotlight on its connected drone. Moving away from all the hype of Mobile World Congress, I believe that 2017 will be the year where the focus turns to real deployments and the monetisation of IoT.  It is fair to say that in 2016 IoT was still in its infancy regarding revenue and deployments. Now, however, I think we will start to see some real live systems that increase productivity, increase customer satisfaction and develop new revenue streams. [easy-tweet tweet="Regulation and standardisation will also come into focus more in 2017" hashtags="IoT, Cloud "] At MWC we heard that 5G will be the panacea for IoT. However, 5G is still a number of years from being realised in any meaningful way. In the meantime, telcos will have to deal with new IoT models using alternative technologies today. Telecom operators’ strategies and business models for generating revenues from IoT will continue to develop through 2017, and I think we will continue to see the battle between NB-IoT and LTE-M play out. Regulation and standardisation will also come into focus more in 2017. On the regulatory front, I have to admit we haven’t seen much come through yet but I expect to see more government interest as IoT becomes more pervasive in smart cities, the public sector and energy. Smart cities will lead the charge in IoT deployments. The awareness of what ‘smart city’ means is now beginning to capture the attention of residents. They value safety (smart lighting), convenience (transportation, parking) and the potential cost savings that cities can deliver (smart metres, on-demand waste pickup and so on).  That said, cities will continue to be strained by the need for money to support the deployment of sensors to gather data and the integration of city-wide systems. As business investment in cloud technologies continues, so IoT moves up the agenda. IoT data tends to be heterogeneous and stored across multiple systems. As such, the market is calling for analytical tools that seamlessly connect to and combine all those cloud-hosted data sources, enabling businesses to explore and visualise any data stored anywhere in order maximise the value of their IoT investment. Equally, organisations across the world will be deploying flexible business intelligence solutions that allow them to analyse data from multiple sources of varying formats. Joining incongruent IoT data into a single pane of glass, companies can have a more holistic view of the business, which means they can more easily identify problems and respond quickly. In other words, we need to extract the data from IoT devices then figure out what it all means. With the solution to the ‘last-mile’ of IoT, data organisations can then start to increase efficiencies and optimise their business offering. [easy-tweet tweet="Many of iland’s top customers are working in the IoT space." hashtags="IoT, Cloud "] While the possibilities for IoT to improve our world seem endless, concerns over security are very real. As we put more and more critical applications and functions into the realm of IoT, it increases the opportunity for breaches. In 2016, the market finally began to take security seriously, largely because of the increase in IoT hacks. Incidents such as the big denial-of-service attack in October and even the potential of a drone injecting a malicious virus via lights (from outside a building) caused significant concern throughout the industry. As a result, we saw some solid announcements such as the Industrial Internet Consortium releasing its security framework. With all the new vulnerable devices now being put into service, hackers will continue to exploit IoT systems. I think we can certainly expect to see more large scale breaches as hackers look for newly connected devices, especially in the energy and transportation areas. Many of iland’s top customers are working in the IoT space, and they are looking to iland to provide the most secure and top of the line environments to host their applications in the cloud. Here at iland we take this very seriously and we are constantly improving, upgrading and fortifying our platform to meet the growing needs of the IoT market.  This means applying on-premise levels of IT security to cloud workloads. For example two-factor authentication, role based access control, encryption and vulnerability scanning which enables a protective shield for the cloud to scan all incoming and outgoing data for malicious code, regardless of the device being used.  Additionally, we have just introduced our Secure Cloud Services designed to foster strategic, self-sufficient visibility and control of a cloud environment. I believe that 2017 will be the ‘Year of IoT’ and at iland we are focused on making sure that we provide the most secure and compliant environment for our customers so that they can take advantage of the opportunities that IoT presents. Making sure your IT systems are compliant and having the reporting mechanisms so that you can prove your data is secure will be a critical success factor in order to leverage IoT devices.  If you are interested in finding out more about the iland Secure Cloud™ or iland Secure Cloud Services, please visit our website. ### Artificial Intelligence and innovation in Fintech The last decade has seen unparalleled innovation in banking and financial services. At the heart of this wave of innovation has been the cloud, with organisations using it to offer a wide range of new services. The next driver of innovation will be Artificial Intelligence (AI), how will this be deployed to offer improved and innovative services? The amount of choice in banking and financial services has grown hugely over the past few years, both in terms of the variety of services customers can use but also in the type of company that provides them. Traditional banks have been joined by a whole host of fintech startups, agile and nimble entrants to the market, as well as businesses such as supermarkets which have begun to encroach on traditional financial services. [easy-tweet tweet="AI has developed to the extent that it can now crunch datasets much larger." hashtags="AI, Fintech"] This increase in services is just as applicable to businesses as it is consumers. With banks reluctant to lend in the wake of the financial crisis, an era of fintech innovation began. While access-to-funding was the principal driver initially, it soon became clear that many areas others of finance and banking were ready for disruption. Now we are on the threshold of a new era of fintech innovation, this time fuelled by the rise in AI. AI has developed to the extent that it can now crunch datasets much larger, deeper and more diverse than ever before, all using methods derived from Human Intelligence, but at a scale that goes beyond that. Financial services providers – and in particular the fintech firms that can develop and utilise technology much quicker and more effectively than traditional banks – are now deploying AI to offer new and improved cloud-based solutions to their customers that will help them make genuinely helpful steps forward in daily life. Examples include: Smarter decision-making – the ability to ask the right questions to machines, not humans, and the subsequent analysis of that data, will mean smarter and more informed decisions for fintech providers, on whether to lend, what the risk might be and much more besides. Automated Financial assistants and customer support - there has already been a move towards automation in customer experience , but AI can really take this to the next stage. Text chat or chatbots can use data to deliver a highly personal and intuitive service, and this can be developed further in the form of financial assistants to helps advice and make financial decisions, such as whether to buy or sell stock. Predictive analytics – this works by using large volumes of data to discover patterns and insight, which in turn essentially tells a business what is likely to happen next, enabling that business to improve sales, recruitment and many other areas of business operations. For example, late payment is one of the biggest issues faced by small businesses in the UK, with the average time for an invoice to be paid standing at 72 days. Predictive analysis can look at credit performance and advise on whether a customer is likely to pay on time and what measures could be taken to address or prevent late payment. Those examples are really just the tip of the iceberg – the potential of AI is almost limitless. Data is power for a business, and AI is the most effective way of unlocking that. It will be particularly influential with smaller businesses, which lack the time and resources to manage data. Using AI techniques can have a real impact on a small business, delivering all manner of innovative insight and services, and levelling the playing field with other bigger and better-resourced businesses. ### Kroll Ontrack Offers Ransomware Victims Alternative Solutions to Paying the Ransom Kroll Ontrack has identified over 225 different strains of ransomware and developed a set of solutions to restore data and eliminate payments in the event of an attack. Research suggests payments to ransomware criminals jumped to nearly $1 billion in 2016, with no end in sight as businesses and individuals continue to pay up. Ransomware is a type of malware that blocks access to data on a device or server by encrypting it. In working with enterprises affected by ransomware, Kroll Ontrack has identified over 225 unique strains and its engineers have defined decryption processes for over 80 of those variants. While anyone with a computer or a connected device can be the target of ransomware, corporations are often hit the hardest. Not only is an infected company charged an exorbitant ransom to have its data returned, it also faces financial losses due to downtime. Those most at risk include healthcare organisations, financial institutions and government bodies. To mitigate the damage caused by ransomware, Kroll Ontrack has developed a set of solutions to quickly recover the ransomed data by other means, eliminating the need to pay the criminals behind the attacks, including: Software and tools to decrypt ransomed data. There are several methods used to decrypt different strains of ransomware – Kroll Ontrack has identified over 225 strains and defined decryption processes for over 80 of them. Knowledge and experience in data recovery to find unencrypted copies of ransomed data and restore or rebuild what is found. If there are no decryption processes or software able to decrypt a ransomware variant, Kroll Ontrack uses its proprietary data recovery tools to search for unencrypted copies of the data.  [easy-tweet tweet="It is important to have a good backup and recovery plan" hashtags="Security "] Robin England, Senior Research & Development Engineer at Kroll Ontrack said: “At Kroll Ontrack we do not recommend paying the ransom. Many victims who pay their attackers never receive their data in return and can lose hundreds or even thousands of pounds. The best solution is to restore data from a backup. “Ransomware developers know this and in an effort to keep the money coming in, new ransomware variants are being developed that now target those backups. This is why it is important to have a good backup and recovery plan, be diligent in testing backups and educate users on what a potential ransomware attack can look like.” Those individuals and enterprises who are most at risk should take precautions to reduce their risk and lessen the effects of an attack. Below is a list of steps they can take: Never pay the ransom because attackers may not unlock your data. There are many cases of ransomware victims paying the ransom demanded and not receiving their data back in return. Rather than running this risk, companies should work with data recovery experts who may be able to regain access to data by reverse engineering the malware. Create and follow a backup and recovery plan. Ensure that a plan includes storing the backups offsite. Be prepared by testing backups regularly. Organisations must be familiar with what is stored in backup archives and ensure the most critical data is accessible should ransomware target backups. Implement security policies. Use the latest anti-virus and anti-malware software and monitor consistently to prevent infections. Develop IT policies that limit infections on other network resources. Companies should put safeguards in place, so if one device becomes infected with ransomware, it does not permeate throughout the network. Conduct user training, so all employees can spot a potential attack. Make sure employees are aware of best practices to avoid accidentally downloading ransomware or opening up the network to outsiders. ### LIVE: Cloud Expo Europe 2017 ABOUT For the second year Compare the Cloud's event coverage platform Disruptive Tech TV have partnered with Cloud Expo Europe 2017 to bring the mother of all live shows direct to the event. CloudExpo.TV will broadcast live across the two days from the ExCeL from a specially built studio covering the announcements, interviewing attendees, exhibitors and speakers on the latest tech trends. You can watch the broadcast live at the expo, online or on demand after each day. SCHEDULE Cloud Expo Europe Live Show 1: Wednesday 15th March 3.00pm – 4.30pm Show 2: Thursday 16th March 10.30am – 12.00pm Show 3: Thursday 16th March 2.00pm – 3.30pm Behind the Expo Live Show 1: Wednesday 15th March 1.30pm – 2.30pm Show 2: Thursday 16th March 12.30pm – 1.30pm MORE INFO For more info about guests and shows visit our official site for the Expo at www.cloudexpo.tv ### The Benefits of Having a ‘Cloud First’ Policy The uptake of the cloud is happening at a significant pace within modern organisations, completely changing the way that they do business. With that in mind, it seems logical that a ‘cloud first’ policy will become the norm at modern companies. This means that whenever they deploy new services or update existing services, they will at least consider moving it to the cloud, to the ultimate improvement of the entire IT environment. Here at my company, we have been operating along the lines of ‘cloud first’ for some years now. Our initial reasoning was that as an infrastructure vendor, we should arrive at the cloud early on so that we could optimise our services and products for the cloud accordingly. In our mind, it made sense to be ready in advance of when our customers started to utilise the cloud themselves. Clearly, not all organisations are in the same situation. But adopting a ‘cloud-first’ model can have various benefits for businesses of all types. The way that we have defined our ‘cloud first’ policy is relatively simple. Essentially, whenever we create or update something on our IT infrastructure, we consider either using a cloud service (SaaS) or moving the entire system to the cloud (IaaS, PaaS). There are five key reasons for applying this policy, all of which could benefit a wide range of businesses. #1: It gives us a chance to rethink our IT strategy I started my own business 20 years ago, and in that period we have integrated a whole host of smaller and larger IT systems into our existing environment. The advent of the cloud gives us the perfect opportunity to look at each of them whenever we interact with those systems, and ask: Are we using our legacy systems in the way that we should? Can we simplify the system? Is it at the core of our business, or can we move to a standardised system? #2: We can do things that were impossible before the cloud The cloud has opened up a whole new world for businesses, and as such, it is worth reflecting on all of the tasks that it now allows us to undertake: Delivering websites and downloads globally and at speed using CDNs, while also geo-blocking downloads in countries with export embargos Running the tracking and analytics system that we use internally Keeping our email and newsletter distribution systems intact And that’s just within our own business. Depending on the sector and nature of what you do, the opportunities can be endless. #3: It saves time and money To use a personal example, for several years we have been using cloud-based CDNs to deliver our website content, and also downloads of our IT monitoring product (which encompasses the thousands of trial and update downloads every month). Using the cloud is cheaper that our data centre, faster for customers, and includes features for free that would take a significant amount of resource to create and maintain – for instance, the aforementioned geo-blocking for export control. Email is a perfect real-world example of this. Migrating an on-premise Microsoft Exchange server cluster to Office 365/Exchange Online can result in the removal of excess hardware, less hassle when it comes to ongoing maintenance, and ultimately increased performance. [easy-tweet tweet="The cloud offers greater options for scalability" hashtags="Cloud, IT"] #4: It’s better when it comes to scalability It’s well-known that the cloud offers greater options for scalability. For businesses that are growing quickly, to the extent that they may even be moving offices every 4-5 years, moving IT services to the cloud can be incredibly useful. Such changes place significant pressure on internal IT and developers, and the cloud can take some of that pressure off by offering unlimited scalability. #5: It offers greater security and stability Most companies will rely on just a small IT team working within an 8 hour day to ensure adequate cyber security is in place, but cloud providers such as Amazon and Microsoft have round-the-clock teams devoted to this kind of activity. To use an analogy, if you have £1 million in cash, do you store it at home or in a bank vault? Clearly, the benefits of adopting a ‘cloud first’ policy are numerous. By considering factors such as time, cost, security and scalability, ensuring that your business is cloud-optimised is clearly beneficial when it comes to operating in the new global economy. ### Mobile Working a Major Cause of Data Breaches A third of companies say they have experienced a data loss or breach as a direct result of mobile working.  Apricorn, the leading manufacturer of software-free, hardware-encrypted USB drives, today announced new research highlighting that a lack of rigour and consistency when it comes to protecting data poses significant security risks, as 70 percent of IT decision makers agree that securing corporate data is an ongoing battle. The research, conducted by Vanson Bourne, found that around a third (29 percent) of surveyed organisations have already experienced either a data loss or breach as a direct result of mobile working. A significant proportion – as many as 44 percent – expect that mobile workers will expose their organisation to the risk of a data breach. Underlining this concern, almost half (48 percent) of the surveyed companies say employees are one of their biggest security risks. The survey results show that mobile working is a major problem as companies are still uncertain how to enforce adequate security policies, and many have no viable strategies in place. As mobile devices extend the boundary of the corporate network, ensuring confidentiality, integrity and availability of the data that the devices access, process and store is a constant challenge. Fifty-three percent of surveyed companies said that managing all of the technology that employees need and use for mobile working is too complex, while 35 per cent complain that technology for secure mobile working is too expensive. The survey also found that one in ten companies with over 3,000 employees does not have a security strategy that covers remote working and BYOD. One in ten companies, regardless of size, don’t have a strategy that covers removable media, such as USB sticks. Removable devices such as compact flash drives can pose a huge risk to businesses, not only because they are easy to lose or steal, but also in terms of the malware they can introduce to networks. Worryingly, roughly a quarter (23 percent) of surveyed organisations admit that they have no way of enforcing relevant security strategies they have in place, which is almost as risky as having no policy whatsoever. [easy-tweet tweet="12 percent do not have any policy." hashtags="Data, Mobile "] Despite some having defined security policies for mobile working, nearly 7 in 10 (68 percent) say they cannot be certain that their data is adequately secured when employees work remotely or on mobile devices. Encryption is the most viable option for organisations to protect valuable data outside of the corporate network, whilst also balancing control and accessibility. However, only a third of those surveyed say they enforce hardware and software encryption of their data, and 12 percent do not have any policy at all regarding encryption for data that is taken away from the office. “Whilst data protection is not a straightforward task, companies (particularly those in the private sector) are trusted by their customers to follow basic best practices. Despite this, 38 per cent say they have no control over where company data goes and where it is stored. Organisational struggles with enforcing data protection regulations and compliance standards are putting confidential data at risk,” said Jon Fielding, Managing Director, Apricorn EMEA. “The repercussions associated with a data breach are huge, both in terms of financial and reputational damage. Regulations are put in place to protect the data, its owner and the company responsible for it,” he added. In 2018, the financial implications will increase when the European General Data Protection Regulation (GDPR) comes into force, and fines of up to €20 million or 4 percent of global annual turnover are introduced. The survey found a distinct lack of awareness amongst UK companies when it comes to the GDPR requirements: “Companies will need to ensure personal data of European citizens is secure but, disturbingly, 24 percent of the surveyed organisations are not even aware of the GDPR and its implications. On top of this, 17 percent are aware of the regulations, but don’t have a plan for ensuring compliance,” Fielding noted. When asked about the greatest security risk to their organisation in 2017, half of the respondents (51 percent) cited outdated software, followed by employees (48 percent), and the cloud (40 percent) among their top risks. More than a third of those surveyed said BYOD and mobile working were among the biggest liabilities. While many organisations recognise the security problems associated with mobile working, sometimes it’s down to a lack of adequate training or not providing the right tools: Over half (57 percent) of respondents agree that while their mobile workers are willing to comply with security measures, they don’t have the necessary skills or technology to keep data safe. And it may get even harder to secure and enforce data protection in the future as 47 percent agree, or strongly agree, that while the younger generation of workers is more technology savvy, they care less about security than the older generation. About the survey The research was conducted by Vanson Bourne, an independent specialist in market research for the technology sector. The research consisted of 100 interviews of IT decision makers in the UK, during January. Respondents to this research came from private sector organisations with more than 1,000 employees. Vanson Bourne’s reputation for robust and credible research-based analysis is founded upon rigorous research principles and their ability to seek the opinions of senior decision makers across technical and business functions, in all business sectors and all major markets. For more information, visit www.vansonbourne.com.  ### Capabilities and Innovation - #CloudTalks with Sempre’s Tom Clark In this episode of #CloudTalks, Compare the Cloud speaks with Tom Clark, the Analytics Practice Manager from Sempre, about how they are looking into machine learning to help clients better utilise its technology. Tom also shares what the company is succeeding with at the moment. #cloudtalks Visit Sempre to find out more: http://www.sempreanalytics.com/ ### Amazon Web Services and Datapipe Selected as Infrastructure Providers for the ANNA Derivatives Service Bureau for OTC ISINs The ANNA Derivatives Service Bureau (DSB) today announced that Amazon Web Services (AWS) has been selected as the cloud services provider for the DSB, alongside Datapipe as the Service Provision Provider (SPP) for the DSB platform.  With the engagement of these companies, the DSB initiates build-out of the infrastructure to address global requirements of market participants and regulators. The DSB platform will provide near real-time allocation of International Securities Identification Numbers (ISINs) for OTC derivatives, as well as the ability to download the full archive of these ISINs and their associated reference data. “Securing Amazon Web Services and Datapipe for infrastructure provision and management is another major step forward in the DSB’s mission to deliver OTC derivative ISINs to the industry,” said Emma Kalliomaki, managing director of the Association of National Numbering Agencies (ANNA) and the DSB. “We are enthusiastic about the massive resources and proven expertise they bring to this ambitious development project, and we look forward to their provision of the custom-architected and managed solution to meet our global needs.” Datapipe and AWS were selected following a formal RFP process managed by BearingPoint, the management and technology consultancy that serves as trusted advisor to ANNA in development and oversight of start-up activities, as well as business strategies and practices. AWS is a leading provider of highly reliable, scalable, low-cost cloud services to millions of customers across the globe. Datapipe, which will provide customization and management services for the infrastructure, is an AWS Partner Network (APN) Premier Consulting Partner with expertise in architecting, migrating and managing next-generation technology environments. “ANNA’s decision to build the DSB on AWS demonstrates the importance of building mission-critical industry utilities on secure, scalable, resilient and globally distributed infrastructure,” said Scott Mullins, head of worldwide financial services business development, Amazon Web Services. “The DSB and its users will benefit from AWS’s global scale, cost efficiencies, and secure cloud services, and we are delighted to be selected as DSB’s cloud services provider. Stewart Smythe, managing director of Datapipe Europe said, “We were able to address the global requirement of DSB and use Datapipe’s expertise combined with AWS infrastructure to provide it with a managed solution for the DSB platform. The infrastructure is ready to support its requirement for the near real-time allocation of ISINs and it delivers OTC derivative ISINs in time for MiFID II regulations that come into effect on the 3rd January, 2018.” The development timeline of the DSB is keyed to the enforcement date of the EU’s MiFID II regulation. enabling users to transition into production before European regulatory reporting of some OTC derivatives becomes mandatory in January. User Acceptance Testing is due to launch on 3 April 2017, and UAT registration is now open. DSB production service begins on 2 October 2017. [easy-tweet tweet="Datapipe provides comprehensive cloud solutions" hashtags="Datapipe, Cloud "] About Datapipe A next-generation MSP, Datapipe is recognised as the pioneer of managed services for public cloud platforms. Datapipe has unique expertise in architecting, migrating, managing and securing the public cloud, private cloud, hybrid IT and traditional IT. The world’s most trusted brands partner with Datapipe to optimise mission-critical and day-to-day enterprise IT operations, enabling them to transform, innovate, and scale. Backed by a global team of experienced professionals and world-class interconnected data centres, Datapipe provides comprehensive cloud, compliance, security, governance, automation and DevOps solutions. Gartner named Datapipe a leader in the 2017 Gartner Magic Quadrant for Public Cloud Infrastructure Managed Service Providers, Worldwide. For more information please visit the Datapipe website.  About The Derivatives Service Bureau (DSB) Ltd Headquartered in London, the DSB is a privately held subsidiary of the Association of National Numbering Agencies. Its core purpose is to serve as a global numbering agency, providing a unique identification of OTC derivatives to serve the needs of market participants and regulators through the allocation of the International Securities Identification Number (ISIN) as OTC products are created. The ISIN is a globally recognised and adopted ISO standard for identifying financial instruments. More detailed information on the DSB and its development path can be found in the DSB section and related pages, as well as recent announcements at the ANNA website.  About ANNA Established in 1992 by 22 founding numbering agencies, ANNA is the membership organisation of national numbering agencies, which are operated by depositories, exchanges, government agencies, nationally central data vendors and other financial infrastructure organisations. ANNA also serves as the registration authority for the ISIN and FISN standards, under appointment by the International Organization for Standardization (ISO). Under ANNA’s stewardship, the role of the ISIN in enabling global financial communications has been established worldwide. ISINs are issued today more than 200 jurisdictions worldwide. In addition, ANNA is developing the Derivatives Service Bureau (DSB), a fully automated global numbering agency to meet the operational and regulatory requirements of the over-the-counter derivatives markets. The number of national numbering agencies and nations working to establish national numbering agencies continues to grow each year, now surpassing 120 jurisdictions globally. For information about ANNA, its members and activities, please visit anna-web.org. ### Cloud migrations – Costly, Risky or Neither? A few weeks ago, we wrote a brief introduction about saving time and how safeguarding of data is imperative. I want to follow on from this by taking a deeper dive into how Velostrata can assist with the technical world of migrations and simply put – saving time and money in 4 identifiable ways: Complexity Anyone who says they don’t have complexity within their IT estate is lying. IT by its very nature is complicated (that’s why Compare the Cloud exists) and nothing is more complicated than taking your IT Estate and making it “Cloud Ready”! I love the way that Velostrata address this in a unique style at the Virtual Machine (VM) level that includes rightsizing, (great for capacity planning) with agentless operation which makes it much easier to implement and use for complex migrations. [easy-tweet tweet="You can have a workload operating in the Cloud in less than 5 minutes" hashtags="Cloud , Data"] Time How long have we had to wait for data copies, data transfers and the long weekends of data migrations? Complicated scripts that have to run and often fail, and the necessity for large internet pipes. Because of Velostrata’s streaming capabilities, you can have a workload operating in the Cloud in less than 5 minutes, while remaining data streams in the background. Velostrata’s read/write cache ensures full resiliency and application performance despite the WAN. Risk How many times has the lack of planning let you down when migrating to the Cloud? Can’t take the application offline to conduct the appropriate audit? There have been countless times I wished I could take a complete copy of the IT Estate workload I’ve been responsible for in my past, in preparation for a migration or upgrade whilst staying current. With Velostrata, you can clone a complete copy of a workload to the Cloud for testing, evaluation and validation without taking the production application offline. Also, Velostrata’s unique caching and wan optimisations to give you full app performance in the Cloud, as well as the ability to roll a workload back on premises (or to another cloud) if desired. [easy-tweet tweet="A hybrid approach to Cloud efficiencies is pretty much the mainstay now" hashtags="Cloud, Hybrid "] Scalability The Cloud is scalable right? Yes and no! A hybrid approach to Cloud efficiencies is pretty much the mainstay now, but when we talk about effective use of Cloud environments these are quite often misquoted, misunderstood and badly implemented. Velostrata decouples compute from storage, and unique workload streaming allows a workload to boot in the Cloud with only the critical elements needed to run the application. Additional hot data streams in the background while the application is live. This, combined with de-duplication and other WAN optimisations, makes it possible to move 88 VM’s simultaneously and have them running in the cloud within an hour by using modest bandwidth links. A really amazing addition to the standard Cloud migration management toolsets available. Most migrations rarely consider scale and in fact, they have probably just evolved from disaster recovery environments or even simple backups. I like Veolstrata and their approach to what, in my view are, the most important parts of any migration are streets ahead of their competition. You can clearly identify that a lot of work and thought has gone into delivering exactly what we want!   ### Process Automation - #CloudTalks with Trivaeo’s Pat Graham In this episode of #CloudTalks, Compare the Cloud speaks with Pat Graham from Trivaeo Cloud Services limited about their product, crossroads and what type of services they provide. Pat also talks about the benefits of process automation.   #CloudTalks Visit Trivaeo Cloud Services to find out more: trivaeo.com/ ### Legacy, Migrations and the Elephant in the Room! At Compare the Cloud we frequently come across some of the best (and worst) rising tech innovations in the industry, and recently there has been a splurge in very disruptive tech fall into our laps. However, I want to discuss the question of Legacy, how do you transform/conform to achieve a digital transformation? Let's see the challenges that face the legacy tech/systems/apps that you have just migrated around, keeping that elephant in the room. Why? Because it was too much effort to change, not possible to change due to application restrictions or you have always conducted a certain service that way, so why change it? [easy-tweet tweet="The multitude of green screens are rife" hashtags="Cloud "] I can relate to this in a big way and I think anyone reading this article that has worked in a banking/finance environment would agree – the multitude of green screens are rife. They are seen as a necessary evil to run that client services app, that customer data record system or even just accessing alternate technology that required an emulator. Sadly the green screen floors were always thought of as low tech, data entry oddities that were old school with importance attached. Does the below look familiar?   Of course, it looks familiar, and this still has its use today. But, how do you transform your company, service, technology and staff to the digital age with these legacy applications? Yes you can have an emulator, yes you can access this remotely and work around the elephant in the room but why would you if you didn’t have to? Your staff would be happier if they could use a modern interface that they are not embarrassed to be associated with. You as a divisional head can have a roadmap to a digital transformation that does more than performing a makeover. How about turning those legacy applications to be web based, mobile across smart devices, faster than traditional applications in us and catering for systems such as – Mainframes (all), IBM i, Unix, Linux, VMS and others? Flynet is the missing link that enables legacy applications to be digitally transformed, that will simply save you so much time to achieve this goal. I caught up with Flynet recently, and I have blown away with their products and how they just love legacy environments. No, it's not a terminal emulator (although it can be too) as I thought when first discussing Flynet. Flynet takes your green screen, codifies it, digitalises it, makes it mobile and heterogeneous to your modern way of thinking, period! Mainframes have had a makeover to adapt to Linux and other current and future operating systems so why shouldn’t the apps run more efficiently and link into your technical roadmap for digital transformation too? [easy-tweet tweet="You know you can rely on the tech behind the scenes" hashtags="Cloud, Tech "] As I said before, it's not just a modern front end makeover, though, it’s a massive improvement to the way you use legacy applications. It's fast and basically turns your legacy into a very sophisticated web application that is right on the pulse for the digital transformation that you WILL need to go through. Let's face it nobody really cares what your legacy application runs on (sorry), but the consumers/users of the application do. You know you can rely on the tech behind the scenes to be secure and do what they are designed to do but nobody wants to limit the access to an emulator that is stuck on the end of world-class technology infrastructure. This opens the door to run even more on your current infrastructure that you never thought possible by integrating applications with the “green screen” very resilient backend technology. To be honest, I am impressed with Flynet and their developments in this arena as it moves the elephant in the room puts it on a diet and workout regime and comes out a lean digital machine. If you have green screens, struggling with legacy migrations and want to change with the times – just take a look at what Flynet offer as I did and prepare to be surprised. ### AWS and Active Directory – What are the limitations? An increasing number of organisations are migrating their data processing operations to the cloud, and it's not hard to understand why. Cloud service providers, such as AWS, are secure, flexible, scalable and relatively inexpensive to setup. They provide automatic software updates and disaster recovery. Specifications such as CloudAudit, are being developed, which provide a standardised method for presenting detailed statistics about a cloud system's performance and security. Yet, despite the proposed improvements, there are still a number of limitations and concerns associated with auditing critical data in the cloud. After all, you are still hosting your critical data on a shared server that you have little or no control over. Cloud solutions are less customizable and are typically more expensive than on-premise solutions in the long term.  Cloud services may be subject to disruptions and outages, and there’s even the possibility that the service provider will go out of business. Amazon Web Services (AWS) provide a tool called the AWS Directory Service, which enables IT administrators to run Microsoft Active Directory on their servers. There are three different options for running Active Directory in AWS: Microsoft AD, Simple AD and AD Connector. However, each option comes with its own set of impediments. [easy-tweet tweet="Simple AD provides a subset of features found in Microsoft AD. " hashtags="Cloud, AWS, "] The first option; Microsoft AD, is the enterprise version of AWS Directory Service and is able to handle up to 50,000 users or approximately 200,000 Active Directory objects. These objects can include users, groups and computers. While such limitations are unlikely to be an issue for small companies, they could be a problem for companies who process a very large number of AD users and objects. Additionally, the AWS Directory Service doesn't allow users to configure the performance settings in Microsoft AD, which makes it impractical for users to remedy performance issues. For example, users may wish to change a number of resources allocated to processing, storage or memory, for a given AD instance. Another drawback of using AWS is that it's currently not possible to migrate your existing on-premise AD database to the cloud. The second option; Simple AD, would be suitable if you require an inexpensive AD-compatible service for running Active Directory in AWS. Simple AD provides a subset of features found in Microsoft AD. It allows you to manage users, groups and computers but, does not allow you to define trust relationships between domains, or add domain controllers to instances. Also, you will not have features such as AD Administrative Center, AD Recycle Bin and PowerShell. You will also have limited control over password policies, the group managed service accounts and schema extensions. Again, this may be fine for small organisations, but for large organisations, Microsoft AD will probably be required. The third option; AD Connector, is used to connect your existing on-premise Active Directory database to AWS. The AD Connector enables you to mitigate both the costs and complexities of managing and maintaining your own infrastructure. AD Connector allows you to use the same tools to manage AD in the cloud, as you would manage your on-premises AD. Since no information is cached in the AWS cloud, your organisation can retain control over the way critical data is handled. However, the AD Connector does not allow admins to make changes to Active Directory in AWS, which, as previously stated, will inhibit the admin's ability to address potential performance issues. As you can see, there are pros and cons to running Active Directory on AWS. One might argue that for larger organisations who want unfettered control over their data, an on-premise setup might be more appropriate. While an on-premise setup might require an up-front investment, it will actually work out cheaper in the long run. Furthermore, prices are coming down. For example, solutions such as LepideAuditor, provide an affordable, yet feature rich suite of auditing and reporting tools, which will give you complete control over your data. ### Disruptive Pitch UK – Series 1 – Episode 5 Episode 5 of reality TV technology pitch show; #DisruptivePitch. The UK’s most aspiring technology entrepreneurs, startups and emerging companies pitch themselves to our panel of judges for a chance in the final. ### Wallscope and NHS Scotland: The Story So Far Wallscope has been demonstrating how the NHS can transform their working practices, as part of ongoing work with the Information Services Division (ISD) of NHS National Services Scotland. The Challenge ISD are responsible for publishing statistical information for a wide range of audiences, from their own staff to Government policy leads, clinical staff and the general public. Currently, this information is published in fixed Excel spreadsheets and PDF files which can be difficult to understand and customise. Making this data accessible and presenting it quickly and efficiently, in a way that is tailored to the end user, is key. That’s why Wallscope has been working with ISD as part of the Scottish Government’s CivTech scheme, which aims to ‘drive daring and innovation’ in the public sector. Wallscope won a place in the scheme through their response to the challenge: ‘How can we make our data publications more accessible and appealing?’ In an organisation where knowledge access presents significant issues, sustainable and lower-cost solutions are increasingly in demand. Data is often stored in spreadsheets so large the average user wouldn’t be able to open them. In addition, increasing regulation, funding pressures and an ageing UK population all create a demand for robust and insightful solutions that meet the needs of an ever-changing NHS. The Solution Wallscope’s Dynamic Data Discovery platform has the ability to directly link topics, concepts and associations that span a vast number of sources. This significantly reduces the time spent on the preparation and distribution of information – and makes the data presented more meaningful, accessible and relevant to the end user. So how does this work in practice? We have developed a metadata level which dynamically connects different data stores. In the initial stages of this work, we integrated NHS prescription data with the British National Formulary (BNF) – a pharmaceutical reference book that contains a wide spectrum of information on prescribed drugs – and workforce data. This included information on GP practices, doctors, nurses and health boards. We then used geodata to build heatmaps for health boards across Scotland, which provide real-time visualisations of the data. Responding to a steer from the NHS, we concentrated on anti-microbial drugs, such as antibiotics or antivirals. We can quickly give an overview of their use across Scotland, identifying when and where they are prescribed and comparing their use in different health boards. By layering different types of data we can gain new insights into the effectiveness of a particular drug, for example. The End Result These visualisations can be presented in a custom-designed dashboard which is tailored to the end-user, such as a GP, an analyst or a member of the public. Access to designated functions, features and content can be controlled using verifiable permissions – meaning that sensitive information remains secure. As a result, published data is better presented, and the user’s experience is massively improved, through enhanced search, discovery and exploration of the organisation’s information. The addition of auto-tagging and annotation increases the power of the search function and allows content expansion across an improved publishing workflow. Quick access to more meaningful information will support decision-making and help the NHS to meet requirements related to governance and public accountability, as well as realising efficiencies in service delivery and process improvements.  It will also help them provide better customer service and value for money for service users. There are many more valuable uses of the technology that could be explored – for example integrating health and social care data, to give a clearer picture of the delivery of services across Scotland and beyond.   ### Customer Experience - #CloudTalks with TES Enterprise Solutions’ Ron Argent In this episode of #CloudTalks, Compare the Cloud speaks with Ron Argent, TES Enterprise Solutions' CEO, about mainframe, and how they are growing as a company and moving into storage, cloud and consultancy. He also talks about how TES Enterprise Solutions will go above and beyond to help their clients. #Cloudtalks Visit TES Enterprise Solutions to find out more: tes-es.com/ ### Designing with Data Wallscope lead research seminar at University of Edinburgh.  As part of an ongoing relationship with the University of Edinburgh’s Design Informatics department, Wallscope recently led a research seminar on their dynamic data discovery platform – the Verinote API. Directors David Eccles and Ian Allaway demonstrated the technology to a packed audience of postgraduate students, researchers and academics. [easy-tweet tweet="Improved access to data means larger volumes of information" hashtags="Data, API"] The API uses semantic web techniques and linked data principles to improve the search and discovery of information. Improved access to data means larger volumes of information, which in turn creates design challenges and opportunities. This interface between design and technology is at the heart of Design Informatics – harnessing the analytic power to design products and services that transform the way we live and work. After explaining the underlying semantic and linked data technology, Ian and David used real-time examples to illustrate how the API works in practice. Firstly they showed how NHS information relating to areas where drugs are prescribed or where particular infections are prevalent could be linked, using geodata. This can be presented through graphics such as heat maps to make the information more engaging and accessible for the end user. Wallscope is currently developing this with the NHS as part of the Scottish Government’s CivTech pilot project. They also demonstrated a tool developed with Design Informatics students and the Fruitmarket Gallery in Edinburgh. This draws together information about particular artists and concepts using open source data from the Fruitmarket, the Tate and MoMA, and presents it using a mind map design. The quick and intelligent data search and discovery, combined with an accessible design and user interface, creates a valuable educational resource. Technology Director Ian Allaway said: ‘Our seminar explained how information can be used to deliver meaningful services to ordinary humans in practical ways. We covered Artificial Intelligence, semantic technologies and linked data. Terms that seem mysterious to some – so an exercise in debunking the myths surrounding these technologies proved to be engaging, especially for the students present.’ Students were encouraged to think creatively about how they could harness this technology and be given access to the API for use in their work. Wallscope already employs two student interns and are keen to share ideas and potential applications of the technology with the student community as this work progresses.   ### What is Consentua? - #CloudTalks with KnowNow’s Chris Cooper In this episode of #CloudTalks, Compare the Cloud speaks with Chris Cooper, KnowNow's Founder and Director, talks about Consentua and how they capture the consent for a third party to exploit the personal data. Chris also talks about Smart Cities and how they don't yet exist. #Cloudtalks Visit KnowNow to find out more: http://www.kn-i.com/ ### Cloud Expo Europe 2017 and Disruptive Tech TV looking for interview guests for live broadcasts — 15th/16th March. DisruptiveTech.TV, an offshoot of Compare the Cloud, is bringing three live shows to the ExCeL, London on March 15th-16th, 2017; from Cloud Expo Europe. Cloud Expo encompasses five events in 2017, Cloud Expo Europe, Big Data World, Cloud Security Expo, Data Centre World and Smart IoT London. DisruptiveTech.tv will broadcast across the five events, and stream live coverage of interviews with technology experts and speakers across the two days. Exclusive interviews with industry leading experts will be featured across the two days of broadcast. There will also be our Behind The Expo segment interviewing attendees and discovering the people behind the technology at the Expo. Slots are filling quickly - but we are on the lookout for technology companies looking to be interviewed on our live shows to talk about their business, technologies and journey. To schedule an interview please contact: pr@comparethecloud.net DisruptiveTech.tv is an alternative approach to the everyday technology that is ingrained in our lives. Focussing on the IoT, analytics and truly digital and disruptive technology, DisruptiveTech.tv aims to get people interested in the tech behind technology again. In addition DT:TV will also be filming the finale of Disruptive Pitch season 1. Over the past 6 months entrepreneurs and disruptive technologies have been pitching to our panel of expert industry judges for a keynote slot at Cloud Expo Europe 2017. Join the audience on Wednesday morning as our judges decide who has the perfect disruptive pitch. Andrew McLean comments: “DisruptiveTech.tv is designed to change the way we look at technology, and the way we cover events in the technology sector. It’s about bringing an exciting new approach to live coverage of technology, and pointing a spotlight at the truly disruptive technologies that are being developed by the biggest, and smallest, players in the industry. CloudExpo.TV is a great collaboration with a monumental event.” To follow DisruptiveTech.tv’s shows at Cloud Expo 2017, follow @disruptivelive on twitter, check out the hashtag #CEE17, or visit the website at https://www.disruptivetech.tv/cee17 You can watch the event at:  www.cloudexpo.tv You can discover more about the event and watch the trailers at:  https://www.disruptivetech.tv/cee17/ ——————- About Cloud Expo Europe Cloud Expo Europe, is the UK’s largest technology event and the world’s largest independent cloud event, featuring a major exhibition with a record 400+ cutting-edge suppliers plus a compelling conference of 300 global expert speakers – and thousands of visitors. ——————- About Compare the Cloud Compare the Cloud is an industry leading cloud commentator, aiming to educate the public on the latest IT and Cloud trends. Compare the Cloud champions the adoption and value of cloud computing; helps companies select services and providers best suited to their needs, and acts as a central hub, information resource and community for all cloud-related issues. ——————- About Disruptive Tech TV Powered by Compare the Cloud, Disruptive Tech TV is a 24 hour online channel covering live events and emerging technology throughout UK, Europe and Globally. ### IT Pros: Lifesize Welcomes You to Stress-free Live Streaming Lifesize, a global innovator of video conferencing technology, today announced Lifesize® Live Stream. The latest addition to the cloud-based Lifesize application is a simple, flexible and secure solution to live-stream company meetings, executive updates, training sessions and more with high-definition, broadcast quality. With Lifesize Live Stream, IT professionals now have even more ways to effectively enable their organisations to easily and powerfully connect and collaborate, all through one easy-to-use solution, furthering their investment and lowering their total cost of ownership. Lifesize Live Stream delivers unparalleled flexibility, allowing IT administrators to enable an unlimited number of Lifesize virtual meeting rooms to simultaneously live stream events with up to 10,000 viewers and 50 video-enabled sites per event. A simple, single click allows moderators to start a presentation and users to join an event from any location, browser or device. [easy-tweet tweet="Lifesize Live Stream allows organisations to say goodbye to one-sided events." user="comparethecloud" hashtags="lifesize, livestream"] This new tool also raises the bar for live streaming with unmatched value by allowing organisations to pay based on usage. With the Lifesize Live Stream pricing model, users no longer have to worry about getting locked into a fixed package or trying to predict how many events will take place and how many viewers will attend. Instead, organisations can grow as they go, using and expanding their hours as needed. Additionally, Lifesize Live Stream allows organisations to say goodbye to one-sided events. To promote active participation from audience members, Lifesize Live Stream offers a real-time Q&A feature that lets viewers ask questions directly from the live streaming page. Coupled with Lifesize® Record & Share (formerly Lifesize Amplify), the tool also allows one-click recording of any event, which can be shared with absent viewers later and housed in a personal video library. “The Lifesize Live Stream capability allows us to liv-stream company-wide meetings through a single, unified platform. It will connect our offices throughout France and elsewhere in the world and include mobile and remote staff who can participate from wherever they are. It’s a much simpler and easier solution compared to other approaches we have tried that require multiple components and were complicated to set up, manage and use,” said Yannick Petriccioli, IT Project Manager at Groupe Virbac, the eight largest animal health company with locations around the world. “Coupled with the Lifesize recording and sharing feature, we also like the benefit of being able to give those who couldn’t attend the session the ability to easily watch the session on-demand. We see great potential for using Lifesize Live Stream beyond internal meetings, including seminars, training and interactions with customers and partners on a wide scale basis.” “For IT, connecting an organisation from end-to-end is a challenge when users are geographically distributed and operating on different devices,” said Craig Malloy, CEO at Lifesize. “Designed to make event streaming as simple as possible, Lifesize Live Stream provides the flexibility, security and cost-effectiveness that allows organisations of any size to reach their audiences and engage with them in a meaningful way.” With Lifesize Live Stream, organisations and their IT admins can: Enable unlimited simultaneous events with up to 10,000 viewers and 50 video-enabled sites per event. Deliver professional broadcast streaming quality through Lifesize HD camera and phone systems for any conference room paired with the cloud-based Lifesize application. Easily monitor and report on utilisation, attendance and Q&A transcript from an admin console. Automatically generate an invitation link for viewers and set viewing permissions (public, private, group-only). Record and share both video and content or build a personal video library with Lifesize Record & Share. Maximize value with a grow-as-you-go pricing structure. Enable viewing from a variety of browsers (Chrome, Firefox, Safari, Internet Explorer and Microsoft Edge) and mobile devices (iOS and Android). Lifesize Live Stream is available for purchase with a Lifesize Enterprise or Premium subscription plan as of today. ### The biggest CIO challenge ahead: Transforming the enterprise Around the globe, boards of every enterprise are concerned about the impact of digitisation on their company’s market position and ability to compete. Nearly 8 out of 10 (78 percent) of enterprises say that achieving digital transformation will become critical to their organisations within the next two years. Sixty-seven percent of CEOs will make digital transformation the centre of their corporate strategy by the end of 2017. IT has been elevated from its traditional support function to being the backbone of the modern enterprise. Given their ever-increasing dependency on technology, CIOs must assume a strategic leadership role and drive change throughout the enterprise. While this might not sound entirely new, the extent of involvement needed and extra burden on the CIO’s shoulders have constantly increased. Almost 60 percent of CIOs have moved to the foreground and are now in regular communication with their boards of directors. Today’s successful CIOs transform IT by driving an agenda based on an in-depth understanding of business challenges and corporate objectives. In this context, the following four themes have emerged to reposition IT and transform the enterprise: Consolidation and Standardisation Many enterprises still find themselves in a relatively heterogeneous IT environment with loosely integrated subsidiaries and country organisations that operate fairly autonomously. However, without a baseline for global process consistency and a coherent operating model, CIOs risk losing control of their own destiny. A lack of integration and the existence of fragmented systems creates enormous redundancies. [clickToTweet tweet="Users have much higher expectations for ease of use and convenience #IT #Technology" quote="Users have much higher expectations for ease of use and convenience"] When building central repositories for IT assets and streamlining the IT landscape across the various geographies and units, CIOs cannot just yield significant savings (greater buying power, economies of scale, etc.). Reduced complexity also helps achieve much higher availability and increased stability. In a highly standardised and well-documented environment, service interruptions can be drastically reduced. And should they ever occur, the chances are that these services can be restored at a minimal meantime to repair (MTTR). User Experience In a digitalised world that operates 24/7/365, users have much higher expectations for ease of use and convenience. Today, IT services not only service internal staff – they have the power to allow customers to interact with the enterprise. As a result, offering a differentiated experience must be a core component of every digitisation strategy. The days when a lacklustre help desk and a public knowledge database were good enough are long gone. Word of mouth – both positive and negative – spreads quickly. Better user experiences lead to higher levels of loyalty and repeat business. Those enterprises that excel in delighting their customers will enjoy unwavering credibility and have a huge edge over their competitors. [clickToTweet tweet="#CIO should ruthlessly embrace #automation and apply a “zero-touch” mindset #IT #Technology" quote="CIOs should ruthlessly embrace automation and apply a “zero-touch” mindset"] Innovation and Automation The concepts and frameworks in strategic management are all about outsmarting the competition. However, the quicker pace of innovation in a digital world has made them harder to sustain and easier to duplicate. In the digital age, nobody can afford to rest on their laurels. Therefore, CIOs and their respective business unit peers have to constantly enhance the portfolio and come up with new features that enrich the customer experience. Wherever possible, CIOs should ruthlessly embrace automation and apply a “zero-touch” mindset to shorten processes, preclude human error, lower costs and deploy new services in the blink of an eye. Enabling New Business Models Improving the user experience and promoting agility will enable new digital revenue to be generated. In a connected world, enterprises have the opportunity to capture and refine data from every link in their value chains. Since data is the “new currency” of the 21st century, new business models will emerge that allow enterprises to come up with entirely different value propositions. Rather than focusing on the bottom line of the balance sheet and lowering costs, CIOs need to change gears and devote their attention toward the top line, too. In essence, CIOs will be the enabler of data-centric business models and have the unique opportunity to serve as the growth engine for the enterprise. Conclusion The world is changing rapidly – and frankly, the challenge ahead is enormous. The task at hand means nothing less than reinventing the enterprise. CIOs should rethink their agenda and make sure they come up with a strategic roadmap that is well aligned with corporate goals and includes concrete deliverables, milestones, metrics and KPIs in line with the four themes outlined above. Setting the right priorities is absolutely essential to deliver tangible results and avoid getting bogged down in details. More than ever, CIOs can and must play a pivotal role in taking the business to a new, more profitable place. Instead of operating in the background, CIOs can step straight into the spotlight by enabling the enterprise to monetize data and produce new digital revenue streams. ### Client Success - #CloudTalks with Prolifics’ Richard Hope In this episode of #CloudTalks, Compare the Cloud speaks with Richard Hope from Prolifics about cloud based solutions and how their clients are looking at taking chances and are less afraid to fail. Richard also talks about how Prolifics have continued their 36 years of success. #Cloudtalks Visit Prolifics to find out more: prolifics.co.uk/   ### Spark44 shifts up a gear with Datapipe’s Managed AWS Solution Datapipe, a leader in managed cloud services for the enterprise, has led the cloud migration and digital transformation of global digital advertising agency, Spark44. Migrating its legacy infrastructure to the public cloud has given Spark44 the ability to manage and control digital assets consistently across the world.  Spark44 is a joint venture with Jaguar Land Rover that specialises in developing demand creation programs globally. With 18 offices in 16 countries spread globally, Spark44 found that distributing materials and content to local agencies had become increasingly challenging. In the digital era, where content is in a continuous cycle of updates and refreshes that change almost daily, Spark44 needed a platform that would maximize its ability to deploy content globally. As its legacy infrastructure was becoming a significant hurdle for the business, Spark44 migrated its infrastructure onto AWS’ public cloud with the help of Datapipe. Spark44 decided that the public cloud was the best solution for its requirements, but without the expertise to migrate and manage a public cloud infrastructure, it looked to Datapipe for assistance. Spark44 chose Datapipe because of its deep AWS expertise, along with its flexibility and willingness to collaborate with the agency from the start. [easy-tweet tweet="Spark44 migrated its infrastructure onto AWS’ public cloud with the help of Datapipe" user="comparethecloud" hashtags="Spark44, Datapipe, cloud"] Ahmed Hasan, CTO at Spark44 said, “We met with several service providers. Datapipe was the only one willing to come into our office. I’m a big face-to-face guy and they were happy to sit down with me, listen to my needs, and architect an idea that would really work for us. This not only demonstrated their competency, but also illustrated their willingness to go above and beyond in forging a partnership, an ethos close to our own hearts.” Datapipe developed a new global content management and distribution platform utilising three AWS regional sites – the master site in Europe and satellite sites in North America, and APAC. The solution has given Spark44 the ability to support customer demands by improving not only the speed in which new promotions or campaigns are rolled out to the markets, but also the consistency of the agency’s global deployment. Tony Connor, Head of EMEA Marketing at Datapipe said, “Spark44’s migration to the public cloud demonstrates the impact of cloud technology on businesses looking to modernise and expand. Legacy infrastructure is a challenge that holds back many businesses and with a trusted partner like Datapipe, businesses can overcome barriers to successful digital transformation. Spark44 now has everything in place to distribute its content globally using our fully scalable solution.” Through working with Datapipe, Spark44 can automate processes that were previously manual, and it has the resilience and scalability to cope with increased customer demands for tracked campaigns and in-depth analysis of results. This has set the agency up for future success and growth as its infrastructure is now built for scale. ### Simplifying IT - #CloudTalks with SimpliVity’s Paul Graham In this episode of #CloudTalks, Compare the Cloud speaks with Paul Graham from SimpliVity about simplifying IT, how they tore up the rulebook and spent several months redesigning data storage. Paul also talks about how they have improved on the Legacy Stack. #CLOUDTALKS Visit SimpliVity to find out more: https://www.simplivity.com/ ### Visibility the key to GDPR compliance   Look ahead, for a moment, to May next year. Around 425 days’ time. Business is booming with customer interactions transformed by funky cloud applications. But imagine someone in authority being dissatisfied by the unstructured data you hold on your customers – Twitter exchanges with disgruntled consumers or amusing pictures from happy ones. The authority is also unimpressed that your company has no system for confirming to consumers all the personal details you hold on them – or even how you delete them. The authority’s head of compliance says they could fine you 4% of annual turnover for these missteps. [easy-tweet tweet="For many mid-size companies or public organisations, there’s no escaping the GDPR’s remit" hashtags="Data, cloud"] A misunderstanding? Security overkill by Whitehall? No, it’s the EU’s General Data Protection Regulation (GDPR) that puts the citizen’s right to data privacy at the heart of our digital economy across Europe. And it’s coming to all organisations of 250 or more people from May 2018. This regulation goes beyond previous notions of privacy. GDPR builds in principles of ‘accountability’, and a citizen’s ‘right to be forgotten’ into EU law – changing all aspects of business and social interactions for any organisation with a digital and cloud footprint. For many mid-size companies or public organisations, there’s no escaping the GDPR’s remit, because it applies to those supplying goods and services to the EU from inside or outside. It becomes law in each EU state without legislation and will take effect a year before Britain can make its earliest Brexit: ‘UK plc’ will need different GDPR compliance regimes before and after leaving Europe. The directive will massively affect commerce, especially areas like e-commerce or manufacturing supply chains that regularly draw on and re-use multiple data sets. That’s because when two partners agree contracts next May, they will need to determine if a contract involves consent from a citizen or data subject, to the handling of their personal data – even if it isn’t needed for the contract’s performance. But aren’t these doom-laden scenarios the latest over-reaction to the sort of Brussels’ bureaucracy that saw Britain decide to withdraw? No – for three main reasons. First, the GDPR will affect organisations’ day-to-day operations since organisations operating in the EU will become liable for managing all the unstructured personal data held on their networks and in the cloud – a big challenge. Second, Britain will still need a close imitation of the GDPR to keep trading with European partners; Government ministers, the technology sector and legal commentators broadly agree on this. Thirdly, GDPR’s compliance and penalty clauses are harsh. Non-compliant organisations could be fined up to 4% of their turnover. How can companies deal with such a far-reaching directive? Unsurprisingly, boards will take a strategic approach, designing compliance frameworks, reviewing privacy standards and involving all employees. But policy-led approaches quickly run up against the continuing phenomenon of the 21st century Internet: explosive growth in required data processing levels. As companies embrace cloud-based processes, bring-your-own-device (BYOD) programmes and social networks, IT teams have little or no visibility of the extent of their data assets - or their final uses. Practical GDPR planning begins with finding out who does what to your organisation’s data and when does it leave your jurisdiction. The GDPR compliance picture is evolving quickly but we can set out broad principles for boards, CIOs and security professionals to build a compliance framework that keeps core business systems flowing: Boards must be responsible for systems that will meet data subjects’ future requests under GDPR, such as the right to be forgotten or access to copies of relevant data held about them Organisations must start to design data security into products or services by default UK companies must design data security and auditing processes that notify stakeholders of a data breach – and make suppliers document their own information security processes Companies over 250 employees, or whose operations are based on data handling, will need a Data Protection Officer to scrutinise their IT processes, data security and privacy systems Boards must operate Data Protection Assessments (DPAs) and train their IT and security personnel, as a starting point, on compliance. No team of IT suppliers currently provides a complete GDPR compliance solution but suppliers such as Cloud Access Security Brokers (CASBs) are making breakthroughs such as integrating corporate network and application monitoring systems (achieving data and application visibility) and implementing enterprise-wide sanctioning and control of IT applications. Encouragingly, these cloud services and hardware technologies will transform organisations’ processing and network monitoring power – with those services becoming readily available to CIOs as flexible, managed services. We’re counting down those 425 days but smart organisations are already beefing up network monitoring and data processing. Companies that questioned the wisdom of adopting cloud on an enterprise-wide scale or putting sensitive data in the cloud may find that the GDPR’s sheer scope will force them to accept sophisticated cloud services unequivocally in the future. Teneo is a specialist integrator of next-generation technologies. ### Information Management - #CloudTalks with Entity Group’s Steve Parry In this episode of #CloudTalks, Compare the Cloud speaks with Steve Parry from Entity Group about information management and the opportunity/obligation that is held around data and how offering advisory services helps design a 'roadmap' for the integrating the existing products within the market. #CLOUDTALKS Visit Entity Group to find out more: entitygroup.com/ ### How to choose a work collaboration tool for your virtual team When it comes to working in virtual teams, using the right productivity and work collaboration tool can be both a blessing and a curse. Armed with all kinds of devices and software solutions, teams remain challenged by complex workflows, tight deadlines, and geographical boundaries. Here are some ways technology is holding teams back and what your virtual team can do about it. Overreliance on traditional tools Despite the rapid development in everything from instantly accessible SaaS solutions to mobile devices, organisations continue to rely on outdated, disparate tools for project and team collaboration. Surveys have shown that there is a lot of waste due to poor work collaboration. What could you accomplish if your team could get 20 working days back per year? [easy-tweet tweet="The average team member uses about five different tools to manage projects." hashtags="Virtual, software"] Some hurdles are more obvious than others. For example, while email has a role in connecting teams, it can cause headaches and concerns about missing important information. The drawbacks of traditional solutions, such as email and spreadsheets, are evident in the work collaboration obstacles: trouble accessing up-to-date information, lack of workload visibility, and confusing cross-functional team collaboration. Doing more with less, rather than less with more Newer tools are not necessarily a solution to all problems. In fact, in a recent survey of 200 business professionals who manage or participate in projects, respondents also cited the use of different tools as a top work collaboration challenge. About 40 percent of those surveyed said they have tried various online work collaboration solutions, but most provide only some capabilities needed to work together effectively. In fact, the average team member uses about five different tools to manage projects. One-third also revealed that no one in their company uses the same tools. This is the Bring Your Own software dilemma. The IT department doesn’t have the same control in larger organisations as it once had, and from small teams to large enterprises, people’s adoption of a new messaging or productivity tool is just a click away. It becomes a vicious cycle: With team members using too many different tools, no one can work together effectively. As a result, people turn back to email. Productivity app requirements for teams If traditional software solutions aren’t dynamic enough, and the latest enterprise messaging apps don’t mesh well for virtual teams, then what exactly do teams need to successfully work together? The professionals in the survey mentioned above were clear on four requirements: Reliable, cloud-based document sharing with version control and iteration Planning, scheduling, charting, and milestone tracking capabilities Workload and workflow visualisation, for the team and the individual All in one place, one platform to access all crucial tools. Here’s what to look out for. Document sharing Teams must be able to share the latest documents with all colleagues, whether they are in the next room or a continent away. Relying on email and internal document repositories often spiral into a never-ending search for the most up-to-date data and documents. External team members, such as consultants, vendors, etc., can sometimes find it difficult to access and navigate these sources. Cloud-based document systems alleviate the challenge of sharing information: but often fall short in key project management areas. Look for applications that are available to everyone on the team and offer full version control, locking, reviews, and relation back to project activities and tasks. Collaborative planning and scheduling The ability to plan visually is a way to get virtual teams on the same page. In traditional project management, Gantt charts are used to plan and track projects using classic constructs such as activities, milestones, dependencies, and due dates. Seek solutions that incorporate Gantt charts digitally and enable the distributed team to view and update them at any time. Task visualisation Whether you are a traditional or an accidental project manager, one way to complement the planning done via Gantt charting is Kanban boards. Resembling Post-it® notes, Kanban cards represent tasks that individual project members are working on at any given time. Placed in columns, these cards represent workflows of projects and provide an at-a-glance view into the work of each individual team member. This helps with workload management by identifying who is busy and who is free for more tasks. Some applications offer the ability to add documents, hyperlinks, and comments to cards for more interactive team collaboration. Combined with Gantt charting, Kanban boards can provide a unified view into the progress of overall activities and milestones. A virtual office A central, cloud-based work and project management solution can facilitate effective collaboration among virtual teams. This mitigates the security concerns some companies have about integrating their existing infrastructure with external organisations. It also avoids the headaches and expense of building and hosting solutions that may not be conducive to working with outside parties. By establishing virtual offices online, distributed teams can collaborate, communicate, and stay up-to-date regardless of their whereabouts. Combined with the right tools, the team has one place to communicate directly with anyone, keep current on project deliverables, access and update project documents, and ensure work gets done. In addition, the most capable platforms also offer out-of-the-box integrations with third party solution. Services like Zapier of IFTTT makes it possible for you, or your team members, to continue to use your single-purpose app of choice. These productivity solutions represent the needs of modern, innovative teams that work across boundaries and require transparency into the team’s progress, bottlenecks, opportunities, and outcomes. A good place to start is to audit and normalise all team technology in light of these four requirements. Every tool should be easy for remote staff to access and use. Once a tool has been chosen, the only way to ensure success is for the team to use it consistently. By having the right capabilities available in one virtual, secure workspace, teams can elevate their productivity and increase work collaboration. ### Building a Cloud - #CloudTalks with Agenor’s Jason Forsyth In this episode of #CloudTalks, Compare the Cloud speaks with Jason Forsyth from Agenor about their software development programs and the services that build from there. A team is built around what the customers needs making it not an individual resource but a complete delivery team.   #CLOUDTALKS Visit Agenor to find out more: agenor.co.uk/ ### 5 Ways IT Can Lead the Enterprise Agile Transformation The adoption of Agile as a work management methodology outside of IT is accelerating and shows no signs of slowing down. Particularly for marketing and product development, it’s easy to see why: teams that have embraced Agile methods have seen tremendous improvement in achieving their business goals. Of course, IT is already well aware of the benefits of Agile. Advantages like faster time to market, increased adaptability, massive productivity improvements and the ability to deliver a more customer-centric product have made Agile the de facto standard among development teams. As other business units see these advantages, it’s only natural that they want in on the secret. [easy-tweet tweet=" “Going Agile” has benefits beyond the IT function that can benefit the overall business’s success" hashtags="Agile, IT"] As the Agile experts within most organisations, IT has a tremendous opportunity to champion Agile adoption and mentor other teams in implementing it the right way. By lending their expertise, IT can help ensure a smooth transition and demonstrate their strategic value to the rest of the organisation. Here are five ways IT can lead the Agile charge: Distinguish agile from Agile. The word “agile” is already part of the vernacular in other departments, but likely not in the uppercase “A” form. Most every team strives to be lowercase “a” agile: nimble and able to respond to opportunities or challenges quickly. However, uppercase “A” Agile is not an adjective. It’s a specific methodology with an established set of principles and practices. The most important thing for IT to do when mentoring other teams is to make that distinction very clear. Otherwise, IT will be talking about apples, but what other teams hear sounds more like oranges. Ask questions and listen. Because Agile has a very distinct definition, some teams may not fully understand exactly what they’re asking for. Start by asking questions like, “Why do you want to adopt Agile?” and “What are you trying to achieve?” The answers might range from “because my boss told me to,” to “we’re losing market share.” When IT can fully understand the challenges or goals of other teams—and what’s stopping them from achieving those today—it puts IT in a much better position to recommend ideas and help with Agile implementation. Don’t force Agile on them. Recognise that while Jira or some other scrum tool might be your panacea, it may not work in the context of other teams. As a mentor, it’s IT’s job to be an advocate for its mentees in the enterprise and help them to find solutions that work, regardless of whether they fit the dogmatic Agile approach. After all, you’re ultimately in this together. Forcing a team to adopt a tool that won’t work could set them (and by default, you) up for failure. After asking some questions to understand the issues, you may find that Agile is not the answer. That’s valuable insight, too. If it likely won’t solve the challenges, save yourself and the team a tremendous amount of time, effort and upheaval in forcing through a new system of work that, well, won’t work for them. Start simple. As the existing Agile authority, it might be tempting for IT to take an authoritative approach, insisting that Agile is an all-or-nothing proposition: you either adopt the entire Agile Manifesto, or you just don’t do Agile. But, the reality is that approach doesn’t always work outside of IT (or inside IT for that matter, depending on your context). It’s OK to be semi-Agile. Perhaps start with implementing a Kanban board that uses visual queues to help organise and prioritise work. This low investment, simple way allows the team to get comfortable with a specific Agile tactic, and sets them up for a relatively quick, easy win, allowing them to reap some immediate productivity benefits, rather than suffering through a 6-month or longer implementation process before they ever see any results. Make Agile accessible. IT can offer tremendous value in coaching teams in the software/technological aspect of Agile by providing access to the tools that make executing it possible. But, there’s also business value accessibility to unlock. IT can help other teams understand the importance of the newfound visibility into productivity, capacity, etc. In most organisations, IT has become an expert at measuring productivity and throughput, justifying additional capacity investments with data and demonstrating ROI. Marketing, product development and other teams can also benefit from these insights. IT can show mentees how to use Agile to demonstrate business value and influence business decisions. As the Agile experts, IT has an opportunity to mentor other organisation teams in adopting this high-productivity, customer-centric approach to work management. There’s no doubt that “going Agile” has benefits beyond the IT function that can benefit the overall business’s success. But, they also have a responsibility to do it in a way that makes sense within the context of the teams’ work, needs and challenges. While it may not be added into the Scrum plan as a sprint, spending time in this mentoring capacity can certainly yield benefits for both IT and the entire organisation. ### IBM Maximo - #CloudTalks with BPD Zenith’s George Lightfoot In this episode of #CloudTalks, Compare the Cloud speaks with George Lightfoot from BPD Zenith. Working with IBM Maximo for over 20 years, BPD Zenith aim to have the ability to predict to know when things will fail. #CLOUDTALKS Visit BPD Zenith to find out more: bpdzenith.com/ ### Your time, your data and what’s it worth? With tech reinventing itself monthly, one of my favourite statements of late is "I may be old but I'm not obsolete" (I think that applies to many of us on a personal level too!). Great ideas and cutting edge products save us the most sought after commodity in the world, time. This may seem an obvious tenet but when I caught up with a company that I've been watching for a while who propose to do just that, I was genuinely excited! Everyone needs more time whether it's for family, personally or indeed to be more productive at work! Technically speaking there are many things that make our lives easier; applications that do the work for us when we want that online purchase, be it a book, a taxi, a holiday, or even a partner, but let's spare a thought for how native cloud apps, data movement/setup or even lowering cost of ownership is achieved by making a leap from on-premise IT infrastructure to public cloud. There have been a few large players who've pulled out of the public cloud space, mostly for a good reason.  However, it does leave only a handful of giants that can actually service your needs properly, securely and effectively. Two main providers to mention that I'm sure you know, AWS (Amazon Web Services) and MS Azure, are dominant players that simply make it easy for you to be a Cloud resident. Both companies are extremely well established, the tech just keeps getting better and better with more inclusions. [easy-tweet tweet="Once in while a company comes along that presents a different and interesting mindset. " hashtags="cloud, data"] But let's spare a thought about where your technical landscape is and how you change it first. You know that migrating to a Cloud is the right decision to do, but what about the legacy? What about the migration plan? What about the rigorous testing that you have to go through even before you have a concrete list of services to move or that can be moved? Nightmare! Well, there's loads of tools out there I hear you say, and yes there are many ways that you can copy, monitor and develop a Cloud strategy set for migration. But once in while a company comes along that presents a different and interesting mindset. It does all the above and saves you that valuable commodity. Releasing staff and resources to be used more effectively. I caught up with Velostrata a few weeks ago and was given an update from the product marketing guys. I was impressed with the developments following on from an interview we had at last years Cloud Expo. Velostrata's software allows you to move workloads from your existing IT Estate to the Public Cloud in minutes, while controlling and automating where your storage reside. Could be on-premise, in the Cloud, or a combination of both. From what I can see Velostrata is a unique and effective solution that addresses migration of both computing and storage without the risk! They cater for Public Cloud big boys such as MS Azure and AWS with some amazing testimonials and because of the way they stream your data so you can have applications running in minutes – without waiting for the data to be moved first or managing complicated data synchronisation for stateful applications. Because migrations can happen far more quickly, you can reduce both man hours and infrastructure costs. I wish I had access to this when I was responsible for financial IT Estates years back, it would have made my job a lot easier while ensuring both my peers (and my boss), were impressed! Additionally to all the above, you also have the chance to test your migration in an R&D environment without affecting your production environment and its also agentless! If you are thinking of changing your on-estate tech cloud setup, (and even if you’re not, it’s worth looking at what a market leader is doing!), this company is worth more than a casual look ### Leveraging Technology - #CloudTalks with Eurotech's David Parker In this episode of #CloudTalks, Compare the Cloud speaks with David Parker from Eurotech about how after 24 years in the business specialising specialist solutions in the oil, gas and energy sectors. With a massive take up of cloud by the public it can become difficult for Tier 2 and 3 suppliers to challenge against AWS, Azure and other suppliers. Without being as price competitive Eurotech can offer tailored services to the consumer.   #CLOUDTALKS Visit Eurotech to find out more: https://www.eurotech.com/en/ ### Oops my cloud just went down! Burn baby burn… So I am sure you all have been aware of the AWS S3 outage by now and various comments on twitter - #awsoutage If you have a website, chances are you have been or were affected in some way shape or form. A part of Amazon Web Services infrastructure had issues (S3) which is pure essence in part or Public Cloud infrastructure that hosts pictures, videos, images and files/storage for a vast amount of services you currently use (Airb&b, SoundCloud and Slack to name a few). [easy-tweet tweet="We need to rethink the way we consume cloud services" user="neilcattermull" hashtags="Cloud, AWS"] Public Cloud is a very difficult model to provide as a service when you consider the reasons why to take it in the first place. Public Cloud is a commodity for technical infrastructure – requiring an inexpensive service to gain your business – that translates to an inexpensive solution to resiliency to keep the costs down. Public Cloud needs to be flexible - requiring a multitude of variables of storage, types of files, bespoke code and to have the ability to grow quickly – very hard to cater for absolutely every connotation of provisioning safely, securely at a price that’s competitive. Being competitive on price always comes with a cost operationally and this recent outage at AWS has demonstrated this. Event the webpage that is provided by AWS to monitor the systems and locations of the service was hosted on the same infrastructure that went out of service. There have been a few very large IT providers that have simply pulled out of providing Public Cloud infrastructure provisioning as they see this as fools gold. To provide a resilient, secure, multi-tenanted and flexible service is a very hard task to do in the best of examples but by adding in the low-cost element too simply makes it a business model that just isn’t viable (commercially). We need to rethink the way we consume cloud services to not just be about “how much it costs” to “what happens if the service goes down? (and it will)”. ### Ensure your Office 365 migration runs smoothly, and keeps your users and data secure along the way Cloud computing can mean big changes for small to medium enterprises. From small start-up businesses, to established small and medium-sized enterprises, an increasing number of organisations are becoming aware of the potential benefits of migrating their email and collaborative applications to the cloud. Many of these enterprises are looking to Microsoft’s Office 365 portfolio to remove the burden of managing these applications and services in-house, whilst realising the cost-savings of their subscription model and powering flexible working for their employees. However, for most companies migrating apps and services - particularly email systems - can result in a long, complex and expensive project if the migration strategy is not planned properly or executed using purpose-built migration tools. A poorly managed cloud migration process can result in prolonged downtime and disruption for employees, reduced productivity and in the worst scenario, critical data business information could be put at risk. [easy-tweet tweet="To minimise potential issues, careful planning prior to migration is essential." hashtags="cloud, office365"] While these risks may deter many smaller businesses from moving to Office 365, disruption can be mitigated and disasters averted if organisations pay careful attention to each stage of the migration process. From preparing and gaining insight beforehand, to automating key migration processes during the transition and on to managing the new environment securely afterwards, businesses can enjoy a smooth cloud adoption experience. In this article, I'll share some tips on how to migrate to Microsoft Office 365 without any major headaches. Before Office 365 migration A major concern for any organisation is the downtime associated with the migration to Office 365, as time spent unproductively can cost the business income. To minimise potential issues, careful planning prior to migration is essential. Businesses should conduct an in-depth pre-migration assessment including an inventory of all users, mailboxes and apps. This will enable IT teams to determine specific migration requirements and estimate projected costs before launching into the process. Migrating to Office 365 provides many small to medium sized enterprises with the ability to scale the number of licences needed based on the number of people being employed at any given time. Careful assessments can determine the exact Office 365 licencing costs and leverage the advantage of a flexible subscription-based model, which is a fundamental advantage for a smaller businesses. Planning ahead for Office 365 migration also helps identify what within the businesses does not need to be migrated to the new environment. During the planning process, businesses should take the opportunity to archive, and importantly, back-up all on-premises data, not only for peace of mind but to allow for recovery in the minutes during migration. During Office 365 migration During the migration process small to medium sized businesses should also take advantage of available software tools to help speed up and simplify migration to Office 365. With key migration processes scheduled and automated by specialised software, the transition is streamlined and IT teams’ work time (not to mention working costs) are dramatically reduced. Migration tools also allow IT departments to monitor the process through regular status reports, and are useful for businesses owners who will be keen to know that the Office 365 migration is running smoothly. Most importantly, migration software tools ensure that employees maintain productivity and any accidental data loss during migration can be recovered. This is facilitated by the temporary coexistence between existing legacy systems and the new Office 365 environment whilst the process is being conducted. After Office 365 migration Businesses must be able to stay in control of the new environment. Here software tools and automation can again be helpful to simplify users’ licences across Office 365 and offer a single sign-on whether on premises or in the cloud. SME’s also should ensure their employees are accessing Office 365 within a secure environment and make sure that staff understand their responsibilities too. The way to achieve this, without any headaches, is usually for SME’s to include the applications within an existing unified authentication and authorization strategy, rather than building dedicated security around Office 365. One small business that utilised Office 365 migration solutions for SharePoint is electronics company Instron Corporation, which has European headquarters based in Buckinghamshire. Over the course of the deployment the team migrated over 250 gigabytes of content to Office 365 with zero downtime, or impact for end users. Prior to migration Instron Corporation was feeling the constraints and limitations of its native tools. With very limited resources internally, and a lack of experience to manage the migration to Office 365 confidently, the risk of lost data, downtime and disruptions for employees was a real concern. Utilising SharePoint migration software, Instron Corporation was able to realise the benefits of migrating to the cloud, with zero impact on end users. Smaller businesses or those with limited IT capacity need not be afraid of making the jump to Office 365 and transitioning their messaging and collaboration functionality to the cloud. With well-informed preparation and dedicated software tools, IT teams and the wider employee base can experience a zero impact migration and get on with growing their business in the cloud era.       ### Predictive Analytical Technology - #CloudTalks with Smart Vision’s Berni Simmons In this episode of #CloudTalks, Compare the Cloud speaks with Berni Simmons from Smart Vision about helping customers make more intelligent use of their data to improve their business operations by the use of advanced and predictive technology. The key for Smart Vision is using this advanced modelling and analytics effectively in their organisation. #CLOUDTALKS Visit Smart Vision to find out more: sv-europe.com/ ### Cloud vs. Dedicated Hosting: What is Best for your Business? Results of a poll conducted on cloud adoption for 2016 suggest that over 4 out of 5 businesses or roughly 84% of enterprises in the UK use at least one service in the Cloud. This is a healthy adoption rate that has been happening in the last 5 years from 40% in 2010 to 78% in 2014. [easy-tweet tweet="Is cloud hosting the best way to go or is dedicated hosting also a viable option?" hashtags="Cloud, hosting"] This trend provides valuable insight that the Cloud helps business organisations achieve goals and objectives as well as workloads. Moreover, more than 50% of respondents in the polls stated that they will move their business entirely to the Cloud at some point. But, the question remains - is cloud hosting the best way to go or is dedicated hosting also a viable option? We examine the advantages and disadvantages of different hosting options for your business. Advantages of cloud hosting No physical infrastructure The primary edge of cloud hosting is that there is no need for physical hardware at all. This means you don’t have to shell out capital to buy physical servers nor are you faced with finding space for equipment. Cloud users benefit from economies of scale as the cost of procurement is shared among subscribers. Pay only for what you need When your data collection, storage and protection are in the cloud, you only pay for the volume of server space that you use. This eliminates paying for idle time which can be used by the company for other lucrative activities. Flexibility As your needs and requirements grow, you have the flexibility to scale up your resources. For example, you can upgrade RAM or disk space easily with a few clicks.  Software integration is also automatic. Accessibility Accessibility is also a main benefit of using cloud hosting as anyone in your firm can work anywhere, anytime. The ability to pull out and use data on the go increases levels of productivity that would not otherwise be possible with a dedicated server. Backup and recovery It is also easier to store, backup and restore data in the cloud than on a physical device. Disadvantages of cloud hosting Data breach Information in the cloud is vulnerable to data breaches since you do not have full control of security. 70% of businesses who were polled in the Cloud study cited data security concerns. Uptime dependent In addition, you are highly dependent on uptime for your activities to run and if there is plenty of downtime, you lose valuable time and potential clients. Slow to no technical support Technical support is often slow and the waiting period is long, up to 48 hours. Pros of using dedicated servers Control of resources Full control of your resources is the main advantage of using a dedicated server for your business data. You can tailor it to your requirements as you see fit. Full security compliance If you have extremely sensitive data that you want to protect, you can build in stringent security measures and avoid third party infringement. In the long run, it can become a cost-effective solution for companies that do not rely on uptime to fully function. Not dependent on the Internet Since you are not dependent on the Internet to access data, you can just go to the office and start working. Cons of dedicated servers Capital expenditures It requires capital outlay to pay for hardware and infrastructure. Requires space You will also need to assign space in a room as well as hire IT support. Less flexibility If your data requirements change, it is not that quick or easy to upgrade RAM or CPU and is also dependent on vendors’ supply and technical support. The Best Approach To maximise the benefits of both cloud and dedicated hosting, using a hybrid cloud may be a feasible solution. It consists of a private cloud (hosting where only one organization has access) and a public cloud (shared by other users on the network). Typically, sensitive data is in the private cloud while general information can be stored in the public cloud. This seems to be a prevalent trend with a majority of businesses affirming that they will be working in a hybrid IT environment now or sometime in the future. While it is true that no model is a perfect fit for all businesses, companies that find the right balance between private and public cloud will have an edge over their competitors compared to those who solely use dedicated servers. Overall, 56% of respondents in the Cloud Industry Forum Study said that Cloud services offer a ‘competitive advantage’ to their organisations while 22% foresee one coming. ### Modernising Legacy Apps - #CloudTalks with Flynet’s Andrew Bentinck and Christian Rule In this episode of #CloudTalks, Compare the Cloud speaks with Andrew Bentinck and Christian Rule from Flynet about how they are using tools to modernise Legacy applications. Making Flynet unique is the speed they achieve this in with the use of apps and in development. #CLOUDTALKS   Visit Flynet to find out more: flynetviewer.com/ ### NetApp’s Matt Watts on Threats to Data Matt Watts introduces Threats to Data hosted by NetApp. Compare the Cloud attended a private invite-only briefing provided by NetApp with some of their key partners. The general theme was based around how businesses are transforming with the emergence of SMAC – Social, Mobile, Analytics and Cloud. Various statistics were also demonstrated that showed the audience that indeed Data and the management of is of the utmost importance to any industry together with security. Further presentations followed with a deep dive into BaaS (Backup as a Service) and how this is helping businesses transform digitally whilst increasing availability and security. Data and time are two of the most valuable commodities that every business try to manage and secure above any other items and NetApp together with some key partners demonstrated how they achieve optimal benefits for both. Compare the Cloud interviewed the speakers after the event and this is the first video to be released in a series of four, Laurence James, NEMEA Products, Alliances and Solutions Manager at NetApp is up first talking about Data Nuggets and the value of data in your organisation. ### 4 Ways The Cloud Can Make Moving Office Easier Moving office is a complex task and must be carefully managed to avoid costly mistakes. A poorly coordinated office move can have a significant impact on bottom line sales so it is important to dedicate enough resource to the project to ensure it runs smoothly. As a project manager, hiring a good removals company will minimise risk and ensure you have access to professional advice and support throughout your move. “Research and contact three moving companies that are at least accredited and approved by the British Association of Movers,” advises John Watson, Managing Director UK Services at Abels, a moving company specialising in UK and international removals.     Hiring a removals company will give you the headspace needed to focus on the other important elements of an office move such as your IT infrastructure. Relocating computers, servers and other IT equipment can be difficult since you need to connect it all again at the new location. This is not always quick or easy to do and can result in downtime. A great way for SMEs to overcome this is to use cloud computing, which essentially moves your IT services to a remote location and uses that to store your data for you to access remotely. Let’s look at 4 ways the cloud can make your office move easier in 2017: Minimise downtime Cloud-based software means that you no longer need to shut down your services during an office move because you can easily stay connected with staff and customers. Ultimately, all that is required at the new office to enable your team to be working with minimal disruption is an internet connection and internet enabled devices. Data storage, backups, operating systems, applications and emails are all managed and updated by the cloud provider. [easy-tweet tweet="Traditional IT systems are notorious for offering limited security and disaster recovery." hashtags="Cloud, tech"] Increased productivity If you have multiple people working on a project simultaneously, then cloud-based software like Google Drive ensures everyone that is invited to view a document can see changes in real-time and add to the most recent version. With the team being able to access their data and applications from any location using their own devices, they could flexibly work from home or an alternative location whilst the move is taking place. This would ensure business continuity and increased productivity. It also means the team are able to still collaborate together without necessarily being in the same place. Reduce the risk of error Traditional IT systems are notorious for offering limited security and disaster recovery. Unsurprisingly, SMEs are finding it increasingly difficult to keep these systems up-to-date and secure. It is a big worry when moving offices to ensure all servers on which sensitive data is stored are relocated safely and backed up correctly, especially if back-ups are not performed regularly. The right cloud services remove this risk by ensuring data is stored and backed up regularly in secure ISO 27001 certified data centres. Scalability Cloud computing allows businesses to scale up or scale down their operation and storage needs quickly – it is very flexible. This means you only pay for the services you need and how many users you have depending on the size of the business. If you have relocated office due to plans for expansion then as the business grows, you can grow your cloud services with it. ### Node4’s Steve Denby on Application Workloads Steve Denby introduces Application Workloads from Node4 hosted by NetApp. Compare the Cloud attended a private invite-only briefing provided by NetApp with some of their key partners. The general theme was based around how businesses are transforming with the emergence of SMAC – Social, Mobile, Analytics and Cloud. Various statistics were also demonstrated that showed the audience that indeed Data and the management of is of the utmost importance to any industry together with security. Further presentations followed with a deep dive into BaaS (Backup as a Service) and how this is helping businesses transform digitally whilst increasing availability and security. Data and time are two of the most valuable commodities that every business try to manage and secure above any other items and NetApp together with some key partners demonstrated how they achieve optimal benefits for both. Compare the Cloud interviewed the speakers after the event and this is the first video to be released in a series of four, Laurence James, NEMEA Products, Alliances and Solutions Manager at NetApp is up first talking about Data Nuggets and the value of data in your organisation. ### How can SME's benefit from the cloud? By 2020 it is estimated that 78% of small businesses are expected to host their IT environments in the cloud*. Working smarter, not harder is becoming the goal for many small businesses. Business-changing technology has previously been restricted. The costs, as well as implementation, pushed it out of reach for many small business organisations. Small businesses are the backbone of the British economy, so it’s great that cloud technologies are finally enabling businesses, to create a robust operational environment in which to grow. Personally, I think the decline of onerous, on-premise business platforms is probably a good thing for everyone. Today, there are all kinds of benefits that a small business expects to see when adopting cloud computing solutions. Implementing cloud technologies, such as accounting, enterprise resource planning and customer relationship management combined with Office365, can help a small business drive revenue growth, enhance security, improve mobility and reduce costs. [easy-tweet tweet="Cloud solutions can offer reduced costs, as well as open and transparent costs." hashtags="Cloud, tech"] I’m not suggesting that SMB’s are new to the cloud (they have been using SaaS for quite some time in popular email solutions, project management and CRM). However, this is the era of seamlessly, joined-up ERP, marketing automation, office collaboration & productivity tools. I’m increasingly finding that our small and mid-sized business clients are seeing their revenue expand, as we support them to migrate their data and apps to cloud environments. It’s not by mistake that small businesses with growing revenue are 50% more likely to prefer cloud IT delivery*. The majority of our small businesses clients report security benefits since moving to the cloud and significant productivity benefits from cloud applications. Cloud solutions can offer reduced costs, as well as open and transparent costs. And, I believe that most businesses are reinvesting their cloud cost-savings back into their business. People can get over-focused on the technology, but the emphasis should be on what it does, what the organisational benefits are and what service is provided. Cloud brings fresh opportunities, like the emerging app store economy, where businesses can enhance, evolve and grow their capabilities through cloud-based, extendable apps. This allows small businesses to evolve and grow, quickly and on their terms. Speed of deployment, flexibility, scalability and versatility of consumption models are just a few of the many benefits of cloud computing. [easy-tweet tweet="For many small businesses, it is more practical and cost-efficient to buy cloud services" hashtags="cloud, tech"] It has been a common misconception that cloud technology was designed for big business only, when in fact it is a great leveller for small business. Technology solutions have reached a point that means small businesses can compete with their big business counterparts in terms of capability. For many small businesses, it is more practical and cost-efficient to buy cloud services as opposed to maintaining and periodically upgrading systems on-premise. An additional benefit is that cloud services are also automatically upgraded by the service provider. The cloud has made it easier for businesses to connect with their audience, cloud-based applications have simplified the process of integrating social media, web and email marketing campaigns. One of the most exciting aspects of cloud-based solutions is interoperability and intelligence. Intelligence brings real-world advantage with cloud solutions now effectively making every small business analytics driven. Cloud solutions are combining people, data, systems and processes for a real-time view of what’s going on. Ultimately, the cloud gives the small business workforce the flexibility to work anywhere and anytime. * IDC, Worldwide SMB Public IT Cloud Services 2014–2018 Forecast, Oct14     ### Daisy’s Nathan Marke on Software and Data as Currency Nathan Marke introduces software and data as currency hosted by NetApp. Compare the Cloud attended a private invite-only briefing provided by NetApp with some of their key partners. The general theme was based around how businesses are transforming with the emergence of SMAC – Social, Mobile, Analytics and Cloud. Various statistics were also demonstrated that showed the audience that indeed Data and the management of is of the utmost importance to any industry together with security. Further presentations followed with a deep dive into BaaS (Backup as a Service) and how this is helping businesses transform digitally whilst increasing availability and security. Data and time are two of the most valuable commodities that every business try to manage and secure above any other items and NetApp together with some key partners demonstrated how they achieve optimal benefits for both. Compare the Cloud interviewed the speakers after the event and this is the first video to be released in a series of four, Laurence James, NEMEA Products, Alliances and Solutions Manager at NetApp is up first talking about Data Nuggets and the value of data in your organisation. ### How assuring performance in the cloud transition is critical to digital transformation There’s no doubt cloud computing is a key driving factor in the digital transformation (DX) impacting every single sector from retail to financial services, manufacturing to heavy industry. Through the cloud, new digital-centric businesses are emerging that are more agile and innovative than anything seen before. Enterprises are having to worry about these new, agile competitors, while simultaneously dealing with upheaval on a massive scale that they have never experienced before. The challenge is clear: disrupt or be disrupted. [easy-tweet tweet="The road ahead is more complicated than just migrating infrastructure and applications " hashtags="cloud, tech, data"] For long-established enterprises, however, this is particularly difficult. They're not like the new start-up competition; they have legacy systems and applications that cannot be virtualized and migrated to the cloud, and customers relying on the flawless and ceaseless delivery of their services. Furthermore, due to governance, compliance, and regulatory (GRC) requirements some of the digital assets have to stay on-premises. Therefore, the road ahead is more complicated than simply migrating infrastructure and applications to the cloud. The role of the CIO With this in mind, the cloud transition, and DX as a whole needs to be managed carefully to ensure the continuation of service and the best outcomes for the business. This responsibility has become that of the CIO, who must endeavour to maintain order and lay the foundations for the future. This is especially true when you consider how the change of pace is quickening and at the centre of DX are a variety of new technologies that span the edge, core, and cloud of the service delivery infrastructure. These technologies are the foundation for the ‘Pillars of Innovation’, namely Big Data Analytics, Cloud & XaaS and IoT, among others. [easy-tweet tweet="The DX journey is perpetual and depends on constant innovation " hashtags="cloud, data, tech, digital"] The DX journey is perpetual and depends on constant innovation implemented through new business services. These are delivered via the Pillars of Innovation and the CIO is ideally positioned to oversee this. Fundamentally, the right tool to assure successful digital transformation needs to be future proof and: Manage complexity as it expands over time in hybrid cloud environments Scale to manage any number of services, users, and data Support speed and agility necessary in highly competitive environments Visualise the information in the context of the monitored business services If the transition is managed properly, the benefits of the move to the cloud is clear – enterprises will gain the ability to increase infrastructure capacity with no additional capital expenses and quickly deploy new services as mandated by the business needs. However, it is important to consider that in the process, enterprises risk losing visibility and control over the data and the quality of service delivery. It is crucial that this risk is mitigated effectively. The importance of service assurance Enterprises must remember that successful cloud-based disruption is not only about delivering transformational customer and business services. It is about delivering them well. In the connected world, assuring the quality of the enterprise service delivery infrastructure, the applications that utilise it and all their respective interdependencies becomes a mission-critical business activity. As the business environment continues to evolve and competition increases, there is renewed pressure on IT to reap the benefits of external cloud services, but there is rarely a plan to assure performance and secure the data stored in the cloud, beyond relying on the cloud providers. While cloud providers advertise they have integrated control, management, and security of respective cloud offerings, a better approach would be to trust but verify. Furthermore, cloud management solutions only monitor operational and performance metrics of cloud resources and applications. Cloud management does not analyse the interdependencies across service chain components such as network, compute, storage, databases, service enablers and applications. Last but not least, cloud management does not monitor holistically hybrid cloud and multi-vendor cloud environments. Therefore if an organisation becomes dependent on the cloud to support key aspects of the business then the CIO needs an alternative approach to achieve complete visibility across all hybrid cloud-based systems. The right approach to service assurance overcomes this challenge by providing a holistic visibility across the entire service delivery infrastructure from the wireless Edge to the Core and in the Cloud. This is achieved by end-to-end instrumentation, continuous monitoring and analysis of the traffic data flowing over hybrid cloud networks. The analysis of the monitored data provides a real-time and historic view of business services and their infrastructure across the virtual, physical and hybrid service delivery environments, enabling enterprises to spot and isolate any anomalies that may present a hindrance to business performance. Translating smart data into actionable insights in real-time is of huge strategic value to the enterprise both in terms of productivity and revenue. [easy-tweet tweet="Cloud will play a central role in the DX transformation of the majority of enterprises" hashtags="cloud, data, tech"] Cloud will play a central role in the DX transformation of the large majority – if not all – of enterprises over the next decade and service assurance is crucial to the success of this transition. The CIO must be able to confidently manage the quality of each new cloud service and application that is adopted across the business, and every new system must be accounted for and be aligned with the overall DX cloud initiative and enterprise-wide DX strategy. With this in mind, it’s clear how greater visibility has become essential to manage the quality of each new hybrid cloud service and application that is adopted across the business. ### Fintech explained with Blockchain as an example! Through Fintech (Financial Technology) various technologies have emerged over the decades to take the guess work out of financial decisions. Algorithmic trading and analytics (we know as Big Data analytics) has been employed in the financial markets for a very long time. Predictive analytic programs to assist financial decisions based on long-term data gathering for many types of data plotting graphs and charts based on previous traded history. This is one example of Fintech! There are many others that are part of the overall finance technology employed (settlements with payment mechanisms that settle the trades purchased or sold, as another example). The current trend is more of a decentralisation of access to create opportunities all of the banking environments to interact in unprecedented ways. [easy-tweet tweet="When we talk about Digital currency the word Bitcoin comes to mind, this is a hot topic. " hashtags="fintech, blockchain"] So, now that we’ve got a base definition of “FinTech”, I should probably elaborate on exactly what it entails with a few examples: Digital currency– Bitcoin and others emerging as the de-centralisation of currency transfer and distribution now comes of age Peer-to-peer lending– matching lenders to borrowers directly Algorithmic asset management or algorithmic trading – Artificial Intelligence using information management techniques to predict future outcomes within financial trading (BlackRock and Two Sigma, hedge funds that hired two former top Google engineers) Crowdfunding– raising money/investment from a large number of individuals online Payments– Specific forms of payment systems are used to settle financial transactions for products in the equity markets, bond markets, currency markets, futures markets, derivatives markets, options markets and to transfer funds between financial institutions both domestically using clearing and real-time settlement systems and also internationally using the SWIFT network Data collection– research and compliance purposes All of the above build and implement technology that is used to make financial markets and systems more efficient. In my opinion, the next wave of advances within the “Fintech” space are coming from the above list. When we talk about Digital currency the word Bitcoin comes to mind and this is a very hot topic at the moment. For a decentralised infrastructure that governs/controls anonymous payments is a very difficult thing to do and it’s reliant on systems and infrastructure that need to have rapid connectivity to each system in the chain and to be in synch. Enter a method called Blockchain. Digital Currency, associated infrastructure needed – The Blockchain method! So what is Blockchain I hear you ask? Well it’s basically made up of 3 concepts (elements): Sharing Sharing a Set of Rules for Updating State via Blocks and a Trust Model for Timestamping. Essentially and originally set up to support Bitcoin, the cryptocurrency, blockchain is a way of sharing a set of rules via a sort of shared ledger system that gets timestamped and logged. Based on the principle of trust (I know, an oxymoron statement in the context of banking) but it’s also based on mathematical probability. If there are many many “nodes” (basically software installations that are connected to the internet) the chances of all of the nodes getting out of synch/changed is very unlikely so the main security is based on trust. With blockchain(s) the blocks contain details of the transaction and as long as more than 50% of the resources/nodes are honest the blockchain continues. Also, it is important to add that this is where the term Miners enters the room. Miners are nodes that find and source bitcoins. Basically, you can have a program running on your computers and as long as you are connected to the internet you have a chance of receiving bitcoins yourself in gratitude, this is why it takes approximately 10 minutes to complete a transaction with bitcoin on the blockchain as miners are sought after to find bitcoins that are available. If you are lucky enough to be at the top of the blockchain you will be awarded 25 bitcoins for assisting with the transaction. Now lets put this into context, if you win the blockchain lottery (that’s what I call it) it will be quite lucrative 1x Bitcoin = £369 (as of time of writing) X 25 = £9,225! Now if you have an extremely high powered computer network of miners you stand a better chance of being at the top of the blockchain. It’s interesting to see that we have to create incentives to decentralise the control of the currency, however, this commission/lottery ticket value will go down to 12.5 bitcoins every 10 mins around October time due to changes in the regulation of payments. I’m not sure what effect this will have on the currency but one thing is for sure is that you will need an extremely fast network of compute, power and processing to stand a chance of winning! Peer to Peer lending Essentially peer to peer lending is a marketplace that directly matches lenders to borrowers via an online platform. In this way a fee or commission is made, a sort of finder’s fee if you like, that is usually taken from the lender and the borrower. Credit scoring is key to the process and quite often other 3rd parties are linked in the chain. With this said it is a very effective way of getting a loan and when compared with the risks of a bank taking on this role of lending money no capital is needed with little or any liability. Algorithmic asset management or algorithmic trading This has been around for quite a while, however with the increased developments for connectivity and speed to execute, this is now very competitive. Speaks for itself really. Crowdfunding A new way to raise capital for that cracking idea you have that will make you billions. There are many crowdfunding platforms out in the market that will allow you to donate/pay to be a piece of your success. Payments This is a massive subject and warrants its own document really. However, I will list out a few examples: Paypal, ApplePay, Transferwise are good examples of payment mechanisms that are widespread and familiar to you. Transferring money to overseas has never been cheaper due to the emergence of the new technologies available. Opensource software has been a massive catalyst for this and startups are rife in this industry at the moment. New and improved ways to authentication, new and improved ways of quickly transferring/settling a monetary transaction are springing up monthly. Also, this extends to the financial trading world too. Fast execution, fast settlement and cheaper overheads. This maybe to the reason why banks and others are trying to implement the blockchain infrastructure but are concerned about the regulatory requirements for data protection and security. Data Collection Data collection and manipulation within the finance markets has been happening for decades. Recording stock market prices on a daily basis to then implement within each respective banking system of choice (banks even do this internally for their own in-house written apps and databases). Now we can take Big Data analytics to a new level and run rule set algorithms (complex queries) against existing data so that the bank can now accurately tell when you may require any other product they want to sell you (pension, insurance, saving schemes etc). This can also be referenced against social media data and even take into consideration sentiment analysis. Imagine you post on twitter/facebook that you have just won using a lottery ticket, your bank can find this out and the same day call you offering a wonderful investment opportunity! This is available now and will increase 10 fold in the future. [easy-tweet tweet="Data collection/manipulation within the finance markets has been happening for decades." hashtags="fintech, blockchain"] Fintech Users and uses Who uses fintech? B2B (business to business) - for banks their business clients, this is a very large arena and will be broken down further later Banks as consumers – what technology they consume? Consumers of banking technology – other banks or partners (broker dealers, asset managers or portfolio managers) B2C for small businesses and consumers – tech that’s used to link to banking systems, online payment mechanisms etc.  Banking specifically explained Banking and finance is such a large market vertical. To really understand Banking Technology you really need to understand how the banking system works and the differing types existing today. In essence, there are two main types of banks, Retail and Investment (possibly a third if you count central banks such as the Bank of England etc). Industry models, types of financial firms and how they function – Enter the Buy and Sell side (Finance industry slang)! The Sell Side Broker-dealers Acts as an intermediary between the client and the market. Its regulated to perform trading activity and they are paid commission for the process of transacting business on behalf of their client. Research driven Broker-Dealers This type of broker-dealer conducts research on companies and sector and advises clients on what investment choices they should make. In exchange for this advice the clients channel their trading activity through the broker-dealer paying them for their research in dealing commission. The Buy Side Fund managers/Institutions/Insurance companies. This is the most important element of the market in that it’s this investment that underpins the market. By institutions we mean the pension funds and investment funds. This has traditionally been the largest trading block in the market. Hedge funds. Hedge funds are a relatively new addition to the market. They attract vast sums of money from the institutions and adopt investment strategies that are higher risk and less opaque than normal. They charge high fees and in return aim to deliver higher returns. Currently, they operate in a loose regulatory environment that allows them this freedom this also includes private wealth management and retail. Note: This is trading by or on behalf of the public. Buy side vs. Sell side Why do we have the distinction? The buy-side deals with clients in the form of the public or public money or, in the case of insurance companies, to leverage premiums and the sell side deals with the market. Note: The aim of having the distinction is to make the entire process more transparent and to prevent illegal collusion between the market and investment managers or traders. Bids and Offers Bids and Offers is the accepted terminology applied to trades in the market. Client              Broker                 Market Sell                 Buy – Sell              Bid Buy                 Sell – Buy              Offer The above table shows the chain of events from client to market in a transaction. Retail banks Retail banks are much easier to describe, we all have an account with one of them – Barclays, Lloyds, Santander etc. Salaries get paid into various types of accounts held from current accounts to savings (if you are lucky enough not to rely on a salary to pay the bills). Most retail banks use similar IT infrastructure as investment banks, in fact most are a hybrid of both these days offering similar investment products and stock purchasing etc. Investment Banks Investment banks provide services to other banks and institutions (also to corporates). Investments banks normally consist of trading (requiring trading and settlements technology), research on stocks (research distribution mechanism both writing and publishing) and mergers and acquisitions (technology that enables due diligence as well as corporate valuations). A typical Investment bank would normally look something like this: Front office trading department – Traders and brokers alike that fill orders based on the type of stocks and shares they predominantly focus on (yes each bank can specialise in certain stocks and shares). Complex trading systems software is in use that allows the “order book” to be propagated and logged based on what financial instruments they are trading (foreign exchange currency, commodities – metals, energy etc). They will also offer loans (millions) to firms that prefer debt as a means to raise money, manage pension funds and offer advice on investments and corporate acquisitions. So with all of the above services going on, what technical infrastructure is utilised? A front office trading system - This needs to be very resilient and fast. Typically insourced from a major vendor in the market (E.g Fidessa – for equities). Database management is also very key and many banks have their own in-house developed infrastructure that they rely on that is fed from market data providers that can populate their own “trading database” however this is costly and can be very complicated due to the regulatory requirements imposed. As you can see banks have complicated technical infrastructure (and I haven’t even discussed the specific elements of the tech and systems). With the rise of Opensource and decentralised infrastructure concepts such as blockchain, I feel we may be some time away from productionised implementations of this into the large banks although they are playing with the tech now.   ### NetApp’s Laurence James on Data Nuggets Compare the Cloud attended a private invite-only briefing provided by NetApp with some of their key partners. The general theme was based around how businesses are transforming with the emergence of SMAC - Social, Mobile, Analytics and Cloud. Various statistics were also demonstrated that showed the audience that indeed Data and the management of is of the utmost importance to any industry together with security. Further presentations followed with a deep dive into BaaS (Backup as a Service) and how this is helping businesses transform digitally whilst increasing availability and security. Data and time are two of the most valuable commodities that every business try to manage and secure above any other items and NetApp together with some key partners demonstrated how they achieve optimal benefits for both. Compare the Cloud interviewed the speakers after the event and this is the first video to be released in a series of four, Laurence James, NEMEA Products, Alliances and Solutions Manager at NetApp is up first talking about Data Nuggets and the value of data in your organisation. ### Head in the clouds As a professional accountant, I still come across clients who are reluctant to used cloud-based systems for their accountancy and bookkeeping. And yet these same individuals are undoubtedly using the cloud in their daily lives - we all do! Every time we use an on-line purchasing system, or pay bills on-line, or look for information on the internet, we're almost certainly accessing information that's stored on the cloud. So why the reluctance to use this technology to make your business life slicker, easier and up-to-date? Here's the three main issues and what you can do to minimise the risk: How secure is your data? Since your data is stored on the cloud provider's server, protection from the risk of unauthorised access to your data is outside your control. However, security on cloud-based systems can often be better than you have through the firewalls and other protection systems you use for your non-cloud-based systems because cloud providers are able to use the economy of scale to solve security issues by devoting resources to this beyond what you may be able to afford. For a system to be trusted, the provider can't afford too many breaches! [easy-tweet tweet="Security on cloud-based systems can be better than you have through the firewalls" hashtags="Cloud, security, tech"] Can your data be altered? Since the cloud provider can of course access the data stored on the cloud, there is the possibility that information could be accidentally altered or deleted. However, you should be able to encrypt data to prevent unauthorised access and it's also essential to know under what circumstances the provider can share information with other parties (e.g. if required by law). Read your provider agreement! Who owns the data on the cloud? Again - read the provider agreement. If legal ownership isn't covered, you need to make enquiries, to check that you retain ownership of the information you store. You don't want anyone else to profit from your business information! And you also need to check that your cloud computing complies with data protection and any regulatory requirements. So   -   now we've looked at the main risks, what benefit do you get from using cloud-based accountancy systems?  Cloud accounting is just like using accounting software installed on your own computer, except that you access it using your web browser, over the Internet. The cloud provider installs and manages the infrastructure and platforms that run the applications and the cloud users (you) access the software through the cloud clients (your service provider). So what are the advantages? Effective outsourcing of hardware and software maintenance and support, which saves costs Maintenance is easier and more cost-effective because the applications are not stored on each individual user's computer You can access your information anytime from anywhere Scaling up or down is much easier than upgrading you own system No new software needed to incorporate updates Multiple users can work on the same data at the same time Information is entered only once no matter what the source You don't need to arrange for off-site backup Resources and costs of a cloud-based system are shared between many users, so there are economies of scale, plus infrastructure can be based in low-cost areas Using a cloud-based system, you only access it when you need to, and the cloud provider will arrange systems to cover peak-load times As you may have gathered, I'm a keen advocate of using cloud-based systems. Why? because they work and the advantages greatly outweigh the disadvantages. It's a great way to maximise efficiency and minimize cost. [easy-tweet tweet="Cloud is a great way to maximise efficiency and minimize cost" hashtags="cloud, data, tech"] I encourage my clients to use a system called Xero, which is a cloud-based accountancy system developed especially for use by small and medium-sized businesses. Clients upload their bookkeeping onto the cloud instead of maintaining it on an individual computer system. This enables us, as their accountants, to access the information direct for the preparation of management and final accounts, VAT returns and the provision of business advice. But the system also provides a complete overview of the financials of the business, in real-time, and automated feeds import all bank account and credit card transactions. You can create and send invoices on-line with all payments, returns and credits automatically tracked. What's not to like? Just one more point - whichever service you choose, make sure that you read your cloud provider's service agreement right through to the end before you sign up. Not all cloud providers are created equal! ### 7 trends driving cloud adoption Gaidar Magdanurov, VP and General Manager, Consumer and Online Business at Acronis looks at the essential movements in this year’s cloud services market. For many people, success in the years to come will be defined by their ability to deliver digitally enhanced products and services. Cloud has become synonymous with availability and convenience of computing solutions at a manageable cost, and the next 12 months will bring more opportunities and challenges as more consumers and businesses continue to adopt cloud-based services. The following trends will emerge to challenge cloud providers: More applications will move to the cloud Driven by the strong adoption of Microsoft Office 365 and Google Docs office tools, more applications will move to the cloud. Business users and consumers continue to embrace browser-based apps. With graphic editors, accounting, word-processing, CRMs, and many other business apps already in the cloud, next year we’ll see even more applications follow suit. Customers will look for new browser-based apps as an alternative for applications that require installation for convenience, and software vendors will have to move more services to the cloud. [easy-tweet tweet="Data integrity can be easily compromised without the knowledge of the data owner." hashtags="Cloud, data"] Small businesses will drive cloud adoption Cloud-based services make a wide range of business tools widely available to all customers, irrespective of their budget or profile. It’s easier and more cost efficient to host business applications with a cloud service provider than pay licensing costs and maintain IT infrastructure in-house. Consequently, small businesses will lead the way in terms of adoption. Data storage needs will increase With applications moving to the cloud, there will be an increased need for convenient and secure data storage. Cloud-based apps need data that’s also stored in the cloud. Users will look for solutions to manage their data distributed among different cloud services. There will also be a need for new ways to access, aggregate and search data stored in different locations. Demand for data verification and encryption will grow As data becomes more distributed, there will be a growing demand for services to verify data authenticity. Unprotected data stored in public cloud accounts will be subject to increased cyber-attacks. Data integrity can be easily compromised without the knowledge of the data owner. This will drive adoption of verification and encryption services. A good candidate to maintain public registry may be blockchain. Besides storing information, validation and integrity will gradually become part of cloud data storage services. Legal documents and contracts will migrate to the cloud Electronic documents and signatures will become even more prominent. This will be an additional driver for solutions confirming the authenticity of information. We may see the growth of blockchain-based public registries for digital signatures, slowly replacing proprietary services that are prone to hacking attacks due to the single source of authority. Single cloud solutions will take off With multiple subscriptions for various cloud services and applications within the household, users will be looking to simplify ownership and management of those services. Service providers offering multiple services packaged as one solution and brokers simplifying ownership of multiple subscriptions will receive even more attention. The need for data protection will increase While trying to simplify subscription management, users of the mix of multiple local and cloud applications will be more willing to create multiple hybrid-cloud data repositories to avoid relying on only one provider for data storage. This will increase the need for data protection software to take care of automated data backups and simplify data management. Growing dependence and increased consumption of cloud-based services will keep providing exciting opportunities for cloud service providers, and both business and consumer customers. However, the transition to cloud may be difficult for users, as they will need to adopt subscription-based and consumption-based models for the applications and services they used to ‘own’, and service providers will need to provide a seamless transition for their users on the way to the cloud.  Subscribe to the Acronis Blog: http://blog.acronis.com/ Featured image credit to Lobster.media ### Hybrid cloud 2017: A forecast for change Organisations are now beginning to realise that there is no singular universal hybrid IT solution and that to meet the needs of their stakeholders, they must embrace a variety of technologies. As the role of hybrid cloud is growing in importance, here are some key trends that will change hybrid cloud in 2017 and beyond. Organisations are turning to managed services Companies are increasingly using hybrid cloud as a managed service platform to achieve a consistent application experience, regardless of the underlying infrastructure. As cloud adoption increases, the industry is moving beyond self-service portals for provisioning infrastructure to software based managed services platforms. These automated, software-driven managed services have consistent Service Level Agreements (SLA), regardless of workload deployment models across public, private, and vendor clouds, as well as seamless workload portability and automated migration.  They also provide governance, risk, and compliance assurance across the user base and deployment model. Cloud workloads are being automated above the orchestration layer Cloud services have typically concentrated on automating the orchestration layer. However, as needs evolve, organisations are looking at integration above the orchestration layer to automate the whole application deployment process across any cloud infrastructure. This speeds up initial deployment and ongoing DevOps integration; it simplifies application management and accelerates delivery of the application owner’s business objectives. However, finding an integration point that will support platform-independent strategies is a challenge. [easy-tweet tweet="Abstracted DevOps toolsets enable the selection of a cloud environment" hashtags="tech, cloud"] Abstracted DevOps toolsets enable the selection of a cloud environment based on individual business requirements and allows for the leveraging of other clouds for future deployments, reducing the risk of vendor lock-in. To see which best addresses the business requirements of the application one must identify the mechanisms behind toolsets that differentiate them. Self-service provisioning and automation to support public, private, and hybrid cloud-based development teams, is becoming a necessity. Programmable networks are also powerful enablers of hybrid cloud, allowing new operational sites to be rolled out more quickly. Previously, equipment would be bought, configured and taken to the new site, needing a highly skilled engineer to install it. Now with generic programmable equipment and cloud applications, the process can be templatised by taking a simple device, putting the intelligence in the cloud and using a lower level engineer to install the equipment and the branch software, all in the cloud. Container tools are becoming the new platform-as-a-service Throughout 2017, we’ll also see more widespread adoption of containers, but the transition to a fully containerised world is still a way off. Initially, we’ll see traction in using Kubernetes as a deployment model for more complex workloads.  As support for Docker is variable across public cloud platforms, organisations are not likely to jump to Docker on multi-cloud. They’ll probably choose to use it on a single cloud platform and achieve hybridisation in combination with their on-premise stack. When adopting Docker and Kubernetes (or similar variants), organisations should make sure they have a clear strategy around image management, network access and security patching, service discovery, and container monitoring. Network function virtualisation becomes the path to hybrid cloud nirvana Nirvana in hybrid cloud is where one part of a service is run in a firm’s own data centre; a second part is on public cloud provider A, and the remaining part is in public cloud provider B. The firm is then free to determine where they want any element of the service to run based on performance, availability, privacy, or cost. A major reason why this has yet to be achieved is that the network elements in these hybrid domains must be stitched together. Some enterprises attempted to use Software-defined networking (SDN) to unite their hybrid environments but discovered this to be very complex. Network function virtualisation (NFV) promises to be a much easier way of networking together hybrid cloud and hybrid IT environments. One of NFV’s advantages is that the virtual networking and security appliances it employs allows organisations to maintain control of IP addressing schemes, DNS, and routing choices as they stitch the network together. Firms can treat the cloud as an extension of their own network, using familiar networking technologies, tools, and vendors. This means we’ll see more interest in NFV when cloud-enabling existing networks, and architecting new networks with hybrid cloud in mind. NFV is also becoming the preferred enabler of containerisation As containers are dynamic and short-lived, traffic flow is unpredictable. Once started, it needs to be registered in some directory; when it’s ‘killed’, everyone needs to be informed through a service discovery layer and processes that run on the console, using tools like CoreOS. [easy-tweet tweet="As containers are dynamic and short-lived, traffic flow is unpredictable." hashtags="tech, cloud"] The Kubernetes networking model requires containers that communicate with network nodes and one another directly, and that a container sees itself as the same IP that others see it as. In Kubernetes, all containers within a pod share the same IP address and must use the localhost construct to communicate with one another. These containerisation networking challenges can be approached from Docker networking options, container-centric options, to SDN, and NFV. However, accepting that a greenfield Docker deployment is less likely than a hybrid deployment and the container is to run alongside existing Virtual Machine implementations, the NFV approach is most likely to address containerisation networking challenges. The rate in which technology is evolving makes it challenging to predict the future, but the adoption of hybrid cloud is clearly on the rise. Despite the numerous challenges that will be met along the way, if done right, it can provide organisations with the ability to alternate between dedicated resources, harnessing the security of a private cloud and the flexibility of public cloud services, thus forming an ideal solution to meet ever increasing IT expectations and demands. ### Data Centers, Mainframes and the rest! When compared to an x86/distributed server environment, the operational costs of the power consumption versus IBM’s Mainframe z Systems are half, while the performance is 30 percent greater (using z Systems as an example.) So let's think about that for a minute! X86/Distributed server power consumption is double that of an IBM Mainframe? An IBM Mainframe system will achieve a 30% greater performance too? Where’s the catch? Sure – the very word “Mainframe” when used today still conjures up images of huge rooms, vast banks of machinery and lots of people (oftentimes replete with white coats!) pawing at various gauges and monitors. Such images obviously don’t help the Marketing efforts of the so-called “Big Iron” manufacturers. [easy-tweet tweet="Mainframe still conjures up images of huge rooms, machinery and lots of people" hashtags="cloud, data, tech, mainframe, ibm"] That said let's look at some cold hard facts: 90+ percent of the world’s top 200 Banks still use mainframes Many similar mission-critical applications (think of those used by major airlines/credit card companies) also rely upon “Big Iron” Mainframes Maybe it’s the fact that despite their “dated” image, Mainframes have been continuously updated and refined to cope with the seemingly ever increasing amounts of the truly explosive growth in mobile and online transactions. [easy-tweet tweet="Mainframes have been refined to cope with the amounts of data" hashtags="data, cloud, mainframe, ibm"] The latest z Systems come with an impressive list of capabilities. z Systems machines can process some 111,000 million instructions per second. They can easily process 2.5 billion transactions a day, encrypt mobile and online banking transactions on the fly, and allow those transactions to be monitored with real-time analytics, allowing users to spot potential fraud and find opportunities within transactions as they're happening. The z System itself also supports something called single instruction multiple data (SIMD), which lets one microinstruction operate at the same time on multiple data items. "When you start doing advanced modelling and analytics, you can get an 80% performance improvement because of SIMD. By amassing data from disparate sources (and encrypting same in the process) and then  populating a database designed to turn data from the Internet of Things (IoT) into useful, meaningful information – in real-time – users of this technology will be well placed to exploit the opportunities that will remain elusive to firms relying on the slower, typical server-farms using x86 technology. Further, a series of application programming interfaces lets developers create apps for the system quickly — (an important selling point for banks that need to add new features to their apps frequently for competitive and business reasons.) In order to cope with the vast quantities of unstructured data (similar to Datacentrers) that will present itself once firms begin to increasingly interconnect their standard products and services with more and more devices and/or the IoT, firms will need to ensure that the entire spectrum of their systems (e.g. network, storage, API’s, hardware throughput and resilience etc.,) can be handled in a safe, reliable and secure manner. So, what’s this got to do with Datacenters I hear you ask? So, what’s this got to do with Datacenters I hear you ask? Datacenters have the same need for lower power consumption and increased technical performance. Let's say that a Datacenter is similar to a bank, Data is the new currency after all, and you need to secure it, you need to access it regularly, you need to make sure that it never gets lost or stolen and you can cater for different currencies. This is exactly what a mainframe can do for a Datacenter! [easy-tweet tweet="Data is the new currency after all, and you need to secure it" hashtags="data, cloud, tech, security, mainframe, ibm"] The latest generation of mainframes can lower your operational costs, increase performance while having one of the toughest security measures available. They now cater for Linux (most popular OS in use today), and not just bespoke operation systems, and they have an impeccable reputation for never having downtime with resiliency built in as standard for practically every main component installed. No brainer? I think so and with the overall footprint of having the ability to run over 8,000 instances of VM’s on one box it makes a very compelling opening discussion to have over a first meeting. All this is well and good but did you know you can get even more out of the device with tailored management software and configuration tools? As if the standard “raw power” proffered by a z System wasn’t enough – there are ISVs who support the IBM ecosystem and specialise in areas such as mainframe data performance and optimisation – further optimising the throughput of various applications and routines. Companies like DataKinetics – a company with 40 years of experience in driving greater performance and value from existing systems. DataKinetics serves the world’s largest mainframe users with a variety of solutions that achieve significant savings – in time, effort, and money. Recent examples include one of the largest US Banks that implemented a solution that reduces the time taken to process some particularly complex batch jobs from over ten hours to less than one minute! Think about that. 10 hours to 1-minute reduction in processing time for tasks. Imagine what they could do for datacenter processing tasks, batch jobs across the enterprise and others whilst still enabling the mainframe to service up over 8,000 VM’s! Another client, a leading Healthcare Insurance provider, saw much more efficient resource utilisation that not only reduced their monthly operating costs by hundreds of thousands of dollars each year, but their system’s optimisation also created significantly more available processing space which allowed a planned system upgrade to be deferred for at least 2 years! These new generation mainframes also boast more advanced capabilities in application availability, resiliency and disaster recovery than even the most bleeding-edge x86 virtualization software. Further, from a security standpoint, viruses and worms that work through stack overflow conditions in distributed systems simply don’t work on the mainframe. All of this is achieved with no downtime – not even when upgrading software or trying out newly developed software prior to release. Mainframes are still orders of magnitude more powerful than even the largest virtualised distributed systems clusters. For example, IBM’s most recent z System microprocessors run at 5.0 GHz, while today’s commodity processors typically run between 2 and 3.4 GHz. A single z/VM in version 6.1 can accommodate more than 60 virtual machines per CPU; a fully loaded z Enterprise cabinet can hold four modules, each containing six quad-core processors and up to 786 GB of memory with four levels of cache; and the full system can address more than 10 TB of memory and support thousands of concurrent workloads, in many cases running at (or close to) 100 percent utilization. Each mainframe operating system (OS) can theoretically host somewhere between 10,000 and 30,000 users, depending on the mix of workloads. The mainframe is also still regarded as the gold standard when it comes to batch jobs -- up to 100,000 of which can be run in a day within the same OS. Without being too advertorial for mainframes it does seem to me that mainframes get a bad rap due the longevity of the word, the misuse of the word in certain films and of course the assumption they cost a lot! Well, they may cost more when compared like for like but they simply are NOT like for like – that would be like comparing apples and oranges. And let's face it they are in a different league, a league that demands a performance increase while lowering costs – Isn’t this what every company is trying to achieve? So, as competition among Data Centers/Cloud Providers increases – maybe it’s time for them to take a seriously close look at just how much money they could save by using mainframes instead of x86 server farms. Less maintenance, less power consumption, the ability to place huge computer resources on the same machine as the data that needs to be analyzed/used to verify authenticity etc., ISVs like DataKinetics focused on driving even greater throughput, and all encrypted on the fly might just enable some of us at least to see mainframes in a new light. ### Supporting HR - #CloudTalks with Tap’d Solutions’ Anthony Ryland In this episode of #CloudTalks, Compare the Cloud speaks with Anthony Ryland from Tap'd Solutions about how knowing the technologies available to support HR make it possible to go into a business and really 'see' where the gaps are in employee engagement. From this suggesting technologies and consulting with companies to get ready for step-change their engagement. #CLOUDTALKS For more content visit: comparethecloud.net/ Visit Tap'd Solutions to find out more: tapdsolutions.com/ ### Five cloud trends that will shape 2017 Last year, we saw cloud technologies grow in sophistication and power, enhancing their ability to process vast amounts of complex data. Adoption of cloud services increased, not just among start-ups and small enterprises, but also larger, mainstream businesses. No surprise there, as cloud capabilities provide more flexibility, lower up-front costs – to name just two advantages – compared to on-premises solutions. However, this doesn’t mean that adoption across businesses has been systematic or smooth. Deep cloud integration remains a challenge particularly as companies begin using it for more critical business functions, such as data storage. And bigger businesses have tended to opt for gradual adoption, given that transferring complex data systems and infrastructure can take time and stringent internal auditing. Going forward, cloud adoption by businesses of all sizes will only accelerate. Here are the top five trends we see shaping the year ahead. [easy-tweet tweet="Going forward, cloud adoption by businesses of all sizes will only accelerate" hashtags="cloud, data, tech"] Organisations will embrace a hybrid world Organisations will increasingly live in a hybrid world, split between on-premises solutions and cloud environments. For businesses with complex legacy IT systems, data is typically fragmented across local servers and cloud services. The shift towards cloud is a gradual journey rather than a sprint to the finish line. A new breed of agile cloud software is allowing flexible choices to help CIOs manage this new balance between on-premises cloud services. For users, these solutions make complex hybrid environments function as one cohesive system. For IT, the benefits run even deeper. Investments in hybrid software will remain fully relevant even as organisations eventually shift operations toward an all-cloud future. Cloud service providers will remove the complexities of regional data regulations Complying with new government policies on data privacy and sovereignty can be expensive and time consuming for global companies. In 2015, the European Union ruled against Safe Harbor, requiring international companies to revamp many of their compliance efforts. Then in July 2016, the Privacy Shield agreement again demanded new efforts from businesses with data spanning the Atlantic. These regulations are proving to be a constant challenge, and many companies are looking to major cloud providers for help. As they operate globally while maintaining regional data centres that meet today’s regulations, cloud service providers offer a solution for global companies to overcome these barriers. They have teams dedicated to monitoring and planning for regulatory shifts – something that can be cost-prohibitive for businesses. In 2017, more organisations will look to major cloud providers to take the pain and cost out of meeting regulations, allowing them to focus on their core business. Prioritising customer success and adoption It’s no secret that opportunities for brand and sales engagement increasingly span the entire lifecycle of a buyer’s journey, which is why it’s vital for cloud vendors to focus on their customers’ long-term success, not to mention develop a strong working relationship with both IT and the business. Today, cloud software vendors are extending their focus far beyond point of sale. Because software deployments require fewer initial investments of time and money, cloud vendors are looking to work with customers to ensure product adoption and business value. [easy-tweet tweet="Today, cloud software vendors are extending their focus far beyond point of sale" hashtags="cloud, data, tech"] By offering higher levels of customer support, more robust training resources, and deeper guidance on product adoption, this new timeframe is leading to mutually beneficial partnerships. From this, enterprises realise more value from their investments, and vendors are able to build long-term customers rather than one-time buyers.  Flexible analytics will solve the IoT’s last-mile challenge IoT data tends to be heterogeneous and stored across multiple systems, from Hadoop clusters to NoSQL databases.  As such, the market is calling for analytical tools that seamlessly connect to and combine all those cloud-hosted data sources, enabling businesses to explore and visualise any type of data stored anywhere and maximise the value of their IoT investment. Organisations across the world are deploying flexible business intelligence solutions that allow them to analyse data from multiple sources of varying formats. Joining together incongruent IoT data into a single view, companies can have a more holistic view of the business which means they can more easily identify problems and respond quickly. With the solution to the “last-mile” of IoT data, they can increase efficiencies and improve their bottom line. IT will shift its skill set With growth in cloud adoption creating increased demand for specialist cloud expertise, IT is responding by prioritising specific training on cloud technologies and developing internal training programmes focusing on security, hosted databases and infrastructure as a service. IT managers are stepping up their search for candidates with experience in DevOps practices and cloud platforms like AWS, Azure, and Google Cloud Platform. IT is shifting its workflow, too. Since concerns like scalability and maintenance are all but taken care of with the cloud, IT departments will place a greater emphasis on agile methods that provide continuous development and project delivery. [easy-tweet tweet="IT managers are stepping up their search for candidates" hashtags="IT, recruitment, cloud, tech"] ### Investing in enterprise productivity through unified collaboration   Enterprises are spending on average $8.1 million on unified communications and collaboration (UC&C) technologies and services in the hope that business productivity is enhanced. Fortunately, companies that deploy these tools receive returns, and it has been proven that enterprises with highly-effective internal communications offer 47 percent higher returns to shareholders.  So, clearly UC&C is on the rise and businesses are becoming more productive and deriving value from their investments. [easy-tweet tweet="Enterprises are spending on average $8.1 million on unified communications" hashtags="communications, tech"] In contrast, although it is clear that communications is vital, a failure to promote successful communication and collaboration internally is costly.  Organisations lose approximately $37 billion yearly due to poor workplace communication. Furthermore, in a business of 100 employees, an alarming average of 17 hours per week is spent on seeking clarifications, resulting from miscommunication. With this much wasted time, and its effects on productivity, it becomes critical for companies to seriously consider technology investments. As the use of UC&C technology grows, so does the adoption of cloud technology. This is because most UC&C technologies are dependent on the cloud. Consider too, that we have also seen that 71 percent of IT decision makers believe cloud computing has influenced their purchasing plan. So, it appears that the two are almost synonymous with each other. Further, an estimated 60 percent of public sector IT professionals believe the cloud will be the standard for all applications by the year 2020. [easy-tweet tweet="71 percent of IT decision makers believe cloud computing has influenced their purchasing" hashtags="cloud, purchasing, tech "] Additionally, in the next year, the global mobile workforce is expected to reach 1.45 billion people. The use of UC&C allows a clear way for enterprises to commit to a long-term collaboration strategy and ensure successful communication across their organisations. While it is clear that cloud and UC&C is critical to the future success and productivity within organisations, it is also important to establish what the digitally capable modern worker demands. Nowadays, they expect more than traditional conference calling, but seek an integrated approach to how they communicate with their colleagues, whether by video, voice, web, chat or audio. [easy-tweet tweet="The cloud will be the standard for all applications by the year 2020" hashtags="cloud, data, tech"] This essential variety places increased pressure on IT professionals to meet the demands of their organisations and staff. Therefore, by adopting the right UC&C approach, teams can become enabled to meet an enterprise’s communication objectives. As this is carried out, what do IT leaders need to consider when building out their UC&C strategy? Understanding your organisation’s UC&C needs Before investing in new technology for your business, ask whether you truly need it, what is can do and if it’s feasible long-term. Naturally these are obvious, but many IT and business leaders don’t actually consider this. For instance, what are the characteristics of your team? Are they baby boomers? Millennials? What are their collaborative personality types? Additionally, how does your business operate; is it entirely office-based or virtual? Once these questions are answered, and you have a better idea of your organisational need, you can begin to develop a business case for UC&C and then review your technology and feature requirements. Research into your chosen provider One of the first criteria to consider when choosing a supplier is how many other companies have they supported successfully, and what is their level of experience. While collaboration newbies and Silicon Valley giants can achieve VC funding, few are truly equipped to support the collaboration needs of global corporates. It’s important to do your research from the very beginning and request customer references. Consider the scale of your needs So, you’ve chosen a supplier – great! Are they able to cope with the size of your organisation? When supporting hundreds, if not thousands of employees, scalability becomes very important to your UC&C strategy. It’s critical, too, to consider simultaneous users and simultaneous meetings on your chosen provider’s network. Can they manage these and at scale? Businesses will need to wisely assess the meeting capacities that are offered, to guarantee that they meet requirements, from the smallest ad hoc conference calls to large, company-wide virtual events. Once the choice is made to deploy a technology to your extensive user base, training programmes and resources are vital to consider as they drive usage and adoption and enable the firm to maximise ROI. Security first Today’s security challenges are compounded by the introduction of new technologies, providing new opportunities for security threats. As such, companies will need to consider enterprise-grade security and encryption along with centralised IT administration to easily establish user permissions at either the individual or departmental level, ensuring the protection of valuable company data. Furthermore, built-in network resiliency and fail-over ensures that your service will always be available with limited downtime. Are you supported? Irrespective of the provider, it’s certain that you will encounter issues that require customer support. For enterprises, it’s important to have access to a choice of support avenues, including live chat phone support, face-to-face and customer communities to meet the various needs of the organisation. [easy-tweet tweet="For enterprises, it’s important to have access to a choice of support avenues" hashtags="cloud, tech, communication"] Also, depending on your organisation’s size, you may need to consider the availability of global support – in terms of access points, network infrastructure and local-language support options. Analyst firm Wainhouse Research found that more than half of web conferences start over 5 minutes late due to problems such as software download and installation delays, difficulty joining meetings, and missing materials or credentials. The loss of productivity and costs associated with these delays are incalculable. These are very basic problems that can easily be resolved by simple planning. In addition, developing a “watertight” UC&C strategy that empowers workers with collaboration tools will go a long way to increase productivity, and help keep your business on track.   ### The Linux Foundation and the NCWIT Release Speaker Orientation Course The Linux Foundation, the nonprofit advancing professional open source management for mass collaboration, and the National Center for Women & Information Technology (NCWIT) today announced the availability of a free LFC101 - Inclusive Speaker Orientation course to help prepare event presenters and public speakers with background knowledge and practical skills to promote inclusivity in their presentations, messaging, and other communications. Development of the course was first announced in November. The course, offered in three 20-minute, self-paced modules, presents content in a simple and practical way applied to the specialised needs of presenters. Topics covered include crafting presentation messages, scripting discussions, presenting media and subconscious communications. The course is based on NCWIT’s “Unconscious Bias” messaging, which encompasses the ideas of “Realise, Recognise, and Respond.” The Inclusive Speaker Orientation Course is available for free online. All Linux Foundation employees will now be required to take this course. Other speakers at Linux Foundation events are strongly encouraged to do so, and will be offered a special incentive for completion. It is offered openly, so other events may use it for their speakers, or individuals who are interested may enrol on their own. The Linux Foundation also continues to offer Ally Skills Workshops, which teach everyone simple, everyday ways to support women in their workplaces and communities, onsite at events. NCWIT serves as a developer of and clearinghouse for evidence-based tools and resources for increasing the meaningful participation of girls and women in computing. NCWIT is a research and data-driven organisation that is structured as a “change leader network” of educators, entrepreneurs, corporate executives, and social scientists who work to narrow the gap by addressing barriers to participation. “Increasing diversity in open source is important work that has widespread benefits for individuals, projects, organisations, the tech industry, and society as a whole,” said Terry Morreale, CTO and Associate Director at NCWIT. “We are excited to work with The Linux Foundation to make tools for change more accessible to everyone in the community.” A primary goal of The Linux Foundation is to make the open source community more inclusive and welcoming to all individuals who wish to participate and contribute. LFC101 will support that goal by strengthening diversity and inclusiveness within the open source ecosystem. Other initiatives to increase diversity at Linux Foundation events include a strict code of conduct for speakers and attendees, onsite child care and nursing rooms, diversity scholarships, non-binary bathrooms, barring all-male panel discussions and partnering with community groups to encourage more women to apply to speak. The Linux Foundation has also partnered with Girls in Tech to host ‘Hacking for Humanity’ hackathons later this year at Open Networking Summit and Open Source Summit North America. “Open source is for everyone, but everyone does not always feel completely welcome in the community,” said Linux Foundation Executive Director Jim Zemlin. “Speakers have a major role to play in making events feel safe and welcoming, and while they almost always have the best of intentions, they need the tools to be successful. By leveraging the expertise of NCWIT for this course, we will help speakers ensure their presentations are more inclusive, which in turn will help event attendees of all backgrounds feel more accepted in the open source community.” Anyone wishing to take LFC101 - Inclusive Speaker Orientation may enroll now at http://bit.ly/2kSRCVe. ### Deveo combines forces with a global cloud computing powerhouse ​ The code hosting and collaboration platform becomes the first repository hosting partner for the Alibaba Cloud Marketplace Helsinki, Finland, February 2017: Code hosting and collaboration platform, Deveo, has joined forces with Alibaba Cloud, the cloud computing arm of Alibaba Group, to become   the first repository platform within the Alibaba Cloud Marketplace. With free private repositories, support for Git, Mercurial, and Subversion version control systems, and WebDAV, for storing binary files, Deveo offers a unique private hosting environment. “We are glad to partner with Alibaba Cloud to deliver the best tools for the developers to use in the cloud.” explained Deveo CEO, Ilmari Kontulainen. “The first step was adding Deveo to Alibaba Cloud’s marketplace, but we look forward to discussing a more in-depth collaboration to offer the whole DevOps tool stack on the cloud with a click of a button.” Deveo has been creating proprietary software to assist high-profile enterprises for the last 10 years, as part of the leading Finnish DevOps organization, Eficode. In 2014, the team launched their code hosting and collaboration platform as a standalone application, and while retaining enterprise users, set their sights on the wider industry. “Deveo's sister company has already introduced disruptive innovation in the mobile payment sector.” said Kontulainen. “We are hoping to find disruptive innovations in the DevOps tool sector together with Alibaba Cloud.”   ### Intranets - Their journey to the Cloud With over 15 years building some of the best intranet systems for significant brands across many sectors we have seen the benefits that they can deliver but also the challenges that they throw up. In the BC (Before Cloud) era the barriers for widespread adoption of intranets included the uneven nature of the investment with large upfront costs to add hardware infrastructure as well as hiring or contracting technical, design and other specialist skills to build and deploy the system. The user experience was also heavily influenced by older desktop, not browser experience and with each company evolving their system at different rates, maintenance costs escalated and improvements became lengthy and difficult to achieve. Now - Everything has changed! The Cloud, coupled with a mobile workforce, whose expectations of engagement with technologies has changed beyond recognition, means that the old world is simply not viable. [easy-tweet tweet="Organisations are looking at evolving their intranet into a complete digital workplace" hashtags="cloud, digital, tech"] What is now expected from an intranet is all about a user experience that is modern, engaging and in keeping with users personal and social tech experiences. Organisations are also looking at evolving their intranet into a more complete digital workplace where employee orientated business processes such as holiday and absence planning or any form-based systems can be streamlined The Cloud is the only answer Only in the Cloud can innovative and disruptive solutions such as www.oak.com be run 24/7 and be available anywhere on the planet and from any device. Only the Cloud can deliver intranet software and digital workplaces without huge infrastructure costs, setup fees and enable every size of organisation (not just large enterprises) to benefit from improved communication, employee engagement and sharing of content and knowledge. Only in the Cloud can these applications change and iterate at a speed that meets the accelerating rate of change across organisations large and small. And once in the Cloud, many business systems benefit the organisation through easy and effective integration. [easy-tweet tweet="Once in the Cloud, many business systems benefit through easy & effective integration" hashtags="cloud, tech, data"] We can see the evidence at Oak.com, in the 15 years prior to Oak we attracted nearly 200 clients, In the 8 months since we started showing Oak to the market, over 40 organisations have joined our community with some going live in a matter of days rather than months. The move to The Cloud is a one-way ticket to a better place! ### Do I need a Managed Services Provider if Public Cloud is so easy? Public cloud services are becoming more appealing as a deployment option for an increasing number of workloads, applications and business solutions. Public cloud offers a low cost, low barrier to entry proposition. Armed only with a credit card, tech professionals are able to ‘self-serve’ and get started quickly on a massive choice of services and apps. However, capitalising fully on the freedom afforded by public cloud services also requires a cultural shift that some businesses are simply not ready for – in particular, governance of a decentralised, ‘need it now’ approach requires a high level of maturity and specialism, which can be difficult to fulfil in house. If left unmanaged, or without full consideration of the potential for error, overspend or risk, the flexibility of public cloud services can instead lead to future growing pains in terms of overall cost, durability and security as the solution matures. Aside from the solution and cultural approach, one of the biggest issues organisations face is human resources. It’s a familiar tech challenge remastered for the age of public cloud, specific to individual business and technology objectives – and can be boiled down to ‘3Cs’: Capabilities Creating environments and instances in public cloud is a simple premise, however, the practice of finding the right people to create a secure, highly available, integrated and well-supported software solution is not. It is important that people with the right skills and knowledge to design, build and support the solution over the lifetime of the application should feature early on in discussions to help build the plan. Although it may seem obvious, it is surprising how often this is not the case as departments create and grow accounts outside the central control of IT. Many businesses look to their existing IT teams to undertake a public cloud infrastructure transformation and ongoing management, but up-skilling in this specialist domain is also a challenge: The capabilities and services offered by public cloud are rapidly and constantly changing, so it can be hard to keep skills current and relevant Staff retention often becomes an issue – skills are in high demand and attract relatively high rewards, so poaching of talent is rife. Retention can also be an issue when a big project ends. Post-deployment, businesses may not have anything meaty or exciting enough on an ongoing basis for their skilled resource to get their teeth into – and do not underestimate the lure of an intellectual challenge to an AWS or Azure specialist The people who develop the solutions tend to end up as the ones supporting it, and not by design. This requires a different skill and mindset – think DevOps without the Ops [easy-tweet tweet="The capabilities and services offered by public cloud are rapidly and constantly changing" hashtags="cloud, data, tech"] Capacity Capacity is about the right people with the right skills. Do businesses deploying services in public clouds have enough skilled and available people to support their systems, today and tomorrow? The concept of a static technical workforce with siloed specialisms is not well aligned to the premise of public cloud solutions. Elastic supply can create dynamic demand for skilled services – businesses may need to ‘burst’ temporarily from a people perspective to accelerate or derisk service delivery, effect a migration or meet specific project requirements. Capability structure is obviously crucial to effective capacity management. Lightweight, flat structures and processes advocating collaboration are naturally better suited to DevOps, which corresponds with public cloud. Technical resources have to be equipped to support the business in an increasingly fluid way and the most effective teams are often multi-disciplined with (metaphorically) multi-lingual and ambidextrous capabilities. Clarity of purpose Organisations need to ask themselves on a continuous basis whether their valuable resources and workloads are in the ‘best execution venue’ for their skillsets. Internal experts are better set to drive innovation that creates a competitive edge or elevates the customer experience, rather than being absorbed in infrastructure management. How do managed cloud providers help? The public cloud proposition is extremely compelling, yet many businesses face at least one of the following challenges: limited budget, time, skills, resources, or vision to invest in internal capability today. [easy-tweet tweet="Working with a managed cloud provider helps mitigate resource bottlenecks and risks" hashtags="cloud, data, tech"] Working with a managed cloud provider helps mitigate resource bottlenecks and risks, dynamically and cost effectively to create and manage an internal talent pool. Consider also the ongoing training and tools these teams require to perform (for monitoring and alerting, backup & restore, authentication etc) – whose ROI has to be measured against business outcomes. The managed cloud provider will also be able to offer economies of experience, such as best practice architectures that accelerate solution development plus skills and insight that can be translated into custom development. The applicability of the ‘managed proposition’ to public cloud contexts becomes clear where tools and automation processes are complex, deep expertise relatively rare and the cost of management failure (typically around solution durability and cost control) high and well documented. Public Cloud solutions To use a building analogy, even with limited DIY skills, it is rather easy to construct a garden shed (your quick start public cloud solution), or transfer an existing one from a neighbour’s garden (think technical lift and shift). The structure can then be used to hold gardening tools and the lawnmower (read application code and data). Sheds are useful and fulfil a temporary or ‘good enough’ requirement for contents not significant enough to merit their own space in the house. When it comes to building something habitable, most people would consider engaging an estate agent or an architect and builder to deliver a home with running water, electricity (monitoring and alerting), a number of separate rooms and a hall for welcoming visitors (semi-segregated areas for security). Building your public cloud environment Few people design and build a house on their own, particularly considering the number of different skills and trades involved. Instead, most give the work to people with the required technical qualifications, to ensure things are put together correctly and operate efficiently. It may be a laboured analogy (and it’s probably also a renter’s market) but the fact is that public cloud environments are becoming a more viable option for an increasing number of scenarios – and an increasing number of businesses are moving their entire digital footprint into public cloud infrastructure. So for organisations trying to work out why their shed can’t be more like a permanent residence, it might be time to call in the professionals. [easy-tweet tweet="Public cloud environments are becoming a more viable option for an increasing number of scenarios" hashtags="cloud, data"] ### Is there a gap in the mobile devices market? Not everybody does the same job, so why should they all have the same device?  While this used to be the case, this approach is a thing of the past. Bring Your Own Device (BYOD) was presented as the initial answer but has petered out in the UK, with the exception being some of the largest corporates. Most medium to large businesses now look at user profiles and work styles before choosing the most suitable device for the various roles within their organisation. To any sensible person this makes perfect sense. Which begs the question – why is it not working in the real world? Firstly, instead of standardising on a single device this approach often results in standardisation on a single manufacturer. The manufacturer is chosen because it has a broad portfolio of devices, coupled with an operating system the IT department is happy to support. Entry level devices for the workers and top-of-the-range devices for executives is one way of looking at it! However, with more and more operating systems becoming disconnected from the manufacturer is the single manufacturer approach also flawed?  Given both this challenge and current market trends, who should you buy from today?  [easy-tweet tweet="In the past, BlackBerry has been the all-time enterprise favourite." hashtags="blackberry, mobile"] In the past, BlackBerry has been the all-time enterprise favourite. But last year BlackBerry decided to concentrate on the development of its software and services business and took the decision to stop manufacturing its own phones. What about Microsoft? The Lumia range experienced massive success in the enterprise for the reasons I’ve outlined above. However, the future is uncertain with Microsoft commenting that it will continue to remain in the phone market, yet all signs point in another direction.  There are the rumours about the ultimate Surface mobile device, and the majority of the Lumia range is now end-of-life with no replacement in sight. Are the lights going out on the Lumia range? That leaves Apple or Android. Apple iOS and iPhones are inextricably linked. However, there are no cheap-and-cheerful iPhones, so no credible option for the worker-bees in the business. On the other hand, there’s Android, which now boasts the security features to keep IT happy and a broad enough range to ensure both users and procurement are satisfied. [easy-tweet tweet="There are no cheap-and-cheerful iPhones, so no credible option for the worker-bees in the business" hashtags="iphones, mobile, apple"] Is there a gap in the market that can be exploited?  There is an opportunity for smaller mobile manufacturers that can produce devices with top-end features at a lower price point. Could this pave the way for a Nokia comeback? Not only does Nokia understand the business market, but one of the worst kept secrets in the mobile industry has been that Nokia will be re-entering the market this year with the next generation of Nokia phones running Android. What makes a mobile device suitable for business anyway? I’m not sure that there is one definitive answer. Security features? Fingerprint recognition? Long battery life? These are just features which are in demand by consumers! It’s certainly worth exploring the predicted innovations in mobile for 2017 to identify any trends that might be appropriate for business. Curved screens, foldable screens, no headphone sockets – not sure these serve any purpose to a business. Wireless charging and dual lens cameras are probably of more use. But in the enterprise today, the focus is more on the apps than the device itself and ensuring that employees have seamless access to the systems, business apps and collaboration tools that you are required to do their jobs. One thing is for certain – nothing ever stands still in mobile! ### AWS’s membership in the Association of Cloud Infrastructure Services Providers. AWS has just announced it’s membership in the Association of Cloud Infrastructure Services Providers in Europe (CISPE). [easy-tweet tweet="GDPR is a real concern and even a hurdle for the uninitiated circles in information security." hashtags="AWS, CISPE"] Neil Cattermull, Director at Compare the Cloud has made this comment: "This is a major sign of security and compliance being taken seriously at the cloud infrastructure level. GDPR is a real concern and even a hurdle for the uninitiated circles in information security. With AWS now joining the rest of the pack it marks a significant leap forward to making public cloud services more secure, compliant and achievable than ever before!" CISPE is a coalition of about twenty cloud infrastructure (also known as Infrastructure as a Service) providers who offer cloud services to customers in Europe. CISPE was created to promote data security and compliance within the context of cloud infrastructure services. This is a vital undertaking: both customers and providers now understand that cloud infrastructure services are very different from traditional IT services (and even from other cloud services such as Software as a Service). Many entities were treating all cloud services as the same in the context of data protection, which led to confusion on both the part of the customer and providers with regard to their individual obligations. One of CISPE’s key priorities is to ensure customers get what they need from their cloud infrastructure service providers in order to comply with the new EU General Data Protection Regulation (GDPR). With the publication of its Data Protection Code of Conduct for Cloud Infrastructure Services Providers, CISPE has already made significant progress in this space. ### Working out the real costs of server downtime Server downtime should be a thing of the past, at least in theory. Cloud-based server clusters and falling costs of equipment mean that should a server fail, there should always be an alternative system ready to pick up the slack. Unfortunately, in practice this is often not the case, as many IT managers will know.  There could be many reasons for this: your company’s IT budget could be cut, or the cloud services provider you use could suffer an outage or data loss. Regardless of the causes, server downtime remains an issue for modern businesses, one with high costs and high stakes. Preventing server downtime issues requires a few different approaches, but a good start is to make sure key stakeholders in the business understand the common causes of downtime, and exactly how taxing downtime can be on both a business’ financial and human resources. Common causes In my experience from working with a variety of different companies, from tiny startups to large enterprises, some of the most common causes of server downtime are obvious and easily avoidable issues. [easy-tweet tweet="One frequent problem that popped up time and again was lack of disk space on servers." hashtags="diskspace, servers, cloud"] One frequent problem that popped up time and again was lack of disk space on servers. For a simple, easily resolved problem the potential dangers of running out of disk space are serious: applications running on low disk space can run unpredictably, causing freezes, crashing or data corruption. Address the underlying cause of storage space issues by making sure your applications are designed efficiently. The most common cause of excessive disk space is log files, which can suddenly grow very quickly or are not rotated often enough.  In purely financial terms, using pay-as-you-go cloud tools to run your applications can suddenly run up unexpected costs when things go wrong, so it’s crucial to catch these issues before the spiral out of control. Effective monitoring of resources with the right alerts configured before things get too bad can really help avoid unnecessarily large bills. To use one high-profile example, Snap’s cloud bill this year will be higher than its total revenue figure for 2016. Putting a price tag on downtime It can be tricky to put a proper cost on how much server downtime costs you, so it’s best to start with an estimate. If your website has an uptime rate of 99% over the course of a year, this means that your website or online service has been inaccessible for 3.65 days. For many businesses, 3-4 days of lost revenue can be a critical issue, especially in smaller companies. At Server Density we’ve created a cost of downtime calculator to help businesses understand more clearly exactly how much of their bottom line is put at risk by unstable server infrastructure. [easy-tweet tweet="The impact on a startup’s revenue from downtime can therefore be huge" hashtags="startups, cloud, tech"] Often startups and small businesses won’t have the money for built-in redundancies. While this might make financial sense in the short term, it can also spell a protracted wait to resume services in the event of an outage while the IT team figures out where the problem stems from. The impact on a startup’s revenue from downtime can therefore be huge, making it of vital importance that business leaders understand the risk involved. Larger businesses are generally able to mitigate that risk, as they do have the financial flexibility to purchase ‘insurance’ measures like large-scale redundancy. Because the cost of downtime for these businesses is orders of magnitude greater than for a small startup, so are the costs of mitigating against that downtime. The human toll Working on-call in IT brings with it personal costs which can affect wider business goals. A 2014 study by Tel Aviv University suggests that having your sleep interrupted can be worse than no sleep at all. When a worker doesn’t know when he or she will need to respond to an emergency that creates stress, and this stress can affect the quality of work. This is a shame, as there are easy ways to minimise stress for on-call workers. Many existing system monitoring products produce high quantities of alerts, ‘noise’ which may not be directly relevant to the task at hand or require action. Nevertheless, these alerts drain our concentration and eat away at our time. Our own research suggests it takes an average of 23 minutes to regain intense focus after being interrupted, and our data from December 2016 shows that 1.5 million individual alerts were triggered across all our customers’ servers. These unfiltered alerts created a total of 165 years worth of interruptions for our customers. Excessive task switching among employees, combined with the mental and physical stresses that on-call work creates, are often overlooked as costs affecting business performance. These factors need to be managed and monitored more rigorously to ensure customers are receiving high-quality service. Create a plan Not all server downtime issues are avoidable. Sudden problems such as power cuts, fire or flooding are difficult to anticipate and often result in catastrophic, and expensive, downtime for businesses. No amount of server monitoring will help you in this situation, however through careful planning the damage can be mitigated. GitLab’s recent challenges can provide us with a great lesson in what not to do: the company lost 300GB of production data due to a mistyped command, and its five different backup techniques all failed. Identify your different backup methodologies in your plan, be clear about the order of priority in which they will be used, and regularly check to ensure that these backup plans are working effectively. Make it clear to all the relevant people who should be contacted in case of emergency, and develop a simple checklist of issues to tackle - these kinds of features might sound obvious, but in a crisis people can behave unpredictably, so it’s important to have as many processes as possible codified and tested prior to the event. IT leaders often face a challenge of communication: backups and redundancies are sorely needed, but it’s often difficult to demonstrate the need for the additional budget to have these features in place. An understanding of the true cost of server downtime, both on a company’s finances and the employees themselves, can help pave the way to a more adequately resourced plan for dealing with what is still a very real risk. ### Delivering a Complete Digital Transformation Experience Through Data It has been easy to peg “digital transformation” as a buzzword or a tag for limited special projects or initiatives. However, IDC recently reported that the value it derives for organisations now means it has become a “strategic business imperative” for Global 2000 enterprises. Whether it’s through the cloud, mobile or analytics technologies, all industries are embracing digital advances to serve their customers in new and better ways. This is taking place by putting IT at the centre of their business and identifying ways agility can positively impact their digital business. Organisations must take a holistic approach to ensure the success of these investments. But how? [easy-tweet tweet="Organisations must take a holistic approach to ensure the success of investments." hashtags="cloud, tech, data"] A customer-first approach As IDC reported by 2020, 67 percent of all enterprise IT infrastructure and software spending will be for cloud-based offerings. There are many reasons for the tremendous growth in the cloud, but one important factor is an organisation can often add server capacity at a rate much faster than they can within their own data centres. This helps free up product teams to optimize, iterate, and develop new experiences and applications to customers. Delivering a great digital customer experience doesn’t just require a shift in the way you build products; it also requires a different relationship with your customers. The most successful digital businesses have a commitment to serving their customers in new and better ways by understanding deeply each customer’s experience with their digital products. Many businesses are almost blind when it comes to understanding what kind of digital experience their customers are having. To remedy this requires an all encompassing approach. It should include the way customers interact with the company’s digital properties and the relationship customers have with everyone at that company - including developers, marketers, and sales representatives. That way, the company will be able to serve their customers better. Take for example a retailer like John Lewis during the peak holiday shopping season. As shoppers have shifted from bursting through stores’ physical doors on Black Friday, John Lewis now requires a complete understanding of their digital doors on the web and mobile as shoppers begin their search for the perfect gifts. This begins with an understanding of where shoppers are located, how they interact with the web or mobile app, and the chain of events triggered with the click of a mouse. But the digital experience doesn’t end there as sales, advertising and email campaigns, and in-store pickup are also now critical digital components and require coordination across the company with IT. The stakes are critical with major online retailers earning tens to more than a hundred thousand pounds per minute during the holidays. Customers expect immediate response times, or they will quickly move on to another site. Everything is knowable about your digital customer experience Previously product teams formed opinions or speculated about their customer’s experience, but now every part of a customer’s journey is knowable with real-time data and metrics. Everybody in the organisation must work from a common dataset; ideally, through a common platform, that’s easy to access, and enables for the whole team to collaborate to deliver consistency to your customers. [easy-tweet tweet="Everybody in an organisation must work from a common dataset" hashtags="data, tech, cloud, business"] To ensure the effectiveness of this data-centered approach organisations must start with a clear business goal such as, “What do we want customers to actually do with our product?” From there you may ask basic questions about your site starting with, "Are we available?" Digging deeper you need to understand, “What are the customers doing?” or “How are they behaving?" with a feature or the entire purchase path. Finally, you have to understand whether any change you make to the product has a positive or negative impact on a customer’s experience. Digital transformation requires an all-encompassing change in mindset across an organisation, regardless whether you’re a beloved, century-old department store or a startup, what you’re selling, or whether the buyer is a consumer or business. Every function in some way may have to change the techniques, processes, and underpinning technology used in this new relationship to your customer. By gaining empathy for your customer’s experience based on data, you can deliver exactly what your customers expect, and create greater business value. ### The face of fintech: how nimble fintech companies are transforming the banking customer experience The digital age has redefined how people engage with their bank. A decade or so ago, banking options were limited by proximity and competition was thin. Now, these services are all online and there are such a range of alternative possibilities that the balance of power has shifted. We are now in an era of customer-led banking. Today’s customers have higher expectations and greater choice than ever before. Banks are under increasing pressure to keep up with their more agile fintech counterparts and deliver not just financial services, but an exceptional customer experience too. There’s no time to lose. Research by PwC indicates that fintech companies could control 20 percent of the financial services business by 2020. As a result, 83 percent of institutions believe part of their business is at risk. The fact that 2000 bank branches have closed over the past decade alone validates this concern. Centuries old banks may have greater manpower and deeper pockets but nimble fintechs are able to offer a transparent and highly personalised customer service. Seventy-five percent of banks recognise that the key competitive differentiator is fintech’s increased focus on the customer – yet most have not taken enough action to improve their position. This is a serious oversight and one that fintech companies have capitalised on. The definition of service   Fintech companies put their customers front and centre. They realise that the financial services industry isn’t built solely on the delivery of high-quality finance. It’s also heavily dependent on the fast delivery of a high-quality user experience, customer support, and convenience. Whether applying for a loan or setting up an account, most customers have had frustrating experiences at the bank involving long waits, confusing forms and being put on hold or transferred from one department to the next. Bank products and services are also usually inflexible and subject to complex regulation, forcing the customer to choose what comes closest to their needs – or adjust their requirements accordingly. Rather than go out of their way to find a spot-on solution, banks will invariably respond with ‘no’. Fintechs on the other hand, understand that speed and service is of the essence. At Sonovate, we made responsiveness and personalisation an integral part of our customer service and product offering. Our contract finance enables recruitment agencies to start running a contractor division in just one day. By contrast, it can take up to five years to launch a contractor division using traditional invoice finance from a bank. Tech power Smart use of digital tools has given fintech companies the upper hand in terms of customer experience. Mobile apps and cloud-based platforms make financial products and services easier to manage across multiple channels – to the customer’s advantage. [easy-tweet tweet="The fact that 2000 bank branches have closed over the past decade alone validates this concern." hashtags="Tech, Fintech"] The banks are also investing in these technologies but fintechs have the advantage of being free of legacy technology systems and not burdened by regulation and bureaucracy. Limited online functions and a poor mobile experience are not uncommon, putting banks on the back foot in terms of customer experience. One of technology’s greatest contributions is its transparency. Intuitive mobile apps and user-friendly online platforms allow fintech customers to engage directly with their finances, helping them feel far more in control of their transactions and confident in their decisions. Monzo, a start-up challenger bank, offers a pre-paid card that links to an app on their customers’ phones to give them a real-time overview of all their payments. This kind of real-time log wouldn’t be possible with a legacy bank that doesn’t update the ledger for several days. Financial data can also be presented to customers in a clear, actionable format that helps enhance their understanding. The success of fintechs such as cloud software company Xero, whose tagline is ‘beautiful accounting software’, proves that appearance matters. Customers increasingly expect slick design and seamless user experience, the absence of which can damage credibility. Customer service is, like technology, constantly evolving. The agility of fintech companies gives them a clear, competitive edge over more ponderous banks. However, there is still time for older institutions to shake off their complacency and learn from these dynamic disruptors. Collaboration between banks and established fintechs is also on the rise, providing banks with greater access to new customers. As the market shifts, fintech companies move with it. They are constantly listening and responding to their individual customers’ needs – and making sure that the products and services they offer, are what people want – and need. Featured image credit to Lobster.media ### Mobility Challenges Require a New Model-Driven Application Development Approach According to independent market analysts, nearly two-thirds of organisations have deployed three or more applications. The challenge of mobility is no longer “How do I build apps?” but rather, “How do I build apps faster and more efficiently?” Mobile app development teams are facing numerous challenges scaling their teams to handle the ever-increasing backlog of requirements for more apps, faster release cycles, and complex back-end integrations. [easy-tweet tweet="Nearly two-thirds of organisations have deployed three or more applications" hashtags="tech, cloud, apps"] Despite all the focus on mobile and digital transformation, companies are having a hard time delivering on what their businesses are demanding, because mobile application development is still a complex and challenging process. Gartner predicts that by 2017 market demand for mobile app development services will grow at least five times faster than internal IT organisations’ capacity to deliver them. Another survey found that half of the companies surveyed have a backlog of between 10 and 20 apps. Examples of these challenges include importing apps from a single platform to all the major mobile OSs, which can double or triple the cost with each additional platform port costing an incremental 50 percent to 70 percent. Native mobile app development skills are also expensive and hard to find. Furthermore, HTML5 mobile applications lack native features, suffer from poor performance, and often result in poor user experience and low adoption. Back-end or server side development can also be critical to app performance, and significantly impact end-user experience/adoption, and account for 75 percent to 80 percent of the effort and expense. Much of the complexity of mobile application development results from the fragmented technologies available today. Building your own back-end, using custom frameworks out of niche point products or leveraging multiple open source tools and plug-ins can increase complexity and costs. These tactics often make sense to get started cheaply, for simple applications, or for extending a web application to mobile devices, but can quickly become a complex custom mobile application development project. These all leads to a high total cost of ownership that has the potential to grow over time. Therefore, organisations need to find ways to deliver great mobile experiences using predictable and repeatable processes that unblock agile development teams, enable parallel work streams, maximise reuse, and are cost effective to support long term. This is, of course, not a new problem for large organisations that have been building complex enterprise applications for many years, but the problem is new to mobility projects because it is just now reaching the size and scale that mandates the use of process and methodology to avoid complete chaos. Mobility fundamentally changes the current application model by focusing on breaking a business process into numerous small task-oriented user interactions, sometimes referred to as “mobile moments” vs traditional monolithic approaches. Developing these applications in a reliable, repeatable, and maintainable fashion does not require the creation of a brand-new approach.  The answer already exists and has been proven successful in other areas of software development. [easy-tweet tweet="HTML5 mobile applications lack native features" hashtags="tech, cloud, apps, "] The path to consider is a model-driven development approach, which, is a development methodology for rapid development of applications from software architectural models leveraging smaller, discrete objects. By using a model-driven approach and combining it with a microservices architectural approach to assemble and deploy self-contained independent components into a complete solution, it is possible to develop applications faster and reuse components across multiple applications. One example of the successful application of model-driven development in other technologies is its adoption by Java-based development teams with frameworks like Spring and Struts. These frameworks use an object-based model to abstract the low-level programming complexities and provide a standardised programming approach so that any developer can instantly understand the application structure and safely make changes to it. The wide adoption of these Java-based model-driven development frameworks demonstrates the efficiency improvements model-driven development can provide. To date, there has not been a similar standardised enabler of model-driven approach for the mobile space. As a result, the mobile application development world is suffering from many of the pain points that the early Java development community experienced: slow and expensive application development, shortage of development skills, low level of code reuse, and high costs for application upgrade and enhancement. By using a model-driven development approach, optimised for the unique challenges and requirements of the mobile space, companies can apply this path to better address the rapid pace of changes required on the front-end mobile app interface while still bridging to less frequent but equally important changes in the mobile back-end and core business infrastructure. ### Converged Services - #CloudTalks with Six Degrees Group’s Ken Moody at Dell In this episode of #CloudTalks at Dell, Compare the Cloud speaks with Cloud Business Development Manager Ken Moody from Six Degrees Group about their converged services. Providing connectivity, from Internet to private circuits, virtual voice services, and cloud resource services. With more of a consultive approach Six Degrees Group can help decide whether to put individual workloads into Azure, AWS or VMWare coming up with a hybrid solution. #CLOUDTALKS To view more videos in the series visit Compare the Cloud: comparethecloud.net/category/watch/cloudtalks/ Visit Six Degrees Group to find out more: 6dg.co.uk/ ### LSE explores growth of ambitious companies in Europe The London Stock Exchange (LSE) Group will today host a roundtable discussion aimed at ‘supporting ambitious companies across Europe towards growth’. A group of high-profile investors and business innovators will gather to speak on a range of topics including the Capital Market Union project run by the European Commission and the LSE’s efforts and initiatives to support growing businesses looking for investment. The European start-up scene has transformed dramatically over the past decade, and it is now home to a significant number of well-known companies. According to an annual report by investment advisors, Clipperton, $12 billion was invested in European start-ups in 2016. FacilityLive, an Italian high tech start-up that is transforming search technology, by creating a unique ‘human language search’ platform, is one of the ELITE Programme companies that has been selected to present at this exclusive event.  They are also the first non UK company to be admitted into the London Stock Exchange (LSE) ELITE Programme in London. Founder and CEO , Gianpiero Lotito is also the creator of the Small Valleys model which is producing companies to rival those created in Silicon Valley, and regularly champions the support of ambitious companies across Europe. He commented: “The attention that a world leading financial institution, such as the LSE Group, has for the European start-up movement is a concrete sign of the importance of the new industrial generation in Europe. Also there is the need to find financial backers who will support the development of future European enabling platforms and scale-ups. The London Stock Exchange Group provides the perfect financial environment to support companies that can find, in the Small Valley European model for digital ecosystems, the perfect physical environment for their growth”. Europe has the unique opportunity to create an even more successful tech industry. LSE’s campaign to explore the growth of ambitions companies will investigate the measures that need to be taken to ensure that SME’s have access to alternative forms of finance, and how to ensure that investors back EU growth companies. ### How to make big data work for your business IDC predicts that by 2020 about 1.7 MB of new information will be created every second for every human on the planet. From retail to health to travel and automotive, this means industries have no choice but to adopt digital services as soon as possible. [easy-tweet tweet="by 2020 about 1.7 MB of new information will be created every second for every human" hashtags="tech, cloud, information, data"] But as consumers expect more information at their fingerprints at such a rapid rate, the pressure increases for organisations to keep pace with a marketplace evolving at a never ending speed. To meet this demand, traditional businesses must implement new technologies and processes that help streamline their operations, retool their work forces and drive new revenues to help them increase the bottom line. Traditional organisations that successfully implement the required strategies to address the explosive growth of data will manage to shift their businesses to align with the digital economy, soon realising all the upsides. How can the board best prepare for these changes? ‘Digital business’ isn’t just about replacing the old with the new. It is about harnessing technology to enhance every aspect of an organisation, reshaping and redefining businesses from the ground up. That isn’t to say organisations across every vertical aren’t already experiencing the benefits of digital transformation even before redefining their business as such. Cloud computing is making the ownership of enterprise systems simpler and more cost effective, for example, whilst mobility and BYOD (bring your own device) are empowering the workforce giving them greater flexibility in how and where they work. [easy-tweet tweet="Cloud computing is making the ownership of enterprise systems simpler and more cost effective" hashtags="cloud, tech, data"] But while these initiatives enable businesses to become agiler, they also bring a new set of challenges; the need for a renewed focus on the integration and coordination of teams, processes and systems that span the whole range of IT management disciplines. For a Chief Information Officer (CIO), for example, these digital transformation initiatives represent more technology to manage, more diverse ways to use it and more complexities in how it is deployed. This is unchartered territory for many CIOs and offers boundless opportunity for innovation. It’s even led many companies to create a new C-Staff position to focus exclusively on new digital technologies and strategies: the Chief Digital Officer (CDO). For this reason, it’s important for leaders to recognise that to prepare their enterprises for the shift to the digital landscape, they must also take action themselves. CIOs must have a clear vision of the business and view technology as a way of generating revenue and growing the business rather than just treating IT as a cost centre. Meanwhile, Chief Marketing Officers (CMOs) need to gain real-time access to business analytics and uncover trends faster than ever before, and Chief Technology Officers (CTOs) and CDOs must plan to deliver a customer-centric digital technology strategy (that help innovate in an open, collaborate and mobile way). These opportunities and strategies are a springboard for the leadership of any business that’s about to undergo or is currently in the middle of a digital transformation. If business leaders can break free from their ‘comfort zone’ in IT, they can add considerable value by leading innovation throughout the entire organisation. How is big data benefiting the end consumer? Today’s consumers are adopting an entirely new set of behaviours, transacting across both digital platforms and in-store. As a result, there is an increasing amount of data available about their demographics, spending habits, preferences and activity that when analysed on both a macro and individual level leads to meaningful insights. This helps businesses understand its consumers, drive engagement and retain sales. The trend is clearly impacting all levels of business nowadays and consequently, helping businesses meet customer demands. In fact, a recent report from McKinsey states that companies making extensive use of customer analytics see a 126 percent profit improvement over competitors. [easy-tweet tweet="There is an increasing amount of data available about demographics" hashtags="cloud, tech, data"] A key takeaway for more B2B-focused businesses is that thanks to these changes in behaviour, companies such as Spotify can now use data from its users to inform everything from the advertisements to the creation of personalised recommendations in their Spotify weekly playlist. By analysing listeners’ preferences from a higher level, Spotify is able to identify trends and help people find new songs that they will actually enjoy. How is big data changing the way we run our business? Few businesses have been untouched by technology. From how to order a takeaway to how we book a doctor’s appointment or keep track of sales leads, the digitisation of business has led to the creation of new job roles and business models, and the evolution of others. A clear yet niche illustration of this is the case of Red Roof Inn. The hotel chain has proved the value of real-time information for its business by leveraging data such as weather conditions and flight cancellation statistics to target mobile users in regions affected by flight cancellations due to bad weather. Since the launch of this strategy, the business has realised a ten percent increase in revenue in the area where the strategy was deployed. However, whilst there are countless ways that existing processes can be impacted when incorporating big data into an organisation’s operations, simply extending a traditional business intelligence (BI) approach is likely to fail to yield the insights that big data promises. Some aspects of data analysis, architecture and governance may require an entirely different approach. It is, therefore, important to consider and implement new technologies and processes on an ongoing basis. Moreover, whilst technologies that streamline operations and retool workforces will help make the best use of big data to drive new revenue and increase the bottom line of businesses, this alone is not enough. Equally important is a change in mindset across the board, breaking free from the traditional ‘comfort zone’ in IT. Together, strategic digital technology decisions and the right mindset can help align business objectives with the digital economy, helping any organization in any industry keep up with the ever-evolving demands of the marketplace. ### Let’s Go Shopping for Our New Cash Management Wardrobe Few things in this world are more modern or faster changing than the technologies that underlie automated treasury management. Computers and networks are faster and more powerful than ever. Today’s core system may well be tomorrow’s doorstop. Let’s say that your company is ripe for an upgrade to its cash management capabilities. You might be automating for the first time – moving away from manual ledgers and Excel spreadsheets. Or you might anticipate sales growth or corporate expansion, and you want to make sure that your systems and processes will be able to handle it. How can you be sure than your investment in a TMS will give you the best possible return? How can you determine whether a particular vendor or brand is the best fit for you? That’s an appropriate question: “What’s the best fit?” You don’t need a newly-minted Ph. D. in engineering to guide you. Rather, you can look back about 40 years to the sage advice of John T. Molloy, the first self-declared “wardrobe engineer,” who wrote the best-selling “Dress for Success.” Why You Buy Clothes for Yourself and Cash Management Systems for Your Company Mr. Molloy’s guiding philosophy about wardrobe selection could just as well be applied to TMS selection. He wrote, “Your clothes should move you up socially and in business, not hold you back…If you are a Wall Street stockbroker, go to work in a conservative, three-piece pinstripe business suit. If you are an art director for a Madison Avenue ad agency, avoid conservative clothing in favour of more flamboyant, ‘with-it’ gear.” In like manner, your new Treasury Management System should move your company up the ladder of business success. It should speed up and simplify your company’s operations while enabling you to meet the requirements of your customers and markets. Your customers and markets are your own. Seek the system that enables you to serve them to the best of your company’s capabilities. So here are some guiding principles that you may keep in mind, whether you’re performing CTR (Clothing and Textiles Research) to go shopping for a BDU (Battle Dress Uniform), or if you’re immersed in TWAIN (Technology Without an Interesting Name) in search of a TMS. Have a Plan Before you go out the door, draw up a shopping list. What’s missing from your wardrobe now? Shop strategically. Have a clear idea of what your overall wardrobe should contain. If you’re in the market for cash management software, you should think about what functions and features you want. Not all offerings do the same things, and some do what they do better than others. Most of them will help you with consolidating, projecting, and banking. Others are better at connectivity, generating reports, and integrating accounting. Maybe you won’t be wearing all of the garments in your wardrobe any time soon. But if you’re sure that you’ll need them at some point during the year, better to get them now. The same goes for the features of a TMS. Seek Compatibility and Interoperability in Clothing and in Cash Management. Think carefully about each piece of clothing. Can you wear it with at least three more garments in your wardrobe? When you’re buying garments, make sure that you can combine each piece with other clothes, so that you can wear it in more than one season and get some usage out of it on more than one type of occasion. Go for solid colours rather than the prints and plaids - they mix-and-match better. The sartorial message here, translated for TMS selection, is to think both in terms of modules and of compatibility with your other systems and processes. Your greatest need right now may be for higher efficiency and cost-savings in your payment processes. You may also be thinking of streamlining your reporting processes after you get the payments flow under control. The reporting that you’ll then focus on will inevitably be better – clearer, more timely, and more complete. Some of the greatest benefits of cash management systems stem from compatibility with other systems and processes. Be sure that the system you choose is fully compatible with your own ERP system. That will mean more automation, less manual data entry and re-entry, and fewer errors. Another sine qua non is the capability of connecting with financial institutions, whether domestic or international. Be sure that your cash management vendor has demonstrated expertise in working with all connection options and data formats. That will enable you to securely access, transmit, and display all your required treasury information. Also take note of your prospective vendor’s experience and track record with partnerships. If you’re thinking of marketing payments functionality to your own customer base, be sure that your provider has a strong track record in white-labeling and in helping partners customise its product for the partners’ own user bases. Style Defines You, Both in Your Mode of Dress and in Your Mode of Business Buy the clothes and styles that will fit the social and business strata where you operate, and where you’ll be traveling. In other words, consider your lifestyle. What clothing do you need that will make you both feel great and look great, so you can do what you usually do well? This piece of advice bears on two major considerations when you’re in the market for a TMS. One of them is internal, and one of them is external. Let’s take the internal one first. What’s your company’s style of operating? Do you have a particular, unique approach that leads you to keep things hands-on and in-house? And if so, do you have varsity-caliber technical staff? Or do you call upon outside expertise for your IT operations? The answer to the above questions will likely lead to whether you’ll be better off with a cloud-based, SaaS Treasury Management System or an installed, in-house one. A cloud-based approach means no upfront capital expenditures, a predictable monthly bill, and automatic upgrades and patches. An in-house approach lets your staff tweak and modify the system to your unique specifications – your “secret sauce” - that you’d prefer to keep under wraps. As for the external consideration bearing on the above wardrobe question: what industry segment are you in? Your wardrobe should demonstrate that you are a member of the social and business circles in which you operate. Your cash management system should show that you are a member - or at least an affiliated member by virtue of your associations and track record – of the industry segment(s) that you serve. Concerning those industry segments – your prospective TMS provider should be able to offer you success stories and case studies of their work with companies from the same or a comparable industry or vertical segment. This would show two things: that they have the experience in implementing the software, and they have been able to address issues and problems that similar clients have grappled with. Don’t Forget Service and Support Make friends with a good tailor. Nothing raises the quality of clothes more than having them altered to suit your shape. So build a relationship with a tailor or dressmaker who knows your body and your taste. If your workforce wears uniforms, be sure that the uniform company has a true tailoring service and not just some local tailor that they engaged in order to get your contract. Like a suit or a dress, an ERP system is not likely to fit perfectly when it’s right “off the rack.” As your wardrobe items must be hemmed, taken in, or let out there and there before you head out on the town, your TMS must be expertly installed, tested, and adjusted before launch. Just like your clothing will need cleaning, pressing, and an occasional mending, your system will need regular maintenance and upgrades. Make sure that your vendor is adept, prompt, and methodical with its implementations – that the “tailors” of its technical staff are up to the task. Also be certain, by checking references, that your vendor’s track record of post-installation account management is outstanding. Your vendor should have the same outlook as that of Vidal Sassoon, another style maven – but of hair, not of clothing: “If you don’t look good, we don’t look good.” It’s Not What You Buy for Clothes or Invest in for a Cash Management System: It’s How You Use Them. For a final word, we turn again to John Molloy and his advice to women in the workplace. He wrote his first dress-for-success back when women were just beginning to enter all areas of the business world in serious numbers. Several years later, he wrote, “A woman's success does not depend entirely or even primarily on how she dresses, but dress is an important factor in most women's careers. Research shows that when a woman dresses for success, it does not guarantee success, but if she dresses poorly or inappropriately, it almost ensures failure." The same observation can be made about your new, automated treasury management system. It can save your company time and money, it can modernise your operations, and it can give you a highly detailed and accurate picture of your company’s liquidity and capital. It’s a powerful asset and a competitive advantage. But, like a good wardrobe, it doesn’t ensure your success. Only you can do that! ### Four reasons why small businesses should move to the cloud Not too long ago, starting a business required setting up an office with landlines, desktops and hard drives. You also had to consider your office’s location carefully and base yourself near your prospective customers. [easy-tweet tweet="Cloud technology has revolutionised the business landscape" hashtags="cloud, tech"] Cloud technology has revolutionised the business landscape, breaking down the constrictions of a bricks and mortar office. You and your staff can now access documents, tools, financial data and software online at any time, from anywhere, on any device. Essentially, all a small business needs to get going today is an internet connection.  Regardless of where you, your customers or your colleagues are based, cloud computing can be used to ensure that your business runs smoothly. Here are five reasons why the cloud is the perfect platform for small businesses: More money Small businesses don’t have money to throw around. They have to carefully manage their cashflow and save where they can. One benefit of migrating your daily business operations to the cloud is that it can help you save significantly on operating costs. Cloud computing requires less physical server hardware, which means less power usage, IT support, maintenance and upgrade needs. You could reduce your overheads even further and set up your new office away from expensive big cities. Or, you could even do without a fixed office space and work remotely, from home, a café or a shared office space. The Internet has given rise to a global marketplace which means you no longer have to be located in the same town, or even the same country, as your customers and prospects.  You can target your audiences using a range of online channels – without ever meeting them in person. Similarly, gone are the days when you had to visit the local bank branch to apply for funding. You can now access alternative finance providers online, as well as crowdfunding services like Kickstarter to raise the start-up funds you need. Better collaboration and communication Cloud computing makes collaboration between colleagues easy. Your team can seamlessly save and access files, enabling them to work from the same master document without any confusion. Cloud collaboration tools, such as Google Drive and Trello allow users to upload, edit and comment on documents in real-time - making it easier for people to work together on projects from multiple locations. Plus, these tools enable you to follow what your employees are working on, allowing you to track and manage their individual progress. You can also choose to control access to certain documents and files if you need to, such as confidential employee information. Online platforms like Skype, FaceTime or Google Hangouts make it easier to communicate with anyone, at any time, anywhere in the world. Recruitment processes have been made easier as you can find and even interview candidates online. You can also reach potential funders, business partners, staff, and customers without needing to install landlines or have a meeting room to host them in. Increased flexibility The modern workplace is increasingly mobile and accommodating of more flexible working arrangements. In many instances, your employees don’t have to stick to a prescribed set of work hours or be present in an office. It also means that you can manage your business on the go. Travel time between meetings doesn’t have to be wasted time and can be put to productive use. [easy-tweet tweet="The modern workplace is increasingly mobile and accommodating of more flexible working arrangements." hashtags="tech, cloud"] This increased mobility and flexibility feeds back into the cost savings benefit of cloud technology. With actual office space and hardware needs reduced to the bare essentials, your overheads will be minimal. Depending on your business, you could go so far as to implement a BYOD (Bring Your Own Device) work culture and get employees to use their own smartphones, laptops and tablets. Greater integration A cloud-based business solution allows you to integrate with other cloud technologies. Your small business can take advantage of a range of specialised services that integrate with back-office operations, from marketing to customer service and accounting. When administrative tasks are integrated and automated, it gives you more time to focus on growing your business and hitting targets. If you want your business to work smarter and faster, migrating your business to the cloud is a wise move. Cloud computing is not the working world of the future, it’s the workplace of today. The practical benefits of improved collaboration and financial management are numerous but ultimately, the cloud will ensure that your small business stays relevant in a fast-moving, competitive world.   ### 2017’s lessons for the public cloud – a lawyer’s response Jon Topper made a strong case for moving to public cloud. “Businesses that are not currently on the public cloud are behind competitors by 3-4 years.”  We are living through an uncertain era. Brexit is creating uncertainty over the UK economy and the pound is under continued pressure. Banks and other employers are threatening to move employees from the UK to other parts of the EU. President Trump seems to favour the UK and has made encouraging gestures on a trade deal but has been lukewarm towards the EU generally. So surely you’d be mad not to take advantage of the benefits of public cloud and DevOps, especially if your competitor has already. Topper also says DevOps is key to making the switch to public cloud and security investment will continue to rise. Even the Government Digital Service has finally come off the fence and declared public cloud secure enough for most of the public sector. Even the regulators are comfortable with cloud. The Financial Conduct Authority, Information Commissioner and the Solicitors Regulation Authority have all issued cloud-friendly papers. As the old adage goes, you should look before you leap and this is true of any leap into public cloud. Some cloud providers I've advised have asked for a tough set of terms with their customers. As it's a commoditized, standardised, low-margin service, they pass the risks to the customer. I also advise potential customers of cloud. By the time they’ve decided to talk to me, they are generally aware of the risks of using public cloud and I know the terms to look at. Here be dragons Public cloud with DevOps has many advantages but can catch out the unwary. Let’s face it, only lawyers read terms and conditions. So if you click on the accept button or sign on the dotted line without reading them, you might not even know the risks you're taking on. Here are a few of the common risks which cloud providers pass to customers: Some public clouds are provided “as is” with no promises over quality or fitness for purpose. The service might suit your needs, but it's up to you to verify it can do what you need. There will be no comeback against the provider If you’re used to the old style of waterfall IT delivery, are you and your CTO ready for the agile, continuous and, frankly, vague nature of DevOps? If your public cloud fails, the most compensation you can get is often service credits. Even if you suffer a week long outage or even a complete failure of service Your data might be stored in the USA. The law allows for international data transfers so that's not necessarily bad. But if you've promised your customers you won’t transfer their data outside the UK/EU, you might be in breach of your contracts with them You will generally retain ownership of your data, but you might have a limited time to migrate it at the end of the contract. Also, you might have given the cloud provider an unlimited licence to use itBy all means, move to public cloud with DevOps. But make sure it’s fit for your purpose, not just the provider’s. ### Using the cloud for surveillance Cloud computing has long been credited for delivering greater scalability, reduced costs, and easier access to applications. Industry experts have called it the most exciting and disruptive force in the tech market in the last decade and, according to analyst firm Forrester, 2017 will see more enterprises moving to the cloud in a big way. However, while many organisations are fully embracing the cloud, some markets are still lagging far behind. Earlier this year, we explained how the corporate CCTV sector has been reluctant to adopt any sort of cloud model. What we are seeing is the public sector take steps towards using the cloud, with the Metropolitan Police recently announcing plans to issue body-worn camera systems to over 22,000 frontline officers and use the cloud to store CCTV footage indefinitely. This latest move provides many of the advantages expected from any cloud system such as lots of storage and scalability. Like any cloud users, the police can store as much visual data in the cloud as they need (provided they are not keeping personal sensitive information for longer than necessary as directed by the ICO), and they can choose to pay only for what they use. However, the Metropolitan Police is only just dipping its toes in the water. The cloud can do so much more, and it’s not just the emergency services that can benefit. Other organisations such as housing associations and care homes are already using many more advanced features of cloud technology in their visual surveillance. These features include: Motion detection – through APIs, businesses can use the cloud to trigger recordings only when motion is detected, minimising bandwidth use without compromising quality. Some cloud-based systems can be configured to record data and alert their users when triggered from almost any type and combination of external influences or sensors. Federation of data – organisations can dynamically connect a cloud CCTV system to both old analogue cameras and newer 1080p IP cameras. There is no need to rip and replace cameras or cabling. In addition, the system can be integrated with external alarm sensors and device actuators. Advanced security – the latest systems enable existing analogue or digital CCTV cameras to transmit data securely to the cloud via an encrypted tunnel. Once stored, HTTPS/TLS and 256-bit AES encryption ensure the data remains secure. Footage can be viewed by authorised users from any location using their smartphones, tablets or PCs, or downloaded to provide evidence to the police or other authorities if required. This means there is no longer a need for local or network video recorders. Flexibility – visual data can be viewed in real time or historically, with search criteria and parameters used to reduce the amount of footage that requires reviewing. Furthermore, system features such as data deletion, data forwarding, live view and dynamic camera scheduling can be easily varied per user and recording parameters such as motion detection zones and redacted areas can be easily varied per camera to ensure the system retains its compatibility with purpose. Finally, with any surveillance or cloud system, organisations must adhere to the regulations associated with transferring and storing data. Data protection is a fundamental concern to any business which holds personal information – both written text and CCTV recordings. Breaching the regulations has serious consequences including fines, bad publicity and even criminal sanctions. Organisations must ensure compliance with the Data Protection Act (DPA). This includes only collecting personal data when there is a legitimate reason for doing so (like detecting crime), using and keeping data only to fulfil its purpose, and securely storing recordings in order to prevent unauthorised access and hacking. They also need to prepare for the General Data Protection Regulation (GDPR), an even tighter regulation that comes into effect in May 2018. Serious breaches could see organisations facing fines from the Information Commissioner’s Office (ICO) of up to €20 million or 4% of turnover, whichever is higher, so it is vital to begin implementing GDPR compliant policies and processes now. To summarise, using the cloud for surveillance brings a number of benefits but there are points for organisations to consider, which the ICO explains in detail in its Code of Practice. When correctly implemented, the cloud is an extremely effective and secure method for visual surveillance to protect corporate assets and people. And crucially, it can be used to consolidate CCTV systems across an enterprise and make compliance with the DPA workable and effective. ### Revenue Management for telecommunications companies According to the Communications Fraud Control Association (CFCA), network operators around the world lose around $38 billion annually to fraud and uncollected revenues. Currently the biggest threat to telecommunications service providers comes from loss of termination revenues. Termination revenues come from when a network handles someone else’s call – for example when someone on one network in the UK contacts someone on another – or when you dial abroad. Telecommunications service providers naturally want to have a reward for handing on the call to the final recipient. Annual declines in revenue from termination in many countries are 10% or more. [easy-tweet tweet="Just one SIM in a SIM box can lead to losses of $3000 per month and a SIM box can hold thousands of SIMs" hashtags="security"] Reduction in termination revenues is due to two issues: Regulation, such as that in the EU around roaming costs, which operators can do little about other than lobby Revenue loss due to fraud or unfair competition from other (particularly OTT) service providers, which operators can do something about. A bit of background Operators buy and sell international minutes between themselves called swaps which they try to equate to an agreed value but often these will either be too big or too small for a particular country requiring a top up to make the swap balance (imagine the increase in voice calling to New York after the recent bomb threats). When operators need to top up their swaps, they buy or sell in online exchanges to make good on their commitments. However, when they buy additional bundles of minutes these can include bundles of minutes that are not entirely legitimate. The traditional issue would be something called SIM box fraud: SIM box fraud involves buying thousands of SIM cards, setting them up in a piece of equipment that is connected to the internet and selling the capability to terminate (i.e. connect) calls into a certain country. If the wholesale rate for connecting a call is 10p and assuming the SIM cards in the SIM boxes are able to connect local calls for, say 5p, the owner of the SIM box can make a margin of 5p for every call he connects. This often happens because SIM cards for consumers will include bundles of free minutes or discounted rates. However, a condition of buying a consumer SIM card is not to use it to terminate calls. Hence why this is fraud. Just one SIM in a SIM box can lead to losses of $3000 in revenues per month and a SIM box can hold thousands of SIMs – do the maths. This is where operators traditionally lost revenues to criminal gangs that found it ridiculously easy to get hold of SIM cards, connect them and sell the capacity on the open market. OTT Recently a new threat has occurred. OTT apps, such as Viber and WhatsApp have struggled to monetise the often free calling they promote between users. Some have now recognised the opportunity to route telephone calls to handsets via their app, enabling the OTT players to terminate the call in an app and keep the termination fees. OTT apps have been proactively touting the capability to terminate calls to second and third tier telecommunications operators. The caller calls someone’s normal mobile number and the recipient of the call ends up receiving the call within an OTT account. That is all well and good but the caller legitimately called someone else’s mobile. The caller has the right to expect the call to be connected to the other person’s phone, not through their app. Furthermore, call quality can be affected, as might other voice services such as voicemail or caller ID. If you are abroad, you might also be using data to receive the call, which can add up to a large roaming bill that you are not aware of. What this means for operators Telecommunications research carried out by Revector, a mobile anti-fraud specialist, found that 81% of respondents experience SIM box fraud and 73% experience OTT bypass. 90% of those that experience customer complaints are directly related to call quality. Some operators are seeing their revenues from call termination drop by 50% due to this activity and 68% of the respondents are aware that OTT operators are offering termination on their network. This matters because the operators are responsible for creating and maintaining a decent network for the OTT players to use. Part of that “deal” with regulators and Governments around the world is that they receive revenue from termination charges in exchange for maintaining the networks and keeping costs low for business and customers to thrive. With more and more data being used, networks are creaking. And if they continue to lose revenue to OTT companies that use their networks but do not contribute to the cost, it is a grim future for all of us. Rarely does anything come for free and the severe reduction in termination charges can only negatively impact the industry as a whole. Featured image credit to Lobster.media ### Hewlett Packard Enterprise Announces Global Reseller Agreement with Mesosphere Today, Hewlett Packard Enterprise (HPE) announced that it has signed an original equipment manufacturing (OEM) and reseller agreement with Mesosphere to help customers transform and modernize their data centers with hybrid IT solutions that span traditional infrastructure, private, public and managed cloud services. As part of this OEM and reseller agreement, HPE will be a provider of Mesosphere enterprise solutions worldwide. HPE will build and deliver infrastructure solutions, including server, storage and technology services designed for Mesosphere DC/OS, an open-source platform (built on top of Apache Mesos™) for developing, deploying and elastically scaling the technologies that underpin today’s modern applications. The agreement with Mesosphere is the latest milestone for HPE as the company continues to expand its container leadership to help enterprise customers fully embrace the benefits and flexibility hybrid IT provides, such as optimized performance and faster service time to market. “Today’s enterprises need to engage a lot of users, process real-time data at scale and roll out services fast. That means running distributed systems of containers, big data services and everything else that modern applications require elastically across on-premises IT and in the cloud,” said Peter Schrady, senior vice president and general manager, SMB and Enterprise Server Solutions, HPE. “By combining Mesosphere’s DC/OS technology with our infrastructure and service leadership, we are making it easier for our customers to leverage hybrid IT to run, deploy and manage today’s distributed enterprise environments more efficiently.” Through this reseller agreement, customers get the benefits of cloud native platform services, accelerated by HPE services and powered by HPE infrastructure. As part of the platform, Mesosphere Enterprise DC/OS running on HPE servers and storage enables customers to use a single platform to run container applications, Platform-as-a-Service, Infrastructure-as-a-Service, distributed databases, big data analytics and stateful services. The reseller agreement is the latest chapter in a growing relationship between Mesosphere and HPE. Mesophere also announced today that it has joined the HPE-led Cloud28+ community as a technology partner. Cloud28+ provides customers and HPE partners a platform from which they can find hybrid IT solutions, share knowledge, and create new connections. In March 2016, Hewlett Packard Pathfinder, the HPE venture investment and partnership program, led Mesosphere’s $73.5 million Series C funding round. “We’re excited to partner with HPE to help its customers around the world adopt modern applications and simplify their IT operations via DC/OS,” said Florian Leibert, co-founder and CEO of Mesosphere. “DC/OS is built around technologies that have automated infrastructure and application development at large web and consumer-technology companies, but is designed for use by all companies from startups to global enterprises. Our relationship with HPE brings the rich capability and elastic scale of public cloud to our joint customers’ own infrastructure.” HPE and Mesosphere are also working together in a number of other areas designed to help joint customers modernize their IT environments and operations around technologies such as cloud computing, containers and flash storage. Details include: Servers: Initially available on HPE ProLiant servers running Linux operating systems as supported by Mesosphere. Expanding in the future to servers with other operating systems as supported by Mesosphere and certified by HPE. Storage: New reference architectures and platform integration to address the data persistence needs of containerized applications on DC/OS. The solution provides elasticity, superior performance SLAs, high availability and data services required for deploying massively scalable containers on HPE 3PAR StoreServ All-flash Storage platform and software-defined HPE StoreVirtual Storage. Technology Services and Support: Worldwide support for Mesosphere as well as the underlying infrastructure from one global IT partner, with HPE Foundation Care. Technology Consulting Services: HPE Cloud Native Container Service for Mesosphere is designed to help customers achieve faster time to application service delivery while lowering the costs and challenges of adopting the latest container strategies. The service is delivered in three phases’ discovery, design and deployment. Through the HPE Cloud Native Container Service for Mesosphere, HPE can help you achieve: Faster time to containerize applications. ### Businesses have embraced the cloud – now they’re embracing a number of them at a time Ten years ago the concept of ‘cloud technology’ was largely alien to businesses, who up until then had utilised their own data centres for storing and sharing data. Some even said cloud technology might never catch on. Fast forward to the present day and not only are the majority of organisations using the cloud in some form, but many are using multiple cloud services at the same time. A recent study from IDC, surveying 6,100 organisations in 31 countries, showed that ninety-five per cent of respondents have now built an infrastructure that uses, “multiple private and public clouds based on economics, location and governance policies”. In simple terms – a growing number of businesses are adopting multi-cloud. What the stats don’t show however, is that many CIOs and IT leaders aren’t fully aware of this development and struggle to describe what multi-cloud actually is. Some are even unaware that their own organisation is already using multi-cloud. With the ever-changing nature of business, it’s crucial that CIOs and IT leaders understand the multi-cloud services their organisations are utilising, or the benefits they can provide. With any leadership change, the difficulty of inheriting responsibility for these applications, databases and infrastructures increases, but an understanding can minimise this difficulty and expedite business processes. Often different cloud providers and provision resources are chosen by the various teams across a business, each to best suit their individual needs. These decisions occasionally happen with little or no input from an organisations IT department, this is known as shadow IT. So what does multi-cloud actually mean, why are businesses adopting it, and why do some IT experts not even realise they are using it already? Multi-cloud: it’s in the name To give ‘multi-cloud’ an exact definition - it is a term which applies to any digital environment where applications are deployed across two or more cloud platforms. These can include any combination of private clouds (whether powered by OpenStack, Microsoft Hyper-V or VMware); public clouds (such as Microsoft Azure, Amazon Web Services or OpenStack); or dedicated servers. Despite becoming an increasingly common architecture, the term ‘multi-cloud’ is often used interchangeably with ‘hybrid cloud’, while in fact they are different entities entirely. Hybrid cloud is actually a specific type of multi-cloud architecture. It usually refers to a digital environment that combines public or private cloud platforms with more-traditional deployment models, such as managed or on-premise hosting, and includes orchestration among the various platforms. Multi-clouds, multi-benefits While by definition, ‘multi-cloud’ introduces more IT complexities through sheer scale, using multiple clouds can offer businesses several significant benefits including: Best-of-breed infrastructure –  A multi-cloud strategy gives the flexibility of being able to select the best-suited cloud service for each department’s workload. This in turn means enabling the business to meet the unique requirements for each specific use case, instead of being constrained by one cloud framework. Reduced risk of vendor lock-in – By investing in multiple cloud providers, businesses have more choice as to where they run their cloud workloads, giving them leverage to minimise price increases and other risks related to sticking with just one vendor. Disaster mitigation – If a business properly utilises multiple clouds, they can minimise the risk of application downtime or widespread data loss as a result of a localised failure. Added geographical data flexibility – The leading cloud providers all have data centres across the globe, however some companies may require that data for specific workloads resides within certain national boundaries. A multi-cloud strategy means businesses can easily meet those requirements, while still engaging with a global cloud platform. Innovation – Organisations that host their legacy applications on older infrastructures can turn to the public cloud for innovation. As these older processes often run at capacity, public cloud can enable innovation without interfering with the day-to-day digital business operations. More clouds, more challenges Despite the numerous benefits that a multi-cloud strategy can bring to a business, it isn’t without its share of difficulties. As mentioned, the chief obstacles introduced by multi-cloud relate to additional complexity by definition of having more than one cloud. These include: Expertise – It's true that learning the ins and outs of the infrastructure and lingo of more than one cloud can be challenging, especially for a smaller company. Meanwhile, bigger companies are faced with tremendous competition to retain the specialised engineers and architects who are versed in each cloud, meaning that even they often struggle to keep the required skills. However, this isn’t a burden that IT departments need to shoulder alone and it doesn't need to get in the way of benefiting from multiple clouds. Integration – It can be challenging to integrate public clouds built with different platforms. For example, Azure is based in Windows, whilst AWS’ code is founded on Linux. Administration and Vendor costs – The more clouds a business adopts, the more contracts they would need to manage with their vendors, resulting in increased administrative interfaces, potentially complicated cost tracking and additional billing management. Security – With more than one cloud being employed, a single security solution may not cover them equally, resulting in additional planning around security and governance. Right choice – A business would have to ensure it has chosen the correct cloud providers for the appropriate workload, such as more processing intensive cloud solutions for data analysts. It’s a multi-cloud future Despite the benefits and challenges, multi-cloud is here to stay. The key to navigating this digital environment is making sure that businesses are working with people who have the knowledge and expertise to execute multi-cloud successfully. This could mean employing third party cloud experts, or training existing IT staff with transferable skills that can be adapted to different cloud technologies. Businesses need to ensure they work with people who understand the array of services offered by the leading providers, their strengths and weaknesses and how they map to their specific needs. Although multi-cloud may begin by accident, IT leaders should embrace it sooner rather than later to protect themselves against security risks and access the many advantages.   ### The Financial Sector and the Great Cloud Conundrum I read an interesting article recently that outlined the way in which cloud adoption has changed the business landscape, causing a seismic shift in how organisations operate. Depending on your source UK cloud adoption rates are currently anywhere between 78 percent and 84 percent, and whilst cloud is no longer a new phenomenon, its importance to not only the CIO but also the full c-suite of decision makers such as CEOs, CMOs and CFOs, is paramount as they jostle to gain a competitive advantage over their competitors. It has been argued that cloud adoption heralds the largest disruption in enterprise computing since the advent of the PC, with many industries embracing cloud-based platforms to not only cut costs but also drive efficiency. Despite this, there still remains a certain amount of trepidation from the financial services sector to make the transition and fully embrace cloud and its many advantages. [easy-tweet tweet="It has been argued that cloud adoption heralds the largest disruption in enterprise" hashtags="tech, cloud"] At the mere utterance of the word ‘cloud’ you will hear a plethora of reasons why financial services organisations cannot make the leap. There are the concerns over regulatory compliance as well as the complexity of functional replacement, security and control. But, in an era where financial institutions are more highly regulated than ever before, one may forgive these organisations for a tentative approach to change. To further validate the hesitance, financial services firms are reportedly hit with security incidents 300 percent more frequently than other industries. However, it’s not all doom and gloom for the financial services sector. In mid-2016, following the publishing of the Financial Conduct Authority’s (FCA) final guidance for UK regulated firms outsourcing to the cloud, it was made clear that there is no fundamental reason why financial services firms cannot use public cloud services, so long as they comply with the FCA’s rules.  This statement and guidance provided will certainly be welcomed by those UK financial institutions that have been hesitant to embrace cloud due to the lack of regulatory certainty over its use. This also serves as good news for the cloud sector too, providing a boost in the uptake of cloud services in the sector. Regulatory compliance and managing cyber risk do not need to be the enemy of innovation. In fact, taking a risk-avoidant approach to experimenting with new business models or user experiences will be a fast path to obscurity in today’s business landscape, where innovation and competition can come from anywhere. Banks and other players in the financial services ecosystem should seek out technologies that meet compliance and security needs but also enable agility and flexibility. Here are three quick benefits that cloud can provide for the financial services sector: Enhanced Security – Contrary to popular belief businesses who take advantage of cloud computing may actually enjoy stronger security than those who try to go it alone, and with an on-premise system. The cloud is certainly more secure than many legacy platforms, so if financial organisations choose the right cloud deployment, they can actually experience a higher level of security than they would via legacy solutions. Reduced Infrastructure – As your financial services firm grows, so does its information technology hardware and software needs. By migrating to the cloud, your company can reduce the amount of infrastructure stored onsite, share liability with qualified technology partners, eliminate much of the hassle associated with procuring hardware and software, and possibly even reduce costs in the process. There is no longer a need to purchase multiple servers and supporting equipment, store it on-site and pay for the space and utilities to support the operation of that infrastructure. Increased Business Agility - Cloud computing brings with it a number of benefits related to agility. First and foremost, cloud computing is built with mobile productivity in mind. Employees need no longer be tethered to their desks. Applications and information can be accessed from virtually any device with Internet connectivity, allowing your staff the access needed to be effective, without being tied to the office. [easy-tweet tweet="Cloud computing brings with it a number of benefits related to agility" hashtags="tech, cloud, "] By embracing cloud computing services, companies in the financial sector are able to add vast efficiency to their operations. As long as the risks can be managed, there are many benefits. Cloud services can eventually help companies enter new markets, benefit from new opportunities, and strengthen their business processes. Further, cloud computing can help financial firms reduce the setup and operating costs related to installing new hardware and software or acquiring storage in the data centre by making the necessary infrastructure resources available. If your financial services firm has been hesitant about a migration to cloud computing, it may be time to reconsider. Enjoy stronger security, lower your maintenance costs and unleash the productivity potential of employees by migrating to the cloud. ### 2017’s lessons for the public cloud Last year saw a real shift in perception around the public cloud in some business sectors. Increasingly, more organisations realise its business benefits.  Recent figures have contributed to the private versus public cloud debate; Information Week’s State of the Cloud Report showed that 39% of IT managers surveyed expect to get half or more of their IT services from the cloud, nearly double the number two years ago (20% of a similar sample). Organisations in regulated industries in particular have been reluctant to transition to the public cloud, with concerns around security holding them back.  As the public cloud closes the gap on its private counterpart, what will be the key lessons for businesses in 2017? Staying ahead in the technology adoption lifecycle Businesses that are not on the public cloud are already years behind competitors. If we look at the Technology Adoption Lifecycle model, we see initial adoption by "innovators", then the "late majority" getting in on the action. Eventually the "laggards" get there too - when it comes to the public cloud, this is the point we’ve reached today. Typically conservative, late majority and laggard businesses are unlikely to take many risks with their technology strategy, industry best practises being their preferred route.  As more of their industry influencers and peers realise the benefits of the public cloud – flexibility and cost efficiency to name a couple - so others will inevitably follow. Many of the concerns around public cloud were either unfounded, or have been addressed, and the risks of not adopting this technology have become clear, encouraging this slow but certain direction of travel. DevOps is here to stay If the public cloud is on an upwards trajectory, so is DevOps and again, 2016 was a turning point.  The increase in DevOps uptake in 2016 is almost certainly tied up with the adoption of public cloud technologies too, as more CIOs recognised its transformational potential and looked to adopt it within their organisations. DevOps and Cloud are very closely linked, and a Cloud strategy without a DevOps approach will probably fail. As more organisations embrace DevOps practices, so they will inevitably transition to the public cloud.  In the UK, the industry increasingly understands the benefits of adopting DevOps,  and confidence in the public cloud will drive further adoption.  The data centre map is changing The growing number of cloud suppliers building infrastructure in the UK has garnered more interest and confidence from CIOs. For reasons of security and compliance, the prevailing consensus is that data is often better located locally.  To meet demand, we’re seeing cloud providers placing smaller facilities in more countries. As vendors open up their offerings – in the UK and across Europe – so CIOs will have a greater range of choice available to support their cloud strategies. The question will now be how to adapt their on-premise security policies for suitability in this new world. Security investment will rise To date, security has been one key barrier to adoption of the public cloud.  In the same way, it will also be the prime adoption driver as understanding grows and providers address customer concerns; earlier this month the Government Digital Service declared the public cloud secure enough for the “vast majority” of the public sector.  We’re seeing cloud service providers such as Amazon Web Services and Microsoft Azure put significant investment into their security capabilities, boosting customer confidence and trust in the public cloud as a secure hosting option.  For highly regulated industries like healthcare and life sciences, this has transformative potential. From an information security perspective, the cloud will continue to play an important role, in particular ahead of the impending European General Data Protection Regulation (GDPR). While GDPR won’t be rolled out until 2018, businesses are already looking at how they prepare for this and get systems in place to comply with new requirements. It’s fair to say that businesses that are not currently on the public cloud are behind competitors by 3-4 years. In this climate, no business can afford to miss out on the benefits this cloud architecture offers in terms of cost savings, scalability as well as its ability to commodify common IT tasks.  We expect real change this year, as public cloud solutions continue to dominate the market. ### Happy Birthday Disruptive https://www.youtube.com/watch?v=gRQGgF_dKbo It has been a year since Compare the Cloud launched its spin-off 24 hour online TV channel; Disruptive. What started as an experiment in live streaming using phone cameras, sellotape, iPads and borrowed wifi has become a fully functioning station with live events, camera rigs, production suites, 24 hour streaming and a lot of disruptive technology. https://www.youtube.com/watch?v=VZXCjwa8pvA Making a dent within the video market is difficult. It's crowded. Anyone with a phone and a microphone can now become a roving reporter - and many do. We wanted to try something different - to be the UK's first 24 hour technology TV channel. To produce content people want to see, be involved with and that can hold the viewers interest. From our live events such as the specially constructed Cloud Expo Live, IPExpo Live and IBM Bluemix Garage Relaunch Live to our Women in Technology series featuring some of the most important names in the UK technology space. https://www.youtube.com/watch?v=_mbxC0W6Y1E My highlight of year one has been the hugely successful Disruptive Pitch series - an ongoing reality show where entrepreneurs pitch their disruptive technologies to our panel of judges. The breadth of talent within the UK is astounding and so far this season we have had some amazing individuals and truly disruptive technologies. https://www.youtube.com/watch?v=A6KZ2dGwSnY So what's coming in 2017? We're about to launch our #DisruptiveStudio - a specially built environment for filming and broadcasting live shows, interviews and exciting new content. We're covering more technology events than ever before - including the final of Disruptive Pitch at Cloud Expo 2017. There's more. A lot more. Watch this space... https://www.youtube.com/watch?v=54h3NsJRQDI ### Disruptive Pitch UK - Series 1 - Episode 4 Episode 4 of reality TV technology pitch show; #DisruptivePitch. The UK's most aspiring technology entrepreneurs, startups and emerging companies pitch themselves to our panel of judges for a chance in the final. ### What happened to our efficiency gains tied to our new cloud collaboration software? Business tools have become an extension of our personal lives. We now use software like instant messaging and video conferencing to connect with colleagues both inside and outside of the office, leading to a much more collaborative approach to communication. Gone are the days of email and phone calls, replaced by many enterprises with UCaaS and cloud-based services. When purchasing these systems organisations were told that they would drive efficiency and innovation through instant chat, file sharing and more. So why is it then, that some companies still haven’t really seen any of these efficiency benefits, aside from cost savings? [easy-tweet tweet="Every business is different and you need to know what your own needs and limitations are." hashtags="Cloud, tech"] From our work in the sector, we are regularly asked this question. Customers may have read an article about the benefits of collaboration or unified communications, which has led to the purchase of a fancy new piece of software for their tech-savvy workforce to use. A few training sessions from their vendor later and they understandably want to start seeing all of their employees transform into a completely collaborative collective, producing the kind of innovative work that couldn’t possibly have been achieved with emails and shouting across the office.If you go into the acquisition of any communication service expecting the technology to instantly transform the way your colleagues interact with each other, then you’re setting yourself up to be disappointed. Managing the adoption and implementing the changes in how your team works together is the only way you can achieve the efficiency and innovation increases that you read on your tech vendor’s website.Every business is different and you need to know what your own needs and limitations are. It’s important to research your IT estate before you do anything; knowing exactly what you can and cannot change, as well as factoring in processes that are ingrained in the company culture. Thinking about where the potential pain points for your colleagues are the key thing to consider here. If you are going to introduce software, which is meant to change the way in which they work, it’s better to focus on areas that are likely to be met with apathy. For example, many companies look to completely replace internal email with instant messaging tools. While it’s unlikely your staff will care if you do this for work approval, you will come up against some resistance if your HR process currently requires email verification for their annual leave.Once you understand what collaboration features will be accepted by your employees, it then becomes important to plan for what a successful implementation looks like. Setting goals and KPIs related to the usage and capabilities of your new communications system will allow you to see how it is performing. The ability to flag any issues or features which haven’t been used will give you the opportunity to know when and where you need to provide training on using the software. Above all else, you need to be patient and realise that this adoption doesn’t happen overnight. While technology now allows you to quickly migrate to a new system or instantly go live with a new feature, the time it takes for your colleagues to understand and use it is much longer. Even millennials take time to become aware of how the same kind of software they use subconsciously in their personal lives can make their jobs easier. When replacing a legacy communications infrastructure with a new cloud-powered unified communications solution, businesses of all sizes need to know how staff are going to adopt it into their day-to-day working life. When you’ve got a good strategy for how you are going to get everybody using the new system, you’ll soon start to see all of the efficiency and collaborations you read on the box. ### Using the cloud to solve PCI DSS compliance headaches The role of the cloud in the enterprise has accelerated significantly over recent years and its business benefits are becoming increasingly difficult to ignore. Cloud-based solutions are now available for a plethora of business issues ranging from storage and archiving through to virtual software and hardware. Not only are they more powerful, flexible and scalable than ever before, but they are often cheaper as well. Even cloud security, perhaps the one lingering argument against moving to the cloud, is more advanced and robust than in many on-premise solutions. It is for reasons such as these that the worldwide public cloud services market grew 17.2 percent in 2016 to total $208.6 billion, up from $178 billion in 2015, according to Gartner. One area of business that can significantly benefit from the switch to a cloud-based approach is that of compliance. As incidents of data theft continue to rise, businesses that take payments either online or over the phone are obligated by law to ensure they offer customers the highest levels of data security possible through compliance with the 12 requirements of the Payment Card Industry Data Security Standard (PCI DSS). However, many often struggle to maintain a robust and fully PCI compliant security solution in-house.  Budgetary constraints, the rapid pace of technology evolution and a lack of internal resources are some of the most commonly cited reasons for this. However, in most instances, issues can be traced back to a bigger problem; the size of their Cardholder Data Environment (CDE) that needs protecting. [easy-tweet tweet="The benefits of moving to the cloud don’t end at reducing the scope of PCI DSS compliance" hashtags="cloud, tech"] PCI DSS compliance applies to an organisation’s entire CDE, which can be loosely broken down into four areas – data capture, data processing, data transmission and data storage. In simple terms, this means how payments are taken from customers (and who takes them), how that payment data is moved around within the organisation, and where it finally comes to rest. Contained within this are all of the physical and virtual components involved in each stage including the network (firewalls, routers etc), all point of sale systems, servers, internal and external applications and third party IT systems. Each of these components contributes to the overall scope of the CDE, which must be protected in full as part of PCI DSS compliance. The larger the scope, the more difficult and potentially expensive compliance becomes. As such, the key for many businesses is to try and reduce the size of their scope. Unfortunately for those who have chosen to take a fully on-premise approach, reducing their CDE scope is extremely difficult, but for those looking to the cloud, there are numerous cost-effective ways in which it can be achieved. Reducing CDE scope using the cloud By outsourcing key aspects of a cardholder data environment to a third party Cloud Service Provider (CSP), organisations can not only significantly streamline their business operations, but they can also pass on the PCI compliance responsibility for that area to the provider as well. A great example of this is the implementation of a cloud-based secure telephone payment solution. If an organisation uses a traditional call centre to take and process telephone payments manually, every aspect of that call centre is in scope for PCI DSS, from the telephone agents themselves through to the computers, network and payment systems used. However, if the organisation switches to a cloud-based payment system, all of these aforementioned elements are taken out of the PCI DSS equation immediately. Why? Because at the point where a payment is required, customers are routed through to a secure, cloud-hosted platform where they enter their sensitive information via their telephone keypad. The call centre agents themselves no longer play any part in the collection or processing of the customer’s sensitive data and it never enters the call centre environment. As a result, all of those elements are removed from the scope of the CDE and responsibility for PCI compliance passes to the provider of the cloud payment platform. The benefits of moving to the cloud don’t end at reducing the scope of PCI DSS compliance either. Many cloud service providers now boast data security measures and technology far superior to those available for on-premise solutions, which are updated regularly to ensure the data they contain remains safe, compliant and secure at all times. In a relatively short period of time, cloud-based solutions have gone from a ‘nice to have’ business luxury, to an integral part of any successful operation. For those affected by PCI DSS compliance obligations, the power, security and flexibility offered by many cloud solutions today are impossible to ignore. Maybe it’s time to look to the cloud for their compliance needs? ### Visionaries – Time To Shine When working in technology, you cannot avoid the question of ‘what comes next’ –what is the next big thing that will shake up the industry and how can you make sure you stay ahead of, or even better lead, the changes. From James Watt who improved the steam engine to Steve Jobs who changed the way we see computing (and mobile phones) forever, visionaries have never stopped to push the boundaries of what is possible. Ultimately, going a step further and identifying the changing dynamics that shape the business and tech landscape is what differentiates success from failure. Speed will always be Key It is no secret that today the pace of change is higher than ever and is only set to increase. A great and innovative idea risks becoming ‘yesterday’s news’ in the blink of an eye. Does that mean that all visions are doomed to end up forgotten in the back of a drawer? Certainly not. To be a visionary means not just creating bigger and bolder ideas but also bringing them to fruition faster. The world never lacked great ideas, on the contrary, innovative and even radical ideas have always been in abundance. Consider the case of Yahoo! which was once upon a time a key player in the Internet world. Despite leading the digital revolution back in the day, it was exactly because of how fast that revolution moved on to the next phases and Yahoo!’s inability to adapt to it that meant it was no longer leading but following its competitors. What is needed is to find the right balance between brainstorming big ideas and creating a plan of action. Execution is your secret to success In many ways, we are all visionaries. We are all creative in our own ways and think of ideas that could potentially change (our) lives. The successful execution of these ideas is what actually makes a difference. The visionary is a strategist that creates a roadmap for bringing ideas to life and most importantly to market. To do so, the visionary thinks and acts across timescales, learning yesterday’s lessons, acting today and planning for tomorrow. You can never be sure that your neighbour is not developing the next Uber or Deliveroo in their garage. Instead of being afraid of competition, visionaries embrace it and make it part of their strategy. To do that, make sure your vision is adaptive and flexible enough when it comes to both the immediate steps and longer-term plans. Test, test, test A key aspect of execution is testing the vision and evaluating its weak points and its overall success potential. Assessing how operational your project is, can save you time, money and creative energy. For the experiment to succeed, a series of small tests are needed to prove its validity. Based on the results, you can either validate the direction of the project or quickly course-correct as early as possible. Creation is a journey that is far from straightforward. It takes time, effort and many attempts to develop the ‘next killer app’ so don’t be afraid to explore a number of different paths and change direction if needed. Choose your team carefully We often think that visionaries, like leaders, are sitting lonely at the top. This is not necessarily true. Visionaries rely on their teams as much as they rely on their own creativity. A visionary does not simply choose the best talent but leads teams to success. Nurturing a culture of collaboration and the sense of a common purpose can work as a key differentiator that will help teams rise above the crowd. Their task is to once again strike the right balance between maintaining control and empowering their employees and broader ecosystem to take initiative, use their own creativity and drive the vision. This also requires leading by example and instilling a different mindset in the workforce, no matter if that comprises 5 or 500 people. Everyone needs to overcome their fears and be motivated to be a lot more open and prepared for change. Being a visionary was never easy, let alone in today’s fiercely competitive tech world. But to be one means being able to turn things on their head. Competition and constant challenging are in fact your best friends that will push you to outdo yourself and keep your creativity alive and kicking. The tech world will never cease to need as much inspiration and thirst for discovery as it can get. So this is your time to shine. ### Cloud first, then what? Tips for Cloud Planning Cloud discussions are a little like New Year resolutions. They regularly focus our attention on ambitions but, unfortunately, don’t magically eliminate the challenges that stop us achieving them. Many organisations now have a ‘cloud first’ strategy, to prioritise cloud service supply for new solutions. Regular reviews of existing workloads will also be needed to migrate existing solutions. Agreeing on the right cloud strategy may be far from trivial, and tends to be more about transformation than migration. The business case should clarify the key strategic goal between: technical agility, to support more rapid provision and change; commercial agility, from a more modular pay-per-use model; and/or reducing the costs from in-house technology hosting. But increasingly it is the specific migration plans that are proving to be the major headache.  Ensuring minimal impact to live services during a migration is one such challenge. The recommended steps below can help navigate the migrations within the cloud journey: Recognise that, not only will any comprehensive moves to the cloud take significant time to complete, but there will be many solutions that turn out to be better suited for more traditional hosting services. So taking the time to select the most effective hosting partner(s) will never be wasted effort. Undertake a broad application and service portfolio analysis. This allows you to identify application services that represent the most promising candidates for software-as-a-service (SaaS). It also helps you to: accelerate vital discussions with business users, around what changes will happen when; root out application estates that are ripe for decommissioning, which in turn leads to infrastructure consolidation and cost savings; and ensure rapid business-benefit realisation. Clarify the interactions between application services. This can be gleaned from the portfolio analysis and is vital from a security, performance and availability perspective but is easily overlooked. Service virtualisation can accelerate and simplify any testing, and also clarify the nature and extent of these interactions. Assess the underlying platforms (e.g. databases, middleware, storage, messaging, web services or development frameworks) as potential candidates for platform-as-a-service (PaaS). It’s also important at this stage to check the application dependencies to ensure a comprehensive picture of which applications will be impacted by which platform. This may result in modifications (e.g. version upgrades) to the applications themselves, which will require alignment with the relevant technology lifecycle plans. As with SaaS migrations, PaaS migrations will further reduce infrastructure migration requirements and facilitate more decommissioning. Assess the remaining infrastructure in terms of what needs to be decommissioned, refreshed, consolidated or migrated directly to either hosting or infrastructure-as-a-service (IaaS) offerings. In all cases the migration will be clearer using this approach, and the return on investment fairly rapid. Initiate the preparatory work streams before the actual migrations take place. This includes: defining a migration framework covering both decision making and standard sequences of events; identifying and defining target reference architectures; assessing the application portfolio, to clarify the SaaS likelihood and platform/infrastructure dependencies, which then facilitates similar assessments on the PaaS and IaaS candidates respectively; building a target operating model, to include service integration and management (SIAM) function with end-to-end capacity planning, service level management and support practices; and aligning the anticipated changes against the existing project portfolio. Whether the target state is SaaS, PaaS, IaaS or colocation, the migration framework will cover three basic scenarios: (a)  the target state is almost identical to the current state; (b)  the target state is similar, with some modifications or version changes, to the current version; or (c)   the target state is significantly different from the current state and some form of transformation will be undertaken. Kick off each of these scenarios, ensuring the following: for (a), complete the required non-functional testing, business acceptance, adoption and decommissioning; for (b), complete the above, after the relevant functional testing; or for (c), complete both of the above, after verifying the target state design. Once these migrations are finished, it’s important to undertake regular reviews of the cloud model supply chain. The same approach will apply: effectively shifting workloads towards SaaS, PaaS or IaaS respectively. At this point the real value, beyond either of the two primary benefits – i.e. achieving greater agility/modularity and reducing the internal cost base – may reveal itself. Cloud migrations force the in-house IT function focus to move past technical support, through service integration, to business information management and, fundamentally, supporting the business users through more effective IT solutions. That, in itself, is a New Year resolution worth striving towards. ### Visualising Malaria in Zambia using Big Data EXASOL, a high-performance in-memory analytic database developer, and PATH, an international nonprofit organization and global leader in health and innovation, today announced a partnership to support the Zambian government’s ambitious campaign to eliminate malaria by 2020.  “Data analytics is often discussed as a way for business to derive value from the data they hold, whether that is to increase profitability or serve customers better,” said Aaron Auld, CEO, EXASOL. “But data can also unlock important information that can help organisations such as PATH improve the way they address Malaria. This ultimately shows the value of data in saving lives.” EXASOL joins a transformative partnership—Visualize No Malaria—between the Zambian Ministry of Health, PATH, Tableau, and technical partners including Alteryx, Mapbox, DataBlick, Twilio, DigitalGlobe, and Slalom. EXASOL’s contribution—access to the EXASOL database in the cloud on Amazon Web Services—enables the Visualize No Malaria team to perform highly complex queries of not just "big data" but truly "massive data" with speed that enables almost instant rendering, allowing for real-time analysis. Allan Walker, a volunteer with expertise in data analytics and visualisation, is helping PATH’s#visualizenomalaria team create analyses that estimate where malaria cases will be more likely to occur. The analyses aim to find the relationship between the mosquito vector and the human carriers of the disease. The team’s current project involves loading complex geospatial data into the EXASOL database to model geological features in Zambia’s Southern Province such as elevation and slope and hydrological features such as topographic wetness and stream power. This shows whether the land is dry or wet, and if water is still or moving. The team also regresses time-series models of population density and mobility, and meteorological models of precipitation and temperature, to establish a relationship with the epidemiological data. Once honed, the analyses could be used by Zambian decision-makers to focus on probable malaria outbreak areas and quickly respond to new cases. “EXASOL simply puts the ‘snap’ and ‘zing’ back into Tableau projects, regardless of scale, effortlessly returning queries of billions of rows of data,” Walker said. “It has back-end database power and speed that Tableau developers require and users in the field will appreciate.” Jeff Bernson, senior director of PATH's Results Management, Measurement and Learning Department, said, “If you’re trying to inspire data use among counterparts and decision-makers, watching a spinning wheel and waiting for dashboards to render can often be a deal breaker. Partnering with EXASOL and Tableau is helping us tackle challenges with data access and speed. It truly aligns with our focus to develop and apply transformative innovation in low-resource settings.” “It has been a great pleasure to support PATH in this fantastic initiative,” Auld continued.  “Furthermore, we have an enormous amount of respect for Allan and his team for their dedication and hard work around the program.  We are grateful to Jeff and the PATH organisation for their decision to engage with us, and we look forward to continuing to make a contribution towards supporting PATH and the Zambian government in their efforts.” For 40 years, PATH has partnered with the private sector, governments, and civil society institutions to create market-based solutions that change the course of disease, transform health, and save lives. By convening strategic partnerships, PATH touches 150 million lives a year around the globe through projects including this one supporting Zambia’s plan to eliminate malaria. Malaria, a mosquito-borne illness, has seen remarkable reductions worldwide, with mortality rates declining 60 percent since the year 2000. But this preventable disease is still killing too many people in sub-Saharan Africa, where malaria takes the life of a child every two minutes. In Zambia alone, it’s estimated nearly 3,000 people die from malaria each year. ### How technology can help combat payroll fraud Payroll fraud is a real risk. The Association of Certified Fraud Examiners says that payroll fraud affects 27% of all businesses. The crime manifests in several ways, but is most commonly set-up through falsified timesheets and ‘ghost employees’. It’s incredible to think that companies can pay a salary to a non-existing staff member for years without even noticing it – but they do. It’s difficult to prevent payroll fraud altogether. Especially if your company still relies on excel spreadsheets to manage its accounting. However, you can put in place certain measures that will make it harder for employees to steal from your company. New technologies are key here. Modern payroll software is defined by three critical characteristics: automation, workflow and transparency, all of which greatly reduce the room for employee error or theft. [easy-tweet tweet="New technologies can automatically calculate the commission owed to an employee based on various input options. " hashtags="Fintech"] Automation The big benefit of automated payroll software is that your employees are only involved in the process when they need to be. It’s difficult to steal if you don’t have access to the safe. An automated system can manage a range of important business functions without human intervention, minimising fraudulent activities significantly. Employee leave management is a good example. When this function is automated, it can be fed directly into the payroll system. The self-service function enables employees to view their leave balance, submit a leave request or cancel one with manager and/or further approval. Any changes are immediately updated and accurately reflected on the payroll. This means that none of your employees will be able to take more leave than they are entitled to without your knowledge. Commission is another weak spot in most businesses’ current payroll systems. The rules vary from company to company and can become quite complex. New technologies can automatically calculate the commission owed to an employee based on various input options. For example, if a salesperson is paid according to the number of units sold, payroll may have to take into account their leave in order to pay an average for the leave days taken. A commission calculator built into your payroll system will ensure that only the correct earned commission can be paid to an employee. Ideally you want your payroll system to integrate with your accounting software. When all the necessary data - such as recurring earnings and deductions - is automatically assigned by employee type or grade for instance, before the first payroll cycle, your company finances are protected from mistakes – whether deliberate or not. An electronic payroll system can even confirm the validity of an employee’s identity. Essentially, each time a new employee’s details are loaded, your software will validate their ID number based on birthday, gender and citizenship using a digit check algorithm. Their bank details will be checked against a bank merchant number, account type and account number. In some cases more in depth verification can be performed with the institutions via integration but this comes at a per verification cost. Transparency With modern payroll software, payroll managers, administrators and employees can access relevant payroll information on any device at any time. Different access levels can also be allocated to different roles to ensure confidentiality and data security. Transparency also allows for an efficient audit trail functionality that can track various access points and see who is doing what at all times. This makes it far more difficult for employees to commit payroll fraud as there’s a greater chance that they’ll be found out. Real-time analysis and insight into what is actually happening in your business is crucial. A fully transparent payroll system with cloud-based analytics enables you to filter data according to various fields, percentage or amount fluctuations compared to a previous period and create daily reports, which means you’re much more likely to see if something untoward is going on in the company. Payroll is one of the biggest expenses in any business. For many companies, employee costs account for over 75% of their operating expenses. There is no doubt that payroll fraud or errors in capturing your payroll data can be costly. A fully integrated, automated and transparent payroll management system can help you track how money moves through your business.  When a door is left wide open, there will always be someone tempted to walk in and take what they can. Unfortunately, there is no magical cure for fraud and dishonesty. However, the right technology can close the door and prevent payroll theft before it happens. Featured image credit to Lobster.media ### How Cloud Technology is Helping Businesses Back Up their Data We live in a world of statistics and data. There is information and disinformation. Facts and post facts. For businesses looking to cut losses and improve margins, there are two pieces of information they should pay close attention to: American businesses lose $37 billion a year because of poor communication between employees. 300 businesses surveyed by Intel suggest American companies lost $2.1 billion worth of laptops and other hardware with an incalculable amount of data on them in 2010 alone. What do these facts tell us? It tells us that businesses are losing vast amounts of money because of poor communication and data storage. The 86,000 lost laptops contained sales data, communication information, vital business information, strategic plans, sensitive and confidential information. It also shows us that those businesses are not storing information in the cloud, but on individual machines and this is part of the communication gap which is causing point no.1. How Cloud Computing Improves Business Communication Access to information and its misuse are two vital areas of loss for businesses. Furthermore, poor data gathering, retention and use, hinders performance and can actually hinder the careers of more able employees who do not have the social/verbal gifts needed to get promoted. Cloud storage allows different people at different levels to access the same information pool. This can be organized however the business likes, so that not everyone can access everything, but they can access what is important to them or their team. For example, an automobile retailer can constantly update sales metrics based on live feedback from sales reps in the field. A realtor can receive photos of a new property from a surveyor in the field, and automatic update the site. They can also allow a rep in the field to receive instant information on new properties that might interest a prospective buyer they are giving a tour to. For an ad company, as another example, the art department can upload finished artwork to a communal file which other departments can then instantly use. This improves communication within departments and between them. It allows people in the field to share information with those in offices, and makes remote working more feasible. Furthermore, it allows management to track employee performance more directly by being able to see what they have actually done and are doing in real time. Not only that, but if a good provider is chosen all files are backed up and encrypted. If the worst happens and hardware is damaged, destroyed, or lost, then all the information required is still there for the business to use. ### Cloud Connectivity and Communication - #CloudTalks with hSo’s Avner Peleg In this episode of #CloudTalks, Compare the Cloud speaks with Avner Peleg from hSo about Connectivity and Communications in the cloud. #CLOUDTALKS hSo is your partner in implementing, deploying and managing IT Cloud & Telecom solutions that adapt to all your needs, whether big or small. We operate our own core low-latency resilient MPLS network and provide the underlying infrastructure to ensure that your employees can access business applications anytime, anywhere. We work with all the major operators and integrate all proven technologies, so that your business always gets the best service at the best value. Visit hSo to find out more: hso.co.uk/ ### Cloud Migration: What To Consider Before Making The Move Companies are migrating to the cloud: As noted by BizTech, cloud spending is up for 2016. However, adoption seems to be slowing, in part because businesses are now employing more “refined” cloud strategies that focus on leveraging the right apps, storage solutions and security tools for the job, rather than overspending on massive cloud sprawl. Bottom line? If you’re considering cloud migration, it pays to consider the big questions before making a move. Fast or Slow? When it comes to cloud migration, many companies assume that more is better and fast beats slow — after all, if you’re headed for the IT and cultural shift that comes with cloud computing, why do it piecemeal? According to Small Business Trends, however, this strategy often leads to the problem of “too many moving parts,” in turn causing big migration initiatives to pay smaller-than-expected dividends. To avoid this problem, start slow. Adopt cloud services as needed and only scale up as required. [easy-tweet tweet="Moving to the cloud? Ask hard questions to minimise the chances of a maddening migration." hashtags="cloud, tech"] Can You Leverage Legacy? Do you need to toss your existing hardware and infrastructure, or can you make use of this technology in the cloud? While it may not be possible to migrate all apps or link every piece of current hardware to your new cloud ecosystem, keeping at least some of your legacy tech in place gives you a fallback if things don’t work out as planned. Also a good idea? Check your network: If you don’t have the throughput needed, even the best cloud provider won’t be able to improve your overall performance. Fit and Failure? As noted by TechTarget, any move to the cloud comes with a host of third-party tools on deck to help streamline IT and increase overall productivity. Start by identifying “best fit” apps that align with current migration needs and will help smooth your initial transition. Next, take the time to consider worst-case scenarios such as data breaches or network failure. Also, track specific apps or services — such as automatic backups or disaster-recovery-as-a-service add-ons that can help safeguard your business against the unexpected. Security: In or Out? When companies make the move to cloud services, it’s often tempting to bring security along for the ride; after all, many niche cloud security companies can now out-perform local defence solutions. GCN offers some sage advice, however: “It’s your data.” This means that while cloud security experts can help limit the risk and empower your ability to recover stolen or compromised data, the ultimate responsibility lies with your company. As a result, it’s critical to develop security strategies that both embrace cloud technology but don’t ignore the role of in-house safeguards. Where’s the Bottom Line? What will moving to the cloud really cost? While significant spending reductions come with moving workloads off-site, reducing the need for cap-ex payouts, and increasing the amount of free server space you have in-house, it’s easy to end up paying more than you expect for cloud services. Why? As noted by Enterprise Tech, unnecessary workloads for testing and development can ramp up your monthly costs, as can unexpected downtime. Scaling up to meet business demand may also come with a big price tag — here, it’s essential to conduct a thorough analysis of any provider’s SLA to determine what comes standard, what constitutes a separate charge, and how you’ll be compensated in the event of downtime or cloud failure. Moving to the cloud? Ask hard questions to minimise the chances of a maddening migration.   ### IT has a mountain to climb To paraphrase the three rules of mountaineering: It’s always further than it looks. It’s always higher than it looks. And it’s always harder than it looks. While navigating a path through today’s technological landscape may not seem as perilous as climbing a mountain, building a data centre network that supports the cloud-driven, digital economy is an undertaking that needs careful planning. Not all networks are made equal and few organisations have a blank canvas on which to design their infrastructure but creaking under the strain of the tremendous workload demanded by today’s digital services, legacy infrastructure can simply end up holding back and hindering many businesses. [easy-tweet tweet="Transforming the network environment often leaves I.T. with a mountain to climb." hashtags="Cloud, Tech, Future"] Often built with overlapping generations of technology ‘sprawl’ and added complexity, traditional networks typically end up constraining businesses with their rigid design, cumbersome operations, misaligned cost structures and inadequate, outdated security capability. In the vast and varied environment of the data centre, transforming the network environment often leaves I.T. with a mountain to climb. Yet the biggest obstacle facing business is an I.T. mindset that maintains the status quo by simply doing the same thing over and over, even though a different and better outcome is needed. That ‘better outcome’ usually involves looking beyond standalone, proprietary technology and instead considering an open, vendor-neutral technical framework where best-of-breed products are developed more quickly, at lower cost and seamlessly integrate with similarly-built, adjacent technologies. CHANGE IS THE NEW NORMAL Set against a disruptive landscape that will move towards business models becoming friction-free and technology automatically adapting to human behaviour - an era we call Digital Cohesion – this will continue to impact and stretch many I.T. resources. Underlying architecture of traditional networks now has to support the heavily virtualised and scalable cloud environments needed for the digital economy and the Digital Cohesion era beyond. The network has to be able to scale, perform tasks faster and, just as importantly, anticipate and adapt to keep pace with ever-increasing bandwidth demands. While ‘going digital’ has many compelling advantages, such dynamic network challenges need to be carefully addressed. Building an automated, open, agile and inherently secure data centre network requires a step-change in capabilities and performance. This is especially true where the infrastructure is spread across multiple, geographically distributed sites or cloud environments, and where the sheer volume and velocity of data needed to support cloud-based services inevitably impacts network resources. SURVIVAL GUIDE By selecting the right architecture and vendor partner, it’s possible for even the smallest business to have a big I.T. solution; one that delivers tangible, impactful business results, especially when trying to rival the ‘born-in-the-cloud’ start-ups and OTT (over the top) content providers blazing a trail across many established markets. Here are my four suggestions when setting off from base-camp and beginning that journey: Virtualise and automate your network: No longer just an operational backbone to any organisation, the ‘smart’ network has to contribute to the business like never before. The adoption of network functions virtualisation (NFV) is an approach that can provide the agility and elasticity needed for driving cloud platforms. Instead of running infrastructure that relies solely on managing a range of physical network devices, the virtualised environment can quickly build, adapt or evolve network services using generic, reprogrammable hardware. New network resources can be automatically and rapidly scaled up/scaled down, re-configured and re-deployed with services provisioned at the click of a mouse. Make the network open: As business resources evolve in line with market demand, no single network vendor will have a solution for it all. However, a multi-vendor, open standards strategy is an example of the I.T. community building flexibility and freedom of choice right into the heart of the network, without compromising manageability. Choosing a strategy with no dead ends or lock-ins allows the use and re-use of infrastructure, meaning that as the architecture evolves, any rip and replace of single-vendor, proprietary resources is avoided. An open infrastructure that supports vendor-agnostic technology not only provides best-in-class solutions but also mitigates risk, helps manages contingencies and remains flexible enough to scale with the business. Keep it simple: Although businesses can be complex, networks don’t have to be. But when looking at the proliferation of cloud-based streaming services, BYOD and IoT, networks will need to support millions1 of new connected devices. Software-defined networking (SDN) provides the means to simplify networking tasks by using software capabilities to automate and orchestrate key network functions, and minimise, or eliminate entirely, many labour-intensive tasks. Without manual intervention, the software-defined infrastructure can also rapidly adapt to the business by deploying new or modified services more efficiently, more quickly and with less risk. Aim for the cloud: Fuelled by the need for business agility, accelerated service delivery and simplified operations, many businesses have already made great strides in cloud migration. Whether public, private or hybrid, cloud platforms represents an effective strategy for achieving business goals by delivering next generation services incorporating built-in resilience, scalability and automated, on-demand provisioning. Cloud architecture also helps achieve greater levels of efficiency since virtualised environments can create (‘spin up’) new applications in minutes, rather than days or weeks, while evaluating new services becomes lower risk, and lower cost, as large-scale investment and no-turning-back commitment isn’t required. NETWORK ECONOMICS The exponential growth in services-led platforms continues to re-define the I.T. landscape, challenging and disrupting the way businesses have traditionally differentiated themselves. The promise of improved agility and operational efficiency means many now view the cloud as a business enabler where the investment can be precisely aligned for meeting the next wave of business challenges head on. With cloud architecture helping to overcome economic barriers, the network is increasingly viewed as a strategic asset; one that can can transform a commercial necessity into a powerful and cost-effective resource for driving value into the heart of the organisation. By planning the journey carefully, an open, multi-vendor network strategy can transform service delivery and provide the means to engage with customers and users like never before. Featured image credit to lobster.media ### Can cloud businesses escape the digital highwaymen? Information and data is power. It’s why organisations are so desperate to keep their data safe and hackers are so intent on stealing it. But it isn’t just about launching more attacks; cyber criminals are also getting smarter. Radware’s latest Emergency Response Team (ERT) survey found that businesses that are not prepared to defend against the latest attack types are in danger of becoming victims of cyber criminals who target the valuable data that they hold. Ransom attacks have increased. 49% of European businesses cited cyber-ransom as the top motivation behind attacks they suffered in 2016 – an increase of almost 100% from the 25% recorded in 2015. Despite this, less than half of European businesses interviewed said they were well prepared to fight ransom attacks with a worrying 44% having no cyber security emergency response plan in place. Although most businesses understand that the data they hold is a juicy target for cyber criminals – especially cloud businesses that rely on their service providers to ensure adequate security is in place – a large number appear to be unprepared to defend against these types of attacks. So why is there a still such a disconnect between the level of the threat and organisations’ preparedness to defend against it? It’s easy money Although Internet of Things (IoT) botnet attacks, such as Mirai, took the headlines towards the end of 2016, the year should really be remembered for the flood of ransom attacks that hit businesses across the globe. Ransom Denial of Service (RDoS) often involves a small-scale attack on a target’s network to show that the threat is real, followed by a demand to pay up or suffer a much stronger attack. This has proven to be a very beneficial technique for cyber criminals to make easy money. Some hacking groups have even made DDoS-for-ransom a profession, leveraging a set of network and application attacks. Their success quickly drew followers and copycats joining the ransom party. Today, cloud based storage and the data contained within it are the primary target of cyber-attacks, rather than Internet pipe saturation or firewall exhaustion as had been the objective in the past. Cloud businesses that hold data such as personal identification, account credentials or even medical records are all at risk. The bottom line: cloud data is the new target. Cyber security reaches tipping point Cyber security attacks and attackers are nothing new. Yet, we are at a point in time where we are witnessing dramatic and frightening increases in attack frequency, complexity, size, etc. The hacking community has reached an ideal state in terms of: Availability of low cost resources Dramatic increase in high value, increasingly vulnerable targets putting more and more valuable information online A level of maturity where on top of hacking programs and anonymity they enjoy services such as hosting, security and can even leverage public cloud compute power. Counting the cost Although cyber-ransom attacks make it is easier to estimate the financial losses caused by an attack, most businesses (60%) are in the dark when it comes to understanding the actual losses associated with a cyber-attack. Those who do quantify the various aspects of the losses estimate the damage at nearly double the amount compared to those who have no measurable practice of estimation. The cost issue is just one example of the growing gap between the defenders and offenders. While organisations battle budget, bureaucracy, and expertise – hackers are much more agile in developing new attack tools and techniques. The result of the two different cruising speeds is an ever-expanding chasm between the businesses and perpetrators. Will the gap ever shrink? Not in the near future. The answer, I’m afraid, will only come after a few years when the majority of organisations will adopt security solutions based on machine learning, behavioural analysis and continuously adaptive models.   ### The Biggest Tech Trends of 2017 The annus horribilis of 2016 is thankfully behind us and now all eyes are optimistically turned to 2017. Technologies that were once the stuff of science fiction are fast becoming reality as deep technology makes machines more intelligent and the boundary between the real and the virtual worlds become ever less distinguishable. Here are three of the hottest trends that we’ll see in tech over this coming year. Virtual reality hits the mainstream In 2016 we saw the release of Pokemon Go - the enormously popular Augmented Reality (AR) mobile game. The technology worked by superimposing Pokemon via a smartphone or tablet camera onto the real world. Virtual reality (VR) takes this technology a stage further by creating an entirely virtual world in which almost anything is possible and everything is digital. We have seen VR in sci-fi movies and on TV but 2017 is the year that the technology goes mainstream. Facebook’s Oculus Rift, Sony’s PlayStation VR, Samsung’s Gear and HTC’s Vive are all examples of early stage adoptions of the technology and have built the platform for the technology to explode onto the mainstream this year. As the technology matures further over the course of the year we can expect prices to drop as more competitors enter the market. The technology as it stands currently has a prohibitively high pricing point, and increased competition in the sector should push prices down to a point that we see VR headsets in most households. VR is more than just a consumer phenomenon; there are strong business use cases for the technology as well. We can expect more and more business applications for VR to crop up over the next 12 months. One of the most interesting areas for expansion is retail - as retailers use VR to show how a watch might look on your wrist or how a new coffee table would look in your living room. There are so many possibilites with this technology that have yet to get the industry’s attention and investment that they deserve - this will all change in 2017. The sharing economy expands The sharing economy is changing the face of businesses, creating new opportunities for startups and setting up challenges for incumbent market players. It went through a meteoric rise in 2016 with some of the established sharing economy leaders launching new products and developments. Uber’s recent expansion into Bangladesh for example, adds to the ever-growing list of countries and cities in which the business is now available. Uber even expanded its offerings to food with Uber Eats following the success of food delivery startups such as London-based Deliveroo. Airbnb launched its new “experience” service called ‘Airbnb Trips,’ which now guides travellers to restaurants, classes and day trips in 12 cities across the globe. But while the popular commercial models of the likes of Airbnb and Uber are likely to persist in the coming year, in 2017 we’ll see the sharing economy develop into many new and exciting ways. Electrolux, the Swedish appliance maker, is currently exploring the option of starting an “Uber for laundry” service, in which consumers will be able to use their own washing machines to wash other people’s clothes. Expect traditional market sectors and industries to be further challenged as the sharing economy continues to expand into 2017 and do battle with the status quo. Artificial Intelligence (AI) and chatbots One of the most widespread manifestations of artificial intelligence is the chatbot. In the first few months since chatbots were launched on Facebook Messenger, the number of platforms has grown exponentially to over 30,000. In 2016, there has been significant movement in the chatbot space with a number of startups building chatbot-based solutions. Chatbots are used in a wide array of industries. For example Shopify, a popular e-commerce platform used by over 243,000 businesses acquired a startup called Kit CRM, which enables companies to interact with their customers over text. Scheduling startup Doodle acquired Meekan, an intelligent chatbot that can allow fast scheduling between different calendar service providers.` When it comes to chatbots the more data there is to work from the better. In the light of this, we might see collaboration between rival companies to pool information in the interests of improving customer experience. We will likely see more initiatives in the same vein as Microsoft’s recent give-away of a set of 100,000 questions and answers that are used to improve non-Microsoft products such as Siri or Cortana. 2016 saw its fair share of political turmoil which traditional polling systems failed to forecast in the case of both Trump and Brexit. Technology, specifically AI, is already more accurate than the pollsters. MogIA, correctly predicted the results of the last three elections as well as Trump’s shock victory. There are not many people who would associate AI technology with politics and this goes to show that AI has almost limitless capabilities. Tech now permeates almost all industries and sectors of society, driving change and innovation. The sharing economy is changing the way we work - individuals are paid for their skills at times that suit them as the traditional work model changes. As deep tech practices continue to develop over 2017, more time will be spent in virtual environments with machines that create, adapt, and populate the virtual world. 2017 will see these virtual environments flourish in both a consumer and business context as the digital economy continues to expand. ### Data provenance, just what the doctor ordered Knowing where your data is has never been more important. A combination of the rising rate of cyber attacks on organisations, the changing political climate and the upcoming introduction of the European General Data Protection Regulation means that cybersecurity is a key focus for organisations. Last year the discovery of data breaches at Yahoo! highlighted the need to properly secure customer data, and made consumers far more aware of the potential for their details to be stolen online. Organisations have a duty of care to ensure that the details of customers are stored securely. Part of this is also the need for organisations to know exactly what their critical data is and where it is, to ensure it is comprehensively protected from threats. This extends to data provenance and knowing the location of your customer’s data and the laws that must be adhered to. When applying geographical boundaries to digital assets the physical location and the data centre it resides in is of increasing importance. [easy-tweet tweet="Being able to tell your customers exactly where their data is stored is hugely important." hashtags="Data, Security"] In November last year, Amazon announced the opening of its data centres in the UK, which for the first time, would allow UK businesses to store data locally on its AWS infrastructure. It was an important announcement, not least because of the popularity of Amazon’s cloud service, but because it would enable AWS customers to inform their clients of the provenance of their data with confidence. It's not just about compliance, companies and individuals are increasingly aware of the importance of data provenance.  Where sensitive customer, client or partner data is kept and by whom, is becoming a service differentiator for all verticals. As IoT and digital transformation progress, it will become more so. It may be that data is perfectly safe held on a server in US, for example, but UK consumers tend to feel safer if their data is kept in the UK. This is especially true of the public sector and the NHS, whose customers are citizens who need to be protected. A good example of how data security and compliance can drive innovation and new business opportunities is London based start-up Echo, one of the first users of Amazon’s new British data centres. The company has developed a smartphone app that takes much of the chore out of repeat prescriptions. Users can manage their NHS prescriptions via their iPhone (and soon Android device) and have them delivered directly to their door. According to the company, the app can convert a doctor’s orders into alerts and reminders, helping patients to better manage their medications. It’s especially useful for those on repeat prescriptions, and the elderly and housebound. It goes without saying that Echo has been given access to some of the most sensitive data in the country, and is NHS approved. Its data assurance will have to be top notch. While being an excellent example of how young digital entrepreneurs can help the overstretched NHS, and improve patient experience at the same time, its future success depends almost entirely on how the highly sensitive medical data it has been entrusted with is handled. By using Amazon’s UK based AWS servers, the NHS at least knows that patient data remains in the UK and is traceable. Its security is entrusted to Amazon’s own technology. This is a lesson for all of us. In a world of subcontracting and security as a service, the integrity of the cloud or service you use is important. Make sure you ask where your data is being stored, who by and with what. Being able to tell your customers exactly where their data is stored is hugely important. In the uncertainty of a post-Brexit UK, full data provenance will be demanded by your clients. You need to be in a position to fulfil this with absolute certainty, or risk losing their business. And ahead of the EU’s General Data Protection legislation coming into force in 2018 businesses need to use this year to get their security in order, or risk hefty fines if breaches occur.   ### The Node.js Foundation Partners with The Linux Foundation on New Node.js Certification Program The Node.js Foundation, a community-led and industry-backed consortium to advance the development of the Node.js platform, today announced development of the inaugural Node.js Certified Developer program aimed to establish a baseline competency in Node.js. Node.js is one of the most popular programming languages with more than 4.5 million active users per month. While Node.js is increasingly pervasive with enterprises of all sizes today, organisations that are eager to expand their use of Node.js often struggle when retraining Java developers and recruiting new talent. “The Node.js Foundation, with help from incredible community members and core experts, is creating a comprehensive certification program that broadens the funnel of skilled Node.js expertise available. Whether working in enterprise environments or as individual consultants, those who become Node.js Certified Developers will be well-positioned to hit the ground running as a Node.js developer, possessing skills that are in high demand,” said Tracy Hinds, education community manager for the Node.js Foundation. The Node.js Certified Developer program, which is being developed with input from leading Node.js experts and contributors, is expected to be available in Q2 of 2017. The program will provide a framework for general Node.js competency, helping enterprises quickly identify qualified Node.js engineers, while providing developers, contractors and consultants with a way to differentiate themselves in the market. Node.js Foundation is worked closely with The Linux Foundation to create the blueprint and process for administering the program. The Linux Foundation offers a neutral home for running training and certification programs, thanks to its close involvement with the open source community. It offers several online courses and certifications, including Kubernetes Fundamentals; Linux Foundation Certified System Administrator (LFCS); Linux Foundation Certified Engineer (LFCE); among many others. Ideal Node.js Certified Developer candidates are early intermediate-level developers who can already work proficiently in JavaScript with the Node.js platform. Pricing for the self-paced, online exam is still to be determined. It will be delivered online, so users can take it from anywhere, 24/7. Currently the Node.js Foundation is working with the community to determine specific questions that will be asked on the exam. To contribute to the Node.js Foundation Certification Development Item Writing Workshop Sessions, fill out this application. Exam topics will be published publically as will resources to help prepare for the certification, allowing others to leverage the source materials for their own Node.js learning. ### New Kubernetes Fundamentals Course Now Available From The Linux Foundation The Linux Foundation, the nonprofit advancing professional open source management for mass collaboration, today announced the availability of a new training course, LFS258 - Kubernetes Fundamentals. LFS258, developed in conjunction with the Cloud Native Computing Foundation, is geared towards developers and administrators who want to get started with Kubernetes. It teaches students how to use Kubernetes to manage their application infrastructure. Topics covered include Kubernetes architecture, deployment, how to access the cluster, tips and tricks, ConfigMaps and more. This self-paced, online course provides the fundamentals needed to understand Kubernetes and get quickly up-to-speed, to start building distributed applications that will scale, be fault-tolerant and simple to manage. Kubernetes Fundamentals will distill key principles, such as pods, deployments, replica sets and services, and give students enough information to start using Kubernetes on their own. The course is designed to work with a wide range of Linux distributions, so students will be able to apply the concepts learned regardless of their distribution of choice. Kubernetes is quickly becoming the de-facto standard to operate containerized applications at scale in the data-centre. It builds on 15 years of Google’s experience managing containerized applications. With a growing open-source community, Kubernetes is poised to change the way applications are built and managed, as well as change the role of system administrators. The 2016 Open Source Jobs Report from The Linux Foundation and Dice open source professionals reported that containers are the second more important skill, behind only cloud technologies. There is a lack of formal training options available for those wishing to get started with Kubernetes though, which LFS258 aims to address. “Kubernetes is rapidly maturing in development tests and trials and within production settings, where its use has nearly tripled in the last eight months,” said Dan Kohn, executive director, the Cloud Native Computing Foundation. “We are thrilled to partner with The Linux Foundation to provide a program that will train, certify and promote Kubernetes Managed Service Providers (KMSP). This program will ensure enterprises receive the support they’re looking for to get up to speed and roll out new applications more quickly and more efficiently.” LFS258 also will help prepare those planning to take the Kubernetes certification exam, which will be launching later this year. Updates are planned for the course ahead of the certification exam launch, which will be specifically designed to assist with preparation for the exam. The course, which has been available for pre-registration since November, is available to begin immediately. The $199 course fee provides unlimited access to the course for one year to all content and labs. Interested individuals may enroll here.   ### Have online payment systems fallen behind the times in 2017? We’ve entered 2017; driverless cars are on the horizon, VR and AI are expected to go from strength to strength and black market criminals continue to use tighter security than global payment systems. The latter may come as some surprise on the surface. Yet, whilst many banks require just a password to authorise transactions, two-factor authentication (2FA) is now a standard practice on the ‘dark web.’ This is a worrying enough premise, especially when we consider the fact that, despite decades of discouragement, “123456”, “password” and “12345678” are still the most popular consumer passwords in use (according to SplashData’s annual list). [easy-tweet tweet="even if you set a complex password, and regularly change this, hackers can still intercept them using malware." hashtags="#Security #Password"] But here’s what consumers don’t realise: even if you set a complex password, and regularly change this, hackers can still intercept them using malware. These passwords are then instantly made available on massive online databases. For determined criminals, passwords are no more a barrier than a padlock on suitcase. What can be done? For a start, 2FA of any kind will markedly improve upon having just a password, because it requires users to provide twice the information (factors) for verification. Typically, the two factors are a password and a one-time code sent by SMS or email. Sometimes, a push notification, key fob, or fingerprint scan serve as the second factor. Anything to do with payment transactions would logically be the last place where you would hope to find that a username and password alone are enough to facilitate a transaction. Sadly, that’s not the case. Payment systems without two-factor authentication allow for hackers to easily change the notification settings and transfer controls before filling their pockets. When the account holder finally detects the fraudulent activity days, or even, weeks may have passed. 2FA solutions would help to deflect more attacks, and properly implemented cloud based solutions would even be able to alert the account holder of any suspicious activity. As has proven to be true for consumer payment systems, many business payment systems also remain equally unprotected. Payroll systems often permit wire transfers with no more than a password. In a large enterprise, a hacker could easily add a fake payee to the payroll or accounts payable with little trouble and could then direct this money into an illicit account. 2FA would make this kind of action much more difficult, if not altogether impossible. As we head into 2017, there are positive signs. Interbank payment systems are continuing to become more secure and 2FA continues to become a more common practice for a broad spectrum of companies. But there is still a long way to go before the threat of online cybercrime subsides. In fact, as recent security breaches at the likes of Tesco and Three Mobile have highlighted, cybercrime continues to evolve and dominate headlines. SWIFT, the international payment network, relies on public key infrastructure (PKI) and hardware 2FA to start a terminal session. In 2017, this represents the absolute minimum level of security that a bank payment system should look to implement. As in the case with SWIFT, this level of security means that individual transactions merely require the equivalent of a password, leaving active sessions vulnerable to remote access or abuse, which we witnessed in the 2016 Bangladesh Bank Heist. Now compare SWIFT to the cryptocurrency space. Bitcoin is not regulated or protected by traditional fraud insurance and yet Coinbase, the global market leader in buying and selling Bitcoin, mandates 2FA for its nearly 3 million users to protect their accounts. In 2017 companies involved in online payments need to ensure that they are sufficiently protecting their customers. Without two-factor authentication, it’s not a question of if, but when hackers will break through. Featured image credit to Lobster.media ### How Cloud-Based Software Improves Business Performance Management The modern business world is witnessing some ground-breaking changes. From the way businesspeople organise their work, to hiring new staff and communication, everything is much different than twenty years ago. Adapting to this new situation and implementing innovative solutions to everyday work tasks is no cakewalk. Entire systems need to be altered, in order to make those new things work. However, if you use cloud-based software, it all becomes much easier and faster. In close touch with your employees Employers have always had problems with measuring and tracking their employees’ productivity. However, with modern progress-tracking solutions that are often used directly from the cloud, you can learn more about your workers. They should be asked to sign in at the beginning of the work day and sign out before leaving the office. Every single task, email or website they visit within that period can be obtained through those tracking solutions. The greatest benefits you’ll get from this innovation in the work organisation is that your employees will be rewarded in accordance with their objective results. Moreover, the entire worker monitoring process will be automated, which will give you more time to commit to your business tasks. Following the competition Every business owner who wants to achieve their business goals needs to gather information about their rivals. Luckily, today you don’t have to send corporate moles on special missions. It’s enough only to follow the social media pages of your competition and you’ll already have loads of information about their activities. What’s more, if you go for some cutting-edge tools for collecting details on your rivals’ activities, you’ll get resourceful input for making new strategies for your enterprise. For instance, you can get regular reports on their most frequently used keywords, social media posts, engagement with their followers and many other features. Of course, they can use the same tools to track you, so the future will bring interesting tracking clashes. [easy-tweet tweet="Every business owner who wants to achieve their business goals needs to gather information about their rivals." hashtags="cloud, tech "] Organisational cloud benefits The amount of time your employees spend working on different tasks is only one of numerous features that affect your work organisation. Many cloud-based solutions can help you improve them. For instance, if you’re going to organize a charity event or a business conference, you can sell tickets online through a WordPress event ticketing system.   Apart from that, you can reduce the number of software tools you purchase for your business needs. Today it’s possible to utilize software from the cloud, without downloading those tools. This is known as software as a service (SaaS) and is offered by numerous cloud providers. Business owners can make significant savings if they start using those services. Finally, your employees can create and change documents in no time if they use cloud-based software. Since those tools sync and memorize every single change, people from different countries can now work together and yield great results.   Vast storage space The advent of the cloud has introduced innovative strategies for storing business and private data. Nowadays, many businesses are moving their documents and other materials to their cloud accounts. This option offers several benefits. First of all, you don’t have to get new hard drives every now and then. You simply increase the amount of space you need in the cloud and keep storing your data there. Secondly, you can relax when it comes to backup plans. Today, every reliable cloud provider has a backup plan. Hence, you don’t have to make several copies of your data in the office. What’s more, you can even find providers specialized in backup, that can come in handy for websites and blogs. Finally, you can adjust the online storage space you use to your current needs and not waste a single cent on something you don’t need. If you treat the time spent at work as potential energy, then its kinetic counterpart is money. The more business potentials you manage to transform into assets, the more successful your business will be. Cloud-based software options will enable you to waste less time and improve your overall business management. Due to rapid development of online business demands, it will be of crucial importance in years to come. ### Best of both worlds: How SMBs can benefit from hybrid cloud A recent report by BCC Research predicts the global UC&C market will reach $62 billion by 2020. Hybrid UC&C is the fastest growing segment in this market as a growing number of businesses demand more efficient, real-time communications to streamline the exchange of information between employees, business partners and customers. But what is the best approach for SMBs that want to benefit from modern communications? Manish Sablok, Head of Field Marketing, NWE at ALE, explains how the hybrid model is fast becoming the ideal solution for SMBs. The need to deliver hybrid UC&C is heating up, with 54 percent of businesses set to take the hybrid approach in the next two years. By integrating several different communication components with on-premise hardware, such as instant messaging, document sharing or audio/video conferencing, businesses are able to facilitate all sorts of real-time collaboration, straight away. [easy-tweet tweet="Hybrid cloud adoption is the ideal solution for SMBs looking to on-board cloud services while protecting existing investment. " hashtags="Cloud, security "] Large enterprises are leading the chase when it comes to UC&C, because enterprises were early adopters of premise-based UC&C solutions. But as the new models evolved and matured to deliver complete communications in ways that makes sense to SMBs, there are now big opportunities for these companies to join the 'cloud crowd'. Protect what you've got and build for the future Hybrid cloud adoption is the ideal solution for SMBs looking to on-board cloud services while protecting existing investment. There is no need to rush ahead and completely migrate across to full cloud adoption. A hybrid approach allows businesses to realise optimal value from existing hardware at the same time as introducing the aspects of cloud-based communications. These cloud services can be 'bolted' on a pay-per-user basis, enabling a much more flexible approach that allows smaller businesses to only pick the services that will help meet their needs. By adopting a hybrid model, SMBs can start to close the 'cloud gap' that separates the have and have nots. Any further transition to a full cloud solution is far easier than starting from scratch as it has already reduced the need for on-premise physical infrastructure. Stay in step with a flexible workforce With data and business applications accessible outside of the workplace, the rise in popularity of the mobile workforce has led to a complete rethink of how employees work together. In modern business, it's becoming increasingly common for employees to work from home or on the move - collaborating with colleagues across different time zones. Add 'hot desking' into the mix, where each employee has their own laptop or mobile device rather than a PC at a fixed workstation, and you have a complex recipe in trying to provide one single point of collaboration. Traditional SMB communication solutions haven’t offered the scalability and agility smaller organisations rely on, and which will allow them to grow and expand in the future. This can make for large investments when systems and platforms have to be completely overhauled in order to keep up with growing businesses - often as a result of a lack of flexible, cost-effective cloud models that have hampered the ability for SMBs to adapt their current platforms. Controlling cost of ownership In the past, communication platforms for SMBs would charge different licensing costs for different devices, meaning cost of ownership would rise, SMBs couldn't have the flexibility to buy common licenses for multiple devices. Hybrid and cloud solutions help address this challenge, enabling SMBs to take full advantage of flexible work options without CAPEX investment. Hybrid and cloud communication providers should be able to tailor platforms to SMB needs and simplify the licensing of new users and devices. Through a cost-effective 'pay-per-license' model you eliminate the previous costly requirement to pay for separate licenses for different devices. This model also provides the ability to rapidly on-board new employees and their mobile devices without any significant administrative effort or major infrastructural costs. Shifting the burden of security and support The impact of downtime can be damaging and an unwanted expense that few SMBs can afford. One of the great benefits of hybrid cloud is overall control stays in-house, but it removes the management and maintenance burden facing SMBs - many of which lack the IT skills available to larger enterprises. Instead the solution provider is tasked with maximising uptime. So for SMBs wanting to access cloud services, it is important to make sure any hybrid UC&C solution can be supported by on-demand remote management. Remote management is a growing trend for hybrid and cloud-based solutions, in which solution providers can remotely access the network to provide support and troubleshooting, ensuring maximum uptime for SMBs. Hardware and software is updated remotely to eliminate maintenance delays and ensure security from external threats. With multi-year support contracts, SMBs routinely benefit from major security and functionality updates to protect their business. A clear road ahead For many SMBs, the cloud offered the opportunity to get hands-on with the latest business tools, but couldn't always offer them a way to maximise the value of existing investments. Hybrid cloud is the perfect approach to take, allowing SMBs to adopt cloud services, support enterprise collaboration tools and maintain on-premise systems for when more control is needed. Featured image credit to Lobster.media ### Rumours of a Dystopian Future are Greatly Exaggerated Perusing recent technology predictions has been interesting. The more I read, the more I was struck by a rather uncomfortable theme – at least a theme that I think would be unpopular if these articles were widely read outside of the IT industry. Future-gazing pieces that, to technologists, are about the increasing use of computing to take over mundane tasks hence freeing humans to focus on innovation, creativity and collaboration, could easily be read as some kind of precursor to the Terminator movies. I was struck by how some predictions could seem alarmingly like the march of the machines, with computers taking over humanity… [easy-tweet tweet="By 2018, 20 percent of business content will be authored by machines." hashtags="cloud, AI"] The difference between seeing these predictions as benign and seeing them as echoes of some dystopian future boils down to your view of technology. But before I explain, it’s worth looking at four of the technology predictions that made me sit up and take notice: 1.     By 2018, 20 percent of business content will be authored by machines. The idea is that automated composition machines can produce analytical information in written natural language, which is potentially useful for shareholder reports and budgets. I’ll admit my first reaction was to chuckle at the potential for ‘malgorithms’, left unattended to create amusingly inappropriate documents, or simply very long and boring ones (It was Sir Winston Churchill who once observed “This document’s length precludes it ever being read…”). On a more serious level, if this kind of technology makes it easier to produce high quality content by freeing up humans to focus their creativity on the most important parts, bring it on. 2.     By 2018, more than 3 million workers globally will be supervised by a “robo-boss.”  According to Gartner robots will begin to make decisions that previously would have been made by human managers. This one really does give pause for thought – the idea of being given ‘orders’ by a robot doesn’t sit well at all. But then, it’s also important to consider exactly what kind of decisions these robots are likely to take. They are unlikely to be conducting one on one, 360-degree reviews any time soon – there is simply no substitute for human interaction when it comes to leadership and motivation. More likely, they will take on and automate ‘bulk’ management tasks that require little in the way of personal intervention, nuanced management or a personal touch. 3.     By year-end 2018, 20 percent of smart buildings will have suffered from digital vandalism. This prediction focuses on events that might, for instance, plunge a building into darkness, disarm fire prevention systems or deface digital signage – and it’s only fair to accept this is a genuine risk. Like all emerging, internet-connected technologies, smart building solutions do represent potential security vulnerabilities. To date, however, I am not aware of any real life, or significant incidents of this type. I am sure having manual fire alarms set off maliciously or in error is still a much more significant issue, for instance. That said, there is no doubt that, as with all IoT technologies, security will be an ever-present, significant consideration. 4.     By 2018, two million employees will be required to wear health and fitness tracking devices as a condition of employment. On the face of it this sounds like a major violation of an individual’s privacy. But again, the reality depends on application. If these sensors tell us that an airline pilot is about to have a heart attack, or that a military recruit is being pushed too far, they could save lives. In truth, there are a great many potential applications where the use of health monitors seems perfectly sensible. So yes, whether you view these technology predictions as benign or potentially malevolent boils down to how you view technology, and how you view its potential application. In the end, technology should be enabling, not self-serving – in the same way that today’s technology solutions should put users at their core, rather than allowing their needs to be relegated to the status of near afterthought. This applies to all these apparently dystopian predictions. As long as these technologies are developed and deployed with the desire to improve the human experience, to enable people rather than replace or subjugate them, then there will be nothing dystopian about them at all. Featured image credit to Lobster.media ### Three New Year’s Security Resolutions When it comes to setting New Year’s resolutions, most people are a little over-ambitious. We give up carbs, go running every morning, become a vegan or even give up drinking alcohol or stop smoking. Inevitability, a few weeks later, we find ourselves right back where we started. As security professionals, responsible for keeping the bad guys out and reducing the risk of data breaches, we find ourselves right back where we started too — we fundamentally do not really improve our security posture, and then wonder why not. We set lofty goals and unrealistic expectations and above all, focus on the wrong things to do, and then wonder why. Next year we will see more breaches, more companies moving to the cloud, more usage of mobile and more IT budget spent on security. You have heard all of this already, so let’s try to make an impact and improve our overall security posture at home and at work. 2016 was a challenging year — from a security perspective, there have been too many notable breaches, topped off with Yahoo. Here are three New Year’s resolutions for 2017 — one focused on mindset, one on implementing something simple at home and at work and one that is a question you should ask the CISO every month in 2017 until you get a great answer. 1. Mindset — Rethink Security We all need to think differentially in all aspects of life, but here it’s about thinking differentially about security – think about identity. Why? [easy-tweet tweet="Act now and make 2017 the Year for two–factor authentication!" hashtags="Future, security, IaaS"] The status-quo today is that: Your apps are everywhere — in the data centre, as SaaS and mobile apps. Infrastructure is everywhere too — in your data centre, virtual servers and in IaaS providers like AWS. Users who access your data are everywhere too, in the office, on the road, as third parties and partners. So, with this wide net of interconnected elements, where do you start? You need to start by thinking differently. Imagine your internal network is as insecure as the internet. It’s like thinking your front door at home is open when you go to sleep. This mindset change is already happening at major companies. I call this a “rethink of security” because it goes against the teaching of many security textbooks and the classic “hard outside, chewy inside” analogies we typically describe. With this mindset change, the major takeaway is that you cannot trust your network anymore and if you take a paradigm shift and start thinking that your internal “previous secure” network is no longer secure, you’ll start to think differently, take charge of your security strategy and implement better defences. Those defences will be based upon securing your enterprise with Identity and Access Management  – with technologies like two-factor authentication, single sign-on, lifecycle management, privilege account management and auditing. 2. Act now and make 2017 the Year for two–factor authentication With 63% of data breaches caused by compromised credentials and breach analysis after breach analysis pointing to credentials, the argument to remove passwords is so strong now that soon employees will be asking why security at consumer facing sites like home banking, Amazon, Facebook and Gmail are better than what they have at the office. All of these organisations are pushing for two-factor authentication, and, as adoption increases in the customer world, CIOs will be left answering questions why it is not implemented in their own organisations. The argument that technology is too complex or employees will push back are all based upon legacy thinking. Current generation solutions are simple, cloud-based and leverage a mobile device. The key to implementing two-factor authentication is to have 100% coverage over all employees and all access points — accessing apps, VPNs and servers. This was never the approach in organisations that did implement legacy two-factor authentication, but now all user access can be enforced with it.  3. What Aae you doing about privileged IT users? The set of users accessing applications or technology that runs your applications includes: Employees: This is typically where most breaches start. Senior Management: These are a small set of employees in your business, but since they have access to more confidential information they are a target for hackers. IT Employees: This is a small set of employees (larger in IT centric organisations, like financial service) but these employees have access to all your IT infrastructure, applications and servers — thus are the prime target for hackers. Customers: These are a large number, but they typically have access to a small set of applications or maybe just the website Partners: These can be large, but like customers they typically have access to a small set too. From this set above, the most risk is the IT employees. We call them privileged IT users, since they have access to your servers in the data centre or in the cloud on which your applications and databases run on. Stealing their accounts is what the hackers are after, because typically once an account’s credentials are obtained, they are wide open with full access to run any command. If you have ever wondered how millions of accounts are stolen, it’s typically a hack that used a compromised privileged user account. So, your priority is to solve this problem. If your company does not have a strategy to implement privileged identity management (PIM), ask “why?” This should be top of mind for all organisations. So these are three security resolutions for 2017. Think differently, act now, and make 2017 the year of two-factor authentication and find out what your company is doing about privileged IT users. Featured image credit to Lobster.media ### HPE to Acquire Cloud Cruiser to Bolster Consumption-Based IT for Customers Today HPE announced a definitive agreement to acquire Cloud Cruiser, a leading provider of cloud consumption analytics software that enables customers to manage and optimise public, private and hybrid cloud usage and spend. Founded in 2010 and based in San Jose, CA and Roseville, CA, Cloud Cruiser offers an IT infrastructure consumption analytics application that provides clear insight into IT usage and spend and helps customers more effectively plan and manage their IT systems. Cloud Cruiser is already a key component of HPE’s Flexible Capacity business; we currently license Cloud Cruiser’s solutions in our Flexible Capacity offering and, in fact, we are their largest customer. This acquisition marks additional investment in HPE Flexible Capacity, to further differentiate and strengthen this high growth service. A hybrid approach to IT In today’s ever-changing technology landscape, IT organisations are under extreme pressure to not only ensure that the day-to-day operations run smoothly, but also to deliver new apps, processes and services that allow businesses to innovate. IT leaders are expected to be agile – provisioning IT in minutes, not months – all with significantly reduced budgets. Many enterprises have turned to the public cloud to fulfill these new demands, resulting in a new IT-as-a-service model that allows customers to pay as they go only for the technology resources that they use. This reduces the need for organisations to commit large capital investments in IT, eliminates unused capacity, and frees up valuable IT resources for new value-adding projects. But while some workloads may be right for the public cloud, others – which may require higher security, compliance and service levels – are best-kept on-premise. Because of these different needs, HPE helps organisations take a hybrid approach to IT. But how can business leaders enjoy the same IT-as-a-service model when employing a mix of public cloud and on-premise IT? The potential of Flexible Capacity Enter HPE Flexible Capacity. Flexible Capacity, a strategic and high-growth component of HPE’s Technology Services portfolio, offers customers on-premise IT infrastructure with cloud economics. It enables our customers to manage IT infrastructure in their own data centre but pay for it as-a-service. This reduces the risk of organisations investing too much – or too little – in IT, eliminates unused capacity, and frees up valuable IT resources for new value-adding projects. In other words, Flexible Capacity is a unique consumption model for IT that gives organisations the freedom to try new projects, fail fast without penalty, and support growth if they succeed. Where Cloud Cruiser comes in A critical piece of HPE Flexible Capacity is measurement – the ability to accurately meter and bill for customers’ consumption of IT– that differentiates Flexible Capacity from other offers. Cloud Cruiser’s consumption analytics offerings enable enterprises such as Accenture, KPN, and TD Bank to measure, analyse, optimise and control their usage and spend in private, public and hybrid cloud environments. As a Cloud Cruiser customer, we have seen first-hand the value that Cloud Cruiser’s technology creates by enabling HPE Flexible Capacity to meter and bill for usage of on-premise IT infrastructure in a pay-as-you-go model. By continuing to enhance the Cloud Cruiser platform and SaaS app Cloud Cruiser 16™, more tightly integrating it into HPE Flexible Capacity and leveraging the deep domain expertise of the Cloud Cruiser team, we are excited about the opportunity to accelerate the adoption of innovative consumption-based IT offerings and simplify hybrid IT for our customers. When the transaction closes, Cloud Cruiser will become a key part of the Data Center Care portfolio within our Technology Services Support organisation. Cloud Cruiser co-founder and CEO David Zabrowski (who served as VP and General Manager of HP’s Enterprise Computer Organization from 1997-2002), will report to me. Together, HPE and Cloud Cruiser will transform IT organisations, providing the Flexible Capacity solutions to help IT organisations spend less time on day-to-day operations and more time on innovation. ### How FinTechs are Unbundling and Enhancing Financial Services A lot is set to change in the financial landscape in the next few years. As the world continues to become even more connected, it is apparent that the financial world is lagging behind other industries when it comes to innovation. While the term FinTech has become more prominent recently, financial technology is nothing particularly new. However, advances in technology coupled with the demand for a better way of banking has resulted in a huge buzz being generated as new entrants set to disrupt the market. Areas which have seen the most disruption include payments, lending, FX, current accounts, and remittance. [easy-tweet tweet="With new regulations set to roll out in the next few years, collaboration will be key." hashtags="fintech, tech"] By unbundling financial services, FinTech disruptors have been able to capitalise on the fact that a number of major banks arrived late to the digital party. In addition to this, in the wake of the financial collapse in 2008, FinTechs with e-money licenses were able to provide pared down banking solutions such as prepaid cards linked to e-wallets instead of traditional current accounts, and alternative lenders - which unfortunately includes payday loan companies - have seen exponential growth. For better or worse, the financially excluded were able to access alternative solutions that the banks were not willing, or not able to offer them. Disruptors become collaborators It’s not simply a case of the Davids against the Goliaths though: a new breed of FinTech companies have emerged that are breaking down barriers - offering innovative solutions that optimise legacy processes and infrastructure rather than disrupting them completely. By providing banks with the option to collaborate to enhance their existing offering, or to outsource non-core banking services, FinTechs have been able to carve out a niche by focusing on a very small part of an incumbent's business, and improving it. Collaboration benefits both parties and their customers, and smart FinTechs have already realised that the future will be rife with opportunities for further innovation - aided in part by overwhelming support from major regulatory bodies and governments. The Financial Conduct Authority (FCA) has provided startups with a regulatory sandbox, which allows them to test their propositions in a safe environment. The UK Government and Bank of England have announced support for the FinTech industry, and the UK Payment Systems Regulator (PSR) announced at the start of the year that it would be opening payment infrastructure that has been monopolised by major banks for years. In Europe, the Payment Service Directive II (PSD2), an EU legislation that focuses on payments, is set to shake up the payments ecosystem across SEPA (Single Euro Payments Area), with banks having to provide access to third party payment providers. With new regulations set to roll out in the next few years, collaboration will be key. Not just for banks partnering with FinTechs and vice versa, but also for more established FinTechs and new entrants. Areas of financial services could even end up being micro-optimised in each niche - further unbundling and enhancing the product dependant on the end user.  While there will always be noisy disruptors that shake up the status quo every once in awhile, those who give themselves the greatest chance of success will be the ones who aren’t too precious to do things quietly and efficiently by forging partnerships as well. Featured image credit to Lobster.media ### Up In The Clouds With Microsoft Azure Analysis Services We all know that the evolution of big data technologies have driven smarter and more sophisticated business insights. From omnichannel solutions coming of age in retail, to the impact of customised algorithms whenever we click through to a social media site, business owners can now interact and engage with consumers like never before. The development of Microsoft Azure solutions, as we’ve been shouting about for quite a while now, marks the next evolution in customer insight, and creates greater opportunities for IT support providers, like LMS Group, to keep clients ahead of the competition.  Next-level business insights OK, we’re not saying that better apps won’t come along, because they will. But, by placing more investment into its cloud applications than its traditional ones, like Office, Microsoft has set a new precedent in what we can expect from cloud-based capabilities and services. Microsoft Azure Analysis Services is a fully-managed cloud platform service, aimed at all abilities and levels of interest. The technology has been around for a while: it is the engine in Power BI (which we love) and PowerPivot. The transfer of this proven technology into the cloud has merged its industry-leading tools with the accessibility of the cloud to create bigger and better business opportunities, wherever you are in the world. [easy-tweet tweet="Microsoft Azure Analysis Services is a fully-managed cloud platform service, aimed at all abilities and levels of interest." hashtags="Cloud, tech"] What’s so good about it?  Analysis Services has traditionally followed a technical business intelligence set up. This IT-led Microsoft solution achieves high-end scalability, and allows businesses to create business intelligence (BI) models that can be revisited and recycled throughout an organisation. Having this BI stored in a secure location which can be accessed by all your staff - technical and non-technical - enhances your ability to create cohesive messaging and strategy moving forward. As expected with an IT-managed programme, Analysis Services is also equipped with strong programmability and automation features. Although businesses can currently access the engine through other Microsoft services - and Microsoft is unique in the fact that it provides seamless migration from the cloud and on-premises - what makes Microsoft Azure Analysis Services superior is that BI models can be deployed quickly across an entire organisation, which eliminates or significantly reduces manageability issues. Say you come across an unforeseen customer activity trend. Asking for a server used to take a few days or up to a week, even with optimum configurations and drivers on a virtual server. With this, you can spin a server in a matter of seconds. Succinctly put, speed is the real difference. Users can still use Power BI to digest billions of rows of data into rich visuals to collect and organise, or send the data over to Visual Studio to develop it for any app or platform; only now it’s a fully managed, end-to-end cloud service. If you’re already using tools like Power BI to transform your business data, you don’t even need to abandon current projects to make the transition into the cloud as they work together seamlessly. Why are we talking about this? Like Microsoft Azure solutions, our primary goal is to allow staff to get on with their work, without the extra hassle of managing IT infrastructure and availability problems. Not everyone works in BI, but what is great about this Microsoft Azure solution is that you don’t have to be in a technical role to access and understand your business strategy. As a provider of smart, strategic solutions to SMEs in the South of England, we’re always on the lookout for the newest innovations that will complement our IT infrastructure, telecom and connectivity services. On top of that, LMS Group is a Microsoft Gold Partner for ‘Small and Midmarket Cloud Solutions’ which gives us greater bragging rights and many more opportunities to deliver top level service. As readers of this site will already know, the cloud improves continuity, productivity, accessibility and business security. What Microsoft has done is to take some of its most popular business software and transfer it to a cloud-based system with quicker and easier functionality and seamless migration. Billions of rows of data can be pulled into a visual graph within seconds to provide instant gratification and interactivity at the speed of thought in a single click. All this culminates into powerful insights that you can transfer to your business strategy. ### Containers: Navigating the Modern Cloud 2013 was the year that Docker arrived on the scene and since then software container technology has advanced significantly. Like any new technology, containers would not be possible without an advanced environment in which to run; what I refer to as “the modern cloud”. Containers are still a relatively new concept in the software development world, so here I’ll give a broad introduction to the technology, its relation to what came before, the benefits of containers, and the modern cloud infrastructure that they rely on to be used effectively. What does the term ‘container’ really mean? Containers are a means of software abstraction used by developers. The days of ‘bare metal’ and software designed specifically for them are behind us. Flexibility and agility are extremely important to modern developers. For that reason, virtual machines (VM) were created as the next step in a chain of technologies that has led to containers as we know them today. [easy-tweet tweet="Containers are a means of software abstraction used by developers. " hashtags="Cloud, Tech "] VMs use software known as a hypervisor to abstract their work away from hardware. Hypervisors replicate hardware capabilities like CPU, networking and storage and enable more tasks to be run simultaneously across multiple VMs per physical device. However, using VMs and replicating them across numerous devices can be a significant drain on resources. It can be helpful to see containers as a lightweight version of VMs, because, while they share the same basic function of abstracting processing work away from underlying hardware, they do not require a virtual copy of host hardware or their own operating system to be fully operational. What this means in practical terms is that a developer can fit far more containers on a single server than would be possible with VMs, resulting in more power and more flexibility - the developer can move faster and deploy to the cloud with greater ease. More agile and more secure Containers can promise to run virtually everywhere and that is one of their most appealing aspects. The technology has the capacity to scale from a single developer on a laptop all the way up to an entire production cluster. Containers are much more portable than previous software development methodologies and allow developers to work with greater flexibility across complex applications. Containers could also increase application security: before containers we had the ‘monolithic model’ of software development - when code had to be dealt with as a single, complex entity. If there were an error or an issue then the development team would have to analyse all their code, determine where the issue was located and remedy it without breaking any dependencies - a time-consuming process for even the most skilled developers. Containerised software is more reliable and more secure. Issues can be isolated, removed and replaced with minimal disruption to the overall application. In addition, container technology supports the use of multiple coding languages in the same application - this means that cross-compatibility issues are minimised and different teams can work together more effectively. A note on the modern cloud It all started in 2006 with the Elastic Compute Cloud released by Amazon (EC2). Before then, low-cost and developer-friendly VMs were hard to come by - only the most forward-thinking companies with advanced, internal cloud functionality could access them. Tech giants such as Amazon stepped in and started to do much of the heavy lifting on the infrastructure side, allowing smaller companies with specialised knowledge to build features that were really relevant for their customers. This supply of cheap, quick VMs allowed teams to move faster as they could rapidly spin up new VMs without having to manage the infrastructure requirements themselves. Containers only take a few seconds to load whereas VMs can take minutes. Containers are more flexible than VMs - which are often locked-in to a particular cloud provider. It is therefore faster to scale workloads in response to demand and, if required, migrate to another cloud provider using containers - something which can be highly challenging using VMs. Rather than a hypervisor, containers require a scheduling tool to be managed within the framework of the modern cloud. Containers, and their orchestration tools, can span multiple cloud infrastructures, a step closer to the end goal of ‘build once, run anywhere’. What is next for containers? Container technology is thriving: 81% of businesses surveyed earlier this year suggested that they would increase their investment in the container space. Container technology has uses across a wide variety of industries, some of which may come as a surprise. Goldman Sachs, the American investment bank, has invested around $95m into Docker and plans to move the bulk of its workload onto the platform over the next two years. Tech companies such as Amazon, Microsoft and Google are some of the other high-profile advocates of Docker technology. Containers allow developers to compartmentalise and manage complex code - a step towards full software development automation. Adoption of the technology has been widespread in the developer community, and the next step is for larger companies and enterprises to begin using the technology en masse. Container technology, when used in conjunction with schedulers such as Kubernetes on modern cloud infrastructure have the potential to help automate more and more aspects of developers’ working lives. ### It is time to embrace the cloud, but proceed with a plan Businesses are keen to embrace the cloud for many reasons. Cloud solutions provide enterprises with an IT infrastructure they can control with few overhead costs and complexities. This gives them the ability to focus less on IT, and more on their business and bottom-line.   With the cloud, businesses are able to design and operate complex global and local platforms with little up-front cost, and pay only for the infrastructure and technology used. They can integrate platforms into existing legacy physical environments and add in SaaS solutions as needed. Cloud solutions provide the power, flexibility, and the commercial models that could only have been dreamt of five years ago. However, it is important that businesses plan carefully when implementing a cloud solution. The right plan can help businesses avoid unnecessary pitfalls and ensure the transition to the cloud goes smoothly. [easy-tweet tweet="It is important that businesses plan carefully when implementing a cloud solution" hashtags="tech, cloud"] Ensure you have the proper skills to bridge the cloud security skills gap Cloud security works differently than owned-infrastructure security. Physical servers are replaced with virtual machines running on virtualised hardware and security is managed through a cloud dashboard rather than directly on the server. The ease of use of adding resources is one of the attractions of the cloud but it is important these new resources inherit security characteristics in a carefully controlled manner. If employees don’t have the proper skills to manage the cloud, businesses should consider using a managed service provider (MSP) to help with migration and provide support after the move. An MSP will take care of the migration and maintenance on behalf of the IT team and eliminate the need for extensive cloud security know-how on the business’ part. This move will also accelerate an organisation’s maturity and greatly reduce the time to value. Consider data residency and protection in the cloud Data residency has been debated widely in recent months, especially since the Court of Justice of the European Union declared the International Safe Harbour Privacy Principles void in October 2015. The Safe Harbour agreement between Europe and the US was meant to provide blanket permission for data transfer between the two countries, and peace of mind for companies with data residing on foreign soil. The legislation has now been replaced with the EU-US Privacy Shield but uncertainty persists. One answer to the uncertainty lies in the cloud because businesses can choose where to store data. The simple solution to data residency is to store data locally and avoid sovereignty issues. Data protection is another cause for concern. In the EU, data protection laws are in the process of being replaced with the EU General Data Protection Regulation (GDPR), which is due to come into effect in May 2018. This includes the “right to be forgotten” for EU Citizens, which is especially applicable in the cloud where data can be replicated across nodes and backed up multiple times. It is important that businesses keep data protection in mind when building their solutions. [easy-tweet tweet="One answer to the uncertainty lies in the cloud because businesses can choose where to store data" hashtags="tech, cloud, data"] Manage cloud sprawl to minimize cloud costs The cloud offers growth and scalability for businesses, and it is easy to keep building platforms and pumping data into the cloud. Unless you’re careful, the size and number of services you are using will continue to increase and this will lead to the cost of cloud services to mount up. By deleting old, obsolete, and unused applications on a regular basis, as well as archiving data to cheaper, higher latency services, businesses can avoid this unnecessary expense. By taking into account how they are managing the cloud strategy, businesses can realise the promised savings of cloud computing. The best solution is to have a single pane of glass view for all your cloud solutions and spending. With this visibility, businesses can use only what they need and switch off or consolidate little-used infrastructure. Regular ‘cost optimisation’ activities should be undertaken to ensure you are benefitting from the latest cloud features, capabilities and pricing changes. Ensure your cloud provider sets the standard One of the benefits for businesses getting into cloud computing is that cloud providers themselves will provide security and compliance to the data being stored with them. Standards such as 'ISO 27001' for information security management is the stamp of approval that means your data is secure with a cloud provider. However, this may not be enough. It is crucial that businesses check to ensure the provider has the compliance framework in place to satisfy the regulatory requirements of its industry and marketplace. For instance, if a business is accepting card payments online it needs to meet the Payment Card Industry Data Security Standards (PCI DSS). As already mentioned, GDPR is fast approaching and will cause significant changes to the accountability and liability of a cloud provider for their customers’ data, so expect more stringent and defined responsibilities to be put in place. Cloud is transformative for businesses The cloud represents a shift in the way businesses consume and pay for IT services but it is also transformative for businesses, allowing them to scale business offerings, save on hosting solutions and ultimately, offer better solutions to their customers. By keeping the above considerations in mind when migrating to the cloud, companies can guarantee a successful and pain-free transition. This provides opportunities for innovation and means businesses can benefit from new routes to market, while offering clients new capabilities, covered by security and certifications. Clients can relax knowing they have provisions for future growth built into their solution. ### DataFest17 Attracts Global Data Leaders to UK With just two months to go until the pioneering week-long data science festival, DataFest17, two more high-profile speakers have been confirmed. Hilary Mason, former chief scientist at bitly and founder of Fast Forward Labs in New York, and Nuria Oliver, former scientific director at Telefonica who holds a PHD. from the Media Lab at MIT and is a Fellow of the European Association of Artificial Intelligence, will attend the Edinburgh event. Scottish innovation centre, The Data Lab is behind the festival that takes place 20th to 24th March 2017 with a comprehensive programme events across Scotland promoting and celebrating data innovation under the banner #DataChangesEverything. The so-called ‘big data rock star’, Hilary Mason said: “Increasingly organisations are waking up to the potential of data. Chief technology and information officers are being called upon to actually have a strategic vision for what the business can be capable of through the technology. Technology is not merely about facilitating other parts of the business – it is core to the operational goals. DataFest17 recognises and celebrates the massive potential of data. I’m thrilled to be a part of it, meet other data leaders and to find out more about some of the innovative initiatives underway on that side of the Atlantic.” Mason and Oliver will speak at the Data Summit part of the data festival, a two-day international conference held in Edinburgh over 23 and 24 March. The first day of the summit will focus on ‘proof’ - current applications, experience and challenges in delivering value from data. The second day will focus on ‘hypothesis’ - the future of data. Other Data Summit confirmed speakers include: Dr Hannah Fry, UCL lecturer and presenter of BBC TV’s The Joy of Data Marc Priestley, former Mclaren F1 data scientist and expert in British racing technology Tim Harford, The Undercover Economist Stewart Whiting, Co-Founder and Data Science Lead at Snap40 Jason Leitch, National Clinical Director, Scottish Government Alex Bellos, science and maths writer, columnist at The Guardian Alan Nelson, IT and information law lawyer at CMS. As well as the Data Summit, DataFest17 also comprises: Data Talent Scotland on 22nd March 2017 in Edinburgh - a one day event bringing together over 500 attendees linking data talent from universities with industry and public sector organisations. Fringe events – taking place Scotland wide w/c 20th March 2017 - events around Scotland exploring/hacking/debating data innovation. Confirmed events so far include: Fintech Edinburgh – Innovators from the fintech sector will showcase how they are using data and what open banking will mean for the future. The Travelling Gallery (Bus) - a data driven artists exhibition based outside Assembly Rooms on George Street. Smarter Tourism: Shaping Glasgow’s Data Plan – on the back of the £24m Technology Strategy Board-funded Future City Glasgow programme, this event, supported by The Data Lab, the Future Cities Institute and Scottish Enterprise, will feed insight and ideas into the next stage of data-led tourism innovation. Training - w/c 20th March 2017, Edinburgh and other locations – a programme of training events from practical data science to leadership. The University of Edinburgh will be running its third annual deep learning workshop on 21st March, attracting over 200 local and international delegates. Gillian Docherty, CEO of The Data Lab, said: “Momentum is building for DataFest17 – in just two months we’ll welcome insight and debate from international academics, businesses and public services on how #DataChangesEverything. The packed programme will showcase data innovation and catalyse further activity while underlining Scotland’s leading role in data on the global stage. “We want to ensure that people across the world look to Scotland’s data science community as a benchmark for the future of data science. The festival is a platform that not only celebrates the work on data that is being led from Scotland but crucially encourages others to leverage the power of data.” DataFest17 is supported by a number of organisations including analytics firm, Aquila Insight, Sainsbury’s Bank as well as CMS and NHS Scotland. Jonathan Forbes, Aquila Insight CTO said: "If you want to meet a rock star data scientist and engineering leader, Hilary Mason is at the top of your list. From the Wall Street Journal to Glamour magazine, Hilary Mason is the creative, enthusiastic voice of data science. “Her presence underlines the calibre of DataFest 17. We are proud to support this exciting event."   ### Negotiating Cloud Contracts Changing a traditional critical business application is a major undertaking. But a move to cloud-based SaaS solutions can be even more complex. When negotiating contracts with your chosen vendor, there are a number of things to look out for, because you’ll be placing your trust in a third party to provide, and crucially, maintain a business-critical function. Let’s start with the basics. In any commercial selection, you need to ensure your chosen product can fulfil your functional and non-functional requirements; that the vendor or implementation partner has a realistic and proven approach to delivery and support, and that the cost of product and implementation services is affordable and competitive. [easy-tweet tweet="When moving to the cloud, however, you also need to consider a number of other issues. " hashtags="Cloud, SaaS, Tech"] First and foremost; Capabilities. Can the Vendor/Partner demonstrate the skills, resources and established processes to deliver on all aspects of the implementation service, transition and associated managed service? Evidence need not necessarily be written into the commercial agreements or RFP responses, but the documentation associated with the agreement should reassure you that the Vendor/Partner has them. Also, written into the main document, or its schedules, should be a commitment that the Vendor/Partner will provide the resources to deliver in these specific areas: Responsibilities: Who does what, in the realm of Implementation Services and Managed Services? What is the split of responsibilities between the Client and 3rd parties? How are services “backed off” between providers? To what standard must they do them? Redress: What is done to compensate for failure to meet the responsibilities to the required standard? Not necessarily punitive, but it is there to give the responsibilities that can be called to account, full accountability. In the end, the objective is to get the services as specified, under normal operations or get Recovery in the shortest time. Recovery: In the event of catastrophic or chronic failure, what special measures will be exercised to get delivery back on course? This could for example include “carve out” of some, or all of the service(s) to a 3rd party, at the expense of the Vendor/Partner. Exit: What is “Plan B”? The Vendor/Partner must have a project plan and defined responsibilities to facilitate your exit from the engagement in the event of commercial necessity, failure of the relationship, (including termination for cause) and normal end-of-term without renewal. (Note: it would be usual for the vendor to levy a separate charge for the service at the time of invocation.) All parties need to know the consequences of failure and to understand their obligations so openly addressing redress, recovery and exit from the outset provides mechanisms that can be invoked in the event of significant problems along the way. Having these transparent to everybody and agreed upfront creates a more “grown up” and collaborative environment for successful, long-term partnering. With the recent fall of Sterling against the Dollar, extra-commercial considerations have arisen in relation to services delivered by US-based providers. Anyone considering a move to the cloud may want to look at strategies to protect themselves from future exchange rate fluctuations. For significant spend, then currency hedging is an option any business with significant income or overseas spend would probably consider.  If individual contracts are in foreign currency, then you may have the option to fix rates and/or increases for an agreed period.  However, this depends where you are in your contracting cycle - beginning, renewal, etc.  Your negotiation leverage will vary; think through the various scenarios: What happens if the exchange rate continues to weaken? What happens if there’s a significant upturn back to previous rates? What if we trade longer commitment and/or stricter termination clauses for a negotiated rate? What happens if we add more/less subscriptions, in the above events? What happens at the end of the current/renegotiated term? What happens if we cancel? By thinking through these situations, you will be able to assess your negotiating strength and the risks/opportunities you can afford to take. It should be stressed, though, that the falling pound is not a reason for keeping services on-premise. All the reasons for evaluating cloud vs hosted vs on-premise that were valid before the fall in Sterling still apply. Recent events have just served to highlight one of the many commercial, technical and functional considerations/risks that need to be thought through before undertaking any significant business applications or technology investment. Featured image credit to Lobster.media ### One step closer to complete cloud business In 1876, Alexander Graham Bell took out the first patent for the telephone. Sometime between 1936 and 1938, Konrad Zuse invented the Z1, the world’s first programmable computer. The years that followed saw constant improvements to both. In the 1960s banks of switchboard operators connected calls by moving plugs into one socket and then another. The 1950s to the 1970s were the era of the ‘big iron’, massive mainframe computers. The first personal computer was the MITS Altair 8800 in 1975. Since then there have been great advances - both computers and telephones have become more powerful, smaller and easier to use, as well as transport. However, it’s been difficult to totally remove the need to physically store equipment and servers. Cloud technology has allowed us to make some progress here in recent years, but many are sceptical about whether we will ever be able to be completely cloud-based. [easy-tweet tweet="Both computers and telephones have become more powerful, smaller and easier to use" hashtags="tech, cloud"] At 12pm on Tuesday 13th September, Thirsk-based communications technology business, TeleWare broke a major cloud barrier. In real-time, TeleWare was able to route record and analyse a call through the Microsoft Azure cloud platform – a world-first, confirmed by Microsoft, and something that most experts believed not possible. It had been thought that attempting to use an entirely cloud-based system would result in poor call quality due to loss of data packets and latency on the line. However, TeleWare found it was possible to develop and enhance existing products to achieve a seamless call of the same quality as one using physical equipment. Then, on Tuesday 1st November, at midnight, there was another landmark achievement. This time, TeleWare’s Hosted Service fixed-line call recording functions were completely migrated on to the Azure platform. From now on, calls will be routed through the TeleWare UK datacentres and then directed onto the Microsoft Azure cloud for recording. All call recordings will then be passed back into the TeleWare datacentres for secure and compliant storage. As a result, the dream of businesses becoming ‘100% cloud’ is one step closer. But why should businesses be excited about this? The benefits of the cloud are well known. They include reducing the recurring cycle of refreshing or rebuilding services in datacentres and all of the disruption and associated costs. This is achieved by taking away the need for expensive and hard-to-maintain physical servers – even if they’re hosted offsite by another company. Data can be encrypted by users and still be easily accessible. The cloud can even provide enterprise level technology to small companies with even smaller budgets. Software providers can deliver pay-as-you go solutions to help their clients get a crucial step ahead of their competitors. Even more important, perhaps, are the disaster recovery options. Hosting everything in a data centre reduces the risk of losing precious data should there be an emergency. Indeed, many businesses are already using Dropbox or other cloud systems to run their email and replace physical servers to provide file management that is accessible from any location. However, the vast majority of business is still done on the phone – whether it be taking and negotiating orders, dealing with customer complaints, or even talking to other members of the company. [easy-tweet tweet="Many businesses are already using Dropbox or other cloud systems to run their email and replace physical servers" hashtags="tech, cloud"] Companies that rely on recording their calls, such as financial services, would never have been able to go completely onto the cloud. It would even have been a struggle for companies that are used to recording calls for their own purposes – be it for customer experience analytics programs or simply to make sure that precious order details aren’t lost. In fact, it was thought that the industry was not even close to being able to record a call in real time on the cloud. Managing to take routing, recording and analysing of calls through Azure, as well as placing entire fixed-line recording functions on the cloud is truly significant. For the first time, it shows that companies can have all the functionality and quality of a fixed-line system, as well as the flexibility and benefits that the cloud brings. ### Flynet – Putting the Mainframe at the Heart of the Idea Economy The Common Argument We are all concerned about the growing skill shortage in the mainframe space, in fact it’s a fairly pervasive problem across the heritage system space in general, be it Mainframe, IBMi, Unix, VMS, VAX, Multivalue/Pick. In addition to the technical skills shortage, we have technical push back from new millennial-aged employees who simply refuse to work with a system where they can’t touch the screen or point and click. Explaining that the application has been built over many years and therefore not all the processes are linear and in turn the information architecture isn’t intuitive doesn’t seem to quell their complaints. This is then further compounded by the eruption of the “idea economy” and a desire from business units to rapidly produce apps that will engage or drive new lines of business.  Traditional methods of development that have gone hand in glove with the ultra-reliable workhorse machines that often underpin our key business processes are not particularly agile. [easy-tweet tweet="It would appear that we have a machine that is somehow no longer fit for purpose" hashtags="cloud, tech"] So it would appear that we have a machine that is somehow no longer fit for purpose, there is a decreasing number of people who can develop on it, there are a growing number of people who don’t want to use it and it doesn’t support rapid application development and certainly not Agile development. Pulling Back Wool from the Eyes This is very much a common misconception built up around a set of machinery that sits quietly in the background and supports our very modern economy. These are the machines that we trust to run our banks, our healthcare system, our high street retailers, our government and defence. The average person interacts with a mainframe computer between 8 to 10 times a day, they check their bank statement, they buy their groceries, they pay their bills. It is fair to say that without ever having met the machine in the back room I pretty much trust it with my life. From an IT perspective we have used these systems because they are the most robust and resilient pieces of machinery, in addition they can handle the high transactional workloads that other systems struggle with. So it seems to do the job really well, what if I told you that in about 15 minutes you could securely access every application on your Mainframe from any device at all, without installing any software on it or doing any development work, that you could use a touchscreen, you could use a mouse, you could integrate seamlessly with modern tools like office 365? What if I told you your entire workforce could build modern responsive mobile/web apps that connect directly to your core data, that they wouldn’t need any development skills at all because they can use No Code tools to do it. What if I told you could build 300 Web Services in a week to enable an entire SOA or IOT strategy. All of a sudden our perception of the machine in the back room starts to change. The Flynet Vision – Enable, Enhance and Evolve Essentially the Flynet solution is targeted in 3 areas Enable, Enhance, Evolve.  All founded on a route principle that the Flynet solutions deliver 100 percent pure responsive HTML to any web browser on any device without client-side software or plug-ins. Essentially our tool set enables Host/Legacy systems to participate securely in Mobile, IoT, Big Data and Cloud initiatives very quickly with little cost and no risk. [easy-tweet tweet="Your entire workforce could build modern responsive web apps that connect directly to your core data?" hashtags="tech, cloud, mainframe"] Enable – Flynet Viewer TE (Terminal Emulation – that connects to Mainframe, iSeries, pSeries, UNIX, VMS, VAX and MultiValue systems) Replace bulky desktop terminal emulation clients with a fully featured Terminal Emulator that is delivered to any browser on any device (desktop, thin client, Mobile or Tablet) without any client side software or plug ins.  This unlocks how organisations can work with their host system and in addition allows them to use banking level security in doing so. Enhance – Flynet Viewer Studio UX (Host/Legacy system modernisation tool) A Wizard based No Code Platform (API extendable) for modernising green screen interfaces, giving Host/Legacy systems a more contemporary look and feel and making them accessible via any browser on any device. We have blue chip client examples where we have trained in-house non-technical personal to modernise enterprise-wide interfaces in a 20th of the time that our closest competitor quotes for the same work using high-cost consultants. Evolve – Flynet Viewer Studio WS (No Code application that creates W3C compliant Web Services to both Host/Legacy screen based and dB housed data) Studio WS allows developers and non-developers to create standards compliant, fully robust, annotated, testable and deployable web services to backend data and processes in minutes.  Enabling you to create a whole catalogue of Web Services to Host/Heritage (and other) systems in days not months. Completely unlocking Host and Legacy systems to participate in SOA strategies and IoT initiatives. ### Hackers versus Hurricanes: It’s Time to Begin Planning for Real Disasters The most common objection I come across when discussing Disaster Recovery (DR) is, “Oh, well we don’t get hurricanes or floods or anything like that, so we don’t need DR”. To which I always reply “That’s all well and good, but do you have humans? What about hardware and power? Do you have HVAC in your datacentre?” A disaster can be any number of things and yes, whilst nature can definitely be a cause, in research that Opinion Matters undertook on behalf of iland in 2016, we found that it actually accounts for only 20 percent of outages. Our research also found that more often than not,  disaster also comes in the form of operational failures (53 percent) and human error (52 percent). It is actually as a result of the latter two factors that the most outages occur. When people think of disasters, they tend to think dramatically and on a large scale of incidents capable of taking out an entire datacentre for long periods of time. In actual fact, when businesses leverage DR solutions, it tends to be because of much more isolated and smaller issues that have impacted the business. Perhaps only a couple of systems or mission critical applications have gone down; entire IT systems don’t need to implode before one’s eyes to warrant a disaster. It is the smaller disasters that we really need to be focusing on as it’s usually a build-up of these which results in a cascading effect which can then have a bigger impact on the business. We have many customers that will failover just one or two systems. Maybe a patch went wrong, or a virus occurred and the best thing to do was to keep the system running, fail the affected machine over to keep the application running and fail back when systems are repaired. It doesn’t always have to be on a massive scale. By grouping VMs supporting multi-tier applications into virtual protection groups, you can perform partial failovers within that group, reducing cost and simplifying failback. [easy-tweet tweet="VMWare can perform partial failovers within that group, reducing cost and simplifying failback" hashtags="cloud, security, tech"] Significant disruptions to IT systems, however, do happen and can take a heavy toll on the business; hence the importance of having a reliable DR plan in place. Of course, there are the obvious implications of a disaster such as interruptions to trade, service, and loss in revenue. But, perhaps what you might not have considered is the effect on employee morale and additionally on customer confidence. In order to reduce disruptions such as these, a proactive and strategic approach to disaster recovery is required. Five or ten years ago, the solution to DR felt heavy and complex and, as a result, many organisations put it in the ‘too hard or expensive’ basket and DR was put off to next month or the next quarter, maybe even the next year. In the last few years, however, cloud-based disaster recovery options have given organisations new and more affordable route to protecting their businesses. As companies started to virtualise and people start moving to the cloud, we found that other parts of the datacentre weren’t keeping up. Efficiency was increasing but replication was still lagging, still embedded in the hardware layer. This not only made it difficult to work with an external vendor, as matching hardware requirements could not be met but, it also meant that everything had to be replicated, rather than just the data that companies were interested in. As a result, the investment in digitisation was being undermined as resources were being wasted. iland’s DR software, powered by Zerto, allows for replication to take place in the hypervisor at the VM level, providing its users with a far more efficient solution. So what should you look for in a DRaaS solution?  In my mind, excellent technical support is key in enabling users to craft a complete DR solution that meets their cost requirements and risk tolerance, whilst also ensuring that the solution will be implemented quickly and tailored for success within their organisation. A successful DRaaS solution should also allow for testing by the user themselves to make sure their plan actually works. Finally, a DRaaS solution should be able extend the security measures and compliance rules already present on-premise into the cloud. [easy-tweet tweet="So what should you look for in a DRaaS solution?" hashtags="tech, VMWare, cloud"] As I mentioned earlier we commissioned a survey of 250 IT decision makers with responsibility for DR from medium to large companies in the UK. From outage experiences to achievable recovery times to failover confidence levels and barriers to adoption of cloud-based DR, the survey revealed a wealth of insights that can serve as benchmarks for developing a successful disaster recovery strategy. If you would be interested in finding out more about the results of the survey, the full report can be found here. ### An intelligent approach to global networking As enterprises look to deliver cutting-edge services to their tech-savvy customers, move into new markets, rethink their business models, and disrupt entire industries, the CIO needs to enable digital transformation across the IT estate. With the pace of technology innovation and the data explosion showing no signs of slowing down, it’s time to ask: is your network propelling your growth in the platform economy, or holding it back? Video, IoT and cloud creating new network challenges The biggest challenges that the world’s networks face at present are the continuing demand for more bandwidth, the multi-layered nature of wired and wireless networks, and enterprises’ increasing dependency on cloud applications. The growth in bandwidth demand is largely driven by video use by both businesses and consumers. By 2020, video will represent more than 80% of all Internet traffic, with nearly 1 million minutes of video content travelling across the Internet every second. The multi-layered natured of networks is  also creating a whole new level of complexity that is set to become exacerbated by the Internet of Things, generating billions of new connections. [easy-tweet tweet="The biggest challenges that the world’s networks face at present are the continuing demand for more bandwidth" hashtags="tech, cloud"] Additionally, enterprises are having to deal with a complex combination of private, public and hybrid cloud. Their on-premise applications might include ERP solutions and sensitive customer data, whereas a private cloud might house Office 365 and mobility applications. A public cloud might have non-mission-critical applications, such as marketing automation tools. Enterprises also have to address the requirements of their increasingly mobile workforce, which makes data flows difficult to predict and manage. Due to these challenges, a next-generation network will be needed to deal with heavier and more complex workloads than before. But what will this next generation business network look like? Will the next-generation network be wired or wireless? 5G will undoubtedly play a key role in meeting businesses’ connectivity demands. But while these latest wireless technologies attract the lion’s share of attention in the media at present – with organisations such as the European Commission pushing the industry to accelerate 5G roll-outs – wired connections will not disappear anytime soon. In fact, the role they will play in addressing businesses’ needs for secure, easy-to-manage and cost-effective networking. Wired connections and physical fibre infrastructures won’t disappear because they will be capable of carrying superior workloads than wireless alternatives for the foreseeable future. New 40G, 100G and even 400G technologies mean that speed and capacity will continue to grow to meet demand. Another area where wireless connections still can’t match wired networks is security. It remains easier to control and monitor what traffic is going out or coming into the network when using a physical wired connection. After all, wireless is a shared medium, so anyone who is in range of the signals can capture and potentially interfere with them. [easy-tweet tweet="It remains easier to control and monitor what traffic is going out or coming in" hashtags="cloud, tech"] SD-WAN – a promising technology One of the technologies that many enterprises are looking at for addressing their global network challenges is a Software-Defined Wide Area Network (SD-WAN). While a nascent technology at present, SD-WAN is set for huge growth: Gartner predicts that by 2018 more than 40% of WAN edge infrastructure refresh initiatives will be SD-WAN-based, up from less than 2% today. An SD-WAN enables enterprises to deploy a new kind of hybrid network infrastructure that combines the scalability and cost-effectiveness of the Internet with security and reliability of a private MPLS WAN. This gives the CIO unprecedented control over the entire network infrastructure across all branch offices, and lowers the barriers for enterprises to expand into new geographies In practice, an SD-WAN enhances the performance of both on-premise and cloud applications. The CIO could change routing policies in real-time, and assign different policies for different offices depending on demand, for example. The ability to dynamically route traffic between the Internet and the private network, and use the public Internet for less business-critical applications, makes the enterprise network a lot more cost-effective. With our IZO SDWAN offering, for example, enterprises can save around 30% on their network operating costs. An SD-WAN also improves the end user experience, as congestion is minimised and application responsiveness is maximised. This helps ensure that employees are able to enjoy faultless, secure access to applications and data to empower them to collaborate securely and seamlessly, wherever in the world they are. The benefits are clear: a boost in productivity and an improved bottom line. Key considerations for enterprises looking to adopt SD-WAN include: Global reach: A global SD-WAN enables organisations to connect even the smallest branch offices to the cloud and enterprise WAN easily, securely and cost-effectively, supporting the enterprise’s growth ambitions. Overlay-underlay integration: An overlay SD-WAN that is tightly integrated with the underlay IP infrastructure maximises the reach and manageability of the network and simplifies the roll-out of the SD-WAN. Full management: A fully managed SD-WAN reduces the heavy lifting needed by the CIO and the IT department. This eliminates the need to train staff on the security and application layers of a hybrid network, or the virtual machine operating systems of public and private clouds, while ensuring that the CIO and IT department are still in full control of the global network. Reinventing global networking The practically ubiquitous connectivity granted by the Internet has unlocked new markets for enterprises and lowered the barriers for international expansion. It has paved the way for a whole new platform economy where new business models emerge and transform entire industries. Crucial for the growth of this platform economy is the data that fuels it, the applications that power it, the clouds that underpin it, and the connectivity that brings organisations and consumers together to create value from it. But as the platform economy continues to grow, ubiquitous, reliable, easy-to-manage, superfast connectivity that brings enterprises, their employees and customers together on a global scale becomes paramount. That is there is a need to reinvent networking with innovations such as SD-WAN – only then will enterprises be able to grab their share of the growth opportunities that the new digitally driven, cloud-powered economy brings. ### Tackling 2017’s anticipated attacks for CIOs Along with well wishes, this New Year we have been inundated with warnings from experts and journalists that 2017 will entail an upsurge of DDoS, IoT and Ransomware attacks that will exceed by far 2016’s record. So with warnings must come action – which is where we propose that prevention is the best form of defence. 2016 saw some of the most well-established and public facing companies as the target of cyber criminals attacks, the most notable being the Yahoo hack which saw one billion accounts being compromised and the Tesco bank cyber-heist which was regarded as Britain’s largest attack to-date after losing £2.5m. These were amongst just a few of a staggering 1.6 billion data breaches that took place in 2016. [easy-tweet tweet="The Yahoo hack saw one billion accounts being compromised" hashtags="tech, cloud, yahoo"] Last year also saw some of the largest DDoS attacks on record, with attacks in some instances topping 1 Tbps - and there is no sign of slowing. In 2015, the largest attacks on record were in the 600 Gbps range now only two years later, we can expect to see DDoS attacks grow in size which further fuels the need to tailor solutions to protect against and mitigate against these grand scale attacks which have been apparent throughout the year. We can only expect to see more relentless and hard hitting attacks in 2017, so thorough precautions must be taken. The most notorious DDoS attacks of 2016 was the Dyn attack which made major Internet platforms and services unavailable to large swathes of users in Europe and North America. The reality is that we need to brace ourselves for an even higher magnitude of cyber-attacks in 2017, hence the need for cyber-security New Year’s resolutions. Effective cyber defence requires paying attention to the technologies that are available and using them in the way they are supposed to be used. Companies that take this approach will construct effective barriers meaning hackers will go elsewhere and find an easier target to attack. So what are some of the most pertinent threats in 2017 and what can be done to protect organisations and individuals? Ransomware saw rapid expansion in 2016 and this type of cybercrime will develop in 2017 into more sophisticated types of extortion that add social engineering to the mix and we will see the emergence of the DDoS of Things (DoT) as an attack method which means we need to really tighten up our security protocols. BYOD and IOT are both emerging trends which pose problems to individuals and organisations. The continued proliferation of devices and the associated attacks will confound CSOs and help threat actors propagate their malicious activity at greater scale. Meanwhile IoT In 2017, we’ll see the emergence of the DDoS of Things (DoT) as the attack method. By abstracting the devices and the malware they create, we dig into the root of the problem: the outcome, which, in this case, is a colossal DDoS attack. As the DoT continues to reach critical mass, device manufacturers must change their behaviour to help curb it. They must scrap default passwords and either assign unique credentials to each device or apply modern password configuration techniques for the end user during setup. [easy-tweet tweet="Device manufacturers must change their behaviour to help curb DoT" hashtags="cloud, tech"] These developments highlight the fact that criminals are becoming more complex and scaling up their attacks. Despite this, two of the fundamental issues that allow these breaches to take place are the fact that businesses are unwilling to spend out on necessary security and prioritize and that there is a lack of education amongst the public when it comes to cybersecurity. With new European laws coming into force this year, companies should feel more inclined to consider security precautions as a priority, but crucially, by giving cybersecurity the attention it deserves and investing in well-managed security controls, damage control won’t be necessary. Organisations also have a responsibility to invest in well-managed security tools, which have controls designed to prevent, detect, contain and remediate data breaches. Furthermore, organisations should take care to share simple safeguarding techniques amongst employees and make sure that they are educated around the type of attacks to expect, but ultimately protection systems need to be put in place to keep hackers out. As employees are an organisation’s greatest tools, the way they contribute to securing the company should also be well-managed. CIO’s and CISO’s should make it a New Year’s resolution to ensure staff have the knowledge, tools and ability to keep themselves and the organisation safe from the myriad of threats that are looking to jump over low barriers or get through chinks in the security armour. With organisations and individuals facing so many threats in 2017 including IoT, DDoS, BYOD and ransomware it is clear that we all need to be more aware of the threats we face. In order to protect our individual data and to keep organisation’s safe and secure it should be our resolution this year to become more personally aware and to invest more in all aspects of security. We should all approach 2017 with an enlightened view towards cyber-security and perhaps next year, the doomsday cyber-security warnings won’t be out in such force. ### The case for a data culture and user adoption Businesses are always seeking improvements in how they process and analyse large volumes of data. The usual business case is that it’s a major boost to efficiency and effectiveness - but there’s more human factors too. A data-led approach brings benefits to customer satisfaction, product and services, and employee engagement - when done effectively. Research by Harvard Business Review (HBR) and ThoughtSpot showed that 41% of businesses expect frontline workers to be using AI in the next two years. So if employees are resistant to transitioning to new practices, there needs to be a look at the corporate culture. This goes beyond just a reluctance where people have not used data-centric processes before. If a culture isn’t setting standards for openness, critical thinking, and innovation, it cannot be a high performer or a leader in its space, long-term. That HBR research found that executives at companies lagging in achievement are ten times more likely to say their teams shouldn’t be able to make decisions using data. This is problematic because no matter how differentiated the solution, impactful the use-case, or strong the executive support is, if no one uses data and smart technologies then immediately useful benefits are lost.As a result, to generate business value through new technologies, organisations must create a culture and environment that encourages user adoption of critical thinking and reliance on the right data at the right time to make decisions. To do this, meaningful change needs to be applied to how companies work to empower their employees to have the confidence and desire to make data-backed decisions. Thought this, employees will provide demonstrable value through data-driven insights and vitalise their own skill sets and careers. Building this data culture starts with providing straightforward access to the data employees need using their own business language and context via accessible technology. It’s worth it. HBR found that 72% of organisations empowering those on the frontlines with access to data are seeing gains in productivity, 69% in customer satisfaction, and 67% in product and service quality. This makes the use of analytics and smart technologies to drive business performance the battle ground to compete on. It’s open to every business, and therefore those that take the lead will realise early, solid, and long-term gains and laggards may never make up that ground. Choosing the right platform to drive adoption Simply deploying technology is not necessarily enough to enact a culture shift. Many business intelligence solutions answer the same questions repeatedly and don’t go far enough to achieve a true data culture. As such, the technology used must be tailored to those using it with the agility to react to pressure situations and adapt to different users’ needs. In fact, the most impact is gained from technology that lowers the barrier to getting insights so that users are focussed on learning how and when to apply data, not on how to use data technologies. Platforms infused with AI take this one step further, bringing automation and contextual information to encourage AI-driven analytics that provides the user with more automation and contextual information. This utility goes a long way in encouraging adoption, and in-turn, affecting the business culture and drives success in business goals. One size does not fit all Even when considering these elements, user adoption is often hard to predict. The users that are expected to adopt quickly can be reluctant - or those expected to be disinterested are the first users.  Often, this variation depends on how familiar and comfortable these individuals are with current technologies and whether new solutions are seen as an impediment or a help to their existing processes. If the benefits of using the new technology are clear, adoption will need less outside encouragement. While adoption can be hard to predict with variation between users, one thing is true: When clear business value is demonstrated, business users are motivated to adopt it to realise that value for their role, their career, and their business. This can materialise as actionable insight, reducing the time taken to complete tasks and freeing up the availability of the traditional analysts who would have previously been running data tasks for others. As such, users must be able to recognise the individualised contextual value. For this, it is best to allow them to get their hands on the solution in question so they can begin to experiment and see how it can work for them. Firstly, clearly involve business users in defining use cases and problems that can be solved by data, so the business is building applications and bringing in technology they actually need. Secondly, show users what's in it for them personally, either through examples of wins of their peers, career opportunities, etc. The importance of leadership throughout If the leaders of an enterprise are championing the cause of making data more accessible, then it's only appropriate that they practice what they preach. Tools driving business insight and value should not be implemented in isolation. The tools are a part of changing how employees perform their roles, and may need to be supported by holistic changes to culture to reward the right mind-set and behaviours. Executives need to practice what they preach. If they want to see adoption of data and analytics, they must adopt it themselves. They should include data to support decisions they make, ask for insights when approving a course of action, etc. Other initiatives to consider revolve around ensuring employees understand how to use these tools. Training and reskilling can’t won't happen without investment and a demonstration of their importance to the business. Further education and resources for upskilling users, organisations should also look at how they can create processes by which power users can partner with new users to help them learn and grow in confidence. That research from HBR and ThoughtSpot revealed that 74 per cent of respondents who use analytics solutions saw long term increases in productivity when frontline workers are empowered to use the data at their disposal. Furthermore, 69 per cent said they’ve increased both customer and employee engagement, indicating that return on investment is achievable most in those who commit to using the solutions long-term. In data and in analytics, the value of new technologies is tied to its level of adoption. Even with the best technology for the job and all the support possible from the C-suite, these technologies are contingent on actual users. By encouraging a company-wide culture that prioritises the end-user, you will drive adoption in your organisation and help it ultimately become more insight-driven - for greater success and growth. ### Embarking on Digital transformation: What should companies look for before investing Technology, organistion’s and user expectations have all changed so dramatically over the last decade that companies are now looking to leverage this shifting landscape to fundamentally alter the way they conduct business. This is in order to improve interaction with, and the satisfaction of, employees, partners and customers. The widespread use and adoption of consumer-based applications have re-set expectations of what is possible and means the focus on requirements is now squarely focused on those of the users. A recent global survey of IT professionals conducted by Forrester Consulting found that 70 percent of organisations consider digital transformation to be a high or critical priority, and are hence employing intuitive software solutions to enable employees to work smarter and more efficiently. [easy-tweet tweet="The widespread use and adoption of consumer-based applications have re-set expectations of what is possible" hashtags="tech, cloud"] Improved functional capabilities of mobile technology, changing approaches to the concept of the ‘workplace’ and widespread adoption of cloud/web based services mean that users demand flexible software in order to work effectively. Interoperability of systems is fundamental and has become essential to the improvement of business processes. The attributes that businesses should specifically look for when choosing a platform to enable digital transformation are varied, but the importance of each cannot be understated. The following criteria are key to providing a solution that can both leverage today’s landscape and a foundation for future evolution. A Platform with Intelligence The management of content within an organisation can no longer be considered in the context of a static archive. The content is constantly changing with users modifying and accessing documents via different processes and locations. Seamless integration between content, forms, rules-based triggers and alerts and a fully functioning process engine are key to enabling content to be accessed by various users at the times they require it and allowing for necessary collaboration. Content and process must be managed seamlessly within a single platform in order to deliver desirable results. Agility and an Open Approach A service-oriented approach needs seamless integration with a myriad of proprietary and bespoke systems, which will inevitably be required not only to satisfy today’s demand, but those of the future as well. Open source software has provided a whole new level of efficiency and agility in this area, by allowing developers and system integrators to readily see that the code does not restrict them to a vendor-dictated set of integration points. The result is typically a faster, better-written integration or development and a closer fit to user requirements, which in turn drives a lower TCO. Agility is further enhanced through the provision of user configurable tools to enable new or modified processes/forms to be quickly created or modified by business users as business evolves and develops. [easy-tweet tweet="A service-oriented approach needs seamless integration with a myriad of proprietary and bespoke systems" hashtags="tech, cloud"] Flexibility in Deployment Fundamentally, the decision of where a platform or application runs and where data and content reside should not be dictated by a vendor. Organisations should be free to make decisions on the location of services and data based on their business requirements and infrastructure. For most organisations, the decisions between in-house, private hosted or public cloud are rarely an either/or decision and what is critical is their platforms and applications being able to reside in either or all and still be able to interoperate. Businesses need to be able to share content or business processes with cloud-based users (often external to their organisation) on an as-required basis. In this way businesses can take advantage of ubiquitous access benefits of cloud without relinquishing control of their critical information. Smart Search Capabilities Given the volumes of data involved, the old paradigm of traversing a cabinet and folder structure to find information is obsolete. Intuitive search with real-time matching of the first few characters accompanied by automatically suggested facets to help refine the search are just some of the features that will enhance the user’s efficiency and minimise the time spent searching for vital content. Automated File Plans The ability to maintain records in accordance with regulations relating to retention schedules and file plans has been a major challenge for most organisations, as historically systems have required users to register a record and this step can be prone to error. In order to implement record management in an effective way, record managers must be able to set a file plan and retention rules which are automated, negating the need for users to embark on an extra step. The ideal platform must combine all of these attributes to provide a robust product that will benefit organisations and enable effective and seamless business transformation. It results in a highly scalable and intelligent platform that can be deployed on-premise or in the cloud to provide content and business process capability across an organisation’s infrastructure, meaning it instantly becomes ‘fitter’ and more resilient. ### 5 Reasons To Adopt Cloud During The Pandemic The unpredictable pandemic has forced various business sectors to shift their workforce and adopt work-from-home culture overnight. Every ongoing business process switched its operations entirely online within a blink of an eye.  Although the global pandemic has jeopardised the world's economy, some businesses are still able to sustain, even thrive, with the help of advanced technologies such as Cloud computing. According to research, since the majority of employees are working from home, cloud adoption has shown substantial growth. In 2018, the global market for cloud applications accounted for $118.7 billion and anticipated to reach $437.9 billion by 2026. That's impressive. Cloud computing is a notable uptick in the technology where one can accelerate business operations effectively without interruption.  Reasons to Accelerate the Push To Cloud Services During This Pandemic "Cloud is the saviour amidst pandemic." Over the past few years, organizations have embraced a series of technology innovations- Artificial Intelligence, Big Data, Blockchain technology, etc. that have made the business processes effortless with fewer investments. However, all these technologies work optimally only when combined with the cloud platform.  So, when companies transition to the cloud, they are eventually improving the digital experience instantly by adopting the latest technologies.  Let's dig deeper into the reasons to adopt cloud during the pandemic.  1.  Business Continuity and Disaster Recovery By adopting the cloud, you gain business continuity that can prove beneficial during unusual circumstances like we are all experiencing with the pandemic. Business continuity means that you can maintain business operations as usual, even if you are in a different location from the team members. Another boon to adopt cloud platform for business continuity and the need to embrace cloud entirely occurs within the practical unpreparedness of business for such unusual and sudden events where conditions like lockdown appear.  "Preparing to store data now on the cloud means planning data availability for the future." It means if you plan to store your data on the cloud today, you are eventually planning to have its availability even after something inevitable happens like a disaster. Disasters can be of any type- cyclone, tornado, or merely a power outage. A natural disaster cannot be retrograded; hence, having a cloud-based platform can mitigate the risk of data loss. When the data is stored in the cloud-based servers, it comes up with backup planning. Data can be backed up even at the time of disaster to enable day-to-day operations without a stop. 2. Flexibility and Scalability Due to the pandemic, the fluctuation in the business demands is quite evident. When businesses get impacted due to a natural crisis, it becomes hard to predict the growth of the business. In such cases, companies tend to analyze the present resources allocated and the future resources needed to run the business. This is where companies can take full advantage of the cloud. Cloud offers high-level flexibility to scale the resources needed according to the demands of the business. The business might have a progression point during a few months, and the demand may also take a downturn after a few months. During the progression period, there will be a need to add more users and storage space while working from home. Cloud service providers can scale it accordingly without having to purchase hardware or a new set-up. 3. Collaboration and Accessibility The effect of the pandemic has led offices to shut down in the short-term, but who knows if the results will be long-term. After all, it has already been more than half a year. The recent work-from-home spike indicates the companies will need to manage teams entirely remotely.  In such cases, messaging apps and tools must not be the only thing at your disposal. For instance, if you are an accounting-centric firm, you may require to be in touch with the clients for sharing updates. Not only accounting firms but any other business that needs to be in constant contact with customers may need better collaboration solutions.  Sharing documents is tough, but not when you host on the cloud. Cloud adoption enables businesses of all types and sizes to work on the same file simultaneously anywhere around the world, making communication among the team members much more manageable, minimise the latency in work, and restrict real-time access to a limited number of users.  4. Cybersecurity Security in any company is the primitive concern, and any intervention- cyberattacks, malicious attacks, etc. may hamper the business operations. Cyberattacks have always remained as a serious threat to the increasingly digital business world, and the pandemic is only worsening the problem.  No matter if you are working from home or office, security threats can never be taken care of if proper strategies are not implemented. When the business data is stored in the cloud, the cloud service provider makes sure the information is secured under multiple layers of security, giving the company the best security platforms, detecting and eradicating the threats in real-time.  5. Reduced Overhead Costs Every organization has a planned budget, and the basic principle demands to operate within the allocated budget to drive profitability. Moving along with the account requires the adoption of cost-effective business solutions. When organizations set up their on-premise infrastructure, the cost of deployment, and the cost incurred in managing the infrastructure is much higher as compared to what cloud computing service providers offer. The cloud itself means using someone else's computer.  Cloud adoption has an altogether different economic profile. Cloud enables organizations to cut down overhead costs like the cost involved in purchasing the local software, servers, and IT professionals who manage the systems within the office. Cloud computing services reduce the costs by many factors or more when compared to individual corporate operations. At this pandemic time, the maintenance of on-premise infrastructure becomes difficult, and this is when cloud service provider infrastructure can be of great help. Cloud is the Saviour Amidst the Pandemic Cloud is not only the answer to concerns like data security, DR, but it is ubiquitous, helping businesses accelerate in the fast-paced digitalised world. During this pandemic, the transition to cloud computing might be a collective smart decision. ### A little bit of daily scrubbing can rid the internet of DDoS The recent Dyn attack - which, is in fact, the largest to date - brings to light the blunt force of Distributed denial of service (DDoS) attacks. These attacks are relentlessly persistent, the worst of these DDoS attacks are those that continue for days, as this leads to disruption that could affect service for days or even weeks. [easy-tweet tweet="the worst DDoS attacks are those that continue for days" hashtags="DDoS, tech, security"] The attacker must use many hosts in order to sustain an attack for a lengthy period. If it all came from a single data centre – the attack would quickly be stopped by the data centre operator, more than likely within a day. Considering how many home networks participated in the Dyn attack, it is no wonder it is almost impossible to shut down. Thirty-thousand systems sending 10 Mbps of attack traffic results in 300 Gbps of attack traffic. Many small trickles come in from many directions, becoming a massive flood once it reaches the target. The most effective course of action would be if people kept their home systems clean and up-to-date on patches. Scrubbing at the target site is a tried-and-true technique, but it's a matter of capacity: scrubbing 300 Gbps of attack traffic takes some serious muscle. Stopping a DDoS attack near its many sources is much better, and this is a matter of being a good internet neighbour. And this is where the true opportunity lies. By deploying smaller-scale scrubbing technology at the edges of the Internet, closer to office buildings, and closer to home users, most DDoS attacks can be mitigated before they even make it out of the neighbourhood. This is especially true for ISPs and providers that operate sub-10 Gbps links to hundreds or dozens of end customers. [easy-tweet tweet="The sooner we realise that DDoS is a common problem, the sooner we can all play a role in minimising it." hashtags="security, cloud"] More often than not, the enforcers are not aware of their participation in a distributed attack, but their traffic patterns are clearly visible to their Internet provider or small enterprise security teams. By cleaning egress traffic before sending it upstream, you are not only a good Internet neighbour, you can also save substantial peering costs over the years. Just as it is good common sense to drop any packet with a non-local source address, it is equally good sense to scrub malformed packets that have no business on the internet. No blunt instruments needed at the source end, just snip out the few bad packets and let the majority through. The sooner we realise that DDoS is a common problem, the sooner we can all play a role in minimising it. Big sites will surely always need special protection, but as individuals we can do our best to scrub off a couple Mbps or Gbps of outgoing traffic, helping to block the trickles that could become a flood. ### How to avoid becoming a victim of ‘cloud shock’ Buyers of public cloud need to ensure that they understand the full cost of the services they require or they could have an unpleasant shock when their first bill arrives. The headline costs for public cloud services can appear to be remarkably good value, with server/instances running at just a few pounds per month. However, there is much more to running a service/application, and organisations need to ensure they consider all aspects of how their services operate if they are to avoid an unpleasant shock when their first bill arrives. This does not mean public cloud is necessarily more expensive or a bad choice – simply that it is vital to prepare a comprehensive business case before migrating services, as once you have migrated, you are stuck with what the provider gives unless you undertake another migration, which is likely to be time-consuming and risky, plus will bring further business interruption. [easy-tweet tweet="Buyers of public cloud need to ensure that they understand the full cost of the services they require" hashtags="cloud, consumer, tech"] All public cloud services are metered in some way; this can be both a good and bad thing, dependent on the application or service and its expected use.  The review should consider three factors; firstly, understanding what is included in the proposed cloud service and the service’s charging structure.  Are there other elements required to safely run the application not included in the base service price, such as security, resilience, management, patching and back-up?  Secondly, what is the application’s purpose and likely usage patterns?  Thirdly, how fast much is it likely to grow, in both users and data? Think of buying public cloud as if you are buying space inside the shell of a building to create a flat. You get the basic premise, but have to decide how to provide everything else you need, including management and monitoring of back-end components, backup, anti-virus, and patching.  As you are sharing the facilities with other residents, you also need to provide your locks for the internal doors – in the case of cloud services, security. Take a simple application that you believe your business uses 9am to 5pm weekdays.  With metered cloud costs, hosting this in public cloud can look significantly cheaper than fully loaded internal costs. However, the application will probably require additional systems such as login/authentication, network etc., and these need to be powered up beforehand, so the requirement quickly becomes 7-9. Then add multiple interactive systems, increasing complexity and cost. Shutting down and restarting has to be sequenced, and some employees will want access outside core hours, so you actually require 24x7 running. Your costs are now three times the originally budgeted price and you still need to add monitoring and management. Users tend to keep servers, data and all network traffic running 24x7, so end up paying significantly more than they originally anticipated. The second aspect requires an understanding of the finer points of the applications you are planning to move. In public cloud services such as AWS, it costs 1p per GB each time servers in different domains talk to each other, and 8p per GB to send data over the Internet. This seems minimal, but with some applications, servers have a constant two-way dialogue and hence costs can quickly escalate. Similar problems can arise when trying to put a custom application into Microsoft Azure. If an application is not optimised for public cloud, it may be more appropriate to retain it in-house or use a managed cloud service. Finally, organisations need to consider how much data they are storing in the cloud. Which organisation is hosting and managing less data than they were a year ago? The best way to keep this under control is data classification, followed by a visit to each department to say: “we have this volume of your data; how important is it to the business and can we delete it?” For this reason some people refer to cloud as ‘the revenge of the ITIL manager’! [easy-tweet tweet="With some apps, servers have a constant two-way dialogue and costs can quickly escalate." hashtags="cloud, tech"] Despite these issues, public cloud is a good option in many cases. If there is a good SaaS available, it makes sense to use it. However, many providers are currently offering something that is more like PaaS, so you will need to provide some aspects of the service yourself, or use a managed cloud service. To prepare a watertight business case for a potential move to cloud, the first step is to baseline your existing IT provision against business requirements. This enables you to categorise and prioritise the systems appropriate to be migrated to cloud. You can then design the new architecture for those services and plan the migration before going to market. Most suppliers have different cost models but armed with your definitive blueprint you can make a realistic comparison between the various offers. It is also important consider the soft elements of service delivery, such as SLAs and how they provide ongoing support.  You could accidentally increase costs if you select a cloud platform or supplier that insists all transactions are via a portal with no opportunity for human interaction if all the processes your organisation uses demand human interaction. Some services can and should run in the public cloud, some in private cloud and some should remain on-premise, creating a hybrid infrastructure that needs managing and monitoring. You should, therefore, retain key skills in-house to control both costs and security of your new hybrid cloud environment.  You have to take responsibility for asking your cloud provider to deliver the appropriate levels of information security and need to measure and audit them yourself to ensure that the relevant security is applied. ### Amazon Web Services Announces AWS re:Start to Boost Cloud Computing Amazon Web Services today announced the launch of AWS re:Start, a free training and job placement programme for the UK to educate young adults as well as military veterans, reservists, and their spouses, on the latest software development and cloud computing technologies. Working with QA Consulting, The Prince’s Trust, and the Ministry of Defence (MoD), AWS re:Start also brings together AWS Partner Network (APN) partners and customers to offer work placements to 1,000 people as part of the programme. The AWS re:Start programme furthers AWS’s commitment to helping organisations of all sizes and developers of all skill levels transition to cloud computing.  AWS re:Start is designed to accommodate differing levels of experience – even those with no previous technical knowledge can sign up. Participants who join AWS re:Start will complete technical training classes, led by AWS certified instructors, and will complete work experience to gain on-the-job training. They will learn about cloud computing and how to architect, design, and develop cloud-based applications using AWS. They will also learn how to set up new cloud environments using proven best practices in security and compliance and to build applications using software development toolkits for popular languages, such as Python. In addition to the technical training, The Prince’s Trust, through their ‘Get into Technology’ programme, will support students with mentoring, soft work skills, and help in applying for jobs including resume writing and interview skills. “AWS re:Start provides a platform from which individuals, no matter what their background, will be able to launch a new career and build a future for themselves in technology,” said Gavin Jackson, UK Managing Director at AWS. “We made a significant investment on behalf of our customers with the launch of a new infrastructure region in the UK and today, with AWS re:Start, we are deepening that commitment through training and hands-on experience in the latest cloud computing technologies, helping to provide an 'on ramp' for the UK workforce into highly-skilled digital roles." Karen Bradley, UK Secretary of State for Culture, Media, and Sport, says, “Increasing digital skills in the UK is a major priority for the Government and we are working to make sure that everyone has the skills they need. We welcome the launch of AWS re:Start which is a fantastic initiative bringing together employers from different sectors and providing the foundation on which they can continue to train and grow the UK’s digital workforce.” Organisations that have pledged job placements to AWS re:Start include, Amazon.co.uk, Annalect, ARM, Claranet, Cloudreach, Direct Line Group, EDF Energy, Funding Circle, KCOM, Sage, Tesco Bank, and Zopa. Participants in the programme can expect to be eligible for many highly sought after entry-level technical positions within these companies, from first line helpdesk support to software developer, network engineer or IT recruitment and sales based roles, amongst others. They will also have the fundamental knowledge needed to immediately start working with AWS and building their own technology startup business. “Our collaboration with AWS re:Start will give young adults the technology skills they need to develop a career for themselves,” said Richard Chadwick, Director of Programmes and Development at The Prince’s Trust. “Young people who sign up to our ‘Get into Technology’ programme will come out of the course having worked within some of the largest businesses in the world. We work with vulnerable young people and give them the practical support needed to stabilise their lives and move towards employment.” As part of this announcement AWS has also signed the Armed Forces Covenant, which establishes how businesses support members of the UK armed forces community and guards against discrimination returning service men and women may face when entering the civilian workforce. “The launch of AWS re:Start, and AWS’s signing of the Armed Forces Covenant, validate the company’s commitment to our servicemen and women. The program recognises the value that reservists, veterans, service leavers, and their spouses can contribute to UK businesses, and provides them with a clear path for learning, and applying some of the technology skills in most demand across industries,” said General Sir Chris Deverell, Commander, Joint Forces Command, who co-signed the Covenant on behalf of Defence. Training content for the AWS re:Start programme will be curated by AWS in collaboration with QA Consulting, who will also deliver the training courses.  QA Consulting is an AWS Partner Network (APN) Training Partner. “At QA we have been delivering AWS training since 2013 and during this time have witnessed the tremendous impact AWS has had on the way UK businesses innovate and develop new services,” said Tony Lysak, Managing Director at QA Consulting. “When we were asked for our support to launch AWS re:Start, we jumped at the opportunity to build a customised training program that takes individuals from knowing nothing about IT, to competent technologists and highly-desired employees for the numerous organisations moving to the AWS Cloud as fast as they can.” The first intake of participants for AWS re:Start is scheduled for March 27, 2017. ### How cloud storage became a target for hackers More and more businesses are now using the cloud to store their customers’ data. As with all new technologies, hackers will look to exploit any security vulnerabilities they can find. While the attack on Friend Finder Networks does not appear to have been a cloud-based attack, a number of high-profile attacks have taken place against cloud storage systems in recent years. [easy-tweet tweet="More and more businesses are now using the cloud to store their customers’ data." hashtags="cloud, security"] All of the big supermarkets now let you order your groceries online and websites exist that can be used to order a takeaway, while Amazon can now be used to order almost anything! Most of these services now also have apps to make such transactions even easier for the consumer too. This is because increased internet penetration, along with affordable data packs, has meant consumer-focused businesses have boomed in the last ten years. All of this means that consumers are now increasingly comfortable making online financial transactions. I’d even argue that consumers now come to expect the ease and convenience of making financial transactions in this way. However, the public should be wary of handing over their financial details so easily. Businesses, particularly big businesses, have embraced cloud storage options in recent years to store their data as, among other reasons, cloud storage solutions have meant they no longer have the numerous costs associated with storing all their information in large data centres. However, some businesses don’t seem to understand the potential hazards of using such a method for storing customer data. While the cloud has opened up new frontiers, it’s also opened up a whole new world of security issues, as hackers now have another way to try and access people’s personal and financial information. Therefore, it is vitally important that businesses processing and storing customer information do their utmost to ensure it is secure and safe from those with sinister motives. This, unfortunately, is not always the case. [easy-tweet tweet="While the cloud has opened up new frontiers, it’s also opened up a whole new world of security issues" hashtags="cloud, security"] For example, the last two years has seen a number of high-profile attacks against cloud storage systems. For example, this is not the first time Friend Finder Networks has been attacked, as their site, Adult Friend Finder, was previously attacked in May 2015. Furthermore, the attack on Apple’s iCloud platform that resulted in the release of the personal photographs of many high-profile figures was a big talking point in the summer of 2014 and it was only last month that the hacker was finally sentenced for his crimes. Similar to the attack on Friend Finder Networks, in July 2015, a group called The Impact Team claimed to have stolen all the personal details of the extra-marital dating site Ashley Madison’s 37 million users in a separate attack. The next month, this information was then released in two large data dumps. In turn, this led to a number of people being targeted by extortionists for large sums of money. On all of these occasions, the hackers were able to access this information following a single hack too. As who working in the FinTech sector, where large amounts of customer financial data is processed on a daily basis, I find this very worrying, particularly given more and more people now make online transactions. However, it may be unfair to only highlight cloud storage, as hackers will attack wherever they can find a weak spot in a company’s security. It may simply be that the initial high-profile attacks put the spotlight onto cloud storage systems, making them a well-known target for other subsequent hackers. Given these most recent revelations, I’m sure we can all agree that online security needs to be a top priority. It really isn’t difficult either, as common sense practices, such as businesses keeping all their security software up-to-date and making sure their privacy and spam settings are rigid, really do go a long way to helping a business keep itself and the information it keeps protected. ### Safeguarding NHS patient data is a matter of life and death A new survey suggests many NHS trusts are failing to invest enough in protecting their computer systems and data. The NHS must ensure data protection and backup provisions are fit for purpose, otherwise, patients’ lives may be at risk.  [easy-tweet tweet="Many NHS trusts are failing to invest enough in protecting their computer systems and data." hashtags="tech, data, security"] Plymouth’s Derriford Hospital was hit by a ransomware attack earlier this year that shut its systems down, it was revealed recently. Rather than pay the hackers off, the hospital restored its systems from a back-up. Unfortunately, judging by a recent story on Sky News, many other UK hospitals simply wouldn’t be able to rely on back-ups, because their parent NHS Trusts have spent nothing on cyber-security. Sky collected information from 97 NHS trusts about their spend on computer security and whether they had been the target of any cyber attacks recently. Seven NHS trusts, covering 2 million patients, admitted that they spent nothing on cyber-security in 2015. Another 45 NHS trusts were unable to specify their cybersecurity budget at all, Sky says. Sky’s investigation also revealed that trusts are suffering an increasing amount of personal data breaches, from 3,133 in 2014 to 4,177 last year, and that cyber incidents are accounting for more breaches, from eight in 2014 to 60 in 2015. In the face of an increase in malicious attacks by hackers and criminals, the NHS has been slowly moving towards using the Cloud to protect its data. However, given the nature of patient data, there are massive challenges for any NHS trust and Cloud services provider to address. Indeed, when the NHS started exploring using the Cloud a few years back, it had to take legal advice about whether it would be able to do so. The answer was yes, but only if the data was actually being stored somewhere in England. [easy-tweet tweet="The NHS has been slowly moving towards using the Cloud to protect its data." hashtags="cloud, security, tech"] The question of whether the NHS should be using Public or Private Cloud services has also been raised. There is no legal reason why the NHS cannot use the Public Cloud, but trusts may prefer the apparent extra security of having their own Private Cloud. Whether public or private, though, NHS trusts need a suitable and reliable cloud service provider. The highest priority here is to ensure that the provider’s data centre resources are based in England, because Patient Identifiable Data (PID) must not leave the country. Furthermore, the provider’s internal data backup must also only be to English data centres – in other words, if the data centre itself suffered some kind of catastrophic failure, where are its back-ups stored? If sensitive data is mirrored outside England, then it would be illegal for the NHS to use them. Service level agreements concerning how, where and when data is stored and under what conditions it is transferred back should also be key considerations when selecting a cloud data protection services provider. The end user should also take into account the provider’s level of encryption to prevent any accidental or targeted misuse of data. There are legal requirements for the level of encryption that must be used for NHS data, particularly PID. Building a private cloud solution for NHS trusts is admittedly somewhat more complex than using the Public Cloud, but it would give NHS bodies increased control over critical patient data and digital information. While the NHS trust will have more resources to manage, a private cloud solution has a comprehensive range of options with regard to the provisioning of services, access rights, selection of applications and device support. That, in turn, gives employees greater flexibility, the tools they need for their job, and the ability to deliver the same user experience as they would get with a public cloud. The safety of data and devices would have to be guaranteed according to internal standards specified by the NHS, of course. For an increasingly cash-strapped NHS, the potential cost savings and efficiency benefits offered by using the Cloud to back up vital data are obvious. However, as you would expect, the overriding concern must be the integrity and safety of patient data. ### Cloud security - myths and truths Even in this day and age, some businesses remain fearful of the Cloud. Are they right to be circumspect and untrusting? Is Cloud data an open door to hacking and cyber-crime or is this an irrational fear? We’ve all seen sensationalist media coverage around Cloud security involving celebrities, banks and even accounting software, so what is the truth? [easy-tweet tweet="Is Cloud data an open door to hacking and cyber-crime?" hashtags="tech, cloud"] Data security for a business is paramount and with so many companies feeling comfortable about moving their business to the Cloud, surely the Cloud cannot be as risky as we are led to believe? But how do we distinguish the myths from the truths? Myth 1 - All Clouds are created equal What is “The Cloud” really? It’s still computers, databases and the internet. There are so many different offerings of “Cloud” services that bunching them all together under one heading can be misleading. At a personal level, for example, most people don’t know where their data is anymore. So when you save your iPhone photos to iCloud or manage your Facebook profile, where does that information reside? Who takes care of it? What happens if you can’t access it one day or it disappears altogether? This may be a small risk to an individual but this level of uncertainty would be entirely unacceptable for a business. Myth 2 – Cloud business systems employ the same level of security Obviously, enterprise class business systems need to be far more secure than personal 'data Clouds'. However, you can’t assume that all Cloud business solutions have the same level of security. The measures different providers put in place can be worlds apart! 'Best in class' Cloud providers have extremely tight security measures in place, including: Top quality data centre architecture that includes two geographically separate data centres. Application security comprising industry standard SSL encryption and application-only access so that users can only access the application features and not the underlying database.  Audit trails, restricted user access, IP address restrictions and robust password policies are also key for making sure a business’ data is as secure as possible. Continuous system monitoring by a dedicated security team so that any suspicious activity is quickly identified and dealt with. Background checks on Cloud provider's staff and strict physical access restrictions to the data centres. ‘Best in class’ security certifications, providing independent verification of the system’s security credentials. Research is a must here. Security is a primary consideration and so each business needs to conduct its own diligence. Myth 3 - Choice of technology partner has little bearing on Cloud security The term “Cloud Computing” is popular so it stands to reason that traditional software companies see a wider market for their applications if they, too, are “Cloud”. Be wary of re-engineered applications, or as some people call them “fake Clouds” which could be something as simple as your own software running on someone else’s server. This is not solving the challenge of moving to the Cloud, this is just taking your server off your premises and moving it somewhere else. If you want to enjoy the benefits of “true Cloud”, look for a partner who works only with true Cloud applications and does not volunteer for the onerous task of providing hosting. Myth 4 - On-premise systems are so much safer It’s interesting that so many business leaders still consider on-premise business systems as so much safer than Cloud solutions. Yet storing data on-premise is akin to keeping all your money in a shoe box under the bed. [easy-tweet tweet="So many business leaders still consider on-premise business systems as much safer than Cloud solutions" hashtags="security, cloud"] Far too many businesses still have on-site servers that are inherently risky due to location, questionable back-up processes and defective security measures. Servers in unsecured places and business owners with data backed up to USB devices on keyrings are worryingly common. And how many businesses test whether their servers could be restored if there was a devastating fire, for example? Whilst cyber-crime needs to be protected against, how many business owners seriously protect themselves against threats from ‘insiders’? The malicious theft of data from a disgruntled employee, a fraudulent act from an unscrupulous insider and negligent/accidental behaviour that creates a security breach, are still far more common than cyber-attacks. An on-premise server offers ‘insiders’ far greater access to the company’s precious data! It’s important to keep concerns about the Cloud in context. If the combined purchasing power of a global installed base can make best-in-class security and availability affordable to the majority, then surely anything that takes the risks out of a business is a good thing? Best-in-class Cloud software is out there and so perhaps it’s time to stop worrying about what could happen, and look to see what’s available now and how it could help your business? ### 4 Open Source Security Predictions for 2017 Organisations of all sizes and types are expanding their use of cloud and mobile applications, which rely heavily on open source components; and these software elements live outside the company firewall. Hackers have learned that applications are the weak spot in most organisations’ cyber security defenses and widely available open source vulnerability exploits have a high ROI, allowing them to compromise thousands of sites, applications and IoT devices with minimal effort. With that in mind, here are four predictions concerning open source security that I think are distinct possibilities for the coming year. [easy-tweet tweet="Open source is a great tool, and not something that organisations should fear." hashtags="tech, cloud, opensource"] 1. The number of cyber attacks based on known open source vulnerabilities will increase by 20 percent. Why? While open source is no less (and no more) secure than commercial code by itself, there are several characteristics of open source that make it an attractive target: Open source use is ubiquitous, and therefore offers a target-rich environment. Open source vulnerabilities are publicly disclosed in the National Vulnerability Database (NVD), and references are often made to exploits that “prove” the vulnerability. The support model for open source is usually the opposite of commercial software. For the latter, a service level agreement is typically in place that requires the vendor to “push” updates to its customers and notify them of security issues. With open source, users have elected to download the component and comply with its license. They also take responsibility for monitoring the project for updates, including security issues, and deciding whether or not to “pull” the updates. 2. In 2017 we will continue to see high-profile, high-impact breaches based on open source vulnerabilities disclosed years previously, such as Heartbleed, Shellshock, and Poodle. Why? Black Duck’s Open Source Security Audit Report found that, on average, vulnerabilities in open source components used in commercial application were over 5 years old. The Linux kernel vulnerability discovered 8/16 (CVE-2016-5195) had been in the Linux code base since 2012. Most organizations don’t know about the open source vulnerabilities in their code because they don’t track the open source components they use, and don’t actively monitor open source vulnerability information. 3. 2017 will see the first auto manufacturer recall based on an open source breach. Why: A typical new car in 2016 has over 100 million lines of code. Automobiles are becoming increasingly intelligent, automated, and most importantly, internet-connected. This will exacerbate a problem that already exists — carmakers don't know exactly what software is inside the vehicles they manufacture (most of the software that binds sensors and other car hardware together comes from third-parties). That software almost certainly contains open source components with security vulnerabilities. Vulnerabilities in open source are particularly attractive to attackers, providing a target-rich environment that may have disastrous implications to a moving vehicle. 4. At least one major M&A deal will be put in jeopardy because of a discovered security breach. Why: As the Yahoo data breach demonstrated, any M&A transaction can be hindered by software security issues, especially when for more and more companies the software is their business. Companies develop their proprietary software code over the course of many years and many millions of dollars, and the software is their distinct competitive advantage. Open source issues in their proprietary code can be very damaging to the value of the software franchise from a license compliance and application security perspective. With some buyers if there's an IT issue or an open source issue, they will not acquire the company at any price. [easy-tweet tweet="With some buyers if there's an IT issue or an open source issue, they will not acquire the company at any price." hashtags="cloud, tech"] Even though open source is an essential element in nearly every piece of software today, most companies are blind to possible security issues in the open source components contained in their code – issues which often remain undiscovered until a code audit is performed. While these predictions should be of concern, I want to emphasize that open source is a great tool, and not something that organisations should fear. Open source is not the problem – it’s a lack of visibility into open source that’s the issue. As I noted at the beginning of this article, open source is no less (or more) secure by nature than commercial code – it’s software and it will have vulnerabilities. Open source only becomes a problem when organisations don’t have visibility into the open source they use, or don’t track the ongoing security of the open source components in their code. ### What lessons should developers learn from Pokémon Go? The success of Pokémon Go has been a truly remarkable milestone for mobile gaming. The game allows users to hunt down and capture virtual monsters with ties to real world locations, using Augmented Reality (AR) technology to display the creatures in parks, homes, and offices around the world. [easy-tweet tweet="Anyone hoping to model their strategy on the success of Niantic should also pay attention to what they missed" hashtags="tech, cloud, gaming"] Although it’s popularity has waned with the end of summer, the game nevertheless broke five Guinness World Records, including most revenue grossed by a mobile game in its first month at $206.5 million, and the shortest amount of time to gross $100 million dollars – just 20 days. However, anyone hoping to model their strategy after the success of Niantic should also pay attention to what they missed – especially when it comes to security. Bots and cheats One of the biggest problems encountered by the game has been hackers accessing APIs to facilitate cheating. Pokémon Go has been plagued by “botting” – the use of scripting and tools to automatically play the game at levels impossible for a human user. Botting is a common issue for many popular online games and can ruin the economy for honest users by making competitive play impossible—either by currency or skill level. Despite the best efforts of the developer, bots continued to spoof the communication between a legitimate client and the server APIs. This means they can find and capture creatures by sending spoofed GPS data, as well performing other actions such as collecting items and fighting monsters without direct user input. This extra traffic puts more strain on the game’s servers; and also spoils the fun of legitimate players who cannot keep up the competitive aspect of the game. Cryptographic keys are one of the most important prizes for hackers looking to break into an app and access the server to facilitate botting, as they enable encrypted data to be deciphered. Keys are used for everything from binding devices to accounts to proving user identity, so breaking them gives hackers a clear window for wider malicious activity as well. These keys and signatures are also intended to ensure that only the legitimate clients are able to utilize the game server APIs. Access is usually regulated with a cryptographic challenge-response protocol, which usually requires the mobile client to maintain a public and private key material for any asymmetric cipher. [easy-tweet tweet="Keys are used for everything from binding devices to accounts to proving user identity" hashtags="gaming, cloud, tech"] Beating the cheats Both Pokémon Go’s developer and its players have been fortunate that hackers have been content with facilitating bots or discovering game secrets hidden in the code, rather than launching harmful attacks. In order to see off anyone attempting this kind of access, cryptographic key protection and binary code obfuscation are important tools to keep the code and the keys safe and trusted. This transforms code to prevent prying eyes from easily understanding and extracting information, making it even more difficult to identify and defeat the application’s other defences. Limiting information leakage in clear text strings, removing unused program code from application binaries, as well as changing easy-to-understand program symbol names also makes the code more difficult to crack. Injecting multi-layered “Guards” into the binary of the app will enable Runtime Application Self-Protection (RASP), creating a self-aware app that is able to identify threats and take immediate action to protect itself in real time. Meanwhile, these Guards can integrate into threat modeling and reporting technologies so that attacks can be tracked and reacted to in real time. [easy-tweet tweet="Applications using white-box cryptography have repeatedly safeguarded cryptographic keys" hashtags="tech, gaming"] Finally, one of the strongest defences for keys on untrusted devices is white-box cryptography. This approach combines a mathematical algorithm with data and code obfuscation techniques to transform the key and related operations, making it impossible for hackers to locate and extract them in the code. Applications using white-box cryptography have repeatedly safeguarded cryptographic keys from direct intrusion testing from leading red-teams. The immense popularity of Pokémon Go has highlighted the issue of hackers accessing code and spoofing authorisation to facilitate cheating, but these are actually incredibly common problems. We have found the vast majority of apps, including healthcare and finance apps full of confidential data, lack vital protection to keep code safe. All of the developers who are sitting on the idea for the next breakthrough application should learn from the missteps of Pokémon Go and protect their assets from the beginning. ### How Carriers Can Regain their Competitive Advantage Twilio is one of today’s “it” companies. In addition to its approximately 30,000 customer accounts and a successful entry into public markets earlier this year, the company boasts the third largest SIP Trunking install base in North America, less than two years after launching the offer. Twilio is definitely one of the winners in the cloud communications space because it leverages technologies that deliver Over-the-Top (OTT) applications without touching the Public Switched Telephone Network (PSTN) and charges to move these sessions from the web to the PSTN. It has also built its business model by making it easy to embed “sessions” and reselling services like SMS or MMS to those session purveyors. [easy-tweet tweet="Despite counting multiple CSPs among customers, Twilio is encroaching more on their revenue streams." hashtags="tech, cloud"] So despite counting multiple Communications Service Providers (CSPs) among its customers, Twilio is encroaching more and more on their revenue streams. Can these providers fight back? Yes, they can! To beat Twilio at its own game, here’s what CSPs need to consider: Go to market with a web-based messaging platform supporting voice, text, video and enterprise customers, and work with systems integrators who can also leverage these services to create their own embedded solutions using developer-friendly APIs and mobile SDKs. Leverage turnkey solutions that enterprises and systems integrators can simply “reskin” or easily embed into existing workflows, ERP, web, social and mobile apps – for example concierge, live expert support solutions, click-to-chat, click-to-call, and click-to-see (video and visual applications). Instead of having to build them, CIOs can leverage reference web and mobile apps. If customization is needed, they can do it themselves or via system integrators. Cut out the middle man and sell the solution along with services, including into Twilio’s enviable base of large enterprise customers and growing number of small and medium businesses. Carriers own the network, the last mile and the interconnect connectivity and can easily introduce more competitive pricing while keeping healthy margins. As an example, 2FA on WhatsApp accounts for approximately 10 percent of Twilio’s revenue. This is a huge opportunity for a carrier with a large SMS network. Create revenue for developing and implementing solutions, with a long tail of recurring revenue for supporting high quality and reliable service. In fact, quality of service (QoS) is yet another area where carriers have a built-in advantage. Owning the network and the last mile enables them to offer higher, end-to-end QoS to their customers, which in and of itself opens up the possibility of deploying higher-end, enterprise-grade services. Reinvent mass communications deployments, for example in the growing contact center world, but with new more flexible software-based architectures leveraging new standards like WebRTC. Extend service reach by opening their up APIs and potentially federating with other CSPs (via a common API/SDK platform. This opens up the opportunity to sell Real Time Communications (RTC) services beyond their territory and drive more network consumption. One example is selling SIP Trunking within the CSP’s home geography and extending that reach across other geographies with support from federated carriers. Today’s world is about rapid service creation. Yet for many carriers, internal OSS/BSS and IT complexity make agility difficult. By leveraging cloud-based service enablement systems, carriers can launch new digital business services in weeks versus months and years. It’s even easy to try it before buying it, so carriers can fail fast, fail forward and if successful bring the solution in-house within their private cloud via an NFV-compliant CPaaS solution. [easy-tweet tweet="Carriers can launch new digital business services in weeks versus months and years." hashtags="cloud, tech"] When CSPs make the right moves, they are in a position to rapidly build even more profitable businesses, beating Twilio at its own game. In fact, by integrating turnkey solutions for better customer engagement, improved customer service and Two Factor Authentication (2FA) for enhanced security, they can rapidly accelerate their offerings and immediately drive revenue. ### Millennials are transforming the finance department and it starts with the cloud New generations always bring cultural change. But the millennial generation, born between the early 1980s and 2000, is driving cultural and process changes in new ways as companies across the globe seek to successfully assimilate them into the workforce. Indeed, it is the millennial generation—the first generation to grow up in a nearly all-digital world—that is steering a digital culture shift, which, as they mature in their careers and dominate boardrooms, will continue to drive the digital transformation currently underway. The finance department is no different. According to Jim Kaitz, president and CEO of the Association for Finance Professionals, “By 2020, millennials are going to be the dominant part of the workforce.” This new generation of finance professional is embracing change, hungry for new innovations and eager to utilise technology to make their life and jobs easier and drive profitability within the enterprise. [easy-tweet tweet="New generations always bring cultural change." hashtags="tech, cloud"] Key characteristics of the millennial generation that are impacting both technology and process in companies—and specifically finance departments—across the globe: Millennials adopt quicker – a combination of existing technical skills and “no technology fear” means that uptake happens much more quickly Change makes for progress – a generation that has been used to many new iterations of devices and software results in an appetite for change Tech advocates outside the IT department – while traditionally technology was implemented through the IT department, millennials are more tech savvy and have a “self-service” mentality Putting the user experience first – it’s all about a smooth and seamless experience. If technology helps get the job done better, it will be adopted, simple as that Multi-device is a given – the millennial generation has seen and embraced a number of new devices with newer devices coming along more and more often. And they expect the same services and software to always be available no matter what device they are using Trust in the cloud – with a faster adoption cycle and solutions which can be deployed in hours rather than weeks or months Cloud services, with their inherent flexibility and ability to be deployed on demand, are pivotal to the ongoing changes and their increased adoption underscores how the finance function is evolving. In a recent CFO Indicator report, based on a global survey of 377 CFOs, CFOs reported that cloud technology is on the rise, with 33 percent of them citing their current IT infrastructure is already SaaS-based, and 60 percent expecting their infrastructure to be SaaS-based over the next four years. [easy-tweet tweet="Expectations for real-time access to data means it is no longer acceptable for budget reports" hashtags="tech, cloud"] The ongoing adoption of cloud technologies for finance functions, such as planning, budgeting, reporting, and consolidation, is also triggering process changes in the finance organization. Collaboration between finance and other functions, such as sales, HR, and manufacturing, is enhanced by cloud-solutions that enable a “single source of truth”—where finance and business users alike can leverage a single version of data. And, millennials’ expectations for real-time access to data means it is no longer acceptable for budget reports and analysis to take weeks, if not months. As millennials continue to join and move up through the workforce, organizations will have to adapt to these tech-savvy professionals. ### Marketplace vs platform banking – why the marketplace approach must prevail to better serve SMEs The Open Banking revolution is fast upon us, and many financial institutions are already shifting towards the type of collaborative ecosystem that will likely define the new banking landscape. Consumers and businesses alike will soon expect to browse a bank’s core services, third-party Fintech services, and even non-financial, value-adding services, all in one place. [easy-tweet tweet="The Open Banking revolution is fast upon us" hashtags="tech, fintech"] For the SME segment who have, for too long, been grouped into one bracket and served using the ‘one-size-fits all’ approach, Open Banking models present a real opportunity for banks to strengthen their digital portfolios and better segment their business customers to provide the personalised service that SMEs have, up to now, been denied. In my view, out of the two distinct banking approaches emerging - marketplace and platform – it is unequivocally clear which of them will deliver the higher value when it comes to digital banking for SMEs. To start, how do the two models differ? The platform model: anybody can join Under a platform approach, the bank opens its API’s up to anyone who is interested, including other Fintechs, potential competitors and third-party developers, so that they can utilise the bank’s data to build their own products to be accessed via a platform. So long as they meet certain basic criteria, any third-party developer can ‘plug in’ their products to the platform, with little –if any – curation of these services by the bank. In this respect, the financial institution effectively becomes an ‘Amazon of banking’; offering a huge array of similar and competing products, giving the customer a greater opportunity to find a solution that best matches their specific needs. The marketplace model: a selective club Traditional banking models have thus far produced very generic products, with banks offering their own services and attempting to serve a wide range of customer types. Designed to generate quick profits, the ‘one-size-fits-all’ approach offered a de-personalised experience. [easy-tweet tweet="Traditional banking models have thus far produced very generic products" hashtags="fintech, cloud"] By contrast, the marketplace model sees banks continue to provide and develop their core services, but where there are gaps within their product portfolios – for example a lack of digital banking products for small businesses – they create relationships with carefully selected partners. There are no openly published APIs, the bank offers the service (and therefore maintains the primary relationship) and there is a high level of curation across what’s offered. Why marketplace will win… In my opinion, the marketplace model will prevail. Amazon has done an incredible job of dominating the retail market, expanding from a marketplace model to a platform model and offering an increasingly wide variety of products from multiple sources. But Banks are not retailers; they have carefully developed brands and deal with very sensitive, personal information. These elements need to be carefully protected to maintain the trust of their customers. A platform model feels like a race to the bottom of who can provide the widest range of services at the lowest rates with less rather than more focus on the customer experience. The marketplace model on the other hand, presents a seamless digital experience that acknowledges SMEs need for an increasing level of personalisation and customisation. Banks need to recognise they can’t effectively address every customer need themselves. Instead, they should continue to provide services they’re good at, and plug in services they’ve carefully procured to fill the gaps. Segment for success SME customers have long been neglected by banks, continually being passed between the retail and corporate divisions with an ill-conceived hybrid of personal and corporate products. The advent of Open Banking is a great opportunity for banks to leverage the growing innovation in the market to supplement their basic SME offering, with a range of targeted products and services that cater for all the different types of thriving SMEs. [easy-tweet tweet="SME customers have long been neglected by banks" hashtags="fintech, cloud"] By carefully segmenting SME customers, banks can better define the challenges facing different customer types, lifecycle stages, emotional states and non-financial needs and consequently better address them through a joined-up customer value proposition. For the customer, effective segmentation means they receive better support specifically targeted to their needs; allowing the bank to provide them the right tools, for the right time in their lifecycle. The marketplace model advocates a positive and enriching digital banking service for SMEs and this strategy should be adopted if banks are to retain their position as a trusted and reliable partner whilst the world of Open Banking continues to evolve. ### New loyalty scheme launched for hackers threatens cloud businesses In early December, hackers in Turkey came up with a new strategy to incentivise others to carry out cyber attacks for them in return for points in a new loyalty scheme. This scheme allows the Turkish hackers to shift most of the risk away from themselves and onto those who are prepared to attack pre-defined targets in exchange for access to tools including click fraud software. Although providing members with tools and a list of targets has been done by Anonymous in the past it’s the first time we have seen a Distributed Denial of Service (DDoS) platform that enables hackers to earn points, rewarding them for their ‘loyalty’ by giving them access to new attack tools. [easy-tweet tweet="The ‘cyber domino’ effect has become a popular weapon" hashtags="tech, cloud, security"] It’s especially worrying for cloud businesses, as hackers may launch strong DDoS attacks on cloud service providers in order to bring down targets and they are at risk of being caught in the crossfire. The ‘cyber domino’ effect has become a popular weapon in the cyber criminal armoury in recent years. It works by taking down a hosting company so that the target company will be taken offline as well, as will many other companies who use the same provider who become innocent victims in the attack. The motives for the latest attacks are not completely clear, but there are some good indicators that give insight into the potential motivation of the authors of the platform and the participants. With this new platform, known as "Sath-ı Müdafaa", which translates into “Surface Defence”, there is no prior connection between participants and the provider of the tool. The authors provide the platform and a locked version of the Balyoz DDoS tool, with a limited list of targets. The targets (credit to Forcepoint) included Kurdish Workers Party (PKK), the People’s Defense Force (HPG), websites of Kurdish hacking crews and Kurdish radio & TV stations, as well as the German Christian Democratic Party (DCU, Angela Merkel’s party), the Armenian Genocide website and several Israeli sites – mostly sites with a political position with respect to Turkey. From the participants’ perspective, the motive can be either political ideology (the list of target sites) or financial gain. The financial gain comes in the form of rewarded points that can be used to buy an untethered version of the Balyoz DDoS tool or click-fraud bots such as Ojooo, PTCFarm, Neobux PTC. Click-fraud bots can automatically click on ads for pay-to-click (PTC) services and are clearly there for financial gain. One other motive for the author might be to gain insight in the participants through a backdoor, or recruit their systems for other attacks – the latter being less probable since the participants are performing illegal activities with those systems they might want to thrash them after use. So I believe that the backdoor is there to gain insight in competing criminal groups or gather intelligence on the participants. Whilst this new loyalty scheme approach is currently only focused on targets that has some political connection with Turkey the model is one that could well be repeated by hackers looking to target businesses for whom they have an angst of some sort. Additional platforms may have already been developed but are yet to be discovered. It is very likely that this will not be a ‘one off’. [easy-tweet tweet="There are steps that cloud businesses can take in order to defend against DDoS attacks" hashtags="security, tech, cloud"] I don’t see how we could stop such platforms from forming - even if one could be taken offline, it is just a matter of time before the next forms. But there are a certain number of investments or resources the owner of the platform must have to be able to make it work and gain popularity and that is a good set of tools to attract participants and make them want to earn points. This is certainly the action of seasoned hackers. With that in mind, there are steps that cloud businesses can take in order to defend against DDoS attacks that may be the result of this new tactic. They must review their cloud service provider and ask: Are you using hybrid mitigation capabilities? A successful defence depends upon multi-vector attack detection that is ‘always-on’, along with the ability to automate the process of redirecting traffic to cloud-based mitigation resources. Do you have effective application (Layer 7) attack detection and mitigation services? New attacks are reportedly sending massive HTTP floods, making Layer 3/4 detection methods useless. Do you have a separate network for DDoS mitigation? The ideal architecture features a separate, scalable infrastructure specifically for volumetric DDoS attack mitigation where attacks can be rerouted when they reach predetermined thresholds. Cyber-criminals will always find new and unusual ways of launching attacks that are often difficult to defend against. But those who carefully review their cyber defence strategy – and that of the providers they rely upon – will be well prepared to take on these latest threats. ### Making the most of the live video revolution We are now at the live video tipping point. The trend, which has been driven by mobile – in both the supply and demand, is now on the verge of becoming the norm. Twitter has Periscope. Tumblr has Tumblr Live. But, the introduction of Facebook Live has really changed the game. [easy-tweet tweet="We are now at the live video tipping point." hashtags="tech, cloud, video"] There have already been so many interesting examples of its usage. We’ve seen use cases of it which vary from people capturing road rage on camera, to the UK's biggest "Virtual Santa's Lap" event, where thousands of children throughout the country talked to Santa via a live video stream. Access anytime, anywhere to events which we might have previously not been able to witness thanks to the ease at which live footage can be shared and captured is what’s driven this change. We are hungry for information which requires minimal effort to consume. Facebook Live has simply made it easier to broadcast to millions of people regardless of time and location and other social media platforms – like Instagram – are following suit. When will we reach the tipping point? Companies, consumers and celebrities alike are using live video a lot more – but for it to become the norm for everybody, three things need to happen: 1)      Connectivity improvements 4G has completely revolutionised internet usage on the move. The increased availability of fibre will change this dynamic even more and give people the ability to watch live videos wherever they are without interruption. 2)      Better battery life Current smartphones do not last more than a day without charging and that’s just with regular use. Watching videos causes the battery to drain even quicker, preventing people from watching them on the move. As soon as this technology improves and people no longer have the fear of their phone switching off before they make it home, we’ll see a big spike in the amount of live video watched. 3)      Brands leading the charge The uptake of live video communication on social media platforms has transformed the way we communicate. We will also soon start to see more brands and news outlets improve their video capabilities on mobile devices which will give people a lot more choice and drive the live video revolution. Making the most of live video opportunities The time is now for businesses to act to get one step ahead of their competition. Let’s look at advertising for example. Why would people pay for expensive advertising slots when they can reach a bigger audience through Facebook Live? If you’re a big star, it’s a no-brainer, you already have a high number of followers that will watch and share your videos and help them to go viral. [easy-tweet tweet="Why pay for expensive advertising slots when you can reach a bigger audience through Facebook Live?" hashtags="live, social, tech, cloud"] For smaller brands, they can become broadcasters at a fraction of the cost. Sky high television advertising may have prevented them from using video to reach potential customers before but this should no longer be the case. By working with an industry influencer, a brand can take advantage of their follower base and reach their target audience instantly. We are now at the brink of mass live video adoption. It is now more of a question of when rather than if live video will become the norm for celebrities, governments, brands and consumers alike. As we move into 2017 and beyond, it’s clear that live video will become even more impactful. Businesses should start taking advantage of its capabilities now to lead the way in the live video revolution. ### Sleep Walking to Oblivion: the potential local societal impact of global cloud domination Much is made of the success of silicon roundabout and the thriving ecosystem of the start-ups in neighbouring Shoreditch. Few of the youngsters working there though will recall ICL, the company that was once the UK’s tech giant – just as Bull was in France. National champions like ICL and Bull found it hard to serve their respective political masters as well as keep pace with the rapidly changing technology landscape and both fell on hard times, with ICL eventually being subsumed by Fujitsu Services. Ever since then the UK has produced the odd tech unicorn, like Psion or Autonomy, but otherwise the UK tech industry has largely been comprised of large foreign (mainly US) players and small local start-ups, many of which are bought up long before they become unicorns. [easy-tweet tweet="A small handful of US tech giants could dominate the tech sector in a completely different way" hashtags="tech, cloud"] This model has worked well for decades, with players like IBM not only employing over 20,000 staff in the UK, but also basing significant R&D establishments here, such as the software labs at IBM Hursley. Britain did not need a large champion of its own as long as it benefitted from the jobs and skills based here and from the taxes associated with these operations. Then there were the healthy partner ecosystems supporting each major vendor and all the jobs and large capital investments in corporate data centres managing and maintaining the systems that they supplied. The cloud is changing all of this. There is now the very real risk that a small handful of US tech giants could dominate the tech sector in a completely different way. Not only will their size and reach squeeze out the partner ecosystems to a far greater extent than ever seen before, but they will also be replacing the corporate data centres as well and all the jobs that went with them. Traditional vendors like IBM are struggling to compete and in order to do so are having to move marketing and R&D jobs to cheaper locations. Indeed, most of the traditional tech vendors have been reducing headcounts in the UK and EU dramatically as they reduce or offshore large proportions of their headcount. The new kids on the block have a different model. They have based their cloud operations out of gigantic data centres where automation is maximised and headcount minimized. These vast operations are hoovering up workloads from thousands of companies – and many of these companies are now looking to close down their own data centres entirely as a result, leading to the loss of more skilled jobs. Leading the pack are AWS and Azure, each of which now boasts its own presence in a UK data centre, but look under the covers and things are not quite as they might first appear: [easy-tweet tweet="AWS and Azure lead the technology pack" hashtags="AWS, Azure, tech, cloud"] Investment: while in its recent announcement AWS claimed to be “investing in the UK’s cloud future”, the global players are not actually investing anything much in the UK at all. The new facilities do not represent net new investment in the UK, as these players are simply leasing facilities that already exist and while they are installing some equipment of their own, it is all imported – almost none of it is made in the UK. Skills: none of the innovation and little of the network management for these facilities will be based in the UK. Unlike IBM Hursley that (once) provided many high-skilled, UK-based R&D jobs, AWS and Azure have their innovation centres on the US West Coast along with much of their network management. Their local facilities have minimal staffing levels and most of their UK operation is simply sales and marketing. Jobs: the new players will not provide anything like the level of employment that the traditional tech vendors like IBM did, and this does not even touch on the vast number of jobs that will also be lost in corporate data centres and as well as in the channel ecosystem. Taxes: all of these lost jobs will represent a massive loss not only in the skills base in the UK, but also in the tax base - both in terms of the income taxes that the staff pay, and in terms of the corporation taxes that these global giants have so far been effective in evading. Compared to the enormity of these changes, the handful of jobs centred around silicon roundabout and Shoreditch pales into relative insignificance. And while I applaud the efforts, innovation and enthusiasm of these small start-ups, they will never replace the skills base and tax base that we face loosing and will eventually be impacted themselves by these wider losses. In the US the new administration is focused on ‘Making America Great Again’ and on the continent the French and others are dubious of the US players and their very different attitude to privacy. This privacy divide has been exacerbated by recent amendments to Rule 41 and other intrusive US legislation as well as by apprehension about some of the bolder assertions made by the incoming US administration. As the UK enters Brexit negotiations and finds its feet in its new global trading relationships, it needs to consider its own needs and to what level it applies industrial policy in areas like technology. The UK government has already applied a level of societal value orientation in its own procurement, in many ways: For UK public sector IT procurement, the government created G-Cloud as a framework to enable participants of all sizes in the value chain (from large SIs to small businesses) to compete for government business on equal terms. Here at UKCloud we have grown with G-Cloud, providing a public cloud for the UK public sector working alongside an array of partners who also serve this market. The government has also provided guidance in relation to the Social Value Act (in terms of information and resources). The Public Services (Social Value) Act came into force on 31 January 2013. It requires people who commission public services to think about how they can also secure wider social, economic and environmental benefits, and “Procurement for Growth” is firmly embedded in government procurement policy. The Public Contract Regulations 2015 introduced new laws to ensure that public sector commissioners cannot put up artificial barriers to prevent small business bidding for public sector business. The government is committed to channelling 33 percent of its £50bn+ annual procurement spend to small and medium enterprises. (N.B. throughout this article we have used the term societal value as opposed to social value in order avoid confusion with social in terms of social media). [easy-tweet tweet="This privacy divide has been exacerbated by recent amendments to Rule 41" hashtags="tech, cloud, security"] What we need is vision from the UK government and policies that support that vision. Firstly, it needs to continue to lead by example, as it has with G-Cloud and other initiatives that enable local players (including SMEs) to compete on equal terms in the public sector and also to have societal value considerations included in the process. Secondly, it needs to use its influence to help extend this type of model well beyond the public sector to the private sector as well in order to ensure that a race to the cloud does not result in the domination of global cloud giants that could have massively negative societal impact in the UK. Surely a government department would invite a hostile public response if it ignored its obligations in relation to the societal impact of moving its own jobs offshore, avoiding its own payroll taxes, or giving business to companies who practice tax avoidance, so why should it not consider the same issues in a wider context when moving workloads to the cloud? And if private sector organisations claim to have the same ethos around corporate and social responsibility, why would the same considerations not apply here also. Finally, if societal impact is not enough, there is also the chasm that exists in the different attitudes and regulations on privacy on either side of the Atlantic. Nothing, including Privacy Shield, EU model clauses or encryption, prevents a company being responsible for the data it holds or the regulations that apply to the data. And whatever they say, the US players are all subject to a set of intrusive US regulations that do not apply to their UK rivals. This includes the recent amendment to Rule 41 which would allow the FBI access to any data held by the US cloud firms. Just look at the compute and storage capacity available in the new facilities opened by AWS and Azure. It is simply not enough to serve all the needs of their UK clients. Therefore, a significant number of workloads will need to be processed off-shore and a significant proportion of data held there also – hardly what you’d call data residency. And even if clients are offered guaranteed data residency with all compute and storage restricted to the new UK facilities, the data is still subject to the intrusive US regulations as long as it is being handled or stored by these US firms – hardly what you’d call data sovereignty either. Indeed, typically under AWS and Azure contracts, data (including meta-data and any customer data needed to perform the services) is allowed to go anywhere in the world. And getting the data back isn’t straight forward either as AWS and Azure are on proprietary technology stacks which means that you are locked in. Think about it. If your customers want to know what your policy is on societal value, then what is your response? And if your customers want to know what your policy is on data privacy, residency and sovereignty, then again what is your response? With GDPR on the horizon, they will be asking, and if they don’t like the answers that you give them, then they will be empowered to withdraw their consent for you to hold their data. You have been warned. ### Proactivity Over Reactivity: Everyone is Going Mad for MAAD Traditional IT teams are facing more changes than ever before because services, such as cloud, functions, and containers have become easily accessible, usable, and scalable for every business. Although this change will eventually make the IT professional’s life much easier, the transition to this end-goal comes with many challenges. For example, the IT team now needs to work hard to maintain rigor and discipline over change processes, and implement cloud services into existing IT infrastructure and protocol. On top of all that, they need to keep systems and applications running as usual while simultaneously translating that transformation into business value. [easy-tweet tweet="Traditional IT teams are facing more changes than ever" hashtags="IT, tech, cloud, business"] Keeping on top of this fast-paced change is a big ask for IT teams. To gain a deep understanding of performance and health, a good starting point is to embrace a DevOps culture. Here, the siloes between the development and operations teams are dissolved, and instead, shared accountability and collaboration are encouraged. IT teams also need to be cognizant about investing in new technology and, at the same time, keep continuous service delivery working like clockwork. It’s therefore not a surprise that all of this must be driving them a little mad. However, instead of getting mad, they should embrace monitoring as a discipline, or MAAD. Monitoring can be something of an afterthought for the IT team, and is often only considered when there’s a problem that needs solving. However, up down funk ain’t enough. Monitoring needs rigor and discipline to provide early-action warnings, optimize application performance, reduce cost, and improve security. Therefore, MAAD needs to become a core IT function. Whether you’re an IT professional, a DevOps engineer, or an application developer, you can never be MAAD enough in this world of instant applications. When it comes to monitoring, it’s best to think of it within one of these eight key skills: Discovery – Let me see what’s going on Alerting – Tell me when something breaks or isn’t going well Remediation – Fix the problem Troubleshooting – Find the root cause Security – Govern and control the data, app, and stack planes Optimisation – Run more efficiently and effectively Automation – Scale it Reporting – Show and tell the management teams/business units By gaining insights from these eight key skills of monitoring with discipline, systems administrators can better optimise resources, save their organisation money, and address performance issues, all before the end-user even notices any Quality-of-Service (QoS) issues. [easy-tweet tweet="Time to embrace a DevOps culture" url="DevOps, tech, cloud"] System administrators, without the benefit of a comprehensive monitoring tool that covers the above, are forced to go back and forth with multiple different software tools – often for both hardware and cloud-based applications in hybrid environments – to troubleshoot issues. The result is often finger-pointing and hours of downtime spent looking for a problem rather than fixing or even preventing it. In today’s hybrid landscape, organisations should invest in a tool that consolidates and correlates data to deliver more visibility across the data centre. Therefore, my advice to IT professionals is to take ownership of the data centres. Whether on-premises or in the cloud, take control of the scale by setting up monitoring as a discipline, giving you the depth, breadth, and visibility you need to manage IT. Let’s finish with some wise words and top tips from Adrian Cockcroft, VP of the AWSâ cloud infrastructure team, and a monitoring engineer who is always ahead of the tech curve: Align IT with the business Develop products faster Try not to get breached All three of these can be solved by putting monitoring at the core of your enterprise IT infrastructure. It’s time to get MAAD and see what all the fuss is about. ### A Secure Path To The Cloud - Is There One? 40 percent of North American business-critical applications are in the public cloud, whether it is file storage, a communications application, or full blown IaaS. These strategies are attempting to address one common theme, increasing productivity of your employees and reducing costs associated with running these services on your own. Especially for small to medium businesses, hiring IT professionals is usually not a part of a business plan, these services are crucial to operating at the level of your competition. Cloud usage must be a part of nearly every single business large and small but do the risks outweigh the rewards? [easy-tweet tweet="40 percent of North American business-critical applications are in the public cloud" hashtags="tech, cloud, business"] You probably ask yourself is the cloud more secure? Is my information and IP better suited to rest and operate in the cloud? Some say cloud service providers are better suited to provide security controls as it is their business and it is inherent to their high standard of operations. More than half of cloud applications, including all the mainstream providers, follow industry security standards, so your data likely rests in a secure environment. Others argue that, although cloud services may be inherently good at keeping your data safe, it doesn’t mean that your data isn’t potentially malicious to your users. Let’s dive a bit deeper into the last argument. A cloud storage provider has done a valiant job at protecting your data and providing access control to properly authenticated users.  In your eyes, and the provider’s, you’re operating in a compliant cloud storage environment. What about my trusted users? How do I protect against intentional or unintentional malicious employees? For example, a Drive Sync, a feature where a folder can sync data from the end-point to the cloud and vice versa. Drive Sync, your best friend and your worst enemy, exponentially increases the attack surface associated with cloud storage providers. Only one end-point needs to be compromised and it could potentially spread to your entire organisation. The typical stance is not if, but when, will I be a part of a data breach. Does utilising cloud providers increase my chance of a breach? The answer is likely yes if you don’t approach with the same process as you would with any other business critical application. A recent report released by Netskope, showed that 34 percent of organisations have malware in their cloud applications and don’t know it. 57 percent that do scan for malware in their cloud apps found it. So we know that there are already methods in place for attackers to leverage these services, the decision you have to make do the risks outweigh the rewards? There are plenty of mechanisms out there to safely adapt cloud services, and the benefits are too useful to ignore. The only way to properly address cloud adoption is use the same processes you would building them on your own. Know the apps you have, control their usage, protect your most sensitive data, scan and remediate threats. ### Disruptive Pitch UK - Series 1 - Episode 3 Episode 3 of reality TV technology pitch show; #DisruptivePitch. The UK's most aspiring technology entrepreneurs, startups and emerging companies pitch themselves to our panel of judges for a chance in the final. ### Manufacturing and High-Performance Computing with Fortissimo’s Mark Parsons In this episode of #CloudTalks, Compare the Cloud speaks to Fortissimo's Mark Parsons about making waves in how manufacturing companies use high-performance tech to improve the quality of work. #CloudTalks For more information: https://www.fortissimo-project.eu/ ### Finance Function - #CloudTalks with Hayne Solutions’ Mark Cracknell In this episode of #CloudTalks, Compare the Cloud speaks with Mark Cracknell from Hayne Solutions about finance solutions and making finance teams more than just bookkeepers and pushing them to become thought providers. Struggling with core financial processes, close consolidate, budget plan and forecast and report and analyse is a common thing in the market and Hayne look to make this a thing of the past and elevate teams into raising 'above the parapet' by freeing up time. #CLOUDTALKS Visit Hayne Solutions to find out more: hayne.co/ ### Business Intelligence - #CloudTalks with MHR Analytics’ Nick Felton In this episode of #CloudTalks, Compare the Cloud speaks with Nick Felton from MHR Analytics about the importance of data and business intelligence. Engaging customers at a business level to drive outcomes that are ultimately pushed by the technology sitting behind it as an enabler. MHR are very focused on business outcomes to lever the most out of their data as possible.   #CLOUDTALKS Visit MHR Analytics to find out more: mhr.co.uk/ ### Think before you click Hybrid Cloud Computing! Many of you may be thinking why on earth I am even suggesting this? Hybrid Cloud offers flexibility, fault tolerance, subscription based cost models and much more. Yes, it does, but it simply does not fit everyone’s business model. Let me explain. Many years ago, we had onsite IT Computer rooms that had static servers and other hardware mountain stockpiles. Then came the dawn of virtualisation and companies such as VMware were born that consolidated multiple servers onto bigger but fewer ones, running virtual servers in a multitenant way. Software virtualisation paved the way for a virtual server placed anywhere (with a little help from network connectivity ) and external services housed out of large data centres became normal. "So tell me something I don’t know!" I hear you all groan, however, we have not mentioned the hardware that isn’t easily moved, shouldn’t be moved and is very often misunderstood, Mainframe technology! [easy-tweet tweet="Software virtualisation paved the way for a virtual server placed anywhere" hashtags="tech, cloud, IBM"] I often write about mainframe technology, and I am a big advocate of the infrastructure for many reasons, but let’s put it into context and position this correctly before you leave this page and surf somewhere else. [caption id="attachment_40091" align="alignnone" width="428"] 1964 picture of an IBM Mainframe 704[/caption] [caption id="attachment_40092" align="alignnone" width="272"] 2016 picture of an IBM Mainframe[/caption] I know what you are thinking, but this is not an advert for IBM or any other Mainframe, just a plain and straightforward way of consolidation of technology (which also happens to be a major benefit of Hybrid Cloud Computing). Mainframe technology is all about speed of transaction and scale in my opinion, and this is why most of the biggest banks, retailers, insurers and other industries utilise them. Yes, you can use inexpensive hosted resources to achieve some of these statements, but why would you if your business demands unrivalled throughput of transactions, security that’s built in at the hardware level and virtually every element has redundancy built into the hardware by design? Keeping bespoke technology up to date with the fast-paced evolution of IT and consumer demands is not easy and not without its challenges, but today mainframe infrastructure has never been so relevant. Mainframes account for 68 percent of IT production workloads but only 6 percent of IT spend for the banking and finance sectors, amazing hey! (Source: Solitare Interglobal) Bigdata – The ability to store, analyse and report on masses of data for many uses IoT – The management of millions of devices at a touch of a button instantly and consistently Disaster Recovery – Redundancy for internal IT services that external clients are reliant on Scalability – Peak seasonal periods or expandability that’s scalable and without migration costs Security – Never more important than today is the need for absolute security if IT Systems These are but a few practises within the technical world that have complete focus today. Mainframe technologies can cater for and deliver all of the above points mention and more with Hybrid elements. Yes, you did hear that right, a mainframe can now have hybrid connectivity to other hosted environments that can act as replication partners, burstable capacity overspill and more but coming back to my original analogy to VMWare, what about making one of the world’s most Cloud Ready infrastructure do even more for you? [easy-tweet tweet="A mainframe can now have hybrid connectivity to other hosted environments" hashtags="tech, cloud, IBM, mainframe"] Just as virtualisation gave us more to run on slightly bigger infrastructure, why not do the same for your mainframe? There are a multitude of vendors that provide optimisation services for current architectures like the x86; little is known about doing the same with mainframe tech. Wouldn't this make sense to look at first before jumping into the multi-vendor management area of Hybrid IT services? By optimising your workloads you have the ability to: Process over a billion mission-critical transactions per second every day Accelerate application processing by up to 98 percent Seamlessly integrate data on mainframe and distributed systems Enable big data and analytics in multi-platform environments Enable increased control and flexibility in sub-capacity pricing and R4H soft capping environments These are mind blowing capabilities for the mainframe environments, and they are available today. I caught up with Allan Zander, CEO of DATKINETICS (Global leader in Data Performance and Optimisation Solutions). These are simply mind blowing capabilities for the mainframe environments and they are available today. I caught up with Allan Zander, CEO of DATAKINETICS (Global leader in Data Performance and Optimisation Solutions). “For the last four decades, technology and IT systems that rely on them have always been seen as technology hidden away. You really don’t need to be concerned about some of the poor stigma that the mainframe may suffer from – much of it is just misinformed FUD.  The mainframe is actually an extremely cost effective computing platform, and the skills to get the solutions that you need are readily available in graduates of today. By optimising what you have and then consider a Hybrid approach can literally save you Billions of dollars! You can run applications on z/OS that require the very best in reliability and scalability that the mainframe has to offer, but for those applications that perhaps do not require top-level resilience, you can run Linux on your mainframe. And now with the most powerful, scalable, secure and fault tolerant technology available a major player in the open source community, you have to look at this tech for the future of your business!” The technology is more open than ever before and my point is this. Shouldn’t you get the most out of what we have first before investing in something else? If you have a mainframe, optimise, and get the most out of it first before looking elsewhere as will you really get the performance, reliability and security you demand elsewhere? Note:  All Fortune 500 IT organizations run both mainframe applications and distributed LUW applications on multiple operating systems on multiple hardware platforms, and access multiple database types. ### Forget the Internet of Things for a minute, what about the Security of Things? The Internet of Things (IoT) is well and truly upon us – and will clearly be even more prevalent in the future. Today, IoT is already branching out into commercial networks as well as enterprise applications. Smart devices are becoming more commonplace in our households with everyday appliances now able to communicate with the internet to help our lives to run more smoothly and interconnected devices are now essential tools in our working lives as well. This is all fantastic news… right? [easy-tweet tweet="The Internet of Things is well and truly upon us" hashtags="IoT, tech, cloud"] While it's easy to get excited about all the new gadgets that the era of the IoT has delivered, it is important to take a step back from all the excitement and talk about security. Millions of people across the globe are connecting with these devices and sharing valuable data. However, the potential misuse of this data still remains fairly well hidden, disguised under IoT's novelty halo effect. Infosecurity experts have long warned that IoT devices are set to be a security nightmare as they are often deployed with little or no consideration for security. The question is: are enough people aware of this and are the right security measures being taken – particularly by organisations that need to protect business critical and sensitive data? Recent distributed denial-of-service (DDoS) attacks such as that experienced by the DNS provider, Dyn – which made it impossible to access the likes of Twitter, Amazon and Netflix – should be a serious wakeup call. In its early days, the World Wide Web brought with it very little protection from misuse. This, of course, generated consumer distrust, consequently slowing down initial e-commerce efforts. But, if we fast-forward to the present day, it is now the case that e-commerce represents around 15 percent of all retail sales in the UK, with an expected £5million to be spent online this Black Friday in the UK alone. This is in no doubt due to the fact that today data encryption and other security measures are simply assumed. People no longer fear sending their credit card information over the wire. As a result, security issues, for the most part, are kept in the background. It almost seems as though we are in a cycle in which consumers and organisations blindly trust companies with their valuable data and it is only when a case of known and reported intrusions arises that action is taken and data security is examined. This, in some respects, also echoes the initial response to the cloud, which saw low user adoption for the first few years due to security worries around the security of the data being stored offsite. Compare that to the beginning of this year when the UK Cloud adoption rate climbed to 84 percent according to the Cloud Industry Forum. [easy-tweet tweet="In its early days, the World Wide Web brought with it very little protection from misuse." hashtags="tech, web, cloud, security"] It has been found that most of the IoT devices that have been hacked to date have had default usernames and passwords, and at no point had the manufacturers prompted users to change these. Increasingly, hackers are able to use malware software to scour the web for devices that have basic security and detect vulnerabilities. This enables the hackers to upload malicious code so that the devices can be used to attack a targeted website. What is really worrying is that the owners of the IoT devices are usually unaware of the attack. This is because once a device has been hijacked it can be impossible to tell as they often continue to work exactly as normal. Issues will then begin to occur behind the scenes when the compromised system is subsequently put on the same network as personal computers, corporate servers, and even confidential government data. The main issue is, without knowing which devices exchange data within a specific network or the internet as a whole, there is no way to develop an adequate security strategy. In theory, every single device that is being added to a network needs to be evaluated, but this is just as painstaking as it sounds. Whether it is the IoT or the cloud, companies need to begin using security technologies and procedures that have already been proven to be reliable. This means applying on-premise levels of IT security to cloud workloads. For example, two-factor authentication, role-based access control, encryption, and vulnerability scanning can enable a protective shield for the cloud to scan all incoming and outgoing data for malicious code, regardless of the device being used. The right level of security technologies embedded into the cloud platform allows companies to gain control of all web-based traffic in order to actively manage which communications should be permitted and which should be blocked. [easy-tweet tweet="Security technologies embedded into the cloud allows companies to gain control of all web-based traffic" hashtags="security, tech, cloud"] Recent high-profile cyber attacks and, increasingly, ransomware threats have spurred a long overdue discussion about the gaps in IoT security.  Unless the security side of IoT is sorted out, it could hold back wider adoption of the technology. Early adopters beware; the best advice is to follow the data. Know how the company behind your latest gadgets and interconnected devices handles security and ensure that any cloud provider is able to provide you with the reports and ongoing visibility that will enable security settings to be managed and maintained. ### Applications to monitor performance - #CloudTalks with InfoCat’s Stephen Waters In this episode of #CloudTalks, Compare the Cloud speaks with Stephen Waters from Infocat about building applications to model and predict the performance of companies before analysing the results.   #CLOUDTALKS Visit Icon UK to find out more: icon-uk.net/ ### Six Benefits of Moving Business Intelligence to the Cloud In 2015, IBM estimated that 2.5 quintillion new bytes of data were generated each day. This data proliferation shows no sign of slowing down as more businesses undergo digital transformation to better respond to the rapidly changing industry demands. Many companies are even appointing a Chief Data Officer to exclusively control, manage, and govern this surge in data volume. As devices and applications are put into the hands of more users, organisations need a high-performance, secure, and cost-effective platform to properly manage their data and content. The software must enable easy and remote access, as well as collaboration both between customers and suppliers, and other seemingly disparate groups. This is where the cloud comes in as an ideal solution. [easy-tweet tweet="In 2015, IBM estimated that 2.5 quintillion new bytes of data were generated each day." hashtags="cloud, tech, data"] Here are six reasons why companies are starting to make the switch from on-premises data storage to the cloud: Instant global presence With the cloud, companies are no longer tethered to a single physical location to store data. Global organisations can choose to store data in datacenters located around the world, so data is accessible when and where it’s needed. By ensuring that users are accessing data stored nearby, companies can reduce latency and make quick decisions when milliseconds matter. The ability to support customers all over the world opens up entirely new streams of revenue. In addition, the cloud provides a reliable backup should a server go down. Organisations can take comfort knowing that an alternate server is still up and running – ensuring that their data is always readily available. The ability to support customers all over the world opens up entirely new streams of revenue. In addition, the cloud provides a reliable backup should a server go down. Organisations can take comfort knowing that an alternate server is still up and running – ensuring that their data is always readily available. High performance and reliability Cloud infrastructure can be configured specifically to meet users’ needs. Most cloud providers offer multi-tenant environments, where different groups of users can access data through different points, but those that offer single-tenant cloud environments can deliver even better performance and improved security to users. [easy-tweet tweet="Cloud infrastructure can be configured specifically to meet users’ needs." hashtags="cloud, tech, data"] Instant deployment Cloud services often come pre-configured, meaning that the set-up time is minimal. Vendors can therefore provision services immediately upon order, so anyone can build powerful applications and securely deliver them via web, mobile, or desktop. New users and projects can be added to the solution instantly, and this entire process runs without the need for complex IT infrastructures and support. Elastic infrastructure Companies need a platform that can easily scale up as they grow. As the company expands, cloud customers can add more users, storage, and features to meet their exact needs. Companies don’t need to scale up permanently either. When additional resources or computing power is needed, organisations can scale up for a specific time period when they expect a spike in usage, and then scale back down later. The benefit of this flexibility is that companies only have to pay for what they need, rather than investing in expensive software and hardware upfront. Better security Data security is becoming more of a priority for every company, particularly among C-suite executives. For this reason, cloud providers do everything they can to stay current with the latest industry standards and compliance certifications. A data breach can have devastating consequences for a company, so the cloud must offer excellent security to avoid becoming a liability, should a breach occur. Single-tenant cloud users benefit from an additional layer of security as there is only one point of access. There is also no chance of their data being combined with another company's data or accessed by unauthorised and potentially malicious users. [easy-tweet tweet="Data security is becoming more of a priority, particularly among C-suite executives." hashtags="cloud, tech, security, data"] Low costs Cloud eliminates capital expenses for infrastructure, including servers and storage, which can represent a significant investment by companies. Without physical storage, problems associated with out-of-date hardware, faulty equipment, and general maintenance and constant software upgrades are largely eliminated. Also, the innate flexibility of the cloud allows companies to only pay for storage and features that are constantly in use. The cloud offers numerous benefits to both businesses and users, in terms of flexibility, ease of access, and unparalleled cost savings. The agility associated with the cloud means that companies who adopt this solution are prepared not only for growth and international expansion, but for the changing demands of employees to facilitate mobile working and remote access. As data volume continues to grow, having a solution that responds and enables simple access and governance will immediately give a company an edge over competitors who stick with legacy on-premises storage solutions. ### Retrieving Data - #CloudTalks with Budgeting Solutions’ Paul Bavington In this episode of #CloudTalks, Compare the Cloud speaks with Paul Bavington from Budgeting Solutions about retrieving data and helping clients to make better decisions on budgeting, forecasting, and reporting. Solutions come in when you learn you have to run your business on fact not opinion. #CLOUDTALKS Visit Budgeting Solutions to find out more: budgetingsolutions.co.uk/ ### Zero clients and the future of mobile business It’s no secret that mobile device proliferation has been essential in enabling businesses to become more efficient, productive, cost effective and responsive. But while mobile devices have generated many benefits for corporations, they have also ushered in unprecedented security threats, and a range of device management challenges. [easy-tweet tweet="Mobile device proliferation has been essential in enabling businesses to become more efficient" hashtags="tech. mobile, cloud"] The multi-device mobility challenge IT professionals are continuously struggling to manage multiple devices from multiple locations. Recent research from Toshiba shows that 84 percent of senior European IT decision makers believe unauthorised IT system use to be endemic within their businesses – highlighting the severity of this problem. Further still, research conducted by Ovum's European Mobility Management Gap report found 61 percent of European business leaders still feel they are making little or no progress towards building a secure mobile workforce. Efforts to tackle these challenges in the past have involved IT managers turning to thin client solutions, which make for an attractive proposition because they’re generally easy to install. They also offer a simple application process and improved security. However, when it comes to remote working, the limited nature of thin client solutions in terms of both portability and security is causing businesses to explore different alternatives. The rise of zero clients This shift has given rise to zero client solutions. While similar to thin clients in their reliance on a central, purpose-built server that hosts the operating system and applications, zero clients differ in the fact that the operating system can be extracted from the individual device. For businesses, this eradicates the need to store information on the individual device’s HDD or SSD, instead bringing all functionality and data into the cloud through a secure virtual desktop infrastructure (VDI). With no data stored on the device, company data is considerably less vulnerable, and any threat involved should a laptop be lost or stolen is eliminated. The other key benefit of zero clients is the cost element, and the savings they provide, with minimal management required once integrated into the IT infrastructure and updates automatically rolling-out via the server. Equally, with no information held on the client, hardware is faster, more efficient and more durable – removing the costly and time-intensive need to regularly upgrade to new devices. [easy-tweet tweet="The other key benefit of zero clients is the cost element" hashtags="zeroclients, cloud, tech"] The future of mobile business Bringing together the enhanced security that zero clients provide with the ability to work unhindered no matter where, Toshiba has pioneered a Mobile Zero Client solution – helping to overcome the desk-based nature of more traditional zero clients in this age of mobile working. Such technology offers the perfect blend of flexibility, mobility and security, working in tandem with the latest business-built mobile devices to offer the power and connectivity tools for workers to perform at full capacity whether in the office, on the train, or at home. And with all data still stored away from the device, businesses will have full peace of mind that the security threat, should a device be  lost or stolen, is minimal. With a growing demand for zero clients, enhancing the mobile functionality of such solutions is the next significant innovation and promises to truly bring businesses into an era of fluid, flexible and secure working in any location. ### Implementing Data Projects - #CloudTalks with Entity Group’s Steve Parry In this episode of #CloudTalks, Compare the Cloud speaks with Steve Parry from Entity Group about the journey to implementing data projects within your business. "If it's a journey, give me some roadside attractions along the way" - there are many decision points around business priorities, data technology and internal projects as well as value points and deciding when you can get something from the project that is valuable and means something. Entity aim to find a balance structured data and new data types as well as how you as a business operate. Entity aim to find a balance structured data and new data types as well as how you as a business operate. #cloudtalks Visit Entity Group to find out more: entitygroup.com/ ### Why The Future of ITIL Certification and Other Training Is Online We’ve all gone through work-related training schemes. Prince2 training for project managers, offshore survival training for rig workers, or licensing training for hospitality staff; it doesn’t matter the industry or job, there will always be relevant training to attend. Often offsite and always at the most inconvenient time, traditional training schemes are the bane of any work schedule. [easy-tweet tweet="Traditional training schemes are the bane of any work schedule." hashtags="tech, training, cloud"] This is slowly coming to a halt. Online training has been on a quiet but effective revolution of the training world for several years now, particularly in the areas of IT and business management. David Baker, Business Development Manager for PMP certification providers Datrix training said “These days, there’s no need to do most training in the physical world. It’s a throwback to when technology wasn’t developed to the point where it could be more effective than pen and paper. The stage we’re at now, I can lead a class in the US from a room in Australia. Just look at the rise of online universities like the Open University: entire self-taught, online degrees. If that doesn’t tell you that the future of training is online, I’m not sure what will.” The Benefits of Moving Online David’s point on the technological advancement is a good one. We are at a stage where it is possible to drive a business almost entirely from the mobile devices in our pockets. Where before a course had to be researched, developed, tested, accredited, and distributed in physical forms, now these steps can all be achieved electronically, with the course then hosted online. The benefits of this are sizeable: Cost-effective and environmentally friendly – Companies can incur significant costs through conventional learning systems. Rather than paying for the physical copies of the course material, employees simply use a computer or mobile platform to take part in the course. This saves money on the production of the course material. This reduction in paper consumption - combined with employees not having to travel for training – means that online training is far more environmentally friendly. Ease of access – Where before the training would be limited to a physical location, now it can be carried out anywhere. This removes the logistics of booking rooms for training or finding the appropriate time; now all employees can train in their own time, at their own space, in any location. In theory, employees can have 24-hour access to a training course, giving them free reign to interact with training as they see fit. Reduction in wasted man hours – Previously, employees would have to take large amounts of time out of their working day to attend training. Traveling to a location, getting set up, the course itself, then returning back to work – these all take time out of the working day. Now, it’s possible to attend training without even leaving the desk, cutting out a large amount of wasted time. Why Online Training Is Important to Success With the rate at which technology is developing, training for staff is a constant process. Millennials are the first “tech-fluent” generation, but they are still in the minority in the workplace. The majority of the workforce comes from the baby boomers and Gen X-ers. These generations aren’t so tech-taking longer to familiarise themselves with emergent technologies. One thing they all have in common, however, is the knowledge and ability to use a desktop station. This allows them access to online courses where they can develop their skillset and understanding of a topic. [easy-tweet tweet="Now, it’s possible to attend training without even leaving the desk!" hashtags="tech, cloud, training"] Something as simple as IT certification in technological terms can have a massive impact further down the lines. Understanding the terms of a topic allows us to communicate effectively in that area. Effective communication leads to streamlined and productive collaboration. Beyond this, further training in the theory behind a technology and its appropriate applications allows employees to make informed decisions when it comes to utilising new technologies. With the rise of the Internet of Things (the IoT) and cloud-based technologies, the possibility of incorrect application of tech through lack of training and understanding has increased massively. Ensuring staff are up to date in their training avoids wasted investment – both in man hours and monetary senses – and promotes efficient internal work practices. ### MariaDB Adds Support for Big Data Analytics with General Availability of ColumnStore 1.0 MariaDB® Corporation, the company behind the fastest growing open source database, today announced the general availability of MariaDB ColumnStore 1.0, a powerful open source columnar storage engine. With this release, MariaDB has united transactional and analytic processing with a single front end to deliver an enterprise-grade solution that simplifies high-performance, big data analytics. “MariaDB ColumnStore is the future of data warehousing. ColumnStore allows us to store more data and analyze it faster,” said Aziz Vahora, Head of Data Management at Pinger. “Every day, Pinger’s mobile applications process millions of text messages and phone calls. We also process more than 1.5 billion rows of logs per day. Analytic scalability and performance is critical to our business. MariaDB’s ColumnStore manages massive amounts of data and will scale with Pinger as we grow.” Data warehousing and analytics are undergoing massive transformation driven by cloud computing, cost of compute, distributed deployments and big data. But in spite of all the innovation designed to reduce cost and complexity, enterprises continue to be confronted with two imperfect options for big data analytics: build a costly, proprietary data warehouse on premise with vendors like Teradata and Vertica, or get locked into potentially uncontrollable costs with cloud-based solutions like Redshift. “Data warehouses are famously expensive and complex – requiring proprietary hardware which makes it nearly impossible to deploy on the cloud, or on commodity hardware. MariaDB ColumnStore makes data analytics more accessible, on premise, in the cloud or in a hybrid deployment,” said David Thompson, Vice President of Engineering at MariaDB. “Costing on average 90% less per TB per year than the leading data warehouses and offering a scalable solution for the cloud, MariaDB ColumnStore presents a new, open source model for big data analytics that is designed to meet today’s enterprise needs.” New features and benefits of ColumnStore 1.0 include: Lower Cost of Ownership and Better Price Performance Compared to the licensing and hardware costs of proprietary data warehouses like Teradata and Vertica, MariaDB ColumnStore, an open source solution, provides exponentially better cost per performance for analytics. Additionally, ColumnStore’s flexible deployment options mean businesses can deploy in the cloud, or on premise with commodity hardware as opposed to solutions like Redshift which require a commitment to cloud-only deployments. Rather than buying and maintaining a third platform for data warehousing, MariaDB users simply turn on the columnar engine for a particular dataset and run analytics from the same front end as their transactional system. ColumnStore’s advanced data compression is more efficient at storing big data, requiring less hardware and making big data analytics more affordable. Easier Enterprise Analytics Unlike Hadoop solutions, ColumnStore offers full ANSI SQL capabilities including queries for complex joins, aggregation and window functions. This makes it easy for data analysts to run their existing enterprise queries without modification. MariaDB ColumnStore simplifies enterprise administration and execution by accessing all the same security capabilities delivered in MariaDB Server, including encryption for data in motion, role-based access and auditability. ColumnStore also offers an out-of-the-box connection with BI tools through ODBC/JDBC, as well as the standard MariaDB connectors. ColumnStore’s automated partitioning, combined with the elimination of the need to create indexes and views, makes it easier to setup, deploy and manage. Faster, More Efficient Queries Compared to row-based storage, columnar storage reduces disk I/O with compression, making read-intensive (i.e., analytic) workloads significantly faster. MariaDB ColumnStore distributes the queries into series of parallel operations, resulting in extremely fast and scalable query results. "The demand for robust analytics technologies has exploded as companies look to derive greater value from increasing data volume," said Jason Stamper, Analyst, Data Platforms & Analytics, 451 Research. "MariaDB's ColumnStore is a new option for high-performance analytics to big data, which aims to enable faster and more efficient queries while eliminating delays and reducing enterprise data warehousing expenses." MariaDB ColumnStore is now available for download. For customers wanting to get started immediately, the MariaDB professional services team is offering a new program called MariaDB ColumnStore JumpStart, which delivers a ready-to-use ColumnStore environment in 3 - 5 days. Contact MariaDB to learn more. ### BMW Group to Start Research with IBM Watson IBM has announced a new collaboration with the BMW Group, through which the companies will work together to explore the role of Watson cognitive computing in personalising the driving experience and creating more intuitive driver support systems for cars of the future.   As part of the agreement, the BMW Group will collocate a team of researchers at IBM’s global headquarters for Watson Internet of Things (IoT) in Munich, Germany and the companies will work together explore how to improve intelligent assistant functions for drivers.   IBM recently pledged to invest USD $200 million to make its new Munich centre one of the world’s most advanced facilities for collaborative innovation as part of a global investment of USD $3 billion to bring Watson cognitive computing to the Internet of Things. BMW, which also has its company headquarters in Bavaria’s capital, is one of the first companies to sign up to be collocated inside IBM’s building within one of the newly-launched industry ‘collaboratories’.  A team of BMW Group engineers will work alongside IBM’s own team of technologists, developers and consultants.   “Watson is transforming how people interact with the physical world – helping to create safer, more efficient and personal experiences at home, at work and on the road,” said Harriet Green, Global Head of IBM’s Watson IoT business. “With this agreement, our companies will work together to lay the foundations so that drivers can benefit from Watson’s conversational and machine learning capabilities. Our insight shows that while the car will remain a fixture in personal transportation, the driving experience will change more over the next decade than at any other time of the automobile's existence.”   To further its automotive research and demonstrate the possibilities of Watson IoT technologies to clients, IBM will locate 4 BMW i8 hybrid sports cars at its Munich Watson IoT HQ. Prototype solutions which will run on IBM’s Bluemix cloud platform will help demonstrate how Watson can enable new conversational interfaces between cars and drivers.   Watson’s machine learning capabilities offer new opportunities for vehicles to learn about the preferences, needs and driving habits of their drivers over time, customising the driving experience accordingly and improving levels of comfort and safety. The car’s manual will be ingested into Watson so that drivers can ask questions about the vehicle in natural language while still being able to focus on the road. The aim is for the solution to also incorporate data from the Weather Company (an IBM business) as well as real-time, contextual updates about route, traffic and vehicle status in order to enrich the driving experience and make recommendations to the driver. ### Manufacturing Globally - #CloudTalks with Procurri’s Mat Jordan In this episode of #CloudTalks, Compare the Cloud speaks with Mat Jordan from Procurri about manufacturing globally and finding the niche in the market and supplying the answer. [easy-tweet tweet="Compare the Cloud speaks with Mat Jordan from Procurri about manufacturing globally" hashtags="tech, business, cloud"] #CLOUDTALKS Visit Procurri to find out more: procurri.com/ ### Fujitsu Wins 32 Million Euro Deal to Drive Digital Transformation in Finland with the ‘Apotti’ Project for Healthcare Services Fujitsu has been selected to help create a unified healthcare patient records and ERP system in the Helsinki area as part of a major digital transformation project known as ‘Apotti’. The company is supporting this ambitious initiative by providing extensive Platform as a Service (PaaS) managed infrastructure services to ensure the availability¹ and security of the Apotti system. The Apotti project is a co-operation between the local governments of Helsinki and the surrounding regions of Vantaa, Kirkkonummi, and Kauniainen, and the hospital district of Helsinki and Uusimaa (HUS). Its goal is to create a unified healthcare program that manages patient records and clinical data that will allow the region to adopt a more data-driven and evidence-based mode of providing healthcare services. Apotti will also streamline and simplify the region’s existing IT infrastructure, replacing the majority of the systems currently in use with a single Apotti system when it goes into full-fledged operation at the end of 2018. The new system will improve the quality of care in the region by eliminating the disconnect between healthcare and social welfare services. Healthcare professionals in the area’s 29 hospitals, 40 health stations and 50 social welfare offices will all have access to the same real-time patient information, allowing them to more effectively coordinate support for the elderly, children or those with mental health or substance abuse issues. The approximately 1.6 million residents of the region – around 30 percent of Finland’s total population – will benefit, not just from improved access to healthcare services but also from online services such as confirming appointments or requesting extensions to prescriptions. Hannu Välimäki, Managing Director for Oy Apotti Ab, said: “This project is the first of its kind which means that we were able to design it from scratch based on the best possible combination of technology, without having to work around legacy infrastructure. Fujitsu is providing not only an infrastructure model that gives us the best possible price/ flexibility ratio but also the high levels of availability and security that are crucial for an efficient healthcare system. We anticipate significant improvements in care as a result of integrating the region’s multiple healthcare systems and we are confident that we have the right infrastructure foundation both for the launch and to scale as Apotti will cover new municipalities in the Uusimaa area.” Fujitsu’s managed infrastructure services portfolio is designed to provide organizations with a robust but agile IT infrastructure that allows them to optimize their technology budgets by removing the need to deploy and maintain their own infrastructure equipment. Solutions are created by means of co-creation with customers – that is, working closely together to understand their specific current and future requirements to devise and implement the most appropriate solution. The dedicated Apotti infrastructure deployment includes an extensive portfolio of Fujitsu services such as Platform as a Service (PaaS), Infrastructure as a Service (IaaS), monitoring and IT support services, in addition to advanced security services to protect sensitive healthcare data. Conway Kosi, Senior Vice President and Head of Managed Infrastructure Services at Fujitsu EMEIA, commented: “Our dedicated managed infrastructure services are designed to ensure maximum reliability and flexibility for Apotti at a predictable monthly rate. By allowing Fujitsu to deploy and manage its infrastructure, Oy Apotti Ab is freed up to focus on delivering the best possible healthcare service to its users, rather than on managing its technology. We continuously innovate and evolve our services, which means that the Apotti system will always be underpinned by the best available services and infrastructure.” Implementation of the managed infrastructure services contract for Apotti has already started and full availability of the unified healthcare patient records system is scheduled for late 2018. ### How to launch a successful technology business A global downturn in start-up funding, along with uncertainty in the aftermath of Brexit, is challenging London’s future as a start-up hub for the continent. Although this is unlikely to be a long-term trend, today’s entrepreneurs face a tough environment characterised by ruthless competition, economic uncertainty and, in some industries, skills shortages. [easy-tweet tweet="Today’s entrepreneurs face a tough environment characterised by ruthless competition" hashtags="tech, cloud, business"] My company, Sonovate, has grown from a start-up to a leader in the fintech industry in just four years. We are now competing in a very dynamic alternative finance sector that includes companies such as Funding Circle and MarketInvoice. We secured Series A funding at the beginning of this year and just closed a £14m Series B funding round in October. But while renewed investor confidence is good news, there are a number of factors which underpin this success. In my experience, there is no right time to launch a tech start-up. To launch a successful business, a founder must display hard work, a clear vision for growth, and a natural enthusiasm for the unpredictable yet hugely rewarding journey that is entrepreneurship. If you’re about to launch your own start-up, make sure you keep these five tips top of mind at all times. Think ahead Starting, and then growing, a business is expensive. Series A funding and angel investment will help you get off the ground but they can only take you so far. In an ideal world, your current investors would stay on-board and add to their initial investment. It doesn’t always work out this way and you need to have a solid Series B funding strategy prepared well in advance that includes other potential investors too. Stay close to your investors and show them what your business has achieved with their funding. Use clear metrics to demonstrate your growth to-date and take them through your plan for the next phase. Positive numbers combined with a strong proposal is a sure-fire way to create interest and secure future finance. [easy-tweet tweet="Stay close to your investors and show them what your business has achieved with their funding" hashtags="tech, cloud, business"] Be steady and flexible The age of technology is characterised by speed. When it comes to starting your tech company, however, don’t be rushed into things you’re not comfortable with. Businesses that expand almost overnight may look impressive, but building a successful tech company takes time and lots of planning. You need to analyse the market in detail and find the gap that you want to own. Operational needs, including your number of staff, have to be considered carefully too. You don’t want to hire for the sake of hiring and then have to let people go because you jumped the gun. The market can be very changeable. Give your business flexibility so that it can shift direction as and when necessary. Be open to a certain degree of change. Sonovate launched as an all-in-one solution for entrepreneurs looking to start their own recruitment agencies but we pivoted our focus to invoice finance when we identified a need in the industry for an alternative to the banks’ outdated solutions. Have a co-founder Do not underestimate the value in having a trusted partner on board who shares your vision. Starting a business is very hard and can be incredibly stressful. The only other person in your life who will be able to empathise, carry some of the load and actually help, is your business partner. Every day there will be problems that need solving. Some will be small and irritating, others will potentially make or break your business. They will all need time and attention to resolve and not all will be within your areas of expertise. A co-founder can provide an alternative point-of-view and bring extra skills and experience to the table – all of which can be helpful and supportive during high-pressure times. Get help Know this from the beginning: you won’t be able to do this alone. To get your business successfully up and running, and prime it for growth, you need a strong circle of support. This can be made up of your investors, fellow entrepreneurs, business mentors and staff. Don’t isolate yourself and make the road to success even bumpier. There are many established businesspeople who have been right where you are and will happily offer you support and advice. The lessons they’ve learned from their journey could save you precious time. [easy-tweet tweet="To get your business successfully prime for growth, you need a strong circle of support." hashtags="cloud, tech, business"] Hire the right people – and keep them happy When the time is right, strategic hiring will be crucial for getting your business off the ground. In addition to having an excellent skill set, you need to look for people with a certain character that will thrive within a fast-paced start-up environment. They must have proven expertise in their field as well as a get-stuff-done attitude. Hire energetic, clever people who will stick with the business as it grows - and then hold onto them. As any business grows, its culture can change. Creating an atmosphere of intimate collaboration gets harder the bigger you become, but it’s not impossible. It’s up to you to keep all lines of communication open with your staff and stay true to your vision. ### Engaging with Omni Channel Markets - #CloudTalks with Icon UK’s Chris Jones In this episode of #CloudTalks, Compare the Cloud speaks with Chris Jones, CEO of Icon UK about Omni Channel marketing to every channel available to a market today. [easy-tweet tweet="Compare the Cloud speaks with Chris Jones, CEO of Icon UK about Omni Channel marketing" hashtags="cloud, tech, "] #cloudtalks Visit Icon UK to find out more: icon-uk.net/ ### Securing multi-cloud environments Large enterprises have turned to Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) in the cloud, such as AWS, Azure or Google, to fulfill business demands in a reliable and scalable way. Some organisations have jumped in feet first with a stated end goal of adopting 100 percent cloud with a “cloud first” mentality, while others are experimenting with a hybrid approach of on-premise and cloud infrastructure. This flexibility is a great option for meeting evolving business requirements, but it can also present a headache for those responsible for securing these environments. [easy-tweet tweet="Flexibility is a great option for meeting evolving business requirements" hashtags="tech, cloud"] A multi-cloud environment could have many faces; it could be different cloud IaaS and PaaS providers, or a single provider with multiple accounts; for example one for development, one for testing and one for production applications, which is generally considered best practice. With the rapid adoption of cloud infrastructure, ensuring security and compliance in these environments is one of the biggest challenges modern CISOs face. While most CIOs are tasked with developing a digital transformation strategy, the CISO is responsible for ensuring this strategy does not introduce risks or new threats to the organisation, often confronting an uphill battle and pressure to go in blindly. The challenges can be attributed to changes in ownership of technology, reduced overall visibility and new gaps in governance. To overcome these security hurdles and maintain a consistent approach to defense and monitoring, there are a number of actions organisations should take in order to make the digital transformation run much more smoothly. Tell the CISO From the outset, the CISO and security teams need to be aware of plans for moving infrastructure to the cloud, not just to be able to assess the risks and forget about it, but also to be involved with the security architecture in these environments.  Once the security measures are established, there needs to be effective and consistent monitoring to maintain the organisation’s security posture. Figure out who is responsible for what Often application and infrastructure teams have significantly more experience working in cloud infrastructure environments. In many cases, security teams should take advantage of the application teams experience and assume a role of governance versus operations.  And in all cases, clear definitions of responsibility should be established. [easy-tweet tweet="Security teams should take advantage of the application teams experience" hashtags="tech, cloud,"] Technology can help With complex and diverse environments, it can be worthwhile to invest in security management technology that helps organisations get a holistic view, from risk to compliance and threat monitoring.  One that covers multi-cloud environments will be especially beneficial to large enterprises for maintaining a consistent security posture while still being able to take advantage of all the cloud has to offer. Don’t just say “no” A lot of times the security team gets a bad reputation for just saying “no”.  A more effective approach is setting standards that enable IT and infrastructure teams and help to set boundaries so that each party wins.  If the security team is seen as a hindrance to innovation and productivity, it will just end up being bypassed altogether and that is a much more dangerous situation. No matter what an organisation’s cloud journey looks like, establishing consistency in security defense and threat monitoring and maintaining a good security posture will always be the number one challenge.  It is not a transition that will happen overnight, but by following the advice above, organisations can take a less bumpy route to digital transformation and use the cloud to its full advantage more securely. ### ‘Internet of Things’ Connected Devices to Triple by 2021, Reaching Over 46 Billion Units New data from Juniper Research has found that the number of connected IoT (Internet of Things) devices, sensors and actuators will reach over 46 billion in 2021. This 200% increase, from 2016, will be driven in large part by a reduction in the unit costs of hardware. Juniper forecasts that it will average close to the ‘magic’ $1 throughout the period. Juniper’s latest research, The Internet of Things: Consumer, Industrial & Public Services 2016-2021, found that industrial and public services will post the highest growth over the forecast period, averaging over 24% annually. Scale Brings Complexity However, the research cautioned that both providers and end-users will face tremendous challenges when considering IoT deployments at scale. “The platform landscape is flourishing”, noted research author Steffen Sorrell. “However, analytics and database systems are, for the most part, not architected to handle the Big Data 2.0 era that the IoT brings”. To that end, Juniper identified key areas where disruption is needed, such as spatiotemporal analytics and intelligent systems able to run on less powerful machines (eg routers). Juniper predicted that, without changes in attitude from some service providers, getting an IoT project off the ground would be too difficult for non tech-savvy customers. It noted that initiatives such as Exosite’s IoT Alliance (a large partner network) and Rubicon Labs’ flexible business models (pricing according to the value of the data in question) would be fundamental in driving market traction forward. Cybersecurity at Boiling Point Additionally, the research found that the security threat landscape is widening. IoT DDoS (distributed denial-of-service) ‘botnet’ attacks have become infamous in 2016, although in the medium-term, personal data theft, corporate data theft and physical asset damage will be the primary goals for IoT hackers. It was found that enterprise and industry are investing heavily in security for the IoT. However, the consumer market landscape is woeful, with lax attitudes typified by UK ISP TalkTalk’s astounding “do nothing” Mirai Worm advice to consumers. The research argued that regulatory, corporate and media collaboration would be needed in order to improve the overall threat landscape. ### Innovating corporates quicker - #CloudTalks with Kazendi’s Maximillian Doelle In this episode of #CloudTalks, Compare the Cloud speaks with Maximillian Doelle from Kazendi about how they help large corporates innovate more rapidly and how they have been working with the Microsoft HoloLens. The HoloLens looks to create virtual spaces within the real-world bringing the sci-fi fantasies of Minority Report to life. It won't only revolutionise the way we work with virtual screens and whiteboards being digitally mapped to your surroundings it can also play media and be augmented into reality with the user able to walk around it and see it from every angle. What happens when you leave a space? The mapped data is saved to the cloud and then when re-entered loaded from the cloud and open windows open where they were left. Visit Kazendi to find out more: kazendi.com/ ### Why Businesses Are Moving Away from the High Street Banks? Financial Technology has established a world of start-ups that are built on the passion of tech and the financial savvy of wall street. So it’s no surprise that FinTech services have been exploding over the past five years to overshadow the traditional high street bank. The modern FinTech industry is young and disruptive. Two of the oldest companies are only a mere ten years old, and the rate of investment is growing by 45 percent annually. Compare this to high street banks which have seen slow innovation, and crippling reputational damage due to the financial crisis; and you can begin to understand why more businesses are starting to turn away from the high street and jump online. [easy-tweet tweet="FinTech has established a world of start-ups built on the passion of tech and the savvy of wall street" hashtags="FinTech, tech,"] Where traditional financial institutions are failing, FinTech companies have been providing the solutions to consumer demands which have been ignored. Better rates As a global business transferring money through a bank could cost a fortune in hidden fees. The standard money transfer charge from banks sits between 5-8 percent when sending money abroad. Whereas FinTech companies charge between 0.5 – 1 percent. This can equate to a £100 - £150 saving on a £5,000 international transfer. These fees and charges can be crippling to a business looking to expand internationally, and it’s these financial restraints that are impacting a generation of global businesses. In business your money is your lifeline, and if it’s not being protected, how can you trust that your money is working in the best way for your business? FinTech companies have been battling the banks about rates since their conception, and it’s for this primary reason that businesses have begun turning to tech and online services to provide financial products. More secure Apps and mobile banking are now the biggest route for financial transactions. With more than 8 million people downloading banking apps in the past year, and £2.9bn being transferred through apps in 2015 alone. 53 percent of adults bank online, but each year 900,000 people are affected by bank fraud costing the UK up to £193bn per year. The biggest scams relating to business services. [easy-tweet tweet="Apps and mobile banking are now the biggest route for financial transactions" hashtags="tech, FinTech"] This year has been a particularly rough ride for banks, with a number of brands hitting the headlines for failing fraud checks which have identified the risks their customers face by using their services. Fraud can destroy a business, leading them to incur large losses, which in some cases, cannot be resolved quickly. FinTech companies are leading the way in online security because they are a tech company first, and a financial service second. Many of them have been born from previously successful tech start-ups that are aware of the way fraudsters work, and have built their own intuitive systems that can manage such attacks far better than the outdated systems of the banks. Global ease Businesses no longer look to expand locally, but globally. The world is becoming a smaller place, and the UK is home to start-ups that have their grass roots abroad; with the most recent research showing that 1% of companies are foreign-owned. Furthermore, research conducted by the Centre for Economic and Business Research found that by 2025 British small businesses plan to expand overseas. With a global outlook on the future, businesses wish to partner with banks who can support their ambitions, rather than restrict them. Fast service When you think about starting a business or arranging a financial transaction, banks have a notorious reputation for delaying simple processes and charging high prices for what are often standard tasks. Burdened by an over-regulated market, outdated systems, and huge costs that they have accumulated over decades, high street banks make funding and basic business accounting feel like mammoth tasks. FinTech has revolutionised basic banking practices by removing the middle man. Transferring money abroad, generating funding, payment processing and investment can all be done without the need of a traditional high street bank. [easy-tweet tweet="FinTech has revolutionised basic banking practices by removing the middle man" hashtags="FinTech, tech"] Driving businesses to become more productive and get new ideas off the ground quickly is possibly a major factor in the FinTech boom. As we can see from these main factors, while banks are trying to catch-up with the disruption caused by FinTech, businesses are going to choose FinTech options over traditional banks until they can become competitive and trustworthy again. ### A VPS Overhaul - #CloudTalks with MEMSET’s Richard Ripley at Dell In this episode of #CloudTalks at Dell, Compare the Cloud speaks with Richard Ripley from MEMSET about their new VPS hosting package overhaul. #CloudTalks For more information visit Memset to find out more: memset.com/ ### IBM joins Zero Outage Industry Standard Association The Zero Outage Industry Standard Association announced today that IBM has joined the association as a founding member to provide a common framework for IT-Quality with a focus on platforms, people, processes and security. Within the next weeks, all partners will continue to work together to specify new guidelines for the industry standard which will be published on the association's website zero-outage.com Bernhard J Klingenberg, CTO, IBM Resiliency Services said: "In today’s digitised world, ‘always-on’ services and solutions are paramount for IBM clients and customers. IBM has a proud tradition of supporting open source communities and industry collaborations, and always strives to serve our clients’ interests in an integrated and unified way. We are honoured to contribute our experience as a leading services integrator to this new standard with the Zero Outage Industry Standard Association to encourage the continued availability and reliability of IT infrastructures today and in the future.” During the association’s first board of directors meeting on November 25th, 2016, Stephan Kasulke, SVP global Quality at T-Systems International has been voted as new chairman of the board of directors for the Zero Outage Industry Standard association. He explained: “As an organisation’s IT infrastructure can involve a complex ecosystem of technologies from a variety of vendors, there are often differing levels of service level agreements in place which can lead to critical defaults and security issues. I am very enthusiastic to be the first Chairman of the Zero Outage Association – this is the start of a long journey towards a stable Telecommunications, IT and IoT.” Digitization is in full swing: machines communicate with each other, processes are becoming more and more efficient and automation is an integral part of the process. But all can only work if the IT behind it runs smoothly. A failure, even for a few minutes, can have fatal consequences. If production bands are stopped due to IT problems, companies are threatened with image losses and costs of millions. If systems fail in the airport tower or in hospitals in the worst case scenario, human life is threatened. No matter how big a company is or which industry it belongs to, a reliable and smoothly functioning of information and communication technology is important. Therefore quality is becoming a decisive competitive factor in the age of digitalization. According to Gartner, the average CIO is already spending 18 percent of the organisation's budget in support of digitalization, with that number expected to increase to 28 percent by 2018. ### Fairy-tale Lessons in IT Managed Services: From Beauty to Beast Building the dream castle with a digital transformation strategy There are parts of every business that are princely, and some that are beastly, but to turn a traditional business that operates on the hop (like a frog, pre-princess kiss) into a prince charming, requires a strategy to navigate the complexities that modern IT services now present. [easy-tweet tweet="There are parts of every business that are princely, and some that are beastly" hashtags="tech, cloud, IT"] What does success look like for businesses today? In a fast-paced climate, there are now four key imperatives for businesses who want to remain competitive: improving customer engagement, empowering their employees; transforming the products and services they offer using digital content, and optimising business operations. The more whimsically minded might consider these the equivalent of inviting guests to the ball, inspiring the lost boys, adding a touch of magic, and fitting the magic slipper. Making a crystal ball The key to achieving these four goals is a digital transformation strategy, and harnessing new cloud-based services can help fast track the delivery of this. Consider this strategy the fairy godmother of the organisation’s plans. For example, companies can empower and enchant employees by using cloud-based tools that make it easier for teams to collaborate, communicate and provide access to a wealth of information previously siloed in different systems across the organisation. In addition, optimising business operations depends on knowing what’s happening in the business now – and what’s likely to happen in the future – the equivalent of a crystal ball. Adding some Fantasia Today’s organisations will be benefitted by a deeper insight into their business processes, which comes through intelligent reporting and analytics, which can be accessed through cloud services like Cobweb Complete for Office 365. It can provide regular reporting on employees’ productivity, if required, enabling a business to bolster productivity and create a more efficient working environment. In fact, according to the Cloud Industry Forum, 80 per cent of UK businesses that have implemented, or are planning on implementing a digital transformation strategy, say that cloud is important to their strategy, and 38 percent say that cloud has given their organisations a significant competitive advantage. A Merlin-like mentor will help Growing a business is a full-time occupation, and in-house IT resources are usually stretched to capacity. As a result, it’s often difficult to devote the necessary time and effort into deploying and managing new cloud services. Adopting cloud solutions via a managed service provider takes the pressure away from the IT team (if a small business has a team!) and puts everything into the hands of a cloud service specialist. By outsourcing the critical day-to-day support and maintenance of the IT estate to an expert that understands their needs, the business is free to focus on growing, while taking the maximum value from their investment in IT. Even young Arthur needed a Merlin to help him become king, after all. This is especially helpful for small companies or start-ups; instead of time spent managing software installations, updating old versions of software and overseeing upgrades, employees are free to focus on core business activities such as delighting customers and generating new revenue. A 2016 report from the Cloud Industry Forum shows that after migrating part of their software estate to the cloud, more than half of UK IT teams (55 percent) were able to be redeployed to other activities, while a further third (34 percent) saw their maintenance workload reduce dramatically, paving the way for them to engage in more strategic work to the business. The biggest attraction associated with choosing a managed service for finance or IT directors are the financial benefits associated with increased employee productivity and greater predictability of costs. With simple monthly subscription pricing, organisations can avoid large upfront costs for new software and servers, and move the cost of IT from a capital to an operating expense. By offering a predictable monthly subscription plan, organisations know exactly how much they are spending each month and can plan their budgets accordingly. Waving the magic wand There are a number of different managed service options out there, but those from IT partners that wrap around Microsoft Office 365 are usually the most popular option for growing businesses, looking to take advantage of competitive prices and innovative software. Such services that allow organisations to get more out of the applications that they are already familiar with provide an instant boost and cut out training and familiarisation time. It’s the difference between using the mice and pumpkin or the magic carriage to get to the ball – by getting the most out of an asset’s potential. Most small businesses remain entirely unware that Microsoft Office 365 provides a suite of tools to help save them time that they can spend on the business and customers – whether that’s making it easier to manage files and data, simplifying communication and collaboration among employees, or managing meetings on the move. Office 365 combines staples like Exchange Online, SharePoint Online, Word, Excel and PowerPoint with the next generation of productivity-based services such as Skype for Business Online that bolsters collaboration and communication both within and across organisations. Falling in love again Much like Beauty, seeing ‘something that wasn’t there before’, managed services that integrate into the fabric of a growing business offer the chance to get the very most out of the cloud services that they are already paying for but generally not utilising, improving both efficiency and productivity. Another vital benefit for businesses whose employees often work from home or require IT assistance around the clock, is 24/7 support services, to ensure that not a moment is wasted in fixing a problem. Furthermore, active monitoring and proactive support helps to protect a company’s users and content, also reducing overhead and operating expenses. [easy-tweet tweet="Services integrated into the fabric of a growing business offer the chance to get the best from cloud services" hashtags="cloud, tech, IT"] You can’t always ‘laugh in the face of danger’, and protecting identity and content are also key aspects of the service. Industry leading antivirus and antispam technologies provide an additional layer of email security, preventing unwanted messages and protecting against damaging and costly cyber-attack. Advanced monitoring, management, compliance, alerting and auditing features are the foundation to the proactive reporting and support element of the managed services package, incorporating pro-active warnings for potential security threats or compromised accounts. In today’s technological workplace, which is evolving at an incredible speed (like Sleeping Beauty’s castle coming back to life), getting the right support is key. The journey to digital transformation is an important step for businesses looking to take advantage of technological advances to improve business efficiency, the and service ‘wrappers’ that help deliver the powerful software to help progress this journey is now available from the IT partner community. It’s worth finding the right partner to jump on the magic carpet with, as the ride will be a whole new world… ### Advancing through the cloud - #CloudTalks with Adapt’s Kevin Linsell at Dell In this episode of #CloudTalks at Dell, Compare the Cloud speaks with Kevin Linsell from Adapt about their achievements in advancing the consumer through cloud technologies.   #Cloudtalks For more visit Adapt to find out more: adapt.com ### Pulsant provides managed network to Argyll Community Housing Association Pulsant, leading UK provider of hybrid cloud services, has completed a managed network deal with the Argyll Community Housing Association (ACHA). The network was deployed across ACHA’s eight office locations and 10 sheltered housing sites. Pulsant won the contract in 2015 to supply the housing association’s network infrastructure, which covers a geographically diverse area across Argyll and Bute in the west of Scotland. The project was staged over a year, bringing all of the housing association’s sites onto the wide area network (WAN). “We needed all of our sites to be connected with our staff having access to all the systems and applications required to make sure operations run smoothly. This wasn’t an easy task as the offices and housing sites are so widely spread across the region, some very rural, and others in more populated areas so we had some challenges with finding telecommunications providers,” says Vivienne Kerr, IT manager, ACHA. “Pulsant stepped up to the challenge and was appointed to deliver the leased lines and ADSL circuits, along with a number of routers based on a variety of requirements for each site. The project was completed successfully and within our timescales, which were incredibly tight. The benefit for us now is that we have our own network, which is managed by one provider — Pulsant — with a dedicated account manager, and excellent service and support.” Ian Appleyard, business development executive, Pulsant says: “Due to our carrier neutrality we were able to select different methods, telecommunications providers and underlying infrastructure to deploy the WAN. Our key differentiator was the fact that we could pick which provider was best suited to which area and then bring that all together in a single solution.” Pulsant continues to proactively manage the solution and provide support where needed. ### The Migration to Modern Account Security is Under Way… Finally Over the past month, there have been a number of high-profile security breaches, most notably at TalkTalk and Three mobile, and these are just the latest in a long line of security incidents that have compromised the sensitive details of hundreds of millions of users. These breaches reflect the existence of an ever-present and ever-growing threat to the security of billions of online accounts. [easy-tweet tweet="Breaches reflect the existence of an ever-present and ever-growing threat to the security of billions" hashtags="tech, security, cloud"] In an attempt to resolve cyber-security issues, companies often advise users to select complex, unique passwords for each account and recommend changing them frequently. However, vast numbers of consumers tend to reuse old passwords or choose weak ones, in spite of the risk this poses. Rather worryingly, “123456” and “password” have toppedSplashData’s annual “Worst Password” report as the most commonly used passwords– five years in a row. Security experts have known for some time that usernames and passwords alone aren’t enough to protect users. The industry is beginning to recognise the importance of added security, namely two-factor authentication (2FA). With the help of cloud communication platforms, companies can easily (and perhaps most importantly, cost-effectively) integrate 2FA into the user experience. 2FA hardens account security by requiring customers to provide a code that is transmitted to their own device. In the majority of cases, a mobile device is a far more secure form of authentication compared to using say, your mother’s maiden name. Unfortunately, despite the better security offered by 2FA you need only spend a few minutes on TwoFactorAuth.org to see how many businesses have yet to implement it. Moving forward – SMS and push notification As a recent Microsoft study attests, introducing further steps in the log-in process can be a risky business. The resulting security fatigue can frustrate the user to the extent that they may even discontinue their service. To this end, businesses have traditionally shied away from clunkier – though stronger – security. Microsoft researchers found that no alternative security method is as easy to use, or implement, as passwords. They wrote, ‘Marginal gains are often not sufficient … to overcome significant transition costs’, concluding that the ‘funeral procession for passwords’ is likely still years away. [easy-tweet tweet="Businesses have traditionally shied away from clunkier – though stronger – security." hashtags="tech, cloud, security"] Take for example, the most popular medium through which two-factor authentication is achieved is SMS. Users are prompted to send an SMS verification code to their phone number, and then are asked to enter the code into the website. This is still more secure than a username and password, but in a time where businesses focus on converting as many website visitors as possible, this can seem counterproductive. Whilst there is no reason to avoid SMS verification in low-risk communications (for example: a text to notify users that a car has arrived), this type of (by default) unencrypted communication remains less suited to high-risk communications. Luckily, the security industry is constantly trying to devise strong security measures that consumers will actually use. In the past 18 months, a new form of 2FA has emerged, based on a technology that we regularly use and interact with: push notifications. Unlike SMS, push notifications can start a chain of end-to-end encrypted communications between the app and a secured authentication service, thus providing “Push authentication” which is sent straight to your device over the internet. Simply responding to the push initialises secure software that then displays an intended message to the device owner. But instead of just a string of random numbers, push notifications can include context in an authentication request. For example: “Would you like to authorise a transfer for $3,000 to Mr. T. Hief?” Reactive fraud alerts only notify the victim to the illicit action, but a push notification empowers the user to respond immediately and prevent the attack. Generally, businesses should all be considering utilising push notification in cloud-based authentication scenarios. Push is familiar and easy, and the technology is mature and reliable. Passwords: a thing of the past? In recent months, new forms of push authentication have been integrated into the services of popular consumer sites. Yahoo, Google, Microsoft, and even online gaming giant, Blizzard, are rolling out “password-less” experiences, powered by push. Although this is great news for users, it doesn’t present an adoption strategy for businesses looking to implement similar security measures, because each of these solutions serve a specific community alone. [easy-tweet tweet="Online gaming giant, Blizzard, are rolling out “password-less” experiences, powered by push." hashtags="tech, cloud, gaming, security"] Fortunately, we live in an age of readily available, flexible building blocks for software development that can scale and keep up with growing customer demands and changing business requirements. APIs continue to innovate, altering previously static industries like communications and payments. What’s more, companies like live-streaming service Twitch and virtualisation leader VMware understand the importance of securing user accounts -  that’s why they looked to cloud-driven, reliable two-factor authentication layers to further protect their communities. In your migration to agile, cloud-based development, don’t leave the safety of your customers behind. Instead, put serious consideration in strengthening your security capabilities by implementing two form authentication functionality. ### On-Premise to SaaS - #CloudTalks with Secura Hosting’s Dan Nichols at Dell In this episode of #CloudTalks at Dell, Compare the Cloud speaks with Dan Nichols from Secura Hosting about their Software-as-a-Service package and hosting services for design agencies. VMWare's NSX technology service has enabled some advanced networking scenarios around cloud technologies that were being used previously. #cloudtalks   For more visit Secure Hosting to find out more: https://secura.cloud ### Why the cloud hasn’t trumped hardware skills The cloud is no longer the new technology platform surrounded in mystique and suspicion that it was. The business value and relevance of the platform to the market has largely been recognised. However, the role of hardware has too often been seen as secondary. The growth of the cloud has come alongside a pay-as-you-grow revolution in which technology is consumed ‘as-a-service’ on simple subscription-based models. The result of this is that consumers have less insight into the technology behind products. [easy-tweet tweet="Cloud is no longer a new tech platform surrounded in mystique and suspicion that it was" hashtags="cloud, tech, data"] The ‘as-a-service’ subscription trend is one we’re seeing across business and consumer markets, from services such as Netflix through to Google’s G Suite (formerly Google Apps for Business). Both are also reliant on cloud services to operate. Netflix uses Amazon Web Services to manage its cloud, quickly deploying data and space to where it is needed to maintain service uptime. With G Suite, Google is trying to radically change the way in which businesses work and operate - with everything from word processing software to files hosted in the cloud.  The cloud is helping these companies shorten the supply chain and deliver always-on instant access to customers. This is an inherent benefit of the cloud but is only positive until something goes wrong.  The part we continually find many forget about the cloud is that it isn’t just some enormous invisible body. Yes, it can be accessed anywhere in the world, but it relies on data-centres hosting vast amounts of physical hardware. Only recently did Microsoft launch three data centers in London, Durham and Cardiff to support its growing UK cloud infrastructure - and Amazon is expected to follow. [easy-tweet tweet="Cloud is helping companies shorten the supply chain and deliver always-on instant access to customers" hashtags="tech, cloud, data"] From the outside, it appears hardware is disappearing. Very few companies still have the traditional basement computer room full of servers. However, what we’re actually seeing is greater centralisation of hardware into big data centers. These giant warehouses full of server, networking and storage components inevitably require maintenance from technical specialist engineers. The challenge is that any failure to maintain a part or server properly could not only result in downtime for one company but multiple paying service users, damaging the reputation of the service provider. When we talk about the skills gap - the concern tends to be around software development, but in my mind, as important an area this is it is not the full story. We require skills in emerging technologies such as Artificial Intelligence and Virtual Reality for the development of the UK economy. However, these tend to be heavily reliant on the processing power of the cloud, meaning we need strong and effective cloud infrastructure to keep things moving. In short, enabling innovation requires strong, resilient cloud infrastructure that has a heavy reliance on hardware. Despite this, the focus in the technology sector has primarily been focused around training new skills across software and applications, when in fact, hardware maintenance, installation, and professional service skills are just as important. Government may have prioritised software in recent times, however, taking such a singular approach fails to acknowledge necessary supporting technologies and the skills required to maintain them.  [easy-tweet tweet="Focus in the tech sector has been focused on training new skills across software and apps" hashtags="tech, cloud, data"] This is a view supported by many in IT support roles, with a survey conducted by OnePoll on Agilitas’ behalf revealing 78 percent of resellers, managed service providers and independent IT providers agree that hiring and training more staff with data-centre technical skills will help to ease the skills gap by 2020. The research highlights the view that hardware and technical skills should not be regarded as separate or a challenger to cloud. If a data center does go down, the impact will be felt across multiple businesses. Hardware and the skills required to maintain it must, therefore, be seen as essential to future cloud uptime. To secure our digital future, skills investment must be balanced across the cloud and hardware. New technology still requires a data center back end somewhere, so efforts must still be made to ensure a skilled technical function remains to manage and maintain hardware infrastructure. In short, broadening investment in skills across IT will help to reduce the growth of the skills gap and in the long term maintain uptime. ### Broaden Your Reach: Lifesize Creates Seamless Skype for Business Interoperability with Direct Calling Today, Lifesize®, a global innovator of video conferencing technology, announced it has enhanced its Skype for Business interoperability to simplify cross-platform audio, video and web conferencing, and extend the Skype for Business environment to the conference room with award-winning Lifesize HD camera and phone systems. With this unique interoperability, Lifesize is the only solution to date that enables users to make and receive direct calls to and from Skype for Business for one-on-one or group calls. Microsoft Office 365 and its communication application, Skype for Business, is a cornerstone for modern businesses, with more than 100 million people using Skype for Business worldwide. Yet, despite the popularity and regular use of Skype for Business, seamless integration between video conferencing solutions and Skype for Business remains limited. Far too often, workers get stuck in this type of situation.  A product development team of four individuals gather in a small conference room in Austin. They connect via video conference to three marketing team members in San Francisco. A few minutes into the discussion, they realise they need input from a contractor based in London. They reach for their speakerphone to make the call, but stop – the contractor only uses Skype for Business and their system cannot call her. Now what?  Meetings should not be interrupted or lack video quality because of interoperability limitations, which is why Lifesize is dedicated to delivering the most user-friendly and complete video communication experience for organisations regularly collaborating with Skype for Business users. Designed to complement the Skype for Business environment, Lifesize fits into the existing Skype for Business workflow by allowing Lifesize users to place direct calls to Skype for Business users, add them to existing calls, and send and receive screen sharing. The integration also allows organisations to bring a Skype for Business presence into the conference room with the Lifesize plug-and-play video conference room system, which gives users the option to connect to any third-party conference room system. With this comprehensive package of unmatched quality, ease of use, flexibility and interoperability, IT administrators don’t need to worry about risking their investment with “lock-in” products for the conference room. In fact, Lifesize now helps IT admins further extend the value of their Skype for Business deployment and investment.  “Skype for Business and other such applications allow organisations to standardise on collaboration workflows,” said Craig Malloy, CEO at Lifesize. “For users in those organisations, the best conferencing solution should complement the way they work, not hinder it. The interoperability Lifesize offers extends far past any solution on the market, providing the simplest, most effective collaboration experience between Lifesize, Skype for Business and third-party vendors.” Lifesize enhances the Skype for Business environment with: A superior conference room experience: Users can place and receive direct calls from a Lifesize HD camera and phone system within the conference room to a Skype for Business user at his or her desktop PC. Users can also leverage a Lifesize virtual meeting room to connect third-party conference room systems to Skype for Business or Lifesize users. Bi-directional calling with Skype for Business: Those using the Lifesize cloud-based application can place direct calls to Skype for Business users and/or add them to an existing call. Skype for Business users can also place direct calls to Lifesize app users, conference room systems or virtual meeting rooms. Bi-directional screen share with Skype for Business: Skype for Business users can send and receive real-time screen sharing in a call with a Lifesize app user, conference room system or virtual meeting room. Broader reach: Skype for Business users can extend their call capability with the Lifesize cloud-based application, which allows up to 50-way calling, audio dial-in numbers in more than 60 countries, the ability to view more participants on screen, and the option to bring in callers from outside of the organisation The Lifesize Skype for Business interoperability is supported on Skype for Business Server or Skype for Business Online through Microsoft Office 365. ### 24/7 Network Operations - #CloudTalks with OryxAlign’s Carl Henriksen at Dell In this episode of #CloudTalks at Dell, Compare the Cloud speaks with Carl Henriksen from OryxAlign about their 24/7 Network Operations and providing the consumer with the 'complete offering'. #Cloudtalks For more visit OryxAlign to find out more: oryxalign.com/ ### Information is all: the new responsibilities of a CIO today Many organisations today still misinterpret the role of the CIO as a ‘head of IT’ - the computer guy, stowed away in a back office while the “Business Units” make the business decisions that matter. However, as the business world continues to rely on information to provide their competitive advantage, that perception is being rightly challenged. Technology alone doesn’t differentiate a business from its competitors - but information can, which is why the CIO is such a critical part of the senior management team. [easy-tweet tweet="Technology alone doesn’t differentiate a business from its competitors - but information can" hashtags="tech, data, cloud, security"] Information is now the single most valuable asset available to a business, and as a result, the responsibility and influence of the CIO is evolving at pace. With the rise of the cloud and the Internet of Things (IoT), valuable data is pouring into businesses at an exponential rate. As customers increasingly come to expect a highly personalised experience, getting the greatest possible value out of that data is key to success. The CIO is tasked with overseeing the consolidation of data from hundreds of sources, enabling the company to extract relevant insights and ultimately driving revenue growth as a result. The CIO has a unique advantage in this aim thanks to their overarching view over the company’s information assets. In an average mid-sized business, for example, the marketing department may run its campaigns on a dedicated platform like Marketo, while the sales teams record their leads in Salesforce. Neither has a view into the other’s data. The CIO, on the other hand, has responsibility across business units and is able to see what data is stored by the company across all these platforms. Importantly, this gives them the opportunity to spot where relevant connections can be drawn between disparate data stores. For example, giving sales reps access to social media interaction data from marketing teams so they know which companies or individuals have been showing interest in the company online. However, the task of connecting data is becoming more complex, as corporate information sources exist across a widening range of locations. The CIO of a manufacturing company, for example, may have several thousand shop-floor sensors to manage, all feeding in gigabytes of real-time data on everything from temperature and humidity to production count and fault reports. Meanwhile, line-of-business units could be racking up information in cloud-based Software-as-a-Service apps and self-service analytics platforms. For the modern CIO, the key challenge is to enable the business to efficiently sort and make use of the data available, making the relevant patterns and trends available in as close to real-time as possible. [easy-tweet tweet="Corporate information sources exist across a widening range of locations" hashtags="tech, cloud, data, "] Of course, there is more to the role of CIO than just technical wrangling. A growing number of CIOs aren’t IT stalwarts. Instead, they are experienced business people with a strong understanding of operational discipline, efficiency and measurement and are best equipped to improve their team’s and the company’s performance. The CIO role is essentially an enabling generalist - a leader with a view into each of their team members’ areas of expertise, who can unite them for a single purpose. Part of this is about rallying IT teams around the most important issues to the business. To do that the CIO needs to be able to enfranchise each team member and make clear in the wider context of the business why they are doing what they are doing. Finally, for the new breed of CIO trying to get data analytics into the boardroom and educate the business as a whole about their role, it’s essential to be proactive in taking responsibility for important, high profile projects. By requesting the lead, the CIO can demonstrate not only their technical ability but also how IT dovetails with and can lead transformation in flagship initiatives. In a software company, for example, by taking responsibility for overhauling the company’s subscription fulfillment model, the CIO becomes a key player in leading both a technical project and a major change to how the company does business. [easy-tweet tweet="It’s essential to be proactive in taking responsibility for important, high profile projects." hashtags="cloud, tech, security"] Ultimately, if companies don’t take advantage of their information assets, competitors will. Just look at hospitality – it was a capital-heavy industry requiring property and employees all over the world until Airbnb changed that with an information-centric model. If organisations in other markets are to avoid the same fate, they will need the help and guidance of their CIO to ensure that business unit leaders can bring data into the boardroom, shifting the discussion from functional optimisation to enterprise optimisation and uniting disparate divisions and platforms behind transformative initiatives to push the company forward. Information holds the key not only to redefining the future of the CIO, but also to overall business success. ### Dell/EMC Momentum 2016 Barcelona - Welcome to the end of an era – Welcome to the digital era! Compare the Cloud were asked to attend and report on the prestigious event held by Dell/EMC in Barcelona in November so how could we say no! With the event following on from last year with a similar theme – digital disruption. During his keynote speech Rohit Ghai, President Enterprise Content Management, discussed the new era of the digital disruption and how we are seeing the “demise of the client/server era, the ERP era and the transactional economy, and welcome the age of the Cloud SaaS era, the subscription sharing economy and the cogitative age!” Words of wisdom from Rohit and as these last statement fall right into the hands of the three top topics that followed during the week. [easy-tweet tweet="Welcome the age of the Cloud SaaS era, the subscription sharing economy and the cogitative age!" hashtags="tech, cloud, dell, emc"] InfoArchive, Leap and Documentum were the main themes presented with new developments within each of the product sets. Let me explain what they are, and what they do to the uninitiated. InfoArchive If you are not familiar with the product, let me explain at a very high level what it does. InfoArchive is essentially an archiving platform that accommodates structured and unstructured data into a single repository with adherence to regulations like HIPAA, Dodd-Frank or EMIR. InfoArchive provides a long-term compliant archive meeting retention requirements and ensuring audit ability, defensibility and easy accessibility. This was demonstrated by vertical market segmentation for the product set within industries. Leap A collection of 5 subproducts (leap courier, leap snap, leap concert, leap express, leap focus) all providing their services that form a marketplace of sorts that combine all the tools you will need to manage and interact with enterprise content. Documentum This collection of products (includes 90 individual products that form elements of then overall data management platform) service all sectors for data management with compliance and portability. Documentum has been around since 1990 and over the last 26 years has proved to be a compelling data management service that just keeps improving for its users. We caught up with Mark Abour, head of product management for Documentum about the recent changes and announcements at Momentum 2016. Neil: So Mark, tell me a little bit about your background: Mark: “Well, I’m Mark Abour, head of product management for Documentum. I first started on my journey at EDS some time ago but moving to Bon then Bulldog and then Documentum, now Dell/EMC. I was part of the original team that started on demand Documentum as a service in 2012, built this out and then moved back products group around 18 months ago." Neil: So, that’s a very interesting journey through the data management space from all sides. Can you explain the thought process behind the 3 distinct product brands shown today and why 3? Mark: “Well we market them (Leap, Infoarchive and Documentum) as 3 different brands although today your saw the message – we are better together. As an example, the Documentum product itself has 80/90 sub products included within it. Leap we separated out because we wanted to market it as a multitenant SaaS next generation application. Infoarchive is targeting a lot of non-Documentum clients although there is crossover which allows easy integration for cross-platform opportunities.” Neil: Mark, have you any final thoughts on future developments? Mark: “We are constantly developing/adding new features and functionality across all products, but we do see considerable development within the analytic capabilities of big data analysis.” [easy-tweet tweet="InfoArchive provides a long-term compliant archive ensuring audit ability" hashtags="tech, cloud, dell, emc"] So, in summary, it is my belief that Dell/EMC merger will only strengthen their Enterprise Content Management clients by providing new developments for a very long established data management industry. Vertical solutions for industry markets, data compliance that has evolved with their clients that isn’t just an “off the shelf then self-tailoring at your peril solution” with product history and development that’s evolved other 26 years across each industry the future of Enterprise content management is Documentum and this is demonstrated by the Gartner magic quadrant statistics clearly visible as a leader in this space. For more information on the Momentum event and highlights of the show visit http://www.emc.com/microsites/emcworld2016/momentum-barcelona.htm ### Scientific breakthrough reveals unprecedented alternative to battery power storage Ground-breaking research from the University of Surrey and Augmented Optics Ltd., in collaboration with the University of Bristol, has developed potentially transformational technology which could revolutionise the capabilities of appliances that have previously relied on battery power to work. This development by Augmented Optics Ltd., could translate into very high energy density supercapacitors making it possible to recharge your mobile phone, laptop or other mobile devices in just a few seconds. The technology could have a seismic impact across a number of industries, including transport, aerospace, energy generation, and household applications such as mobile phones, flat screen electronic devices, and biosensors.  It could also revolutionise electric cars, allowing the possibility for them to recharge as quickly as it takes for a regular non-electric car to refuel with petrol – a process that currently takes approximately 6-8 hours to recharge.  Imagine, instead of an electric car being limited to a drive from London to Brighton, the new technology could allow the electric car to travel from London to Edinburgh without the need to recharge, but when it did recharge for this operation to take just a few minutes to perform. Supercapacitor buses are already being used in China, but they have a very limited range whereas this technology could allow them to travel a lot further between recharges.  Instead of recharging every 2-3 stops this technology could mean they only need to recharge every 20-30 stops and that will only take a few seconds. Elon Musk, of Tesla and SpaceX, has previously stated his belief that supercapacitors are likely to be the technology for future electric air transportation.  We believe that the present scientific advance could make that vision a reality. The technology was adapted from the principles used to make soft contact lenses, which Dr Donald Highgate (of Augmented Optics, and an alumnus of the University of Surrey) developed following his postgraduate studies at Surrey 40 years ago.  Supercapacitors, an alternative power source to batteries, store energy using electrodes and electrolytes and both charge and deliver energy quickly, unlike conventional batteries which do so in a much slower, more sustained way. Supercapacitors have the ability to charge and discharge rapidly over very large numbers of cycles.  However, because of their poor energy density per kilogramme (approximately just one twentieth of existing battery technology), they have, until now, been unable to compete with conventional battery energy storage in many applications. Dr Brendan Howlin of the University of Surrey, explained: “There is a global search for new energy storage technology and this new ultra capacity supercapacitor has the potential to open the door to unimaginably exciting developments.” The ground-breaking research programme was conducted by researchers at the University of Surrey’s Department of Chemistry where the project was initiated by Dr Donald Highgate of Augmented Optics Ltd.  The research team was co-led by the Principal Investigators Dr Ian Hamerton and Dr Brendan Howlin.  Dr Hamerton continues to collaborate on the project in his new post at the University of Bristol, where the electrochemical testing to trial the research findings was carried out by fellow University of Bristol academic - David Fermin, Professor of Electrochemistry in the School of Chemistry. Dr Ian Hamerton, Reader in Polymers and Composite Materials from the Department of Aerospace Engineering, University of Bristol said:  “While this research has potentially opened the route to very high-density supercapacitors, these *polymers have many other possible uses in which tough, flexible conducting materials are desirable, including bioelectronics, sensors, wearable electronics, and advanced optics.  We believe that this is an extremely exciting and potentially game-changing development.” *the materials are based on large organic molecules composed of many repeated subunits and bonded together to form a 3-dimensional network. Jim Heathcote, Chief Executive of both Augmented Optics Ltd and Supercapacitor Materials Ltd, said: “It is a privilege to work with the teams from the University of Surrey and the University of Bristol.  The test results from the new polymers suggest that extremely high energy density supercapacitors could be constructed in the very new future.  We are now actively seeking commercial partners in order to supply our polymers and offer assistance to build these ultra high energy density storage devices.” ### How secure is your data? Five questions for your cloud communications vendor When choosing a cloud communications vendor many companies focus on price, speed and efficiency. While these are important factors, few remember to think of the protection of their data. With more businesses moving to the cloud, asking the right questions about security when talking with your vendor should be a top priority. [easy-tweet tweet="Asking the right questions about security should always be a top priority." hashtags="tech, cloud, security"] Cloud security is a shared responsibility model, meaning that effective security relies on both the customer and vendor for implementation. There's a lot your vendor can do to mitigate risks to your business, and in the spirit of strong partnership, all customers should ask their partners the questions that they need answered for trust and assurance. Here are five to start with for your UCaaS vendor. 1. Do you support encrypted VoIP? Unencrypted data streams of any type increase your risk and attack surface. A VoIP call is not any different than web application data in this respect. Eavesdropping, man-in-the middle attacks and capturing registration credentials are all made easier with plain-text data. The bottom line is that VoIP traffic should always be encrypted. A vendor’s implementation of VoIP encryption should include both call signalling and media communications, so be sure to ask if both are encrypted. 2. Do you have an audit report you can share with me? Trust is an essential part of the cloud services model. Cloud vendors should understand your efforts to look under the hood of their security controls. Cloud companies who take security seriously will do more than just pass along their data center's audit report. Look for vendors who have audit reports put together by an independent third party. Make sure the audit report covers the service organisation’s controls (i.e. your vendor) and not just the data center or cloud service where the application is hosted. Service operations are a key part of the cloud security model. 3. Do you have control points that will enable me and my team to secure our side of the voice network? When revisiting the shared responsibility model for cloud security, it's important to remember that there's an entire customer side to the service, with VoIP phones, apps that run on desktop or laptop computers and users connecting to the service from their mobile endpoints. Does your vendor help educate you about the best practices that are within your control? Do they build security attributes and features into their product to take care of some things directly and to empower you with settings to control the rest? Just like your data stored in the cloud, any of your data stored locally on these endpoints should be secured with encryption. Make sure your vendor offers an option that lets you mandate encryption for your data when it’s stored locally in an endpoint app. Make sure that you can implement roles and permissions for your users, and make sure that you have the ability to specify important usage parameters such as international calling. [easy-tweet tweet="Does your vendor help educate you about the best practices that are within your control?" hashtags="tech, cloud, IT, security"] 4. How often do you conduct product security testing? It’s no surprise to anyone that software has bugs. What’s important is that your vendor make investments to regularly test its application security to find security bugs so that they can be prioritised and addressed, rather than limiting product testing to one or two pen-tests per year. Don’t get me wrong. Pen-tests serve a purpose. However, it’s likely that most vendors are releasing updates to their software at an interval that outpaces periodic pen-testing. Ask your vendors how often they test their application releases for security. In addition, when it comes to product security testing, it’s good to have a mix of in-house and third-party. No one person or team will find every bug. The idea here is that a more diverse set of eyes and methods will find more bugs than a single team or a single method. 5. Do you monitor for unusual activity or usage on customer accounts? Cloud customers don’t always monitor their usage of a cloud service, and those with an on-premise PBX don’t always monitor their company’s PBX activity. As a cloud customer, you may not even have access to the service usage data. Ask your vendors whether they monitor for service abuse and anomalous usage. If you’re looking at cloud PBX, ask if any toll fraud monitoring is included. The list of questions you should ask your vendor is by no means exhaustive, the more you ask, the more information you have to determine if a cloud vendor is ready to be your partner in the shared security model that cloud computing requires. ### Evolving into a cloud services provider - #CloudTalks with Fordway’s Richard Blanford at Dell In this episode of #CloudTalks at Dell, Compare the Cloud speaks with Richard Blanford from Fordway about how they are evolving into a cloud services provider. #cloudtalks For more visit Fordway to find out more: fordway.com/ ### The beauty of Spring is not in one single flower - Huawei eco-Connect Europe 2016 Compare the Cloud attended Huawei eco-Connect Europe event at the back-end of October, and the clue was in the event name as Huawei look to work collaboratively with many start-ups and big names within the industry. The leading global information and communications technology solution provider announced some partnerships at eco-Connect Europe 2016 which will support their product portfolio ensuring customers across Europe become agile and flexible. [easy-tweet tweet="Huawei look to work collaboratively with many start-ups and big names within the industry." user="Huaweiuk" hashtags="tech, cloud, IoT"] Vincent Pang, president of Huawei’s Western European Region, took to the stage with a keynote further exemplifying the importance of collaboration in today’s tech market with the analogy of ‘the beauty of Spring is not in one single flower’. To date, Huawei has attracted over 1,000 certified partners after investing heavily in its partner ecosystem in the enterprise market. Alain Staron of Veolia describes new IoT technologies like Smart Cities as creating a “digital space over a physical place” with an aim to give everyone the chance to live in a “better place”. As ICT solutions begin to mature, Huawei suggests that through openness and collaboration, there will be an opportunity to build services that are better for everyone. By talking about the transformation, Huawei and their partners look to shape the future, and Huawei aren’t thinking linear. From eCommerce, cloud computing and IoT to 3D Printing, video-reality and smart materials Huawei state they’re “happy to work with all” in the next three to five years as they look to build an eco-system based innovation system. Broader connections between ISV, Independent Developers, partners and customers will lead aid the ever-growing demand for data traffic that will be led by the over 50 million devices expected to be connected by 2020. The importance of collaboration was sold further as midway through the keynote a Guinness World Record was made with over 1000 people sharing a VR experience simultaneously in one location.  Each eco-system partner brings individual experiences and innovation to the marketplace and the users - beating expectations. FinTech was touched on by HSBC’s David Knott who claimed tomorrow is exciting but today is “just as exciting” with innovations making an impact across China and Asia. [easy-tweet tweet="Each eco-system partner brings individual experiences and innovation to the marketplace and the users" user="HuaweiUK" hashtags="tech, IoT, cloud"] The cloud revolution within the Huawei eco-system has breathed new life into products and services. Businesses could scale easily, infrastructure costs reduced and has allowed for Artificial Intelligence to evolve. The implications this could have for the future are priceless - advances in Intelligent Medicine, Intelligent Workplaces, and Transportation. Progress made in all these areas has the cloud to thank as it has transformed our world. Huawei claims this has given the eco-system they build the right to be deemed the future of intelligence. The cloud is the pure essence of what Huawei are building with eco-connect and is not only about digital transformation; rather it also implies connotations of a better-connected world. The Internet of Things tends to be top of interest lists when enthusiasts are asked. However, it is in fact, cloud computing that drives everything we know in the sector today. Advances in technology are built upon the services offered and this was highlighted time and again from smart transport start-ups right through to smart TV systems being created. Partners are feeling the benefit in the market that is no longer about big fish eating the little fish, rather capitalising on the faster fish eating the slower one. “In today’s digital era, there is a huge pressure on business to evolve to meet the demand of their customers. By working with Huawei, we will be able to enhance IoT, network and security and cloud solutions - helping our customers on their business transformation journey” states General Manager at SCC France Didier Lejeune. Huawei offers the opportunity to the companies who perhaps didn’t have some power it otherwise needed behind them to become said ‘fast fish’. [easy-tweet tweet="“Digital space over a physical place” - Alain Staron" user="HuaweiUK" hashtags="cloud, tech, IoT"] The connotation of this ecosystem is refreshing for a very dog-eat-dog world of business. It not only extends the reach of Huawei, but also gives other companies more connections within the sector, offering the opportunity to work with clients and peers that perhaps would have otherwise been unobtainable and maybe even being able to make the next innovation we didn’t know we couldn’t live without. ### Secura Continues Expansion with the Addition of Two Highly Experienced Cloud Hosting Industry Experts Secura are pleased to announce the addition of two highly experienced cloud hosting professionals who will use their knowledge and expertise to support the continued growth of the business. Karl Robinson joins Secura as Sales Director and Anthony Doncaster will become Head of Pre-Sales. Both Karl and Anthony have an enviable track record of achieving success and growth within the cloud industry at StratoGen, and their already strong working relationship will mean they will make an immediate impact for Secura. Karl has over 16 years of sales and management experience and a deep understanding of the cloud hosting services sector. As Chief Commercial officer at StratoGen, Karl was responsible for the business’ global sales, both pre and post-acquisition by Access Group. Anthony has over 10 years’ experience working within managed services and hosting and worked closely alongside Karl for four years as Solutions Architect at StratoGen. Anthony has a fantastic reputation within both the VMware hosting community and wider industry for his technical knowledge and for delivering outstanding service and value for customers. Secura CEO, Ollie Beaton, commented: “Karl and Anthony are both fantastic additions to our team at Secura. They will bring years of combined industry experience, outstanding technical knowledge and reputations for achieving success and it’s a real testament to Secura’s development over the last few years that we are now attracting talent of their calibre. It’s an exciting time for everyone in the business and I look forward to working with Karl and Anthony as we continue to accelerate Secura’s growth.” Secura Sales Director, Karl Robinson, commented: “Secura are a dynamic cloud business with great technology, expertise and service that’s really tailored towards the needs of the rapidly growing sector of tech-focused, ‘always online’ businesses in the UK. I think there is a huge opportunity for a provider with the right infrastructure and, importantly, the dedicated service and support that these businesses want to underpin their applications and revenue. It’s an exciting new chapter in my career and I look forward to working with the team to support this new phase of growth.”  Secura Head of Pre-Sales, Anthony Doncaster, commented: “I’m thrilled to be joining a business with such a strong commitment to using cloud technology to really make a difference to customer businesses. The Virtual Private Cloud is an exciting platform to be working with - I’m looking forward to contributing to its continued development and working alongside such a passionate and knowledgeable team.” ### Disruptive Pitch UK - Series 1 - Episode 2 Episode 2 of reality TV technology pitch show; #DisruptivePitch. The UK's most aspiring technology entrepreneurs, startups and emerging companies pitch themselves to our panel of judges for a chance in the final. ### IT in 10 Years Time We are becoming an ever greater data consuming and interconnected world. In ten years, more people and nations will have internet connectivity that they depend on for their life and business. With nations like India ‘leapfrogging’ technology such as landlines, and instead adopting mobile technology, it would not come as a surprise if more and more internet networks are comprised of Wireless devices. Li-Fi (internet connectivity with light pulses) could become a major player in secure internet connections (as it is contained to where the light reaches, i.e. room walls). [easy-tweet tweet="In 10 years, more people will have connectivity they depend on for life and business." hashtags="tech, cloud, business, IT"] With the enhanced connectivity ordinary people, and companies will increasingly face new security issues, which we are already seeing today. This will probably be the most frightening thing of new technological capability; security and data protection. The TalkTalk hacking scandal and the more recent Apple Ransomware virus demonstrate no one organisation or business is safe from these kinds of attacks. As technology evolves, new weaknesses will always be exploited, but that’s not to say systems are weak or vulnerable, we will just need to be smarter about it. People need to be very cautious with the software and material they are downloading and streaming. In addition, they need to ensure that their security protocols are thorough and up-to-date, including anti-virus, web protection, and ad blocking software. People will need to continue to simply avoid dodgy or malicious websites. Security issues come in a range of guises, including Ransomware viruses (like Cryptolocker) and Trojans that sit on the host’s machines and collect data without being noticed (like Volatile Ceder Malware). These programs can make using the same password for everything even more dangerous! Companies such as Microsoft are working on programs to enhance security measures, such as Windows Hello, which uses facial recognition in an attempt to challenge any and all identity-related security issues. The other issue is Big Data and the concept that everything we do is becoming ‘data-fied’, in this respect years’ from now we will live in a very data-heavy world, where data will be harvested, analysed and applied in all walks of life. There are already concerns over companies collecting data from children via the devices and applications they use, as well as the privacy laws safeguarding minors, with many underage children using social media like Facebook, Twitter and Instagram  without knowing that those pictures taken are essentially then ‘lost’  to the web. [easy-tweet tweet="Through Big Data, data will be harvested, analysed and applied in all walks of life." hashtags="bigdata, tech, cloud, IT"] In general, companies will start to move away from on-premise solutions like servers and instead adopt either hybrid or totally cloud solutions for their business. As more and more companies look into innovative cloud solutions and applications, less attention will be paid into improving current set ups and hardware in that respect. PCs themselves may simply become dumb terminals, only used to access the internet, dashboards and portals for your data and emails. Microsoft’s Continuum is looking to use phones as though they were your PC, with a clever docking station you use to project your mobile onto your office monitor. This, of course, opens up lots of ‘ownership, ‘BYOD’ and security issues, should the phone belong to the user and not they company. The power of mobile technology has increased so quickly that it is very possible that mobile devices will be used over and above PCs and so the Continuum mentioned above could truly be a reality. Also, chrome books and similar may also increase in popularity. The potential technological capabilities that will be available in 10 years are hugely exciting. But perhaps most exciting, is the increase in ease and efficiency, and how it will be a more subscription-based way to pay for services and applications. This is already in motion with the adoption of cloud services that allow more and more people to pay per month, rather than large upfront costs of say servers or PCs. The cloud allows for increased accessibility and flexibility within the workplace, so people can work on the go, from the office and even at home. This is going to really transform employment in the next decade… mothers can return to work easily after maternity leave or work from home for a period, people can work in international markets as they’re not constrained to their desk and people with disabilities have access to the equipment and software they need to aid them. [easy-tweet tweet="The cloud allows for increased accessibility and flexibility within the workplace" hashtags="tech, cloud, IT, "] Virtual reality will become ever more popular for home and business use, and app automation for central heating, lighting, fridges and much more will be common place in 10 years. IT is becoming increasingly intertwined in our every day life, and while the opportunities for this are exciting, we need to be smart in how we protect ourselves from potential risks. ### The IT Asset Manager's Role in Cloud Adoption When people mention “The Cloud”, we think of cost-effective, flexible solutions for data storage and service access. Less hardware, less software, and less fuss - but the cloud hasn’t brought about the end of ITAM as many expected, far from it. Instead, it has brought about a change in IT Asset Management.  Whether private, public or hybrid cloud solutions are best fit will depend on your type of organization. Similarly, the IT Asset Manager’s role will vary depending on the type of cloud service adopted. License terms, contracts, deployment and user management all shift when moving to the cloud, and as such raise compliance concerns - The ITAM manager’s role has evolved. [easy-tweet tweet="When people mention cloud they think of cost-effective, flexible solutions for data storage and access." hashtags="cloud, tech, IT, data"] ITAM and the Private Cloud Offering higher security and privacy, cost savings and improved reliability and control, many organizations opt for a private cloud solution. For budgeting and cost optimization purposes, ITAM plays a key role in the adoption of a private cloud solution and due to owning the hardware and software, it is essential in transitioning to, and growing with, a private cloud solution. As the cloud service expands and upgrades, cloud management software will optimize capacity use, but it’s ITAM’s job to ensure the organization remains compliant. Robust policies and rules for requesting and registering new software, hardware and licenses will help keep tabs on the full IT environment and helps predict changes in operational requirements for speedier adaptation to the organization’s demand. ITAM and the Public Cloud Although ITAM will have no hardware to manage if the organization choose a public cloud service, managing all deployed software and the licenses associated with it will still be necessary. Software Asset Management becomes more complicated in the cloud due to the cloud vendor possibly requesting the right to access, copy and modify software if it’s used within their agreement. This may cause problems if they contravene their own licensing and third party licensing as the cloud vendor will not take responsibility for any noncompliance with software license agreement terms. As with private cloud solutions, policies for the purchasing and consumption of cloud services will make ITAM easier. Identifying which users are buying and using the services provides insight into compliance, and again, will help ITAM understand how to deliver the most appropriate IT services. Private cloud solutions afford greater opportunity for tailoring IT services, which can result in reduced spend. ITAM and Hybrid Cloud Solutions Blending private and public cloud solutions provide a buffer when capacity is reached during busy times - an Ecommerce site, for example, may experience high volumes during a sale, or over the Christmas period. Known as “bursting” into the public cloud, the organization is not forced into purchasing additional hardware that will remain unused during the quieter periods. If an organization opts for a hybrid solution, ITAM must manage all contracts for both the public and private cloud, which can be extremely complicated, but also extremely cost effective. Compulsory but Changeable Ultimately, knowledge of software licenses and/or their cloud-based equivalents is what will make or break an organization’s transition to the cloud. That may come in the shape of an experienced ITAM Manager, SAM Manager, a legal professional, or a combination of all three, to ensure a deep understanding of the regulations that come with the cloud software and services. [easy-tweet tweet="Blending private and public cloud solutions provide a buffer when capacity is reached" hashtags="tech, cloud, data"] As well as knowing the IT environment inside out, ITAM must be able to anticipate changes in order to maintain compliance. Full visibility of installations, downloads, and usage comes from following strict rules when purchasing and using software, as well as recycling and reassigning licenses - in the cloud, it’s likely that account details will need to be updated to reflect the new user. An effective SAM tool that can differentiate between perpetual licenses and cloud-based licenses will aid in ITAM’s management of software. This isn’t a possibility for all organisations, however, and so a sound ITAM or SAM policy should ensure your organization is using software and services correctly inside the terms of the software agreements. The adoption of the cloud has sparked an evolution in ITAM’s role. ### Sony's Michael Harrit talks IP Live In a special interview with the MD for marketing solutions at Sony, Michael Harrit, talks to Disruptive Tech TV and Compare the Cloud about IP Live and utilising the cloud for broadcasting. Harrit claims IP Live is "a complete new way of working" for the studio, stadium, and outdoor/remote productions. By working together with other major brands the technology will be able to be implemented and upgraded to serve any need. The industry has truly come together By working together the technology behind production has opened new doors to infrastructure options enabling cloud. ### New VMware research: businesses look outside the IT department for next wave of technology disruption VMware, Inc., a global leader in cloud infrastructure and business mobility, today announces research finding that almost two-thirds (60%) of UK business leaders believe the management of technology is shifting away from IT to other departments, as lines of business take charge of technology-led innovation in UK organisations.   The research among 200 IT decision makers and heads of lines of business finds that this decentralization* of IT is delivering real business benefits: the ability to launch new products and services to market with greater speed (56%), giving the business more freedom to drive innovation (63%) and increasing responsiveness to market conditions (59%). There are also positives from a skills perspective, with the shift in technology ownership beyond IT to the broader business seen to increase employee satisfaction (53%) and help attract better talent (37%).   This move, however, is not without its challenges. Leaders from across the business believe this is causing a duplication of spend on IT services (63%), a lack of clear ownership and responsibility for IT (62%) and the purchasing of unsecured solutions (59%). Furthermore, this decentralisation movement is happening against the wishes of IT teams, the majority (67%) of which want IT to become more centralised. In particular, IT leaders feel that core functions like network security and compliance (79%), disaster recovery/business continuity (46%) and storage (39%) should remain in their control.   “It’s ‘transform or die’ for many businesses, with a tumultuous economic environment and radically evolved competitive landscape up turning the way they operate,” says Joe Baguley, vice president & chief technology officer, EMEA, VMware. “Managing this change is the great organisational challenge companies face. The rise of the cloud has democratised IT, with its ease of access and attractive costing models, so it’s no surprise that lines of business have jumped on this opportunity. Too often, however, we’re seeing this trend left unchecked and without adequate IT governance, meaning that organisations across EMEA are driving up costs, compromising security and muddying the waters as to who does what, as they look to evolve.”   The ownership for driving innovation within organisations is not disputed among business leaders.  Almost 4 in 5 (77%) believe that IT should enable the lines of business to drive innovation, but must set the strategic direction and be accountable for security – highlighting the balance to be struck between the central IT function retaining control while also allowing innovation to foster in other, separate areas of the business.    “This isn’t ‘Shadow IT’ anymore, that’s yesterday’s story – this is now ‘Mainstream IT’,” continues Baguley. “The decentralisation movement is happening, driven by the need for speed in today’s business world: we’ve never seen such a desire for new, immediately available applications, services and ways of working. By recognising these changes are happening, and adapting to them, IT can still be an integral part of leading this charge of change. The latest technology or application will only truly drive digital transformation when it’s able to cross any cloud, to be available at speed and with ease, within a secure environment.”   George Wraith, Head of ICT at New College Durham, also supports this view “Business models are being disrupted by the movement of IT services to outside the IT department. This is leading to cost, management and security inefficiencies,” he says. “This trend is not going away, and gives IT the opportunity to adapt, regain control and embrace it.”   Decentralisation definition *The decentralisation of IT is when any employee within any business department of an organisation, other than the IT department, is making IT purchases or installing or maintaining software. It can also include employees using non-IT approved software, such as Dropbox, without the involvement of the centralised IT department. ### Why Cloud print is a "No Brainer" Cloud computing and storage have been adopted by UK businesses en masse, with other IT services like unified communications following hot on their heels.  Yet something that many have neglected to even consider sprinkling cloud gold-dust onto – an IT function that is absolutely critical to business processes at all organisations large and small – is printing.  Print always seems to be left until last for some reason, but this situation could be about to change radically. First, let’s get past this misconception that print related considerations don’t add up to a hill of beans in the typical IT department.  Gartner analysts estimate that up to 3% of an organisation’s total revenue is spent on print, which can work out at thousands, if not millions, of pounds.  Make no mistake - printing represents a perfect opportunity for the cloud to add new value and support the business imperatives of increased service levels, productivity, and agility, all while reducing costs and carbon footprint. [easy-tweet tweet="Cloud computing and storage have been adopted by UK businesses en masse" hashtags="cloud, tech, business"] This is done by leveraging a managed cloud print service (MCPS) - an enterprise-wide print management system that is hosted in the cloud and helps to create a controlled print, copy and document environment.  If properly deployed, this will save time and money, but also strengthen an organisation’s ability to control risk.  As well as helping to avoid loss of valuable data and preventing unauthorised access, it can also ensure confidential documents don’t end up in the wrong hands, limit device access to particular users and departments, and clearly identify key users. An MCPS uses a cloud-based print network to manage the printing needs across the entire organisation.  For organisations spanning several offices nationally or globally, this is ideal. For smaller ones, it enables complete scalability. It’s flexible, it’s responsive, and reduces the administrative headache of managing a printing network spanning tens, hundreds or even thousands of printers. It also helps to reduce paper, ink and energy consumption, saving money as well as the planet. Like other managed cloud services, cloud printing is developing to meet changing business needs. As printer fleets have evolved and expanded, many organisations find themselves operating multiple print servers to support their individual printing requirements. This brings advantages, by providing a more resilient network and avoiding heavy network traffic, but it also creates specific challenges, including a substantial IT cost and administrative burden. Typically, for each site a business operates from, it needs one server and one print server for every 100 print devices. This heavy dependency on physical print-servers can lead to higher costs, device and driver deployment, poor print management, and redundancy issues. By virtualising the print infrastructure, an MCPS replaces the traditional physical print server, hosted on-site, with a private print network that uses ‘virtual print servers’ hosted in the cloud.  As well as large financial savings, this means higher security, control, agility, productivity and network resilience. Unless you work within a typical IT department, you may fail to appreciate just how much of a drain on resources is it to operate a large print estate and its attendant on-premise server infrastructure. Alleviating these resources contributes to the 40 percent of so in cost savings that can be achieved by migrating print to the cloud. Many IT departments comment upon the benefits they’ve unlocked by allowing skilled internal staff to be redeployed to more strategically valuable duties than print server husbandry. [easy-tweet tweet="Cloud printing is a relatively new concept, but it is fit for purpose within every business sector." hashtags="cloud, tech, IT"] Of course, there are some organisations that remain concerned about the security issues raised by giving their print infrastructure over to cloud technology, using it as the reason to decline or defer this next leap forward. Again, as with the broader evolution toward the cloud, this resistance to cloud printing is rapidly dissipating.  With the right safeguards in place, increased digitisation and mobility should not compromise the security of data and documents. One of the more challenging objections relates to data residency and the need to understand the specific location of print data for compliance and governance purposes.  Rather than go the AWS/Azure route and leverage the public cloud, KYOCERA’s approach to reassuring customers with this concern is to provide a UK-situated private cloud environment situated within a recognised colocation facility with a very high level of resilience and availability. Whilst cloud printing is a relatively new concept, it is certainly fit for purpose within every business sector. The efficiency improvements offered to both large and small organisations, combined with the lessened security risks, will inevitably drive greater adoption. It may be the greyest part of your IT infrastructure today, but it could yet become your most golden. ### Is the job of a SysAdmin dead? Or just moving with times? The original role fulfilled by a SysAdmin may be dying out but that doesn’t mean their function within an organisation has disappeared. These days, they may just be operating under a different guise. It wasn’t very long ago that I worked for a company where it felt like all the SysAdmins were going through the five stages of grief. I think this was down to a dawning realisation that their current technology offering was no longer meeting users’ demands. [easy-tweet tweet="Customers became disgruntled as the technology and progression had become stunted." hashtags="tech, cloud"] Although there was a general appreciation among the SysAdmin team that cloud/mode 2 technology had an important role to play in the future of their organisation, there was also a fear their jobs would end up redundant if they fully embraced this approach – so they didn’t. The new reality of the cloud The SysAdmins were in the ‘denial’ phase of grief and were rejecting the new reality. This resulted in the team being very standoffish/dismissive about anything cloud related. Consequently, customers became disgruntled as the technology and progression had become stunted. By the time I left the business, the team had moved past the ‘denial’ phase and transitioned to ‘anger’ (nothing to do with me, honest). This was because those end users – who had become so unhappy with the slow turnaround of new services or solutions – had started to bypass the central IT team altogether, and were going directly to third party cloud vendors. In doing so, the end users had discovered they could get almost instant gratification with only minor feature sacrifices. SysAdmin to Cloud Architect The company I worked for was known for taking a long time to plan, prepare and execute projects. So much so, that a Windows 7 build was only being rolled out when Windows 10 was being announced. They are not the only organisation to struggle in this regard, but the difference is that the majority have adapted. Joe Fuller, CIO at Dominion Enterprise, claims roles within his team have been transformed due to user demand. He says: "It was the users in there and us catching up, now we're trying to get ahead of it. We are trying to change ourselves to cloud architects instead of rack-and-stack administrators. Our goal is to help our business units, because when some of these departments start doing IT, you get a bell curve – some do it great, others not so great." Missed opportunity By rejecting the cloud, the SysAdmin team in my old company had missed an opportunity. While the cloud would have reduced the tangible environment they managed, the need to provide hands-on assistance and implement new services would also have been reduced. They would have been allowed to focus on other priorities and get ahead of the technology curve. [easy-tweet tweet="A focus on cloud and mode2 technology isn't something to be restricted to end users" hashtags="tech, cloud"] It would have freed up time to educate clients on important matters – like security and proper resource utilisation. By spending time educating and putting in preventative measures (without imposing an overly controlling environment) it would have improved the users' experience along with overall safety. Bad things may still have happened, but having users more clued up would mean they would (hopefully) know when something wasn’t right and contact Support sooner. Moving with the times A focus on cloud and mode2 technology isn't something to be restricted to end users or clients, however. There is also an opportunity to work closer with other tech departments within the organisation and strengthen the services that can be provided. Instead of just reacting to problems, a former SysAdmin should now be involved in designing, developing and engineering systems. They should be willing to evolve and be ready to grow horizontally with various technologies, not just vertically into one. Their role in a business should no longer be only managing systems, but optimising them. This shift in responsibility and mentality follows the DevOps philosophy of breaking down silos and allowing for a smoother transition of tasks between departments. As this progresses, the idea of departments also starts to become more fluid, helping the overall business to run more efficiently. ### Arrow Cloud Global VP Steve Robinson on the importance of Cloud Distribution In an exclusive interview with Compare the Cloud, Global VP for Arrow Cloud, Steve Robinson, talks to us about the importance of cloud distribution, strategy for hybrid and helping channel partners from a financial perspective. Arrow looks to assist with digital transformation with cloud assessments on workloads in datacenters, the assessment is profiled across three primary dimensions; financial, operational and technical. Once the profiled is mapped it is compared against public cloud offerings as well as datacentre and virtualised datacentres solutions resulting in an objective view of where the workload should run; a public cloud, or an on-premises hybrid cloud model. ### PTC University Wins CEdMA Europe 2016 Impact Award PTC University has received the Computer Education Management Association (CEdMA) Europe 2016 Impact Award for its new generation PTC Learning Connector. The PTC Learning Connector bridges education and product and solution development by providing PTC technology end users with training, technical support and customer-defined content in real time. CEdMA is the premier organization for education professionals on a management path within hardware and software companies. The organization selects its Impact Award winner based on two main criteria: the effect achieved in a specific department or for a particular solution in the field of training and education and its measurement thereof. The PTC University development team met both criteria with its new Learning Connector. Based on the "In Product Learning & Support" concept, the new tool smoothly combines digital product and solution development with real-time access to a broad range of information and training content. The user can refer to free training information provided specifically for each of his work steps within the program he is working on, as he needs it, so that he has the right training material at the right time. Additionally, numerous PTC University eLearning libraries, which are available as separate licenses, provide further content for in-depth study of specific topics. Both new and existing customers have found the tool highly appealing, which is impacting positively on revenue derived from the PTC University eLearning courses. "Theoretical information learned in traditional time-consuming, block-training courses is often lost once the developers and engineers are back at their desks working on real projects,” said David Cleverley, vice president PTC Customer Success. “We want to change this by providing our customers with answers and solutions precisely when they are needed. With the PTC Learning Connector, we have not only managed to dynamically combine eLearning with practical application, but also to excite many of our new and existing customers about our eLearning content. Winning the CEdMA award has pleased us greatly, because, as we know, an idea is only as good as its subsequent implementation, and in this case both were decisive." ### Romania, the EU cloud alternative for the UK. Romania, currently ranked 3rd in the world insofar as Internet download speed after Singapore and Hong Kong, according to InternetSociety.org. It is home to highly trained IT professionals and already a major exporter of IT platforms and solutions. In the past few years, the competition between IT companies (software, automation, etc.) has increased significantly, leading to a concentration of this industry around the cities of Cluj-Napoca and Bucharest, which have become main points of interest in Europe’s IT industry. [easy-tweet tweet="Romania is beginning to be recognized as a reliable Cloud and Datacenter service provider" hashtags="cloud, tech, data"] Romanian IT specialists benefit from extensive expertise, incorporating nearly every technology emerging on the IT market, almost instantaneously. The development of IT companies has also brought about large investments in solutions such as “computing power”. Modern and well-equipped data centers have been developed, serviced by highly trained IT professionals, which can successfully compete on a European and an international level. AdNet is one of the pioneers of Data Center solutions in Romania. Its investments have materialized into two datacenters, one in Cluj-Napoca and one in Bucharest, thus developing a solid platform for Cloud solutions (VM Ware, Cisco UCS, HPE 3PAR, etc.) which mainly targets the Romanian and European markets. [easy-tweet tweet="Romania can deliver secure and reliable services at affordable prices." hashtags="tech, cloud, services"] Considering the speed and the quality of internet connections in Romania, the access time to any destination in the EU is comparable to local solutions. Given the existing infrastructure, we can argue that IP and IT boundaries no longer exist, which is why, if you are in search of Cloud and Datacenter solutions (IaaS/PaaS/SaaS), this country and, in particular, this company, can deliver secure and reliable services at affordable prices. The IT security level in Romania is high. For instance, to this date, no financial institution has been majorly compromised by cyber-attacks. At the same time, Romanian laws allow strong privacy conditions to be provided to clients. No datacenter in Romania has ever fully crashed as a result of a cyber-attack, not even for a second. No terrorist attacks have ever taken place in Romania, it is a NATO member country and currently, hosts two multinational military bases. Companies such as AdNet have grown consistently over the past few years, expanding their client portfolio on external markets as well as the domestic one. In the beginning of November, clients from 28 countries were using AdNet’s Cloud solutions. The ratio of income generated by external clients out of its total revenues is 50 percent, thanks to high quality services (state of the art high-performance equipment), premium support (engineers available 24/7 to resolve any issues which may arise) and competitive pricing (the average price per vCORE in the UK is at least 50 percent higher than the average vCORE price in Romania). [easy-tweet tweet="Romanian IT specialists benefit from extensive expertise" hashtags="tech, cloud, data"] Language barrier? Not an issue, more than 40 percent of Romania’s population speaks English. In addition, more than 80 percent of the people employed in the IT field speak English. Romania is beginning to be recognized as a reliable Cloud and Datacenter service provider, not only for one of the world’s most beautiful roads, according to Top Gear (the famous Transfagarasan), sandy beaches, a picturesque mountainside, captivating history, good food, and welcoming people. Go Cloud, go Romania! ### Why the healthcare sector should self-prescribe cloud eSourcing In an age of ongoing reviews and inspections, the pressure is on for the healthcare sector to deliver high-quality care. The Care Quality Commission (CQC) is England’s independent regulator of all health and social care services, and it aims is to ensure that fundamental standards of quality and safety are met. It publishes the findings of these reviews, meaning that healthcare organisations that don’t meet those standards risk falling into disrepute. One way of ensuring that they meet CQC standards is via a clinical audit ran by the Healthcare Quality Improvement Partnership (HQIP), an independent organisation responsible for managing clinical audit contracts on behalf of NHS England. It works with healthcare organisations to identify those areas where the quality of care they provide can be improved. It was launched to promote quality in healthcare, and in particular to increase the impact that clinical audits can have on healthcare quality improvement. But this can only happen if you have the provision to source high quality medical and scientific experts who can help conduct a successful audit. And to do this, best practice tendering processes need to come into play. [easy-tweet tweet="The NHS’ aim to be paperless by 2020 is driven by the efficiencies that it will bring" hashtags="NHS, tech, cloud"] The HQIP is a prime example of an auditing body rethinking its procurement approach to boost efficiency. Now using Wax Digital’s cloud-based web3 eSourcing, it can publish tenders electronically and make use of existing templates, while giving suppliers the ability to submit responses online. A tool of this kind gives HQIP the ability to source experts quickly while offering a mix of automated and manual scoring facilitates, and managing subsequent contract awards too. The NHS’ aim to be paperless by 2020 is driven by the efficiencies that it will bring, primarily cost-savings. Thanks to its auditing body making a similar move by ridding the procurement process of paper, it too can make efficiencies and speed up its processes. Improving its contract management processes means that it can spend more time focussing on how the healthcare organisations it audits can be improved. Judith Hughes, interim head of procurement at HQIP, highlights benefits of the speed and efficiencies offered by intuitive cloud-based software: “As we’d aimed for, Wax Digital web3 has greatly improved our processes. Moving away from paper-based tendering has significantly reduced the time it takes to review and award teams for projects. It has also helped further ensure our quality guidelines are upheld and we now have a much more efficient way of engaging with our suppliers and them with us.” But the benefits of eSourcing go beyond making the procurement process more efficient. We’ve seen procurement departments in healthcare organisations integrate more easily with the clinical team thanks to eSourcing technology, and entering into discussions about equipment and supplier choices can happen much more naturally. Procurement can also spark up a more successful relationship with the finance department, as having access to accurate spend data means that it can work towards the savings goals. And of course, procurement is about more than just saving money, and thanks to eSourcing technology, healthcare organisations can source durable and quality equipment and find it at the best price. For example, hospitals have reason to select a more expensive piece of equipment if it significantly reduces the likelihood of the patient requiring further care. eSourcing enables procurement to treat different health situations on a case-by-case basis like this. [easy-tweet tweet="Procurement can also spark up a more successful relationship with the finance department" hashtags="tech, cloud, NHS"] Innovating the procurement process can spearhead a change across the organisation, and organisations like HQIP have realised the benefits of turning contract management digital. In the healthcare sector particularly, we’re seeing many organisations make their supply chains more efficient by rolling out eSourcing technology and using the cloud to store all relevant information in one place. And given that a clinical auditor that oversees healthcare has taken that route, the sector has taken a further step towards procurement efficiency. ### Strategies for beating ransomware start need to start right now According to the FBI, ransomware attacks have increased 35-fold during 2016 resulting in an estimated US$209 million paid out every quarter. No one is immune, with attacks impacting hospitals, schools, government, law enforcement agencies and businesses of all sizes. The increased frequency of attacks has organisations thinking differently about their approach to ransomware, writes CTERA’s Jeff Denworth. [easy-tweet tweet="Increased frequency of attacks has organisations thinking differently about their approach to ransomware" hashtags="tech, security"] Ransomware has reached the heights of a global IT epidemic that every organisation has either already faced – about 50% of organisations have been breached – or will almost certainly face as the pace of cyber-attacks increases on a daily basis. The costs are spiraling with ransomware payments being made via anonymous bitcoin transactions – where it can cost anywhere from $500-$2,000 to unlock an average PC.  The anonymity makes it’s difficult to know precisely how many anonymous payments have been paid to cyber-criminals. Here’s a look at how it works: What we are seeing is an exponential spike in ransomware activity that is seeing it look likely to become a US$1bn business this year. Compounding this is that success of ransomware encourages cyber criminals to introduce new strains of malware and increase their efforts. Take a look: Two especially nasty tweaks to ransomware are starting to emerge: Certain cyber-criminals are capturing data that ransomware can copy out of your network for the purposes of selling it to interested third parties, enabling industrial espionage. There have been reports of cases where customers have paid ransomware attackers and then never receive encryption keys for decrypting their PCs in return. The problem then is substantial and getting larger so now is the time to start fighting back and putting countermeasures in place. Countermeasures to fight back against crypto-malware: Step #1. Secure your perimeter to minimise the chance of breach: Patch your operating systems and keep your operating systems up to date. Train employees on ransomware and their role in protecting the organisation’s data. Disable macro scripts from office files transmitted over email. Limit access to critical and rapidly changing datasets to only need-to-know users. Step #2. Back up all files and systems to avoid paying ransom to recover from crypto events. Backup your endpoints, back up your file servers. Implement lightweight optimised data protection tools that minimise recovery points. For the last 20 years, the market has been conditioned for daily backups. Whether we’re talking server or endpoint backup, in both cases file storage systems have been built for relatively lax backup intervals because backups have been expensive, requiring lots of CPU, lots of storage and too much time, and organisations haven’t had to deal with an explosion of file-locking malware attacks. The use of legacy backup software in an organisation becomes a major issue for organisations where knowledge workers are continuously storing data on PCs and file shares. For example, an organisation that has 1,000 knowledge worker employees with file access by power users and IT teams has all of its files shares vulnerable. Daily backup using legacy tools leaves 24 hours of work unprotected which equates to 2.73 many years of cumulative lost productivity. That demonstrates how legacy backup tools can have real costs for organisations that are routinely faced with crypto-ransomware. Modern backup solutions, including CTERA’s, can enable organisations to achieve a finer degree of backup interval granularity through the use of global, source-based deduplication, incremental-ever versioning and the ability to track file changes without doing full system scans. That said – default settings for even the most efficient tools are anywhere from four to eight hours, which is nearly a full business day. Therefore, the same problem could essentially persist. In spite of the relatively large recovery point, data protection tools – or backup – will always play an instrumental role as a ransomware countermeasure, in large part because of backup software’s ability to recover full systems and system profiles. The Petya virus, for example, forgoes single file encryption and will simply lock up a full desktop hard disk. These type of viruses create the need for simple tools that can help with full PC restores and backup software fits the bill. The lines of data protection are becoming increasingly blurred between NAS and Backup Software as Enterprise File Sync and Share, a self-protecting file management and collaboration tool that provides user-level storage and file recovery tools, emerges into the market. These tools create incremental versions of files as they are changed and updated, and are protected on an “event basis” (a file save) as opposed to a ‘scheduled’ basis (a pre-defined backup interval). [easy-tweet tweet="The lines of data protection are becoming increasingly blurred" hashtags="tech, cloud, security"] The result of an ‘event-based’ data protection agenda is much more compelling than a scheduled backup strategy that protects user data in 24, 12, six or even one-hour intervals. Philosophically backup everything The FBI is right to advise that you back up your systems. You’ll want to recover desktops and servers without Herculean recovery efforts, and modern backup tools can make it simple to protect systems and easy to recover full profiles. CTERA Enterprise File Sync and Share and Sync Cloud Storage Gateway models can publish and version file updates with less than five minutes. In the event of a ransomware attack, you can roll recover your desktop with CTERA backup software and then recover to versions of folders that were stored in CTERA Sync products to ensure you are recovering to the most recent file state. CTERA and our customers have been fortunate to catch and recover from some very serious ransomware crypto-lock events. Using very granular file sync and backup procedures, CTERA customers have minimised their recovery points to as little as five minutes – in comparison to 24 hours or more with competing technology.  With the right data protection tools, they’ve successfully saved themselves from paying hundreds of thousands of pounds in ransom and minimised the period of business outage. [easy-tweet tweet="The FBI is right to advise that you back up your systems." hashtags="tech, cloud, ransomeware"] The only way we can put an end to this epidemic is by building the right safeguards that eliminate enterprise vulnerability and end the need to pay cyber-criminals to access our data and our systems. Whether you choose CTERA tools or any number of other approaches to safeguarding your organisation, choose to be prepared and please do it right now. ### It’s Time to Clean up Your Cloud CRM Data CRM data becoming unreliable? Can’t segment your data easily? Unable to produce accurate reports without manual intervention? Then it’s time to clean up the data in your CRM. While many implement a CRM solution in the hope that it will transform their business, unless it is embraced by all in the business and utilised correctly; then it’s unlikely you’ll see the expected returns. So, what can you do to ensure your cloud CRM is a solution that works for your business? Keep in mind these tops tips! [easy-tweet tweet="What can you do to ensure your cloud CRM is a solution that works for your business? " hashtags="CRM, tech, cloud"] Find Out Why It’s Failing If you want to find a solution, then it’s always best to start with identifying the problem. When you’ve invested a lot of money into a CRM system, you expect it to deliver the results you were after. Failure Number 1 – It’s perceived as technology only Customer relationship management is a business strategy; the software is there to facilitate this. You need to implement a business wide shift to improve customer relations, with a view to improving retention. Failure Number 2 – Not updating working practices When you implement a new system, it’s time to make some changes to how you use it. Many businesses fail to update working practices due to a fear of reducing user adoption. However, this will mean you encounter the same challenges you did with your old system. To ensure you have more efficient CRM process, use the opportunity to update your way or working to reflect the changes in the new solution. Failure Number 3 – Importing poor data If you’re just starting out then you won’t have this issue, however, if you are rolling out a new system then you’ll need to bring across your existing data. Now is the time to start fresh and clean the slate. Conduct a thorough cleanse before you import data into your new CRM. [easy-tweet tweet="Conduct a thorough cleanse before you import data into your new CRM." hashtags="CRM, tech, cloud"] Failure Number 4 – User failing to adopt If your employees aren’t using your CRM properly or at all, then you’re unlikely to get any value from it. Provide training to ensure all team members know the correct processes to follow. Creating a CRM Culture Culture is fundamental to your CRM success; if you’ve got a customer-centric culture in your business then you are more likely to deliver a better customer experience, which will boost customer retention, thus increasing sales revenue. So, how can you achieve this? Communicate your goals – it’s no good if it’s just management who know the value being customer-centric can bring to the business; your employees need to know this too. Share your vision, so that the rest of the business understand the value too. With everyone on board, you are more likely to get buy-in. Educate employees on the value it adds to their role – if explaining the benefits it will bring to the business fails to ignite excitement, then tell employees how a CRM will add value to their day-to-day activity. Ultimately, this is what matters to them, so communicate the benefits they will see if they use the solution. Tell everyone – the culture you create within your business needs to be widespread, it’s no good if just a handful of employees or one team is using the CRM; everyone should be aware of how to use it and how it will improve the business. Don’t leave anyone out – you should take care to speak to those who are directly involved with the CRM to ensure that any of their queries or common issues are addressed. They will be able to tell you what works and what doesn’t, and by including them in the whole process they are more likely to have an invested interest in making the CRM a success. Create champions – appoint CRM champions to shout about the software solution to the rest of their team. A representative from each department will be able to provide you with valuable feedback and aid colleagues in using the CRM. Reinforce using the CRM – once the solution is live, the hard work doesn’t stop there; now’s your time to keep up with using the CRM. Incentives to keep employees engaged and new KPIs can help to keep employees invested in the cause. Defining Data You’ve take the steps to identifying failures and implementing a CRM culture; two things which will undoubtedly help you on your way to clean data. Next, you need to define what data you require. Step 1 – Define important data Look at what data you already have and decide what additional data you require. You want to ensure the data you hold will allow you to understand and anticipate your customers’ needs more effectively. Step 2 – Identify how to record and report After you’ve decided what data you need, you will need to store and report on your findings. Don’t fall into the trap of adding data here and there; it’ll end up cluttering your screens and muddle the data you do have. Step 3 – Educate Communication is essential to ensure others see the value of keeping accurate records. Once they can see the value to themselves and the business, they are more likely to take the steps to data nirvana. Step 4 – Begin collecting If you require new data to input in your CRM, then you will need to contact existing customers and prospects in order to obtain this information. It’s best to do this as part of a data cleanse exercise, allowing you to kill two birds with one stone. Implement Quality Processes Lastly, you are going to need to implement data quality processes. While there will be some element of human error in your CRM, you can introduce measures to try and prevent this from occurring. Measures you introduce should help to minimise mistakes and make them easy to correct. Here are some steps you can take: [easy-tweet tweet="Measures you introduce should help to minimise mistakes and make them easy to correct." hashtags="CRM, tech, cloud"] Introduce mandatory fields – mandatory fields require data to be input before you can move to the next step. Picklists – these remove the need for manual data entry and reduce mistakes. It can also make reporting simpler. Regular data cleansing – data degrades quickly, so you will need to review data regularly and conduct a thorough cleanse. Dedupe routine – duplicate records can affect customer experience, so it’s imperative that you undertake a regular dedupe routine. ### In an age of automation, don’t write off people just yet  If you believe the hype, then the UK’s 376,000 contact centre and customer service workers will soon be out of a job. According to the BBC, occupations in the contact centre industry are among those most risk of being taken over by robot workers in the future – placing them 109 in a ranking of 366 jobs in the UK. The rise of automation will undoubtedly play a huge role in the industry, taking on simple, repetitive tasks and answering basic queries. The benefits are obvious – robots can work 24 hours a day, seven days a week and only take a few seconds to be ‘trained’ on a new topic. [easy-tweet tweet="The rise of automation will undoubtedly play a huge role in the industry" hashtags="tech, cloud"] But this doesn’t mean humans are on the way out. I believe people in the contact centre industry have a bright future and to write them off entirely would be to ignore the strategic importance of human interaction to a brand. The contact centre is increasingly becoming the only place that customers interact directly with a brand, more specifically with the people on the front line that represent the brand. This is particularly true for e-commerce companies that don’t have the presence of a physical store. In an era where brands are defined by the service they provide, not just the products they sell, the actions of the contact centre team are fundamental – they are the customer experience that makes a brand what it is. Contact centres are full of highly-trained, knowledgeable people with problem-solving and negotiation skills that are very difficult to replicate with automation. Then there are the ‘softer’ skills – like showing empathy and charm – which are virtually impossible for robots. This is even more vital when you consider that customers will often get in touch with a contact centre when something has gone wrong and are looking for help. We’re all too familiar with the ‘computer says no’ approach. It’s when human call centre agents start showing ‘robotic’ traits and refuse to deviate from a script that customers are likely to become frustrated by an interaction on the phone. Building a relationship and creating a dialogue is key to getting to the bottom of a problem and making sure it’s dealt with properly. Unless you consider fictional films like Weird Science or Her, no one has ever built a relationship with a computer. [easy-tweet tweet="No one has ever built a relationship with a computer outside of sci-fi films like 'Her'" hashtags="tech, cloud, "] Yet, ironically, underinvestment in technology is probably the biggest threat to humans continuing to take the leading role in contact centres. With more and more being demanded of contact centre workers, as the front line of a brand’s customer experience, technology is vital to ensure agents have the tools they need to provide stellar service. The right technology can make sure customers are connected to the right agent with the right expertise first time. Technology can match a customer to any agent in the world who is able to answer their query to cut down on waiting times. Agents can have the full history of a customer’s interactions with a company at their fingertips so that a customer doesn’t have to repeat their query again and again. By selectively using technology in the contact centre environment, we can help agents to not only let their unique human skills shine but become superhuman. Solutions like 8x8’s Virtual Contact Centre and EasyContactNow are built to enhance the great customer experience contact centre agents are able to offer by making sure customers are always sent to the right agent, with the right expertise, at the right time. By taking care of the basics, agents are freed up to focus on solving a customer’s problem, not wasting time asking functional information. Automation is undoubtedly here to stay. It’s now a common feature in many parts of our lives, from helping us pay for our supermarket shopping, to giving us directions through a smartphone, to allowing us to check ourselves in at the airport. Contact centres will no doubt be affected but it’s important they are part of this automation movement, rather than ignoring it. [easy-tweet tweet="Technology can match a customer to any agent in the world who is able to answer their query ." hashtags="tech, cloud"] The successful contact centre of the future will be able to harness the best of both the human and automated worlds, to give customers the service and experience they want. ### Cloud computing start-up UpCloud raises €4 million from investors Today, Finnish cloud computing start-up UpCloud announces that it has raised €4 million in its first round of external funding, led by Nordic venture capital company Inventure. Formed in 2011, UpCloud offers cost-effective and high-performance cloud computing from four data centres located in Helsinki, London, Chicago and Frankfurt. The company’s technology, which diligently follows privacy regulations, is built in-house to offer better performance, redundancy and value for customers. With the extra funding, UpCloud will expand its team as well as finance the opening of new offices and data centres internationally. In spring 2016, UpCloud was awarded the highest performance in the infrastructure space in a benchmark released by market research company CloudSpectator. The research was conducted amongst 19 cloud computing companies across Europe and included the likes of Google Cloud, Amazon Web Services and Microsoft Azure. Now, UpCloud aims to become the defacto global European alternative in the high-performance cloud space, with this initial round of funding consolidating the company’s position as a real contender in the market. “We have built value for our customers from the start and enjoyed excellent revenue growth. Our ambition is to offer our services to customers globally at a faster pace, and this is the next step on that path,” said Antti Vilpponen, CEO of UpCloud. “We at Inventure are happy to further boost the growth of UpCloud, supporting the company in expanding to new markets and customer segments, which will all benefit from the superior cloud performance UpCloud offers,” commented Sami Lampinen from Inventure. UpCloud’s commitment to providing clients with the utmost service and value has been recognised by industry third parties. UpCloud was recently named the third fastest growing technology company in Finland in Deloitte’s annual Fast-50 listing, with a growth rate of nearly 1,200% over the last four years. “A major shortcoming in cloud computing has always been poor and fluctuating performance. We have, over the years, solved many difficult challenges with our own R&D and technology and I am very happy to confirm that we are now able to spread our world-class technology faster, and to an even wider audience,” says Joel Pihlajamaa, CTO and Founder of UpCloud. ### The 21st century space race In the 1960s, the United States of America and the Soviet Union battled for technological superiority and political supremacy. The Space Race captivated the imaginations of many as man, woman, and child looked up at the moon and dared to wonder where mankind would set foot next. Our endeavours since then, however, have failed to captivate in quite the same way. Satellite launches and interplanetary probes just cannot capture the imagination in quite the same way, at least not for the majority of observers. [easy-tweet tweet="The Space Race captivated the imaginations of many a man" hashtags="tech, space, cloud, future"] Recently, however, the beginnings of a new space race have emerged, contested not by global superpowers, but by private corporations seeking economic gain. SpaceX, a company spearheaded by Elon Musk, is one of the frontrunners and has ambitious plans to make humanity an interplanetary species. Despite setbacks plaguing SpaceX’s Falcon 9 launches, Musk’s enthusiasm remains as strong as ever. Just last month, the entrepreneur released a video simulation of his $10 billion Interplanetary Transport System, alongside provisional flight timetables and engineering details. Of course, bringing these plans to fruition will be far from easy, and Musk recognises as much. The company’s current outgoings for rocket development are in the region of tens of millions of dollars a year. The rocket that will take humans to Mars is estimated to cost $10 billion. What’s more, creating a self-sufficient colony on the planet – Musk’s ultimate goal – would require 10,000 flights and a timeframe of at least 40 years. But despite these challenges, SpaceX is far from the only player in the new space race. [easy-tweet tweet="We dare to wonder where mankind would set foot next." hashtags="cloud, tech, spacerace"] Last week, Boeing CEO Dennis Muilenburg put forward his own company’s ambitions. “I'm convinced that the first person to step foot on Mars will arrive there riding on a Boeing rocket," he said, while also taking a not too subtle dig at new companies like SpaceX for lacking “substance.” Although these companies have significant financial clout of their own, any successful mission to Mars will surely require both private and public investment. Ultimately, that old head from the original space race, NASA, may have the deciding say on which firm gets to Mars first. SpaceX has built up a relationship with the space agency by sending cargo to the International Space Station, but Boeing’s long history in aviation will surely work in its favour and the company was recently confirmed as the primary contractor for NASA’s Space Launch System. New race, new players Although NASA is likely to be involved in some way, the 21st century space race is shaping up to be very different from the one that captivated audiences in the 1960s. In theory, any private company with enough financial backing could attempt the first manned mission to Mars and that comes with a number of risks. Already NASA has stated that it would offer advice and criticism if it felt that a private company was taking unnecessary risks with its astronauts, but the space agency is ultimately powerless to regulate the actions of private companies. Unfortunately, the openness of this space race has raised concerns about some of its participants. Mars One is a not-for-profit organisation aiming to set up a permanent human colony on the red planet. Although the company has already selected a pool of 100 candidates for the one way trip, it seems otherwise underprepared. Its 2025 timeframe has been called wildly optimistic and its $6 billion budget appears as undercooked as it is unachievable. When all this is coupled with the now-abandoned idea of using reality TV to raise funds and the worrying testimony of the candidates involved, Mars One begins to look more dangerously slapdash than simply naïve. [easy-tweet tweet="This new endeavour will surely be more of a collaborative one" hashtags="cloud, tech, spacerace"] Whoever ultimately wins the 21st century space race, it is unlikely that they will do so without outside help. Instead of the dichotomy of US vs USSR, this new endeavour will surely be a more a collaborative one, taking its cue from other international space missions of recent years. The private companies hoping to put mankind on another planet for the first time in its history share a great opportunity, but also a huge responsibility. Just like the Moon landing in 1969, a manned mission to Mars promises to be a defining event, scientifically, culturally and historically. ### The rise of the cloud employee The workplace is changing and more and more it’s becoming about distributed teams, collaboration, and connectivity. Gone is the employee based in the international office expected to travel from town to town. With today’s technology, the airport lounge veteran has been replaced by the cloud employee: the individual who can work from just about any location via any device because they are connected via the multitude of devices, apps, and data residing in the cloud. And it’s not just the cloud employee – the new economy is made up of companies who have built their business around cloud employees. [easy-tweet tweet="The workplace is becoming about distributed teams, collaboration, and connectivity." hashtags="tech, cloud, business"] Shaping the cloud workforce IDC predicts that two-thirds of the CEOs of Global 2000 companies will have digital transformation at the center of their corporate strategy, by the end of 2017. More and more companies in the new economy are ditching their offices, in favor of a workforce made up of cloud employees. In fact, this was exactly the decision made by Matt Mullenweg, founder of WordPress and CEO of Automattic. Automattic is a completely cloud-based company, which means that its 400 staff work across 40 counties, from wherever they are in the world. It could be home, a coffee shop or co-working space. Although this model may be extreme to some, this paradigm is influencing the essence of how people are running their businesses - with many cloud employers making the switch citing cost savings and a more productive workforce. Tools of the trade With new technologies and applications allowing people to work remotely, it’s easier than ever for employees to collaborate with co-workers and receive updates globally – without the need for in-person interaction. Additionally, as a new generation enters the workforce, organizations are looking to equip Millennials with the right technologies and adapt to changing working practices. By 2020 it’s estimated that millennials will make up over half of the global workforce, a generation who have grown up with technology in their back pocket and use an array of tech to help ease everyday life. According to the 2016 Future, Workforce Study three in five millennials believe that the adoption of improved, collaborative communication technology and remote teams in the workforce could soon make in-person meetings redundant. [easy-tweet tweet="Communication technology and remote teams could soon make in-person meetings redundant." hashtags="tech, cloud, business"] Although office space may be good for encouraging communication and collaboration between employees, more often than not even a small distance or partition can have an impact on employee relations. So if a different floor has the same impact as a different office, why not allow your employees to be based in different countries! For cloud employee’s, here are some tips to be successful in an always connected world. Block out time Whether remote or in the office, we can run from meeting to meeting or suffer endless interruptions. Block time on your calendar for you to do your work. Ensuring you have the team to address your action items or workload is critical. Make time for your life Beyond blocking time of your workload, make some time for your family and friends, or just rest. Recharging is real and required. If you don’t your productivity will go down. Maximize the mobile moment The great thing about our smart devices is that we can respond to things quickly – not letting them build upon us. Banging out three quick emails from your phone can keep things moving smoothly, and keep emails from piling up. BYOC (Bring Your Own Cloud) The mobile office is more important than ever. So pick the tools that help your work and share your favorite apps with your customers and colleagues. They may not know about that killer app you’ve been using and welcome being introduced to it. Try alternatives to email Try Slack, HipChat, CoTap, or one of the new team messaging solutions. The instant communication helps to cut down on emails and connect everyone faster. Fewer meetings and more conversations The quick conversations we have in the hallway or at the cubicle can now be done in the cloud for any remote worker. Take the lead, jump on online for quick discussions and brainstorms. Get answers quickly, then get off. The company cloud Organizations and businesses will most likely have their preferred apps and solutions – so make sure you’ve got them! ### Cohesity Delivers Cloud Integration for Scalable and Cost-Effective Secondary Storage Across On-Premises and Cloud Infrastructures Cohesity, the pioneer of hyperconverged secondary storage, today announced a new product that extends the benefits of its on-premises consolidated secondary storage solution to public cloud infrastructure. Running on Amazon Web Services (AWS) and Microsoft Azure, Cohesity DataPlatform Cloud Edition (CE) enables businesses to take advantage of the scalability, convenience and cost-effectiveness of the cloud for data protection, DevOps, and disaster recovery workloads. The fragmentation of today's storage solutions means that many companies struggle with seamless data portability from on-premises to the cloud. Pushing data to the cloud often requires special gateways and data format conversions. As enterprises increasingly leverage the cloud for software development operations (DevOps) as well as off-site data protection, businesses require a seamless and simple platform that can quickly move data between on-premises and cloud infrastructures. Cohesity solves this problem by providing native replication from on-premises DataPlatform deployments to cloud DataPlatform CE deployments. By leveraging Cohesity's underlying CloudReplicate feature, this solution instantly replicates local storage instances to remote clouds services. Seamless integration with Microsoft Azure and AWS empowers companies to immediately unlock the scalability and economic benefits of cloud computing for a range of use cases including: Running Cohesity DataPlatform as a virtual appliance in the cloud Transferring data from on-premises Cohesity clusters to the cloud Leveraging cloud services for DevOps and disaster recovery workflows Cohesity delivers a hyperconverged secondary storage system for consolidating backup, test/dev, file services, and analytic datasets onto an infinitely scalable, intelligent data platform. Cohesity DataPlatform CE extends the solution's current cloud integration capabilities to give customers even greater flexibility to move new storage workloads to the cloud. "With the rollout of Cohesity DataPlatform CE, all the benefits of an intelligent, web-scale storage system are now available via the cloud," said Cohesity Vice President of Marketing and Product Patrick Rogers. "DataPlatform CE consolidates backup, archive, and DevOps workloads into a highly-efficient and infinitely scalable storage architecture that can run entirely in the cloud. This completes our original vision for a limitless storage system that spans both private and public clouds, and enables transparent data movement across clouds." Tad Brockway, General Manager, Azure Storage, Microsoft Corp. said, "Most enterprise customers are using a hybrid environment as they transition to the cloud. Cohesity's solution for managing enterprise data across both public and private cloud data centers addresses critical scenarios for the enterprise market. We're pleased to have Cohesity DataPlatform CE in the Microsoft Azure marketplace." Cohesity DataPlatform CE is currently in preview through an early access program, and will be generally available on AWS Marketplace and Azure Marketplace in the first half of 2017. ### Big Data and the Cloud – Friend or Foe? Big Data and its business benefits have been bandied around since the term entered the Gartner Hype Cycle in 2011 so it’s no surprise that research carried out by Vanson Bourne indicates that 86 percent of organizations now have big data systems in place. But the technology landscape is becoming more complex with nascent technology emerging every month and solution providers attempting to influence the technical decision by addressing architecture concerns such as volume, velocity, variety and veracity. [easy-tweet tweet="Is operating big data in the cloud really a viable option?" hashtags="cloud, tech, bigdata, business"] But how do businesses choose the solution that works best for them and is operating big data in the cloud really a viable option? Cloud – for better or worse Choosing the right big data architecture is no longer a one size fits all method as one single solution is unlikely to cater for all the requirements of business needs, across the entire organization. Typically, enterprise data analytics is deployed through a combination of proprietary on-premise data warehousing solutions and BI platforms. At the same time modern Hadoop, NoSQL, MPP and Streaming technologies are not longer uncommonness, but rather a standard step towards accommodating big data requirements. Fuelled by this novelty and an all but infinitely scalable cloud resource, the benefits of processing and analyzing big data in the cloud has become an important part of the wider industry discussion. When I talk to companies about implementing big data projects, their most common concerns include: Their data users experience serious performance, data quality, and usability issues They have an existing data infrastructure and licenses that are rigid, so it is, therefore, difficult to update the legacy system There is a skills gap in the organization where many departments don’t actually know how to gain access to the data or pull insights from it The investment in time and money needed to build a new system from scratch is unappealing On this basis, you can see why cloud-based solutions are increasingly found to be an attractive path to modern big data analytics. [easy-tweet tweet="Cloud-based solutions are increasingly found to be an attractive path to modern big data analytics." hashtags="tech, cloud, bigdata, "] Instant infrastructure One of the single most valuable benefits of choosing a cloud big data solution is flexibility. A cloud-based solution can allow a single network administrator to support hundreds or even thousands of users and open the doors for new business roles to gain access to the data. In addition to this, it enables the creation of a highly available and elastic infrastructure. This enables businesses to focus more on the core business and solves technical problems in a completely different way. With 63 percent of the organizations viewing big data analysis as a necessity to remain competitive in increasingly data-based economies, the ability to establish big data infrastructure as quickly as possible is a major pull factor for cloud-based solutions. By combing cloud services for data storage, processing and archiving companies such as AWS provide the opportunity for rapid prototyping and deployment of infrastructure that companies would otherwise have to build up themselves from scratch. Cost reduction Organizations are increasingly migrating business functions to the cloud whether it’s security, unified communications or data archiving and a key driving factor behind this is the potential reduction in capital expenditure (CapEx). Similarly, migrating big data analytics to the cloud, even if it is in small pockets, can offer major financial advantages. Performing big data analytics on-premise requires companies to invest in and maintain large data centers, which can incur a sizable initial investment and create a long-term drain on IT budgets. This limits the accessibility of big data to companies of a certain size with enough revenue to maintain this investment. When businesses move to the cloud, this responsibility shifts to the cloud services provider which makes this solution appealing for smaller start-up companies as well as larger organizations that require a scalable solution that can be altered to meet demand. We don’t suggest that businesses abandon their own, in-house big data centers completely but instead reduce their reliance on these resources, maintaining small, efficient data centers while transferring the majority of their big data analytics workloads to hosted environments. This cuts the cost of purchasing equipment, cooling machines, and security, while also allowing them to keep the most sensitive data on-premise. The ability to take a hybrid approach to big data analytics is vital for some industries whereas others are happy to go “all in”. Although the financial services industry understands the value of big data and its ability to provide business critical insight, there still remains hesitation to migrate to a multi-tenant public cloud provider due to the security of sensitive data. SoftServe’s recent Big Data Snapshot indicates that 75 percent of financial institutions rank security as their top big data concern meaning banks and financial institutions are likely to continue to take this hybrid approach. What’s next? Machine learning is adopting at a rapid pace, and businesses are rushing to take big data analytics one step further to not just benefit from business intelligence dashboards but also a new level of efficiency by automating manual analysis or enabling new user experiences. Businesses already understand the value of machine learning – despite the term only entering the Gartner Hype Cycle last year, 62 percent or organization already plan to implement it within the next two years. [easy-tweet tweet="The Cloud has super data gravity meaning that the more workload it ingests the cheaper it becomes" hashtags="tech, bigdata, cloud, business"] The Cloud has super data gravity meaning that the more workload it ingests the cheaper it becomes for everyone. Similar to traditional big data analytics solutions, processing machine learning in the cloud has appealing cost saving benefits. In addition to this, machine learning workloads could be variable making it difficult for businesses to invest in and forecast on-premise equipment, therefore an elastic and scalable solution within a hosted environment is a much better fit for those businesses considering the technology. Ultimately the success of cloud-based big data analytics is dependent on a number of key factors and perhaps the most significant of these is the quality and reliability of the solution provider. The right vendor should combine robust, extensive expertise in both the big data and cloud computing sectors, and for businesses that still can’t decide between the multitudes of solutions on offer it’s advisable to consult an expert who can provide invaluable guidance through the decision stage and implementation of your big data project. ### IBM Triples Cloud Data Center Capacity in the UK IBM today announced that it is adding four new cloud data centers infused with cognitive intelligence in the UK, to keep pace with growing client demand. The investment in the new facilities underscores IBM's long-standing commitment to providing innovative solutions to the UK market and triples its cloud center capacity in the UK. "We are already among the most digitally connected countries in the world, with a globally successful digital economy worth more than £118billion a year and strong cyber security defences to protect consumers and business,” said Matt Hancock, Minister of State for Digital and Culture. "Today's announcement by IBM is a further boost for this thriving area, and another vote of confidence which shows Britain is open for business. These new cloud data centers will help our firms work smarter and quicker to become the world-leading businesses of tomorrow." Cloud adoption rates increased to 84% in the UK over the course of the last five years, according to Cloud Industry Forum.  IDC forecasts that the global market opportunity for public cloud services will exceed $195 billion by 2020.  The expansion strengthens IBM’s ability to bring clients greater flexibility, transparency and control over how they manage data, run their businesses and deploy IT operations locally in the cloud. These new facilities expand IBM's cloud data center footprint from two to six in the UK and 16 across Europe as part of IBM’s network of more than 50 data centers worldwide.  With these new data centers, clients have access to a complete portfolio of IBM cloud services for running their mission critical enterprise workloads and innovations through more than 150 digital services (APIs). These digital services are available via IBM's development platform, Bluemix and include Watson, IBM's cognitive intelligence technology that allows computers to think and learn like humans, building blocks for innovative Blockchain applications that increase trust in transactions, plus enabling technologies for Internet of Things (IoT) and analytics applications. Thomson, part of TUI UK & Ireland, the UK’s largest tour operator, is trialling a new travel search tool, which leverages Watson cognitive services on the IBM Cloud with its customers giving them the opportunity to find travel inspiration by interacting with the virtual travel assistant.  The tool is a smart chatbot that interacts with consumers to provide real-time search results on travel destinations and holiday inquiries, such as "I want to visit local markets/ go on an adventure/ have a cultured holiday/ see exotic animals". Thomson is tapping into IBM's Cloud data center in the UK and leverages Watson APIs such as conversation, natural language classifier and speech-to-text.  The experimental website gains insights and learns from data via frequent requests to provide more digital intelligence that matches customer requirements. "By leveraging IBM's Cloud platform with Watson's cognitive intelligence we have raised the bar in the travel industry and delivered an interactive website to our digitally savvy customers in a fun and innovative way. We are excited to learn how customers interact and what they get out of the experience," said Jo Hickson, Head of Innovation, TUI UK&I. ### When RTC and IoT Clouds Meet, We Move From Proof of Concept to Promise of Commercialization We cannot escape the massive numbers being shared every day now by industry analysts and the media when it comes to the size and growth of the Internet of Things. [easy-tweet tweet="The Internet of Things grows massively every day within the industry" hashtags="IoT, tech, cloud"] Research firm IDC says the global Internet of Things market will grow to $1.7 trillion in 2020 from $655.8 billion in 2014, as more devices come online and new IoT platforms and cloud continue to emerge. They also predict that the number of “IoT endpoints” including cars, wearables, industrial equipment, refrigerators, thermostats, security systems and everything in between will grow from 10.3 billion in 2014 to more than 29.5 billion in 2020. Breaking down the revenue opportunities, or total addressable business market, devices, connectivity and IT services are expected to account for the majority of the global IoT market in 2020, with devices accounting for 31.8 percent of the total. The real recurring growth will come, in our opinion, from purpose-built platforms and “as a service” offerings, which is why the cloud is so important as IoT moves from Proof of Concept to the promise of commercialization. The intersection of IoT with RTC (real time communications of the human kind) creates opportunities for new revenue streams and business models for all who invest now. These opportunities can fundamentally change the way business was traditionally done in many industries, and we’re seeing the most interest from our clients in telecom, healthcare, and public safety. [easy-tweet tweet="The intersection of IoT with RTC creates opportunities for new revenue streams" hashtags="IoT, business, tech, cloud"] Telecom, which has traditionally been a subscription-based model for enterprises as well as retail consumers, can now move up the value chain to offer cloud-based managed services wrapped around IoT-enabled applications that integrate the value of their network and real-time communication services. On the other hand, businesses within healthcare sector can now move to an entirely new model of subscription-based services providing 24x7 remote monitoring. They can also plug in partners across the globe that can offer appropriate assistance in case of emergencies. Connected devices with real-time communication triggered by sensors make this possible. For manufacturing companies, integrating partners with whom they can have real-time adjustments in their commercial arrangements is a massive opportunity – we’re seeing interesting work going on here with companies including IBM, SAP, and other large systems integrators. We’re also seeing tremendous interest in security, and strongly advise thinking through security up front, reducing or eliminating the risk of problems down the line. Very high-quality code, robust platforms, and global support for the “glue” that is so essential for securely connecting endpoints with clouds across many types is mission-critical. Companies like myDevices are creating platforms that provide for the provisioning, secure authentication onto networks and into public, private and hybrid clouds, and management of millions of devices. Where gateways have enabled human conversations to access and traverse networks, IoT gateways do the same as “identity management” technology for “things.” There’s a Lot of Money at Stake Gartner estimates that the Internet of Things (IoT) will support total services spending of USD 235 Billion in 2016, up 22 percent from 2015, and nearly USD 6 Trillion will be spent on IoT solutions over the next five years. While consumer IoT will play a major role, connected devices in enterprises, aka. Industrial IoT, will contribute over 60 percent of IoT spending in 2016. [easy-tweet tweet="As the price of sensors falls, it is becoming more feasible for businesses to connect more end points" hashtags="tech, IoT, cloud"] As the price of sensors falls, it is becoming more feasible for businesses to connect more and more ‘end points’. And with Wi-Fi and cellular connectivity across the globe, it is now possible to wirelessly plug any device or infrastructure into the Internet. Billions Being Invested by the World’s Largest Companies Global tech giants including Google, IBM, and others are making significant commitments to the evolution of technologies and platforms enabling IoT solutions. Google’s Brillo and solutions that enable cloud-based deployment of IoT solutions and IBM’s Watson Platform are examples of the evolution of IoT-based solution orientation at these tech giants. Likewise, telecom giants including Vodafone, AT&T, Verizon, and many others are embracing IoT and are developing solutions that help their enterprise customers simplify deployment of IoT-based solutions. Communications Service Providers are extremely well positioned to go to market well beyond connectivity but with experience in building and managing massive enterprise systems for the “distributed IoT” – bringing together local IoT solutions into a common cloud and enabling secure, real-time management of the largest implementations – with the huge advantage of blending machine and human communications, new services that are monetizing decades of investment in all-IP global networks. Fully engineered and fully managed solutions are key to the “telecom” industries competitiveness – and none of this is possible without RTC and IoT clouds. Connecting them? Exponential opportunities to create value. ### LinuxCon + CloudOpen + ContainerCon Become The Linux Foundation Open Source Summit for 2017 The Linux Foundation, the nonprofit advancing professional open source management for mass collaboration, today announced LinuxCon, CloudOpen and ContainerCon have combined under one umbrella event, The Linux Foundation Open Source Summit. The Linux Foundation Open Source Summit in North America and Europe will also contain a brand new event, Community Leadership Conference. Attendees will have access to sessions across all events in a single venue, enabling them to collaborate and share information across a wide range of open source topics and areas of technology. They can take advantage of not only unparalleled educational opportunities, but also an expo hall, networking activities, hackathons, additional co-located events and The Linux Foundation’s diversity initiatives, including free childcare, nursing rooms, non-binary restrooms and a diversity luncheon. The events will take place on the following dates in the following locations: The Linux Foundation Open Source Summit Japan - May 31-June 2, 2017, Tokyo, Japan The Linux Foundation Open Source Summit North America - September 11-13, 2017, Los Angeles, CA The Linux Foundation Open Source Summit Europe - October 23-25, 2017, Prague, Czech Republic Each part of The Linux Foundation Open Source Summit brings part of the open source community to the table, providing a holistic overview of the industry: LinuxCon The event where the leading maintainers, developers and project leads in the Linux community and from around the world gather together for updates, education, collaboration and problem-solving to further the Linux ecosystem. Attendees learn from the best and the brightest as they are gathered together under one roof to deliver cutting edge presentations and demos, lively discussions and collaboration on a wide variety of Linux topics. ContainerCon Containers are revolutionizing the way workloads are automated, deployed and scaled, and ContainerCon is the place to learn about how to best use these technologies. From hardware virtualization to storage to software defined networking, containers are helping to drive a cloud native approach. The leading experts in both the development and operations community will come together to share ideas and best practices for how containers are shaping computing today and in the future with a focus on DevOps culture, automation, portability and efficiency. The goal of the event is to bring companies on the leading edge together with users and developers to continue to innovate on the delivery of open source infrastructure. CloudOpen CloudOpen is the only event providing space for education and collaboration designed specifically to advance the open cloud. It is a conference celebrating and gathering together the open source projects, products, technologies, and companies that comprise the cloud and are driving the cloud, big data and networking ecosystems of today and tomorrow. The event gathers top professionals to discuss cloud platforms, automation and management tools, DevOps, virtualization, containers, software-defined networking, storage & filesystems, big data tools and platforms, open source best practices, and much more. Community Leadership Conference (North America and Europe only) The Community Leadership Conference provides world-class content and networking for building empowered and productive open source communities. The event provides presentations, tutorials, panel discussions, and networking opportunities that bring together some of the leading practitioners to share their expertise in how to build powerful communities. Whether its focus is on collaborative workflow, licensing, governance, outreach, messaging, social media, or anything else, there is sure to be content and conversation that will bring value to any organization or project. The Community Leadership Conference is led by Jono Bacon, a global leader in community strategy and management, author of The Art of Community, and community leadership consultant to many organizations including Huawei, HackerOne, Microsoft, GitLab, Creative Commons, Sony Mobile, ClusterHQ, and others. “As open source continues to grow and expand into new areas such as hardware, data, IOT, standards, and beyond, having an accessible, productive community strategy is critical for success”, Bacon says. He continues, “I am excited to launch the Community Leadership Conference as part of the Open Source Summit to support the continued evolution of community leadership and the success of those who practice it.” “In recent years, open source has expanded to be the default software in virtually every area of technology, so it is important that the broad community have a place to gather and exchange ideas,” said Linux Foundation Executive Director Jim Zemlin. “The Linux Foundation Open Source Summit will gather the best and brightest from every corner of open source technology together for an event where they can collaborate and share best practices.” Registration for The Linux Foundation Open Source Summit Japan, North America and Europe will open in December. Organizations interested in sponsoring these events can download a prospectus from http://events.linuxfoundation.org/sponsor. About The Linux Foundation The Linux Foundation is the organization of choice for the world's top developers and companies to build ecosystems that accelerate open technology development and commercial adoption. Together with the worldwide open source community, it is solving the hardest technology problems by creating the largest shared technology investment in history. Founded in 2000, The Linux Foundation today provides tools, training and events to scale any open source project, which together deliver an economic impact not achievable by any one company. ### Axway a Leader in 2016 Gartner Magic Quadrant for Full Life Cycle API Management in Third Consecutive Placement Axway, a catalyst for transformation, today announced it has been recognized as a Leader in Gartner’s 2016 Magic Quadrant for Full Life Cycle API Management for its ability to execute and completeness of vision. Previously known as the Application Services Governance Magic Quadrant, Axway has been in the Leaders Quadrant since 2013. APIs are fundamental to the success of leading brands to better anticipate, adapt and scale to meet ever changing customer expectations. Axway API Management Plus enables API creation, control and consumption to make it easier to deliver rich digital experiences. Uniform coverage throughout the API lifecycle allows API developers, security architects, integration teams and lines of business to collaborate together as a customer experience network and drive expanded revenue opportunities. This satisfies the needs of digital consumers, increases employee productivity, and fosters architecture modernization. API Management Plus is powered by AMPLIFY™, Axway’s cloud-enabled data integration and engagement platform. According to Gartner analysts Paolo Malinverno and Mark O’Neill, “It is generally preferable to choose an overall API management solution from a single vendor that provides full life cycle API management. API management has evolved from being focused only on running APIs to taking a broader view of the API, its design and its usage across the full API life cycle.” “Traditionally, APIs have been a sole concern for programmers – but no more. Business and IT leaders are charged with extending systems and services to an expanded digital ecosystem of customers, partners, and developers. That is where Axway’s deep understanding of APIs critical role in digital strategies comes in. With our true lifecycle solution for APIs, API Management Plus, organizations can link existing systems and data to new cloud services, mobile apps, and devices,” said Jeanine Banks, executive vice president, global products and solutions, Axway. “We believe being named a Leader in the Gartner Magic Quadrant for Full Life Cycle API Management for a third consecutive placement validates our unwavering commitment to provide the platform for creating new, efficient ways to collaborate and drive revenue.” Axway completed the acquisition of Appcelerator in January 2016.  Building on Axway’s Vordel purchase in 2012, the acquisition reinforced the company’s strategy to expand the reach of its offering in API management and app development, to deliver optimal customer experiences connected on any device, to any data source, at any time. ### Cloud Backup Services that Don’t Cost a Pretty Penny Headlines are abuzz with post-Brexit rhetoric: “Falling pound” this, and “Price increase” that. For all the sensation the I-told-you-soers are creating about exiting the EU (of which the full effect won’t be felt for years), there is merit to the alarm. [easy-tweet tweet="For all the sensation the I-told-you-soers are creating about exiting the EU, there is merit to the alarm." hashtags="brexit, tech, cloud,"] Foreign exchange rates are invariably affected by sentiment and, as such, companies paying in US dollars for their cloud services will have to fork out more. And where accounts are still paid locally in pounds, American-based cloud service providers (like Microsoft) will be getting fewer dollars when the money goes home. Some are responding by hiking prices of their cloud offering by up to 22 percent from January 2017. This could have the nett effect of companies reducing their margins and possibly losing their competitive pricing edge. It has also been suggested that UK-based companies could consider relocating their headquarters closer to their mainland European customer bases. This could be especially relevant should the effects on visas start to be felt and if attracting overseas talent for jobs becomes strained. Clouds, Yes, but silver linings also. Viable UK-based alternatives As far as cloud backup and storage providers go, there are plenty to choose from within the UK. They range from the Dropbox-type, home user-oriented cloud-based storage to the more elaborate enterprise-ready managed cloud backup services offered to corporates. Staying within UK borders reduces the number of choices available a little but that doesn’t mean many of the top providers are out of reach. There are also many benefits to going local: Automatic compliance with data sovereignty laws. The Data Protection Act and other compliance mechanisms will be adjusted as time goes on and Brexit is implemented Avoiding surveillance by other sovereign states. Since other countries don’t have access or jurisdiction, your data privacy becomes more easily guaranteed. Logistical benefits. Whether it is to get technical support, having large data volumes couriered to site at short notice, or to renegotiate service contracts, being within easy travelling distance of your provider immediately makes for a more efficient working relationship. [easy-tweet tweet="When cloud backup services are under the same flag, your budget will be vulnerable to fewer risks" hashtags="cloud, tech, business"] Predictable costing models When your cloud backup services are rendered under the same flag, your budget will be vulnerable to fewer risk factors – primarily foreign exchange fluctuations. In addition, cloud backup providers that function within the same socio-economic environment will also behave more predictably with their costing models, resulting in greater stability to your business’ overheads. As a final tip, look for a provider that charges on a cost per gigabyte basis only. This allows you to only pay for what you use, without any unforeseen costs due to data transfer. 'Paul started his career as a Lawyer in the UK with Shoosmiths and Harrison back in 1993 before realising this was not his vocation and decided to move into IT at the end of 1994. He subsequently joined Memory and then Shuttle Technology ending up as Business Development Director. In his current role, Paul is responsible for setting the strategic direction of the business and investigating and developing new business opportunities.' ### Tech Trials: What we’ve learned this year At Red Badger, we love trying out and testing technology, both new and old. Just recently our team sat down to discuss developer and testing topics, ranging from new languages to new ways of working. The Badger team split the current and future technological ecosystem into ‘what we’re most excited about for 2017’, ‘fresh twists on a classic’ and ‘tech that needs more work’. What we’re most excited about: AWS + Serverless – Red Badger’s number one for 2017 These are best used for projects that have a huge need to scale, think retail websites that are taking part in Black Friday, that kind of scale. AWS combined with tools like Terraform will also serve well in decreasing the need to engage in traditional ‘DevOps’ activities. [easy-tweet tweet="AWS combined with tools like Terraform will help decrease the need to for traditional ‘DevOps’ activities." hashtags="DevOps, tech, cloud"] Serverless pushes the concept further, provisioning environments and deciding scale seamlessly based on application requirements, allowing the focus of development to remain on the application core. It’s really exciting and we have great expectations for this technology in the future – we’ve already launched projects utilising it successfully and will be looking to take on more. GraphQL and Apollo Although REST has carried us far, GraphQL alleviates many of REST’s long and well-known problems. It allows you to declare exactly what data you require to a single endpoint and allows the backend to resolve these needs. Meanwhile, Apollo is an incrementally-adoptable data stack leveraging GraphQL that manages the flow of data between clients and backends. It looks to be a well-designed system that works fantastically with some of the top front-end core technologies like React and Redux. Elixir Elixir is a very exciting backend language that many of us at Red Badger are keen to embrace further. It provides the tools to implement scalable, fault-tolerant concurrent backend systems by utilising the power of the BEAM VM and the lessons of Erlang. [easy-tweet tweet="Elixir provides the tools to implement scalable, fault-tolerant concurrent backend systems" hashtags="tech, cloud, future"] In addition to intrinsic language virtues, it provides a fantastic developer experience and a great range of tools. For example, to ensure everything’s up to scratch, Elixir can run code examples in your method documentation. Elm Elm is a powerful front-end language and a joy to use, with the strong type system allowing it to have features such as exceptionally insightful compilation error message, should you ever lose your way. Over the last year, Elm has only been gaining momentum, gaining features, and growing a vibrant and brilliantly helpful community. Fresh twists on a classic: Java Java is a well-remembered tool amongst us Badgers, and its recent use at Red Badger has brought back memories (fond and otherwise). Java and its frameworks, especially for legacy systems, has both advantages and flaws that have been explored often. Given the general internal trend towards functional programming, it’s a good idea to try Clojure or Scala to allow firms to keep their JVM infrastructure while gaining the benefits of these whizz-bang languages. Need more work: Cucumber and other automated UI testing tools Why does it need more work? Automation testing processes can begin to add to the lifetime of pull requests, and unreliable or unwieldy integration tests inject uncertainty into the readiness of the product. Most Badgers found that automation testing provided little value. UI testing is a difficult problem, especially at scale, and no framework seems to be perfect for us at the moment. For 2017, we’re thinking snapshot testing through tools like Jest will allow some of the automation testing to be filtered down to a middle layer, letting automation tests remain lighter. CSS CSS has been around for close to twenty years and many of us feel it has fundamental flaws which make it difficult to combine with the component-based approach to creating web applications, such as unpredictable side effects, subtly different rendering between browsers, inconsistent rules and its cascading nature. The solution to these issues is also unclear. Discipline approaches like BEM exist to minimise some of these pains, CSS Modules have proven popular but at Red Badger, we think that maybe something more exotic like the recently released Styled Components or defining styles in JS could be the solution and the future. [easy-tweet tweet="Styling is a hot topic in the community so we’re keen to experiment in 2017." hashtags="tech, cloud, CSS"] Flow Released by Facebook, Flow is an optional typing library that allows you to gain the benefits of static types in the normally dynamic language wonderland of javascript. These types can be defined by hand, and through Flow’s impressive type inference system. It gives you safety from all kinds of null errors, and through fantastic tooling such as Nuclide, you can gain great hints on possible pitfalls in your code as well as additional developer experience niceties like on-the-fly parameter documentation. While this can increase confidence and predictability of your code, some Badgers felt this removes from the dynamic core of javascript, while others mentioned that the current tooling caused them aggravating issues. This technology was widely disputed at the Badger gathering, but the optional and incremental aspect of Flow means that for teams where it’s providing value, it can be used without treading on other people’s toes. We’ll be keeping an eye on Flow, and maybe we’ll agree when the syntax settles and the tools mature, or we’ll have moved onto something else! ### Dell EMC and Intel transform life science research with multi-million pound Bio-Medical Cloud at the University of Cambridge Dell EMC and the University of Cambridge (UoC) have today announced a new multi-million pound Bio-Medical Cloud service, one of the largest dedicated bio-medical compute and data platforms in UK university research. The capabilities offered by the platform will accelerate research and computational medicine allowing data analytical methods developed in the university setting to be used to improve patient outcomes in the clinic. While the need for traditional HPC use cases still exists within medical research, a new researcher demographic of non-computational experts has emerged. To meet the need of this demographic, UoC needed a new platform which would enable non-IT specialised medical researchers to benefit from HPC capabilities, while still handling large scale data intensive workloads. To meet the demand, UoC partnered with Dell EMC and Intel to design and implement a new paradigm of HPC systems. Through the enhanced performance of Dell HPC hardware solutions and access to the benefits of OpenStack virtualized technologies, university researchers are now able to process and model complex data with significantly improved flexibility and system usability.  Dell EMC has worked closely with Research Institutional Services at UoC to pioneer solutions which use the best features of commodity hardware and open-source technologies to deliver HPC solutions. The Cambridge Bio-Medical Cloud, powered by Dell compute and storage servers, including the Dell PowerEdge C6320, the Dell PowerEdge R630 and the Dell PowerVault MD3460and Intel’s latest generation processors and NvRAM technologies coupled with high performance Ethernet networking, will deliver major compute and data analysis capabilities to the Cambridge bio-medical community. “There’s a revolution occurring across scientific, technical and medical research disciplines, generating demand for platforms which can handle a large user-base of computational techniques and data intensive science,” said Bart Mellenbergh, director, HPC Dell EMC EMEA. “Dell EMC offers HPC capabilities, accelerated by the latest hardware technology, to handle the excessive demands of large scale data modelling whilst also providing additional flexibility and accessibility through open source cloud computing.”  “The Cambridge Bio-Medical Cloud heralds a transformation in the manner in which Research Computing Services are being delivered,” said Dr Paul Calleja, director, Research Computing at the UIS, University of Cambridge. “To address the challenges associated with HPC, large scale high performance data analytics and delivery of Web-Services, IaaS and cloud-based models, the University is adopting an agile Cloud service model centred on OpenStack. With the assistance of superior hardware and core technologies, this new platform will radically democratise access to large scale compute and data resources and ultimately contribute to significant advancements in the treatment discovery process.” ### Are businesses finally getting excited about IT resilience? The provocative side of IT resilience is made up of all the threats that today have more and more people talking about it than ever before. Let’s face it, who isn’t intrigued (and worried, terrified or simply concerned) by headlines about IT failures that have triggered nationwide flight cancellations or data hostage situations or global service outages? That said, even less dramatic outages like ING Bank and Glasgow City Council still hit the headlines here in the UK last month. ING Bank’s regional data center went offline due to a fire drill gone wrong (reports suggest that more than one million customers were affected by the downtime), and Glasgow City Council lost its email for three days after a fire system blew in the Council’s data center. [easy-tweet tweet="IT resilience is no longer doomed to the proverbial backburner" hashtags="tech, cloud, IT, security"] The stark reality is business threats are only becoming more pervasive, and whether that is because of ever-evolving cyber-attacks, or human errors being made by overworked teams, or aging hardware and systems that just keep breaking down… the list is constantly growing and goes on and on. In the face of these rising threats, there have been two recent and very fundamental shifts in the way C-Suite and IT executives think about Business Continuity: The first one is that IT resilience is no longer doomed to the proverbial backburner and only something that the IT team needs to concern itself with. This is especially true as the “powers that be” realise cloud-based backup and disaster recovery are extremely cost and resource-efficient. Near-zero downtimes are standard with the right technology and support. Implementation time can be measured in hours meaning that the path to securing executive buy-in is well paved and far easier than it might have been in the past. That said, at a roundtable that we hosted last month in London with senior IT executives to discuss the findings from our latest survey, The State of IT Disaster Recovery Amongst UK Businesses, the group did debate whether business decision makers really understood the financial impact of downtime. Moreover, whether more education is needed about recovery times and what can be recovered, with the group concluding that clearer prioritisation around different systems needs to be implemented so the business understands what will happen when outages take place. [easy-tweet tweet="Security is now a top priority for Disaster-Recovery-as-a-Service and backup." hashtags="DRaaS, tech, cloud, security"] The second fundamental shift is that security is now a top priority for Disaster-Recovery-as-a-Service (DRaaS) and backup. When choosing a DRaaS or Backup-as-a-Service (BaaS) provider, companies are rightfully now asking more questions about security first, right along with speed of recovery and all the other questions that you might expect get asked. So why is this? To my mind this comes down to a few key factors: Companies adopt BaaS as the natural first step to geographic diversification, sending secondary copies to a secure location with less than five minutes of setup time. With our pricing model, companies can protect themselves from data loss, compliance fines and malicious threats like ransomware at very little expense, which is of course very attractive. DRaaS has become the standard when every second counts, cost-efficiently enabling companies to minimise IT downtime that is often measured in the thousands of dollars per minute. Security and compliance are key to our customers and our advanced security cloud platform integrates all the components our customers need to maintain security best practices in the cloud. Backup and Disaster Recovery prove to be stepping stones to broader cloud migration. Customers increasingly fail over to the iland cloud as part of a DR effort and choose to remain there, rather than failing back. In a disaster, companies report exceptional phone support is critical, as it provides the last piece of assurance a failover will succeed.  Therefore it is critical that cloud service providers factor this into the overall package that they offer. As I said at the outset, IT resilience is becoming much more of a business and mainstream issue and with IT resilience firmly on the agenda, organisations are able to put their worries to rest. ### Virtual Instruments Acquires Xangati  Virtual Instruments, the leader in infrastructure performance management, today announced it has acquired Xangati, a hybrid cloud and virtualisation performance management company. The acquisition creates the industry’s first application-centric infrastructure performance management platform that is real-time, cross-domain, and empowers IT teams to ensure application performance and availability as they digitally transform their businesses. Xangati’s products enable a self-healing, self-optimising hybrid data centre, complementing Virtual Instruments’ deep expertise and experience in IT infrastructure performance analytics. Together, the offerings deliver unparalleled application infrastructure performance and availability management by providing comprehensive, correlated real-time insight for any server, network interconnect or data store across the entire data centre. Virtual Instruments and Xangati address a gap in existing application and infrastructure performance monitoring approaches by providing comprehensive visibility and actionable analysis across the compute, network and storage infrastructure in the modern hybrid data centre. The combination of Virtual Instruments and Xangati gives IT teams control of application performance delivery and infrastructure spend within private or public clouds on legacy or hyperconverged platforms. The Xangati acquisition is the second significant transaction in 2016 for Virtual Instruments. Earlier this year, the company merged with leading storage performance analytics provider Load DynamiX. Virtual Instruments continues to see widespread and increasing demand. “Our mission is to create a world where applications and infrastructure perform better together, and this acquisition supports that goal for our Global 2000 customers. By adding Xangati’s monitoring and advanced analytics expertise to Virtual Instruments’ capabilities, we are even better positioned to help companies assure performance, increase availability, and optimise the cost of application and service delivery,” said Philippe Vincent, CEO of Virtual Instruments. “The combination of Virtual Instruments, Load DynamiX and Xangati has now created a company with the scale, expertise and capabilities to lead the market for next-generation performance and availability solutions,” said John Kim, managing partner of HighBar Partners, lead investor in Virtual Instruments. “EMA Research shows that more than 90 percent of enterprises are unable to predict or reliably monitor application performance within today’s complex hybrid infrastructure environments. Virtual Instruments has been a leader in application-centric storage infrastructure monitoring and performance analysis for many years and with the acquisition of Xangati, the company now owns all the critical parts to offer a single solution for proactive virtualisation, server, network, storage and cloud performance management,” said Torsten Volk, managing research director, Enterprise Management Associates. As a result of the acquisition, Virtual Instruments will enhance its existing capabilities in infrastructure performance monitoring with: Deeper visibility into key environments: IP network infrastructure and network flows Compute and virtualisation layer VDI Public cloud environments such as Oracle and Amazon AWS Containers Advanced analytics: Real-time contention analytics Predictive capacity analytics Adaptive control analytics “IT architects, applications delivery and infrastructure operations teams need a holistic approach to proactively ensure the performance and availability of their constantly evolving hybrid data centres,” said Jagan Jagannathan, founder and chief technology officer of Xangati, and now chief innovation officer of Virtual Instruments. “Now that Xangati is part of Virtual Instruments, these teams will have a shared view into how changes in application behavior and infrastructure affect application performance, complementing application performance management (APM) solutions. This unparalleled visibility across the digital landscape will help enterprises scale and adapt.” “We have been using the Xangati products as the foundation of our NovaMonitor real-time network infrastructure monitoring service for our customers, which has allowed us to offer differentiated high-value services,” said Jason McGinnis, president of NeoNova, a leading provider of subscriber and network solutions for regional broadband providers. “The combination of Virtual Instruments and Xangati is compelling as it will enable us to offer a broader range of monitoring services with complete end-to-end infrastructure visibility.” ### Why all companies need a tech-savvy business continuity manager with people skills Regulatory hurdles. Severe weather threats. Ransomware. Brexit. These are just a few of the challenges facing businesses today. Add to that the fact that around 78 percent of cloud users have formally adopted two or more cloud services, and you get an increasingly complicated business continuity landscape to navigate. [easy-tweet tweet="Around 78 percent of cloud users have formally adopted two or more cloud services" hashtags="cloud, tech, business"] As businesses become aware of these threats and the importance of maintaining business continuity, the business continuity manager is assuming a more central role in business strategy. To help business continuity managers identify what skills they need in their toolboxes to keep pace with the business world through 2017, Yusuf Ukaye, business continuity specialist at IT Specialists (ITS), will be at the BCI World event participating on the “Professional development: New skills for a changing landscape” panel. These tips are also helpful for business leaders wondering what qualification to look for in an effective business continuity manager. Ukaye recommends that business continuity managers have the following skills: An understanding of the power of the cloud From software as a service (SaaS) to disaster recovery as a service (DRaaS), there are a plethora of cloud-based tools on the market. It’s critical for a business continuity manager to be skilled at understanding the implications of using cloud solutions, including data management and compliance challenges. Sometimes outsourcing is a must, especially for SMEs or larger enterprises that don’t have endless budgets, and the business continuity manager must be able to ascertain when working with a third party, such as a managed service provider (MSP), is beneficial for your business. DRaaS is a commonly outsourced cloud technology component, as it’s extremely important to keep your business systems online and operational. Many businesses take advantage of hybrid cloud infrastructure so that critical business data is backed up not only in the cloud (thus mitigating local incidents, both technical and physical) but also on local dedicated hardware, so restoring data doesn’t need to be downloaded from the cloud. Strategic use of hybrid clouds can provide the certainty that your data and IT environment are protected without draining your IT resources. Of course, managing hybrid infrastructure requires significant time and expertise, which is why some businesses choose to have an MSP manage their DRaaS infrastructure. [easy-tweet tweet="DRaaS is a commonly outsourced cloud technology component" hashtags="tech, DRaaS, business, cloud"] The ability to discern the business’s overall risk profile It’s important for a business continuity manager to be familiar with best practices for cloud management, but they must also understand how these functions fit into the business’s overall risk profile. To do so, the business continuity manager needs to know their colleagues and the business well. This process starts by working with each business unit to identify how soon each department and function must be up and running in the event of a disaster. The business continuity manager must also work with the IT department to determine the feasibility of meeting the required recovery time frames. For example, if the business needs email to be up and running within four hours, can the IT department support this requirement with their current resources? All businesses are dynamic and quickly changing, so it’s essential for the business continuity manager to consult the various departments before, during and after the initial planning phase. Business continuity is an active, ongoing process – it’s not about writing a plan that then collects dust. Knowledge of what makes people tick  Truly effective business continuity managers are able to get the whole organisation involved in business continuity planning by knowing what’s important to the business units. Inspiring internal stakeholders to engage in the planning process and respond to any incident appropriately involves the business continuity manager being empathetic and proactively building relationships with those people. Motivating individuals with the appropriate skills and knowledge required to perform business activities and maintain the confidence of the supply chain is important since external factors such as supply chain vulnerabilities are especially easy to overlook or understate in the planning phase. Business continuity testing experience Once you’ve created a business continuity plan, it’s time to establish a regular business continuity testing schedule to ensure the plan can be implemented in the real world. Testing under real-world conditions for a variety of scenarios, such as a flooding incident or a power outage, will help prove – or disprove – the effectiveness of the plan. Throughout the testing process, employees should gain familiarity with business continuity protocol such as deciding when it’s unsafe to commute, communicating with coworkers and customers, and accessing critical applications. Ultimately, testing increases employees’ awareness of the threat landscape that exists. A successful business continuity manager will be skilled at constructing tests, documenting results and creating follow-up action items based on the results of the test. Of course, your business might require your business continuity manager to have additional skills, but these are a good starting point. ### Lessons in Contingency Few would argue that business as we know it could exist without Internet-based services. Connected devices and the burgeoning Internet of Things only add to this dependence, and yet, two key vulnerabilities remain that, when exposed, have the potential to now completely isolate businesses from their customer base – even via the telephone. [easy-tweet tweet="Few would argue that business as we know it could exist without Internet-based services." hashtags="cloud, tech, security"] Only recently, 10 percent of BT customers’ Internet usage was affected by broadband outages, caused by no more than a power cut at one of its partners’ Docklands-based data centers. The issue also followed a widespread shortage in February that hit several hundred thousand customers. Of course, when broadband is reliant on the power of those partners housing the infrastructure, a certain number of setbacks due to extreme circumstances or conditions could be unavoidable. Yet, the apparent fragility highlighted by this particular incident does still raise a question mark over the resilience of the UK’s Internet architecture in the first instance. Secondly, for businesses, it emphasises the importance of ensuring the contingency of all Internet-based services, including IP telephony. Despite consumers’ increasing propensity towards online self-service – not least in the banking sector, which was particularly affected by the latest outage – the fact remains that 72 percent of people still prefers customer contact via the telephone. And with businesses of all sizes migrating to IP Telephony for its cost savings, increased operational flexibility, and agility, the incident points clearly towards the need for ensuring resilience and contingency within the provider’s architecture. So what lessons in contingency can the industry learn? First and foremost, organisations should think about having a guaranteed alternative means of connecting to the services they want; which should ideally be a combination of digital and IP-based technologies for full business continuity. Simply having two Internet or Ethernet connections may not be sufficient if the service supported by a business’s telephony (and broadband/Internet) provider is business critical. For some businesses, analog might also play a role, but with an expected lifespan of only a few more years for this technology, it is especially important for such companies to consider contingency. On-premise switches have evolved for corporate digital and IP use, but are still restricted due to the static nature of the technology. For example, on-site digital switches can now take IP cards, and IP switches can accommodate digital cards as back up. However, if the on-site switch fails, a replicated switch that can be manually swapped onto is the only means of backup. [easy-tweet tweet="As we move steadily forward in today’s digital age, innovation comes in many shapes and sizes." hashtags="cloud, tech, mobile"] A cloud-based solution, on the other hand, offers a flexible and seamless alternative with no need to overhaul existing on-premise infrastructures. This opens the door to a range of on-demand contingency measures, including the ability to log in through a standard telephone line to manage the telephony system and operate as usual. A preinstalled mobile app can also allow the use of the mobile network rather than the local Internet network. In fact, mobile is fast becoming the backup of choice for contingency. As a location independent solution, mobile can receive directed calls with relative ease, particularly those devices installed with contact center apps. As we move steadily forward in today’s digital age, innovation comes in many shapes and sizes. However, one thing is clear: existing ways of doing things still have a role to play. In the modern world, there remains a place for both digital and IP telephony – the true innovation lies in combining the two in conjunction with the cloud for complete architectural resilience and business continuity. ### The rise of the pro-locator The move to cloud-based computing has been an ongoing goal for many IT teams across the UK. Indeed, the Cloud Industry Forum recently found in its fifth annual UK cloud adoption survey that more than four in five UK organisations have formally adopted at least one cloud service, and many more have it on their priority list for the year ahead. Looking to 2017, many of Gartner’s predicted trends will rely on a cloud computing infrastructure, for example, the Mesh App and Service Architecture (MASA). [easy-tweet tweet="The move to cloud-based computing has been an ongoing goal for many IT teams across the UK." hashtags="business, tech, IT, cloud"] With much more data being stored in the cloud, the type of data itself will inevitably become more varied. Some organisations may elect to only be storing non-critical data in the cloud, whereas others may feel more comfortable about using services to hold more sensitive data. When making this decision, the location of the cloud services is an important factor. Services hosted in Tier 3 and 4 data centres, may be more appropriate for business-critical data, whereas back-up and archive platforms are more suited to data centres which prioritise value for money over the speed of access and resilience. No data centre is made equal so it makes sense that businesses take advantage of their individual qualities to build more efficient and cost-effective data centre estates. In essence, they should become a ‘pro-locator’, matching the data or service to the most appropriate facility. The barrier to doing this, however, has been connecting these locations together in a way that delivers the level of data access and reliability which is typically offered in a single data centre. Data centres with a network edge To overcome this hurdle, it is vital to pick data centres which are connected to an ultra-resilient, high capacity network. If inter-site connectivity is at a good enough standard, enterprises can take advantage of the sweet spot which each data centre operator has specialised in – whether that be a communication hub or for financial exchanges. [easy-tweet tweet="what does an ultra-resilient, high capacity network actually look like?" hashtags="business, tech, cloud, IT"] But, what does an ultra-resilient, high capacity network actually look like though? Here is a checklist of the qualities an IT team should look for: 100Gbps capable optical wavelengths Sub 1ms latency In-life circuit moves between connected data centres Rapid provisioning – ideally within a week Flexible contract durations and negotiable set-up charges 24/7 support As if it were a single portfolio A best-of-breed network with these attributes can become a differentiator for enterprises in creating their own bespoke data centre portfolio. Dubbed ‘pro-locators’, enterprises which distribute their IT estates between data centres in this manner, can take advantage of a number of different data centre qualities, which drive down cost and maximise network effectiveness. In a nutshell, the ability to utilise this kind of technical, operational and commercial flexibility is fundamental for today’s savvy enterprise IT managers to establish a truly competitive advantage that can be seen on the bottom line. ### AWS Makes Amazon QuickSight Available to all Customers Amazon Web Services, Inc. (AWS), an Amazon.com company, today announced that Amazon QuickSight is now available to all customers. Amazon QuickSight makes it easy for all employees within a company – regardless of their technical skill – to build visualizations, perform ad-hoc analysis, and quickly get business insights from their data. More than 1,500 AWS customers, including global enterprises and startups from a range of industries, have been using Amazon QuickSight during its preview. These customers have experienced how seamlessly Amazon QuickSight connects to AWS and non-AWS data sources, and how fast the query performance is for its robust Super-fast, Parallel, In-Memory Calculation Engine (called SPICE). Pricing for Amazon QuickSight starts at $9.00 per user, per month – one-tenth the cost of traditional BI solutions – so customers of all sizes can make rich business analytics functionality available to all of their employees. The volume of data businesses create and process is growing every day. To get the most value out of this data, customers need to make it easy for all employees, regardless of their technical skill, to analyze data and share insights with others to drive decisions. However, the cost and complexity of traditional BI solutions prohibit most companies from making analytics ubiquitous across their organizations because these solutions require investments in costly on-premises hardware and software, weeks or months of data engineering time to build complex data models, and ongoing investments in additional infrastructure to maintain fast query performance as data sets grow. Amazon QuickSight is built from the ground up to solve these problems by bringing the scale and flexibility of the AWS Cloud to business analytics. Getting started with Amazon QuickSight is simple and fast. Customers simply log in to Amazon QuickSight, and Amazon QuickSight automatically discovers their data stored in AWS services such as Amazon Redshift, Amazon Relational Database Service (Amazon RDS), and Amazon Simple Storage Service (Amazon S3). Amazon QuickSight also connects to third-party data sources such as Microsoft Excel or Salesforce.com. Amazon QuickSight formats the data and moves it to SPICE so it can be visualized in Amazon QuickSight, and customers can choose to keep the data in SPICE up-to-date as the data in the underlying sources change. “Amazon QuickSight makes business analytics much easier, more accessible, cost effective, and faster than ever before,” said Raju Gulabani, Vice President, Database Services, Amazon Web Services. “Because so many companies store data in AWS, it’s very appealing to these customers that Amazon QuickSight can automatically discover their data in AWS, move it to SPICE, and return query results so quickly – all for one-tenth the cost of traditional BI solutions.” Built from the ground up for the cloud, SPICE uses a combination of columnar storage, in-memory technologies enabled through the latest hardware innovations, and machine code generation to allow users to run interactive queries on large datasets and get rapid responses – even with thousands of users all simultaneously performing interactive analysis across a wide variety of AWS data sources. As users explore their data, Amazon QuickSight infers data types and relationships, issues optimal queries to extract relevant data, aggregates the results in the SPICE engine, and provides suggestions for the best possible visualizations of the data – from bar and line graphs to pie charts and scatter plots or heat maps and pivot tables. MLB Advanced Media (MLBAM) is a leader in digital media and content infrastructure, powering an ever-growing number of massively popular media and entertainment properties. “We have been using the preview version of Amazon QuickSight to get insights into our digital content business,” said Brandon SanGiovanni, Traffic Manager, MLBAM. “Amazon QuickSight and the SPICE engine allowed us to tear down and explore our data in a fraction of the time we expected, making it easy for us to tap into the value inside our data. Amazon QuickSight made it easy to slice and dice our data by simply pointing Amazon QuickSight to our AWS data sources. With just a few clicks, Amazon QuickSight provided us with a real-time, 360-degree view of our business without being constrained by pre-built dashboard and metrics, which allowed us to expand our use of data to make informed decisions regarding our products.” Edmunds.com is a leader and one-stop resource for automotive information online, and provides everything from prices for new and used vehicles, vehicle reviews, and true cost of ownership to tips and advice for car purchases. “At Edmunds.com, we have large amounts of data generated every single day in AWS, from website clickstream logs, vehicle transaction and inventory data, and third party datasets to our server and infrastructure logs,” said, Ajit Zadgaonkar, Head of Cloud Infrastructure, Edmunds.com. “Today, we use various tools to generate reports for the entire organization. However, there’s an increasing need to equip our business users with the ability to do their own ad-hoc data discovery and analytics to complement our centralized reporting. We are excited that Amazon QuickSight supports this self-service use case very well. Now that it is generally available, we’re looking forward to giving access to Amazon QuickSight to all of our business users because of its ease-of-use, fast analytics, deep integration with AWS data sources, and affordable price point.” Infor provides business applications specialized by industry and built for the cloud, to its more than 90,000 customers and 66 million cloud users. “We have been using Amazon QuickSight during preview to analyze large datasets from our customers and we are excited about the speed of SPICE and the ease of use of Amazon QuickSight,” said Steve Stahl, Senior BI Development Manager, Infor. “Amazon QuickSight’s SPICE engine is very fast and lets us quickly process these datasets and visualize the results while greatly simplifying integration with Amazon RDS and Amazon Redshift.”   Hotelbeds is a global distributor of accommodations serving more than 185 destination countries worldwide and more than 25 million room nights annually. “Hotelbeds’ platform that connects travel partners around the world is built on top of AWS and generates massive amounts of data every day, but our existing BI solution limits how we make these data and insights available to our business users,”said Miguel Iza, Head of Data and Analytics, Hotelbeds. “Amazon QuickSight is the ideal service to make this data available to our business users. After testing this new service in preview, we found Amazon QuickSight to offer great performance with its SPICE engine, and an easy-to-use user experience when it comes to analytics. Amazon QuickSight simplifies the way our users access their data, perform self-service analysis, and share insights with others. We plan to adopt Amazon QuickSight for our new data solutions and look forward to Amazon QuickSight democratizing business analytics in our company.” Amazon QuickSight is free for one user and one GB of SPICE capacity. The Standard Edition of Amazon QuickSight starts at $9 per user per month, includes 10 GB of SPICE capacity, and is now available in US East (N. Virginia), US West (Oregon), and EU (Ireland) and will expand to additional Regions in the coming months. The Enterprise Edition of Amazon QuickSight includes all Standard Edition features as well as support for Active Directory integration and encryption at rest and will be available in the coming weeks. ### How SEO Is Helping Counter Customer Resistance with Improved Marketing Communication The marketing function for every business has always assumed the shape of a battle between the salespeople and the customers. Ever since businesses came into being, it has been their attempt to create a product or a service for which there is a need and all energies are focused on making customers aware of the existence of the product and create a yearning for it. Conventionally, the battle is with the competitors who occupy the same space with their own products and services. However, modern day businesses have realized that not only do they have to defeat their competitors but also they need to counter the resistance of customers themselves. [easy-tweet tweet="An Internet user is averagely exposed to a few hundred advertisements on a daily basis." hashtags="ads, tech, online,"] Just as businesses have defined their goals and adopted various strategies to achieve them, customers to engage in a similar game. Not only are customers obsessed with discovering the best products at the lowest prices but also they are busy in resisting the communication put out by thousands of businesses to win over their attention. According to studies, an Internet user is averagely exposed to advertisements that range from a few hundred to over a thousand on a daily basis. This constant bombardment has made internet users extremely tired of the onslaught on their senses and provoked a backlash. Consequently, while internet marketers are continually attempting to get more attention from their target audiences, the customers themselves have started resisting in a number of ways. Ad Blocking – the New ThreatAd Blocking AAA Technically speaking, ad blocking has been around for quite some time; however, some recent trends have thrust it into the limelight. While the PC World magazine featured Adblock Plus, a popular tool for ad blocking way back in 2007 in its list of 100 Best Products, the technology has grown from a merely handy tool used by the tech-savvy to one that now poses a massive headache for online advertisers all over the world. According to a PageFair-Adobe study, in 2015 there were 198 million active users of ad blockers, an amazing 41% spurt over the previous year. It is estimated that online marketers lost close to $22 billion due to this technology in just 2015. Not only did the trend continue in 2016, but also the technology came to be used in web browsing on mobiles too. Currently, the population of smartphone users using ad blockers is estimated to be in excess of 400 million and rising fast with Apple deciding to allow ad blocking on its devices. Apart from online publishers, ad blocking is severely affecting the fortunes of advertisers too. Not only is there less traffic on their sites due to the blocking of the ads but also they continue to be billed for the blocked ads because the technology resides on the devices of the users and comes into play after the ads have been served. Thus, advertisers are in a painful situation of paying for advertisements that their target audiences will never get to see. [easy-tweet tweet="Ad blocking is severely affecting the fortunes of advertisers" hashtags="ad, blocking, security, tech"] Can SEO Come to the Rescue? SEO that is already quite valuable becomes even more so in this context for businesses that also resort of paid advertising to generate traffic to their site. While the number of users using ad blocking tools is rising fast, even those who are not using these tools are learning to avoid and ignore the advertising, which effectively renders them blind to advertising efforts. In such a scenario, businesses need to improve their organic search results as much as they possibly can with well-focusedSEO techniques rather than paid advertising, according to the head of a leading SEO company Mumbai. Offsite SEO Offsite SEO encompasses an assortment of methods that enables better ranking of the site on search engines. The technique is largely built around backlinks. Organic search engine rankings can get a large boost when a site can get links from other sites that have an authority in the niche in which the site is operating. This makes them less reliant on paid advertising for traffic generation. Guest posting is a good initial step to take if you do not have a strategy for link building in place. When you post articles on sites that operate in a related area you give them free content and in turn you are helped with a backlink and valuable exposure to the site’s audience. Similarly, you can engage in tactics like blog commenting, directory submissions, and social media optimization to generate backlinks that will get you more traffic and enable you to build your brand. Onsite SEO Onsite SEO not only improves the visibility of your site on the major search engines but also serves to attract and engage visitors in a way that is best suited for you. Onsite SEO revolves around useful and original content, title tags that are optimized, and Meta descriptions that are attractive to search engines. While title tags and Meta descriptions will help you to get noticed in the crowd, good quality content will help nurture and grow your audience. Conclusion Rapid changes in internet technology are changing the behavior of both customers and marketers radically. Even as the internet is becoming all pervasive, both businesses and customers are adopting techniques and tools that help them to stay on top. While ad blocking is increasingly on the rise, good SEO techniques can help marketers and customers find each other for best results. To know more, please visit this site http://gingerdomain.com ### The death of the 9–5 workday and the birth of ‘on demand’ accountants With cloud technology taking over in daily life and the workplace, the future for accountants is changing. To remain in business and stay profitable, accountants need to become increasingly accessible to their clients and substantially more productive. Let’s look at the ways technology enables us to do so. [easy-tweet tweet="With cloud technology taking over in daily life and the workplace, the future for accountants is changing. " hashtags="business, tech, cloud"] Anytime, anywhere One thing technology is bringing to every industry and to everyone across the world is convenience. When it comes to choosing an accountant, Xero’s recent State of Accounts survey found that a major factor for SMBs is responsiveness. This is about being able to easily reach an accountant and to get a quick response, benefits that technology has made possible by giving real-time access to figures and a variety of ways to keep in contact, for example using software, email or instant messaging. Xero’s report also reveals that 16 percent of small business owners expect to interact with their accountant purely through accounting software in the future, with instant messaging (10 percent) and video calls (10 percent) considered the next most likely means of communication. In fact, only 42 percent thought they would interact face-to-face at all in the future. This anytime, anywhere connectivity highlights the need for companies to change their traditional working models and end the one-size-fits-all approach. Being able to adapt can not only increase availability to clients, but also improve both client satisfaction and productivity. [easy-tweet tweet="Being able to adapt can not only increase availability to clients" hashtags="cloud, ondemand, tech, business"] But improved connectivity is not just for the benefit of clients, but for accountants themselves too. As cloud technology takes over, the traditional 9–5 workday is shifting. More professionals are opting to work remotely. 40 percent of accountants say that technology has made their working day more flexible and 75% believe they would be more successful if they could choose the hours they worked. Nine in ten accountants believe this increased flexibility to be beneficial to those with commitments outside of work, such as parents. Increasing value – the future of the accountant However, life as an accountant is not just about advising on the numbers – there’s much more to their skill set. A growing number of small business owners feel that automation of certain tasks will enable their accountant to add more value to their business, with one in four having asked their accountant for broader business advice at some point. This will certainly extend hours and types of services requested. Although 59 percent said they did not think they would need an accountant at all in 10 years time, the skills small business owners consider to be most important in a business advisor are trust, attention to detail and technical competence – all key qualities of an accountant. So there isn’t, and will never be, any dispute around the value of an accountant for a business. However every modern business, regardless of size or age, needs a modern accountant who knows how to future-proof a business. So ensure you understand your client’s needs, your own needs and the tools and technology that can help you continue to provide excellent accountancy services. ### Almost half of UK businesses will not exist in current form by 2021 in the wake of digital disruption In a study commissioned by Fujitsu almost half (44%) of UK business leaders admit that, in the face of rapid digital disruption, their organisation will not exist in its current form by 2021. Although 97% said their organisation has been impacted by digital disruption and 94% expect further widespread disruption, the vast majority (92%) admit their business need to evolve in order to thrive in the face of it. As a result, 61% of execs admit digital disruption is the biggest challenge they face as a business, almost half (46%) are concerned about the future of their organization in wake of digital disruption and over a quarter (26%) wish they weren’t experiencing digital disruption. It is not only businesses themselves that are evolving – entire sectors are experiencing complete transformation in the wake of digital disruption. 86% of business leaders believe their sector will fundamentally change by 2021. When assessing what external influencers are driving their response to the challenge of digital disruption, most (46%) business leaders identified customers, although this was closely followed by competitors (41%). While when asked who is leading digital disruption in their sector, over a quarter (28%) identified the leaders as “established organisations entering their sector such as Google or Amazon”. Indeed, “increased competition / new entrants to the sector was identified as the main effect of digital disruption by 45% of execs. “Digital is not only disrupting internal processes and customer service, it is changing the face of the UK business community itself. New entrants are coming into sectors they’ve never before played a role in and businesses are having to evolve to thrive in their new business environments,” commented Lucy Dimes, Chief Executive, Fujitsu UK & Ireland. “Digital disruption can boost revenue, make organisations more competitive and support innovation. But that potential is driving the concerns of business who fear falling behind competitors. Business leaders know they need to not only keep up but digitalise faster, with confidence, strategy and ultimately, success.” Although business leaders anticipate dramatic change over the coming years, most (80%) believe digital disruption presents exciting opportunities. Such potential benefits are driving a hunger to capitalise quickly. 86% of business leaders say their organisation needs to move faster to stay relevant in a digital world. Exploring what would make them more confident in their organisation’s ability to thrive in a digital world, the majority (46%) of business leaders said “the right technology partner”. The other top answers were: a focus on increasing digital skills in their business (42%) and increased budget to spend on innovation (38%). “Digital disruption is a huge challenge – no question. But businesses cannot bury their heads in the sand because digital is not only a powerful force, it’s unstoppable,” added Dimes. “The truth is that organisations need help to realise the potential of digital. The ability to pool knowledge, ideas and resources with a technology partner is a vital capability. If all digital stakeholders work together to navigate through this disruption, businesses will not be overrun by digital, or outrun by competitors, they will forge ahead, innovating and prospering to reap all the benefits the digital age offers.” ### Is the internet a basic human right? Freedom of expression, the right to a fair trial and freedom from discrimination. These are just a few of the fundamental rights that we consider to be necessities; things that no human being should be without. They have all also been highly prized for centuries, perhaps even millennia, and they continue to be defended to this day. However, with the important role that the internet now plays in so many aspects of our daily lives, some are beginning to ask whether this too should be considered as a basic human right. [easy-tweet tweet="With the important role that the internet plays in many aspects of our daily lives, is it a human right?" hashtags="tech, internet"] For many of us, life without the internet is inconceivable, despite the technology’s relative youth. We communicate with it, we shop with it, we bank with it and we use it to learn. For the generation defined as “millennials,” life and the internet are intertwined. However, this has also caused many of us to take the internet for granted. Around 60 percent of the world’s population remains without internet access but even in the UK where this figure is substantially lower, more needs to be done to improve online connectivity. A recent report by the Local Government Association (LGA) asserted that broadband – the high-speed form of internet that the majority of us experience – should be made more affordable to the poor. Although the report did not specify a suitable rate, it did suggest that social tariffs should be included within the government's broadband Universal Service Obligation (USO) to subsidise contracts for low-income families. The USO is the UK government’s attempt to ensure reliable, affordable broadband access for everyone that requires it. What this means in practice, however, is difficult to define. The LGA have stated that the USO should incorporate a minimum speed of 10mbps subsidised by ISPs, but already that figure has come under scrutiny. While 10mbps may be suitable for today’s range of digital services, it is likely to become insufficient in the near future as our reliance on the internet grows. Aside from affordability, the LGA report raises another salient point on how broadband connectivity can be improved. Nearly one in four adults (approximately 12 million people) do not possess the basic online skills required to access and use digital services. Evidently, more needs to be done to educate the UK populace, particularly given all of the benefits that the internet can provide. [easy-tweet tweet="Online access can be a great way of tackling social isolation, particularly amongst the elderly." hashtags="tech, internet"] Online access can be a great way of tackling social isolation, particularly amongst the elderly. However, this social group can often be the one most lacking in the skills and financial means to get online. The internet is also increasingly a necessity for job seekers, with many applications only taking place online – in this respect, internet connectivity can also provide career opportunities and social mobility. It is important to remember that human rights, even those that seem so fundamental, have evolved over time and as such, must adapt to new developments. In 2010 Finland became the first country to recognise internet access as a legal right for every citizen and several polls have revealed that a similar opinion exists across the globe. Digital services are already a vital part of our everyday lives, but their importance is only set to grow – if the internet is not already a basic human right, it certainly will be in the not-too-distant future. ### Automated Financial Systems Taps IBM Cloud to Expand Financial Services IBM today announced that Automated Financial Systems, Inc. (AFS), a global leader in commercial lending to top-tier financial institutions is utilizing the IBM Cloud to expand its loan processing capacity to support its overall growth and market expansion. AFS, whose clients include two-thirds of the top 25 U.S. banks and bank holding companies, turned to IBM Cloud Managed Services on z Systems which extends the scalable and security-rich processing capabilities of IBM z Systems in a hybrid cloud environment, to establish a new hosting and recovery plan that supports the 24/7 demands of offerings such as AFSVision, the only real-time, linear lending system that manages the commercial lending lifecycle from loan origination to account servicing and reporting. These newfound efficiencies have allowed AFS to expand existing client portfolios and onboard new clients into production much faster than ever before. IBM’s delivery of a highly flexible operating environment allowed AFS to double its processing capacity, and run times for several critical AFS applications were improved with IBM Cloud Managed Services on z Systems – some by as much as 100 percent. “Our decision to adopt IBM Cloud Managed Services on z Systems was made with the knowledge that IBM consistently has delivered for clients and evolved alongside them for decades,” said Mike Bogdash, Managing Director at AFS. “The efficiencies that we’ve received from this platform has boosted our processing capabilities and makes it possible for us to implement a client into production much quicker.” To better protect its operations and encourage higher-quality customer service, AFS also has adopted an IBM business continuity and resiliency services plan that provides the financial software company with an alternative site recovery solution in the unlikely event of a disruption. Seventy percent of financial risk managers cited cyber risk as their top concern in a 2015 survey for an event that could affect global financial systems. Financial policy leaders also have emphasized that cyber risk needs to be viewed as a major priority rather than a limited technology issue, with financial services encountering security incidents 300 percent more frequently than other industries. “AFS holds a commanding position in its industry while continuing to innovate its IT infrastructure in today’s digitized world,” said Martin Jetter, Senior Vice President, Global Technology Services, IBM. “IBM Cloud Managed Services on z Systems is designed to amplify mainframe efficiencies and offer new opportunities for AFS to respond to evolving client environments and achieve future growth.” About Automated Financial Systems, Inc. Automated Financial Systems, Inc. (AFS) is the global leader in providing commercial lending solutions to top-tier financial institutions with more than $2.2 trillion in commercial and industrial loans being processed on our solutions for banks that include two-thirds of the top 25 U.S. banks and bank holding companies. AFS works with the world’s financial institutions to build lending processes based on a straight-through model and on-demand technology and services. In doing so, AFS partners with client banks to understand their strategic goals and works proactively to achieve their business and technology objectives. AFS is headquartered in Exton, Pennsylvania, a suburb of Philadelphia; its European subsidiary, Automated Financial Systems GmbH, is based in Vienna, Austria. About IBM Global Technology Services IBM Global Technology Services offers end-to-end IT consulting and business services supported by an unparalleled global delivery network that is transforming its business to lead in an era of Cognitive and Cloud. As a cloud services integrator, GTS is managing the services and underlying infrastructure in an integrated and unified way. It is modernizing clients’ IT environments to help them meet the increasingly complex customer demands. GTS provides clients with innovative technology solutions that help them to improve their business processes and in turn, profitability.   ### Why cloud fails... and how to prevent it Everyone is talking about how best to use the cloud as part of their IT strategy. However, organisations who want to find out how to use it to scale activities, to reduce the number of support heads or to save money are asking the wrong questions; they are entering the cloud era from a traditional perspective. [easy-tweet tweet="Everyone is talking about how best to use the cloud as part of their IT strategy." hashtags="cloud, security, tech"] The multi-faceted amorphous nature of cloud means that it does not slot in directly to a traditional IT strategy. Yes, it is another tool in the IT box, but it should be used to address business requirements and not necessarily technical ones. It can certainly help with scale, define strategy, reduce support heads and save money – but only if it is aligned to an organisation’s business strategy, with specific goals defined and KPIs agreed before any implementation begins. Rather than use a singular cloud to solve a technical problem such as running out of disk space, or using SaaS to address application lifecycle issues, organisations should be considering how they can use multiple clouds to fulfil their business needs. That is where the true skill resides, interleaving and tuning a multi-cloud solution. One company EACS has worked with had a significant five-figure cost against on-premise backup, driving tapes around to offsite storage facilities. They also had a three or four-week turnaround for data restores and 15,000 tapes in storage, 80 percent of which were not catalogued. This is not rare; we come across it quite frequently. One of the other challenges we were talking to this company about was the fact they had very little (if any) disaster recovery provision at their HQ. Cloud seemed perfect, right? Remove the reliance on magnetic media, geographically separate backup and production data; combine this with the increase in DR capability Infrastructure as a Service (IAAS) provides and we all thought we were onto a winner. [easy-tweet tweet="Cloud seemed perfect, right?" hashtags="cloud, tech, security, future"] However, their procurement strategy only focused on the backups, not DR. Cloud fell at the first budget hurdle as it was ‘expensive’ when considered in isolation to address the backup challenges. That company still expends five figures on magnetic media management and has gone a separate route for DR. Both projects combined provide a more expensive and less flexible solution than leveraging a cloud solution. This is an example of a large company pursuing a traditional and ingrained IT procurement process without focusing on the overall business need. Combining projects and initiatives to drive value in this cloud era is not a sales strategy to inflate the deal, it is a modern approach to enterprise IT that needs careful, cross-company consideration. Many of the failures of cloud adoption we see are when existing IT budgets have been used to apply cloud to an existing IT service. This typically results in backup costs spiralling due to poorly defined retention, IaaS costs spiralling due to a ‘lift and shift’ approach to cloud migration, and Platform as a Service (PaaS) not delivering the value due to poorly understood business processes. Of all the companies we have spoken to about cloud transitions, the ones which are most successful are those that use it to address a particular business requirement. We recently created an entirely new division for a large insurer who needed to tactically separate this new venture from a compliance standpoint. We provisioned this in the cloud for 40 users within two weeks. From an SME perspective, having a cloud-enabled infrastructure can be key to absorbing any acquisitions quickly and efficiently. We work closely with an estates management company whose business growth strategy is one of an aggressive company acquisition, adding four or five businesses a year throughout the UK. They have a private cloud infrastructure which makes absorbing users, and more importantly the plethora of line of business applications, a ‘simple’ task. The challenge becomes a business one, not technical. In this case getting IT involved early enough in the acquisition process to plan and assess has been the key hurdle to overcome. [easy-tweet tweet="From an SME perspective, a cloud-enabled infrastructure is key to absorbing acquisitions quickly" hashtags="cloud, tech, security, business"] In closing, the first question organisations should ask themselves when considering cloud is: “What business issues am I currently facing?” Having defined these, and understood the business processes supporting them, they can then ask themselves how IT – whether traditional, cloud or a combination of the two – can best be used to address them. ### Accounting Profession Risks Its Future By Not Keeping On Top Of Tech Skills A new report on the state of accountancy reveals that the extent of change since the introduction of cloud software will lead to accountants broadening their offering beyond number crunching, into data analysis and management consultancy. Signaling a new era in accounting, the report exposes that significant numbers of accountants in Bristol think business management (54%), risk analysis (49%) and computer science (31%) skills will be needed in order to succeed by the year 2026, as automation creates opportunity for technical analysis. Two thirds of accountants in Bristol (67%) expect proficient knowledge of technology and automation in finance to be crucial to their success within five years. Furthermore, 15% feel the extent of change will be so great they will need to leave the sector if they don’t adapt to modern methods by the end of the decade. Opportunity in Bristol remains strong however, with 67% confident they can adapt to change, as accountants top the list of most trusted advisors for Bristol SMB owners (21%). The death of the 9-5 and the birth of ‘on demand’ accountants Key findings in the report commissioned by software innovators Xero include a shift in the traditional ‘9-5’ as cloud technology takes over. 44% of accountants in Bristol say that technology has made their working day more flexible and 64% believe they would be more successful if they could choose the hours they worked. A huge 95% of accountants in Bristol believe this increased flexibility to be beneficial to those with commitments outside of work, such as parents. This belief is supported by research from the Centre for Ecnonomics and Business Research which found that two thirds of mothers who stay at home with young children would go back to work if flexible hours around childcare were an option. Della Hudson, Founder of Hudson Business Accountants and Advisors is a working mum and employs eight flexible workers at her firm in Bristol: “Using technology, especially cloud based, allows us to be more flexible over time and place of working. This means that we get to recruit some really high calibre team members who want time with their family or to take more holidays because their kids have grown up. Whatever the reason we find that looking after our team helps them to look after our clients,” comments Della. Combined with technology enabling ‘anytime, anywhere’ connectivity, the options are continuing to grow for companies to change their working models and end the ‘one size fits all’ approach – increasing productivity at the same time. The report also found that a major factor for Bristol based SMBs when choosing an accountant today was responsiveness (9%), a benefit that technology brings by giving real-time access to figures and a variety of ways to keep in contact, e.g. through software, email or instant messaging. A new era of collaboration for SMBs Over half (57%) of small business owners in Bristol still expect to interact with their accountant via email, followed by 14% who believe they will communicate via video call and 9% by instant messaging. Interestingly, only 10% believed they will communicate with their accountant via accounting software in future. A growing number of small business owners in Bristol feel the automation of certain tasks will give their accountants the ability to add more value to their businesses, with just under one in five (18%) having asked their accountant for broader business advice at some point. Although 25% of small business owners in Bristol said they did not think they would need an accountant at all in 10 years time, the skills small business owners consider to be most important in a business advisor are trust (48%), attention to detail (46%) and technical competence (39%) – all key qualities of an accountant. As part of the full report, some SMB owners admitted to being willing to spend up to £100,000 on management consultancy to save a failing business – an area smart accountants can capitalize on. Gary Turner, Xero’s UK managing director comments: “As we head into a prolonged period of technological change in the next five years it's encouraging that many accountants see being tech savvy as a key survival skill. However, the survey also suggests that the profession needs to work harder on investing sufficient time in keeping abreast of emerging technologies, and in more effectively persuading SMBs that a close working relationship with a financial professional will be important in years to come.” ### Dark and Stormy: Shadow IT in the Cloud Data is a precious commodity for every organisation these days. But the ability to protect data, particularly in the cloud, is complex and often difficult to achieve. The challenge of securing data that’s stored in the cloud is further exacerbated by the fact that most organisations today use multiple cloud storage providers. This is supported by the findings of our recent study, which found that 89 percent of organisations use a total of 1-15 private cloud storage providers and 92 percent use 1-15 public cloud storage providers. [easy-tweet tweet="Data is a precious commodity for every organisation these days." hashtags="cloud, data, tech"] By spreading data across several cloud storage providers – both private and public – organizations can diversify their portfolio of providers and mitigate their risk in the event that service outages occur or a provider goes out of business. But the more cloud providers organisations have in the mix, the more difficult it becomes to have full visibility into the use of all providers. And when visibility is limited, that can often lead to data management errors and shadow IT. With IDC estimating that spending on cloud services will grow nearly five times faster than overall IT budgets, it’s absolutely vital that organisations take the necessary precautions to identify if shadow IT is occurring and then put the necessary processes, policies and monitoring mechanisms in place to reduce it. What is shadow IT? The term shadow IT essentially means that the IT department has had no role in helping to select and deploy services and may not know which services/providers are being used. As our recent study found, 26 percent of global organisations are either ‘not confident’ or ‘somewhat confident’ that their IT teams know about all cloud storage providers being used. With figures like that, it’s clear shadow IT is a serious problem and can cause serious harm to an organisation. What does EU GDPR  have to do with shadow IT in the cloud? Shadow IT in the cloud, and out of it, puts organisations at risk of a data breach, which can cause huge financial losses, legal repercussions, regulatory fines and reputational damage. Soon, however, the EU General Data Protection Regulation (GDPR) is going to up the ante even further. [easy-tweet tweet="Shadow IT in the cloud, and out of it, puts organisations at risk of a data breach" hashtags="tech, cloud, data"] The EU GDPR will require organisations to demonstrate they have controls and procedures in place to ensure personal data is protected ‘by design’ as well as demonstrating that data is not being retained longer than required. In addition, the new legislation will require businesses to hire a Data Protection Officer (DPO), who is responsible for reducing risk, ensuring compliance, responding to requests for access, reporting data breaches and creating sound data security policies. Data security regulations do already exist worldwide, such as Principle 8 of the UK Data Protection Act which states: “Personal data shall not be transferred to a country or territory outside the EEA unless that country or territory ensures an adequate level of protection for the rights and freedoms of data subjects in relation to the processing of personal data.” The EU GDPR’s aim, however, is to unify all current European guidelines and to force organisations to really start taking data protection seriously. If organisations do not comply with the EU GDPR, they risk being subject to an administrative fine of up to €20 million, or 4 percent of their global turnover. Whilst this may seem harsh, anyone from Ashley Madison, TalkTalk and Yahoo will tell you that a data breach is much, much worse. What is the solution? Companies trying to protect themselves from data breaches caused by shadow IT firstly should identify where all of their data resides - in-house, in the data centre, or in the cloud. From there, organizations need to monitor if, where and why to shadow IT is occurring. It really is crucial that the IT department takes an active role in identifying which cloud services are being used within their organisations, both legitimately and covertly, by employees working autonomously to IT. When it comes to shadow IT, a lot of this boils down to the IT department taking responsibility for educating their organisations’ employees about what sorts of activity can put corporate data, and the overall operating system, at risk. Organisations should also monitor if employees are installing their own WiFi hotspots onto the office’s network. If the WiFi hotspot isn’t secure, it could result in a cyber-criminal hacking into the corporate networks. It’s also important to monitor the network for known and unknown devices. These are all common occurrences, but many organizations just don’t know it’s happening because they don’t think to look. [easy-tweet tweet="It’s really important to establish guidelines for how data should be managed by cloud providers" hashtags="cloud, tech, data"] In order to monitor and reduce the occurrence of shadow IT, it’s really important to establish guidelines for how data should be managed by cloud providers, conduct frequent and unscheduled audits of each cloud provider, and assess the security of data stored in the cloud – be it in a private, public or hybrid environment. Organisations must be diligent in knowing where their data is being stored, how it’s being protected and when it needs to be removed. Following these steps, and complying with the rest of the measures dictated by the EU GDPR, will go a long way in protecting organisations from shadow IT, data breaches and a hefty fine when compliance with EU GDPR is required as of May 25, 2018 ### Cloud Services can be Secure – just not by Design Security should, hands down, be the number one concern when considering moving data to the cloud. And why wouldn’t it be? A cloud is a storage unit for all things personal, confidential and business-centric. Handing that key out to just anyone could be disastrous. The problem is that, all too often, it isn’t considered. [easy-tweet tweet="Security should, hands down, be the number one concern when considering moving data to the cloud." hashtags="tech, cloud, security"] Written in the Stars Over the years we’ve seen many fall victim to having their cloud-based data hacked. In August this year, Sage admitted to a security breach of its cloud computing systems affecting 280 companies. Dropbox has had to reset many of its users’ passwords due to a data breach that took place back in 2012. Of course, it’s not just these cloud service providers that are being targeted, individual accounts can also be in the hackers sights. Pippa Middleton (sister of the Duchess of Cambridge) recently had her iCloud account hacked. The hacker made off with 3,000 of her personal photos and had quickly offered them for sale to a popular news site for £50,000. The trove of stolen photos contained private photos that included members of the UK royal family. A high court judge subsequently ruled that the photos shall be barred from publication. Pippa is just the last in a long line of celebrities to have found her security lacking. AppRiver’s Q3 Global Security Report that looks at the threat landscape of the preceding three months showed that on any given day there were around 40 million unique threats locations – from malware, phishing messages and compromised or malicious websites and links. What can be done? Pippa’s iCloud hack should serve as a reminder to replace any weak or shared passwords you may be using. Of course, any single security defence may be flawed and there’s no point creating the strongest password possible if hackers are able to obtain it by breaching the organisation and taking the customer database. Yahoo is one company that has had to hold its hands up to this, revealing it couldn’t be sure the data was encrypted! [easy-tweet tweet="To bolster passwords, enable two-factor authentication" hashtags="security, tech, cloud"] To bolster passwords, enable two-factor authentication whenever it is offered. To avoid falling victim to phishing attacks install a spam filter. Anti-virus will help keep your system free from malware and key loggers. Perhaps the most readily available security defence is common sense. If an offer seems too good to be true, it probably is so don’t click the link. When visiting web pages stay alerted and look for evidence that it’s secure, such as a padlock in the address bar, most browsers now offer a colour coded version to indicate the site has been verified and the https: at the start of the URL is another indicator. When on an unsecured network, use a VPN service to secure your connection. While each defence can be great on its own, layered security is greater than a sum of its parts. ### How can cloud services survive the next Dyn? As a result of the massive cyber attacks that took down Dyn’s managed Domain Name Servers (DNS) network on October 21st, hundreds of thousands of websites became unreachable to most of the world. DNS are like the phone books or roadmaps of the Internet. These services maintain a directory of domain names and their corresponding IP address. It’s easier for people to remember a domain name than an IP address, so when a user types a web address such as Radware.com into their browser they are actually directed to 91.240.147.21. [easy-tweet tweet="DNS are like the phone books or roadmaps of the Internet." hashtags="tech, DNS, cloud"] The attack on Dyn has made it feel that no site is immune as it took high-profile cloud services like Twitter, Spotify and Netflix offline. The problem intensified later in the day when the attackers launched a second round of attacks against Dyn’s DNS system. So how can cloud services survive such destructive attacks? Researchers have long warned about the risks of a vast majority of Internet clients centralising their networks by using a handful of DNS providers. Coupled with this problem are a large number of Internet clients using only one DNS provider for both their primary and secondary DNS. When DynDNS went under attack, those that did not use redundant DNS services found service unavailable and users unable to reach their website. In many ways, it is a similar situation to the ‘cyber-domino’ affect that has been a popular method among cybercriminals over the past few years. It involves using a knock on effect tactic where the attacker will take down a company’s website and network operations by launching an attack on the hosting provider or ISP that the company relies on. Take the ISP or hosting company down and the company will be taken offline as well, as will many other companies who use the same provider who become innocent victims in the attack. The main difference with these new attacks that affected DYN is that they used infected Internet of Things (IoT) devices that became a virtual cyber army for the attacker. Security evangelists have long been talking about the potential for IoT-driven attacks, a message that has often been met with a combination of eye rolls and scepticism. That’s likely no longer the case after these latest attacks, which also-also raised the issue of identifying where the responsibility for technology starts and ends. Without question, these assaults signal a new age of attacks that will force many businesses to question not only their own cyber security strategies but also those of the service providers who they depend on upon for availability. Here are three key things to look for when reviewing cloud service providers to help you to establish whether they are prepared to defend against the new wave of attacks: Hybrid, automated mitigation capabilities Successfully defending a network from such a major attack requires multi-vector attack detection in an always-on fashion, along with an ability to automate the process of redirection of traffic to cloud-based mitigation resources. Be sure your provider is utilising hybrid mitigation capabilities, ideally leveraging the same technologies on-premise and in the cloud to ease automation and speed time for effective and accurate mitigation. Layer 7 attack detection In the past, most large-scale DDoS attacks have leveraged network attack vectors (Layer 3/4). However, new attacks are reportedly sending through massive amounts of HTTP floods, making most Layer 3/4 attack detection methods useless. Be sure your service provider has effective application (Layer 7) attack detection and mitigation capabilities. [easy-tweet tweet="Most large-scale DDoS attacks have leveraged network attack vectors" hashtags="cloud, tech, DDoS"] Separate network for DDoS mitigation The ideal architecture features a separate, scalable infrastructure specifically for volumetric DDoS attack mitigation where attacks can be rerouted when they reach predetermined thresholds. These DDoS scrubbing centres should ideally be located close to a major Internet peering point, providing the distinct advantage of not having to backhaul large amounts of traffic across a network backbone, which increases costs to the service provider and results in a necessity to drop certain customers who are under sustained volumetric attacks. For now, it appears the attacks have abated. However, they should stay in the forefront of the minds of cloud businesses as indicative of the direction of large cyber security attacks. It has become vital to not only reconsider your defence strategy, but also those employed by the providers you rely upon. After all, it’s not just their service that depends on it, but yours too. ### Making the Move to Cloud Services – How Businesses Can Tackle the Security Challenges Driven by a range of benefits from enhanced scalability to cost reduction to ubiquitous access to applications, the migration of businesses to the cloud is continuing apace. Indeed, according to analyst, Gartner, more than $1 trillion in IT spending will be directly or indirectly affected by the shift to the cloud during the next five years. [easy-tweet tweet="The choice of a third party cloud services provider is critical" hashtags="security, cloud, tech"]   Any business making the transition will naturally be focused on achieving optimum levels of data security. The choice of a third party cloud services provider is, therefore, critical – and there is a host of issues to consider. For example, do the terms and conditions of their prospective provider meet their requirements; and what about data sovereignty, security and even compensation if something goes wrong?    Just the simple fact of moving data to the cloud brings with it security concerns – and having a rigorous approach to encryption in place is critical in this context. Businesses need to ensure for example that any data transitioned to the care of that provider is encrypted the moment it lands rather than post-landing.  Best practice is for the business to encrypt data itself as it leaves their building. This ensures there are two layers of encryption – so that if one is compromised, one remains encrypted.   While the choice of provider is a key upfront concern, businesses also need to decide from the outset what data they want to move to the cloud and what to retain in-house. That’s why we are seeing the hybrid cloud model becoming de facto especially for larger businesses, who see benefits in keeping more sensitive customer data on-premise. [easy-tweet tweet="businesses also need to decide from the outset what data they want to move to the cloud" hashtags="cloud, tech, IT, security"]   Ultimately, the business itself needs to accept a high level of responsibility for the security of its cloud-based data and this is especially key with regards to data access. One of the big issues for any business running hybrid cloud is: do they have a security policy that works seamlessly across both on-premise and cloud services: If somebody wants to access the business’s on-premise data they go through a gateway: generally a VPN, or front-end web server. However, if an employee tries to access data in the cloud, the business is unlikely to have any control over, or visibility of, that process. That’s because there is typically a standard way of accessing cloud services that is not necessarily in line with the organisation’s standard security policies.      Many cloud services will come with username/password authentication out-of-the-box and that is likely to bring with it an element of risk. The challenge for the business is to manage and mitigate those cloud service access risks in the same way as it would its on-premise service risks. After all, cloud data belongs to the business, not the cloud service provider, and the business is ultimately responsible for protecting it. And in the age of BYOD where many devices used in the corporate environment are unmanaged, that’s often a significant challenge.   So what’s the solution? Education is key, of course. Businesses need to highlight the message that employees should take a responsible approach to data protection. They must be aware of the potential security threats and do all they can to mitigate them - from keeping care of devices they use at work to ensuring passwords are consistently strong. [easy-tweet tweet="Businesses need to highlight that employees should take a responsible approach to data protection" hashtags="tech, cloud, IT, security"]   But in this new security environment, businesses also need to find technology solutions that allow them to mitigate risk.  A key part of this is to step up the level of authentication that those devices require before they can access cloud data. Businesses can, for example,, deploy an authentication portal or an access broker which means that if a user wants to access data in the cloud, they have to authenticate via the business’s own domain. That critical touch point enables the organisation to establish control over data access. And they can further mitigate risk by making the authentication mechanism adaptive depending on who and where the user is; what they want to access and what devices they are using.   So in summary, before businesses move to the cloud, they first need to find a cloud service provider they can trust; define which services and applications they are going to transition and then put a security policy in place, But critically also, across all of this process, they need to find some form of access broker and an adaptive authentication mechanism that delivers the highest possible level of control. At that point, they will have a fully secure approach to data access in place and be ideally placed to reap the many rewards that moving to cloud services can bring. ### Atos tackles storage bottlenecks for High Performance Computing Atos, through its technology brand Bull, launches Bull Director for HPSS, first component of the future range of Bull Data Management software dedicated to High Performance Computing. Bull Director for HPSS optimizes current large scale storage solutions and frees up compute time for users. This launch confirms Atos’ strategic commitment to develop innovative solutions for High Performance Computing (HPC) and High Performance Data Analytics (HPDA). The beta version of Bull Director for HPSS is revealed at the SC16 Conference in Salt Lake City, Utah (USA), the largest HPC event in the world that takes place 14 – 17 November. Bull Director for HPSS includes the following features: It is an ad-on to HPSS that brings performance and ease of management to the next level. It improves the global performance of HPSS for tape libraries, in an environment with many concurrent access requests, by grouping simultaneous requests from different users in real time. It sorts all file recalls dynamically and transparently, and queues them according to their position on tapes, thus massively decreasing tape mounts and tape movements. This improves drastically HPSS performance when massive file recalls are the norm. As data sets with hundreds of terabytes or many petabytes of data become increasingly common-place, Hierarchical Storage Management (HSM) becomes a cornerstone of successful intensive data management systems. HSM is a data storage technique that automatically moves data between high-performance / high-cost storage devices and low-end / low-cost storage devices. To optimize costs, HSM systems store the bulk of data on slow, economical devices such as robotic tape libraries, and copy data to faster disk drives whenever they are needed for processing. HPSS (High Performance Storage System) is an HSM system born from a collaboration between IBM Global Business Services and 5 American DOE laboratories, augmented by the French CEA/DAM. "In a context of data explosion, storage is often a bottleneck and has a negative impact on application performance. Atos has a long experience of implementing HPSS in challenging environments where long-term data preservation and re-use of massive data sets are key. Our ultimate objective with Bull Director for HPSS and the other future components is to get rid of these bottlenecks and free up compute time for users." explains Eric Eppe, Head of Products and Solutions for extreme computing at Atos. Atos at SC16 Atos will exhibit the full breadth of its HPC offer at SC16. Visit booth #721 to discover the latest Bull sequana blades, the BXI high performance interconnect, and extreme factory, the HPC-as-a-service solution from Bull. About Atos Atos SE (Societas Europaea) is a leader in digital services with pro forma annual revenue of circa EUR 12 billion and 100,000 employees in 72 countries. Serving a global client base, the Group provides Consulting & Systems Integration services, Managed Services & BPO, Cloud operations, Big Data & Cyber-security solutions, as well as transactional services through Worldline, the European leader in the payments and transactional services industry. With its deep technology expertise and industry knowledge, the Group works with clients across different business sectors: Defence, Financial Services, Health, Manufacturing, Media, Utilities, Public sector, Retail, Telecommunications, and Transportation. Atos is focused on business technology that powers progress and helps organizations to create their firm of the future. The Group is the Worldwide Information Technology Partner for the Olympic & Paralympic Games and is listed on the Euronext Paris market. Atos operates under the brands Atos, Atos Consulting, Atos Worldgrid, Bull, Canopy, Unify and Worldline. Bull is the Atos brand for its technology products and software, which are today distributed in over 50 countries worldwide. With a rich heritage of over 80 years of technological innovation, 2000 patents and a 700 strong R&D team supported by the Atos Scientific Community, it offers products and value-added software to assist clients in their digital transformation, specifically in the areas of Big Data and Cybersecurity and Defense. ### Malware Mousetrap – how to catch a hi-tech hacker Technology is a beautiful thing. Every day, we’re celebrating a new innovative product and rejoicing as  technology becomes more and more accessible to people all over the world. We often forget, however, that with this progress comes even greater risks and, despite advances in antivirus software, launching a malware attack has never been easier. [easy-tweet tweet="Launching a malware attack has never been easier." hashtags="tech, cloud, malware, hacking"] You don’t have to look far to see that malware creation is on the rise and there are new types of malware created every single day that slides under the radar of traditional antivirus programs. From adware and spyware to zombie computers and ransomware - it seems that no device is safe, and in order to protect our data, we need a lot more vigilance than simply installing an antivirus solution. At OVH, we have been working to put the brakes on the proliferation of malware and ransomware, with the latter being especially popular over the past three years. These malicious programs infect computers and servers, encrypting their data and ransoming it from their owners, or sending it to third parties through an array of complex methods, comparable to that of offshore financing. To catch the authors of these programs, hosting providers must get clued up and start using clever approaches, often combining computer science, reverse engineering, and some good old-fashioned police work. Here are a couple of our tried and tested methods: Bait with cheese: malware traps and spam nets Based on the principle of a mousetrap - baiting undesirable rodents with a piece of cheese - hosting providers can intentionally place easily hackable machines on their networks. These machines record all activity and can help gain a better understanding of how users’ servers are compromised and what purpose they serve afterward. Here at OVH, we have created and released on the web (forums, mailing-lists…) thousands of valid email addresses – and even entire domains - so that they are available to spammers. All we need to do is raise our nets on a regular basis – we analyse the emails received and those containing interesting attachments are stored, grouped and dissected. This allows us to recognise current campaigns and identify those involving servers present on OVH infrastructures. [easy-tweet tweet="Hosting providers can intentionally place easily hackable machines on their networks to act as 'bait'" hashtags="tech, cloud, malware"] Follow the breadcrumbs: reverse-engineering Any good hosting provider will do everything it can to stop the propagation of malicious software and the theft and sale of data through machines which are under its authority. However, there are some cases where servers distributing certain ransomware are permitted to continue to operate temporarily in order to collect evidence. Tracking cyber-criminals is necessary, but it’s a lengthy process. Just like Hansel and Gretel, these malicious actors leave a long trail that can be traced back to the source. ISPs need to intervene and make the most of the evidence provided as quickly as possible before the URLs sent are already no longer valid, the servers involved have been returned and for the most part the malware campaign over. One thing we do is reverse-engineer the malware that we’ve captured in our traps or that which have been sent to us by other security researchers. The goal is to adopt a proactive approach, capturing weak signals to identify new operating methods and pull the carpet from under the cybercriminals’ feet. If we can detect malware before it can be used, it can be very offputting to them.There have been numerous occasions where we managed to understand how the hacked servers were configured to do harm and were, therefore, able to cut them off before they were even used. After that, we never saw this strain of malware return to us again. Food for thought: educate users Finally, on the education front, there is work to be done. Often the cause of infection or invasion is down to human error – or at least a lack of vigilance. When it comes to PCs, e-mail still proves to be the biggest point of contamination. Whether through malicious banner ads (malvertising) or the exploitation of software vulnerabilities (exploit kits). Regarding the servers, there are two types of offending administrators: those who leave the key in the door – i.e using very simple passwords, and those who leave the windows open – i.e forgetting to update the programs they use to the latest versions. Hackers are people like everyone else, they are concerned about efficiency, and with more people looking for security vulnerabilities, there are more to be found. [easy-tweet tweet="Often the cause of infection or invasion is down to human error" hashtags="security, cloud, tech, malware"] So there you have it, a few mousetraps hosting providers can set in order to identify hackers and protect both their and their customer’s data. As technology continues to advance, malware is becoming more and more of a threat, proving more dangerous than the widely feared DDoS (Distributed Denial of Service). ISPs have to start taking a proactive approach to ensure these perpetrators don’t nibble holes in their infrastructure and get away with the cheese. ### How has the cloud transformed the finance and expense industry? The introduction of the cloud meant a transformation of the business landscape and it’s now become a mainstay in the IT industry. The approximate figure of businesses using the cloud is 93 percent according to RightScale research, highlighting the high adoption rate from UK companies. For the finance industry, the cloud has arguably transformed many of the traditional methods of management, including expenses, for the better. [easy-tweet tweet="The approximate figure of businesses using the cloud is 93 percent" hashtags="cloud, tech, business"] It saves time and money Quite simply using the cloud is more economical – in cost and time. The cloud allows data to be shared more easily and has inevitably driven UK digital entrepreneurship over the past few years. The need for cloud-based software solutions has gone hand in hand with the rise of digital and software developing companies that offer efficient solutions for businesses. According to research from Upp Technology, 30 percent of cloud users reduced costs by 40% in two years; expense management is just one area in which businesses can save time and money. For the CFOs and FDs, the cloud-based software has simplified many of the basic bookkeeping tasks and frees up their time to analyse figures, thus making it easier to see where improvements can be made. One of the key transformations the cloud has enabled finance are the costs saved through Op Ex spends. The Op-Ex option of the Cloud means businesses don’t have to pay the high upfront fee incurred with Cap Ex solutions. This is a huge benefit for finance start-ups who are trying to save upfront costs of setting up such as servers, hardware and software purchases. It provides enhanced security features Some businesses have been cautious to adopt the cloud as security is one of their biggest concerns. Many are reluctant to adopt an application that could disrupt their current operations. However, much of these concerns could be unfounded. Webexpenses’ own experience of working with this type of technology has shown the minor disruption of initially changing to the cloud is nothing compared to the long-term investment you’re actually making for your company. The cloud also provides additional security, backing up all your files automatically, and 64 percent of cloud users believe it to be more secure than legacy systems. [easy-tweet tweet="The cloud also provides additional security, backing up all your files automatically." hashtags="cloud, tech, security"] The many recent data hacks into high-profile companies have made businesses and customers warier than ever about sharing their personal information online. However, research highlights many of the attacks were carried out in on-premise environments rather than the cloud. Security will likely remain a key concern when a company decides to move its data to the cloud, but the level of risk mostly relates to the behavior and culture of the employees rather than the software. Cloud is the future of business and has proved its dedication to ensuring a secure option for businesses and their customers. Remote and immediate access One of the biggest impacts of the cloud for the finance and expense industry, in particular, is the remote and immediate access it provides. It can be accessed on numerous devices, so for employees who want to expense their lunch or travel ticket, they are able to scan the receipt into their phone remotely. This eliminates the chances of losing the paper receipt and makes the process immediate and efficient. Not only is this a benefit for the employee, but it allows the business to have visibility and control over areas such as travel spending, as well as ensuring there’s a system in place to ensure timely reporting. This real-time access brought the cloud and mobile together, taking away the challenges of geography or time zones businesses previously faced. Webexpenses’ cloud-based software enables businesses to provide efficient financial reporting across regions as easily as if they were round the corner from your office. It reduces your paper footprint The ability to use cloud software on different devices has also had an impact on reducing the amount of paper that businesses are using, particularly within the finance teams. For example, webexpenses offers a petty cash management solution to reduce the amount of paper and loose receipts, as everything is moved digitally with receipts scanned in. These changes to even the smallest aspects of a business can have a big impact on the money and time saved. [easy-tweet tweet="webexpenses offers a petty cash management solution to reduce the amount of paper and loose receipts" hashtags="tech, cloud, security"] Recent research has shown that 37 percent of SMEs are using apps in their day-to-day business and 64 percent are running their operations via cloud technology, replacing the traditional paper-based ways of working to faster and more efficient digital methods. This inevitably has an impact for instant Management Information as the reporting capability of a digital cloud system, in comparison to the manual paper process, is vast. Information is more easily stored and updated online and across devices rather than limiting it with manual paper processes. The increase in cloud adoption rate leaves no doubt that the cloud is the future for businesses. The enhanced security features, efficiency, and access have transformed the day-to-day running of business across all industries and we will continue to see the growth of cloud-based software solutions. Whether this is the end of the road for traditional on-premises hardware and storage remains to be seen. ### Alcatel-Lucent Enterprise solution brings next-gen UC&C to SMBs through simplified hybrid cloud model ALE, operating under the Alcatel-Lucent Enterprise brand, today introduced a new generation of small-medium business (SMB) solutions providing access to advanced cloud services. These new solutions 'make IT simple' for SMBs to use and for the channel partners serving them. Built on the new robust Alcatel-Lucent OXO Connect R2 communications server, the latest Alcatel-Lucent OpenTouch® Suite for SMB communications offer will address the flexibility SMBs require for the modern workplace. The new OXO Connect offers a hybrid system creating a future proof investment for cloud services, along with an end-to-end voice, data and Wi-Fi converged system. Alcatel-Lucent Enterprise channel partners will be able to reach a broader base of customers with increased capacity of up to 300 users, while new tools will provide partners secure remote management of customers’ networks. The introduction of a simplified quotation and licensing management program has been designed to make IT easier for SMBs, and a simple licensing model will accompany the customers’ update, providing a software assurance guarantee for three years of support. New key features: Unique telephony licensing allows channel partners to better manage installed configurations. It’s coupled with a future proof solution to protect investments covering all software upgrades for customers for a limited time, making technology advances cost effective. Cloud services on OXO Connect: Alcatel-Lucent Rainbow™ UC cloud services connects people and systems, creating an integrated and innovative cloud-based collaborative workplace for business users and their business contacts. Alcatel-Lucent Cloud Connect Agent will provide remote access for channel partners to deliver support and services. Hospitality markets benefit from the increased capacity of up to 300 guest and administrative phones. Additionally, the new VTech analog phones better serve the needs of 2-3 star hotels. Enterprise-grade, low cost Wi-Fi is now achievable with the new Alcatel-Lucent OmniAccess® WLAN Access Point AP-1101 wireless LAN technology designed specifically for SMBs. The AP-1101 is based on controller-less technology and provides enterprise-class features such as auto-discovering, dynamic RF management, roaming, VoWLAN and QoS support for real-time applications. Damien Delard, head of the SMB business unit at ALE commented, “With more than 850K solutions sold and over 18M users, Alcatel-Lucent Enterprise SMB solutions have been proven to meet the needs of SMBs. This latest generation of solutions deliver simple, robust and connected capabilities that demonstrate the continued commitment of ALE to advancing communications for SMBs. The new OXO Connect solution increases customer satisfaction and minimises costs for businesses.” ### Should Salesforce users bolt to Lightning? Salesforce Lightning has been described as the most significant update to its user interface (UI) in the company’s history, but has lightning struck twice for its customers? Salesforce Lightning is a next-level platform designed to replace Salesforce Classic. Its promise is to offer a UI that is more flexible, mobile-friendly, and appropriate for supporting today’s data-driven enterprises than its previous incarnation. [easy-tweet tweet="Salesforce Lightning is flexible, mobile-friendly, and ready for today’s data-driven enterprises" hashtags="tech, UI, cloud,"] Last year, Salesforce’s Marc Benioff said Lightning is “The biggest change ever done to the company.” That’s a lot of change of taking in. After all, with change comes risk. Is there enough that needs updating to justify dealing with the challenges of change? Appearances count and so does usability. Getting the UI right is a critical factor in securing user adoption of new technologies, and driving positive business outcomes. Illustrated by data from our recent The State of Salesforce Report, it showed that when salespeople believe they can run an entire sales cycle from their phone, their companies are three times more likely to see cost reductions and twice as likely to see revenue gains attributable to Salesforce. Whilst Salesforce Classic was at the forefront of the cloud revolution, today’s mobile-centric users are looking for intuitive and more easily customisable UIs.  Designed with feedback from 150,000 Salesforce customers globally, Salesforce Lightning is intended to provide that. Custom development becomes less code-dependent. Users can modify and personalise dashboards, as well as drag-and-drop new components from third-party providers in the Salesforce AppExchange. It is all far more user-friendly and adaptive, an important step in driving a more intuitive user experience. That is the big picture. Drilling down to specifics, Lightning is actually an umbrella term for a broad range of offerings, including: Lightning Experience – a set of modern user interfaces (UIs), including UIs for the Salesforce1 Mobile app and template-based communities Lightning Component Framework – a JavaScript framework along with a set of components that developers can use to customise the Lightning Experience Visual Building Tools – tools that provide drag-and-drop simplicity for app-building and customisation Component Exchange – a subset of AppExchange that offers dozens of partners’ Lightning components Lightning Design System – style guides and user experience best practices for both the Lightning Experience and Salesforce1 mobile app However, it is important to bear in mind that Lightning is also a work in progress. Not all of the core features of Salesforce Classic are yet available on the current iteration of the Lightning platform. This is a phased migration with new elements being added on an ongoing basis. [easy-tweet tweet="It is important to bear in mind that Lightning is also a work in progress." hashtags="cloud, tech"] Customers need to decide what pace to move at and when to do so. What capabilities do you need from Salesforce? If there are core functions and features in Salesforce Classic upon which your organisation is dependent, then due diligence needs to be done on whether they are currently available within Lightning. If they’re not, then you may decide that now is not the right time to move away from Salesforce Classic or that you’re prepared to work around these shortfalls. Whilst Lightning is clearly the preferred destination for Salesforce, it is possible to use the new features of Lightning, but switch back and forth to the Salesforce Classic interface if those missing features are needed. It is up to individual users to decide if that’s an acceptable trade-off in order to tap into the benefits of the updated UI. For example, in the Sales Cloud, users can only add, edit or delete information on one account team member at a time. If you want to change team member display order or add multiple account team members on a single page, you’ll need to switch back to Salesforce Classic. Is that a major hassle for your sales team? Or is it something they can live with, for now, balanced out by benefits elsewhere, such as being able to associate a single contact with multiple accounts or being able to coordinate a sales team’s access to accounts by setting up an account team in Lightning Experience? The same is true of providers of third-party apps that integrate with Salesforce. Many of the leading app providers, such as FinancialForce and Apttus, have already delivered or are poised to deliver Lightning versions of their offerings. It is easy enough to check the status of apps by filtering for “Lightning ready” in the AppExchange. Increasingly, large parts of the Salesforce portfolio will come Lightning-ready. For example, the recently-announced Financial Services Cloud comes with the Lightning Experience UI and can be integrated with Classic Salesforce. Other clouds, born out of Salesforce Classic such as the Marketing Cloud, have had Lightning makeovers. Most recently, Salesforce has added Lightning Voice into the Sales Cloud, allowing sales staff to dial a number from a keypad within the application and access a note-taking section with details of the call (who was called, duration, and number) automatically logged. For Salesforce, this is an agile development journey that will continue for some time. For example, next year will see Salesforce’s Professional Edition, Enterprise Edition, and Unlimited Edition for Sales Cloud and Service Cloud replaced by the new Lightning Professional Edition, Lightning Enterprise Edition and Lightning Unlimited Edition for Sales Cloud and Service Cloud. The good news is that the nature of Salesforce’s SaaS model means that existing Sales Cloud and Service Cloud customers will automatically receive the capabilities and features of the new Lightning Editions on a rolling schedule. However, whilst expounding an ‘as easy as flicking a switch’ message when it comes to moving to Lightning, Salesforce itself recommends a three-phase approach to the realities – learn, launch, iterate. Critical to this is having pilot users, defining criteria for success, and using a dashboard to evaluate feedback and track metrics. [easy-tweet tweet="Salesforce itself recommends a three-phase approach to the realities – learn, launch, iterate." hashtags="tech, cloud, IT"] At Bluewolf, we have completed several Classic to Lightning migrations to date, and this will only increase over time; but the pace of the ramp-up will depend on feature usage. Determining whether Salesforce Lightning is the solution for your organisation right now is a decision that will vary from company to company and one that may benefit from third-party advice, but the future of Salesforce’s UI lies with Lightning. ### Is the domestic IaaS provider dead? Gartner’s August 2016 Magic Quadrant of Cloud Infrastructure as a Service (Iaas) has two players in the leader’s quadrant, seven players in the niche quadrant and only one visionary. There are no prizes for guessing who the two leaders are and maybe only a small prize for guessing that Google is the visionary. Whilst this particular report is aimed at global providers of IaaS and excludes UK-only IaaS providers, it raises a very interesting question; does the UK-based provider of IaaS have a future? [easy-tweet tweet="Does the UK-based provider of IaaS have a future?" hashtags="tech, cloud, IaaS"] I believe the short answer is ‘yes’. However, the way Pulsant and other domestic providers of IaaS go to market is changing and in many respects has changed already. Microsoft has announced a UK data centre as has Amazon, so one of the midmarket's chief differentiators until now around data sovereignty has overnight disappeared. No longer is the local IaaS provider able to claim that its cloud is located in the UK while the big players are offshore. Why wouldn’t everyone then just go to Microsoft and Amazon? The premise of hybrid IT is that you, as the customer, can manage multiple workloads in multiple environments, whether physical or virtual. You may be consuming SaaS services so have no need of infrastructure but your revenue generation application, your IP, sits in-house on your own infrastructure. Your application continues to be developed by your people but using public cloud resources and many of your back office requirements are all based in a public cloud. This hybrid landscape is not uncommon and is becoming increasingly more complicated. The key to unlocking the power of hybrid IT is the use of multiple clouds which then raises the question of how to manage them. The global colocation providers offer cloud exchanges and the ability to connect to multiple clouds, which is great if you are a large global enterprise with a significant IT team. In this scenario, your network overlaps into the data centre and you have access to multiple cloud providers. Your extensive IT team is able to utilise the low latency connectivity and manage the different workloads in the different clouds. [easy-tweet tweet="The niche providers and the local providers will re-invent themselves as deliverers of managed IaaS." hashtags="tech, cloud, IaaS"] For the UK midmarket who does not have an extensive IT team, this option isn’t feasible, which is why the domestic IaaS provider isn’t dead. The niche providers and the local providers will re-invent themselves as deliverers of managed IaaS.  These companies will offer a portfolio of managed public cloud services removing the need to build and maintain their own their own IaaS platform, which is a cash hungry, expensive beast to manage and grow. The new managed IaaS provider is going to use the size, scale and location of the two big providers to deliver to the UK mid-market a platform from which organisations with small IT teams can harness the power of the cloud. This new outsource model is outsourcing the management of the environment and the opportunity to flexibly connect to multiple different cloud sources. The managed service provider owns the SDN platform from which you can choose which cloud you want to utilise. The challenge for the new managed IaaS provider is how to provide a lens through which the enterprise is able to view and utilise their cloud resources. You need to be able to manage all of your applications through this pane of glass and those organisations who today have their own IaaS platform will over time wind them up. Globally, 81 percent of McDonald’s restaurants are franchised yet you would never know when you go and eat in them; the experience is the same the world over and the franchise ‘manages’ your experience. Is there any reason why IaaS shouldn’t go the same way? After all, all Microsoft and Amazon are doing is providing a franchise model. ### Linux Foundation Releases 2016 Guide to the Open Cloud The Linux Foundation, the nonprofit advancing professional open source management for mass collaboration, today announced the release of its 2016 report “Guide to the Open Cloud: Current Trends and Open Source Projects”. This is its third paper on the open cloud and adds many new projects and technology categories that have gained importance in the past year. The report covers well-known projects like Cloud Foundry, CloudStack, Docker, KVM and OpenStack, and up-and-comers such as Ansible, Hygieia, Prometheus and Rancher. The purpose of this guide is to help companies stay informed about the latest open source cloud technologies and trends. It takes a detailed look into cloud infrastructure, including projects in the areas of IaaS, PaaS, virtualization, containers, cloud operating systems, management and automation, unikernels, DevOps, configuration management, logging and monitoring, software-defined networking (SDN), software-defined storage and networking for containers. New this year is the addition of a section on container management and automation tools, which is a hot area for development as companies race to fill the growing need to manage highly distributed, cloud-native applications. Traditional DevOps CI/CD tools have also been collected in a separate category, though functionality can overlap. “Open source is software is prevalent in the cloud and is often the preferred choice for new infrastructure technology deployments,” said Mark Hinkle, VP of marketing at The Linux Foundation. “The Guide to the Open Cloud was created to provide an overview of the latest open source software used to deploy and manage cloud deployments as well as illustrate emerging design patterns including those utilizing containers, cloud-native applications and microservices.” The Linux Foundation’s work with numerous cloud computing companies and projects (Cloud Foundry, Cloud Native Computing Foundation’s Kubernetes) and engagement at its ContainerCon, ApacheCon and MesosCon events helped shape the paper. For ease of reading, each category includes 15 or fewer projects, evaluated by the project’s origins, age of the project, number of contributors, number and frequency of commits, diversity of contributions, exposure, demonstrated enterprise use, and expert opinions from IT practitioners. All profile data were collected from public sources, including project websites and source code repositories. ### Choosing the right ERP software Making the decision to roll out enterprise resource planning (ERP) software usually, comes after a long process of deliberation for any business. Opting to upgrade is not always as easy as flicking a switch, and requires an element of input from each department to be successful. However, it’s also important for businesses to ensure the solution they choose is right for their business, as the increasing amount of choices available caters to a variety of specific requirements. [easy-tweet tweet="Opting to upgrade is not always as easy as flicking a switch." hashtags="tech, cloud, IT"] With so many ERP options available, choosing the right package can be an overwhelming task. Many companies wrongly assume that any software will work well for their business, bringing with it the benefits that they have undoubtedly heard so much about - but failure to do some research ahead of the purchase could prove disastrous. So, what is the most efficient way of finding the ERP solution that is best suited for your business? We have outlined several steps below. Review your processes ERP is a business initiative and should be treated accordingly. Therefore, it is important to define and document any current business processes, including strengths and weaknesses. While conducting this research, take some time to consider what your methods should look like in the future, and what requirements you are likely to have too. Doing this will help you to evaluate your company’s position both now, and in the future, therefore helping you to find a solution that is well suited for your needs. Consider the technical fit of ERP It is essential that businesses consider how the rollout of ERP will align with your current IT infrastructure. For example, if you are accustomed to using one type of solution, it is best to opt for a solution that will support and work alongside this. But remember, technology is evolving and adopting legacy software could pose real problems in the future. Calculate the total cost of ownership Some businesses put off the rollout of ERP due to the associated costs. However, if you take some time to calculate this early on in the buying process, there shouldn’t be any surprises later on. It is much easier to accept potential costs at the beginning of the process, rather than after you have committed to a particular solution. Furthermore, take some time to discover any hidden costs - including implementation costs, software support, software upgrades, and software maintenance. [easy-tweet tweet="It is possible your company is viewing ERP as a way of reducing costs" hashtags="tech, cloud, ERP"] Keep an eye on the business benefits Many businesses fail to recognise the impact ERP has on their operations because they do not measure the specific changes. It is possible your company is viewing ERP as a way of reducing costs, increasing revenue or scaling for growth, and you should estimate and measure benefits against these metrics if you are going to utilise ERP to its fullest potential. Evaluate your options Keep your options open when selecting an ERP solution. As we mentioned earlier, there are more vendor options available than businesses may initially believe; therefore it is important to consider every possible option when carrying out that all-important initial research. A considerable number of companies choose a solution based on the brand name, or what their competitors are going for, and fail to consider what will benefit them specifically truly. Organisations need to consider the options that are going to meet their unique requirements best and bring a competitive advantage. Ask for advice It can be all too tempting to believe every word uttered by the vendor when it comes to ERP, but we’d recommend looking further for advice. Speak to your colleagues, employees and other business contacts about what they would recommend. Also, take the time to conduct research online and look for failures as well as successes when it comes to each option. If you have little or no knowledge when it comes to ERP, find other sources of advice like industry analysts or publications. [easy-tweet tweet="A considerable number of companies choose a solution based on the brand name" hashtags="tech, cloud, IT, ERP"] Manufacturing businesses on the lookout for the right ERP solution are advised to carry out as much initial research as possible, delving deeper into the market and how this software can help, not other businesses, but you. Seek software demonstrations from multiple vendors and choose the right fit for your business based on what your needs. In the end, your ERP will only be a success if you can see and measure the benefits of the solution for your manufacturing operations. ### Huawei Jointly Completes the Launch of a Cloud Ready Data Center with UAE's Largest Offshore Oil Producer ADMA-OPCO Huawei and the Abu Dhabi Marine Operating Company (ADMA-OPCO), a major producer of oil and gas from the offshore areas of the Emirate of Abu Dhabi, jointly announced at the Huawei Global Energy Summit that ADMA-OPCO’s Cloud Ready Data Center is now fully operational. The Cloud Ready Data Center is expected to help ADMA-OPCO address the needs of long-distance transmission and processing of a massive volume of data during offshore oil exploration and production. The Cloud Ready Data Center will also enhance the security of critical business data and applications for ADMA-OPCO. As one of the largest offshore oil and gas companies in the Gulf region, ADMA-OPCO is responsible for the production and exploration of a number of offshore oilfields in Abu Dhabi, producing several hundred thousand barrels of oil per day. With the expansion of their businesses, ADMA-OPCO plans to upgrade its three data centers into a future-oriented cloud data center by integrating existing IT systems, so as to improve IT resource utilization and service response capabilities. This would allow ADMA-OPCO to cope with the rapid growth of data volumes and reduce the operating costs of data centers. Huawei has facilitated ADMA-OPCO to construct a cloud data center based on the Information Technology Infrastructure Library (ITIL) standards, which integrates the core architectures of data center management to implement a centralized management of both cloud and non-cloud data centers. The Cloud Ready Data Center features centralized IT resources, as well as collaborative management of blade servers, and storage and network equipment, realizing automatic provisioning of computing resources and scalable cloud data services required to support future-proof smooth capacity expansion. As a security mechanism, geographic disaster redundancy between two sites 200 km apart has been implemented to ensure zero loss of key data generated by Enterprise Resource Planning (ERP), Oracle, and email systems. Speaking at the launch ceremony, Dr. Alaeddin Al-Badawna, ADMA-OPCO's Chief Information Officer, said, "The Cloud Ready Data Center we built with Huawei supports seamless integration with our existing infrastructure, which maximizes our IT operation efficiency, optimizes IT resource management, and boosts IT resource utilization. We expect to see a 30% drop in the operation and maintenance (O&M) costs of the data center. “Virtualization-based cloud computing can ensure high service continuity and greatly facilitate our business expansion. In addition, Huawei's geographic disaster redundancy technology provides enhanced security for our key business data and applications," he continued. He Tao, President of Huawei Middle East Enterprise Business Department said, "Huawei has worked closely with ADMA-OPCO to clearly understand their requirements and provide an end-to-end overall cloud data center solution to build a simplified, open, and elastic data center for ADMA-OPCO. This Cloud Ready Data Center will support more flexible services and applications, helping ADMA-OPCO excel amidst fierce market competition. “The commercial rollout of this Cloud Ready Data Center is a testament to Huawei's understanding of the specific needs of this particular business and positions Huawei as a leader among data center solution providers for global high-end oil and gas customers, as well as accelerate Huawei’s technological influence in this area and more." Since 2002, Huawei has started to provide data center integration services and solutions for customers. The company has over 14 years of experience in research and development, delivery, and the operations and maintenance of data center solutions. Innovative architectures and solutions developed by Huawei include distributed cloud data center (DC²) architecture, private cloud data centers, public cloud data centers, service continuity and disaster redundancy, data center management solution ManageOne, and data center consolidation and migration. As of August 2016, Huawei has helped global customers in various industries deploy more than two million virtual machines and 830 data centers, including 420 cloud data centers. ### BREXIT: Seeking refuge in the cloud The ramifications of Britain’s decision to leave the EU are, as yet, unknown. While a weaker pound might be good news for exporters, the reality is that the vast majority of our technology is imported. US vendors generally price their hardware and software costs in US dollars. The pound is at a 31-year low against the dollar, and this currency chasm is forcing technology providers to re-evaluate their pricing strategy in both the UK and Europe, the Middle East, and Africa (EMEA). If you consider that 17 out of the top 20 enterprise technology providers in the world are headquartered in the United States, it doesn’t take a world-class economist to realise that IT spending will be impacted over the coming years. [easy-tweet tweet="It doesn’t take an economist to realise that IT spending will be impacted over the coming years." hashtags="tech, economy, cloud"] Not only are price rises inevitable, but they’re already happening. Hewlett Packard, Dell, and other leading vendors have already introduced blanket price increases in the UK. Experts have suggested that it’s only a matter of time before Microsoft follows suit. Weathering the storm In times of financial uncertainty, it makes sense for businesses of all shapes and sizes to review their spending strategy. Organisations should be doubling down on cost optimisation efforts, in order to save money, simplify operations and speed up time to value. In that vein, there has never been a better time to consider cloud-based services such as Office 365. Frisby comments: “Cloud services are ideally suited to helping companies deal with uncertainty. If IT is in need of a refresh, it makes financial sense to move to the cloud and an OPEX model, rather than making a significant CAPEX investment.” Office 365 from Cobweb virtually eliminates capital costs and provides pay-as-you-go flexibility. By shifting your communications infrastructure off-premises, and adopting a subscription-based model, your cash-flow will never be threatened by unforeseen events or volatile market conditions. By signing up to Office 365 now, you can effectively lock your company into existing prices, should Microsoft decide to increase subscription costs in the future. “There is a 12-month price guarantee with all Microsoft cloud services,” says Frisby. “So if you buy today, the price you pay is set for the next 12 months.” Keep calm and carry on We all know that uncertainty is poison to the markets. And as the UK will remain part of the EU for at least the next two years, that uncertainty isn’t going anywhere fast. As legislators begin to hammer out agreements over trade relationships, the long-term implications will slowly reveal themselves. [easy-tweet tweet="We all know that uncertainty is poison to the markets." hashtags="tech, cloud, economy"] In the here and now, it is essential that organisations keep calm and carry on. However, it is also important that business leaders look to optimise costs and drive efficiencies wherever possible, so that they can focus on steering the company through these uncertain times. With low capital expenditure and flexible scaling, Office 365 from Cobweb provides organisations with a fully managed, next-generation productivity platform. And with US technology firms such as Microsoft in the process of reviewing their UK pricing strategy, businesses would be well advised to act quickly, in order to circumvent the inevitability of price rises. ### CenturyLink reaches agreement to sell data centers to a consortium led by BC Partners and Medina Capital CenturyLink, Inc. today announced that it has entered into a definitive agreement to sell its data centers and colocation business to funds advised by BC Partners, in a consortium including Medina Capital Advisors and Longview Asset Management. In exchange, CenturyLink will receive $2.15 billion in cash, subject to offsets for capital lease obligations and various working capital and other adjustments, and a minority stake to be valued at $150 million in the consortium’s newly-formed global secure infrastructure company, which in total represents a multiple of approximately 12 times estimated 2016 Adjusted Operating Cash Flow.   CenturyLink plans to use the net proceeds from this sale to partly fund its acquisition of Level 3 Communications announced on October 31, 2016. These two transactions further advance CenturyLink’s strategy to improve lives by connecting people to the power of the digital world.   “After conducting a thorough review process, we are pleased to have reached an agreement with BC Partners,” said Glen F. Post III, chief executive officer and president of CenturyLink. “We believe this transaction will benefit customers, employees and investors. Both CenturyLink and BC Partners have a strong customer focus and are committed to ensuring a seamless transition of the customers and their colocation environments.” CenturyLink will continue to focus on offering customers a wide range of IT services and solutions, including network, managed hosting and cloud. Though it will no longer own the data centers, CenturyLink will continue to offer colocation services as part of its product portfolio through its commercial relationships to be entered into at closing with the BC Partners/Medina-led consortium. Justin Bateman, a managing partner at BC Partners, said “We are excited to be acquiring CenturyLink’s portfolio of data center assets. CenturyLink has built and maintained an impressive global footprint of colocation data centers that is unparalleled for a portfolio of assets of this size. Led by Manny Medina and his management team at Medina Capital, these data centers will become part of a new, global secure infrastructure platform that will meet the growing and changing needs of customers today and for the future. We thank Glen Post and the entire team at CenturyLink for their partnership, and we look forward to working together to offer all the data centers’ existing customers, as well as new customers, unrivaled datacenter and colocation services.” Under terms of the agreement, the BC Partners/Medina-led consortium will assume ownership of CenturyLink’s portfolio of 57 data centers at closing. The data center portfolio includes approximately 195 megawatts of power across 2.6 million square feet of raised floor capacity. The purchase agreement contains various customary covenants for transactions of this type, including commitments of CenturyLink to indemnify the purchaser for certain taxes and other specified matters, subject to certain limitations. The parties anticipate closing the transaction in the first quarter of 2017. The transaction is subject to regulatory approvals, including filings under the Hart-Scott-Rodino Antitrust Improvements Act and a review by the Committee on Foreign Investment in the United States, as well as other customary closing conditions.  BofA Merrill Lynch, Morgan Stanley & Co. LLC, and Wells Fargo Securities, LLC are acting as CenturyLink’s financial advisors and Jones Walker is acting as CenturyLink’s legal advisor.   LionTree Advisors acted as financial advisor to BC Partners and its consortium investors. Latham & Watkins LLP is serving as legal advisor and PricewaterhouseCoopers is serving as accounting advisor. Citigroup, JP Morgan, Barclays, Credit Suisse, Jefferies, HSBC, Macquarie and Citizens have underwritten the debt package to finance the acquisition.   Greenberg Traurig served as legal advisor to Medina Capital. Vedder Price served as legal advisor to Longview Asset Management. ### The many shapes and sizes of the cloud Bang and Olufsen is a high-end luxury brand within the consumer electronics space. We often say there’s an element of quality to our products. When a customer enters one of our Bang and Olufsen stores, they expect an exceptional customer experience to match the quality of our devices. [easy-tweet tweet="Bang and Olufsen is a high-end luxury brand" hashtags="tech, cloud, consumer"] This is why it is vital for us to replicate this level of service on our eCommerce site. Three years ago, we opened our online store and we did just that. At the forefront of our mandate to upholding a high level of customer satisfaction we knew what he had to do. Facilitating a smooth transaction, implementing quick and easy checkouts, ensuring there were zero downtimes and quick load times was essential to the plan. Our site is rich in content, with videos and high-resolution imagery that aesthetically showcases and demonstrates the quality of our products. This enables our customers to view and experience the devices online in a truly comparable way to how they would in person in our stores. Ensuring we have the best technology infrastructure underpinning the business is crucial. This can take form in many shapes and sizes. One of the most important factors rests on the decisions we make in regards to which technology we choose to keep in-house and the elements we outsource to companies that have a core capability in any given area. It’s a business decision. When you compare the cost and investment required to run certain operations in-house, it might make more sense to choose to outsource. Finding the right outsourcing company – one that you can trust and rely on is critically important. While many companies deliver on all the technical aspects, which are no doubt very important, few deliver to the same capacity in their customer services. [easy-tweet tweet="Compare the cost needed to run operations in-house, it may make more sense to outsource." hashtags="cloud, tech, "] Proactivity is the key differentiator when setting apart a good cloud outsourcing company from a great one. When our partner identified our priorities and needs are before we had even identified them, we knew we had found a great cloud outsourcing company. We have a dedicated support team available whenever we need, they engage in monthly discussions and participate in behind-the-scenes meetings, where we get to see new developments and roadmaps. We have used the hosting capabilities of Cogeco Peer 1 for six years now and the power, performance and resilience that is provided is essential in supporting mission critical elements of our business. For instance, Cogeco Peer 1 hosts a VMware cluster which is fully redundant and delivers 100 percent uptime. We have a hosted production, testing and disaster recovery set up. Meeting all business requirements is crucial but so is a commitment to the values of the brand. We know we have the technical capability in place you then have a solid platform from which to expand hosted services as appropriate. ### 4 ways to ensure your business is cloud-security conscious While the ever-expanding impact of the cloud is transforming organisations, we must also address a series of important questions if a business is going to actually succeed. Most importantly, as the cloud’s influence continues to permeate through workforces who are striving to be more accessible and more competitive, how can we guarantee that businesses find the equilibrium between maximising the benefits of the cloud, while also ensuring that teams have the specialist knowledge to protect themselves from risks while doing so? [easy-tweet tweet="It appears that security has become less of a priority" hashtags="cloud, tech, security"] To move to the cloud, organisations have channelled efforts into attracting IT professionals with the right technical expertise. In the meantime, it appears that security has become less of a priority, as shown by a recent study from the IT consultancy, Company85. The research found that 95 percent of Company85’s customers used the cloud for hundreds of services, but only 7 percent of enterprise companies were confident they met their security needs. Now is the time to address this balance, and here are four steps in how to do this successfully. Step 1: Recognition The first step is recognition. It is crucial for businesses who operate in the cloud to acknowledge that security risks are intrinsic to this way of working. While IT systems have always been subject to cyber threats, cloud computing heightens the risk that databases face from security attacks, including hacking, phishing and ransomware. These attacks are becoming increasingly sophisticated, enhancing the need for awareness. Step 2: Security Measures Many of the top 12 cloud computing threats that organisations face today, as identified by the Cloud Security Alliance, could be avoided through improving IT processes. For example, using multifactor authentication and encryption can protect against data breaches. You can help to prevent hacking by reinforcing your API and interface security, which can be achieved, in part, by having the right modelling applications in place and conducting thorough testing. Permanent data loss and data theft could be considered the most damaging security breaches, with data theft incidents costing UK businesses £6.2 billion per year, according to the internet service provider, Beaming. Distributing data across multiple zones and ensuring you have adequate data backup measures are just a few ways to avoid data loss. Taking initial steps to mitigate such risks will bring your team closer to operating efficiently within the cloud. Security should be front of mind for every IT professional. Essentially, collaboration and communication are pivotal to successful safety management and, while adopting a DevOps focused approach will help, the primary challenge facing CISO’s and leadership teams is how to install the security message without it becoming stale. Step 3: Collaboration The third thing you can do is to establish a unified approach between your IT team and wider workforce. Developing an educated security culture within your organisation will create awareness of safe security practices throughout your entire workforce. By engaging employees about how to spot and report threats, you can protect against dangerous security breaches or even stop a potential attack before it disrupts your system. [easy-tweet tweet="Security should be front of mind for every IT professional." hashtags="tech, IT, cloud"] However, security methods are not only the responsibility of IT departments. Establishing a successfully united approach between your IT team and the rest of your employees has to be a mutual relationship to succeed. The IT team can provide critical education and expertise, but your workforce should be actively reporting on suspicious circumstances. Feedback from IT teams to those reporting on issues will encourage employees to continue raising the flag for future suspicious activities that could negatively affect your business’s digital safety. The whole organisation needs to play their part in tackling these issues. Step 4: Developing new skills All in all, as more companies transition to the cloud, there is an increasingly apparent shift in the skills organisations need within their IT functions. Skills such as mobile device management, multi-factor authentication, specialist and off-the-shelf networking and systems monitoring tools, vulnerability assessment and penetration testing, are sure to be in high demand. As a result, as businesses continue hiring IT professionals with the expertise of cloud implementation security practices, organisations must invest in training their current employees, to ensure that as a whole, they are operating to their optimum. [easy-tweet tweet="As businesses continue hiring IT professionals with the expertise of cloud implementation security practices" hashtags="tech, cloud, IT"] Ultimately, while certain security measures are put in place by cloud providers, it is the end users who are actually responsible for guarding their data. Promoting awareness of security measures will help your business to mitigate risk yet still profit from the flexibility offered by the cloud. In addition to finding talent within the security space, we must nurture the development of security awareness and be learning within organisations. To successfully operate a network that is both efficient and secure, cloud skills and security measures are not adversaries. Only a union of the two will define the success or your business. ### MiFID II set to change the way the financial services industry records conversations There are significant changes afoot in the regulation of financial instruments. There’s a section of MiFID II (The Markets in Financial Instruments Directive), which comes into force in January 2018, which stipulates that any firm providing financial services to clients linked to ‘financial instruments’ will have to record and store all communications intended to lead to a transaction. [easy-tweet tweet="There are significant changes afoot in the regulation of financial instruments. " hashtags="cloud, tech, IT"] This means anyone in the advice chain must record and store their conversations with customers, which is a big escalation of current obligations. Recording of conversations between traders and their clients on fixed and mobile phones, including voice and SMS, has been mandatory since 2011, and the current mandate applies to about 30,000 traders. From January 2018 however 300,000 individuals in the UK alone will be subject to the new regulation. It means many more firms will need to be ready to meet the explosion in data that will accompany MiFID II enforcement. All forms of communication In fact, the MiFID II regulations will extend into all forms of communication; including face to face meetings. Face to face conversations won’t necessarily need to be recorded but they will at least need to be captured in written minutes or notes and then stored for up to five years - or seven years if requested by the authorities.  It’s down to an individual firm how they save conversations, but relying on manual notes alone potentially creates more work and incurs more risk than taping conversations (following the customer’s permission of course). In the financial services industry experience shows that the authorities nearly always favour the client in the event of a complaint, so access to proof of innocence is paramount. The UK’s PPI misselling scandal is a prime example.  The Financial Ombudsman found that unless a firm can provide irrefutable evidence that PPI was not mis-sold, it would conclude that the company involved was culpable. It’s a decision that so far has cost the UK financial services sector more than £35bn in compensation. Quality counts Given that conversations often continue over phone, email, SMS, a company will need a holistic view of compliance across all channels. When it comes to face-to-face, scribbled notes on a pad probably won’t cut it. Rather a network-based call recording service should enable organisations to meet MiFID II mobile voice recording requirements and achieve FCA compliance without compromising the user experience. As long as it’s not reliant on an app, conferencing or streaming, the service should be robust and tamper proof. [easy-tweet tweet="Being able to intercept GSM calls will guarantee clear playback of calls." hashtags="tech, comms, cloud"] Naturally, the quality of the recordings is outstanding. Should a query arise, the content must be of sufficient quality to hold up to scrutiny. Being able to intercept GSM calls will guarantee clear playback. A big mess of data The implications for the recording of conversations are far-ranging. The regulations don’t just infringe on conversations across all devices and locations ( to cover remote working), they also infer that a business must put in place processes for the routeing, reviewing and monitoring of these conversations on both company-provided and privately-owned devices (if the latter is ever used for work purposes). This in return could create a big mess of data. Marie Kondo’s bestselling book, The Life-Changing Magic of Tidying Up: The Japanese Art of Decluttering and Organising, helped people create order from chaos. Some of the methods Kondo outlines can be applied to MiFID II: her transformative one-time organising event against clutter can be likened to using a high availability infrastructure, in which data is indexed with rich metadata for quick discovery to reap results very quickly. To paraphrase Kondo, a company’s data should ‘spark joy’. While indexing can help to create this spark, a security breach will snuff it out immediately. This makes encryption imperative for all data. Stronger data protection [easy-tweet tweet="While indexing can help to create this spark, a security breach will snuff it out immediately." hashtags="tech, cloud, IT"] The General Data Protection Regulation (GDPR) Act comes into force about the same time as MiFID II. GDPR will strengthen the 1998 Data Protection Act and will penalise companies who fail to protect an individual’s data – so any recording policies under MiFID II will need to be considered within the context of preventing possible intrusions into privacy. For instance, firms will need to find a viable way of ensuring business calls are recorded on a device, without also recording personal calls – given that merely the act of recording them (let alone listening to them) would infringe GDPR. In short, there’s a great deal for businesses to consider in the new regulations, and currently not nearly enough information to provide meaningful guidance. The burden to comply with MiFID II and the GDPR Act will rest solely on the firms involved, so I would urge companies to start on the necessary preparations now. The changes will be upon us before we know it. ### Axway Introduces Next Generation Data Integration and Engagement Platform Axway, a catalyst for transformation, today introduced Axway AMPLIFY™, a new data integration and engagement platform that empowers organizations to unlock business value from a vast array of data sources to transform the customer experience. Developers, architects and administrators can use AMPLIFY for everything they need to power their unique inventions, from apps, connectors and transformations to workflows, dashboards and policies. In a report by Forrester Research, business decision-makers identified improving the customer experience as a high or critical priority, second only to growing revenue. Getting the customer experience right is a huge challenge as the ever-increasing demands of today’s digitally empowered consumers mean they can no longer rely on traditional omnichannel strategies to differentiate the experience they offer from their competition. The problem in today’s paradigm is most backend services do not share data across internal business units and often don’t leverage data from external parties to anticipate customer needs and behaviors, resulting in a fragmented view of the customer, poor service levels, and lost revenue opportunities. Instead, organizations need to quickly capture and act upon insights from an explosion of data sources that range from enterprise systems, supplier processes and partner interactions to cloud services, mobile devices and connected machines – all part of what Axway calls a “Customer Experience Network” – to rapidly adapt to customer expectations. AMPLIFY offers a unified, secure environment both in the cloud and on-premises for digital teams to create, run and scale API-enabled services. Users also gain access to an open marketplace where they can discover, share and monetize an extensive catalog of prebuilt services and “service accelerators,” a customizable set of templates with complete API documentation that can be used to bootstrap new services. “At Accenture, we encourage our clients to take a consumption-driven approach as we help them with their digital transformation journeys. Key to success is building a true customer experience network, empowered by APIs," says Kevin Kohut, Global API Strategy Lead at Accenture. "We're excited about the potential of Axway’s AMPLIFY platform to support an organization's digital transformation by harnessing the power of APIs, combined with tools that empower digital interaction, which ultimately drives engagement, loyalty and revenue growth.” AMPLIFY enables organizations to achieve measurable outcomes through a comprehensive set of capabilities, encompassing: An agile, secure and scalable data integration infrastructure that allows DevOps teams to quickly convert data silos into configurable and API-enabled services that are more efficient and affordable to manage. A global services company with a proprietary database containing 235 million business profiles was able to generate $10 million in incremental revenue within its first year of delivering their data as a service through APIs. Full lifecycle management of APIs that provides the key to unlocking existing internal and third-party assets to shape rich mobile, cloud and IoT experiences with a faster ROI. A global automotive manufacturing brand was able to introduce new services for traffic safety, driver convenience, infotainment and insurance, and grow its subscriber base to 2 million connected cars by using APIs to access vehicle, merchant and environmental data. App development that enables visual and hands-on design, built-in testing and connectivity to any data source via mobile-optimized APIs, all leveraging existing JavaScript skillsets. A leading consumer real estate destination with 18 million unique monthly visitors was able to decrease its time to market for new mobile apps by 40% and lower the cost of cross-platform native development by over 50%. Community management that reduces on-boarding time for partners and developers from days to hours, and lightens the load for IT by delivering common workflows and self-service capabilities. A financial services customer who was handling ACH, Lockbox, Reconciliation and related payments with nearly 1,200 financial partners was able to make its onboarding process 35% more efficient. Embedded analytics that illuminate the real-time health, performance and adoption of services to help predict future growth and support continuous improvement. A major retail chain serving 800,000 customers each day was able to monitor its order fulfillment process, avoiding tens of thousands of Euros in blocked orders daily. “The way that most enterprises manage the customer experience needs a complete rethink as digital technologies are intimately woven throughout every facet of society,” said Jeanine Banks, executive vice president, global products and solutions, Axway. “The AMPLIFY platform was born out of our passion for helping organizations successfully capitalize on this change and tap into customer experience networks. The new rules of the game are about ecosystems collaborating together to fuse virtual and physical interactions with customers into seamless, more personalized digital journeys.” The launch of the AMPLIFY platform features an early preview of the marketplace and provides a glimpse into future innovations, a project code-named “Golden Gate.” It is focused on reimagining lifecycle management capabilities to help create, govern, engage, consume and measure engagement with APIs as well as creating an even more efficient way to onboard and manage communities of employees, suppliers, partners, and developers. Project Golden Gate will also feature a powerful framework for deploying, managingand scaling containerized services both in the cloud and on-premises with enterprise-grade security, high availability and comprehensive visibility. Availability for Project Golden Gate is planned for 2017. ### Cloud-based CRM: Extending omnichannel to the realm of B2B Despite the benefits of B2C businesses making themselves available to customers 24/7 via multiple platforms, a surprising number of B2B companies are keeping their doors firmly closed outside usual office hours. But if you aren’t open when customers are poised to make a purchase, those customers might just find somewhere else to take their business. Customer expectations are being redefined, and if B2B businesses can’t find ways to keep up, it’s likely they’ll fall behind. Fortunately, the solution exists; it’s called eCommerce. Incorporated with a cloud-based, accounts-integrated CRM system, it can inform – and be informed by – sales operations, stock control, accounting and the supply chain, for a level of B2B service otherwise reserved for B2C customers. [easy-tweet tweet="Incorporated with a cloud-based, accounts CRM system, it can inform and be informed by sales operations" hashtags="cloud, ecommerce, tech"] Evolving habits eCommerce is everywhere in the B2C world and it has pushed expectations of service to unprecedented levels. Success lies in delivering consistent, joined-up customer experiences across all channels and maintaining customer loyalty in the process. Online shopping has allowed customers to shop any time, with availability and speed of delivery influencing purchase decisions almost as much as price. But until now, this omnichannel approach has largely been considered the domain of B2C. For B2B, the application of eCommerce has been a slow burner. Some companies have questioned its relevance to their sector, while others have tried it with mixed results – largely due to flawed methodology. Early adopters are triumphing, but for many, the natural default is to stick with what they know. Yet change is inevitable. And it’s not just consumer habits that are shifting. Flexible working is becoming an increasing consideration for employees; something that businesses should take seriously when it comes to attracting and retaining the best workers. What with the legal requirement that employers must reasonably review all employee requests, flexible working capabilities are a necessity, not to mention invaluable when it comes to fulfilling customer facing roles. Progressive solutions The addition of eCommerce functionality can truly transform B2B operations, with the most effective solutions fully integrating with your CRM and ERP accounting systems. Typically, CRM and accounting operate in silos, which can often lead to gaps in information capable of disrupting order processes, blighting communications and undermining the customer experience. Accounts-integrated CRM and eCommerce solutions fill that gap. By connecting CRM, ERP and eCommerce systems, companies can go beyond a standard ‘web shop’ to a more tailored offering that’s based on the customer’s history with all available company touchpoints. [easy-tweet tweet="Flexible working is becoming an increasing consideration for employees" hashtags="tech, cloud, "] Of course, eCommerce is especially well placed for selling goods, but businesses that sell a hybrid of goods and services should also consider how eCommerce, in combination with accounts-integrated CRM, could strengthen the customer experience. If a purchasing decision requires advice, consultation via traditional channels is still available, but the dialogue can be bolstered by eCommerce, thanks to the visibility of previous online engagement that’s held within the CRM system. And once a purchase is made, it forms the basis for future orders that can be quickly actioned via eCommerce mechanisms until a new consultation is required. This improves the customer experience without surrendering the personal service. The future is in the Cloud Take accounts-integrated CRM and eCommerce one step further and you’ll find it in the Cloud. Sales data and customer records – detailing engagement across all channels – integrated with stock control, accounting and supply chain operations can be pulled up from anywhere, at any time, completing that all important, ‘always on’, seamless service. Flexible and remote working requirements are met intrinsically, and a more streamlined process ensures greater efficiency, productivity and capacity management that’s especially important for expanding SMEs. The ability to easily scale the solution as appropriate, devolve technical responsibility and drive down the cost of service results in greater business efficiency, freeing up staff to spend more time on customer service and business development. Conclusion [easy-tweet tweet="Everyone is a consumer, even those in professional uniforms." hashtags="cloud, tech, consumerism"] Everyone is a consumer, even those in professional uniforms. So why do our B2B experiences not replicate those we enjoy elsewhere? If B2B companies are to thrive and survive, they must apply B2C logic. And fast. By establishing an omnichannel offering via accounts-integrated CRM and eCommerce, businesses can streamline processes throughout – from stock control, supply chain and eMarketing – so that each is geared to enhancing the customer experience. Look to the Cloud, and only then does the omnichannel experience truly become an ‘always on’ experience. ### Atos First to Offer Global Prescriptive Threat Detection and Instantaneous Remediation Atos, a global leader in digital services, has announced a new managed security service provider (MSSP) partnership with Intel Security. Through its technological brand Bull, Atos will leverage its dominance in Big Data analytics to provide the first ever managed offering built to deliver the McAfee “Threat Defense Lifecycle” as a cloud, on-premise or hybrid service. Amid an exciting digital revolution known as the Second Economy, that brings with it countless innovative and connected devices, is the explosion in volume and complexity of threats targeted at both businesses and individuals. The Atos and Intel Security global partnership enables groundbreaking advancements in integration, automation and orchestration by combining McAfee technologies with the high powered computing (HPC) of Atos’ bullion server. Beyond just excellent intelligence, Atos is uniquely capable of enhancing automation and achieving orchestration of security processes enabling the first prescriptive-security service that can thwart attacks before they happen. “Atos is pleased to debut its new Big Data & Security solution that makes it possible for customers to predict security threats before they occur”, said Michel-Alain Proch, group senior executive vice president and CEO North American Operations, Atos. “Our first-of-its-kind Threat Defense Lifecycle solution allows us to deliver with velocity, secure with unified visibility and respond with intelligence — further showing our commitment to innovation and being the trusted partner our customers deserve.” Atos leverages its bullion servers’ record-breaking compute power, patented elasticity and unmatched scalability to help customers achieve prescriptive security. This new bullion-based appliance is optimized for big data analytics to analyze and correlate the growing volumes of structured and unstructured data, to preemptively reinforce security controls and to neutralize cyberattacks even before they reach the organization’s environment. “With prescriptive security that learns from threats and orchestrates automated tasks to respond immediately based on current needs, we enable real-time action—proactively protecting other endpoints throughout the customer environment to contain a detected attack and even to prevent attacks from happening,” said Pierre Barnabé, chief operating officer for Big Data & Security, Atos during a keynote address at Intel Security’s Focus16 conference in Las Vegas. “Powered by big data capabilities, automation and orchestration, the end result for clients is uninterrupted delivery of their business.” ”This kind of marriage of big data and security is groundbreaking for Intel Security,” said Richard Steranka, senior vice president of global channel operations, Intel Security. “Unique capabilities and efficiencies enable Atos to deliver market-leading security at an affordable cost. As concerns about advanced attacks increase faster than availability of skilled cybersecurity resources or security operations budgets, fast-responding, affordable security services are in demand now more than ever.” ### The Future of Telephony Businesses large and small require communications systems that meet their unique needs, but with the lightning pace of technological change, it can be difficult to know which solutions are best suited to your company,  and whether or not you should embrace innovations such as Hosted Telephony to empower flexible working. [easy-tweet tweet="It can be difficult to know which solutions are best suited to your company" hashtags="cloud, tech, IT"] Fear of making the wrong choice can leave business owners confused and delay sometimes vital transitions to newer technologies. Understanding the driving forces behind the ever-evolving telephony market is a good place to start, as trends in telephony and communications are always led by the changing needs of modern businesses. So what does the future of business telephony hold? VoIP Telephony- Increasingly flexible deployment A major sticking point for businesses thinking of upgrading their telephony is the idea of depreciation of their current infrastructure. Having invested in costly on-site equipment, managers can be understandably reluctant to give up such expenditure and switch to completely new systems. VoIP telephony providers these days are aware of this, and many brands now offer flexible solutions that can be deployed over existing infrastructure. Avaya's Server Edition of their popular IP Office product is a prime example of this- the phone system can be based in a data centre, on premise, or virtualised. The ability to take the system off-site but keep any existing handsets and other kit is very appealing to companies who have already taken the investment plunge but who also feel the need to upgrade to something with more features and flexibility. Ongoing Growth of Hosted Solutions Although the telephony market as a whole has been growing quickly over the past few years, Hosted Telephony in particular has seen substantial growth, with 250,000 new “seats” added in the first half of 2016. This represents a 25 percent growth in the industry from mid-2015 to mid-2016, a staggering expansion which has driven competition between different providers to create the most attractive product - good news for buyers, as it means prices are kept stable even as new features are added. Recently, there has also been significant uptake of hosted telephony amongst larger businesses looking to take advantage of the flexibility and power of Hosted Telephony solutions, demonstrating that it’s beginning to break into the vertical market. [easy-tweet tweet="Hosted Telephony in particular has seen substantial growth" hashtags="comms, tech, IoT, cloud"] SIP Trunking as an ISDN replacement SIP Trunking has also seen substantial growth of more than 20 percent over the past year, though this is a slight slowdown in comparison with previous years and less than was predicted for the year ahead. Industry experts think that this is due in part to the fact that BT’s 2025 ISDN turn-off date (which will force businesses to upgrade) isn’t imminent enough to create a demand for upgrades, especially in comparison with countries whose ISDN service is being discontinued much sooner, such as Switzerland (2017) and Germany (2018). However, it’ll still be necessary to invest in an alternative solution at some point, and SIP trunks offer many benefits over standard ISDN services. Unified communications (UC) means different things to different people, but the underlying concept is a package which brings together multiple strands of your communications channels. The definition of UC is always evolving but the flexibility of a CPaaS solution is that it allows businesses to create their own tailor-made applications within the provider’s framework, essentially picking and choosing which features suit them best. Services like Avaya’s "Zang" allow businesses to implement flexible Real Time Communications (RTC), incorporating video, messaging and voice services without needing separate applications for each channel. This makes CPaaS a good fit for vertical businesses with access to high-speed and reliable broadband connections; because CPaaS provides an environment within which specialised communications suites can be built, they allow for highly customised solutions to be constructed whilst also remaining flexible and reasonably priced. [easy-tweet tweet="In the coming years, we’re likely to see ongoing competition drive continued penetration" hashtags="cloud, tech, comms, "] Next steps in telephony Clearly the Hosted Telephony market is growing stronger, with various services beginning to penetrate into large businesses. In the coming years, we’re likely to see ongoing competition drive continued penetration into the mid-market and specific vertical business sectors as CPaaS and hosted telephony offer even better services, and the ISDN turn-off date will drive continued uptake of SIP trunking as it draws closer. ### What can the Internet of Things learn from 1990s ERP? The 1990s were an exciting time to be in the software industry. Companies were waking up to the idea that reinvigorating business processes could boost productivity and streamline operations. Integrating business operations became top of the CIO’s list in the 90s and subsequent decade. However, it was not without its problems. Early IT implementations overran, under-delivered and cost companies millions. In one famous case, Hershey’s was unable to deliver $100 million worth of its chocolate on Halloween due to its failing ERP, which caused the company’s stock to dip 8 percent. [easy-tweet tweet="Early IT implementations overran, under-delivered and cost companies millions." hashtags="tech, cloud, IT"] While many of those early implementations were complex and costly, the resulting integration and visibility had never been seen before. For example, real-time, accurate information enabled businesses to provide Available to Promise (ATP) commitments to customers and end-to-end processes became transparent. Fast forward to the present day, and the ERP industry has not only learnt the lesson of its early days, but it has evolved. While ERP systems of the past were developed for specific functions, the new wave ERP-as-a-platform exists to open the platform up to allow developers and business users to make it their own. So what has ERP’s journey got to do with the Internet of Things? As Peter Sondergaard, SVP & Global Head of Research at Gartner recently put it; the IoT is where ERP was in the 90s. It’s new, hot and exciting and continuing to grow. But to ensure IoT lives up to its potential, it should look at the journey ERP has taken, and see what lessons it can learn: Real-time analytics is key Real-time analytics is crucial for ERP, for businesses to optimise their business performance. It allows companies to see as and when, what’s going on - from economic fluctuations to the best way to complete an operation given numerous variables. Real-time analytics is also critical for IoT. Take driverless cars for example. Although not commonplace on our roads, they soon could be. If they are to be successful, there will have to be data flowing between cars, people, streetlights, mobile phones, etc. It will be real-time analytics that use this data to make the important, real-time decisions for the driverless cars. [easy-tweet tweet="Real-time analytics is as critical for IoT as it is ERP" hashtags="tech, cloud, IT"] Integration is a top priority Gartner has cited a lack of application integration strategy and related skills as the biggest problem facing cloud ERP. To avoid this, organisations cannot forget about their existing systems outright. It’s important for companies to look for a vendor who will provide guidance on application integration strategy when transitioning to the cloud, or risk reaping none of the benefits. In the same way, connected devices will need to integrate seamlessly into an environment which, right now, will be dominated by non-IoT devices. When choosing to invest in IoT devices, ideally, neither the business or their customers will notice the switch. It’s only with a good integration strategy that this will be made possible. Security is important Security should be paramount for IoT devices, especially in light of recent attacks which potentially utilised IoT devices to shut down major websites in the US, such as Twitter and Yahoo. ERP, as it has grown and developed, has also had to place a heavy focus on security. Companies use ERP to store valuable, private data, so it has to be able to prove it can keep it all secure. If it can’t provide this assurance, ERP will struggle to attract users. Likewise, if the IoT can’t keep the hackers out, consumers, businesses, and investors, will shy away. Agility and flexibility have an impact If the IoT wants to profit from all the data collected from ERP, it has to be both agile and flexible. The IoT would have to be adaptable, intelligent, and be able to process and generate real-time information, in the same way ERP does. [easy-tweet tweet="As long as IoT devices are flexible, they won’t disrupt business operations" hashtags="cloud, tech, IT"] As long as IoT devices are flexible, they won’t disrupt business operations, or constrain a business as it tries to grow. This is especially pertinent for smaller companies, who face very different challenges to larger businesses and often have to react faster to remain competitive. Mostly, IoT needs to look at the flexibility and agility of modern day ERP and strive to do the same. As with the dawn of ERP over 25 years ago, the defining characteristic for the IoT’s success will be integration: with existing business processes, people and systems. Thankfully, the post-modern ERP we see today will play a major role in extracting this value. ### Managing the shift to SaaS in the financial services sector Given the recent trend of financial services companies opting for Software-as-a-Service (SaaS) applications, the Financial Conduct Authority (FCA) has released advice for organisations looking to outsource IT services to third parties. The document, FG16/5: Guidance for firms outsourcing to the ‘cloud’ and other third-party IT services, outlines legal considerations, risk management and continuity plans. [easy-tweet tweet="What are the risks that come when outsourcing to a third party SaaS application?" hashtags="tech, SaaS, cloud, IT"] It is helpful to understand the risks that come when outsourcing to a third party SaaS application. Subscribing to a third-party software company often means placing critical business data in the hands of another organisation. If the service provider experiences any software outage or goes out of business, the data could be lost or certainly at risk, leaving the subscriber unable to carry out its day-to-day operations or comply with its data obligations. Although this is relatively rare, it can quickly cause irreparable reputational and financial damage, so organisations need to have a contingency plan should the worst happen. Navigating the finer points of the FCA’s new guidelines can be difficult when beginning a new relationship, especially when considering that SaaS providers will have further links to other companies. They are likely to rely on external data centres for the storage of data, adding another layer of complexity. Monitoring these relationships can be difficult. It is, therefore, wise to enlist a third-party to monitor the SaaS provider’s payments to its cloud service provider or data centres. This means organisations can be pre-warned if the SaaS provider isn’t making payments – a sign which could potentially signal significant financial issues. Another risk scenario to be addressed is the access to data if the SaaS provider. For example, a system should be in place to enable access to the data centre in which it is stored, independently of the SaaS vendor. [easy-tweet tweet="A system should be in place to enable access to the data centre, independently of the SaaS vendor." hashtags="cloud, SaaS, tech"] Since many organisations in the finance sector now rely on SaaS providers for business critical applications, they should also consider how they will restore this service if necessary following unforeseen circumstances. Having a copy of the software source code is certainly a solid foundation to this business continuity plan. Better still, firms can regularly take a snapshot of the application in its runtime environment so that it can be restored in a much more time efficient manner. Finally, and crucially, financial firms need to check that software providers have the operational resources necessary to meet legislation and other regulations such as the Data Protection Act and to monitor and identify risks to data continuously. This could be easier said than done for smaller SaaS companies and they may need to change their working practices when dealing with a financial services firm. Any contract between an organisation in the financial sector and third-party SaaS provider should outline how the provider will make sure that the data is secure, and that access to this data is managed carefully. Although these steps may sound time-consuming, they are necessary for financial companies to adapt to the changing technological landscape with minimal risk. With FinTech start-ups on the rise, the way that consumers interact with financial services companies is changing, but caution must still be exercised. ### PTC Partners with Etisalat Digital to Bring ThingWorx Internet of Things Platform to Middle East Region PTC today announced that Etisalat Digital, the newly established business unit of Etisalat, one of the leading telecommunications providers in the Middle East, North Africa, and Asia, has chosen the ThingWorx® Internet of Things (IoT) platform to develop IoT solutions in the United Arab Emirates and across the Middle East region to capitalize on the growing IoT opportunity in the Middle East region. The agreement enables PTC to extend ThingWorx into a new market and further position itself as a global leader of IoT enablement for the world’s leading companies.  ThingWorx will serve as the platform on which Etisalat Digital will build its new solutions for markets that include smart industry, smart buildings and cities, and fleet management. PTC’s open cloud platform architecture will enable Etisalat Digital to build a comprehensive ecosystem of IoT partners and solutions. “Working with Etisalat further illustrates the importance of building an Internet of Things strategy around a robust platform that enables customers to seamlessly integrate the components that lead to IoT success,” said John Stuart, divisional vice president, global sales and partners, PTC. “With this agreement, PTC expands its technology into another new region, further demonstrating the potential for ThingWorx to be at the heart of IoT innovation around the globe.” “The Internet of Things holds significant potential in the Middle East region and we believe it will be a major component of growth in the digital solutions ecosystem,” said Francisco Salcedo, senior vice president of digital solutions, Etisalat. “Collaborations like this one with PTC enable Etisalat to provide an array of solutions and strengthen Etisalat’s position as a leader in the IoT space, capitalizing on the opportunities in the region.” ThingWorx, the centerpiece of PTC’s Internet of Things technology portfolio, is comprised of a rapid application development platform, connectivity, machine learning capabilities, augmented reality, and integration with leading device clouds. These capabilities combine to deliver a comprehensive IoT technology stack that enables companies to securely connect assets, quickly create applications, and innovate new ways to capture value. ### Amazon Web Services Announces Amazon Wind Farm US Central 2 Amazon Web Services, Inc. (AWS), an Amazon.com company (NASDAQ:AMZN), today announced Amazon Wind Farm US Central 2, a new 189 megawatt (MW) wind farm in Hardin County, Ohio that will generate 530,000 megawatt hours (MWh) of wind energy annually starting in December 2017. Amazon engaged with EverPower, a leader in utility grade wind projects, to construct, own, and operate the new wind farm. This is AWS’s fifth renewable energy project in the United States (and its second wind farm in Ohio) that will deliver energy onto the electric grid powering AWS data centers located in the AWS US East (Ohio) and AWS US East (N. Virginia) Regions. When this newest wind farm is completed, AWS’s five renewable energy projects will generate a grand total of 2.2 million MWh of energy annually – enough to power almost 200,000 U.S. homes. In November 2014, AWS shared its long-term commitment to achieve 100 percent renewable energy usage for the global AWS infrastructure footprint. Ambitious sustainability initiatives over the last 18-24 months have put AWS on track to exceed its 2016 goal of 40 percent renewable energy use and enabled AWS to set a new goal to be powered by 50 percent renewable energy by the end of 2017. In addition to investing in wind and solar projects that deliver more renewable energy to the electrical grids that power AWS Cloud data centers, AWS continues to innovate in its facilities and equipment to increase energy efficiency, as well as to advocate for federal and state policies aimed at creating a favorable renewable energy environment. For example, in Ohio, Amazon supports proposed changes to the state’s current wind setbacks law to encourage more investment in new renewable wind power projects. This second Ohio wind project joins Amazon Wind Farm US Central (100MW) in Paulding County, which AWS announced in November 2015 and will start producing wind energy in May 2017. Other AWS renewable energy projects in the US include Amazon Wind Farm Fowler Ridge (100MW) in Indiana and Amazon Solar Farm US East (80MW) in Virginia – both of which are currently in production. Amazon Wind Farm US East (280MW) in North Carolina is under construction and expected to start generating electricity by December 2016. "We remain committed to achieving our long-term goal of powering the AWS Cloud with 100 percent renewable energy," said Peter DeSantis, Vice President, Infrastructure, AWS. “There are lots of things that go into making this a reality, including governments implementing policies that stimulate cost-effective renewable energy production, businesses that buy that energy, economical renewable projects from our development partners and utilities, as well as technological and operational innovation that drives greater efficiencies in our global infrastructure. We continue to push on all of these fronts to stay well ahead of our renewable energy goals.” “We applaud Amazon’s goal to power the global infrastructure for the industry-leading AWS Cloud with renewable energy and are pleased to collaborate in this effort. Leading companies like AWS are enabling the construction of large, utility-scale renewable power projects that will ultimately help to off-set the energy supplied by fossil fuels and create a cleaner, healthier environment for our communities,” said Jim Spencer, CEO of EverPower, a US leader in wind projects. EverPower has seven active projects in four states producing a total 752 MW of energy, with a near-term pipeline of an additional 2,000 MW. Beyond the sustainability initiatives focused on powering the AWS global infrastructure, Amazon is investing in several other clean energy activities across the company. Examples of other projects include Amazon Wind Farm Texas – a 253MW wind farm in Scurry County, Texas -- green rooftops, and the District Energy Project that uses recycled energy for heating Amazon offices in Seattle. About Amazon Web Services For 10 years, Amazon Web Services has been the world’s most comprehensive and broadly adopted cloud platform. AWS offers over 70 fully featured services for compute, storage, databases, analytics, mobile, Internet of Things (IoT) and enterprise applications from 38 Availability Zones (AZs) across 14 geographic regions in the U.S., Australia, Brazil, China, Germany, Ireland, Japan, Korea, Singapore, and India. AWS services are trusted by more than a million active customers around the world – including the fastest growing startups, largest enterprises, and leading government agencies – to power their infrastructure, make them more agile, and lower costs. ### Nine Business Dynamics that Scream You’re Ready for the Cloud Certain business dynamics are custom made for cloud-based communications solutions. If you want to spend less time managing infrastructure and more time focusing on your core business objectives, it’s time to consider migrating to the cloud. First, it’s important for companies to take a step back and take a fresh look at the forces affecting individual businesses. From our discussions with industry leaders, I’ve outlined common dynamics that might mean end businesses should consider migrating to the cloud: 1 - Rapid Growth Businesses expanding at warp speed need solutions that can be implemented just as fast – and that can flex with the needs of the business. The cloud provides quick and easy software upgrades with the inherent scalability required to support growth. Plus, cloud solutions offer greater flexibility for integrations with other technologies you may already be using with the ability to adapt processes in the future more readily. [easy-tweet tweet="It’s important for companies to take a step back and take a fresh look at what impacts individual businesses." hashtags="cloud, tech"] 2 - Multiple Locations You may have staff in multiple locations – whether that’s sales teams on the road or work-from-home employees – and you’re looking for a single solution that supports this flexible model. One that doesn’t mean you’re facing the ‘doom and gloom’ prospect of your IT team having to perform a major administrative overhaul. A cloud solution can quickly be deployed regionally through to globally while creating a more consistent experience for both your staff and your customers. 3 - Enabling the Mobile Workforce Cloud-based technologies and mobile apps make data easily accessible wherever staff are in the world. The cloud also makes it easy for your administrators and IT staff to remotely manage equipment which ensures remote employees have the tools they need to get the job done in a highly productive way. 4 - Lean Small-to-Mid-Sized Businesses (SMB) Unlike the big boys, most smaller organisations can’t afford to maintain their enterprise-level network security and redundancy, and may not have the necessary staff to do so. But, as an SMB, you still need to compete and can’t afford to put your operation at risk simply due to lean teams and lean finances. With a cloud-based software solution, SMBs can share the costs of sophisticated infrastructures and strict security measures with other cloud clients, gaining enterprise-class features without the need to manage staff or equipment. 5 - Dynamic Regulatory Environment If you operate in an industry that demands compliance to specific regulatory standards – like financial services – software deployment via the cloud can make it easier to keep up with changes. Finding a provider that supports and stays current with your compliance requirements can take the stress off of your internal resources while ensuring conformity if you operate in multiple locations. This reduces risk from regulatory fines. 6 - Non-Profit Managing limited budgets, limited infrastructure resources and limited staff (often without the luxury of dedicated IT or security staff) is no easy feat. The cloud can help non-profit organisations face these issues head on by lowering the total cost of ownership (TCO) for technologies, providing easier administration and very minimal ongoing maintenance. You’ll also see the additional benefits of easier accessibility for staff and volunteers, as well as solutions that can provide better ways to connect with donors and other important constituents. [easy-tweet tweet="The cloud can help organisations face managing budgets head on by lowering cost of ownership." hashtags="tech, cloud"] 7 - Outdated Equipment If your onsite equipment is ready for an upgrade, there’s no better time to consider moving your communications to the cloud. Review your goals and where your business is headed to see if the flexibility and cost structure offered by cloud communications are a good fit for your company. 8 - The Need to See the Big Picture Today’s executives must understand the breadth and depth of their business operations, as well as the minuscule details of each customer’s unique needs or a given sales rep’s performance. Cloud architectures that support multi-tenancy or integrations with CRM platforms make it easy to combine and sift through data at individual or aggregate levels. This helps to provide valuable reporting capabilities and new insights. 9 - Dissatisfaction You or those around you e.g. your staff, your manager, your CEO just aren’t happy with the performance, functionality or return on investment (ROI) of your current system. If you hear more complaints than compliments, take a critical look at your business objectives. Many cloud-based solutions offer greater customisation opportunities to help you meet the distinct needs of your business. If some of this rings true to you, it’s probably time to take a look at your current technology and see if it’s delivering the results and productivity you need. Maybe it’s time to experience the cloud. ### Sureline Systems announces support for Google Cloud Storage Coldline Sureline Systems, a leader in enterprise-class virtual, cloud, physical, and container migration and disaster recovery solutions for application mobility, today announced SUREedge DR support for Google Cloud Storage Coldline as a DR target. Customers are now able to protect existing physical servers or virtual machines from any hypervisor or cloud to Google Cloud Storage Coldline.  Customers now have the option to determine the appropriate storage platform on Google Cloud Platform based on how often they expect to access for test or deployment of the DR images. For systems that are rarely accessed for either purpose, Coldline may be an ideal option.  Coldline is part of a unified offering for cloud storage, which provides different price and availability points that match customer’s storage needs. Coldline helps address some of the drawbacks of traditional cold storage: data is still usable by applications when needed and its latency matches the other storage classes, with 99% SLA. For systems that are accessed more often, other Google Cloud Storage options are available. SUREedge DR is a business continuity and disaster recovery solution that provides the ability to plan, test, and run the recovery physical or virtual systems. SUREedge DR provides the orchestration and automation of a disaster recovery plan for systems according to predefined protection groups and dependencies. Replication of data between the sites is performed between instances of SUREedge DR, a pair of VMs running at each site.  SUREedge takes control of replication, restoration, resynchronisation and reverse replication as appropriate to the function being performed. SUREedge DR captures application consistent images. These point-in-time images are then replicated, after efficient bi-directional global deduplication, compression and encryption, to Google Cloud Platform. An intuitive central management console provides easy job scheduling and tracking, with alerts provided should an issue arise. “We are pleased to provide an enterprise class disaster recovery solution for Google Cloud” said George Symons, COO Sureline Systems.  “With the introduction of Coldline, Google Cloud Platform now offers low cost, high performance cloud storage that is ideally suited for rarely used DR images, that need low latency when called into service.” ### Access to advanced computer simulations made easy: Fortissimo Marketplace brings together key providers and SMEs Today, Fortissimo launched the Fortissimo Marketplace, a new platform for brokering High Performance Computing (HPC) services. The Fortissimo Marketplace will simplify access to high-end simulation services helping companies improve productivity and significantly reduce costs. Fortissimo is a new company, emerging from a successful EU funded initiative, the Fortissimo Project. Small and medium sized enterprises (SMEs) have in often struggled to take advantage of High-Performance simulation and modelling, due to its intrinsic complexity and price. The Fortissimo Marketplace, as a tailor made self-service platform, aims to overcome those barriers by providing on demand access to hardware, software and the necessary expertise to run advanced simulations. Service providers joining the platform benefit from a targeted sales channel, expert matchmaking with user requirements and a fulfilment service for end user accounting and billing. At launch, Fortissimo already has 139 partners including manufacturing companies, application developers, domain experts, IT solution providers and HPC Cloud service providers, from 19 countries.  Proven business advantages delivered through simulation and modelling Using simulation and modelling on High Performance Computing systems is widely seen as an effective but very expensive design and development tool, only accessible to the largest and most financially powerful companies. Analysts IDC have estimated[1] that each dollar invested in HPC returned, on average, US$ 356 in revenue and US$ 38 in profits or cost savings. However, the supercomputers and software required to run large scale highly sophisticated simulations, plus the expertise of skilled professionals to handle the simulation tasks, require substantial capital investments. This means, the initial cost for setup and operation can be forbidding for potential adopters. Affordable access to computer simulations for all The Fortissimo Marketplace takes the complexity out of the process of procuring high performance simulation and modelling, and markedly reduces costs. Through either turn-key packages or Fortissimo’s matching service, SMEs from a broad range of vertical markets including automotive, aviation, medical engineering, manufacturing, the energy and environmental sectors can gain cost-effective access to advanced simulation and modelling services operating on a cloud infrastructure of HPC resources coupled to software applications, expertise, and tools. Larger enterprises or academic institutions can of course benefit from the advanced simulation and modelling services as well. Clients can immediately access and purchase services after registering on the platform. Registering is free of charge. Portal of choice for HPC service providers Fortissimo Marketplace is open to any service provider and offers flexible terms for pricing, payment and service presentation. It is a goal of the Fortissimo Marketplace to create value and opportunities for all parties in the design process. This includes Independent Software Vendors (ISVs) looking for a reliable partner to implement their Software-as-a-Service strategy, as well as Service Providers wishing to provide expertise or resources. The platform provides a pre-pay or pay-per-use, one-stop shop for hardware, visualisation, software and expertise that can be accessed as “tools”. Through its streamlined and well thought out approach, the Fortissimo Marketplace aims to become the portal of choice for advanced computer simulations and service provision delivered by Europe’s major HPC technology providers. Two types of membership are offered. A Gold membership enables ISVs, Solution Providers, Integrators and selected HPC Centers to advertise and promote their services via the Fortissimo Marketplace. With a Platinum membership, the Fortissimo Marketplace additionally provides an integrated sales platform providing a fulfilment service including end–user accounting and billing. Fortissimo Project experiments prove the potential of digital modelling The Fortissimo project has conducted over 50 experiments within the pilot phase of Fortissimo Marketplace. As a result of its Fortissimo experiment, IES, a leading building-performance modelling company, can now run simulations in hours instead of days and at half the cost of an in-house system capable of delivering the same results. Similarly, supercar manufacturer Koeinsegg, recouped its investment in HPC experimentation with Fortissimo in less than three months. Corresponding to a 1.5% reduction in overall development costs of its new Koenigsegg One:1 supercar. In a Fortissimo experiment, business relevant simulations of industrial processes are implemented and evaluated. Each experiment is collaborative and includes application and HPC experts or ISVs who become part of the Fortissimo consortium. In the second stage of the Fortissimo project, which was initiated in November 2015, more than 30 experiments have been started. Successfully concluded experiments become case studies, which are made available to other users through the Fortissimo Marketplace to inspire evaluation of HPC simulation. Today’s experiments cover subjects as diverse as improving energy efficiency, the aerodynamics of light aircraft, CO2 emissions prediction for automotive engines, low-pressure die-casting, cabling routing architecture optimisation and many more. ### Is an All-Cloud Company Possible? Believe it or not, but there are still people who are debating whether cloud has a place in the world of business or whether it is just a passing phase. It is all starting to feel like the good old days when various people were telling the world how the internet would never take off. Whichever source you consult, you find out that the number of companies that utilise the cloud is growing and shows no signs of stopping. [easy-tweet tweet="There are still people who are debating whether cloud has a place in the world of business" hashtags="cloud, business, tech"] However, would it be possible to run an entire company from and in the cloud, exclusively? Could one run a relatively successful business that employs a dozen employees, works only online and uses no software but operating systems and the software that runs solely in the cloud? Talking Purely Infrastructure Before we can address the "peripheral" concerns and philosophies of operating exclusively in the cloud, we need to cover the infrastructure. In other words, are there enough cloud-based tools and SaaS services that would allow a company with 12 employees to operate without a hitch and, hopefully, grow? For one, it is possible to register a business online. Depending on where you live, your government will have provided an online service to register your company online. By using a cloud service called DocuSign, you can sign everything you need online. This will also come in handy later. The next step would probably be an office suite of some kind, allowing you to take care of the most simple of tasks. Google Docs or LibreOffice will be more than capable to get things done. Next, you will probably need a piece of collaboration software, and the choice is yours here. It is. If you want a bit more help in managing your employees, you can always go for an employee time clock management tool. You will also find that there are many cloud-based accounting solutions out there for you to choose from. Even if you decide to run an e-commerce business of some kind, you can nowadays hire 3rd party warehousing and shipping solution that you can manage online and without the need to be present physically there. In essence, whatever you can think of, there is a company that offers the service that is cloud-based and available without the need to run anything on your company computers. If you were to start a business today, going all-cloud is a possibility. [easy-tweet tweet="Whatever you can think of, there is a company that offers it as a cloud-based service." hashtags="cloud, tech, business"] The Obvious Advantages Doing it full-on cloud-based has some advantages that are quite obvious. For one, running our hypothetical 12-people company using nothing but online services and cloud solutions will be much, much cheaper than renting out office space, purchasing all kinds of equipment and traditional software solutions. Often, running a tiny company using these solutions will have such low costs that it will seem almost unreal. Moreover, scaling up and down will be far easier due to scalability has become one of the most important features of most SaaS solutions. As your profits and demands grow, so will the services that you pay for. No waste. The adaptability of your new company will also be a huge bonus, as you will not have to worry about old equipment being compatible with new software and new updates making, for example, a portion of your data unusable. A Few Challenges Of course, running an all-cloud company will also have its challenges, especially if one decides to run it without actually having any office space to talk of. This will include onboarding new employees who will need a bit of time to get the hang of all the software that is used to keep everything running. [easy-tweet tweet="Running an all-cloud company will also have its challenges, especially the lack of office space" hashtags="cloud, tech, business"] Also, there is the security issue which many people believe is still not handled in a perfectly satisfying way in the cloud. It will be a long time before people are truly and convinced they can store their essential data off-site. There are also certain limitations in the performance of cloud-based software solutions and while this is not an issue most of the time, there are situations in which these short delays and minor problems simply won't do. It should be pointed out that this is a problem that might (and probably will) be easily solved in the future. The Transitional Cases At the moment, the vast majority of businesses are in the transitional phase where they are moving certain parts of their business to the cloud while keeping the essential ones in-house. They are meeting some challenges on their way. It is a question whether they plan on going all-cloud in any foreseeable future or whether they plan on staying hybrid for a long time. Their future probably does look all-cloud, but it will be some time before it becomes the present. At present, however, a company can be all-cloud. If it wishes. ### Go “All In” with Cloud-based Video Communication Business communications are evolving at lightning speed. Communication technology that was considered acceptable just a couple of years ago has quickly become outdated. As companies employ a larger volume of remote workers who rely heavily on multiple communication platforms, there is a higher demand than ever for the most modern, intuitive offerings. Remote workers and their in-office colleagues want (and expect) easy (consumer-like) anytime, anywhere access to collaboration tools that allow them to interact face-to-face whether they’re at home, in the office or somewhere in-between. [easy-tweet tweet="Business communications are evolving at lightning speed." hashtags="comms, tech, cloud"] As IT professionals find it increasingly costly time-consuming, and in some cases impossible to upgrade on-premise video conferencing equipment and infrastructure to meet the changing needs of their organisations, a growing number are switching from working with multiple on-premise solutions to choosing one cloud-based Software as a Service (SaaS) video conferencing provider. A successful move to cloud-based video communication technology requires a few key steps: Review Cloud Advantages and Capabilities First, ensure you’re making the decision that best meets the needs of your organisation. Most businesses move to a cloud solution in search of the following benefits: Reduced Cost — IT departments are under pressure to reduce expenses. Investing in on-premise video conferencing infrastructure and hardware can cost hundreds of thousands of dollars. A cloud-based video conferencing service extends IT budgets much further than a lump sum upfront payment for an on-premise solution. Increased Flexibility — It 's hard to deliver and support video conferencing sessions that may be scheduled with little to no notice, but this kind of flexibility and agility is required in a dynamic business environment. With an intuitive, cloud-based application, IT administrators can get valuable time back by enabling users to become more self-sufficient. And enterprise-class cloud-based video communication platforms offer more flexibility to scale up or down as per business and user requirements than those services that grew out of consumer applications. Simplified Automation — IT admins who choose a cloud-based video conferencing vendor need no longer struggle with the decision of whether to build and manage the video conference infrastructure in-house, freeing up valuable time, money and effort. A web-based admin console gives IT administrators easy access to dashboards to check usage statistics and to customise settings. Increased Device Diversity — Video conferencing software needs to support a broad range of “bring your own device” (BYOD) technology. It can be challenging for on-premise support staff to stay current on all of the different smartphones and operating systems upgrades, so organisations often prefer to give that task to a cloud provider who can also handle seamless integration and support. Professional cloud-based video conferencing solutions bring collaboration to the devices employees use every day from the browser- and desktop-based application for laptops and tablets to mobile applications for smartphones. No need to worry about software version or complicated updates. Interoperability — On-premise video conferencing products usually lack interoperability and still live in a proprietary software world that requires users to purchase an expensive gateway appliance to call third-party video communication devices. Organisational decision makers who move video conferencing to the cloud make interoperability the vendor’s responsibility and avoid being saddled with incompatible equipment that leaves users in isolated silos. [easy-tweet tweet="Video conferencing products lack interoperability and still live in a proprietary software world" hashtags="tech, cloud, software"] Choose the Right Vendor Once you’ve determined that a cloud-based video conferencing solution is right for your organisation, it’s time to vet your vendors. Decision makers should pay close attention to the vendor service level agreement and support, ask about outage levels and discuss vendor availability. This is a critical business tool that must have the highest standards of support and service levels. Those who are interested in continuing to provide their employees with the benefit of a traditional face-to-face meeting in a conference room, along with the ability for remote party dial-in and face-to-face meeting options, should look at a hybrid cloud-based video conferencing solution with a conference room-based hardware component to offer employees the best of both worlds. Implement Effectively [easy-tweet tweet="Even if the software is a perfect fit, it won’t work if it isn’t implemented correctly." hashtags="tech, cloud, IT"] Last, effective implementation is critical — even if the software is a perfect fit for your organisation, it won’t work if it isn’t implemented correctly. Efficient implementation includes: Choosing a Point Person — selecting the right team and establishing who does what and when mitigates the likelihood of a stalled application and ensures that everyone knows who has the final word. Establishing a Timeline — Imposing a clearly defined timeline and ensuring the vendor will be available when needed eliminates the risk of ongoing implementation with no end in sight. Maintaining Focus — resisting the temptation to think about every detail rather than focusing on the big picture is vital to moving the implementation forward — take care of the high-level issues first and enjoy the additional features after implementation. Listening to the Expert — Clients often expect software vendors to configure the new software to match their internal operations, which can lead to long-term negative consequences. Remembering who the expert is — the vendor when it comes to the new software and the organisation when it comes to business operations — is vital to implementation success. Gaining User Buy-in — After a long vetting process, it’s common to assume the internal sales process is over. Nothing could be further from the truth. The employees who will be using the new software the most may not have been involved in the approval process. Ensuring that they value — and champion — the use of the new software will increase the likelihood of a timely implementation. It is important to gain the trust and buy-in of the users and give them a say in the implementation. As businesses continue to employ distributed workforces around the world, the demand for an enterprise-ready video communication solution that can provide a seamless, connected experience will continue to increase exponentially. Organisations that go “all in” and commit to improving employee communication and collaboration stand to benefit from a workforce that returns the favour with higher productivity, less staff churn and great employer – employee relationships. ### Disruptive Pitch UK - Series 1 - Episode 1 First episode of Disruptive Tech TV's live online pitching show #DisruptivePitch Once a month for the next six months - Disruptive Tech TV presents a new live show, supported by Arrow ECS UK, where emerging and startup technology companies will have 3 minutes to pitch what they do. Each episode, which will be dedicated to a specialist industry, will put the pitches up against our judges. The finalists from each round will then join us for the final show and battle it out to become the One Pitch. Disruptive Pitch on Twitter: https://twitter.com/disruptivepitch Disruptive Pitch on Facebook: https://www.facebook.com/disruptivepi... Official Site: https://www.disruptivetech.tv/disrupt... ### Embracing the cloud effectively in four simple steps As understanding of the benefits of the cloud continues to grow among organisations worldwide, so too does cloud adoption. Gartner predicts the global public cloud services market will grow 16.5 percent this year, as more companies pursue a digital business strategy and shift away from legacy IT services, adopting cloud-based services instead. [easy-tweet tweet="As understanding of the benefits of the cloud continues to grow among organisations worldwide" hashtags="tech, cloud, IT"] Seeing competitors migrating files or applications to a public cloud computing platform – reaping the rewards of achieving greater flexibility, usability and cost savings, among others – is putting pressure on companies to quickly catch-up. However, rushing to migrate an organisation’s entire IT infrastructure to the cloud all at once would require pausing all business operations, and devoting an unlimited amount of budget and engineering support to this mammoth task. Of course, this would be impractical. In reality, businesses do not have unlimited resources and cannot pause operations. So, how can companies make a move to the cloud quickly and efficiently, while addressing their needs and available resources? The key is not to migrate everything at once. More often than not, organisations are not as far behind as they might fear. Adopting a considered approach such as the one below will guarantee a smooth and efficient move to the cloud. Step 1: Decide where data will live Firstly, organisations should determine which applications and information stores they wish to migrate to the cloud and which ones they don’t. For example, an e-commerce company may only need to scale up certain IT resources to match the various shopping seasons. However, it would probably want to keep confidential and sensitive data, such as customers’ credit card numbers, in the data centre, where it is entirely under its control. For companies that are expanding and opening branches in other countries, opening IT operations to support a new region used to be an expensive process that could take up to a year or more. It involved renting data centre space, and hiring people to set it up and manage it. Now, working with an existing cloud vendor, businesses can co-locate applications at the provider’s local data centre – a process that takes a lot less time and effort. Step 2: Migrate apps to the cloud By far, the most common reason for an enterprise to adopt cloud computing is for data storage.  Files stored in the cloud are safe from the dangers that might threaten physical storage, with the added benefit of being accessed from anywhere. In fact, according to recent studies, 90 percent of enterprises will increase annual spending on cloud computing this year, with 70 percent of those cloud adopters reporting they use cloud infrastructure primarily for file storage. However, companies are increasingly looking to move more complex operations to the cloud, such as application delivery and development. For example, in Europe, almost half of enterprises are already using advanced cloud services to run business applications. That being said, migrating applications to the cloud is not as simple as moving files from local storage hardware to a cloud provider’s platform. This requires re-engineering those apps to ensure they optimally leverage cloud resources, and that they perform just as well (or better) than the existing data centre-based version. Step 3: Achieve full visibility Moving to a hybrid IT environment and implementing a mix of cloud and on-premise applications has some advantages, such as reduced costs and more IT agility. However, dealing with a wider variety of applications that run on different infrastructures and are spread across many locations, also creates security and productivity issues. The Riverbed Global Application Performance Survey 2015 showed that when applications fail to meet performance expectations, they directly impact productivity and the company’s bottom line. [easy-tweet tweet="Hybrid IT environments and implementing a mix of cloud and on-premise applications has advantages" hashtags="cloud, tech, IT"] Organisations running apps in the cloud should not have to sacrifice control over convenience and cost. And in today’s complex hybrid world, you need visibility over all applications no matter how or where they’re deployed or consumed — SaaS, IaaS, in data centres or remote sites. The solution is to establish end-to-end visibility into application performance across the entire network. To close the performance gap, organisations need to create a clear line of sight into how apps are performing, and how they impact on the end-user experience. By identifying the cause of performance issues, IT can fix them before users even notice. Once the organisation has diagnosed and fixed performance issues, they can ensure peak performance of applications across any network. Step 4: Improve to succeed Once their apps have been migrated to the cloud, organisations have an opportunity to reassess their processes to increase agility and automate processes along the release pipeline. The key to continuous improvement is putting in place a strong feedback loop that guarantees that quality is not sacrificed for agility. As a result, teams throughout the organisation can stay focused on innovation and collaboration, benefitting users and ensuring quality – ultimately delivering what the business needs to grow.  There’s still time The latest technologies enable real-time continuous data capture and analysis so that companies can view network delays, and provide speed, insight and control no matter where data is stored. These new tools are enabling enterprises to overcome their long-held fears over security and accessibility of cloud computing services. According to Riverbed’s survey, 96 percent of respondents already use some cloud-based enterprise applications in their work today, and 84 percent expect the use of these applications to increase within their companies over the next two years. Businesses that are just starting to incorporate cloud computing into their IT architecture are not alone, and they shouldn’t feel rushed or try to move too much at one time. But if they are still undecided about migrating to the cloud, now is the time to develop a plan to deliver their business-critical applications over the Internet. Organisations of every size will realise significant cost-savings almost immediately, and performance improvements for all their users, no matter where they are in the world. ### How small business owners can identify the right apps for their business How many times have you checked your phone today? 25 times? 50? Do you even notice you’re doing it anymore? According to research from Nottingham Trent University, we check our phones a remarkable 85 times a day, spending five hours a day surfing the web and checking apps. [easy-tweet tweet="How many times have you checked your phone today?" hashtags="tech, cloud, apps"] Mobile devices have transformed us into an app-driven society, both in our personal lives and in business. In fact, our global research found that nearly three-quarters (71 percent) of UK small business owners rely on mobile or web-based applications to run their operations, eliminate administrative tasks and grow their firm. It’s unsurprising that business apps have become so popular, they can benefit SMEs in many ways.  Whether it’s to improve customer communication by allowing them to see and respond to requests and queries in real time or using cloud-based tools that enable a team to access information anytime and anywhere, apps can be a simple and cost-effective solution for many of the challenges today’s SMEs face. However, with so many apps out there, it’s important to find the right ones to meet a particular business’ needs, and not use technology for technology’s sake as sometimes, this can be more destructive than productive for a small business. Two in five (40 percent) UK small business owners using apps believe there are too many to choose from and are unsure of which are best suited for their business. Getting the balance right can be challenging, which is why it’s useful to carry out an analysis of the day to day business processes and from there, assess which apps can simplify and streamline these. Identifying the apps best for your business For many small business owners, the biggest challenges can include lack of time and money. In fact, 44 percent of SMEs come close to – or do – run out of cash within the first three years, so it’s critical to keep a close eye on the books. This is why business apps are so important, allowing SMEs to have real-time information so they can make decisions on the go and cut down administrative processes significantly. [easy-tweet tweet="The number one pain point for SMEs is controlling costs and expenses" hashtags="fintech, apps, tech, cloud"] According to the research, the number one pain point for SMEs is controlling costs and expenses, and as a result, the most popular apps were those in the banking and finance category (59 percent) followed by payments (49 percent) and payroll management (37 percent). Let’s face it, the ability to track your finances in real-time, whether you’re on a train, sat in the office or on the sofa at home is crucial to helping small business owners stay on top of what’s coming in and going out of the firm. The upshot of using these apps is that small business owners can spend less time balancing the books and more time focused on doing what they love – running their business. With technology continually changing the way we work, doing things “the old way” just isn’t an option for SME owners determined to ensure they stay close to customers and grow their business in the face of increasingly stiff competition. However, it’s important not to get bogged down trying to use all the apps that are out there just because others are talking about it. Business owners must make informed decisions and use apps to create efficiencies, enhance productivity and improve business processes. ### Global Media Brands Make Strategic Investment in Social Video Creation Platform Wochit Social video creation platform Wochit today announced receipt of $13M in funding, with investment from media giants ProSieben, Singapore Press Holdings’ SPH Media Fund, Carlo de Benedetti and several existing investors including Redpoint, Marker LLC and Cedar Fund. The funds will be used to further enhance Wochit’s award-winning technology and to expand its business with publishers and content creators worldwide. The investment from existing customers, each representing dozens of brands, is a testament to the strategic value of the Wochit platform, which supports rapid, cost-effective production of timely, socially-optimized video. By empowering storytellers with intelligently-applied automation and the largest library of pre-licensed assets, Wochit ensures quality results for publishers seeking a cost-effective video solution. Jens Doka, Chief Product Officer, ProSiebenSat1 Digital, whose premier brands include ProSieben, Sat.1, Sixx and Kabel Eins commented “The proliferation of content platforms has resulted in an incredible demand for video. With Wochit, ProSiebenSat.1 is able to produce video content at the scale needed to address this growing need, even providing tools to help deliver that content in the right format for each distribution channel.” “I believe that a bright future for publishers and media companies is possible. We need to adapt our offerings to meet the wants and needs of audiences. Wochit is an invaluable company emphasising video-based storytelling and giving publishers a powerful tool to make an historically resource intensive proposition into a user-friendly, cost-effective one.” Said investor and publisher Carlo de Benedetti. Says Wochit co-founder and CEO Dror Ginzberg, “I’m proud that such esteemed media organizations value the impact Wochit is having on the industry to a degree that they are taking a stake in our continued growth. We’re confident that this investment will give us the ability to continue to drive results for our current partners and broaden our client base into new markets around the world.” The investment by SPH Media Fund, the corporate venture capital arm of Asia's leading media group, Singapore Press Holdings (SPH), shows Wochit’s growing presence in the Asian market. A Wochit customer, SPH is already engaging audiences with Wochit video across properties such as The Straits Times, Lianhe Zaobao, The Business Times, AsiaOne and Stomp. ### The Linux Foundation and edX Announce New, Free DevOps Course The Linux Foundation, the nonprofit advancing professional open source management for mass collaboration, today announced its newest massive open online course (MOOC) is available for registration. The course, LFS161x - Introduction to DevOps: Transforming and Improving Operations, is offered through edX, the nonprofit online learning platform launched in 2012 by Harvard University and Massachusetts Institute of Technology (MIT). The course is free and will begin November 16. Registration is now open for this free course. This is the fourth edX MOOC offered by The Linux Foundation. Its first course, Intro to Linux, has reached more than 600,000 students globally and continues to grow in registrations. The others are Intro to Cloud Infrastructure Technologies and Introduction to OpenStack. “DevOps is a rapidly growing career field, which provides strong job security, highly competitive compensation and opportunities for growth, but there is a lack of experience and talent in the market that need to be remedied,” said Jim Zemlin, executive director of The Linux Foundation. “From our past experience, we knew edX was the best partner to help address this lack of talent by providing a platform to make DevOps education easily accessible.” DevOps started as a cultural movement, designed to remove silos between developers and operations personnel. It originated with operations personnel who felt they would be more effective managing IT infrastructure if they better understood how and why it was built. Organizations that implement DevOps best practices have been demonstrated to be more agile, flexible and effective in designing and implementing IT practices and tools, resulting in higher revenue generation at a lower cost. The Linux Foundation is already helping develop technology for DevOps professionals through its open source projects, and now through the launch of this course, to provide the training opportunities to educate a talent pool to support those projects. LFS161x is organized around the three basic principles of DevOps, otherwise known as the “Three Ways”, which outline the values and philosophies that guide DevOps processes and practices. Students will learn how to: Explain the need to do DevOps. Understand the DevOps foundations, principles, and practices. Understand, analyze, and map value streams. Explain and implement the deployment pipeline. Illustrate the concept of Continuous Delivery. Create a problem solving culture. Explain the concepts of blameless postmortems. Monitor meaningful infrastructure and business metrics. Converge change management and DevOps. Understand how reliance engineering and safety culture are critical to DevOps success. Create a learning organization. The course instructor, John Willis, has over 35 years of experience, focusing on IT infrastructure and operations.  He has helped early startups such as Chef, Enstratius (now Dell) and Docker navigate the DevOps movement, and is one of the original core organizers of this movement. Willis has been a prominent keynote speaker at various DevOps events throughout the years, and is a co-author of the "DevOps Handbook". “DevOps is a new and rapidly growing field that shows immense promise,” said edX CEO and MIT Professor Anant Agarwal. “This course furthers the professionalization of the industry, and we are proud to continue our work with The Linux Foundation to expand edX’s technical and open source educational offerings.” The course includes six chapters, each with a short graded quiz at the end. A final exam is also required in order to complete the course. Students may take the complete course at no cost, or add a verified certificate of completion for $99. About The Linux Foundation The Linux Foundation is the organization of choice for the world's top developers and companies to build ecosystems that accelerate open technology development and commercial adoption. Together with the worldwide open source community, it is solving the hardest technology problems by creating the largest shared technology investment in history. Founded in 2000, The Linux Foundation today provides tools, training and events to scale any open source project, which together deliver an economic impact not achievable by any one company. ### Cloud Storage Vs Cloud Backup In late June 2016, artist Dennis Cooper was browsing his Gmail when he was suddenly informed that Google had deactivated his account. While many have undoubtedly had their Google accounts suspended or altogether deleted, this incident had particularly far-reaching consequences. The artist also used Google’s blogging app, Blogger, to host his popular blog, The Weaklings, which when deleted, made inaccessible 14 years of work. [easy-tweet tweet="Many have undoubtedly had their Google accounts suspended or altogether deleted" hashtags="google, tech, cloud"] The artist perhaps rightfully became disgruntled with Google for not pre-warning him about removing his blog whilst the media and activists are up in arms because it reeks of censorship. It remains unclear why the blog was deleted although online speculation suggests it may have been due to content violating the platform’s terms of service. Google have yet to comment on the matter. In the meantime, data backup providers are resting their heads in their hands as they continue to read about it. The deletion of the blog and possible censorship aside, there is a more notable problem that many have not chosen to focus on - the importance of data backup. Whilst nobody likes to be on the receiving end of an “I told you so”, the simple fact is that if Mr Cooper had backed up his work from the blog, he could simply restore the content and migrate it to a platform with perhaps more amenable terms and conditions. This highlights an all too common misconception that because something is in the cloud it is safe from accidental or intentional deletion. Dennis Cooper’s case proves that this is simply not true. The same misconception costs hundreds perhaps thousands of people a year as they realise their online data hasn’t been backed up. [easy-tweet tweet="Just because work is stored in the cloud on a blog doesn't mean it is safe from loss." hashtags="tech, cloud, IT"] In the case of Dennis Cooper, just because his work was stored in the cloud on his blog didn’t necessarily mean it was safe from loss. The reality of the situation was that the data was very much at risk of being lost with no formal backup in process in place. This is evident simply by reading Google’s terms of service, which state: WE DON’T MAKE ANY COMMITMENTS ABOUT THE CONTENT WITHIN THE SERVICES, THE SPECIFIC FUNCTIONS OF THE SERVICES, OR THEIR RELIABILITY, AVAILABILITY, OR ABILITY TO MEET YOUR NEEDS. WE PROVIDE THE SERVICES “AS IS”. When using any cloud services, including services such as Office365 and Dropbox, it is crucial to have a backup of everything, separate from the system in which the live data resides. Every day we read about the importance of online backup for businesses and are constantly exposed to cautionary tales and cases of bad practice. However, less emphasis is placed on an individual’s own information being backed up which is just as important. How to avoid data lost due to backup oversight To reduce the risk of data loss, it’s imperative to understand that cloud storage and cloud backup are not the same thing. Cloud storage is for live data and provides often little or no protection against data loss. Whilst it’s great to be able to enjoy the benefits of storing data in the cloud and having access to it from anywhere, a backup should always exist outside of the system in which the live data resides. With the advent of cloud technology, we have the opportunity to back up our data within the cloud. Many businesses and individuals see the benefits of having local copies of their data as well. In particular, this can be useful in situations where Internet access is intermittent or where very large restores are needed fairly often. Lastly, you can never be too cautious when it comes to data protection so backup your backups. If you utilise an online backup service provider, the service provider will usually do this for you through redundant copies in multiple data centres. This could save you a lot of time, energy and the inconvenience of potential data loss. If you use a service provider that only retains one copy of your data, now is a good time to explore what else is out there. ### Big Data, not Big Brother: using information to drive a better customer experience Big Data is a powerful currency. Its business value lies in the wealth of customer insights it provides which, when put to good use, can provide you with that all-important competitive edge. Companies are thus investing enormous resources in acquiring as much personal information as they can – building massive datasets that look impressive, but don’t necessarily achieve the desired results. [easy-tweet tweet="Big Data is a powerful currency." hashtags="bigdata, tech, cloud"] Customers, however, are increasingly reluctant to hand over their details. For many, it’s to only avoid receiving unsolicited spam. The growing threat of cyber-security, however, has added a far more serious concern. In fact, the rise in computer hacking, data breaches, identity theft and online surveillance is somewhat reminiscent of George Orwell’s dystopian novel ‘Nineteen Eighty-Four’. In the book, Big Brother is the symbolic leader of the totalitarian state ‘Oceania’, controlling and suppressing its inhabitants using…data. While today’s information-driven society is not an exact reflection of this story, companies need to pay better attention to how, and for what purpose, they use people’s personal details. When data has been sourced correctly, stored securely and used ethically, it can drive incredibly effective customer communications. Here are three ways to get the most out of your Big Data – and steer clear of Big Brother. Improve the quality of your data Recent research reports that 63.3% of UK businesses have out-of-date customer information on file. This lack of clean, correct data means that critical communications are not reaching the intended recipient – or any recipient at all. Not only is this a waste of time and money, but it can also be very damaging to your brand. Further research estimates that this reputational damage can result in a substantial 30% drop in revenue. [easy-tweet tweet="Here are three ways to get the most out of your Big Data" hashtags="bigdata, tech, cloud"] In other words, it pays to keep your data clean and current. A regular data health-check will weed out all the useless information that is either incorrect or irrelevant. Once all the clutter has been removed, the value of your remaining data will soon become apparent: you’ll be able to keep a far closer eye on customer buying behaviour and spot any business opportunities (such as a gap for an up- or cross-sell) before they have a chance to shop around. Make it personal The volume of outdated data is at odds with the growing customer demand for more personalised content. Regardless of whether you are contacting their clients for sale or general brand awareness, each email, text or social media post is a reflection on your company. Spelling a customer’s name wrong, not respecting their privacy settings or contacting them at an inopportune time when you should know better, will only weaken your relationship and diminish your brand equity. The more you know about your customers, the more on-point and relevant your communications strategies will be. People are more sensitive than ever before to receiving reams of unsolicited content and will not take kindly to any abuse of their personal data – real or perceived. Show respect for the information you have and make sure that you use it to provide some benefit to your customers. Deliver a good customer experience If customers feel like they aren’t getting anything in return for the data they’ve provided, they’ll unsubscribe from your mailing list or send your collateral to the junk folder. The point of Big Data is not to stalk your customers online or spam them with anything and everything the marketing team dreams up – it’s to offer them information via a convenient channel that enhances their working lives. [easy-tweet tweet="Companies rely on data-driven technologies to win their share of a very competitive market." hashtags="bigdata, tech, cloud"] Also, bear in mind that this is not a one-way transaction. Each time you contact a customer, you are initiating a conversation. If your client gets back to you with a complaint or query, make sure that you have the resources in place to respond quickly and politely. Companies rely on data-driven technologies to win their share of a very competitive market. When data is used correctly and consistently, it can nurture stable, long-term customer relationships that are essential to achieving positive business growth. Technology may increasingly drive the world, but human values such as loyalty and trust are still worth their weight in gold. ### UK Spectrum Policy Forum Report Identifies Success Factors for Making the UK a 5G World Leader The UK Spectrum Policy Forum (SPF) has today launched a report on the critical success factors for deployment of 5G in the UK. The report prepared by Analysys Mason on behalf of the SPF, summarises the UK wireless industry’s views on how 5G will be introduced into the UK market, and the resulting spectrum implications. Based on the work of the SPF spectrum applications and demand working group, the report makes a series of recommendations on the spectrum needs for 5G and the key policy actions required at the UK level to support these needs being met. David Meyer, Chairman of the SPF, comments, “It’s vital that the UK takes global leadership of the development and implementation of 5G, to secure its place as a world-leading digital economy. This report highlights the breadth of technologies and services that will make use of 5G, and therefore how crucial it is to get implementation right.” The discussions that have taken place within the SPF during 2016 recognise that 5G networks are poised to bring new mobile experiences to consumers as well as to provide wireless connectivity solutions that will meet the requirements of a range of vertical industries. As such, the use cases being foreseen within 5G are extremely broad and include several industry segments that have traditionally operated dedicated networks outside the cellular network domain. Meeting these diverse user requirements will require a range of spectrum bands to be used, spanning from bands below 1GHz up to bands in the millimetre-wave portion of radio spectrum that are being studied internationally in accordance with ITU-R Resolution 238, as defined by the World Radiocommunication Conference in 2015. Use of millimetre-wave bands will be particularly focussed on the highest-capacity outdoor and indoor hotspots (requiring very high data rates and volumes of traffic). Considering that 5G networks are unlikely to be commercially available before 2020 in many markets, innovative new applications and services are likely to emerge that will place increasing demands for access to suitable spectrum. On millimetre-wave bands, the SPF is supportive of the UK regulator’s current position arising from the various European and international working groups on 5G spectrum, that focus should be on frequency bands that have the greatest potential to be globally harmonised, while also taking account of the need to consider the early harmonisation of a pioneer band in Europe. The SPF urges Ofcom to continue to promote these positions in Europe and internationally. Noting that Ofcom plans to award new mobile spectrum in the 700MHz and 3.4–3.8GHz bands in the UK, the SPF notes these could also support initial deployment of services for 5G. Key to the success of 5G in the UK will be the timely award of spectrum licences in these bands, with suitable conditions for 5G deployment. In addition, sub-6GHz bands already licensed for 2G, 3G and 4G mobile will continue to be needed, and the networks deployed in those bands are expected to evolve towards 5G based on market demand. ### OpenStack® Foundation welcomes four new Gold Members from China and Europe The Board of Directors of the OpenStack Foundation approved four members of the OpenStack community as the newest Gold Members of the Foundation, supporting strategic opportunities in public cloud, telecom networks and the Chinese market. The decision was made during the Foundation Board¹s meeting at the OpenStack Summit in Barcelona. The new members are City Network and Deutsche Telekom in Europe, and 99Cloud and China Mobile in China. Europe and China are fast-growing markets for OpenStack, the most widely deployed open source software for building clouds. Both markets are growing faster than the U.S., supported by healthy startup communities, large vendor support and a growing network of regional OpenStack Day events at which users and developers gather to exchange best practices and hear from operators of clouds based on OpenStack. Until recently, growth in OpenStack deployments was chiefly found in the private cloud market, including 99Cloud Inc., which is focused on providing enterprise-level solutions and deploys private cloud in different industries with tailor-made vertical templates in China, including financial services, government and power. The addition of other three Gold Members highlights the emergence of OpenStack-powered public clouds in non-U.S. markets. Public cloud has emerged as a growth driver for OpenStack in Europe and APAC due to factors such as data sovereignty and local/regional presence of data centers. The two telecoms‹China Mobile and Deutsche Telekom‹join a growing list of telecoms and operators of large networks who are using OpenStack not only to build public and private clouds but also to take advantage of the strong set of Network Functions Virtualization (NFV) capabilities and ecosystem support that OpenStack offers. About the Newest Gold Members 99Cloud - 99Cloud Inc. is China's leading professional OpenStack service provider to enterprises. 99Cloud is one of the first corporate members of the OpenStack Foundation and is ranked in the top 10 in OpenStack Community code contribution worldwide. Additionally, it ranks in the top 10 worldwide in contributions to Horizon, Murano, Tacker, Kolla, Senlin, Sahara, Trove and Freezer. 99Cloud is the sponsor and organiser of Trystack.cn, the largest OpenStack community in China and is also China's largest professional OpenStack training institution. 99Cloud Inc. has been approved as one of the first official training partners of the OpenStack Foundation¹s COA certification worldwide. China Mobile - China Mobile is the leading telecommunications services provider in China and boasts the world's largest mobile network.The company operates one of the world's largest OpenStack public cloud platforms and is building a new, larger private cloud based on OpenStack over multiple locations. China Mobile is focusing on 5G networks, NFV, IoT and cloud computing and has chosen OpenStack as a fundamental platform for NFV and IoT applications. China Mobile is becoming an OpenStack leader through standardisation and industrialisation. City Network - City Network is a leading provider of infrastructure services in Europe. The company provides public, private and hybrid cloud solutions based on OpenStack from 27 data centers around the world. Its IaaS and global networks make it easy to build solutions that fully exploit the capabilities of OpenStack over multiple locations. Through its industry-specific IaaS, City Network can ensure that customers can comply with demands originating from specific laws and regulations concerning auditing, reputability, data handling and data security such as Basel and Solvency. With the most data centers and features in Europe, and global reach via nodes in APAC and North America, it allows for ease of use while offering complete redundancy and data integrity. Deutsche Telekom - Deutsche Telekom is one of the world's leading integrated telecommunications companies, with more than 156 million mobile customers, 29 million fixed-network lines and around 18 million broadband lines. Deutsche Telekom has announced improvements to its Open Telekom Cloud, first announced at Cebit 2016. At the OpenStack forum in Germany, Deutsche Telekom's business customer division announced further enhancements in Open Telekom Cloud, its public cloud offering, launched in March 2016 at Cebit. Open Telekom Cloud is based on OpenStack and is the company¹s first public cloud offering running on OpenStack. Deutsche Telekom chose OpenStack in order to keep further development and integration as simple and "open" as possible. Foundation membership is limited to 24 Gold Members, who vote annually to select eight Board Directors representing the class. In addition to the four new members elected in Barcelona, Gold members are Aptira, CCAT, Cisco, Dell, DellEMC, DreamHost, EasyStack, Ericsson, Fujitsu, Hitachi, Huawei, inwinSTACK, Juniper Networks, Mirantis, NEC, NetApp, Symantec, UnitedStack and Virtuozzo. These new Gold Members highlight the ongoing growth and exciting new use cases for OpenStack software, said Mark Collier, COO of the OpenStack Foundation. Europe, APAC, public cloud and telecom each are proof points that OpenStack is attracting new users and new workloads at a quickening pace. We're looking forward to what the community can learn from these contributors to our project. One of our team is live tweeting from OpenStack Summit Barcelona – @MrDavidOrgan ### IBM leads major milestone in cloud-to-cloud interoperability at OpenStack Summit, Barcelona IBM today announced the results of the Interop Challenge, a cloud industry initiative aimed at demonstrating how OpenStack delivers on the promise of interoperability among vendors across on-premise, public and hybrid cloud deployments. The challenge was issued to OpenStack Foundation peers by Don Rippert, general manager for IBM Cloud Strategy, Business Development and Technology, at the OpenStack Summit in Austin in April 2016, and called for fellow cloud vendors to show proof of interoperability by the end of October.  To date, 18 major OpenStack vendors, including IBM, AT&T, Canonical, Cisco, Deutsche Telekom, DreamHost, Fujitsu, Huawei, Hewlett Packard Enterprise (HPE), Intel, Linaro, Mirantis, OpenStack Innovation Center (OSIC), OVH, Rackspace, Red Hat, Suse and VMware, have come together to work through the challenge and achieve enterprise interoperability, deploying and successfully running OpenStack deployments across multiple clouds. Many of participants demonstrated this interoperability on stage at the OpenStack Summit in Barcelona today. Since OpenStack was created in 2010, customers have been attracted to the innovation of the open source platform, as thousands of developers globally work together to develop the strongest, most robust and most secure cloud platform possible. Global community members and corporate industry leaders are committed to investing in the platform, contributing to its projects, improving the source code and improving the OpenStack deployment process and ongoing operational experience. It is ever evolving and ever improving. Customers are able to source different interrelated components of their cloud solutions from different vendors of their choice to create the combined environment that best meets their needs, but until now, the potential of OpenStack was limited by the lack of proof of interoperability among various OpenStack environments. Through improved OpenStack cloud interoperability, customers are protected from vendor lock-in and able to easily uproot workloads and move them to new cloud providers if they choose. In addition, they benefit from the seamless interoperation among their chosen vendors, enabling them to better leverage hybrid OpenStack cloud environments and use whatever combination of cloud services best meets their requirements. “What customers want from open source projects is innovation, integration and interoperability,” said Mr. Rippert. “Nobody has doubted the innovation and integration capabilities within the OpenStack projects, however some doubted whether the vendors supporting OpenStack would work together to achieve interoperability. Today with this significant milestone, we are proving to the world that cross-vendor OpenStack interoperability is a reality. When it comes to OpenStack, our hope is that this demonstration of working interoperability will reduce customer fears of vendor lock-in. We at IBM look forward to continued work with the community and fellow OpenStack vendors to continually improve interoperability to meet the goals of our customer base.” The Interop Challenge uses deployment and execution of an enterprise workload with automated deployment tools, demonstrating the capabilities of OpenStack as a cloud infrastructure that supports enterprise applications. This is not the first effort at interoperability within the OpenStack community, but it is the first to focus on workload portability. The Interop Challenge is therefore complementary to the ongoing work of the DefCore Committee and RefStack project, which focus on defining must-pass API tests and designated sections of code that must be present in OpenStack Powered (TM) products. Together, the two efforts help move the community toward a more interoperable future. As part of the group’s success in working through and achieving portability between on-premise, public and hybrid OpenStack deployments, the 18 participants have also created automated tools for deployment of applications across a variety of OpenStack environments. They have additionally generated collateral on best practices, providing an interoperability roadmap for other OpenStack members in the future. Leveraging the collateral and best practices resulting from this challenge, the OpenStack Interop members will continue to collaborate across the community to drive further interoperability improvements for OpenStack going forward. Comments from Interop Challenge Participants Canonical: "As a founding member of OpenStack, Canonical has been instrumental in helping to define OpenStack interoperability and builds over 3,500 clouds a month in its OpenStack Interoperability Lab. Portability of workloads between cloud platforms is vital if organisations are to realise the benefits of agile, elastic infrastructure so we are fully committed to making this a reality in OpenStack and beyond." – Mark Baker, Head of OpenStack Product Management, Canonical Cisco: “As companies move to the hybrid cloud model, portability becomes crucial to avoid re-designing and breaking workloads. On-premises and public OpenStack-based clouds need to expose interfaces that enable consistent application deployments. This year’s Interop Challenge made great strides in this movement. We are pleased with the achievements and to be part of this great community collaboration with cloud providers. – Lew Tucker, VP and CTO, Cloud Computing, Cisco Systems Deutsche Telekom: "Today's Cloud users do expect open standards, ongoing innovation, independence and freedom of choice of their platform, and OpenStack stands for these principles. We also need to prove the abilities to move cloud workloads. That’s why Deutsche Telekom is highly committed to the Interop challenge. It’s not only about proof, but also about a substantial demand of our customers in the cloud era." – Thomas Aschenbrenner, Director, Open Telecom Cloud, DT DreamHost:  “DreamHost helped lay the cornerstone of the OpenStack Foundation because we believe that every user of a cloud should have the freedom of movement for workloads without hassle when crossing the border. We're stoked to demonstrate that DreamHost Cloud is part of the wider OpenStack union, where applications can be deployed on the best clouds for the job without substantial modifications.” – Jonathan LaCour, Vice-President, Cloud, DreamHost Hewlett Packard Enterprise: “Transforming to a multi-cloud, hybrid infrastructure demands an enterprise-grade, open source cloud platform. HPE has been involved in OpenStack from its earliest days and has helped make it mainstream. We’re invested heavily in OpenStack because we know enterprises need economical, open source IaaS solutions. Having successfully completed OpenStack Interop testing for HPE Helion OpenStack, we are committed to interoperability so that customers can use the right mix of cloud solutions that meet their unique requirements.” – Mark Interrante, Senior Vice President, R&D, HPE Cloud Huawei:  "Huawei, a cloud technology and service provider, offers both public cloud and private cloud solutions to enterprise customers. Our customers need a hybrid IT environment, where they can have access to both private clouds and public clouds with workloads running and moving across seamlessly. OpenStack's ‘Interoperability’ vision makes that possible. Huawei is fully committed to helping OpenStack transform the Interoperability vision into reality. We are excited to be part of the Interoperability Challenge, which marks an important milestone toward the Interoperability vision." – Ren Zhipeng, President, IT Cloud Computing Product Line, Huawei Linaro: “Successfully completing the Interop challenge with our Enterprise Reference Platform running across multiple ARMv8 SoCs clearly demonstrates the strength of Linaro’s collaborative engineering model. We are excited that ARM is now a fully integrated and tested architecture for OpenStack Clouds.” – George Grey, CEO, Linaro Mirantis: “One major benefit of OpenStack is the promise of avoiding vendor lock-in. Verifying that workloads run correctly across multiple OpenStack clouds is a valuable next step toward assuring interoperability between vendors, and Mirantis is happy to contribute." – Craig Peters, Director, Product Management OpenStack Innovation Center: “The OpenStack Innovation Center (OSIC), a joint investment by Rackspace and Intel, aims to accelerate the enterprise adoption of OpenStack through continuous enhancements that support increasingly robust and diverse workloads. OSIC’s participation in the Interoperability Challenge is one of many ways that we are delivering on this shared commitment, helping to advance the platform’s enterprise capabilities, interoperability, and ease of deployment. Such investments help ensure OpenStack’s long-term vitality among enterprises around the world.” ―David Brown, Alliance Manager, OpenStack Innovation Cente OVH: “OVH Public Cloud continues to grow fast and since the beginning, we had to make some changes quickly to answer our customer needs and our environment. Our customers are asking for more and more interoperability which is a “must to have” and we putted it as a priority since months." – Damien Rannou, Public Cloud Tech Lead at OVH Rackspace: "Since founding OpenStack with NASA in 2010, Rackspace has continued to focus on advancing the community. Our involvement and support of the OpenStack Interop Challenge is another step towards furthering this open-source platform that has become the de facto standard for enterprise private clouds.” – Bryan Thompson, GM, Rackspace OpenStack Private Cloud Red Hat: “Red Hat has long been committed to our vision of an open hybrid cloud and the promise of interoperability across traditional, private cloud and public cloud environments. The OpenStack Interop Challenge is an important example of how this vision can become a reality for customers, and demonstrates how an open source ecosystem can work together." –Jim Totton, vice president and general manager, Platforms Business Unit, Red Hat Red Hat is a trademark or registered trademark of Red Hat, Inc. or its subsidiaries in the U.S. and other countries. Suse: “Customers are looking for assurance that applications written on one OpenStack cloud will work on another. Demonstrating that solutions from multiple vendors can support the same workload will give customers confidence that OpenStack delivers on the promise of avoiding vendor lock-in.” – Pete Chadwick, Director of Product Management, SUSE VMware: “VMware has always emphasized two core values of OpenStack. First: open cloud services and open APIs. Second: vendor neutrality of the framework. By consistently maintaining VMware Integrated OpenStack as an OpenStack Powered™ product, we have delivered on the API promise to customers while greatly simplifying some of the biggest challenges of OpenStack. Further, by successfully participating in the interoperability challenge as well as continuing our work with the DefCore Committee, we confirm our commitment to preserve the vendor neutrality of OpenStack.” – Purnima Padmanabhan, vice president, product management, Cloud Management Business Unit, VMware ### Retail branch IT as a strategic asset Innovative “bricks and mortar” retailers are investing in building a totally integrated customer experience – one that mixes in-store with on-line purchasing to redefine the client journey. This omnichannel approach requires the seamless integration of the customer journey across all platforms, including social, mobile and physical, which in turn requires the existence of new instore IT functionality that is capable of supporting this mission. [easy-tweet tweet="Innovative retailers are investing in building a totally integrated customer experience" hashtags="tech, cloud, IT"] Retailers are moving in this direction in response to customer demand. Omnichannel customers spend 3.5 times more than other shoppers, while 84 percent of customers believe that retailers should be doing more to integrate their online and offline channels. The key differentiator that physical retailers have in achieving this strategy is their branch network, and in-store IT capability is the foundation on which the strategy must be built. In this model, in-store IT, when integrated with on-line and mobile customer touchpoints, becomes a strategic asset. But for many experienced retail professionals, the idea of viewing retail branch IT as a key strategic asset may seem a contradiction in terms. Historically, branch IT has been seen as a brake on the business, at best costing a disproportionate amount of time and effort to manage, and at worst restricting the ability of the company to implement innovative customer engagement strategies. But there is a growing realisation that the branch network must be better integrated into the IT estate so that it can be better used to optimise the customer journey and leverage brand equity. Investment is required to achieve this, but, given the margin pressures of this sector, it must be delivered in a cost-effective manner. Retailers face a challenge to deliver a coherent digital strategy that encompasses omnichannel retailing. This requires a complete alignment of retail systems to serve customers across all channels in all locations, including in-store in real time. New retail omnichannel strategies cannot be supported by old in-store IT solutions. They require a new type of in-store infrastructure that is able to support new, advanced applications running across multiple device types, with real-time analytics and secure local storage for the masses of data such applications produce . And they must be integrated with the cloud and support consolidated network functionality including firewalls, routers and WAN optimisation. [easy-tweet tweet="New retail omnichannel strategies cannot be supported by old in-store IT solutions. " hashtags="IT, Tech, Cloud"] The first and natural reaction of many retail professionals when contemplating new branch IT requirements is to wonder how on earth they are going to support such advanced functionality, distributed across multiple sites, without any local support resource. Branches already present a disproportionate IT support challenge, and the thought of moving from today’s simple branch IT to the virtualised, hyper-converged infrastructure required for advanced in-store applications is worrying indeed. These concerns have lead many retailers to operate a cloud-first strategy, which affords many benefits, including a reduced emphasis on infrastructure and a greater focus on application and business process. However, a cloud-first strategy is not a cloud-only strategy. Many of the new and innovative applications (as well as most legacy ones) require IT infrastructure on the remote site. The challenge is to deliver state-of-the-art, often complex and usually dynamic IT to remote sites, centrally and efficiently managed, without the need for expensive on-site expertise, or without over-taxing already stretched centralised IT resources. And this requires a redefinition of the role of the local server Fortunately, new developments and trends in cloud managed hybrid IT have created a new opportunity for retailers and CIOs to transform retail branch IT into an asset rather than a burden. Two major and interconnected developments — hybrid cloud and hyper-convergence — are allowing us to redefine servers in a way that meets the needs of advanced in-store IT capability without the advanced local IT skills. With cloud managed servers, capacity can be put back in-store at a low cost with a high SLA. By using cloud managed servers based on a hyper-converged software platform, organisations can obtain their own flexible cloud in-store without compromise. When combined with a cloud-first strategy, this provides an excellent base for running the business and experimenting with new applications to enhance the customer experience Cloud managed servers control all the infrastructure of the server, including hardware, firmware, virtualisation and management. They manage compliance, security and are kept current 24/7. They take responsibility for cloud service integration, backup and disaster recovery. Cloud managed servers deliver packaged IT functionality, which can be run in any branch, regardless of local IT support skills or the resilience of local network connectivity. They are the ideal solution to the needs of the store network, in as much as they allow a flexible IT strategy with applications being placed where they deliver the most benefit, and support resources optimised across the whole distributed organisation. They provide the capability to quickly implement and maintain a consistent technology platform for all locations and businesses, with a single set of data, and the support best practices that arise from operating on a single repeatable technology stack. And hyper-converged cloud managed servers are resilient by design because they support scale-out architectures with automated failover in the event of hardware faults. [easy-tweet tweet="Hyper-converged cloud managed servers are resilient by design" hashtags="Cloud, data, tech"] In effect, cloud managed servers provide an advanced in-store infrastructure which can be integrated with the retailer’s on-line presence to deliver a multi-channel customer experience. This can be achieved without complexity and the need for in-store branch expertise, or regular and costly site visits from the central IT function The fundamental shift in cost-effective, high-functionality, in-store IT delivered by cloud managed servers is a game-changer in the world of the omnichannel customer experience. They offer new opportunities for retail CIO’s to provide advanced branch application capability, integrated with on-line and cloud-based channels, and a less complex IT infrastructure without the need for extensive distributed IT expertise. Zynstra will shortly be holding a breakfast event in central London dedicated to this topic, hosted by retail expert and consumer champion, Clare Rayner. ### OpenStack gains ground in the enterprise with business-critical workloads running on larger deployments across diverse industries: 451 Research OpenStack deployments are getting bigger. Users are diversifying across industries. Enterprises report using the open source cloud software to support workloads that are critical to their businesses.  These are among the findings in a recent study by 451 Research regarding OpenStack adoption among enterprise private cloud users. About 72 percent of OpenStack-based clouds are between 1,000 and 10,000 cores and three fourths choose OpenStack to increase operational efficiency and app deployment speed. The study was commissioned by the OpenStack Foundation.  The data were previewed at the OpenStack Summit Barcelona, alongside a diverse set of users speaking directly about their experience, including  Banco Santander, BBVA, CERN, China Mobile, Comcast, Constant Contact, Crowdstar, Deutsche Telekom, Folksam, Sky UK, Snapdeal, Swisscom, Telefonica, Verizon, Volkswagen, Walmart and many more.  Key findings from the 451 Research include: Mid-market adoption shows that OpenStack use is not limited to large enterprises. Two-thirds of respondents (65 percent) are in organisations of between 1,000 and 10,000 employees.[1] OpenStack-powered clouds have moved beyond small-scale deployments. Approximately 72 percent of OpenStack enterprise deployments are between 1,000 to 10,000 cores in size. Additionally, five percent of OpenStack clouds among enterprises top the 100,000 core mark. OpenStack users are adopting containers at a faster rate than the rest of the enterprise market with 55 percent of OpenStack users also using containers, compared to 17 percent across all respondents.[2]  OpenStack supports workloads that matter to enterprises, not just test and dev. These include infrastructure services (66 percent), business applications and big data (60 percent and 59 percent, respectively), and web services and ecommerce (57 percent). OpenStack users can be found in a diverse cross section of industries. While 20 percent cited the technology industry, the majority come from manufacturing (15 percent), retail/hospitality (11 percent), professional services (10 percent), healthcare (seven percent), insurance (six percent), transportation (five percent), communications/media (five percent), wholesale trade (five percent), energy & utilities (four percent), education (three percent), financial services (three percent) and government (three percent). Increasing operational efficiency and accelerating innovation/deployment speed are top business drivers for enterprise adoption of OpenStack, at 76 and 75 percent, respectively. Supporting DevOps is a close second, at 69 percent. Reducing cost and standardising on OpenStack APIs were close behind, at 50 and 45 percent, respectively. Our research in aggregate indicates enterprises globally are moving beyond using OpenStack for science projects and basic test and development to workloads that impact the bottom line, said Al Sadowski, research vice president with 451 Research. This is supported by our OpenStack Market Monitor which projects an overall market size of over $5 billion in 2020 with APAC, namely China, leading the way in terms of growth. Our keynotes this morning highlighted enterprises doing work that matters with OpenStack, said Mark Collier, COO of the OpenStack Foundation. The research gives an unbiased look into the plans of enterprises using private cloud, and they're telling us that OpenStack is not merely an interesting technology, but it's a cornerstone technology. Companies are using OpenStack to do work that matters to their businesses, and they're using it to support their journey to a changing landscape in which rapid development and deployment of software is the primary means of competitive advantage.  New Enterprise User Case Studies Available Online Members of the OpenStack community have joined forces to take advantage of the momentum in enterprise adoption. The collaborative effort, dubbed "The World Runs" on OpenStack, was launched at the OpenStack Summit Barcelona, using the social media hashtag #RunsOnOpenStack. The cornerstone of the campaign is a collection of OpenStack enterprise user success stories that feature many industries, workloads and organizations using the software around the world.   About OpenStack® OpenStack is the most widely deployed open source software for building clouds. In use globally at large and small enterprises, telecoms, service providers, and government/research organizations, OpenStack is a technology integration engine that supports the diverse ecosystem of cloud computing innovation. Current news and alerts signup at: http://www.openstack.org/news/signup. One of our team is live tweeting from OpenStack Summit Barcelona - @MrDavidOrgan ### Using psychology to compel consumer action Understanding the precise triggers that compel customers to buy is complex. Technology, from apps to bots, is making it easier to connect with and respond to consumers, but building loyalty and motivating purchase, ultimately requires a strategy that blends left-brain science and right-brain emotional intelligence. What’s the difference between right- and left-brain function The left part of the brain powers analytical thought. It is more linear in its approach. As a result, the typical left-brained buyer makes purchasing decisions based on hard facts, data, empirical proof or historical factors. The right-brained buyer on the other hand, values creativity and relationship building, as the right side of the brain affects the imagination and emotions. These types of consumer go with their guts and act on intuition. [easy-tweet tweet="The left part of the brain powers analytical thought." hashtags="tech, IoT"] So, how can brands create a customer experience that appeals to both sides? Three tips for marketing to the logical left brain: Be a realist: Emphasise the practicality of your product or service and explain the tangible impact at key points in the purchasing journey Prove your point with data: Lead with facts and relevant information and you’ll make the strongest case to compel action e.g. highlighting case studies and new customer savings will demonstrate exactly what the consumer stands to gain. A/B test the results: Like a scientist in the lab, always adjust your experiment based on results and tailor marketing to what’s proven to resonate with your customers. Three tips for marketing to the emotional right brain: Have a vision for a better world: Right brainers respond to holistic, altruistic, and more personal approaches. These customers aren’t loyal to companies; they’re loyal to ethos, values and belief systems. Develop your “brand personality”: You need to know who you are before your customers do. And then you need to act consistently and congruently in line with your aims and ideals. Be human: Technology (like bots) can guide the left-brained buyer along a logical path toward purchase. It can also empower customer care agents giving them the insights that lead to a better, personalised experience. These agents can see what’s worked or not in the past and tailor their conversation to their customers’ preferences. The complex psychology behind loyal consumers Whether your customers are left brained or right brained, a strong customer experience appeals to both, as it combines a smooth, sequential series of stages while also having to rely on correct emotional engagement. However, research has shown that it’s the emotional connection that keeps consumers coming back time and time again. Therefore brands need to identify what their emotional motivators are and make sure these are conveyed in their messaging strategy, and every time they connect with their customer so they can appeal to their instincts and intuition. Although we may like to think of ourselves as rational beings, emotion is intrinsically linked to decision-making. With this in mind, marketers need to pair the data-supported value-add of their product or service to how the consumer feels about it. Achieve this, and you create a loyal customer for the long term. ### PTC and Hewlett Packard Enterprise to Collaborate on Internet of Things Solutions PTC and Hewlett-Packard Enterprise (HPE) announced today a planned collaboration to facilitate the availability of the Converged IoT Solutions, based on PTC ThingWorx software and HPE Edgeline Systems. As planned, the collaboration will focus on industrial use cases, incorporating PTC’s ThingWorx® IoT platform technologies and HPE’s hardware and data services. PTC and HPE will demonstrate the latest example of the combined solution at IoT Solutions World Congress from October 25 to October 27 in Barcelona, Spain. The planned collaboration of HPE, a global leader in computing and data processing, with PTC, a leader in visualization and augmented reality, is expected to result in the availability of a pre-tested best-in-class hardware and software combination that will enable customers to solve IoT data management problems and make decisions from sensor data more effectively. As planned, the collaboration between PTC and HPE would demonstrate complete hardware and software technologies specifically designed for IoT edge computing and smart, connected solutions. This includes sensors, edge compute, real-time edge analytics, machine learning, and augmented reality. The demonstration at IoT Solutions World Congress builds off of these technologies that are currently being featured at HPE’s IoT Innovation Lab in Houston, Texas, and at leading industry events such as PTC’s LiveWorx® event. “Companies working together to solve industry challenges is the fastest way to accelerate IoT innovation and bring about meaningful business value,” said Andrew Timm, chief technology officer, PTC. “Our work to date with HPE demonstrates what is possible when two leading IoT companies come together with a mutual goal of addressing some of the IoT’s foremost challenges and creating new IoT solutions.” “The IoT promises access to immense amounts of pent up data which hold great insights that enable customers to accelerate business, engineering, and scientific outcomes. The HPE Edgeline Converged IoT Systems coupled with Aruba connectivity and PTC’s leading technologies are a strong combination for unleashing these data and insights for our customers,” said Dr. Tom Bradicich, VP & GM, Servers and IoT Solutions, Hewlett Packard Enterprise. Connect with PTC and HPE at IoT Solutions World Congress PTC and HPE will be demonstrating an example of their latest combined IoT technology at IoT Solutions World Congress in Barcelona. PTC will be at booth D561 and HPE will be at booth D519. Additionally, PTC and HPE will deliver a keynote presentation on the ways that the IoT enables business value on October 27, 2016 from 16:15 - 17:00 in Room 1:  Accelerating Insights at the IoT Edge (BT13) with HPE’s Olivier Frank and PTC’s Andrew Timm. Additional companies exploring IoT opportunities with PTC and HPE include National Instruments, OSIsoft, and Deloitte.   ### Compare the Cloud Launches the Disruptive Pitch TV Show   [embed]https://www.youtube.com/watch?v=FHhTxpGVdkA[/embed] Compare the Cloud’s 24-hour online technology channel, Disruptive Tech TV, today announced the premiere of its monthly technology pitch live show, Disruptive Pitch. The show, filmed in central London, will pitch emerging technology providers to their judges including and guest judge experts in the industry. [easy-tweet tweet="Disruptive Pitch will premiere as a monthly technology pitch live show." hashtags="tech, pitch, "] Disruptive Pitch is a one hour show, free to all viewers, and streams from www.comparethecloud.net/live, https://livestream.com/disruptivelive and www.disruptivetech.tv/live online channels. The first episode broadcasts on 26th October 2016, 4:30PM GMT. Viewers can also follow www.twitter.com/comparethecloud and www.twitter.com/disruptivelive for show updates. “Today we find that many young and innovative companies struggle when trying to create exposure for their businesses,” said Andrew McLean, producer of Disruptive Pitch. “Compare the Cloud Limited is launching www.disruptivepitch.tv which will be a monthly event for the next six months focussing on disruptive technology organisations with the first show streaming live”. Recordings of the show and excerpts will be available via: www.youtube.com/comparethecloud, www.youtube.com/disruptivelive and www.twitter.com/comparethecloud. The format of this show will be a live stream backed by a significant advertising campaign to bring in 10,000 relevant online viewers during each episode. Each company pitching will be required within a 3-minute time limit to pitch their respective business or organisation to the four judges and audience to go through to the final round. “No powerpoint,” added Andrew McLean. “we’re looking for people to explain their technology as they would a client - their elevator pitch as you will. As the show progresses we will work with each finalist to create the perfect pitches for the final show”. Each month we will have two regular judges from the Compare the Cloud team and two distinguished guest judges from the IT industry. Each team will be scored by industry experts out of ten for their respective pitch with a two finalists announced at the end of each show. [easy-tweet tweet="Companies entering the live show will also have the opportunity to network with major IT vendors" hashtags="tech, disruptive"] Companies entering the live show will also have the opportunity to network with major IT vendors and partners before and after the scheduled event. The show is currently looking for UK technology companies available for filming in the London area. If your organisation wishes to be represented in future shows drop us a note here http://www.disruptivepitch.tv/#contact or via pr@comparethecloud.net with details about your technology. For further information, what the show trailer here https://www.youtube.com/watch?v=6_-MsbwRqEQ&feature=youtu.be ### WiFi on trains could leave commuters vulnerable to hackers Do you ever have long commutes? Are you ever annoyed by the constant cutting out of signal? Would life not be made so much easier if they just has wifi on the trains? From 2017 free WiFi will be rolled out across a number of UK train operators, thanks to the Department for Transport’s £50 million initiative to increase WiFi on trains. You could soon go from Bournemouth to Newcastle without fear of not being able to scroll through Instagram due to poor signal. The Department for Transport (DfT) is currently running a program whereby the department will offer funding to train operators to launch projects in order to implement WiFi on trains. With recent demos showing the way in which insecure WiFi hotspots can lead to those accessing the network being targeted with ransomware, Intel Security launched a Freedom of Information request on the DfT to find out whether it is setting up mandatory guidelines for public train WiFi to accompany this program. “While this will hugely benefit a number of commuters, who can work remotely during their journey to and from work, this also comes with significant security risks if the right precautions are not implemented,” explains Raj Samani, CTO EMEA Intel Security. A Freedom of Information Request (FOI) uncovered that the DfT “has not linked receiving funding for the on-train Wi-Fi with including a specific cyber security strategy.” While the department will be providing some suggested cyber security guidelines, it is not a mandatory requirement for train operators to follow these in order to be deemed eligible for funding to launch open WiFi on their trains. “I severely hope that despite the lack of mandatory guidelines, each individual train operator puts strict security measures in place to protect commuters who access their WiFi,” continued Samani. “Free WiFi hotspots are a breeding ground for hackers and our latest demo has uncovered that there is a potential new threat set to hit users of public WiFi. Intel Security ran a demo using a fake WiFi hotspot and proved that those that access the network can be targeted with ransomware. The ransomware then lies dormant on the user’s computer until they return to the office and log on to the business network. The ransomware then activates and locks down network assets, holding them to ransom. “Alongside the train operators, I urge commuters to take precautions when accessing public WiFi. Creating a virtual private network (VPN) is one of the best ways to keep your browsing session private.”   ### Are businesses ready for the rise of remote working? Remote working has become one of the most sought after benefits for employees in a broad spectrum of industries in the digital age. Government figures, which were analysed by the TUC last year, revealed the total number of individuals now working from home on a regular basis now stands at 4.2 million - and, as time progresses, this figure looks likely to increase further. [easy-tweet tweet="Remote working has become one of the most sought after benefits for employees" hashtags="cloud, tech, BYOD"] The ability to work from home is also an increasingly important factor among individuals who are looking for a new job, and offering this benefit is just one of the ways modern businesses can compete for the best talent. Research carried out by digital marketing specialist I-COM recently revealed that 45 percent of professionals look for permission to work from home on a regular basis when searching for a new role, while only 20 percent of respondents were currently able to take advantage of remote working. However, with the increased freedom presented by remote operation comes a lot of responsibility, and potentially a whole host of issues for the businesses that fail to consider the implications of offering remote working to members of staff. With the number of mobile office workers reaching new heights, and bring-your-own-device (BYOD) policies now the norm, organisations can no longer ignore the need to construct an infrastructure that goes beyond the four walls of the office. Here, we put the spotlight on remote working and take a closer look at how businesses can ensure they do not leave themselves vulnerable to threats as their workforce becomes increasingly independent. Improving efficiency with the right technology Employees looking to enjoy the benefits of remote working are more productive and efficient if they have access to the right technology. Therefore, businesses should ensure they are using software that is compatible with mobile devices, so they can easily be taken on the road. It may seem obvious to many businesses, but others have so far failed to catch on to the advantages offered by Google Apps - and in particular, Google Docs. By adopting the use of the search giant’s cloud-based office tool, co-workers can easily collaborate on work, edit and pick up on documents while at home or in any other location. [easy-tweet tweet="Businesses should ensure they use software that is compatible with mobile devices" hashtags="tech, cloud, mobile, BYOD"] The issue of security Advancements in computing infrastructures have meant that remote employee security has become a challenge. Thanks to remote working, systems are often decentralised and diffused to create an environment where confidential data needs greater protection - the only issue is, it’s even more tricky to protect. One of the key challenges businesses face in overcoming security threats presented by remote working is that they are tasked with managing a broad array of mismatched devices running different operating systems. However, it is not impossible. Businesses need to roll out efficient security strategies to protect themselves from outside threat with this new way of working. Protecting every device is just one method that IT departments should use to protect the business as a whole. Often, malicious software that is designed to swipe data often infects a computer through an email or website. Therefore, to reduce the potential of such infections, it is important to utilise the best security software and appropriate practices. Similarly, utilising cloud the features of a cloud service provider can allow companies to maintain a high level of protection. Cloud service providers use data encryption technology while transmitting confidential information from remote locations to the company intranet. Businesses should make use of a virtual private network to ensure that all internet traffic remains fully secure. A third-party VPN service will have all the necessary security patches installed, meaning firms are continuously monitored for any signs of malware. Looking to the future While the opportunities presented by remote working may cause some business owners to question whether office life could soon be a thing of the past, there are many arguments in favour of maintaining a more traditional approach to work. Yahoo is just one example of a brand that chose to go against the grain and ban remote working altogether back in 2014, stating that: “Some of the best decisions and insights come from hallway and cafeteria discussions, meeting new people and impromptu team meetings.” [easy-tweet tweet="There are many arguments in favour of maintaining a traditional approach to work." hashtags="cloud, tech, IT, BYOD"] It is incredibly important that businesses in the 21st century are able to adapt, and remote working is just one way they can do so. Those companies offering remote working, and giving employees the chance to enjoy a healthier work/life balance, are likely to have a happier workforce. However, as technology becomes less of a barrier to ensuring members of staff still need to go into the office, it is essential that employers make sure they do not become a thing of the past. ### Should Cost Be The Overriding Factor When Choosing A Cloud Platform? Cost is one of the primary considerations for enterprise organisations considering cloud migration. A cloud platform’s on-demand pricing and elasticity make it easier to achieve optimal utilisation than with colocated or in-house bare metal. But the cost is far from being the only benefit of cloud migration, and ignoring other factors can lead to poor vendor choices, especially for companies without large IT and DevOps teams. Security, support and management services should also be considered. [easy-tweet tweet="Cost is one of the primary considerations for enterprise considering cloud migration. " hashtags="cloud, tech, IT"] Any company with a large on-premises infrastructure deployment is aware of the associated hardware, networking, building, and staffing costs. Migrating to a cloud platform doesn’t make those costs disappear — the cloud vendor has those costs and will pass them on. But cloud providers benefit from economies of scale that most companies can’t. Their costs are lower, and because cloud clients only pay for what they use, they don’t have to spend to maintain, house, and staff an infrastructure deployment that may be idle for most of its lifecycle. So cost is a motive for cloud migration, but using cost as the only factor in making a decision is a mistake. Some cloud vendors — including many of the biggest names in the business — cut management services and support to the bone to compete on price. Clients get the infrastructure and the cloud capabilities, but that’s about it. We’ve all heard stories of what happens when a cloud server goes down and takes data with it: the response is usually a shrug with the implication that redundancy and reliability are your problems, not the cloud vendors. [easy-tweet tweet="Cost is a motive for cloud migration, but using it as the only decision making factor is a mistake." hashtags="cloud, tech, migration"] If an organisation chooses a cloud platform solely by price, they’ll have to ensure that they have an IT and DevOps department capable of making the most of a cloud platform that makes management, performance, and stability the responsibility of the client. This is especially problematic for smaller companies that don’t have the wherewithal to build great DevOps and IT departments. Decision-makers in these enterprises should think about who is going to manage their infrastructure. Choosing a cloud vendor that offers management services may push up the price, but, because of the economies of scale we discussed earlier, outsourcing will still be less expensive than cultivating the expertise in-house. Choosing on price alone is a significant cause failure for enterprise cloud projects. Selecting a vendor that can provide management services and has existing partnerships with domain experts is often the best way for smaller companies to make the most of the cloud’s potential. ### ZTE wins Best Wireless Broadband Innovation award for Pre5G massive MIMO ZTE Corporation, a major international provider of telecommunications, enterprise and consumer technology solutions for mobile Internet, today announced that it has won the Best Wireless Broadband Innovation award for its Pre5G massive multiple input, multiple output (MIMO) technology at Broadband World Forum in London. Sponsored by Informa, the global business media company, Broadband World Forum is an important event in thebroadband industry and this award further recognises ZTE's innovation in Pre5G massive MIMO technology. In a commercial network, the single-carrier peak rate of Pre5G massive MIMO exceeds 400 Mbps, increasing spectral efficiency by four to six times when compared with that of existing 4G networks. In addition, Pre5G massive MIMO technology is compatible with existing 4G terminals (such as 4G customer premises equipment (CPE) and handsets) so that users can enjoy a high-speed broadband experience without changing their terminals. Pre5G massive MIMO technology solves the Internet’s last mile problem by improving Internet access and therefore enhancing the user experience. Compared with digital subscriber line (xDSL) and very-high-bit-rate digital subscriber line (VDSL) technologies,massive MIMO can provide a more competitive access rate without the expense of fibre to the home. In fact, this is highly practical for both mobile carriers and fixed network carriers. Earlier this year at Mobile World Congress (MWC) in Barcelona, ZTE’s Pre5G massive MIMO base station, described as a "disruptive innovation", won both the Best Mobile Technology Breakthrough and the Outstanding Overall Mobile Technology-The CTO's Choice 2016 awards. Since 2015, ZTE has launched several commercial Pre5G massive MIMO pilot projects in multiple markets. ZTE has also signed memorandums on strategic cooperation in Pre5G/5G technologies with China Mobile, Softbank, Korea Telecom, U Mobile, and Hutchison Drei Austria for joint technical validation, testing, assessment and research and development. Sofar, more than 20 Pre5G networks have been deployed all over the world and ZTE will continue to promote a wider application of Pre5G. ### Lessons from the cloud With growing investments, an increasing mix of both public and private platforms, and new industry developments cropping up on a daily basis, it’s no longer a matter of debate to ask if cloud computing is here to stay – the only real debate left is now between cloud’s “computing” and “on-premise computing” capabilities! IDC recently asserted that the growth and benefit associated with major digital transformation initiatives and the implementation of “3rd platform” technologies such as mobility and data analytics would not have been possible without the cloud as the foundation. Meanwhile, Gartner predicts that more than $1 trillion in IT spending will be directly or indirectly affected by the shift to cloud in the next five years. But as our comfort with and acceptance of the cloud model means we increasingly migrate more processes and systems into the cloud, it also provides us with an opportunity to learn from this innovation cycle to better prepare for future ones. Specifically, why have businesses favoured the cloud and where did the acceptance, adoption and even eagerness for the cloud come from?  And importantly, what skills were essential to making the change effective? Freedom to evolve The cloud has made it possible for organisations to free themselves from the tasks of selecting, procuring, installing, managing, maintaining and retiring racks of bulky onsite servers. The need for an impressive, intransigent physical system has been eradicated for an ideal, interactive off-premise alternative. So far so useful. However, the exhilarating benefit of the cloud is that it heralds IT’s change in role from fixing broken systems to fixing business problems. IT departments now have the opportunity to focus on innovation and real changes within organisations – leaving infrastructure specialists to worry about hardware investment cycles, backup scheduling and hard-disk obsolescence. Suddenly, the IT team has grown up. It empowers, evolves and maybe – hyperbole notwithstanding - even energises the business, rather than merely equipping it. The lesson for future innovation is, therefore, to not only consider the way in which the business might immediately benefit practically from the new technology but also to look at how the IT team’s role and subjective perception may change. Where does the opportunity lie? Will the benefits largely be the removal of costs, or will it remove burdens and increase the chance to contribute strategically? Negotiation of needs Some companies use the cloud because of its scalability – perhaps for backup needs, or for the elasticity associated with cloud - accommodating a wide variation in monthly or weekly demand - and would contract on a pay-as-you-go basis. But with this flexibility comes a need to negotiate with suppliers to protect the business with proper SLAs that mirror both the flexibility and the continuity the company is looking for. And of course to be able to predict needs no matter how the business might change accurately. So what’s the lesson to be learnt from how we dealt with the cloud for when the next significant evolution in business computing pops up? Ensuring the IT team had the soft skills to take advantage of - or to harness - the new capabilities that cloud offered was crucial to the success and speed of cloud adoption. It has shown that the technical skills, infrastructure and even imagination are one thing, but that learning the non-IT skills necessary to make it practical and acceptable to a business environment are just as crucial. When the next innovation appears, what will be the next non-IT power the IT team needs to learn? Actively learning from, and not just riding, innovation cycles are essential for the IT industry. Ours is too fast-paced an industry to miss opportunities, so as new cycles start, we must be prepared and flexible enough to capitalise quickly and efficiently, while also having the wisdom and experience to sort the innovation wheat from the chaff early. The nature and therefore usefulness of the newest technology will of course always be different as each cycle starts, but motivations for adoption and triggers for hype will remain mostly the same. Which means we as an industry need to observe and understand this cloud cycle, rather than rest on our laurels and watch it float by. ### Kaleidoscope of Clouds: Is cloud ERP right for your business? Your ERP system is the engine room of your business, supporting everything from purchasing through to the supply chain. It’s a business-critical system and with its high status comes anxiety around the best deployment option for a new ERP platform. With a kaleidoscope of cloud options on offer, it’s hard to know which, if any, will best suit your business’ needs.   Earlier this year Deloitte research showed 56 percent of mid-market companies are already using some form of cloud-based services. Despite the increasing understanding and uptake of cloud by mid-market businesses, the benefits and opportunities need to be weighed up for your business. The choice isn’t as simple as ‘cloud or no cloud’ – there is a kaleidoscope of options available and the best deployment decision will be closely aligned with your business goals. [easy-tweet tweet="56% of mid-market companies are using some form of cloud-based services" hashtags="cloud, tech, IT"] Here we explore the top five considerations when evaluating which deployment option is right for your ERP system: Having the freedom to customise Just as you want to be able to customise and configure your ERP system on-premise, the same level of personalisation should be available to you as a cloud-hosted or SaaS option. You want your business systems to work the way you do – not change the way your entire organisation works to fit with your software. The level of customisation between providers will vary so it is worth understanding not only what is possible but the impact when it comes to software upgrades and maintaining the customizations going forward.  For example, working with UK communications service provider Comvergent Group, we’ve been able to create a personalised dashboard for individual users which is hosted in the cloud to handle complex and real-time data (rather than employees updating siloed spreadsheets on personal computers). Being in control of your data Being in control of your data is an essential requirement for many mid-market businesses. There will always be costs involved in transferring the sizable database that supports an entire business to another platform, but it should be made as simple and painless as it can be. Understanding this upfront is important. Your business circumstances are constantly changing so you don’t want to be locked into a provider only because the cost to extract your data makes moving prohibitive. Receiving the right levels of support A good first step is to be aware of the degree of assistance provided for both the cloud platform and the software that you’re using. Increasingly support requests are triggered by wider business issues relating to process management, staff training or the hardware devices involved. If you don’t have the business knowledge and technical capability internally to troubleshoot these kinds of issues, then you need to ensure you have a support team on hand to assist you. Service levels Following on from support, check that your cloud vendor has the appropriate infrastructure to offer the level of required response. This factor was particularly important for our customer Future Directions, a care provider for learning, physical and mental disabilities who we supported through the process of migrating away from the NHS. This was a complicated task, particularly for managing and holding sensitive patient data across multiple locations. [easy-tweet tweet="The amount of data that a whole-of-business system generates is extensive" hashtags="tech, data, cloud"] The amount of data that a whole-of-business system generates is extensive, and it’s important to check that your cloud provider can handle the volume of transactions and keep pace with the drumbeat of your business. As your company grows, this will become even more important as most FDs will want to avoid starting from scratch if the cloud provider’s functionality is too necessary. Back-up and recovery Finally, should the worst happen, how well are you protected?  It is pretty much a given that your system will be fully backed up as part of your cloud or SaaS package (you still should check what is in place and how this is charged). But there are variables around the point-in-time that the data can be restored back to, right through to automatically ‘hot switching’ to a backup system so no data is missed.  What is acceptable will depend on your business model and appetite for risk – be sure you fully understand the recovery terms. Weighing it all up To hone in on the best deployment route for you, carefully consider your company’s size and growth, governance requirements, security needs, SLAs and tenancy options. Weigh up which deployment method best suits your business, from a technical and operational perspective and then evaluate the ROI and TCO of that option across many providers. If cloud hosting is right for you, then you shouldn’t have to compromise. Putting your ERP system in the cloud should deliver the same level of customisation, support, performance and business confidence as any other option. Choosing the right implementation partner will help you bring clarity to the kaleidoscope of deployment options that best reflect your businesses setting you up for sustained success. ### New regional strategy for Industries, Cloud and Services leads focus for Alcatel-Lucent Enterprise growth in year ahead ALE, operating under the Alcatel-Lucent Enterprise brand, is forming four regional sales organisations to better serve its partners, customers and their markets, by placing a focus on specific market segments for growth and accelerating the shift to new business models. Each of the four regions - Europe North, Europe South, Asia-Pacific and North America - will have its own dedicated staff supporting its defined industry-specific strategy for sales, cloud and the channel, with supporting sales enablement and services delivery. Rukmini Glanard joins ALE to head up the Europe North region which includes North, Central and Eastern Europe, Russia and Israel, and brings over 17 years experience in the global telecoms and IT industry having previously held senior executive roles with companies including Orange Business Services and Sita. Within these four regional sales organisations, ALE will extend its successful industry approach in hospitality to the healthcare, education, government and transportation sectors. This industry approach delivers tailored solutions, integrated with industry-specific business processes to enable new business opportunities, address growing demands for IoT-enabled communications systems and simplify the job of IT staff while reducing costs for businesses. ALE is at the forefront of delivering technologies spanning from CPE-based networking and communications technologies to advanced, on-demand consumption-based cloud offerings with the Alcatel-Lucent Rainbow™, OpenTouch® Enterprise Cloud and Network On Demand solutions. No matter what enterprises’ needs are, ALE provides award-winning technology and investment protection that enables them to transition to the cloud, tailoring its solutions to industry- and customer-specific business needs. Matthieu Destot, newly appointed EVP of Sales & Marketing, ALE commented, “Whether public, private, hybrid, CPE-based, cloud-based or a combination of both, today’s enterprises have a multitude of choices to source technology that supports their digital transformation goals. ALE is building a specialised global Cloud Services sales organisation to help partners and customers understand how migration to the cloud can help them improve their business outcomes.” Complementing the industry and cloud specialisation effort is an enriched channel sales organisation. With a redesigned channel program that offers specialisations, training and greater revenue possibilities, ALE is simplifying how partners do business with ALE and enabling them to tap into new revenue streams with innovative, consumption-based technologies. ### APIs and cloud: The basis for the Estate Agent of the Future Technology is crucial to the survival of the estate agent. The extraordinary pace of technological advancements that have been seen over the last ten years has transformed the way businesses and consumers live and work. However, estate agents are only just waking up to the fact that it can do more than just speed up processes. [easy-tweet tweet="Technology is crucial to the survival of the estate agent." hashtags="tech, cloud, API"] The adoption of technology puts estate agents in a unique position to offer clients the best of both worlds – the opportunity to meet consumer expectations, while simultaneously managing their businesses better. However, the implementation of technology has historically been met with resistance from traditional estate agents. Many believe technology implementation is more disruptive than innovative, and that technology advancements offer much more benefit to the buyer than the agent. This is not the case. Technology has the advantage of delivering significant opportunities to streamline and adapt to market changes and can offer estate agency the same sort of digital transformation benefits being seen in other industries. Cloud applications have made it quicker and easier to adopt technology, and it has started a movement away from using high-cost, on-premise, bespoke systems. Many of the legacy systems still used by estate agents were designed at a time when technology wasn’t developed in a unified way, and multiple formats were difficult to integrate with other applications. This made technology deployments quickly outdated, as new functions and processes took considerable time to install rather being easily bolted-on. Times have changed. Technology is now being developed with the needs of the estate agent truly at the front of mind of developers. One of the technologies that are truly underpinning the Agent of the Future is the API. APIs can tie multiple functions and draw in complementary technologies into a single, cloud-based platform that gives agents a full view of their business.  By connecting via the internet rather than via slow, cumbersome, on-premise cables, newer cloud-based systems offer agents real mobile working. It enables them to react faster and deliver a seamless service regardless of their location. APIs provide the flexibility to integrate legacy and new third-party software for a seamless user experience. [easy-tweet tweet="Technology is now being developed with the needs of the estate agent truly at the front of mind of developers." hashtags="cloud, tech"] What does that look like in practice? Take the third party company who can quickly generate an end of tenancy check for a property and the report is seamlessly delivered into the agent’s software against the customer record. This is the digitally connected, API-enabled estate agent and proves how a third party can provide a service seamlessly into the agent via the API.  The alternative – the inventory company completing a report, scanning it, emailing it to the agent who then has to attach it to a customer record – is the equivalent of the legacy infrastructure that is preventing bricks-and-mortar agents being more responsive. It’s another way to deliver an exceptional, seamless service. Similarly, take the agent who can quickly schedule an appointment directly into the calendar of their recommended Financial Advisor for the buyer even when they are not co-located. They can log in and see how the meeting proceeded before giving their client a call back – impressive customer service. The power of the API-enabled estate agent connects the whole business together providing a seamless experience to the client. The alternative – the agent, sends an email referral to the financial advisor hoping they will call the customer back, has to chase the financial advisor for feedback and an update before they can proceed with their sale – is also a legacy issue. Successful agents are increasingly looking to automate repetitive tasks to reduce the time and increase accuracy. From improving close rates to lost valuations and automated property match mail outs, agents are using the power of automation to get their time back to do what they do best – sell. Agents can work smarter, rather than harder by automating everyday workflows that would otherwise require human input by using the cloud. [easy-tweet tweet="The Agent of the Future is connected, and powered by APIs." hashtags="tech, cloud, IT"] This is achieved through an open architecture, accessible via a full-spectrum API, coupled with an incredibly powerful, outward looking workflow system. Residing on the central premise of security and agility, future-proofed platforms sit at the heart of the business and connect existing and new systems. This enables estate agents to share data and have access to a single version of the analytical truth. Perhaps the most important point about these platforms is their responsiveness to business change while facilitating fluid services across both human and digital touchpoints. The Agent of the Future is connected, and powered by APIs. Building a business on APIs gives an estate agent the freedom to hit all of their pain points at once, and use technology to enable a fully scalable and flexible business model. Estate agents must embrace technology to remain connected and relevant in an increasingly competitive market. Technology is at the epicentre of digital transformation in the estate agency market. Those that make the best use of its possibilities will thrive, those that don’t will face an uphill battle to be successful. ### Regulation and the Cloud: A move towards harmonisation The Financial Conduct Authority (FCA) recently published its final guidance for financial service firms outsourcing to the ‘cloud’ and other third-party IT services. The direction confirmed that there is ‘no fundamental reason’ why financial services firms cannot use cloud-based services in a manner that is compliant with the FCA’s rules. [easy-tweet tweet="There is ‘no fundamental reason’ why financial services firms cannot use cloud services" hashtags="cloud, tech, IT"] The regulator’s initial draft of the guidance received a fair amount of criticism from firms, stating that it was high-level and risk-based, and often unclear, which made it difficult for companies to innovate. As such, it is no surprise that organisations and service providers alike have welcomed the revised guidelines. However, it is not all plain sailing from here. Firms will need to consider their regulatory compliance in consultation with the guidance and look to their IT providers for some assistance before embarking on any cloud outsourcing. The challenges behind the cloud Cloud platforms represent a key technological development that has revolutionised the way in which businesses and customers share data. If implemented correctly, cloud-based solutions can transform a firm’s operations, making it agiler and capable of developing services that are tailored to customers’ individual needs. However, one of the biggest challenges for financial institutions is the need to rely on unwieldy legacy systems. Due to the sheer volume of critical data held in these systems, most banks are often left in a position where they are unable to replace their IT solution with a more up-to-date solution. Fortunately, with the latest cloud and mobile technology, this problem can now be solved. [easy-tweet tweet="Traditional banks can now compete with their digitally focused counterparts." hashtags="tech, cloud, IT"] By combining the computing power of legacy systems with the next generation of cloud and mobile technology, traditional banks can now compete with their digitally focused counterparts. By bridging the gap between legacy systems and mobile applications in this way, established players should be able to develop innovative mobile solutions that help to enhance consumer engagement, largely due to developments in cloud computing. The role of technology in cloud compliance Despite the many benefits the cloud can offer, many regulated firms have lagged behind when it comes to adopting this technology, fearing that security issues could result in regulatory fines and other penalties. That said, we should expect to see some changes in attitudes towards the cloud now that the FCA has revised its current guidelines. However, these guidelines are only the first step. To achieve the flexibility, convenience and scalability that the cloud can provide, firms will need to look to their IT providers for some assistance. Complex regulations mean that many companies still do not feel comfortable using this kind of technology. In particular, smaller companies are likely to be unfamiliar with the technology and the compliance requirements, and should, therefore, seek specialist expertise from an experienced advisor in order to gain the greatest benefits from the cloud. The necessity of tailored solutions It is tough for off-the-shelf IT solutions to meet all of a regulated firm’s particular FCA requirements, as one size just does not fit all. As such, it is important that companies adopt a tailored, hybrid solution that has been created specifically with their business goals and FCA guidelines in mind, particularly when it comes to cloud-based solutions. It’s essential that businesses recognise and understand that regulations are put in place to protect consumers and data. For this reason, management and compliance are changing all the time, so unless a business is willing to evolve and overhaul its systems and processes, it will only be spending money on ‘keeping up’, rather than gaining any new benefits. [easy-tweet tweet="Financial institutions should be taking advantage of the digital" hashtags="fintech, cloud, tech"] The FCA argues that financial institutions should be taking advantage of the digital due diligence tools that are available for this purpose and also work with the Government Digital Service to understand how a secure digital identity could be used in financial services. If firms are willing to focus their efforts in this way, it’s possible that compliance in the cloud will no longer be seen as a barrier when adopting the platform. In fact, if the cloud is adopted effectively, firms should be able to meet their specific business goals while staying compliant with the FCA. We already see many firms in the industry question the viability of older, more traditional technology systems.  This nod from the regulator will, therefore, come as a boost for businesses that are looking to gain a competitive edge by accessing the latest digital and mobile platforms via the Cloud. ### Small businesses and charities lacking basic digital skills Almost two in five UK small businesses (38%) and nearly half of UK charities (49%) lack basic digital skills, with a rising challenge amongst some small businesses around cyber security, according to findings from the third annual Lloyds Bank UK Business Digital Index. The most digital small businesses are twice as likely to report an increase in turnover as the least digital, and 65% of small businesses are using digital to reduce their costs, according to the survey of 2,000 small businesses and charities across the UK which is developed in association with digital skills experts Doteveryone and Accenture. The number of charities accepting online donations has more than doubled since last year, and the more digital charities are 28% more likely to report an increase in funding than the less digital charities, although fewer than three in five charities also have their own website. Using the new Doteveryone definition of Basic Digital Skills, which sets out five key skills needed to get the most out of being online (managing information, communicating, transacting, creating and problem solving), this year’s report shows that 62% of small businesses have all five skills. Lagging behind the larger small businesses are sole traders, with only 50% having basic digital skills and 78% are still investing nothing to develop these skills, down by 10% in the last year. The lack of key digital skills is a primary barrier to doing more business online, with 15% of businesses stating this is the main barrier, more than doubling since 2015.  For charities, the high turnover of staff and volunteers means it is difficult to retain digital skills and charities are often reliant on volunteers rather than embedding skills within the charity itself. Cyber security is also rising in prominence as a reason for 14% of small businesses not doing more online nearly double from a year ago. More than two-thirds of small businesses (69%) and charities (72%) state they need to develop their cyber skills. Nick Williams, Consumer Digital Director, Lloyds Banking Group said: “It’s very encouraging that the Business Digital Index shows an even stronger link between the digital maturity and organisational success of businesses and charities, with the small businesses most digitally capable being twice as likely to increase turnover. However, there are still too many without the basic digital skills which allow them to make the most of the internet. We need to motivate by raising awareness of the benefits of digital, including saving cost and time. Just as important is to remove the barriers and for some, concerns around online security are holding them back from adopting digital technology. We need to do more to reassure and support them to develop their cyber security skills.” Overseas trading Another possible area for growth is how businesses employ digital when trading overseas - such as using e-mail to overcome time zone differences, or international online payments. Currently only one in five small businesses (21%) are using digital to support their overseas trading. This does vary by sector, with retail businesses rising to 26%, with manufacturing businesses that use digital to trade overseas the highest at 39%. Of the proportion of small businesses which reported an increase in turnover over the past 12 months, a quarter (24%) are trading overseas. Social media and shifting advice preferences The rise of self-service digital was another clear theme in the 2016 Index, with businesses and charities both preferring to turn to friends, relatives or colleagues first followed by online search for help or information. Social media usage amongst small businesses and charities also saw increases to 45% and 44% respectively but still more than half of both groups are yet to embrace these digital channels as a way to interact with current or prospective customers. The increase in social media and free digital support may explain why 66% (2.5m) of small businesses and four out of five charities (78%) are still not investing their budget in digital skills.  Organisations may instead be looking to more informal low cost (or free) resources to improve their digital skills. ### Atos teams with VMware to launch Digital Private Cloud offering Atos, a global leader in digital services, today at VMworld 2016 Europe, announced the launch of its Digital Private Cloud offering, part of Atos Canopy Cloud portfolio to accelerate customers’ digital transformation. The solution is a part of Atos Hybrid Cloud managed services portfolio, and is based on VMware’s software-defined data centre (SDDC) technologies. The Digital Private Cloud offering is available for customers worldwide immediately. Eric Grall, Executive Vice President of Managed Services at Atos explains: ‘We understand the need to provide a unified, software-defined product that will enable our clients to leverage digital transformation. Atos has built one of the world’s largest hybrid clouds, and today I am pleased to announce the global release of the Digital Private Cloud (DPC), built in partnership with VMware. It is available to help IT leaders who articulate clear strategies in terms of security and governance, and are looking for partners who have the ability to assist them in their full cloud transformation, and in particular in private cloud”. This offering is based on unified, fully automated and programmable architecture which will allow intelligent placement of applications and services on ‘purpose-fit’ infrastructure environments. As a part of this solution, organizations will be able to use VMware software-defined data center technologies. Digital Private Cloud (DPC) by Atos and VMware enables enterprises to Compete more effectively in the Digital Revolution, by enabling innovation without compromising on security. Optimize the cost and performance of applications and empower IT to operate at the speed of business Realize new digital business models with a scalable private cloud infrastructure that grows with your business. Use expanded ‘On-demand’ capabilities which enable customers to receive ‘public cloud’ benefits within the realm of control and customizations enabled through private cloud. “VMware is pleased to partner with Atos to help organizations realize the powerful business transformation benefits of the software-defined data center,” said David M. Parsons, vice president, Strategic Systems Integrators & Outsourcers, VMware. “Digital Private Cloud enables intelligent placement of application and services on ‘fit for purpose’ infrastructure, built on VMware’s software-defined data center technologies, empowering organizations to leverage a unified, automated and scalable solution to grow with business needs and enable new sources of innovation.” In a recent report by Everest Group ‘Private Cloud Enablement Services – Market Update and PEAK Matrix Assessment: Marry with Public Cloud or Die, Sep. 21st, 2016’, Atos was named leader and star performer in private cloud enablement services. ### How can communications technology improve safety at sporting events? The increasing regularity of critical incidents worldwide means that the organisers of the main sporting events need to readdress and improve their safety and security procedures. While the Rio 2016 Olympics and Paralympics were, on the whole, a celebration of sporting excellence and culture, large scale riots, the targeting of athletes and spectators by criminal gangs and widespread fears about the impact of the Zika virus, meant security fears overshadowed the event. [easy-tweet tweet="The Olympic Games in Rio was marred by an influx in street crime" hashtags="cloud, tech, sport, security"] Despite the fact the Rio government deployed an 85,000 strong security force (that included 23,000 soldiers) to patrol and monitor the games, the event was marred by an influx in street crime. One of the most high-profile incidents saw the Chief of Security, Felipe Seixas, the victim of an attempted robbery at knifepoint outside of the Maracanã Stadium following the opening ceremony. For the organisers of high-footfall events—in this case, Brazil’s National Olympic Committee (NOC)—implementing effective security measures and practices can be a serious challenge. To ensure that it is best placed to manage a crisis, organisers need to have the tools to be able to communicate quickly and reliably with scores of people, including stewards, security officials, visitors and competitors. When incidents like the rioting close to the Rio Olympic Park take place, how can organisers make people aware of a fast-changing situation and what actions they should take to keep themselves safe? One solution is to implement the use of an independent, cloud-based critical communications platform to manage emergency responses and to prioritise the safety and security of everyone involved. Used by emergency services in the aftermath of the Boston Marathon Bombings in 2013, the Everbridge cloud communications platform was recently implemented to manage all communications in the lead up to, during, and after one of the biggest sporting events in the world—Super Bowl 50 in San Francisco. With more than 1.1 million people in attendance during Super Bowl week, emergency services, local authorities and organising bodies ensured visitors were kept up-to-date with all of the latest information as the week unfolded. These notifications included the most recent travel information such as rail or road closures, live traffic and parking updates, weather news, and any emergency alerts that required immediate action. Had the Brazilian authorities had a critical communications platform at its disposal when the riots broke out, they could have aggregated geolocation data from the platform to send out an emergency notification to everyone in the area warning of the danger and providing actions to guide them to safety. At the same time, the platform could have notified on-duty security staff of the incident and ensured that securing the area and minimising any immediate danger became the top priority. [easy-tweet tweet="With a critical communications platform, you can collect aggregated geolocation data" hashtags="tech, data, geolocation, services"] This technology enables organisers to send out emergency notifications via more than 100 different communication channels and devices including SMS, email, VoIP calls, voice-to-text, social media alerts and app notifications. The platform continues to send out signals until an acknowledgement is received. Responses to these notifications would have provided emergency services and security officials with clear visibility of an incident, streamlining the process of understanding which people were at risk and what resources were available to manage the crisis. Implementing a cloud-based communications platform has many advantages for organisations. By having a system that operates entirely independent of an internal communications network, groups are ensuring that the respective lines of communication between safety officers and spectators remain open—even in the event of a cyber-attack or IT outage that may compromise an internal network or a rush of calls which may overload a telecommunications mast. Organisations can also use the technology to capitalise on data already being gathered—via Wi-Fi networks or stadium access cards—to gain an even greater understanding of where staff and spectators are located at all times. Furthermore, by using cloud technology to automate the time-intensive emergency cascade process, resources can be deployed more effectively and efficiently than before, ensuring that the safety of everyone involved is better protected. As attentions turn to the next major sporting events—such as Tokyo 2020 and the FIFA World Cups in Russia and Qatar— organisers will look to implement new strategies to improve the safety and security of everyone attending. Harnessing the power of the latest cloud-based technology to help ensure that the safety issues of Rio 2016 are a thing of the past will surely be high on the agenda. ### The Linux Foundation Unites JavaScript Community for Open Web Development The Linux Foundation, the non-profit advancing professional open source management for mass collaboration, today is announcing that JS Foundation is now a Linux Foundation Project. The JS Foundation is committed to help JavaScript application and server-side projects cultivate best practices and policies that promote high-quality standards and broad, diverse contributions for long-term sustainability. Today the JS Foundation touts a new open, technical governance structure and also announces a Mentorship Program to help encourage a culture of collaboration and sustainability throughout the JavaScript community. Initial projects being welcomed into the mentorship program include: Appium, Interledger.js, JerryScript, Mocha, Moment.js, Node-RED and webpack. The JS Foundation is a member supported organization; founding members include Bocoup, IBM, Ripple, Samsung, Sauce Labs, Sense Tecnic Systems, SitePen, StackPath, University of Westminster and WebsiteSetup. Developers rely on a growing portfolio of open source technologies to create, test and deploy critical applications. By creating a center of gravity for the open source JavaScript ecosystem, the JS Foundation aims to drive broad adoption and ongoing development of key JavaScript solutions and related technologies and to facilitate collaboration within the JavaScript development community to ensure those projects maintain the quality and diverse contribution bases that provide for long-term sustainability. “The JS Foundation aims to support a vast array of technologies that complement projects throughout the entire JavaScript ecosystem,” said Kris Borchers, executive director, JS Foundation. “JavaScript is a pervasive technology, blurring the boundaries between server, client, cloud and IoT. We welcome any projects, organizations or developers looking to help bolster the JavaScript community and inspire the next wave of growth for application development.” The JS Foundation is focused on mentoring projects across the entire JavaScript spectrum: client and server side application libraries; mobile application testing frameworks; JavaScript engines; and technologies pushing the boundaries of the JavaScript ecosystem. As a new Linux Foundation Project, JS Foundation and its projects remain community-driven and supported, while benefiting from guidance on quality, open governance and healthy community development practices. More about initial projects within the JS Foundation Mentorship Program: Appium, contributed by Sauce Labs, is an open source Node.js server used for automating native, mobile web, and hybrid applications on iOS and Android platforms as well as the recent addition of the Universal Windows Platform. Appium expands JS Foundation’s current test framework and tooling offerings into the device automation space. Interledger.js, contributed by Ripple, enables instant payments and micropayments in any currency, across many payment networks using the Interledger Protocol (ILP). By supporting this project, the JS Foundation is encouraging organizations and their application developers to consider new ways to think about payments on the web and look for ways to simplify and standardize those processes. JerryScript, contributed by Samsung, is a lightweight, fully-featured JavaScript engine for Internet of Things (IoT) devices that ships in commercial products today. As IoT is one of the largest and fastest growing sectors of the JavaScript ecosystem, JerryScript is just the beginning of JS Foundation’s efforts to support projects and developers within the IoT ecosystem. Mocha is a feature-rich JavaScript testing framework providing a command-line interface for Node.js as well as in-browser testing capabilities. Focused on supporting the entire JavaScript ecosystem, the JS Foundation brings Mocha under its mentorship alongside Lodash to ensure that many JavaScript application cornerstones will be available and supported long into the future. Moment.js is a lightweight JavaScript date library for parsing, validating, manipulating, and formatting dates and also provides time zone support to JavaScript through Moment Timezone. Another cornerstone of the JavaScript ecosystem, Moment.js helps empower developers to build amazing JavaScript applications. By supporting Moment.js alongside projects like Globalize and Jed, the JS Foundation hopes to foster collaboration for internationalization and formatting. Node-RED, contributed by IBM, is a flow-based programming environment built on Node.js – commonly used in the IoT space – and aimed at creating event-driven applications that can easily integrate APIs and services. Node-RED will be a major factor in the JS Foundation’s efforts to support the full end-to-end JavaScript ecosystem. web pack is a bundler for modules and is primarily used to bundle JavaScript files for usage in a browser. It is also capable of transforming, bundling, or packaging just about any resource or asset. For information about all of the projects hosted by the JS Foundation, please visit http://js.foundation/projects/. The JS Foundation’s open, technical governance model includes a Technical Advisory Committee (TAC) and a Governing Board. The TAC provides technical advice to the projects and informs the Board about technical opportunities it sees in the JavaScript ecosystem. A Board of Directors guides business decisions and marketing, ensuring alignment between the technical communities and members. JS Foundation will also build upon its work with standards bodies, such as W3C, WHATWG, and ECMA TC39, to nurture the open standards that browser vendors and developers rely upon. By working closely with the Node.js Foundation, the JS Foundation shows a shared commitment to ensuring the sustainability and longevity of projects in the JavaScript ecosystem.   ### Lessons from the giants of cloud-only business The role of the cloud as a driver of business transformation is often exemplified by disruptive brands that were ‘born in the cloud’. They present very compelling examples of how the cloud can enable agile, responsive businesses which challenge more established rivals. Just look at how the way we consume movies, listen to music and order taxis has been revolutionised in recent years. [easy-tweet tweet="Companies 'born in the cloud' are compelling examples of how cloud is a driver." hashtags="cloud, tech"] Brands like these have thrown down the gauntlet in all sectors, challenging others to up their game and be better attuned to customer needs. Cloud also provides the means for more traditional businesses to take up the challenge and emerge victoriously. But success requires more than just a piecemeal approach to deploying cloud. The benefits can only actually be realised if the business itself realigns to put cloud front and centre through a clearly defined enterprise cloud strategy. Achieving this can involve a measure of organisational realignment that new challenger brands will likely not have experienced. When a company starts with cloud at its centre, lines of business radiate from that centre and are fundamentally connected to it. This means the organisation can focus on capturing customer data and real-time information and using that to meet customer requirements better and develop new services. More established businesses may have been operating before cloud – or even before on-premise data centres. They may have a more organic, less highly structured approach to IT, which needs to be challenged and realigned, as part of the process of implementing an enterprise cloud strategy. [easy-tweet tweet="Organisations have the potential to compete with cloud-only challenger brands." hashtags="tech, cloud, IT"] Oracle’s research report, Putting cultural transformation at the heart of cloud success, bears this out. Our report highlights how organisational culture can be a barrier to established businesses getting the most out of the cloud. Thirty-three percent of respondents working in IT told us their organisation had an inappropriate IT culture for the cloud computing age, and 95 percent of those working in IT said lines of business investing in cloud technology causes complexity. Establishing an enterprise cloud strategy is the only way to become a cloud-only business. It provides the technology base which allows businesses to be agiler in their approach to moving workloads between different environments as needed, whether on premise, converged or cloud. An enterprise cloud strategy also formalises the gathering of real-time customer data to a single location, ready to be analysed and used to gain competitive advantage. This helps eliminate siloes, while centralising financial management to reduce organisational spending on the cloud. In the end, the most valuable lesson from challenger brands that have built their business solely on the cloud is that this is the only way for companies to be truly responsive, agile, and flexible, meaning they can quickly deliver innovative new services to customers. But all organisations have the potential to compete with cloud-only challenger brands. To do so, they must realign their business around an enterprise-wide cloud strategy. It is this approach which gives challenger brands the edge to fully focus on a common goal in the most flexible and direct way. This will foster organization-wide access to all data and technology resources while avoiding the creation of data siloes and duplicate technology purchasing. In a very real sense, then, cloud can be the glue that brings the business together to maximise its potential. ### How Cloud based telephony can benefit your business As an enterprise develops and grows, so does the technology needed to ensure operations run at an optimal level. Outdated legacy systems just cannot support modern workforces which have embraced flexible working. The BYOD trend has allowed employees to work virtually anywhere. As such, a mobile enabled workforce is necessary to stay competitive in today’s agile working environment. [easy-tweet tweet="As an enterprise grows, so does the technology needed for it to run at an optimal level." hashtags="tech, cloud, IT"] With remote working on the up and cloud technology uptake continuing, many enterprises are looking to cloud voice to guarantee business success. SIP trunking is an example of an IP-based telephony which can immediately improve operations while delivering instant cost savings. Furthermore, the adoption of cloud voice futureproofs organisations so regardless of what determinations change, services remain durable, safe and online. SIP trunking is fast becoming the first choice for communications for businesses looking to embrace the cloud and mobile era. Cost Efficiency Many companies relying on traditional ISDN technology may find themselves paying more than they need. From excess line rental and ongoing maintenance fees, too costly call forwarding charges, these numbers add up and can quickly eat away at a company’s bottom line. To reduce costs, organisations should look to regularly review their existing suppliers and seek alternatives on the market. Updating outdated telephony can significantly reduce operational costs with savings in line rentals and call costs, as well as remove the need for significant capex on hardware – freeing up cash for other business ventures. Modern technology such as SIP can help organisations, both in the immediate and longer term. This is because SIP is a more flexible, lower-cost alternative to ISDN (often including free internal calls between extensions and offices). For businesses with multiple sites, SIP trunking also offers the opportunity for line rationalisation and a reduction in the number of PBXs that need to be maintained. [easy-tweet tweet="Updating outdated telephony can significantly reduce operational costs with savings in line rentals" hashtags="cloud, tech"] Streamlined integration As more businesses look to move their data to the Cloud, cloud-based telephony is the natural step forward in integrating data and devices. SIP trunking also alleviates logistical plans in the event of your office relocating. The technology allows for businesses to keep the same geographical number without the continued costly call forwarding charges. Flexibility Working over capacity can put your enterprise at risk. Companies that are heavily seasonal need to scale their telecom systems quickly and effortlessly to meet customer demand. A prime example of this would be the retail or e-commerce sector during holiday periods – Black Friday, Cyber Monday and the Christmas season. During these high volume sales periods, these businesses see a massive influx of call traffic and need to react accordingly. With such an increase in call volume, scaling to meet demand is the only viable option to stay on top of the increased traffic. Traditionally, increasing to meet demand came at a high cost to the business with new hardware and ISDN lines being required, often under long-term contracts which become redundant once the high traffic period finishes. With SIP-based telephony, more lines can quickly be added and/or removed with minimal support from the provider. There is also no requirement to buy additional hardware meaning it’s possible to scale down following demand. Enterprise continuity To futureproof an organisation, it is imperative to be proactive when it comes to adopting new technology. The ISDN standard was developed in the 1980s. Put simply; it is archaic. With the emergence of mobile and cloud era, organisations require more efficient alternatives to obsolete technology. While many companies spend a considerable amount of resources ensuring their IT is resilient, fewer consider how outdated telephony can affect their finances. Massive data loss is an obvious threat to an organisation, but the risk of telecoms downtime is often overlooked, despite its ability to damage customer service levels and impact on potential business revenue. With SIP trunking, resiliency is built into the voice architecture by design. As a result, businesses can continue to operate with minimal impact in the event of the disruption. Calls can be quickly and easily re-routed, if not automatically, and call load balanced in times of unexpected peak demand. [easy-tweet tweet="With SIP trunking, resiliency is built into the voice architecture by design." hashtags="tech, cloud"] Unlike traditional telephony options, which require the regular maintenance of circuit boxes and a dedicated on-site IT support team, SIP is cloud based which reduces the strain on IT and allows organisations to monitor their traffic better. It also offers resiliency in the network – ensuring business continuity in the event of any unscheduled downtime. The operational change affects how day-to-day work is done. Updating to cloud-based telephony has the potential to impact upon all aspects of business. Utilising technology that can enhance communication, collaboration and aid productivity can only speed up decision making, help with customer service and ultimately drive business growth. ### LANDESK Expands Ransomware Protection Building on a multi-layered approach to defend against ransomware, LANDESK today announced new capabilities to its LANDESK® Security Suite. With a growing number of ransomware threats menacing enterprises, the latest additions aid in closing security gaps across the entire organisation—a strategy that’s recognized as the best way to battle ransomware. Ransomware is an imminent threat to organisations and shows no signs of slowing down. According to the FBI, criminals have already collected $209 million in revenue in the first quarter of 2016, and that number could exceed $1 billion by year-end. And while the average ransom is still relatively small—between $200 and $5000—the price is going up. “The good news is, accepting the inevitable doesn’t mean accepting defeat,” said LANDESK Chief Security Officer Philip Richards. “While ransomware poses a very real threat to organisations and their systems, by taking a sophisticated, multi-dimensional defense to protect against it, organisations can mitigate the negative effects.” Cyber watchdogs like US-CERT and the Center for Internet Security (CIS) agree. LANDESK Security Suite offers the recommended multi-layered endpoint protection, without disrupting productivity or business operations. With this latest release, LANDESK has expanded its suite in all three critical areas of defense: detection, prevention and remediation. “The LANDESK Security Suite provides a much needed solution that tackles how to respond to ransomware attacks, protect data and devices, and in the worse-case scenario, minimise impact should a security breach occur,” continued Richards. Detection. No organisation wants to fall victim to malware threats its AV vendor identified and tagged. Still, even with strong protection, it’s important to account for today’s highly dynamic malware, which can transform itself before or after an attack. The Verizon 2015 Data Breach Investigations Report found that 70 to 90 percent of malware samples are unique to a single organisation. That’s why LANDESK Security Suite enhances AV detection with active and passive discovery technologies, visibility across the network and actionable data. In this new release, IT can display information about the applications on each endpoint and act instantly on suspicious and malicious apps. Prevention. Realistically, ransomware will get in. When that happens, the goal is to minimize the chances it can execute. LANDESK Security Suite offers a range of industry-recommended preventive measures, including, but not limited to, application, device, and connection control and automated application and OS patching. With compromised websites regularly exploiting known vulnerabilities in software, patching internet-facing applications is particularly critical. That said, it can be time-consuming and applied inconsistently, and may break critical business applications as it seeks to repair others. Even seasoned experts like US-CERT sometimes fail to apply patches as necessary. With that in mind, Security Suite simplifies patch management with best practices, automated processes, fast deployment and no impact on users. In this release, the new Installed Patch Report provides an additional tool for timely, effective patching—offering easy access to data on installed patches and those that still must be applied. Additionally, Security Suite can now help keep ransomware and other malware from modifying the master boot record and rendering the system useless. Remediation. Detection and prevention are crucial, and in many cases highly effective, but organisations still need a plan in case ransomware executes on the network. Realistically, all it takes is one user who clicks the wrong link or downloads a malicious attachment in email to fall victim to an attack. Luckily, LANDESK Security Suite adds to the final piece of the protection puzzle. This latest release tackles the most widespread form of ransomware, which encrypts files and hides the critical decryption key before demanding ransom. Security Suite detects any attempt to encrypt files on the local machine, stops the encryption process, and notifies all other computers on the network so the ransomware cannot be unleashed on other users—effectively thwarting the attack. ### Forecasting Clearer Weather: Three Tips to Overcome Cloud Challenges before they Occur The cloud offers organisations an opportunity to quickly and efficiently deploy tools or applications for everything from new ventures to mission-critical projects. According to a new study by 451 Research, enterprise adoption of either private or public cloud is at 41 percent, with the estimates that it will rise to 60 percent by 2018. Cloud-based solutions have enabled organisations to be agile in ways that were not possible ten years ago, meaning that overall IT time and spend can be much more strategically focused on meeting a growing set of needs or objectives. Cloud infrastructures are inherently flexible, allowing companies to expand, react and achieve business goals without massive upfront costs typically associated with an investment into new technology. But how does the use of the cloud affect an organisation’s data and overall strategy for managing the exchange of valuable information with customers, vendors, or partners? [easy-tweet tweet="Cloud infrastructures are inherently flexible allowing companies to expand" hashtags="cloud, tech, hybrid"] Traditionally, enterprises, in particular, have been wary of storing and managing data in the cloud, pointing to concerns over the protection of sensitive data and the ability to meet compliance mandates. Besides the cloud also makes it easier for individuals or departments to implement new cloud technology and get projects off the ground quickly, without considering the IT team’s vision or policies, which can make controlling data or minimising vulnerabilities a real undertaking. Consider these three tips when determining how to implement the cloud while ensuring your data is secure: 1. Seek out tools or applications that provide flexibility without compromising security The essential benefit of the cloud is flexibility: being able to access and distribute information from and to anywhere with an internet connection. It is this flexible, open nature though that is the cloud’s attraction (and ultimate challenge) for a CIO or IT. However, flexibility should not come at the expense of protecting sensitive information and data that has been made more readily available through the cloud. By looking for tools or applications that can provide transparency around the movement of data, IT administrators can better manage unauthorised sharing or mitigate vulnerabilities quickly. Consider applications like a cloud-based managed file transfer solution, data loss protection tool or working with a cloud access security broker. [easy-tweet tweet="IT admins can better manage unauthorised sharing vulnerabilities quickly." hashtags="cloud, tech, IT"] 2. Check for regulations on handling data through government associations and industry organisations Most organisations now face some level of control, with the most highly regulated industries being healthcare, financial services, and retail. There are many regulations that specifically deal with how an entity should manage sensitive data. These are regulations such as Payment Card Industry – Data Security Standard (PCI DSS), EU-US Privacy Shield and the General Data Protection Regulation (GDPR), among others. When it comes to quickly launching cloud-based technologies, it’s important to understand which regulations affect your own organisation’s data and whether you also need to comply with regulations in other industries or countries, based on the type and destination of the data. In some cases, like with the EU-US Privacy Shield, you may also need to confirm that your cloud provider has clear information on how they manage data sovereignty. For those in the EU or the UK, your provider may need to have a regional cloud centre. 3. Implement solutions that can positively affect business results Organisations and their employees often turn to cloud solutions to get projects up and running quickly or ensure quick results. Often, we hear more and more from companies we speak with, that there’s a mandate to use the cloud for new projects. This puts more pressure on the cloud deployment or associated applications to return its investment quickly. In the end, cloud tools should ultimately improve business results. There are a couple of things that CIOs or IT teams should keep top of mind when evaluating cloud tools to ensure that these investments pay off for the business. Make sure you have a good understanding of your cloud application or service provider’s policies, especially related to support or downtime. If there’s an issue with the service, you want to have plans in place to know exactly who to call and set up procedures to have network traffic failover or be highly available to avoid your business also having any outages or lag-time. [easy-tweet tweet="IT teams should keep top of mind when evaluating cloud tools" hashtags="tech, IT, cloud"] It’s equally important to confirm that your current technology investments or legacy systems integrate with the new cloud services. If there is a challenge or conflict with the integration, you’ll have time to find workarounds or evaluate new services before the cloud tools go online and are operational for the larger organisation. Embracing cloud is not simply about following the latest trend, or doing what everyone else is, it’s about finding the right solution to support your business. Be it a small IT department for a software startup or a 200-person department for a FTSE 100 enterprise, the power of cloud should not be discounted as a means to enhance your overall business results. It is important to have a good understanding of how you, as a CIO, CISO, or IT department, can empower individuals within your organisation to adopt the cloud without creating unnecessary headaches or severe security vulnerabilities. At the end of the day, CIOs, CISOs, and the IT team can become vital in the pursuit of business success, while protecting its assets and data simultaneously. ### Explorer wins 3 UKOUG Oracle Partner awards! Leading Oracle Platinum Partner, Explorer, were last night awarded ‘Database Partner of the year - Gold’, ‘Systems Partner of the year - Gold’ and ‘Training Partner of the year - Silver’ by the Oracle community at the annual UKOUG awards, held at The Café de Paris, London. This marks a significant point in Explorer’s remarkable growth and particularly special as its voted for by the Oracle customer community. “These awards are the latest in a long line of recognition from the UKOUG that highlights our ability on all fronts of Oracle technology combined with our services. Thank you to all our staff for their commitment and loyalty towards our customers and, of course, to everyone who voted for us. Winning Gold for Database two years running is especially pleasing” Jon Lingard, Sales Manager, stated following the announcement. Jon continued; “The Oracle landscape is evolving and it’s a very exciting place to be! From traditional Systems implementation & support through to Managed Services on Engineered Systems and now new projects in the Oracle Cloud; our customers have more choice than ever before. The number of PaaS and IaaS implementations that we're managing, is growing month on month and Our customers are benefiting hugely from our ability to deliver on premise and in the Oracle Cloud.” Simon Greenwood, Development Director said; “As the #1 Application Express (APEX) development consultancy in the UK it’s extremely gratifying to know that so many of our customers have voted for us on the back of the development projects we have delivered for them. We've seen great uptake of APEX in the Oracle Cloud too and it's opening up opportunities for our customers in emerging and competitive markets alike". ### Battlefield Cloud The irresistible shift to cloud infrastructure has been accompanied by a certain amount of fear and uncertainty. Part of the reluctance has been a sense of “who is minding the shop” and a concern over the lack of visibility and control. In one sense, cloud data centre infrastructure is potentially more secure than private infrastructure, as it may be maintained by a large, specialised team with potentially greater expertise that can attend to security issues that arise. In another, the cloud opens up several new attack surfaces that could be used by cybercriminals and other bad actors. [easy-tweet tweet="The shift to #cloud infrastructure has been accompanied by a fear of uncertainty" hashtags="tech, IT"] Most organisations already face a serious security deficit. Few can detect an active attacker at work on their network. The industry average dwell time of some five weeks reflects the failure of traditional security to find an intruder that has circumvented preventative security and is working towards a data breach. Companies should first address the problem they face on-premises and then address the blind spots in the cloud. Both are important steps to safeguard data and resources. The public cloud data centre can be a new ingress or egress point for attackers to gain a foothold into one’s network or as a channel for command and control communications and data exfiltration. In addition, the reconnaissance and lateral movement that an attacker must employ once they gain access can be done without fear of discovery. These four types of operational activities must be detected and stopped, if organisations want to protect their assets from attackers. Closing all the gaps is important in the drive to protect networks from a data breach. Attackers are agile, and if one tactic is too difficult or not possible, they can easily move to plan B. For some companies, this may mean that an attacker would turn to the cloud data centre to help accomplish their goals. If attackers can use a public cloud data centre for undetected ingress and egress, it invalidates some of the efforts an organisation has put in place to protect their own network. [easy-tweet tweet="An attacker could turn to the cloud data centre to help accomplish their goals." hashtags="hacking, tech, cloud"] Often, a development or test instance becomes the Achilles heel in the cloud. Many of these development or test instances may have been intended for a specific purpose at a specific point in time, but they can become forgotten and left up —a perfect entry-way for an attacker. One of these forgotten instances could be the easy route an attacker utilises. Finding active attackers in on-premises networks can be accomplished by identifying their operational activities as they probe servers and misuse credentials. Having a complete vantage to network traffic is essential, as the primary attack activities involve reconnaissance and lateral movement—things that happen on a network, between devices. Discerning these comes through behavioural profiling of all users and devices. Once a baseline of learned good behaviour is obtained, it is possible to detect anomalies. These anomalies need further refinement to find those that are truly indicative of an attack. Combining full network visibility with machine learning allows organisations to detect an attacker that might otherwise be invisible to the security team. Unfortunately, gaining access to a mechanism for visibility in the cloud has not existed until recently. Now, organisations can take advantage of Behavioural Attack Detection in the cloud and use native Amazon Web Service VPC Flow Logs or the new Gigamon Visibility Fabric for AWS to monitor their virtual network. They can also protect hybrid cloud environments, monitoring user access to cloud servers and detecting anomalous activity indicative of an attack. [easy-tweet tweet="Organisations can take advantage of Behavioural Attack Detection in the cloud" hashtags="tech, cloud, IT"] The cost and consequences of a data breach are rapidly climbing, particularly with the enforcement of the EU General Data Protection Regulation (GDPR) which applies to companies in the EU or doing business in the EU. Besides hefty penalties, damage to brand and reputation or loss of intellectual property or business secrets could put a company out of business. While it is nearly impossible to ensure that no endpoint will ever get compromised, it is possible to find active attacks quickly and prevent a simple malware infection from turning into a costly data breach. Closing all the blind spots for attackers to operate in without notice is essential to make this a reality. ### Who owns the cloud agenda? Every year, we see new reports on how cloud computing is gradually increasing its foothold in business. But who is actually driving cloud adoption? Many would argue that it’s not the business community. In most cases, it’s not even the technology user. Despite often being the ultimate recipient of the technology, the consumer is merely adapting to the wider trends. [easy-tweet tweet="Who is actually driving cloud adoption?" hashtags="cloud, tech, adoption, "] The real cloud drivers With the exception of Microsoft, who arrived relatively late to the cloud party, it’s the big vendors of the IT industry who have been the major influences behind cloud adoption over the last decade. Technology providers have in many ways reverse-engineered business cases that would enable them to deliver their solutions at a lower cost and improved bottom line. SaaS and IaaS were designed to be profit models for the industry that delivers them, not necessarily to meet a genuinely expressed requirement in the marketplace. So what does that mean for the end user? Is this the new era of the product-focused enterprise? Is the market being sold a cloud-shaped solution to a need invented by technology providers? [easy-tweet tweet="Is this the new era of the product-focused enterprise?" hashtags="tech, cloud, enterprise"] Stragglers and champions In Forrester’s trend evaluation in 2013, Oracle customers were polled on their intentions of moving their Applications hosting to the cloud. The report claimed that 89 percent were unlikely or hesitant to consider a cloud migration. In the months and years that followed, there were a number of strategic approaches on Oracle’s part, to improve the incentives for shifting to a cloud model – or to raise the barriers to using traditional hosting. For many software areas, however, the cloud option has been firmly embraced by users from the start. Adobe’s Creative Cloud is one product which immediately saw a huge uptake across the user base, far exceeding expectations, and is still growing rapidly – amongst both leisure and business users. When discussing cloud adoption from the market’s perspective, we need to look at it from the two distinct angles of how the market views cloud computing as a compelling solution. Migrating onto cloud If the business or user is approached with an active choice to change the way they operate, to go from one platform to using another, there has to be a number of expressed benefits which outweigh the risk and cost of the transition. The idea of moving existing on-premise solutions to the cloud can be daunting, particularly if the new model doesn’t appear to fit the business perfectly. Traditional industries may have concerns around data security, where others may not want to risk potential downtime when disconnecting legacy technology. As most organisations will find; there is never a perfect time to move to the cloud. In many cases, temporary disruption is part of the price you pay for long-term cost-savings. [easy-tweet tweet="Various interfaces can take time for the individual user to fully understand and use" hashtags="tech, cloud, "] Depending on the complexity of the IT infrastructure, there could also be a culture shift at play. New software, new hardware, new platforms – these various interfaces can take time for the individual user to fully understand and use, which can cause productivity dips or added pressure on IT Support. Starting on cloud For many cloud-based services emerging now, new platforms are fully designed with the user in mind. Netflix for example, being the poster-child of “Cloud done right”, managed to identify a cloud solution which perfectly balanced convenience, access and value for the customer. One of the reasons why it worked so well was because it didn’t require a transition of usage. The new value of on-demand movie streaming far outweighed the old value of the DVD or VHS collection. In a similar way, Salesforce.com didn’t become the most high-profile CRM system on the market by mimicking its predecessors, but by enabling businesses to make better use of their data in ways that were obvious from the outset. They disrupted the CRM market by setting a new standard that was completely customer-centric – which was why they quickly won thousands of advocates. Faster than the speed of Cloud Whenever we discuss the Third Platform and the growth of the use of mobile, social and cloud technology, it’s important to remember that not all cloud is created equal. The IT industry’s ability to speed up the rate of cloud adoption will rely hugely on their ability to create business cases that offer genuine value to the customer – by putting them first. The most successful cloud transitions happen when the user experiences an immediate enhancement of the service, without having to compromise on any of the aspects that are important to them. ### IT Pros Still Find it Difficult to Secure Data in the Cloud Survey results released by Lieberman Software Corporation today, to coincide with European Cyber Security month, show that 43% of IT professionals find it difficult to secure data in the cloud. The survey, which was carried out at Microsoft Ignite, also revealed that 73% of respondents prefer to keep their sensitive corporate data on premises, rather than in the cloud. “The cloud is ideal for businesses that need a cost effective, scalable and flexible means to transform their IT environments,” said Philip Lieberman, President and CEO of Lieberman Software. “Yet, IT professionals are still reluctant to put sensitive data in the cloud because they say it is difficult to secure.  What organizations need to understand is that the same security problems they face on premises follow them into the cloud. Migrating to the cloud doesn’t mean they face any more or less security risk than keeping data on premises.” Lieberman argues that attackers use the same automated cyber attacks on physical systems that they do on cloud-hosted systems. To succeed - whether inside the cloud or not - attackers need credentials. To gain these credentials, cyber criminals use tactics such as spear phishing and social engineering to circumvent traditional perimeter defenses like firewalls. Once inside the network, the attackers look for privileged credentials that allow them to move between systems and anonymously steal sensitive data. “A security solution that provides unique and frequently changing credentials for each privileged account ensures that even if an intruder steals a password, it is time-limited and cannot be used to jump from system to system on the network. And if this solution can be deployed in a cloud or hybrid environment, while also securing the credentials that underpin the administration of cloud portals themselves, we will see confidence in cloud security rise,” Lieberman concluded. ### New-gen Security for Smart devices Becomes a Must The technology has been continuously evolving throughout history. COVID-19 has increased the pace of this evolution to a whole new level. With the fast-paced growth, we have seen many new trends that came up to the surface. IoT or Internet of Things is one of the emerging trends in the world of technology. IoT helps in managing every device and situation you wish. But how do IoT works? An IoT system consists of sensors that communicate or send signals to the cloud through some kind of connectivity. When the data reaches the cloud, then the software processes it and accordingly may decide to act like send an alert message or automatically adjust the sensors/devices, etc., without the need of the user. IoT is such an amazing tool but is it secured enough for us to use? Well, in research done by HP, it was found out that about 70% of IoT devices are prone to attack! To see security come at last in this new age of digitalisation is not a happy sight. Every personal or professional information your devices hold is in a continuous threat. What are these IoT issues that we need to handle in order to make our devices secured? Data Encryption: In this "new age" of technology, every transmission of data that is sent will be encoded, which can be retrieved by the receiver. Hackers are always on a lookout to find new ways and techniques to hack the data that can be useful to them. In the blink of an eye, your data can be hacked at any given moment. There are some ways which might somewhat ensure the safety of your data; Secure Sockets Layer Protocol (SSL) can be used whenever dealing with online data is involved. The majority of the websites do use SSL certification for encryption and protection of data. With inbuilt encryption, it is always advisable to make use of the wireless protocol. Firewalls are best when using web apps. Data Authentication: After data encryption, there is still a threat to your device. Hackers are always on a hunt of small oversights that they can use for their benefits. They can hinder data that might be useful to you. To avoid this, you can use protocols that provide these; Customer anonymity Secure session key establishment Mutual authentication And most important is that you always think before you buy any IoT device. IoT Hardware Issues: IoT has made the tech giants like Intel focus more on security amendments in their smart devices. The new trend of IoT attracted many, which led to security issues. The number of smart devices increased and this caused vulnerability to cyber-attacks. Hardware issues have created many questions among IoT enthusiasts.   This has led to a rise in manufacturing costs in the IoT devices. It has resulted in IoT being an elite technology. But in time, this can change for IoT technology if safe and secured hardware with cost-efficient solutions is introduced.  Hardware testing is inevitable: Smart IoT devices are usually tested before they undergo the manufacturing process. This is a good way to try to eliminate the IoT security challenges from the start. But is this all that we need to rely on the IoT devices? Surely not! We can check for ourselves as well. We can self-assess certain things before buying an IoT device like device range, or the memory capacity of IoT device, scalability, reconfigurability, and latency of the device must be tested upon, etc. There are several online content-like videos available, through which you can do your DIY testing. Have fun testing! Managing Updates: Let's be real; updates are important for every product. It is not possible for an IoT device to have an over-the-air update or updates without downtime. There are situations where data might need to be pulled out temporarily in order to conduct updates. Some devices may not support updates; this is mostly the case in older devices. There are several other instances like these where keeping track of updates is not possible. In this situation, the device manager is useful to keep track of versions on each device. Safe Storage: Data storage is easier than before. Zillions of bytes of data can be stored on online facilities. So, it is clear that storing data is not a huge challenge but what is more important than that is backing them up. The source you use for backing up the data may not be secure or safe enough for the job. Anyone could forge the data or might even steal it. Proper safety measures are to be established to avoid such issues. There are other ways as well to improve security over IoT and to avoid any IoT Mishaps like being careful while buying an IoT device. Also, ensure your data and the source you are accessing your data are safe and secure.[/vc_column_text][/vc_column][/vc_row] ### A hybrid cloud architecture works best for SMEs Only eight weeks ago, analyst firm IDC’s quarterly Worldwide Cloud IT Infrastructure report (August 2016) forecast that, by 2020, spending on cloud services will almost equal what is being spent on traditional IT. This increasing spend on public and private cloud infrastructure, anticipated to grow by around 15.5 percent, will be made up in some part of hybrid cloud architecture. A combination of on-premises and off-premises deployment models, the hybrid cloud offers unparalleled speed, scalability, flexibility and, most importantly of all for smaller businesses, a healthy ROI. [easy-tweet tweet="Hybrid cloud offers unparalleled speed, scalability, flexibility and a healthy ROI." hashtags="cloud, hybrid, tech"] As overall business data volume is also growing fast, at a rate of more than 60 percent annually, it’s not hard to see why the hybrid cloud approach is burgeoning. Inadequate data protection can lead to a business’ downfall 2016 has been the year that companies of all sizes realised that data protection is a basic requirement for doing business. The inability to access customer, operational or supply chain data can be disastrous and every second of downtime can impact ROI. Critically, losing data permanently threatens to damage operational function, as well as business perception. The latter is especially key in business relationships with suppliers and customers. What may have taken years to develop can be undone in the course of a few hours of unexplained downtime. It’s never been easier to take business elsewhere, so the ability to keep up and running irrespective of hardware failure or an extreme weather event is critical. The best of both worlds - speed and cost benefits Perhaps the most obvious benefit of hybrid cloud technology is that SMEs have access to enterprise-class IT capabilities at a much lower cost. SMEs can outsource the management of their IT services through Managed Service Providers (MSP), allowing for immediate scalability and negating the need for them to manage – or hire internal IT experts to do so for them - those potentially complex IT systems in-house. Another significant benefit is budgetary. Outsourcing via MSPs means avoiding a large upfront expenditure for installing IT systems and, instead, enjoying the benefits of data protection in the example of hybrid cloud data backup, for a monthly retained or pay per user basis. One UK business that benefited from these benefits is Harbro Ltd, a £100 million multi-national agricultural feed supplier. On recommendation from its MSP, Clark Integrated Technologies, Harbro Ltd implemented SIRIS 2 - Datto's hybrid cloud solution. Just six months later, the company was hit by ransomware. Datto’s SIRIS 2 quickly restored the 120,000 corrupted files by going back to an hour before the virus struck. If Harbro had still been backing up its data and files using traditional tape methods it would have suffered days of downtime. Austen Clark, Managing Director of Clark IT, commented: “Preventing data loss and ensuring business continuity for our customers is key. The financial data the company holds on its clients and the loss of revenue and reputation due to a stall in business operations can have a fatal effect on any business - particularly in today’s climate. By selecting Datto, we can be sure that businesses like Harbro can keep on going.” The considerable upside of the hybrid cloud model is that local storage devices can make immediate access to data or services possible without any of the delay associated with hauling large datasets down from the cloud. This is particularly important for SMEs that may be affected by bandwidth and/or cost concerns. In the event of a localised hardware failure or loss of a business mobile device, the data can be locally restored in just a matter of seconds. The ability to backup files at any time, without any network downtime [easy-tweet tweet="Hybrid models use network downtime to backup local files to the cloud" hashtags="cloud, hybrid, tech"] Often hybrid models use network downtime to backup local files to the cloud, lowering the impact on bandwidth during working hours, while ensuring that an off-premises backup is in place should a more serious incident occur. Of course, this style of network management is not a new approach, but with a hybrid cloud setup, it’s much more efficient. Whereas in a cloud-only set up the SME’s server will have an agent or multiple agents running to dedupe, compress and encrypt each backup, using the server’s resources. A local device taking on this workload leavings the main server to deal with the day-to-day business. Consequently, backups can be made efficiently as and when required, then simply uploaded to the cloud when more bandwidth is available. Good news for MSPs Finally, the hybrid cloud offers many benefits for the MSP side of the coin, delivering sustainable recurring revenues, not only via the core backup services themselves, which will tend to grow over time as data volumes increase but also via additional services. New value-add services might include monitoring the SME’s environment for new backup needs, or periodic business continuity drills, for example, to improve the MSPs customer retention and help their business grow. ### The first robots: History’s early automatons If you ask most people what their experience with robots has been, you’ll likely receive a response ranging from minimal to none. Outside of the Robot Wars reboot and science-fiction movies, robots still feel as though they are implausible and futuristic, certainly when it comes to everyday life. In years to come, we may all have self-diving cars and robot assistants, but that day feels unfathomably distant. And yet, for something that feels so otherworldly and remote, robotics have a surprisingly long history. [easy-tweet tweet="In years to come, we may all have self-diving cars and robot assistants" hashtags="robotics, tech, "] Automatons are the ancient ancestors of today’s robotic inventions; self-operating machines capable of performing a range of functions determined by their particular mechanism. Reports of these machines have existed for centuries, with one of the earliest proclaiming the existence of a life-size, humanoid figure complete with artificial organs and capable of singing and dancing built in China in the 10th century BC. While it is difficult to verify the existence of some of these early automatons, others have received more substantial historical backing. Leonardo da Vinci used his interest in mechanical engineering to design, and in all probability build, a mechanical knight that was capable of moving its arms, legs and head, as well as “speaking” via an automatic drum-roll. Leonardo’s designs for the machine were discovered in the 1950s and faithful reconstruction has given credence to claims that the robot was presented to the Milanese court in 1495. Perhaps even more impressive, The Writer automaton is capable of writing any text up to 40 characters long. Designed by Pierre Jaquet-Droz in the late 18th century and made up of more than 6,000 parts all squeezed into the replica model of a small boy, the programmable nature of the automaton has seen it lauded as the precursor to modern computers. Moving beyond the mechanical While it is easy to see the connection between Da Vinci’s knight and, say, the pre-programmed industrial robots used today, the future of the technology is already looking to outgrow its humble beginnings. Artificial intelligence is likely to be at the heart of robotics development and will surely make our present day efforts look relatively primitive. [easy-tweet tweet="Artificial intelligence is likely to be at the heart of robotics development" hashtags="tech, robotics,"] Whereas the early automatons were made up of a collection of cogs, gears, levers and pulleys, robots of the future are likely to be more digital than mechanical. The robotic creations of the future are just as likely to draw from the AI-related fields of linguistics, logic, behaviourism and software development, as they are nuts and bolts mechanisms. Already there are attempts being made to create robots that learn from experience, build upon cloud intelligence and recognise human emotions. While these robots of the future may differ markedly from the automatons of the past, in many ways they stem from the same ambition. While automatons mimicked the likeness and behaviour of humans, the robots of the future will be able to do this while also emulating, and perhaps surpassing the human mind. It seems that mankind’s need to create is centuries, even millennia old, and no matter how complex our robot inventions become, they will certainly owe a great deal to their mechanical forebears made all those years ago. ### Grasshopper Launches in the United Kingdom GetGo, a subsidiary of Citrix Systems, Inc. (Nasdaq: CTXS) today announced the general availability of Grasshopper, a leading cloud-based phone system for entrepreneurs and small businesses, in the United Kingdom (UK). With a simple mobile app, Grasshopper gives users the ability to add a virtual business line to their existing personal mobile, home or office phone number, without having to install or maintain expensive hardware. Subscription plans start at £10 per month. Grasshopper is a reliable, easy way for entrepreneurs and small businesses to run their business from any location. Users simply pick a company phone number (local, national, or freephone options are available), customize their main greeting, and add unlimited extensions for different departments and employees. Users can then take advantage of the new Grasshopper mobile app to make and receive calls using their new numbers, listen to voicemails, receive faxes and be notified of missed calls immediately. Providing calls over the public switched telephone network (PSTN) rather than over a Voice-over-IP (VoIP connection), Grasshopper is able to provide superior call quality and reliability. In addition, it offers the ability to transcribe voicemails via email notifications so users can quickly prioritize inbound messages; alternatively, voicemails can be transcribed by a person on-demand. For those on the go, Grasshopper allows users to view and sign faxes and PDF attachments directly from their smartphone. As of October 2016, over 250,000 entrepreneurs in North America and the UK are using Grasshopper to expand their footprint and appear more professional, including close to 1,000 customers in the UK. Additional launches are planned for 2017 and beyond in countries like Australia, Singapore and South Africa. “We’re happy to be providing our dependable and easy-to-use virtual phone service to new customers located throughout the UK,” said Rouven Mayer, director of international marketing at GetGo. “Grasshopper puts the power in entrepreneurs’ hands and gives them the ability to grow and manage their business from anywhere in the world.” About Citrix Citrix (NASDAQ:CTXS) is leading the transition to software-defining the workplace, uniting virtualization, mobility management, networking and SaaS solutions to enable new ways for businesses and people to work better. Citrix solutions power business mobility through secure, mobile workspaces that provide people with instant access to apps, desktops, data and communications on any device, over any network and cloud. With annual revenue in 2015 of $3.28 billion, Citrix solutions are in use at more than 400,000 organizations and by over 100 million users globally. ### 4 Key Terms Used When Developing a Salesforce Test Data Migration Plan Every once in a while, you have to change your business’ systems. Migrating from the old system to a newer one is always exciting because you will be moving to a better performing system. However, this migration poses a great risk to the data that you have acquired over the years. Loss of data is a major cause for worry for many business owners. For this reason, you have to ensure that you set in place the best data migration strategy. [easy-tweet tweet="Migrating from the old system to a newer one is always exciting" hashtags="cloud, tech, migration"] Data migration is not as easy as plugging in a flash drive or a memory card to your computer and transferring to another one. It is a time-consuming process and one that will require serious planning. It is nonetheless one of the most beneficial tasks for all businesses. You will be upgrading to better data management practices and better-performing systems. In order to conduct the whole process, you will need the services of a data migration expert and your database administrator’s assistance. You also need to understand what is meant by the following terms. Legacy data Legacy data is simply the data that you want to move to another system. It goes without saying that the source of this data is the legacy system. It includes the information that is currently recorded in your storage system. This could be anything from scanned images and paper documents to database records, text files, and spreadsheets. All these formats of data can be shifted to another system. Data-migrator This is a tool that is used to move the data from the legacy system to the new system. Flosum is one of the best data migration tools that you will find in the market today. You can access it at http://www.flosum.com/salesforce-data-migrator/ and it will help your data migration process by simplifying it. This migrator works with just about all methods that you might want to apply in the data migration process. Data migration This is the process of exporting the legacy data to the target system through the data migrator. It can be done manually or be automated. The specific method that you decide to use for the data migration process is totally dependent on the systems that you will be using as well as the nature and state of the data that will be migrated. Data cleansing Cleansing of data must be done before you begin the migration process. It is all about the preparation of the legacy data for the migration to the new system. This is done because there is a disparity between the architecture of the legacy system and the target system. Often, the legacy data will not meet the criteria that are set by the target system. Data cleansing, therefore, manipulates the legacy data so that it will meet the requirements of the new system. [easy-tweet tweet="Cleansing of data must be done before you begin the migration process." hashtags="migration, tech, cloud"] The bottom line here is that if you understand the basics of data migration, you will have a really easy time finishing the whole job. These above-mentioned terms are among the foundational features of every data migration project. A good comprehension of them will help a lot as you plan the data migration project.   ### Dell EMC Unity Increases Efficiency And Lowers All-Flash Storage Costs By 4x ... For Free Dell EMC today introduced significant, free, non-disruptive, data-in-place software updates to the Dell EMC Unity™ family of storage systems. The new capabilities featuring intelligent inline compression and support for the latest flash drives are designed to enable substantial gains in storage capacity and efficiency for All-Flash configurations. Expanding Unity’s capabilities for the modern data center, Dell EMC also introduced integrated file tiering to the public cloud (including Virtustream), along with intelligent and predictive analytics through the new CloudIQ cloud-based storage analytics platform. CLOUD-BASED ANALYTICS Previewed earlier this year, CloudIQ is a cloud-based storage analytics platform that will be available later this quarter. CloudIQ provides near real-time intelligence and predictive analytics capabilities to proactively monitor, manage, and provide health scores for Unity storage. DOUBLE THE CAPACITY, SAME FOOTPRINT Unity customers will gain inline compression capabilities with the latest version of UnityOE software, further enhancing data storage efficiency with features such as thin provisioning, snapshots and “file system shrink”. With the ability to compress block-based LUNs hosting in All-Flash pools, Unity’s new compression capabilities are designed to help customers save up to 70% in storage capacity costs. Unity All-Flash now provides up to 384TB in a 2U rack through new support for 15.36TB 3D NAND (SAS Flash 4) flash drives, doubling Unity’s drive density. In addition, the Unity 600(F) doubles the number of drives supported, increasing total usable capacity up to 10PBs. Extending the life of mixed SSDs, Dell EMC is also introducing intelligent wear leveling with Unity, allowing customers to mix different types of flash drives within an All-Flash pool. This capability is designed to further extend the life expectancy of All-Flash drives on Unity platforms by intelligently moving data off highly active drives over to drives with less activity within the same Unity storage array. LOWER TCO WITH CLOUD TIERING Expanding its capabilities for the modern data center, Unity now supports seamless, automated policy-based file tiering to public clouds including Virtustream, Amazon S3, and Microsoft Azure. With this capability, customers can archive inactive data to a public cloud service. This feature can help reduce capital expenditure on additional physical storage by freeing up primary storage capacity while reducing backup windows and lowering operational expenditures by reducing data management overheads. Furthermore, customers who use Unity cloud file tiering with Virtustream can save up to 60% on their file storage costs. MIGRATION MADE EASY Since its May 2016 introduction, thousands of customers have simplified and modernized their data centers with Unity, and upgrading to Unity has never been easier. Customers wishing to upgrade to Unity from EMC VNX® systems can leverage a new streamlined and simplified workflow user interface. Migration is implemented as a built-in, self-service solution from inside the new Unisphere management interface. Additionally, customers already using Unity can take advantage of data-in-place upgrades. “We’re always looking for innovative new ways to help our customers get the most out of their storage environments and bring a new level of simplicity to storage,” said Jeff Boudreau, SVP and GM midrange storage, Dell EMC. “By adding inline compression to our existing data reduction technologies, our customers can achieve up to 4x efficiencies in their storage, helping them to reduce significant overhead within their database and other workloads and maximize the value of their storage investments. With the addition of larger capacity flash drives for Dell EMC Unity, customers will be able to scale their environments up to 384TB in a 2U rack, driving up efficiency while keeping down TCO. And now with the availability of new CloudIQ storage analytics, customers can get proactive Health Scores of Unity systems including recommended remediation based on best practices and risk management procedures – ensuring Unity systems are available, protected, and performing.” AVAILABILITY Updates to the UnityOE, including support for 15.36TB flash drives, will be available toward the end of 2016 at no cost as a non-disruptive, data-in-place upgrade. CUSTOMER QUOTE: Brady Snodgrass, storage architect, First National Technology Solutions “Inline compression in Unity will allow us to more efficiently utilize our midrange storage capacity and, ultimately, reduce costs. Additionally, the ability to create completely isolated environments using IP multi-tenancy will greatly increase security and control, which, as a cloud service provider, is an absolute necessity. The simplicity with which we can perform data-in-place hardware upgrades is amazing on the Unity platform, as is our ability to upgrade an existing Unity array to a new model by swapping out the service processors.” ANALYST QUOTE: Ashish Nadkarni, program director, computing platforms, IDC “Businesses are putting unprecedented pressures on their midrange storage arrays, with ever-increasing sizes and numbers of files that need to be stored and accessed instantly. To get the most out of an array – in terms of value for money and capacity per drive – it’s important for organizations to demand high-capacity drives, data compression and tiering to the cloud. Smart businesses are choosing storage platforms that act as a hub, compressing data and directing it to the best storage target to match its SLOs, whether that’s an internal, high-capacity flash drive for hot data or a more cost-effective option in the cloud for information that’s not used as much.” PARTNER QUOTE: Jack Rondoni, vice president of storage networking, Brocade “To achieve maximum value from the Dell EMC Unity Family of solutions, organizations require a modern storage network that is easy to deploy and manage without sacrificing performance or reliability. Brocade Gen 6 Fibre Channel and IP storage networking helps unleash the full power and efficiencies of Unity. In addition, Brocade Fabric Vision Technology with IO Insight provides deeper visibility into the IO performance of storage infrastructure to further enhance monitoring and maintenance of service levels.”   ### New Selfie Technology launched by Lloyds Banking Group As part of the investment in digital technology to improve the customer experience, Lloyds Banking Group has just launched selfie technology to enable Bank of Scotland customers to open a current account seamlessly online. Customers applying online through the new simplified application form will sometimes be required to provide ID. They will be able to complete a simple step-by-step application process to open a current account and take pictures of their UK driving licence or passport, along with selfie images to confirm their identity. As the ID verification technology is web based customers can use a web browser on their smartphone or tablet to submit images – and there is no need to download an additional app. This greatly simplifies the current account application process for new customers and gives them the flexibility to open an account quickly from the comfort of their own home. Customers are guided through the process with simple instructions on screen and the whole process only takes a matter of minutes. Within a maximum of two working days the customer will be informed of the result of their application, although this can happen within an hour. Nick Williams, Consumer Digital Director, Lloyds Banking Group, said: “We recognise the importance of simplifying our processes to allow customers to easily open a new personal current account from whichever device they prefer. Today’s customers now expect an experience which is simple and efficient, and as they are increasingly digitally savvy they want to interact with us in a way that suits their busy and mobile lifestyles.” ### Nuance Provides Latest Voice Biometrics Technology to University of York for New Courses in Speech Science   Nuance Communications, Inc. today announced that it has provided the University of York with voice biometrics software solutions for use within the University’s new Continuing Professional Development (CPD) courses in Forensic Speech Science. The courses are aimed at scientists and public sector professionals looking to enhance their knowledge and practical skills in speech analysis. The software will also be available to graduate students studying for the University’s MSc in Forensic Speech Science and for postgraduate research more generally. The CPD courses were launched this week with the first cohort of participants drawn from law enforcement agencies, public and private sector laboratories and speech scientists from other universities seeking to extend their knowledge and skill sets. The courses are designed as intensive theoretical and practical three-day programmes where students learn about and use voice biometrics technology to better analyse speech in criminal cases. The University’s year-long MSc course, which is believed to be the only one of its kind in the world, focuses on the application of linguistics, phonetics and acoustics in investigations and legal proceedings. It enables postgraduate students, including practitioners within the public sector, to get a grounding in theory and hands-on experience with using the latest technology available in this area. Technological solutions are increasingly being used alongside established methods to quickly and accurately filter large volumes of data and to subsequently speed up casework. Dr Peter French, Professor of Forensic Speech Science in the Dept of Language & Linguistic Science, University of York and Chairman at JP French Associates commented: “Nuance is recognised as a market leader in research and development in this area. Its automatic speaker comparison technology facilitates investigations by allowing large volume comparisons of recordings. The process is extremely time-efficient and the user interfaces of the software are highly intuitive and user-friendly.” Dr Dominic Watt, Senior Lecturer of Forensic Speech Science in the Dept of Language & Linguistic Science, University of York and Chair of the Departmental Communications Group said: “We work hard to ensure that our courses are relevant for careers in industry and aim to provide students with the skills they need to succeed in their profession. The new CPD courses and MSc programme are tailored to the needs of the forensic, law enforcement and government sectors. On these courses you won’t just be sitting in a lecture hall taking notes; you’ll get to work on real case materials and see how technology can help identify individuals more quickly.” Dr Dunstan Brown, Professor of Linguistics and Head of the Dept of Language & Linguistic Science, University of York noted: “We are thrilled to be using the voice biometrics software from Nuance, which will provide for us an important new aspect of our training and research programmes.” Brian Redpath, Public Sector Director at Nuance Communications, concluded: “Forensic speech analysis is a powerful tool for the public sector assisting professionals in various casework activities. In the hands of trained personnel, this technology will enable improved productivity during both the investigatory stage and during preparation for trial. I am delighted that the University of York will include the use of Nuance voice biometrics, enabling postgraduate students and industry professionals to enhance their skills by embracing our technology.” In the general population voice biometrics has become an accepted form of authentication as well, one which has proven to be both secure and convenient. Opus Research’s latest voice biometrics census, completed in July 2016, shows dramatic growth in enrollments (more than 137 million globally), signaling voice as a ubiquitous, highly personalised authentication factor with the capability to combine command and control with identification and access management. Voice biometrics technology has indeed been embraced worldwide, and in the UK in particular has increasingly been deployed by a number of UK organisations to not only provide a more secure and seamless authentication experience, but also to reduce Fraud. Organisations like First Direct, TalkTalk and Barclays have rolled out voice biometrics to millions of UK citizens over the past several months alone.   ### Heavy Reading Survey: 86 Percent of Global Telecoms Report OpenStack Important or Essential to Their Success A survey recently conducted by Heavy Reading reveals that 85.8 percent of telecom respondents consider OpenStack to be essential or important to their success. The survey, commissioned by the OpenStack Foundation, explores current usage and adoption plans of communications service providers (CSPs) particularly with respect to NFV, 5G, IoT and enterprise cloud. Attend a webinar hosted by Heavy Reading senior analyst Roz Roseboro on November 9 for a deep dive into the survey results, and hear directly from telecoms at the OpenStack Summit Barcelona, October 25-28, 2016. Facing unprecedented traffic and rapidly evolving customer expectations, telecom providers worldwide are accelerating their adoption of Network Functions Virtualization (NFV) to increase network agility and mitigate costs, and OpenStack has emerged as the NFV infrastructure platform of choice. Already, numerous telecom providers and enterprise leaders have chosen to implement NFV with OpenStack. AT&T, Bloomberg LP, China Mobile, Deutsche Telekom, NTT Group, SK Telecom and Verizon are among the organizations documented using OpenStack for NFV. “The flexibility and versatility of OpenStack as a cloud and NFV platform allows telecom companies to combine business and communications IT under a single technology,” said Jonathan Bryce, executive director of the OpenStack Foundation, “We’re seeing incredible progress and rapid adoption of OpenStack by the telecom industry, and in return, the contributions the telecom industry is making to OpenStack have far-reaching value for other businesses with challenging requirements such as high-availability, scaling, fault management, and a diverse footprint of facility sizes and locations.” Axel Clauberg, vice president, Aggregation, Transport, IP at Deutsche Telekom AG, reports, "Deutsche Telekom is well on its way taking NFV into production. In March 2015, we announced CloudVPN, our first production NFV workload running on OpenStack, a service available in Croatia, Slovakia and Hungary. Deutsche Telekom supports OpenStack and contributes requirements and code to help advance NFV features more rapidly." Jörn Kellermann, senior vice president of Global IT Operations, T-Systems International, GmbH (a subsidiary of Deutsche Telecom) will be a keynote speaker at OpenStack Summit Barcelona. Key Findings: Heavy Reading surveyed its Light Reading database and received 113 responses from representatives of telecom companies around the world: 54 percent from the US, 14.2 percent from Europe, 11.5 percent from the Asia Pacific region, 8.9 percent each from Central/South America and Canada; and 2.7 percent from the Middle East. Highly Valued: 85.8 percent consider OpenStack to be essential or important to their success. Rapid Adoption: Telecoms are already using or currently testing new use cases with OpenStack for: NFV: 60.3 percent using or testing; 37.8 percent considering (total 98.1percent) IoT: 41.3 percent using or testing; 49.5 percent considering (total 90.8 percent) 5G: 30.6 percent using or testing; 49.1 percent considering (total 79.7 percent) Multiple Benefits: In popularity order, the benefits cited most often are: Offer new services more quickly (This was by far the most-cited benefit, outpacing the next by 24 percent) More rapid virtualization of the data center Reduced operational cost Reduced software cost Strong Community Engagement: 73.5 percent of respondents are engaged with OpenStack, primarily by contributing to the software, contributing requirements and use cases, and attending OpenStack Summits and OpenStack Days community events. Neutron is the OpenStack project most contributed to, followed closely by Nova. OpenStack Neutron Plug-in Popularity: The survey revealed that telecoms are using a mix of open source and commercial OpenStack Neutron plug-ins for SDN and virtual switching. The variety of plug-ins selected in the survey validate OpenStack's strategy of providing user choice. Containers on the Rise: 99 percent of respondents are considering containers for both VNFs and business applications. 68.4 percent will definitely use containers for VNFs, and almost all others are considering them (30.6 percent). This is slightly higher than using containers for business apps (63.6 percent definitely or likely, 35.5 percent potentially). “One of the findings of this survey, which is borne out in other open source research we’ve conducted recently, is the high degree of awareness of OpenStack among the telecom industry,” said Roz Roseboro, senior analyst at Heavy Reading. “Also impressive is the large number of organizations using or testing NFV solutions on OpenStack; this number has grown considerably in comparison to similar studies we conducted only a year and a half ago. The number of telecoms using or testing Internet of Things solutions on OpenStack is also significant, as IoT is a rising star, and OpenStack is well positioned to be an integral resource in this high-growth arena.” The OpenStack Summit taking place this October in Barcelona, Spain, will feature a telecom/NFV keynote address, user presentations, workshops, panels and collaborative sessions. Among the telecom user speakers will be AT&T, China Mobile, Deutsche Telekom and Verizon. A highlight of the keynote address on October 25 will be a live NFV fault management demonstration with OpenStack Congress and Vitrage, based on the OPNFV Doctor Framework. Another special demonstration will be held in the Marketplace; EANTC (European Advanced Networking Test Center) will broadcast NFV tests from the Berlin test center, showcasing OpenStack-based interoperability and performance for a variety of telecom solutions. Approximately 6,000 attendees from 50+ countries are expected to attend the Summit to discuss how the OpenStack open source cloud platform is helping companies increase agility and save money as software becomes more strategic to their business.   ### UK Organisations Risk Retention Crisis Due to a Lack of Digital Working Practices Research from Sungard Availability Services® (Sungard AS), a leading provider of information availability through managed IT, cloud and recovery services, has revealed that UK employees are leaving their current employers if digital expectations are not met. The study found that having access to the latest digital tools is considered crucial by 76% of UK workers, whilst a third (33%) admitted they would be embarrassed to work in an organisation without them. Worryingly, 21% of UK employees have actually left a place of employment as they felt they did not have access to the latest digital tools to remain competitive within their industry. While this figure is not as high as the US (32%), it should still serve as a cause for concern for UK businesses especially given growing fears of an ever widening skills gap within the region, as well as the stiff competition to attract, and retain, the best talent on the global stage. Investment and Upskilling Investment in digital tools is only half the battle. Organisations must also invest in their employees, ensuring that they have the skills and knowledge needed to make the best possible use of digital technology. While employees recognise the need for digital tools, many are struggling to get the most out of them. Nearly a third (31%) believe new digital tools are making their jobs more stressful, while 30% claim it has made their role more difficult. Troublingly, 23% say they do not understand how to use the new digital tools their employer has provided. A comparison of this research across different regions found that the UK is the most pessimistic country when it comes to digital confidence, with just 24% feeling that they are able to make the most of digital tools, this is in stark comparison to both the US and Ireland who are the most confident – with 42% in both regions feeling assured in their use of digital technology. Obstacles to Overcome Over half (52%) of UK employees cite having the right technical skills as the biggest challenge hindering digital transformation, with receiving the right training coming second at 37%. Meanwhile, 34% of workers complained they were not given enough or any training to get the most from the digital tools provided by their organisation, with nearly a quarter stating that the little training they do receive is not relevant or up to a good enough standard. Keith Tilley, Executive Vice President, global sales & customer service management at Sungard Availability Services commented: “Digital tools, from mobile working solutions through to cloud-based collaboration applications can be a game-changer for businesses and a powerful tool for growth. However, while deploying these tools may fall to the IT department, they must be adopted across the entire organisation in order to have any real or lasting impact. From the C-Suite down, all employees must have the right tools and training they need if organisations are to implement an effective digital-first strategy. Not doing so would prove an own goal and undermine the investments made in people as well as technology. “Happily, our research shows employees already understand the importance of digital transformation and are keen to learn more around the tools and techniques they will need in this new era of business. This is an important moment for the IT department, the CIO and their peers to further demonstrate the value they can offer. Not just in delivering digital tools, but also in offering the guidance and knowledge in helping employees to gain the most value from them. By doing so, they really will have a true digital transformation on their hands, and create a business ready to reap the rewards.” Eddie Curzon, Regional Valley Director at CBI added: “While technology is important, it is only part of a wider story. IT only does what it is told; its success depends entirely on the competency and acumen of those operating it. While employees understand the value of digital tools, it doesn’t necessarily mean they already have the skills in place to use them effectively. Businesses must take the time to invest in the workforce – closing any impending skills gap by offering staff the training they need. And with employees now demanding the implementation of digital working practices, the impetus is with senior managers to ensure that everything is being done to answer these demands and provide the workforce with the tools needed to help the organisation thrive. As ever, businesses that listen to and act on the suggestions of their staff will prosper, while those who fail to heed employee demands place themselves at risk of familiar staff retention issues. As the research shows, technology is now a critical factor in keeping staff productive and fulfilled in their roles. Ignore this at your peril.” ### CensorNet Releases App for Multi-Factor Authentication CensorNet, the complete cloud security company, has today announced the launch of the CensorNet App for Multi-Factor Authentication (MFA). This latest authentication method allows one-time passcodes (OTPs) to be delivered to the users’ mobile phone via encrypted push notifications from the CensorNet App.  The app adds to the SMS, voice call, email, and token delivery methods already offered by CensorNet. The app will provide CensorNet customers with greater freedom and flexibility, at the same time as offering a highly secure method of authentication. Every single app installation will use its own, auto-generated encryption key, ensuring only the intended receiver of a message will be able to decrypt it. As with each of CensorNet’s security tools, controls can be set at a granular level. For MFA, different OTP delivery methods are set based on what is appropriate for different user groups. Administrators can now determine if employees should or should not receive passcodes via the app. In addition, the solution can automatically select the most appropriate delivery method based on the context of the user. The application has been released as part of Version 9 of SMS Passcode which, from today, is renamed CensorNet MFA. Further updates to the product include PowerShell support as a new option for administrators to manage a CensorNet MFA installation, and Advanced Database Auditing to help customers comply with strict industry regulations and audit control requirements. Ed Macnair, CEO, CensorNet said: “Apps are increasingly being adopted for communication and collaboration within businesses, so it makes absolute sense to facilitate MFA in the same way. Our aim has always been to make security as simple and efficient as possible. Integrating an app that’s easily and fully customisable helps us do just this.” “Weak and stolen credentials are still a primary attack vector and our acquisition of SMS Passcode earlier in the year has enabled us to offer a truly integrated cloud security platform.  Renaming the product CensorNet MFA is the next step in ensuring we offer that seamless, single pane of glass platform.” CensorNet MFA Version 9 is available to customers from today and the CensorNet App can now be downloaded from the App Store and Google Play. ### Using data for decision making – why different approaches to intelligence matter Over the centuries, humans have thought a lot about what makes up intelligence. From general theories about reasoning and memory to the development of the Intelligence Quotient (IQ) from 1912 onwards, we have looked at ways to define what makes us smart and – more importantly – how we can be smarter. Today, organisations are developing their use of analytics to help business people make decisions in smarter ways. These deployments can cover point uses, such as data visualisations and dashboards, to more far-reaching and strategic Business Intelligence (BI) and Analytics implementations. However, as we seek to expand the use of analytics within businesses and public sector bodies, it’s important to recognise that not everyone works in the same way. [easy-tweet tweet="Organisation use of analytics help businesses make decisions in smarter ways." hashtags="cloud, data"] Emotional intelligence, or EI, was developed as a concept to discuss how people think about interpersonal relationships, manage behaviour and achieve their goals. In a work environment, better EI should correlate with strong performance in teams and where roles deal with complex emotional situations. The main thing here is that not everyone within an organisation will process information in an analytic fashion. Getting to a “data-driven” mindset can be more difficult than just providing user-friendly charts. Instead, successfully managing the decision-making process with the help of data can involve a lot more change management and education. For people used to working intuitively and based on their experience, making a decision based on the output of an analytics query can be a difficult habit to get into and well outside their comfort zone. While finance, marketers and sales leaders have been developing their analytical skills and started the transition to working with data every day, many other departments still have to make this shift. Thinking about EI can help. Designing data and analytics for everyone To make analytics work for everybody across the organisation, it’s worth building out a change management process to help educate new users. As part of this process, it’s important to get feedback on how the team involved currently measures its successes and opportunities. For example, a social care team may currently look at how their services help clients live more independent lives. This goal may be difficult to translate into traditional reports, so it is worth digging into how each of these successes can be linked to wider data sources such as budget figures or other care costs. By understanding these results in context, it’s then possible to build more appropriate metrics that can be used and modelled. This level of insight is only possible through discussion with those who are responsible for delivering the service. By understanding how people make improvements, it’s possible to create measures that they can track over time. These are Key Value Indicators or KVIs. They are different from Key Performance Indicators (KPIs) in that KVIs should link to personal goals, while KPIs are often set at the organisation, group or departmental level. The relationship between KVIs and KPIs is not an exact science, but each individual should be able to feel that ownership of his/her KVIs will lead to direct improvement in overall KPIs, no matter how small. Alongside this process of setting up KVIs, IT may have insight into how data and decision making can go together. This has to be a collaboration between IT and the team involved to be successful. Without that connection to the emotional side of how people think, it’s difficult to get adoption of analytics to stick. Delivery of data needs a new approach From a technology perspective, the shift to more data-driven decision making runs parallel to a trend where companies are generating and keeping more data than ever before. A new approach is to build up data lakes alongside traditional data warehouses. However, combining all this data together and making it accessible and useful is the big challenge. [easy-tweet tweet="Data-driven decision making is parallel to a trend where companies keeping more data than ever." hashtags="cloud, tech"] Company approaches to BI and analytics have typically focused on small numbers of users – either specialised business analysts with skills in using data and statistics, or new roles such as data scientists working with big data sets. Some departments may have chosen to use data discovery tools to make the information within spreadsheets more usable. However, getting consistent data into the hands of people across an entire organisation requires a different approach. Rather than relying on central IT teams to share reports, the emphasis should instead be on helping everyone get access to data that they can work with in their own ways. These individuals will know far more about their own needs and the kinds of questions that they want to answer, so IT can help them better by stepping out of the way and providing self-service tools to governed data sets. This does require more guide rails to be in place around the data that people use – for example, providing access to marketing and sales data in a sandbox environment that makes use of “real world data” but without affecting the production systems that this data comes from. By helping people build queries in a semantic way, based on the questions that they want to answer, analytics can help them get more accurate responses to their requirements. Key to this is networking data from different sources together in a shared, common analytical fabric, rather than relying on data dumps or spreadsheets that are emailed around. By bringing people into the data to work with it, it’s easier for them to meet their own needs, rather than IT doing what the team thinks is required. This semantic approach to data also fits better with groups working to shared goals. Rather than looking at data models and building up knowledge of statistics, the emphasis should be on knowing the right questions to ask. Again, guiding the process is more important than defining what reports should be delivered. [easy-tweet tweet="Guiding the process is more important than defining what reports should be delivered" hashtags="tech, cloud, data"] Looking ahead, it’s inevitable that more and more of us will be using data in our decisions every day. However, reaching this goal involves understanding the different needs that everyone has. While we will all need to build up skills in using data, IT should be focusing on how to make analytics useful to everyone. ### A View from the Edge Another year at the prestigious IBM Edge conference and what’s new? Well, quite a bit actually. Cloud with z Systems – IBM’s latest solutions create customer experiences through mobile and analytics, deliver agility and efficiency through the cloud, and ensure always-on service and data protection. [easy-tweet tweet="IBM Solutions create customer experiences through mobile and analytics" hashtags="cloud, tech, IBM"] Ubuntu OpenStack on IBM Systems: IBM and Canonical have a long and innovative alliance and announced the availability of Canonical’s Ubuntu OS, Ubuntu OpenStack and tooling on  LinuxONE, as well as IBM z Systems. Ubuntu OpenStack is the most widely used private cloud platform among enterprises and service providers with over 55% of production OpenStack clouds, more than all other vendor solutions combined. Enterprise open innovation – Solutions that take advantage of hybrid cloud scalability, blockchain, high-performance data analytics coupled with open source development, LinuxONE Enabling cognitive IT with storage and software defined solutions – solutions around the workload and data-aware infrastructure needed to efficiently capture, deliver, manage and protect data with superior performance and economics with a link into the cognitive. Architecting the future of cognitive business – from designing, building and delivering IT infrastructure – current and future visionaries share and educate with their understanding of tech world – from the young to the old(er) As there is an awful lot of information to cover here, even within each topic, I'll focus on a couple and leave the link at the bottom of this article for you to watch the replays at your leisure.   A Secure Blockchain: IBM LinuxONE I am continually amazed at IBM with their ability to present solutions way ahead of the rest of the large tech vendors. I know that this is a very bold statement to make, especially from me but I can back this up with one particular presentation theme – The Blockchain! Having a very solid grounding and understanding of this particular technology I am ideally placed to comment and I must say two things. IBM has a skill for finding great talent (companies that compliment their technology) and demonstrating the partnership in such a way that the audience simply put, just get it! Additionally, they have some of the smartest engineering divisions that constantly research, develop and push the boundaries of their technology – Enter the Blockchain on their LinuxOne platform! [easy-tweet tweet="IBM have some of the smartest engineering divisions that constantly research" hashtags="IBM, tech, cloud"] Ross Mauri, GM IBM z Systems, and LinuxONE, presented a great keynote that left us pondering on the sheer size of Cloud and the Internet of things. “In the region of 20, 50 and even 100 transactions have been seen on backend systems from one click of a mobile device. When architecting for the future we need to look at the growth of data this generates from 6x the current growth in mobile data by 2020 to over 20 billion IoT connected devices by 2020 too – this will drive massive growth in for analytics,” noted Mr. Mauri. Ross Mauri is absolutely spot on with these statements and again, perfectly positioned are systems such as the IBM LinuxOne brand that can house, execute and manage 100’s of thousands of transactions a second! This is why I personally feel that IBM has chosen a very sound combination of infrastructure and services with Blockchain, which is extremely transactional and compute hungrily. Everledger at IBM Edge Another highlight of the IBM Edge conference was the second-day keynote featuring Leanne Kemp, CEO of Everledger. IBM client Everledger is an example of innovation with blockchain. Everledger is , a great start-up that launched last year with a fantastic idea of creating a digital DNA footprint for the diamond market, using the Blockchain and their own way of capturing the Provenance of each diamond by creating a digital fingerprint (location, authenticity, history) that revolutionizes the insurance market for diamonds! At IBM Edge, Everledger announced a blockchain platform to digitally certify Kimberley Process export diamonds. The technology fights against insider threats, protects data, and secures entry points and integrity of the network through unique and secure features (underpinned by IBM LinuxONE), in order to meet strict security requirements of the diamond industry [easy-tweet tweet="The technology fights against insider threats, protects data, and secures entry points" hashtags="tech, cloud, IBM"]   At IBM Edge, IBM demonstrated that they are visionaries in blockchain technology that is architected for the future. Not only by future proofing technology use cases but also finding practical use cases for industries. Well done IBM Edge, again, for making my time very educational, enjoyable (well it was in Vegas) and also very productive. It seems that IBM is all in with Cloud as the way forward and all of their product sets fit nicely into a solid Cloud vision. IBMGO – replay link for the keynotes and interviews - https://ibmgo.com/Edge2016 ### So, what is Cloud Storage? Are you unsure about exactly what Cloud Computing is? You may have heard the term somewhere else before - in an advert or on a software update or packaging, but it was never explained to you. Sometimes it is easy for tech companies to assume we all know what something is without ever telling us. However, by knowing a little more on cloud storage, you could revolutionise how you save and use your documents, movies, videos, and photos. [easy-tweet tweet="Are you unsure about exactly what Cloud Computing is?" hashtags="cloud, computing, security"] So, what is Cloud Storage? The 20th-century idea of storage was for data to be saved solely on the device in hand be it a desktop computer or a mobile phone. More recently, the limitations of storing information on devices have been realised, they often lose performance when too much data is stored on their hard drives or memory chips. The solution has not involved actual clouds or anything in the sky, but the idea of your information being stored in a dedicated storage device somewhere else, but instantly retrievable from whichever device you sign into. This means effortlessly working on the same documents or using the same photos without transferring data via discs or USB devices or emails. Cloud computing now stores all information, no matter which device you upload it from or access it from, on a supercomputer with near limitless storage capability - the cloud. It is now easier to use different file types and sizes, move documents including text files, photos and videos between devices and sync all of your information to mobile devices and smartphones no matter your location. [easy-tweet tweet="two-step authentication makes it harder for hackers to break your password." hashtags="cloud, tech"] Is Cloud Computing Safe? Reputable cloud computing companies offer users basic in-built encryption services which make it impossible for hackers to decipher what your content says. Secondly, they should offer two-step authentication which makes it harder for hackers to break your password. Of course, there are measures you can take to protect yourself including strong passwords - never save them on a file, but write the password down on a piece of paper and keep it safe. Lastly, it is worth noting that storing your data on a remote server is safe for another reason, you can recover and restore any lost data from the cloud. ### Humanitarian Aid: What Happens When the Cameras Go Home? When many people consider humanitarian aid in disaster situations, they think about the media’s coverage of the crisis: people being rescued, aid packages being delivered, shelters being built. But what happens when the cameras leave? Peter Skelton, a London-based Physiotherapist and Rehabilitation Project Manager with Handicap International, a charity which remains in a disaster-affected region for months after the public’s attention has moved on. Peter specialises in helping people injured during emergencies, often in countries with limited resources and support frameworks. Speaking about his work, Peter said, “Most people’s experiences of physiotherapy in the UK come from their own direct interactions with a physiotherapist, normally because of a sports injury, back pain or a similar issue. That experience is completely different if you’ve had a major accident such as a spinal injury or an amputation, when you will see a very different side to physiotherapy. “In many ways, the work we are doing in disaster situations is not markedly different from what we would do in major trauma centres within the UK. The difference is linked to the resources we have available, and the situations in which people find themselves. “Invariably, in the UK when you provide treatment, you know that people can get access to the follow-up care that they need, you know that they’ll have support from social services if they need it, and they’ll generally have a supportive family around. There are all sorts of systems set up to support people while they are unwell and throughout the recovery process. In a disaster zone you generally don’t have access to these. “We aren’t dealing with disaster injuries in isolation. Frequently, patients will have not only experienced a catastrophic injury, but may also have lost their home, their business, family members, friends. The country itself may also be experiencing severe upheaval so they are unlikely to have the same social support that we expect to be available in the UK.” Peter Skelton works for Handicap International, an international aid organisation working alongside disabled and vulnerable people in situations of poverty and exclusion, conflict and disaster. He has worked in emergency teams responding to crises in Ecuador, Nepal, Gaza, Iraq, the Philippines, Libya, Jordan and Haiti. Peter Skelton will be speaking at the World Extreme Medicine Conference and Expoat Dynamic Earth, Edinburgh, EH8 8AS on 18 November 2016, focusing on the issue of psychological first aid. The Psychological First Aid training package was developed by the World Health Organisation, and is targeted at anybody that is helping out in response to a disaster: humanitarian aid workers, medical professionals and even laypeople. It is designed to give a basic framework that they can use to deliver immediate support to people in disasters. Peter said, “There is a misconception that the victims of disaster are always traumatised. Actually, my experience has been that people in disasters are incredibly resilient. What they really need is access to things like shelter, food and water, and if you can help them to meet those needs then they’re going to be fine. “It’s only a much smaller number of people that require any specialist intervention and psychological first aid comes in one level below that.” Mark Hannaford, founder of conference organisers World Extreme Medicine, said, “Peter is a hugely respected figure on the UK humanitarian scene, and his perspective is of particular interest because of his experience of the long term rehabilitation of disaster victims. “World Extreme Medicine was founded around a campfire in Namibia, and we coined the phrase ‘World Extreme Medicine’ as an umbrella term for all practices of medicine outside of a clinical environment, whether it is prehospital, disaster and humanitarian, endurance, sport, expedition or wilderness medicine. “Our message is that there is a great diversity of careers in medicine, and that traditional hospital environments are not the only option for a fulfilling career. To put it into a layperson’s terms, there’s never been a more exciting time to work in medicine.” The World Extreme Medicine Conference and Expo brings together leading experts from around the globe to share learnings on prehospital care, expedition and wilderness medicine, sport, endurance, humanitarian and disaster medicine. ### NASA Gives Thumbs Up to World Extreme Medical Conference Kate Rubins, one of three astronauts aboard the International Space Station, has transmitted a message of support to the organisers of the World Extreme Medicine Conference and Expo, which will be held at Dynamic Earth in Edinburgh on Friday 18 to Monday 21 November. Taking time off from sequencing DNA 400km above the Earth’s surface, Kate Rubins reinforced the importance of extreme medicine: “Here in Earth orbit we have a unique appreciation of the concepts of ‘extreme’ and ‘remote’, very applicable to the World Extreme Medicine Conference, especially provisioning and point-of-care diagnostics in similar remote environments as well as on a wider global scale. “The concept of extreme medicine resonates with so many corners of human health, such as disaster and humanitarian medicine, prehospital care, wilderness medicine, and in isolated villages in the developing world. “The breaking down of traditional silos between these disciplines is leading to more effective treatments and devices, and of course the sharing of knowledge and best practices on a wider stage.” The video was put on the World Extreme Medicine Facebook page last Friday and already has over 9,100 views. The attendees at the World Extreme Medicine Conference represent an eclectic mix of disciplines, united by one thing: they all specialise in medical practice conducted away from a usual clinical setting, typically in remote and sometimes dangerous locations. Four core disciplines are covered by 100 key speakers: disaster and humanitarian medicine, extreme, expedition and space medicine, human endurance and sports medicine plus prehospital medicine. Highlights include: Disaster and Humanitarian Medicine Dr David Nott, an NHS surgeon who spends several months of each year working overseas for Médecins Sans Frontières and the International Federation of the Red Cross, will be speaking about his most recent work at a makeshift hospital in Aleppo, Syria, and previously, in other conflict zones. Peter Skelton, a London-based physiotherapy and rehabilitation specialist who has worked in emergency teams in Ecuador, Nepal, Gaza, Iraq, the Philippines, Libya, Jordan and Haiti, will be speaking about the importance of Psychological First Aid training to responders in disaster situations. Extreme, Expedition and Space Medicine Speakers come from as far afield as Australia, such as John Cherry, a rural doctor working in Orange, New South Wales, around 150 miles west of Sydney. Dr Cherry has had an incredibly varied career and will be speaking about how he created the blueprint for preparing ESA astronauts for medical situations in space. American MD Will Smith is travelling from Jackson, Wyoming, where he is the US National Parks Medical Director. He provides consultancy services to extreme medicine and rescue organisations across the world and will be sharing his experiences of practicing medicine in remote and austere locations. Human Endurance and Sports Medicine Speakers include the elite sports expert, Edinburgh-based Dr Andrew Murray, who has worked for the Olympic, Paralympic and Commonwealth games. He is also an incredible athlete in his own right, having run 4,300 km from Scotland to the Sahara Desert and completed a husky trek in -40C in Outer Mongolia. Pre-Hospital Medicine Londoner Eoin Walker is a Pre-Hospital Mass Casualty Incident Management Paramedic with the London Air Ambulance, and will be discussing prehospital care alongside Zoe Hitchcock. In 2013, Zoe suffered a cardiac arrest whilst shopping in Oxford Street, central London, and Eoin was the first on the scene and re-started her heart. World Extreme Medicine Founder Mark Hannaford said, “In today’s world, more than ever before, the human race is determined to access remote areas, whether it be for science, exploration, business or a myriad of other reasons. People going into these areas need medical support, and the skillsets of the medical professionals required are very different to those needed in a traditional clinical environment. “Likewise, there are conflicts and disasters happening in parts of the world where access to equipment and medicine is extremely difficult or impossible. Medical professionals in these conditions need to be able to work with very limited resources and frequently overcome new challenges. “The area of extreme medicine is in growth, and our message is that it’s a great alternative to a traditional clinical career. My belief is that there’s never been a more exciting time to work in medicine, and the fascinating speakers at the World Extreme Medicine Conference will prove that point.” Mark Hannaford concludes, “We are thrilled to be bringing 100 speakers to Edinburgh at a unique event attended by 800 doctors, nurses, paramedics, surgeons and medical students. New medical research findings will be shared, making the conference an unmissable and historic event.” ### The Linux Foundation Drives Standardization of Open Source Software Supply Chain   October 4, 2016 – The Linux Foundation®, the nonprofit advancing professional open source management for mass collaboration, today announced that the OpenChain Project has established its first set of requirements and best practices for consistent free and open source software (FOSS) management processes in the open source software supply chain. The OpenChain Specification 1.0 aims to facilitate greater quality and consistency of open source compliance to help reduce duplication of effort caused by lack of standardization and transparency throughout professional open source organizations. Open source is the new norm for software development, evidenced by nearly 70 percent of hiring managers looking to recruit and retain open source professionals within the next six months (see: 2016 Open Source Jobs Survey and Report). From society lifelines such as healthcare networks and financial institutions to in-car entertainment and movie production, open source has become a key software supply chain every major industry is dependent upon. Businesses ranging from startups to enterprises are looking to establish, build and sustain open source projects that support long-term innovation and reduce R&D costs. For open source software to continue to thrive, there must be a common set of requirements and best practices established to ensure consistency of use and quality of software. Individuals and organizations reliant on open source software must also have access to training resources and expertise such as licensing and compliance to uphold the integrity of code. “Hundreds of thousands of people around the globe, including the world's largest companies, leverage open source software, so we need to work together to support best practices for software license compliance throughout a supply chain,” said Jim Zemlin, executive director, The Linux Foundation. “Licensing, best practices, training, certification and other resources are needed to scale open source and protect the innovation built on top of it. The OpenChain Project is taking a major step forward by helping create software supply chains that are both efficient and compliant.” The OpenChain Project is a community effort to establish common best practices for effective management of open source software and compliance with open source software licenses. The project aims to help reduce costs, duplication of effort, and ease friction points in the software supply. Today the OpenChain Project releases its first specification that defines a common set of requirements and best practices for open source organizations to follow in an attempt to encourage an ecosystem of transparent sharing and open source software compliance. The goals and requirements of the OpenChain Compliance Specification 1.0 include: Document FOSS policy and training for software staff; Assign responsibility for achieving compliance via designated FOSS-related roles; Review and approval of FOSS content; Deliver FOSS content documentation and artefacts such as copyright notices, licenses, source code, etc; Understand FOSS community engagement including legal approval, business rationale, technical review of code, community interaction and contribution requirements; and Adhere to OpenChain requirements for certification. The OpenChain Project has also established three Work Teams to collaborate on future refinements of the OpenChain Specification, to develop training materials and create conformance criteria for organizations. The project will also begin the roll out of a self-conformance program this year. Platinum Members of the OpenChain Project include Adobe, ARM, Cisco, Harman, Hewlett Packard Enterprise, Qualcomm, Siemens and Wind River. ### The IoT challenge isn’t capacity planning, it’s about database choice No matter what it means to your business, the Internet of Things (IoT) in its broadest sense is affecting all businesses from the sports brand delivering fitness trackers to the manufacturer predicting pump failures. But if you want to build an IoT application today, you will struggle to find a good cookbook because there are many careful considerations to be made upfront. [easy-tweet tweet="Internet of Things (IoT) in its broadest sense is affecting all businesses" hashtags="IoT, tech, cloud"] How much data will I collect? Will my current tools perform? Will I run out of capacity, and if so how can I predict my need for servers in the cloud?  A good place to start is Gartner analyst Doug Laney’s famous Three V’s of Big Data: Volume, Velocity, Variety. You may not categorise your needs as big data as such, but it is likely one of these Vs will become – or already is – a new challenge. Volume is a commonly cited problem because it can be hard to predict how much data will grow with device deployment and subsequent analyses, and the cost of over-provisioning is unacceptable. Today you can opt for a so-called IoTaaS or IoT Platform and outsource large parts of your service to a Google, or a Cisco (Jasper) or you can stay DIY by starting with the servers you need today and harnessing cloud and software-defined infrastructure to scale. The key here is not visiting the fortune teller to predict what you need, but understanding how web-scale companies scale efficiently and reliably on commodity hardware by choosing their technology stack carefully from the outset. Then there’s Variety. The bad news is that most relational databases (nor most NoSQL databases) aren’t designed to query IoT data efficiently. The good news is that IoT data are, in essence, very simple – a stream of timestamped values – which is why the database you choose must be optimised to ingest and query this data as efficiently as possible. It is by getting this database foundation right that today’s pioneers have the Velocity to deliver the real-time analytics and deep insights enabling them to live up to the promise of IoT. In short, if you’re building for IoT you need a time series database that scales efficiently across many servers in the cloud. Ticking time (series) bomb Time Series data are being generated everywhere; by devices in homes, on your wrist, increasingly in our vehicles, linking patients to healthcare service providers, to industrial machinery, and the financial world has been immersing themselves in it - time series data is fast becoming the coolest kid on the block. Businesses that understand how to make full use of time series data can identify patterns and forecast trends creating significant competitive advantage.  For example, it has been estimated by McKinsey & Company that retailers who leverage the full power of their data could increase their operating margins by as much as 60%. At the moment less than 0.5% of all data are ever analysed and used, and a recent study by Oxford Economics found that only 8% of business are using more that 25% of their IoT data1. Just imagine the potential. [easy-tweet tweet="Businesses that understand how to make full use of time series data can see patterns and forecast trends" hashtags="tech, cloud, data"] Accepting that promise, the next question is how you can continue to capture it cost effectively? What if you need to archive?  What are the implications when you’re required by law to keep seven years of data with the granularity you need?  Try doing this with Teradata, Netezza or Oracle.  The short version of the story is that it’s just too expensive and cumbersome.  You need to be considering commodity hardware combined with operationally simple open source software. And it’s not only the expense.  Not to labour a point we’ve made many times, but these databases are just not designed to hold and manage unstructured, or specifically time series, data.  They were built for a different era. Setting scalability aside, you should ask: How does my database Meet the cumulative volume of “readings” or data points over the period of data retention as required by the business Provide adequate query performance for time series data Manage data as it expires over time Support the analysis needed for my use cases Growing ambition The aforementioned IoTaaS, of course, are masters of scale. They have invested hugely in building and optimising their “stack” for seamless operation so they can focus on invoicing happy customers and delivering new features. But these technologies are typically free and open source – if a backroom startup can profit from them why can’t established enterprise? Well, here’s a secret: Those that are succeeding in early IoT products and internal services already do. I am fortunate to live in the “new stack” world where our customers have hit the limits of what’s possible and needed solutions to grow. Take the Weather Company as an example. Their system collects several gigabytes of data per second from many sources including millions of mobile phones, sensors in weather observing stations, as well as thousands of daily flights that measure data on temperature, turbulence, and barometric pressure. In October 2015, Bryson Koehler, executive vice president and CIO, The Weather Company described the challenge: “At The Weather Company, we manage 20 terabytes of new data a day, including real-time forecasting data from over 130,000 sources. The sheer volume of time series data requires databases that can efficiently and reliably store and query time series data. Riak TS delivers on this need and allows us to perform the associated queries and transactions on time series data while maintaining high availability and scale.” Another good example is Intellicore, that decided to build its Sports Data Platform on a time series database which is used to provide live value-added analysis (called second screen) to spectators and broadcast audiences during sporting events such as the recent Formula E electric car championship. Intellicore acquired the live telemetry data from the Formula E racing cars, generating tens of thousands of events per second and redistributing that data normalised and analysed in real time to millions of live online viewers thanks to Riak TS. This is only possible because these IoT applications are highly distributed like Riak TS. The Riak TS database is masterless and scales with just a few lines of code. As such, it provides fault-tolerant storage for the relentless stream of incoming data. If a server dies, the data are stored elsewhere, and another node will remain responsive to queries. I appreciate that at this moment you might be saying “but that is not what it is like for most companies.” As businesses wake up to the possibilities, the number of organisations that will start to leverage IoT data for insight and competitive advantage will mean these sorts of use cases are quickly becoming mainstream. [easy-tweet tweet="Designing an IoT application using a time series database which scales efficiently means the cloud is the limit." hashtags="cloud, tech, IoT"] By designing a distributed IoT application using a time series database which scales efficiently means the cloud is the limit. If you are planning to launch a distributed application, then you can now develop and launch into production for free using open source Riak TS – launched in April this year. ### Cloud based analytics: A data first approach The cloud increasingly dominates virtually every aspect of IT deployment, yet few companies have successfully embraced cloud-based analytics. The problem is not merely technical – although scalability concerns combined with the cost of accessing cloud-based data can rapidly derail an analytics project; the real issue is that companies are investing heavily in analytics platforms only to discover the data lacks the depth required to deliver real business value. [easy-tweet tweet="The cloud increasingly dominates virtually every aspect of IT deployment" hashtags="cloud, tech"] It is time to go back to basics and take a data-first approach – and the first step is to leverage low-cost analytics platforms to verify the value of existing data quickly. Business Value The financial model adopted by cloud providers means that while it is easy – and relatively cheap – to put data into the cloud; it is difficult and extremely expensive to get it out again.  In fact, the problem goes far deeper than the sheer cost of typical big data analytics in the cloud models. While many companies may have vast quantities of data – both structured and unstructured – there is no guarantee that data can deliver business value, however, clever the analytics. Data collection, categorisation and storage processes are inherently complex – and from data quality to ease of access, organisations can spend thousands of pounds only to discover that the data is simply not available to answer the critical business questions. What happens when the retailer discovers transactions are stored at a basket level rather than individual item level as believed? Or the insurance company with policy data that is so complex and fragmented it is impossible to reconstruct a live system? Or the management team learning that instead of retaining customer history for five years, it is stored for just three months to reduce IT costs?  Data issues can fundamentally compromise analytics activity. Test & Evaluate Rather than undertaking an expensive and time-consuming process of building up hardware, software and analytics expertise only to discover the gaps in data history, the new realm of cloud analytics providers now offer the capability and technology to undertake this essential data discovery step incredibly quickly. Within a matter of weeks, a cloud-based service can look at a company’s data to determine whether there is any business value and whether that data can be used to meet key objectives. It is only once the quality of the data has been confirmed that organisations should embark upon an analytics project. At this point, the priority is to work in close collaboration with cloud-based analytics providers to identify business outcomes. In the world of big data it is possible to search for any number of patterns or trends, but without a clear business case, the vast majority of such activity will deliver little or no value. [easy-tweet tweet="The priority is to work in close collaboration with cloud-based analytics to identify business outcomes" hashtags="cloud, tech"] Analytics on Demand Attitudes towards analytics need to change if businesses are to gain real value. Just delivering analytics capabilities will not deliver transformative business insight: a broad analytics capability alone is not a fast track to boosting profit or gaining new markets. Organisations need a business case to drive the analytics activity – and none of this can be achieved if the initial data evaluation step has not been undertaken up front. Big, expensive, time consuming and resource intensive analytics projects are outdated and unproven. They are failing to enable organisations to gain the benefits of either the cloud or analytics. Analytics as a service model eradicates all the issues currently acting as a barrier to adoption from cost to control, security to confidence. It is once organisations have the chance to iteratively and interactively exploit analytics in the cloud to meet specific, tightly defined business objectives quickly and at a low cost, that the next stage of business innovation will be truly enabled. ### Coping With Today’s Complex Network Management Licensing Licensing models for software have changed dramatically over the last decade, largely as a result of the greater number of cloud-based offerings, the growing number of Software as a Service (SaaS) solutions and mainstream adoption of virtual environments. But while these changes to software licensing have, in some respects, been confident regarding budgets and buying power, they have brought their challenges when it comes to the complexity they can add to environments. [easy-tweet tweet="Licensing models for software have changed dramatically over the last decade." hashtags="tech, cloud, "] These days, many IT managers tend to over-license software, either because they have failed to re-assign licences when decommissioning or commissioning hardware or to safeguard themselves from falling foul of any audit. While this approach may provide a safety blanket, it comes at a cost as money is wasted on licences that aren’t being used – and may never be. But worse, if companies are unable to get a grip on their licensing regime, they could be guilty of under-licensing. This could potentially leave them at risk if there were to be an audit. Fast Paced IT Changes Are Not Reflected In Licensing Models Licensing for network management software is nowhere near as advanced as you might imagine. This has created a huge headache for IT teams because the software touches all aspects of the modern and highly sophisticated IT infrastructure. As a consequence, purchasing can be tough, with organisations forced to buy separate licences for applications, network devices and network flow sources. Worse, these licenses are often inflexible and difficult to scale up without incurring significant cost and negatively affecting their return on investment because of the artificial licence bands set by vendors. This becomes even more of an issue when IT teams have to deal with multiple software companies. Is There A New Way? Essentially, what network management needs is a variant of the pay-as-you-consume model that is evolving for IT hardware and some types of cloud-based software. This would provide companies with a more flexible licensing model that addresses the overall network rather than a complex and unwieldy model tied to individual devices, applications or network flow sources. [easy-tweet tweet="Network management needs a variant of the pay-as-you-consume model" hashtags="cloud, tech"] Organisations would no longer be required to buy technology specific licences for applications, network devices or network flow sources because they would all be included. This would enable them to reallocate the permits wherever and whenever they need without additional cost. IT teams would have freedom to restructure their monitoring requirements at will while eliminating the artificial limitations that result in the unnecessary expense of unused, technology specific licences. How would this work? One option is to adopt a points-based system that assigns points to the network devices, servers and virtual machines/hosts, applications and users being monitored. A significant advantage of the points system is it provides organisations with the flexibility to purchase a set of points and use them to control any mix of technologies (e.g., devices, applications, flow sources, virtual machines, etc.). IT teams can configure their points any way they like. As long as managers stay within the points tier they have purchased, they never have to pay for additional licences. As they add more devices or applications, they only buy more licences, but they are not forced to buy specific permission types. Compare this to traditional licensing schemes that force people to acquire licences locked to specific network devices, applications or network flow sources. Conclusion  A flexible licensing system is much more suited to the dynamic and agile IT systems of today because it allows organisations to make changes to their licences quickly and effortlessly as and when they make changes to their network. There is far less potential for IT teams to be left with available licences, to lose track of licences when they make changes to the mix of technologies being monitored or to purchase additional licences they don’t need. Networks may well be more complicated, but that doesn’t mean we can’t make them easier to manage. ### Securing your cloud identity As more and more enterprises move to the cloud, IT managers have their hands full when it comes to ensuring they do not become exposed to security breaches, data privacy exposures, and compliance issues. [easy-tweet tweet="IT managers have their hands full they do not become exposed to security breaches" hashtags="tech, cloud, IT"] In the last year alone, there’s been a significant increase in the number of CIO mandates only to adopt cloud applications. Research from IDC claims that by 2019 IT managers and enterprises will be spending more than £100 billion annually on cloud services. Of course, when approached in the right way, the cloud can deliver considerable benefits to organisations, but there are some challenges facing IT managers along the way: Covering all the bases As agencies move to the cloud, they will likely continue to have many critical applications that remain on-premises, some for many years to come. So even if a “cloud-first” mandate exists, any cloud-based identity management solutions must provide comprehensive coverage to this hybrid IT environment. Dissolution of the “network perimeter.” Employees today can use their personal devices to access corporate accounts in the cloud. This means that IT managers need visibility into and control over that. Often, the only link IT has between the end-user on a smartphone and an account for a SaaS application is the user’s identity.  So actually managing that identity is the key to managing the perimeter-less enterprise. Data, data and data We’re seeing an explosion of unstructured data in the enterprise and out to the cloud in storage systems such as DropBox.  More often than not this is data that was previously kept secure in a database or application but in the name of convenience and collaboration it has now been distributed in a largely uncontrolled fashion. With potentially sensitive data making the move to cloud storage services, it is crucial for organisations to understand and manage where this data exists and who has access to it. For IT managers to make the most of the cloud without exposing themselves to security and privacy concerns there needs to be a shift in organisations’ overall approach to IT security. Since understanding “identity” is often the most critical element in linking all this together enterprises need to ensure that identity is at the centre of their IT and security approach. To do this effectively, the important barriers and separate silos of security and operations processes need to be broken down, to provide better visibility into who is doing what, what kind of risk that represents, and to be more proactive in dealing with threats in real-time – across the entire hybrid IT enterprise infrastructure. The ability to manage and control identities across the hybrid IT environment while securely migrating to a cloud company requires sound identity governance. And fortunately for those cloud-savvy enterprises, there is a new generation of cloud-based identity management solutions that meet the needs of managing this hybrid IT environment while extending the benefits of the cloud. However, as with all new markets, there will be technology claims that may exceed a vendor’s ability to deliver. [easy-tweet tweet="IT managers need visibility into all the information about an identity" hashtags="tech, IT, cloud"] A comprehensive cloud-based identity governance must be able to connect to all enterprise systems, from the legacy applications that have been in use for years to the SaaS applications that are being adopted today. IT managers need visibility into all the information about an identity, across all the applications an enterprise uses, all the data they have, and across all users – no matter where they are located or what devices they may use. Cloud-based identity governance should also be able to govern everything. Organisations must have a grasp of who should have access, who does have access, and what users are doing with their access to all applications and data for all your users. This requires the ability to define the desired state and continually assess where access is not aligned with the model. Finally, IT managers must empower their users to work how they like to work, wherever they are and on whatever device they want to use. This enables organisations to increase collaboration both inside and outside of the network safely. By treating identity as a company-wide initiative, organisations can ensure visibility, control and governance to all data and applications. Only by taking an ‘identity-first’ governance approach can IT managers help their bodies become ‘cloud-first’ over time, maintaining a safe IT environment while keeping identity management at the foundation of it all. ### Connecting Devices - #CloudTalks with RealVNC's Agustin Almansi In this episode of #CloudTalks, Compare the Cloud speaks with Agustin Almansi from Real VNC about Connecting devices. This interview was filmed live at Cloud Expo Europe 2016. #CloudTalks VNC is used by millions of people worldwide to improve the operational efficiency of their business. We help organisations cut the cost and improve the quality of supporting remote computer systems. Our software reduces the expense and risk of field service activities, minimises downtime, and cuts the number of repeat calls. It enables our customers to comply with regulatory obligations and creates compelling new revenue opportunities. Whether you work in enterprise IT, consumer electronics, automotive, healthcare, telecommunications or manufacturing, VNC connects your business. Visit Exoscale to find out more: https://www.realvnc.com/ ### Snapchat "Spectacles" - Gadget of the Week How would you feel talking to someone while knowing they have the capability to record and share whatever you're doing because they are wearing Snapchat's first foray into the world of hardware? With a design that echoes nineties McDonald's happy meal toys the social company looks to add a sense of voyeur to their shared content that threatens to go full-1984.  [easy-tweet tweet="Snap looks to add voyeur to their shared content that threatens to go full-1984" hashtags="snapchat, tech, cloud"] Messaging app firm Snapchat has announced its first move into the gadget market with new wearable piece 'Spectacles' (sunglasses with a built in camera). The device that records 30 seconds of video at a time before syncing the data to the cloud will go on sale later this year priced at around £100. The risk lays with where this data will go as video moves to the cloud immediately so if the device is confiscated the content could already be safely stored and ready for sharing via the phone app. Many look to compare this threat of privacy to CCTV, however, as the name implies, this content is closed-circuit meaning that it is not available to everyone as it would be with Snapchat when shared. The danger with Snapchat is a classic case of giving anyone the power to record in 'secret'. There is a light that turns on when video is being captured through the glasses which at least allows everyone else to see when the camera is active and to know when to avoid being captured on a video they don't wish to be shared in. [easy-tweet tweet="The danger with Snapchat is a classic case of giving anyone the power to record in 'secret'." hashtags="snapchat, social, tech"] Snapchat's 26-year-old creator Evan Spiegel explained in an interview his rationale for creating 'Spectacles'. "It was our first vacation, and we went to [Californian state park] Big Sur for a day or two. We were walking through the woods, stepping over logs, looking up at the beautiful trees. "And when I got the footage back and watched it, I could see my own memory, through my own eyes - it was unbelievable. "It’s one thing to see images of an experience you had, but it’s another thing to have an experience of the experience. It was the closest I’d ever come to feeling like I was there again." [easy-tweet tweet="Snapchat is rebranding itself as 'Snap, Inc.'" hashtags="snapchat, social, tech, cloud"] As part of this announcement Snapchat is rebranding itself as 'Snap, Inc.', a decision which underlines the company’s ambition to go beyond the ephemeral messaging app, a product which is highly popular with the younger demographic. ### Editing Software via the Cloud - #CloudTalks with Sixsq's Marc-Elian Begin LIVE from Cloud Expo In this episode of #CloudTalks, Compare the Cloud speaks with Marc-Elian Begin from Sixsq. about why data privacy is so important for their customers . This interview was filmed live at Cloud Expo Europe 2016. #CloudTalks   SixSq are software artisans, passionate about software development and technologies. They have learned our trade writing a lot of software and also enjoy reducing software, using refactoring, or contributing to other's good ideas, such that they do not have to write so much. Visit Exoscale to find out more: http://sixsq.com/ ### 4 Key Terms Used When Developing a Salesforce Test Data Migration Plan Every once in a while, you have to change your business’ systems. Migrating from the old system to a newer one is always exciting because you will be moving to a better performing system. However, this migration poses a great risk to the data that you have acquired over the years. Loss of data is a major cause for worry for many business owners. For this reason, you have to ensure that you set in place the best data migration strategy. [easy-tweet tweet="Migrating from the old system to a newer one is always exciting" hashtags="data, systems"] Data migration is not as easy as plugging in a flash drive or a memory card to your computer and transferring to another one. It is a time-consuming process and one that will require serious planning. It is nonetheless one of the most beneficial tasks for all businesses. You will be upgrading to better data management practices and better-performing systems. In order to conduct the whole process, you will need the services of a data migration expert and your database administrator’s assistance. You also need to understand what is meant by the following terms. Legacy data Legacy data is simply the data that you want to move to another system. It goes without saying that the source of this data is the legacy system. It includes the information that is currently recorded in your storage system. This could be anything from scanned images and paper documents to database records, text files, and spreadsheets. All these formats of data can be shifted to another system. Data-migrator This is a tool that is used to move the data from the legacy system to the new system. Flosum is one of the best data migration tools that you will find in the market today. You can access it at http://www.flosum.com/salesforce-data-migrator/ and it will help your data migration process by simplifying it. This migrator works with just about all methods that you might want to apply in the data migration process. Data migration This is the process of exporting the legacy data to the target system through the data migrator. It can be done manually or be automated. The specific method that you decide to use for the data migration process is totally dependent on the systems that you will be using as well as the nature and state of the data that will be migrated. Data cleansing Cleansing of data must be done before you begin the migration process. It is all about the preparation of the legacy data for the migration to the new system. This is done because there is a disparity between the architecture of the legacy system and the target system. Often, the legacy data will not meet the criteria that are set by the target system. Data cleansing, therefore, manipulates the legacy data so that it will meet the requirements of the new system. [easy-tweet tweet="Cleansing of data must be done before you begin the migration process" hashtags="bigdata, tech, cloud"] The bottom line here is that if you understand the basics of data migration, you will have a really easy time finishing the whole job. These above-mentioned terms are among the foundational features of every data migration project. A good comprehension of them will help a lot as you plan the data migration project. ### IPExpo and platform convergence This October we have the pleasure of co-hosting the IPExpo live streaming channel. A lot of people ask us what our thoughts around the cloud and other various IT disciplines are today. 2016 has been an exciting year for the cloud, as we see more mainstream adoption and the proliferation of hybrid devices which act as ramps to cloud on-boarding, I believe that the second phase of cloud computing has begun. As with any previous technological disruption, progress has always been measured by Moore's law and density of devices. Miniaturisation occurs in every industry and as devices get smaller and more compact so does the energy requirements and colocation housing needed to operate them. [easy-tweet tweet="Miniaturisation occurs and as devices get smaller so does the energy requirements" hashtags="tech, cloud"] In our opinion, The Cloud has always just been a platform for bigger things, a stage constructed with strong foundations that expand on demand to allow the show on top to truly blossom for its audience. As we enter areas such as "cognitive" led by IBM Watson and Google Deepmind and Machine learning industrialised and offered on demand with AWS, we are now just seeing the beginning of what is possible. Big Data is now entering Big Data 2.0, analytics are going beyond neural networks to become self-learning, our children are coding in school, and our data centres are adapting by offering connectivity to cloud providers. Business models based on traditional hardware sales are moving beyond transactional arrangements with margin resell through to service delivered according to business need and project goals. [easy-tweet tweet="We are now just seeing the beginning of what is possible with cloud" hashtags="cloud, tech, IPEXPO"] Today's IT Buyer is armed with information beyond all scope of just a few years ago and has usually investigated possible solutions. Unfortunately, these decisions are made without meeting competitive companies and based on carefully crafted PR propaganda weaved through the tapestry of search engines and buyer persona tools. In this current environment, we believe that the potential purchasers of solutions get a chance to meet face-to-face with potential suppliers allowing for interactions outside of digital channels. When we look at the different streams to the show, we view them all as one large segments of the same discipline. Allow us to elaborate on each stream below; Cloud – The foundation of every scale out service or application Cyber Security – The protection of the ‘evolving - fluid' boundary that cloud has become Networks and Infrastructure – Without network access, there is no cloud, the most overlooked area of cloud implementations Data Analytics – Without analytics data is just binary 1's and 0's, and analytics gives us the power to view without confusion. DevOps - Cloud, applications and data all fall under DevOps, a core area of structure and process that brings agility to cloud operations and development allowing for agile iterations and fast execution of services. Open Source – Open Source is everywhere on every device from the home to the internet. Without Open Source many of today's applications from Big Data through to Cloud would not be with us. We hope that this blog has helped and perhaps even crystallised some thought processes or assisted in framing what you are going to do at IPExpo. We look forward to welcoming you onto our stand. ### Cloud Disruption - #CloudTalks with Cassini Reviews' Paul Bevan LIVE from Cloud Expo In this episode of #CloudTalks, Compare the Cloud speaks with Paul Bevan from Cassini Reviews about cloud disruption. This interview was filmed live at Cloud Expo Europe 2016. #CloudTalks Visit Cassini Reviews to find out more: http://www.cassinireviews.com/ ### Exponential-e aims to become cloud provider of choice to the public sector British cloud and network provider, Exponential-e, has been accepted on to the G-Cloud 8 framework reinforcing its commitment to powering cloud initiatives that support the public sector in delivering its vision for a digital Britain. Using its wholly-owned 100 GigE carrier class network, Exponential-e has been approved to deliver Infrastructure-as-a-Service (IaaS), Platform-as-a-Service (PaaS), Software-as-a-Service and Specialist Cloud Services (SCS) solutions through the latest iteration of the G-cloud portal. “The cloud plays an integral role in enabling the public sector to achieve its vision for becoming digital by default. Yet despite total spending on the G-Cloud reaching £1.2 billion since its launch in 2012, less than half of that figure has been invested in cloud computing services,” explained David Lozdan, head of public sector at Exponential-e. “We want to support the public sector in transitioning towards modern, cost effective cloud architectures that provide the backbone for the digital transformation of public services.” The G-Cloud 8 framework can be used by organisations across the UK public sector including central government, local government, health, education, devolved administrators, emergency services, defense and not-for-profit organisations. Using its highly reliable and resilient network, Exponential-e will provide private access to a scalable cloud services that sit on the inside of the corporate firewall and operate as an extension of an organisations WAN. Through this approach, public sector bodies benefit from a flexible consumption model whilst mitigating the risks associated with cloud adoption. “With public spending experiencing cuts of 8.3% since 2010, the G-Cloud framework sets out to deliver better public services for less cost and to implement technologies that increase public sector productivity,” added Lozdan. “We want to ensure that government departments develop an effective approach to the cloud, which allows them to benefit from shorter procurement cycles and quicker deployment to achieve efficiency savings and support the rapid deployment of new online services.” ### Cloud Computing: A Revolutionary Concept or Overhyped Trend? It has been established as commonplace that cloud computing changed the way we transport data on digital avenues. The technology has been hailed as the champion of the information age, and indeed, it is hard to find a modern company which does not store and access data in cloud. Still, if we take a step back and look at the evolution of the cloud computing, do we really see the concept that came out of the blue and shifted the high-tech landscape of today? Is this the digital revolution or another trend that has been blown out of proportion by the buzz-hungry online community? [easy-tweet tweet="Cloud Tech has been hailed as the champion of the information age" hashtags="cloud, tech, computing"] Transcending space and time People are no longer tied down with their computers, enjoying the benefits of high-speed internet on the go. With data storage available from virtually everywhere, the USB sticks and DVDs have become nearly obsolete. Now we can utilize amazing solutions such as Dropbox or Evernote, and keep all our data in one place. We do not have to upload our blog posts to FTP server, for WordPress makes this process far less tedious. However, if you have been wondering about how the hell people have worked without the centralized architecture you have been nurturing a misconception. Namely, the cloud computing is neither revolutionary nor new. Although the term is relatively novel, the concept itself has been used to denote client-server architecture for 15 years and more. In those “olden days” there were mainframe computers that were connected to the computers of clients. Later, we witnessed the invention of Laptops and proliferation of VPN networks. There are also a lot of inventions that have contributed to the surge in online tech: New top-level domains like .me domain, the rise of the mobile platforms, boom in the app market, etc. The old and the new [easy-tweet tweet="Cloud computing is neither revolutionary nor new." hashtags="tech, cloud, computing"] Since then, the main framework has remained pretty much the same, with innovations encompassing makeovers, internet speed boosts, and renaming. That does not seem to give enough reasons for announcing the game-changing technology, but the question is where does the hype come from then?  Well, it is safe to assume that the ballyhoo of this phenomenon is rooted in the widespread use, as well as its evolving applications. Moreover, cloud computing has become the darling of the users because it is easy-to-use, accessible, and secure. Uploading and syncing data runs like clockwork, the features are automated, and the design polished. There is no need to engage in the hassle such as manual synching or to use programs like desktop reader RSS applications. But, like it or not, under the surface, it is the same. In a sense, we have reinvented the wheel rather than a concept that is changing the face of the world: The cloud computing has been around for quite some time, albeit not in such a refined, sophisticated state. This is not to say that cloud computing is redundant, or anything like that. It is right at the center of rapid businesses and tech developments, accelerating our advance into the world of tomorrow. Many applications deserve the cheering and applause, as they make our life much easier. The problem is that overwhelming buzz clouds the rational debate and obfuscates some problems like numerous security threats. We need to see cloud technology for what it is, with all the advantages and flaws included in the big picture. Only then can we make informed decisions and gain an edge in the competitive business arena. From the ground up Thanks to the cloud computing, we do not have to work with one physical machine all the time, and are able to reap the benefits of the seamless integration and syncing. But, does cloud computing deserves the praise it gets? Well, I would say yes and no. The ability to work from wherever you want should definitely be celebrated and cherished. Yet, the groundwork has been laid a while back, the same as load-bearing walls and the roof.  Cloud computing, as we know it, has given the building’s facade a much-needed facelift, beautified its architectural features, and made them fit for the 21st century. ### Dell EMC Expands Broad Microsoft Support, Delivering New Innovations across Cloud and Converged Infrastructure Dell EMC today announced broadened support of Microsoft environments with new Microsoft Azure Services, Validated Systems for Microsoft Exchange and SQL, as well as customer momentum for the Dell EMC Hybrid Cloud System for Microsoft. The new services and validated systems can enable customers to increase business agility, improve efficiencies and lower costs with highly customizable hybrid cloud and infrastructure solutions. “For more than 30 years, Dell EMC and Microsoft have focused on delivering best-in-class, innovative solutions that span the entire Microsoft product portfolio to organizations all over the world,” said Jim Ganthier, senior vice president, Validated Solutions Organization, Dell EMC Converged Platforms Solution Division. “Together, we are working to both simplify and accelerate customers’ journey to the cloud, helping them build individualized cloud solutions that are future-proofed for business and IT transformation.” Through this alliance, Dell EMC today announced availability of a new suite of Azure Cloud Services, which enables customers to easily adopt hybrid cloud services, increasing efficiencies and extending the capabilities of their IT infrastructure. These cloud services are hosted, managed and delivered from Azure. Backed by Dell EMC enterprise-grade service level agreements, 24/7 tech support and service health monitoring, customers pay only for the services consumed and without long-term contracts. The new services include: Azure Backup– a flexible, secure, scalable backup solution in Microsoft’s cloud management portfolio. With no capital investment and consumption-based pricing, this service delivers significant cost savings while protecting data running in virtual machines or on physical servers, as well as in Azure. Azure Dev/Test Labs– a service that allows developers and testers to create and provision Windows and Linux environments quickly using reusable templates and pre-provisioned environments in Azure. As a result, customers can minimize time and waste, control costs and scale-up load testing efficiently. Azure Business Continuity Solution– utilizes multiple Azure services delivered with Dell EMC consulting, provisioning, break/fix support, and single-source pay-as-you-go billing. This solution brings enterprise-class business continuity to small businesses by making it accessible and radically simpler without having to design, build, or maintain it on their own. DELL EMC HYBRID CLOUD SYSTEM FOR MICROSOFT GAINS MOMENTUM, HELP BUSINESSES TO SIMPLIFY CLOUD DELIVERY, MANAGEMENT AND SCALE Since introduced in October 2015, customers from financial services to technology providers have selected the Dell EMC Hybrid Cloud System (DHCS) for Microsoft to power their hybrid cloud environments. These customers are benefiting from the industry’s first integrated, hybrid cloud solution that offers simplified, automated deployment and maintenance capabilities and unparalleled hybrid cloud governance, control, and policy-based automation for Azure with an optional, unique payment solution to reduce investment risk: Glarner Kantonalbank, an innovative digital Swiss bank, wanted to upgrade its already successful hybrid cloud. Glarner Kantonalbank chose the DHCS, installed with the help of Dell EMC Deployment Services, in the first rollout of its kind in Europe. As a result, it can expand and become a full service provider of digital services for other banks. CGI Group Inc., the fifth largest independent information technology and business process services firm in the world, offers private cloud deployments with a customized approach to meet client requirements. With DHCS, the company significantly improves time-to-value for clients by simplifying deployments, reducing risks, and cutting deployment time by up to 50 percent. PeeringOne, a Malaysian-based cloud provider, needed to meet growing customer demand for cloud services and guarantee reliability and performance. The modular scale-out architecture of the DHCS enabled PeeringOne to minimize capital expenditure and leverage cost savings to ensure compelling customer price points. Pulsant, a UK-based cloud hosting company, increasingly required a hybrid cloud based on Microsoft technology. After running a successful proof of concept, the company deployed two DHCS appliances and expects its new hybrid cloud offerings to drive up to $2 million (US) in new business. NEW VALIDATED SYSTEMS DRIVE BUSINESS INTELLIGENCE AND COLLABORATION EFFORTS Dell EMC’s converged infrastructure leadership spans tested and validated reference architectures, solution bundles, validated systems and turnkey, fully engineered systems for a variety of use cases. The  Dell EMC converged infrastructure portfolio covers the “build” to “buy” continuum, so customers can easily and effectively adopt key technologies, such as flash, software-defined storage and scale-out architectures, in industry leading offerings, purpose built for wherever they are in their digital transformation journey. In support of its service-defined infrastructure approach, Dell EMC today unveiled two additions to its Validated Systems portfolio: Dell EMC Validated System for Microsoft SQL is designed for superior performance, significant cost savings and scalability for future needs. It allows customers to process in-memory online transaction processing (OLTP) workloadsup 30 times faster with Microsoft SQL Server 2016, as well as consolidate sprawling legacy databases with the latest Dell EMCPowerEdge R830 servers for a modern data warehousing solution with exceptional performance and total cost of ownership. TheDell EMC Validated System for Microsoft Exchange is a pre-architected and validated datacenter solution for email workloads. The highly scalable system delivers faster time to value by shortening the design, delivery and configuration time for Exchange Server 2016. Designed for exceptional scalability, this converged system lowers cost of ownership through resource consolidation, optimized storage design and efficient system management built on the Dell EMC PowerEdge R730xd server. These new Dell EMC Validated Systems can be configured, quoted and ordered in minutes, while lifecycle management tools allow customers to easily deploy, scale and update systems. ### HPE to Deliver an On-premises, Integrated Microsoft Azure Hybrid Cloud Solution Today, Hewlett Packard Enterprise (NYSE: HPE) unveiled details of its upcoming HPE | Microsoft Azure Stack solution, an integrated system that will enable organizations to increase agility and flexibility by delivering Azure compatible infrastructure as-a-service (IaaS) and platform-as-a-service (PaaS) services from their on-premises data center. HPE also introduced new HPE Consulting for Azure Hybrid Cloud services to help enterprises and service providers successfully implement Azure Stack, as well as enhancements to HPE Flexible Capacity and support for Azure. Organizations are turning to hybrid cloud strategies that include both private and public clouds to reduce costs and increase flexibility while retaining the control some businesses require. Using Azure Stack powered by HPE integrated systems, enterprises and service providers will be able to realize the speed and simplicity of the public cloud, combined with the cost-effectiveness and security of an on-premises private cloud. Co-engineered by HPE and Microsoft, the HPE | Azure Stack solution will provide enterprises with an Azure hybrid cloud platform in their own datacenter that is compatible with Azure public cloud services. “Many enterprises would like to host Azure within their data centers for performance, security or compliance reasons and many service providers would like to offer Azure-compatible services for data sovereignty or other targeted services,” said Paul Miller, vice president of marketing, Converged Data Center Infrastructure, at Hewlett Packard Enterprise. “HPE is teaming with Microsoft to deliver the right mix of on- and off-premises Azure services with built-in security, operations management, pay-as-you-go pricing and the expertise that provides unmatched flexibility and cost savings for enterprise workloads.” The HPE | Azure Stack solution will be built on the world’s best-selling server, the HPE ProLiant DL 380, as a complete compute, storage, networking and software system. Using a consistent, uniform portal, IT operators and developers can quickly access both Azure public and private cloud services to rapidly provision cloud resources, build applications, and easily move them between Azure public cloud and Azure Stack running on-premises. New enhancements to HPE Flexible Capacity expand the ability to include Azure services as part of the pay-as-you-go solution for hybrid IT. In addition, HPE plans to support the HPE | Azure Stack solution within Flexible Capacity for a pay-as-you-go ownership model and unified billing of public and private clouds. “Microsoft and HPE share a view that hybrid cloud capabilities and consistency across public and private cloud environments helps customers accelerate their cloud strategy,” said Mike Schutz, general manager, product marketing, Microsoft Corp. “The HPE solution for Azure Stack builds off the relationship we announced last year, tightly integrating the hardware, software, support and services to help customers benefit from the pace of cloud innovation while reducing deployment complexity and risk.” Lower operating cost and complexity The HPE | Azure Stack solution will be able to deployed together with HPE Operations Bridge software to offer an analytics-driven, automated operations management solution for managing Azure private cloud, Azure public cloud and traditional IT, dramatically reducing operating cost and complexity with features such as causal and predictive analytics, automated remediation and integration with Microsoft System Center. Additionally, HPE hybrid cloud solutions provide multi-cloud support, allowing enterprises and service providers to manage cloud environments that include Azure, Amazon Web Services, VMware and OpenStack®. Deliver secure hybrid cloud services Security is one of the top concerns of customers adopting cloud technology. When deployed with HPE SecureData, the HPE | Azure Stack solution will protect cloud services in multiple ways, securing sensitive data traveling into and throughout the Azure public cloud and Azure private cloud and providing unified security information and event monitoring when deployed with HPE Security ArcSight. Additionally, since the HPE | Azure Stack solution will exist in an organization’s own datacenter, it conforms to all of its security and compliance requirements. User security controls, Azure private cloud and traditional IT are provided through a unified security and event monitoring. New expert technical consulting and support services for Azure hybrid clouds To help customers prepare for a successful deployment, HPE Consulting for Azure Hybrid Cloud provides new advisory and implementation consulting services for Azure hybrid cloud, including security, workload migration, identity, networking and backup and recovery. In addition, global, enterprise-class support for Azure public cloud and hybrid cloud services expand organizations’ IT capacity to innovate by out-tasking monitoring and routine management responsibilities with HPE Datacenter Care-Operational Support Services. Availability The HPE | Microsoft Azure Stack solution is targeted for availability in mid-2017. HPE Consulting for Azure Hybrid Cloud services are available now to assist organizations plan their Azure Stack deployments. ### Records Management Solution Introduces Industry-First Custom Security Controls Alfresco Software, a leading provider of enterprise content management (ECM) and business process management (BPM) software, is giving government and enterprise organizations unparalleled ability to customize access to content by creating their own security access system. The new Alfresco Records Management (RM) 2.5 – the only open-source DoD 5015.02 certified solution – that allows content administrators, records managers or security personnel to secure content by easily assigning highly granular security classifications and marks. Alfresco will be demonstrating the new Records Management 2.5 at this week’s ARMA LIVE! International Conference & Expo in Booth #625 in San Antonio, Texas. Analyst firms such as Forrester confirm that records are increasingly generated digitally and public sector RM decision-makers are nearly equally challenged as private sector RM professionals by the volume of unmanaged digital documents that sit outside of records and information management programs.[1]  This is being exacerbated by the introduction of new compliance requirements such as the Managing Government Records Directive (M-12-18) for the US Government, which has resulted in the creation of the Capstone Approach by NARA. Mature records and information management programs are now looking more strategically at data and documents across their entire life cycles. Developed originally to meet thesecurity requirements of the U.S. Navy, other government agencies worldwide and commercial enterprises, Alfresco Records Management 2.5 offers a simple, user-centric interface that encourages faster, appropriate content classification. “Increasing regulatory compliance obligations and modern ways of collaborating require a new way of managing the complete lifecycle of information. Managing sensitive data is a key part of this, but users struggle to deal with the ever-greater complexity of the classification process,” said John Iball, Alfresco’s Senior Product Manager. “If the classification process is too difficult or time consuming, users will often over-classify files, preventing some people who need to use the files from accessing them. Alfresco Records Management 2.5 allows for more simplified and granular control of content.” Securing the Information Many content management solutions deploy Access Control Lists (ACLs) -- lists of users and groups able to access particular files – in order to manage access control. Alfresco also provides this ACL option, but recognized the need for a more manageable and scalable way of controlling the management of information and ensuring privacy with the following updates: Built-in Security Classification allows authorized users to define the security clearance level needed to access a file, document or record. The first level of the Security Controls provides the ability to apply a classification level (Top Secret, Secret, Confidential, Unclassified or other custom levels to content. Users are assigned a clearance level that allows them to access content – not just records – based on their clearance level. Custom Security Marks support a range of security scenarios and applied to files to restrict access of certain content to the appropriate users or groups of users. Alfresco Records Management 2.5 is also the only system of its kind to enable a combination of security marks. The security marks configuration includes options for selecting that the user meets all security marks applied to the content; one or more marks applied to the content; or the same or greater clearance than that of the content. In contrast, other records management systems require users to possess all of the content marks or the same or greater clearance than that of the content. For example, a particular project, say “Project X” may have restricted user access. By applying a “Project X” mark to content, only users possessing the “Project X” mark can access the content. Alternatively, creating a group of marks allows access to users with one or more of the marks applied to the content. For example, if access needs to be restricted to particular nationalities, the content can include a list of permitted nationality marks, e.g. US, UK and CAN. Users of any of those nationalities can then access the content. Similarly, in the commercial world it allows easy access control based on roles in the organization. For example, by marking files from an “Executive” mark group with CIO, CFO, CIO marks, only users in these positions can access the files. By combining the different types of marks, complete and highly granular access control is provided. Built on Open Source and Open Standards for Global Flexibility Unlike legacy, proprietary systems that were not designed to manage today’s explosion of digital content, Alfresco’s open platform offers the most flexible way to configure and deploy content, business process and records management. Support for open standards enables Alfresco Records Management 2.5 to be easily integrated into existing IT environments and to adapt as new business needs and technologies emerge. Supports an Extensive List of World Wide and Regulatory Standards In addition to meeting U.S. Navy requirements, Alfresco Records Management 2.5 has been validated for use by other government agencies, including JITC as well as HMGCC in the UK to ensure wide application to public sector organizations. It also meets regulatory compliance standards such as UK government's National Archives Electronic Records Management Systems certification, 2002 requirements, Australian Victorian Electronic Records Strategy, VERS (specs 1-5), ISO 15489 and MoReq2010. ### Enterprise Augmented Reality App Revenues to Approach $6bn By 2021 A new Juniper Research study has found that consumer app revenues from AR will remain below that of the enterprise sector through to 2021, despite the high visibility of games such as Pokémon GO. Juniper forecasts that revenues from enterprise AR apps will reach $5.7 billion by 2021, rising tenfold from an estimated $515 million in 2016. The research found that enterprise interest in AR technologies has continued to grow, fuelled by improvements in the field of vision and latency on HMDs (Head Mounted Displays). Future releases of these units, such as Microsoft HoloLens, are set to drive AR content revenues in the enterprise sector as businesses opt for HMDs over smartphones and tablets. HMDs to Dominate Future Growth According to the new study, Augmented Reality: Developer & Vendor Strategies 2016-2021, the bespoke nature of content in the enterprise AR sector, such as DAQRI’s partnerships with Hyperloop and Siemens, will lead to higher app prices. The combination of higher content pricing and hardware will initially hamper adoption and only the earliest tech-adopters in the enterprise space will implement the technology over the next 2 years. The report finds that enterprise AR revenues are set to remain on smartphone and tablet devices for the foreseeable future. However, enterprise focus from manufacturers such Microsoft and Vuzix will see the largest revenue opportunity for content developers on HMDs from 2020. Consumer AR’s Biggest Obstacle Whilst the ubiquity of the smartphone will give an immediate audience and distribution channel, the report advises developers to focus on continued innovation in order to keep users engaged. Research author Sam Barker added: “The nostalgia of Pokémon has worn off for most users and there are reports about a drop in app usage. This means that the majority of consumer applications will only have limited revenue opportunity and are likely to have a short shelf-life akin to the current wider app ecosystem”. Additionally, the high cost of HMDs and consumer hesitancy around public usage will deter consumers from purchasing AR glasses in the short-term. ### What Difference Does Big Data Make When It Comes to UX Design? When it comes to digital presence, can you visualize anything bigger than user experience in the current scenario? Generally, the answer would be a strict ‘no’ because user experience has emerged as the principal total of different characteristics that clears the way for a website to have a dynamic web presence. User experience is now the key factor for delivering business value. In addition, the digital presence carries the value of a successful brand in several facets. Now, the biggest question is, whether the enormous amount of digital data generated every moment can actually ascertain the success of our user experience. It is worth taking a note that currently; Big Data is crossing all barricades of popularity in regard to validating and scaling user experience. [easy-tweet tweet="Can you visualise anything bigger than user experience in the current scenario?" hashtags="tech, bigdata,"] The following study reveals what significance actually big data carries in respect to UX design, let’s go through. Enormous Scope of Big Data The extensive digital data generated across platforms, devices and interfaces all constitute Big Data. It represents 5 vital characteristics including variety, volume, velocity, value, and veracity. The exceptional growth of digital data in volume and variety is taken into consideration by this emerging arena of data analysis. This is quite obvious that data is largely becoming a precious source for getting business insights that are not often gained by the traditional analytics. The huge scope of rigorous business analytics is offered by Big Data simply because a distinct variety of data sets belonging to both structured and unstructured data types from different sources now can be put under analytics to get deeper and more result oriented insights. However, all the credit goes to its value in offering business insights which is extensively considered as a key technology for the decision-making procedures in businesses. [easy-tweet tweet="Big Data is crossing all barricades of popularity in regard to user experience." hashtags="bigdata, tech"] New Liberator in Big Data for UX Professionals This is one of the major characteristics of Big Data that it has come, like several other fields of business, as a liberator for UX professionals too. Big Data analytics has played a vital role in delivering essential insights from operational business data. At present, business decision making across the world agrees with no doubt to this emphasized role of Big Data. However, what about permitting this new scenario of digital data analytics to let the digital user experience better? This has been the new and one of the exceptional roles that have appeared as a liberating mechanism for developers and UX designers. In reality, the large volume of data derived from different sources can tell us much about customers and the digital communication they have with more profundity and promise. This is quite natural that factor, which carries a lot of impact for various users and communications can now be conveniently scaled and the proficiency of the UX can be confirmed accordingly. Relevance of Data Is Essential                   This is noteworthy that volumes of digital data may contain insights but all types of data are not pertinent and all data are not insights. In addition, it is the procedure of analytics and extraction of insights that offer them relevance. The face value of pure data is not much unless it is kept under the scanner of analytics and excerpt insights relevant to your business or any specific use. You always get an opportunity to know your customers better by getting contextual and aim-driven data keeping your business process, business output and customer interaction in mind. [easy-tweet tweet="Volumes of digital data may contain insights but all types of data are not pertinent" hashtags="bigdata, tech"] A user, for instance, is buying a product or availing a service comes with a buying habit, carries his preferences and there are time and location of his engagement with the business. Such exercises are integrated together provides an idea about the customer and the business transaction. Now, while the whole data is always there, filling the gap between pertinent insights and volume carried by them emerges as a tough job. Big Data Helps Visualizing UX Design Data visualization takes place where Big Data and UX deal with each other head-to-head to deliver intuitive and creative process of presenting information. Data visualization is referred as the form of visual presentation of data in graphical and pictorial format. Mobile users at present are unwilling to read a big sized content and they stress on data that is instantly viewable. Due to the small screen of mobiles and their image of being an on-the-go device, users do not like attentive reading such text. This is the reason putting complex data in visually appealing way is preferable in mobile apps and websites. User experience design assists with putting the data into fragrant images or graphics and can even add communication features in them. This is a very convenient way to purify data into useful graphic designs that you can have a quick look at. It is quite common that by adding complex information into graphic display the treasure of the data is exposed and given meaningfully to the user. Scaling UX with Big Data As soon as Big Data made its entry, the whole picture has been changed. Operational business data for instant insights into customer communication as well as big chunks of data from different sources to tell about customers and user interactions with more depth are there to assist you. With new and big pieces of data assets, available decision-making can be more suitable to take every effective factor into heed. From customer motive, their emotional responses to expectations all chip in to the measurement and validation of user experience now. In actual, the objective of this multi-sourced analytical measure is the effectiveness of UX measures from which your company can have a profound understanding of the customer communication, valued loopholes and engagement that need to be discussed. Scaling the UX against particular business insights assists to have an idea of the areas that need alteration and positive focus to deliver business value. Conclusion For organizations, the quality of user experience is vital to get a fierce lead. Often, this is ensured by observing user communications. Big data analytics not only offer data-driven insights about customers but also helps us validate and measure the US to a great extent. ### Networked cameras within transportation; the difficulty in moving from analogue to digital solutions One of the largest considerations within the UK rail industry is one of efficiency – every decision made must be based on a clear ROI with actionable results. When we consider the shift from analogue surveillance to networked solutions, many operators still judge this ‘new’ technology on the basis of its closed-circuit predecessor – discounting the numerous intelligence benefits that are now on offer [easy-tweet tweet="Operators simply did not foresee cameras becoming such an essential tool" hashtags="tech, cloud, security"] The average age of rolling stock in the UK is now at a 14-year high at 20.2 years old, with these assets built shortly before the invention of the first network security camera. It stands to reason that during the creation of safety procedures and policies, operators simply did not foresee cameras becoming such an essential tool. According to a recent report into the state of public transport, 30.4 percent of organisations revealed other priorities were a barrier to upgrading their surveillance network. 20.3 percent cited difficulty in securing funding, and 17.4 percent noted no clear business case, as they had seen no significant ROI. Notably, of those highlighting ROI, approximately three-quarters still use analogue systems. There is a clear education gap within transport when it comes to understanding the value of network video. The majority of carriages operate either through uploading security footage once it reaches a depot, or through staff manually downloading the footage once every few days. Some operators only ever download the footage if an incident has been reported, meaning numerous issues often go unnoticed. This is why footage took so long to come to light in the recent ‘Traingate’ incident involving Jeremy Corbyn and Virgin. The technology currently exists, however, to take the hard work aspect out of surveillance solutions. Education; a significant barrier to further adoption Instead of an incident being reported and an employee trawling through hours of footage, newer cameras have analytic functions that come into play. This means that instead of just operating at a surveillance level, these cameras can trigger alerts for any important event, such as tools left in the tracks, bridge strikes, graffiti or drunken behaviour. The issue, however, comes from educating the rail industry on the progression of these analogue solutions to networked systems which can save time and money, increasing the overall safety and security of those travelling or working within a network. [easy-tweet tweet="Working above a surveillance level, these cameras trigger alerts for important events" hashtags="security, cloud, tech"] One of the largest issues with implementing these new systems, as well as fully realising the benefits that are on offer, is one of infrastructure, with existing communications networks in the UK presenting a significant obstacle. We are some way behind some of our continental neighbours with regards to WiFi technology, broadband, 4G and other networks. With the number of signal dead zones in the UK, we must increase the connectivity across the UK’s entire network to make use of a fully realised cloud solution within rail, allowing train operators to react to real-time alerts and prevent any issues that may arise. Increasing capabilities through technological innovation There is a significant disconnect between the technology on offer, its availability and the benefits it can provide. Rail operators are gradually shifting towards IP systems, although these solutions are broadly a work in progress for a large number of train operators still using analogue or hybrid systems. Education is still a strong aspect in the uptake of networked security cameras. As an industry, we must do more to fully inform the rail industry of the new solutions available and but also the cross-functional benefits they provide - ranging from security to predictive maintenance and analytics. [easy-tweet tweet="Education is still a strong aspect in the uptake of networked security cameras." hashtags="cloud, tech, security"] To fully realise the future of networked cameras within rail, we must first focus on improving the connectivity of the UK as a whole. Once this has been achieved, there is an opportunity to standardise systems and ensure transport hubs across the UK are benefitting from the latest networked technology. With a commitment from security and transportation industry professionals alike, there is a real opportunity to shift away from legacy systems, supporting a reduction in crime, an increase in operational efficiency and a better ROI. ### New NetApp Software and Flash Systems Simplify Data Management, Boost Performance in the Hybrid Cloud “Hospitals produce a lot of data, from radiology images to patient records and email. We currently generate up to 6 petabytes of data that grows by more than 50% year-on-year,” said Reinoud Reynders, IT manager, Infrastructure and Operations at The University Hospitals in Leuven, Belgium. “NetApp ONTAP software with All Flash FAS systems gives us a flexible environment with the performance and capacity we need to manage our data at a low total cost of ownership. For us, that is very important.” NetApp® (NASDAQ: NTAP) today announced new NetApp® ONTAP® software, flash systems and expanded public cloud support that provide the modern foundation to help customers maximise the value of their data in the hybrid cloud. The new offerings are key to helping customers transform their IT. Successful IT transformations integrate data siloes, automate processes to speed results and remove barriers to scale that limit growth. Today’s announcement includes: An update to NetApp ONTAP 9 software, making the company’s rich data services offering even better with built-in encryption for improved data security, support for massively scalable high-performance NAS containers and simplified provisioning for enterprise applications. Six new flash-optimised storage arrays, including the industry’s fastest unified all-flash systems[1] and highly scalable hybrid flash arrays, with simpler setup and serviceability. Expanded choice with ONTAP Cloud now supporting the Microsoft Azure public cloud, complementing existing support for the AWS public cloud. In addition, NetApp has taken the lead in optimising the data centre for flash with its support for industry-leading connectivity options. “IDC believes that the future of IT is hybrid cloud, and we see enterprise storage companies like NetApp making strategic shifts to focus on flash storage, virtualisation, high-performance hybrid cloud, and other cloud-related technologies that enable the portability of data and applications,” said IDC Storage Research Director Eric Burgener. “Vendors like NetApp are paving the way for customers who are rapidly transitioning to flash-optimised, software-defined storage architectures with rich and proven data services that monetise big-data analytics by delivering meaningful insight for making critical, fact-based decisions for their businesses.” New features in ONTAP 9 include:    NetApp ONTAP FlexGroup is a massively scalable, high performance NAS container ideal for the latest generation of applications in the EDA, high-tech, oil and gas, and media and entertainment industries. FlexGroup allows customers to scale a single container up to 20PB and 400 billion files. NetApp Volume Encryption offers granular, volume-level, software-based, encryption for data on any type of drives across AFF, FAS, or ONTAP Select systems without requiring special additional self-encrypting disks. ONTAP Cloud support for Microsoft Azure ONTAP Cloud is software-defined storage running in Azure that enables customers to use advanced ONTAP data services to efficiently share, move, protect and manage their cloud data. Customers can now gain greater infrastructure flexibility by integrating Azure cloud services into their enterprise data fabric. ONTAP Select now supports all-flash commodity servers. NetApp also refreshed its industry-leading portfolio of ONTAP-powered all-flash and hybrid arrays, to provide customers with unrivalled scale, speed and data services as they transform their IT. New features and capabilities introduced today include: The AFF A700 all-flash system, designed for high performance and configuration flexibility; and the AFF A300 all-flash system, optimised for midrange all-flash configurations. These new systems offer up to twice the performance of prior NetApp systems at half the latency. The systems are easy to set up in under 10 minutes, have greater serviceability and provide expanded support for multi-stream-write (MSW) and 15TB SSDs. New hybrid flash systems including the FAS8200 for enterprise workloads and the FAS2650 and FAS2620, optimised for small enterprises and medium-size businesses. The new FAS systems offer dramatically increased speed and responsiveness with up to 200% higher performance than that of earlier NetApp systems and integrated NVMe NetApp Flash Cache™ intelligent caching. The new FAS9000 for business critical workloads gives customers the ability to easily scale up to 14PB in a system and scale out to 172PB in a cluster. This capability provides customers with the right size solution for their needs today and growth options for the future.        The industry’s first support for leading connectivity with 32Gb Fibre Channel and 40Gb Ethernet, in addition to the 12Gb SAS-3 storage connection, optimiding the infrastructure for flash. Intelligent modular design for AFF A700 and FAS9000 to enhance reliability, availability and serviceability, and simplify future upgrades. “Simplification is a key enabler for today’s digital transformation,” said Joel Reich, executive vice president, Products and Operations at NetApp. “NetApp’s new ONTAP software and flash systems give customers a way to bridge existing and emerging IT architectures as they build and evolve their hybrid cloud.” ### Cloud-based disaster recovery continues to advance in the face of growing cyber attack threat Against a high-profile, high-stakes backdrop of a global rise in cyber attacks and data losses, efficient disaster recovery has become a top priority for businesses all over the world. Active, speedy disaster rehabilitation and the ability to future-proof data storage has never been more important Virtualisation through the cloud allows the entire server to be copied or backed up to an offsite data centre Hybrid cloud models hold tremendous potential for what can be achieved in disaster recovery Understandably so; agility in data recovery can entirely dictate the immediate and future fortunes of your business, while the ability to future-proof data storage has never been more important. [easy-tweet tweet="effective disaster recovery has become a key priority for worldwide business" hashtags="tech, security, cloud"] First introduced in embryonic form in the 1970s, disaster recovery is now most cost-effectively realised in the cloud, delivering fast recovery times and multi-site availability at an accessible, affordable level. Effectively then, in a single stroke, what was only available to the few is now available to all, and there’s no more need for backup tapes. Through virtualization, the entire server can be copied or backed up to an offsite data centre and run as the live system on a virtual host within a matter of minutes, instead of what once took weeks. That massive shift has completely changed how we all think about off-site data recovery. Thanks to the virtual server being hardware-dependent, everything can be safely and accurately transferred to a second data centre without the headache of reloading each component of the first data centre. Smart data centre operators are providing full disaster recovery services that not only replicate the servers between date centres but also replicate the entire network configuration in a way that recovered the network as quickly as the backed up cloud servers. Furthermore, cloud technology has allowed businesses to outsource their disaster recovery needs, providing much greater flexibility. Indeed, Disaster Recovery as a Service (DRaaS) is easily scalable should companies want to expand or contract their operation according to market needs. It’s also big business with the DRaaS market predicted to be worth $6.4billion by 2020. [easy-tweet tweet="A hybrid cloud model combined with on-premise backup holds significant potential" hashtags="tech, cloud, hybrid"] Nevertheless, while the cloud offers the best method currently available - and an excellent one at that - there is plenty of development still to come, particularly when the sophistication of threats continues to advance at the same rate as new technology. Today’s best-of-breed solutions enable companies to produce a live clone of their production environment and use it as a development and test platform that doesn’t impact day-to-day operations. Moving this workload entirely into the cloud immediately eliminates the need for cumbersome and expensive maintenance of development environments. A hybrid cloud model combined with on-premise backup holds significant potential, ensuring that company data in the cloud and information stored physically is continuously replicated and synched so that companies can rapidly recover onsite or from the cloud as need dictates. However, the real disaster recovery advancements are likely to come from those service providers that integrate new technologies available in the marketplace with their innovations to ensure the customer experience is the best possible. As always, those that can offer the speed, reliability and competitive cost that the market demands, will be the winners – but the ability to shift data upon request between various sites while attaining always-on availability will be crucial. [easy-tweet tweet="Ability to shift data on-demand between sites while keeping always-on availability will be key" hashtags="tech, cloud, data"] The cloud is creating an unprecedented level of scalability, and the sky is surely the limit for the disaster recovery needs of the future. ### Cloud Data Security: the newest gap for IT to fill IT teams are responsible for mission critical applications, services and systems that businesses depend on to ensure success. On top of this, already over-worked IT pros are responsible for protecting corporate data in a world where data is moving across borders, systems and people. And it’s not an easy job. Although there’s a lot of technology that enables business integration, security is still an issue, and teams often struggle to protect data end-to-end even as they implement solutions to respond to heightened cyber security threats. [easy-tweet tweet="There’s lots of tech that enables business integration, but security is still an issue" hashtags="tech, security, IT, cloud"] In a recent survey we carried out, nearly 600 IT pros working in organisations across the globe with greater than 500 employees were polled about their data security and compliance issues. In the UK, 78% of respondents rated their ability to transfer and share files as very important securely and 70% already prohibit individual file transfer solutions through company policies. Other European findings included: 59 percent of respondents have policies in place that prevent certain file transfer technology or services for sensitive data, while 35 percent of companies in regulated industries such as healthcare, finance and government don’t allow it at all Only 31 percent of respondents believe that their organisation’s processes in mitigating risks in file transfer operations are very efficient and only 28 percent of respondents believe that their organisation’s processes in identifying risks in data transfer operations are very efficient 47 percent of those surveyed organisation’s have or may have experienced a significant loss of data resulting from a breakdown in the file transfer process 55 percent of respondents that did experience a significant loss of data said it was due to human or processing error As the external threat of hackers causing cyberattacks and breaches continues to rise, security problems caused by employees making innocent mistakes or in rare cases looking to maliciously steal corporate data is also increasing. An effective way to prevent both innocent and malicious insiders from causing cyber security issues is to stop them from using unsecure public cloud file sharing solutions such as Dropbox or Box. Transitioning to a more secure Managed File Transfer(MFT) solution that offers extra layers of security and protection can make all the difference in an increasingly complex and unsafe world rampant with cyberattacks. The use of cloud-hosted applications has led to more IT teams battling a new kind of data security gap in their organisation. And as shadow IT permeates most organisations, non-secure file transfers can often be at the bottom of a long list of security gaps that need attention. [easy-tweet tweet="Cloud-hosted applications lead to more IT teams battling a new kind of security gap" hashtags="tech, IT, data, security"] Filling cloud data security gaps can pose a unique challenge for IT teams because they have no visibility into these data transfers. Additionally, IT teams struggle to meet compliance standards, whether internal or external, because they don’t have a record of what files have been shared, who they have been sent to, or where the files went. This lack of control can weigh on IT teams and as cloud applications evolve and the work force becomes, even more, tech savvy, the threat and the consequences will only continue to grow. To address these security gaps, IT teams can look to adopt secure MFT solutions that send files securely with complete tamper-proof audit logs, reports and dashboards to gain visibility into the movement of their company’s data and a sense of control. Additionally, IT teams can: Prohibit most, if not all, EFSS and insecure/unsanctioned file transfer solutions Continuously educate employees on secure file transfer best practices and policies Train all staff on the MFT solutions in place for both person-to-person and person-to-system transfers Leverage workflow and automation tools included in MFT solutions to reduce the possibility of human error Implementing an MFT solution and embedding it into a company’s workflow is just the first step in addressing security gaps that occur from the rise of the cloud. IT teams should also set best practices for themselves to help prevent other gaps from forming. For starters, they should classify documents and implement content inspection practices around sensitive information including PII and intellectual property. IT teams are on a long road to protecting their company’s data against breaches, attacks and careless employees, and there are bound to be a few fumbles along the way. Based on results from the previously mentioned survey, not even half, only 40 percent, of respondents working at global companies in regulated industries have even adopted best in class and more secure MFT technology. But with the cloud data security gap only looking to widen, this number needs to rise significantly. By implementing these tips to fix many pressing issues presented by non-secure file transfers, IT teams will be well on the way to closing the cloud data security gap and prevent future ones from appearing. ### CPA Global IP Expert Recognised as World Leading IP Strategist ‘IAM Strategy 300 – The World’s Leading IP Strategists’ identifies individuals who are leading the way in the development and implementation of strategies that maximise the value of IP portfolios. Tyron Stading, President and Founder of Innography, a CPA Globalcompany, has been named by the Intellectual Asset Management (IAM) as one of the world’s leading IP strategists. Compiled annually, IAM researchers speak to a range of senior corporate IP managers and third-party IP service providers to identify IP leaders from around the world. Only individuals nominated multiple times, and by different parties as outstanding IP strategists, can be listed in the IAM Strategy 300. “The IAM Strategy 300 recognises the achievements of a very select group of men and women whose advice has consistently helped companies across the world to generate significant extra value from their IP,” says IAM Editor Joff Wild. “Developments over recent years have shown the strategic importance of IP to businesses, so locating individuals who understand IP value and how to create it has never been more important.” Tyron Stading founded Innography in 2006. Innography provides patent search and intellectual property analytics software that helps the world’s leading patent owners, innovators and decision makers drive more business value from their IP investments. With a vision of redefining patent analysis, Tyron has published multiple research papers on intellectual property and filed more than 50 patents. CPA Global acquired Innography in November 2015. Simon Webster, Chief Executive Officer, CPA Global comments: “We are fortunate to have a team of more than 250 intellectual property specialists at CPA Global. Every individual works towards the same vision: empowering a global IP community to achieve excellence in IP management and realise the potential of ideas. The exemplary work of Tyron Stading has been recognised by the IAM Strategy 300. We look forward to our customers collaborating and gaining insights from their expertise to optimise their IP strategies.” About CPA GLOBAL CPA Global is the world’s leading IP management and technology company. We believe that ideas change the world. Trusted by many of the world’s most respected corporations and law firms, CPA Global empowers a global IP community to achieve excellence in IP management and realise the potential of ideas. CPA Global does this by supporting the day-to-day delivery of IP tasks globally and providing the right information at the right time, enabling professionals to make better IP decisions for the future. CPA Global’s integrated suite of IP software, services and information is underpinned by an outstanding global team of over 2,000 IP professionals, working together to help customers deliver strategic value.   ### Suppliers need to work more closely with GDS if we’re to achieve the G-Cloud vision The G-Cloud framework has been with us for more than four years and by the end of June 2016 sales had exceeded £1.26 billion. Almost every aspect has come in for criticism in recent weeks: buyers for not using the framework, suppliers for not understanding what they need to do for a successful sale and the framework itself for being difficult to use by all parties and not providing the promised opportunities for SMEs. In my experience, there is some truth in all these comments – but it’s hardly surprising, and should not be a reason to complain, or to pull back from the framework. Moving to an entirely new method of public sector purchasing was never going to be a quick win, and if we as suppliers want to make it work we can’t sit back and wait for someone else to take the initiative. [easy-tweet tweet="Moving to an entirely new method of public sector purchasing is never going to be quick" hashtags="cloud, framework"] Let’s start with the facts: many public sector organisations are using G-Cloud, and many SMEs, including Fordway, have been awarded significant contracts. In the latest figures provided by the Government Digital Service (GDS), total recorded G-Cloud sales have reached £1,263,386,146, with 53 percent of total sales by value and 62 percent by volume awarded to SMEs. Through being on the framework since it was first launched, we’ve learned a lot. This includes analysing opportunities where we haven’t been successful, which has enabled us to improve our offer in future iterations of the framework. But other suppliers and we need to do much more. We need to start by sharing what we’ve learnt with the GDS so that everyone can benefit. It could help buyers to specify their requirements more accurately, suppliers to improve their service descriptions or the GDS to change search terms, service categories or other parameters. We want to make it easier for potential buyers to find our services – assuming that those services are relevant to their needs. While this may also help other suppliers, we have to be confident that our offer will stand up to the comparison on a level playing field. Suppliers can also advise GDS if there are services customers are asking for but can’t find in the framework, forcing them to look elsewhere. This could lead to the development of new services, categories or search terms, or perhaps changes to the scope of the framework itself. [easy-tweet tweet="Suppliers can also advise GDS if there are services customers are asking for" hashtags="cloud, framework"] GDS itself is being proactive, launching a discovery process for G-Cloud 9 to find out how future iterations of the framework should evolve and looking at overlap with the Digital Outcomes and Specialists framework. Earlier this year they ran a workshop at the Cyber UK in Practice conference to review the security around G-Cloud services, and more such events are in the pipeline. This gives suppliers and buyers an opportunity to help evolve the framework into something that meets all our needs. The original G-Cloud vision inspired me and many others, offering a way for the public sector to benefit from the creativity and agility of SMEs while reducing procurement costs (and giving me as a taxpayer better value). It’s up to all of us working in the sector to help make that vision a reality. ### Open Platforming for IoT - #CloudTalks with Maplese’s Marc Van Der Meent LIVE from Cloud Expo In this episode of #CloudTalks, Compare the Cloud speaks with Marc Van Der Meent from Maplese about why data privacy is so important for their customers. This interview was filmed live at Cloud Expo Europe 2016. #CloudTalks Maplese is an essential part of your IoT Ecosystem. It is an open platform that offers smart agile app development, visualisation & data storage. Find the latest developments of Maplese here. ### Cutting through the hype The IT industry loves acronyms, almost as much as marketers love buzz words and trends. However, combining the two can lead to a rather muddled vat of terminology that obscures meaning, rather than adding clarity. So, I’d like to help provide that clarity by boiling it all down and giving some perspective on the differences between backup, disaster recovery (DR), and Disaster Recovery as a Service (DRaaS), addressing how and when you should think about using these technologies. A backup is simply a snapshot of your virtual machine, typically taken once a day after close of business. Backups are stored for a set period of time, with a retention policy such as "one a day for a month and one a month for a year." They are usually kept in a place that is separate from the primary system, like a basement, another data centre, or, in the case of Backup-as-a-Service, in the cloud. [easy-tweet tweet="Backups are great for legal and regulatory reasons, data protection and so on." hashtags="tech, backups, data, security"] In my opinion, backups are great for legal and regulatory reasons, data protection and so on. However, they are not so great for business continuity. Why do I say this? The most recent copy you have is from last night, which is usually stale as most business applications change significantly during a working day. Equally, to re-awaken a system from a backup takes time. First, you need a ready and working system, rebooted and running smoothly, and then you have to mount the backup. If the backup is (more safely) stored elsewhere, there is also the travel time of getting it to the primary location to factor in as well. The whole process is not renowned for being very fast. So how does this compare to Disaster Recovery? Disaster Recovery (DR) systems replicate your data on an ongoing basis from a primary location to a secondary location. Usually, replication periods are measured in minutes, or even seconds, so the last copy of your data is fairly fresh. The secondary copy is held in stasis until it is re-animated via a failover trigger. The secondary location can be another data centre in another region, or even the cloud, in the case of DRaaS. In my opinion, DR is fantastic for business continuity because you can trigger a failover and be live at the secondary location in minutes. While some systems do enable an archiving-type function, most people agree that replication itself is not really the right tool for long-term data storage. This is where I believe that Disaster-Recovery-as-a-Service (DRaaS) really comes into its own. If you don't have access to a secondary location, or you don't want to pay for it, DRaaS enables you to execute the same Disaster Recovery replication with the cloud as your secondary location. In this instance, the speeds involved are similarly fast—it takes minutes or seconds to recover systems. With some providers, you only pay for storage while the systems are replicating/in stasis, and once they failover, you'll pay for CPU and memory resources as you consume them. That said, you will want to vet the cloud as an operating environment for your workloads. Levels of security, customer support, compliance and transparency vary greatly from provider to provider. It’s important to understand the needs of your business and choose a DRaaS provider, accordingly. [easy-tweet tweet="It’s important to understand the needs of your business and choose a DRaaS provider, accordingly." hashtags="tech, cloud, DRaaS"] DRaaS is a great way to achieve business continuity goals quickly and more cost-efficiently than owning a secondary data centre yourself. But don’t just take my word for it; take a look at how one of our customers, a multi-national financial services company, Bluestone, reduced its costs by 40 percent with our DRaaS solution. With offices in the UK, Ireland and Australasia, Bluestone is at the forefront of delivering innovative financial services to customers in a rapidly changing market that is subject to stringent compliance requirements. Facing ageing infrastructure and expiring supplier contracts, the company was pushing towards a cloud-first IT strategy to stay ahead of evolving customer and regulatory demands. Krisztian Kenderesi, the head of Global IT Operations for Bluestone, said it best: “Our IT team runs lean and mean and we don’t want to build our own DR solution or hire a DR expert-–for us, that expert is iland. The iland DR solution reliably runs in the background. I know it works and I don’t have to worry about the infrastructure availability, compliance and security. Our usability and maintenance overhead for the iland DRaaS solution is basically zero—much less than the headaches of an on-premise DR solution.” Below is a summary of the key benefits that Bluestone has found since using our DRaaS: Avoided downtime while simplifying its DR management – Bluestone can quickly recover from any IT incident, achieving recovery time objectives measured in minutes and recovery point objectives measured in seconds. Protected customer data with integrated advanced security – In the event of a failover, Bluestone’s workloads are protected against emerging IT threats with features like antivirus and malware detection, vulnerability scanning, whole disk encryption, SSL-VPN, intrusion detection and prevention, event logging, deep packet inspection and other advanced capabilities. Simplified industry compliance – In addition to FCA regulations, Bluestone must maintain the ISO 27001 standard, which has clear requirements for considering security in a Disaster Recovery environment. We, therefore, support Bluestone with security and compliance reporting to speed up and simplify its auditing processes. Reduced IT resiliency costs by 40 percent – With our straightforward pricing model, Bluestone only pays for compute resources when it requires a failover. The company has achieved a reduction of 40 percent in the overall cost model of its DR solution. Nothing helps break through the hype like a real-world use case, and you can download the Bluestone case study here for more information. ### What hybrid IT professionals can learn about support from cloud middleware Public cloud usage within the enterprise has grown dramatically in the last decade. It has migrated from the preserve of testing and development into more business-critical systems like CRM and HR. Industry analyst firm Gartner recently predicted that the global market for public cloud would grow by 16.5 percent to $204bn in 2016 with IaaS seeing a growth of almost 40 percent. [easy-tweet tweet="Public cloud usage within the enterprise has grown dramatically in the last decade." hashtags="cloud, tech, hybrid"] Many – if not most – organisations will find themselves using public cloud services from more than one provider. It is increasingly common for different departments to buy cloud services individually. Additionally, some IT teams may find that one public cloud is superior to another regarding pricing, support or suitability for their specific workloads. This diversity in cloud platforms has resulted in a parallel diversity in automation and orchestration technologies – all of which can either partially or entirely aid organisations in managing the various application workloads across their different suppliers. Many experts have gone so far as to say that automation and orchestration technologies are intrinsic to the definition of a ‘true’ hybrid IT environment. These technologies are rapidly becoming the norm. According to Rightscale, 82 percent of enterprises are now using multi-cloud environments, compared to 74 percent in 2014, demonstrating just how rapidly the hybrid IT ecosystem is growing. The increased reliance on public clouds for multiple business functions comes with increased complexity, despite clear improvements in the quality, cost and security that public clouds have achieved in just a few short years. Each cloud environment usually comes with its management platform, SLAs, APIs and characteristics, making management difficult – especially when data needs to travel between clouds or ‘burst’ from private clouds to public clouds at peak demand periods. So how can IT professionals most effectively manage their hybrid environments? Complexity and automation There is a myriad of management tools from vendors such as Flexera, Scalr, xStream and ElasticBox that can act as an intermediate layer, enabling organisations to manage multiple cloud providers from a single platform. However, these platforms are often limited to management, monitoring, load balancing and scaling rather than addressing problems - like outages or issues with APIs - that may occur. Problem remediation in enterprise IT is a difficult problem in even the best of times, and will become even more complicated as interdependence between all segments of the IT estate increases. [easy-tweet tweet="Many software solutions have started to offer the ability to self-diagnose problems." hashtags="hybrid, tech, cloud"] Remediation among interconnected public and private clouds is why automation is now so critical to hybrid cloud usage. Many software solutions have started to offer the ability to self-diagnose problems and automatically fix them without the need for human intervention. Power to the people… as long as they’re in formation Although cloud automation is moving forward by leaps and bounds, even the most advanced automation providers are still limited to solving relatively simple issues such as discovering and registering new IT assets and handling ticketing for IT support events. This means that human intelligence is – and will continue to be – critical for the foreseeable future. There are not yet robots or programs that can physically visit a server rack and make a judgment call based on years of expertise in the industry or even simply unplug equipment. Handling such issues in-house, however, can be challenging and a drain on an IT department’s limited resources. IT teams often find themselves working with multiple external support teams or spending a lot of time driving from location to location – all while finding a time that is convenient for everybody to carry out maintenance operations. This is especially problematic for customer-facing services that can’t be unplugged without impacting revenue. Due to these logistical constraints, many organisations find themselves appointing support and management companies to facilitate IT maintenance. Choosing to outsource this work raises the issue of finding the right support company that can successfully navigate all the different parts of the IT estate. Cloud management software can indirectly help organisations learn how to organise their IT suppliers most effectively. These platforms often sit as intermediaries between IT staff and cloud resources, and can be mirrored within the ecosystem of an organisation’s IT suppliers, vendors and services. Rather than having a large number of vendors operating in isolation – with each reporting to the corporate IT team like a ‘hub and spoke’ model – the future may lie in appointing one managing supplier to oversee operations, like an operating system mediating between applications and the underlying hardware. This gives the in-house IT team one single point of contact and a familiar partner to work with for support. [easy-tweet tweet="These platforms often sit as intermediaries between IT staff and cloud resources" hashtags="cloud, tech, hybrid"] Supercharging hybrid Whilst private cloud revenues are slowly increasing; public cloud consumption continues to grow fast. As the majority of organisations move to a ‘hybrid by default’ approach to technology, there is little doubt that systems will become exponentially more complicated over time – raising the importance of both system automation and experienced IT staff. Therefore, it is important to look closely at the current IT organisational structure and culture. Hybrid IT enables flexible technology solutions that are cost efficient and do not include services that are beyond the scope of the application workload. Given how rapidly the use of public cloud has been adopted by enterprises until systems for automation and orchestration become more autonomous in respect to fault remediation, capacity management and elasticity, skilled IT professionals will continue to be intrinsic to the successful operation and management of these environments. ### BT joins forces with Microsoft to simplify hybrid cloud BT today announces BT Compute for Microsoft Azure, a new service that allows BT customers to order Microsoft Azure alongside BT’s own cloud services through its award winning Compute Management System (CMS) online management tool. The service enables customers to build hybrid cloud infrastructure with a single service wrap, contract and on a single bill. BT customers already use private and public cloud services – also known as hybrid cloud - hosted in BT Compute’s48 data centres globally. With BT acting as their cloud services integrator, and by using CMS, customers can manage their cloud services end-to-end from datacentre to network, maximising the benefits and minimising the complexity, risk and costs of moving to the cloud. Using local delivery with global scale allows BT to meet the evolving needs of organisations for cloud services and at the same time address the complexity of regulatory requirements. Integration of Microsoft Azure into the CMS expands the choice for BT customers. They gain access to Microsoft’s rapidly growing collection of integrated cloud services, including leading Infrastructure as a Service (Iaas) and Platform as a Service (PaaS) computing capabilities. Neil Lock, vice president of BT Compute at Global Services, BT, said: “Hybrid cloud has become a major focus for many large enterprises as they choose a variety of cloud solutions to suit their complex business needs. In fact, BT research suggests that 90 percent of its largest customers expect to be using a combination of public and private cloud in the next few years. Through our relationship with Microsoft, customers can build their own hybrid cloud environment and enjoy the benefits of Azure whilst removing costly management concerns from the equation. Our Cloud of Clouds portfolio strategy is all about empowering our customers take advantage of the choice, flexibility and control of cloud without concerns about the complexity and security.” Aziz Benmalek, vice president, Worldwide Cloud and Hosting Services, Microsoft Corp, said: “BT is a natural fit to provide managed cloud services on Azure. Through our relationship with BT, we're able to help customers with their infrastructure management and give them the ability to scale their business in a consistent hybrid cloud environment." The service will be available during the final quarter of 2016, adding further value to Microsoft Azure ExpressRoute available through BT Cloud Connect Direct. ### The sky is the limit - the importance of Cloud Assurance to Digital Transformation Cloud computing is one of the technologies turning the way businesses are run upside down. New emerging powerhouses such as Airbnb do not own a single hotel and yet have a larger market cap than any hotel chain in the world. Even the largest established businesses are adapting to disrupt industries with General Motors investment in rideshare start up Lyft. Digital transformation (DX) will affect all businesses. According to a survey which was conducted ahead of Code Computerlove’s ‘Leading your business through digital transformation’ event earlier in June this year, 86 percent of business leaders think digital transformation is necessary within their organisation. [easy-tweet tweet="Digital transformation (DX) will affect all businesses." hashtags="tech, cloud, IoT"] Established enterprises are facing far more significant challenges than their startup competitors. New players such as Airbnb and Uber have a huge advantage, they can operate almost exclusively in the cloud. Established enterprises often have legacy systems and applications that cannot be virtualised and migrated to the cloud. Furthermore, regulatory compliance may require that some of the digital assets have to stay on a company’s premise. Therefore, the road ahead is more complicated than simply migrating infrastructure and applications to the cloud. Clouded vision The cloud offers enterprises the very attractive proposition of being able to increase the infrastructure capacity whilst also being able to deploy new services as and when they are needed. This new found flexibility does, however, have its drawbacks. Enterprises may lose visibility and control over the data and the quality of the service which they offer. The CIO is the key figure who can ensure that a business maintains control of its digital assets to assure continuity and quality of services. They are in a unique position to design a digital transformation strategy that pulls together disparate systems and assures the continued smooth running of the business. It is vital that the CIO handles cloud-based assets like any other IT resource. They will be deployed to support enterprise functions and mission critical systems so they need to be closely monitored. Once an enterprise embraces digital transformation, the CIO needs to be able to have complete visibility across all cloud-based systems within the business. They should be in a position to veto any initiatives that do not comply with the overall corporate IT business assurance strategy. For the CIO, being able to monitor activity continuously across the entire service delivery infrastructure, both on-premise and in the cloud in real-time is absolutely vital for being able to support business agility to quickly introduce new services in a digitally connected world. Digital transformation is serious business An ongoing challenge facing businesses in 2016, is that companies are relying on a mix of public cloud services providers, private cloud and on-premises technology which make it difficult to have a holistic view of the infrastructure. With more and more systems relying on the performance of external services, it’s a challenge to pin down the root cause of a service disturbance when it occurs, due to complex dependencies of multiple service delivery elements both on and off of a companies’ premises. This is evidence as to why the CIO needs to provide oversight for the enterprise as a whole. [easy-tweet tweet="86 percent of business leaders think digital transformation is necessary within their organisation." hashtags="tech, cloud, IoT"] What is needed is a complete end-to-end view of the entire service delivery infrastructure. Without this visibility, service failure and even criminal activities, such as hacking and Distributed Denial of Service (DDoS) attacks, can go unnoticed. With so much of modern business activity reliant on the effective operation of the network, it is vital to be able to identify and address issues quickly and effectively, lest they have serious consequences for the organisation Sitting on Cloud 9 Cloud assurance will play an absolutely vital role in the DX transformation process for a great majority of enterprises in the coming years. The CIO will be the figure who is in the strongest position to provide a business assurance oversight across the entire business. They need to be able to manage every new cloud based service and application which is adopted by the business. By providing this oversight, the CIO will be able to identify any anomalies or threats as well as ensure that all new cloud applications fit in with the overall IT assurance strategy. Balancing the smooth, continual running of the business throughout the digital transformation process is no easy task, however, the CIO is perfectly placed to build unity into the IT strategy and ensure businesses reap the benefits in the coming years. ### Digital Transformation: How Teams are Writing Applications without having to Code The rush is on to digitise products, services, customer engagement and operations; both to capitalise on the opportunity and to avoid becoming a victim of new and faster competition. According to MIT’s Center for Information Systems Research, 32 percent of revenue is at risk by 2020 due to digital disruption. This suggests a debilitating future for firms which lag behind in the digital revolution. Leaders will steal their revenue. [easy-tweet tweet="The rush is on to digitise products, services, customer engagement and operations" hashtags="tech, cloud, IT"] Creating an innovation ‘fast-lane’ To survive and thrive, every company must become an innovative software company. Firms must harness the power of modern application software to differentiate themselves; create new products or channels that weren’t accessible before; and engage customers, partners and employees in new ways. Perhaps most critical of all, it means releasing applications early and often and ensuring a cycle of continuous innovation. The challenge for established firms though is that existing IT teams are rightly focused on maintaining core applications and critical systems. To overcome this, many companies embrace a ‘bi-modal’ IT approach to digital transformation augmenting existing resources with small, cross-functional teams focused exclusively on innovative applications. Typically, team members are tech-savvy business people or business-savvy tech people - the point being that they’re able to bridge the gap between business needs and technical possibilities to transform ideas into applications quickly. It’s important to recognise that requirements for digital solutions are often fuzzy at the start of the project. Teams should work, in short, iterative cycles in close collaboration with end users. The key is to define the minimum viable or useful functionality required of an application, build and release that version, then iterate continually based on user feedback. [easy-tweet tweet="It’s important to know that requirements for digi-solutions are often fuzzy starting a project." hashtags="tech, cloud, IT"] No need to Code Writing applications using traditional code-based approaches doesn’t always suit rapid-pace software development. Skilled coders are in short supply and can be challenging and expensive to recruit. And coding an application can take a long time, versus other approaches, without delivering the desired business outcomes. An alternative is to consider a low-code development platform, which offers a visual modelling ‘drag and drop to a workflow’ approach to building applications. Mendix is one of many such platforms. The use of visual models to define an application’s data model, user interface and logic creates a common language for business and IT to collaborate and progress the build iteratively. It also delivers significant productivity gains over hand-coding. Typically, suitable applications are given six, even ten times faster when using visual modelling than via traditional coding approaches. Visual modelling platforms also further shorten time to market through offering reusable components, drag-and-drop connectors to enterprise systems as well as IoT, big data and machine learning platforms, one-click cloud deployment, and more. How to use a Visual Modelling Platform While a logical mind is imperative, most low-code development platforms require no prior programming experience. Learners can expect, to begin with a vendor provided online training session or workshop. This will present the basics of how to use their platform to capture requirements, and build applications simply via selecting themes and components from a library and dropping them into a workflow. Delegates can learn in just a day or so to build their own simple web and mobile applications, which they can deploy locally or in the Cloud. Most have an active developer community offering guidance, support and meet-ups. Vendors typically also offer advanced training courses and workshops. Some provide formal exams too, allowing progress to certified developer status on that vendor’s platform. Of course, the ability for non-coders to quickly pick up these platforms shouldn’t imply that professional developers can't also leverage them. In fact, developers who value speed and close collaboration with peers from the business over getting bogged down in specific technologies often embrace visual modelling. Evidence from the Field Does it work? There are plenty of good examples around for using low-code development platforms to speed up digital transformation projects. At UK insurer Liverpool Victoria (“LV=”) the CIO says that using a visual modelling platform is helping them to achieve the speed and agility to compete in a growingly innovative UK industry.  Springer Healthcare, a publisher, built in just 12 weeks a mobile application for its sales force that would have taken 52 weeks to create using code, while Kao Beauty built a global sales application for its mobile workforce in just one tenth of the time coding would have taken. Another visual modelling advocate is the charity Action for Children. Its head of business systems development says that constructing applications this way is fast, giving working prototypes they can demonstrate to users in just days, enabling the step-by-step creation of user-friendly and fit-for-purpose applications. [easy-tweet tweet="There are plenty of examples for using low-code dev platforms to speed up projects" hashtags="cloud, tech, IT"] The demand for digital transformation projects is putting lots of pressure on business and IT leaders. Low-code development platforms are well proven to support the innovation and fast delivery of fitting applications that customers, employees and partners are eager to use. Skilled coders are worth their weight in gold. But your people don’t always have to be able to code to create great digital applications. ### Survey Reveals 50 Point Big Data Adoption Chasm in UK Public Sector Pentaho Corporation, a Hitachi Group Company, today published research commissioned from iGov revealing that despite 76% of respondents believing that Big Data could benefit their organisations, only 26% regard it as a top priority. This 50 point adoption chasm exists despite 79% of survey participants stating they believed big data could help improve efficiencies / reduce costs and 74% recognising the potential for big data to help deliver personalised services. The full report, which polled 132 UK senior public sector IT executives, is entitled “Effective Analytics: The Use of Big Data in the Public Sector 2016” and is available for download here. On September 28th at Whitehall Media’s Central Government Business and Technology event being held at London’s Inmarsat Conference Centre, Pentaho will discuss these and other adoption concerns with delegates, and how they can be overcome using modern business analytics and data integration technology. Skills gap and data quality top list of concerns In the public sector, where IT failures are highly visible, professionals cite multiple reasons for hesitating to adopt big data. 75% of those surveyed lack the understanding of what big data can do for their organisations while 63% don’t believe they have the appropriate skills in house to get started. 50% see their legacy systems as a bottleneck to success. Other bottlenecks include project resourcing (55%) and concerns over data quality (53%). IT infrastructure concerns More than half (52%) of the respondents cited a range of IT infrastructure concerns as reason for delaying big data adoption. 72% of those surveyed would have to integrate various data sources and 34% are doing this manually. Only 5% use an Hadoop-based solution to store and analyse data. These factors explains why 50% of survey respondents cited difficulties with managing large amounts of data as a top concern with 47% concerned about data misuse and 44% about the accuracy of analysis. Closing the gap Despite these concerns, nearly half (46%) of respondents stated that they were interested in learning more about how big data analytics tools could support their goals within 12 months. According to Paul Scholey, Vice President EMEA & APAC, Pentaho “Many survey respondents identified the same big data bottlenecks that private sector companies are alleviating today using modern tools like ours based on open data standards. These companies gain an advantage by using our software extensively to overcome the skills gap and automate complex, repetitive and time-consuming aspects of big data preparation.” Below are examples of addressable problems that respondents identified in the survey: - 66% believe that streamlined processes where all data can be accessed and viewed through one system would help to improve the ways in which its organisation handle Big Data analysis and integration; - 65% think it would be the integration of data at the source with little to no manual work; equally 65% think the ability to ‘look-up’ data using a single interface that searches multiple sources would help; - 64% think that a simple way of ‘cleansing’ data to ensure a higher level of accuracy would help; - 60% believe they can build on effective and accurate data analysis to give a better insight into citizens. ### Sell your home in 90 days or you get you the cash with new online tech Third of sales fall through causing chain-buyers to miss out on their dream home. Successful serial founders with £3m funding from leading investors. British ‘proptech’ (property technology) startup, Nested has launched a new service that promises to sell a house within 90 days, or lend the seller the money for their next house purchase interest-free. This follows a recent report from Which?, which shows that three in 10 property sales now fall through, most often related to the property chain, with one in five failed deals (21%) caused by buyers unable to sell their own home*. Widely regarded as one of the most stressful experiences in life, selling a house beats having a child and changing jobs, and comes second only to going through a divorce**. The Nested service aims to take the stress and time out of the process and helps sellers escape the property chain by offering them certainty over their property sale from day one – without having to compromise on the sale price. Set up earlier this year by the founder of GoCardless, Matt Robinson, CEO, former Songkick CTO, Phil Cowans, and ex-McKinsey consultant and trained architect, James Turford, the team has already raised £3m from leading investors, including Passion Capital, Indeed founder Paul Forster and self-made billionaire brothers, Oliver and Alex Samwer. The Nested service will be available initially in London. Properties sold using Nested are guaranteed from day one and sellers will have the funds for their next purchase in 90 days, whether it is sold or not, putting them in the best possible position to purchase their next property. For example, if they have found their dream property and are about to lose out to a cash buyer. Matt Robinson, CEO of Nested, explains. “The whole property chain and selling process is broken, with sellers complaining about slow lawyers, endless phones calls, untrustworthy estate agents, and of course, being let down by the buyer at the last minute. With so much time and effort wasted on the sales process, we set out to solve the problems that trap sellers in a lengthy sales chain and with no guarantee of a sale at the end of it. “Today a third of agreed offers fall through, something we are seeing more of with increased uncertainty post-Brexit. Many sellers find themselves forced to start again, often losing the house of their dreams further down the chain as a result. Some resort to moving into rented accommodation in order to move on time for a new job or the start of the new school year, and end up paying for two properties. Nested makes sellers chain-free from day one, removing months of uncertainty and putting them in the strongest position to make their next purchase.” Using smart valuation technology and process management, Nested provides sellers with a property valuation within minutes and guarantees a minimum price from day one, promising to sell the house within 90 days or give them the money. Thanks to its valuation algorithm and unique processes, the company is able to guarantee the customer fair market value, unlike traditional home buying companies in the distressed house sale industry, which was widely criticised in a recent OFT (Office of Fair Trading) report. Nested charges a minimum fee of 1.8%, and if they can sell for more than the guarantee, they give the majority (80%) of any upside to the homeowner, whether the property is sold before or after 90 days. Nested retains the other 20%, meaning their incentives are fully aligned with the homeowner to sell for the maximum amount possible. ### Oracle Beats Amazon Web Services in Head-to-Head Cloud Database Comparison Oracle Executive Chairman and Chief Technology Officer, Larry Ellison today demonstrated that Amazon databases are 20 years behind the latest release of the Oracle Database in the Cloud. In his keynote presentation at Oracle OpenWorld 2016 in San Francisco’s Moscone Center, Ellison shared detailed analysis that showed that Oracle Database-as-a-Service (DBaaS) is up to 105X faster for Analytics workloads, 35X faster for OLTP, and 1000+X faster for mixed workloads than Amazon DBaaS. Ellison also showed that the Oracle Cloud is optimized for running Oracle Database while Amazon Web Services (AWS) is not. An Oracle Database running on the Oracle Cloud is up to 24X faster than an Oracle Database running on AWS. “Oracle’s new technologies will drive the Cloud databases and infrastructure of the future,” said Ellison. “Amazon are decades behind in every database area that matters, and their systems are more closed than mainframe computers.” Ellison also announced the availability of Oracle Database 12c Release 2 in the Oracle Cloud with the launch of the new Oracle Exadata Express Cloud Service. This service provides the full enterprise edition of the Oracle Database running on the database-optimized Exadata infrastructure. Starting at just $175 per month, Ellison showed this Cloud service is lower cost than similar offerings from Amazon. With the launch of Oracle Database 12c Release 2 in the Cloud first, Oracle has demonstrated that the Oracle Cloud is the most optimized, complete and integrated Cloud for Oracle Database. The latest release provides organizations of all sizes with access to the world’s fastest, most scalable and reliable database technology in a cost-effective and open Cloud environment. In addition, the world’s #1 database includes a series of innovations that add state-of-the-art technology while preserving customer investments and supporting their transition to the Cloud. Ellison shared detailed analysis during his keynote that showed how the new Oracle DBaaS delivers unparalleled performance for analytics, online transaction processing (OLTP) and mixed database workloads. In a direct comparison between Oracle DBaaS and Amazon databases, Ellison shared the following analysis: Oracle Cloud Database is dramatically faster than Amazon Cloud Databases: Oracle Cloud is up to 105X faster for analytics than Amazon Redshift Oracle Cloud is up to 35X faster for OLTP than Amazon Aurora Amazon is 20 years behind Oracle in database technology Amazon Aurora is missing critical OLTP features that Oracle shipped 20 years ago, including scalable read-write clusters, parallel SQL and the ability to replicate encrypted databases Amazon Redshift is missing critical analytics features that Oracle shipped 20 years ago, including table partitioning, materialized views, support for rich data types and sophisticated query optimization Amazon databases do not support mixed workloads Oracle runs analytics workloads 1000+ times faster than Amazon Aurora Oracle runs OLTP workloads 1000+ times faster than Amazon Redshift Amazon databases are more closed than IBM Mainframe databases, and are not compatible with on-premise enterprise database applications Amazon Aurora, Amazon Redshift and Amazon DynamoDB only run on AWS With AWS, organizations can’t use dev/test for on-premises, can’t use disaster protection for on-premises, management is incompatible with on-premises Amazon databases are not compatible with existing enterprise database applications such as Oracle, DB2, SQL Server and Teradata and force organizations to throw away decades of on-premises investments Ellison also demonstrated that AWS is not optimized for the Oracle Database: Oracle Database is up to 24X faster for analytics on the Oracle Cloud Platform than on Amazon Web Services Oracle Database is up to 8X faster for OLTP on the Oracle Cloud Platform than on Amazon Web Services AWS has limited storage performance: Amazon Elastic Block Storage limited to 48,000 IOPs/nodes, which is 8X slower than Oracle Cloud; Amazon Elastic Block Storage limited to 800 MB/sec/node, which is 19X slower than Oracle Cloud AWS cannot scale-out Oracle across nodes: AWS provides no support for Oracle Real Application Clusters Oracle is the only vendor with true workload portability across on-premises and Cloud deployments. This helps ensure customers can continue to leverage their existing investment, keep costs down and easily benefit from the efficiency of Cloud. With proven continuous innovations and industry-leading performance across the entire platform from infrastructure to database, including support for mixed workloads, Oracle Data Management Cloud is the leader today and in the future. ### Taking on Shadow IT and cloud sprawl Highlight launches Application Visibility (AppVis™) so enterprises can see and understand how their vital applications are performing, particularly the growing use of unauthorised cloud apps. Visibility is the first step in ensuring corporate data is well managed and helps to prevent accidental loss of information or security breaches. AppVis™ is one element in Highlight’s new Application Awareness™ suite of tools that delivers insight into the performance of business critical applications, wherever they’re hosted. It helps to manage shadow IT and contain cloud sprawl, and ensures that every customer interaction is a good experience. Presented in visually accessible charts, the cloud-based Highlight AppVis™ service automatically audits an IT estate to reveal every app, its location and usage patterns. With its friendly, concise and simplified reports, Highlight is not just for those who run the network, it is ideal for business level audiences. Richard Thomas, CEO of Highlight says, “It is so easy for individuals to sign up to a cloud app with a credit card and expose a corporations’ information. Seven out of ten UK workers use cloud technologies that are not supervised by their company* and nearly one in four organisations have no idea which “unofficial” apps are running on their IT infrastructure**. Organisations need to know what is happening and we can give them that insight, so that they can start from an informed position when deciding on cloud usage. Our aim is to help organisations to embrace cloud services and frame policies so people can use apps in a compliant, governed way. “This is a major shift in thinking,” he adds. “People no longer need to get hysterical about cloud security but rather use services like Highlight to help them manage this new world. In fact, more organisations are becoming aware that there are many valuable apps available in the cloud that can benefit their operations. With Highlight AppVis™ they can be fully aware of the apps being used and then develop security policies to ensure those apps are governed, compliant and budgeted correctly.” Service Providers and their Enterprise Customers can both view and understand the same information on Highlight AppVis™ to help deliver improved consistency, clarity, transparency, predictability and accountability. Highlight AppVis™ is built on Cisco’s latest Application Visibility and Control (AVC) solution. AppVis™ leverages multiple technologies to recognise, analyse, and control over 1000 applications, including voice and video, email, file sharing, gaming, peer-to-peer (P2P), and cloud-based applications.   About Highlight For over sixteen years, Highlight has been giving Enterprises clear, fast visibility into the performance of their technology – their public-facing websites, business-critical applications, and IT infrastructure. As well as delivering visibility and analytics, we help our enterprise customers manage the growing list of relationships they have with external providers. By improving these conversations, we strengthen these relationships and allow both sides to move forward confidently with projects to expand, deploy new technologies and transition to the cloud. Highlight is delivered to enterprises through our service provider partners, who range from Tier 1 providers with turnovers in the tens of billions, to boutique VARs with a headcount of twenty. They value Highlight’s fast and easy deployment, its simple pricing structures, the OPEX subscription-based format for predictable costs and the ability to scale with customers’ needs. The results include clarity of service delivery, reduced churn, improved customer advocacy and efficient cost management. The Highlight service is used in 90 countries, on 6,000 enterprise networks including 40% of the FTSE-100.     ### Duo Security Secures Access to the Microsoft Cloud Duo Security, a cloud-based Trusted Access provider protecting the world’s largest and fastest-growing companies, today announced that it’s helping thousands of Microsoft customers migrate workloads to the public cloud. Gartner estimates that the forecast for Cloud application services (SaaS) is forecast to grow 20.3 percent in 2016, to $37.7 billion.* Much of this growth is driven by organizations moving on-premises Microsoft services to the cloud. Duo currently protects more than 8,000 Microsoft customers with its trusted access platform, with the most common workloads being email, Windows servers, remote desktops, and directory services. Read more at http://duo.sc/msftcloud Today, one out of every five corporate employees use Office 365* but a move to Office 365 can be complex causing many organisations to migrate incrementally. As such, these organisations need to secure both their on-premises and cloud email infrastructure simultaneously. Thousands of organisations rely on Duo to secure access to both on-premises and cloud email environments. Organisations are also taking advantage of public cloud platforms for other Microsoft services, such as Windows servers and virtual desktop infrastructures. Today’s companies use a combination of Windows Remote Desktop Protocol (RDP) and other remote access services (RD Gateway, RD Web) to provide access to Windows servers and other business applications. Protecting access to remote Windows sessions, regardless of where it’s hosted, is critical to security. Over 7,000 companies use Duo to protect on-premises and cloud-based Windows servers and virtual desktop infrastructures. Finally, many companies are utilising cloud-based Azure Active Directory to connect with popular cloud applications and improve availability of their directory infrastructure. They also leverage Active Directory Federation Services (ADFS) to federate identities between multiple applications and Active Directory instances. Duo quickly integrates with both on-premises and cloud-based Active Directory and ADFS to enforce role-based policies for user access and authentication into any application. "One of the biggest reasons we moved to Microsoft Office 365 was to leverage the cloud-based email system,” said Chris Allamon, Vice President of Information Technology at Level One Bank. “After we converted our email system over, we gained some other benefits by also looking into SharePoint in the cloud. We use Duo single-sign-on with Microsoft Office 365." To see a demo of how Duo can protect Microsoft services, visit Duo at Microsoft Ignite hosted at the Georgia World Congress Center, booth #859. In addition Duo also announced the launch of their new single sign-on (http://duo.sc/ssopr) offering this week, meaning that in addition to Office 365 and other Microsoft applications, companies can provide a consistent and intuitive experience to access any other cloud application such as Salesforce, Box, Expensify, Slack and more. ### Survey Reveals A New Breed Of Exceptional Accountants Are Major Untapped Asset For Businesses Accountants currently seen as key to business growth, but miss out on decision-making 19 September 2016, LONDON – Research results released today by BlackLine, a leading provider of finance controls and automation software that enables Continuous Accounting, reveals that while accountants play a much larger role in achieving business success than previously believed, accounting departments overall are under-valued in their role to the business, with over one-third (34%) of respondents accepting that their financial teams are an underutilised resource. The survey, conducted with UK business decision-makers at medium and large enterprises, also found that many business leaders recognised the value that finance teams brought to longer-term business strategy – if faced with an economic downturn, around a third declared they would rather an ex-finance chief (34%) or ex-accountant (27%), run their business – over an ex-salesperson (13%). Even in a growing economy, more than three in five (63%) noted that sound financial and accounting processes and systems are the most important elements of well-managed growth. Whilst nearly one-third (30%) of business decision-maker respondents in BlackLine’s survey lauded their finance teams for contributing to the growth of the business beyond their role in financial management, the survey also found that finance teams can often get dragged down by time-sensitive reporting and month-end processes. Half of the business decision-makers surveyed noted that they felt their financial teams typically have an overly taxing volume of work. Andy Bottrill, Regional Vice President at BlackLine, commented: “To unlock the true value of an accounting team, businesses will need to provide the right tools that could automate the tedious manual work such as processing and reporting, and allow accountants to focus on high-value areas including fraud detection, compliance, and data analytics. Accountants could become Exceptional Accountants and on track to a CEO role.” Accounting is the most common career background of FTSE 100 CEOs according to the latest Robert Half FTSE 100 CEO Tracker, with almost one in four company bosses holding a chartered accounting qualification and 55% of chief executives coming from a finance background. Through Continuous Accounting, Exceptional Accountants can continually capture, validate and produce financial data throughout the month, to work in ‘real-time’ instead of in peaks-and-troughs. Implementing a continuous close function also enables a more accurate measurement of business progress, allowing underperformance to be quickly identified and corrected. This reduces ‘time to insight’ and accelerates ‘time to decision’, empowering Exceptional Accountants to gain more time to add strategic value to the business. Bottrill continued: “The results of this research are clear: finance teams have a huge untapped potential to act as growth drivers for their organisation. Without alleviating the tedious and sequential workload around reporting requirements through technologies like Continuous Accounting, businesses will miss out on exploiting the knowledge and capabilities of a new generation of accountants, what BlackLine calls Exceptional Accountants. Exceptional Accountants have a combination of deep knowledge of business performance and the ability to act strategically in the high-value areas such as data analytics and line-of-business advisory. They create innovation and opportunity within their organisation, and are pivotal to its success beyond simply managing finances.” About BlackLine BlackLine is a provider of cloud-based solutions for Finance & Accounting (F&A) that centralize and streamline financial close operations and other key F&A processes for midsize and large organizations.   Designed to complement ERP and other financial systems, the BlackLine Finance Controls & Automation Platform increases operational efficiency, real-time visibility, control and compliance to ensure end-to-end financial close management, fueling confidence throughout the entire accounting cycle. BlackLine’s mission is to continuously improve the quality, accuracy and efficiency of Finance & Accounting by centralizing key functions within a single, unified cloud platform. Enabling customers to move beyond outdated processes and point solutions to a Continuous Accounting model, in which real-time automation, controls and period-end tasks are embedded within day-to-day activities, BlackLine helps companies achieve Modern Finance status, ensuring more accurate and insightful financial statements and a more efficient financial close. More than 1,500 companies with users in approximately 120 countries around the world trust BlackLine to ensure balance sheet integrity and confidence in their financial statements. BlackLine is recognized by Gartner as a Leader in its 2016 Magic Quadrant for Financial Corporate Performance Management (FCPM) Solutions and as a pioneer of the Enhanced Financial Controls & Automation (EFCA) software category. Based in Los Angeles, BlackLine also has regional headquarters in London, Singapore and Sydney. ### The race for the future of driving In technology, first place isn’t everything. There are plenty of examples of businesses coming late to the party, perfecting an existing idea and walking away with the profits. After all, Apple didn’t invent the MP3 player, smartphone or the tablet. However, implementing an idea first does offer some benefits, particularly when it comes to high-value products that consumers are unlikely to chop and change so readily. This is why the race is on for technology firms to perfect the next stage of automotive evolution: the self-driving car. [easy-tweet tweet="The race is on for technology firms to perfect the next stage of automotive evolution" hashtags="tech, IoT, driverless"] Who’s in the driving seat? Despite the technology only being at the testing stage, there are already a number of companies openly pursuing self-driving technology. These can be split into two groups: traditional automobile firms and modern technology corporations. When it comes to the former, Audi, BMW, Ford, Honda and Volkswagen are just a few of the big-name competitors, while Google, Apple and Tesla are some of the more prominent tech firms involved. However, with each business choosing to divulge varying amounts of information, it is far from clear which company is most likely to dominate the autonomous vehicle industry. Tesla, the electric car firm led by Elon Musk, has perhaps been the most open about its self-driving projects, with the company’s “auto-pilot” feature having been included (although not enabled) on every Tesla release since September 2014. However, the company suffered a setback in May when a Tesla driver was killed after the onboard software was unable to pick out a white lorry against the bright sky. It remains to be seen whether updates announced this week will be enough to assuage consumer worries. Google has also been happy to share information regarding its self-driving car, having already clocked up more than 1.5 million miles, while Apple’s progress has proven more secretive with rumours of setbacks and layoffs having plagued the mysterious “Project Titan” for some time. For Tesla, Apple, Google and the more traditional automotive firms, this race is perhaps one that is more likely to be won by the tortoise than the hare. A premature release followed by an accident of any kind that can be attributed to a software or hardware error would be disastrous for the company involved and could set back the entire self-driving vehicle industry by years. [easy-tweet tweet="This race is perhaps one that is more likely to be won by the tortoise than the hare" hashtags="IoT, tech, driverless"] The home straight Whichever company ultimately comes out on top, they may still have a difficult task convincing consumers to let go of the wheel. Aside from safety concerns, many may find it difficult to transition from driver to passenger, particularly given Mr Musk’s recent suggestion that human-driven cars could become illegal in the future. Will self-driving cars be such an easy sell if they are viewed as the beginning of the end for driving itself? On the other side of the argument, Gerry Keaney, Chief Executive Officer of the British Vehicle Rental and Leasing Association (BVRLA), believes that autonomous vehicles are likely to receive a warm reception from the public. “Every demographic study you read about young people living in the major cities shows that they are increasingly less motivated to learn to drive,” he said. “They are less motivated to take their driving test and, where they do learn to drive, they are less interested in owning a vehicle.” Ultimately, it is surely a matter of 'if', not 'when' we will see self-driving cars filling our motorways. Reservations are understandable, but technological progress usually carries on regardless. If that weren't the case, we’d still be travelling around by coach and horseback. ### Mitigating the Security Risks: migrating legacy applications to the Cloud For many organisations, an increasing proportion of the applications that they rely on for their day-to-day business operations are hosted in the cloud. This stands to reason: the cloud offers a range of benefits from cost to flexibility and scalability, which makes it an attractive option at a time when the need to ‘do more with less’ is high on the corporate agenda. [easy-tweet tweet="The Cloud offers a range of benefits from cost to flexibility and scalability" hashtags="cloud, tech, IT"] However, moving legacy applications to the cloud comes with its own set of security challenges. The way that an application is used and hosted in the cloud could be significantly different from how the original application was deployed on-premise. With shared tenancy and multiple users using the same stack, organisations need to assess and remediate new vulnerabilities. Planning the journey, and understanding and mitigating these security risks, is key. Application Security Cloud computing has fast become the de-facto model of computing for many key applications and services. In fact, it is estimated that, by 2018, 50% of the applications running in public cloud environments will be considered mission-critical by the organisations that use them. The cloud evolution and all the benefits that come with hosted applications and services – reduced operating costs and an ‘on-demand’ consumption model - now mean that the cloud-first, or cloud-only, approach to applications will soon be the default option. As with any significant change, there are security risks. Yet where the focus for cloud security has typically been on the controls and checks in place protecting the infrastructure on the cloud provider side, organisations must also consider security from the perspective of the design and architecture of the application that’s running in their cloud. Migrating apps to the cloud introduces new issues arising from inter-connections and interactions between components, authentication, logging and key management. A critical part of the path to migration is identifying any missing or weak security controls or flaws in the application itself that could increase the risk of a breach. [easy-tweet tweet="A critical part of path to migration is identifying any missing or weak security controls" hashtags="tech, cloud, security"] Organisations need to start by assessing their strategy including: Defining the boundaries One of the issues to resolve is defining the lines of responsibility. For on-premise applications, the organisation is wholly responsible for security. In the cloud model, responsibilities shift to the cloud provider, so it’s important to know exactly what these are and where the ‘hand-off’ is. Many network and application level security assessments become strictly cloud provider activities – or they can only be conducted if the provider permits the action.  Some assurance activities that were typically conducted by security consulting firms are now responsibilities of cloud providers. For example, with a SaaS (Software as a Service) model, in which the cloud provider runs the infrastructure and application for the customer, they also own the majority of security responsibility and control. In the IaaS (Infrastructure as a Service) model, the customer runs and manages virtual machines in a software-defined environment. This means that the customer has the greatest security control and responsibility. Security Assessments A full security assessment can help to identify the specific areas of risk. This will not only ensure that the application works optimally in the cloud, but also that there are no new risks as a result of the migration. [easy-tweet tweet="The cloud platform layer controls testing applications" hashtags="cloud, tech, security"] This should include a configuration review: assessing the cloud platform layer controls and testing applications for vulnerabilities, remediating what is found. Performing deeper testing, combining static testing (SAST) and dynamic testing (DAST) and code review will identify the risks.  Reviews of the design and architecture of the application, along with threat modelling, will identify where the risks are and where action needs to be taken.  This may highlight issues such as how authentication and authorisation is designed for the application, analysing the encryption mechanism and key management design. It will determine if you are protected against targeted threats such as malicious insiders, or an external cyber criminal. Ultimately, the responsibility for the security of the application rests with the organisation.  As more of us make the journey to the cloud, ensuring that application security has been addressed and that any flaws in design or architecture have been remediated - must become a core part of the migration strategy. ### Post-Brexit: What does this mean for UK companies storing data in the UK? The post-Brexit landscape is surprisingly similar to the one that preceded that momentous vote on 23 June. Far from bringing an end to months of political and economic uncertainty, the Leave vote and the ambiguity over what happens next continues to cause consternation amongst both UK companies and foreign investors alike. [easy-tweet tweet="The post-Brexit landscape is surprisingly similar to the one that preceded 23rd June." user="m7msuk" hashtags="cloud, tech, Brexit"] The technology sector, which has been attempting to resolve the US vs. European data protection conundrum now faces further uncertainty, and many businesses are particularly concerned about what form the UK’s data protection laws will take once the country has formally completed its divorce from the European Union. Although the UK is capable of making its own data protection laws, as with the 1998 Data Protection Act, as a member of the European Union it was also subject to EU laws, including the Data Protection Directive. Upon leaving the EU, the UK will no longer have to abide by this ruling, or its replacement, the General Data Protection Regulation (GDPR). The new rules, which come into effect on May 25 2018, concern users’ ‘right to be forgotten’, the right for them to know when their data has been compromised as well as the right to transfer data to another service provider without the fear of vendor lock-in. While it is highly likely that the UK will continue to abide by EU data protection regulations post-Brexit, it is understandable that some UK businesses are looking at the current uncertainty with some concern. Fortunately, there is a way for UK organisations to put this worry to one side and start planning for the future. The role of the MSP in a post-Brexit world As the growth of Public and Private Cloud offerings attract more IT investment, it becomes more important than ever for businesses to choose their managed service provider (MSP) wisely. By opting for a Managed Services partner who operates from UK-based data centres, businesses based here can ensure that national data protection regulations are met. That way, even if the UK and EU opt for different data protection laws, your UK business can continue as normal. [easy-tweet tweet="It's understandable that some UK companies look at the current climate with some concern." user="m7msuk" hashtags="brexit, tech, cloud"] Although M7 Managed Services have always taken a global approach to our customer and vendor partnerships, managing applications in IBM SoftLayer for example, our Private Cloud and Managed Services offerings are hosted on leading technology infrastructure, located in UK data centres. All of our customers’ data remains in the UK, ensuring that it meets compliance and regulatory standards. With more than 15 years of experience in infrastructure hosting and managed services and boasting a broad range of industry accreditations, we believe we have the expertise and resources to keep data safe and secure, while still allowing customers to collaborate with their partners all over the world. Whether your business needs application hosting, off-site data replication or disaster recovery, M7 has a UK based solution to offer a solution that takes away the post-Brexit data conundrum. ### 10 Questions CIOs should ask before moving operations to the cloud There are a large number of enterprise-level CIOs who are planning to move their legacy IT operations to cloud environments in the next few years. When you ask them what their cloud architecture will look like or how those environments will be managed internally, their answers get very complex. The truth is, when the transition is ill-prepared, cloud deployments can create just as many problems as they solve. A sound onboarding plan is key to smooth migration. [easy-tweet tweet="When transition is ill-prepared, cloud deployments create as many problems as they solve." hashtags="cloud, tech, security"] Based on research from across the industry, SingleHop created the infographic below — detailing 10 questions CIOs should ask before moving operations to the cloud. Infograhpic created by SingleHop ### A Drone with Arms - Gadget of the Week Claws on a Drone - Perhaps not exactly the best way to make the drone revolution look less terrifying but we have it now and the possibilities are endless. Feel like you haven't got enough cats in your life? Well, the world has potentially become a giant arcade claw game for you as you zip around the neighbourhood (literally) picking up the local feline community with ProDrone's latest offering which is rather snappily-named the "PD6B-AW-ARM". [easy-tweet tweet="The world has potentially become a giant arcade claw game for you with this drone" hashtags="drone, tech, IoT"] When thinking about this claw-handed drone that can fly at 6km per hour with a battery life of 30 minutes it's hard not to revel in the possibility of attaching small weapons to the machines and conducting air battles. The advert itself doesn't seem to do the gadget justice as we watch this mighty piece of tech fulfil its purpose in... moving patio chairs. The mechanical arms are strong enough to lift 22 pounds (just over half of the drone's bodyweight). The eerie-looking tech's Japanese creator says the toy will have a range of industrial uses (beyond the moving of patio chairs you'd have to assume) from cutting cables, lifting cargo and perhaps even dropping a life-saving buoy. Most frighteningly, the drone can use its claws to 'perch' like a giant robotic bird on ledges or railings. Despite being the pin-up drone for a dystopian future it can't be ignored that these robotic devices will ultimately assist in heavy duty jobs by reaching places quicker and easier than human counterparts. Of course, there are obvious negative connotations of the technology and fears rumble on of the drones being used for attacks on innocents but this can be applied to most new technology. ### It’s time to DROP the same password In the past week yet another corporate brand has been hacked impacting up to 70 million users. Red faced by the original email attack in 2012, Dropbox has had to release another statement to users to reset their passwords. [easy-tweet tweet="Dropbox has had to release another statement to users to reset their passwords" hashtags="security, tech, cloud"] In reality, this is not only a consumer but also a corporate security issue. Today we are using more and more cloud based services that morph into our daily lives. Personal and business accounts are becoming more bimodal (i.e. having two sets or modes of operation within IT) to keep up with business and digital transformation change. Right now the best way for organisations to support this mode is to consider how a leakage can be stopped even before it is compromised. DLP (Data Leakage Protection) is paramount in today’s 24/7 world. Some companies are finding it difficult to manage data protection and security on their own as there are so many paths to protect (i.e. monitoring, encrypting, filtering, blocking sensitive information kept at rest or in motion within the managed IT environment from perimeter to end point devices). Realistically, considering the following is just a start to developing and maintaining security: Stopping the stupid  - 73 percent of IT leakage is due to accidents or stupidity, according to data from InfoWatch (e.g. Laptops or USB keys lost or left, people sharing valuable data via email or even via personal file sharing sites). Putting in place the correct endpoint security protection. Training, training and re-training on ways to 'stop the stupid' and the risks to brand damage. Enhancing web gateway protection. Update Network Access Control (NAC) regularly - ensuring only company based devices are connected to the network and compliant with the organisation’s security policy. Build in end user analytics tools that can determine abnormal behaviour. Data leakage is definitely on the upward trend where more personal data leaks are occurring. Reported leaks globally in 2015 by InfoWatch were an astonishing 1500 and counting. This is a staggering 61 percent increase since 2012 when the alleged Dropbox attack occurred. Certain industries are targeted higher than others by external intruders such as high tech, manufacturing and retail versus Local Government. Surprisingly, healthcare is the only sector where insider leaks are the highest. This could be for a variety of reasons including mistrust of drug and medicine companies, including overcharging, which soon gathers momentum on social media. Mylan Inc has already seen brand impact with their share price dropping over 4 percent since last week with their price increase for EpiPens in the US. To put into perspective the impact on data leaks, last year alone saw at least 21 mega leaks which resulted in the loss of over 10 million records per brand. Organisations need to focus their efforts on containment, risk aversion and brand damage limitation. It’s the PR nightmare that most CxO’s in any organisation dread and result in sleepless nights. Today, reputational brand damage is shared on social channels  way before the organisation's PR team can make a publicity statement. Consider the Co-Op scandal of 2014 where salary data was leaked. This caused the CEO to resign and comment that it was unable to adopt professional and commercial governance. No-one recalls the CEO and his team rescuing the banking arm of the business from its largest crisis in its 150 year history. [easy-tweet tweet="Organisations need to focus their efforts on containment, risk aversion and brand damage limitation." hashtags="tech, security, cloud"] Originators of leaks vary but on the whole come from existing employees, Senior Execs and IT departments System Administrators (1.4 percent) (see  diagram below). Each leakage is not dissimilar and often occurs through malice or an inability to keep company data secure. In 2015 alone nearly 8 percent of the overall leaks were caused by access rights abuse, manipulation of data or data leaked on a “need to know basis”, according to InfoWatch's 2015 Data Leakage report. Leakage of Twitter’s quarterly update plunged shares by at least 18% in a single day and took some months to recover. Source of diagram: Data Leakage Report 2015, InfoWatch In summary, consider the points above and never re-use passwords (especially personal and business ones).   Brand damage can be very severe to an organisation, you only have to look at the financial impact reported on TalkTalk last year. ### Computer Says... Maybe to AI in the UK New research commissioned by IP EXPO Europe, Europe's number one enterprise IT event, shows that although 37 percent of respondents think that Artificial Intelligence (AI) will be a main technology theme for enterprises in 2017, there are mixed signals in terms of business application, investment levels and impact on society from UK IT decision makers (ITDMs). The results reveal that a significant 32 percent of respondents claim they are concerned about AI replacing human jobs overall, with 19 percent admitting that they are more worried about their job being overtaken by a robot than they were 12 months ago. Also, there is a clear perceived risk factor, with almost a third worried about the unknown impacts of increased AI. 22 percent of respondents also voiced concerns about AI providing another route for cyber security attacks. Interestingly, 1 in 10 UK ITDMs thought that no form of artificial intelligence was of value to business. However, the results also indicate that any hesitancy in uptake perhaps comes from a lack of immediately recognised business/organisational applications and perceived cost. In fact, 21 percent noted that although they didn’t think AI applied to their business right now, it could be one day. Another 18 percent saw the advantages of increased AI automation and machine learning technologies but claimed such initiatives were ‘too expensive’ to implement immediately. In parallel with these AI ‘non-believers’, the survey also showed a more positive take on AI from some respondents. 28 percent agreed that there are enormous benefits to be had and that they need artificial intelligence in their business now. A very confident 35 percent of ITDMs have never been worried about their job being overtaken by a robot. In fact, they think that it will make for a more efficient, smarter workforce enabling humans to focus on more high-value concerns. Automation (26 percent) and machine learning (23 percent) topped the list of types of AI most valuable to businesses, followed by cognitive computing (13 percent) and robotics or co-botics (12 percent). While 11 percent said that AI had already hit the workplace, some thought it would take longer to see the real impact with a quarter of respondents (24 percent) predicting it would take up to 3 years, 15 percent up to 4 years and 23 percent up to 5 years. This mixed picture is reflected in predicted investment levels. While 27 percent aren’t intending to invest anything at all in AI over the next 12 months, 36 percent are planning to allocate up to 30 percent of their total IT budget in some form of AI in the next 12 months. More interestingly, 22 percent will be investing between 31 percent and 50 percent, with 16 percent claiming it will be over 50 percent. Bradley Maule-ffinch, Director of Strategy for IP EXPO Europe comments; “With Nick Bostrom, one of the world’s foremost authorities on Artificial Intelligence as our opening keynote this year, we were keen to investigate the realities of the state of AI within the UK IT industry. The results reinforce that while AI is unarguably a ‘hot topic’, a range of unanswered questions still exist about the implementation, application and real benefits of AI in business. It is sure to be a highly debated subject at this year’s event.” Bostrom, Author & Founding Director of Oxford University’s Future of Humanity Institute will be the opening keynote at Day One of this year’s IP EXPO Europe. Bostrom will be discussing the impact that AI and intelligent machines will have on business and society, and sets out to answer the question: ‘Will AI bring about the end of humanity?’ IP EXPO EUROPE EXHIBITOR COMMENTS: Natalie Keightley, customer experience evangelist at Avaya: “Artificial Intelligence in enterprise is fast becoming a reality. And it’s in customer service where it seems to be gaining the most traction, most rapid. A range of AI, including chatbots, is increasingly being used to take away the menial tasks from agents, allowing them to focus on the human element that is so crucial to driving customer satisfaction and enabling them to provide better and warmer collaboration with their clients. The application of artificial intelligence to deliver on the combined objectives of the first-touch resolution and immersive experiences is moving beyond the conceptual stage. Avaya is already working on a self-learning chatbot that should be available in a few months. This type of AI can predict customer preferences and resolve problems – almost before the client knows they have one. Ultimately, though, there is one key element that will never change regardless of technological advancements: the customer. I believe AI will support this continued focus on the client because it will enable agents to spend more time assisting clients in the best possible way – the personal way.” Graham Jones, UK managing director, Exclusive Networks: “AI is already a factor in cyber security with advanced User Behaviour Analytics technology, from vendors like Exabeam, that analyses and risk assesses behaviours, aiding the detection, response to, and remediation of, advanced attacks. But in ten years the complexity of the digital world will be unrecognisable from today and I expect huge advances in processing power to enable advanced tracking, analytics and automated decision making to seep further and deeper into the wider world of humans and society. Perhaps we eventually become one of the things in the Internet of Things? Although, sizeable alarm bells will be centred on how far this is allowed to go in the name of security and who will make those decisions.“ Ojas Rege, Chief Strategy Officer at MobileIron: “My master’s thesis back in 1988 was on the use of artificial intelligence in systems design and testing, so this is a question near and dear to my heart. Identifying and learning from patterns in data will be a double-edged sword in mobile security. “On the one hand, the risk will be heightened as existing mobile threats become smarter and mutate more quickly to exploit chinks in the security armour. On the other hand, mobile security will be made more robust as threat prevention systems benefit from insights from massive data sets on past and present attacks. These systems will become smarter and more efficient at suggesting defensive actions to neutralise threats. The next ten years will be an arms race between hackers and security providers as both harness the very real power that AI holds for mobility security.” To register for IP EXPO Europe 2016 for free please visit www.ipexpoeurope.com where you can also find additional information about this year’s keynote and seminar sessions, including speaking times. Find us on Twitter and join the discussion using #IPEXPO. ### Diamonds are forever – Especially on the Blockchain If you haven’t been hibernating in a cave for the last 12 months, you will be familiar with the infrastructure called blockchain. This is not a new concept, and I’m sure we are all familiar with the Bitcoin story that utilises the blockchain and the reputation that surrounds Bitcoin from illegal purchases. As a quick refresher for the uninitiated in this technology, I will try to explain very briefly what this technology is before I move onto what I really want to discuss, diamonds! Blockchain is a distributed public ledger to record transactions of things of value. In its rawest form blockchain is an automated way to record all transactions in a way you can trust and costs very little. Couple this with distributed nodes that can be housed globally, it makes this method of record keeping/transaction logging practically impossible to be corrupted. [easy-tweet tweet="Blockchain is a distributed public ledger to record transactions of things of value." hashtags="blockchain, tech, cloud"] So what has this go to do with diamonds? Well, quite a lot considering £50 billion is lost annually throughout the US and Europe on insurance fraud with 65% of fraudulent claims going undetected. Couple this with £100 million being spent each year on jewellery theft, there seems a need for ensuring the authenticity of objects such as diamonds to reduce the risk of fraud. This is precisely where Everledger enters the marketplace. Built upon the fabric of Blockchain, with the assurance of IBM technology (LinuxONE), and with some amazing elements that ensure the authenticity, certification, security and quite literally tracking of these gems! The Everledger Blockchain Story So how does Everledger do this? Well, let's go into a bit more detail as what I discovered when interviewing Leanne Kemp, CEO and Founder of Everledger.   Everledger quite literally is a digital vault that allows you to trace, recall history and validate goods in the luxury space. With diamonds, 40 metadata points are collected and matched to the laser inscription on the stone along with information on the stone’s origin and history. This is then digitised and stored on the blockchain creating a digital footprint that can be validated, traced and checked with smart contracts. This is a game changer for the financial services and insurance industry. Everledger may have created the killer solution that the blockchain has been waiting for. Imagine, you see that lovely engagement ring on eBay that would look fantastic on your partner’s finger. How do you know that it’s genuine? A piece of paper right? Well with Everledger I could tell you the complete history of that diamond as it will have its unique digital DNA signature that can be recalled and validated from the blockchain. From its mining origin, authenticity, history and more— Everledger ensure the provenance of the diamond. When discussing more details with Leanne, she spoke highly of IBM’s commitment to blockchain and her business. Running a Secure Blockchain on IBM LinuxONE Leanne states “When we looked at tackling global problems while ensuring transactional security, blockchain is the fabric that brings together the security and scalability that is essential to building a solution, and IBM plays an important part in achieving this with their LinuxONE infrastructure. When you see a firm that’s over 100 years old and has the scale alongside the agility to change direction when needed, that’s a company you need as a trusted technology partner.” [easy-tweet tweet="IBM plays an important part in bringing things together with their LinuxONE infrastructure." hashtags="cloud, tech, IBM, blockchain"] This is a bold statement from Leanne but it’s backed up with hard facts that demonstrate IBM is out in front in the blockchain space. Let’s face it, they have a “Blockchain in the Cloud” (the Blockchain High-Security Network) available to test and develop upon, they have the resources and experience that is truly global together with some of the most amazing academic initiatives available for the support of agile start-ups we see today. This coupled with a development platform (Bluemix) that is free to try that killer API out on without burning large wholes in your pockets for DevOps. Beyond diamonds—the future of Blockchain Diamonds are only the beginning for Leanne and Everledger. Leanne states further “This technology is suited to any high-value goods that are produced today and we are already working on significant developments for other goods/markets”. My final comments on this are simple. Everledger could just be that killer application that the blockchain needs to break free from the bitcoin stigma. The applications for this technology can quite literally change the way we look at contracts (replaced with smart contracts) and how we trace and track items to identify their authenticity without added risk. I’m looking forward to catching up with Leanne again at the IP Expo in London after the IBM Edge conference as she has created something very special, built on amazing technology that scales infinitely from volume to products. In the very near future cash will be replaced with digital currency, it's inevitable, but for good reason. It lowers risk, it increases accountability, and it scales while applying the appropriate security. Learn more about Everledger and Blockchain at IBM Edge If you are going to the IBM Edge conference in Las Vegas from the 19th of September, please do make a point of listening to Leanne present. She could quite possibly have the unicorn that blockchain has been waiting for outside the financial industry. If you’re not going to IBM Edge, you can listen to Leanne present online by watching the conference Livestream at IBMgo.com on Sept 20th from 8:30 am PST to 10:00 am PST, or 11:30 am EST to 1 pm EST, (UK Time, 4:30 pm to 6:00 pm). Register to Listen at IBM Go. ### TeleWare achieves world-first real-time call routing through Microsoft Azure Communications technology business TeleWare has today announced that at midday on the 13th September, it was able to route, record and analyse a call in real-time through Microsoft’s Azure cloud computing platform in a fully encrypted, compliant environment. Microsoft has confirmed this is a world first. It had been thought that attempting to use an entirely cloud-based system to route, record and analyse a call would result in poor call quality due to loss of data packets and latency on the call. Previously, this would have required physical servers and hardware. However, TeleWare has been able to develop and enhance its solutions to achieve a seamless call that is of the same quality as one using physical equipment. This world first was carried out with a TeleWare customer, as opposed to a test environment, demonstrating that this is achievable without further enhancements to already deployed products. Rob Corrigill, CTO, TeleWare said, “Many had advised that this wasn’t possible as the cloud just ‘wasn’t ready for this yet’ and we’ve certainly had to overcome some technical hurdles. However, our technology teams, whether they be development, infrastructure or the Network Operations Centre (NOC) have all pulled together with one common goal and have achieved the ‘impossible’. This is deeply significant as it demonstrates that our overall goal of going 100% cloud is possible.” Steve Haworth, CEO TeleWare said, “We’re delighted that the team has achieved this ground-breaking milestone, which has wide-reaching implications, not just for us, but for the wider Azure and cloud community. This is a huge accomplishment, which couldn’t have been made possible without the stellar work of the infrastructure team—each and every one of them should be congratulated.” TeleWare’s fixed-line recording product will be fully migrating onto the cloud with its MVNO product also planned to migrate in the coming quarters. ### Greater than the sum of its parts – why learning is key to DevOps adoption DevOps, a set of best practices to enable software delivery teams to consistently deliver high quality software at speed, has become an integral part of the modern enterprise. Utilised in industries ranging from automotive and financial, to retail and telecoms, recent research into the state of cloud technology discovered DevOps adoption within enterprise is continuously growing. Figures in 2016 highlight an 8 percent increase on the year prior, to 74 percent. [easy-tweet tweet="DevOps has become an integral part of the modern enterprise." hashtags="tech, cloud, DevOps"] The benefits of DevOps are vast, and can be summarised as a means of creating more value for businesses by breaking down barriers and encouraging collaboration between departments to solve problems together. The popularity of DevOps is, however, also its most significant sticking point. The pace of technological innovation has magnified the success of this movement and its definition is often considered in terms of technology adoption alone. A successful transition to DevOps will not only take technology into consideration, but people and processes too. Processes are at the heart of all enterprises - to minimise disruption during the transition to DevOps, teams must be empowered to respond to change. This requires teams not only identify a need and suggest improvements, but also have the tools to measure and action these new developments on a continual basis. [easy-tweet tweet="Successful transition to DevOps will take people and processes into consideration." hashtags="cloud, tech, DevOps"] Learning as a tool to boost DevOps adoption This idea of continuous learning is borrowed from manufacturing - more specifically, the innovation in process improvement that came from the Toyota factories in the 1980s. As explained by Mike Rother in his book The Toyota Kata (McGraw-Hill Education), an Improvement Kata is a fundamental pattern for improving, adapting and innovating. This is achieved through deliberate practice in daily working, where we are effectively taught how to learn until it becomes second nature. To successfully implement this approach, however, the following points must be adhered to: Firstly, understand your desired direction – create a shared understanding of what needs to be improved Secondly, grasp the current condition – understand the state of play by describing processes as they really are. Create and measure a set of metrics that define the actual processes and the desired outcomes Thirdly, establish the next target condition. Create a shared understanding of the target state in the near future, perhaps in four weeks’ time. Define the new process and the metrics that validate the process Fourthly, run small experiments to help move the organisation towards the target Finally, repeat this exercise until the improvements become self-evident! These focussed iterations of incremental change encourage individuals to work in teams to learn about current processes, and to identify and solve problems and inefficiencies. DevOps is no longer restricted to developers and operations. Instead, we have seen a shift towards educating everyone in the software delivery cycle on continual improvement – teaching teams to solve their own problems and boosting productivity in the process. To see greater industry adoption, addressing DevOps as a holistic enabler across people, process and technology is vital. DevOps continues to solve challenges within businesses, revealing waste and creating more value. Companies are becoming more aware of efficient ways of working; it will take a renewed push for them to realise it is DevOps. ### 9 Steps to Develop a Game App Ever wondered how a crazy game like Pokemon Go reached to insanely high popularity? The gaming community is a vast ocean and the new gaming apps that have the capability to attract the users, create a wildfire amidst it while other apps just lay there almost invisible to the gamers. [easy-tweet tweet="Developing an app that creates such an effect on the audience is an achievable task" hashtags="app, gaming, tech"] Developing a game app which creates such a lightning effect on the audience is an achievable task provided, correct methods are taken into account. The vast genres of games in the market make it really difficult to figure out which type of game will make gamers catch their breath in astonishment. Whether will it be a match 3 game like Candy Crush or time consumer like Clash of Clans? Who knows? A sound market research will help you in figuring that out, only then you will be able to sketch out a scalable model for your app. This visual infographic takes into account all the basic facets of mobile app development. The infographic gives you a brief insight on the following: Market Research Concept idea of the app Designing Register as a developer Hire a developer Coding Testing is the key Take feedbacks and improve Marketing ### Voice in the Cloud – What’s Stopping You? The Cloud is no longer unfamiliar territory. By now, most businesses have some part of their IT infrastructure operating from the Cloud, whether its security, storage or business applications. So why aren’t more companies moving their voice infrastructure into the Cloud? [easy-tweet tweet="By now, most businesses have some part of their IT infrastructure operating from the Cloud" hashtags="cloud, tech, IT"] One important reason enterprises may be reluctant to transition to a hosted Unified Communications (UC) solution is concerns about what to do with existing PBX infrastructure and PSTN access. Enterprises are careful with their budgets, so it doesn’t make sense to simply dispose of a legacy solution, especially if the solution is still functioning and serves the business well. On the plus side, Cloud-hosted platforms have a compelling financial impact. Hosted platforms have predictable costs which alleviates businesses from major capital expenditures. In fact, from a purely capital expenditure (CAPEX) perspective, a move to Cloud UC eliminates expensive upfront capital investments and extended pay-back periods to cover hardware expenses. A Cloud-based solution also means easier and more cost effective future software upgrades - a huge benefit in a world where technology develops so quickly. Then there’s the savings on operational expenditure (OPEX). With Cloud UC you only pay for what you use, when you use it. This makes the choice of UC Cloud even more compelling, especially for enterprises that need to scale capacity up or down. Management of the communications network is also simplified. This functionality frees up IT staff and reduces expensive service visits because it’s simple to update software, add new lines and trouble shoot yourself. With all those benefits aside, enterprises are still concerned about their legacy investments. What enterprises need is a way to migrate to a hosted platform at their own pace while minimising upfront capital expenditures. If a scalable, step-by-step solution can be combined with a sense of familiarity, well then that is win-win. Familiarity helps remove the fear and uncertainty that often comes with implementing new technology. [easy-tweet tweet="With Cloud UC you only pay for what you use, when you use it." hashtags="cloud, tech, UC, IT"] Choose a name your staff will trust: Microsoft Skype for Business is already used by millions of businesses and has sold enough licenses to grow into a top three vendor (in terms of market share) in the global UC market. The launch of Microsoft’s Cloud Connector Edition (CCE), demonstrates that Microsoft is well aware enterprises need help migrating to Cloud UC at a pace that suits them. CCE is software that essentially allows enterprises to leverage on-premises PSTN connectivity with Office 365. While the software is free of charge and downloadable from Microsoft, enterprises still need to source a server and have qualified technicians perform the installation. This usually takes some time to get up and running, and it could cause some disruption to the business during operating hours. Furthermore, Microsoft strongly recommends a Session Border Controller (SBC) for security, dial plan integration and interworking between the PBX and Skype for Business endpoints, so that will also need to be sourced separately. While it sounds complicated, help is on the way in the form of clever solutions that bridge the gap between Microsoft CCE and enterprise SBCs - and only taking a fraction of the time to set up. What help looks like Enterprises should look for a solution that integrates Microsoft CCE directly to their SBCs, essentially an “all-in-one” appliance that is easy to install. Keep an eye out for solutions that lower the total cost of ownership as well. One way of doing this is by matching Microsoft’s maximum throughput capacity, which is 500 concurrent calls in a single appliance. If you deploy an appliance that can’t scale to the 500 concurrent call capacity, you may have to purchase another appliance to meet your enterprise’s growth trajectory and demands doubling your investment. Another thing: flexibility is key. Enterprises should choose a solution that can be deployed either virtually or as a hardware appliance. And with security top of mind, it’s imperative to find a solution that provides comprehensive security features, including topology hiding and protection against Denial of Service attacks. [easy-tweet tweet="Enterprises should choose a solution that can be deployed either virtually or as hardware." hashtags="cloud, tech, IT"] With the right solution to link Microsoft CCE and existing SBC, enterprises can maintain POTS lines, fax machines, elevator phones, courtesy phones, paging systems and emergency lines – even present a single call back number regardless of which Cloud PBX user is making a call. So if you’re still “on the fence” about Cloud PBX, there are some purpose-built, game-changing appliances out there waiting to help you migrate to the Cloud on your terms. ### Finland – experts in IOT pull together to smarten up the UK utilities industry Finpro, the official Finnish export and investment promotion agency, is bringing together IOT and smart utility experts to showcase the latest global trends and innovation in renewable energy solutions at RWM, Energy, Water and Renewables event on stand G36-H39 at the NEC from the 13-14th of September 2016. Avarea, BaseN, Distence, Ionsign, Refecor and Wirepas will be showcasing their technologies in big data, analytics, asset management, M2M and gamification. Finland is leading the drive for renewable energy solutions and offers the smart utilities industry a wide range of solutions for several critical utility infrastructure segments, such as energy production (heat/cooling/power), water and waste treatment and related smart grid segments. Using big data and IoT, the solutions typically consist of extensive sensor networks that are connected to monitoring, optimisation and automation solutions. The data can be enriched with various external data, such as weather or financial information, to improve the efficiency of operations. Reijo Smolander, senior advisor at Finpro, says: “A recent report by Markets&Markets showed IoT in the utility market is estimated to grow globally from $4.63 Billion in 2015 to $11.73 Billion by 2020. Finland is a leader in the smart utility sector with smart grid functionalities already operational and with its open energy market, the country is an excellent testing ground for smart solutions. Plus, Finnish companies have the IOT know-how, software and mobile expertise that can be used globally to deliver an immediate and valuable impact to the utility sector.” Efficient smart grid 2.0 infrastructure Finland has relentlessly developed its electricity distribution domain and many smart grid functionalities – AMR, load profiling, real-time billing, remote control and monitoring – are already implemented in the current system. The level of automation and ICT systems in network operation is high, and large-scale implementations of advanced AMR systems have opened up new possibilities to develop network management and the electricity market. Finland is also home to the world’s biggest single IoT mesh network, which enables smart devices to connect automatically without any configuration, infrastructure or third-party networks. Capitalise your Knowledge, a Growth Program run by the official Finnish export and investment promotion agency Finpro, brings together Finnish companies in the field of Big Data and IoT, helping to improve energy efficiency – in a smart way. Exhibiting on the Finland stand at RWM and exhibiting some of these cutting edge technologies are: Avarea – Avarea has analytics competence especially in customer analytics and profitability with a wide experience in analytics methodologies including general statistics, machine learning, text mining, econometrics, time-series forecasting, operations research and mathematical optimisation. BaseN – delivers low-cost, smart living solutions for consumers to monitor energy and water consumption and benchmark usage against their neighbours in a gamification style and high quality IoT solutions for collecting environmental data – such as pollen counting, noise pollution, radiation levels, fertiliser levels. Distence – provide solutions that enable customers to make their existing assets intelligent and bring them into their business processes. Distence offers competitive advantage by supplying customers with relevant, on-time and usable information from their industrial assets and processes. IonSign – At RWM, IonSign is launching its Neutron Smart Gateway - part of the Neutron family of products for reading smart meters. The new Neutron Smart Gateway is a product for collecting 3rd party Utility Meter Data from many meters across multiple types of FieldBus and Pulse meters through one data transfer. Refecor – Offers companies off-the-shelf expertise in product and concept design services and specialises in Machine to Machine and Energy efficiency, automated meter reading, electronic products and wireless technologies design and production support. Wirepas – Wirepas Connectivity solutions are flexibly enabling large scale smart metering in IoT applications. Wirepas technology is used in a variety of applications including Smart Meters (Electricity, Gas, Water), Asset Tracking and Logistics, Clean Tech (solar), Smart Buildings (connected Beacons, environmental sensors, lighting), Smart City (Street lighting), Smart Retail and Smart Logistics. ### How to stop hybrid cloud complexity affecting the bottom line Many enterprises are finding their way through the current wave of digital transformation, and a key obstacle they have to overcome is the number of users who expect 24/7 access to applications across any number of their devices. A growing number of enterprises are migrating applications and information stores to private and public cloud solutions, so that hybrid IT architectures – where information is stored in the cloud as well as on local systems – have become widely adopted. [easy-tweet tweet="Hybrid networks have made monitoring the performance of apps and systems more challenging" hashtags="cloud, tech, hybrid"] These more complex networks have made monitoring the performance of applications and systems more challenging, costly and time-consuming for IT departments. Enterprises around the world are realising just how challenging maintaining application performance levels has become within the cloud hybrid environment. Riverbed’s Global Application Performance Survey found enterprises which are more fragmented in their performance management approach face negative impacts on their productivity and, as a result, the bottom line. Companies require a fully integrated, proactive, agile approach to ensure application performance is conducive to business value. Staying Agile The Enterprise Management Associates’ 2016 ‘Network Management Megatrends’ study found 90 percent of organisations have established deployment plans in preparation for the hybrid IT infrastructures. Of this 90 percent, 70 percent have either completed or are in the process of deployment whilst the remaining 20 percent plan to deploy within the next two years. For the majority of these enterprises, deployment will be an ongoing process and will develop as applications progress and the business needs evolve. To increase agility within the organisation, IT departments need to regularly evaluate and adopt new cloud services to deliver applications more rapidly. Achieving agility is easier said than done. Applications often falter or fail for a myriad of reasons – problematic code, network outages or server failures to name a few. As such, IT departments are often blamed. Riverbed’s Global Application Performance Survey found 71 percent of users say they frequently feel “in the dark” regarding why their enterprise applications are running so slowly. The Bigger Picture Most organisations’ approach to application performance management is not comprehensive enough – often it is far too fragmented. Separate IT teams use different tools to monitor network traffic, real-time communication issues, infrastructure, and application performance-related problems. Individual teams see what they are responsible for rather than the network ecosystem as a whole. In addition, IT is faced with challenging intra-department communication due to the teams’ use of different metrics. As such, instead of fixing the problem straight away, IT teams struggle to determine the root cause of the issue – this reactive approach is a far cry from the necessary proactive approach required for a complex application-driven environment. [easy-tweet tweet="Most organisations’ approach to app performance management is not comprehensive enough" hashtags="tech, cloud, hybrid"] To counter this fragmented approach, enterprises should implement holistic, real-time, end-to-end visibility into cloud and on premise application performance throughout the entire network. This way, IT is able to establish an overall view of how apps are performing and how they impact end-user engagement. By identifying the cause of issues, IT can fix them before the user notices. This proactive approach improves overall performance, visibility into application performance results and thus increased productivity and revenue for the organisation. Furthermore, enterprises can expect improved customer service, product quality and employee engagement. New technology that allows for increased visibility, optimisation, and control ensures prime performance of enterprises’ applications – whilst maximising IT efficiency and productivity. In addition, new technology also permits IT to configure application infrastructure and architecture to respond to the organisation’s needs. The integration of these infrastructures with other systems within the network ensures complete delivery and the best possible user experience. Visibility through the cloud With businesses increased adoption of cloud technology, and the sheer mass of data that is being generated across networks – guaranteeing the best possible application performance is imperative for success. Businesses need to be more agile than ever, for they will only be able to stay competitive if they have a complete view of how applications are performing irrespective of their location. Once businesses have a holistic view of their network (both cloud and on premise) – they will have the performance insight they require to ensure they deliver and maintain business-critical applications – achieving increased productivity and an improved bottom line. ### Active Risk Manager now available on G-Cloud 8 Sword Active Risk has announced that Active Risk Manager (ARM) has been accepted onto the Government’s G-Cloud 8 digital marketplace, for NHS organisations. Available as a Software as a Service, ARM will help organisations to implement the risk management policy, processes and procedures published in the guidance for NHS England, in the UK Government Risk Management Framework (January 2015). The Government framework outlines its commitment to risk management to help identify, analyse, evaluate and control the risks for NHS organisations. The latest version of Active Risk Manager (ARM 9), released in July, includes new features that encourage more flexible, collaborative working and engagement for risk management across the organisation. The digital marketplace enables public sector bodies to search and procure approved services on the G-Cloud framework, without needing to run a full tender process. Nick Scully, Chief Executive Officer at Sword Active Risk stated; “This achievement is recognition of Sword Active Risk’s commitment to service and quality, enabling us to become an approved supplier to the Government’s digital marketplace. It signifies an important milestone in our business growth and one that a number of our customers in the UK public sector have been looking for us to achieve to assist them with their risk management strategy. “Risk management is becoming increasingly important for NHS organisations as they seek to manage threats and opportunities that may have an impact on reputation and their ability to deliver their statutory responsibilities and patient care. The digital marketplace will now enable them to procure and deploy our proven risk management cloud solution without incurring costly infrastructure or maintenance overheads.” ARM 9 has new features that support the risk management process within an organisation including Document Vault, which allows users to store documents within ARM against Activity, Portfolio, Risks, Audits and Incidents. It supports all commonly used file types including MS Word, Excel, PDF and image files, while new data views allow users to Risk Adjust for both project and organisation levels, encouraging wider adoption. ARM 9 also has Real Time Alerts that can be configured to call users to action and drive workflow enabling greater engagement within the organisation, as well as a multi copy facility that enables Project Risk Managers to distribute risk registers across large project schedules quickly and easily. G-Cloud 8 services went live on the Digital Marketplace on 1 August 2016, giving the public sector access to more approved services from suppliers of different sizes across the UK. The total number of suppliers on the G-Cloud framework (G7 and G8) is 2726 (90% SMEs) and the public sector now has access to over 26,000 services. Recent G-Cloud sales figures published by the Crown Commercial Service state a current total of £1,263,386,146 since the launch of the digital marketplace, of which over 50% has been awarded to SMEs. ### Moving to the cloud: fact isn’t stranger than fiction The transformative power of the cloud across the enterprise cannot be denied. Beyond business use, the availability of cloud deployment options has altered the way organisations use, implement and purchase technology. Yet, it is frequently oversimplified and even misunderstood. Some, mistakenly view it as a mysterious tool which can be used to solve any IT challenge, while cloud sceptics only see a buzzword with no real use case to support its uptake. [easy-tweet tweet="The availability of cloud options has altered the way organisations implement technology" hashtags="tech, cloud, business"] As might be expected, the truth lies somewhere between the two. The cloud can provide real, powerful benefits to organisations of every shape and size but must be carefully planned and executed every step of the way – just like every other valuable business strategy. When moving workloads to the cloud, carefully consider the five biggest cloud myths outlined below. Myth One – Migrating to the cloud is purely a technology decision First and foremost, moving to the cloud is a business decision and needs to be handled as such. When choosing the right cloud-based system to support an organisation’s business goals, the decision must rest on far more than whether the move is technically feasible and what type of cloud environment might work best. The IT department will need to work with business leaders across the company before migrating any workloads to the cloud in order to set up a truly holistic plan. Whether moving just one solution or deploying cloud technology across every department, this liaison is vital. The organisation’s strategic goals, potential productivity gains, possible business benefits or downsides, and information security concerns must all feed into the final plan. [easy-tweet tweet="Moving to the cloud is a business decision and needs to be handled as such." hashtags="cloud, tech, IT"] Myth Two – Moving to the cloud is not secure You may have heard the phrase: “There is no cloud, it’s just someone else’s computer.” Indeed, some seem to believe that businesses are happy to leave vital corporate data sitting under a desk in some unknown basement. In a 2015 study, the Cloud Security Alliance revealed that information security concerns are still the primary roadblock preventing businesses from moving systems to the cloud. Yet, there is a huge chasm between this perception and the truth. At its core, every cloud vendor’s business is IT service provision. This means they must guarantee that the technology, physical locations and personnel all comply with stringent security standards. When faced with today’s threat landscape, getting security right can be hard. In reality, many individual businesses don’t have access to the right level of resources to achieve the required high levels of security and compliance which are considered standard for cloud providers today. Data security is one of their core competencies and therefore taken seriously. [easy-tweet tweet="At its core, every cloud vendor’s business is IT service provision." hashtags="tech, cloud, IT"] That said, IT should never blindly trust a vendor’s assurances of security. It is worth asking cloud providers to establish their security capabilities and fully outline their security processes and certifications. It is also important for companies to get an understanding of the security, governance and regulatory compliance requirements within their particular industry. Without this knowledge, businesses cannot know what is required from a cloud vendor and therefore cannot ensure they are set up to provide the essential services and security levels. Myth Three – Cloud means Software-as-a-Service While Software-as-a-Service (SaaS) can be an obvious use-case for the cloud, it is far from the only one. Of course, many companies do assess how cloud technology can deliver business services, whether through e-commerce, supply chain transactions or infrastructure services. However, some organisations instead seek a managed services offering. This can include hosting applications in a private cloud deployment, with access to dedicated functionality, application customisation and integration. Myth Four – Cloud: go all in or not at all Perhaps unsurprisingly, many vendors advocate a cloud-first strategy as the only way forward. The reality is that moving every single IT asset to the cloud might not always make business, regulatory or technical sense. [easy-tweet tweet="Many vendors advocate a cloud-first strategy as the only way forward." hashtags="cloud, tech, IT"] In many cases, organisations are benefitting from hybrid cloud deployment. Plenty of companies shift a number of systems and processes to a public or private cloud while keeping a selection on-premises. This often works because there is no ‘one-size-fits-all’ cloud solution: every individual deployment and business will differ. Cloud might be the answer for some use cases but others could require hybrid options or on-premises systems to drive success. While this can often be attributed to regulatory or data sovereignty requirements, sometimes IT simply understands that business drivers rely on certain data remaining within the organisation itself. Today, organisations can choose from a wide variety of cloud deployment options. No matter the end choice, each option must be thoroughly explored to guarantee it will work from both a technical and business standpoint. Myth Five – Save money by moving to the cloud Businesses can reduce costs by migrating to the cloud and it is true that consolidating systems can also help to increase savings. Yet, this is only one aspect of cloud migration. Cost-saving benefits need to correspond to the underlying factors behind the shift towards cloud. [easy-tweet tweet="Businesses face an ever-growing number of possibilities when it comes to the cloud." hashtags="tech, cloud, IT"] Eliminating the administration and maintenance required to sustain software can enable businesses to increase productivity and save resources. Other organisations can use the cloud to improve service levels, offering 24/7 support with even limited staff. No matter the end result, businesses must analyse wider organisational gains and the total cost of ownership rather than concentrating on upfront costs. Benefits must be viewed at a strategic level. Businesses face an ever-growing number of possibilities when it comes to the cloud. It is therefore important to thoroughly assess what exactly needs to be accomplished. Don’t allow a third-party vendor to hurry your decisions. So many deployment options are available but finding the perfect option for your specific business requirements cannot be rushed. ### iPhone 7 - Gadget of the Week We're not in 2009 anymore. Your phone doesn't have a floppy drive, and it most certainly shouldn't have a headphone jack. When Apple made the iPhone 6 and 6S, the world simply wasn't ready for the future... But now it is apparently and we have been given an abundance of new features to persuade us to forget about the loss of the humble jack. [easy-tweet tweet="Your phone doesn't have a floppy drive, and it most certainly shouldn't have a headphone jack." hashtags="iPhone7, tech, Apple"] Apple has officially announced the iPhone 7 and iPhone 7 Plus models this week at their annual keynote event held on 7th September 2016. Featuring a design very similar to last year's 6S and the year before that's iPhone 6 the new entry boasts a sleek rounded aluminium body but of course, that's not all. The new phone may look very similar to last year's model, but instead of the matte metal finish we've become used to, it has a glossy mirrored design. The new colour is called 'Jet Black' and is a much darker and richer black to previous years. So, besides a different shade of black what else is new? Here's a quick rundown of the top new features headed to iPhone 7: Water Resistance 12MP Camera (Dual Camera on the Plus model) New Home Button A10 Fusion Chip makes the handset 2x faster than its predecessor Stereo Sound Speakers iOS 10 What? You still can't charge it in a microwave? Nor wirelessly apparently despite removing the headphone jack meaning that charging while watching Netflix in bed is a thing of the past unless you invest in Apple's AirPods (their bluetooth in-ear headphones). That all sounds well and good (ignoring the price tag), but with a 5-hour battery life on Apple's AirPods what happens when they also need charging and you're midway through your 8th episode of Narcos that evening? On the subject of AirPods, just looking at the wireless contraptions feels like enough to begin a panic attack about losing them. The future may well get rid of tangled wires but I'd rather spend 20 minutes fiddling with a cord than losing the pieces altogether. [easy-tweet tweet="What? You still can't charge it in a microwave? Nor wirelessly apparently..." hashtags="iPhone7, Apple, Tech,"] Water resistance has been added to the handset so when you get that urge to throw your phone into the sea after an argument with bae you actually can - and then dive to the ocean bed and retrieve it without the worry of water damage (presumably). Stereo speakers also make their first appearance with iPhone 7, a feature that is seemingly rendered a little pointless as most people's ears are further away from each other than the two ends of the phone are. Stereo speakers previously featured on HTC models to much acclaim, however, the technology has not been as well received on the Apple device. A new pressure sensitive home 'button' debuts on the device using technology from recent 'force sensitive' MacBook 'Trackpads' - again, perhaps not the major innovation many were hoping for with the release of the new model but it's what the consumer is getting. The iOS release that looks to coincide with the iPhone 7 offers to soften the blow of lack of innovation with iMessage getting an overhaul as are many of the other built in apps to bring them in line with opposition third-party apps. The smart iOS aims to be ahead of the user with predictions throughout the built-in apps from guessing where you are looking to go through to evaluating which pictures you want to view in Photos. Finally, Shigeru Miyamoto was present at Apple's keynote to announce that Super Mario will be coming to the iPhone. This marks the first time a Nintendo game will run on a platform that is not one of their own (Pokemon Go, of course being a licensed product of Game Freak). [easy-tweet tweet="Apple know they do not need to beat, or even match what the competition is doing" hashtags="Tech, Apple, iPhone7"] Apple know they do not need to beat, or even match what the competition is doing at this moment in time. Why? Easy, because owning an Apple product is a status symbol. In the end it's all about taste, which OS you prefer, which design takes your fancy and maybe even which phone makes better calls (ignore that last one). No matter what, the iPhone will continue to succeed as most users succumb to trend. ### System administrators and IT monitoring – How to ensure the long-term success of the cloud Unsurprisingly, cloud computing, network virtualisation and on-demand technology services are growing at a rapid rate. If these new services take over, as many are predicting, we could see businesses accessing all of their IT needs remotely, no longer requiring their own infrastructure or applications. Were this to happen, it would result in a huge decrease in the importance of IT administrators and the Local Area Network (LAN), subsequently impacting the role of support software such as network monitoring. [easy-tweet tweet="If these new services take over we could see businesses accessing their IT needs remotely" hashtags="tech, cloud, IT"] The increasing use of the cloud will bring with it undoubtable change, providing numerous reasons as to why businesses should transfer to cloud computing; such as the range of software, IT services available and the increased efficiency it can bring with it. There are also claims that a shift to the cloud would cut costs by eliminating the need for IT administrators. Managers, with the help of the cloud, would apparently be able to cover all jobs IT administrators do, such as changing passwords and creating new accounts. However, this point of view is not without its detractors. Indeed, the idea that the IT administrator will be obsolete is highly debatable, as the cloud could not replace the more technical on-site jobs where specific expertise and knowledge is crucial. While the cloud would ensure the completion of more basic tasks, IT administrators would still be required to ensure the smooth running of the network and the network monitoring tools. In considering other reasons that the uptake of cloud may not be on either the scale or at the speed that some suggest, cloud systems are very heavily dependent on Local Area Networks (LANs) and a stable internet connection. So, if a business unexpectedly loses internet connection, it wouldn’t be able to access the cloud, and simple, basic tasks such as printing wouldn’t work, as employees would find themselves unable to access the cloud app that sends the file to the printer service. [easy-tweet tweet="It’s easier said than done to shift a business to the cloud." hashtags="cloud, tech, IT"] In addition, the cloud – as well as the Internet itself – depends on millions of switches, servers and firewalls located across the world. Therefore organisations still need to rely on LANs, and demand high availability and quick response times from them. Another large issue for many companies is that it’s easier said than done to shift a business to the cloud. Businesses that rely on machine power, such as those working in industrial manufacturing, simply cannot move to the cloud right now because it does not yet offer the high availability that assembly lines demand. At present, 37 percent of the world’s GDP is produced by non-service industries, and machines need to be connected by secure LANs with ultra-high bandwidth. As LANs remain the epicentre of businesses’ IT infrastructure, the IT administrator will remain essential when it comes to maintenance and improvement in a business IT context. While the cloud is undoubtedly a significant technological development, and one that will have an enormous impact on businesses in the years to come, the enduring importance of traditional IT components cannot be overstated. The advent of cloud computing will ultimately rely on efficient networks, underpinned by sound engineering, and so network administrators and the network monitoring tools that they use to ensure the smooth running of their networks, are here to stay. ### BMJ uses Datapipe to build the foundations for change BMJ started out over 170 years ago as a medical journal. Now as a global brand, BMJ have expanded to encompass 60 specialist medical and allied science journals with millions of readers and have used Datapipe to help initiate a radical change within their organisation. BMJ have moved their digital platform to a fully automated shared-nothing architecture with virtualised infrastructure. In doing so, they have brought a new culture of sustainable development and continuous integration to the organisation. BMJ’s story is a familiar one: Its infrastructure had grown organically over time as new sites, applications and features were commissioned. However, as it grew it increasingly built a technical debt. “In a way, we were victims of our own success,” explains Alex Hooper, Head of Operations, BMJ. “The Technology Department’s focus was on getting the new products to market and there was little time to go back and revise the architecture. An expiring hosting contract and the subsequent review of hosting providers gave us an opportunity to pay off that technical debt and to design for the future.” BMJ had become a 24-7 organisation in recent years and its product portfolio had become international in profile, so the capacity and availability for allowing downtime – scheduled or otherwise – was diminishing. It needed to change its culture and move to a sustainable development cycle of continuous integration and automation. However, it was also used to being in control and keeping everything in-house, so it needed a managed service provider (MSP) who could work in true partnership. In order to find an MSP to partner with, they set a short task for the shortlisted vendors to enable them to demonstrate their expertise and commitment. “Datapipe was the only provider that stepped up to and completed our challenge successfully. It set the foundations of a fruitful working partnership.” Hooper explains. “We wanted someone with the whole package; someone we felt we could work in partnership with. Datapipe had managed AWS; they did hybrid clouds; they could help us expand into China and they had the adaptability to work with us in the way we wanted. Some vendors draw a line - you’re either fully managed or not at all, but Datapipe had the flexibility and the know-how to work BMJ’s way.” The process of moving infrastructure became the lever that could bring cultural change to the organisation and cement in place a new DevOps way of working in one of the oldest publications in the UK. By the end of the move: BMJ was fully virtualised, with over 200 virtual machines running its applications 24-7 in a private cloud infrastructure. The release cycle improved from around one product update a month to up to three a day. Automation was brought in and as a result, the interdependencies were managed or removed and the processes untangled. Product changes and releases became commonplace. Content has started to be delivered using APIs rather than weekly batch file transfer jobs. Services can now be built around the APIs. A DevOps culture has taken hold in IT and even permeated to other departments which now hold daily stand-up meetings. BMJ is now considering their next step of moving workloads to the AWS public cloud.  Sharon Cooper, Chief Digital Officer at BMJ says, “Datapipe delivered the infrastructure we needed to initiate a change of culture within our organisation. They worked closely with us and offered a professional combination of listening to our needs and giving advice to my team. Most importantly, we completed the change with zero-downtime, so our customers were not affected.” Tony Connor, Head of EMEA Marketing for Datapipe says, “We are delighted to have helped BMJ build the foundations for change. It is exciting to be working with a company that has both a long, distinguished history and is also forward-thinking in embracing the cloud. The change has permeated beyond infrastructure and brought BMJ a new, modern culture of continuous integration.” “BMJ has seen extraordinary change in its time. In recent history it has transitioned from traditional print media to become a digital content provider. With Datapipe’s help it now has the infrastructure and culture in place to allow it to grow its business worldwide and is well-placed to take advantage of technology such as the public cloud to cement its position as a premier digital publisher and educator.” ### What’s next for unified communications, and how can your business stay ahead of the game? Effective communication is a vital foundation for both happy customers and engaged employees. In today’s cloud-first and mobile-first world a company’s communication solution needs to evolve to meet today’s key challenges of doing more with less, enabling flexible working environments, reducing complexity and providing customers with a fantastic experience. The increased popularity of outsourced cloud-based solutions, means unified communications are no longer a luxury afforded only to larger enterprise; they are available to even the smallest business, helping them provide more responsive engagement with their customers, reducing costs by integrating with existing solutions end users are used to, lowering travel costs and providing a more flexible modern workplace. [easy-tweet tweet="Increased popularity of outsourced cloud solutions means unified comms are no longer luxury" hashtags="cloud, comms, tech"] Capitalising on the opportunities for the Channel An important point consider, is where the big opportunities lie, and assess how prepared channel partners are to exploit them. Microsoft estimates that Skype for Business Online is a $50 billion+ opportunity for the channel over the next 3 years, and that’s a pretty big statement to make. However, it is understandable when you look at the facts. There’s no doubt that unified communication solutions such as Skype for Business Online are a huge opportunity for the channel. It’s also becoming increasingly clear that tens of thousands of old on-premise PBXs (private branch exchanges) will need to be replaced over the coming 3-5 years and partners that aren’t harnessing the opportunity as an ‘in’ to new customers risk getting left behind. [easy-tweet tweet="Microsoft estimates that Skype for Business Online is a $50 billion+ opportunity" hashtags="tech, cloud, comms,"] There are more people working for SMEs overall than for large organisations and the SME segment is one in which channel partners can add real value due to the inherent complexities of bringing together IT and Telephony in to a single unified communications solution. Channel partners who invest in understanding both the IT and traditional telephony worlds are well-placed to address SMEs’ concerns and provide businesses with independent, tailored advice and UC (unified communications) offers many different solutions from fixed and mobile telephony, collaborations tools and so on. Next steps for the Channel Profit from unified communications is driven by profit. The first step for any reseller is to deploy an UC solution within their own business, so that all employees experience first-hand how UC solutions, such as Skype for Business Online, can help increase productivity and improve customer service all whilst lowering the overall costs. [easy-tweet tweet="The first step for any reseller is to deploy an UC solution within their own business" hashtags="tech, comms, cloud"] However, for many partners, there are many new complex areas for partners to learn when they first investigate UC solutions, and so picking the right solutions and delivery partners is key to helping take advantage of the massive opportunity ahead. Knowing which technologies can meet the needs of customers in the simplest and most easy to use ways is vital. Although, simplicity is not the only key, it is equally important that the products work with the existing IT infrastructure and tools that end-users know and use every day. Reseller should look to build a fully integrated solution, complete with ongoing Managed Services which offers the full unified communications package right out of the box. How can vendors help channel partners along the way? Unified Communications vendors have a pivotal role in helping the channel prepare to exploit this massive opportunity from understanding how the different solution elements connect together, to simplifying how the solution can be deployed and end-users on-boarded. Vendors should work closely with resellers to offer flexible ways for them to migrate customers to the cloud in a way that suits them, for many organisations a hybrid on-premise/cloud world will exist for many years to come. Furthermore, operational revenue from collaboration services requires billing and a trained customer service team. Vendors can provide a range of supporting services, often on a 24/7 basis, to ensure partners can provide the tools, and the IT to support their customers. ### Hybrid IT is about to get interesting — The benefits of Hybrid Cloud and Azure Stack In my previous post I looked at hybrid IT (and cloud) and introduced the forthcoming game-changing technology from Microsoft – Azure Stack. Expanding on that,  this blog will go into a bit more detail on why you might want a private cloud in the first place and why specifically you would choose to use Azure Stack inside your own datacentre or for customer cloud solutions. Obviously the cloud still allows the use of virtual machines (VMs) but increasingly companies are drawn to the benefits of running infrastructure as a service (IaaS), as well as platform as a service (PaaS), which commoditises these elements and focuses time and effort instead on providing solutions to end customers. Azure Stack now takes this core idea with the benefits of public cloud and allows a company to replicate this on-premises (in their datacentre). What will Azure Stack be good for? Azure Stack for ISVs and dev/test is just one of the applications. If a company has already invested in a robust IaaS (VM) solution and if all it plans to do is build more VMs and run them in Azure Stack then the benefits may be harder to prove. One of the real world scenarios for Azure Stack is to offer ISVs, who want to take advantage of the cloud, a consistent environment where they can build applications and services in private Azure Stack and then simply move these to public Azure without having to re-engineer them. [easy-tweet tweet="Public Azure makes perfect sense for an ISV" user="PulsantUK" hashtags="cloud, tech, ISV"] Right now many ISVs are developing and moving their applications to Azure (public). Azure provides geographically redundant global datacentre environments that are externally hosted and scaled to the need of an ISV’s customer quickly and easily in all of the regions that ISV/application is serving. Azure (public) makes perfect sense for an ISV in that respect. However, as much as ISVs like the idea of the infinitely scalable dev/test environment that Azure (public) provides, they often need and want better control during the development process and where the code (the intellectual property) resides. They want development capacity in private environments with the ability to add resources as required for compilation and processing whilst doing so in an environment that is similar, if not identical, to the one in which the application or service will be deployed. Azure Stack provides the ability to completely control the physical location and allocated capacity of the dev/test environments. Because development is happening in Azure Stack (on-premises), the application can then be easily shifted from dev/test into Azure (public) because the two environments are effectively the same. Azure Stack for secure environments Another scenario that Azure Stack is ideal for is organisations that want and need the scalability that Azure (public) provides, but in a private, certified, highly secure environment. In my last post it was pointed out that one of the blockers to using public cloud (in spite of security improvements) are the very real issues of data governance and security. For those organisations that have applications they want to scale and manage in a cloud environment but require a level of governance and security, mandated by local or global standards (PCI, FSA, SOX etc.,) Azure Stack provides that capability. Not only can this be done within a datacentre but with the use of Azure Stack appliances the ‘private environment’ on which an application, service or data resides can be physically and easily moved to any secure location as required. Of course, once in this environment, there’s nothing to prevent a company, local public sector organisation, or bank from shifting from a private to a public model. Azure Stack gives an organisation a pathway to a public model, if and when they choose to, at any point in the future. An obvious use case would be for emergency services or for bodies such as the UNHCR or Red Cross. They can move an Azure Stack appliance with the necessary SaaS services and applications on it to the disaster hit region. This gives them a secure cloud environment in which to work, but with the ability to burst into the cloud to offer a ‘public’ facing web application, for example, to allow refugees to sign up for services or to collect data. [easy-tweet tweet="Azure Stack gives an organisation a pathway to a public model" user="PulsantUK" hashtags="tech, cloud"] Azure Stack for hosting companies Hosting companies and managed service providers have a unique opportunity in differentiating their services leveraging Azure Stack. The world of third-party hosted virtual machines has quickly become a commoditised market as organisations have spun up datacentres, calling themselves “cloud providers,” and effectively just hosting virtual machines. Azure Stack will allow hosting and managed service providers to provide such an environment and to differentiate themselves from their competitors. This is especially relevant as increasingly enterprises are beginning to realise that scalability and agility are of paramount importance, and these in turn are dependent on having a secure, private and above all a consistent platform that spans both public and private clouds Conclusion [easy-tweet tweet="Azure will provide highly scalable and globally distributed public cloud services" user="PulsantUK" hashtags="tech, cloud"] When you look at the current cloud landscape and the main players (Google, AWS, Microsoft Azure) it is clear that Azure Stack will give Microsoft a unique position in the marketplace. Azure will provide highly scalable and globally distributed public cloud services while Stack provides organisations with the ability to run the exact same applications, capabilities, and cloud scalable and redundant services on-premises. For hosting providers, this will be  a very easy way to extend reach globally by deploying Azure Stack appliances anywhere in the world, be that datacentres or on customers sites, and to manage those private clouds as one seamless Azure environment – one CONSISTENT and seamless Azure environment. ### Flash Forwards: The UK enterprise is hooked In an increasingly fast-paced enterprise space, where decisions are made in a heart-beat and customers’ exceptional expectations stalk service providers – the quest for high-tech, high-performance support is growing. All the while Flash technology has been scaling down its capacity costs while turbo-charging its potency – and UK enterprises are finally taking note. [easy-tweet tweet="The quest for high-tech, high-performance support is growing." hashtags="flash, tech, IT, "] After battling against barriers to Flash adoption, like the perceived high-costs and a lack of understanding among financial decision makers, it looks as though Flash has turned a corner. NetApp’s extensive survey of over 1000 UK IT decision makers delved into the deeper issues surrounding Flash adoption, business needs and perceptions of Flash. The findings demonstrate a remarkable shift in attitudes to Flash – most tellingly among small and medium businesses. Almost two thirds of businesses are now supported by Flash technologies across the UK enterprise, an outstanding figure that highlights the drive for high-performance support – fuelled by today’s always-on generation. As businesses respond to the need for scalable infrastructure with the ability to consistently support services as demand peaks and troughs, it is unsurprising that almost one fifth of businesses say nothing will stop them from adopting Flash. One of the more revealing findings from the study hints at the extent of the movement within the Flash market. While 55 percent of larger businesses have already adopted Flash technology, this is followed very closely by smaller businesses with a 53 percent adoption rate. Meanwhile, medium-sized businesses were the strongest advocates of Flash with 61 percent saying they’ve adopted the technology. This is surely testament to the advances being made in educating businesses on the true value of Flash and strengthening the business case. To see movement within the market from smaller and medium businesses is a positive indication of the deep rooted changes underway. It is perhaps symptomatic of today’s world, in which it is now widely accepted that to get ahead in business, you must make data your friend. This can only be done if you have the capacity to process and store data, with instant scalability and accessibility. [easy-tweet tweet="55% of larger businesses have already adopted Flash technology" hashtags="flash, tech, IT"] To understand the move towards a Flash bound future, we must get a handle on precisely how businesses are applying the technology. According to NetApp’s findings, 53 percent of businesses need Flash to support high-performance storage. Meanwhile, 40 percent need it to support high speed storage across multiple devices. In line with these needs, one-in-five businesses (18 percent) are deploying Flash technology to support payment processing and business intelligence applications – both of which require high performance storage and reliable access to data. It appears the UK enterprise and Flash are on a path to synonymy, as the technology becomes increasingly integral to day-to-day business. In order to maintain competitive, by keeping the attention of their customers without any momentary glitches causing them to drop off, businesses need a highly consistent data management infrastructure. Performance is key and enterprises are acknowledging the longer-term benefits of streamlining workloads. Gone are the days when Flash was considered too expensive and inaccessible to businesses. As big data boulders through Flash’s traditional barriers, reinforcing the need for high-performance storage, IT decision makers are more educated than ever – and financial decision makers are next on the agenda. Flash is the future of enterprise. Two thirds of the UK’s businesses are already living it. ### VR brings the scare back to gaming Horror writers aim to get inside our minds and trigger reactions by showing scenarios audiences have been taught to fear. This has been adapted to gaming and with a significant boom in the horror genre in the last few years the gore and violence of industry staples such as Call of Duty have desensitised the gaming demographic. Just as gamers begin to get used to walking through dimly lit narrow spaces, slowly creeping past limp bodies slumped in chairs and accepting that everything is now spiders; it looks like Virtual Reality aims to bring terror a whole lot closer to the gamer. [easy-tweet tweet="Just as we accept what is scary; VR will bring terror closer to the gamer." hashtags="tech, gaming, horror, vr"] Many of our current fears and phobias stem from back in our evolutionary past and darkness (the go-to setting for horror games) has a recurrent context for humans throughout history going back before we were the top rung on the food chain. The inability to see clearly what is around us combined with predators being far more adapted to darkness led to connotations of danger when thinking of the night which then grew into folktales associated with the night such as vampires, witches, ghosts and werewolves. As a primary mechanism in early human development the notion of attachment means we are a social species (even if introverts deny it). Horror games leave your allies dead or missing and your character alone which triggers evolved mechanisms for heightened fear and vigilance for potential threats as nobody has your back. Other cues thrown into the horror mix are children, usually barefoot and accompanied by nursery rhymes wandering around a scene where they don’t belong (F.E.A.R. is a classic example) - Psychology suggests we find their appearance scary because it adapts our own childhood memories into hugely negative cues that suggest nothing and nowhere is really safe. Religion makes gamers wary of surroundings and messing with practices that go against many people’s beliefs about what is good scares us. This belief goes hand in hand with the unknown. A classic way to ruin horror in any medium is to reveal too much about the evil - once it is humanised an audience can relate and it becomes less scary. This is because without any backstory we have no clue what to expect - we cannot understand the psychology of a demon, however, we can understand better the psychology of a human trapped in a demon’s body. [easy-tweet tweet="Our current fears and phobias stem from back in our evolutionary past" hashtags="gaming, tech, horror, VR"] Involvement with content is key to fear and the closer you can get as a gamer, the better! Third-person play-throughs such as the Resident Evil series (bar new upcoming entry Biohazard) offer detachment from the horror as you clearly control another person and can therefore distinguish the difference between game and reality. However, when these lines blur and games such as Outlast offer an immersive first-person experience set in a relatable location with a semi-believable story triggering the above fears it becomes far easier to reach for the television remote to turn the TV-set off and reset your sanity and dry your sweaty palms. The next step in gaming is the massively hyped Virtual Reality experience and with both Playstation VR from Sony and the Oculus Rift finally getting a commercial release in the UK putting ourselves in the game is about to become a thing of normality. What does this mean for horror? It means your peripheries are about to be blocked and there will be no escape from the small pale child following you around the haunted asylum unless you completely remove the headset. Cheap jump scares will become nightmarish as creeps pop up literally in front of you to the extent where even Five Nights at Freddie’s can become scary again. Games such as Alone, Alien: Isolation, Slender: The Arrival and Lost in the Rift all use darkness, character separation and negative cues to create a highly disturbing atmosphere in VR titles that will reignite fear inside gaming universes. [easy-tweet tweet="Putting ourselves in the game is about to become a thing of normality" hashtags="horror, tech, gaming, VR"] Virtual Reality isn’t the only medium of re-establishing fear in the horror genre, Augmented Reality (best known now by the success of the over-mentioned Pokemon Go) is being adapted to bring horror to your own humble abode with Novum Analytics’ Night Terrors app. The app that grew from an ‘indiegogo’ campaign promises to be the ‘scariest game ever made’ and uses smartphone mechanisms to log the shape of your house or flat and inject horror through the means of AR with ghosts and ghouls appearing on screen in an ultra immersive campaign that turns your smartphone into a vintage styled camcorder while accessing your camera’s flashlight to further control reality and immerse the user. The ultimate aim can only be to combine these two disruptive technologies to allow gamers to roam freely while interacting with a real environment that includes aspects of AR gameplay. For now however, we can only enjoy what’s confirmed to come but with the likes of cult-horror sequel Outlast 2 being all but confirmed for a VR release it is an exciting time to be a horror gamer. ### The True Cost of IT – M7 Managed Services Wouldn’t life be much simpler if we had a crystal ball telling us exactly when disaster was about to strike and advising us how to prevent it? Just ask British Airways management this morning! Without this insight, however, it pays to be prepared for the worst. While some business managers and IT leaders recognise the need to put the appropriate safeguards in place, others are apparently prepared to take risks with their IT estate if it means they can cut back on costs. Finances are understandably important for businesses of all sizes, but when it comes to your IT systems and your reliance on them, can it really pay to cut corners on your disaster recovery? [easy-tweet tweet="Some Business Managers are prepared to take risks with IT estate if it means they can cut costs." user="@m7msuk" hashtags="cloud, tech, IT"] In order to answer this question, businesses must first have a clear grasp of the dependence their organisation has on all their different IT systems and the cost incurred for each hour or day they are without them. From a purely financial point of view, these costs will vary depending on the type and scale of disruption facing your company, but they are always unwelcome. Take a security breach, for example. These are seemingly becoming more and more commonplace, with headlines emerging on an almost daily basis. Despite this, some companies still adopt an “it won’t happen to me” attitude – until it’s too late. According to recent research, IT security breaches are set to cost businesses $2.1 trillion globally by 2019. As it is impossible to protect against the many threats facing modern businesses, an organisation’s disaster recovery plan can offer a safety net that’s better than an insurance policy. In most cases it will also probably reduce the cost of your insurance policy. Even if you are fortunate enough to avoid an IT security breach there are still plenty of reasons to invest in disaster recovery. Disruption comes in many forms, from natural disasters, equipment failure to internal mistakes. Predicting when one of these is going to occur is virtually impossible, but that doesn’t mean that customers won’t start looking elsewhere if your recovery strategy means they can’t transact with you for a numbers of hours or days. [easy-tweet tweet="Disruption comes in many forms, from natural disasters, equipment failure to internal mistakes." user="@m7msuk" hashtags="cloud, tech, IT"] In today’s fast-paced digital world, even the smallest amount of IT downtime can be enough to send your customers to one of your competitors. Lost business like this can be difficult to recover, but with the right disaster recovery plan in place, your customers needn’t know about the disruption at all. Instead, business continuity planning combined with the correct replication tools can provide a seamless solution to ensure your employees, clients and customers continue to receive the level of service that they usually expect. However, disaster recovery cannot simply be implemented as an afterthought or considered as a necessary nuisance to help pass a company audit. If organisations feel like they do not have the time, resources or expertise to implement a robust disaster recovery plan of their own, they should instead source a managed service provider (MSP) that can. For a monthly subscription fee, M7 Managed Services can deliver a DR plan that ultimately could be the difference between a business continuing or failing. By planning ahead businesses might avoid financial ruin or reputational damage through future IT failures. For those companies seeking a solution M7 is now offering a DR Solution free trial so businesses can appreciate the benefits that DR offers, without having to invest beforehand. Unless you have a corporate crystal ball, going without a disaster recovery solution is almost certainly a risk that simply isn’t worth your business taking. ### Genomics England works with EMC to deliver the 100,000 Genomes Project EMC is providing the data lake required to support large scale data collection and analytics generated by the project’s genome sequences EMC Corporation, today announced that it is Genomics England’s official IT storage supplier, supporting the organisation in the completion of the 100,000 Genomes Project. Using VCE vScale, with EMC Isilon and EMC XtremIO, Genomics England will store the genome data securely for analysis. A ground-breaking project announced by the former British Prime Minister David Cameron in 2012, the 100,000 Genomes Project is sequencing 100,000 whole genomes from 70,000 NHS cancer and rare disease patients and their families. Genome sequencing has the potential to shift the way in which we approach healthcare. The project aims to deliver a new genomic medicine service with the NHS, to support better diagnosis and more personalised treatments for patients.    Once a genome has been sequenced, the information, amounting to hundreds of gigabytes  per genome sequence, is stored digitally. Data in the Project will increase 10 fold over the next two years. It will be key to provide the agility needed to analyse and compare these immense data sets. De-identified data from the 100,000 Genomes Project will also be made available to approved researchers from academia and industry to help accelerate the development of new treatments and diagnostic tests that are targeted at the genetic characteristics of individual patients.   In the past, Genomics England was using EMC Isilon for storage of its sequence library alone. The organisation has now chosen to use an Isilon data lake for all the data collected during genome sequencing. Once captured at the sequencing centre in Cambridge, UK, the file is stored on Genomics England’s secure IT infrastructure. The Isilon data lake will facilitate initially 17PB of data to be stored and made available for multi-protocol analytics, including Hadoop. Alongside the Isilon data lake, 24 X-Bricks of all-flash XtremIO is in place to support their virtualised applications. EMC’s Data Domain and Networker are also used to provide back-up services. The net result is resilient infrastructure that supports massive scalable data storage with robust analytics.   The Genomics England computing environment is delivered by both on-premise servers and Infrastructure-as-a Service, provided by Cloud Service Providers on G-Cloud. One of the key legacies Genomics England will create is an ecosystem of CSP’s providing low cost, elastic compute on demand through G-Cloud, bringing the benefits of scale to smaller research groups.   “There are few better examples of the fundamental impact that analysis of data sets can have on society. It’s a privilege to be chosen as the IT storage provider for Genomics England and to be part of a revolutionary time for genome analysis. Genomics has the potential to transform healthcare and redefine the way the NHS operates, uncovering medical treatments, benefitting patient experiences and transforming the economics of universal healthcare in the UK”, comments Ross Fraser, Vice President and Managing Director, UK&I, EMC. “Delivering the platform for this large scale analytics in a hybrid cloud model will help accelerate the impact Big Data analytics could have on the NHS, potentially delivering billions in efficiencies in care delivery and improving patient outcomes immeasurably.”   Dave Brown, Head of Informatics Infrastructure at Genomics England said “This project is at the cutting edge of science and technology. EMC’s data lake platform provides the secure data storage that we need, with the flexibility and power to undertake complex analysis.” ### 'Traingate’ shows us that CCTV systems need the cloud The furore over Jeremy Corbyn and 'Traingate' may have died down, but it highlights an unfortunate truth about CCTV which those of us who use it would do well to note. The press, whilst reporting Mr Corbyn’s mis-telling of the events which led to him sitting on the floor instead of a seat on a Virgin train, seem to have missed an important point. Given Virgin’s reputation for using the latest technology and providing accurate, rapid communication through every media channel, why could it not produce CCTV footage to counter Mr Corbyn’s claims immediately after the event, instead of leaving it a week and then producing some grainy footage with incorrect time-stamping? To be nice about it - CCTV in general isn’t very good. In this day and age you’d think that footage would have been quickly identified within a secure central corporate repository with an accurate timestamp, downloaded and sent immediately to Mr Corbyn’s office with a polite note requesting that he withdraw his remarks. We live in an age when anyone can quickly upload video from their phones to Facebook, so why did this not happen? [easy-tweet tweet="Why is CCTV bad in an age when we can quickly upload video from phones to Facebook" hashtags="tech, security, CCTV"] I have no doubt that if Virgin could have got their hands on the CCTV footage of what actually happened as soon as they needed it they would have countered Mr Corbyn’s claims immediately. The trouble is that the CCTV footage is recorded to DVRs in each train carriage and, as trains are on the move, even getting hold of the footage within a week would have been no mean feat. The Daily Mail reported that some of the footage was incorrectly time-stamped. It is not that important in this instance, but if the system had been recording a serious crime or terrorist attack, timings would have been vital. CCTV systems are notoriously bad at keeping the correct time and have to be constantly checked to ensure they are correct. Another small point on time-keeping is that, to meet the requirements of the Data Protection Act (DPA), it’s essential that footage should be of evidential quality and is correctly timestamped. It’s not great for Virgin to have to display such an inaccuracy in the public domain – particularly as it makes the footage unlawful. Of course I’m pointing all this out for a reason – CCTV really does not have to be this way. Footage from one or many corporate CCTV cameras can be simply, securely and inexpensively consolidated onto cloud servers, where it can be quickly interrogated by authorised personnel from any location and on any device (even by a busy CEO on their smartphone on a Caribbean island, provided they have the authorisation). [easy-tweet tweet="IoT devices have to tell the right time or they won’t work very well" hashtags="IoT, tech, CCTV"] What’s more, IoT devices have to tell the right time or they won’t work very well, so cloud-based systems such as Cloudview are constantly polling NTP servers to ensure they are telling the right time regardless of location. In this age of instant opinion CCTV footage needs to keep up – and it can, as long as it becomes part of the IoT. ### Lessons on collaboration from Pokémon Go It seems you can’t escape Pokémon GO, whether it’s in the news, on social media, or in your day-to-day life as you see people of all ages hunched over their phones walking around outside. This wildly popular application has spurred discussions across a variety of topics—health, social interaction, marketing, gamification, pedestrian safety, and more. [easy-tweet tweet="Pokemon Go teaches the value of a comprehensive app to enable team collaboration." hashtags="pokemongo, tech, app,"] Observing strangers collaborate through this application can open your eyes to the potential potency (and potential shortcomings) of a business application that can foster teamwork in your business. It teaches us the value of a single, comprehensive application to enable team and project collaboration. The Quest Pokémon GO is a real-time application in which you can catch Pokémon while travelling around. It has geo-tracking and real-time updates, players all experience the same interface—provided they’re in similar areas. The same creatures pop up on the screen, at the same time, at the same location. This accounts for many observations of increased social interaction, and also accounts for the parallel in team and project collaboration. When encountering a player recently who was on a quest for a specific Pokémon he wasted no time in saying he drove over 45-minutes to reach the park we were in because he heard it was rich with a Pokémon he desperately wanted to add to his collection and asked if I’d seen any. I pointed him to an area where I’d found the creature he sought and heard his triumph when he found it. He then proceeded to scour the park. Anytime I found the Pokémon he was looking for, I called out to help direct him to the right area to catch it. Strangers collaborating to chase invisible pocket monsters on a smartphone application—who’d have thought? It’s all about teamwork [easy-tweet tweet="Strangers collaborating to chase invisible pocket monsters on an app - who’d have thought?" hashtags="app, cloud, tech"] For this particular Pokémon GO player, he was able to complete his quest for catching Pokémon because all players experience the application in real-time. By collaborating with other players, he identified the park as a place he needed to visit and was able to add the necessary Pokémon to his collection. This story is a testament to the power of a real-time applications and online collaboration capabilities accessible by all. If a player can accomplish so much from the application, what implications are there for businesses? The Pokémon GO collaboration experience could be better because it doesn’t offer the in-app ability to message another player or group, or share images of any kind. Although the application revealed the necessary Pokémon for this player, he relied on friends and external websites to provide the best possibility of completing his collection. The collaboration experience could be both easier and improved: taking a seamless application operating in real-time and adding the ability to message others, post discussion threads, and share images and insights. As well as work in other applications and links that would enable any player to get the information needed to play effectively. On the positive side, the synchronised application experience paired with real-time capabilities helped this player to complete his task. On the negative side, he had to lean on other applications to accomplish his mission to ultimately be successful. [easy-tweet tweet="What implications does Pokemon GO have for businesses?" hashtags="tech, app, "] A comprehensive interface certainly would have helped him accomplish his goal more efficiently, just as a more complete collaboration tool helps workers get more done faster. Relying on multiple applications to allow full collaboration creates a disjointed, less effective experience than if all of the capabilities were built-in. Unlike Pokémon Go, there are business applications with a complete set of tools to help your employees collaborate. If you give your employees the opportunity to utilise a communications system that promotes project collaboration on a platform designed with mobile in mind, they will be able to work efficiently and effectively, at any time, in any location. Many businesses today have one piece of the puzzle—like Pokémon GO—but choosing a truly integrated collaboration experience can really give you a competitive edge. Especially as it’s estimated that the average employee spends up to an hour every day trying to get in contact with people, find meeting rooms, and track projects and emails. For a workforce on-the-go, uniting multiple channels of communications in a single mobile-first application—email, chat, phone calls, text, video—can drastically improve both productivity and output for a business. Companies should choose a team collaboration tool that prioritises accessibility, visibility, and real-time tracking of various projects, documents, and presentations so workers get more out of every hour they put in. In order to get the most effective solution for your business, make sure it is: Mobile-enabled Real-time Inclusive of all communications channels Ready for project collaboration Able to share documents and track updates Secure CRM-compatible ### Bridging the gap between technical and commercial - #CloudTalks with Verizon's Martin Galea In this episode of #CloudTalks, Compare the Cloud speaks with Martin Galea from Verizon about bridging the gap between technical and commercial. #CLOUDTALKS Every day, Verizon connect millions of people, companies and communities with powerful network technology. Not many companies get the chance to change the industry and the world through innovation. Verizon do. Verizon aim to make a positive impact on society. From working to improve education and healthcare to evolving sustainable business practices, Verizon share their success with the community to make a world better place. ### Supercomputer 'Watson' edits the world's first AI Trailer https://youtu.be/CWu_BjkYHqI In June, 20th Century Fox released the thrilling trailer for new movie Morgan - a film that centres upon a hostile humanoid robot whose fate is determined by a risk-management consultant played by Paul Giamatti. Today (2nd September 2016) sees the full film's release in cinemas nationwide but earlier this week the production company (rather aptly) dropped a rather eerie trailer using the help of IBM Research's artificial intelligence program, Watson. [easy-tweet tweet="IBM's Watson delivers an eerie AI-made trailer for sci-fi thriller 'Morgan'" hashtags="morgan, film, IBM, tech, trailer"] In order to train Watson for the task it used technological systems to sift through 100 horror movie trailers, each cut into separate "moments". Watson then performed visual, audio and composition analysis on each scene to develop an idea of what people deem scary. After all this, the machine was finally fed the entire 90-minute final cut of Morgan in order to find the 'right moments' to include in the trailer. The supercomputer instantly honed in on a total of ten scenes making up six minutes of content. A human was still required to arrange these scenes into a coherent story, but the real milestone in this story is that Watson's involvement shortened the process to just 24 hours. On average film trailer can take between 10 days to a month to complete. So, if there is someone out there working on an editing-bot we could be ready to boot all creatives from the post-production filmmaking world in order for the machines to take over! The supercomputer has been used in many creative and intelligence experiments from assisting in cooking, reading and predicting emotions and competing (and winning) against humans on gameshow Jeopardy. This however is the first time Watson has been used in such a way to create media in order to appeal to a certain audience in a creative manner and the result is indeed quite horrifying and rather apt if you consider the story is about a group of scientists who create a humanoid machine that rapidly gains capabilities and goes out of control. [easy-tweet tweet="How do you create a trailer about an AI human? You turn to the real thing!" hashtags="tech, morgan, film, trailer, watson"] The trailer itself may not feel as natural and fluid as your usual blockbuster trailer as the clips chosen aren't really the best representation of the movie and while eerie, does lack direction. The editing (done by a human) makes the best of the footage provided and this is down to the sound analysed by Watson which was highlighted as 'scary' to audiences rather than perhaps picking sections that included some narrative and build-up that explain a situation that would normally be found in film trailers; especially for something with a seemingly complex narrative as audiences may be left wondering what exactly the film is about after this minute of content. A full explanation of how Watson chose the clips for the trailer can be found here on IBM's research blogs. ### SGNL - Gadget of the Week After a while floating through the seemingly endless pages of tech startup projects vying for your funding on Kickstarter it becomes clear there will forever be a very thin line between helpful tools for our day-to-day lives and complete madness. From smart-belts that follow tech-fashion in tracking your steps, keeping you healthy through gamification and notifications that aim to keep you paranoid about sitting down for too long to projects that bend IoT to the advantage of feeding your pets from the other side of the country while watching them eat via a connection between your smartphone and a 'easyFeed' dispenser in your house. [easy-tweet tweet="SGNL will use bone conduction to allow phone calls through your index finger" hashtags="wearable, tech, wearable"] This week SGNL want you to Touch Your Sound. Well, what does that actually mean? Basically SGNL will use bone conduction from your wrist to allow you to hear phone calls through your index finger, something that is bound to make you look like the coolest dude in the early-naughties.  As with most startup technology the thought of this product seems like a silly one to adopt, but then again who could have believed we'd be so obsessed with sharing pictures of ourselves with dog ears superimposed onto our heads before the Snapchat update a few months ago? The stuff of kids' make-believe games looks to become reality with SNGL's aim to have have everyone walking around talking into their hands all while offering what smartwatches have so far failed to do with the ability to talk on the go. The Body Conduction Unit (BCU) can be nicely disguised by a fashionable watch face and of course also includes a built-in activity tracker that connects to an app on a smartphone. [easy-tweet tweet="The stuff of kids' make-believe games looks to become reality" hashtags="SGNL, tech, wearable"] Formerly Samsung's TipTalk, I think it's fair to say the breakaway startup made the right choice in changing the name to SGNL. The product has been developed to take the form of a smart watch strap (requiring the user to add to their smartband collection) that allows you to answer phone calls using your fingertip by simply sticking your finger in your ear and using the band as a receiver through its integrated microphone. Created as a solution for people who lose their earpieces, SNGL will allow users to leave their phone in their pockets or bags while simply raising their hand to answer calls without any extra wires or hassles. The team behind SGNL take credit for the product having a built in 'background noise' reducer in the form of putting your finger in your ear. The strap design allows for smartwatch mounting to combine your wearable technology into a hybrid application straight from your wrist connecting to your phone via bluetooth. With products moving towards this 'deviceless' period could we possibly expect the smartphone to become a thing of the past? Surely it could only a matter of time if technology like SGNL takes off. SGNL looks to move into mass production by December 2016 and has raised over 150% of the funding needed. Check out the product video below: https://www.youtube.com/watch?v=40LfELeoAIw ### HCL and Mesosphere Enter Into Partnership HCL Technologies (HCL), a leading global IT services company, today announced that it has entered into a partnership agreement with Mesosphere, a datacentre infrastructure and container orchestration company. The partnership combines Mesosphere’s Datacentre Operating System (DC/OS) with HCL’s unique Next–Gen IT & Operations capabilities to deliver a unified operational experience and achieve efficient resource utilisation for clients. “With the rising wave of digitalisation, cloud and automation, the role of IT has changed from managing assets and services to delivering perfect user experiences. 21st Century Enterprise Infrastructure must be prepared for this change to meet the ever evolving business and user expectations,” said Kalyan Kumar, Executive Vice President and Chief Technology Officer, HCL Technologies. “Through this partnership, HCL and Mesosphere are uniquely positioned to play a leadership role in the global cloud infrastructure management market. While Mesosphere will be providing best–in–class Enterprise DC/OS technology, HCL will be the global systems integrator for consulting, designing, integrating, building and running solutions for large enterprises.” “Forces such as cloud computing and containers are driving major changes in IT infrastructure, and are affecting businesses of all sizes in all industries,” said William Freiberg, Chief Operating Officer, Mesosphere. “DC/OS helps users take advantage of these trends, and is already revolutionising operations and application development inside customers’ datacentres and cloud environments. Our partnership with HCL will help us bring these results to a much larger range of companies across the globe.” Mesosphere is the creator of DC/OS (Datacentre Operating System), an open source technology that is built around Apache Mesos™ and offers the most reliable, scalable and easiest platform for running containers, stateful data systems and everything else that modern applications require. For users, DC/OS powers everything from streaming video to the Internet of Things—simplifying IT operations while delivering the types of services their customers and businesses demand. HCL’s vision for the 21st Century Enterprise encompass a modern infrastructure, including datacentre infrastructure services that are service–oriented. HCL helps its customers adopt best–in–class next–generation hybrid data centre architectures with strong IT Service Management process automation and service orchestration at the core. This enables enterprises to be lean and agile, creating united experiences for the end users. ### Accelerate your Compute at lower cost with innovation! Imagine your computer software running faster; your datacenter footprint smaller. Imagine you need less servers, software and support licenses. Imagine what your business could achieve with better performance at a lower cost. For many businesses, this dream is already a reality. [easy-tweet tweet="Innovation drives IBM Power Systems and collaboration from the OpenPOWER Foundation" user="IBM" hashtags="IBM, tech"] Innovation drives IBM Power Systems and collaboration from the OpenPOWER Foundation to deliver business workloads and the most demanding high performance compute requirements needed for scientific and genomic research. These servers are designed to be as compute effective as a mainframe, bringing a formidable platform that revolutionises Linux workloads. 10 reasons why IBM Power and OpenPOWER rock Linux workloads 1. Innovate, innovate innovate! – The OpenPOWER Foundation The IBM Power processor is an awesome processor – so great that tech powerhouses IBM, Google, NVIDIA, Tyan and Mellanox formed the OpenPOWER Foundation in 2013. IBM opened up the technology around the Power architecture so members could build out their own customised servers, networking and storage hardware. Through the OpenPOWER Foundation, new innovative Linux-only Power-based servers have been created by the likes of Wistron and Mellonox, Tyan, Rackspace and Google. There are many new components such as GPUs and field-programmable gate arrays (FPGAs) designed and built specially for the IBM Power architecture. OpenPOWER is enabling a new Linux open ecosystem. 2. Linux all the way Power is no longer just AIX; it’s all flavours of Linux. Linux-based software can utilise the Power platform and realise performance gains. The software only needs a recompile as a minimum to run on Linux on Power. If you then go on to tune your software to exploit the innovative enablers such as the hyper threading, GPUs and FPGAs, you can realise even greater performance. Check out how MariaDB realised 3x the performance over Linux on x86 when tested independently by Quru. [easy-tweet tweet="Linux-based software can utilise the Power platform and realise performance gains" user="IBM" hashtags="tech, linux, cloud"] 3. Hyper-threaded processor The POWER8 processor is designed for virtualisation; the cores are multithreaded with eight threads per core. This allows for virtualisation of many workloads at the thread level, enabling increased processing ability with linear performance. The POWER8 multithreading makes it easy to realise over 2.5x the performance, with no tuning! It’s like changing the processor core into an eight lane highway, rather than having a slow single track road to mosey along. Check out my performance blog for more. 4. GPUs Imagine a card you can add to into your computer architecture to massively increase the number of cores you can access. That’s the beauty of GPUs. The GPU virtual cores are ideal for processing compute intensive workloads. Thanks to the OpenPOWER Foundation, the NVIDIA GPUs enable compute intensive workloads and are now also able to handle those same loads even when faced with high bandwidth requirements such as big data. With the new NVLink technology from NVIDIA, businesses can redefine how they handle compute and big i/o. It is so cool when innovation brings components like NVIDIA Cards and NVLink to solve and make light work of hungry Hadoop and analytics workloads. 5. FPGAs and CAPI FPGAs are among my favourite innovations. I love that a programmable card can be configured specifically for your workload. You instruct the FPGA with the most efficient way to process your compute needs, and then you can reduce the number of instructions, movement of data, and so forth. This can increase your workloads’ performance by the thousands. FPGAs are perfect for machine learning algorithms, situations where the processing is compute heavy. Think visual recognition and surveillance applications. Using FPGA’s with IBM’s CAPI technology removes the overhead and complexity of the I/O sub system, giving you even greater performance. 6. Cache & Flash Each generation of cache has seen architecture changes to bring innovation and greater performance. Having access to more cache means memory-intensive applications will perform better as memory latency is reduced, but the data within it is available to be processed immediately with no waiting time for retrieval. POWER8 comes with a new Level 4 (L4) cache, boasting 128MB – unachievable by x86. There is also plenty of cache in L1, L2 and L3 – Power Systems significantly beats any other server it competes with. [easy-tweet tweet="Each generation of cache has seen architecture changes to bring innovation and greater performance." hashtags="tech, compute, cache"] Cache can be complemented with storage. Let’s look at flash storage, which is nearly as fast as cache. Adding 75TB of IBM Flash Storage will absolutely rock your data response times. 7. Bandwidth Bandwidth, or the i/o subsystem, may not sound like a thrill. But this is what moves data around quickly, and when you have data-hungry applications like data warehousing and analytics applications, the POWER8 i/o will deliver significantly greater performance and scalability. The actual i/o speed is 230-410GB/sec, which when compared to Intel Haswell, we’re looking at an average of 6x faster. 8. Footprint & Cost With all the innovation and ability to run mixed highly virtualised workloads and all-around awesome performance, Power Systems are a dream-come-true for the datacentre. These servers easily halve the number of processors required, yet deliver increased performance for more demanding workloads. The bottom line? You need fewer servers, and in every TCO study I have seen comparing POWER8 with x86, both the initial and three-year costs for both commercial and Open Source (with Enterprise Support Licenses) is less for POWER8, including the server price. POWER8 servers continue to amaze me – the price for compute is much lower, and that’s what matters. 9. Reliability, Availability, Security or RAS All essential requirements for servers have been traditionally associated as being on the leading edge for Unix system on Power. In the same way, these capabilities are available for Linux when running on POWER8. Check out what SUSE say. 10. Ecosystem There is a lot of software, both commercial and open source, that has been officially ported to Linux on Power, and there is now a waterfall of vendors moving to the platform. Customers are adopting it as it delivers value and enables their businesses on premise and in the cloud. Here is one who realised 25 times greater performance gain. “Porting was easier than expected” – Robert Love, Google If you haven’t checked it out, you’re behind your competition. Now is a great time to get your Linux ported to POWER8 and OpenPOWER. ### Growing divide between IT and business is leading many to take the wrong approach to cloud While many European companies are embracing the move to the Cloud, nearly half are struggling, wrestling with increases in Cloud integration costs and data silos according to a new study sponsored by Oracle. A key reason: more than 60% of a company’s overall IT spend is being driven by individual business units versus traditional IT departments, making it difficult for companies to fully benefit from the Cloud services they are subscribing to. Another significant part of the problem is that most organizations continue to fund their IT investments without aligning to revenue potential and innovative projects: two in three business decision makers said IT funding is too traditional and is stifling innovation, while one in three IT-decision makers admit their organizations’ IT funding models are hindering them from IT innovation, according to Oracle’s Putting Cultural Transformation at the Heart of Cloud Success report. For the research, Oracle partnered with Coleman Parkes to survey 600 senior IT and line of business decision makers across Europe and the Middle East Time to change funding models The findings reveal businesses must rethink their IT funding models and undergo a cultural transformation in order to fully exploit the benefits of cloud computing. One third (33%) of respondents say an inappropriate IT funding model is inhibiting their business. One third (33%) also believe their company’s IT culture is unfit for the cloud computing age. Tellingly, 72% of respondents say a new cloud funding model will allow IT departments to deliver more cloud services to the business, and 70% expect it will help the company to reduce costs. Shedding light on Shadow IT The Oracle study also found that increased IT spend outside the IT team (also known as Shadow IT) is standing in the business’ way. More than one third (35%) of technology respondents believe Shadow IT practices are inhibiting the ability of IT to deliver on business goals. Indeed, 46% said the approach they’ve taken to cloud so far has increased integration costs, with the same percentage saying it has led to the creation of data siloes. Additionally, the vast majority of respondents (95%) believe Shadow IT is a major cause of complexity. Roughly one-third say leaving lines of business to manage their IT-spend independently results in increased security concerns, making funding more difficult to manage, and diluting the company’s control of its IT. Johan Doruiter, Senior Vice President of Systems, Oracle EMEA, said:  “The issues companies face with their cloud resources are less to do with the technology itself and more to do with a lack of synchronization across lines of business. Decision-makers in each department are increasingly making cloud purchasing decisions without involving the CIO due to the ease of procurement. However, without one IT point-person to unify their cloud investment strategy companies will continue to struggle with individual departments tugging time and resource in opposing directions.” CIO as Cloud Navigator Oracle’s research reveals that in many companies there is no single person with a view across all technology investments. This makes it difficult for organizations to develop and follow a unified cloud strategy. The CIO cannot continue to be side-tracked and must be an integral player in leading the business as it transitions to an enterprise cloud model. With integration and data management still critical to organizations, alarmingly, CIOs control less than half the IT budget in 66% of businesses. Doruiter added: “Companies are having their expectations met by the cloud in many respects, but their approach to IT investment remains stuck in the past. The cloud is about seamlessly joining up data and workloads across the organization and yet we continue to see individual lines of business implement IT systems in siloes. This breeds added complexity and leads to integration issues that could easily be avoided with a more integrated approach. “CIOs must work more closely with line of business leaders to ensure IT is supporting innovation. They must serve as ’cloud navigator’, collaborating with each department to manage cloud procurement issues, cost and risk and ensuring that all lines of business are working towards a common cloud strategy. The CIO should also be the voice of change in the boardroom, calling on the CEO and CFO to mandate a unified approach to cloud across the business.” ### Hybrid IT is about to get interesting — Introducing Azure Stack Hybrid cloud, and by extension hybrid IT, is here to stay. Few companies will use pure public or private cloud computing and certainly no company should miss the opportunity to leverage a combination. Hybrids of private and public cloud, multiple public cloud services and non-cloud services will serve the needs of more companies than any single cloud model and so it’s important that companies stop and consider their long term cloud needs and strategy. [easy-tweet tweet="Hybrid cloud services will serve the needs of more companies than any single cloud model" user="PulsantUK" hashtags="tech, cloud"] Since so much of IT's focus in the recent past (and in truth, even now) has been on private cloud, any analytics that show the growth of public cloud give us a sense of how the hybrid idea will progress. The business use of SaaS is increasingly driving a hybrid model by default. Much of hybrid cloud use comes because of initial trials of public cloud services. As business users adopt more public cloud, SaaS in particular, they will need more support from companies, such as Pulsant, to help provide solutions for true integration and governance of their cloud. The challenge, as always in the cloud arena, is that there is no strict definition of the term ‘hybrid.’ There has been, until recently, a distinct lack of vendors and service providers able to offer simple solutions to some of the day-to-day challenges faced by most companies who are trying to develop a cloud strategy. Challenges include those of governance, security, consistent experiences between private and public services and the ability to simply ‘build once’ and ‘operate everywhere’. Enter Azure Stack. For the first time you have a service provider (for that’s what Microsoft is becoming) that is addressing what hybrid IT really means and how to make it simple and easy to use. [easy-tweet tweet="Business use of SaaS is increasingly driving a hybrid model by default." user="PulsantUK" hashtags="cloud, hybrid, tech"] So what is Azure Stack? Azure Stack is, simply put, Microsoft’s Azure public cloud services brought into an organisation’s own datacentre providing a private cloud solution. Under the hood Azure Stack is running Microsoft’s vNext technology ‘stack’, Window Server 2016, Hyper-V and Microsoft networking and storage. In truth, when it launches later this year, it will only have a limited number of Azure public services ready to go – but – the exciting bit is that no matter how you look at it is, you will be running Microsoft’s Azure on-premises. It’s not just “something that is functionally similar to Azure,” but running Azure Stack is running Microsoft’s public Azure cloud in your datacentre. The obvious first question on people’s minds is perhaps ‘why run a cloud service offering in your own datacentre at all’? Isn’t the whole idea of (public) cloud to push workloads out of your own organisations to third party hosting solutions to minimise IT costs? Well, yes. Offloading infrastructure and datacentre workloads to the public cloud has been the main (but not only) use of public cloud so far. The challenge that still faces many CIOs and CTOs is that there are many workloads and services that they cannot or refuse to put up in the public cloud (right now). Common excuses include security, compliance, and assured reliability, along with the fact that core business apps are not (currently) allowed outside of the datacentre and other reasons. One way of addressing this issue has been to build private clouds on-premises. This works but often these are ‘disconnected’ from public cloud services a company may be using, are poorly integrated (if at all) to other cloud services and are often bespoke solutions that lack any true ability to make use of public cloud services without a lot of re-engineering, re-development and a lot of ‘know how.’ What Azure Stack does for the first time is provide a consistent platform that allows companies to, for example, stage the development and implementation of apps on-premises in a private Azure Stack cloud and then seamlessly move or scale out to Azure public cloud. Azure Stack provides the exact environment on-premises as in the public cloud. Not something similar on-premises, but truly in an on-premises dev/test and on-premises internal datacentre environment that is identical to a public cloud (Azure) environment. How is Azure Stack (or Azure public) better than an existing VM environment? This is the simple question that completely differentiates Azure (public)/Azure Stack from a traditional VM-based environment. When you understand this, you understand how Azure Stack is a disruptive and game changing technology. For a long time now application scalability has been achieved by simply adding more servers (memory, processors, storage, etc). If there was a need for more capacity the answer was “add more servers”. Ten years ago, that still meant buying another physical server and putting it in a rack. With virtualisation (VMware, Hyper-V, OpenStack) it has been greatly simplified, with the ability to simply “spin-up” another virtual machine on request. Even this is now being superseded by the advent of cloud technologies. Virtualisation may have freed companies from the need for having to buy and own hardware (capital drain and the constant need for upgrades) but with virtualisation companies still have the problem of the overhead of an operating system (Windows/Linux), possibly a core application (e.g. Microsoft SQL) and, most annoyingly, a raft of servers and software to patch, maintain and manage. Even with virtualisation there is a lot of overhead required to run applications as is the case when running dozens of “virtual machines” to host the applications and services being used. [easy-tweet tweet="Even with virtualisation there is a lot of overhead required to run applications" user="PulsantUK" hashtags="tech, cloud, virtualisation"] The public cloud takes the next step and allows the aggregation of things like CPUs, storage, networking, database tiers, web tiers and simply allows a company to be allocated the amount of capacity it needs and applications are given the necessary resources dynamically. More importantly, resources can be added and removed at a moment’s notice without the need to add VMs or remove them. This in turn means less ‘virtual machines’ to patch and manage and so less overhead. The point of Azure Stack is that it takes the benefits of public cloud and takes the next logical step in this journey — to bring the exact capabilities and services into your (private) data centre. This will enable a host of new ideas letting companies develop a whole Azure Stack ecosystem where: Hosting companies can sell private Azure Services direct from their datacentres System integrators can design, deploy and operate Azure solution once but deliver in both private and public clouds ISVs can write Azure-compatible software once and deploy in both private and public clouds Managed service providers can deploy, customise and operate Azure Stack themselves [easy-tweet tweet="Azure Stack takes the benefits of public cloud and takes the next logical step" user="PulsantUK" hashtags="cloud, hybrid, tech"] Azure Stack will be a disruptive, game changing technology for cloud service providers and their customers. It will completely change how datacentres will manage large scale applications, and even address dev/test and highly secured and scalable apps. It will be how hosting companies will offer true hybrid cloud services in the future.  ### Japan’s special relationship with robots: Tech fact or fiction? There has been much written and said about Japan’s appreciation of all things robotic. This oft-quoted cultural phenomenon has been propped up with plenty of explanations, cultural, economic and even spiritual, but solid evidence is harder to come by. Is it true to say that the Japanese view robots as friendly helpers while the West sees them as potential agents of the apocalypse? Or is this simply tech fiction posing as fact? [easy-tweet tweet="Are #robotics simply #tech fiction posing as fact?" hashtags="futurology"] An economic necessity It is true that Japan is one of the world leaders when it comes to supplying industrial robots and the country’s government has plans for this to continue, aiming to grow robot sales from 600 billion yen (£4.6 billion) a year to 2.4 trillion yen (£18 billion) by 2020. And it is also true that a very real economic problem is driving the adoption of robotics on a mass scale. Japan’s rapidly aging population means that individuals of working age will soon struggle to support their dependants. Predictions for 2060 indicate that 40 percent of the population will be over 65 years of age. This, coupled with the county’s low immigration rates - less than 2 percent of Japanese residents were born outside the country - leaves an economic crisis looming on the horizon. Robotic workers offer a potential solution and will in the future, perhaps, be able to offer the necessary care required by Japan’s greying populace. However, outside of the aforementioned industrial robots and high-profile, but experimental, humanoid robots picked up by the media (Honda’s Asimo, for example) the average Japanese citizen has very little interaction with robotic technology. The future may see every Japanese home ably assisted by an automaton helper, but the present certainly does not. [easy-tweet tweet="The future may see every Japanese home ably assisted by an automaton helper, but the present does not"] A cultural fascination Culturally, Japan is also viewed as having a more welcoming attitude to robots than the West. The oft-touted view is that Hollywood sees robots as killing machines, while Japanese media is much more likely to portray them as a benign presence. Examples of the man versus machine trope in cinema include Terminator, Blade Runner and iRobot. But is this surprising, or even unique to robotics? Hollywood has a long history of sensationalism and for every Skynet looking to take over the world, there’s a Wall-E or R2-D2 trying to save it. Japan’s love of robotics, on the other hand, is evidenced by Astro Boy, the hugely popular story of a humanoid robot that acts alongside humans to save the Earth. However, when you consider that Astro Boy was often battling other human-hating robots, the East-West divide becomes less clear cut. Japan certainly has a pronounced interest in robotics and some cutting-edge research is emanating from the Asian archipelago. Robots that can play baseball, create music and even cook have been developed, but are still at the early stage of development. These innovative creations feed into our preconceived notion of Japan as the home of quirky, experimental robotics, but there are institutions all over the world conducting similar research, whether it’s Estonian delivery robots or robotic surgeons in the US. A religious origin Japan’s love for mechanical objects has also been attributed to certain teachings found in the local Shinto religion. Some consider the religion to be a form of Animism, a set of beliefs where all non-human bodies contain a spiritual essence. While this is normally applied to natural phenomena like mountains, rivers and animals, some have claimed that Japan’s fondness for humanoid machines also stems from these beliefs. Everything from the aforementioned Astro Boy to Japan’s 18th century Karakuri puppets have been attributed to Shinto’s Animist tendencies, but we see similar fascination with animatronics in Western cultures without a strong connection to animism. Toy Story, Furbies, and Leonardo da Vinci’s 15th century mechanical knight are all examples of how widespread and enduring mankind’s interest in robotics is. [easy-tweet tweet="Japan’s love for mechanical objects has also been attributed to certain teachings found in the local Shinto religion" hashtags="tech, future"] Japan certainly has a strong affinity with robotic technology, particularly when it comes to industrial and humanoid robots, but to ascribe this to some special cultural, or even religious, relationship is a stretch too far. Perhaps Japan’s position as a world leader in robotics is more practical - stemming from its aging population and the threat of economic decline. In any case, Japan’s technological ambition is surely one that many other countries will follow. In the future, the robotics industry may well prop up many others, from manufacturing to healthcare, and the work being undertaken by engineers and developers in Japan and abroad is likely to deliver global benefits. ### Arrow announces strategic investment Arrow is pleased to announce the completion of a strategic investment by Growth Capital Partners (GCP) in order to fast track growth and secure future M&A activity. Recent strategy has seen Arrow completing seven acquisitions in the last six years and whilst this has seen the business nearly treble in size and successfully transform from its mobile roots to a much broader based business communications supplier, the Arrow team are focused on complementing organic growth with larger acquisition opportunities which strengthen their hosted, data and IT services portfolio. GCP is an independent private equity company which invests in growing UK SMEs and will provide a significant acquisition fund to enable Arrow to continue their ambitious growth strategy and further develop their presence in the industry. Chris Russell, CEO of Arrow Business Communications, commented: “I am delighted that GCP has become a major investor in our business and look forward to working in partnership with them in order to help Arrow continue to flourish. GCP has an excellent reputation for supporting SME businesses in the UK and we welcome their support, experience and enthusiasm.” Arrow and GCP will work in partnership moving forward with both parties holding a 50% shareholding in the business. Led by Chris Russell and the current management team, Arrow will continue to operate from the same 5 locations delivering a personal level of support to their base of customers throughout the UK. Chris Russell, CEO of Arrow Business Communications, commented: “We knew that GCP were the partner for us from the moment that we met them and I am delighted that we have been able to complete this deal. This is testament to the hard work contributed by the Arrow team over the last 6 years.” Richard Shaw, Investment Director at GCP, commented: “We are delighted to have completed an investment to support the exciting growth plans of the Arrow management team. With a market position supported by exceptional levels of service differentiation and a proven buy and build track record, Arrow is ideally placed to benefit from continued business communication convergence. With our funding support and technology sector M&A experience we look forward to supporting the next stage of the buy and build strategy.” Arrow continues to reinforce its position in the market as a leading telecommunications and IT supplier to businesses throughout the UK. Arrow’s primary driver is to nurture close customer relationships through personal contact whilst offering dynamic solutions to suit all types of business. ### Missguided expands omni channel operations with Eurostop Online fashion brand selects Eurostop EPOS solutions for their first store in London’s Westfield centre, Stratford Eurostop, a leading supplier to fashion, footwear and lifestyle sectors, has announced that Missguided, the international online fashion brand, is using Eurostop epos-Touch software in its first bricks and mortar location at Westfield shopping centre in Stratford, London. Based in Greater Manchester, Missguided launched in 2009 selling fashion clothing for women aged 16-25 years and now sells online to customers in 160 countries worldwide. The epos-Touch software provided by Eurostop will be integrated into Missguided’s ERP system. e-pos Touch is an intuitive and easy to use till system that will provide many business benefits for the Missguided management team including real-time stock and sales and collection of customer contact data for marketing purposes. Deborah Loh, Marketing Manager at Eurostop stated; “Using e-pos Touch in store enables retailers to capture real-time customer and sales data that they can use to develop omni-channel strategies across their business. We are delighted to be working with Missguided as they look to expand their multi-channel operations.” ### Historic Dell and EMC Transaction Set to Close Dell Inc. and EMC Corp. (NYSE: EMC) today announced that they intend to close the transaction to combine Dell and EMC on Wednesday, September 7, 2016.  Dell Technologies, the name of the new combined company, will begin operating immediately following the close of the transaction. Today’s announcement follows regulatory approval of the Dell and EMC transaction by China’s Ministry of Commerce (MOFCOM), which has granted clearance for the companies’ proposed combination. MOFCOM approval was the final regulatory condition to closing the transaction. EMC shareholders approved the transaction on July 19, with approximately 98 percent of voting EMC shareholders casting their votes in favor of the merger, representing approximately 74 percent of EMC's outstanding common stock. “This is an historic moment for both Dell and EMC. Combined, we will be exceptionally well-positioned for growth in the most strategic areas of next generation IT including digital transformation, software-defined data center, converged infrastructure, hybrid cloud, mobile and security,” said Michael Dell, chairman and CEO of Dell Technologies. “Our investments in R&D and innovation, along with our 140,000 team members around the world, will give us unmatched scale, strength and flexibility, deepening our relationships with customers of all sizes.” “I am proud of everything we’ve built at EMC – from humble beginnings as a Boston-based startup to a global, world-class technology company with an unyielding dedication to our customers,” said Joe Tucci, chairman and chief executive officer of EMC. “The combination of Dell and EMC creates a new powerhouse in the industry - providing the essential technology for the next era in IT.” At closing, EMC shareholders will receive $24.05 per share in cash in addition to a newly issued tracking stock linked to a portion of EMC’s economic interest in the VMware business. Based on the estimated number of EMC shares outstanding at the close of the transaction, EMC shareholders are expected to receive approximately 0.111 shares of new tracking stock for each EMC share. Upon close of the transaction, EMC shares under the ticker symbol “EMC” will be suspended from trading on the New York Stock Exchange. Shares of the tracking stock, trading under the ticker symbol “DVMT,” are expected to begin trading on the New York Stock Exchange on Wednesday, September 7. ### PCI FAQs and Myths Payment Card Industry (aka PCI) compliance is a group of guidelines that oversees data security across a series of credit and debit card payments. Businesses must to comply with these rules outlined by the PCI Security Standards Council in order for their merchant account to remain in good standing. [easy-tweet tweet="Every business that accepts credit/debit cards needs to obey these PCI standards" hashtags="tech, payment, cloud, data"] Every business that accepts credit/debit cards needs to obey these PCI standards, no matter what processing method they use. BluePay has put together a slideshow guide, answering some FAQ’s along with deflating some myths surrounding PCI compliance. Check it out: ### Logicalis Financial Services expands with new senior appointments Logicalis Financial Services has made three senior appointments in the UK, North America and Australia. The Group appointed Paul Tweehuysen as Commercial Director based in the UK, Paul Christensen as Director of North America and Mark Hegarty as Director of Australia and New Zealand. Logicalis Financial Services is responsible for providing financial solutions and products to end user customers of Logicalis to drive incremental business and profitability. Jens Montanana, CEO of Datatec Group the parent company of Logicalis and Logicalis Financial Services, said, “All of these appointments made valuable additions to the team as the business continues to develop its offering across the UK, North America, Australia and New Zealand.’ “Logicalis Financial Services innovative financing solutions are instrumental to enhancing synergies across the Group and significantly contribute to our wider offering.’ “The financing solutions Logicalis Financial Services offer customers are a key point of differentiation for us in a highly competitive and ever changing marketplace” Paul Tweehuysen has joined Logicalis Financial Services and has a key role in working with the Sales teams to develop and execute transactions. He brings over 20 years experience in financial services and has also joined the Senior Management Team. Previously Paul was Global Engagement Manager (EMEA) at Cisco Capital and Leasing Manager Stanley Security Solutions Ltd. Paul Christensen who has joined Logicalis Financial Services as Director of North America has more than 25 years’ experience in the financial services industry. Paul previously worked for CIT as Chief Sales Officer of Avaya Financial Services and Regional Finance Manager of EMC Global Financial Services. Mark Hegarty joined Logicalis Financial Services as Director of Australia and New Zealand and also has more than 25 years’ experience in the financial services industry. Mark has previously worked for CNH Industrial Capital as National Sales Manager and Managing Director at Future Finance Corporation. Christian Roelofs, Managing Director at Logicalis Financial Services, commented, “Paul T, Paul C and Mark bring a great deal of expertise and creativity to Logicalis Financial Services and are ideally suited to support the business in the next phase of its development.’ “With the expansion of our team we now have a foundation across our core geographies to provide Logicalis with unique financial solutions. By integrating the financial services offering with the customer offering from the outset we can provide a genuine competitive advantage.’ “With the shift towards Everything as a Service (EaaS), investment decisions are no longer simply a choice between leasing or buying. Enabling a new alternative to better address Logicalis customers’ needs is our top priority.” ### Do diagnostic engineers have a future with the IoT? Senseye, the Uptime as a Service company, will be revealing how it’s making diagnostic engineers a thing of the past at the NMI Future World 2016 “Deep Tech” conference to be held in London, UK over the 15th-16th of September. Senseye’s product PROGNOSYS is revolutionising predictive maintenance by automating analysis of condition monitoring and enabling prognostics – diagnosing current and future failures, automatically. The software is cloud-based and easy to use, already trusted by Fortune 100 companies to help forecast failure, lower unplanned downtime and save operational costs. The Industrial Internet of Things generates huge amounts of machine condition information but this has been largely untapped due to human and technical limitations. Senseye will discuss where the disciplines of Condition Monitoring and Prognostics have come from and how they are being shaped by advances in cloud-computing, machine learning and artificial intelligence, making the traditional role of the human diagnostic engineer a thing of the past. Robert Russell, CTO of Senseye says “We’re excited to discuss with our peers how advances in AI and Machine Learning are helping manufacturers in the real world; the IIoT we’ve been promised for some time is finally arriving!” “Senseye are a great example of a growing UK tech company that can exploit know-how in machine learning to deliver high-value solutions  - in this case, helping to take manufacturing technology and capability to a new level.” Says, Alastair McGibbon, CTO at NMI. Future World is a conference for entrepreneurs and businesses engaging in “Deep Tech”. These developments are consistently introducing disruptions across domains from agriculture to medical, industrial to fashion. Senseye is offering a free assessment of how much manufacturers could save with their cloud-based, easy to use diagnostics, prognostics and condition monitoring product. ### Reduced Buyer Friction - Key to OS-Pay Success New findings from Juniper Research claim that the value of digital and physical goods purchased through mobile ‘OS-Pay’ platforms will increase by fifteen times in the next two years. A combination of in-app purchases and website retail payments is projected to drive annual spend via Apple Pay and Android Pay to $8 billion in 2018, up from $540 million this year. Seamless Transactions Key to Mobile Success The new study, Mobile & Online Remote Payments for Digital & Physical Goods: Opportunities & Forecasts 2016-2021 found that the integration of OS-Pay into apps will prove to be a ‘no brainer’ for developers looking to reduce buyer friction, where password entry on smartphones remains cumbersome. Staples, for example, has already reported that over 30% of its iOS app users make in-app purchases through Apple Pay. Meanwhile, despite the continued contraction of the consumer tablet market, more than 85% of remote goods payments are forecast to be made using mobile devices in 2021. “It is clear that even in markets where PCs and laptops have a high installed base that the smartphone is playing an increasingly important role where remote goods purchases are concerned.” noted research author Steffen Sorrell. “For merchants, this means that the buyer experience must be made as frictionless as possible – from product search and discovery to purchase.” PayPal Beware Apple has recently signalled its intent to offer Apple Pay to online merchants by the end of 2016, offering a similar payment mechanism to PayPal. The research anticipates that this move will be welcomed by most merchants as long as integration into their storefronts is made simple and rates are competitive. Meanwhile, Juniper expects the entry of Android Pay into this space once further country expansion has been achieved. This will no doubt boost the market further in terms of merchants who may have been undecided where Apple Pay is concerned. These OS-Pay solutions are likely to pose a threat to PayPal’s Western dominance, although Juniper does not anticipate that combined sales via Apple Pay and Android Pay will approach those via PayPal within the next 5 years at least. The whitepaper, OS-Pay to Shake Up Remote Retail is available to download from the Juniper website together with further details of the full research. Juniper Research provides research and analytical services to the global hi-tech communications sector, providing consultancy, analyst reports and industry commentary. ### Best Database Management Practices that All Remote DBA Experts Should Know When setting up a database, one of the most difficult stages is that of picking the programming language to use. There are so many options to go with and not all of them will guarantee the reliability you deserve. However, making the decision to work with Python could be one of the best choices you will ever make. [easy-tweet tweet="When setting up a database, one of the most difficult stages is picking the programming language" hashtags="database, tech"] Python is a general purpose programming language that is distinguished by its high readability when compared to other languages. The programming language is also more efficient because programmers use fewer lines of code to achieve their objectives. The language is highly preferred by programmers because it is open source, helps accelerate code development, has innumerable resources to help programmers solve problems and also offers advanced data structures like sets, tuples, lists, dictionaries and more. All in all, creating a database with Python does not guarantee that your website will never have any problem in the future. There are a number of database management practices you need to consider to ensure that your Python database runs smoothly. Optimise your staffing The most important thing you need to do in database management is to optimise your staffing. This means you need well organised and highly trained and experienced teams to help with your database needs. Your staffs will also need to be trained to equip them with more skills that enable them to leverage on the new technologies as well as reinforce their management strategies. You need a team that is able to quickly identify problems and resolve them effectively. It is also important that you assign tasks to your team so that everyone knows who is supposed to be working on what. Do not just hire a single person to handle your database management needs. You need a team. Staffing issues are what contribute to most of the challenges in database management. Do you have high data entry staff turnover? Are you straining your existing resources? If you answered yes to any of these questions, it is time to find better ways of motivating your team. Review the products and performance regularly Keeping a close eye on the products involved in the management of your database is the most important thing. Start by looking at what is installed, where it is installed and how it is being managed. It is highly recommended to engage in product assessment at least once every three months. This will enable you to track the rate of change as well as enable you to comprehensively understand infrastructure expansion. It is only with close monitoring that you will be able to act proactively and prevent total system failure. Regular reviews will further enable you to prevent any unauthorised augmentation of your database. Needless to say, this can negatively affect the performance of your database or even put it at great risk. To closely monitor the database, the DBAs need to maintain a physical as well as logical models of the database. This will be their baseline so that they are able to identify unauthorised changes with ease and fix them as quickly as possible. [easy-tweet tweet="Regular reviews will further enable you to prevent any unauthorised augmentation of your database." hashtags="data, cloud, tech"] Poor documentation is what causes delays in most problem resolutions. Therefore, whether you are working with onsite or remote DBA experts, it is imperative that you ensure they have the physical and logical models documentation. Improve the performance of critical applications The challenge that most organisations have is that of the performance of their essential applications. The improper configuration or poorly planned database layouts as well as the lack of proper understanding of the performance are some of the challenges most businesses face. To prevent performance issues, the first thing that ought to be done is to come up with a good indexing strategy. This is also a crucial step in making sure that the organisation is ready for big data. It is up to your team to constantly review the indexes based on your usage. The unused, duplicate and missing indexes need to be removed else they will bloat the database. All in all, before you start deleting anything from your database, take time to double-check that they are not needed. You must also ensure that you backup your database before you start deleting the unneeded indexes. Fix the data integration problems Data integration problems are the main issues that DBAs get to deal with. These issues have been around since the introduction of new data sources, such as the Internet of Things, social media and the Cloud. This has led to application with both structured and unstructured data that is stored in different locations. This has worsened the problem. Picking a data warehouse environment is the key to resolving most of the integration issues. You need to come up with standard operating procedures that will enable you to convert from one data format to the desired format. Investing in the right software will further speed up the process. What you need to resolve data integration issues is the right architecture and tools. In addition to that, you need to implement data auditing processes, which should have clear responsibilities. This will enable you to guarantee superior data quality. The data needs to be cleaned before being converted and integrated. [easy-tweet tweet="#Data needs to be cleaned before being converted and integrated." hashtags="cloud, tech"] Take security seriously Security threats are probably the main causes of database problems. The last thing you want is for hackers to gain access to the crucial data of your customers. This will not only expose you to legal issues but also tarnish your company’s reputation. The best way forward is to ensure that security problems never take place. This will only be achieved with a stringent strategy in security management. You must check the applications that are accessing your database and ensure they are updated to tighten security. Encrypting communications between the apps and the Cloud through SSL will further improve the security of your Python database. The key to guaranteeing the security of your Python database is making sure that you constantly check the database for problems and keep up with the trends in database management. You can also outsource the work to a third party for personalised support. ### New EMC Hybrid Cloud Platform Enhancements Simplify the Journey to Cloud, Ignite IT Transformation   News Highlights:  EMC announces general availability for Enterprise Hybrid Cloud v4.0 Enterprise Hybrid Cloud v4.0 enhancements include expanded multi-site support, enhanced data protection and anytime VM encryption New Native Hybrid Cloud offering, built on hyper-converged VCE VxRail Appliances, provides a platform that delivers simple and easy path toward the development and deployment of cloud native applications   Full Story: EMC (NYSE: EMC) today announced general availability for EMC® Enterprise Hybrid Cloud v4.0, as well as planned enhancements for both Enterprise Hybrid Cloud and EMC Native Hybrid Cloud, designed to help customers advance their journey for both traditional and cloud-native applications. Enterprise Hybrid Cloud v4.0 extends IT’s ability to update legacy application infrastructure to meet the demands of a digital world. A new Native Hybrid Cloud offering built on hyper-converged VCE® VxRail™ Appliances provides a platform that delivers a simple, fast and easy path toward the development and deployment of cloud native applications. Understanding the needs of organizations to transform to a more modern, automated, self-service IT environment, EMC continues to develop and deliver innovations into its cloud platforms that are designed, supported and sustained as one to help customers simplify and accelerate their digital business initiatives. Enterprise Hybrid Cloud For businesses primarily running business-critical applications such as SAP, Oracle, or Microsoft Exchange, Enterprise Hybrid Cloud automates the delivery of infrastructure services to improve predictability, reliability, and efficiency of those services while increasing resource utilization. New capabilities in Enterprise Hybrid Cloud v4.0 include: Multi-Site Support: Extends Enterprise Hybrid Cloud from dual-site to up to four datacenter sites with up to four instances of vCenter. This enables EMC customers with physically distributed data centers to centrally manage their multi-site cloud using a self-service catalog Enhanced Data Protection: Data protection choices can now span multiple sites. In addition, new data protection options enable recovery to any point-in-time from outages, corruption or disasters for individual or groups of Virtual Machines via the self-service catalog. And enhancements to backup support allow modifications to backup service levels to be performed at any time throughout the lifecycle of a workload from the self-service catalog Anytime VM Encryption: The ability to add or remove encryption for VMs via the self-service catalog at any time in the lifecycle of a VM Additional enhancements to Enterprise Hybrid Cloud are targeted for later this year. These enhancements are planned to include new workflows that automate application to infrastructure provisioning to reduce the time and complexity of delivering new services. Native Hybrid Cloud For organizations looking to differentiate their businesses through the rapid development of cloud native applications, Native Hybrid Cloud offers a turnkey Pivotal Cloud Foundry platform. EMC today announced plans for a new, Native Hybrid Cloud offering based on the VCE VxRail 200 and 200F models. Designed to start small and quickly scale as demand for new applications grow, the platform is planned to provide organizations with a flexible, cost effective and optimized hybrid cloud platform for cloud native application development and deployment that is inclusive of IaaS, PaaS and developer & IT OPS services. Benefits include: Customers can now quickly deploy a turnkey modern application development platform into their environment while leveraging their existing and proven investments in VMware infrastructure and knowledge of VMware vSphere and Virtual SAN technologies Delivers centralized infrastructure resource management, provisioning and performance evaluation from the familiar vCenter Server Console Delivers the benefits of EMC converged infrastructure: increased agility, simplified operations and lower risk General availability of Native Hybrid Cloud on VCE VxRail Appliances 200 and 200F are planned for the end of the third quarter of 2016. About EMC EMC Corporation is a global leader in enabling businesses and service providers to transform their operations and deliver IT as a service. Fundamental to this transformation is cloud computing. Through innovative products and services, EMC accelerates the journey to cloud computing, helping IT departments to store, manage, protect and analyze their most valuable asset — information — in a more agile, trusted and cost-efficient way. EMC, VCE, and VxRail are trademarks or registered trademarks of EMC Corporation in the United States and/or other countries. All other trademarks used herein are property of their respective owners. ### Don’t get grounded like Delta – challenge your Disaster Recovery plan You can try and plan for it. You can train for it as much as possible. You can create multiple systems designed to prevent it. When a disaster does occur, it’s usually when you least expect it.   [easy-tweet tweet="Companies can lose up to $5,600 per minute on average during an outage" hashtags="tech, cloud, security"] Just ask Delta Airlines. A huge power outage in the US delayed and cancelled flights worldwide, causing widespread chaos and significant damage to the Delta brand. Enterprises and businesses should view this as a cautionary tale – having a tried and tested disaster recovery plan in place is critical. I can’t stress enough how imperative it is to have a disaster recovery plan in place that ensures you’re prepared to cope with any disaster that might come along. The aim of any disaster recovery plan should be to get back up and running as soon as possible. There’s the damage to your brand to consider, not mentioning the costs involved. According to Gartner, companies can lose up to $5,600 per minute on average during an outage. To make sure your disaster recover plan is up to the task, consider these 6 questions: 1. Has your disaster recovery team set roles and responsibilities? Each member of your disaster recovery team needs a clearly defined role. When a disaster hits, everyone needs to know exactly what their job is to ensure the plan is carried out without hesitation or mistakes. Contact information for each member is a must and clarity on their role for him or her and the rest of the team. Sounds minor but this will make all the difference in the event of a disaster. 2. Will your budget cover the unexpected? In the same way a disaster can be unexpected, so can the costs involved in getting back online. Do you need cloud support? Usage fees need to be factored in. Will there be a need for external consultants to assist the disaster recovery team? They will most likely be more expensive than your employees so there needs to be the necessary budget in place. Having a surplus in case of a disaster is key. Every minute of downtime can result in huge IT expenses, especially for larger enterprises. Considering these additional expenses and building them into the plan is critical to ensure your team doesn’t meet any further snags when trying to get authorisation for these costs. 3. Can you mobilise your data? [easy-tweet tweet="Data mobility means immediate and self-service access to data anytime" hashtags="data, tech, recovery"] If a disaster strikes, you’ll need to consider the mobility of your data – is it confined to your physical infrastructure or can your data move freely to different locations? Data mobility means immediate and self-service access to data anytime, anywhere. It enables accelerated application development, faster testing and business acceptance, more development output, improved productivity and improved time-to-impact business intelligence. Without data mobility, there’s a significant risk of recovery cost and time to get back online. 4. What is most likely to go wrong with your data? Once you have your plan in place, it’s important to analyse it closely and be aware of what is most likely to go wrong when executing it. Hone in on those vulnerabilities and ensure they are corrected. Finding ways to avoid them will make for a much smoother disaster recovery process. 5. Has your plan been put to the test? Test, test and test again – making sure your disaster recovery plan is ready to be rolled out successfully at any time is an absolute must. After testing, analyse the result to see if the plan performed according to the specifications. Are there any areas that need to be ironed out or strengthened? Testing a disaster recovery plan should be done consistently on a regular basis so the plan can evolve and become the best it can possibly be. 6. Are your systems up-to-date? Finally, having up-to-date systems is essential when forming a strong disaster recovery plan. Do you have the most modern backup and recovery solution or is it just another one of the same traditional solutions that don’t work? Considering the newest technologies such as copy data virtualisation will bring the way you manage disaster recovery to the next level. The risks associated with poor disaster recovery - data loss, unforeseen budget expenses, loss of customer trust - can be avoided by taking all of these elements into consideration during your planning phase. ### New EMC XtremIO Features Offer Extensive App Integration and Automated Storage Management Workflows for Virtualized Environments Updates Are Latest In Continued XtremIO Investment To Help Organizations Accelerate Their Journey To The Modern Data Center News Highlights: EMC announces new EMC XtremIO application integration and management features for virtual machines via SMI-S and VMware vRealize Orchestrator Updates extend full power of XtremIO iCDM functionality to virtual environments Simplifies storage management in virtual environments with XtremIO Management Server leveraging EMC Virtual Storage Integrator (VSI) and EMC AppSync EMC plans further application integrations and workflow automation support within the XtremIO user interface Full Story: EMC Corporation (NYSE:EMC) today announced the release of new application integration and management features in the EMC® XtremIO® all-flash storage platform for VMware and Microsoft virtualized environments. The new XtremIO features offer a rich set of workflow automation integrations that enable application and infrastructure teams to take full advantage of simplified resource provisioning and management workflows, automation and orchestration in virtualized environments. Integrated Copy Data Management (iCDM) functionality also helps reduce the need to purchase separate dedicated hardware and software to manage copy data sprawl. Among more than a dozen updates to the XtremIO XIOS™ operating system, highlighted new features include: EMC and VMware product integrations between XtremIO, EMC AppSync®, EMC VSI and VMware vCenter Web UI. This new integration enables the vCenter administrator to locally recover and restore virtual machines and datastores, and perform file-level restores from within the virtual machine. This also enables new iCDM workflows from the vCenter native user interface, allowing users to create and repurpose functional copies using XtremIO’s virtual copy technology. New XtremIO vRealize Orchestrator Plug-in, which acts on its own or as a part of the VMware vRealize Automation workflow. The new vRealize Orchestrator plug-in allows infrastructure teams to build a self-service catalog of automated, end-to-end, infrastructure workflows for virtualization administrators, DBAs, and application owners. EMC will enable VMware users to fully leverage XtremIO with VMware vRealize Automation to automate and orchestrate storage management and provisioning tasks, such as compute and storage provisioning and iCDM capabilities natively within the VMware interface. Native support for Microsoft PowerShell version 4.0 and 5.0 via the XtremIO Management Server. As a result, administrators can now benefit from a fully supported scripting environment, using PowerShell and VMware PowerCLI from XtremIO to automate and orchestrate any operation in a virtualized environment. New XtremIO support for the Storage Management Initiative Specification (SMI-S), allowing users to manage and provision XtremIO storage natively in their SMI-S user-interface of choice. SMI-S enables broad interoperable management and provisioning of heterogeneous multi-vendor storage systems from a variety of SMI-S enabled applications. For example, XtremIO includes new support for Microsoft Hyper-V virtualized environments via the System Center Virtual Machine Manager (SCVMM) SMI-S profile. Going forward, EMC will continue adding support for other SMI-S client profiles in XtremIO. EMC also released a beta version of XtremIO’s new web-based HTML5 graphical user interface. This web-based user interface is designed to help provide XMS users with a simple, lightweight option for XtremIO reporting, monitoring and management – and supports access via mobile devices. The rich set of application integration and storage workflow automation features built into XtremIO are designed to help simplify storage management and provisioning processes within major applications that are critical to both IT and business users. EMC will continue developing XtremIO to support integration and workflow automation capabilities for additional applications to help reduce the complexity of managing and provisioning storage and application copies. See XtremIO at VMworld 2016 For the third consecutive year, XtremIO all-flash storage arrays have been chosen to power the VMworld 2016 Hands-on Labs, an extremely demanding, IO-intensive instructional environment built for testing and demonstration of a variety of virtual and cloud-based workloads with hundreds of users in more than 100 sessions over four days. XtremIO all-flash storage is part of the foundational infrastructure of the VMworld Hands-on Labs that powers the hybrid cloud, end-user computing and software-defined data center environments that are built on top of the VMware software stack. XtremIO delivers high IOPS and predictable sub-millisecond latencies while providing 100% storage uptime and maximum storage capacity efficiency (upwards of 12:1), to ensure a world-class user experience against the high performance and reliability demands of the Hands-on Labs environment. Availability New features within the latest release of the XtremIO operating system are available immediately as a free software update for customers with current support agreements. The XtremIO vRealize Orchestrator plugin will be available later this year.   ### New Funding for Development of Bio-active Implants that Speed Recovery Innovate UK are supporting research and clinical trials of a bio-active implant to treat post-operative infection after total knee replacement (TKR). This could reduce significantly the cost of post-operative complications to the NHS, estimated to be £300 million per year. Implant manufacturers MatOrtho and advanced metal coating developer the Wallwork Group are leading the programme. Significant support with clinical trials and evaluation is being provided by the Royal National Orthopaedic Hospital, University College London and Queen Mary University London. Infection after total knee replacement is a distressing, complex and costly complication. The new implant, known as Smart Spacer, combines a bespoke TKR implant with an innovative surface coating that responds to the physical and chemical stimulation provided by use and implantation in the body. The patented chromium-nitride silver coating (CrN-Ag) is bonded directly to an existing cobalt-chromium TKR prosthesis. This will be employed as a temporary spacer device, during the eradication of the periprosthetic infection. Infections, including MRSA and Staphylococcus Epidermis, which are resistant to standard antibiotics are treated directly by the Smart Spacer. This presents significant advantages over current treatments and opens opportunities for the future of coated prosthesis becoming a long term primary treatment. Physical vapour deposition (PVD) is the technique used by Wallwork to apply the smart coating. The company run one of the largest specialist PVD coating facilities in Europe and manufacture the high vacuum coating chambers used in the coating process. They have extensive development laboratories, staffed by highly trained scientists and technicians, who can optimise the nano-composite coatings to achieve the required surface characteristics and performance. In addition to the coating of medical devices, Wallwork provide surface engineering for components used in the aerospace, automotive, nuclear and other technologically advanced industries. The Smart Spacer has two initial functions. By the release of silver ions from the CrN-Ag coating it can actively reduce infection. The coating also protects and shields the cobalt-chromium TKR device, preventing the leaching of potentially harmful metal ions into the patient. Initial trials have shown chromium and molybdenum ion leakage to be 200 times less than that for uncoated devices and well within accepted concentrations. The coating will operate in two phases with movement and loading of the knee joint initiating a boost of silver ions from the sacrificial surface layer immediately after initial implantation to counter infection. After this phase the device will release lower levels of silver ions to maintain stability as recovery progresses. The programme will have several key stages. Primary development work in the laboratory will include simulated life testing of the coated knee joints by mechanical replication of normal movement. This will enable Wallwork scientists to optimise coating performance. When this phase is complete, closely supervised clinical trials begin with patients. Conventionally, spacers are usually in place for six weeks after the removal of the primary TKR implant where post-operative infection has arisen. High levels of antibiotics are required to eradicate infection and permit the removal of the spacer and fitting of a new TKR prosthesis. Smart Spacer has the potential to facilitate speedier stabilisation of the infection and may reduce heavy reliance on antibiotics. Evidence from the performance of Smart Spacers will be used to evaluate effectiveness and may open possibilities for similarly coated TKR devices to be eventually adopted as primary knee replacements and in other surgical and dental implants. As populations live longer so the need for knee replacement increases. The consortium anticipates a high level of interest from around the world. ### Xangati to Demo Cloud Infrastructure and Control Solutions at VMworld 2016 Xangati will be running live demos of its cloud infrastructure management solutions at next week’s VMworld in Las Vegas. Xangati– on Stand #2715 – will showcase its predictive analytics platform, trusted by large enterprises, public sector organisations and cloud service providers worldwide to achieve service level objectives for virtualisation infrastructure and VDI (virtual desktop infrastructure) environments. The Xangati ESP platform is a virtual appliance built on a patented in-memory architecture that collates, analyses and visualises performance data with unprecedented speed and scale across multiple software-dashboard modules. It crunches hundreds of thousands of interactions live, second-by-second, without the use of agents or probes, to enable predictive analytics for performance and capacity based on dynamic threshold algorithms and machine-learned heuristics. vSphere cloud system administrators are able to remediate performance and capacity underutilisation issues automatically and in real time, resulting in cost reductions. Xangati’s anomaly correlation analytics also identify and self-heal intermittent and chronic resource contentions, leading to improved application delivery. “With Xangati ESP for Cloud Infrastructure solutions featuring enhanced storage and cloud visibility, vSphere sysadmins can proactively solve real-time performance and capacity challenges with automated remediation in seconds, rather than the retrospective days and weeks of post mortem analysis involved with traditional monitoring,” said Atchison Frazer, CMO,Xangati. Xangati ESP for cloud infrastructure is an IT operations and analytics control platform for cloud/hybrid-cloud environments enabling service assurance and optimised capacity utilisation with the use of virtual servers and containers. Xangati combines insights, data intelligence and predictive analytics to help enterprises evolve from traditional performance monitoring to complete control with one self-learning tool to ensure better business outcomes. ### EMC Expands “Data Protection Everywhere” Portfolio For VMware Environments EMC Data Protection Suite, EMC Data Domain and EMC RecoverPoint for Virtual Machines Boost Speed, Simplicity and Efficiency for VMware Workloads London, UK – August 24, 2016 News Highlights: EMC expands integrated data protection offerings, optimizing protection of VMware workloads on VMware Virtual SAN, VMware vSphere while making the process fast, simple and efficient VCE VxRail Appliance customers can now expand on its built-in protection capabilities and leverage a wider range of EMC data protection products, including EMC Data Domain Virtual Edition, which provides scalable protection storage, cloud enablement, and 50% faster backups Launches EMC Data Protection Suite for VMware protection for comprehensive enterprise data in hyper-converged environments Becomes the only VM replication vendor to support vSphere APIs for I/O Filtering, enabling fully integrated virtual replication data services through RecoverPoint for VMs  Full Story: EMC Corporation (NYSE:EMC) today unveils new products and support that optimize the protection of VMware workloads across the spectrum of VMware environments to enable protection everywhere. This includes enhanced support for VMware Virtual SAN, VMware vSphere and expanded data protection options for VCE® VxRail Appliances. The integration between EMC’s data protection portfolio and VMware software empowers vAdmins to provision, monitor and manage the protection of their virtual workloads through the standard VMware interface. The updated version of EMC® Data Protection Suite™ for VMware combined with EMC Data Domain Virtual Edition, provide fast, simple and efficient protection for VMware workloads, enabling continuous end-to-end protection through the native VMware interface. Meanwhile, the support of RecoverPoint for Virtual Machines (VMs) for VMware vSphere APIs for IO filtering, empowers vAdmins to manage replication through vSphere Storage Policy-Based Management, leveraging the common VMware framework. EMC Data Protection and VCE VxRail Appliances As IT organizations increasingly look to hyper-converged infrastructure appliances (HCIAs) to simplify and improve operational efficiency for more and more use cases, they need a wider range of additional capabilities and more cost effective scaling to support those workloads VCE VxRail Appliance customers can now expand on its built-in protection capabilities and leverage a wider range of EMC data protection products to provide application consistency for backups as well as comprehensive monitoring, reporting and backup file search capabilities.  Since VCE VxRail Appliances are powered by Virtual SAN, they can be deployed with EMC Data Protection Suite for VMs and EMC Data Domain Virtual Edition to provide a simple yet comprehensive solution for businesses which require cost-effective scaling to support workloads, or require comprehensive monitoring, analysis, reporting and backup file search capabilities, as well as backup and recovery and continuous data protection for mission critical applications including Oracle, SAP, DB2, and more. With today’s EMC Data Protection Suite for VMware and EMC Data Domain Virtual Edition announcements, EMC is making the extended protection of Virtual Machines on VxRail Appliances: Simple Simplified management makes of data protection for VMs fast and easy Centralized monitoring, analysis, reporting and search of backup data gives users one place to find out everything they need to know A single vendor approach offers a unified point of contact for sales and support from hardware to software to protection and enables management through a single native interface Fast Backup times can be reduced by up to 50%Y Recovery times are up to 30x faster when compared to traditional image-level restoresP By deploying solutions that are built to work together, data protection for VMs on VxRail Appliances can be up-and-running in minutes  Efficient Protection storage requirements can be reduced by up to 30x Up to 1,000 virtual machines can be continually protected for zero RPOπ Software-defined protection storage offers simple deployment Support for mission-critical applications, including Exchange, SAP, SQL, SharePoint, Oracle, DB2, Sybase, and IBM Domino for backup and replicationπ EMC Data Protection and VMware Virtual SAN and VMware vSphere APIs for IO Filtering For organizations leveraging VMware Virtual SAN, the Data Protection Suite for VMware also provides industry-leading Disaster Recovery support for Virtual SAN through EMC RecoverPoint for VMs. Having announced at VMworld 2015 that EMC would become VMware’s first replication design partner, this year, EMC becomes the only VM replication vendor to support VMware vSphere APIs for IO Filtering, as it brings a fully integrated virtual replication data services through RecoverPoint for VMs. RecoverPoint for VMs employs advanced multi-site synchronous and asynchronous replication technology, offering up to zero RPO with any point in time recovery via a highly resilient, no-single-point-of-failure architecture for the most advanced VSAN topologies. Because RecoverPoint for VMs is storage agnostic, replication is supported between VSAN and non-VSAN environments, enabling even greater flexibility. And VMware vSphere APIs for IO Filtering integration empowers vAdmins to manage replication through vSphere Storage Policy-Based Management, leveraging the common VMware framework. EMC Storage and VMware In addition to its data protection portfolio, EMC will be demonstrating VMware integration in its primary storage arrays at VMworld. EMC Unity is the company’s first midrange storage array to deliver native support for vSphere Virtual Volumes and Storage Policy-Based Management. EMC has also made available a free community edition of EMC UnityVSA™ ideal for testing and development usage. This VSA can be deployed in a vSphere environment and provides full access to the HTML 5-based user interface allowing users to get insight into how to use Unity. EMC VMAX offers continued support for Virtual Volumes enabling native integration for enterprise storage. EMC will also demo the VMware capabilities of its XtremIO all-flash arrays, such as VSI plugin, AppSync and ESA, at the show. With its true scale-out architecture, EMC XtremIO excels by delivering consistent and predictable performance at sub-millisecond response times making it perfect for the demands of businesses running large numbers of VMs. And its integrated Copy Data Management and deduplication offsets the storage demands created by XCOPY to deliver exceptional performance. Availability: EMC Data Protection Suite for VMware is available today. EMC Data Domain Virtual Edition is available today. EMC RecoverPoint for VMs supports VMware vSphere APIs for IO filtering integration today. EMC Unity and EMC VMAX support Virtual Volumes currently. ### A state of play – keeping gamers online The rise of DDoS attacks in online gaming From World of Warcraft to DOTA 2, online gaming has well and truly exploded thanks to games consoles such as Xbox One and PlayStation 4. Connected devices have also played a significant role in fuelling this trend and widening the demographic, with consumers increasingly using their smartphones and tablets to immerse themselves in a world of online gaming. In fact, the world of online gaming has become such a phenomenon that there are even events dedicated to this thriving industry. Last week Gamescom, the world’s biggest gaming event, is taking place, and will see gamers from around the world come together to play the latest games and see the newest trends set to hit the industry. [easy-tweet tweet="Not a week goes by with no reports of DDoS attacks hitting the online gaming industry" hashtags="gaming, tech, security"] But just like every other online industry, the rise in cyber-attacks has affected online gaming. Distributed denial of service (DDoS) attacks, which attempt to make an online service unavailable by overwhelming it with traffic from multiple sources, are an incredibly popular way to disrupt online games. Not a week goes by where there isn’t reports of DDoS attacks hitting the online gaming industry – with Pokémon Go the latest victim. The impact of DDoS attacks For online gaming sites, DDoS attacks are more than a mere inconvenience - they can have massive and detrimental effects on revenues and customer loyalty. According to research from Cisco, last year, the number of DDoS attacks globally grew by 25 percent. In terms of gaming, DDoS attacks account for more than 5 percent of all monthly gaming-related traffic, and more than 30 percent of gaming traffic while these attacks are occurring. [easy-tweet tweet="More than 30 percent of gaming traffic while DDoS attacks are occurring." hashtags="cloud, tech, DDoS"] According to research by Ponemon Institute, DDoS attacks on a gaming site can cause revenue loss from $40K per hour to as much as $1.5 million in a year. Organisations that fall victim to a DDoS attack typically suffer damage in multiple ways: deteriorating customer trust, lost revenue, negative brand impact, or slowed web innovation and expansion. Online gaming sites therefore have two priorities: ensure they are protected against security attacks while also delivering outstanding user experience – a difficult balance. So how do online gaming websites ensure that they maintain their users’ loyalty as well as protect their revenues? The unlikely saviour Ultimately, there is no way to stop a DDoS attack before it happens – but online gaming sites can mitigate the effects of an attack. While DDoS prevention appliances can help, a single appliance can cost over $50,000 – and can’t always guarantee safety from the average DDoS attack. There is also the option of establishing additional data centres to fend off DDoS attacks. But again, the cost of this is high – and it also makes little sense to establish additional data centres just to deal with DDoS attacks. So many online gaming sites turn to cloud solutions. While it may not seem like an obvious choice, a content delivery network (CDN) with DDoS absorption capabilities can help to fight off DDoS attacks. Why? Because a CDN offers more than just DDoS protection - it provides additional capabilities that are very valuable for gaming websites. By bringing gaming content closer to gamers wherever they are in the world, a CDN can reduce the number of download failures and server crashes, ensuring a seamless experience for every gamer. A CDN with a cloud-based infrastructure can also help with spikes in player numbers. Because player numbers can be unpredictable, huge spikes in demand can impact on performance – meaning a gaming website becomes a victim of its own success. A CDN can balance these increases in demand in real-time, meaning performance, availability and efficiency are not compromised. The importance of DDoS protection But it is important to note that not every CDN is equipped to protect against DDoS attacks. In fact, some CDNs have claimed to offer DDoS protection when in reality, they have relied on their infrastructure to scale and increase the capacity of game servers with points of presence (PoPs) located around the world. It is these PoPs that allow some DDoS attacks to be absorbed without blocking access to web content and applications. However, this alone will not do the job, and a CDN that relies solely on sheer bandwidth to absorb DDoS attacks won’t help you. You need a CDN that comes with specialised DDoS expertise, which can intelligently identify malicious traffic and go beyond http/https attacks. The game goes on For online gaming sites, reliable, fast and secure connections are critical because gamers expect no less than quick, responsive and secure game-play. Ultimately, gaming sites need to be able to perform at their absolute best in order to generate revenues and keep gamers coming back for more – and that means fending off attacks. A CDN might offer you the right protection while improving your web performance but ultimately, you need a CDN with the right kind of DDoS protection to keep a state of play. ### The Cloud: The perfect home for big data analytics for companies of all sizes It wasn’t long ago that businesses felt real-time analytics was a laughable and unnecessary concept, but in the past twelve months the pendulum has swung in the opposite direction and the demand for real-time data insights is now seen as a critical path to success. [easy-tweet tweet="Real-time data insights is now seen as a critical path to success." hashtags="cloud, tech, security"] Real-time data analytics used to come with a costly price tag, but how can cloud computing enable businesses of all sizes to benefit from big data analytics at speed? Technology is cheaper than ever before Thanks to Moore’s law, technology is cheaper than ever before, so businesses of all sizes are now able to take advantage of the latest technological advances previously only enjoyed by large corporations with big budgets. However, it’s not just Moore’s law that is enabling businesses to get their teeth stuck into big data analytics, the cloud has opened the doors for many too, largely due to what appears to be a very attractive price model. Cloud: Opex vs Capex One of the industries that has benefitted the most from advantageous pricing structures is the big data industry. Despite relatively restrictive budgets, small businesses are now able to make the most of their money; the Cloud enables them to work within an Opex model instead of a Capex one. These smaller companies are more flexible and are able to adapt quicker to new technology, allowing them to integrate it within their business more easily than it is perhaps for larger enterprises. [easy-tweet tweet="Despite relatively restrictive budgets, small businesses are now able to make the most of their money" hashtags="cloud, tech"] The Cloud means that businesses no longer need in-house servers for their data, so they can save money on buying and running hardware and infrastructures that they might quickly outgrow. Smaller businesses also need a degree of scalability to remain flexible with their growth, which is something the Cloud also provides businesses with, as businesses only need to invest in the software and solutions that they will need. This is of great benefit to organisations that want to get started with data analytics fast but cannot invest upfront in analytic infrastructures. Cloud: The perfect home for big data analytics Business intelligence reporting and data analytics workloads are perfect for the Cloud; you can run reports when you need to and only pay for the computing power you use. Companies can start with their data analytics within short timescales, which in turn speeds up business intelligence and reporting projects.  All this with just a few clicks of the mouse and at a price that suits their operational expenditure budgets. Customer-focussed The flexibility and scalability of the Cloud means that it has never been easier for companies to get involved with big data and data analytics as everything can be hosted in the Cloud at a cost that is transparent. By taking advantage of a database-as-a-Service offering, companies of all sizes can make big data the focus of their business plan, and the benefits of this are vast. For instance, the opportunity to analyse data sets to understand and act upon customer behaviour can be crucial for a business getting its customer service right. Having a fast analytic solution allows smaller businesses to react to any event, resolve any problems and respond to queries as soon as they are reported.  This improves the competitiveness of the brand, for if a business takes too long to respond to a situation, customers will just find a competitor who will give them what they want and give it to them more quickly. [easy-tweet tweet="Having a fast analytic solution allows smaller businesses to react to any event" hashtags="tech, cloud"] This is crucial for smaller brands, as it means that they are up-to-date with current technology and those who do not continue to adapt to it risk losing out on future business. Cloud as the new route to market In addition to the benefits gained by businesses that are able to access fast analytics, software vendors are also gaining an advantage. The big public cloud providers like Amazon Web Services and Microsoft Azure are creating a new route to market via their online marketplaces.  Using these public clouds, businesses can get up and running using software for a myriad of jobs, including analytic databases. In the same sense that Apple has the app store for consumers, AWS and Azure have marketplaces for business, and companies are increasingly moving their infrastructures into the cloud. Digital democratisation What was once financially impossible for startups to achieve is now possible. Smart and innovative ideas are now capable of becoming reality, and the Cloud has provided businesses with the opportunity to start to compete and challenge the industry giants. This new competition can only be good for the industry, as it encourages more research and development and will push advancements in technology at a quicker rate as companies strive to improve and upgrade in order to stay ahead of their rivals. The Cloud is democratising Big data and all in the business community can benefit. It will be interesting to see what new data-driven businesses are created as a result of the powerful combination of real-time data analytics in the Cloud. ### Retail Analytics: The Third Industrial Revolution The first industrial revolution began just over 250 years ago. Another followed just under 150 years ago, and now we are in the midst of a third - rather than using steam power or electricity, this revolution is now being driven by big data and data analytics. It is clear that data is the new currency. [easy-tweet tweet="It is clear that data is the new currency." hashtags="data, tech, comms,"] Data analytics is essentially defined by its use, so, what is retail analytics? The answer is refreshingly simple: employing data analytics in a retail context. With this in mind, it can be shaped to fit pretty much any context. For example, service analytics, business analytics, digital marketing analytics, security analytics – this list goes on and on. Ultimately, the basic goals of data analytics and consequently retail analytics are to: Transform data to information, to knowledge and to wisdom Drive the creation of actions based on this resulting wisdom Anticipate what is likely to happen and prepare for it Influence what may happen to gain competitive advantage I have also heard this described by Splunk as ‘making data accessible, usable and valuable to everyone’. What sets data analytics apart from traditional business intelligence is that the focus is on real time insight, allowing today’s decisions to be based on today’s data. The art of the possible in terms of queries do not need to be specified ahead of time. Once you have the data, you can ask whatever you like, however you like. [easy-tweet tweet="What sets data analytics apart from traditional business intelligence is that the focus is on real time insight" hashtags="cloud, tech"] For example, one of the most critical decisions an online retailer can make is when to put up a holding or busy page on their website to protect it from being overwhelmed by sheer load from visitor traffic. This decision has profound implications for key success factors such as customer experience, ability to trade and brand credibility – we have all seen the newspaper headlines around ‘Black Friday’ trading. Using data analytics for real time insight enables retailers to predict these trends and make well-informed decisions ahead of time, often saving the business from potential trading disasters. As a general rule, the quicker you can put enlightening information at the fingertips of decision makers, based on what has happened, the more effective the decisions they make can be. This is especially true if decisions need to be made in real time and if there is an appetite to automate decision making and instigating process and workflows based on those decisions. Overall, optimised application of well-formed, outcome-driven data analytics can make the difference between glorious peak trading and painful peak profile. [easy-tweet tweet="Using data analytics for real time insight enables retailers to predict trends" hashtags="cloud, tech"] However, it’s not just in predicting trading patterns where retail analytics can add value, there are a number of emerging trends that can be identified in every day retail scenarios. Firstly real-time offers i.e. creating targeted offers that can be received at a kiosk or on a receipt as a result of the day’s shopping. Video analytics are also becoming commonplace in order to gather information on the flow of shoppers in-store, measuring how shoppers observe product placement and to gain insight on how best how to lay out displays. Finally, sentiment analysis is becoming a huge tool, examining the language and extracting that data from blogs, social networks, reviews etc to gauge customer feeling towards a product or service. It may take a little longer for retailers to transform themselves into precision analytical machines and the initial investment may inhibit some retailers from exploring their analytic capabilities. However, some of these investments in analytics can generate income quickly, improve productivity and even lower costs. Not only that, the ability to predict buying trends, customer preferences and trading patterns helps to safeguard business against future disruption. The overall trends are clear: retail is a data-intensive industry, and taking advantage of all that data to operate and manage the business better requires analytics. Most retailers have only scratched the surface of what is possible, and now it’s up to decision makers and business owners alike to fully realise and embrace the potential of this third industrial revolution. ### Hybrid cloud: The future of infrastructure While a full public cloud infrastructure has worked well for some pure-play digital companies such as Netflix, most enterprises are now seeing that despite the benefits of cloud, not all workloads can or should move to the cloud. The hybrid cloud model is here to stay for the foreseeable future. [easy-tweet tweet="The hybrid cloud model is here to stay for the foreseeable future." hashtags="cloud, hybrid, tech"] Why hybrid? We still see software licenses that are not cloud-friendly and compliance and governance restrictions in some sectors and/or geographies continue to preclude 100 percent cloud infrastructure. Equally, there are many legacy applications that have performance or availability characteristics that just aren’t well-suited to public cloud. For these reasons and more, most enterprises continue to retain some on-premises infrastructure, and instead look to service providers to help them enable hybrid cloud hosting, rather than making a whole-scale switch. Using cloud hosting, even only in part, has advantages that are well-understood. The care and feeding of infrastructure, or even operating systems, is not normally one of a businesses’ core functions. Organisations of all kinds stand to benefit from the economies of scale, safety, security and the access to expertise that managed cloud and other hosting services offer. In a hybrid scenario, enterprises must figure out how to get the most out of these advantages alongside the necessity of retaining on-site infrastructure. Building the optimum hybrid environment Recent 'MarketsandMarkets' research forecasts that the hybrid cloud market will grow at a rate of 22.5 per cent per annum till 2021. The question among prospective adopters is shifting from “why hybrid?” to “how hybrid?”. Once this is the question, the key practical challenge for a business adopting this hybrid approach becomes setting up the right IT management structure with your cloud services provider to get the most out of your hybrid environment. For any given enterprise, at the beginning of your journey towards a hybrid environment, the tools in use by the in-house teams might be quite different from those provided by a hybrid cloud hosting provider’s tools and technologies. As you look to partner with a cloud service provider, the first thing to do is to audit your existing environment. Begin by making a list of the tools and processes your organisation currently uses for tasks such as OS monitoring, application monitoring, patching, antivirus, authentication and auditing. Essentially build an inventory of everything you currently use to manage your application portfolio, including the underlying infrastructure. These tools will need to be evaluated against those that a provider is able to offer, and the decision needs to be made as to whether the cost savings that come with using a service provider’s tools outweigh the benefits of customisation and familiarity that you have with your own tools. Often the tools used in the enterprise were selected because they provide a particular value or a particular customisation that is important for the existing systems. On the other hand, the tools chosen by service providers are often chosen because they operate efficiently at scale and provide the ability to implement a standard offering. In these instances, teams must work together to evaluate the business need for customisation versus the ability to work with, or within, the tools provided by the service provider and ensure that they can work towards an efficient delivery model at a compelling price point. That said, in some instances, where, for example, in-house teams have spent years fine-tuning their monitoring environment to provide the right information in the right way, it may not be the correct move to change to the provider’s solution. Similarly, an enterprise might also have the credentials processing management in place for their own two-factor authentication. Here, it may make sense to actually extend the people, tools and processes in place into the hybrid cloud environment as well. For each of your tools, you will need to ask not only if it can be extended into the hybrid cloud, but also whether this is the best approach. One key consideration is to understand the level of customisation that both exists and is ultimately required across your portfolio of enabling tools and technologies. A skilled hosting service provider can accommodate almost any combination of the management tools that an enterprise needs to operate in ways that are both customised to the business’ needs and cost-efficient. Cloud services: Finding the right provider Deciding on a service provider is not only a question of the tools they can offer, but also how they will work with you to manage your hybrid infrastructure. More so than other cloud models, hybrid cloud provides fewer natural delineations and prescribed areas of ownership between in-house teams and third party providers. You should therefore look for evidence of an appropriate level of service rigidity, SLAs, and capacity for customisation and integration between your environment and the provider’s environment. A desirable service provider should also be able to demonstrate a long history and provide references regarding their OS and application management capabilities. Not because they will necessarily be managing your applications, but because a provider that has a history and a legacy of managing applications, all the way up through the application stack, means that their OS management practises have been highly optimised and tuned. After all, infrastructure doesn’t exist to be infrastructure: it exists to run applications. A hybrid environment also requires connectivity between providers and in-house infrastructures. The power of hybrid cloud is based on the ability to extend the network and thereby provide a seamless experience between assets in the cloud and those that are on-premises, enabled by a high-speed network. A key consideration is therefore finding a provider that can offer this high-speed connectivity on the same networks that are connecting your enterprise data centre. In some cases, and notably if you are highly virtualised on-premises, finding a provider with a compatible hypervisor can also be important. [easy-tweet tweet="Finding a provider with a compatible hypervisor can be important." hashtags="cloud, tech, "] The hypervisor can be a factor that streamlines the hybrid experience. If, for example, you’re running VMware on-premises and you partner with a VMware-based service provider, the integration and compatibility of moving workloads between the on-premises environment and the managed cloud can be dramatically simplified. Integration between the on-premises cloud and the public cloud through APIs can often allow for a level of automation that’s necessary to ensure a seamless and scalable experience across the different assets. So, depending on the size and sophistication of your own situation, it may be advantageous to find a provider that offers a cloud platform that surfaces APIs for integration. In this day and age, most will. The evolution of hybrid infrastructure Cloud computing has evolved and matured rapidly over the past few years, which has enabled the development of increasingly sophisticated and customisable hybrid models. Hybrid tools and enabling technologies such as the network now allow for discrete management options and seamless integration with on-premises infrastructure, leading to increasing opportunity for enterprises that take advantage of these offerings. [easy-tweet tweet="Hybrid tools and technologies such as the network allow for discrete management options" hashtags="cloud, tech"] Hybrid infrastructure management is all about choosing the combination of management options that will enable flexibility and appropriate customisation, while also reaping the benefits of hosted services. Businesses that learn to manage their hybrid infrastructure effectively stand to benefit substantially, both financially and technologically. ### Digital Transformational Returns - #CloudTalks with T-Impact’s Keith Stagner In this episode of #CloudTalks, Compare the Cloud speaks with Keith Stagner from T-Impact about embracing the digital revolution and digital transformational returns. #CLOUDTALKS T-Impact design, automate and improve business process and workflow 92% of the business is referral-driven. That means survival by delivering exceptional results for  customers, every single time. ### Forget the Wristband, the Future is on the Body How Smart Garments are Transforming the Wearable Tech Market Wearable technology has taken its time. Limited by price, bulkiness and low functionality, mainstream adoption of Wearables has been stubbornly slow, and although recent developments in design and technology have triggered a rise in sales of health trackers like Fitbit, the uptake on other devices remains reluctant. To integrate technology into the physical space of a user, it needs to feel natural, and this is where smart fabric technology is changing the future of Wearables. [easy-tweet tweet="Smart fabric technology is changing the future of Wearables" hashtags="IoT, Tech, Smart"] By weaving a device into the fabric of a user’s clothes, it becomes an integral part of their normal activity - enhancing their experience rather than hindering it. The technology is invisible, lightweight and can be spread across the body, gathering information, sending signals and responding to changes. From a tennis shirt which monitors heart rate, breathing and stress, to Ping, a hooded jacket that allows the wearer to stay connected to social media through a series of gestures or patterns, the potential uses for smart garments are endless. To take a purely aesthetic approach, fabric with electronic capabilities can be made to light up, change colour, and display real-time visuals. These are features that have the ability to transform garments completely and features that the fashion and advertising industries have been quick to pick up on. In 2012, EE was one of the first brands to use smart garments as part of a marketing campaign, commissioning designers CuteCircuit to make a Twitter dress to mark the UK launch of the company’s 4G mobile network. The dress was worn by Nicole Scherzinger and displayed live tweets from her fans whilst she was on the red carpet. This was just the start and since then we have seen many major brands forming partnerships with tech companies to make smart fashion a reality, including a ‘Polo Tech Shirt’ from Ralph Lauren and Google’s partnership with Levi’s to create a pair of ‘smart jeans’ that interact with smartphones and tablets by swiping the fabric. Between conductive fabrics or sensor-clad smart garments, Wearables will soon intertwine so closely with fashion we won’t be able to tell them apart. [easy-tweet tweet="Wearables will soon intertwine so closely with fashion we won’t be able to tell them apart." hashtags="tech, wearables"] Far from the glamour of the red carpet however, smart garments have been saving lives. The U.S. Military was one of the first organisations to recognise the practical, performance enhancing potential of smart fabric, and for the last 15 years has been developing tech-integrated clothing to help soldiers in the field. Alongside their initial functionality, one of the biggest benefits for the military of integrating their devices into smart garments is the size and versatility that comes from the fabric. Soldiers typically carry around 50kg of equipment but by transferring some of this to Wearables, they are not only easier to carry, but also significantly lighter. Soldiers are equipped with smart garments such a temperature-regulating gloves, clothing embedded with GPS trackers, hands free communication devices and a vest sensitive to infrared light (invisible to the human eye) which will alert a soldier if a sniper gun is being aimed at them. Military research into smart garments is on going, with new items being tested and developed constantly, and it is the tech behind these garments that will ultimately filter down into mainstream production. Although it may be a while before we see smart garments on the high street, it is clear that they are already succeeding in specialised markets such as fashion and military wear. The benefits are also being recognised by health professionals and athletes with huge speculation on how smart garment technology could be put into widespread use across these industries. As application and usage of smart fabrics grows in these markets, it is inevitable the technology will become more readily available to the mainstream buyer, and a future in which regular device-users can tailor their wardrobe to their technical needs seems more and more imminent. ### Visibility in Hybrid IT - #CloudTalks with SolarWinds' Christoph Pfister In this episode of #CloudTalks, Compare the Cloud speaks with Christoph Pfister from SolarWinds about Visibility in Hybrid IT and playing a part in IoT. #cloudtalks At SolarWinds, we are fanatical about putting our users first in everything we do. We strive every day to deliver powerful functionality that is easy to use with one of the fastest and longest lasting ROIs in the market. Our approach is to deliver "unexpected simplicity" and redefine the expectations IT Pros have for enterprise software. ### How to deliver the best returns on your automated marketing software Automated marketing software that communicates with customers in real time as they interact with your brand will generate an enormous return on investment – but only if used correctly. Nine times out of ten, brands are using marketing automation technology to lead, rather than support, the customer experience, and should switch their focus to succeed. [easy-tweet tweet="Automated marketing software that talks with customers in real-time turns a big return on investment" hashtags="tech, comms"] Health check your data You might think that your database with hundreds of thousands of contacts is impressive, but unless you are actively engaging in a two-way conversation with each person, it’s just data. If you don’t understand the true value of that data, you won’t be able to use it to communicate effectively – or profitably. It’s in your best interests to regularly review the quality of your customer information and remove any inactive, irrelevant or duplicate contacts. Take time to establish the accuracy of the data you collect at source. Brands that dump every email sign-up from their website, or worse, buy random data lists and simply integrate them into their existing marketing campaigns without checking quality first are going to damage the personalisation and relevancy of their automated marketing communications – potentially beyond repair. The most valuable customer information is that which is earned from your real target audience – as opposed to bought or collected at random – which means you need to offer something in exchange for it. For example, a supermarket that promises to send vegetarians only meat-free special offers will attract the personal customer data they’re looking for with minimal effort. [easy-tweet tweet="You won’t be able to use it to communicate effectively if you don't know the value of data" hashtags="cloud, tech, comms"] Reach the right customers Once your data is clean and clearly segmented, you can use it to engage with your customers and prospects. Direct your marketing budget at nurturing these contacts with strategic content campaigns that speak directly to their needs and offer them something useful. Make sure that every piece of content you send is personalised and delivered to each individual via their preferred touchpoint. Focus on relevancy –  sending emails to your loyal customers in London about your summer sales in Edinburgh is not going to do your brand any favours, especially if they’ve also asked to receive the information via SMS. With automated marketing technology on your side, there really is no excuse for getting this wrong. If you do abuse automation in an attempt to reach as many customers as possible – regardless of relevancy – not even the most advanced systems will be able to help you win back lost customer loyalty. Predict the future There is no plug and play automated marketing software that guarantees positive results from the outset. Every customer engages with your brand differently. Some click on an email, others pick up the phone and there are those who respond to discounts with an immediate purchase. You could take advantage of insight services such as propensity modelling which allows you to segment your customers according to which actions they’re most likely to take, and therefore make more informed decisions on who to target. This is a great option particularly when you don’t have huge amounts of historical behavioural or transactional data on individuals; you can map similarities of your base to your best customers and test from there. [easy-tweet tweet="Segment your customers by which actions they’re most likely to take with propensity modelling" hashtags="comms, tech"] Review the results Use your automated marketing software to help you continuously track metrics such as conversions, and adjust your campaigns in response to the results. Customer loyalty, and how it translates into tangible transactions, is not as easy to measure, but tools like Net Promoter Score (NPS) can help you better understand how different types of customers view your brand. This in turn will prompt you to focus on strengthening ties with ‘promoters’ (those likely to promote your brand to others) and ‘passives’ (those who are satisfied, but unlikely to promote) instead of trying to win over detractors (those likely to spread a negative brand message). Your satisfied customers are ultimately your most valuable asset, and you need to keep them happy as their repeat business is worth far more to your company’s bottom line than a customer who has only engaged with your brand once. The irony is that many marketing budgets are poured into acquisition campaigns with little return, while retention campaigns are often less expensive to run and yield the most profitable results. ### The Globalisation of FinTech September 22 2016 veber, london Veber is hosting an exclusive dining event, bringing together leading minds in the FinTech startup space. This is an opportunity for C-Level FinTech experts to join their peers in an evening of drinks, good food and discussion around the future of FinTech and changes in the traditional banking world. ### Capitalising on the Cloud: Top tips for IT Leaders The cloud is no longer the new kid on the block in the world of CIO. It has become part of the “business-as usual” infrastructure for most organisations. In its simplest form, it allows companies to procure technology as a service, including infrastructure, applications, platforms and business process by connecting to external providers. With IT no longer linked to the funding needed for ownership, companies can source, scale and deliver capacity unbound by physical location, labour or capital. Cloud is now key in helping companies embark on a digital transformation, in fact, analyst firm, Gartner, predicts that the aggregate total of what it calls “cloud shift” spending will hit $111 billion this year alone, almost doubling to $216 billion by the end of the decade. [easy-tweet tweet="Cloud is now key in helping companies embark on a digital transformation" hashtags="cloud, tech"] This is particularly relevant at a time when IT leaders and CIOs openly admit to struggling with the pressure of meeting their organisation’s expectations around digital transformation. New research from CloudTalent commissioned by Loudhouse Research shows that although 90 percent of UK businesses are engaging in digital transformation projects, nine out of ten IT leaders are finding it hard to cope with the subsequent changes to their operating model and greater levels of pressure are weighing heavily on their IT teams as they try to bring these projects to life. [easy-tweet tweet="90 percent of UK businesses are engaging in digital transformation projects" hashtags="cloud, tech, stats"] The fact of the matter is that making businesses fit for the modern age is not as simple as just “moving everything to the cloud” - the reality is less clear cut. While many organisations have mastered aspects such as remote monitoring and management of IT assets, they struggle with how to leverage cloud-based services to their full advantage. This is mainly down to the friction exhibited within organisations who tend to forgo some of the softer and more subjective areas of change that may not necessarily be obvious. It is clear that key disciplines such as demand management, financial control, organisational construct and make up are huge contributors to an organisation’s ability to consume cloud services; many are overlooked and as such adoption can be negatively impacted. A key issue for IT leaders looking to the cloud is the wide number of options available and choosing which is suitable for their business, from “on premise”, “public” and “private.” Logicworks recently presented a strong case for the public cloud, emphasising its costs and technical benefits system. This is in keeping with public cloud vendor Microsoft's Azure cloud which has an aggressive roadmap, scale, and a combination of stand-alone cloud services and the ability to interoperate with on-premises systems. However, in a similar post, Scalr shows the elasticity of the private cloud for modern businesses. These debates are ongoing, but the consensus is the hybrid model utilising on premise, public and private cloud is the best option. For digital transformation, CIOs and IT leaders must now look to leverage these various clouds to support their business. This can be done by: Take a multi-geared approach for your specific issues Despite embracing multiple cloud options on the way to digital transformation, many companies still need in-house capability and the IT teams to manage them. CIOs must take the necessary steps to guide their business toward new technologies, while still fulfilling traditional IT provision, implementation and maintenance roles. To manage this complex mix of applications, technology, platforms and services – IT leaders should take a “multi-geared” approach to delivering the unique set of IT solutions their organisation needs for their current situation. Role assessment If a business lacks the expertise, processes or tools to utilise the cloud, partners can be brought in to help. This can place IT Leaders in an aggregation role, managing multiple relationships not only with traditional vendors, but also with cloud providers and distributors. Indeed, according to CloudTalent’s research, this can already be seen in the 94 percent of companies who currently use external consultants and specialists to help relieve digital transformation pressure and deliver projects. Be holistic Companies need to look at how the cloud will benefit all departments and areas, not just the IT department. With new cloud options and outside relationships comes the need to adjust business practices. By examining processes, from finance to sales and customer service, they can determine what should be tweaked – or completely rearranged – in order to improve operations and, as a result, better adapt. [easy-tweet tweet="With new cloud options and outside relationships comes the need to adjust business practices." hashtags="cloud, tech"] For companies undergoing digital transformation, the cloud will not just redefine the delivery of application and infrastructure services, but also unlock scalability, flexibility, security and control along with continued lower infrastructure costs. There is pressure on IT teams to deliver, but doing it right means turning companies into agile and intelligent digital businesses, better equipped to navigate the digital age. ### A How-To Guide to Crowdtesting 10 critical factors to successfully perform remote testing with the crowd As the world of technology continues to integrate with every aspect of day-to-day life, it's imperative that comprehensive and extensive testing of software and devices takes place. However, due to resource and time restrictions, organisational blindness, platform fragmentation among many other issues, testing often becomes an afterthought to the development process. Crowdsourced testing (crowdtesting), an original approach to testing that utilises a crowd that represents a digital product’s target audience to perform testing is able to address many of these issues. Today, crowdtesting has become an established solution as a complementary service to traditional testing methods in virtually all industries. [easy-tweet tweet="Crowdtesting has become a complementary service to traditional testing methods" hashtags="cloud, tech"] Its perks vary from being able to produce quick results due and around the clock availability of testers (thanks to crowdsourcing), to the availability of virtually every device and operating system imaginable. Compared to laboratory testing, crowdtesting tends to be more cost efficient and thanks to a diverse, global crowd; websites, apps and all manner of technology can function and be relevant on an international basis. However, to benefit from its many advantages, the characteristics of the method need to be taken into consideration. In many ways, crowdtesting differs from traditional methods when it comes to checking functionality and usability of digital products. The following 10 factors are a must to successfully perform crowdtesting and integrate it into the development process including practical recommendations from Philipp Benkler, Managing Director of Testbirds a crowdtesting service provider: 1. Employer Side Responsibility Testing is an essential part of the development lifecycle and crowdtesting can be applied to both waterfall and agile testing methods. Therefore, it’s a must that the method is embedded into the overall strategy in a way that works on an individual basis. When working with a crowd, it’s vital that tasks are clearly outlined and crowdsourced workers have a strong grasp on the responsibilities that are expected of them. This is especially true when it comes to crowdtesting due to the fact that the crowd is often located internationally and communicated with remotely. Similarly, it’s important that the entire process is well organised and thought through in detail before testing begins. Who oversees the testers and who creates the test design? Is it the service provider or the company itself? Who implements the results of the testing process and responds to questions in case the software doesn’t work after fixes are applied? Solution: Including crowdtesting into early planning allows it to become a fixed part of the development process. If responsibilities are clearly defined early on and the processes are well-established, service providers, clients and the crowd will all be able to have an efficient and successful testing experience. [easy-tweet tweet="Prior to launch, a crowd test can also help customers use your app." hashtags="tech, cloud"] Personal Recommendation: “Most of the time you only have one shot at success, especially if it’s an online shop or gaming app. There are so many alternatives available on the market that a customer can choose instead of your product. Therefore, involving the crowd, YOUR crowd or target group as soon as possible in order to get feedback to be able to deliver what they want is extremely important. Prior to launch, a crowd test featuring a number of different devices to test functionality can also help customers use your app instead of deleting it because it doesn’t work as expected.” 2. Realistic Planning and the Right Testing Method When developing technology today, time pressures are extremely high as companies attempt to deliver their digital products to the market as quickly as possible. Due to delays or a tough schedule, there’s often not enough time for testing. Even though the crowd works quickly and are extremely flexible, it’s still important that the testing phase is planned realistically and the right testing method is chosen based on the individual necessities of the digital product in question. It’s equally important that time is set aside for fixing issues found during testing. It doesn’t make sense to find problems if there is no time to fix them. Solution: Depending on the digital product in question, it could make sense to test single components or functions before the product is ready. For example, using the crowd to test prototypes or mock-ups allows problems to be fixed before they arise. This in turn allows the development process to stick to tough time schedules despite the inclusion of extensive testing. Practical Recommendation: "We often get calls from people who want to test over the weekend and go live the Monday after. This is completely manageable using our crowd. But how do clients expect to fix the bugs in such a short time period? We always recommend calculating a testing phase of up to two weeks and an extra two weeks for fixing the problems that are found." 3. Careful Preparation Planning is an absolute must no matter what testing method is used. With crowdsourced testing it’s especially important as testers work remotely, meaning that they are unable to receive feedback immediately. Before testing it’s important that the goal is kept in mind, questionnaires with detailed instructions on what to do are designed and the steps that will guide testers through the process are well defined. It’s important that the people designing the test try to visualise the problems that might arise or aspects that might be unclear to testers when setting them up. Asking the right questions means getting the right results. Solution: Testing in a laboratory environment and remotely are two different things, therefore the requirements of those conducting the tests are also different. Due to the specific needs of remote testing, working with people who know how to set up clear instructions and have experience in how the testers will act and carry them out is a must. Practical Recommendation: “Being well prepared is an absolute must, especially for tests where you need to gather feedback from a tester. Questions need to be defined carefully as we want to refrain from receiving the same feedback for two different questions. Take your time when defining them. At Testbirds, a project manager is always there to help and advise you with expert input.” 4. Best Possible Conditions In order to complete their tasks quickly and efficiently, it is a must that a high functioning and user-friendly online platform and infrastructure is available for testing. It’s equally important to keep in mind the interests of all stakeholders involved. Besides flexibility, when it comes to different testing methods, the platform should also feature helpful aspects such as bug exporting tools, tracking tools, and direct communication channels between testers and the managers of the test. It’s important that these factors are taken into consideration when choosing a service provider. Solution: Continuous servicing and development are essential aspects of a productive and successful infrastructure. Regularly gathering feedback from stakeholders grants a better understanding of what is needed or missing. The rule of thumb here is to never stop improving. Practical Recommendation: “Our testing platform is exclusively developed and maintained by Testbirds. This means that whenever there is something wrong we can immediately react and try to fix it. For example, if a tester reports that the button to submit a bug doesn’t work on their device, we can instantly fix it. Similarly, every client has different needs. In order to create the best possible conditions for them and their team on our platform we try to gather feedback for improvements after every test run.” 5. Reliable Testing Environment Testers need easy and reliable access to test, no matter the kind of digital product that is being tested (beta apps, live online shops, connected devices, etc). Applications have to be stable enough to be tested. It’s difficult to test when updates are running as this affects the process and leads to unreliable results. Furthermore, it’s important to document which version has been tested and on which one bugs have occurred. For unreleased software, security issues have to be considered due to confidentiality; therefore a safe method of access needs to be provided. [easy-tweet tweet="Testers need easy and reliable access, no matter the kind of digital product being tested" hashtags="cloud, tech"] Solution: A VPN for secure and easy access is a must during the testing or staging environment. For fast beta app distribution, online cloud based tools are often a great solution. Practical Recommendation: “Whenever we test products that have not yet gone live, we face clients that fear the app will be distributed to all 150,000 members of our crowd. In reality, though, only testers who are specifically chosen have access to the product. We also offer our own app distribution tool, BirdFlight, where we offer full control over who has access to a beta app. After a test is completed, we remove it internally from our servers hosted in Germany so that no one can access it again.” 6. Entry Tests and Trainings In the lab, test managers can help testers who encounter problems. Remotely, however, testers have to partly take over the tasks of the managers themselves. In essence, they have more responsibility. For this reason, it’s vital to train people appropriately and make them aware of requirements. It’s equally important to show them how best to use the platform. Entry tests can ensure that testers know how to handle their devices and the testing platform, which in turn can yield to results of a higher quality. Solution: Tutorials and videos that show how the platform works and how to report bugs can ensure that testers are well prepared to handle the responsibilities expected of them. Practical Recommendation: “All testers who sign up to our platform have to complete an entry test before they can participate in real tests. We do that so that our testers have a chance to become familiar with our platform and the way in which we conduct tests. This way, they can later on concentrate on the actual test instead of trying to figure out how our platform works. In addition, we offer training in our “Bird School” where testers can learn more about, for example, functionality testing or why good usability is important among many other topics.” 7. Choose the right testers The crowd is a reflection of society. It can represent a number of age groups, jobs, education levels, family members among many more demographics. These are often normal people without extensive testing knowledge or expertise. In addition, what is required of them differs from project to project. To select the right people you need as much information about them as possible. For usability testing, people that represent the target group of the digital product that is to be tested is a must as in the end, they will be the ones to use and review it. For functionality testing, this isn’t as big of an issue, instead it’s important that testers are experienced and have the necessary devices. [easy-tweet tweet="The crowd is a reflection of society. It can represent many demographics" hashtags="cloud, tech"] Solution: Inexperienced testers search for bugs and usability issues in a different manner than testing experts. When hunting for bugs or figuring out shortcomings in user experience, it’s important to have a group of testers with varying degrees of experience. Practical Recommendation: “Let’s say a client asks us to test their app with their target group consisting of women over 50 years old. While we can easily accomplish this, an important question to consider is if it makes sense to include their target group if they are only looking for technical issues in the app. Sometimes, the target group of a digital product isn’t the type of person who knows how to find bugs. For these situations, we have special bug finders on our platform. They know where apps often have flaws and will try everything they can to break the app. This will get you the best results.” 8. How to Handle Further Enquiries Test managers are unable to provide direct feedback when remote testing. Therefore, it’s even more important that the feedback they do provide is perfect. It’s important to try to solve possible technical issues that testers may encounter with strong and clear instructions. When there are issues it’s vital that test managers react quickly to get high quality results. Checking the feedback from testers carefully and changing instructions based on misunderstandings is also a must. Solution: Always provide testers with a personal contact and have their information readily available. Communicate the availability of this person in case testers need to ask something. A forum for communication between the testers themselves can also be helpful and allows them to fix issues on their own. Practical Recommendation: “In every test there are some testers who have questions such as about the setup or an app that can’t be installed on a particular device, etc. At Testbirds, project managers are the tester’s person of contact. During the setup and briefing, they will receive a lot of information on the product and the internal processes so that they can answer almost any question that testers might have. Some tests also have a dedicated forum where testers can communicate and help each other. Finally, if necessary, the client can also communicate with them directly.” 9. Detailed Documentation and Quality Assurance One of the critical aspects of successful remote testing is comprehensive documentation. Testers have to report the steps they take in a detailed and precise manner. Every step needs to be documented and supported with evidence such as screenshots. Reports have to be reviewed in regards to how complete they are, if they are traceable, of a high quality and whether they are reproducible or not. Have the testers done what they were supposed to do? Did they test on the right devices and software? How important or critical is the bug in question? Solution: Testers learn and develop with time and projects. Give feedback on what they should be aware of for the next project, what can be done better and clearly explain situations that may arise such as why a bug was not accepted. Practical Recommendation: “All clients have their own way of documenting bugs in their internal systems. This is why we are working with a flexible tool that can be adapted to all kinds of bug documentation systems. Furthermore, all bugs run through an internal quality assurance process in order to make sure they are valid and well documented for clients. This way we can ensure that clients only receive relevant bugs and no duplicates that clog the system. “ 10. Fair Payment and Incentives When it comes to remote testing, testers are the key to success. Happy testers are good testers. It is therefore a must to provide adequate incentives and motivating tasks. Other than fair payment, a reward system can also be used to motivate people, based on, for example the quality of the bugs or gamification of the testing process. Additional payment for bugs found when performing functionality testing can lead to an increase in creativity and engagement when hunting for bugs. Solution: Experience points for the participation of tests helps to qualify testers and can also serve as motivation for the testers themselves. Rankings and awards for extraordinary achievements can also be used to further motivate the crowd. Practical Recommendation: “A lot of testers test with us because they are interested in trying out new products. However, others also test with us for earning some money while testing new or soon to be released products. It’s a win-win situation. Other than those aspects, Community Points are extremely popular with our testers, which they receive for performing tasks such as participating as backup testers or writing an excellent test report. This in turn allows them to climb the ranks, which has now become a popular competition on our platform.” ### 'Goodbye' Point Of Sale: 'Hello' Points of Interaction We’re all well-versed in how to shop. Head to the store, browse, pick what you want, pay, take your item. Whatever you buy, wherever you buy, it largely follows this process. In fact, the till is probably the only real point of direct and personal interaction between retailer and consumer in many instances. [easy-tweet tweet="Tech's so much blurring the lines between physical and digital as overlaying one on another." hashtags="tech, cloud"] However, this is changing. Technology isn’t so much blurring the lines between physical and digital so much as overlaying one on the other. Cloud, Open APIs and big data have combined to bridge that gap between online and in-store to reshape retail. At the heart of this is the Point of Sale (POS) which is evolving into something much, much more exciting. This way to pay. Or that way. Actually, any way. The till has been an integral part of shopping; if you want something you do have to pay for it after all. It’s a single purpose part of the store devoted to the transaction. Thanks to the cloud though, the era of the POS is evolving and being replaced by Points of Interaction (POI). Instead of dedicated tills, an off-the-shelf tablet like you might use at home can do the same job. All the clever stuff including security can be done in the cloud. For the retailer, this means they are able to turn everyone on the shop floor into roaming sales staff rather than having dedicated members of staff manning the tills. In being able to take payment anywhere, staff can ‘queue bust’ and take payment more efficiently and in a more personal way. For the shopper, this gives them more freedom to pay quickly and conveniently. [easy-tweet tweet="In being able to take payment anywhere, staff can ‘queue bust’." hashtags="tech, cloud"] One-time offer, just for you. No really. Payment has always been something of an island, separate to the rest of the experience. With Open APIs, different data streams can be integrated with one another in real time. A cloud-based POS can bring together payment, loyalty inventory and marketing to create an altogether richer, more personalised POI. Having all these functions working together helps create an omni-channel retail experience for the consumer. Payment card data can be used to tie a shopper’s online profile with the person in-store. This creates a more complete understanding of each consumer, filling in the gaps. This fuller understanding paves the way for hyper-personalised, relevant marketing. Not just via email or post – but in-store while you pay. Collecting data is one thing but being able to bring different sources together into once place turns it into something far greater than the sum of its parts. Real Omni-channel Omni-channel generates a lot of noise and a lot of column inches – for many retailers it’s where they see their market and their business headed. This means bringing together all channels and creating a consistent experience. Payment is a constant across every channel in retail and can unify a brand’s retail experience. Cloud, data and the Open APIs which bring these together are creating newer, richer and more personalised retail experiences. As a result, the traditional POS is evolving into something far more exciting; the point of payment will become part of a wider experience in line with the modern-day shopper’s expectations. ### New To VoIP? Evaluate Before You Invest Voice over IP (VoIP) systems are enjoying strong growth worldwide. According to Global News Wire, the VoIP services market should reach $140 billion in the next five years as companies look for ways to enhance their telephony without being tied to expensive, on-site hardware. For many businesses, the sharp rise in VoIP prompts a drive to adopt before their plain old telephone service (POTS) solution is completely outdated. The problem? Jumping in isn’t always the best solution; here are four tips to help newcomers to VoIP evaluate their options before investing. [easy-tweet tweet="VoIP prompts a drive to adopt before old telephone services are completely outdated." hashtags="cloud, comms"] On or Off? The first decision you need to make when considering a VoIP system is location: do you prefer to host the VoIP servers on site or off-premises? Both are viable options. According to Beta News, if you have no existing communications hardware, it’s often more cost-effective to choose hosted options, since setup time is minimal and there’s no new CapEx. Staying on-premises, meanwhile, gives you more granular control over security and access. When it comes to choosing a provider, research its strengths and pick one that matches your needs. For example, some VoIP partners have vetted a host of cloud providers to ensure top-quality service. Check Your Bandwidth Early-stage complaints about VoIP included sound “jitter,” lag and dropped calls. Cloud-based technologies have largely eliminated these issues, but you may still experience problems if your Internet connection doesn’t have enough bandwidth. Here, even the best provider won’t be able to meet expectations since too-small bandwidth pipelines will naturally limit call quality. Before investing in any VoIP solution, make sure you’ve got room to spare on your existing Internet connection — if not, consider adding a second, dedicated line. As noted by It Pro, it’s also important to make sure your router is up to the task of prioritising voice traffic and implementing quality of service (QoS) rules for VoIP data packets. [easy-tweet tweet="Cloud-based technologies have largely eliminated issues of VoIP 'jitter'." hashtags="VoIP, cloud, tech"] Consider Costs The cost of a hosted VoIP system is much like that of a cloud computing service — without any hardware purchasing or upgrades, you’re looking at pure OpEx spend. While it’s tempting to cut costs wherever possible to limit the long-term impact, when it comes to VoIP, you get what you pay for. In other words, the value of your VoIP solution is defined by its reliability, adaptability and usability; while you can find bargain-bin providers, the type of service they offer will always be commensurate with their price. Features and Futures Last but not least, consider what you need right now and what you may require down the line. This means reining in the impulse to spend big on every available VoIP feature and instead dial it back to include features that meet specific business needs. For example, if you’re in the middle of a business expansion, it’s ideal to look for providers that specialise in mobile device support and enhanced video chat to keep staff and stakeholders in the loop. It’s also important to consider the future; does your prospective VoIP provider leave you room to expand — by doubling or tripling your usage if necessary — and does it support the emerging infrastructure for unified communications-as-a-service (UCaaS)? If not, you may want to keep looking. New to VoIP? Understand your options, know your limits, find your price point and consider your future before picking a provider. ### Protecting a mobile workforce – lessons from France and Turkey The recent terrorist attack in Nice and civil unrest in Turkey demonstrate how important it is that organisations can communicate with travelling employees that may be in danger, but organisations need to strike a balance between employee protection and privacy invasion. How can cloud-based communication tools help organisations protect employees and minimise operational risk? [easy-tweet tweet="organisations need to strike a balance between employee protection and privacy invasion." hashtags="tech, cloud, comms"] In 2016 a new wave of terror attacks and unrest struck mainland Europe with attacks in France, Belgium and Germany coupled with severe military unrest in Turkey as factions of the country’s armed forces attempted a coup d'état to overthrow the government of President Recep Erdogan—leaving 312 people dead and more than 1,500 injured. Business today is global. In the event of an emergency or crisis situation, secure, effective and reliable two-way communication with employees is crucial. Modern workforces are mobile, so businesses of all sizes need to ensure that the bilateral lines of communication between management and employees remain open, whatever the situation. Power of multi-modal, cloud-based communication In an emergency it is vital that organisations are able to implement effective crisis management plans. A vital part of that process involves the use of multi-modal, cloud-based communications platforms to ensure that notifications are sent out quickly and reliably, ensuring staff are aware of any danger and can respond accordingly. No single delivery path is ever 100 percent reliable, 100 percent of the time. For this reason, emergency notifications must be multi-modal and utilise every available contact path to communicate with employees until delivery is acknowledged. The only way to guarantee crisis alerts and communication reach employees is to adopt a cloud based, multi-modal solution which enables users to quickly and reliably send secure messages to all members of staff, individual employees and specific target groups even when traditional routes are unavailable. [easy-tweet tweet="It is vital that organisations are able to implement effective crisis management plans." hashtags="cloud, tech"] These mass notifications are sent out through multiple contact paths which include: SMS messaging, emails, VOIP calls, voice-to-text alerts, app notifications and many more. In fact, with cloud-based software, there are more than 100 different communication devices and contact paths that businesses can use to communicate and send secure messages to their workforce. This is a crucial area where cloud-based platforms have an advantage over other forms of crisis communication tools; unlike call cascade systems and SMS blasters of the past, emergency notifications are not only sent out across all available channels and contact paths, but continue to be sent out until the recipient acknowledges them. Employees are then able to use the platform’s two-way polling feature to respond immediately, providing an organisation with clear visibility of an incident and crucial data that enables them to effectively deploy resources to ensure the safety and protection of its people. In previous crises, having the cloud-based software installed on employees’ smartphones and other Wi-Fi enabled devices has proved invaluable. During the terrorist attacks in Brussels in March 2016 the GSM network went offline, making standard mobile communication impossible. The citizens of the Belgian capital were unable to send messages to family, friends and work colleagues. The team at Brussels Airport made its public Wi-Fi discoverable and free of a network key, allowing anyone with a Wi-Fi enabled device to connect, send and receive messages. For crisis management and business continuity, the flexibility that multi-modal functionality provides is essential to ensuring that a high level of response is received quickly when emergency communications are sent. [easy-tweet tweet="A high level of response is received quickly when emergency communications are sent." hashtags="cloud, tech"] Privacy v Protection What the recent spate of terror attacks has demonstrated is that that crisis communication plans are far more effective when supported by detailed location information. Easy access to accurate location data means a business can take a risk-based approach to targeting communications. For instance, if you had a sales force operating at various locations across a city when a series of attacks happen, how do you know exactly where your people are and which ones might be in danger? Some companies use GPS tracking dongles but these are expensive and liable to be lost or stolen. It could also be argued that they are intrusive as they cannot be turned off.  Employees want to be safe in the event of an emergency but they may also have concerns about their privacy. The answer is to capitalise on information that is already being collected and use it smartly to understand more about where an employee might be. Employees already provide information on where they are in various different ways. For example, when employees log on to a company Wi-Fi network it is safe to assume they are in a certain building and when they enter that building, access pass information (which is being collected anyway) also provides insight into their location. By combining this data the employer can understand where someone is located without needing to generate additional location information or cause concern amongst employees that they are being tracked. This is a way that organisations can then receive regular updates and alerts regarding their employees’ last known location. In a crisis situation, the employer has a better idea where people are, allowing Incident Management teams to co-ordinate a more effective response. Staying Connected For any organisation, the protection of its workforce in the event of an emergency is of paramount importance; infrastructure can be rebuilt over time whereas people’s lives cannot. Organisations need to have crisis communication plans in place that work in real-life, not just on paper. An essential part of this process includes utilising a scalable, flexible cloud-based solution to help businesses communicate with their employees quickly, reliably and securely in any situation and enable it to protect its staff members who find themselves at risk during an emergency. By learning the lessons of the past, organisations can be better prepared to handle the crisis of the future. ### Video surveillance data: The IT challenge lurking behind our public safety [embed]https://www.youtube.com/watch?v=_8r9oBR6Si0[/embed] Video surveillance plays a vital role in keeping us safe. Away from the Big Brother caricature that is sometimes presented, the UK’s 4 million-plus CCTV cameras help keep our streets safe and our property secure. But managing video content on this scale brings its own challenges. [easy-tweet tweet="Managing video content on a large scale brings its own challenges." user="Pivot3Inc" hashtags="cloud, tech, "] Video data requires a great deal more storage space than text or image-based content and with the amount of CCTV footage being recorded daily, storage must be rapidly scalable. What’s more, CCTV video data may be sensitive for the individual being filmed, but of great interest to the public at large. Take, for example, video footage of a mugging. The individual victims may wish to remain anonymous, but the police and the wider public need the footage in order to prosecute the perpetrator. As a result, CCTV video data must have the highest levels of both security and reliability. Storage The fact that CCTV cameras require full motion, high definition video streams, sometimes running 24 hours-a-day, means that general purpose IT storage is often not up to the task in terms of data quality and availability. In addition, camera systems are operated in a different way to many other IT applications, making the task of data storage somewhat different. With other IT tools, an end-user will quickly notify the operator if a service is performing below the expected standard, but CCTV cameras will continue to capture footage, regardless of whether back-end IT infrastructure is keeping up. If it isn’t, you may find that a crucial piece of video data is not at the quality you need, or not stored at all. Pivot3’s hyperconverged vSTAC OS platform manages video data differently. This technology has been optimised to deliver the highest performance levels for CCTV workloads. Currently operating across more than 1 million HD cameras across 53 countries, Pivot3 delivers enterprise-class storage that protects your surveillance platform from data loss and failures. vSTAC OS transformstraditional servers into purpose-built, highly fault-tolerant virtual shared storage arrays that support even the most demanding, data-intensive video applications. Security Of course, with something as serious as CCTV footage, security is paramount. If an external party interfered with surveillance data by, it would be much more difficult for a charged offense to stand in court. Pivot3’s Virtual Security Server ensures, businesses can access their video data remotely wherever and whenever they need to, while maintaining the level of access control that they need. [easy-tweet tweet="Pivot3’s Virtual Security Server ensures you can access video data remotely" user="Pivot3Inc" hashtags="cloud, tech, security"] As well as access control, organisations also require 100 percent confidence in their stored data. Whereas traditional SANs can lead to data loss, Pivot3 solutions maximise throughput to reduce dropped frames and image degradation. With Pivot3’s hyperconverged infrastructure solutions, video surveillance data has become easier to manage than ever before. Whether you work in the military, police or events management, video surveillance is a crucial part of keeping the public safe. However, in order to guarantee this safety, you need an IT solution that can deliver high definition video data securely and reliably 24 hours-a-day, seven days a week. ### How cloud computing makes automated supplier statement reconciliation accessible to all The key advantages of cloud-based software, namely access to sophisticated technical systems without the burden of upfront capital expenditure or the need for extensive and ongoing IT support, need no further explanation. Discussion has now moved on to how these increasingly flexible ways to use technology can benefit and add value to the organisation. [easy-tweet tweet="The key advantages of cloud-based software need no further explanation" hashtags="cloud, tech"] Across the enterprise, the use of technology frees up staff to carry out work that requires the finesse of human input. When it comes to finance, as noted in this article, cloud-based accounting platforms mean that software can do more of the ‘heavy lifting’ of tasks such as data entry and reconciliations, enabling accountants to provide more value-added advisory services. Within an organisation, accounts payable teams are also benefiting from cloud-based systems that can relieve them of the labour-intensive, repetitive tasks they have traditionally undertaken. In particular they are released from the burden of supplier statement reconciliation. Financial governance Supplier statement reconciliation is a key element of financial governance and a compliance requirement for some companies to ensure supplier liabilities are accurate.  It is an in-depth checking process that ensures all invoices/credits have been received and there are no erroneous postings on the vendor’s account or potential duplicate payments waiting to happen. However, while it is potentially business-critical, undertaking the reconciliation process manually is arduous and time consuming. A statement from a single supplier can contain hundreds and thousands of invoices while taking hours or even days to manually reconcile. Hardly surprising then that it’s an unpopular job. [easy-tweet tweet="Supplier statement reconciliation is a key element of financial governance" hashtags="cloud, fintech"] Not only that but, because managing the day-to-day task of processing invoices through to payment is already a full workload for the accounts payable team, only a select number of supplier accounts can be reconciled. Introducing cloud-based automation technology Step forward automation. Raise the bar further still and make the automation software available in the cloud, and therefore straightforward and affordable, and statement reconciliation is transformed, meaning accounts payable have only the exceptions to manage. Statements received in paper, PDF and Excel formats are uploaded into the cloud-based statement processing system and automatically reconciled against data from the accounts payable ledger. Any credit notes and invoices that are missing or duplicated are flagged, so that immediate action can be taken. Any miss-postings and/or data errors are highlighted for the accounts payable team to investigate further. The benefits of this relatively simple move are far-reaching. Effective use of human expertise Instead of laborious statement checking the accounts payable team can focus on resolving the exceptions flagged by the automated reconciliation system, proactively requesting missing invoices and credit notes from suppliers for example. Sharing reconciliations directly with suppliers enables them to see the real-time status of their account and cuts down the volume of phone call and email queries to the accounts team, freeing up their time still further. Organisations have a real-time view of the invoices and credit notes that have not yet been processed. This gives them an accurate picture of their supplier liabilities, improves accruals reporting and validates the accuracy of their invoice processing. Duplicate invoices (and the subsequent payments) are easily spotted, as are payments to suppliers that cannot be matched to an invoice. Easy visibility of these means payments can be recovered and cash flow improved. Meanwhile, the accounts payable team can add direct and visible value to the organisation because they have time to focus on tasks that require human insight, rather than carrying out essential, but mundane and time-consuming activities. The advantages of cloud-based statement reconciliation extend beyond the organisation.  Relationships with suppliers can be improved because they have a 360-degree view of their statement status without having to call (or chase payment). In the long term this has the potential to strengthen the supply chain. [easy-tweet tweet="The advantages of cloud-based statement reconciliation extend beyond the organisation" hashtags="cloud, business, tech"] Making good business practise affordable and accessible Statement reconciliation is clearly good business practice but prior to the flexibility offered by cloud computing, automating the process was out of reach for many organisations.  Today, affordable and accessible software means it is a straightforward step to eliminate much of the repetitive activity it requires. The end result is a streamlined organisation that can maximise the expertise of its accounts team and develop a strong supply chain. ### More data, more intelligence – the use of SIEM beyond compliance SIEM (Security Information and Event Management) technology is growing in popularity in Europe as continuing innovation boosts its brainpower. Despite this, many businesses are failing to realise its full potential. Often, when those who control a company’s purse strings think of SIEM, they visualise the technology as little more than a ‘box ticking exercise’ for compliance purposes alone. Perhaps this is understandable, as an increasing amount of data protection regulations, such as the upcoming General Data Protection Regulations (GDPR), mean businesses must act to ensure they are looking after data correctly. The problem this creates is that the value of SIEM in delivering ROI beyond compliance is often overlooked, with perceptions of the technology as too expensive and difficult to use preventing it from garnering any further consideration. In reality, SIEM can extract value from the data companies already hold as well as protecting it, all in a cost-effective and easy to operate manner. [easy-tweet tweet="SIEM can extract value from the data companies" hashtags="data, cloud, security"] Maximising workforce productivity In recent years, as the threat of data breaches and the complexity of IT systems has increased, so has the value of data. In theory, every device connected to an IT network generates data, or logs. The problem is that these logs are generated in a different format. It is similar to attending an EU Summit whereby officials are not wearing a language headset; everyone is speaking to each other in a different language. The information is available, but nobody understands it. SIEM takes this information and normalises it, effectively converting it into a single language. This helps IT managers to maintain control over the sheer amount of security logs generated from each system operating within their IT infrastructures, as interpreting this information manually is complicated and time consuming. SIEM simplifies this process by automating the task. This means highly skilled security staff no longer need to spend copious amounts of time sifting through data in order to retrieve actionable intelligence. Rather, they can utilise the insights provided by a SIEM solution to make sensible business decisions. [easy-tweet tweet="SIEM takes information and normalises it, converting it into a single language." hashtags="data, tech"] Translating data into business intelligence One of the key misconceptions regarding SIEM is that it is purely a security-focussed tool. If businesses were to broaden their use of the technology, they could experience a greater ROI. An example of this could be as simple as utilising SIEM to monitor a business’ printing needs. Many company printers are leased from external firms, and a SIEM solution can recognise which printers are being utilised and how often. If a certain printer is not being used frequently, for instance, the business can then save money by reducing the number of printers leased. In addition, within a security scenario, SIEM can add further context to a security situation, crucial when making intelligent business decisions. For example, the software may flag that an employee has accessed a file they are not permitted to. On its own, this could lead to disciplinary action. However, if a SIEM tool is also connected to car park surveillance cameras, it may notice that the person who has accessed the file has not yet shown up to work. Furthermore, if the technology is linked to the HR department, it may register that the same employee is on annual leave. This additional context provides vital information, allowing an organisation to recognise the difference between a HR issue and a security breach. Clearly, SIEM over-delivers when it comes to compliance. For many organisations, the next step is to make sure it is delivering – even at a basic level – to support productivity, decision-making capabilities and security procedures. We believe the future of SIEM involves more than just compliance. This is a tool that, in a world with more data than ever before, helps sift through the noise to make the most intelligent security and business decisions. ### Bluebird - IBM Sterling as an integrator into manufacturers The manufacturing industry is entering the digital age. Although many factory floors are already home to a wide range of robotics and other hardware offerings, manufacturers are now increasingly embracing a number of software-based IT solutions to improve efficiencies and reliability as well as automating processes. However as manufacturers introduce these complex IT solutions, a number of challenges, particularly around integration, can appear. [easy-tweet tweet="Manufacturers are now increasingly embracing a number of software-based IT solutions" user="BluebirdITS" hashtags="cloud, tech"] Overcoming SAP challenges Many manufacturers choose SAP systems for their IT infrastructure, but even with the proven success of the IT modules, challenges still arise. Fortunately, IBM Sterling B2B Integrator, as delivered by Bluebird IT, can help businesses securely overcome these challenges, allowing them to modernise and improve their manufacturing processes. One of the primary issues that manufacturers face is missed service level agreements, or SLAs. Inefficiencies when converting file formats to Idocs (intermediate documents) can make order processing an unnecessarily long process. However, Sterling B2B Integration allows the exchange of a full range of SAP Idoc messages as standard, streamlining the process. Similarly, a number of SAP processes can be automated, removing the need for staff to get bogged down in repetitive manual tasks and decreasing the risk of error. Ultimately, by reducing these errors and speeding up the ordering process, manufacturers experience a lower cost of ownership and better return on investment. Making your partnership productive The key to creating productive relationships with your partners is becoming as easy to do business with as possible. At Bluebird we recognise that all of your partners are not the same, some are large & prepared for highly secure level Internet based transfer, and other smaller companies may be using faxes and emails. Your customers and partners are becoming more empowered, Sterling B2B Integrator offers you the agility to integrate with any partner regardless of size, protocol or file transfer sophistication, as well any partner mandates. All of this allows you to keep up with your ever-changing partner community. Plus, with new partners being on-boarded all the time, IBM Sterling helps you to manage your trading partners more efficiently and using fewer resources. [easy-tweet tweet="Compliance is complicated further still by supply chains crossing national boundaries." user="BluebirdITS"] In an increasingly globalised business environment, companies need to collaborate across enterprise boundaries. With IBM Sterling B2B Integrator, processes can be automated even if they are shared between different partners with differing technologies, policies, procedures, preferences and priorities. When you consider how fragmented the manufacturing supply chain can be, moving all the way from supplier to consumer, this level of integration is vital. Another key aspect for manufacturers to consider when integrating with a diverse range of partners, particularly when it comes to supply chain management, is visibility. A step to improving partner integration is through real-time visibility, letting manufacturers oversee and audit product, information and finance flows. The B2B Integrator offers a dashboard that helps foster compliance with SLA’s and alerts you to transactions & events that fall out of bounds, as well as offering real time information to auditors and more importantly your partners. Even if the technical challenges of integration have been overcome, manufacturing firms could still find themselves facing a number of regulatory hurdles. The issue of compliance is complicated further still by supply chains that cross national boundaries. Failure to meet these regulations could have devastating consequences for businesses, leading to fines, lost business and long-term 'reputational' damage. Sterling B2B Integrator can help manufacturing businesses meet their customer, partner and industry demands. Even if your supply chain is operating across disparate document definitions, systems and compliance standards, our comprehensive solution can give your manufacturing business greater visibility into file transfers, reduce task complexity and boost automation, giving your company the tools required to collaborate with its partners, wherever they are in the world. Bluebird is an IBM Premier Partner specialising in IBM Managed File Transfer, Business to Business Integration, Enterprise Content Management and Business Process Management solutions.  We sell IBM software and deliver services including design, development, implementation, project management, training and technical support. More about Bluebird IT here.   ### Hewlett Packard Enterprise to Extend Leadership in Big Data Analytics by Acquiring SGI Hewlett Packard Enterprise today announced that it has signed a definitive agreement to acquire SGI, a global leader in high-performance solutions for compute, data analytics and data management, for $7.75 per share in cash, a transaction valued at approximately $275 million, net of cash and debt. SGI products and services are used for high-performance computing (HPC) and big data analytics in the scientific, technical, business and government communities to solve challenging data-intensive computing, data management and virtualisation problems. The company has approximately 1,100 employees worldwide, and had revenues of $533 million in fiscal 2016. "At HPE, we are focused on empowering data-driven organisations," said Antonio Neri, executive vice president and general manager of Hewlett Packard Enterprise. "SGI's innovative technologies and services, including its best-in-class big data analytics and high-performance computing solutions, complement HPE's proven data centre solutions designed to create business insight and accelerate time to value for customers." The explosion of data, in volume and variety, across all sectors and applications is driving organisations to adopt high-end computing systems to run compute-intensive applications and big data workloads that traditional infrastructure solutions cannot handle. This includes investments in big data analytics to quickly and securely process massive data sets and enable real-time decision making. High-end systems are being used to advance research in weather, genomics and life sciences, and enhance cyber defences at organisations around the world. As a result of this demand, according to International Data Corporation (IDC), the $11 billion HPC segment is expected to grow at an estimated 6-8 percent CAGR over the next three years, with the data analytics segment growing at over twice that rate. SGI's highly complementary portfolio, including its in-memory high-performance data analytics technology, will extend and strengthen HPE's current leadership position in the growing mission critical and high-performance computing segments of the server market. The combined HPE and SGI portfolio, including a comprehensive services capability, will support private and public sector customers seeking larger supercomputer installations, including U.S. federal agencies as well enterprises looking to leverage high-performance computing for business insights and a competitive edge. "Our HPC and high-performance data technologies and analytic capabilities, based on a 30-plus year legacy of innovation, complement HPE's industry-leading enterprise solutions. This combination addresses today's complex business problems that require applying data analytics and tools to securely process vast amounts of data," said Jorge Titinger, CEO and president of SGI. "The computing power that our solutions deliver can interpret this data to give customers quicker and more actionable insights. Together, HPE and SGI will offer one of the most comprehensive suites of solutions in the industry, which can be brought to market more effectively through HPE's global reach." HPE and SGI believe that by combining complementary product portfolios and go-to-market approaches they will be able to strengthen the leading position and financial performance of the combined business. Overall, HPE expects the acquisition to be neutral to earnings in the first full year following close and accretive thereafter. The transaction is expected to close in the first quarter of HPE's fiscal year 2017, subject to regulatory approvals and other customary closing conditions. ### The country with the craziest Euro Cup fans: lessons from a CDN There are many metrics by which the hardcore-ness of a nation’s soccer fans can be measured: merchandise purchases, how far they travel for games, fervor for face painting and impassioned acts of hooliganism. However, the fans in the stadium (and those getting kicked out of the stadium) only tell part of the story. How about those at-home fans? What if there were a way to measure just how into their European Championship games each nation was? [easy-tweet tweet="What if there was a way to measure how into a European Championship game each nation was?" hashtags="tech"] Finding out who’s actually paying attention In this day and age there is one major indicator that someone is actually paying attention to something, be it a conversation, their work, a movie or a sporting event: he or she is not messing around on the internet at the same time. Naturally then, in order to find out which nation’s at-home fans were most invested in the games, it had to be determined which nation had the biggest drop-off in internet usage during games as fans abandoned their digital devices to cheer on their countrymen. Methodology to the madness Internet activity measuring duties fell to Imperva Incapsula, who harnessed the powers of their massive CDN to gain a clear picture of what was or was not going on online during each game of the Euros. A CDN or content delivery network is a globally distributed network of proxy cache servers designed to deliver website content to users as quickly as possible by cutting down the physical distance between users and the server they are redirected to based on geolocation, amongst other website performance improvement methods. The Imperva Incapsula content delivery network serves tens of millions of users in Europe and handles 12 million requests per minute, so they had an advantageous position when it came to observing the internet habits of European soccer fans. The data used was amassed by comparing the number of sessions to the CDN at the time of the game as compared to the average number of sessions that occurred at the same time and on the same day of the week from May 20 to June 8. The tournament began on June 10th. Group Stage, where the underdogs (mostly) ruled the day To kick off the tournament it was generally the countries whose teams were major underdogs in their matches who tuned in most ardently and recorded the biggest drops in internet usage. Albania, Russia, Ukraine and Hungary all won the battle of at-home fan fervency while Switzerland, England, Germany and Austria were a little more chill about splitting their attention between devices, perhaps because they were more confident in the outcome of the games. The exceptions to this loose rule were Spain, whose fans dutifully ignored the internet as they took on the lower-seeded Czech Republic; Portugal, who outshone Iceland when it came to dropping their devices; and the home team and major favourite France, whose fans weren’t willing to overlook the game against Romania and recorded a 19 percent drop in internet usage. On to the Round of 16 France fans kept on keeping on when it comes to focusing on football. While France’s internet usage declined 23.3 percent during their game against the Republic of Ireland, Irish fan internet usage actually rose 0.4 percent. The Icelanders who stayed offline to record a 2.4 percent drop in internet usage were rewarded with a big upset, thumping England whose disenchanted fans recorded a 9.6 percent rise in internet-ing. Italy and Germany both inspired drops of over 15 percent with their big wins over Spain and Slovakia, respectively. The Quarter Finals, where it was Germany vs. Italy on and off the field Germany and Italy played in a nail-biter that was decided on penalty kicks, with Germany walking away the winner. German fans also won with an 18.6 percent decrease in usage to Italy’s also-impressive 15.5 percent drop. France, meanwhile, kept ignoring the internet and kept on winning, taking down Iceland. Portugal’s at-home fans also had a notable showing with a 14.6 percent decrease during their win on penalty kicks over Poland. Semi ignoring the internet During the semi-finals it was old internet ignoring favourites Germany and France once again posting double-digit drops, with Germany seeing a 14.6 percent drop and France posting a 13 percent decrease. On the field it was all France with a 2-0 win. Finally The UEFA European Championships 2016 finals saw a blistering match-up between France and Portugal, and while both teams’ fans did their part by ignoring the net, in the end it was the internet imitating life with Portugal coming out on top with a whopping 60 percent decrease in internet usage and a 1-0 win. France had a 39 percent reduction in internet usage. Though CDNs are much more typically used for things like speeding up websites, improving performance and search engine rankings, cutting bandwidth bills and protecting against DDoS attacks, observing European Championship internet traffic was worthwhile as well, even if only to keep Russian hooligans from being named the top fans based on how many people they hospitalised. ### Three ways Businesses have changed their approach to data since 2015 Recent research from Gartner revealed that more than three quarters of companies are investing or planning to invest in big data by 2017. As with the adoption of any new technology, the journey to properly leveraging data is a long and challenging one - just 0.5 percent of data is ever analysed or used. However many are now beginning to understand the critical role data and analytics will play to the future of successful companies; The Economist’s Intelligence Unit recently found the majority of business leaders believe data is “vital” to their organisation’s profitability. As adoption and the understanding of what data can and should achieve grows among businesses, data is shifting from a vision to a reality for many. It has gone from a buzzword to business practice. [easy-tweet tweet="#Data has gone from a buzzword to business practice." hashtags="cloud, tech"] At Tableau, we have first-hand insight into how businesses are working to capture and analyse data. Customers that span nearly every industry and vary from small businesses to enterprise corporations use Tableau Online to connect to and analyse their data each with their own unique data sets and objectives. Their data is stored in a wide variety of places—from individual databases, to cloud, on-premise, and hybrid deployments. In an effort to help businesses navigate this diverse landscape, we’ve developed the ‘Cloud Data Brief’, which analyses the usage patterns of over one million anonymous data source connections published to Tableau Online by more than 4,000 customers. This brief identifies emerging trends that we believe are representative of the wider industry and that all organisations must consider when developing a data strategy. Data’s centre of gravity is moving to the cloud. Cloud is revolutionising IT, and this is particularly true of data. As the size, scale and complexity of data grows, cloud has proved the ideal platform on which to host it, empowering businesses to easily capture and store their data in cloud databases and Hadoop ecosystems.   Using Tableau as an example, we saw a 28 percent increase in usage of cloud-hosted data within Tableau Online over the past 15 months. By the first quarter of 2016, nearly 70 percent of our customer data was hosted in the cloud. This marks a noticeable shift. To put things into perspective, just last year the split was almost even at 55-45 cloud to on-premise deployment. [easy-tweet tweet="Cloud is revolutionising IT, and this is particularly true of data." hashtags="tech, cloud"] Data gravity indicates the pull of data on services and applications. If your data lives in the cloud, for example, you’ll likely want your data tools – from processing to analytics – running in the cloud as well. Data’s centre of gravity is now squarely focused on the cloud, and that focus will only grow larger in the future. Organisations building data ecosystems should concentrate their efforts on cloud workflows to ensure they’re system are ready for this change in data gravity. In the move to cloud, hybrid data technologies are becoming the norm Hybrid databases can be deployed on both cloud and on-premise networks. When not all your data can be moved to the cloud, or you want to make the move incrementally, hybrid options give the flexibility to bridge that gap between cloud-hosted and on-premise environments. Gartner recently predicted these hybrid offerings will become the norm by 2018. For businesses transitioning to the cloud, the Brief reveals however, that for Tableau’s customers in particular hybrid is already the norm. The reason for this is straightforward. As established businesses move their operations to the cloud, they often find themselves with data that must remain on-premise. Security and privacy requirements, for example, can necessitate storage behind corporate firewalls. Other times, moving operations to the cloud is a slow and incremental process achieved over months or years. These scenarios are driving demand for hybrid solutions. Returning to our data as an example, the most popular hybrid databases used in Tableau Online are SQL Server, MySQL, and PostgreSQL. In the first quarter of 2016, these three databases accounted for over half of all data sources used by our customers.     Meanwhile, connections to cloud-native data sources like Amazon Redshift and Google BigQuery are rapidly gaining market share. At the beginning of 2014, they represented just 12 percent of all connections in Tableau Online. By the first quarter of 2016, they had grown to 28 percent. [easy-tweet tweet="Cloud-native data accounts for only a quarter of connections used in Tableau." hashtags="cloud, data, tech"] Even with such consistent growth, however, cloud-native data sources account for only a quarter of all connections used in Tableau Online. Hybrid sources, on the other hand, have never dropped below 60 percent. Data storage is expanding beyond our traditional concepts of databases and businesses are analysing data from many sources. With the rise of the Internet of Things (IoT), data is now flowing from everywhere and everything. As a result, the capture and storage landscape has expanded to meet the requirements of new and variable streams of data. There are over 40 types of data sources represented in Tableau Online. Even when file-based sources like Excel and web applications like Google Analytics are excluded, there are 32 distinct types of databases and Hadoop ecosystems represented. This diversity is indicative of the wide and varied landscape of data tools available today. For instance, newcomers like Snowflake are rethinking the delivery of databases, while hosted solutions like Amazon Elastic MapReduce have extended the capabilities of traditional Hadoop ecosystems.   In the future, it’s almost inevitable the landscape will only become more crowded. There’s a clear need for businesses need to connect their analytics tools to not one, but many data sources that span across databases, Hadoop ecosystems, and web applications. For organisations looking to capitalise on the breakneck speed of innovation in the data landscape, it is necessary to build a data workflow that focuses on flexibility and choice above all else. ### Brexit - In times of uncertainty, Opex rules The future of the UK and its finances is undeniably uncertain in the wake of Brexit. With a news landscape that changes daily, making decisions regarding the technology spend within an organisation is one of the CIO’s biggest challenges. In this unpredictable climate, it is only natural that every business will be hesitant to spend a penny more than it absolutely has to. [easy-tweet tweet="The future of the UK and its finances is undeniably uncertain in the wake of Brexit" hashtags="fintech"] In recent years CIOs have pursued the opportunities presented by cloud service providers, offering ‘flex-up, flex-down’ rental models for hosting & storage. No longer did CIOs need to invest their capital. Investments that demanded a clinical focus on sweating assets over a 3 to 5 year term to deliver the appropriate ROI. Now, rapid deployment and short term consumption are all made possible under the ‘cloud based’ operating model. Today, despite the current market's volatility and reluctance to spend on anything with a mid to long term return, businesses simply cannot afford to ignore their IT requirements or they will risk falling behind. Using outdated technology leads to a competitive disadvantage and real risk of failing to operate at full business potential. Furthermore, when and/or if the economic uncertainty settles, they will potentially have to undertake a large financial outlay to get up to speed. The good news is there are ways to stay ahead of the curve and efficiently manage financial IT investment. Historically, during the macro economic landscape in the eighties, many businesses moved from large scale enterprise to more flexible financing models and the current uncertainty will see this situation occurring again. Businesses need scalability and flexibility, and the Opex “pay as you go” model offers flexibility according to market demands. To preserve of the hosting & storage function, smart providers are now offering to de-risk investment in core infrastructure and the end-user estate. More than simply a leasing model for hardware (which still demands a full term commitment or face a cancellation penalty), the option to rent the full end-user or core infrastructure bundle is now a rare but realistic option. And critically the provider assumes the risk of the business de-aggregating or shrinking; stop using, stop paying. [easy-tweet tweet="There are ways to stay ahead of the curve and efficiently manage financial IT investment." hashtags="cloud, tech"] So for many, committing to a one-time capital expenditure - a Capex model - and a multi-year depreciation schedule just isn’t cost-effective, desirable or dynamic enough in the current climate. Businesses that move from Capex investments to an Opex model can benefit from greater flexibility and diminished risk; with IT spend closely linked to the ebbs and flows of business demand and market forces. Opex allows companies to engage the resources and expertise of managed services, including cloud hosting, experts while eliminating the need for capital investment on technology project; replacing up-front cost with pay-as-you-go operational spending and the opportunity for increased agility. [easy-tweet tweet="Opex allows companies to engage the resources and expertise of managed services" hashtags="cloud, tech"] For those under pressure from the CFO who is naturally concerned that historic investments, which still have book value to be managed, the transition to a predominantly Opex model doesn’t have to happen in one migration. Blended models that make use of current investments in technology, coupled with historic contracts with sub-contractors, can be built into a service delivery model that evolves over time. Refreshing Capex financed assets with Opex leased equipment when the technology ceases to be fit for purpose or when the book value is fully written down. Integration happens both at the technology and commercial levels; but once migrated the flexibility and diminished risk become tangible benefits. And to any ‘late adopters’ of the cloud model, Opex presents a great opportunity for organisations that were uncertain before, but now wish to test the water - by investing in rental, hybrid models – and only paying for the data or services used, with any peaks in demand covered using contractor resources. And thankfully the CIO can adopt when new investment become a necessity whilst sweating the existing investments in a fully hybrid model – there’s no material outlay to adoption. Employing an Opex model will also potentially accelerate further adoption of cloud technology in business, as organisations continue to build trust in the cloud ecosystem. ### Why the UK needs its own privacy-focused search engine Most of us would agree that privacy is important. Without it, our ideas and actions are exposed to the world without our consent – something that even the most open of individuals would surely accept is not without its issues. Privacy is not about hiding illegal activity, but about being able to choose what we keep to ourselves and what we share. [easy-tweet tweet="Privacy is not about hiding illegal activity but choosing what we want to share" hashtags="cloud, security"] Unfortunately, maintaining our privacy is becoming increasingly difficult and in the digital age even something as seemingly benign as using a search engine comes at an unexpected cost. With personalised advertising providing such a lucrative market, search engine providers are harvesting personal information on an unprecedented scale. Sharing our personal data, including our age, location, likes and dislikes, is the modern cost of using the Internet. Protecting UK privacy Although there are privacy-focused search engines available, such as DuckDuckGo in the US, Oscobo is the first to target the UK market. This is important as location can play a crucial role in providing relevant results. By using a UK-based privacy search engine, users can receive results that are targeted to a UK audience, but without the search engine having to harvest individual location data. Of course, if users decide to add further location details into their search string, such as “Places to eat in Manchester,” for example, they can still receive more precise results. There are also a number of other reasons why the UK was evidently ready for its own privacy-focused search engine. Although the most prominent surveillance headlines of recent years have surfaced in the US, namely those involving Edward Snowden and the NSA, the UK has some significant privacy controversies of its own. The ongoing debate surrounding the Investigatory Powers Bill, currently being scrutinised in Parliament, suggests that online privacy is an important issue here in the UK. In addition, the more that users understand about the type and quantity of information being collected by Google and similar sites will likely lead to more privacy concerns. [easy-tweet tweet="the UK has some significant privacy controversies of its own." hashtags="cloud, security"] Already, the privacy-search market is showing promising growth, outstripping that of the search market as a whole. All of this points to the fact that there is a receptive audience in the UK ready for a privacy-focused search engine. In the future, however, there are plans for Oscobo to expand into other European markets. Although privacy may be a universal right, each market has its own particular concerns, which is why we believe that the UK and other nations deserve a reliable, secure and surveillance-free search engine all of their own. ### The Facebook Internet Facebook has over 1.71 billion monthly active users* - more than the population of the worlds largest country and it is continuing to grow daily. 1.57 billion* of these users access the site from a mobile device and Facebook are responding to this by rolling out new features on their Facebook and Messenger app whilst the desktop site remains relatively unchanged. With more features and content being added to the app regularly, a closed ecosystem is developing and users can access almost anything they need without the need to leave. Facebook doesn’t just want to own the Internet; Facebook wants to be it. [easy-tweet tweet="A closed Facebook ecosystem is developing, users can access anything needed without having to leave" hashtags="social, tech"] The reason this is working is down to the unparalleled popularity of Facebook as a social media site. With a monopoly on users, Facebook can drive away competition from other services simply by being something that people are already on board with. Take the introduction of Facebook Live for example. Live video streaming has been popular for sometime with apps like Periscope, but now that the same service is available on Facebook, why go anywhere else? There is no need to download a new app or persuade friends to sign up and users now have another reason to go on Facebook. Similarly, Facebook games, Instant Articles and video integration have increased users’ app ‘dwell time’. By delivering content users want to see based on their own likes and the opinions of their friends, the experience of playing, watching or reading is enhanced on Facebook and users are much more likely to engage. [easy-tweet tweet="With a monopoly on users, Facebook can drive away competition from other services" hashtags="social, cloud, tech"] One of the most interesting features Facebook has introduced to its app to increase time spent on the site is one that might have passed users by. The Facebook browser is a stripped-down web-view window that allows users to stay within the Facebook site when opening external links (links that would previously have meant navigating away). By opening links on Facebook, the in-app browser shares information back to the site whilst the user has a normal web view experience. Although it is still basic, the Facebook browser allows users to bookmark pages, navigate back and forth, and displays the sentiment of other Facebook users in a response bar at the bottom. Most importantly, loading is fast and entirely seamless between the app and browser making it highly unlikely users will opt to view this content elsewhere. So what does Facebook gain from this move towards providing users with a closed ecosystem? The short answer is data. By adding more features and therefore increasing users’ ‘dwell time’, Facebook are not only increasing the amount of time a user can be shown ads, but also gathering more information on the kinds of ads they want to see. With the in-app browser, Facebook learns what websites a user has visited, and for those taking the time to comment or react, exactly how they feel about the content delivered. Put this information together with information on the games they like to play and the articles they like to read, and the personal and social information already shared on their profile and Facebook have a rich and complex insight into the sentiment and personality of each individual user. This information can be used to create content that keeps users hooked and is shared with advertisers to deliver highly tailored, relevant advertising that appeals to the user without them feeling like their online space has been invaded. Ultimately click rate and sales are increased and the value of Facebook’s advertising space is raised. Facebook is changing the way we use the internet and by adding more features to its ecosystem, users are becoming self sufficient on the site with less reasons to leave and fewer decisions to make, sharing their habits, tastes and opinions with the Facebook simply by interacting. The next logical step for the company would seem to be the development of a full Facebook browser. This would not only give users more reason to head straight to Facebook when they go online, but also keeping them off major competitor sites such as Google. Improving the limited functionality of its existing in-app browser by adding the option to type URLs and tab support would be a game changer and bring Facebook one step closer to its goal, meaning that going online really does become synonymous with going on Facebook. [easy-tweet tweet="Ultimately click rate and sales are increased and the value of Facebook’s advertising space is raised." hashtags="social, tech"] *As of July 2016 - https://zephoria.com/top-15-valuable-facebook-statistics/ ### It’s not all doom – now is the time for cloud Currently, the public sector IT community faces an environment where the only certainty is uncertainty. This predicament is down to the Brexit vote. It is becoming harder to entice stressed and distracted ministers to think about much else in the wake of government being reshaped to consult and deliver a new role for the UK in the world. [easy-tweet tweet="CTOs should take the opportunity to fix their underlying IT infrastructures." hashtags="cloud, tech"] No-one knows which form Brexit will take, and there are many possibilities, so its impact on the wider public sector in the four nations of the UK is hard to pin down. That means government policy on everything from borders to farm payments could be up in the air for years - Brexit complicates life for government CTOs and CDOs. Digital and technology projects linked to policy will have to wait for a clearer picture. However, there are also reasons to be cheerful. Rather than waiting for the go ahead on projects that may never materialise, CTOs should take the opportunity to fix their underlying IT infrastructures. With flexible, low cost infrastructure and skilled development teams in place, technologists can be agile when the time comes for them to implement new government policies and build new services. Now is the time for cloud. Making healthy gains Ex-Minister Francis Maude set the tone for Cloud when he announced the ‘Cloud First’ policy for central government in 2013, making it clear that Cloud is one of the few bright spots in a static public sector IT market. Cloud services, including SaaS, PaaS and IaaS, only accounted for around two per cent of overall public sector IT spend at that time, but has been growing at a very healthy 45 percent per year since. ‘Cloud First’ was characterised as a policy that would essentially drag the unwilling adopters of government IT into the 21st century. However, this was deemed to be unfair; public sector IT professionals have long since understood the potential benefits of cloud and ‘as-a-service’ delivery models. [easy-tweet tweet="‘Cloud First’ aims to drag the unwilling adopters of government IT into the 21st century." hashtags="cloud, tech"] Kable’s 2016 survey told us  public sector IT professionals expected to gain: Greater flexibility and scalability in computing resources (19%) Improved IT efficiency and agility (17%) Great cost effectiveness (17%) Reduced capital expenditure (16%) Mobile and ubiquitous access to applications and services (15%) Greener IT (11%) Other (5%) A simple and quick route Following the positive wave of optimism after GDS was founded, ‘Cloud First’, with its focus on commodity public cloud services, was greeted with enthusiasm. New digital teams, bringing agile working practices to government for the first time, often found public cloud services quicker and easier than their own IT departments to work with. The first generation of digital services were well suited: greenfield, handling little sensitive data, and with light backend integration requirements. G-Cloud has been an excellent vehicle for those early cloud projects: not just commodity IaaS, but also SaaS tools supporting new collaborative ways of working. It offers buyers a simplified, quick route to access a wide range of suppliers, especially SMEs who are often shut out of the public sector market. On average, the public sector is now spending over £50m per month via G-Cloud (although some 80 per cent of that has been for professional services, rather than SaaS, PaaS and IaaS). Where there’s hope, there’s hype The next wave – call it public sector cloud 2.0 – is being driven by the transformation of existing services, rather than brand new projects. Opportunities will be opened up as legacy IT outsourcing contracts come to an end, and organisations look to shift workloads to the cloud. Kable analysis of G-Cloud sales data – supported by anecdotal evidence from suppliers - suggests that sales growth has reached a plateau. Public sector cloud is still in its infancy and has huge growth potential, but much of the low-hanging fruit has gone. As my colleague Gary Barnett has argued, people sometimes talk about cloud as if it had magical, unicorn-like properties. And herein lies the issue. The ins-and-outs of the reality of cloud migration is ignored and buyer enthusiasm is not always backed up by expertise. We see evidence of this in cloud-hosted government digital services apparently not architected for cloud potentially crashing under completely predictable demand spikes, for example, DVLA’s digital tax disc registration, and GDS’s Register to Vote service, for example. Making a success of cloud is not about swapping out expensive infrastructure for cheap public infrastructure, and it’s not about G-Cloud. It’s about enterprise architecture; data quality projects; new deployment processes, governance models, security policies and service management. It’s about hybrid cloud and cloud orchestration. It’s anything but simple. [easy-tweet tweet="People sometimes talk about cloud as if it had magical, unicorn-like properties." hashtags="cloud, tech"] Here comes Cloud 2.0 The competitive landscape is also changing. As requirements evolve, a number of UK-based SMEs have succeeded on the back of a public sector preference for UK data centres. Life will unfortunately become more challenging for them as public cloud giants Microsoft and AWS open UK data centres this year. Systems integrators, including IBM, CSC and others, can additionally provide hybrid cloud services on a large scale. In the wake of Brexit, smaller players could end up relying on client fears about data sovereignty to keep them in business. However, it would be preferable for suppliers to offer a positive vision to the public sector, not just a negative one. Tackling the barriers to cloud adoption should be the main priority, using services such as data governance frameworks, security audits and migration programme management. SMEs also need to start getting over the G-Cloud hype. Some suppliers have learnt the hard way that Digital Marketplace is no substitute for sales and marketing - this is becoming even more apparent as buyer needs become more complex and less suited to G-Cloud. Those suppliers that wish to remain in the commodity space will need partners that can bring them into clients’ hybrid environments. Above all, suppliers should be helping clients to build their own expertise and knowledge, so that the UK government can become the well-informed customer for cloud services that it aspires to be. And, perhaps, to get its infrastructure into shape for the day when politicians and the public finally decide just what Brexit means for the future. ### Are your employees putting your cloud data at risk? Ransomware has spent quite a bit of time in the spotlight as a major culprit of data breaches. There is however, another threat that while not as sensational as ransomware, can be just as risky. That threat is people. Of all the data breaches reported in the UK during Q1 2016, ICO data reveals that 62 percent were caused by human error. In fact, ransomware wouldn’t be as prevalent as it is if it weren’t for people like you and me making blunders such as accessing insecure web pages, downloading infected software or clicking a phishing link in an email. [easy-tweet tweet="People accessing insecure web pages are responsible for ransomware being prevalent." hashtags="cloud, tech"] Unfortunately, the cost of human mistakes is larger than one might think. According to research by the University of Portsmouth, fraud and human error are costing UK organisations around £98.6 billion a year. The actual number is even larger, as the reported figure doesn’t include undiscovered or unreported incidents. While some might think that storing data in the cloud keeps it from being vulnerable to ransomware, they’re wrong. Ransomware can encrypt files on hardware and cloud services alike. Of course, data in the cloud is always susceptible to human error. Minimising the risk your employees pose to cloud data begins with education. However, figures from Experian reveal that only 46 percent of companies enforce obligatory security training for all employees. Among those that do offer employee security training, 43 percent only provide basic training that omits many of the serious data breach risks their businesses face. [easy-tweet tweet="While some might think that storing data in the cloud keeps it from being vulnerable to ransomware, they’re wrong" hashtags="tech, cloud"] When educating your employees, focus on the following principles to help ensure your staff follow cybersecurity best practices. Handle data with care. The majority of incidents attributable to human error are associated with sheer carelessness or lack of knowledge about how to properly handle data. To prevent unauthorised access to data, employees should consider who else might be able to view the information they store in the cloud. They should avoid storing sensitive data on a shared drive or cloud network that’s accessible by people who aren’t authorised to access the data being uploaded. Any sensitive data files must always be encrypted, regardless of where they’re stored. Quickly identify phishing emails. [easy-tweet tweet="Teach employees to look for these common characteristics of phishing emails" hashtags="cloud, tech"] An unsettling number of employees are falling victim to phishing attempts. According to research from Verizon, people opened 30 percent of phishing messages, up from 23 percent last year. Of that 30 percent, 13 percent also opened the attachment, giving malware a clear path to the network. Teach employees to look for common characteristics of phishing emails: Poor design Incorrect spelling and grammar Requests for personal details Suspicious attachments URLs that don’t match the company’s primary domain (to view a URL without clicking a link, users can hover over the link with their cursor) Respond appropriately to a suspected ransomware attack. If employees suspect a device they’re working on has been impacted by ransomware, it’s critical that they stop working on the device immediately and notify IT. Ransomware wipes files within a set amount of time, delaying action could result in data loss. Even if you have backups and are able to recover your systems, restoring the production environment can take as long as a few days, which could lead to costly downtime. For example, Lukas Hospital in Neuss, Germany, had complete backups of all systems in place, but when it was plagued with TeslaCrypt 2.0 ransomware, the hospital estimated that it would take up to 48 hours before its IT environment was fully functional again. As a result, 20 per cent of the hospital’s surgeries had to be rescheduled, and less critical care had to be temporarily shifted to other hospitals. Apply security patches. New security threats are continually surfacing, exploiting vulnerabilities in hardware and software. To protect their systems against these threats, users need to apply system patches immediately when prompted. Even delaying the updates by a few days could increase the likelihood of the system and network falling victim to ransomware attacks and other cybersecurity risks. Create secure logins. Employees need to create sentence-based complex passwords that involve special characters, numbers and a mix of lower- and uppercase letters (e.g. “To be or not to be” becomes “2BorNot2B_ThatIsThe?”). Whenever possible, use two-factor authentication to increase security. Avoid shadow IT. As if the risk posed by human error and ransomware alone weren’t enough, shadow IT only aggravates the threat. Research from Cisco reveals that CIOs estimate that their organisation has 51 public cloud applications in use, but the actual number is more like 730. If your employees are uploading restricted data to an unauthorised cloud application – such as Google Drive, Dropbox and Evernote – without proper encryption, this increases your security risk. Encourage your employees to enlist IT’s help in selecting and implementing cloud solutions. The IT department should be empowered to act as a trusted adviser rather than merely a strict policy enforcer, which will minimise the likelihood of employees resorting to shadow IT. Follow your organisation’s security policies. Having clear, enforceable security policies in place helps your employees know what data they have permission to view and handle and how they’re allowed to view and handle that data. With most UK businesses (95 percent, according to a BT study) using mobile devices for work purposes, a bring-your-own-device (BYOD) policy is a must. Your BYOD policy should address issues such as data security, remote management, data transfer, backups, data wipe and technical support (office or field based). If you work with a managed services provider for your IT support, ensure that the vendor can assist with developing and supporting your BYOD program. Retrain after a breach. Having employees who are educated in security best practices reduces the chance of unauthorised access to data as well as ransomware compromising your data and network, but it’s not a fool-proof solution. Employees do still make mistakes. Unfortunately, Experian has discovered that a whopping 60 percent of businesses that have experienced a breach make the error of not retraining staff after a breach has taken place. If a breach occurs, review what went wrong and have your employees go through security training again, with special emphasis on weak areas. Employees ought to observe cybersecurity best practices, but you should do your part by ensuring your IT infrastructure is designed to protect against ransomware and other threats. To protect your perimeter, implement a network solution that includes intrusion detection and prevention (IDS/IPS), deep packet inspection and perimeter anti-virus, and malware blocking. Also be sure to back up your systems. If ransomware takes your systems hostage, paying the ransom is never recommended, as it only encourages hackers, so the key to recovering your IT environment is a reliable disaster recovery as service (DRaaS) solution that creates and updates a complete image of your system as frequently as you specify, sometimes even as often as every 15 minutes. [easy-tweet tweet="Whatever backup or #DRaaS solution you choose, files should remain encrypted in transit and at rest." hashtags="cloud"] Whatever backup or DRaaS solution you choose, files should remain encrypted in transit and at rest. Also verify that the vendor offers service level agreements (SLAs) that ensure that your data can be restored within your recovery time objectives (RTOs) and that provide adequate recourse in the unfortunate event that data is lost. Although ransomware is a legitimate threat and is certainly deserving of the media’s attention, often the more immediate – and more easily managed – risks can be within the organisation rather than outside it. Educating your employees and doing your part to protect your network are excellent starting points for preventing a data disaster. ### More cloud to hit the UK The usage of Cloud technology is spreading around the world and extending its reach beyond traditional IT teams and into the realms of sales, accounts and business leaders who want to be more mobile, more scalable, store more data and take advantage of the latest technology on offer in order to keep up with the competition. [easy-tweet tweet="The UK is now experiencing a period of huge cloud growth." hashtags="cloud, tech"] Whilst the Cloud has established itself in the US, the UK has been slightly slower in embracing the technology but is now experiencing a period of huge growth. Microsoft Azure is gaining popularity but Amazon Web Services (AWS) continues to lead in public cloud adoption. The popularity of AWS also has a large impact on DevOps - the power of AWS and automation of process is a massive driver to businesses adopting a DevOps way of working when it comes to creating apps and websites. With around 97 recognised consulting partners in the UK, the competition is growing and businesses are competing to find the best IT talent. Due to the fast expansion of the cloud industry, businesses are facing the challenge of finding the skilled people to implement and maintain cloud services. In fact, lack of resource/expertise is recognised as the number one challenge in the industry, according to the RightScale 2016 State of the Cloud Report. [easy-tweet tweet="Competition is growing and businesses are competing to find the best IT talent." hashtags="cloud, tech"] In the UK, the skills shortage is, in part, due to a slower adoption of this emerging technology, compared to the US who is leading in cloud services and is at the centre of Cloud’s global expansion. The sudden growth of Cloud in the UK means that there isn’t a talent pipeline to fill the high level of jobs now in demand. The numbers of consultants, engineers and architects holding certification are growing but not fast enough. Those who’ve built up hands on knowledge of the platform are engaged and enjoying the projects they are delivering and those companies are developing a pipeline of business on the back of that expertise. The extent of the UK’s digital skills crisis is exemplified in a recent report published by the UK’s Science and Technology Committee. One of the main and most significant findings was that 12.6 million adults in the UK lack basic digital skills. With 90 percent of jobs now requiring digital skills, the digital skills gap costs the UK economy £63bn in lost income and is having an adverse effect on businesses, with 93 percent saying that the skills gap is affecting their commercial operations. Not only is there a lack of digital skills generally, there is also a deficit in the specific skills and knowledge needed to implement cloud services like AWS and the adoption of a DevOps way of working, which are particularly difficult to set up. Recruitment in these fields is incredibly fierce as businesses look for individuals with backgrounds in large data-centre administration or from hosted data-centre backgrounds who have the experience to understand and utilise these services and the ability to quickly learn how to implement them. Another challenge which continues to prevail in the cloud industry is security and information governance. Interestingly but not surprisingly, security is more of a concern amongst those taking their first steps towards the cloud than it is amongst more experienced users, according to the aforementioned RightScale report. There are still stigmas to overcome but the financial sector’s recent adoption of cloud services is a large seal of approval for security in public cloud services. [easy-tweet tweet="Security is more of a concern amongst those taking their first steps towards the #cloud" hashtags="tech"] With the growing popularity of AWS and DevOps and the increasing awareness businesses across multiple industries have of Cloud benefits, employers need to up their game to secure the skills they need to fully embrace the Cloud. IT professionals with Cloud experience are highly sought after and have the upper hand in the employment game. These technically-skilled individuals are able to dictate a cost for their time and demand the projects and clients they wish to work on and with. As well as seeking new talent to enhance their existing IT capabilities, businesses should also look to up-skill their IT and development staff to ensure they have the best chance of future-proofing their business. Find more at Networkers. ### Reducing Threats and Management Headaches across Private Clouds While public cloud implementations are steadily increasing, private clouds in customer’s own data centres continue to be deployed because of the perceived higher levels of security and control they offer. However, the management of a private cloud can be complex and many organisations underestimate the scale of the challenge. [easy-tweet tweet="Management of a private cloud can be complex and many underestimate the scale of the challenge." hashtags="#cloud #tech"] Security and management in particular continue to be pain points. Many arrive with the assumption that in a private cloud, IT departments have more control so the environment will either inherently be more secure, or will be easier to implement and maintain security controls. Unfortunately, many organisations subsequently find the management challenge is greater than anticipated and adoption is difficult, which often has more to do with organisational and IT transformation issues rather than the technology itself. This puts a whole new dimension on the scale of the challenge, and raises the question of whether the internal IT team is up to it. [easy-tweet tweet="In order to secure your private cloud you will need to manage your cloud footprint" hashtags="cloud, security"] In order to secure your private cloud you will need to manage your cloud footprint – including key performance metrics, network and virtual machine configuration, disaster recovery and backup, intrusion detection, access management, patching, antivirus to name a few considerations. The sheer volume of security controls needed can be overwhelming and not many IT departments have the people power or expertise to be able to manage diverse security solutions from multiple vendors. However, the real kicker is often not just the security technology itself, it’s the internal processes required to implement private cloud security. The IT team needs to work with the security department and procurement to define responsibility for who looks after what when it comes to maintaining infrastructure. Internal departments also won’t necessarily prioritise your request and complex negotiations and acts of diplomacy often ensue. Evaluation processes alone can take six months or more and it’s a dance that needs to be done for not one security solution but for multiple solutions. Add to this the fact that the threat landscape continues to grow and become ever more complex. Some cloud initiatives are increasingly stalling or getting cancelled altogether because the security risks are deemed to be too high. This results in an uncomfortable situation for IT leaders as lines of business in their organisations are still demanding the agility, scalability and cost savings that cloud computing can deliver. IT leaders know they can’t abandon cloud altogether, the benefits are too great – yet they also know whose head will be on the line if an outage, data loss or hacking incident was traced to a cloud workload. It is not surprising that some IT leaders look back with nostalgia on the simple outsourcing models of the past. [easy-tweet tweet="Some cloud initiatives are stalling or getting cancelled because the security risks are deemed to be too high"] A secure hosted private cloud from a provider like iland can provide an answer for companies with workloads that require isolated infrastructure. A hosted private cloud can deliver all the integrated security, management, support and availability levels you would get in a traditional data centre environment, but with all the convenience and benefits of cloud. Instead of buying, installing, integrating and maintaining separate security solutions, secure hosted private cloud solutions like iland’s also offer a purpose built management console that integrates the management and reporting of security and compliance settings, smoothing the path to completing audits. Embedded security features in a hosted cloud platform include role-based access control, two-factor authentication, VM encryption, vulnerability scanning, anti-virus/anti-malware, integrity monitoring, intrusion detection and detailed firewall and log events. Additionally, all private cloud resources can be managed from a single place, with access to performance metrics, granular billing data, VM management capabilities and DR management. Hosted private cloud customers also benefit from flexible pricing models that deliver predictable and controllable operational expenses in reservation or burst options while avoiding the capital expenses of on-premise private clouds. [easy-tweet tweet="Secure hosted private clouds offer stretched IT departments the best of both worlds" hashtags="security, cloud"] As the security challenge continues for cloud workloads, secure hosted private clouds offer stretched IT departments the best of both worlds. Removing the burden of hardware management, shifting capital expenses to operating costs and benefitting from the latest innovations. This model seamlessly augments on-premise data centres and delivers a robust, enterprise-class secure private cloud with all the features of a public cloud – but without the management overhead. ### Is your Wi-Fi ready for the Internet of Things? The internet of things (IoT) has been dubbed ‘Industrie 4.0’ in Germany for its potential to spark a new industrial revolution. Eighty percent of companies see IoT as the most strategic technology initiative in their organisation for a decade; there could be 38 billion IoT devices in the world by 2020, and by 2025 the market for IoT technology could exceed $6 trillion. As companies embrace this disruptive trend, how is it changing business? Critically, with Wi-Fi networks at the heartbeat of any IoT strategy, how can you ensure it’s going to be ready to help you make the most of IoT’s potential to find new ways of working more efficiently, productively and cost-effectively? [easy-tweet tweet="Periodically, technology arrives that changes everything. IoT is one such technology." hashtags="#IoT"] From manufacturing to healthcare: new ways of working Periodically, technology arrives that changes everything. IoT is one such technology. It will allow us to rethink the way we do things with particularly far-reaching benefits in: Manufacturing: It has long been seen as a sector that will be changed most by IoT and, to date, the majority of IoT sensors, 40.2 percent, are found here. By collecting real-time data from across supply chains, manufacturing lines and machines, manufacturers are shaving seconds off processes, driving down defects, working more safely, cutting costs and improving productivity. For instance, using data from machines and robots on the plant floor, and using predictive analytics, it’s possible to schedule pre-emptive maintenance to avoid line closures. Healthcare: IoT has a huge range of applications in healthcare. Take the example of a critical patient. In many cases they need to be triaged, tested and treated within set time-frames. By placing Bluetooth-enabled wristbands on them, they can then be tracked through a hospital using Bluetooth beacons. If their progress slows, alarms can be automatically raised with senior clinicians who can intervene. Transport and Logistics (T&L): Of the many ways IoT will be used in T&L, one of the most interesting is using smart cameras in depots to assess the dimensions of items. This added intelligence can be used by Big Data systems to provide loading guidance that optimises vehicle space to reduce unnecessary mileage, costs and CO2 emissions. Retail: Bluetooth beacons are allowing retailers to track customers around aisles. The data can be used to identify successful store layouts (by using heat mapping) and to connect with customers’ smartphones to interact in new ways such as welcoming them in-store and sending promos and discounts. [easy-tweet tweet="IoT will create a huge volume of data from sensors that will connect to the internet using your Wi-Fi network." hashtags="IoT, tech"] IoT will create a huge volume of data from sensors that will connect to the internet using your Wi-Fi network. This has implications. If your network isn’t business critical already, it’s likely to be; if it’s a few years old it may struggle to cope with the surge in demand; and, given the sensitivity of data it will handle, you may need to rethink security. What we’re talking about here is the need to provide ‘carrier-grade’ network performance. But the likelihood is your IT team are not wireless experts – so it’s important to think about the best way to deploy, integrate and manage your network. With all this in mind, and with the help of Gartner’s IoT stack, here are five things to think about when getting your Wi-Fi optimised IoT ready. 1. Versatile connectivity: A whole range of devices will connect to your network – not only tablets, PCs, phones and laptops – but also smart sensors such as temperature controls. While most new devices will be based on the latest standards (802.11AC Wave 2) – that offer more bandwidth and scalability that is so essential in the world of IoT, if you have legacy devices connecting to your Wi-Fi you need to make sure the network can support them. 2. Communications and networking: Look for Access Points that not only include Wi-Fi radios but Bluetooth and Radio Frequency (RF) sensors. Bluetooth will make it easier (and lower the cost of) deploying Bluetooth-enabled devices – e.g. in retail, Bluetooth-driven Electronic Shelf Labels or beacons. RF sensors are important as they will scan the network neighbourhood to look for any interference and proactively jump to new channels to protect quality of service. This autonomy will allow your IT team to take a hands-off approach to day-to-day management. 3. Use the power of the cloud: If you’ve a large number of users, many sites, demand for guest access and the requirement to scale – to pretty much any size – cloud-based Wi-Fi could be for you. With cloud systems, the power of switches and controllers and the intelligence of the network are put into the APs that will self-register over the internet with the host cloud. If you need more capacity, or more sites, you just add more APs (there’s no need for controllers. This simplifies the way networks are built and changes the economics of Wi-Fi – from a hardware play – to a more flexible and versatile ‘Op-Ex’ model. 4. Apps and services: Look for networks that make it easier to on-board guests (while managing authentication) and include technology to monitor network activity. That monitoring should include both Wi-Fi and Bluetooth clients to find suspicious behaviour. This is especially crucial in retail where we have seen incidents of Bluetooth being used to steal payment card details over an unprotected Bluetooth system. 5. Remote control and management: Most vendors can give you loads of network stats but to make that meaningful you need analytics that give you intelligence, such as how your apps are performing, who’s connecting to the network and from where, what the quality of those connections is – and, critically, how you can improve in these areas. Make sure too, that your network allows your IT team to see every AP and remotely manage and troubleshoot the network. Radical Simplicity Your IT team has a lot on. The last thing they need is to take on the management of a complex wireless network that’s now, more than ever, business critical. Technology is helping. Just as Wi-Fi networks have become super-fast – in excess of one gigabit download speed – they’ve also become easier to install, manage and maintain. There’s nothing in IT that does not benefit from radical simplicity and, with Wi-Fi, we’re getting to a place where carrier-grade network performance can, with minimal intervention, be set up and maintained by the networks themselves. ### Mainframe: The Ultimate Cloud Platform Cloud computing has become the default option for IT service provision be that off or on premise or via a hybrid model. Against this backdrop the words ‘mainframe’, ‘cloud’ and ‘agile’ may not often appear in a sentence together due to the popular perception that the Mainframe is not as agile as other architectures found in cloud datacenters. However, IBM’s zCloud offering can, and in fact does provide a cloud computing infrastructure with the agility, scale, security, availability and performance to respond to today, and tomorrow’s changing business needs. [easy-tweet tweet="Scalability, availability, virtualization, and usage-based metering are all part of Mainframe" hashtags="IBM, cloud"] One of the main challenges for CIOs with a large mainframe environment is that as their business grows, so too do the costs to maintain the environment. Take the case of financial services industry – 92 of the world’s ‘Top 100’ banks use mainframe technology and increasingly mobile queries the world over are delivered by – you guessed it, a Mainframe. In a situation where mobile banking volumes are exploding, IBM’s zCloud service can help financial services clients get the additional compute power they require with the added flexibility and scalability needed to meet their demands, without additional capital expenditure. Coupled with scheduled and unscheduled compute demand, this flexibility provides a vital function to supporting the daily operations of many a Mainframe client. IBM’s zCloud offering has helped customers the world over reduce ‘Total Cost Of Ownership’ (TCO) by up to 30 percent through reducing their on premise Mainframe footprint and moving to a ‘pay-as- you-use’ model base. If new product workloads demand more compute power, then so be it quick, easy, and cost effective. [easy-tweet tweet="Cloud computing has become the default option for IT service provision via a hybrid model." hashtags="mainframe, cloud, IBM"] Another recent example is a UK financial services company who recently launched a new product that enabled insurance companies to access its data from mobile devices to validate individuals and provide them with online insurance quotes. The IBM zCloud solution enabled the firm to compete in this low-cost- per-click market by using the flexibility and speed of the low-priced cloud platform service, thereby generating a whole new revenue stream. Finally, one of the reasons for Mainframe not being perceived as agile and cloud-like is perhaps due to the fact that Mainframes are often used in "low profile" yet mission critical “back-office" applications. IBM is focused on this area, through delivering a range of services to compliment the zCloud offering; they are designed to focus in on delivering greater agility for application development. A recent example of a client who leveraged these services was a European Bank which re-launched its newly designed customer-facing digital banking solution where the IBM zCloud offered the backend system. The inherent flexibility and extra capacity capability in these solutions helped the UK based bank to be more agile and radically improve turnaround times to set up custom environments – the net outcome was gaining the confidence of the developers to try out new solutions and deliver them rapidly to the market. Scalability, availability, virtualization, and usage-based metering have long been part of the Mainframe environment, which the distributed world has zealously tried to recreate, and often claim credit for. IBM’s zCloud portfolio of contributions and services takes these virtues out of the back office and thrusts them out into the bright lights of the future – creating perhaps, the ultimate Cloud Platform. To find out more about IBM zCloud take a look here. ### Gaming gets serious to solve workplace challenges You may think of gaming and the workplace as incompatible concepts. The former is usually thought as being fun, entertaining and trivial, while the latter is often typecast as important, structured and serious. However, more and more businesses are starting to realise that the world of gaming has much to offer the world of work. Sometimes dubbed “gamification,” business tools are starting to adopt a number of game-based features in order to improve employee engagement and productivity. [easy-tweet tweet="Gaming and the workplace: One is fun yet trivial, the other is typecast as important." hashtags="cloud, gaming, IBM"] Expanding on this trend, IBM has created its own project looking at the business applications of gaming. IBM Serious Games, built on the SmartPlay Framework, uses games, not as a leisure activity, but as a means of tackling important business challenges. These Serious Games allow participants to sort and analyse data, overcome real-life issues and test potential solutions. The game play element helps employees to remain focused and establish a clear goal-orientated method of work. The exact makeup of a Serious Game will depend on the industry, its target audience and the decisions taken by the development team, but there are usually a number of common features. Video simulation often plays a key role and participants learn by doing, rather than simply watching or listening. Already, Serious Games have been used in a number of industries, from healthcare to the military, as well as in schools and universities. Below we’ve highlighted some key examples of how IBM is using gaming to do more than simply entertain. Project Ares [embed]https://www.youtube.com/watch?v=JGgEjP7F2gk&cm_mc_uid=05689010754514665145846&cm_mc_sid_50200000=1468941280[/embed] Cyber-attacks are rarely out of the headlines these days, and it is increasingly important for individuals to know how to defend themselves from malicious actors. Project Ares has been developed to guide users through the ever-changing research and strategies relating to cyber defence. Using a massively multiplayer artificial intelligence based platform, Project Ares uses a combination of real-time threat intelligence, a nextgen AI engine and natural language dialogue provided by IBM Watson to provide cyber warfare training. Users can select missions, each with their own distinct backstory and bespoke skills, and choose whether to attack or defend IT infrastructure. Mission objectives are broken down into smaller objectives and the in-game avatar named Athena provides helpful guidance throughout. [easy-tweet tweet="The game helps trainees to better understand the modern-day cyber-attack landscape" hashtags="cyber-attack, cloud, tech"] The game not only helps trainees to better understand the modern-day cyber-attack landscape, it also allows security trainers to oversee progress and keep up-to-date with each user’s individual strengths and weaknesses, meaning that Project Ares can serve as both a training and recruitment tool. Medical Minecraft As well as providing practical applications in the workplace, Serious Games can also be used to educate participants of all age groups. Using the hugely popular sandbox game, Minecraft, combined with the power of IBM’s Watson and a bespoke plugin, developers can incorporate cognitive computing into the game, letting users learn about a number of real-life concepts. As an example, IBM made Watson Dialog for healthcare accessible through a Minecraft plugin, which let players ask Watson questions about diseases and their best possible treatments. Similarly, a Disease plugin was also introduced to show how infectious diseases spread in the game, mirroring a real-life epidemic. By using the vast stores of knowledge at Watson’s disposal, Minecraft provides an entertaining way for participants to engage with any number of educational topics. All work and no play As businesses continue to search for ways to boost productivity, many are turning to Serious Games as a way to get their employees more engaged while at work. By incorporating game-based elements, organisations can both educate and entertain in equal measure. ### Reskilling IT for the Cloud Era The increased adoption of cloud services is redefining the role of the IT department and a perennial need to do more with less, all while meeting increased demands from the business means IT needs to become collaborative and more consultative. [easy-tweet tweet="IT needs to become collaborative and more consultative." hashtags="cloud, tech"] The IT team needs to act as a broker of cloud-enabled services across the organisation, to ensure new capabilities fit with an integrated cloud strategy in which all data and IT resources are joined up and made the best possible use of. They need to become a partner in innovation, working with the business to understand and deliver upon its goals and objectives. The focus of the IT department is shifting from managing the data centre stack or maintaining hardware. Innovations such as the Oracle Cloud Machine – which brings remotely-managed cloud services on premise, supporting the creation of a cloud-enabled strategy for all critical enterprise applications – means that the traditional support function of IT is becoming less relevant. But that isn’t to say IT’s role is becoming less relevant. Far from it. Although cloud enables the rapid set-up and deployment of services, implementing cloud-based technologies isn’t as simple as just switching them on. The process needs to support existing business processes and legacy technology, while enabling new ways of working. Indeed, the classical virtues of understanding the strength of architectures will remain as it is crucial in embracing the enterprise cloud model without introducing new complexities or data silos. As cloud reduces the complexity, there will also be a growing imperative for IT professionals who can design, develop, migrate and integrate new applications, as well as extend current ones, at speed. In short, the traditional maintenance mind set of IT will give way to a more strategic, software-oriented vision. [easy-tweet tweet="Traditional maintenance mind set of IT will give way to a more strategic, software-oriented vision." hashtags="cloud, IT"] At an operational level, IT is already having an expanded role with the rise of ‘dev ops’, in which the lines are blurring between development and operations, and IT teams are becoming more application and service-oriented. Another form of role expansion will come through the lines of business, which increasingly will look to IT to help process vast amounts of data across finance, sales and marketing. IT departments could increasingly be structured around lines of business, with data specialists working on projects for HR, finance or marketing, using cloud-based analytics tools. This all creates a critical time for CIOs. Identifying and developing the right skills for their departments will become a necessity, and they need to plan early. They will need to think of resourcing, training and talent acquisition across areas such as development and integration, but also consider how IT roles are evolving as their business becomes increasingly digital. Looking to a future where IT staff will look more like business application developers focusing on user experience rather than maintenance personnel. IT leaders need to start articulating to their teams what this future means and how cloud can enable team members to expand their roles and develop their careers. ### The true cloud generation “True Cloud”… here to stay? Consumers are already reaping the rewards of ubiquitous access to their data anytime, anywhere and from any device. Many are even using cloud based services such as DropBox for work. Employees are embracing the cloud and businesses are having to adapt accordingly. [easy-tweet tweet="Consumers are already reaping the rewards of ubiquitous access to their data anytime" hashtags="cloud, IoT"] With market research firm, Gartner, predicting the global market for cloud-computing services to reach $240 billion by next year, cloud services appear to be here to stay; but what about the next generation… the true cloud generation? What is “True Cloud”? Not all hosted offerings marketed as “cloud” are true cloud solutions. Many suppliers are “cloud washing” or “dirty hosting”. True cloud, often referred to as Software-as-a-Service (SaaS), offers massive economies of scale and long-term strategic benefits not achievable by on premise or hosted solutions. True Cloud solutions offer the following: Scalable usage: SaaS solutions offer high scalability, which gives customers the option to access more, or fewer services or features on-demand. Automatic updates: Rather than purchasing new software, customers can rely on a SaaS provider to automatically perform updates and patch management. This further reduces the burden on in-house IT staff. Accessibility and persistence: Since SaaS applications are delivered over the Internet, users can access them from any Internet-enabled device and locations Many Enterprise Resource Planning (ERP) solutions that are described as running in the “cloud” are in reality designed to run on premise. While these vendors may host these “cloud” ERP solutions, they typically are unable to fulfil the above criteria, meaning that businesses are not enjoying the benefits of true cloud. True Cloud delivered ERP is what we call Enterprise SaaS. Meeting business needs Businesses are turning to the cloud because it offers a simpler, faster and more flexible solution. It allows businesses to scale up or down quickly providing flexibility when responding to changing market demands. Most organisations are keen to focus on their core business rather than technology and the complexity of running or updating their software and infrastructure. Enterprise SaaS takes cloud to the next level, enabling businesses to reduce costs, adopt emerging technologies such as smart mobile devices and future-proof their organisations. The new market With increasing mobility, employees are expected to work in multi-locations using a number of devices, while accessing the same business data. With the right software this is possible without compromising on functionality. Businesses can deploy on premise mobile apps but this approach is not without its challenges. Managing security and devices is complex, particularly when a device-installed app is used. Many vendors attempting to provide mobile solutions often confuse simplicity with reduced functionality. Enterprise solutions are generally rich in functionality, data and processes. “Dumbing down” the mobile front ends by providing only limited functionality and process to make the user experience simple. By combining cloud and mobile, and using software that works across all devices, organisations can access their enterprise information from any device. With software that is wholly web-based, users simply sign in for a service they can use anywhere, any time and on any device. What does the future hold for “True Cloud”? As understanding of the technology and market matures, we will see increasing adoption of the cloud. Widespread adoption will be driven by consumer demand for cloud as employee personal use continues to increase. Just as cloud becomes universally accepted in consumer technology, we will see businesses embrace it within their organisations. [easy-tweet tweet="Widespread adoption will be driven by consumer demand for cloud" hashtags="cloud, IoT"] As this preference for true cloud increases, demand for on premise software will decrease and we will see many vendors facing a major dilemma. Many of the ERP software solutions today will simply reach end of life and the industry will experience a rationalisation and shake out not seen since the days of Y2K. ### IP Expo Europe announces 2016 event focusing on the future of the technology industry in the wake of Brexit IP EXPO Europe, Europe's number one enterprise IT event has today launched its 2016 IT showcase, to take place on 5-6 October 2016 at the ExCeL in London. Following the success of last year’s event, IP EXPO Europe has announced a collection of influential speakers who will be holding keynotes over the two-day event focusing on the six themes of Cloud, Cyber Security, Networks and Infrastructure, Data Analytics, DevOps, and new for 2016, Open Source. In the wake of an elected Brexit, this year’s IP EXPO Europe will explore how this change will impact the current demand for tech skills with a look to the demands of businesses in the future and the growth in technologies such as automation and artificial intelligence (AI). It will answer the many questions that businesses may have about the future of their companies through a series of seminars and speaker programmes as well as address the fears that many IT professionals may have regarding the future of the workplace. With six sub-events under one roof, IP EXPO Europe 2016 is expected to attract a broad audience of enterprise IT professionals and enthusiasts to discuss how to evolve their businesses into a more technical future that embraces advancements like AI and network with other likeminded professionals in the industry. Bradley Maule-ffinch, Director of Strategy for IP EXPO Europe, commented: “After record interest last year – 14,600 attendees across the two days – we are looking forward to a real buzz of activity at IP EXPO Europe 2016, driven by the wealth of knowledge and future-gazing our range of high profile speakers will provide, in combination with the rise and adoption of emerging technologies in the enterprise. We are also extremely excited to introduce the Open Source Europe sub-event, which is sure to create excitement among the IT leaders and professionals.” This year, IP EXPO Europe will be collaborating with Hewlett Packard Enterprise (HPE) for the first time on a special series of seminars focused on the future of computing and exploring the importance of STEM. Throughout the two day event, HPE will discuss evolutions in the technology industry and their state-of-the-art technology in a series of seminars focused on business transform alongside demonstrations at their stand. Building on the success of the last decade, this year’s IP EXPO Europe will encompass a number of high-profile keynoters and industry leading speakers, including: Gavin Jackson – MD at Amazon Web Services Mark Russinovich – CTO at Microsoft Azure Miles Ward – Global Head, Solutions at Google Cloud Platform Eugene Kaspersky – Founder and CEO at Kaspersky James Lyne – Global Head of Security Research at Sophos   For further information and to register for free for IP EXPO Europe 2016, please visit: www.ipexpoeurope.com. Get involved on Twitter using #IPEXPOEurope. About IP EXPO Europe IP EXPO Europe is Europe’s leading IT event, designed for those looking to find out how the latest IT innovations can drive business growth and competitiveness. Now in its 11th year, the event showcases brand new exclusive content and senior-level insights from across the industry, as well as unveiling the latest developments in IT. It covers everything you need to run a successful enterprise or organisation. IP EXPO Europe 2016 now features a brand new theme, Open Source joining the existing Cloud, Cyber Security, Networks and Infrastructure, Data Analytics and DevOps – incorporating six events under one roof making it the most comprehensive business-enhancing experience for those across IT, industry, finance and facilities roles. ### The Future Workforce Dilemma: Should Robots Be Held To A Higher Standard Than Employees? They never show up late and they can work 24/7. They don’t take lunch breaks, sick days or require paid leave. They don’t leave coffee cup rings on the desk, or complain about the air conditioning. So, if they’re less of a HR headache, should robots be held to a higher standard than their human counterparts? [easy-tweet tweet="If they’re less of a HR headache, should robots be held to a higher standard than their human counterparts?" hashtags="tech"] Defining Standards What do we mean by ‘higher standards’? Merriam Webster defines a standard as ‘something set up and established by authority as a rule for the measure of quantity, weight, extent, value, or quality’. The established measure of quantity is usually determined by the average performance over a period of time, or a published set of metrics for a specific task, as determined by a respected body. For example, Company X may have determined that on average, over the last twelve month period, a financial analyst can produce 65 accounting entries per month – defining the standard. So, if a robot is subsequently created to produce accounting entries each month for a year, and averages 100 accounting entries, do these results make the robot superior to the employee? Balancing Quality and Quantity A robot is programmed with the logic of the majority of use cases - in this example, accounting entries. Could the robot be programmed for 100 percent of the cases? Absolutely, but is the programming specific to something that happens once a quarter or year, for example, justifiable from a cost perspective? That is something for the organisation to decide. In our example case, the robot produced 69 percent more entries than the human. This raises three issues: From a purely results driven perspective, the robot did a much better job The exceptions that the robot was not programmed for fell to the human to resolve The robot only had to access the entry once, whereas the human had to access more often to amend any errors When we consider the weight element of a standard, these issues bring to light a common misconception about robots – that they can only carry out simple, transactional tasks. In reality, robots thrive on complexity. If we take a specific, complex entry and program a robot to complete it side by side with a human, the robot wins every time. An average accounting entry may take twenty-five minutes for a human to complete. This example defines the extent, or level of effort required to complete the entry. Robots don’t get distracted, and they don’t have to go back to the entry to check it for errors. How can we measure value or quality? Measuring value or quality is important because it is not always monetised when developing a business case for robots. Accounting entries made by humans may have been revisited a number of times before they passed inspection and have been subsequently approved and posted. Further down the line still, the approver may have overlooked some of the human errors, and any incorrect entry will have slipped through the net. [easy-tweet tweet="it is not always monetised when developing a business case for robots" hashtags="tech"] On the other hand, the robot’s entry was right first time, so will pass entry and be posted without query. Fast forward to a quarterly or early audit of entries either by an internal compliance team or external auditors: As they review, they may catch the errors made by the reviewer for the human entries, but there are none to catch from the robot. In fact, when compliance or external auditors encounter robotic entries, just like systematic entries, they spend less time scrutinising them because they know the human error is absent. This actually equates to not only a minimised risk of financial error, but more importantly, reduced audit costs. So, should robots be held to a higher standard because of the higher value they derive with their output? The caveat is that humans will always be required within an operation because the robot must be programmed for changes, and they need to handle the one-off transactions the robot has not been equipped to handle. It is not so much a question of competition between humans and robots, as it is collaboration. ### The Rise of Azure, and why Microsoft is betting on CSPs I’ve just about recovered from my Microsoft WPC induced jet lag, following the conference, which took place in Toronto last week. It was a great event, and as a representative of a Microsoft Cloud Solutions Partner business it was incredibly insightful as a guide to the changes in Microsoft’s strategy. The main takeaway for me is that Microsoft has bold and ambitious plans to engage channel partners on a number of shared priorities, such as accelerated cloud adoption, Windows 10 Enterprise deployment and digital transformation. [easy-tweet tweet="Microsoft has bold and ambitious plans to engage channel partners on a number of shared priorities" hashtags="#tech #business"] I heard on many occasions that the cloud will be a $500 billion-plus opportunity by 2020. Furthermore, Azure cloud is poised to become a true hyper-scale platform for creating hybrid clouds spanning Microsoft data centres, as well as service provider and end-customer facilities. It’s clear to me that Azure is key to Microsoft’s future. Looking back 4-5 years, Windows would almost certainly have had the biggest keynote, followed by Office and then the Server and Tools division. This year Cloud and Enterprise which owns: Azure, SQL, Windows Server, Dev Tools, EMS and Power BI received a keynote twice the length of Office and Windows – a subtle but clear sign of the changing of the guard. This has also been reflected in the targets for FY17 within Microsoft, which reveal all the focus is on growing Azure aggressively. To support partners Microsoft is doubling its investment in cloud platform internal-use rights, as shared by: Gavriella Schuster, CVP of Microsoft’s Worldwide Partner Group. That move will support partners’ work with Microsoft's cloud technology and pursue activities such as building demos, she said in her keynote. Microsoft is now enlisting partners in their mission of driving digital transformation for customers. Digital is now front and centre for CEOs, but the partner ecosystem must be able to align digital transformation with customers' business outcomes, too. Jeff Immelt from GE provided fascinating insights as to how GE has embraced digital to transform their business. [easy-tweet tweet="Microsoft is now enlisting partners in their mission of driving digital transformation for customers." hashtags="tech, cloud"] Windows is still an important business for Microsoft though, with more than 350 million Windows 10 users already. The new Windows 10 Enterprise E3 subscription service provides partners with a significant opportunity to develop new managed service offerings around delivering enhanced security solutions for their customers. SQL Server 2016 growth is also a priority, as Microsoft aims to help customers turn the idea that “data is the new electricity” into reality, using real-time business insights and analytics to help drive digital transformation. That being said, Microsoft doesn't want partners to neglect hardware. According to Microsoft, “customer experiences come to life through hardware solutions," and this is another ecosystem agenda item for 2017, particularly with the announcement of “Surface as a Service”. Schuster claims that Microsoft has seen 17,000 partners transact since the launch of CSP last year. In May, CSP transactions overtook the number of Microsoft Online Services Adviser program and Microsoft Open programs transactions for the first time. WPC reinforced many things this year, but the key message was that digital transformation is vital, it isn’t an option, it’s a necessity not only to thrive but to survive. Businesses need to embrace the digital world to build better products, keep customers engaged and most importantly, stay relevant in a rapidly advancing marketplace. [easy-tweet tweet="Businesses need to embrace the digital world to build better products" hashtags="tech, cloud"] We are now entering the fourth generation of the industrial revolution; AI and machine learning will significantly change productivity and jobs in next 30-50 years. Microsoft’s Brad Smith summarised this aptly - 40 years ago, every elevator had a job inside, a person pressing the buttons for you. Today, this doesn’t happen, and why should it? In 40 years’ time we will think it was equally quaint that there used to be a job in every taxi when there are so many driverless cars and drones roaming our streets – jobs are changing, and businesses need to make sure they don’t fall behind. ### Managing Authentication in the Cloud Era Security concerns around cloud adoption are on the rise. According to the 2016 Cloud Security Spotlight Report from Crowd Research Partners, 53 percent of organisations see unauthorised access, through the misuse of employee credentials and improper access controls, as the single biggest threat to cloud security. Today, perceived security risks are a significant factor holding back faster adoption of cloud computing. However, the many benefits that cloud services deliver, from more flexible working practices to improved efficiency, are driving organisations and security teams to find more secure ways of working. As such, implementing the right security can have a profound impact on business transformation. Trusted access to the cloud provider [easy-tweet tweet="Outsourcing services to a #cloud provider means more centralised controls" hashtags="tech, IT"] While the benefits of using the cloud are numerous, outsourcing services to a cloud provider means more centralised controls. The cloud hosting provider’s services also often include an administrative/management dashboard with the controls to manage an organisation’s cloud services. This means the username and password for the cloud’s dashboard or accounts would be the ‘master key’ to the entire IT infrastructure. Protecting access to these cloud applications with only a username and password is insufficient for today’s user targeted attacks. If those credentials are stolen and exploited, it can be very difficult to regain control. Cloud security is certainly many-faceted, but guarding the entry point should be an integral first step to protecting data, networks, resources and other company assets located in the cloud. With that in mind, keeping cloud logins secure is vital for businesses. Streamlining login security is also essential in making security easy to implement and manage, which is why using a two-factor solution that works for on-premise and enterprise-level cloud apps is ideal. Adding an extra layer of security means an attacker can’t log into an administrative account by simply exploiting stolen credentials – they would need the associated physical device in order to successfully authenticate and gain access to the cloud management console. Identifying security risks Hackers will typically choose the easiest target, and ‘path of least resistance’, which is why they are increasingly targeting end users directly, with an estimated 95 percent of breaches now involving stolen user credentials. The continued rise in phishing scams and use of social engineering techniques to steal credentials means that even the most complicated and ingenious of passwords is no longer enough. [easy-tweet tweet="Devices running outdated operating systems and browsers pose a significant security threat." hashtags="#firmware"] The users themselves are not the only risk; devices running outdated operating systems and browsers also pose a significant security threat. Research from Duo Security’s labs team found a concerning lack of device security in the enterprise, with 25 percent of Windows devices running now-unsupported versions of Internet Explorer, leaving those un-patched systems open to more than 700 vulnerabilities. Mitigating these risks and ensuring only trusted access to a company’s cloud services is achievable by adhering to the following golden rules on access control and device visibility: Passwords are not enough First and foremost – ensuring users are who they say they are. Using only passwords as a form of authentication or access can leave organisations vulnerable to stolen credentials. Especially as users often reuse passwords for multiple accounts, both work and personal. Employing two-factor authentications across all logins provides an essential extra layer of security that doesn’t rely on user behaviour. Solutions such as cloud-based two-factor are quick to deploy, easy to use and easy to manage for administrators and IT teams. Securing employee devices Attackers will exploit any vulnerability, and out-of-date systems are a potential treasure trove. With that in mind, organisations need to focus on the health of devices, ensuring they are running the latest operating systems and removing any blind spots, in order to mitigate any risks these devices may pose. Providing administrators with data on device ownership and health allows them to make risk-based access control decisions. As such, out-of-date devices should not be permitted to connect to the cloud and the network, or access any highly sensitive data. Embracing the Cloud and BYOD The era of widespread cloud adoption, and BYOD (Bring Your Own Device) has encouraged organisations to set specific usage policies on employee-own PCs, smartphones and tablets, addressing the rules that govern how all workers access the corporate network. With more sensitive company data stored in the cloud, organisations need stronger, more secure, trusted access to enable employees to work freely, and safely from wherever they are. ### Discussing New Gen Apps at Red Hat Summit 2016 When Compare the Cloud was invited to speak at the Red Hat Summit in San Francisco, of course we graciously accepted. The topic? Next-generation applications on Red Hat Enterprise Linux and Enterprise Platforms. [easy-tweet tweet="Although the topic was vast the panel honed in on key discussion points that I will elaborate on." hashtags="IBM, Apps"] The format was a 45 minute panel discussion with Al Gillen (IDC), Jim Wasko (IBM) and myself (good looking one 2nd from the left), chaired by Mary Hall (IBM). A lively debate followed with discussions based around the main theme of enterprise application delivery through Red Hat Linux and IBM LinuxOne infrastructure. Although the topic was vast, the panel honed in on a few key discussion points that I will elaborate on. Developing the Next Killer App How can you achieve true scalability with that “killer app” without degradation of performance and security? With pretty much the majority of infrastructure running Linux (various flavours) today, you need the core backbone (technical infrastructure) to be scalable with the ability to run at full capacity without faltering services or outages when peak usage bursts are apparent. However, this flexibility is great for performance, and for the wallet. Rigid security is also key – Enter IBM’s LinuxOne architecture! Built upon the same principles as their Mainframe technology, the LinuxONE environments are purpose built Linux powerhouses in a box. One such environment can cater to over 7,000 Linux virtual machines with the tightest security available that can run at 100 percent utilisation without any degradation of service. One would think that such a product range is too good to be true but it is indeed veracious and they offer more or less of what you require depending on your needs and requirements. Moving on from the obvious good fit for this topic, the audience was quizzed on their thoughts as to how they see enterprise technology fitting with the next unicorn application. There were some interesting responses and the main retort was based around how to develop said unicorn that can scale without costing an arm or leg. Again, this played right into the hands of IBM’s product offerings as they now offer free public access to a development environment to test and configure Linux based VM’s. However, in the words of a shopping channel commercial – That’s not all - you have access to ready made VM templates that give you the option of different flavours of Linux (Red Hat being the focus during the session). Additionally IBM have a suite of software products that can assist with the creation and deployment of your unicorn code in the form of IBM Bluemix which can also be trialled for free! Open Source Development IBM is certainly changing with a big focus on assisting Open Source development of applications as well as the delivery and I still find it hard to believe that it’s free! Never has there been a better time to pick up that code book and create the next billion dollar application, cheaply, securely, and have the flexibility of a Hybrid Cloud approach of traditional technology. The LinuxONE environment could just be the new normal in the future with all of these benefits rolled into one distinct package that provides the most secure environment available which is not just needed for external attacks but in this day and age – protecting your Intellectual property that could/would make you a billionaire. [easy-tweet tweet="#IBM is certainly changing with a big focus on assisting Open Source development of applications" hashtags="tech"] In summary for the Red Hat Summit (not just our discussions) there seemed to be two major trends that were prevalent throughout the few days, Environment Monitoring and Performance and Security. These are major trends in the industry right now and a myriad of vendors are riding the bandwagon to success. We are in the age of the constant evolution of technology that is being spurned on from the constant evolution of entrepreneurship. Come on, let’s start coding and create something really special! To test drive IBM LinuxONE for 120 days, sign up for a test drive here. To learn more about the Top IT Trends for 2016, read IBM’s Enterprise Linux Insights blog. ### Financial services, like everyone else, are heading into the cloud Ask any CIO and COO if they’re concerned about using the cloud and they’ll often cite concerns around data security, compliance and transparency as significant barriers to adopting cloud computing services. [easy-tweet tweet="#Security, compliance and transaparency are often cited as concerns about #cloud"] Yet individual business areas are often keen to take advantage of the benefits that these technologies offer, leading to the rapid growth of shadow IT systems, where internal stakeholders procure their own cloud-based services without the knowledge or control of the IT management - it can take as little as 15 minutes and credit card to get set up. Unsurprisingly, millennials are often at the forefront of this trend – as ‘digital natives’, they are familiar with cloud based technology, particularly where employers have adopted a ‘bring your own device’ strategy that encourages greater use of personal mobiles, laptops and tablets. In financial services, traditionally a sector with one of most complex and highly regulated IT environment, pressure for the rules to recognise the benefits and flexibility offered by solutions is growing. This month the UK Financial Conduct Authority (FCA) published long awaited guidance on using the cloud, which are a first step – if not a comprehensive guide – to their expectations of regulated firms in financial services. Data compiled by the International Data Corporation shows global spending on cloud services is set to rise from $70 billion in 2015 to more than $141 billion in 2019, with $6.8 billion spent by the banking sector alone in 2015. [easy-tweet tweet="Global spending on #cloud services is set to rise to more than $141 billion"] Spending on software as a service – which enables clients to access applications through their own devices without managing or controlling the underlying infrastructure – will continue to dominate, making up nearly two-thirds of predicted total spend. From a business perspective, managing these disparate cloud environments creates unnecessary complexity and cost, often amplifying operational and regulatory risks. In financial services 52 percent of firms we surveyed last year raised compliance concerns as a high-risk factor. One of the most frequent concerns we hear is around maintaining data security and protection. Users of cloud services retain oversight of, and accountability for, their data, and good governance is key to developing scalable programmes that can incorporate continuous improvements. As best practice, cloud service users  should regular include assessment of market and technical requirements and incorporate regular security updates into their contracts with service providers. So what should IT executives do? The first and most important step is to recognise - and celebrate – the fact that cloud technology is here to stay. Ease of access to cloud technology, and the breadth of solutions it offers, from payroll services to smart, searchable audits, means that it is more important than ever for IT to be seen as a partner that enables rather than blocks access to new technology. Effectively balancing the benefits and risks of cloud technology should feel like a joint venture between IT departments and the wider business, not command and control from the centre. Process owners should ‘own’ the supporting systems, while IT executives should steer and advise on best practice. One of the biggest challenges in moving towards this model is embedding a good understanding of the risks and challenges across the Board and senior management team. While Board members are not expected to understand how the technology works, they should have a clear view of the associated risks and mitigating actions. However, they can often be the furthest removed from how these solutions are used day-to-day across the business, which make it difficult to take a considered view of cloud technology. Rapid growth in the use of cloud technology should be matched by greater collaboration within businesses, between users and providers and between firms and regulators. While we welcome the FCA’s guidance, we don’t believe it has gone far enough, or sufficiently recognises the nuances in public and private cloud environments. To be a truly useful guide for firms – and further its regulatory requirements to promote innovation and completion - the regulator needs to recognise that it needs to respond to the way the cloud is used today too.   ### Containers, APIs and Cloud Machine Learning – three disruptive trends in cloud The initial wave of cloud migration has been about relocating workloads to cloud infrastructure. As business confidence in public cloud platforms increases, the pace of adoption has been accelerating with transformation projects happening across thousands of applications. The automated migration of workloads, primarily virtual machines, has been the dominant flavour of this wave, but very little of this transformation has affected the applications or data layers involved in these migrations. [easy-tweet tweet="The initial wave of #cloud migration has been about relocating workloads to cloud infrastructure."] This first wave has proved successful from the public cloud vendor perspective, with some clear winners gaining a significant share of the market. However, the next wave transformation is likely to be quite different in nature and will involve more application and data level changes. It is here that containers, APIs and Cloud Machine Learning are likely to be disruptive and promise to change the competitive landscape for public cloud vendors. The nature of competition is likely to shift from a pricing-centred IaaS approach to a feature-centred, value-driven approach around PaaS offerings. Below, we explore the reasons behind this shift. A container-based approach By enabling easy cross-cloud migration, containers are eliminating application level dependencies on public cloud features. Those who moved to cloud platforms early are now re-architecting their applications to leverage containers. This container-based microservices approach also means choosing the best fit for a particular microservice rather than a one-size-fits-all solution. Thanks to containers, it is now possible to easily port applications across environments including two competing cloud environments, disrupting the competitive landscape for public cloud providers. [easy-tweet tweet="By enabling easy cross-cloud migration, containers are eliminating application level dependencies on public #cloud features."] Application and data layer services Secondly, with several rich application and data layer services being hosted on public cloud, cloud is increasingly shifting its focus towards NoOps and serverless computing. Enterprises are realising the value of powerful PaaS offerings (for example, Google BigQuery or Amazon Lambda) for running special use cases, saving both time and effort. Similarly, several cloud-based APIs such as Google Maps, Speech, Translate and Text, offer capabilities that support business use cases and are otherwise difficult for a traditional enterprise to replicate. We are witnessing scenarios where a large part of a platform with commodity features is hosted on one public cloud and another part with premium features is on a separate public cloud, specifically to leverage its API ecosystem. Machine learning Finally, while there is a lot of general awareness about machine learning, enterprises are realizing that the technology is not easy to implement. Special algorithms take a long time to develop and perfect and this is the third area where clouds are starting to differentiate beyond basic IaaS play. Public cloud providers, either through in-house R&D or acquisition of boutique machine learning focused firms, are exposing larger suites of trained machine learning (and in some cases, deep learning) models through APIs for easy integration into cloud applications. Cloud platforms are also able to provide the required hardware (including GPUs where necessary) to deliver industry grade responsiveness from the trained machine learning algorithms. While initial launches have been around broad use cases like Google Speech, Vision, etc. it is likely that future launches will see algorithms for special industry use cases. The cloud market is currently seeing strong price competition for commodity services in line with Moore’s Law. However, it is logical to assume that cloud providers will want to start offering differentiated services to introduce more value-centric pricing, and then to retain workloads on the basis of these differentiated services. Containers, APIs and Cloud Machine Learning are set to transform the public cloud landscape significantly in the coming years. ### Touch-Button Home Security to Make You Feel Like a Rock Star Everyone knows that driveway gates can greatly improve home security. But beyond choosing between wooden and iron gates, did you know that you also have a choice of access control? There’s nothing quite like rolling up to your drive and at the touch of a button watching your gates swing open to welcome you. It can make you feel like a rock star in your own home. [easy-tweet tweet="There’s nothing like watching your gates open to welcome you home at the touch of a button." hashtags="security, IoT"] Let’s take a look at a few of the high-tech options available. 1. Smartphone/Tablet With an estimated 76 percent of adults in the UK owning a smartphone, it’s easy to see why this one is top of our list. There are innovative apps that allow you to open and close your gates with just one swipe of your smartphone. No need to carry a transmitter! If you are anything like us and you normally have your phone on you at all times anyway, then this option is for you. 2. GSM (Global System for Mobile Communication) Intercom This allows you to use your mobile phone to open your gate from wherever you are in the world! When a visitor drives up to your gates, they press a button and your GSM intercom will ring your mobile or home phone and allow you to speak to them. You can then let them in with a push of a button. If you are away on holiday there’s no roaming charges as it doesn’t work on 3G, just GSM signal. It’s completely wireless. This device also comes with a keypad so you can let yourself in by simply entering your pass code. 3. Handheld Transmitter The cheapest option for controlling your electric gates is a handheld transmitter. Buy one for each member of your family and they can come and go as they please with the press of a button. By choosing a transmitter with a long range you can open your gate from the road before you drive up to your gate, meaning you no longer have to stop on the road until it opens. But don’t forget to buy a spare in case you misplace it! 4. Wireless Keypad The simplest choice is a wireless keypad positioned in the entrance to your property. You can choose a code and hand it out to those who you want to have access to your property. If you fall out with a relative, then you just need to change the code & redistribute. Simple! This isn’t the best option if your gate is on the roadside, as you can’t set the gates to open before driving up to them or the key pad. [easy-tweet tweet="Find the best security gates for your home and feel like a rock star" hashtags="IoT, security"] There are many different options and variations of those that we have discussed in this post. For more information, you should find a local installer and they can talk you through the best options for you. ### Best practices for IT security: The SMB top four [vc_row][vc_column][vc_column_text]If you’re running a small business, chances are there’s enough on your mind that IT security isn’t a top priority. It makes sense — with 50 per cent of SMBs failing in the first five years, it takes a combination of determination, effort and good luck to make a small business work. The problem? Ignoring IT security could land you in the wrong 50 per cent if consumer or credit data is stolen, information is destroyed or a post-incident investigation reveals you didn’t do enough to protect this data. It’s not all doom and gloom, however: Here’s a look at the top four IT security practices for SMBs. [easy-tweet tweet="If you’re running a #SMB, chances are there’s enough on your mind that IT #security isn’t a priority"] Recognise your risk SMBs are now attractive hacker targets. Why? Because cybercriminals know that small businesses are often sitting on critical consumer data such as names, addresses, Social Security numbers and credit card information. They’re also aware that SMB IT security — as a general rule — isn’t on par with enterprise defences, meaning attackers have a better chance of getting in, getting what they want, and getting out before they’re detected. Want proof? New research from independent research firm Ponemon Institute found that 50 per cent of SMBs experienced data breaches over the past 12 months. Your best practice here? Design IT security with high risk in mind: You’re not a second choice or “also ran” for hackers — in many cases, you’re a top target with valuable resources. Plan for a serious, coordinated attack. Defend your data The next best practice to secure SMB IT? Make it standard practice to fully defend your data. Start by making sure that every piece of critical information on your network is encrypted. This starts with data in transit — sent from and received by your business — but it’s also important to protect data at rest. If hackers get their hands on anything, it should read like gibberish, not shine like gold. As OpenDNS points out, SMBs should also take steps to regularly back their data. This might take the form of off-site servers, cloud storage or even tape drives; just make sure you have more than one copy. Prioritise passwords Where possible, hackers prefer the easy route to more complex and high-risk methods — why get caught trying to subvert antivirus programs or sophisticated defences when they can simply log in through user accounts? If you don’t think it happens, think again: As noted by recent research, top passwords from 2015 included the ever-popular “123456,” “password,” “starwars” and the oh-so-secure “letmein.” How do you solve this problem? Start with a hard-and-fast timeframe for password changes; six months is a good rule of thumb. Make sure everyone — from owners and managers down to front-line employees — follows the same rules. For example, don’t let staff re-use the same password, opt for a minimum character length (eight or more) and prevent the use of repeated characters. Since you’re probably not an IT pro, it’s worth spending on reputable password management software to help manage user logins. Think outside the organisation Bottom line? You can’t do everything yourself. In the same way you outsource manufacturing, accounting software and even marketing responsibilities, it’s now possible to tap a reputable third party to handle SMB IT security. Managed service providers not only have access to substantial cloud resources — keeping your servers free for critical, as-needed data — but also a wide variety of specialised tools and solutions designed to protect key assets. In addition, the right service partner can help draft a customised IT security policy that meets the specific needs of your business. Here, the key is research and reputation: Look for a provider staffed by IT experts with substantial experience in the industry, and always opt for a partner that offers 24/7 service. [easy-tweet tweet="It is now possible to tap a reputable third party #cloud vendor to handle #SMB IT #security"] Running a small business is no easy task, but leaving IT security off the table is a surefire way to increase the chance of network compromise. Protect yourself by recognising risk, defending data, prioritising passwords and opting for outside help.   See Small Business Loan Statistics 2019 here[/vc_column_text][/vc_column][/vc_row] ### Fidor Group acquired by Groupe BPCE Fidor Group, the German digital challenger bank and fintech pioneer, is today announcing its acquisition by Groupe BPCE, the second largest banking group in France. Groupe BPCE has signed an agreement with the key shareholders, founders and managers of Fidor Group relating to the acquisition of their equity interests in the company. Fidor will capitalise on the support of Groupe BPCE to enable a strong international expansion, continue the development of its proprietary digital banking technology, and strengthen its presence in Europe. Following the deal, Fidor will remain as an independent business. Founder and CEO Matthias Kröner will continue as the Chief Executive of Fidor, keeping a shareholding in the bank and leading its business strategy, development and international expansion as before. Fidor Group will also adjust its internal structure, with the new ‘Fidor Holding Group’ acting as a parent company to the rest of Fidor’s business offerings - Fidor Bank, the challenger bank, and Fidor AG, the digital white-label technology solution provider for digital banking and any future corporate offerings. Matthias Kröner, CEO and founder of Fidor, commented: “This move will allow Fidor to continue its international expansion and drive the development of our innovative digital technology even further. In a world of increasing volatility, it is important to be member of a strong group and this transaction is strongly improving our overall financial sustainability. We are excited to have such a well-established partner as BPCE in the financial world that recognises the need for an entrepreneurial approach to banking and innovation.” Kröner continued: “With a simplified shareholder structure, Fidor’s senior team, including myself, will be able to focus on expanding our core business offering and explore more market opportunities all over the world” François Pérol, Chairman of BPCE, added: “This operation constitutes a key step in the acceleration of the digital transformation of our group. It further demonstrates our commitment to innovation, to developing a customer centric approach enabled by digital banking technology, and to be more involved in the digital and mobile banking field. We are very proud and happy to welcome Fidor’s teams, communities and clients into Groupe BPCE.” The closing of the transaction will be subject to customary regulatory change of control approval from the European Central Bank, the BaFin and upon clearance from the German competition authority, expected in Q4 2016. The deal was facilitated on behalf of Fidor Group by Heussen Law and Zelig Associates. Founded in 2009 by its CEO Matthias Kröner, Fidor is one of the world’s first “fintech” banks, pioneering the collaboration between traditional financial services and technology businesses. Fidor offers a unique proposition with its collaborative banking experience, where 350,000 community members work together to help build the bank’s services and products. Fidor has also developed its own proprietary technology platform – the Fidor Operating System (fOS) – which enables open, fast and advanced API banking. This week, Telefónica also announced the launch of ‘O2 Banking’, its mobile-only bank account, in partnership with Fidor. ### Network anomalies: Gaining cloud control Controlling your IT estate is vitally important for any successful business. But control is only possible when businesses have complete and reliable visibility over their network traffic. Without this, problems easily fly under the radar unnoticed until after it’s too late. So much information about us is now stored digitally, which means that it can be susceptible to attack. Malicious actors can use undiscovered vulnerabilities to infiltrate networks and wreak havoc, whether that’s by stealing data or carrying out a DDoS attack. However, when companies have a clear view of their network, these attacks are more difficult to carry out. [easy-tweet tweet="Malicious actors can use undiscovered vulnerabilities to infiltrate #networks and wreak havoc" user="Xangati"] Often, the first indication that something may be amiss within your company network will be relatively minor. On the surface, your critical process may be running as normal, your CPUs may be operating at expected utilisation levels and nothing gives cause for concern. A closer look at network traffic, however, may reveal anomalous behaviour. These anomalies may not seem malicious at first, but any network traffic that differs significantly from that which is generally expected is often a sign of malicious intent. But how do businesses identify this unusual network behaviour? Xangati’s Anomaly Index Fortunately, a number of tools enable businesses to monitor their network traffic in great detail. Often dubbed “Network Behaviour Anomaly Detection” (NBAD) tools, they provide real-time tracking of critical network characteristics, looking for patterns which match the signatures of known security threats. At Xangati, our Anomaly Index offers a slightly different approach to network monitoring. In order to provide a holistic view of network performance, the index is accompanied by four other metrics: Performance, Availability, Relative Capacity and Current Efficiency. This means that system administrators can view all needed information directly from the Xangati dashboard, allowing them to identify possible threats, use analytics to assess behavioural anomalies and keep availability high. Additionally, Xangati’s network tools can be paired with any number of additional modules from the Xangati ESP for Cloud Infrastructure solution. Riding out the storm Because of the potential disruption they can cause, Xangati classifies anomaly-related events as “storms.” Service Storms refer to email and upload/download activity, while abnormal behaviour, which could be a sign of an impending DDoS attack, is dubbed an Unusual Activity Storm. However, knowing that a storm is on the way is only the first stage of anomaly analysis. Xangati ESP utilises a graphical interface to give IT administrators a clear view of the storm-related event, how severe it is likely to be and the recommended response. After the storm has passed, Xangati ESP can replay the event for further analysis, and the Anomaly Index can generate a report looking at the anomalous activity over a certain time period. [easy-tweet tweet="Understanding #network behaviour is vital if businesses are to maintain control over their IT estate" user="Xangati"] Having a clear understanding of network behaviour is vital if businesses expect to maintain control over their IT estate. Any number of potential dangers may be hiding in your IT infrastructure, but by using solutions like Xangati’s ESP and Anomaly Index, your business can ride out any lurking storms and ensure that IT tools remain available for your employees and customers. ### Looking for a step change in IT productivity? Cloud is rapidly becoming the core platform for adding innovation and business value to an organisation. We are often overlooking some of the more obvious places where cloud can have a much bigger and immediate impact on IT productivity. Areas such as mobile solutions, the Internet of Things and cognitive computing can revolutionise the business processes. [easy-tweet tweet="#Cloud is rapidly becoming the core platform for adding innovation and business value to an organisation."] So where is the cloud likely to have the biggest impact on IT productivity in 2016? We have identified the following business cases where cloud can provide a step change in efficiency and effectiveness, enabling business and IT leaders to not only improve IT productivity but seize the opportunity to progress their enterprise capabilities and their personal careers. 1. Making mergers and acquisitions seamless In the intense negotiations around mergers, acquisitions and divestitures, cloud services might not be top of mind. However, they can make these processes much more painless in a number of ways. They can help organisations to realise synergy benefits more quickly, simplify integration while accelerating the change programme, reduce costs through efficiencies, mitigate costly migration investments and encourage financial flexibility. Any merger or acquisition is a critical business moment with complex processes to be negotiated, so it’s worth looking at how these can be simplified. 2. Preparing for the unexpected A systems failure or security breach could hit at any time and while preparing for the unexpected may seem an impossible task, a lack of sufficient testing is often to blame. Testing via the cloud has a range of benefits including rapid access to resources, cloning and provisioning, shared access to testing resources and an expanded range of tests that can be covered, which can be useful for simulating the unexpected demands created by a malicious attack. Leaders should start by assessing their current test practices and services against the cloud testing possibilities available – which could help the business to be better prepared in the event of an incident. 3. Building an agile edge Customer expectations are ever-growing and as such, businesses are under constant pressure to provide continually improved customer experiences. This requires a systematic and iterative approach to leveraging a range of technologies – meaning fast, agile development is required, as well as rapid access to new innovations. [easy-tweet tweet="Business leaders are coming to realise that cloud enables a culture change to design-first thinking" hashtags="cloud, tech"] Business leaders are coming to realise that cloud enables a culture change to design-first thinking and agile practices – enabling them to better manage unpredictable demands. 4. Increasing IT efficiency Business leaders responsible for IT are launching cloud-based transformation programmes to make a step change in IT productivity and increase the velocity of new function delivery. Cloud technology models provide well documented and ready to use, standardised IT services which allow organisations to focus on the service rather than the technology details. Increasingly, IT leaders are starting to review their existing systems to assess which can be moved to cloud technologies unchanged, and which require re-platforming, reworking or replacement. 5. Addressing privacy concerns With the proliferation of data and increasingly connected systems, today’s business leaders are faced with mounting regulatory requirements, putting growing pressure on IT systems. Failure to meet these requirements can have critical business impact. [easy-tweet tweet="One solution business leaders are looking to in order to manage this is moving to a private #cloud" hashtags="tech"] One solution business leaders are looking to in order to manage this is moving to a private cloud – dedicated solely to their organisation to enable greater control and privacy. 6. Implementing business transformation   Whether the business is responding to a perceived competitive threat, implementing a major new programme, or appointing a new executive, business change is an ideal time to consider the value of cloud services. Cloud services simplify collaboration with partners, provide a framework for developing and enabling innovative ideas to be rapidly deployed and scaled up in response to changes in demand – making business change a far more seamless, painless process. As these business cases demonstrate, business leaders are already recognising the power of the cloud and the opportunities it provides across various disparate functions. As we look forward to 2016, the cloud will only become more vital to organisations looking to revolutionise their IT operations, providing the IT department the opportunity to demonstrate its value beyond a support function to a driver of business change. ### How Is Technology Transforming the Industrial Packaging Industry? Cloud Software for Business Cloud technology is developing at an incredible rate and who knows where it could take the packing supplies industry. In terms of warehouse management and logistics, it is already making a significant difference to the way we pick and store our stock. [easy-tweet tweet="Cloud technology development is enhancing the packing supplies industry" hashtags="cloud, tech, "] Steve Banker from Forbes Online believes that this is only the start of IoT use in warehouses and has emphasised how changing to the cloud will benefit employers, employees, and customers. Banker claims that: “Most well-run warehouses have a Warehouse Management Systems (WMS) that depends on upon bar code data. Warehouse floor level operations personnel and warehouse managers execute their tasks based on this data” A warehouse management system (WMS) is a software application that supports the day-to-day operations in a warehouse. To enhance your organisation’s efficiency your WMS program must hold centralised management of tasks such as tracking inventory levels and stock locations. The Packaging Industry and Technology In-store retailers are feeling the wrath of e-commerce businesses. Due to the efficient technology used in warehouses, customers are receiving their packages quicker than ever after adopting “the amazon effect”. Making your business more central should be on the top of your priority list. Although Cloud software is hosted off-site, virtually, it will manage accounts by accessing the software over the internet. This future of warehouse management lies in IoT. Cloud technology has massively grown in popularity over recent years and in 2015 it hit its peak in success. Last year’s results proved to be positive for cloud technology as: 90 percent of businesses used cloud technology in some capacity in 2015 85 percent of users are confident in their provider’s ability to provide a secure environment 24 percent of IT budgets will be allocated to cloud solutions in 2015 It is predicted that Amazon will have a 19 percent share of the U.S. apparel market by 2020, which is 6.7 percent up from 2015’s impressive figure. With e-commerce sites like Amazon and eBay moving towards cloud technology, UK retailer Argos has decided to use voice picking in their warehouses. This risk has proved to have been successful as their level of accuracy is going up and pick accuracy error rates have gone down from just over 1 percent to less than 0.3 percent. Part of Amazon’s success is their use of cloud and robotics. Currently, Amazon is using over 30,000 of these robots in their facilities, but how can this work for your business? The Future of Warehouse Management Businesses such as Ferrari's Packaging have recently installed battery powered, self-propelled robotic wrapping machines. Currently investing in sophisticated technology means it can be used to wrap multiple packaging products. There are lots of benefits to using this new technology one being that they are much safer than manual stretch wrap machines. These robotic machines are equipped with an active break emergency stop bumper that can bring the machine to a stop under any impact. Again by using a machine to boost productivity this is creating a more efficient foundation for businesses. For the first few months concentrate on adapting your business by supplying your staff with the digital training they will need. Technology will only keep growing and it will soon become a help rather than a hindrance to your employees. ### Managing and maximising your virtualised resources Virtualisation has provided many benefits for businesses. Previously, organisations would have needed to employ a number of physical servers, perhaps one for each application, each of which would have been severely underutilised in order to manage the inevitable, but unpredictable, spike in users. [easy-tweet tweet="#Virtualisation allows organisations to make more efficient use of their IT infrastructure" user="Xangati" hashtags="GPU"] Virtualisation, however, allows organisations to make more efficient use of their IT infrastructure by leveraging software to create any number of virtual machines (VMs) on a single server, enabling business resources to become more scalable, reliable and cost-efficient. With virtualisation deployed, hardware utilisation levels quickly shot up from approximately 10 per cent to closer to 80. However, virtualisation has not been without its own challenges. Business applications that require higher levels of graphical performance can be difficult to deliver using conventional virtualisation resources. This, really, is an extension of a pre-existing issue that can occur when using physical servers;  the CPU can struggle to scale when it comes to graphics computation, particularly as it has to handle all other core IT processes. This led to the development of the graphics processing unit (GPU) taking some of the strain away from the CPU. The increasing use of graphics-intensive resources was partly driven by the uptake of virtual desktop infrastructure (VDI), in which the entire desktop operating system was hosted within a VM on an external server, instead of a local machine. With the rise of VDI, individual, physical GPUs were no longer necessary, as multiple endpoint PCs could share the resources of a single, high-end GPU. As such, many devices now utilise their own virtualised GPUs (vGPUs) and allow their vCPUs to focus on other vital practices. Managing your virtual resources Managing virtual GPUs can be difficult, with virtual infrastructures often shifting as resources fluctuate to accommodate a number of different VMs, each with their own distinct requirements. Connected to this is the upkeep of the physical GPUs that underpin the virtual resources. But gaining a clear view of what is causing GPU availability issues or slow-downs may be easier said than done. With Xangati ESP for Cloud Infrastructure virtual appliance administrators can collect information from all of the various virtual components in order to manage and monitor the organisation’s cloud infrastructure. Managing physical GPU resources also becomes more straightforward when using Xangati’s ESP Extension for NVIDIA pGPU, providing visibility into the impact that pGPU performance has on virtualised environments. Effective GPU management can yield a number of business benefits. Xangati ESP gives IT leaders a view of their virtual infrastructure inventory, including profiles of each pGPU, second-by-second monitoring of critical metrics, and a storm-tracker to help with troubleshooting. Reports can also be generated that help IT management make accurate assessments on their busiest pGPUs, letting them improve future efficiency and overall user experience. [easy-tweet tweet="A whole host of industries, from healthcare to entertainment, now rely on extensive #graphical output" user="Xangati"] A whole host of industries, from healthcare to entertainment, now rely on extensive graphical output. By using Xangati ESP, in combination with VDI and GPUs, businesses can run large virtual environments within reliable and cost-effective IT infrastructure. With pGPU and vGPU deployments becoming increasingly prominent, businesses need to maximise their return on hardware investment and ensure they get the most out of previously wasted resources. ### BT outage highlights need for Backup and Disaster Recovery The Telecity data centre “outage" that knocked 10 per cent of BT internet subscribers offline in the UK this week is a stark reminder of how vulnerable our systems are as well as how important it is to have a solid Backup and Disaster Recovery (BDR) solution in place. [easy-tweet tweet="As #data grows in volume and value, businesses are using the #cloud to meet their data management needs"] As data continues to grow in volume and value, businesses are increasingly turning to the cloud to meet their data storage and management needs. But backing up and storing data in the cloud presents its own challenges – data is typically a company’s greatest asset, and requires protection from disasters, loss or theft. It must also be easily accessible – to approved personnel – for employees to be productive and better serve customers as part of today’s highly digital global business landscape. The danger here is that downtime can be disastrous. And as the Telecity outage shows, this downtime can come from anywhere, from someone spilling a cup of coffee over a server, a hacker attacking a company’s website or a data centre suffering a power failure all the way up to earthquakes, floods and hurricanes. And businesses can’t dismiss these outages as minor inconveniences. The average cost of an infrastructure failure is $100,000 (£68,000) per hour, with critical application failures coming in at between $500,000 (£340,000) and $1 million (£680,000) hourly, according to a report from market research company IDC. IDC also highlighted that the average total cost of unplanned application downtime per year for the Fortune 1000 companies is $1.25 billion (£850 million) to $2.5 billion (£1.7 billion). At the same time, companies are expected to be able to process data and protect data all the time. Customers (whether consumers or business users) are quick to vent their anger on social media when websites are offline or e-tailers suffer data breaches. It’s not just a matter of a company losing money during any unplanned downtime; there is also the fact that you may be sued by customers for any losses your failure may have caused, plus the potential reputational damage amongst end users of your products. This means BDR planning is not a ‘nice-to-have’ – it is a business imperative. Despite this, it’s remarkable how difficult it can be to convince executives that investing in BDR services for your systems and data makes absolute financial sense. The problem is, they look at it as a cost, not an investment. It’s very difficult for them to see the value of something not happening. It works better if you present it as an insurance policy, which is, to be honest, what it is. After all, like insurance, it’s there to protect your business – you hope you’ll never have to use it, but you have to be protected against disasters that can happen at any moment. As with any insurance policy, there are costs involved, like premiums, and it is important to make a strong business case for an investment. Does it make sense for you to get a disaster recovery solution? What is the best deployment option? How does the cloud impact today’s BDR solutions? How do you get your executive team to accept that insuring corporate data with a BDR solution is necessary? The best way is to demonstrate that disaster recovery is not a cost, but an investment with a positive ROI. IT departments seldom justify their purchases using ROI – often using operational needs, costs savings or productivity enhancements as the key arguments. However, for a BDR solution, an ROI analysis provides the most effective and objective argument for investment. Remind your executive team of the worst-case scenario – without a BDR solution in place, the company is at risk. [easy-tweet tweet="#Data is typically a company’s greatest asset, and requires protection from disasters, loss or theft" hashtags="DRaaS"] And remind them that the downside is not just the purely financial impact, but also the potential damage to customer happiness and brand image, not to mention the possibility of law suits and financial sanctions imposed by regulators for breaching compliance rules. ### Pokémon Gone - has Pokémon Go proved that augmented reality producers can't rely on Google or Amazon? If you wanted to know what a zombie apocalypse would look like then head to Manhattan, people are literally falling over themselves while they catch Pokémon. It’s no surprise that the Pokémon Go app has become a social phenomenon given the series’ ever-popular catchphrase – “Gotta catch ’em all!”, but there have been several other side effects that developer Niantic Labs will be less keen to shout about. Players are being taken over by an irrational fear - fear that the Pokémon they’ve caught won’t end up in their PokéBall because the game will crash while other users have trekked to the location of a rare catch only to find they can’t login to the app at all. [easy-tweet tweet="Pokemon Go users have trekked to the location of a rare catch only to find they can’t login to the app" hashtags="tech"] Within six days of launching in the US on 6 July, Pokémon Go had been downloaded by 21 million users. While some players were able to chase down a Pikachu or Psyduck straight away, others had to grapple with another monster altogether – downtime. The app then launched in Europe on 16 July, but again Niantic had not correctly predicted the demand generated by new players. The game was quickly overloaded as servers could not handle authentication and application requests. It’s clear that plans to scale out the appropriate pieces of infrastructure including registration servers, authentication servers and application servers were unfortunately insufficient. The downtime has had such an effect that hacker groups like PoodleCorp have claimed responsibility, although there is little evidence to conclude an attack caused the outages. The Google Cloud Platform, which hosts the game, has come under pressure for failing to handle the demand. Even arch rival Amazon Web Services poked fun at Google, with CTO Werner Vogels tweeting ““Dear cool folks at @NianticLabs please let us know if there is anything we can do to help! (I wanted that drowzee)”. In theory the cloud should offer all the capacity you need, but phenomenal demand has caught both Google and Niantic Labs out. So would it be different for Amazon? Where do ambitious developers turn if they want to capitalise on the latest craze with their own augmented reality app? [easy-tweet tweet="In theory the cloud should offer the capacity you need, but phenomenal demand has caught and Niantic Labs out." hashtags="cloud, tech"] In many ways it’s a case of keeping things as simple as possible. Load balancing, which distributes the workload of your systems to multiple individual systems, has come a long way in a relatively short space of time, evolving into application delivery controllers (ADCs) that integrate core load balancing with an array of advanced application services, including security. The good news is that a simple foundation can be rapidly implemented and then scaled up as necessary. This way, developers can make sure they keep the monsters on the screen, rather than in the system. If you are considering how best to launch a new application, here are six characteristics of the future proof load balancer: Keep it simple, keep it smart Place emphasis on performance, security and adaptability. The trick is to make sure you have foundations in place so that you can easily add functionality down the road. It has to be customisable A basic ADC set up will get you up and running, but having an optimised solution that specifically addresses your needs will result in faster application rollout times and increased operational simplicity. Virtualise It’s not recommended to use an existing ADC to serve multiple applications. This can impact end user experience at peak times and even during cyber attacks. New virtual ADCs (vADC) isolate any issues with your applications and can be configured differently without affecting neighbouring set ups. vADC is also much faster and easier to launch as the time needed to add a new ADC unit, rack it, connect it to the network and configure is dramatically cut. Build as you go Deploy an ADC solution that can grow as the business scales by allowing enhancements to be added via a modular framework. Meet the application SLA challenge End users expect the same quality of experience at all times. Make sure your ADC delivers applications consistently and provides the tools to monitor and manage SLAs. Connect to next-gen switches Adoption of 10GE and even 40GE ports is taking off. Ensure your ADC can connect to these switches to enable your applications to benefit from the latest technology. It is unfortunate, though understandable, that Niantic struggled with the launch of Pokémon Go. But the real test will be after the initial surge of interest cools. Will the app continue to struggle? It may be riding high on a wave of nostalgia mixed with new technology for now, but most players will eventually drop away if problems persist. It would be crazy to suggest that Pokémon Go has been anything else than a tremendous success, but the launch problems have nevertheless dented the developer’s reputation, along with the platform it depends on. Continued issues could directly impact potential revenue due to lost players who either downloaded the game and had problems or decided not to download it at all. Hacker groups claiming responsibility in order to gain media attention will further magnify any issues. Even if the claims are untrue, it screams out that the infrastructure is vulnerable and encourages further attacks and negative attention to what is otherwise a popular and successful game. I’m sure that Niantic knew they would have a hit on their hands when they developed Pokémon Go, but even they must be surprised at the magnitude of its popularity across the globe. I’m equally sure we will see many more innovative uses of augmented reality applications in the future. Niantic has forged a path for others to follow, but highlighted the need for a carefully considered load balancing strategy. It’s not just good practice – it’s super effective. ### What can M7 do for you? The cloud computing market is now so broad that businesses can be forgiven for not knowing where to start. Far from simply offering additional storage, managed service providers (MSPs), now supply a range of IT solutions for a broad spectrum of industries with varying levels of support. [easy-tweet tweet="Choosing the right vendor, and the right solution, is absolutely vital when moving to the #cloud" user="m7msuk "] At M7 Managed Services, we realise that choosing the right cloud vendor, and the right cloud solution, is absolutely vital, so we work with our customers to deliver a bespoke service that is tailored to their business needs. As part of our MSP offering we provide a number of secure, highly available, cloud hosting services, all delivered on an IBM and CISCO Platform across 2 UK data centres. Using the IBM SoftLayer Cloud, we also enable businesses to deploy applications quickly and reliably in both virtual and bare-metal environments. Many companies don’t have the time, staff or IT resources to test, deploy and integrate new technologies. With M7 as their MSP, businesses can outsource the complexities of this deployment, instead receiving an IT platform that can support the software that they need to be successful. It is also worth considering moving your IT solutions to the cloud because of the additional reliability that it provides. Hosting critical applications with an external MSP allows businesses to access vital workplace tools whenever and wherever they need to. In the always-on economy of today, this is a key requirement. Our many accreditations, certifications, references and SLAs also provide the re-assurance that we are able to deliver resilient IT solutions that meet the uptime and availability standards that you need. As well as technical solutions, M7 are often chosen by businesses for the level of support that we provide. We champion long-term partnerships, as part of a consultative, on-going approach that ensures our customers’ requirements and expectations are understood at all times. Our 24/7 help desk is also there to remedy potential issues as soon as possible. Disaster Recovery If problems arise, it is important for businesses to have a robust disaster recovery plan in place. Having a cloud-based secondary infrastructure is the perfect way to get your business moving quickly again when disaster strikes. Cloud DR, often referred to as Disaster Recovery as a Service (DRaaS), means that businesses require less data centre space and on premise resources. This can result in significant cost savings compared with in-house disaster recovery options, enabling smaller businesses to access the same highly resilient and hugely cost effective DR solutions enjoyed by their competitors. [easy-tweet tweet="A second site provides businesses with a #disaster recovery solution following disruption" hashtags="DRaaS, Cloud"] At M7 Managed Services, our second site provides businesses with a disaster recovery solution that is ready to go live immediately following any disruption in their primary IT infrastructure. It also provides a great ad-hoc test environment for customers who wish to trial new IT solutions, but are concerned about how they may affect their existing IT set up. This enables businesses to introduce new, innovative technology, while minimising downtime and ensuring business continuity. ### 3D printing and IoT: Building ‘digital enterprises’ of the future 3D printing is continuing to gather momentum and grow in demand as demonstrated by many recent headlines. Companies and individuals across a range of industries are experimenting by extruding various different materials, including plastic and metal for prototyping their products and parts. [easy-tweet tweet="Manufacturers using #IoT solutions saw an average 28.5 per cent increase in revenue between 2013 and 2014"] Whilst a significant value of 3D printing comes from enabling designs or design cycles which were previously challenging to achieve using conventional methods, extra capabilities resulting from the development of the Internet of Things (IoT) promise to make 3D printing useful in many new and more exciting ways. In fact, manufacturers utilising IoT solutions saw an average 28.5 per cent increase in revenue between 2013 and 2014 according to a TATA Consultancy Survey. The next generation of talent has the opportunity to ‘digitally connect the dots’ of a modern factory floor to gain smarter ‘real-life’ insight over their competition, linking 3D processes through the use of IoT. A new array of business opportunities If a business leader is not considering implementing fundamentally different processes or part designs, then they will miss out on much of the benefit and opportunities for overall cost reduction and value enhancement. 3D printing fundamentally allows for rapid design cycles which can either reduce design time, or can improve designs in a given time, or often both. Moreover, this innovative printing process can reduce the overall costs of manufacturing, even if the end process implemented is not 3D printing itself. Looking ahead, extra capabilities resulting from the integration of the IoT with 3D printing promise even more exciting opportunities. With the advent of big data and the idea of remote printing, we will soon be able to gain smarter ‘real-life’ insights over the competition and link up 3D processes like never before. Industry-specific examples Businesses operating in the aeronautical and healthcare sectors serve as good illustrations of the how 3D printing and the IoT can be combined to deliver tangible business benefits. To keep pace with an increasingly competitive international market, it’s crucial that the UK’s aerospace industry continues to find ways to improve passenger experience whilst scaling back costs and becoming more energy efficient. For this reason, many businesses within the aeronautical manufacturing industry are already considering new design, manufacture techniques and technologies such as 3D printing that will help them meet demands for greater efficiency and innovation while, at the same time, enabling them to work within ever tighter budgets. In the recent United Launch Alliance between Boeing and Lockheed Martin, for example, a move to 3D printing technology for the manufacture of components saved the companies a reported $1 million in a year. However, with 3D printing still being a relatively new technology, engineers need to create adequate quality control methods and the IoT can play an important role in this. GE Aviation, for example, now connects big data to 3D printing using strategically placed sensors to collect and analyse manufacturing information to detect production problems in real time. The technology identifies factors such as temperature and structural integrity and the data gathered helps improve the quality of outputs from many manufacturing products. In the healthcare industry, the use of 3D printing is currently helping to improve the quality of many patients’ lives. For example, individuals with just one finger can now use personalised joysticks to control their wheelchair, whilst the existence of customised bionic eyes is helping people with profound vision loss to recover some sight using a retinal implant. At the same time, the data revolution is empowering the healthcare industry by enabling the development of connected devices such as wearables and tailored apps which are helping consumers take control over their own health in a highly personalised manner. Combining these two revolutions – digital manufacturing and big data - will equip the industry with tools that can revolutionise the way health is monitored, analysed and improved in order to enhance everyone’s quality of life. What next? With an ever increasing demand for fast and quality produced parts and the convergence of software and hardware, the importance of 3D printing techniques is likely to continue growing and developing further. To plan the journey towards realising the aspiration of being a true ‘digital enterprise’, on-demand manufacturing is key, and innovative manufacturing companies need to differentiate themselves by always looking to go one step ahead. For this reason, manufacturers are likely to not only adopt 3D printing technologies that allow them to transform their operating models but also to digitally connect the processes so that information about product usage, production capabilities and market and customer requirements can be shared and analysed much faster than ever before. [easy-tweet tweet="#IoT solutions are in a great position to merge with the new #digital manufacturing processes"] Ultimately, IoT solutions are in a great position to merge with the new digital manufacturing processes, to digitally connect the dots in the manufacturing floor and help bring together the new ‘digital enterprise’ of the future. ### Q&A with KRCS Group Compare the Cloud has been interviewing cloud experts and learning about their companies and today the spotlight is on KRCS, a retailer and Apple expert partner. We sat down with Philip Woods, Director at KRCS Group, to find out a bit more about his business. [easy-tweet tweet="Smaller businesses can really struggle with the #migration to #cloud technologies"] Tell us about yourself and your experience in the cloud computing industry? KRCS has worked with all sizes of business and educational establishments since 1983, but our work in recent years with larger organisations has shown us that a move to cloud based services on mobile technologies has a huge impact on productivity.  At the start of 2015 we identified that smaller businesses can really struggle with the migration to cloud technologies, particularly the early stages of the process that involve transferring from traditional infrastructure products.  It can be daunting for businesses without in house IT knowledge. Who are KRCS and what is your differentiator in a crowded ERP market? KRCS is a retailer and an Apple expert partner. Our expertise stretches to cloud based POS solutions like Vend. We take a long term consultative approach to our business relationships with retailers, and we genuinely feel like we offer a unique perspective on their challenges.  A successful business relationship for KRCS is primarily based upon the success for our customers. We will do everything we can to share the knowledge that we’ve built up over 15 years trading as an Apple retailer to assist in any area of a retailer’s business, not just POS. What countries have KRCS deployed into? KRCS has only worked with UK based retailers, and primarily those in and around our home territories, particularly the Midlands and north of England. What are the development plans for VEND KRCS? One of our key USPs as a Vend expert is that we’re local to our customers.  A significant development plan for the rest of 2016 is to expand our agenda of retailer ‘meet up’ events in our stores. We are planning to run one per month as a showcase for what’s new in the world of Vend, but also as a general business networking event for like-minded local business owners. Where did the name KRCS originate from? KRCS was initially registered as KR Computer Services in 1983.  K and R refer to the christian name initials of our two founders and Directors, Ken Woods and Rod Bishop. What are the trends your organisation is noticing within the marketplace? Currently the trend is towards subscription-based cloud solutions that don’t require a huge upfront investment. We are also seeing a genuine desire from retailers to make their solution scalable. All start up retailers have a dream of owning multiple stores one day, and thanks to cloud solutions like Vend they are able to work towards this goal from day one. In this way their operation can scale seamlessly as their business grows. How do you intend to attract new customers onto your platform? The answer to this question could be a lengthy one, so I’ll just mention a few key points: We offer a complete range of advanced cloud-based solutions for retailers, including Vend for POS, Deputy for Staff scheduling and Xero for accounts and payroll We specialise in supplying and servicing iPads, the best platform to run these systems on We aim to exceed every single one of our customers’ expectations, and when people receive excellent service, they tend to tell both their friends and their business allies We don’t try and flood the market with pointless marketing messages. We keep it personal! A question we ask everyone: “What is your definition of cloud computing?” Cloud computing is a software solution that’s available from anywhere with an internet connection.  The definition then splits into two, because larger organisations may actually host the system in house, while for smaller businesses instead access the software remotely on a securely managed shared server platform to ensure low cost and high reliability. Tell us what your average day entails? I’m a Director of an SME, so no day is average.  My day can involve planning for a significant new business venture, like our relationship with Vend, or trying to find an emergency technician to open a failed security shutter. Who is the team behind the KRCS Group? KRCS is essentially a family run business.  My father (Ken Woods) is our Managing Director and still actively involved in the business on a daily basis.  My brother (Robert) and I effectively run the operational side of our businesses, but our senior leadership team includes four people without the Woods surname, so we can ensure an independent opinion is heard. [easy-tweet tweet="#Cloud computing is a #software solution that’s available from anywhere with an internet connection"] And finally if you could change one thing within the cloud computing industry what would it be? I think it would be the vast array of solutions seemingly available to do the same thing.  In time I hope that the market rationalises, but right now it’s a nightmare for owners looking to navigate to the right solution for their business, as independent advice like ours is very hard to come by. ### The rise of the robots – a networking perspective When a switch is worth 1,000 robots! Cutting-edge robots and other advanced smart machines are set to be added into the rapidly expanding Internet of Things, which is projected to reach 25 billion devices by 2020. Robotics has already been used in manufacturing to great effect for over a decade, performing delicate and precise tasks with a higher success rate than humans. With advancements such as 'deep learning' robots, delivery drones and ubiquitous knowledge-sharing between machines, widespread robotics adoption is becoming far more feasible. In healthcare, there are already robotic services in operation with automated pharmacy dispensing and robotic trolleys - robots that can navigate between floors and even call the lift using a Wi-Fi sensor. The hospitality sector has also been a keen adopter of robotics to deliver services and in education, robots are being deployed successfully as a tutor, tool or peer in learning activities, providing language, science and technology education. Balancing risk and reward We must exercise caution when considering a broad introduction of robots and smart 'Machine to Machine' (M2M) devices. Robotics delivering such essential services will create an enormous call for greater bandwidth and prioritise on-demand connectivity – but can these be fulfilled with legacy IT systems? Without a strong and capable network as a backbone, individual smart devices will not flourish in the workplace, bottlenecked by capacity limitations. How can businesses that operate in traditional IT environments hope to successfully exploit and integrate this advanced automation? Can their networks ever meet the demands for smooth interoperability, large amounts of bandwidth, and continuous reliability in order to support constant data streams? The answer is yes – new developments in network technology are enabling businesses to develop the right infrastructure to support the adoption of robots. [easy-tweet tweet="Developments in network technology are enabling the development of infrastructures to support robots." hashtags="tech"] When a switch is more than just a switch Adequate bandwidth must be available to ensure time-sensitive tasks with little margin for error are not interrupted. This is where the network truly shows its worth - the Application Fluent Network that is comprised of robust switches capable of withstanding heavy demand for bandwidth and connectivity will be best poised to adopt automation to maximum effect. The most advanced generation of SDN-ready, Application Fluent Network switches incorporate features such as embedded network analytics and deep packet inspection. These application and device aware switches will allow for data prioritisation and smart machines and robots to operate unhindered by bottlenecks. Industrial-grade switches are also now available, capable of operating 24/7 at the network edge in challenging industrial conditions such as high temperature, dust and humidity and are a necessity in many of the differing workplace environments where robots will be deployed. If a switch is not robust or tends to get damaged or made ineffective often it means millions of Pounds worth of robotic automation could be rendered useless. These rugged switches are already available, and can be found in challenging environments ranging from offshore wind farms to underground tunnel infrastructure. But the challenges are twofold – physical and digital. By introducing Intelligent Fabric technology, enterprise networks can dynamically adapt to the requirements of virtualised workloads and simplify network operations through comprehensive automation. An Application Fluent Network strategy on which Intelligent Fabric networking is based provides the agility and flexibility that is vital for integrating robots into a network of any size, prioritising data & applications 24/7 so that service is always delivered in the right place at the right time. A cloud-centric future? Developments in cloud computing with the potential to harness the benefits of converged infrastructure for robotics appear highly promising in early stage testing, such as cloud-based processing of complex data in the form of speech and facial recognition. Yet practical deployment of this advanced technology is made possible by developments in networking such as high data transfer rates and improved interoperability. The rise of cloud-based, on-demand computing has brought with it the ability to incrementally deploy robots for greater cost-efficiency. [easy-tweet tweet="Smart machines will not perform effectively when served by an unintelligent network" hashtags="#tech #cloud"] As 2016 moves onwards, we should be observant of how networking and cloud technology develop alongside robotics. Smart machines will not perform effectively when served by an unintelligent network. Enterprises will need intelligent, automated network prioritisation at all levels, to ensure the right data is delivered to the right location without interruption. ### Artificial Intelligence Could Be Set to Make Security Smarter The future of online security is in the hands of Artificial Intelligence (AI); that's the prediction many industry experts are currently making. Although the development of computers capable of intelligent thought is nothing new, AI experts are now starting to expand the capabilities of their technology. [easy-tweet tweet="The future of online #security is in the hands of AI; that's what industry experts are predicting"] Indeed, back in the 1950s breakthroughs such as alpha-beta pruning were hugely important, but mainly because they allowed computers to play chess at a more advanced level. While it's true that chess has been one of the benchmarks for the capabilities of AI machines over the years, developers have moved on to new pastures ever since Deep Blue beat Garry Kasparov in 1996. AI's Gaming Skills Could Provide Smart Security Solutions "Deep Blue" (CC BY 2.0) by James the photographer Today, researchers such as those at Carnegie Mellon University are working to solve games such as No Limit Texas Hold'em. Unlike chess where all the variables are known (each player knows the spectrum of possible moves in any situation), No Limit Texas Hold'em is a game of unknown variables. Because the betting is unrestricted and a player's cards aren't on display, the number of possible hand combinations and moves makes it almost impossible to predict what's going to happen. For AI researchers this poses a tantalising opportunity to develop the abilities of AI machines and in 2015 Claudico took on some of the best players in the world. Developed by Tuomas W. Sandholm and his team at Carnegie Mellon University, the AI program narrowly lost the match but proved that computers can now make optimal moves against unknown variables. In fact, it's this result that's promoted Sandholm to suggest AI could soon be used to tackle cybercrime. Using the same level of analysis that allowed Claudico to play No Limit Hold'em, security systems of the future could detect the strategies of an opponent (i.e. a hacker) and then figure out a way to exploit the opponent's weaknesses. Modern Security is the Product of Evolution  "Cloud-computing (1)" (CC BY 2.0) by incredibleguy While this technology may not be perfected for another decade, the roots are already there. If we look at the evolution of online security we can see a movement away from static systems towards programs that offer a more dynamic form of protection. It used to be the case that small businesses would have to rely on stock security hardware to protect their systems. However, with small businesses now a target for hackers, something more is needed. This increased threat level has given rise to the use of cloud-based security solutions such as web application firewalls (WAFs) within the small business community. [easy-tweet tweet="WAFs offer a dynamic, cost-effective compliment to any business hardware" hashtags="security, cloud"] Offering a dynamic, cost-effective compliment to any hardware a business may own, cloud-based WAFs protects from common web application threats and vulnerabilities. From cross-site scripting to remote file inclusion and SQL injections, WAFs are currently the best way for businesses of all sizes to protect their data. What's more, this technology is both a product of the times and of the security industry's evolution. Indeed, because cloud technology is now a mainstream product (rather than the niche projects like Salesforce.com were in 1999), industries are now able to tap into its capabilities and this is what the security industry has done over the last decade. Instead of sticking to the hardware solutions of the past, leading security companies are now combining this technology with WAFs in order to provide multiple layers of protection. However, as with all things in the tech world, things will continue to evolve and that's where AI could come into the mix. Whether it's some form of standalone protection or the integration of AI with current solutions such as WAF, it looks as though that's the way the industry is heading which, for consumers, should be seen as a positive.   ### The Dangerous (Augmented) Reality of Pokemon Go Augmented Reality has seemingly endless possibilities, however, attempts made to merge the real world with that of computer-generated content have so far fallen flat. With innovations in Virtual Reality being made every day and VR Headsets becoming mainstream not only for gaming but video content too it begs the question of how long will it be before we see the two integrated? [easy-tweet tweet="#AR opens a door to a virtual universe where our minds are the only boundary" hashtags="security, futurology"] Described as ‘opening a door to a virtual universe where our minds are the only boundary’, augmented reality begins to sound like the stuff of dystopian nightmares where everything we see has superimposed images of information for everything in the real world in real time, from spare rooms in hotels to live prices on food in supermarkets and all it takes is for the user to look in the right direction. Before we are hit with this new visual universe however, Japanese gaming firm Nintendo have gifted the public their first highly addictive taste of augmented reality with their smartphone app debut, Pokemon Go. Nintendo, like many companies in recent years, have dabbled in making AR ‘fun’ and easily accessible with ‘AR Games’ on their 3DS console where, by using a recognised card, the console created simple games that ‘appeared’ in the real world via the hardware. Using the smartphone’s built-in GPS the game brings the Japanese company’s creature-filled world into reality and has set users the task of travelling throughout the real-world to ‘catch them all’. The smartphone app has become an inescapable force of nature over the past few days despite only being released a week ago in a handful of countries and builds on success of the giant video game franchise from the 90s. It benefits from a perfect blend of millennial nostalgia for the original games and the modern thirst for combining new tech like AR with the trend of casual gaming and social sharing on smartphones. [easy-tweet tweet="Pokemon Go benefits from a perfect blend of nostalgia and a modern thirst for combining new #tech"] The basic concept isn’t too dissimilar from real-world treasure-hunting resource ‘Geocaching’ which relies on users and moderators to hide capsules for other users to locate and ‘log’ by optimising smartphone GPS capabilities. Pokemon Go uses your smartphone’s GPS combined with the camera to create the illusion of the collectable monsters appearing on the city streets as you explore the area and allows you to compete alone or in teams to conquer ‘gyms’ which appear as map landmarks scattered throughout the real world. This interactivity with other users assists in the app’s addictiveness through competition and Nintendo plan to further the interaction options by implementing a trade option in order to help trainers ‘catch them all’. One of the most compelling angles of using AR features in combination with GPS is of course that of security. Early users of the application found it appeared to allow ‘full access’ to their Google accounts and smartphone location settings have been a long time concern for most users. In order to play Pokemon Go it is required for all these security settings to be opened by the user to allow the phone to track your location (even while it is locked and in your pocket). What is concerning about location services is where that data is stored and who can view it. By tracking where you go and even how long you stay in a certain place the data creates a personal profile for you as a consumer and this collected information could eventually be used for tailored advertising or perhaps rather sinisterly it could be our first steps transcending towards the ideologies of Orwell’s Airstrip One where the population is highly monitored by the state. Stepping away from the dangers of becoming a totalitarian state and concentrating on other perils Pokemon Go and AR technology may impose, more simply perhaps, it will become a distraction. You will be able to whittle away the hours in-game and tone your calfs while you walk miles catching new Pokemon, taking over gyms and hatching eggs but much like texting it will create a perception filter around you while you do so. As a user you may be paying close attention to whats in front of you via the smartphone’s camera, however, because it is a game, users are tending to forget they’re still in the ‘real-world’ while playing it and walking straight out into traffic and even stumbling head first into oceans while hunting for the best Pokemon, because of this it already garners a hazard warning on the app’s splash screen. Besides Pokemon Go possibly being intrusive to your privacy settings and attempting to kill you while you travel far and wide, it apparently also wants you to be the centre of some minor civil unrest as ‘Poke-Stops’ and Gyms have popped up in-game in some rather unfortunate locations. The US Holocaust Museum in Washington has asked players to stop entering the AR while inside the grounds to collect items and pokeballs as they feel it is extremely inappropriate to the victims of Nazism. [easy-tweet tweet="Stocks for #Nintendo soar with the resurrection of the monster franchise" hashtags="pokemongo, app, security"] While stocks for Nintendo soar with the resurrection of the monster franchise, the app shows no sign of petering out as the AR is becoming a bandwagoning success with the power of social media allowing gamers to make one another envious of the latest Pokemon they’ve caught and where. The irony of this however is Pokemon Go is fast reaching at the top social apps outpacing them for active users and time spent on the application. It has proved a success in bringing people together just as much as it has raised eyebrows with groups using ‘pokestops’ to lure unsuspecting users into secluded areas and robbing them. Even with these dangers rearing their ugly heads it is hard to ignore how good it feels to be walking along a street with no idea exactly where you are on your day off and making eye contact followed by a rye smile of a fellow Pokemon Master on their own adventure. Augmented Reality offers some very exciting ideas for the future of gaming as well as practical uses and making the technology work eventually comes down to getting the formula right and bringing users together in a community where AR assists rather than distracts. ### Cloud opens up opportunities for those tied to ERP There are many compelling reasons why organisations turn to Enterprise Resource Planning (ERP) software to manage core business processes and collect and interpret data from a variety of activities such as HR, supply chain management or financial control. [easy-tweet tweet="There are many compelling reasons why organisations turn to #ERP software to manage core business processes"] According to figures from Allied Market Research, the ERP market is showing no signs of slowing down and is predicted to reach $41.69 billion by 2020. There’s no doubting that ERP delivers to businesses on many levels. Sales forecasts are no longer based on guess work, order volumes can be easily balanced with inventory, and workforce planning can be brought in line with business needs. ERP systems help with these and many more challenges for tens of thousands of organisations. System frustrations However, for all of the many benefits it offers to the organisation as a whole, it’s not uncommon for certain business functions to become frustrated by the rigidity and limitations offered by the functionality on offer to them from their organisation’s prescribed ERP system, whether that’s JD Edwards, Oracle, SAP or Sun Systems. While ‘core functions’ such as finance, manufacturing and HR are well supported by ERP’s ‘core features’, often other business functions, procurement being one, can find themselves in receipt of ‘add on’ functionality that purports to be a fully blown system. However in reality these more niche functions may have been the result of a gap-filling acquisition of a technology that is poorly integrated or not part of a long term roadmap. Making the move Let’s take procurement’s plight as an example. We find that many procurement professionals are looking to move away from what’s offered to them in their ERP system towards best in class eSourcing and eProcurement technology. This will enable them to automate manual processes and contain and better manage organisation-wide spend, thereby delivering a multitude of business benefits. It’s worth noting that there is a stark difference between the make-up of ERP and procurement systems. While ERP tends to be a ‘deep’ system used by relatively few people, purchasing systems perform ‘shallower’ processes, relatively speaking, and can be used by almost anyone in an organisation that needs to buy something. Where ERP needs to manage complexity, procurement needs to offer simplicity to its user base. The same applies to other outlying functions less well served by ERP. Increasingly, procurement teams are facing the choice of continuing to use their ERP system or make the move to a cloud-based eProcurement system that can integrate seamlessly with it. But how can departments like procurement convince the rest of the business, especially IT, that ERP no longer meets their needs and that moving to a cloud-based system that can be integrated and be done securely and without disruption? The integration challenge We’ve seen many of our customers seek to take these steps before being halted by the integration challenge, and concern from IT managers that integrating with a remotely hosted, third party system may pose a risk to the organisation, especially where business-critical master data and finance systems are concerned. When it comes eProcurement software in particular, often one of the biggest roadblocks is the question of integration with financial processes, however, the tide is now turning and some cloud-based eProcurement solutions come with ready configured integration Platform as a Service (iPaaS) in order to securely integrate with any required system offering users freedom of choice and the ability to automate, improve and better manage many of their day-to-day procurement processes. Efficiency benefits For example, one of our customers uses JD Edwards (JDE) for finance and used its procurement module for over ten years to raise purchase orders and approve invoices. The system wasn’t considered very efficient or easy-to-use so certain departments chose to bypass it all together, preferring instead to manually process their orders. However, the complexity and limited functionality of the existing system was preventing the organisation from making wide-scale purchasing efficiencies and restricting its view of organisation-wide spend. Deciding to integrate a new eProcurement system with the JDE finance system that would enable a number of efficiencies including better spend control, more efficient order processing and payments, the organisation decided on a hybrid cloud approach allowing us to host our cloud-based service from within its data centres. Another of our customers, a global organisation of scale, was keen to make efficiencies to the management of its indirect spend across Europe. Multiple systems were being used across the region for indirect purchasing and these were largely manual, paper-based processes that did not provide full visibility and control over expenditure. As a result, collaboration between the purchasing teams and finance, and with suppliers, were not integrated and could lead to duplication on spend or even the business purchasing goods or services it didn’t need. After considering the choice of using the procurement element of its ERP, in order to improve indirect purchasing across Europe, it chose to move its entire European operations to a separate, cloud-based eProcurement system to integrate with SAP. It now has a single view of European-wide purchasing enabling significant cost savings. [easy-tweet tweet="One of the biggest roadblocks for eProcurement #software is #integration with financial processes"] Best-of-breed cloud-based solutions offer a host of benefits across a business, not just for one team, so for those parts of the business poorly served by ERP, cloud-based solutions complete with iPaaS may offer a viable and low resistance alternative to the limited functionality currently on offer. ### A day in the life of a company in FinTech The FinTech market is accelerating at a rapid pace. Global investments in FinTech firms reached $5.3 billion over the first quarter of 2016, an increase of 67 per cent when compared to the same period twelve months previous. In order to sustain this growth, the FinTech industry is operating within a very broad market, targeting any and all areas of the financial sector. [easy-tweet tweet="One of the areas that #FinTech firms are making an impact in is hedge funds" hashtags="Investment, Cloud"] As a result, the day-to-day life of a FinTech business can vary massively depending on its particular circumstances. Everyone from startups to multinational banks are now joining the financial technology sector and many businesses are now focusing on the delivery of bespoke solutions rather than becoming a jack of all trades. Even so, there are some common elements among FinTech firms. Namely, many FinTech businesses are committed to disrupting traditional ways of working. This could take the form of mobile payment platforms, investment analytics, peer-to-peer transfers or many other yet-to-be-discovered disruptive approaches. A hedge fund horror story One of the areas that FinTech firms are making an impact in is hedge funds. Hedge funds may have become synonymous with the worst excesses of the finance industry, but they remain a viable approach to investment. As a result, cutting-edge startups are seeing how they can use digital technologies to reinvigorate an investment strategy that has been in existence since the 1940s. One of the ways that they are doing this is by utilising data-intensive tools and analytics solutions to help managers and investors optimise portfolios, manage risk and track performance. Taking a data-heavy approach, however, does come with its challenges, particularly regarding security. FinTech firms store, transfer and analyse vitally important data regarding hedge fund investments and security can be difficult without vast IT resources. Failure to keep this data protected could easily lead to a “hedge fund horror story” where fraud and corruption are allowed to occur. However, cloud computing can provide FinTech firms with the resources to safeguard hedge fund data while still embracing the latest digital technologies. At Veber, we offer FinTech businesses robust security features as part of our cloud hosting packages. This includes a range of firewall options (virtual, shared and dedicated), anti-virus protection and encrypted access over our virtual private networks (VPNs). The latter is particularly important when it comes to hedge funds, as remote customer access is often necessary. Another potential horror story surrounding investment technology is not having the required network infrastructure to manage data flows. This could result in poor forecasting, or even unexpected downtime. For FinTech companies operating in an immature market, these kinds of disruptions can be terminal to future business success. With cloud computing, however, organisations benefit from the network infrastructure of their third-party vendor. Our network technology promises high availability and increased performance to ensure that FinTech businesses and their customers have access to the data that they need, whenever and wherever they need it. [easy-tweet tweet="#Cloud computing can provide #FinTech firms with the resources to safeguard investment data"] The FinTech industry is changing fast. What the future holds is difficult to say, but data will likely remain at its heart and cloud computing offers the industry the security, reliability and scalability that can prevent FinTech horror stories from becoming a reality for your business. ### Why we can look at award winners to shape and improve our game What to learn from a micro-SME winning a G-Cloud award When running EuroCloud in the UK, a role I enjoyed for 3 years to July 2015, there was one unmistakeable signal that shone out from the activities we undertook. The cloud vendors needed help understanding how to be successful on G-Cloud.  G-Cloud events were sell-outs, participation in the Q&A was tireless, the networking party was frenzied and the feedback was positive and always called for more. [easy-tweet tweet="Why do only a small number of vendors make enough money to make being on G-Cloud worthwhile?" hashtags="GCloud"] The question that needs to be answered is why a small number of vendors are successful and a large proportion (90 per cent for SaaS) don’t make enough money to make being on G-Cloud worthwhile.  The events EuroCloud run always include inviting the successful to show-and-tell how they do it; it’s a credit to the community that these firms are prepared to give away part of their competitive advantage. But it is in their interest that G-Cloud grows and becomes the model Cloud marketplace. That’s also why we started the G-Cloud Outstanding Success Story Award. And why my firm (micro-SME that we are) is pleased to sponsor it. It showcases how to succeed; look here for the real story on what works, and how the winner has adopted not just a functional service but has wrapped the whole-product in a packaging of non-functional attributes which have been correctly assessed to be requirements or desirables and compensate, if compensation is necessary, for not being a global vendor. This year’s winner, Linkspace, a flexible data management system that empowers business process owners to create, modify and control the data in and around their evolving business processes is developed on Open Source components by Ctrl O Ltd, a micro-SME that is getting it right when so many firms with a lot more marketing muscle have not yet found the path to success. What did they do? Linkspace (SaaS) was deployed by the MoJ in two contrasting instances to securely manage sensitive data that required concurrent access and updated by dispersed staff with highly granular security administration (who can look at what and do which tasks).  Linkspace was selected against opposition probably in large part because Ctrl O have been painstaking in making themselves easy for that type of customer to do business with. I asked Andy Beverley, Ctrl O’s CEO, to explain the secret formula that seems to be eluding some pretty heavyweight vendors who are clearly successful off G-Cloud: "I think it's down to two things. Firstly, supplying exactly what the customer wants in as simple a manner as possible: we know the language in which our customer articulates their need, and understand how they want to fulfil a requirement. Secondly, the whole service needs to meet the sometimes-difficult needs of the public sector. That includes the security requirements, and keeping the overall management easy and uncomplicated, without a myriad of potentially costly support options. In the case of Linkspace, because the software can be adapted as the business process evolves, there's no costly external consultants' bill every time something changes." [easy-tweet tweet="The G-Cloud Outstanding Success Story Award provides positive examples for #cloud firms" hashtags="GCloud"] In my view, this is an exemplary story, which is the whole purpose of the G-Cloud Outstanding Success Story Award, it is a sign-post that simply shows the way to those vendors who are open-mindedly looking for answers.  It is a story that has many of the marketing best practice features that are common in many of the successful G-Cloud SaaS vendors I have studied. It’s worth taking a look at the full case study which can be found here. ### Public Cloud vs. On-Premise: What to consider? The decision to use public cloud infrastructure for big data management is a fairly simple one for IT teams who have an immediate need for storage and computing or who are driven by an organisation-wide initiative. [easy-tweet tweet="For those deciding between on-premise and public #cloud, there are several criteria to consider "] For those weighing their options between on-premise and public cloud, there are several criteria to consider in deciding on the best deployment route. Data Location – Where is the data generated? Data can be viewed as having “mass” and thus can prove difficult (and expensive) to move from storage to computing. If the big data management model is not the primary location for data, best practices suggest establishing the enterprise data hub close to data generation or storage to help mitigate the costs and effort, especially for the large volumes that are common to big data management workloads. That said, IT teams should explore the nature and use of the data closely, as volume and velocity might allow for streaming in small quantities or transfers of large, single blocks to an on-premise environment. Often, if data is generated in the public cloud or if the data is stored long term in cloud storage, such as an object store for backup or geo-locality, public cloud deployment becomes a more natural choice. Workload Types – What are the workload characteristics? For periodic batch workloads such as MapReduce jobs, enterprises can realise cost savings by running the cluster only for the duration of the job and paying for the usage as opposed to keeping the cluster activated at all times. This is especially true if the workload is only run for a couple of hours a day or a couple of days a week. For workloads that have continuous and long-running performance needs such as Apache HBase and Impala, the overhead of commissioning and decommissioning a cluster for the term of the event may not be justified. Performance Demands – What are the performance needs? One of the underlying tenets of Hadoop is tightly coupled units of compute and local storage that scale out linearly and simultaneously. This computation proximity enables Hadoop to parallelise the workload and significantly accelerate the processing of massive amounts of data within a short period of time. However, a common foundation of cloud architectures is pools of shared storage and virtualised compute capacity that are connected via a network pipe. These capabilities scale independently, but the network adds latency and shared storage can become a performance bottleneck for a high-throughput MapReduce job, but the exact performance needs vary from workload to workload. The ecosystem of cloud vendors offers enterprises many architectural options and configurations that can address more directly the particular needs of a workload. For example, IT teams should examine the proximity of storage to compute as well as the degree of shared resources within the service as potential factors to performance, from fully virtual instances to standalone, bare-metal systems. Performance is often an important criterion when processing large volumes of data typical of Hadoop workloads. For non-production, development, or test workloads, this factor might be less of a concern, which makes running these workloads against shared storage a potentially viable option. For production workloads, public cloud environments are still viable, but IT teams need to be more deliberate in their selection of proximity and resource contention, for example, in order to meet the performance requirements. Data location, like cloud-based storage, and types of workloads, like periodic batch processing, are strong influencers on the decision to deploy into the public cloud, yet many see the total cost of ownership—in terms of rapid procurement and provisioning of resources and the associated opportunity costs—as the most important motivator. Cloud TCO – What is the difference in Total Cost of Ownership (TCO)? Calculating the TCO of a public cloud deployment can extend beyond the options for compute, storage, data transfer, and the pricing thereof. A good starting point to narrow down the options is to use reference architectures from your service provider, for the cloud environment of choice. Based on the options from the reference architecture best suited for the workload or workloads, enterprises can further develop their expected usage patterns and arrive at a more accurate TCO for deploying a big data management model in the public cloud. [easy-tweet tweet="#Data can be viewed as having mass and thus can prove difficult to move from storage to computing"] While deployment choice does not fundamentally change the architecture of the data management model, the additional benefit of on-demand provisioning and elasticity in the public cloud does open new possibilities for this evolution in data management. ### Stormy weather: Compliance and security in the cloud With Microsoft’s announcement on the 1st July about the general availability of 'public switched telephone network' (PSTN) Calling as part of Skype for Business Online, agile working just got made even easier. As more businesses embrace agile working this means that staff are increasingly working out of office, bringing their own devices into the workplace, or using their mobiles to make calls as well as sometimes even arranging deals. In the never-ending battle to reduce costs and increase margins, businesses are looking at all options with many turning to cloud solutions, such as Skype for Business for making and receiving calls, as a cheaper alternative. The benefits however, are far more than simply commercial – a key feature of cloud solutions are their flexibility and scale. For example, it can make working with overseas colleagues, partners and customers much more seamless, as well as making business expansion into new regions far more 'instant'. It allows organisations to flex up and down based on its needs. [easy-tweet tweet="A key feature of #cloud solutions are their flexibility" hashtags="security"] However, as with all new technologies, there comes a certain amount of uncertainty, misunderstanding and fear. In the case of the cloud, many fear the potential compliance and security risks that come with the freedom to work anywhere at any time - but should they? On the one hand, the agility that these solutions provide is perfect for staff, particularly with the ever increasing number of mobile workers. They’re not bogged down in the office and can freely travel between meetings, clients and in some cases countries. From a business perspective, giving your workforce this level of flexibility results in a more productive, engaged and happy workforce. However, it may give lead to internal IT headaches, since it could open a debate over the levels of security, compliance and data protection that cloud solutions such as Skype for Business provide online. The financial services industry has had to deal with these issues for a while. It has been a requirement for some time that calls relating to trades need to be recorded to prevent any potential market abuse. MiFID II (Markets in Financial Instruments Directive) will be extending this to calls from everyone in the advice chain, an additional burden on compliance departments. In addition, the US has been recording mobile calls since the introduction of Dodd Frank. In the past, a large number of firms simply banned client communication from mobile devices as they could control their own fixed landlines. However, it’s not just financial services that need to monitor and record calls. We have seen a rise in non-financial service businesses looking for call recording solutions to increase staff productivity whilst improving the customer experience. Any business that buys or sells over the phone might find themselves one day dealing with an irate customer contesting an order. This means someone has to find out which side originated the mistake, if any. If the call wasn’t recorded then it can become a battle of ‘he-said, she-said’, with no real possibility of getting to the bottom of the issue or being able to learn from any incidents. The only way to do this is making sure that every call is recorded, no matter the device or location. [easy-tweet tweet="Where can businesses store the high volumes of call data being collected?" hashtags="cloud, security, business"] Recording mobile and online calls are crucial for compliance but once the call is recorded, what next? An average day on a trading floor or in the sales department of a busy company can result in hours and hours of conversations. Where can businesses store the high volumes of call data being collected? All of these have to be stored and easily retrieved when an issue arises. The sheer expense and computer-intensive nature of storing hours of calls can be beyond many businesses, especially when margins are already being squeezed. The cloud is an obvious solution but some businesses are still concerned about security. If it’s not tangible and easily locked away like sticks upon sticks of memory, it’s not hard to see why it doesn’t feel particularly safe to some. The challenge then arises for any business of how to store the valuable data being collected whilst meeting regulatory compliance whilst also protecting the business from online security threats. An increasing number are looking to encryption services to prevent any mishaps or leaks of customer and trade data. As businesses become more and more flexible they are still having to deal with continued waves of rigid regulation and an increasing number of online security threats often in the form of illegal hacks. Making sure that there is a solution that will keep companies compliant and protected is of paramount importance – no matter the device, there must be a simple way for staff to record their calls and encrypt the data in the cloud. There may be stormy weather, but a solid solution should keep companies in the clear. Register for Cisilion and TeleWare’s joint event on 17th August, where speakers will be talking about security and compliance in the cloud, ROI and Skype for Business. ### The stratospheric rise of cloud-based video services In recent years, a bewildering array of acronyms about X-as-a-Service has sprung up. Cloud computing has made it possible for UK businesses to tap into an increasingly wide range of services provided over the Internet instead of on premise, with all the advantages that offers.  According to the Cloud Industry Forum,  over three quarters of UK-based business cloud converts use at least two cloud-based services and one in eight have deployed five or more. As UK organisations’ engagement with the cloud deepens, sifting through the various X-as-a-Service options can be confusing, especially since many capabilities are available as more than one option. [easy-tweet tweet="A company with its own streaming service powered by a vendor is a good example of #IaaS for #video"] A good example is video. Video is increasingly a part of the way we do business today. As familiarity with recording and watching work-related videos on our smart phones and other devices grows, so too does the expectation that video, social and mobile tools are becoming commonplace at work. Video is becoming part of the business fabric, used not just for the chief executive’s speeches, but e-learning, knowledge sharing, job interviews, compliance, call monitoring, marketing and more. So how can businesses get video-ready? There are a number of cloud-based options available today: Software-as-a-Service This is the biggest cloud market and the fastest growing too. SaaS lets an organisation license and use third-party applications through nothing more than a web browser, eliminating the need to download or install software. It has eliminated all the related expenses of hardware acquisition, provisioning and maintenance, as well as software installation and version upgrades. A corporate video portal is a good example here: the SaaS provider takes care of everything video-related, including all the underlying technology, the user experience and the user interface. All the company needs to do is start using the product. Here SaaS offers a one-stop solution for a business who wants an out-of-the-box way to use video for corporate communication and knowledge sharing. Infrastructure-as-a-Service IaaS exists for businesses who simply want remote access to data center infrastructure. They want to build their own capabilities and just need the power of the cloud to scale up computational power, storage, and networking. Here, companies are responsible for managing their own software infrastructure including operating system updates as well as their own applications and data; the provider takes care of providing the virtualised hardware provisioning and enables elastic scaling of these resources. A company with its own streaming service, powered by a vendor like Amazon Web Services or Microsoft Azure, is a good example of IaaS for video. The company licenses virtualised infrastructure from the provider, and is then responsible to deploy and manage the software to do everything video related themselves. Both DevOps teams and developers are required to deploy and maintain this solution, bringing in the required developer resources as the solution is scaled up. X-Platform-as-a-Service PaaS is roughly halfway between SaaS and IaaS—building on top of IaaS, it adds all the capabilities to handle software and scaling seamlessly, including abstracting, handling the operating system, managing servers and elastic scale logic. PaaS enables organisations to build more sophisticated systems, more rapidly, without worrying about the  underlying infrastructure required. It offers far more flexibility and control than SaaS, and a significantly stronger foundation and support than IaaS to building custom applications. In recent years we’ve seen the rise of the XPaaS: X stands for a more specific need or domain, taking the PaaS foundation to an even more skilled and specialised tier, whilst maintaining the flexibility required when innovating and building custom applications. It is the rise of the Specialised Cloud Platforms. For businesses wanting to quickly create their own video applications or workflows, Video Platform as a Service, or VPaaS, is an ideal choice. VPaaS is a self service specialised cloud platform enabling companies of any size and from any industry sector to quickly prototype, build and scale video-centric applications and experiences and easily integrate video as a native data type into existing business workflows. VPaaS provides end to end platform as a service for efficiently working with video as a central data type in your workflows. It can be used in any scenario where video is required to enhance business needs, whether, for highly secure and private environments like government and surveillance or consumer online media & entertainment sites, remote patient care or learning and training applications. The video industry is evolving fast. By using VPaaS, developers can quickly incorporate capabilities like media ingestion, management, transcoding, delivery, playback, enrichment, and monetisation into their own workflows, whilst staying current with the latest industry standards, remaining compatible with all devices and able to scale seamlessly as demand grows. [easy-tweet tweet="#Video is an incredibly powerful tool, offering a highly personal and engaging way to communicate" hashtags="Cloud"] Video is an incredibly powerful tool, offering a highly personal and engaging way to communicate and enhance business needs. It’s also a demanding data type to work with, complex and bandwidth-intensive. As the market increasingly demands offerings to include video as a native data type, specialised cloud services are becoming a critical part of UK organisations’ content and product strategies. Choosing the right cloud strategy is the key to successfully adding video to any workflow. ### Developing for blockchain: Creating a secure environment Fintech news is one of the industry’s hottest topics right now and blockchain lies at the centre of many of the headlines. This digital ledger technology that facilitates reliable and anonymous financial transactions has been prophesised to overhaul traditional payment infrastructures the world over. Applications using blockchain could become commonplace in the future, as multinational banks and fledgling startups alike begin to embrace the fintech revolution. However, in order for this to take place, developers need a reliable and secure environment in which to create and test their solutions. [easy-tweet tweet="Applications using #blockchain could become commonplace in the future" hashtags="Fintech, IBM"] With this in mind, IBM has announced the launch of its security rich business network on IBM Blockchain, powered by LinuxONE™. The new service will allow developers to execute network simulation and better understand blockchain capabilities in the context of a secured blockchain network. By accessing the security rich business network on IBM Blockchain remotely they will be given the opportunity to use a virtualized environment running on the most advanced Linux-only system. By using a blockchain service built on IBM LinuxONE™, developers will benefit from the highest levels of security and compliance, which is absolutely vital when dealing with sensitive financial information. Securing the future of finance Despite all the promise that surrounds blockchain applications, there is a significant hurdle that must be overcome first: security. All financial transactions, whether they take place informally between friends or as part of complex international monetary schemes, rely on trust. Blockchain, at this early stage of adoption, must secure the trust of its users or risk fading into obscurity. Fortunately, the technology has security at its heart, enabling self-verifying, immutable and corruption-resistant transactions. Enterprise-grade security, however, requires further safeguards to the blockchain itself, protecting the code against the wide range of potential IT vulnerabilities. IBM’s LinuxONE offers this protection against both horizontal and vertical threats, through advanced virtualization technology and tamper-resistant data encryption. Horizontal protection is vital for organisations dealing with blockchain technologies as contractual agreements may be applicable across company lines, even amongst competitors. In this multi-tenant scenario, it is imperative that the majority of financial information is kept private, with each party only able to see the data that is relevant to their shared projects. IBM LinuxONE takes tenant isolation to the next level by separating the tenants using firmware and hardware, resulting in actual physical separation. This technology is known as logical partitioning (LPAR) and enables businesses to share resources while still enjoying independently assured security. Blockchain applications running on LinuxONE are also protected from vertical attacks through a special type of LPAR technology known as IBM’s Secure Services Container. This takes all blockchain software and places it within a signed, trusted appliance-like vehicle – verifying that all software has not been tampered with. Once running, the container is locked down and safe from privileged user tampering. LinuxONE also encrypts all data and ensures that the relevant encryption keys are stored on tamper-resistant crypto cards. Blockchain is rightly generating a lot of hype in the fintech community and the wider industry. If the technology is to reach its full potential, then developers need to have the freedom and security to create and test a wide variety of applications. The security rich business network on IBM Blockchain offers developers the opportunity to create innovative new solutions with the knowledge that they are working within the highest possible security parameters. [easy-tweet tweet="The #IBM #Blockchain Security Rich Business Network offers developers the opportunity to create new solutions"] If you’re interested in working with the security rich business network on IBM Blockchain, sign-up here to try the complimentary beta. ### Why platform is the least important part of your cloud journey For most organisations these days the question is no longer whether to adopt cloud, but when is the right time and what services to move. Like any other major infrastructure change, it needs to be driven by a compelling event, such as a move to new premises, a need to refresh your existing infrastructure, the end of an outsourcing contract or a major organisational change. If you’ve recently made a big investment in your data centre, wholesale migration to cloud is probably not for you at the moment. [easy-tweet tweet="Moving to the #cloud is in effect just another #infrastructure #migration project"] Moving to cloud is in effect just another infrastructure migration project – with one further complication, which can make it three times as long and five times as complicated as you originally thought. The good news is that the technology decision is the easy part! Having helped many organisations implement various types of cloud over the last five years, and reviewed a range of different services, I’m happy to say that the actual platform you choose doesn’t really matter. Heresy, particularly from a managed cloud provider? Perhaps, but it’s based on practical experience and an in-depth knowledge of what’s out there in the market. Let me explain. I believe there are four requirements for a successful move to cloud. First, you need a vision that everyone can understand and align to, as well as the compelling reason for the project to go ahead which I discussed earlier. Second, you need to get people on board throughout the organisation, from commitment at the top to support from the team at the coal face. Change is always difficult and particularly with cloud, as your existing people will be worried that their jobs are at risk so may not fully commit to the project. It may also be that you are migrating to cloud because you have existing people problems in your IT service delivery. If people are going to support the project, there’s got to be something in it for them, which generally means job security, new skills and hopefully recognition and increased salary. To help with this part of the plan, the SFIA provides an excellent model for IT staff alignment which will help you assess what capabilities you have and what you need, develop and retain staff and reduce project and operational risk. Communication is also vital. Too much is never enough, as your staff will always assume the worst when there is silence. Keep them informed throughout the change process. Third, you need to get your processes aligned with the cloud provider; when dealing with the major public cloud providers, unless you are a huge company or the government, it is unlikely that they will change their existing processes to suit you. An excellent starting point is a business and IT alignment review to ensure that your organisation has accurately defined the service levels it requires for the key operational processes that IT supports in order to fully understand their cost, performance and availability implications. We find many organisations operate their IT without defined and agreed service levels, or have defined service levels but no way of measuring them to ensure they are being met. Once you have defined what services you need, you need to decide which can usefully be provided via cloud and which to retain in-house.  If migrating to cloud, you also need to ensure that your current IT operations processes are compatible with your chosen cloud provider. The final stage is to choose which platform is best suited to your needs. And if you’ve done your preparation correctly, it really doesn’t matter which platform you choose, as all the technology is pretty good. We’ve looked at all the leading public cloud services and they’re all very capable, although billing models vary. It’s important to check the small print, particularly the SLAs offered, and remember that you will still need to monitor performance against the SLAs yourself to ensure you receive the contracted service. Some legacy or bespoke services, or those where you need a non-standard SLA, may be difficult to transfer to a public cloud service. However, you can still potentially manage them from a single point. [easy-tweet tweet="Some legacy or bespoke solutions may be difficult to transfer to a public #cloud service"] So don’t be confused by all the providers constantly introducing new services. Microsoft, for example, introduced or updated more than 500 services on Azure last year. Begin with the vision, the people and the processes, and the platform decision should be straightforward. ### Cloud Migration - #CloudTalks with Velostrata’s Ady Degany LIVE from Cloud Expo In this episode of #CloudTalks, Compare the Cloud speaks with Ady Degany from Velostrata about cloud migration projects and hybrid technology. This interview was filmed live at Cloud Expo Europe 2016. #CloudTalks Velostrata software is deployed within virtual appliances, and installation requires just a few steps. First, download and deploy the Velostrata OnPrem virtual appliance in the data centre. Second, leverage Velostrata to create a Virtual Private Cloud, and a Virtual Private Network to securely link the VPC to your Velostrata OnPrem appliance. It’s important to note that all traffic between these points is encrypted end to end, both in flight and at rest. Third, deploy a vCenter Plug-In from within Velostrata OnPrem. Then, log in to the extended vCenter and click to automatically deploy a pair of Velostrata Edge virtual appliances in active/active configuration into the VPC. Each pair supports 50 concurrent VMs, and you can add more for scale. Velostrata provides comprehensive tracking and performance reporting, so you have a complete view of your applications. ### Finland: Setting the benchmark for a new era of smart mobility The global market for intelligent mobility is predicted to be worth around £900bn by 2025, with the potential to reduce the amount of cars needed on urban roads globally by up to 20 million vehicles per year and resulting in emission savings of 56 megatonnes CO2 per year. So what is it exactly? Intelligent Transport Systems (ITS) encompass a broad range of information and communication technologies that improve safety, efficiency and performance of transportation systems. These technologies are used to collect data which is analysed and distributed through different channels to varying terminals in order to improve the capacity of traffic networks and serve commuters better. [easy-tweet tweet="Digitalization of traffic infrastructure has led to a new #service intelligent transport model"] This digitalization of traffic infrastructure has led to a new service intelligent transport model: Mobility as a Service (MaaS). Currently commuters buy the means of mobility, such as a car or a bike, or the tickets for a specific transport mode. MaaS uses networked transport systems to introduce a pay-as-you-go approach, so when a commuter selects the service, price, timing and destination, the MaaS system recommends the optimum travel plan, which may include several means of transport. Providers, including train, bus and taxi companies, will transmit their information in a common format, which passengers will access using just one website, app or agent, regardless of their destination or chosen mode of transport. And this is where Finland is already leading the way, where road infrastructure and public transport already benefit from intelligent traffic solutions. In 2015, over 23 key organisations joined forces to establish the world’s first MaaS ecosystem for traffic, which is changing the face of public transport by harnessing the capabilities of public and private entities and creating a seamless, demand-based, and flexible travel experience for the public. Open data for the open road MaaS will only work if the system is fed timely, accurate data, which is something Transport for London (TfL) is facilitating through the opening up of its data silos to outside agencies. In doing so, the capital’s transport authority is working out how it can disrupt itself rather than disrupting others. TfL, which operates one of the largest contactless payment systems in Europe, has responsibility not only for public transport but also the capital’s roads and traffic lights – as well as an emerging MaaS operation of its own. This is growing all the time, and TfL expects to integrate various shared mobility options as the various transport service markets converge. Other cities are outsourcing this work to third-parties, with Montreal and Denver contracting with Xerox, the company best-known for its office automation products. It already processes more than a million public transport tickets every day alongside 16m parking fines per year. [easy-tweet tweet="Xerox’s research indicates that 51% of Europeans will decide where to live on the basis of mobility services."] Its back-end systems collect various data from transport provider points in each city – which in Montreal’s case is split across 19 different companies – and uses them to analyse how residents are getting about. This analysis helps authorities plan new routes, work out where transport assets need to be deployed, and calculate the most profitable services. This will become increasingly important, with Xerox’s research indicating that 51 per cent of Europeans will soon be deciding where to live or work primarily on the basis of mobility services. The need for uniformity MaaS needs just four things to thrive: traffic data, trip planners, analytics and a payment system, but the rapid introduction of competing services means there is the risk of a fractured system. MaaS Global, the world’s first MaaS operator, argues that there needs to be a small number of interfaces (APIs) for universal data sharing, that work not only on a city-wide or national scale, but also from country to country. This year, starting in Helsinki, the operator will be rolling out a service called Whim. Whim fulfils every requirement of true MaaS, not only unifying timetables, transport options and congestion data, but also handling payments and distributing the proceeds to the transport providers. With a single app, customers will be able to book a journey across multiple modes of transport, including hired cycles, and the system will even reward them for choosing ‘greener’ methods of transport where a lower carbon option is available. MaaS Global is already talking to other cities around the world, with the potential to rolling out its services on a global scale. Reinventing the car Of course MaaS is not just about urban mass transit: there are some places that trains and buses still do not serve, for which a timeshared car or private vehicle is still the only option. It is for this reason that Renault Nissan, the company behind the Zoe and Leaf – the biggest-selling electric cars in the world – is working on cars specifically geared towards sharing. It is also looking to roll out autonomous driving at every price point within the next ten years – not just at the high end. So, are we in the middle of an auto revolution? Rightwave, the company behind some of the automobile industry’s most striking user interfaces, believes so. As commuters we rely more and more on screens in every mode of transport – not only on the instrument panel, but also in the form of tablets and smartphones. The company sees a future in which these work seamlessly, presenting a unified view of your journey, whatever device or mode of transport you happen to be using. Rightwave’s UI design application, Kanzi, is responding to this change, as the company starts to focus less on graphics excellence, and more on connectivity by any means, including TCP/IP and Bluetooth, between the mode of transport and passengers’ or drivers’ personal devices. The shift in thinking required to bring this about – and other emergent trends in mobility – is so dramatic that Renault Nissan sees a talent war developing within the automotive market. The company argues that the demand for robotics and data analytics expertise is particularly hot, which proves that the time for innovation is now. ### Death of the Speeding Ticket Right now, the driverless car that takes you from A to B at the press of a button, parallel parks for you and which never gets lost, seems more likely than ever. The next generation will be incredulous that people actually drove themselves; they’d also struggle with the notion of speeding since an autonomous car will always observe national speed limits, regardless of how much of a hurry you might be in, or how much you shout at it. With various reported estimates for the number of connected devices and things – possibly including driverless cars - predicting exponential growth, network systems are only set to become bigger, more pervasive, and increasingly more complex. [easy-tweet tweet="Network systems are only set to become increasingly more complex." hashtags="cloud, IoT"] Everything at cloud speed Even today, everything is about creating and delivering a rich user experience, at ‘cloud speed’. The need to innovate, build value and scale services has become critical for any organisation, especially when confronted by the ‘born-in-the-cloud’ start-ups blazing a trail across established markets and disrupting many businesses in the process. Inevitably, the sheer volume and velocity of data needed to support digital services continues to impact and stretch traditional networks. Creaking under the strain of the tremendous bandwidth and workload demands of applications and resources, the data centre network must be able to scale, perform tasks faster while anticipating and adapting to change. While data centres have a long-established presence in many organisations as a technological and operational backbone, not all networks are made equal.  Often built with overlapping generations of technology ‘sprawl’, adding layer upon layer of operational complexity; legacy networks typically end up constraining businesses with their rigid infrastructure, cumbersome operations, misaligned cost structures and inadequate, outdated security capabilities. The evolved data centre As data traffic, either from devices or other systems and ‘users’ continues to escalate, data centre infrastructure investments will have to focus heavily on a new network strategy, lowering costs while increasing capacity, service delivery and service innovation. The evolved data centre network has to adapt, scale and evolve to respond to changing market conditions, dynamically and quickly. So what does the future hold for data centre design? Here are my top four suggestions: 1. OPEN: As businesses migrate towards network virtualization and the cloud, no single vendor will have a solution for it all. But an open, multi-vendor environment is an example of how the I.T. community is building flexibility and freedom of choice right into the heart of the network. And as networks experience a variety of changes, an open infrastructure not only provides best-in-class solutions but can mitigate risk, manage contingencies and be flexible enough to grow and evolve with the business. 2. SIMPLE: The promise of a programmable, virtualized platform that builds automation throughout the network and which can be tailored to manage service delivery across all environments is a key aspect in driving a successful cloud strategy. Highly scalable, secure and ready to adapt to customer needs, cloud platforms can provide significant agility for enterprises and service providers operating in an increasingly competitive marketplace.  Applications and services can be re-configured, re-deployed and delivered on industry-standard hardware, with virtual devices programmed for building self-service, automated capabilities, at ‘cloud-speed’. 3. AGILE: Software-Defined Networking (SDN) is a network technology that provides the means to simplify complex, operational tasks through automation. By leveraging software control capabilities to automate key networking functions, SDN can minimize or entirely eliminate many labor-intensive tasks. Creating a highly agile network infrastructure, new services and applications can be rolled out quicker, more efficiently and with less risk, all at the click of a mouse. 4. SECURE: All organisations are threatened by a constantly-changing landscape with the surface area for cyber-attacks set to become larger than ever. Changing how and where to deploy security in the network becomes even more critical as threats exist not outside the network as well as the many that are more than likely already inside. The traditional ‘castle’ method of just relying on network security at the perimeter is no longer enough. To counter emerging and unknown risks, a fully automated approach is needed which embeds security (both physical and virtual) within the entire network at all levels, not just at the edges, and which dynamically adapts to new threats. [easy-tweet tweet="So what does the #future hold for #data centre design?" hashtags="cloud, IoT"] ### The growing importance of age checks for online businesses The internet has changed the world as we know it. It’s 2016 and you can now buy practically anything online – clothing, groceries, flights abroad and even family pets! The internet has it all. [easy-tweet tweet="UK retailers are still failing to verify the age of customers online before selling them age-restricted items"] But, as the world enters a more digitalised era, and with increasing numbers of businesses expanding their operations online, what happens when the goods or services they offer carry an age-restriction? This is where online age checks come in and it’s an issue that has been gaining increased awareness of late. In fact, earlier this year the Queen mentioned online age checks in her speech while location-based dating app, Tinder, announced last month that it was upping its minimum user age, in the face of mounting public pressure. But Tinder isn’t the first company to come under fire for not doing enough to protect youngsters from the perils of the internet. Earlier this year, Amazon faced heavy criticism, and arguably suffered reputational damage, when it came to light that the 16-year-old who was convicted of fatally stabbing Aberdeen schoolboy Bailey Gwynne had bought the knife on Amazon. To make matters worse, no age checks were carried out – not even when the item was delivered; it was instead left in a shed. This has led to promises from the government to crack down on those not adequately protecting under-18s from accessing age-restricted goods and services. Businesses now need to review the measures they currently have in place and ensure these are stringent enough. After all, no one pities the off-licence caught selling cigarettes and vodka to 14-year-olds, so why should online businesses be any different? Responsibility lies with business As online age checks grow in awareness, more and more businesses operating on the web are beginning to realise the responsibility lies with them to have effective age checks in place. Accusing parents of ‘a lack of control’ over their children’s use of the internet is diminishing in its validity as an excuse for those businesses trying to dodge responsibility. Sifting out customers who do not meet the legal minimum required age is imperative in helping businesses to stay on the right side of the law. In order to do this, websites need to have online age-checking measures in place. With any service that is adopted to protect a business, these services need to be periodically monitored in order to ensure they continue to meet business needs. As technology and regulation advance, novel ways to check the age of customers are being introduced. This means businesses can benefit from a fluid approach to age checking – installing multiple technologies on their sites, or using service providers who adopt new technologies as they become available. What could happen to a business that fails to verify its customers’ ages? The implications for a business that fails to verify the age of a customer before granting access to anything with an age restriction can be serious. And yet, even with the Amazon revelation, another study since found some of the UK’s biggest retailers are still failing to verify the age of customers online before selling them age-restricted items. These scandals can leave a permanent mark on a business and besmirch its reputation. There are also are legal implications to consider as well and this cannot and should not be taken lightly. For any business selling or granting access to age-restricted goods, content or services to anyone under the minimum required age, individuals deemed responsible could face up to two years’ imprisonment and an unlimited fine. What are the downfalls of the current age-verification checks online businesses are using? Once upon a time, merely asking: “Are you over 18?” sufficed as an age check for businesses who sold age-restricted goods and services. But times have changed and, as we now live in an era predominantly shaped by information technology, businesses operating within this sphere will have to find technology that can mirror the age checks relied upon in the offline world. But unfortunately, this is not yet the case in many instances. At present, there are many websites that still only ask a customer to click a box to confirm they are over-18. Similarly, there are also many websites that simply ask a customer to enter a date of birth. There is no mechanism in place to verify whether or not the date of birth is actually correct or has been falsified – they only calculate whether or not the year entered is 18 years ago or more. It’s fair to say that neither of these measures are particularly robust. Then there’s the delivery of the items. For a number of businesses operating online, a third party courier is used to deliver any goods. Because of the third party status, the courier may not feel obliged to ask for proof of age from customers of another company while others may not have the processes in place to order to check age. This is demonstrated in the recent study undertaken by auditor Serve Legal, which tested 1,000+ retailers that home deliver alcohol and found over half (56 per cent) were not checking ID at the point of delivery. The public are becoming more informed and more vocal about businesses failing to verify ages of customers they sell age-restricted goods to. On top of this, the government will soon be making moves and introducing further measures to punish businesses not playing their part. Companies need to seriously consider a strategy and review the measures they already have in place to ensure youngsters are not gaining access to things they shouldn’t be. It’s of paramount important that companies begin to introduce more stringent checks in order to ensure they keep their reputation, and ultimately business, intact. What preventative online age-verification measures are available to businesses? Online age-checking mechanisms are available which can be implemented at the point that a product or service (which may be inappropriate to minors) is about to be displayed to a customer. An alternative to this, and what is referred to as a ‘payment-gate’, is an age check that sits within the buying process and prevents a payment transaction taking place to someone under-aged. For goods that are purchased online and delivered in the real world, the last line of defence is the face-to-face check at the point of delivery. Learning from the mistakes of Amazon, businesses should consider implementing a policy to ensure a proof of age is presented upon in the delivery of items. It’s up to the business to consult with any third party courier services they use in order to inform them of whether or not an age-restricted item is to be delivered and whether or not age-verification checks need to be done at the door. A business using its own delivery service needs to ensure staff are aware of the policy regarding the delivery of age restricted items. These measures are pretty straightforward but it is important to remember they are not 100 per cent fool-proof. There are, however, more innovative measures emerging that allow businesses to implement more robust age-verification checks. One such example is AgeChecked – a software that pulls information and data from a range of sources to verify the age of a customer. Easily added to websites, the software creates gateways that redirect website users based on their age, meaning that those who do not meet the minimum age requirement are prevented from purchasing goods that are age-restricted or entering sections of websites that host age-restricted content. AgeChecked appears simplistic - easy and quick for customers to use - yet it incorporates a wide range of age checking methods working over 160 countries worldwide. Designed to work seamlessly within shopping carts, payment pages, or content delivery , it is an example of one of the new generation of age-checking services that are emerging to meet the requirements of the online age. [easy-tweet tweet="With increased awareness online age-checking is not going away" hashtags="AgeCheck"] With increased awareness and it now being an issue on the government’s agenda, online age-checking is not going away. Preventing under-18s from accessing goods, services and content that are age-restricted is becoming a necessity for businesses. And now is the time to act – businesses that don’t or are slow on the uptake will face not only reputational damage but may also no longer exist if a hefty enough fine is imposed. For more information, please visit: http://agechecked.com ### Looking for a step change in IT productivity? Cloud is rapidly becoming the core platform for adding innovation and business value to an organisation. We are often overlooking some of the more obvious places where cloud can have a much bigger and immediate impact on IT productivity. Areas such as mobile solutions, the Internet of Things and cognitive computing can revolutionise the business processes. So where is the cloud likely to have the biggest impact on IT productivity in 2016? We have identified the following business cases where cloud can provide a step change in efficiency and effectiveness, enabling business and IT leaders to not only improve IT productivity but seize the opportunity to progress their enterprise capabilities and their personal careers. 1. Making mergers and acquisitions seamless In the intense negotiations around mergers, acquisitions and divestitures, cloud services might not be top of mind. However, they can make these processes much more painless in a number of ways. They can help organisations to realise synergy benefits quicker, simplify integration accelerating the change programme, reduce costs through efficiencies, mitigate costly migration investments and encourage financial flexibility. Any merger or acquisition is a critical business moment with complex processes to be negotiated, so it’s worth looking at how these can be simplified. 2. Preparing for the unexpected A systems failure or security breach could hit at any time and while preparing for the unexpected may seem an impossible task, a lack of sufficient testing is often to blame. Testing via the cloud has a range of benefits including rapid access to resources, cloning and provisioning, shared access to testing resources and an expanded range of tests that can be covered, which can be useful for simulating the unexpected demands created by a malicious attack. [easy-tweet tweet="Testing via the cloud has a range of benefits including rapid access to resources..." hashtags="cloud, security"] Leaders should start by assessing their current test practices and services against the cloud testing possibilities available – which could help the business to be better prepared in the event of an incident. 3. Building an agile edge Customer expectations are ever-growing and as such, businesses are under constant pressure to provide continually improved customer experiences. This requires a systematic and iterative approach to leveraging a range of technologies – meaning fast, agile development is required, as well as rapid access to new innovations. Business leaders are coming to realise that cloud enables a culture change to design-first thinking and agile practices – enabling them to better manage unpredictable demands. 4. Increasing IT efficiency Business leaders responsible for IT are launching cloud-based transformation programmes to make a step change in IT productivity and increase the velocity of new function delivery. Cloud technology models provide well documented and ready to use, standardised IT services which allow organisations to focus on the service rather than the technology details. Increasingly, IT leaders are starting to review their existing systems to assess which can be moved to cloud technologies unchanged, and which require re-platforming, reworking or replacement. 5. Addressing privacy concerns With the proliferation of data and increasingly connected systems, today’s business leaders are faced with mounting regulatory requirements, putting growing pressure on IT systems. Failure to meet these requirements can have critical business impact. One solution business leaders are looking to in order to manage this is moving to a private cloud – dedicated to their organisation only, enabling greater control and privacy. 6. Implementing business transformation   Whether the business is responding to a perceived competitive threat, implementing a major new programme, or appointing a new executive, business change is an ideal time to consider the value of cloud services. [easy-tweet tweet="#Cloud services simplify collaboration with partners, provide a framework for developing innovative ideas and enable innovative ideas to be rapidly deployed and scaled up in response to changes in demand"] Cloud services simplify collaboration with partners, provide a framework for developing innovative ideas and enable innovative ideas to be rapidly deployed and scaled up in response to changes in demand – making business change a much more seamless, painless process. As these business cases demonstrate, business leaders are already recognising the power of the cloud and the opportunities it provides across various disparate functions. As we look forward to the rest of 2016, the cloud will only become more vital to organisations looking to revolutionise their IT operations, providing the IT department the opportunity to demonstrate its value beyond a support function to a driver of business change. ### How to keep your cloud safe In the past several years, cloud adoption has grown rapidly. The latest studies reveal that cloud adoption in the UK now stands at 84 per cent with companies using at least one cloud service. As investments in the cloud increase, so do concerns regarding security and the risks associated with storing sensitive information on cloud platforms. So what security essentials should a company consider when storing data in the cloud? Cloud security starts with the same three 'pillars' as internal network security: confidentiality, integrity and availability. Yet, businesses need to recognise that the cloud stretches these three pillars in new ways. For example, there is a greater attack surface whatever the delivery model. [easy-tweet tweet="Private #cloud is the most secure, it doesn’t compromise company policy but it’s expensive to do right." hashtags="business"] Private cloud is the most secure, it doesn’t compromise company policy but it’s expensive to do right. Community cloud involves shared infrastructure with unified security, compliance and jurisdiction requirements, although it can be restrictive. Public cloud is flexible from an adoption perspective, but you have to accept the policies of the service provider. Finally, hybrid cloud combines all these aspects, although success depends on the eventual service choice (x-as-a-service). Once you have identified the architecture that fits your requirements, there are further questions to ask. Are you able to answer the following with confidence? What are the controls on privileged administrators and how are they supervised? Where is data held? How is it held (encrypted/resilient/high availability)? Will legal obligations to protect company data be impacted if the provider has a distributed architecture (i.e. multiple data centres across different countries)? What about backup and archiving? What is the provider’s viability? Any probability of company failure or acquisition? Does the cloud solution integrate with the company’s IT infrastructure? Will the workforce be affected by how they access data? Certifications - who audits them and how frequently? Does the provider have disruption provisions against attacks, business continuity or disaster recovery? One point businesses must be aware of - data security remains their responsibility. It is not transferred to the provider. No single security method will solve every data-related problem, so multiple layers of defence are critical, from access control, system protection and personnel security, to information integrity, network protection and cloud security management. As well as hackers targeting a specific cloud service or corporation, companies must also take into consideration the risks posed by employees. A research released by Experian showed that 60 per cent of security incidents were caused by the employees; this risk is exaggerated further by staff working remotely or the use of personal mobile devices to access sensitive materials outside of the company network. Consequently, organisations need to implement a strong security and awareness strategy that includes acceptable usage policies for the employees, enabling them not only to improve their cyber security behaviour, but to become true custodians of the company’s sensitive data, cloud or no cloud. [easy-tweet tweet="it is critical to make sure that #cloud infrastructure and disparate applications are integrated" hashtags="business"] Finally, it is critical to make sure that cloud infrastructure and disparate applications are integrated, yet independent from each other so that the impact of any compromise or breach can be contained. This is a crucial step to securing the cloud across a business. ### Steps to secure apps in the cloud: From dev to production & beyond Cloud security is such a massive subject, usually focused on the security of the infrastructure itself. But what about securing the apps hosted and developed in the cloud? After all, the vast majority of modern apps are deployed in the cloud – and it’s a fast growing segment. For example, according to Cisco research, cloud apps will account for 90 per cent of total mobile data traffic globally by 2019. [easy-tweet tweet="#Cloud #security is such a massive subject, usually focused on the security of the infrastructure itself"] Organisations are under intense and growing pressure to deliver web apps that satisfy global demand – from both customers and internal staff - and create differentiation and business advantage. This urgency means web application development can often leave certain aspects in second place, not least security. This problem is intensified when developing cloud-first applications, where changes are made multiple times each day, with the inherent danger of introducing security vulnerabilities way more frequently. Our own Web Application Vulnerability report found that just under half of web apps contain a high-severity vulnerability, such as Cross-site Scripting (XSS), SQL Injection and Directory Traversal that could lead to data theft. Types of apps commonly in the cloud Clearly today’s enterprise IT focus is on moving to the cloud, but as millions have often been invested in their existing infrastructures it might be unrealistic to expect them to move everything to the cloud (although evidence shows movement in that direction). While web apps are arguably easier to move to the cloud, traditional thick client apps are also following suit. Companies are offloading their security burden and getting more out of the cloud by putting their security infrastructure there. Rather than using a company’s own infrastructure, a plethora of cloud services such as Office 365 and Google Apps make it cheaper, more effective and more convenient. With cloud security, organisations want the same peace of mind they had from their on-premise security infrastructure. Using a modern cloud vulnerability scanner makes pushing security to the cloud easier – where a scan for web apps was previously run in-house, a cloud scanner doesn’t have to take account of physical machines; plus as there’s less resource implications, admins don’t need to worry about making sure everything is ready from the IT side prior to running the scan. However, putting your sensitive data in the cloud can be a contentious issue - mainly of trust. So a mix of cloud/on-premise apps works for both SMEs and larger companies, the precise mix depending on resources allocated within the security team or IT team. Don’t forget, smaller businesses are still exposed to the same threats, they are just trying to find the most cost-effective security testing solution – and cloud security offers lower barriers to entry. Especially if they are a start-up with a cloud-first approach! Where to start? Large organisations can have thousands of web applications and web services to maintain –even for small businesses they can number over 100 - so where does the security team start? Remember it’s not just about identifying vulnerabilities, but it’s crucial to fix vulnerabilities based on how business critical the web application is. Key questions include: how severe is the vulnerability? Is it a business critical application? Is it an internal-only, or an Internet-facing web application? The most effective way is to approach an application as an attacker would - from outside in. Scan each application individually using dynamic testing across a wide vulnerability assessment area. A scanner will then produce a report based on the criticality of the app to your organisation. When dealing with hundreds of applications it’s often difficult to prioritise action, but clearly the first to fix should be the highly business critical apps. If it supports it, you will also receive a CVSS3 score which helps define the priority list further and is easy to add weightings to make it more appropriate to your business.  Armed with this intelligence, an accurate action list can be built for the team to start working on.  Save money by identifying flaws before they reach production Having a ‘let’s fix them later’ approach to vulnerabilities can be very costly and dangerous for organisations. With any vulnerability, the longer it remains in your design & build process, the more it costs to fix – the advice is clear, the earlier you start your security effort, the better. A web application’s staging environment should be as close to production as possible, therefore it’s recommended to do security testing at this point. If a vulnerability is identified, it’s always easier/cheaper to fix here rather than when the vulnerability makes it into production. It’s also possible to run an automated scan on just a subset of the app, even prior to staging, in the development environment (e.g. by looking for XSS flaws in a new feature a developer created).  Whether on-premise or cloud-native apps, the dev environment should be regularly checked for flaws using automated testing. For some, automated scanning is more suited to a cloud model – where scanning web apps hosted in the cloud becomes easier, without the headache of installing or maintaining any software. Then later on, if all reasonable flaws have been fixed, a company can escalate their security efforts by manually pen-testing the application for logical/business logic security bugs. [easy-tweet tweet="In the race to get #cloud apps to market, #security must not be left behind"] Test early & often In the race to get cloud apps to market, security must not be left behind. Dealing with identified flaws - prioritised correctly - should be a critical part of your ongoing security cycle. Don’t forget… it’s always much more cost-effective for companies to test early and frequently. ### Digital healthcare: Securing your patients’ trust In the US, it is estimated that cyberattacks cost the healthcare industry $6 billion annually. Similarly, both private and public-sector healthcare firms in the UK are increasingly finding themselves targeted by hacks and other forms of malicious activity. Altogether the NHS has now been fined more than £1.3 million by the Information Commissioner’s Office for poor security practices. Clearly, as healthcare businesses begin to embrace more digital technologies, they also need to ensure that they install the relevant safeguards to protect patient data. [easy-tweet tweet="As #healthcare businesses adopt more digital technologies they must protect patient data" hashtags="Security"] In order to understand how to best protect healthcare information it is important to realise why cyberattacks are targeting this industry. Data stored by health firms can include email addresses, credit card data, employment information and much more that could be used to commit fraud. Medical records are also extremely personal, meaning that they should be protected even when there is little or no financial risk involved. The cost of a data breach could not only cost businesses millions of pounds in regulatory fines, but also untold reputational damage. Curing your security troubles with the cloud Modern healthcare businesses, however, are now challenged to keep patient data secure while also continually evolving their IT infrastructure. In the past, protecting information was relatively straightforward: making sure that filing cabinets were locked. Today, digital records can be accessed from all over the world and the rise of IoT healthcare devices promises more access points in the future. This means that a holistic approach to patient security is a necessity. One way that health firms can improve their IT security is by employing a reliable cloud vendor. At Zsah we provide managed cloud, managed IT, software engineering and consultancy to a raft of industries including telecoms, finance, retail and healthcare. Cloud computing can help healthcare firms by providing flexible, scalable IT solutions, but it can also bolster security. Long gone are the days when cloud computing was seen as inferior to on premise security protocols. Today, cloud firms ensure the most stringent safeguards are in place so their customers can meet the necessary compliance standards. With Zsah, all of our data stays within the United Kingdom and our firewalls use IPS (Intrusion Prevention), IDS (Intrusion Detection) and DDOS (Denial of Service) protections to secure the network and your data. We can offer managed cloud to take some of the pressure off of your in-house IT teams or help you to deliver private cloud solutions for the highest levels of privacy and security. The cloud also provides healthcare firms with an added layer of security in the event that unexpected disruption occurs. The disaster recovery options provided by cloud solutions mean that whether disruption is caused by a malicious agent or a natural disaster, core processes can be resumed quickly and you can continue to deliver the service that your patients rely on. [easy-tweet tweet="#Cloud computing is now able to deliver a number of #healthcare benefits and keep patient data secure" hashtags="Security"] Cloud computing offers many benefits for modern health firms, allowing medical professionals to access information all over the world, collaborate on cutting-edge research and maintain closer contact with their patients. Crucially, the cloud is now able to deliver these features while keeping patient data as secure as possible. At Zsah there are a number of cloud computing options available to healthcare firms, but one thing remains the same: The high standards of security that you and your patients expect. ### When Big Data goes Wrong Since the invention of the television, we as human beings have become accustomed to trusting the output of a screen. Military campaigns always have a media and propaganda element as part of the planning. The capture of the national broadcaster is always key in any rebellion or coup d’etat. However, the issue is trust. [easy-tweet tweet="Why do we tend to almost hypnotically fall for the messaging laid before us? #BigData #Security"] Why do we tend to almost hypnotically fall for the messaging laid before us? This is especially the case when it is using the right mix of colours and aims for confirmation bias (the part of our mind that makes instant judgments at speed). The Myth of the Data Scientist The word scientist, in most of our minds, conjures up a man in a white lab coat that deals in certainty and fact. A scientist discovers new frontiers whilst carefully documenting an outcome and sharing it with the greater world. The problem though is the so called ‘data scientist’ who is building these algorithms and analytical insight. For example, predictions of the Iraq War through to using Big Data to predict flu outbreaks. Does the wisdom inherent make the data analysis real? So what is a data scientist? Generally, a data scientist is a system developer or statistician; or as we are finding all too often, someone who has done a course in a lab using one of the main open source distributions, who has appointed themselves as the solver of all problems. Not unlike the snake oil salesman of Wild West times. The similarity to ERP Promises of old  Do a reasonable amount of googling and you will see the promises of ERP systems from many years ago that offered unparalleled insight and analytics to businesses. Unfortunately, with many of these promises the enterprise usually follows like sheep, based upon a high profile success case that incurs pressure from the board to adapt or die. So what is the big issue? The big issue is that companies small and large are betting on becoming the next Amazon or Netflix. The marketing hype and science, combined with the dark Machiavellian public relation types, have the general market believing that Big Data will save the world. Whilst there are big data and analytics projects in use today within both the scientific and machine learning community that can be hailed as a success, we have to temper the hype with caution. [easy-tweet tweet="Big Data has a chance to make a meaningful impact, but only if the rhetoric is calmed down" hashtags="security, big data, "] Big Data has a chance to make a meaningful impact, but only if the rhetoric is calmed down - the pressure cooker slowed - so that this technology has a chance to shine. Yes, we understand that storage vendors and cloud providers want to espouse Big Data benefits, but let’s not go through another round of hyperbole as we did with cloud. Our tips for a Big Data project: Start small, go with a little data and analyse with an open mind Think of an outcome you are aiming for and reverse the project plan from there Do not get sucked in by the rhetorical, a title is meaningless without the human occupying it being able to communicate clearly Choose the vendor you will go with wisely and make sure that they offer the cradle to grave support and consultancy, do not get stuck with point solutions Always check data regulations when handling sensitive or personal data that is going to an external cloud provider Everyone makes mistakes, those that claim never to or have elevated their status to demi-gods need to be avoided like the plague If something sounds too good to be true then it probably is Always check backup and data retention policies, check terms and conditions of engagement with a lawyer - watch out for the liability clauses Avoid the blame game where possible, limit the scope to as few external parties as possible Cost of everything and value of nothing, make sure that costs are accurately reflected. Do not become a never ending subscription with a low entry point to get you onto a platform ### Is multi-tenant cloud relevant anymore? In the early nineties multi-tenancy was relevant, a new breed of solution providers were emerging, led by Salesforce.com. The multi-tenant approach employed a software framework, which used a single instance of software running on a server that served multiple tenants; a tenant being a group of users who share a common access with specific privileges to the software instance. With a multi-tenant architecture, a software application is designed to provide every tenant with a dedicated share of the instance including its data, configuration, user management, tenant individual functionality and non-functional properties. In the 2000s, multi-tenancy underpinned almost all 'Software as a Service' (SaaS) enterprise application delivery – Salesforce was a pioneer of this approach. In fact enterprise software was designed and constructed with cloud deployment in mind. But why was this? Fifteen years ago infrastructure was expensive. There was no cloud – at least not like we have today. This meant companies wanting to offer SaaS applications had to build and run their own costly data centres. Virtualization technologies were still in their infancy and there were no DevOps tools; making it hard or even impossible to automate instantiation of new instances. Meaning even more manually intensive activities. Executives naturally tried to limit by means of software the usage of expensive resources such as hardware, software and people, that were needed in creating data centres. [easy-tweet tweet="Implementing multi-tenancy at both the application and database tier allows you to maximise utilisation of resources" hashtags="cloud, multi-tenant, "] Implementing multi-tenancy at both the application and database tier allows you to maximise utilisation of those resources. This is achieved by having more customers running on the same instances; limiting the number of instances required and allowing you to save on hardware and software licenses. The past fifteen years has seen an ongoing debate about application multi-tenancy. Discussions focus upon software architecture, data models, practical problems with operations, data separations between tenants and so forth. All good topics that are worth discussing, however, in my opinion, what would be more pertinent would be to question the relevancy of multi-tenancy in the everything cloud age. At the risk of upsetting cloud purists my answer to the title of this article is no. Times have changed Today, 15 years or more after the creation of application and database multi-tenancy, things are different; the economic necessity for multi-tenancy at the application tier has disappeared. We have huge clouds offering cheap infrastructure and tooling for automatic scaling in and out, creating new instances and tearing them down again without any of the manual interventions that used to be necessary. There are new services popping up every day, like databases being charged per transaction rather than per instance, which means you do not have to pay in the event the tenant is not using the database. You have availability zones offering service redundancy, ensuring that your application, services and database are – nearly – always available. Best of all, these cloud resources are very cheap, and there are lots of them. [easy-tweet tweet=" vendors could and should question the value of retrofitting multi-tenancy into their database and application layers" hashtags="cloud, multi-tenant, data"] With all these new infrastructure technologies available at a low price, vendors could and should question the value of retrofitting multi-tenancy into their database and application layers. It may quickly turn out to be a bad investment compared to building installation and upgrading automation scripts into the underlying cloud management system for handling quick, on-demand instantiation of new tenants. While there are many other reasons for updating existing software, simply adding multi-tenancy should no longer figure on the list. When building a completely new system, there is no reason not to design the application tier as multi-tenant, but it is not the overriding imperative it once was for a SaaS application. Rather than saving cost, as it did fifteen years ago, in some cases it may even add unnecessary complexity and overhead to the design of your application. In summary 1. Do not refactor multi-tenancy into existing apps just for the sake of it. 2. Focus on installation and upgrade automation, so no human intervention is required to: a) Instantiate new tenants b) Conduct horizontal scaling 3. For new or major refactoring, include multi-tenancy into the architecture provided this does not add unwarranted complexity and overhead to your software. Why would customers care? They should not care – all they should care about is the SLA. So why is it that customers do seem to care about a software architectural construct? Salesforce and other SaaS pioneers greatly succeeded in persuading customers that multi-tenancy is a feature of the product rather than an architectural construct originally created to save operational costs for the software providers themselves. Although I agree that fifteen years ago application multi-tenancy was important to lowering cost and therefore a great USP for the SaaS providers; today I’d say it doesn’t really matter. ### All dressed up with nowhere to load: The story of a very frustrated IT department and its quest for digital transformation Half a year ago, we could have told you that we were all on the cusp of a ‘business technology revolution’. A myriad of data-driven technology trends were highlighting a route to a ‘business process mecca’; a path, if chosen, that would enable businesses to maximise their potential and gain competitive advantage. [easy-tweet tweet="For many businesses, #legacyIT infrastructure continues to hold them back from innovating" hashtags="Transformation"] That supposed future vision is happening right now. We’re seeing the convergence of big data, the Internet of Things, 5G, cloud computing, and mobility. Business leaders and senior IT decision-makers alike are realising that having a superior network in place, one that can efficiently handle the huge data demands that come with the rise of these trends, is vital to business growth. A scalable, agile network infrastructure could be the difference between leading from the front, and total stagnation. Adopt to adapt IT departments are very aware of what is at stake for their companies. At Brocade, we recently issued a report which found that the vast majority of IT decision-makers (87 per cent) were part of organisations that had strategies in place to transform their digital operations. An even larger proportion (94 per cent) stated that their organisation’s CIO and IT managers valued the adoption of digital transformation in order to achieve business objectives. ‘Computer says no’ However, despite evidence that IT departments know what needs to be done, some businesses still find themselves falling behind competitors because they are failing to sufficiently adapt their approach and embrace a digital transformation. They are all dressed up with nowhere to load. The problem is a familiar one for fans of Matt Lucas and David Walliam’s hit comedy ‘Little Britain’: “Computer says no’. For many businesses, legacy IT infrastructure continues to hold them back from innovating on their terms. Despite knowing what they need to achieve, IT departments have found themselves having to say ‘no’ to new business opportunities far too often, instead focusing efforts and resources on ensuring their current legacy system can handle increasingly overwhelming quantities of data. In fact, if IT departments could spend less time ‘keeping the lights on’, they could devote more time to creating value, reducing costs, and increasing revenues as much as 12 per cent according to our research. How to start saying ‘yes’: The New IP approach Digital transformation can sound like a daunting proposition, but if broken into digestible steps, it can be much easier to implement at a pace that is considerate of struggling IT staff, and their budget. There are a number of achievable changes organisations can begin to make that will ensure their IT departments are finally able to start saying ‘yes’. Firstly, nothing is more important than adopting The New IP approach. Modern New IP technologies are software-enabled and based on programmable networks, unlocking the power of the network as a platform for innovation. The New IP approach gives organisations a much higher ceiling for adapting to modern technology demands, freeing up their IT departments to build upon what they have and evolve with technology, rather than attempting to cope with more from each advance. The New IP is a class above the legacy networking systems that are holding companies back. Introduce automation across the board However, just implementing a new networking system isn’t going to make your business a digital enterprise overnight. Even once the system is layered with all the required software, it will need to be agile enough to respond to demands within milliseconds. Consider online retailers: as it hits a busy sales period and shoppers are maximising their own resources making purchases online or via mobile, the retailers’ IT systems will need to be able to handle a greater demand on the servers, provisioning more resources if need be so that customers can continue to have a smooth shopping experience. Additional recent Brocade research highlighted how a failure to optimise application performance could mean missing out on an 11 per cent increase in revenue as disappointed customers decide to shop elsewhere or spend less. In a traditional legacy system scenario, IT teams would have had to manually raise tickets which moved from one siloed IT focus to another. However, with the right network and algorithmic solutions in place, it’s possible to automate this process. Enabling new server capacity to be freed up or new cloud space to be provisioned as soon as it’s needed without waiting for a never-ending line of tickets to be dealt with means that the business can maximise its potential. Shut down the ‘no’ programme Senior decision-makers need to ask themselves if they are truly giving their IT departments the support they need to help businesses perform at their peak. The convergence of digital technologies can be seen as a series of challenges but it should ultimately be regarded as a collective wealth of opportunities. IT departments don’t have to be stuck in process, and should be empowered to innovate and help business decision-makers deliver true business value. [easy-tweet tweet="Simply implementing a new networking system isn’t going to make your business a #digital enterprise overnight"] By making your network agile and scalable, bridging IT silos, and enabling end-to-end automation of cross-functional IT workflows, you can shut down the ‘no’ programme for good and move into the business of ‘yes’. ### Teamwork makes the Dream Work: Democratising mobile app development Whether remotely ordering a Chinese takeaway or approving employee expenses, end-users are ultimately after one common thing, simplicity. This is one of the main success factors for a good mobile app, and as the convenience of easily using a mobile app anytime and anywhere continues to top the ‘must have’ list, simplicity should definitely not fall outside the app development equation. [easy-tweet tweet="convenience of easily using a #mobile #app anytime and anywhere continues to top the ‘must have’ list, simplicity should definitely not fall outside the app development equation."] With this in mind it just doesn’t make sense that so many businesses still get wrapped up in so much complexity, when instead the focus should be on turning simple ideas into working business apps. This is nothing that can’t be resolved with some good old company incentives and fun get-togethers; where most employees would be happy to brainstorm plenty of useful and feasible ideas for apps that make life at work easier and more efficient. Of course, there are many entrepreneurs and like-minded people that can figure out the ‘what?’, so what applications would make a difference and make life easier for everyone? The only thing getting in the way is the ‘how?’. How do we go about creating these genius mobile applications? Well, reassurance that these ideas can be achieved is the first step. [easy-tweet tweet="Developing a #mobile #app shouldn’t be any more difficult than creating a web app or even an engaging design mockup"] We need to get rid of the idea that engaging and successful mobile apps can only be created by knowledgeable and experienced - or you may even dare to say ‘geeky’ - developers. Developing a mobile app shouldn’t be any more difficult than creating a web app or even an engaging design mockup. Planning the best way to take advantage of the native capabilities of different platforms does mean burning some additional brain cells, but luckily most of the code generation can be automated, or enabled with a touch of JavaScript. So, caught up in excitement of the broader app development possibilities, you may be asking, “should every app idea be pursued?” no, not so fast. It’s important to judge an app beyond its initial idea, otherwise you’ll be judging a book by its cover, and we all know that this approach can lead to unexpected and undesirable consequences. The proof is in the pudding. Therefore being able to test the theory - even if incomplete - makes for an informed, easier and wiser judgement of whether there is good enough potential to move forward. Before tackling all the details of the app to be developed, app creators need better ways of roughing out their ideas to generate interest in them. With the democratisation of mobile app development, visionaries who want to innovate within the company are able to sit down next to the CFO and showcase an app that was designed in an afternoon and made accessible from the app store; with the capability of running on an iPhone or Android device. Although, this would not yet be ready for production or linked to enterprise systems, it would be fleshed out enough to show the potential before an investment decision is made. This same scenario is what Kony, had in mind when it released a free starter edition of the Kony Visualizer 7 mobile app design and development portfolio. By giving away this valuable solution, Kony is offering the ability to create and show a native working prototype to a boss or an investor without first needing to have the budget for an enterprise software license. One of our customers, a European energy conglomerate whose name I’m not allowed to mention on this occasion, recently hosted a hackathon for more than 500 employees from across the company, including business units, HR, Learning, Marketing, etc. As a major energy provider in Europe, the head of IT called upon the support of people from across the business to come up with innovative ways to better perform various activities such as improving customer service to finding new ways to work more efficiently. The participants were put into over 100 cross-functional teams that used Kony Visualizer to develop mobile app designs and prototype apps. Company leaders were able to test the resulting app previews as working code, across multiple devices, before deciding which one to go for. [easy-tweet tweet="Mobile is a powerful innovation catalyst across every aspect of the enterprise " hashtags="app, business"] Mobile is a powerful innovation catalyst across every aspect of the enterprise – from providing mobile apps for sales teams to promoting greater efficiency, enabling increased productivity, and equipping field teams with the ability to deliver faster responses and improved services to customers. By democratising the mobile app development process, companies can empower employees across the organisation, not just the IT department, to play a key role in innovating and transforming the business. ### Setting up a cloud call centre? Here are 10 questions you need to ask yourself now Despite the increasing number of customer service channels available, most people still turn to the humble telephone when they want to contact a business. As such, your business’s contact centre remains central to maintaining customer interactions and relationships. For this reason, as managers continue to wake up to the technology’s many advantages, cloud infrastructure has become the fastest growing sector of the call centre market. In addition to the cost benefits, cloud contact centres offer scalable ‘anywhere, anytime’ access, easy integration with existing systems, and a wide range of features that help businesses deliver excellent customer service. If you’re one of the many managers contemplating leveraging a call centre in the cloud for the benefit of your business, here are 10 questions you need to ask yourself now. 1. Do you want a managed or non-managed service? Opting for a provider that offers a managed service means they will work with you to understand your requirements before building a system to meet them. On the other hand, if you choose a non-managed service provider, you will need to have a much greater understanding of your exact requirements and preferred network configuration pre-purchase. So if, for example, you don’t have staff capable of configuring a cloud call centre network, a managed service will free up time and resources, and make things a whole lot easier. 2. What telephone numbers do you need to include in your network? When creating any telephony network, knowing the numbers you want to use and ensuring they are available for your chosen provider to work with is fundamental. If you are working with a provider offering a managed service, this is likely to be part of the installation process. If it is not, you will need to compile a list of all your numbers to ensure your provider can use them to route all incoming phone calls on your behalf. 3. Do you fully understand all your technical requirements? [easy-tweet tweet="When moving to a #cloud telephony system, you need to ensure your network is geared up to handle the increase in #traffic it will encounter"] When moving to a cloud telephony system, you need to ensure your network is geared up to handle the increase in traffic it will encounter when your voice channel moves to Internet Protocol (IP). It is important to consider what type of interaction your cloud provider will have with your existing infrastructure. Again, if you opt for a managed service, your provider will typically provide support to ensure your network meets the technical requirements needed. 4. Who will need access to what and when? With the implementation of any cloud call centre, different people will need different levels of access and the use of different features. Therefore, after ensuring your network is ready for the cloud, you need to consider your approach to access rights. Pertinent questions at this stage include: what features will your middle and senior managers need access to? Are these different to the features your sales team will need? And who is going to be the system administrator for your cloud call centre network? 5. What type of routing solution will you need? After determining who will need access to what within your team, you can start to map out your call routing requirements. First, consider all the different reasons why your customers might contact your business. You can then start to review whether the calls of some customers should be priority routed depending on the context of their inquiry. 6. Do you want to integrate customer-friendly solutions? Some businesses frustrate their customers by using interactive voice response (IVR) systems that are too complex or difficult to navigate. A major benefit for cloud call centres is that they can easily incorporate computer telephony integration (CTI), which allows for the provision of intelligent IVRs. From a customer-friendly perspective, intelligent IVRs offer the possibility of dynamic menus that will help your customers speak to the correct person, faster. 7. Do you want reports that monitor metrics and data? A cloud call centre with CTI offers the opportunity for your service provider to compile smart reports. With such a wealth of data potentially available, you will be able to access reports that provide detailed insights on important sales, service and marketing metrics. These can include things like the number of prospects your agents are currently engaging, customer satisfaction scores and retention rates, all of which can be used to generate a better understanding of your business’s sales and customer service successes. 8. What training will you need to provide? The return on investment (ROI) of any technology is limited to the benefit enjoyed by the end user. Therefore, investing time in effectively training your staff on the new cloud telephony system effectively will pay for itself many times over. It’s worth noting that, if you elect to go with a managed service provider, they will often provide training to help you and your staff get the most out of the system. [easy-tweet tweet="Investing time in effectively training your staff on the new #cloud telephony system effectively will pay for itself"] 9. How much ongoing support will you need? Any contractual agreement with the supplier of your cloud solution is likely to include technical support. The level of ongoing technical support provided by most service providers is tiered by cost and designed to suit varying requirements. Using your knowledge of your team, consider exactly how much hand-holding is required to get your new system working and operating smoothly on an ongoing basis. 10. Can a Salesforce Open CTI adapter improve sales and service? Salesforce’s cloud technologies help nurture growth by ensuring businesses can better maintain their customer relationship management (CRM) platform, without having to make a huge investment in hardware and software. A Salesforce Open CTI adapter allows all kinds of integrations and applications to plug into Salesforce in the cloud. As a result, it provides a wealth of benefits for helping call centre agents deliver sales success and excellent customer service. Want to learn more about how a Salesforce Open CTI adapter can help your business deliver better customer service over the phone? Then take a look at NewVoiceMedia’s whitepaper, ‘The Advanced Guide to Salesforce Telephony Integration: How to Compare Salesforce Open CTI Adapters’. ### Industry Solutions Forum 12 July Avnet Customer Centre, capitol building, bracknell You are cordially invited to join M7 and Avnet for the UK re-launch of the Lenovo Industry Solutions Integration Programme, to be held at the Avnet Customer Centre, Capitol Building, Bracknell on July 12th. The event is aimed at Independent Software Vendors and Systems Integrators who are looking to build their business through international solutions delivery. The M7 Solutions forum has been running since 2008 and provides an opportunity to hear vendor product and programme updates as well as meet with other ISVs and M7 partners. ### Want to be a Business Cloud Ninja? Easy steps to getting to grips with cloud app security for businesses of any size. One of the key reasons why businesses were initially put off by cloud services was concern over security. And whilst many damning headlines have trumpeted the dangers from hackers and data leaks over the past decade, the truth is more nuanced. As businesses looked into the technology more deeply, they realised that a cloud solution was actually a way to improve their security. This is what cloud providers have banked on when developing a reliable, trusted business model after all. Now millions are moving their business to the cloud, with the latest figures from the Cloud Industry Forum showing that cloud adoption rates in the UK are at 84 per cent and still growing. So if you are new to cloud, or not au fait with the IT jargon inherent in the deeper discussions of business software, read on to fast track your way to business cloud ninja-hood. [easy-tweet tweet="Read on to fast track your way to business cloud ninja-hood" hashtags="cloud, ninja, security"] "Bujinkan" (divine warrior training hall) – Dual-factor Authentication One of the many criticisms of consumer cloud solutions is that they rely on a single password and username to control access. We use these services every day – think Facebook and Yahoo! But with enterprise apps you need to look for dual-factor authentication, because business data is often much more valuable than your individual data. That’s why criminals are desperate to get it. Some cloud solutions build this in. With Microsoft’s Office 365, for example, you get an additional layer of security, with the ability to add dual-factor authentication to the login process. To access your data on any service with dual-factor authentication, you need the standard password and username, and must also provide additional proof that you are who you say you are. On a mobile device this can be using the fingerprint reader, or on a desktop, it can be a text message with an ID that you then input into Office 365. Some banking software does this too, making your data much more secure. Consider this a grounding before learning the really impressive ninja skills. "Buyu" (warrior friends) – Regular Data Backup What use is data that isn’t also secure from loss? Once it’s in the cloud, the point is that you want it back sometimes. Some services offer a ‘free’ backup and recovery service as part of the whole solution. With Office 365 as an example, you not only get the scalable collaborative tool to meet your business case, your data is also continuously backed up and saved in multiple locations around the Microsoft network. If you do accidentally delete a file, it can be restored quickly and easily with just a few clicks. [easy-tweet tweet="What use is data that isn’t also secure from loss?" hashtags="cloud, security"] Any cloud service should offer this backup facility. Depending on the solution, you might also want a full versioning and auditing system, enabling you to see who made what changes, to which files, and when. These systems makes it easy to go back to a certain point in a file’s life to correct an error. Should something be overwritten or deleted in that file, you can easily spot where mistakes were made. Ninja skills indeed. By extension, your backup should be able to work as a disaster recovery solution. Should your business be affected by floods, fire or any other natural disaster, you can still carry on working with your files on Office 365, no matter where you are in the world, or which device you choose. Your data is held securely within the Microsoft Office 365 data centre. This is automatically copied to other data centres within your geographical area. So even if the Microsoft data centre is affected, there’s another copy in a secondary location that can be relied on. Dan (black belt grade) – Always up-to-date One of the key areas where businesses expose themselves to security problems is in not keeping up to date with the latest versions of the device operating system or the application they are using. It’s easy to understand why, when the day-to-day business of doing business is so onerous. However, hackers have become increasingly more sophisticated and deliberately target businesses that are using out of date or unpatched applications. They exploit known problems in the systems to gain access to your data and other parts of the network. With smart cloud solutions like Office 365 you no longer need to worry about which version of the software you’re running, as it automatically uses the most up-to-date software. Because the data remains within Microsoft’s Office 365 network, you’re protected from other exploits and problems on your viewing device. Ninjutsu (the martial art of the ninja) – Privacy Settings at all Levels On a typical office file sharing system, you are usually allowed to access files in a given section, or project, or on a particular server. Once you start to allow external users into projects, which is becoming the norm in most businesses, security becomes more difficult to manage. If you give your external users access to a particular server, it’s difficult to prevent them downloading anything they find on that server. Likewise on a project, they may be able to view files that should only be for the management or director level users to see. Chose a solution with variable settings from the get-go. You will want to get privacy and security settings that can go down to a file level. You can then decide exactly who can see a particular file or project, and those users only have that view and nothing else. Additionally, you should see about the ability to make files viewable by particular grades within a business, or restrict all non-internal users from accessing files or projects. Most importantly you should be able to remove viewing rights quickly and easily at the end of a project. Soke (grandmaster) – Putting it all Together A ninja-like awareness of the cloud environment doesn’t have to be like walking a tightrope or balancing a dagger on your nose. It comes down to remaining aware, staying alert to the possible threats and weaknesses of your processes and services, and staying strong with the right security for your needs. In the world of martial arts this is known as 'zanshin' – a state of awareness that a ninja displays. ### Why privacy is a universal human right Online search engines have a become an essential aspect of everyday life for all of us. All ages, nationalities and professions now possess a wealth of knowledge at their fingertips. However, while the Internet, and the search engines we use to navigate it, have become a universal right, they are beginning to encroach on another core value: our privacy. [easy-tweet tweet="Growth in the #privacy sector is outpacing the overall rate of growth in #search generally" user="Oscobo"] When we launched Oscobo last year we did so with the knowledge that a privacy-focused search engine could have universal appeal. Growth in the privacy sector is outpacing the overall rate of growth in search generally, indicating that there is a definite market for search engines that do not harvest your personal information. Below I’ve taken a look at two very different examples of user groups that may want to protect their online data. Privacy at home There are plenty of reasons why families all across the UK would want to safeguard their personal data from being harvested by search engines. Have you ever tried searching online for an offer on a family holiday? Well, your previous search data may count against you. The information that you’ve supplied to Google and similar platforms, will affect the price you are quoted. When online brands have information relating to your location and profession, it is easy to see how results become targeted at you, rather than tailored for you. Another factor to consider is whether you have young children using the family computer. Do you really want every aspect of their lives being stored and eventually analysed so they can be bombarded with targeted advertising? Because we do not use cookies, look at your IP address or store any user data, we provide “pure” search results based entirely on the words you typed into your search request. Why should you have to share your age, sex, marital status and previous purchases every time you search online? At Oscobo we believe strongly that this no longer needs to be the case. Privacy at work We’ve also found that a number of businesses are keen to adopt a more privacy-focused approach to online searching. There are lots of examples of corporate data that you’d rather keep private and your search terms are likely to be included within this. However, when search engines store data on your search criteria in the cloud, it means that they could potentially become targeted by cyber attackers or local intelligence agencies. Because we don’t store any data on our users, you can be sure that there is no personal information that can be targeted. If there’s no data stored, there’s no data to be hacked. [easy-tweet tweet="When #search engines store your data in the #cloud, it may be open to #hacking" user="Oscobo"] Whether you’re a stay-at-home parent or a corporate high-flyer, there’s no reason for your personal information to be shared against your wishes. If you believe that online privacy should be for everyone, make sure you choose a search engine that delivers “pure” results without tracking its users. ### Size & Security – Not the same thing at all A few years ago, cloud computing was being touted as the biggest and best new innovation for companies to increase productivity and cut costs. Fast forward to today and the use of cloud-based core services such as DropBox or Google Mail are almost ubiquitous among smaller organisations. If you’re a small or medium sized company then having the capacity to ‘outsource’ not only your software and its maintenance but also your data storage, means that you can save significant time and money on internal resources. At the top end of the scale however, the largest enterprises have seen less reason to adopt cloud services. Sharing computing resources doesn’t provide many added advantages if you already require a vast amount of technology to cater for your business needs. These organisations often consider it to be more efficient to source their own dedicated data centres or to self-manage their hardware on-site. There has also been a widespread distrust of the cloud in terms of security, with in-house IT teams instinctively wanting to preserve their control of systems and hardware. Attitudes are beginning to change however. As the cloud market matures and suppliers continue to improve their offerings, the economics of implementing the technology are starting to stack up for larger enterprises as well. Ironically, given the long-term suspicion of the cloud, one of the most important reasons for this change is security. [easy-tweet tweet="Ironically, given the long-term suspicion of the #cloud, one of the most important reasons for this #change is #security."] It has taken a long time, but the senior management teams are starting to recognise the potentially disastrous consequences of a security breach for their organisations. Compliance is also becoming more of a headache. Regulations are being tightened up on a regular basis - the EU General Data Protection Regulation (EU GDPR) is just around the corner for example, and this legislation will still apply even after the recent Brexit vote. Thanks to this, in-house teams are beginning to realise just how difficult security and compliance can be. In the FinTech sector in particular, governance and legal requirements can be extremely onerous, and across all organisations there is an absolute need to keep risk to a minimum. It is becoming much more widely accepted that the real security experts may in fact be those working outside the organisation, rather than within it. In Gartner’s recent predictions for IT organizations and users for 2016 report, the analyst firm stated that “Through 2020, 95 percent of cloud security failures will be the customer’s fault.” Gartner’s suggestion is that it will be weaknesses in the internal procedures of companies that lead to compliance failures and increased levels of business risk, rather than failures of the external providers. As an example of the investment taking place in security for cloud services, in 2014, Microsoft, as one of the first cloud providers on the market, undertook a four-year process to ensure a selection of its cloud products were compliant with the standard clauses laid out by the European Commission in 2010. Clearly, it is not a simple process to bring cloud technology in line with security regulations. [easy-tweet tweet="Clearly, it is not a simple process to bring #cloud #technology in line with #security regulations."] As a result, more large organisations are now choosing to place their security in the hands of the experts. This is not a panacea; no matter how superior the security measures offered by a cloud provider, companies are still ultimately responsible for their own data. They need to be extremely careful when checking the security credentials of their provider; whether that’s for an accounting system or for a call centre, and the legal controls between provider and customer need to be tighter than ever. A recent survey by Toshiba revealed that 56 per cent of businesses plan to implement a new cloud-based solution this year, and that for 50 per cent, security is a key area of investment. Our expectation is that we will see a continuation in the slow movement of the large corporations to the cloud, and the steady rise of security to the very top of their priority lists. ### Understanding Fintech compliance and regulations Providing an overview of the current (and likely future direction) of laws and regulations as they pertain to financial technology companies in the UK just became a lot more challenging. The recent outcome of the so-called “Brexit” referendum has created a great deal of uncertainty and it remains to be seen whether the UK will adopt a similar approach to that used by Norway and Switzerland when it comes to IT regulations, or attempt to carve its own path. However, while the future is impossible to predict, there are some aspects of Fintech compliance that we can be relatively certain about. [easy-tweet tweet="Businesses should use a secure #cloud hosting provider in order to meet future #Fintech regulations" user="veberhost"] Cybersecurity Regardless of whichever regulatory body ultimately oversees the fintech industry in the UK – one aspect at the top of the list of “must haves” – is cybersecurity. Both the frequency and level of sophistication of data breaches continues to rise across the globe. Alongside this, the average “costs per (hacked) user” and the cost of business lost to competitors as a result of the bad publicity following the disclosure of a breach is also growing. Even the world’s central banks – with all of their considerable monetary muscle - are not immune from being targeted by malicious actors – as the recent $81 million Bangladesh/New York Fed SWIFT Banking Network hack proved. Governments and regulators are discussing how to make companies share details of attempted and/or successful hacks, without threats of recrimination, with authorities – and similarly, where governmental systems have been compromised – then details of any incidences will likewise be shared with companies. Increasingly, companies are choosing to effectively outsource the task (but not the responsibility) of cybersecurity. The costs to a company of hiring sufficient staff with the necessary skillsets seem to be skyrocketing. Hence, it makes commercial sense to hand over the security aspects of your business to a trusted partner (data centre/cloud provider) to reduce a potentially huge capital expense – into a lesser, operational expense – replete with associated tax advantages. Governmental surveillance of data The former EU/US Safe Harbour Agreement which had been in effect between the two countries for over 14 years – is set to be replaced by the “EU-US Privacy Shield” and is expected to come into force in July 2016. Once it is in effect, governmental agencies will have to adhere to strict guidelines limiting their abilities to conduct surveillance – but data can once again flow and/or be stored on either side of the Atlantic. Also, in 2018 the EU’s “General Data Protection Regulations” (GDPR) will come into effect. Failure to comply with the various regulations in this respect can be extremely expensive (e.g. up to 4 per cent of a company’s worldwide turnover!) Further – politicians have also been of the opinion that directors of companies found to be in serious breach should be subject to personal fines and even imprisonment. MiFID II Another version of an EU Directive - the “Markets in Financial Instruments Directive – Part Two” (aka MiFID II) - is soon be implemented (est. January 2018) and will require financial firms to implement changes and controls affecting a broad swathe of topics covering everything from research, derivatives, data protection, storage etc., - as well as the need to record meetings (over and above telephone calls!) The UK’s Financial Services Authority (FCA) already mandates that anyone directly involved in equity trading must record calls. However, MiFID II broadens the scope of individuals coming under its mandate – it’s not just the top city traders, it’s also financial advisers and commodity traders not previously regulated by the FCA. Consequently, the amount of data that will need to be securely stored will soon skyrocket. Since the data is comprised of highly sensitive information, any breach of confidentiality which falls under the purview of both the UK’s Information Commission (ICO) and as mentioned above, the new General Data Protection Regulation (GDPR) could result in significant financial penalties. Furthermore, MiFID II makes it very clear that recordings must be securely archived. Even in the event of a successful cyber-attack, conversations must be encrypted so they are unreadable. Businesses that are looking to comply with MiFID II and the GDPR should seek to use a cloud hosting provider like Veber that has a history of reliably safeguarding their customers’ data. Mobile Devices Firms need to carefully consider how they can prevent employees from inadvertently causing them to be liable to stiff fines or bad publicity via misuse and/or loss of mobile devices. By implementing proper controls and even providing formal training as to how and when mobile devices/their social apps may or may not be used – firms can do a lot to mitigate their exposure. Financial businesses may also benefit from the support of their cloud provider. Veber can supply financial businesses with virtual private networks (VPNs) and end point access that allow mobile workers to connect to remote data centres even when they are not using the local network. This means that financial businesses can enable their employees to work flexibly while still complying with stringent security regulations associated with the financial sector. Cloud Hosting As mentioned previously, fintech firms may decide that the best way to meet regulations and compliance is to use a cloud vendor. At Veber we have already worked with a number of financial businesses who come to us for our high levels of availability, performance and flexibility. Financial information can prove hugely damaging if leaked, so businesses should look for a vendor that meets the highest compliance standards, has reliable SLAs and provides the scalability to grow. The right cloud supplier will not only help you meet regulations, but also outpace your competitors. [easy-tweet tweet="Despite an uncertain future, #cybersecurity will still play a crucial role within #Fintech compliance" user="veberhost"] Act now, before it’s too late Although the UK’s position on forthcoming regulation is not entirely clear, businesses cannot afford to wait around. Fintech companies should begin reviewing how their IT solutions standup against future compliance standards, particularly as the EU is likely to remain a prominent trading partner. And even with an uncertain future, one thing remains clear – cybersecurity will continue to play a crucial role within Fintech compliance and regulation. ### 9 Tech tips for taking your business to cloud nine The high-paced speed at which innovations are implemented in the world of information technology often makes business owners dazed and confused. Today, solutions are made and put to practice in matters of minutes. At a time when the cloud is becoming the most important globally available web super-service – its revenue climbing to about $53 billion in 2015 in the US alone – business managers have to follow some strict tech tips to keep pace with the ongoing changes. [easy-tweet tweet="The #cloud is quickly becoming the most important globally available web super-service"] Approved vendors If you only type the phrase "cloud services" in a search engine, you will get thousands of different results. Such a wide variety of options can have a negative effect on businesses and their cloud choices. The rule of thumb here is to hire only approved cloud vendors that have proper cloud certifications. So, feel free to ask your potential provider about their expertise and legal background before you make any other moves. Smooth data transfer Now that you have made up your mind and found an approved provider, it is important to ensure a smooth data transfer to your cloud account. Before you start this procedure, check your Internet speed. The upload speed is the key feature of this process. It should be at least 1 Mbps and higher if you want a fair transfer. Short downtime You will know that you have scored a great cloud deal when you see that your provider conducts regular updates and security checks. However, those actions should not affect the accessibility of the cloud services you use. So, when you are considering different cloud providers, take their uptime qualities into consideration, too. Unlimited scalability What matters the most, when it comes to cloud services, is to have an option to adjust them to your current business needs. If a cloud provider insists on any illogical limitations or forces you to take more than you want, do not cooperate with them. You are the customer here and they have to go every extra mile to take you to cloud nine, cloud service-wise. Limiting apps Using apps and cloud-provided software services instead of programs has been a trend for a few years now. Business owners make huge savings this way, paying only for the services they need at a given moment. However, using too many SaaS products at the same time could lead to uncontrolled expenses, annihilating the budget surplus made through such cloud services. Therefore, every business should name an employee or a team that will be in charge of the app management. Safe virtual location One of the basics of every business that keeps data online is to find the safest possible location for their needs. This means that you should not experiment with hosting providers, but simply go with the flow and opt for one of the well-known ones. Proper encryption As you are getting familiar with your cloud possibilities, it goes without saying that business owners have to keep their eyes wide open in terms of data encryption. In a nutshell, entrepreneurs using cloud services should accept nothing less than multi-level encryption. Password alterations Due to some potential threats coming from former employees (as well as clever hackers) every business working in the online environment should alter passwords from time to time. You could make great use of some free password managers for this purpose, in addition to your own creative moves. Simple data organisation When a business is somewhere between the traditional device-based storage and those new cloud software and storage options, there is a higher risk of data loss. Therefore, you need to introduce some new features for organising your business content. For instance, you can download and (photo)stream photos and videos with cutting-edge Bulkr photo tools, to keep your visual content well-organised. Photos are becoming a vital part of web presence and they require special care. [easy-tweet tweet="Times are changing so fast that falling behind with technology is a one-way path to a failure" hashtags="Transformation"] The times are changing so fast that falling behind with technology is a one-way path to a failure. On the other hand, incorporating all those tech features into your online business policy should launch your business on cloud nine and ensure a lucrative future for your business. ### 5 Steps to embracing digital disruption in the connected economy It is no secret that we live in a world of constant change where digital disruption affects businesses on a daily basis. If you want to be a market leader you need to have agility, a strategy that encompasses both digital and business goals, and be a totally connected organisation.  Digital is fundamental to the whole business, not just the IT function, and it is essential for success in today’s ‘Connected Economy’. [easy-tweet tweet="#Wimbledon has embraced the #digital age and every year it delivers more detailed #data insights" user="IBM"] This year, in its ‘pursuit of greatness’, The Championships, Wimbledon, is again a leader and successful player in the digital Connected Economy, and here are some key lessons other businesses could learn from it: 1. Embrace new digital channels Wimbledon has completely embraced the digital age and every year it delivers more detailed insights across a vast array of different channels to continually engage its existing fan base and attract new audiences. Since the championships began in 1877, Wimbledon’s engagement with its fans has evolved. In the last 10 years, the rise of social media has enabled Wimbledon to target new demographics that interact in new ways. Themes that epitomise all that The Championships stand for, such as the ‘English Country Garden’, have been shared digitally all over the world through channels such as YouTube, Facebook, SnapChat and Instagram. Wimbledon has actively embraced these new channels to draw in new and younger audiences from around the world to deliver the ultimate ‘Wimbledon Experience’ for those enjoying the games in person or remotely. 2. Have a digital strategy Everyone in an organisation needs to be driven by and be part of the digital revolution. The removal of silos is key as we focus on using data, both internal and external, to drive how we operate, engage and adapt in an agile way to changes in our own organisation and the wider market. Wimbledon is driven by data, taking feeds from: Each tennis match, capturing details of every serve, volley and player movement Social media sentiment, including hashtags, trending topics, feeds from TV and the media Many third parties such as the Met Office Historical player and match data In fact, Wimbledon captures 3.2 million data points from 19 courts across 13 days with an accuracy target of 100 per cent and a sub-second response time. It does this using highly trained tennis analysts and IBM Systems to transform and enrich data in near real time, providing insights to commentators and media that help to bring The Championships to life for fans everywhere globally. 3. Innovation, Collaboration and the Internet of Things It’s essential to have a digital strategy but to deliver it successfully requires collaboration and an innovative approach. The key is to integrate business and technology functions so they work in a connected way. Collaboration in the Connected Economy is also about engaging with the people or organisations we do business with in a dynamic and seamless way. In a Connected Economy, the Internet of Things (IoT) affects us all.  If we can harness the IoT we can use it to see what is happening at the point of delivery and to dynamically effect that point. We can realise new insights enabling us to adapt our offerings, services, products and how we are interacting with the world. At Wimbledon the IoT is part and parcel for capturing ‘tennis play’ data ready for analysis so it can be shared to provide insights for the fans, media, players and coaches.  The IoT also captures social feeds, which are then used to gain insight to enable Wimbledon to respond accordingly to the sentiments and trends that are identified. Wimbledon’s focus on its fan engagement not only helps determine how it serves up information but helps define the strategy to attract new fans to tennis. We work tirelessly with clients like Wimbledon to ensure they are on the right track in their pursuit of greatness. 4. Break new ground, don’t just enhancing what was there Don’t be a Luddite and stick with what you have, think and innovate! Companies like Uber, which became the world’s leading taxi company with a unique business model, and Apple’s iTunes, which took the music industry into a new digital era, are examples of just how quickly disruption can occur. To stay ahead of the competition, this year Wimbledon has added new capabilities and media channels, meaning the championships can be watched via an Apple TV app, which can live stream content and information directly into fans’ homes in the truest sense. Wimbledon is making sure it stays ahead of the game when it comes to its fans’ entertainment requirements. Not only this, but new for The Championships in 2016 is the Wimbledon smartphone app, which now enables fans onsite to create their own ‘Wimbledon Experience’ with a checklist of to-dos and daily updates on their favourite players. It also allows fans that can’t be at this year’s games to gain updates on their favourite players and create their own story. This year, Wimbledon has also launched the Cognitive Command Centre which analyses social sentiment and identifies trends and hot topics, including conversations on other topical sporting events, in real-time. This enables Wimbledon to tailor the content it feeds to its fans based on their interests and what they are talking about, helping to truly deliver the ultimate fan experience. 5. Connected Infrastructure - Don’t forget the backend Being data-centric is a huge part of the Connected Economy.  Equally important is the ability to access and deliver information at the ‘speed of thought’. Leading businesses, just like Wimbledon, have infrastructure that is working incredibly fast, utilising the best software, hardware, cloud and network to deliver the greatest service. Wimbledon uses an open hybrid cloud architecture to ensure perpetual change can be managed in a seamless manner, which is crucial in striving to gain the competitive edge. Wimbledon’s on premise workloads utilise IBM Power Systems and storage configured as a private cloud, while other workloads reside on IBM SoftLayer Cloud, where flexibility of resources is dynamically aligned with demand, enabling the systems to scale 100 times or more if demand is there. The IBM Bluemix Cloud is a key technology component with many of its prebuilt API capabilities, such as natural language processing, enabling significantly reduced delivery cycles and bringing ready-to-deploy capabilities that drive many of the digital and fan experiences. The pursuit of greatness is an endless journey, with many new innovations waiting round the corner. Digital disruptors will continue to find new ways to engage and businesses must embrace this change to be successful in the Connected Economy – if companies don’t, their competitors will. In a world where digital changes can take place at speed: innovation, collaboration and investment in people and digital enablers are the drivers businesses need to help them in their own pursuit of greatness in today’s Connected Economy. [easy-tweet tweet="At #Wimbledon the #IoT is part and parcel for capturing ‘tennis play’ #data ready for analysis" user="IBM" hashtags="Analytics"] To read more about The Connected Economy, check out the Harvard Business Report here. ### Why shadow IT should stay firmly in the spotlight Digital transformation can be the difference between success and obsolescence for a 21st century organisation. Those that are thriving in the current, fast paced, environment are doing so because they are providing digital experiences that meet the ever changing demands of today’s consumers. Organisations looking to embrace digital transformation need to be agile so that they can evolve their product and service offerings in line with consumer expectations. [easy-tweet tweet="#Digital transformation can be the difference between success and failure for a 21st century organisation" hashtags="Cloud"] However, with the current business landscape caught in an accelerating cycle of technology-oriented transformation, many organisations still continue to treat IT as a separate entity. Businesses of all sizes have found themselves battling to compete in the ultimate do or die; to embrace digital transformation or drown amongst those struggling to stay afloat, whilst fighting the tide of change. Organisations just need to look to the banking industry as an example of digital transformation done right. Not so long ago, customers had to walk into their local branch with a bank book to access, transfer or even check on their deposits. Fast forward to the present day and consumer-led digital transformation has advanced to the online and mobile banking so many of us enjoy today. With £2.9 billion being transferred online each week and a six per cent decline in in-branch transactions, it is a prime example of digital transformation. Yet, this digital transformation is often not as swift as many employees would like. While many IT departments get bogged down with tactical and strategic technology investment decisions, they can lose sight of the fact that employees are forging ahead regardless. Staff are taking advantage of the proliferation of easily accessible solutions and apps that can be acquired and implemented quickly, to help them undertake their day-to-day work. But whilst this could often deliver business value, it can also bring with it untold risks. IT needs to regain its ability to protect the business and at the same time become the vanguard of digital enablement and transformation. Shadow IT is the phrase used to describe the emergence of software and applications by stealth; brought into an organisation under the radar by employees, without prior knowledge or approval of IT. Individuals, or sometimes entire departments, are driven to do this in response to market dynamics that can be borne out of either market disrupters, such as more progressive competitors, or new market entrants. Employees are increasingly using applications in their private lives – whether it be file transfer utilities such as Dropbox or communication apps such as Skype – and looking to use them in their business lives too. Whilst the IT department may shudder at the thought of rogue software infiltrating their networks, opportunity is often lurking in the shadows. In a competitive environment, IT departments can model the innovative nature of shadow IT to expand the solutions across the organisation. IT should focus on rebuilding intimacy with their internal business customers in the same way the wider organisation puts the end customer front and centre of strategic planning. Digital transformation thrives on a culture of collaboration and in order for organisations to truly embrace IT in their digital initiatives, it is imperative they stop burying their heads in the sand. Shadow IT is here to stay. Enterprises and IT departments need to appreciate the drivers of shadow IT and the speed and agility benefits it brings to the business. To help facilitate this, channels of communication should be established through internal digital forums, where staff can contribute ideas on new solutions and apps, as well as learn about the wider implications for security and compliance that IT have to deal with. IT can, in turn, demonstrate leadership by identifying feature-rich options that deliver business value. Through shining a light on the applications that previously resided in the shadows, the IT department can create a safe environment for experimentation and accelerated business benefit.  This can remove many of the historic obstacles and help to roll out new solutions throughout the organisation as a whole. By encouraging the trialling of new applications within a safe and controlled environment, it means IT can thoroughly test them. This not only ensures that they provide the desired business benefits, but can safeguard against a negative impact on existing systems and processes within the wider IT estate, not least breaches of regulatory compliance. [easy-tweet tweet="Evolving culturally is pivotal to a successful #digital journey" hashtags="Cloud, Transformation"] The ability of an organisation to evolve culturally is pivotal to a successful digital journey. A solid partnership between the business and the IT department means the capability to accelerate value, both to end customers and the wider business. ### How digital technologies are changing the legal sector The legal sector is facing a shift from the old ways and having to adjust to the demands of the digital age. With everyone online, attracting clients has turned to digital outlets. But this also means that there are new pitfalls to watch out for. Relegating digital development to marketing is an archaic mindset and a mistake to avoid. This simple guide points out a few things to keep in mind when dealing with a brave new tech-savvy world.  [easy-tweet tweet="The #legal sector is facing a shift from the old ways and adjusting to the demands of the #digital age" hashtags="Cloud"] How clients choose their representation With the wealth of information available online, it is no wonder that people look to the internet to decide who they want to consult. Many firms now offer the option of an online consultation. This means that the pool of options has grown vastly for clients. You can look for legal advice from firms across the country from your own desk. Legal firms now need to put a lot of thought into investing in their online reputations. Dynamic websites and interactive options are standard upgrades. Word of mouth just doesn’t cut it anymore. Public awareness Again, with so much information available at their fingertips, clients approach legal firms already armed with information. They are not looking for a lecture on the basics of law because they have already done their homework. What the clients want to know is what you can do with them in the given situation. Keep in mind that they will usually approach more than one company. In this case they could be quite knowledgeable about what others offer too. Treating clients in a straightforward and transparent manner is the best way to go. Networking  Networking has moved on from in person appearances at events to the internet. Websites like LinkedIn are a great example of individuals taking networking into their own hands. At the same time we get advised ever more strongly to be careful of what we post online. For instance, employers can judge potential employees based on their Facebook profiles. Keep in mind that professional and private reputations can be negatively affected. Networking online requires careful management and monitoring. Privacy It is also important to realise that openness brought on by digital advancement also means a lack of privacy. There have been countless cases of breaches in security and systems getting hacked. Clients do not want their information getting out and neither do companies themselves. Interactions online are like a minefield when it comes to what should or should not be put out there. Being careful and bringing in experts is a great step in becoming more modern as well as ensuring safety. Streamlining and speeding up the process Thanks to automation, you can speed through most bureaucratic processes. From finding the right representation to delivering documentation on entire legal processes, there’s very little that can’t be placed in the fast lane. In the past, research meant digging into dusty archives. Now searching databases is as easy as typing relevant terms into a search bar. Programs that can compile data in a matter of minutes are a great help too. Email and video conferencing allows direct contact between the lawyers and their clients. Bringing in more than one concerned parties has never been easier. [easy-tweet tweet="Relegating #digital development to #marketing is an archaic mindset and something to avoid" hashtags="Legal"] This is why everyone involved in the legal sector should become well acquainted with such digital tools. Firms that can up their efficiency have competitive advantage in the industry. They provide faster and more flexible service. ### Following in Edison’s footsteps: Collaboration, innovation and the cloud There’s a popular view that innovation is something of a mystical process; the preserve of rare geniuses receiving a flash of inspiration as they work alone. This view of innovation, however romantic, is generally incorrect. Even somewhat eccentric inventors such as Alan Turing and Thomas Edison worked in close collaboration with wider teams, without which their work would not have been possible (indeed, Edison is widely recognised as the originator of research and development teams within businesses). Even Isaac Newton, who spent most of his time working alone, admitted: “If I have seen further it is by standing on the shoulders of giants”. [easy-tweet tweet="By bringing down workplace barriers, organisations can enable #collaboration and drive #innovation"] In our digital age this fact holds true: innovation is the result of a collective effort; of enterprise-wide collaboration based on a full understanding of goals and context. The more easily a business can collaborate, the more likely it is to come up with truly innovative (and lucrative) customer services. Digital enterprises depend on effective collaboration and engagement among employees, partners, and customers in order to remain competitive and deliver cutting edge products and services. The key question facing businesses is how they can enhance collaboration. What is it exactly that enables free and effective collaboration? For me, the answer is simple: collaboration is what happens when organisational barriers are removed. When people are able work freely with the people they need to work with, drawing on the resources and content they need at any given time, then they are able to collaborate. Such an environment is a world away from the simple ‘file sync and share’ platforms that currently dominate the enterprise space. One good way of breaking down these barriers is to embrace the agility of cloud platforms. These services seamlessly blend content, people, processes, as well as communications, as part of a cohesive workflow. This is about much more than simply being able to share documents with teams; it’s about delivering content that is contextually relevant to the task at hand. Cloud platforms can deliver just that by focusing on two areas: content and process. When it comes to content, modern cloud platforms allow businesses to free their enterprise content by allowing employees to access the information they need anywhere, anytime and on any device. Crucially, the cloud also enables freer and more intuitive working practices. No longer do documents need to be emailed in attachments, downloaded and uploaded and version controlled. Teams can manage version history and access control within the cloud. Moreover, cloud platforms can change the way businesses collaborate by enabling social conversations – teams can talk as a group in a dynamic environment, regardless of where in the world they are based. In short, cloud platforms enable better decision making through enhanced information exchange, social collaboration, and mobility. Traditionally however, other barriers to collaboration have remained behind the scenes in business process management. Businesses today need to quickly deploy and adapt processes to meet their operational demands. However, the cost and complexity of implementing infrastructure to support new tools has in the past been prohibitive. The cloud solves this by offering visibility and management of business processes, delivering a full lifecycle process management environment that allows businesses to easily extend existing Software-as-a-Service and on-premise applications across the enterprise. By being able to tailor their applications in the cloud and extend them to wherever they are needed, businesses can ensure they are getting the most of their content while providing an environment in which collaboration can flourish. [easy-tweet tweet="One good way of breaking down workplace barriers is to embrace the agility of #cloud platforms" hashtags="Collaboration"] Any business serious about innovation needs to ensure that collaboration can take place easily within their organisations. By enabling agile content and process management through the use of tools like Oracle’s Content and Process solution, cloud platforms can help them do just that. ### Network survival handbook: 7 essential tips When it comes to the modern IT infrastructure, government IT pros are under more pressure than ever. Evolving compliance mandates, growing demands from employees to access information whenever and wherever they want and the constant looming dark threat of a cyber-attack, can make managing the infrastructure more daunting than ever, and at the heart of this is the network. [easy-tweet tweet="The first key step to #network survival is knowing the network’s capabilities, needs and resources"] Today, there is no question about keeping a network well managed and maintained – it is a necessity; a matter of survival. The modern government IT professional is less a computer geek and more a Bear Grylls-style survival expert and, as such, needs to be prepared for any threat, challenge or hurdle which comes their way. To assist them in this task, SolarWinds has put together its top seven network survival tips to guide you safely through today’s IT wilderness. 1. Map out your network Any explorer wouldn’t get far without a map – of your network, that is. The first key step to network survival is knowing the network’s capabilities, needs and resources like the back of your hand. Although this may seem an obvious suggestion, asset discovery and network mapping is more important than ever due to the amount of devices connecting to the network. By skipping this key survival tip and moving ahead without a plan, you are likely to make assumptions and guesswork, leading to uncertainty, doubt and errors. 2. Wireless is the way forward The use of wireless equipment in government is on the rise due to the low cost of purchase (compared to traditional wired installations) and maintenance. However, things can often get quickly out of hand when Bring Your Own Device (BYOD) comes into play, causing all sorts of issues for government customers. However, tools such as wireless heat maps, which help visualise both usage as well as coverage gaps allow staff to manage over-used access points and under-served areas of the building. Meanwhile device tracking tools can allow agencies to track rogue devices and maintain BYOD policies. In the past, many of these tools have been too expensive to manufacture and use, however newer options of technology which you might not be aware of are now available. 3. Be prepared for the Internet of Things Government agencies are increasingly experimenting with the Internet of Things, and when it comes to the IT professional behind this, they need to remember that many of these ‘things’ connect directly to the internet. Due to not coordinating with a centralised controller on the LAN, each device incurs a full conversation load, burdening the WAN and every element in a network. Worst of all, many of these devices prefer IPv6, meaning you’ll have more pressure to dual stack all of those components. So how does the IT pro survive this? Specific application firewalls can help to make sense of the most complex device conversation, whilst getting IP address management under control and preparing for IPv6. Firewalls can also drive effective quality of service to ensure that critical business traffic is managed, classified and segmented – enabling the IT pro to monitor the overall flow. 4. Allow your network to grow and change Government networks are growing, however sometimes the infrastructure doesn’t follow the initial plan. You need to predict the scalability and growth needs of your infrastructure so that you can stay on top of demands and manage costs. This can be done through leveraging capacity forecasting tools, configuration management and web-based reporting. When looking in hindsight, most errors and problems, such as network outages are predictable, so make sure that you catch them in a timely manner by implementing the tools to help you. 5. All about the application The whole point of having a network is to support your end users. While it means expanding your view beyond a narrow (and admittedly manageable) focus, your IT environment will thrive and flourish when you gain a holistic view of the entire picture, including the impact of the network on application issues. Rather than silo things such as network management, storage, web, compute etc., you should take an overall view of the infrastructure in order to achieve your mission and support your stakeholders. 6. A bad workman blames his tools Sophisticated tools can do any manner of things which make an IT pro’s life easier in today’s IT department. However, we shouldn’t overlook the need for certifications, training and skills which ensure that the tools are used in the best way to manage the network. Intelligent state of the art tools are important, yet when paired with the right skill set, they are unbeatable. 7. Revisit, review, revise The network is a living, breathing entity. As it grows, the IT professional needs to grow and change with it – agility and adaptability are key in order to keep up with the network and ensure it consistently runs at its peak. Successful network management is a cyclical process in which it needs to be constantly re-examined so that changes can be addressed and rectified as they arise. [easy-tweet tweet="As a #network manager, the key to survival is through preparation, flexibility and patience"] As a network manager, the key to survival is through preparation, flexibility and patience. The modern government IT pro needs to be as prepared as Bear Grylls for everything Mother Nature technology throws at them, to which these seven hard and fast rules should give you a lifeline. ### Where are you better connected: At sea or in space? It is often said that we know more about the Moon they we do the deepest parts of our oceans here on Earth. Whether or not that statement holds up to much scrutiny, it is certainly true that both the open sea and the darkness of space share a remoteness that brings with it a number of challenges. [easy-tweet tweet="Sea & space share a remoteness that brings with it a number of #communication challenges"] Communication is foremost among these shared challenges, and the level of connectivity that we’ve come to expect on land is much harder to achieve at sea or in space. Connectivity at sea The 2015 Crew Connectivity survey shed some light on the level of communication available while travelling on the high seas. According to the report, 79 per cent of respondents had access to satellite telephone, 43 per cent had internet access and 24 per cent had access to SMS messaging. Many of these communication services are only available in certain parts of the ship, meaning privacy can be an issue, and they may be in high demand. Whilst it seems that the 2006 Maritime Labour Convention has helped to ensure that crews are not completely cut off, there is still far less connectivity than we would expect when at shore. Connectivity in space Initially it may seem unlikely that astronauts would have any connectivity with their loved ones back home. The International Space Station, for example, is located some 400km from Earth. However, technology allows those onboard to stay in touch with Earth despite the vast distances and logistical challenges involved. In fact, in 2010 NASA introduced a software update that granted astronauts personal use of the internet in space. Although speeds are slow (worse than dial-up reportedly), the online connection helps to mitigate feelings of isolation and even allows them to keep up to date with any social media goings-on. In order to get online, the data request must travel from the astronaut’s laptop to a network of geosynchronous satellites, then to a receiver on the ground and then back to the astronaut - a journey of around 22,000 miles. As well as online access, astronauts can also make phone calls in space too using a specialist program known as Softphone. Astronauts simply dial the number through their computer keyboard and then speak into a specially designed headset. [easy-tweet tweet="Astronauts often have to put up with lag and periods of complete #communication blackout"] It may seem unlikely, but in many ways connectivity is less of an issue in space than it is at sea. Having said that, astronauts often have to put up with lag and periods of complete communication blackout when out of range. It is also worth noting that journeys at sea may only last a matter of hours, making those periods of poor connectivity much more bearable than months in space. In any case, continued development in wireless communications means that it may not be too long before sailors and astronauts alike can experience the same level of connectivity as they do on solid ground. ### European C-Suite see IT as key to increasing future profit Fifty-two per cent of European Chief Marketing Officers (CMO) and Chief Financial Officers (CFO) believe Chief Technology Officers (CTF) and Chief Information Officers (CIO) will be their closest allies in driving future corporate profits and increasing margins for their organisations. Forty-one per cent have already begun to align themselves more with the technology department in order to boost their effectiveness in their own role. These are some of the key findings in an in-depth report from profit optimisation experts, Vendavo, and academic Patrick Reinmoeller, Professor of Strategic Management at Cranfield School of Management, based on a poll of 200 C-Level executives from across Europe. While it’s clear that many of the CMOs and CFOs questioned view technology as strategically important for driving future margin increases, there is some dissatisfaction with the insight they currently have. The research showed that the ability to manage margin effectively is directly linked to company performance above market expectations, yet many organisations admit they do not currently have the tools to do so. Nearly four in ten respondents (thirty-eight per cent) would not know how to increase margin tomorrow. Those organisations which are delivering relevant up to the minute pricing data into the hands of their sales teams are more likely to be performing ahead of market expectations than those that are not – fifty-five per cent that use real-time data are performing ahead of market expectations. [easy-tweet tweet="C-Suite suggests that their organisations’ sales force is also in need of better data" hashtags="C-Suite, cloud"] The C-Suite suggests that their organisations’ sales force is also in need of better data. Forty-three per cent believe their sales teams still use gut feeling to make pricing decisions when closing a sale, while sixty-seven per cent rely on the relationship with the customer. Patrick Reinmoeller, Professor of Strategic Management at Cranfield School of Management said, “Pressure on margins has never been greater, while global competition has never been tougher. Businesses have to build new capabilities to understand the market and their customers, pick up the weak signals, make the right calls and swiftly align resources to seize emerging opportunities or avert threats. The best performing CFOs and CMOs are already using IT to influence how strategies are developed. Providing intuitive tools that offer those making the critical judgments with better insights based on better data is going to be core to raising the level of how organisational knowledge is created and deployed.” “CTOs and CIOs have an immensely important role to play in optimising margins and driving corporate profits,” commented Robert Irwin, Vice-President, Business Consulting, Vendavo Europe. “Their colleagues on the senior management table are currently struggling to make sense of the data that surrounds their business and they’re looking to IT to help – not just in putting the right systems in place to capture and serve up the data, but in developing the processes that will enable the broader C-Suite to make the decisions on which the future of their organisation could rest.” ### Managed Support as a USP - #CloudTalks with Zayo's Aaron Shelley LIVE from Cloud Expo In this episode of #CloudTalks, Compare the Cloud speaks with Aaron Shelley from Zayo about how their approach to people, policy and process helps them to differentiate from their competitors. This interview was filmed live at Cloud Expo Europe 2016. #CloudTalks Zayo Group provides Communications Infrastructure services, including fibre and bandwidth connectivity, colocation and cloud services to the world’s leading businesses. Customers include wireless and wireline carriers, media and content companies and finance, healthcare and other large enterprises. Zayo’s 110,000-mile network in North America and Europe, includes extensive metro connectivity to thousands of buildings and data centres. In addition to high-capacity dark fibre, wavelength, Ethernet and other connectivity solutions, Zayo offers colocation and cloud services in its carrier-neutral data centres. Zayo provides clients with flexible, customised solutions and self-service through Tranzact, an innovative online platform for managing and purchasing bandwidth and services. ### How do you know you need DR? Some business may feel like they can get by without a reliable disaster recovery solution - that is, until disaster actually strikes. Taking the “it won’t happen to me” approach is hugely risky, particularly with the broad range of incidents that could conceivably disrupt your business. Malicious attacks, insider threats, hardware faults, software integration problems and more recently ransomware - the list of potential disasters that could affect your company is extremely wide ranging. [easy-tweet tweet="The reality is that technology fails, humans make mistakes and nature is unpredictable" hashtags="DisasterRecovery, Cloud"] What’s more, many business are choosing to implement disaster recovery solutions because opting to go without one can have serious implications. Unplanned downtime and lack of availability can be hugely costly, particularly when affecting core IT processes. According to a 2016 Availability Report, the gap between what IT systems can deliver and what businesses need is costing firms up to $16 million every year. When you factor in additional damage to a company’s reputation, disaster recovery starts to make a lot more sense from a financial point of view. In addition, it seems that incidents of unplanned downtime are actually on the rise, suggesting that many businesses are struggling to put a successful disaster recovery or business continuity programme in place. Perhaps businesses are not yet convinced that they need disaster recovery, but that could change very quickly. “My business will be fine. It has been so far” While each individual business will require a bespoke solution that suits their needs, it is likely that all companies will benefit from disaster recovery in the long term. Sensible business leaders are increasingly thinking in terms of “when’’, not ‘’if” disruption will occur. The reality is that technology fails, humans make mistakes and nature is extremely unpredictable. When businesses accept that these factors are outside of their control, implementing disaster recovery is the next logical step. This often starts with an assessment of your business processes to ascertain how long they can remain down for and at what cost. Once businesses are armed with this information they can mitigate disruption much more effectively. However, perhaps the single biggest reason why businesses need to employ a disaster recovery solution is that they now operate in an “always-on” environment. In the past, before the Internet and cloud computing, customers accepted periods of unavailability as the norm. This is no longer the case. Disaster recovery is vital for modern businesses because consumers, clients and partners will quickly move on if companies are not able to recover from disruption Here is a real scenario that caused one company to review its DR plans: “We had a gutter running above the communications room and along the roof housing our air conditioning units for the building.  Seagulls were destroying insulating material around some of the big pipes and this managed to get into the gutter. We had torrential rain on a Friday which resulted in the gutter filling quickly due to the insulation material getting jammed. The water level rose and started to come into the building. In the corner of the comms room water started to flood in, coming down the wall half a meter from the electrical box which could have blown, but we were lucky. A massive panic and shutting down of equipment followed. Moving it under this stress is not funny, but you cannot help but laugh, seeing Carl standing with a girl’s umbrella in the corner of the room directing the incoming water to a bin with holes in it ..… Luckily for us the facilities manager got on the roof and freed the gutter which resulted in the water stopping. If this had happened just 1 hour later no one would have been in the building and we would have suffered a massive outage from water damage to our essential IT Infrastructure. You may think it will never happen, but this goes to prove that it can.” M7 can help businesses (that are) looking to protect their investment by embracing disaster recovery solutions. The managed services that we deliver include off-site data back up with a wide range of recovery options, for customers who wish to host their own infrastructure. For businesses preferring to make use of M7’s advanced infrastructure solutions, we incorporate full business recovery using the most cost-effective technology currently available into our design and build process. [easy-tweet tweet="The global market for #disaster #recovery is expected to be worth $6.4 billion by 2020" hashtags="DRaaS, Cloud"] The global market for disaster recovery is expected to be worth $6.4 billion by 2020. The rapid growth of this market is testament to the fact that more and more businesses are realising that they too need to be fully prepared in the event of any disruption. If you don’t want to leave your company’s success to chance, disaster recovery offers the safest and most reliable way of protecting its future. ### Security as a Service - #CloudTalks with Alert Logic's David Howorth LIVE from Cloud Expo In this episode of #CloudTalks, Compare the Cloud speaks with David Howorth from Alert Logic about the innovation happening around security services, particularly with regard to public cloud. This interview was filmed live at Cloud Expo Europe 2016. #CloudTalks Alert Logic has more than a decade of experience pioneering and refining cloud solutions that are secure, flexible and designed to work with hosting and cloud service providers. Their services focus on delivering a complete solution that lives in the cloud, but is rooted in real expertise. ### Why the cyber security skills gap shortage is a boardroom issue The average cost of an online security breach for UK businesses is between £1.46 and £3.14 million. Worse still, in 2015 the industry reported a significant increase in the number of breaches in both large and small organisations at 90 and 74 per cent respectively. [easy-tweet tweet="Cyber #security is the biggest challenge for the UK right now and it’s spreading across many industries " hashtags="Cloud"] One suggested reason for such a high increase in security breaches is that businesses are becoming more aware and effective in detecting and reporting cyber crimes. Good news if that is the case, but that doesn’t explain why organisations are still under threat. More importantly, what can they do to protect themselves? Cyber security is the biggest challenge for the UK right now and it’s spreading rapidly across many industries – not just in IT. To tackle the issue head on, we must first understand the causes behind it. The tidal wave of cyber threats Data explosion has amassed huge amount of Internet traffic that flows through corporate networks at rapid speed. Nowadays, up to 80 per cent of a company’s data traffic is Internet-related. Lurking inside is malware and spyware waiting for the right moment to strike and infiltrate corporate networks. Our growing reliance on public and on-premise Wi-Fi is creating opportunities for criminals to conduct illegal activities right in front of us – yet these are often hidden in blind spots. As the Internet traffic ebbs and flows, corporate users are also downloading unauthorised mobile and cloud-based applications, and uploading sensitive data onto public cloud storage systems like DropBox and Box. Many employees are unaware of the dangers of shadow IT. As such, they are unintentionally destroying their corporate security – and along with it the company’s reputation – from inside out. CISO/IT has no choice but to fire fight in these situations. But without a clear single view of the various elements, it’s difficult to pinpoint where the cause of the breach originates or where is the best place to start (re)building defences. Tackling cyber threats from high and low The challenge can be very different depending on the size of the organisation. Large enterprises have established IT security teams that grow with business needs. As such, they often have multiple point products implemented in silo. This creates a significant amount of data for analysis, which is an ineffective and time intensive way of monitoring potential threats. Target was one such example. Their silo approach became a risk in itself because IT was not able to act on the data quick enough to stop the data exfiltration from happening. For small and medium size businesses, the challenge escalates in other ways. Many cannot afford to hire their own security specialists as these are often expensive and in high demand. Yet, SMEs experience the same level of cyberattacks as any large enterprises. In some cases, they are at a greater risk than most. Blocking the attack is key, but organisations also need to be agile enough to be able to react to imminent threats quickly and effectively. The paradigm shift to adopting Security-as-a-Service solves these problems for both large and small organisations. A security service removes the need for hiring a dedicated team of security specialists to maintain hardware and deal with uptime/availability. As an example, the Zscaler Internet Security Platform provides up to date threat feeds and adds scalable new functionalities (for example, sandboxing) to detect new threats as they emerge. Running a security platform on the cloud offers the added advantage of 24/7 coverage protection for roaming users. It also provides better integration with SIEM systems to automate the identification of new threats and infected devices. As a result, this will free up valuable time for security specialists to focus on protecting the architecture of the internal network, the data centres and inbound firewalls. They will also have a more effective way of identifying infected devices and ensure procedures are in place to quickly disinfect those devices and ensuring business users maintain a high level of productivity. Cyber security strategy is a collective effort Organisations need to wake up to the fact that shadow IT is happening right here, right now. CISO/IT cannot stop users from using their own apps. Enforcement will only encourage more people to deviate and break the rules. Instead, they should create an open policy but one that put the onus on the individual to stay safe online. Cyber protection requires a consortium across the whole organisation. IT lays out the policy guidelines; HR coordinates training and oversees employment liability; marketing ensures the message resonates through internal communications and partner networks. But most important of all – the initiative must be led from the top, which is why savvy CEOs are fast becoming the driving force behind cyber security strategies. Once the strategy is in place, the next step is to introduce it to the wider company, as well as supplier and partner networks. Education is key and part of the roll out must ensure all employees buy into the concept and understand why they may be held accountable for security breaches. Users need to be made aware of the inherent risks on the Internet and shadow IT. This applies to the workplace as well as their private lives. BYOD continues to strive in the workplace, and employees are increasingly logging onto public Wi-Fi networks using their work laptops and mobile devices. In both cases, they are opening up the defence system and inviting hackers to invade. Skills shortage attracts ‘cowboy’ services The bigger skills gap challenge, however, is that global demand for cyber security experts outstrips supply by almost a third. According to the non-profit security consortium (ISC)2, private and public sector organisations will require six million security professionals by 2019 to effectively tackle cyber crimes. However, only 4.5 million have the necessary qualifications. Skills shortage in cyber security will mean that IT and business leaders need to outsource security protection and defence mechanisms. As applications move outside of traditional data centres into the cloud, the smart approach is to deploy security measures that also runs on the cloud. One of the immediate benefits is 24/7 monitoring, which provides CISO/IT with better visibility into unusual spikes in traffic and allows them to anticipate possible cyber attacks before they hit the core network. However, the lack of technical skills in-house restricts the freedom in which organisations can customise and manage their own security infrastructures. Instead, they have no choice but to look externally for assistance from consultants and managed service providers. Businesses need to be careful when selecting a technology supplier. A wrong choice could lead to a false sense of security, more chaos and disastrous consequences. [easy-tweet tweet="Choosing the wrong supplier can lead to a false sense of #security, more chaos and disastrous consequences" hashtags="Cloud"] All is not lost. If executed and promoted in the right way, the spike in market demand and generous training investment will spur a new generation of talent into the industry, guaranteeing a safer digital future for everyone. This is why cyber security initiatives must be driven from the top and be incorporated as part of the boardroom strategy. ### Identity as a Service - #CloudTalks with OneLogin's Charles Read LIVE from Cloud Expo In this episode of #CloudTalks, Compare the Cloud speaks with Charles Read from OneLogin about Single Sign-on (SSO) and identity management solutions. This interview was filmed live at Cloud Expo Europe 2016. #CloudTalks OneLogin provides customers with a Single Sign-on (SSO) and cloud Identity and Access Management (IAM) solution. Businesses of all sizes use OneLogin to secure enterprise data, while increasing IT administrator and end users efficiencies. Implementation of the identity management system can be achieved in hours not days, delivering a fully featured administrative and self-service portal. Strong multi-factor authentication, mobile identity management for one-click access on smartphones and tablets, and real-time directory synchronisation add an extra layer of protection. ### The technology driving the rise of crowd-based capitalism The growth of widespread digital technology has changed the way we spend and earn money. The ‘on demand’ culture of online retail has given rise to a surge of new platforms specialising in taking familiar services and using the online network to turn them into a peer-to-peer commercial exchange. [easy-tweet tweet="The growth of #digital technology has changed the way we spend and earn money " hashtags="SharingEconomy"] Whilst giving a ride, running an errand or borrowing an outfit were once something shared between people in an immediate social circle, the growth of websites and apps connecting people online has made it possible for these services to now be exchanged with strangers for a fee. The platforms facilitating this change are driving the rise of what has become known as the ‘Sharing Economy’, a system that relies on ease of use, low prices and brand trust of the technologies behind it. Take our first example, giving a ride. French company BlaBlaCar matches passengers who need to travel with drivers who have empty seats. Bookings are made quickly and securely, with payment upfront, and drivers who would previously have been making the journey at their own expense are now covering costs or even earning as they go. By ensuring that all users are identity checked and profiles moderated, BlaBlaCar has built up trust in its service and adds value to the transaction by insuring each journey and offering further safety options such as a ladies only service. A key part of the brand integrity BlaBlaCar has earned comes from the integration of social profiles such as Facebook to their service. Alongside reminding users that they are dealing with real people, the social capital carried by social network users has transformed the review and recommendation process for customers. Users are much more likely to trust an opinion or feel safe traveling if a friend in their network has responded positively to their own experience. The fact that the BlaBlaCar lift sharing network now transports more people per year than America’s rail network Amtrak, despite not having any concrete investment, is proof in itself that a large proportion of consumers see this peer-to-peer exchange as a preferable alternative to purchasing through the more traditional market. Website and app TaskRabbit also uses a similar trust model build confidence in its brand. By aggregating a wide range of ‘taskers’ who can fulfil odd jobs and errands, TaskRabbit matches people who have the time and/or skill to complete tasks with people who need them doing. The platform provides transparent pricing and online payment options, eliminating the need to ask for quotes or pay in cash. However, the most revolutionary aspect of the platform is the collation of trusted traders all in one place, removing the need to ‘shop around’ and that feeling that you might have got a better deal if you’d looked elsewhere. The effect this service has on the economy is huge. We see spending increase as users are given more options to buy tasks they may have done themselves, or previously put off due to the hassle of sourcing labour. TaskRabbit markets it as ‘buying time’ and it’s working, not just for the customers but for the traders as well as ‘taskers’ are able to work flexibly, selecting the jobs they wish to take on so the workload can fit around their lifestyle. This flexibility is another important part of the Sharing Economy as it gives the trader and the customer more control. Companies like Rent the Runway are offering consumers flexibility in their purchase options by encouraging a move towards low commitment, lower cost rental options. Rent the Runway provides a high-end fashion rental service, allowing users to hire clothes for special occasions that for most buyers would otherwise be unaffordable purchased as a one-off. Whilst their partner retailers would previously have relied on a limited customer-base paying high prices, Rent the Runway provides a service where customers are paying lower amounts, to rent out outfits more frequently. The company partners with stores and designers to offer a wide range of choices, provide styling advice and clothing insurance during the rental period, using the brand capital of the designers integrated with the social capital of user feedback and ‘real life’ examples of customers wearing the clothes to build customer loyalty. Frequent users of the service have the option to commit slightly more and pay a one off monthly fee to select three items of clothing to keep for up to six months. Many of the tech platforms mentioned above are well on their way to becoming household names, but as they become more established in the Sharing Market, what are the next generation of technologies driving crowd-based capitalism and what are they doing differently? Look out for names like OpenBazaar - a platform for traders and consumers to connect directly without the facilitation of a third party. This and other platforms like Lazooz, where users can offer lifts in their empty car seats, sound familiar to BlaBlaCar and eBay, but the difference is that apps and websites like these are pioneering the concept of a decentralised market, where the consumers and providers participating in the market are also running it. In these examples, the technology allows users to connect with each other through the network and conduct transactions between themselves with no centralised infrastructure to bump up fees. The crowd is not only the source of the labour but also provides unregulated information and money. [easy-tweet tweet="Flexibility is another important part of the #SharingEconomy as it gives the trader and customer more control"] How far we move towards this peer-controlled market model and reject the brand trust for a system based entirely on social trust is as yet unknown, but it is clear that our growing access to tech like smartphones, digital ID verification and the ‘social capital’ of Facebook has shifted economic activity away from traditional institutions towards the peer-to-peer marketplace and this will only increase as the technology improves. ### Internet Performance Management - #CloudTalks with Dyn's Paul Heywood LIVE from Cloud Expo In this episode of #CloudTalks, Compare the Cloud speaks with Paul Heywood from Dyn about how Internet Performance Management delivers greater visibility and control across the global Internet. This interview was filmed live at Cloud Expo Europe 2016. #CloudTalks Dyn is a cloud-based Internet Performance Management (IPM) company that provides visibility and control into cloud and public Internet resources. Dyn’s platform monitors, controls and optimises applications and infrastructure through Data, Analytics, and Traffic Steering, ensuring traffic gets delivered faster, safer, and more reliably. ### How will the EU Referendum affect IT compliance in the finance industry? On the 23rd June, the UK electorate faces its biggest decision in a generation. The debate surrounding the EU referendum has focused on a variety of issues: immigration, national sovereignty and economics to name but a few. However, the technology industry, based both in the UK and abroad, will also be waiting eagerly for the electoral result. If you happen to have read any headlines about Safe Harbor, Privacy Shield or the upcoming General Data Protection Regulations (GDPR), then you’ll be well aware of the hands-on role that the EU plays with regard to data protection and IT compliance. Many businesses are now facing a period of uncertainty as they wonder if, and how, the EU referendum result will affect IT compliance. The finance industry is already undergoing a period of great disruption, with Fintech startups unsettling the established players. When confusion over the UK’s relationship with the EU is added to this, businesses are understandably concerned. Looking ahead, the biggest change to IT compliance is likely to come in the form of the GDPR, which is due to be enforced in 2018. [easy-tweet tweet="Tech firms will be hugely interested in the outcome of the EU referendum" user="veberhost " hashtags="EURef, Brexit"] What Brexit means for GDPR Many financial businesses may believe that they should wait until the result of the referendum before assessing whether they need to comply with the EU’s General Data Protection Regulations, but this is a dangerous approach to take. It is likely that UK businesses will need to comply with GDPR, regardless of whether we vote in or out, providing you are dealing with EU customers or businesses. It is also likely that in the event of Brexit, EU businesses will claim that the UK’s pre-existing compliance laws – the Data Protection Act 1998 – is not stringent enough. Ultimately, this means that UK companies need to start assessing their compliance standards now, before it’s too late. For some businesses, this may involve a long, hard look at their internal IT resources, but for many others their relationship with their cloud vendor will take on added importance. Cloud compliance Finance firms worried about whether their cloud vendor will meet compliance standards in a post-referendum UK must begin acting now. Ensure that your supplier already adheres to the highest security and compliance standards and if they do not, it may be time to find a new vendor. At Veber, we have a compliance team in place that is focused on ensuring quality, environmental and security standards. As well as our ISO accreditations, our own “Veber Promise” guarantees that you’ll receive a best in breed service. Also, as an approved G-Cloud supplier you can be sure that our broad range of cloud services have been vetted against the most stringent public sector standards. For financial businesses, it is vital that their cloud vendor has a clear understanding of IT compliance in order to avoid regulator fines and reputational damage. At Veber, our experience and expertise can give your IT leaders peace of mind, no matter what happens on the 23rd June. [easy-tweet tweet="Finance firms worried about #Brexit affecting compliance in the UK must begin acting now" user="veberhost " hashtags="EURef"] Businesses that are looking for stability, whether we end up in or out of the European Union, are best served going with a UK-based cloud vendor that offers only the highest levels of data security and compliance. ### Moving to the cloud: A security breach waiting to happen? When Gartner predicted at the start of the year that 95 per cent of cloud security failures would be a customer’s fault, many in the industry took a sharp intake of breath. [easy-tweet tweet="Policy, process and people need to join together to ensure every application decision is thought through" hashtags="Cloud"] It’s true there’s often a sense among IT security teams that they can no longer protect what they no longer manage at their fingertips, despite knowing that moving to the cloud brings great benefits. It’s akin to leaving your child at a nursery – a tough decision to make and often filled with a natural sense of fear but the right thing to do on balance. And it’s right that Gartner highlights the concern. Security needs to be at the heart of all IT decisions. There have been too many high profile breaches for complacency. But it’s not necessarily true to say that adoption of the cloud is impeded because of trust however. The reality is that there are many established cloud service providers that have better general security operations and practices than some enterprise organisations. It makes sense. Knowing security concerns have been a deterrent to adoption, large providers have put a huge emphasis on establishing strong standards – you can now find ISO standards, assurance registries and frameworks for establishing which providers you can trust. Having said that, that’s not enough. There are still risks. Inevitably, when you share resources with others there is always the potential for ‘collateral damage’. No one wants to be a cyber domino but the possibility exists. If the service provider is overwhelmed by a DDoS attack then unintended victims can be brought down too. Hackers go for glory and a service provider is a honey pot for causing maximum disruption to customers. Similarly, if an application running on the cloud has a vulnerability that is breached, the breach could lead to unauthorised access to other parts of the cloud environment. So while there is a common thought that eventually everything will be in the cloud, it’s not surprising that 79 per cent of IT professionals think that some applications will always need to be hosted on premises. That’s because there are applications that require zero latency as they are tied to physical operations, or they are bound by data sovereignty issues that won’t allow data to be hosted elsewhere. These scenarios depend on security and so the decision to move lock stock isn’t as simple as it sounds. The very nature of having on and off premises applications means you make multiple demands on the security team. Often that’s where trouble lies. It’s not unusual for organisations to suffer a breach and discover it was because of an application no one in the security team knew about. That’s in part why Gartner’s prediction was so severe. Finding and deploying cloud services will likely reduce some of the discipline around application security, from creating secure code to ongoing maintenance and management. This is where policy, process and people all need to join together to ensure every application decision is documented and thought through. But with a pressure to manage security operations across a company’s own data centres and networks, as well as across multiple cloud providers it’s easy to see how a ‘hybrid’ strategy could run into problems. If you’re supporting the orchestration of security policies across multiple environments, then there will be added pressure on stability and agility. No security team would want to stand in the way of being able to respond to market demands but equally keeping the status quo protected is a full time job. So what’s the answer? Stepping back is never a bad thing to do. It creates the time to consider how you’ll support the company strategy that needs to be delivered in the next 12 months and the next 10 years. For example, if you’re in an industry where IoT will be fundamental to your success then the applications you’ll be dealing with could grow as quickly as the devices you have connected to the network. That brings risk. There’s therefore no harm in taking a risk-based approach to considering the options and determining the best cloud adoption when faced with strategic scenarios like this. It’s really important to assess the impact of a breach, the applications that will render operations useless if they suffer any downtime, right through to the merits of a private versus public cloud strategy, and what the service provider can offer – will application layering streamline management for example? With such an approach you can determine where the priorities lie, how cloud technologies need to blend and where the weaknesses will be. Of course, much of this analysis is about assessing the capability of the cloud against the strategy. But there is one fundamental development of the last six months that can also help with the decision making process – namely ‘always on’ cloud protection. Always on detection and mitigation is a widely adopted technique in corporate networks and it makes sense to extend what you already have to the cloud. Why wouldn’t you take advantage of cost efficiency, scalability and working with technology you already know? If you knew your infrastructure, no matter its guise, could detect very specific threats like an SSL attack and keep data anonymous you’d take it. And you’d be even more inclined to adopt if the service also recognised new applications had been added, needed to be protected and were automatically brought into the security fold. Given the applications management complexities described, and the constant need to stay ahead of the competition, having one less thing to worry about will be a boon to the security team. The inadvertent breach will be far less likely and if the worse happened the defences would go up automatically. [easy-tweet tweet="Managing #security policies across multiple environments, will add pressure on stability and agility" hashtags="Cloud"] Gartner’s predictions were well founded, but these are the advancements that reduce the risks and make the decisions about how and when to adopt cloud more clear-cut. But most importantly it makes reaping the benefits of cost saving and operational efficiency that the cloud promises a definite reality. ### Public cloud and its rise to prominence Over the past few years, organisations have witnessed cloud computing move higher and higher up their business agenda. It was a movement, which first began as an operational and strategic investment, initially requiring significant change management, but eventually reduced enterprises’ costs, which led to an approach focused more on innovation management. [easy-tweet tweet="Many companies are still wondering whether the public #cloud is the right step for their business"] Thanks to the virtualisation of IT, the transfer of data from physical servers to the cloud has enabled businesses to work in a more flexible way to better embrace the concept of an open enterprise. Yet, so many years after the public cloud first rose to prominence, many companies are still wondering whether it’s the right step for their business. The pros There are many business benefits, which come with storing data in the public cloud, as opposed to on premise. The main benefit is the flexibility – public cloud offers an ‘on-demand’ service for data access, storage and management. This flexibility can drive adoption of bring your own device (BYOD) solutions, as well as enabling more flexible working practices, allowing employees to work with familiar devices whilst being able to access required data from any location. Storing data in the public cloud can be a great fit for companies, which do not deal with highly sensitive information. As well as providing flexibility and scalability, public cloud can also greatly reduce overheads. By opting for public cloud, businesses no longer need to store data on physical servers and can outsource the cost of talent needed to secure and manage that data to the cloud provider. Over the years, public cloud has been best suited for smaller organisations and start-ups, as due to the nature of these businesses, they don’t tend to have the scalability or disposable budget to invest in strong hardware and back office functions. Public cloud, therefore, can be a beneficial solution for smaller businesses considering their storage options and looking to speed up their workflows. However, public cloud’s greatest asset can also be seen as its greatest downfall. Lack of geographical limitations restricting users as to where they can access the data from, can expose organisations to a security breach, especially if data is stored in one country with its own set of regulations, but accessed in another. Which brings us to the cons. The cons More sensitive data will always require stricter security and preferably a highly regulated network, at least until organisations invest in better identity management. Deciding to jump into the public cloud with a one-network-fits-all strategy is a common mistake, which can get businesses into hot water. Before deciding to move to the cloud, it’s critical for companies to examine the nature of the data they handle, in order to establish how many virtual information networks comprise their infrastructure, what data should be handled by which network, and which security settings should be attributed to each. For now, many organisations utilise private cloud for that purpose, yet for companies, which require an ‘in-between’ solution for public and private cloud, a hybrid cloud infrastructure may be the solution. CFOs will be pleased to hear that properly configured hybrid solutions will not compromise sensitive and confidential corporate data as this model gives businesses the flexibility to choose where to store different sets of information – allowing users to take advantage of the scalability and cost benefits associated with the public cloud whilst still maintaining security levels for sensitive data. The continued pressure to drive down IT costs can ultimately make the financial implications of hybrid cloud adoption the defining business driver: getting the cost right can often outweigh all other discussions with the CFO. [easy-tweet tweet="Security will eventually become device, system and data agnostic" hashtags="PublicCloud, Cloud"] The future Public cloud’s journey to prominence is not over yet. In the long term, there will inevitably come a time when the public cloud will be a place for enterprises to store all of their data and applications - under the condition that organisations are able to solve issues around identity management and information rights management. Security will eventually become device, system and data agnostic – focusing instead on people, data and permissions. Once that happens, sensitivity of information will no longer be an obstacle to storing data and applications in suitably secure public clouds. ### SSH & DevOps - #CloudTalks with SSH Communications Security's Matthew McKenna LIVE from Cloud Expo In this episode of #CloudTalks, Compare the Cloud speaks with Matthew McKenna from SSH Communications Security about the ubiquity of SSH security protocols. This interview was filmed live at Cloud Expo Europe 2016. #CloudTalks In 1995, the company’s founder, Tatu Ylönen, invented the Secure Shell protocol, which soon became the gold standard for data-in-transit security. Today Secure Shell is one of the most widely used protocols in the world and SSH Communications Security has grown to serve over 3,000 customers around the globe. ### Disaster recovery and cloud backup A few years ago people were constantly asking, what is the cloud? How will it help me? Why would I move to the cloud? Etc… Whilst those questions are still valid more and more people are able to answer them to some degree, whether talking about business or personal use. [easy-tweet tweet="As more people are relying on the #cloud it is more important than ever that #data is secure and recoverable"] Still a hot topic and increasingly relevant to businesses across all sectors, over 50 per cent of businesses in the UK are cloud enabled. As more people are relying on the cloud it is more important than ever that data is secure and recoverable. Many people worry that as they cannot see where their data sits it is not in their hands and therefore at risk. Whilst precautions must always be taken to ensure all data is secure and backed up, hosting data in the cloud means you are much less susceptible to natural disaster and human error. I recently had a discussion with a start-up business who had not thought about cloud based back up for their data. They were a recruitment agency who had been working hard for 6 months meeting prospective clients and candidates with all data being recorded and saved on a laptop. One evening the laptop was stolen from inside the front door of the owners’ house. This meant that 3 months’ worth of work was lost forever. Obviously devastating for the owners it also taught them a valuable lesson and they would now not be without a cloud based backup solution. Imagine if this had happened 3 years down the line…! Whilst computer viruses, hackers and malware are well publicised and scare mongering is everywhere, natural disasters and human error are rarely spoken about in the UK. Few of us want to think about these, as there is little we can do to stop them. Human error can be reduced by education and thorough processes but how do you really eliminate the risk of flooding, fire and theft? Back-up and disaster recovery are hot topics, educate your employees and minimise risk! Other benefits of cloud computing include, flexibility, meaning improved efficiency and productivity. Team members can work where they want when you want, whether that is on the road, from home or hot desking. It is also easy to ramp up or down your IT capacity as your business grows. Reduced business costs, onsite hardware costs are reduced as applications and data are stored in the cloud. Reduced capital expenditure, budgeting becomes easier as costs are more easily predictable. Team members can collaborate freely on centrally located documents rather than sending documents back and forth as email attachments which can quite often lead to confusion and people working on incorrect documents. Cloud computing gives businesses a competitive advantage as you can move quickly and easily with the ‘times’ and keep up with large enterprise businesses. As discussed above one of the key advantages is security, even if you were to lose a laptop you can easily access all data that is stored and backed up in the cloud. The price of replacing a laptop is nothing compared to replacing data, this could lead to huge fines, loss of trust and even the breakdown of your business. [easy-tweet tweet="Back-up and #disaster recovery are hot topics, so remember to educate your employees and minimise risk" hashtags="DRaaS"] Contact Complete I.T. for an un-biased view on the best back-up and disaster recovery solution for your business. ### Data Privacy & Protection - #CloudTalks with Exoscale’s Matthew Revell LIVE from Cloud Expo In this episode of #CloudTalks, Compare the Cloud speaks with Matthew Revell from Exoscale about why data privacy is so important for their customers . This interview was filmed live at Cloud Expo Europe 2016.  #CloudTalks Exoscale provides Cloud hosting with an emphasis on Simplicity, Scalability and Safety for SaaS companies and web applications. Based in Lausanne Switzerland, and with data centres spread throughout Switzerland, Exoscale benefits from the data protection granted by Swiss laws. ### Modernising healthcare: The challenges and opportunities Healthcare is in a constant state of flux. As new diseases emerge, alongside new cures, the entire industry must adapt and evolve. Technology already plays a vital role in this evolutionary process and over the last few decades we have witnessed the introduction of many cutting edge machines and processes. [easy-tweet tweet="One of the main issues that the #healthcare sector must overcome surrounds #compliance and security" hashtags="Cloud"] However, the IT infrastructure that underpins the healthcare industry can struggle to keep pace with this innovation. Despite the many potential benefits that cloud computing can deliver to medical professionals and patients alike, there are many challenges that must be overcome before we can modernise this aspect of the healthcare industry. One of the foremost issues that the healthcare sector must deal with before it can fully embrace cloud computing surrounds compliance and security. Patient data can be of an extremely sensitive nature, meaning that the upmost levels of privacy are demanded. Healthcare firms therefore, whether in the public or private sector, must choose their cloud vendor extremely carefully if they are to meet the compliance standards required. At Zsah, we ensure that our customer data is 100 per cent private and always remains within the UK. These high security and privacy standards have allowed us to work with the NHS and a number of other customers that deal with sensitive information. Inertia provides another challenge to modernising IT infrastructure in the healthcare industry. Many medical institutions use legacy IT systems that require time and money to replace. Even if IT systems are relatively modern, cloud migration provides another obstacle that must be overcome. Fortunately, cloud vendors like Zsah also provide bespoke support as well as technical infrastructure to help their clients. Any healthcare firms worried about the migration process will receive expert consultation to ensure it goes as smoothly as possible. Conversely, once healthcare firms commit to modernisation, they’ll find that there are a multitude of benefits on offer. Firstly, the scalability of cloud services means that medical institutions can deal with spikes in demand and still offer the same level of service. Constant monitoring, like that offered by Zsah, also helps to ensure that your IT solutions do not suffer unplanned downtime, which could affect your ability to treat your patients. However, perhaps the greatest benefit that cloud computing can deliver concerns collaboration. By storing healthcare information in the cloud, medical professionals can aggregate data, discover trends and potentially cure diseases faster using the latest analytics software. In this sense, cloud computing may not simply deliver efficiency, scalability and reliability, it could actually help save lives. [easy-tweet tweet="Healthcare must continue to evolve at pace, and #cloud computing looks set to play a vital role"] Modernising IT infrastructure within the healthcare sector may take time, but many of the perceived challenges can be overcome by using a secure and reliable cloud supplier. Zsah has already worked with a number of healthcare providers and continues to provide the compute power, managed support and high compliance standards that the industry demands. Looking forward, healthcare must continue to evolve at pace, and cloud computing looks set to play a vital role every step of the way. ### Why smart IT leaders are harnessing the cloud to understand data Savvy businesses use the cloud to ease workloads, increase productivity, facilitate mobility and reduce unprofitable expenditure. More and more IT leaders are looking for ways to work with the large volumes of data they handle on a daily basis, and the cloud lets them navigate the information they have to keep them at the top of their game. [easy-tweet tweet="IT leaders are often looking for ways to work with huge #data volumes on a daily basis" hashtags="Cloud"] Navigating data Data professionals and line of business users now have the ability to incorporate information from many different sources, including Amazon Aurora, Google Sheets, Adobe Analytics, Salesforce and Salesforce Wave Analytics. They also have the added advantage of bringing this together with the mass of data directly inside Apache Hive and Microsoft Azure SQL Data Warehouse, so there’s no shortage of information to make the most of. But with so much information to navigate, it can feel overwhelming. The cloud is at its best when it helps facilitate any confusion. It enables IT leaders to deliver intelligence to the right people at the right time. It gives them the much-needed flexibility to scale up and down according to demand, meaning they can quickly analyse data when needed. Keeping competitive with the cloud and analytics The true power of the cloud, however, comes when businesses use it to make the most of big data sources to execute a focused approach to analytics. Clever CIOs and their teams can ensure they retain the best advantage and stay competitive by pulling this information from various sources to turn into real-time, actionable insight. When done well, the results can be revolutionary. Take Leeds Teaching Hospital, for example. It uses the hybrid cloud to support advanced analytics and analyse files, retail drug sales and hospital visit history. This helps the hospital identify costly errors that failed to be recorded in clinical databases, better enabling them to avoid future monetary waste. What’s more, this analysis doesn’t have to be confined to the IT team. Self-service analytics equips wider teams by letting them easily dissect and analyse data problems themselves, in more depth than ever, without being a programmer at heart. This gives them deeper insights, in hours rather than weeks. And as data workers are more familiar with their own functional domains, they become far more effective at working on various projects at one time. Predicting a revolution It comes as no surprise that the volume of information organisations have to manage is going one way, up. The cloud offers a reliable platform which turns both unstructured and structured data sources into valuable business insight. IT leaders need to make sure they’re on top of technology which helps them go further. The cloud enables them to enter the realm of predictive analytics which, when executed well, can transform business performance. The National Trust is a stand out example of this. In earlier years, its marketing campaigns weren’t matched to specific audiences. On a mission to provide a more tailored experience to its supporters, the Trust decided to use data to predict which information and updates would be most interesting for specific groups of supporters and potential supporters. It has been able to cluster supporters together based on interest or habitual location which the Trust has then used to develop predictive models that suggest which content will be of most interest for these particular groups. By using predictive analytics, the Trust is able to ensure it is being as effective as possible by taking action based on real-time insight. [easy-tweet tweet="The #cloud lets businesses embrace predictive #analytics which can transform business performance"] The cloud provides the foundation for these analytics technologies which will give businesses the crucial edge they need. Those IT leaders that haven’t yet harnessed the capability of the cloud need to get on board to make sure they have the resources in place as business demands evolve. Once they are using the cloud to help analyse big data, and then go on to draw predictions from this analysis, the possibilities are endless. ### Cloud migration: Shining a light on the hidden costs Salesforce’s recently announced the availability of its Internet of Things (IoT) cloud, which will be hosted on Amazon Web Services (AWS), and represents a step change in the way the company integrates third-party infrastructure. Salesforce says the move was brought about by the need for flexibility to handle “uncontrolled exponential growth” of the IoT service, but it also demonstrates the need for the company to keep a close eye on costs. [easy-tweet tweet="Salesforce’s move to #AWS highlights that operating your own data centre is increasingly costly" hashtags="Cloud"] The cloud’s the limit? Salesforce’s decision to move to AWS highlights the fact that operating your own data centre is increasingly expensive and time intensive. Pushing services to the cloud removes the need for upfront investment in data centres that won’t reach capacity for some time. Moreover, it gives organisations the ability to rapidly configure and scale infrastructure resources according to demand almost instantaneously. Other key benefits of moving applications to the cloud include: Significant cost savings achieved from reduction in capital expenditure and potentially lower operational expenses; Simplified processes for businesses to implement disaster recovery solutions and backup systems while avoiding large-scale capital investment; and A more transparent costing structure for business partners. Given the opportunity for cost savings, it won’t be long before CFOs start asking their colleagues in IT whether more of the company’s applications could be hosted in the cloud, rather than on more expensive in-house servers. But to give the best guidance to the business when answering these questions, IT leaders have to go beyond conversations of run costs for on-site versus public clouds. They have to also identify and explain the hidden costs of cloud migration that CFOs typically don’t see. Exploring the hidden costs If only the cloud were as simple as its marketing. While there can be financial benefits with a shift to the cloud, the financial impact of migrating assets to the cloud can be substantial. In the absence of efficiencies created elsewhere, our analysis shows that a company migrating a significant chunk of capacity to the cloud would not be able to offset migration costs for two-to-four years. It’s akin to moving from a high-cost city to a lower-cost city; even if it’s the right moreover time, you still have to face moving costs. Telling the cloud cost story In the long term, as risk-adjusted pricing continues to drop, economies of scale will lead more companies to migrate applications and data over to the cloud. In the short term, the most important thing that IT leaders can do is demonstrate the complete story of cloud costs, in terms of migration and projected future costs. [easy-tweet tweet="IT leaders must demonstrate the complete story of #cloud costs, including hidden migration costs"] Equally important, IT leaders need to figure out how their cloud story fits in with the story that the CFO is trying to tell investors. Where will IT’s cloud strategy contribute to scalable cost savings? Will it instead deliver speed, to fuel enterprise growth initiatives? IT leaders need to understand this context, and demonstrate the fit between cloud economics and enterprise economics. ### Hedge Fund IT Solutions - #CloudTalks with Hentsu's Marko Djukic LIVE from Cloud Expo In this episode of #CloudTalks, Compare the Cloud speaks with Marko Djukic from Hentsū about how their IT solutions help with agility and compliance in the financial services industry. This interview was filmed live at Cloud Expo Europe 2016. #CloudTalks Hentsū grew out of the needs of asset managers to evolve beyond the traditional resource and capital heavy IT model, allowing managers to scale up or down, pivot strategies, and adapt to changing market opportunities at a moment notice – securely, efficiently and on demand. ### The top 4 myths about cloud telephony services Despite the rapid evolution and adoption of cloud telephony systems, many organisations are still reluctant to take this step. This is largely due to concerns around data security. Some believe that sticking to traditional data servers minimises the risk of data breaches. However, innovations in cloud telephony and cloud contact centre technology have been significant and most security concerns about the cloud - such as those around the protection of data and sensitive information - can be allayed by addressing a few common misconceptions and misunderstandings. [easy-tweet tweet="Despite the rapid evolution of #cloud #telephony systems, many firms are still reluctant to take this step"] 1. Cloud telephony is a new technology, therefore it is more vulnerable to security breaches Security is an age-old concern with cloud services and, as adoption of new cloud telephony surges, the misconception that security is a challenge persists. However, security concerns are the same, regardless of physical or virtual components. After all, when we refer to ‘the cloud’, we are still talking about data storage and software applications within a data centre. In addition, leading service providers always employ industry, and in some cases military, standard encryption, intrusion protection systems and firewall configurations to ensure security of client data and to avert outside network intrusion. The increase in security layers added within the cloud has introduced additional methods of authentication. The technology can record and monitor geolocation, device history and the preferred browser linked to your account and requires personal information before logging in. 2. The level of data control and access is reduced Another misconception around cloud telephony systems is that the level of data control and access is reduced. However, cloud telephony actually gives businesses better access and control through administrator rights. Data is typically archived and calls and communication data is stored by cloud telephony providers using military grade call encryption. This archived data in the cloud can be accessed immediately through the call log search capability and is far ahead of in-house systems from both a security and search point-of-view. In addition, all files are stored centrally and everyone sees one version. Greater visibility means improved collaboration, which results in better productivity – a welcome boost to the bottom line. 3. Local computers and networks are better protected than cloud-based telephony systems Even though data may be hosted with the cloud telephony provider’s data centres, that doesn’t make it any less secure. The term ‘cloud’ may seem to convey that data is just floating around somewhere but, in fact, it is a very safe place to store data. If anything, a cloud-based telephony system offers an increased level of security over local computers and networks. Cloud telephony providers can afford to invest in innovative, intelligent security measures that can outstrip those that organisations invest in themselves. Generally, data is lost when an organisation loses control of how it’s stored, shared, or secured and what end users do with it. The cloud gives back that control, from data centre to delivery to endpoint. 4. Cloud telephony does not meet industry security and compliance requirements Cloud telephony providers can offer military grade data security, which far exceeds compliance requirements. With industries such as banking and financial services facing heavy communications regulation, cloud telephony solutions hold the key to meeting strict compliance. In fact, cloud telephony systems can more securely and cost effectively meet PCI DSS, MiFID II, FCA, Central Bank, SEC and Safe Harbour requirements than traditional systems. [easy-tweet tweet="The biggest risk of not embracing #cloud #telephony is that companies will fall behind their competition"] The biggest risk of not embracing cloud telephony and technology is that companies will fall behind their competition. By moving to a cloud telephony system, companies can take a significant leap forward in terms of ensuring data security and meeting regulatory compliance, making these concerns a thing of the past for cloud adopters. ### Cloud is driving a massive shift in the accounting industry Technology is driving a once-in-a-generation shift in the accounting industry which we expect will hit a critical inflection point this year. [easy-tweet tweet="The move to #cloud accounting platforms means software is doing more of the heavy lifting"] The move to cloud accounting platforms means software is doing more of the heavy lifting, freeing advisors from the drudgery of data entry so they can focus on higher value activities and help boost the success of their clients. With more accountants choosing to work with their clients in the cloud, the industry is undergoing a significant shift - especially when it comes to how we perceive and demonstrate value. Working on a single ledger, in real-time with our clients means we’re able to develop closer client relationships and up-level to offer more CFO-type services. Advisory services are becoming the new norm More often than not, firms are turning to advisory services as a way to differentiate and generate new revenue. Advisory and consulting services revenue growth was up in 2015 for 32 per cent of small- and mid-size North American accounting firms, according to the latest International Federation of Accountants (IFAC) Global SMP Survey. There’s a similar trend happening here in the UK, and right across the world. To make advisory a sustainable part of your practice, you shouldn’t just be doing your client’s sums, you should also be doing your own. Before you get all tangled up in helping your clients, figure out the level of service you’re willing to deliver and what this means for your own business. Cloud platforms like Xero are enabling accountants to do the technical, compliance-oriented work faster. It’s giving them more time to interpret the data and provide value-laden advice to the small businesses owners they work with. The role of an accountant is shifting from someone who just crunches historical numbers to someone who is a critical member of a small business owner’s team. Technology is actually driving relationship-based accounting partnerships and moving the advisory needle. Automation takes a load off the accountant’s back Automation is one trend that’s being discussed right across the accounting industry. HMRC is looking at how filings can be better streamlined and we’re building automation into our platform so advisors can better collaborate with their clients. There has been some concern that automation will make accountants irrelevant, but that’s very far from reality. Automation and technology changes in the industry actually place advisors back in the centre and opens up a raft of opportunities. By freeing up your time, accountants can offer advisory packages to small businesses that were previously not commercially viable - either it was too expensive for the small business owner or there wasn’t enough margin in it for the accountant. With better access to data, the role of the accountant as a small business advisor is stronger than ever. It creates an opportunity to expand services beyond the traditional activities. By providing meaningful accounting information to your clients in real-time, it is possible to offer high-value services such as debtor chasing using automated debtor chasing apps, sourcing funding or finance using secondary finance platforms that take data from online accounting platforms to make rapid lending decisions. Data is driving deeper insights The idea of data driven advice is also evolving. With small businesses using a range of software solutions to run their businesses, from point of sale solutions to employee management, all of which feed into their cloud accounting solution, accountants can provide detailed, granular and customised advice, quickly. There are also new products and features that can help accountants in their transition to the cloud and advisory services. Xero now offers integrated payroll and auto-enrollment services to help streamline customer management. Assisting small business owners to ready themselves for auto enrolment is a massive business opportunity for the UK’s accountants. Over the next two years the bulk of the country’s 5.3 million small businesses will have to contend with their staging date and for many, how to deal with the changes is still a mystery. However, given the majority of the UK’s small businesses still use simplistic paper based methods or spreadsheets to keep their books, the challenge of having to contend with more complicated payroll requirements will not be trivial. They will need their trusted advisor by their side, more than ever before. [easy-tweet tweet="Moving to #cloud based accounting can help businesses to manage finances digitally"] By helping your clients transition to cloud-based accounting and small business software now, you’re going to help them avoid a lot of pain when they’re required to manage their taxes and accounts digitally by April 2018. ### Where are you on the G-Cloud Maturity Model? There are some things going wrong on G-Cloud. There’s some going right. There’s finger-pointing and name-calling. [easy-tweet tweet="G-Cloud can be good for public services, citizens, taxpayers and our industry – if it goes right" hashtags="Cloud"] Personally, my focus is SaaS, but you’ll be onto a safe bet if you think 90 per cent of this applies to IaaS, PaaS and SCS (Specialist Cloud Services). And just so you know, I’m all in favour of G-Cloud. It’s good for public services, citizens, taxpayers and our industry – if it goes right. It can deliver competition, flexibility, efficiency, new ways of working and new outcomes. Currently, it is doing some of this, but it can do more and it needs to grow-up a bit. Like any new market, like any new organism, it’s immature in its early years. Tend it, root and branch, and it will bear fruit. In SaaS terms, £120 million has been spent in the four years G-Cloud has been around. Roughly £60 million in the 12 months to 31 March 2016. (75 per cent to SMEs.) Is £60 million a lot of SaaS? Well no, not exactly. But here’s a few observations - it’s bigger than it looks: It’s growing exponentially. Looking forwards, its significance will define an epoch if it survives infancy and gets all that is required for sturdy growth. G-Cloud has been operating in four years of severe government expenditure constraint. Anyone who has run a big IT budget will know when you are in that situation the majority of your spend was fixed some periods ago. Your discretionary spending is very small, your new IT projects are dwarfed by keeping your existing ones running. G-Cloud is meant to deliver savings (cloud, opening the market to agile SMEs, thinking Agile). We know it does, although it is difficult to estimate how much. So from an opportunity cost perspective, G-Cloud spend is delivering more for less than traditional procurement would do. That £60 million price-tag would be higher under another procurement model. It’s SaaS. So it’s mainly recorded as a pay-per-use, pay-per-month/quarter/year. This £60 million figure would be at least 3 times larger under perpetual licenses for systems it replaces. From the vendor’s perspective the £60 million is pretty much pure application revenue. Professional services (often excluding support) are under another revenue class – SCS – so implementation, migration, training are all earned by SaaS vendors on top of this. So that £60 million is a much bigger figure than those with which we may be judging it, but nevertheless, it’s still £60 million, which is not enough. G-Cloud isn’t working. DON’T PANIC! It is working for some. 100 SaaS vendors made over £100,000 in sales over the 12 months to March 2016. Revenue figures likes this have significance – anything over £100,000 and the market can sustain one FTE. Any CEO has got to believe they can get to that sustainable level of revenue in a reasonable space of time, or they should go find a better market. But we have been running four years and 1,200 unique SaaS vendors are not yet at the magic number. Over 1,100 sold nothing in the last 12 months. So 8 per cent are making a living or better and 92 per cent aren’t. Here is a worthy quest. You see, I know these companies. They are some of the most successful, entrepreneurial and valuable companies in the economy. They are our good-news, growth industries. They have the brightest talent, hardest working, driven individuals and teams in UK PLC (OK and from our trading partners, no slouches there either). And yet, 92 per cent of them are selling diddly-squat – Why? No Excuse My three-year stint running EuroCloud in the UK, the trade association for supply-side cloud businesses, has given me great insight and access to vendors, CCS (Crown Commercial Service), GDS (Government Digital Service), public sector customers and professional advisors across the field. I hear a lot of excuses. A lot of myths are being spun. But I don’t buy-it. On the whole, the vast majority of procurements are totally fairly run, under the rules. No one has “stitched-up the market”, there’s no “old-boy network” and the 100 successful vendors didn’t all do their sales training at Hogwarts. Try asking “why?” – that’s the most important three-letter-word for creating progress. Eureka! Well, I’ve found it. I’ve found the causes for failure and the non-functional contributors to greater success. I call the functional attributes of a service: what it does that creates value and the price. And separate these from the non-functional attributes which for G-Cloud I say are the 60+ characteristics disclosed on the face of the Digital Marketplace such as Networks Supported, Location of Datacentre, Browsers Supported, etc. When functional attributes are a cause for failure the vendor gets eliminated on a shortlist because price is too high or the solution doesn’t have required functionality. This is just straight competition – it’s little different from what vendors face in the commercial sector. Considering only vendors who are reasonably successful outside G-Cloud; I can assume they have adequate functional attributes. For vendors which are successful outside G-Cloud but not on G-Cloud, then it’s something in G-Cloud we need to look for to explain failure and success. Is G-Cloud a broken framework, incapable of supporting significant SaaS transaction volume? Well it has weaknesses. For example, I’m not keen on the 24 month maximum contract term – that’s a hostage to ransom. If a customer has to invest a lot in implementing a new system, but doesn’t know what the price to renew will be in month 25 they won’t buy through G-Cloud or if they do they expose themselves to exploitation. But, are the conditions stipulated for doing business on G-Cloud so poor that this is the cause of failure? As the above 24-month contract term example indicates, there are a number of structural features of the framework that effectively limit the scope of G-Cloud as a SaaS marketplace. It was always intended for the more commoditised services. But, as 100 vendors made £55 million in the last 12 months, someone is making decent returns in a young market against the headwinds of austerity, conservatism, risk-aversion and inertia. So I deduce, it’s not something chronically wrong in the state of G-Cloud, because 100 vendors are the proof it can and does work. So, it’s something wrong with the 92 per cent; we need to look where the unsuccessful are different from the 8 per cent which are making hay. Yes, that’s right Watson, when you’ve eliminated the impossible, whatever remains, no matter how improbable must be the truth. It could just be our fault, the fault could be with the vendors who aren’t succeeding. And, to cut a long story short, it is. The evidence, when you look for it, is very clear. The conclusions are documented and detailed guides on how to cure any of the various ailments I’ve uncovered are all in G-Cloud Success, Analysed (and Failure Explained). From this work I can probably save some 400+ companies 9 months and £50,000 or more, each, by making the following big, tactical recommendation: If you have been on G-Cloud for over 12-months and have not made a sale or been notified you are on a shortlist and you are copying your G6 or G7 details to G-Cloud 8. For Pity’s Sake STOP! You don’t have to be Einstein to work out it’s insane to leave everything the same while hoping things will change. Answers, answers, answers Our SaaS vendors are gifted and successful in their commercial field. But G-Cloud is a different game. There is a linear pathway to success: understand the new rules, adapt to the new playing field, learn and understand the techniques and equipment – only then can your natural prowess win through. I call it the G-Cloud Maturity Model. Establish where you are now, plan what you need to do to get to the ultimate level – where your product and sales ability will win. The G-Cloud Maturity Model for Suppliers Level Action Note Example 0. Get listed Getting your service listed on the Digital Marketplace does not mean your service is conformant and capable of being sold Wrong documents have been mistakenly attached or images of documents so blurred they are illegible at any magnification 1. Get conformant Over 50 per cent of SaaS vendors on G-Cloud today are chronically non-conformant and disqualify themselves from making a shortlist. Vendors can use our study to see and rectify the problems Inadequate pricing information supplied. We have classified chronic failure into 6 major categories, with different variants in each. Inadequate pricing information is one of the 6 categories. Most zero sales vendors have multiple failures 2. Get optimised Comparing sales success against the structured data on Digital Marketplace in an analytical model shows: successful vendors exhibit a statistically significant number of positive attributes that unsuccessful vendors do not ISO27001: 60+ per cent of successful vendors appear to be accredited. Only 40 per cent of unsuccessful ones are. What’s to argue? Most vendors, particularly SMEs can get certified in time for G8 go-live. Benchmark and discover where you fall short against successful peers in over 60 attributes we map against revenue success 3. Get marketing Successful vendors, generally, are leaps and bounds ahead in several areas of marketing best practice.   Comparing the good with the bad we list guidance on wording the target search fields and the crucial role of the Service Definition Ensure your Service Definition is full and covers the mandatory points, it should not be a ‘teaser’ or general overview, it should be complete.   This is one of several marketing deficiencies exhibited by a significant proportion of the weaker performers. Once identified and understood, not one should take much more than a morning’s work to get 80 per cent perfect. 4. Get selling If you are at level 3 you should be scoring some hits. But if you want to go the full distance, remember: folks don’t know what they don’t know. You need to build awareness. Particularly that you may have a solution to a problem where prospects are not thinking of G-Cloud for a solution. They won’t even look for you on G-Cloud if they don’t know about what you’ve got Get out and about: Listen to public sector pain (and learn the language of the market): Content marketing Exhibitions Seminars & workshops Events Trade press Don’t just talk, get listened to   What are the steps needed to succeed on G-Cloud? G-Cloud is continually evolving. It is growing up fast and the demand-side (government managing agencies and public sector customers) are getting there. Don’t fret that they could do it more quickly, we vendors may be holding them back. If four in five vendors can’t even be considered for a procurement because of some fatal flaw in their paperwork – the customers are going to turn their back on G-Cloud and go back to their comfort zone of direct procurement. We vendors are responsible for our own evolution, we need to appreciate where we are on the G-Cloud maturity model and make a determined policy to move to level three by the time applications are filed on June 21st. In most cases this can be achieved with surprisingly little effort. We have to get it right for G-Cloud 8, a big leap forward: 8 per cent making sustainable business can be 33 per cent and if you have read this far and are determined to do something about it – then success is not far away. The extent of the work involved to achieve a compliant application which can successfully support your sales plan is often just a few hours, a morning’s concentrated effort. And the tools. Well, I have provided those: G-Cloud Success, Analysed (and Failure Explained) is available here. It is priced at the equivalent of 4-hours consultancy, a value reflecting the personal investment of over 400 hours on this first, publicly available, in-depth study into success and failure on G-Cloud. [easy-tweet tweet="In #SaaS terms, £120 million has been spent in the four years G-Cloud has been around" hashtags="Cloud"] I wish everyone about to embark on G-Cloud 8 rich success; and I also take this moment to salute the hard working teams at GDS and CCS for creating and maintaining this magnificent opportunity. ### The Best Is Yet To Come: IoT and Big Data Integration Five years ago, it seemed more like science fiction than fact: A wirelessly connected, always-on world of devices endlessly transmitting information. Today, it's the Internet of Things (IoT), and according to research firm Gartner, the market is headed for $300 billion in revenue by 2020. Yet companies have barely scratched the surface of IoT; wireless sensors and monitoring devices are still in their infancy. The result? The best is yet to come — device sophistication and data collection will quickly ramp up over the next few years. To make best use of IoT, however, companies need to address the inherent challenge of connected devices and big data integration. The Volume Vector [easy-tweet tweet="According to research firm Gartner, the #IoT market is headed for $300 billion in revenue by 2020."] The Internet is already teaming with connected devices, everything from mobile phones to tablets and traditional desktops. Adding in IoT, however, blows this number out of the water — every temperature sensor, every light bulb, every printer and fridge could potentially act as a data collection network endpoint. The Internet is the network these devices will use to make their events and data available. The increase in the number of devices will come with a potential huge uptick in big data. As smaller and more numerous devices gather ever-more-specific information, the available “pool” of data quickly increases past a point that even the most sophisticated information-handling tools can handle. The volume vector of IoT means companies must find new ways to manage the influx and extract meaningful results. Streamlined Storage One option? Streamline storage. As noted by Data Informed, many companies are now moving away from site-based storage to platform-as-a-service (PaaS) solutions that let them store data off site and continually expand to meet the increasing need of IoT. Flash storage is another possibility since it offers improved density and retrieval times while lowering the chance of hardware failure. However, even flash and PaaS won't be enough to meet the sheer volume of big data produced by IoT devices. As a result, there's a growing trend toward tape-based storage that sees non-critical data physically stored off site over the long term, rather than compromising a company's digital footprint. Bottom line? Multiple storage types are necessary to handle the information deluge.  even flash and PaaS won't be enough to meet the sheer volume of big data produced by IoT devices Funnelling Down The next step in managing IoT data? Recognizing that not everything is important. Can companies drill down and grab usable data from light-switch monitoring sensors? Sure — with the right analytics tool it's possible to improve space utilization and reduce total electricity spend. But will this same data always have comparably high value? Not likely. Enterprise Apps Today puts it simply: Some data just needs to be read and thrown away. The trick is telling actionable, immediately relevant data from less useful counterparts, and then funnelling this data into a usable stream. As IoT technology becomes more sophisticated, expect a commensurate rise in data funnelling technologies that help companies automatically separate critical data from overflow information. [easy-tweet tweet="As #IoT tech becomes more sophisticated, expect a rise in data funnelling technologies"] Seeing Is Believing Even with the right data on tap, it may be difficult for companies to develop actionable, IoT-based strategies. Why? Because while IT and analytics professionals are used to working with this kind of unending information stream, C-suite decision-makers have a very different worldview. The result? A kind of miscommunication when data hits the boardroom: If executives can't make sense of what they see, there's little chance they'll choose to act. The solution? Improved data visualization tools. By simplifying gathered data into easily consumable “bites” of information, it's possible to communicate the core message of complex concepts without overwhelming those without prior IoT knowledge. Think of data visualization like a translator, allowing decision-makers and stakeholders to effectively tap the critical message of IoT integration. Think of data visualization like a translator, allowing decision-makers and stakeholders to effectively tap the critical message of IoT integration. The best is yet to come for IoT and big data. With the right storage solutions, funnel techniques and visualization tools, companies can get ahead of the game. ### Shifting applications to hybrid cloud: Does your database strategy keep up? Hybrid cloud implementations have grown in number as companies want to speed up their application deployments. Nearly 75 per cent of large enterprises planned to have hybrid cloud deployments in place by the end of 2015 according to Gartner. Moreover, the amount of data stored in hybrid cloud deployments is expected to double in the next two years. [easy-tweet tweet="Nearly 75% of large enterprises planned to use hybrid #cloud deployments by the end of 2015" hashtags="Gartner"] Hybrid cloud deployments are becoming more popular as IT teams want to get the flexibility of cloud while also making use of their internal IT skills and assets. By using a mix of on-premise IT, private cloud and third-party services, IT can deliver applications that can scale alongside the needs of the business. Behind this wider goal, hybrid cloud implementations can support company IT teams in being more flexible compared to deploying internal IT or using public cloud services on their own. Options for hybrid cloud include the ability to expand infrastructure resources only when required to meet peak demand, the option to outsource the infrastructure management and responsibility to a third party, as well as more tactical deployments for sourcing secondary storage for disaster prevention. However, running database implementations as part of these hybrid cloud deployments can present challenges. These fall into three categories: #1: How Simple Is It to Manage? The first thing to consider when running a database in a hybrid cloud configuration is how easy it will be to have the database run across multiple locations. If the IT team wants to base the application or service on a single database, then the technical architecture of that database has to be able to run across multiple locations without running into any difficulties in performance or heavy lifting to make this work. At this point, it’s important to look at the underlying architecture of any database that is considered. Master-slave implementations are used for relational databases where one instance is in control of all data reads/writes across a number of “slave” nodes that are used primarily for read operations. In contrast, a masterless database is one where every node is the same; all can service read and write activities. For hybrid cloud deployments, master-slave implementations will almost always have parts of the cluster that are devoted to different activities and functions. For example, some parts of the cluster will handle write operations, while others only handle reads or are marked as failover-only. This can make it more difficult to manage the distribution of data over wide geographic areas. A masterless database takes the opposite approach -  any node within the cluster can take on any operation at any time. This approach can be better suited to hybrid cloud deployments where the number of nodes in the cluster can go up or down in response to demand. At the same time, this approach can service applications that are geographically dispersed, as the nearest node in the cluster can handle all operations rather than needing requests to go back and forth to a master node. One of the biggest concerns is how to smartly move data between on-premise and cloud providers to support any service. The use of a masterless architecture with flexible replication capabilities that enable read/write anywhere independence, the ability to selectively replicate data between on-premise and cloud providers, and keep data synchronised in all locations greatly simplifies hybrid cloud database management.  Lastly, management and monitoring tools used should be able to seamlessly incorporate any machines running the database on cloud providers with on-premises hardware that house the same database. To the tool, machines in the cloud should appear the same as those run on the enterprise’s IT infrastructure.    #2: How Scalable Is It? One of the biggest draws of the hybrid cloud model is the ability to scale quickly. The aim is to avoid compute resources sitting idle and so capacity must be able to expand or shrink based on either current or forecasted demand.  However, predictably scaling a database across internal IT resources and external cloud services is not an easy task. It’s important to understand how databases scale in the cloud, so that performance and scalability requirements can be met. While all databases should be able to scale initially, not all databases are able to scale in a predictable fashion. Again, the architecture of any database plays a key role in how scalable it will be across private data centres and cloud providers. For some databases, scale is not linear so the amount of resources needed to cope with an increase in service requests would be much higher. For masterless database architectures, linear scalability should be available via node increases for both read and write operations. Any database used in a hybrid cloud deployment should deliver predictable scalability. Master-slave architectures are less likely to deliver the same predictability when it comes to scale. For hybrid cloud deployments, where investment in cloud services may be required to deliver results back to the business, this lack of control over costs can affect both the business case and any potential return on investment too. #3: How Secure Is The Data? Security continues to be a big consideration for hybrid cloud deployments. Handing over any responsibility for the data can be a big hurdle to overcome, particularly if there is a perception of increased risk around unauthorised access. Other common concerns include account compromise, cloud malware, excessive data exposure and over-exposed personally identifiable information (PII). To alleviate these worries, it’s worth looking at the security management and support tools that exist around the database and whether these can run across multiple locations. This should ensure the same levels of protection and security for data no matter where it’s housed. As a base, encryption should be used for all data transferred over the wire, between nodes, and at rest. Similarly, authentication and access authorisation should be in place for access to data across all sites. Lastly, smart auditing functions should be applied so that database access can be monitored from both the cloud and internal locations. All of these security controls should function uniformly across the hybrid cloud deployment.    Alongside security management, there is also the issue of data sovereignty to consider. This covers how stored data is subject to the laws of the country in which it is created and then located. Many of the issues surrounding data sovereignty deal with enforcing privacy regulations and preventing access to data that is stored in a data centre that is located in a different country. Cloud and hybrid cloud computing services can work across traditional geopolitical barriers while companies can make use of multiple providers to deliver a service. With so many options available for hosting data, company IT teams have to consider which data regulations are applicable to their operations. There have been a lot of developments around the regulations on data residency, particularly in Europe. The death of the Safe Harbour agreement has come at the same time as a new regulation on data protection in the EU. Under the terms of the European Union’s General Data Protection Regulation (GDPR), companies found breaching these rules could be fined as much as four per cent of their annual turnover. All companies doing business in Europe have two years to put in safeguards and management processes in place to protect customer data, as well as maintain adequate controls over how that data is processed over time. Adhering to data sovereignty security requirements in a Hybrid Cloud deployment comes down to the granularity of control over time. For example, any database deployed under an application should retain data for specific countries or geographies in the appropriate location, while other data can freely move between clouds that exist in other geographies. The rules should then be applied across the hybrid cloud automatically so that customer data always remains in the appropriate location or locations over time. Replication of data across database nodes for disaster protection should itself be location-aware – for example, in Europe, data can be located with nodes in Germany and France, rather than sending secondary copies of transactions or data to nodes held in other countries. Replication in fully-distributed databases like Cassandra can be controlled at the keyspace level. This allows data covered by data sovereignty requirements to be easily restricted to local data centres, while other data can be placed into different keyspaces that allows replication between local centres and other chosen cloud providers.  [easy-tweet tweet="#Data #security requirements in hybrid #cloud deployments come down to granularity of control over time"] Because the database is the heart of nearly every application, it’s important to ensure any database being considered for a hybrid cloud deployment is simple to operate in such an environments, can scale in a predictable fashion, and ensures solid data security. ### Q&A with Khaos Control Compare the Cloud has been interviewing cloud experts and learning about their companies, today the spotlight is on Khaos Control Cloud. We sat down with Mike Cockfield, Founder and Managing Director of parent company Keystone Software Development, to find out a bit more about his business. [easy-tweet tweet="Keystone Software Development was founded in 2000 in order to revolutionise business management software" hashtags="Cloud"] Tell us about yourself and your experience in the cloud computing industry. Keystone Software Development was founded in 2000 in order to revolutionise business management software for SME retailers. Our ERP solution, Khaos Control, is currently used by hundreds of retailers across the UK. Khaos Control Cloud is our first cloud solution and builds on the experience, knowledge and skills that we have developed during that time, to produce a compelling product aimed at small and micro-retailers that need to ditch their spreadsheets in order to grow their businesses. Who are Khaos Control Cloud and what is your differentiator? We are a wide-ranging business management solution for small and micro retailers that is completely browser-based. Khaos Control Cloud provides a breadth of functionality that is unrivalled at our price point, ensuring that companies can focus on developing their companies, rather than worrying about how to integrate their CRM, to their stock control, to their accounts. Our solution encompasses all of these functions and much more, which provides businesses with a powerful, all-round solution with which to manage their organisations. What countries have KC:Cloud deployed into? Initially we are focusing on the UK. However, Khaos Control Cloud is location agnostic and can already be used worldwide. The solution has been developed in a way that will make it quick and simple for us to move into non-English speaking territories. What are the development plans for KC:Cloud? Our development plans for Khaos Control Cloud are aggressive and far-reaching. This is not a product that will stand still and early adopters will benefit hugely from the ongoing development phases that we have planned for the product. After launch date we will involve our user community with the product development roadmap. Users will be able to up-vote the features that will have the most impact for them and will be part of the development process for Khaos Control Cloud. As a cloud solution, updates will be released to the entire user community regularly and seamlessly, ensuring that the product is constantly moving forward. Where did the name Khaos Control Cloud come from? As a company we develop solutions that help to control the chaos in customers’ businesses so that they can focus on growth, hence Khaos Control. With such a striking product name, we decided to build upon that when naming our browser-based product, hence Khaos Control Cloud. What are the trends your organisation is noticing in the marketplace? The key trend for us is the continued growth of pure play retailers; companies that only sell online. Online marketplaces and the ability they provide for individuals to test and trial products and brands continues to drive the expansion of micro and small retailers. These companies get so far with manual tools, spreadsheets and other single solutions, but they are held back by not being able to get control of their business as a whole. The second key trend from our perspective is Omnichannel retail. Providing a consistent experience to your customers, regardless of the way in which they're interacting with you is becoming vital, regardless of the terminology you use to define that. That is only possible if you have the right system and business processes in place in order to join everything together. If you're using Seller Central, Magento's CRM and other marketplace back-ends to manage your customers and order fulfilment then you will miss out on the opportunity to provide that consistent experience and every person on every channel will be treated as an individual. Having a single system that allows you to control all of that customer data and interaction means that you can get back to understanding who your customers are and how and where they're interacting with you. You can make them feel valued and ensure that their experience is as excellent as possible, every time they shop with you. How do you intend to attract customers to your new platform? We're looking to grow Khaos Control Cloud as organically as possible, focusing on generating content that will help the entrepreneurs and businesses we know Khaos Control Cloud will help from day one. We're building our following on Facebook and Twitter, as well as looking to raise awareness of our brand and the product via more traditional methods - hence this interview! We are also looking to mix in physical events, and the first of these will be a series of roadshows across the country during the first week of July 2016. What is your definition of cloud computing? For us, cloud computing means providing browser-based solutions to our customers that can be used securely anywhere and at any time. What does your average day involve? An early start! I'm still a developer at heart, and I love nothing more than getting into the office at 6am and spending a couple of hours with my head in a new development challenge or problem before the office gets busy. I deliberately work in the heart of our open plan office, so once the working day is in full swing I'm as accessible as possible. To-do lists and task lists are an essential tool to ensuring that accessibility doesn't end in tasks being missed! There will be meetings every day, ranging from strategy and general marketing through to 1-2-1s and technical. I like to keep things dynamic and agile, but I also know that giving my time up to my colleagues is vital to ensuring that they feel valued and supported. I always take a break for lunch, getting myself away from my machine is vital in order to gather my thoughts and focus on what I want to do that afternoon. On an ideal day, I'll be done by 4pm and will relax with some home-cooked food and some time for the family and myself. Who is the team behind Khaos Control Cloud? Keystone Software Development now has more than 40 employees and it feels a little unfair to only highlight those working with me on our cloud solution. However, the key technical people behind the product, in addition to myself, are: Steve Jefferson - Senior Developer David Capps - R&D Matt Hadden - Developer Max Read - Designer [easy-tweet tweet="#Cloud computing means providing solutions for our customers that can be used anywhere and at any time"] If you could change one thing about the cloud computing industry? Standardise browsers! Everyone has their own favourite and new browsers continue to thrive, with Vivaldi being the latest example. But they all have their own quirks and features and taking advantage of them in a way that benefits our users is a challenge! ### The role of the cloud in reducing global emissions Global greenhouse gas (GHG) emissions have increased tenfold in the last century – 90 per cent since the 1970s. One of the most significant, but avoidable, causes of carbon dioxide (CO2) emission is gas flaring. International efforts to limit the global temperature rise, such as the COP21 Paris Agreement, the World Bank’s Zero Routing Flaring by 2030 initiative and CO2 emissions trading schemes like the European Union (EU) ETS are putting companies under more pressure to reduce flaring operations through higher taxation and increased regulation. [easy-tweet tweet="Global greenhouse gas #emissions have increased tenfold in the last century" hashtags="GreenTech, Cloud"] Building a holistic picture of flaring It may seem surprising that around half of plants flaring gas in the chemicals and oil and gas industries simply estimate their emissions. By measuring pipe size and guessing the flow rate, companies are submitting estimates with an uncertainty of up to 20 per cent. Historically, measurement processes have been entirely manual, involving large workforces visiting flaring sites to take physical measurements. In addition to being a significant cost, a manual process increases risk faced by engineers by placing them in hostile locations. Developments in gas flow measurement, particularly ultrasonic technology, have transformed this from a largely labour intensive and inaccurate process. Today flare gas measurement can be as accurate as 1 per cent and measurements can be logged remotely on an ongoing basis, reducing the need for human intervention. Businesses are now looking to the cloud to further automate the process and to deliver critical insight that can reduce taxation and increase revenues. Cloud automation The most forward thinking operators are creating national operations centres by providing connectivity between flare sites. When flare gas is accurately measured, for example by an ultrasonic flow meter installed on a pipe, emissions data can be fed directly into a cloud-based Continuous Emissions Monitoring Systems (CEMS) to give a company access to precise and real-time information on the amount of gas it is flaring. Declining absolute caps of the EU ETS mean that flaring natural resources in industry should only take place when absolutely necessary. Using cloud-based CEMS to track long-term flaring volumes over a company’s whole operation will give it insight into how close it is to reaching its CO2 emission cap. In 2020, EU ETS emissions caps will be 21 per cent lower than in 2005. By 2030, they will be 43 per cent lower. With such substantial decreases, every cubic metre counts. Flaring information can be presented on a dashboard for executives to review in real-time, showing how each site, process, manager, state, or country is performing. A connected, fully automated process enables remote reporting and adjustments, increases safety, reduces workforce and gives a company much greater visibility into site best practice. This enables a business to reduce workforce costs, increase employee safety, reduce carbon tax obligations and provides significant environmental benefits. Working smarter with cloud Companies will need the accuracy provided by measurement technology and cloud connectivity to maintain revenue while implementing reduction schemes. Cloud technologies help drive ROI across whole operations during implementation of new schemes to reduce emissions, by building a better picture of trends over time. For example in a plant where flaring only happens during maintenance procedures, accurate real-time data can enable the flaring process to be managed more effectively – reducing both the amount of wasted gas and taxation paid. With insight into trends over time, the plant will be able to more accurately predict flaring volumes and work out the viability of gas capture technology. This enables a cost (taxation) to be turned into revenue (natural gas sales). [easy-tweet tweet="Over the next ten years, the #cloud will become central to limiting global temperature rises" hashtags="GreenTech"] Cloud technology is enabling almost every industry in the world to work more effectively and efficiently. Over the next ten years it will become central to limiting global temperature rises. High-emission industries like chemicals and oil and gas will need drive cloud adoption, harnessing its speed, flexibility and collaboration capabilities to drive new insight from flaring data and better emissions management. ### Post-Cloud Deployment: Making it work for the long term Cloud infrastructure has heavily infiltrated enterprises over recent years and many key decision makers will be able to tell you of its many benefits to business. These being, most notably, increased storage, access from all locations, security and the ability to manage and scale. The ever increasing dissemination of these benefits across industries is contributing to rapid cloud deployment. [easy-tweet tweet="Companies use #analytics to track customer experience, ensuring #cloud solutions deliver premium results"] However, organisations also want to know about the longer-term value of cloud deployment to both enterprise and consumer alike. How can bottom line results be achieved through the maximisation of cloud infrastructure to collect valuable strategic data for decision making or optimise the customer experience for example? The answer is in a clear post-deployment cloud strategy that will enable businesses to manage for the future. So, where do you start? First, a business must identify how the cloud provides added value to its customer base. Enterprises are under increased pressure to accommodate fast changing customer needs, and to keep up, IT managers need an environment that allows them to develop applications, test products and go to production much faster. In this scenario, businesses bring application development and client services to the cloud, ensuring a quicker, more secure platform of engagement. Putting Customer Needs First Shifting applications to the cloud is just the start of the deployment journey. Firms must also deploy the proper edge and connectivity infrastructure in their respective offices, stores, restaurants, etc. to provide customers access to those applications. Customer satisfaction is a top priority in today’s business landscape, so it’s crucial that Ethernet edge switches are quick enough and smart enough to maintain an optimal user experience. This means prioritising business traffic ahead of bandwidth-consuming, non-mission critical traffic and moving network intelligence to the edge. Not only does an intelligent edge device prioritise business traffic, it also supports a strategy that aligns with the subscription-based financial models most often associated with the cloud. Utilise Analytics Once customers are connected and able to use applications, companies can leverage analytics to track customer experience, ensuring the cloud solution is truly delivering premium results. Analytics also enable a business to become a more agile, revenue generating operation, enabling the monetisation of network infrastructure. Analytics can also help IT departments make decisions about where they should invest more or – conversely – pull back, based on activity reports on which software is being used on the network. These data-driven decisions combat the onset of shadow IT; still prevalent in many organisations and frequently the cause of security and business expenditures. If a company is paying for a license that people aren’t using, the CIO and IT department can orient themselves based on analytics to understand what’s actually transpiring on a network. From there, they can better evaluate the software that has been adopted from a security and revenue standpoint. Vertical Cloud In addition to the enterprise, the application of analytics and monetisation can be highly valuable in several vertical sectors as well. In universities, for example, IT managers can see how students interact with applications on a network to determine how it’s helping or proving to be an obstacle to them. In large sports stadiums, analytics can show crowded areas to appropriately allocate wireless and provide support for fans. Family members and patients in hospitals are able to interact and connect seamlessly during emergency situations. Quality of experience is a relevant measurement for every industry, whether in a waiting room, classroom or boardroom. [easy-tweet tweet="Post-deployment strategies will become become crucial to offering customers a seamless experience" hashtags="Cloud"] Where Next for Cloud? As more companies implement cloud solutions, their post-deployment strategies will become the determining factor for success when offering customers a seamless experience. It all starts with analytics, which offer businesses precise visibility into collected data that could increase sales and improve the customer experience. From there, it’s the organisation’s responsibility to leverage their solution as a resource for monetisation. With these steps, companies will be positioned to succeed following the implementation of their cloud solution. ### The most pressing FinTech trend? Cybersecurity The FinTech space this past week has again been pretty much dominated by news about Blockchain technologies. For now however – I want to draw your attention to another, equally if not more important branch of FinTech, namely, cybersecurity. [easy-tweet tweet="No matter the future possibilities and allure of #Blockchain – #cybersecurity is a genuine threat" hashtags="FinTech"] No matter the future possibilities and allure of Blockchain and its derivatives – cybersecurity is a real threat in the here and now. It’s been said many times - but in my humble opinion it is nonetheless worth repeating – that as far as a security breach is concerned – it’s not a matter of if it happens – but rather when it happens. It seems surprising to me that given the bad press, loss of business and regulatory fines that result from data losses and websites being effectively taken off-line, that these types of threat are not getting as much attention as the future/game-changing possibilities of Blockchain. Given the above – I was therefore happy to discover that IBM have published a Whitepaper which focuses on cybersecurity and the effective steps that firms can take to minimise their exposure – even when dealing with a workforce which insists upon using their own mobile devices for both work and for pleasure purposes. This “Bring Your Own Device” (BYOD) approach is far more realistic than trying to force employees into only using an “enterprise” and/or in-house computer. It’s what they’re used to using throughout the day – and for many, the prospect of using some “boring old-fashioned pc terminal” can be a real turn-off. This is why I found the IBM paper so interesting, focusing as it does on the real world issues of firms who want to balance their employee's need to be reasonably happy versus the firm's needs and obligations to protect sensitive data and generally keep systems up and running. This approach further got me thinking that perhaps FinTech companies that up until now have provided good, solid enterprise systems should now consider how they might modify the front-ends of their products to mimic the most popular front-ends of the most popular apps out there. Think about it – we’ve all witnessed the incredible speed with which (particularly younger) smartphone users check their email, Twitter, Facebook et al feeds – share images/videos – send texts – seemingly without any real conscious effort. If enterprise product engineers could improve their somewhat boring front-ends and generally increase the “User Experience” (UX) – think about the likely effects on the employees productivity levels – and more importantly, the firms’ bottom-lines. In adopting this approach – firms would still of course have to be mindful of exactly how they mitigate any possible cybersecurity threats. The aforementioned IBM whitepaper sets out in some detail various options in this respect. One other “Non-Blockchain” trend I’ve noticed is how tech companies and even some governments are creating “sandboxes” and/or “garages” where firms can test out new apps, services, products etc., in safe environments to assess their effectiveness and suitability before being unleashed to the public. [easy-tweet tweet=" A recent #FinTech trend is the creation of sandboxes to test new apps & services in safe environments"] You can find some more background reading on this trend here. ### Unified Communications - #CloudTalks with ScanSource's Paul Emery LIVE from Cloud Expo In this episode of #CloudTalks, Compare the Cloud speaks to Paul Emery from ScanSource about their new Unify cloud hosted solution. This interview was filmed live at Cloud Expo Europe 2016. #CloudTalks ScanSource is the leading international distributor of specialty technology products, focusing on point-of-sale (POS) and barcode, communications and physical security solutions. ScanSource’s teams provide value-added services and operate from two technology segments, Worldwide Barcode & Security and Worldwide Communications & Services. ScanSource is committed to helping its reseller customers choose, configure, and deliver the industry’s best products across almost every vertical market in North America, Latin America and Europe. ### Data & Risk Intelligence - #CloudTalks with LogicNow's David Sobel LIVE from Cloud Expo In this episode of #CloudTalks, Compare the Cloud speaks with David Sobel from LogicNow about how his company's management platform for MSPs helps with service delivery. This interview was filmed live at Cloud Expo Europe 2016. #CloudTalks LOGICnow is the only 100 per cent SaaS ITSM solution for IT pros who manage organisations from just a few to up to tens of thousands of endpoints that span multiple locations and functions, securing and managing their systems and data. They provide the only collective intelligence solution, using analytics and machine learning that brings together all the data across the platform to provide knowledge and actionable insights to our users, increasing the effectiveness of the IT pro. ### Will the banks take control of the FinTech movement? Blockchain and its derivatives (e.g. Bitcoin, Ethereum) have pretty much dominated the FinTech headlines these past few weeks. [easy-tweet tweet="In order to gain maximum leverage from #FinTech innovations, a more international approach is needed" hashtags="Blockchain"] For a summary, be sure to take a look at some of my over Blockchain blogs on Compare the Cloud. That said, it’s worth noting that “FinTech” generally is grabbing the attention of Governmental Agencies: Clearly, there must be something going on – and if there is – we should have a meeting to determine what it is we need to do. Regardless – let’s have a meeting anyway to ensure there’s nothing we need to do… With apologies to Civil Servants everywhere. On a more serious note – it seems increasingly apparent to me that in order to gain the maximum leverage from FinTech innovations, that an increasingly more “international” approach will be needed. Perhaps the Central Banks will lobby to ensure they retain their current “central hub connecting all the spokes” status – all the while refusing to recognise Bitcoins, Ether/similar as a reserve currency. Whilst some countries would certainly enjoy the prospect of seeing the “Almighty Dollar” topple from being the veritable King of the Castle – it’s unlikely to happen anytime soon. So – within individual countries – will those respective Central Banks permit digital currencies to be used to settle payments due in other (non-digital) currencies or formally exchanged? And if so exchanged, which market(s) should be referenced to establish a reliable/current exchange rate? If these so-called Distributed Ledger Technologies do not in theory at least require Trusted/Secure connections – then who exactly will become the effective guards – and ultimately, “Quis custodiet ipsos custodes?” As I see things currently, there are simply too many vested interests in maintaining the status quo. Sure – FinTech companies can certainly attack banks and steal away from them some low hanging fruits for sure – but can they realistically take on the world’s central banking systems and ultimately replace them? Far more likely in my opinion is that we’ll see the banks “invest” in various FinTech start-ups – and little by little, take control. After all – if you can’t beat them – find somebody bigger who can – or buy them! [easy-tweet tweet="It's likely we’ll see the banks invest in various #FinTech start-ups – and eventually take control" hashtags="Blockchain"] The banks will remain – but potentially huge numbers of employees from within – plus scores more from the eventually likely to be defunct intermediaries (more on this next week – but for now think “Clearing Brokers”) will no longer be required. Costs will be significantly reduced - profits are therefore likely to increase – so how can banks not invest in FinTech? ### Vicom Infinity: An Open Ecosystem to Fuel Innovation Tuesday, June 28 2016, 8am to 10am JW Marriott Essex House Central Park South, New York, NY Conducted at the JW Marriott Essex House located in New York City overlooking Central Park South, the meetup will consist of presentations on Open Source as a driver for innovation and Blockchain’s impact on conducting future business, concluding with a panel discussion. Attendees will meet and discuss these topics with their peers leading to a better understanding of common issues and ways to optimise their IT environments. Agenda (8:00 -10:00 a.m.) Breakfast/Networking Open Source; Driver for Innovation Blockchain; Impact on Future Business Panel Discussion Adjourn ### Pulsant announces acquisition of Onyx and appoints Mike Tobin as chairman Pulsant, the cloud computing, managed hosting and colocation expert, has announced the acquisition of Onyx, an IT infrastructure services provider. This acquisition forms part of Pulsant’s strategy to enhance capabilities and strengthen its market position. The acquisition will benefit both existing and potential customers. [easy-tweet tweet="Pulsant's acquisition of Onyx will benefit both existing and potential customers" hashtags="PulsantUK"] Onyx focuses on providing cloud, colocation, remote and on-site IT management, workplace recovery, applications management and security solutions to customers. Backed by mid-market private equity firm Livingbridge in 2011, it now has five datacentres in Edinburgh, Glasgow, Sheffield and Newcastle, and other business continuity centres and offices around the UK. The company’s staff, technology platforms and infrastructure will be integrated with Pulsant’s over the coming months. The combined business will have almost 400 staff, revenues of £75m and over 4,000 customers in a variety of industries. Its operations are underpinned by a network of 15 owned and operated datacentres across in the UK, and leading-edge private, multi-tenanted and public cloud platforms. The acquisition will add to Pulsant’s capabilities as a provider of hybrid IT services — particularly in the areas of applications management, remote and on-site managed services, workplace recovery and security — and expand its reach in the UK to deliver more options for resiliency and a stronger portfolio of solutions to customers. “Bringing Onyx and its capabilities into Pulsant is the latest step in our targeted acquisition strategy. The acquisition increases the scale of our business and the breadth of services we can offer, which is crucial as multi-cloud environments become more complex and more important,” says Pulsant CEO Mark Howling. Neil Stephenson, Onyx CEO, comments: “We have known Pulsant for many years and always felt that combining Onyx and Pulsant would provide a very strong UK-wide player. Both businesses have complementary capabilities and locations, and this acquisition enables the enlarged business to offer a stronger, broader range of services to our joint customers. The cultures of our two organisations are also very similar, which was a big factor in selecting Pulsant as the acquirer.” This acquisition marks a significant milestone in the development of Pulsant, and is its first major acquisition since Oak Hill and Scottish Equity Partners became Pulsant’s major shareholders in July 2014. As the business moves into its next stage of development, the company has also announced that it has appointed Mike Tobin, OBE, as chairman. Tobin was previously CEO at Telecity for 13 years, overseeing its growth from a market cap of £6 million to £1.6 billion, and floating it on the London Stock Exchange in 2007. Howling says: “Mike has an excellent reputation in the industry, a broad understanding of the datacentre and cloud markets and experience of growing businesses in the UK and overseas. I am delighted that he has agreed to become our chairman, which is a significant signal of Pulsant’s quality and ambitions.” Pieter Knook, Pulsant’s existing chairman, will continue to serve on the board as deputy chairman. [easy-tweet tweet="The acquisition will add to Pulsant’s capabilities as a provider of hybrid IT services" user="PulsantUK"] For more information visit: http://www.pulsant.com/pulsant-acquires-onyx/ ### Is the Fintech Revolution being held back by regulatory hurdles? Traditionally, banking has been viewed as one of the most staid, old-school, stuck-in-the-past industries. [easy-tweet tweet="#FinTech intends to totally disrupt the lives of those working in the financial world" hashtags="Blockchain, Finance"] Many people have in the past opined that perhaps over time, Banking might become regarded as somewhat of a “utility” service – rather like water, gas and electricity services – running in the background to support shiny devices on the surface – but that nevertheless, banking would always exist. “FinTech,” which of course, is short for financial technology, intends to totally disrupt the lives of those working in the financial world – and perhaps completely change the very structure of banking and other monetary systems. Anyone following news articles about Bitcoin, Ethereum and more generally, Blockchain will realise that the technology offers so many potential benefits – that banks (even Central Banks) are actively working on projects collaboratively – and/or by ‘partnering’ with FinTech companies – to prove whether the technology actually can deliver – as opposed to possibly being hype. According to a survey conducted by Citigroup, in 2015 almost $20 billion was invested into FinTech companies – a ten-fold increase from just five years earlier. These new technologies are poised to fundamentally disrupt the biggest players in finance. Numerous e-payment services now compete with the likes of Western Union and PayPal – Crowd-Funding type companies make getting a loan cheaper and easier. Foreign-Exchange services are under serious threat from the likes of Circle and TransferWise who claim to be up to 30 times cheaper than banks. Bots will soon advise you and manage your money from your smartphone. Finally, Bitcoin followers believe their digital currency will become the new gold-standard and even Reserve Currency. According to a Citigroup report last week, FinTech may be on the cusp of an “Uber moment,” as Antony Jenkins, the former chief executive of Barclays, predicted last year. Some 800,000 people will have lost their jobs at financial services companies to some of the newly dreamed up software in a decade, the report said. “Roughly 60 to 70 per cent of retail banking employees are doing manual-processing-driven jobs,” the report explained. “If all the current manual processing can be replaced by automation, these jobs can disappear or evolve.” Despite these predictions – there remain many not inconsequential barriers to the prospects of the FinTech companies’ success. Legal, regulatory and compliance requirements have evolved over many years and are an essential part of the banking infrastructure – at a significant cost. How will FinTech companies demonstrate that their systems and procedures are fully compliant with the volumes of laws and regulations that are in place to protect the public? Some US Regulators have offered a potential olive branch in these respects: The Office of the Comptroller of the Currency recently said it thought a new regulatory framework needed to be created to help foster innovation among FinTech companies while ensuring compliance. [easy-tweet tweet="How will #FinTech companies demonstrate that their systems are fully compliant with existing regulations?"] Much lobbying and new-lawmaking would seem to be on the horizon! ### Would you know if your cloud had been hijacked? The benefits of cloud computing have been extensively documented, but what isn’t discussed as much is the growing opportunity for cyber attacks that cloud environments create. The majority of cloud storage providers house vast amounts of data in centralised, multi-tenanted cloud environments. This is great for business efficiency because it allows multiple users to access cloud data across numerous different devices and platforms. However, this approach isn’t without risks and being aware of what they are is extremely important for any business looking to embrace the power of the cloud. [easy-tweet tweet="#Cloud account hijacking can result in the leaking or destruction of sensitive company IP" hashtags="Security"] The growing threat from cloud hijacking Cloud account hijacking is a growing phenomenon that occurs when an organisation or individual’s cloud account is accessed by a third party without permission, usually with malicious intent. Attacks of this nature are becoming increasingly commonplace in identity theft schemes, with criminals using the stolen account details to conduct fraudulent activity. However, what’s concerning is that more sophisticated cloud hijackers don’t even need a username or password to gain access to the account. Instead, they are using malicious attachments in emails or browser extensions to gain access to the account’s authentication token. Once in, they are able to pose as the legitimate account owner whilst simultaneously conducting a wide range of damaging activities such as stealing sensitive data, spreading malicious code or even sending traffic to counterfeit websites. Are the risks real? The risks are definitely real. Cloud account hijacking can be particularly devastating at an enterprise level, depending on how attackers use the information they steal, or what else they get up to whilst in the account. Obvious concerns are the leaking or destruction of sensitive company IP. However, the reputational damage caused by actions such as redirected web traffic to fraudulent websites or spamming customer bases with phishing attacks could be catastrophic. Additional legal implications and regulatory fines are also a very real concern for businesses in highly regulated industries, such as healthcare and finance, particularly if confidential stock market or patient data is exposed, for example. What can businesses do to make their cloud accounts more secure? There are a number of simple steps that any organisation can take to quickly improve the security of its cloud accounts and data: Require multi-factor authentication. Several tools exist that require users to enter static passwords as well as dynamic one-time passwords, which can be delivered via SMS, hardware tokens, biometrics, or other schemes Encrypt sensitive data before it goes to the cloud Ensure service providers conduct background checks on all employees who have physical access to data centre server rooms Restrict the IP addresses allowed to access cloud applications. Some cloud apps provide tools to specify allowable IP ranges, forcing users to access the application only through corporate networks or VPNs Implement secure solutions for cloud account hijacking defence Make sure all data is securely backed up in the event that data is lost in the cloud Take time when choosing a new cloud service provider Businesses must also take proactive steps when choosing a new cloud service provider, and prioritise security alongside other key factors such as ease of use and scalability. One important step is to compare the cloud security and data-integrity systems of different cloud service providers within their contracts. Companies are well within their rights to examine the number of data loss or interference incidents a cloud service has experienced in the past. It is extremely prudent to know who you are going into business with. How often do they experiences downtime? How do they monitor and manage vulnerabilities? Do they allow clients to audit the performance in any of these areas? All of these are important questions that businesses should do their best to find the answers to before signing on any dotted lines. Security platforms that extend to both the cloud and mobile devices will further bolster security. The ability to control or block risky data activity based on behavioural and contextual factors all add further security layers. Additional capabilities to look out for include end-to-end encryption, application control and continuous data monitoring. [easy-tweet tweet="To tackle the threat of #cloud account hijacking, businesses need a data-aware approach to #security"] As the threat of cloud account hijacking continues to grow, a data-aware approach to security has become more important than ever. Cloud computing may well be the business model of the future, but organisations who want to capitalise on its many benefits would also do well to verse themselves in its risks and how to effectively mitigate them. ### Choosing the right type of Blockchain: Public, Consortium or Private This week I will attempt to differentiate between the three different varieties of Blockchain structures – namely: Public; Consortium and (Fully) Private. [easy-tweet tweet="The Public type of #Blockchains can be accessed and read by virtually anybody" hashtags="Bitcoin, Cryptocurrency, FinTech"] Let's begin with the Public kind. As currently envisioned, these are Blockchains that virtually anybody can access and read. Further, virtually anybody could send transactions and expect to see them get included in the chain (at least if they are real/valid transactions.) Finally, virtually anybody can choose to participate in the verification process (called the Consensus process) to assist in determining which transactions get added and for verifying what current form and status the Blockchain is currently in. In an effort to suppress and hopefully prevent any cheating – these Public Blockchains are encrypted and require significant economic resources to compute the trustworthiness of the data. Thus the trustworthiness of the chain is not considered as being centralised. Financial incentives are provided to those participants in the Consensus process. Consortium Blockchains, however, only permit the Consensus process to be controlled and performed by a select number of nodes. As a possible example, consider a Central Bank which allows only specified, trusted Banks to provide the necessary calculations and thus verify transactions before adding them to the block. Furthermore, any rights to read the Blockchain can also be restricted (or if desired – made public) – most often via the use of certain approved API’s. Consequently, this variety is often known as a partially-decentralised Blockchain. The final category – (Fully) Private Blockchains – are the kind where only one organisation is permitted to write (create/input) new transactions. The read permissions however may still be made public – or restricted to only a certain select few. It is envisioned that such environments will be used where (say) management/auditors etc., need to view the activities which are internal to a company and where public read access does not apply. The Private Blockchain is by far the easiest to control and/or modify. There is no need to get involved in protracted discussions with other companies/individuals whenever changes need to be made. However, as the number of “trusted” participants increases it becomes more difficult to agree upon and then test any changes needed. Some argue that with truly Public Blockchain approaches, the system is generally more secure – since even programmers may lack the necessary permissions to engineer spurious transactions. Whichever “variety” businesses opt for will to a great extent depend upon their respective needs to satisfy any disclosure or transparency requirements as demanded by their regulators and/or by law. [easy-tweet tweet="The #Blockchain variety that businesses opt for will depend upon their respective transparency needs" hashtags="FinTech"] In the next article I’ll be writing about how the Consensus process and Smart Contracts are actually performed – as well as providing some examples of where Blockchain is (potentially) a great fit – and also where it most definitely is not. ### The network foundations for business success Although best known for the telephony services it offers to UK consumers, TalkTalk has been steadily growing its business customer-base by developing a flexible, high capacity IP network. Already, the company is beginning to reap the rewards of this move. [easy-tweet tweet="New developments like #NFV promise to disrupt traditional approaches to business #networks" hashtags="Virtualisation"] The revenue for TalkTalk Business constituted more than 30 per cent of the company’s total £1.8 billion figure, representing a five per cent increase on the previous year. A major factor in this growth has been increased demand for the data products and next-generation voice services being delivered by its direct channel and wholesale partners. Looking to build on this success, TalkTalk already has a number of ambitious plans for the future. A partnership has recently been agreed with NTT Com for TalkTalk to use its IP transit network to carry the additional SIP voice traffic that its business customers will generate. Similarly, TalkTalk Business is working on a network upgrade scheme in partnership with network system integrators and network vendors like Axians and Juniper Networks. These proposals should ultimately deliver better quality of experience (QoE) for customers, as well as new features like network function virtualisation (NFV). Network functionality is increasingly important to businesses of all sizes, with more and more industries requiring access to high speed connectivity that is able to underpin a raft of mission critical applications and service, from unified communications to big data analytics. New developments like NFV also promise to disrupt traditional approaches to business networks, by replacing dedicated hardware (relating to routers, firewalls and VPN) with virtualised software functions. [easy-tweet tweet="Future proofing your #network is not easy, as it is impossible to predict the impact of new technologies"] Future proofing your network is not easy, particularly as it is impossible to predict what impact new technologies are likely to have. However, by embracing innovative network approaches like NFV, CIOs and IT leaders are laying the groundwork for tomorrow’s business success, today.   ### Does Blockchain need to mature before hitting the mainstream? The last few weeks or so have not been such a great time for aficionados of Blockchain technology. [easy-tweet tweet="Recent analysis from SWIFT did not make good reading for die-hard fans of #Blockchain" hashtags="FinTech, Bitcoin"] First off – I got sight of a detailed analysis conducted by the (worldwide) banks very own SWIFT (Society for Worldwide Interbank Financial Telecommunications) body. For those of you unfamiliar with this entity – it is a FinTech company of sorts – owned by central and regular banks around the globe and which enables them to (amongst other things) transfer payments in pretty much any currency on a truly global basis. Regardless – the analysis was far from what the die-hard fans of Blockchain wanted to hear. SWIFT’s analysis pointed to the “need for a focus on Distributed Ledger Technology’s abilities to cope with the following key requirements:” Strong governance Data controls Compliance with regulatory requirements Standardisation Identity framework Security and cyber defence Reliability Scalability Furthermore, they produced the below graphic depicting the amount of progress DLT has made with regard to each of these categories - notice that three of the requirements have been categorised as "not yet addressed." In short, it appears that SWIFT are not overly enthused that as it currently stands, Blockchain technology is actually at a sufficiently mature stage to enable existing/legacy systems to be replaced. Similarly – a study performed by Morgan Stanley – also arrived at broadly similar conclusions. Both bodies seem to agree that there exists significant potential cost-saving benefits – but not unless and until all of the issues (compliance, security etc.) are ironed-out – which they feel will likely take between 5 and 10 years from now. Not to be outdone, Citigroup also published a report which focused on the possibility of DLT challenging existing cross-border and Trade Finance payments systems. The paper – produced by researchers Keith Horowitz, Adrien Porter and Michael Cronin – concluded: “We don’t view blockchain as an intermediate-term strategic threat to the profitable high-value cross-border payments business for the banks.” However – in the case of Letters of Credit the outlook was more bullish: Letter of Credit is a manually intensive process, making it ripe for digitisation. A typical transaction can entail 150 different records and up to 10 different parties signing off on the transaction. Moreover, wrote Horowitz, about 80 per cent of these transactions end up with some form of discrepancy that delays the transaction. Evidently, a type of smart contract could be developed that would record a transaction’s particulars, codify it through the blockchain, and release the funds only after particular conditions are met, said the Wall Street Journal. “We believe blockchain can help tackle the main challenge of trade today, which is just taking the paper-based process and digitising it, while adding features such as smart contracts that would trigger actions such as payment release and/or shipping of goods,” wrote Horowitz. Indeed – Barclays Bank announced that they had recently conducted successful tests using so-called Smart Contract technology on a Blockchain platform to trade derivatives. So currently – it would seem that Blockchain technology can provide benefits only where the number of transactions is relatively small (e.g. it currently could not cope with the thousands of trades per second which are processed currently on Equities/Options markets) and even then – only once the necessary Legal/Compliance issues (to say nothing of the need for ‘Standards’- as in methods to formally identify instrument types, using Legal Entity Id’s/similar, Currency Codes et al) all exist and have been suitably tested. Various governmental bodies have come forward offering their two-cents (or Bits?) worth of advice: The UK government could use distributed ledger technology to track student loans and aid money, according to Minister for the Cabinet Office Matt Hancock. Not to be outdone – the ECB came out with: The European Central Bank is looking into the use of distributed ledger technology (DLT) for the Eurosystem's market infrastructure but is wary of the far-reaching consequences for the role of central bank money Finally – and just to prove that SWIFT are not entirely without issues themselves – check out the following headline: "Bangladesh Bank hackers compromised SWIFT software, warning issued." [easy-tweet tweet="#Blockchain can currently provide benefits only where the number of transactions is relatively small" hashtags="FinTech"] In my next post, I will be illustrating how the currently envisioned three varieties of Blockchain structure proffer different strengths and weaknesses – as well as pointing to which areas of Banking, Trading, Finance generally will likely benefit (or not) from what’s available. ### Data Protection & Compliance - #CloudTalks with NetApp's Martin Warren LIVE from Cloud Expo In this episode of #CloudTalks, Compare the Cloud speaks with Martin Warren from NetApp about the importance of recognising the long-term value of corporate data. After reviewing the many interviews filmed at Cloud Expo Europe 2016, Martin Warren was awarded the Cloud Talks trophy for the best interview of the day. #CloudTalks Throughout the world, leading organisations count on NetApp for software, systems, and services to manage and store their data. They help enterprises and service providers envision, deploy, and evolve their IT environments. Customers also benefit from NetApp's open collaboration with other technology leaders to create the specific solutions they need. ### Future proof your business: How to become a complete cloud solutions provider Cloud services are rewriting the rules of business, improving the way people collaborate and revolutionising the way we work. For most businesses, this is making the world a lot easier, but for IT service providers, cloud services are making the landscape more complex. [easy-tweet tweet="To maximise profitability and future proof your business, the key to success is on building the value add" hashtags="Cloud"] The proliferation of cloud technology means traditional providers must adapt their offering from transactional services to an integrated solution, delivering services traditionally outside of their core capabilities. The benefits of switching to the cloud are vast. For instance, the latest research from Microsoft and IDC revealed that the cloud is key to growth. Microsoft’s cloud-focused partners are growing at double the rate of non-cloud resellers, and enjoy one and a half times more gross profit. Resellers, service providers and ISVs should take advantage of the long term business opportunity and recurring revenue streams presented by becoming a complete cloud solution provider. However, as cloud adoption has risen, so has customer expectation. The customer lifecycle is now more important than ever, meaning providers must deliver a seamless customer experience through the entire journey. For IT providers to maximise profitability and future proof their business, the key to success is on building the value add. The good news is the door is now open for partners hoping to make the most of this opportunity. To guide your business through this challenging landscape, Michael Frisby, Managing Director of Vuzion and Cobweb Solutions, shares his three top tips to ensure your switch to a cloud first model is a success. Build a rich portfolio of services Delivering a tailored solution which meets your individual customer needs is key. Partner with a cloud aggregator who offers a marketplace of relevant and compatible services. This will enable you to implement and combine new capabilities within your own IT services offering in order to expand the revenue and profit opportunities for your business as a whole. It’s important to consider how your offering fits together. Think about whether new cloud services will work to enhance your business and brand with new and existing customers, or if they will appeal to a certain demographic. Consider Billing-as-a-Service Partnering with a provider who offers Billing-as-a-Service capabilities is vital for success, particularly for small businesses which may not have the resources or expertise to take this on in-house.  Not only will this boost your own services offering, but using a provider who offers automated customer invoicing, payment collection and credit control minimises the overheads and complexities associated with consumption based billing. What’s more, this will free up your time to concentrate on delivering better customer service and growing your business - especially if you partner with a provider who offers a 24/7 support service. Why is this necessary? Cloud services require immediate, accurate monitoring and billing. They must support several monetisation strategies, such as upgrading from trials to subscription and bundling services. Differing services will require differing pricing, ranging from recurring monthly fees to usage-based fees. Usage needs to be tracked and reporting must be available on demand. For a traditional business, this is a significant change to accounting, customer care and delivery processes. Benefit from a smart ecosystem of partners For businesses that are new to selling cloud services, teaming up with a provider who can connect you with an ecosystem of partners with whom you can collaborate and learn from is invaluable. Working together with other partners can help you to more easily create and deliver an integrated range of business solutions to best meet your customers’ requirements. No one business or vendor can necessarily offer everything, and that’s where a partnership of complimentary services can provide access to the skills and business offerings that customers need – wrapped in the package delivered by your business through the cloud. Remember, marketing is vital for business success Cloud technology has changed the dynamic between traditional IT resellers and cloud providers. As a result, providers are expanding the services they offer to include customer lifecycle management, lead generation, training and sales support. This is great news, particularly for smaller resellers who may not have the resources available to make this investment! When making the switch to the cloud, look for a provider who offers support when it comes to marketing automation, CRM integration and market insight integration. This will boost your customer lifetime value. Partnering with a provider who offers go-to-market support to win and retain customers and create up and cross sell opportunities will ensure your business is geared to succeed in the cloud. [easy-tweet tweet="The #cloud has altered the relationship between vendors, businesses and traditional IT service providers"] In short, the rise of the cloud has irreversibly altered the relationship between cloud providers, businesses and traditional IT service providers. To stay competitive, cloud providers must adapt their partner models to deliver greater value and business success for both parties, and resellers must harness the digital era and bring cloud tools into their remit! By so doing, providers and resellers will reap the rewards, as will the everyday businesses to whom their services benefit. ### Understanding Blockchain: What is it and why should you care? A simple way to consider what Blockchain technology could provide is to think of a ledger (e.g. an Excel spreadsheet) that can only be updated when a majority of trusted participants (known as ‘miners’ connected via ‘nodes’) have simultaneously agreed that the proposed transactions are valid and will not permit a situation whereby any erroneous “double-billing” could take place. [easy-tweet tweet="The #Blockchain ledger maintains a full and transparent audit-trail of all transactions ever made" hashtags="FinTech"] Once the miners have reached a sufficient consensus and the transactions are approved – only then will the new entries be permitted to be updated and committed to the ledger. The miners receive compensation in the form of newly created digital currency and/or by receiving a fee for processing transactions. Further, the Blockchain-style ledger maintains a full and potentially transparent Audit-Trail of all transactions ever made via that particular ledger. The Bitcoin currency garnered some pretty bad publicity for apparently allowing some bad actors (drug dealers, money launderers) to use this new digital currency to conceal their true identities and profit from their ill-gotten gains. However, in many senses, the Bitcoin currency is what potentially provides Blockchain technology with such a broad appeal. Suitably authorised participants could transfer value between themselves in virtual real-time – and all the while – without the necessary involvement of existing firms in this space, such as Western Union, banks et al – at virtually no cost. Imagine how this prospect could change not just existing banking structures – but potentially replace the need for central banks too. A politically independent Global Currency unit of exchange that can be, if necessary, transferred to/from other units of exchange – in minutes, has certainly captured the imagination (and not insignificant Research and Development Funding) of the world’s largest banking firms – and even that usually staid “Old Lady of Threadneedle Street” – the UK’s Bank of England. All of these entities are busily working away at figuring out just how they might use the technology to at least stay in the game – or partner with other firms. Indeed – one country’s central bank, Estonia – has already invested and implemented a form of Blockchain technology and claims that as a result, they enjoy the lowest credit card fraud rate in all of  Europe. As mentioned above, transactions are submitted, and then verified by the miners – but what stops these miners from conspiring to somehow modify the new transactions? The full technical response to this danger will be covered in a future posting – but for now, just imagine that a majority of the miners would all need to submit exactly the same responses – at exactly the same time – or their proposed transactions get rejected. Also in future postings I will explain in more detail how there exists essentially three different varieties of Blockchain’s so-called “Distributed Ledger Technology” (DLT) and the benefits/weaknesses of each approach. Some pundits are already proclaiming the technology as heralding the dawning of some kind of New Age – others are simply interested in how much money they might make/save by eliminating scores of middlemen/mid-back office costs and the like – whilst still others are not yet convinced that their existing models and frameworks are as yet redundant. [easy-tweet tweet="Estonia's central bank has already implemented Blockchain technology to combat credit card fraud " hashtags="FinTech"] As is often the case with new/groundbreaking technological advances – the level of commitment and ultimately co-operation by lawmakers, regulators, politicians, bankers and the like will determine just how much of a sea-change this technology will actually bring to the world. ### Industry 4.0: It’s time to collaborate or die We are now in the era of Industry 4.0 – the fourth industrial revolution – ushering in autonomous smart cars, robots that can perform tasks with impressive efficiency, and 3D printing. The SmartThings Living Future report created by a group of academics and futurologists recently launched its vision of the future depicting 3D printed food, underwater ‘bubble’ cities and the colonisation of the Moon and Mars. However, the technology behind this can only develop as fast as their previous incarnations allow: it has to be built on solid foundations. [easy-tweet tweet="Industry 4.0 – the fourth industrial revolution – has to be built on solid foundations" hashtags="Industry40"] Innovating is a difficult task – it wouldn’t be innovating if it was easy. There are countless examples of innovations with the best intentions that have fallen flat on their faces. Universal product codes and electronic data interchange are examples of this. However, this pales in insignificance next to the inability to agree common standards. Connectivity is the cornerstone of Industry 4.0. It is dependent on interoperability; devices and systems that are able to communicate with one another. Developing multiple standards leads to fragmentation that is a huge barrier that will restrict the potential of Industry 4.0. To use a consumer analogy, it’s like having a connected home with an automated ordering service where your fridge is operating on a different OS. The central hub can’t gather information on whether the fridge needs to replenish juice supplies, and the system doesn’t work. Closed systems just don’t work. In order to unlock the potential of IoT, this will rely heavily on interoperability and the ability to address the same old challenges. Tech giants who have spent so long building up these walls will have to play nicely together to agree standards, using open source frameworks to enable innovation between them and other innovators. The Industrial IoT Opportunity Industry 4.0 involves the computerisation of machinery and automation using robotics, as well as the intelligent measurement and analysis of data to improve efficiency, profitability and safety. Third party sources predict that global investment in the Industrial Internet of Things (IIoT) will reach $500 billion by 2020. Companies that introduce automation and more flexible production techniques to manufacturing can boost productivity by as much as 30 per cent. Plus predictive maintenance of assets can save companies up to 12 per cent over scheduled repairs, reduce overall maintenance costs by up to 30 per cent and eliminate breakdowns by 70 per cent. These automation technologies will be driven by advanced sensors, big data technologies and intelligent machine applications that will harvest contextual data, manage, analyse and serve it back to the user or device as relevant information, all in real time. Look at the proliferation of mobile devices using intelligent software, traditional computers, big data servers, and IoT devices – you then have a picture of the magnitude of contextual data available. While the IIoT is still in its infancy, its ability to evolve will depend on how well this explosion in data is integrated and served across an ecosystem of devices. Let History’s Mistakes be Today’s Lessons Each ‘industrial revolution’ to date has failed to lay the groundwork that will successfully solve the challenges of data and device interoperability – even as far back as railway gauges in the first industrial revolution. The upshot of this is that there is a real danger of the industry trying to run before it can walk. To avoid this, foundational integration challenges should first be addressed. Businesses still struggle to integrate their ERP with other core business applications due to the shift from batch to real-time integration so as to provide outbound data from the ERP to other systems. Many universal product codes that track trade items in stores are still not fully integrated with the customer journey data and profile, meaning that it’s harder to offer personalised offers based on preferences. Additionally, Electronic Data Interchange (EDI) – the abilities for companies to exchange documents electronically – has still not been resolved as message formats are still not standardised. With this all considered, the interoperability challenge still looms large. There is a lack of interoperability between devices and machines that use different protocols and have different architectures. There is a reason for this. The technology giants who drove web innovation, such as Google and Apple, want to keep control. This desire for control has led to other OEMs and partners following suit, creating their own standards for the development of applications based on proprietary operating systems or devices, which results in vendor lock-in. This represents a massive problem for ubiquity: a lack of common standards with popular devices and systems that do not share data with each other if they are not all connected within the same ecosystem. Technology giants need to find a way of cooperating that doesn’t threaten IP while also building a mutually beneficial open standard that encourages collaboration from a developer perspective. It’s the tech giants that have the power to ensure that the IIOT will still foster innovation and collaboration. Open Architectures and Web Languages: the Perfect Cocktail Open technologies in an open architecture will go some way to helping businesses learn and develop systems that integrate. These include new open technology frameworks like NativeScript and React Native that help developers to develop IIoT apps that will work across systems and have the ability to share data across them all. Businesses should encourage IIOT developers to use Web languages to scale applications across any device or platform. JavaScript, a popular Web language, is the only language that runs on every single platform. It is the only true "write once, run everywhere" language. Whether that's directly on a JavaScript engine, or as an abstraction to native languages, the important part is that the language that the actual business logic is coded in stays the same. [easy-tweet tweet="Continuing with a walled garden approach means we’ll never see the benefits of Industry 4.0" hashtags="Industry40"] Collaborate or Die If the tech industry continues with this walled garden approach, we’ll never see the benefits that Industry 4.0 promises. Open and secure technologies are crucial in providing the foundations to deliver higher productivity and cost savings that industry craves. It is a case of ‘he who collaborates wins’, and the ones who do today will be tomorrow’s winners. ### How IT teams can drive business growth and deliver a competitive edge With digital capability becoming one of the most important differentiators in modern business, the expectations placed on technology teams have grown significantly in recent years. IT departments are no longer a reactive troubleshooting service solving day-to-day problems. Nowadays IT teams are also expected to be a key proactive driver in developing innovative business strategies and helping firms gain a competitive edge by embracing the latest digital technologies. [easy-tweet tweet="IT departments are no longer a reactive troubleshooting service solving day-to-day problems" hashtags="Cloud"] So what is the current reality? To what extent are technology teams truly ready to meet these heightened expectations? Our recent Tomorrow’s Tech Teams research examined the structure and purpose of IT departments today, speaking to both IT leaders and workers. It found that both groups face huge challenges in delivering technologies like cloud services, mobile apps and turning data into actionable insights. Shockingly, it reveals that the IT leaders of the biggest UK businesses think their teams are not currently fit for purpose. In fact, on average, they are a staggering four years behind their most innovative competitors. Almost one in three IT leaders state that their teams need to be completely overhauled in order to drive digital transformation in their business. Weighing up the skills imbalance The survey supports our previous findings that one in four businesses are looking overseas to source talent in order to manage the well documented skills imbalance and shortage within the UK IT industry. Two thirds of IT leaders have said that their teams lack the expertise to push the company forward, and that they could increase overall productivity by 31 per cent if their teams had the right balance of IT skills, knowledge and experience. In comparison, the majority (71 per cent) of IT workers feel that their skills and knowledge are not being fully utilised by their organisations. They believe this is mainly due to a lack of investment (46 per cent) and up-to-date training (34 per cent). What is clear from the research is that IT departments are not currently in a fit state to change from being a largely reactive service to being at the heart of business strategy and innovation. So how can organisations overcome this challenge and achieve digital transformation? Review and restructure As a first step, businesses must analyse and restructure their IT teams in order to enable innovation. A thorough review not only looks at how existing capabilities meet current business requirements but also the ideal future skill set. This can be done through performance reviews and more informal weekly catch-ups discussing employees’ existing skills, challenges and focus areas. However, these conversations must align with wider company vision and strategy so that employees are not only aware of the organisation’s direction, but are in full support of it. This will provide businesses with the required insights to assess what areas they need to improve in order to achieve their goals. Additionally, by engaging employees and asking for their opinions on where they think the department needs to change and meet business demands, individuals will feel involved and responsible for their development.  It might be the case that the structure of the IT team and how it works with other areas of the business may also need to be revisited. Increasingly, digital transformation means that IT professionals need to work far more closely with Marketing, Sales and Operations to help deliver more efficient, cost-effective operations for these departments. Many areas of businesses are beginning to implement new tools and systems themselves, such as cloud technology to allow employees to access files anywhere. It is therefore important that IT departments are in control of new technologies being integrated into the business. This means that IT professionals must be able to understand the needs and demands of other parts of the business in order to add value beyond traditional technical support. Provide tailored training Training should be designed to support overall business goals and requirements while, at the same time, assisting the career aspirations and needs of the individual that were highlighted in the initial review process. These training programmes need to be bespoke not just in their content but also in how they are delivered to employees. Broadly, different employees will respond to different training stimuli and environments, so you need to ensure you are meeting their needs by providing a wide range of training exercises. For example, offering interactive sessions, online forums for discussing challenges and ideas, short videos or mentoring, not just traditional, classroom style training, can support a culture of creativity and knowledge sharing. In addition, providing an employee the opportunity to work on a project within a different area of the business will allow them to gain exposure to wider operations and enhance their skills at the same time. Build a culture of creativity Furthermore, businesses and IT leaders must encourage individuals to think creatively about long-term projects that will impact the bottom line, not just fixing short-term problems. Employees should question existing systems and proactively think about how they can be improved. Embracing new technologies will be a big part of this. Whether it’s designing and implementing the latest mobile app or delivering cloud services, it is imperative that a strong business case is developed to demonstrate how investment in a new service can boost turnover or help reduce costs. [easy-tweet tweet="Businesses and IT leaders must encourage individuals to think creatively about long-term projects" hashtags="Creativity"] By ensuring that the right structures and development programmes are in place, tech teams will become more productive, strategic and driven to achieve digital transformation that will deliver on business goals now and in the future. ### IBM and Cisco announce trailblazing collaboration IBM and Cisco have announced the next stage in the two companies long-term collaboration project - combining the former’s IoT Cognitive Computing and the latter’s Edge Analytics solution to deliver greater value in Hyper-Distributed environments. [easy-tweet tweet="The joint solution offered by IBM and Cisco will enable businesses to use all important data" hashtags="IoT, IBM"] Hyper-Distributed environments occur when vast quantities of data are being distributed outside of the data centre - making control a complicated issue. Connecting the wide variety of data sources together allows businesses to gain access to more detailed analytics and generate more relevant insights. The joint solution being offered by IBM and Cisco will enable businesses to use all the data that is important to them. The first of its kind approach enables IoT analytics at the edge of the network and collects data for longer term analysis in the cloud. Cisco brings edge and fog computing processing solutions, as well as edge performance analytics, to the partnership, while IBM can deliver cognitive & predictive analytics, machine learning and end-to-end security as part of its Watson IoT platform. Already there are a number of industry use cases that demonstrate the potential of the IBM-Cisco partnership. Electronics, Travel, Real Estate and a number of Construction sectors stand to benefit from better use of industry data by increasing productivity and reducing operation costs. Mike Flannagan, Vice President and General Manager for Cisco’s Data & Analytics Group explained what kind of benefits the partnership with IBM could deliver. [easy-tweet tweet="There are many industry use cases that could benefit from the IBM-Cisco partnership" hashtags="IBM, Watson, IoT"] “Now, industrial organisations and those in remote locations with intermittent network connectivity can take advantage of the cloud, cognitive computing and network intelligence, working together – analysing sensor reading at the point of collection, eliminating the need to transfer all, or unessential data to the cloud,” he said. “The combination of these technical capabilities provides the flexibility of processing and analysing data everywhere, at the edge and in the cloud, so it can be leveraged in time and context as the business needs to use it.” ### 4 common IT security challenges faced by small businesses “Small businesses are equally as vulnerable as large enterprises when it comes to data security,” warns King of Servers’ managing director Albie Attias. But what is it about small businesses that puts them at such a risk? Albie explains below. [easy-tweet tweet="Small businesses are equally as vulnerable as large enterprises when it comes to data #security"] Providing adequate security on a tight budget Small businesses often have limited budgets for expenses such as IT because it is hard for some business owners to visualise the financial return from their investment in this area. With EU law introducing fines for businesses that do not adequately protect their data however, it is becoming clear that bosses need to see the importance of investing in security. Albie Attias says: “For some small businesses, investing more heavily into security will mean cutting back IT expenditure in other areas. This puts a lot of pressure on the IT department, which, in smaller companies, may only consist of one or two people. Finding that budget is incredibly important for businesses however, as leaking data can incur fines, on top of a bad reputation.” Reducing the risk of human error Human error is one of the main contributing factors in data leaks, yet many small businesses offer little or no staff training regarding best practices when it comes to keeping data safe. Albie Attias explains why he believes this to be the case: “Small business owners may not feel they have adequate knowledge to personally deliver a training session on IT security, and hiring an external professional can be expensive.” Staff training does not have to break the bank however, as Albie suggests ways in which small businesses can do so on a shoestring budget: “If the business has an in-house IT department or person, they can help to create a handbook of rules or deliver a simple one-hour training session on best practices for employees to follow. If your IT department’s resources are already stretched, there are many resources online that advise staff members on common habits that can put data at risk. Simply run this by your IT department to confirm it is accurate and send round to your staff, explaining its importance.” Executing regular security audits “Security audits are underrated,” says Albie. Why? Well, security audits are integral to helping businesses establish security vulnerabilities and reduce threats. Without carrying out regular audits, businesses risk leaving gaping holes in their security and implementing an ineffective strategy. However, in small businesses, they are often overlooked or put on the back burner as they can be time and resource-consuming. Albie explains why they are in fact worth the resource and effort: “It is often the IT manager’s responsibility to protect small businesses from security threats, yet if they are not given the time, means and assistance to carry out an effective audit, they will struggle to use their budget effectively. This can lead to wasted expenditure as they implement a security strategy based on gut feeling and guesswork rather than issues highlighted through a controlled procedure.” As part of the King of Servers security awareness series, which is taking place on their website, we have written a guide for small businesses, on how to successfully conduct a security audit. This is worth referring to when you are ready to start conducting regular audits. Accepting that cyber criminals don’t just target large businesses Because the media only really reports on data leaks from big household names (ahem, Sony), small business owners make seek solace in the common misconception that it won’t happen to them. This is not the case, as smaller businesses often make easy pickings for hackers. Albie points out that not all data leaks are visible: “Just because a small business hasn’t physically ‘seen’ that they have lost data, they may be inclined to think theirs is safe. This is not the case, credit card data for instance, could be getting leaked over a long period of time before a small business picks up on it.” This brings us back to why staff education and security audits are so important - the first aspect of improving your business’s security, is to accept that it is at risk. [easy-tweet tweet="Data protection is important for small businesses and there are still low-cost means of improving IT #security"] Data protection is incredibly important for small businesses, and although limited budget often creates barriers, there are still low-cost means of improving IT security. As more operations move online, bosses need to consider how they are storing their data to avoid leaking personal information. ### 5 value imperatives for measuring virtual TCO in 2016 How we measure the financial implications of investments into virtual infrastructure will dramatically change during 2016. Where previously you would only really measure the more traditional negatives, you can now fully understand where your server’s infrastructure is at its best capacity. [easy-tweet tweet="How we measure the financial implications of investments into #virtual infrastructure will change during 2016"] Here are five questions that you should ask yourself when trying to determine the best virtual TCO model in 2016: Will the solution be suitable for your technology ethos? My ideal set-up is one where I can see critical components in real time. This means that I have complete visibility over my entire infrastructure. If this type of system is similar to yours, you should aim to increase your time-to-resolution by at least 70 per cent. Can you trust your data? A siloed system is incredibly reliable; you trust that it can perform in most circumstances. I know from personal experience that a legacy, siloed system has the capability to perform in any situation. These days systems and infrastructures are so dynamic that there are things constantly changing on the fly. You need to know that your system can cope with these rapid changes while still performing to a high level. What are the benefits of your investments in performance visibility? It’s imperative that your virtual infrastructure gives you an advantage over your competitors. However, you can spend as much as you like improving the infrastructure of the system but if you don’t correctly sort the wheat from the chaff, you will find yourself running an inefficient system. You need to ensure that you identify any redundant apps as well as any applications which aren’t performing to their maximum potential. The benefits of an efficient system are obvious to any large-scale IT operation. Does your data analytics identify the correct problem? The relationship between identifying slow performance through data analytics and real-time machine-learned recommendations needs to be cohesive and rewarding. Some of the major benefits of strong analytics include increased revenue, productivity and access, as well as a decrease in overall man-hours. Can you properly identify every financial benefit? Properly identifying dollar benefits that aren’t just e-commerce QoS responsiveness or SaaS availability can sometimes seem like the Neverending Story - it can be difficult to do as well as being incredibly time consuming. [easy-tweet tweet="It’s imperative that your virtual infrastructure gives you an advantage over your competitors" hashtags="Virtualisation"] Some of the key things to watch out for in order to avoid any revenue leakage are poor performance, customer experience, productivity enhancements, workload volume and easy to achieve SLA commitments. ### Do the metrics of a data centre impact upon its agility? You need to understand how your data centre is performing in order to make better decisions about how and where you invest in your network. There are various ways to measure performance, here are the top five. [easy-tweet tweet="You need to understand how your data centre is performing in order to make decisions about #network investment"] How can you merge heuristics with data centre metrics in a way that delivers the best use of space and cooling power? It used to be that you would only use heuristics to identify traffic patterns. However, you can now apply them to network security so that you can quickly identify and act on any potential hacks. In order to get better information, it’s best to cross reference NetFlow metrics with different types of KPIs from your more conventional silos. Doing this helps you identify any contention issues as well as laying the foundations for a more intelligent investment plan. This increases productivity and makes your system far more efficient. How can heuristic-based data centre metrics help our operations? As the modern data centre has become more and more complex, (think conversational flows, replatforming, security, mobility, cloud compatibility) heuristics has become more and more important. This technology gives us the capability to perform back of envelope calculations as well as taking the risk out of human intervention. The end product is ideal, a machine-learned knowledge base. Is it possible to properly model costs as well as operational metrics? When it comes to managing and upgrading our system, most of us have to make do with a fixed budget. This means that we are susceptible to ‘over-providing’ hardware and infrastructure. The main cause of this is that we can’t properly see the complexities that come as part and parcel of a contemporary data centre. What you need is an effective infrastructure performance management tool. This will help you properly calculate your capacity and make a better informed investment decision which means that you won’t overspend in a bid to prevent overloading that you can’t even see. Can financial metrics-based modelling benefit data centres? Data centre managers can deliver core IT operational metrics to business-side managers in a language that makes sense by continuously illustrating a realistic view into available capacity against overall efficiency, acceptable performance thresholds between running hot and wasted “headroom,” and a greater degree of granularity in terms of ROI benefits to virtualisation over maintaining legacy infrastructures. Using financial metrics, data centre managers can deliver concise, easy to read metrics to people outside of the IT industry. This includes people such as business-side managers and other stakeholders. This is achieved through simply showing available capacity against overall efficiency as well as performance thresholds. Showing return on investment makes it easier to communicate your good performance to peers. You will find below some of the best metrics with which to demonstrate ROI analysis: TB Storage Reduction: Variable Cost / GB / Month $ Server Reductions: Annual OPEX / Server $ VM Reductions: Variable Cost / VM / Month $ Applications Reduction / Year $K Database Reductions: Variable Cost / Database / Year $ Consulting / Contractor: Reduction $K Revenue Improvement / Year $K Blended/Gross Margin % Facilities & Power / Year $K Ancillary OPEX reductions / Year $K Is it possible for data centre managers to provide a holistic operational metrics solution? One of the best ways to visualise performance is through a high fidelity streaming multi-threaded dashboards. These provide easy to understand data composed of data points and their key interdependencies which includes endpoint devices, physical infrastructure and visualised applications. [easy-tweet tweet="The best way to minimise the impact of a service outage is to automate your system" hashtags="Automation"] The best way to ensure that you minimise the negative impacts of a service outage is to automate your system. We would recommend using a platform like ServiceNow. This helps increase agility and responsiveness. However, none of this is possible without good data and visualisation. In order to predict the future, you need to understand what’s happening in the now. ### What is Network Function Virtualisation? You will find yourself hearing a lot about Network Function Virtualisation (NFV) over the next couple of years, so here’s a little guide to make sure that you understand what it is and how it can help you manage your system or your business. [easy-tweet tweet="#NFV is a new process for designing, deploying or managing a #network service" hashtags="virtualisation"] Essentially, NFV is a new process for designing, deploying or managing a network service. With NFV, you can decouple the network's key functions (firewalling, NAT, DNS, intrusion detection and caching) from your proprietary hardware so that you can then run these actions through visualised software. NFV has been designed to combine all of your standard networking components in a way that makes it easier for you to manage and install system modifications without having to physically go in there and change them yourself. How does it benefit my system? As we all know, operating and installing equipment in a system can be an incredibly complex operation. There is a huge amount of equipment with lots of different needs and requirements. NFV simplifies this operation and makes it easier to manage your system. The key benefit of NFV is that it gives system managers the potential to quickly scale their systems up or down in response to whatever the system needs at that exact moment. Being flexible and having the capability to respond to any ad-hoc requests is an absolute must for any person working in the modern systems industry. Another benefit is that you will also no longer need to spend capital on new software just so that you can build a service chain. This is possible because server capacity can now be increased through software, rather than building an over-provisioned system. Is it compatible? Installing NFV shouldn’t be seen as replacing your system. In fact, NFV will compliment your existing system and should work as part of a more holistic system management operation. The business benefits of NFV become quickly apparent, you can install or deploy services in a matter of seconds - something that used to take a few weeks to complete now takes moments. Time is a luxury in systems management so anything that gives you more of that has to be seen as a bonus. The more quantifiable benefits of NFV include things such as reduced capital and operating expenses, this is due to the systems now requiring less space and power. You can also reduce the need to start building unnecessary infrastructure and wasteful over-provisioning. [easy-tweet tweet="The idea that network function virtualisation isn’t compatible with #SDN simply doesn’t hold true" hashtags="NFV"] The idea that NFV isn’t compatible with SDN simply doesn’t hold true, you can do SDN with or without NFV. The technology merely compliments your existing product strategy by making it more efficient. This is achieved through reducing capital costs and increasing productivity. ### The smart home appliances that are available now According to many industry analysts, we’ll soon all be living in smart homes, with Internet-connected devices providing greater functionality than ever before. Below we take a look at some of the appliances that early adopters are already putting to use. [easy-tweet tweet="Many of the electronic devices that we use in the kitchen are prime candidates for #IoT technology" hashtags="SmartHome"] In the kitchen Many of the electronic devices that we use in the kitchen are prime candidates for IoT technology. Smart fridges are already available that let you monitor your food via an embedded camera and alert you when anything is starting to go bad. Samsung’s Smart Fridge also comes with built in speakers, a screen displaying news and weather, and offers seasonal recipe suggestions. There are also plenty of other ways that businesses are updating our kitchen tools. Modern dishwashers utilise cameras to monitor areas that are already clean, helping to reduce water usage. There are also pans with temperature sensors, chopping boards with built in scales and oven-connected apps. In the bathroom There’s plenty of innovation in the bathroom too, whether your priorities lie with singing practise or saving water. Wireless speakers are now available as built-in parts of the shower, while the Waterpebble monitors exactly how much water you’ve used and lets you now when your water usage is getting excessive. For those of you that want your smart home devices to provide a little more comfort, there are applications that allow you to store personal settings for your shower and bath and implement them from anywhere on the home network. There’s now no need to step into a freezing cold shower or scolding hot bath. The rest of the home Gadget fans are already spoilt for choice when it comes to smart home devices. Mobile-controlled vacuum cleaners, IoT light bulbs and garden sensors that monitor your soil conditions all give you greater control than ever before, all from the comfort of your smartphone. The smart home hub The real concern for huge tech firms like Google and Apple is developing a smart home hub. Sales of connected fridges and vacuum cleaners may well make them some money, but the entire home IoT ecosystem is likely to rely on a central hub - ensuring that this is where the real revenue will come from. [easy-tweet tweet="Integration will be a key concern as the smart home ecosystem develops" hashtags="IoT, SmartHome"] Integration will be a key concern as the smart home ecosystem develops, which is why winning the race to become the number one smart home hub is so important. Will the homes of the future be running on a variation of Android, iOS, or something completely different? Time will tell, but in the meantime there are already plenty of smart home devices to tinker around with. ### How digital marketing agencies will change over the next few years Digital marketing agencies have been chopping and changing since they opened shop. It’s becoming harder and harder to predict where this massively mobile industry will go next. There are so many different types of agencies now (full service, integrated, SEO), that putting your finger on what a digital marketing agency does can be quite difficult. In this guide we take a look at how and why things will change. [easy-tweet tweet="It’s becoming harder to predict where the digital #marketing industry will go next"] 1. Integrated agencies will become the norm One of the less celebrated aspects of the rise of social media and social media marketing is the fact that it has combined the two previously incompatible worlds of PR and digital. More user generated content means that you can bypass the old world of meeting and buttering up journalists. You can now send your own positive messages through social media accounts which ultimately means that a digital agency can’t just be focused on digital and a PR agency vice-versa. PR agencies need to understand every part of the digital puzzle if they are to compete with their digital counterparts. One thing you might notice is that there could even be a small dip in the amount of PR agencies operating by the end of the decade. 2. Increased mobile marketing Mobile was king last year with mobile searches overtaking desktop for the first time ever. It had been the channel of choice for agencies who were dealing with younger brands for quite some time now. However, it was in 2015 that it really took off. The best way of marketing anything was through mobile. This trend is set to continue over the next few years and it won’t be long before every agency has a mobile division. Understanding how to deliver a wide range of content through this ever-growing platform will soon be more than a luxury, it will be essential. 3. Analytics will become more commonplace The concept of coverage is incredibly widespread. Long gone are the days when you would simply skim through the day’s papers in the hope of finding something relevant to the brand that you were looking after. Coverage now represents many different things – it could be social buzz, increased traffic or related user-generated content. The way that these things are tracked will eventually become more and more in-depth. Proving the value of what you are doing can only be done through demonstrating coverage. The better you are at analysing and presenting it, the happier your clients will be. 4. Content will become even more regal It seems like you can’t go a day without someone saying that ‘content is king’. You hear it everywhere from huge conferences to client meetings. However, that doesn’t mean that it’s true. Generating content is becoming key to every marketing plan. Expect to see agencies adopt larger video departments, sign more freelance writers and employ more bloggers. Content departments will become more specialised and it’s definitely one of the most exciting aspects of working in a digital marketing agency. It will be fascinating to see how this develops over the next few years. [easy-tweet tweet="Digital marketing is an incredibly exciting place to be with new technologies creating constant change"] Digital marketing is an incredibly exciting place to be right now with new technologies and ideas all contributing to an ever changing state of flux. Marketing strategies have moved away from the ‘hypodermic needle’ model to an ever–growing world of user-generated messages and stories. How agencies react to this will be key. ### This Day in Tech: The world’s first automobile accident As Henry Wells drove along upper Broadway in New York City on the 30 May 1896, little did he know he was about to make history. Evylyn Thomas, equally unaware of her role in the fateful event, was blissfully riding her bicycle along the same street. That is until the two collided, and the world's first recorded automobile accident took place. [easy-tweet tweet="120 years ago today, the world's first recorded automobile accident took place" hashtags="ThisDayInTech"] The New York Daily Tribune report of the accident is matter-of-fact, unaware that variations of this event would become an all too common occurrence in the years to follow. “The wagon [automobile] operated by Henry Wells, of Springfield, Mass., wobbled furiously, going in a zig-zag fashion, until it seemed that the driver had lost control of it. Evylyn Thomas, of No. 459 West Ninetieth-st., was approaching on her bicycle, when suddenly the wheel and horseless carriage met, and there was a crash,” the article reads. “A crowd gathered, and the woman was picked up unconscious, her leg fractured. An ambulance took her to the Manhattan Hospital, where last night it was reported that she would recover soon. Wells was taken to the West One-hundred-and-twenty-fifth-st. station, and held pending the result of the injuries to Miss Thomas. The wagon went on in charge of another operator.” The development of the automotive industry is of course a fantastic technological achievement, but it has also introduced a potentially deadly hazard into our daily lives. In the year leading up to September 2015, there were 188,830 reported road casualties in the UK alone, with 1,780 fatalities. However, are modern advances bringing us closer to the day where we can consign road accidents to history? Self-driving cars Much has been said about the potential impact of self-driving cars. Elon Musk, Tesla’s chief executive, has stated that the rise of autonomous vehicles could see humans banned from being behind the wheel, claiming that it is too dangerous to have “a person driving a two-tonne death machine.” For now, however, much progress still needs to made if self-driving cars are to prove themselves as a safer alternative to human-driven vehicles. A number of collisions have taken place during off-road tests - normally attributed to human error - but back in March Google admitted that its self-driving car was to blame for crashing into a public bus in California. Teething problems, however, are to be expected and the growth of the autonomous vehicle market will have to struggle against regulatory issues and public inertia as much as it will its own safety record. Despite these early issues, the future is promising for the automotive industry. Imagine a self-driving car that automatically reduces its speed to keep the correct separation distance from the car in front, or a vehicle that could send data to other road users to inform them of upcoming hazards. [easy-tweet tweet="Imagine a self-driving car that automatically reduces its speed to avoid hazards" hashtags="Autonomous, SelfDriving"] In the 120 years that have passed since that first recorded accident, motor vehicles have embraced a number of safety features, from crumple zones to diagnostic software. As technology continues to progress, are we approaching the day when we read about the world's last automobile accident? ### How the cloud lets us share our life-changing experiences It was not that long ago that intrepid adventurers took off into the unknown, well aware that the next time they made contact with their loved ones would be upon their return. The isolation and anxiety that followed was an accepted part of the journey. Communication has not always been as pervasive as it is today. [easy-tweet tweet="#Cloud computing is enabling us to share our experiences more freely than ever before" user="Dropbox"] Cloud computing, however, is enabling us to share our experiences more freely than ever before. Digital technologies can now overcome even the harshest of conditions and modern-day daredevils are using them to ensure that their personal stories have the broadest possible impact. Tempest Two Earlier this year, James Whittle and Tom Caulfield completed what is likely the be the biggest challenge of their lives – rowing unaided from the Canary Islands to Barbados. Recognised by many as the toughest voyage in the world, the 3000 mile cross has been completed by fewer than 500 individuals, a smaller number than those that have conquered Everest or the North Pole. If you are able to get over the intimidating nature of their task, however, there is plenty of beauty to be found in the wide expanse of the Atlantic Ocean. Alongside whales, sharks and towering waves, the journey also had something else worth sharing. “Many of you reading this will undoubtedly be surrounded by the hustle and bustle of modern day life,” wrote the pair on the completion of their arduous journey. “London and other cities offer a huge amount in terms of stimulation, but one thing it lacks, is simplicity. This simplicity gives you time to reflect, and puts things into perspective like never before.” Of course, sharing this experience when floating in the middle of the Atlantic Ocean is easier said than done. For Whittle and Caulfield, cloud computing provided them with the means of sharing their once in a lifetime experience. With just a laptop, a few cameras, a satellite phone and a Dropbox Business account, they were able to regularly update their family, friends and followers on their progress. As well as sharing some breathtaking images of their surroundings, the cloud also gave followers an insight into the two travellers’ states of mind. Over the course of the two month journey, they shared blog posts, images and videos that revealed their struggles and successes. As well as letting us know about the physical toll that the journey was having on their bodies, we also found out how the pair kept spirits up and broke up the monotony of rowing day-in, day-out. [easy-tweet tweet="Via the #cloud, the Tempest Two team were able to share their personal adventure with many others" user="Dropbox"] Spending 54 days on a boat no bigger than a minibus with just your rowing partner for company could certainly be an isolating experience. Through cloud computing, however, the Tempest Two team were able to share their personal adventure with many others all over the world. ### Grow your ecommerce business with cloud ERP Whether you are a business owner, an entrepreneur or an eBayer, your main goal is to make sales. [easy-tweet tweet="Cloud business management solutions simplify the background processes that are often neglected" hashtags="ERP"] This infographic explains how cloud business management solutions (ERP) simplify the background processes that are often neglected. Worried about stock control, order processing, accounting, payment and courier integration? Would you love to see these worries become easy and functional? The infographic below shows you how simply cloud ERP helps growing e-commerce businesses. And if you decide to go abroad for a few weeks, or away on holiday, you can access all the features you need to run your business from anywhere, from any device, be it tablet, desktop or mobile. This guide lists the benefits of cloud ERP solutions, which should help you to decide which approach is best for your e-commerce business. [easy-tweet tweet="#Cloud ERP can provide clear automation, with no added stress and many more happy customers"] There is an easier way to manage your business. If you are after clear automation, parcels despatched in time, with no added stress and ultimately more happy customers, cloud ERP could be the solution. ### Troubleshooting the multi-cloud Tracing an application performance issue to its source across today’s private, public and multi-cloud deployments can be a time-consuming challenge. Application Delivery Controllers (ADCs) can help scale and maintain application performance and access at the server and application level. [easy-tweet tweet="Tracing an application performance issue to its source can be a time-consuming challenge" hashtags="Cloud"] However, with applications and components spread across numerous networks, cloud services and geographic locations, network congestion may be just as responsible for application performance issues as the server. Packet capture and netflow tools can provide some diagnostic help and relief, but these solutions require a fair amount of time, expertise and patience. In a hybrid and multi-cloud environment, tracing root cause often requires consulting multiple on-premise, network and cloud based tools and intelligence. In the meantime, you’re not only stressing IT resources, but downtime and performance may be impacting business productivity and revenue. There is another way however. Some of today’s emerging application delivery controllers provide not only a single centralised ADC management tool across multiple cloud and data centre environments but also awareness of the SDN OpenFlow protocol and the network visibility and automation that it provides. By integrating with standard SDN controllers, ADC management tools can enhance application performance not only at the server level but the network level as well. The diagram above shows how an SDN-aware application delivery controller architecture might resolve an application performance issue by detecting network congestion. In this example, the SDN aware ADC tool has detected traffic congestion between VBR 2 and 3 on port 2-3 to 3-2 on the path to the servers it manages. Once congestion is discovered, the tool instructs the SDN controller to redirect traffic automatically via another less congested route to by-pass the bottleneck and improve application performance. [easy-tweet tweet="By integrating with standard #SDN controllers, ADC management tools can enhance application performance" hashtags="Cloud"] IT involvement is little to none and network related application issues are addressed and resolved in seconds or minutes, before they have an impact on the business. It’s a surprisingly simple and quick solution to a seemingly complex issue.  ### Embrace the cloud, but don’t forget your umbrella Software as a Service, or SaaS, is being embraced by businesses across all industries and from all corners of the world. However, organisations must be aware of the risks of moving to the cloud, as well as the opportunities.  [easy-tweet tweet="There are many benefits of Software as a Service such as elasticity, expertise and usage-linked costs" hashtags="SaaS"] Advantages of SaaS There are many benefits of Software as a Service (SaaS) such as elasticity, expertise and usage-linked costs, all of which are becoming increasingly well-known in business. Gone are the days when the words ‘cloud computing’ would be followed by radio silence and confusion in the boardroom. Instead, business owners are now looking to actively embrace cloud and explore all it has to offer. The SaaS model can be adapted to benefit any industry. It offers a flexible, accessible and scalable way of working, enabling companies to significantly improve efficiency and productivity. SaaS applications can be particularly valuable when it comes to HR management, customer services, procurement and facilitating internal collaboration by enabling employees to update and share documents in real-time. But the benefits of SaaS extend further from the use of the applications. The rapid rate at which it can be implemented means that it causes little disruption – a business could be up and running with its SaaS service within a few hours, with employees trained to use it the same day. It also requires no man-power to install and no additional hardware investment, meaning that it helps to keep costs down. Risks to overcome  Although there are many obvious benefits to embracing SaaS, these can become insignificant if a business fails to acknowledge the risks associated with trusting a third-party cloud provider with its business-critical applications and data. These risks come in a number of different forms ranging from a data breach, where data in the cloud is accessed by cyber criminals and may be stolen or altered in some way, to the sudden loss of critical applications if something goes wrong at the SaaS provider’s end. If ignored, these risks can quickly become very real and can even lead to the worst case scenario for any business – being unable to perform vital day-to-day tasks leading to long-term disruption and the reputational damage that can be experienced as a result. The severity of these threats highlights the importance of good cloud practices – but given the responsibility for mitigating the risks lies predominately with the provider, it can sometimes be difficult for businesses to know what to do. Creating a safety net Since SaaS providers are subject to the same stresses and strains as any organisation, you need to be confident that the critical applications and software you are trusting them with will remain accessible if they run into any technical or financial difficulties. Therefore, in order to be safe in the cloud, it’s important to carefully scrutinise any SaaS package, including contingency plans, during initial conversations with cloud providers. Although this may sound overly cautious to some, it can be the difference between efficiency savings and a disaster. Any good provider should offer services to protect their business users from commercial failure and I would advise all businesses to demand a contingency plan as part of any SaaS package, and make sure that the contingency plan is robust and dependable if it is ever called upon. This is standard practice amongst reputable providers but should always be reviewed and double checked as it is in an integral part of business continuity planning. [easy-tweet tweet="By putting contingency measures in place, companies of all sizes are able to embrace what #SaaS has to offer"] By taking the time in the first instance to ensure these contingency measures are in place, companies of all sizes are able to fully embrace what SaaS has to offer, while being safe in the knowledge that the applications and data they need on a daily basis will remain accessible and functional, no matter what. ### Legacy IT Challenges - #CloudTalks with Expedia's Elizabeth Eastaugh In this episode of #CloudTalks, Compare the Cloud speaks with Elizabeth Eastaugh from Expedia about how the business is a technology company first, and a travel company second. #CloudTalks Although many people think of Expedia as a travel company, internally the organisation sees itself as more of a technology business. In particular, Expedia has had to overcome a number of challenges relating to legacy IT as one of the forefathers of the online travel industry. ### Cloud computing and the importance of buying British The significance placed on geographical boundaries has been steadily eroded by the rise of the Internet, and more recently, cloud computing. Being able to access customers and services from all over the world certainly has its advantages, but it also poses certain problems. Even with the rise of globalisation, national borders are not quite meaningless, not yet anyway. The location of your cloud services remains highly important, which is why UK firms should remember to “buy British” whenever possible. [easy-tweet tweet="There are advantages to going with a 100 per cent UK #cloud vendor – hosted in the UK, with UK support" user="zsahLTD"] Under the open market, businesses may be tempted to choose the cheapest cloud service provider, regardless of location. However, there are a number of advantages to going with a 100 per cent UK company – that is, a cloud provider hosted in the UK, with UK support. Firstly, businesses may benefit from more reliable security by going with a UK cloud provider. An international vendor, for example, could leave your data susceptible to regulations that you, and your customers, may not be comfortable with. There have been a number of instances of foreign governments making requests to view data stored with cloud providers, most notably in the case of Edward Snowden’s NSA revelations. When businesses store their data with a cloud supplier located outside the realms of UK law, they have far less control over its security. On-going court proceedings also suggest that even data stored on European servers may not be safe, if the company holding it is a US subsidiary. For example, the US government has requested access to data being held by Microsoft’s Irish branch – a request currently being contested by the US firm. The final outcome will no doubt be of interest to companies all over the world, but there is a simple way of avoiding the pressures of governments based in the US and elsewhere. By using a UK cloud provider, like zsah, businesses can be assured that there data is protected against external interference. Similarly, the recent Safe Harbour fallout continues to leave data privacy in a state of limbo when it comes to using US-based cloud providers. The EU-US Privacy Shield is on the way, but has not yet been finalised. Under this cloud of uncertainty, British businesses can gain peace of mind by going with an EU, or better yet, UK-based provider. However, opting for a British cloud vendor is about much more than just security. At zsah, we recognise that a reliable cloud service provider must be able to deliver first-rate support at all times. As such we constantly monitor your network to assess issues and address any problems before the user is affected. Our team is also contactable at all times should you experience problems with your cloud solution. Based in the UK, we have a robust understanding of the local cloud market and the needs of our clients. [easy-tweet tweet="UK #cloud vendors have a robust understanding of the local market and client needs" user="zsahLTD"] Zsah is a proud member of the UK cloud vendor community. Boasting some of the country’s finest technical and support staff we are able to offer customers all over the world a range of critical solutions, from Infrastructure as a Service to application hosting. Cloud computing may have changed many things about the way UK businesses operate, but the importance of “buying British” remains as great as ever. ### The 4 steps towards transforming your business with copy data virtualisation As terabytes turn into petabytes, the surge in data quantities has sent costs spiralling. At this current rate of production, by the end of this year the world will be producing more digital information than it can easily store. [easy-tweet tweet="Copy #data virtualisation can free an organisation’s data from its legacy physical infrastructure"] This deluge of data caused by increased connectivity poses a serious challenge. The sheer amount of data quantities being created is causing the complexity and cost of data management to skyrocket. By 2020, IDC predicts that 1.7 megabytes of new information will be created every second for every human being on the planet. Trying to make sense of this data is going to be a huge challenge for all organisations. So, why does this problem exist? It’s down to the proliferation of data copies - multiple copies of the same thing or outdated versions. Consumers make many copies of data: on backup drives, multiple devices and cloud storage. Businesses are worse because of the need to maintain copies for application development, regulatory compliance, business analytics and disaster protection. IDC estimates that 60 per cent of what is stored in data centres is actually copy data, costing companies worldwide as much as $44 billion to manage. The increase in copy data also poses a significant security threat. A recent IDC study found that the typical organisation holds as much as 375 data copies. Each added copy increases an organisation’s “area of attack”, and gives hackers looking to get at important information, more material to work with. Right, so what is the way forward? You’ve heard of server virtualisation. You’ve heard of network virtualisation. But have you considered virtualising your data yet? Copy data virtualisation – the process of freeing an organisation’s data from its legacy physical infrastructure – is increasingly what forward-thinking companies are doing to tackle this problem. By eliminating copy data and creating a single ‘golden master’, virtual copies of ‘production quality’ data are available immediately to everyone in the organisation that needs it. For IT managers, copy data virtualisation could be the way to address the ever-increasing rise in data. Like any significant overhaul or change, implementing it across an organisation requires planning and strategic thinking. I believe there are four key steps for businesses to follow to maximise the copy data virtualisation opportunity. Here they are: Your choice of platform Every organisation faces its own set of challenges and this is what will impact on the choice of platform. Whilst this choice has to suit the needs of each individual business, there is a set of criteria commonly used. A typical enterprise will have workloads across a number of different systems – virtual machines on VMware and physical machines on Windows for example. The platform of choice has to be able to support of all of systems, as well as databases and applications. This is all a must for copy data virtualisation to take effect. There are more considerations too. The platform should be infrastructure-independent to allow for different use cases. A single point location will make it simple to control and a platform with support for hybrid cloud offers the ability to take different applications into alternative data centres. The initial use case IT departments adopt a number of overlapping technologies, such as software for backups, snapshots, disaster recovery, and more. Data virtualisation removes the need for all of these redundant technologies by creating virtual, on-demand data copies that suit a number of these uses cases with one platform. However, this doesn’t happen instantly and nor should it if you want your move to copy data virtualisation to be successful. Choose one use case initially and roll it out first. In doing this you’ll be able to iron out any issues that come up before a wider roll out. What exactly do you need to consider? So, your platform has been chosen and an initial use case identified – what next? Now it’s the time to understand your specific needs so you can design the underlying infrastructure to support accordingly. You need to be asking some important questions: What rate is the production data changing at? What is the retention time required for virtualising backup? How many virtual data copies do you need at any one time? What testing will be done with that data (performance, functionality, scaling etc.)? How much bandwidth do you need (especially important if you’re working with multiple data centres across different locations)? How is your data being replicated and encrypted? It’s pivotal that you’re able to answer all of these questions before you start investing in infrastructure – it will save you a lot of time and money. Hybrid cloud Lots of organisations have begun harnessing both private and public cloud offerings to create a hybrid cloud infrastructure. These hybrid clouds adopt the control and security of a private cloud, along with the flexibility and low cost of public cloud offerings. Working together, they can give organisations a powerful solution to meet the increased demands on IT from the rest of the organisation. One of the main benefits of this hybrid cloud approach is enhanced agility – using public cloud means you can experience fewer outages and less downtime. Using a private cloud is good for the testing and development of new applications before deciding on where you’d like to host applications permanently. A hybrid approach also allows you to use multi-purpose infrastructure – for data recovery and test and development simultaneously, for example – helping to cut down on costs and complexity. By implementing copy data virtualisation and reducing physical copies of data sooner rather than later, organisations will have to spend less on storage and can get to the most important stage of data management quicker – analysis. The results are wide-ranging, from less data moving across networks, less stored data, significantly reduced storage costs, and the total removal of costly operational complexity. [easy-tweet tweet="Copy #data virtualisation and reducing physical copies of data, means organisations spend less on storage"] We like to view data virtualisation as a way to give an organisation virtual sanity. ### Modern finance, continuous accounting and the future of finance The finance department is changing. Companies no longer rely on manual, spreadsheet-based practices when it comes to their internal finance processes. According to research carried out by BlackLine in 2015, almost 80 per cent of UK financial decision-makers acknowledged that using cloud-based applications to automate and control key financial processes would have huge benefits. [easy-tweet tweet="Modern Finance refers to a form of process automation designed specifically for finance and accounting" hashtags="FinTech"] Since then, we’ve seen more and more large and mid-sized corporates adopting what we call a ‘Modern Finance approach’ – using process automation to ensure their internal financial processes are completed on time, accurately, and comply with regulation. This approach not only makes it easier to spot potential errors and late reconciliations, it saves huge amounts of time for accountants at month-end, ensuring happier, less stressed staff and, ultimately, fewer errors. But what’s the next step for businesses embracing automation? Modern Finance today – the future of finance is now For those not familiar with the term Modern Finance, it refers to a form of process automation designed specifically for the finance and accounting function. BlackLine was actually the first company to develop a software-as-a-service (SaaS) offering to respond to this need. Software like BlackLine’s original account reconciliation solution enables accountants to upload numbers to a secure, automated system that makes it simpler to complete reconciliations each month. Hours of time wasted manually inputting data are saved, costs are reduced through the elimination of paper-based files (auditors can log into a BlackLine portal to review past financial statements, rather than rifling through ancient box files full of Excel documents) and accountants are able to spend their time on other tasks, such as supporting the CFO on strategic business decisions. Recent research commissioned by BlackLine found that over 40 per cent of UK financial decision-makers (FDMs) don’t have total faith in the accuracy of their data. This is up from 27 per cent in 2014-15, which is a concern. Accounting teams are up against more and more restricting regulatory changes, plus the ever-present threat of cyber-attacks. With threats like these hanging over them, it’s not surprising that many CFOs and other FDMs feel they can’t completely put their faith in company financial data. However, it also uncovers a genuine issue – data inaccuracy. Whether accidental or by deliberate action, an error on a spreadsheet can cause carnage for a business. Thanks to Modern Finance applications, however, it becomes much easier to spot an inconsistency. CFOs and Financial Controllers no longer have to trawl through Excel documents to locate a potential error – it is easily highlighted on their dashboards. Adopting a Modern Finance approach makes staff happier, too. BlackLine’s recent research found that a worrying proportion of accountants are looking to leave the profession – because they feel they aren’t being given the right tools for the job. Businesses neglecting to explore Modern Finance practices are putting their entire departments at risk by allowing accountants to become stressed and overworked – not to mention missing out on the benefits of automation. Whilst we’ve seen a significant increase in companies embracing Modern Finance, there is still a great deal to be done in terms of educating accountants and their managers about the benefits. Continuous Accounting: The next step Despite the struggle still faced by many accountants, the Modern Finance movement is picking up speed. BlackLine now services over 1,300 large and mid-sized businesses worldwide, a number which continues to increase as well-known brands such as British Gas, Western Union, and LV embrace a Modern Finance approach. But what’s the next step in achieving maximum efficiency in the financial close? The answer is: Continuous Accounting. We’re all aware of the concept of Continuous Development – the idea that software can be developed, tested, deployed and upgraded continuously. BlackLine has applied this concept to finance and accounting, in conjunction with the launch, late last year, of a new suite of applications designed to take Modern Finance as we know it to the next level. Continuous Accounting dispenses with the traditional ‘record-to-report’ process accountants use, which follows a rigid, step-by-step process, replacing it with a more ‘continuous’ process. Fig.1: a traditional record-to-report process Fig.2: a Continuous Accounting model Continuous Accounting enables team members to smooth out the workload peaks by automating certain activities earlier in the period, for example zero balance account reconciliations. This means that there is no spike in activity at month end, where accounting departments rush to close off all reconciliations. Workflow is dramatically improved, meaning even more time is saved. [easy-tweet tweet="Continuous Accounting enables team members to smooth out the workload peaks by automating certain activities"] Will Continuous Accounting catch on? Of course, our view is that it will. In an ever-changing business environment, security is under greater threat than ever, from both internal and external sources, both malicious and unintentional. This, coupled with adherence to increasingly stringent and changing regulations, shows a need to embrace a new way of working. The initial success of the Modern Finance movement shows us that there is a better way to deal with finance processes – a more continuous way – that will change the way we work, ultimately, for the better. ### Search Engine Privacy: What the Internet knows about you When Tim Berner-Lee made his ideas for the World Wide Web available in the early 1990s, he did so with no patent and asking for no royalties. This principle remains the foundation for the free Internet that we use today. Except, the Internet isn’t free. [easy-tweet tweet="The currency we exchange for our Internet services isn't dollars or pounds, but personal information" user="oscobo"] In the modern world, the currency that we exchange for our Internet services isn't dollars or pounds, but information; personal data that we are handing over in ever increasing quantities. The price that we pay to use the Internet, therefore, is a heavy one: our privacy. Of course, there is plenty of information that you openly give to Google and other web service providers. Most people don’t have a problem supplying their name, telephone number and address when they sign up for Gmail, for example. However, you are also sharing information whenever you enter a search term. The more that Google knows about your behaviour, your likes and dislikes, the more targeted their advertisements are and the more they can charge brands for them. Perhaps the most worrying aspect of this practice is that a lot of search engine users are not aware of exactly how much information Google is harvesting. It’s only by accessing the little-known “Web & App Activities” page that users are able to deduce the data that has been collected. You can delete this information, but only after you’ve ignored Google’s warnings about how it will affect your tailored results. By visiting your “Ads Settings” page, you can then see how Google uses this data to make money. You’ll likely see a number of results that genuinely match your interests, and plenty that don’t. The ad targeting algorithm is not perfect, but it does help them to increase their ad rates compared with blanket serving the same ads to everyone. Of course, Google is not the only offender. Many other search engines, and indeed web services generally, collect user data in order to sell ads or the data itself. Facebook users can also download a copy of all the information they’ve shared with the social network (from deleted friends to individual pokes), highlighting just how much of our lives are now stored online. However, not every online search engine collects personal data. At Oscobo we believe that personal data should remain personal. We do not build online profiles based on your behaviour, we do not track your movements and we do not store data on our users. Of course, this means that we do not provide you with tailored results, but in many instances these “personalised suggestions” are simply ways for other search engines to turn your data into profit. Instead, we provide “true” search results based only on the words you type. There are many reasons why you might want to withhold your personal data - avoiding spam, receiving unbiased prices - but the main reason is simply the protection of privacy. You wouldn’t expect to hand over your phone number, home address and date of birth before using a face-to-face service, so why has this become accepted practice online? [easy-tweet tweet="Many search engines and other web services collect user data in order to sell ads or the data itself" user="oscobo"] Privacy is important and needs to be protected, even as we adopt an ever increasing range of web services. With so much of our lives lived online, the time is right to find out exactly what the Internet knows about you and whether you’re comfortable with it. ### How IT pros can approach the challenges of multisite branch IT IT professionals are under increasing pressure to deliver an advanced, coordinated IT capability across all sites in a business, no matter how remote or how small. But, as always, this must be achieved without putting further strain on already overstretched IT budgets. The challenge is to deliver state-of-the-art, often complex and usually dynamic IT to remote sites without the need for expensive on-site expertise, or without over-taxing already stretched centralised IT resources. Thankfully, the emergence of cloud managed servers, which provide local advanced IT functionality as a service and without the need for local resource, are changing the game and opening up new opportunities for the multi-site IT professional. [easy-tweet tweet="The emergence of #cloud managed servers are opening up new opportunities for the multi-site IT professional"] There are a host of major needs that are driving organisations to review their branch IT strategy, including spiralling remote site support costs, exacerbated by the cost of the time and travel needed to get to these remote sites if on-site support is required; new customer models, like multichannel retailing, requiring high-functionality applications; local data storage needs where data must be stored and manipulated locally for policy, privacy or control reasons; data-heavy apps that can’t reliably operate over the network; the consolidation of network functions; and preparing for the Internet of Things in the future, where masses of data is created with a need to store and analyse this data locally, with periodic communication and reporting to centralised control points. These examples — and many more like them — share a common requirement for advanced local functionality, integrated across distributed organisations, without local IT skills to commission, manage or support them. This can lead to inertia within an organisation with new and innovative processes, business models, or customer engagement strategies not being adopted because of the difficulties of making them work at the local level. A key role of the IT professional is to marry the strategic direction of the business with a set of equally strategic technology choices, and then develop a roadmap that delivers a programme of change that seeks to align the two. There are a range of options open for IT professionals to consider. In the recent past, the IT pro has rightly chosen a cloud-first strategy to solve remote site IT, but then struggled to articulate a branch strategy that is consistent with the cloud. So why have IT pros headed straight for the cloud in the first place? Traditional servers need regular love and attention.  Historically, the infrastructure used to carry out server maintenance on remote locations was lacking, resulting in high support costs and low service levels. However, a cloud-first strategy is not a cloud-only strategy. Many of the new and innovative applications (as well as most legacy ones) require IT infrastructure on the remote site. Consequently, although the public cloud has a significant role to play in branch IT, it does not support some of the most demanding business requirements of the distributed organisation. Two major and interconnected developments — hybrid cloud and hyper-convergence — are allowing us to redefine servers in a way that meets the needs of advanced local IT capability without advanced local IT skills. Disparate remote sites and branches without local IT skills can increase the load on central IT departments. Fortunately, the availability of new types of cloud-managed servers opens up new opportunities for effective, cost-efficient and high-functionality remote site IT. With the advent of cloud-managed servers, an IT Pro can now drive a coherent and aligned strategy across the business. It offers new opportunities for IT service providers to prosper in the cloud world. It is enabling IT professionals to meet the common objectives in most multisite organisations: providing a consistent technology platform for all locations and businesses; a single set of data; best practice in IT and to support operational excellence by providing a standard technology stack to build a consistent set of practices in order to achieve predictable performance; resilience in order to keep operating, with scale-out architectures with automated failover in the event of hardware faults; and good functional fit to requirement out of the box, with built-in keep-current technology, so that they can be kept up-to-date over many years and automated software releases that negate the requirement for people to be sent on-site. Cloud-managed servers deliver packaged IT functionality, which can be run on any site, regardless of local IT support skills or the resilience of local network connectivity. In many ways, they are the ideal solution to the needs of the remote site, inasmuch as they allow a flexible IT strategy with applications being placed where they deliver the most benefit, and support resources optimised across the whole distributed organisation. [easy-tweet tweet="Cloud-managed servers deliver packaged IT functionality, which can be run on any site" hashtags="Cloud"] This fundamental shift in cost-effective, high-functionality, small-site IT delivered by cloud-managed servers is a game-changer in the world of branch IT. Looking ahead, this approach for delivering complex local application capability without a corresponding complex IT infrastructure looks set to be the norm. ### Connected Cars 2016 28 - 30 June 2016 Olympia Grand, London To help you maximise your revenues Connected Cars 2016 will bring different teams from OEMs and Tier 1s, including Infotainment, Product Planning, Strategy and R&D together to focus on these issues. Through 15+ hours of networking (5 more than last year!) and interactive formats away from the conference stage, you’ll be able to have frank discussions with your peers. Our programme will be developed and overseen by an advisory board of 10 industry visionaries, ensuring it meets your business’ needs and that you leave with clear strategies to stay ahead of the competition. ### Cloud & DevOps World 21 - 22 June 2016 Olympia, London Cloud & DevOps World will return to Kensington Olympia, London, to bring together the industry’s leading technologists and innovators, to discuss the future of Cloud Computing. This year’s edition will no longer focus on Cloud implementation, but the strategies, business models and technologies which can activate the Cloud, and drive new opportunities for your organization. With case studies and thought leadership pieces from the likes of the HMRC, Betfair, HSBC, Google, Twitter, Airbus, Lufthansa, M&S, Sky, Lonely Planet, Hotels.com and many more, Cloud & DevOps World will give you the opportunity to drive your business into the digital economy, and realise the potential of Cloud Computing. ### IoT Tech Expo Central Europe 13 - 14 June 2016 Berlin Congress Center, Germany The Internet of Things (IoT) Tech Expo Event will bring together key industries from across Central Europe for two days of top level content and discussion. Industries include Manufacturing, Transport, Health, Logistics, Government, Energy and Automotive. Introducing and exploring the latest innovations within the Internet of Things, this conference is not to be missed. Taking place in Berlin’s Congress Center on 13-14th June, this year’s IoT Tech Expo Central Europe in Germany will hostthousands of attendees including IT decision makers, developers & makers, OEM’s, government and local council officials, automotive exec’s, operators, technology providers, investors, venture capitalists and many more. The IoT Tech Expo is set to showcase the most cutting edge technologies from more than 100 exhibitors and provide insight from over 200 speakers sharing their unparalleled industry knowledge and real-life experiences. ### Infosecurity Europe 7 - 9 June 2016 olympia, london The Information Security Europe visitor audience is made up of a diverse range of IT security professional coming from key sectors such as: IT Distribution companies IT Hardware, Software, Manufacturers and Suppliers Finance/Banking/Insurance professionals Government professionals With 12,006 unique visitors from 80 countries passing its doors, Infosecurity Europe 2015 was a great success. Our team is working towards the delivery of a even greater 21st edition of Infosecurity Europe in 2016. ### How the cloud era is changing the insider threat One of the biggest security challenges now facing all organisations is the “insider threat”: either accidental or malicious activity by an individual with access to their network. It seems concerns around those with privileged access to the company’s data assets are significant and, from our own survey with more than 500 IT security experts, 70 per cent of respondents considered the insider threat more risky than an outside attack. When attackers gain insider access they can stay undetected within the network for months and cause real and lasting damage. Risks could include loss or theft of data or even malware being introduced to the network. [easy-tweet tweet="One of the biggest #security challenges now facing all organisations is the insider threat" hashtags="Cloud"] Now, in the age of the cloud, assumptions made about who the ‘insiders’ are need to be re-considered.  The perimeters are shifting and, if our data is no longer under our own watchful eyes, what’s at risk when we don’t know and haven’t been responsible for hiring or vetting the very people entrusted to look after our most valuable asset?   After all, we can’t simply walk into the data centre and oversee the individuals that are responsible for managing our IT estate. So how do we regain control of the human risk of ‘insiders’ at third party providers? The shifting network boundaries As cloud computing has become ubiquitous, businesses of all sizes have reaped the benefits of flexibility and scalability. However, shifting network boundaries have given rise to misunderstandings about where the lines of responsibility start and finish when it comes to data security. When users can more easily access and share data, there’s not always clarity around when the user is responsible and when this responsibility rests with the cloud provider. Yet, given their privileged access rights, the consequences of damage done by a malicious insider, such as a cloud administrator, might be far more devastating than anything that could be perpetrated within the company itself.    The added challenge is that ‘insiders’ have an advantage of knowing the best way to infiltrate the network. With privileged access rights, they may have intelligence on knowing where to strike for maximum effect and how to disguise what they’ve done. And of course, there’s an impact on the service provider. When even minor performance issues, delays or downtime can result in significant damage to their reputation, the impact of activity at the hands of a rogue employee could be devastating.  Best practices for visibility and control There’s always some risk involved in handing over responsibility to a third party, however there are ways to control the partnership through a combination of sound processes, transparency and visibility of their activities. Establish the partnership parameters. From an operational perspective, organisations should exercise due diligence when selecting their partner and ensure that there are contractual obligations governing security policies and procedures that the cloud provider will adhere to.  Organisations should also not be afraid to ask questions about those with privileged access rights. These are the systems administrators that will be managing the cloud computing environment, so it’s important to understand what kind of checks and controls are in place for these individuals.    Monitor administrator’s activity. Restricting external administrator access is also a challenging exercise. For this reason, it’s essential to have tools in place that can monitor third party and external administrative activity. Organisations must know what is happening across the network in real time and protect against unauthorised access.  Identify anomalies. New approaches, known as User Behaviour Analytics (UBA) are enabling organisations to understand what is really happening on the network and identify any unusual activity. They work with machine learning algorithms, which create a profile of users and can pinpoint abnormalities in their day-to-day activity. This can identify if there has been a data leakage, or database manipulation and the cause of the incident can be quickly identified. [easy-tweet tweet="The added challenge is that ‘insiders’ have an advantage of knowing the best way to infiltrate the network"] Fundamentally, organisations need to take control of the partnership with their provider and apply the same strict security standards that they would have within their own organisation. Visibility of the activities of privileged users helps to control the ‘human risk’ and means that any actions at the hands of a malicious ‘insider’ can be quickly stopped in its tracks. ### Data Storage & Disaster Recovery - #CloudTalks with Insight’s Mike Wheeler In this episode of #CloudTalks, Compare the Cloud speaks with Mike Wheeler from Insight about how their Discovery Workshops can help improve business processes, agility, security and productivity. #CloudTalks Shaped by the evolution in IT, Insight is now a leading provider of Intelligent Technology solutions, helping companies around the world implement innovation and improve business performance. ### How cloud technology is impacting finance Cloud adoption is on the rise. As noted by a recent RightScale survey, hybrid cloud adoption now tops 71 per cent year over year. The cloud is also maturing — the need for highly skilled professionals has replaced security as the top business concern. In effect, a “skills gap” has emerged as purpose-built cloud technology advances at a breakneck pace. Despite a burgeoning market, however, the financial industry has been slow to adopt this new technology; here are five ways cloud advancements are impacting the future of finance. [easy-tweet tweet="The need for highly skilled professionals has replaced #security as the top business concern" hashtags="Cloud"] Strategic Thinking Financial institutions ensure continued profit, not only through sound present-day business practices, but by making intelligent predictions about future trends. As noted by Forbes, finance teams are increasingly on the front line of decision-making and often tapped by other departments to make informed judgments about both spend and investments. Cloud-based finance systems now offer the ability to analyse historical and operational data, in turn generating both prescriptive insights — used to streamline current business processes — and predictive groundwork, used to suggest the most profitable course of action moving forward. Collaboration Financial decisions are rarely made in isolation. Often, approval from one or more business units across disparate geographic locations is required before high-value agreements are signed or new strategies are adopted. Cloud technologies support the adoption of critical communications technology such as voice over IP (VoIP) and video conferencing — allowing international connections without needing to leave local offices. Cloud solutions also make it possible for project teams to collaborate in real time and create a continuously updated set of documentation to ensure both data continuity and security.  Fraud Detection The advent of sophisticated malware strains and phishing efforts make fraud a serious concern for every financial institution. Both customer accounts and internal networks are at risk for cybercriminals that breach IT defences — at best, companies lose time tracking and stopping malicious actors. At worst, they lose money and public trust. Advanced cloud-analytic platforms use a combination of raw data and behavioural analysis to determine the likelihood that a particular transaction or user behaviour is fraud rather than legitimate. Companies can set a specific notification threshold in addition to creating automated responses to both quarantine threats and ensure security pros have the necessary data to act.  Resource Use To meet the challenges of increased computing demand, financial companies have historically made large, on-site hardware investments. The problem? Even with rapidly increasing server density and enhanced data storage processes, there's a finite limit to available resources. Not so in the cloud — companies pay for the performance they need and can scale up on demand. In addition, businesses are not responsible for the maintenance or upgrading of cloud-based hardware, meaning they can transition from a variable combination of CapEx and OpEx spending to consistent and controllable OpEx month to month. Intelligence  According to the Financial Times, the development of artificial intelligence (AI) in the cloud also promises to positively disrupt the financial sector. There are several possible avenues of advancement based on this premise. For example, adaptable intelligence tools could be used to enhance security through the use of facial and behavioural recognition technology. Language-based intelligence, meanwhile, could drive the development of enhanced self-service finance solutions, giving consumers increased flexibility without compromising security. Eventually, cloud-based AI could help streamline the distribution of resources and data across complete enterprise networks. [easy-tweet tweet="Cloud #AI could help streamline the distribution of resources and data across complete enterprise networks"] The cloud has moved from outlier to fundamental infrastructure for many organisations. While financial companies have been hesitant to adopt this technology in an effort to ensure both compliance and data security are unaffected, cloud solutions are now mature enough to not only support effective financial governance but help companies secure a solid, revenue-driven future. ### The evolution of IoT: Fog computing Did you know that in only two years period of time, we, the users of the Internet, will produce as much as 50.000 gigabytes per second? [easy-tweet tweet="Our data centres will soon become overcrowded with information" hashtags="IoT, FogComputing"] This isn’t just an interesting fact, this is also an upcoming problem that we will have to face very soon. The question at hand is of course the following: Where are we going to store all of that information? While some alternatives suggest such extremes like implementing data storage disks in any device capable of carrying them; a whole other aspect will begin to give us additional headaches. And that is data acquisition and real time computing. As the rising need of real time data acquisition continues to grow, our servers, and our providers, will simply begin to crack under pressure. Actual proof of how much of an issue an over-clogged channel can present for both the end user and the service provider are the frequent cyber-security breaches that we are witnessing on a daily basis. Our providers and governments are simply incapable of spotting the actual threat, and warning us in time before our records get exposed and our accounts hacked. And if you were wondering why is this the case, you should know that the problem is actually more of a technical nature, but a very simple one in fact: We are hogging all of the Internet! With so many operations computing simultaneously, spotting the threat literally becomes an impossible task. It is as though you are trying to find a needle in the haystack. Before the end of this decade we will have more than 22 billion devices online. Now imagine 22 billion devices downloading and uploading information, constantly. How will this astonishing number of operations influence the speed our own connections and real time data acquisition? The answer is – a lot. And since entire industries, businesses, and enterprises of all sizes rely on real time data when they are making their decisions, this issue may present a catastrophic outcome for many. So what can we do? What’s the solution? Actually, this answer is also a very simple one – our hardware will evolve. And one of the alternatives that might just present the resolution of this problem is called Fog Computing. Foggy (with a chance of) Computing Our data centres will soon become overcrowded with the amount of information that we are producing and asking for. We all want, as professionals and as private users, the best possible service, the best cloud storage, the best analytics software, not to mention that we want our data to be secure and encrypted. The problem is – all of that information is stored and processed within our data centres. They are computing, receiving and delivering information to a rapidly growing number of users, constantly. This is why alternatives such as Fog Computing present a much needed aid. The effective resolution may just be a simple decentralisation of the very computing process and data acquisition. The idea of relocating 90 per cent of the process to a local cloud computing server, and concentrating our data requests to only those which seek for outside information, is called Fog Computing. This means that we will have a piece of hardware, presumably not larger than our current Internet modem. And while today all of the cloud computing process happens in the data centre of our service provider, or in the data centre of our platform provider, in the future we will probably have our very own, private cloud computing server that will handle all the grunt work. This method will allow us to have the channels of communication open for much more important tasks, such as real time acquisition. It will also have a positive effect on the current, alarming state of cybercrime. Since your vendors won’t be swamped by multiple requests, they will probably have a more transparent insight on threats and will presumably even discover them much sooner. Of course, this decentralisation of data and cloud computing will bring a great variety of other questions. Such as access control, the process of authentication, not to mention the very location of your most valuable files.  However, transferring to a complex system of this sort will also have quite a few drawbacks. It isn’t all about the question of security of a system like this one, but also a question of marketing. How will industries, and individual users, accept this trend that will soon become a harsh reality and a necessity? With some of the industry leaders already working on developing their Fog Computing strategies, and with so many smaller businesses and enterprises still unaware of this upcoming problem – we are bound to witness yet another long and tiresome adapting process. [easy-tweet tweet="Fog Computing will need years to prove itself as a mandatory addition to our #cloud computing world"] So although it is inevitable, Fog Computing will need years to prove itself as a mandatory addition to our cloud computing world. In the meantime, enjoy your Internet speed, because as scary as it may sound – you might just end up missing it. ### Telephone Compliance - #CloudTalks with BT smartnumbers' James Foley In this episode of #CloudTalks, Compare the Cloud speaks with James Foley from BT smartnumbers about how the cloud is a step forward in terms of data security. #CloudTalks Organisations are looking to the cloud to provide the next generation of business applications and utility, and nothing is better suited to a cloud solution than the storage, discovery and retrieval of corporate call recordings. Built to the most demanding compliance requirements, the BT smartnumbers vault provides a level of scale, security and resilience that is far beyond most organisations’ infrastructure. And yet it has been designed for ease of use and simplicity of deployment. ### The problem with Blockchain Bitcoin was the cool kid on the financial services block not so long ago. Despite being thrown back into the media spotlight recently, with entrepreneur Craig Wright claiming the digital currency is his brainchild, Bitcoin had lost appeal long before this. It is essentially an un-regulated currency that goes against all Know-Your-Customer (KYC) and Anti-Money-Laundering (AML) principles. [easy-tweet tweet="The industry has now turned its attentions to the technology that underpins #Bitcoin: #Blockchain" hashtags="FinTech"] The industry has now turned its attentions to the technology that underpins Bitcoin: Blockchain. It has been hailed as holding the key to reinventing the face of finance, and recently, we’ve seen four major banks test Blockchain for credit default swaps, Deutsche Bank enter discussions with a London Innovation Lab for the technology, and RBS plan to pilot a product based around this new phenomenon. It’s no secret that there has been a ton of VC money invested into Blockchain in recent years, and we’ve seen a whole host of start-ups emerge that are focused on its progression. Blockchain company R3 is now looking to raise a staggering $200 million in funding. However, it seems as though one key question has been overlooked amid this industry buzz – does Blockchain actually offer anything to the end-customer that could not be achieved with existing technology? Where does the customer fit in? Some argue that within the remittance space, the technology could help to facilitate global and instant payments in mere seconds, and at an extremely low cost. With these two features, it would serve as a better alternative to traditional channels such as SWIFT or ACH. Whilst this may be true, it is only the case because banks have under-invested in global payment networks for far too long. In reality, being able to send $10 from New York to Sydney within seconds could have been achieved much earlier without the requirement of Blockchain if only banks had addressed the issue as the problem arose. However because they had all of the power in the finance sector, a lack of competition meant that they had no real need to innovate, and there was no financial incentive to invest in payment networks while profits on cross-border payments were so high. One area that could massively benefit from Blockchain is in settlement. The opportunity to overhaul outdated legacy systems that are slow and inefficient is massive. However, settlement is little other than a back end process that offers very little to the end customer. What matters to the consumer is being able to access finance options that suit them, and that make their lives easier. They want to be able to bank from their mobile with ease, and access a range of financial services that help to enhance their life, all at a good price. Blockchain simply doesn’t achieve this. Exploring the excitement Blockchain initiatives, like many other FinTech solutions, are encouraging change within the incumbents and this is undoubtedly one of FinTech’s greatest achievements. With a constant stream of media attention around Blockchain, the concern that it could truly disrupt the financial sector has prompted banks to take action and look into what they could gain from this technology. Whilst a complete transformation of the industry may not be a realistic expectation, banks have had no choice but to investigate the possibilities. The potential Overhauling outdated legacy settlement technology is a key area where Blockchain has the potential to shine. It is a massive industry with millions exchanged daily, and the companies that are able to build these technologies could well be worth billions in the future. However, it is all about the end-user, and a proposition that only brings little value to the consumer cannot feasibly be dubbed as transforming financial services as a whole. If a solution is to truly disrupt the industry, the complete user experience needs to be re-built. Only when solutions are created that bring full transparency to the market, change the channels that consumers interact with to buy financial products, or offer completely new products, will the industry see real change. A key example is JP Morgan. A consumer wouldn’t choose to use their services because of the technology they use to settle transactions, but rather they would choose them because they have a trustworthy reputation. [easy-tweet tweet="#Blockchain will continue to gain media headlines by transforming some niche banking sectors" hashtags="FinTech"] There’s little doubt Blockchain will continue to gain media headlines, and in some senses, for all of the right reasons, for example, transforming some niche banking sectors. However, it is vital to keep in mind that whilst it may be able to improve some areas, banks will have to rethink their customer offering if they are to truly reinvent themselves. ### Converged data management firm Rubrik expands into Europe Rubrik, the Converged Data Management Company, has announced its expansion into Europe, the Middle East and Africa. The company has experienced rapid customer growth in North America, and launched its operations in response to tremendous customer and partner demands in EMEA.  Rubrik’s Converged Data Management software powers the Company’s r300/r500 Series appliances, which deliver automated backup, instant recovery, unlimited replication, and data archival at infinite scale. [easy-tweet tweet="Rubrik has launched new operations in response to customer and partner demands in EMEA" user="rubrikInc"] “Rubrik is the game changing innovation for data protection in the last two decades, and the company’s steep growth in North America has proven that,” said Bipul Sinha, Rubrik Co-founder and CEO.  “We’re excited to bring our solution to EMEA with our growing team and our strategic distributors and reseller partners.” The company also announced the appointment of IT industry veteran Karl Driesen, as Head of EMEA Sales. As the former Vice President, EMEA at Palo Alto Networks, Driesen built a team of over 200 employees and a $200 million business in five years. Most recently, Driesen was a Vice President with Elastica. “I’m excited to join the Rubrik team,” said Karl Driesen, Head of EMEA Sales.  “The combination of strong demand for Rubrik’s converged backup and recovery solution, and our solid EMEA distribution, reseller and support networks will enable us to make a significant impact in the marketplace." Setting their sights on new markets, Rubrik has undertaken a series of developments over the last two quarters, including: Hiring customer account teams in the UK, Belgium, France and the Netherlands Signing up major European and Middle Eastern distributors such as BigTec, Commtech, and Evanssion including their partner networks Certified Partner Field Engineers in the Rubrik Master Program Sold production deployments in EMEA to end user customers in the Business Services, Finance and Government verticals. [easy-tweet tweet="Setting their sights on new markets, Rubrik has undertaken a series of developments over the last two quarters"] “Rubrik does everything we wanted from a solution, and works as described, insanely simple,” said Holger Sell, Manager of Corporate IT Services at Totaljobs Group (UK).  “The overall support and feature set together with a clear product vision made the decision for us.  I see Totaljobs Group as a company making a difference. A place that helps 60,000 people find a job every month, and a centre of excellence for innovation and product development.  Rubrik’s innovative solution fits right in.” ### Key takeaways from the Zayo Capabilities Panel at Cloud Expo Europe Zayo Group hosted a panel of cloud industry experts during Cloud Expo Europe April 12-13 in London. The panel was hosted by Timothy Creswick, CEO and Founder at Vorboss and included Karl Deacon, SVP Enterprise Platforms at Canopy, Steve Hall, CEO at Crown Hosting and Aaron Shelley from Zayo Group. [easy-tweet tweet="The #cloud term can be both a help and hindrance in terms of brokering a discussion with customers" user="ZayoGroup"] Aaron Shelley, manager of technical sales enablement at Zayo Group, shares the top 5 take-aways from the members of the panel, including what hosting and infrastructure providers should be doing now to help their customers with cloud and IaaS solutions: 1) The term ‘cloud’ can be both help and a hindrance for customers   Aaron: The cloud term can be both a help and hindrance in terms of brokering a discussion with customers. Since cloud is a fluid term, it does offer you an advantage since architecting cloud as an idea can become everything depending on the customer’s needs. Karl: The vagueness of ‘cloud’ allows for a broader discussion of what people actually need to achieve from a business perspective and then how the 'cloud' should help them. Customers are now looking to transform the enterprise as a result of cloud-enabled solutions and it has become the tool that allows businesses to do just that. Steve: Customers aspire to digitally transform themselves but don't know how to do that, so they will often turn to cloud providers for input. 2) Self-service is prevalent in the cloud Karl: Cloud, social and mobile are all important for customers today. Self-service is critical, in driving potential change in a managed way and bringing quick benefits. Cloud is still (a question of) outsourcing and how you can get the best out of your cloud service provider (including fully managed service and self-service). Aaron: Smaller more mid-market businesses are used to doing it for themselves. Zayo can be at both ends of the spectrum. Customers demand self-service for greater speed and agility. 3) What is the role of the IT department in a situation where departments can self-service for the cloud and IaaS applications they need? Aaron: Internal IT teams need to upskill further on how to onboard a robust cloud strategy. The successful ones are not the service huggers but the ones which allow others to take control. Where do I see success? Leaders who are discussing what is tactical and what is strategic for the business to be able to achieve in IT transformation terms over the next 12 to 24 months. Karl: IT is an enabler to transform a business from the inside-out, and the IT team has become the strategic enabling service within those organisations to effect more strategic change often without their colleagues realising a change took place! It is always going to be about how new technology will change how businesses work. For example, Siemens and GE are building analytics into their overall development strategies so they will change their relationship with their customers over time. Technology is now the enabler to drive change to reach scale and to develop closer links with customers. At that point, you are in an enviable position. Steve: It’s all about organisational and THEN technological change. Every time I have seen it the other way round it has gone wrong. 4) Major factors in the last 12 months which have accelerated the cloud Karl: It is business economics which is driving cloud adoption. It is not just about technology innovation, but a desire to change the way we do business and adopt the cloud. Steve: Companies are becoming more confident in the cloud and therefore less risk-averse. Business desire has always been there, but it's become a more comfortable thing to do and there is less risk involved now. Aaron: It's a business solution. Newcomers tend to diagnose the cloud from a business angle. Where does it hurt? What does the business need? In a phrase: disaster recovery. 5) What happens to the businesses getting left behind? Are there risks to not adopting cloud? Karl: Again, this is a business issue. The companies who are left behind, whether they adopt the cloud or not, are those that are not being innovative and competitive in their markets. Steve: There are now a lot more hybrid solutions available at market level. No way is cloud ‘one size fits all’ and I tend to agree: hybrid is now a popular choice to help support this innovation and tailor it to these business needs. [easy-tweet tweet="Companies are becoming more confident in the #cloud and therefore less risk-averse" user="ZayoGroup"] Aaron: It completely depends. At what point are you in your business lifecycle? It depends on your company’s current market offerings. Pure public cloud is the exception not the rule, the majority are now shifting into the hybrid. If there is not a business driver behind you, you are not making the right decision. Everything should be used in moderation, including cloud development. ### Live From IBM Bluemix Garage, London Tonight! DisruptiveTech.tv, is recording LIVE tonight at 19:30 GMT. Broadcast time: Wednesday 18th May 2016 - 19:30pm   ### Cloud Voice Services - #CloudTalks with BT smartnumbers' James Foley In this episode of #CloudTalks, Compare the Cloud speaks with James Foley from BT smartnumbers about how their cloud voice services are bringing agility and mobility to businesses. #CloudTalks The BT smartnumbers portfolio provides the most resilient voice services offered by BT, with a 99.999 per cent service availability guarantee. More than 2000 organisations and 250,000 users trust BT smartnumbers for fixed and mobile phones. ### Don’t do your business a disservice: Be determined with your domain name In business, your name is everything. Business owners spend a significant amount of time and money creating a name that reflects who they are and what they do. As often the first thing customers encounter, your name is unavoidably linked to your brand, identity and reputation. So why not pay just as much attention to your domain name? [easy-tweet tweet="Your name is unavoidably linked to your brand, identity and reputation" hashtags="Domain"] From extension names to number usage, security issues to hyphens – there are many things to take into consideration. But before rushing to buy your domain, keep our five top tips in mind to ensure your domain name is the perfect fit for your current and future needs. 1. I spell po-tay-to; you spell po-tah-to If we were face-to-face and you said your domain name, would I be able to write it down? This may seem like a silly question, but it’s surprising just how many businesses fall at this first hurdle. The best way to ensure there is no miscommunication is to do exactly that. Some call it the ‘Radio Test’ - ask a selection of people to listen as you say your website address and have them write down what they hear. Say your business is called ‘Fifty Times Better’ – that could be written in a variety of ways: www.fiftytimesbetter.com, www.50xbetter.com or how about www.50timesbetter.com? This simple example shows how important it is that the verbal version of your domain name is well understood. 2. wHy mAk3 ThIng5 c0mpLicAt3d? Although you want your domain name to give adequate information about who you are and what you do, it’s not the place to try and squeeze in as many key words as possible. Don’t overdo it – using too many words can make it difficult for people to identify even one! As we’ve already seen, numbers or hyphens in your domain name can prove confusing when conducting the Radio Test, not to mention that Google actually penalises domains that include too many hyphens. Shorter and symbol free names are catchy, easily recognisable and understandable, making it easier for your audiences to find you. 3. Prevention is better than the cure Taylor Swift is an award winning singer-songwriter said to be worth at least $200 million. However, if you search for her official Twitter handle it is not simply ‘taylorswift’ but ‘taylorswift13’. The ‘taylorswift’ account is suspended, but Taylor Swift herself is still yet to take control of her handle. You may be thinking that your business is too small to start seriously thinking about a domain name. But as a business grows and attracts more attention, it becomes more susceptible to ‘cyber squatters’ – and if your business is doing well, it’s unlikely they will give it up without a fight. We suggest that you register your domain name under several variations to protect your brand. The bigger you get, the harder you can fall and the more people that may be looking to trip you up! United Airlines learnt the hard way when a parody Twitter account sarcastically replied to grievances from angry customers. Registering multiple domains beforehand can save you a lot of time, money and stress that will come with having to try and buy them back later on, or worse, if your customers think these fake accounts are yours. 4. Don’t get lost in translation Remember, what makes sense in your native language may have a completely different meaning in another. A good example of this is when over the Christmas period, Ford released a car called the Pinto. The ad campaign included the slogan ‘put a Pinto under your tree’ – which to you and me is a nice play-on-words. For Portuguese speakers however, the word ‘pinto’ has a completely different meaning and this proved incredibly problematic (I’ll leave you to find out why). Do the research – make sure meaning travels across as many markets as possible. 5. Keep the future in mind If your ambitions are global and you’re thinking about long term development, it’s worth thinking about the markets where you’re likely to do business in the future and secure your name with different country domain extensions. There is an extensive list of global extensions you can procure. Of course this does take a level of future planning, however if you don’t do it early, the day you want to branch out, you may find it’s already been taken and naturally - it’ll be for sale for a very high price. Only recently did Apple lose a trademark fight over the name ‘iPhone’ in China. Make the small investment now to avoid scrambling later. [easy-tweet tweet="We suggest that you register your #domain name under several variations to protect your brand"] So that’s our top five tips to choosing a domain name. Now is the time to protect your virtual identity. In this digital age, a good name is no good unless you also have a good domain name attached to it. ### How to avoid the data breach blame game Deciding who is accountable for data security within your business is far from clear-cut. Individual errors may have led to the breach, but these may have only been possible as a result of poor security policy, at which point blame often shifts towards the IT team. However, in the wake of a data breach, it is important that businesses do not start throwing accusations around wildly. Determining who is ultimately responsible for company security is not about realising who is to blame, but rather who is in charge of making sure similar mistakes are not repeated. [easy-tweet tweet="Recent research by #VMWare found that cyber #security can no longer be left to the IT team alone" user="VMware"] Recent research carried out by VMWare, however, has found that cyber defence can no longer be left to the IT team alone. In fact, 29 per cent of IT Decision Makers (ITDMs) and office workers believe that the CEO should be held responsible for a significant data breach. Similarly, when asked who should be most aware of how to respond to a data breach, 38 per cent of office workers and 22 per cent of ITDMs said the board, while 53 per cent of office workers and 40 per cent of ITDMs believed it was the remit of the CEO. Evidently, the response to cyber attacks is changing and it is important for businesses to understand why. Firstly, organisations are coming to terms with the fact that it is often a case of if, not when, a data breach occurs. 24 per cent of businesses expect a serious cyber attack to hit their organisation in the next 90 days and one look at last year’s headlines will reveal how harmful they can be. Reputational damage as a result of a data breach can be difficult to recover from, as the likes of TalkTalk and Ashley Madison are now discovering. With the frequency and impact of data breaches becoming better understood, it is not surprising that businesses are moving towards a more holistic security policy, one that comes all the way from the C-suite. “The issue around accountability is symptomatic of the underlying challenge faced as organisations seek to push boundaries, transform and differentiate, as well as secure the business against ever-changing threats”, explained Joe Baguley, CTO, VMware, EMEA. “Today’s most successful organisations can move and respond at speed as well as safeguard their brand and customer trust. With applications and user data on more devices in more locations than ever before, these companies have moved beyond the traditional IT security approach which may not protect the digital businesses of today.” The need to constantly innovate mentioned above has proven difficult to integrate with existing security measures for some companies. The rise of mobile devices and cloud computing has created far more access points, and hence vulnerabilities, for corporate data, and the expected growth in IoT technologies is only going to exacerbate the issue. [easy-tweet tweet="Communication moves data #security from being a blame game to more about collaborative solutions" hashtags="VMWare"] Technical solutions, including a software-defined approach to security and encryption on data at rest and in transit, will help, but cultural changes are also in order. IT security must be demystified if businesses are to become better protected and for that to happen, clear and continuous dialogue must take place between the IT team and the board. That way, security becomes less of a blame game and more about collaborative solutions. ### Modern Finance: The story so far In today’s world, cloud-based process automation is becoming omnipresent in everything from banking to shopping, and businesses are beginning to feel the benefits of smoother processes and ROI. [easy-tweet tweet="Cloud-based process automation is becoming omnipresent in everything from banking to shopping" hashtags="Cloud"] However, this was not always the case. A few years ago, if you asked a finance or accounting professional about the state of their work processes, more often than not, they’d tell you they were snowed under with laborious, time-consuming manual processes, and cloud-based business solutions were an unknown and frightening prospect. In fact, it wasn’t until quite recently that businesses began to adopt cloud-based software solutions designed specifically for finance and accounting professionals – which, when you think about the huge benefits, is mind boggling. The automation of financial processes, or as we at BlackLine like to call it, the Modern Finance approach, not only makes life easier for finance professionals by saving time, cutting costs and virtually eliminating risk of error, but benefits the wider business in ways that a few years ago, we couldn’t have begun to imagine. And it all started with an idea… The birth of BlackLine and the Modern Finance prototype The concept of Modern Finance was designed by BlackLine back in the early Noughties. Founder and CEO Therese Tucker, comfortably retired from her position as CTO of SunGard Technologies, was enjoying maternity leave and had no intention of going straight back into work. Until, that is, she began to reconnect with former clients and heard their horror stories about being kept in the office at all hours, poring over spreadsheets and box files each month in an effort to complete their reconciliations on time. The primary issue was that, whilst other areas of business were embracing process automation and thus moving forward in terms of efficiency and productivity, finance and accounting departments were finding themselves left behind. Staff were overworked, spending hours on manual data entry in spreadsheets, which by its very nature was not a secure process and was left wide open to mistakes – accidental or otherwise. Time, money and paper was being wasted and something had to be done. It was from these conversations that Therese formulated the idea for an automation solution, something that would modernise the way accountants worked. From this idea, BlackLine’s original account reconciliation automation solution was born. The solution essentially automated and controlled the financial close process – by providing visibility into data, the solution enabled accountants to rapidly detect any anomaly, thereby limiting the potential for mistakes to happen. The process enabled a more continuous approach to accounting – instead of a spike in activity at month end, accountants could spread out their duties over the month, leading to a smoother close. Therese developed the original BlackLine solution entirely on her own and was sole investor in the product for the first five years. It wasn’t long before former clients began requesting a trial of the new solution. Very quickly, BlackLine’s success began to pick up speed. In 2009, Therese led her now-burgeoning company through a migration to SaaS, a risky move at the time and one which could alienate users. Luckily, it paid off, and enabled BlackLine to later develop a full suite of solutions for accounting departments. Modern Finance today: Still a need for automation education Today, the Modern Finance revolution is in full swing. Not only are businesses acknowledging the benefits of process automation, they are continuing to build on the original reconciliation solutions they have purchased, enabling them to take the ‘next step’ in automation. Increasingly, businesses are seeing less errors, more timely month-end close, and happier staff. One such example would be BlackLine customer Western Union which, following the adoption of BlackLine’s solutions, was able to sync up over 20 global accounting offices, enabling the Group Accounting Director to view in real-time which reconciliations were complete and which were outstanding. And BlackLine? We now have over 1300 customers in over 100 countries; large and mid-sized corporates including LV, KFC, British Gas and RSA, to name just a handful. In late 2014, BlackLine’s Finance Controls and Automation Platform – a scalable, unified cloud platform built around Therese’s original solution – was recognised by Gartner as a ‘best of breed’ solution in the newly-created category, Enhanced Finance Controls and Automation (EFA). Further to this, BlackLine is also an SAP Gold Partner, Oracle Gold Partner, and participates in the partner programs of NetSuite and several other ERP providers. Despite the success of BlackLine – and the happy customers – it’s clear that there is still a great deal of work to do when it comes to educating finance and accounting departments on the benefits of automation. In 2008, it was estimated that close to 90 per cent of spreadsheets had errors in them. Whilst we don’t have any data to hand that conveys the extent to which this has decreased, we know that those using automation tools report a higher level of job satisfaction and are confident in the accuracy of their data. Some recent BlackLine research found that in 2014, around 27 per cent of CFOs were concerned about the accuracy of their financial data. This rose to a whopping 44 per cent in 2015, likely due to increasingly tight regulation, which businesses must comply with. The threat of cyber security breaches has loomed large on the business agenda for the past few years and this is only becoming more prominent. What many of these businesses still don’t realise is that a Modern Finance approach can help them to ensure compliance, will keep their data secure from threats and will also enable them to spot potential errors earlier on. [easy-tweet tweet="It’s clear that cloud-based process automation, particularly within the finance industry, is the future" hashtags="FinTech"] It’s clear that cloud-based process automation, particularly within the finance industry, is the future. The next step is to ensure that businesses of all sizes are better educated about the benefits, learning from those who have successfully embraced a new era – the era of Modern Finance. ### FinTech & International Payments - #CloudTalks with Currency Cloud’s John Hammond In this episode of #CloudTalks, Compare the Cloud speaks with John Hammond from Currency Cloud about how their FinTech solutions reimagine how money moves through the digital economy. #CloudTalks Established in early 2012, Currency Cloud is a FinTech company in the heart of London. Their mission is to power next-generation enterprises with a transparent, fast, easy-to-use and secure payments engine that will transform the way businesses move money around the world. Today, they work with over 125 platform customers and their service has reached more than 150,000 end-customers. They now process in excess of $10 billion in payments every year, across more than 40 currencies in 212 countries. ### What the European General Data Protection Regulations (GDPR) mean for your business Following the approval of the new General Data Protection Regulation (GDPR), businesses must be prepared for a new set of standards surrounding data processing. Although there remains a two-year lead in period, organisations must begin work now if they are to meet the regulations. Crucially, businesses must be able to distinguish between fact and fiction when it comes to deciphering what impact the GDPR is likely to have going forward. Making sense of the mixture of speculation, misunderstanding and erroneous interpretations could mean the difference between success and failure when it comes to complying with the ruling. [easy-tweet tweet="Despite the two-year lead in period, organisations must begin now if they are to meet new #GDPR standards"] Lisa Dargan, Business Development Director for Ultima Risk Management, has taken a look at some of the GDPR myths already circulating and what impact the legislation will really have on your business. Myth 1 - You must appoint a qualified, independent Data Protection Officer (DPO) Not true. It was strongly suggested that the GDPR would require every organisation with more than 250 employees, or processing in excess of 5,000 personal data records, to appoint a Data Protection Officer. However, this proposal was removed from the legislation at the drafting stage. Instead, Section 4 of the GDPR states that DPOs are required if you are: a public body a private sector controller whose core activities involve ‘regular and systematic monitoring of data subjects on a large scale.’ (Notice that what constitutes “large” is open to interpretation.) a private sector controller whose core activities involve the processing of special categories of personal data (i.e. sensitive information). Businesses should also be aware of the importance of an independent DPO. They can still be an existing employee, but they must have an independent reporting line, and directly report to the Board without interference. They should also have a thorough and up-to-date understanding of data protection law if your businesses is to meet compliance standards. Myth 2 - I am an SME, so the GDPR doesn’t apply to me Not true. The GDPR applies to all businesses ‘engaged in economic activities’ that involve the processing of personal data. Although there are some exemptions for micro and small businesses when it comes to record keeping, SMEs still need to be aware of the new ruling. Smaller firms may be working with large customers and so will need to ensure that the relevant data is managed appropriately. Myth 3 - I’m acting as a data processor– my customers, as the data controllers, can deal with the difficult stuff Not true. Over the next two years data controllers will need to review all of their supplier contracts to ensure that they meet the new regulations, but data processors also have direct responsibilities under GDPR, including a requirement that they (or their representatives) maintain a record of processing activities including: The name and contact details of the processor or processors, or where applicable, the processor’s representative The name and contact details of each controller (or the representative) the processor is acting for and their data protection officer The categories of processing carried out on behalf of each controller Transfers of personal data to a third country or an international organisation, including the identification of that third country or international organisation and the documentation of appropriate safeguards (e.g. contractual clauses within inter-company data transfer and sharing agreements based on risk assessments etc.) Where possible, a general description of the technical and organisational security measures the recipient of the transfer has implemented The records need to be in writing, including in electronic form and made available to a supervisory authority on request Myth 4 - I encrypt my personal data so there’s no way I’ll get fined Not true. Security is important, but fines can be issued for failure to meet data controller/processor obligations, as well as security breaches. Regulators can impose penalties of between two and four per cent of annual turnover, depending on the severity of the infringement. Some considerations taken into account before issuing a fine include: The nature, gravity and duration of the infringement The purpose of the processing concerned The number of data subjects affected The level of damage suffered by data subjects (including infringement of their rights) Whether the infringement was intentional or negligent Any action taken by the controller or processor to mitigate the damage suffered by data subjects The degree of responsibility of the controller or processor, taking into account technical and organisational measures implemented Any relevant previous infringements The degree of cooperation with the supervisory authority in order to remedy the infringement and mitigate the possible adverse effects The categories of personal data affected by the infringement The manner in which the infringement became known to the supervisory authority, in particular whether, and if so to what extent, they were notified Whether any previous measures ordered against the controller or processor relating to the same subject-matter were complied with Whether approved codes of conduct or approved certification mechanisms were in place Any other aggravating or mitigating factors such as financial benefits gained, or losses avoided, as a result of the infringement Encryption will not solve all your problems. You will also need to consider ‘organisational and technical’ measures, not just in relation to security management and data protection, but potentially in terms of documented privacy impact assessments. These are now mandatory where new processing operations are likely to result in a high risk to the rights and freedoms of data subjects. Businesses should also ensure that they have a thorough governance and compliance regime in place in order to ensure their accountability obligations are met. In response to the GDPR ruling, data processors and controllers need to think ahead and prepare for the coming impacts on their IT infrastructure. Is your business able to: Identify where personal data is stored, processed and transmitted by utilising data discovery and data audit tools Record how consent for processing personal data was obtained, who it was obtained from, who it has been shared with, whether it has been changed, its accuracy disputed and approval for disclosure under data sharing agreements (internal, external and inter-company)? Do your applications/systems developers understand the GDPR implications? Are you preparing to perform documented privacy impact assessments and criteria for prior consultation with data protection authorities as part of your compliance regime? Are your applications/systems able to support the GDPR data deletion requirements? Are you planning application changes to support the new rights of data subjects to receive copies of their personal information in common (interoperable) electronic format and/or forward that data to another entity (portability)? Are you proactively talking to your software suppliers, service providers and data processors? Have you identified them and planning contract reviews? Are you a data processor or software solutions provider? Will your incident management and investigation procedures enable compliance with data breach notification obligations, to notify supervisory authorities where necessary within 72 hours? Are you considering what, how and when you may need to notify data subjects that a breach has occurred and what assistance you will provide them? How will you review online privacy information notices and achieve online consent? How will online consent translate into recording that consent and subsequent withdrawal of consent trigger potential data erasure? How will the data erasure/portability requirements impact your current data retention and archiving arrangements? What resources and support will you need for your GDPR reform project? Myth 5 - The GDPR will not be relevant if we leave the EU, so businesses should wait before acting This is not an advisable approach to take. If the UK remains in the European Union, the GDPR will supersede the UK Data Protection Act, but if we leave, the complex withdrawal process could mean that the UK is forced to implement similar legislation in order to comply with the EU rules. The free flow of information will remain vital to the success of UK businesses whether they are based in or out of the EU, meaning that organisations are better off complying now, before it’s too late. [easy-tweet tweet="The free flow of information will remain vital to UK businesses whether they are in or out of the EU" hashtags="EUref, GDPR"] For further guidance on how to comply with the GDPR, the Information Commissioner’s Office has published guidelines via it’s new micro-site: https://dpreform.org.uk/. ### Dealing with ransomware: Why data protection needs a more holistic approach Ransomware refers to a strain of malware that attacks computers, encrypts the files on them and then demands payment to unlock them. From being a rare form of attack around a year ago, thousands of organisations are now being hit by this kind of attack every day. The attacks themselves are indiscriminate, hitting public sector bodies like police forces, hospitals and councils, as well as private companies and individuals. [easy-tweet tweet="The confluence of IT industry trends has made backup more challenging" hashtags="Ransomware"] Ransomware is also spreading beyond the traditional Windows PC into targeting Linux and Mac machines, as well as now mobile phones using versions of Android. As more and more potential targets for ransomware are created, it’s important that all organisations look at their approach to data protection in more detail. Step 1 – Education The most common route for all malware attacks into organisations is still email, with attacks disguised either as a link or an attachment. This is not a sophisticated approach, but it is still successful. Rather than the mistake-ridden missives of the past, better design and grammar in the emails makes them harder to spot. At the same time, workforces within companies are getting more mobile, taking them increasingly outside the perimeter security implementations that can help to stop attacks getting through. Use of company assets outside the business – or employees using their own devices for work purposes – can exacerbate this risk further. Education here can help. Users can and should be trained to spot attacks on them, whether these are for the latest phishing attacks that are based on social engineering or designed to get payloads opened. Encouraging users to manage their work and not get rushed into opening potentially suspect emails – even when they have the appropriate name or branding on them – can help prevent some of these issues in the first place. Step 2 – Backup and data protection across all devices, not just some While education can help, it’s not the only answer. It relies on every user being 100 per cent vigilant all the time, and precludes the possibility of human error. Alongside keeping staff up to date on problems, it’s therefore important to look at data backup. Backup is one of those tasks that is often looked at centrally by IT. However, the confluence of IT industry trends has made backup more challenging. The growth of mobile working, the use of multiple devices and more deployments of cloud applications all have an impact on backup strategies. However, protecting data across all assets – not just those held centrally – is vital when it comes to defeating ransomware. However, backup has to move out from being something only done for centrally-held IT. As ransomware can strike at almost any IT asset – from a phone, laptop or tablet through to the files held centrally – backup has to cover each and every device equally. Alongside this, holding multiple versions of each file is required too, just in case files are infected previously before being spotted. Protecting data on each device does mean thinking about how to get data off those IT assets. Mobile or remote workers may not come into the office for regular imaging of their devices, while relying on users to protect data themselves runs the risk of steps not being completed. Instead, client backup should be as unobtrusive as possible. Cloud-based approaches can help here, as data on devices can be protected regardless of location or device type. By sending updates of data securely at regular intervals, a history of files can be created. If a ransomware attack does make it through, then restoring the files can be done based on going back to a “known good” version. Any approach here should also bear in mind how sensitive the data that users create is. Anything containing personally identifiable information (PII) should automatically get detected and secured with encryption, regardless of the backup strategy that is put in place. If a device does get attacked either by ransomware or more traditional malware that looks to steal data, then these files should be protected automatically. Step 3 – Stopping vulnerabilities faster Without backup in place, it’s highly unlikely that a recovery from a ransomware attack will be successful. Some attacks have failed due to poor encryption implementations, but these have required significant technical expertise in order to fix them. Without education, it’s harder to stop employees from succumbing to attacks. However, it’s also worth linking up with the IT security team to prevent known vulnerabilities from being exploited by ransomware. Putting updates in place as soon as they are released can help prevent some of the most common forms of ransomware from being successful, as the exploits they are associated with have often been fixed by the software providers already. This approach won’t help if an attack does get through, but it does help to reduce the likelihood of a call from an end-user who has a problem. [easy-tweet tweet="The only way to ensure protection against #ransomware is through a comprehensive data protection scheme"] Ransomware has become a prevalent threat for everyone. However, no matter how educated employees are and how up to date a company’s IT assets, there is still a great chance of being infected by ransomware. The only way to ensure protection against ransomware is through a comprehensive data protection scheme. Preventing the problem through effective data backup is the best cure. ### Encryption and Cloud Security - #CloudTalks with WinMagic's James LaPalme LIVE from Cloud Expo In this episode of #CloudTalks, Compare the Cloud speaks with James LaPalme from WinMagic about the importance of encryption and other security features when moving to the cloud. After reviewing the many interviews filmed at Cloud Expo Europe 2016, James LaPalme was awarded the Cloud Talks trophy for the best interview of the day. #CloudTalks WinMagic provides application aware Intelligent Key Management for everything encryption, with robust, manageable and easy-to-use data security solutions. Although security used to be thought of as the cloud's weak link, modern IT solutions let businesses embrace new technologies without worrying about the safety of their corporate data. ### Why performance versus promise matters in the cloud world Industry analysts project that global software-as-a-service (SaaS) spending will continue to grow, from $49 billion in 2015 to $67 billion in 2018. IDC research supports this, suggesting that SaaS revenues are growing nearly five times faster than traditional software products. With this in mind, cloud really is now a reality for businesses today. And, as more companies come to trust and rely upon the cloud, many are increasingly focusing their attention on requirements such as service availability and performance. [easy-tweet tweet="Trust helps inspire customer satisfaction, which is the critical metric for success in the #cloud world"] After all, modern enterprises rely on the cloud for their most critical business systems, so it’s absolutely imperative that these applications are available and can be scaled up or down as needed. This reliability builds trust, and with that trust comes a relationship between vendor and customer that can develop into a long-term engagement. Equally important, from a vendor perspective, is that trust helps inspire customer satisfaction, which is the critical metric for success in the cloud world. As more business applications move to a multi-tenant cloud deployment model, software vendors are being challenged to satisfy their entire customer base with a single service-level agreement (SLA). Consumers have become accustomed to the user experience provided by companies such as Facebook, Twitter, and LinkedIn—services that rarely have any downtime longer than a browser refresh. Although some service interruptions are necessary and even mandatory—such as for major version updates—the expectation exists that they should be brief and unobtrusive. As well as having seamless availability, customers also want their applications to stay up-to-date and historically this meant systems had to be shut down by vendors during updates. The headaches associated with major upgrades of on-premise systems can leave customers months out-of-date in terms of maintenance updates, and with some vendors, there is a patchwork of service packs and bolt-on products to ensure users have the latest features. Extended downtime and an almost perpetual state of running yesterday’s version isn’t a recipe for keeping customers happy. It’s at this junction of availability, reliability, and running the latest system that the journey to zero downtime really becomes important. Service availability in today’s connected, always-on world is a whole different ball game to the legacy client/server world. In the cloud computing era, the responsibility for uptime and availability falls on the supplier’s shoulders. So what should a best practice approach look like? A single, workable SLA is highly desirable, and that approach should mesh with the way companies think about the design and delivery of the product – with an ultimate aim of achieving one code line, one version, and one customer community. Customers should have the opportunity to preview new releases before they go live, so they benefit from testing and planning that takes place in this period. This brave new world is very different to the on-premise ERP upgrade experience, which normally requires months of planning and weeks of post-rollout fixes. As more businesses shift to the cloud, questions are also being asked about the transparency of SLAs. Do you know the specifics of your SLA? Are there any clauses in your contract that allow unscheduled downtime and still allow the vendor to meet the agreed-upon SLA? Any lack of transparency around SLAs is something customers should avoid. That said, customers shouldn’t be thinking solely about the SLA promise when they look at the cloud, but more their chosen vendor’s past performance and, consequently, how successful they will be as a future partner to the business. [easy-tweet tweet="Vendors must continually earn customer trust if they want to extend the relationship long-term" hashtags="Cloud"] In the cloud world, vendors must continually earn the customer’s trust if they want to extend the relationship long-term, because with a subscription model, the customer truly is in control. For this reason, it really is performance rather than promises that matters for businesses moving to the cloud today. ### Understanding the new era of Internet governance Henry Ford is reported to have observed that “If I had asked people what they wanted, they would have said faster horses”. Short enough to be popular on social media, this homage to leadership is as likely to be used to damn ‘luddite’ dissenters today as at any time in history. [easy-tweet tweet="The domain name system offers new ways of conducting business and of expressing an opinion"] Yet whatever logic the saying has when applied to genuinely revolutionary creations, the idea that leaders have monopolies on the best ideas, and therefore that they can and in some cases should bypass popular opinion, is not without controversy.  This is particularly true where decisions are liable to erode those rights which are so fundamental to our daily lives that we simply take them for granted.  We often see this tension in the development of digital media services, where new opportunities are paid for by revising our social and cultural foundations, or at least our assumptions about the extent of those foundations. Ongoing discussions within ICANN, the organisation that controls the domain name system, are putting a modern face on this value exchange.  With historically low barriers to entry, the domain name system, and in particular the new top-level domains (TLDs) such as <.cloud>, offers new ways of conducting business and of expressing an opinion.  But as the domain name system has grown in size and in importance, it has enabled a volume and diversity of privacy and property infringements that would never be experienced offline. Freedom of expression is a fundamental right, of course; but so too are the rights to respect for private life and to property, including intellectual property.  Can these competing interests be given equal protection, or must one cede to the others?  How, and by whom, should the balance be determined?  Through the creation of bespoke rights protection mechanisms (RPMs), several TLD operators are taking the initiative to determine where the balance lies within their own domain spaces, but the absence of a global solution to expressive-but-infringing content across all top-level domains means that in practice many stakeholders remain without any remedy or protection against abusive behaviour.  So, what to do? Recent history shows how a global solution could be achieved.  In 1999 the global Internet community created a low cost, accessible and, most importantly, supranational means of tackling cybersquatting and typo-squatting in domain names that also preserved Internet users’ rights to freedom of expression.  As a result of contributions from stakeholders representing all aspects of the domain name system, the community created the Uniform Domain Name Dispute Resolution Policy (“UDRP”), an administrative procedure to which those selling domain names must now adhere in order to comply with their contractual obligations to ICANN. The UDRP has been a huge success.  It provides certainty to rightsholders and registrants alike, and it means that disputes can be resolved without costly litigation and potentially life-changing damages or cost awards being made against a losing party.  A small number of interest groups want ICANN to scale back the UDRP as part of its ongoing RPM review, but those groups have not offered any compelling alternative.  In fact, not only has the global Internet community benefitted from the stability and predictability of the UDRP, but it is appropriate in the context of ICANN’s RPM review to consider whether the model of the UDRP might offer a solution to the issues raised above, and could be extended to address more than just the use of trade marks in domain names.  There is no doubt that a collective process such as that which created the UDRP would allow any proposed solutions to draw on the experience and perspectives of the disparate interest groups within the Internet community. That it would result in a globally-binding solution that would create a level playing field and could not be altered by mere bilateral agreement also favours this approach. As shown by the steps taken by individual registry operators to develop their own RPMs, a new era of Internet governance is coming:  the question that remains is what will it look like?  Should conduct and content be regulated by bespoke solutions developed by commercial service providers, by minimum standards set out by governments on behalf of their citizenry, by detailed rules agreed upon by the global Internet community, or by some other means? [easy-tweet tweet="The creation of bespoke RPMs suggests a new era of Internet governance is coming" hashtags="ICANN"] There are many ways for the community to get involved in this debate, from attending ICANN meetings, to joining a mailing list via http://icann.org, to liaising with technology and disputes lawyers familiar with ICANN. But whatever the method you use to have your say, the time to get involved is now. ### Data Protection & VAR - #CloudTalks with Silverstring's Alistair MacKenzie In this episode of #CloudTalks, Compare the Cloud speaks with Alistair MacKenzie from Silverstring about data protection and making the most of the opportunities presented by cloud computing. #CloudTalks Your business environment uses and produces massive volumes of data, swelling powerfully every day. Silverstring helps you manage the challenge of storing it, backing it up, protecting it, and recovering it, across public, private and hybrid cloud platforms. By using the provisioning speed and flexible billing of major cloud providers, Silverstring’s DRaaS: Cuts overall costs by 50 per cent, whilst delivering enterprise-grade corporate data protection Replicates the reliability of IBM’s TSM solutions into private, public and hybrid cloud environments, through the Predatar app Eliminates capital expenditure costs and offers flexible monthly billing options Enables you to test and be sure of your business’s recovery capability at any time ### In today’s cloud era, the Chief Customer Officer has never been more vital We’ve been living in a cloud era for some time now and have watched its impact in transforming businesses, but what does it mean for customer service? Higher demands from customers have resulted in a new strategic role within organisations  - the Chief Customer Officer. What does that role need to deliver in today’s cloud era to impact on company success? [easy-tweet tweet="Higher demands from customers have resulted in a new strategic role: the Chief Customer Officer" hashtags="Cloud"] Cloud and customer expectations The dominance of cloud has heavily impacted customer expectations. As consumers, we have all become used to having information available at our fingertips, whenever we want it, wherever we are and however we want to consume it. Forrester notes that “As customers become more accustomed to digital touchpoints, their expectations increase.” This is especially true for newer generations, now born into a digital world. It’s no different in the business world when serving an ever-growing digital generation. According to Forrester, “Millennials are becoming the largest demographic group in your workforce. Their experience with digital technology shapes their expectations of and behaviour in the workplace. They expect and need high-quality technologies and easily consumable technology services that enable them to succeed.” Forrester cites that we are now in the “age of the customer”.  It is therefore time to prioritise those that have the most impact on a business - the customer. Customers expect the best service, products and experience and there is no reason for companies to fail in delivery. Again, like consumers, customers expect the same level of service whenever and wherever they are and in a way they want to receive it. The good news is the technology is there to help, with cloud ERP eliminating friction and errors when dealing with customer facing processes to create satisfying experiences for everyone involved. Introducing the Chief Customer Officer With customer complacency now gone, customer loyalty is much more difficult to achieve. As the cloud era gains more dominance and the customer has become king, it is no surprise to see a rise in a new role - Chief Customer Officer (CCO). This role is vital in ensuring companies remain competitive. The CCO has to look both internally and externally to make sure the right teams, technology, policies and processes are in place to offer second-to-none customer service and that it is felt by each and every customer. Customer relationships have also been changing. No longer are they built on one-off transactions. Customer service is now responsible for building long-term real relationships where every customer touch with a company is easy, enjoyable and valuable. It is only then when customer loyalty can be created. No-one expects a completely perfect service all of the time, but it is how companies interact in good times and bad, that also helps to formulate a strong relationship and one which may reap benefits in the long-term for both parties as they grow together. A CCO needs to work in tandem with all departments of a business, both back and front office, to ensure a customer has a smooth interaction. Whether they want to buy more, get advice, check status or even complain - each connection should be received with the same level of enthusiasm, professionalism, consistency and accuracy, resulting in strong customer satisfaction. Impact of technology Cloud technology, if implemented with the right strategy, can impact dramatically on how a CCO can succeed. If each and every person in a company is able to see a complete 360 degree customer view, the chances of excellence in customer service delivery are significantly improved. If anyone can view data from sales, finance, HR and services in one true customer record it provides great insight to give more understanding of a customer, and what is needed to serve that customer well. It’s not enough to patch together different cloud apps, as that can still result in siloed information between departments leading to a lack of consistency in how customers are served. To share data across all departments in the easiest way all cloud applications should be on the same platform. Not all CCOs will have direct decision making when it comes to technology, but the most savvy will ensure they have influence and the best possible opportunity to succeed in exemplary customer service delivery. The digital age has increased competition overall. Forrester states that “competition is increasingly based on the strength of your digital experiences and digital business. At the same time, your customers’ expectations are formed by their best experiences in any industry.” Without a strong customer-centric position, a company can sit back as dissatisfaction is communicated publicly and quickly and a customer moves on to another provider. Having a CCO in place can ensure that customer-centricity remains a priority for all. Customer service shift Expectations driven in the cloud era mean that customers expect companies to form relationships with them. For too long many have rallied around to try and problem solve with those customers who are not happy and not receiving the service they believe they have signed up for. A successful CCO will ensure the right touch points are in place to continue to build relationships with satisfied and dissatisfied customers. This reduces attrition, increases retention and greatly improves the likelihood of cross-selling. Not only that, but it can help a company build fantastic references. Third party endorsement is easier for others to view and has more credibility than ever before. [easy-tweet tweet="The #cloud era is set to stay and the CCO role is set to continue its rise"] The cloud era is set to stay, customer demand will continue to increase and the CCO role is set to continue its rise. It’s a busy role, but vital for those companies who wish to win through in a digital age where companies must centre everything they do around the most important party - the customer. ### Friday the 13th: Our Modern Technology Fears Fearing change is nothing new. Human beings are biologically programmed to be wary of the unknown, lest it provides a threat to their existence. However, the rate of change taking place over the last century or so is far greater than at any other point in human existence – fuelling a rise in technophobia. [easy-tweet tweet="The rate of change taking place over the last century or so is fuelling a rise in technophobia" hashtags="Friday13th"] Last year’s Survey of American Fears saw technology take second place, losing out narrowly to man-made disasters such as terrorism. In honour of Friday the 13th, we’ve taken a look at some of the creepier technological developments that are driving this anxiety. DNA Hacking Manipulating human DNA has long been the realm of science fiction, but it is fast becoming science fact. Developments in genetics and genomics have given scientists a greater understanding of our personal biology than ever before and for every dream of a disease-free world there is an alternate nightmare scenario. Could DNA hacking lead to a new wave of personalised biological weapons designed to target specific individuals? It may sound far-fetched, but there are already claims that the US government collects and destroys personal items used by the President to prevent his DNA falling into the wrong hands. Creepier still, WikiLeaks claimed in 2010 that the US was actively collecting biometric information belonging to other world leaders. True or not, there is certainly much potential for good, and bad, to come from our genetic tampering. Government Surveillance Technology is unfairly portrayed as the enemy of personal privacy, with Internet cookies, government metadata collection and GPS trackers giving the authorities access to a great deal of information regarding your whereabouts, actions and motives. Of course, there is also plenty of technology, from encryption to VPNs, designed to shield your behaviour from surveillance. Big Brother may not be watching, but NSA revelations have certainly led many to believe that it is. Tech Warfare War has often brought out the dark side of technology, even if it may ultimately end up saving lives. From destructive machines that fill our skies and seas to the atomic bomb, technology has played a leading role in warfare for many years. Developments continue today through cyber warfare targeting foreign nations’ networks and computers and drone warfare on foreign lands. The latter, in particular, has the potential to increase the sense of detachment surrounding warfare. With military personnel selecting targets from behind a computer screen, taking a life could easily become a cold and disinterested affair. The Uncanny Valley As robotics develops we are starting to see machines that ever more closely resemble humans. Things start getting unsettling, however, when robots look almost human, but are clearly not. Psychologists describe this phenomenon as the “Uncanny Valley,” whereby observers exhibit a feeling of revulsion at the point where a robot’s appearance becomes more human-like. Examples of these creepy automaton’s are increasingly evident, including the Geminoid-F and Telenoid R1 robots, both designed by Hiroshi Ishiguro. Artificial Intelligence Although many of our modern-day technology fears are being encouraged by a media that feeds on scaremongering and hysteria, the threat of an AI-led disaster has received backing from a number of respected scientists and technology thought-leaders. An open-letter from the Future of Life Institute urged AI research to ensure that we reap the benefits of the technology, “while avoiding potential pitfalls.” To date, the letter has been signed by more than 8,600 individuals, including Stephen Hawking, Elon Musk and Steve Wozniak. Professor Hawking has also gone on record multiple times to explain the threat that artificial intelligence poses to humanity. Crucially, once AI matches or surpasses current levels of human intelligence, it’s behaviour will become difficult to predict. Will it prove benign towards its creators, or as Professor Hawking warns, “spell the end of the human race?” [easy-tweet tweet="Professor Hawking has often explained the threat that artificial intelligence poses to humanity" hashtags="AI, Friday13th"] It is important to note that many of the technologies listed above are some years from being realised, and even if they are, they are likely to provide more advantages than causes for concern. It is not surprising that change can unnerve our modern-day sensibilities, but it is important that we do not let our fears stand in the way of innovation. ### Mobile Apps & Sporting Success - #CloudTalks with Quickscore's David Geensen In this episode of #CloudTalks, Compare the Cloud chats with David Geensen from Quickscore about how they came up with the idea for their app-controlled Smart Scoreboard. #CloudTalks As well as an improved matchday experience, Quickscore's Smart Scoreboard also provides a great opportunity for clubs, schools and universities to generate advertising/ sponsorship revenue and an excellent opportunity to maximise ROI. ### Three magic words for cloud transformation: “Open, Hybrid & Integrated” At Huawei Cloud Congress Europe 2016 there were a raft of interesting talks, ranging from virtual reality to supercomputers. Whatever technology was being discussed, however, the conversation inevitably came back round to cloud computing and the challenges and opportunities it provides for businesses. [easy-tweet tweet="Huawei outlined their three magic words for cloud transformation: Open, Hybrid and Integration"] It was during one of these talks that Krzysztof Celmer, Senior IT Solution Consultant at Huawei, outlined his “three magic words” for cloud transformation: “Open, Hybrid and Integration.” Below, I’ve taken a look at why these three principles are key for businesses of all sizes and across all industries. Open It’s not just Huawei that is singing the praises of open cloud environments. The likes of IBM, Dell and Rackspace have all pledged to support open cloud standards. It’s easy to see how open cloud platforms benefit the end-user by giving them more choice and greater interoperability, but vendors also stand to benefit. By opening themselves up to greater collaboration, cloud vendors gain access the vast resources provided by opensource communities. Open standards have helped to drive innovation, opening up platforms to new developers, new applications and, ultimately, new ideas. It’s all well and good having a closed ecosystem if you can convince enough customers that all of your cloud offerings are best in class, but in reality this is difficult for a single vendor to achieve. Open standards turns competitors into collaborators and creates a win-win ecosystem for customers and businesses alike.  Hybrid Certain applications may be better suited to a public cloud environment and others private. With this in mind, many businesses are moving to a hybrid cloud approach in order to grasp the best of both worlds. Huawei’s FusionCloud offering makes use of OpenStack cascading technology to connect public and private environments. Business must have the upmost confidence in both, in order for their hybrid deployments to be successful. Hybrid cloud recognises that combining the flexibility and scalability of public cloud with the security of private cloud represents that ideal infrastructure for many organisations. Increasingly, this means blending platforms from multiple vendors, which brings us to the third magic word. Integration As the cloud market opens itself up to a wider range of vendors and niche applications, the integration challenge becomes more pressing. What’s more, businesses must ensure that cloud platforms not only work together, but also with their existing on-premise technology. In fact, a recent survey indicated that 81 per cent of respondents think that cloud applications must be fully integrated with each other as well as with on-premises software in order to experience the full rewards of cloud computing. The integration challenge is also multi-faceted. Businesses must be aware of where their data sets are located in order to meet compliance standards and to generate vital insights. API limitations may also scupper your integration plans and businesses must make sure that they have the expertise to overcome this hurdle. Fortunately, many cloud vendors are now making integration a priority, particularly as competition between vendors intensifies. Some businesses are even offering Integration Platform as a Service (iPaaS) as a means of delivering additional value to their customers. [easy-tweet tweet="Certain applications may be better suited to a public #cloud environment and others private" hashtags="Hybrid"] Although there’s no magic formula for cloud computing success, keeping in mind the three key principles of  “open, hybrid and integration” will certainly help you when choosing your cloud vendors. ### Enterprise IT 2016 31 May - 3 June 2016 Marina bay sands, singapore Enterprise IT 2016 is Asia’s LARGEST integrated sourcing platform for IT professionals across key industries. Network with more than 1,200 exhibitors from around the world Leading launch pad in Asia; be the first to see and understand how these new solutions can transform your industry Source from the latest cutting-edge technologies such as IoT and Smart City solutions Witness first-hand as exhibitors present their latest launches at the Xperience Zone or experience live demonstrations at the Hospitality Suites Connect with new suppliers and reinforce ties with business partners to enhance your business Share your views on global trends with peers from around the region Learn & gain insights from the experts in the industry at the CommunicAsia2016 Summit as they speak about disruptive digital trends. View our conference programme here. ### SuiteWorld 2016 May 16-19 2016 San Jose Convention Center San Jose, CA SuiteWorld is the #1 ERP cloud event of the year! SuiteWorld 2016 is planned entirely around making you and your business even more successful with NetSuite. Choose from more than 150 breakout sessions, enjoy numerous networking opportunities and explore the SuiteWorld Expo, where you'll hear about new ideas and offerings from our developer partners, solution partners, system integrators and more. You will learn: How to most effectively implement and use the latest NetSuite features What's planned for future NetSuite releases Useful insights and tips from other NetSuite customers Register for the event here. ### Cricket, IoT and Mobile Apps - #CloudTalks with Quickscore's Abdul Mohamed In this episode of #CloudTalks, Compare the Cloud chats with Abdul Mohamed from Quickscore about how their Smart Scoreboard is bringing innovation to cricket. #CloudTalks Quickscore brings next generation scoreboard technology to club cricket. Displaying in-depth individual stats, match stats, match highlights and milestones, the technology not only promises to improve the match experience, but also deliver digital advertising opportunities for clubs, schools & universities. ### London may be a tech powerhouse, but it’s not best for business! Location, location, location; geography has always been an important factor for businesses, indeed choosing the right spot can be the difference between success and failure. The right environment can place organisations in the best possible position to serve their customers, expand their reach, add to the bottom line, or literally ensure they are in the right place at the right time! [easy-tweet tweet="Location, location, location; geography has always been an important factor for businesses"] There are a number of factors when considering what makes a city ‘business ready’, and our recent study, analysing a range of government and third party data to rank 10 major UK cites, threw up a number of surprises. For a start, Cambridge was named ‘best UK city for business’, closely followed by Oxford and Brighton. Is the Capital losing its grip? Long hailed as the epicentre of business activities, it may come as a shock that London failed to make it into the overall top three. However, when delving deeper into the research the results become painfully obvious. Who, for example, is surprised to hear that London office rent prices are the highest in the country? Or that employment rate is low in comparison to other cities? However, despite being hailed as a hub for new businesses, the UK capital could only manage ninth place when it came to start-up survival rate. A combination of the aforementioned factors, plus the high concentration of already established businesses is clearly hampering start-up survival chances. Rise of the ‘Northern Powerhouse’ While southern cities still dominate the overall rankings list, the findings highlight a number of areas in which the North is flourishing. For example, York came out on top for start-up survival rate, helped by schemes such as the Whyte Knight Fund, which gives organisations the financial help and support that they need to establish themselves in the market. With so many start-ups vying for business, it could be argued that the North inspires and helps nurture the sort of entrepreneurial spirit that finds it difficult to survive in the capital. Flying the flag still further for the ‘Northern Powerhouse’, Leeds was also placed highly, ranking number one when it comes to access to graduate talent. With this thriving start-up culture in the North, cheaper rent and lower crime rates, the region is well placed to attract skilled workers. And, in an environment where access to the right skills is still a challenge, the North is putting itself at an advantage. For tech, London remains ‘Top Dog’ Despite the great start-up culture, both Leeds and York subsequently took the bottom two places when it came to average internet download speed. Having access to a fast and reliable internet connection is extremely important to all businesses – especially now that every organisation is expected to be ‘digital by default’. Although clearly not holding these cities back, this could be an issue that will need to be addressed if the North wishes to fully establish itself as a tech hub, or to compare with more favourably endowed regions of the UK. Both Northern cities ranked low when it came to their ability to combat cyber security issues. It is in this space that London comes into its own, with the capital emerging as the most resilient location when it came to combating cyber security breaches. With Silicon Roundabout hosting almost eight times as many tech firms as anywhere else in the UK, it seems that London is still making good use of its technology talent. [easy-tweet tweet="With the complex IT landscape facing UK businesses today, here is no such thing as the perfect location"] Ultimately, it seems that there is no such thing as the perfect location for business, with the findings showing room for improvement and investment in some area for every region in the UK. With the complex IT landscape facing UK businesses today, it is unrealistic to expect one region to excel on all fronts. In order to succeed businesses should seek to work with experienced partners to help them identify and plug any gaps they may have. No matter the city in which they might be based. ### The ransomware problem: Why SMBs are vulnerable to attack In recent years, ransomware has become a popular method for hackers looking to extort money from small-and-medium-sized businesses (SMBs). The concept of ransomware, however, is not a new one, and has been creating problems for small businesses since 1989 starting with the ‘AIDS Trojan’. Distributed via a floppy disc, the ransomware mimicked a software expiry notice - requiring users to pay a ransom by post so files could be decrypted. This ransomware, however, was considered easily breakable due to an over-reliance on symmetric cryptography, along with a less than perfect distribution method, and passed without significant damage. [easy-tweet tweet="In recent years, #ransomware has become a popular method for hackers looking to extort money from SMBs"] In today’s business landscape, we are witnessing a ransomware gold rush. This has been brought on by a combination of both technological progression, and greater proliferation of readymade ransomware packages available to scammers through the Darknet. SMBs are a prime target for hackers due to the high rate of return of successful scams, alongside the relative ease of infiltration. Also, larger businesses often place greater emphasis on investing in security compared to SMBs, making them a more difficult and time consuming target. Sitting ducks: why SMBs are least prepared for a cyber-attack Ransomware is a high-margin scam - especially when it targets smaller, less secure businesses. Contrary to popular belief, this type of scam is neither difficult, nor does it require a large amount of intelligence from the attacker. Another problem businesses are facing is that due to the low cost of producing this type of attack, a ransomware campaign only needs a low conversion rate to be considered a success. In comparison, focussing resources on attacking a single large company can often yield no results. In a recent survey of UK businesses, however, over one third of those had suffered a ransomware attack, with 31 per cent admitting they would rather pay the ransom instead of losing vital data. The problem with this approach is that there is no guarantee that a business will ever receive the decryption key, due to the command and control server potentially being under investigation from a security vendor or law enforcement. Consequently, an organisation could pay a large sum of money to retrieve its data, and receive nothing in return. Creating awareness: educating SMBs on the ransomware threat SMBs are the most lucrative target for ransomware attacks as they usually possess more significant financial resources than standard users, while rarely undertaking the comprehensive security policies of larger companies. Some companies make hackers’ jobs simpler by posting company email addresses online. While this is a minimal risk with modern security solutions and continuous data protection policies, a large number of SMBs do not take advantage of the security available to them. Bitdefender recommends business users and IT administrators should set up regular offline, off-site backups to critical data to prevent malware from finding the network connected storage, and encrypt this data. Deploying a company-wide security solution is also recommended, as this will help spot malicious payloads landing via drive-by attacks or spear-phishing attempts. IT administrators are also encouraged to set up access control lists and restrict user permissions on endpoints to ensure employees don’t accidentally install suspicious or rogue software. [easy-tweet tweet="SMBs are a prime target for hackers due to the high rate of return of successful scams & the relative ease of infiltration"] We predict 2016 will be the year of ransomware, and that the number of victims will significantly increase across the board. The increased ransomware detections we have observed not only suggest that it has become a highly lucrative business, but also that malware developers will soon begin exploiting new platforms, as seen with Linux. As malware developers broaden their perspectives by targeting operating systems that have a large market share, the chances of infection increase exponentially, making a security solution that can stay abreast of a constantly shifting threat landscape indispensable. ### Business Analytics & Data Discovery - #CloudTalks with Zizo's Peter Ruffley In this episode of #CloudTalks, Compare the Cloud speaks with Peter Ruffley from Zizo about how their analytics as a service offering can help generate Big Data insights. #CloudTalks Zizo offers a unique approach to Big Data analytics by delivering cloud analytics and data science as a service through their patented technology. With the ability to deliver solutions in days via their pattern database delivered as a service, they enable business users to discover new insights to improve performance, revenues and customer experience, no matter what size of business, or scale of data; all for one low, monthly cost. With a focus on delivering business value, Zizo serves customers across many sectors, including Media, Retail, Insurance, Logistics and FMCG. ### Man (Dis)connected – Philip Zimbardo – What it Means to be Male in a Connected World No stranger to controversy – Psychologist and author Prof. Philip Zimbardo rocked the academic world in 1971 with his Stanford Prison Experiment. His research continued with such celebrated works as The Lucifer Effect, studies of time, heroism and social intensity syndrome. [easy-tweet tweet="In Man (Dis)connected, Zimbardo explores the role technology is having on men socially and sexually"] In his latest book, Man (Dis)connected, Zimbardo explores the role technology is having on men academically, socially and sexually. In this #Disruptive Voices interview we caught up with Dr. Zimbardo during his book tour of the United Kingdom and asked him the effects the connected world is having on young men. He discusses internet, computer games & pornography addictions and how this is affecting a new generation of males. Video Transcript: It’s too easy for young people to live in a virtual world and give up the social world, the real world. Hi, I’m Philip Zimbardo – I am the founder and President of the Heroic Imagination Project and when I’m not talking about good and evil I’m talking about the impact of technology on young men everywhere in the world which I illustrate in my book “Man (Dis)connected’ and there’s an American version coming out next month called “Man Interrupted” which I did with Nikita Coulombe, my young co-author. The problem is we now live in a world of technology. We have our cell phones, people walking down the street in America with a Starbucks coffee and a cell phone plugged into their ears. You see bunches of kids, teenagers, sitting around – no-one’s talking to each other, they’re all on their cell phone or they’re watching a YouTube video. So technology has taken over. The danger is getting addicted to it. That it’s just too easy for young people to live in a virtual world and give up the social world – the real world. Because it provides so much entertainment – with YouTube videos, everybody has endless Google searches and now, of course, the big media – Amazon – any movie you want, any song, you can have all the music in the world at your fingertips. And again, endless movies. And so I appreciate the value of that. The alarm that I’m sounding in this book is for men who’ve grown up in the internet generation, maybe up to age 29 now, meaning when they were little kids they were accessing the internet. For those young men the internet has become too important. I mean important enough so that it replaces much of the real life connected with real people. So the two things that I focus on are video games and online free pornography – and now they’re almost like, for me, a twin evil. There’s nothing wrong with either of them – the problem is when you do it either in excess and when you do them in social isolation – for me that’s the twin killer. You’re all alone, five, six, eight, ten hours every day, every night – in your room playing video games, watching pornography and you’re not interacting, you’re not socialising. So for many young men they never learn social communication, they don’t know how to carry on a conversation. We know if you programmed your computer or a robot to carry on a conversation – there’s a thousand algorithms, a thousand rules they have to learn. And you only learn how to talk to people and feel comfortable doing it by observing others and by practising and getting feedback. They don’t observe others, they don’t practise and so they feel awkward – and when you feel awkward you avoid it. So I’m arguing it’s a new kind of shyness. The old shyness, that I had studied for many years, was a fear of rejection. Now it’s not knowing what to do – so it’s not fear of rejection – it’s almost as if you’re in a foreign land and you don’t know the language and you don’t know how to ask where the loo is. So it’s really that. Now, we do know that there is lots of research that says playing video games in moderation can have positive cognitive effects – building up sense of mastery, hand-eye co-ordination. So it’s really when does a moderate amount become excessive? Now, some people argue it’s really not the number, it’s not like “gee, when you do ten hours”, it really becomes addictive when you prefer to do that over anything else. You would rather be playing video games than hanging out with your friends, than being in school, than playing sports, than being out in nature. So it becomes all consuming and when you’re not playing video games, you know, when you’re at a family dinner, when you’re in class – you wish you were there. Now again, video games are designed by men for men and for boys. So, it’s obviously… it is possible to have more socially redeeming games – I’m not even talking about violent games – I’m not saying, you know, video games are bad because they’re violent – I’m saying set that aside. Games can be designed that are socially redeeming, games can be designed that promote good causes like conservation, sustainability and games could be designed where there are two or more people physically present. Like you’re a pilot and this is your navigator – your co-pilot. And so that’s what I’ve been pushing for. But the games are designed by men for men, so male values of domination, of aggression, and it turns out the same is true for pornography. Pornography - both of these are multi-billion dollar businesses, so it’s not like other things in past years we said “it’s a fad” – teenagers go through it – no, these are multi-million dollar businesses that are only going to get more fascinating, more involving, more engaging. And again, video pornography is designed by men for men. Now, I’m even setting aside the objectification of women. But what it means is that you press a button, or you type in a few words ‘www dot porn’ and there’s a world of pornography – there are things that you’ve never even imagined were possible – but also it’s a strange world because everybody is beautiful. The women all look like models. The men are physically… studs, they have enormous penises that they perform for twenty or thirty minutes on end. And what you don’t realise – this is all edited. If they have an orgasm too soon they edit it out, they start all over again. But the point is, you’re a young man who hasn’t had sex yet and the more pornography you watch the more it becomes the social norm. ‘This is what women want, this is what women expect, this is what sex is and this is what’s expected of you’. And the problem is those expectations are incredibly high. Namely, you see somebody like you, only it’s a man performing incredibly, I mean with an enormous penis, non-stop, sixteen positions in twenty minutes and suddenly you feel inferior – inadequate. So that’s my concern. Now, the other thing is because the pornography is designed by men, really for men, it takes all the romance out of sex. No-one kisses, no-one touches gently. No-one communicates, there’s no words, nobody talks. There’s no negotiation of boundaries – no woman says ‘this is okay, this is not okay’. And when they become addictive in a sense of they young man, the young boy, would rather be doing… playing video games, watching pornography, than anything else in the universe. Yeah, I mean technology replaces people. In the automobile industry in most countries there’s no workers – there’s robots. When you go to the supermarket now you can check out yourself – so that replaces supermarket check-out people. So it increases profitability by replacing people with machines. But, as I’m a social psychologist, I believe people need people. People have to want to be with other people. And when you get to a point where you say ‘you know virtually all of my needs can be satisfied by technology’ and now the newest technology – Virtual Reality – where you wear the goggles – and the world is in your head. So, again the idea is there’s nothing wrong with video games, there’s nothing wrong with pornography – it’s when either of them are used in excess. When either of them are done in complete social isolation. And when they become addictive in a sense of they young man, the young boy, would rather be doing… playing video games, watching pornography, than anything else in the universe. Yeah, I mean technology replaces people. In the automobile industry in most countries there’s no workers – there’s robots. When you go to the supermarket now you can check out yourself – so that replaces supermarket check-out people. So it increases profitability by replacing people with machines. [easy-tweet tweet="There’s nothing wrong with video games or pornography – it’s when they are used in excess" user="PhilZimbardo"] But, as I’m a social psychologist, I believe people need people. People have to want to be with other people. And when you get to a point where you say ‘you know virtually all of my needs can be satisfied by technology’ and now the newest technology – the Virtual Reality – where you wear the goggles – and the world is in your head. This article was originally posted at DisruptedTech.tv ### The RoboFinance Revolution: Transforming the back office The news is awash with stories on the rise of automation and robotics, raising a new debate on the future of the workforce in our digital by default society.  The rise of robotic process automation will undoubtedly disrupt the idea of the traditional labour market – as we cannot expect a technological revolution without change. [easy-tweet tweet="Industry 4.0 is characterised by the rise of new technologies underpinned by data and vast computing power"] Despite the hype and attention given to this area in mainstream media, the concept of robotics is not a new one. The first modern use cases of these machines arose during the Industrial Revolution. Fast and fixed machinery was created to streamline a whole host of manufacturing tasks, removing the need for extensive human intervention and effort. Fast forward to the 21st century, and we are now in the ‘Fourth Industrial Revolution’ or ‘Industry 4.0’ as it is commonly known. This revolution is characterised by the rise of new technologies with data and computing power underpinning and influencing many business critical decisions. Businesses that fail to harness new technologies and analysis of data will fail to maintain a competitive edge. When we refer to ‘robotics’ in the 21st century, we are no longer referring to clunky and basic machines that purely imitate human behaviour, but instead, a new opportunity to re-imagine business processes and their interdependencies. Today, it is simply no longer a matter of plugging in ‘dumb’ machines, but fully integrating smart technology with a pre-programmed knowledge of the end–to–end processes and best practice in place. Repeatable and regular processes can be streamlined with robotic process automation. The finance function in particular, is one that is poised to benefit here. Think about it – many of the most complex and high–pressured activities within a company are financial ones. Even in today’s digital economy, many of these time intensive and high volume tasks rely too much on manual effort. From invoice processing, accruals and reporting to close, many accounting and finance teams across the country have to regularly overcome the headache of spreadsheets and manual data input to close the books accurately and on time. Furthermore, the pressure to deliver a 100 per cent accurate close can result in regular overtime and the creation of a somewhat fraught atmosphere. This isn’t just a question of manual effort. At many points during the financial close process, the entire close needs to be suspended in order to wait for the successful completion of one step to continue. To manage this, many businesses may be using task lists and spreadsheets, which invites the possibility of human error. More often than not, this waiting takes place because some steps need a level of manual validation of a completed process. These ‘pause’ instances aren’t just inconvenient – they do conceal undocumented processes and hide the unnecessary manual intervention points. Error management is another considerable cause of time losses. Consider all of that time spent filtering through sloppy data to manually reconcile errors – this time could be better spent if it were re-directed to allow teams to work on more strategic and valuable analysis of the data. Time to implement robotics A patchwork approach to robotics is not an effective solution. There is little merit in replicating one task at one point in a process, and then another in a different place if these systems are not integrated with one another. A holistic approach to robotics is required to progress thinking on from a simple ‘user-centric’ model which mimics individual actions – to a ‘process-centric’ model which robotises both the application and the system. For the finance function, holistic robotic process automation brings the potential to streamline financial process efficiency and also standardises these processes. At the most basic level, automatically generated reports ensure that you can rely on data integrity with the upmost assurance. Built-in business rules can eliminate the need to micro-manage and allows users to keep track of processes and trigger actions, whether at their desk or on the move working remotely. Make no mistake that strong opportunities and results are to be had from standardising processes across the finance function. Robotics as an enabler Business leaders can achieve all of the rigour with none of the effort, with robotics in place. As many finance teams will be all too aware, compliance can be a persistent strain. More time can often be spent on meeting compliance standards than on the tasks which those controls are programmed to govern. The reality is that compliance should be a natural by-product of any business process. By directly engaging with the system of a record, robots can validate, track and document complete processes end-to-end, to deliver a thorough audit trail. The true strength of robotic process automation lies in their inherent ability to drive further self-remediation and improvement. By their very nature, robots generate streams of valuable and accurate data simply as a result of executing the process itself. Once captured and stored, this information can then be analysed and exploited to review previous performances and to hone in on specific areas for improvement. In ‘Industry 4.0’, business leaders are now acutely aware of the power of ‘big data’. However, it is not time to align this buzzword with robotic process automation. [easy-tweet tweet="A holistic approach to #robotics is needed to move from a ‘user-centric’ model to a ‘process-centric’ model"] Performance, accuracy, analysis and productivity. The time is now for finance teams to stamp out inefficiencies. Working staff harder rather than addressing underlying processes should no longer be an option today. Business leaders that realise the true benefits of robotic process automation will be those that retain a competitive edge in ‘Industry 4.0’. ### Simplify your data centre and get ready for the cloud-based digital economy Most people love simple-to-use products and services. Something that is easy to understand, intuitive and which exactly meets their needs. In a world of multiple choices, with many vendors competing for their business, they not only want the best experience and service but also expect everybody to match – or even surpass – their high level of expectations. As such, many digital businesses have transformed themselves to better meet these sky-high demands by anticipating customer and user preferences and continually innovate to maintain competitive advantage. [easy-tweet tweet="#Cloud architecture provides a cost-effective, pay-as-you-go resource that can scale quickly"] Continuing the need for simplicity, organisations are always looking to streamline company operations. More efficient processes provide many benefits when applied to delivering a rich customer experience, including reduced cost, less complaints, fewer errors, and stronger market differentiation. For management, striving for simplicity, closely coupled with innovation, has become a cultural mind-set. While the owned data centre has long supported business growth, a great many organisations are now making cloud adoption a fundamental part of how they do business. A cloud architecture provides a cost-effective, pay-as-you-go resource that can scale quickly, provide flexible, fast-to-deploy, provisioned services, with applications that can be rapidly spread across geographically-distributed sites or cloud environments. At Juniper, we’re helping to unlock new business models for this connected world and create growth opportunities. As a foundation to the digital business, the network is a critical enabler for enterprises to integrate cloud-based services, especially using hybrid models. Cloud providers rely on the network to build a flexible and pay-as-you-go infrastructure, which enables them to deliver cost-effective services to their clients. A simple, well-designed infrastructure can speed up traffic, avoid errors, remove congestion, enforce and maintain ubiquitous security, while also costing less to operate than traditional networks. From a data centre perspective, we recommend two key elements to designing a simplified high-performance network: Architecture Adopting open standards is the ideal route to simplifying network evolution. It increases flexibility, protects legacy equipment and enables the introduction of best-of-breed innovation from a wide development community. Switching, routing and security coupled with automation and orchestration – all should be working in conjunction with an open technology ecosystem to accelerate the deployment and delivery of applications for enterprises and service providers alike. For Juniper, 'open' means adhering to industry standards and platforms such as OpenStack. It also means publishing APIs that allow partners and customers to talk directly to the operating system and define the network in the manner that best suits their unique needs. It means adding components easily from other vendors as and when you choose, whether it's network devices, storage or servers. Customers running bi-modal I.T., i.e. two separate, coherent modes of I.T. delivery, one focused on stability and the other on agility, will also benefit from the flexibility and easier integration. Topology Juniper recommends implementing a topology that reduces the number of tiers to ensure low latency and allow the rapid transfer of data across virtualised environments. Many traditional data centres still work with a hierarchical tree structure which was fine in the old client/server environments, but today apps and websites integrate data from numerous geographically distributed sources. As a result, the biggest data transfers nowadays are east-west (server-to-server). The preferred topology for new data centre networks is a flattened 'spine-and-leaf' approach which presents a much better foundation on which to build a data centre in terms of scalability, performance, agility and cost. 'Simple' from the end user's perspective also means rapid access to information and short download times. With many businesses moving toward a service-oriented, scalable infrastructure, flattening the network should empower them to not only create opportunities for deploying automation and virtualisation technologies but also drive up performance across the network to improve the overall customer experience. [easy-tweet tweet="Redefining the role of networking should create new opportunities for businesses to succeed" user="JuniperNetworks"] In short, redefining the role of networking should create new opportunities for businesses to succeed in the digital economy. Simplicity always wins over complexity in the data centre – especially when building a network with a strong foundation that drives innovation through your organisation. ### Data Recovery and Outsourcing - #CloudTalks with IT Specialists' Matt Kingswood In this episode of #CloudTalks, Compare the Cloud speaks with Matt Kingswood from IT Specialists about outsourcing to third parties and the best way of protecting company data. #CloudTalks ITS is a nationwide Managed IT Services provider, delivering UK-based services to business of all sizes. They offer a broad range of IT services including: managed backup and recovery, tailored UK cloud hosting; complete infrastructure implementation; flexible telephony solutions and web creative services to businesses across the country. Recently re-branded, ITS has a long line of history, dating back to 1896 when they first opened as a printing company. Since the 1960s when ITS first transitioned into computer services, we have seen various mergers and acquisitions, which have developed into the brand we know today. Formally known as Kalamazoo, ITS is now part of the global IT software and services provider Reynolds and Reynolds. In total, Reynolds and Reynolds employ over 4,300 people worldwide and have operations throughout Europe, Canada and Mexico as well as those in the UK and US. With three regional offices located in the UK and field engineers stationed from Edinburgh to London, ITS offers customers minimum downtime through various forms of IT support, hardware break/fix and disaster recovery services. ### The transformation of IT operations IT operations are changing. The globalisation of business, expansion of market opportunities and rising customer expectations are changing the way businesses use technology. IT has become more operational, focused on agility and information. In a recent prediction for 2016, Nick Patience, an analyst with 451 Research, argues that analytics will become more “prevalent throughout the layers of technology businesses use.” At the same time, mobile penetration is also continuing to rise sharply, with mobile-phone users expected to number 6.1 billion by 2020, which is almost one billion more than the current global adult population. [easy-tweet tweet="The use of technology and scalable tools has meant changes to the role of the CIO and IT support services"] As a result of the growing use of technology and emergence of scalable tools, the role of the CIO and IT support services has evolved as well. The most innovative companies with leading CIOs are managing these changes by developing an operational strategy to IT, enabling profitable growth and agility by lowering costs and becoming more strategic in their resourcing. Operational excellence is key to any progress. Accenture views operational excellence as a key attribute of successful companies in any industry. Their white paper on the subject argues “relatively small improvements in business processes can put companies significantly ahead of competitors,” calling on organisations to decide “which processes really matter—and develop the disciplines to manage and sustain them.” For those selling IT, relevance is critical. Companies want their IT suppliers to provide solutions that include support, guidance and a quality service. Consequently, improving one area of a business, by as little as nine or ten per cent can “dramatically differentiate a provider from competitors” say Accenture. As a result, IT companies are consequently adopting much more adaptable and efficient models of service to customers. The CAPEX (capital expenditure) vs. OPEX (operational expenditure) debate has raged for a number of years, and OPEX has won. Taking an operational approach to IT not only improves efficiency, but it’s proven to be cost effective, enabling a ‘pay-as-you-grow’ revolution. Businesses and people alike are no longer interested in owning the asset. Subscription models from the likes of Netflix through to Amazon Web Services have all contributed to the growth of the pay-as-you-grow subscription model. For businesses, taking this approach provides fast scalability. In our space, Inventory Management for example, organisations can just add new products and hardware to their support contracts as they scale. In my view, the pay-as-you-grow economy is evidence for a need to take an OPEX rather than a CAPEX approach in IT. OPEX provides scalability, and ensures organisations are not having to shell out for products in addition to any support they might require. A good supplier will provide field maintenance, repair services, technical support and even hardware support training to enable your business to improve its service and better retain customers. With the proliferation of data, cloud technology and SaaS (software as a service) solutions, the landscape has changed dramatically, and today’s businesses who wish to be here tomorrow need to look to take advantage of the new status quo in their IT operations. Technology and CIOs need to be prioritised to lead the charge as organisations evaluate how they engage with customers and use new information to improve the way they do business. The foundations of this have to be 'what's in it for the customer'. [easy-tweet tweet="Taking an operational approach to IT improves efficiency and has proven to be cost effective" hashtags="Transformation"] There has been a movement within IT towards becoming more operations focused. As organisations look to become more competitive and meet the growing demands of the globalisation of business and rising customer expectations and competition, they need to develop agility and use tools that are flexible and easy to scale. Focusing on agility and information requires leadership from CIOs and I expect their role to continue to grow in importance in driving growth of operations based models within the IT market. In short, as the function of IT and business culture continues to develop and change, the benefits of taking a more operational approach will become clearer, highlighting a need to grow and empower IT leaders. ### The digital solution to Britain’s productivity gap Once again, the latest UK labour productivity figures showed how its productivity gap continues to grow. Indeed, the difference between British productivity and the remaining members of the G7 group of industrial nations is now the greatest since modern records started in the early 1990s. [easy-tweet tweet="The latest UK labour productivity figures showed that the productivity gap continues to grow"] And while there have been improvements with regards to employment – with the unemployment rate falling to 5.2 per cent last year, the lowest in nearly a decade – a clear challenge for both businesses and policymakers is how to get more output out of every hour worked.  The government is acutely aware of and working to tackle what Sajid Javid named “the economic challenge of our age”. Last July, a mix of deregulation and government intervention across apprenticeships, universities, skills, housing, transport and finance were announced to try to address Britain’s poor productivity performance. However what was not mentioned is a key element which could be a game changer in British productivity. For many professional occupations, IT and access to data is often cited as significantly hampering the ability for workers to achieve their full potential. Indeed, new research from Nimble Storage on the app data gap revealed that 61 per cent of Brits surveyed believe the speed of the applications they use significantly affects their ability to perform best at work.  And with organisations of all sizes modernising and transforming in order to expand their capabilities and progress digital projects, applications are becoming an ever-more crucial part of our professional experience. Companies frequently rely on hundreds – if not thousands - of applications to power every conceivable business process, from sales and marketing, to product development. But the success of these new digital processes depends on uninterrupted, rapid access to data. When the delivery of data to applications is disrupted, it creates a gap which may negatively impact the user’s experience, the business outcome, and ultimately hinders organisational effectiveness. Overcoming this “App-Data Gap” is therefore essential to all businesses hoping to drive greater productivity. The App-Data Gap is a significant blight across the UK, with 94 per cent of British IT staff admitting to experiencing delays when accessing and inputting information while using business software applications at work. While another 42 per cent estimate that they waste between 10 to 30 minutes a day on application response delays. So for companies looking to drive greater productivity from staff, managers need to look beyond positive cultural changes and also consider how the IT infrastructure that they are providing their workers can have an important impact. And although frequently perceived solely as a storage issue, in more than half of cases examined by Nimble Storage, application breakdowns stemmed from complex infrastructures, which then created the app data gap that disrupted the data delivery – ultimately meaning information was not instantly available and slowed down the process. So it’s important that organisations that look to their IT infrastructure to improve productivity address the issue head-on. Companies must look to new technologies that incorporate both flash-optimised architectures and predictive analytics to respond to concerns as they arise. This not only will help organisations anticipate and rapidly resolve complex issues, avert hot spots, and simplify planning, but will enable them to make long-term predictions around their storage capacity and performance, and plan to make improvements seamlessly and non-disruptively. [easy-tweet tweet="In more than half of examined cases, application breakdowns stemmed from complex infrastructures"] Improving productivity is essential both to organisations individually and to the greater economy. It is therefore important that companies look to digital solutions to address a decrease in productivity. And with digital process driving so much of British industry, it is essential that unproductive processes are rectified to reduce workers’ frustration and immediately enable employees to save wasted time and - ultimately -be more productive. ### VoIP & Cloud Telephony - #CloudTalks with Swytch's Chris Michael In this episode of #CloudTalks, Compare the Cloud chats with Chris Michael from Swytch about VoIP and avoiding international roaming charges. #CloudTalks On average, there are two active SIM cards for every adult in the UK, meaning a huge demand for additional mobile numbers. Swytch meets this demand by providing additional UK mobile numbers on your existing phone. Swytch numbers live in the cloud, so as long as you can access the internet from your device you have access to all of your Swytch numbers, all of the time, directly within the app. As well as added convenience, Swytch numbers also provide additional privacy and security. ### How to move your data to a new virtual home in the cloud When moving to a new home, your prized possessions are shifted from room to room, then to a storage box, then to a removal van and finally placed in the new house. In this lengthy process, your treasured belongings are often lost, damaged or discarded. It’s easy to think that this is part and parcel of the moving process – after all, that new house with the extra space and better location has to be worth it, right? [easy-tweet tweet="Taking your first step towards the #cloud requires a lot more than a simple software upgrade" hashtags="Migration"] This analogy is surprisingly appropriate for businesses moving to the cloud. Solutions like Office 365 offer a wide range of benefits in terms of productivity, network availability, and easier emailing and collaboration from anywhere and from any end device. In spite of these benefits, just like a house move, getting from on-premises Exchange and Office tools to the cloud or hybrid environment can present significant challenges for IT teams. Whilst many people expect to lose items during a move, more often than not, these losses are caused by a lack of preparation. Sorting, packing and moving involves a lot of work and it can be stressful, but it is also an opportunity to clear out the old junk and move into new premises with a clean slate. The process of moving data to the cloud is the same. Even in the preliminary stages, there is plenty that can go wrong. So, the better prepared a company is for the move, the more smoothly the process will run – and smooth processes lead to lower running costs in the long term as well. Companies that decide to take their first step towards the cloud need to be aware that it is a lot more than a simple software upgrade; it involves a complex infrastructure migration process. Good preparation is the key to success. Those organising the migration process should ensure that all data is ready for the move and that the company network has the necessary capabilities to interact with the cloud. Packing up – Sort and label your data The first thing your company needs to decide is whether it actually wants to move all of its data to the cloud. Often, it makes more sense to archive some of it. For example, emails usually take up an enormous amount of storage space. Migrating absolutely everything means that these messages have to be physically moved into the cloud, despite the fact that users rarely access very old emails. The implications of this can be significant. For example, research has shown that, on average, Office 365 migrations can take 30 per cent longer than the original plans. The main reason given for this is that businesses have underestimated the volume of surplus email. Overshooting the migration timeline by a third can considerably increase the cost of the move. During an Office 365 migration, the Exchange servers move to the cloud. This gives businesses the perfect opportunity to decide where the existing data will be stored and is an excellent opportunity to clear out any unwanted mess, such as unnecessary old emails. The same applies to many users’ PST files and local archives. Similar to labelling storage boxes with the room they need to end up in, IT teams can save a great deal of time by sorting and labelling their data with keep, archive, migrate and delete. The archiving of emails and existing PST files should take place before the actual move begins, allowing the slimmed down mailboxes to be moved to the new cloud environment seamlessly and efficiently. Preparing for the move – Reinforce the network The route to the Office 365 cloud can be tricky without the right preparation. Companies frequently face two major problems in relation to data traffic: The present network is simply too slow if too many people are online Office 365 has to compete with other web-based applications for available bandwidth In traditional IT environments, a single firewall provides security for a single on-premises Exchange server. All email traffic is routed via this central point, even if staff are travelling or working at home (known as traffic backhauling). When switching to Office 365, it is essential to set up direct internet breakout, which will distribute the traffic across external networks. Multiple firewall instances are required to secure these direct internet connections. This can involve a lot of work for IT managers, unless they implement a centrally managed firewall solution for distributed networks. Moving day – Migrating to the cloud Once everything is prepared, the move to your new home in the cloud can begin. Getting there – which, by the way, can take weeks or even months – should be achieved without detours or damages. One thing to consider at this point is a plan for migrating user settings, such as profiles and address books, to the new cloud environment. Instead of moving all their data to the cloud, more and more companies are opting for a hybrid solution. This means that some users and storage locations stay with Exchange on-premises, while the rest move to the cloud. [easy-tweet tweet="Instead of moving all their data to the #cloud, more and more companies are opting for a #hybrid solution"] In your home move, you might employ a removal company and home cleaners to help make it a smoother process. A company’s IT managers may want to do the same; when moving from one Exchange to another you are well advised to call on the expertise of third parties. They will have the necessary knowhow and a portfolio of tools and products for ensuring a smooth migration to Office 365. ### Introducing Oscobo: A different kind of search engine Compare the Cloud has been interviewing a number of digital innovators and today the focus is on Oscobo, a UK-based Privacy Search Engine. Founded in 2015, Oscobo was launched on the belief that personal data should remain personal. We sat down with Rob Perin, co-founder at Oscobo to hear a bit more about the company. [easy-tweet tweet="Oscobo is the UK’s first Privacy Search Engine - providing true search results"] 1. What is Oscobo? Oscobo is the UK’s first Privacy Search Engine. We are proud to say that we do not track you, we don’t use cookies, we don’t look at your IP and we store no data on the user. We just provide “true” search results based solely on the words typed in the search box. Our motto is “we don’t know who you are and we don’t want to.” 2. Where does the name Oscobo come from? “Scobo” means “to look into” or “probe” in Latin and the letter “O” means “not” or “non” in Swedish. So we have taken some creative licence by mixing our languages, but to us “Oscobo” means “not to probe, not to track.” 3. What inspired you to create a different kind of search engine? Myself and Fred Cornell are the co-founders of Oscobo, and we worked together in 2000 in the start-up incubator IdeaLab. I went on to work for BlackBerry as Commercial Director for Europe and Fred as Director of Operations for EMEA for Yahoo! While at Yahoo, Fred learnt about the online search industry and became uncomfortable with the way personal data was tracked, recorded and sold to advertisers. After 10 years working there he left the company and set out to put the record straight.  By 2015, I’d left BlackBerry and was in search of another game changing company and Fred shared his idea of developing a privacy search engine with me. We both saw a gap in the market, especially in the UK where no UK-centric privacy engine existed and after a period of developing and testing the platform in Europe we launched Oscobo in January 2016. 4. Why do you think online privacy is becoming increasingly important? Some people who use the internet think most content is for free and others understand that they are funding it with their personal data. However, as more and more technology is creeping into our lives with the use of tablets, phones, online banking, etc, more and more data is being collected. People are starting to realise just how invasive and, indeed, how dangerous it is to leave so much personal information on the web and are starting to think twice about sharing so much information online.  5. What would you say to users who feel that, having nothing to hide, they have little need for a privacy-based search engine? Using a privacy search engine has nothing to do with hiding things. It is to do with protecting your identity online so you will not be spammed, cold called, and perhaps even lose out on certain offers based on your profile. People who use a privacy search engine believe they have a right to surf online without everyone knowing everything about them, their spending habits, their income bracket, what they “like” and what they are looking for. Your ISP will know exactly what pages you are looking at and must legally file that. The issue is when your data is being collected and then sold on to advertisers. Also, data that is stored is data that can be hacked. 6. Who is Oscobo aimed at? Oscobo is aimed at those people who believe they have a right not to share their age, sex, marital status, buying habits and income bracket everytime they search for something online. It’s for people who want to be given true algorhythmic search results based purely on what they type in the search box and not based on what advertisers “think” they want to see. 7. Is there a trade-off in terms of usability because you do not keep personal user history? There is a small trade off.  We don’t hold any data so you could type “Five Star Hotel Paris” once and then a second later go to search it again and we would not be able to know who you are or predict what you are typing. I would say the “cost” of having such convenience of not having to type terms twice, is far outweighed by the benefits. If an advertiser knows you are a middle aged manager who frequently flies for business and has been looking all day for a flight to Paris, they are more likely to increase the price than if they think you are a young student looking for the first time. [easy-tweet tweet="Oscobo is for people who want to be given true algorhythmic #search results based on what they type"] 8. What are Oscobo's plans for the future/expansion? Oscobo will be rolling out across Europe and internationally providing localised search results in each country in the near future. We will also be developing some of the features on the site in the next few months to improve usability and have a roadmap of products and apps to come out later in the year. ### Public, Private & Hybrid Cloud - #CloudTalks with EMC's Rob Lamb In this episode of #CloudTalks, Compare the Cloud chats with Rob Lamb from EMC about finding the right cloud platform for your mission critical applications. #CloudTalks EMC Corporation is an American multinational corporation headquartered in Hopkinton, Massachusetts, United States. EMC sells data storage, information security, virtualization, analytics, cloud computing and other products and services that enable businesses to store, manage, protect, and analyse data. EMC's target markets include large companies and small- and medium-sized businesses across various vertical markets. ### Cloud’s critical role in emergency communications Businesses can and will face emergency situations – any incident that has an impact on an organisation’s business continuity has the potential to escalate into a crisis.  Crisis situations involving IT downtime can range from routine IT and operational incidents to sophisticated cyberattacks and acts of nature such as a flooded server room or power outage. [easy-tweet tweet="A SaaS #cloud platform will enable a business to notify every relevant employee quickly"] While these events cannot be avoided, organisations can manage and minimise the physical, financial and reputational damage of a crisis by proactively implementing effective cloud-based systems and procedures to cope with periods of IT downtime. The new threat landscape Cyberattacks are becoming increasingly common. The number of companies reporting distributed denial of service (DDOS) attacks has been doubling year on year.  According to the 2015 report from the Ponemon Institute, ‘2015 Cost of Cyber Crime Study: United Kingdom’, cyberattacks cost businesses on average £4.1 million per incident and incidents take an average of 31 days to resolve.  If internal systems are compromised, what will be the cost of reduced productivity?  Losses in the areas of labour, revenue, and service all contribute to the total cost of an IT outage. Businesses are unable to go on hold for a month to resolve a cyberattack; which is why companies need an alternative to the internal network that enables them to communicate quickly and effectively in the event of a breach.  The best way to achieve this is to implement a cloud-based communications platform that has the security, reliability and scalability to communicate effectively when an organisation’s internal network is compromised. The cost of a crisis Generally the cost of IT downtime can be distilled into two key areas: productivity loss and revenue loss.  North American Systems International (NASI) recently looked at these issues, and provided companies with a formula to quantify each of these loss areas as a starting point for calculating the total cost of IT downtime. The formula given by NASI to calculate losses resulting from reduced employee productivity during an IT outage is P x E x R x H, where P is the number of employees affected; E is the average percentage they are affected; R is the average employee cost per hour and H is the number of hours of the outage.  Revenue lost to an IT outage can also be calculated.  The formula in this instance is (GR/TH) x I x H. In this calculation GR is the business’ gross yearly revenue; TH is the total yearly business hours; I is the percentage impact and H is the number of hours of the outage.  In a practical example, if a retailer that generates 50 per cent of its revenue online for a gross yearly revenue of £2.4 billion, IT downtime will cost that company more than £2,000 every minute.  IT outages usually lead to a flow of related costs, and a cost analysis should also include factors such as late delivery surcharges, overtime costs, loss of customers, additional marketing costs and impact on share price.  It is easy to see how the cost of an IT outage can lead to a business crisis.    The central role of the cloud Communication is critical during periods of IT downtime if the impact on key areas is to be minimised.  By reducing the time between knowing there is a problem and solving that problem, businesses can significantly limit damages. Effective communication in crisis situations depends on two factors: delivering the right message to the right individual, and receiving an acknowledgement that the message has been delivered and actioned.  For example if an organisation faces a cyberattack it will need to send both internal and external notifications. Internal experts will need to be located and informed of the issue instantly so the situation can be assessed and immediate action can be taken.  Customers might need to be informed early to protect consumer confidence and brand reputation.  A SaaS cloud-based platform will enable a business to notify every relevant employee quickly, regardless of the extent of the impact on the internal network.  Responses to the notification can be received within minutes and the business will know who is available to fix an issue and where they are located. By allocating responsibilities to the nearest people with the right skills the impact of the issue on the business – its revenue, productivity and reputation – will be substantially minimised. Conclusion To assure critical communications during system downtime, major service disruptions, and even complete network breach, an organisation’s critical communications platform must be completely separate to its normal network and utilising SaaS-based cloud tools is the best way of ensuring this, and subsequently effective communications during a crisis. [easy-tweet tweet="Communication is critical during periods of IT downtime if the impact on key areas is to be minimised" hashtags="Cloud"] A unified critical communications system needs to be easily adaptable to effectively manage a future crisis.  Customers, employees and stakeholders are increasingly connected and businesses need the ability to communicate critical information to a wide range of individuals quickly and reliably during an emergency situation or periods of IT downtime. By implementing cloud-based SaaS tools, organisations can ensure an effective plan is in place before the crisis occurs; if organisations are proactive, resulting losses can be minimised. ### Top five ways to manage cloud Infrastructure-as-a-Service According to Gartner, the cloud services industry is growing more than five times faster than other IT categories as companies shift away from internal hardware to greater use of public cloud infrastructures. While the cloud is enabling enterprises to shift costs away from purchasing hardware, services and running their own data centres – another question emerges: Are companies actually saving money and spending wisely on their cloud infrastructure? The massive shift in spend identified by Gartner underscores the need for enterprises to start tracking, managing and optimising cloud services usage—essential for eliminating waste, controlling costs and improving utilisation. [easy-tweet tweet="Infrastructure as a Service offers the ability to only incur infrastructure costs when needed" hashtags="IaaS, Cloud"] Infrastructure as a Service (IaaS) offers the ability to only incur infrastructure costs when needed, which is great for test workloads—no need to buy a permanent piece of hardware and have it sit idle most of the time. Simply turn on virtual machines, which are computer files that behave like an actual computer, for the amount of time needed. When the virtual machines are off, payment stops. However, most people are not used to turning virtual machines off – which means oftentimes organisations pay for more cloud service than they need. The cost of the wasted usage appears small—pennies per hour. But, because a large enterprise may have hundreds or thousands of idly running virtual machines…the pennies add up quickly. For many production loads, the virtual machines always need to be on – so, is there a way to save money here? Yes. Depending on the cloud provider, subscription pricing may be an option, meaning lower prices for “reserved” instances. The savings compared to pure, on-demand pricing can be significant – 40 to 50 per cent over time. However, most businesses have virtually no insight into whether they’re spending too much on cloud services, or are efficiently utilising their investment. Most have multiple cloud accounts that are often purchased in a decentralised fashion by different departments and business units—with no single view of adoption and usage across the company. They need a way to automatically import usage and billing data from multiple accounts to deliver a centralised view across the enterprise. In addition, as companies continue to adopt cloud services and create even more complex, heterogeneous IT environments, their asset management tools must evolve and expand to optimise not only on-premises hardware and software assets, but cloud infrastructure services as well. Below are the top five ways organisations can optimise their cloud infrastructure services: Utilise a dashboard that provides an executive-level view of the use of cloud resources, presenting aggregated consumption patterns and total spend across all cloud subscriptions. Enterprises can analyse data and view consumption reports and breakdowns based on instance sizes and types, subscription models (on-demand or “reserved”), assess consumption rates by departments and perform various trending analysis. Track cloud services spending and usage across all departments to eliminate over-buying and underutilisation of cloud capacity and instances. More effectively negotiate volume discounts and rationalise usage of cloud instances. Enable IT’s transition to a service-based organisation by centralising management of cloud services—minimising overhead, managing costs and facilitating chargeback of cloud services. Optimise usage of pre-paid capacity (reserved instances) to minimise costs. Businesses are quickly subscribing to public cloud services in order to decrease infrastructure costs and provide flexible computing resources that can be rapidly scaled upward or downward, depending on demand. However, they can’t recognise real cost reduction and efficiency without visibility into, and control over, cloud services spend. Software Asset Management processes and technology need to be able to account for, track and manage cloud infrastructure services. [easy-tweet tweet="Businesses are quickly subscribing to public #cloud services to decrease #infrastructure costs"] Cloud services can provide real benefits, but, regardless of the type of cloud services used, companies need to consider employing software licence and subscription optimisation processes and technology to optimise their investment in cloud services and software. These tools will help identify idle and under-utilised instances and subscriptions, and identify licence compliance issues. This will help maximise return on cloud services investment, whether in IaaS, SaaS or other types of cloud services. ### Virtualization & Cloud Infrastructure - #CloudTalks with Xangati's Atchison Frazer In this episode of #CloudTalks, Compare the Cloud speaks with Atchison Frazer from Xangati about virtualization and infrastructure performance management. #CloudTalks Over 400 customers among enterprises, government agencies, healthcare organizations, educational systems and cloud providers use Xangati’s solutions to provide the intelligent performance management of workloads running in your VI and VDI environments. Xangati’s solutions provide visibility across the entire infrastructure avoiding silos of inaccessible and out-of-date information; yet, deliver consumptive and interactional information in a context that is relevant and actionable to any specific critical workload. ### Staff Heroes eases administrative burden of short-term employment A new online platform has been launched that aims to eliminate staff shortages by providing an end-to-end service that allows employers to tap into a talent pool of high-quality temporary staff. [easy-tweet tweet="Staff Heroes aims to eliminate staff shortages by providing an end-to-end recruitment service"] Dubbed Staff Heroes, the service addresses the issues faced by the hospitality industry when recruiting short-term staff via traditional methods. Available as an online and mobile platform, Staff Heroes already boasts more than 1,000 professionals, or Heroes, across London, ready to be booked for paid-per-hour shifts. The solution offers employers plenty of flexibility and pay scales can be set depending on job type and experience. Posting a job also promises to be a simple experience, taking less than five minutes, and recruiters can view Hero profiles to determine the most suitable candidate. The financial aspects are all handled securely by Staff Heroes, removing another administrative burden from the recruitment process. Workers are paid directly and businesses are only charged when a shift has been completed. “Staff Heroes is to the staffing sector, what Uber is to the taxi industry,” explains Laurent Gibb, Co-Founder and CEO of Staff Heroes. “We remove the need for businesses to spend time and resources on unnecessary admin and agency fees, by providing a secure, automated platform to source quality workers with ease.” “We are empowering businesses to take control of their hiring needs via a flexible and user-friendly platform. This also benefits our Heroes who will be able to identify local opportunities to engage in short-term work and earn money.” [easy-tweet tweet="Flexible and contractual employment is playing an increasingly large role in the UK’s economy"] Flexible and contractual employment is playing an increasingly large role in the UK’s economy, as businesses and employees look to improve payroll efficiencies and find a better work-life balance. Recent statistics from the Recruitment and Employment Confederation (REC) indicated that on any given day in the UK, 1.2 million people are employed on a temporary, contract or interim basis. ### The future of banking must involve collaboration The banking industry is under threat as never before, and it needs to adapt or face the consequences. How can it go about this change, and is the answer more collaboration with cloud-based and agile fintech firms? [easy-tweet tweet="Most of the innovation in financial services is to be found in the #fintech startup eco-system"] There was once a time when no-one would have predicted that Microsoft would ever be toppled from a position of worldwide dominance in business software and systems. But just as it and other behemoths of the tech industry – Nokia, IBM, Blackberry - have come under sustained attack and have had to reinvent themselves, shrink or both, as a result of the disruption brought by cloud-based and more agile businesses, it’s clear that banks will have to engage strongly in digital in order to remain relevant in the shifting economy. But how easy is this for banks, which are usually very big, not that agile and unable to respond quickly to changing market conditions? This, allied with the severe threat posed by supermarkets and other providers means that the banking industry itself could be at risk. How can banks address this, and rather than missing out, is the answer collaboration with smaller, more agile fintech providers? Digitising services is not enough The rise of digital has been astonishing, and banks are aware of the need to change and adapt their offerings. They are investing significant sums in replacing legacy systems, and working to upgrade capabilities so that customers can now do online what they once could only do in branch. However, the problem is that by digitising existing services, banks are simply doing what their customers expect of them: moving a service to a different channel. The reality is that these developments are nowhere near as innovative as customers now demand. Industry analyst group Gartner highlighted this in a July 2015 webinar on Managing Innovation and Digital Transformation in Financial Services, pointing out that while banks are busy digitising services, they should in fact be digitising the entire business. Collaboration is key to a healthy banking industry However, banks are not generally structured to do this on their own and to remain relevant and meet their customers’ changing requirements, they must look to partner and collaborate with more agile and cloud-based fintech providers. Most of the innovation in financial services is to be found in the fintech startup eco-system, and partnering would not only make the digitisation process much more straight forward, but would also allows the bank to offer more differentiation, very important in the face of competition from other banks and new market challengers. Rather than seeing fintech firms as potential rivals, banks should instead collaborate to offer their customers a superior and differentiated service. By their very nature, fintech startups are more flexible and agile than banks and can bring a product to market significantly quicker in most cases. They are also more used to innovating on a day-to-day basis, and delivering the type of services demanded by customers in 2016. A number of banks have already started this process. In the UK, Santander has partnered with peer-to-peer lender Funding Circle - the bank refers SMBs with rejected business loan applications to them, and in return Funding Circle directs any SMBs needing banking services to Santander. At Ormsby Street, we have recently launched CreditHQ, our small business credit checking SaaS in Italy and Germany, as a result of a partnership with one of Europe’s biggest banks. Such moves are indicative of a general acknowledgement that it makes little sense for a bank to build systems and software from scratch when fintechs have already done a lot of the hard work for them. The challenge comes when they want to assess and integrate multiple partners. [easy-tweet tweet="Rather than seeing #fintech firms as potential rivals, banks should instead collaborate" hashtags="Banking"] As banks explore numerous options, including collaboration and acquisition, one thing is clear, that in the past few years they have recognised the need for change. The next step is to embrace the disruptors who’ve brought innovation to the sector, and find the right partnership model in which to do so. ### Operating Systems and MSPs - #CloudTalks with RDB Concepts’ Phill Evans In this episode of #CloudTalks, Compare the Cloud chats with Phill Evans from RDB Concepts about how they deliver managed support and services around databases and operating systems. #CloudTalks ### Google and the War of the Clouds: IoT Isn’t the Point At London’s recent Cloud Expo, Barak Regey, Google’s director of cloud platforms for Europe, recommended that cloud companies simply abandon trying to predict how their data and cloud capacity needs will evolve over the next five years. [easy-tweet tweet="There are so many companies creating #cloud offerings these days that the market has become very crowded"] The cloud expert stated that with the explosion of the Internet of Things (IoT) and the pace of innovation and change in the technology industry, cloud organisations simply won’t be able to calculate and prepare for how much cloud capacity they’ll need. Interesting, but not really the point The rather defeatist forecast is not necessarily inaccurate, (it is impossible to calculate how much usage IoT is going to require because new devices are being invented every day), but it does kind of miss the point: As long as companies thoroughly plan how they manage their cloud environments, they should be in perfectly good standing to scale efficiently and continue to capitalise on the benefits provided by the cloud. The Real Capacity Projection Point What Regey misses is perhaps the more interesting point – rather than warning companies off being able to plan for the future due to lack of knowledge surrounding IoT capacity, he should attempt an outside look in at Google and the other big providers such as AWS, Microsoft and IBM. When it comes to capacity projections, the biggest threat should be felt by these large providers – they are each other’s worst enemy. The big four, amongst a myriad of other cloud providers that are continually appearing in the marketplace to take a slice of the cloud opportunity, are saturating the market really rather quickly. In the same way that startups in the nineties began creating the ".com bubble" that burst the decade after, there are so many companies creating cloud offerings these days that the market has become a very crowded place. It seems many technology providers these days have a cloud offering, and many of these offerings are being fuelled by borrowed investment. Some of these offerings are becoming so big that they are even threatening the biggest players. Even hosting providers who traditionally stick to “power and space” are now providing cloud services. All of this together is likely to lead to a levelling of the markets over the next five years that may well be as normalising and balancing as IoT and technology change is disruptive. Bigger players will consume the smaller ones, household names will need to differentiate themselves, and everyone will need to provide new incentives to retain clients. Power to the Consumers Rather than feeling powerless, it’s the organisations that consume cloud services that are in the strongest position. Theirs will be an opportunity to capitalise on a market that is becoming crowded and therefore much more competitive. As long as these organisations plan how they will manage their cloud environments, this “war of the clouds” should be nothing but beneficial to them. How Can Businesses Plan to Manage Their Cloud Environments? If careful planning is business’s best chance at leveraging the cloud amid the uncertainties of capacity usage, are there specific points to note? Of course! Here are some of the basics that your cloud migration plan should cover, as well as questions companies should ask themselves as they prepare to move to the cloud: Do you have clearly defined goals that you want to achieve as a result of moving to the cloud? Do you really understand your data? What data can/should be moved to the cloud, and what should be kept in-house? Does any data need to be kept in-house due to latency, security or geography? What is the size and scale of the data you need to move to the cloud? What levels of security and SLA’s do you require from your chosen cloud provider? Who will manage your migration process to the cloud? Do you have the resources in-house to manage a migration, or will you have to outsource? Do you know the next migration step? Don’t assume that the cloud is the final resting place as you might find it more expensive than you thought. Be sure to manage how and where it is hosted efficiently [easy-tweet tweet="Even #hosting providers who traditionally stick to “power and space” are now providing #cloud services"] Rather than worrying about capacity projections, businesses are better placed focusing on these guidelines and ensuring they have honest, detailed answers to these questions. With these answers in place, businesses are better placed than ever to harness the power of the cloud in an agile, flexible way amid a rapidly changing technology landscape. ### Predictive Analytics - #CloudTalks with InsideSales.com's Jim Steele In this episode of #CloudTalks, Compare the Cloud speaks with Jim Steele from InsideSales.com about how predictive analytics and machine learning can be used to accelerate sales. #CloudTalks ### Bringing visibility and control to your sprawling hybrid cloud infrastructure If you cannot see something, it is impossible to monitor it, assess it and improve it. In a business environment, this lets problems fester unnoticed until they eventually wreak havoc with your productivity. However, the growth of digital technology has meant that achieving visibility is far from straightforward. With hybrid IT and multi-cloud, achieving a holistic overview of your business operations is harder than ever, with business processes extending across different personnel, hardware and software. [easy-tweet tweet="Infrastructure monitoring solutions give businesses complete control over their IT resources" hashtags="Xangati"] Monitoring your hybrid cloud infrastructure  As business technology became more diverse, infrastructure monitoring solutions were developed to give organisations visibility into their IT performance. However, these were largely silo-specific and application unaware, making them increasingly inadequate for hybrid cloud infrastructures, where resources may not be owned by the company and may sit outside corporate firewalls. One of the ways that businesses are overcoming these challenges is through the use of cross-silo data and analytics. At Xangati, our infrastructure monitoring solutions are designed to give businesses complete control over their IT resources, regardless of their complexity. There are a variety of different solutions within our portfolio, offering enhanced visibility into end-user metrics and enabling your organisation to move away from traditional performance monitoring tools. For example, our Xangati Service Assurance Analytics framework delivers a rapid and scalable way to locate and resolve, or prevent, end-to-end performance issues in your hybrid cloud infrastructure, and guarantee performance, capacity and efficiency. We are able to map infrastructure behaviour to autonomic controls across traditional silos in real time and analyse impacts to app performance, efficiency and delivery to optimise virtualised workloads across any compute platform from private to hybrid cloud.  More than just control Being able to monitor your hybrid cloud infrastructure offers more benefits than simply gaining greater control over your IT. Monitoring, which may involve live recordings and end-to-end visibility, can quickly evolve into analysis, via machine learning, metric-based alerting and predictive analytics. With this information to hand, businesses can quickly set about optimising their hybrid IT infrastructure. System faults can be identified, security problems can be pinpointed and performance metrics evaluated. Crucially, by gaining cross-silo visibility into the entirety of your business processes you can examine the cause of performance degradation and eliminate it. Of course, the optimisation of your IT infrastructure plays a vital role in understanding how infrastructure monitoring delivers ROI, through cost reduction and revenue generation. With regards to the former, cross-silo monitoring can greatly reduce resource wastage and hence, improve cost efficiency. A predictive monitoring system can also lead to increased uptime and performance, which in turn delivers a better experience for the end-user and therefore boosts revenue generation. [easy-tweet tweet="Making sense of the sprawling complexity of #hybrid IT is an important challenge" hashtags="Cloud, Xangati"] At Xangati we believe that monitoring controls are the most effective tools for empowering efficiency within your business. Whichever solution you decide to implement, we’ll ensure that you have the intelligence you need to understand which aspects of your IT infrastructure are in need of improvements. Through our detailed performance reports we can deliver targeted capacity planning, improved resource efficiency, real-time monitoring and much more. Making sense of the sprawling complexity of hybrid IT is an important challenge and it’s crucial that your monitoring tools are up to scratch. If they are, your business will be able to do far more than simply monitor its infrastructure, it will also be able to analyse, control and optimise it. ### SD-WAN - #CloudTalks with Riverbed's Paul Griffiths In this episode of #CloudTalks, Compare the Cloud speaks with Paul Griffiths from Riverbed about the benefits provided by their SD-WAN solutions. Continue reading for more information about how SD-WAN can deliver improved performance and agility for businesses. [easy-tweet tweet="Compare the Cloud speaks to Riverbed's Paul Griffiths about the benefits of SD-WAN solutions" hashtags="CloudTalks"] SD-WAN: An agile enabler of enterprise movement towards the cloud Gone is the time where IT assets were limited to a handful of data centres. Gone is the time where users and applications were all bound by one unified MPLS network. Today, businesses are increasingly mixing off-premises assets to their existing IT infrastructure. Productive users are everywhere, on-premise but also on the road or at home. The Internet is becoming the backbone of enterprise communications. As enterprises are becoming more hybrid, the shape of the network itself is dramatically changing. The underlying networks are getting more diverse in terms of performance and security. MPLS is now combined with the Internet using a variety of transports from DSL to fibre and even 4G/LTE. With HD Internet Video or Unified Communication and Collaboration (UCC), the traffic mix and the communication requirements are getting richer and more dynamic. The network has never been so heterogeneous, distributed and complex. Architectures built for the network as it was 10 years ago are rapidly losing relevance. Managing multiple WAN paths and distributed local Internet breakouts is becoming crucial but lacks efficient solutions Being too static, mechanisms like QoS become a nightmare to manage While network performance can be controlled and optimised on-premise, guaranteeing performance for mobile users and/or off-premise applications is extremely challenging Holistic visibility on the traffic requires more instrumentation devices than ever Visibility on the performance delivered by off-premise cloud service providers is a new problem without a practical solution Over the past few years, a novel architecture has emerged to solve similar problems at the data centre level: Software Defined Networking (SDN). Today, vendors are emerging with solutions to deliver guaranteed application performance to the modern users and workloads of the hybrid enterprise, by applying the SDN principles to the WAN in the form of so called SD-WAN solutions. While the market for an SD-WAN solution begins to emerge, the requirements for an excellent SD-WAN solution appear clearly: Optimisation capabilities for on-premise and cloud based applications like Office 365 or Salesforce.com A network and application aware path selection capability to direct traffic on the appropriate network (MPLS, Internet) Dynamic tunneling with central control plane allowing secure backhauling of branch traffic to the corporate data centre across the Internet A simple interface to zScaler or other cloud-based security services enabling local Internet breakouts without requiring further investment in on-premise Internet security appliances Inbound QoS to manage local Internet breakouts and protect business Internet against surges in recreational Internet Deep and wide visibility on all assets interconnected by the SD-WAN with holistic visibility on network usage, performance and integration with end-user experience monitoring of on-premise and SaaS applications In addition, a proper SD-WAN central management console is one that marks the start of an era of dramatic improvement of manageability and usability of control capabilities like QoS, path selection or VPN management. Ideal SD-WAN management consoles expose to the users an intuitive interface and management plane based on high level abstractions like applications, sites, uplinks or networks that matches the way they see their IT environment. Ideal SD-WAN solutions shall rely on a control plane designed to support intent based configuration that provides a translation of global parameters into local policies. [easy-tweet tweet="Using SD-WAN, customers can implement new, more efficient, management workflows" hashtags="SDN"] Thanks to SD-WAN, customers should be able to implement new, more efficient, configuration and change management workflows that make hybrid-networking capabilities really usable. SD-WAN has the potential to deliver to the business, the performance and agility they need for business critical applications, while controlling and reducing network costs at the same time. ### Huawei Cloud Congress Europe - Live Blog Huawei Cloud Congress (HCC) Europe 2016 is one of the top Huawei ICT events, where Huawei and ecosystem partners gather to share their latest, innovative, cutting-edge IT solutions and interesting idea on IT practices and transformation. Hundreds of business leaders, IT technology KOLs and media participants from across Europe will be in attendance. [easy-tweet tweet="Compare the Cloud will be providing live updates from Huawei Cloud Congress Europe 2016" hashtags="HCCEU2016"] Throughout the event, Compare the Cloud will be providing live updates on all the latest announcements, product demonstrations and expert opinion around cloud transformation, via our live blog below. [scribble src="/event/2040450" /] ### Branch IT - #CloudTalks with Riverbed's Paul Griffiths In this episode of #CloudTalks, Compare the Cloud speaks with Paul Griffiths from Riverbed about branch IT. Continue reading for more information on why this approach to business technology is so important. [easy-tweet tweet="In this episode of #CloudTalks, Compare the Cloud speaks with Paul Griffiths from Riverbed about branch IT"] Introducing a New Approach to Branch IT The lifeblood of many companies today depends on branch offices. Whether these are remote sites, retail outlets or manufacturing plants, they must be agile and able to quickly respond to the business’s ever-changing needs. But too often, branch offices operate as independent data centres which are difficult to support and protect. Consequently, services outages and data loss are a common occurrence, leading to productivity issues including missed sales opportunities, customer churn, assembly-line stoppage and ultimately, lost revenues. How can businesses efficiently address their branch office needs? The solution is to take a completely new approach to branch office IT which will improve system performance and resiliency, ensure reliable data backups, and greatly reduce operating expenses, particularly as more companies adopt a hybrid enterprise IT infrastructure that combines on-premises and cloud or SaaS-based applications and services. By implementing a “Zero Branch IT” model, businesses will no longer install new equipment and assign additional on-site support at each location. Instead, they will centralise data without compromising performance, while enabling instant provisioning of new applications and services at remove locations and branches, as well as making instant recovery of applications and services a reality. The challenges of outdated branch IT A recent Riverbed report found that that 50 per cent of corporate data is stored in branch offices and that branch offices represent 50 per cent of an average company’s total IT budget. This creates an insecure and complex network of distributed servers and storage deployed solely to meet local performance and reliability needs. In other words, half of today’s IT organisations are using outdated methods of operation, forcing branches to subsist on decentralised, ad hoc, and rigid legacy infrastructures. In addition to being costly and complex to manage, outdated infrastructures limit IT’s ability to proactively respond to businesses’ ever-changing needs, prevent security breaches, and recover from unplanned outages. How Zero Branch IT supports wider business goals Though CIOs are expected to play a central role in driving business objectives, few organisations take into account the IT challenges involved in rolling out new services across all branch locations, such as WAN constrictions, security concerns, and minimal staff. Disorganised, legacy branch infrastructures are costly to both the business and to the IT department, making even the smallest propositions a worrisome task- and making it harder for the CIO to support the business. As an alternative, CIOs can implement a Zero Branch IT model which will address the needs of the IT department as well as those of the organisation as a whole. To better understand this new model, IT can imagine the branch as a smartphone – a simple device, fully equipped with applications and high-speed access to data over the cellular network or the Internet. When buying a smartphone, mobile providers just provide the device itself, not a rucksack full of application servers, storage, and backup infrastructure users must also carry and maintain. Branch offices and other remote sites can operate in a similar way. Taking a new approach to branch IT enables the CIO to manage everything inside a secure, central data centre and deliver performance out to the branches. The result will be an almost non-existent IT infrastructure footprint with no remote servers, storage racks or backup and recovery systems. The benefits of optimising the network Gartner describes the average WAN optimisation system as a deployment of appliances at the central data centre and in each branch office, though an additional option is to deploy appliances as virtual machines or as a cloud resident service. For mobile or remote users, WAN optimisation can be deployed as a soft client that runs on individual user devices. As wide area networks (WANs) are notoriously unreliable and do not offer protection against the creation of localised pockets of systems and information stores, Zero Branch IT requires tools that enable the convergence of IT systems and applications with WAN optimisation technologies. This will offer local LAN performance, bringing data back to the data centre, while maintaining application performance at all branch offices. The most recent Gartner Magic Quadrant for WAN Optimization conveys that WAN optimisation technologies can provide a range of features that improve application performance running across the entire WAN, and reduce the overall cost of the WAN. Gartner describes the typical WAN optimisation setup as a deployment of appliances at the central data centre and in each branch office. Another option is to deploy the appliances as virtual machines or as a cloud resident service. For mobile or remote users, WAN optimisation can be deployed as a soft client that runs on individual user devices. Businesses can cut their branch IT costs by eliminating the need to purchase, maintain and protect servers, storage and backup systems in branch offices. Additionally, by using a centralised infrastructure managed via a single console or dashboard, IT can achieve greater visibility and control over the network, so as to quickly and easily redeploy, upgrade, move, or migrate systems, applications and services to accommodate the opening of new branch offices. Using new technologies, IT can store and protect sensitive data in a centralised, strictly controlled location, with stringent backup and replication policies. Using specialised applications, businesses can easily access that information in an agile way. They can therefore deploy new services, applications, or entirely new branch sites while ensuring maximum productivity of branch staff. New tools also enable real-time continuous data capture and analysis so that companies can view network delays, providing speed, insight and control no matter where data is stored. As a result, businesses will experience far fewer instances of system outages, slow application performance and downtime, making it easier for employees to work anyplace, anytime, using an ever-increasing selection of work-issued and personal computing devices, including laptops, smartphones and tablets. [easy-tweet tweet="With a new approach to branch IT, businesses can bring disparate systems to the data centre"] By taking a new approach to branch IT, businesses can bring disparate systems and applications to the data centre manned by the full-time IT team, driving tangible economic benefits, including efficiencies of scale, improved employee productivity and the ability for all remote offices to share expensive backend solutions. A Zero Branch IT approach enables today’s organisations to leverage IT strategies such as branch converged infrastructure, storage delivery, virtualisation and WAN optimisation to address the unique needs of branch offices, all while delivering better business performance overall. ### Dropbox Infinite unveiled to solve local storage issues Dropbox has unveiled the latest addition to its cloud solutions portfolio, promising to “reimagine how people find, access, and collaborate with large amounts of data.” [easy-tweet tweet="Dropbox Infinite lets you access all your files from your desktop, regardless of your local storage space"] Named Dropbox Infinite, the new offering allows users to access all of their Dropbox files from their desktop, regardless of how much local storage space they have. The usual green checkmark will appear next to a file if it has been synced locally, while all other items will have a new cloud icon. The decision to launch Dropbox Infinite emerged as a response to a common customer issue, namely that the limitations of their local storage meant that they did not have clear visibility over their company’s resources. Now, instead of having to switch to a web browser to see all of the company’s files, a much better user experience is achieved via the desktop. “With Project Infinite, we’re addressing a major issue our users have asked us to solve,” explained Project Manager Genevieve Sheehan. “The amount of information being created and shared has exploded, but most people still work on devices with limited storage capacity. While teams can store terabyte upon terabyte in the cloud, most individuals' laptops can only store a small fraction of that.” As well as visibility and real-time access, Dropbox Infinite also promises universal compatibility. IT teams will be able to utilise the solution across any computer operating Windows 7 or higher or Mac OS X 10.9 and up. [easy-tweet tweet="As well as visibility, Dropbox Infinite also promises universal compatibility" hashtags="DropboxInfinite"] Dropbox revealed its new Infinite project during its Dropbox Open event earlier this week, but there were also plenty of other announcements made. The company commented on its growing customer momentum, its expanding European leadership team and a new Dropbox Platform update including a new File Properties API. ### Enterprise mobility is not an app, it is a way of life Apps have become an indispensable part of our lives since the launch of the iPhone in 2007 and the subsequent launch of the app store in 2008. They are now available on a variety of other platforms, with more of them available than most people can care to count. [easy-tweet tweet="#Apps are now a mainstay of life but it is clear that they have certain limitations" hashtags="Mobility"] Apps are now a mainstay of life for many today but despite the proliferation, it is becoming increasingly clear that they have certain limitations, especially when it comes to the business environment. There is a common misconception held by many organisations that the availability of an app makes their product or solution ‘mobile enabled’ and this is something that needs to be addressed. There is no disputing that an app is a very valuable tool to have but enterprise mobility is something else entirely. A recent study suggests 14.1 million people in Britain want flexibility in their working hours or location, equivalent to almost half the working population. While apps are essential for enabling these working conditions, true mobility is about allowing people to do business in the most efficient way, regardless of their location. Apps are only one part of the puzzle that enables this. A good app presents many opportunities for the enterprise as it can transform and simplify business processes. It has the potential to save time and resources by taking out some of the complexities that come with other solutions. The app has to be part of a wider mobility strategy that addresses employee’s need, not just another solution that is added to the mix to satisfy the demand for mobility. This is the challenge of enabling true enterprise mobility: it’s multi-faceted. It goes without saying that an effective mobile app is essential for any solution in today’s business world. However, we should also admit that an app doesn’t always match the desktop experience for managing a complex business process end to end. If users still need to access the desktop application for some or all parts of a process, it will be difficult to convince many to download and rely on the mobile app. Despite this, downloading an app would still seem like a small price to pay for a significant increase in efficiency for people for people who have to perform a particular process every day. However, users will either never download these apps or they will download and never use them. Organisations need to provide other mobility options, including mobile responsive design for tablets, VPN connectivity, smartphones and wearables, and even the old favourite, email. Email is one of the most underestimated business apps. An overwhelming majority of business people have at least one email app on their smartphone and they use it every day. If organisations can provide something that can be actioned via email without having to log on to the desktop or leave their email app, that would be a great mobility experience for most users. This is something we have seen with the approval of purchase orders through our platform and being able to approve purchase orders remotely makes a huge difference to our customers. We have seen PO turnaround times cut from the previous average of two or three week to 17 hours. Most of these approvals are done via email, even though our platform offers both email and app choices. It is the same story for suppliers. Most suppliers don’t get too many POs from a customer and they usually invoice once per month. For these guys, making all the information they need available at their fingertips keeps their process simple and it also improves efficiency. We have already established that apps are great and that they play a very important role in everyday life and business functions. But why go to another app to do what you could do in the app you’re already in? Most people won’t do it. [easy-tweet tweet="An app is a very valuable tool to have but enterprise #mobility is something else entirely"] Organisations need to continually think how to make the business process more efficient for users and how technology can help with that. There needs to be a focus on enabling people to work how they want, when they want and with as little friction as possible. This is what enterprise mobility truly is and there isn’t an app for that. ### Capacity Planning - #CloudTalks with Sumerian's Peter Duffy In this episode of #CloudTalks, Compare the Cloud speaks with Peter Duffy from Sumerian about how their Capacity Planning solutions utilise big data and analytics. #CloudTalks ### KCOM joins the Cloud Industry Forum The Cloud Industry Forum (CIF) has announced that KCOM, which delivers communications and IT services to customers across the UK, has become the latest Cloud Service Provider (CSP) to join its growing membership roster. [easy-tweet tweet="By joining CIF, KCOM will ensure that its customers receive up-to-date information about the #cloud market"] KCOM’s cloud services deliver business critical collaboration tools tailored to organisations in public and private sector industries including retail, finance, health and professional services.  For example, KCOM redesigned, delivered and today supports real-time operational and strategic IT for Rail Settlement Plan, including Live Sales Management, a Ticket on Departure system which produces over one billion tickets per year, ordered online and collected at the train station. KCOM is an AWS Premier Consulting Partner and holds AWS Managed Service Provider competency status. KCOM is also an official Microsoft CSP. By joining CIF, KCOM will ensure that its customers receive up-to-date information about the changing cloud market and help to guide the focus and direction of the industry body. Established in 2009, the Cloud Industry Forum is a not-for-profit organisation that aims to improve standards and education in the cloud industry. CIF provides transparency through certification to a Code of Practice to endorse credible providers of cloud services and, through its Cloud Adoption Roadmap assists end users in determining the core information necessary to enable them to adopt such services with confidence. Stephen Long, MD at KCOM, said: “Our cloud portfolio has grown considerably in recent years, and, as cloud is a key focus for the business, it’s important that we can stay at the forefront of industry developments. Aligning ourselves with the Cloud Industry Forum, one of the UK’s leading authorities on cloud, gives us access to technical insight and the latest developments in the UK cloud market, which in turn allows us to improve the service we provide our customers. We are very pleased to have joined CIF and look forward to working alongside CIF’s other members to support its various initiatives.” [easy-tweet tweet="CIF provides transparency through certification to endorse credible providers of #Cloud Services "] Alex Hilton, CEO of CIF, commented: “Our members are directly responsible for the shape, format and governance of the Cloud Industry Forum and, as such, are instrumental in influencing public confidence and industry best practice. KCOM is a very welcome addition to our membership roster, and I believe that the company’s heritage and expertise in connectivity and cloud communications will be highly beneficial to our mission.” ### The honeymoon’s over. Why we need to chat better, not more Communication at work is notoriously hard to get right. Fred Brooks identified it as a key aspect of any team’s success in the software engineering bible, The Mythical Man Month, when he found that the more people you have inputting or communicating, the more problems you may encounter in actually getting projects done. The same sentiment was echoed in a recent article by Basecamp founder, Jason Fried on the pros and more notably, cons, of chat platforms on employee productivity and well-being. [easy-tweet tweet="This current wave of technology innovation has seen a never-ending slew of new ways to communicate" hashtags="Cloud"] This current wave of technology innovation has seen a seemingly never-ending slew of new ways to communicate. As somebody who was at Skype during their hyper-growth phase, I witnessed the move from voice to chat, and have followed the many iterations of chat since. These tools, by and large, have made access to communication easier, but certainly in the workplace, we haven’t even begun to scratch the surface of making communication better. A lot has been said in recent weeks about team messaging tools. What started as a gushy love affair hailing the beginning of a new dawn of communication is beginning to give HR managers and staff alike some serious headaches. I can certainly agree with a lot of the negativity. I recently wrote an article defending email’s place in the world and I’ve been pretty positive that by seeing messaging as a substitute for email, you’re just replacing one set of problems with another. Hello chat-box zero. I think there are a few fundamental issues with using messaging as a team: The first is that it has created an added distraction. Messaging is not alone in being a culprit of this problem. Our phones are beeping all the time, our desktops are notifying us of new emails, tweets, messages and updates. Focus is fundamentally important to getting good work done well and it’s slowly being eroded by our modern toolchain - not to mention our decreasing attention spans and just how easy it is to go shopping on Amazon or read some news mid-way through a work task. I feel that making messaging alone the scapegoat for this problem is pretty unfair; it may have amplified it, especially if you don’t manage your notification settings and availability well. The good news is there are other little tools to help with these issues. I’ve recently started using a Mac App called Focus that can be configured to disable distracting websites and notifications allowing you to get work done. This comes down to people taking back control of their time, and tasks. Defining other rules of engagement for how your team communicate and actively working towards being smarter, not louder or faster, when it comes to communication is critical to success. We need to get better at delineating specific scenarios when we ought to communicate face to face, when email’s best, and when it’s ok to thrash it out over chat (and for how long ideally), so we’re not using chat as a de facto solution, sucking all our time away. But I’d like to take a minute away from negativity of messaging and talk about some of it’s strong points, which I feel have been undervalued in recent weeks. In particular, a sense of belonging. This is so fundamentally important to our beings it fits neatly in the middle of Maslow’s hierarchy. Increased home and remote working has enormous benefit in building a team and allowing your team to win back on the work-life balance front - however, even in that setting, feeling connected to your colleagues, your peers and cohorts in your industry remains fundamentally important. We see this a lot at Gitter. We provide a home for developer communities, allowing technically minded people to come together and connect across companies, across cities and continents. What we see happening on the network continues to reinforce my belief that forming connections with like-minded groups and individuals is fundamental to the future of work as much as good teamwork amongst your colleagues is. Every day we see people connecting, sharing ideas, solving problems and learning together. What’s even more fascinating is that the number of direct message conversations make up just over half of all of the conversations on the network. We constantly hear stories of people learning new skills, receiving job offers and creating new technologies together through these connections. [easy-tweet tweet="Defining rules of engagement is important in working towards smarter #communication in the workplace"] I feel that messaging has a huge part to play in how people connect in this way. It’s more intimate and personal than browsing profiles on LinkedIn, it feels more immediate and connected than message-in-bottle posts or emails. I can still remember the first time I used chat online, back in the early 90’s; that sense that you and someone else, or other people, were talking together across untold distances created a feeling of connectedness I’ve seldom felt since; long may it continue. ### Customer Engagement - #CloudTalks with Interactive Intelligence’s David Paulding In this episode of #CloudTalks, Compare the Cloud speaks with David Paulding from Interactive Intelligence about the importance of communication, collaboration and customer engagement. #CloudTalks ### TeleData Data Centres acquire cloud specialist 1st Easy to accelerate growth TeleData UK, a leading independent data centre operator and colocation service provider based in Manchester, has today announced the acquisition of cloud and hosting service provider 1st Easy Ltd as the next stage in its strategic plan to accelerate growth and expand its infrastructure hosting portfolio. [easy-tweet tweet="TeleData have been providing colocation services from their privately owned data centre since 2007" hashtags="Cloud"] Cheshire based 1st Easy have been operating within the hosting sector since 1999 and in recent years have specialised in the provision of VMWare based public cloud solutions and data centre infrastructure services. TeleData have been providing colocation services from their privately owned data centre, located close to Manchester’s Airport City, since 2007 and have also recently launched Work Area Recovery services at the facility.  The acquisition of 1st Easy will see TeleData integrate IaaS (Infrastructure as a Service), DRaaS (Disaster Recovery as a Service) and a number of other hosting products into its wider offering. Matthew Edgley, Sales Director at TeleData comments; “TeleData are already unique in the data centre market place as the only provider to offer colocation, work area recovery solutions and a BS5979 alarm receiving centre within the same facility.  Successful organisations of today demand a highly secure, resilient and hybrid IT enabled environment to protect and serve their business and its technology – the acquisition of 1st Easy further expands our capabilities to serve these demands.” 1st Easy are to continue to operate under their own brand name and the company states that all current staff are to be retained. [easy-tweet tweet="1st Easy will continue to operate under their own brand name and all current staff are to be retained" hashtags="TeleData"] Shaun Wilcock, TeleData CEO comments; “This is a smart acquisition for TeleData, and a positive result for both companies, all employees and our customers. We will be sharing an incredible pool of talent, technology, infrastructure and market experience to better serve our customers and realise even greater opportunities in an exciting marketplace; one where uniqueness and differentiation within your offering is vitally important.” ### Why are business applications so boring? There is a misconception that business must be serious, uninteresting and, to an extent, sterile. Certainly, when it comes to business applications there are plenty of tools that lack the engagement to go with their functionality. While you might think that this functionality is the be all and end all, in reality a boring application can be hugely damaging in terms of staff productivity. [easy-tweet tweet="One of the ways that business applications sometimes fall short is by having a poor user interface, or UI"] Boring by design One of the ways that business applications sometimes fall short is by having a poor user interface, or UI, and there are a number of reasons behind this. Firstly, the nature of business apps means that developers spend the majority of their time and resources focusing on the required corporate features - meaning that design can take a backseat. When time is tight and budgets even tighter, getting businesses to invest money in creating a more engaging UI can be a tough sell. However, recent research suggests that companies may be missing a trick by creating difficult and uninspiring applications. Businesses with engaged employees outperform those without by up to 202 per cent and workplace applications play a major role in fostering this engagement. “Often, corporate applications fail to generate the employee engagement levels hoped for, particularly if a cross-platform solution is chosen to cut costs and achieve faster delivery,” explained Matthew Jones of shinobicontrols. “This forces useful features to be sacrificed, or worse, completely overlook user experience. This impacts not only employee productivity when using the app, but also their willingness to engage in its future use. Thankfully, the tide is changing. Organisations are realising that if employee engagement is a priority, the only option is native.” Although, UI and app development in general may ultimately have to conform to the company’s bottom line, it is important businesses are not overly myopic during the app development process. A more engaging UI may cost more in the short term, but if it helps to generate a more creative and inspired workforce, then it will certainly prove to be money well spent. Another reason why business applications can fail to inspire their users is because, unlike consumer-grade applications, it may not be possible for employees to switch to a competitor app if the user interface is a hindrance. Even with the rise of shadow IT, there is much less incentive for developers of business apps to spend time on the UI. After all, if there is only one application that enables you to input holiday requests, then employees will continue to use it no matter how badly it is designed. A more engaging UI may cost more in the short term, but if it helps to generate a more creative and inspired workforce, then it will certainly prove to be money well spent In order to overcome poorly designed business applications, organisations need to ask themselves some key questions. Firstly, do your developers have the required skills to design business-grade applications with consumer-grade user interfaces? If not, it may be time to call in outside help. Secondly, are employees utilising corporate applications to their fullest? If they are not, then productivity is sure to take a hit. Gamification Another way that businesses are making their applications more engaging is by embracing gamification. Crucially, this is not about turning your business apps into games - it simply refers to using game design elements to make them more interesting and engaging. There are a variety of ways that businesses can incorporate gamification into their business applications. Creating an online social community, for example, has enabled users to feel more invested in the games they are playing and the same principle is now being applied to work-based apps. Creating a collaborative platform where users can discuss the app, share materials and access a shared knowledge base is a great way of boosting engagement. Similarly, not many games simply throw players in at the deep end without any form of help. Instead, they include training session and manuals to help with those first steps. Likewise, applications must provide assistance to users, whether in the form of elearning, quizzes or other targeted coaching resources. Another aspect being borrowed from games is employee rewards. By meeting pre-defined goals and tracking progress, staff can receive recognition in the form of points or virtual achievements. This also provides a clear overview of who is using the corporate tools and to what extent. [easy-tweet tweet="Business apps must provide assistance to users, in the form of elearning, quizzes or other coaching resources"] Business applications have long had a reputation for being functional, but largely forgettable. However, organisations are now starting to realise that this approach can cause issues, particularly in the long term. By embracing gamification and other features associated with consumer software, business apps are starting to become far more engaging, both in terms of their design and utility. ### Managed Services - #CloudTalks with Daisy's Mark Scaife In this episode of #CloudTalks, Compare the Cloud speaks with Mark Scaife from Daisy about their cloud offerings including hosting services, business continuity, security services and more. #CloudTalks ### Why most retailers aren’t ready for Millennials 
 We are moving toward a Millennial world - a world where the “real world” and online culture are continuously blending together, blurring the lines between on-and offline shopping and changing the shape of the retail landscape as we know it.  [easy-tweet tweet="We are moving toward a #Millennial world - where the “real world” and online culture are blending together"] The implications this Millennial world (individuals between the ages 18 to 35) brings for retailers are significant. Not only do retailers need to be up to speed on the basic technologies and platforms available for them to interact with their Millennial customers, but they also need to join them together seamlessly from the customer’s point of view. And, whilst many retailers may have made advances in their multichannel experience, there is still some way to go to close the gap between Millennials’ expectations for a consistent experience across both physical and digital retail outlets, and the experience retailers are currently offering. This article explores those gaps and looks at other emerging opportunities that online retailers can capitalise on to drive omni-channel engagement and ultimately growth. Mind the consistency gap! The shopping preferences of Millennials are very different from past generations: there is no longer such a thing as an offline or an online journey to purchase, as every journey is a combination of both. This is a generation that is hungry for convenience so brands need to be able to connect the dots between supplier, location, customer and product information in real-time, anytime. For Millennials, a seamless shopping experience means the ability of a retailer to deliver a consistently personalised experience for each individual customer, at every touch point – anytime and anywhere. To this end, retailers need to be able to have a single “conversation” with their customers - not one that changes from smartphone to PC to physical store. Failure to meet these needs may mean that the customer will turn to a competitor who can meet them. The good news is that, compared to last year, online retailers have made progress in offering a consistent shopping experience with 83 per cent of online retailers now offering a device agnostic solution and specifically adapting the design and functionality of their sites per channel. However, in a recent survey, amongst the 86 per cent of retailers who claimed to sell through multiple channels, one in two admitted that there was some degree of inconsistency in what they offer to customers across their different online channels. The front-end of a website may be beautifully designed, with a vast selection of merchandise, but if Millennials cannot pay how they want to, products will remain unsold Delivering a Millennial-ready checkout experience There are many facets to creating a seamless online retail experience but one element that is regularly overlooked comes towards the end of the customer’s shopping journey and is in fact one of the main reasons for cart abandonment. Recent research has revealed that 58 per cent of British online shoppers have abandoned a shopping cart due to payment issues in the last 12 months. Poorly designed user interfaces may translate into lost customers, as this research highlighted, with just under a third (32 per cent) of shoppers abandoning their cart due to a poor and cumbersome checkout process. Another one in four shoppers (24 per cent) did not complete their purchases because they had security concerns about the site. To increase their chances of online success, retailers may also consider an omni-payment strategy to keep Millennial customers happy. The front-end of a website may be beautifully designed, with a vast selection of merchandise, but if Millennials cannot pay how they want to, products will remain unsold. In fact, one in five shoppers have abandoned a purchase because the online retailer did not offer their preferred payment method. Tellingly, the checkout experience is also a deciding factor in terms of repeat visits and ensuring customer loyalty, with 84 per cent of British online shoppers saying a poor online checkout experience would be enough to stop them returning to a particular company's website or app in the future. A simple and efficient checkout process could help to convert frequent online shoppers into repeat and loyal customers. Indeed, 86 per cent of weekly shoppers said this applies to them. Investing for the future The good news for the retail industry is that many retailers are now working on forging a convincing online presence and are recognising the growing importance of other platforms, such as mobile and social media. In fact, 96 per cent of online retailers are planning to invest in some area of ecommerce in the next 12 months. One front-runner in terms of investment priorities for online retailers may be to make better use of social media (38 per cent). This is not surprising as social media is prevalent on mobile devices. In addition, one in five online retailers plan to invest in optimising payments for smart technology (18 per cent), adding new payment methods (19 per cent) and creating a simplified payment process (19 per cent). [easy-tweet tweet="Retailers may consider an omni-payment strategy to keep #Millennial customers happy"] Final thoughts Although retailers have clearly made headway in offering shoppers a consistent cross channel experience, becoming fully omni-channel remains a work in progress. This is partly due to retail not yet catching-up with shopping habits that continue to evolve rapidly, and partly because new technology continues to be developed apace, increasing the possibilities for consumers and retailers alike, both online and in-store. The Millennial generation represents the future of ecommerce, and as this new generation becomes more commercially significant, online retailers must recognise that they will have to continually adapt their strategies to remain relevant to an audience that is increasingly mobile, visual, social and collaborative. ### Integrating and managing multiple cloud services The optimum cloud solution for most organisations is likely to be hybrid - a mix of in-house, third party and managed cloud services based on cost, SLAs and whether the organisation has the in-house skills to manage a particular service. However, this creates the new challenge of managing multiple cloud services from different providers. [easy-tweet tweet="Once an organisation moves to the #cloud, it needs to actively manage its portfolio and monitor performance"] The skills needed to prosper in this new world are much more service orientated than technically driven.  IT teams need to be fully conversant with service level management, strong in IT contract negotiation and development, have a good understanding of risk awareness and their organisation’s appetite for risk and take a sound approach to supplier relationships.  And it can get even more complex. When using multiple providers running cloud based services, the organisation also has to consider inter-supplier relationships, supplier dependencies (on each other as well as the organisation’s dependence on them), inter-supplier SLAs and back-to-back SLA management. Doing this effectively requires two things: a service management approach, and the right tools to manage multiple cloud providers. Taking a service management approach The first step in managing multiple cloud providers is to clarify the objectives for each cloud service. Why are you moving a particular service to cloud? What are the measurable indicators of success? How will this model complement your business strategy and goals? The next step is to decide on a procurement strategy, which could be a long term or short term partnership with the chosen providers. It is vital to think about how to ensure suppliers commit to a partnership approach with each other. The business needs to challenge its suppliers to work closely together, implement efficiencies and nurture a culture of continual improvement where the expectation is that it will happen and that each and every supplier will participate in the process. We recommend a service management approach: either a light touch which allows the suppliers to deliver the contracted services while your organisation concentrates on the business improvements, or a more structured approach to ensure supplier management at a more granular level. Monitoring multiple suppliers from a single pane of glass Once an organisation starts moving services to cloud, it needs to actively manage its cloud portfolio and monitor performance against the agreed SLAs to ensure it is receiving the contracted service. Organisations whose IT service is now dependent on multiple cloud and other external suppliers will want to know: How well are my service providers performing against contractually agreed SLAs? If they are not performing, where is the problem? This is particularly important where multiple providers are responsible for elements of the IT service Is the aggregated service delivering suitable performance to our user community? This is leading to a growth in new services (Cloud Monitoring as a Service, or CMaaS) to monitor the performance of multiple suppliers, all of whom will claim ‘it’s not their fault” when a problem arises. These services aim to provide organisations with full visibility of how well both each individual provider and the overall IT service are performing. An effective monitoring service should pull together service availability and other performance information. It should have the ability to carry out synthetic transactions against defined services and applications, show overall system health, and monitor response times and latency. It should also consolidate events and other performance statistics across the IT supply chain. Ideally, an organisation would like to be able to carry out this monitoring from a single pane of glass. Fordway has developed a solution with its CMaaS service, taking publicly available information from cloud providers such as Amazon Cloud Watch and using the AWS integration tool plug-in for the company’s toolset, which is based on MS Systems Centre.  The same can be done for other public cloud services such as Azure, Office 365, Service Now and Salesforce.  Network monitoring is also provided and the results integrated into event correlation then displayed on custom HTML5 dashboards which offer policy-based SLA measurement.  When using multiple providers running cloud based services, the organisation also has to consider inter-supplier relationships, supplier dependencies, inter-supplier SLAs and back-to-back SLA management Where contractually allowable, agents and gateways can also be placed inside a service provider’s infrastructure to provide more detailed information. The service should provide an overview and enable an organisation to drill down into each of the elements to identify where an issue resides and the potential causes of performance issues. Services such as this can also be used to monitor traditional IT services such as in-house environments, plus hosted and private cloud services where agents can be deployed or gateways installed into the monitored environment. Cloud management provides a single point of contact Cloud management might on first thought be perceived as something that is not required for cloud services, as they are all designed to be commodity services, primarily with user self-service through web portals. However, most organisations prefer, and in many cases need, a human voice and face plus organisation specific information from their services. Additionally, there may be several cloud providers who collectively provide your IT service. Thus we are seeing the introduction of cloud management services which provide service integration, management and monitoring for all cloud services contracted by an organisation. They offer major incident and problem management, with escalation to third parties if required, and may also include asset management of devices and infrastructure. The features to look for in a third party cloud management service include: Customisable services and reporting Cross supplier service consolidation and reporting against defined service levels Independent review and reporting on third party supplier performance Service on-boarding and service management for multiple partners Ability to work with other organisations e.g. network providers for WAN links 24 x 7 monitoring and support. [easy-tweet tweet="A service management approach will make it straightforward to manage multiple #cloud suppliers"] Cloud offers significant advantages but it also brings new challenges to IT teams. A service management approach and choice of the right tools will make it straightforward to manage multiple suppliers while focusing the majority of IT resources on business priorities. For more information visit www.Fordway.com ### VoIP & Cloud Telephony - #CloudTalks with NFON's Tamsin Deutrom-Yue In this episode of #CloudTalks, Compare the Cloud speaks with Tamsin Deutrom-Yue from NFON about how their cloud telephony solutions are empowering businesses all over Europe. #CloudTalks ### Cloud computing for startups Startups are playing an increasingly important role in the national economy. According to figures analysed by the Centre for Entrepreneurs, 608,100 new businesses were formed in 2015, an increase of 4.6 per cent compared with the previous year. [easy-tweet tweet="As #startups often do not have the financial clout of their larger competitors, keeping costs down can be vital"] However, it has also been reported that as many as 90 per cent of startups fail, with spiralling costs usually the main culprit. According to CB Insights, 29 per cent of startups cite a lack of cash as the reason for failure, with 18 per cent blaming pricing and cost issues and 8 per cent stating that a lack of investor or financing interest led to their demise. Given that startups often do not have the financial clout of their larger competitors, keeping costs down can be the difference between success and failure. One of the ways in which startups are keeping their budgets in check while still making use of innovative digital technologies is by embracing cloud computing. New businesses are adopting the full range of cloud solutions, whether that’s infrastructure as a service (IaaS), platform as a service (PaaS), software as a service (SaaS) or any other managed service. Crucially, the cloud’s subscription payment model makes it far more affordable for businesses than traditional IT solutions, where upfront CAPEX costs can prove prohibitive. Take, for example, the way that cloud computing is helping startups to keep their resources more secure. Companies dealing with large amounts of data are also tasked with keeping this data secure. Perhaps they have the budget to build their own private data centre and invest in a wide range of digital and physical safeguards, but perhaps they do not. In this situation, cloud computing can provide the solution. For a subscription payment, startups can access a private cloud and a host of security options from their cloud vendor. New businesses are adopting the full range of cloud solutions, whether that’s infrastructure as a service (IaaS), platform as a service (PaaS), software as a service (SaaS) or any other managed service However, it is important to note that startups are using cloud computing for reasons that extend beyond finance. Scalability is another key benefit, particularly amongst businesses that are expanding at a rapid rate. With cloud computing, IT resources are owned and managed by a third party vendor and then leased amongst their subscribers. This means that if a startup needs more bandwidth or computational power, they simply have to request it from their vendor and perhaps pay an amended subscription fee. With traditional IT solutions, scaling up and down is much more challenging as in-house infrastructure may need to be overhauled, often at great cost. What’s more, the cloud enables startups to compete with more established players by giving them access to the expertise and resources of their chosen cloud vendor. For example, with on-premise solutions startups would be tasked with the maintenance, security and administration of all their IT solutions, which may be challenging for firms that do not have in-house IT teams or large budgets. Conversely, businesses that use cloud services will have access to 24/7 managed support from their supplier. This has meant that a wide range of software and hardware is now available to startups, which previously would have been impossible to manage internally. [easy-tweet tweet="The #cloud lets startups compete with more established players - by accessing their vendor's resources" hashtags="zsah"] Having experienced life as a startup ourselves, zsah is well aware of the challenges facing young businesses and so we have worked closely with a number of entrepreneurs to create the zsah Cloud Startup Program. By qualifying for the scheme, you’ll gain access to powerful cloud infrastructure, rapid scalability, 24/7 managed support and free credit for up to one year. There are many hurdles that startups have to overcome if they are to achieve long-term success. Fortunately, the rise of cloud computing has meant that accessing vital IT resources is no longer one of them. ### Cloud BPM and the anarchy of innovation Almost universally, innovation is regarded as a positive force. How many businesses – particularly in the technology space – now have a Chief Innovation Officer, whose job it is to originate new ideas and to recognise and foster innovative ideas generated by others, as an essential part of their C-suite? [easy-tweet tweet="Introducing innovation will inevitably result in change and quite possibly conflict" hashtags="Cloud"] Anarchy, on the other hand, is generally associated with destruction, and conjures up images of insurrection, disorder, chaos, and mayhem. So what do I mean by “the anarchy of innovation”? The reality is that innovation means change.  Introducing innovation – even if it’s little “i” innovation in the form of new ideas, changes, fixes – particularly if on the fly – will inevitably result in change and quite possibly conflict. And there’s no question big “I” innovation in the form of a major business transformation will drive massive change. Innovation can break repeatability and potentially result in the delivery of a different experience for users – often not regarded as a good thing since users tend to like, expect, and even value a consistent experience. While some would suggest that innovation is in direct conflict with and can be crushed by over-controlled, process-driven cultures, business process improvement doesn’t have to be like that. In fact, with the capabilities that modern cloud-based BPM platforms bring to the table, it really shouldn’t be. Imagine for a moment a process-driven culture. Did you picture grey people with a grey culture, constrained by pages and pages of roles and procedures? That is a mindset born of outdated, legacy approaches to BPM that results in a perception of process as a separate documentation and audit activity, rather than one that sees process as a fundamental component of doing business. Instead, process knowledge needs to keep up with an organisation’s own creativity. That means not only keeping up with how we truly do things, but actually sharing the ownership of know-how with the right people in the business so that they are aware of this knowledge asset, and can challenge it, debate it, revise it, and, most importantly, use it. Moving from a legacy on-site BPM system (or even Word document based method) to a cloud-based BPM platform provides a unique opportunity to make processes dynamic, easily accessible, and easy to integrate into an employee’s day to day activities. Innovative cloud-based BPM solutions also make it easier for teams to truly collaborate around process improvement and to innovate. Let me provide a case in point from a remote part of Australia. From 2010-2013, the City of Karratha was transformed from a quiet backwater into a booming metro area as families flocked to the area to make the most of its mining resources. That had a definite upside, as Karratha experienced significant investment in both private and public infrastructure projects. But the speed of that growth posed some serious challenges to Karratha’s city council. Innovative cloud-based BPM solutions make it easier for teams to truly collaborate around process improvement and to innovate Anxious to serve its expanding population, the city council added new staff, many of whom had never worked in the public sector and, as a result, were unfamiliar with public sector responsibilities. The council itself was also on a steep learning curve, adding new services with which even experienced staffers were unfamiliar. Recognising that many existing procedures were incorrect, out of date, inefficient, or worse, non-compliant for local government operations, Karratha employed Promapp cloud-based software to consolidate processes, improve procedures and documentation across the organisation, and make sure staff were following approved methods.The software also addressed gaps in education and training. While disrupting previously engrained processes and procedures, the process improvement initiative enabled staff across the entire organisation to tap into up-to-date, centralised process management information, making it significantly easier for city departments to work together collaboratively, sharing information and understanding the various workflow impacts on the entire organisation. Even processes that seemed unnecessarily detailed were dubbed “a really useful exercise.” For example, if a key person was absent from work, the documented processes made it relatively easy for someone to step into their shoes and continue moving everything forward. Putting links, forms, screenshots, and guidelines into the processes gave users everything they needed to complete the task at hand. Moreover, by creating and documenting processes within the BPM software, the quality of the processes themselves improved. The software’s flexibility and cloud-based interface allowed users to add relevant information and important links to websites. It also facilitated co-design of processes, allowing for input from individuals within the organisation. The result is processes that aren’t solely aimed at compliance, but also suit the way in which people want to work. This not only creates buy-in from the users, but improves everyone’s understanding of how things work and, more importantly, why. In short, Karratha’s city council introduced innovative changes without anarchy. Clearly, innovation can cause anarchy, but only if it isn’t properly managed. By establishing a platform for people to engage with process and make the case for change, innovation can be fostered. A voice needs to be given to those working within the process, allowing them to share their insights and ingenuity. That platform should facilitate the coordination and integration of changes and improvements. Cloud-based BPM software is the ideal platform for implementing these kinds of innovative process changes. Ultimately, it is all about harnessing innovation, rather than stamping it out. [easy-tweet tweet="Moving from a legacy on-site BPM system to a cloud-based platform lets businesses make processes dynamic"] Can it be done? Absolutely. First, though, you need to change the way the organisation looks at processes. Only then can you truly change the way you look at innovation and fully realise the positive benefits that innovation can deliver. ### The rise and rise of workplace analytics Advanced, invisible, predictive, prescriptive, real-time, collaborative and pervasive  - data analytics now comes in many guises, all of which expand the narrative significantly from its big data focus. [easy-tweet tweet="Data #analytics comes in many guises, all of which expand the narrative significantly from its big data focus"] Gaining particular traction is the latter, which as the name bears out, describes how analytics can be fully integrated within a business, capable of penetrating all strands of an organisation. It’s a level of immersion which is now recognised as prerequisite if optimal value is to be derived from the data at our disposal to drive a competitive advantage. Achieving this has become the Holy Grail of any progressive business, a logical progression fuelled by a growing recognition that to reap the full benefits, more people within an organisation need to be in on the act. As the usage of analytics expands from its original purpose around operational reporting to become a core tool to better understand the full customer journey and be deeply and invisibly embedded across the full gambit of daily processes and weaved within popular applications, jobs and tasks become quicker, easier and more efficient with learning and insight more easily shared. Getting there demands the right platform that can facilitate a more intuitive management of the huge and static volumes which define big data and shift the focus from the mounds of intelligence per se, to exactly how it is collected and harnessed. As information floods in from social media and the IoT, across customer, market and sensor data and operational financial information, the mission to distribute it to the right people at the right time has spawned a range of more accessible analytic platforms which have transformed a traditionally complex process that once relied on stand-alone, costly and complex BI reporting tools. TIBCO’s Fast Data platform is one such example of this reform, bringing analytics much closer to the data sources and in turn putting an end to the sequential processes that saw data collected and assembled with the analysis and action happening at a later point. Those heading up the technology are now far more visible within the wider business, illustrated by the growing prominence of the Chief Analytics Officer Here, actionable data is generated, integrated and analysed to inform and coordinate in one hit so that the right commands are executed across the multitude of complex processes with unprecedented speed, agility and reliability in real time. It’s a speed and ease compounded by the rise of self service solutions which have brought analytics to a far wider audience and has been intrinsic to achieving greater accessibility to the latest information. Driven in part by the explosion of the cloud, this approach has changed the way that business intelligence tools are accessed, all of which has made analytics cheaper, accessible and straightforward, so it is no longer the exclusive domain of experts and data scientists. By equipping every individual across an organisation with the materials to drive the business metrics through the use of data, the opportunities for growth, competitive advantage and enhanced business value expand vastly. At the same time comes the potential to learn from historic data and apply that intelligence to real time data to influence business outcomes. We’re seeing how this can empower people across all strands of the business. Front line analytics may sound like a gimmick or something firmly in the future, but it’s already a reality, as those on the floor in the retail environment, for example, are now empowered to deliver better customer service and respond in real time to customer intelligence on purchasing history, habits and product availability on mobile devices. As a result, they can tailor the offer to personalise the customer experience and fully exploit the opportunity. That way, the critical business moment that presents itself can be seized right away - something increasingly necessary across a wide range of industries. At the other end of the spectrum, those heading up the technology are now far more visible within the wider business, illustrated by the growing prominence of the Chief Analytics Officer.  While the role may still not be as mainstream as the usual C-suite counterparts, momentum is building. Along with the recognition of its strategic importance as a digital pioneer comes a mandate to improve operations and identify future growth opportunities. Increasingly, it is a role that sees greater collaboration with the CIO, to empower the business to better leverage BI by making agile analytics pervasive and allowing access to predictive analytics at every point of customer engagement. As well as driving new levels of innovation and business performance, this also helps to promote the buy in and engagement needed for optimal success. [easy-tweet tweet="The way that business intelligence tools are accessed has changed, making analytics cheaper and more accessible"] As digital transformation and the move to digital business gathers momentum, pervasive analytics will be a key enabler in making this a reality. ### DevOps & Digital Transformation - #CloudTalks with Rackspace's Mike Bainbridge In this episode of #CloudTalks, Compare the Cloud speaks with Mike Bainbridge from Rackspace about the true meaning of digital transformation. #CloudTalks ### Cloud convergence and the state of play for smaller players
 As the cloud market matures, the roles of tech players is transforming rapidly.  Roles established in the old ‘provide-it-yourself’ IT market are rapidly becoming less relevant in the cloud market.  Some global vendors - Google, Amazon and Microsoft- were far-sighted enough to create strategies to become core players in the new era of cloud by becoming ‘hyperscale’ cloud vendors.  Using their enormous economies of scale, they now provide the overwhelming majority of cloud based compute and storage for cloud solutions – Infrastructure as a Cloud Service.  What this means is that previously separate vendor roles like hardware sales, software sales, ‘hosting’ and managed service providers have lost their place in the market, with role convergence fast become a defining feature of the emerging cloud market. [easy-tweet tweet="Some global vendors were far-sighted enough to create strategies to become ‘hyperscale’ #cloud vendors"] With more and more of our cloud-based computing power being supplied by a very small group of hyperscale vendors, are we headed towards a completely homogenised market-place with little choice or does this create new opportunities for the fantastic breed of smaller market players that have emerged in recent years? Where the market is today – and tomorrow The appetite in business to continue the provide-it-yourself IT model is declining rapidly – it has too many disadvantages.  Major providers have played to their strengths, building hyper-scale cloud infrastructures and individual cloud services. But does this mean they have cornered the cloud market?  The answer is yes and no.  Yes, they’re on track to own much of the global Infrastructure-as-a-Service (IaaS) market but these are only the basic ingredients of cloud solutions. Today’s businesses want to move completely away from the provide-it-yourself model for IT and move to a ‘consumption’ model – simply using the business tools they need to run the business and improve their competitive capability but not having to build, operate and manage it for themselves.  IaaS, even at hyperscale, does little on its own to enable businesses to do this; it simply shifts the compute and storage to the cloud.  What’s more, larger players know that it’s an impractical business model for them to try and build complete, bespoke cloud solutions for, say, companies of between 250 and 1000 seats, which represent the overwhelming majority of companies with sophisticated IT requirements.  In the future, the raw cloud ingredients are likely to be delivered by 4-5 hyper-scale cloud vendors who compete hard to deliver fundamental cloud solutions at the lowest possible price How smaller cloud providers compete It’s self-evident that smaller businesses can’t compete with our hyperscale peers in the provision of the raw ingredients of cloud solutions - cloud-based infrastructure - we simply can’t bring the massive economies of scale to bear.  So, the starting point for smaller players is to not to try and compete with the hyperscale vendors on their core offering - IaaS. The opportunity for smaller providers is their proximity to and understanding of customer needs.  Smaller players are much more able to be agile, flexible and responsive to customer requirements than hyperscalers  Therefore, for smaller, innovative businesses, it’s about optimising the fantastic opportunities that the hyperscalers have created through the provision of high quality cloud infrastructures, available at accessible cost.  Using these high quality, reasonably-priced raw ingredients, smaller players - Cloud Solution Providers – can design, provision, build, as well as manage, operate and support complete, sometimes complex, cloud solutions so that all customer businesses have to do is consume what they need to run their business. In this way, smaller players can transform the hyperscalers’ raw ingredients into a complete, immediately consumable solution that adds substantial value for customer businesses.  What does this look like in real terms?  A genuine eco system with smaller providers working in a symbiotic relationship with hyperscale organisations.  Customers benefit from the advantages of large providers - high quality and best price - combined with the skill, experience and agility of smaller players with really innovative, customer-focused outcomes.  Why go for a smaller player? In the future, the raw cloud ingredients are likely to be delivered by four to five hyper-scale cloud vendors who compete hard to deliver fundamental cloud solutions at the lowest possible price.  The new breed of service providers, Cloud Solution Providers, will then transform these into complete solutions totally in sync with customer needs, managing and operating these via a comprehensive support platform. For an organisation looking for a new cloud partner, the benefits of teaming up with the smaller players are clear. There’s the simple fact that, to a smaller player, the customer will be far more important than they would ever be to a hyperscale provider. Smaller companies will be far more concerned about customer service, and react more attentively to customer needs than hyper-scale vendors will ever be capable of achieving. They also tend to specialise in key verticals so can bring that expertise with them too. [easy-tweet tweet="Smaller companies will be more concerned about customer service than hyper-scale vendors" hashtags="Cloud"] Time for a shift The provide-it-yourself market is dying rapidly and the cloud services market is growing at an astounding pace.  Hyperscale cloud providers are a reality within this new eco-system and there is a rapidly growing group of small, innovative pioneering Cloud Solution Providers that have already found ways of not just co-existing but thriving together.  It’s a critical point for the market, regardless of size.  Because cloud is a very different, consumption-of complete services model, we’re seeing that huge shift in the IT market. Adaptability is what will separate winners from losers here. Forward thinking providers of all sizes need to work together to re-shape their businesses and respond to these changes to thrive in this converged space. ### A disaster recovery plan in the cloud More and more businesses are moving their systems from under the stairs or the back cupboard to a place in the cloud – which in most instances dramatically improves security, reliability and availability. But as organisations seek different and better ways to protect their critical data, they need to be aware of a number of key criteria to ensure businesses are ‘cloud-ready’ before incorporating it into disaster recovery plans. Disaster is always the situation that companies aren’t expecting – or want to think about. While there is no definitive answer that will solve all fiascos, there are ways to avoid being caught out. [easy-tweet tweet="Businesses must realise that there is a major difference between Business Continuity and Disaster Recovery"] Disaster Recovery vs. Business Continuity Businesses first need to realise that there is a significant difference between Business Continuity and Disaster Recovery when considering their options within the cloud. Understanding this difference, and therefore the end goal, impacts the most suitable use of the cloud. Disaster Recovery is a process that is enacted in usually more drastic scenarios and can take hours or even days to take effect. Whereas Business Continuity is the ability of an application or service to continue or return to service within minutes or seconds. Moving a service from self-managed and on-premise to a professionally-run managed service will normally intrinsically improve the organisation’s Business Continuity capabilities – almost without planning to. This improvement is a by-product of the service provider having to consider the implications of service outages while maintaining adherence to SLAs. Meanwhile, more deliberate efforts are required to improve Disaster Recovery effectiveness. It involves active collaboration between the business and the service provider, which may involve replicating the service in another location in case of a fundamental disaster.  Migrating to the cloud Some companies use cloud solely as a backup option and on a pay-as-you-go scheme. Setting a server up to run systems from the cloud on an “if needed” basis prepares businesses for a number of data eventualities. It is a reliable, cheap and secure disaster recovery strategy. It is also its distributed and offsite nature that makes the cloud a prime candidate for taking a pivotal role in business continuity and more intense disaster recovery planning. But as good as the cloud is for these purposes, it actually adds another element of risk to the equation that must be mitigated first. When considering migration to the cloud, the issue of data sovereignty inherently comes to the fore. There are a number of rules surrounding data protection in the EU, and the good news is that they cover the whole of the region. Unfortunately, the European Court of Justice has lifted the Safe Harbour scheme between ourselves and the US, proposing to replace it with a new Privacy Shield framework. In the interim, most organisations are relying on a mutual non-disclosure agreement (NDA) between themselves and anyone else who gets their hands on their data under the EU agreement. This mutual NDA is essential, particularly if you are using an American company to host your cloud systems and data. With cloud as part of a disaster recovery strategy, your network becomes the most critical single component of your organisation’s entire business continuity infrastructure Then there is encryption. Who has the keys to reading your data? The more secure you make it, the harder it is to use. Several people must be aware of how to access the encrypted data to ensure that it’s effectively used day-to-day, and in the case of a disaster. Finally, whenever business-critical applications and data are moved offsite, the network connection not only comes under greater strain, but its reliability becomes critical. In fact, with cloud as part of a disaster recovery strategy, your network becomes the most critical single component of your organisation’s entire business continuity infrastructure. Build a great cloud on a poor network and you may as well not bother. Is your network ready for the cloud?   So what are the key components of a cloud-ready network? We consider a network to be cloud-ready if it incorporates the following aspects: 1 - A secure and scalable converged network that is centrally managed The cloud relies on a robust network infrastructure that is dynamic, secure, scalable, distributed and can be monitored and managed centrally. The network should also be a converged network i.e. all types of traffic – voice, data, video and mobile – should be managed efficiently within the same network. 2 - Built in scalability and redundancy Scalability to manage the peaks in demand that come from using the cloud, and redundancy in order to keep the lights on if one aspect of the network falls down. A network that is built in this way will also operate on a range of different access technologies (DSL, fibre, Wi-Fi, cellular etc.) in order to further minimise risk. 3 - Virtualised network With virtualisation, a single “best-of-breed” IP network can be built on a range of different technologies from different suppliers, enabling the organisation to choose the best components to support its business continuity requirements. It facilitates convergence and flexibility, and since you no longer physically own the network, it enables you to shift an even greater proportion of IT costs to the more flexible OpEx model commonly used among cloud suppliers. Lastly, with a virtualisation layer unifying all technical aspects of the network, application developers only have to develop for one infrastructure. 4 - Application performance monitoring Cloud applications are useless if their performance is so poor that it slows everyone down. This not only damages productivity, but may encourage employees to bypass the cloud altogether and return to on premise applications that are outside of the business continuity infrastructure. The network must therefore be able to monitor the performance of all the applications running on it so that they can be optimised differentially attaching priority for critical application flows accordingly. [easy-tweet tweet="Cloud migration of critical applications without a strong network is like building a house with no foundations"] Building the right foundations Moving critical applications to the cloud without a strong and flexible network in place is like building a house with no foundations. Preparing for all disasters is not possible, but attempting to could prevent something that is screamingly obvious. If enterprises fail to ensure their own corporate networks have the right levels of bandwidth, flexibility, latency and security then they will at best never see the benefits of the cloud, and at worst, open up their business to significantly greater operational risk than if they’d remained on premise. ### Hybrid IT: Critical to long-term business success We are in the midst of a once-in-a-decade shift in the technology landscape. Nowadays, end-users expect nothing less than a seamless consumption experience, keeping them blissfully unaware of the changes happening in their IT environments. In order to meet and exceed this expectation, IT infrastructures are having to evolve in an era dominated by hybrid IT: migrating some infrastructure services to the cloud whilst continuing to host critical services on-premises. [easy-tweet tweet="In order to meet expectations, IT infrastructures are having to evolve in an era dominated by hybrid IT"] A recent SolarWinds report, IT Trends Report 2016: The Hybrid IT Evolution, revealed that 90 per cent of IT professionals say that cloud technologies are important to their organisations’ long-term business success. With improved flexibility and agility, infrastructure cost reduction, as well as eliminating the need for maintenance of an underlying infrastructure, it’s clear that adopting a hybrid IT infrastructure, driven by cloud services integration and delivery, is essential for future business success. Hybrid IT – worth the effort When working in a hybrid environment, 86 per cent of respondents in the SolarWinds study listed ‘increased infrastructure flexibility/agility’ as the top five rated benefits. Having the choice between leveraging on-premises or cloud services, IT professionals will have the ability to consider workload resource distribution, security and performance needs, amongst other things. Along these lines, a hybrid IT approach can also deliver greater cost efficiency, with 70 per cent of IT professionals listing it as the second highest benefit. With hybrid IT, the IT department is able to better align with business needs and choose specific services for workload allocation. Is it cheaper to store this data or run this application on-premises or in the cloud? Finally, ‘eliminating responsibility for maintenance of underlying infrastructure’ was named as the third biggest benefit (67 per cent), indicating that hybrid IT provides IT professionals with less technical debt to deal with, which translates into more time to qualify and quantify best-of-breed services. It is undeniable that hybrid IT holds plenty of benefits to the IT team, as well as the business as a whole, but what are the barriers to adoption? Cloud will not be suitable for all workloads anytime in the foreseeable future and, even if it were, few companies would convert all of their existing services and applications to run in the cloud The skillset to unlock hybrid IT’s potential As illustrated by the study’s findings, IT professionals need to arm themselves with a new set of skills, products and resources to succeed in the hybrid IT era. The number one barrier to adopting hybrid IT is security and compliance concerns (66 per cent). IT professionals can ensure the security of their IT infrastructure, both on- and off-premises, with hybrid IT monitoring and management tools. Security and compliance starts with awareness of potential threats and developing countermeasures. Nearly two thirds (59 per cent) of respondents indicated that one of their main challenges in moving towards a hybrid IT environment is the work it takes to get there. Although overcoming tech debt and tech inertia can be a time consuming and potentially difficult task, this is in reality a short term challenge, which will provide fruitful when it comes to the long-term benefits for the entire business. Finally, as when overcoming any challenge in life, the IT team needs to develop and improve key skills and knowledge in order to successfully manage hybrid IT environments. By learning about service-orientated architectures, automation, vendor management, application migration, distributed architectures, API ecosystems and hybrid IT monitoring and management tools, IT professionals will position their IT career for viability and longevity. Today’s IT professionals need to reach across traditional roles and become polymaths in order to be successful in the hybrid IT world. The reality is hybrid IT Cloud adoption is everywhere and used by nearly everyone in modern business, whether they know it or not. Cloud will not be suitable for all workloads anytime in the foreseeable future and, even if it were, few companies would convert all of their existing services and applications to run in the cloud. [easy-tweet tweet="The number one barrier to adopting hybrid IT is security and compliance concerns" hashtags="Cloud"] The resulting dynamic—one set of critical on-premises services connected with another set of services in the cloud—is hybrid IT. And at the centre of this evolution is the IT professional who needs to ensure always-on performance of applications, devices, networks and systems—regardless of whose premises. In turn, they need to be empowered with executive support to gain the skills and tools required to properly manage hybrid IT environments. And if executed successfully, this will allow businesses to truly unlock the innovative potential of the cloud. ### Business Analytics - #CloudTalks with Assimil8's Karl Mullins In this episode of #CloudTalks, Compare the Cloud speaks with Karl Mullins from Assimil8 about how their business analytics solutions utilise consulting expertise to generate the best results for their clients. #CLOUDTALKS ### Bringing order to DevOps The benefits of using an Agile development methodology and DevOps for continuous deployment are well understood. Both are now key building blocks for modern enterprise software creation and delivery. With the rapid uptake of hybrid cloud the infrastructure for software delivery is often dispersed geographically across multiple private and public data centres and cloud providers. Coupled with the fact that DevOps encourages developers, testers, and operations to deploy infrastructure as they need it, this can lead to multifunction and multisite server and software deployments, which can be hard for those tasked with infrastructure management and security to control. [easy-tweet tweet="#DevOps encourages developers, testers, and operations to deploy infrastructure as they need it"] One emerging aspect of this burgeoning infrastructure and security management issue is related to networking. Currently emerging practices in networking are Network Function Virtualisation (NFV) and Software Defined Networking (SDN). NFV allows generic hardware to be configured via software as the type of network equipment required for a particular function. SDN moves the management of the network to a software based central controller and can also abstract higher level network functions from the underlying network hardware. Basically, these two practices deliver virtualisation of network functionality in much the same way that we have seen servers and storage provision virtualised over the last decade. NFV and SDN make it easy to deploy network devices like load balancers, firewalls and other security infrastructure throughout the hybrid cloud networks that are now in use; especially on networks using Agile and DevOps as it’s easy to incorporate deployment into automated workflows. In scenarios such as this it is vital for security and consistency of the deployment and testing functions that virtualised networking components are deployed with appropriate policy configurations applied. From a security perspective it’s also vital that the correct permissions are applied so that only those who should be using the network functions are granted access and that the virtualised networking components are shut down and removed when no longer required. This last step is essential to reduce the targets for anyone attempting to compromise a network. Old components that are no longer used may not get updated with security changes and so can present a major vulnerability on a network. NFV and SDN make it easy to deploy network devices like load balancers, firewalls and other security infrastructure throughout hybrid cloud networks The uptake of NFV and SDN virtualisation of networking functionality, which was previously delivered via dedicated appliances, means that there is a need to incorporate both into DevOps workflows. It is essential that there are management tools in place that can offer visibility, tracking and logging of NFV and SDN functions throughout the hybrid cloud. It is also essential that they deliver management templates, scripts, role based access control and tools that administrators can use to configure these software devices consistently in a sometimes chaotic DevOps environment. Ideally administrators should be able to ensure that anyone deploying a virtualised network function within a DevOps workflow anywhere throughout the hybrid cloud does so in a secure and consistent way. Administrators should also be able to get an overview of what is in use throughout and be able to approve changes to standard configurations before they are exposed to external networks. Most organisations that have adopted Agile Development and DevOps practices have a set of configuration tools and in house workflows that are based around standard management tools and APIs. For example, Microsoft PowerShell is increasingly popular as the scripting language of choice for automation. Also, communication with development and deployment tools via RESTful APIs is very common. Whichever underlying toolset powers a development and deployment workflow, the management tools for NFV and SDN that are adopted should be able to link into the same scripting tools and APIs. This allows the virtualised networking aspects of modern delivery workflows to be managed in the same way as other parts of a DevOps workflow. [easy-tweet tweet="From a security perspective it’s vital that the correct network permissions are applied" hashtags="DevOps"] Having a common set of management interfaces and preferably a single management view of all the parts of the DevOps workflow, will make it much easier to ensure consistency across the hybrid cloud and make securing all parts of the development and deployment workflow secure. This will pay dividends over time as everyone involved with the development and deployment of the software products will spend less time managing infrastructure, and will have more time to spend on making the products better. ### The security challenges telecoms operators face in deploying SDN and how to overcome them In the traditional model of networking, the main vectors that needed securing were the networking devices themselves and particularly any external management measures. [easy-tweet tweet="The measures used to secure an SDN environment are not dissimilar to those used in a traditional environment"] With the SDN model, there are now additional layers. These include the separation of the control plane from the data plane and the ability to be able to programmatically manage networks through an application front end. With functionality moved into these additional layers you now have to worry less about the network devices themselves as they typically just move packets. However, the devices that have flows pushed down, the SDN controllers, the application frontends and the APIs that allow the layers to interact with each other need to be rigorously assessed as flaws here could impact the entire network. Hybrid networks that still use more traditional networking devices, but support SDN protocols, will need to ensure the security assurance of the network devices as well, as more traditional attack surfaces will remain. Preventing attacks in a changing environment The first step is to identify the key parts of the existing infrastructure and where important assets are stored. Focusing on securing these and then architecting future changes with security in mind should help prevent issues arising. Traditional measures such as the use of firewall rules and access control lists will still be applicable and the use of SDN may allow more flexible application of firewall rules to hosts that move around the network. VLANs and containers can be rapidly configured and hosts placed into network zones that both restrict traffic and allow detection opportunities for inter-zone traffic. It may be possible to interface threat management and monitoring systems with the network controllers. Rather than alerting that a host is making a connection to a malicious IP address or domain, the firewall rules could be automatically modified to prevent the connection from going out at all when the Indicators of Compromise (IoC) feed is updated. Network routing protocols such as BGP could be reconfigured in the event of a volumetric attack such as a DDoS attack to black hole malicious traffic.  Using Out Of Band (OOB) channels for control traffic and the use of hardened bastion hosts or jump boxes will remain a key way of protecting administration interfaces. Security tools With the use of encrypted data channels, the monitoring of endpoints and logging information becomes increasingly necessary. Cloud services can be difficult to monitor and perform forensic activities on and so a focus should be on analysis of the logs those services make available. Although SSL encrypted traffic can prevent deep packet inspection, flow data showing which hosts were communicating can allow investigation to identify which hosts to analyse. A greater focus on endpoint monitoring can overcome some of the challenges presented. However, these solutions will be most effective if constantly monitored to determine baseline behaviour and maintained, rather than just relied on in the event of a breach. Cloud services can be difficult to monitor and perform forensic activities on and so a focus should be on analysis of the logs those services make available Cost effective implementation For telcos, implementing additional measures such as implementing OOB connectivity between data centre points of presence will be relatively cheaper than a business trying to achieve the same objective across an enterprise WAN at different offices. Many SDN solutions are open source, meaning a lower capex and allowing resources to be invested in the staff who will support and secure the equipment. The main cost will be building and growing the security capability for the organisation. This can be achieved either by recruiting and training an in-house team or through outsourced services. It is only with this capability and effective processes that an operator will be able to efficiently make use of security systems that they invest in. Although a number of products exist on the market, as networks get more complex, organisations will find that skilled defenders become increasingly important and armed with an array of open source tools. The future With increasingly dynamic systems, the need for stringent change management and documentation processes will be required to maintain a current view of the landscape. Out of date records and documentation is already a common problem with more static infrastructure and as it becomes more fluid this will grow exponentially if not brought under control. The measures used to secure and monitor an SDN environment are not that dissimilar to those used in a traditional environment. The main difference will be that, with the ability to programmatically manage the system, there will potentially be a larger landscape to secure in a faster changing environment. This difference also means that, if leveraged properly, it could also be easier to document, manage and maintain through automatic means. [easy-tweet tweet="As networks get more complex, organisations will find that skilled defenders become increasingly important"] Summary There are now opportunities to take advantage of the ability to programmatically manage a network. It could be easier to manage systems in a manner that overcomes the issues of managing security in an increasingly dynamic landscape, such as in a cloud environment where virtualised hosts can be created and removed in different geographical locations to cope with changing demands. The challenge now is to evolve processes and culture to be able to keep pace with the constantly changing nature of threats and the risks they pose. ### Octo Telematics: Insurance Telematics – a key proof point for IoT Earlier this week, Jonathan Hewett, CMO at Octo Telematics gave a presentation on how insurance telematics is a proof point for the Internet of Things. The keynote presentation took place at the annual Smart IoT event in London’s ExCel centre. [easy-tweet tweet="Telematics has been transforming auto insurance in recent years through behavioural and contextual #analytics"] Octo Telematics is currently at the centre of a global IoT revolution and so this event is the ideal platform to discuss the issues at hand and its impact on businesses. Telematics has been transforming auto insurance in recent years through behavioural, contextual and driving analytics. Octo Telematics now collects over 320,000 data points per minute through advanced driver technology. This comprehensive set of real-time driving data is then processed into actionable insights and intelligence which enables insurers to carry out their mission critical processes such as pricing, crash and claims management and developing their customer relationships. However as devices begin to communicate with each other leveraging the Internet of Things or IoT), the telematics market is set to explode beyond the automotive sector as we begin to collect more and more data about our lives. Insurers already know the benefits of monitoring the data from smoke detectors and similar devices, but this is only the surface of the potential for increasingly individualised policies Insurance, then, is a market poised to benefit from IOT. As new sources of data about the people and things we are insuring become available, insurers gain new opportunities to identify and manage risk. In many cases, consumers can be incentivised to play an active part by sharing data and, based on the insights from that data, modify their behaviour to reduce risk. It’s a fantastic model for the insurance industry, one that offers greater transparency and value for all stakeholders. The SMART IoT London conference, was held on April 12-13th 2016, and has been established to help businesses unlock the potential of the Internet of Things. The two day conference is focused on understanding the business opportunities enabled by smart products and new ecosystems.  [easy-tweet tweet="Visitors to SMART IoT London can gain the latest insights from #IoT thought-leaders and visionaries."] SMART IoT London is an annual conference that attracts over 20,000 professionals making it the largest gathering of IoT expertise in the world. Visitors can gain the latest insights from IoT thought-leaders, practitioners and visionaries. ### The IT manager’s guide to securing your cloud – Infographic Cloud technology is here to stay, as demonstrated by the fact that 71 per cent of businesses either use it already or plan to implement it in the near future. However, many of the companies who are resistant to the cloud cite security concerns as the primary reason for doing so, with a LinkedIn cloud security report revealing that 63 per cent of businesses have worried over the possibility of unauthorised access to their cloud. [easy-tweet tweet="When selecting #cloud storage providers, read carefully to examine how committed they are to security"] When you consider how easily hackers can gain unsolicited access to confidential data in the present day, you can understand why businesses would be concerned over the true privacy of data stored online. However, there are several actions that they can take in order to at least minimise the possibility of their cloud storage being hacked. The first, and easiest, step is to choose strong passwords. It boggles the mind that people still use passwords containing their own name, a company name or, heaven forbid, the word ‘password’. This is the online equivalent of posting a sign to your front door and saying ‘Dear burglars, the key is under the mat’. Use passwords that contain upper case letters, lower case letters, numbers and symbols. Also, activate two-factor verification and it’s even advisory to enter false answers to security questions so that they’re all but impossible to guess. After that, it’s essential to conduct an audit of the devices to which your cloud is connected, as well as the files within your cloud. Go through all of the applications and files in your cloud and remove any which you don’t need or no longer use. However, make sure that all important files are backed up elsewhere in case there is a security breach. Any particularly critical files should be encrypted, also. When it comes to selecting cloud storage providers, read through their terms and conditions to examine how committed they are to the security of your cloud. The temptation might be to skim over these, as they are lengthy, but you might not spot that they’re essentially giving themselves permission to access your data and do what they want with it. Steer clear of any providers like this and also try to apportion your data between multiple providers so that, if one account is hacked, files stored in other accounts won’t be affected. [easy-tweet tweet="Ensuring your #cloud is secure will take time, but the effort will be worth it for the peace of mind it brings"] Ensuring that your cloud is absolutely secure will take time, but the effort will be worth it for the peace of mind that it brings. Among its main benefits are that two-factor verification is almost impossible to hack, cloud storage providers come with security certification and data can be accessed from another device if one device isn’t working. Also, data can be wiped remotely in the event of a device connected to your cloud being lost or stolen. ### Pivot3 acquires NexGen storage – Compare the Cloud looks at what this really means In February of this year Pivot3 announced the acquisition of NexGen storage – we look at why this could be significant for service provision and the software defined data centre. [easy-tweet tweet="Pivot3’s acquisition of NexGen sends a very strong statement to customers evaluating the #hyperconverged market"] Pivot3 provides Dynamic Hyper-Converged solutions, they’re very good at it with more than three times the patents of their nearest competitor and customers in more than 53 geographies.  They’re in a fiercely competitive market and analysts predict exponential growth in hyper-convergence so expect to see new entrants, consolidation and failures.  Indeed, the expectation is many of todays providers will be purchased or simply disappear due to lack of funding.  Pivot3’s acquisition of NexGen therefore sends a very strong statement to customers evaluating this market – that is they’re here for the long haul and in an early adopter market that’s significant as buyers need to place bets.  It therefore follows that providers making these sorts of investments are making a clear statement of their intent and long term credentials.  In other words they’re a safe bet – customers need providers to support them through this transition, not disappear due to lack of funding six months into the relationship. In terms of the technology itself Pivot3 is already serving customers globally with its brand of hyper-convergence - that is the ability to perform unified data centre tasks from x86 commodity servers using its software as the glue to tie compute, storage and networking bandwidth together.  So how does NexGen storage enhance this?  On the surface of it they provide PCIe Flash storage and have been very successful; we expect though there’s more to this than just storage alone. Pivot3 are in a fiercely competitive market and analysts predict exponential growth in hyper-convergence so expect to see new entrants, consolidation and failures Pivot3 has seen an opportunity to expand the traditional notion of hyper-convergence.  We expect them to migrate NexGen's software capabilities into their own operating system – one of these is Quality of Service (QoS), so IT no longer has to overprovision for peak workloads, but instead can use just-in-time dynamic provisioning of resources and scale as the business grows. This has relevance for service provision as it enhances the management of workloads in a multi-tenanted environment – service providers can guarantee a quality of service to their customers and we see this as a big step towards the adoption of hyper-convergence in this sector.  Unlike other storage arrays and hyper-converged systems that treat all data the same, Pivot3’s QoS governs performance targets, input/output (I/O) prioritisation and data placement, allowing customers to meet business service-level agreements. For example, an organisation serving the healthcare market can prioritise mission-critical hospital system applications over the internal Microsoft Exchange server, preventing a surge in employee email downloads from interrupting life-or-death health care system operations. This is a great example and the same concept is true in any service provision environment. Michael Frank, Manager IT Services Group, NCS Credit explains: “We categorise all of our applications as mission-critical, business-critical or non-critical.  We need our systems to be able to recognise that not all applications are created equal and to process them appropriately and differentially.” Ever-increasing user expectations and application demands determine that not all data should be treated the same and Pivot3 can certainly manage this now by taking their hyper-converged proposition to a level where service providers can deploy it not just as a platform, but a platform that enhances their own offering to customers, as well as complementing the overall shift to a software defined data centre. Finally, we also expect Pivot3 to leverage NexGen’s channel providing them with further footprint in geographies they already serve and an opportunity to expand into new regions.  It's not surprising then that Pivot3 made this move – not only does it send out a very clear statement regarding their intent and longevity in this market, it also significantly enhances their hyper-converged solution. Finally, the new routes to market provided by NexGen salesforce and the channel will also prove hugely beneficial.  [easy-tweet tweet="We expect Pivot3 to leverage NexGen’s channel to expand into new regions"] We started this blog saying why this acquisition could be significant for service provision and the software defined data centre – it's much more than that and the message this conveys in the early adopter market of hyper-convergence is a powerful one. ### Multi-tenancy is made for the arts What’s often missing from discussions about cloud in the arts is whether the underlying architecture is multi- or single-tenant. When you’re comparing cloud-based applications, you really need to understand the difference. [easy-tweet tweet="Differences between single and multi-tenant architecture have a huge impact on arts and entertainment companies"] Multi-tenant basically refers to a single instance of software used by multiple clients or ‘tenants’. In a single-tenant architecture, each tenant accesses a separate instance of the software. When a new tenant comes along, the supplier has to add a new, separate instance. Those distinctions have a huge impact on arts and entertainment companies when it comes to data, security and customer service. Data and analytics There is a perception that single-tenant is superior because it provides data isolation. The multiple customers (or tenants) on multi-tenant software are sharing a single instance of the application, and this increases the risk of data being shared with other tenants – or so the thinking goes. It isn’t true. There are many options for managing data in a multi-tenant environment which sit on a continuum between isolated databases and shared databases. Whether the data is secure in both single and multi-tenant environments comes down entirely to the quality of the software engineering. For arts organisations, the isolated database end of the continuum makes most sense, as the tenant databases are usually quite large and because the ability to move databases around between servers makes it easier to handle the peak demands experienced in our industry. Multi-tenancy and security So how can a cloud provider guarantee that its multi-tenant architecture handles data securely for different clients? The key is the maturity level of the SaaS architecture.  At level 1, every time there’s a new client, the supplier adds a new instance of the software application which is only accessible by that client. That instance runs independently of other instances and all instances might be running a different version of the software.  At level 2, the supplier still runs separate instances but aims to keep them all on the same version.  The key word here is “aim”. Keeping all instances of the software on the same version might sound easy in theory, but imagine you have, say, 200 of them, and then consider the time and effort needed to make this happen. On the other hand, at levels 3 and 4, the supplier can run a small number of instances serving a potentially large number of clients, and changes or fixes can be rolled out far more quickly and efficiently as only a small number of instances of the software need to be maintained. The difference between levels 3 and 4 SaaS maturity is an important one. Without it, every customer client would always have to be on exactly the same version. This makes it nearly impossible to run a beta program, or hold certain customers back from a new release if they aren’t ready for it. In a single-tenant architecture, each tenant accesses a separate instance of the software. When a new tenant comes along, the supplier has to add a new, separate instance. Imagine a multi-tenant vendor with 200 customers but only a small number of instances of the software. Then imagine a single-tenant software provider who also has 200 customers. The single-tenant provider has to create 200 separate instances of its software. Imagine how that impacts areas like: Security - A multi-tenant SaaS provider can secure its shared instances and spread this cost across all of our clients. This works much better than having to secure each one of 200 instances and keeps costs down. System resources - Running 200 instances means 200 copies of the application which all have to live in memory on the web servers. Rather than having to buy masses of hardware for 200 instances, multi-tenant software can focus on quality hardware for the small number of shared instances it's running. Upgrades - If there’s a hotfix that needs to go out quickly, applying it to each of the shared instances of the software is much more straightforward if the architecture is multi-tenant. A supplier working with single-tenant architecture would need to apply it to each of their 200 instances. Scalability - Multi-tenant can far more easily handle peaks in demand across the client base. That’s because the instances are not single computers but clusters of servers behaving as one. Maintenance - Maintaining 200 instances properly costs a lot of money, which is either passed on to the customer or addressed by quietly under-resourcing the maintenance and upgrade frequency. Why isn’t everyone in the arts embracing multi-tenancy? Multi-tenant software doesn't come out-of-the-box like single-tenant, so the jump from level 2 to levels 3 and 4 of SaaS maturity will generally require an entire rebuild from the ground up. I also suspect that there’s a lack of will and/or skill in many of the vendor engineering teams serving the arts industry. Think about all the successful SaaS businesses out there - Salesforce, Xero, Google Apps, Zendesk – they’re all built on multi-tenant architecture. That’s not a coincidence. [easy-tweet tweet="Multi-tenant architecture can far more easily handle peaks in demand across the client base" hashtags="Cloud, Scalability"] The downside of multi-tenant is that it requires significant R&D to create and sustain. However in the long term, single-tenant software services for the arts simply won’t be viable if they don't take the plunge. ### Are cloud and virtualisation technologies key to ensuring the digital security of businesses? Following numerous high profile data leaks over the past 12 months, Catalin Cosoi, Chief Security Strategist at Bitdefender, discusses why businesses are increasingly adopting cloud technologies to secure their systems. [easy-tweet tweet="Cloud #virtualisation allows a buffer from elements that may endanger a company’s security"] High profile data breaches are on the rise, with organisations such as TalkTalk, Ashley Madison and Carphone Warehouse, as well as University computer network ‘Janet’, all suffering from large-scale breaches in recent months. With the average cost per breach to large UK firms now hitting £4.1 million, and an increasing number of attack surfaces brought about by the Internet of Things (IoT) and Bring Your Own Device (BYOD) policies, there are now numerous means by which malicious actors can breach a network. By 2016, it is predicted that 6.4 billion mobile devices will be in use, increasing to almost 21 billion by 2020. Poor security currently presents a significant barrier to the adoption of new business-enabling technology. The uptake of IoT in enterprise sits at around 30 per cent of that seen in consumer markets, with 40 per cent of UK organisations having no mobile device security policies in place. Cloud virtualisation, however, allows a buffer from elements that may endanger a company’s security, while mitigating risk levels through secure, off site technologies. One of the best approaches of adopting cloud computing is a hybrid implementation between public and private cloud infrastructures How the cloud can mitigate the increasing risk of cyber-attacks Security concerns have failed to hinder cloud uptake as businesses begin to realise the positive security aspects of cloud and virtualisation technologies. Such services have now become standard business practice, with 93 per cent of businesses in the UK utilising different elements of cloud technology. This increasing reliance on cloud services not only points to increased security and optimisation of cost, but also to business growth. Non-critical services deployed on a public cloud are able to serve a larger number of clients while also keeping costs to a minimum. There had previously been problems of piling up security technology upon security technology to combat the greater attack surfaces present, however, a virtualised approach provides a far more efficient method. Since it’s not feasible to have an individual security solution installed on each IoT device, mostly due to fragmented operating systems, the solution could involve an IoT framework that IT personnel can use to control the type and amount of information IoT devices can tap into when connected to the corporate network. One of the best approaches of adopting cloud computing is a hybrid implementation between public and private cloud infrastructures. Offloading non-critical operations and data to the public cloud, and keeping sensitive, vital information in a private cloud, ensures that the public cloud can be leveraged when scaling non-critical services. Cloud technology: A key business enabler The tide is clearly turning, with the cloud no longer seen as a security threat, but a key business enabler. Many businesses now turn to virtualised environments simply for increased application and system deployment times. One of the key advantages of utilising cloud virtualisation systems over traditional hardware is significant improvement to recovery time, minimising the risk of hardware failure, or disaster response, as power outages or hardware deprecation will not affect businesses’ bottom line. [easy-tweet tweet="Many businesses now turn to virtualised environments for increased application and system deployment times" hashtags="Cloud"] Virtualisation eliminates repairing, maintaining and troubleshooting network hardware, making it easier to focus on other aspects that drive business, while maintaining strong security at its core. In enterprise environments, convenience, reliability and cost effectiveness are of paramount importance – often leaving security as a secondary concern. Virtualisation is one method of merging these two disparate interests, with security becoming an integrated part of a business’s day-to-day operations. ### A safe introduction to BYOD It might not seem like it, but smartphones have only been ubiquitous for the past five years. Before that, devices such as Handspring’s innovative Palm Treo series were too far ahead of their time to take off. The same rings true for bring your own device (BYOD), a concept that has existed for years but is only now able to be securely realised. This capability offers a tempting possibility for many businesses, as Howard Williams, marketing director of digital engagement specialist Parker Software, explains. [easy-tweet tweet="The concept of bring your own device, or #BYOD, has existed for years but is only now being securely realised"] The last five years of development in enterprise technology has created a perfect storm, culminating in increasing interest in the idea of BYOD in workplaces. Businesses are considering this possibility as a way of extending the geographical reach of internal systems and networks, effectively streamlining the flow of data between employees, customers and even satellite offices. Until this perfect storm, BYOD had been a pipeline idea, with relatively few employers implementing it fully, but there is now a genuine argument that it is a feasible concept. Concerns about device interoperability and security have held BYOD back, but now, due to the rise of cloud-based technologies, these are no longer a significant risk. Cloud-based software offers compatibility between all manner of devices, while cloud software-as-a-service (SaaS) makes login credentials and data as secure as they were in an office environment. Naturally, it takes time for IT professionals and business owners to understand fully how secure this makes their data. Some decision makers still have perceived reservations about transitioning to the cloud in light of numerous infiltrations of popular cloud networks such as Apple’s iCloud. In reality, the cloud provides the same degree of security as storing documents on localised servers or computers — if a break in occurs, the files are vulnerable. The last five years of development in enterprise technology has created a perfect storm, culminating in increasing interest in the idea of BYOD in workplaces Lost in transit If there are still valid security concerns surrounding BYOD, they are the result of social engineering attacks or plain old human error. Any network incorporating personal devices faces the risk of data and files getting lost when people misplace them. Seamlessly transferring information between a mobile device and a desktop computer may seem perfect, but you can never rule out human error. These errors can be exacerbated in fast-paced work environments. When employees begin to be inundated with new information and tasks, files can easily be left behind or not acted on in a timely fashion. Oversights such as this can quickly accumulate and spell trouble for the company. This is where we turn to cloud-based automation software for a solution. Rather than just being a tool to make administrative tasks easier, some business automation software can be used to unify multiplatform sources. With features such as this, organising data and files in a more complex way becomes achievable and straightforward. In a sales environment, for example, it is important that transaction details are stored on a central client database. When there are many sales to process and feedback into that database, it can be easy for an employee to forget an entire sale or an individual task within a sale. Automation software such as ThinkAutomation can prevent this, by instantaneously storing all transactions and documents in the correct place. The software can parse input data from a variety of sources and feed it back to the central output location, in this case the database. [easy-tweet tweet="Security concerns surrounding #BYOD are often the result of social engineering attacks or plain old human error"] Easy and efficient processes like this make automation software some of the most exciting technology available in a sales environment today. BYOD and cloud computing connect a multitude of devices to make interfacing with data simpler, but it is business automation software that brings the devices together so that the information is readily usable. By using the two together, businesses can implement BYOD securely and reap the competitive advantages it provides. ### Channel partners must transition to a cloud first model – or risk getting left behind In the digital era, channel partners must transition to a cloud first model or they risk getting left behind. Resellers, service providers and ISVs should harness the massive opportunity and long term business value presented by becoming complete cloud solution providers. [easy-tweet tweet="#Cloud services eliminate cost, scale and reliance barriers and enable businesses of all sizes to compete"] Why? Because cloud services are rewriting the rules of business. They are making it easier for businesses to operate, interact and transact in the market, increasing efficiency and flexibility for employees and opening up the borders to expansion. Cloud-based services eliminate cost, scale and reliance barriers and enable businesses of all sizes to compete more successfully. For resellers, cloud services open the doors to recurring revenue streams and a sustainable business model focused on building integrated solutions, value-add and a seamless lifecycle experience for customers. Over time, more and more IT services will be delivered from the cloud. This provides traditional resellers with the opportunity to expand their offering and profit from delivering cloud services and building value-add using professional services. The latest research from Microsoft and IDC revealed that Microsoft’s cloud-focused partners are growing at double the rate of non-cloud resellers, and enjoy one and a half times more gross profit. When making this transition, channel partners must embrace a new role – a complete cloud solution provider that can provide end-to-end solutions and value add across the customers’ lifecycle, becoming a trusted advisor. This new landscape can be challenging to navigate. To help partners get started on their journey to the cloud, Michael Frisby Managing Director of Vuzion and Cobweb Solutions, shares his key considerations for the channel to bear in mind when switching to a cloud first model. Adopt a strategic approach Cloud services and partner programmes are driving significant disruption across the IT channel. It’s vital for partners to have a strategy in place for how they are going to develop their organisation’s structure, sales and service capabilities to embrace the cloud opportunity. This means adapting their business model to a new way of creating, delivering and managing services and value-add, as well as altering the way in which they work with customers to address their individual business needs. IT service providers need to move from a transactional service to an integrated solution offering in order to provide more value to customers. The key to success lies in working with a provider who proactively helps their partners to build on their capabilities to drive sustainable profitable growth, rather than being just a marketplace to transact through Collaboration is key If IT service providers don’t yet offer the services their customers require, they need to find a partner that can help them source new services and capabilities in the shortest time possible. The right partner is ideally one with experience in packaging, delivering, managing and supporting cloud services and who can easily transfer these capabilities. Resellers, services providers and ISVs should partner with a value-add cloud aggregator who is expanding their portfolio of cloud services for delivery through channel partners. A provider which delivers the full spectrum of offerings across the cloud platform will enable partners to bring new cloud service sales capabilities and provide complete cloud-based solutions, as well as enhancing go-to-market and customer life cycle management services. Future proof your business Being able to source complementary cloud and professional services quickly will be vital to building a sustainable channel business model. Partnering with a cloud services provider who was born in the cloud is key. By so doing, channel partners will benefit from their heritage and expertise. Not only this, experienced providers can package up the support and knowledge to help traditional IT resellers, SIs and ISVs to make the successful transition to reselling cloud services. The key to success lies in working with a provider who proactively helps their partners to build on their capabilities to drive sustainable profitable growth, rather than being just a marketplace to transact through. Finding a cloud ecosystem which offers integrated solutions instead of individual services is essential, and will enable traditional IT partners to gain real value from the growing cloud landscape and keep their customers at the cutting edge. Remember, the customer experience is vital to succeed In this changing landscape channel partners will always be relevant but to stay competitive their role as trusted advisors will become ever more critical. There is no compromise. Customers want trusted advisors that pull together and package the best services into solutions and deliver them to their specific needs. Partners must evolve from being a transactional IT provider to a complete cloud solution provider, helping customers to maximise the benefits they deliver. Evergreen cloud services will mean the end of one-off upgrade projects. Resellers need to adapt their customer engagement, revenue and profit models to deliver ongoing value to end users. Those that help their customers to unlock the value of their IT investments will be the most successful. [easy-tweet tweet="Partners must evolve from being a transactional IT provider to a complete #cloud solution provider"] The right provider can help partners to successfully manage the customer lifecycle and speed up go-to-market time and growth rates. This will enable partners to serve their customers with mature cloud services and solutions, combined with streamlined operations and experienced customer care. By so doing, partners will be able to make greater margins with less risk than by constructing their own services portfolios from scratch. This will enable them to focus on building their own value add on top of cloud services, whilst being confident their business model is future proof. ### The biggest threat to G-Cloud is Amazon With Oracle, Salesforce, Amazon and many other US suppliers now part of the government’s G-Cloud project, offering services via the Digital Marketplace and establishing UK data centres so that they fit inside the data sovereignty concerns, are UK SME suppliers coming under increasing competition? Is the government building a new oligopoly like the one that they initially fought so hard to break up? Kate Craig-Wood, co-founder and MD of British IaaS supplier, Memset explores. [easy-tweet tweet="G-Cloud has been one of the most effective initiatives implemented by the government to date" hashtags="Cloud"] G-Cloud has been one of the most effective initiatives implemented by the government to date, but is far from perfect as its relatively limited adoption shows. You just have to look at the numbers. As of January 2016 the latest published figures show that the current total of reported G-Cloud sales is over £1 billion. But you have to remember that that £1 billion is the total G-Cloud spend to date. The actual run rate is about £40 million per month. That is close to half a billion a year, so still good, but when you break that down further less than 20 per cent of that is actually cloud services (*aaS). The other 80 per cent is Lot 4 SCS, the consultancy services - hardly the aspiration of G-Cloud. There have been some good reasons for the lack of true cloud adoption; the security impact levels have been a mess, PSN didn’t work, and government departments needed time to re-build their in-house IT capabilities and, in many cases, re-learn their own systems/applications after decades of relying on outsourcing. However, a shift is slowly happening and G-Cloud stands at the forefront of a new paradigm in government ICT; one where oligopolies, over-priced services, vendor lock-in, fragmented (expensive) networks and poorly understood and implemented security standards are a thing of the past. US threat Recently we have seen US giants like Amazon Web Services (AWS, IaaS), Oracle, Google (mainly SaaS, the Google Apps, and PaaS, Google App Engine) and Salesforce (SaaS) joining the Digital Marketplace causing some to fret that the celebrated 52 per cent of total G-Cloud sales awarded to SMEs will soon become a thing of the past. For the most part we should not be concerned. Most SMEs can still provide customers with an innovative edge, high agility and a competitive price-point, provided they have a USP. Our specialty, for example, is deploying low-cost generic hardware and open source software to provide government-accredited high-security hosting at commodity prices. That said, when it comes to the bottom layer of the stack (infrastructure as a service; virtual machines etc.), AWS is absolutely a threat - not just to suppliers large and small, but also to government. Most SMEs can still provide customers with an innovative edge, high agility and a competitive price-point, provided they have a USP The war is starting in the IaaS space and will soon move up the stack into PaaS and SaaS. AWS is already overwhelmingly dominant in private sector IaaS (Gartner estimated recently that Amazon Web Services offers as much computing capacity as the next 14 players in the market, combined), and that's without the inevitable price war really having started - IaaS prices have fallen much more slowly than underlying costs in recent years. Consider their long-term strategy; once their competition is eliminated AWS, Google, et al will inevitably exploit vendor lock-in at the customers' expense. Let us not have any illusions either; it doesn't matter if you are an old-school "IT giant", those guys have the jump on you, and your business structure is probably far too outdated, bloated, inflexible and sluggish to compete in that space anyway. Building an open source community AWS's success is because they cleverly created an ecosystem of companies who build and innovate on their platform. They then duplicate or acquire those innovations. Google can be viewed similarly in SaaS and PaaS, and is trying to play catch-up fast in IaaS with their Compute Engine. The only way to beat them is at their own game. If we want to avoid a situation where government is having to do all this (G-Cloud) again in 10 years time to break away from a new oligopoly, suppliers and the public sector actually need to get together as a community and start working around open standards and open source technology. Thankfully the open source community is in now in a good position to fight back. Technologies like OpenStack grow in adoption and maturity by the day. Even the unlikeliest of open source adopters, Microsoft, has seen this. Their newfound love of open-source code demonstrates that they understand the threat and recognise that the only hope against the new IT behemoths lies in collaboration. [easy-tweet tweet="Suppliers and the public sector need to work together on open standards and open source technology" hashtags="GCloud"] I'm not advocating protectionism nor saying we shouldn't compete with each other - that is at the heart of Britain's free trade spirit and inspires innovation - I'm just saying that we should keep in mind how to usher in the new paradigm of better, lower-cost, higher-security public sector ICT services: We need to work together as a community, around open standards and open source technology. ### The data behind dating: Can data ever really replace chemistry? There are an estimated 5,000 dating services in Europe and a whopping 91 million people around the world using dating apps. There is no doubt that the modern world and social norms have shifted, such that online and in-app algorithms are now an acceptable – and often encouraged – way to meet a future partner. It is user data that drives the success of these matches, helping to find that ideal partner, and this data has driven a genuine shift in modern dating habits. How can these dating sites ensure that they are managing and using this data in the best ways possible, as well as ensuring that it’s properly protected? Finally, what does the valuable currency of personal data mean for social norms now and in future? [easy-tweet tweet="What does the valuable currency of personal #data mean for social norms now and in future?"] Each online and app-based dating service attempts to differentiate itself from the competition to tap into one of the many lucrative markets available in this space. Whether they are targeting the young, the old, the professional elite or a specific religion, there is one thing every service has in common. They all use a series of algorithms to analyse each user’s data, to make the best possible matches, based on demographics and shared interests. Dating site OKCupid is particularly well-renowned for its dating-based data insights and has drawn some interesting trends based on its twelve years of existence. Apparently, for example, iPhone users have more sex than Android users. All users tend to lie about their height, with the average person two inches shorter than they claimed. Heterosexual and homosexual people have the same average number of sexual partners – six by the age of 30. One question that these platforms raise is over security – we’ve all heard horror stories of fake profiles and scammers. However, it’s not just personal welfare that should be considered, but also data security. How do these dating websites store and manage user data? The good news is that there are legal bodies in place that regulate how companies handle users’ data. Companies collecting and processing data must implement technical and organisational measures to ensure a level of security that is “appropriate to the risks represented by the processing taking place and the nature of the data in question”, according to EU regulation. These regulations will soon be made more stringent with changes to EU Data Protection regulation. Although the finer details of the regulation are yet to be ratified, one of the areas specified is that individuals can request their data to be deleted under the “Right to be Forgotten” clause. Any data centre manager or IT worker will know that eliminating all traces of data is a large and potentially difficult task. This became mainstream when Ashley Madison data was leaked, causing thousands of individuals to delete their accounts. How do these dating websites store and manage user data? The good news is that there are legal bodies in place that regulate how companies handle users’ data For dating sites, it’s going to be vital to ensure that they are compliant and looking after user data correctly, otherwise they will face a hefty fine of four per cent of their revenue. These penalties would be nonsensical and something companies should avoid at all cost. For dating apps and websites – just like any other business – this means having an effective data privacy programme and data management practice in place. Whether they store user data on premise or with an external private or public cloud provider, they should assess and reassure customers that data is collected, processed, accessed, shared, stored, transferred and secured in accordance with all laws and regulations, keeping them safe and ultimately allowing them to eradicate their data, should they be ready to end their online dating days. NetApp’s own clustered Data ONTAP storage operating system is one such example, and can be used across cloud and on-premises infrastructure to create a Data Fabric that acts as a single system, meaning that data is more easily managed and controlled. There is definitely some truth that these dating sites reduce the number of frogs you have to kiss before you meet Mr or Mrs Right, but do they really do more than this? Can they truly help you find The One? For all the people that tell you that online and app-based dating is a waste of time, there are plenty more that will tell you they are now happily settled or married as a result. eHarmony, for example, is now responsible for more than two million marriages –  and with lasting effect. The divorce rate is less than four per cent, which is much lower than the average of 44 per cent in Europe and 53 per cent in the US. [easy-tweet tweet="For dating sites, it is vital to ensure they are compliant and looking after user #data correctly" hashtags="Security"] Thanks to a sophisticated mixture of psychological profiling, data, algorithms and marketing, the online dating industry is worth in excess of €1.5 billion. As long as all of the data it produces can be properly managed and secured, there’s no reason why the dating industry can’t continue to be as successful as its happily-ever-after matches, complementing chemistry, rather than negating it. ### Disruptive Tech TV launching live at Cloud Expo Europe DisruptiveTech.tv, an offshoot of Compare the Cloud, is launching on April 12th, 2016; and will be broadcasting live from Cloud Expo in London. Cloud Expo encompasses four events in 2016, Cloud Expo Europe, Cloud Security Expo, Data Centre World and Smart IoT London. DisruptiveTech.tv will launch across the four events, and stream live coverage of interviews with technology experts and speakers across the two days. Exclusive interviews with industry leading experts will be featured across the two days of broadcast. Broadcast times: Tuesday 12th April 2016 - 11am-12:30pm, 2:30pm-4pm Wednesday 13th April 2016 - 11am-12:30pm, 2:30pm-4pm   ### Securing the IoT For the past two years, IoT has remained consistently in the headlines, with much discussion surrounding its potential to drive huge economic benefits and dramatically improve our lives and businesses. Predictions such as the one by Accenture, stating that industrial IoT alone could add $14.2 trillion to the world economy over the next 15 years by improving productivity, reducing operating costs and enhancing worker safety, highlight the enormous potential IoT can bring to all aspects of our existence. However, despite these clear benefits, there are urgent security issues that must first be addressed for IoT’s potential to be fully realised. [easy-tweet tweet="With security being a vital aspect of #IoT, the world’s leading experts will be at Smart IoT London" hashtags="SmartIoT16"] With security being such a vital aspect of IoT, many of the world’s leading IoT experts will be attending Smart IoT London on the 12th and 13th of April to discuss and debate how best to address the issues and opportunities that accompany IoT. In advance of the show, some of the speakers have shared their thoughts on how to tackle the challenges ahead to unlock the full value of the IoT. IoT hacks could have disastrous consequences There is an undoubted consensus that security is one of the most pressing issues facing IoT. As Srdjan Krco, CEO, DunavNET noted, “security is one of the top challenges in the IoT era. Having a large number of devices in the field makes IoT solutions very susceptible to various attacks, from eavesdropping to forging data. The latter can have dire consequences and in some cases might be difficult to discover quickly.” The severity of IoT hacks was echoed by Raphaël Crouan, Founder & Managing Director, Startupbootcamp IoT | Connected Devices, who commented that “as the number of use cases within Industrial IoT grows, so does the security risk. Hacking into a smart kettle is one thing, hacking into smart manufacturing plants or power grids is something much more serious.” As Thingalytics author and PLAT.ONE CEO Dr. John Bates noted, “even US Vice President Dick Cheney did not get a networked pacemaker fitted, fearing that hackers could shut down his heart!” As Bates observed, this shows that with IoT being implantable and swallowable, as well as wearable, there is now also a need to protect our bodies from hackers. Taking responsibility So how can these challenges be addressed to ensure a more secure connected world? For Phillip Pexton, Senior Analyst, Beecham Research, responsibility is key, and we must establish “who is responsible for security and at what stages “. As Pexton noted, “until the liability for the first major breaches are decided in court and a precedent set, this is still a very open topic.” Similarly, Pexton observed that we must also determine “who owns data pertaining to an individual’s activities/energy usage/etc.? The individual, the device manufacturers collecting the data or the service provider?” Discussion of data ownership also brings with it discussion of data privacy, which is a further key issue for the IoT Protecting privacy Discussion of data ownership also brings with it discussion of data privacy, which is a further key issue for the IoT. Evelyn De Souza, Data Privacy and Security Leader, Cisco Systems, Cloud Security Alliance Strategy Advisor, commented that “as the Internet of Things gains further ubiquity, data on consumer lifestyles and habits will take on increased value to IoT manufacturers and their ecosystem of partners. And, as our lives are increasingly digitally connected, so, too there will be a digital data trail associated with increasingly intimate aspects of our lives. This is something that is often not discussed as much as it should.” These concerns were supported by Krco, who added that “increased availability of IoT services is introducing various privacy concerns, which vary according to individual factors like personal characteristics, cultural background and norms, etc. In addition to data privacy concerns emanating from leaving digital footprints on social networks, credit card transactions etc., IoT installations in smart cities, shops, and buildings are adding another layer at which personal information from location to the actual activity and context is being recorded.” Human and technology concerns For Innovate UK Lead Technologist Jonny Voon, technology security issues are trumped by human concerns. “The weakest link in the security chain is not the technology, algorithm or machine; it’s the human,” noted Voon, highlighting the importance for businesses to train their teams against security risks. However, there are undoubtedly steps that must be taken to safeguard the devices themselves, with Voon adding that “cyber security should be inherent and designed from the very beginning, and not just for IoT.” As Crouan explained: “There is a lack of subject matter or uniformity across security processes, making the jobs of hardware engineers very difficult. Vulnerabilities exist across a range of products, caused by poor encryption or backdoor access. It only requires one weak spot to open up a whole IoT network, and that could put thousands of devices at risk.” Need for guidelines To help further promote security, it is also vital that guidelines are established. As Clive Longbottom, Founder and Research Director, Quocirca said, there is currently little guidance (industry or legal) on IoT data security as yet, meaning that organisations will need to ensure IoT data systems are flexible enough to embrace new security demands.” This was echoed by Voon, who stated that, “government policy and regulation will need to change and adapt to the way we use and interpret data.” Simon Jones, Editor of WearableTechWatch also supported this, commenting that “the IoT needs standards – fast – in security, for the protection of everyone, especially since we’re all going to be outnumbered by so-called “intelligent” devices by a ratio of up to 7:1.” Concerns giving rise to opportunities Despite these concerns, some of the security challenges also bring with them opportunities. As De Souza observed, “in many ways digitisation has given end users more empowerment. So in this digital age, ought people not to have increased control, versus the current trend of surrendering data streams that are personal and intimate to manufacturers, service providers and other third party organisations? “And, increasingly data streams from the various IoT models we will consume at work and home will have a monetary value”, De Souza continued. “They already do today in our digital world, which is far less intimate than the data streams that will come from the world of IoT. There is an opportunity to enable people to make decisions around privacy based on the monetary value of their data.” As is clear from many of the speakers’ comments, addressing IoT’s security challenges will be essential in order to ensure its widespread success and unlock its true potential. With security featuring heavily in the Smart IoT London agenda, we’re looking forward to seeing these issues addressed to help promote a more secure connected world. [easy-tweet tweet="Security will feature heavily at Smart #IoT London, as we promote a more secure connected world" hashtags="SmartIoT16"] To register for Smart IoT London, please visit http://www.smartiotlondon.com/about/visit ### Why MSPs need to be prepared for the coming IoT explosion According to a recent Gartner report, there will be 20.8 billion Internet of Things devices by 2020. Whether you agree with this statistic or not, it is certainly true that the Internet is being increasing populated by devices, as opposed to people. While some of these IoT devices will target the consumer space, others will certainly prove useful for business customers and managed service providers should ensure that they are ready to make the most of the coming IoT explosion. [easy-tweet tweet="For #MSPs and IT resellers, the Internet of Things provides a number of exciting opportunities" hashtags="IoT"] For MSPs and IT resellers, the Internet of Things provides a number of exciting opportunities, particularly around device management. Many businesses are already struggling to track and manage all of the devices operating across their network and the expected influx of IoT products will only exacerbate this issue. Some MSPs are beginning to offer Internet of Things as a Service (IoTaaS) to their customers, as a way of recognising new business challenges surrounding connected devices. This could involve IoT management software, consultancy, staff training or data storage solutions, depending on a company’s particular needs. Another significant IoT opportunity presenting itself to MSPs involves business automation. The huge volume of data created by IoT sensors and other devices is giving organisations the information they need to automate many processes that previously required human intervention. The manufacturing sector, in particular, has been quick to embrace IoT automation, but many other industries are likely to implement IoT business automation in the future. Harnessing the devices and data required for automation is unlikely to be easy, however. MSPs should begin looking at their automation software offerings now, if they do not want to miss out on this potentially lucrative market. The manufacturing sector, in particular, has been quick to embrace IoT automation, but many other industries are likely to implement IoT business automation in the future Of course, with all the additional network access points and datasets associated with the Internet of Things, MSPs will also need to redouble their security efforts. Reports have already begun to surface that consumer IoT products are being targeted by hackers and it won’t be long before any existing business weaknesses are also exploited. IoT security could, therefore, become a strong revenue stream for MSPs, providing they are able to put the necessary safeguards in place. On the other hand, MSPs that are not prepared for the additional risks surrounding IoT devices could face harmful data breaches, regulatory fines and long term reputational damage. For managed service providers hoping to grasp the coming IoT opportunity, it is important that they have a clear understanding of the technology and what it could bring to their customers’ businesses. You may also need to evaluate how you deliver your services given that IoT devices and software usually require 24/7 management. This could have an impact on customer contracts and the workload of your service desk team. You’ll also have to consider whether you actually have the IT tools in place to administer such a broad spectrum of IoT devices, many of which will be made by different manufacturers and use a variety of operating systems. [easy-tweet tweet="MSPs that are not prepared for the additional risks surrounding #IoT devices could face harmful data breaches"] If MSPs are concerned about their ability to offer IoT services, they could always partner with a vendor that is capable of providing the assistance that they need. At Dell we are always looking to support MSPs in any way we can, whether that’s via sales-enablement tools, staff training or marketing resources. According to a survey by Vanson Bourne, 58 per cent of MSPs believe that they need a partner vendor in order to offer IoT services and they would certainly be better off acting sooner rather than later. Having a true end-to-end offering, Dell are well placed to support all aspects of MSPs requirements. The Internet of Things market is accelerating rapidly and MSPs need to get onboard now to ensure that this opportunity doesn’t pass them by. ### Considerations for enterprises before migrating to the cloud Cloud computing is on the rise with enterprises shifting their applications, data and IT infrastructure to cloud environments. A recent IDC survey found that 58 per cent of companies are using web-based, on-demand computing services, including both public services such as Amazon Web Services and private cloud facilities, for two or more applications, up from 24 per cent just 14 months earlier. [easy-tweet tweet="For most decision-makers, the question comes down to whether they need private, public or hybrid #cloud"] The future of the cloud, whether private, public or hybrid, is without doubt secured. Digitisation strategies are being rolled out across enterprises to gain all the benefits of low latency, agility, elasticity and economies of scale the cloud brings, while helping businesses innovate by virtue of its inherent flexibility. With this in mind, what should enterprises consider when starting to migrate to a cloud infrastructure? No two businesses’ IT looks the same and as IT structures vary greatly from one company to the next, a different outlook on migration according to specific needs is required. For most decision-makers, the question comes down to the type of cloud implementation they want to pursue – private, public or hybrid. Private cloud setups are often reserved for large-cap corporations with a surplus of resources and technical expertise, while the public cloud offers fast and easy storage for a better price. However, organisations typically seek to incorporate elements from both styles of services and create their own hybrid cloud solutions that better fit their computing requirements and long-term goals. A smooth migration to the cloud is essential to gaining the maximum value from an enterprise’s cloud investment. Conversely, a poorly thought-through and planned migration can prove more than a headache; it could cost a business a great deal of money, time and effort.  A comprehensive strategy is the most important trait for any successful cloud migration, so what are the steps to getting there? Cloud migration doesn’t have to happen all at once, companies have a great deal of flexibility to move their resources gradually if they wish Assessment: The first component of any effective, comprehensive cloud migration is the assessment. This assessment must take into account the likely impact migration will have on every aspect of the organisation. In the majority of cases, this will include the entirety of the company’s existing infrastructure, all of which must be audited, inventoried and analysed. Planning: Once the assessment phase is complete, it is essential to plan out the migration in detail to minimise complications and ensure security and compliance standards are maintained at all times. Expertise and experience are essential to this phase. Every organisation’s cloud migration is unique, and failing to address these distinctions will inevitably cause problems throughout the migration process. Yet actually designing a migration strategy that accounts for these differences in the most productive, efficient fashion is really only possible with the benefit of tremendous experience in these specific matters. Oversight: It is critical to keep an eye on the effectiveness of the migration strategy, both during the actual migration as well as in the immediate aftermath. Considering the complexity of cloud deployments, there’s always the potential for issues to arise. With keen oversight, these problems will likely prove minimal and can be handled before they cause any significant delays or inefficiencies. For example, moving a large database to the cloud is no small task. The database needs to be assessed for potential issues migration may cause, from evaluating all the current applications through the practicalities of running the database in the cloud.  In some cases, it may make sense to completely move database providers rather than stick with an inflexible legacy product. In planning we need to consider how to physically move the core data, which may run to many terabytes in size or be constantly updating. It may make sense to run the new cloud-based database in parallel with the legacy database, or it may be better to plan a “big-bang” move over one weekend. Oversight by a trusted partner will guide you through the steps of through testing and the go-live phase. Migrating business critical applications and processes needs careful planning and execution, the CIOs or decision makers of the project must select a provider they trust. Any potential concerns, for instance, around reliability and security, need to be addressed well in advance. Cloud migration doesn’t have to happen all at once, companies have a great deal of flexibility to move their resources gradually if they wish. Once the benefits have been experienced and the system is tried and tested, businesses can move more of their assets to the cloud. [easy-tweet tweet="With a trusted partner and careful planning, the risks associated with migrating can be reduced" hashtags="Cloud"] With a trusted partner and careful planning, the risks associated with migrating can be drastically reduced and even the most complex IT ecosystems can be moved to the cloud. Businesses can now reap the benefits of a move to the cloud and future-proof their business with the agility and flexibility that the cloud offers them. ### Don’t let hyperconvergence market hyperbole cloud the broader vision The battle for hyperconvergence supremacy is intensifying.  A raft of legacy IT vendors stake their claim to an increasingly visible and competitive market and a multi-billion dollar opportunity.  IDC predicts that sales of hyperconverged systems will reach $2 billion worldwide this year, doubling to nearly $4 billion by 2019. [easy-tweet tweet="IDC predicts that sales of #hyperconverged systems will reach $2 billion worldwide this year" hashtags="HCI"] From EMC’s CEO purporting to have the solution that will change the game, to VMware’s assertion that it is already leading the industry, their move on the market which is growing at more than 150 per cent annually, comes with an avalanche of new products and solutions. It’s a contest marked by a degree of ambiguity, as declarations of top dog status appear to be based on different criteria, depending on who is making the case and whether their sales numbers include hardware and software. However, a prevailing consensus across the board from EMC, Cisco, HP to VMware, is that Nutanix remains the one to target in this space. As the first operator to combine compute, storage, hypervisor and virtualised management software in an integrated package, we’re aware that all eyes are on us and what we are going to do next. It’s a level of scrutiny and expectation which is a testament to a growth and impact made in a relatively short space of time. Just as hyperconvergence comes to mainstream prominence and dominates the wider agenda, for Nutanix, the narrative has already moved on to the Enterprise Cloud and the webscale infrastructure more associated with the likes of Google, Amazon and Facebook. An integrated offering means businesses will no longer face a choice between having to sacrifice speed for security or vice versa when it comes to their IT infrastructure That’s not to relegate hyperconvergence to the sidelines, but to shift the emphasis to putting the infrastructure in a broader context and how it underpins a broader vision to make the public cloud a reality for business. The enterprise cloud embodies this mission, fuelled by a very specific aim to address a gap that exists in the current offering between the public cloud experience and the on premise experience provided by vendors. The plan is to deliver the solution that assures the agility, ease of use and built-in integration that is synonymous with the public cloud, Here, security is an integral component as opposed to a box-ticking afterthought, fuelled by technologies which enable applications to be deployed across a mix of virtual and cloud environments and a solution which is 100 per cent software defined. Design also takes centre stage, affirming that a sleek and engaging aesthetic needn’t be the sole preserve of the giants of the consumer space. Here it has value and relevance beyond an attractive interface, capable of permeating all levels of the organisation to enhance ways of working and ultimately the customer offering. In short, this integrated offering means businesses will no longer face a choice between having to sacrifice speed for security or vice versa when it comes to their IT infrastructure, instead they can quite simply get the best of both worlds. Now they can enjoy the scale and scope with the peace of mind that comes with a resilient and predictable solution, and one that, in common with the public cloud’s defining quality, improves every time it is used, working seamlessly in the background and evolving with changing needs. [easy-tweet tweet="An integrated offering means businesses no longer have to choose between speed & security" hashtags="Cloud"] It’s a move that involves a revaluation of cloud and an appreciation of the full scope of capabilities which comes with permission to think bigger. We have seen over the years how the cloud has become an invaluable respite from the limitations of traditional three tier infrastructure but all too often it has been deployed reactively to address a specific issue when it arises. Now used discerningly and strategically for the next-generation enterprise, it delivers value back to businesses through continuous innovation with unlimited potential. ### Cloud control: Which type of cloud solution makes the most sense for your business? Cloud computing is playing an integral part in changing the business landscape as we know it. It is allowing organisations to rapidly scale their computer resources, manage and process a myriad of data, and help control their IT spending. Thanks to cloud computing, small firms are now able to enter new markets and perform more competitively in the new digital economy. According to a recent survey, nearly half of all small firms in the UK now use cloud computing to increase flexibility and reduce costs. [easy-tweet tweet="#Cloud is allowing organisations to rapidly scale their computer resources & manage a myriad of data"] That said, with data growing at 40 per cent a year and IT spending only rising by 5 per cent a year, there’s an increasing imperative to invest. The challenge for many of today’s business leaders is understanding what type of cloud solution makes the most sense for their organisation, as one size does not fit all. Public? Private? Or the best of both worlds? Cloud computing has traditionally broken down into two models: public and private. The difference between the two lies in who maintains control and responsibility for the data centre and is ultimately responsible for ensuring application service levels are met. In public cloud computing, some or all aspects of operations and management are handled by a third party service provider. Public clouds offer a number of advantages; such as low costs and scalability. They can be extremely useful for companies needing to deal with large amounts of data or for those that don’t have the resources to manage their own infrastructure. However, concerns have been raised about how secure these public clouds are, and where the data they hold is physically stored; something that can be a major issue from a regulatory perspective. As a result, public clouds are often thought of as being suitable primarily for non-sensitive data or applications. Private cloud offers a very different set of benefits. The private cloud model is based on the data centres being owned and maintained by the organisation using them. What that means is businesses have the ability to create a virtualised IT infrastructure that prepares them for the future, is built on an organisation’s own terms, and still maintains the flexibility and scalability of cloud-based applications. Unlike the public cloud, private cloud computing does require an organisation’s own IT team and sufficient resources, which can sometimes prove impractical for small companies. Private and public clouds bring their own unique advantages and disadvantages and many companies may require elements of both. As a result, we are seeing a third option become increasingly popular that combines the best of both models: hybrid cloud. Hybrid clouds fuse the qualities of both public and private clouds, allowing companies to take advantage of both cloud models in a way that works best for their business, now and in the future. Private and public clouds bring their own unique advantages and disadvantages and many companies may require elements of both Key considerations: Business goals vs. resources Balancing a business’s long and short-term goals against the resources available can be a challenge. Here are some of the key considerations businesses need to think about before investing for the first time or boosting their current offering: Resources and skills: One of the most crucial questions is: does the business have the resources to effectively implement a hybrid cloud approach? Hybrid clouds require not only time and financial resources, but also an in-house technical expert with knowledge of the cloud infrastructure – both public and private elements – if it is to be effectively managed. Reliability: Data is the lifeblood of any business, so ensuring that IP is safe is imperative. Companies therefore need to consider how to deploy a reliable cloud structure so that mission critical applications and data are always available. Shift in IT mindset: If the business has a designated IT team, it’s important that this team believes in a service-oriented approach, rather than the traditional infrastructure-oriented thinking. This will ensure any technology implemented now will ultimately prepare the business for the future. Security: At a basic level, security measures need to balance the probability of a threat occurring, the impact of a security breach, and the cost of implementing countermeasures. To avoid putting data at risk while keeping costs low, companies need to find a middle ground and establish what data needs to be placed in private clouds versus public clouds. Most importantly, the business will need to vet public cloud suppliers rigorously, as privacy and security issues around data can impact the location where the data is physically stored. [easy-tweet tweet="Public #cloud is often thought of as being suitable primarily for non-sensitive data or applications" hashtags="Security"] With all of this in mind, businesses may be apprehensive about completely overhauling their existing infrastructures to dive headfirst into the cloud. However, to avoid the pitfalls, organisations must take the time to really understand where their business is now and where it aims to be in 5 to 10 years. This should play a significant role in guiding cloud investment levels for any organisation. Technology may be here to help businesses change, but decision makers need to be savvy. The symbiotic relationship between technology and business works best when companies understand what they need and why. The cloud offers a unique opportunity to level the playing field as any business - no matter the size – can now enjoy real competitive advantages. ### Cloud Expo Europe and the cloud platform convergence Moving back through time it’s easy to see how platforms converge onto single devices or as a converged stack. An example of this is iPhone or Android devices bringing us the functions of music (Walkman), telephone (mobile), Internet (laptop) and a plethora of applications that were previously point products. [easy-tweet tweet="It’s easy to see how platforms converge onto single devices or as a converged stack" hashtags="Cloud, CEE16"] I was having a conversation with the Cloud Expo Europe team recently and wanted to share my thoughts about the converged platforms at each collocated event. An analogy I will be using is that of a large multi-tenant building: The foundations of the future stack: The data centre Every building requires a foundation and the foundation of any cloud or hosted solution is the data centre. As well as performing this key role, the data centre is the conduit that allows utilities to be delivered to the building, such as power or electricity. The data centre is also the lifeblood of the often overlooked essential item of the cloud: ‘connectivity;’ the staple that supplies an Internet connection and allows SaaS/cloud and mobile applications to come alive on the device you are holding. The apartments in the building: The cloud A building is to be resided in, securely accessed and above all should be a comfortable, familiar environment for the tenants. In the cloud we have a plethora of options for segregating each apartment cost effectively, allowing for customised furnishings and a comfortable operating environment to be consumed. The big advantage of this multi-tenant building is each apartment has ‘burst’ capacity that allows any relatives or friends to take short-term rentals by the minute, hour or month. These apartments are where the cloud systems are operated, consumed and developed upon. Turn the heating down and dim the lights please: Smart IoT From footsteps that can create electricity, to automated home heating programs, through to fitness monitoring applications. Today, Smart IoT (Internet of Things) is all around us both visible and obscurely embedded into everyday items. Today we are in an age where intelligent sensors that are almost invisible to the eye are being mass produced to connect every conceivable item around us. From a drilling plant telling us the best place to locate oil, through to our fridges reading bar codes and automatically ordering goods online as they run out. The huge computational horsepower required to ingest, store and analyse this plethora of data is located in the cloud. Add augmented reality into this mix, where IoT data will be reconstructed to allow for modelling machines and systems in virtual reality, and we have the beginnings of the virtual universe. All this kit and no security? Cloud security Any building holding high value items and contents should have a level of security corresponding to the value of replaceability within the system. This extends to the data we export that consumers trust us with and the documents that professionals such as lawyers (cough** lets not go there), healthcare and finance professionals are entrusted with. Today’s security threats go way beyond what I would describe as physical entry to a building. Threats now range from the IoT device that switches on our lights at home to the toys our children play with. The security boundary is no longer just the office, this threat is the Internet connection we use within our homes and on the tablets we use to connect to our corporate systems. Cloud security services provide the elastic perimeter that help to protect our physical and digital world. Platform convergence I hope you have enjoyed this blog and albeit brief journey into how I view the platforms converging into a single stack. Compare the Cloud are at Cloud Expo so why not come and visit us for a chat on this topic or take a wander over to our Disruptive Tech.tv stand. We love talking cloud and technology and we promise you we never sell anything…. You can register for your free ticket to Cloud Expo Europe, alongside Data Centre World, Smart IoT London and Cloud Security Expo, here. [easy-tweet tweet="#Cloud #security services provide the elastic perimeter to protect our physical and digital world." hashtags="CEE16"] My name is Andrew McLean and I am the Chief Technology Office of Compare the Cloud. If you wish to arrange a briefing with us please contact pr@comparethecloud.net ### Taking advantage of multi-cloud computing For the last few years, much of the industry buzz surrounding cloud computing has centred on hybrid cloud, with the vast majority of businesses adopting a mix of public and private services. If you accept that hybrid cloud is the dominant industry environment then the next logical step is multi-cloud, with businesses combining multiple public, private and managed clouds depending on their bespoke business needs. In order to fully commit to a multi-cloud infrastructure, however, businesses will need to overcome some challenges, notably those surrounding management and integration. [easy-tweet tweet="The next step is multi-cloud, with businesses combining multiple public, private and managed clouds" hashtags="MultiCloud"] Before businesses start thinking about which vendors to work with they must first decide whether multi-cloud computing is right for them. Many organisations are switching to a multi-cloud architecture because a single cloud approach just isn’t specialised enough for their business needs. Modern enterprises have varied requirements, some of which will be extremely complex, and so running multiple clouds lets businesses avoid a jack of all trades, master of none approach. With a multi-cloud approach each offering can be mapped to a specific business process. The move to a multi-cloud approach is also hardly surprising given that each of the different cloud models provides its own distinct benefits. The variety of different cloud vendors is also now so broad that it often makes sense for firms to handpick a number of different cloud solutions. Organisations that face particularly stringent regulatory pressure may also suit a multi-cloud architecture, as will companies with a loose business-unit structure wishing to unite disparate cloud resources. One of the most effective ways to simplify a multi-cloud architecture is to make it look as though each cloud service comes from a single supplier However, in order for businesses to avoid financial and management issues relating to multi-cloud architecture they must first undertake careful planning. Often the first consideration is which kind of cloud model will be the basis for your multi-cloud architecture. Deciding between software as a service (SaaS), platform as a service (PaaS) or infrastructure as a service (IaaS) may be determined by you existing cloud vendor or an assessment of your applications and IT systems. Once that’s been determined, businesses should begin matching their business applications to the most suitable cloud vendor. Important considerations include not only the features and benefits of each provider, but also the cost. Businesses need to evaluate the price of each cloud service, as well as possible integration costs and whether it makes financial sense to move the application to the cloud at all. One of the most effective ways to simplify a multi-cloud architecture is to make it look as though each cloud service comes from a single supplier. APIs may enable businesses to control multiple clouds with the same tools, while external management tools may provide further standardisation. What’s more, many businesses are using a single colocation provider to host their private cloud stacks, while simultaneously giving them direct and fast access into the large public clouds. A recent survey found that 45 per cent of businesses expect to use 3rd party colocation providers to help host their multi-cloud solutions in this way. [easy-tweet tweet="The diversity of the #cloud landscape may actually increase, as vendors become more specialised" hashtags="MultiCloud"] Although multi-cloud architecture is beginning to gain traction in the business environment, it is still in its early stages, which means that many exciting developments are still to come. Much is being made of the cloud vendor landscape consolidating, but it is unlikely that the major cloud players are going to be going anywhere anytime soon. However, the market is evolving in such a way that there will remain a place for innovative new players. The diversity of the cloud service landscape may actually increase, as vendors become more specialised and focused on increasingly niche market sectors. This will provide organisations with greater choice than ever before, allowing them to pick and choose their services to create a cloud strategy that is optimised for all their business processes. ### Big data, big business: Top career paths for IT pros Companies are in a tight spot. Technology is evolving at a breakneck pace, leading to increased consumerisation of popular offerings such as cloud computing and data analytics. IT security, meanwhile, struggles to keep up with evolving network threats even as employees look for better access and consumers demand better service. For IT professionals, this presents a unique opportunity — as noted by a recent ISACA report, there's a widening “skills gap” in the industry. While all tech sectors are dealing with increased demand, the hot ticket for 2016 is big data: Effective analysis and insight are critical for organisations to stay ahead of the curve. Here's a look at three big data jobs that offer big career prospects.  [easy-tweet tweet="While all tech sectors are dealing with increased demand, the hot ticket for 2016 is big #data" hashtags="BigData"] Big data architect Interested in building a big data strategy from the ground up? That's the role of big data architects, responsible for creating the blueprints of data management systems. As noted by V3, there are huge career opportunities for qualified IT pros — with the right company you could make north of $200,000 per year. Of course, landing this job requires both industry experience and commensurate training. The V3 piece points out that companies looking to fill the skills gap are seeking the best of the best — IT pros who are experts in the field and stand head and shoulders above the crowd. The role of big data architect is ideal for a tech pro looking to take ownership of a company's big data strategy and get in on the ground floor. Skills required include the ability to collaborate with multiple IT teams, research new data opportunities, define needed technologies and create an end-to-end strategy for managing and analysing big data. Technology is evolving at a breakneck pace, leading to increased consumerisation of popular offerings such as cloud computing and data analytics Big data engineer According to Information Week, big data engineer ranks as the third-“hottest” IT job in 2016 with a top-end salary of approximately $190,000. In many respects this role is a mix of data scientist and engineering positions; you'll be required to build on the existing big data framework in place and develop new ways to handle information. For example, big data analysis using MapReduce or Hadoop is critical, along with data warehousing, transformation and data collection. Bottom line: In this role, big data is your baby, from start to finish — and everywhere in-between. For many IT pros, this career track offers the benefit of working within an established structure but thinking outside the box to get the job done. Big data consultant Not all big data jobs are FTEs. For many companies, the drive to secure top-tier talent collides with IT budget constraints, leading some to choose outsourced data experts over hiring in-house experts. For IT professionals, going the consultant route offers the benefit of flexible scheduling and separation from office politics — your task takes top priority. According to Forbes, the emerging discipline of data consultant requires a varied skill set. For example, in 2015 the demand for VMware expertise in big data jobs rose almost 800 percent, while open source bumped up 333 percent, and Python programming knowledge increased by nearly 200 percent. As a big data consultant you could be tasked with any project at any time — an ideal fit for IT pros looking for a challenge that comes with solid compensation. [easy-tweet tweet="For many companies, the drive to secure top-tier talent collides with IT budget constraints" hashtags="BigData"] Big data is big business. Architects, engineers and consultants stand out as some of the top-performing jobs in 2016. ### If you aim to deploy a hybrid cloud in 2016 - here’s what not to do Most, if not all, organisations by now have adopted some form of cloud solution. In fact, many private organisations are now moving to adopt a hybrid stack, with some form of Infrastructure-as-a-Service or Platform-as-a-Service solution sitting on top of a cloud application model where end-users see no noticeable lag to their performance. But what about choosing who to partner with when opting for a hybrid cloud provider? [easy-tweet tweet="Recent studies show that much of private enterprise is most definitely keeping its #hybrid options open" hashtags="cloud"] Here in the UK, hybrid cloud is of particular interest, as most enterprises have tended to see-saw between public and private cloud adoption. Recent studies show that much of private enterprise is most definitely keeping its hybrid options open. Just in time for some spring-cleaning here are four clear ideas on what to consider when you move to a hybrid cloud: #1 Don’t: Use a provider who can’t tailor hybrid cloud solutions One of the biggest advantages to the hybrid cloud is its ability to match workloads to the environments that best suit it. Aim to build a hybrid cloud solution with incredible hardware and impressive infrastructure, but if you don’t tailor your IT infrastructure to the specific demands of your workloads, you may end up with performance issues, improper capacity allocation, poor availability and/or wasted resources. #2 Don’t: Neglect your current data centre or co-location needs There’s any number of reasons an organisation may wish to outsource its IT infrastructure: from shrinking its IT footprint to driving greater efficiencies, from securing capacity for future growth, or simply to streamline core business functions. The bottom line is that data centres require massive amounts of Capex to both build and maintain, and legacy infrastructure does become obsolete over time. This can place a huge capital and upfront strain onto any mid-to-large-sized businesses’ expenditure plans. But data centre consolidation takes discipline, prioritisation and solid growth planning that looks ahead to what your needs may be not just at this moment in time, but in five or six years’ time. #3 Don’t: Skimp on your hardware costs For larger workloads, should you seek to host on premises, in a private cloud, or through co-location, and what sort of performance needs do you have with hardware suppliers? A truly hybrid IT outsourcing solution enables you to deploy the best mix of enterprise-class, brand-name hardware that you either choose to manage yourself or be fully-managed from a cloud hosting service provider. Performance requirements, configuration characteristics, your organisation’s access to specific domain expertise (in storage, networking, virtualisation, etc.) as well as the state of your current hardware often dictates the infrastructure mix you adopt. The bottom line is that data centres require massive amounts of Capex to both build and maintain, and legacy infrastructure does become obsolete over time #4 Don’t: Forget about business continuity planning Once you determine the threats to your infrastructure, the next step is to determine how those threats would impact your business continuity in terms of significant downtime, from expensive repairs to lost income to regulatory fines. First, make sure all managers and stakeholders of these systems are involved in the completion of the business impact analysis. In fact, a recent study by leading DR provider DataBarracks shows that nearly 73 per cent of UK SMBs have no defined business continuity plan in place. Second, assess the status of your systems - understanding the distinctions between them will help you prioritise the processes of any effective disaster recovery planning process: Mission critical — your business can’t operate without them Business-critical — your business can’t operate effectively without them Non-critical — your business can operate without them for an extended period of time without additional expenses Fundamentally, the right hybrid cloud hosting provider should not only be able to design and build out your environment, but also manage it throughout the building and life-cycle process. Ultimately it’s worth checking through these questions before you choose any potential hybrid cloud partner: Can my provider right-size my workloads for me? What are my specific security/compliance concerns? What’s an accurate assessment of my current data centre estate? Have I included all my hardware/remote IT needs? Have I got all of my business continuity or disaster recovery plans in place? Did I omit any of my most critical RfP needs? [easy-tweet tweet="Data centre consolidation takes discipline, prioritisation and solid growth planning " hashtags="cloud"] Any provider worth their salt should feel like an extension to your internal IT department. Trust is important to any commercial partnership because it’s a partnership where a partner’s cloud solution architects understand not just your company’s specific needs for capacity planning, but, one where you work together to design and deploy the most flexible solution available to you. ### The journey into the cloud The cloud is rapidly becoming a star in the IT strategies of organisations throughout the world. Companies of all sizes are waking up to the benefits of moving some or all Oracle application to the cloud, reducing the need for up-front capital investment in order to accelerate return on investment, as well as gaining greater business insight and an excellent user experience.  [easy-tweet tweet="Some organisations are still cautious about moving business-critical applications to the #cloud"] Blazing a trail The reality of today’s cloud technology is now matching - or even exceeding - the hype of a few years ago. However, some organisations are still cautious about moving business-critical applications to the cloud, based on perceived risks that are often overstated and based on outdated concerns. To allay these fears, we can draw on the experiences of early adopters, such as Reading Council, to prove the value of migrating Oracle applications to the Cloud. Organisations today are under increasing pressure to embrace new technology, both in order to demonstrate innovation and to improve performance. Within the Oracle community, organisations are at different stages in migrating their business-critical applications but more and more are embarking on - or at least planning their route for - the journey to the cloud. A key driver is that the cloud reduces the need for specialist skills in-house, liberating IT departments from systems maintenance and empowering them to focus on more strategic initiatives. Organisations today are under increasing pressure to embrace new technology, both in order to demonstrate innovation and to improve performance The state of play Claremont’s recent survey on The Future of Oracle-based Applications delivers a fascinating insight into the state of play in Cloud migration. The survey asked: Has your business migrated any of its Oracle-based applications to the Cloud? 13 per cent have migrated some Oracle-based applications to the cloud 13 per cent are in the process of migrating Oracle-based applications to the cloud 74 per cent of businesses have not migrated any Oracle-based applications to the cloud 40 per cent say they will move to the cloud within 12 months, 60 per cent within one to three years While only 26 per cent are actively engaged in the cloud at present, perhaps the most interesting result is in the intentions of the 74 per cent who have not yet migrated. With 60 per cent planning on moving to the cloud within three years, organisations are clearly excited about the benefits of the cloud, making late adopters an ever-diminishing minority. The survey also asked organisations who engaged in the cloud about which applications they have moved. HCM/Payroll is currently the most common application, with 62.5 per cent of organisations moving it to the cloud. In future, this looks is set to change, with 80 per cent of organisations keen to move ERP, once any apprehensions have been dispelled (18 per cent currently host ERP in the cloud). Conclusions Moving business-critical applications to the cloud only makes sense if you see clear benefits. The mass migration over the next one to three years demonstrates that the vast majority of organisations see them very clearly indeed. This movement towards the cloud is surely influenced by the experiences of early adopters. For instance, the survey indicates that 75 per cent of organisations who have moved their Oracle applications have reduced their operating costs as a result, with 63 per cent saying their system is now easier to upgrade and maintain. Perhaps even more impressive, 63 per cent cite their move to the cloud as enabling new business models for the organisation. [easy-tweet tweet="The movement towards the #cloud is surely influenced by the experiences of early adopters"] The organisations who believed in the promise of the cloud have blazed a trail and have proven the value of moving some or all Oracle applications to the cloud. The right hosting partner will be able to work with you to determine if moving to the cloud is right for you and to help make that journey as seamless as possible. ### Clouds lifting: A brighter public sector security outlook ahead? While the pace of cloud computing adoption has reached new peaks, the public sector continues to lag relative to the private, despite initiatives to allay fears and spur uptake. There are a number of key reasons for this, but the major issue above all others is still the perception of insecurity. Although some innovative councils and even, more recently, the Metropolitan Police are beginning to embrace cloud adoption, the majority of the public sector remains overly risk averse and this has been a crucial blocker of uptake. [easy-tweet tweet="While #cloud computing adoption has increased, the public sector continues to lag relative to the private"] In many minds, loss of control will always equate to an insecure environment. A continued lack of due diligence from the media has also added to this feeling of insecurity. Continually, large-scale cyber breaches of all forms are reported as “cloud” breaches, when most still target corporate networks, not the CSPs (Cloud Service Providers) themselves. The distinction, if made at all, is often unclear. The marketing strategy of some cybersecurity firms has also had a negative role to play. In lieu of data to conclusively demonstrate ROI of security spend at board level, scaremongering has become a default marketing strategy. This is counterproductive in the long term. However, despite these fears, a cloud environment can actually be far more secure than in-house capabilities. Firstly, the nature of cloud computing allows reconfiguration in response to threats far easier. These threats are real, but are not necessarily more or less threatening to the cloud than to any other environment. We are too ready to forget the shortcomings of more familiar environments, particularly with regard to economies of scale and specialisation. Indeed, one of the major benefits of moving to the cloud is the ability to leverage the expertise of the vendor. Most CSPs with sufficient scale see many thousands of times more threats than the average enterprise. In a growing and diverse threat landscape, this is a powerful driver of uptake. Additionally, given the prevalence of risk from insider threats, there is also a strong argument that cloud environments can significantly hinder the potential damage a malevolent employee can wreck by physically separating them from where data is stored. This also makes common tactics such as social engineering much more difficult. The nature of cloud computing allows reconfiguration in response to threats far easier. These threats are real, but are not necessarily more or less threatening to the cloud than to any other environment CSPs would be well served to highlight transparency and security reporting features and capabilities. Transparency in particular can allay fears over loss of control. If these fears also extend to vendor lock-in, CSPs should emphasise interoperability. For example, some cloud services have gained such widespread uptake they have become defacto standards, with their functions emulated by others. Compliance with these APIs can ease the issue of vendor migration. CSPs should also help customers test for security, regardless of any other provisions in place. Crucially, buyers should never accept a one-size-fits-all approach, regardless of how basic or limited a requirement they believe is needed. A good CSP will always be willing to work with the customer to create a cloud environment that is particular to their organisation. Migrating to the cloud is the perfect time to undertake a holistic security audit of processes, assets and people. While the CSP has a crucial role to play in allaying fears, the buyer should of course undertake significant due diligence on their potential provider. IT security standards such as ISO27001 and the Cloud Security Alliance Cloud Controls Matrix (CCM) should be supplemented by personnel security standards such as BS7858. The Cloud Security Principles issued in 2014 offer helpful guidance when building or implementing a cloud computing platform in the public sector. Empowered by changes to the Government Security Classification Policy, the requirements are far less prescriptive and more flexible than previous iterations and allow for easy adoption of off-the-shelf cloud products at the lowest official security band. [easy-tweet tweet="Migrating to the #cloud is the perfect time to undertake a holistic security audit of processes and people"] Additionally, recent changes to the VAT regime for contracted out services in central government and the NHS has also shifted the cost calculation, as commodity cloud is now eligible for rebate. Taken together, these are powerful enablers of uptake, and of significant help in bridging the public-private cloud computing gap. ### Security report reveals disparity between mobile app security perception and reality Arxan Technologies, a provider of application protection solutions, has announced the publication of its 5th Annual State of Application Security Report. The new research is based on the analysis of 126 popular mobile health and finance apps from the US, UK, Germany, and Japan, as well as a study examining security perspectives of consumers and app security professionals. Arxan discovered a wide disparity between consumer confidence in the level of security incorporated into mobile health and finance apps and the degree to which organisations address known application vulnerabilities. While the majority of app users and app executives indicate that they believe their apps to be secure, nearly all the apps Arxan assessed, including popular banking and payment apps and FDA-approved health apps, proved to be vulnerable to at least two of the top 10 serious security risks. Among the research findings: Consumers and app executives believe their mobile health and finance apps are secure. A combined 84 per cent of mobile app users and mobile app executives believe that their mobile health and finance apps are “adequately secure,” and 63 per cent believe that app providers are doing “everything they can” to protect their mobile health and finance apps. The majority of mobile health and finance apps contain critical security vulnerabilities. 90 per cent of the mobile health and finance apps tested had at least two of the Open Web Application Security Project (OWASP) Mobile Top 10 Risks. More than 80 per cent of the health apps tested that were approved by the US Food and Drug Administration (FDA) or the UK National Health Service (NHS) were also found to have at least two of the OWASP Mobile Top 10 Risks. The security and safety risks are real and significant. 98 per cent of the mobile apps tested lacked binary protection – this was the most prevalent security vulnerability identified. 83 per cent of the mobile apps had insufficient transport layer protection. Such vulnerabilities could result in application code tampering, reverse-engineering, privacy violations, and data theft. In addition to sensitive data being taken, the vulnerabilities could lead to a health app being reprogrammed to deliver a lethal dose of medication, or a finance app to redirect the transfer of money. Most consumers would change providers if they knew their apps were not secure. 80 per cent of mobile app users would change providers if they knew the apps they were using were not secure. 82 per cent would change providers if they knew alternative apps offered by similar service providers were more secure. “Mobile apps are often used by organisations to help keep customers ‘sticky,’ yet in the rush to bring new apps to market, organisations tend to overlook critical security measures that are proving crucial to consumer loyalty,” said Patrick Kehoe, CMO of Arxan Technologies. “Our research in Arxan’s 2016 State of App Security Report demonstrates that mobile app security is an important element in customer retention. Baking in robust mobile app security is not only a smart technology investment to keep the bad guys out, but also a smart business investment to help organisations differentiate from the competition and to achieve customer loyalty based on trust.” ### Why cloud storage continues to divide opinion Cloud technology gives consumers access to a wealth of applications and expands data storage capability beyond the confines of the home PC or laptop. It gives people the opportunity to access content on the move and to share their data. However, preference for using cloud storage divides opinion. [easy-tweet tweet="26% of technology enthusiasts recently cited the #cloud as their least preferred storage method" hashtags="IEEE"] Through an IEEE online survey, 26 per cent of technology enthusiasts recently noted that the cloud is their least preferred method for storing their information. On a scale of one to five, 23 per cent ranked it mid-range with only 15 per cent citing it as their first preference. No doubt consumers are sensitive to the havoc cyberattacks can wreak. A raft of recent data hacking incidents, affecting private individuals and corporations, has raised awareness of the potential dangers when security measures are compromised. Cybersecurity is a necessary safeguard to our digital lives as more and more of our private and personal information is stored on third party platforms. It’s a modern-day reality that more of our assets and devices are now ‘connected’ and, along with the convenience this brings, also comes the threat of privacy invasions. Such intrusions can extend beyond the vulnerable devices that may immediately spring to mind – such as smartphones and laptops - to wearable devices, connected homes and connected cars as well. There is a growing need for vigilance and heightened measures to protect our digital information. Faith in digital payments Interestingly, the same IEEE survey reveals growing trust in the security of mobile payments. 70 per cent of respondents suggested that by 2030 mobile payments will be secure enough to replace cash and credit cards as the way to pay.  It certainly points to a high level of faith in digital systems for currency at this early stage of mobile payments’ life cycle. Although, we also see from the survey that for 46 per cent of respondents payment information hacks represent their biggest concern over mobile payment technology, while 33 per cent worry about unauthorised payments. Cybersecurity is a necessary safeguard to our digital lives as more and more of our private and personal information is stored on third party platforms The year 2030 may sound some way off but it’s a mere 14 years. If the survey respondents are right, we can anticipate a future where today’s infants will never know cash in their pockets. Cash, having dominated the exchange of value since coins were first used around the 7th century BC, could become obsolete a mere 80 years after the first credit card payment (as we would know it) was made, and within the comparative blink-of-an-eye since mobile payments were introduced. Of course, cash is already being phased out in many arenas, particularly transport where ‘change for the bus’ is an increasingly irrelevant saying. Offering speed and convenience, the popularity of mobile contactless payments is growing as a means for completing low value transactions such as purchasing a takeaway coffee. The ‘I’ in Internet A surprising number of the survey’s respondents demonstrate a high level of security awareness when it comes to the internet at home and to the impact on the individual of cybersecurity threats. Technology is being used to monitor home internet activity with over a fifth (22 per cent) from the survey saying they have auto alerts set up to flag any attempted connectivity, 11 per cent stating they use visualised monitoring in real-time and three per cent indicating they connect to a cloud monitoring system.  They’re statistics that suggest more people now understand some inevitable vulnerabilities go hand-in-hand with being ‘online’ and that there are measures that can, and should, be taken to protect oneself and one’s assets. Such awareness bodes well for the continuing health of the cybersecurity industry. The next generation of cybersecurity professionals needs to be encouraged and developed to ensure protection in the future. Those developments in cybersecurity will mostly affect identity theft, according to 42 per cent of survey respondents with over a quarter (27 per cent) calling out online anonymity, 18 per cent piracy and 12 per cent viruses. [easy-tweet tweet="Awareness has grown over the importance of #security, and the risks when it fails" hashtags="Cloud, Storage"] This survey into views on digital safety and the future of cybersecurity reveals enthusiasm for progress, with a high proportion of people anticipating digital forms of currency exchange to supercede cash in a short space of time. Yet, it also shows reticence towards giving too much control over personal data to third parties for it to be stored somewhere in the cloud. Awareness has grown over the importance of security, and the risks when it fails. In an increasingly digital world, with unprecedented levels of connectivity and the proliferation of cloud services, cybersecurity has never been more important. ### IBM Mad Scientists Event: Robots, AI and IoT Last week the Compare the Cloud team were invited to the IBM Mad Scientists event at IBM Southbank to experience some of the more “out there” technology projects currently being explored by the company’s developers, members of the LJC and University researchers. Promising IoT, AI and robotics, the event was unashamedly one that valued innovation over functionality, which helped to explain some of the stranger projects on show. Still, with the human imagination providing the only limit to the creations on offer, we’ve highlighted some of the most noteworthy proposals below. [easy-tweet tweet="Twitter-controlled drones were for entertainment rather than social media-based warfare" hashtags="MadScientists, IBM, IoT"] How emotional do we need robots to be? One of the earliest talking points of the night came from a talk by developer Dave Snowdon about why emotions are playing an increasingly large role in robotic development. With Aldebaran’s Nao robot offering support in the form of unintelligible noises, Mr Snowdon explained that giving a non-human object a personality is a difficult balancing act. When it comes to personal assistants and avatars, for example, a lack of emotion can come across as cold and unengaging, but misguided attempts at giving robots emotions can also lead to a feeling of unease - the disconnect between the human and inhuman leads observers to experience a sense of the “uncanny.” Robot personalities will continue to progress, but it may be a while before we experience one that is convincing without being creepy. Twitter-controlled drones While the words “Twitter-controlled drones” could conjure images of social media-based warfare, the devices we saw were purely for entertainment purposes. By tweeting a variety of commands like “take off,” “turn left,” and “flip forward,” we were able to make the quadcopters perform aerial acrobatics at the touch of a button. Promising IoT, AI and robotics, the event was unashamedly one that valued innovation over functionality, which helped to explain some of the stranger projects on show LED disco It was time to watch out for the dad dancing as we made our way to the LED disco, where Twitter messages could be displayed on a Saturday Night Fever-esque dance floor. Our own Neil Cattermull was throwing all manner of shapes as the lights beneath him spelt out his Twitter handle. It’s not yet known what happened to the footage of this amazing spectacle. Gaming on the cheap The event also showcased a couple of homemade gaming efforts by IBM DevOps specialist Paul Mandell. The first saw a £5 IKEA table, an old LCD screen, some broken PC speakers, and a Raspberry Pi turned into a retro gaming fan’s dream. Pac-Man, Space Invaders and a host of other classics were available without you having to dip into your spare change. Also on display was a “Zombie Bunnies” video game, which saw you charging around the IBM Hursley data centre collecting coins while avoiding hordes of the undead. The most impressive thing about this game is that it was created from start to finish in just 24 days. IoT innovations Given the promise surrounding the Internet of Things, it was hardly surprising that connected devices played a sizeable role at the Mad Scientists event. There was a kettle that tweeted your mum to tell her you were having a cup of tea, a soil monitor that lets you know when your plants need watering and IoT soap dispensers that inform you when they’re in need of a refill. The latter idea, in particular, could prove useful in a host of other areas - essentially making sure that everything from vending machines to medical supplies are kept stocked up. [easy-tweet tweet="The projects on display were about #innovation, experimentation and imagination" hashtags="IBMMadScientists, IBM, IoT"] It remains to be seen whether any of the innovations on show at the Mad Scientists event will ever become commercial products or services, but to worry about that would be to miss the point entirely. The projects on display were about innovation, experimentation and imagination, all of which must be in place long before the moneymen get involved. ### The hurdles facing cloud service providers and how they can jump them The storing of data has become an increasingly important concern as more and more data is being generated at a global level. In recent years, virtualisation has transformed the data centre, with Cloud Service Providers (CSP) included in those driving the change. [easy-tweet tweet="Massive, fast-growing, unpredictable virtualised workloads are dealt with by thousands of CSPs everyday" hashtags="cloud"] Massive, fast-growing, unpredictable virtualised workloads from customers are dealt with by thousands of CSPs everyday. This significant level of growth can be handled only if CSPs can contain its complexities and curb its unpredictability. It remains a challenge for CSPs to identify the pains and priorities that define their businesses over a period of time. With this in mind, VM-aware storage (VAS) provider Tintri surveyed 78 CSPs to discover the issues that they face. The research, and subsequent report, focused on storage because it is central to the success of CSPs and can help them to thrive if they utilise it correctly. Virtualisation on the increase The research revealed that 75 per cent of respondents have virtualised over 80 per cent of their environments. The fastest-growing CSPs actively use storage to unlock new efficiencies and deliver differentiated services to their customers. To help them compete more effectively and grow their businesses, they should be working towards the following: Align storage with virtualisation: While conventional storage is highly effective for physical workloads, LUN and volume-based storage architectures have little benefit to offer a highly virtualised footprint. CSPs cannot afford to lose time shuffling virtual machines (VMs) between LUNs and they cannot have low ceilings imposed on their ability to manage a large and growing number of VMs. That's why they need storage specifically built for virtualisation that can offer density and simple management. Given the heavy competition in cloud services, CSPs need to stand apart (and expand margins) by offering highly differentiated services Compete on differentiated services: Given the heavy competition in cloud services, CSPs need to stand apart (and expand margins) by offering highly differentiated services. That's often accomplished by procuring different tiers of storage. Looking ahead, CSPs need to build on storage that allows them to isolate virtualised applications and set different Quality of Service (QoS) tiers on a single device. That way they can help their customers step up to higher-revenue services and guarantee the performance of customer applications. Think manageability: According to the report, CSPs cite performance as their biggest pain point, which makes sense: above all, they want to provide the best possible services to their customer base. But poor application and VM manageability drastically hinder storage performance - not to mention the CSP's bottom line. CSPs don't need more boxes - they need greater automation and better troubleshooting visibility in order to scale profitably. Is VM-aware storage the solution? With this in mind, CSPs need to consider a VM-aware storage solution because it will make it easier for CSPs to purchase and grow, while offering customers guaranteed high performance, transparent VM-level analytics and differentiated services. [easy-tweet tweet="CSPs need to build on #storage that allows them to isolate virtualised applications and set different QoS tiers"] CSPs require reliable, scalable and high-performance storage to power their cloud infrastructures and they need to offer differentiated services to stand out in a competitive marketplace. They also have to acquire the right enabling solutions with flexible terms that allow them to grow at their own pace. No wonder that, for many CSPs, selecting storage can be a make-or-break business decision. ### World Backup Day: Does your business have a strategy in place? March 31st has been declared World Backup Day. With so much of our lives, photos, and videos being stored in digital form, it is important that we begin to make backups of our precious data. Backing up your data is one of those easy-to-do procedures that many people ignore. [easy-tweet tweet="World Backup Day is a reminder for IT leaders to explore the industry’s advancements in #storage systems" hashtags="data"] People now create and generate over 1.8 zettabytes of data per year. That’s a lot of data that we need to protect! Unfortunately, nearly 30 per cent of people have never even backed up their data. Backing up your data will protect your life’s work when that hard drive fails. If you are a small business, a data backup can be what saves your company. World Backup Day is here to make sure that people actually start backing up. Below we've collected a few comments from industry experts about why data backup is so important: Ellen Rome, Vice President of Marketing, X-IO:  "On the consumer level, backing up data has never been easier, especially with so many mobile devices connected to the cloud. If only it was that easy for businesses and the enterprise. IT leaders are struggling to keep up with growing demands for data backups and recovery that requires the right balance of performance, availability, simplicity and affordability. World Backup Day is a great reminder for IT leaders to explore the industry’s advancements in storage systems that won’t charge extra for high availability software features such as mirroring, snapshots, replication and stretch clustering.” Peter Godden, VP, EMEA at Zerto: “Every day we’re reminded that the growing complexity of data centre technologies and troves of data pose a significant challenge to not just C-level technologists, but business leaders as well. World Backup Day is a light hearted way for organisations of all industry types and sizes to take stock of the importance of protecting and recovering their critical IT infrastructure as they see the stark realities associated with this critical task. More than a celebration of backup, the day marks the overarching significance of making critical data and applications available to a point within seconds of loss either due to man-made, natural, or criminal disaster. This approach to creating the highest levels of confidence of ‘readiness’ among IT and business leaders is really the heart of what helps bring World Backup Day to life, particularly for highly regulated industries such as healthcare and financial services.” Geraldine Osman, VP International Marketing, Nexsan: "On World Backup Day it's important to draw attention to the wasted money spent on storing infrequently accessed data on primary storage, which slows down backups and raises costs. We estimate that 80 per cent of data on primary storage can be easily identified as data that never changes, or changes infrequently. By offloading this data onto more appropriate storage, companies can ensure it’s backed up, whilst simultaneously speeding up the backups of their now lighter primary storage system. By choosing the right storage for the task you can have a backup strategy that saves your organisation time and money, even in this day of ever growing data.” Nigel Tozer, Solutions Marketing Director, EMEA, Commvault: “Your data, your responsibility’ is a clause embedded in the majority, if not all, public cloud contracts. The big public cloud providers typically use snapshots and replication to reduce the risk of data loss, but neither of these offer the protection of an actual backup copy. Clearly this isn’t so much of a problem for backup data sent to the cloud, but for critical data that only resides in the cloud, World Backup Day is a great reminder to check exactly what protection is in place and take action if needed to avoid undue risk.” [easy-tweet tweet="World Backup Day can easily be observed by spreading the word of #data integrity and automatic #backup"] World Backup Day can easily be observed by spreading the word of data integrity and automatic backup. Share World Backup Day with all your friends and family. The virtues of both local and offsite data backup sites should be extolled. ### Hong Kong tops Cloud Readiness Index 2016 Hong Kong has come out on top in this year’s Cloud Readiness Index (CRI), representing a climb of four places. 2016 also marks the first time that the study has included non-Asian markets for comparative analysis. However, the comparison does not make great reading for Western countries, with the CRI placing the likes of Hong Kong, Singapore, New Zealand and Australia above Germany, the UK and the US. [easy-tweet tweet="Hong Kong has come out on top in this year’s #Cloud Readiness Index" hashtags="CRI, Asia"] The index, compiled by the Asia Cloud Computing Association, suggests that physical infrastructure is one of the key reasons why the Asia Pacific region outperforms other markets. Strong performance in terms of international connectivity, broadband quality, green policies and data centre risk, all contribute to Asia’s heightened cloud readiness. A cloud divide emerges The 2016 Cloud Readiness Index does contain some warning signs for the Asian market, however, with a cloud computing divide on the horizon. The difference in scores between lower ranked countries is much more pronounced, with a difference of 12.5 points separating the 8th and 9th placed countries, compared to an average difference of just 2.6 points. When coupled with the fact that the top eight nations remain unchanged since the 2014 report, it suggests that the gap in cloud readiness could widen. Unchecked, this digital divide could have wider economic ramifications for the region, particularly given the increasing importance of cloud computing. Reasons for this divide are difficult to pin down but could stem from multi-year digitisation programmes that some countries are implementing. These plans include gCloud, broadband and other connectivity rollouts that have seen certain economies rapidly improve there cloud readiness, leaving other markets behind. Hong Kong’s Digital 21 strategy and Singapore’s iN2015 Masterplan are both examples of successful digitisation projects and other nations will need to formulate effective proposals of their own if they are to avoid falling further behind. The next step Many of the leading markets in the CRI, however, already have plans in place to further improve their cloud readiness by adapting to changes in the wider digital economy. For example, cybersecurity has become a much greater concern in recent years and the 2016 Cloud Readiness Index includes a new parameter to reflect this. Subsequently, many nations, including Australia, Malaysia, New Zealand, Singapore and Thailand have amended their cybercrime legislation. Further digital development plans are also afoot in countries all over the world and the latest CRI report makes it clear that countries must continue to develop their cloud infrastructure in order to ensure a prosperous future, whether based in the Asia Pacific region or elsewhere. [easy-tweet tweet="A #cloud computing divide could be on the horizon for Asian markets" hashtags="CRI, Asia"] “As data becomes the currency of the future digital economy, and as cloud computing continues to mainstream as a technology, ensuring the seamless flow of data through cloud infrastructure becomes central to a country’s cloud readiness,” the report reads. “Countries must be aware of where they stand in preparation for this, and to this end, the Cloud Readiness Index has been developed to provide perspectives which work to ensure that Asia Pacific economies do not lag behind global technology trends.” ### Hybrid is now: How to manage on-premises and cloud applications Modern businesses are well on the way when it comes to cloud adoption. Whether they hope to take advantage of the cost savings or simply increase their overall agility, this ‘cloud first’ mentality often leaves IT professionals tasked with managing applications in a challenging new environment: the hybrid data centre. [easy-tweet tweet="A #cloud first’ mentality can leave IT professionals managing applications in a challenging #hybrid environment"] Despite the rapid rate of cloud adoption, it’s expected that many businesses will continue to have a significant part of their daily operations tied to physical infrastructure for the foreseeable future, due to cloud economics, as well as security fears and regulations – the cloud is for everyone, but not necessarily for everything. Rather than re-architect applications for cloud, IT teams are mostly likely to ‘lift and shift’ applications from traditional infrastructure to the cloud, which in turn can cause a whole set of challenges when it comes to management. At the end of the day, unless it’s a brand new application architected from the start to be cloud-aware, IT pros face the challenge of running an app in the cloud which was designed for the ground, and so has many of the same properties as one built for on-premises – and the same challenges: uptime, performance management, patches, issue resolution, capacity planning, etc. The ultimate objective of the IT pro when it comes to applications is to give the end user the best application performance possible, therefore silos within teams often do not work With this reality in mind, SolarWinds offers several best practices to help IT pros align the management of on-premises and cloud-based applications in this new reality of Hybrid IT: End user and application focus In a hybrid environment, there is less of a focus on infrastructure components and department silos defined by technology layers, and more focus on end user experience and overall application uptime. Traditional infrastructure serves only to make the application work, whereas cloud infrastructure is ephemeral and dynamic. In order to achieve resilience and flexibility, IT professionals should note that in the cloud servers are disposable and so if they begin to fail or problems arise, it is time to kill and replace them. They should also consider ways to implement a more iterative process in order to allow teams to move faster and serve the business more effectively. It’s also important to remember automation, deployments, tests, monitoring and alerts when it comes to faster end-user servicing. One unit with one goal The ultimate objective of the IT pro when it comes to applications is to give the end user the best application performance possible, therefore silos within teams often do not work. There should be one unanimous IT team rather than separate database, virtualisation and storage teams. The team as a whole should be responsible for the performance of applications, so when an application is down, it’s everyone’s role to help rectify the issue. This level of teamwork requires a shared set of goals, transparency, and a consistent set of tools that provides everyone with a consistent set of metrics and visibility across the stack. The IT team needs to have the same common goal: maintaining the application uptime and improving the end user experience. Research cloud providers When evaluating a cloud provider, IT professionals should be careful to do their research based on the individual requirements of each workload. Administrators should pay close attention to a cloud provider’s upgrade and patch processes and service interruptions, SLAs, recommended architectures and available services and capabilities. The right questions need to be asked when making the move to a hybrid data centre: What is the security model? How resilient is the environment and what are the architectural implications? What are their response times? What will they take responsibility for? How will they help me enforce governance rules? How do they support meeting compliance requirements? Full stack monitoring Transitioning workloads to the cloud – even if just a few – can be a mammoth task. A successful transition requires a reliable, full-stack monitoring system. The cloud allows more flexibility, more control and instant changes, which is great but will also be reflected in your monthly bill. This creates a responsibility and an opportunity to optimise everything – all elements should be tried and tested, and the impact of every change known. When managing a hybrid environment, combined visibility is key, and so is adopting a monitoring model which allows you to review the health of both physical and cloud infrastructure in one dashboard, and helps compare performance and resource utilisation on premises and in the cloud. The end goal is performance certainty, both in terms of end-user experience and resource utilisation (and cost). [easy-tweet tweet="Due to economics, as well as security fears – the #cloud is for everyone, but not necessarily for everything"] These best practices can be useful when applied to both traditional on-premises IT and cloud. The goal of moving certain workloads to the cloud is that the IT department can begin leveraging cloud-based principles and some of the new services more efficiently, providing them with the benefits cloud natives enjoy. Through a holistic approach of adopting these best practices, IT pros can manage both on-premises and cloud-based applications, allowing for a faster and more agile business. ### Fast cars and fast data: Formula One lays out a winning circuit for the modern enterprise Big data drives the modern enterprise. It is now routinely collected and shaped and deployed to power applications spanning all aspects of operation—from production to ERP to supply chain to CRM, the list can go on and on. And the speed at which all of this transpires is astounding. Fast Data now drives near real-time customer interactions, instantaneously adjusts processes for optimisation, and curbs security breaches automatically. The scope of just how great an impact our collective shift to big data-based operations has made on the nature of business can best be illustrated by looking to... Formula One (F1) racing. [easy-tweet tweet="The biggest common denominators for both F1 and the modern enterprise are complexity and speed" hashtags="F1, BigData"] In the interest of full disclosure, I grew up in Monaco, so I was exposed to F1 from an early age and am a serious fan. And while that may colour my views on Fangio and Vettel, it doesn’t negate the fact that the apex of international auto racing is actually analogous to a data-driven enterprise operation in a number of ways: Some enterprise functions require strategic real-time data to take immediate action, akin to the split-second decisiveness required of skilled F1 drivers during the course of a race. Other functions also require precision, but permit more time to identify patterns based upon recurring data and make decisions informed by both past experience and the demands of present circumstance, akin to the efforts of the pit lane crew. And all those involved in managing enterprise operations are kindred to the F1 engineers who leverage disparate information and expertise to make and continually improve those beautiful F1 chassis and engines. Likewise, their enterprise counterparts leverage disparate data to improve the way a business runs. The biggest common denominators for both F1 and the modern enterprise are ever-advancing technical complexity and speed. For smooth function and any chance of success in both realms, the right data needs to get to the right people at the right time—and it needs to do so with consistency and great speed. Think about enterprise managers and how they parallel F1 engineers. They rely on data to provide context on how the business or team is doing internally, as well as context on how the market space or particular race is functioning. They have a game plan, but are continually examining conditions and looking for the best set-up—the one that is going to produce the best results given the particular situation. Increasingly, what is needed is the ability to make quick contextual decisions as well as the flexibility to adapt based on all the data coming in. Data in the enterprise is fuelling a similar course for business, where innovation has advanced our capabilities exponentially over a few short years, and again we’re witnessing the marvel of speed Today, you need the data fast as well as the ability to align those decisions and execute rapidly. In business, the Netflix model is the obvious example: leveraging analytics and speed with a massive data-management effort to create a solid and scalable operation. And the velocity at which that particular organisation is adapting is quite impressive: they exemplify a philosophy that embraces agility. This doesn’t mean every enterprise has to do exactly what Netflix does. But just as no two F1 cars are identical, all F1 cars do have to function under the same “formula” equipment and rules. So while the “Netflix way” won’t work for every organisation, it does present an indication of how enterprise tools and techniques are rapidly evolving, and enterprises will have to address those facts. Innovation in the effective management of big data, particularly in areas such as real-time analytics, has redefined performance capabilities both on and off the track. Because the speed of innovation has spiked so rapidly where data collection, processing, and actionability are concerned, one emergent way of addressing it is through partnerships, which have become very important both in F1 and in enterprise operations. In F1, you’ll see tyre-manufacturer Pirelli supplying mountains of important diagnostic information about performance under various conditions to the various F1 engineering teams, which they incorporate when testing and preparing their cars for competition. Likewise, you’ll often see design entities collaborating to create a team’s cars; hence the difference between entrants and constructors in F1 (for example, team Ferrari has a Ferrari engine this year, but so do team Sauber and Toro Rosso). The same is true in industry, where more and more enterprises integrate a variety of third-party technologies to best enhance the performance of their core operations, and ecosystem-building has become increasingly important across the board. [easy-tweet tweet="The Fast Data race in enterprise is already well under way and shows no sign of stopping." hashtags="BigData"] Formula One as a sport has evolved over the last 50 years from relying entirely on tinkering, instinct, recollection, and hand-written signs to the use of advanced computer assisted design, telemetry, event processing, and deep analytics as a matter of course. The rapid rate of innovation advances the capabilities of the cars and their drivers, and race fans get to witness the breathtaking marvel of speed. Data in the enterprise is fuelling a similar course for business, where innovation has advanced our capabilities exponentially over a few short years, and again we’re witnessing the marvel of speed. The 2016 Formula One season kicked off on March 20 with the Australian Grand Prix in Melbourne and will culminate in November with the final Grand Prix in Abu Dhabi. The Fast Data race in enterprise is already well under way and shows no sign of stopping. The chequered flag awaits! ### Building a hybrid cloud for your enterprise? Here are the pros and cons you should consider The growth of cloud computing over the last decade has been unprecedented. From a relatively unheard-of concept, the cloud has become a key part of the boardroom conversation amongst CIOs and directors of IT at companies across industries, sizes, and revenues for its promise of organisational transformation. [easy-tweet tweet="From a relatively unheard-of concept, the #cloud has become a key part of the boardroom conversation"] A large amount of enterprises have already built their own private cloud networks, hosting essential applications and providing anywhere, anytime access to mission critical data for employees scattered across the world. In many cases, the effort pays off, resulting in increased productivity, reduced costs and ease of access. And there’s no shortage of companies that have built public cloud offerings to leverage this trend - tech giants like Amazon, Google, Microsoft, and Oracle all having entered into the market in the last few years, not to mention the thousands of smaller service providers offering more niche solutions. These services can be cheaper alternatives to building a private cloud infrastructure, or provide a necessary extension to the limits of a private cloud, allowing for occasional bursts in computing capacity. The biggest trend however is to blend the benefits of both private and public cloud offerings to create a hybrid cloud infrastructure. These hybrid clouds exploit the control and security of a private cloud, along with the flexibility and low cost of public cloud offerings. Together they form a powerful solution to meet the demands on IT from the rest of the organisation. The biggest trend however is to blend the benefits of both private and public cloud offerings to create a hybrid cloud infrastructure. Cloud computing has a seemingly endless list of reasons why companies should adopt it: cost savings, improved security, enhanced agility, greater accessibility and flexibility, among many others. Considering changes in the IT infrastructure always poses risks and challenges that must be taken into account and built into the plan for rollout. For that reason, we’ve put together this guide with best practices for building a hybrid cloud. Hybrid Cloud – The benefits Cost savings - One of the foremost benefits of implementing a hybrid cloud is of not having to spend the money and build infrastructure to withstand occasional bursts in system usage that only happens a small fraction of the time. Instead organisations can leverage public cloud offerings to offload some of the heavy usage, and only have to pay for it when they need it. With less money spent on infrastructure, more funds can be devoted to other critical projects that help move the business forward. Improved security - While the perception that the cloud is insecure is a persistent one among members of traditional IT teams, many believe that customers of service-provider environments actually suffer from less attacks than on-premise users do. The myth that cloud computing is less secure than traditional approaches can lend itself to the fact that having things stored off-premises feels less secure, however this does not reflect reality. Enhanced organisational agility - By leveraging the public cloud in times of heavy usage, organisations can experience fewer outages and less downtime. For developing and testing new applications, the hybrid cloud also offers an attractive option for hosting them - buying time until a decision is eventually made as to where to host it permanently. Greater accessibility - With employees becoming increasingly mobile, greater accessibility to business-critical applications is a necessity for any 21st century enterprise. Gone are the days when employees only need to access their email when they’re at their desks, or only need to update a spreadsheet or access an application during business hours. Today, Business happens 24/7 and for companies to compete effectively, the cloud offers the advantage of anywhere, anytime access. [easy-tweet tweet="Greater accessibility to business-critical applications is a necessity for any 21st century enterprise" hashtags="mobility"] Hybrid Cloud – The challenges Tools and skills - To effectively operate a hybrid cloud solution, it takes tools and skills. Not everyone has these skills, and it can be costly to get them. If your organisation has recently decided to move to the cloud, it might be necessary to look for outside talent that has the necessary skillset to accomplish the transition. Moreover, the team implementing the project will probably need additional training to learn the systems, and all of this costs time and money; bringing us to our next point… Cost - As always, cost is key. It plays a major role in planning to execute a hybrid cloud strategy. While the public cloud can offer an attractive option for its flexibility and relatively low cost to operate, building a private enterprise cloud requires significant expenditure and can become expensive very quickly with all the physical hardware necessary. At the same time, heavy use of public cloud resources can rack up unexpectedly high usage bills that may not have been planned for. Security - It’s at the forefront of everyone’s mind when they think of the cloud. While we’ve already seen that cloud computing is not inherently any less secure than traditional computing, and in fact faces fewer attacks, there are still considerations to take into account when building out a hybrid cloud. The proper precautions must be taken to ensure data is properly protected and that the right people maintain control. Additionally, depending on the industry, there may be regulatory requirements that prohibit data from being stored off-site, which would prevent the use of a public cloud entirely. Depending on the industry, there may be regulatory requirements that prohibit data from being stored off-site, which would prevent the use of a public cloud entirely. Data and application integration - This is a particular challenge when taking the move to a hybrid cloud into account. Applications and data exist in a symbiotic relationship, with each one being useless without the other. Oftentimes they’re chained together. So when considering where to store each of them, it’s essential to ask whether the infrastructure they’re placed on matters. For example, if an application lives in a private cloud and its data lives in an on-premises data centre, is the application built to access the data remotely? Technologies like copy data virtualisation can decouple data from infrastructure and make this problem less of a headache. Compatibility - Across infrastructure, compatibility can prove itself to be a major issue when building a hybrid cloud. With dual levels of infrastructure, a private cloud that the company controls, and a public one that the company leverages, the chances are that they will be running different stacks. Can you manage both using the same tools, or will your team have to learn a new set in order to effectively oversee them? Networking - Another important factor to consider in hybrid integration is designing the network around it. For instance, will very active applications be living in the cloud? It’s necessary to consider the bandwidth usage that this could take up on the network, and whether or not it could cause problems in bottlenecking other applications. [easy-tweet tweet="Across infrastructure, compatibility can prove itself to be a major issue when building a hybrid cloud"] Like every IT project, building an enterprise hybrid cloud brings many benefits and some challenges. When properly accounted for during planning, organisations can minimise these difficulties and maximise the value they bring for the company. ### How the cloud has enabled BYOD It is predicted that in 2016 BYOD will become a business necessity rather than a privilege. In a recent survey, 72 per cent of organisations polled were permitting or planning to implement BYOD. BYOD includes employees accessing corporate data such as emails or documents through the use of their own devices. The cloud has been at the forefront of enabling this change. [easy-tweet tweet="It is predicted that in 2016 #BYOD will become a business necessity rather than a privilege"] There are many benefits to using cloud storage over the more traditional computer data storage. Cloud storage offers flexibility. It means that data is always available to the user, from any location with Internet access. Three of the most popular types of storage are iCloud, Dropbox and Google Drive. The most successful is Google Drive. Its capacity to keep data in one place, using online storage, and the flexibility to continue working even when offline, makes it user friendly. Dropbox allows the simple sharing of a number of different devices, whilst iCloud backs up mail, calendar, contacts and storage. The main concerns around BYOD are safety and security. Using strong passwords with a range of numbers, letters, capitals and symbols is strongly advised These types of storage and their many components such as Google Docs, Google Spreadsheets and online calendars have made working more streamlined. With these facilities teams can work anywhere and at anytime. It’s unsurprising that businesses have embraced these changes especially as recent research has indicated that it makes their workforce more efficient and productive. The main concerns around BYOD are safety and security. Using strong passwords with a range of numbers, letters, capitals and symbols is strongly advised. Other obvious recommendations include not using the same passwords and never to share them. If you are using your own device password security is unlikely to be breached, however using browsers within the workplace can confuse matters. Often browsers will ask if you’d like it to remember that password. If you don’t sign out of the browser a colleague could use your login information. Always remember to log out. Using a trusted BYOD platform such as Purple is also key to providing a better experience for staff using their own devices. Security is essential. The WiFi network must be encrypted and the connection secure. Padlocks don’t guarantee security so always check that your service provider is the best you can find. [easy-tweet tweet="A trusted #BYOD platform is key to providing a better experience for staff using their own devices" hashtags="Security"] With more and more companies adopting a BYOD IT policy approach, the need to assess how they implement and secure their data against potential breaches is ever more important. BYOD is here to stay. ### Collaboration, containers & AI: Trends in open source reflected by “Rookies of the Year” Since 2008, Black Duck has done an annual review of the world of open source to recognise top new projects launched during the previous year. The selection committee is made up of members of our Product Management and Product Marketing teams with data analysis help from the team that manages Black Duck’s OpenHub.net service. Our goal is to leverage Black Duck’s unique insight, as a company that has amassed a KnowledgeBase of nearly a million and a half open source projects, to shed light on projects that might not otherwise be on the radar of mainstream tech media.  [easy-tweet tweet="Much has changed in open source since the first Black Duck Open Source Software awards in 2008" hashtags="OpenSource"] The 2015 Black Duck Open Source Software “Rookies of the Year” were announced in early March 2016 and marks the eighth year of the awards. Much has changed in open source since our first awards in 2008. Open source is no longer the domain of academia and idealistic developers. OS components have become essential building blocks for many enterprise and commercial applications. OS development is now essential to the business strategy of the majority of software development companies. The open source rookie class of 2015 reflects those dynamics. For-profit software companies sponsored and contributed to the bulk of the top projects. Businesses choose an open source development model for many reasons, but one thing is clear: open source is not viewed merely as a tool for competitive success and profitability, but also as a strategic element in achieving those goals. In some cases, the open source projects are adjuncts to their sponsors' core products or offshoots of internal development initiatives. In other cases, the projects are drivers of the core products themselves. Businesses choose an open source development model for many reasons, but one thing is clear: open source is not viewed merely as a tool for competitive success and profitability, but also as a strategic element in achieving those goals 3 trends shaping the open source industry in 2015 As we went through the selection process, I was struck by three trends in the state of open source. Open Collaboration: Enterprise real-time communication and collaboration continue to be fertile grounds for open source innovation. Many of us, me included, are dependent on tools like GoToMeeting and Slack. But what if you don’t want to be locked into a proprietary solution? Our rookies class has three great options: Rocket.chat and Mattermost are Slack alternatives, while Hubl.in provides simple and lightweight video conferencing Docker Containers: Just two years ago we added Docker to our Open Source Rookies list. Since that time, market interest in DevOps, rapid deployment technologies, and Docker in particular has exploded. There were many projects launched in 2015, and it was hard to choose which ones to name on the rookies list. In the end we selected four. Kontena and Nulecule (part of Red Hat Atomic) provide solutions for orchestrating and simplifying deployment in complex environments. Chef’s Inspec helps companies add automated compliance testing to their DevOps environment, while Capital One have put their own DevOps dashboard solution, Hygieia, into open source. Artificial Intelligence:  We may still be a way from truly self-aware machines, but development teams are racking up impressive achievements in helping computers get smarter at getting smarter. IBM’s Deep Blue supercomputer beat Grand Master Garry Kasparov at chess in 1997. In 2011 IBM’s Watson trounced humans at the quiz game Jeopardy. And in March 2016, researchers at Google DeepMind, the Alphabet-owned artificial intelligence research company, announced that they had created an artificial intelligence system that beat a professional Go player in a best of five match-up. Facebook CEO Mark Zuckerberg responded through Twitter that his company’s AI researchers are also close to beating humans at the game. Both Google and Facebook are very active in the open source community and this year we recognised two projects backed by them. The first is Bazel, a large scale build automation system developed and used by Google. The second is React Native, an extension to the React mobile development framework, sponsored by Facebook. Both of these projects look to be very promising solutions for development teams. Enterprise real-time communication and collaboration continue to be fertile grounds for open source innovation The full list of 2015 Black Duck Open Source Rookies of the Year include: Rocket.Chat – an open source web chat platform built for communities and companies wanting to privately host their own chat service. Mattermost – an open source, on premise Slack alternative, written in Golang and React. Hubl.in – a free and open source video conferencing solution. MXNet – a lightweight deep learning library created by DMLC, the people behind CXXNet, Minerva and Purine2. Bazel – a subset of Google’s internal software development tools, building software quickly and reliably through a shared code repository in which all software is built from source. React Native – a Facebook-sponsored framework for building native mobile applications using the React JavaScript library. Kontena – an open source container management solution “built to maximise developer happiness.” Nulecule – a specification for packaging complex multi-container applications while ensuring smooth deployment across all instances. InSpec – an open source compliance testing framework for specifying compliance, security and policy requirements. Hygieia – Capital One’s enterprise DevOps dashboard, released last year as an open source project on GitHub. Glucosio – one of the world’s first open source diabetes monitoring applications. Honorable Mention: Vault – a tool for securely accessing API keys, passwords, certificates, employee credentials and other sensitive resources. Honorable Mention: RancherOS – a miniscule Linux distribution specially designed to be the easiest way to manage Docker containers. Honorable Mention: OWASP Security Knowledge Framework (SKF) – a free, open source web app security system based on OWASP security standards. This year's Rookies are impressive examples of how far open source has come, with start-ups like Mattermost and Glucosio as well as big players like Google, Facebook and Red Hat leveraging the open source community to help drive innovation in everything from DevOps and Docker container solutions to diabetes monitoring and real-time communication. [easy-tweet tweet="Many businesses are leveraging the open source community to help drive innovation" hashtags="OpenSource"] All these are sophisticated initiatives where the open source approach is a core part of the business strategy for speeding development, promoting adoption and providing the most value to their customers. Is there another Docker in the group?  Only time will tell.  But when you talk to the really smart people behind these projects and say to yourself, “I'd love to work on that,” you know you have a group of winners. ### GDPR compliance: Don’t cloud the issue of data protection by design Following a period of lengthy debate, fine tuning and approvals from various political entities, the EU General Data Protection Regulation (GDPR) will soon come into effect. With just one final stage of approval yet to come, the GDPR is set to finally become law this spring, leaving organisations with two years to achieve full compliance with the regulation. [easy-tweet tweet="With just one final stage of approval yet to come, the #GDPR is set to finally become law this spring" hashtags="compliance"] Whilst two years may seem like a sufficient amount of time, the fact is that IT teams are faced with the challenge of controlling a hugely complex web of cloud use in the workplace. As a result, GDPR compliance will be tricky: recent research conducted by Netskope and YouGov revealed that almost 80 per cent of IT professionals in medium and large organisations do not feel confident that they will be able to ensure compliance with the regulation before the expected deadline of spring 2018 falls. almost 80 per cent of IT professionals in medium and large organisations do not feel confident that they will be able to ensure GDPR compliance before the expected deadline of spring 2018 Organisations striving for GDPR compliance must consider enterprise cloud app use. It is a difficult hurdle to clear: cloud apps create unstructured data which is not only more difficult to manage but also explicitly included within the regulation. The research found that while nearly a third of IT professionals admit to knowing full well that shadow IT is rife in the organisation – meaning that employees are using unauthorised cloud apps at work, placing data at risk – only seven per cent have implemented a solution to deal with this problem. Blanket block policies when it comes to apps are not an option because cloud app use leads to such huge productivity gains. Businesses have to find a balance, enabling continued use of cloud apps while ensuring that structured and unstructured data, both at-rest and in-transit, are protected. But how can organisations securely allow employees to use cloud apps while ensuring GDPR compliance? Under the GDPR, companies are required to take active measures to protect their data. Legal arrangements such as policies, protocols and contracts are not sufficient to guarantee GDPR compliance. Instead, organisations must ensure data protection and compliance in all areas by implementing deliberate organisational and technical measures. Known as ‘data protection by design’, these actions extend beyond the traditional security measures aimed at ensuring data confidentiality, integrity and availability. Both cloud vendors and cloud-consuming organisations must recognise that the GDPR will have wide-ranging and significant ramifications GDPR compliance will not be possible if organisations do not control and secure data in cloud apps. Closely managing a business’ interactions with the cloud is a good starting point. In order to achieve this, IT needs to: Discover and monitor every cloud application in use by employees. Know which personal data sets are being processed by employees in the cloud – for instance, customer information such as name, credit card details, address, or other forms of personally identifiable information (PII). Secure data by implementing policies to ensure that employees are not using unmanaged cloud services to store and process PII. Policies should be sufficiently granular in order to prevent unwanted behaviour while simultaneously ensuring compliant use of the cloud can continue. Coach users in best practice so they adopt the services sanctioned by IT. Use a cloud access security broker to evaluate the enterprise-readiness of all cloud apps and cloud services so the business can guarantee that all data is protected both at rest and in transit. [easy-tweet tweet="#Cloud apps and #ShadowIT in the workplace has made personal data more difficult to control" hashtags="compliance"] The explosion of cloud apps and shadow IT in the workplace means that personal data has become even more difficult to track and control. Both cloud vendors and cloud-consuming organisations must recognise that the GDPR will have wide-ranging and significant ramifications in terms of data control and protection. IT security teams will need to take full advantage of the two-year grace period before penalties for non-compliance come into effect. Examining cloud app use in the organisation is the best place to begin your journey towards GDPR compliance. ### Let’s go hack your email today I am sure that you still remember Microsoft's famous Windows tag line “Where do you want to go today?”, which was the title of a large Microsoft advertising campaign. Well today I’d like to go hack your email… figuratively speaking that is… [easy-tweet tweet="When it comes to explaining #email #encryption, there is no better way than walking the walk"] I talk to a lot of business and cloud service providers out there and there is one thing I have learned over the years. When it comes to explaining email encryption, there is no better way than walking the walk instead of just talking the talk. With that in mind I went last year to an event in Johannesburg, South Africa with a different approach than the usual discussion about email encryption. This time around I set up the workshop in order to demonstrate how an email is hacked as opposed to how it is protected and the response I got from the audience was radically different to anything else I had seen up until then - they simply got it. So this is what I will be doing here, I will just show you how an email gets hacked and how easy it can be. Hacking an email account Hacking someone’s email doesn’t take much more than a telnet session and some basic knowledge. In this example an AppRiver security analyst performs an IP Scan, a SMTP login and creates new messages right from a telnet session. The analyst then goes on to demonstrate how someone with the right credentials is able to filter through incoming or outgoing email traffic to find and read any email message.  [embed]https://youtu.be/fgI-vGiROYk?list=PLYEHhg0bNgd0Rf1jAelIFLW_dTuPS2q_y[/embed] Your email is a postcard How hard is it to hack an email account? Hacking someone’s email may vary depending on the level of security but overall is a fairly easy task to undertake by even a ‘junior’ hacker. The degree of difficulty may increase depending on the barriers to entry, but as we have all seen in the news these last years it does happen on a regular basis and we have found examples of email hacking activities coming from different directions: cyber criminals, intelligence agencies, business competitors, former employees, legal battle driven - the list of cases hitting the news is pretty much endless. Hacking someone’s email may vary depending on the level of security but overall is a fairly easy task to undertake by even a ‘junior’ hacker So you really need to start thinking of an email just like you do with a postcard. You are sending a postcard across the world and any postmen or person handling it has the option to read it if they decide to do so. Now, would you include contracts, payrolls, patient information, bank accounts, critical business decisions or any confidential information in a postcard? So what makes you think that unencrypted email is just fine? Encrypted email So you have seen how easy it is to intercept and read any email in the previous video so now let’s see what happens when you run the same approach on encrypted content. A security analyst mimicking a hacker starts by intercepting the message and then proceeds to access the content but the difference here is when he tries to read that content. So a hacker may get hold of the message but it will be fully indecipherable and therefore his attack will end up being useless. [embed]https://youtu.be/OhYCfw52ics?list=PLYEHhg0bNgd0Rf1jAelIFLW_dTuPS2q_y[/embed] Encryption: It’s hard to use One of the reasons why businesses have been holding back when it comes to encryption is because they find it hard to use and not intuitive enough. Well, that was a good excuse back in the days of PGP deployments that ended up frustrating users to no end but this is definitely not something that you can say today. There are many examples that show that encryption technology has become easy to use, just trial encryption on your corporate email for a couple of days and you will understand how easy it is to integrate within Outlook, Chrome or mobile. [easy-tweet tweet="Businesses can hold back when it comes to #encryption because they find it hard to use or not intuitive enough"] The good news is that with clear examples such the ones seen above companies understand why they need to rollout email encryption to at least the key departments that manage corporate sensitive data and are finally disabling hacking attacks and keeping their data secure. ### Global governance is what makes big data valuable The bigger big data becomes, the more valuable it gets. Most CIOs understand that the benefits of analytics increase when you collect, store and analyse data from more sources, in greater quantities and without latency. Give a global fashion retailer access to real-time data from its entire store footprint and it can not only react more quickly to new trends but also keep a tighter rein on costs across its supply chain. This is big data working as a strategic asset. 61 per cent of organisations in recent Capgemini research acknowledged that big data is now as valuable as its actual products and services. [easy-tweet tweet="61 per cent of organisations acknowledge that big data is now as valuable as their products and services" hashtags="BigData"] But the bigger big data becomes, the harder it is to manage. For CIOs, it’s a problem as intractable as it is inevitable. Sooner or later, you run into the laws of physics. If our fashion retailer wants to store all its transactional data in the cloud, enabling the firm to run complex analytics on customer behaviour patterns, the CIO knows all too well that the minute the data exceeds one or two petabytes (incidentally the point at which it becomes really useful) it also becomes immovable. This is of course great news for the cloud providers. With widely accepted open standards only emerging at a low pace, the illusion of choice for now is cloud’s dirty secret. Once a customer is signed up, switching between providers becomes physically impractical.  The most cost-effective method of moving a petabyte of data from A to B has long been to rent a room from your cloud provider, transfer the information onto a lorry-load of hard discs, and FedEx them to the data’s new home. The irony for organisations navigating the transition to digital business is that big data, the Holy Grail of transformation, is simply too big to handle digitally. the nature of global business operations is that data tends to be generated according to the local environment So how do global businesses run global analytics on immovable data? The solution is to behave like a mining company. The size of the mine dictates that you take the digger to the mine, not vice versa. If the data is too big to move, you take your analytics to the data. This is fine in principle, but the nature of global business operations is that data tends to be generated according to the local environment. Our fashion retailer, for example, may have its marketing operations and headquarters in the UK, manufacturing ops in India, and a growing sales market in China. With hundreds of millions of Chinese consumers, the business needs access to as much transactional data as it can lay its hands on. While the Chinese government may decide that any sales data created in China must remain within its borders, given the likely size of its Chinese data set, the retailer will be forced in any case to position its analytics locally. The problem is that the Chinese arm of the business uses different metadata – different ways of describing the most important data – making it very difficult for the analytics to function. The Chinese office may for example use different descriptive terms for fundamental items, such as customer, size, colour, and place of purchase. And as the business grows, the metadata grows and it becomes harder and harder to set data norms that all employees can easily follow. The company may be a global business, but its analytics are operating at a local level. It’s becoming increasingly important for multinationals to understand not only the growing importance of big data as a strategic asset, but also the governance required to manage it as an asset. Big data is only effective in a global organisation if it is treated as a global asset. For our fashion brand to successfully sell jeans to Chinese consumers at scale, it needs to know exactly how many Chinese consumers are buying size 10 following a sales promotion, which colours are most popular, where those sales are occurring and what else customers add to their baskets. This information must travel quickly across the company’s global supply chain for the company to maximise the sales opportunity. But if the company’s brand new stock management analytics (created in the UK) can’t read Chinese metadata, the business opportunity will be lost. It’s becoming increasingly important for multinationals to understand not only the growing importance of big data as a strategic asset, but also the governance required to manage it as an asset. For data analytics to be successful across the globe, a light touch is required. It’s up to the CIO – together with her business peers - to work out the minimum governance required to turn the data into an asset that drives global collaboration. That demands a simple, consistent blueprint, a set of rules that enable collaboration rather than inhibit it. It also requires a governance that mirrors the specific dynamics of the organisation, balancing the needs and realities of global and local. [easy-tweet tweet="Big data is only effective in a global organisation if it is treated as a global asset" hashtags="BigData, cloud"] Managing big data as a business asset is getting progressively more difficult. Fragmented privacy laws remain difficult for CIOs to work with. Data’s sheer size means real-time insights can be difficult to generate, while the data itself is often immovable. Yet the biggest barrier holding CIOs back is governance of an increasingly vital business asset. Until organisations can create a global blueprint to enable simple collaboration, they’ll never realise the full value of their investment in big data. ### Bitcoin, blockchain and cryptocurrencies: What’s all the fuss about? Part Two In part one we looked at the buzz surrounding bitcoin and other cryptocurrencies, including why they are about so much more than just cybercrime. In part two we are discussing what is drawing businesses to new blockchain-based cryptocurrencies As Ben Broadbent (in 5th), the Bank of England's Deputy Governor, recently explained: “The main point here is that the important innovation in bitcoin isn’t the alternative unit of account – it seems very unlikely that, to any significant extent, we’ll ever be paying for things in bitcoins, rather than pounds, dollars or euros – but its settlement technology, the so-called ‘distributed ledger’.” The potential for distributed ledger technology to make retail payments more efficient and the financial system as a whole more resilient, is behind the bank’s focus on RSCoin. [easy-tweet tweet="Where #bitcoin is unregulated and slow, RSCoin would be regulated and need to be much faster, says @BillMew"] RSCoin would be a closed, “permissioned” blockchain offering the advantages of digital currencies – fast and cheap transactions permanently recorded in a shared distributed ledger – without the troublesome openness of the Bitcoin network where anyone can be an anonymous node on the network. Instead of anonymous miners, only the central bank and vetted financial operators would be allowed to validate RSCoin transactions. And where bitcoin is unregulated and slow, RSCoin would be regulated and would need to be much faster. It takes quite some time for individual Bitcoin transactions to be confirmed as having consensus approval: around 10 minutes on average. Newer algorithms such as Ethereum are faster, processing transactions in 17 seconds. Central banks will need to improve dramatically on this. And other players are piling in too. The Toronto Stock Exchange (its parent TMX Group is in 6th) has hired a Bitcoin entrepreneur as its first chief digital officer as it explores the capabilities of blockchain, the technology behind the virtual currency. JP Morgan Chase is conducting blockchain tests that could result in 'live' applications later this year. Its CEO Jamie Dimon (in 10th) believes the first application of blockchain, bitcoin, is doomed because the Department of Justice will never allow a competing currency to exist in the long haul. He believes in blockchain though. The Toronto Stock Exchange has hired a Bitcoin entrepreneur as its first chief digital officer Indeed there are many other blockchain initiatives - most are primarily composed of banks and exchanges – but one, the ‘Open Ledger Project’ which is overseen by the not-for-profit Linux Foundation, is driven by technology heavyweights including IBM, Intel and Cisco (Ginni Rometty, IBM’s CEO, came 25th in our ranking). So why all the fuss? Blockchain has the potential to reduce overhead costs by eliminating middlemen. Currently you need central authority or an impartial mediator to prove ownership or authorship of information when trading directly with any counterparty. Blockchain guarantees authenticity across institutional boundaries, providing new ways to check authenticity of records, content, and transactions. A shared ledger technology would not only bring a new level of efficiency to the labyrinthine global payments market, but it would also provide a platform for secure, transparent, transactions that could eliminate the corruption, bribery, theft, and tax evasion that blights developing nations. [easy-tweet tweet="#Blockchain has the potential to reduce overhead costs by eliminating middlemen" hashtags="bitcoin"] Blockchain has the potential to change everything! ### Searching for secure DNS infrastructure: Can hybrid cloud architecture help? DNS servers deliver critical services to businesses such as Internet visibility for customers, partners and employees – and external access to network applications and other important services such as email. [easy-tweet tweet="Businesses should deploy #cloud #DNS security alongside on premise solutions to create a hybrid environment"] Because of the fundamental role they play in IT infrastructure, DNS servers are exposed to Internet-based attacks and therefore must be secure at all times. Losing email service or Internet connectivity due to attacks could significantly impact a business’ profitability. An IDC survey conducted in June 2014 revealed that 72 per cent of respondents had been the target of a DNS attack in 12 months, and as a result, 45 per cent were impacted by downtime, 36 per cent reported loss of business, and 40 per cent had intellectual property stolen. Cyber incidents rose by 38 per cent last year[1] and are only expected to increase over the course of 2016. So what can be done about this? The DNS service is one of the most crucial IT services of any company. It delivers critical services and is of the utmost importance to ensuring business continuity and should be a vital part of any global company’s security plan. But so far, existing solutions such as firewalls or generic anti DDoS systems have only demonstrated their inability to protect mission-critical DNS services. Hackers have many different objectives and will use any vulnerability to attack, causing interruption to a business’ activity, corrupting data, or worse – both. DNS attacks are becoming more and more sophisticated and organisations must stay ahead of the threats using security solutions that can protect against a range of attacks. An IDC survey conducted in June 2014 revealed that 72 per cent of respondents had been the target of a DNS attack in 12 months, and as a result, 45 per cent were impacted by downtime The best approach for businesses is to deploy cloud-based DNS security alongside on premise solutions to create a hybrid environment for even greater resilience, performance and protection. Hybrid cloud architectures offer reliable and scalable solutions that allow businesses to manage the DNS server in-house and the domain name in the cloud. They also offer an established infrastructure with multiple points of access to achieve 100 per cent availability of DNS servers from a single management console. It’s industry best practice to not rely on one technology to limit single point of failure, so organisations need to understand practices when building DNS architectures. By doing so, it ensures access to network applications and other indispensable services such as email are never lost. With the right management tool, cloud-based DNS deployments also allow IT teams to revert back to local deployments at the click of a button. This reversibility makes hybrid cloud deployment very adaptable and flexible to individual users and business needs. A test on cloud infrastructure can be made quickly, and can be changed back to in-house in one easy step. There are not many solutions that can be implemented to manage in-house and cloud DNS servers but it can guarantee 100 per cent availability and be integrated to operate alongside other security solutions such as those that work to absorb DDoS attacks and mitigate Zero-Day vulnerabilities. In a report conducted by IDC in 2014, only 12 per cent of UK respondents were already using DNS cloud deployments across their organisation for new applications, but 23 per cent were interested in such solutions and planned to look into using it in the future. However, PwC’s report on The Global State of Information Security Survey 2016 revealed 69 per cent of organisations now use cloud-based cybersecurity frameworks – less than two years later. Despite this, implementing a hybrid cloud architecture does come with its challenges. For instance, there’s an increase in security complexities and a different kind of security protection is needed. On premise and cloud-based DNS servers also don’t share the same configuration which can, at times, come with its own set of problems and can increase the time of setting up a hybrid system. Hybrid DNS architectures also require IT staff to learn a new range of endpoints and responsibilities which can add to costs and extend the time scale until the system is fully up and running. [easy-tweet tweet="On premise and #cloud-based #DNS servers don’t share the same configuration which can cause challenges"] Companies are beginning to understand the need for protecting their DNS servers with multiple technologies, however still do not fully trust solutions that don’t store data on premise. But as DNS attacks continue to grow and develop in 2016, more efficient and increased security is needed. Combining your DNS security with a hybrid cloud architecture is your best form of defence; its scalable, simple to deploy, cost effective, flexible, and most importantly, secure. [1] PwC The Global State of Information Security Survey 2016 ### CeBIT 2016: Why Digital Transformation is more than just a buzzword I attended CeBIT 2016 from the 14th to 15th March, and over the course of the two days I had the opportunity to spend time with Software AG and some of their partners, amongst the exhibitors in the Digital Business Solutions hall. [easy-tweet tweet="Maybe it’s time to tip #DigitalTransformation out of the Buzzword Bucket, says @KateWright24" hashtags="CeBIT2016"] The overriding theme of these two days was Digital Transformation. As someone who is fairly new to the IT industry (having moved from recruitment in July 2015), for me Digital Transformation had been thrown into the ‘Buzzword Bucket’. The BWB is where I mentally store all the words and phrases that seem to get aimlessly thrown around with little understanding of what they mean. Trying to educate myself in this industry has been no simple task with everyone providing juxtaposing explanations of these buzzwords. Digital Transformation has been no exception, and has sat cosily in the BWB along with the likes of Hybrid Cloud, Big Data, Disruptive Innovation, Uberisation etc. However, spending some time with SAG has helped to shift my view of Digital Transformation and maybe it’s time to tip it out of the Buzzword Bucket (sorry Uberisation - you’ll be in there forever). Speaking with Wolfman Jost, CTO of Software AG, he was able to (finally) provide an understandable, short explanation of what Digital Transformation is: “Digital Transformation is the creation of new business applications through the use of digital technology.” This is simple enough and it is applicable across all industries. Companies must now embrace digital to avoid falling behind. Take Blockbuster for example - had they embraced the fact that their target customer was going digital with the emergence of Netflix and other online streaming services, they may have been able to innovate and reinvent. But for most organisations it isn’t so easy to change their attitude and focus. Through discussions with Wolfman Jost and Q&As with SAG partners, there are some key challenges that businesses are facing: Legacy systems This is especially difficult for large organisations and those who have been around for 40+ years. It is great that IT moves so fast and opens up new doors, but every time there’s a new application, it means that your current systems become more and more outdated. Often IT departments will focus on just upgrading key areas of the infrastructure and over time you find that you have a long history of applications that now can’t keep up with the speed of the business. It is at this point that organisations need to realise that they have no choice but to innovate or risk falling behind. Software AG’s platform focuses on how these businesses can deliver fast by modernising what they already have in place. Seamless integration becomes a must. Any organisations that choose to switch their focus to DT are going to need support to do so, and there are a number of organisations that can provide a Digital Transformation Platform Early(ish) adopters vs Laggards We all know the deal here - the public sector and governing bodies always get tarnished with the same brush for falling behind and only embracing technologies just as they’re going out of style. Whereas banking and finance are known to be the early adopters. However, when it comes to DT Urs Kramer, from Sopra Steria, said that even the banks haven’t been fast enough. It’s difficult enough trying to integrate a DT platform with legacy infrastructure, but the task becomes even more laborious when you have to first change the attitudes of these organisations. As Wolfman says, “software is leading the world” and these organisations need to switch their focus to become software-centric rather than product-centric. Which platform? Any organisations that choose to switch their focus to DT are going to need support to do so, and there are a number of organisations that can provide a Digital Transformation Platform, but who do you work with? Wolfman expressed a critical point that a lot of the key providers are super large organisations that have too many focus areas, meaning they can suffer from ‘jack-of-all-trades’ syndrome. This adds extra challenges as their attention is diluted and they may not be able to provide a holistic solution to suit your organisational needs. On the flip side, they may be trying to provide you with too much and you risk being locked in under a myriad of services. SAG themselves realised this, which is why they have moved away from some of their old service offerings and have invested heavily in their Digital Transformation Platform to ensure that their customers are supported in five key areas: Business and IT Transformation Data Transformation Application Integration Process Implementation Decisions and Analytics Reflecting on my new, less sceptical, attitude to Digital Transformation, it’s good to celebrate organisations, particularly large ones, that have embraced the DT era. Some of my favourite examples include Facebook experimenting with money transfers via messenger; Zara using iBeacons to track a customer’s journey through the store and improve the retail experience; and how Amazon personalises your experience through recommendations based on your shopping habits. [easy-tweet tweet="It’s good to celebrate organisations that have embraced #DigitalTransformation @SoftwareAG " hashtags="CeBit2016"] One of my favourite DT stories though? When Coke launched their campaign for people to personalise their Coke bottles and cans - it was a simple idea but one that translated across the globe with people of all ages and backgrounds getting involved. Looking further into this, the platform behind Coca Cola that has allowed their Digital Transformation journey is Software AG. With names such as Coca Cola, ING, Fujitsu and PayPal under their belt, it’s clear that SAG are leading the way for business Digital Transformation. As Wolfman Jost says: “Every company will become a software company.” ### Bitcoin, blockchain and cryptocurrencies: What’s all the fuss about? Part One Bitcoin, blockchain and cryptocurrencies have been around for a while now, but having once been dismissed, they are starting to be taken seriously in interesting quarters. We take a look at the concept and the implications for the future as well as where all the current buzz is coming from and what it’s all about. [easy-tweet tweet="Is #bitcoin a #currency, a commodity, or a hybrid of both asks @billmew"] Bitcoin itself was created back in 2009 by a developer or group of developers going by the pseudonym Satoshi Nakamoto. As a decentralised virtual currency or cryptocurrency, bitcoins are exchanged digitally and managed by a peer-to-peer network, rather than a central bank or authority. There is debate as to whether or not bitcoin should be considered a currency, a commodity, or a hybrid of both. Each bitcoin is a piece of code that has its own transaction log with timestamps. The coins are created by mining servers, with a total limit of 21 million bitcoins. They are stored in an owner's virtual wallet and can be transferred and exchanged for goods and services. Transactions are public and although they are relatively anonymous, it is possible to trace identities back to real-life individuals. Not only is the value of bitcoins exceedingly volatile, but the cryptocurrency currently falls under no regulation or legislation Association with risks and crime Not only is the value of bitcoins exceedingly volatile, but the cryptocurrency currently falls under no regulation or legislation. There have been incidents of online bitcoin wallets being compromised by hackers leading to theft of Bitcoins. And while bitcoins have also been tarnished by their association with crime, there is a growing recognition that the concept of cryptocurrencies and the underlying technology blockchain could well have a role to play in the future of payments. So what’s the buzz? A big data analysis of the top opinions and influencers on the topic demonstrates the issues at hand here. Compare the Cloud’s global top 10 #CloudInfluence ranking for Bitcoin, Blockchain and Cryptocurrencies provides a snapshot in time. [table id=61 /] Top of the list is Allen Stefanek, president and CEO of the Hollywood Presbyterian Medical Centre in the United States which was “was crippled” when hackers were alleged to have held their data for ransom and demanded 9,000 bitcoins - worth roughly US$3.4 million. While Stefanek later revealed that the attackers were actually demanding only 40 bitcoins or approximately $17,000, it was a high profile example of the way that bitcoins have become the favoured currency for cyber-criminals. People probably have less sympathy for Martin Shkreli (8th in the ranking) who rose to notoriety as a price-hiking pharmaceutical boss. He wanted buy out the new Kanye West album “The Life Of Pablo” and even paid the bitcoin equivalent of $15 million, so he could later resell the record to Kanye’s fans. Unfortunately it was a scam and he has been seeking to recover the funds ever since. Such incidents have lead to many merchants distancing themselves from cryptocurrency. It was even claimed recently that Microsoft (4th in the ranking) would no longer be accepting bitcoin at its Windows Store, a little more than a year after announcing it would support the digital currency. However Microsoft later confirmed that the message on its website had been posted by mistake, and that it would in fact continue to accept bitcoin. Then one of the lead bitcoin developers, Mike Hearn (in 2nd), said that he was ending his involvement with the cryptocurrency and selling all of his remaining holdings because it had “failed”. Bitcoin is gaining notoriety as much for its novelty as for the volatility of its valuation So, it’s just a failed enterprise used by criminals then? No, there’s more to it. Bitcoin may well have been tarnished by associations with dubious such use cases – many of them criminal in nature – while also gaining notoriety as much for its novelty as for the volatility of its valuation. However there is increasing interest in underlying blockchain technology and in the real potential of this and of cryptocurrencies. Business, trade and currencies are built on trust, but the current procedural, organisational, and technological infrastructure required to create institutionalised trust is expensive, time-consuming, and, in many cases, inefficient. Bitcoin may be tarnished and unregulated, but the underlying blockchain technology could provide a disruptive model to transform the way that we currently handle transactions, contracts, and indeed trust. [easy-tweet tweet="#Blockchain technology could provide a disruptive model to transform the way that we handle transactions" hashtags="bitcoin"] In Part Two we look at why organisations are suddenly looking at new blockchain-based cryptocurrencies. ### A BI Revolution: Taking the first steps to data democratisation The democratisation of data has been a hot topic for a while — and it’s finally here. Business intelligence (BI) no longer rests solely in the hands (and minds) of IT. Today, all end users, from salespeople to marketers, can have easy access to data and tools that improve their daily decision-making. Cloud BI tools promise an unprecedented level of self-sufficiency to more or less an entire organisation. [easy-tweet tweet="The promise of easy access to data is certainly achievable today" hashtags="BusinessIntelligence, data, cloud"] The promise of easy access to data is certainly achievable today. However, despite the “democratisation” of BI, many organisations have yet to realise the benefits; organisations are not leveraging BI tools to unlock easy access to data despite it being technically possible. In fact, Gartner surveys reveal that 70 per cent of potential BI users in an organisation fail to adopt CIO-sponsored analytics tools. With 71 per cent of companies planning to increase their analytics budgets in 2016, it is imperative that users adopt new analytics tools and make sure that investment doesn’t go to waste. If you’re not ready to implement cloud BI tools across your entire organisation, consider a Proof-of-Concept trial to prove value. Democratisation of data is a result of tools and capabilities that are incredibly flexible and readily available. Organisations are no longer locked into massive requirements gatherings before undertaking an analytics project — cloud BI tools allow you to conduct Proof-of-Concepts with your own data and just a few plug-and-play BI tools. Looking for predictive capabilities? Buy an app. Need an ETL tool for data migration? Just plug it into your existing CRM.  From data enrichment to report creation and data visualisation, third-party apps and tools will allow you to empower your teams with self-service BI tools now. BI tools can revamp mobile, embedded analytics, rich visualisations, and data-driven workflow integration for your company What you can do now  While data democratisation may seem like a tall order, it will be a key differentiator for organisations moving forward. When you begin to plan how your organisation will make data more accessible, consider how the following criteria could (and should) come into play: How data-driven decisions will become the norm for end users and their processes. Data should be so abundant across your organisation that consumption is easy and automatic. Think specifically about how BI tools can revamp mobile, embedded analytics, rich visualisations, and data-driven workflow integration for your company. How end users will be able to directly influence how data is presented and consumed.  If it’s easy for them, they’ll adopt the tools. If not, your organisation will most likely struggle with poor adoption. How your data visualisations are aligned to key business outcomes. BI tools today are so powerful that they’re able to make large data sets immaterial. Only the most relevant information is served up at a given juncture in the workflow process, despite capturing every meaningful customer touchpoint, relevant structured or unstructured attribute, or new data source like IoT. Invest in the “art” of dashboarding to serve up key metrics that address issues and prompt a next step. How your business will be enabled with self-service data preparation, data enrichment, and data integration capabilities. Keep in mind, there will be a learning curve. Considering who the users are and what they want out of BI, then developing training for them, will improve adoption. [easy-tweet tweet="IT needs to be more involved in the business’ strategic goals and understand users need for #data visibility"] IT and the business need to develop partnerships to make this new co-mingling of capabilities run more smoothly. Business needs to look to IT for insights surrounding data governance and security, while IT needs to be more involved in the business’ strategic goals and understand users need for data visibility. The path toward data democratisation can be disruptive, and for many there may be a turbulent road ahead, but the impact it will have on business outcomes like revenue growth, efficiency, and customer retention will be well worth the investment. ### Play Data: Could number crunching make you a better parent? It is easy to lose sight of how much data influences our lives. Many of the decisions we think we made ourselves actually come to us partially formulated. From the charities we’re asked to support to the products we buy online, there’s always an element of data-led intelligence behind those journeys. The days of the cold lead are gone. [easy-tweet tweet="There is always an element of #data-led intelligence behind our #online journeys" user="comparethecloud"] As a society, we’re very comfortable letting data suggest our movies. We even use it to help us select a romantic partner via dating websites. We value the assistance. Yes, it’s easy to be creeped out by hyper relevant retargeting adverts (the first time at least), but if everything we saw online, on TV or on our smartphones was arbitrarily plucked out of the sky, we’d soon be exhausted by the sheer irrelevance of it all. it’s easy to be creeped out by hyper relevant retargeting adverts So when it comes to those big life decisions, there’s a growing trend towards handing over the keys to the people with the data. Be that in health, romance, investing and even parenting. A recent study of 1,000 UK adults conducted by Xoomworks found that one third of parents would use data to make key parenting decisions. The parents said they’d use productivity data for a variety of parenting applications, from using voice analytics to improve speech development to using wearable tech to keep their kids fit and healthy. 10 per cent of parents said they'd also use demographic data to help them select their children’s friends. Presumably using crime stats by area would be a good way to prevent your child visiting friends in crime hotspots, but it could never replace common sense parenting. And there are big questions around ethics and effectiveness in this regard. But the key point is that parents are open to it. As a society, we’re very comfortable letting data suggest our movies. We even use it to help us select a romantic partner via dating websites. We value the assistance The Xoomworks study revealed that parents most valued better problem-solving ability as an output from using data. 22 per cent wanted better speech and would consider using voice analytics - a marketing tool typically used to measure the mood, sentiment and preferences of a customer - to improve how their children spoke. Improved hand-eye coordination, fitness, capacity for empathy were also ranked as desirable outcomes from the use of data. We can already do all of these things with data, but normally in an experimental setting with willing and interested adults. So should parents feel weird about experimenting in this way on their kids? Jon Boggiano, a 37-year-old entrepreneur and dad of three from Charlotte, North Carolina, doesn’t. He’s very comfortable employing data to help his “data driven” family achieve their goals. “My younger two children have used wearable speech tracking devices for the past year. The devices track the amount of words the child hears, prompting adults to engage more with children where necessary. “All of my children wear Fitbits.  It is a fun game for the family and our kids are outside a lot.  They routinely beat me on steps and it is also interesting to see the periods where they have many fewer steps.  For our son he usually fidgets more when he has had less time to spend being active. “Third, we've used all sorts of analytic services like adaptive maths programs to improve academic performance. “I have no ethical reservations really.” [easy-tweet tweet="Improved hand-eye coordination, fitness, and empathy were ranked as desirable outcomes from #data use" hashtags="Parenting"] We’re increasingly more open to using data to optimise business processes. It lets us challenge assumptions and make smart decisions. So why not use it for parenting? ### Insights on IT in the United Kingdom: A CompTIA snapshot It’s always interesting to look at how technology service providers and resellers are conducting business in different geographic locations. What’s true in one location may not be true in another, and best practices, industry regulations and business drivers can vary from one place to the next.  [easy-tweet tweet="Seven in ten UK #channel firms report at least some degree of business transformation" hashtags="cloud"] CompTIA, a non-profit trade association with approximately 2,000 member companies, 3,000 academic and training partners, 65,000+ registered users and more than 2 million certifications issued, recently published their 5th Annual State of the Channel Report – and there were some interesting insights and takeaways specific to the UK market included. CompTIA Faculty and Managing Director Tracy Pound recently presented the results of this report at Continuum’s first, sold-out European Partner Day in February. An abstract from CompTIA’s website reads: Seven in ten UK channel firms report at least some degree of business transformation taking place in their organisation. This suggests a highly dynamic environment, and there are many obstacles to overcome as channel firms seek new partnering arrangements, meet new customer needs and build new models around emerging technology. The three key drivers of growth in managed IT services today include seeking out new partnerships, meeting new customer needs, and building new models to adapt to emerging technologies and solutions This summary touches three key drivers of growth and evolution in managed IT services today that aren’t bound to any particular geography; seeking out new partnerships, meeting new customer needs, and building new models to adapt to emerging technologies and solutions. In looking at the UK specifically, however, here are some of the key takeaways and insights reported: Channel revenue When asked where channel revenue is coming from today, 46 per cent of firms claimed the majority is coming from existing customers, while 35 per cent reported it’s from a balance of both net new and existing customers. When looking to the future, however, the numbers begin to shift a bit. 41 per cent of organisations expect to see more revenue over the next two years come from net new customers, while 32 per cent expect more revenue from existing customers. Vendor relationships Looking at vendor relationships, 68 per cent of respondents were “mostly satisfied” and only 21 per cent reported being “very satisfied.” Interestingly, twice as many US-based respondents reported being “very satisfied” with vendor partner relationships. The report also explored top reasons to drop vendor partner programmes; responses included the cost of membership, low margins or minimal discounts/rebates, lack of marketing support and several others. Is your vendor relationship healthy? Here are 10 red flags when evaluating or working with an IT management platform provider! Hardware dependency As cloud and virtual resources become increasingly accessible and widespread, MSPs are finding they no longer need to rely as heavily on hardware sales to achieve revenue goals. 38 per cent of UK firms identified hardware sales as “very important” (the distinction from the US here was also sizeable, where 55 per cent reported hardware sales as very important). Not surprisingly, cloud infrastructure was identified as the technology area with the highest-expected revenue growth. What are other cloud trends to look out for? IT experts share their 2016 cloud computing forecast here! As cloud and virtual resources become increasingly accessible and widespread, MSPs are finding they no longer need to rely as heavily on hardware sales to achieve revenue goals The future of the UK channel The report also contains some interesting insights and predictions regarding the future of the UK channel. 52 per cent of firms identified their opinion as being “generally optimistic”, and 35 per cent had “mixed feelings.” Respondents looked to opportunities such as cloud computing opening new doors, an increasing demand for managed services, and the fact that customers still want a local provider who acts as a trusted advisor. Conversely, a skills gap for emerging and complex IT solutions, growing competition from new players like telecom providers and the notion that many aspects of IT are becoming easier to deploy and manage internally were reported as reasons to be pessimistic about the channel’s future in the UK. [easy-tweet tweet="46 per cent of firms claimed the majority of #channel revenue is coming from existing customers" hashtags=" cloud"] CompTIA’s 5th Annual State of the Channel and UK State of the Channel reports are available for CompTIA premier members. To learn more about premier membership, click here. ### Use Xangati to ensure that your workload is optimised when operating in a hybrid cloud environment The worlds of IT and cloud computing are in a constant state of flux. Technology constantly changes as vendors try to ensure that their systems are the easiest and most productive to use. However, this evolution of technology and methodology has led to a situation where we are now seeing more and more difficulties for IT managers and systems administrators. [easy-tweet tweet="Technology constantly changes as vendors try to ensure that their systems are the most productive to use" hashtags="cloud"] Nowadays, the vast majority of systems are being upgraded and developed with hybrid-cloud infrastructures in mind. This integration means that systems will have both improved flexibility and deployment as well as the ability to host SaaS. What is a hybrid cloud environment? A hybrid cloud environment is seen as an advanced, highly customisable networking solution that allows managers to apply it as they see fit. This helps them to develop a system that is able to suit any business need. Nowadays, the vast majority of systems are being upgraded and developed with hybrid-cloud infrastructures in mind The benefits of moving towards a hybrid cloud system are obvious – increased customisability, flexibility, cost-effectiveness as well as scalability. However, there are some long term issues to take into account when migrating to a hybrid cloud system. Such a fine-tuned machine requires more than simple observation, you need to be able to manage the system in a sophisticated way that decreases the risks involved. If the system isn’t performing in the way that we want it to, we often blame the system's technological ability rather than the way that the system is being managed. Without the correct management tools, managers cannot identify and react to potential problems such as bottlenecks and poor performance correctly. There are other ways to deal with this of course, some companies employ ‘Tiger teams’ – a group of people whose sole responsibility is to monitor stacks and identify threats. However, this is really expensive and doesn’t really improve the overall productivity of the system. Another flaw of these Tiger teams is that they are really only a reactive solution. They do not give you the complete control that you need to pre-empt potential problems before they happen. The benefits of moving towards a hybrid cloud system are obvious – increased customisability, flexibility, cost-effectiveness as well as scalability What is the solution to this? Hybrid cloud environments are becoming more and more complex which means that it is now essential that businesses focus on maintaining the way that processes flow in order to increase productivity. An ideal flow would feature the following processes: Cloud installed apps that feature across the whole of the infrastructure Any processes or applications that have been implemented into any data centre Identify system faults, security issues, loopholes, performance problems and capacity gaps However, with Xangati you can evolve from traditional monitoring to actually identifying and managing your system. Xangati gives you complete control and allows you to make more informed decisions on how you manage your system. Some of the key benefits of Xangati include: The ability to monitor: Live recordings Context specific dashboards Visual trouble tickets End-to-end cross Silo visibility Scalability To analyse: Machine learning Best practice thresholding Cross-silo metrics based alerts Predictive analytics Capacity planning To control: Root cause correlation Assisted remediation Unique analytics Your system's performance, capacity and efficiency [easy-tweet tweet="There are some long term issues to take into account when migrating to a #hybrid #cloud system"] This tool has been developed by Xangati as a solution to the common problems associated with managing and operating a hybrid-cloud system. Visit xangati.com to find out more about how Xangati can help you manage your system. ### Is 2016 the year to move your ERP to the cloud? Whilst businesses of all sizes and sectors have been willing to explore cloud-based applications for years now, many IT departments remain reluctant to move their ERP. It’s easy to see why. The on premise option gives a sense of security from knowing your business-critical processes are in your own hands. [easy-tweet tweet="Smaller companies can access up to date #ERP technology - which would've been impossible a few years ago"] But has technology now reached a stage where cloud-based ERP is a viable option for your business? The honest answer is possibly. That’s not a cop out – like most IT decisions, it really is all dependent on your own business and processes. Cloud-based ERP benefits are well documented. There’s the reduced upfront cost that comes with all cloud offerings. An on premise ERP system requires a significant capital investment, especially if you have to buy new hardware and expensive servers. In contrast, cloud systems are paid for through predictable monthly fees, becoming an ongoing operating cost which can be much easier to manage. It means many smaller companies can access the most up to date ERP technology, something which wouldn’t have been viable for them a few years ago. has technology now reached a stage where cloud-based ERP is a viable option for your business? The honest answer is possibly Then there are the lower ongoing costs. Running and managing an on premise solution requires a good level of IT expertise within the business, creating staffing and training costs. There is also the energy cost of running servers and the space needed to house them. For cloud alternatives, all that is taken care of by the vendor. If you need to be up and running quickly, cloud ERP ticks the box yet again. The time to go live is much quicker with cloud than on premise, with the process measured in weeks rather than months. Then there’s the added security and data recovery. People are becoming accustomed to internet based applications being capable of securing financial and commercial data - we all bank online don’t we? Even the biggest businesses struggle to match the data security, storage and back-up systems of the big cloud providers. Choose cloud ERP and you’re effectively storing your data in the world’s most secure data centres. Add on the benefits of being able to work remotely and access your business information and processes on a range of devices in real-time, and you start to wonder why on earth anyone would consider on premise at all. But as we mentioned at the start, cloud isn’t quite the panacea which solves every business issue just yet. One of the main obstacles faced by cloud ERP customers is a reliable internet connection. We work with a high number of manufacturers who operate on remote sites where internet speeds aren’t what they ought to be. Your ISP needs to be able to provide consistent and reliable high speed internet access, or putting your mission critical business application in the cloud can be a risk too far. That said, there is a trade-off to consider. With on premise solutions, downtime can be created due to simple real world issues like snow and more drastic business continuity issues such as floods and fires. If you have cloud ERP, your workforce can work from home as easily as in the office. Only you can decide whether the benefits of that outweigh the risks associated with relying on your particular strength of internet connection. If your business has less specialised needs, such as professional services firms, a straightforward out-of-the-box cloud ERP system can be ideal Finally, the other critical factor that needs careful consideration is the nature of your business process. Today’s cloud ERP is inherently stable, highly scalable (up and down) and often has all the capability and flexibility of its on premise cousin. But if you have very specific, complex business processes and need a highly customised and integrated ERP solution, on premise can still be the best option. The need to integrate disparate systems might override any of the benefits of cloud. If your business has less specialised needs, such as professional services firms, a straightforward out-of-the-box cloud ERP system can be ideal. [easy-tweet tweet="Today’s #cloud #ERP is stable, scalable and has all the capability and flexibility of on premise solutions"] As with all IT investment decisions, it pays to find a trusted partner who can analyse your business’s needs and advise on the best option. Cloud ERP offers huge benefits for the right customers, but jumping in without careful consideration can cause more problems than it solves. ### Fighting your IT fears: Five tips for better cloud security It's a safe bet: More than one IT professional has likely woken up in a cold sweat, convinced that an upcoming cloud move will open their organisation to significant security risk. No surprise, then, that the cloud security market is on track to reach almost $12 billion in the next six years as businesses look for ways to leverage the agility and flexibility offered by cloud computing while minimising risk. Here, John Schumaker, Cloud Services Manager at Gordon Flesch Company, proposes five quick tips to improve cloud security, and your sleeping habits: [easy-tweet tweet="Here are five quick tips to improve #cloudsecurity" user="comparethecloud" hashtags="cloud, infosec"] Prioritise placement For many IT professionals, the benefits of cloud computing are balanced by the worry of lost or stolen data. However, there's a simple way to solve this problem if you're just starting a move to the cloud: Keep confidential information close to home. Think of it like a trial run; use the cloud for often-accessed files and documents that aren't confidential or required for regulatory compliance. You get the benefit of cloud service without the spectre of serious data loss. Always encrypt Over time, you'll want to move critical line-of-business (LoB) data onto secure cloud servers. Though what happens if another client on the same public server is hacked, or your vendor is somehow compromised? By encrypting data in-house or by using a third-party service, however, you significantly reduce the value of this information to an attacker. For many IT professionals, the benefits of cloud computing are balanced by the worry of lost or stolen data Best bet? Look for a vendor that offers “zero knowledge.” In this scenario, not even your cloud provider has any knowledge of your encryption method; local admins hold the key. Even if your provider had its database hacked or a government agency demanded access, your files stay safe: Only you can unlock their contents. Outsource with care There comes a time when most businesses use enough services and store enough data in the cloud that security-as-a-service (SECaaS) starts to make sense. The key? Outsource with care. Look for a vendor that offers customisable security solutions and total transparency: You need to know exactly how your data is protected, how new threats are detected, and how the company will respond in the event of an attack. Pay attention to personnel While outside actors can pose a serious threat to cloud security, IT admins also need to manage internal threats. In many cases, employees and executives don't mean to compromise security but do so out of ignorance or in an effort to quickly complete their current task. To manage insider actions you need a reliable cloud monitoring tool that permits greater scrutiny of those with greater access: What are they accessing, when and why? It's also critical to offer security training in proportion to access — those with permission to move or alter valuable data need more training and clear expectations for use. In many cases, employees and executives don't mean to compromise security but do so out of ignorance or in an effort to quickly complete their current task Drill down to devices The last tip for better cloud security? Target devices on your network. With most users now accessing the cloud through their smartphone or tablet, it's not enough to secure desktops: Mobile devices with corporate network access must include application management tools to separate personal and professional data, in addition to patch management solutions that ensure app security is always up to date. [easy-tweet tweet="To manage insider actions you need a reliable #cloud monitoring tool that permits greater scrutiny" hashtags="Security"] Moving to the cloud is a daunting prospect for many IT pros. De-stress by prioritising data placement, always encrypting information, outsourcing where applicable, and managing both employees and their devices. ### The public sector must empower its CIOs The CIO (Chief Information Officer) job title has grown in popularity over the last decade; we are now firmly in the information age and this presents the public sector with both opportunity and challenge. At the same time, austerity measures combined with a growing population has put pressure on public sector organisations to deliver more for less. There are many opportunities to transform and improve services, but perhaps none more so than harnessing the power of technology as a key enabler. Outgoing Executive Director of Digital in the Cabinet Office, Mike Bracken, commented "There is a salient lesson for every institution in every sector, which is this: the internet always wins. Government is no different. We ignore that at our peril.” "the internet always wins. Government is no different. We ignore that at our peril." - Mike Bracken Many organisations are using this shift to replace traditional heads of IT with business focused CIOs, however according to a recent iGov survey only three per cent recognise them as transformation focused business leaders. The challenge for CIOs is that they are expected to maintain and lead IT departments whilst delivering change and organisational improvements. Issues arise when CIOs don’t get the support they require to transform an organisation, and subsequently fail to become the business leaders they are capable of being. [easy-tweet tweet="#CIOs are expected to maintain and lead #IT departments whilst delivering change "] The challenge Public sector projects are often more difficult to drive forward and implement than their equivalent in the private sector. Research by McKinsey and Company found that IT projects driving public sector change were six times more likely to experience cost overruns and 20 per cent more likely to run over schedule than similar private sector projects. With the right leadership in place that understands challenges and stakeholders, delays and cost overruns can be reduced. In the accompanying report to this research, analysts recommend ‘hiring and nurturing the right talent’ specifically commending the UK government policy of seeking to attract highly talented individuals into leadership roles. If we examine some of the failed public sector projects in recent years, lack of leadership has been a contributing factor. One of the most high profile has been the Department for Work and Pensions (DWP) Universal Credit, a scheme designed to replace some of the current out of work benefits and improve efficiency in supporting job-seekers. Despite the well thought out plan of uniting a number of benefits systems, the programme was labelled a £700 million flop by the Public Accounts Committee which criticised the DWP for its “alarmingly weak management” of the project. The issue with this particular project was that money was spent on IT systems that didn’t work as they should, eventually being totally written off, highlighting a clear failure in leadership. The fact that just three per cent in the public sector recognise their CIOs as a driver of transformation is concerning, but provides an opportunity Strong public sector CIOs combine IT expertise with deep understanding of business needs. Many will also have experience in the private sector and be used to operating within strict budget guidelines. Giving these leaders the power to drive change must be a priority for organisations. In turn, they will empower those around them to deliver solutions based on a clear vision and operational targets. One successful project, that I’ve seen recently, is the transformation of the workplace at Epic CIC, an employee led mutual focused on providing support services in the royal borough of Kensington and Chelsea. Led by a visionary CIO in Jamie Holyland, Epic was able to transform the way they work by introducing Google Apps to foster collaboration and Chrome devices to support mobile and flexible working. The project will save £140,000 a year, but equally important, the system is shaping the services Epic can provide and the way it provides them.  For example, it can seamlessly integrate with technologies such as social media that will be vital to developing their services in the future. It isn’t just small public sector organisations in which CIOs can deliver real change either. One much larger and particularly successful project was in the Department for International Development (DFID). A unique department, DFID were looking for an efficient way to manage the distribution of aid worldwide, that would allow for regular engagement with a mix of different partners. They eventually chose to use Google sites for this project, providing workspaces where staff and local partners can share and access documents, no matter where they are in the world. Starting with 40 sites initially, DFID is adding more sites and adopting other aspects of Google Apps technology as time goes on. Through intelligent leadership, DFID was able to introduce an efficient tool to help them improve aid delivery. Delivering more for less is a target being pushed from the top down in the public sector. Unlike traditional ‘heads of IT’ who were primarily placed in supporting roles in organisations, CIOs are expected to drive change. Experienced CIO Steve Day commented that IT leaders in the public sector now need to “inspire those around them, be visible and get out from behind their desk; if you don’t have the respect and support from your CEO, directors, the political parties and cabinet members, and proper engagement with users, even the best ideas will fail.” [easy-tweet tweet="The current environment requires public sector CIOs to lead, and they must be empowered to do so."] The current environment requires public sector CIOs to lead, and they must be empowered by directors, politicians and senior management to do so. The fact that just three per cent in the public sector recognise their CIOs as a driver of transformation is concerning, but provides an opportunity. In these times of change, with the right support, these business focused innovators can embrace their role and redefine the position as the leader of digital transformation not just a supporter. ### What exactly is hybrid cloud: A stepping stone, a sanctuary for regulated industries or a rampart for old tech vendors? Epoch-making technology change occurs only occasionally, but when it does there is often a transition phase in which hybrid solutions emerge. [easy-tweet tweet="Hybrid cloud can be seen as an intermediate step on the path to full public utility cloud, says @BillMew"] Hybrid cloud: A stepping stone to public cloud In a recent Harvard Business Review article entitled "The Prius Approach," Nathan Furr and Daniel Snow explained how hybrid technologies help companies survive disruption and shape the future. For example when Edison’s light bulb threatened their future, incumbent gas lighting firms borrowed the filament technology from the electric bulb to form a hybrid technology the gas-powered lightbulb, which improved their own efficiency fivefold. This succeeded not only in starving Edison of profits for 12 years (and nearly bankrupting him), but it bought them enough time to prepare a profitable exit into the adjacent heating business. Experts in disruptive innovation point to that kind of move to bolster a doomed technology as the last gasp of a dying industry. In the same way hybrid electric cars, like the Toyota Prius, will allow auto manufacturers time to develop new skills and capabilities to prepare for the era of all electric cars. And depending on your perspective, hybrid cloud can be seen as an intermediate step on the path to full public utility cloud. For many industries there is evidence that the transition to full public cloud is accelerating and there are commentators (such as @DavidLinthicum) claiming that the era of private and hybrid cloud is already ending. Such are the economies of scale and efficiency improvements with cloud that the advantages of public cloud are obvious and undeniable – advantages that cannot be sufficiently replicated in private cloud deployments. Old tech vendors see hybrid as a way of retaining some control while defending and extending the life of their legacy platforms Hybrid cloud: A rampart for old tech vendors While many of the old tech vendors are doing their best to expand their cloud business, this is coming at a cost – with supposedly new cloud business more often than not involving cannibalising higher margin business with existing clients, rather than being net new client wins. However, with many enterprise workloads likely to remain on-premises for some time to come, old tech vendors see hybrid as a way of retaining some control while defending and extending the life of their legacy platforms. They hope that hybrid will buy them enough time to catch up and bring new capabilities to the cloud. Hybrid cloud: A sanctuary for regulated industries The whole argument for hybrid cloud is that no matter how compelling the public cloud services are, some workloads will need to be hosted in private clouds for the following reasons: The need for technical control over one’s compute environment – especially where legacy platforms or architectures would prohibit a move to the cloud. Reasons of proximity – some data intensive applications or latency sensitive workloads benefit from having compute power near data sources or big data stores. Reasons of protection, risk aversion, compliance and regulation – the need for multiple redundant facilities in each local market as well as regulatory accreditation. While innovative new tools and technologies (such as containers) are already enabling firms to overcome the technical challenges, the main bastion for private cloud in terms of latency, security and regulation remains the financial services sector. Will banks ever adopt public cloud? Well, public cloud providers, such as Amazon Web Services (AWS), are already working closely with financial regulators to ensure their services are in line with the security requirements. AWS has achieved widely respected certifications such as PCI DSS Level 1, ISO 27001 and ISO 9001, and banks across the world are not only embracing cloud, but are even adopting public cloud. In the Netherlands banking regulator, De Nederlandsche Bank (DNB), is in favour of public cloud, and Dutch retail bank Robeco Direct is already using Ohpen’s ‘bank in the cloud’ software running on AWS. So what’s stopping everyone moving to public cloud? One old expression still holds true: “Why do people rob banks? Because that’s where the money is!” The fact that the banks remain the main target for hackers and cybercriminals of all kinds means that a heightened level of security is essential. Some time ago banks were described as simply technology companies with a banking licence, because everything they did was already digitised. It might be more accurate to describe them as security companies, because everything they do is based on trust, if their security is compromised and trust is lost, a bank is out of business. This is not to argue that private environments always have lower latency and higher security then public ones, but whatever level of regulatory certification AWS, Azure and Google obtain, the banks are going to want to be hyper secure. [easy-tweet tweet="Whatever level of regulatory certification vendors obtain, the banks are going to want to be hyper secure."] After all, while AWS beat all the competition (IBM included) to win the CIA’s business, the CIA doesn’t use the main AWS cloud facilities. Instead it uses an especially secure private cloud provided by Amazon. Should the banks settle for anything less? ### Understanding the data protection ramifications of Safe Harbour Following the recent Safe Harbour ruling, many companies are now concerned about how they can store their data securely but still make it available to their employees at any location. [easy-tweet tweet="Daniel Model looks at what the #data ramifications of the fall of #SafeHarbour were "] Safe Harbour refers to an agreement between the European Union and the United States. First adopted by the European Commission in 2000, it allowed companies to transfer personal data from a European Union country to the US in compliance with the EU Data Protection Directive. At the start of October last year, the European Court of Justice decided that Safe Harbour did not give data transfers between Europe and the US adequate protection, declaring the agreement void. The doubts surrounding the agreement are due to scepticism towards the US’s Patriot Act, which in certain cases allows the US government to access data saved in the US and monitor it for any potential terrorist threats. For EU and US companies, the Safe Harbour agreement was a ’safety net’ that allowed the transfer of EU data to the US. It allowed any company to issue self-certification that confirmed EU data would be protected if it was transferred to and saved at US data centres. For EU and US companies, the Safe Harbour agreement was a ’safety net’ that allowed the transfer of EU data to the US The invalidation of Safe Harbour raises questions and problems for many companies with regard to backing up and sharing data in the cloud. There are two possibilities for securely storing and accessing data in a cloud. One option is for the user – either private or corporate – to find a cloud provider whose data centres are operated in Europe. Alternatively, companies have the option of setting up their own private clouds and using them to provide their employees with data, IT resources and applications. The market offers many options for both approaches. Today, offering customers safe data storage and sharing in the cloud is part of the mission of almost every major manufacturer. Cloud users must work with proven and secure provider If a company opts to use a public cloud architecture, it requires a suitable and reliable cloud service provider. The highest priority here is to ensure that the provider’s data centre resources are based in Europe or even better, the UK. Users can also inquire as to whether the provider’s internal data backup is located only within these data centres or if copies are made in data centres in other countries. Service level agreements concerning how and when data is stored and under what conditions it is transferred back should also be key considerations when selecting a cloud data protection services provider. You should also take into account the provider’s level of encryption to prevent any accidental or targeted misuse of data.  Full control in the company’s private cloud The second option of guaranteeing secure data protection, access and sharing in a cloud architecture is admittedly somewhat more complex, but it gives companies an increased level of control of critical business data and digital information: a private cloud architecture. While the company will have more resources to manage, a private cloud solution has a comprehensive range of options with regard to the provisioning of services, access rights, selection of applications and device support. That, in turn, gives employees greater flexibility, the tools they need for their job, and the ability to deliver the same user experience as they would get with a public cloud. The safety of data and devices can be guaranteed according to internal standards set by the company. [easy-tweet tweet="A #privatecloud is not affected by the ramifications of #SafeHarbour" user="comparethecloud"] A private cloud is not affected by the ramifications of Safe Harbour and it allows for the use of a wide range of cloud-based services, rather than applications such as Box or Dropbox, which should probably not be used in view of the changes related to Safe Harbour. Companies that want to maintain secure clouds and full control of their resources and data should choose a private cloud and the right applications for their employees and IT personnel. In short: data security in the cloud must be the top priority A very important conclusion can be drawn from the Safe Harbour debacle. No matter whether a company chooses a public or private cloud, working with proven and compliant vendors that leverage local data centres and are committed to security in the cloud are qualities that should be looked for when choosing a solution. ### Compare the Cloud launches Disruptive at Cloud Expo 2016 Disruptive, an offshoot of Compare the Cloud, is launching on April 12th, 2016; and will be broadcasting live from Cloud Expo in London. Cloud Expo encompasses four events in 2016, Cloud Expo Europe, Cloud Security Expo, Data Centre World and Smart IoT London. Disruptive will launch across the four events, and stream live coverage of interviews with technology experts and speakers across the two days. Exclusive interviews with industry leading experts will be featured across the two days of broadcast. [embed]https://vimeo.com/159080871[/embed] Disruptive is an alternative approach to the everyday technology that is ingrained in our lives. Focussing on the IoT, analytics and truly digital and disruptive technology, Disruptive aims to get people interested in the tech behind technology again. Disruptive is a global technology channel which has been in development for the past six months in anticipation of the launch at Cloud Expo. Andrew McLean comments: “DisruptiveTech.tv is designed to change the way we look at technology, and the way we cover events in the technology sector. It’s about bringing an exciting new approach to live coverage of technology, and pointing a spotlight at the truly disruptive technologies that are being developed by the biggest, and smallest, players in the industry.” Compare the Cloud will be filming short one-on-one interviews with cloud experts and industry leaders across the two day event. To follow DisruptiveTech.tv’s live launch at Cloud Expo 2016, follow @disruptivelive on twitter, check out the hashtag #disruptive, or visit the website at www.disruptive.live. You can watch the event at: www.comparethecloud.net/live or https://www.disruptive.live You can discover more about the events on our events page: https://www.comparethecloud.net/events/ ——————- About Cloud Expo Europe Cloud Expo Europe, is the UK’s largest technology event and the world’s largest independent cloud event, featuring a major exhibition with a record 400+ cutting-edge suppliers plus a compelling conference of 300 global expert speakers - and thousands of visitors. ——————- About Compare the Cloud Compare the Cloud is an industry leading cloud commentator, aiming to educate the public on the latest IT and Cloud trends. Compare the Cloud champions the adoption and value of cloud computing; helps companies select services and providers best suited to their needs, and acts as a central hub, information resource and community for all cloud-related issues. ### IBM Acquires Optevia, Expanding Role as Solutions Provider for Public Sector Clients IBM today announced that it has acquired Optevia, a privately owned Software as a service, (SaaS), systems integrator specialising in Microsoft Dynamics CRM solutions for public sector organisations. Optevia will join IBM Global Business Services and help meet the increasing client demand for CRM SaaS solutions within the public sector. [easy-tweet tweet="#CloudNews: @IBM have acquired #PublicSector #SaaS company @Optevia" user="comparethecloud"] Industry experts estimate that the world-wide Customer Relationship Management (CRM) opportunity is in excess of $23B, with cloud-based CRM solutions expected to surpass 50 percent of that total. The acquisition of Optevia, will help IBM establish itself as a premier SaaS and digital consultant and accelerate leadership in CRM solutions. Optevia’s main focus on UK Emergency Services, Central Government, Local Government, Health Authorities and Housing and Social Enterprises, allows them to offer their clients highly differentiated solutions. Optevia’s client base includes ministries, councils, regulators, licensing and grant management organizations, transport authorities and social housing organizations. “By acquiring Optevia, IBM will be able to provide Public Sector clients and prospects with a range of unique, industry focused CRM based solutions,”said Joanna Davinson, IBM Public Sector Leader - Europe. “This strategic acquisition will help strengthen IBM as a SaaS provider and Global Software Integrator.” Since its founding in 2001, Optevia has rapidly established itself as a trusted partner for Public Sector and Social Enterprise clients. The acquisition will allow IBM to scale Optevia’s solutions across other areas worldwide, where Optevia's software, assets and highly skilled, industry-focused workforce, coupled with their expertise will significantly increase IBM’s current capabilities. For more information on IBM public sector services visit: http://www.ibm.com/uk-en/ ### How is the cloud changing the way we do business? Now established as common practice for reducing operational expenditure, cloud computing has long been acknowledged as a technology game changer. It’s shaping up new business models through shared computing resources and propelling business innovation. Everything from taxis to handymen services are being streamlined across the cloud. [easy-tweet tweet="Businesses are able to design their operating models around technological capabilities thanks to the #cloud"] The Sharing Economy In 2015 we saw the rise of the “Sharing Economy”. In the context of the cloud, “sharing” refers to enterprises helping users share goods or services across peer to peer platforms. Businesses across all sectors are now able to design and build their operating models around technological capabilities thanks to the cloud. This is helping improve flexibility and efficiency to extend business reach into global markets, taking advantage of global value chains. The “Sharing Economy” model, fuelled by the flexibility of the cloud, is on track to fundamentally change the nature of competition. Boosting collaboration With the rise of this “Sharing Economy”, cloud is turning companies that would traditionally see each other as competitors into partners, working together across specialised service platforms that enable them to operate a streamlined, efficient business. The “Sharing Economy” model, fuelled by the flexibility of the cloud, is on track to fundamentally change the nature of competition. Cloud-based software is helping companies automate their manual processes, configuring them to meet an organisation’s specific requirements. Moreover, in a world where customer demand patterns shift at a moment’s notice, cloud software is giving companies the agility to modify business processes or roll out entirely new ones much more rapidly than traditional approaches. As such, businesses’ competition points are changing - shifting away from relying on the efficiency of operational technology as a key competition point, to focusing on boosting and refining their core offering. The rise of the “super-niche” As corporations increasingly outsource services to specialists, new business models are likely to emerge that focus more on the specific inconveniences and frustrations within B2B processes that can be streamlined across the cloud. As a result, entrepreneurs will be looking out for minute niches to own. For example, Amazon Dash Button is a B2B initiative to get manufacturers of devices that use consumables, such as printers to build in sensors that gauge how much has been used up and automatically put an order in to Amazon when a top-up is needed. With the rise of this “Sharing Economy,” cloud is turning companies that would traditionally see each other as competitors into partners Consumers vs businesses In the consumer space, cloud has completely revolutionised the way we think about investing in and owning physical assets. For example, we stream music and films, rather than investing in physical records, downloads or DVDs. We often talk about ‘consumers’ and’ businesses’ separately, forgetting there’s an important association between them. At the end of the day, businesses are run and managed by ‘consumers’ in another facet of life. This means that as we become increasingly dependent on instant services and transactions, business leaders are taking these expectations with them into their corporate lives, which is in turn translating into an increased demand for cloud providers that can offer this level of streamlined, flexible, instant access. [easy-tweet tweet="Cloud has revolutionised the way we think about investing in and owning physical assets" hashtags="SharingEconomy"] As such, the demand for these niche providers will continue to soar across B2B, and these emerging businesses will likewise fuel the growth of the on-demand and sharing economy even further – creating a self-perpetuating cycle of growth and demand for the entire cloud industry. ### How Cloud and Managed Services are enabling the ISV Community Google estimate that software development will be worth an estimated £30 billion to the UK economy by 2025. Cloud computing has been a key accelerator for this sector and has removed barriers for entry  enabling low cost experimentation and delivery; and in turn changing the software business model. [easy-tweet tweet="#Cloud computing has been a key accelerator for #software development" user="comparethecloud"] The democratisation of access to compute resources, combined with widespread low cost network access, means cloud computing now provides the perfect platform for the development and scalable delivery of applications. the cloud is not a panacea for ISVs But the cloud is not a panacea for ISVs and challenges still remain in the delivery of secure, reliable software solutions to end-users.  However by partnering with a managed services provider that can offer agile development platforms and robust delivery services, many software developers will be able to reap the benefits of increased focus on core business areas, improved user experience and faster, more agile DevOps cycles. A developer’s cloud Developers are now leveraging Infrastructure-as-a-Service (IaaS) platforms to spin up cloud instances using their software and to tear them down when they’re done, without any manual intervention, and all from one portal. However, there are multiple service providers that offer a variety of cloud services and software businesses need to consider its specific needs carefully when selecting a provider. Key considerations should include: Uptime commitment with the service provider, particularly how this relates to the critical needs of the relevant application Whether a transparent billing mechanism is built-in with the cloud offering API functionality to allow your software to drive the infrastructure Tools or native platform capability that allow rapid cloning and deployment of whole environments, facilitating faster and reliable DevOps cycles Delivering a better user experience The SaaS model is creating new opportunities for both ISVs and their customers. Consumption based charging models enable low-cost-of-entry and low-cost-of-software so clients can experiment with applications that optimise business processes, drive higher efficiency, productivity and growth. [easy-tweet tweet="The #SaaS model is creating new opportunities for both #ISVs and their customers" user="comparethecloud"] Cloud based models allow ISV’s to focus on their core goals of developing and delivering applications; and improving their customer experience. Tasks like capacity management, infrastructure budget management and platform availability can all be offloaded to a cloud partner; and importantly these costs can be married to usage and revenue for the ISV. Potentially other tasks can be offloaded too - ISVs working with a Managed Service Provider can also offload tasks such as patching, replication, redundancy and security. With the right partner the ISV can deliver agility to the DevOp’s cycle and then rely on the MSP to implement change control, security or compliance enhancements, business continuity and a robust availability and performance SLA for the production applications. With the right partner the ISV can deliver agility to the DevOp’s cycle ISV’s can take a pragmatic approach in a move to SaaS, establishing whether it’s a good fit for their client base, their financial model and their software offering. But they can still take advantage of a partner that can deliver hybrid solutions that fit most use cases. Applications delivered to end-users in heavily regulated industries; may not be suited to multi-tenanted platforms, but may still sit well on discrete, dedicated infrastructure with appropriate security and compliance controls. Cloud platforms where resources are rented by the hour, may not necessarily offer the best value to applications with predictable workloads or those where the end-user signs fixed-term contracts.  [easy-tweet tweet="The #cloud is reducing barriers to entry for new #software businesses " user="comparethecloud"] The combination of opportunities presented by IaaS and SaaS models has expanded the options available to ISVs for software development and delivery; and in turn provided a greater number of options and better value solutions for end-users. The cloud is reducing barriers to entry for new software businesses and allowing existing ISVs to be more agile, customer responsive and innovative. Both customers of these solutions and the ISVs themselves stand to gain considerable benefits in transitioning to the cloud and taking advantage of cloud infrastructure and managed services as long as due diligence is undertaken in this transition. ### Promapp Wins Contract to Support East Africa Solar Revolution Promapp, has announced a new contract win with UK-headquartered BBOXX to supply its market leading software in support of the rollout of solar power throughout Africa. [easy-tweet tweet="#CloudNews: @Promapp software will support the rollout of #solarpower in #Africa with @BBOXX_hq" user="comparethecloud"] BBOXX offers an on-grid experience in an off-grid world powered through a unique financing model to sell solar systems to the mass market on a monthly payment plan.  While its finance and engineering operations are based in London, manufacturing is conducted in China and more than 150 staff work for the organization in East Africa in a network of retail shop operations as well as in a variety of support and service roles. Through this vast network of shops and outlets, BBOXX focuses on giving access to the fundamental need of access to electricity combined with superior customer service. However, rapid company and organizational growth had lead BBOXX to conclude that its strategy of managing business processes in areas such as customer service through its existing Word, Visio, Excel and Access software as well as a legacy SharePoint environment was unsustainable.  Indeed, it was increasingly unable to support a consistent approach to operational process, particularly in the areas of customer payment default issues. BBOXX prices its direct sales units to match these existing energy costs BBOXX systems are typically paid for by customers in monthly payments from 12 to 36 months which includes the price of the product as well as maintenance costs. These monthly payments are adjusted to the local market, matching the cost of kerosene fuel, making clean solar energy a viable alternative for all. Customers in Uganda, for example, usually earn $150-$200 per month and spend $10-12 on energy expenditure such as purchasing kerosene, batteries, and charging their phones. BBOXX prices its direct sales units to match these existing energy costs, spreading the cost of a solar system over time to widen its customer base, while enabling consumers to purchase clean renewable solar energy. Various billing processes were captured with inconsistent formats causing confusion and limiting engagement from teams across East Africa.  Following a comprehensive market review BBOXX concluded that the best path forward would be to deploy a centralized and consistent sales, billing, and reporting process approach across its network of remote shops which would also promote ownership and accountability at the local level as well as ongoing innovation and process improvement. [caption id="attachment_35592" align="alignright" width="300"] Laurent Van Houcke, Chief Operations Officer, BBOXX[/caption] “Although we work in very rural, isolated places we still need to provide a consistent process approach for our staff to assist in managing distribution points,” says Laurent Van Houcke, Chief Operations Officer, BBOXX.  “Great business process management software enables remote teamsto improve process accessibility, provide a consistent execution of process and drive a continuous improvement culture.” The decision by BBOXX to deploy Promapp follows a comprehensive market review, but as Van Houcke explains, “Promapp is in a league of its own in terms of being user friendly, intuitive to use,and has an inherent capacity to support good communication across a diverse and remote workforce.  For example, we have a department focused on sales and stock management which will nowbe ableto define the process required to handle a customer order from the initial sale to ultimate installation in the village community.” “Promapp will also enable us to visualize processes in a live, dynamic environment.  For example, where we might have a customer who has defaulted on a monthly payment, our staff in the shops now have a process where they can access a call center and communicate in real time via SMS in resolving the individual customer situation. The aim is to simplify our processes, enable easier access, and create greater staff involvement thereby supporting our overall auditing efforts.” Promapp will also support BBOXX in making changes to processes on the fly, enabling even the most remote users to access up-to-date information, ultimately resulting in a more consistent quality of service for customers. While Promapp will be supporting current processes in the company to effectively manage existing customer orders, service, and payments, it will also be a strategic asset in supporting the company’s growth plans which include a doubling of shops and the training associated with staff recruitment to manage the rollout of the new retail presence. BBOXX aims to provide 20 million people with electricity by 2020. [caption id="attachment_35593" align="alignleft" width="300"] Ivan Seselj CEO, Promapp[/caption] Ivan Seselj CEO, Promapp, says, "We are thrilled to be working with BBOXX, a world leading business innovator, and to have been selected to deploy our software in helping to support its quest to provide remote communities with access to reliable electric power.  The company is a powerhouse of growth and we look forward to seeing the rapid impact which our business process management software will have on accelerating BBOXX’s vision of propelling access to energy across Africa as well as improving the lives of low income people through access to clean energy.” ### The importance of maintaining a startup spirit As businesses expand and develop, their growing financial and reputational influence allows them to do things that they may have never dreamed of in their earliest days. However, this growth can also be accompanied by challenges, with businesses playing it safe rather than embracing the innovation that provided them with success in the first place – essentially, they forget where they’ve come from. [easy-tweet tweet="At @Dell we believe fostering a #startup spirit is important no matter how large and successful your business is"] That’s why, here at Dell, we believe that fostering a startup spirit is important no matter how large and successful your business is. Startups are characterised by risk-taking, agility, hard work and a culture of inclusivity – all of which could benefit established firms just as much as the new kids on the block. One of the ways that we try to keep that startup culture alive at Dell is by working with creative, new businesses on a daily basis. Our Dell for Entrepreneurs programme aims to give emerging businesses the resources, expertise and solutions they need to get to market faster. Whether a business need advice on pitching, IT solutions or anything else for that matter, we’re happy to help in any way we can, because we realise that this a two-way process. Dell for Entrepreneurs also puts us in contact with innovative thinkers and inspiring entrepreneurs – the kind of people that help keep the startup spirit thriving at Dell. Our Dell for Entrepreneurs programme aims to give emerging businesses the resources, expertise and solutions they need to get to market faster It’s also vital to remember the important role played by startups in the wider economy. Although not all of them would have been successful, a staggering 608,110 startups were launched in the UK alone last year. These businesses provide jobs and services to countless individuals across all sectors of the economy. Of course, the startups that do develop into household names may have an even bigger impact. Facebook is a startup, Twitter is a startup and, in fact, Dell is a startup – even the most successful multi-national companies had to start somewhere – and likely needed help to do so. When startups are given the opportunities to succeed, the wider economy also benefits, including more established industry players. The Dell Women's Entrepreneur Network supports the crucial role that women play in driving global economic growth Taking an active involvement in entrepreneurial and startup programmes can also have wider social benefits. Establishing partnerships in less developed countries has already shown itself to have positive impacts at tackling poverty. Startups like UpEnergy and WeFarm, in Uganda and Kenya respectively, are using entrepreneurialism to overcome genuine local problems. Similarly, our own Dell Women's Entrepreneur Network is focused on supporting the crucial role that women play in driving global economic growth. By connecting female entrepreneurs with the right networks, sources of capital, knowledge and technology, we’re putting equality and diversity front and centre of the startup ecosystem. [easy-tweet tweet="Larger firms would also do well to embrace the #startup traits of innovation and risk" user="comparethecloud"] No matter how much you’ve grown as a business, don’t neglect the startup culture that helped you along the way. Innovation and risk taking are fast becoming known as traits relating to startups specifically, but larger firms would also do well to embrace them. Getting involved with the startup and entrepreneur community can not only help your own organisation to flourish, but it can also have other advantages. Embracing the startup spirit provides broader economic and social benefits that support innovation happening across the globe. ### Securing data transfers in the financial service sector Technology has always been important to the financial services sector, from the ancient techniques used to mint coins to modern day encryption methods. Recently, however, a specific set of technology concerns have emerged within financial institutions, whether they are huge multination banks or Fintech startups. [easy-tweet tweet="#Technology has always been key for #financial services, from ancient minted coins to modern #encryption"] For example, here at Bluebird we’ve been working with a UK-based tier 2 bank to help with their strategy for expansion and ensuring that the IT infrastructure was robust enough to meet expectations, compliance and regulation. Technology is providing more opportunity than ever before for financial institutions to expand, create new revenue streams, and ultimately overtake more established, but perhaps less agile, industry heavyweights. However, growth is not always easy to manage, which is where Bluebird comes in. In particular, our IBM Managed File Transfer software is helping financial businesses to achieve scalability while still adhering to the industry’s stringent compliance regulations. Businesses experiencing  rapid growth  often struggle to manage the volume of files that they need to receive, particularly in the financial sector where the sheer weight of invoices, credit checks and onboarding can becoming overwhelming. Technology is providing more opportunity than ever before for financial institutions to expand IBM MFT contains a suite of tools to help businesses overcome these challenges, by providing the security and reliability that their employees and customers demand. Examples of the featured products include IBM Sterling Control Centre, which provides a management solution for all file transfers and associated Service Level Agreements, and IBM Sterling Connect:Direct, a point-to-point file transfer solution optimised for high-volume data delivery between enterprises. Given the rise of Fintech startups, we’ve also ensured that IBM MFT contains software that is specifically geared for use by firms that may not have a huge amount of time and resources. MFT Go!, for example, is a complete managed service solution for secure data file transfer that is delivered, either on-premise or from the cloud, for a set annual fee. Setting up a secure file transfer system between financial institutions and their partners is required for compliance purposes, but upfront costs can be prohibitive. With MFT Go! however, businesses can remain safe in the knowledge that their subscription charge will not change no matter how quickly they grow and how much data they transfer. [easy-tweet tweet="#Financial institutions need to be able to move #data within and between organisations securely" user="bluebirdITS"] The reason why financial services are adopting these kinds of IT solutions is because they provide security and reliability in an industry where both are of paramount importance. A single data breach can lead to pay-outs, compensation claims and long term reputational damage that many businesses simply cannot recover from. As such, financial institutions need to be able to move data, both within and between organisations, securely. Unreliable data transfers are also likely to lead to a lack of trust and hinder employee productivity. At Bluebird IT, we understand the important of both reliability and security in the financial service sector. With more than 30 years’ experience delivering world-leading IBM technology, we’ve worked with countless businesses across a wide range of industries. If your organisation is based in the finance sector, you’ll be acutely aware of how important security is to your partners and customers. At Bluebird we share this view, so why not see if our industry leading B2B file transfer and management technology can help take your business to the next level? ### Is cloud computing facilitating poor security practices? Despite the myths and rumours, the cloud does not make it easier to have poor security from a technical standpoint; it may however make poor security feel less painful psychologically. The thing many fail to understand is that the cloud is just the same technologies from an on-premise environment running somewhere else. Any risks that there would have been running a CRM app like Salesforce on premises are still there in the cloud. [easy-tweet tweet="Businesses think there are either unique risks or magical protection afforded by adopting the #cloud"] Though the share of risks is much smaller since the provider takes care of some,  in a case like Amazon’s EC2 where servers are running in the cloud, the organisation would be just as responsible for security from the operating system up as they would be if it were a server in your own data centre. Many fail to see that clearly and think that there are either unique risks or magical protection afforded by running in Amazon’s world. Of course, when something is “over there” it feels like it’s less your problem. So many things that would have perhaps been a nagging feeling about a server you set up in your data centre may feel more distant when they are running in the cloud. Has the cloud made people more complacent? The cloud hasn’t made people more complacent to risks, but it also doesn’t seem to have made them more attentive to them either. This varies from organisation to organisation, of course. Some see the very specific language about what duties and risks are theirs in the contracts with their cloud providers and it wakes them up to all the things that may go wrong that they have forgotten. The complacency comes from the fact that risks are still prioritised for action alongside everything else that pulls on organisations. If it will cost twice the money to fix a security risk as to increase profit margins by a third, what do you think an organisation will do? Organisations will ultimately act to further their main interests and IT security risks don’t often make the cut. Organisations will ultimately act to further their main interests The single most common mistake users of public cloud make is to not read their contracts and understand where their responsibilities truly lie. Often people are unclear as to when and how the creation of a server in the cloud moves from the care and security of the provider to them. I’ve run into folks who mistakenly thought their cloud provider was patching servers through some back door for them. They weren’t; and the servers went unpatched for months. Often organisations will forget that the layer of management given to them by the cloud provider will also need some security. The administrative users and rights used to configure and control the cloud systems will need to be treated just as carefully as any other privileged users in their systems. Another mistake that is common is to think that the cloud provider will have services that their on- premises systems did, simply waiting for them to use. It’s true that Amazon, Microsoft, and others do build in many services for customers, but before moving to the cloud organisations really must do a full inventory of everything they were doing on-premises to identify gaps. Security is often an area where there are things missed when moving workloads from on-premises to the cloud. Maybe there are different groups involved – the operations folks are spurring the move to the cloud for cost reasons, but the security folks only find out at the last minute and have to scramble to make a change to support the move. How do you secure your data in the cloud? Properly securing public cloud resources is, in the end, no different than securing systems running on- premises. The differences, in principle, are none; and the differences in operation are minimal. The real trick to appropriate security in the public cloud is to treat it as if it’s just another data centre. Attempt to build security that’s at least as good as what you had on-premise, or perhaps even take the opportunity of the new build out to make improvements that you would have on-premise if you had only had the time. If there are ways that you want to apply security patterns that turn out not to work because things are running in the cloud, then deal with them as exceptions. You won’t find many. [easy-tweet tweet="Properly securing #publiccloud resources is, in the end, no different than securing systems running on-premise"] From a security perspective, cloud has been mature for years. If you look at the intimidating list of security and even compliance certifications that the major cloud providers have, you can see that no IT shop except the most elite (and well-funded) have ever come close to offering a platform as well secured. They have to. If the cloud providers had a major gap in security, especially considering how much undue attention is being paid to that security, then they would be finished overnight. It’s sufficient to say that if you have very poor security in the public cloud, it’s likely you brought it in with you when you walked through the door. ### Deutsche Telekom and Huawei Launch New Open Telekom Cloud On March 14, 2016 at CeBIT 2016, Deutsche Telekom announced the launch of Open Telekom Cloud. The new public cloud platform will provide European enterprises of all sizes with on-demand, pay-as-you-go, secure cloud services to respond to fast changing market conditions. Huawei was selected as the hardware and software solution provider for Open Telekom Cloud. [easy-tweet tweet="#CloudNews: Deutsche Telekom announce the launch of #OpenTelekomCloud" user="comparethecloud"] Deutsche Telekom is driving digitization in Germany and across Europe. Huawei’s innovative hardware and software including servers, storage, networking and Cloud OS solutions provide the infrastructure and expert technical support for its new public cloud services, which will accelerate Deutsche Telekom’s position in the European cloud services market. “We are adding a new, transformational cloud offering to our existing portfolio of cloud services,” said Deutsche Telekom CEO Tim Höttges at CeBIT in Hanover. “For our business customers in Europe this is an important new service to support their digitization, and a critical milestone for us in our ambition to be the leading provider of cloud services in Europe.” Commenting on the deal, Huawei Rotating CEO Eric Xu said: “Huawei and Deutsche Telekom have a common strategy and vision in terms of cloud computing. Both parties have the dedication, experience and talent to serve enterprises, as well as outstanding public cloud technology and skills. “The strategic partnership allows each party to fully play to their strengths, providing enterprises and the industry with various innovative public cloud services that are beyond those provided by over-the-top content players. At Huawei, we are confident that, with esteemed partners like Deutsche Telekom, we can turn Open Telekom Cloud into the standard of public cloud services for the industry at large.” [caption id="attachment_35553" align="aligncenter" width="606"] Huawei’s Rotating CEO Eric Xu gave a speech at the press conference announcing the launch of Deutsche Telekom’s Open Telekom Cloud.[/caption] Open Telekom Cloud is an OpenStack-based Infrastructure-as-a-Service solution, powered by leading new ICT from Huawei and operated by T-Systems, the enterprise-focused subsidiary of Deutsche Telekom. The solution provides rapid access to flexible, affordable and secure computing, storage, network components and other services. The new offering will help Deutsche Telekom gain a strong foothold in a market segment largely dominated by U.S. cloud services providers. “We will combine Deutsche Telekom’s ubiquitous network, connectivity, and T-Systems’ understanding of enterprise applications and service advantages with Huawei’s strong R&D capability, innovation and experience in serving enterprises. We will also continue to create differentiating and innovative cloud services to meet the needs of enterprises and the industry, strengthening T-System’s leading position,” expressed Dr. Zhang Haibo, Head of Global Public Cloud Solutions, Huawei. According to analyst firm Pierre Audoin Consultants (PAC), there is significant demand for a Germany-based public cloud solution.  “Access to a scalable, inexpensive public cloud provided by a German service provider from a German data center under German law will be very attractive to many customers in Germany,” explained Andreas Zilch, Senior Vice President – User Business & Lead Advisor, PAC Germany. “The combination of a competitive service and German legal security represents a unique selling point right now.” Open Telekom Cloud allows enterprises to order IT infrastructure and software solutions in a central booking portal in just a few clicks, making resources available in minutes. In this way, Deutsche Telekom is helping its customers to digitize their business models by providing simple and flexible services that can be scaled to suit organizations of any size and in any industry. “More and more customers are discovering the advantages of the public cloud. But they want a European alternative,” said Anette Bronder, head of the T-Systems Digital Division and in charge of the cloud business. “With the Open Telekom Cloud we are now offering customers the right platform and solution. It is simple, secure and affordable.” SAP is one of the first Open Telekom Cloud users and service providers. T-Systems provides the necessary infrastructure services for frequently used SAP applications as well as SAP HANA analysis technology in Open Telekom Cloud first. These services are also available in any combination of public and private clouds. "House of Clouds" in Biere Open Telekom Cloud will be set up in Europe's most cutting-edge data center, located in Biere, Saxony-Anhalt. Consequently, any data processed will be subject to Germany's strict data protection law. The Biere data center, and its twin center in Magdeburg, host almost all of Deutsche Telekom's ecosystem of technology and software partners.  Deutsche Telekom's "House of Clouds" offers short distances to link one application to another, one cloud to another. Leveraging the extensive experience of T-Systems' cloud experts, data and entire application environments can be transferred easily from the public cloud to an even better protected private cloud.   Experience products and services live at CeBIT from March 14-18 at the Deutsche Telekom stand C38 in hall 4; and at the Huawei booth B54 in hall 2. To learn more about Huawei at CeBIT 2016, please visit: http://enterprise.huawei.com/topic/2016Cebit/index.html ### IoT helping to empower the building industry and create the zero-energy home Opportunities for the Internet of Things (IoT) are moving so fast and it’s an exciting space to be in right now. One company, HOUZE, is helping to integrate disruptive technologies into consumer and commercial real estate developments and buildings. Their vision is to create an affordable net-zero energy home using advanced building technologies and Kii has collaborated with them to help them realise this dream. [easy-tweet tweet="#IoT has the potential to revolutionise the #housing and #building industry"] IoT is a disruptive force that has the potential to revolutionise the housing and building industry by redefining quality of life. By using next-generation building systems, technologies and materials, the HOUZE zero-energy home aspires to meet the highest energy efficiency standards, while creating a safer, stronger structure and aiming to deliver a near-zero carbon footprint. As an IoT platform provider, Kii will help HOUZE to create IoT solutions and services for the real estate development and building industries, for both new development and retrofit.  Combining clever energy management from HOUZE and Kii’s IoT platform solution, this will chnage the way homes are built and retrofitted with advanced energy-saving technologies. HOUZE is establishing a network of ‘living labs’ across the globe illustrating the lifestyle benefits of IoT HOUZE is also establishing a network of ‘living labs’ across the globe that will illustrate the lifestyle benefits of IoT in both residential and commercial buildings.  Kii will be the platform enabling smart automation and monitoring in these living labs, which will contain smart devices and solutions from a variety of device manufacturers that operate within the HOUZE IoT Ecosystem. [easy-tweet tweet="The global platform from @kiicorp enables all three layers of #IoT solutions (things, services, apps)"] Kii enables customers across the world to rapidly create compelling IoT solutions with its scalable, easy to use and feature rich IoT platform. The global platform enables all three layers of a typical IoT solution (things, services, apps), thereby significantly reducing the time it takes to create solutions, freeing up the customer to focus on their solution differentiation. Kii provides a flexible deployment model (public cloud, dedicated cloud, private cloud) globally, thereby enabling seamless solution deployment for customers of all sizes. ### Study shows clear ROI for omnichannel engagement centre solutions In today’s world, consumers are faced with almost infinite choices, constantly competing for their attention and spending power. As a result, they are quick to voice their anger or jump ship to a competitor if expectations are not met. Most brands now recognise that providing good customer experience (CX) is imperative to their current and future success. Most have a company-wide strategy in order to govern their approach to providing this experience. [easy-tweet tweet="Most #brands now recognise that providing good #CX is imperative to their success" user="comparethecloud"] However, while these strategies are often well-intentioned, too often they are not joined up. For example, taking customer calls might be handled by a contact centre agent while responding to messages on social media may be the responsibility of community managers or the corporate communications team. Even worse, these separate teams acting as individual siloes within an organisation don’t have access to the same information or may hold different information altogether. [caption id="attachment_35383" align="alignright" width="350"] Source: http://www.genesys.com/uk/about-genesys/resources/forrester-study-reveals-benefits-of-an-omnichannel-engagement-center[/caption] Often, this can cause frustration on the part of customers and doesn’t make for the experience that companies are striving for when they undertake these ambitious CX strategies. We’ve all experienced this scenario ourselves, when calling the bank or electricity provider for example turns into an ordeal because customer service agents don’t seem to know what we’re talking about - despite having called up about the issue we’re having numerous times. ‘Omnichannel’ CX has long been regarded as the solution to these issues. Omnichannel seeks to give the customer the best experience possible by gathering context and information across all engagement channels throughout the customer lifecycle. So in an omnichannel experience, the agent can see how many times a particular customer called or engaged with an agent, the details of those conversations, and the customer’s personal information and preferences—and then apply all of that customer information in the moment of a new interaction, seamlessly and proactively. But until now, questions about the value of omnichannel have remained. It sounds good in practice but does it actually benefit the bottom line? Analyst firm Forrester Consulting recently published a study commissioned by Genesys which looked to answer this very question. The firm interviewed and collected data from five multinational companies that had implemented an omnichannel customer engagement solution. These companies included an American multinational financial services corporation, a Chinese multinational computer technology company, an African mobile communications company and a large financial conglomerate in South America. Prior to implementing the omnichannel solution, the companies in the study had all deployed call centre (voice only) or contact centre (limited channel support) technologies — most often using solutions from multiple vendors. In each case, agents had to switch between multiple screens to serve a customer and frequently had little information about the purpose of their call or the customer journey history. These limitations led to long call times during which customers had to provide the same information multiple times and did not receive personalised customer service. the companies involved in the study would see a ROI of over 150% over 5 years by implementing an omnichannel customer engagement solution The study found that the companies involved in the study would see a return on investment of over 150% over 5 years by implementing an omnichannel customer engagement solution in response to these issues, with a pay-back time of just over 12 months. There were areas identified in the study where the solution produced significant productivity and cost gains. While it may seem like an obvious starting point, getting rid of the various different and dispersed software and hardware elements that make up the existing CX architecture can produce significant cost and productivity savings. Moving to a cloud-based solution removes the need for hardware, while one software license rather than several creates significant economies of scale. IT labour and maintenance costs are also reduced. [easy-tweet tweet="#Omnichannel solutions can resolve the problem quickly and effectively, giving the customer an outstanding experience"] Another key area identified by the study was call time. If an agent doesn’t have the full record of a customer’s previous interactions, he or she will have to spend time repeating themselves and going over their issue once again. With an omnichannel solution, this doesn’t happen and the agent can resolve the problem much more quickly and effectively, giving the customer an outstanding experience. Forrester found that calls were reduced by a full minute – leading to a combined total of saving of almost $20 million over five years.  Web chat is another customer engagement tool which has grown in sophistication in recent years, and is being implemented more and more. One of its main uses is to help customers who are browsing the web that may be in need of assistance or are unsure how to proceed with a transaction. Live agents can recognise these issues, using contextual and background info to assist these customers. web chat integration has serious benefits for the bottom line Again, the Forrester study found that web chat integration as part of a wider omnichannel solution has serious benefits for the bottom line. Having agents on hand to help customers online meant a 30% reduction in abandonment and also created more opportunities to cross and upsell. Overall, the study reported that this would lead to a projected revenue increase of $1 million over five years. Automation is also increasingly being used to provide customers with outstanding CX, and is one of the key aspects of an omnichannel solution. This was also identified as a key area of financial benefit in the study. Automated outreach, chat, self-service, and call back features all allow customer queries to be resolved speedily and efficiently, in a way that suits them. This means that agents can be better deployed, leading to a reduction in headcount and therefore significant cost savings. While omnichannel has long been regarded as the solution to providing outstanding customer experience to consumers, there hasn’t always been a clear business case and too often the concept has been regarded as a woolly marketing term. Now, in light of the recent Forrester study, there is clear evidence that implementing an omnichannel customer engagement solution not only gives customers the best experience solution possible but also directly benefits the bottom line by providing an ROI of over 150%. ### Cybersecurity closed door evening, sponsored by 8MAN Last week I attended an exclusive invite only Cybersecurity discussion in the City of London at an evening hosted by 8MAN. It was a lively debate on how the current and future of cybercrime prevention is shaping up. Jens Puhle, UK Managing Director for access rights management specialist 8MAN, moderated the evening, and was an incredibly knowledgable host. [easy-tweet tweet="CTC's @NeilCattermull recaps @8Man_UK's #CyberSecurity evening" user="comparethecloud" hashtags="infosec"] Research has found that almost 40 per cent of data breaches in the UK are caused by insider theft, but 8MAN believes the true figure is much higher. They have found many firms are not equipped to discover data theft in the first place, or are either unwilling to report crimes or unaware of how to do so. One of the speakers of the evening, Esther George actually initiated and designed the Global Presecutors E-Crime Network (GPEN) in 2008 to enable cybercrime prosecutors around the world to learn and benefit from sharing information, experiences, and strategies with each other. Which is just incredible, and such a revolutionary development in the way we look at cybercrime. Esther has spearheaded 8MAN's launch of Cyber Detect & Protect, a new consultancy service to help organisations protect themselves from insider cybertheft and successfully work with prosecutors to bring criminals to justice. She comments: “Insider theft is one of the most damaging kinds of data breach a company can suffer, but the level of crime is drastically under-reported. Many firms have no idea how to identify a breach, but even if they do it is still all too common for it to go unreported, either because the firm fears the backlash of it going public, or does not know how to proceed. “There is a lack of knowledge about what information is needed to help the authorities build a solid case for  prosecution, or where to report the crime in the first place. Many firms go straight to the Information Commissioner’s Office (ICO) and never take their case to the police. While the ICO will invesigate and can issue fines, this means the police and CPS may not even be aware of many cases.” Another extraordinary speaker was Colin Nicholls QC who successfully defended in the UK’s first hacking case in the House of Lords which led to the passing of the Computer Misuse Act 1990 (R v Gold and Schifreen [1988] AC 1064). Philip Manning a Consultant in Financial market and former board member of ADIB (UK) Ltd MBNA Europe & Barclays Financial Services also spoke, as did Philip Virgo, the Chairman of the Conservative Technology Forum and former Secretary General of EURIM, the European Information Society group and Simon Fordham OBE, the Chairman of The International Cyber Security Protection Alliance. It looks like the Met are getting tough on Cybercrime in a big way It looks like the Met are getting tough on Cybercrime in a big way, but the international links to cybercrime still delay and mask many convictions due to individual bureaucracy. DCI Gary Miles of the Metropolitan Police was due to speak at the event, but unfortunately couldn’t attend, the Metropolitan Police’s Operation Falcon provided the below statement in place of his absence. [quote_box_center] Operation Falcon, Metropolitan Police “U.K. Law enforcement (NCA, Met Falcon, Regional Organised Crime Units) are really keen to build better working relationships with business. This area was never a priority so frankly forces largely ignored it. Business didn't often report as they had no confidence we would do anything about it and if we did, often we didn't have the capability or experience to investigate effectively. That's changed, the Home Office has substantially invested in new capacity and capability but it needs to be bigger. The Met are also investing up to 500 officers and staff in this area now, a huge increase in the face of 20% cuts. No other forces have invested to this level. Most haven't increased staffing at all. This led to the creation of Falcon (Fraud and Linked Crime Online). Since we started Aug 2014 we've made more than 1000 arrests. Everyone is getting charged, very few are getting acquitted and we're getting good sentences. If we can identify a real person (the big challenge) all the evidence is on their devices or in their bank accounts so they've got nowhere to go. More cyber criminals aren't convicted because forces don't investigate as they don't have the capability or capacity. The 2 biggest challenges are the fact the crime is often global so there are jurisdictional issues and delays and we need communications data to prosecute. Often that data is in countries or companies that can't/won't give it to law enforcement. Increasing levels of encryption also make it harder and harder to intercept comms or retrieve it evidentially through digital forensics. The government's draft Investigatory Powers Bill addresses some of this and is a big step forward but only covers UK based providers. What about overseas?  That's a massive gap. These are the biggest challenges to successful prosecution although there are plenty of UK based cyber criminals to tackle as well as the international ones. Where business can help is firstly to report everything so we all understand the scale of the problem and properly resource an effective response. Where big business can help is supporting us through their resources be it skilled staff supporting investigations, training, access to business intelligence and technology or sponsorship to help fund our work. On the mistake front companies leave it too late to tell us they've had a problem or we often only communicate through lawyers. This delays everything, leads to misunderstanding and means evidence is often lost.  Tell us early what is happening in a big breach. If you change your mind about cooperating/prosecuting later we'll walk away. We understand businesses desire for confidentiality and need to get business back to normal as quickly as possible and we won't undermine that. The only time anything is public from our point of view is when someone is charged and a reporter sees them at court. A cyber incident crisis might be the first for the company but it won't be for us. We can advise you on what worked and didn't work in other incidents to help businesses make the right decisions under pressure. I'd like to see more international cooperation to bring offenders to justice between governments and law enforcement along with timely access to international comms data. I have a counter terrorism background and want to see the same level of cooperation in this area.” [/quote_box_center] ### Competing in the cloud has never been more important for traditional banks in the face of new digital competition As digitally-focused challenger banks continue to emerge, it’s clear that more traditional institutions are having to take a fresh look at their technological infrastructure if they are to begin competing on an equal footing, never mind taking a lead. [easy-tweet tweet="Digitally-focussed #banks are challenging and disrupting the legacy #financial model " user="stevenicsbbox" hashtags="fintech"] In order to stave off the increasingly sophisticated threat of the likes of Atom, Mondo, Monese and Starling, the more established retail banking leaders have to start taking a leaf out of their book by rejecting legacy models and making a leap into brave new territory. Effective, secure and fast mobile banking matched by joined-up service design is proving to be a major disruptor as we continue to see a decline in physical branches and a move towards a fully digital service. More and more, the bank is becoming a virtual concept rather than a high street or call centre entity - though many still want the peace-of-mind that accompanies talking directly to experienced banking staff.    More and more, the bank is becoming a virtual concept rather than a high street or call centre entity The picture is not looking good for the traditional retail institutions with significant underinvestment currently the norm; indeed it’s thought that digital latecomers could see up to 35 per cent of net profit eroded, while those maximising its benefits may realise a profit of 40 per cent or more (McKinsey & Company, 2015). Essentially, those that are failing to make a meaningful transition are only putting off the inevitable that is required to protect against the risk of revenue loss – the real type of investment that could create a differentiated customer experience featuring truly innovative digital applications. Regular bad press and sustained pressure from financial services regulators tells us that the old guard is now facing a crossroads; adapt and invest now, or cling to outdated core systems that could eventually prove catastrophic. The latter course risks a situation where banks fade along with their older client base. Clearly, we’re all going increasingly mobile – we expect to now do everything with immediacy on our smartphone from any remote location – and traditional banks must reflect and offer this agility if they are to keep our cash and custom. In other words, digital banking is the key battle ground, and the new kids on the block are often doing it better, while simultaneously avoiding traditional pain points in the customer experience. [easy-tweet tweet="Traditional banks must offer agility if they are to keep our cash and custom" user="comparethecloud" hashtags="fintech"] Nevertheless, while increasing digitisation might mean cost savings, it’s also often accompanied by the double-edged sword of less customer engagement. Analysis suggests that it’s a delicate balancing act, with lack of engagement easily triggered by a less-than-preferred channel, giving rise to a situation where the customer could walk away. In that sense then, the challenge is on to bridge the gap and ‘humanise’ the digital banking experience; to provide some of the comforts of the physical branch alongside the agility of the virtual bank. Either way, it’s clear that innovation in itself is not enough. Digital convenience must be matched by engaging customer service at every step of the process. [easy-tweet tweet="The challenge is on to bridge the gap and humanise the #digital banking experience" user="comparethecloud" hashtags="fintech"] The cloud represents a compelling answer. Through cloud migration and targeted, determined action, banks can handle big data, centralise everything for speedier transitions and realise cost reductions, while still meeting high customer expectations. The ability to add new business functionality quickly is another major asset, while cloud providers’ increasing drive to heighten security and meet compliance demands is making the cloud a natural choice for financial institutions. However, the ability to maintain service in all circumstances could be the real differentiator. Moving to a cloud solution that is designed to provide a high level of resilience is hugely attractive to the banking sector, and that is where it can excel. Legacy systems can also work in harmony with the cloud Legacy systems can also work in harmony with the cloud, so the move from old to new doesn’t have to be so dramatic and daunting. As a strategic platform for financial institutions looking to improve their services, the cloud is difficult to rival; in fact, it’s more a case of when, rather than if, banks will migrate their systems. Just how easy they want to make it on themselves though could be the real long-term question. ### Leica Microsystems sees Global Data Protection and Compliance with Druva Druva has announced that Leica Microsystems, a multi-national company that develops and manufactures microscopes and other scientific instruments, has deployed inSync for central protection and visibility of data stored on end-user devices. With Druva inSync, Leica Microsystems can manage and control how data is protected on each device, while also meeting the compliance requirements of each geographic region that the company operates in. [easy-tweet tweet="#CloudNews: @Druvainc's inSync to be deployed across Leica Microsystems" user="comparethecloud"] Leica Microsystems operates in more than 100 countries worldwide, as well as working with a network of international distribution partners. Its 4,000 staff cover multiple countries, working remotely and while on the road. This meant that data protection for employee endpoint devices was a potential challenge. Using Druva, the company’s IT team can automatically backup data from all endpoints to a central location without any interaction required. “Before Druva, we had a large number of sales and service employees working remotely and it was challenging for IT to make sure that the data on employee devices was secured and protected. Users were supposed to back up their own devices using custom software, but they did this infrequently, if at all. With Druva inSync, our company data is protected, the user experience is seamless and the overall data protection implementation is very, very reliable,” said Oliver Barner, European IT Services Manager at Leica Microsystems. As part of its implementation of Druva, Leica Microsystems can manage how data is protected and retained based on local and regional data protection requirements. “InSync met our requirements as a multi-national company - German data privacy regulations prohibit the company from using geo-location to track laptops in the country, but in other places, such as the U.S., it’s part of the company’s strategy to combat lost or misplaced devices. It’s great that we can tailor the solution when it comes to these sorts of features,” explained Barner. Druva provides cloud-based endpoint data protection and governance, working with public cloud instances from AWS and Microsoft as well as private cloud implementations. Druva customers can deploy globally and scale to any number of users and select from 32 global storage locations to meet local compliance requirements and regulations. “Companies have more mobile workforces than ever before – employees want to work where and when it suits them, and companies want to support that trend to get more productivity from those employees. However, protecting the data that employees create has to be part of any strategy here. Druva helps companies ensure that their data is protected and governed in accordance with local market legislation and compliance regulations, while also making it simple to administer and run,” commented Rick Powles, Vice President EMEA at Druva. ### Identity is everything: protecting the organisation from the growing human threat It seems that every other day we hear of or read about a data breach. Yet it’s chiefly the big-brands and other high-profile companies – or in the case of Ashley Madison, controversial ones – that get the most airtime. Imagine just how many incidents we don’t hear about. It’s estimated that on average there are more than four breaches per day, the vast majority of which never achieve notoriety. [easy-tweet tweet="Imagine just how many #data breach incidents we don’t hear about" user="comparethecloud @sailpoint" hashtags="infosec"] With SMBs and large companies in the list of those exposed, it’s an undeniable fact that no company is safe. Last year there were at least 1,400 data loss events recorded, releasing over 169 million records. The common thread with all these data breaches? Someone inside the company did something they weren’t supposed to do. Exposure points have evolved      As hackers become increasingly more skilled at breaking through perimeter defences, new technologies are needed to help secure enterprises and their data silos. Just as we have evolved our thinking about how our employees work and access data, so too have we evolved our approaches to protect against digital threats. new technologies are needed to help secure enterprises and their data silos If this past year has shown us anything, however, it’s that network security alone is no longer sufficient. The perimeter that once held our information safe has been eroded. Instead of brute force attacks and SQL injections being the norm, intruders have begun to favour social engineering as the primary attack vector, allowing them to instigate a breach from within. Phishing emails and other means through which people inadvertently release information represent the greatest threats to companies today. A billion points of exposure Companies operate with multiple internal and external users entering their systems and accessing their data everyday: employees, contractors, vendors and suppliers, partners and customers. Considering the sheer volume of users, applications and various levels of data access, it is easy to imagine an enterprise managing over a billion points of access. These points of access can easily become points of exposure. Out of those billion points of exposure, it only takes one to be compromised for an organisation to suffer damage worth millions. [easy-tweet tweet="The #security vector companies need to focus on is human" user="comparethecloud" hashtags="infosec"] This should imply that the security vector companies need to focus on is human, but it is something they seem to struggle with handling today. Whether intentional or inadvertent, people cause a large portion of data breaches, and likewise are responsible for some of the largest breaches we have experienced. As hackers advance their strategies, more data breaches will occur from users doing something they weren’t supposed to do. Identity is everything With the inevitable occurrence of a data breach, the network perimeter disappearing and the ever-present risk from the human vector, organisations must adapt and secure their best asset and simultaneously their greatest threat: their identities. User identities “hold the keys to the kingdom,” and for an organisation to be safe, securing those identities is everything. identity management must be at the core of an organisation’s security programme To do this, identity management must be at the core of an organisation’s security programme. Since identities are most likely to be targeted, securing them must be the top priority. By focusing on all the systems to which users connect, whether they are on-premises or in the cloud, security can be holistic. Only when IT departments have all the information can they make the right decisions. The good news is that managing this complex network of users, systems and access is possible for IT departments with the right technologies. Those billion points of exposure are dynamic, constantly changing and extend beyond the physical walls of the enterprise to customers, partners, vendors and contractors. Therefore, organisations require a governance-based identity management solution. One that can holistically and automatically manage identities at granular levels. [easy-tweet tweet="The points of exposure are dynamic, constantly changing and extend beyond the physical walls of the enterprise"] By employing a governance-based approach to identity and access management (IAM), knowing “who has access to what” is possible for all users and the applications to which they have access as they move in the company throughout their lifecycle. By harnessing a user-centric approach and integrating all the systems together (such as data governance, network security, user behaviour analysis) into the IAM platform, IT would now have visibility into the entire security ecosystem. Only then can the organisation truly put identity at the core of its security and protect their most crucial information. ### Pressure is on CIOs to Innovate and Digitise It is now well accepted that innovation enabled by digitisation and process change is a major lever in delivering business growth.  Indeed, the vast majority (94%) of IT directors in a recent survey by Robert Half believe that technology will play an important role in driving their organisations’ expansion in the year ahead.  [easy-tweet tweet="94% of IT directors believe #technology will be driving their expansion in the next year" user="comparethecloud"] Businesses are actively investing in cloud-based platforms as part of their infrastructure strategy to provide superb service to their clients, especially among small- and medium-sized businesses. As a result we have been seeing an increase in demand for IT professionals who can manage the transition of public and private cloud services, including Software as a Service offerings. The areas in which IT directors believe technology will play the biggest part in driving business growth are improving the customer transactional experience (45%), and driving process or employee efficiency (32%). The most important technology cited by IT directors to achieve these objectives is cloud adoption The most important technology cited by IT directors to achieve these objectives is cloud adoption (34%), followed by virtualisation (29%), business analytics and intelligence (31%), and mobile solutions and application development (both 24%). While cost and convenience are driving UK organisations to adopt the cloud for new and legacy IT systems, there are still some concerns about the security implications of moving from an on-site to a private, public or hybrid IT infrastructure. Security concerns Indeed, the biggest challenges associated with cloud are increased security risks (cited by 52% of CIOs), increased legal and data privacy concerns (37%), implications for cross-border data transfers (33%) and potential increased costs due to user ineffectiveness and/or excessive time wastage (28%).  [easy-tweet tweet="The majority of #CIOs believe the benefits of #cloud outweigh the associated #security risks"] According to Ryan Rubin, Managing Director (UK Security & Privacy Practice), Protiviti: “Despite the security risks associated with cloud, the majority of CIOs still believe that the cost and convenience of cloud computing outweigh the risks of migrating systems onto a cloud platform for many business cases.  “Some of the risks associated with cloud services, such as ensuring that data is not moved across borders, dealing with outsourced cloud providers, maintaining confidentiality and availability of data stored in the cloud, should be actively managed in organisations’ risk posture and security policies and practices.” the pressure is on CIOs to find the best people with appropriate technical experience So from a skills perspective, the pressure is on CIOs to find the best people with appropriate technical experience as well as knowledge about how to keep data safe in a new cloud-based environment. Demand outweighs supply Demand for specialist professionals in these areas is already outweighing supply, so businesses need a competitive edge to attract IT talent. Although competitive remuneration is still an important component, businesses also need a strong employer brand as skilled professionals are often deciding between multiple options. Being perceived as an innovative company has become essential for UK companies to attract, recruit and retain strong IT candidates. It’s vital to demonstrate to technology candidates that you can offer a challenging, innovative environment. Robert Half’s research found that while 50% of CIOs have responded to the need for adequate resourcing for the IT departments which have embraced cloud by up-skilling existing staff, one in five (21%) have hired skilled interim professionals to take on the load. [easy-tweet tweet="Being an #innovative company is essential for UK companies to attract, recruit and retain strong #IT candidates"] The role of contractors A common approach that we are seeing more of is to appoint an appropriately skilled and experienced contractor to lead a project and to encourage knowledge transfer to the permanent team, so that gaps are filled as quickly as possible. All of the signs are that IT security professionals are in high demand.  CIOs need to consider how to ensure they have the right resources in place to manage the new cloud environment, whether that involves re-skilling existing teams or bringing in skilled contract IT professionals. ### Blockchain, Finance and the Internet of Things! I am forever looking at new tech, especially within IoT and Finance and there’s a big storm brewing. This technology storm is going to be immensely disruptive in the tech/finance industries and let me explain why. [easy-tweet tweet="CTC's @NeilCattermull discusses the #technology storm which is going to disrupt #fintech"] So what is this disruptive and potentially revolutionary technology? Let’s start at the beginning with some explanations of the existing terms in the industry, together with a short video from IBM that explains how the process works. Bitcoin – A digital currency created back in 2009 that now is traded publicly and started the blockchain infrastructure we see today Coloured Coins - The easiest analogy is to compare these with types of asset classes Smart Contracts (digital) – Protocols (computer) that facilitate, verify, enforce or negotiate a contract Blockchain – There are two types of records utilised in the chain of blocks Transactions – Content stored in the chain of blocks (blockchain) Blocks – The record and recorded timestamp of transaction with logs for the blockchain Sidechain – Tech focused on improving the blockchain (basically blockchains that integrate others) – this is a major area of improvement/opportunity for the whole fintech industry and I feel massive growth will come from the interoperability of blockchains, so get developing! Blocks – The blocks get created by “miners” who use very specific tech/software that is designed and written specifically for your individual needs Now lets cue the video and watch IBM explain it here! [embed]https://www.youtube.com/watch?time_continue=5&v=4RRD4Jy6aWw[/embed] sneak preview and announcement of IBM’s involvement within the blockchain environment Does blockchain make more sense now? I hope so. Now onto the main body of this article – my sneak preview and also announcement of IBM’s involvement within the blockchain environment. From development to execution it seems that IBM has a solution that fits all the demands of most industries, and now with the resilient tech of z Systems for blockchain they have proved yet again that they are ahead of the game with emerging technologies. Oh, did I mention that IBM are members of the Linux Foundation too? “For security in a blockchain environment not all platforms are created equal. For regulated and other entities IBM can provide full encapsulation of the transaction, code and the blockchain. The z Systems brand provides the highest security platform in the world today and this is essential when applying technology to regulated environments,” says Ross Mauri, General Manager IBM z Systems & LinuxONE. [easy-tweet tweet="IBM is making moves in the right areas with the z Series tech at the heart of it" user="neilcattermull"] Anyway you look at it IBM is making moves in the right areas with the z Series tech at the heart of it. With the necessity of management for the IoT and the need for the appropriate security that it demands, the powerhouse processing power and sheer throughput that this range of tech achieves is amazing. Let’s face it, mainframe technology has always been known for its security, resiliency and transactional processing power, just ask the settlements or client services division of any bank. Now with the evolution of the Internet of Things (a vast amount of connected devices that all need to talk to each other securely, resiliently and fast – see the trend) it makes sense to look at this tech for your digital transformation. It isn’t just for regulated markets either, there are other industries that will eventually adopt the digital transformation technology – blockchain – including Government, retail, manufacturing, healthcare, music, and gaming to name a few. New IBM Partnerships: Apple & SWIFT With all of these industries having a need to change, and also be managed by new technologies such as Blockchain, it is no wonder that IBM have partnered with some of the best tech companies of our time, including the most recent IBM announcement with Swift, amongst many others Partnering with the brightest and best around simply guarantees a form of future proofing, and catering for the increasing public demand to support new vendor releases while staying compatible with the latest technical trends. A good example of this would be IBM’s recent partnership with Apple, now controlling over two thirds of mobile payments in the world with Apple pay. Imagine this service using the z Systems product range as its main infrastructure hub, it simply makes sense. Maybe this will happen, but at the moment the partnership has just made it a lot easier for the pair to develop and execute applications together. Of course as I stated previously, IoT is the next ‘cloud buzz’ and security is paramount to the adoption stability of services. The devices utilised have a need to be secured and managed appropriately, and in my view we are not quite there yet. But looking towards the future, I feel IBM is definitely in the lead in this space, with the most secure and resilient tech around. ### Why Have Virtual Desktops Become Increasingly Popular? Virtual desktops have seen a sudden and dramatic rise in recent years, with the technology becoming extremely influential in businesses all over the world. Virtual Desktops now use a central data base, where everything is stored, regardless of how old or new each individual machine is. So why has it seen such a meteoric rise? [easy-tweet tweet="#VirtualDesktops have seen a sudden and dramatic rise in recent years" user="comparethecloud" hashtags="VDI"] Security One of the most important aspects of a business is security and this is whether it physical or virtual. Security is an important aspect of virtual desktops, as it offers the ability to track external devices and also prevent external devices copying data from any files from local based machines. Virtual security gives an advantage for businesses, as data is never actually stored on each individual device. Once mobile desktops store and hold important information offer a risk far larger than the security of a virtual desktop. Virtual security gives an advantage for businesses, as data is never actually stored on each individual device A virtual desktop not only consistently monitors all devices, it also stores everything in one central server, making it remote and not associating files to certain devices. Even if anything happens to your desktops your information is protected. Easy To Manage In many businesses there are those who are working behind the scenes in order to consistently keep every single machine working. Even though there are specialists in their own departments it becomes increasingly difficult to keep everything managed and maintained. Using virtual desktops means everything managed centrally, including any installed applications, any individual files and making it easier to find if there any problems. Energy Efficiency Is there anything more beneficial to businesses then saving money? Virtual desktops can help companies all over the world become a little more energy efficient. This is because virtual desktops, or a virtual desktop session will use a lot less electricity than an individual desktop computer. Companies can find themselves reducing their carbon footprints, as well as outgoings. [easy-tweet tweet="A #VirtualDesktop session will use a lot less electricity than an individual desktop computer" user="comparethecloud"] Flexibility Virtual desktops are beneficially versatile, as you are able to access files from anywhere with internet access. This can increase productivity for staff members, as it no longer requires them constantly work from their offices. Not only can it be beneficial in terms of flexibility for employees, it can be increase the budget for investments into things such as new desktops and other aspects of the business. Building your businesses image plan based on VDI provides a different user authorisation, deploying applications and issuing a privacy control for all desktops. These can all be managed by the IT department, making it incredibly easy to have overall control of your businesses desktops. Consistent Brand Image It can take longer to set up a desktop in the same image as every other computer in the initial stages, however virtual desktops make that process a lot more efficient and easier. It uses the same operating system, installed applications and reduces costs for administration and support. Financial Benefit virtual desktops support mobile workforce In terms of financial benefit virtual desktops can offer a significant difference. As money is spent bringing in the main server hardware will only benefit you in the long run. Virtual desktops can benefit you if there are any problems and hardware issues, as only the main server will need a troubleshoot, rather than individual machines. There are no longer a need to buy desktops, as virtual desktops support mobile workforce and those who work remotely. ### Zoopla Property Group moves to Cloud BI with Birst As a recent first time home buyer I can honestly say that Zoopla is a game changer in the way we look at property. The information it makes available to users is invaluable, it has the potential to ease some of the stresses that purchasing property puts on a buyer. Having direct access to statistics and charts for the areas that you’re looking to purchase in takes the hassle out of trying to value an area, and predicting where to buy investment properties that will make good returns.  [easy-tweet tweet="#Zoopla has chosen @Birst’s #CloudBI platform for its #analytics and business intelligence programmes" user="comparethecloud"] Last week it was announced that Zoopla Property Group (ZPG) has chosen Birst’s Cloud BI platform for its analytics and business intelligence programmes. Birst Cloud BI will be used to integrate data from across ZPG’s leading online property brands such as Zoopla, PrimeLocation and HomesOverseas. ZPG owns and operates a range of online property brands used by customers for managing the process of finding their next home location, whether this is in the UK or overseas. As the company expands the range of services it offers to house buyers and home-owners, the role of analytics will continue to develop, as well. As part of the move to Cloud BI, ZPG will bring together information from multiple applications, including its CRM system, marketing automation, web analytics and social platforms. By bringing the data from their multiple brands together, ZPG aiming to use Birst BI to improve its marketing campaigns through more efficient targeting, audience segmentation and attribution. Jay Larson, CEO at Birst, said, “The role of data within businesses continues to develop, particularly in online industries, where decisions have to be made faster and more accurately than the competition. For companies such as ZPG, Cloud BI brings together information from multiple applications in a governed way, enabling IT to control data centrally while giving users across the business access to analytics. This approach helps people ask their own questions, share the results of analytic enquiries with their peers and contribute to improvements in performance for everyone.” "The role of data in this growth plan is significant, as it will support our employees in tailoring outreach, marketing and customer service campaigns for the right audiences at any given time" “Today, ZPG provides customers with online services that support their house hunting. We are expanding this range of services to become the primary market resource for everyone involved in buying, selling and managing property. The role of data in this growth plan is significant, as it will support our employees in tailoring outreach, marketing and customer service campaigns for the right audiences at any given time. Birst has been chosen to support these plans, based on our requirements for a flexible, agile BI platform that will keep pace with our aims,” said Stephen Morana, Chief Financial Officer at ZPG. The marketing department  is a good example of where the use of cloud can provide more value – but there needs to be a decision made on whether the improved value is just for the department, or for the business as a whole. Carl Tsukahara, CMO at Birst, said, “Many companies are embarking on digital transformation initiatives too, which can complicate this process as well. Joining together data from multiple departments can therefore really help Marketing, as this can be used to recreate the whole customer journey using data. Networking this data together and letting people ask their own questions can help show where there are real opportunities to improve results for the whole business.” [easy-tweet tweet="Joining together #data from multiple departments can benefit the #Marketing department"] “Customer data sits in multiple places within a business – these can be traditional apps, Cloud-based services, or information provided by third parties. Getting insight out of this data is a great opportunity for Cloud deployments. Business users don’t care about the issues of data quality or normalisation, they want to get their questions answered. For IT, issues like accuracy of data over time and collaboration can affect the success of these projects, so keeping governance and control over the data sources can be important. However, this can ultimately make the data discovery process and use of analytics easier too. By virtualising BI using Cloud, analytics can be made available to everyone.” ### Save the legwork of finding a reputable cloud provider – let the government do it for you The current Brexit debate brings the differences between the cultural outlook in Britain with that of its continental neighbours to the fore. One aspect is the respective attitude to regulation and accreditation, but for a largely unregulated sector like cloud what does this matter? Well, maybe it should. [easy-tweet tweet="How does the #Brexit debate relate to #Cloud? Neil Laver of @Veberhost explains" user="comparethecloud"] At a European level there are regulations such as the EU Data Protection Directive, which will apply to the whole of Europe including the UK if we want to provide cross-border services (whether we leave the EU or not). According to 451 Research, the most competitive prices for cloud services globally are found in the US. On average Europeans pay between 7 and 19% more, depending on the complexity of the application, but deals are available to those that shop around. the most competitive prices for cloud services globally are found in the US There was found to be a ‘protection premium’ for hosting services in-country or in-region, rather than using the cheaper option of US services. This reflected the extra investment needed by European cloud users as result of three pressures: the need to meet local regulations, the need to boost performance by bringing apps closer to users and the use of local customer service. If we drill down to a national level we can see that even within Europe prices and regulations differ. Recently Compare the Cloud interviewed the director general of ANSSI – the people behind the French National Digital Security Strategy. “ANSSI … runs a national accreditation and certification program. It publishes a list of rules for companies (including those in the traditional IT hosting and cloud sectors), and then conducts independent evaluation on their performance before providing the appropriate certification. Almost unique to France, Guillaume Poupard admits that this process isn’t necessarily cheap for the firms that take part, but he argues that it is definitely worthwhile – not only for the firms themselves that need a set of standards to apply and benefit from the ability to show that they are up to standard, but also to their clients who have the peace of mind knowing that they are working with a firm that has met such high standards.” Meanwhile the UK market is not only more mature, but also less regulated providing a far more cost-competitive environment. This may mean better value, but how can clients have the same peace of mind as their French cousins that they are working with a firm that are reputable and meet high standards of service? how can clients have the same peace of mind as their French cousins that they are working with a firm that are reputable and meet high standards of service? Well, there are some firms that are accredited by the UK government – those that wish to tender for government business and are part of the UK G-Cloud scheme. The UK government initiated the G-Cloud program in 2012 to deliver computing based capability using cloud and it has been hugely successful, providing benefits to both customers and suppliers alike with over £900m worth of sales having now taken place via the G-Cloud platform since its launch. With the G-Cloud framework there is a Digital Marketplace that is provided by the Crown Commercial Service (CCS), an organisation working to save money for the public sector and the taxpayer. The CCS acts on behalf of the Crown to drive savings for the taxpayer, ensure quality of service and improve procurement efficiency. Central government departments, local government, health, education, not-for-profit and devolved administrations can all use CCS’ procurement services. Let the government check out suppliers for you! G-Cloud has substantial benefits for both sides. For vendors the benefit is clear – to be awarded as an official supplier for G-Cloud demonstrates that the company has met the standards set out in the G-Cloud framework. Furthermore, it also opens up access to new opportunities to supply the public sector in the UK. Likewise it brings recognition to the brand and further emphasises the firm’s position as a reputable provider of digital services. [easy-tweet tweet="#GCloud gives quick and easy access to a roster of certified suppliers that have been rigorously assessed"] For the public sector organisations G-Cloud gives quick and easy access to a roster of approved and certified suppliers that have been rigorously assessed, cutting down on the time needed to research and find such vendors in the marketplace. This provides companies with a head start in finding the cloud services that will best address their business and technical needs. Even if you aren’t a public sector organisation, if you are looking to source a cloud provider and want to find a reputable one, then why not start your search with those that have been awarded a place on the G-Cloud framework agreement. G-Cloud provides companies with a head start in finding the cloud services that will best address their business and technical needs Of course you still need to ensure that their platform, service level agreements, native management tools and support teams can deliver the solutions that best address your business goals as well as your security and compliance requirements. Nevertheless you have a ready-prepared list of firms that are reputable and meet high standards of service. One firm that meets all the standards set out in the G-Cloud framework and also has a reputation for local service excellence is Veber. Why not check us out! ### How network monitoring can boost your company’s operational efficiency It doesn’t matter how operational sound you already are, finding new ways to cut costs and increase revenue is music to a business leader’s ears. For any company that relies on its IT infrastructure to carry out its core functions, network monitoring can provide a viable way of boosting the ROI of its IT investments. In addition, with more and more businesses utilising virtualised or hybrid cloud resources, network monitoring has begun to take on even greater significance. Far from simply giving IT teams more visibility and control, network monitoring can make a genuine difference to an organisation’s financial wellbeing. [easy-tweet tweet="#Network monitoring can make a genuine difference to an organisation’s financial wellbeing" user="comparethecloud"] To understand how ‘network monitoring’ can help benefit your company’s finances, it is first important to understand exactly what is meant by the term. Network monitoring refers to a system that monitors and analyses all of the individual components within a network, notifying administrators when a device fails or performance levels are dropping. Often this involves the use of software tools that check the activity and health of a business network 24 hours a day, seven days a week. As with any other IT solution, network monitoring systems are constantly evolving, which means that businesses must also constantly reassess whether they have the right tools in place. By taking a proactive and thorough approach when assessing your network monitoring systems, businesses can quickly evaluate where their existing tools are underperforming and whether an alternative solution will provide better return on investment (ROI). Understanding the wealth of network monitoring options available to your businesses can also see your IT team gain greater control and visibility over the network, as well as providing financial benefits. Understanding the wealth of network monitoring options available to your businesses can also see your IT team gain greater control and visibility Evaluating ROI for your network monitoring tools When assessing the financial performance of your network monitoring tools, one of the first elements to consider is how it helps you to control costs. An effective monitoring solution must be able to operate across departmental silos in order to deliver a holistic understanding of whether your network is hindering productivity and affecting your bottom line. The ability to identify complex network challenges and notify managers if any components are not functioning correctly is crucial, and many businesses are finding that the addition of integrated enterprise solution tools is delivering greater insights. Downtime can be greatly reduced, limiting the cost of “dead time,” as well as longer-term strategic costs, such as reputational damage. Having early warning systems in place as part of your network monitoring solution can also help to reduce costs by enabling businesses to plan for system expansions or repair before downtime occurs. [easy-tweet tweet="Enterprises can gain a more secure handle on their resources and how they are allocated as a result of #network monitoring"] In addition to cost reduction, network monitoring can also boost revenue generation. Enterprises can gain a more secure handle on their resources and how they are allocated as a result of the visibility provided by network monitoring. For example, your sysadmins can identify potential problems before they occur so instead of fighting fires, they have a clear idea of how their IT budget and staff time is being allocated. This in turn gives business leaders a clear insight into how available resources can be better allocated to other business processes that provide the company with added value. Many businesses are already realising the potential benefits to be realised by employing effective network monitoring tools. Organisations can achieve reduced levels of downtime, greater performance and enhanced asset utilisation, as well as better visibility and control across the company’s entire network infrastructure. Taken together, all of these benefits also deliver fantastic tangible efficiency gains, by ensuring that enterprises and large public sector entities not only reduce their costs but also boost their revenue generation. Organisations can achieve reduced levels of downtime, greater performance and enhanced asset utilisation Xangati have network solutions available for both virtualised and hybrid cloud environments that provide IT administrators with the real-time data they need to take the guesswork out of network monitoring. With end-to-end visibility, Xangati ensures that businesses are no longer left in the dark about their network performance. ### Protecting partnerships in the manufacturing industry Manufacturing, like many other industries in fact, is reliant on collaboration and partnerships. In the automotive industry, for example, the production of each vehicle relies on a complex network of suppliers delivering each component on time and to the correct location. What’s more, the amount of administration that accompanies these partnerships is vast. From purchase orders to design specifications, there is a broad spectrum of files that need to be transferred and stored securely. [easy-tweet tweet="Faced with large volume #filetransfers, manufacturing firms are looking for ways to manage transfers securely"] This is true too of many other manufacturing sectors, including smartphones, heavy industry and engineering. Faced with such a large volume of file transfers, manufacturing firms are increasingly looking for ways to manage transfers securely, on-time and with the lowest amount of complexity. At Bluebird IT, we can deliver a suite of IT solutions that helps to secure all your manufacturing partnerships, regardless of how large your operation. We recently worked with the largest manufacturer of their product type in the world to provide an electronic data interchange (EDI) solution that met their complexity and security requirements. EDI software is vital for exchanging documents between businesses in a fast and simple manner. In fact, EDI tools can facilitate this exchange without any human intervention at all, ensuring that invoices, POs and any other documents are received, transformed and deposited appropriately. EDI software is vital for exchanging documents between businesses in a fast and simple manner EDI is just one facet of our B2B Integration solution, however. Other features include the fast and efficient onboarding of new business partners,the transformation of data into formats to integrate with backend systems and the delivery of a central platform and data hub for all business processes with external business partners and suppliers. Taken together, our B2B Integration solution can result in the automation of a number of business processes, cutting the amount of time employees spend on manual tasks and boosting productivity. Automating business integration processes can also help eliminate error from the data transfer process, leading to further time savings. Security is another key aspect of any B2B integration solution. Encryption is increasingly being adopted by firms across a multitude of industries, particularly those that have to continuously share information with their partners. Effective B2B integration should encompass multiple encryption standards and be sure to safeguard files both when in-transit and at-rest. Authentication services are also vital if organisations are to ensure that sensitive information is only available to the relevant individuals. Bluebird’s B2B offering includes a number of authentication services, including single sign-on, LDAP and local user authentication, so manufacturers can have peace of mind before sharing any information. [easy-tweet tweet="Effective #B2B integration should encompass multiple #encryption standards" user="bluebirdITS"] Technology has made the world a smaller place and manufacturing firms are increasingly finding that they need to be connected with suppliers, partners and customers all over the world. They also need this communication to be reliable, secure and simple. With the services provided by Bluebird, businesses no longer have to worry about the complexity and affordability of their file management and transfer solutions. If you want your employees to spend more time building great products and less time jumping over IT-related hurdles, then be sure to choose a vendor with experience delivering outstanding business integration software. ### The UK Cloud Armageddon The UK Cloud Computing and MSP market today reminds me of Florence and Rome during the Renaissance, full of Machiavellian whispering and Borgia scheming, although I would say that the politics was less in yesteryear. [easy-tweet tweet="Commoditisation coupled with tired business models is driving #Cloud politics" user="comparethecloud"] What is driving this? Simple answer, commoditisation coupled with tired business models with external forces driving down costs. Lets analyse each of the above in turn: Commoditisation: Traditionally IT companies made large margins from customer sales. A large majority of this margin was made from physical hardware such as servers and storage, and with anything that has movable parts support contracts on this IT hardware made considerable margins. Now along comes Cloud Computing and Software as a Service that does not require large Capital Expenditure in physical goods and the emergence of mega-scale cloud providers with a nod towards Amazon AWS driving this model. And what does this bring us, shrinking margins and the ‘Borgia-esque’ venture capitalists riding their spreadsheets demanding more revenue to satisfy investments or looking for exits. sell/merge or acquire to grow to a scale whereas providing IT and cloud services becomes profitable The answer; sell/merge or acquire to grow to a scale whereas providing IT and cloud services becomes profitable. And this is what is driving todays UK marketplace a fear of commoditisation and the need to scale to reach profitability. When you are an IT organisation that has taken investment either from the stock exchange or venture capital community, there is always a Faustian bargain struck, one of future growth and profitability.  In the commoditisation of IT era we are now seeing a bloodbath play out in front of us. Akin to the mobile telephone dealers and photocopier era before us revenue streams EBDITA and other metrics of profitability and growth need to be adorned before the paymasters. [easy-tweet tweet="In the commoditisation of #IT era we are now seeing a bloodbath play out in front of us" user="comparethecloud"] Tired business models: a model based on capital expenditure based hardware investments without innovation and R&D investment is dead. Any business that is not looking at API based interconnections through to Big Data and analytics is crying out for modern leadership. Look at any successful organisation today and you will see a number of shared factors namely; Investment in areas such as Big Data and Analytics Embracing of social tools and methodologies Customer centric not revenues at all costs Innovation born from diligence and customer needs Leveraging metrics such as CLV / LTV to measure growth over time Investment plans based on medium to long term initiatives Focus on sectors rather than broad categories and develop a true understanding of what drives those customers Building of innovative marketing functions such as buyer personas Allowing for thought leadership and content to be consumed over multiple touch points Developing products and services based on tomorrow not today and consultatively working with customers [easy-tweet tweet="Any business not looking at #API #BigData and #analytics is crying out for modern leadership" user="comparethecloud"] External influences; when Marc Benioff of Salesforce fame declared ‘Software is dead’ many years ago SaaS, Cloud Computing and other forces have eroded traditional IT Margins. The full force of what I call the on-demand consumption model is only just started to cause to pain to traditional partners. What Salesforce has done in a short while to established players within the CRM sphere is now being played out across the infrastructure sector. The only area where cloud is in my view going to drive higher consumptions is that often forget cloud staple ‘connectivity’ higher demands for direct connectivity and data centre bandwidth will really drive the growth in this sector. As for the rest, commodity requires differentiation to stand out from the crowd. With regards to business models we are in one of the greatest times of technical and business change therefore agile business strategy (not the devops one) combined with an open mind and propensity to innovate will be key. commodity requires differentiation to stand out from the crowd And if in doubt consult the original stories of the Borgia and Medici perhaps in these books you may find todays answers. ### MariaDB Fortifies Enterprise-Grade Features for OLTP in Spring 2016 Release MariaDB® Corporation, the recognised leader in open source databases for Online Transactional Processing (OLTP), today announced the Spring 2016 release of MariaDB Enterprise. MariaDB’s upcoming offering addresses the most pressing enterprise data management challenges. New capabilities defend data against application and network-level attacks, support faster development of high-performance applications, and deliver higher service levels at lower cost.  MariaDB also introduced the MariaDB Security Audit service to help customers identify and remedy data security weaknesses. [easy-tweet tweet="#CloudNews: @MariaDB’s upcoming offering addresses the most pressing enterprise #data management challenges"] The performance and security enhancements reflect growing use of MariaDB in large, mission-critical applications in on-premise and cloud environments. The MariaDB Enterprise Spring 2016 release raises high availability to a new level with connection pooling, automatic failover, and integration of Galera multi-master cluster technology.  Customers with mission-critical applications will benefit from over a dozen transaction processing performance improvements that include InnoDB storage engine optimization, memory optimization to boost query response, and numerous enhancements from WebScaleSQL created by the user community. The Spring release provides security at every level in the database. It protects against SQL Injection and denial of service attacks, and transparently encrypts data both at rest in the database and in motion to and from applications. Customers can reduce the risk of data breaches by storing encryption keys in their choice of independent key management systems including ones from Eperi, and can leverage stronger Kerberos authentication, password validation plugins, and role-based access control. The innovations within MariaDB Enterprise were created in part by collaboration with MariaDB open source community.  The company’s ability to leverage that vast community for continued innovation and rapid threat identification ensures  the continued superiority of MariaDB capabilities at a faster release schedule than proprietary database vendors can offer. “Our customers and user community are our inspiration, our guide, and our partners in creating database software that can harness emerging opportunities and anticipate security threats that all companies face.  Proprietary solutions just can’t keep up.” said Michael Howard, CEO of MariaDB. MariaDB Security Audit service helps companies evaluate the security policies, technologies, and practices used with their MariaDB database to identify procedural and technical gaps that create vulnerabilities and increase legal, financial, and brand reputation risk. The service will also assist existing MariaDB users in leveraging the full extent of the solution’s security capabilities. With innovations created collaboratively by the MariaDB Open Source community and MariaDB  engineering, MariaDB Enterprise Spring 2016 delivers the most high-available, high-performing, secure open source database on the market. The MariaDB Enterprise subscription includes the newest version of MariaDB server 10.1.12, LGPL connectors, MariaDB MaxScale 1.4, developer enablement and DBA productivity tools, database management plugins, and expert 24x7 support and services to address the needs of mission-critical applications. Availability: The MariaDB Enterprise Spring 2016 release, with MariaDB MaxScale, will be available this spring. For more details about features or for pricing information, visit www.mariadb.com. ### India m2m2iot Forum 2016, April 22 - 23 April 22 - 23, 2016 New Delhi, India TSDSI ‘Telecommunications Standards Development Society, India’ and TCOE ‘Telecom Centers Of Excellence, India’ comes together with m2m2iotpaper.com to organize the India m2m2iot Forum 2016, in New Delhi, India, on April 22 - 23, 2016. India m2m2iot Forum is the platform that continues to actively facilitate new ideas, businesses and partnerships in m2m + iot developments and will provide an unique platform for M2M + IoT Evangelists, Innovators and Disruptors, and Manufacturers from the Small and Medium Enterprise Segments to interact with Governments - Policy Makers and Regulators and Operators to come together and create the right enabling environment for innovation, ensure fair competition, facilitate cross-sector partnerships and meet the common challenges of legal regulations, privacy and security, and most importantly STANDARDS driven implementation and management of M2M + IoT based services in India. India m2m2iot Forum is the forum that strives to bring together the major stake- holders form the entire eco-system of the m2m + iot to discuss, deliberate and also conclude these industry opportunities and challenges, learn and share knowledge – and build the actual framework for the successes of this very small and medium enterprise segment. This joint forum is intended as a bridge between the ‘movers and shakers’ of the M2M + IoT universe in India with the standards development community of India and the world. Website: http://www.m2m2iotforum.com/ ### SanDisk Interview at the Lenovo Technology Centre Launch Earlier this year Compare the Cloud attended the launch of Lenovo’s Technology Centre in Basingstoke and interviewed a range of experts on the benefits that the Technology Centre is going to bring to Lenovo’s partners.   Compare the Cloud interviews Justin Wheeler of SanDisk about how they are working with Lenovo, and how the Technology Centre is going to benefit their business. [easy-tweet tweet="Catch up with @SanDisk's Justin Wheeler on @comparethecloud's coverage of the @LenovoPartner Technology Centre"] ### “The Cloud”.  There’s more to it. Like a lot of people, I like to read the Sunday papers. Generally, it’s to review sporting events and current affairs, however, on Sunday 14th February - Valentine’s day, for the romantic or the married - I was drawn to an exposé about cloud computing. The thing that struck me was the use of the term “the cloud” and the generic, undefined meaning associated with it.  It is my belief that the IT industry often places more value on a term, than what it actually represents. [easy-tweet tweet="#Cloud Computing is a concept that has three core characteristics, so why do we have such trouble defining it?"] Let me try and explain. Cloud Computing is a concept that has three core characteristics.  When all three are present when delivering IT infrastructure or an application, essentially you have a cloud service. Customer self-service. Automation and orchestration of tasks and processes are completed in the background where the customer doesn’t see and is only concerned when it doesn’t work. Scalability.  No major step changes are chargeable or visible to the customer when increasing usage. They can scale the service as and when they need to, and associated usage should be reflected in the billing if provided by a third party. Wide Area Network (WAN). A cloud service should be delivered to the customer out of a purpose-built data centre across a WAN.  This could be the internet or a corporate network. in my mind cloud computing has been with us for some time So in my mind cloud computing has been with us for some time. Think about the early internet e-mail services that were about. Hotmail, yahoo etc. They all had, and still have, the above characteristics.  So when we refer to “the cloud” are we just referring to these three basic characteristics? To my mind we are not, and this is where the confusion exists.  The technology industry is specialising in delivering their products as a service, rather than designing and shipping the various components for the customer to build. This change is in part due to the cloud concept, but more credit has to go to the advancements in technology, such as virtualisation and low latency networks. These newer, faster, more functional IT services are delivered incorporating the characteristics of cloud computing. Therefore, when we use the term “the cloud”, are we are intimating that the service that was delivered in house by your IT team, will now be delivered by a specialist technology company? Otherwise known as outsourcing. However, businesses do create their own private clouds by purchasing outright the component pieces, building, and delivering the service back to their internal customers. Is this investment ownership model discounted from the term? [easy-tweet tweet="Attributing application specific benefits to the generic term of #cloud is disingenuous at best" user="comparethecloud"] Another interesting aspect about the use of “the cloud” is the benefits which are attributed to it. Some of the benefits are plainly due to the functionality of the application which was obviously created for the role it was deployed. The application is delivered as a service, however it’s not the cloud delivery model that gives the benefit. By attributing the application specific benefits to such a generic term is disingenuous at best. If you are not in the IT industry, the term must seem vague. And if you use it in the same context as the internet then you must think “the cloud” is a single entity. Which it is not, yet we in the IT industry still use “the cloud” to refer to multiple, separate IT services that incorporate 3 different service models (IaaS, PaaS and SaaS), 3 different delivery models (Public, Private and Hybrid), that are available in a plethora of different commercial models. The only aspect in common are three characteristics. The truth about adopting cloud computing is every business needs to research the market based on their requirements, and go through the appropriate levels of due diligence and planning. They need to consider all options to make the right choices for their businesses. Delivering an IT service is simply not a one size fits all discussion. There are a range of specialist products and service providers that now offer real choice, the new challenge is making sense of it all. The truth about adopting cloud computing is every business needs to research the market based on their requirements A simple life beckons for those running an end user IT organisation.  However, overplaying, “The cloud” doesn’t reflect the progress in the industry and work required to take advantage of it.    ### Planning a shift towards DevOps? Here are 5 key traits your DevOps team needs DevOps. It seems to have become everyone’s favourite buzzword. There’s a reason for that though.  According to Gartner, it’s growing so fast that a quarter of all Global 2000 organisations will deploy it by the end of this year. [easy-tweet tweet="#DevOps is more about culture than tools and technology" user="comparethecloud" hashtags="cloud, IT"] So what exactly is DevOps? As Tony Bradley explains, “DevOps is more about culture than tools and technology. A group of traditional developers and IT engineers who understand and embrace DevOps culture can be successful, whereas a team of DevOps experts experienced with the associated DevOps tools that don’t accept and adapt to the cooperative, collaborative nature of the DevOps culture is likely doomed to failure.” At the heart of DevOps are four crucial elements: speed, quality, control and cost. Speed is fundamental to competitive execution and market positioning. Quality is vital to successful implementation and long-term viability. Control, the command of data use, security, access and process is crucial to safe operations. And cost as we all know is a key element of nearly all business decisions. At the heart of DevOps are four crucial elements: speed, quality, control and cost While it is a common assumption that implementing DevOps is a primarily technical process, we see that the cultural aspects and adjustments are equally as important. Every DevOps team needs to possess certain traits in order to successfully tackle this cultural shift. Communications = key Until recently, IT professionals had strictly defined roles and responsibilities that allowed them to work independently rather than collaboratively. As a result, communication skills weren’t a priority for when putting together an IT team. However, as rapid deployment and streamlined processes have emerged, communication has become key to making smooth transitions from one phase of the project to the next. Enforcing good communication can lead to better results in a shorter amount of time and ultimately helps organisations save money. Is there an ‘i’ in team? [easy-tweet tweet="Flexibility is key to effectively implementing a #DevOps methodology in an organisation" user="comparethecloud"] Flexibility is key to effectively implementing a DevOps methodology in an organisation. For those jumping on the DevOps bandwagon, the phrase “it’s not my job” shouldn’t be considered. While it’s common for organisations to experience a clash between development and operations teams when first implementing a DevOps strategy, successful interdepartmental integration requires collaboration in order for the team to reach their goal of satisfying the needs of the business. Think of implementing DevOps as working with a team of teams. While each team brings different skills to the table, it is important for all teams to support each other to deliver the most powerful results as effectively and quickly as possible. Bring on the change We’ve all heard the saying that the only constant in life is change, whether it involves something as small as adjusting our daily commute or as big as a new career. And like everything else, the implementation of DevOps brings about a large cultural shift for an organisation. DevOps brings about a large cultural shift for an organisation Gartner analyst George Spafford recommends implementing a cultural change programme to make team members aware of the mutual goal. To begin, he encourages developing a small pilot plan to test the waters initially by deploying tests and taking careful note of what works and what doesn’t. It’s important to know your team and what works best to motivate the group to keep them positive and interested. Laying out such a road map and embracing the cultural change will result in a more focused team that will optimise the outcome. Don’t fear failure If you’ve been doing your DevOps research, you’ll know that there are just about as many articles on DevOps failures as there are successes. To be on a DevOps team you need to accept that failures can happen, but you can’t fear it. According to a Gartner study, 75% of enterprise IT departments will have tried to create a bimodal capacity by 2018. However, less than 50% of them will reap the benefits that new methodologies like DevOps promise. Accepting to fail and being patient is crucial for a team to get the most out of their DevOps efforts. Maintain the enthusiasm [easy-tweet tweet="#DevOps is here and it’s the next big thing. You’re probably getting tired of hearing that by now!"] DevOps is here and it’s the next big thing. You’re probably getting tired of hearing that by now. A successful DevOps team needs people that want to make a difference with the excitement to drive a significant business transformation. This involves the willingness to listen to customer feedback and adjust accordingly. Since customers are the main driver on continual software updates and releases, it is crucial to be interested in what they have to say and be more than willing to be accommodating. There will be many highs and lows, and despite processes breaking and things not going according to plan, people involved in DevOps need to maintain continuous enthusiasm for the journey ahead of them. With these five traits, your team will be able to successfully implement a DevOps strategy and navigate the minefield of cultural change that comes along with it. ### .cloud domain launches globally and is set to be one of the most popular domains for businesses and individuals The .cloud top-level domain extension, is now finally available to the general public on a first-come-first-served basis. Businesses and individuals will now have the opportunity to adopt the new domain extension – an alternative to the likes of ‘.com’ ‘.uk’ or ‘.io’ – to better communicate their online brands. [easy-tweet tweet="#CloudNews: .cloud domain launches globally as an alternative to .com" user="comparethecloud"] The level of interest for this new extension has been strong already in the early phases of the launch. Over the past few weeks leading international cloud players such as Weebly, ePages  and  Odin released dedicated portals for their cloud offering as part of the .cloud’s Pioneer Program that launched in October 2015. Innovative start-ups have also jumped at the opportunity to better position themselves in the industry. From emerging cloud service providers such asClouDesire, to social enterprises such as Food.Cloud and FashionCloud, an innovative exchange of digital content for the fashion industry, newer ventures are ensuring that they have a distinct brand online. Iseult Ward, co-founder of FoodCloud and an early adopter of .cloud said: “.cloud synced perfectly with our company name and brand. By using .cloud we can clearly explain to the public and our key audiences what we do and how our app and web-based business is powered by the cloud.” FoodCloud was started in Dublin in 2013 and in two years has grown to work with businesses and charities across the 26 counties in the Republic of Ireland and four cities across the UK, moving over 10 tonnes of food a week. The new domain is set to support growth plans as FoodCloud expands over the next year to offer its food redistribution solution to more and more communities, with the end goal of having a world where no good food goes to waste. The .cloud's commercial launch got underway in mid-November, with the Sunrise phase ranked among the top ten launches of new domain extensions since 2013 (based on registration requests). The following Priority Registration phase saw incredible interest with more than 2000 orders and close to 200 domain names receiving multiple requests which will now be assigned through auctions. [quote_box_center] A global technology company already profiting from a .cloud web identity is leading online shop software provider ePages, which launched its epages.cloud partner portal as a .cloud pioneer.  Richard Stevenson, Head of Corporate Communications for ePages, said, “ePages’ mission is to deliver best in breed e-commerce cloud services, and the values of cloud are within our DNA. Our portal explores how merchants’ success is driven every day by the power, resilience and efficiencies of cloud-based software-as-a-service – no other TLD could have captured this better, or delivered more engagement”. "Becoming a pioneer for .cloud was a logical and pleasing step in our corporate story, and of course an exciting one – the nTLD is among the most innovative and eagerly anticipated. Epages.cloud is a resource for any provider, reseller or developer wishing to benefit from the selling of e-commerce to the SMB and web professional markets.  Our .cloud domain is the perfect web address to correctly position ePages to its partners, to communicate modernity and availability and affirm the integral role that cloud technology plays in delivering excellent online services. Our .cloud web identity has been a crucial element in securing the traffic and engagement we’ve won with the portal so far.  Our industry audience, often senior execs and web professionals, are innovators themselves and expect us to be as smart, fast moving and ambitious as they are – our .cloud domain is the right emblem to reflect that”.[/quote_box_center] A network of over 70 Registrars worldwide is now ready to accept orders for new .cloud registrations. These include leading providers like GoDaddy, 1&1, Name.com, Gandi, Domain.com and OVH. Francesco Cetraro, Head of Registry Operations at .cloud added: “Today marks an important day for everyone who has been eagerly awaiting the opportunity to own and use a .cloud domain. We have been thrilled with the response to date: first with the wide level of interest in our Pioneer program, which turned out to be one of the best subscribed to date. The Priority phases were also very well subscribed, showing there is a clear interest among companies and individuals to get access to a modern domain extension that helps them better reflect their business capabilities and position their brands in a unique and powerful way.” Stefano Maruzzi, VP EMEA at GoDaddy, said: “cloud computing has quickly emerged as a key driver of innovation and its importance as backbone of the digital economy is only going to grow in the coming years. IDC estimates that total spend on cloud IT infrastructure worldwide reached 32.6 billion dollars at the end of 2015. In this context, the .cloud extensions represents an attractive and relevant opportunity for creative people and businesses of all sizes. As a leading provider of complete solutions to SMBs worldwide, GoDaddy is extremely pleased to be able to be one of the first providers to share the opportunity offered by .cloud with its customers around the world.” To find out more about getting a .cloud domain visit: https://www.get.cloud/home.aspx ### MSPs and the importance of secure file transfer Managed service providers (MSPs) are proving increasingly popular across a wide range of industries. If the rise of cloud computing has supplied businesses with previously inaccessible IT solutions, then the rise of the MSP has enabled these same organisations to better manage, integrate and secure them. Spending on managed services is expected to reach more than $193 billion by 2019, almost doubling the figure it set at the end of 2014, and this is hardly surprising considering the benefits that MSPs can deliver. Manage service providers can supply expertise, hardware and software to suit your needs and robust security protocols. [easy-tweet tweet="The rise of the #MSP has enabled organisations to better manage, integrate and secure #cloud solutions" user="bluebirdITS"] However, the ability of MSPs to meet their customers’ requirements ultimately depends on the technology that they utilise. At Bluebird IT, we recently worked with a global MSP to help them deliver a secure data transfer service. When finalising the technology, we had to consider security and governance capabilities, as well as the ability to easily onboard new customers. For the MSP to be successful in the long term, the ability for them to scale up the technology to meet growing demand was absolutely vital. Working together, we were ultimately able to create batch file transfer infrastructure that met PCI DSS compliance standards, as well as other key regulations, making it secure enough to meet the demands of their customers in the tier 2 financial services sector. By choosing secure, governed and flexible technology from Bluebird, the MSP was able to bolster its reputation and set itself up for potential domination of their market. File transfer is just one of many services provided by MSPs, but it is one of the most important File transfer is just one of many services provided by MSPs, but it is one of the most important. No matter which industry you work in, there is likely to be the need for partnerships and collaboration, either internally or externally, which require the safe transfer and storage of data between all parties. Companies across all sectors will experience this challenge, whether exchanging POs and invoices between partners or simply enabling greater workplace collaboration. Poorly implemented data transfers can lead to company networks becoming overloaded, a lack of visibility over where data has been transferred and ultimately, a fall in productivity. In addition, failure to meet compliance standards when it comes to data protection can see businesses faced with compensation claims and even long-term reputational damage. As a result, MSPs must ensure that the file transfer service they deliver to their customers is able to meet the most exacting security and reliability standards. [easy-tweet tweet="Poorly implemented #data transfers can lead to company #networks becoming overloaded" user="bluebirdITS"] When MSPs are looking at their technology options, it is important that they opt for hardware or software that has a proven track record of success. After all, an MSP that is unable to justify their technology choices is unlikely to be able to convince customers of a consistent level of service. In order for MSPs to receive the trust of their customers, they need premium technology delivered by a trusted partner with more than 25 years of experience. No matter which industry niche you want to dominate, Bluebird IT can help provide you with the proven technology you need in order to do so. ### Netskope introduces first complete threat protection and remediation solution for enterprise cloud apps Netskope, has announced the availability of Netskope Active Threat Protection (NATP), a first-of-its-kind threat protection solution for the cloud access security broker (CASB) industry. With a comprehensive vantage point over cloud app usage, NATP combines threat intelligence, static and dynamic analysis, and machine-learning based anomaly detection to enable real-time detection, prioritised analysis and remediation of threats that may originate from — or be further propagated by — cloud apps. These new Netskope capabilities integrate with industry-leading tools to create a defence-in-depth solution that reduces the time required for cloud threat detection and forensic analysis from hours to minutes. [easy-tweet tweet="#CloudNews: @Netskope introduces complete threat protection and remediation solution for #enterprise #apps"] With 4.1 percent of enterprises’ sanctioned cloud apps laced with malware and total cloud app usage — including unsanctioned or “shadow IT” apps — extending into the thousands per enterprise, organisations have been largely unprotected by traditional perimeter security providers. The increasing complexity of the threat landscape and frequency of attacks has also led to an unprecedented shortage of skills and cognitive overload for IT security professionals. NATP addresses the lack of cloud visibility with a 360-degree view into sanctioned and unsanctioned cloud app usage, even if the user is accessing the app remotely or from a mobile device. This vantage point over the cloud vector goes beyond other CASB solutions that fail to see all app usage and data movement. NATP goes even further by understanding the context of the usage, such as who is uploading, downloading and sharing data — information that may prove critical when thwarting an attack or limiting its blast radius. To help IT address the complexity of the threat landscape and skills shortage, NATP is designed to prioritise potential threat dangers during scanning without sacrificing the comprehensiveness of the scans performed. This is done at high speed and in real time before surfacing forensic analysis in a single Netskope dashboard or via a customer’s security information and event management (SIEM) solution. To expedite or automate remediation efforts, NATP comes with a granular policy enforcement engine and can trigger workflows such as quarantining, or a customer can integrate with their existing remediation toolset. Key features of Netskope Active Threat Protection include: 360-degree cloud vantage point: NATP offers a 360-degree view into sanctioned and unsanctioned apps, distilled into users, activity and context, all in one central dashboard. Prioritised threat protection: Industry-first prioritised threat protection provides deep contextual-based insights from threat intelligence, static and dynamic analysis and anomaly detection, to detect, analyse and quarantine the latest viruses, advanced persistent threats (APTs), spyware, adware, worms, ransomware and other malware. Remediation built for the cloud: NATP leverages the Netskope policy enforcement capabilities along with cloud-specific integrations with endpoint detection and response (EDR), sandbox and SIEM vendors so that the time required for forensics is reduced from hours to minutes. [quote_box_center]“With the constantly evolving landscape of malware, ransomware and other threats to the enterprise, IT need not only ‘rip the blindfold off’ when it comes to shadow IT, but must be able to react immediately to ensure the safety and security of sensitive data,” said Sanjay Beri, co-founder and CEO, Netskope. “With Netskope Active Threat Protection, customers can now take advantage of the Netskope deep cloud app visibility and granular policy enforcement capabilities in tandem with the benefits of a complete threat protection suite. We have collaborated with a number of leading enterprise security companies to offer this service to our customers and ensure that we are one step closer to safer enterprise cloud app usage.”[/quote_box_center] NATP also integrates with leading IT security vendors to provide best-of-breed capabilities and extend existing enterprise investments: Threat intelligence feed aggregation and sharing: NATP automatically aggregates and normalises threat intelligence feeds to increase threat detection. In addition, as a participant in the FireEye Cyber Security Coalition, Netskope integrates with the FireEye platform to share intelligence. Finally, Netskope Active Threat Protection communicates using STIX/TAXII or OpenIOC standards to exchange threat context and detection information and Netskope customers can easily leverage existing threat intelligence aggregations that they have built over time. Zero-day threat intelligence: Zero-day intelligence feeds from FireEye ensure NATP detects and protects against the latest threats. Sandboxing: NATP provides certified integrations with FireEye and Cyphort. Additional sandboxing providers can be leveraged through pre-built integrations. Endpoint intelligence and incident response: NATP integrates out-of-the-box with the Carbon Black EDR solution. The integration is bi-directional; endpoint behavioural data is pulled into the Netskope platform, where it is analysed against user, activity and content data. Netskope cloud app policies can also be pushed to the EDR for seamless remediation. ### Zsah’s App of the Month: Gotta Go! App For March’s App of the Month we have selected Gotta Go! App. Now, if you’re a big Netflix-er then you may have already heard of this. It got huge PR by featuring in the show Chelsea Does. The app creates excuses to get you out of sticky situations. It may be a bit disingenuous, but let’s be honest, we’ve all been there - you’re stuck somewhere you don’t want to be and a little white lie is your best option. [easy-tweet tweet="Need a reason to run off? @zsahltd have found an #app that gives you all the excuses you need!"] The problem is that the classic white lies just don’t cut it anymore - people see through them. How many times have you heard, “I’ve got to get on a conference call”? Or, “I need to go home and feed the cat?” With this app, you can spend some time crafting a bank of credible excuses, assign them to one of your ‘contacts’ and then when you need it, that ‘contact’ will message you with your get-out-of-jail-free card.  We recommend using this app when: You're on an awful first date You've been dragged out for after-work drinks with your colleagues The group bill arrives at the restaurant [easy-tweet tweet="Check out the #gifs on @zsahltd's app of the month @gottagoapp" user="comparethecloud"] We do not recommend, I repeat DO NOT RECOMMEND, you use this app when: When your loved one is giving birth You've just caused a car accident You're having cold feet at the wedding GIFs via GIPHY ### Microsoft Interview at the Lenovo Technology Centre Launch Earlier this year Compare the Cloud attended the launch of Lenovo’s Technology Centre in Basingstoke and interviewed a range of experts on the benefits that the Technology Centre is going to bring to Lenovo’s partners.   Compare the Cloud interviews Anthony Saxby of Microsoft about how they are working with Lenovo, and how the Technology Centre is going to benefit their business. [easy-tweet tweet="Catch up with @Microsoft's Anthony Saxby on @comparethecloud's coverage of the @LenovoPartner Technology Centre"] ### Linux, OpenSource and IBM LinuxONE Recently, I had the opportunity to attend IBM’s INTERCONNECT conference in Las Vegas. The conference covered IBM technologies for Cloud, iOT and Linux. During the conference I sat down to talk with  Ross Mauri, GM IBM z Systems and LinxuxONE on what IBM is doing with Linux and the new IBM LinuxONE Community Cloud. [easy-tweet tweet="CTC's @NeilCattermull spoke to @rossmauri about #Linux, #OpenSource and @IBMLinuxONE at #IBMInterconnect"] We discussed Linux and the IBM LinuxONE purpose built hardware designed for Opensource technologies, probing deeper and asking about the latest announcement of the IBM LinuxONE Community Cloud from IBM. So what is the IBM LinuxONEcommunity cloud? Well it’s actually a really big announcement of a service from IBM that has never been done before. Imagine for the first time in history that you can build, create and test your traditional x86 applications for free! “We have a great story on the Community Cloud, a partner tracked me down to discuss his experience and he told me that he just ported his app to LinuxONE using the Community Cloud service which enabled us to port the application onto the community cloud easily. This gave my developers, based in the Ukraine, the ability to test and retest very quickly and have the application running within a week by using the power of Opensource, the power of Open power and having the ability use previously unreachable public infrastructure very quickly and for free," said Ross Mauri, GM IBM z Systems. “Following on from this we (LinuxONE team) feel that we have enabled Developers and ISV’s alike to use the community cloud service as a 'go-to- market strategy' that is designed to swiftly enable application creation and delivery with the most secure and resilient systems in the world,” continues Mauri. [easy-tweet tweet="CTC tried and tested the @IBMLinuxONE #CommunityCloud creating wemadethis.tech" user="neilcattermull"] Previously it would have been very difficult to access and utilise the amazing tech that IBM delivers on the LinuxONE platform and this was always a barrier to entry for anyone looking at this environment to use. Think of the community cloud as an ecosystem for developers and independent software providers to create and test with, and also to test performance and reliability against x86 technologies. The LinuxONE platform significantly out performs x86 architecture time and time again together with the highest security at your fingertips. Think of the community cloud as an ecosystem for developers and independent software providers to create and test with As a final thought we will leave you with this. If you have ever wondered what an application will run like over a true enterprise technical powerhouse but were afraid of the costs? don’t be as you can now have your cake and eat it too. The performance is amazing, the security is mandatory and the price is, well it’s free! You can sign-up to trial the LinuxONE Community Cloud to test drive it yourself at https://developer.ibm.com/linuxone Well done IBM for creating a community for developers to create easily, operational directors to sleep easier and project managers to now budget more for less with the performance increase against x86 architecture as well as the ability to create Docker instances on the fly for application delivery. ### The Miniaturisation of IT and Data Centres Today we all hear about the consumerisation of IT, but the one subject most vendors and data centres do not wish to face is what I define as the miniaturisation of IT. [easy-tweet tweet="The miniaturisation of #IT and #datacentres is real says Daniel Thomas of @Comparethecloud"] I am a proud iPhone 6s + owner (and an Apple watch, iPad Pro, MacBook Air, iPad Mini 4 all of these in Gold), when I hold my iPhone up I wonder how… The ‘how’ I wonder is how does this one small device have more processing power than all the Allied and Axis powers combined during World War II. It is amazing when you look at the original Turing bombes or the later colossus machine and wonder how far forward the human race has come. an iphone now performs more functions than ever imaginable when mobile telephones were invented But as with any device it is subject to reform, refinement, resizing or adapting to a new format altogether. An example of this is the number of functions my iPhone performs that required other devices previously; Email – this required a desktop or laptop Music – this required a C90 cassette or record player Music store – I have iTunes which is digital delivery of services rather than the traditional vinyl record or cassette iBook’s – I read my books based on a digital download rather than traditional book manufacturer or consumption The above are just a few examples where functions and form are realised into a more compact function. Lets do a bit of future gazing, one of my personal hobbies looking at current devices and services and predicting future form and function: Data centres / Servers / Storage Future Form: Do we need such huge properties and devices guzzling power, spinning up out of date components. My view is the power of the large providers such as Microsoft Azure, IBM SoftLayer, Amazon AWS will all fit onto a single form factor of todays single server within 20 years. As I predicted many years ago on this blog, we are beginning to see the fusion of biomechanics with computing. As Quantum computing and synthetic DNA storage become mainstream todays devices will become more and more obsolete. I hear with interest many networking and storage companies moving towards being ‘software only’ companies we will see more of these announcements in the next few years. [easy-tweet tweet="Do we need such huge properties and devices guzzling power, spinning up out of date components?" user="comparethecloud"] Hybrid, Hybrid, Hybrid, before we see what I term as ‘the great leap’ towards cloud based technologies, we will first see a gradual migration, using hybrid technologies which resemble current architectures. These Hybrid technologies such as those used within the car industry will allow familiarisation with future enhancements whilst retaining the current look and feel of IT hardware. The hybrid transition will move into the laggard space by 2018. Future Function: Those that invest in data centres should now move onto another website here’s a great link www.whufc.com. In my view data centres that do not modernise and open up their doors, embracing local and national communities will die. Like the much-maligned 1960’s tower blocks being pulled down around the UK, the data centres that do not offer more than technical real estate will be akin to lemmings walking towards a cliff. The function of the future data centre will be in my view very different today and based upon biotech and be more technical hubs for those that need premises. in my view, Data centres that do not modernise and open up their doors, embracing local and national communities will die. My view is that within 10 years most mega-cloud providers would of built their own data centres so they control the full stack, whilst regional data centres will lose 80% of capacity, which will need to be replaced by other revenue streams. Servers / storage and other hardware functions will be software controlled and take 99% less footprint that todays technical architectures. My final thoughts modernise now or you will be extinct like a dodo. [easy-tweet tweet="The #datacentres that do not offer more than technical real estate will be akin to lemmings walking towards a cliff" user="comparethecloud"] Laptops and mobile devices  the issue is whether the artificial intelligence community can allow for the form to be made into a chip... Future Form: We could make this as small as an atom if needs be, the issue is whether the artificial intelligence community can allow for the form to be made into a chip. The reason I say this, voice dictation and thought control will be essential to miniaturise these devices. The keyboard layout is crucial to many familiar with creating spread-sheets and word processing functions. My prediction: the laptop and mobile will be chip sized with embedded virtual reality functions that will pop out a virtual keyboard for those who wish to type using QWERTY. Future Function: IoT, M2M and any other acronym I can throw out there will be controlled consumed and executed by a bio chip. We are already starting to see home automation and personal health embedded via apps onto these devices. My view is long term everything from our passports to movies will be embedded onto every human as a biochip with artificial intelligence functions interacting with our thoughts. ### Nutanix Interview at the Lenovo Technology Centre Launch Earlier this year Compare the Cloud attended the launch of Lenovo’s Technology Centre in Basingstoke and interviewed a range of experts on the benefits that the Technology Centre is going to bring to Lenovo’s partners.   Compare the Cloud interviews Andrew Brewerton of Nutanix about how they are working with Lenovo, and how the Technology Centre is going to benefit their business. [easy-tweet tweet="Catch up with @Nutanix's Andrew Brewerton on @comparethecloud's coverage of the @LenovoPartner Technology Centre"] ### Cloudian Interview at the Lenovo Technology Centre Launch Earlier this year Compare the Cloud attended the launch of Lenovo’s Technology Centre in Basingstoke and interviewed a range of experts on the benefits that the Technology Centre is going to bring to Lenovo’s partners.   Compare the Cloud interviews Sean Norris of Cloudian about how they are working with Lenovo, and how the Technology Centre is going to benefit their business. [easy-tweet tweet="Watch @CloudianStorage's Sean Norris on @comparethecloud's coverage of the @LenovoPartner Technology Centre"] ### IBM Cloud Security Enforcer and how it stops shadow IT By now, you would have hoped that there had been enough data breach headlines and cyberattack scaremongering for every business to realise the importance of security. However, companies are often so busy focusing on external threats that they neglect their internal vulnerabilities. One such security failing that needs to be tackled from within is shadow IT. [easy-tweet tweet="#Cloud and #smartphone #technology has eroded the role of the #IT department in many businesses" user="anke_philipp" usehashtags="no"] The rise of cloud computing and smartphone technology has eroded the role of the IT department in many businesses. The average employee is, by and large, comfortable with purchasing and using devices to access company data and applications, without making IT personnel aware. BYOD, remote working and cloud applications have all brought countless benefits, but they have also wrestled control away from businesses. Of course, in most cases the employees engaged in shadow IT do not have any malicious intent. However, when staff act without authorisation, there can be risks involved. Firstly, shadow IT can be costly as many individual licenses fail to benefit from economies of scale. The use of third-party cloud applications also means that company data could be at risk. Clearly IT teams need to reclaim some level of control – but it is currently proving a challenge. A recent survey by Cloud Security Alliance found that almost 72 per cent of executives and IT managers do not know the number of shadow IT apps within their organisation. when staff act without authorisation, there can be risks involved For IT managers looking to gain a handle on the shadow IT taking place within their organisation, IBM Cloud Security Enforcer offers some hope. It provides IT teams with a clear dashboard breaking down the cloud apps being used and by whom. It also offers alerts if employees are breaking company policy or acting in a way that could put business data at risk. Previously, IT managers would have had to trawl through mountains of data to gain this information, but now assessing the risk of shadow IT is a much simpler process. Crucially, it give businesses visibility, as well as control. IBM Cloud Security Enforcer also provides real-time threat intelligence, so businesses can respond rapidly to any potential vulnerabilities or attacks, and by covering PCs and mobile devices, it guarantees robust protection. Identity and access management is also provided to ensure that individuals only have access to the resources that are relevant to their work. IT managers can create an employee launchpad of approved cloud apps to ensure that there is always an easily accessed alternative to shadow IT.  [easy-tweet tweet="The rise of #shadowIT stems more from employees trying to be productive than it does deliberate deception" usehashtags="no"] In fact, it is important to realise that IBM Cloud Security Enforcer isn’t about clamping down on employee freedom. The rise of shadow IT stems more from employees trying to be productive than it does deliberate deception. It’s important, therefore, that organisations still encourage this productivity, but introduce better safeguard for the business. With IBM Cloud Security Enforcer, it’s much easier for business to reduce risk without crippling employee freedom. ### Stormagic Interview at the Lenovo Technology Centre Launch Earlier this year Compare the Cloud attended the launch of Lenovo’s Technology Centre in Basingstoke and interviewed a range of experts on the benefits that the Technology Centre is going to bring to Lenovo’s partners.   Compare the Cloud interviews Hans O'sullivan of Stormagic about how they are working with Lenovo, and how the Technology Centre is going to benefit their business. [easy-tweet tweet="Catch up with @stormagic's Hans O'sullivan on @comparethecloud's coverage of the @LenovoPartner Technology Centre"] ### Close To Home: The Case For Private Cloud Computing Public, hybrid or private? When it comes to cloud computing, your organisation has a wealth of choice — all three models are now viable, thanks to maturing technology and increasing standardisation efforts. As notes by Forbes, however, despite the widening market, many companies still consider public deployments the de facto cloud choice. In fact, the private cloud infrastructure market gained a solid 16 percent last year. What's more, public and private compound annual growth rates are expected to keep pace over the next five years. Bottom line? There's a real case for going private — here are four reasons you may want to keep the cloud close to home. [easy-tweet tweet="Chris Surdenik shares his four reasons you may want to keep the #cloud close to home" user="comparethecloud"] Compliance All businesses are subject to some compliance regulations — for example, any retail organisation using point-of-sale (POS) services or collecting credit data online for e-commerce purchases must adhere to PCI DSS data handling standards. Certain industries, however, must ensure they can accurately track the movement and access all information in their organisation. Consider health care: HIPAA standards define specific requirements for the handling of electronic health records. Finance companies, meanwhile, must be able to demonstrate — even in the event of a network breach — that they've done everything possible to safeguard consumer data. While many public offerings now advertise specific standard compliance, this task is easier if you retain total control of your cloud environment. While many public offerings now advertise specific standard compliance, this task is easier if you retain total control of your cloud environment Security Security is another key element of the private cloud. Although public cloud solutions have evolved to rival the defensive protection of on-premises alternatives, many IT professionals remain uncomfortable with the idea of outsourcing any portion of security control to public providers — what happens if connection with public servers is lost, or your provider permits a data breach? Staying private permits granular control of all security tasks and keeps hardware within reach. Clarity As noted by TechTarget, private clouds are a good fit for companies with superior line-of-business clarity — in other words, if you have a specific cloud vision and have the capital available, private cloud may be the better choice. While it's impossible for most businesses to match the feature sets of public cloud offerings, in some cases, less is more: By designing a private cloud to suit specific needs, you can avoid the problems of both over provisioning and over spending. [easy-tweet tweet="If you have a specific #cloud vision and have the capital available, #privatecloud may be the better choice"] Customisation You may also want to consider going private if customisation tops your priority list. While public clouds offer a host of features and on-demand compute resources, limitations exist — public providers profit by delivering service under a “common denominator” model. Private clouds, meanwhile, allow your IT staff to customise not only software and platforms, but also infrastructure to meet specific needs. For example, industrial design firms may have graphics-heavy workloads that require graphics processing unit (GPU) throughput not supported by public deployments. Public-private customisation is also emerging. Next Platform notes that it's now possible to run a “slice” public cloud on premises to create a truly hybrid environment. You may also want to consider going private if customisation tops your priority list Considering the cloud? While public remains the most popular choice, private alternatives offer benefits to address unique corporate needs. ### Pivot3 Interview at the Lenovo Technology Centre Launch Earlier this year Compare the Cloud attended the launch of Lenovo’s Technology Centre in Basingstoke and interviewed a range of experts on the benefits that the Technology Centre is going to bring to Lenovo’s partners.   Compare the Cloud interviews Daniel Keelan of Pivot3 about how Pivot3 is working with Lenovo, and how the Technology Centre is going to benefit their business. [easy-tweet tweet="Catch up with @Pivot3's Dan Keelan on @comparethecloud's coverage of the @LenovoPartner Technology Centre"] ### Are Vendors falling out of love with MSPs? Every industry around the globe follows trends that create cause and effect relationships. Within the IT industry there has always been ebbs and flows, normally induced by following the vendors marketing funds. [easy-tweet tweet="Is the #Vendor #MSP love story coming to an end? " user="comparethecloud" hashtags="cloud, strategy, IT"] This time last year any IT vendor or distributor was calling for the recruitment of MSPs (Managed Service Provider) with the gusto of a town crier ringing out the latest news or the penance of a medieval monk wearing a hair shirt. The issue I am seeing now though, is the lack of sales generated by the MSP channels, due to not being a sell-through organisation and only using IT hardware and software for their own multi-tenant consumption. The knock-on effect within the vendors (aside from dwindling staff bonuses) is disillusionment within the channel functions, combined with prostrating apologies to traditional partners, who were left out of marketing funding pots for the last year. Playing to the laggard space every vendor salesperson has a family to feed and clothe, therefore they will always go where the money is to enable targets to be hit My personal view is that every vendor salesperson has a family to feed and clothe, therefore they will always go where the money is to enable targets to be hit. Like the insurance industry many years ago, which changed from a commission, based to salaried structure, the days of the vendor shiny suited salesman being the middleman are slowly drawing to a close. The next two years will not see a major step change, what I am already starting to see is the emergence of a new vendor salesperson, one who is focussed on being essential to clients adding value rather than being coin operated. The MSP community has tended with all this recent attention to act with quite a swagger, in my view this is akin to the dot COM period of many years ago. This swagger combined with the easy availability of vendor marketing funds for a simple MSP rebrand has now caused a period of stagnation and disillusionment. Sorting the MSP wheat from the chaff It is easy to apply a badge of honour to either a new or existing business; we have seen this consistently over the years from the over use of the word ‘cloud’ we are now seeing this with the term ‘MSP’. Having been exposed to a number of providers over the years there are a number of core differentiators that we see that defines quality from quantity in the MSP space. [easy-tweet tweet="There are core differentiators that we see defining quality from quantity in the #MSP space"] The following is my personal top five points I use to define a quality MSP (not in order of preference): Ability to pay salespeople. A simple as this sounds an MSP model is based on subscription rather than sell-through margins. A service may make profits of only small proportions but over a period of time far outweigh sell-through profit margins. A model that rewards salespeople within this environment is key to maintaining growth. Consultants not salespeople It may seem that I aim taking aim at the sales function I make no apologies for this. Monthly or quarterly contracts have removed the once renowned ‘hit and run’ tactic of old. End-users (the ultimate product consumer) does not wish to be ‘sold to’, they wish to be informed educated and view the MSP’s front line staff as partners NOT self interested coin operated robots, who will say anything to win a deal. Reformed Marketing The days of ‘lets buy a marketing list’ then ‘email spam that list’ followed up by ‘lets cold call that list’ are over. If you are a marketer, residing in a vendor funding such actions, or an MSP executing such actions hang your head in shame. You are no worse than the annoying people we get tele-spamming our mobiles with unsolicited calls. Face it this kind of marketing went down with the titanic. Grow up, embrace social and digital be essential be informed but above all be relevant. The latest trend I have seen in successful providers, is the leveraging of big data and analytics to determine direction and strategy.  PS, many PR agencies are about as useful as a ‘cat flap in an elephant house’ think twice before appointing one. Self determination I have seen a number of hardware and software purchases made on the back of promises made by vendors of helping the partner sell-out to end-users. Whilst this is great for the MSP is there not a need for self-determination within an MSP, the saying ‘be cruel to be kind’ is prevalent here. If a MSP needs a vendor to sell out and market a service then that MSP is never going to get cut from the vendor crutch and determine their own destiny. From a vendor standpoint do not spoon feed your partners, educate digitally transform, train and plan and teach them to be self-sufficient. Or better still enable the distributors to do this.. Technical ability Never before in the history of the computing industry has the technical team had a chance to stand upfront and be respected. I often hear that without sales there is a no technical department, my view in terms of client adoption and retention the age of the ‘techie’ is here. With clients tuning out to repetitive marketing automation, not bothering to meet salespeople, this is your time to shine! Technical ability can be defined in many ways but my personal view is not some BS vendor accreditation that anyone can exam cram and pass Technical ability can be defined in many ways but my personal view is not some BS vendor accreditation that anyone can exam cram and pass, but the ability to fix customers issues especially when under heaps of pressure. Like the ‘Ice Man’ in the movie ‘Top Gun’ this is your chance to move that coolness into a more pre-sales consultancy role. Cloud and SaaS services are moving so rapidly those technical departments that do not embrace change advise out-dated costly solutions are doing a client disservice. Aligning the technical department to agile business strategy, will be more prevalent in Q4 2016 my advice embrace this now and implement. ### See the Launch of the Lenovo Technology Centre Earlier this year Compare the Cloud attended the launch of Lenovo's Technology Centre in Basingstoke and interviewed a range of experts on the benefits that the Technology Centre is going to bring to Lenovo's partners.  [easy-tweet tweet="Recap the @LenovoPartner Technology Centre launch on @ComparetheCloud"] ### University of Portsmouth to host a new centre of excellence for satellite data The University of Portsmouth is to host a new centre of excellence for satellite data, which will promote economic growth in the UK’s space sector. [easy-tweet tweet="#CloudNews: University of Portsmouth to host a new centre of excellence for #satellite #data"] The centre will allow the south coast business community to tap into satellite technology that could improve their products and services, for example, developing technology to help robot submarines use live satellite data to inform their route through the ocean – similar to drivers using live traffic updates. The South Coast Centre of Excellence in Satellite Applications will be led by the University of Portsmouth’s Institute of Industrial Research. It is co-funded by Satellite Applications Catapult and the UK Space Agency. The new centre will use data from thousands of man-made satellites which orbit Earth. Some satellites take pictures of the planet that help meteorologists predict weather and track hurricanes, others are used mainly for communications, such as beaming TV signals and phone calls around the world. A group of more than 20 satellites make up the Global Positioning System, or GPS. Experts at the centre will work with industry, particularly small businesses, to ensure they can use satellite data to make new products and services commercially available. Its market focus areas cover maritime and marine and autonomous systems. Professor of Industrial Systems, David Brown, said: “Satellites looking towards Earth provide information about clouds, oceans, land and ice. They give us insight, collect and distribute vast quantities of data and give us a detailed picture of what is happening in our world such as allowing us to monitor and protect our environment, manage resources and respond to humanitarian disasters. [caption id="attachment_35274" align="alignright" width="300"] Professor David Brown[/caption] “To be leading this new centre is a great success for the University. The centre will encourage and promote collaboration between universities and industries, especially SMEs, which can only enhance and promote local economic growth.” Other partners include the Universities of Brighton and Southampton, National Oceanography Centre, Marine South East, Offshore Renewable Energy Catapult and Hampshire County Council. Professor Catherine Harper, Dean of the University of Portsmouth’s Faculty of Creative and Cultural Industries, said: “The Institute of Industrial Research has been working with businesses for a number of years to deliver solutions and opportunities of strategic importance to industry and research. University of Portsmouth's lead in this new innovative centre is of considerable significance to this city and region.” A new centre of excellence is also being established in the south-west. There are existing centres in the East Midlands, north-east England and Scotland. Universities and Science Minister Jo Johnson said: "The Government is backing the UK's successful space sector and ensuring more businesses can benefit from space technology through the Satellite Applications Catapult. These new and extended centres will deliver more support to businesses and scientists across the country, and help the space industry reach its ambition to grow to £40bn by 2030 and generate 100,000 new jobs.” Stuart Martin, CEO of the Satellite Applications Catapult, said: “We’ve witnessed a significant upturn in regional engagement over the past two years through the initial three centres. This has included activities linking up the science base, large industry and SMEs to help develop satellite-derived applications and solutions. “The network’s expansion to cover the southern region of the UK will hopefully result in generating further understanding and awareness of the opportunities that satellite data and technology can provide – especially among market sectors not currently engaging with them – and developing new commercial opportunities for the UK’s Space sector.” ### Fresh fight for Cloud e-tailers as they offer soup to nuts … along with milk, eggs and other fresh foods Alphabet launched its Google Express service in 2013 to offer non-perishable goods to test markets in a few major US cities. Recently it announced that it is now expanding into fresh items as well in two test markets - LA and San Francisco. The pilot scheme will ship milk, eggs and other fresh foods. In doing so it was not only taking on the supermarkets and other grocery players, but also its main cloud rival, the retail giant Amazon. [easy-tweet tweet="#Alphabet is taking on supermarkets and other grocery players, but also its main #cloud rival, #Amazon"] However just as Google extended its non-perishable service into groceries, Amazon moved to extend its existing grocery delivery service (already available in a number of major US cities) to the UK through a deal with UK supermarket Morrisons. Traditional players in the supermarket and grocery trade should be right to fear the encroachment of the two cloud giants into their turf – here’s why: Amazon – the cloud and logistics giant that sells everything: While Amazon is one of the world’s leading retailers already, this is not where either its profits are made or indeed it gains its real competitive edge. It is its cloud business AWS that provides the firm with most of its profit and it is its logistics and fulfilment operation that gives it the edge. Amazon may well sell a great deal itself, but an increasing proportion of what it offers is provided by other merchants. They simply use Amazon for logistics and fulfilment operation. Indeed in order to increase the scale of its logistics and fulfilment operation it has recently started to invest in aeroplanes to cut out the air-freight operators and shipping operations in order to get into bulk freight as well. Overnight Morrisons becomes a merchant and wholesaler with nationwide coverage and logistics via Amazon.co.uk Its deal with Morrisons is a ‘prime’ example of this. The UK supermarket chain had a good regional presence in Scotland and northern England, but less coverage elsewhere and was later than its rivals into the online grocery delivery business. Tesco which is the largest local supermarket player has UK-wide coverage and 40% of the online grocery delivery business with a wide range of non-grocery items available via Tesco Direct either for delivery or click from its stores. Overnight Morrisons becomes a merchant and wholesaler with nationwide coverage and logistics via Amazon.co.uk. Amazon’s Prime Now option is currently available to members of Amazon’s Prime service in London, Birmingham, Newcastle, Manchester and Liverpool, who can order over 15,000 items for one-hour delivery costing £6.99, or choose two-hour, same-day delivery slots free of charge. Amazon Pantry, meanwhile, is available across the country to Prime customers, who can order more than 5,000 everyday items such as nappies, detergent and mouthwash for one-day delivery.  Alphabet – the cloud and advertising business that lets you search for things Google’s parent Alphabet has its fingers in a host of different pies, but at heart it is a massive advertising business – after all this is where most of its profit is made. We don’t pay to use its search engine, it offers basic email and storage for free as well and its Google Apps and sider cloud operation aren’t massively profitable yet. However its search engine presence allows it to serve up a range of products to potential shoppers via Google Shopping. [easy-tweet tweet="Purchases on Google comes as Facebook and Pinterest also try to make it easier for people to shop says @billmew"] Google wants to go further than this though. It is looking to make it easier for people to shop while using its website. Last year it launched a trial feature allowing smartphone users to make online purchases from their search results. The introduction of “Purchases on Google” comes as Facebook and Pinterest also try to make it easier for people to shop while using their sites, putting themselves in position to profit from transactions or related advertising. Google sees ‘Purchases on Google' as a big step towards helping retailers drive more mobile conversions and win more customers. Add to this its logistics and fulfilment operation with Google Express, a service that is now expanding into fresh items as well, and you can see how it is lining up against Amazon in so many areas. Amazon and Google aren’t going to have it all their own way though - their UK rivals are fighting back: Sainsbury's is aiming to "future-proof" its business with the £1.3bn offer to buy Argos owner Home Retail Group (HRG). HRG isn’t the most successful or profitable retailer, but it has an incredibly efficient logistics operation. Indeed its stores, being more like warehouses with a collection desk are already well set up for the kind of click and collect operation that many groups are seeking to offer. The UK grocery market is already ultra-competitive The UK grocery market is already ultra-competitive with incumbents like Sainsbury’s and Tesco along with Asda, which is owned by Amazon’s US rival Walmart. Then there are the discounters like Aldi and Lidl that are also stealing market share. At the same time the UK market is already the most advance in Europe for online delivery, with 27% of British shoppers saying that they shop online last year for their groceries monthly, compared with 22% in 2010. Amazon and Google both have major operations elsewhere (Google taking on Apple with Android and Amazon taking on Netflix with Prime Movies), but across a broad swath of their operations they are now direct rivals as well as serious challengers for the retail incumbents. This should be of greater concern to the incumbents than to shoppers – after all any increase in competition or innovation in service and efficiency should be to the benefits of customers, as long that is as good level of competition is maintained. ### Are you compromising your company’s data security? With data protection being an ever-present concern for companies of all sizes in all industries, you can never be doing too much to ensure that any sensitive documents you have are well protected. Recent information from ICO shows that one of the biggest causes of information security breaches is the loss and theft of paperwork, and companies are falling into this trap completely unnecessarily. [easy-tweet tweet="When it is so easy to go completely digital securely, there really is no excuse for taking a risk with paper documents"] When it is now so easy to go completely digital securely, there really is no excuse for taking a risk with paper documents. Additional security is not the only benefit, either. In fact, here are five good reasons to scan the documents that you have, and move exclusively to digital. 1) You can make the most of your working day Workers in information industries spend around 11.2 hours every single week creating and managing documents. You could be saving as much as half that time by switching to digital documents. [easy-tweet tweet="Workers in information industries spend around 11.2 hours every single week creating and managing documents" user="comparethecloud"] Digital documents have two key qualities that can help you save time: they are searchable and they are shareable. You can search through files on your hard drive or in a cloud-based system in seconds, and you can also easily search for specific words on most documents. With digital documents, finding that obscure piece of information can take seconds instead of minutes, and you will have more time to spend acting on that information. You can also easily send documents to colleagues and clients via email or share them across various online platforms, allowing for much faster feedback and collaboration on those files. Having your documents in digital form will therefore not only increase the speed with which they can be made use of, but potentially their quality as well, as it is easier for others to contribute and give you feedback. 2) You have the capacity to adopt a flexible approach to work Digital documents are fundamentally mobile. With all of your documents digitised, you won’t have to worry about carrying them with you from one location to another, or about making do without them. You can have access to your documents any time that you have access to the internet, or you can download them onto your laptop or tablet to have access to them anywhere. Digital documents are fundamentally mobile You would not only have more options with where you travel for your work, you will also have the opportunity to work while you travel, and make the most of that time that would otherwise be wasted. With this approach, a train journey stops being a blank spot in the day, and becomes another opportunity to carry out valuable work. 3) You can communicate efficiently with people outside the company Whatever it is that your company does, there is scope to do it even more efficiently by switching to purely digital documents. Perhaps you’re part of a consultancy firm that has regular contact with clients, or your business is working alongside another business and you need to communicate often with them - in all these situations having digital versions of your documents will make the communication faster, easier and more secure. International communication is also much easier with digital documents. You can send files to companies in the USA as easily as you can to a client in the same city as you, and you’ll never have to pay postage. International communication is also much easier with digital documents 4) You will save on resources By switching to digital documents, you’ll be saving money and space as well as helping the environment. Think about how much space a couple of filing cabinets take up in the office. True, it might not be a huge percentage of the total space that you have available, but it is space that you are losing unnecessarily when all your documents can be stored in hard drives or the cloud. Digitising your documents helps you to make the most of all the money that you are paying for your real estate. It also means that you won’t be wasting all the paper that could be filling up those filing cabinets over the years to come. Once they have been scanned, existing documents can be easily and securely recycled, and once your company has gone digital there won’t be any further need for paper to build up around the office. 5) You will not be compromising on security [easy-tweet tweet="With #digital documents you can have complete control over the #security of your documents" user="comparethecloud"] With digital documents you can have complete control over the security of your documents. If your company handles particularly sensitive data, then there are international accreditations that will help you prove to clients and board members that you are doing everything you can to protect the data that you have. Ultimately, having digital versions of your documents is far safer than leaving them all in physical form, where they are vulnerable to damage, theft and destruction in situations that are often beyond your control. There is no additional benefit to keeping the documents in paper form, so why take the risk with security on top of that? Your existing documents can be securely scanned and disposed of, and you can move forward with digital documents from that point onwards. ### Cisco and NetApp All-Flash Converged Infrastructure Removes Constraints for the Modern Enterprise Data Centre   Today, Cisco and NetApp®, the developers of the FlexPod® converged infrastructure (CI) solution, have seen FlexPod partner and customer enthusiasm grow for the performance, agility and economic benefits of the all-flash data centre. [easy-tweet tweet="#CloudNews: @Cisco and @NetApp are removing constraints for the modern enterprise #datacentre" user="comparethecloud"] FlexPod with All Flash FAS (AFF) accelerates Citrix time-to-market for new software versions and features Citrix solutions, used by more than 100 million people globally, power business mobility through highly secure, mobile workspaces that provide instant access to apps, desktops, data and communications on any device, over any network and cloud. Like most software companies, Citrix constantly improves and updates its products to deliver value to customers. Citrix wanted to improve efficiency and move further into DevOps by introducing new capabilities for automation. Immediately after installing the FlexPod with AFF, the company noticed a drastic improvement. Testing at scale is consistent, fast, and reliable, and tests no longer need to be extended or repeated. All have contributed to faster time to market for Citrix XenApp and XenDesktop virtualisation solutions. [quote_box_center] “We’ve had really good success with NetApp All Flash FAS and the FlexPod infrastructure. We have a higher confidence in our data. We know it works. And we’re going to continue expanding that as fast as possible,” said Satish Sambandham, senior manager, Product Development, at Citrix Systems. Customers recognise that flash equals speed. But as all-flash array pricing has come down, there’s a growing recognition that flash provides significant management benefits as well, especially when integrated within the simplified operation of a converged infrastructure solution. [/quote_box_center] “In the daunting economic climate of today’s IT, converged infrastructure can not only simplify users’ experiences and ROI, but also drive partner profitability,” said Mark Peters, Practice Director & Senior Analyst at ESG. “Cisco and NetApp are jointly enabling their users and partners to more quickly realise the benefits of the specific technologies and expertise that each company contributes to the FlexPod solution.” "FlexPod with All-Flash FAS is a natural fit for converged infrastructure since it simplifies data management,” said Lee Caswell, vice president, Product, Solutions and Services Marketing at NetApp. "In a world of limited IT skills, the all-flash FlexPod allows users to focus on high impact work and makes storage performance tuning a thing of the past." “FlexPod with All-Flash FAS is the type of solution that our customers are looking for in their data centres,” said Mark Melvin, CTO of ePlus. “The performance and economic benefits of flash have fundamentally changed our customers’ business models and design paradigm. FlexPod with All Flash FAS offers a compelling solution to solve many of their data center challenges.” “Shifting business pressures and operational limitations are pushing our customers to rethink how they build and manage data centre environments,” said Jim Kebert, vice president of Avnet’s NetApp solutions business in the Americas. “Our partnership with NetApp has been about combining their excellence in data management with our industry-leading solutions distribution practices. The announcement today is the perfect opportunity for us to continue our focus around the FSA One Framework and the new FlexPod with All Flash FAS to our customers.” With the FlexPod Advantage initiative, NetApp and Cisco are enabling partners to help customers accelerate their journey to a modern data centre. The featured solution is the FlexPod with NetApp All Flash FAS storage, Cisco UCS servers with the latest M4 processors and Cisco NEXUS 9K switches with Application Centric Infrastructure (ACI). The solution is ideal for enterprises looking to consolidate mixed workloads and protocols across flash, disk and cloud resources. Key benefits of the FlexPod platform include: Impressive performance gains: Up to 20X reduction in latency with all flash storage; over 200% faster SQL response times with new server CPUs (See here for more detail) New standards in management agility: Up to 30% reduction in application testing time and over 80% faster provisioning with ACI (See here for more detail) Clear economic value: Over 75% ROI in just 17 months, generating savings for both CapEx and OpEx. Source: Forrester “The Total Economic Impact Of NetApp’s FlexPod Datacenter Platform” “We believe that FlexPod utilising the latest Cisco UCS servers, Nexus 9K switches, Cisco ACI, and NetApp All-Flash FAS can deliver new levels of performance and create a modern technology foundation to meet the demands of business critical applications,” said Dhritiman Dasgupta, vice president of Product Marketing at Cisco. The companies also announced FlexPod with Infrastructure Automation. This new solution utilizes Cisco UCS Mini, NetApp FAS storage, and automation capabilities that radically simplify and accelerate the ordering, delivery, and installation of IT at remote and branch office locations. For details on how FlexPod with Infrastructure Automation can be deployed in less than one hour, visit here. For more details on the FlexPod with NetApp All Flash FAS and Cisco ACI, visit here.   ### HPE teams up with vArmour for application-aware micro-segmentation I caught up recently with an old friend Dean Hickman-Smith, who is one of the Exec’s at cloud security leader vArmour after they announced a big partnership with HP Enterprises. Here’s a brief Q&A which serves as a great primer on one of the hottest trends in cloud security right now. [easy-tweet tweet="vArmour's Q&A with @Comparethecloud about the HPE partnership announcement"] Daniel Thomas (CTC): Congratulations on the new partnership with HP Enterprise. What’s the gist of the partnership and what can customers expect as a result? Dean: Thank you. We’re very excited with our new HPE partnership not only because this means we can better service customers globally, but also because of the incredible validation this provides for our most differentiated capabilities, application-aware micro-segmentation with advanced analytics. Daniel: That is a mouthful - what does that mean - application-aware micro-segmentation with advanced analytics? Dean: At a simple level, it means that we help customers see and stop sophisticated attacks in their data centre, virtual and cloud environments. According to Cisco, By 2019, more than 86% of workloads will be processed by cloud data centres Here’s how to think of it a level deeper. According to Cisco, By 2019, more than 86% of workloads will be processed by cloud data centres (Cisco Global Cloud Index, 2015).  As companies began moving more and more workloads to the cloud (many of them in converged and now hyper-converged infrastructure models), we noticed that they tended to rely on their legacy perimeter-based models of security. But the perimeter breaks down in the cloud. In short, we all know that the bad guys are going to get in, but once they’re in, how can you see and stop them before they take control of your critical systems? No firewall can help you there. We’ve seen this story in the headlines. Application-aware means moving the level of understanding and protection from the perimeter to the individual workload, so you can monitor application and user behaviour and detect threats at the application layer, not just the network layer. Micro-segmentation refers to the ability to segment applications using a set of rules rather than placing them in specific places in the network. Not only does this free people up to spin up workloads wherever they have the resources, but more importantly, it lets them quarantine them when they see something suspicious. This helps stop the lateral spread of threats in the data centre. That’s where the analytics come in. Customers have so many workloads and so much happening in their data centre and cloud environments, that they need to know when an application starts to act strangely, like a web server trying to talk to a data base. That shouldn’t happen. Daniel: So how does all this fit with the HP Enterprise products. What is the better together idea? Dean: We are going to market with two products within HP Enterprise. The first solution is built around their hyper-converged platform, which combines with vArmour to ensure every workload has security built into it as they launch storage and compute resources. It then lets them segment their most sensitive assets – from personal information or healthcare records to credit card numbers – all in the same, shared infrastructure pool. This helps companies reduce their threat surface. The “better together” idea here has to do with protecting companies from threats without slowing them down. [easy-tweet tweet="Application traffic flowing inside virtualised and #cloud #datacentres is a blind spot for many enterprises"] The other opportunity is to go to market with HPE’s leading SIEM (Security Information Event Management) platform HP ArcSight. These are huge systems that monitor incoming information from all kinds of enterprise systems, correlate information, and manage security response operations. Application traffic flowing inside virtualised and cloud data centres is a blind spot for many enterprises, and there has not been a simple way to understand context and  application-layer activity to SIEM platforms for correlation and response to incidents. With vArmour, it will be easy for users of HP ArcSight to see application communications from every workload across public and private clouds. This will also be able to respond in real time to sophisticated threats in ArcSight by making policy changes dynamically using vArmour’s to stop an attack. And here, the “better together” idea has to do with clearing up these blind spots and giving companies a new arsenal of tools to respond and stop threats. Daniel: So our readership are all manner of cloud practitioners. What would you want them to understand about the current state of cloud security. Dean: I would say two things - as a result of this partnerships it’s now simple and cost effective to deploy top notch security without slowing down your business. Don’t back off the challenge. The second is to stop thinking of infrastructure and security as totally separate entities, these come together in a converged world, just as they are coming together through this partnership with HPE. stop thinking of infrastructure and security as totally separate entities ### Pervacio opens regional hub in Finland Increasing demand for its services across Europe has led to mobile device automation solutions specialist Pervacio opening a regional development, engineering and management hub in Finland.  [easy-tweet tweet="#CloudNews: @PervacioInc opens regional hub in Finland" user="comparethecloud"] The company’s new office is in Turku on the country’s southwest coast. Finland’s former capital, Turku is home to several university campuses plus a dynamic IT industry, offering not only the resources that will enable Pervacio to innovate, but also that all-important talent pool. Pervacio will take advantage of this location to recruit highly experienced personnel with more than 20 years’ experience in mobile device software development. The Turku office will provide local program management, a call centre and certification for the Scandinavian countries of Denmark, Norway, Sweden, Finland and Iceland. In due course it will expand to support the broader EU region and will develop research and development activities. The new intake of program managers will bring wide experience and knowledge of the mobile device industry, particularly in warehouse, retail and consumer PC applications. Meanwhile, Pervacio’s new developers are experts in device USB and TCP/IP connectivity, embedded software development, Windows ecosystem, application architecture, web connectivity and PC application development. Pervacio has developed a platform that provides end-to-end management for mobile devices irrespective of the network, type of mobile device, operating platform or where the device is in its usage lifecycle. Its solutions are purpose-designed to solve the unique set of problems faced by those operating in the forward and reverse supply of mobile and smart electronic devices. More and more end users, who include carriers, manufacturers, retailers and insurance underwriters, are using Pervacio’s patented products to simplify, centralise and speed up their operations while simultaneously cutting costs. Pervacio’s CEO, Sanjay Kanodia, said: “Stakeholders across Europe are recognising our unrivalled investment in research and our engineering and certification network. It is because of this commitment, along with our global reach, that they are choosing to partner with us, creating a significant increase in demand for our services. “The new hub in Finland will operate alongside our well-established offices in the UK and Hungary; enabling us to expand our operation while also providing ongoing support for existing customers throughout Europe.” The decision to open a Scandinavian base marks the latest phase in the company’s business growth and follows the launch of Pervacio’s Global Innovation Centre and further expansion of the hub serving the EMEA/Asia Pacific region, both of which are located in India. ### Productivity and the Cloud: 5 IFTTT Recipes You Need to Know  IFTTT, which stands for If this, then that, is the technology that takes automation to a whole other level – and for free! It works on the basis of cloud connectivity to multiple channels such as DropBox, Gmail, Google Drive, Facebook, Evernote and OneDrive amongst many others, giving you the freedom to choose your connectivity partner of choice. So upgrade your productivity and prepare for a completely digital existence with IFTTT recipes that take care to automate your life. If you’re wondering where to start, take a look at these five amazing IFTTT recipes that will definitely increase your productivity in the workspace. [easy-tweet tweet="The basic assumption of #cloud #connectivity to multiple channels means life could be simpler!" user="comparethecloud" usehashtags="no"] Use Voicemail to Set Yourself Calendar Events and Notifications If you’ve ever struggled with the calendar app and wondered why it couldn’t be simpler, you don’t have to any more. With the aid of some nifty IFTTT tech, you can now automate the event setup process just with the help of voicemail. For instance, You can set a voice reminder for instances that are tagged #callme on Google Calendar. Or, you can make a phone call and set up an event in Google calendar itself. For those looking for fun and deviant ways of using IFTTT, you can even schedule phone calls to yourself just to get out of a calendar event (meeting, date, whatever) as long as its logged onto your calendar app. Just use an event entry on your calendar with a “Save me!”, or “Call me!” and you should get a phone call around fifteen minutes of the time you prescribe. Share Meeting Notes Automatically with Your Group This IFTTT feature works on the principle of “If an email is sent, then create a document to shared Google Drive folder.” Obviously, this implies that you already have a Google Drive folder with sharing rights that is ready to receive emails. The best part about this is that you can automatically save all emailed meeting notes (or, for that matter, class lectures, shared articles, etc.) with the people you want to share it with. While this works in the office space, it also works in environments like schools and universities.  Additionally, you can even create reminders for your team by scheduling them through Slack or Evernote. Scheduling a daily reminder for the team or updating minutes of a meeting are a breeze with pre-defined IFTTT recipes. Scheduling a daily reminder for the team or updating minutes of a meeting are a breeze with pre-defined IFTTT recipes Automatically back up photos you are tagged in [easy-tweet tweet="#AutoSave your tagged photos using #IFTTT protocols, and never lose a memory again" hashtags="cloud"] One of the most interesting aspects of life in 2016 is the easy access to digital photography: with Instagram and instant uploads, it’s just easy to capture every moment. However, when it comes to storing these memories, many of us have our lapses: especially if those are photos that we’re only tagged in, in which case they depend entirely on the uploader and can be removed at any time. Now, though, you have the option of automatically saving each and every photograph linked to your account – be it from Instagram or from Facebook – and then saving it to your associated DropBox, OneDrive or Google Drive folder. In fact, some recipes will even allow you to create instructions for your camera to directly upload photos to your Facebook album, etc. Take control of situations based on Location-centric Recipes With a simple IFTTT recipe, you can control certain aspects of your life with just the location features on smartphones. The phone’s GPS navigation system can automatically detect your location and perform a number of predefined tasks, such as the following. It can turn up the ringer once you leave your workplace, school or hospital. The ringer can be turned down automatically as soon as you reach your office, school or a hospital. It can track your mileage depending on the location and automatically update it to spreadsheets. It can let you send reminder emails to your colleagues, friends and family based on when your flight lands. Create Recipes to Be Updated on Goings-on This can work in three ways: update you on important news, give you up-to-date information on all the goings-on in your company or workplace, and keep you updated on class updates. How? Most people are part of broader Facebook groups which keep updating information: for instance, a class that was canceled or a event announced on a business page can often reach social media before any verbal alerts. Stay on top of everything with automated recipes that can directly send updates or posts to your own Facebook page, so that you get all the notifications no matter what you might be doing or where you are. Or you can create a recipe such that important notifications, important posts on the social media - or even on the internet - directly reach your inbox. In fact, the same works for important news such as new laws being passed or traffic and weather alerts. Essentially, cloud ifttt recipes can simplify your life, you just have to learn how to cook them up! ### Put your feet up and let the cloud automate your business Consider this. You're the head honcho of a fast growing company selling lots of finely crafted business products and services. You've got plenty of great people working in specialist teams making sense of everything from web analytics, social media and business lead generation, to customer relationship management (CRM) and business process automation. Despite this, your sharp managerial intuition tells you there's still more value to be had.  [easy-tweet tweet="Can the #cloud give you more free time at work? " user="comparethecloud" hashtags="automation"] You're right, everyone's at it. Some of us enjoy it in the morning before breakfast, some of us at night. Some of us enjoy thinking about it over a hot beverage and others like to discuss it around the office water cooler. A few of us even like to make it a team event, but how many people can really say they are good at cloud computing? The reality is that the best examples of cloud use are happening hand in hand with digital transformation The reality is that the best examples of cloud use are happening hand in hand with digital transformation. By moving all your business tools into a single platform you can achieve two things: improved efficiency inside your business, and a bigger, happier customer base. Orders will begin to move faster, customer complaints will dry up and you will see those all-important return on investment figures go through the roof. It all sounds great, but it's not always clear exactly how you can get the ball rolling. [easy-tweet tweet="By moving all your business tools into a single platform you can achieve two things..." user="comparethecloud" hashtags="cloud, automation"] This is why Parker Software has developed a suite of business automation tools to take out the hard work. This powerful cloud-based setup works on three key areas: business automation, customer engagement, and lead generation. The automation function can analyse emails, SMS, social messages and file attachments for keywords. Once that's done, it can run intelligent triggered-responses, so if there are appointments to be scheduled, databases to be updated and files to be translated, it simply does it all in real-time. Digital engagement tools like live chat software have been around for a while – but not many can use intelligent triggers made by a customer's browsing behaviour to automatically initiate a live call between customer and sales person. Why wait for the customer to leave the website, compare your competitors, and spend time mulling the decision to call you? Combine this with a prospect detection tool that can provide in-depth financial and structural details of businesses that visit your website, and it's easy to see how the cloud can help you convert better than ever. it's easy to see how the cloud can help you convert better than ever So there we have it, using the cloud doesn't have to be a mere fantasy. By improving the digital transformation of your business processes and facilitating better engagement with customers, the cloud can become your best friend. Before you know it, you'll be able to put your feet up, hot beverage in hand. ### What’s Next for The Internet of Things?  Whenever we talk about the next big trends in tech, Internet of Things and Wearables always come up. The promise of a boom has, however, failed to materialise (excuse the slight pun).  [easy-tweet tweet="When will #IoT and #wearables be simple, effective and affordable enough to actually be market critical? "] That is not to say that it won’t happen though. Experts disagree on the exact size of this potential market, but predictions for the next 10 years’ growth tend to be in multi-trillion dollar range. Just like mobile and smartphone adoption, we’re essentially waiting for the tipping point where these technologies will be good enough, simple enough, and affordable enough to become pervasive. In the meantime, many companies are hard at work innovating in that space, and many of them cluster in what might seem an unlikely corner of the world: Canada’s British Columbia province. Those lucky enough to have visited is capital Vancouver will know that there’s more to it than stunning scenery and wildlife. In terms of film and TV production, the city is the third largest centre in North America, losing out only to New York and Los Angeles. It is also home to thousands of ICT companies, earning it the nickname ‘Silicon Valley North’. The government of the province is keen to promote this fact, which is why it is bringing a delegation of its rising tech stars across the pond to Mobile World Congress in Barcelona. Looking at the DNA of some of these, a pattern of innovation in the IoT space emerges that suggests that it might be worth keeping an eye on the region. It is, after all, home to Sierra, the world’s largest machine-to-machine wireless company. The company was recently tipped as one of the top Canadian tech stocks to watch  as it has squarely positioned itself in the middle of the sector. It specializes in creating embedded wireless modules that can fit into various objects, enabling network connectivity. Crucially, however, Sierra also invested in its data analytics and cloud computing. These are the elements that can render those connected devices ‘smart’ and where the real game-changing possibilities lie. data analytics and cloud computing are the game changing elements that can render connected devices ‘smart’ Security then becomes a much bigger issue in that scenario, as it is clearly important to protect this myriad of personal devices against tampering, but it is impractical to embed each one with a separate security mechanism. Wedge Networks, based in Calgary, has developed a cloud-based cybersecurity solution that protects the entire network rather than requiring users to download and update software to individual devices. They predict that as the Internet of Things becomes more prevalent, the demand for this type of Security as a Service (SECaaS) will soar to an estimated to $30 billion by 2020. When people discuss how the Internet of Things will transform our homes and lives, they often have a hard time going beyond Nest or Smart Fridges that can tell you when you’ve run out of milk and order some for you. But the more interesting IoT applications probably won’t require you to purchase specific new devices or learn to operate them. They should gradually and seamlessly integrate into existing technologies and incrementally bring the physical and virtual world together. [easy-tweet tweet="The most interesting #IoT will gradually and seamlessly integrate into existing technologies" user="comaprethecloud"] One example of how this can work is the way Eventbase is using bluetooth Beacons. These low-energy flexible devices can be placed on any location and programmed to interact with smartphones and wearables. This meant that by placing one of those beacons inside a plant pot on a table, a conference attendee at last year’s Cannes Lions festival could order a complimentary glass of wine on their Apple Watch, and have it delivered to their exact location in less than a minute. On a similar vein, another coin-sized device named Linquet promises to make losing your valuables a thing of the past, allowing you to attach it to pretty much anything and then program it to alert you whenever an item moves out of a certain range. Again, the implications of building a smart network of such connected and programmable devices brings many tantalising possibilities. The company already suggest that you could attach the device to a pet’s collar, for example, to prevent it wandering off. In future, however, could we make this technology small enough to include in microchips, so that we could locate lost pets automatically without the need for them to be handed into a vet and scanned? Embedded sensors are definitely a trend with many interesting applications, such as the one developed by SoudOfMotion. Their inexpensive $50 sensor that can be mounted into to shoes, limbs, oars, paddles and other equipment to track progress for both amateur and professional athletes. By using the Earth’s magnetic pole to measure pivotal motion, it works like a compass to provide much more accurate and reliable and granular data to help people improve their sporting technique.  A sensor attached to a power cable can read the harmonics whenever an appliance is turned on It is easy to see how in future these types of functionalities will be built into devices and automatically connected to networks. One company that demonstrates how that could work is ThinkEco Power, which developed a way to monitor household appliances without the need to install additional tracking devices. A sensor attached to a power cable can read the harmonics whenever an appliance is turned on, a processor reads the signature, and an algorithm deciphers which appliance it relates to. By learning people’s activity patterns, this sort of technology could be used to help prevent elderly and disabled people dying unnecessarily or suffering a stroke, heart attack or a preventable accident in their home. It means a person’s activity – or inactivity – can be monitored, so if regular actions such turning the TV and radio on, using equipment such as a walking assistant, turning on a hearing aid or even boiling the kettle fail to take place, a notification can be sent to a relative, neighbour, or even the emergency services. Further uses could see it detecting fire or CO hazards and turning off the mains to prevent further damage. [easy-tweet tweet="The #IoT is effectively in its infancy, but that makes it all the more exciting - we get to help it grow up!"] This all goes to show the possibilities as the technology scales up, as many of these are effectively prototypes. The Internet of Things is effectively in its infancy, but if the steepness of the mobile growth curve is anything to go by, we should brace for exponential growth, and looking at these early innovators provides a tantalising glimpse at what’s to come. ### What will you do if the cloud bursts? A sudden loss of critical services is a business emergency. Whether a power cut, a crucial delivery that fails to arrive or even a broken heating system, dealing with such a failure becomes the number one corporate priority. [easy-tweet tweet="The #cloud remains one area in which many businesses still don’t have a well-oiled emergency response strategy"] Most businesses are well-versed in how to respond. The generator kicks in. They default to their secondary supplier. But the cloud remains one area in which many businesses still don’t have a well-oiled response strategy. The elephant in the room Global SaaS software revenues are forecasted to reach $106bn in 2016, according to Forrester’s Tech Market forecast. That’s a 21% increase on 2015 spending levels, so it’s clear that businesses are embracing cloud applications at a phenomenal rate. It’s also clear why. Cloud computing offers economies of scale for businesses of all sizes, whether in terms of file storage, network infrastructure, data backup or business analytics. Shared services delivered as a utility over the Internet are scalable, flexible and accessible anywhere - and, consequently, can deliver dramatic efficiencies and cost savings. Cloud computing offers economies of scale for businesses of all sizes But in the rush to embrace and trust SaaS, too many businesses are still too quick to ignore - or at least, fail to thoroughly assess – the risks inherent in transferring responsibility for business-critical applications, data or information architecture, to an external provider. How reliable is your SaaS provider, really? Measuring reliability Reliability isn’t a concrete concept, and as such is difficult to quantify. No business will deliberately partner with a SaaS provider that looks set to fail in the next few months – but, as we all know, predicting such business failure is very difficult. It’s not just about whether that SaaS provider will continue to trade successfully – it’s also about whether they will continue to honour their contracts, or continue to provide the same levels of service that they originally sold. Failure to do so isn’t necessarily a malicious action, nor the result of the SaaS provider winding up business or going bankrupt – although these are crucial possibilities to consider. Loss of critical resources, changes to licenses or terms and conditions or, perhaps most damaging of all, a malicious cyberattack leveraged at the SaaS provider, can all have significant knock-on effects on the cloud services delivered to corporate customers. The contingency plan [easy-tweet tweet="What can businesses do to protect themselves from #SaaS provider failure?" user="comparethecloud"] What can businesses do to protect themselves from SaaS provider failure and ensure they don’t slip into a business emergency? Service Level Agreements - SLAs, of course, are a standard part of any SaaS contract. But businesses need to check that they cover a range of scenarios, not just a generic service failure. A comprehensive SLA will include details of what happens in the event of bankruptcy and dissolution as well as service outage. Information security standards - SaaS providers need to prove that they are taking sufficient measures to mitigate against data breaches – whether as a result of malicious cyberattack, internal sabotage or basic human error. And SaaS customers need to demand proof of these measures too. Interim continuity services - These act as a SaaS safety net, guaranteeing access to SaaS applications and data in the event of provider failure. An up-to-date copy of the business’s data and/or production environment is provided, giving it time to switch to an alternative solution without service interruption. These measures are not complex, but they go a long way to ensuring that businesses can enjoy all the benefits of the cloud, without the risk of coming to a standstill. ### The IoT supports endangered sea mammal conservation in the Philippines Fishermen in the Philippines are taking part in a groundbreaking new ‘citizen science’ project to help save the endangered dugong population using the Internet of Things (IoT), smartphones and the cloud. [easy-tweet tweet="#IOT changing the world for the better: @KiiCorp is helping fishermen in the Philippines to track #dugongs"] Known as ‘sea cows’, dugongs are vegetarian sea mammals found in the warm coastal waters from East Africa to Australia, including the Red Sea, Indian Ocean and Pacific. Related to manatees and similar in appearance, dugongs are now classified on the IUCN’s (World Conservation Union) Red List of Threatened Species as ‘vulnerable’. While legally protected, figures suggest the population has declined by 30% in the last six decades, due to entanglement in nets, illegal hunting and loss of habitat. The project, run from the island of Busuanga by local NGO, Community Centred Conservation (C3), has historically relied on expensive helicopters to spot and record dugong sightings in an attempt to generate a database of their numbers and locations. Now IoT cloud platform provider, Kii, has stepped in to work on ambitious new conservation work with the local fishermen to help monitor and track the dugong population in the region. Partnering with Smart Earth Network (SEN) and C3 who are managing the project, Kii is providing the cloud platform where fisherman can upload geo-located images of the sea mammals using a smartphone and an Android app. Kii is providing the cloud platform where fisherman can upload geo-located images of the sea mammals Around 30 fishermen are taking part of the initial trial period using smartphones from local mobile provider, Cherry Mobile. Once out at sea, they photograph any dugongs they spot and then upload the data to a central database on the Kii platform, allowing for a more precise and comprehensive picture of the dugong population to be built up, which in turn enables more targeted conservation activities to be put into practice. [gallery link="none" size="medium" ids="35098,35100,35099"] Each image indicates the exact location of each dugong via GPS, allowing C3 to map the sightings and get a specific information like timings of sightings, migration patterns and so on. The plan is to share the data with other conservationists worldwide, and also with the local Council of Development to help C3 lobby the local government. [easy-tweet tweet="This collaborative approach to environmental tracking is one of the first of its kind to use #IOT and #mobile"] The fishermen, many of whom are unable to read or write, are being trained by the C3 project team on how to use the smartphones and provided with local charging facilities. While early days, the project has already been well received by both conservationists and the local community. Chris Poonian at C3, a non profit organisation, says: “Traditionally, we have had to track these amazing sea creatures from the air, which is expensive and unreliable. Using mobile phones to monitor them is an innovative approach. This collaborative project is one of the first of its kind to employ this kind of technology. If successful, IoT and mobile technology could have important applications for surveys of rare species throughout the world.” Simon Hodgkinson, founder of SEN, which provides a platform for conservationists and technologists to share ideas, network and develop innovative solutions, adds: “Technology can provide a level of assistance in conservation that we could only dream of a few years ago. The Internet of Things can be harnessed to massive benefit to the natural world and we are seeking challenges from other conservation groups, which could benefit in the same way. Technology and conservation have to unite to open a new era of environmental activism.” The Internet of Things can be harnessed to massive benefit to the natural world According to Masanari Arai, founder and CEO of Kii: “So often we’re used to seeing our technology delivering benefits for business, but now we’re involved in a truly rewarding conservation project that will deliver benefits for an endangered species.” ### Channel migration part 2: MSP to ISV ISVs are by nature specialists in a segment – software tends either to be functionally or vertically aligned – i.e. for accounts or marketing, or maybe for telcos or retail (other niches exist such as security, but these are becoming increasingly overcrowded). At the same time an ISV’s IP is encapsulated in the software that it sells. [easy-tweet tweet="Analyse #Channel migration with @BillMew, from #MSP to #ISV" user="comparethecloud"] Alignment and differentiation: MSP a (the specialist)  v. MSP b (The generalist) Previously we suggested that differentiation in future will increasingly depend on specialisation. Compare the advantages of two different MSPs: MSP A specialises in dealing with legal firms, understands the legal value chain, is familiar with all the main legal software packages, has experience in managing and integrating them, has specialised process models and methodologies, has sales staff and consultants with IP in this area and maybe has even developed its own software modules to integrate with or between other legal packages and platforms. MSP B is more of a generalist. It is great at what it does, but serves a mixed bag of clients from a number of different sectors and industries. Both MSPs have made the transition from the transactional tech world and have successfully established annuity revenues in new areas of operation such as Cloud, Mobile, Security and Cognitive (including big data and analytics) – all the areas touted by IBM to its business partners. MSP A is able to establish a reputation for being the best at what it does – managed services for the legal sector. As its brand and reputation increase it evens achieves a level of eminence in this area where it becomes well-known for providing managed services for the legal sector and prospects from this sector start to beat a path to its door (reducing its cost of sale). MSP B is less focused and finds it more difficult to establish a differentiated brand. With less ability to build IP or replicate services across all its clients as they are in such different businesses, the cost of services this disparate set of clients starts to rise with time. When pitching for new business MSP A focuses on a sector where its chances of success are high, its cost of sale is lower and its ability to add value and therefore charge a higher margin is greater. When pitching for new business MSP B struggles to differentiate itself. On too many occasions its excellent client service reputation gets it as far as the final short list, but it loses out on the final deals to rivals that have greater differentiation or value to offer. A series of lost bids have pushed its cost of sale up significantly. [easy-tweet tweet="#MSP specialists out-compete the generalists as a result of their greater ability to build #IP" user="billmew"] Over time specialists like MSP A start to dominate each niche. They out-compete the generalists as a result of their lower cost of sale, lower cost to service clients, and greater ability to build IP or replicate services. At the same time their reputation for expertise and added value in their niche not only makes marketing easier, but it also allows them to charge a premium for their services. Generalists like MSP B find it increasingly difficult to compete with these specialists. Automation (becoming become more ISV-like) Much of the value that MSP A is able to provide as a specialist is in its IP. Not only is this inherent in the specialist knowledge of its staff, but it can also be applied through replicable solutions, templates or modules that the firm has developed over time. MSP B may well be excellent as what it does, but focus in its IP is too widely spread and impossible to codify and automate in this way. The more that MSP A codifies and automates its IP, the more ISP-like it becomes. Eventually it is able to fully automate elements of its IP into mobile apps, or plug-ins and modules that integrate with the applications and processes that it regularly supports. As competition increases this automation enables MSP A to increase its value add and margins further, giving it even greater competitive advantage over generalists like MSP B. This may appear a like a rather theoretical scenario, but the trends are already apparent and pressure already exists to focus in order to increase differentiation and automate in order to increase efficiency. Things aren’t going to change overnight – after all there is still some margin to be made in resale. Nevertheless it is telling that a vendor like IBM (that has been encouraging its partners to migrate from VAR to MSP for years) is refocusing its channel programmes to encourage greater specialisation around key competencies. IBM is refocusing its channel programmes to encourage greater specialisation around key competencies The pace of change isn’t going to let up any time soon and fresh from transitioning into MSPs many firms are already finding it a very crowded market in which any form of real differentiation is hard to achieve. They need to look ahead and start to plan their next migration already by understanding how they can specialise, differentiate and automate in order to become more ISV-like. ### Hershey’s Partners with Infosys to Build Predictive Analytics Capability on Amazon Web Services Infosys, a global leader in consulting, technology and next-generation services, has announced that its Infosys Information Platform (IIP) is now available on Amazon Web Services Marketplace (AWS Marketplace). This will enable businesses to gain robust data insights quickly, while tapping into the flexibility and the lower cost of a cloud-based platform. [easy-tweet tweet="#CloudNews: @Infosys to build #PredictiveAnalytics capability using #OpenSource information platform on #AWS"] Enterprises can now test IIP on AWS free of cost for a one-day trial (AWS usage fees apply), giving them the opportunity to experience in real-time how the solution can operationalise their data. For example, Hershey’s LLC, North America’s largest chocolate manufacturer, recently used IIP on AWS to analyse retail store data. The company wanted to gain valuable, revenue-generating insight faster than a traditional analytics implementation could deliver. Phil Lerro, Senior Director, Enterprise Information Service, Infrastructure and Workplace Solutions, Hershey’s LLC "We needed to establish our Hadoop landscape and extend our analytics and big data capabilities quickly. Partnering with Infosys, we had the landscape up and the data lake seeded for our analysts in less than a week. Using the Infosys Information Platform on AWS accelerated our deployment by weeks, if not months." Dave McCann, Vice President, AWS Marketplace, Amazon Web Services, Inc. “At AWS Marketplace, we are always looking to offer innovative solutions that help our customers harness the power of the AWS Cloud. By making IIP available in AWS Marketplace, Infosys is reducing the cost and complexity of gaining access to powerful, enterprise-class open-source technologies that enable data-intensive enterprises to innovate efficiently for their needs today and in the future. This solution can be deployed in minutes and tested, then purchased.” Abdul Razack, Senior Vice President and Head of Platforms, Infosys “IIP on AWS is helping our clients realize the real promise of big data analytics faster. Over 200 clients have already experienced the power of IIP on AWS and seen that speed-to-insight does not have to come at the cost of expensive, inflexible proprietary approaches.” Additional examples of IIP on AWS include: A mining operator uses real-time data analytics of IIP on AWS -- ingesting and processing 27,000 messages per second – to predict which of its vehicles is about to fail, enabling repairs to be scheduled before downtime occurs An office automation vendor used machine learning and in-memory computing capabilities of IIP on AWS to analyze more than two million records of previous purchases and identify customers with a propensity to purchase particular products and services.  The complete set of predictions was produced in seven seconds A freight rail operator wanted to reduce unnecessary breaking events, which resulted in operating delays. Infosys used the R programming language and IIP on AWS to analyze an expansive set of operational metrics and create a delay event prediction model.  This helped the company adjust its Positive Train Control system to improve on-time performance IIP on AWS is designed to foster such rapid adoption of end-to-end, scalable big data processing and analytics. As an open source platform, IIP carries no vendor lock-in clause, allowing projects to adapt easily and take advantage of new and emerging open source technologies. IIP takes leading open source components and adds value by wrapping technology enablers, accelerators and services around them. Infosys also helps customise, integrate and implement the IIP platform alongside clients’ existing enterprise applications, and provides ongoing support for the open source components. Infosys has deployed 29 full production IIP projects. In a market where data scientists are in high demand and short supply, Infosys offers access to more than 200 experienced data scientists in its analytics practice. To learn more about Infosys IIP, please visit https://www.infosys.com/iip. ### SentinelOne Introduces First Next Generation Endpoint Protection Built for Linux Servers SentinelOne, the company that’s transforming endpoint security by delivering real-time protection powered by machine learning and intelligent automation, has announced a powerful new solution aimed at protecting enterprise data centres and cloud providers from emerging threats that target Linux servers. [easy-tweet tweet="#CloudNews: @SentinelSec announces end-point security for #Linux servers" user="comparethecloud"] Today’s internet is largely powered by Linux servers, many of which have become the target of attackers looking to utilise this vast pool of resources for much larger and more aggressive campaigns. Traditionally these have consisted of DDoS attacks, but more recently attackers are increasing their utilisation of these comprised resources to distribute malware to other systems outside the affected company. When it comes to protection, Linux systems suffer from the same shortcomings inherent in traditional antivirus software that relies on static signatures for detection of threats, and provides no means to detect the thousands of new threats that emerge daily. “As we have seen, Linux endpoints, whether they are servers or other devices, are not immune to malware and other forms of attack,” said Tomer Weingarten, CEO of SentinelOne. “To address this new threat plane, SentinelOne EPP now provides the same exceptional level of integrated threat detection, prevention and remediation for Linux machines as it does for Windows and OS X devices.” To detect and block even the most sophisticated threats and zero-day attacks, SentinelOne uses a lightweight autonomous agent to monitor all activity in both kernel and user space (including files, processes, memory, registry, network, etc.) on the protected device. Each agent leverages the SentinelOne Dynamic Behavior Tracking (DBT) Engine which uses sophisticated machine learning to predict threats across any vector against a full context of normal application behavior.  Once malicious activity is detected SentinelOne immediately employs a series of automated mitigation and quarantine processes to eliminate the threat in real-time. SentinelOne also maintains a detailed audit trail of activity for forensic analysis and reporting which is delivered to the management console in real-time. SentinelOne is the only vendor that enables organisations to both deploy next-generation endpoint protection and replace antivirus while ensuring that industry and government regulatory requirements are met. Unlike other next-generation endpoint security products, SentinelOne EPP is certified by AV-TEST to meet regulatory requirements for antivirus protection on both OS X and Windows machines, with Linux certification under way. ### Is Big Data the problem? No – but data copies are Big Data. It's the buzzword of the moment. And it’s got everyone talking. The Big Data trend has the potential to revolutionise the IT industry by offering businesses new insight into the data they previously ignored. For many, it is seen as the Holy Grail for businesses today. For organisations, it’s the route towards better understanding exactly what their customers want – and allows them to respond appropriately. [easy-tweet tweet="#BigData is more than just a #tech buzzword, it has the potential to revolutionise the #IT Industry"] In an age where Big Data is the mantra and terabytes quickly become petabytes, the surge in data quantities is causing the complexity and cost of data management to skyrocket. At the current rate, by 2016 the world will be producing more digital information than it can store. Just look at that mismatch between data and storage. For reference, one zettabyte would fill the storage on 34 billion smartphones. by 2016 the world will be producing more digital information than it can store The challenge The problem of overwhelming data quantity exists because of the proliferation of multiple physical data copies. IDC estimates that 60% of what is stored in data centres is actually copy data – multiple copies of the same thing or out-dated versions. The vast majority of stored data are extra copies of production data created every day by disparate data protection and management tools like backup, disaster recovery, development, testing and analytics. [easy-tweet tweet="An estimated 60% of the #data we currently store is useless copy data" user="comparethecloud"] IDC estimates up to 120 copies of specific production data is being circulated by a company whereby, the cost of managing the flood of data copies reached $44 billion dollars worldwide. Tackling data bloating While many IT experts are focused on how to deal with the mountains of data that are produced by this intentional and unintentional copying, far fewer are addressing the root cause of data bloating. In the same way that prevention is better than cure, reducing this weed-like data proliferation should be a priority for all businesses. why aren't we addressing the root cause of data bloating? The ‘golden master’ Data virtualisation - freeing organisations’ data from their legacy physical infrastructure just as virtualisation did for servers a decade ago – is increasingly seen as the way forward. In practice, copy data virtualisation reduces storage costs by 80%. At the same time, it makes virtual copies of ‘production quality' data available immediately to everyone in the business anywhere they need it. Data virtualisation - freeing organisations’ data from their legacy physical infrastructure That includes regulators, product designers, test and development teams, back-up administrators, finance departments, data-analytics teams, marketing and sales departments. In fact, any department or individual who might need to work with company data can access and use a full, virtualised data set. This is what true agility means for developers and innovators. Moreover, network strain is eliminated. IT staff – traditionally dedicated to managing the data – can be refocused on more meaningful tasks for growing the business. Data management licences are reduced, due to no longer requiring back-up agents, separate de-duplication software and WAN (wide area network) optimisation tools. [easy-tweet tweet="By eliminating physical copy #data and working off a golden master our storage problems are over"] By eliminating physical copy data and working off a ‘golden master', storage capacity is reduced – and along with it, all the attendant management and infrastructure overheads. The net result is a more a streamlined organisation driving innovation and improved competitiveness for the business faster. ### EMC Announces the EMC Provider Cloud System EMC Corporation at Mobile World Congress EMC announced the new EMC Provider Cloud System (PCS), a software-defined Network Functions Virtualization Infrastructure (NFVI) reference architecture designed to enable communications service providers’ (CSPs’) to deliver distributed, multi-service, carrier-grade cloud solutions quickly and economically. [easy-tweet tweet="#CloudNews: @EMC announces their new EMC Provider Cloud System" user="comparethecloud" hashtags="cloud"] In today’s communications service provider market, advanced software and Internet and mobile connectivity are combining to lower barriers-to-entry, enabling new entrants to offer traditional telecom services without having to make the large capital investments that have historically been required. At the same time, many providers are finding ways to bundle telecom services with newer services like streaming media and public cloud hosting to generate incremental revenue and increase customer loyalty. To compete in this new world, communications service providers need a platform enabling them to rapidly build and deliver new services with a minimum of operating expense and maximum speed. EMC Provider Cloud System delivers carrier-grade availability and reliability, blending automation, programmability, and predictive analytics to support next-generation workloads. EMC’s PCS reference architecture is based on Network Functions Virtualization (NFV) industry standards and is designed to leverage technologies from VMware. Comprised of an NFVI layer, advanced data and security services, and a management and orchestration layer, the EMC PCS makes it possible to deliver traditional and next-generation telecommunication services, and to use Big Data and analytics for communications service providers to optimize operations and create new revenue opportunities. Using virtualized platforms such as that prescribed in the EMC PCS reference architecture, research shows that communication service providers can achieve more than 60% lower TCO over non-virtualized approaches. Scalability and Carrier Grade Availability Underpinning EMC PCS is a modular NFVI stack that can scale from a single node to large distributed environments. This scalability is enabled by software that can cluster server, storage and networking into a platform that hosts virtual network functions (VNFs). VNFs include traditional telecommunications services, such as the software that implements cellular networks, co-existing side-by-side with nontraditional services, such as cloud hosting and media and content delivery. Carrier-grade availability is provided by management and orchestration software that helps ensure resiliency and six nines availability across the entire environment and by EMC’s industry-leading disaster recovery and data protection technology. EMC’s Provider Cloud System delivers: Choice: Providing flexibility with standard X86 hardware and integration with open standards and Application Programming Interface (API) instrumentation at all layers. Agility: Management and orchestration software enables the provisioning of different NFVI personalities to support a variety of VNF workloads. Intelligence: Big & Fast Data capabilities combine real-time intelligence with Data Lake technology to optimize network operations and unlock new use cases. Automation: Advanced software services provide the carrier-grade data protection and service assurance features that operators require. [quote_box_center]David Hudson, General Manager of Telecom Transformation, EMC Corporation “Massive changes in communications, collaboration, mobility and content present today's communication service providers with unprecedented opportunity. To help customers thrive during this period of massive transformation, EMC is delivering scalability, carrier-grade reliability, and choice with the Provider Cloud System reference architecture. [/quote_box_center] Partner Ecosystem With the ability to choose software components that fit their needs, communications service providers can dramatically increase the flexibility and agility of their business.  Service providers will be able to offer new value-added services on a unified, virtualized infrastructure while shortening the introduction of those services. EMC supports a thriving community of best-in-class VNF and mobile network solutions providers, including Affirmed Networks, Cellwize, and Versa Networks, each of whom has recently validated their offerings in connection with EMC’s PCS reference architecture. EMC is providing a platform on which partners can build services from common blueprints, and collaborate together using DevOps best practices to build services in completely new ways. [quote_box_center]David Wright, Vice President of Operations, Telecom NFV Group, VMware “The combination of EMC's Provider Cloud System with VMware’s production-proven NFVI brings together technology from the leaders in virtualization and cloud into a carrier-grade offering for communication service providers of all sizes. Whether starting with small edge deployments or building out at data center scale, operators get the benefits of customized, pre-integrated hardware and automated management and orchestration software from EMC, with the industry's most trusted and widely deployed network virtualization platform from VMware.”[/quote_box_center] ### 2016: The year of divide and conquer in the cloud Every year since 2008, we’ve been told that we’re in the year of the cloud. I’m both amused and frustrated by the discussions on the cloud at times, especially when they fail to consider the realities of adoption. [easy-tweet tweet="Every year since 2008 has been Year Of The #Cloud, are we done with that yet? " user="comparethecloud"] I do not believe the cloud should be compared to a four letter word, far from it, but it can’t be the best at everything, for every business requirement. According to Rightscale, 82 percent of enterprises have a multi-cloud strategy, however, most predictions and statistics fail to point out the amount of a business's operations choosing to adopt cloud technology. I’m certain that 82 percent aren’t simply migrating everything to the cloud. Could some of these organisations just be adopting cloud at the departmental level? A handful of employees might have just found an effective cloud-based tool that assisted with a particular task. I agree that cloud adoption is rapidly on the rise. Growth in demand is evident, and according to IDC’s Dan Vesset spending on cloud technology will grow 4.5 times faster than for on-premises equivalents in 2016. I also believe 2016 will be a year when enterprises closely examine the data and applications they are using in the cloud, and the business implications. Migration to the cloud impacts regulatory compliance, operational performance, and systems integration. As such, I expect adoption decisions to be more nuanced in 2016. Enterprises will look at what they can shift to the cloud, what might work best in a hybrid environment and the tools that may be better placed on-premises. Migration to the cloud impacts regulatory compliance, operational performance, and systems integration Consequently, 2016 will be the year of data classification and data management. One major reason for this has been the growth of concern around security. In just one week in October 2015, we saw three significant data breaches in the UK alone: TalkTalk, British Gas and Marks & Spencer. On top of this, just last month the European Union announced that it had agreed on a proposal for a Network and Information Security directive, aiming to better protect European citizens’ data. This follows the ruling that the Safe Harbour agreement, the means by which data was transferred between Europe and the United States, was invalid and in breach of European law – and is being replaced by a more rigorous framework known as the EU-U.S. Privacy Shield. The result has been an increased focus amongst the business community on information, how it is being stored and the safest means to transfer it. [easy-tweet tweet="#Data classification and management are becoming increasingly important when deciding on #cloud solutions"] In this environment, data classification and management are becoming increasingly important when deciding between public cloud, on-premises and hybrid deployments when determining where to store and distribute company data. Interestingly, in research examining the most influential factors for enterprises considering data transfer, key cloud benefits and selling points such as any time access to data are cited as less important than issues of compliance and software integration. The research highlights demand for clarity of data over the productivity benefits of the cloud. Making a decision on how to house data, without clear classification based on thorough considerations to the business implications may mean future issues or vulnerabilities. An effective cloud or hybrid migration must include an examination of the types of data different departments of an enterprise are holding. What data is used for, who needs access to it, how much control they are required to have over it, and how much latency is tolerable are all critical factors. Classifying data in this way, putting it into boxes, will in many cases reveal issues with moving entire enterprises to the cloud. These could include anything as complex as compliance, to something as simple as compatibility issues. [easy-tweet tweet="Putting #data into boxes, will in many cases reveal issues with moving entire enterprises to the #cloud" user="comparethecloud"] Ultimately, choosing an approach to managing data requires a divide and conquer approach. The decision is no longer about which is more secure. Cloud, on-premises and hybrid approaches each have their own benefits and require an evaluation of business needs. On-premises models offer the additional component of control over an organisation’s network, cloud models can be quickly deployed and scaled fast to meet changing needs, often making them more cost efficient as a result, and hybrid models combine the best of both these models. In short, I expect 2016 to be the year organisations get to grips with the cloud vs. on-premises debate and choose to divide and conquer, producing bespoke solutions based on their unique needs. ### Channel migration part 1: VAR to MSP Migration is not something that we tend to do willingly. Despite the hardships that they’ve faced, the majority of Syrians initially chose to stay and fight or to hold out and get by as best they could. The worsening situation has forced many to conclude that joining the mass migration is now their only hope. Unfortunately by the time things get this bad, many will have lost everything. [easy-tweet tweet="Take a look at #Migration from #VAR to #MSP with @BillMew" user="comparethecloud"] The hardships and devastation in Syria may be on a different scale to that in the tech industry and in the IT channel in particular, but we’re seeing similar patterns of behaviour and a reluctance to migrate, whatever the hardships faced. Many Value Added Resellers (VARs) have made a good living from resale, integration, installation and upgrades for many years and are doing well from the wave of companies moving to the cloud at present. In almost all of these areas, however, there are worse times ahead. Resale is already in decline, integration is becoming all about APIs, upgrades are managed remotely in the cloud and the move to the cloud is nothing but a one off transition. Resale is already in decline, integration is becoming all about APIs Most of us have seen these trends coming for some time now and vendors have been doing all they can to help their VARs make the transition to becoming MSPs, while at the same time trying to recruit new MSPs to replace the elements of their channel that won’t survive the migration. For some time now vendors such as IBM (whose annual channel event the PartnerWorld Leadership Conference (PWLC) we attended last week) have been providing assistance with varying levels of training, strategy guidance and financial support to help VARs make this transition. We have discussed the challenges of going from a transactional to an annuity-oriented structure and of spanning the chasm in between in many blogs on this site and I don’t propose to delve into the intricacies again in this blog other than to emphasise how hard the transition can be. [easy-tweet tweet="The problem remains the lack of urgency among many in the #channel says @BillMew" user="comparethecloud"] The problem remains the lack of urgency among many in the channel. Some have made the transition, but many are either finding it too daunting or have made just hesitant progress in building new cloud-based annuity revenue streams. If not careful they risk finding that the current transactional revenue streams that they will need to fund the transition, will start to fall off more dramatically, long before the new revenue streams are in a position to carry the weight. They risk being left with nothing. A further problem is that the move from VAR to MSP is just the first step on the path that many will need to make. Having called on its channel to make the VAR to MSP transition for the last five years or so, IBM announced at this year’s PWLC that it was restructuring its channel programme to focus on the next step that its channel is going to need to make. Unfortunately becoming an MSP and changing your business from a transactional to an annuity-oriented structure won’t be enough on its own to guarantee long-term survival. IBM’s partners are also being asked to focus on a set of competencies that will enable them to differentiate themselves in a market that is increasingly awash with competing MSPs. Many IBM partners have done all they can to comply with the company's call to transform their practices in line with IBM's strategy – away from its traditional business towards its new priorities – Cloud, Mobile, Security and Cognitive (which includes big data and analytics). While IBM’s revenues as a whole have declined in recent quarters, its channel revenues have risen, but many partners are still finding the pain of transition greater than the benefits. At PWLC Marc Dupaquier (@marcdupa) heralded the advent of the era of cognitive computing and the maturation of IBM's Watson platform as their opportunity for growth, before presenting them with an outline of a revamped PartnerWorld program. Key among the changes is a set of competencies being introduced by IBM that will provide certification and support for MSPs that specialise in particular niches – such as industry segments. [easy-tweet tweet="At #IBMPWLC @Marcdupa heralded the advent of the era of #cognitive and the maturation of #Watson"] Historically VARs have grown opportunistically taking whatever business they can from whatever quarter it came. Consequently many VARS have a mixed bag of clients from a number of different sectors and industries. They are naturally loathed to give up any of these clients. The reality is that differentiation in future will increasingly depend on specialisation. For example imagine an MSP that specialises in dealing with legal firms, that understands the legal value chain, that is familiar with all the main legal software packages, that has experience in managing and integrating them, that has specialised process models and methodologies, that has sales staff and consultants with IP in this area and maybe even have developed its own software modules to integrate with or between other legal packages and platforms. A specialist of this kind should not only be able to win business from a generalist rival, but should also be able to charge a premium, while at the same time reducing its costs (as its solutions are largely replicable between clients). The reality is that differentiation in future will increasingly depend on specialisation Most ISVs are already in a position where they are specialised in a segment (either an industry one or otherwise). As MSPs align themselves in this way and as they seek to automate elements of the IP into apps, templates or modules they will gradually become more ISV-like. The next part of the channel migration is therefore from MSP to ISV, which we’ll explore in the next part of this blog. ### Most successful "mobile-first" sectors revealed UK mobile market share rises to 60% [easy-tweet tweet="#CloudNews: The UK is a leading #mobile first country says new report from @SimilarWeb" user="comparethecloud"] The UK has emerged as a leading “mobile-first” country with 60% of all websites now viewed on smartphones or tablets, a major report  into online mobile habits has revealed. In the report, “How The UK Surfs the Mobile Web”, global cross-device market intelligence company,  SimilarWeb, analyzed the top 10,000 websites visited in the UK  over 24 categories, from sports to shopping. It found that 59.6% of website traffic came from mobile devices in 2015, a rise of 13.4% since the previous year. The study finds a number of sectors are leading the way in their mobile-first strategy, with pet and animal sites, leading the field with 73% of UK visitors arriving via a mobile, rather than desktop. In this sector,  Pets4Homes, which  allows pet owners to put puppies and cats up for sale,  secured an 81% mobile share, followed by Pets at Home (77% mobile share) through strong and visual images of pets and animals. Other leading mobile-first UK  players are  health sites (73% market share),  generated by on-the-go medical queries, not least for the NHS, which secured 289.3million mobile visits, 79% of which are via mobile. The least likely sites to be visited via mobile are report are computer and electronics sites (with only a 36.95% market share). Mobile engagement lower in U.K, except for booming literature sharing sites The shift to mobile however represents a challenge for companies in securing visitor time on site and pages viewed. While on average, U.K. visitors spent an average of 8 minutes 49 seconds visiting desktop sites, this dropped to 5 minutes 59 seconds for the same site viewed on mobile phones or tablets. The average pages viewed also dropped from 6.7 on desktop to 5.24  when users are on mobile. The most successful sites in keeping U.K mobile users on site are booming books and literature communities -- achieving an average mobile visit per session of 9 minutes 28 seconds.Leading literature community, quotev.com, with 27.9million mobile visits last year, registered a 73% mobile share compared to its desktop visits,  and an average session lasting 24 minutes 52 seconds.  Meanwhile, Wattpad, another community linking readers and writers, saw 27.4 million mobile visits, with an average mobile session lasting 18 minutes and 23 seconds. The least successful sector in engaging mobile visitors are science websites, with the average mobile visit last only 1  minute  and 40 seconds. Joel Zand, SimilarWeb Digital Insights Analyst said: "We live in a multi-device world where the Internet is accessible almost everywhere we go. In the UK, most of the population uses a mobile device to access the web. "For marketers, this behaviour can have quite an impact on how to reach an audience. Mobile web engagement is highly correlated to differences in industries. "Benchmarking your own website’s performance against the industry, and understanding where you stand in relation to your competitors can help you build a better experience for your customers, and keep them coming back to your site.” ### Wearables in the arts: ready to take centre stage? The buzz around wearable tech just keeps getting louder. Forecasts show adoption at an annual compound rate of 35% over the next five years, with anywhere from 148 million to 200 million units shipped by 2019. [easy-tweet tweet="Could the #Arts be the sticking point for #IoT #wearables?" user="comparethecloud"] So what does that mean for the arts? A recent study by Carnegie Mellon University in the US looked at how wearables are being used by the creative industries today, and considers how they’re likely to be used in the not-too-distant future. Many of the predictions gel pretty neatly with the kinds of queries we’re taking each week from arts organisations about the possibilities of wearable tech. Take smart glasses for example. Google Glass may be currently on hiatus, but in the arts industry smart specs are still generating loads of interest. Glass was used for media and entertainment purposes 54% of the time in field trials, so the arts could well be the place where smart glasses find their most enthusiastic early adopters.  in the arts industry smart specs are still generating loads of interest From our perspective they hold the promise of accelerating the shift towards a truly digital box office. Imagine if you could equip front of house staff with smart glasses at multiple points in the foyer to cut CoBO queues and use facial recognition technology, barcodes or QR codes to make checking in really simple? It could be a complete game changer. The first brave theatre team just needs to take the leap. Look to other sectors and it’s already happening. Virgin Atlantic recently experimented with Google Glass as an alternative to airport check-in desks. It allowed staff to access all the information they needed to about a passenger’s booking, freeing them up to provide more attentive service. The response from customers was overwhelmingly positive. [easy-tweet tweet="Imagine a #CRM app for #smartglasses which discreetly displays sophisticated #customerinsights"] The next step would be to imagine a CRM app for smart glasses which discreetly displays sophisticated customer insights (e.g. donor status, genre preferences, frequency of attendance) right in front of your FoH staff’s eyes. It could spur the development of new techniques for improving service and deepen the relationship with customers. Glasses aren’t the only wearable format of course. What if you could have real-time ticket sale data, ROI reports and email marketing stats delivered to a screen on your wrist? Salesforce, the cloud based CRM platform is already making this possible by making its enterprise software accessible via an app for the Apple Watch. It allows users to access data, request reports by voice and respond to notifications with a few taps on the wrist. With more enterprise apps on the horizon for Apple Watch, the idea of using smartwatches to improve the way we work in the arts is intriguing. the idea of using smartwatches to improve the way we work in the arts is intriguing We’re already using cloud apps on our devices to stay on top of emails when we’re out and about so the transition to doing the same thing on a small wearable device doesn’t seem so farfetched. Salesforce says that its app “bridges the last mile between insight and action”. Extend that idea to the box office and it’s not hard to imagine a CRM app on your watch could help marketing, FoH and fundraising teams react quickly to notifications and demand signals as they happen in real time. Exciting? We think so. Notifications would alert you when something important happens,for example when you sell out a show or cross a milestone in your fundraising goals. You’d have key stats enabling you to react faster to data in real time; for example, the number of open opportunities in your fundraising pipeline. Access to stats could be far easier with a single swipe or a voice command. Of course we’ve thought about our own software and how it might work on a smartwatch. Perhaps like a health tracker it could nudge users towards better behaviour through regular updates on marketing campaign metrics, perhaps congratulating you when you achieve a higher email open rate than usual, or prompting you to make a course correction quickly when something doesn’t go quite as planned. Used in tandem with iBeacon technology, you could market extra services or products to customers, or ask them for feedback on a specific aspect of the venue or service, based on where they are in or around your building. [easy-tweet tweet="Using #SmartWatches is potentially more secure than radio comms in a theatre setting" hashtags="iot"] Front of house teams could use smart watches instead of a radio when dealing with customer issues and potentially broadcasting sensitive information, for example VIPs, major donors or people with access needs. It might also be useful for venue management by pushing notifications to your watch to let you know how seats are filling up in the auditorium, for instance. Your Google Glass-wearing digital box office team could, meanwhile, make check-in even easier by helping customers ‘tap in’ when they arrive at the venue.  When it comes to actually making this a reality, software suppliers who want to adapt their arts CRM and box office software for wearable devices would need a reliable cloud infrastructure first. Making the most of these opportunities will require a massive shift in perception and changes to infrastructure. Bandwidth in general needs to increase and arts venues need to invest in reliable in-house WiFi – none of which comes cheap. Even if we can get the IT systems in the arts sector up to speed, adoption of wearable technology still has a long way to go Even if we can get the IT systems in the arts sector up to speed, adoption of wearable technology still has a long way to go. But it’s important that arts organisations begin to prepare now for a sudden shift in acceptance. A recent incident on Broadway shows just how much theatregoers’ behaviour has already been altered by the arrival of new, personal technology. Looking further ahead, what if a brain-sensing wearable could allow gallery goers to get inside the minds of curators to create a brain-based dialogue on new installations? Ready or not, it’s coming. ### #InfoSec Tweet Chat, March 4th 2016 #Infosec Tweet Chat 4pmCET | 3pmGMT March 4th 2016 Join us on twitter at #InfoSec to discuss how your cloud business is changing, and what your challenges and aspirations are for security. [scribble src="/event/1870061" /] ### Ahead in the Cloud: harnessing the power of Threat Intelligence Cybercriminals are experts at finding the chinks in your organisation’s armour. For most businesses, the weak points lie in the grey spaces between security devices. [easy-tweet tweet="Security vendors have long told us we can solve every emerging #security challenge with their latest Magic Box"] Security vendors have long told us we can solve every emerging security challenge with their latest Magic BoxTM, but the reality is these solutions often don’t communicate with each other, creating silos that leave holes for cybercriminals to exploit. Similarly, these silos of security devices have left many enterprises lacking the visibility they need to spot any potentially malicious behaviour happening across their IT estates. In trying to solve this challenge and harness more insights from devices, a tsunami of data has been unleashed that’s made things even more complex. The deluge of data The volume of alerts, alarms and threat feeds is increasing exponentially, creating a sea of data. Hidden within this ocean are the currents that reveal the threats organisations need to worry about, but they’re almost impossible to find. Meanwhile, cybercriminals can exploit these murky waters to continue business as usual. cybercriminals can exploit these murky waters Cutting through this confusion to harness the power of Threat Intelligence is far from easy. Expert people, robust processes and advanced Big Data analytics must all combine to deliver useful, contextualised Intelligence in real-time. For instance, to filter information and uncover the patterns that build Threat Intelligence requires hugely skilled people – at a time when every business is facing up to a global cyber skills shortage. There aren’t enough experts to go around and, even if you can find talented people, they won’t come cheap and are hard to keep. To create Threat Intelligence, organisations need not just the right people, but the right processes and technologies. But even this combination is not enough. Insights also need to be rapid, cost-effective and easy-to-consume. Trying to go it alone and build capabilities in-house is a risky strategy; enormous investments are required, with no guarantee of success. Clouding up So, should organisations give up on the dream of Threat Intelligence enabling better informed decisions, more focused security spending and pre-emptive defences against the world’s most dangerous and relevant threats? No. There is another way: cloud. should organisations give up on the dream of Threat Intelligence? No. There is another way: cloud Threat Intelligence and the cloud are, appropriately, a marriage made in heaven. Individual organisations may be unable to justify the construction of Threat Intelligence capabilities, but specialist providers of security services can, with the cloud providing the perfect delivery system. A cloud platform purpose-built to power Threat Intelligence services gives organisations swift and simple access to cutting-edge data analytics, expert people and robust processes – delivering a richer, more cost-effective solution. Cloud-based solutions can also be extremely agile in changing functionality, as well as providing the scalability and service levels to rapidly adapt to different needs. It’s raining insights [easy-tweet tweet="The #cloud’s greatest advantage is that it connects billions of pieces of disparate information"] The cloud’s greatest advantage is that it makes it far easier to connect billions of pieces of disparate information in a secure and scalable environment. Add in big data analytics and some data scientists to scrutinise the data and you now have the capability to tally security-related information from across an organisation’s IT estate with external sources, such as threat feeds and derive Threat Intelligence in a way that is meaningful to that organisation. By bringing together all this information, a cloud platform also has the capacity to ‘normalise’ data to determine what’s good and bad much more efficiently and effectively. Typically, today’s enterprises work with small data sets gleaned from a limited number of internal devices, whereas a vast pool of internal and external data stored safely and securely on a cloud platform allows the normalisation process to be much more accurate and granular. To put it another way, working from a small internal data set means that most incidents seem like zero-day attacks. In contrast, when using information from thousands of similar businesses and global information feeds, common threats emerge much more clearly – as do the actions needed to mitigate them. Businesses today require answers – not more questions. The cloud offers the ability to crunch huge volumes of data into consumable, contextual Intelligence that’s relevant to securing and protecting individual organisations. Thus Threat Intelligence can become a seamless service: taking information from within an organisation, combining it with global threat data, extracting relevant insights, and then delivering actionable advice back into the end business. [easy-tweet tweet="#Cloud-based approaches make Threat Intelligence agile, useable, cost-effective " user="comparethecloud"] Cloud-based approaches make Threat Intelligence agile, useable, cost-effective and, most importantly, hugely successful at fighting back against cybercriminals. When it comes to IT security, it’s now possible to keep one step ahead in the cloud. ### Choosing cloud over in-house – how retailers make the choice Since the mid-nineties what we now know as ‘cloud’ technology has been talked about in various different terms –as hosted software, Software as a Service (SaaS) and even as managed services. Now the term ‘cloud’ is commonplace. [easy-tweet tweet="For retail organisations considering #cloud systems, the main benefit is not needing to rely on fixed #servers"] The NIST (National Institute of Standards and Technology) definition of Cloud Computing describes it as “a model for enabling ubiquitous, convenient, on-demand network access to a shared pool of configurable computing resources (e.g. networks, servers, storage, applications and services) that can be rapidly provisioned and released with minimal management effort or service provider interaction.” Compelling reasons for the cloud For retail organisations considering cloud systems, the main benefits are that they don’t need to rely on fixed servers, they get real time access to data, scalability and simple software upgrades. All compelling reasons for choosing this option, since most retailers wish to focus on their core business and don’t want the hassle of worrying about IT systems. Many software houses have developed purely cloud based solutions which offer all of the benefits outlined above to retailers and also are fail safe, having implemented back- up service for the rare time that there is a failure. Keeping it local Although these benefits are widely acknowledged within the IT industry, for retailers it is important to consider that cloud solutions are a continual commitment, rather than a one-off IT investment. With cloud there are monthly charges whether for the application, the data or both. Implementing a local server, however, means an initial upfront cost, which, depending on the size of the business, often sees a payback within 18 months. for retailers it is important to consider that cloud solutions are a continual commitment Access to data is a key point to consider when deciding whether to go in-house or cloud. Although we now have the benefit of faster internet connection with leased lines, there will always be a restriction to the speed of the connection, whereas with local servers, speed is not an issue. For many retailers, a local server based infrastructure is still the best option. This may be due to unreliable ‘internet’ connectivity, security issues, questions over control of the data, or that it is easier to utilise integration tools when all software is stored locally. Remaining flexible for business As a well-established supplier to the retail sector for over 25 years, at Eurostop we are fully aware of the various requirements of our customers, which is why we recommend keeping options open. What is important is that a retailer is able to choose either of these methods and to adjust as the business requirements change. [easy-tweet tweet="A retailer might start with a #hosted or #cloud platform as they may have limited technical skills says @eurostop"] For one store or many – start with what works best As a single store retailer a retailer might start with a hosted or cloud platform as they may have limited technical skills or resource to manage a server, backups and upgrades. As the business grows to include a number of outlets at different locations, they may want to start integrating other specific packages and therefore want to take the management of their retail systems and data in house. At the other end of the scale a large retailer with multiple outlets and a warehouse operation that already has a server array in the business may wish to have their retail management solutions as a fully integrated element of their IT infrastructure. 75% of Eurostop’s new customers are now choosing a cloud solution, and while smaller retailers favour cloud for all the obvious benefits, the general trend is upwards, even amongst larger retailers. smaller retailers favour cloud for all the obvious benefits The key point to remember is that having the freedom to focus on the functionality should be the most important thing to an organisation. How well the retail system supports the operations – whether fulfilling orders, tracking sales and managing stock should be the main concern. Requirements may change over time, so we recommend choosing a supplier that can provide a range of options to support the business. ### Cloud can drive innovation but only if you have the right Data Fabric It is well known that cloud is increasingly becoming the default IT option for businesses with many looking to cloud for some, if not all of their infrastructure needs. This interest primarily stems from two areas – firstly the cost benefits, flexibility and speed at which large scale cloud infrastructure can be deployed; and secondly from a business desire to innovate more but with decreased risk. [easy-tweet tweet="The process of #innovation cannot take place without a strong #data management strategy" user="NetApp @comparethecloud"] While the cloud has been sold as the key to innovation for many companies, that process of innovation cannot take place without the business having a strong data management strategy in place across their entire IT footprint, whether that is in the cloud or on-premises. The implementation of a Data Fabric can give companies that data management strategy. Finding your cloud comfort level: what businesses need to consider For technology decision-makers, notably CFOs and CIOs, the new reality is that their organisation’s computing technology and data will likely be cloud-based. What businesses are now faced with is deciding what to move to the cloud and when to move it. Linked to this is the general consideration of how to transition from an on-premise to a cloud computing technology environment well. the new reality is that their organisation’s computing technology and data will likely be cloud-based When starting out the transition the most important thing for companies to keep in mind is that they need to understand the true value cloud computing can bring to their business. CIOs and CFOs must remain closely aligned on IT decision-making to ensure that cloud is used in the most appropriate ways for the organisation. This involves assessing technology in the context of business purpose, risks and cost. There is no point in trying to transition too much too soon. Architecting applications for the cloud can have pitfalls and attempting to run before you can walk is a sure fire way to cause an IT issue during a cloud migration. It is best to take one area, get that right and then consider other areas of the business’s IT landscape. For example, many businesses look to use the cloud for disaster recovery first. Instead of having a secondary data centre location for back up, they choose cloud. Disaster recovery in the cloud can be used by making exact replicas of data saved on private cloud storage. Once this is right businesses can look to other areas to move to cloud. The benefits of deploying a Data Fabric management strategy for cloud Moving applications to the cloud is the logical next step for businesses. That said, disaster recovery is one thing, moving whole applications can be more difficult particularly with regards to ensuring data control, portability and flexibility is maintained. This is where a Data Fabric data management strategy comes in. [easy-tweet tweet="Moving #applications to the #cloud is the logical next step for businesses says @NetApp" user="comparethecloud"] Businesses need a data management strategy that makes all IT infrastructure work as one – whether it is public cloud, private cloud or legacy on-premise systems. This is what a Data Fabric does. It lets companies use their existing data-centre practices on data residing in the cloud. IT teams can easily share applications across different clouds, and move their data to the most appropriate place. With this, CIOs are not locked into one cloud supplier. Instead they can move workloads from one cloud to another, based on the service level, costs and capabilities needed for the relevant business outcomes. A Data Fabric can improve efficiency and accelerate innovation in the business. Industries can benefit from being able to tap into different types of clouds, using a common application model, without worrying about learning unfamiliar cloud scale-out programming techniques. They can build applications that collect and analyse data from people and processes, and they can run them across these different cloud models without having to change them. [easy-tweet tweet="The era of cloud-centric IT operations is just beginning" user="NetApp @comparethecloud" hashtags="cloud"] The era of cloud-centric IT operations is just beginning. The future will include multiple cloud providers, woven together operationally, wrapped in a fabric that unifies the view of the data. What lies ahead is the freedom to pursue new ideas that weren’t feasible before. CIOs cannot hesitate. The future is finally here and they need to embrace it. ### Netflix: Not Your Typical Migration Story The biggest news to hit the data centre universe lately is surely that Netflix, after seven long years, is finally shutting down the last of its data centres in a bid to host nearly its entire offering in the cloud (Amazon Web Services, specifically). [easy-tweet tweet="Why has it taken 7 years for @Netflix to move wholly to #AWS?" user="comparethecloud"] The move has made the headlines big time, positioning Netflix as a cloud pioneer and AWS as the Holy Grail, but the real question that sprang to mind when the headlines emerged was this: Was it really such a surprise? Furthermore, why did it take seven years to make this move? The Cloud: Otherwise Known as Someone Else’s Datacentres The headlines are heralding the approaching “shutting down of the final Netflix datacentre,” as though it’s a monumental move, which should really prompt us to ask: why is one of the world’s largest home streaming video businesses opting to offer its services via one of the world’s largest public cloud providers such a big deal? why is one of the world’s largest home streaming video businesses opting to offer its services via one of the world’s largest public cloud providers such a big deal? Netflix is only doing what other companies have been doing in droves for several years: moving to the cloud. And while “the cloud” may seem the solution to an array of problems presented by aging, expensive and underpowered datacentres, when we refer to “the cloud,” what we’re really talking about here is just another collection (albeit much bigger and fully serviced) of datacentres. In this case, it just happens to be owned and run by Amazon. But Why did it Take Seven Years? Seven years ago Netflix had only been in the video streaming business for two years and the demand these services was not near high enough to consider branching out beyond its existing infrastructure. However, seven years on Netflix Vice President of Cloud and Platform Engineering, Yury Izrailevsky, had some great insight on the journey of this migration, which is possibly some of the most interesting behind the scenes migration information we’ve seen outside our own migration projects:  [easy-tweet tweet="Moving to the #cloud was a lot of hard work, and we had to make a number of difficult choices along the way says @Netflix"] “Given the obvious benefits of the cloud, why did it take us a full seven years to complete the migration? The truth is, moving to the cloud was a lot of hard work, and we had to make a number of difficult choices along the way. Arguably, the easiest way to move to the cloud is to forklift all of the systems, unchanged, out of the data center and drop them in AWS. But in doing so, you end up moving all the problems and limitations of the data center along with it. Instead, we chose the cloud-native approach, rebuilding virtually all of our technology and fundamentally changing the way we operate the company. Architecturally, we migrated from a monolithic app to hundreds of micro-services, and denormalized our data model, using NoSQL databases. Budget approvals, centralized release coordination and multi-week hardware provisioning cycles made way to continuous delivery, engineering teams making independent decisions using self service tools in a loosely coupled DevOps environment, helping accelerate innovation. Many new systems had to be built, and new skills learned. It took time and effort to transform Netflix into a cloud-native company, but it put us in a much better position to continue to grow and become a global TV network.” Perhaps the more relevant question is this: how long would it take a similar sized provider to move its services to the cloud starting in 2016? And, perhaps more importantly, would they do it in the same way Netflix has? Not Your Typical Migration Story The Netflix migration story is an interesting account and one any technologist would like to hear more about, but it’s not a typical migration story. It’s a great testament to Netflix that they were able to actively transform themselves this way, building a new infrastructure and achieving continuous delivery on so many levels while at the same time scaling and moving piece by piece to the cloud. It’s a great testament to Netflix that they were able to actively transform themselves this way The ability for a company to fund a re-design of their entire application landscape is unusual, and strategically is the right thing to do – as Gartner talks more about datacentres of micro-services in the same way. However, in comparison to other large enterprises that have been around some time, Netflix is very new, and so to re-architect everything would be relatively simple as the company wouldn’t have any significant legacy apps. More established companies are not looking to downsize their datacentre footprints as such because in the majority of cases, it’s unlikely such a move is needed. Most large enterprises are happy with the solution they already have, which is often a hybrid cloud infrastructure that makes best use of all worlds. The cost of moving entirely to the cloud could well be astronomical, which makes it all the more unappealing. Beyond this, most companies that are older than seven years’ old have legacy applications that are simply not cloud-friendly. Such organisations will leverage the commodity nature of cloud providers (burst and retraction of compute and network capacity as two examples) for specific applications, rather than shoe-horn every system into a service model that wouldn’t be appropriate for a proportion of their systems. [easy-tweet tweet="Most companies wouldn't look to migrate wholly onto the #cloud because of significant legacy apps"] Will it take seven years? No, because unlike Netflix, most companies are selective about what they migrate to the cloud, rightly recognising that the cloud is not suitable for every application that they host within their datacentre. The best approach would be not to fork-lift applications directly into the cloud, but to select those that are easy to move and provide best value. Planning for this to take months, rather than years should be the expectation. If you’re an established organisation, moving everything to cloud is sub-optimal in the same way that holding everything in-house is also not the best application and value strategy. ### For small business the answer's in the cloud The cloud is more than a technical innovation.  It’s a movement.  A democratisation of IT networking that is helping SMEs and start-ups save time, cut costs and improve their flexibility.  But it’s not as simple as it might seem. Choosing to use the cloud as a platform for your business is just the start.  There are many different flavours of cloud technology and just as many partners to advise you.  Nowadays, it’s possible to operate your entire business from the cloud, although for many small businesses, it can feel like a big step to put all your eggs in one internet-enabled basket.  [easy-tweet tweet="Choosing to use the #cloud as a platform for your business is just the start" user="comparethecloud"] Start by thinking through your requirements and listing them by priority. You might need a document sharing system so your colleagues can access content with ease wherever they are based? You might want to open your system up to external contractors or partners who support your business so they can all work on up to the minute versions of any report? Think about your business processes and your regular workflow. You’re looking to become more efficient, reduce duplication of effort and speed up delivery. However, it’s critical that you consider how the work you do impacts your clients, customers or beneficiaries. If your internal processes become too complicated as a result of adopting a new platform, then the ones that will feel the pain and complain loudest will probably be the customers. Are you starting from scratch or looking to replicate your current working practice onto the cloud? Don’t just think about the technical side of things; consider any costs, any risks, what support you might need, how long this will take and the impact it might have on the day-to-day business during the transition.  [easy-tweet tweet="Are you starting from scratch or looking to replicate your current working practice onto the #cloud?" user="comparethecloud"] It’s too easy to misjudge the work and complexity involved and be swayed into thinking this is as simple as uploading a file and sharing a password.  But migrating too much information, or moving too quickly, can lead to ‘virtual sprawl’.  If you thought it was tricky to control content on computers and in filing cabinets, times that by 50.  You can lose control quite quickly as the volume of content grows, with or without your knowledge.  There can be an ‘out of sight, out of mind’ problem; who has access to what applications?  Or how do I monitor work across the team?     Often the most sensible approach is to engage a contractor or small agency who can help you assess the pros and cons. Remember, it’s your data they will be handling and have access to so you need to trust them implicitly.  Remember, it’s your data they will be handling and have access to so you need to trust them implicitly Selecting a supplier isn’t always easy and so, depending on the value of your project, it’s probably worth going through an informal tender or supplier selection process. Recommendations from friends and colleagues are usually a good barometer however what’s appropriate for one business isn’t always going to be right for another. Maintain commercial competitiveness by comparing at least two or three suppliers. How much can you spend?  There are a variety of cost models.  Cloud providers usually don’t require up-front fees, instead favouring monthly pay-as-you-go costs, which differs from the traditional model of how you procure most technology services where the service, hardware or software is purchased upfront. Whilst leasing might be financially attractive for your business, do qualify the terms by which that arrangement is maintained. Some smaller businesses have complained of feeling ‘locked in’ with a cloud provider who can make it cumbersome to extract data if the client has a change of heart.  Security is paramount Just as you would expect a guarantee if you were purchasing a product outright, make sure you’re happy with the guarantees you get from a cloud provider. Cloud performance, availability and monitoring must be to a standard that you are happy with because your business will be at the mercy of the platform that now is carrying it.  Security is paramount, of course. Think through your highly secure material and what measures you might need to take to protect it. And ensure that access to the cloud in your business is smooth. It goes without saying that unless your connectivity to the internet is running beautifully, the best cloud platform in the world is going to be impossible to reach.    [easy-tweet tweet="#Technology can help small businesses and #startups punch well above their weight" user="comparethecloud" hashtags="cloud"] Technology can help small businesses and start-ups punch well above their weight. The cloud brings a 24/7 availability to your business, accessible from diverse geographies. In future, it’s going to be impossible to imagine life without it. So start soon – but keep a clear view before you go into the cloud. ### prpl Foundation reveals vision for a secure Internet of Things The prpl Foundation, an open-source, community-driven, collaborative, non-profit foundation for driving the next gen of datacentre to device solutions, has announced availability of a new document entitled Security Guidance for Critical Areas of Embedded Computing that lays out its revolutionary vision for a secure Internet of Things. It describes a fresh hardware-led approach that is easy to implement, scalable and interoperable. The prpl Foundation’s guidance aims to improve security for devices in a rapidly expanding connected world where failure to do so can result in significant harm to individuals, businesses and to nations.  [easy-tweet tweet="#CloudNews: prpl Foundation has revealed their vision for a #secure #IoT"] “The Internet of Things is connecting our world in ways not anticipated even a decade ago. This connectivity finds its way into everything from light bulbs and home appliances to critical systems including cars, airlines and even hospitals,” said Art Swift, president of the prpl Foundation. “Security, despite its huge and increasing importance, has so far been addressed in piecemeal and often proprietary ways. “Given ubiquitous connectivity and the rapid emergence of IoT, the need for a well-designed, structured and comprehensive security architecture has never been greater,” he continued. Embedded systems and connected devices are already deeply woven into the fabric of our lives, and the footprint is expanding at a staggering rate. Gartner estimates that 4.9 billion connected things were in use by the end of 2015, a 30% increase from 2014. This will rise to 25 billion by 2020 as consumer-facing applications drive volume growth, while enterprise sales account for the majority of revenue. Security is a core need for manufacturers, developers, service providers and others who produce and use connected devices. Most of these – especially those used on the “Internet of Things” – rely on a complex web of embedded systems. Securing these systems is a major challenge, yet failure to do so can result in catastrophic consequences. “Under the prpl Foundation, chip, system and service providers can come together on a common platform, architecture, APIs and standards, and benefit from a common and more secure open source approach,” added Cesare Garlati, prpl’s chief security strategist. The new Security Guidance Document lays out a vision for a new hardware-led approach based on open source and interoperable standards. It proposes to engineer security into connected and embedded devices from the ground up, using three general areas of guidance. These are not the only areas that require attention, but they will help to establish a base of action as developers begin deal with security in earnest. These areas include: Addressing fundamental controls for securing devices. The core requirement, according to the document, is a trusted operating environment enabled via a secure boot process that is impervious to attack. This requires a root of trust forged in hardware, which establishes a chain of trust for all subsystems. Using a Security by Separation approach. Security by Separation is a classic, time-tested approach to protecting computer systems and the data contained therein. The document focuses on embedded systems that can retain their security attributes even when connected to open networks. It is based on the use of logical separation created by hardware-enforced virtualization, and also supports technologies such as para-virtualization, hybrid virtualization and other methods. Enforcing secure development and testing. Developers must provide an infrastructure that enables secure debug during product development and testing. Rather than allowing users to see an entire system while conducting hardware debug, the document proposes a secure system to maintain the separation of assets. By embracing these initial areas of focus, stakeholders can take action to create secure operating environments in embedded devices by means of secure application programming interfaces (APIs). The APIs will create the glue to enable secure inter-process communications between disparate system-on-chip processors, software and applications. Open, secure APIs thus are at the centre of securing newer multi-tenant devices. In the document, the prplFoundation offers guidance defining a framework for creating secure APIs to implement hardware-based security for embedded devices. The report is available at: http://prpl.works/security-guidance/. ### Can London be the tech hub of the world? Technology has become such an integral part of the way that we live our lives, it is understandable that cities all over the world are striving to become headquarters for technological innovation. London is already making great strides in this regard, with its Silicon Roundabout area rightly recognised as an industry leading tech hub. [easy-tweet tweet="More than 350,000 Londoners are now employed in #digital businesses" user="zsahLTD @comparethecloud" hashtags="techcity"] With more than 350,000 Londoners now employed in digital businesses, the UK capital was recently hailed as the tech centre of Europe. However, in global terms, the city still has work to do. Overtaking the likes of Silicon Valley and New York, while holding off the challenge of rapidly growing Asian tech hubs such as Singapore, will not be easy, so London will have to make the most of its resources if it is to become the tech hub of the world. One of the first areas that London must address if it is to improve its world standing is in terms of recruitment. There is a notable skills gap that must be addressed if tech firms are to have access to the talent required to succeed. A recent survey found that 77 per cent of the capital’s technology startups believe that they could achieve faster growth if they had access to staff with better skills. Clearly, the demand for talented individuals is there, but the supply is not. Fortunately, efforts are being made to close the skills gap by placing stronger emphasis on coding in schools and launching free online training programmes like Tech City UK’s Digital Business Academy. There is a notable skills gap that must be addressed Even if London is able to improve its access to employee talent, it must ensure that it can match it with world class network infrastructure. Parts of the city still suffer from connectivity issues, with the government prevented from direct investment due to competition laws. Slow broadband speeds and lack of access to mobile Internet is hugely damaging to workplace productivity and means that organisations may find it difficult to access some of the more bandwidth intensive applications. Although projects like the government’s Broadband Connection Voucher Scheme have been successful, more still needs to be done. London cannot become a world-leading tech hub, if it does not have access to world-leading broadband speeds. [easy-tweet tweet="London cannot become a world-leading tech hub, if it does not have access to world-leading broadband speeds" user="zsahLTD"] One of the key components of a successful tech hub is a thriving startup community. However in London fledgling businesses are often up against it due to high costs of living, transport and renting office space. Company budgets are already stretched thin, with firms having to manage growth, marketing, recruitment and all other business aspects with a small team. For many businesses, London’s high rents are the final in the coffin, particularly given the high concentration of more established competitors based in the capital. Culturally, London and the UK as a whole will need to shake off its image of being overly sceptical and afraid of taking risks. The most successful tech hubs are known for embracing cutting-edge, and often disruptive, innovations and so there is no time for inhibitions or caution. This reputation is already shifting thanks to the entrepreneurial efforts of Silicon Roundabout and greater financial support for startups. If London carries on this vein, it should receive further recognition as a truly global tech hub. If London carries on this vein, it should receive further recognition as a truly global tech hub However, perhaps the best way for London to secure its position as a worldwide hub of technological innovation is for businesses, old and new, to work together. Cross-industry support can help keep innovative ideas from fading into obscurity and boost the local economy. For example, the zsah Cloud Startup Program provides young businesses with powerful cloud infrastructure that is rapidly scalable and available on a pay-what-you-use basis. There are also a number of agile new firms providing marketing, accountancy and other services at much more flexible and affordable rates. It is vital that startups realise this and make use of their fellow companies. A thriving tech hub benefits all of London’s businesses, enabling each and everyone of them to share in the benefits of local success. If London is to become the tech hub of the world, it will need the government and private industries to work together to overcome international competition. ### Ransomware attacks demonstrate the security imperative Last month, Lincolnshire County Council’s computer systems were shut down following a malicious ransomware attack. This naturally caused a great deal of public concern, and certainly demonstrated the need for businesses to consistently review their security practices. [easy-tweet tweet="#Ransomware can attack any business, at any time, are your staff prepared? " user="comparethecloud" hashtags="infosec"] Lincolnshire County Council’s computer systems, now back online, were closed for four days following a severe ransomware infection. The software encrypted the council’s data and requested a £350 ransom payment in order to access the files once again. This is clear evidence that, no matter which sector or region your business operates in, leaders have a duty to ensure that employees are well versed in cyber security practices. the rate of high profile attacks is growing exponentially Indeed, the last six months in particular have seen an incredible number of cyber attacks on businesses in both the private and public sector, from ransomware to targeted hacks. From TalkTalk, Sony, Ashley Madison and Wetherspoons in 2015, to HSBC and Lincolnshire County Council this year - the rate of high profile attacks is growing exponentially. Unless enterprises undertake company-wide initiatives to improve their internal security practices, hackers will continue to decimate business’ computing systems on an increasing scale. There are some basic steps that can be taken when it comes to ransomware. These include: Keep up to date with security news and updates  Knowing what to look out for is half the battle. Keep up to date with the latest developments in malware and ensure to download the latest antivirus software updates and patches, as this can help to protect against many threats. Back up your business-critical data offsite If you’re ever caught out by ransomware, this could be your lifeline. Be careful though, as if you don’t catch the malicious software in time you might unintentionally back up encrypted files and lose your clean, unencrypted data. Partnering with a specialist helps ensure fast recovery and peace of mind. Communicate risks with your staff regularly All it takes is one uninformed employee to open an infected attachment for your whole business to be affected. Make sure your team is clear about the recommended procedures to follow in the case of a breach - as well as helping to prevent attack in the first place, this can help to get incidents under control quickly, reducing the amount of damage caused. [easy-tweet tweet="All it takes is one uninformed employee to open an infected attachment for your whole business to be affected"] Additionally, they can also take advantage of pre-existing initiatives such as the government’s Cyber Essentials Scheme (CES). For SMEs, which might not have the dedicated in-house IT staff to address cyber security challenges, the CES provides advice and guidance for those looking to take their first steps into cyber security or simply improving existing processes. In the current environment it is imperative these resources are utilised. ### Ride-Sharing with BlaBlaCar's new MariaDB Databases The ride-share economy is booming, and one of the pioneers of the industry has just announced a partnership wtih MariaDB. BlaBlaCar is using an open source database cluster to meet its scalability and availability needs, and the partnership with MariaDB is allowing them to access their data more effectively.  [easy-tweet tweet="#RideSharing is getting a #Database turbocharge in the partnership between @BlaBlaCar and @MariaDB"] With 25 million members across 22 countries, notably India and most recently Brazil, BlaBlaCar certainly is amassing a lot of data to deal with. Mass adoption of the BlaBlaCar service has revolutionised the long distance transport market. BlaBlaCar isn't intended to get you home from the pub on a Friday night, it targets drivers and riders looking to travel between cities, it's an alternative to the coaches and trains that we have relied on for so long. BlaBlaCar now carries 10 million passengers per quarter and its application is used by over seven million visitors per month. Those are pretty crazy statistics for a social driven ride-share app, especially considering most of us won't smile at fellow passengers on the train! Mass adoption of the BlaBlaCar service has revolutionised the long distance transport market BlaBlaCar's growth means they must ensure the global scalability of their data management platform. The capability of their service is crucial and requires a scale-out architecture. Nicolas Blanc, the Head of Platform Architecture at BlaBlaCar says improving the service for users of the application, as well as for the development teams, and simultaneously meeting his marketing teams’ needs are challenges he knows well. Blanc said, "Our needs are evolving rapidly, and our database solution must ensure high performance and total availability while at the same time allowing us to quickly and flexibly acquire new data sources. MariaDB is an ally thanks to its flexible and scalable design." In an environment which includes nearly 250 physical servers and a hundred virtual machines spread across two data centres, the flexibility of MariaDB’s architecture and the MariaDB Enterprise Cluster™ gives BlaBlaCar complete control of the consistency, reliability and scalability of its database applications. At BlaBlaCar, some tables handle more than one million logs per day and the data archives’ volume exceed 1.5 Terabytes. The storage engine chosen was InnoDB which allows BlaBlaCar to manage the data’s consistency and integrity. [easy-tweet tweet="Using the @MariaDB Enterprise Cluster allows @BlaBlaCar a highly available and scalable architecture"] Using the MariaDB Enterprise Cluster allows BlaBlaCar to have and maintain a highly available and scalable architecture without complex IT operations. Serge Frezefond, Cloud Architect at MariaDB, says: ‘‘We are very proud to have BlaBlaCar as a customer... BlaBlaCar is one of the world’s leading sharing economy companies who trust MariaDB with truly business-critical data.’’ The sharing economy is a very interesting progression both technologically and socially for our global society, it's nice to see that technologists are working to make the sharing economy successful in the long term. ### Why to Attend Cloud Financial Services Asia Cloud Financial Services Asia is the thought-leading forum on cloud and financial services in Asia. Taking place on the 23-24th February in Hong Kong, this event aims to bring together cloud infrastructure and architecture leads from major financial institutions with both local and international solution providers. [easy-tweet tweet="#CloudFSAsia is happening February 23-24, 2016 in Hong Kong" user="comparethecloud"] Featuring 25+ expert speakers including representatives from the China Construction Bank, Ageas Insurance, AWS and the Hong Kong Monetary Authority the Cloud Financial Services Asia 2016 event in the only APAC summit dedicated to cloud deployment in financial services. Having attended (and spoke at) the last event held in New York City, I would suggest that it is an event well worth attending. Deep insights within the financial services industry and cloud infrastructure migrations will be explained as well as current and future solutions explained and demonstrated. In the regulated world of Fintech and the associated IT Infrastructure solutions it is imperative to have a solid understanding of the trends and governance changes that are occurring around the world and not just one region. With the Cloud Financial Services Asia event I believe that you will not be disappointed, and that you will come back educated, informed and happy knowing that the takeaways will be valuable to your business. it is imperative to have a solid understanding of the trends and governance changes that are occurring around the world and not just one region The agenda has been specifically designed to provide the depth of knowledge required to inform a complex procedure such as moving core infrastructure services to the world of financial cloud, and to facilitate a unique networking opportunity. In a heavily regulated, some would say restricted, world of technology and banking its difficult to see the wood for the trees. Financial Services has always been a complex mix of regulation/compliance and governance and with emerging technologies that quite simply "change the rules” for banking (Blockchain etc al) you need to stay in and ahead of the game. The executive level summit is a key networking and information sharing forum for thought leaders and early adopters in Hong Kong, Key agenda topics include creating a pan-Asian regulatory framework Key agenda topics include creating a pan-Asian regulatory framework, protecting critical data in the cloud and how to optimise your cloud implementation and migration process. The program is not just a discussion about the benefits of cloud but a guide to successful implementation. With a mixture of keynote presentations, panel discussions and breakout sessions the event will provide both the depth and breadth of knowledge required to support your organisations transition to the cloud. [quote_box_center] 10 reasons to attend Cloud Financial Services Asia The only event in the region dedicated to cloud adoption in financial services, no other event offers such a specific focused agenda on key financial vertical challenges Over 120 Executive and Director level stakeholders from major financial institutions will be in attendance Key regulators are in attendance to give their perspective on how to create a regulatory framework for cloud adoption in Asia to boom In depth presentations and thought leading panel discussions giving new insights to attendees Dedicated networking breaks to facilitate 1-1 networking A collection of the most innovative cloud solution providers will be joining the discussion to show how their solutions can help financial institutions in achieving their goals Join the emerging cloud and IT community in Hong Kong who are converging to share their outlook for their year Reaffirm long standing relationships with industry executives and cement your partnerships ahead of competitors Meet new partners who are interested in moving processes to the cloud Start the Chinese New year by bolstering your knowledge and partner base and speaking with new potential partners before budgets are allocated [/quote_box_center] ### Do I need a CDN if I’m already Running My Website on a Cloud Host? Hosting a website on a cloud-based server is a smart decision. It typically makes your website faster, more secure and agile. Cloud hosting is also helpful in letting you manage the scalability of your website depending upon its traffic. Compared to regular hosting, cloud hosting is a major advantage for any website. [easy-tweet tweet="If everyone is moving to #cloud hosting, is everyone going to be on par for web performance?"] However, virtually every notable website today is increasingly shifting to cloud hosting, which means that even if you are using cloud hosting and are enjoying all its benefits, you are nearly at par with most other websites in terms of performance. At this point, the key is to implement other methods of improving your site’s speed, security and customer experience. One way of accomplishing this is to deploy a content delivery network (CDN). the key is to implement other methods of improving your site’s speed, security and customer experience A CDN is a set of servers distributed in strategic locations all over the world. The network caches elements of your website, which are then delivered to visitors from the server that is located nearest to them. With the reduced distance, latency is reduced, and thus page load speed is also minimised. This means the user gets to see the content from your website faster. A CDN is used in conjunction with web hosting and it essentially enhances the performance of your website. Following are some of the major advantages that a CDN brings to your website: Lower Bandwidth Consumption Bandwidth consumption can be a major bottleneck if your site has heavy traffic. On one hand, limited bandwidth can cause disruptions in access to website. On the other hand, it is also one of the key drivers of maintenance expenses. By using a CDN, both these problems are instantly resolved. Since the CDN delivers data such as images, stylesheets and scripts from your website through its own servers, the amount of data served directly by your main host is minimised. And your main server consumes less bandwidth. For a cloud-hosted website, this can minimised bandwidth expense. In other cases, this will help mitigate any bandwidth limitations, thus ensuring higher availability. [easy-tweet tweet="A #CDN can stop the #bandwidth bottleneck says @daanpepijn" user="comparethecloud"] Lower Maintenance Costs Bandwidth consumption is directly tied to the maintenance expenses of a website. The more bandwidth your site consumes, the greater will be the amount you are billed for your hosting subscription. Or, at the very least, you are more likely to exceed your bandwidth allocation. Since a CDN delivers cached website elements from its servers, it reduces the amount of data delivered from your main server. This means the reduced bandwidth consumption directly results in a reduction of bandwidth expenses as well. Better Global Coverage Cloud based hosts usually maintain a more centralised architecture compared to CDNs. So a cloud hosting service may have its data centre mostly in a single country or region or only in a very few locations globally. In comparison, a CDN has a far greater global coverage, in terms of data centres. Since the strength of a CDN is its geographical Points of Presence (POPs), CDN servers are typically located in a larger number of global locations. So between the two, a CDN is more effective in covering your site’s users all over the world, all the while offering better site-load speeds. the strength of a CDN is its geographical Points of Presence A Solid Layer of Security Cyberattacks and website security breaches are becoming an everyday affair. No matter how improved your site’s security is, there is always room for improvement. A CDN comes with an additional layer of security for your website. CDNs like Incapsula make use of the reverse proxy topology which puts the CDN infrastructure ahead of the main site servers in the face of incoming user traffic. As a result, if any bad actors decide to launch a barrage of DDoS attacks, the CDN infrastructure will meet it outside the doorstep and mitigate it before it can affect your servers. Since DDoS attacks are a very common nuisance for websites, such an effective solution against them is very advantageous. Incapsula also makes use of its Web Application Firewall (WAF) to deal with hacking attacks on your website. Although cloud hosting services also come with some security features, a dedicated solutions like that provide much higher grade of protection, while also enabling compliancy (e.g. meeting PCI DSS requirements) . So if you opt for a CDN, not only will be getting better site speeds, you will also have a more secure website. To answer the question: Yes. You need a CDN even if you are already hosting your digital assets on the cloud. [easy-tweet tweet="A #CDN brings immense value to your #website in terms of speed, efficiency and enhanced security"] Cloud hosting has a huge number of advantages but it has a different set of capabilities compared to a CDN. A CDN brings immense value to your website in terms of speed, efficiency and enhanced security. So even if you have your website hosted on a cloud server, deploying a CDN will provide better protection and optimisation for your digital assets. ### IoT Tech - the future is adorable Last week I attended the IoT Tech Expo at Olympia in London, and can I just say that the future is looking pretty cute! Innovation abounding, the expo was full of attention grabbing displays, but the stand out had to be Pepper.  [easy-tweet tweet="The #IoTTechExpo and its hottest products reviewed by @Rhian_Wilkinson of @Comparethecloud"] Pepper is a humanoid robot that has been designed as a day-to-day companion by Aldebaran. With the ability to give hugs and fist bumps, Pepper is personable and likeable - a real change from the hard edged 'terminator' style robots western culture is permeated by. Based on your voice, facial expressions, body movements and the words you use, Pepper can interpret your emotions and offer appropriate content. This is making Pepper a hit in Japan, where Nestlé is planning on equipping more than 1,000 stores with Pepper robots to enhance the customer experience. Pepper is personable and likeable - a real change from the hard edged 'terminator' style robots western culture is permeated by There have also been in-home trials for Pepper robots, which can gradually memorise your personality traits, your preferences, and can adapt himself to your tastes and habits. These options for personalisation make Pepper infinitely more interesting than previous humanoid robots, because he become truly unique to you and your needs. As much as I loved meeting Pepper, other technologies caught my fancy too - IoT is infiltrating so many industries that don't normally cross your mind when you think of technology. I was surprised to find Feeligreen, a French startup, showing off IoT-C. The Internet of Things for Cosmetics. Feeligreen's offering is a handheld ergonomic device that uses non-invasive micro-current technology and LEDs to increase the effectiveness of skincare treatments. The device connects to your smartphone where you can track progress, and design skincare programs using the device to achieve your goals. [easy-tweet tweet="#IoT is infiltrating industries that aren't normally associated with #technology" user="rhian_wilkinson"] The Activ-feel device comes with 4 skincare products: targeting stretchmarks, blemishes, ageing, and cellulite. Early testing shows that using the device with the skincare products provide noticeable results within 7 days, compared to approximately 28 days without the device. I don't know about you, but I certainly like the sound of that! Not just targeting vanity, there is also a sports version of the product, which comes with with heating, cooling, relaxing and body sculpting gels. Onwards and upwards to the bigger ideas, companies like Covisint are the platform behind the IoT. They are simplifying our lives by doing things like unifying our identities into singular devices. They are the people who mean we can make our regular coffee order, pay, gain our loyalty card points, all with minimal effort on the purchasers behalf. No more pulling three or four cards out of your wallet at every checkout! companies like Covisint are the platform behind the IoT Right at the other end of the spectrum I got to speak with Nigel Maris of EveryWare, a company that specialises in industrial IoT applications. Their solutions are so varied that I can only just begin to describe how it all works! They allow companies to monitor where deficiencies are in their businesses using sensors and IoT networks to create connected solutions that work better for the future. The IoT Tech expo was a busy event, and I am remiss to admit that I didn't get to all the stalls I wish I could have! I think we have an exciting future of connected homes, cars, and lives ahead of us, and IoT is taking us there! ### Pay as You Go Innovation It is clear to see that cloud has lived up to the hype. It is now firmly in a high growth phase and only shows signs of accelerated growth over the next few years. In my view, there are three key reasons as to why more and more businesses are turning to cloud technology: innovation, flexibility and reduced operating costs. These three are not independent of each other, and together they are providing more UK businesses opportunities to innovate and grow. [easy-tweet tweet="There are 3 key reasons why businesses turn to #cloud: #innovation, #flexibility and reduced operating costs"] Traditionally, when companies have prepared their resources for anticipated growth, the approach has been to invest heavily in technology infrastructure to support that growth. The risk here is over-investment and with no guarantee that the growth will materialise as anticipated. Naturally over time this has led businesses to become cautious and as result, innovation has suffered. The fear of over-investment risk has meant innovation has suffered Innovation has been, and will continue to be, a key driver of UK productivity growth and economic prosperity. Between 1985 and 2013, Gross Expenditure on Research and Development grew by 52 per cent in real terms with the UK ranking 7th in the world (2014) in terms of overall level of R&D spending. However, its share of innovation spending as a proportion of GDP places it behind several of the leading research nations. In order to support this future innovation, businesses have been looking for solutions that allow them to continue to innovate with the associated risks or costs. Historically, developing and testing has been viewed as a risky necessity requiring organisations to set aside, in some cases, vast sums of capital through the development of dedicated infrastructure and prototyping. In addition to this, as business applications are becoming ever more complex, making it difficult for organisations to build and maintain in-house testing environments that mimic reality. [easy-tweet tweet="Cloud-based #innovation testing has the potential to offer a compelling combination of benefits"] Cloud-based testing has the potential to offer a compelling combination of lower costs, pay-as-you-go models and the elimination of intensive capital investment. For example, cloud computing allows disk, server and network capacity to be added from anywhere within a data centre in a matter of seconds. For example, cloud technology enables businesses to quickly and easily prototype new products with minimal investment. As well as addressing the growing demand for sophisticated test environments. This is important development as business become more cash strapped but continue to compete with a global market. cloud benefits include on-demand flexibility, diminished need to hold assets, improved collaboration, greater efficiency and most importantly, reduced time-to-market Other benefits brought on by the cloud include the on-demand flexibility, diminished need to hold assets, improved collaboration, greater efficiency and most importantly, reduced time-to-market for key products and applications. This flexibility allows for businesses to work together, without the issue of merging clunky IT systems together or without geographical limitations. As we are all too aware, existing legacy infrastructure isn’t flexible enough for this kind of resource allocation. Most resources are dedicated to specific applications or functions, making it difficult and sometimes near on impossible to collaborate. It is clear that cloud technology has been and will continue to support innovation here in the UK but also to help break down geographical barriers. Quick, simple and cost effective – the cloud will continue to serve and support business growth. ### Brocade Paves the Way to 5G with New IP Offerings for Mobile Network Operators Brocade today announced new offerings to help mobile operators re-architect their networks to support 5G services and deliver higher agility and performance at a fraction of the cost of traditional networks. These innovative New IP offerings uniquely position Brocade as a disrupter with 5G capabilities that solve the key challenges facing mobile network operators (MNO) and mobile virtual network operators (MVNO) today. [easy-tweet tweet="#CloudNews: Brocade Paves the Way to 5G with New IP Offerings" user="comparethecloud"] The massive amounts of data created and moved by users consuming Internet and video services, as well as the number of connections associated with the Internet of Things (IoT) and machine-to-machine (M2M) services, have stretched mobile networks built on legacy architectures to their breaking point. In addition, traditional networks cannot adequately support innovative emerging use cases such as in-flight connectivity and services.  Enabling new services and meeting customer demand requires more than just a revamped, higher-speed air interface. A fully mobile and connected society requires a pervasive mobile network infrastructure that is software-enabled, virtualized, modular, scalable and built on open standards. Brocade is providing a path to deliver on this mobile network vision by simplifying the infrastructure and making it possible for operators to begin the transformation to 5G today.  With mobile devices capable of generating diverse traffic patterns in terms of data rate, traffic load, mobility, coverage and criticality, Brocade® offerings address the challenges of emerging services—such as IoT, M2M, high definition video, inflight connectivity and virtual mobile networking—by improving flexibility, cost-efficiency and scalability over legacy networks. “There is a desire among mobile operators to introduce virtual mobile core technology to support new services and business models at a low cost-of-production,” said Gabriel Brown, senior analyst for Heavy Reading. “However, because it involves a challenging set of applications that are inherently risky to experiment with, they are seeking to identify lead use-cases, such as MVNOs, enterprise VPN and IoT services that can deliver a fast return without adversely impacting the mass-market consumer EPC. Over time, as operators gain experience with the technology, we expect virtualization to become the mainstream operating model. The future 5G core will be ‘cloud-native’ from inception.” Brocade Virtual Core for Mobile Solution The Brocade Virtual Core for Mobile (VCM) is the first-of-its-kind, fully virtualized Evolved Packet Core (vEPC) suite of products that supports network slicing with independent localization and scaling of the control and user planes—ahead of the targeted standardization date established by the 3rd Generation Partnership Project (3GPP).  Specifically designed for virtual environments, the Brocade VCM transforms mobile networks by eliminating expensive and proprietary hardware devices, long upgrade cycles and over-provisioning.  The Brocade solution can be deployed in days instead of months.  Depending on specific network and deployment requirements, the same software package can serve as a complete vEPC or a combination of mobility management entity (MME), home subscriber server (HSS), serving gateway (S-GW) and packet data network gateway (P-GW). The overall deployment model of the Brocade VCM can be used to create network slices that are optimized for IoT traffic profiles to enable a mobile packet core-as-a-service offering for hosting MVNO services or to support non-traditional, innovative services from mobile operators.  The solution interoperates with all major radio and 3G/LTE core network equipment through standard interfaces and bridges 5G capabilities to older networks. SmartSky Networks, a provider of high-speed in-flight data communications, has selected the Brocade vEPC to connect its new SmartSky 4G air-to-ground broadband network to more than 250 cell sites that will be strategically placed across the U.S.  “Brocade is the only company that offers an end-to-end virtual EPC solution built from the ground up.  Other vendors take a short cut by repackaging old solutions,” said David Claassen, vice president of Network and Service Architecture at SmartSky.  “With support for separation and independent scaling of control and user planes, the Brocade vEPC solution is enabling us to provide seamless data service across the continent while eliminating redundant functionalities and internode dependencies.  As a result, we expect to see dramatically lower total cost of ownership and increased performance compared to physical node-based packet cores.” Brocade Mobile Edge Computing Capabilities Mobile edge computing provides the infrastructure to deliver applications, network services and content from the edge of the mobile network. Based on the industry-leading Brocade vRouter, Brocade offers mobile edge virtual infrastructure capabilities that integrate routing, IPsec termination and network firewall functions with the virtualization infrastructure in the host operating system. This approach enables MNOs to capture new revenue by extending the cloud to the edge of the mobile network. These capabilities also allow for consistent introduction of virtual network functions in closer proximity to end users as well as a platform for hosting third-party applications, especially vertical industry applications associated with IoT. To provide a complete solution, these capabilities integrate with the Brocade SDN Controller and the Brocade Virtual Packet Broker. Brocade also provides a comprehensive network visibility solution for pervasive monitoring of physical and virtual mobile networks.  This end-to-end network visibility solution includes network packet brokers and protocol probes, with a software-based approach that delivers real-time programmability of the visibility infrastructure and a dynamic, scale-out architecture that supports an unprecedented 100 million subscribers per packet core. “Today’s announcements solidly plant Brocade in the business of delivering innovative networking solutions that help MNOs and MVNOs solve their most critical business and technology problems,” said Kevin Shatzkamer, chief technology officer of mobile networking, Brocade. “The combined offerings for vEPC, mobile edge computing and network visibility provide a future-proof technology foundation well in advance of 5G availability which will help MNOs accelerate new revenue streams today.” Brocade will be demonstrating its solutions and technologies at Mobile World Congress, Feb. 22-25, 2016 in the Brocade booth located in Hall 2 of the Fira, 2G29. ### Cloud Colonialism: can local player stand up to the US cloud giants? After Facebook managed to get itself embroiled in an embarrassing row in India, we look at the US cloud firms and ask if they are leading to a wave of tech neo-colonialism. [easy-tweet tweet="Are the major #cloud players leading a form of #Tech colonialism? " user="Billmew, @comparethecloud"] The Facebook row comes days after India banned a programme called Facebook Free Basics, which offers free but restricted internet access to people in poor, rural areas. Marc Andreessen one of Facebook’s directors reacted by saying that “anti-colonialism” was to blame for the country's economic struggles. “Denying world's poorest free partial Internet connectivity when today they have none, for ideological reasons, strikes me as morally wrong,” Andreessen, the well-known Silicon Valley investor, wrote on Twitter. When someone compared the Free Basics service to “internet colonialism”, he replied: “Anti-colonialism has been economically catastrophic for Indian people for decades. Why stop now?” Despite later apologising and deleting the tweets, he was accused of endorsing colonial rule by thousands of angry Twitter users as his post was circulated widely in India, which has over 22 million Twitter users. Soon the phrase “East India Company” started to trend on Twitter as users accused Facebook of acting like the former British trading company, which effectively ran most of the Indian subcontinent in the late 18th and early 19th centuries. Cloud Colonialism Are we seeing a rehashing of colonialism for the modern world? This started me thinking about the way that the “East India Company” had outflanked its French and Dutch rivals in India, by working with and eventually subjugating local tribal leaders. A similar scenario was played out in North America when the French and English allied with local native tribes, using them as proxies in their struggle to gain control of the ‘New World’ – as was shown in Michael Mann’s film adaptation of ‘Last of the Mohicans’ which depicted 1757 during the French and Indian War. Amazon has made great strides in attracting partners and service providers to its AWS marketplace, with Microsoft’s Azure marketplace and IBM’s Bluemix and cloud marketplace both seeking to do the same. In an environment that is undoubtedly ‘hybrid’ many local CSPs and MSPs as well as local telcos are evaluating where they stand. While volume economics would indicate that workloads and data will eventually migrate to the largest and cheapest platforms possible – favouring public cloud and the US giants, this is mitigated somewhat by security, provenance, and sovereignty concerns – favouring private or hybrid cloud where the local players can come into their own. Many local players saw OpenStack, the open cloud platform, as a way of taking on the US giants, but the OpenStack movement has been slower than its giant US rivals to reach a level of enterprise-ready maturity and provide with it a slew of innovative services. [easy-tweet tweet="Did Facebook cross the digital colonisation line? " user="comparethecloud, @billmew"] Recently HPE gave up the fight to take on the giants in public cloud and refocused on private and hybrid instead. Along with HPE, players like VMware and Red Hat now all see the market heading in the direction of hybrid and have allied themselves with the Microsoft’s Azure public cloud. Microsoft has even provided them with its Azure Stack to allow them to develop their own private and hybrid clouds in its image – effectively seeking to rally them all to its banner and its public offering. So what are the options: Should local CSPs, MSPs and telcos bet their future on OpenStack? Will the community be able to make the strides in usability, reliability and innovation that it has been promising for so long? Should they follow HPE, VMware and Red Hat and use the Azure Stack to develop their own private and hybrid clouds that would link seamlessly with Azure? However if everyone is doing this then how can individual firms differentiate themselves? Should they throw their lot in with AWS instead and ally themselves with the market leader with the largest pool of skills, largest client base and largest marketplace? However Amazon’s retail operation has been quick to decimate rivals where and when it saw that there was money to be made (just ask book or DVD sellers) and who is to say that they won’t eat your lunch too? Once regulatory issues have been ironed out (such as Privacy Shield, which it is hoped will replace Safe Harbour on the data privacy front) the international reach of cloud services will make this a truly global market in a way that no other market has been before. Local support and service will remain important and concerns about security, provenance, and sovereignty will persist, but the market will have global dynamics and reach. [easy-tweet tweet="Does choosing a global giant provider put you at risk of finding one day that you’ve been colonised?" user="billmew"] So… do you take a stand against the global giants or do you work with them at the risk of finding one day that you’ve been colonised? Make your mind up soon because your future depends on your decision. ### Classics never die What do the winklepicker, the majestic perm and thin client computing all have in common? One answer could be that they have all had their day - more than once. Trends are often said to go in cycles, but some would argue these ones never really went away and that they've been plugging away in the background, waiting to be reborn once again. [easy-tweet tweet="Jonathan Wilkins tracks the life of #thinclient #computing - the concept of one centralised computer power"] Thin client computing refers to a network of machines with limited inbuilt processing power and memory. These devices work by using a mainframe to provide central processing and storage. Traditionally, this involved a user working at an output device, or display monitor called a dumb terminal. These had very little computational power to do anything except display, send and receive text. A dumb terminal did not have much processing power or memory, so the user couldn't save things locally. Their use was very limited and was mainly reserved for inputting data – believe it or not, they didn't even have Minesweeper. [easy-tweet tweet="A #DumbTerminal didn't even have the power to play #Minesweeper" user="comparethecloud"] Similar to the perms of 80s footballers, the benefit of thin client computing was that it was rugged and required infrequent and inexpensive maintenance. Terminals had no hard drive, fans, motherboards or inputs that could break down, so maintenance was relatively inexpensive. In comparison, modern PCs - even industrial PCs - are expensive to maintain. Operating system vulnerabilities, constant patching, patches that break other applications and users downloading viruses are just some of the issues. Not to mention the added cost of renewing licenses for anti-virus, firewall, office suites and operating systems. Even if properly secured and administered, things can go wrong for a variety of reasons. In the 1970s and 80s, thin client computing was a staple in many industries In the 1970s and 80s, thin client computing was a staple in many industries until the 90s rolled around and brought with them house music and the rise of PC networks. These had their own processing power and the humble dumb terminal and mainframe became overlooked, with many predicting their obsolescence by the dawn of the millennium. [easy-tweet tweet="The 90s brought the rise of house music and PC networks " user="euautomation @comparethecloud"] However, like many trends, thin client computing never did truly die out. Sectors from finance to government services continued to use thin clients because of their reliability, security and simplicity. This brings us to using thin clients with the cloud. Improvements in thin client hardware, software, connection protocols and server technology have meant a resurgence of thin client computing in a cross section of industries. In the past, thin clients were not quite able to deliver the power needed for high performance demands, but not anymore. Mainframes and dumb terminals have now given way to high-performance cloud-based workstations that you can carry around in your pocket. One of the main advantages of thin client computing today is that when smartphones, tablets and thin client computers are combined with the central processing power of the cloud, they enable users from multiple locations to send files to each other with ease. Or work on live documents where there's no confusion about the latest version or revision. Encrypted access to private clouds also means thin client computing can actually be safer than PCs. Security has taken a step up too. Encrypted access to private clouds also means thin client computing can actually be safer than PCs. Furthermore, thin client computing negates a common problem with PCs in the workplace: users plugging in infected USBs. Many thin clients don't have USB ports for this exact reason, or their USB ports can be deactivated to protect the network. The moral of the story is that you should never dismiss a technology just because it's obsolete, or considered to be so. The technology columnist Stewart Alsop famously stated the last mainframe would be turned off in 1996, but they're still going strong today and Alsop was later forced to eat his words. [easy-tweet tweet="Never dismiss a #technology just because it's obsolete, or considered to be so says Jonathan Wilkins"] Similarly, the world of industrial automation still uses many obsolete motors and inverters that meet current energy efficiency standards and can fit into your Industry 4.0 applications, if you give them a chance. Sometimes you have to go backwards to go forwards. You never know, 2016 could be the year we start seeing footballer style perms again. ### London StartUp Curve to close the gap between traditional and mobile payments Today, Curve launches to simplify payments and reshape people’s experience of managing their money. The London-based startup allows people to combine an unlimited number of bank cards into one physical payment card, accepted everywhere MasterCard cards work. It is supported by a mobile app that gives people a clearer picture of their payments. This includes services such as the ability to see all transactions in one place - in real-time, access to super low foreign currency rates with no fees, and using Amex at places it isn’t accepted. Curve offers the advantages of products such as Apple Pay, but in the widely accepted form of a card and without the £30 transaction limit. [easy-tweet tweet="#CloudNews: Curve launches to simplify payments and reshape people’s experience of managing their money"] Curve is built on the MasterCard Network and works the same way as every other card, supporting the latest Chip and Pin, magstripe and Contactless technology. Curve is a Prepaid MasterCard, so has bank-level protection and all the security features of traditional cards. ‘With Curve you have one card and see every single transaction in one screen. We’re not another new bank or extra service to deal with, we transform your existing fragmented financial world into somewhere crystal clear, designed for the user,” says Shachar Bialick, CEO and co-founder of Curve. “Mobile payments have the potential to bring similar benefits, but cards work everywhere and people are used to them. So we’ve created the best of both worlds - all the benefits of mobile payments via a ground-breaking card.” "The concept of payments as we know it is on the brink of being shaken up in a meaningful way,” says Taavet Hinrikus, CEO and co-founder of Transferwise and investor in Curve. “Today, Curve makes the first step at being the company to finally make this significant change in impacting how people pay for goods and services.” “At MasterCard we are at the forefront of supporting pioneering payment solutions that improve people’s day to day lives,” says Eimear Creaven, Group Head of Growth Markets. “Using our network, Curve is bringing to market genuine innovation in payments and helping customers to manage their financial lives better.” HOW IT WORKS The Curve card can be ordered through the website (www.imaginecurve.com) or via the Curve app. The setup process is simple: users download the app and scan an unlimited number of existing debt and credit cards through the Curve app, which syncs them with the Curve card. The Curve card looks and works just like one of your standard bank cards with Chip and Pin, magstripe and Contactless technologies, and doesn’t require a battery. You set a default card in the app and can change it to another card before payment in the app with one swipe. The Curve app displays every transaction made, with a running total across each month. Manage and understand your spending on the go, by labeling transactions with a swipe of your finger. The Curve team offers 24-hour customer support, and protection against lost or stolen cards. The Curve card has an initial one-time cost of £35 which includes shipping and payable only when the card is sent out. The card and app are free to use. SECURITY Curve protects financial information with industry-leading security and fraud prevention systems. With Curve your original card details and data are never revealed to the merchant. Instead, when making a purchase with Curve, online or in-store, a unique token is used to process your payment, keeping identity and actual card numbers safe. These added layers of security protect against fraud and remove the worry that often comes when a bank card is lost as the original cards are safely at home. Initially available on iOS, Curve soft launches today in the UK to a limited number of users in the business community – including businesses, entrepreneurs and freelancers. It will roll out to consumers in the coming months. People can sign up and order by downloading the app, or at www.imaginecurve.com ### Richard Agnew from Veeam has some big news for CTC's #CloudTalks Richard Agnew of Veeam speaks with Compare the Cloud for #CloudTalks. Read the full press release for Veeam's latest Availability Report below. [easy-tweet tweet="Richard Agnew of @Veeam speaks to @Comparethecloud for #CloudTalks"] 2016 Veeam Availability Report: Availability Gap Widens With Application Downtime Costing Enterprises $16 Million Each Year The study illustrates that there is alarming disconnect between user demands and IT’s ability to enable the 24/7 Always-On Enterprise, with the number of downtime incidents and the length of downtime greater than in 2014 New research from Veeam® Software, the innovative provider of solutions that deliver Availability for the Always-On Enterprise™, clearly illustrates that despite numerous high-profile incidents in the last year, enterprises are still not paying enough attention to the needs of their users. In its fifth year, Veeam Availability Report showed that 84 percent, a two percent increase on 2014, of senior IT decisions-makers (ITDMs) across the globe admit to suffering an ‘Availability Gap’ (the gulf between what IT can deliver and what users demand).  This costs businesses up to $16 million a year in lost revenue and productivity, in addition to the negative impact on customer confidence and brand integrity (according to 68 percent and 62 percent of respondents respectively). This figure has risen a staggering $6 million in 12 months, despite almost all respondents saying that they have implemented tightened measures to reduce availability incidents and that 48 percent of all workloads classified as ‘mission-critical (rising to 53 percent by 2017). With the number of the world’s connected population soaring to record levels last year (3.4 billion or around 42 percent of the globe) and predictions there will be almost 21 BILLION connected devices by the end of 2020, the need to deliver 24/7 access to data and applications has never been more important. However, it seems that enterprises have not received that message despite more than two-thirds of respondents stating that they have invested heavily in data center modernisation specifically to increase availability levels. “When you talk to more than 1,000 senior ITDMs you expect that there will be some that are still struggling to deliver on the needs of the Always-On Enterprise – the Enterprise that operates 24/7/365, but these findings are alarming,” stated Ratmir Timashev, CEO at Veeam. “Modern enterprises are becoming software-driven businesses, so IT departments can no longer get away with services that are ‘ok’; always-on availability is paramount.  However, since our last study, the number of annual unplanned downtime events have increased (from 13 to 15) and they are also lasting longer and taking a far greater amount of time to recover.  In today’s economy, where speed and reliability are imperative, this is unacceptable. If this trend continues, I fear for the companies we surveyed.” [quote_box_center] Key findings in 2016 Veeam Availability Report include: Availability is of paramount importance… yet enterprises are failing Users want support for real-time operations (63 percent) and 24/7 global access to IT services to support international business (59 percent). When modernising their data centers, high-speed recovery (59 percent) and data loss avoidance (57 percent) are the two most sought-after capabilities; however, cost and lack of skills is inhibiting deployment. Organisations have increased their service level requirements to minimise application downtime (96 percent of organisations have increased the requirements) or guarantee access to data (94 percent) to some extent over the past two years, but the Availability Gap still remains. To address this, however, respondents stated that their organisations are currently, or are intending in the near future, to modernise their data center in some way – virtualisation (85 percent) and backups (80 percent) are among the most common areas to update for this purpose. Data at risk SLAs for recovery time objectives (RTOs) have been set at 1.6 hours, but respondents admit that in reality recoveries taken 3 hours. Similarly, SLAs for recovery point objectives (RPOs) are 2.9 hours, whereas 4.2 hours is actually being delivered. Respondents report that their organisation, on average, experiences 15 unplanned downtime events per year. This compares to the average of 13 reported in 2014.  With this, unplanned mission-critical application downtime length has increased from 1.4 hours to 1.9 hours year over year, and that non-mission-critical application downtime length has increased from 4.0 hours to 5.8 hours Just under half only test backups on a monthly basis, or even less frequently. Long gaps between testing increase the chance of issues being found when data needs to be recovered – at which point it may be too late for these organisations. And of those that do test their backups, just 26 percent test more than 5 percent of their backups. ‘Financial’ impact is substantial As a result, the estimated average annual cost of downtime to enterprises can be up to $16 million. This is an increase of $6 million on the equivalent 2014 average. The average per hour cost of downtime for a mission-critical application is just under $80,000.  The average per hour cost of data loss resulting from downtime for a mission-critical application is just under $90,000.  When it comes to non-mission-critical applications, the average cost per hour is over $50,000 in both cases. Loss of customer confidence (68 percent), damage to their organisation’s brand (62 percent), loss of employee confidence (51 percent) were the top three ‘non-financial’ results of poor availability cited. [/quote_box_center] “While it’s easy to think that this survey paints a picture of doom and gloom, there are shoots of optimism as well,” added Timashev. “Almost three quarters of enterprises recognise that they have areas which need addressing and intend to do so in the next six to 12 months. It is not always easy to divert precious funds to invest on infrastructure, but there is acceptance that this needs to be done. We are seeing enterprises starting to realise the importance of availability solutions and, in particular, the role cloud and cloud-based services such as Disaster Recovery as a Service (DRaaS) can play. Enterprises appreciate the need for an Always-On, always-available operation and I am confident that users will see this become a reality sooner rather than later.” To find out more, please visit https://www.veeam.com/ and to download the full version of Veeam’s Availability Report, click here.   ### Do the metrics of a data centre impact upon its agility? IT operations managers need to understand how their data centre is performing in order to make better decisions about how and where to invest in their IT infrastructure. Key to those right-sized investment decisions are to look at the various ways to measure performance; here five that should be considered as top priorities. [easy-tweet tweet="IT operations managers need to understand how their #datacentre is performing" user="atchisonfrazer @comparethecloud"] How can you merge heuristics with data centre metrics in a way that delivers the best use of space and cooling power? It used to be that you would only use heuristics to identify network security traffic patterns. However, you can now apply heuristics to infrastructure performance issues so that you can quickly identify and act on any potential hacks. In order to get better information, it’s best to have cross-reference visibility to NetFlow metrics with different types of KPI’s from your more conventional silos. Doing this helps you identify any contention issues as well as laying the foundations for a more intelligent investment plan. This increases productivity and makes your system far more efficient. How can heuristic-based data centre metrics help our operations? As the modern data centre has become more and more complex, (think conversational flows, replatforming, security, mobility, cloud compatibility) heuristics has become more and more important. This technology gives us the capability to perform back-of-envelope calculations as well as taking the risk out of human intervention. The end product is ideal, a machine-learned knowledge base. Is it possible to properly model costs as well as operational metrics? When it comes to managing and upgrading our system, most of us have to make do with a fixed budget. This means that we are susceptible to ‘over-provisioning’ hardware and infrastructure resources. The main cause of this is that we can’t properly see the complexities that come as part and parcel of a contemporary data centre. [easy-tweet tweet="We over-provision because we can’t properly see the complexities that come with a contemporary #datacentre"] What you need is an effective infrastructure performance management tool. This will help you properly calculate your capacity and make a better-informed investment decision, which means that you won’t overspend in a bid to prevent overloading that you can’t even see. Can financial metrics-based modelling benefit data centres? [easy-tweet tweet="Financial metrics allow #datacentre managers to deliver concise, easy to read metrics to people outside of #IT"] Data centre managers can deliver core IT operational metrics to business-side managers; in a language that makes sense by continuously illustrating a realistic view into available capacity against overall efficiency, acceptable performance thresholds between running hot and wasted “headroom,” and a greater degree of granularity in terms of ROI benefits to virtualisation over maintaining legacy infrastructures. Using financial metrics, data centre managers can deliver concise, easy to read metrics to people outside of the IT industry. This includes people such as business-side managers and other stakeholders. This is achieved through simply showing available capacity against overall efficiency as well as performance thresholds. Showing return on investment makes it easier to communicate your good performance to peers. You will find below some of the best metrics with which to demonstrate ROI analysis: TB Storage Reduction: Variable Cost / GB / Month $ Server Reductions: Annual OPEX / Server $ VM Reductions: Variable Cost / VM / Month $ Applications Reduction / Year $K Database Reductions: Variable Cost / Database / Year $ Consulting / Contractor: Reduction $K Revenue Improvement / Year $K Blended/Gross Margin % Facilities & Power / Year $K Ancillary OPEX reductions / Year $K Is it possible for data centre managers to provide a holistic operational metrics solution? One of the best ways to visualise performance is through a high-fidelity streaming multi-threaded dashboard. These real-time operation dashboards provide easy to understand intelligence comprised of data points and their key interdependencies, which includes endpoint devices, physical infrastructure and visualised applications. The best way to ensure that you minimise the negative impacts of a service outage is to automate your system The best way to ensure that you minimise the negative impacts of a service outage is to automate your system. We would recommend integrating with an IT operations management platform like ServiceNow. This helps increase agility and responsiveness. However, none of this is possible without good data and visualisation. In order to predict the future, you need to understand what’s happening in the now. ### IT Recruitment and the munching cloud If the IT industry marketing machine would have you think that I'm writing my job/skills predictions sat on a sun kissed beach accessing and tapping my prophesies into a totally secure cloud which sits in a container, flexing up and down seamlessly due to the demands of myriad's of other mystic meg jobs/skills pundits across the globe. Somewhere bots are automatically creating storage /admin space just for my prophesies and occasionally a human looks at a single pane of a glass, checks everything's fine and dandy and then lets out a long self satisfied yawn.   [easy-tweet tweet="How is #IT recruitment changing with the industry? asks @MillsHill_Neil" user="comparethecloud"] Of course this image isn’t real, but the reality is very very close. To be quite frank, big structural change is happening and there is power for you knowing and accepting this change, particularly in the corporate / enterprise market place.  My favourite saying at the moment is 'cloud is munching everything in front of it' and that’s true. The big corporate capitalist machine loves to save money - to be quite frank if expensive / complex enterprise grade infrastructure can sit in the cloud more cheaply, efficiently and reduce CAPEX, then the CEO and the board will be game on. Yet, of course, not everything can or should sit in the cloud - certain industries / sectors cannot have 'stuff' sat in a cloud. So think hybrid.  cloud is munching everything in front of it says @MILLSHILL_NEIL We are seeing the effects of this change on job / roles in the IT industry.  We are seeing traditional IT services players morphing into Cloud / Managed Services providers. This sector is buoyant, they are busy transitioning new clients of all sizes onto their cloud / platform. So if you think it through the need for on premise IT teams is being reduced - yes the IT industry marketing machine will have you believe that cloud releases internal IT to focus and solve the bigger core issues of the business; but don't believe the hype, the reality is that it leads to lower on premise staff numbers.    So after seeing and understanding this big picture what else are we seeing/predicting? Here's my thoughts on the future of recruitment in IT.  Internal Infrastructure Support Engineers are moving to Cloud / Managed Services providers, And Cloud / Managed Service providers are needing staff to woo, win and technical prove why to move to the cloud; thus more sales / presales solutions architects are needed.  Cloud / Managed Service providers are realising that winning new business is the easier part of the equation; the bigger part being retaining business, thus they are requiring more engagement managers, client account managers, project managers and service delivery managers. Big reduction in numbers of road warrior IT consultants plumbing in new technologies into on premise clients, e.g. why plumb in a new tech when you can buy it in from a cloud vendor? There is also a reduction in on premise legacy old school storage roles needed, but conversely old school storage contract rates may rise. why plumb in a new tech when you can buy it in from a cloud vendor? In house desktop support roles are being vastly reduced, although we are seeing a transition of that role into a VIP support role - someone's got to be available 24/7/365 if the CEO can't access the cloud from their iPad right?  Corporate IT staff numbers are being reduced. Those remaining are becoming highly skilled all rounders, e.g. infrastructure, storage, virtualisation, networking, accessibility, security, etc - they have to be able to cover all the bases. There has been a big rise in Cloud Solution Architects, both in service providers and on the client side. Someone's got to design this stuff right? Then we’ve had a surge of Cloud Platform Architects too - big platform providers in particular are the winners here; think AWS, Azure and possibly Google. Cloud Security - there is a big demand here, so it’s a no brainer that there is a demand for cloud security professionals. Ditto with DevOps - there is a demand for the suite of infrastructure skills, and developer languages understanding, so that new and old stuff can just simply be built / work in the cloud. [easy-tweet tweet="There is high demand for #Security and #DevOps professionals says @millshill_neil" user="comparethecloud"] I’m expecting to see an increase in regards to Cloud Apps Architects - a role where old legacy applications are 'stripped'/coded and just to made to work in the cloud. The same for Cloud API Architects - roles where applications/infrastructure is built so that 3rd parties can access and pay for a 3rd party 'cloud machine' to 'crunch data', paid per cycle/call/'crunch'. This is Big opportunity for those with the skills to execute. Big Data / Analytics / Digital - this is going to be all about how the corporate uses its internal / social media data. Big opportunity. Companies need people to know how to build apps/infrastructure to message me on Twitter when I'll get the drone delivery of that product I thought about 5 mins ago right?  we’re moving towards Networking / Wireless everything And in conclusion, we’re moving towards Networking / Wireless everything - everything sitting on a network, always on, accessible from anywhere, and that will bring a whole realm of recruitment challenges in itself. ### IaaS, the infrastructure IT utility? When a business today opens an electricity bill it sees the amount of electricity used over the billing period and the cost per kilowatt hour. What the business doesn’t see is a separate flat rate allocated per appliance and to the various pieces of electronic equipment that have been deployed in the office. Imagine paying a flat rate for the electricity to power your air conditioning, but only using it heavily for 2 months a year; commercially it doesn’t make sense. Yet we are willing to accept the flat rate model and the complexities that go with it when we buy and deploy cloud IaaS. [easy-tweet tweet="Should we be paying for #IT and #cloud the same way we pay for utilities? " user="comparethecloud" hashtags="utilitycomputing"] Is the current cloud infrastructure (IaaS) market, delivering an easy to use, consumer friendly service, similar to common utilities? Or should we be looking at the benefits of an alternative commercial model, utility computing, which differs from the widely used cloud IaaS model. When comparing IT infrastructure to a utility model of delivering a service it is important to define what we mean. Standard utilities such as electricity and gas are delivered in a model where we, the consumer, only pay for the actual amount we use. Generally, this is considered to be the fairest way to buy utilities, as well as other goods and services such as food and commodities. All associated costs for the service, such as production and delivery, are included in the amount we pay at the end of the billing cycle. The key to the success of the commercial model is there is a single unit of measurement that enables the service provider to measure, and charge for the amount of resource that we have consumed. the single unit of measurement is what makes the utility sales model successful This model essentially removes the consumer from any of the responsibilities and financial investment into the supply, production and maintenance of the utility. The consumer is only concerned about the unit cost, the amount of resource used, and the reliability of the supplier. With regards to changing supplier, all suppliers use the same unit of measurement, and are therefore easily comparable. The complexities of managing supply and demand are removed from the consumer and the buying process is simplified to make the resource easy to use. Let’s look at how this model translates into the world of IT infrastructure and the cloud IaaS market. Traditionally organisations/ the consumer owned and invested in the technology and people to deliver their IT. The organisation took on the responsibility of managing supply and demand, so they could deliver data and applications to the business. They also, inadvertently, took on headaches such as keeping pace with changing technologies, recruiting and skilling specialist staff, scaling to meet demand, financing ad-hoc capital investment and building specialist facilities, the list goes on. [easy-tweet tweet="Should there be a single unit of measurement for #Cloud services? " user="comparethecloud"] This is not a simple model for the consumer to understand, and therefore control. As a consequence, costs and delivery standards were, and still are, wildly different between organisations.  The cloud era has brought about IaaS and more of a utility aspect to infrastructure delivery. Cloud Service Providers (CSPs) are specialised and the consumer pays on an opex basis, no ownership of assets and no responsibility for supply and demand. However, it is not common practise to pay for IaaS as a true utility, or in other words only pay for the resources that have been used. if you deploy a 100 IaaS instances and most of them are over provisioned for the workloads they support, then you are wasting a considerable amount of money Most CSPs sell their cloud IaaS instances at a flat rate depending on the amount of resources (CPU, storage etc) allocated. It doesn’t matter how well utilised the instance is over the billing period the customer pays the full amount. Therefore, if you deploy a 100 IaaS instances and most of them are over provisioned for the workloads they support, then you are wasting a considerable amount of money. For context, we see far more over-provisioned servers, than we do under-provisioned. The other problem is variation in sizing (resources allocated) and flat rate pricing models for the IaaS cloud instances. Because each CSP’s rates and instance sizes differ, it is an overly time consuming and complex task to work out who is offering value for money, and then to forecast your future spend. The cloud model has changed infrastructure delivery for the better, however it is still not delivered in a consumer or business friendly model. What both the traditional investment ownership and cloud IaaS models have in common is that billing units are fixed logical containers of resource. The logical containers have just progressed from physical server, to virtual server, to cloud instance. If we want to be more efficient with infrastructure usage and IT spend, the industry needs to look to our common utilities for inspiration. to be more efficient with infrastructure usage and IT spend, the industry needs to look to our common utilities for inspiration For a utility computing model to be effective in the IaaS space there must be a common unit of infrastructure measurement. 6fusion’s Workload Allocation Cube (WAC) is a patented algorithm used to formulate a single unit of IT measurement, that combines the 6 compute resource metric readings together. A thousand WACs is equal to one kWAC hr, similar to gas and electricity utilities. This unit of IT measurement is already in use to facilitate trade on a utility computing spot exchange called the Universal Compute Xchange (UCX). The exchange has a number of vetted CSPs selling their IaaS products through their online trading platform. The best way for customers to buy via the exchange is to engage a registered broker.  The broker role is a relatively new one in IT but not uncommon in other industries. The broker will be a member of the exchange and has access to better pricing than non-members. Their role is to facilitate trade between the buyers and seller. However, as this is IT, a buyer may have specific needs for their workloads, so the broker will need to understand the requirements and purchase units with all the necessary assurances in place. The requirements organisations currently have like performance, security, compliance and service levels will still need to be addressed. [easy-tweet tweet="What the #utilitycomputing marketplace gives the customer is pricing transparency and usage based billing"] What the utility computing marketplace gives the customer is pricing transparency and usage based billing, so finance and business stakeholders can see where their IT spend is going. The simple commercial model also enables benchmarking and forecasting of future consumption and spend. This model seems superior to the currently more popular flat rate per IaaS instance, where the price is calculated on the resources allocated. Benchmarking and forecasting future deployments and spend is inherently more difficult in the flat rate model. If the future of IT infrastructure is to be treated as a utility and a commodity that is on tap to drive innovation and customer engagement, then consumers must demand a more commercially savvy charging model. It’s time to consider utility computing and the commercial benefits it can bring.  It’s time to consider utility computing and the commercial benefits it can bring. ### Cloud Financial Services Asia, 23-24 February 2016 23-24 February 2016 Hong Kong [easy-tweet tweet="#CloudFSAsia is happening in Hong Kong on 23-24 Feb 2016 #CloudFS #Fintech"] Cloud Financial Services Asia is the thought-leading forum on cloud migration in Asia. Taking place on the 23-24th February in Hong Kong, this event aims to bring together cloud infrastructure and architecture leads from major financial institutions with both local and international solution providers. The agenda has been specifically designed to provide the depth of knowledge required to inform a complex procedure such as moving core infrastructure services to the cloud, and to facilitate a unique networking opportunity. Download the brochure to find out more. #CLOUDFS #CLOUDFSASIA ### Finding your gurus: How tech recruitment has changed The recruitment process must continually adapt and evolve to meet changes to its business environment. In the technology industry, these changes seem to occur at an ever-faster pace, with smartphone technology, cloud computing and the Internet of Things just some of the innovations to have been introduced in recent years. These new technologies are accompanied not only by new skills sets, but also cultural changes within the workplace that must be reflected in how businesses acquire and retain talented employees. With the rate of change showing no signs of slowing in the tech industry, recruitment teams need to show adaptability of their own. [easy-tweet tweet="The #recruitment process needs to adapt and evolve to meet changes in industry says @Veberhost" user="comparethecloud"] One of the most significant changes to tech recruitment in recent times is the emphasis now placed on flexibility. In the not too distant past, many tech jobs would see employees working nine to five in an office environment, tethered to their PCs. However, the growth of smartphone technology and cloud computing has meant that employees can access vital work resources from anywhere in the world. This has broadened recruitment in the tech industry beyond geographical confines. It no longer matters if the best software developer or cloud architect is located thousands of miles away, as technology firms are able to benefit from greater mobility than ever before. tech recruitment in recent times now has emphasis placed on flexibility It is not just remote working that has shaken up the tech industry, however. Because the expertise required to work for a technology business is liable to change as new hardware, software and methods are promoted, recruitment is beginning to identify the importance of broader skillsets. For example, DevOps, wearable technology and even cloud computing were not mainstream concepts even ten years ago and it is difficult to say with certainty what new innovations are on the horizon. As such, tech recruitment now focuses on individuals with adaptable talents that can react to new developments. Businesses cannot replace staff wholesale when the latest tech innovation emerges and so recruitment teams are recognising the importance of flexible candidates. How businesses find these candidates is also being impacted by technology. In addition to the formal recruitment process, social networks are being used by businesses to gain greater insights surrounding potential employees. Previously we have utilised LinkedIn to get a greater understanding of candidates’ strengths and weaknesses prior to interview, but other social media outlets are also of value to the modern day recruiter. A candidate’s Twitter profile, for example, can be a great way of demonstrating a passion for their industry and as a networking tool. [easy-tweet tweet="A candidate’s Twitter profile can be a great way of demonstrating a passion for their industry" hashtags="recruitment"] When undergoing the recruitment process it is also important to recognise how it differs between large and small tech firms. Larger organisations often utilise huge recruitment networks, where team leaders liaise with HR staff to source the best possible candidates. Microsoft, for example, currently employs more than 115,000 members of staff worldwide. In order to manage this huge talent pool, large organisations often display well-defined career paths, generous benefits and a very targeted recruitment process. Smaller firms, on the other hand, often attempt to foster a more entrepreneurial spirit, offering rapid progression if the business succeeds. veber retains an entrepreneurial spirit at the heart of its recruitment process Veber has grown significantly since being founded in 1999, but still has this entrepreneurial spirit at the heart of its recruitment process. As well individual responsibility and drive, we also expect our employees to be flexible, self-aware and with a passion to continue learning. [easy-tweet tweet="Knowledge, drive, self-motivation and a desire for personal development are valued by #tech #recruiters"] Even with all the changes taking place within tech recruitment, some of the most valuable attributes have remained the same. Knowledge, drive, self-motivation and a tendency towards continued personal development have been valued by recruiters for many years and will continue to be held in high regard. However tech recruitment, in particular, has had to demonstrate a willingness to adjust its methods to better reflect technological innovations and changes in workplace culture. With the tech industry remaining committed to change at all levels, it seems likely that recruitment teams, and indeed potential candidates, will need to continue demonstrating their adaptability for the foreseeable future. ### Take one Safe Harbour regulation, shred it, reheat it and what do you have? Privacy Shield The EU and US have agreed a new pact to replace the data transfer mechanism called Safe Harbour that was declared invalid late last year. It is hoped that the new pact, called Privacy Shield, will make it easy for organisations to transfer data across the Atlantic, countering the threat that tech firms of all sizes have been facing which would have made it impossible to send personal information for processing in US data centres. [easy-tweet tweet="The EU/US #PrivacyShield announcement analysed by @APJ12 of @GTTCOMM" user="comparethecloud"] Question: How many lawyers does it take to overturn a 15 year EU-US privacy arrangement? Answer: None. Max Schrems wasn’t a qualified lawyer. He was just a student from Austria studying law during a semester abroad at Santa Clara University in Silicon Valley when he started a campaign against Facebook for privacy violations, including its violations of European privacy laws and alleged transfer of personal data to the US National Security Agency (NSA) that eventually lead to the downfall of the whole EU-US data transfer mechanism called Safe Harbour. Ever since Safe Harbour was overturned, businesses have been seeking a quick and clear resolution. They have been encouraging the European Commission and US administration to move quickly to agree and implement a new arrangement that would allow trans-Atlantic data flows to resume on a secure and stable legal footing. [easy-tweet tweet="Ever since #SafeHarbour was overturned, businesses have been seeking a quick and clear resolution"] International data transfers not only enable global trade, but are also central to many companies’ ability to collaborate and operate both internally and with the partners and clients that they serve. The delay in negotiating a replacement for the previous, but now defunct, EU-US data transfer mechanism has left firms in limbo without a safe legal footing for such data transfers which are seen as critical to the global digital economy. The new agreement is called the EU-US Privacy Shield and it includes the following provisions: A US ombudsman will be created to handle complaints from EU citizens relating to any allegations of Americans spying on their data. A written commitment protecting Europeans' personal data from mass surveillance will be provided by the US Office of the Director of National Intelligence. An annual review conducted by the EU and US will ensure the new system is working properly A written commitment protecting Europeans' personal data from mass surveillance will be provided by the US Office of the Director of National Intelligence A host of bodies including the European data privacy watchdogs, their US counterpart and the Federal Trade Commission will monitor arrangements and flag up any problems. Companies that are found to abuse or fail to comply with privacy safeguards could be prevented from making use of the trans-Atlantic data transfer arrangements. So that’s easy then - it’s all sorted. As ever things aren’t quite as simple as this. A few significant hurdles remain: Approval - national watchdogs across the EU are now exempting the provisions outlined in the pact and all 28 EU nations then need to approve the arrangements. Opposition - many privacy campaigners remain adamantly opposed to any pact that might allow trans-Atlantic data flows to continue and some have vowed to do all that they can to combat the EU-US Privacy Shield. Implementation - firms will then need clarity on what is expected from them in order to comply with the new privacy safeguards so that any changes to their systems and processes can be implemented effectively. So while the clarity and certainty in international data privacy regulation that businesses are crying out for is not quite here yet, we do at least have an indication of the likely provisions. There has already been significant progress towards finding a replacement data transfer mechanism for Safe Harbour and we now need to complete the approval process and make it work. ### CTC live from #IBMPWLC Bill Mew of CTC is reporting live from #IBMPWLC in Orlando, Florida.  [scribble src="/event/1850924" /] ### Securing Access in a Cloud Environment According to the Oxford English dictionary, the modern-day definition of the word ‘privilege’ is “A special right, advantage, or immunity granted or available only to a particular person or group”. Mmm. I bet those tasked with securing their organisation’s IT security wouldn’t use those words. For them, ‘privilege’ as in ‘privileged access management’ is a real headache. And one that’s about to get worse, particularly as organisations increasingly move to a cloud-based computing environment. [easy-tweet tweet="Privileged access management is problematic says Bruce Jubb of @Wallixcom" user="comparethecloud"] In fact, privileged access management is so problematic that it has emerged as the top risk in two separate studies carried out into cloud computing security. Gartner listed it as the number one risk amongst seven (Source: ‘Assessing the Security Risks of Cloud Computing’) whilst another report – this one from the Cloud Security Alliance - listed it as the most important risk of three key issues (their other ones being Server Elasticity and Regulatory Compliance). So why does this area represent such an Achilles Heel for companies? The problem might lie in the ‘binary’ view that IT professionals have about IT security with a world divided up into two groups: insiders and outsiders. Of these, outsiders have traditionally been viewed as being the most ‘hostile’ or representing the greatest threat and so the majority of security resources are spent on 'defending the perimeter’. 55% of all cyber-attacks last year were carried out by people who had privileged access to an organisation’s IT system The reality is that these two groups have blurred to the extent that outsiders are now insiders. And insiders have the potential to do considerably more harm to an organisation. In fact, 55% of all cyber-attacks last year were carried out by people who had privileged access to an organisation’s IT system (IBM’s 2015 Cyber Security Index). And in a cloud-computing environment, where is the perimeter? With so many privileged accounts not just being made available to administrators and super users but routinely to  external service providers too, how can these accounts be controlled and monitored in a truly effective way?  Blaming the cloud service provider for ‘their’ lax security procedures may not cut any ice either. According to Gartner’s ‘Top Predictions for IT Organisations and Users for 2016 and Beyond’ one prediction looked particularly chilling, ‘by 2020, 95 per cent of cloud security failures will be the customer’s fault’, they wrote. The problem – as  I see it - falls into two main areas. (So fix those and you’ve fixed the problem). Problem number one concerns control. Being able to successfully manage users accessing the right resources at the right time dramatically reduces the risk of a breach. However the vast majority of firms are - for legacy reasons - reliant on directory services to control access and manage users of network infrastructure. The problem with that is it’s easy enough to grant access but hard to actively control or even revoke it. [easy-tweet tweet="It's easy enough to grant access but hard to actively control or even revoke it" user="wallixcom" hashtags="security, cloud"] Problem number two is around visibility. You may know that you have a set of privileged users who log into a critical infrastructure of systems containing highly sensitive data but do you know when, for how long and what they’re doing during those sessions? So, what can be done to get a better handle on this vital group of users? IT admins are the beating heart of an organisation’s IT infrastructure (even if they are made up of contractors and not employees, but that’s a different topic) and so leadership teams are understandably daunted by the thought of disrupting ‘business as usual’. I’ve outlined five conditions that I believe must be met for the more efficient management of privileged users. Passwords: Those shared accounts have got to go. Organisations must have the ability to generate, hide, disclose, change or sustain passwords targets and secure them in a certified safe. Access control: Being able to define, award and easily revoke access to each system for each privileged user is a must. Monitoring: the ability to view and control the connections and user activity on systems, and generate alerts on events. This is not only a big help when it comes to compliance but also in the event of a breach. Seeing is believing: the ability to watch video recordings of user sessions privileges. Audit: the ability to create a reliable and enforceable audit trail of all activities of users privileges on the target systems. Privileged accounts remain a weakness from both insider threats as well as a target for external attackers and with more systems in place and data being used, there are more privileged users than ever. Those tasked with securing these systems need to adopt and execute as efficient a privileged access management strategy as they can. Privileged accounts remain a weakness from both insider threats as well as a target for external attackers ### The secrets of a successful big data implementation In the face of increasing austerity, the UK public sector must explore new ways of increasing operational efficiencies in order to generate cost savings, while improving the quality of services it provides to citizens. [easy-tweet tweet="The UK public sector must explore new ways of increasing operational efficiencies" user="comparethecloud" hashtags="bigdata"] For the public sector, adopting big data and predictive analytics can deliver enormous benefits that reveal new insights and solutions to complex problems, such as medical advances that can save lives. Big data can also help drive efficiencies by enabling more precise targeting of resources and delivering improvements in citizen services. Preparing your internal organisation for big data To get maximum value from big data, internal organisation is a must, with at least a baseline understanding of its potential and the right governance in place to ensure the consistent building of solutions. An organisation needs to have the right resources in the right places. This may mean having data scientists in each department, or perhaps a science centre of excellence, depending on need. An organisation needs to have the right resources in the right places Managing the cloud infrastructure Although big data solutions can be deployed internally, the volume and connectivity requirements can exceed the data centre capacity available; and it can be challenging for a non-specialist IT team to scale out big data infrastructure on demand. Ultimately, the cost of ensuring the availability and durability of data managed in house tends to become prohibitive. For these reasons, many big data projects fail to realise their full potential, or may be abandoned. Cloud-based solutions, however, change the way that infrastructure is deployed and paid for, helping organisations break down the capacity and operational management barriers to big data solutions. many big data projects fail to realise their full potential Frameworks like G-Cloud enable public sector organisations to contract, order and set up cloud services in a matter of hours, putting an end to long procurement cycles. G-Cloud suppliers’ consumption-based commercial models and usage-based pricing make costs transparent, so budget and expenditure can be accurately tracked. The open data question Data that can be freely used by, modified by and shared with other public sector organisations is the most effective route to genuine insight and actionable intelligence. The UK government has a well-established open data agenda with aims that range from helping people understand how government works and how policies are made; to making raw data available to enable people to create useful new applications. [easy-tweet tweet="The UK government has a well-established open data agenda" user="comparethecloud" hashtags="data"] For an organisation to work effectively with shared data, it needs a highly available, highly connected repository with scalable capacity for storing and processing the data. Cloud computing is the most viable, cost-effective and rapidly deployable solution to this challenge. But organisations need to choose their provider carefully, to be confident that the private data stored alongside the public data will be secure, protected and accessible only by the organisation that created it and that it follows data protection regulations. Selecting the right tools and technology For public sector organisations, using open-source tools and technology is often preferable to their proprietary equivalents. The nature of open source helps avoid the vendor lock-in, which many have suffered in the past, finding themselves held to ransom on price or proprietary solutions. With the chance that the technology landscape will have moved on before a proof of concept is finished, an organisation needs to maintain close contact with the open-source community, and keep up to date with toolset development and overall development direction. However, there is a risk of encountering some dead-ends, committing time to a tool only to find the industry heads off in a different direction, leaving it unsupported and unmaintained. To minimise the risk of this happening, organisations should try to favour tools with a longer life which have been widely adopted. To reduce the risk further, tools where the contributors on the project come from more than one company should be chosen. Furthermore, organisations must make sure the technology choices support the business’ long-term aspirations, or can be extended to do so. Deliver maximum ROI with big data In order to demonstrate value to stakeholders, the programme should be structured to deliver short-term value as quickly as possible, and include a vision for the longer term, with an agile structure that’s responsive to change, enabling the capitalisation of new opportunities as they occur. A big data implementation is often, in reality, a programme of business change, as well as a technology programme. A big data implementation is often, in reality, a programme of business change, as well as a technology programme. Big data analyses can unlock insight hidden deep within data, which may challenge the organisation's existing understanding of its users, its customers, or the environment and lead to change within the business processes. To maximise the potential for success of a big data programme, a public sector organisation needs to start its big data journey by carefully assessing its business objectives and establishing a strategy before implementation. The most likely route to success is a business-first, technology-second approach. Overall, there is a huge potential for government departments to share data in a federated way that will allow each department to be more efficient, and give all departments a better overall picture of a given scenario. By using scalable and open-source technologies, the public sector can make huge savings and look at use cases that were previously seen as too complex or costly. Adopting a big data strategy will also ultimately lead to a better experience for citizens when they interact with government services. ### Data is the key to a customer’s heart on Valentine’s Day This time last year, Google revealed that there had been a year-on-year increase of 41% in ‘last minute valentine’s gift’ searches. Time is an invaluable commodity in the modern world and with the best will in the world, Valentine’s Day can creep up quickly on lovers. [easy-tweet tweet="Could #BigData be the key to keeping your sweetheart sweet this #ValentinesDay?" user="comparethecloud @NetApp"] The apparent increasing reliance on the Internet to save the day shows there is a massive opportunity for retailers to make Valentine’s Day a far less stressful time of year for their clients. All that is required is an approach that is customer curious rather than impersonal, and a more sophisticated use of data to inform marketing, promotion and sales strategies. Most leading online retailers nowadays will target promotions, discounts and offers at customers. This is increasingly happening on mobiles, taking factors such as location and buying habits into account. However, recommending that I buy something based on a previous purchase is not customer curious – it’s a fumble in the dark rather than a helping hand. Most leading online retailers nowadays will target promotions, discounts and offers at customers Customer curiosity is about asking questions of your customers so that you can develop a persona for them that is contextually aware, and using that data to provide a superior service. For example, a florist sending me a text message when I’m at work saying that there is 25% off flowers until 10th February is not customer curious. If, however, I receive a text message when I’m walking through the area during the week preceding Valentine’s Day that says ‘10% of off geraniums if you buy in-store’, because their data records show that I bought geraniums last Valentine’s Day and on my wife’s birthday because they are her favourite flowers, that is an experience that I will remain loyal to. It’s contextually aware and tailored to me. [easy-tweet tweet="Customers are loyal to experiences rather than brands, so use #bigdata to your advantage" user="comparethecloud"] In today’s economy, customers are loyal to experiences rather than brands, and brands that are customer curious will provide better experiences. Data is crucial to exercising customer curiosity, and retailers that use and manage data more intelligently than others will win the hearts of their customers this Valentine’s Day. ### When Sneaky Marketing Backfires Recently we have seen a proliferation of mistakes from marketing departments across the world. There have been some fantastic marketing spins that are only now biting back at the companies that implemented them. [easy-tweet tweet="Are you guilty of sneaky marketing? Are you ready for it to backfire? " user="rhian_wilkinson" hashtags="marketing, branding"] Let’s focus on the painkiller product Nurofen, as it has been revealed they have been selling exactly the same product, with inflated prices, changing nothing but their packaging and marketing materials. Essentially, someone came along and said hey! We should target our product here, here, and here, but let's brand it for each of them, without making it clear that it's actually all the same thing! They're definitely silly enough to think they need to buy twice the amount of product to fix one kind of problem! All of the Nurofen products in question use the active ingredient of Ibuprofen Tysine, specifically 342mg. Yet each of these products has been marketing differently. The green packet for back pain, purple for migraine pain, pink for period pain, and brown for tension headaches. These products are all additional to the standard Nurofen product. Despite containing exactly the same active ingredient, in exactly the same dosage, the products were being sold at a price almost double that of the standard product. The sneaky marketing tactic won Reckitt Benckiser, Nurofen’s maker, a Shonky Award from consumer group Choice in 2010. The Australian Competition and Consumer Commission (ACCC) launched action against Reckitt Benckiser in March of 2015, and has just come to a ruling that will require all Nurofen specific pain products be removed from retail sale within three months, and that the company post corrective notices in newspapers and on its website. Don’t market your product as the same thing five different ways Moral of this story? Don’t market your product as the same thing five different ways. Nurofen is a painkiller, the ACCC is not disputing that it can be used for many different kinds of pain. But Nurofen has been misleading consumers into thinking that they need to buy the same product three times if they have period pain, a tension headache, and back pain (three pains that often come at once!). This is not only misleading, it is potentially dangerous if a consumer uses all three products at once, taking a potentially harmful dosage of the painkiller. [easy-tweet tweet="People are much more likely to buy one product that achieves five goals" user="comparethecloud" hashtags="marketing"] Take a look at your suite of products, are you trying to sell your customer the same thing five times over? Psychologically people are much more likely to buy one product that achieves five goals, than 5 products that each do one thing. The cloud industry is obsessed with acronyms that could frankly be gone without, there are companies who have entire internal languages, which are separate to what they actually refer to their products as externally. All of this confusion is making it harder for consumers to know what they are buying, and what they should be getting from their products. There is so much focus now on big data telling us what to sell and when, surely we should know to trust ourselves sometimes. The Harvard Business Review recently did a Spotlight on Digital Customer Engagement. The focus was how to balance big data with traditional marketing tactics. The playoff between short and long term marketing. If you have been running a campaign to increase brand awareness, and it is going well, but all of a sudden a big data analyst says you can make a quick buck by targeting your advertising at a certain segment, should you abandon your long term content goals? you shouldn't abandon your long term content goals in order to get fast results My argument here is no, you shouldn't. Companies are far too focussed on their short term goals a lot of the time. Only caring about meeting this quarter's targets, and on-boarding as many customers as possible. Without sticking to the long term plan for their business's brand, they will lose out. Nurofen's sneaky sales tactic may have made them more money in the short term, but now that it has come to light, it has negatively impacted their customer loyalty. They lied to customers in order to hit their targets. It was savvy, it was smart, but it was sneaky. People don't like being tricked into purchasing things they don't need. [easy-tweet tweet="Are you #marketing for the long term and protecting your #brand? " user="rhian_wilkinson @comparethecloud"] So ask yourself, are you marketing for the long term and protecting your brand and your customer? Or are you just getting that customer on board for a short ride, and waiting till they fall off the rickety marketing bandwagon you built? ### What do you do about cyber-security when your data centre is all at sea? Recently I was dispatched to Lille to attend the 8th International Cybersecurity Forum  to report live,  and shortly after I shared some key insights from the top keynotes focusing on the political agenda around cyber-security, including copies of keynote speeches at the event from European Commissioner Günther H. Oettinger, French Interior Minister Bernard Cazeneuve, and Security Minister at the UK Home Office John Hayes. [easy-tweet tweet="Take #DataSecurity out to sea with @BillMew's interview with Patrick Hebrard from the #DCNS group" user="comparethecloud"] In addition to interviewing France’s top Cyber-cop (Francois-Xavier Masson, Chef de l’ OCLCTIC) and its top Cyber-regulator Guillaume Poupard (general manager of ANSSI), I also spoke to naval cyber-security expert Patrick Hebrard from DCNS Group. Long before I was a cloud pundit, I was a weapons engineering officer in the Royal Navy, but this was back in the late 1980s when the cold war opponents were Russian submarines in the Baltic and when the main systems on board had been developed in the 1960s or 1970s. Back then there wasn’t any sign of anything even as advanced as a PC on board. Things have moved on in the last 25 years – considerably. Combating modern cyber threats to naval systems IT managers everywhere are having to deal with the increasing volume and sophistication of cyber-threats – both those that are external and deliberate such as hacks and data thefts, and those that are down to incompetence such as data loss or lack of encryption. Modern warships are just like floating data centres with a combination of PC-based systems, Unix systems and more bespoke weapons systems. The only difference in the context of a naval warship is that once it leaves harbour the data centre is truly off the grid. One might think that having no direct link to the outside world might ensure complete security, but vulnerabilities still exist. IT managers everywhere are having to deal with the increasing volume and sophistication of cyber-threats Just as hackers are able to hack corporate data systems, it’s possible for a small group or a state with the right skills to hack the systems of an enemy ship, to steal data, to take control of the ship, of its information system, of its weapon system or one of the many monitoring and control systems used for managing the ship’s mechanical systems from its power generation and supply to its propulsion and steering. There are external threats that could come from a developer who has deliberately, malevolently or even accidentally introduced malware into a system or an item of equipment. Then there are threats from simple negligence and ill-informed use of compromised equipment by maintenance engineers or use of infected devices, CDs or USB keys by members of the crew. [easy-tweet tweet="Just as #hackers are able to hack corporate #datasystems, it’s possible to hack the systems of an enemy ship"] While common sense and discipline among maintenance staff and crew is the first line of defence, Hebrard suggests that this kind of static protection in no longer enough and that dynamic protection through detection, responsive-systems, counter-measures is now best practice just as it would be to more physical threats such as a missile attack. When an incident occurs and the detection of cyber-signals or other signs of cyber-attack triggers the counter-measures there may be limited cyber-skills on board so teams work in collaboration with shore-based operational security centres. The human factor is addressed through best practices in cyber hygiene, in accordance, in line with the rules defined by of the national information system security agency (ANSSI). ANSSI also runs a national accreditation and certification program that all systems need to pass before being adopted for use on board French warships. Integrating the various systems and minimising the cyber-threat is then down to specialists like DCNS Integrating the various systems and minimising the cyber-threat is then down to specialists like DCNS. It believes in ‘Security by Design’ from conception to deployment, including both warranty and maintenance phases. The unfortunately reality is that sailors have been picking up nasty viruses in foreign ports of call for hundreds of years. The difference now is that the greatest threat might well come from a virus on a discarded USB key or infected mobile phone. As ever discipline among the crew and dockyard staff is crucial, as is the protection afforded by experts such as DCNS. ### An introduction to zsah Compare the Cloud has been interviewing cloud experts and learning about their companies, today the spotlight is on zsah, a London based MSP that prides itself on customer loyalty and satisfaction. Established in 2002, zsah provides cloud, software engineering and managed services; we sat down with three of zsah’s leading men to hear more about what their business is all about. [easy-tweet tweet="An introduction to #MSP @zsahLTD from @ComparetheCloud" hashtags="cloud"] Amir Hashmi, Managing Director & Founder Being a techy by nature Amir had spent his early years refining his skills and helping a number of major corporations including Bertelsmann and BP with a wide variety of high profile projects. Amir is an Experienced technology and business process expert with significant exposure to both the Enterprise and SME spaces. Fahd Khan, Business Development Manager Fahd has been with zsah since 2015 as a Business Development Manager working with and helping develop business relationships with a number of new as well as existing clients.  Fahd is a very hands-on and experienced Business Developer with an entrepreneurial streak. Paul Hocking, Sales & Account Director Paul joined zsah a little over two years ago, but has been selling outsourced IT services for over 20 years and was an early adopter of co-sourced IT delivery models, where an external services provider works hand in glove with a client’s own IT organisation to deliver an optimised and flexible IT solution to support the client’s business plan. Can you tell us about zsah as a whole?  AH: Established in 2002, zsah provides cloud, software engineering and managed services. Their service offerings add value to your existing functions, or complement your IT operations, where and when you want. Based in London, zsah's teams consist of smart, driven, responsive people who deliver a personal, dedicated service. Together with our robust, reliable infrastructure, zsah's people can support, develop and manage your technology needs so you can do what you do best. This combination of personal service, experience and strong architecture is rare, and having it yields potent capabilities, maintained and developed for you. zsah's teams consist of smart, driven, responsive people who deliver a personal, dedicated service What do you think of the Cloud Industry at present: AH: The industry at the moment is exciting, but getting crowded with people who don’t have the experience or credentials to do what they are promising. FK: I think the cloud industry is seeing unprecedented growth but along with that growth the quality of service is being watered down. In my experience we have come across more and more customers who are experiencing poor levels of service; something which for us being in the managed services industry is an absolute priority and the founding aspect that zsah has built its success on. PH: The delivery of services across the Cloud is here to stay; will continue to grow; and is the way in which most companies will get the majority of their IT services - just like buying in other "utility" services. Can you tell us a little about zsah’s main strengths? AH: Experience, expertise, flexibility and talent. We know what organisations require and we understand applications as well as infrastructure. Everything is in house and we don’t believe in outsourcing work, this helps us control the work we do and ensure standards of work are at its highest. [easy-tweet tweet="Everything is in house and we don’t believe in outsourcing work says Amir Hashmi, Founder of @zsahLTD"] FK: The quality of service we offer. As zsah has grown we’ve ensured that our high standards remain the same and have hired effectively to maintain that.  A lot of companies grow and start doubling the workload on their team and inevitably that then goes on to effect the service being offered and this in turn can affect the reputation of the company and considering the vast majority of zsah’s growth has been driven by our reputation this is something that is vitally important to us and a strength we always maintain and build on. PH: One of zsah’s main strengths are that all services are provided from world leading data centres, but through technology wholly owned and managed directly by zsah - and with the support and back up of a strong and very experienced technical team that can support companies through advice, implementation and changes to their IT.  What do you think sets zsah apart in a competitive cloud marketplace? AH: In our exact space we don’t have that much competition, so because of our commitment to delivering an exceptionally high level of service, we naturally stand out from our competitors. we end up being a “one stop shop” for our customers for all their IT requirements FK: The fact that aside from cloud services we also offer end user support and as well as development services. So in many ways we end up being a “one stop shop” for our customers for all their IT requirements. Also everything we do is in house, secure and truly private, are a UK sovereign organisation, so we are not subject to the patriot act unlike the big three (Rackspace, AWS & Azure) amongst many other cloud companies. PH: The ability to deliver the same level of services as the big players, but with a level of technical support and relationship with organisations that cannot be delivered by the commodity players or by cloud service brokers. We work very closely with our clients, acting as part of their own IT organisation - or in many cases as their end to end IT team. [easy-tweet tweet="We work very closely with our clients, acting as part of their own IT organisation says @zsahLTD"] Can you describe the ideal client for zsah's services? AH: Our clients are often SME organisations or software re-sellers who use us as a technical support and infrastructure and platform provider. But we also have clients across the spectrum, from start-ups, to media companies and a range of public sector organisations requiring a private and secure cloud partner. FK: I would say SaaS companies are an ideal customer as we have dealt with a good number of SaaS companies who have been supported by zsah’s cloud infrastructure and have gone on to successfully grow and flourish into a bigger business. Public sector organisations have also reaped huge benefits from our honest advice and efficiency savings. PH: Any company or public sector organisation that requires general or specific IT infrastructure or platform services, but with a relationship where they can avail themselves of advice and additional support as and when required. In most cases, the key is the intimacy of the relationships we have with our clients and a level of personalised and technical support which they don;t feel they get from the larger, commodity, providers. Could you tell us a bit about some of zsah’s current clients? AH: They vary from SMEs to large multinational firms. The common thread between them is that they require the best solutions and a level of management that other firms do not offer. They choose us instead of choosing a combination of two or three other suppliers which complicates their service delivery landscape. FK: We support and host infrastructure for a variety of different businesses.  From small start-ups to large enterprises, the greatest benefit with this is we have a wide range of experience with all types of businesses and are often familiar with the requirements and solutions that most new customers require. Where do you see zsah heading in the future? AH: I see us as the leaders in private and hybrid, high performance, managed hosting. Our solutions are wholly scalable, proven for years and we expect our client base to grow both in number and in terms of the size of the organisations we serve. FK: We see ourselves growing into leaders in truly private/hybrid and managed cloud services as well as providing some great startups of the future the platform to help grow and establish their businesses. zsah will continue to grow, based on its unique positioning in the market PH: zsah will continue to grow, based on its unique positioning in the market and its strengths as outlined above. Whilst we will expand into global markets, following the demands of some of our larger clients, our core strength will continue to be the UK market where our physical presence of both data centres and support offer a significant advantage to clients - whether for reasons of data privacy or for geographic support. [embed]https://youtu.be/c6t0fFIzW5g[/embed] ### Provisioning Cloud Applications Enterprise app stores have become popular, employee-friendly ways to encourage employee self-service – allowing users to get the apps they need, when they need them. But many organisations don’t fully utilise their app stores across all application types – such as for provisioning cloud-based apps like salesforce.com or Office 365. By failing to provide a truly universal enterprise app store, companies are sabotaging their own efforts. [easy-tweet tweet="Many organisations don’t fully utilise their #app stores across all application types" user="comparethecloud"] The enterprise app store must provision all apps employees are authorised and likely to use – regardless of the type of app or platform it will be run on. This includes SaaS apps that are hosted in external cloud environments. To the user, accessing enterprise app store apps should be a one-stop-shop, otherwise it will just represent another system that doesn’t really deliver what the user wants and it will be circumvented. Such a universal enterprise app store must provide access to authorised enterprise applications that are suitable for the device and operating system. In addition to providing a selection of apps for different devices, an enterprise app store should also enable the installation of applications from an assortment of locations from a local application library, as well as applications that run in the public cloud. These cloud apps are most often Software-as-a-Service (SaaS) applications such as Office 365, Box, and salesforce.com, and are gaining more and more subscribers in the enterprise space.  A universal app store approach gives employees a consistent experience across all of their devices A universal app store approach gives employees a consistent experience across all of their devices from desktops and laptops to tablets and smartphones. Moreover, the experience will most likely be familiar to the employees since most users already know how to access public app stores and download apps. A universal app store delivers this experience by presenting a “single pane of glass” for Windows, mobile, and Apple users for physical, virtual, mobile, and cloud applications. This eliminates the need for multiple “portal” solutions. Many enterprise app store products do not connect to a diverse set of deployment systems, and do not have the out-of-the-box capability to provision cloud applications. So enterprises should be careful when selecting enterprise app stores – because many products and custom solutions still rely on heavy customisation that is costly to develop and difficult to maintain. [easy-tweet tweet="Governance is a less understood aspect of an effective enterprise #app store" user="comparethecloud"] Governance is a less understood aspect of an effective enterprise app store. Any solution chosen must be able to automatically manage apps on devices across multiple management systems, provide governance such as determining license position, license reservation, software harvesting and leasing, and providing extensible options for OS deployment. Many organisations are struggling to provide the same level of service for cloud apps as they are for other application types. Employees commonly use a different request process for cloud apps or submit a request via a help desk ticket, resulting in a different level of service, often leading to dissatisfaction and shadow IT. Administrators who use an app store capable of simultaneously provisioning cloud applications alongside on premises and mobile apps are able to provide the same business approval processes and questions across all application types. By creating this familiar, transparent interface, employees use the app store for all their needs, including cloud apps. Once the application request is approved and provisioned for the user successfully, they simply receive an email which includes pertinent login details. With a universal app store in place, employees request apps like Salesforce.com, Microsoft® Office 365, Box, and others from any desktop or mobile device. This simplifies the user experience and automatically provisions cloud accounts, providing the same level of automation and control as desktop and mobile apps. ### PTC ThingWorx Joins MK:Smart to Enable Rapid Development for Internet of Things Applications PTC has announced ThingWorx, a PTC technology, is joining the MK:Smart project, a £16M HEFCE (Higher Education Funding Council for England) smart-city initiative, led by the Open University, which aims to tackle barriers to economic growth in the city of Milton Keynes, UK. ThingWorx will provide a rapid application development environment as an additional feature to the MK data hub. Partners of this project include BT, Milton Keynes Council and UCMK (University Campus Milton Keynes), supported by ThingWorx Partner InVMA. [easy-tweet tweet="#CloudNews: #ThingWorx, a @PTC technology, is joining the MK:Smart project" user="comparethecloud" hashtags="smartcity"] The MK Data Hub enables a data-centric approach to open innovation and provides a platform that can be used to deliver apps and services to address the challenges of an expanding city. This provides opportunities for local companies to take data-driven innovations to market. ThingWorx provides a purpose-built IoT application development environment that is free to SMEs (small and medium-sized enterprises) in MK:Smart for the development of their market-ready applications. Pre-connected to data available in the Data Hub and open to new data sources, the ThingWorx® platform will accelerate the development process of combining data sets and building rich, interactive applications. Training and advice for SMEs is provided by the team at UCMK, supported by BT and ThingWorx Partner InVMA. In addition to this technical support, MK:Smart offers a range of business engagement activities for innovative SMEs who are seeking to engage with the MK Data Hub to build new data-driven applications and services. “ThingWorx is a great addition to the MK Data Hub,” said Professor Enrico Motta, Open University, Project Director, MK:Smart. “It will enhance our platform by facilitating rapid prototyping and delivery of IoT applications, a key aspect of smart city solutions. I am looking forward to seeing the results from this exciting collaboration.” “ThingWorx is honored to join MK:Smart, one of the world’s preeminent smart city initiatives,” said Russ Fadel, president, ThingWorx. “Integrating ThingWorx with the MK Data Hub will not only enable universities, high tech businesses and incubators to quickly and securely deliver IoT applications that unlock the value in the city’s open data sets, but also to accelerate economic growth in Milton Keynes by promoting the creation of new local business ventures. MK:Smart is truly demonstrating the way forward for cities around the world.” ### ANSSI – the people behind the French National Digital Security Strategy Recently I was dispatched to Lille to attend the 8th International Cybersecurity Forum  to report live,  and shortly after I shared some key insights from the top keynotes focusing on the political agenda around cyber-security, including copies of keynote speeches at the event from European Commissioner Günther H. Oettinger, French Interior Minister Bernard Cazeneuve, and Security Minister at the UK Home Office John Hayes. [easy-tweet tweet="ANSSI's GM Guillaume Poupard spoke with @BillMew about France's approach to #cybersecurity" user="comparethecloud"] In addition to interviewing France’s top Cyber-cop (Francois-Xavier Masson, Chef de l’ OCLCTIC), I also met the man who heads up L'Agence nationale de la sécurité des systèmes d'information (ANSSI). Effectively the French national agency for computer security, ANSSI is responsible for defining the French National Digital Security Strategy. ANSSI sets out the rules and policies for protecting state information systems and for monitoring and verify the implementation and adoption of all such measures. It provides a monitoring service, as well as detection and warning systems and is responsible to leading the response to computer attacks on any critical infrastructure or state networks. At a time when France has been the target of a series of brutal terrorist attacks and like all modern economies is under constant cyber-attack from hackers and scammers, the role of ANSSI has never been more critical. ANSSI has been lead since March 2014 by, its general manager, who was kind enough to give us an interview. Like all French agencies ANSSI works to a strict brief. It only focuses on cyber-defence – and doesn’t conduct intelligence or cyber-attacks. Protection of all of the nation’s ministries and its critical infrastructure is core to its role. It coordinates the activities of government agencies and departments as well as those of key public and private sector entities – such as the finance or power companies. ANSSI focuses on cyber-defence for France Between 2009 and 2016 its headcount grew from 80 to 500 as its staff’s remit extended to rules and policies cover all critical national infrastructure, providing legal protection, defining mandatory controls and recording and reporting attacks. [quote_box_center]It sees its role in two main areas: Data theft prevention ANSSI has a primary focus on protecting economic intelligence, but it also seeks to cover technical and personal information as well. Often incidents are hard to detect with thieves not wanting their presence to be visible so that they can return time and again for more and more data. Attack and sabotage prevention ANSSI works with firms in the transport and finance sector as well as utilities to ensure that all systems are resilient and all infrastructure is protected from attack or sabotage. [/quote_box_center] While it is directly responsible for defining the rules and motoring their adherence, it finds that real effectiveness is built on more than just rules and it finds that constructive dialogue with the firms it liaises with is critical. For example internet service providers (ISPs) play a pivotal role, providing support with audit, detection and reaction. ANSSI also runs a national accreditation and certification program. It publishes a list of rules for companies (including those in the traditional IT hosting and cloud sectors), and then conducts independent evaluation on their performance before providing the appropriate certification. Almost unique to France, Guillaume Poupard admits that this process isn’t necessarily cheap for the firms that take part, but he argues that it is definitely worthwhile – not only for the firms themselves that need a set of standards to apply and benefit from the ability to show that they are up to standard, but also to their clients who have the peace of mind knowing that they are working with a firm that has met such high standards. [easy-tweet tweet="#ANSSI is seeking to align its rules and standards with similar regimes emerging elsewhere in Europe" user="comparethecloud"] ANSSI also need to work with many foreign companies – such as the global internet firms that span the globe. It is also seeking to align its rules and standards with similar regimes in Germany and emerging ones elsewhere in Europe. It’s aim is to find a set of rules that can be applied fairly to all, but that are stringent enough to provide the right level of protection. In working with firms both within France and elsewhere Poupard finds that it essential to have a high level of mutual trust and that this means being based on a real understanding of the situation – even if this means access to source code. We asked Poupard how cloud has changed things in his industry he told us; When cloud first arrived ANSSI advocated caution, but we are aware that cloud has become part of the way that organisations operate now – we tell firms that they probably need cloud, but not all clouds are the same and neither are all CSPs and MSPs. We believe that in terms of security our standards and our accreditation and certification program are therefore as important as they have ever been. We have therefore started a new certification program specifically for CSPs and MSPs. Poupard argues that SMBs that use these CSPs and MSPs find that the certification program provides a valuable mark of safety and provides them with peace of mind. So what’s the next big challenge we asked Poupard? He sees the Internet of Things (IoT) as the next area of focus. we’ve seen a huge amount of innovation and ballooning volumes of data from IoT He said we've seen a huge amount of innovation and ballooning volumes of data from IoT, but there hasn’t been enough focus security. Indeed Poupard believes that for IoT security to be effective it needs to be baked in right from the start. Do you believe that other countries need to follow France’s lead with a national cyber-security accreditation and certification program? Get in touch and let us know what you think! ### Moving into Services: Where to Start IT focused channel partners have multiple questions to consider before making the jump to services. After all, if the delivery of services was a walk in the park, we’d all be doing it already, right? But many partners are doing it, and doing it successfully, so what’s the best way in? How do service margins stack up versus the margins from traditional technology sales? What about business growth with services. Is it about acquisitions or other considerations? [easy-tweet tweet="How do service margins stack up versus the margins from traditional #technology sales?" user="comparethecloud"] Let’s take each question one at a time, starting at the very beginning. How do we start this journey? The journey into services starts with identifying what your customers are using and what they ultimately need from you or other technology/service providers, specifically: Could the answers to how we enter the services market come from our existing customer base? What is it that our customers are asking for? Are we finding ourselves providing ad hoc support to customers in certain non-core areas because “it’s the right thing to do” as a supplier? What services do our customers buy that we are not currently providing? As a trusted IT supplier, are there services that we could provide? How could we provide the services our customers are interested in with minimal investment and maximum credibility? Could print-aas be a defensive strategy as well as a profit opportunity? If you’re hesitating to move into services, or maybe you have started down the path and are now looking to improve your portfolio of offerings, then print-as-a-service could be a good answer for you. At Xerox we are seeing a shift in traditional print resellers who started their services journey with Managed Print Services (MPS) and are now moving into broader IT services – if this is an emerging competitive threat to your business, then perhaps print is a defensive strategy as well as a profit opportunity for you. How do service margins stack up? It’s true that in the early phases of selling services, the margins probably won’t stack up as well as traditional technology sales. Just remember that services can bring long-term stability, but you do have to make an initial investment. And certainly margins will improve after any initial cost of sale or implementation has been absorbed. Where does growth come from? Services may help you smooth out the troughs associated with a pure technology business — but how can you drive growth? New customer acquisition remains an important part of the equation. But adding new services is just as important. The more services you offer, the greater the sway you have with customers and greater your revenue and profit potential. [easy-tweet tweet="Services may help you smooth out the troughs associated with a pure #technology business"] Reality check As an IT solution provider, you can view “print-as-a-service” as an opportunity or a threat. It’s an opportunity if you are prepared for the move into providing print as a service and mine the print spend in your customer base. If you choose not to engage customers on print, it’s important to ask who is or will be managing your customers’ requirements in this space to avoid missing a key opportunity. ### An Introduction to Bluewolf I recently caught up with Vera Loftis, UK Managing Director at Bluewolf, a global consulting agency, in the company’s London office. Vera joined Bluewolf in 2007 to assist with the already established growth through the consultancy arm of the business, and she moved to the UK 4 years ago. [easy-tweet tweet="CTC's @NeilCattermull catches up with @Vera_Loftis of @bluewolf" user="comparethecloud"] Bluewolf specialises in leading cloud technologies like Salesforce and delivering solutions to clients across all business verticals. It has grown considerably since its inception more than 15 years ago. Vera says, “We pride ourselves on understanding our clients’ business needs inside and out. I know most companies boast such statements but our growth is testament to this, Founded in 2000, we started with one office, now we have 12 offices globally, over 500 Salesforce certifications with 9500 success stories.” This is quite the impressive statement, so I probed deeper with some client side questions, basing them around a business vertical I know very well – Financial Services. I was impressed with Vera’s knowledge within this space, she was able to provide the correct advice on product sets and the market evaluations of the applications, as well as usage for a typical financial brokerage. In a Jeremy Paxman style interview Vera really impressed me with the way she described Bluewolf and the story to date. Salesforce implementations can be very difficult to execute and deliver in my experience; and Bluewolf claims to do this 50% faster than anyone else in the market, a bold statement! Bluewolf claims to be able to implement salesforce 50% faster than anyone else in the market So, I asked the 60 million dollar question “Why would I go to Bluewolf for a Salesforce implementation rather than Salesforce?” Not an easy question to answer, but Vera answered it well! “Salesforce are very good at what they provide, however clients look to us for a partnership that lasts. What I mean by this is that we have in-depth experience within the product set across all business sectors and work so closely with them that we provide ongoing support after their migrations are complete. This coupled with the understanding that if they are not suited to the product set, we will tell them”. With this understanding of planning, implementations, and ultimately the support of Salesforce I can see why Bluewolf has grown as it has done. With the long established office in the UK (4 years), it is are looking to assist with the explosion of cloud computing throughout Europe, backed with the immense experience of 15 years of consulting and support of Salesforce.  But don’t just take my word on how impressive Bluewolf’s experience is, IDC rated it a worldwide leader in the Salesforce Implementation Ecosystem 2015, amongst many other awards gained. [easy-tweet tweet="IDC rated @BlueWolf a worldwide leader in the #Salesforce Implementation Ecosystem in 2015"] To sum up – Bluewolf is a global leader of Salesforce implementations and support with a track record of implementing migrations 50% faster than anyone else around with ongoing support. Additionally they work with other product sets within the cloud ecosystem and can advise you on these when you meet with them, making them unbiased and very flexible. Vera certainly knows the industry very well and it was a total pleasure to meet someone who really can do what they say in the world of cloud. ### Infradata Acquires Quonubes Group Limited Infradata, the leading independent Network Integrator, today announced the acquisition of Quonubes Group Limited, a UK-based virtual network service provider that specialises in multi-vendor integration technologies for Software Defined Networks (SDN) and Network Function Virtualization (NFV). [easy-tweet tweet="#CloudNews: @Infradata have acquired UK-based virtual network service provider Quonubes" user="comparethecloud"] Infradata believes that the SDN and NFV technologies will increasingly shape the future of networking and security for both Service Providers and Mission Critical networks. Infradata is an early adopter of these technologies. Over the last years, Infradata has invested in acquiring the right skills, competences and hands-on experience in building, integrating and migrating Software Defined SP Networks and Mission Critical networks. Recently, Infradata has supported major customers building their software defined data centers and automating their end-to-end service provisioning and delivery. The aqcuisition of Quonubes is the next step in Infradata's strategy of SDN and NFV. Quonubes has developed solutions for both Service Providers and Mission Critical networks based on SDN such as Software Defined Data Center Solutions as well as vCPE (virtual CPE) as a service. Infradata will leverage the combined skills of both organisations to accelerate the development of the innovative multi-vendor SDN platform of Quonubes. The acquisition brings technologies and skills together and enables enhanced support for clients seeking flexible network services and advanced multi-vendor manageability. At the same time, technical and marketing staff from Quonubes will form a new UK operating unit for Infradata, enabling it to expand its UK business. With the acquisition of Quonubes, Infradata will continue to be the leading Network Integrator in the SDN and NFV market. “SDN and NFV are new technologies that have the potential to radically change how networks are delivered. Throughout our long relationship with the Quonubes team we have admired their level of expertise and innovative technologies. Of particular interest was the ability to support flexible network services using their multi-vendor integration platform. This also allows seamless deployments of VNF’s in a manageable and scalable manner,” Leon de Keijzer, CEO of Infradata. Commenting on the acquisition, James Morgan, former CEO of Quonubes Group Limited and future Managing Director of Infradata UK, added: “There was far greater international interest in our innovative multi-vendor integration platform than we could cope with as a start-up. Infradata not only has the resources and scale to take the platform forward on a global basis for the networks of the future, but also the capability to design, deploy and support the networks of today. This focus and expertise in managing Mission Critical network services makes Infradata a compelling organisation for companies to work with.” Quonubes clients will benefit from enhanced international technical support services centered around Infradata’s SDN lab in Leiden, the Netherlands. At the same time Infradata’s existing customers will gain access to Quonubes’ multi-vendor integration platform as well as its virtual network services catalogue. The acquisition of Quonubes Group Limited closed on November 16th, 2015. Terms have not been disclosed. ### AppCarousel and Mobica Partner in Connected Device Markets to Deliver Complete App Solutions AppCarousel, the leading provider of app platforms for connected devices, and Mobica, the leading provider of software engineering, testing and development services, have announced a strategic partnership leveraging each other's expertise to offer end-to-end app and software solutions to markets worldwide including connected car, connected fleet, Internet of Things, and smart devices. [easy-tweet tweet="#CloudNews: AppCarousel and Mobica announce partnership for #App ecosystems" user="comparethecloud"] AppCarousel’s platform delivers a range of solutions to organizations looking to deploy apps and app ecosystems including; app marketplaces, digital goods commerce, and app lifecycle management for all types of connected devices. Mobica provides services to those organizations, including design, software creation and software solutions.  This partnership enables AppCarousel and Mobica to offer complete turnkey solutions from app building through to app deployment, ongoing app management and monetization. In the rapidly expanding connected car sector, automotive manufacturers and their supply chains are increasingly turning to AppCarousel and Mobica to create great app experiences for the connected driver along with the creation of custom software management solutions for the vehicle, to ensure apps and critical on-board software stay up-to-date and relevant. Mobile and connected device companies come to AppCarousel and Mobica because they need custom apps, a white labeled app store and an app management platform to make their app ecosystems a reality. Together, AppCarousel and Mobica provide end-to-end solutions to companies that require custom app creation, distribution and monetization solutions for their developers, partners and customers. "Many organizations do not have the expertise to determine their app strategies on their own, so the AppCarousel and Mobica partnership enables us to address all of their needs whether it's creation of apps and software or a platform to make those app strategies a reality," said Emanuel Bertolin, CEO of AppCarousel. "Our two companies are complementary and we are excited to be working together on many opportunities including the burgeoning automotive sector." "With this strategic partnership, businesses can entrust AppCarousel and Mobica to address all of their app-related needs, all the way from creating apps and software to getting them out there in a managed scalable manner," said Jim Carroll, Mobica CTO. "With Mobica's extensive software engineering and development services and AppCarousel's app management and over-the-air update expertise, our partnership delivers a unique approach to the connected device markets we serve, including automotive, IoT and smart devices." ### Don’t rely solely on your broadband connectivity: it’s time for 4G to take centre stage Over the past couple of years organisations have woken up to the critical importance of 100% Internet access uptime. With the reliance upon cloud based applications and the cost benefits of VoIP calls, continuous, fast connectivity has become an essential business requirement. Yet the divergence in cost, coverage and quality of service available to companies in different geographic areas continues to challenge the evolution of a long term comms strategy – especially for those still waiting for the roll out of superfast broadband services. [easy-tweet tweet="Organisations have woken up to the critical importance of 100% Internet access uptime" user="comparethecloud" hashtags="cloud"] What’s more, news that exploded the headlines this week of BT’s major broadband outage, which left tens of thousands of households and businesses without a connection for hours, shows that businesses shouldn’t waste time waiting much longer; even when a superfast broadband connection does arise, there’s still the possibility of it failing. In contrast, 4G network coverage is getting ever better. Speeds are great and with 4G data costs at a fraction of the 3G alternative, 4G clearly has a role to play. Many organisations still consider mobile purely as a back-up comms option – but given the speed, resilience and reliability of the 4G infrastructure, is it now time for businesses to consider 4G as a primary, rather than secondary connection? is it now time for businesses to consider 4G as a primary connection? Fixed versus Mobile Most organisations now recognise the complete reliance upon an Internet connection. Indeed, quality and reliability of comms increasingly informs decisions regarding business location. However, far too many organisations still feel constrained by the lack of top quality fixed line connections available – and certainly by the huge variability in price. Today, a fixed line Ethernet connection can cost between £300 and £2000 per month depending on location and the company’s data needs – a huge divergence in price, especially for the SME. While there are strong benefits - not least the lack of contention which delivers dedicated, guaranteed speed – many SMEs understandably baulk at the investment, especially those facing the top end of the price range. Companies also recognise that a single comms connection represents a significant business risk – a secondary or back-up solution is essential to avoid expensive, non-productive down time. is there any need to invest in a fixed line anymore? The alternatives to fixed line options vary depending on location, and include point to point radio , satellite and mobile - increasingly 4G as the network coverage expands. Indeed, in many areas where the ADSL service is weak, growing numbers of SMEs have adopted 4G as a contingency comms solution – and discovered the failover performance far outstrips the main connection. Speeds are good, if variable, and more than adequate for the needs of most smaller organisations; and data costs are just a fraction of those on a 3G network. So is there any need to invest in a fixed line anymore? Flexible Model Having tried 4G as a back-up or short term solution in new premises, the quality of performance has prompted growing numbers of organisations – especially SMEs – to use 4G as the primary connection. However, the variability of 4G can raise concerns regarding consistent performance for phone calls, VPN or cloud based Citrix sessions that require a consistent minimum bandwidth. So while a 20Mbps 4G connection is brilliant for routine Internet access, some organisations question whether it will always be enough for simultaneous data transfer or voice calls. Organisations are also justifiably concerned about the lack of Quality of Service (QoS) guarantees on offer from mobile providers. [easy-tweet tweet="The concept of primary versus secondary #connection has been replaced by a bonded model " user="comparethecloud"] So what is the option? In fact, there is no need to make an either / or fixed versus mobile decision. The concept of primary versus secondary connection has been replaced by a bonded model that enables organisations to gain additional resilience and contingency from bonding different connections together. With this approach, a company can bond fixed and/or mobile connections - including connections from different mobile providers for additional contingency – and gain the benefit of the combined bandwidth to deliver day to day performance improvements. The bonded model also addresses the QoS issue by capping the top speed of the aggregated connections at a point where a more consistent connection can be guaranteed. In addition to creating the most reliable connection possible, by taking bandwidth from different network operator cell locations the bonded service ensures that even if one operator’s network is congested, the overall connection remains stable and high performing. Long Term Solution In addition to delivering the key requirements today, namely consistent bandwidth plus business contingency, this bonded approach builds in future proofing. Companies can adapt the comms strategy in line with network changes and provider pricing. For example, a company may opt today to restrict the 4G usage to a set data volume – but as soon as the provider makes a pricing adjustment, that restriction can be removed. Companies can opt for a mixed fixed and mobile approach today, or go for a mobile only mixture of 3G and 4G networks, from different vendors for contingency. enabling companies to invest in a comms model that truly reflects business needs irrespective of location is key The key message is that the choice is broad – enabling companies to invest in a comms model that truly reflects business needs irrespective of location. Is a mobile first comms strategy viable today? To be honest, it depends on business needs. Certainly the arrival of 5G will certainly challenge the validity of fixed line connections, but that is a few years away yet. In the meantime, 4G is becoming a compelling business solution – as both primary and back-up option. ### Axians Networks appoints new Managing Director Specialist networking systems integrator, Axians Networks Limited, part of VINCI Energies UK & RoI is delighted to announce the appointment of Russell Crampin as its new UK Managing Director with immediate effect. [easy-tweet tweet="#CloudNews: @AxiansUK has appointed Russell Crampin as new Managing Director" user="comparethecloud"] Russell will build upon Axians Networks’ service and delivery excellence by shaping and developing solutions to enable customers to maximise new market opportunities. Russell will head up Axians Networks’ board of directors and report into Axians’ Executive Chairman, Rochdi Ziyat, who is also CEO of VINCI Energies UK & RoI. Russell joins from Logicalis, where he held the position of Sales Director. With over eighteen years multidisciplinary experience in the technology market, Russell is a seasoned industry figure holding director and senior management roles in sales, operations, business transformation and service delivery. Commenting on the new appointment, Rochdi Ziyat, Executive Chairman at Axians Networks said: "Russell brings with him a wealth of experience and knowledge in delivering business value to customers’ ustilising innovative solutions. His industry experience and commercial acumen is an invaluable asset as we look to re-affirm Axians as the ‘go to network partner’ of choice for the Service Provider and Research and Education sectors. I am confident that Russell has the right pedigree to lead Axians on the next important phase of our journey.” Russell commented: "I am excited to be joining Axians at a significant time of growth and investment and welcome the opportunity to help customers exploit the new opportunities we are seeing in the industry.” Russell graduated from the University of Exeter with a degree in Mathematical and Computer Science. ### Meet France’s top Cyber-cop - Francois-Xavier Masson Recently I was dispatched to Lille to attend the 8th International Cybersecurity Forum  to report live,  and shortly after I shared some key insights from the top keynotes focusing on the political agenda around cyber-security, including copies of keynote speeches at the event from European Commissioner Günther H. Oettinger, French Interior Minister Bernard Cazeneuve, and Security Minister at the UK Home Office John Hayes. [easy-tweet tweet="Get the latest on French #CyberSecurity from @BillMew and the top French #CyberCop"] I’ve been to many IT security events in the UK. Never have I seen the British police manning a stand at any such event, but in Lille the French Interior Ministry has a stand that was crawling with Cyber-cops – many of them heavily armed. We asked to speak to the top cyber-cop to get a direct line on what their main challenges are. OCLCTIC - One acronym to beat them all! The greatest challenge that they face is volume of data, but this data is not uniformly spread across the various threats: the greatest volume by far is in relation to scams and hacks where they are overloaded. Child exploitation involves a far lower volume of data and terrorism less still – meaning that it is far more like finding a needle in a haystack. Different technologies and techniques are therefore required in each area. Sometimes interrupting these data flows is a real challenge. A great deal of the scams, hacks, and fraud come from abroad and arrive in enormous volume (as with DDOS attacks), while child exploitation is hidden and terrorist activity is either public (where the authorities seek to interrupt and take down propaganda) or secret (where activists seek to communicate, coordinate and plan their actions). the central office for the fight against crime related to information and communications technology The French National Gendarmerie has the most policing resources both physical and electronic, but the Interior Ministry (headed by Bernard Cazeneuve who was a speaker at the event) has established a specialist unit called L'Office central de lutte contre la criminalité liée aux technologies de l'information et de la communication (OCLCTIC) – which means the central office for the fight against crime related to information and communications technology. Meet France’s top Cyber-cop Francois-Xavier Masson, Chef de l’ OCLCTIC, is the unit’s head. He explained that the cyber security unit is focused on technologies and techniques to address the massification (growth in volume) of threats, attacks and breaches as well as the sophistication of criminals through their use of everything from malware to encryption and the dark net.  This body is not only the main unit focused on organised crime, scams, fraud and hacking, but it also works closely with Europol on an operational basis (and with Interpol on more of an information-sharing basis only). OCLCTIC employs techniques different from those of many of its counterparts in Europol because French law still doesn’t allow it to operate undercover or on the dark net. [easy-tweet tweet="French law still doesn’t allow OCLCTIC to operate undercover or on the dark net" user="billmew" hashtags="cybersecurity"] Cooperation is seen as their greatest weapon – not only ensuring that the various French government agencies work effectively together and with their international policing counterparts, but the private sector has a major role to play as well. Masson suggests that they are learning to work with ISPs and the major global social media enterprises and that cooperation is improving all the time. Indeed working with the private sector is essential for OCLCTIC if the unit is to hope to keep pace with the latest advances in technology. Masson and his team need to work with other international agencies to find the necessary proof – even if this means pursuing hackers back to their bases in Russia, or scammers back to their bases in Africa or the Balkans.  there is always going to be a difficult balance to be had between privacy and investigation He accepts that there is always going to be a difficult balance to be had between privacy and investigation. Despite being the target of significant recent terrorist outrages, the French still see liberties as culturally important and are keen to guard their privacy. This isn’t necessarily the greatest barrier to effective investigation though, he suggests. The authorities are always at a slight disadvantage as they are always going to need to have evidence in order to pursue any prosecution and this can mean the need to wait for proof, even when you are reasonably sure who is responsible. ### Crisp Thinking Launches World’s Most Advanced Platform for Social Apps to Protect Young People from Sexual Exploitation and Bullying Crisp Thinking, the leader in social risk protection, today marks Safer Internet Day by unveiling Combat – the world’s most advanced solution to address the rise in online grooming, sexual exploitation, cyber bullying and radicalisation through social networks and mobile apps. [easy-tweet tweet="#CloudNews: Crisp Thinking launches new advanced social monitoring app for #SaferInternetDay" user="comparethecloud"] Building on Crisp Thinking’s 10-year heritage of helping social networks, apps and kids games protecting more than 420 million users and stopping 1.9 million possible sexual predators attempting to speak to 9.5 million children, Combat provides new and advanced protection by examining real-time all the risk factors in every exchange on social media, whether it involves images, video or chat. Backed by a team of global safety experts based around the world and alerted within minutes of potential issues, allowing for accurate and rapid intervention. The increased use of image and video sharing on social media and messaging apps brings not only an increased and new set of risks to young people, but also a much bigger problem for the networks and app providers striving to protect users.  Combat provides a complete solution, bringing together enhanced discriminating technology that has incredible accuracy with a team of over 200 risk protection experts based across the globe and managed from a 24/7 UK command centre. The sophisticated technology within Combat can spot the ‘needle in the haystack’ from billions of pieces of content producing almost negligible rates of false positives. Crisp’s Social Risk Defence team are on hand 24/7 to interpret, prioritize and react, adding an important human element to the interpretation of risks. [quote_box_center] “The massive increase in real-time image, video and group messaging has unfortunately been more than matched by the activities of on-line bad actors such as child sex offenders, extremist groups and others looking to exploit young and vulnerable people,” says Adam Hildreth, CEO and founder of Crisp Thinking. “We are also seeing an increasing trend in tactics where young people are tricked into sending explicit images or videos which are then used as blackmail for money or to further extort indecent images from the young person. “Combat builds on our ten years of expertise in spotting the bad people, bringing a new level of protection for young people, allowing social networks and mobile apps to offer a new level of safety without compromising functionality, user experience or privacy. “We have chosen to unveil Combat on Safer Internet Day because as the world-leader in this field, we have already reviewed over 90 billion pieces of user-generated content per year and have protected more than 420 million users over the past 10 years, we are committed to protecting young and vulnerable people who have the right to enjoy the internet without having their safety put at risk.” [/quote_box_center] Combat is delivered as a fully managed SaaS solution available by subscription according to the number of active users on a network or app, giving any social network, mobile app or kids game the unrivalled levels of protection for their users in a highly cost-effective manner. With the fully managed service a team of 200 risk experts around the globe supplies a unique combination of skill, experience and technological innovation, capable of monitoring social interactions in 15 languages to counter the global explosion in risks from child sex offenders, trolls, extremist groups, radicalisation, sextortion, and child abuse content. ### What Safer Internet Day means for professionals The 9th February 2016 sees the return of Safer Internet Day. Coordinated in the UK by the UK Safer Internet Centre, the aim of Safer Internet Day is to promote the safe, responsible and positive use of digital technology for children. [easy-tweet tweet="The aim of #SaferInternetDay is to promote the safe, responsible and positive use of digital technology" user="RNelsonLLP"] Childwise, a research agency, recently announced a landmark change in how children spend their time. Children are now, for the first time, spending more time playing and socialising online rather than watching TV. Creating a safer online community therefore has now become more important in society than ever before. Despite the focus of Safer Internet Day primarily being on children, adults can also use this time to consider how they themselves can be more vigilant whilst being online, namely for their career’s sake. In this article, I explore what Safer Internet Day could mean for various professionals in terms of their own conduct, regulation and career. let's look at what Safer Internet Day could mean for various professionals in terms of their own conduct, regulation and career Regulation also extends outside the workplace Regulators for professions such as teachers, doctors, nurses etc. are becoming increasingly alive to issues relating to the online activity of professionals. The regulators regulate professionals as a person, rather than just their professional conduct in the workplace. There has long since been controversy regarding the appropriate boundary of regulation of professionals. As an expert in professional disciplinary defence, I have heard of a number of professionals that have risked losing their careers in relation to their online activity. Many of these individuals had regarded their private online conduct as something that would remain entirely private, only to subsequently face a devastating inquiry by their regulatory body. [easy-tweet tweet="When does your private online conduct stop being private? #SaferInternetDay" user="comparethecloud @RNelsonLLP"] How can professionals keep themselves safe online? Whatever personal views are held on the appropriateness of intervention by regulators into private online activity by professions, it is crucial to accept that this is happening and to act accordingly. Following these useful tips when engaging in online social interactions with strangers should help to ensure that your reputation and career remain safe, avoiding any fitness to practise investigation by your regulator: Never reveal what your professional status is; Never reveal your full name and/or address; Be alert to safeguarding issues relating to children and young people; Report any concerns regarding the safety of children online to the appropriate safeguarding authorities; Don’t fall into the trap of thinking that what you write online won’t ever be traced back to you; Don’t write anything online, even on an anonymous basis, that you wouldn’t be comfortable explaining to your regulator if it came to that. To find out more about Safer Internet Day, visit UK Safer Internet Centre and find out how you can get involved. ### Amazon Web Services Announces Amazon Lumberyard — A Free, Cross-Platform, 3D Game Engine Integrated with AWS and Twitch Amazon Web Services Inc. (AWS) today announced Amazon Lumberyard, a free, cross-platform, 3D game engine for developers to create the highest-quality games, connect their games to the vast compute and storage of the AWS Cloud, and engage fans on Twitch. Amazon Lumberyard helps developers build beautiful worlds, make realistic characters, and create stunning real-time effects. With Amazon Lumberyard’s visual scripting tool, even non-technical game developers can add cloud-connected features to a game in minutes (such as a community news feed, daily gifts, or server-side combat resolution) through a drag-and-drop graphical user interface. [easy-tweet tweet="#CloudNews: @AWS launches #AmazonLumberyard - free cross platform #3D game development engine" user="comparethecloud"] AWS is also announcing Amazon GameLift, a new service for deploying, operating, and scaling session-based multiplayer games. With Amazon GameLift, Amazon Lumberyard developers can quickly scale high-performance game servers up and down to meet player demand, without any additional engineering effort or upfront costs. New AWS service, Amazon GameLift, lets game developers quickly scale their session-based multiplayer games to support millions of players with AWS’s highly available cloud infrastructure Amazon Lumberyard is free, and available today in beta for developers building PC and console games, with mobile and virtual reality (VR) platforms coming soon. Amazon GameLift has a small per-player fee, plus for both Amazon GameLift and Amazon Lumberyard, developers pay standard AWS fees for AWS services used. To learn more about Amazon Lumberyard, visit http://aws.amazon.com/lumberyard. To learn more about Amazon GameLift, visit http://aws.amazon.com/gamelift. Building technology capable of making the highest-quality games is difficult, time-consuming, and expensive. Game developers either have to spend several years creating the more than 20 significant technology components that are needed to build the highest-quality games (such as real-time graphics rendering, world and character editors, animation systems, physics simulation, low-latency networking, particle systems, scripting systems, terrain generation, and more), or they have to invest in commercial game engines that are expensive and do not include native integrations with Twitch or cloud back-end technologies (like AWS). And, as live, multiplayer games have risen in popularity, game developers have also had to invest thousands of hours to build and manage the back-end infrastructure needed to connect their games to the cloud and support high volumes of fluctuating player traffic. Amazon Lumberyard is the only game engine that gives developers a combination of free, feature-rich development technology, native integration with the AWS Cloud to make it easier for developers to create live and multiplayer online games, and native integration of Twitch features that help developers connect their games to the world’s leading social video platform and community for gamers. By starting game projects with Amazon Lumberyard, developers are able to spend more of their time creating differentiated gameplay and building communities of fans, and less time on the undifferentiated heavy lifting of building game engine components and managing server infrastructure. And, with Amazon GameLift, developers can be sure that on day one, their live, multiplayer games can scale to support millions of players, while maintaining the high performance gamers expect. New game engine helps developers create the highest-quality games, build cloud-connected gameplay features, and build communities of fans on Twitch; beta available for free download today “Many of the world's most popular games are powered by AWS's technology infrastructure platform," said Mike Frazzini, Vice President of Amazon Games. “When we’ve talked to game developers, they've asked for a game engine with the power and capability of leading commercial engines, but that's significantly less expensive, and deeply integrated with AWS for the back-end and Twitch for the gamer community. We're excited to deliver that for our game developers today with the launch of Amazon Lumberyard and Amazon GameLift." With Amazon Lumberyard, game developers can: Create the highest-quality games—Amazon Lumberyard helps developers build rich, engaging, world-class games—from a full-featured editor, to native code performance and stunning visuals, and hundreds of other features like performant networking, cloth physics, character and animation editors, particle editor, UI editor, audio tools, weather effects, vehicle systems, flocking AI, perception handling, camera frameworks, path finding, and more. Developers also have full access to Amazon Lumberyard source code, making it easy to customize the technology to create differentiated gameplay. Build live, online features in minutes—Live,online games enjoy higher engagement and retention than offline games. Amazon Lumberyard’s visual scripting tool, with its drag-and-drop graphical user interface, makes it easy to build connected game features that access AWS services, such as DynamoDB, Lambda, and S3. In minutes, game designers can create features such as granting a daily gift or sending in-game notifications without having to write a single line of code. Amazon Lumberyard also comes integrated with AWS’s C++ SDK to provide developers access to dozens of AWS services through native C++ code, the most common language used to make games. Reach and engage fans on Twitch—Amazon Lumberyard is integrated with Twitch so that developers can build gameplay features that engage the more than 1.7 million monthly broadcasters, and more than 100 million monthly viewers on Twitch. With Amazon Lumberyard's Twitch ChatPlay, developers can use a drag-and-drop visual scripting interface to create gameplay features that let Twitch viewers use chat to directly impact the game they are watching in real-time. For example, with Twitch ChatPlay within Amazon Lumberyard, a developer could build a game that lets viewers on Twitch control a character or vote on game outcomes using chat commands like "up," "down," "live," or "die." And, the Twitch JoinIn feature within Amazon Lumberyard helps developers build games that allow Twitch broadcasters to instantly invite their live audiences to join them side-by-side in the game, with a single click, while others continue to watch. Amazon GameLift, a new managed service for deploying, operating, and scaling session-based multiplayer games, reduces the time required to create multiplayer back-ends from thousands of hours to just minutes. With a few quick steps in the AWS Management Console, developers can deploy game servers across the AWS Cloud, start connecting players to games, and scale capacity up and down to meet player demand. Developers can also identify operational issues using Amazon GameLift’s real-time reporting of game server capacity and player demand. With Amazon GameLift and Amazon Lumberyard, developers can create multiplayer back-ends with less effort, technical risk, and time delays that often cause developers to cut multiplayer features from their games. “Amazon has been a great partner and we are deeply excited about both Amazon Lumberyard and Amazon GameLift,” said Josh Atkins, Vice President of Creative Development, 2K Games. “The integration of a fantastic game engine with amazing cloud services presents a wonderful opportunity for both independent developers and established publishers.” “Developing and maintaining a back-end infrastructure for multiplayer games requires a lot of time, resources, and expertise that are beyond the reach of many developers,” said Chris Jones, Chief Technology Officer, Obsidian Entertainment. “Amazon GameLift removes much of that burden from the developer, allowing them to focus their energy on bringing their great game ideas to life.” Pricing and Availability Amazon Lumberyard is available for download in beta for PC and console game developers. The Amazon Lumberyard engine is free to use, including source. There are no seat fees, subscription fees, or requirements to share revenue. Standard AWS fees apply should developers choose to use other AWS services. Amazon GameLift is available at launch in the AWS US East (N. Virginia) and US West (Oregon) Regions, with additional Regions coming soon. Amazon GameLift costs $1.50 per 1,000 Daily Active Users plus the standard AWS fees for AWS services they consume. ### Can Cloud be used in the Financial Services Industry? Compare the Cloud recently presented a webinar with Dell, explaining the intricate details of whether to Cloud, or not to Cloud, when discussing the Financial Services industry. [easy-tweet tweet="Catch up on the first webinar on #Financial Services from Dell with @NeilCattermull" user="comparethecloud"] The first webcast in the series happening throughout Q1 was titled Dell Cloud Insights for Financial Services Series: Can ‘cloud’ be used by Financial Services? We covered a lot of ground in relation to the financial sector during the webcast, including approaches to omni-channel banking and improving customer experience through a cloud computing foundation. One of the key things that was discussed was the fact that in the new digital age, banks are not embracing the agility that is needed to stay in business. We also looked at how new FinTech approaches such as mobile payment systems including Apple Pay are affecting the financial giants. in the new digital age, banks are not embracing the agility that is needed to stay in business One of my favourite topics to talk around - Security in Financials - was another big focus on the webcast. Security is a huge challenge for most banks, and increasingly private and hybrid cloud models can offer the same if not better security than they currently employ, but with increased agility. During the webcast participants learned about how updated Cloud models for Financial Services Firms offer better agility, together with best of breed security for a better experience to financial institutions and their customers. As well as approaches to Cloud delivery in banking – Which approach suits regulated entities and why? Subsequent sessions in the series of webcasts are happening throughout Q1, and Dell will drill into more detail on specific areas such as security and European regulations, with guest speakers known for their experience in these areas. A short overview of Dell’s Cloud Services solutions and coverage of a recent customer partnership was provided during the webcast. Listed below are some of the questions asked during the session: Migrating legacy systems can mean many things. It is always a good idea to stop, assess, and make the best decision for the bank's goals and technological maturity level. Otherwise, are they risking moving from "old" legacy to "new" legacy? Concerned that Cloud providers might do their own analytics on my data. Is this a legitimate concern? Can Financial Institutions operate on Public Clouds or should it always be a Private Cloud? What about VAT on different services? And is where the services are performed an issue? To hear the whole webinar recording and to view the presentation please click here. There are a 13 webcasts from Dell planned in the series, with the following webcasts scheduled in March and supporting invitations will be distributed via social media tweets to keep the momentum and help build out the community. Please click on the registration links below for further information: Mar 02 2016 2:00 pm Understanding the restrictions that regulations actually impose on cloud usage,  presented by Frank Jennings, lawyer specialising in cloud and technology, Wallace LLP Mar 15 2016 2:00 pm Financial Services workloads that make sense in the cloud (case study) presented by Nick Hyner, director of Cloud Services for EMEA, Dell Services Mar 22 2016 2:00 pm Can Public Clouds be suitable for traditional / legacy workloads? Jay Hibbin, Evangelist & Sales Director, Century Link ### IronKey™ by Imation acquisition by Kingston Digital In order to further strengthen its investment and commitment into Encrypted USB technology, and to further evolve its solutions for customers of all levels who want to protect mobile data, Kingston today announced it has acquired the technology and assets of IronKey™ from Imation Corp. [easy-tweet tweet="#CloudNews: Kingston Digital acquires IronKey™ by Imation" user="comparethecloud"] Due to regulatory and legal requirements there will be a 30-day integration process where both companies can complete the final details. Until this period ends it will be business as usual for both companies. Expect to see further details as the acquisition completes. ### Cloud strategy - the next chapter Recently analysts Esteban Kolsky of thinkJar and Denis Pombriant of Beagle Research Group conducted research on cloud strategy to get at the heart of enterprise cloud application adoption as well as to identify some of the challenges companies face with their deployment. [easy-tweet tweet="#Cloud adoption may be here, but #CloudStrategy is in disarray" user="comparethecloud"] Overall the research shows that while cloud adoption is here, cloud strategy is in disarray. So are we at risk of global cloud disillusionment if businesses do not develop a cloud strategy which works across the business, and one which all those who introduce and implement cloud solutions can operate within? Results shows there are several aspects which contribute to the lack of coherent cloud strategy adopted in business. The governance problem No one department is in full control meaning there is no uniform governance of cloud strategy. 71.3% of cloud systems decisions are made with either IT supporting departments making systems decisions or IT collaborating with departments making joint decisions. 13% of cloud decisions are made with IT not participating in the decisions, and departments / LoBs making decisions on their own and nearly 16% are made as entirely an IT decision. IT NEEDS TO BE PART OF ANY CLOUD STRATEGY IN ORDER TO ENSURE COMPLEXITY IS REDUCED While departments may be closer to business requirements for success, IT needs to be part of any cloud strategy in order to ensure complexity is reduced and to avoid the pitfalls that can come with a disjointed strategy. Cloud strategy is not designed with a ‘platform first’ approach, which can result in numbers of apps on many different platforms making it unwieldy to manage. This counteracts some key benefits to creating simpler business solutions, which save both time and money. Multiple platforms in use 39.8% of companies still use more than four platforms, causing problems in delivering real-time business insight and one true customer record. If decisions are ‘apps first’ based, there is no consideration taken for how data in apps can work collectively with each other to give true business insight, enabling companies to act faster and serve customers more effectively. This also causes businesses to duplicate work systems, which can be eradicated through reducing the number or using just one platform. Only 14.8% of companies surveyed were using just one platform, showing room for improvement is needed in reducing complexity to implement and manage cloud applications. [easy-tweet tweet="There is room for improvement in reducing complexity to implement and manage #cloud #apps"] Technical problems still an issue Technical problems still ranked highly when it comes to cloud applications and these gathered the highest number of votes. This could stem from the complexity in managing cloud apps on many platforms but also shows that IT needs to increase its capability, awareness and education in managing cloud applications. This will see IT gaining confidence within its organisation and have others trust their ability to provide a sound cloud strategy. It will enable them to reduce the potential number of problems that adopting cloud applications engender for organisations, and enable IT departments to remain lean. Problems that can be resolved with right cloud strategy [quote_box_center] The seven most popular issues facing cloud app customers were found to be those which cloud applications are heralded to resolve: capped or reduced IT budgets slow performance application accessibility synchronisation problems operating multiple databases number of applications and enabling technologies in use ongoing data quality concerns [/quote_box_center] Companies need to acknowledge that simply bringing in a new cloud application is not the only requirement to solving a business problem. If the right strategy and infrastructure is missing, the investment made to introduce a cloud application will be lost and could worsen an existing issue or introduce new problems to a business. Business benefits One aspect that is coming out strongly in businesses surveyed is that the introduction of cloud applications is enabling some business benefits. The speed of preparing management information has increased in 46.7% of companies, 54.7% have seen it improve the quality of management information and 49.3% report increased customer service. [easy-tweet tweet="The speed of preparing #management information has increased in 46.7% of companies thanks to #Cloud"] Looking forwards Imagine what can be achieved if complexity is further reduced with the right cloud strategy and an optimal decision making infrastructure in place for companies to drive forwards? The research shows it is simply not enough to adopt cloud applications, but cloud decisions need to be taken in the right way with the right underlying strategy in order to reap the full benefits. Businesses can then use insights and analytics to really see efficiency gains, plus the ability to act faster and deliver excellent customer service through achieving a full 360 customer view. The cloud industry may be in full swing as far as adoption is concerned but we can look forward to seeing its next chapter as firm strategies are introduced to allow cloud to realise its full potential. If business ignores the strategy issue, the whole industry could risk disillusionment as it fails to live up to its promises. IT IS SIMPLY NOT ENOUGH TO ADOPT CLOUD APPLICATIONS, BUT CLOUD DECISIONS NEED TO BE TAKEN IN THE RIGHT WAY WITH THE RIGHT UNDERLYING STRATEGY IN ORDER TO REAP THE FULL BENEFITS ### BTI Systems acquisitions strengthens Juniper NFV Play This proposed acquisition of BTI Systems looks like good news for Juniper Networks from a number of different perspectives. For one, BTI is not short of customers – its website lists a total of 380 including data centre specialists Equinix and Interxion and cloud hosting company Rackspace. [easy-tweet tweet="Analysis from @AxiansUK shows the acquisition of BTI Systems was a smart move for @JuniperNetworks"] The jewel in BTI’s crown appears to be the NFV enabled BTI 7800 Series Intelligent Cloud Connect platform which debuted in 2014, a device combining high capacity optical switching with MPLS routing and NFV based applications targeted squarely at cloud service provider data centres. That product in particular has seen some traction in the market, deployed by Asia-Pacific network services provider Pacnet last year to provide 10G and 100G connectivity to its ISP and cloud services provider customers in the region, for example. Global colocation solutions provider CyrusOne is also using a 100G enabled BTI 7800 platform to address demand for colocation services and data centre interconnect bandwidth for its own customers, a number it calculates at 675. Bringing BTI under its wing looks like it will strengthen Juniper’s position in what has become an increasingly competitive market Bringing BTI under its wing looks like it will strengthen Juniper’s position in what has become an increasingly competitive market for 100G metro optical and data centre interconnect equipment, one that has come to be dominated by a small number of very large players following recent merger and acquisition activity. Market research firm IHS/Infonetics has predicted huge growth for metro optical port shipments for example, which it calculates were up 145 percent in 2014 over the previous year. That was anticipated to grow another 118 percent in 2015, but the market is really expected to explode in 2016, when new equipment built specifically for metro Ethernet market, including the BTI 7800 Series, should see wider adoption amongst network service providers. From the NFV perspective, the challenge for Juniper and BTI is to convince those same cloud and network service providers, which are primarily seeking network and data centre capacity upgrades, that embedded NFV components are worth implementing on top to open up additional value added revenue streams and help accelerate NFV adoption elsewhere. [easy-tweet tweet="The challenge for Juniper and BTI is to convince CSPs that embedded NFV components are worth implementing"] Certainly, BTI’s portfolio gives Juniper an important, additional component in its broader SDN/NFV strategy aimed at building virtual network overlays across multiple data centres to offer greater redundancy and resilience for cloud services. Juniper confirmed it will integrate BTI technology with its own NorthStar WAN SDN Network Controller, for example. ### ‘Thin’ is the new green Core benefits of a thin client infrastructure include the ability to reduce TCO and improve manageability and security, but have you ever considered its low carbon credentials? [easy-tweet tweet="Thin is the new green, and green is the future says @DavidAngwin of Dell" user="comparethecloud" hashtags="thinclient"] History was made at the 2015 United Nations climate change conference in December as world leaders agreed upon a final deal which included a "legally-binding" agreement to keep global warming below 2˚C. The international community, governments and publics around the world will now be keeping a watchful eye over organisations to ensure they are playing their part in delivery against this commitment. UK businesses will be expected to devise prudent and long-term policies to rein in their energy consumption UK businesses will be expected to devise prudent and long-term policies to rein in their energy consumption. Those who don’t risk incurring hefty fines and a public backlash. IT sustainability will of course be a key part of any carbon footprint reduction plan and one approach which is definitely worth considering is deploying thins client within a virtual desktop infrastructure (VDI).  Thin clients have been around for decades and today’s improved systems are an increasingly popular approach, particularly within healthcare, education, government, and retail sectors. Simply put, a thin client is a desktop device connected over the corporate network to a central server. All application processing and storage is done by the server and the thin client is merely a device for input, display and output. It uses cloud technology, and desktop resources are centralised into one or more data centres – either on or off-premise. [easy-tweet tweet="Centralising infrastructure to a server running multiple #ThinClients is green and it optimises #hardware resources"] It is this centralisation which delivers many of the advantages thin clients are known for - namely optimisation of hardware resources, reduced software maintenance, and improved security - but there is a significant ‘green’ element which shouldn’t be overlooked. [quote_box_center] When considering the entire lifecycle, cradle-to-grave, thin clients offer numerous environmental benefits. These include energy efficiency; a longer life-span, improved reliability, less packaging, and fewer raw materials: Lower failure rate: Thanks in part to the absence of moving parts such as hard disks or fans, thin clients use as little as 6 watts of energy, which can be a 90 percent reduction compared to existing PCs. This will significantly reduce energy usage and expenditure, especially across large sites. Longer life-span: Thin clients have no hard drive and host no applications. Studies show that thin clients last on average twice as long as conventional desktop PCs, which are typically considered obsolete after three to five years because of advances in computing speeds, storage capacity and functionality. Heat reduction in working environments: Low energy consumption also means low heat emissions which dramatically reduce the need for air-conditioning. Aside from the environmental impact, excessive heat can make working environments uncomfortable for users, and research shows this is likely to impact on productivity and morale. Reduction in packaging and transportation: Thin clients are 60 to 70 percent lighter in weight and smaller in size than PCs which means a reduction in packaging. Furthermore, freight costs, and the environmental impact of transportation, can be reduced as many more devices can fit into each shipping container.  Fewer raw materials and waste: Manufacturing a computing device requires raw materials, such as, metals, water, electricity, and chemicals. Thin clients contain fewer components and in turn fewer mechanical fasteners and adhesives, all of which need to be disposed of at the end of the product life. [/quote_box_center] Thin clients and a Virtual Desktop Infrastructure is an economical solution for organisations that are looking to significantly reduce their carbon footprints. Modern systems deliver a greener, leaner and more flexible and productive working environment than you might have realised. Thin clients and a Virtual Desktop Infrastructure is an economical solution for organisations that are looking to significantly reduce their carbon footprints ### 5 Value Imperatives for How to Measure Virtual TCO in 2016 How we measure the financial implications of investments into virtual infrastructure will dramatically change during 2016. Where previously, you would only really measure the more traditional negatives, you can now fully understand where your server’s infrastructure is at its best capacity. [easy-tweet tweet="How do you determine the best virtual #TCO model in 2016? #cloud #cost" user="comparethecloud"] Here are five questions that you should ask yourself when trying to determine the best virtual TCO model in 2016: Will the solution be suitable for your technology ethos? My ideal set-up is one where I can see critical components in real time. This means that I have complete visibility over my entire infrastructure. If this type of system is similar to yours, you should aim to increase your time-to-resolution by at least 70%. My ideal set-up is one where I can see critical components in real time Can you trust your data? A siloed system is incredibly reliable; you trust that it can perform in most circumstances. I know from personal experience that a legacy, siloed system has the capability to perform in any situation. These days systems and infrastructures are so dynamic that there are things constantly changing on the fly. You need to know that your system can cope with these rapid changes while still performing to a high level. What are the benefits of your investments in performance visibility? It’s imperative that your virtual infrastructure gives you an advantage over your competitors. However, you can spend as much as you like improving the infrastructure of the system but if you don’t correctly sort the wheat from the chaff, you will find yourself running an inefficient system. You need to ensure that you identify any redundant apps as well as any applications which aren’t performing to their maximum potential. The benefits of an efficient system are obvious to any large-scale IT operation. [easy-tweet tweet="Identify redundant #apps and apps not performing to their maximum potential says @AtchisonFrazer"] Do your data analytics identify the correct problem? The relationship between identifying slow performance through data analytics and real-time machine-learned recommendations needs to be cohesive and rewarding. Some of the major benefits from strong analytics include increased revenue, productivity, access as well as a decrease in overall man-hours. Can you properly identify every financial benefit? Properly identifying dollar benefits that aren’t just e-commerce QoS responsiveness or SaaS availability can sometimes seem like the Neverending Story, it can be difficult to do as well as being incredibly time consuming. Some of the key things to watch out for in order to avoid any revenue leakage are poor performance, customer experience, productivity enhancements, workload volume and easy to achieve SLA commitments. ### The rise and rise of customer experience management There is probably more focus now on providing a good customer experience than at any other point in time. There are a number of things to consider when looking at customer experience management software – what’s the best way for brands to approach this? [easy-tweet tweet="We live in an era of the empowered customer, how are #CEM solutions addressing this?" hashtags="marketing, data, bigdata"] We live in an era of the empowered customer. Where many consumers were once inert and passive about dealing with and accepting a bad customer experience, they will now not only shout about any bad experiences on social media but are also much more willing to take their custom elsewhere. With poor customer experience potentially damaging a brand and also impacting the bottom line, this has meant that smart organisations are delivering a customer experience – smart, quick, efficient and in keeping with the customer’s preferences - that encourages loyalty and attracts new customers. But what is the secret to achieving this? It is ultimately down to having the right people and the right processes in place, but it’s also a question of deploying the best and most effective technologies to manage the customer experience.  What exactly IS customer experience management? the practice of designing and reacting to customer interactions to meet or exceed customer expectations and, thus, increase customer satisfaction, loyalty and advocacy Industry analyst organisation Gartner has defined customer experience management as ‘the practice of designing and reacting to customer interactions to meet or exceed customer expectations and, thus, increase customer satisfaction, loyalty and advocacy.” It is a strategy that requires process change and many technologies to accomplish. What constitutes a good customer experience varies accordingly to the individual, but good customer experience management (CEM) solutions will almost inevitably involve using the data a brand holds on its customers in the right way, to anticipate their needs and address them before they even become an issue. Strong analytics enable brand, product and service to continuously reflect emerging customer preferences. Some organisations will remain stuck in the traditional way of delivering a customer experience, but those with vision and purpose will focus on leveraging  a number of processes and technologies to make customer intelligence visible and accessible throughout the business, to whoever may need it, whether a front-line customer service agent or otherwise. [easy-tweet tweet="Customer experience management is about using the #data brands hold on its customers in the right way" hashtags="CEM"] To build, or to buy? The CEM choice for most brands is essentially whether to build or to buy. While there is some merit in an organisation looking to develop its own CEM offering, this is typically a lengthy process and impacts on legacy platforms, while time is of the essence. If a company is offering a poor customer experience, then waiting for internal teams to build a system is not a feasible option. There are a number of options to consider for any brand looking to implement CEM software, or update their existing system, and they would be best advised to look for best of breed point solutions to bring them up to speed. A big part of this should include looking at SaaS offerings. These are more flexible than on-premise options, and allow a company to scale much more effectively. While there are also cost benefits to using cloud CEM, the real benefit is the flexibility it affords and the ability to be easily configured to suit an organisation’s specific needs. the security involved with SaaS CEM is as strong as most organisations will ever requir Highly regulated businesses still might prefer to own the data and perhaps even the server farm it is held, but the security with cloud-based options has improved immeasurably over the past few years. Customer data is very precious of course, and the repercussions of a data outage of some kind are not good, but the security involved with SaaS CEM is as strong as most organisations will ever require. We consult with many companies on how best to approach CEM, and unless there is a compelling reason otherwise – usually around regulation – SaaS is a better option for most. There is more choice, more contemporary design and architecture and they are more flexible than on-premise solutions; SaaS providers are generally more willing to work with organisations of different sizes and stages of development. ### Amir Hashmi of zsah talks to CTC for #CloudTalks Amir Hashmi talks to Compare the Cloud about his company zsah Technology Group for #CloudTalks #CloudTalks ### Vendors need to adapt in the cloud era The cloud promises to change the way IT is consumed and managed forever. The pace of change is accelerating and we can safely assume that, in the not-distant future, most or all services will be delivered from the cloud, on a subscription model. [easy-tweet tweet="VMware's @Philcroxy shares his thoughts on why vendors need to adapt to #cloud" user="comparethecloud"] Customers will, of course, still need supplier organisations to provide and manage those services – as hosting and managed services providers, and as specialist aggregators, administrators and performance managers. This role will naturally fall to the ‘reseller’ – although they will need to transform their businesses quite radically in order to meet the changing needs of customers. But what lies beyond the cloud for vendors has not been discussed at such length. How will they be impacted by these changes? Will they have to change and adapt as much as their reseller partners? At VMware, we have always taken the position of being the thought-leader and visionary within our sector of the market.  We pioneered virtualisation and when we brought the concept within the reach of every organisation that needed to run multiple servers and optimise use of resources, it opened up a myriad of new opportunities both for resellers and end customers. The first wave of virtualisation meant we could reduce the size and cost of managing data centres Now VMware is leading the way again, with technology that will also enable both the network and storage infrastructure to be virtualised and managed with software. But the scenario in the market is entirely different today. The first wave of virtualisation meant we could reduce the size and cost of managing data centres; in this new era, the availability of cloud services makes an almost infinite number of infrastructures and approaches viable. Apps, services, infrastructure, storage, security, the entire network, can be virtualised and situated on-premise or in the cloud and managed remotely. This is shifting power towards the users. The possibilities are now so vast and open that customers can now effectively design the ideal infrastructure in concept, take it to their trusted provider i.e. reseller (how ‘resellers’ develop their capabilities in this respect is a subject worth considering separately) and ask them to refine and deliver it. They will be expected to advise on and provide the optimum infrastructure for a given application workload, whether that is ‘on-premise’ at the customer, in a managed private cloud or in a public cloud.  What’s more, they’ll then need to be able to seamlessly move workloads from one environment to another as requirements change, without the need for complex and expensive migrations. [easy-tweet tweet="Entire #networks can be #virtualised on-premise or in the #cloud and managed remotely shifting power to users"] This has profound implications for us as a vendor. We have been aware of this far-reaching change taking place for some time, and have effectively re-engineered the way we drive our research and development. We are also changing the way we work with our partners and customers. Their strategies and directions – what can be achieved by the customer with technologies and services, rather than simply what technology can do – is now the starting point for new ideas and developments. We are also being much more sensitive and adaptable to the individual needs of resellers and integrators. Our partners have the closest regular contact and engagement with end-user customers and in the era of hybrid infrastructures upon which organisations have a very high level of reliance, we expect relationships between the customer and reseller to become much closer and interdependent. This will make partner relationships even more important to VMware too. we expect relationships between the customer and reseller to become much closer and interdependent Instead of today’s clearly-defined relationships between us, partners and customers, we envisage a much tighter and complex set of interactions. VMware and its partners will have to become much more aware and responsive to customer needs than we ever have been in the past. Having already undergone this transformation ourselves, offering our partners and their customers ultimate choice in how they procure and consume our software, we are leading resellers in the same direction. We are building what we believe is a much more harmonised and liquid ecosystem that will allow partners and our own business strategy and direction to be guided by the real needs of our mutual customers. ### Black Duck Hub Receives ‘Ready for IBM Security Intelligence’ Validation Black Duck®, a global leader in automated solutions for securing and managing open source software, today announced that it has received IBM PartnerWorld’s Ready for IBM Security Intelligence designation for its Black Duck Hub security solution. As a result, Black Duck Hub has been validated to integrate with IBM Security AppScan to better protect customers around the world. [easy-tweet tweet="#CloudNews: @black_duck_sw receives 'Ready for @IBM Security Intelligence' validation" user="comparethecloud"] The technology integration allows organisations to identify and manage application security risks for both custom-developed and open source code through a single view within IBM Security AppScan that provides comprehensive information about vulnerabilities and the ability to manage remediation. Black Duck Hub identifies and inventories the open source in applications and containers and maps any known security vulnerabilities by comparing the inventory against data from the National Vulnerability Database (NVD) and VulnDB. Hub also provides continuous monitoring for newly discovered open source vulnerabilities. IBM® Security AppScan® Enterprise enables organizations to mitigate application security risk, strengthen application security program management initiatives and achieve regulatory compliance. Organizations worldwide are struggling to keep their applications safe from vulnerabilities. Among their top challenges are visibility and control over risks in open source code. Thousands of new vulnerabilities in open source are reported annually and 98 percent of organizations are using more open source in their applications than they are aware of, leaving them exposed to vulnerabilities such as Heartbleed, Shellshock, Ghost or Venom. “It’s not uncommon for open source software to make up 40 to 50 percent of a large organization’s code base. By integrating Black Duck Hub with AppScan, IBM customers will gain visibility into and control of the open source they're using. This will enable them to better understand and reduce security risks,” said N. Louis Shipley, Black Duck CEO. “We’re dedicated to enabling a holistic approach to enterprise application security management,” said Lawrence Gerard, Program Director, Application Security, IBM. “Through our technology integration with Black Duck, our joint customers will be able to identify and remediate security vulnerabilities in both their open source and custom code – all through IBM Security AppScan Enterprise. This gives them a more complete and effective way to manage application security." Key features available to IBM AppScan customers using Black Duck Hub: Comprehensive identification of open source: Rapid scanning and identification of open source libraries, versions, license and community activities using the Black Duck® KnowledgeBase™ – the industry’s most complete database for open source Assessment of open source risks: Automated mapping of open source inventory to known vulnerabilities Integrated remediation orchestration and policy enforcement: Open source vulnerability remediation prioritization and mitigation guidance Continuous monitoring for new security vulnerabilities: Ongoing monitoring and alerting on newly reported open source security vulnerabilities ### 6 Nations 2016: How data is paving the way for sporting success At the highest levels of sport, the finest margins can mean the difference between success and failure, driving teams to hunt for any competitive edge possible. Increasingly, data is providing this edge, giving coaches and players access to greater insights into match day performance. [easy-tweet tweet="#data #analytics is giving coaches the edge in the #6nations #rugby says @bsquared90"] In recent years, the 6 Nations rugby tournament has been a closely fought affair, with the previous three championships all decided on points difference. Given these fine margins, it’s hardly surprising that professional rugby coaches have embraced data analytics in a big way in order to help get their team over the line. Accenture, the Official Technology Partner of the RBS 6 Nations Rugby Championship, has access to tournament data going back to 2000 and uses its analytics platform to measure and compare a huge range of statistics. Teams of data scientists are able to measure this information and present in a way that makes it easy for coaches to evaluate the relative strengths and weaknesses of each individual player or team. The use of data in sports has helped to make both players and trainers more accountable and it is not just rugby that is getting on board. The NFL already utilises RFID sensors, and other examples of IoT sporting equipment are also being developed to collect information. The use of data in sports has helped to make both players and trainers more accountable and it is not just rugby that is getting on board However, former South Africa coach Nick Mallet knows from personal experience that data is not the be all and end all, particularly when there are human factors at work that are difficult to quantify. [caption id="attachment_34690" align="alignright" width="300"] 2015 Italy vs Wales Game Tracker[/caption] “There was a young, up and coming number eight called Bobby Skinstad and in every one of the match stats, he was better than Gary Teichmann. I picked him purely on stats, which was a hell of a bad mistake, because Teichmann was an outstanding leader,” he said. “So I got rid of a captain to put in a talented player, but I lost ten per cent of everyone else’s performance. And arguably I lost the World Cup in 1999 because of it.” [easy-tweet tweet="Sometimes even #bigdata #analytics can't predict a winner, the human aspects play a big part" user="bsquared90"] Similarly, Ireland’s Paul O’Connell was not, statistically speaking, in last year’s 6 Nations team of the tournament. However, most viewers of the 6 Nations would agree that O’Connell’s influence as a leader and talisman was vital for Ireland in retaining the championship in 2015. This highlights the importance of not trusting data without its surrounding context. Whether you’re based in a sporting environment or a business one, insights can only be gained by utilising the expertise of your data scientists in combination with individuals that are familiar with its contextual surroundings. In this case, Nick Mallet, has been able to advise the Accenture team about the types of rugby datasets that are most useful. For example, giving more penalties away than the opposition would generally be seen as a negative attribute, but without additional context, such as where on the field these penalties were given away, it is impossible to know if this metric played a key role in the outcome of a match. In isolation data is largely meaningless, but without data it is easy to dismiss opinions as lacking evidence, however when data is used to support your claims it can deliver a powerful advantage. [easy-tweet tweet="Don't trust #data without its surrounding context says @bsquared90" user="comparethecloud" hashtags="bigdata, analytics"] Data is also being used to help rugby fans engage with the sport, by utilising another piece of cutting edge technology. Accenture has developed a proof-of-concept virtual reality experience to enable fans to immerse themselves within the game, exploring real-life venues before being presented with visualisations created from analysed game data. Creating clear and engaging visual representation of data is extremely important if the information is to be fully understood and Nick Millman, managing director, Big Data and Analytics Delivery Lead for Europe, Africa and Latin America at Accenture, believes that combining VR and big data could have broad applications for a wide range of industries. combining VR and big data could have broad applications for a wide range of industries “In this proof of concept, we used analytics and overlaid visualisations of the insights onto a virtual reality environment,” he explained. “This application could provide some interesting potential uses for businesses; for example, we could create realistic training scenarios for hazardous work environments, test customer experience in a number of disparate locations, or explore and detect patterns in big data using 3D imagery.” [caption id="attachment_34695" align="alignleft" width="300"] 2015 Team of the Tournament[/caption] The amount of data available to sporting teams and corporate businesses is growing all the time. For organisations, the challenge is no longer how to acquire data, but how to use it effectively. With analytics platforms, interactive dashboards and virtual reality technology progressing at a rapid rate, there are also a multitude of options when it comes to presenting data. As we approach this year’s 6 Nations tournament, sporting teams and businesses must both remember that data must be contextualised, analysed and acted upon correctly, if it is to help you win over your competitors. the challenge is no longer how to acquire data, but how to use it effectively... ### The cloud is the future of mobile Nearly 80% of UK organisations have formally adopted at least one cloud-based service. Cloud technology and services are now integrated into all of our lives, making us more connected, more productive, and hopefully freeing up valuable time.  This is so for both home and professional lives.  [easy-tweet tweet="#mobile number flexibility has been relatively late to enter the #cloud in full force" user="comparethecloud"] Considering its central place in the modern digital landscape, it seems strange that mobile number flexibility has been relatively late to enter the cloud in full force, but key consumer and business trends are now meaning companies are utilising the cloud to bring mobile communications in-line with other technology. Virtualisation of mobile A cloud mobile number works just like any other mobile number A cloud mobile number works just like any other mobile number, but the number is removed from the physical SIM card and placed in a ‘cloud-based mobile network’.  Users can then access their phone number in the same manner they would access any other cloud-based service – through a connected device. Virtualisation of mobile network services in this way is almost identical to what has been done with music, documents, software, and general file storage: it gives users the flexibility to access their number easily on any connected device, at any time, and from anywhere in the world. Modern businesses; outdated communications Recent figures released by the Office for National Statistics (ONS) show that 76% of businesses do not employ anyone.  At first glance this may appear strange.  How can a company get any work done when it has no employees? What the study in fact demonstrates is the large number of people who are self-employed or operate single person companies. There is also an increasing trend towards companies using contractors and freelancers on a temporary or project-led basis.  Recent figures released by ONS show that 76% of businesses do not employ anyone Businesses that are still employing people are seeing a huge rise in employees bringing their own mobile devices to use at work (BYOD). The most recent Tech Pro Research survey on the subject shows that BYOD has been adopted by 59% of organisations, with a further 13% planning to do so soon.  While this offers benefits to businesses (for a start their employees are now purchasing the devices they need to perform at work!), it does present drawbacks.  When an employee is working for a business their phone number becomes associated with their role, and the contacts the worker makes in connection with their role also become associated with that number.  So what happens when the worker leaves? Cloud mobile for business Cloud mobile numbers can give companies the tools they need to thrive in a modern business landscape. They can allow them to anticipate and prevent issues of increased costs and decreased visibility associated with BYOD, while also enabling a new level of control over a company’s intellectual property (IP). New employees can be allocated a business phone number to their own mobile phone for as long as is required; the business does not have to pay for a new device or contract, and the employee can keep using the handset they know while still retaining their personal number. The business is always in control of the number and when an employee moves on the business can immediately assign the number to their replacement, keeping the contact point with the business rather than the employee.  [easy-tweet tweet="#Cloud #Mobile allows you to run multiple numbers on one device" user="comparethecloud"] Owners of small businesses and freelancers can use a cloud-based mobile service to separate their personal and business communications on one device. Owners of several businesses can also separate their communications, by having individual phone numbers for each of their businesses, accessing them all through just one device. Whenever they receive a call, they will know exactly which business it relates to and can answer appropriately, and when they make a call they can choose which number to display on outgoing calls and texts. Cloud mobile for everyone Cloud mobile numbers also lend themselves well to being used as temporary numbers, which can be particularly useful for communication in our increasingly digital lives at home.  In cases where protecting privacy is crucial, such as dating (both online and in the real world), and using online marketplaces, cloud numbers are an easy and transparent way to make calls or send SMS text messages to someone you are just getting to know. [easy-tweet tweet="Don't trust that #Tinder date enough to give them your real number? Maybe a #Cloud number is the way to go!"] In addition to multiple mobile phone numbers on one phone, cloud numbers allow users to avoid expensive roaming charges when abroad. While data charges can still be a problem, when connected to a Wi-Fi network, users can make calls to any mobile phone number around the world for a cost-effective and transparent call rate. This is great for expats, who want a UK mobile number as a point of contact but often struggle to manage cost-effective communications with home while they are based abroad. In personal and business lives, virtualisation of mobile is poised to change the communications landscape forever, enabling people to decide not just whether to take a call or not, but exactly how to take it too. virtualisation of mobile is poised to change the communications landscape forever ### EU-US Privacy Shield replaces Safe Harbour: Changing attitudes towards data privacy The rise of digital technologies has enabled businesses and the data that they hold to be more mobile than ever before. However, multi-national agreements are often required in order to guarantee the security of this data. The most high profile of these, the Safe Harbour agreement, was created by the EU and US between 1998 and 2000 in an attempt to protect citizen’s data on both sides of the Atlantic. [easy-tweet tweet="The EU Commission has announced a new framework for transatlantic #data flows: the EU-US #PrivacyShield"] Back in October last year, however, the European Commission declared the Safe Harbour agreement to be invalid following a complaint by Austrian citizen Maximillian Schrems. The initial reaction to the abolition of the Safe Harbour agreement was one of uncertainty and confusion. Organisations rushed to amend their user agreements and many considered relocating customer data where possible. The lack of clarity over when the agreement would be replaced only added to the sense of businesses left in limbo. This finally came to an end this week when the European Commission announced that it had agreed a new framework for transatlantic data flows: the EU-US Privacy Shield. European Commission vice president Andrus Ansip has stressed that the new framework will protect the fundamental rights of Europeans. the EU-US Privacy Shield will protect the fundamental rights of Europeans “We have agreed on a new strong framework on data flows with the US,” he said. “Our people can be sure that their personal data is fully protected. Our businesses, especially the smallest ones, have the legal certainty they need to develop their activities across the Atlantic. We have a duty to check and we will closely monitor the new arrangement to make sure it keeps delivering. Today's decision helps us build a Digital Single Market in the EU, a trusted and dynamic online environment; it further strengthens our close partnership with the US. We will work now to put it in place as soon as possible." The EU-US Privacy shield includes three key elements for protecting personal data belonging to EU citizens: Firstly, US companies will now be subject to stringent obligations from both the US Federal Trade Commission and European DPAs when importing data from EU citizens. Secondly, there must be transparency regarding US government requests to access data and mass surveillance will no longer be carried out.  And finally, EU citizens will have more opportunities to make formal complaints if they feel their personal data has been misused. Whether or not the EU-US Privacy Shield will completely appease privacy advocates, however, remains to be seen. Although the framework has now been agreed, it will be difficult for businesses and individuals to understand how it differs from Safe Harbour, in practice, until more details are revealed. Neil Laver, Sales Director at Veber welcomes the EU’s decision, but stresses that it doesn’t mean data privacy is guaranteed. “It's good to see the progress being made by the US and the EU commission,” he said. “However, we really need to see the small print, due later this month, to be fully confident about the agreement. Even then, if companies really want to ensure that their data is safely within the UK or the EU then they need to choose cloud providers and hosting companies, like Veber, who can guarantee this.” [easy-tweet tweet="The EU-US #PrivacyShield agreement is simply representative of changing attitudes towards #data #privacy " user="bsquared90"] For businesses, the new agreement assures them that they can continue serving their customers in both the US and the EU, but organisations may find that 2016 has a huge impact on the way that they handle data, particularly with the General Data Protection Regulation (GDPR) due for adoption in the coming months. Dave Packer, vice president of product management at Druva, believes that the combination of GDPR and the EU-US Privacy Shield will be important but shouldn’t be allowed to overshadow basic security protocols. the year of 2016 will have a huge impact on the way that businesses handle data, particularly with the GDPR due for adoption in the coming months “Taken together, there have been a lot of changes for businesses dealing with EU customers to consider,” he said. “Druva believes the best approach for the moment is to concentrate on applying data security best practices – so encryption, firewalls and anti-virus technologies and the like – but also make sure that there is a more proactive approach in place towards managing data as it is created regardless of location, rather than centrally after the fact.” Many would say that the new agreement is simply representative of changing attitudes towards privacy. The Snowden revelations regarding government surveillance, which ultimately led to the disintegration of Safe Harbour, has meant that public and organisational awareness of data protection is at an all-time high. The EU-US Privacy Shield simply puts into writing what businesses have known for some time now – that data privacy can no longer be taken for granted. ### Hybrid Cloud = “The Great Enabler of Digital Business”, And More A recent report by IDG Research Services commissioned by EMC demonstrates the close connections between hybrid cloud computing and digital business. The report pegs hybrid cloud as the “great enabler of digital business”, but it’s more than a proven enabler. Hybrid cloud can accelerate progress and amplify the benefits of digital business initiatives. There’s a benefits “multiplier effect” when companies put more workloads in hybrid cloud and put the agility of hybrid cloud to work.  [easy-tweet tweet="#Hybridcloud can accelerate progress and amplify the benefits of #digital #business initiatives"] It probably goes without saying that digital business is high on corporate agendas (90% of respondents to the global IDG survey count it among their top business priorities) and, therefore, high on the agendas of CIOs and IT organisations. Consumer technologies and the Internet have altered forever the expectations of consumers and commercial customers. People and organisations expect to do business quickly, easily, anytime and anywhere. To remain competitive, companies must further digitise their processes and interactions – and deliver engaging customer experiences. Hybrid cloud is today’s best infrastructure for digital business How does hybrid cloud play a role? It’s today’s best infrastructure for digital business. It has the necessary underlying characteristics for keeping pace in a fast-changing digital economy: speed, flexibility, reliability, scalability, cost effectiveness. And it enables digital business in specific ways. Hybrid cloud supports rapid testing, deploying, and scaling of new applications by enabling businesses to leverage public and private cloud applications and services together seamlessly. New mobile apps can integrate with core systems so a company can bring all of its capabilities into play. But the hybrid cloud value proposition goes further. The IDG survey showed that hybrid cloud adoption reduces IT operating expenses by a remarkable 24% on average. That’s a big number, both as direct savings and as opportunity to reinvest. But the real kicker is that the surveyed companies put around 40% of their savings into new business and technology initiatives. Then there’s the multiplier effect. The more aggressive adopters – those with a significant number of workloads in hybrid cloud – enjoy the biggest gains. They are three times more likely to be achieving their digital business goals and achieving infrastructure readiness than non-adopters.  Their IT operating expense reduction increases to a whopping 29%.  This translates into an even greater opportunity to reinvest savings. Hybrid cloud is, indeed, the great enabler. Consumer technologies and the Internet have altered forever the expectations of consumers and commercial customers “Becoming digital is a priority for nearly every business on the planet.  But how to get there is not as obvious. This study makes it perfectly clear that hybrid cloud – and the savings and agility it brings with it – is a key enabler to becoming a digital business,” says Jeremy Burton, President, Products and Marketing, EMC Corporation. [easy-tweet tweet="Becoming #digital is a priority for nearly every business on the planet says Jeremy Burton of @EMCcorp"] [quote_box_center]Top statistics from the report include: 92% indicate their organisation’s competitive strategy calls for digital business initiatives 63% indicated they are already on their way to achieving digital transformation goals, whilst 90% call digital business a “top priority” for the next one to three years 88% believe that hybrid cloud capabilities are important or critical to enabling digital business transformation 87% said improving the customer experience was the key driver for digital technology investments 83% currently use or plan to use a hybrid cloud environment, and 73% agree that a hybrid cloud model creates a path to digital business [/quote_box_center] Read the full report here. ### Investing in the right cloud With Gartner predicting that 20 per cent of business content will be authored by smart machines within the next two years, it might come as a surprise that most businesses are only just beginning to realise the degree to which the right IT strategy and execution will make or break their business model in the coming years. [easy-tweet tweet="#Innovation and #success in 2016 will require companies to improve their control over #cloud strategies" user="comparethecloud"] As new and emerging technologies come to light, the infrastructure that supports these devices and applications is becoming more and more important. For many, continuing to make innovation and success a reality in 2016 will mean improving their control over cloud strategies, as deeper data insights, consolidated cloud systems and limitless test environments provide businesses with the tools to grow faster and accelerate revenues. As a result, today’s businesses are demanding an IT infrastructure that is flexible enough to cater for operational growth and reliable enough to support critical applications. many organisations are reluctant to invest in designing, buying, building, deploying and managing the complex and costly infrastructures that are needed to support new technologies Yet despite demand, many organisations are reluctant to invest in designing, buying, building, deploying and managing the complex and costly infrastructures that are needed to support new technologies. Outsourcing these demands to the Cloud has become mainstream, and has enabled organisations to level the playing field and punch above their weight. A prime example of how successful this approach can be is the Government’s G-Cloud initiative, a framework that promotes public sector cloud adoption. Since launching in 2012, a total of £856 million sales have taken place via the G-Cloud, allowing public sector bodies to transition towards a ‘cloud-first’ policy and shift the complexities and expenses of hardware to cloud suppliers. Variations on a cloud As organisations move towards the cloud, IDC predicts that Infrastructure as a Service (IaaS) will be the fastest growing area of this market between 2014 and 2018. But before enterprises adopt IaaS, they need to consider how they plan to consume and use the many variations of cloud that exist. [easy-tweet tweet="IDC predicts that #IaaS will be the fastest growing area between 2014 and 2018." user="comparethecloud" hashtags="cloud"] The private cloud is one example of a model that has evolved. Many enterprises have built their own infrastructure and automation for development and testing or running internal web services, often using new and efficient methods to facilitate it. Yet, when compared to the public cloud, it doesn’t always provide the same amount of flexibility and agility, and therefore can be restrictive for businesses. In comparison, the public cloud has always been available as a service that can be consumed by everyone. It appeals to, and is used by, the mass market for backing-up files, streaming music and films and email services. Surprisingly, despite advocating a private solution, a lot of global organisations still adopt the public cloud in one way or another, despite concerns around security. This solution may be suitable for parts of IT, but is certainly not secure enough to provide storage for the most important and sensitive data moving across the network. In contrast, a hybrid solution can provide the flexibility and agility of a public cloud, whilst still providing the security of a private cloud. Increasingly, it is the hybrid cloud model that will provide the foundations of the infrastructure for next generation technology. Instead of making an upfront capital investment in hardware or relying solely on existing architectures, enterprises can simply purchase more computing power or server space from their cloud provider as and when it’s needed. enterprises can simply purchase more computing power or server space from their cloud provider IT orchestration As businesses drive their digital agenda towards a hybrid cloud model, they will need to be equipped with the IT infrastructure to deliver the best services available. Combining existing infrastructure with additional private cloud services, means that resources can be adapted to business requirements. Long-term, this allows IT to evolve from backroom engineers into IT strategists that act as a vital resource to meeting business objectives and driving revenue. While 2015 was the year that the cloud evolved, 2016 will be the year that IT takes control. Having the right technology will be paramount - facilitating sales and providing a competitive differentiation by tuning, customising and integrating applications with an ever increasing precision. As the industry turns the corner to become more streamlined, safe and agile, the organisations that can fine-tune this capability and capitalise on this potential will surge ahead to success in the years to come. ### Revolutionising the customer experience with the cloud The advent of cloud computing has not only transformed the technology sector, but also the way that organisations run their businesses. According to the latest research from the Cloud Industry Forum, roughly four in five UK-based organisations use at least one cloud service today, and a significant proportion of those organisations use many more than that. To put this into context, that figure stood at just under two in five in 2010, demonstrating the astronomic rise of the delivery model. [easy-tweet tweet="#Cloud has transformed more than the #tech sector, it has changed the way people do business"] Several years ago the primary objective for migrating to a cloud-based model was to save money and avoid capital expenditure. But while cost savings do often materialise, there are no guarantees and, depending on the solution in question, a business could feasibly end up spending the same on a cloud-based service as they would have done for an on-premises version. Costs are necessarily important, and there’s no denying the appeal of reducing the cost of running and maintaining IT, but we have started to see a shift in the conversation about cloud from how it can save money to how it can generate it. Ultimately, cloud is not about the technology per se, but what it enables you to do and how it can transform the IT department from a cost centre to something that can enable genuine business change. Ultimately, cloud is not about the technology per se, but what it enables you to do The ability to transform business and customer interactions is one of the greatest strengths of cloud computing, and, indeed, improving customer service stands as one of the key objectives that is driving the continued investment in cloud services. You want to be able to offer your end users, be they internal or external, more support and transform their experience – which, in turn, brings with it significant competitive advantage. The potential of cloud to transform the end-to-end customer experience is perhaps nowhere more apparent than in the contact centre, where cloud computing has enabled businesses to fully embrace omni-channel engagement. This is critical for businesses looking to improve their mobile self-service offering and divert costly, common or non-complex enquiries away from the contact centre. Customers today want to be able to navigate between different channels, from social media and mobile apps to the traditional voice call – all without losing context. It’s convenient and seamless, and it’s critical that businesses can respond with the same agility and provide a frictionless service regardless of the contact method used – which is precisely what cloud-based solutions help to achieve. the security aspect of cloud is still under scrutiny There are, however, a range of obstacles – be they real or perceived – that stand in the way on the wholesale transformation of businesses and their use of cloud, chief among them the belief that cloud is insecure. Although it isn’t seen as the absolute barrier to adoption it once was, security is still an objection that many businesses have about cloud computing, and is cited as the primary reason for keeping services in-house. Plainly, data security is not something to be treated lightly, but typically a good cloud provider will have security nailed and will have invested in sufficiently high levels of protection. With economies of scale on their side and a breadth of customers with differing risk requirements, Cloud Service Providers are in the position to spend more on security and, indeed, are able to offer infinitely greater levels of security than most businesses can afford in their own right. It’s fundamental to the cloud business model; if cloud were inherently insecure, the industry would just evaporate. [easy-tweet tweet="If #cloud services were inherently insecure, the industry would have just evaporated, but it hasn't. " user="comparethecloud"] It’s critical that businesses can find mechanisms to overcome these perceived barriers as the ‘opportunity costs’ of not moving to cloud are too great to ignore. Rather than asking “What will I save?” or “What will cloud cost me?” businesses should be asking “What will the cost to my business be if I don’t move to cloud?” Those that have already moved to cloud are reaping the benefits and securing competitive advantage. Those that haven’t will already be finding themselves at a distinct disadvantage. ### January 2016 Global #CloudInfluence Organisation Rankings January’s leaders in Compare the Cloud’s Global #CloudInfluence Rankings are the market share leaders Microsoft and Amazon. Both announced impressive cloud revenue growth this month, with their cloud computing operations – AWS and Azure – playing an increasingly vital role for each parent company. [easy-tweet tweet="Check out the top global #CloudInfluence organisation charts for January 2016" user="billmew"] AWS revenues grew about 69% in the last year from $1.42 billion to $2.41 billion. It almost tripled its profits from $240 million to $687 million. Microsoft lumps Azure into a larger segment called "Intelligent Cloud," which is a bunch of different enterprise products and services so its hard to make a direct comparison. It did state that revenue growth for Microsoft Azure was 127% with overall revenue from server products plus Azure of about $5.1 billion – this being the vast majority of the "Intelligent Cloud" segment's overall $6.34 billion (the rest comes from consulting and support). While no direct comparison is possible, Azure is clearly growing sales faster than AWS meaning that the number two in the market is gaining on the market leader. In Compare the Cloud’s Global #CloudInfluence Rankings Microsoft has had the highest share of attention for a few months now and is simply maintaining its overall lead. Other players have a long way to go to catch them both, but while Google is taking time to revamp its cloud offering under new leadership, its parent company Alphabet posted strong results with profits of $4.9bn. The results announcement sent its share price up as much as 9% in after-hours trading, meaning that Alphabet was then worth around $568bn, surpassing Apple, which had a value of $535bn. This made Alphabet the most valuable company in the world. Microsoft, Amazon and Google are not only the world’s three biggest public cloud operators, but they are now also among the largest server and storage manufacturers in the world. Much of the actual manufacturing may be subcontracted to OEMs or ODMs, but they control the specifications and act as both the manufacturer and customer. Microsoft, Amazon and Google are not only the world’s three biggest public cloud operators, but they are now also among the largest server and storage manufacturers in the world The move to migrate workloads to the cloud – whether public, private or hybrid - is accelerating.  And while volume economics would indicate that workloads and data will eventually migrate to the largest and cheapest platforms possible – favouring public cloud, this is mitigated somewhat by security, provenance, and sovereignty concerns – favouring private or hybrid cloud. A key announcement by Microsoft this month was its preview of the Azure Stack, a set of software that will allow enterprises and service providers to create a private cloud based on the Azure model that Microsoft operates at a hyper-scale level. Not only will private clouds using this stack be able to operate like Azure, but application developers will be able to use a 'write once, deploy to Azure or Azure Stack' approach. This will be achieved by using APIs that are identical to Microsoft Azure, with the creation of applications based on open source or .NET technology able to run on either on-premise or public cloud. In addition management and automation tools will give IT professionals oversight into the transformation of on-premise data centre resources into Azure IaaS/PaaS services, as they define and manage their hybrid environments in order to handle regulation, data sovereignty, customisation and latency issues. [easy-tweet tweet="AWS and Google aren't showing much in the way of a #hybridcloud strategy" user="billmew"] With neither AWS nor Google showing much in the way of a hybrid cloud strategy this represents a massive advantage for Microsoft and Azure; and a real threat to corporate cloud rivals such as IBM and Oracle (which were 3rd and 4th in this month’s rankings). In addition players like HPE, VMware and Red Hat all see the market heading in the direction of hybrid and Azure and have allied themselves with the Microsoft public cloud – VMware was in 8th in the current rankings with HPE and Red Hat fell outside the top 50. [table id=60 /] NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### Why you should choose managed services over unmanaged cloud When it comes to adopting the latest forms of business technology it can be easy to get carried away. The need to overtake or remain ahead of your competitors can drive some businesses to adopt innovations when they are ill-suited to their business needs or without a clear strategy for their implementation. [easy-tweet tweet="Should you choose managed services over unmanaged cloud? @zsahLTD debates on @Comparethecloud"] Although cloud computing is one innovation that has certainly justified the hype surrounding it, some organisations have struggled to get the most out of this relatively recent technology. Whether it’s due to a lack of planning, time or resources, some IT leaders associate the cloud with integration and reliability problems as much as they do productivity and scalability benefits. For these businesses, managed services provide a viable alternative, whereby a third-party vendor handles the implementation, management and maintenance of your IT estate. Whether or not your company should choose managed services over unmanaged cloud will depend on your precise business requirements, but there are certainly some general pros and cons that IT decision makers should consider. Although an unmanaged cloud service can still deliver a number of benefits to businesses, this is often dependent on the user having the requisite technical skills, time and IT budget. When organisations opt for unmanaged cloud, without possessing these requirements, they can often be left frustrated and frantically searching for more IT budget to solve their issues. UNMANAGED CLOUD SERVICES ARE OFTEN DEPENDENT ON THE USER HAVING THE REQUISITE TECHNICAL SKILLS, TIME AND IT BUDGET Having said that, there are some cases where businesses will benefit from an unmanaged approach to cloud computing. Unmanaged hosting can often cost less on a monthly basis than manged hosting – although of course your own in-house management costs may negate these savings. It also gives businesses the freedom to run their cloud platform as they see fit. Self-management of backups, security and integration can give businesses a greater sense of control over their IT infrastructure, but it is also serious responsibility. As a result, many organisations are embracing managed services in order to benefit from the many advantages of the cloud without putting unnecessary strain on their internal resources. Firstly, managed services often come with the highest levels of technical support. Many providers have expert teams monitoring your cloud services for any potential faults or vulnerabilities and are able to offer 24/7 availability. What’s more, the best managed service providers will supply dedicated account managers and engineers to work on your IT resources to ensure a consistent level of service. [easy-tweet tweet="Managed services for #Cloud often come with the highest levels of technical support" user="zsahLTD" hashtags="MSP"] Managed cloud also has advantages when it comes to security. Managed vendors depend on robust security measures in order to keep their reputation intact and their customers happy. As such, they are willing to invest in the most up-to-date defences against cyberthreats. Similarly, managed service providers will supply your business with a clear data backup plan, should disruption occur. A data breach is a serious issue and a managed cloud can give you peace of mind that a professional team is securing your infrastructure. At Zsah, we understand that most businesses want to be able to focus on their core processes without having to jump over IT-related hurdles. With our managed services, you no longer have to devote time and man-power to managing your own infrastructure or applications. In addition, although we offer our own cloud portfolio, we are happy to manage cloud platforms built by other suppliers and will even oversee on-premise equipment. Before settling on their cloud package, businesses must ask themselves whether they have the budget, time and technical expertise to manage their IT resources internally. If you have any doubts, then managed hosting can provide the helping hand that you need to get the most from your cloud services. BUSINESSES MUST ASK THEMSELVES WHETHER THEY HAVE THE BUDGET, TIME AND TECHNICAL EXPERTISE TO MANAGE THEIR IT RESOURCES INTERNALLY? ### Sky’s the Limit: Let the Cloud be your king To cloud or not to cloud? During its inception, this was a question businesses could ask themselves at leisure. However, with security concerns, increasing profitability, improving business growth and reducing operational costs at the top of the agenda for most organisations, now is not the time to procrastinate. Now is the time to have those discussions about which areas of IT function should be divested skywards.  [easy-tweet tweet="Now is the time to have those discussions about which IT functions should be divested skywards" user="insight_UK"] Contrary to popular opinion, an increasing number of enterprises today are more confident with the idea of using cloud services; in fact 73% are in the process of developing a cloud strategy. With the growing popularity of services such as Gmail, Hotmail, DropBox and Facebook, it is increasingly rare to encounter an individual that doesn’t take advantage of the Cloud on a daily basis. And yet there still remains apprehension from businesses contemplating a migration to the Cloud, despite its speed, agility and cost effectiveness. So what is causing this hesitation? Firstly, it’s a question of size. Larger IT outfits often find it less costly to host certain applications in-house on local hardware as opposed to bolting on extra colocation resource or hosting such apps in the Cloud. In addition, where there are many local, bandwidth-hungry users, in-house hosting may be more prudent. Businesses also have to think carefully about which data sets they can afford to let ‘out of their sight’. Where precisely will the provider be housing your data? Who maintains the hardware? The network? it is increasingly rare to encounter an individual that doesn’t take advantage of the Cloud Security also remains a big concern for most businesses, especially given numerous high profile breaches hitting the headlines in 2015. However, research suggests that Cloud-based offerings can sometimes be more secure than traditional Datacentres, so the tide may be turning. This is thanks to many Cloud-based gateways rapidly deploying anti-malware signatures and blocked URL lists as soon as they’re available. Certainly an advantageous tool with BYOD adoption continuing to sky-rocket and the workforce becoming increasingly distributed. This renewed confidence in reinforced security has boosted the popularity of cloud-based services, especially among SMBs, and with good reason. The cost savings, flexibility and decreased time to market and plenty more besides make perfect sense for those with limited financial and logistical resources. Recent research notes that 64% of SMBs in Western Europe are already using three cloud-based applications on average, among the most popular were: hosting (55.4%) backup (41.5%), emails (41.4%), data storage (40.9%) and booking systems (22.9%). [easy-tweet tweet="64% of #SMBs in Western Europe are already using 3 #cloud-based applications on average"] The Cloud is maturing, evolving, and diversifying at a frenetic rate with the range of applications and hosted offerings is expanding exponentially. Several recent studies have also indicated that Cloud adoption is now having a positive influence on customer support with 70% of firms now able to reinvest cost savings into their operations as a direct consequence of moving to the Cloud. This has led to marked incremental improvements in customer experience, as well as driving potential growth by delivering the scope and scalability of resources, and encouraging a focus on core business. The global Cloud market is forecast to be worth over £80 billion The global Cloud market is forecast to be worth over £80 billion by next year and the benefits of this technology appear limitless. Businesses are running out of reasons not to invest and send at least some parts of the infrastructure skywards. The question, it seems, is no longer whether to cloud or not to cloud, but what to cloud, why, where and how. ### The less obvious industries embracing the Internet of Things The Internet of Things has been much hyped, but the number of mainstream applications remains limited. There has been plenty of industry buzz surrounding connected cars and IoT thermostats, but away from automobiles and smart homes, many other industries are quietly getting to grips with the technology. Although some businesses remain tentative about the Internet of Things, rightly exercising caution regarding security and privacy worries, more agile startups are already preparing for a more connected future. [easy-tweet tweet="CTC is talking #Connected farms and lesser known #IoT uses with @BSquared90 " user="comparethecloud"] One of the less obvious industries beginning to embrace the Internet of Things is agriculture. With the convenience provided by the modern supermarket it is easy to forget the complex processes needed to transfer masses of crops from field to fork. In less developed countries, the agricultural industry is even more perilous and technological innovation is likely to prove necessary in order to meet the  food demands of a global population expected to reach 9.6 billion by 2050. The “Towards Smart Farming” report, issued last year by Beecham Research, found that the Internet of Things, combined with big data analytics, would enable farmers to gain more detailed insights about their crops, optimising efficiency and maximising productivity. the move towards smart farming is being encouraged through various projects and programmes “In Europe, the move towards smart farming is being encouraged through various projects and programmes funded by public and private money. These include EU initiatives and projects at a national level,” says Saverio Romeo, principal analyst at Beecham Research and joint author of the report. “While the M2M agricultural sector is still emerging, M2M and IoT technologies will be key enablers for transforming the agricultural sector and creating the smart farming vision.” Already, a number of agricultural firms are trialling IoT technology. Hahn Estate Winery is utilising IoT sensors, alongside overhead drones, to combat ongoing drought in the US state of California. By collecting data on pathogens, moisture levels, humidity and a number of other metrics, agricultural firms like Hahn Estate can boost production levels. Other examples of IoT crop monitoring are also being employed in cocoa farms in Indonesia, connected farms located entirely within shipping containers and many other places all over the world. There are so many factors that contribute to the success or failure of a particular crop and by using the Internet of Things, farmers are gaining better visibility and control. [easy-tweet tweet="#IoT crop monitoring is giving farmers more insight into the successes and failures of their crops"] The Internet of Things is also making a major impact on the healthcare industry. Connected devices are not only expected to give medical practitioners greater insight into the wellbeing of their patients, they are also predicted to put greater emphasis on self-care. For example, the Philips Medication Dispensing Service automates the pill taking process so that patients can manage their medication when at home. The device is connected to the user’s phone line and issues an alert if it’s time to take their medication or if it needs refilling. Similarly, UroSense automatically measures body temperature and urine output for catheterised patients, allowing doctors to diagnose potentially life threatening conditions faster than previously possible. In addition, the growth of wearable technology is also helping to provide healthcare services with vital information. Monitoring vital signs can give loved ones peace of mind that elderly family members are healthy or simply keep doctors abreast of your wellbeing. Whether from smart wristbands or connected appliances, The Internet of Things promises to harvest and store vast quantities of information about our bodies. When coupled with analytics programs, doctors may be able to use this data to reduce the time it takes to diagnose and treat patients, ultimately saving lives.  the future benefits of IoT devices could be broader than we previously envisioned Although the Internet of Things will certainly bring advantages to the automotive, finance, marketing and home appliance industries, other business sectors also stand to benefit. In fact, both agriculture and healthcare are already demonstrating that the future benefits of IoT devices could be broader than we previously envisioned. ### Will the cloud make IT managers obsolete? For IT managers, cloud computing poses something of a challenge. For those who’ve spent the majority of the last few years doing engineering-style work rather than consulting, the natural question is whether the cloud will put your job at risk. Will the big changes to your carefully-constructed IT stack make you irrelevant? [easy-tweet tweet="Is #IT being lost to the cloud? Why are we hearing that IT’s job is over?" user="comparethecloud"] I don’t believe in all or nothing. Somewhere between the two visions of everything and nothing lies a possibility that’s far more grounded in reality – some change. What’s not going away any time soon is the need for on premise infrastructure across a variety of environments. But for those environments embracing change, I often hear too much conventional wisdom that IT is being ‘lost to the cloud’, that IT’s job is over. This defeatist attitude shows a depressing lack of imagination from a sector that has been given a great chance to grasp exciting new opportunities. if all you’re doing is helping to keep the lights on, then you’re not adding much value to the business And, let’s face it – even without all these new technologies, if all you’re doing is helping to keep the lights on, then you’re not adding much value to the business. SaaS cloud suites are a great example. With the big players all offering a range of software that wouldn’t otherwise have deployed on premise, previous barriers, including the expense, have been removed. The deployment aspect is now taken from your hands, the cost spread. So why is this an opportunity? It’s obvious – it frees you up to dive straight into making sure new software is having a positive impact on the business. Instead of having time to kill, you’ll have your work cut out. No longer is your job title an IT manager – you’ve become an IT project manager, a consultant. On top of what you’ve already got going on, you’ll now need to evaluate, pilot and test. Training, creating staff reference materials and promoting each new piece of software being rolled out will be within your remit. It’s not completely unexpected to need some help. This is where traditional Value Added Resellers (VARs) can come in handy, helping to deliver some of that user interaction that often forms part of a SaaS rollout. This lets IT management teams get on with coordinating the project and ensuring the needs of the business are clearly defined and met. There’s also a complimentary role, with VARs able to help boost the resources available for IT during these critical stages. Day-to-day operations requires a finite level of resource The value of this can’t be understated. Day-to-day operations requires a finite level of resource. However when projects demand additional resource, using the services available from an IT consultant can help fill that gap. This lets you ramp up the available engineering, project management and consultancy services available to you quickly, and in an even more granular fashion than a lone IT contractor could provide. For those used to this way of working it may sound like nothing new, but to a tribe of IT managers out there used to doing more 'engineering' in their role, it can be a daunting change. It’s a change to embrace though and one that can benefit everyone. The IT manager's job evolves and actually grows as opposed to shrinking. New versions get released; new features get added, and so on. There’s always a reason to keep going round the cycle, helping the business continue to extract value from the product.  [easy-tweet tweet="There’s no point investing in #technology if its not used to its fullest potential" user="comparethecloud"] After all, they're paying for it via a subscription, month after month. There’s no point investing in the technology if they’re not going to use it to its fullest potential. And helping a business to do just that is without doubt a full-time job in itself! Beyond the original period of deployment, you might still need the services offered by external organisations like VARs. Cloud SaaS providers love change, and you may well find yourself having got used to the software’s user interface one day only to wake up the next to find it looks completely different. It won’t be long before the phone starts ringing with users looking for guidance It won’t be long before the phone starts ringing with users looking for guidance, so keeping a VAR on hand beyond the initial rollout isn’t a bad idea. Think of it less as a support contract and more as a consultancy contract. Ramp up resources, bring a consultant in to manage acceptance of the new version and keep that productivity growing. But it’s not for everyone. Some will choose other options, like sticking with a more technical route and embracing development or DevOps. It’s easy enough to brush off the idea of consultancy, but you’d be losing the ability to more closely align IT with business goals – and going it alone is far riskier than taking the plunge. After all, if you’re not going to make yourself obsolete on your own terms, someone else will do it for you. ### Extracting the value from big data using algorithms Data has become the currency of the always-on world. But on its own, it’s not enough. Businesses need to make sense of the wealth of data they have. This is where the algorithm economy comes in, and using algorithms to make sense of data will be a business priority for decision-makers in 2016. [easy-tweet tweet="#Data has become the currency of the always-on world" user="comparethecloud"] Big data has grown in interest and use in recent years, but companies are beginning to see the necessity of extracting value from it. Intel recently predicted more than 50 billion devices and ‘things’ will be connected worldwide by 2020. This presents a massive influx of data access points bringing with it a wealth of business opportunities and challenges. For the most part, focus has been on getting systems in place that take care of data storage and management. In an always-connected world, with employees and customers connected to the back-end systems using a variety of devices at any time and from any location, it makes sense to have the building blocks in place. However, this is no longer enough. A shift is happening that is seeing organisations focus less on big data... A shift is happening that is seeing organisations focus less on big data, and more on streamlining the data they already have. According to Gartner, worldwide IT spending is forecast to surpass $3.6 trillion in 2016. The IT industry is being driven by digital business, and an environment driven by a connected world. Businesses cannot operate without IT, and IT cannot operate without data. This cyclical nature of the always-on environment means that data links together all aspects of a business, but whether it is managed through virtualisation; modern storage; the cloud; or other enterprise computing options, data is still very complex even when it is being streamlined. [easy-tweet tweet="Businesses cannot operate without IT, and IT cannot operate without #data" user="comparethecloud" hashtags="bigdata"] The algorithm economy translates business challenges into meaningful actions using the data that is available. Algorithms can take real-time analytics to the next level with an analysis unique to a business and its requirements and goals. Gartner’s SVP and global head of research, has predicted that algorithms are defining the future of business, and that products and services will be defined by the sophistication of their algorithms, giving organisations a competitive edge. This adds further complexity as additional security needs to be implemented to safeguard the organisation, its data, and the algorithms that will drive it. Regulatory compliance is helping to mitigate this to a certain extent as it provides companies with a blueprint of the boxes it needs to tick for data surety. decision-makers need to find a balance between managing data and analysing it As the Internet of Things (IoT) continues to gather momentum, the pressure will be on decision-makers to find a balance between managing data, analysing it, and shifting strategy to meet changing demands of the market. Algorithms will take on greater importance to help regulate this from an organisational perspective. Companies can no longer turn a blind eye to the always-on business and the need to evolve their approaches and systems with it. ### Cyber-security: the political agenda – insights from the 8th International Cyber-security Forum in Lille What should we expect – or indeed what is it reasonable to expect – from our political masters in terms of strategy, policy and regulation for cyber-security? [easy-tweet tweet="Read @BillMew's recap of the 8th International #CyberSecurity forum in Lille #FIC2016"] Data is the fuel for digital transformation: 144 billion emails are sent worldwide everyday, 30 gigabytes are published every second, 800 000 new web sites appear daily, and the amount of available information doubles every two years… all this with only 42% of the world’s population digitally connected. VIEW THE ARCHIVED LIVE COVERAGE OF #FIC2016 HERE Europe exports and consumes data but the nature of data and interconnectedness of markets makes it difficult to apply ownership. Should we consider data as a "common good" that is immune to ownership or appropriation? In this social media age, does the idea of a private life still make sense? With confusion between data security and computer security- i.e. between the vehicle and its contents – how should virtualised technologies, cloud computing and new business models be managed? How can data be considered in context? Which technologies and solutions are needed to ensure data security from start to finish? Which governance solutions have been or should be implemented by organisations? How can we create a trusted environment conducive to the development of new data uses for citizens, businesses, local authorities and the state? Cybersecurity, a shared responsibility Recently we attended the 8th International Cyber-security Forum #FIC2016. Focused on data security and privacy, the event provided a forum for an open discussion between service providers, trusted security solution professionals as well as end users from the public and academic sector. It also featured keynote speeches from some of the most influential political figures in this sphere. #FIC2016 featured keynote speeches from some of the most influential political figures in Information Security European Commissioner Günther H. Oettinger ‏ Speech Overview: Cyber threats do not respect borders. We therefore must act together, not in isolation. And we need to remember that cybersecurity, data protection and online privacy are very closely linked to each other. [easy-tweet tweet="The threat landscape has changed and intensified since the adoption of the 2013 European #Cybersecurity Strategy"] The threat landscape has changed and threats have intensified since the adoption of the 2013 European Cybersecurity Strategy. However, our priorities and fields of action remain as valid as three years ago. Let me tell you where we stand on what I consider the cybersecurity policy priorities for the European Commission: Enhancing cybersecurity capabilities and cooperation to across the EU; making the EU a strong player on cybersecurity; mainstreaming cybersecurity in EU policy-making and last but not least ensuring high level of trust and privacy protection in the digital economy. A full copy of his #FIC2016 speech is available in English here. Bernard Cazeneuve, French Interior Minister Speech Overview: We currently face security challenges that are considerable and that requiring a joint response, both European and international. In 2015, France itself was the target of barbaric terrorism. Never before had we experienced attacks of this nature or magnitude on our own soil. These terrorists use cyberspace as a powerful vehicle for propaganda and recruitment, just as fraudsters and hackers use it for their own ends. We need to collaborate to be effective in response across all areas: Legal means: with the adoption of new antiterrorism laws in November 2014 and laws on Intelligence in July 2015. Collaboration and best practices: working across governments department, between governments and between the public and private sectors – including the major US internet and social companies. Planning and resources: employing the Plan de Lutte Anti-Terroriste (PLAT) - the counterterrorism plan – and the technical and manpower resources necessary to make it work. A full copy of his #FIC2016 speech is available in French here. John Hayes, Security Minister at the UK Home Office Speech Overview: We need to keep businesses, citizens and public services safe. After all, trust and confidence in the security and privacy of data is crucial for consumers, businesses and investors. Our two countries already recognise the need to work together to face this threat through strong government to government dialogue in support of our two key approaches to cyber security; Working in partnership - with business, with public services and with citizens, and Closing the skills gap in cyber security and cyber crime. [easy-tweet tweet="In 2016, we will develop the next National Cyber Security Strategy" hashtags="cybersecurity"] In 2016, we will develop the next National Cyber Security Strategy.  We are going to nearly double our investment - £1.9 Billion over the next five years – working with Industry, supporting the best cyber start-ups through the creation of two cyber innovation centres and we will launch a £165 million Defence and Cyber Innovation Fund, to support innovative procurement across both defence and cyber security. We will also increase the capabilities of the National Cyber Crime Unit to develop stronger defences for government systems and create a new National Cyber Centre which will work with industry, academia and, most importantly our international partners to protect against cyber-attacks. A full copy of his #FIC2016 speech is available in English on Compare the Cloud’s main website here. [quote_box_center] In addition to reporting on these keynote speeches we also interviewed key figures in the French fight against Cybercrime. Further interviews will be published shortly based on our discussions with the following: Patrick Hebrard, from DCNS, a cybersecurity expert who discusses the challenges of protecting warships and weapons systems France's top #cybercrime cop - Francois-Xavier Masson Chef de OCLCTIC - the cybercrime agency in the Ministry of the Interior Guillaume Poupard, director general of ANSSI - the French security standards body that is charged with protecting critical infrastructure and setting out security strategy, standards and accreditation for France. [/quote_box_center] ### How analytics can streamline your business in 2016 As a nation we monitor everything. We are constantly recording everything we do from tracking our spending to even our eating habits. The way we do this has rapidly changed, from writing down our expenses and meals in a notepad to now using online banking and adding up our food intake in one of the abundant apps on offer. [easy-tweet tweet="The way businesses analyse their key information has been revolutionised" user="comparethecloud" hashtags="bigdata"] Just like the way we track our personal lives has changed, the way businesses analyse their key information has been revolutionised too. Those businesses that adapt and make the most of all the data at their disposal will inevitably reap the benefits. The internal business benefits of utilising analytics Analytics were one of the fastest growing technology trends last year and will continue to grow in 2016 as businesses recognise the value on their bottom lines. to be as efficient and effective as possible a business needs to review its successes and pitfalls We all know to be as efficient and effective as possible a business needs to review its successes and pitfalls. Analysing the behaviour of individuals, teams and the company as a whole is critical to ensuring resources are appropriately distributed, praise is given and help is provided where necessary. To make the most informed decision about their businesses every team manager needs access to a dashboard that displays all the key information in real time. CIOs and CEOs are best placed to track the trends in data that can be used to make the wider business more efficient and productive. 8x8’s own analytics suite helps businesses of all sizes make faster decisions, well informed decisions based on critical information about a company’s performance. How analytics can improve customer relations Analytics is not only about understanding your own business, but also about how your customers perceive your service. Take how calls are handled as an example, which is  an integral part of customer service. Missing calls, or calls being diverted to the wrong extension can cause confusion, anger and in some cases loss of business. When an integrated analytics suite is in place, team members can see these missed calls and return them, saving possible financial loss and helping to resolve queries quickly once answered. But it’s not just how quickly calls are being dealt with, team capacity needs to be taken into account so callers are not left waiting to speak to an operative. Analytics allows businesses to pinpoint certain times when call volumes are likely to surge. Once these trends have been identified, it is much easier to accurately plan expected capacity needs and ultimately keep customers happy. The big picture  [easy-tweet tweet="#Analytics are key for companies to consolidate business performance and improve productivity" hashtags="bigdata"] Analytics are key for companies to consolidate business performance and improve productivity. Whether you’re a CIO or a team leader, using analytics across your business will help you generate powerful insights so that you can make fact-based decisions that will improve your business costs and efficiencies. Our daily world is converging into a sea of bright colours, graphs and charts—everywhere from our financial statements, phone bills, data usage, water consumption, utilities and more. Everything today is summed up in a dashboard.  If we’re used to using these charts  in our everyday lives then it makes sense to transfer this insight to the business world too. there’s nothing better than a highly informative dashboard that makes us look and act more intelligent After all, there’s nothing better than a highly informative dashboard that streamlines data, points us in the right direction, and makes us look and act more intelligent so we can add real value. ### You Will Experience A Major Incident in 2016 According to Dimensional Research, nearly 90 percent of companies with 5,000-10,000 employees experience a major incident at least monthly. While a quarter of companies with more than 10,000 employees experience a major incident at least weekly. [easy-tweet tweet="90% of companies with 5-10,000 employees experience a major incident every month" user="comparethecloud" hashtags="cloud, IT"] This is - to say the least - a highly concerning challenge for CEOs, CIOs and IT managers to consider, which - no doubt - causes sleepless nights for some. Add the complexity of technological advancements, and change, across the business landscape into the mix, and watch your team’s stress levels rise. There are so many factors these individuals need to consider in order to keep businesses’ technology infrastructures running at both a strategic and tactical level. Aside from investing in cloud-based communication solutions, that enable any business process or application to trigger two-way communication throughout the extended enterprise during these time-sensitve events, what other areas are a priority to stay on top of? Cloud security Only a few years back, many of our customers were sceptical about cloud migration due to security concerns. But, as cloud platforms continue to improve, many of these concerns are actually easing and dissipating.  the bad guys are getting smarter too The drawback, though, is that the bad guys are getting smarter too. Data breaches and malware attacks increase every year. Last year, some significant cyber-attack victims included TalkTalk, British Gas and most recently, website development and hosting platform Moonfruit. Therefore, as is expected, IT departments will need to spend more on data encryption, malware detection, firewalls and other ways to improve security. And it won’t be just about installing sytems. Educating employees will be more important than before too.  [easy-tweet tweet="IT departments will need to spend more on #data #encryption and #malware detection" user="comparethecloud"] Data encryption Many companies have been swapping email communications in favour of faster and better encrypted applications. Even terrorists are following suit and using encrypted social communication apps to manage their communications. Some security experts – in fact -- think terrorist organisations are even using PlayStation 4, which is particularly difficult to decrypt. Therefore, at some point there will likely be a greater public debate about the nexus between cloud security, public safety, personal privacy and information volume.  Even terrorists are following suit and using encrypted social communication apps DevOps, DevOps, DevOps We all know and appreciate that efficiency is everything in business. Today, having developers just throw new software over the wall to operations doesn't make much sense -- it's the same as marketing throwing collateral over the wall to sales.  This isn’t feasible and doesn't work. Typically, organisations that have adopted agile development and a DevOps mentality have reaped the benefits, and they'll only do better as they get better at it.  According to New Relic, companies that deploy DevOps practices, code up to 30 times more frequently than their competition. This reduces the risk of change and provides faster access to new functionality. Furthermore, Puppet Labs’ 2013 State of DevOps survey says the vast majority of deployments are successful.  [easy-tweet tweet="Companies that deploy #DevOps practices, #code up to 30 times more frequently than their competition"] Cloud migration Investing relatively small amounts to piecemeal cloud implementation will become a thing of the past. Due to the change occurring across the industry, there will be a greater movement of companies towards adopting/implementing cloud solutions, with budgets from inside and outside IT, and expenditure moving to above-board operational expenditures.  What this ultimately means is that businesses will create real value, through solutions, storage, service providers and outsourcing.  Hybrid cloud adoption Hybrid clouds combine the power and scale of public clouds with the security and technical control of private clouds. According to Gartner, nearly half of large enterprises will deploy a hybrid cloud environment before the end of 2017; and with better API compatibility being a strong feature of hybrid cloud approaches, companies can use private clouds as staging environments before deploying to a public cloud. And, if Gartner is correct, which I suspect it is, now is the time to explore hybrid clouds. With the plethora of technological challenges that CIOs are stacked against, having a solution in place to enable intelligent communication processes across the business to manage operations and deal with incidents is utterly critical.  it is an imperative to keep systems secure While it is an imperative to keep systems secure or move towards cloud computing, as the Dimension Research indicates, there is an extremely high possibility of a major incident occurring within your business. This makes it vital to ensure that, aside from staying technologically ahead of the game, that your communication processes are up to scratch too. ### Reining in on industrial automation The benefits of cloud computing in the manufacturing industry The domestic canary isn’t what most of us would consider a tool for industrial labour. But in 1988, hundreds of these birds were made redundant from their small, yet significant roles in Britain’s coal mining industry. Earlier mines lacked air ventilation, which meant these ill-fated birds were favoured as the gas detectors of choice. Today, advancements in connectivity and the introduction of cloud computing are making monitoring industrial environments a lot more sophisticated. [easy-tweet tweet="The #Cloud and the Canary. How are they related? " user="comparethecloud" hashtags="industry, manufacturing"] Head in the clouds Manufacturers today are faced with challenges unlike any they have previously experienced. Complex regulatory standards, the big data trend and an ever growing pressure to compete with emerging economies make it difficult for many organisations to achieve sustainable success. In recent years, we’ve heard a lot about cloud computing, but its role in the manufacturing industry is not always clear. Outside of the factory walls, technology is advancing at a rapid pace and inevitably, the world is becoming more technologically intertwined than ever before. While cloud computing certainly isn’t a new phenomenon in the IT and consumer environments, the technology has only recently started to take its place in the manufacturing industry. As industrial automation becomes more intelligent and manufacturers embrace machine-to-machine (M2M) technology, cloud computing is set to become the obvious solution to store and manage the ever growing expanse of production data. Aside from increased storage space, the cloud helps manufacturers to reduce costs, change business models, provide new services, increase agility, optimise performance and ultimately, drive profitability.  cloud computing is set to become the obvious solution to store and manage the ever growing expanse of production data Energy data management For industry, there are few topics as widely discussed as the European Union’s Energy Efficiency Directive. While legal requirements, such as complying with the Energy Savings Opportunity Scheme (ESOS), are clear, strategies for meeting these requirements are not always as straightforward. Performing a meaningful evaluation of a manufacturing facility’s energy efficiency is only possible when energy consumption figures are available in a complete manner. This data should be compared against other factors, such as machine and process data, number of items produced and more. To make sense of the copious amounts of data produced on the shop floor, many manufacturers are deploying energy data management systems (EDMS). Generally, EDMS is set up locally and embedded into the existing IT infrastructure, but there are a number of different scenarios available, including moving the EDMS to the cloud, a possibility which enables company-wide analysis of energy data. Traditionally, organisations with more than one production site would monitor energy efficiency locally. But now, cloud computing allows energy managers to track this data on a global scale, regardless of their location. Not only does this help set benchmarks for company-wide efficiency standards, but also creates opportunities to develop new business models to tackle energy management, head on. Silver linings One of the greatest benefits of moving to the cloud is the cost advantage. Rather than paying a set amount for on-site infrastructure, some manufacturers choose the cloud because expenses are easier to calculate and based on a pay as you go basis. What’s more, as the platform is hosted by a third-party, there’s no need for hardware installation or ongoing maintenance at your end, keeping operation costs to a minimum. But cloud computing isn’t just for large organisations. Manufacturers of all sizes are already using the cloud to gain access to companywide production data, key performance indicators (KPIs) and energy efficiency figures – all at the push of a button. Calm before the storm [easy-tweet tweet="The benefits of #cloud computing are clear, that doesn’t necessarily mean it’s an easy decision for #manufacturers to make"] Although the benefits of moving to cloud computing are clear, that doesn’t necessarily mean it’s an easy decision for manufacturers to make. Before making the move, organisations must decide on the objectives and reasoning for choosing cloud computing in the first place. From there, they should set a specific migration strategy in place. One of the most common concerns to arise when moving to the cloud is how the move can be achieved without disrupting the organisations existing IT infrastructure. Another is how to integrate an entirely new procedure into established operational routines. For some, there is an inevitable fear surrounding the idea of storing production data off-premises. Although these concerns are all valid, these fears aren’t as justified as they might seem. Most cloud providers have invested heavily to ensure the infrastructure is safe and resilient to any attacks Most cloud providers have invested heavily to ensure the infrastructure is safe and resilient to any attacks. For example, Microsoft and its cloud platform, Azure, is ISO 27001 certified and therefore provides disaster recovery as a service (DRaaS). By automating the replication of your factory data, Azure will provide a secondary data centre to act as your recovery site, meaning even in the unlikely event your data is lost, it isn’t gone forever. What’s more, Microsoft’s Azure platform is fully compatible with a number of energy data management systems. COPA-DATA’s software, zenon, for example, is based on Microsoft’s latest technology and therefore can be seamlessly integrated into the Azure cloud platform. This consistency provides a complete end-to-end Internet of Things (IoT) solution for manufacturers, from industrial machinery to the cloud and mobile devices. All of this means concerns about integrating the current IT infrastructure with the cloud are diminished.  Above the clouds [easy-tweet tweet="Moving to #cloud computing isn’t just about moving #data storage off-site" user="comparethecloud"] Moving to cloud computing isn’t just about moving data storage off-site. Used correctly, the cloud can enhance an organisation’s performance in production, efficiency and potentially, its entire business model. With so much potential, it’s hard to see why any manufacturer wouldn’t consider cloud computing. The days of canaries in coal mines are long gone; today’s industry is a world of calculated risks and intelligent data. To evolve effectively, manufacturers should carefully consider their needs, objectives and business goals because without taking these steps, they risk getting left behind. ### Seeing clearly with cloud based CCTV Cloud computing appears to have been accepted into the IT mainstream as enterprises of all sizes increasingly adopt public, private and hybrid services. Whether they choose Software as a Service, Platform as a Service or any other flavour of aaS, they benefit from the flexibility, capacity, cost savings and ease of access that cloud offers. [easy-tweet tweet="Why have so #CCTV users been reluctant to innovate and move to the #Cloud?" user="cloudviewcctv"] One sector which has been so far reluctant to adopt any sort of cloud model, despite the potential benefits, is corporate CCTV. Why have so many CCTV users been reluctant to innovate and how can cloud help the corporate security community with the very unique challenges it faces?  Police forces and utility companies are already using this type of system, but for the majority of corporates analogue solutions which require manual oversight still dominate the market. The main reason for lack of interest in cloud-based CCTV appears to be the longevity of analogue technology. Only around one fifth of new CCTV installations are digital, despite the widespread availability of digital systems. Users give two key reasons for resisting change: it would require an expensive ‘rip and replace’ of their existing infrastructure, which is simple to install and works well; and digital CCTV is more complex, insecure and expensive to install. In this second point they are only partially correct, as analogue systems are equally insecure if they are connected in some way to a corporate network (e.g. via a DVR). The main reason for lack of interest in cloud-based CCTV appears to be the longevity of analogue technology With an installed base of hundreds of millions of analogue cameras, these concerns have been a deterrent for proponents of cloud technology because of the hurdle of connecting the cameras to the internet and the security problems this connectivity can bring. However, both concerns can be addressed by the latest generation of cloud solutions. First, analogue systems can now be brought up to date and beyond without the need to replace cameras or recording equipment. In effect, users can now take an analogue camera and plug it into the digital world by simply adding an intelligent adapter. This enables footage to be streamed directly to a cloud storage system using standard internet connections − regular broadband, 3G or satellite services − with recording initiated either by automatic event triggers or manual activation. Users can view event-triggered alerts, live views of CCTV feeds and recorded footage from one or more cameras at one or more locations wherever and whenever they want via their smartphone, tablet or PC. The costs of changing to this type of system are minimised because there is no need to replace cameras or cabling. There is minimal complexity, with no need for third party hardware such as routers and firewalls in order to connect to the internet. It can also be used to add remote monitoring and alerting to a CCTV system without the need for VPN tunnelling, fixed IP addressing or other configuration changes to ensure security from unauthorised access (hacking), or to add visual verification to intruder alarms. The second reason many are reluctant to move to digital systems is security. Any insecure embedded device connected to the internet is a potential target for attacks. These are usually initiated by computer botnets, but in recent months we have seen major distributed denial-of-service (DDoS) attacks have been triggered by malicious requests from CCTV cameras. [easy-tweet tweet="DVR-based CCTV systems have a number of security weaknesses that leave companies vulnerable says @cloudviewcctv"] Traditional DVR-based CCTV systems have a number of security weaknesses that leave companies vulnerable as an entry point for corruption and a pivot for attacking a corporate network. These include use of port forwarding, few automatic firmware updates, a lack of oversight because footage may rarely be looked at and a predisposition among manufacturers to include ‘back door’ functionality. They are also vulnerable to data exfiltration - by their nature, DVRs carry a lot of network traffic in both directions. How can organisations tell what that traffic is and where it’s going? This, combined with their large hard drives, makes DVRs the ideal point to extract vast quantities of data from a network. Adding a digital encoder to an analogue CCTV camera enables it to be connected to the internet but does not solve the host of security issues that come with internet connectivity. They are also expensive. Meanwhile many cloud-based video solutions use the same IP connection and ‘port forwarding’ techniques as an old-fashioned DVR, so the security problem remains. However, cloud solutions are available which only require outbound connections. And because an adapter only has to perform a fraction of the functionality of a full DVR, it is much less powerful and hence much less attractive to a potential attacker. A final aspect which must be considered is data security. The 1998 Data Protection Act outlines the steps that organisations must take to preserve the confidentiality of gathered data. CCTV users need to ensure that their potential providers have strictly defined controls around the access to, and management of, customer data, and do not share that data with a third party without the explicit consent of the user. To ensure that sensitive data is secured in the cloud, organisations need to look for systems that offer authentication, end-to-end encryption and a digital signature to ensure data integrity It remains to be seen when the CCTV sector will fully embrace cloud technology [quote_box_center] We recommend asking a potential cloud CCTV provider a series of questions to ensure that they comply with data security requirements, such as: How do you ensure secure communication between cameras and users? Once data has reached the cloud, how is it protected from unauthorised access, and what happens if the cloud system itself is breached? How is data integrity ensured? For example, how does the user know that the data is current and not from say two weeks ago? Where is the data held? The Data Controller must know where the data is residing and must be happy that it is compliant with Data Protection regulations. [/quote_box_center] It remains to be seen when the CCTV sector will fully embrace cloud technology. It offers significant benefits and can be added to existing analogue systems without breaking the bank or compromising security. Many cloud systems do have vulnerabilities but, more often than not, they offer the rigours of well thought out security and data protection standards which are likely to offer better security for a lower cost than those than can be implemented by a singular client. Moreover, they provide the physical security aspect of holding the data in a remote location. ### Is Oracle a sleeping cloud giant? I have heard many times the famous quote regarding cloud computing spoken by Larry Ellison of Oracle - “cloud is a fad”.  I have also witnessed a number of people claim Oracle is too late for cloud, cloud is only AWS and subject to an early mover advantage this includes people within my own organisation. [easy-tweet tweet="Let's run with the assumption that first mover advantage means market control..." user="comparethecloud"] For a moment let’s assume the cloud purists are correct and that the AWS early advantage captured the hearts and minds of the development and startup communities and launched household names such as Netflix et al. Using this assumption of ‘first mover advantage’ would mean that all products and marketplaces are under the control of the originator of the product or service. Karl Benz would have complete control of the automobile market and there would never have been Henry T Ford with his Model T and mass production? Motorola invented the first mobile telephone they would have retained ownership of the mobile marketplace (I must put down my shiny gold iPhone 6 plus 128gb) Richard Trevithick dominated the worldwide need for steam locomotives with his patents creating wealth beyond imagination Second Life outstripped Facebook with new user sign-ups and advertising revenues pioneers do not always capture a marketplace The point I am trying to make here is that pioneers do not always capture a marketplace. And where technology and certainly the Enterprise is concerned the transference of e-commerce born-on-the-web approaches to customer service and problem resolution will not be based on self-service web support. [easy-tweet tweet="Enterprises and #SMBs will require problem resolutions from human beings, not automated algorithms"] Enterprises and SMBs will require/desire problem resolutions from human beings, not automated algorithms. The Enterprise has always been the traditional domain of the large IT players such as IBM, HP, Dell, and Oracle; and whilst there is always going to be talk of disruption, these companies are transforming. So back to Oracle; why do I feel they are a sleeping giant? See below a picture of a job advertisement that has been published across social media recently.  I must admit after seeing this advert and reading the job posting link I had to stand back and nod in agreement. When looking at the description a number of things stood out for me; Use of social media for prospecting and profiling customers leveraging social selling techniques  Building out consultative inside sales teams rather than enterprise sales types Aiming towards a younger millennial workforce (I judged this by the imagery used) The separation into 3 simple to understand areas of sales engagement a sleeping giant has awoken My view is that a sleeping giant has taken the traditional Chinese business approach of approaching a market late with a view to disrupting it with a new approach and price. Both of which Oracle cloud has signalled very loudly to the marketplace. A significant advantage that plays to Oracle’s strength is the far reach into its traditional customer base. Perhaps with this team that is being assembled across the globe Oracle is intending to disrupt itself, a move the majority of its legacy customer base would welcome we suspect. We will be keeping a close eye on Oracle Cloud developments in the coming months, my personal view is that there is a giant that has awoken from a self-induced slumber, and it will be attacking the cloud market in anger soon. ### Speech by John Hayes at the 8th International Cybersecurity Forum in Lille The following is a speech by John Hayes, Security Minister at the UK Home Office at the 8th International Cybersecurity Forum in Lille #FIC2016.  Monsieur le Ministre Cazeneuve, mesdames et messieurs les organisateurs du Forum International de la Cybersécurité, je vous remercie de me donner l’occasion de prendre la parole au sujet de la sécurité des données et de la vie privée. [Monsieur Cazeneuve, ladies and gentlemen organisers of the International Cyber Security Forum, I thank you for giving me the occasion to speak about the security of data and privacy.] Forums such as this, drawing international visitors – from governments, citizens, business and research - serve a vital role in bringing us together to explore the threats and the opportunities that arise because we all now live in a digital world. Living in this digital world, all of us here today understand that data security, and with that, wider cyber security, underpins the digital economy. It is needed to keep businesses, citizens and public services safe. After all, trust and confidence in the security and privacy of data is crucial for consumers, businesses and investors. As we all know, the scale of the threat to security of data is significant.  In the UK alone, 90% of large businesses and 74% of small businesses had a cyber-breach in the past year. These breaches can be hugely costly and damaging to businesses.  But, as we all know, cybercrime and wider issues of cyber security transcend national boundaries, we cannot therefore deal with these threats alone. Our two countries already recognise the need to work together to face this threat through strong government to government dialogue. The UK National Cyber Crime Unit, part of the National Crime Agency works closely with French (and other European) colleagues in Europol’s Joint Cyber Crime Action Taskforce (J-CAT). While the NCA has a dedicated International Liaison Officers located here in France so that we might achieve quicker, more effective criminal investigations between our two countries. We continue to strengthen this important relationship. In the coming months we hope to welcome colleagues from French Customs and Police Nationale to the National Cyber Crime Unit to continue to identify mutual opportunities in dealing with mutual threats; long may this cooperation continue. When I consider the approaches we, as Governments have taken to set the direction for how we deal with cyber security – and with that, cybercrime, I  welcome  the parallels that can be drawn between your National Digital Security Strategy, and the UK  National Cyber Security Strategy. This isbecause both recognise the two key approaches to cyber security; (1) Working in partnership - with business, with public services and with citizens, and (2) Closing the skills gap in cyber security and cyber crime. I’d like then, to outline how the UK has approached these, before setting out, in brief our broader approach. First published in 2011, through the UK’s Cyber Security Strategy, we have invested £860m in a National Cyber Security Programme to support the UK’s economic prosperity, protect our national security and safeguard the public’s way of life, to building a more trusted and resilient digital environment Adopting a partnership based approach, we have worked with industry to transform business understanding and response: To date a huge amount has been achieved. One success is the ‘Cyber Essentials’ certification scheme. Launched in 2014 as joint industry and Government scheme, this sets out clear basic standards for cyber security. We know the majority of successful cyber-attacks exploit basics weaknesses. Cyber Essentials addresses those basic weaknesses. The message behind it is clear:  If you adopt Cyber Essentials in your business, you will protect your business against the majority of threats on the internet.   The Cyber Essentials scheme isn’t just aimed at the large prime firms – it is also intended to help them manage their third party risks, which is why we have made the scheme suitable for smaller businesses, including those who are part of larger supply chains.  Over 1,200 Cyber Essential certificates have now been issued. In Government, we now require suppliers of most contracts and services to hold a Cyber Essentials certificate. We are also working with the audit community to take  cyber security out of the IT department and into the boardroom. The Cyber Governance Health Check, now in its third cycle, helps the UK’s top firms understand and improve their level of cyber security; last year’s health check data demonstrates good progress. For example, 88% of FTSE 350 firms now include cyber security in their risk register, up from 58% in 2013. The National Cyber Security Programme also invested in law enforcement capability and skills: establishing the National Cyber Crime Unit (NCCU), located in the NCA  to provide a strong overarching response to combatting the most serious cyber criminals. The NCA is also investing in state of the art equipment and specialist expertise: keeping pace with the criminals who threaten the public. There are also cyber teams established within each of the Regional Organised Crime Units (ROCUs) across England and Wales to bolster the national and local response and have introduced Crime Training courses for all police forces. Finally, we are also training police officers and staff in how to identify and secure evidence on digital devices. But now, in 2016, we are looking to develop the next National Cyber Security Strategy.  We are going to nearly double our investment - £1.9 Billion over the next five years - to protect Britain from cyber-attack and develop sovereign capabilities in cyberspace and it will continue to focus on partnership with business, with public services and with citizens. We will: As a former Minister for skills who oversaw a substantial increase in apprenticeships I know the importance of Continuing to address the skills gap in cyber security; building on the 2011 programme the Government has now put in place interventions to improve cyber security skills at every level of the education - from coding in schools, to cyber security at post-graduate and doctoral level. The challenge now is to hugely increase the numbers of young people and existing workers moving into the cyber security profession, to ensure we have the skills we need now and in the future We will continue to work with Industry, supporting the best cyber start-ups through  the creation of two cyber innovation centres  and we will launch a £165 million Defence and Cyber Innovation Fund, to support innovative procurement across both defence and cyber security. The National Cyber Security Strategy also reflects our need to build on our capability to address cyber security threats in 2016 and beyond. There will be an increase in the capabilities of the National Cyber Crime Unitto develop stronger defences for government systems; this will include further investment to develop law enforcement capabilities at a national, regional and local level to investigate, disrupt and protect against cyber crime We will create a new National Cyber Centre which will work with industry, academia and, most importantly our international partners to protect against cyber-attacks, Using these strengthened capabilities, we need to continue work together to disrupt the criminal market place and target those criminals who believe they can act anonymously online. We also need to go after the infrastructure and financial networks that are the source of attacks and profits for cyber criminals One of the greatest British Prime Ministers Benjamin Disraeli once said:  ‘Circumstances are beyond human control, but our conduct is in our own power’. We all know that the issues surrounding cyber security are not going away. The threat to data security and privacy remains significant. But, although we cannot determine the circumstances we face, we can determine our response. By continuing to work in partnership we can continue to tackle this threat. Chacun parmi nous joue son rôle dans ces efforts, et il est très important que nous continuions de le faire dans l’avenir. Je vous remercie de votre attention. [Everyone amongst us plays a role in these efforts, and it is very important that we continue to do so in the future. Thank you for your attention] Thank you. ### Webcast: Can ‘cloud’ be used by financial services? Date: Feb 02 2016 Time: 2 p.m. GMT (8 a.m. CST) There's no escaping that, in the new digital age, creating an omnichannel customer experience that puts the customer first needs a solid cloud foundation to be successful. Financial services companies face more challenges than ever before from new competitors that use cutting-edge cloud architecture — without the migration issues of legacy infrastructure. And insurance companies need to implement cloud services and big data analytics before they can even begin to embrace the latest in Internet of Things insurance services. In our first webcast in the Insights on Cloud in Banking, Financial Services and Insurance series, you’ll learn how cloud can increase your agility and provide best-of-breed security for your financial services firm and your customers. Featured speaker Neil Cattermull, director of Compare the Cloud, will offer his considerable insights from more than 20 years in IT management working at major banking institutions. He’ll lend his deep technical understanding of cloud and centralised IT infrastructures to help provide an overview of which approach to cloud best supports regulated entities. ### New Complete Cloud-Based Security Solution from iSheriff                 iSheriff has announced the release of iSheriff Complete, a comprehensive cyber security platform designed to provide 360-degree protection of an organisation’s devices and communication channels. iSheriff Complete is the industry’s first comprehensive cloud security platform to provide fully-integrated endpoint, Web and email security, delivered through a single Web-based management console with a single set of enforceable security policies. [easy-tweet tweet="#CloudNews: @iSheriffINC announces iSheriff Complete, a comprehensive #cybersecurity platform"] “In order to provide a secure network, a security manager must be able to see and control all communications in and out of that network,” said John Mutch, CEO, iSheriff. “iSheriff Complete provides superior malware protection and full control of Web, email and endpoint vectors, providing the only integrated cloud-based security platform available today. We deliver complete cyber security through the cloud, all controlled by a single, easy-to-use interface. iSheriff Complete implements security policies and schedules reports in minutes, so our customers can get back to running their businesses.” As a cloud-based platform, iSheriff removes potential malware and viruses before they ever reach the network by providing a ‘clean feed’ to customers for their email and Web traffic. [quote_box_center] iSheriff Complete is available now and includes: Endpoint, Web & Email Complete vector control and visibility is central to the iSheriff platform, which offers a tightly integrated security solution, unlike ‘best of breed’ point products that often do not work together. iSheriff Complete is designed to enforce common security policies across all vectors, as well as track users both on and off the network. Threat Detection Engine Working 24x7x365 to stay ahead of the latest cyber threats on a worldwide basis, iSheriff security operates around the globe, detecting new threats and developing and propagating new signatures in a matter of seconds. Active Response Console The console is designed to be easy to use, allowing organisations to control their entire cyber security system from one interface. Threats are displayed instantly, along with what actions have to be taken based on defined policies. [/quote_box_center] “We’re in an age of fast-moving change and innovation powered by connected mobile devices and cloud computing, which has opened up the influence of technology to everyone, including criminals,” Mutch continued. “Cloud-based services and software of all varieties have been broadly adopted. By eliminating the need for complex and expensive hardware and software installation and offering scalable, per user pricing, cloud-based platforms open access to services that were previously cost-prohibitive for many businesses.” iSheriff is one of the leading companies providing effective security completely from the cloud. The company delivers the industry’s easiest to deploy and manage Software-as-a-Service (SaaS) solution for protection against today’s multi-vector threats and compliance challenges, combining the power, flexibility and ease of a cloud-based service, offering with a seamlessly integrated suite of critical technologies. For more information on iSheriff, please visit www.isheriff.com and follow the company on Twitter at @isheriffinc. ### What is Next-Generation Endpoint Security? It seems like an easy question: How do you stop a breach? The answer’s not so simple. Organisations like yours can spend a fortune acquiring the best security technology and personnel – and still get breached. It’s because their security systems were designed to defend networks against malware. But it’s not just malware you need to worry about these days, and it’s not just your network you need to protect. You have to stop breaches where they start: at the endpoint. [easy-tweet tweet="security systems were designed to defend #networks against #malware, but things are changing" user="comparethecloud"] As organisations grow and become more distributed, adding more endpoints across the enterprise, sophisticated adversaries will continue to aggressively target their data and IT infrastructure. Rather than over-relying on popular anti-virus tools, which alone are insufficient and unable to properly combat advanced cyber-attacks, organisations need to leverage next-generation endpoint security tools in order to more effectively detect and prevent all attack types, at every stage – even malware-free intrusions. Mistakes Enterprises are Currently Making on the Endpoint Relying solely on anti-virus technologies. In today’s sophisticated threat landscape, anti-virus technologies alone are not sufficient to prevent persistent and advanced attacks. Adversaries evolve their tradecraft faster than security companies can update their tools. What is compounding the challenge is that attackers increasingly employ malware-free intrusion tactics. In fact, less than 40 percent of attacks today involve malware. You cannot rely on security at the perimeter alone to keep the enterprise safe. anti-virus technologies alone are not sufficient to prevent persistent and advanced attacks Solution: Anti-virus software is still useful and must be kept up-to-date. However, responding only to threats that have already been identified is like being a bank guard who lets a robber come in because the police haven’t released a description of a robbery suspect yet. A good bank guard knows to look for malicious activity anywhere it might be found. Traditional anti-virus solutions may catch run-of-the-mill malware, but are no match for advanced adversaries going in with stealthy intrusion tactics. Organisations need to employ next-generation antivirus capabilities that can detect and prevent unknown malware and importantly go beyond that to block attacks that do not use malware at all. Failing to monitor your enterprise endpoints. The conventional “defence-in-depth” model has focused on defending the perimeter of an organisation. Today, more often than not, adversaries are finding ways to penetrate the network and execute code at the system’s endpoints. We are also witnessing a continuous and ever-evolving sophistication in adversary tradecraft beyond anything we’ve seen before. Watching the perimeter only allows for “silent failure.” That is, once an adversary is inside, he operates freely without threat of detection because nobody is looking. He will operate with impunity, posing grave danger to your organization. Solution: Employ technologies that monitor endpoints continuously. Real time and historical Endpoint visibility is critical for making the transition from reactive security to proactive hunting and detection. Aggregating large swaths of data and looking for anomalous behaviour across the enterprise will help to identify indicators of attack. If you can identify adversary activity expeditiously, you can isolate and mitigate the attackers impact on your network. [easy-tweet tweet="Do you know the critical building blocks for effective cyber defence?" user="comparethecloud" hashtags="security"] What to look for in Next-Generation Endpoint Security Solutions When evaluating next-generation endpoint security solutions, organisations should ensure that technologies provide the following capabilities: Complete Protection – Solutions today need to prevent attacks from both known and unknown malware, allowing organisations to defend against attacks that existing security tools can’t stop. Modern threats come in all shapes and sizes. You need a solution that covers all types of attacks, from commodity malware to the most advanced persistent threats. Ensure that next-gen endpoint protection tools provide proactive and continuous protection against everyday threats, as well as sophisticated attacks that are undetectable and invisible to traditional malware-centric defences. Endpoint Visibility – Visibility and continuous monitoring across every endpoint in an environment is a key requirement. This capability allows you to discover and investigate current and historic endpoint activity in seconds – providing you with a complete and searchable forensic record of endpoint events. Measured time to respond should be measure in milliseconds – with time to remediation in minutes or hours, not days, weeks, or months. This capability should span all major platforms, including Windows, Linux, and Mac. Lower Cost & Complexity – Endpoint security platforms that are 100% cloud-delivered reduce costs by 75% versus traditional on-premise solutions and allow for frictionless deployment of sensors to hundreds of thousands of endpoints in minutes. Cloud delivery provides protection where your users are – on or off the network. Cloud delivered endpoint solutions can bring with them significant benefits with respect to to deployment times, reduced hassle and complexity with updates and maintenance and immediate out-of-the-box protection capabilities. Indicator of Attack Approach – Organisations need to move beyond a reactive Indicators of Compromise approach to a proactive attack detection strategy. Security tools need to focus on identifying adversary objectives, as opposed to simply detecting malware tools. Detection of attacks in progress provide the ability to spot an attack prior to a devastating data breach. Organisations need to move beyond a reactive Indicators These core areas are no longer just part of an emerging approach but critical building blocks for effective cyber defence. While there is no end in sight to the arms race between attackers and defenders, the tools at the disposal of enterprise security professionals are dramatically improving. In the defender’s toolbox, the Next-Generation Endpoint category of tools is proving that an evolution in the way that endpoint security is handled is both necessary and available. ### Egnyte Debuts in Microsoft’s Cloud Storage Partner Program Today, Egnyte announced the availability of an all-new Co-Authoring integration as a select member of Microsoft’s Cloud Storage Partner Program. Available immediately, Egnyte customers using Office Online will now be able to collaborate in real-time, using PowerPoint, Word and Excel documents stored within their Egnyte environment. [easy-tweet tweet="#CloudNews: Egnyte announces Co-Authoring integration with Microsoft’s #Cloud #Storage Partner Program"] With more than 1.2 billion Microsoft Office users to date, this integration will be a significant addition to our product portfolio. As a part of Egnyte’s open platform, it enables users to securely collaborate on files in any app, on any device, via any storage, this new functionality will give customers a whole new way to work. “We’re excited about the next step in our journey to make Office files more open and accessible for Egnyte customers,” said Kirk Koenigsbauer, corporate vice president, Microsoft Office. “Starting today, real-time co-authoring is available for customers storing documents in the Egnyte’s hybrid environment, giving them enhanced flexibility and a more seamless experience while working with Office applications.” According to one study, less than 10 percent of enterprise data is stored in a public cloud. As such, we’re happy to be a part of the initial group of partners, as this not only validates our place in this market, but the demand for our hybrid solution. While the need to move workloads to the cloud is steadily increasing, a large majority of businesses remain committed to storing a portion of their data on-premises for investment protection, security and performance reasons – especially large enterprises. Egnyte’s hybrid approach to file sharing uniquely positions us to offer Microsoft Office customers the flexibility to work on content stored in the cloud and on-premises, which our fellow Microsoft Cloud Storage Partners cannot. "Technology is always advancing the way we work, which positively impacts our operational efficiency, said Paul Creed III, technology project director at Kent State University.” “Our Microsoft Office Online users can view, create, and collaborate on documents stored in the cloud via Egnyte using the Microsoft or Egnyte web interfaces. The ability to use existing applications — that are familiar and part of everyday workflows — in conjunction with our Egnyte account has improved productivity in such a seamless way, which allows our staff to focus on the task at hand." The Microsoft Co-Authoring integration will fit nicely into the many existing integrations we have developed for Outlook, Office Online, Office 365, and Office mobile apps. Moving forward, our engineering team is excited to continue working as a part of the Microsoft Cloud Storage Partner Program to create better experiences for all of our joint customers. Being a part of Microsoft’s Cloud Storage Partner Program has been invaluable to the proliferation of our own Technology Partner Program at Egnyte. By executing successfully at the highest level with folks like Microsoft and Google, we are confident that we can replicate success with a wide variety of new partners. In addition to our work with Microsoft, we have built a tremendous suite of integrations with companies such as DocuSign, Jive, OneLogin, and Salesforce as well as industry-specific partners like FotoIn, SmartUse, and SmartSheet. In 2016 we will look to expand this program so customers can deploy the best of breed solutions throughout their organization. ### m-hance Limited joins the NetSuite Solution Provider Partner Program NetSuite Inc. the industry's leading provider of cloud-based financials / ERP and omnichannel commerce software suites, today announced that UK-based m-hance Limited, an award-winning and gold Microsoft Dynamics partner, has joined the NetSuite Solution Provider Partner Program, adding NetSuite to its existing portfolios including Microsoft Dynamics GP and Microsoft Dynamics CRM to meet the growing demand for Cloud ERP among businesses in the UK. NetSuite will be marketed under ‘HighCloud Solutions,’ a new division specifically created for m-hance’s cloud ERP practice. Partnering with NetSuite will allow m-hance to gain an immediate presence in the cloud enterprise resource planning (ERP) market. Under the partnership, m-hance will provide full implementation, support, customization, integration and training services to mid-market clients in the wholesale distribution, manufacturing, professional services, retail, financial services, media and publishing, and not-for-profit industries. Headquartered in Manchester with offices in London, Dublin and Glasgow, m-hance has more than 20 years’ experience in implementing and supporting sector-specific ERP and customer relationship management (CRM) projects, and brings proven expertise in helping organisations get the most out of their software and a maximum return on their investment. The creation of HighCloud Solutions will mean a dedicated focus on NetSuite, while m-hance will continue to focus on its partnership with Microsoft. With the NetSuite Cloud, customers of HighCloud Solutions will be able to run core business processes such as finance and accounting, ecommerce, HR, CRM and project and resource management from a single, unified cloud platform. By implementing NetSuite customers can expect to reduce local IT hardware demands, gain real-time business visibility of their organisation and improve operational costs, efficiencies and productivity. “I am delighted that we have become an Accredited NetSuite Partner,” said Stephen Driscoll, CEO of m-hance. “It is imperative that we offer our customers and our prospects a ‘true cloud’ solution. The creation of HighCloud Solutions will allow us to meet that objective. Our highly experienced team, combined with the industry’s number one cloud product will create a market-leading solution.” “Experienced ERP practices like m-hance are seeing the demand for cloud ERP solutions across the UK and realising the advantages of partnering with NetSuite, the leader in the market,” said John Campbell, EMEA Channel Director for NetSuite. “We welcome the new HighCloud Solutions business and look forward to a lasting relationship benefitting the two companies and our mutual customers.” Launched in 2002, the NetSuite Solution Provider Program is the industry's leading cloud channel partner program, providing hundreds of channel partners with a cloud solution to offer prospective customers and grow their businesses as well as industry-leading margins and incentive programs. With cloud computing at the forefront of the hottest trends and cloud ERP leading the way, channel partners representing on-premise products are continuing to build new practices based on NetSuite's superior cloud business management suite. Designed to help solution providers transform their business model to fully capitalise on the revenue growth opportunity of the NetSuite cloud, the NetSuite Solution Provider Program delivers unprecedented benefits that begin during recruitment and range from business planning, sales, marketing and professional services enablement, to training and education. For more information about the NetSuite Solution Provider Program, please visit http://www.netsuite.co.uk/portal/uk/partners.shtml. Today, more than 24,000 companies and subsidiaries depend on NetSuite to run complex, mission-critical business processes globally in the cloud. Since its inception in 1998, NetSuite has established itself as the leading provider of enterprise-ready cloud business management suites of enterprise resource planning (ERP), customer relationship management (CRM), and ecommerce applications for businesses of all sizes. Many FORTUNE 100 companies rely on NetSuite to accelerate innovation and business transformation. NetSuite continues its success in delivering the best cloud business management suites to businesses around the world, enabling them to lower IT costs significantly while increasing productivity, as the global adoption of the cloud accelerates.   ### Tackling Big Data with the New IP Big Data is quickly becoming critical for businesses of all sizes and across all industries.  With connected devices explodes and cloud computing becoming the norm for businesses and consumers alike, the amount of data that organisations have to store, analyse and understand has, and continues to, increase exponentially. And data generation isn’t about to slow down. Business data worldwide alone, across all companies, doubles every 1.2 years. [easy-tweet tweet="#BigData is critical for businesses of all sizes and across all industries" user="jewellmh @comparethecloud"] The Internet of Things (IoT) is a key driver for this data growth, and with Gartner estimating that 6.4 billion connected ‘things’ will be in use this year, we can clearly see why.  The sheer volume of data produced by these connected devices will mean IT infrastructures are likely to fall well short of what businesses will require of them. In today’s market, traditional, outdated, legacy networks are still very much at large. These networks were often first architected twenty or more years ago, meaning that they were simply never designed to cope with the new technologies such as, cloud, social, mobile and the huge volumes of data that they generate.  If businesses cannot update their core infrastructure to cope with these new technologies, it’s inevitable that they’ll fall behind competitively. A recent report from IDC has highlighted that IoT may never reach its full potential if investment in the technologies underpinning it don’t keep pace. Furthermore, a recent survey commissioned by Brocade found that 76% of UK CIOs are concerned that their network will prevent them meeting business objectives – an alarmingly high figure considering that big data volumes are imminently going to expand. So how can companies adapt to their infrastructure to handle big data? The answer is The New IP. In simple terms, The New IP is a new approach to IP networking, one that’s based on programmable networks rather than traditional hardware-led solutions. The New IP is highly virtualised, open, automated, flexible and scalable and is not limited by user type. The New IP is a new approach to IP networking IT departments are increasingly under pressure to give their workers an agile, resilient and scalable network, one capable of delivering data and applications to any device, anywhere at any time. The New IP is critical to making this possible. Adopting a New IP approach to data centre networks will give businesses a dedicated platform for IP storage that solves the problem of limited expansion and flexibility. The New IP will be able to offer the same performance, predictability, availability, and operational simplicity that were once achieved using older infrastructures, but applied with modern devices in mind. This approach has been emerging over the past few years but, thanks to advances in networking, the New IP is now beginning to gain traction globally. The New IP will be the catalyst that allows a wealth of new technologies – from Big Data IoT – to fulfil their potential as transformative tools for business. A New IP to enable to enable smart technologies Adopting the New IP will enable a new era of network technologies such as Software defined networking (SDN). SDN is a new model for building, managing and extending data centre networks that offers considerable technological and financial benefits. [easy-tweet tweet="#SDN is a new model for building, managing and extending #datacentre networks" user="comparethecloud"] As a relatively young concept, it is one that in many cases has only recently been understood by networking managers. However, according to recent research by Brocade, 40% of IT departments are planning to deploy SDN within the next five years, with 30% of those likely to do so in the next three years. The transition to the New IP will not be instantaneous The transition to the New IP will not be instantaneous. It is likely to be an iterative process as technologies mature and organisations gradually come to terms with the changes that are required in order to future-proof their business. However, organisations that are serious about SDN and the New IP should be exploring the options now so that they can move together as the technology evolves. Taking a “let’s wait and see” approach is, in this case, unlikely to be helpful as it will mean that enterprises fall behind and opportunities are missed altogether. It’s clear that Big Data is rapidly rising up the list of priorities for many businesses. For a company looking to harness the power of Big Data to make faster, more intelligent business decisions, having the right network infrastructure in place will be all important. Investing in The New IP now will allow IT teams to not only handle the growing data volumes of today, it will also give them the foundation they need to adapt and evolve their infrastructure in the years to come. ### Cloud Adoption in 2016: The State of UCaaS According to the Gartner’s latest report, the cloud infrastructure adoption is one of the main reasons why IT services will reach an astonishing figure of $940 billion in revenue by the end of the year 2016. This estimation promises a great progress after a slight drop that we’ve witnessed in the past few months. Nevertheless, even during the year 2015 the need for UCaaS platforms increased for a strong 16%. While most small businesses and medium sized companies are adopting cloud platforms in order to boost the overall efficiency of their business, and also in order to remain competitive on the market, the hybrid cloud still remains a popular choice of many. Primarily because it is the most economical solution still. [easy-tweet tweet="Unified Communications as a Service has predicted 41% market penetration by 2020" user="comparethecloud"] However, a certain implementation of a less familiar cloud computing system is projected to become a dominating trend in the next five years, which will be evident by the end of 2016. Apparently, the need for employment of the Unified Communications as a Service systems, is on the constant increase. It is expected to reach a 41% of market penetration by the end of 2020, according to BroadSoft. Surprisingly, although this cloud computing service may seem a little bit challenging when it comes to management, this solution it is more popular with the mid-sized companies and small businesses. This is why 42% of the 129 surveyed claims that more than half of unified communication will happen through a mobile device by 2020. Mobile accessibility and structural simplicity is winning small businesses over rapidly, BroadSoft claims in its report. 18% of the surveyed even believe that these type of platforms will push emails out of business. The State of UCaaS For those unfamiliar with the term, UCaaS is a cloud computing model which includes third party applications primarily for transferring various data. Through a single model, it integrates applications for collaboration, conferences and communication in general. This makes UCaaS a great platform for business streamlining and project management. the question of quality of service still keeps UC away from the freemium market The question of quality of experience and the question of quality of service still keeps UC away from the freemium market, and only paid solutions are available. This factor presents a serious problem for the providers marketing the product. Since the technology and the cloud servers behind the services of this complexity have to provide reliability, real time encryption of any data transfer, plus high responsivity through various performances – a free solution is simply not scalable for the moment. At least not for the current investors.  Conversely, Dave Michaels, a Principal Analyst from TalkingPointz suggests that there is a business model that will allow users to try out this service for free. How? By selling custom tailored solutions to business owners. Certainly, we have to count on the fact that IT and cloud computing standards are on the constant increase as well, and features are bound to become slightly outdated and consequently more cost-effective sooner or later. [easy-tweet tweet="The majority of #servers are now switching to #SSD storages says @DanRadak"] For example, the majority of servers are now switching to Solid State Drive storages, which are undoubtedly a crucial factor when it comes to the quality of experience of the UCaaS, IaaS, SaaS and the PaaS cloud computing systems. But now when even private users have adopted SSD almost instantaneously because of its quality of performance, despite its still debatably high price, the idea behind this pricing model seems more eligible. If anything, providing prospects with free services inside of a basic UC package could present an inexpensive way of building the customer base. Let’s not forget that the great deal of cloud computing vendors has implemented this approach successfully. Popular cloud platforms (like the project management collaboration platform Asana for example) often have millions of free users. However, that is the marketing strategy which brings conversions too. (Asana has hundreds of paying users today.) Implementing this marketing strategy to an UCaaS system is simply inevitable, the only question is which vendor will come up with the scalable solution first. ### The Cloud Market in Canada – A native’s perspective Canada has gone through some changes recently and most pressingly, the economy has taken some blows. In 2015 I was in a senior PM role for the software mobile device management (MDM) division at blackberry, so I'd like to think I have quite a solid grasp on where the market is at present. Below is my comment on the cloud market in general in Canada, from the perspective of a local.  [easy-tweet tweet="Hear how the #TechIndustry is developing in Canada from @MrBrianEmery" user="comparethecloud"] In Canada we have some very interesting developments underway and, in my opinion, we have massive opportunities in making the cloud more apparent and available. AWS recently announced that they are opening up a data centre presence here after trying to dominate the territory. This I believe is a smart move considering the current state of the economy (all time low for the Canadian dollar) as well as cloud tech coming of age here. Obviously they are not the only cloud titan to make moves in this market. MS Azure is also a common service, but these two firms are deemed from an outsider looking in perspective. Who are the main players in Canada? IBM has a very dominate foothold in Canada which really became further established with the acquisition of SoftLayer back in 2013. This really gave us more choice and flexibility amongst the other large companies (coupled with a strong IBM relationship already established). However there are other competing local players such as CenturyLink, Bell, iWeb, RackForce and a handful of others each offering similar services, but this is infrastructure and it is hard to differentiate. [easy-tweet tweet="Canada is going though a major #tech transition at the moment says @mrbrianemery"] It is also interesting to see some of the very large cloud centric firms targeting Canada; they all see this territory as one with steep growth, and to be honest I feel this too. We (Canada) are going though a major tech transition at the moment. The lack of major cloud providers having a presence here until recently means we have had to be innovative and in some ways it has isolated us. Even smaller cloud centric IT firms have differentiated themselves to play in this market and are winning business. With this said, we also have a skills shortage and look to outsourced development houses to assist (Full Stack skill sets) as we are playing catch up. On the other hand we have leapfrogged the pain of the evolution of infrastructure that most of the world has had to go through, and in my opinion puts us in a very good place. Recruitment in Canada This week there was a local recruitment event organised by Communitech, a sort of incubator of technology companies in the Waterloo area of Ontario. They are also a very good B2B and B2C business support organisation that really assists growth within the IT sector for their clients. The Tech Jam on the 19th of January was absolutely packed, filled with similar people like myself looking at who is hiring who, and what skills are in demand. A very innovative approach was seen – Vendors pitching their respective companies and asking the audience to approach them. [easy-tweet tweet="Canada has bypassed the growing pains of general #cloud infrastructure" user="mrbrianemery"] The skills in high demand from the 50ish vendors exhibiting differed by the size of the organisation; but I can say that full stack developers, product management, Java developers, software developers, and testers were in high demand. This in itself is a good indication that we have bypassed the growing pains of general cloud infrastructure and are on the road to an Open Source application revolution. Of course I should also mention that the Financials are looking to improve and integrate mobile agile working too. TD Bank and ScotiaBank were main sponsors and also a handful of Fintech supporting companies. BlockChain is a relatively new concept over here and I think this is a good time to get on-board with this financial service enabler. I must admit I knew very little about BlockChain until it was explained to me but it does make sense to implement a fast, effective and inexpensive way of processing information that ultimately benefits us, the end consumer of financial products. Canada is about to boom with cloud technology We are a relatively small community and we do have the luxury of looking across the pond and seeing where everyone else has paid the price of innovation in an evolutionary market. Canada is about to boom with cloud technology. Yes, we have some economic challenges, but we are a relatively green field for new approaches to the underpinning technology and have a sound, solid base to build upon. With our Open Source heritage and now the larger cloud players taking an interest the tech scene, we are looking forward to a very prominent pinnacle for the future of tech in our country.  ### Neil Cattermull reviews IBM LinuxONE Yesterday IBM announced their supremacy in the Linux community again following on from last years announcements of creating the LinuxONE product range. In case you were sleeping, on holiday or just not online, IBM released support for Linux, and various flavours of, within their ultra resilient and scalable high end infrastructure tech. But hold on, this is not your usual product set rebranded, it actually is a Linux powerhouse that has all the features of the most resilient tech on the planet but now scales and offers practically every flavour of the OpenSource software available. [easy-tweet tweet="Thoughts on the @IBMLinuxONE announcement from @NeilCattermull"] So with Linux executing over 80% of the worlds financial trades it seems logical that it should be housed on the most resilient tech on the planet too! So that’s what IBM has done (actually being doing it for decades) and now the list is exhaustive. [quote_box_center] From March 2016 Ubuntu Linux distribution and cloud tool sets (Juju, MAAS, Landscape) in addition of Ubuntu to the existing SUSE and Red Hat availability. ISV tools including Apache Spark, Node.js, MongoDB, MariaDB, PostgreSQL, Chef and Docker Industry standard Apache-based capabilities for analytics and big data. The components supported include Apache Spark, Apache HBase and more, as well as Apache Hadoop 2.7.1. Go programming language, which was developed by Google. Go is designed for building simple, reliable and efficient software – contributing to the code from this summer. From March 2016, Optimizing Cloudant and StrongLoop tech for LinuxONE offering a highly scalable environment on Node.js, which enables developers to write applications for the server side using the language they prefer. Cloudant, an enterprise-grade fully managed NoSQL database, stores data in JSON format, common for mobile data.[/quote_box_center] It seems that major developments have taken place in the Linux community from an IBM point of view and even more options are continually available using the IBM brand to enhance the OpenSource community. With the release of a purpose built technical product range that is at home with being the mainstay for the majority of the worlds mission critical services just extends the reach for OpenSource environments. With the entry level LinuxONE system, codename RockHopper, all of the above statements are true which allows for Hybrid Cloud environments as well as connectivity to onsite tech that can be managed in one place as well as being a hub for security placement that’s so desperately needed for Cloud environments. It never fails to amaze me with the steps that IBM continue to make for OpenSource technologies and their commitment to advancing above and beyond the norm with their infrastructure (and I’ve been to most of these group meetings). If you have a Linux estate and its mission critical you have to take a serious look at the IBM LinuxONE product range and more fool you if you don’t. ### New Year – New Innovations: Introducing ECS 2.2 Today EMC has announced general availability of the newest version of ECS (EMC Elastic Cloud Storage), v2.2. With the industry’s most complete suite of software-defined storage (SDS) solutions, EMC is firmly committed to helping organizations harness today’s digital transformation. A key part of EMC’s SDS portfolio, ECS is a software-defined, object-based cloud storage platform that provides global, up to Exabyte scale with compelling economics to customers.  Mark Zuckerberg of Facebook recently announced his New Year’s resolution is to build an artificial intelligence system that will be able to control his home – way more interesting than the traditional “lose weight” resolutions! The amount of data created and ingested by this “Iron Man” influenced platform will definitely require a storage system with a modern, flexible and versatile architecture. You may not have the resources of a Tony Stark or Mark Zuckerberg, but for organisations trying to keep up with the blazing pace of digital transformation, ECS can help. ECS is built from the ground up to handle the massive burden generated by object data from mobile devices, social media and the Internet of Things (IoT), something many traditional storage platforms aren’t designed to handle. [quote_box_center] Announcing ECS version 2.2 With new features under the hood, EMC ECS 2.2 delivers: Software Defined Storage: ECS can be deployed on certified hardware or as a turnkey appliance. A Multipurpose platform: With native support for multiple protocols (like AWS S3, OpenStack Swift, HDFS), ECS breaks down dedicated storage silos. With native NFS support, ECS can also support file data without the need for a file gateway. Smart storage: With ECS, users can search metadata across exabytes of unstructured data without a dedicated database. This enables data analytics that can harvest valuable insights while greatly simplifying cloud and IoT application development.” Low Total Cost of Ownership (TCO):  Today’s release further lowers the storage overhead for cold archive scenarios. A new single-pane-of-glass view that provides complete system health can help customers to reduce operational costs.  Data Protection: With “data at rest” encryption support, users can apply protection to business-critical data. ECS also fully complies with SEC 17 a-4(f) and CFTC 1.31(b)-(c) regulations.  [/quote_box_center] Customers around the world are seeing great success with the ECS platform. Case in point: ATOS SE, a leader in digital services, has seen exciting results with ECS. ATOS’ 93,000 employees in 72 countries provide tech and transactional services to clients in a variety of industries. In 2015, ATOS was able to transform its services using ECS. Check out their story in this video.  The momentum doesn’t end with customers. ECS is compatible with a growing number of ISV solutions such as Hortonworks, and has been certified on four levels of HDP. Check out what Hortonworks VP of Global Channels and Alliances, Chris Sullivan, has to say about ECS: “After rigorously testing and certifying Hortonworks Data Platform with EMC ECS, we are saving customers time to implement the solution, while providing them with an assurance of interoperability. With ECS, customers can benefit from a software-defined object storage system with HDP for analytics and unlocking business insights.” Want to learn more? Check out the ECS homepage, follow @EMCECS on Twitter or test out the ECS Free + Frictionless edition software. ### Making the leap onto the cloud I was speaking to a few managers and directors recently at a networking event and after making initial introductions and mentioning my line of work, they started saying how all they seemed to hear about nowadays was the cloud.  Whether it was in business headlines or trade headlines or online campaigns.  They were reading about it every day and everywhere. It’s clear that there is an increasing buzz from organisations of all shapes and sizes about the varying possibilities and advantages that the cloud can give businesses. [easy-tweet tweet="The #Cloud buzz is building says @TheRealFahdKhan of @zsahLTD" user="comparethecloud"] Although many companies are now realising the multiple advantages of cloud technology, there are still many businesses who have uncertainty and doubts over moving their IT infrastructure to the cloud as well as an element of fear related to making such a change within the company. Even some IT veterans can be overwhelmed by the huge variety of services and advantageous features each company provides along with the numerous cloud providers in the industry. Whether a company wants to move to the cloud to help drive its business growth, cut hardware costs or increase their own accessibility for their employees there are still many questions that I've experienced organisations asking when considering making the leap.  Some of these questions are left unanswered and IT teams can often be left without a guiding hand after moving their Infrastructure over to the cloud. [quote_box_center] There are many questions I've answered relating to cloud technology in the past but the questions that keep coming up again and again in conversations include: How does migrating my companies Infrastructure to the cloud improve my business productivity? Our businesses main focus is its customers, will they see any positive change or benefit from making the move? How will I stay ahead of my competition? Apart from scalability and reduced costs what else can the cloud do for my business? Will the cloud help resolve my business continuity challenges [/quote_box_center] All of these are valid and important questions to ask in order to gain a better understanding of what the cloud can offer your business. How your company will benefit from the virtualisation of its infrastructure and what this means for the future of your customer’s and the future of your company. These questions provide a basic understanding from a business perspective to help you make a calculated and informed decision before you make the inevitable move. We will explore the common questions asked above which should provide you with a better understanding and give you the initial confidence in making the leap onto the cloud! How does migrating my companies Infrastructure to the cloud improve my business productivity? As with many aspects of the cloud there are several beneficial points to take into account. When you are moving your IT infrastructure to the cloud you are virtualising your servers, depending on the cloud option you choose some more so than others. The cloud offers the ease of access and flexibility that can help future proof your business by providing accessibility to a level that you will not have experienced before. Whether you’re having a coffee in Starbucks or enjoying a holiday abroad the cloud strongly supports working remotely and this allows your team to be working from anywhere with internet access rather than being dependent on being in the office. This is especially beneficial in situations where a member of your team is unable to make it into work or when workloads are high and deadlines need to be met, rather than long hours in the office they can opt for a change of scene or the comfort of their own home and continue to work thanks to the accessibility that the cloud provides. When you are moving your IT infrastructure to the cloud you are virtualising your servers Lastly, in the past when companies have been dealing with any expansion or changes to their servers it can take days and weeks to set-up a new server whereas with cloud services not only are such changes cheaper and more affordable; but making major changes to the servers can be done in minutes as opposed to the lengthy time consuming processes older and more traditional methods would take. Our business's main focus is its customers, will they see any positive change or benefit from making the move? An essential part of maintaining a good relationship and high quality of service with your customers is to ensure that you have good responsiveness and can ensure you maintain a high level of service to all customers. There are a number of ways to maintain this, one of the ways of doing this is by meeting and changing customer requirements over time by updating and evolving technologies and current trends. Also having minimal issues and a smooth and secure business supported by the cloud means your team can dedicate more time to customers and less time to internal issues. Cloud companies who provide a managed service like zsah can take care of minimising any potential issues your systems may face creating more time for your team to dedicate its focus towards driving the business. How will I stay ahead of the competition? the cloud is everywhere As previously mentioned, the cloud is everywhere and it is the fastest growing trend that’s revolutionising businesses and the way we work. Businesses that are successful in any market are the ones that are ahead of the game and cutting edge, whether they are constantly developing their services or taking the company into new directions and new markets, one thing to remember is that they are ever changing and the key to this is to be and always remain open to changes.  In today’s market for any forward thinking company it’s not a case of “should we move onto the cloud” and more a case of ensuring they have taken the right steps and made the best choices for “when” they make the move. By migrating your IT infrastructure on to the cloud you make sure your organisation is keeping up with essential game changing trends in the market and it does not stop there considering the numerous ongoing benefits of moving onto the cloud. [easy-tweet tweet="By migrating #IT infrastructure on to the #cloud you are keeping up with essential game changing trends"] Apart from scalability and reduced costs what else can the cloud do for my business? One of the key factors why businesses decide to make the transition on to the cloud is due to the reduced costs, scalability and business agility it provides, but apart from these factors there are many others. A well-established cloud infrastructure provides dynamic storage, processing and memory. Most cloud services are typically pay as you go, so aside from initial Capex overall expenses are considerably low thus saving companies money in multiple areas from security to software updates and licensing costs. Increased security is another benefit that many companies worry about and this is an area that quality cloud providers spend vast sums of money to insure they have the highest security features to protect themselves and their clients from any potential security breaches. cloud computing is more eco-friendly An aspect that many companies do not take into account is that cloud computing is more eco-friendly.  Businesses on the cloud only use the server space they need which decreases their carbon footprint.  Using the cloud has resulted in at least 30% less energy consumption and carbon emissions than using on-site servers.  This is something SMEs get the most benefit from as the cut in energy use is even greater. Will the cloud help resolve my business continuity challenges? Business continuity is an absolute priority for any company, especially for web based businesses.  Of the many innovations in business cloud computing has provided, business continuity is one of the greatest with a range of services that help maintain continuity via Disaster recovery, mirrored servers and health checks. Time has proven that the next evolution for businesses is taking place right now, on the cloud.  Future proof your business and make the transition now. ### How being a SaaS company helps Guidebook to better serve its clients Buying event apps can be costly, confusing, and frustrating. The reason that Guidebook's business model is software as a service, is because technology changes so rapidly, and event organisers must always stay with (or ahead of) the curve. Paying someone to develop a one time app is more often than not a costly mistake that people make only once. [easy-tweet tweet="Hear about how being a #SaaS company works for @Guidebook" user="comparethecloud" usehashtags="no"] For instance, in the past 8 years, countless new device types and operating systems have been developed, design standards have changed substantially, and security requirements have become more complex. If an event organiser paid for a bespoke blackberry app in 2012, it would be useless to them today. On the other hand, if they had chosen to partner with Guidebook, and purchased the software as a service on an annual basis, their app would be at least as up-to-date today as it was in 2012. By developing a platform that we constantly maintain, analyse, and improve - we are better able to serve our clients with the exciting new features they expect. Networking, session registration, live polling, and many others are no longer 'nice-to-haves' but 'must-haves' - and all the better if they can be successfully deployed with no risk. All new features and improvements begin from detailed metrics analysis, or direct feedback (from users, clients, or staff). They are then deployed and tested on the Guidebook app, before being rolled out for ALL of our subscribers. Financial Positives of the SaaS model We regularly see companies desperately trying to hold on to their bespoke app, spending more and more money to keep it updated and relevant. Adding more and more new horseshoes to buckaroo... before he eventually (and inevitably) kicks! Just like that old used car you're always pouring money into, a bespoke app is entirely dependant on 2 factors. The technology it was first built upon, and the skill and integrity of the mechanic. bespoke apps are entirely dependant on 2 factors, The technology it was first built upon, and the skill and integrity of the mechanic To be clear, for extremely large, one-off/annual events, there is certainly an argument to be made for choosing the bespoke option. It gives the organiser complete control over every aspect, allows them to develop completely new and innovative features, and break the mould. Although this can naturally be off-set with a counter-argument that questions the need for a complicated, hyper-advanced app that the average user has no idea how to use. For companies that run multiple events of varying sizes, it is absolutely vital that they have a platform which can scale, and which they can manage themselves. As soon as you add the complication (and cost) of a 3rd party to manage your app's content, you reduce all of the benefits an app can offer: such as flexibility, autonomy, lower costs and ubiquity. It's really a very simple formula: if deploying an event app is cheap and easy, this will equal more apps. More apps will increase user familiarity, as well as an organisations' understanding of how best to utilise them. Eventually this will create ubiquity, which will mean that anyone not deploying an app will be firmly behind the curve, and disappointing their users. Commercial Benefits of the SaaS Model [easy-tweet tweet="#SaaS businesses are able to constantly update to keep their customers happy " user="comparethecloud"] By consistently delighting clients with new features they have asked for, or even ones they have never thought of, SaaS businesses are able to retain their clients - and hopefully increase the value of all those relationships over time. In 2012, there wasn't an app on the market that offered networking, session registration, live polls, surveys, and detailed user metrics. And none of the above were added to the platform overnight. They have been added systematically during our companies development, and the cost per event has actually gone down for our clients. Meaning that the SaaS model has produced the perfect storm here at Guidebook, where clients are successfully served with a better product year-on-year, for an extremely affordable price. Put simply, the quality of our offering grows at a vast superior rate to our prices - which means constantly better value. The more consistent revenue that comes from being a SaaS company, allows a much more long-term approach to business plans and product roadmaps - and Guidebook, its clients, and its users, have all been beneficiaries of that. ### The march of Object Storage We have started to see a number of vendors make more positive noise towards Object and Virtualised storage by adding new functions or whole new products in the storage space. This lead me to think - why? [easy-tweet tweet="#ObjectStorage is the way of the future, @Anke_Philipp tells you why" user="comparethecloud"] Well it's simple, we are producing so much data today that the current setups will not hold together in the mid to long term. So they are having to address this today, and as they say, you need to look back to move forward. We are producing so much Data today that the current setups will not hold together Brief History of Data Storage in the last 15 years The way we store data has evolved immensely since the turn of the millennium. Fifteen years ago we had a lot less data, and it was mainly stored in a local or direct attached model. By the mid 2000s we started to see a move to scale-out storage driven by the need for greater data protection/DR, mixed environments, virtualisation etc. but there had been an increase to a lot more data and a lot of silos of storage. Now we are in the cloud era and it was this drive to cloud based services and storage - plus the hard learned lessons of the passed years - that has lead us to re-examine how we store our data. Add to this Big Data, Analytics, more Virtualisation, VDI, personal data and the fact we wish to access it 24/7, things are becoming hard to manage and control. It’s this drive to cloud based services and storage that is leading us to re-examine how we store so much data So why you ask: Object Storage? The biggest problem with traditional approaches is scalability. NAS lacks the ability to scale as a single system, especially in exabyte and larger environments we are seeing today. Today’s SANs are already complex when they are deployed with a file system layer on top. Scaling them out makes the problem a lot worse, and then there is the big one: cost. Prices are out of control; in some cases £1k per TB when it is £100 in PC world for the same basic disk! If this is enabling tier one vendors to run there own private jets, well then you have to look very closely at the whole storage story! [caption id="attachment_34263" align="alignright" width="300"] Join the #ObjectStorage #CloudChat on Twitter, Jan 29th[/caption] How is Object Storage different from NAS or SAN? Well first, Object Storage is not new, it certainly pre-dates Amazon’s S3. Object Storage is essentially just a different way of storing, organising and accessing data on disk. An Object Storage platform provides a storage infrastructure to store files with lots of metadata added to them. The back-end architecture of an object storage platform is designed to present all the storage nodes as one single pool. With Object Storage, there is no file system hierarchy. The architecture of the platform, and its new data protection schemes (vs. RAID, the de-facto data protection scheme for SAN and NAS), allow this pool to scale virtually to an unlimited size, while keeping the system simple to manage. Plus it's not limited to location so data can be spread worldwide or local - your choice. With Object Storage, there is no file system hierarchy Users access object storage through applications that typically use a REST API (an internet protocol, optimized for online applications). This makes object storage ideal for all online / cloud environments but not limited too as we are now starting to see new applications object aware. [easy-tweet tweet="REST API access makes #objectstorage ideal for all online / cloud environments" user="anke_philipp"] When objects are stored, an identifier is created to locate the object in the pool; applications can very quickly retrieve the right data for the users through the object identifier or by querying the metadata (information about the objects, like the name, when it was created, by who etc.). I think of it as the concierge service in a hotel, you give them your bag they give you a ticket the bag goes off to be stored safely, you come back handover your ticket and there’s your bag. This approach enables significantly faster access and much less overhead than locating a file through a traditional file system. Current generation of object storage platforms are designed with openness and flexibility in mind. The industry has learned some tough lessons from using proprietary systems. One initiative to prevent vendor lock-in, is SNIA’s cloud data management interface (CDMI). This is a set of pre-defined RESTful HTTP operations for assessing the capabilities of the cloud storage system, allocating and accessing containers and objects, managing users and groups, implementing access control, attaching metadata, billing etc. [quote_box_center] Object Storage Summary Data is stored as objects in one large, scalable pool of storage. Objects are stored with metadata – information about the object. An Object ID is stored, to locate the data. REST (API) is the standard interface, simple commands used by applications. Objects are immutable; edits are saved as a new object. [/quote_box_center] So... will we see the march of object-based storage? I think YES! This is because we are going to produce so much more data which we will need store, but we will expect the costs to be lower. Added to this, with people becoming more open to their data being dispersed to different geographically spread data centres, and the law catching up with regards to data governance - it is only going to move one way and that is forward (at least until the new great leap in storage technology comes along). ### IBM Enhances LinuxONE for Hybrid Cloud Environments IBM today announced new technology features and collaborations for its LinuxONE family of high-performance Linux systems. The enhancements will enable organisations of all sizes to develop, deploy and manage applications for the cloud simply and efficiently with robust security. The updated IBM LinuxONE Rockhopper, the entry point into the portfolio, is now enabled with improved speed and processing power. The enhancements to the LinuxONE family of systems give clients the option to develop, deploy and manage applications for the cloud, simply and efficiently with robust security. (Credit: IBM) [easy-tweet tweet="#CloudNews: @IBMLinuxOne enhances for Ecosystem Growth and Refreshed Processing Power"] IBM LinuxONE gives clients the choice they desire with the consistency they need in the cloud with new enhancements: New Hybrid Cloud Capabilities – IBM is optimizing its Cloudant and StrongLoop technologies for LinuxONE. The new features will offer a highly scalable environment on Node.js, which enables developers to write applications for the server side using the language they prefer. Cloudant, an enterprise-grade fully managed NoSQL database, stores data in JSON format, common for mobile data, enabling users to save time by storing data natively in the system, without the need to first convert it to a different language. Ecosystem Growth – IBM is expanding supported software and capabilities for LinuxONE. IBM LinuxONE recently ported the Go programming language, which was developed by Google. Go is designed for building simple, reliable and efficient software, making it easier for developers to combine the software tools they know and love with the speed, security and scale offered by LinuxONE. IBM will begin contributing code to the Go community in the summer. Through new work with SUSE to collaborate on technologies in the OpenStack space, SUSE tools will be employed to manage public, private and hybrid clouds running on LinuxONE. Enhanced Systems – IBM is announcing refreshed versions of the LinuxONE family, which includes the Emperor and Rockhopper, to improve speed and processing power. In March, IBM Open Platform (IOP) will be available for the IBM LinuxONE portfolio at no cost. IOP represents a broad set of industry standard Apache-based capabilities for analytics and big data. The components supported include Apache Spark, Apache HBase and more, as well as Apache Hadoop 2.7.1. Continuing its commitment to contributing back to the open source community, IBM has optimized the Open Managed Runtime project (OMR) for LinuxONE. This repurposes the IBM innovations in virtual machine technology for new dynamic scripting languages, infusing them with enterprise-grade strength. “IBM is strengthening its expansion into the open community, providing developers more choice and flexibility with LinuxONE,” said Ross Mauri, general manager, IBM z Systems and LinuxONE. “The platform’s broadened ecosystem and new hybrid cloud capabilities underscore the security, efficiency and performance that clients need, while delivering the flexibility and possibilities of open source they love.” Clients using the platform benefit from its powerful right-time insights capabilities. The UK Met Office, a leader in climate and weather services for the public, business and government, uses LinuxONE to process transactions and run analytics to derive insights more quickly than before to deliver critical information in real-time. “It is essential for us to deliver services based on accurate data that paints a full picture, and do so as quickly as possible,” said Graham Mallin, Executive Head of Technology, Met Office. “LinuxONE has enabled our organization to provide our services to clients based on weather and climate data faster, and today’s announcement will enable us to go even further with this life-saving work.” Following an intent announced last year, Canonical is offering its Ubuntu Linux distribution and cloud tool sets (Juju, MAAS, Landscape) to LinuxONE clients. With the addition of Ubuntu to the existing SUSE and Red Hat distributions, organizations now have a third option for acquiring the LinuxONE system. LinuxONE, introduced in August 2015, is enabled for a broad range of popular open source and ISV tools including Apache Spark, Node.js, MongoDB, MariaDB, PostgreSQL, Chef and Docker. The new LinuxONE systems availability is currently scheduled for March. For more information on the IBM LinuxONE Systems Portfolio, visit http://www.ibm.com/linuxone and join the conversation on Twitter using @IBMLinuxONE and #LinuxONE. ### GDPR is coming and businesses need to prepare The General Data Protection Regulation (GDPR) is expected to be deployed in 2017, tying together the principles of data protection, within every nation in the EU. As we well know, GDPR has been devised to address the changing ways in which companies function in the modern world. This will be done by tackling concerns surrounding the protection of personal data on social networking sites, as well as data stored and transferred in the public Cloud. But how will these changes affect those of us on the ground? The introduction of penalties of up to 4% of global annual turnover and the obligation to report data leaks are sure to have a significant impact on the way companies approach data protection. [easy-tweet tweet="Concerns about protection of #data stored and transferred in the public #cloud are prominent"] In the absence of a crystal ball, here is a view of what companies can do to ensure that they are protecting the sensitive information they hold about their employees and customers and avoid being hit by new data protection penalties. Whether your business is based in Europe, or you are a non-EU business with data traded or stored inside Europe, it is probable that the upcoming European data protection regulations will change how you deal with your staff and client data.  Considering international collaboration The recent growth in globalisation has given birth to a borderless corporate philosophy where data can immediately be shared between different countries and devices. Our means of storing and distributing data have dramatically evolved in the last ten years and it is imperative that the regulations around protecting data are evaluated and altered to mirror this; a business’ firewall cannot defend sensitive data when it is shared with third parties. Our means of storing and distributing data have dramatically evolved in the last ten years One of the major changes in the GDPR is expanding the territorial scope of the laws to include not only companies that are established in the EU. The new regulation will also affect those that are based elsewhere but processing personal data of people residing in the EU. Now, many organisations that were outside the scope of application will be directly subject to the requirements. This development brings the important question of data residency squarely into the limelight. Now more than ever, EU-based businesses and individuals are questioning if their data is being handled and stored in EU-based data centres. Under GDPR, businesses will have to ensure that the information stored in their data centres never leaves the country-specific legal area without authorisation. [easy-tweet tweet="The ECJ believes that #data from Europe is not always safe when stored overseas" hashtags="infosec, datasecurity"] The recently repealed Safe Harbour Agreement between Europe and the United States shows that even the European Court of Justice (ECJ) believes that data from Europe is not always safe when stored overseas. Although customers will likely be able to approve the transmission and processing of their data on both sides of the Atlantic, businesses currently run the risk of violating European privacy laws and allowing business-critical information to fall into the wrong hands if they store data outside the EU. Dealing with sensitive information – what is the right approach? Whether your business has 50 or 5,000 employees, it is likely that you deal with a substantial amount of client and customer sensitive data, be that contact details, social media activity or professional and personal records. Especially when considering the increase in Cloud services, this sensitive information is now likely to be stored outside of the business itself, as well as internally. With GDPR around the corner, it is imperative that businesses evaluate the way they gather, categorise, store, distribute and defend the data they acquire. With GDPR around the corner, it is imperative that businesses evaluate the way they gather, categorise, store, distribute and defend the data they acquire Under the new regime, the definition of ‘personal data’ is expected to broaden, bringing many more types of information into the regulated perimeter. Businesses will also have to make sure that this far wider scope of relevant data is secured by adopting modern encryption methods and other technical safeguards such as rights management, two factor authentication, operator shielding and full audit trails. While this may sound like a significant burden to businesses, many data-handling solution providers are already offering data protection by “default”, meaning that products and services are automatically provisioned with the highest level of privacy. Information is a valuable asset to businesses and one that needs to be effectively guarded to protect customer information and trade secrets. However, it also needs to be communicated and shared with third parties. There is a fine balance between operating effectively with the right access to data, while also protecting the privacy of the data subjects and complying with regulations. Simple and secure technical measures need to be put in place to ensure this balance is possible. [easy-tweet tweet="Information is a valuable asset to businesses and one that needs to be effectively guarded" hashtags="infosec"] Straight forward, but secure collaboration In this day and age, a large proportion of businesses are required to work with external parties in order to stay current; but they similarly must meet data protection regulations if they are going to protect client information, and their own reputation. By introducing an easy-to-implement and use storage and collaboration technology, businesses will find it much more straight forward to achieve both requirements of modern business. When sharing data with third parties, wherever they are located, businesses should consider adopting a collaboration platform that is simple for employees to use and supports them in their everyday work, so it is perceived as a helpful tool rather than a hindrance. [quote_box_center] To make it easier for your business to protect customers’ data and comply with GDPR, you should opt for a collaboration platform that: Protects data transmission and storage by cryptographic means Has strong authentication measures to ensure only authorised users can access data Allows you to tailor users’ access rights and modify what they can do with a document Provides a tamperproof audit trail, enabling traceable and transparent insight into how documents are being used and edited Is accessible securely when employees are travelling abroad or are out of the office [/quote_box_center] If you haven’t already done so, now is the best time to start preparing your business for the full implementation of GDPR in 2017. By reviewing the way your company collects, stores and shares data with these new data protection regulations in mind, you will be able to ensure your ongoing compliance and avoid devastating fines and reputation damage in the future. ### Application rationalisation: Four steps to make it a resolution that lasts Many of you will have started 2016 with a resolution in place; maybe getting the exercise bike out of the garage and committing to being more active, or cutting chocolate and cigarettes. These are great things to commit to, but have you considered that you should probably be doing the same for your business too? IT and cloud systems can also benefit from being a bit leaner, effective and fine-tuned, which is why application rationalisation should be a consideration. [easy-tweet tweet="Have you begun your process of application rationalisation for 2016 yet? " user="comparethecloud" usehashtags="no"] Application rationalisation – or the process of evaluating applications to see which ones are needed and which can be discarded – is something you may well already have had conversations about. After all, who doesn’t want to tap into hidden cost savings or improved business performance? The problem is that such initiatives are often doomed to failure from the start due to poor organisation and failure to put a set of processes in place. To prevent this from happening, here is a four step plan to follow for application rationalisation success: Think about the details – not just the big picture As with your resolutions, the best way to achieve success is to know how you’re going to do it before you get started. It’s no good saying you’re going to lose weight, stop smoking and get fit all at once; you need specific targets, timeframes, and plans for how you will get there. Similarly, if you want to wean yourself off unnecessary applications, you need to think long and hard about your goals for both the short and long term. Is it cost reduction, or are you looking to consolidate which applications you are using to simplify processes, for example? The key is to set a goal right at the beginning; aiming to achieve everything at the same time is often a fast track to achieving nothing. Once you have this fixed in your mind, it’s time to put together a realistic plan and timeline on how you’re going to achieve these. Build up your support network Once you know what you want to achieve you need to work out how you are going to succeed. To do so, you need to put together a team of people from across the business to examine which applications are needed and valued. This will help to determine which ones can be rationalised or updated. Having a cross-departmental team representing different areas of the business – from HR to Finance, Sales and Marketing to Operations – will enable you to learn how applications are used between and across teams. put together a team of people from across the business to examine which applications are needed and valued Are your applications working together or operating in silos? With a team of stakeholders in place, you need to look at your applications in detail. How are they working together? If certain applications are only being used by one team, are they needed? Are they supporting a single project? Can this be achieved elsewhere? Do you have two applications (or more) doing the same job? Asking these type of questions will help to identify duplication, or where teams are using different tools to provide the same function. This will be critical when it comes to selecting what stays, what goes and what can be merged. Application rationalisation isn’t just a New Year’s resolution While conducting your application rationalisation, keep an eye open for insights to be used for the future. Though you may decide that some applications are more necessary then others, it is good practice to note where risk might reside, or how teams can work together more effectively by granting access to certain applications across teams. Above and beyond this, it’s an opportunity to put in place continuous lifecycle management for all applications, so you can revisit which ones you might need or not need in 3 / 6 / 12 / 24 months’ time. It’s important that the business as a whole decides whether it needs to update, change or remove applications, rather than just individual teams. For instance you might find that an updated operations application replaces something used by the finance team or vice-versa. [easy-tweet tweet="Continuous lifecycle management for all applications is important for #IT health" user="comparethecloud"] Modern business relies upon a variety of specific applications to function, but some have overstayed their welcome and are undoubtedly holding back organisations, costing time and money. Application rationalisation is important, but it needs to be carried out with clear goals in mind to guarantee long-term success. Planning will help to remove unnecessary applications, which will help to improve long-term processes. Sharpening IT systems is a great resolution to make, just don’t give it up. ### PTC's Augmented Reality ‘ThingEvent’ 28th January Date:  January 28, 2016 Time:  9:00 AM EST and 3:00 PM EST Registration:  www.thingevent.com Later this week, on January 28, PTC will host its exclusive ThingEvent, focused on augmented reality (AR) in the enterprise. This livestream event will highlight real-world customer solutions from leading manufacturing companies as well as new technologies that will transform how humans interact with and experience the physical ‘things’ around them in the era of the Internet of Things (IoT). By seamlessly overlaying digital data onto the physical world – and providing a rich, visual, interactive experience – augmented reality can fundamentally improve and transform the way we design, manufacture, operate, and service smart, connected products and systems. ThingEvent attendees will see live demonstrations of practical uses of augmented reality for business, presented by PTC executives and representatives from leading manufacturing companies. These PTC customers will share the benefits of presenting enterprise information and live streaming sensor data through augmented reality, enabling deeper insight into product behavior and usage, better execution of service procedures, more comprehensive training of staff, as well as many other value propositions that are driven by various use cases. The event program will feature PTC’s CEO Jim Heppelmann, co-author of recent thought leadership articles published in the Harvard Business Review, who will explain how augmented reality will play a primary role in PTC's IoT platform. The program will also include Jay Wright, PTC SVP and General Manager for Vuforia, who will discuss the expansion of the Vuforia platform to enable new enterprise solutions in addition to consumer-facing applications. Recently acquired by PTC, Vuforia is the world's most widely used augmented reality platform. [quote_box_center] PTC invites you to join a worldwide audience of thousands via live streaming of this exciting event: “The incredible opportunities presented by the IoT, give rise to the need for a whole new generation of enabling technologies,” said Jim Heppelmann, president and CEO, PTC. “We hope you can join us on January 28 for this special event. Be there to witness the vision, hear the strategy, and see the unveiling of a new suite of products, coming later in 2016, that will harness the power of augmented reality and transform forever how we experience things.” [/quote_box_center] Social Media ThingEvent presenters will interact with the global livestream audience via social media. Attendees should follow and use the hashtag #thingevent to post comments and questions. Be certain to also follow @PTC, @ThingWorx and @Vuforia_AR for the latest news, events and general happenings across PTC. ### When Enterprise and Social Collide: the future of collaboration The way that we all collaborate at work is changing, just as the way that we all collaborate with our friends already has, and it’s all about social media. Like many of my colleagues I use different social platforms in different ways: for me Facebook is for Friends and Family, Twitter is for Work and Arsenal and LinkedIn is for Work and Contacts. We all strike a slightly different balance, but as a rule most of us are now quite comfortable using social media for public collaboration whether work or personal. [easy-tweet tweet="Email has had its day, we all overloaded, slaves to our inboxes, Enter #social #collaboration tools" user="billmew"] There is another side to collaboration however, the less public kind – for example the collaboration that occurs behind closed doors at work, and that in most organisations, is still done via email. Many believe that email has had its day, not only are we all overloaded, slaves to our inboxes, they believe that in future we will all use social tools for collaboration at work instead. To this end companies like IBM, Microsoft and Jive have for years now been selling and enhancing their tools for workplace social collaboration. These are like Facebook, but for internal use. If IDC is to be believed, then IBM is the leader in this space. It has made IBM its worldwide marketshare leader in enterprise social software for the fifth straight year. Gartner's latest Magic Quadrant on workplace social software also had IBM as a leader, along with Jive, Microsoft and Salesforce. The Forrester Wave for social depth platforms had Lithium, Jive, Acquia, Telligent as its leaders. Meanwhile IBM, EMC and Box were the leaders in Forrester's File Sync and Share Wave. Enter the real social giant, Facebook… In January 2015 Facebook launched a pilot version of its Facebook at Work tool. It uses familiar Facebook features like the news feed, groups, messages and events, but has been designed purely for use within individual companies. Employees’ information is not be accessible to the outside world, including keeping it separate from their personal Facebook profiles. The challenge for all social collaboration tools is adoption The challenge for all such tools is adoption. Employees won’t move away from email until they have an alternative that is not only easy to use, but also provides immediate and obvious advantages. It is also a cultural thing. Unless you can change the culture of a firm and get enough staff to use a new tool, it won’t be of any value. After all in collaboration terms it’s no use having a phone if there’s hardly anyone else to call. The traditional enterprise giants have the advantage of knowing the enterprise computing market, having an installed base and being able to integrate with their other corporate applications. Facebook however has the advantage that its tool is already familiar to almost everyone. It also is easy-to-use, secure and available on desktop and mobile devices, along with its complementary messaging tool, Work Chat. [easy-tweet tweet="In #collaboration terms it’s no use having a phone if there’s hardly anyone else to call" user="BillMew" hashtags="social"] [quote_box_center] Case study: iAdvize is a firm that provides a messaging platform that enables businesses to engage customers and prospects on their website and on social media from a single tool (Chat, Voice, Video). Its employees are only too familiar with social media and external collaboration. iAdvize was pre-selected to be one of the first companies in Europe to take part in the private beta of the ‘Facebook at Work’ solution before its official launch. In July 2015 it started using the solution, and it rapidly became the company’s number one internal communications tool. More than 90% of its employees now use the business version of Facebook every day and generate an average of 1,000 interactions on a daily basis (posts, likes and comments). Surveys is used to enable employees to gather their colleagues’ feedback on different topics. Watch group has become the best place to share articles, research and other resources about news and trends related to iAdvize’s market, and real-time customer engagement. As with Facebook hashtags and tags to people optimises the impact of posts. New staff at iAdvize can find all the information about the integration process and can also view the profiles of their new colleagues to work out who’s who! And Events is used to get people together for a talk, a meeting or a corporate event. Key statistics are as follows: Key figures: More than 90% of employees use ‘Facebook at Work’ every day 75% of employees use the ‘Facebook at Work’ mobile app an average of 1,000 interactions a day 113 groups have been created since the launch iAdvize Team is the biggest group with 143 members the smallest is, unsurprisingly, The Matthieu group with 3 members on average, 5 events are created every month on ‘Facebook at Work’; international evenings, conferences, races, Movember events, the Christmas party and many more. Examples of active groups: Gong: where employees can share personal and team achievements with the rest of the company - a new client, a successful deployment, a product innovation, the launch of a communication campaign, etc. 1 day, 1 customer: where employees post short customer success stories with a meaningful figure or fact. The Genie’s tips: where anyone can share good deals - a great restaurant, a baby-sitter, an idea for a gift, etc. Newbies: where newcomers can find all the necessary information to help them through the first days/weeks/months (integration process, practical questions, etc.). Product News: where you can find out about all product evolutions and innovations. The Genie Running team: where runners can organise lunchtime runs and plan group participations in official races. [/quote_box_center] In another recent blog “Expansion: up, down, left, right, forward or back – which way is best?” we look at the six strategies that companies can use for expansion. The fifth of these “climbing the product stack” requires you not only to understand your own clients, but to understand their clients as well, which is not as easy as it sounds. Google tried to do this when it launched its own social platform, and we all know Google+ wasn’t the success that had been hoped for. If Facebook gets things right with Facebook at Work, as it appears to be doing, then the corporate technology players in the market will need to look out. [easy-tweet tweet="Corporate technology players in the market will need to look out if #FacebookatWork is as good as it seems" user="billmew"] ### Time to quash the cloud myths Gartner had it right when it said ‘cloud computing, by its very nature, is vulnerable to the risks of myths’. Even in 2016, a quick Google search highlights that enterprises are still unsure about the perceived risks that come hand in hand with cloud computing. [easy-tweet tweet="There is still a worrying amount of misinformation about #cloud" user="bilco105 @steamhausMCR"] Just a few of the top searches reveal a worrying lack of information surrounding the concept, with many still in the dark about ‘what is cloud computing’, ‘what is meant by cloud services’ and the ‘risks of using cloud computing’. It’s time to get rid of the fluffy stuff and dispel some of the myths when it comes to cloud. We’ve put together three of our favourites here. 1. ‘‘I won’t know where my data’s being stored or if I’m being compliant’’ It’s great that people are starting to think about this – especially with changes to UK data protection rules coming in early this year as part of legislation being standardised across the EU. With data breaches potentially costing businesses as much as 4% of their annual turnover, it’s a great time to be questioning the ins and outs of your data security. With data transfers within the EU and soon between the UK and US under ‘Safe Harbour’ to become more tightly controlled, it’s important to make sure that you’re asking the right questions. If your cloud provider is worth its salt, it’ll make sure you know exactly where your data is located and whether you’re staying within the law, wherever you’re operating. [easy-tweet tweet="Data breaches potentially cost businesses as much as 4% of their annual turnover" via="bilco105" hashtags="data, security, infosec"] 2. ‘‘Sharing hardware and datacentre facilities with other customers is a risk to my data’’ This isn’t true if you pick the right partner. Using a platform such as Amazon’s AWS for example, means you can benefit from the company’s global expertise and rest assured that its technology is subject to the most rigorous tests and audits. The sheer size of the platform means Amazon can make significant investments in all elements of security. What that means for businesses is that it’s often more secure than on-premise hardware and can provide better datacentre security isolation than having a dedicated infrastructure. And, for the ultra security conscious amongst us, you can also look to dedicated hardware with full isolation. It’s not that sharing hardware and datacentre facilities is never a risk – but with platforms like AWS, it doesn’t have to be. Take public cloud, for example – users can access the data stored on their platforms. Numerous policies exist to ensure that this is strictly forbidden. Add in a number of ‘at-rest’ encryption offers for data, and you’ve got a platform that’s about as secure as it can be. 3. ‘‘Being in control of my own infrastructure will always make it more secure’’ Adam Selipsky, VP at AWS once said: “People think if they can control it they have more say in how things go. It's like being in a car versus an aeroplane – you're much safer in a plane.” The very same perception exists in managing cloud storage infrastructure. being in full control doesn't necessarily mean you're safer from it disasters This opinion is typically held by CIOs in enterprise organisations, whose responsibility for applications and software naturally prompts them to think that keeping in-house control over their cloud computing will put them in a better position to secure their data. But in reality, relinquishing the management of that data and allowing a partner to store it for you means you’ll be embracing their huge level of expertise, ability to meet tough compliancy requirements and guaranteeing a higher level of availability and automation of services. And that’s all while retaining control of your data. Upcoming changes in legislation, paired with a growing awareness of how cloud computing can tackle security concerns, is helping to redefine perceptions and debunk the cloud myths that are holding businesses back from embracing it. ### Expansion: up, down, left, right, forward or back – which way is best? So you’ve built a successful business doing what you do – you do it well and you business is going well, but you want to do better so what’s the best move? As with a spin of the dice there are six options: [easy-tweet tweet="Roll the dice with @BillMew and look at where expansion should be happening" user="comparethecloud"] 1. A step to the left: expand into new markets Going global or at least entering new neighbouring markets is an obvious option, but routes to market can be a challenge (depending on your type of business). ISPs or MSPs can offer services across borders but sales, marketing and support needs to be market focused. One answer is to work with a distributor (such as Arrow) that can help you enter and exploit new markets. The main focus here is to serve more potential clients and grow revenues while if possible achieving economies of scale. 2. A step to the right: consolidate into local market niches A less obvious step is to focus on fewer markets and try to specialise more in order to drive up margins. It might once have been possible for a jack-of-all-trades to make a good living, but it is becoming increasingly important to focus in order to develop an enhanced value proposition to a particular market segment. It might once have been possible for a jack-of-all-trades to make a good living, but it is becoming increasingly important to focus in order to develop If you specialise in solutions for legal practices, understand their value chain from end to end, have skills in all the specialist software packages that they use and have developed your own IP (templates, modules, methodologies or packages) for this market then you can earn a far higher margin as a specialist than a generalist would, and you’d be far more likely to retain and win business. Initially you focus on margin growth, but once you earn a good reputation in the niche market you may experience revenue growth also. 3. A step forward: expanding your product portfolio There may be new products or services that you can develop or resell that complement what you already do. You don’t even have to do anything extra yourself – investing time in developing partner networks can enable a set of smaller firms (each specialist in what they do) to work together as a team of best of breed players that can challenge the bigger players for larger deals. For example a communications specialist like GTT, which doesn’t offer cloud services and so doesn’t compete with its partners, might be able to add an extra dimension to any proposition. Add a few more such players and you soon get a dream team. [easy-tweet tweet="Investing time in developing partner networks can enable a set of smaller firms to be best of breed players" user="billmew"] 4. A step back: consolidating your product portfolio Just as there is an opportunity to enhance margins by specialising in a particular market niche, there is also opportunity to become a product specialist. There are risks in doing so – Amazon has crushed numerous specialist retailers (starting with bookshops) by entering their markets and undercutting them. The only protection here is to develop differentiation in any product niche through IP. The problem is that in the cloud era it is easier than ever to copy what others do. 5. A step up: climbing the product stack Many firms sell to a market, and believe that they understand what that market sells into, and that they could move up the stack by entering them. Oracle understood databases and thought that entering the market for applications that sit on top of databases would be straightforward. Years of effort and billions spent acquiring Siebel and PeopleSoft have only generated limited success here. The problem is that you may understand your own clients, but to enter their market you’d need to understand their clients as well which is not as easy as it sounds. 6. A step down: descending the product stack Often it is easier to descend the product stack. Apple have a market winning smart phone, but struggled to climb the product stack to offer a mapping app. Instead Apple has had more success descending the stack to develop its own components, such as processors. Being its own client it had no problem understanding its own needs Expansion doesn’t necessarily need to mean revenue expansion Expansion doesn’t necessarily need to mean revenue expansion. Some of these strategies are more focused on margin expansion. Indeed for many of the VARs, MSPs and other channel players that we meet the second option, consolidating into local market niches, is the best tactic for long-term survival. We’re not suggesting you base your strategy on a roll of the dice. You need to be focused on whatever strategy is right for your business, but sitting still and doing nothing is not a viable option. This is especially true if, as many small firms find, you have evolved over time into a jack-of-all-trades with a scattering of clients across a number of market segments – with little in common between all of your clients. The cost of continuing to serve a broad set of markets and broad set of requirements will become increasingly difficult and you will lose out to the specialists described in option two above. Combining this with the development of partner networks described in option three might also help. For many small tech firms, rolling a 2 might well be the best result they could hope for. via GIPHY ### The evolution of the second wave of cloud services According to Cisco we’re in a second wave of cloud adoption, with the number of enterprises moving to the cloud through hybrid and full solutions fast rising. Eighty-four per cent are using some form of cloud, compared to 48% in 2010 according to the Cloud Industry Forum. The early adopters tended to be cutting edge and smaller companies, attracted to innovative new technology or the ease of no longer having to manage infrastructure. Larger organisations as a breed tend to be more cautious, so mass adoption among these has only more recently become the norm. [easy-tweet tweet="84% of enterprises are now using #Cloud in some way" user="comparethecloud"] As bigger businesses have made the move the conversations around cloud computing have changed: no longer debating the benefits of the cloud, but taking a more strategic approach. In the early days cost was consistently touted as the key driver, according to a recent Harvard Business Review Analyst Services report collaboration and business agility are now the top motivators. With cloud services such as Dropbox and iCloud a daily part of the tech savvy consumers’ life, a demand has been created for the same level of speed, convenience and efficiency in the business world. As a result, cloud services are rapidly evolving to support enterprises and take advantage of this current demand. Business collaboration services in particular have grown in prominence, the fast growth of new platforms like Slack, alongside more established tools like SharePoint and Google Drive is evidence for this. Introducing a collaborative element to an enterprise workforce can significantly increase efficiency by encouraging more teams to work together, benefiting from others’ knowledge and expertise. central IT and business units are increasingly making decisions together As the cloud has evolved, central IT and business units are increasingly making decisions together. Allowing business operations to move quicker, departments to work more cooperatively and easing and automating processes, the cloud has given business leaders more information, and more time to make better informed business decisions. Consequently, more varied enterprise level services alongside more effective and faster reporting services are emerging to help businesses gain this value. This is one particular area that we’ve seen a lot of uptake and development in recently. Microsoft for example has blogged extensively on creating reports and scorecards for SharePoint. Auditing and reporting tools are enabling enterprises to gain insight into a number areas of IT software, from migration milestones to unused services and licenses. By understanding where tools are going unused, organisations are able to better control software spending. Auditing tools are seeing significant uptake because they have the ability to make organisations more efficient and focus spending in necessary areas. [easy-tweet tweet="Auditing and reporting tools are enabling enterprises to gain insight into #IT #software" user="comparethecloud"] The continued growth in the number of enterprises adopting cloud services is only going to produce more products and drive further innovation. A key part of demand has, and will continue to be a driver for even greater efficiency, to make better business decisions and get things done quicker. Growth among enterprise tools is enabling big businesses to adopt and benefit from the same levels of efficiency, agility and innovation as the most disruptive startups, which can only be a positive thing. ### SC CONGRESS LONDON 10 February 2016 |8:30 am – 6:30 pm ILEC Conference Centre 47 Lillie Rd | London SW6 1UD  SC Congress London 2016 is already in the works! Join us on 10 February 2016 for superior programming, a robust Exhibit Hall featuring today's leading brands in IT security and an opportunity to network with hundreds of your cyber-security peers and leaders.   ### Video Conferencing with Lifesize On Monday 11th January I had the pleasure of video conferencing with Craig Malloy, CEO and Tiffany Nels, CMO of Lifesize. Lifesize is a video collaboration and digital meetings company, that specialises in the development of software deployed across a cloud platform to provide reliable, high quality video conferencing solutions. [easy-tweet tweet="Read the review of @LifesizeHD's tech by @KateWright24" hashtags="videoconferencing, comms"] Headquartered in Austin, Texas, the company was established in 2003 and bought by Logitech in 2009. Over the years, Lifesize’s video conferencing software complemented Logitech’s hardware. Whilst the focus with Logitech has continued along the hardware and retail route, Lifesize’s vision has evolved to a SaaS and B2B focus. As a result, on December 28th Lifesize officially split from Logitech, with the backing of three VC firms: Redpoint Ventures, Sutter Hill Ventures and Meritech Capital Partners. As Lifesize enters a new phase, I was interested in talking with the company about their focus for 2016, and I had the added advantage of doing this over Lifesize’s own video conferencing platform. Looking at the product offering itself, I have to say I was extremely impressed. I often find myself dreading video conferences; the awkwardness of the lag causing unintended interruptions, the necessity to repeat yourself due to audio cut outs, and the person looking like a stilted, blurry version of them self. However, I was pleasantly surprised by my experience using Lifsize’s platform. To begin with, there was no annoying download for a plug-in that will sit on my computer unused (a pet peeve of mine). Once ‘on air’, Craig and Tiffany were remarkably clear on screen, and the interview flowed well throughout, with no lag or buffering. I was pleasantly surprised by my experience using Lifsize’s platform During the course of the interview, I quizzed Craig and Tiffany on the challenges that the industry is facing and how to counteract them. As a product that will be frequently used to connect people on a global level, it has been essential for Lifesize to understand the difficulties faced in different areas of the world. Craig commented on the highly regulated industries in the USA, the issue of data sovereignty in Germany, and bandwidth and pricing challenges in emerging markets. To have the high level of connectivity needed for their product offering, Craig said there is necessity to be flexible to adapt to the hurdles that different global markets bring. [easy-tweet tweet="There is necessity to be flexible to adapt to the hurdles that different global markets bring" user="katewright24"] This flexibility is apparent in the two models they offer: ‘Land and expand’ in which organisations can scale up (or down) the Lifesize services as needed or, ‘Top-down’, better suited to large corporations where the platforms can be deployed broadly across the organisation. Providing different deployment models, coupled with a high quality product, has supported Lifesize in building their customer base right across the spectrum, from healthcare to finance to government sector and more. Moving in to 2016, after a year encompassed by restructuring, Craig is looking forward to a year focussed on: Building customer awareness Development of the the Lifesize team, and Increasing their global brand awareness. In my opinion, you can never improve face-to-face contact with a person, but when a plan B is needed, Lifesize’s software is a great option. ### Cloud analytics and small business – the perfect match Small businesses are generally much more willing to use the cloud than they were a number of years ago. Yet SME take-up of cloud analytics is still not that common – why is this and how can providers make the most of this opportunity? [easy-tweet tweet="#SME take-up of #cloud analytics is still not that common" user="comparethecloud"] Use of cloud services by small businesses has been slower than one might have expected, given the cloud’s suitability and attributes – flexible pricing, easy access, speed of roll-out, scalability – to those small businesses. However, over the past five years or so, (mis) perceptions about the security and reliability of cloud services have been turned around and it is unsurprising to learn that a majority of small businesses now use cloud services of one form or another. However, while this is encouraging, many of those services are merely cloud-versions of software the small businesses were using before – word processing, file storage, that kind of thing. That’s not to say that this is without benefit, but cloud analytics remains a relatively untapped resource. These are services that can add immense value to a small business without them really doing any of the heavy lifting themselves. Yet small business take-up of cloud analytics is low. cloud analytics remains a relatively untapped resource A step into the unknown? Part of the reason that small businesses have been relatively reluctant to use cloud analytics is a general lack of awareness. I run a fintech start-up, and analytics is an integral part of our small business credit-checking SaaS, CreditHQ. However, we work with 27,000 small businesses, comprised of hundreds of different business types, and many of these are simply unaware of what cloud analytics services there are, and how these might help their business. People can get too involved with the day-to-day running of a business to remain up-to-speed with all the latest technology and platforms available. The founder / manager of a small business may well also be responsible for marketing, HR, IT and other disciplines, so keeping track of these services is just one of thousands of items on the ‘to-do’ list. There is also a fear of the unknown – to the uninitiated, even the term ‘cloud analytics’ can sound overly techie. Throw in other terms like ‘big data’ and it is easy for business owners to assume such things are not really for them. [easy-tweet tweet="To the uninitiated, even the term #cloudanalytics can sound overly techie" user="comparethecloud"] Insight-packed data But the reality is, there are many cloud analytics providers out there that will do nearly all of the work on behalf of a small business. In our case, we know that trying to understand complex set of data about every customer is the last thing a busy small business needs, so CreditHQ’s Insight Engine looks at this information and decides what’s important, suggesting how businesses can respond when dealing with different types of customer or different circumstances. Many questions can be answered with relatively few data sets, it just requires smart analysis The same is true for all manner of other services, from sales forecasts to recruitment, and it is not even the case that a small business needs its own big data. Many questions can be answered with relatively few data sets, it just requires smart analysis and extrapolation of insight. And there are many options for doing this, it just takes a little research to find the best option for a particular business. If there is resource within the small business, and it’s a small volume of data being analysed, then a spreadsheet will be more than adequate for identifying trends and segmenting insights. Otherwise, a small business may want to investigate external analytics providers – none should be too expensive, all should deliver true insight back to the business. Of course, insight on its own is of little use – it is how that insight is used that can make the difference. But if a small business can identify a particular issue to address, then cloud analytics can make a world of difference. ### Avoiding delays and diversions on the journey to the cloud Rarely would an IT manager cite a rollout of new technology that went completely to plan. Implementing cloud services is likely to be no exception, and with many organisations still working out what the cloud can do for them, this perhaps comes as no surprise. New ‘Journey to the Cloud’ research by UK IT managed service provider Redcentric found that nine out of ten organisations moving to the cloud are held back by hindrances that either delay or divert their route. Around half of those surveyed claimed that they were badly affected as a result of being held up, suggesting that the impact of a delay is more severe than diverting off course when deploying a new cloud service. [easy-tweet tweet="9/10 organisations moving to the #cloud are held back by hindrances that either delay or divert their route"] Factors causing a delayed or diverted cloud journey often lie within the organisation itself. Almost a third of IT managers claimed that ‘gaining approval and internal sponsorship’ was one of the top reasons for a cloud delay, suggesting that a perception problem is stopping several organisations from rolling it out within their preferred timeframe. It can be difficult to convince those at board level that the cloud is a worthwhile approach, meaning that IT managers and CIOs have to work hard to ensure that the benefits are fully understood. It isn’t just getting the organisation on board that’s proving to be a challenge. The same amount of IT managers (one-third) also cited ‘cost-cutting’ as another reason for a delay, hinting that some are experiencing reduced budgets as the journey progresses. Cost-Cutting is diverting the journey to the cloud ‘Cutting costs’ also scored as the top reason for a diverted cloud journey, cited by over a third of IT managers. This means that budget cuts not only postpone the process; it’s causing some organisations to take a different route altogether when they have started their journey to the cloud. This sits alongside ‘changing business objectives’, which was the second most common reason for a cloud implementation to be diverted. Organisations should be prepared to experience some obstacles along the way, but by using cloud services rather than on-premise, they will be less detrimental, so you can rest assured that you’re able to take a detour that still takes you to your intended destination. Among these delays and diversions lie real concerns by IT managers with regards to business continuity. The majority (59%) say that they would feel most anxious about business downtime if their cloud strategy derailed, while nearly half (46%) cite concerns of service continuity/quality. Maintaining productivity appears to be a key priority when rolling out the cloud, as IT managers tend to be conscious about losing vital business during the transition. [easy-tweet tweet="IT managers are diverting #cloud plans because of fears around business continuity" user="comparethecloud"] While cloud deployment is rarely without its risks, there are some precautions you can take to mitigate some obstacles. The following steps can help ease the process:- Devise a rigorous plan: A lack of planning is often the reason behind experiencing setbacks when rolling out the cloud. Ensure you formulate a strategy that takes into account every stage required for a successful implementation, while developing a contingency plan that can prepare for possible obstructions. Seek advice: In selecting the right managed service provider (MSP), you’ll have likely chosen someone who can adequately guide you and advise how to deal with any complications along the way. Once the MSP understands your organisation and what you require from the cloud, you can be confident that a tailored solution can mitigate common problems. Involve stakeholders: Only by consulting people across the organisation will you truly identify what your business collectively needs from the cloud. In creating a dialogue between key stakeholders who will use the solution, you are more likely to build a cloud that fits the wider picture. As a result, your journey towards the cloud should take the right direction, meaning that there’s less of a chance of having to alter your strategy in the process. That said, experimenting along the way can pay off. Don’t let thorough planning stop you from investigating the different ways you can benefit from the cloud; keep an open mind and be willing to alter your strategy. But as Redcentric’s research shows, obstacles that get in the way can be detrimental, so the need to map out a plan that keeps you on track is high. Benefits you can reap from the cloud are plentiful; and a smooth transition can enable a positive cloud experience right from the start. ### PwC launches new global technology team to exploit Bitcoin technology PwC has recruited 15 leading technology specialists to exploit and commercialise blockchain, the technology that powers the crypto-currency, Bitcoin. The new Blockchain team will be based in PwC’s Belfast office and is expected to grow to over 40 digital and technology specialists during 2016. [easy-tweet tweet="#CloudNews: PwC launches new global technology team to exploit #Bitcoin technology" user="comparethecloud"] The announcement comes just days after Sir Mark Walport, the UK's chief scientific adviser urged the government to adopt the blockchain technology to run key public services like tax collection, benefits or the issuing of passports. PwC’s decision to focus on blockchain technology represents a major step in the firm’s move towards developing FinTech (financial technology) solutions that are becoming a catalyst for change and innovation in the financial services industry. Announcing the establishment of the Belfast Blockchain team, Steve Davies, PwC partner and EMEA FinTech Leader, said: “There’s clear evidence that banks, institutions and even governments are looking at blockchain technology as a secure storage and distribution solution. “Now there is growing interest and a real demand from our clients to help understand the implications of blockchain and how to respond to it. So, as the blockchain juggernaut continues to gather pace, PwC will be well placed to service our clients’ needs at a global level. “PwC is now breaking new ground in developing radical FinTech solutions and these appointments represent the first stage of our plans to grow a world-class Fintech offering. “We expect the initial core team of 15 experts to grow rapidly, with PwC in Belfast continuing to expand, exploit and deliver technology and digital solutions to global clients.” While there is continued debate over the role of Bitcoin as a mainstream “currency,” the underlying blockchain technology, which consists of "blocks" of data in a digital ledger, is believed to be highly resistant to malicious tampering. The NASDAQ exchange will soon start using a blockchain-based system to record trades in privately held companies, while Bank of England research suggests that blockchains could have far-reaching implications for the financial industry. The UK’s technology sector is increasingly focusing on FinTech as offering radical and disruptive solutions to existing business models. The sector raised a record $3.6 billion of venture capital in 2015, with the FinTech sector accounting for almost a quarter of all investment raised by London-based tech companies. However the impact of Blockchain is still underestimated by many in the financial services industry. PwC’s forthcoming Global Fintech survey polled 545 leading asset managers, FinTech businesses and key players in banks, fund transfer payment companies and insurers.  Only 9% of asset managers are very familiar and just under 3 in 10 are not at all familiar with blockchain, the survey will say. [quote_box_center] Welcoming the creation of the PwC Blockchain team, Ashley Unwin, PwC UK Executive Board member and UK and EMEA consulting leader, said blockchain technology could be the single greatest advance in the FinTech sector in a decade: “Blockchain technology is worrying major players in the financial services industry as they don’t know where it will go or its potential to disrupt business models. However, in document delivery and settlement processing alone, it will offer significant cost reduction and efficiency gains. “We are confident that these disruptive FinTech technologies will trigger a huge increase in demand for blockchain expertise and we intend to be a leader in exploiting these disruptive new technologies.” [/quote_box_center] ### Rise of the Millennial I remember the time when I was at my Sinclair ZX Spectrum 128K looking at a TV screen playing Hunchback, a perfect forming of squares with addictive gameplay. Fast forward to my first PC a Tiny Pentium MMX with a whopping 2GB drive, then the frustration of my 36.6k modem humming into the speaker and a view of website then boom. Connection lost - redialling. [easy-tweet tweet="#Millennials are infiltrating the workplace, are you enabling them? You should be." user="comparethecloud"] When I try and explain these early computing frustrations to a generation born on the Internet it generally elicits a look of pity and a feeling of getting very old. As we look at the generation coming into board positions and rising through the ranks of our companies to try and frame them into a term such a millennial is bordering on ridicule. Like every generation before us a younger more vibrant generation has always disrupted the status quo, from the hippy movement of the 60s to the rave generation of the summer of love 1989-1999. we are seeing the rise of a generation that will disrupt mankind more than any other My ‘personal’ view is that we are seeing the rise of a generation that will disrupt mankind more than any other in history. When we look back at the periods in time such as the Italian renaissance or industrial revolution we saw science, art, and industry change dramatically. Have we seen such a disruption since these times? I would argue not, are we on the cusp of something amazing? Maybe. As the old guard slowly moves on, the practices that were deemed acceptable seem to be fading, for instance blatant advertising without a message, the loss of brands to reviews and recommendations, and above all information delivered at the speed of thought. Think about that last paragraph for a second and transpose oneself back in time and place a tablet or mobile device in your hand with a 4G connection or broadband. How would you have reacted? Imagine being able to Skype or Facetime - it would have felt like something from a Star Trek movie, and completely blown your mind. Today this new generation accepts such access as normal and expected. So how do you capture the attention of a confident, technically aware audience that is used to goods and services being delivered at digital speeds? [easy-tweet tweet="Hear the #Millennial view on how to grab their attention on @Comparethecloud"] Rather than answer this myself I decided to throw this question to our resident internet generation for potential answers. [caption id="attachment_34275" align="alignright" width="200"] @KateWright24[/caption] Kate Wright, Customer Experience Manager at Compare the Cloud, born 1990 I want to be informed by advertising. If the website or campaign doesn’t give me any options for secondary sources of review I won’t consider the product. If I’m actually going to buy something it’s going to be from word of mouth and recommendation. Anything that I see online I’ve probably investigated because someone mentioned it to me. I’ve definitely moved away from making impulse purchases, I take the time to research it, the information is so easily accessible now that it is vary rare for me to now buy something and have regrets. [caption id="attachment_34279" align="alignright" width="200"] @Rhian_Wilkinson[/caption] Rhan Wilkinson, Managing Editor at Compare the Cloud, born 1992 For a brand to capture my attention, there has to be a genuine aspect to their advertising. It also has to involve something that is intrinsically interesting to me. I won’t be sold something that I don’t desire. I think audiences are a lot more aware than they were previously, we recognise the submissive advertising in films - the Coke billboards in the backgrounds, and we definitely recognise the blatant advertising of Samsung phones (I swear they’ve paid for half the blockbusters that have come out in the past two years). There also needs to be an aspect of appropriateness to the advertisement. I don’t want to see adverts for technology in my browser when I’m shopping for shoes. I want to see adverts for other shoes. [caption id="attachment_34278" align="alignright" width="200"] @8DavidAmoah[/caption] David Amoah, Digital Media Producer at Compare the Cloud, born 1991 I like blatant advertising. Things like the Oasis advert. I’m thirsty. They want money. They can quench my thirst, I’ll buy their product. Easy! I don’t like products lying to me, consumers aren’t stupid. We’ve been raised in an era when we can find out anything at a click of a button, brands can’t hide from us. ### The evolution of cloud: from stratus to nimbus Cloud comes in many forms. The main types of Cloud are ‘stratus’, ‘cumulus’, ‘cirrus’ and ‘nimbus’. They don’t look much like each other, but they are all Clouds. The same is true of Cloud computing. No two Clouds look much like each other – the services are often very different, (and sometimes tenuous), but they all adopt the same moniker. Over the past few years the name ‘Cloud’ has been adopted by everyone from large scale providers of platforms such as AWS, to large SaaS vendors such as Salesforce.com to cable manufacturers, (‘carrying the Cloud’), and switch manufacturers, (‘powering the Cloud’). [easy-tweet tweet="The term #Cloud has meant whatever the marketing department wanted it to" user="comparethecloud"] In fact, the real truth is that the term ‘Cloud’ has really meant whatever the marketing department wanted it to be. It carried very little meaning because it was such a generic term. One might consider this ‘stratus’ … very wide spread and creating a bit of a fog to cover what is underneath. The industry is evolving however, and customers are beginning to see around the hype of the marketing department claims. Obvious tenuous claims such as those from the cable manufacturer or the switch manufacturer are filtered out by customers as exactly what they are … marketing hype. These might be described as the ‘cirrus’ versions of Cloud … wispy and thin with no real substance. the ‘cirrus’ versions of Cloud … wispy and thin with no real substance So, what of other forms of Cloud? There are, of course, those that claim to have ‘Cloud’ offerings that are merely different terms for existing services. For example, an operator offering ‘Cloud’ services when really all that they offer is a managed service, but they think ‘Cloud’ makes it more on trend, or those that offer ‘Cloud’ services for nothing more than one form of hosted environment or another. [easy-tweet tweet="True cloud computing is on demand computing" user="comparethecloud" hashtags="cloud"] There are so many of these that they are difficult to identify, the simple test as to whether these services are true ‘Cloud’ or not is to test whether they are an ‘on demand’ service. True ‘Cloud’ computing is ‘on demand’ computing, i.e. you can turn it on when you want to and off when you don’t. Those that don’t offer true ‘on demand’ services are not offering true ‘Cloud’. One might refer to these as the nimbus ‘Clouds’ … full of promise but rather unpredictable – be careful what you choose, you might just get wet! the nimbus ‘Clouds’ … full of promise but rather unpredictable What about those, then, that do offer true ‘on demand’ Cloud environments? What about AWS or Azure or Softlayer, or what about Salesforce.com or maybe Dropbox? These are, by any measure, true Cloud environments. They are ‘IaaS’, (infrastructure as a service), ‘PaaS’, (Platform as a Service), or ‘SaaS’, (Software as a Service). There is, however, a bit of a problem. They are also huge! Try configuring a server on AWS, for example, it would be less complex to design the ISS! [easy-tweet tweet="Try configuring a server on AWS, for example, it would be less complex to design the ISS!"] There are so many variations on a theme, but ultimately there is equally no flexibility. You get what they want you to have and you have no choice. This isn’t to pick on AWS, the same is true of pretty much all of the giants. Try being an SME customer of someone like Salesforce.com and asking them for variation to your licence – you would have more chance negotiating with Kim Jong-Un. These large Cloud providers are great, but they are rather like the good old ‘cumulus’ – flat at the bottom and all billowy at the top. Great if what you want is at the bottom but impossible to get to speak to anyone at the top! the good old ‘cumulus’ – flat at the bottom and all billowy at the top There is one final category of Cloud that we haven’t mentioned. This is a sort of hybrid, (not to be confused with ‘Hybrid Cloud’ in compute terms), that is a combination of what is good from the big Cloud providers, but with the flexibility and agility that many customers are looking for. One might refer to these as the ‘altocumulus’ – smaller clouds that form at low altitude and have many shades in their layers. the ‘altocumulus’ – smaller clouds that form at low altitude and have many shades in their layers There are a number of these smaller Cloud providers coming up to challenge the huge providers. These providers offer similar on demand computing to the likes of AWS or Azure, but they have much greater agility, lower pricing and more localised offerings, (which suit those concerned about data security in the Cloud). Watch out for Ormuco, Carenza, IaaS365 and MigSolv – AltoCumulus are the Clouds of the future! ### eBook: Dissecting the Differences Between DocuSign and SigningHub e-Signatures Ascertia has released a new eBook comparing the e-signatures produced by SigningHub and DocuSign. The study involved using each solution to upload and sign the same document before the results were compared side-by-side using the independent Adobe® Reader® for signature verification. This analysis included comparison of the document’s format, e-signature appearance, strength of the digital signature and long-term verifiability. Key evaluations included: Document Format: Was the document converted to a secure format before signing? E-Signature Appearance: What options are available for the user making their e-signature mark on the document? Digital Signature Strength: How is the document locked after signing so that no further changes can be made, how is the user’s identity linked to the document and what level of non-repudiation is achieved through the signed document? Long-term Verifiability: Can the user’s signature be verified in the long-term? Delivered as a public or private cloud service, or as an on-premise solution, SigningHub is designed to optimise how businesses deliver, review, approve and sign documentation. For those interested in reading the full eBook, the document is available here.   ### 8th International Cybersecurity Forum Lille Grand Palais 25th-26th January, 2016 With the 8th edition of the FIC dedicated to data security and privacy, the goal of this year’s event is to foster discussion involving representatives such as service providers, trusted security solution professionals as well as end users from the public and academic sector. ### What's trending for #Cloud and #BigData right now?  What's trending on Twitter for #Cloud and #BigData right now? ### Paid Social Media Marketing; the millennial outlook I ignore twitter advertising. Paid posts on Instagram get scrolled past. Facebook’s targeting is admittedly more advanced, but still, it doesn’t garner massive results. The generation that was susceptible to paid social marketing has grown up, gotten smarter, evolved past being sold to. [easy-tweet tweet="Paid social media marketing, from the outlook of #Millennial @Rhian_Wilkinson" user="comparethecloud"] Pushing unwanted content onto a reader will not make them engage with it. You can put a photo in front of my face as many times as you like, it doesn’t mean I’ll open my eyes to see it. Paid social marketing works in one industry, music. You know your audience, you know that their friends probably like your product too, and you can target them based on knowing that if their friend likes a band, they are also likely to buy a ticket to a show, purchase an album, or stream a song. Pushing unwanted content onto a reader will not make them engage with it You may be thinking to yourself, but all marketing works like that! Sorry but it doesn’t. Marketing your technology product to me through social media in the hope of making a sale is pointless, I am not in a position to purchase your offering. Seeing a twitter advert for a piece of tech does not interest me at all, the focus needs to move away from paying for social media banners and adverts, and towards quality content that builds branding.  My twitter feed right now is serving me advertising from an online shopping boutique, a herbal tea company, and a web advertising company which is imploring me to increase stack availability and get a free ‘Data Nerd’ t-shirt. I don’t care about any of those things. The shopping boutique is targeting me because I’m a 23 yr old female, without considering my full social profile. I do not buy luxury boutique goods, especially not things that come in rose gold. Unless you have incredibly advanced analytics, don’t try targeted advertising, because chances are, you’re just wasting your money and annoying a potential customer. Had I found this particular shopping website on my own through organic search, I may have visited their website, but because they served me a poorly targeted advert, they have ostracised me. If I come across them organically now, I am much more likely to skip visiting their site due to the fact that my first impression of their brand is not favourable. [easy-tweet tweet="Do you know how to talk to a millennial? Let alone market to them? " user="comparethecloud @rhian_wilkinson"] Targeting 'millennials' is such a craze in marketing right now, the problem is, there isn’t enough millennials in the marketing department to make effective decisions. Regardless of your high opinions of yourselves, you probably aren't 'in touch' with 'the kids' and it's almost guaranteed you're not 'down with pop culture'. If you are over 40, chances are high that you don’t know how to hold a conversation with a millennial, let alone effectively market to them. I don’t remember the last billboard I saw. I don’t remember the last leaflet or freebie I got given at an expo. I have branded sticky notes on my desk that I was given months ago, and while I use them frequently, I can’t tell you who the company is, or where they gave them to me. But I can identify quality content that has helped me understand a product, giving me more confidence in that brand's knowledge in the market. Traditional marketing tactics are no longer effective for millennials, my generation is too fickle. We have the internet. If we want to know about a company we ask Google. We read feedback, and before we even approach you, we have almost definitely already decided if we want to buy from you or not. You need to capture our attention in a genuine way. You need to capture our attention in a genuine way Stop wasting your money on paid advertising on social media, redirect your budget into marketing that actually builds your brand. Millennials buy into brands, not products. I can buy a pair of sneakers from hundreds of different suppliers, I choose Nike because of the brand. Branding is everything, if a great band signs to a record label that is undesirable to their fans, they’ve ‘sold out’, and often they’ll lose a percentage of their fanbase. When an average band signs to a huge record label, they’ve ‘made it’ and will gain fans. Cloud services are the same. Focus on building your brand through quality content and engagement in the right streams.  Acquiring a fantastic product for your range doesn’t mean that people are going to buy it from you. If your brand isn’t desirable, and your services aren’t up to scratch, customers won’t come to you. Consumers are savvier than they used to be, and much more aware of what they are getting when they pay for services. Marketing and advertising are changing, so pack your bags and jump on the wagons, because if you stay stuck where you are, you will wither away. ### Cleversafe Object Storage #CloudChat Jan 29th Join the #CLOUDCHAT on #OBJECTSTORAGE  January 29th 3pm gmt/ 4pm cet Cloud Computing changes the landscape of CSP/MSP Business Models. In order to seek for new revenue opportunities CSP/MSPs are constantly looking for adding new Cloud solutions to their portfolio. The right Cloud Storage infrastructure can help you differentiate and add value to your Cloud Storage. Join us on twitter at #CloudChat to discuss how your cloud business is changing, and what your challenges and aspirations are for storage. [easy-tweet tweet="Join the #ObjectStorage #CloudChat Jan 29th 3pm GMT/4pm CET" user="comparethecloud @cleversafe"]   ### Why encryption is proving controversial in the UK and US On the surface, more robust and widely implemented encryption seems like the sort of idea that would gain widespread acceptance. Better privacy and more secure data transfers would surely be welcomed by businesses, consumers and governments alike, right? However, although the benefits of encryption are not being disputed, governments from around the world are becoming increasingly concerned that these benefits could be used for nefarious ends, particularly when it comes to matters of national security. [easy-tweet tweet="How are encryption and privacy being received in the #cloud industry at present?" user="zsahLTD" hashtags="infosec"] The UK and US are two of the most high-profile national governments calling for a ban on end-to-end encryption, instead requesting backdoors that will enable communications to be monitored. Citing the use of encrypted messages by terrorists and other criminals, the governments argue that widespread encryption poses a serious security threat. In the wake of the Charlie Hebdo shootings early last year, UK Prime Minister David Cameron made his thoughts on encryption clear: “Do we want to allow a means of communication between two people which even in extremis with a signed warrant from the home secretary personally that we cannot read? My answer to that question is no, we must not. The first duty of any government is to keep our country and our people safe." Comments such as this from Cameron have led some to argue that the encryption ban is just another example of the state exploiting fears to constrain civil liberties, particularly given that expert opinion cites it as being impractical and largely ineffective at preventing criminal activity. Still, the battle between privacy and security is set to continue, with businesses across a broad spectrum of industries likely to be effected. the battle between privacy and security is set to continue What this means for MSPs Managed service providers will also be keeping a watchful eye on developments to encryption, not only in the UK and US, but all over the world. Because cloud computing enables the remote delivery of IT resources, the legal requirements faced by MSPs can sometimes become overly complicated. Although a business may be based in a country where encryption is legal, if the MSPs that it is using is based in a country where encryption is banned, then company data could become susceptible to surveillance programmes. For managed service providers that specialise in encryption support, their very existence is being threatened by anti-encryption legislation. [easy-tweet tweet="For MSPs specialising in #encryption support, their very existence is threatened by anti-encryption legislation"] In the wake of the Snowden revelations, a number of technology firms such as Apple, Google and a multitude of MSPs have gravitated towards encrypted communications, but not only in response to government surveillance. End-to-end encryption also helps alleviate security fears by ensuring that even if data is intercepted it is useless to anyone but the intended recipient. Plans to introduce encryption backdoors, therefore, have potentially serious consequences for managed service providers. These backdoors, although intended for government use, will become targets for cybercriminals. Instead of offering protection, an encryption ban would introduce vulnerabilities into MSPs and other businesses that put sensitive data at risk. Whether or not an encryption ban is introduced in the UK, US or elsewhere, recent controversy has certainly reignited the privacy-security debate. With no way to satisfy everyone, MSPs, businesses and governments are left to constantly reassess how much privacy must be sacrificed in order to remain safe. ### The Most Important Questions to Ask a Cloud Computing Service Provider Today even small businesses are working out of multiple locations all over the globe. It’s never been easier or more efficient. Gone are the days of needing your own expensive hardware and expert IT team to keep it running smoothly because now your team can use the cloud. [easy-tweet tweet="What questions should you be asking your #cloud service provider?" user="comparethecloud"] The Cloud enables a team member in the Philippines to access the same files and information updated and created earlier in the day by a team member in the Netherlands. Both team members provide details of their progress for the other and for you to review across multiple time zones and thousands of miles. With cloud computing’s increasing popularity there is a sizeable cadre of cloud computing service providers for your specific need. With that in mind we have put together a list of the most important questions you’ll want to ask a cloud computing service provider before moving onto their service. 1. How can my team get to their files on the cloud? Your company will be assigned a specific url or path via an app where they will each be assigned a specific username and password. If that information is entered correctly your team members will be able to access their specific files 24/7/365 from anywhere in the world on any device. 2. What is the process of creating a cloud account? Typically, you’ll be directed to your dashboard where you will start creating accounts for your team. Some service providers will have a representative help you go through the setup process while others will provide you with a link to a “How To” page or maybe a video. If you are not tech savvy you will get a better use out of a service that helps you set things up allowing you to ask questions along the way. 3. How reliable is your service? Nobody likes being shut out from their work. Cloud outages are frustrating, costly and a reality of the technology. If you encounter a service provider who tells you that they are “never down” go to the next name on your list. Reputable firms will post their history online. Nobody likes being shut out from their work 4. Can I get more cloud space as my business grows? As your business grows the amount of storage and space you need will also increase. Make sure the provider you choose will allow you to add space and user accounts. Find out what the charges are for additional storage space. 5. What if the unthinkable happen: a crash, data loss, etc.? Most businesses who are considering a cloud computing service provider are expecting their data to be safe. Rightly so. But what if your cloud provider somehow deletes or loses your data. How will they right the wrong? Be certain to ask them for their policy regarding data losses in writing. Will they compensate you for losses? What backups does the company have in place to reduce risk? Don’t be afraid to ask or think you are asking something out of line. Be direct and find out if the company has ever encountered such a situation and how it was resolved. 6. What security measures do you take? You only need to look at the news and headlines to know that security is a major concern for storing your data. Any reputable company has several security measures in place that are standard to their operation including security audits, anti-virus protections, firewalls, encryption and the authentication process. 7. Who at the company will have access to your data? It is not uncommon for a cloud provider to also have access to your files. Because of that it is important to know who they employ and what kind of background check they implement to protect their clients. It is necessary to ask the questions and get the answers so you can make the best decision regarding a cloud computing service provider. ### PwC announces conditional agreement to acquire Outbox Group PwC has today announced that it has conditionally agreed to acquire a leading European technology consulting business, Outbox Group, bolstering its ability to offer specialised cloud-based solutions and transformational services for clients across the UK and Europe. [easy-tweet tweet="#CloudNews: PwC announces conditional agreement to acquire Outbox Group" user="comparethecloud"] The Polish-based Outbox specialises in customer, digital and technology services working with leading platforms such as Salesforce, Microsoft Dynamics, Oracle and SAP. Its addition will further enhance PwC's cross-industry customer and digital capabilities to deliver innovative solutions across all channels, platforms and devices. The deal comes after previous acquisitions by PwC, including the European consultancy Mokum, and Booz & Company (now Strategy&), in 2014. The addition of the Outbox Group will increase the firm's contingent of technology practitioners to almost 3,000 across EMEA. More than 250 Outbox employees will join PwC. PwC’s UK and EMEA Consulting Leader, Ashley Unwin, said: "This acquisition represents a major milestone for PwC's UK and Central and Eastern Europe alliance and its commitment to invest in emerging markets. “It is also a significant addition to our customer and digital capabilities and sees the establishment of a Centre of Excellence for these skills within PwC in Europe." PwC's UK and EMEA Technology Consulting Leader, Jonathan Tate, commented: "Our clients are prioritising growth and investing to deliver great experiences to their customers. “This acquisition was driven by a rising demand from our clients in digital and customer transformation as well as the need to offer services from strategy through to execution. "Outbox will allow us to present a truly differentiated offering, enabling us to deliver larger and more transformational solutions to businesses across the UK, Poland and the rest Europe. It will also support one of the firm's strategic priorities to further embed technology into its services." Outbox managing director, Nicholas Mobbs, who will join PwC as a partner said: "We created Outbox 10 years ago in Poland. Through dedication and hard work we have tapped into the wave of disruptive technological change, leading to considerable success across Europe with our unique position around a customer first multi-technology strategy. This deal will provide PwC with the ability to offer a unique combination of world class skills and services, by delivering true cloud-based business transformation projects to the market and benefitting existing clients. The potential market for customer experience, CRM and digital is estimated at over 6 billion Euros and we see this growing even in challenging economic times." The deal was signed on 31 December and is expected to be formally completed next month. ### Helping to ease the file transfer headache Why do you need a managed file transfer solution? Corporate data is often sensitive and highly important, yet organisations often do not take the necessary precautions to make sure it is transferred correctly. Many organisations have multiple technologies and departments which help manage the movement of critical corporate data. For batch transfers there are data transmission departments and a whole raft of unmanaged transfers which include large email attachments via FTP servers – both headaches in themselves. [easy-tweet tweet="Why do you need a corporate managed file transfer system? " user="bluebirdITS" hashtags="MFT"] Compound this with increasing volume sizes and exponential growth in the size and volume of information being moved within and between organisations, and you’ve got the formula for a migraine. This is without even beginning to consider the security of sensitive corporate data and the need for organisations to satisfy regulatory and industry requirements within their file transfer operations. Business as usual will not work anymore using the usual methods for moving data Business as usual will not work anymore using the usual methods for moving data. Thankfully however, there are now products to alleviate the pain that all of these transfer woes bring. Managed file transfer solutions are allowing high-volume, optimised point-to-point file transfers, without the stress. By using a managed file transfer solution you are eliminating the need for manual intervention in the data delivery process. Productivity is improved, and reliability increased by removing the risk factor of manual intervention. [quote_box_center]Did you know that a comprehensive managed file transfer solution can Automate and manage file transfers in 24x7 unattended operations Schedule jobs on a one-time, recurring or continuous basis Assign and manage file transfer workloads Give event driven notification Process language builds scripts to provide integration with back-end systems Provide checkpoint restart Automate recovery from network interruptions Automate alert notifications for success/failure Interface with operating system security for user authentication Provide a  complete audit trail of data movement through extensive statistics logs Provide enhanced authentication and encryption Deploy a protocol which has NEVER been breached [/quote_box_center] This list is just a small number of the benefits that having a managed file transfer system can offer your business. At Bluebird ITS we specialise in making sure your files remain safe and secure on the way to their destination. A managed file transfer solution could vastly improve your day to day business processes. ### M2M and IoT's transition from hype to reality 2016 may still be in its infancy, but the M2M/IoT industry’s transition from hype to a more mainstream reality is gathering pace, buoyed by an increasingly consumer-driven and 4G-enabled environment. And with Gartner’s forecast of some 5.5 million devices to be connected daily in the next 12 months, the savviest M2M managed service providers are set to capitalise by seizing new application opportunities which in turn puts the sector at the very heart of IoT innovation. It’s the reason why at Wireless Logic, our immediate focus centres on driving added value services without compromising our core bread and butter connectivity offering. While today’s customer may want a more integrated service beyond basic connectivity, this is matched by an expectation for  resilient, water tight systems that can meet the complex, high bandwidth and business critical applications demands and ensure a stellar signal strength. customers now expect 5 bar signal strength at all times In fact, this cannot be overstated; in our experience, if a customer only gets one bar of signal strength from an operator, they will feel particularly short-changed.  They now expect five bars, a demand that significantly intensifies as we deliver to over two million connections, while meeting the four-fold rise in the average subscribers data consumption. This is critical bedrock, and one, that in my opinion, any network operator worth their salt will build on through a more integrated offering, particularly extra services which can address any connectivity gaps. Indeed, for the many companies who find the extra cost and complexity of developing, deploying and operating cellular M2M applications prohibitive, outsourcing this function to an added value service provider becomes an attractive option. Indeed, providing stand alone platforms such as our own infrastructure as- a- service proposition, Net Pro, enables the customer to manage their M2M activity in a cost effective and accessible way and   leverage the scope and flexibility that comes from access to multiple bearer services and a choice of over 30 mobile network operators. And with 82% of our customer base going down this route in one month alone, it reflects the growing appetite for access to a technology platform and level of infrastructure that facilitates a fast route to market as an off the shelf, affordable platform. [easy-tweet tweet="#M2M technology will be driving the proliferation of cameras embedded in commercial vehicles showing real-time video"] It’s evidence of the more affordable accessibility required if M2M technology is to diversify into the IoT consumer-driven arena and properly branch out from what has traditionally been an exclusively enterprise domain. Perhaps, not surprisingly the self-driven market will be key to a more mainstream push. Only when we see devices directly available through retail outlets, equipped with a SIM card which the customer can activate online with a QR code ready to sign up to contract there and then will we be free of the delays and barriers that prove to be prohibitive to a more mainstream take up. Of course, some examples are already filtering through. M2M technology will be driving the proliferation of cameras embedded in commercial vehicles showing real-time video, a proposition set to explode in the commercial fleet management. User-based insurance is the stand out case, which despite still being at a formative stage in its development as already proven its worth in the value of that data collated and ability to drive a more equitable and transparent approach to how insurance premium is calculated. Parking is also set to get a smart makeover, consigning tickets and barriers to the past as automatic number plate recognition (ANPR) and smart payment terminals system record your car registration number when you enter and exit the car park. As the market heads towards this space, the M2M network operator is poised to meet the greater challenges beyond the more straightforward demands of routing data from one machine to another to potentially routing from multiple machines to the consumer tablet, which I believe will prove to be an increasingly vital cog in the IoT eco system. ### IBM and SAP: For systems integration in the fast lane As SAP systems integrators, you’ll likely be well aware of the longstanding partnership between IBM and SAP. For decades the two companies have worked together to deliver collaborative IT solutions that are cost-effective, reliable and deliver a level of performance that enables businesses to outpace their rivals. When it comes to systems integration, you can’t afford to be slapdash, but neither can you afford to waste time, particularly when new business challenges and opportunities are emerging all the time. [easy-tweet tweet="The SAP HANA Enterprise Cloud solution is now available across #IBMCloud" user="SWauschkuhn"] It’s for these reasons that we’re delighted to announce that the SAP HANA Enterprise Cloud solution is now available across IBM Cloud, bringing together two of the technology industry’s major players once more. For us, the importance of delivering SAP businesses solutions securely, but swiftly, has never been greater – making sure systems integration remains firmly in the fast lane. The first speed boost we are giving to SAP systems integrators is SAP Hana on Power8. IBM Power Systems is designed to capture and process huge swathes of data from a multitude of sources, enabling organisations to deliver cutting-edge insights. In fact, both SAP and IBM customers are sure to benefit from the combination of the former’s strategic in-memory database and the latter’s Big Data platform. Power Systems users will be able to analyse information in real time, without having to switch SAP workloads to a separate platform and SAP HANA users will be able to deliver their application and workloads with greater reliability and availability than ever before, thanks to the increased performance and scalability of the Power8 platform.  IBM Power Systems is designed to capture and process huge swathes of data [caption id="attachment_33933" align="alignright" width="300"] Read more on IBM Softlayer here[/caption] [caption id="attachment_33611" align="alignright" width="300"] Read more on IBM & ShadowIT here[/caption] IBM has worked hard to enable SAP HANA customers get started on the Power8 platform as quickly as possible. IBM Solution Edition is a fast-start option that provides three base configurations to enable rapid HANA deployments on the Power server, with customers also given the option of running extra virtualised workloads and adding capacity as and when they need it. IBM Power users that are working with non-HANA SAP workloads will also be able to realise the benefits of SAP Hana on Power in no time. In most cases, customers will be able to leverage existing processes, meaning that new procedures will not be necessary and SAP systems integrators can really hit the group running. Automated Modular Management (AMM) is another IBM solution that brings industry leading automation and integration to SAP customers. AMM for SAP HANA is a subscription service offering with monthly payments options and hosted on Softlayer. It enables businesses to manage their cloud infrastructure simply and quickly by bringing greater automation to their IT resources. Developers can build and deploy applications using the SAP management console and SAP HANA Studio. Plus, AMM for SAP HANA utilises pre-deployed infrastructure and fully automated software installation. the flexibility of IBM Cloud, combined with the SAP HANA database, enables organisations to generate real-time insights What’s more, the flexibility of IBM Cloud, combined with the SAP HANA database, enables organisations to generate real-time insights that could help them deliver faster business processes or cement customer relationships. Real-time analytics is vitally important in the agile business world of today, because delivering solutions faster than your competitors isn’t just about raw speed, it’s also about rapid changes of direction to meet new challenges. Up-to-date analytics enables businesses to actually pre-empt trends and react in advance. [easy-tweet tweet=" Automation is vital component of IBM and SAP's latest collaboration " user="SWauschkuhn"] Whichever industry you work in, outpacing your competitors is vital, but there’s no point in racing to the finish line if you’ve lost all your supporters. With IBM and SAP’s latest collaborations, it’s all about less haste and more speed. Automation is vital component of this, removing the manual tasks that take up so much of your time, but without diminishing reliability or performance. Similarly, getting started with SAP HANA on Power8 and AMM for SAP HANA has been made as straightforward as possible, so you can begin immediately developing new solutions, bringing more customers to the cloud and integrating mission-critical applications. ### Unisys to Deliver Enterprise Security Innovation on the AWS Cloud Unisys Corporation today announced it will provide enterprises with the award-winning Unisys StealthTM micro-segmentation security solution on the Amazon Web Services (AWS) Cloud, available for customers to immediately acquire and deploy from the AWS Marketplace. [easy-tweet tweet="#CloudNews: Unisys Stealth now available on AWS" user="comparethecloud" hashtags="security, cloud, infosec"] The solution will provide advanced security to AWS customers, while providing Unisys clients with the ease of access and scale of the AWS Cloud. Unisys optimised its Stealth offering to give AWS customers a fast and convenient way to protect vital information and applications against evolving threats. AWS provides a highly reliable, scalable, low-cost infrastructure platform used by more than a million active customers across 190 countries around the world. With Stealth on AWS, organisations can easily integrate additional protection, comply with regulations, and micro-segment off their virtual machines from neighbours when working on the cloud. The solution is another step in Unisys’ reinvention as an IT technology and services leader focused on secure digital transformation. It also illustrates Unisys’ asset-light strategy of working with cloud technology providers, which complements Unisys’ own advanced capabilities in infrastructure and service management, analytics, application services and software, as well as security. “Security and cloud computing are strategic priorities for businesses today,” said Peter Altabef, Unisys’ president and chief executive officer. “Enterprise-proven security that evolves to meet future threats will provide additional assurance to enterprises and governments that are moving core operations to the cloud.  Integrating Stealth onto the AWS Cloud advances Unisys’ leadership in security, and reflects our commitment to continually deliver innovation that solves real-world business challenges.” “Security is top of mind for our customers, and at AWS, it’s our number one priority. Unisys Stealth on AWS provides extra layers of security for enterprises moving their workloads to the AWS Cloud,” said Stephen Schmidt, chief information security officer, Amazon Web Services, Inc. “We are pleased this solution is now available on the AWS Marketplace, offering our customers immediate access to the security protection capabilities of Stealth.” With Stealth on AWS, users can quickly and easily micro-segment their own portions of the cloud from other users while keeping their own encryption keys. They can unify their internal security protections with those on the cloud, enforce virtual machine-to-machine encryption in the cloud, and reduce attack surfaces. In addition, Stealth on AWS allows organisations to extend entire workloads securely from data centres to the cloud; manage access via existing identity systems including Active Directory or LDAP; and easily add integrated supply chain partners to micro-segments without giving them broad access – at the packet level without any new hardware, firewall rules, or application changes. Unisys Stealth software uses identity-based micro-segmentation techniques and encryption to protect data and applications on the AWS Cloud. Stealth protection makes data and applications invisible to hackers and unauthorised users by encrypting traffic between all Stealth-protected endpoints. Clients can acquire Stealth directly from the AWS Marketplace, where Unisys also offers an AWS Test Drive that allows enterprises to access a private sandbox using Stealth. Using a step-by-step lab manual and video, they can learn more about Stealth and how it works, with no charges incurred as part of the AWS Free Tier. Alternately, enterprises that want to dynamically extend their on-premise infrastructure to AWS can use Stealth’s “cloudbursting” capability, which automates the shifting of secure workloads directly into AWS. “Enterprises can now use Stealth as their single platform for securing both their data center and their cloud environments, substantially reducing the complexity and cost of multiple platforms,” said Tom Patterson, vice president and general manager for global security solutions, Unisys. “This comprehensive level of protection removes roadblocks for many organisations looking to leverage the cloud, and will unlock huge cost savings and business agility needed in today’s competitive environment.” ### How Big Data is Changing Ecommerce beyond Recognition? Data is practically everywhere and we are already used to it. With constant contribution from diverse interfaces, volumes of digital data are opening new vistas of opportunities for every facet of life.  In the retail industry, particularly the booming online retail sector, this change is most obvious.  Experts say that big data is the new horizon with an overwhelming volume of digital data likely to have the relevant solutions to an array of issues and problems in the retail sector.  [easy-tweet tweet="Understanding web traffic #data is key in knowing triggers for purchasing" user="comparethecloud"] On the other hand, success of the online retail store or ecommerce site refers to the volume of buyers or the gross amount of transaction in a given time. To boost these numbers tracking buyers, and their behaviours, and accordingly spotting new opportunities for sales is important. Understanding site visitors and knowing what triggers them towards making purchases can be best done by analysing the traffic data. This intensive analytic process that deciphers wide array of useful insights is referred as big data and has seminal importance in business conversion for the ecommerce sites. How Big Data is represented in ecommerce realm?  Big data in its quintessential definition encompasses a broad spectrum of information useful to retailers whether over web store or in-store. Big data in the realm of ecommerce is represented in 3 characteristics of data, respectively as velocity, volume, and variety. ecommerce Big data is represented as velocity, volume, and variety of DATA Velocity of data represents the data speed corresponding to transactions, user actions, etc. Volume of data represents amount of data from diverse sources represented in denomination of volume like TB, PB and ZBs. Variety of data represents multiple types, sources and formats of data. The ways Big Data changing ecommerce While many people even after having an idea of the supposedly ‘Big’ role of Big Data in digital interfaces, are not entirely sure in which ways this can play a vital role for their business. Big Data irrespective of the hype it received in recent years is not something that can be overshadowed quickly by another technology so soon. Any business, regardless of their size, must take the opportunities or at least come into terms with the scope offered by the huge volume of digital data and the analytical tools to deal with them. In more ways than one the potential of Big Data can be incorporated to the advantages of businesses.  1. Customisation As huge volume of customer data is being collected through various digital sources including loyalty programs, browsing patterns of the website visitors, and purchase history and behaviour of past users, now retailers have the opportunity to process this huge volume of information for planning a customised approach for different segments of customers as per the insights gained based on this volume of customer data. 2. Customer specific offering [easy-tweet tweet="#BigData is allowing more specific and accurate customer targeting" user="comparethecloud"] As a retailer when you have a relatively accurate customer profile you can address his concerns, wants and sensitivity better with specific offers. Now creating more accurate customer profile is also easier as you have diverse touch points and sources for collecting and processing customer data. Once you have specific customer profile this can guide you will regarding the price and promotional offers that are likely to re-engage the customer and can convince him to a buying decision once again. Across the retail stores this has already been proven as one of the most effective customer retention strategies, since with this one can more accurately decide the discount, price and offers applicable for each potential customer and targeted buying decision.  3. Making customer service better Do you know that a whopping 68% of web traffic simply leaves because of non-satisfactory customer service? Customer service has always been the most neglected area in most web retail stores irrespective of knowing the critical role it plays in boosting sales and customer engagement. By facilitating coordination and communication among various channels of information flow including emails, phone calls, forum posts, live chat messages and typical web behaviours Big Data can help making customer service more interactive, time pressed and purpose driven. Understanding of issues concerning customers and addressing concerns in a time bound manner can only be done by allocating resources to know the loopholes in communication of channels. Any conflict of interest, delay and non-addressing of issues thus can be better tacked with the help of accessing of digital customer data in real time. 4. Smooth operation and supply chain management For any ecommerce business transparency in operation and well defined process in supply chain management play an important role. In this respect Big Data can play a very prominent role as by accessing and analysing huge volume of operational data and customer data it can accurately forecast all possible glitches and disruptions in the process and thus can prevent delay, disturbance and process leaks. By accessing data on warehousing, shipping and other operational procedure and updates in real-time and by communicating the same to the retailer any disruption and delay in business process can be avoided. ### Top tips for ensuring a smooth data centre migration to the Cloud As the demands placed on data centres intensify, more quality cloud-based solutions emerge. This means Data Centre Migration to the cloud, is becoming an increasingly unavoidable consideration for IT managers. Although a complex undertaking, the efficiency benefits further down the line can often justify the time, effort and capital required. [easy-tweet tweet="Migration to the #cloud can take several forms: #IaaS, #SaaS and #PaaS" user="comparethecloud"] Migration to the cloud can take several forms: Infrastructure as a Service (IaaS), Software as a Service (SaaS) and Platform as a Service (PaaS). IaaS involves substituting servers in your data centre for servers in the Cloud. The leading providers of these types of services are Amazon Web Services and Microsoft Azure, though a plethora of local data centres provide similar, albeit somewhat less scalable services. SaaS, of which Salesforce is probably the most well-known, is at the other end of the scale to IaaS, whereas PaaS, such as Amazon Elastic Cloud, Force.com (by Salesforce) and Azure web platform, sits in-between these two approaches. Managing your own data centres allows you to remain in complete control Managing your own data centres allows you to remain in complete control, but can be very costly. For this reason, many have decided to look at migrating some or all of the hosting from their own data centres to the cloud. However, this does mean entrusting your data and some or all of your operations to a third party. What happens if there is a security breach, data loss or an outage? These issues boil down to two questions; do you trust the third party, and if you do, will your regulatory environment allow you to entrust your data and operations to them? Once a decision has been taken to migrate to the Cloud, it is important that you plan thoroughly and stick to a clear and thought-out structure. Here are some tips to help ensure your Data Centre Migration to the Cloud flows well: 1) Consider the risks and how you are going to manage them Generally, the main risks in a Data Centre Migration are loss of data and unplanned periods of downtime. Risks can generally be mitigated by performing trial migrations and testing the relevant applications after the trial migration. More testing incurs greater costs, however, and a balance may have to be struck between the risks avoided by thorough testing and the cost of the testing. Classifying applications by business criticality and the technical difficulty of migrating them can save time and ensure consistency in this decision making process. [easy-tweet tweet="A balance may have to be struck between the risks avoided by thorough testing and the cost of the testing"] 2) Identify potential incompatibility issues Legacy apps often do not translate into new environments due to not having the right operating systems and hardware – IaaS does not normally include the older operating systems which have legacy hardware. There are two ways of dealing with this issue: either finding a way to run the legacy application on modern hardware, which either requires emulating legacy hardware, or replacing the legacy application with a newer one with similar functionality. The new application, may even be SaaS, which would eliminate infrastructure concerns after migration. Replacing the application is a project in its own right and may come with extensive licensing, integration and training costs, but such an exercise may be necessary anyway, and doing it as part of the migration can be cheaper than performing the migration and then replacing the application a short time later. 3) Check the network you are migrating to can be configured to run your apps During a Data Centre Migration you will be migrating multiple applications which currently exist within a network configured to a certain spec. For IaaS migrations, if the new network does not work in a similar way to the current one, the migration will have to take this into account or the applications will not function correctly after migration. To help avoid issues like this, make sure that you fully understand your current data centre network infrastructure, considering elements such as firewalls, domains and trusts and ensure that you are given a good understanding of the one you are moving to. If the IaaS does not permit the desired settings, adjustments must be made to the applications to allow them to work within the constraints of the IaaS. make sure that you fully understand your current data centre network infrastructure 4) Consider network latency It is likely that Cloud based infrastructure will be physically further away from your users than your data centre was. Very often, a traditional data centre is on the same site as the users of the applications it hosts. Cloud infrastructure may be thousands of miles away. For many applications, especially those with a browser based architecture, this will not be a problem. For some applications, especially older ones, an increase in network latency can cause a major drop in performance. To counter this, the application can either be replaced by a more modern one, be hosted on a thin client system such as Citrix XenApp, or be redesigned to work better with higher latency. 5) Talk about discovery Depending on how long the original Data Centre has been up and running, there may be existing applications which have outlived the presence of the employees that set them up in the first place. It is very feasible that the knowledge of how exactly these applications work and even the very fact that they exist left your company with those staff members. [easy-tweet tweet="The main risks during #datacentre migration remain unplanned downtime and loss of #data"] If you are considering the migration of an entire data centre, it is crucial to know in full what is hosted in the data centre and how each application interacts with others. Lack of awareness about this can result in lost functionality and potential loss of business-critical data. Network tracing and discovery tools can be used to analyse systems over time and re-discover any forgotten components and relationships. The main risks during migration remain unplanned downtime and loss of data. Those technical risks are complemented by the risks of unforeseen delay and cost, caused by such factors as migrated applications failing tests before going live. To mitigate these risks, a strictly structured and planned approach is essential. ### The Alpha and the Omega of Cloud There was much discussion in 2015 about whether AWS had stolen such a lead in public cloud that would make it impossible for most rivals and all late entrants to catch up and compete. As we have covered before there are rarely new technology markets that provide true first mover advantage. Normally the winners are the fast followers that are able to learn from the mistakes of pioneers and improve on their offerings to deliver differentiated and therefore winning propositions. Examples of this include Google (not the first search engine), Amazon (not the first e-commerce site), etc. [easy-tweet tweet="Alpha or Omega? Who will prevail in the long run? @BillMew analyses AWS v Oracle on @comparethecloud" via="no" usehashtags="no"] Two competing theories, described by the first and last letters of the Greek alphabet, illustrate the opposing theories and the spectrum between the two: The Alpha theory postulates that there is real first mover advantage this time resulting from the fact that AWS has succeeded in establishing such an impressive market share lead, innovating so quickly (which eliminates the rivals’ ability to differentiate), and keeping prices so low (to exploit its economies of scale and undermine the profitability of new entrants). The Omega theory postulates that there is still room for the right rival with enough financial muscle, enough of a hold over its corporate clients and enough of a differentiated approach to exploit its opportunity as the ultimate fast follower. In case you hadn’t already guessed, in cloud while Alpha stands for AWS, Omega in this case stands for Oracle. Alpha = AWS OMEGa = ORacle Famous for initially describing cloud as nothing but hype, Larry Ellison and his crew have now put all their energy into making Oracle the leader in cloud and to do what rival giants like Dell, HP and others have already failed to do – establish a public cloud offering to rival AWS. Oracle certainly has the financial clout to make a significant challenge and its database dominance provides it with a lock-in and hold over its corporate clients – one that is the envy of many rivals. The big question is whether it can build a competitive and differentiated cloud proposition to challenge AWS. This is a battle that Oracle has to win to survive, because over time, as things move to the cloud, databases will become commoditised and its hold over its corporate client base will be lost. Austin Texas is the focus for Oracle’s massive investment in cloud infrastructure. It’s aim is to skip first generation cloud infrastructure and gain competitive advantage by moving directly to the next generation of technology with a container enabled infrastructure. It hopes that it can migrate its application portfolio and its database clients to this new cloud infrastructure before they find a way of loosening its hold over them. AWS isn’t sitting still though. Not only is it exploiting its existing infrastructure to maintain a market share rivals over its rivals, but it is innovating at a fantastic rate as well as taking the battle directly to its rivals. In this case AWS’s database migration tools are seeking to provide the escape route that many Oracle database clients are after and in doing so to undermine Oracle’s core franchise before it can establish a public cloud one to rival AWS. [easy-tweet tweet="AWS is seeking to undermine Oracle's core franchise before it can establish a rival #publiccloud " user="billmew"] So what are the Alpha and Omega characteristics that we should be looking for? For the Alpha theory to prevail we need to see signs of real first mover advantage and the fact that this could be being converted into a dominant market position, such as: Economies of scale: AWS has a scale advantage of all its close rivals and a massive one over new entrants. It is using this scale to realise economies and lower prices to make life difficult for other players. Ecosystem support: the AWS marketplace has become the largest portal for services with most ISVs or service providers focusing on it as the primary platform for their offerings. Even the almost evangelical support for Open OS OpenStack hasn’t enabled it to rival the AWS ecosystem. Developer following: while others such as Google have made a big play of focusing on DevOps and Container friendly environments, the platform that ,most developers are focused on is AWS and the pool of AWS skills is the largest and fastest growing. Barriers to entry: with an enviable lead over its rivals in terms of both market share and growth rate Amazon not only has a data centre estate that is larger than its rivals combined, but it can afford to maintain investment at a level that rivals would find almost impossible to match. You need big bucks to be able to play this game and the ability to maintain an eye-watering level of investment. For the Omega theory to prevail we need to see signs of real fast follower advantage For the Omega theory to prevail we need to see signs of real fast follower advantage and the ability to learn from early mistakes or initial generations of technology to leapfrog and establish a new dominant market position, such as: New generation of technology: often we have seen newer waves of technology that have made previous technologies obsolete or redundant allowing a window of opportunity for new players – as with 2g, 3g and 4g mobile networks. However while containers and other new technologies have emerged, they have been evolutionary and can be rolled out across existing cloud infrastructures. Compelling differentiation: often fast followers, learning from the mistakes of pioneers, are able to add value to provide a compellingly differentiated proposition that enables them to attract clients in droves. However in the case of Oracle it has a dominant position in the database market and almost as many clients as it could have. These corporate clients are not as much looking to flock to Oracle as to find a means of loosening its grip on them in order to escape its clutches. Break-through enhancements: the last remaining tactic for fast followers is to alter the paradigm by changing the competitive dynamic with some form of break-through enhancement. If Oracle has any such break-through enhancement up its sleeve then it has yet to make this public. A proposition worth paying a premium for: finally any fast follower would need to be able to charge a premium, but in Oracle’s case as an insider recently disclosed to Infoworld “on 90 percent of our large deals. ... [@Oracle #Cloud is] ... being given away”. This is the road to ruin. By now you’ll be sensing that the Alpha theory has much evidence to support it while the Omega theory looks implausible at best. Maybe Larry Ellison’s public cloud is nothing but a pipe dream, but only time will tell if he can succeed. ### Manufacturing leaders are embracing public cloud to cut costs and drive efficiencies, UK study reveals Manufacturers are embracing the public cloud, with 85 percent of Line of Business (LOBs) decision makers using some form of public cloud services to cut costs (33 percent) and drive efficiencies (29 percent), a new study from EMC, VCE and VMware today reveals. The research, which questioned over 600 business decision makers across the UK, shows that the majority of these deployments are happening in collaboration with IT: over three-quarters (87 percent) of LOBs consult IT, with almost half (43 percent) running everything by them, including the final-decision making process. As manufacturers take advantage of the scale, flexibility and agility of public clouds, the study highlights the main reasons for adoption: almost half (44 percent) use it to host internal digital applications and services, such as HR, meeting scheduling and purchase management apps, with just over a third (37 percent) relying on it for data back-up and recovery services. This is closely followed by those using it to host external digital applications and services (35 percent). However, despite overall enthusiasm for the benefits that public cloud services can bring to a business, manufacturers are still concerned about security and data management issues. LOB decision makers cited worries about security exploits (50 percent), internal data loss (36 percent) and the reputational cost to the business of these services being breached (43 percent) as prime concerns. Just 8 percent of respondents said they had no concerns over using public services. Despite this 50 percent of manufacturing LOBs expect the amount they spend on IT to increase by 25 percent over the next year. “Manufacturers are united in their appreciation and use of public cloud services, and understandably so – it can offer the agility and flexibility that many LOBs in the industry need to keep up with rapidly changing market demands,” comments Rob Lamb, Cloud Business Director, UK and Ireland, EMC. “For manufacturing IT departments to be more heavily involved in LOB IT decisions, they need to embrace a cloud strategy that allows others to continue cutting costs and drive efficiencies while mitigating security and data lost concerns.” “To truly realise the full benefits cloud computing can offer, the IT department must offer the business easy access to cloud services, but in a controlled, compliant manner,” comments Richard Munro, Chief Technologist and Technical Director for vCloud Air, EMEA at VMware. “It’s here where a hybrid cloud strategy can fit seamlessly, allowing IT and businesses to embrace the public cloud with confidence, in a secure and manageable way.” For more information on how Hybrid Cloud can deliver the flexibility and agility of public cloud, without compromising safety and compliance regulations please visit: http://uk.emc.com/cloud/hybrid-cloud-computing/industry-research/index.htm   ### Virtualisation looks set to increase in 2016 Despite an expected increase in organisations embracing virtual server platforms the role of data centres will still feature prominently in 2016. The desire for organisations to reduce their carbon footprint whilst having a solution that still enables high power performance is a key requirement in the migration towards virtual servers. This is further supported by research undertaken by the Cloud Industry Forum, which found that an estimated 84 per cent of enterprises adopted some form of cloud technology in 2015. [easy-tweet tweet="Data Centres are here to stay in 2016, even the increased uptake of virtual server platforms can't change that"] In addition, when weighing the differences between CAPEX vs OPEX, research indicates that there will be an 11 per cent shift in IT budgets from traditional in-house IT systems, to ones based on variations of the cloud network, further supporting the shift towards virtualised servers. Yet, despite the rise of virtual servers, the importance of data centres should not be underestimated. Many organisations have not fully migrated to cloud services and are instead adopting a combined approach between cloud-based platforms and data centres. What we are now seeing is that whilst organisations use cloud based services for 80 per cent of their operations, the other 20 per cent are still within co-located data centres. Data centres still remain a popular option for those who prefer having direct physical access to where their data is stored Data centres still remain a popular option for those who prefer having direct physical access to where their data is stored, not to mention the increased connectivity of having your data centre located nearby. But, perhaps the most important driving factor is the cost-effectiveness of data centres. Rent has been between 30-40 per cent cheaper in the previous year, and with prices looking to stay low, it seems highly likely that data centres will continue to provide a very viable solution, especially for start-ups. Another key trend that continues to impact both cloud services and data centres is the widespread adoption and presence of the Internet of Things (IoT). Estimates predict that connected devices will increase to 50 billion by 2020, resulting in greater data consumption, prompting a greater need for data storage systems and more power demands. [easy-tweet tweet="The ever-increasing demands on online activity will add pressure on power needs" user="comparethecloud"] In line with the IoT, the ever-increasing demands on online activity will further add pressure on power needs. In 2015, we saw a longer duration in online gaming sessions, significant e-commerce activity fuelled by events such as Black Friday, and also the growing users of streaming media such as Netflix and Amazon Prime. The surge in these activities once again highlights the necessity for servers to provide consistent high power to cope with increased demands. Looking forward, the demands on power and reducing footprint will only drive the adoption of virtual servers. However, what we have and will continue to see is the ability for cloud services and data-centres to co-exist and provide a cost-effective and powerful solution to support the huge boosts in online activity and connected devices. ### 2016: What does the future hold for DevOps? If you haven’t heard of the term DevOps before, you’re probably preparing to stop reading right now. But you shouldn’t! While the term does originate from IT management philosophy, the lessons we can take from its methodologies range far beyond this. [easy-tweet tweet="If you have an understanding of ‘lean software delivery’ then you’ll have an idea of what #DevOps is all about"] Gartner is predicting that 25% of Global 2000 organisations will be leveraging DevOps within their organisations in 2016. Given this trend, it’s important that people understand what DevOps is, why it’s important, and what it can do for your organisation. What’s DevOps all about? If you have an understanding of ‘lean software delivery’ then you’ll have an idea of what DevOps is all about. And if you don’t, fear not. Lean software delivery derives from lean manufacturing methodologies, and the basic premise is not that difficult to understand: by only producing what is needed at the time it is needed, you minimise waste, reduce overheads and therefore boost your profits and also make it easier to make decisions quickly because there’s less administration weighing you down. The solution that manufacturing companies came up with in the 20th century translated very well to software companies, and currently a majority of software companies have integrated these lean software development processes into their way of working. currently a majority of software companies have integrated lean software development processes into their way of working All sorts of companies are describing themselves as software companies nowadays, and this is at the core of why Gartner is predicting an uptick in the use of DevOps in businesses. Companies such as Target and Lego have already made significant strides in implementing DevOps to improve internal company process, and when such household names begin doing something new with successful results, the rest of the corporate world pays attention. So how does DevOps actually work? There is one problem that almost every large organisation will experience at some point in its development, and that is one of silos - meaning teams within a company that do not communicate with each other. Unsurprisingly, silos make work more difficult due to lack of coordination across teams, which ultimately leads to conflict and duplication of work. It was in seeking to overcome the problem of siloing that DevOps first appeared: in a software organisation two teams that commonly have to work very closely together are Development and Operations, hence the portmanteau DevOps. What began as a methodology for ensuring Development and Operations work efficiently together has now outgrown its original purpose, as many DevOps specialists will excitedly tell you. When you think about it this shouldn’t be surprising: any organisation can experience siloing problems, not just ones that have large and well-established Development and Operations teams. What’s emerging is the realisation that DevOps is fundamentally about streamlining delivery mechanisms, with different groups properly communicating and avoiding unnecessary bottlenecks. DevOps seeks to describe working procedures that ensures a rapid delivery, cultural harmony and technological fit within organisations: it’s clear that this is an awfully big job, and this is one of the primary reasons why the field is full of such hype, misconceptions and poor understanding. [easy-tweet tweet="The realisation that #DevOps is fundamentally about streamlining delivery mechanisms" user="comparethecloud"] What’s waiting in 2016 for DevOps? Many firms right now are choosing to abandon the name DevOps in favour of terms like “rapid release” - for them the name is too weighed down with misunderstanding for the reasons I’ve outlined above. They believe that DevOps should no longer be a job description of a dedicated team, but a way of working that should filter down to everyone within the organisation. I disagree - this cannot happen by magic. DevOps is a job description, but its true value lies in DevOps specialists’ integration into a unified delivery team that spans the entire deployment spectrum. That’s a vision any quality CEO will get behind. My prediction for 2016 is that more people will realise DevOps is not a passing buzzword, and that in the coming years agile operations will become just as ubiquitous as agile development is now. In the new year, managers will overcome their fear of the popularity of DevOps and begin to optimise their operations according to the examples set by lean IT management methods. ### Common Cloud Objections: How To Handle The Naysayers According to a recent Verizon survey, hybrid cloud computing is now “mainstream,” with 53 percent of respondents saying they use between two and four cloud providers. Despite a booming market and the marked advantages that come with moving off premises, however, you may experience pushback from executives and IT pros alike. Here are three common objections — and how you can handle the naysayers. Legend of Legacy Every company has applications and services they've been using so long it's difficult to imagine a different way of completing key tasks. It's no surprise, then, that one of the most common cloud objections centers on the disruption of these long-held apps. According to AllBusiness.com, the biggest worry for cloud naysayers is that they'll have to “rip out” existing applications and force them to work with the cloud. This co-mingling could prove costly, time-consuming and the end result may not deliver equal performance. [easy-tweet tweet="One of the most common cloud objections centers on the fear of disruption" user="XOComm" usehashtags="no"] Answering this objection means clearing up a misconception: There is no need to shift or move legacy apps if they don't play well in the cloud. In some cases, moving is simply too costly and companies are better off waiting until they find a viable cloud substitute. Some legacy apps, meanwhile, will never make the transition because of their age and design. In both scenarios, there is no need to disturb the working of legacy apps — move other non-critical services to the cloud first, and then introduce apps that play well with your legacy solutions over time. Unstable Security The next big objection to cloud computing? Security. As noted by MSP Mentor, security is often the first answer given when companies are asked why they're slow to adopt cloud services. On the surface, the concern makes sense: When data moves off site, IT naturally loses a measure of control. If shared cloud servers are hacked in an attempt to grab the data of other companies, your data ends up stolen or destroyed as collateral damage. Dealing with this doom-and-gloom attitude demands two answers: First, as the cloud market has diversified, security has become an industry unto itself, with many providers exclusively focusing on the provision of secure cloud services — in other words, they live and breathe cloud security, and offer far more robust options than simply having data “close to the chest.” It's also possible to leverage the service of “zero knowledge” providers that encrypt your data and then give you the only key. As a result, it's not possible for malicious actors, government agencies or even these cloud companies to access your data without permission. It's not possible for malicious actors, government agencies or even these cloud companies to access your data without permission. Bad Bandwidth Once legacy and security concerns are addressed, naysayers often fall back to bandwidth arguments. Since cloud services rely on an active, stable and high-volume Internet connection, it's possible to lose access to critical data or applications if your Internet Service Provider (ISP) can't keep up or your network hardware is too old. In this case, the cloud provider is largely left out of the equation: Even the best services in the world are useless if you can't stay connected. Even the best services in the world are useless if you can't stay connected. This objection falls flat if companies take the time to assess cloud needs rather than leaping without a look. Cloud adoption aside, it's worth evaluating the performance of your network and replacing key devices — or switching ISPs — as needed. When infrastructure is up to par, start by taking small bites of the cloud rather than trying to move all services en masse. One key benefit of the cloud is on-demand resources backed by simple scalability; test apps to see if you've got the necessary bandwidth, improve if necessary, and repeat. [easy-tweet tweet="One key benefit of the cloud is on-demand resources backed by simple scalability" user="XOComm" usehashtags="no"] Ready for the cloud but dealing with naysayers? Combat legacy, security and bandwidth worries by leveraging flaws in their logic. ### Distil Networks Acquires Sentor ScrapeSentry to Add 24/7 Security Operations Center and Expert Team of Analysts Distil Networks, Inc., the global leader in bot detection and mitigation, today announced its acquisition of managed security services provider ScrapeSentry, a spin-off of leading managed security firm Sentor. ScrapeSentry’s team of experienced security analysts will provide real-time threat monitoring, analysis, investigation, and response, as well as post-incident reports and best practices to ensure a rapid response to malicious bot attacks. With today’s news, Distil’s 24/7 managed security service is available to mid-market and enterprise customers and is based on each customer’s unique security policy and response plan. Bots represent a new generation of threats, making up nearly 60 percent of all web traffic. Bots are a platform used by hackers and fraudsters alike and are the key culprits behind web scraping, brute force attacks, competitive data mining, online fraud, account hijacking, unauthorized vulnerability scans, spam, man-in-the-middle attacks, digital ad fraud, and downtime. Distil offers the first easy and accurate way to defend websites against these malicious bots -- without impacting legitimate users. By monitoring each page request and building a fingerprint of each incoming connection, Distil allows customers to accurately detect bots in real time and then provides options to mitigate the attack. [quote_box_center] “Distil’s technology is built on machine learning and catches 99.9 percent of the bad guys. By adding ScrapeSentry’s experts to our team, and a human element to our solution, we are getting closer to closing that point one percent gap. With ScrapeSentry’s strong security pedigree and background in managed services, the team’s hands-on, granular approach will provide our customers with dedicated SOC resources along with even faster response times.” -- Rami Essaid, co-founder and CEO of Distil Networks [/quote_box_center] “Bots and scraper programs continue to cause serious problems for online business models. Together with Distil we will have a platform that we can use to provide protection for a broader audience and all customers will benefit from the unmatched strength of the joint development and research teams. ScrapeSentry’s experience in managed security services is a natural extension for Distil and we are thrilled to join the company’s team and mission to make the web more secure.” -- Martin Zetterlund, CEO for ScrapeSentry [quote_box_center] “The most costly attacks against businesses take advantage of legitimate website access. While WAFs and scanners can help detect cross-site scripting and SQL injection issues that are properly fixed in the code, Distil Networks' ability to analyze behavior provides the best chance of detecting and blocking bot-driven attacks that take over accounts, scrape sensitive data and drive up fraudulent transactions.” -- Eric Ogren, senior analyst at 451 Research [/quote_box_center]   ### The True IP Communications Story If your small to medium sized business is still using an outdated communications system that takes up an entire telecom room, it’s time to upgrade.  While this was once the only way to do business, the age of the internet has changed everything.  Now there’s a much better and more affordable way to handle your business communications needs: VoIP. VoIP stands for voice over internet protocol, and in layman’s terms that means your communications will be done through the internet. IP communications are an easy and inexpensive upgrade to make. While you may be reluctant to throw out tens of thousands of dollars of communications equipment you just paid off, the switch is totally worthwhile. Making this change will eliminate the need for support and maintenance costs. That alone makes the switch a good business decision. Learn more about IP communications and how making the switch can save your business money from this infographic! ### Getting to Grips with the Business of IT In the past, IT departments tended to lead the adoption of technology to support core business activity. When a new technology such as desktop computers came to market, it took some time for businesses to take advantage of it and even longer for the business to become dependent on it. Once it was adopted however, it became subject to business rules, subject to the expectation that it kept up to date with the rest of the market.  [easy-tweet tweet="Adopting technology-as-a-service models has changed the way #technology and applications are used" user="comparethecloud"] This environment has changed. Today, most organizations operate digitally, adopting technology-as-a-service models that have changed the way technology and applications are used. Support now comes from external vendors and not in-house staff. With adoption of cloud as a service model it is becoming increasingly possible to not have an IT department at all. digitalisation cannot be held back, particularly in the era of Cloud, Big Data, and the Internet of Things IT in business Most IT departments to this day remain close to their technology roots. This is partly because they continue to be run by technologists and engineers whose primary interest lies in the adoption and challenge presented by new technology. However, not every highly qualified engineer makes a good business person, but most organizations promote people who are good at their jobs into management. This has resulted in many IT departments traditionally not being run as a business and good models for how IT should run have been piecemeal or slow to develop. This is despite the key role that IT has increasingly had in enabling digital business and in turn influencing how that business should be run. While some standards have been developed as guides for how different elements of IT should be run (COBIT for governance, ITIL for service management, TOGAF®, an Open Group standard, for architecture), there has been no overarching standard that can encompass how all of IT should be holistically managed. Despite all the technological advances, IT has yet to become a smooth-running business machine. No longer business as usual With today’s emphasis on business and technology agility, everything happens nearly instantaneously Both the business and technological climate isn’t the same as it was when companies took years to implement a software upgrade. With today’s emphasis on business and technology agility, everything happens nearly instantaneously. With this pressure (among many others) facing IT, departments need to either change or adopt a model where IT is more effectively managed. Failure to do so will result in a certain level of chaos that will hinder an organisation from achieving its business objectives. The absence of an effective management model for IT means that companies aren’t able to move quickly in a digital age. Even an inability to utilise data effectively in future could result in problems when looking to invest in a new product that customers are keen to use. Such mistakes can be the difference between success and failure in the modern economy. By adopting a coherent business model for the IT operation, you are able to make better choices for your IT department. With technology evolving so quickly, you have to be able to understand what will help achieve your organisation’s business goals. [easy-tweet tweet="IT departments without a good business model and strategy will not know when to invest in the correct technologies"] IT departments that do not have a good business model and have not aligned their portfolios with the business strategy will not know when to invest in the correct technologies and as a result may miss out on the new technologies that could provide significant business benefit. The absence of a framework to map how new technology could fit into the business will still result in a very advanced department technologically, but not one able to support the broader business.  A new world for IT To help avoid the consequences for the IT department if it isn’t run more like a business, industry leaders have collaborated and formed a consortium that addresses how to better run the business of IT. With billions invested in IT every year these companies recognised that such investment must be made wisely to show long term results. The Open Group’s IT4IT™ Forum The result is The Open Group’s IT4IT™ Forum, which in October released its first standard for running IT more like a business. This standard provides a unified operating model for IT, supplying the “missing link” that previous IT-function specific models don’t address. The standard allows IT to achieve the same level of business, discipline, predictability and efficiency as other business functions. The time is right for IT4IT as digitalisation cannot be held back, particularly in the era of Cloud, Big Data, and the Internet of Things.  The IT4IT standard places IT at the heart of a business model that allows IT to be a core part of the modern enterprise, providing a clear path for digital businesses to compete and thrive for many years to come. ### Understanding DevOps in 2016 I recently sat down with Vadym Fedorov, a Solutions Architect from SoftServe to discuss DevOps. Fedorov specialises in Enterprise Technologies, Cloud Solutions and DevOps. As a Solution Architect Fedorov manages the full software development life cycle, including application design, performance analysis, code optimisation, re-design and platform migration, requirements analysis, usage and development of design patterns and infrastructure planning.  [easy-tweet tweet="CTC is talking #DevOps with Vadym Fedorov from SoftServe" user="comparethecloud" hashtags="cloud"] Fedorov is certified by major technology providers including Cloudera (Hadoop), Microsoft and Cisco, so he is more than qualified to help me (and you!) understand DevOps. Q. DevOps seems to be a buzzword - why should we care? In the not too distant past enterprises had a small number of servers, and it was standard for one system administrator to manage up to 30-40 servers. Today, organisations have hundreds of servers, and their infrastructure has also changed. find out how “DevOps” was born Whereas in the past the deployment of infrastructure was linked to the manual deployment of hardware and software, most Operations and Infrastructure teams now work with a cloud infrastructure provider that offers programming access for infrastructure management. As a result, System Administrators had to start programming or Programmers had to start thinking about infrastructure in order to automate the provisioning and management of infrastructure – and “DevOps” was born. It lends itself to customer facing software: an app, a website, firmware updates, but that software customer could equally be internal: helping a business try new technologies or drive continuous improvements. It’s also enabled by advances in technology. The Increased automation of infrastructure combined with collaboration tools can make it possible for everyone in a team to work together in an agile approach, accelerating software delivery like developer teams have with their coding for many years. Q. You mentioned automation, what role does it play in DevOps? Automation is the backbone of cloud services, helping providers offer customers powerful control panels and hugely flexible and dynamic configurations. Cloud providers are opening up these automation layers through additional services and APIs and this means companies are able to stage, test and deploy new software automatically from continuous delivery tools which orchestrate the software pipeline. This level of automation is typically harder with on-premise servers, networking and storage that aren’t built for the software defined era. [easy-tweet tweet="Automation is the backbone of cloud services" user="comparethecloud" hashtags="cloud, devops"] Q. What opportunities does cloud offer? For startups, it is a highly cost effective and lean infrastructure to take an idea from concept to a public beta or trial. For example, we helped user experience optimisation expert Yottaa employ agile development on Amazon Web Services to deliver its operation Intelligence platform into production. Larger companies can leverage the same infrastructure for R&D, or production applications where multi-tenant hardware suits their needs. As long as your application has a distributed design from the outset, it can scale up on a flexible IaaS as needed to fulfil demand without crippling CAPEX. ### Tim Poultney speaks to CTC for #CloudTalks Tim Poultney takes time to talk to Compare The Cloud about his company, VEBER, as part of the #CloudTalks series. #CloudTalks ### Brad Ross speaks to CTC for #CloudTalks Brad Ross takes time to talk to Compare The Cloud about his company, Pivot3, as part of the #CloudTalks series. #CloudTalks ### Tackling a new CIO agenda IT is changing organisations and their workforces beyond recognition. It’s said that every employee is a now digital employee; we enjoy technology, and recognise its relevance to the wider corporate agenda. Our evolving technical knowledge also allows us to adopt shadow IT in the workplace, in an attempt to streamline individual workloads. As such, being just ‘the IT guy’ is no longer an option – IT professionals are now expected to step up to take a valued and strategic role within the wider business. [easy-tweet tweet="IT professionals are now expected to step up to take a valued and strategic role within the wider business says @colt_technology" hashtags="cio"] With this renewed focus, IT leaders are expected to juggle day-to-day operations whilst remastering business units to respond to the new digital economy. The CIO is regarded as a lynchpin for digital transformation at all levels; whether consulting on new technologies, shaping cross-departmental initiatives or leading pivotal company-wide projects. In the instance of IT challenges or disasters they too are the first port of call. Such pressure could be viewed as a given in this role. After all, the IT function has always come with a high element of risk. But as technology becomes more engrained in the business, how are these mounting expectations impacting the individual at the helm? [easy-tweet tweet=" The #CIO is regarded as a lynchpin for #digitaltransformation at all levels" user="colt_technology @comparethecloud"] the IT function has always come with a high element of risk This updated job description should, in theory, equal a new mind-set. Yet in a recent Colt study it was found that the majority (68%) of CIOs are still making pressured decisions based on their own instinct and experience, above any other factor. The majority (76%) also admit that their intuition can be add odds to other sources, such as big data reports or advice from third parties. While this is understandable given the fast pace of change and potentially contradictory data on offer – it may also indicate a more deep-seated issue to be addressed. What is the cause of this reliance on our tried-and-trusted method decision making? CIOs certainly feel a greater pressure on their shoulders, so one could attribute this to an increased feeling of personal risk. This sentiment was echoed in the same ‘Moments that Matter’ study, where more than three-quarters (76%) of senior IT leaders said they felt more individual risk when making decisions, as IT evolves into a more strategic role. After all, if the stakes are high and the individual is compelled to make the right decision that will result in business (and thus career) success – gut instinct and professional judgement will surely outplay the potentially conflicting insights or advice.  [easy-tweet tweet="Are #IT leaders making full use of the expertise that already exists in the company?" hashtags="cio"] Yet the question remains: are IT leaders making full use of the expertise that already exists in the company? Do they make the most of internal – and to some extent – external expertise to cope with this huge digital transformation? Whilst CIOs are already adapting their way of thinking – focusing less on operational matters and instead majoring on delivering value to the business – there is still a concerning pattern which emerges from the research. The findings suggest that some IT departments still act in an insular way by making their own IT-based decisions. When dealing with issues or risks, the IT professional tends to consult with others in their department, rather than the wider business. This approach worked well when IT’s main objective was to prioritise the internal needs and pressures within a business. But to meet the demands of todays’ digital world, innovation must be led by the needs of the end-customer and that of the market. According to Gartner, more than 21 percent of IT investment now takes place outside of an official IT budget. Department leaders have their own priorities and budgets, which must keep up with market demands and the increasing pace of change. The modern day CIO is expected to be a trusted advisor across the business The modern day CIO is therefore expected to be a trusted advisor across the business and to take the time to learn from counterparts in other divisions – their priorities, struggles and customer needs. Fostering these solid relationships enables insights on different functions, but also provides the CIO with a sense of the business tolerance to risk. This depth of understanding will help them prioritise the projects to focus on – whether prototyping, piloting or rolling out – and identify the ideas that will make the most business impact. In an age of digital transformation, it is interesting and understandable that IT decisions are often still based on instinct and experience. Yet looking to the next phase, ensuing solid communications and collaborative approaches can promote enthusiasm around new forms of tech adoption. Most importantly, collaboration presents a common goal to drive innovation and create advantage, no matter whose idea it was originally. For the CIO, this approach helps to ease the pressure of IT-driven decision making, and help them grow into a new generation of IT leader. ### Virtual HPC Clusters Enable Cancer, Cardio-Vascular and Rare Diseases Research eMedLab, a partnership of seven leading bioinformatics research and academic institutions, is using a new private cloud, HPC environment and big data system to support the efforts of hundreds of researchers studying cancers, cardio-vascular and rare diseases. Their research focuses on understanding the causes of these diseases and how a person’s genetics may influence their predisposition to the disease and potential treatment responses. [easy-tweet tweet="A HPC cloud environment designed by OCF is working to save lives with eMedLab" user="comparethecloud" hashtags="healthtech"] The new HPC cloud environment combines a Red Hat Enterprise Linux OpenStack Platform with Lenovo Flex System hardware to enable the creation of virtual HPC clusters bespoke to individual researchers’ requirements. The system has been designed, integrated and configured by OCF, an HPC, big data and predictive analytics provider, working closely with its partners Red Hat, Lenovo, Mellanox Technologies and in collaboration with eMedlab's research technologists. The High Performance Computing environment is being hosted at a shared data centre for education and research, offered by digital technologies charity Jisc. The data centre has the capacity, technological capability and flexibility to future-proof and support all of eMedLab’s HPC needs, with its ability to accommodate multiple and varied research projects concurrently in a highly collaborative environment. The ground-breaking facility is focused on the needs of the biomedical community and will revolutionise the way data sets are shared between leading scientific institutions internationally. The eMedLab partnership was formed in 2014 with funding from the Medical Research Council. Original members University College London, Queen Mary University of London, London School of Hygiene & Tropical Medicine, the Francis Crick Institute, the Wellcome Trust Sanger Institute and the EMBL European Bioinformatics Institute have been joined recently by King’s College London. Bioinformatics is a very, very data intensive discipline “Bioinformatics is a very, very data intensive discipline,” says Jacky Pallas, Director of Research Platforms, University College London. “We want to study a lot of de-identified, anonymous human data. It’s not practical – from data transfer and data storage perspectives - to have scientists replicating the same datasets across their own, separate physical HPC resources, so we’re creating a single store for up to 6 Petabytes of data and a shared HPC environment within which researchers can build their own virtual clusters to support their work.” The Red Hat Enterprise Linux OpenStack Platform, a highly scalable Infrastructure-as-a-Service [IaaS] solution, enables scientists to create and use virtual clusters bespoke to their needs, allowing them to select compute memory, processors, networking, storage and archiving policies, all orchestrated by a simple web-based user-Interface. Researchers will be able access up to 6,000 cores of processing power. “We generate such large quantities of data that it can take weeks to transfer data from one site to another,” says Tim Cutts, Head of Scientific Computing, the Wellcome Trust Sanger Institute. “Data in eMedLab will stay in one secure place and researchers will be able to dynamically create their own virtual HPC cluster to run their software and algorithms to interrogate the data, choosing the number of cores, operating system and other attributes to create the ideal cluster for their research.” Tim adds: “The Red Hat Enterprise Linux OpenStack Platform enables our researchers to do this rapidly and using open standards which can be shared with the community.” Arif Ali, Technical Director of OCF says: “The private cloud HPC environment offers a flexible solution through which virtual clusters can be deployed for specific workloads. The multi-tenancy features of the Red Hat platform enable different institutions and research groups to securely co-exist on the same hardware, and share data when appropriate.” “This is a tremendous and important win for Red Hat,” says Radhesh Balakrishnan, general manager, OpenStack, Red Hat. “eMedLab’s deployment of Red Hat Enterprise Linux OpenStack Platform into its HPC environment for this data intensive project further highlights our leadership in this space and ability to deliver a fully supported, stable, and reliable production-ready OpenStack solution. Red Hat technology allows consortia such as eMedLab to use cutting edge self-service compute, storage, networking, and other new services as these are adopted as core OpenStack technologies, while still offering the world class service and support that Red Hat is renowned for. The use of Red Hat Enterprise Linux OpenStack Platform provides cutting edge technologies along with enterprise-grade support and services; leaving researchers to focus on the research and other medical challenges.” “Mellanox end-to-end Ethernet solutions enable cloud infrastructures to optimize their performance and to accelerate big data analytics,” said Kevin Deierling, vice president of marketing at Mellanox Technologies. “Intelligent interconnect with offloading technologies, such as RDMA and cloud accelerations, is key for building the most efficient private and cloud environments. The collaboration between the organisations as part of this project demonstrates the power of the eco-systems to drive research and discovery forward.” [quote_box_center] The new high-performance environment and big data environment consists of: Red Hat Enterprise Linux OpenStack Platform Red Hat Satellite Lenovo System x Flex system with 252 hypervisor nodes and Mellanox 10Gb network with a 40Gb/56Gb core Five tiers of storage, managed by IBM Spectrum Scale (formerly GPFS), for cost effective data storage – scratch, Frequently Accessed Research Data, virtual clusters image storage, medium-term storage and previous versions backup. [/quote_box_center] ### Consolidation Among Cloud Service Providers Time to Get Big, Get Local or Get Out! While much of the recent focus has been on the big players in the Cloud market, that are all competing for scale, there remains a competitive play for focused MSPs that is often overlooked. [easy-tweet tweet="The #cloud market is heading the same way as automotive manufacturers says @zsahLTD" user="comparethecloud"] After an initial growth phase in which many pioneers crate a new market, there typically follows a consolidation phase in which mergers, acquisitions and divestitures herald the beginning of a level of market maturity. There were once hundreds of small motor manufacturers, but now you have the volume players like Volkswagen, Ford or Toyota and the specialists with niche brands like Porsche, Mercedes or Jaguar. Those caught in no-man’s land tend not to survive – just look at Fiat – it lacks the volume to compete on scale and the brand to compete on quality. The same trend is currently playing out in the cloud market. Signs of this process are as follows: 1) Generalists quit: HP has been a leader in printers, PCs and corporate servers, but it realised that it couldn’t compete in public cloud so pulled out (shortly before splitting into two firms) to focus on what it really was good at. Likewise Verizon has attempted to span the areas of wired and wireless connectivity as well as cloud, but is now reportedly looking to sell off some of its data centre facilities, in a move to refocus back on its wireless business. 2) Large players acquire smaller ones: Before admitting defeat HP had bought Eucalyptus, ContexStream, Aruba Networks and Voltage Security. Meanwhile Microsoft has bought companies like Adallom, while IBM has added Gravitant, Blue Box, and Cleversafe to its earlier big Softlayer purchase. 3) Specialists emerge: MSPs focus on their core area of expertise and partner across value networks As markets mature, niche or vertical segments appear that are served by local, specialist or premium providers as has been the case with the many segments now served by local MSPs. Unlike Verizon that tried to do everything from connectivity to cloud, the MSPs focus on their core area of expertise and partner across value networks in which each player from the MSP itself to the connectivity providers ISVs and other players that they team with to provide a solution that meets a client’s needs more closely than the big players ever could. So what should clients make of this or understand if they are to get the most from their provider? For commodity requirements (like long term storage) consider the big players – for example AWS Glacier is less then a cent per gb per year. For business critical needs consider an MSP with the specialist skills and knowledge that you are looking for – one that understands your business. Also try to avoid generalists and instead concentrate on those suppliers that are best at what they do and that focus on driving value and competitive advantage through their core strengths. Consider value networks where the various players have experience of working together to provide best of breed components in a fully integrated overall solution. Avoid that jack-of-all-trades that claims to be able to do it all. Be aware: Firms that get acquired are either keen to sell (cashing out) or forced to sell (fire-sale bargains). If your provider is one of these and does get acquired then you need to be prepared for the consequences - your applications and data may change hands meaning potential changes in terms of service, SLAs, etc. [easy-tweet tweet="There are #MSPs for every industry, you just have to find one that serves you right says @zsahLTD"] In any business there are commodity needs (like long term storage) that are hardly critical, but there are also areas where your data and your applications are critical, not only to the way that you do business, but also to the value that you provide to your own clients. If you are a law firm then there are MSPs that serve this segment. They understand how law firms operate, they are familiar with the applications that they use and have worked regularly with the ISPs and connectivity suppliers that also serve this market segment and they are probably the best option for you. The same is true for many other segments. Whatever your business there is likely to be an MSP out there that is right for you – we’d like to think that it might be Zsah, but even if it isn’t then these rules still apply! ### PTC Completes Acquisition of Kepwa PTC today announced it has completed its acquisition of Kepware, a software development company that provides communications connectivity to industrial automation environments. The acquisition enhances PTC’s portfolio of Internet of Things (IoT) technology, and accelerates the company’s entry into the factory setting and Industrial IoT (IIoT). Kepware’s KEPServerEX® communications platform will become a strategic component of the PTC ThingWorx® IoT technology platform. Once the companies’ products are integrated, machine data can be aggregated into the PTC ThingWorx platform, integrated with a wide array of internal and external information, and then automatically analyzed using ThingWorx machine learning capabilities. The integration will allow organizations to gain enterprise-wide insight and to proactively optimize mission-critical processes – enabling them to improve operational performance, quality, and time to market. “With Kepware, PTC will be able to accelerate its delivery of world-class solutions to industrial environments, and help manufacturers, infrastructure operators, and others realize the enormous potential of the Industrial Internet of Things,” said Jim Heppelmann, president and CEO, PTC. “We welcome and look forward to working closely with our Kepware employees, customers, and partners.” “Heterogeneous industrial connectivity will be one of the big challenges for creating a truly connected and digital factory. By enabling customers to combine external data inputs with industrial controls through the integration of Kepware with PTC ThingWorx, PTC is extending its leadership in evolving the on-demand economy,” said Hyoun Park, Chief Research Officer, Blue Hill Research. Over the past 12 months, privately-held Kepware generated approximately $20 million in revenue. PTC drew on its credit facility to finance this transaction. PTC expects Kepware to be neutral to its FY’16 non-GAAP EPS, and will provide additional financial guidance related to the acquisition on its fiscal Q1’16 earnings call, scheduled for Wednesday, January 20, at 5:00 pm Eastern Time. ### Known Vulnerabilities are the Leading Cause of Exposure to Data Breaches and Cyber Threats BMC, the global leader in software solutions for IT, in association with Forbes Insights today released results from a security survey of more than 300 C-level executives across Europe and North America, revealing that known vulnerabilities are the leading cause of exposure to data breaches and cyber threats. The report also confirms a significant gap between the security and IT operations (SecOps) teams, which is contributing to unnecessary data loss, production downtime, and potential reputation damage. The survey revealed that 44 percent of security breaches occur even when vulnerabilities and their remediations have previously been identified. Put simply, it takes far too long to fix a vulnerability once a patch becomes available. When asked why, 33 percent of executives surveyed stated it was challenging to prioritise which systems to fix first, since the security and operations teams may have different priorities. While the joint efforts of security and IT operations ultimately determine an enterprise’s security strength, the individual goals of these two groups are often out of sync. The biggest areas of risk for an enterprise are outdated and poorly synchronised internal procedures that thwart efforts to quickly defend against known threats. When asked about the challenges faced by IT and security, 60 percent of executives surveyed said the IT operations and security teams have only a general or a little understanding of each other’s requirements. Yet, 50 percent don’t have a plan in place for improving the coordination between these two groups. “Today, it often takes companies months to remediate known vulnerabilities – exposing them to potential breaches for six months or more as they work to resolve known threats,” said Jason Andrew, GM and VP of Sales, BMC EMEA “To discover, prioritise and fix vulnerabilities quickly calls for improved coordination between the security and IT operations teams. Narrowing the SecOps gap is critical to protecting an organisation's brand and also ensures customer confidence in the ability for the business to protect its information.” As companies prepare for 2016, CIOs need a plan to address the SecOps gap. European businesses are lagging behind their North American counterparts in this regard, with only 37 percent of those surveyed (compared to 60 percent in North America) planning to purchase or implement SecOps solutions in the next twelve months. C- level executives across EMEA should therefore consider a number of actions outlined in the report, including: Create cross-functional working groups to share security, compliance and operational concerns while implementing regular meetings to build loyalty and trust. Develop collaborative workflow processes that smooth interactions of security, IT operations and compliance personnel. Replace error-prone manual processes with intelligent compliance and security platforms that automate the testing and rollout of security patches and provide centralised information management tools.   “It is time to rethink the traditional, departmentalised, siloed approach to security given the increasingly sophisticated threats,” said Roy Illsey, principal analyst, infrastructure solutions  at Ovum. “Both security and IT operations groups must be held accountable for identifying and fixing issues quickly and integrate security and IT operations activities to further protect their organisations.” ### Is Your IT Estate More Jekyll or Hyde? The road to hell is paved with good intentions. Nowhere is this sentiment better illustrated than in Robert Louis Stevenson’s enduring tale of Jekyll and Hyde, as the good doctor, in developing a mysterious elixir to try and control his dark side, ended up unleashing the crazed Mr Hyde into an unsuspecting world. Much like Jekyll’s elixir, cloud computing is being brought into IT infrastructures with the hope of benefitting businesses, improving productivity and acting as a driving factor in opening new revenue streams. The brew, however, has proved too heavy a mixture for many businesses according to our new research. Although considered a gateway to innovation, it is ironic to see that harnessing new technologies like cloud computing  are a leading cause in the increasing levels of complexity which are now preventing many organisations from innovating. It is a twist worthy of Stevenson’s story. Harnessing new technologies like cloud computing are a leading cause in the increasing levels of complexity which are now preventing many organisations from innovating Good Intentions It all starts innocently enough. As is the case with many well-established organisations, IT departments are already committed to running much of their mission critical processes across a pre-existing IT estate. A ‘rip and replace’ route into the cloud is not a viable solution but the lure of the cloud elixir is too strong to resist. [easy-tweet tweet="A rip and replace route into the #cloud is not viable" user="SungardAS" usehashtags="no"] Instead, cloud adoption has come in smaller sips, with IT departments adding cloud services as an extension or as a replacement for end-of-life solutions. Organisations are now building, against a long-term strategy, increasingly sophisticated estates using a variety of traditional on-premise solutions alongside external services – including public cloud, co-location, or managed hosting services. This is Hybrid IT. The picture isn’t all bad: our research suggests that a number of organisations who have adopted a Hybrid IT approach have done so as a deliberate strategy and experienced a number of rewards – with over half of businesses (53 per cent) pointing to an increase in business agility. In addition, 21% of businesses have enjoyed greater levels of availability as a consequence of their Hybrid IT deployment, and 31% are finding increased levels of security. Indeed 77 per cent of organisations stated that adopting a Hybrid IT approach is a necessary part of staying competitive within their industry. At the Mercy of Mr Hyde Yet others have seen something much darker emerge, an evil side to Hybrid IT that’s becoming progressively more difficult to control. Under the guise of IT complexity, Mr Hyde has well and truly reared his ugly face, increasing IT operating costs for nearly a third of businesses – adding an average of £251,868 to IT spend every year. Mr Hyde has well and truly reared his ugly face, increasing IT operating costs for nearly a third of businesses Moreover, half of organisations agreed they do not have the skill sets needed to manage a complex Hybrid IT environment. Given the importance of Hybrid IT, this is disturbing – organisations admitted they were unable to deal with security issues (38 per cent), the integration of private cloud environments (27 per cent), or even managing different IT systems across the separate departments of their organisation (22 per cent). The end result is a system that IT departments can’t control, much as Jekyll was left struggling to contain Hyde. Most importantly, more than half (53 per cent) of IT decision makers claimed that this complexity is hindering innovation in their organisation. Is your organisation in the grip of the savage Mr Hyde? Here are four key areas to assess: [easy-tweet tweet="Is your organisation in the grip of the savage Mr Hyde? Here are four key areas to assess..." user="SungardAS" usehashtags="no"] Keep Watch for Dark Omens… You have no road-map: Building a Hybrid IT environment with ad hoc purchases and trying to keep numerous disparate applications integrated in a single system is a recipe for disaster. Hybrid IT might be, for many businesses, a stepping stone towards a cloud-first policy but a failure to invest in the right applications now with a view to their true business value will lead to significant issues in the future. You view the cloud as a silver bullet:        The cloud can be great for some projects, but not all applications are suited to this technology and running too many incompatible applications can cause major problems. As truly interconnected hybrid clouds are developed this may ease this issue but the nature of many organisations’ IT will be to remain hybrid, which means cloud will only ever combat part of the monster. Running too many incompatible applications can cause major problems You’ve left yourself with no escape: Rushing to consume the cloud can give you a hangover – and without an exit strategy, organisations can end up locked in and at the mercy of whichever cloud providers they have chosen. What happens if your provider wants to raise the costs? Or it suffers frequent outages? CIOs must make sure their organisation is protected and that doesn’t just mean adding another cloud supplier to the heap. You are starting to lose control: With multiple platforms it is easy to lose grip, so this is where the CIO needs to find a balance between having strong and robust governance in place, without making it so strict that you reduce the opportunity to drive innovation. Exorcizing Your Demons Stevenson’s story may have ended in tragedy but the same does not have to be true for organisations in the grip of a Hybrid IT nightmare. One insight is critical in conquering Hybrid IT: a successful IT estate doesn’t come overnight, and it’s important to plan your attack before you embark. Casting a light into the darkest corners of your data centres can give you clear understanding of each application and services’ importance to your business. Not all technologies or processes are equal, so arming yourself with the knowledge of these requirements will help encourage a successful Hybrid IT strategy, increasing availability, reducing security threats and improving overall resilience. [easy-tweet tweet="A successful IT estate doesn’t come overnight, it’s important to plan your attack before embarking" user="SungardAS" usehashtags="no"] While some organisations are able to vanquish their demons without extra support, for others, bringing in the right expertise is fundamental. Although having those specialists in-house might be the preferred option, the wide range of platforms and technologies which comprise a Hybrid IT estate can make this an expensive and near-impossible option. In this instance, a Managed Services partner – whether used just to advise or to offer full infrastructure support – can be an essential. Moving towards the cloud may seem like an irresistible potion to swallow – and one with countless positive outcomes, but it is important to remember that, conducive to the life of your business, its success depends on establishing the proper ingredients you need before you take that first sip… For more information about the horrors of Hybrid IT, or to download the whitepaper, click here. ### The Kings of Big Data Let's look at a historical experiment in Big Data; a project was implemented with the intention of determining the population of France and its social economic factions such as health and financial prosperity. Using birth and death records from the French parishes a figure was obtained by multiplying the number of births by 26. Third party anomalies such as birth hygiene were also tracked and accounted for. All this was achieved using only partial parish records with an accuracy of projection to within half a million. Guess what? This was done in 1781 by a French Genius Pierre-Simon Laplace. [easy-tweet tweet="Has #BigData existed since the 1700s? " user="comparethecloud" hashtags="data, analytics"] [quote_box_center] There are many descriptions of what ‘Big Data’ the quote below is from the Wikipedia description: “Big data is a broad term for data sets so large or complex that traditional data processing applications are inadequate” Another frequently used description often quoted by a plethora of cloud and Big Data marketers is: “Big data allow us to provide future predictions based on previously collected data”.[/quote_box_center] Monsieur Laplace built upon something called Bayes' Theorem an often much maligned theory first conceived by Thomas Bayes in 1740. Bayes, a religious figure, wanted to make rational decisions about the existence of god and created his theorem strangely based on a cue ball thrown onto a table. Bayes wanted to discover the probability of where the ball would land. Each throw generated a new data point that updated his system based on prior probability, each throw brought Bayes closer to a probability. The point here is that today we would generally call the statistical/theoretical models ‘algorithms’ but remember there was no Hadoop / Apache Spark, BigInsights or MapReduce in those days. Data sets were compiled analysed and conducted by paper and quill (Not even a ball point pen). Let's roll on to the 1940s, the United Kingdom was being decimated by U-boats attacking our merchant fleets during the battle of the Atlantic. From the U Boat pens of Lorient in France communications were issued that were coded by a machine called ‘Enigma’. Cryptography had moved on considerably with many thinking the German naval code simply unbreakable (the same way many feel about today's encryption). Faced with impossible odds in stepped Alan Turing using a refined version of Bayesian theory, Turing slowly but surely built Bombes (early computers) and utilised long strips of thin cardboard to decrypt the German cyphers. Turing, eventually discovered the essence of probabilities calculating large data sets to show probability that each deciphered message was part of a character set. This then allowed for the German Naval messages to be decrypted saving thousands of lives. I would also argue that most, if not all, big data and analytics products in existence are based in some form on Bayesian principles Roll onto 2016, so where does all this leave us today? Let's look at artificial intelligence many of the AI algorithms are today using Bayes to decide probability and to make judgements. I would also argue that most, if not all, big data and analytics products in existence are based in some form on Bayesian principles. [easy-tweet tweet="#BigData is propelling discoveries like never before" user="mrAndrewMcLean"] The analysis of large data sets from DNA through to Health, Economics, Chemistry and Computer Science use these principles to forecast, analyse and refine existing data and predictions. The use of so-called Bayesian networks looking at cause and effect relationships coupled with the failing costs of computing and cloud is propelling discovery like no other time before. My advice though, if you are thinking about adopting a big data approach in your organisation, why not look to the past and show your project sponsors that actually, Big Data is not new at all. For further research I highly recommend the following book "The Theory That Would Not Die".  ### Veeam Availability Suite v9 Available Today Veeam’s Most Scalable Solution To-Date Heralds a New Era for Enterprise Availability, Adding 250+ New Innovative Features to its Award-Winning Solution Veeam® Software, the innovative provider of solutions that deliver Availability for the Always-On Enterprise™, today announced the general availability of Veeam Availability SuiteTM v9.  The latest version of Veeam’s award-winning availability solution introduces more than 250 new innovative features and enhancements to help organisations deliver on the needs of the Always-On Enterprise, enabling SLAs (service level agreements) for recovery time and point objectives, or RTPO™, of less than 15 minutes for ALL applications and data. According to a Forrester/Disaster Recovery Journal survey of global disaster recovery decision-makers, 35 percent of companies reported that mismatched business expectations from technology capabilities was one of the biggest challenges faced when recovering from their most recent disaster or major business disruption[i]. The survey found that business managers and executives are demanding their IT organisations deliver better availability, but IT is struggling to meet those expectations. With Veeam Availability Suite v9, Veeam is enabling enterprises of all sizes to close this gap (the Availability Gap) and deliver 24/7 availability. “Today’s enterprises are more demanding than ever before, generating ever-increasing amounts of data to which they expect 24/7 access; this places an immense burden on IT to deliver,” said Ratmir Timashev, CEO at Veeam. “Veeam Availability Suite v9 addresses these needs with our most robust, scalable and feature-rich offering to date. Almost 42,000 customers registered for pre-launch updates (double that of v8), which illustrates the desire among enterprises to move from legacy backup solutions to innovative availability solutions. With this latest launch, Veeam is solidifying our position as the only company able to meet the needs of today’s Always-On Enterprise.” Veeam Availability Suite v9 introduces hundreds of enterprise-focused enhancements and new innovative features delivering the following capabilities: high-speed recovery, data loss avoidance, verified recoverability, leveraged data and complete visibility. Notable highlights within Veeam Availability Suite v9 are: [quote_box_center] EMC VNX and VNXe hybrid storage array integration A powerful combination of Veeam and EMC delivers lower RTPOs with Veeam Backup from Storage Snapshots and Veeam Explorer™ for Storage Snapshots. This integration helps IT professionals rapidly create backups from EMC VNX or VNXe storage snapshots up to 20 times faster than the competition and easily recover individual items in two minutes or less — without staging or intermediate steps — all with minimal impact on production virtual machines (VMs).  NEW Additional Primary Storage Integrations for HPE, NetApp and EMC Veeam On-Demand Sandbox™ for Storage Snapshots and enhancements to Backup from Storage Snapshots for storage solutions from EMC, HPE and NetApp to help  leverage enterprise storage investments for dev/test and eliminate impact from backup activities on production environment. Direct NFS Access for any NFS-based primary storage to improve backup performance and remove impact from backup activities on VMware vSphere hosts. NEW Veeam Explorer for Oracle (and many other enhancements for the family of Veeam Explorers) Veeam Explorer for Oracle: Features unique transaction-level recovery of Oracle databases, including agentless transaction log backup so customers can restore databases to a precise point in time, thus achieving low recovery point objectives (RPOs); quick and simple 1-click restore of individual databases; support for databases running on Windows- and Linux-based servers, and fully compatible with Oracle Automatic Storage Management (ASM) New eDiscovery features for Veeam Explorer for Microsoft Exchange: Useful for complying with legal or government investigations, new features include detailed export reports specifying exactly what was exported, from where, and with what search criteria; as well as search query result size estimation capabilities; and Major enhancements for Microsoft enterprise applications: Allows restores of Group Policy Objects, integrated DNS records and configuration partition objects with Veeam Explorer for Microsoft Active Directory, full-site restores with Veeam Explorer for Microsoft SharePoint, and table-level restores with Veeam Explorerfor Microsoft SQL Server.  DRaaS powered by Veeam Cloud Connect Replication Included in v9, Veeam customers get a fully integrated, fast and secure means to replicate workloads to the cloud, enabling Disaster Recovery as a Service (DRaaS) with dramatically improved recovery time objectives (RTOs). By building a technology that securely and seamlessly bridges the user to a trusted service provider of their choice, Veeam customers no longer have to build and maintain a disaster recovery site, offloading a tremendous amount of cost and complexity from their IT infrastructure. The extended functionality also gives Veeam Cloud & Service Providers the ability to provide their customers with fast and secure DRaaS with the new Veeam Cloud Connect Replication for Service Providers offering. [/quote_box_center] For further details on the new Veeam Availability Suite v9, visit the Veeam blog.   ### Water, Water Everywhere, but not an Internet Link OR Minimising the Hassle and Risks of IT even in the Most Challenging of Circumstances Small and medium businesses (SMBs) face many of the same challenges of their larger counterparts, but have fewer resources to use to meet them. In particular IT has always been a challenge and SMBs have typically had very limited IT skills. [easy-tweet tweet="Learn about how an #MSP can help you in the event of a disaster such as #Flooding with @Veberhost"] This is where an MSP (Managed Services Provider) comes in. Under normal circumstance the MSP serves to take the hassles out of IT and to provide the support and specialist skills to help the SMB move to the cloud, operate in the cloud and keep everything operational and secure. And it’s not just about on-going operations and security. Smaller firms on average tend to move offices more often as they seek a better rental deal, better location or simply expand or downsize. The lift and shift of their IT operations can be minimised by hosting their apps in the cloud. Life starts to get more interesting when things go wrong though. And with weather events like flooding becoming more frequent and more severe, we have to consider the options. If your offices flood then the chances are that any equipment sited there is going to be taken out completely, even if not directly then almost certainly by lack of power or physical access. Firms that plan for this eventuality don’t always think through the potential impact. You may have an emergency power backup, but have you considered your cooling system? One firm in Italy had a small data centre that had local generators to power their servers as well as their cooling systems (cooling can be a big issue in the Italian summer). When a flash flood hit the area, they were located away from the flood plain and even when the power company went down their local generator kicked in. Unfortunately the local water company was also hit and it didn’t have back-up generators. While the data centre had enough power to run the server and the cooling, when the water ran out everything started to melt down. With a cloud-based service mirrored across multiple sites and managed by a local MSP to meet their specific needs, the firm would have been ok. Another firm in New York with its main offices in Manhattan had access to an emergency facility in New Jersey in the event that their offices were ever out of action. It had desks and terminals that were ready to use and servers that could be ready to run their applications on site at short notice. When a major snow storm hit, none of the staff could reach Manhattan so the contingency plan was enacted and staff all received a message to head for the alternative site in New Jersey. But the roads were blocked everywhere and none of the staff could get there either. With a cloud-based service mirrored across multiple sites and managed by a local MSP to meet their specific needs, staff could have accessed all the firm’s apps from home and the firm would have been ok. [easy-tweet tweet="Hoping that disaster won’t happen to you simply isn’t a realistic strategy" user="veberhost" hashtags="MSP, DR"] Indeed many people imagine that remote cloud-based services would be hard to reach in a crisis, but as these examples show, in most scenarios there is some way of getting access these days. If you can’t get broadband or mobile where you are, you can normally move to where you can. The problem is if the servers you’re trying to connect to have melted down or cannot be reached as they are on local servers or in an inaccessible location then you are stuck. Having your apps in the cloud where they can be mirrored across multiple internationally diverse locations means that you can reach your apps and data and keep the business running even in the most challenging of circumstances. Closer to home we have seen flood events in all parts of the UK. I can’t help but be touched by the impact to people’s lives but my thoughts always then turn to hoping that local businesses have taken the time to prepare for disaster recovery. Hoping that it won’t happen to you simply isn’t a realistic strategy. Getting a local MSP that can set things up to meet your specific needs during the good times and still keep things running when it gets bad, is essential. ### December Top 50 #CloudInfluence Individuals Final individual report for 2015: Unlike our monthly #CloudInfluence ranking for organisation which has featured the same handful of top cloud firms in the top ten each month, the #CloudInfluence ranking for individuals has been far more fluid, with fewer regulars and far more change month on month. [easy-tweet tweet="Check out the review of #CloudInfluence Individuals over 2015 with @BillMew" user="Comparethecloud" hashtags="cloud"] End of year honours go to George Kurian CEO of Netapp who’s profile was given a boost at the end of the year in part due to the firm’s announcement that it intended to acquire SolidFire for $870 million, but also in part due to an unfortunate 14% drop in the firm’s share price over the month of December as a result. Two of the most consistent performers over the year have been Microsoft CEO Satya Nadella (in 4th this month) and FBR Capital Markets Analyst Daniel Ives (in 6th this month). Nadella has led Microsoft’s transformation into one of the leading cloud players, while Ives is the go to source of comment for many of the US networks on the performance, strategy and results of many of the big tech firms. Climbing the rankings to enter the top three in December were Aaron Levie, Box’s Co-Founder and Chief Executive Officer, in 2nd who has been promoting the firm’s collaboration with Sales Force along with Marc Benioff in 21st, while Larry Johnson in 3rd has been busy promoting the Cloud Awards that he organises. Oracle executive were also active with Mark Hurd in 5th, Safra Catz in 16th and Larry Ellison in 25th, all seeking to position Oracle as the main challenger to AWS as the database vendor embarks on a massive and ambitious transformation into a public cloud vendor. We will be exploring the prospects for Oracle in a forthcoming blog. George Kurian at Netapp and Satya Nadella at Microsoft are CEO’s that are also their firm’s main cloud spokesperson or evangelist. Other CEO’s of large firms that often appear high in the rankings are Oracle’s Mark Hurd given Oracle’s big cloud push and Marc Benioff of Salesforce which is totally cloud focused. Indeed the top evangelists for Microsoft in December, aside from Satya Nadella in 4th, were Nicole Herskowitz in 7th and Scott Guthrie in 20th all helping the firm reach number one in the organisation ranking. Steve Ballmer also appeared in 22nd . Conversely few executives from rivals AWS, Google and IBM appeared in the December rankings and their CEOs appear intermittently if at all. Google CEO Pichai Sundararajan, better known as Sundar Pichai, hasn’t appeared in any of our monthly rankings and neither have Larry Page CEO of Google's parent company, Alphabet or co-founder Sergey Brin. All three are far more heavily identified with Google’s other activities like search and advertising rather than cloud. Maybe Meanwhile Jeff Bezos CEO of Amazon and Ginni Rometty CEO of IBM may talk about cloud more often, but only tend to appear in the rankings every third month when their quarterly results are announced. Instead Andy Jassy SVP of Amazon Web Services at Amazon.com often appears, as does Robert LeBlanc SVP of IBM Cloud. Recent changes may impact this – Google has recently appointed VMware cofounder Diane Greene to run its cloud business. At the same time while Lance Crosby may have been about the last of the SoftLayer leadership team to leave IBM last January, the more recent departure of 18 year veteran Danny Sabbah, IBM’s cloud CTO may be a more significant loss. [quote_box_center] Other notable appearances this month include: In 8th Meg Whitman now running only half of her former empire – the HP Enterprise part. In 10th Gary Shapiro of the Consumer Electronics Association as the world gears up for CES 2016. In 22nd Kim Hammonds CIO Deutsche Bank evangelising about cloud as an increasingly important focus for the fintech sector. [/quote_box_center] [table id=58 /] NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### New Years Resolutions from Druva and LAN2LAN New Year’s Resolution #1 – Don’t put off looking at how you will deal with regulations too long “The European Union’s General Data Protection Regulation will hang over IT teams for the foreseeable future – any organisation that collects data on its customers within Europe will have to have its security and data protection affairs in order. For companies that are moving more of their IT infrastructure into the Cloud, this represents a big challenge. For those with mobile workforces – and to be honest, these days that is almost all companies – the challenge will be even greater. Look at this sooner rather than later.” Jaspreet Singh, CEO, Druva [quote_box_center] New Year’s Resolution #2 – Don’t just look at DR in isolation  “Disaster recovery has been a big selling point for cloud services over the past two years. This year, we’ll see more collaboration taking place between the DR team and the compliance team within the business. The reason for this is that there are more files being created on mobile devices rather than within the confines of the office, which leads to potential gaps. Mobile devices also open up more channels for communication between employees, and between the company and its customers or suppliers – think about how text messages are being used, or chat apps are getting added to improve collaboration. All of those channels could contain data that is relevant to eDiscovery and audit. Without taking mobile and edge computing into account, companies run an unnecessary risk.” Jaspreet Singh, CEO, Druva [/quote_box_center] New Year’s Resolution #3 – Plan ahead while trying to be agile “More companies are going Cloud, or increasing their investment in Cloud rather than on–premises IT. This shift has led to more conversations around security and disaster recovery in general too – the risk of failure amongst public cloud services providers has led to better planning discussions taking place. You’d think that issues would dull the appetite for Cloud, but instead there is recognition that it has to be thought through and architected properly. Those companies that have done a good job around Cloud DR and planning are leading the way.” Gary Duke, Sales & Marketing Director, LAN2LAN [quote_box_center] New Year’s Resolution #4 – Don’t think about Cloud on its own – focus on the outcome "Rather than the Cloud being a solution to every problem, there’s much more focus on how to make use of Cloud over time and in a more systematic way. This approach can help CIOs modernise their IT faster and let them concentrate on building solutions for business challenges, which is where the real value can be created by IT.” Gary Duke, Sales & Marketing Director, LAN2LAN [/quote_box_center] ### Protecting cloud based inventions When considering cloud computing and intellectual property, it’s not enough just to look at copyright, the most powerful means of protecting an invention is a patent.  Copyright can only catch another company who copies your code, but a patent can stop anyone who infringes your patent claims. However, obtaining a patent within the software arena - and therefore the cloud computing environment - comes with a number of challenges. [easy-tweet tweet="#Cloud computing is a grey area when it comes to what can and can't be patented" user="comparethecloud" usehashtags="no"] Key challenges Cloud computing is a grey area when it comes to what can and can’t be patented. The primary challenge, and one for the whole of the software industry, is that software per se can’t actually be patented in Europe.  To overcome this exclusion, we need to prove that a cloud invention has a technical effect beyond the simple use of software and make a case for its technical features; purely business or administrative methods are not seen as technical.  Patents are territorial rights and not many companies can afford to patent all over the world The second challenge is choosing the jurisdiction in which the patent is being applied for and granted. Patents are territorial rights and not many companies can afford to patent all over the world.  For example, if we were looking to obtain a patent for a system with a user in the UK and a server, we might want a patent in the UK. However, it’s very likely the server element will be located in the cloud, and the invention could be in the server, yet the server could be located anywhere in the world. This offshore facet means it can often be difficult to enforce a patent against infringers. Generally, this cloud computing patent infringement problem relates to network access to a shared pool of configurable computing resources, such as networks, servers, storage, applications and services. Typically it arises when some aspect or part of the computing resources which are accessed and used to infringe the patent are outside the territorial scope of the patent (offshore). So where do we apply for the patent and how can we enforce it? So where do we apply for the patent and how can we enforce it?  Whilst this might seem an insurmountable problem there are beacons of hope that can help our patent strategy and it is an area where judges really work hard to protect the patentee. As an example: US company Align Technology Inc created the Invisalign system, including an incremental adjustment invention for a dental brace appliance and patented it in Europe. The invention was based on providing a digital data set showing the patient’s teeth as they are (a) and a digital data set of the teeth as they should be (b), with devices designed to move the teeth from position a) to b) over time. The critical element of the invention was that the dental devices were made from a series of calculations that provided successive digital data sets between position a) and position b).  When Align Technology wanted to use their patent in Germany to stop Ortho Caps GmbH using this technology, a challenge in enforcing the patent arose because the process carried out by Ortho Caps was outsourced offshore. Normally, to enforce a patent, everything defined in the main patent claim needs to be carried out by the infringer in the jurisdiction. In this case, the digital data was produced in Pakistan and the dental braces were made in Germany. However, the German infringement court found that the alleged infringer had put the patent to use in Germany, even though the digital data sets were not produced in Germany. In the leading UK case, which relates to interactive casino games played on the web with an offshore host computer, the UK courts came to a similar conclusion - it would be wrong to apply the old ideas of location to inventions of the type under consideration. So in both countries the courts helped the patentees to enforce their rights. It would be wrong to apply the old ideas of location to inventions of the type under consideration [cloud located] Patent law complexities The complexity of patent laws in different jurisdictions and the cost involved in applying for patents present a further challenge. Consequently, an invention tends to be patented in the more important jurisdictions, and in areas where it is expected competitors may well operate, with patents usually enforced in the location of the user. A new idea? The number of patents in the software area and similarities in software approaches for different purposes present a further challenge for cloud based inventions, as previous work means similar ideas may well exist and have been patented, making it hard to have or to prove a new idea. Given the challenges, it’s no wonder companies are concerned about the ability to patent their cloud based inventions and to assert that patent if some parts of the invention would usually take place outside of the jurisdiction. [easy-tweet tweet="Get the expert's top tips on applying for #cloud patents" user="comparethecloud" usehashtags="no"] [quote_box_center] Top tips: applying for a patent to protect an invention Prior to being a Patent Attorney at Haseltine Lake I was a European Patent Office Examiner, so I understand both sides when it comes to applying for and granting a patent.  These are my tips for applying for a patent: You can achieve a software patent if you can prove it is technical, so have a good idea where the border is between technical and non-technical. Know what kind of invention you have, for example a new banking method could be all about improved security, so it’s the improved security which is the technical feature. The way you write the invention is crucial: Use the correct technical vocabulary.  For example, consider the impression the write up of the data structure would give to a European Patent Office Examiner.    A simple word change could be significant, for example “accessing” a digital data set instead of “providing” can make the method relevant in the country where the user is located, irrespective of where the system is in the cloud. The written patent application needs to be granted and enforceable, and to give the best possible chance, use the many different categories of patent claim available, such as the method, the computer program carrying out the method, the IT system, the server and the user terminal.  Of course the challenge is to write a patent application that works well in many different jurisdictions, each with its own laws! Currently companies need to assert a patent in a national court, so decide which countries to patent in based on your competitors’ activities. Whilst the patent system is slow to progress, inventions in the cloud move at speed, so protect your invention (your company and any future investment) by patenting it from the start – once it’s disclosed, it’s too late for a patent! [/quote_box_center] ### An Introduction to Axians Compare the Cloud has been sitting down with cloud experts to learn more about their companies, industries, and thoughts on the future. Our latest interviewee has been David Millar, Services Director of Axians which is a subsidiary of VINCI Energies. [easy-tweet tweet="Read the interview with @AxiansUK's David Millar on @Comparethecloud"] VINCI Energies is active in numerous sectors working to improve the daily life of people today, from energy and industry to transportation, buildings and smart cities. Axians is a global brand dedicated to information and communication technologies (ICT), bringing together the specialist IT solutions and services networks of VINCI Energies. Axians supports its customers throughout the entire lifecycle of their ICT projects to help them achieve their goals and improve their performance. Their ambition is to become a recognised leader in a global market with strong growth prospects, particularly in segments like the cloud, big data and mobility, in synergy with the other VINCI Energies brands and business lines.            David Millar, Services Director, Axians UK David is responsible for the management and development of Axians’ services-led business in the UK, he has been with Axians for 11 years. His primary focus lies in the creation of differentiated value propositions for the company’s service provider and public sector clients, enabling them to leverage a competitive advantage. Previously he has been Managing Director at The Sales Advisor LTD (2001 – 2005), and Enterprise Sales Director (2005 – 2009) and Business Development Director (2009 – 2012) of Imtech ICT before taking up the position of Services Director at Axians UK. David, can you tell us about Axians as a company? DM: Axians UK has a strong technical heritage based on the integration of networking, security and data centre solutions. We help customers operate some of the most complex and challenging network environments today. I’d like to think we’re doing a good job as we currently support many Tier 1 Service Providers networks’ as well as some innovative Tier 2’s and HE environments. For example, we’re providing complementary skills and support for the Janet network, the UK’s national research and education network, provided by Jisc. We know we need to keep pushing the boundaries on what is possible to help our customers’ continue to innovate and develop their services. What do you think of the Service Providers industry at present? DM: The services industry is changing -faster now than I’ve ever seen! As consumers, we look for our data to be on the device that we want, when we want it and where we want it, cut, sliced and diced in a way that just works. This presents a huge challenge for service providers and higher education institutions alike. To be able to meet all their customers’ demands is a tricky balance between infrastructure, accessibility and security. And consumers are quick to think with their feet – if they are not happy, they will move elsewhere. In fact, reducing customer churn is one of the biggest challenges we see faced by telco’s today. we look for our data to be on the device that we want, when we want it and where we want it, cut, sliced and diced in a way that just works And of course, the customer is always right. There are no excuses anymore for poor performance or a second rate quality of service, because the technology is there, it works– it just needs harnessing and integrating in the right way. We know at Axians UK that we need to evolve what we’re doing for our customers too.  We have to look at ways for us to retain customers by increasing our relevance, all the while ensuring we can still help meet their service delivery challenges. What are Axians UK main strengths? DM: There are many things that I can point to here; our assets, both physical and people, are a big part of it. The quality of our technology and of our staff is what stands us apart in this competitive industry. We are always working to exceed technical excellence. This is why we’ve spent time training and building our technical and service abilities. What is important for our customers is that we are a safe pair of hands with the ability to complete projects on-time, on-budget and with minimal surprises. Plus as a company, we have a grand focus on strong human values and social ethics and a culture of qualitative relationships.  We like to think of ourselves as ‘the linking pin’ of the connected planet and although we operate in 15 countries and employ 7,000 people we can act as a small company, as we are an agile team – and  our customers’ have the assurance that we have the backing of a large parent company. [easy-tweet tweet="We like to think of ourselves as ‘the linking pin’ of the connected planet says David Millar of @AxiansUK"] What do you think sets you apart in a competitive cloud marketplace? DM: With newer technologies, we are helping to lead the charge with software defined environments and the virtualisation of network functions. These sorts of technologies are starting to shape the conversations we’re having with our customers. Our expertise is in helping customers find ways of providing services in a fast, advanced, predictive and agile manner, while lowering the cost of doing so. A big chunk of our success is the relationship that we’ve got with Juniper Networks. We’re one of a few partners in the UK with Partner Professional Services (PPS) and Partner Support Services (PSS). These accreditations keep partners like us ahead of the technology curve. We are integrated, certified and engrained in the portfolio of solutions that we deploy. Yet, it can’t just be about training. It can’t just be about technology. It has to be about a deep understanding of the market you play in, and the key partners you’re working with. Can you describe the ideal client for Axians UK services? DM: Our ideal customer is anybody whose network is central, and core to their business.  We tend to look at our services as being applicable to any network. Any network, or any customers’ network, is part of a life cycle, it is born, it evolves and eventually it will be replaced. We’ve got to be able to offer services across that entire life cycle. We look at the reactive way to deal with support and the proactive way of how to help customers optimise the network they have in place, as well as preparing it for the demands of tomorrow. Support and professional services are two sides of the same coin and we are always looking to add new services to meet customers’ new challenges head on and bring more value. If you don’t evolve, you die - so we have to continue to evolve our services practice too. Support and professional services are two sides of the same coin What would you say are the key things that are shaping the current progressions in the service provider market? DM: Anything to do with automation and virtualisation of functions, with being able to understand networks better through analytics and in the securing of customer data  - no matter where it is present  - are key things that are shaping the industry at the moment. Recent high profile security breaches have certainly moved the latter back up the corporate agenda. Where you see Axians UK heading in the future? DM: I see us expanding visibility across Europe; showcasing our capabilities and expertise further to become a well-recognised brand. We have to stay ahead of the curve for skills and specialisms, which is part of our heritage and culture. It is how we are able to sustain Juniper Elite partner status as well as continue talks with customers who are looking to grow their market share and become more profitable. The network is their lifeblood – we will continue to drive innovation in this area. It is an exciting time to be working in this industry as there is always something new on the horizon! ### New Xangati platform gets automated storm remediation  Xangati, an innovator of hybrid-cloud performance management, has announced the latest version of its Xangati Virtual Appliance (XVA) architecture that features automated storm remediation for virtualised and VDI infrastructures. Other new features include native support for Microsoft Hyper-V environments and ServiceNow integration, so that XVA trouble tickets and storm alerts can be shared easily with the ServiceNow ITSM (IT Service Management) portal. Even the best run cloud environments are hit by storms of all kinds such as storage, CPU, memory and boot storms, which cause contentions that cripple applications and disrupt end-user experiences. With this latest Xangati release, virtualisation system administrators are able to remediate CPU and memory performance issues by automatically balancing workloads across vCenter hosts. By leveraging XVA’s real time key performance data, capacity planning and efficiency/cost optimisation, IT professionals can prevent and remediate virtualisation performance degradation. Xangati’s Efficiency Index also measures the extent to which available CPU, memory, storage and network interface capacity is fully utilised. “This new functionality means that VARs now have the ability to run a true managed service performance analytics practice with service-level agreements, based on definitive metrics for MTTR (Mean Time to Resolution),” said Atchison Frazer, VP-Marketing, Xangati. Key features of Xangati’s latest release include: Automated CPU and memory storm contention remediation Microsoft Hyper-V enhancements: Runs natively on Hyper-V, while monitoring and managing multiple hypervisors ‘CPU Wait Time Per Dispatch’ reporting for Hyper-V Support for multiple SCVMM (System Centre Virtual Machine Managers) to accommodate larger infrastructures Deep support for NetApp Storage Systems configured in Cluster-Mode ITSM notifications and metrics easily discoverable by ServiceNow and Splunk Enhanced XVA storm-tracker utility GUI for scale and ease-of-use New optional Executive Dashboard for XenApp environments XenApp 6.5 and Zone Controller logs included in XVA Visual Trouble Tickets XVA install/upgrade and setup workflows enhanced “Xangati is moving closer to its long-term vision of an autonomic infrastructure management that delivers an automated way to stave off complex degrading conditions and v-storm contentions, in addition to end-to-end, closed-loop orchestrated visibility to real-time performance indicators,” said Atchison Frazer, VP-Marketing, Xangati. For more information or for a free trial, visit http://xangati.com/xsr12-upgrade/. Existing Xangati customers with active support contracts can access the XVA upgrade via the Xangati support portal.   ### Your end-of-year report: December Global #CloudInfluence Organisations Many pundits have offered their reflections on the evolution of the cloud market in 2015, but few if any have had a year’s worth of data on which to base their finding. Our #CloudInfluence data gives a day by day rating throughout the year for every major global player – a veritable goldmine. [easy-tweet tweet="Look back over #CloudInfluence in 2015 with @billmew on @comparethecloud"] Over the year Microsoft and Amazon have vied for leadership in the Compare the Cloud #CloudInfluence rankings with Microsoft’s sizeable marketing machine prevailing over Amazon’s more lean marketing operation. Behind these two Google, IBM and Oracle have vied for third place without any of them being able to establish a clear advantage over the others. The outlook for 2016 is largely similar. Amazon has established enough of a lead with AWS and growing at a speed that out-paces its rivals. If it is able to continue innovating with new services and investing to expand its estate (as it has done recently with the announcement of new facilities in the UK and Korea) then its market share lead should be assured. Likewise Microsoft Azure may trail AWS in public IaaS and Paas, but it has established a clear lead over the others. It has also been successful exploiting its .NET franchise to develop a compelling hybrid proposition, is rivalling Salesforce and others in SaaS and has become the preferred public cloud option for HP, Dell and others, putting it in a very strong position. The latest report from Synergy Research Group says that public cloud services grew at 51 percent in 2015 as customers are increasingly flocking to AWS and Azure, while private and hybrid cloud growing at just 45 percent. This hides that fact that AWS and Azure have been growing far faster than the market as others have either struggled in public cloud or exited it entirely. So what are the potential disrupting factors to shake up the market in 2016? 1) Interest rates; these are inevitably going to rise (albeit slowly). Participation in the top tier of the cloud market requires a phenomenal level of investment as rivals seek to compete with ever expanding data centre estates and facilities. While Amazon is seeking to grow organically, borrowing to invest and funding interest from on-going operations, many of its rivals have massive cash piles to deploy. In 2014 Steve Brazier at Canalys predicted that AWS could not be profitable and would be unable to keep pace with its cash-rich rivals. In 2015 when AWS disclosed that it was profitable, he revised his outlook to predict its demise when interest rates go up, as it would increase AWS’s costs in a way that in a way that its rivals, free from borrowing, could avoid. In reality only Microsoft, Apple and Oracle really have cash piles of any great size. IBM and others are in the same boat as AWS, needing to fund expansion from debt or revenue. 2) OpenStack: in theory OpenStack has such broad support, including from so many tech giants like IBM, HP, Intel and Dell, that it should prevail. However in reality innovation by committee is rarely successful and many of its backers are fighting harder to differentiate themselves within the OpenStack community than they are for the community itself. And just as OpenStack started to become enterprise-ready one of its main backers HP stumbled and quit public cloud before announcing that its new preference in public cloud was Azure rather than any OpenStack alternative. Containers may offer some salvation, but at the cost of making OpenStack itself less essential. 3) The Larry Factor: as we will be exploring in greater detail in a blog later this week, Larry Ellison has done a u-turn and committed Oracle’s future to the once derided cloud. One of the cash rich rivals that Brazier warned us about, Oracle is investing in the mother of all facilities in Austin, Texas. I’ve always been one to view Larry’s announcements with a level of scepticism and think that coming from behind to overtake AWS in public cloud might be to much of a challenge for even him. Of course we will be tracking #CloudInfluence day by day and reporting quarterly to keep you abreast of how 2016 unfolds. And you’ll be able to see how these factors and our predictions fare. #CloudInfluence is of course an indicator of the share of voice and influence each player has, but given the very close correlation with the respective success of each player, it provides us with an insightful barometer on the cloud market as it evolves to become the dominant mainstream IT segment. Oh to live in interesting times! [table id=59 /] NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### Where’s the business opportunity to invest in Cloud Orchestration? There is currently a huge void between a successful IT implementation and a valuable business outcome. From our observations, a system being online, and functioning as anticipated, often only makes up 20 per cent of the outcome the company expects. The remaining 80 per cent is made up of system integration, configuration, and customisation (40%) end-user experience, training, and support (30%), system lifecycle management (9%) – while cloud orchestration and subsequent automation is only one per cent. [easy-tweet tweet="Is orchestration one of the most ROI heavy elements of any IT system implementation?" user="davidfearne" hashtags="cloud"] Most customers don’t even think about orchestration being one of the most ‘Return On Investment (ROI) heavy elements of any IT system implementation - or as an accelerator to a successful business outcome. But, to businesses, an outcome shouldn’t just be flashing lights or a shiny web console – it’s much, much more. There are three examples that highlight the importance and business benefits of cloud orchestration and automation: 1. Application lifecycle management Orchestration and automation help to streamline application and user acceptance testing. The importance of automating the test process, along with the build development of test environments, should never be underestimated. Once a working and tested environment is approved, cloud orchestration can help with the build process - helping both future tests and developments, and fixing any bug activity. Orchestration and automation help to streamline application and user acceptance testing 2. Complex situations run like any other It is vital cloud users test and re-test unusual loads, attacks, losses, and disasters. Once point one is in place it becomes a minor task testing the system for the unthinkable and inevitable. Some situations will be positive, such as an increased load, which is specifically an area where orchestrating a response and then automating the actions is critical. However, business systems crashing during high workloads due to the success of the company, is unacceptable. It’s also critical in the organisation to be able to test, in detail, how the business could be vulnerable to attacks using a mirror of the production system. To ensure this doesn’t happen, companies must test availability strategies for losses and disaster, and then compare the results to the delivery of that business outcome before anything fails. [easy-tweet tweet="It is vital #cloud users test and re-test unusual loads, attacks, losses, and disasters" user="davidfearne" hashtags="data"] Another contention between IT and the business is HA DR investment, as it’s more often than not an estimation and can leave an organisation crippled in the event of a failure. Before any event, test and report on the facts - then modify and re-test as required until there is an agreed, tested and documented availability strategy in place. 3. Orchestrate to your audience Orchestrating the end-user functions means more consistent outcomes and better served customers - and can be done 24/7 with no extra cost. Understanding what’s hampering the end-users is also essential. End-users are both your customers and the workforce delivering the business outcome. Most IT projects are designed to transform the workforce, so investing time and energy into understanding what that means to them and how one department differs from another is vital. Most IT projects are designed to transform the workforce This can be a lot of work but just think in terms of ROI or Return on Time. Orchestrating simple tasks, including service desk activities, such as re-setting passwords and unlocking accounts - unless they are in conflict with security orchestrations - makes things much easier for the end-user. Nevertheless, when orchestrating end-users’ functions ensure it’s used as a tool and not a substitute. Orchestration, and subsequent automation, underpins 100 per cent of the modern IT infrastructure - allowing organisations to look forward rather than fire-fight various elements of a business. The end goal should be saving time that can be re-invested and used for research and developing, allowing an IT organisation to take a true bimodal stance. [easy-tweet tweet="Orchestration, and subsequent automation, underpins 100% of the modern IT infrastructure" user="davidfearne"] ### Could your IT infrastructure harm your business? Part II In December we published, as you may remember, Part I of this series “Could your IT infrastructure harm your business?” Picking up from where we left off, we’re going to continue talking about data security. Firstly, you need to consider that there are three types of data, all requiring different levels of protection – non-personal data; personal data; and financial data. These also bring in the question of compliance – especially for financial data where FCA and PCI rules determine the levels and types of security that must be implemented. These rules dictate not only how data should be stored, but also certain aspects of physical compliance relating to your IT infrastructure. Failure to comply with (and ensure continued compliance with!) such regulations is a serious breach of company obligations and will result in fines, or worse, against the company and its Directors. [easy-tweet tweet="There are 3 types of data, all requiring different levels of protection: non-personal, personal, and financial" hashtags="data"] As is now well known, one of the key issues with the recent breach of IT security at Talk Talk was that the data was not all encrypted. Data must not only to be encrypted to an appropriate level, but different types of data have to be stored separately to provide a critical additional level of security such that even if one level of data is breached, full personal and financial data is disclosed. Proper encryption and storage of data (including separation of data types) will reduce the risks of an incident such as the one at Talk Talk by a factor of 100. Best practice security solutions will ensure that personal data is well protected, even when firewall and virus protection layers are breached. Data must not only to be encrypted to an appropriate level, but different types of data have to be stored separately In the case of the US dating website, Ashley Madison, data was encrypted but best practices were not fully implemented in other aspects of their security policy, relating to access rules and processes. The hack in this case was perpetrated with the help of inside information – clearly illustrating the need for clear security processes, implemented across the organisation to avoid risks from internal agents. This leads onto discussion about processes, organisational considerations and the requirement for clear ownership of IT security. Ownership and Responsibility Given the very real dangers in the modern connected IT world and the potential for loss of confidential customer data resulting in significant damage to reputation and direct business, companies MUST have clear ownership, responsibility and accountability for all aspects of IT security. There is a strong argument for the role of a Chief Data Officer, reporting to, or having a direct role on the Board. Organisations should also regularly review their processes, including who has access to data and systems. You might want to consider the use of external audit services as part of the review process to ensure on-going compliance with best practice. Management review meetings typically review aspects of performance (and risk) using a form of “balanced scorecard” – looking at the overall position of the company against financial, staffing and other metrics. We contend that IT security should be a specific additional scorecard element, reviewed as part of the normal company review processes and owner by a senior member of the company’s Board or Management Team. [easy-tweet tweet="Organisations should regularly review who has access to #data and #systems" user="zsahltd @comparethecloud"] What about the impact of external providers? [caption id="attachment_33737" align="alignright" width="300"] Read part I of Zsah's report here[/caption] Most companies employ some level of outsourced IT provision, either through the use of Cloud based applications (e.g. Sage accounting or Salesforce.com CRM systems provided through a “Software as a Service” SaaS solution), or through more wholesale use of external IT providers of Cloud Service providers. In all such cases, these solutions involve holding your customers’ data outside your own physical environment. Of course, most reputable providers use highly secure and well developed data centres that should provide better physical security than the alternative of an in-house IT system in your own offices. Even if your own servers are in secure rooms, who has access to these locations and can you be certain that physical security cannot be breached? So, external Cloud Service providers should provide secure services – but there are steps you need to take to check that their processes are in compliance not only with best practices and external rules (FCA, PCI, etc.), but also with your own security policies and processes. For example, if using one of the “big brand” Cloud Service providers such as AWS or Rackspace, you will be tied to their contracts and embedded policies. You therefore need to be aware of these in sufficient detail to ensure that your end-to-end security solution is fit for purpose. There is also a separate issue of data privacy. You typically don’t have control over where data might be held, with the risk that external agents such as the US Government might have rights to access your data. [quote_box_center] There are a number of questions you should be asking your external Cloud Service provider, including: What is your security policy and processes? An example of the level of detail required is that if PCI data is held, this should be on a server in a physically locked cabinet, not just within an area covered by a general security lock. If the provider is fairly new to the market, what rules do they follow and what is their track record, financial strength and longevity? Updates and maintenance. Does the architecture employed comply with best practices as outlined in this article (e.g. dual layer firewalls)? Who do they use for infrastructure or specific elements of their own delivery? Many Cloud Service providers are partially or wholly “brokers” of services, using another provider for the actual infrastructure. [/quote_box_center] Summary [easy-tweet tweet="Organisations need clear ownership and understanding of all aspects of IT security" user="zsahltd" hashtags="cloud"] Recent cases highlight the need for a fully coordinated security policy covering physical, people, processes, IT security (protection from hacking) and encryption / data separation to protect from loss of data if attacked. Organisations need clear ownership and understanding of all of these aspects of IT security – even if some elements of your IT solution are outsourced. Managed properly, the use of an external Cloud Service provider should significantly increase your own capability and overall IT security. ### Shadow IT – innovation versus risk Shadow IT – as the name implies – is the installation and use of applications by employees that operate in an organisation’s area of shadows, without the knowledge or approval of IT. These applications have not been vetted, authorised, controlled or supported by the IT team and therefore are often not in line with the organisation's requirements for data management, security and compliance. At the same time, shadow IT is considered by many as an important source for innovation, enabling the faster development of applications and tools not considered by the IT team, to develop a prototype way of working that benefits the entire business. [easy-tweet tweet="#ShadowIT applications have not been vetted, authorised, controlled or supported by the IT team " via="edmacnair" usehashtags="no"] Finding the right balance Shadow IT can lead to innovation with applications meeting the needs of the business, but also lead to risk. It  is a balancing act for businesses, with each individual organisation needing to decide just how far they reside along the risk curve. By allowing – or at the very least turning a blind eye to – shadow IT, new working processes that can benefit the wider business may be found. However, shadow IT can also be a security nightmare. Especially when you consider that those members of staff who are likely to use their own solutions, will inherently be from the generation of risk takers and will be inherently less concerned by the need for all-encompassing security measures. So, how can businesses protect themselves from this laissez faire attitude to risk and how should businesses decide their own risk profile? By allowing – or at the very least turning a blind eye to – shadow IT, new working processes that can benefit the wider business may be found BYOD as a catalyst In some ways, bring your own device (BYOD) is the one to thank, or blame for shadow IT. The result of allowing such personally chosen devices in the work environment has led to the creep of employee’s own chosen applications to access or manage corporate information and data. The general consumerisation of IT and the trend for staff to bring in their own devices has meant that every employee is now a potential user of shadow IT, easily deploying mobile applications. It is no longer just rogue, tech-savvy staff wanting their own tools. Shadow IT has recently become a broader issue. A starting point for businesses wishing to mitigate the risk, is to ensure that their acceptable usage policies are updated to cover the modern day business, such as home working and BYOD. Coming out of the shadows  [easy-tweet tweet="According to Atos 36% of #shadowIT purchases is being spent on file sharing software" user="comparethecloud" usehashtags="no"] According to Atos, 36 per cent of shadow IT purchases is being spent on file sharing software, 33 per cent on data archiving, 28 per cent on social tools and 27 per cent on analytics. Often, the use of shadow IT is down to impatience with the IT department, with the primary reason for shadow IT cited by respondents as the IT department’s inability to test and implement new capabilities and systems in a timely manner, thus smothering creativity and productivity. The trend for businesses to move core processes to the cloud – and staff’s general acceptance of it – has also accelerated the prevalence of shadow IT, whilst at the same time made it harder to monitor. According to a recent survey conducted by Fruition Partners of 100 UK CIOs, 60 per cent of respondents said there is an increasing culture of shadow IT in their organisations, with 79 per cent admitting that there are cloud services in use that their IT department is not aware of. A motivational force  It is wrong to discuss shadow IT without examining the benefits it can bring in innovation. A recent Frost and Sullivan report entitled The New Hybrid Cloud showed that 49 per cent of staff are comfortable using an unapproved application, because using them helps staff “get their job done more quickly and easily”. The rise of shadow IT may actually inject a healthy dose of innovative thinking into organisations The rise of shadow IT may actually inject a healthy dose of innovative thinking into organisations, so shouldn’t be disregarded from the start. The ability to test new approaches to business problems that could have a positive impact on the bottom line, is vital to employees at all levels. If they are encumbered by the need for permissions, or for budget approvals to get to the technology they need, things can stall at a time when market conditions change quicker than ever. Not to mention, shadow IT applications are often far cheaper than their ‘official’ counterparts procured through the IT department. Chasing innovation Some of the world’s largest companies are discovering that instead of trying to drive out shadow IT, it is best to embrace it as part of a wider culture of innovation. Adriana Karaboutis, VP and global CIO of Dell recently said: “I don’t chase shadow IT, I chase innovation. When you work in a technology company and have 110,000 best friends that understand technology well and probably even better than you do, you have to be out there working, listening and determining how you can create even more value for the employees and customers that you serve as opposed to being defensive about owning IT.” I would tend to agree. The onus on forward-thinking businesses shouldn’t be on stamping out shadow IT, but rather encouraging employees to adopt and get the most out of their tools of choice in a secure and productive fashion. [easy-tweet tweet="The onus on forward-thinking businesses shouldn’t be on stamping out #shadowIT" user="edmacnair"] Shining a light on the use of shadow IT Especially since the advent of shadow IT – and the exponential rise of cloud applications – organisations need to be able to monitor an individual’s data flow at the most basic level, regardless of whether users are in-office or mobile. Solutions such as cloud application control (CAC) can provide businesses with this visibility and the ability to discover, analyse and control all the information staff are accessing or sharing – whether across authorised or unauthorised applications. It is about managing security risks without stifling innovation. By ‘following the user’, businesses can ensure that employees are safe and secure at all times, whether they are using authorised applications or those from the shadows. ### Cloud and IT are trapped by their narrow vision I would define myself as a people person. I have come to this conclusion from years of working in customer-facing environments (recruitment and retail), but also through studying International Business and Spanish at university, where a year living in Spain taught me how to adapt to different cultures, social norms and business nuances.   Moving into the world of IT and Cloud computing, has brought a number of challenges – some days I think it’s more difficult adapting to this world than learning Spanish! From an outsider's perspective, I have begun learning about an industry that, in my opinion, makes itself inaccessible to outsiders. It was George Eliot, author of Middlemarch who once said, “It is a narrow mind which cannot look at a subject from various points of view.” [easy-tweet tweet="For outsiders to be accepted into the #cloud industry, it needs to evolve says @katewright24" user="comparethecloud" usehashtags="no"] Having now travelled around Europe at various cloud and IT conferences, it strikes me that for outsiders to be accepted into this industry, it needs to evolve. This evolution should include (but not be limited to): a correction of female representation and the loss of acronyms and jargon that are simply there to confuse and baffle audiences. One area in which I believe this evolution is beginning to take place is the emergence of CMOs or ‘Chief Marketing Officer’ who is a user of cloud and associated technologies. The arrival of this role in the industry has enabled us to break down the barrier between the ‘techies’ and the ‘commercials’. The CMOs must truly understand the technology, in a way that they are able to broadcast it into the public domain so it is understandable to the end user, thus eliminating the annoying IT-babble attached to it.  It has allowed for business level discussions to be representative of the CMOs function at the board level. The rise of the CMO should be a wake up call to every person located within the traditional IT function The rise of the CMO should be a wake up call to every person located within the traditional IT function. This wake up call is directed at those technically-minded individuals, telling them that if, “they truly understand technology, they should have the ability to explain it clearly.” If you cannot simplify the technology to me, then you must be lost in your own myriad jargon. As a millennial, I have grown up using the internet and have no recollection of the days of just four channel TV – to be honest, not being able pause and record live TV through my Sky+ box is a very distant memory! My generation from day one have been exposed to the ‘consumerisation’ of technology, where applications and services come ready-to-use, hidden away behind amazingly designed interfaces. [easy-tweet tweet="If you truly understand technology, you should have the ability to explain it clearly" user="katewright24, comparethecloud" usehashtags="no"]   The wider point I am trying to address is that without the simplification of jargon and acronyms, the professional discourse will be ignored in favour of a business minded individual. Allow me to offer an example (thanks to Andrew Mclean my CTO for explaining the tech). When I search on Google I really don't care that “Google has massively parallel databases that have abstracted the hardware layer and provides for searching based on a combination of worldwide data centres and smart algorithms and databases shared over many nodes.” In fact I just want my search results please (preferably without the advertisements but that's a whole new topic). The point of this blog is not a moan, but to try and help the IT and cloud industry understand that it is isolating itself from what could be a very large fan-base. The disruptive economy is offering the cloud and IT person a chance to become the hero The cloud and IT person has a chance with this changing and disrupting economy to become the hero both internal and external to an organisation. Big Data, Analytics, IOT, cloud are all combining into a central focus that will enable business to be disrupted like never before. Make sure the IT function becomes central to that role. The first stage of this is to step outside of IT and learn other functions, disciplines and industries. Any organisation can be affected by collective madness and the lack of ability to execute on a wider perspective. Cloud and IT's propensity for being introspective makes them the perfect industry to be affected by this.  ### Amazon Web Services Launches Korean Datacenters for Its Cloud Computing Platform Korean companies including Samsung Electronics, LG Electronics, SK Planet, CJ O Shopping, Nexon, and Mirae Asset welcome new Seoul Region to expand their use of AWS Cloud Amazon Web Services, Inc. (AWS), an Amazon.com company, today announced the launch of the Asia Pacific (Seoul) Region, its fifth technology infrastructure region in Asia Pacific (APAC). Starting today, Korean-based businesses and global companies with customers in Korea can leverage AWS’s industry leading infrastructure technology platform to build their businesses and run their applications in the cloud. Many thousands of Korean customers have been using the AWS Cloud from other AWS Regions for several years. Now, with the launch of the Seoul Region, Korean-based developers and companies, as well as multinational companies with end users in Korea, can securely store and process their data in AWS in Korea with single-digit millisecond latency across most of Korea. Developers can sign up and get started today at: http://aws.amazon.com. The Seoul Region consists of two Availability Zones at launch. Each Availability Zone includes one or more geographically distinct datacenters, each with redundant power, networking, and connectivity. Each Availability Zone is designed to be resilient to any issues in another Availability Zone, enabling customers to operate production applications and databases that are more highly available, fault tolerant, and scalable than would be possible from a single datacenter. With this launch, the AWS Cloud is now available from 32 Availability Zones across 12 geographic regions worldwide, with another four AWS Regions (and nine Availability Zones) in China, India, Ohio, and the United Kingdom to be available in the coming year. “Customers continue to choose AWS as their infrastructure technology platform because we have a lot more functionality than any other cloud provider, a significantly larger partner and customer ecosystem built around AWS, and unmatched maturity, security, and performance,” said Andy Jassy, Senior Vice President, Amazon Web Services. “Our Korean customers and partners have asked us to build AWS infrastructure in Korea so that they can run more of their workloads on AWS and approve new initiatives that could change their business and customer experience; we’re excited about delivering this to our customers today.” Since its inception in 2006, AWS has changed the way organizations acquire technology infrastructure. With more than one million active customers worldwide and over 50 services for compute, storage, databases, analytics, mobile, and enterprise applications, AWS has become the new normal for companies of all sizes in all industries to deploy their applications. Nexon is Korea’s premier gaming company, providing 150 games in 150 countries. With 68 percent of the company’s total sales of US $1.5 billion coming from their overseas business, Nexon is positioning itself as a global game provider. “For gaming companies like Nexon, it is difficult to estimate if an investment on infrastructure is going to pay off. By using AWS, we’re able to experiment with different games and invest in those that develop a following. With AWS, we did not need to make a huge initial investment and are able to manage our IT infrastructure more effectively,” said Sang-Won Jung, VP of New Development at Nexon. “We are currently running our new mobile MMORPG game, HIT, 100 percent on AWS. This game set the record for achieving the number one sales ranking in the Korean mobile gaming industry in the shortest period of time. With the new AWS region in Korea, we plan to use AWS not just for mobile games, but also for latency sensitive PC games as well.” Mirae Asset Global Investments Group, the number one asset management company in terms of profits in Korea, migrated its web properties from its on-premises data centers to the AWS Cloud. “In order to stay competitive, online and mobile businesses are more important than ever in the financial services sector. Using AWS enabled us to improve our web service environment and reduce annual management costs by 50 percent through the consolidated platform of web services such as servers, network, database, and security,” said Wan-Geu Kim, Managing Director of the IT Department at Mirae Asset. “With the launch of the AWS Region on Korean soil, we will now move even more of our sensitive and mission-critical workloads to AWS.” [quote_box_center] Investing in Korea’s Cloud Future Along with the new region, AWS has also worked to build a vibrant local cloud technology ecosystem in Korea. The rapidly expanding AWS Partner Network (APN) in Korea includes Independent Software Vendors (ISVs) and Systems Integrators (SIs) who are building innovative solutions and services around the AWS Cloud. ISV partners such as Ahnlab, Dreamline, Hancom, IGAWorks, and TMAXSoft are providing a variety of software, security, and connectivity solutions that can be used in conjunction with AWS. SIs including Bespin Global, GS Neotek, Megazone, and Vsystems are helping enterprises to migrate to AWS, deploy mission-critical applications on AWS, or are providing a full range of monitoring, automation, and management services of customers’ AWS environments. More details on these partners and solutions can be found at: https://aws.amazon.com/partners/. AWS is also delivering its AWS Educate Program to help promote cloud learning in the classroom with eight local universities, including Sogang University, Yonsei University, and Seoul National University. Since its launch locally in May 2015, over 1,000 Korean students are participating in AWS-related classes and non-profit e-learning programs, such as “Like a Lion.” The new Region adds to Amazon’s existing cloud computing investments in Korea. Teams of Account Managers, Solutions Architects, Technical Support Engineers, Professional Services Consultants, and various other functions support customers in Korea. [/quote_box_center] ### Oscobo launches the UK’s first anonymous search engine Two entrepreneurs are on a mission to bring a safer and fairer experience to all UK internet users. With a promise to never store or mess around with a users’ personal data, the UK’s first anonymous privacy search engine, Oscobo(www.oscobo.co.uk), has launched today. Founded by Fred Cornell (ex-Yahoo!) and Rob Perin (ex-BlackBerry), Oscobo provides consumers with pure search results based solely on the words that are typed in and not a profile that’s been amassed as a result of previous search engine use. [easy-tweet tweet="The UK has its first anonymous search engine thanks to @oscobo" user="comparethecloud" hashtags="infosec"] Whenever consumers interact online, visit websites or use a search engine, they leave a vast amount of personal and private information behind. Many falsely believe that when using the incognito mode on their browser, no one can see what you do. The truth is that consumers are still being tracked by IP address, logins and other ways. We are moving into a post-cookie world where tracking companies and advertisers use a range of methods to track users across all devices and locations, these include location, super-cookies, device ID, offline activity and store cards, even if the user is not logged into their search engine account. All of this information is collated by traditional search engines to create highly accurate profiles of who they are, affecting the results that are displayed as a result. “Over months and years a consumers’ personal history both in and out of a number of high profile search engines is tracked to monitor their online activity,” explains Rob Perin, co-founder of Oscobo. “This data is then stored, profiled, and sold to advertisers who bombard the consumer with spam, banner ads, and potentially cold calls. Worse of all it can even increase the online prices of goods and services based purely on who they are perceived to be online. We believe that this is fundamentally wrong and that personal data should remain just that, personal. Hence Oscobo was born.” Oscobo is committed to giving internet users another option. It will provide them with private and secure search that does not store or share personal data, enabling them to have a safer and fairer experience when operating online.  When a consumer selects Oscobo as their search engine, no footprint is left – this means that none of their personal and private data is stored or shared. Once they leave the Oscobo site, no records are kept and the search engine will not know who the consumer is when they return. "We will not store or mess around with their personal and private data" Fred Cornell, co-founder comments: “It doesn’t stop there. The tracking methods, or scripts, that run on most popular internet pages are slowing down browsing. This can be very expensive for consumers accessing the internet on their mobile devices. Especially if they are not on an all-you-can-eat data plan. It’s true that some scripts can be good and needed for the improvement of web sites, but many are now so called third party trackers are simply there to harvest as much information about a user as possible. This is certainly not something the user is expecting or would be prepared to pay for.” Cornell adds: “It's pretty simple really – we are on the consumers’ side. We will not store or mess around with their personal and private data, or use technology that costs the consumer additional money for no benefit. ” ### IBM SoftLayer and the importance of an MSP channel to leverage the cloud For many modern businesses the cloud is vital to the way in which they operate, but in order to deliver the level of service that their customers demand, they need to ensure that their infrastructure is up to scratch. IBM SoftLayer has been providing cloud infrastructure as a service (IaaS) to a broad spectrum of companies, from startups to multinational corporations, for a number of years now. As well as bare metal and virtual servers, customers can access networking and big data solutions and a host of flexible IT benefits. [easy-tweet tweet="IBM SoftLayer recognises the importance of integration and automation" user="anke_philipp" hashtags="IBMSoftlayer"] One way in which IBM SoftLayer is different to other cloud infrastructure platforms is that it recognises the importance of integration and automation. Despite having a growing number of data centres all over the world, SoftLayer builds each centre to the same specification and equips it with the full SoftLayer catalogue of services – meaning that customers get the exact same features and reliability across the entire portfolio. The way in which SoftLayer private cloud contracts and expands also means that flexibility is always at hand. SoftLayer gives you access to robust, dynamic and agile infrastructure [caption id="attachment_33611" align="alignright" width="400"] Click here to read more about IBM Security[/caption] For managed service providers (MSPs), the benefits of IBM SoftLayer are particularly noteworthy. Hosting applications and services can prove challenging in terms of resources and finances, particularly during periods of rapid growth, and many MSPs have benefitted from some level of assistance. Whether you specialise in storage, security or any other form of cloud service, partnering with SoftLayer gives you access to robust, dynamic and agile infrastructure that ensures you can deliver a level of service that meets not only today’s demands, but those of the future too. IBM SoftLayer has long recognised the importance of the MSP channel when leveraging the cloud, launching its channel referral program in 2011. Providing support for MSPs, the program doesn't tie partners to long-term contracts, but does give them access to IBM campaign assets, in-person training opportunities and other exclusive benefits, helping more and more companies gain access to SoftLayer's flexible cloud infrastructure. Businesses are free to build on and resell any SoftLayer service as if it was their own, without having to manage any of the underlying infrastructure. the cloud is already too powerful to ignore For managed service providers, the cloud is already too powerful to ignore. Previously, fears surrounding cloud adoption, particularly those centring on security, may have been justifiable, but now a reluctance to embrace the cloud will see more agile competitors overtake you. This is particularly important when you consider future technological developments, like the Internet of Things and big data, which will also rely on cloud computing. [easy-tweet tweet="IBM SoftLayer helps MSPs make the transition to the #cloud" user="anke_philipp"] IBM SoftLayer helps MSPs make the transition to the cloud by providing infrastructure as a service that is automated and robust, but also flexible, meaning you can tailor your service to your customer’s needs. The MSP channel lets organisations all over the world benefit from the cloud, all while IBM SoftLayer provides the dynamic infrastructure needed for modern business success. ### CTC Interviews Kurt Acker at IBM Linux Council New Jersey CTC Interviews Kurt Acker at IBM Linux Council New Jersey ### CTC Interviews Buzz Moschetti at IBM Linux Council New Jersey CTC Interviews Buzz Moschetti at IBM Linux Council New Jersey ### CTC Interviews David Rossi at IBM Linux Council New Jersey CTC Interviews David Rossi at IBM Linux Council New Jersey ### CTC Interviews Neale Ferguson at IBM Linux Council New Jersey CTC Interviews Neale Ferguson at IBM Linux Council New Jersey ### On the ground or in the cloud One of the most significant developments in the IT world to date has been the cloud. It’s completely transformed IT and brought a great deal of benefits to businesses by cutting IT infrastructure costs, improving efficiency, and more importantly, increasing agility. However, though the cloud is often talked about in a positive light, there are concerns around the time and money it takes to constantly maintain sufficient uptime, as well as security breaches which may occur in the cloud. [easy-tweet tweet="Cloud is one of the most significant developments in the IT world to date" user="comparethecloud" hashtags="cloud, IT"] It therefore goes without saying that implementing a cloud infrastructure isn’t a decision to be taken lightly. IT professionals need to be able to justify the move by weighing up the pros and cons of moving to the cloud.  What cloud service providers don’t tell you The operational expense of the cloud isn’t often talked about. While cloud computing services can be quite cost effective to set up, it can become more expensive than an in-house team over time because of the additional services required to run a workload. Most cloud service providers will provide a basic virtual server at a compelling price, however, it is important to consider additional costs such as bandwidth, networking and VPN services, the cost for standby or DR systems, and other required costs to run an application, which can quickly add up. Once the transition to the cloud has taken place, bandwidth and capacity have to meet the demands of both users and applications as they grow. This means the total cost can rocket, and it’s important to remember this will only increase as the application continues to grow. Most cloud service providers offer commodity hardware with uptime and SLAs that may be different from the hardware being used on-premise to run the applications that these cloud providers recommend having. one of the biggest concerns around the cloud is security Lastly, one of the biggest concerns around the cloud is security. When a business moves to the cloud someone else is automatically handling the business’ most prized asset – its data. Though the perception is that the cloud is not secure, the reality is that it can be as secure, and maybe even more secure, than on-premise deployments. The leading cloud service providers offer many security capabilities including isolated networks, security groups, federated identity, encryption of data in transit and at rest, as well as perimeter security solutions. This does not mean you can trust the cloud to ‘automatically’ keep your systems and data secure. It is still your responsibility to understand how the cloud technology works, what protection you need for each set of data, and to establish the right processes to maintain security. Remember security is not a technology but an ongoing process supported by technology. The business benefits of the cloud [easy-tweet tweet="When launching a new project, significant CAPEX is invested" hashtags="CAPEX, cloud"] Although there are challenges to adopting the cloud, it can offer significant business benefits. One of the main reasons businesses are moving to the cloud is the reduction in capital expenditure (CAPEX) that comes from switching to a cloud service provider. When launching a new project, significant CAPEX is invested in purchasing servers, storage, networking devices, cabling and developing disaster recovery procedures. But none of this is necessary when moving to a cloud-based infrastructure, which means capital can be invested elsewhere in the business. More importantly, hardware investments require careful planning. In the cloud, infrastructure can be provisioned dynamically and scaled up or down as the workload requires, minimising waste. Cloud infrastructure is created in seconds via an API call or a control panel click, which enables IT to move at a much more agile pace. Alternatively, when an IT department is stretched or under resourced many choose to move to the cloud so the service provider can take on the management of various components more efficiently, especially hardware and certain services. This minimises the workload for the IT pro so they can focus on other priorities, such as desktop support and network monitoring. Similarly, by switching to the cloud there is less maintenance for the IT department. IT departments are always investing in new products and equipment, but the reality is that these investments won’t last forever and there are always expenses involved, whether for software updates or part replacements. A cloud model does not involve any of these. It’s not to say that no maintenance is needed from the in-house team once they have moved to the cloud, but it’s definitely the next best thing. The cloud also provides the flexibility to scale infrastructure up or down as required. If a new business project has its IT infrastructure in the cloud and experiences an unexpected shutdown or needs downsizing, the cloud is flexible enough to handle it in real-time. If this was in-house, the IT team would have to hold on to all the purchased hardware until the next project came along. Likewise it is incredibly quick to scale-up. An on premise scale-up has to go through hardware and software purchases, infrastructure reconfigurations, equipment delivery time, cost for hours of consultant time, and so on. Whereas in the cloud the provider does all of this on your behalf – freeing up time for the IT department. The choice is yours  Moving to the cloud could completely transform a business Moving to the cloud could completely transform a business, but it isn’t for everyone. Total costs and security should always be front of mind when deciding whether or not to move to the cloud and the IT team needs to be certain that all confidential data is stored securely. On the flip side, the cloud can also massively help to reduce the time it takes to complete a project, increase productivity by making everything a lot more efficient and streamline IT processes. Ultimately, it’s about finding a solution that works for the IT team and meets business objectives of each specific workload.  ### CTC Interviews Mary Wallace at IBM Linux Council New Jersey CTC Interviews Mary Wallace at IBM Linux Council New Jersey ### CTC Interviews Gaylan Braselton at IBM Linux Council New Jersey CTC Interviews Gaylan Braselton at IBM Linux Council New Jersey ### CTC Interviews Howard Hardiman at IBM Linux Council New Jersey CTC Interviews Howard Hardiman at IBM Linux Council New Jersey ### Compare the Cloud at the IBM Linux Council Neil Cattermull gives his overview of the IBM Linux Council. ### Six Degrees Group announces major investment to support ambitious growth plans Six Degrees Group (“6DG”), the converged technology infrastructure provider, has announced the launch of a £12 million organic investment programme to support the company’s ambitious business growth plans for 2016. The substantial additional capital investment comes six months after funds managed by Charlesbank Capital Partners acquired 6DG with the goal of building the UK’s largest mid-market managed services provider. With a commitment to recruit 50 people 6DG will be building and developing its sales, marketing and customer service teams, amongst others, to service its growing customer and partner base. The £12m investment for growth is a purely organic play, designed to complement 6DG’s ongoing ambitious acquisition agenda. Since its inception, 6DG has developed a reputation for acquiring a series of specialist technology companies from across the UK, adding depth and technical capabilities to its growing portfolio of solutions. Delivering cloud, colocation, data and unified communications solutions to the UK mid-market, 6DG offers customers multiple services through one single provider. Alastair Mills, CEO of 6DG, commented: “We want to own the mid-market. We now employ 400 people, have over 2,000 customers and own the most comprehensive technology infrastructure of any UK managed service provider. Now we are making a recruitment investment that is unprecedented in our sector as we look to bring the best talent in our industry to Six Degrees to fuel our future growth.” ### What were the top data recovery concerns in 2015? A spike in adoption of complex, high-end software defined storage (SDS) systems was the top trend impacting the international data recovery industry in 2015, resulting in demand for more enhanced recovery technologies for businesses.  Other trends on the rise include the need for better data privacy and security as well as enhanced legacy data management technologies.  [easy-tweet tweet="A spike in adoption of #software defined storage was the top trend in #DR in 2015" user="comparethecloud"] Growing demand for recovery services from SDS systems Recovery from SDS systems is not only possible, it’s now expected. Growing industry adoption of SDS storage is reflected in the increasing demand for recovery solutions from these systems when they prove fallible. With vendors establishing their own proprietary methods of storing data within their systems, we have found that most enterprise-level failures resulting in data loss require a custom recovery solution. there was a two-fold increase in the number of high-end system failures resulting in data loss in 2015 In fact, we’ve seen a two-fold increase in the number of high-end system failures resulting in data loss in 2015 and have found that recovery success with SDS rests on accurately pinpointing the failure, analysing and deciphering proprietary storage designs, rebuilding file systems and developing solutions to restore critical data. At the leading edge of storage innovation lies hyper-convergence, an approach to IT architecture that consolidates and manages computing, networking and storage resources via software to run on any manufacturer’s hardware. Hyper-converged storage empowers simplicity in ease of use, rapid implementation, space savings and quick redundancy which all make it very easy to expand and protect data while realising cost savings.  [easy-tweet tweet="The need for data recovery from hyper-converged systems, will rise significantly in 2016" via="no" hashtags="data"] The need for data recovery from hyper-converged systems, which launched in the mainstream market in late 2014 and gained adoption in 2015, will rise significantly in 2016.  Recovery is challenging in part because data is fully integrated into the unit, making it difficult to gain sector level access to the disks. Further, system pre-configuration, especially with HDD, SSD and FLASH cache, or lack of information about system configuration, could also pose recovery challenges. Kroll Ontrack research and development engineering teams are analysing this emerging storage trend and are developing strategies to overcome data loss when it arises.    Global data privacy and security laws Like many industries, the data recovery industry is not immune to the impact of global data privacy and security laws. In fact, the evolving data privacy landscape and the impetus by highly regulated organisations to keep sensitive data on premise have resulted in a growing need for onsite or remote data recovery solutions. In addition to addressing data privacy concerns, onsite technical expertise is increasingly required when complex data configurations or the massive size of systems make it difficult for organisations to hone in on the target data for recovery. the data recovery industry is not immune to the impact of global data privacy and security laws Highly complex systems and the obligation to protect data privacy are driving more onsite requests than ever before. While some organisations are able to leverage remote technologies like Ontrack® Remote Data Recovery™ (RDR®) to perform data recovery without a hard disk or other storage device leaving an enterprise’s data centre, we’re seeing that some highly regulated industries, such as healthcare and financial services, can disallow even connecting remotely to their network for privacy reasons. As a result, we’re seeing a growing trend for onsite technical support and recovery. Where’s my legacy data? As organisations grapple with their legal obligation to keep and maintain access to regulated data, maintaining legacy data management systems can be costly and tedious, leaving some organisations at risk if they fail to comply. A 2015 global Kroll Ontrack survey of 720 IT administrators found that nearly one third of organisations do not have clear insight into the information within their tape archives, and more than one third store more than 100 tapes in their archive. Not surprisingly, Kroll Ontrack is seeing a demand for technologies that not only catalogue the information/location of data to meet legal demands, but also consolidate catalogues from various systems into a single, searchable, inventory to eliminate costs associated with maintaining legacy systems. End of life data destruction 2015 has once again proven that data privacy and security are paramount to any organisation handling digital data. Certainly nefarious cybercrime and data breaches continue to plague the news prompting organisations to beef up data security practices. However, IT administrators and even drive manufacturers themselves are combatting a slightly different data security challenge, namely ensuring that built-in data wiping tools are securely and completely sanitising drives prior to re-use or disposal. [easy-tweet tweet="Nefarious #cybercrime continues to plague the news prompting organisations to beef up #datasecurity practices"] As a result, we’ve seen an uptick in requests from IT departments and drive manufacturers for third-party validation that a system’s sanitisation function is 100 percent effective in securely deleting data.  There’s been an exponential increase in demand for our Erasure Verification Services, where our engineers perform detailed analysis of sanitised drives and provide reporting on results to validate whether deletion methods are secure. It’s another layer of protection organisations have at their disposal, and one that we believe will only grow in importance in 2016 and beyond. ### SoftServe Achieves Amazon Web Services Big Data Competency for Delivering Data Solutions in the Cloud SoftServe, a leading solution provider and software development company, has achieved the AWS Big Data Competency from Amazon Web Services (AWS). The competency designation recognizes SoftServe’s continued success in helping customers build effective data management solutions, as well as providing methods and techniques to process and analyze data productively, at any scale. SoftServe’s Big Data Team has extensive expertise in the design, implementation and modernization of Big Data solutions including Hadoop-based systems, MPP Data Warehousing, BI Visualization and Advanced Analytics. "Big Data Analytics represents a tremendous opportunity for organizations to derive new forms of innovation. Success in Big Data Analytics depends on having an infrastructure and services that orchestrate Big Data technology components for collecting, storing, processing, analyzing and visualizing data from literally everywhere, including IoT devices, web logs, and social media information,” explains Serhiy Haziyev, VP of Software Architecture, SoftServe. “We have extensive experience delivering Big Data Analytics projects for Fortune 100 and startup companies alike and in our experience, AWS has proven to be mature, scalable and affordable environment for Big Data.” The AWS Competency Program is designed to highlight members of the AWS Partner Network (APN) who have demonstrated technical proficiency and proven customer success in specialized solution areas. With a team of more than 20 Hadoop certified specialists, and partnerships with leading Big Data and analytics technology vendors such as Cloudera, Hortonworks, Microstrategy and Tableau, SoftServe has helped its customers find new opportunities and insights from the mountains of data generated by processes (e.g. machines, social networks, and smart devices) embedded in daily life. One such customer is Yottaa, the developer of a leading cloud platform for optimizing web and mobile applications. With Yottaa's platform, enterprises can manage, accelerate, and secure end user experiences on all devices in real-time with zero code change. SoftServe helped Yottaa build a solution to address high data throughput, large data volumes (estimated in hundreds of terabytes per year), and near real-time analytics. Adopting an agile development methodology, SoftServe successfully managed the prototyping process from Proof of Concept (PoC) to Minimal Viable Product (MVP) using AWS, delivering a fully-featured in-house Operation Intelligence platform into production. ### Top 10 Tech Predictions of Predictions #1 - Robots will replace humans Source – Robot manufacturers and robotics professors looking for funding Our Prediction - The entire human population will soon be living a luxurious life of leisure as-yet-unrevealed robots, wholly unlike the unconvincing wire-strewn devices we currently see, will so remove any reason to get out of bed each day. [easy-tweet tweet="All humans will soon be living a luxurious life of leisure as robots do our jobs" user="comparethecloud" hashtags="IoT, robotics"] Automotive manufacturers and others will indeed further automate existing uses for machines. The task of integrating the various layers of software required to perform even such basic tasks in a ‘smart way’ will be harder than ever. On the brightside, software development will also be taken over by robots and the promise of standardised, interoperable and compliance systems may have a chance after all. #2 - Your parents discuss the ’Internet of Everything’ Source – Telcos and Big Data vendors Our Prediction - The irrational urge to connect every domestic and industrial appliance to the internet becomes unstoppable. More new uses will be found to link sensors, smartphones and corporate systems. Integration of multiple devices and interfaces will cause more security holes because the app security industry hasn’t yet figured out that real security problems happen because of poor interfaces at the whole-system level #3 – Malware, Hack and Phishing will empty your bank accounts Source – IT Security Vendors Our Prediction – The cyber security hype cycle will continue to churn as companies keep throwing money at cyber training, tools and consulting. Yet still after billions spent over the past 10 years, several name brands will experience significant breaches. Customers will be up in arms, experts will continue to be mystified and CEOs will head for the door. Those who continue to ignore the underlying structural integrity of their software will once again make headlines. #4 - Software will be measurable (as if..) “In God we trust, all others bring data”. Source – IT departments Our Prediction - The dark art of code development will soon become, just a little, more like a science.  Just as many other advances in technology the ability to improve quality will spur more innovation. At this heavenly time of year, it is perhaps relevant to remember, the father of quality himself, W Edward Deming, once said “In God we trust, all others bring data”. The combination of automated function points, a recognised set of global standards and tools to visualise progress on quality will finally provide some certainty to the effort and the costs in developing software, both in-house and by outsourcers. #5 – Big Data and analytics will be even…Bigger Source – Big Data vendors (again) Our Prediction - Even cleverer insights are possible, yet arguably, not necessarily needed, thanks to super clever software that analyses every micro moment of every smartphone and web session. The data of the future will be so, er, Big. As data pools increase the data hype will decrease. The vast amount of Big Data created will finally focus attention on real issues, like application health. [easy-tweet tweet="The data of the future will be so, er, Big" user="comparethecloud" hashtags="bigdata, analytics, data"] #6 - DevOps becomes Dev-Oops Source – Storage and Data Management vendors Our Prediction - DevOps, the bastard son of Developers and Operations, has become the ‘Jack of All Trades’ who is ‘Master of None’. The losers in this, the guys down in the basement, with the still-spinning disks, make a bid to reclaim their ‘rightful place’. The de-skilling of IT operations and smaller teams, will produce worse code more quickly, which will result in IT systems which are fragile, not just agile. #7 – 3D Printers Print 3D Printers Source – 3D Printer manufacturers "Hey Mom can you print me a new iPhone? This one is cracked." Our Prediction - Just like PCs, iPhones and microwaves, every household will soon have its very own 3D printer. Reliability, safety and standards will be overrun by convenience and speed as the Hacker Manufacturer cripples the consumer electrons sector, displacing millions jobs worldwide as Mom and Dad print whatever they need rather than buying it online.  “Hey Mom can you print me a new iPhone? This one is cracked.” #8 – The Cloud Turns to Smog Source – Cloud-based enterprise software vendors Our Prediction – Now that the rush to the cloud is mainstream, clear choice to cloudify everything will get a bit murky as legacy apps and brittle architecture fail to hold up to the demands in the cloud. To solve cloud performance, reliability and stability issues, business will simply move apps back on premise. [easy-tweet tweet="Legacy apps and brittle architecture will fail to hold up to the demands in the #cloud in 2016" user="comparethecloud" usehashtags="no"] #9– Outsourcing becomes Outcome-based at last Source – Systems integrators Our Prediction – Instead of just paying for bodies, clients will start to pay for outcomes, or at least paying by output for software-related work. As automated software measurement becomes a broader reality, application service providers will be able to meter their output and provide a meaningful invoice to clients. #10 Tech predictions will cease to exist by 2018 Source - CAST Our Prediction – As the hype around vendor’s tech predictions gets a bit too cynical, the media will stop publishing these types of articles. So what? As has been proven time and again, in many cases, the effects of new innovations are overestimated in the short term and under-estimated in the long term. The common theme CAST sees in all of these predictions, one we thoroughly agree with is that the importance of software and so its quality, will be significantly greater in 2016. As leading proponents of software quality, it is perhaps curious that software quality has not featured in our own list. CAST does have one prediction to make therefore. That  IT costs will remain unpredictable, until the industry adopts the global Automated Function Point metric developed by the Consortium for IT Quality. CAST is pioneering its use with some of the world's largest and most intensive users of IT. Until its adoption is standard practice, we can expect some very unpredictable IT results. Here's to a productive 2016! ### Natalie Shelley speaks to CTC for #CloudTalks Natalie Shelley takes time to talk to Compare The Cloud about her company, Bluebird ITS, as part of the #CloudTalks series. #CloudTalks ### Customer Analytics, Insights and Experience Forum Back After Last Year's Sold-Out Success Followed by the overwhelming and sold-out success of the previous run in Australia, the Customer Analytics, Insights and Experience Forum 2016 is anticipated to be one of the largest congregations of Customer Success thought leaders and professionals in Europe. Held in London Marriott Hotel West India Quay from 16 - 18 February 2016, this conference will address how customer data can be effectively transformed into valuable customer insights that drive smart business decisions. Ultimately, this will increase positive customer engagement and enhance overall customer experience. Dynamic discussions over the three days will explore leading topics such as: Personalization and Segmentation Customer Journey Mapping Predictive Analytics Real-time Analytics Voice of the Customers Single Customer View Delegates will hear from those at the forefront of customer analytics on strategies to increase customer retention, purchase and consequently, customer lifetime value. They will also gain cutting-edge insights on how using real-time analytics and predictive analytics can transform organizations into agile ones that stay ahead of their customers. Kirsten Kuhnert, Director, Customer Experience and Competitor Intelligence from Microsoft will be delivering her keynote address on Taking Your Customer Analytics Capabilities to the Next Level. Alongside her is a line-up of renowned customer analytics experts including: Barbara Cominelli, Director of Commercial and Operations, Vodafone Davide Cervellin, Head of EU Analytics, Ebay Inc. Kriti Sharma, VP, Head of Product, Real Time Big Data Analytics, Barclays Dan Jermyn, Head of Big Data & Innovation, Royal Bank of Scotland Sherif Choudhry, Managing Director (Digital and Financial Services), Accenture Simon Wood, UK Head of Customer Experience, TNS "In today’s digital world, customers are becoming more empowered, self-directing and research-driven than ever before," said Joana Narciso, Conference Director at Clariden Global, the pre-eminent institution behind this conference. "Coupled with the proliferation of voluminous and complex customer data, this presents many opportunities and challenges for companies striving to get ahead of the competition." Visit the Customer Analytics, Insights and Experience Forum 2016 website to view the full agenda and speakers' profiles. ### Shifting business acceleration to the cloud The long-predicted growth in cloud adoption is now finally accelerating innovation, scalability and business agility, resulting in inventive business models and business growth. As many companies already have highly virtualised infrastructure in place, their IT strategy is increasingly focused on cloud adoption as a means of not just driving cost efficiencies but also innovation. Increasingly, businesses are looking at ways to ease the burden of meeting regulatory compliance and security requirements by implementing the relevant cloud adoption strategies. [easy-tweet tweet="IT strategy is increasingly focused on cloud adoption says Nachiket Deshpande" user="comparethecloud" hashtags="cloud"] As cloud computing is maturing at a rapid pace with many "as-a-service" type offerings such as infrastructure-as-a-service (IaaS), platform-as-a-service (PaaS), desktop-as-a-service (DaaS), disaster recovery-as-a-service (DRaaS), software-as-a-service (SaaS), such developments have paved the way for the anything-as-a-service (XaaS) model. This can be seen as the founding pedestal for next version of the cloud development: the "vertical cloud". The vertical cloud is designed to deliver core applications, tools and the corresponding biota of a specific vertical that allow organisations to customise cloud services themselves to their specific needs and tastes. The vertical cloud allows enterprises to pick and choose what to operate in the cloud The vertical cloud allows enterprises to pick and choose what to operate in the cloud and on customer’s premises, based on security and compliance requirements that govern their businesses. In industries such as banking and finance or insurance, for example, regulatory compliance is the prime driver while choosing the relevant architecture for their IT Infrastructure. With major regulations on banking and finance on the way, including Basel III or the EU General Data Protection Regulation (GDPR), ensuring regulatory compliance will remain a major area of investment during 2015 and into next year. However, using the vertical cloud shifts the onus of compliance to the cloud provider on account of their proven and re-usable governance and security frameworks. Vertical cloud offerings can come pre-packaged with the required regulatory obligations and thus offer organisations a relief from the encumbrance of ensuring compliance themselves. [easy-tweet tweet="Analysts foresee that a significant acceleration in global #cloud-related spending will continue"] Continued growth in cloud and IT infrastructure spending Analysts foresee that a significant acceleration in global cloud-related spending will continue. During 2015, global cloud IT infrastructure spending is forecast to grow by 24.1 percent to USD 36.2 billion, which is about a third of overall IT infrastructure spending, according to IDC. This trend is no different in the United Kingdom, where adoption continues to grow. For example, the banking and finance sector is predicted to spend 75 percent more on IT infrastructure year-on-year in 2015 , a big part of which is dedicated to cloud services. Recent guidance issued by the Financial Conduct Authority is likely to perpetuate this trend, by advocating the implementation of the cloud by financial services organisations, paving the way for firms in this sector to take advantage of cloud services and the innovation that it can foster. The main factors driving cloud adoption are industry competition and pace of change brought on by digitisation The main factors driving cloud adoption are industry competition and pace of change brought on by digitisation. However, businesses need to be more nimble and use the cloud to absorb planned and unscheduled changes swiftly and seamlessly. In order to enable companies to deal with market trends and deviations, the cloud value chain takes a holistic approach to the current business in the context of a changing market. Here is a snapshot of a few such phases which, in rapid evolutionary terms, lead to the adoption of the vertical cloud, a concept that encompasses them all. [quote_box_center] The cloud service economy is here. The current trend in cloud adoption is to look beyond asset management and traditional methods used to accomplish business outcome (e.g. developing, testing and repairing). Instead, various flexible ‘as-a-Service’ models offered by cloud firms allow businesses to employ technology solutions themselves, freeing IT teams to focus instead on architectural and advisory services. From the perspective of an IT Infrastructure, the expectation in the cloud service economy is to have 'value' delivered directly by the investment made instead of traditional and laborious ways of realising it. Anything-as-a-Service as a prelude to vertical cloud. Cloud thinking is spurring some organisations to explore running their entire IT operations on the Anything-as-a-Service (XaaS) model with cost vs service consumption variability. Ultimately, however, it is the digital disruption across industries that added to the increasing complexity of continuing traditional in-house handling. As such, businesses are faced with the need to handle next generation requirements such as big data analytics, cognitive computing, mobility and smart solutions, Internet of Things (IoT) and other examples of digitisation. [/quote_box_center] [easy-tweet tweet="Security and regulatory compliance are complex and exhaustive" user="comparethecloud" hashtags="infosec, cloud"] Security and regulatory compliance are complex and exhaustive, requiring IT infrastructure and applications to be constantly ready for ever-evolving demands. Hence, pursuing the vertical cloud strategy can help business not only to advance and accelerate business growth and gain competitive advantage, but also to mitigate security and regulatory compliance. ### Ensuring Security in Today’s Cloud Environment The cloud offers an array of benefits to businesses, including accessibility and greater control over applications. Not surprisingly then, we have seen a significant growth in the adoption of cloud and mobile applications on a global scale and Okta’s recent Businesses at Work Report highlights that an increasing number of employees use phones and tablets to access both personal and work-related information. However, with this increased uptake of the cloud and the rise of devices in the workplace, security concerns have escalated as data breaches continue to rise. [easy-tweet tweet="Security is a major obstacle when it comes to adopting digital technologies" user="comparethecloud" hashtags="infosec, digital"] Security is a major obstacle when it comes to adopting digital technologies and now more than ever it’s crucial that organisations keep cloud applications and sensitive data protected. What can we do to eliminate these challenges and ensure companies adopt cloud and mobile technology easily and securely? Data breaches on the rise The increasing number of data breaches and leaks of personal information over the past few years has put security at the top of the agenda for many businesses. This is supported by 51% of nearly 2,000 senior-decision makers demonstrating concern about security as a challenge for adopting digital technologies, according to Accenture. Businesses now realise that to maintain the security of end-user computing they need to control the growing number of users, devices and applications that span traditional company and network boundaries. It’s important that IT departments make the transition to ensure that the focus is put on people as opposed to the device. By utilizing “what you have” with simpler and more secure methods of verification — such as SMS, push notifications for phones and watches or hard tokens — companies can ensure users are who they say they are, resulting in the reduced risk of access by unauthorised parties. It’s important that IT departments make the transition to ensure that the focus is put on people as opposed to the device. The rise of MFA Organisations are increasingly leveraging multi-factor authentication methods that are easy to use and more secure, instead of traditional security questions — such as “What’s your mother’s maiden name?” or “What was the name of your first pet?”— as a second form of verification. This demonstrates that due to the countless number of breaches, businesses are eager to put the necessary measures in place in order to protect themselves. Traditionally, businesses have used a second factor to authenticate users to protect VPN gateways. With companies putting more sensitive data in cloud based apps, such as email content in Office 365, sales data in Salesforce.com and employee information in Workday, they need this second factor for cloud apps. That’s why we see the move towards SMS, soft token and hard token multi-factor authentication to ensure the information is only being accessed by approved stakeholders. [easy-tweet tweet="We're seeing the move towards SMS, soft token and hard token multi-factor authentication" via="no" hashtags="security, cloud"] As more data is transferred to the cloud and remote access of that data is required, diverse industries of different sizes are adopting multi-factor authentication to achieve strong authentication and user verification, as highlighted in Okta’s report. Gaining access anywhere — and knowing about it As flexible and remote working arrangements become more prominent and employees have access to the cloud anywhere and from multiple devices, organisations need to know who has access to what applications and where they are accessing it from. For businesses to keep their environment secure while giving people access to the best tools, their systems need to know who the users are and in any context — they need to bring on an identity management solution that will provide users with a single-access point. By implementing a comprehensive identity management solution, organisations can gain visibility and control over different applications, access points and users. In turn, users can easily access every application they need via one portal with a seamless user experience. it is important for organisations to add the necessary security layers With the rise of digitalisation and the momentous increase in cloud adoption, it is important for organisations to add the necessary security layers to ensure they protect their constituents and their most valuable and sensitive information. The transition from traditional security questions and the increasing adoption of multi-factor authentication and single-access points demonstrates that companies value the additional layers of security now, more than ever. If they understand the importance of gaining visibility of users they can realise the true benefits of operating in a cloud environment. ### Refashioning data security with a nod to cloud   American business magnate, Warren Buffet once said, “it takes twenty years to build a reputation and five minutes to ruin it. If you think about that you’ll do things differently.” Hot on the heels of the fallout from the TalkTalk hack, for many organisations and their Chief Information Security Officers (CISOs) in particular, that stark reality rings true. Doing things differently in relation to data security strategy is no longer a project for the wish-list, but a boardroom priority. CISOs are rapidly becoming the point person for customer advocacy, brand protection and providing new ways to effectively secure employees and company data. This is all while reducing costs and simplifying business processes to drive competitive advantage. [easy-tweet tweet="It takes twenty years to build a reputation and five minutes to ruin it - even in #cloud" user="comparethecloud" usehashtags="no"] Yet, in the precarious position of balancing advances in mobility, cloud computing and the Internet of Things, succeeding in this newly developed role means a shift in approach. Gone are the days in which the concept of fortress building is acceptable for keeping the ‘bad guys’ out and maintaining control. Organisations are now tasked with securing employees working from remote locations, across insecure networks and applications. Keeping up with the IT crowd  For many this shift begins with assessing the advances in technology and how they can be applied to security. As IT embraces virtualisation, consolidation and consumption of cloud technologies, platform and software services, security and risk professionals have been left behind. While automation and integration may be standard practice across IT processes, this is not always the case for security protection. [easy-tweet tweet="As #IT embraces #virtualisation, and consolidation of #cloud, security and risk professionals have been left behind" via="no" usehashtags="no"] To advance from fortress building to securing the borderless enterprise, CISOs are now looking for a new breed of security solution. This incorporates data security and encryption functionality with cloud based security and policy driven data protection that can be delivered across all connectivity channels. Tools that can provide cloud-scale visibility and crowd-shared threat intelligence are also becoming hotly pursued. And this change is being demanded with immediate effect. Research carried out in a commissioned study conducted by Forrester Consulting on behalf of Zscaler indicates that 82 per cent of firms require functionality that provides strong integration with data security or encryption technology now, or within the next year. Look Back at Blogs from 2015 [caption id="attachment_33569" align="alignright" width="300"] Read about Lenovo's Rent and Grow Scheme[/caption] [caption id="attachment_33532" align="alignright" width="300"] Hypermarkets and Cloud Services[/caption] Embracing a unified approach With this transition towards enhanced security capabilities something must also change in the way this functionality is delivered. Teams are currently operating under the burden of multiple hardware appliances and point solutions, commonly delivered across a range of vendor portfolios with little flexibility and integration. As a consequence, the natural next step is to look at how organisations can consolidate their existing security functions into one central framework. Results from Forrester support this suggestion. An overwhelming majority (98 per cent) of IT security professionals believe that an integrated security platform would be more effective in delivering a broad range of cyber security capabilities versus point solutions delivered by multiple vendors. In fact, 76 per cent of respondents claimed that the approach would be very effective in comparison. Additional insight highlights that professionals see these platforms as removing the barrier to delivering advanced techniques like advanced analytics and machine learning, which depend upon delivery of consistent big data across all technologies. Of course, when managing a variety of tools, extracting the consistent data to fuel insights becomes incredibly difficult. Making cloud security pay Combining integrated security platforms with cloud deployment options will deliver improved security and higher scalability In the same way that lines of business and security professionals both have to consider the cost implications of a change in strategy, they must work in tandem to support the overall business goals. To enable business agility and competitive advantage not only depends upon productivity, or in this case security, but reducing overheads and making intelligent investments that won’t compromise business stability. Combining integrated security platforms with cloud deployment options will deliver improved security and higher scalability. It also means lower overheads and a shift of resources into performing critical tasks. As a result, CISOs should be looking at how they can make significant strides from a cost perspective. This prioritisation is mirrored in findings among 130 security professionals involved in the Forrester study for whom reducing costs is a major goal, and the top rated driver for adopting cloud security technologies. That said, costs weren’t the only means of making cloud deployments pay. Nearly half (49 per cent) of participants claimed that they would adopt cloud security as a service to gain better security than can be achieved with on premise deployments. In fact, 48 per cent said that one of the top priorities for implementing cloud tools would be to secure areas that on premise tools cannot, such as remote locations, mobile devices and Internet of Things solutions. This capability is becoming critically important as CISOs look beyond the fortress walls towards a water tight strategy for the future. [easy-tweet tweet="One of the top priorities for implementing #cloud tools would be to secure areas that on premise tools cannot" via="no" usehashtags="no"] Tips from the top Naturally, the commonplace methods of securing the enterprise must be adapted as organisations face the restrictions of evolving threats and distributed workforces. Executives need to first assess their existing tools and the functionality they deliver to make decisions about how they can apply more advanced techniques. Next, they must determine the most effective way to deliver these capabilities. Building a security ecosystem that prioritises integration and orchestration across all security technologies in the company’s portfolio is key. This will enable teams to actively seek tools where vendors have developed integration between their products already, or by assessing managed security offerings where service providers can deliver pre-integrated platforms. By making steps to effectively address the need to promote cloud-based security as an enabler for competitive advantage, CISOs are expected to protect the everywhere enterprise in a cost effective and flexible way. Only then will they be to fulfil their role in maintaining the delicate balance between customer trust and brand reputation. ### Do you have the right formula for cloud survival? Consolidation and the lessons from motor racing history  Is your Cloud vendor more like a works teams like Brabham, McLaren or Williams, a premium brand like Lotus, Mercedes and Ferrari, or a volume one like Toyota or Volkswagen? In the early days of motor racing, like today, a number of drivers competed with each in time trials in order to qualify for a place on the grid. The difference back then was that most of the drivers were enthusiastic amateurs. Eventually works teams started to emerge that were more organised and more professionally focused and in more recent times much better funded. Teams such as Brabham, McLaren, Williams and so on started to squeeze out the private entrants. Some these firms were then in turn displaced or bought out by the big automobile manufacturers that nowadays dominate motor racing as the main sponsors / competitors.  [easy-tweet tweet="The initial players in the #OpenStack community are being replaced and bought out at an incredible rate" user="billmew"] I know the history well as my father use to motor race. As an enthusiastic amateur he owned a garage in Tunbridge Wells and raced at weekends with the mechanics from the garage acting as his pit crew. In the 1960s they towed the car behind their van across the continent seeking to qualify for races such as the 1961 Formula 3 Grand Prix at Monza in Italy. There were 90 entries but only 30 places on the grid to be decided by lap times. They managed to get the last position on the grid for the final and were the only private British entry, racing against the works teams. Soon the private entrants were squeezed out entirely by the works teams and the whole affair became rather professional. However, before they did, my father made his name in history: in one race in 1963 having failed to get away correctly at the start, he twice broken the Formula 1 lap record at Brands Hatch fighting his way back from the back of the field to finish second. A short while later the formula classifications were restructured into the Formula 1,2 3 and 4 that we know today were created – meaning that the old formula 1 lap record at Brands Hatch is still held to this day by none other than John Mew – See more here Tunbridge Wells Motor Club: “John Mew In at the Best Time” Similarly the band of enthusiasts that were the initial players in the OpenStack community are being replaced and bought out at an incredible rate. With OpenStack taking far longer than many had expected to take off, many of the initial players have simply run out of funds. Nebula ran out of funds a while ago and rather than sell out, it simply shut up shop and its team moved on. Others have chosen to sell out. IBM has recently acquired BlueBox and Cisco has gobbled up Piston Cloud, which will be incorporated into Cisco’s OpenStack Private Cloud, joining MetaCloud. EMC had earlier acquired CloudScaling, while HP had acquired Eucalyptus. By absorbing early OpenStack firms these giants are simply doing what they do best: turning point technologies into full productised offerings within their vast portfolios. Just as the works teams were the only ones with the funding to take formula one forward, these big players are now stepping in to take OpenStack forward. About the only remaining stand-alone OpenStack player is Mirantis, but for how long? I mean how long will it be able to compete with the level of investment in R&D, channel enablement and marketing as these giants? The same wave of early expansion followed by consolidation has been seen in many industries. Cloud (and OpenStack in particular) is just going through the same growing pains. The parallels here go further. Premium brands like Lotus, Mercedes and Ferrari, that have invested in formula 1, compete with each other at the top end of the automobile market where it is very much a premium play, but they can’t compete with the volume manufacturers like Toyota or Volkswagen in terms of volume. Similarly Cisco, EMC, IBM and HP are investing in OpenStack as a private or hybrid play for premium cloud offerings, knowing that they cannot compete on volume or price with the mega-scale public cloud vendors like AWS, Azure and Google.  The lessons here are clear: don’t assume that the initial players in any market will the same ones that end up commercialising it. And don’t accept a lift from a driver with the surname Mew (we all learned to drive from our father). This post was originally published July 24, 2015. ### LENUX ON MAINFRAME [caption id="attachment_20673" align="alignright" width="300"] Len Santalucia[/caption] Whilst at the LinuxCon event in Seattle this year we caught up with Leonard Santalucia, CTO Vicom Infinity. Vicom Infinity is one of the Founders of the Open Mainframe Project, which was announced at the event. Vicom Infinity is based in the heart of NYC with additional offices outside the city. They provide customers with sophisticated IT solutions, through world-class IT products and uncompromising service by leading customers through an information technology transformation with zero impact on business operations. [easy-tweet tweet="Len was the first person to push the @IBM #linux on #mainframe message to Wall Street " via="no" hashtags="Lenux"] Len established an IBM Linux on z Systems Mainframe Executive Customer Council on Wall Street, and 15 other councils across the USA. IBM is still running most of these councils today (they were originally formed over a decade ago), and consulting with Len to help run them. Len also started the “Linux Center of Competence” at IBM HQ on Madison Ave in NYC, ensuring the knowledge and skill advancement for Linux on the mainframe was well understood and supported. So, with all of these achievements under his belt we think that Len was ideally placed to comment on the launch of the IBM LinuxOne, a purpose built mainframe for Linux environments – the only one of its kind released on the 17th of August this year! What did Len have to say about this? Well there’s no surprise that he had a lot to say. "To see a purpose built product based on mainframe architecture from IBM, the LinuxOne, is absolutely fantastic." “It’s been an amazing journey over the last 50 years for the mainframe. To see a purpose built product based on mainframe architecture from IBM, the LinuxOne, is absolutely fantastic. Support for Ubuntu, Red Hat and SUSE, as well as having the scale of a powerhouse computing environment, is an excellent move. The reliability, availability, integrity, and security of a z System mainframe simply cannot be matched today by x86 technology. This is why Linux on z Systems has been so widely adopted across both the public and private industry sectors, even before the purpose built LinuxONE product line was released and I am very proud to say that I had the honour and privilege to be a part of the process! [quote_box_center]With further comment Len explains; “In today’s age Mobile agility is a must together with security and redundancy. A z Systems mainframe that can supply you with Hypervisors and Open Source Software that even stands up to earthquake simulations let alone a complete in-built resilient set of architecture, takes away any doubt that could keep you awake at night. We are also very proud that Vicom Infinity is considered a key go-to-IBM Premier Business Partner for Linux on the Mainframe for Cloud, Analytics and Mobile service applications and implementations. This plays well into the hands of the traditional Managed Service Provider that can now offer hosted services from one of the most secure resilient computing environments ever made. With services such as Printing-as-a-service, Database consolidation for database-warehousing-as-a- service as well as amazingly fast I/O throughput that handles hundreds of thousands of transactions per second securely. This coupled with a new flexible pricing model that allows clients to only pay for what they use and not the entire system! As an example, the LinuxONE can run as many as 8000 Linux VMs and also run effectively at 100% utilisation – You tell me what x86 environment can do that!”[/quote_box_center] I doubt that there is anyone (certainly no one we know) who could question Len’s comments, the facts of LinuxONE speak for themselves. It’s a very strong move by IBM to release a new mainframe product line that caters to the brave new world of Open Source but I think a smart one. With elasticity for computing resources as well as security, key for Cloud Computing today and for the future IBM has insured their place as one of the main driving forces behind mission critical service provisioning while catering for new services coming through the ranks offered by service providers, for possibly another 50 years to come. The new IBM LinuxOne product range, aptly named after Penguins “Emperor – for the large scale enterprise and the Rockhopper, the entry level model” This post was Originally Published on Aug 27, 2015 ### Crouching OEM Tiger, Hidden Cloud Dragon A View On The Cloud Dream In China Today when we look at the Chinese market we think of it as an OEM (Original Equipment Manufacturer) or white label goods provider, goods that are produced in China and rebranded for sale elsewhere, such as Apple iPhones.  On November the 20, 2012 China’s newly elected president, Xi Jinping spoke about a term known as the “China Dream.” This dream in terms of cloud and technology is for China to move beyond a manufacturing economy to one of technical greatness. China wants to become the designer, producer and retailer of these solutions, not just the manufacturer. It is unjustified to maintain that most Chinese innovations are derived from reverse engineering or technology repackaging. While it may have been true in the past or remain true in certain areas, the China of today has surpassed that. Real innovation is now commonplace in China, just as it is in countries which previously were just markets for outsourced skilled IT labour, such as India. [easy-tweet tweet="The technical community in the Western Hemisphere is always keen to focus on the hyperscale #cloud vendors such as #AWS" via="no" usehashtags="no"] The technical community in the Western Hemisphere is always keen to focus on the ‘hyper-scale’ cloud vendors such as AWS or Microsoft Azure, but in terms of sheer scale and population density there is no region that matches China – a country that epitomises ‘hyper-scale’ in every sense. In China acting at such massive scale, call it ‘Hyper-scale’ if you wish, is normal. Some Chinese banks have as many customers, and therefore as many accounts to manage as the total that half the local banks in Western Europe manage. China Unicom has over 400 Million subscribers and counting – that’s more than the entire population of the US, and about the same as the entire population of Western Europe. Now step back and think of this for one minute, that's 400 Million subscribers, all using mobile internet. The magnitude of the systems, data centres, and infrastructure required to power such a huge number of users is immense in both technical and organisational terms. To achieve true ‘hyper-scale’ globally a cloud platform has to be ‘road tested’ with millions of subscribers. There is only a few cloud providers who can make this claim today. To develop and continually improve the everyday feat of technical engineering that is the Chinese market, networking company Huwai (a Chinese company known internationally as telecommunications giant Huawei), has been investing globally in a number of universities (approximately 20) to develop 5G technology with an estimated $600 Million investment committed to 2018. It is our belief that this significant innovation and investment alongside a solid partnering strategy will push Huawei to stratospheric heights internationally, and enable it to become the dominant force and de-facto choice for all new 5G deployments. When we think of technical and development hotspots, one usually assumes the Eastern European or Indian subcontinent as key areas. India has certainly become a key player in the worldwide technical economy with companies such as Tata Infotech and Infosys becoming common brands amongst the Enterprise CIO elite. India has benefitted from significant investment from offshoring contracts ranging from software development through to call centre operations. These contracts originate in both Europe and the US, but the cost of development in India is now rising rapidly with increasing salary demands and infrastructure burdens altering the balance in favour of new entrants to the offshoring market, such as China. An example of success dates back to 2008 when IBM and the City of Wuxi created the first cloud computing shared service. This service, whilst initially slow to gain traction, has now broken into profitability and succeeded in penetrating its core market. China is reinvesting the profits from its manufacturing success in skills for the future. Chinese Universities are now turning out 7.5 Million graduates a year (London’s population in 2013 was 8.038 Million). A lot of technology companies today evangelise and promote the term IoT or Internet of Things as part of a vision involving a proliferation of connected devices. So what does this mean to Chinese manufacturing and cloud computing? The simple answer is ‘computational and analytics processing on a scale never seen before’. Imagine the sensor data from all those factory machines, ranging from RFID outputs through to nanotechnology, with the computational need to collate, store, retrieve, process and analyse all this data. This will require cloud resources of monumental proportions combined with the necessary software, services and consultancy to make achieve a successful result. As Red Army commanders have always understood, China’s greatest asset is its people. Even when ill equipped the Red Army made its numbers count, but imagine for a moment if they were as well equipped as any Western army. In engineering and technology this will soon be true. China is reinvesting the profits from its manufacturing success in skills for the future. Chinese Universities are now turning out 7.5 Million graduates a year (London’s population in 2013 was 8.038 Million). So that's 0.5 Million short of the population of London per year being highly educated and moved into the Chinese workforce. It makes the 46,000 engineering graduates in the UK each year look like a rounding error. For every three engineering graduates being produced in the UK there are 500 being produced in China. China is perceived by many to be 'the worlds factory'. I suspect we have all lost count of the number of times the impact of cloud computing has been compared to the industrial revolution… so if cloud is the ‘industrialisation of Information Technology’ why would the ‘worlds factory’ not be in a perfect position to be the worlds cloud partner? UK and European companies should look at the wider opportunities available and embrace the operational / cultural considerations needed to become a partner in the China dream. Originally published March 2015 ### Double Down on the Cloud: It’s a No-Brainer It’s December. Another year has gone and we have another chance to take a moment to predict how our industry will grow and develop next year. 2015 may have been a ‘year of the cloud’ but who’s to say 2016 can’t be a different type of year for rapid adoption of cloud based applications around security, analytics and collaboration. [easy-tweet tweet="John Marlow shares his thoughts on #cloud #vendors, #applications, #security and #collaboration in 2016" via="no" usehashtags="no"] Here are my specific predictions for the coming year: 1. Cloud vendors will become the go-to communications solutions providers. FTSE 100 companies and many more have already adopted cloud CRM platforms like Salesforce, and this year I expect to see cloud PBX and cloud contact centres move upstream. It’s more economical for companies to move to communication solutions in the cloud as it saves time, space and human resources, as well as giving companies more opportunities for new, additional and flexible communications solutions that can improve the way they do business, such as apps and collaboration tools. 2. Cloud applications will untether corporate workers from their desks No one likes feeling chained to their desk No one likes feeling chained to their desk and cloud-solutions free employees to work anywhere with high productivity. Enhancing traditional company phone systems, for instance, with cloud applications that run equally as well on mobile phones as they do on desktops, will enable users to have untethered and rich experiences regardless of their location or device. Availability of such cloud business apps will increase pushing further development of more user relevant functionality, allowing workers to do more in the palm of their hand or any other device like wearables which are fast becoming the next phase of device lead innovation. 3. Analytics drawn from cloud app data will become part of everyday business. The connected employee is evolving and growing at a rapid pace, end users have become massive consumers of services and information pushing the volume of data being presented and utilised by business and users alike. Cloud based applications are only pushing this data volume higher. 2016 will bring with logical presentation and consumption needs of this data. Accessibility to these tools will be key to their successful adoption and growth, as these analytics become an essential part of a business’ cloud solutions. Empowering a lot of just-in-time, actionable business insights, utilising these analytics will be the new normal for many C-suite executives. 4. Cloud security solutions will become paramount. [easy-tweet tweet="It is predicted that by the year 2020, the #IoT will include 26 billion units." user="comparethecloud" usehashtags="no"] With more and more business conducted in the cloud, security is imperative and leaves no room for error. The explosion of IoT will also make cloud security even more essential as Gartner predicts that by the year 2020, the IoT will include 26 billion units. As hacks get more sophisticated, and the number of vulnerabilities increases, it will be down to the cloud providers to keep business’ data secure with significant investment and development in this space being inevitable.  Add to this the possibility of new legislation from the EU and UK parliaments for more robust and top-grade cloud security and it is clear that security will be a key concern in 2016. 5. Collaboration tools will become as essential for business communications. there is a higher focus on collaboration and team interaction The workplace as we have known it is changing. Rise of social media, need to be reached anywhere, anytime, clubbed with the limitations of legacy communications tool like email have resulted in a higher focus on collaboration and team interaction, forcing broad adoption of collaboration solutions in corporate lines-of-business.  Collaboration tools are poised for accelerated adoption for team projects and person-to-person communication, possibly challenging email someday as the primary form of written communication.  Collaboration tools will become essential for users as they look to streamline work and improve their efficiency and discover new ways of interacting with colleagues and customers alike ### Top cloud speculative analysis and futurism from 2015 My favourite hobby (aside from fanatical support for Arsenal FC) is a mixture of speculative analysis and futurism – basically reading the runes within the technology industry (and the cloud sector in particular) and seeking to predict the strategic direction of its key players, factions and technologies. And it’s great fun. [easy-tweet tweet=".@BillMew shares his favourite blogs from 2015" user="comparethecloud" hashtags="cloud"] As a final flourish at the end of this year I have been asked to reflect on the year and share with you a short list of the articles I found most interesting and most fun to compile. Here you go. I hope you enjoy reading them as much as I enjoyed putting them together. [caption id="attachment_32336" align="aligncenter" width="640"] Why telcos would rule the world if they got their act together[/caption] 1. Why telcos would rule the world if they got their act together  Here I look at the natural advantages that Telcos enjoy and how they would have had a dominant position in hosting, in cloud services and in apps if they had been more dynamic and innovative. [caption id="attachment_32506" align="aligncenter" width="640"] HP quits Public Cloud; Dell buys EMC; What’s happening?[/caption] 2. HP quits Public Cloud; Dell buys EMC; What’s happening?  Here I try to make sense of the way in which the old era tech titans are struggling to respond to the young Turks of the cloud era. [caption id="attachment_32355" align="aligncenter" width="640"] Falling like Dominoes: Snowden, Safe Harbour, Crypto and the future of CyberSecurity[/caption] 3. Falling like Dominoes: Snowden, Safe Harbour, Crypto and the future of CyberSecurity  Security and privacy have dominated many discussions this year, but the landscape took a massive turn when the Safe Harbour arrangement was struck down and with current encryption at risk, the future will look very different. [caption id="attachment_32601" align="aligncenter" width="640"] Future of Cloud: Part One[/caption] 4. Future of Cloud: Part One  Here I start to look at the new competitive dynamic that the increasing pace of the transition to cloud has caused. [caption id="attachment_32608" align="aligncenter" width="640"] Future of Cloud: Part Two[/caption] 5. Future of Cloud: Part Two  In the second part I look at how private cloud and in particular OpenStack will fare in  the short-, medium- and long-term. [caption id="attachment_33339" align="aligncenter" width="640"] Ultimate Guide to Cutting Sales and Marketing Costs for Cloud Services[/caption] 6. Ultimate Guide to Cutting Sales and Marketing Costs for Cloud Services  This final flourish pokes fun and the similarities in the cloud mantras being promoted by so many of the want-to-be cloud players. What most of them lack however is any really compelling differentiation and a stand-out proposition to alter the dynamics of the market. The technology that so many of them are pinning their hopes on – Hybrid Cloud – is a necessary stepping stone as organisation transition from the old world to the new, but the end point lies elsewhere. You’ll here more from me on this in 2016. In the meantime have a great festive break and I hope you enjoy revisiting these blogs. We’ll see you back here next year. ### Considerations for Data and Applications in 2016 More and more data and applications moving to the cloud, which means that those applications and data are accessed by individual endpoints rather than being saved centrally or protected by the existing company IT security infrastructure. For CIOs, coping with this shift in how IT services are delivered means that they have to think about their security and DR planning. Security and DR are going to see some challenges in 2016, we spoke to another two Cloud Experts about their thoughts on the future. Here are the predictions from Jaspreet Singh, CEO at Druva and Wolfgang Kandek, CTO at Qualys. [easy-tweet tweet="Read @Comparethecloud's next two #CloudExpert 2016 predictions from @wkandek and @jaspreetis" via="no" usehashtags="no"] Wolfgang Kandek, CTO at Qualys: [caption id="attachment_33659" align="alignright" width="220"] Wolfgang Kandek[/caption] Many IT security vendors are predicting that there will be a huge upswing in attacks on mobile devices. Personally, I do not see mobile phones being the next big security target of 2016. iOS and Android are immensely better than traditional computing endpoints when it comes to commercial malware; the incidents of malware getting onto phones mainly came when there were failures in the process around development of the applications that people use in the case of XcodeGhost, or people downloading fake applications from crack sites. In PC security terms, this is going back around 15 or 20 years in terms of approach. However, other mobile devices – particularly laptops that don’t often get back on to the corporate network - will be targeted. Devices used by mobile workers that have to access Internet-based services from untrusted locations will be a risk if traditional security approaches only are used. Instead, it is important to think about how these endpoints can be kept secure when they are never within the corporate network and can’t rely on those large firewall implementations or network security solutions to be kept secure. As more applications and IT infrastructure move into the Cloud, the amount of It on the corporate network will be reduced. This will make endpoint security more important. Taking a “secure by design” approach to how those services are accessed, including multi-factor authentication for access control and continuous scanning for vulnerability management, will therefore become more important in 2016 as well. I think we will continue to see PCs and laptops being the primary targets, as these offer attackers the greatest return on investment. Making use of Cloud security alongside Cloud applications will be a natural next step as more IT services move into the Cloud. [easy-tweet tweet="iOS and Android are immensely better than traditional #computing endpoints when it comes to commercial malware"] Jaspreet Singh, CEO at Druva: [caption id="attachment_33658" align="alignright" width="200"] Jaspreet Singh[/caption] Today, more company data is being held on mobile devices rather than on central storage. IT teams are therefore relying on individuals taking best practice approaches to saving data. This is not ideal. Cloud DR can help IT plan more effectively, whether that data is held centrally or is found on individual devices like laptops and mobile phones. Gartner recently predicted that people would continue to have multiple personal devices; by 2018, employees will have three to four personal devices that can be used alongside enterprise-provided IT devices. As people use these additional phones, tables, PCs and devices it will spur the commoditisation of enterprise storage, leading to price reductions. In this “mobile-first” world, devices will be where you merge various services and SLAs into a single system that works seamlessly for end-users. Abstracting the data away from the device can actually help here too – rather than tying users to specific devices with isolated storage of data on each one, the business can manage all data assets wherever they happen to be. The consumerisation of cloud storage will see businesses focusing less on managing the infrastructure to contain and hold data; instead, business IT will concentrate more on building out services that bring value by using the volumes of available stored data. [easy-tweet tweet="By 2018, employees will have 3-4 personal devices that can be used alongside enterprise-provided #IT devices" via="no" usehashtags="no"] ### 10 cloud predictions for 2016 Compare the Cloud has been inundated with predictions for 2016, today we’re going to focus on the two Top 5 lists of predictions from Glenn Bladon, of IT Services firm ECS, and Rajiv Gupta, of Skyhigh Networks. [easy-tweet tweet="2016 predictions from #cloud experts Glenn Bladon and Rajiv Gupta" user="comparethecloud" usehashtags="no"] Let’s begin with Glenn Bladon’s 5 predictions. Glenn is Director, IT Consulting, at ECS; with over 25 years experience and a thorough understanding of the IT industry he brings an executive view to his predictions. [quote_box_center] 1. Discourage Cloud Sprawl Hearing how easy it is to launch new cloud services, in 2016, we expect line of business managers will demand new services from their CIOs and IT teams without realising how difficult it is to deliver within the desired timeframe.  CIOs will need to fully embrace the cloud for certain applications - and take responsibility for planning, management and integration - to avoid the prospect of being bypassed by managers deciding to develop their own new services directly using SaaS, PaaS or IaaS. 2. Which Development Model? 2016 will see CIOs develop clearer strategies for further cloud adoption. It will be interesting to see which development models they adopt: Agile, DevOps, Bi-Modal IT, ‘All in’, or Hybrid IT. My view is that hybrid cloud environments will dominate. 3. IaaS adoption will explode Amazon Web Services’ and Microsoft’s plans to open data centres in the UK for the first time will result in a mass move to AWS and Azure by UK businesses: the price points, features and speed at which businesses can turn on this infrastructure make it a no-brainer. Because these two companies own 95% of the public cloud market globally, it is unlikely that private clouds will be able to keep pace. 4. Security fears over Public IaaS will diminish Public IaaS offerings will turn a corner in 2016 and start to be seen as more secure than on-premise or outsourced data centres. The decision by Amazon, Microsoft and others to open or expand data centres in the UK also reduces some of the problems associated with data sovereignty/data residency rules - although organisations will still need to ensure they are complying with the appropriate national and international data protection requirements 5. More Sophisticated Cloud ROI Calculating cloud ROI in ways other than capex/storage cost reductions can be difficult, but in 2016 we’ll see some organisations starting to measure Cloud ROI by looking at business outcomes. For example, measuring the ROI of a new, customer-facing cloud-based service by looking at the innovation costs in relation to improved/new customer relationships. [/quote_box_center] And our secondary set of 5 predictions come from Skyhigh Networks co-founder and CEO, Rajiv Gupta. Rajiv has more than 20 years of enterprise software and security experience and is considered a leading web services expert. [easy-tweet tweet="Massive #cyberattack costs in 2015 mean rises in cyber Insurance in 2016 says Rajiv Gupta" via="no" usehashtags="no"] [quote_box_center] 1. Cyber security insurance prices will double. Insurance companies absorbed massive cyber-attack costs in 2015. In response, rates and premiums are on the rise. Companies will balk at prices and may need to agree to unfavourable terms in order to afford coverage: Anthem had to commit $25 million towards any future costs to secure $100 million in coverage. Many insurers max out coverage at $75 or $100 million – well below the cost of a catastrophic breach, which can reach a quarter of a billion dollars. 2. European regulators will resurrect Safe Harbor. Global companies paid attention when the European Court of Justice struck down the data transfer agreement known as Safe Harbor, which allowed companies to store Europeans’ data with US cloud providers. The ECJ’s decision certainly raised valid issues: Companies should be wary of sensitive data unencrypted in cloud services, especially those located in countries with dubious privacy records. Not all data is sensitive, however, and Safe Harbor’s absence will impose unnecessary and unrealistic limitations on operations in the cloud. Regulators will compromise to facilitate global access to data. 3. The majority of cloud security incidents will come from insiders. Cloud service providers have improved security to the extent that breaches on the provider side will become few and far between. This leaves enterprise employees as the weak link. 90 percent of companies experience at least one cloud insider threat per month. Whether malicious or unintentional, your own employees will be your greatest cloud security threat. 4. Companies will start to payoff cloud security debt. More and more companies are full-speed ahead on cloud, but so far security has lagged behind. There’s a gap between where cloud security budgets currently are and where they should be based on overall security spending. According to Gartner, companies allocate just 3.8 percent of cloud spending to security, compared to 11 percent from overall IT budgets. In 2016, budgets for cloud security will outpace overall IT security spending as companies play catch-up. 5. OneDrive will become the most popular cloud file sharing app. Currently in fourth place for data volume uploaded, OneDrive will surge in the rankings as companies move to the cloud with Office 365. Companies have already shown confidence in Microsoft’s cloud platform as a system of record for sensitive information, uploading 1.37 TB per month with 17.4 percent of files containing sensitive data. There is still a huge growth opportunity, however: 87.3 percent of organisations have at least 100 employees using Office 365, but 93.2 percent of employees still use Microsoft on-premises solutions. Microsoft has invested over one billion dollars in security, and recently released a new Office 365 API for partners to monitor and secure sensitive content. Satya is taking cloud security seriously, and companies who were previously hesitant will migrate to Microsoft’s cloud offerings. [/quote_box_center] ### A look to the future: cloud trends set to take off in 2016 2016 will cement the idea that the cloud revolution is here to stay. Cloud computing has become a vital part of any enterprise IT strategy, democratising the way in which IT delivers services and how users access information and business services. At the centre of this lies the realisation that data, as a critical business asset, can be efficiently stored in the cloud. [easy-tweet tweet="#Cloud is set to disrupt the #data and #analytics landscape even further" user="tableau @comparethecloud" usehashtags="no"] In the coming year, cloud is set to disrupt the data and analytics landscape even further. The database, integration and analytics markets will continue their race to understand how each can make the most of this opportunity. As competition heats up between the major cloud players, demands in the market, new partnerships and acquisitions will give rise to cloud challengers. This will ensure that 2016 will see plenty of innovation in the cloud sector. Non-traditional data sources are in demand The race is on for your data - and not just typical internal data assets. The cloud giants, from Amazon Web Services to Microsoft, want organisations to move their data into their ecosystems, whether this is data from web platforms or data from machines and devices. [easy-tweet tweet="The #cloud giants, from #AWS to #Microsoft, want organisations to move #data into their ecosystems" user="comparethecloud" usehashtags="no"] Many companies have already taken steps towards building an enterprise data lake in the cloud. Given the inexpensive storage options that cloud players provide, as well as the appeal of zero capital expenditure of hosted solutions, this is an attractive option. These organisations will be especially open to easy paths for including non-traditional (but increasingly vital) data sources in ever-scalable platforms. The cloud price war continues The cloud price battle rages on as leading providers continue trading blows on how cheaply they can offer their services. The end game boils down to channel partners assisting with the on-boarding of customers onto preferential cloud platforms. Reduced cost adds a new element to the cloud war This year, the contest will intensify as key partners get dragged into the arena. With partners able to capitalise on the deeper resources of larger providers, the ability for them to offer their own services at a reduced cost adds a new element to the cloud war. Large enterprises choose cloud 2016 marks a tipping point for cloud computing. Large enterprises from every industry have woken up to the advantages it offers and are transitioning their entire infrastructure and data ecosystems to the cloud. Adopting a cloud strategy boosts efficiency, cuts cost and risk and can help to streamline a firm’s productivity. These benefits are now impossible to ignore, especially as CIO’s peer 5 years into the future and the alternative of massive unsustainable overhead stares menacingly back… Cost remains key for IT [easy-tweet tweet="Keeping tabs on #cloud deployment costs, and their capacity to expand rapidly, is vital for #IT"] Keeping tabs on cloud deployment costs, and their capacity to expand rapidly, is vital for IT. As the primary reasons for choosing a cloud strategy are cost and efficiency, CIOs must be able to verify those benefits. Over the next twelve months (and beyond), IT leaders will rely on cloud analytics solutions to explore their usage and billing data, and enable them to spot costly services and prevent budget overruns. What’s more, thanks to the rise of mobile analytics, these solutions are on hand 24/7, 365 days a year. As a result, IT leaders will be able to do all this from their smartphone or tablet, even whilst based in the field or travelling to a meeting. Moving data to the cloud is easier than ever   With self-service cloud analytics and data prep now a reality, 2016 will be all about simple methods for pushing data from inside organisations as well as from web platforms into cloud data ecosystems. The ability of letting an individual without a technical background move data into a cloud ecosystem quickly and easily is on the way. Simple solutions that break down the complexity of data integration, staging, and transformation and focus solely on letting business users drop data into preferred cloud databases and warehouses are in the pipeline. Simple solutions that break down the complexity of data integration   Data privacy remains at the forefront of the cloud agenda In light of European data rules on the Safe Harbour Principles, U.S. companies moving user data across the Atlantic need to be explicit on their position on data privacy, or run the risk of losing their customers’ faith. For firms big enough to already have dedicated infrastructure in Europe the message is easy.   For those that don’t, 2016 will be a test. They must decide; do they take a chance and hope that a new agreement is hammered out that allows them to continue to be able to operate solely in the U.S.? Or do they protect their customer loyalty by proactively committing to European data infrastructure and getting involved in the data privacy debate? The future is hybrid One foot in the cloud and one foot on the ground? One foot in the cloud and one foot on the ground? When it comes to a technology roadmap, a hybrid approach has largely shaken the ‘playing safe’ perception. It is now openly accepted as the best path for some organisations. As a result, solutions and services built to support this model will blossom like never before. Mobile and cloud analytics overlap and merge [easy-tweet tweet="In 2016, the #cloud and #mobile markets will continue to overlap one another" user="comparethecloud" hashtags="tech"] According to research firm Gartner, in 2016, the cloud and mobile markets will continue to overlap one another. In a world where devices are always connected and data increasingly resides in the cloud, words like ‘mobile’ and ‘cloud’ no longer matter. It simply becomes about answering questions quickly and communicating results. As the cloud becomes mainstream, it’s increasingly clear that there is no new normal. Change is the constant. Over the coming year, the providers, forms, and purposes of the cloud will continue to evolve. However, we can count on the fact that more people than ever will be operating in the cloud: storing and working with data fast and efficiently, so that they can see and understand its value. ### It’s a Hacker’s Life! A few weeks a go I shared with you the joys of Unisys Stealth, a product that was developed for military cloaking of IT that has been reworked for commercial use. Now you might be wondering, why I’m talking about Stealth, when the title of this blog is ‘It’s a Hacker’s Life!”. The hacker cannot hack what he cannot see, so let’s briefly go inside the mind of the hacker to see where he would go if he could see your network. [easy-tweet tweet="Learn how hackers breach your networks, and how to prevent it from happening" user="neilcattermull" hashtags="cybersec"] I recently sat through a presentation from Ilia Kolochenko, CEO of High-Tech Bridge SA and I thoroughly enjoyed his explanation of what a hacker does in regards to an enterprise and SME environment. The following is an overview of my favourite parts of his presentation. To begin, let’s look at some statistics, frighteningly, we are highly vulnerable purely due to our lack of focussed attention when it comes to applications. [quote_box_center] “27% of all security breaches at banks in 2014 involved web app attacks” | Verizon “70% of vulnerabilities exist at the application layer, not network” | Gartner “4/5 intrusions involved insecure web apps” | Frost & Sullivan and High-Tech Bridge “74% of respondents consider public-facing web applications as the major threat” | SANS “30 000 websites are hacked every day to distribute malware” | Sophos Labs “86% of all websites have at least one serious vulnerability” | WhiteHat Security “96% of tested applications have vulnerabilities” | Cenzic [/quote_box_center] Let’s take a walkthrough of a hackers logical steps when trying/gaining access to your data. Let’s assume that a hacker is trying to gain access to your network... Let’s assume that a hacker is trying to gain access to your network - we can follow the steps that they are likely to take. In this scenario we assume a non-sensitive web application attack. Step 1 - To begin they could Compromise your website, even if it doesn’t have any confidential data on it! Step 2 - Then they could place an exploit-pack (malware) on one of your website’s pages keeping same design / style in place so you don’t notice the alteration Step 3 - Moving forward the aim is to contact the victim (your employees, your big clients or partners) via email Step 4 - Once contact is established, they may send a link to your website by social network or email Step 5 - Snap! The victim clicks! The vulnerability in your browser or its component is exploited Step 6 - The victim’s device is now compromised,  and a backdoor installed to control the device remotely Step 7 - From here the attackers could get into your own or your VIP client network, and do all kinds of damage. Step 8 - The final stages of the attack include the attackers carefully patching your website, to prevent others from hacking it Step 9 - Then the real kick in the teeth, the attackers can re-sell access to your website on the Dark Web Amazing hey? Well this is at the lower end of sensitivity so imagine the chain of events at a higher end, say a bank or equivalent. Let’s see how a hacker may gain entry. Step 1 - Quickly fingerprint IDS/IPS/WAF (if any) to define how to silently bypass them Step 2 - Compromise one of the web applications, or one of its components Step 3 - Patch the exploited vulnerability to prevent competing Hackers from getting in Step 4 - Download all valuable data from your databases Step 5 - Download your backups and source codes of web applications Step 6 - Backdoor your web application to get instant and invisible access to it Step 7 - Try to re-use your IT team passwords to compromise other internal systems Step 8 - Try to re-use your customer’s passwords to compromise their emails, PayPal, etc Step 9 - Sell your data on Dark Web and/or blackmail you with demands for ransom Now you have a firmer idea of a hackers process when accessing your network, forearmed is forewarned as they say. Many companies turn away from investing in tight security principles and don’t see the inevitable coming. We need to stop knee jerking after the fact and start to be proactive with cyber security! With Cybercrime becoming more lucrative than the drugs trade, the more visible you are the more risk you expose yourself to. [easy-tweet tweet="Hackers can’t hack what they can’t see, and luckily with #UnisysStealth we've solved this problem"] Hackers can’t hack what they can’t see, and luckily with Unisys Stealth we really do solve this problem. Cloaking your network using Stealth ensures you are reducing your attack surface and micro-segmenting your network with encrypted communities of interest giving you the best possible  protection from hackers allowing you to maintain a more successful, predictable safe network! ### Cheques are here to stay; get ready for new UK government legislation In the UK, nearly £535 billion of cheques were processed in 2013. While this represents a declining trend in cheque transaction volumes (to just under 720 million), over nine out of ten organisations still continue to use them. In 2009 the Government Payments Council proposed the removal of cheques by 2018, as it was perceived to be an archaic method of payment. However, this was rejected as it caused considerable anxiety for many people in the UK, particularly those who are elderly, housebound or rely on cheques to conduct their day to day business. CHEQUES ARE HERE TO STAY!!! So we know that cheques are here to stay but we also know that the government wants to address the challenges with all payment methods available for individuals and business. So what are the challenges specific to cheque payments and what measures are the government taking to optimise this element of the industry? [easy-tweet tweet="Sick of waiting for your cheques to clear? Going #digital can change it all says @james_mccaskie" user="bluebirdITS" usehashtags="no"] The current 2-4-6 standard Today, cheque clearing operates on a maximum ‘2-4-6’ timescale. This means that a customer paying a cheque into their account starts to earn interest on the money no later than two days after depositing the cheque; no later than the fourth day, the customer is able to withdraw the money from the deposited cheque, but the cheque can still ‘bounce’; only on the sixth day can the customer be certain that the money is theirs. Currently, cheques are physically transported from bank branches to clearing centres where the cheque is read and cheque data is exchanged electronically between banks. Under current arrangements the paying bank takes responsibility for detecting cheque fraud. Here the paying bank can undertake an examination of the paper cheque to determine that it is genuine, has not been fraudulently altered and to establish that there are sufficient funds and that the cheque has been signed, dated and written correctly. Click the images below for more on #Fintech [caption id="attachment_31440" align="alignright" width="300"] Top 10 Fintech Startups[/caption] [caption id="attachment_33439" align="alignright" width="300"] FCA Compliance[/caption] These aspects of the cheque system add delay and expense, contributing to the time it takes for cheques to clear, and the high fixed costs faced by banks and building societies in running the various centres and transporting paper cheques between branches. Please step forward the government! Proposed 2 day clearance As part of the current legislation, banks reserve the right to see any physical cheque before accepting or processing transactions. The government is proposing to amend that right, to reduce the road block of having to physically deliver cheques. Cheques will still have to be written on paper cheque books issued by banks, it will not be possible for a customer to ‘write’ a cheque digitally on your phone. The  new proposed legislation will reduce the time to clear cheques to less than 2 days  which will ultimately reduce the costs to banking organizations, improve the service to us, the public and small businesses, and to offer a more secure and auditable trail of the payment process. So how will this be achieved? Step forward cheque imaging! Cheque imaging converts the paper cheque into a digital image providing the following benefits: Cheques cleared quicker - Cheque imaging represents the most reliable vehicle for the industry to upgrade the current ‘2-4-6’ standard to a new maximum timescale of ‘1-2-2’, offering greater certainty and clarity for cheque users. Types of cheque fraud eliminated - Cheque imaging provides new opportunities to combat fraud and tackle security threats that currently affect cheque users. Due to reducing the time lag between when a cheque is written and money being moved between accounts, cheque imaging helps eliminate types of cheque fraud that could take place within this time. [easy-tweet tweet="Going #Digital can stop cheque fraud, speed up deposits and improve cheque use overall" user="james_mccaskie @bluebirdITS" usehashtags="no"] Fewer cheques lost - Converting the paper cheque into a digital image also eliminates the risk of a cheque being lost whilst in transit between banks, mitigating the possibility of human error in the current system that sees a percentage of cheques go missing each year. Compliance – A fine representing 10% of European company revenue awaits those out of compliance…need I say more? Customer attrition - Financial benefits are often not enough to keep or win customers. In truth it is about offering secure and innovative services to the population. I’m regulated by the FCA…what do I do next?! It’s time to investigate whether your company is ready for cheque imaging. Being non-compliant with the proposed legislation could result in a fine which will represent 10% of your annual turnover (a sobering thought, right?). In honesty, this probably shouldn’t be your largest fear. With my banking customers, attrition is often the largest business challenge. Financial benefits are often not enough to keep or win customers. In truth it is about offering secure and innovative services to the public. After investigating this, I would recommend reviewing the Gartner Magic Quadrant for Enterprise Content Management. Review carefully and then get in touch with me if you have any questions. If you have any concerns about implementing cheque imaging or content management solutions, please feel free to get in touch with me on james.mccaskie@bluebirdits.co.uk or on twitter @james_mccaskie.   ### Hey Meg! We’re happy to help out, just give us a call!  Change is the one constant – and nothing has more change than the world of information technology. Recently we’ve seen epic mergers and divestitures — Hewlett Packard splitting into Hewlett Packard Enterprises and HP Inc (HP Ink – so close!), Dell acquiring EMC, and Google creating multiple companies under their Alphabet umbrella. Organisational changes of any kind can be challenging, but on this scale the risk and IT headaches reach a higher plane. Splitting up a company’s IT infrastructure is not as easy as replicating it or just drawing a dotted line down the middle. What’s the key to success? To take a closer look, I spoke with Richard Donaldson, director of Infrastructure Management and Operations, eBay. Having recently undergone the divestiture of PayPal, Richard explained that the main challenge for their team was how to separate the compute resources and the data between the two companies. [easy-tweet tweet="Splitting up a company’s IT infrastructure is not easy" user="comparethecloud" usehashtags="no"] This was no small task. eBay and PayPal had 125,000+ assets and hundreds of vendors providing business services across a company. Luckily four years ago eBay undertook an inventory management project to understand what they had and what it was used for, and what was allocated and what was not. They established that a portion of the infrastructure was unallocated and they realised they could make better gains by having more of a just -in-time concept, so they looked to optimise what they had as unallocated assets. From our work in helping firms divest of or merge IT systems, here’s a summary of ServiceNow’s best practices: 1)   Audit all IT processes – Look at each process and establish what actually happens. Once you understand what you are dealing with, you can make more informed decisions. Do you know what assets or inventory you have globally, who owns them, how they were purchased, what business units they are allocated to and even if they are still used? You might think that you do, but if you really had to untangle a business service—such as employee onboarding– and divest it, could you honestly say that you knew all the components that made up that service? I’d wager most people do not. This lack of visibility can lead to replication and uncertainty. I’ve see companies with infrastructure still in commission that was originally used to support a service that is now outsourced! 2)   Decide to “fix it now” or “clone it” – You need to make a choice to clean out what you found in the audit and build a system that allows you to manage things right. The clone-and-go option replicates the architecture you already have to just lift and shift it. The downside – you might be replicating something that is redundant and simply providing a short-term fix without addressing the long-term issues within IT processes. eBay chose the “Fix-it-now” route and took the first step of upgrading their IT management system, initially using ServiceNow as a foundation for the divestiture and then as a platform for more sustainable future success for both companies. 3)   Plan for three stages – First create an intermediate system that both companies could use as the divestiture built up. In this way both companies are using the same data and process to manage the way they work by replicating the intermediate system to two new company instances. Second, flip the switch and split the networks. Third, delete the intermediate system. Once the divestiture happens you can remove the linking system allowing both companies to work independently on system that they have designed and optimised together [easy-tweet tweet="Companies need to take a rigorous approach to standardisation and enforcement" user="comparethecloud" usehashtags="no"] Richard added that companies need to take a rigorous approach to standardisation and enforcement. Service management is a clutch player in these key aspects: Standardisation – We all manage servers, routers, workflow, etc. Service management helps you standardise on a platform and see what you have. Discipline and operational rigor – A functional CMDB gives you much-needed accuracy in the asset tables. Visibility to cut costs – Don’t overprovision! Know what you have and need. Right size inventory and support agreements. “Divestitures become a lot easier when you have a good service management philosophy. Get your arms around what you have and how it’s used. It’s not sexy, but you need to do it. There is no point in recreating something that is not used or decommissioned,” Richard said. Wise words from a man who could probably write a book on this! With this insight and control, you’re in a powerful position to absorb or divest business—on any scale. If you want to go deeper, check out how Rio Tinto and Pacific Aluminum sped their split. ### The technology landscape in the year 2025 It is easy to predict what is coming up for the next year but thinking further out is always a challenge.  In this blog post I am going to look at advances made in 2015 that will impact us in 2025. [easy-tweet tweet="Daniel Thomas, Founder of @Comparethecloud speaks his thoughts on how developments of 2015 will relate to 2025" via="no"] Quantum Computing IBM this year announced a major breakthrough in Quantum Computing that allows for the detection and measurement of both kinds of quantum errors simultaneously; and the introduction of a square Quantum bit circuit design. Impact in 2025 Quantum computing when put into full commercial use will allow for the solving of questions that have always eluded mankind. Whilst initially the technology will be seen as malevolent due to the ability to destroy current cryptology (although it solves this issue also) we will see more general use in 2025 akin to how we use devices today. Brain 2 Brain Comms Brain to Brain communication has already had a significant breakthrough in 2015 where two brains were linked for a question and answer experiment. Impact in 2025 Is this the start of the Humanification we blogged about in 2013? How will our children learn in 2025? Will we see today’s education methods dismissed as archaic? Are we looking at the rise of education algorithms that teach according to the subject’s capacity to learn? Brain to Brain communications has the potential to disrupt all learning institutions and methods, all traditional education establishments would have to reevaluate their curriculums. Information on a subject or problem solving capacity would be instantly downloadable into the brain directly. How will this allow for a measure of intelligence? An example I use from 2015 is the question of should a child need to learn maths to an advanced degree when calculators and a plethora of devices already perform this function? [easy-tweet tweet="How will our children learn in 2025? Will we see 2015 education methods dismissed as archaic?" user="comparethecloud"] Nuclear Fusion Nuclear technology is a much maligned and frowned upon source of energy. An example being the Japan earthquake receiving 10x more attention once the Fukushima reactor was in danger. Nuclear fusion is best described as the harnessing of sources as powerful as the sun. The advancement of this technology could revolutionise how we consume and manufacturer power. Impact in 2025 The European Unions first reactor of the type is coming online in 2020 five years into this journey will we see alongside the use of Superconductors and miniaturisation of systems a dramatic change in how we power our data centres, homes and industrial units. Is this the first step towards a positive climate change revolution? The Robot becomes our friend? (Or does it) It is frankly amazing when you travel or speak with people from different cultures.  An example being the Japanese who have always seen robotics as an aid and friendly advancement. My personal view is the difference in our generation has been the portrayal of Robotics. In Japan we have the friendly looking Honda Asimo, compare Asimo to the mean looking predator drones and destructive terminator robots we view in Western Cultures, with the destructive stereotype burned on our children’s brains by the continued growth of gaming consoles. Impact in 2025 We already have robotic hoovers, robotic industrial workers and a robotic seal that comforts the elderly in japan (yes really). As Artificial Intelligence increases we will see the use of more ‘learning’ capability and automation and freedom of robotic ‘services’. Note I said ‘services’, it is my view that repetitive tasks such as domestic cleaning will be ‘service based’ the same way we order a taxi now. By this time drones would of become sufficiently robust and accurate to allow for home deliveries and automation of shopping services. Q. Will robots be evil? A. Are we talking about Robots or the artificial intelligence that drives them? This is the fundamental question we should ask. Artificial Intelligence Artificial intelligence has made significant leaps and bounds in 2015 mainly fuelled by the cheap abundance of computing resource, miniaturisation of devices delivering more power and soon the introduction of commercial Quantum computing. Big data and analytics have also been significant investment growth opportunities couple with machine learning it means 2015 has left us with the building blocks for advancement in human history like no other time since the industrial revolution and electrification of industries. [easy-tweet tweet="Artificial intelligence has made significant leaps and bounds in 2015" user="comparethecloud" hashtags="AI"] Impact in 2025 The military forces of this world will always take technologies and abuse them for nefarious purposes. My view on the malevolent predictions of AI is that we have had nuclear weapons ever since 1946 and these have been improved on drastically since that time, have we seen a nuclear holocaust? In 2025 what we will see is a destruction of any industry that relies on a repetitive task such as accountancy or medical AI will be the ‘brains’ behind all our devices all of our choices all of our health services and above all our social views. The issue with such a thing is that we believe everything and trust whatever is ‘on our screens’ as humans we will need to use our intellectual capacity to move beyond ‘acceptance’ to challenging. The question that I ponder is this, AI and its mechanical cousin robotics to be truly aligned to human capability requires the one thing that during this Christmas s time we all possess ‘Humanity’. Humanity, thoughts, feelings and emotions will never be replicated until humans become a ‘hybrid’ with machines. I believe in 2025 that we will advance legacy technologies from 2015 such as pacemakers, ingestion tablets and other areas but will not be ready until 2035 for the true singularity by this time protections would be put in place. [easy-tweet tweet="Humanity, thoughts, feelings and emotions will never be replicated until humans become hybrid with machines"] What will our homes look like in 2025? The Internet of Things or IoT is already beginning to shape how we interact with devices around us. From devices that measure our heart rate constantly and the steps we take during our day to sleep devices that measure quality and length of sleep. M2M or machine to machine, sensors that move beyond the current RFID chips to smart low power devices will allow for computations and measurements in every aspect of our lives. Whilst many of these things are here today by 2025 we would of mastered the computational and algorithmic aspects to create a perfect balance. Therefore every device we own will be ‘smart’ from the TV we watch to the earpiece we wear. Our interaction will be seamless thought controlled and tuned to our every whim and need by smart algorithms that run our lives. The biggest cause of waste and loss in any environment is the ‘friction’ between objects. Imagine if the Japanese bullet train was powered in a vacuum on magnetic rails with no friction? Our homes and device could harness this centuries old magnetic technology to provide effortless moving of devices and heavy objects. Add to this the embryonic advances in graphene and nanotubes and molecular based engineering, could we have self forming objects crafted and formed in minutes in front of our eyes. “For each of our actions there are only consequences.” As we move into 2016 let us all look back at what has been from a technology standpoint, one of growth but still one of unfulfilled promise. Maybe in 2025 I could look back at this article and laugh at its inaccuracies, remember though when cinema was first displayed people ran out of the theatre thinking the train was going to hit them. Predicting the future is not easy but then neither is looking at the past. My final quote is from British Scientist James Lovelock, an unsung national treasure. “For each of our actions there are only consequences.” And last but not least a big thank you to those that inspire me, the team at Compare the Cloud, Simon Porter, Ian Jeffs, Andy Johnson, Rob Davies, Matt Lovell, Doug Clark, Omer Wilson, Michael Andrew-Foote, Gordon Davey, David Fearne and Alan Baxter a Secondary School teacher who forced a loud unruly dyslexic child to make circuit gateways during numerous detentions (you're my hero). ### The future of the manufacturing industry lies in additive manufacturing The advent of additive manufacturing - better known as 3D printing - has opened up a whole host of opportunities for manufacturers looking for the competitive advantage offered by having their products designed and created as quickly as possible. Furthermore, advances and developments in 3D printing technology have contributed to the rise of ‘digital manufacturing’ across Europe, powering forward the digital economy and allowing the next generation of ‘makers’ to bring their ideas to market, and at speed. It is clear that additive manufacturing is big business for today’s manufacturing industry. 3D printing- from hobby to major technology advancement The news has recently been awash with stories of major 3D printing breakthroughs. 3D printed limbs, bones, and even 3D printed skin have been hitting our headlines demonstrating how 3D printing is revolutionising the healthcare industry, and indeed many other industries across the world. Additive manufacturing has the capability to change today’s manufacturing sector in particular and can transform modern factory floors. Indeed, organisations such as Proto Labs are already harnessing the power of additive manufacturing. Developers can upload a 3D computer - aided design (CAD) file on to the company’s website, with an interactive quote for that design being turned around in a matter of hours. This kind of fast response and rapid prototyping is delivering speed and efficiency benefits to today’s manufacturing business. [easy-tweet tweet="3D printed limbs, bones, and even 3D printed skin have been hitting headlines" hashtags="3Dprinting "] Accelerating innovation One priority for today’s manufacturer is ensuring speed to market to keep up with the accelerating pace of consumer and customer demands. The question is: how are fast production times achieved without compromising on quality? In the past, many UK manufacturers have had to offshore their manufacturing base for rapid production of parts while keeping operational costs to a minimum. The beauty of investing in additive manufacturing is that it allows manufacturers to re-shore their manufacturing base back to the UK - as fast mass customisation, at a low cost, can be delivered at home and closer to the customer. Additionally, by harnessing additive manufacturing technology, the UK manufacturing sector can stay ahead of the competition as it helps to improve the job prospects for young STEM (science, technology, engineering and maths) talent. For example, a young creative designer or software developer may not be aware that they can embark on a manufacturing career. As the worlds of hardware and software converge, the manufacturing sector needs the brightest and best in IT talent, to help accelerate innovation in the new era of ‘high-tech’ manufacturing. By harnessing additive manufacturing technology, the UK manufacturing sector can stay ahead of the competition [easy-tweet tweet="Additive manufacturing can inspire a new generation of #STEM talent" user="comparethecloud" hashtags="3Dprinting"] Conclusion Additive manufacturing has the capability to truly transform the manufacturing sector in the UK, helping to revitalise a once booming industry. Manufacturers have to adapt and embrace new digital business models that focus on speed, quality and customer demand, and these types of business model will require innovative software and the latest in technological advancements for ultimate success. Additive manufacturing can inspire a new generation of STEM talent, who can use their skills to breathe new life into the high-tech manufacturing sector of today. ### Should UK business become paper-free? Paper has been integral to the world of business for decades. Whilst the majority of us are comfortable tapping away on laptops, responding to email on our smartphones, for many people, particularly those that work ‘in the field’, a paper and pen remains de rigeur.  Whether it’s completing a form on a construction site or scribbling notes for a report, using a paper and pen at work is typical.  [easy-tweet tweet="Almost 15% of an organisation’s revenue is spent creating, managing and distributing paper documents " user="comparethecloud"] Unsurprisingly, almost 15% of an organisation’s revenue is spent creating, managing and distributing paper documents.  Furthermore, it is estimated that most companies could reduce their printing costs by 10–30% (Cap Venture). The alternative is of course to ‘go digital’.  For field-based workers, this means recording information on a mobile device via an app, which is then stored in the cloud, accessible, anywhere, anytime.  For businesses, ‘going digital’ enables them to manage the business & staff online, as all activities and tasks can be viewed and managed via the web. going ‘paper-free’ i.e. capturing everything on a mobile whilst in the field, should be obvious Considering how digitally savvy most of us are, going ‘paper-free’ i.e. capturing everything on a mobile whilst in the field, should be obvious.  However, for many companies, the thought of eradicating paper all together, is perceived as costly, time consuming and quite frankly, most businesses are resistant to change.  So, what are the positives?  Are the benefits worth the time and investment? Saving time and money The average UK small business owner spends 16 hours a week on paperwork – almost 50% of their time.  By recording everything digitally, there is no time wasted re-purposing information when back in the office. With all information stored in the cloud, data can be accessed from anywhere, at anytime, with little need for printing. The business can then focus its attention on actually using this information, rather than recording it.  Increase productivity For companies, being able to digitally see and manage your employees, ensures staff are as productive as possible. From an employee perspective, technology enables information to be captured more quickly and easily.    Motivating your employees [easy-tweet tweet="55% of UK SMEs cite admin as their biggest hate, #Cloud could change that" user="comparethecloud"] With 55% of UK SMEs citing admin as their biggest hate (2015 Infusionsoft Small Business Market Research Sales & Marketing Report), empowering your staff with new, intuitive technology, demonstrates that they are valued by their employer and worthy of investment.  On a day-to-day basis, eliminating the headache of using paper, not to mention the time spent filing, will help to increase morale and productivity, as well as ultimately, improving employee retention. More secure Security is a real ‘buzz-word’.  However, capturing important information on paper is a legitimate and significant potential security threat.  What happens if your paper files are stolen or damaged in a flood?  Is your storage facility truly secure?  And how much is it costing the business to store information indefinitely?  With all your data and information safely in the cloud, these risks are reduced. Compliance and auditing  Compliance is a necessary ‘evil’ for the majority of businesses such as health and safety, environmental, fire and accident reports. For some industries in particular, it is a huge burden in terms of time and cost.  Going digital reduces this burden considerably, ensuring all data is stored in the cloud, instantly accessible and secure.  Improving accuracy [easy-tweet tweet="Recording on paper might seem easy in the field, but re-purposing #data when you’re back at your desk is time consuming"] Recording important information on paper might seem easy when out in the field, but re-purposing this data when you’re back at your desk is time consuming, mundane and can result in multiple errors.  Deciphering scribbled handwriting is a potential mindfield; in fact, the cost of human error to businesses in the US & UK is estimated to be £24 billion (Zurich).  Digitally inputting this information ensures you’re not having to re-input data back at the office.  Both the employer and employee can then access and manage data, quickly and easily.      So, going paper-free offers multiple benefits to businesses, from employee motivation through to compliance.  Consumers are only going to become even more digitally savvy and demanding every day, so for employees in the field, using a mobile app, rather than paper is straightforward.  For employers, being able to digitally manage its field-based staff is quicker, simpler and ensures higher levels of productivity.  Whilst it might seem daunting to make such a seemingly significant change to your business, ultimately, becoming ‘paper-free’ will make your business more profitable. ### The security trends shaping business in 2016 IT security – in all its forms – has been a major cause of sleepless nights amongst IT industry professionals for as long as anyone can remember. For most of that time engaging other areas of business in the potential risks associated with data loss has been an uphill struggle. That’s now changing and here’s why. In a business environment where being better connected confers many advantages, the reality of securing an IOT-enabled, fully networked world is becoming apparent. 2015 saw some high-profile breaches. Hacks like TalkTalk and Ashley Madison are making senior management ask themselves: how do you make sure it doesn’t happen to you? At the same time a concerted effort on the part of the government has undoubtedly raised awareness. Cyber is now considered amongst the biggest threats to UK businesses and the government initiative, Cyber Essentials, has sought to improve understanding of what it takes to secure an enterprise. Here are my predictions for the security trends we will be seeing more of in 2016. [easy-tweet tweet="Cyber is now considered amongst the biggest threats to UK businesses" user="achrispace" hashtags="cybersec"] Internal and external – it’s all security Forget the old demarcation between insider threats and external attacks. 2016 should be the year we stop categorising the difference between the two. The biggest challenge for the year ahead will be joining up traditional perimeter defences with better protection against attacks from the inside. Cloud customers: internal security measures should be top priority Gartner has predicted that 95% of cloud security failures will be the customer’s fault and more specifically, are attributable to poor internal security practices. Being able fully to trace and managed the internal movement of data isn’t just going to be important if you have a cloud provider. It makes security sense too. If you look at the most high-profile hacks of recent years, weak internal defences are the common denominator. After the initial breach, when there are few internal barriers, lateral movement and therefore damage is easy. Strengthening internal access provision isn’t just a cyber threat deterrent; it prevents the likelihood of data breach from insiders, which actually accounts for the majority of data breaches. In 2016 we will have another reason too – complying with EU’s GDPR will require a review of how data is stored, processed and moved. Mitigating the risk of attack…through insurance If your firm is considering a cyber insurance policy, you are not alone. If your firm is considering a cyber insurance policy, you are not alone. Cyber is now considered the biggest threat to UK businesses and the meteoric rise of the cyber insurance market is proof of that demand. Paying an insurance company to share some business risk makes good commercial sense.  But be warned, putting a cyber-premium in place does not guarantee a payout should a breach occur unless all required security measures are enforced. According to a study we conducted, around half of IT pros weren’t able to tell if necessary security software updates were being made successfully, or if ex-employees or contractors still had access to the systems. Better, instead, focus on getting some of these basic security measures in place and ensuring the IT department is involved in any decision making regarding a cyber policy from the start. Focus on the people, not just the tech CISOs under pressure to provide impermeable defences against external threats may be relieved to hear current thinking suggests that enterprise security should be managed holistically, i.e., by the IT department working in conjunction with other business areas, like HR. Organisations may be missing ‘predictable behaviour cues’ that would presage a hack. In the holistic model, the IT department provides the IT security tools and the HR department provides the appropriate processes and procedures that need to be followed, as well as creating a necessarily more ‘vigilant’ culture. To put it into a real-life context, what’s the chance that this Christmas bonus season a disappointed worker starts to behave in a way that demands closer scrutiny? [easy-tweet tweet="If #cybercrime is the number one threat to UK business, why are there so few #technology experts at board-level?" via="no" usehashtags="no"] IT security needs to come out of the shadows TalkTalk is still counting the cost of its attack. TalkTalk is still counting the cost of its attack. Expect to hear more analysis of the breach in 2016, for instance, when CEO Dido Harding goes before a House of Commons Select Committee for a (very public) grilling. Perhaps the most important lesson from TalkTalk is the importance of having strong, IT-literate leadership. If cybercrime is the number one threat to UK business, why are there so few technology experts at board-level? TalkTalk should be the battering ram security professionals use to open up the C-suite over the next 12 months. This festive season, that’s something we can all raise a glass to. ### #XmasCloud Christmas Show - Recorded 18th Dec 2015 Get involved live with the show via twitter using the hashtag #xmascloud to our @comparethecloud account and join in the discussions below. Stay tuned for prizes from our "wheel of tat". Clips     [embed]https://www.youtube.com/watch?v=iHk2iEfn1gQ[/embed] ### 2016, The year of the hyper-converged architectures Looking toward 2016, I believe that hyper-converged architectures will be the primary mode of deploying virtualisation in any new Software-Defined Data Centres, and also that private cloud installations will start including hybrid topologies. [easy-tweet tweet="In 2016 third-party performance #analytics are going to be required to ensure 360- latency inspection" user="xangati"] Hyper-Convergence We can expect that third-party performance analytics are going to be required to ensure 360- latency inspection, because hyper-converged is a software-centric architecture that tightly integrates compute, storage, networking and virtualisation resources, and other technologies from the ground up in a commodity hardware system supported by a single vendor. Without a trusted-source analysing all aspects of performance, the default answer for any degradation issues would be to recommend that you ‘scale out’ with more systems from that single vendor. Service assurance metrics are complicated even further when a hyper-converged infrastructure is extended to the cloud and its resources are shared. [easy-tweet tweet="The Dell-EMC merger will drive other #IT systems vendors to promote multiple #hypervisors " user="xangati" usehashtags="no"] Hypervisors In 2016 I think that the Dell-EMC merger will drive other IT systems vendors to promote multiple hypervisors in addition to VMware's vSphere; hopefully, this will in turn create growth headroom for KVM (open source), Citrix XenServer and Microsoft Hyper-V. If this happens as envisioned, 2016 will see concepts that transcend the hypervisor-virtualisation orthodoxy, such as containerisation, become much more palatable. Another trend that almost certainly will be spawned by the Dell-EMC alliance is the notion of a custom hypervisor pre-integrated with the converged stack, similar to what Nutanix has done with Acropolis. Microvisors Enterprises will explore and pilot many different permutations of containers in order to derive greater agility and scale, as well as operational flexibility. With or without OpenStack, in both private and hybrid-cloud environments, ‘microvisors’ for containers will start to gain traction. From a managed, hosted or cloud service provider standpoint, the advantages of running container instances will help streamline the goal of multi-tenant management. Microvisors, (which are a thin hypervisor with the native intelligence to use only those necessary components of the OS to migrate software to cloud-centric infrastructure) will prove especially useful in migrating legacy applications to cloud platforms. 2016 is going to see key network functions increasingly virtualised Virtualisation and Automation 2016 is going to see key network functions increasingly virtualised; thus, Software-Defined Data Centres will create more performance challenges when storage, networking and hybrid cloud are added to the virtualisation mix. This will lead enterprises to explore more progressive frameworks for service assurance delivery, which will be increasingly automated and integrated at the orchestration layer. Traditional silos have isolated performance metrics, but these will now be required to share conversations about workload and traffic flows, and take into account all of their inter-dependencies, including v-storm contention and device conflict. Microservices Microservices, which break down IT systems and applications into smaller, more granular elements; Containers, which take applications and services down to a self-contained, component level; and DevOps providing the framework for the IT infrastructure and automation to develop, deploy and manage the virtualised environment with greater agility will begin to see an intersection, which will rapidly increase. The goal will be to create a more seamless management process while automatically tracking, reporting and maintaining minimum thresholds for exceeding service-level targets. ### Could your IT infrastructure harm your business? Part I Vulnerabilities in IT infrastructure are something all companies need to be aware of, and working on constantly, this is Part I, return to us in January 2016 for Part II. Following recent high profile IT and data security failures, how much risk is your IT infrastructure placing on your business? [easy-tweet tweet="Join CTC and @zsahLTD in analysing how #IT Infrastructure could be harming your business" user="comparethecloud" usehashtags="no"] Over recent months, there have been a number of high profile and very well publicised security issues resulting in a loss of customer data. These include the hacking of Sony’s systems, the Ashley Madison dating site in the US, Marks and Spencer, Apple, and most recently, Talk Talk in the UK. All of these demonstrate the potential risks and resulting damage to business from breaches in IT security. The impact of such a breach of security can last for many years, with massive reputational damage and loss of customer confidence not only in the company’s systems, but also in the brand itself. There is typically also a significant short term business impact. In the case of Talk Talk, some reports claim that up to one third of all customers have terminated or are looking to end their contracts in the immediate aftermath of the problems. In this article, we discuss the causes of security failures and outline the key lessons to be carried into any business environment to reduce the risks to your business, including any specific considerations for the use of Cloud Services. What are the main causes of recent security failures and how can businesses, big and small, protect themselves in what appears to be an increasingly insecure world? Firstly, we need to consider that there are actually three distinct, but related, elements – overall security (including processes and physical security); hacking (i.e. attempts to breach the IT systems’ security); and privacy of data held within your IT systems. overall security; hacking; and privacy of data held within your IT systems To the non-expert eye, it is commonly considered that the primary risk comes from external hackers and that protection is achieved mainly through the use of security measures such as firewalls and virus protection software. Indeed, the majority of hacks and attempted infiltration involve Denial of Service (DoS) attacks to bring down a company’s IT services. A Distributed Denial of Service (DDoS) was the first part of the attack to gain access to Talk Talk’s customer data – and it is critical that all companies maintain up to date virus protection measures. [easy-tweet tweet="Agents looking to infiltrate your #IT infrastructure can still often find ways to get past security software" user="zsahLTD" usehashtags="no"] However, even with up to date security software, agents looking to infiltrate your IT infrastructure can still often find ways to get past this layer of security. One of the most common tactics employed by hackers is through an “exploit” – a piece of software that manages to get past the security layer, but in doing so, opens up a channel through which more malicious software can be deployed. These are typically prevented by a process of notification (when detected) followed by the issue of a new security patch provided by vendors. The problem though is in the delay involved in getting new security patches implemented in the market – one day in the world of IT Infrastructure is like one year in normal business cycles. It is therefore critical that all businesses keep all aspects of the IT infrastructure up to date. For many businesses, the use of externally provided Cloud Services can significantly de-risk this part of the IT delivery – avoiding what might be a reliance on a potentially small internally IT department, removing this work from them and allowing them to focus on more strategic objectives. However, outsourcing the IT infrastructure does not outsource the problem and we will talk later in this article about additional considerations for companies who choose to go down this route. Firewalls are a key part of the defence against external hacking and will, properly implemented, provide a barrier to potential “exploit” packages. However, given the constant re-invention of attacks from external agents, the time to implement patches means that the risks can never be completely eliminated. Many firewall manufacturers are now starting to implement “auto-patch” versions of their products – but don’t be misled into thinking that this solves the problem.  Many patches and software revisions will require a re-start, with knock-on effects for the rest of your systems and availability to your staff and customers. [easy-tweet tweet="The best practice in the implementation of firewalls is to employ a dual-layer #firewall" user="zsahLTD" hashtags="infosec"] The best practice in the implementation of firewalls is to employ a “dual-layer” firewall. This will significantly reduce the risks of breach, as an exploit package that manages to open up a channel through the first firewall layer will hopefully not be successful at the next layer. Data Security So, if we assume that we cannot 100% guarantee against a malicious attack getting through the security systems, what should companies be doing to protect their, and their customers’ data? Check back with us in January to read Part II, which includes our recommendations for securing your data. ### Customer Analytics, Insights and Experience Forum 2016 16-18 February 2016 London Marriott Hotel, West India quay The Customer Analytics, Insights and Experience Forum 2016 will address how customer data can be effectively transformed into valuable customer insights that drive smart business decisions. Ultimately, this will increase positive customer engagement and enhance overall customer experience. A sell-out success in its previous run, this forum is expected to be one of the largest congregation of customer success thought leaders in Europe this year. This forum takes place from 16 - 18 February at London Marriott Hotel West India Quay and features a stellar line-up of regional experts from top companies such as Microsoft, Vodafone, Royal Bank of Scotland, Ebay Inc, Accenture, HSBC, TNS, Volkswagen, Barclays and Lloyds Banking Group. You will hear about strategies to increase customer retention, purchase and consequently, customer lifetime value. You will also gain cutting-edge insights on how using real-time analytics and predictive analytics can transform your organization into an agile one that stays ahead of its customers. Register on our website with the promotional code L6056CLOUD50 to enjoy £50 off the current conference fees. This discount is only for 2-day conference packages and above and cannot be used together with any other discount schemes. For additional info, you may contact Eileen Tay at eileen.tay@claridenglobal.com ### Is private cloud the best option? A number of CIOs around the globe have expressed a clear intention to move legacy IT infrastructure to private clouds or shared services centres within their companies. According to a recent survey by Deutsche Bank, the main reasons they want to move to the private cloud are stability, security and service level agreements. But is the private cloud really the best option? A big drawback is that all the management, maintenance and updates are the responsibility of the company. Outsourcing IT to cloud vendors, on the other hand, removes this burden and ensures key targets, such as security and usability, are always met. Take cloud telephony services, for example, which shrinks and expands depending on business needs. By managing these changes, as well as system upgrades and helpdesk access, vendors enable businesses to spend more time on value-adding work. [easy-tweet tweet="The main reasons companies want to move to #privatecloud are stability, #security and SLAs" hashtags="cloud"] Private cloud is not always the answer While the private cloud may be gaining traction as a way to safeguard the role of traditional IT, even with large IT budgets it’s unlikely that a company not in the business of providing cloud services would match the investment and innovation of dedicated cloud vendors. One of the benefits of outsourcing to cloud vendors is the ability to try out the latest technology without having to find additional resources that could be spent on other business-growing activities. Indeed, there are countless cloud innovations and most companies simply don’t have the time or skills required to investigate, develop and test these new technologies. security concerns shouldn’t blind you to the benefits of the cloud One of the biggest concerns for businesses using cloud providers is security. Many IT managers worry about potential leaks, however these concerns shouldn’t blind them to the benefits of the cloud. Today’s cloud computing practices can actually improve security. Providers have stringent regulations they must comply with, such as the PCI Data Security Standard (PCI DSS), and their success depends on their ability to keep customers’ data safe. Outsourced cloud computing compensates for many business’ lack of understanding and knowledge of what these regulations involve, which translates to enhanced security. Furthermore, one of the key benefits of the cloud is that even if a company loses physical machines, the data itself will be securely backed up and instantly retrievable from separate data centres. Improving collaboration and communication In addition to enhanced security features, cloud computing can greatly improve collaboration and communication within an organisation. The cloud gives businesses the freedom to access data from multiple locations, as well the ability for a business platform to develop organically at the same pace as the business environment. It can facilitate multiple users so that they can share the application without compromising speed or capacity. All of these benefits help facilitate collaboration and communication, enabling users to easily share information and data across various locations. With a cloud service provider, businesses can have all of this managed for them. [easy-tweet tweet="#Cloud can greatly improve #collaboration and #communication within an organisation" user="comparethecloud"] Cost is another benefit of using cloud providers. While on-premise support requires regular maintenance, fixes and upgrades, the only ongoing costs required with cloud vendors are subscription fees, and training and configuration, which significantly reduces the company’s capital expenditure (capex). Cloud providers also deal with any redundancy requirements. In a lot of cases, these costs are overlooked, but, added together, can make a very real difference. Ultimately, organisations need to look beyond the private cloud Ultimately, organisations need to look beyond the private cloud. The cloud undoubtedly offers very real business benefits, however it can often be overwhelming when faced with the management of cloud technology – particularly for companies with little understanding of how it works. For businesses wanting to take maximum value from their cloud investment, the advantages of outsourcing to dedicated cloud vendors far outweigh the disadvantages. By reducing initial expenditure and guaranteeing compliance and security, cloud vendors take the weight of businesses’ shoulders and enable them to enjoy the benefits of the cloud without spending valuable time and resources on maintaining it. It’s a win-win situation. ### Cloud Computing in 2016 - what businesses need to know Cloud is growing at a faster rate than ever before. The proportion of worldwide cloud-related spend is predicted to reach 3.8tn this year. And I can’t think of many organisations that haven’t already migrated some of their IT requirements to the cloud or at the very least started to seriously consider it. We know the reasons behind this surge too - big data and the rapid adoption of multiple mobile devices. But with this faster, always-on approach to business, companies face one major challenge – what to do with cloud storage and how to manage it effectively. [easy-tweet tweet="#BigData and the rapid adoption of multiple #mobile devices is causing a surge to #Cloud"] Who and what is going in the cloud? The kind of storage organisations opt for is very much dependent upon the nature of the business and the market in which they operate their specific needs and regulatory requirements. For instance, Barry Runyon, a research VP for Gartner, who focuses on the healthcare sector, recently predicted that “…a significant percentage of the healthcare providers' workload will move to the cloud in the next 5-10 years.” This is driven by several factors including user demand, supplying remote access to staff and patients and reduced budgets. Cloud storage enables a far more responsive health service [caption id="attachment_33484" align="alignright" width="300"] Read more 2016 Predictions here[/caption] Like any other business, healthcare staff use cloud services for everyday activities such as email, enterprise content management and mobile device management. But it is the other critical data such as medical image archiving, medical record systems, secure texting, clinical collaboration, transcription services, legacy decommissioning, and DR archives which require compliance with the strictest storage legislation and therefore present a challenge. Legislation such as HIPAA (Health Insurance Portability and Accountability Act) which requires, compulsory storage of records for decades, privacy and data confidentiality, secure data disposal and storage of digital healthcare records are a must in healthcare the world over. The Data Protection Act in the UK regulates, among other information, personal healthcare records, requiring mandatory disposal of electronic records after the retention period, accuracy of information, logging any changes and strict confidentiality. Cloud storage enables a far more responsive health service, but it is not without its headaches. Areas of concern At the other end of the spectrum smaller businesses are less affected by regulation but battle with their own fears as highlighted by the University of Greenwich’s recent findings that 66% of small businesses cite data security as their main concern. Researchers reported that only one in four firms have adopted cloud technology for business systems with even fewer (19%) making use of the cloud for financial accounting and less than a third (31%) using it for customer relationship management. [quote_box_center]Our own recent research across almost 200 IT buyers highlighted that the most common barriers to buying cloud services are:  Concerns around security (54%)  Concerns about losing control of their systems, data and infrastructure (54%)  Concerns that they have already invested too much in their current storage network (42%). [/quote_box_center] In addition to regulatory and data security concerns, businesses can also be affected by a flood or fire with some online studies suggesting that businesses without a well-structured recovery plan are forced to close within 12 months of an incident when data has been lost. With more and more businesses embracing the cloud, there is an ever-increasing amount of data sitting off-premise that needs to be continually backed up and readily available. But the risks may be far less than those which threaten data sitting in an on-site server room with poorer measures to protect against fire, theft and flooding. With each DC home to lots of cloud providers, the availability of cloud storage has never been better - whether the business in question is a small accountancy firm, a large online retailer or a healthcare authority. How to choose a cloud provider [easy-tweet tweet="Peace of mind around #cloud technology can only be achieved through a thorough assessment of the providers available" via="no" usehashtags="no"] What are the key considerations any board should take into account when looking for a cloud storage partner? Peace of mind around cloud technology can only be achieved through a thorough assessment of the providers available. Cost and time saving are not the only key factors. We would strongly advise that businesses find a partner that can help them migrate at their own speed. Not everything needs to be migrated away from the existing infrastructure and in fact, there may be good reasons to select a ‘hybrid colocation’ solution, mixing the existing physical infrastructure with the flexibility of a virtual one. It’s for this reason that the geographic proximity of a cloud provider to the existing business is critical - 58% of IT buyers say the location of their cloud provider matters. Colocation, colocation, colocation A local cloud provider not only offers speedy access to data and servers (for major upgrades or changes), but it provides optimal connectivity speeds (low latency equals better performance) as well as the reassurance that the business will meet any legal requirements to keep data within country borders. 75% of users want to solve issues via phone support Communication and support are also crucial - 75% of users want to solve issues via phone support and 67% want to solve them via support tickets. So selecting a supplier who can meet tech support requirements, day or night, 365 days a year is a must. One final, but vital point – data – where it’s kept and stored is critical. As more and more criminal and terrorist events occur around the world, governments will get tougher on meeting regulatory measures such as the Patriot Act. Even privacy-conscious France is considering new surveillance laws. Don’t be seduced by cost savings from foreign cloud providers We would strongly advise that all businesses are clear about where their services are being used and where the data resides. Don’t be seduced by cost savings from foreign cloud providers, as it could land the company in hot water with national regulators.  Selecting a good quality, local cloud provider that offers support and access to virtual servers and data will reap dividends in the long run (as well as providing peace of mind). If they allow you to store your physical servers on-site (colocation) and plug them directly into their cloud infrastructure (hybrid cloud) then you could end up with the best of both worlds. ### Xmas https://www.youtube.com/watch?v=y3B8j_g2wIY ### Nexmo Agreement with SAP Extends Social Chat Capabilities to SAP® Cloud for Customer Nexmo, a leader in global cloud communications platforms, today announced that it has signed a partnership agreement with SAP SE that allows it to integrate Nexmo Chat App API with the SAP® Cloud for Customer solution. By using Nexmo Chat App API, customer service agents will now be able to engage customers on chat apps like WeChat and Viber through SAP Cloud for Customer, and service customers in real time without having to leave the SAP solution. “We are excited to partner with SAP to connect customer service agents to their customers on WeChat and other chat apps,” said Chris Moore, chief revenue officer at Nexmo. “SAP is a global leader in customer engagement and commerce, and we are thrilled to extend SAP’s customer engagement channels to include social chat.” The partnership is the latest for Nexmo’s Chat App API, which connects brands and CRM platforms to customers via their preferred chat apps. In October, Nexmo connected KLM Airlines to its customers on WeChat via KLM’s CRM platform. Nexmo also announced a partnership with Marketo and Bright Pattern. Chat apps like WeChat, KakaoTalk and Viber are crucial in reaching consumers, having amassed over 1 billion monthly active users worldwide. WeChat, the dominant chat app in China, recently exceeded 650 million monthly active users. Users are spending the majority of their time within the chat apps, using them to access diverse services like e-commerce, taxis and customer service. With this integration, customer service agents using SAP Cloud for Customer can service customers on the platforms where they are spending the majority of their time. To learn more, visit the SAP Community Network. ### The eternal perplexity about hosting: HDD, SSD or SSHD For those people who are being confused with the subject, let us make this clear. Although you have an “online service for hosting”, those data still need to be stored somewhere physically. During years, options and choices had changed, so we will present you with three most common options, so feel free to read on and to see which one is your fit. [easy-tweet tweet="Is HDD, SSD, or SSHD the one for you? @DanRadak helps you choose" user="comparethecloud" hashtags="cloud"] HDD (Hard Disk Drive) This is one of the oldest and most often used systems of storage space. If you are reading this on your computer, there are chances that you have it in your hard disk storage. And that what it really is, Hard Disk Drive. It sports several fast rotating disks covered in agnetic material, above which an arm with a “writer” is moved. This writer inputs and reads data from those disks at the same time. The advantages of such approach are it is quite cheap, and can store a lot of data; its capacity had been improved tremendously for the past decade. For being the most affordable, they are often choice for common users and small business startups, for they need to spare every cent possible. The main disadvantage of such storage system are its mechanical parts. It is quite fragile and non-resistant to kicks and hits, so extreme caution is required. The main reason why it is still available is its low price. SSD (Solid State Drive) A complete opposition of the previous one, SSD is everything that HDD is not. Durable, faster and expensive. Based upon NAND flash memory, it is possible to achieve speed of transfer at rate of about 500MB per second, which is several times faster than HDD. For example, on a regular HDD, booting Windows 10 operating system requires a couple of minutes to be fully usable. With SSD, this time had been cut down to about 20 seconds. It is also smaller in dimensions, a matter which might seem trivial, but imagine online server with capacity of several thousand terabytes. The difference in expenses of cooling and maintaining smaller space is quite considerable. Although it is very fast, it has several disadvantages, and the most important is being prone to data loss due to lack of power source, but in case of online hosting, this is not much of a fuss, for many companies have independent power supply with backup generators. SSHD (Solid State Hybrid Drive) [easy-tweet tweet="SSHD: Has rotating disks and writers like the HDD, but stores commonly used data to NAND memory, like SSD"] This system we may simply call “Jack of all trades” (but master of none). As its name suggests, it is a crossover between HDD and an SSD drive. It has rotating disks and writers like the HDD, but from there stores most commonly used data to NAND memory, taken from SSD. This means that it is right in the middle on terms of price, speed and reliability. It is still affordable and faster than HDD but less reliable and much slower than SSD. It is recommended for middle-class users such as medium companies and semi-pro individuals. But, it has one disadvantage which makes him a bit difficult to use, and that is software. Tinkering around with this is not an easy task, and setting up and SSHD will require professional assistance. Conclusion Simply put, our final verdict would be an SSD drive, due to its advanced speed and reliability. Though the price might reject you, we would strongly recommend to take it into consideration, for a quality managed hosting is all about speed. Nothing more, so why not make your life easier? ### What are IT’s top priorities for 2016? The end of the year is often a great time for management to reflect on the year that has passed and what should be done differently. It’s easy for today’s leaders to get lost in the day-to-day tasks of management, and this time of year is useful for taking a step back and assessing the wider values that their company works towards. At Nitro we think that some things are too important to be compromised simply because people are busy. In this article we’ll talk about what we see as the top three priorities for businesses in 2016, with a particular focus on IT management, and how the management teams of next year can improve on their business practices. [easy-tweet tweet="Where to focus on #IT in 2016 according to @iamjohnokeeffe from @NitroHQ" user="comparethecloud" usehashtags="no"] Speed and productivity It seems like everyone’s talking about productivity these days, so much so that it often seems like productivity is an end in itself, rather than a means to an end. Business leaders in 2016 would do well to remember the costs of low productivity. According to a recent report from the IDC, document challenges are robbing audiences of 21% of their overall productivity; this breaks down to an individual productivity cost of £15,000 per person per year. So where are these productivity drains typically found? It doesn’t take much experience of office life to know that email attachments are often the culprit. According to the IDC almost half (48%) of workers have emailed the wrong version of a file to a colleague or client, while 81% surveyed found themselves working on a wrong version. Stopping the slow decline of employee productivity is an important priority for IT departments in 2016 Stopping the slow decline of employee productivity is an important priority for IT departments in 2016, according to research from Nitro. 68% of IT managers stated that this was one of their key goals, and with 91% of IT professionals using printing and scanning as a primary content management tool, there’s clearly a lot of room for improvement. Document security According to the IDC, 90% of US companies have experienced a data leakage or loss of sensitive or confidential documents in the past. In addition, 76% of workers said that document processes can create auditor issues in the future and over half (54%) of workers have discovered that their company is exposed to significant risk due to stored company content that is not correctly tagged and identified. According to Nitro, security topped the list of considerations for IT managers for 2016, with 47% of managers citing it as their key priority. Despite this, 77% of organisations do not provide a secure document sharing solution and a large portion of businesses, especially SMEs, rely on archaic document management practices such as email attachments and USB drives to handle documents. These methods of sharing are scarcely more secure than the paper they replace: they can easily fall into the wrong hands and have no built-in security mechanisms to control access. [easy-tweet tweet="77% of organisations do not provide a secure document sharing solution" user="iamjohnokeeffe" hashtags="infosec"] The biggest obstacle to providing document security in the enterprise market is ease of use: people will not bother using solutions they find difficult to understand and use. Companies should focus on finding the right solutions for their needs that combine safety with ease of use - it doesn’t have to be an either/or scenario. There are tools on the market enabling you to track a document through its entire lifecycle, ensuring the right people see your content at the right time. Sustainability Sustainability has been popular in people’s personal lives for a while now, but in enterprise companies it’s still lagging behind. However this looks set to change in 2016. Reducing energy use and reducing waste (therefore costs) are the two strands that corporate sustainability normally falls under. Both of these factors are directly affected by document management practices. A recent McKinsey survey placed sustainability as a top priority for CEOs with 49% classifying it as a top three initiative and 13% classifying it as their number one initiative. So anyone in a management position should become familiar with practices to improve sustainability. Nitro has identified several trends in IT professionals’ priorities for 2016: an equal portion of IT professionals rated remote accessibility and sustainability as of equal importance to their content management processes next year. The key thing to note here is that both these priorities involve reduced paper usage, and with 45% of paper printed each day ending up in the bin, it’s a change sorely needed. It is important that proper document processes are implemented from the top down Printing is the path of least resistance for employees, and unless they’re given easy-to-use tools enabling them to reduce their reliance on paper and to refresh poor content practices, corporate sustainability will not improve. It’s therefore important that proper document processes are implemented from the top down. Document productivity tools are able to help in getting rid of these archaic processes such as printing, scanning, signing and email attachments, and it’s essential that employees are given access to the tools they need to overcome each of these three barriers. Significant change can’t be made overnight, but as managers prepare for the new year it’s clear that these are the priorities they should be working towards. ### Why the cloud isn’t just ‘hot air’ for channel partners The global market value of office print supplies, including print hardware, currently stands at an incredible $146 billion. There is a huge opportunity for those in the print channel to increase revenue and it’s likely they could be doing more to claim their piece of the market, but how? Looking to the Cloud You’re probably familiar with the cloud computing trend—the potential, the hype and the reality. In fact, 8 out of 10 partners surveyed by Xerox are considering cloud solutions to enable closer customer relationships. [easy-tweet tweet="You’re probably familiar with #cloud computing—the potential, the hype and the reality" user="comparethecloud" usehashtags="no"] Taking a cloud-based approach can revolutionise customer contact. For example, in the print industry the cloud can now be used for managing and automating supplies ordering and selling. While this might sound like a high-tech solution to a low-tech sale, think of it in terms of smartphones. Smartphones not only disrupted the cell phone business but also the photography and mobile computing industries. As it turns out, there’s room to rethink the print supplies order/delivery system and apply today’s emergent technology. Simplifying Supplies Orders [caption id="attachment_33569" align="alignright" width="300"] Read about Lenovo's Rent and Grow Scheme[/caption] What do customers think about ordering supplies? In a word – stressful! According to a September 2014 survey by Staples, small business owners spending more time managing printer consumables are “most stressed”. Matching part numbers with customer devices is a headache. According to Gap Intelligence, there are more than 60,000 different supplies part numbers with device associations in the market, ranging from laser printers to inkjet devices. With a cloud-based approach, print supplies tracking and ordering can be highly intelligent, automated and easier for partners and their customers. By monitoring the customer’s office, the system tracks supplies usage and remembers the part numbers across every brand. Software in the cloud knows when to trigger an alert, search multiple distributors for best ordering sources, place the order for all print manufacturers, and may even link to the back office ordering system. Plus, it can directly ship the order to a customer location or the location of your choice. The customer is always taken care of and best of all, there’s virtually no opportunity for a competitor to make a play. Cloud-based software systems run 24/7 Improving Sales Efficiency Downtime, among other things, can mean wasted time, lost revenue and heightened frustration. Cloud-based software systems run 24/7, aggregating the data from all devices, regardless of the manufacturer, at all customer locations. So there’s no concern about downtime. You and your reps can access real-time customer ordering and supplies data anytime, day or night. Imagine what that kind of currency and transparency of information could do for you. Perfectly timed sales calls, insightful proposals to look-alike accounts and highly accurate forecasting let you pass on savings in time and money to your customers. By putting the process on autopilot, fewer resources are required for analysis, negotiation, ordering and delivery of supplies. Fewer touches mean fewer expenses. Maintaining Customer Loyalty [easy-tweet tweet="In today’s business world, a decision to buy can happen before a seller is ever engaged" user="comparethecloud" hashtags="cloud"] We all know that in today’s business world, a decision to buy can happen before a seller is ever engaged. With pricing and delivery information at buyers’ fingertips, differentiation of service is paramount. A modern, cloud-based approach to selling supplies offers just that. Who else has this level of intelligence and sophistication around supplies and service? As a technology provider, you should naturally be using cutting-edge technology to make life easier for your customers. And once a customer is signed up to a cloud based supplies service (and it only takes a few minutes), there’s very little reason for that customer to ever go elsewhere. ### The impact of the CSR on public sector technology Yet again, George Osborne is revealed as a master of the art of expectations Management. Following months of hints of politically painful cuts, we’d been expecting the Spending Review to create a sense of relief via last-minute reprieves, political and accounting fudges. In fact, the picture presented looked more like a series of rabbits pulled from Osborne’s hat rather than the U-turns described by the Opposition. Three leading analysts from Kable - Jessica Figueras, Andrena Logue and Alan Mo – have combined efforts to provide their views on the impact of the CSR on public sector technology.  [easy-tweet tweet="Three analysts from Kable comment on how the #CSR will affect public sector #technology" user="comparethecloud"] Central Government Thanks to more generous borrowing forecasts from OBR, and by quietly pushing back deficit reduction another year, Osborne was able to reduce the departmental resource savings target from £20bn to £12bn. Now, £9.5bn of savings will be “reinvested in the government’s priorities” rather than banked. This may explain the eye-catching promise to invest £1.8bn in digital transformation, much of which will probably be achieved by re-badging existing ICT programmes as ‘digital enabling’ rather than freeing up new money. technology and digital are key pillars of this spending review MoJ, Home Office and HMRC are where the most obvious gains from digitisation are to be found, and there was specific mention of digitising criminal justice in order to speed the closure of courts, and more money for border technology, including passports and immigration, as well as digital tax. All of these programmes are already underway, of course, though it’s good to know there is continuing support. £450m for GDS was the biggest surprise, but an almost total lack of detail – and silence from Aviation House – suggests that this sum would be over 5 years, representing £90m a year – a 55% rise from its current £58m budget, which is still very generous. GaaP got a mention, but the only new common platform promised was GOV.UK Pay. We’d suspect the extra cash will be spent on GOV.UK Verify, which will need to be scaled up fast to support new digital services across government, rather than being used to fund GDS-led development projects; but we won’t know for sure until more detail is available. What isn’t yet clear is how this £450m is made up, and what GDS’ implementation plan is to deliver GaaP, GOV.UK Verify and GOV.UK Pay. Watch this space. [easy-tweet tweet="The promise to invest £1.8bn in #digital transformation will probably be achieved by re-badging existing #ICT programmes" via="no" usehashtags="no"] Health and the NHS From an NHS perspective the CSR provided a better settlement for the NHS, and NHSE CEO Simon Stevens takes the credit for all his lobbying. The frontloading of £3.8bn in 2016/17 should be well received by the majority, to potentially stave off the winter of discontent that has been looming. On the tech side, there were few details on new money, which absolutely has to come through to support the upcoming digital maturity index and push interoperability, never mind day-to-day spend on core systems, which is lagging nationally. All of this has to be shored up to support the National Information Board’s 2020 digital and patient centric blueprint which plans to provide access to your health records in plain English on your smartphone. The £10m promised to support the ‘Healthcare Innovation Test Bed’ programme will better position healthcare in a combined R&D and digital context, which feeds into economic hub strategies that many councils have developed. Local Government  The ability to raise council tax by 2% to go towards adult social care may be welcomed by some, but it will have an unbalanced and unequal impact across the country. Localities with lower council tax receipts are often those with a higher proportion of the population eligible for social care; while many of the councils that have higher tax receipts also have a higher proportion of self-funders.  [easy-tweet tweet="There are real questions over how local #digital best practice can be shared" user="comparethecloud"] So local authorities will not be able to rely on the 2%, and local areas will need to double up efforts towards health and care integration, and early intervention. With this in mind, the extra £1.5bn for the Better Care Fund (BCF) is a small positive, and could open up opportunities for technology use. It is clear that technology can play an important enabling role, and most existing BCF proposals confirm a growing call for single integrated customer records, with locally configurable workflow helping drive process integration. Although GDS gained £450m to drive digital transformation in Whitehall, local government seems to have missed out completely, and with the end of the Local Government Digital programme expected in March, there are real questions over how local digital best practice can be shared. The spending review was surprising for everyone in terms of budget allocation. Along with central government, local government and health, there was money earmarked for education, police and the armed services. Concerns will certainly be raised over the large sums quoted for central government versus these other sectors and questions need to be asked over where exactly the money is coming from - the devil is necessarily in the detail - but one thing is for sure, technology and digital are key pillars of this spending review. ### The STAC New York 2015 with David Weber Neil Cattermull interviews David Weber, Sales Director and CTO for Lenovo, at The STAC in New York in December 2015. David Weber discusses Lenovo's current placement in the market, and where they're heading in the future. ### Cyber London Announces Second Cohort Cyber London, Europe’s first dedicated cyber security start-up accelerator and incubator, today announced the eight start-ups participating in Cyber London’s second programme. [quote_box_center]The successful candidates, selected from a shortlist of outstanding international applicants, were announced at a private event in Cyber London’s offices in West London.  The teams, who start the programme this week, are:  Aves Netsec, Finland. An adaptive cyber deception system for efficiently countering the advanced attacker. BitNinja, Hungary. The first integrated server defense network powered by a machine-learning system. Fabric, UK. Secure and private mobile communications. Mazor, Israel. Solution for detection of threats on network and security components. Simudyne, UK. Identifies and reduces the risk from cyber attacks. Torsion InfoSec, UK. Delivering accurate rules and classifications-driven security for SharePoint and Office 365: Simple, Precise and Robust. UkkoBox, UK and Brazil. An easy tool that safely encrypts and spreads user files to existing cloud providers worldwide. Verity, South Africa. Enables organisations to prevent document forgery through digital document certification. [/quote_box_center] Kirsten Connell, the new MD of Cyber London, said: “We are thrilled to welcome such a diverse and impressive cohort into the second Cyber London programme. The breadth of talent in the early stage cyber security sector continues to fill us and our partners with confidence in Cyber London’s value to both entrepreneurs and the wider economy. I look forward to working with the teams over the coming months and helping them to go from strength to strength.” Cyber London’s accelerator programme consists of a 14-week programme where teams with innovative business ideas in the field of cyber security are given access to professional training and guidance from an accomplished network of mentors and investors. At the end of the programme the companies present their businesses to potential customers, investors and partners at an in-house Demo Day. In addition to training, mentorship and fully-furnished office space, the teams receive a £15,000 investment in return for 3 per cent equity. Richard Wilding, Director of New Ventures at BAE Applied Intelligence, said: “Nurturing start-ups with Cyber London helps ensure a vibrant supply chain full of  technologies that can keep the UK and Europe at the forefront of this industry. BAE Systems looks forward to supporting the development of the exciting young companies involved in Cyber London’s second cohort." Cyber London is fostering a thriving community of cyber security companies in the heart of London.  Of the eight companies graduating from Cyber London’s first programme, which took place between April and July 2015, all but one has been fast-tracked onto the next stage of their growth path. The companies on Cyber London’s first programme were: Ripjar SQR Systems Intruder Mentat Innovations Sphere AimBrain Ruuta Cyberlytic Andrius Sutas, co-founder of AimBrain, said: "Cyber London helped us immensely by connecting us with highly relevant people in our industry, as well as the governmental and security sectors. We feel very grateful for the time that team and MD spent mentoring us and are super proud to be part of the CyLon's alumni network."   ### Cloud Security – How to Protect Your Business With a wave or recent cyber-attacks on big name companies, staying safe online has become a big concern for business no matter what their size. For many, the cloud is a business solution which works for their business with cost savings and increased business agility is a popular choice for many. However with so much now stored on the cloud protecting your business online isn’t simply a task for IT managers, it’s essential knowledge for all cloud users. [easy-tweet tweet="Protect your business: use complex passwords, #encrypt files, and train staff say Ben Simpson"] Use Complex Passwords Using the same password is essentially giving a hacker free run Passwords and security questions are often the gateway to our files and how the majority of our content is protected in the cloud. Ensuring your password is of one of the most complex standards is one of the key ways to protect your business. Make sure you update your passwords periodically and never use the same password across all platforms. Using the same password is essentially giving a hacker free run, as they will often try out passwords on all devices. You’ll often find that websites and programs are now built in with password strength analysers which will tell you how strong your password is. As a guide, keep your password to a minimum of 8 characters with a mix of lower and upper case letters, look to incorporate symbols and numbers where possible to keep it as complex as possible. Use 3 random words and a selection of numbers, then combine different variations to create a unique set of passwords you instantly remember. Try reversing words and swapping out letters for symbols and make sure your words don’t start with the same letter. Encrypt Files Some services and software packages automatically encrypt files for you, however you may wish to encrypt them yourself if you don’t have access to this service. Encryption can help in your quest to fight off cybercrime and keep important documents safe from hackers. Encryption scrambles the data on your files Encryption scrambles the data on your files which makes them unreadable to outside sources and protects data from being stolen and interpreted on other devices. You can encrypt your hard drive for free fairly easily and whilst you may think there is nothing of worth in the cloud or on your computer, files can easily end up in the wrong hands.  If you use Windows, then installing BitLocker and selecting Trusted Platform Module alongside a PIN will ensure your drive encryption is of the highest security possible. If your computer is older than Windows 7 then you will only be able to use the USB authentication method. Train Staff Staff may be familiar with the ways to stay safe online, but educating them on cloud security can help to protect your business from any unwanted threats. We’re all used to protecting ourselves online, however some can have a relaxed approach in the workplace. [easy-tweet tweet="Mimecast found that 79% of employees use their personal email accounts for work" user="comparethecloud" hashtags="cybersec"] With more employees encouraging staff to bring in their own devices to work from it’s essential to ensure any devices they are using, especially when accessing the cloud, are up to date with the latest anti-virus protection software. A recent report by Mimecast, a security firm, found that 79% use their personal email accounts for work. What you may not realise is the personal information you share can end up being automatically backed up, mirrored and archived on systems and servers outside of your control, even after you’ve left the company. Diarise regular meetings with staff to keep them up to date with the latest security solutions. Ensure staff use complex passwords on any devices they use or files they are working on and make sure they lock devices when away from their work space. Use a Verification Process If you’re concerned that accessing files is too easy on the cloud then deploy a verification process across your devices to add an extra layer of protection. If someone can get past your password then they will soon have access to your data, so adding an extra step will help add another barrier. Enabling a two-step verification process will require you to input two different pieces of information, for example password or answering a security question, before you can access files. This could also be a code that’s generated and sent to another device which you may need to input before access is granted. Whilst this process can be time consuming, it can save you the hassle of having to clear up a data and cloud breach later on down the line.  Finally, always keep a back-up of files. It’s always important to keep an extra back up on file, so if anything does go missing you haven’t lost all that hard work in case something does go awry. ### Snappy 2016 Predictions from Infosys Compare the Cloud spoke with Samson David, SVP, global head cloud, infrastructure services & security from Infosys. Samson has given us his predictions for 2016, including that On-premise will be left behind in 2015. Samson David is the Senior Vice President and the Global Head for Cloud, Infrastructure and Security at Infosys. Samson works closely with global 2000 clients to help define and build their cloud and infrastructure strategies and run end-to-end IT operations. He is also on the board of the ODCA (Open Data Centre Alliance), Samson is what we at CTC quantify as a cloud expert, and more than qualified to share his thoughts on the future.  [easy-tweet tweet="Samson David speaks to @Comparethecloud about his predictions for 2016" hashtags="cloud, infosec"] Cloud security will increase in scale, and decrease in complexity. In 2016 we’ll see cloud security evolve into simpler, virtualised controls and solutions that will have embedded security processes to help map current IT systems. Heavy, bolted-on protective layers that have difficulty scaling in the cloud will stay behind, and next year will have lighter, scalable cloud security solutions. The significant challenge of making hybrid cloud a reality. Without the heavy uplift of people, process and technology across architecture, governance and service management, a hybrid cloud ecosystem will only bring a fragmented IT process within an enterprise. Businesses will have to make a concentrated effort on integration, policy standardisation, operations, cross-provider provisioning and end-user empowerment to implement a successful hybrid strategy. Backup and recovery will become synonymous with security. With the explosive growth of structured and unstructured data, improving backup and recovery time will be a big hurdle for the enterprise. Vendors will rely on automated tiered solutions and data de-duplication to address the challenges of heterogeneity of technology. Encrypted data backups and agentless cloud-based replication will become the norm for data security. improving backup and recovery time will be a big hurdle for the enterprise DevOps will be a starting player for the enterprise team: Next year’s booming market demand for DevOps tools will eliminate siloed setups, narrow thinking and resistance to change. 2016 will see DevOps tools become an integral component to the new ways of working and change the organisational culture within enterprises. With a focus on people, process and tools, companies will reorganise talent and include DevOps in collaborative work models. [easy-tweet tweet="Applications will need to be cloud-ready and platform independent in 2016" user="comparethecloud" hashtags="cloud"] Applications will break free from on-prem restrictions: As all data inevitably moves to the cloud, applications will need to be cloud-ready and platform independent in 2016. On-premise will stay behind in 2015, as businesses adopt flexible and scalable solutions that are supported through multiple application technology platforms. ### Investing in the future, tech for 2016 With a new year upon us and technology continuing to change the way enterprises are run, now is the time to really think about where your company needs to invest in order to ensure that it remains competitive. Business automation software was once an extra competitive advantage, now it’s an absolute necessity for any organisation that’s looking to remain at the forefront of the industry, allowing it to devote more time to high-level strategy. Likewise, automation used to be the preserve of IT professionals; now staff across all departments must be able to manage complex automated workflows. [easy-tweet tweet="In 2016, it’s expected that all companies will have some degree of #automation software at their disposal" via="no" usehashtags="no"] In 2016, it’s expected that all companies will have some degree of automation software at their disposal. Nevertheless, how it is actually implemented is really the key differentiator – and consolidating under one central solution will remain the best approach. Driven by market demand, technological innovation will no doubt continue to evolve at an incredible rate, and it’s clear that the Internet of Things (IoT) and data in the cloud are here to stay. Those that can harness its possibilities in tandem with Big Data and leverage the insights it engenders will have a powerful edge. IOT and data in the cloud are here to stay As highlighted by the recent Deutsche Bank-Hewlett Packard deal, large enterprises will increasingly implement private cloud on demand to deliver flexible capacity, in addition to core capabilities. Part of the bank's long-standing restructuring efforts have included moving to a more flexible digital platform as pressure increases across the financial sector to cut costs, embrace web-based client platforms, and meet new competition from outside of traditional banking. [easy-tweet tweet="New market entrants will increasingly offer #cloud solutions with minimum risk for the customer" via="no" usehashtags="no"] Increasingly, new cloud vendors entering the market will seek more innovative technology-funding services to reduce capital and operational costs. In this way, companies will be able to get the benefits of a solution without investing in any hardware or license fees. New market entrants will increasingly offer solutions with minimum risk for the customer – in other words, companies can subscribe to the service and use it as they like straight away since everything is already made available for use. However, if they don't like it, then they can unsubscribe at anytime. 2016 will also see cloud services continue to evolve towards utility-based offerings. Businesses and private individuals will become increasingly comfortable with the concept of the Internet of Things, of no longer having physical data, and of course the cloud, touching everything from an individual’s smart house service and personal storage, to SMEs embracing a full cloud platform for data, office products, voice and video. It’s incredible to think that some are predicting the number of Internet-connected things will reach or even exceed 50 billion by 2020. Finally and probably most importantly, in previous years, we've seen companies of all sizes embracing  many different technologies and opening their doors to a wide spectrum of software options to tackle their problems. 2016 will see a shift towards reducing the number of platforms and apps, as well as attempts to consolidate data in one central source. This will be done by removing potentially similar software providers from the billing list or moving to an altogether new and more functional platform to ensure the business uses one pivotal solution and consolidated data source. 2016 will see a shift towards reducing the number of platforms and apps ### Cloud28+ Launches Enterprise Cloud Catalogue with 680 Services Enterprise customers across EMEA can now accelerate to hybrid IT via central cloud service catalogue Hewlett Packard Enterprise (HPE) has announced the general availability of the Cloud28+ cloud service catalogue. The catalogue enables European enterprises to easily search for cloud services to meet their specific business requirements. Cloud28+ customers can also sort for a specific data centre location and provider by which a cloud service is operated, thus ensuring compliance with local laws and business requirements. Cloud 28+ is a community of commercial and public sector organisations aimed at expanding cloud service adoption across the European region. A centralised enterprise app store, the Cloud28+ catalogue already features 680 cloud services from 150 Cloud28+ members across Infrastructure-as-a-Service, Platform-as-a-Service and Software-as-a-Service offerings. To date, more than 1000 end user organizations have pre-registered to use the catalogue. The new Cloud28+ catalogue provides a broad range of benefits for European customers: An easy and transparent access to huge number of enterprise cloud services, enabling customers to compare offerings based on functional and non-functional criteria, including price, service level agreements and certification levels. Ability to discover cloud-native ISVs with whom they can partner to embrace the Idea Economy. Expansion of their go-to-market capabilities by publishing their own services in the catalogue, thus turning their IT teams into revenue-generating engines. [quote_box_center] “We created Zettabox to help European companies build sustainable futures in the cloud by ensuring compliance with increasingly strict data protections laws in the EU,” says James Kinsella, Zettabox founder. “Cloud28+ was a logical community for Zettabox to join, as its mission is to build a cohesive and collaborative cloud environment, for Europeans by Europeans. We are proud to be a part of this thriving and growing pan-European community.” [/quote_box_center] The Cloud28+ technology framework is based on HPE Helion OpenStack, ensuring portability of cloud services and eliminating vendor lock-in for the end customer. [quote_box_center] “The launch of the Cloud28+ catalogue is an important milestone on the journey to a European Digital Single Market,” says Xavier Poisson, Hybrid IT vice president for Europe, Middle East and Africa. “Cloud services are fundamental for the growth of the digital economy, and now customers across the EU’s 28 member states and beyond have a single source of hundreds of cloud services that enable them to transform to a hybrid infrastructure.”[/quote_box_center] Cloud28+ catalogue services currently include a wide variety of data management, end-user productivity, business app, digital marketing, software development, and security offerings, amongst many others. Additional categories and vertical solutions will be added in the coming months. [embed]https://www.youtube.com/watch?v=t5_yALuq7N4[/embed] ### Cloud the way you like it: What are you having for dinner tonight? Or what cloud buyers can learn from the convenience store revolution in retail... Where once families used to go to out of town hypermarkets to buy lots of food in one weekly shop and potentially waste some of it, the trend now is to use convenience stores and shop intelligently. They now buy their meal for that evening from the convenience store, shop in a supermarket for non-perishable foods and toiletries, and then pop into the discounters Aldi and Lidl for bargains. [easy-tweet tweet="Hypermarkets and #cloud have more in common than you think " user="veberhost @comparethecloud" usehashtags="no"] For the major food retailers (such as Tesco, Asda, Sainsburys and Morrisons in the UK) this change to intelligent shopping has so far been an even greater disrupter than the move to internet shopping and delivery. It has forced many of the major food retailers (who were architects of their own doom in competing with rival convenience store strategies) to rationalise their estates. Several have opted to close a number of their out of town hypermarkets, along with other underperforming outlets; as well as putting expansion plans on hold and their land banks up for sale. Only a few years back this would have been unthinkable. The parallels here with cloud and data centres are revealing Not long ago pundits were predicting that the hyperscale public cloud vendors would sweep all before them, but a very different market dynamic is emerging as cloud clients learn to shop intelligently for their hybrid cloud needs. The Hypermarket – public cloud: almost all firms now use the hyperscale public cloud vendors in some way or another – whether via strategic choice or shadow IT. When you need a lot of cloud for your money and the application or business operation is not mission critical and doesn’t need extensive adaptation to meet requirements, then it just makes sense. Each hypermarket has its own characteristics: I like the value you get at Asda or Morrisons, the range you get at Tesco or Sainsburys, and the quality that you get at Waitrose. Likewise in cloud. For example: AWS has a vast array of offerings and even more on offer via its marketplace from its ecosystem of partners, but it doesn’t have a private cloud offering or the best integration with the main private cloud standard - OpenStack. Microsoft has seamless integration between .NET environments and Azure with attractive cross licensing between the two. It is also the public cloud of choice for major OpenStack players such as Dell and HPE. IBM and Oracle offer OpenStack-based hybrid environments This is via SoftLayer for IBM and via a bit of promise from Oracle that it will be big and price competitive in IaaS any time now!?! [easy-tweet tweet="Almost all firms now use the #hyperscale public #cloud vendors in some way or another" user="veberhost"] The Discounters – burst out capacity and long term storage: With Amazon Glacier at $0.007 per gb per year (yes that’s less than 1 cent) there’s little point in looking beyond the likes of AWS or maybe Azure for burst out capacity and long term storage. I mean this is a commoditised offering and who’s going to beat them on price? The Convenience Store – private cloud and bespoke services: The reason that much of the focus is now on smaller more-focused local MSPs with niche capabilities and expertise (much like the focus in retail is on convenience stores) is that a one-fits-all approach is never going to work for everyone – or indeed for anyone with any kind of specialist need. For decades SAP tried to sell a standardised suite of applications to large organisations, seeking to persuade them that it could easily be tailored to meet their needs. However organisations found that they were all too different and that SAP implementation often become a massive drain on skills and resources - and once implemented it was a straight jacket restricting innovation. Cloud promised to free organisations from such restrictions, but for this to work they need bespoke services that can best serve their most critical requirements. This is where the local MSPs come in. They provide hybrid integration with the big boys while bringing their niche expertise and local focus to bear on what matters most for the client. Do you do a weekly shop and plan your evening meals a week in advance or do you prefer the convenience and freedom of deciding what you’d like for dinner each night? Maybe you need to visit your local MSP and start enjoying the bespoke services that it can offer. If you don’t have a local MSP then please contact Veber and we’d be happy to help, contactus@veber.co.uk. ### How to Rent-and-Grow with Lenovo Like most technology providers today, Lenovo have embraced the MSP market place to benefit from a more flexible approach to changing hardware requirements. Despite being better known for its end-consumer hardware, Lenovo are not just a hardware provider anymore (and haven’t been for some time). There have been a number of interesting developments that have arisen, namely their Rent and Grow scheme, to which we will take a closer look. The idea of Rent and Grow schemes isn’t something new – in fact many industries have used them successfully for years: the retail industry with mobile phone tariffs or ‘buy now, pay later’ on household goods; the entertainment industry with the emergence of Netflix and Spotify; and even healthcare with dental plans. If anything, you could argue that the IT industry has been a slow adopter, but we are now beginning to see the shift to subscription models. [easy-tweet tweet="The #IT industry has been a slow adopter, but we are now beginning to see the shift to subscription models" via="no" usehashtags="no"] In my opinion, the over-riding benefit of these models is the ability for companies to be able to map out their Customer Lifetime Value (and to be able to increase that value as customers grow their packages). Understanding the value of your customer is crucial to forecasting and budgeting, managing cash flow and changing the attitudes of how we service that customer. You could make a lump sum off one customer one month, but think of the value you could achieve by maintaining a relationship with that customer over a number of years, earning steady revenue. Suddenly, the attitude of how sales, marketing, and accounts view that customer is extremely different. The idea of Rent and Grow schemes isn’t something new With this in mind, Lenovo’s model offers a dedicated track for cloud and managed service providers by allowing them to pay as they provision more Virtual Machines. This ‘rental model’ minimises burn rates whilst also giving you the option to return the kit after 12 months or upgrade to a replacement. [easy-tweet tweet="Lenovo’s model offers a dedicated track for #cloud & #MSP providers by allowing them to pay as they provision more VMs"] Although there are a number of organisations offering similar packages in the market, Lenovo have built in additional benefits, making the model more tailored to the providers’ needs, and essentially more desirable than other offerings in the market. These additional benefits include: Co-marketing funding in partnership with Intel – this will support providers to generate new prospects Trade-in value for ageing equipment – this is particularly attractive for those operating in high-growth environments, where hardware will frequently need a refreshes 120 days deferred payments – having capped quarterly payments reduces your risk, helps to manage cash flow and gives you the choice of a contract end date if you choose to return the equipment Leveraging one of the most reputable tech brands – incorporating you into Lenovo’s ecosystem, and providing the added benefit of technical support courses. As with Cloud environments today, it is difficult to gauge which tech, services or even client acquiring will be achieved, but with flexible options that offer all the above, the risk of capital is vastly minimised. We now see businesses accepting joint risk with their partners. Being able to leverage an ecosystem of trusted business partners on the platform of the Rent and Grow scheme, coupled with an amazing tech offering, has put Lenovo in good stead for 2016. ### US-EU Safe Harbor: The Five C's Safe Harbor is a decade old privacy pact that underpinned cross-Atlantic data transfers for sensitive information. The European Court of Justice (ECJ) revoked the agreement in October 2015 citing concerns over US surveillance programs. The suspension created a dilemma for the 4,000+ companies that used Safe Harbor to comply with European data privacy principles. As the EU and US re-broker privacy definitions, business are left with temporary and cumbersome stop-gaps, such as Model Clauses, that don’t provide long term solutions. [easy-tweet tweet="A framework for global #data transfer has collapsed, what's next?" user="SVPR_Sweetheart" usehashtags="safeharbor"] CipherCloud has launched a set of Safe Harbor resources to educate organisations on what the suspension means for businesses and how they can build a plan to keep data flowing while Brussels and DC negotiate the new rules. This infographic overviews the revocation of Safe Harbor, its impact and proactive measures companies can take to satisfy privacy regulations. ### To Cloud or Not To Cloud: SME Guide to Cloud Computing As cloud computing continues to infiltrate public consciousness, small businesses across the UK are recognising the benefits it brings, particularly in regards to cost savings, business agility and security. Predictably, the conversation now is not about whether to migrate applications to the cloud, but rather, when and what the business should move to the cloud. For small businesses in particular, this can be a difficult predicament, with the various models falling under the all-encompassing ‘cloud’ label, this can be a difficult landscape for non-tech savvy business owners or managers to navigate. [easy-tweet tweet="The question of #cloud is no longer if - but when - says John Morris of @UK2Group" user="comparethecloud" usehashtags="no"] Much like the crystallised water variety, man-made clouds come in many varieties and with the different cloud models on offer in the market, each with their own advantages and disadvantages, it is clear that the cloud landscape can be daunting one for small businesses. The following is a guide for businesses that have outgrown on-premise servers and are taking their first step into cloud computing. Getting your head around the cloud ‘Cloud computing’ has evolved to encompass various methods of computing, but there are essentially three prevailing models within which the majority of businesses operate – public, private and hybrid cloud. In a public cloud the provider offers computing resources over the internet in a virtualised environment. Public cloud makes use of shared resources and could also be referred to as shared hosting. In public clouds, the company shares the server with other companies. This can have some security implications for businesses dealing with sensitive client data and as a result, public clouds are most suited for sharing non-business critical or non-sensitive information such as document sharing or collaboration tools. Migration of business data to a public cloud provider can significantly reduce business IT costs, whilst offering greater business agility and scalability. In public clouds, the company shares the server with other companies The majority of small businesses will take their first steps towards cloud computing by migrating this non-mission critical data to a public cloud environment. These clouds are most suited to small to medium sized businesses that may not need or be able to afford the additional costs associated with private clouds. Private cloud offers additional security, with the infrastructure operated solely for one named company – the business will not share its IT infrastructure with any other companies and access can be behind a company’s firewall via a virtual private network (VPN). This model is sometimes referred to as a bare metal dedicated server, with the additional security benefits making it the ideal destination for sensitive business data. Private clouds are most useful for organisations dealing with private information or have regulatory obligations to keep their data under their control, whilst requiring the computing benefits afforded by the public cloud, such as business agility and scalability. Private clouds are most useful for organisations dealing with private information Due to the additional benefits of the private cloud, costs will be higher for businesses employing this model. Data such as bank account details, private customer details and tax information should be migrated to a private cloud environment rather than public cloud. As can be guessed, hybrid cloud is a combination of public and private cloud offering the cost savings associated with public cloud, with the additional security of a private cloud, but can also include on-premise infrastructure. Companies utilising hybrid clouds manage some resources in house while others are provided externally. These clouds remain separate entities, although working together to fulfil the business’ computing needs. Businesses can migrate the sensitive information to a private cloud environment or keep this data on-premise, whilst more basic applications can he hosted in a public cloud environment, but these must communicate seamlessly to ensure the end-user is not affected. Are you ready for cloud? [easy-tweet tweet="As any small business grows, it will inevitably outgrow its on-premise servers" user="UK2Group" hashtags="cloud"] As any small business grows, it will inevitably outgrow its on-premise servers as these can be expensive to maintain, but how does the business owner or manager know when the time is right to migrate applications to the cloud? In the majority of cases, the migration to cloud computing will be driven by the three factors above: cost, flexibility and security. These are the issues that will determine what model cloud the business selects and what applications to migrate to the cloud. Whilst ultimately, the decision on cloud migration will depend largely on the business and the industry it operates in, business owners and managers should not feel daunted by the prospect of cloud migration. Preparing for cloud migration today, could help avoid stormy seas tomorrow. ### From CIO to Technology Hunter: Turning innovative thought into reality in the enterprise The modern CIO is responsible for a vast number of business decisions. Many of these require the freedom to think strategically about how technology can enable the business to grow and remain agile. Historically, it's been the CIO that’s steered the business through new challenges – such as the introduction of the internet or, more recently, catering to an increasingly mobile workforce. The CIO is left with a choice: Choose the familiarity of well-known technology, or make educated decisions about potential risks posed by change. [easy-tweet tweet="With great power comes great responsibility, and the #CIO must lead the charge to create space for #innovation" via="no" usehashtags="no"] In today’s enterprise environments, the onset of the cloud together with an increasing number of SaaS applications have meant that it’s made sense for businesses to rely on third parties to manage their data, or undertake essential maintenance functions of their infrastructure. This expanding IT eco-system can sometimes leave the CIO feeling powerless to make strategic decisions - instead they are left feeling pressured to make the most ‘cost effective’ choice in order to tick the corporate box. This outsourcing of functions and processes alongside the growth of cloud computing has led to a change in authority over data. CIOs are adapting to a new style of leadership. Gone are the days of holding the keys to a physical server. The IT world is changing and with that comes many opportunities for CIOs willing to embrace their new role. As part of our work, we’re in touch with thousands of CIOs in multinational companies of all sizes, in a range of industries. Over the years we’ve witnessed changes in the role of the CIO. Originally, the CIO was responsible for automation, as well as management and operation of the company’s processes. Today, this is a prerequisite, but not enough to survive in the new world. [easy-tweet tweet="Today’s #CIOs must be focused on developing technology-empowered business strategies" user="comparethecloud" usehashtags="no"] The CIO as “Head Technology Hunter” Today’s CIOs must be focused on developing technology-empowered business strategies. Even if the company isn’t an ‘Early Adapter’, the CIO must be at the forefront of technology and must be the one who brings the company to that point. They are responsible for business outcomes no less than any other member of the board. Leading a business is no longer about focusing on operational systems, it is about analysing and identifying market trends, generating knowledge and insights, forecasting causes and effects of business activities and finding and implementing systems that will proactively advance the company. For example, the CIO can initiate a dedicated system that will identify and issue a warning about dissatisfied customers that might leave in the future, or a system that monitors changes in customer behaviour. It’s the CIO’s job to be the professional umbrella Unfortunately, this revolution hasn’t happened everywhere. There are many obstacles preventing CIOs from implementing these principles. Often it is because they are busy putting out fires and taking care of ongoing maintenance of information systems instead of hunting for innovations. Sometimes the added value the CIO can give the company has to be pointed out. It’s the CIO’s job to be the professional umbrella, the person responsible for promoting everything linked to technology and in possession of a broad vision of company needs and appropriate solutions. Changing the internal perception of the CIO The CIO’s place in the organisation is also changing. The role can be perceived as an IT Strategy Leader. VPs of sales or marketing are no longer the only ones responsible for the company’s results. The IT manager should be the CEO’s aide-de-camp in expanding the company. The CIO needs to identify trends inside and outside the company, to define new initiatives and to discover any barriers or problems involving the company’s automation and the advanced capabilities that information systems enable. [easy-tweet tweet="The #CIO should harness technology knowledge from other key players in the organisation" via="no" usehashtags="no"] The CIO should harness technology knowledge from other key players in the organisation. There is no reason why the initiative to buy a new application shouldn’t come from any of the company’s departments, especially since the internal customer’s understanding of technology is so high today. Non-IT managers, such as marketing and HR, are always on the lookout for new technological tools to spend their time in the most efficient way – and so the CIO must utilise this knowledge and collaborate with other executives to select new technologies. However, only the IT manager has the broad, professional vision necessary to decide. If systems are incompatible, reporting becomes complex or almost impossible. This is where leadership on the part of the CIO comes in. [caption id="attachment_32937" align="alignright" width="400"] Read more on CIOs here[/caption] Streamlining systems and creating time to think The good news for CIOs is that business management systems are evolving even further to support their role as a ‘Technology Hunter.’ Firstly, in order to free up the CIO to take the company’s technology strategy to the next level, it’s important to have a reliable, integrative and easy to operate system with all of the company’s information in one place.  A system which is open to new integrations but also capable of keeping data integrative will make sure that data is consistent, and managed in a flexible and responsive way. Secondly, CIOs need not fear putting their business management system in the cloud. One of the main issues is concerns of losing control over different aspects of system management, such as upgrades. Large companies that are bound by regulations are afraid of this process and prepare for it by strict backups and tests. This is where private cloud works best in the initial stage:  on one hand they free themselves of maintenance responsibilities, but on the other hand they still have a reasonable amount of control. Thirdly, a major demand of information systems today from the C-Suite is to provide the ability to draw conclusions and even deliver business insights. IT systems today are asked to incorporate intelligence abilities that not only analyse past performance, but generate forecasts and make recommendations to managers based on this type of predictive analysis. Intelligent technology at this level forms the basis of a strong and resilient IT department, and in turn frees up time for both the CIO and IT staff to focus their efforts on exciting projects which are targeted at growing the business and staying ahead of the competition. ### What is ISO 27001 and why do I need it? 
 When it comes to data security, there is a wealth of standards that you need to meet in order to achieve compliance. This can be quite confusing as understanding ISO certification can be difficult, especially at face value.  To make this simpler for you, here’s a quick roundup of ISO 27001. What is ISO 27001? Put simply, ISO 27001 is a specification for an information security management system (ISMS). It’s a model of working for frameworks surrounding the legal, physical and technical controls that are used when processing an organisation’s information risk management. [easy-tweet tweet="ISO 27001 is a specification for an information security management system " hashtags="ISO27001"] This standard provides complete guidance, covering everything from establishing and implementing the framework to the way in which it is operated and monitored. It even recommends ways to maintain and improve your systems. ISO 27001 works using a top-down, risk-based approach.  It generates scope, taking into account the context of the organisation, planning and analysing processes, current performance and addresses the findings to show where improvements can be made. It is important to note that ISO 27001 does not work independently Using ISO 27001 It is important to note that ISO 27001 does not work independently. Instead it requires input by management to examine the security risks present and take the appropriate actions based on the threats and vulnerabilities present. Management will have to create and implement their own security controls or other forms of risk management, i.e. risk avoidance or risk transfer, to address the problems present. The best practice is to adopt an overarching security management process that is ISO 27001 approved. This ensures that your security controls meet the required standards needed for your organisation on an ongoing basis. However, even with a system such as this in place you will still need to take manual action from time to time to respond to threats and make improvements and changes to the system. Security controls are very important, therefore it is vital that you take the necessary time to ensure that your system runs as efficiently as possible. Why gain certification? There are many benefits to be had through certification compliance. One of the most obvious benefits is that this shows that your organisation takes their information security management seriously. Having an independent assessment adds extra weight to this. Any organisation looking to work in an environment where secure file transfers are a priority will favour other organisations that have been certified ISO 27001 compliant. This states that the ISMS in place is compliant and there are measures being taken, on a regular basis, to ensure that it is as safe as possible. [easy-tweet tweet="How fast can you get an ISO 27001 certification? John Lynch explains on @comparethecloud" via="no" usehashtags="no"] How fast can I get ISO 27001 Certification? Unfortunately, there is no set answer as the time it takes to gain certification depends very strongly on your existing circumstances. If you are using software and programmes that already have ISO 27001 certification, then you will only need to change the way in which your business operates to gain compliance – typically this will take between 5 and 9 months. However, if there are no measures currently in place then this can take much longer. You will need to implement new programmes, carry out a risk assessment, address issues and change your day-to-day practice in order to meet the standards. At this stage, it could take up to two years to achieve certification. [quote_box_center]If you’re looking to achieve this standard as quickly and efficiently as possible, you will need: The right tools to monitor and evaluate your security A strong plan to assess and score risk To tailor the standard to your organisation’s needs Training across the board to work to ISO 27001 values and best practice[/quote_box_center] Don’t be put off by the time and costs of this certification, this process does not need to be complicated. With the right guidance and proper tools you will find that achieving certification is well within your grasp. ### Big Data: from Investment to Business as Usual Hotels.com is one of the world’s leading online booking services. The company has hundreds of thousands of bookable properties available worldwide, over 15 million Hotels.com Rewards members and over 14 million genuine customer reviews. Hotels.com’s success is a result of hitting the zeitgeist. The site offers customers a broad choice, reliable technology, and an intuitive user interface; creative marketing; and – perhaps surprisingly - a state-of-the-art data application and analysis platform. But, how can Big Data help to create a successful online booking service? [easy-tweet tweet="Multi-device usability is key for success " user="tbedos" hashtags="CX"] One of the most important success metrics for an online booking platform is to offer customers an excellent user and booking experience that is independent from the devices they are using or where they are. The continuous improvement of Hotels.com’s usability is based on analysis from various sources including clickstreams, reviews, personal preferences of users, and hotel profiles. App developers are able to improve the user interface and the algorithms behind the booking procedure by understanding the customer journey beyond just one session. The in-depth knowledge gained allows Hotels.com to provide consumers with a personalised user experience and custom-made recommendations. As an early adopter of advanced and predictive data analytics, Hotels.com is sharing some of its findings to give CIOs suggestions on how to carry through Big Data projects. It’s one thing to understand the principles of Big Data, but seeing industry projects running well could inspire CIOs. Hotels.com’s Big Data experience should serve as a success story for the industry.   Choose the right platform We all know that the right technology platform makes or breaks an IT project; the same can be applied to Big Data solutions. But how can we make the right choice? A technology decision should be based on a thorough assessment of business needs and deciding what could benefit from data analysis in the future. CIOs shouldn’t focus purely on investment costs when choosing the technology. The evaluation should also include performance, reliability, usability, data security and – this is very important - scalability. [easy-tweet tweet="The right technology platform makes or breaks an IT project" user="tbedos" hashtags="cloud, IT"] After weighing up all the pros and cons, Hotels.com chose Datastax Enterprise as its online data platform, which is based on the open-source Apache Cassandra NoSQL database. This allows Hotels.com to benefit from features such as built-in management services, extra security features, and external Hadoop integration that complements the features of a pure open-source solution free of charge. We also benefit from Datastax’s support, maintenance and update services. That leaves us free to focus on data analysis and supporting the Hotels.com business. Get the bosses on board A Big Data project requires both investment and cross-company collaboration: silo thinking could be an obstacle to long-term success. A main goal for Hotels.com, for example, was to break down the barriers between the online world of our booking platform and the offline world with our data warehouse solution at its heart. This required a massive change in entrenched processes. We found that the best way to get backing from the bosses was to prove that we could achieve a quick return on investment (ROI). To demonstrate this we collected more than 150 business use cases that would be possible with the proposed platform. We then selected a subset of 10, which were suited to illustrating a proof of concept within a narrow time frame. Seeing our proposed platform delivering quick wins helped to convince the board and served as a stepping stone to more challenging use cases. Data privacy comes first Customer trust is a precious commodity Customer trust is a precious commodity and respect for data privacy is one important key to success. Therefore CIOs should ensure that their Big Data strategy is carefully balanced with a commitment to protect customer data security.The use of anonymization is vital to protect every user’s privacy, especially when analysing large quantities of aggregated data. Make data analysis business as usual Regarding cost, time and organisational restraints, project leaders should always keep sight of long-term development to build a reliable, efficient and future proof platform. Scalability is an important component in long-term planning to ensure that the technological platform is able to keep pace with the ever increasing flood of structured and unstructured data. CIOs should start to integrate the analytics processes into everyday business once the platform has been established and first use cases have yielded results. Only when this has become business as usual should IT and data teams address new projects to advance the company’s business even further. More than analytics [easy-tweet tweet="For Hotels.com one of the most important uses of #data is to deliver a cross-device user experience"] Big Data is closely linked to data analytics, but Hotels.com’s Big Data platform goes far beyond that. One of the most important uses of data is to deliver a cross-device user experience for multiple screens and independent from where they are. On the one hand Hotels.com analyses user data in order to determine which devices are being used. On the other hand, Hotels.com logs data in order to be able to recognize the destinations each specific user has been searching for on the company’s booking platform and to ensure that customers are recognised on any device, so that they can pick up where they left off in their search for a hotel. This allows users to begin looking for accommodation, for example, on a tablet while travelling by train, before confirming the booking on a desktop at home without the need to start the booking journey again. This is only one example of how Hotels.com uses its data platform to improve the booking experience and make it as personal as possible. ### Zayo Closes Acquisition of Viatel in Europe Zayo Group Holdings, Inc. an international provider of Communications Infrastructure services, has announced it has closed its €98.8 million acquisition of Viatel's infrastructure and non-Irish carrier and high bandwidth business from the Digiweb Group, a full service telecommunications and managed services operator, based in Dublin, Ireland. The acquisition adds an 8,400 kilometre fibre network across eight countries to Zayo’s European footprint, including 12 new metro networks, seven data centres and connectivity to 81 on-net buildings. The Viatel businesses to be acquired are highly aligned with Zayo’s existing product and customer set, including a higher proportion of dark fibre revenue. The new capability significantly expands Zayo’s global capabilities, providing Pan-European infrastructure and connectivity to key subsea cable systems, delivering traffic to and from North America, Asia and Africa. The expanded network and infrastructure solutions are available immediately to Zayo and Viatel customers through their account representatives or via Tranzact, Zayo’s innovative platform for purchasing and managing infrastructure at zayo.com/tranzact. The suite of services available includes Dark Fibre, Wavelengths, Ethernet, IP, Data Centre and Cloud, across more than 92,000 miles -- nearly 150,000 kilometres --  of Zayo’s high-performance network. “The acquisition of Viatel’s European network business strengthens our strategic position in Europe and provides customers with access to our fibre network and expanded connectivity to key international markets,” said Dan Caruso, Zayo chairman and CEO. “Because of the complementary nature of the acquisition, we will begin cross-selling our full suite of services to both Zayo and Viatel customers immediately.” For more information, please visit zayo.com. ### Welcome to the new and easy era of containers and software-defined infrastructure Managing enterprise infrastructure is hard. From dealing with individually configured virtual machines, to using automation tools to deploy systems - management of data throws up so many difficulties. If managing data wasn’t difficult enough already, imagine trying to deploy a system that has multiple services and servers, then add on top of that production and let’s not forget about maintaining identical configurations for development and staging. All of these complications mean it is nigh on impossible to deliver systems as fast as is necessary. [easy-tweet tweet="#Docker and #Mesos will change the way engineers look at architecture says @ArtyomAstafurov" usehashtags="no"] Sounds rather gloomy doesn’t it? Not for long. The good news is that we now have production-ready systems that can change the way we configure systems and manage data centres. Hurrah. Newly developed cluster management systems such as the Mesosphere commercially supported Apache Mesos, and containerization systems, such as Docker help us to view individual data centre machines as a singular atomic unit. [easy-tweet tweet="#Docker allows each system deployed to be put inside a #container making it easier to manage" user="ArtyomAstafurov" usehashtags="no"]   Docker allows each system deployed to be put inside a container making it easier to manage, encouraging collaboration between developers and DevOps, and providing common language for systems deployment and management. Apache Mesos combined with Docker is a technological godsend as it dramatically minimises configuration errors, which can be costly when it comes to distributed critical systems. Trust me when I say; Docker and Mesos will change the way engineers look at architecture and will make them think about their systems in terms of containers. These new systems are creating a very powerful environment for building highly decoupled enterprise systems. Want to read more about containers? Check out the blog from Bill Ledingham below! Strengthening Container security Initiatives ### Severalnines adds silver lining to database management for CloudStats.me Severalnines, the provider of open source database management tools, today announced its latest customer, CloudStats.me. CloudStats.me is a cloud-based server monitoring platform capable of monitoring Linux, Windows, OS X servers and personal computers, web sites and IP addresses. Users of CloudStats.me range from system administrators to website owners and web hosting providers, such as WooServers, HudsonValleyHost and other SMEs. CloudStats.me is a UK company which currently monitors more than 1300 servers and 1000 websites and its server monitoring platform is growing rapidly at 500-1000 new users per week, adding approximately 1.5 servers to each new user account. The rapid growth of the CloudStats user base and the number of services being monitored created a very high load on its database. Such high load has led the MySQL-based CloudStats database to being unable of handling large amounts of incoming data collected by the CloudStats system, causing false alerts and further inconveniences to some of its users. CloudStats needed a new database distribution mechanism that would replace the current MySQL database configuration to avoid further customer churn. The MariaDB MySQL Galera Cluster, which allows users to create a MySQL cluster with a “Master-Master” replication, was chosen as the replacement. The database is automatically distributed and replicated across multiple servers, and additional servers can be added to scale workloads, which makes it extremely easy to scale a database cluster and add additional resources to it if necessary. For the CloudStats.me team it was apparent, however, that the database cluster management required a number of specialist technical skills. This could create a burden on the development team as they would need to devote extra time to setup, configure and manage clusters. Instead of hiring additional database system administrators, it was much more efficient to use one platform that could quickly install, configure and manage a MariaDB Galera Cluster. Furthermore, such a platform would need to support easy cluster expansion, database availability, scalability and security. After some online investigtion, the CloudStats.me team found Severalnines’ ClusterControl. The initial setup of the ClusterControl platform only took a couple of hours and went live straight out of the box. The whole installation process was fully automated, so the team could have clusters up and running in very little time. Severalnines also offered valuable advice on design of the new database architecture. Today, CloudStats.me sees the following benefits from using Severalnines ClusterControl: Complete control of its MariaDB cluster with an ability to scale at any time. Significant cost savings were achieved as ClusterControl helped reduce database administration costs by 90%. There is additional bandwidth to save costs on technical issues with support included as part of the ClusterControl subscription. Vinay Joosery, Severalnines Founder and CEO said: “As we have seen recently, 2015 is truly the year of the cloud as the leading vendors are beating their expected sales targets. There is an appetite to ensure servers can be rapidly deployed and optimised for performance. It is therefore no surprise to see the success of companies like CloudStats.me. We are delighted to provide CloudStats.me with a scalable infrastructure and the ability to increase or decrease workload based on its customers’ needs.” Alex Krasnov, CEO CloudStats said: “Our cloud and server performance management business is witnessing a ‘hockey stick’ growth trajectory. This means it was particularly important to plan our infrastructure in such a way that it would support further growth and allow us to add more features into the CloudStats control panel. The support we have had from Severalnines has made our lives much easier because it provided the database expertise, which we needed to ensure our system has a maximum uptime. We plan to deploy more clusters and continue to integrate ClusterControl with our server monitoring platform.” ### Our first looks into 2016 and beyond As we near the end of 2015, we can begin to think about what not only the next year, but the future in general will bring us. Compare the Cloud spoke to two security experts about their thoughts on the future in security, and what cloud has for us in the years to come. [easy-tweet tweet=".@ComparetheCloud spoke to two #security experts about their thoughts on the future in security" via="no" hashtags="cloud, infosec"] Scott Tyson, Director of Global Sales at email security firm, Mailprotector has provided us with 6 predictions on the state of security in 2016, and Ian Collard, managing director, Identity Methods Ltd has looked even further into the future, sharing his thoughts on the finality of cloud in 15-20 years. We’ll begin with the shorter spectrum of views, here are Scott Tyson’s thoughts on 2016: [quote_box_center]Big breaches – we will continue to see headline-grabbing data breaches hit the media. We have already seen evidence of this in 2015 and the trend will only continue. As well as being headline grabbing, companies will go on to suffer reputational damage and data breaches and security attacks will start to hit the bottom line. Phishing isn’t going away – while more of us are aware of phishing scams today, they still represent big wins for cybercriminals. We will continue to see phishing emails in our in-box as the technology used by attackers gets even more sophisticated and attackers become more targeted and clever, increasingly using social engineering techniques, in their approach. Email security with built-in protection against phishing will be essential for smaller businesses. Alliances, mergers and acquisitions – we have seen a lot of acquisitions in the security sector again this year, and this will be in evidence in 2016 as niche specialists, like those in encryption and identity management, will be scooped up by bigger vendors looking to add firepower to their security portfolios. Data encryption is finally having its day – a niche technology that has been divisive when it comes to discussions about government snooping and surveillance, encryption technology will be even more crucial for securing data in the cloud in 2016. SaaS security - even greater adoption of cloud from businesses will create demand for cloud-based security. IDC’s Futurescape for Security predicts that enterprises will be utilising security software as a service (SaaS) as a greater share of their security spending. By the end of 2015, 15% of all security will be delivered via SaaS or be hosted, and more than 33% by 2018 . Partners come into their own with a services-led approach – this has been a year of change, with greater adoption of cloud and mobile technologies. As a result, we have seen resellers moving away from sales focused on margin per sale towards monthly recurring revenues models. Many are starting to realise that significant revenue will be driven by cloud-based offerings as opposed to hardware, and we will continue to see them develop MSP divisions or services-led business models to embrace these changes.[/quote_box_center] And now, moving on from 2016, here is Ian Collard’s controversial take on the Cloud being defunct by 2035. "The Cloud will stay with us for another 15-20 years until the cycle changes and swings back again to enterprise." - Ian Collard  [quote_box_center]Post Cloud  or AC ( After Cloud ) will be brought about by ridiculously cheap technology, security scares , easier to use systems i.e. Deskilling , Internet of Things  etc. Whatever the trigger, something will lead to bringing the crown jewels back in house . The Cloud is, after all, just a modern version of the computer bureau  The work environment will be a very different place in 2030, refined by the skills our children will have developed for when they enter the workplace. Looking at ‘right now’, the market has already begun to move away from the “bolt on” suites from the likes of SAP and Oracle and is moving in a more self-contained and cloud based direction. Well OK! What are you waiting for?  Well, you’re actually waiting for your IT department to have their handcuffs taken off by your IT Security department.  You’re going to put all of our HR data in the cloud?  It’s going offshore? It’s subject to the US Patriot Act?  Wow, You’ve really stirred up a hornet’s nest here.  Nobody told you that stuff, so suddenly IT Security are all over it.  Where are the safeguards? IT needs to develop a cloud federation capability to share your user names and passwords, you need to extend the reach of your identity and access management.  What’s going on here?  None of that was on the menu when you made the decision. But that’s how it plays. Technology decisions need to be thoroughly thought through. But all this time business needs to stay smart and understand from their colleagues in IT about the implications of strategic change. We often hear the mantra of “IT should support the business” and that is perfectly true, however, whether for good or ill, often IT is the business!  The business cannot function without it and the systems we choose reflect directly on our ability to function as management. [/quote_box_center] Here at CTC we love looking into the future, if you’d like us to publish your predictions for 2016, please get in touch with our Editor Rhian at PR@comparethecloud.net, or tweet her at @Rhian_Wilkinson. ### Federated Cloud: The Next Generation of Cloud Services It’s hard to argue the impact the cloud has had on IT and business over the past decade. It’s made the world smaller by making collaboration easier. The consumer cloud file sharing services that have made our personal lives easier – Dropbox, Google Drive, iCloud, Microsoft OneDrive, etc. – are now rampant within enterprise IT organisations. This first generation of cloud file sharing flourished because it followed the famous Apple mantra: “it just works.” Simple, functional user interfaces made it easy for end-users to understand and use. People got used to that ease-of-use in their personal lives and soon demanded the same of their enterprise file sharing. At the same time, this model requires almost no maintenance – it’s often an easy plug-and-play solution that enables fast, cheap collaboration. [easy-tweet tweet="The first generation of #cloud #filesharing flourished because it followed the famous Apple mantra: It just works"] But this public cloud model has always had a very big, very obvious problem. In an era of cyber-warfare, data breaches, hacking and concerns over government access, user data stored in a centralised cloud infrastructure can lead to serious privacy and security problems – both real and perceived. Whether it’s Dropbox passwords being exposed en masse, iCloud or Sony hacks of sensitive photos and emails or the NSA demanding (and getting) access to information stored on public cloud servers, these solutions simply didn’t have the enterprise-class security needed to ensure ownership and control of data. There are a host of additional issues with the public cloud model. For instance, when the file storage location cannot be changed, this creates legal and regulatory issues for enterprises. Additionally, software that cannot be audited or inspected, puts all of the responsibility and power in the hands of a third party – even when it’s your data at risk of being exposed. Extensibility can also suffer, as administrators struggle to integrate or extend the solution into other systems.  [easy-tweet tweet="User #data stored in a centralised #cloud infrastructure can lead to serious privacy and #security problems"] In response to these concerns, we then saw the rise of the private cloud – a second generation of solutions where software was installed on-premise, on infrastructure controlled and owned by the enterprise. This has the benefits of enhances security, integration and extensibility, and has gained a big slice of the cloud services market share in recent years. But this model has its own limitations, most notably the inability to collaborate and share data outside of a specific enterprise running one of these private clouds. And when you think back to the reason cloud computing took off, the value proposition at its very core comes down to cost and collaboration. Without collaborative benefits, the private cloud is really just a watered down version of what the cloud was supposed to be. With a monolithic, centralised first generation of cloud services that can’t protect your data and a second generation that’s more secure but discourages collaboration, the stage is set for the next generation of cloud computing. the stage is set for the next generation of cloud computing The idea of federation has taken root in in enterprise IT systems, from architecture to identify management. But it has not yet made its way to the cloud. In computing, the word “federation” is used to describe a group of servers acting as a single system. The best example for the use of federation in enterprises is email. Regardless of whether you’re using a secure, private server (your company’s, for instance) or a large, public one (Gmail, perhaps), everyone can exchange information and retain the benefits and drawbacks of the service they’re using. [easy-tweet tweet="In computing, the word “federation” is used to describe a group of #servers acting as a single system" user="comparethecloud" usehashtags="no"] Using the idea of federation for cloud-based file sharing, for example, picture a scenario where each user uses their local, on-premise service with all the benefits mentioned above, but with the additional capability of being able to communicate with each other across servers and organisations. This federation removes the final barrier preventing private clouds from reaching their potential – easy, secure collaboration with external parties. It’s the realisation of the potential we all envisioned the first time someone explained the concept of cloud computing to us (before the almost-immediate security/privacy concerns crept in and gave us pause).  Federated cloud services have the potential to be extremely powerful by combining the benefits of centralised consumer services with the security and privacy benefits of on-premise deployments. So how would this play out in a real enterprise? Think of a global company – collaboration has to take place not only among disparate offices across the globe, but with a network of other companies. Whether it’s a parts supplier, a management consultant, or any other organisation the company outsources to and collaborates with, there’s a real need for universal, real-time file access. With federated cloud services, it’s possible that teams and users across these different geographies and companies can share folders and documents – just like we all do within our own enterprises. Cloud computing is here to stay – it’s becoming an increasingly prevalent and important part of enterprise IT backbones. But despite the successes to date, cloud computing has yet to deliver on its initial promise of seamless, secure collaboration. Federation has been a transformative concept in the enterprise tech world for a long time, and the time has come for the federated cloud – the next era of cloud computing. ### MapR Announces MapR Streams, Creating the Industry’s First and Only Converged Data Platform MapR Technologies, Inc.today announced the industry’s first and only converged data platform and introduced MapR Streams, a reliable, global event streaming system that connects data producers and data consumers across shared topics of information. The MapR Converged Data Platform integrates file, database, stream processing, and analytics to accelerate data-driven applications and address emerging IoT (Internet of Things) needs. The integration of MapR Streams into a converged platform enables organisations in any industry to continuously collect, analyse and act on streaming data. From advertisers providing relevant real-time offers, to healthcare providers improving personalised treatment, to retailers optimising inventory, to telecom carriers dynamically adjusting mobile service areas, organisations must improve their responsiveness to critical events with the continuous analysis of big data. [easy-tweet tweet="MapR unveils #MapR Streams and the only Converged #Data Platform" user="comparethecloud" hashtags="cloudnews"] According to Gartner, “Even as traditional workloads become progressively more optimised, entirely new ones are arising, notably from the emergence of IoT dataflows – that will dwarf previous volume and velocity demands and demand new stacks of hardware, networking and especially information management technology for processing, securing and distributing new varieties of data.” “MapR is at the forefront of designing solutions for data-centric businesses as they operate today and provides the best big data platform with a core architecture in place to successfully address modern data challenges,” said Michael Brown, CTO, comScore. “Our system analyses over 65 billion new events a day, and MapR Streams is built to ingest and process these events in real time, opening the doors to a new level of product offerings for our customers.” MapR Streams can easily scale to handle massive data flows and long-term persistence while providing enterprise features such as high availability (HA), disaster recovery (DR), security, and full data protection. “We continue to be impressed with the innovative new features MapR has been integrating into its data platform,” said Brad Anderson, vice president, big data informatics, Liaison Technologies.  “Our customers are in regulated industries such as healthcare and financial services, which have incredibly demanding security, compliance and agility requirements. We are seeing the converged MapR platform with MapR Streams delivering more real-time data services to our users, while enhancing those important criteria.” Unlike other approaches that create data silos across multiple systems and lack the required enterprise-grade features and global replication, only MapR natively integrates data-in-motion and data-at-rest in one converged data platform. As a result, MapR enables developers to create new, innovative applications that reduce data duplication and movement, lower the cost of integration and maintenance associated with multiple platforms, and accelerate business results. “Bringing together world-class Apache Hadoop and Apache Spark with a top-ranked NoSQL database and now continuous, reliable streaming with global scale is a huge step forward in enabling enterprise developers to create the next-gen apps using big data,” said Anil Gadre, senior vice president, product management, MapR Technologies.  “MapR continues to execute on its vision of making it easy for enterprises to get the competitive edge from big data by bringing together all of the essential components.” [quote_box_center]With MapR Streams, for the first time, developers can harness the power of a continuous, converged, and global data platform, allowing them to: Easily build scalable, continuous high-throughput streams across thousands of locations with millions of topics and billions of messages Unite analytics, transaction, and stream processing to reduce data duplication, latency, and cluster sprawl while using existing open source projects like Spark Streaming, Apache Storm, Apache Flink, and Apache Apex Enable reliable message delivery with auto-failover and order consistency Ensure cross-site replication to build global real-time applications Provide unlimited persistence of all messages in a stream [/quote_box_center] MapR works closely with numerous leading technology partners, such as dataArtisans, Databricks, DataTorrent, Streamsets, and Syncsort, to provide customers with the flexibility to choose the components they want in their real-time analytics data platforms. “MapR has fully embraced Apache Spark, and now with the scale, performance and flexibility delivered by MapR Streams, the solution is a great complement to Spark Streaming,” said Kavitha Mariappan, vice president, marketing,Databricks. “We look forward to our continued partnership with MapR and to offering our joint customers a powerful, converged data platform that simplifies the management of data in motion regardless of its source, location and format.” ### Choosing your cloud - the good, the bad and the ugly The biggest challenge any business faces is trying to predict the future; knowing what your data will look like in three or five years’ time so you can put in an infrastructure that is flexible enough to grow and accommodate change is a huge challenge. The amount of data that all businesses are generating themselves and storing on their clients is growing all the time. The infamous statistic from 2013 that 90% of the world’s data has been generated over the last two years continues to be quoted and I suspect if we looked closer now in 2015 we’d find the numbers astounding. Being able to know what data you want to keep, what you want accessible now and what you can file away is going to become increasingly difficult. [easy-tweet tweet="Choosing #private, #public or #hybrid #cloud is about being methodical in making your choice says @David_4D" usehashtags="no"] Unsurprisingly hybrid cloud has become the cloud model of choice for many organisations that want to build greater flexibility into their infrastructure in a cost effective and future proof manner. What’s not to like about hybrid? You get to keep some of the simple data on-premise yet free up your existing infrastructure by moving critical elements to the cloud, all the while relying on your cloud provider to have the latest provisioning and security in place that helps to ensure you keep abreast of your industry regulation and legislation. But is there a good, bad or ugly of cloud? Choosing private, public or hybrid is about being methodical in making your choice and not just diving in for what seems to be the quickest and simplest solution on the surface.  Private Cloud Private cloud as the name suggests means that only one specific organisation can operate and access data in this particular cloud environment. The idea is that you naturally have greater control and therefore privacy for your sensitive data. This is great if you’re an industry giant with exceptionally deep pockets and can afford excessive operational costs and if you’re someone who doesn’t like to share the same environment. [easy-tweet tweet="With #PrivateCloud you naturally have greater control and therefore privacy for your sensitive #data" via="David_4D" usehashtags="no"] Public Cloud The public cloud on the other hand offers a great virtualised service for consumers running at a fraction of the cost of private cloud and it’s accessible over any public network. However, the big BUT here is whether any savvy business should be sharing its crucial data on a resource that is so publicly available. In this case cheap doesn’t always mean better. Yes, it will offer the ultimate in scalability because it has such a vast user base and instant flexibility if your business grows overnight, however, with businesses becoming more and more concerned about data security a simple breach could provide a massive headache. Hybrid Cloud In my experience a hybrid cloud solution enables businesses to run multiple applications within the platform while also being able to connect back into colocation rack in the data centre or indeed linking straight back into the office using point-to-point connections. We’ve found that while people want all the benefits that cloud solutions have to offer they also want the security of knowing where their data resides and they rightly want to remain in control. A hybrid cloud solution offers just that. Deciding on which cloud is right Be guided by your industry regulations and Data Protection laws At the end of the day when it comes to cloud the decision is yours. Be guided by your industry regulations and Data Protection laws. Talk to your peers and partners about what has worked best for them. A cloud solution should provide you with: Scalability – so you can upgrade instantaneously with no downtime or capex costs Performance – that’s as good as a dedicated server Easy management – via an online secure control panel Reliability – mirrored SAN storage and hot migration of your virtual servers A choice of operating systems – Linux, Windows, in a variety of flavours and versions. And once you’ve taken these requirements into consideration select a partner where you can easily check their security measures – go and see it for yourself. Establish a comprehensive SLA – all reputable cloud providers will gladly provide one. And where possible select a cloud that offers a ‘try before you buy’ model which will help you to establish the success criteria appropriate for your business and ensure the version of cloud you’ve chosen is delivering maximum performance and value for your business. ### Riverbed survey highlights gap between business needs and the reality of app performance Riverbed Survey Finds 98% of Businesses Say App Performance Is Critical to Business Performance, Yet 89% Say Poor App Performance Negatively Affects Their Work  Riverbed Technology, the leader in application performance infrastructure, today announced the results of the Riverbed Global Application Performance Survey 2015, the first global survey of business decision makers on the business impact of application performance. The survey has revealed a major performance gap between the needs of business and IT’s current ability to deliver – 98% of executives agree that optimal enterprise application performance is critical to achieving optimal business performance. And yet, 89% of executives say the poor performance of enterprise applications has negatively impacted their work, and 58% say it impacts their work at least weekly. This performance gap is causing a series of problems for companies, from lost revenue and customers to lower morale to negative impact on brand image. Companies universally agree that business performance relies on application performance. And yet 9 out of 10 organizations suffer from poor performance on a regular basis. One cause of this performance gap is the move to hybrid IT. Migrating apps to the cloud brings agility and cost benefits, but, with other apps still on-premises, it also brings complexity. With apps, data and users literally everywhere, the work of optimizing and delivering great app performance has gotten much tougher for IT organizations. But you can’t control what you can’t see. And in order to close the performance gap, having a clear line of sight into how the apps are performing – and how the end user experience is being impacted – has also become a business imperative. “The results of the survey reflect what we’re hearing every day from IT leaders who are looking to deliver superior application performance in the midst of rapidly evolving, highly complex and hybrid IT environments,” said Jerry M. Kennelly, Chairman and CEO, Riverbed.  “With apps, data and end users everywhere today, companies need end-to-end application visibility, optimization, and control everywhere as well to close the performance gap.  Riverbed helps organizations improve application performance to drive tangible business benefits and performance.” Survey respondents specified their top three business benefits of optimal application performance versus the negative impact of poorly performing applications: Benefits of Optimal App Performance Pitfalls of Poor App Performance ·         Improved employee productivity (51%) ·         Time savings (50%) ·         Cost savings (47%) ·         Improved customer satisfaction (43%) ·         Faster delivery of products to market (33%) ·         Improved employee morale (31%) ·         Dissatisfied clients or customers (41%) ·         Contract delays (40%) ·         Missed a critical deadline (35%) ·         Lost clients or customers (33%) ·         Negative impact on brand (32%) ·         Decreased employee morale (29%) The survey found that executives would be willing to sacrifice a lot for applications to work at peak performance at all times. In fact, 33% would give up their full lunch break. They would also give up a portion of their program budget (32%), caffeine (29%), and even chocolate (27%). Given the universally recognised importance of optimal application performance, why is it so difficult for IT to deliver? Globally, 71% of respondents say they have felt frequently “in the dark” about why their enterprise applications are running slowly, spotlighting a disconnect between IT teams and business executives. And outside the Americas region, that number grows even larger at 76% in EMEA and 75% across Asia. Troublingly, executives can contribute to the problem as they try to work around it: 37% of respondents say they have used unsupported apps when corporate apps run slowly or stop working altogether, thus adding to infrastructure complexity with more “shadow IT.” Others have expressed frustration to colleagues (34%), taken an extended lunch (29%), used slow or down apps as an excuse for missing a deadline (26%), and even left work early (26%). Cloud Computing Benefits Business, But Also Adds Complexity Migrating apps to the cloud has delivered benefits to the business, but also some challenges. Nearly all (96%) of respondents use cloud-based enterprise applications in their work, 84% say their company’s use of cloud-based enterprise applications will increase over the next two years, and more than three-fourths (78%) of respondents say that moving key enterprise applications to the cloud has increased productivity. Additional benefits of cloud-based enterprise apps include increased flexibility (58%), cost savings (46%), increased agility (41%), and increased collaboration (40%). That’s the good news about cloud apps. The bad news is that hybrid IT contributes to the performance gap. There is an increased difficulty in getting end-to-end visibility into the complex, hybrid IT architectures that result from the use of both cloud and on-premises apps. Eighty-three percent of respondents say they believe trouble-shooting application performance issues is more difficult in a hybrid IT environment. In fact, according to a survey by Forrester[1], the majority of companies (51%) say that application complexity is now their primary obstacle to mastering application performance. On average, respondents estimate it takes seven hours for serious app problems to be completely resolved. In summary, business executives overwhelmingly agree that application performance is critical to business performance and driving results, yet the vast majority are impacted by poor app performance, creating a performance gap.  At the same time, business executives are leveraging the power of cloud-based applications and hybrid networks to elevate productivity and create happier, more loyal customers and employees. However, cloud and hybrid environments add complexity and application performance challenges that can also negatively impact business operations, and too often executives feel “in the dark” as to why poor app performance is happening and how to stop it. To deliver superior application performance in today’s hybrid environments, enterprises need a comprehensive solution that provides end-to-end application visibility, optimization and control. For the full 2015 Riverbed Global Application Performance survey report that also includes regional insights, please visit www.riverbed.com/global-application-performance-survey.html.  [quote_box_center] Riverbed Global Application Performance Survey Methodology The Riverbed Global Application Performance Survey 2015 is the result of a custom online survey by Wakefield Research of 900 business executives at companies with $500 million or more in revenue. “Executives” are defined as those manager-level equivalent or above. Research was conducted in October 2015 across eight countries: US, Brazil, UK, France, Germany, China, Australia, and India. Among the 900 respondents, 200 were in the US, with 100 in each remaining country. [/quote_box_center] ### Up in the clouds; Cloud Backups Once, backup was the most unpleasant job for the IT department. Good backup management never got you a pay rise, but data loss could easily get you the sack. Whenever there was a work experience student in the IT department, it was their job to change the backup tapes regularly and drive to the bank safe or other secure location. Backups were hard work and time-consuming, and are only ever important at the end of the process. However, in recent years, disk-based backups have made real changes and the cloud offers a whole range of new options and opportunities. What exactly is cloud backup, what are the challenges and can you completely rely on cloud backup only? [easy-tweet tweet="Good #backup management never got you a pay rise, but #dataloss could easily get you the sack" via="no" usehashtags="no"] Cloud backup not only covers backups in the cloud, but also backups from the cloud. More and more primary company data is now stored in the cloud. The best examples are Office 365, Exchange Online and OneDrive, plus a range of Enterprise File Sync and Share (EFSS) services. As more and more companies are moving over to transferring data to these SaaS-based cloud environments, it is crucial that the same level of data security is provided and that the data is backed up. This is known as cloud-to-cloud backup. Private cloud on premises or with service-provider Anyone opting for a cloud backup does not necessarily need to use a public cloud as offered by some of the big backup providers. They can also use suitable appliances to either set up a private cloud themselves or use a closed cloud operated by an IT service-provider they trust. In both cases, multiple backup appliances are used to form a private cloud. Many companies choose local service-providers in German-speaking countries. They provide proximity and trust and often ensure better legal security. Disaster-proof cloud backup management Just like any other kind of off-site data replication, the most important function of a cloud backup is disaster recovery. If data is lost both on the primary source and the appliance, for example, following a natural disaster, a fire or an explosion, all the data can be restored from the cloud. [easy-tweet tweet="#Cloud #backup means personnel are not required to deal with the backups on site, at the branches, or other offices" user="comparethecloud" usehashtags="no"] Backup and disaster recovery in and from the cloud are exciting new trends, but the cloud also offers major benefits in a third area of data backups, specifically backup management. The physical separation of the appliance and the management system means the management system remains intact and can be accessed from other places. This is very valuable, and not only in a disaster scenario. A shared interface for monitoring backups across different sites also saves money on a day-to-day basis. Personnel are not required to deal with the backups on site, at the branches, or other offices. Even in large companies with lots of branches, just a few administrators can control all backups around the world, and manage recovery at any site if necessary. Bandwidth usage is the biggest challenge in cloud backup In nearly all companies, data volumes are increasing exponentially, which increases the costs of the Internet connection and network infrastructure. To ensure data transfer to the cloud does not impact on the network performance and Internet connectivity of business-critical applications, modern solutions use incremental data backups and inline deduplication. [easy-tweet tweet="#Data volumes are increasing exponentially, which increases the costs of the #Internet connection and #network infrastructure." via="no" usehashtags="no"] Incremental data backup means only data that is new or has been amended is backed up. A full backup is carried out once to start off with and after that it is ‘incremental forever’. Inline data verification ensures that all the data in the backup is complete and usable, so it is no longer required to carry out regular full backups and build on them again. The concept of inline deduplication ensures less bandwidth and storage space is required from the outset Deduplication is also a very effective tool to reduce the amount of backup data transferred. This reduces data volumes massively where identical operating systems are run on numerous devices which are backed up, for example. The concept of inline deduplication ensures less bandwidth and storage space is required from the outset. An agent on the data source, for example the server, calculates a one-off hash value for each data block. The agent then sends this to the local backup appliance which compares it with the existing and processed data. If the value is unique, the appliance requests the block, if it is a duplication, the appliance merely adds a pointer. Complete rely on cloud backups Because of the data volumes and recovery time, it would be unusual for companies to rely completely on cloud backups. Recovery from a local appliance is quicker, and this also forms the gateway necessary for efficient communication with the cloud instance. However, with the developments in cloud backup and bandwidth availability and speed, this will be less and less a problem. Cloud computing is constantly growing and it is predicted that by 2018, 59% of the total cloud workloads will be SaaS workloads, up from 41% in 2013. Many benefits can be achieved by moving data towards the cloud and more organisations are beginning to realise its potential. Although a percentage of offline backups are still required at the moment, cloud backups are constantly evolving and will be on the landscape for a very long time. ### Unisys Stealth – The Invisibility Cloak for the Cloud We are living in the cloud age of “hybrid” and if you don’t know what that means I will share with you the analogy that springs to my mind. [easy-tweet tweet="Hide yo' wife! Hide yo' kids! Hide yo' #Cloud!" user="neilcattermull @comparethecloud" hashtags="security, infosec"] Moving to a centralised cloud environment with options is much like a marriage. Would you get married after the first date? No of course not! So being flexible and living together with a try-before-you-buy approach works. Similarly, you may want to upgrade or change your partner once you have found out they squeeze the toothpaste the wrong way or leave the toilet seat up. Hybrid, better than divorce says @NEILCATTERMULL It’s all about the options that suit you at that time, and let’s face it, every divorce will cost you more than the marriage! Hybrid is just that. Flexibility to other systems that suit you, commitment on your terms, as well as change of life scenarios such as capacity increases or decreases. However, and it’s a big however, you wouldn’t want someone to steal your partner away from you when you leave them at the bar. You need the security and peace of mind that you, and only you, can see your partner as attractive. Security is of the most paramount importance when talking about cloud environments. The more devices/infrastructure we connect to the outside world directly, or indirectly, requires more protection. Now more than ever as cyber fraud is now more lucrative in one year than all the international drug revenue put together. [easy-tweet tweet="Originally developed for the US military Unisys Stealth has upped the ante for #cloud #security" user="neilcattermull"] Recently we reviewed a product that just stands out from all the others. What is it? Well it’s a product that was originally developed for the US military, and has recently been redeveloped for commercial use. It’s called Stealth, from our friends at Unisys. [quote_box_center]Let me just highlight some of the real wow factors: It End Point Cloaks – Masking the end point device which is unique in the market place when put together with the other services it delivers – HIPPA and PCI-DSS standards are met. Micro-Segmentation – The ability to create communities of interest. e.g separation of environments within the IT Infrastructure estate – you define communities that are only accessible from where you dictate. Layer 2 Security – Often called the DLL (data link layer), which is by far the best level of security prior to the network level of the OSI stack. Stealth for Cloud – The ability to use Stealth to “cloak” complete environments (even data centres) as well as other hybrid cloud environments utilised, such as public clouds (Amazon). The extraction point is secured – If anything/anyone has made their way within your infrastructure, stealth attaches to the response and disallows any extraction to exit and drops the packet. [/quote_box_center] There are many other features to Stealth that could be mentioned too, however I am conscious of this article being read like an advert so I would advise you to go check the product out for yourself. I don’t think you’ll be disappointed! An additional interesting point is that Unisys have developed a “Stealth AWS” product that quite literally cloaks the AWS infrastructure - making public clouds infinitely more secure than they are today. Other cloud provider versions are in the pipeline so we will see the product breaking out of the AWS Sphere. [easy-tweet tweet="With Stealth from Unisys you quite literally can’t hack what you can’t see" hashtags="hack, infosec, security, cloud"] With security being key for any cloud environment, the cost of bad security should never be undersold. With the average time of a hacker/cyber breach being 260 days before they execute, odds aren’t in your favour. In fact, you may already have intruders looking at your data. With this product line from Unisys you quite literally can’t hack what you can’t see.  Our advice is for you to take a look at Unisys Stealth and draw your own conclusions. Decide if you want an invisibility cloak placed around your data centre, public cloud and operational systems, personally, I’d be saying ‘Yes please!’. ### The human dimension of cloud computing Research studies published across the world show a massive take-up of cloud computing, driven by a wide range of technological and commercial factors, notably the requirement to work with big data analytics. With estimates as high as 88% of organisations using cloud computing in one form or another, the success of the new digital paradigm can scarcely be in doubt. Even though half (52%) of UK CIOs surveyed by Robert Half view increased security risks as the biggest challenge associated with moving IT systems to the cloud, eight in 10 (82%) say the economic benefits of cloud technology outweigh the potential risks. [easy-tweet tweet="82% of #CIOs say the economic benefits of #cloud technology outweigh the potential risks" user="comparethecloud" usehashtags="no"] The general shift to a greater digital focus is underpinned by this move to cloud computing. The vast majority (94%) of senior IT executives believe technology will play an important role in business growth in the year ahead. The main areas where technology will add value are through driving process or employee efficiency (cited by 32% of IT directors) and improving customer transactional experience (45%). The main way in which businesses are working to achieve these improvements are through cloud adoption (34%) and virtualisation (29%), business analytics and intelligence (31%), mobile solutions and application development (both 24%). Approximately 88% of organisations are using cloud computing in one form or another So cloud will play a huge role in driving business performance and growth in the year ahead and is already an important element of any IT strategy. Yet whenever there is wide scale adoption of new technology platforms, demand for the skills required to work with those systems begins to rise and, alongside that, salary expectations for roles in key positions of responsibility as at least in the short term there is usually a skills shortage. According to Robert Half’s research with CIOs, the huge rise in adoption of cloud computing initiatives is already driving organisations to employ greater numbers of cloud experts. Almost half (41%) of IT directors said that they would hire additional staff to support cloud initiatives in the next 12 months. Of these, permanent employees would make up 17% of new hires, while contractors or interims would make up 24%, helping to fill the cloud skills gap. As well as hiring new cloud experts, IT directors are investing in training and education to bring their current teams up to speed. Almost half (49%) say that they deliver in-house training and development, while a third (32%) invest in external training courses and 30% provide webinar and e-learning. A quarter of companies deliver on-the-job training while a fifth (20%) rely on previous experience in another company. The fact that so many companies are investing in cloud initiatives as well as training for their teams means that there are rich opportunities and a clear career path for IT professionals with cloud skills. In our experience, the most important attributes in demand are demonstrable experience and a solid track record. If a company does not provide training, employees should take the initiative in self-study. [easy-tweet tweet="It’s clearly a good time for professionals to focus on #cloud skills " user="comparethecloud" usehashtags="no"] When talking about the skills needed for cloud support specifically, virtualisation and understanding the full information related to this, such as different storage technologies (SAN, NAS, DAS), networking (including its related technologies) and backups and replication (also including its related technologies) will be key for success. As Cloud involves a lot of technologies, professionals are well versed in a variety of them while looking to specialise in at least one aspect will progress in their careers. For those looking for certification, VMware VCP, EMC Proven Professional, CCNA and any related to Veeam and Zerto will stand employees in good stead. With such a high proportion of companies planning to hire additional staff in this area, it’s clearly a good time for professionals to focus on cloud skills as well as security around cloud initiatives and across the technology function. ### What FCA’s Latest Cloud Guidelines Mean for the Financial Sector Earlier this month, the Financial Conduct Authority (FCA) published new guidelines on how financial service firms should operate, and ultimately take advantage, of cloud computing services. Laying out guidance in this fifteen-page consultation document, the FCA has effectively approved the use of cloud services, provided the “appropriate safeguards” are put in place to protect corporate and personal assets. [easy-tweet tweet="The FCA has effectively approved the use of #cloud services, provided the appropriate safeguards are put in place" via="comparethecloud" usehashtags="no"] As part of its newly-published guidelines on cloud and other forms of IT outsourcing, the FCA are quoted as saying they that see “no fundamental reason why cloud services (including public cloud services) cannot be implemented, with appropriate consideration, in a manner that complies with our [the FCA’s] rules.” The FCA’s latest show of support for cloud-computing comes amid speculation that investment in cloud technology will reach $131 billion by 2017 — a figure that’s sure to spur some industries and sectors into adopting the technology sooner rather than later. But what does the FCA’s new stance on cloud-computing mean for individual businesses, firms and organisations within the financial sector? John Salmon, head of financial services at Pinsent Masons and founder of financial news site, Out-Law.com, believes the FCA’s decision to implement cloud-computing presents a significant leap forward for businesses in the financial sector. He said: “It is really positive for the FCA to recognise that the financial services sector can move ahead with plans to use cloud services, as long as appropriate safeguards are put in place. This is consistent with the regulator's efforts to promote innovation in the sector and should help more firms benefit from cloud solutions.” For businesses who choose to implement cloud-technology, the FCA has predicted that deploying such technologies will serve to improve competition in the financial marketplace — helping small and medium-sized firms compete on a more level playing field with larger, more-established organisations. By outsourcing their IT services, finance companies will enjoy greater flexibility and efficiency in the market, something which could prove lucrative for the firms and their customers. [easy-tweet tweet="#Data protection should be at the forefront of their cyber-security mandate" via="AVRInternation3" hashtags="cybersec"] However, as with the integration of any new technology into a long-established marketplace, there are elements of risk which financial service companies will have to assess, mitigate and manage effectively if they’re to succeed in the proficient deployment of cloud-services. By outsourcing their IT services to the cloud, finance businesses need to be aware of the potential security pitfalls involved in hosting sensitive data in a network with variable levels of security control. For this reason, data protection should be at the forefront of their cyber-security mandate, and the appropriate measures should be put in place to guarantee the safety of data and information. Security concerns shouldn’t deter finance businesses from implementing cloud-technologies, however. Once the recommended security measures are put in place, cloud computing can streamline processes, enable increased regulation and improve business continuity — effectively helping businesses better tailor the services they provide. ### Why work with an AWS partner? A couple of weeks ago I attended an AWS Summit, one of a series of events where Amazon Web Services presents its latest features to its users and evangelises its cloud services to professional prospects. Apart from the main auditorium, there was one major exhibition area: the Partners Zone. You could ask yourself: why is AWS leaving so much of the field to third parties? Don’t they have an enterprise strategy like IBM or HP?  The answer would be: no, they have a completely different business model than traditional IT suppliers. And, yes, they sure do have an enterprise strategy, probably one of the most powerful this industry has ever seen.  But, importantly, AWS’s core business is based on service innovation. They focus on broadening and improving the tools, processes and infrastructure that have disrupted a whole market and that keep them at the top of all cloud platforms rankings in the world. However, they do not manage services: that is where the partners come in. [easy-tweet tweet="AWS does not manage services: that is where the partners come in" user="sergi_mkt" hashtags="AWS, Cloud"] Instead of building a market channel, a sales network and hundreds of local management teams from the ground up, the leader of the so-called public clouds trusts certified managed service providers to advise, build and deploy tailored solutions to end customers. Meanwhile, AWS concentrates on improving its efficiency and economies of scale, and partners take advantage of one of the most advanced modern cloud infrastructures.  And the customers, what do they get? After two years helping organisations migrate to AWS, I see four main benefits a company can gain from dealing with an official AWS partner: Expertise: like with anything in life, when you are about to try something new it is wise to have someone who has done it before by your side. A DevOps team with experience on many other projects, ideally from the same vertical as you, can save huge amounts of time – and money! Proximity: Having a single, local point of contact for dealing with all their cloud issues means a lot to many customers. They want a specialised team that speaks their language, knows their people and business and, in most cases, that can monitor their platform 24/7 to proactively identify and resolve problems. Simplicity: while the partner takes care of all the tough stuff, you can focus on improving your applications. Some partners can also deal with all the billing so that the customer only gets one bill with both infrastructure and service management costs. Continuous improvement: your application, your users’ behaviour and, ultimately, your business, change constantly – shouldn’t your cloud platform? AWS experts can help you in predicting changes, optimising resource use and detecting opportunities. Also, a team that follows the latest AWS news, service upgrades and discount announcements can ensure that you are always up-to-date. These are four simple yet powerful reasons that make the AWS Partner Network one of the most efficient business environments in the tech market nowadays, and partially explain the high adoption rate Amazon Web Services has achieved in so many countries and in so short a time.  [easy-tweet tweet="The #AWS Partner Network is one of the most efficient business environments in the #tech market" user="sergi_mkt" usehashtags="no"] Of course, a company can build its own team to manage a cloud solution on AWS. As long as they are expert technicians, officially certified by AWS and with a proven track record in managing cloud services across multiple scenarios, they can be in charge of something that critical. However, the company behind should also make sure this IT team keeps up to date with all AWS service releases, infrastructure improvements and costs reductions. Otherwise, they run the risk of becoming technologically outdated. At the end of the day, it all comes down to how crucial your cloud platform is to your business, and how short you need your target time-to-market to be. ### The Economics of the Cloud Already one of the defining trends of this decade, big data is revolutionising industries from manufacturing to meteorology through the power of predictive analytics. Big data has the potential to change the way we see the world. But ever-growing volumes of data demand fully flexible storage systems to accommodate this growth, all of which adds cost and complexity to any IT architecture. To illustrate, nearly three quarters of respondents to Tech Week Europe’s July 2015 data storage report stated that high cost is a major problem when looking for storage solutions.  [easy-tweet tweet="The #Cloud is a lot like mobile phones - contract lock-in is scary and needs to change" user="WANdisco" usehashtags="no"] The cloud offers an attractive alternative. It has the potential to reduce costs and deliver a greater degree of flexibility, however a few challenges remain as we start to question how we move the data either to the cloud or, at some point in the future, between cloud providers. It is unsurprising that companies looking to leverage cloud economics as part of their IT architecture may have concerns about changing their cloud provider further down the line. The difficulties presented by movement between providers within the cloud can create a ‘lock-in’ situation, driven by cost, security and practicality. During lock-in, businesses are compelled to stay with their original provider due to the challenges faced by movement within the cloud. This inertia threatens to remove one of the main economic reasons for moving to the cloud: the ability to choose the most cost-effective and flexible IT Architecture. Although at many levels providers want to deliver the best possible solution to meet the needs of the customer, they have little incentive to make it easy to switch, which could potentially lose them business. A good example is the mobile phone industry where it is easy to change providers today, but in the past it was a lot more difficult. Another example is the banking and energy industries where various initiatives have looked to make it easier for customers to switch providers. As such prices have come down and companies have realised they have to be more competitive to attract and retain their customers. We are yet to witness this same level of flexibility in IT service cloud providers.  Transferring between cloud vendors is not just expensive, but can also take a great deal of time Transferring between cloud vendors is not just expensive, but can also take a great deal of time, putting unnecessary pressure on IT teams. Each cloud provider has a different infrastructure, causing complexities when moving between vendors. A leading IT media outlet recently changed cloud providers, and found that their new provider stored data in a different format to their existing provider. In order to switch they had to spend a great deal of time migrating and checking the data before it could be transferred, costing considerable time.  [easy-tweet tweet="The future of the #cloud depends on its ability to provide a flexible service for all" user="WANdisco @comparethecloud"] Companies switching cloud provider also worry about security. In order to transfer data between providers it is necessary to move the data temporarily out from behind the firewall. This exposure leaves the data vulnerable, sometimes for several hours whilst the transfer takes place. It is understandable why companies may be unwilling to change cloud provider, despite the potential long-term cost and service benefits. Solutions to these challenges are out there. WANdisco helps firms to migrate to the cloud without disruption and allows customers to move between different providers in a public or hybrid cloud environment, thus enabling them to extend their data centre. We believe that companies should be allowed to choose and change their cloud vendors at will, avoiding vendor lock-in. It is vital at this early stage in the development of the cloud that movement between providers is achieved as seamlessly as possible, encouraging customers to continually seek value for money. The future of the cloud depends on its ability to provide a flexible service for all. ### Commsworld, Axians & Juniper Networks in Scotland's Gigabit Revolution Leading network systems integrator Axians has been selected by Scottish-based Internet services provider Commsworld to deliver ultra-fast 10-gigabit Internet connectivity as part of a major upgrade to its Fluency network, making this the biggest single investment in the network to date. The additional bandwidth provided for Commsworld puts the company in an advantageous position to capitalise on the continued shift towards a digital economy. [easy-tweet tweet="Leading network systems integrator @AxiansUK has been selected to deliver ultrafast 10-gigabit Internet connectivity" via="no" usehashtags="no"] Axians has enabled Commsworld to develop a state-of-the-art network that spans the length and breadth of the U.K., providing a reliable backbone for its next-generation network. Commsworld’s Fluency network operates 21 points of presence (PoPs) in carrier-neutral data centres and has an extensive local-loop-unbundling (LLU) footprint across Edinburgh, Glasgow, Aberdeen and Inverness. Charlie Boisseau, Commsworld’s CTO, commented: “In the age of fibre optics and next-generation networks, there is no excuse for poor quality Internet services in Scotland – and customers rightly deserve more. Axians has enabled us to create a best-of-breed network based on Juniper Networks® high-performance technology to deliver fast and reliable connections across the country, helping shape the future of Scotland’s Internet community.” Fluency’s new core network is comprised of SDN-ready Juniper Networks MX480 3D Universal Edge Routers in each of its four core locations in Edinburgh, Glasgow, London and Manchester. In addition, Axians upgraded many of the metro access network nodes in Edinburgh and Glasgow with Juniper’s MX104 3D Universal Edge Routers to increase port capacity and enable multi-10Gb/s capacity over its fully optical backhaul infrastructure. Gerard Allison, senior vice president EMEA at Juniper Networks said: “Commsworld provides cutting-edge infrastructure solutions for Scotland’s many fast-evolving organisations, helping them provide outstanding online experiences to their customers. Commsworld’s selection of our next-generation MX Series routers via Axians further demonstrates their commitment to network innovation. This deployment helps Commsworld future-proof its business against fast-changing market demands and stay ahead of customer requirements.” Axians’ experience in designing and deploying carrier-grade networks built on Juniper Networks technology provided the skills and capabilities necessary for the project. Ian Parker, professional services consultant at Axians explained: “The Fluency network upgrade is a major initiative; the demands posed by the growing number of customers, applications and end-user devices require greater bandwidth, performance and flexibility, without sacrificing reliability in any way. Therefore, the technical choices are critically important to ensure a secure and reliable end-user experience for the businesses of Scotland. Juniper Networks’ proven MX Series 3D Universal Edge Routers,  which include physical and NFV/VNF-based platforms, were selected to reduce network latency and provide an agile, high-performance and scalable carrier-grade network. We’re delighted to have been involved in such a significant expansion, which promises to positively impact Scotland.” Commsworld’s Fluency network now forms the vital linchpin for several revolutionary projects with CityFibre, namely in supporting the Edinburgh CORE and Aberdeen CORE, as well as being selected for the City of Edinburgh Council’s wide area network (WAN) in a recent contractual win. Further projects are also in the pipeline, which will see Scotland becoming one of the most digitally-connected locations in the world. ### Security Concerns and Lack of Visibility Hinder Cloud Adoption say 65% of IT Pros Netwrix has released the results of its global 2015 Cloud Security Survey that show when it comes to migrating to the cloud, 65% of companies are concerned with security and 40% worry about their loss of physical control over data in the cloud. In particular, 69% of companies are afraid that migration to the cloud will increase the risks of unauthorised access, while 43% worry about account hijacking. Security is gaining increasing attention from cloud technology and service providers, but the lack of visibility into sensitive data stored externally raises fears that are still holding back wider cloud adoption. Netwrix surveyed more than 600 IT professionals worldwide, representing technology, manufacturing, government, healthcare, finance, education and other industries, to answer questions about cloud security, expectations from cloud providers and measures being taken to ensure data security. [easy-tweet tweet="71% of enterprises think of continuous auditing as an important part of ensuring data integrity in the #cloud" via="no" usehashtags="no"] Other key survey findings show that: A hybrid cloud deployment model is preferred by 44% of respondents as they transition from an on-premise infrastructure to a cloud-based model. Private clouds attract 37% of organisations prepared to invest in additional security. Companies migrating to the cloud plan to enforce internal security policies: 56% plan to improve identity and authentication management; 51% will utilise encryption; and around 45% of medium and large enterprises plan to establish auditing of changes and user activity. Overall, 13% of organisations reject the idea of adopting of cloud technology in the near future. However, 30% of them are ready to reconsider their decision as soon as cloud security mechanisms are improved. Some 30% of organisations already take advantage of improved cloud security, while more than 40% of organisations are ready to invest in additional security guarantees, if offered. Overall, 71% of enterprises perceive continuous auditing of cloud infrastructure as a very important part of security guarantees to ensure data integrity in the cloud. “We wanted to find out what are the exact reasons that prevent companies from cloud adoption and taking advantage of all the benefits it offers,” said Alex Vovk, CEO and co-founder of Netwrix. “The survey revealed that even though cloud is not a new technology, the market has a good potential to grow further. Advanced security solutions and true visibility into what is going on across the cloud infrastructure will help companies minimise security risks, take back control over business-critical assets and accelerate cloud adoption.”  Learn more about cloud technology concerns and download the 2015 Cloud Security Survey: http://www.netwrix.com/go/cloudsurvey2015 ### Ultimate Guide to Cutting Sales and Marketing Costs for Cloud Services Or "How to focus on talent and techniques that will actually make a difference" Are you looking for a way to slash your sales and marketing budgets? Why not try our CSiaB solution (Cloud Salesperson in a Box). Contents of the box include a life-size cardboard cut-out of a cloud sales person with a winning smile. Also included are audio files and presentation materials to enable your CSiaB to recite the cloud mantra as follows: [quote_box_center] The worlds of business and IT are changing! New disruptive players like Uber and AirBNB are a threat to us all. The future is cloud – but right now you have legacy systems so we’ll focus on Hybrid Cloud. The answer to your Hybrid Cloud needs is an integrated solution that links public and private clouds seamlessly. We have the answer. A new Cloud offering that does this all and does it better than everyone else’s cloud offering in some undefined way.  [/quote_box_center] For even better results try the deluxe CMiaB solution (Cloud Marketeer in a Box). Contents of this box include a life-size cardboard cut-out of a cloud marketeer with an ever bigger smile. The deluxe CMiaB solution also includes audio files that last longer and presentation materials in brighter colours to recite the cloud mantra with even greater impact. [easy-tweet tweet="Purchase your CMiaB solution from @Comparethecloud today! Free Lifetime No #BS guarantee!" user="billmew" usehashtags="no"] Also available for budget conscious clients that lack storage space for these life-size cardboard cut-outs are CSaaS (Cloud Salesperson as a Service) and CMaaS (Cloud Marketeer as a Service). Included in these packages are videos that can be streamed live and projected onto screens to look and sound like sales or marketing staff – again with winning smiles! As a special promotion this month bundled with all these solutions we are including CEaaS (Cloud Education as a Service) – this service provides unlimited access to the Compare the Cloud podcast library (WARNING: listening to too much Andrew McLean and Neil Cattermull can be hazardous to your sanity). Seriously though, how many events and conferences have you been to recently where almost all the keynote speakers have been delivering variations on the same cloud mantra – the one cited above – without clearly explaining what their differentiation is. At least with the large public cloud providers you have some clear differentiation. On top of this with the array of services from the AWS marketplace you have plenty of choice while with AWS Glacier at only $0.007 per gigabit per year (yes that’s right – less than a cent!) you’ve got some amazing value services as well. In the private cloud camp all the vendors are converging on VMware or, as is increasingly the case, OpenStack (CloudStack and Eucalyptus having both bitten the dust). With all these vendors offering the same cloud mantra with the same technology stack, understanding the differentiating between their competitive offerings is sometimes hard to do. The unfortunate truth is that in all probability: Your value proposition and differentiation isn’t all that well understood. What you do isn’t necessarily seen as unique by you target market. Others could copy you if they were motivated to do so. When was the last time you got an independent third party to quiz your clients and prospects to get a real idea of how well they understand you and value you? Indeed what kind of competitive intelligence do you have on your market? [easy-tweet tweet="What kind of competitive intelligence do you have on your market?" user="billmew @comparethecloud"] The truth is that intelligence-based marketing is the ONLY way that you are going to create and sustain a differentiated brand. However, this doesn’t mean investing more in traditional marketing techniques or in marketing headcount. Instead you need to focus on: Influencer Marketing: if you don’t know who the main influencers are then start by looking at our monthly #CloudInfluence Rankings. If you don’t have similar intelligence for the markets that you serve then you are competing at a disadvantage. Content Marketing: note that “90% of all content marketing gets NO reaction. That’s massive wasted effort and opportunity cost.” So you need to stop wasting your effort and resources on tactics that don’t work and start applying rigorous data-driven intelligence to fine tune your content creation and targeting. Social Selling: if your sales teams aren’t active on social media and actively building their following and social connections then they simply aren’t present in the same circles as your prospects. Talent Acquisition: with marketing automation tools you don’t need a large marketing headcount. You are better of focusing on a small number of really talented individuals that can really make a difference. The reality is that anyone who isn’t top talent in one or all of the key areas above is little better than a cardboard cut-out! ### Zynstra Unveils New Line of Cloud Managed Server Solutions Zynstra, the pioneer in enterprise-class hybrid IT for the SMB, has unveiled its new range of cloud managed server appliance products. Using award winning virtualization and cloud management technology, Zynstra’s products are designed for organizations who want a secure, reliable IT infrastructure, without prohibitive management costs. Available immediately, Zynstra’s new product line-up delivers advanced, cloud managed, cloud integrated, secure and reliable IT that can translate to TCO savings of up to 60%. Now, organisations with single or multiple sites can enjoy hyperconverged server technology delivered ‘as a service’ and managed from the cloud, making possible new levels of efficiency in buying, running and maintaining critical on-site IT. “We wanted to control our costs and deliver a consistent IT experience across our international sites.” said Tim Newton, Information Technology and Systems Manager at Tristel. “The Zynstra solution allows us to deliver the IT our branches need, with centralised control.” Zynstra’s four new cloud managed server appliances run on HPE’s Proliant Gen9 hardware. They deliver modular and integrated storage, compute, and network capabilities. The new range includes an enhanced file system infrastructure, enabling increased storage efficiency locally and in the cloud.  Intrusion detection and an ICSA Labs certified firewall are included for security, alongside a network gateway with caching for accelerated performance and network domain management for easier and secure system control. In addition, customers can run any server application, with integrated cloud management, backup and disaster recovery. Single sign-on integration to Office 365 is included. Zynstra’s offering is delivered as a service and always kept current, avoiding prohibitive end of life costs, and ensuring software is always consistent across one or many sites. The appliances leverage Zynstra’s patented virtualisation and cloud management technology to deliver modular and integrated storage, compute and network capabilities to all sites, regardless of size, remoteness, or local support availability. [quote_box_center] The new product range delivers servers designed for the cloud era. Features include: ‘As a Service’ economics – with predictable monthly costs Pre-built, pre-integrated server appliances; hardware configurable to a customer’s specific needs, for rapid and optimal deployment from day one Memory and storage ‘packs’ to enable each implementation to stay right-sized throughout its life Ready to run and manage in minutes not hours Maintained ‘as a service’, to remove time consuming day-to-day hardware and software maintenance, while ensuring the solution is always up to date Integrated with, and securely managed through the cloud for maximum visibility of hardware, software and service performance, from wherever you are. [/quote_box_center] “Combining the latest ProLiant servers with Zynstra’s unique virtualization and cloud management technology means we can transform IT for the SMB,” said McLeod Glass, VP & GM, HPE Tower Servers and SMB Solutions. “It’s IT made easy.” “This is the perfect solution for anyone who needs to manage servers on one or many sites that can’t be moved to large centrally hosted data centers, for reasons of security, performance, cost, or control” said Nick East, CEO and founder, Zynstra. “By deploying our solution, these sites can enjoy centrally managed but remotely distributed IT, as-a-service, with cloud-like economics. It’s ‘how IT should be’ for servers in the cloud era, and is especially suited to teams who need to manage IT in multiple remote locations.”   ### Data roaming charges –are we really free to roam? Earlier this year it was announced that data roaming charges will be abolished within the EU by 2017. This agreement, requiring telecom operators to treat most mobile data equally, has been long-awaited, particularly in light of US adoption of net-neutrality rules. For the first time in Europe the principle of data neutrality will be reached, which means internet service providers (ISPs) no longer need to favour national Mobile Network Operators (MNOs) to keep costs contained. Starting from April 2016, telecoms operators will only be allowed to add a stated surcharge for the services, making roaming within EU up to 75 per cent cheaper. With this upcoming decrease in revenue will it have a negative impact on service? [easy-tweet tweet="Data roaming charges will be abolished within the EU by 2017" hashtags="mobile, dataroaming"] Why roaming costs It all started back in the days of fixed line telecommunications; if you wanted to call someone overseas they needed to have a phone on the same network to answer. It was a way of funding international network development and compensating them for connecting the call. Otherwise the “other end” received no money as typically only the person making the call gets charged. The introduction of fixed line competition within countries meant that the model was extended to also cover national traffic. It all started back in the days of fixed line telecommunications... When mobile telephony started it followed the same process except you took your phone with you and “roamed” onto another network. Following a period of international roaming you were still billed in your home country but the local networks, who ran all the masts, had to be paid and this is done via the international roaming rate they set. If each operator sends each other the same amount of traffic then no one cares much about the actual rate as it nets off, but if the traffic is out of balance then it can be very lucrative. The more out of balance it is the less one party wants to re-negotiate the rate. Therefore, as the underlying cost of delivering a call has fallen the “roaming” part has done so at a much slower rate.  This means local prices have fallen faster than international roaming rates.  When most of the people paying these international rates were business users there wasn’t much drive to correct this, but as travel within the EU increased more people complained about the costs. This prompted the EU to do something positive and regulate reductions in the roaming rate. What does this mean for consumers? [easy-tweet tweet="The removal of roaming charges does not mean data will be free" user="comparethecloud" hashtags="mobile, dataroaming"] The removal of roaming charges does not mean data will be free; consumers will simply pay the same for data in each EU country. There have been discussions on whether data will become like many mobile contracts, and become ‘Pay as you go’ or ‘Unlimited’.  This however is a long way off, it will take many years to be applied globally and the market is already seeing some pushback on this. Even for service providers, there are significant benefits to the ‘eat as much as you can’ model, as it’s much simpler to invoice and so the related back office/infrastructure needs are much smaller. However, the network still has to be paid for and there is always a point when it will be perceived as unfair that someone’s phone isn’t connecting because another is consuming capacity streaming music all day.  What are the challenges to the providers? Convincing customers that other aspects of the service are worth paying for and that increasingly, the data quantity consumed is no longer the major part of the overall cost are key challenges for providers today. At the same time, providers also need to encourage additional services that deliver more value and will inherently use more data, giving the service provider business revenues a natural lift. Ultimately the news of the charges being scrapped is great for consumers around the EU. The threat of bills worth hundreds and hundreds of pounds are soon to be a thing of the past. However, one thing is for certain - mobile phone providers will need to readdress their pricing in other areas, to compensate for the inevitable loss in revenue they will face. Consumers will need to be aware of these potential changes and ensure they gain transparency from their providers, when looking to take out or renew a phone contract. Operators too will need to take care when looking to restructure pricing, guaranteeing they remain competitive and don’t alienate their customers. MNOs aren’t greedy but the charges did need sorting. They will however, have to find new ways of funding the continued investment in infrastructure as this was one of the means to do so. ### One year on: Apache Spark continues a winning battle against data management misery Take a deep breath and herald the arrival of Apache Spark – the missing link in data management set to make all our lives easier. I’ll explain why Spark is so good shortly, but first let’s get started by putting it into context. Back in the old days of IBM DB2 and Oracle, we used to look at our data in very structured form such as tables and spread sheets. No surprise there, considering that tabular data and relational databases were the most efficient way to manage data with the computing power we had. Normalised, structured data spawned a very powerful ecosystem of patterns and tools to analyse data. [easy-tweet tweet="The days of structured #Data are done with says @ArtyomAstafurov" user="comparethecloud" hashtags="cloud"]   All good stuff yes? No. These systems only worked as long as the data was structured and normalised, and the format of the data sources was deterministic. The old systems failed to accommodate for when the landscape dramatically changed, i.e. when we added things such as social feeds, news streams, and sensor data into the mix. Data that we now call unstructured data or more commonly, Big Data; in other words, data that couldn’t be analysed with the tools and methods we inherited from the days when normalised and relational data prevailed. What did we do about Big Data? The solution came from Google and Yahoo! In mid-2000 Google issued a paper on an algorithm called MapReduce, which streamlined analytics of Big Data, making it easy to run it on multiple parallel machines. A year later, Yahoo! released Hadoop which for many years became the industry standard for analysing Big Data. So what issues do we face today? Unfortunately for Hadoop and similar tools, technology never stands still. Big surprise there! System configuration management has become a significant factor in how fast we are able to develop and test a new analytics algorithm and it is something we now have to adapt to. Enter Apache Spark   Enter Apache Spark. The timing couldn’t have been more perfect. A project that started in 2009 in the AMP labs of UC Berkley has quickly evolved into a top level Apache project with over 465 contributors by 2014. We are now witnessing its genius as it has unprecedented positive affects on the data management industry. [easy-tweet tweet="In contrast to #Hadoop's implementation of #MapReduce, #Apache #Spark provides performance up to 100x faster" via="no" usehashtags="no"] In contrast to Hadoop's implementation of MapReduce, Spark provides performance up to 100 times faster. By allowing user programs to load data into a cluster's memory and query it repeatedly, Spark is well-suited to machine learning algorithms. Spark allows us to interactively explore data. Say goodbye to the days when we were forced to model data with one set of tools and then implement and run our models with another, and say hello to a more streamlined data management approach. Spark closes the gap between data discovery and running analytics in production, giving an all in one approach to looking at data by using state-of-the-art Functional Programming approach. But hold your horses, because it gets even better. In the very near future Spark will continue to change and revolutionise the way we look at data, giving us the foundation to get the most out of it in a lean and agile manner.  ### Uncovering the Truth About 5 Cloud Adoption Challenges There’s no denying the fact that cloud adoption is on the rise. For proof, look no further than Cloud Sherpas’ 2015 Enterprise Cloud Report, which found that 82% of enterprises identify cloud technology as a key part of their IT strategy and 75% are already implementing or using at least one cloud application. However, there’s no denying the fact that many organisations still harbour concerns about adopting cloud technology. According to a 2014 study from IDG Enterprise, concerns about security (61%), integration (46%) and information governance (35%) top the list. [easy-tweet tweet="61% of people still have fears around #security when adopting #cloud" user="cloudsherpas @comparethecloud" usehashtags="no"] While these factors have each co-existed up until now, we’ve now reached the point of intersection, as organisations now face the prospect of adopting cloud technology for falling behind the competition. As a result, it’s more important than ever to overcome any challenges that exist. But is it possible? The truth is, most organisations consider these to be greater challenges than they really are. In any case, you can overcome them with the proper strategy. With that in mind, here’s what you need to know about overcoming the top five cloud adoption challenges. 1) Security Challenges When you get down to it, you’ll find that security concerns aren’t just confined to the cloud. Security has always been a concern in the on premise environment too, and with good reason, because we always need to keep security top of mind. As a result, security isn’t a cloud-specific challenge. [easy-tweet tweet="#Cloud #security issues have come about because organisations have failed to pay enough attention" user="cloudsherpas" usehashtags="no"] That said - security is still a challenge. While we should never stop paying attention to security, there are many organisations that have had some roadblocks to overcome in this area on their cloud journeys. In most cases, cloud security issues have come about because organisations have failed to pay enough attention to or make enough investments in this area. Overcoming these security challenges is a matter of developing and enforcing a comprehensive security policy that addresses the data of both you and your system. 2) System Challenges System challenges include integration, interoperability and system availability, each of which presents its own unique concerns. Integration is another challenge that is not cloud-specific. In fact, the cloud can make integration less challenging due to the resources made available by SaaS providers. Knowing how to approach integration in the cloud is crucial, as this should be different than you would on premise. The key here is to take a more holistic approach rather than building point-to-point integrations as needed. Looking at interoperability, we do have a cause for concern. The best thing to do here is to understand that the cloud market is maturing every day and to examine your current IT landscape to determine what the best fit is for your organisation. system availability is about doing your due diligence when it comes to selecting your cloud providers Finally, we have system availability, which is mainly a challenge on the front end of cloud adoption, as it’s all about doing your due diligence when it comes to selecting your cloud providers. Leading cloud providers run services for thousands of businesses, so they should have high redundancy and failover plans to maintain availability. If you do your research ahead of time and make sure your provider meets your standards, you should not face any challenges down the road. 3) Cost Challenges We can also break down cost challenges into a few different areas of focus: Implementation, structure and ROI. Challenges around cloud implementation costs are another case of concerns that are not cloud-specific. There are costs that come with implementing any technology, and that doesn’t change in the cloud. What does change in the cloud though is the structure of costs over the lifetime of these solutions. Whereas on premise technologies typically require a one-time investment upfront, the SaaS model is based on recurring costs. This change doesn’t have to be a challenge though; it’s simply a matter of preparing for the difference. You’ll also find that many of the “hidden” costs that came with on premise technologies (e.g. maintenance, upgrades) disappear in the cloud. Finally, the idea that the cloud lacks an effective ROI is absolute perception. The ROI your organisation realises all comes down to how you’ve built your cloud strategy and whether or not you have a strong business case for adopting any given solution. But if you do, the ROI in the cloud is very real. This recent data from Salesforce and IDC is evidence of that. 4) Legal and Compliance Challenges [easy-tweet tweet="The truth is regulators do not typically prohibit the use of #cloud technology" user="cloudsherpas @comparethecloud"] The truth is regulators do not typically prohibit the use of cloud technology. While you must pay particular attention to factors like data residency and security, once again this is not something that is cloud-specific. Meeting these requirements in highly regulated industries can be challenging, as it can be on premise. To overcome this, you need to do your due diligence about the regulations for your industry and region as well as what protections your cloud providers offer. Steps to take to help in this area include creating a cloud risk management strategy, engaging with regulators early on to discuss your cloud adoption activities, sharing any risk concerns with your cloud providers, managing any risks through ongoing measurement of performance and ensuring your IT governance framework contains a cloud strategy that’s aligned with your regulator’s standards. 5) Organisational Challenges Some of the most common organisational challenges include fear of/resistance to change and a lack of cloud skills among IT teams. While both of these challenges are very real, the best way to overcome them is to face them head on. For example, looking at fear of change, you’ll want to introduce a change management programme that drives awareness and clearly communicates with users. In terms of IT skills, proper training can go a long way. Augmenting your internal IT team with an external partner can also help close any gaps. In general, educating your organisation on the value that cloud solutions can provide and dispelling myths surrounding the other challenges listed here can go a long way toward easing challenges in this area. No Challenge Is the Be All, End All To Cloud Adoption Yes, cloud adoption challenges exist, but many of these challenges have existed for a long time in the on premise world too. While it’s important to identify challenges and pay them extra attention, it’s also essential to recognise that no challenge is strong enough to prevent cloud adoption. As outlined above, it is possible to overcome common challenges to cloud adoption, it’s simply a matter of proper planning and education when it comes to researching solutions and vendors, preparing your IT team and users and developing your cloud strategy. ### Calm down, this 'cloud' concept is not new… I started in the IT industry (with a full head of hair) as a youthful 18 year old. During this time and with the addition of 27 further years within the now tech or rebranded cloud world I have seen lots of hyperbole and marketing enthusiasm for products concepts or services. I sit here today (still in my prime I might add) and I hear words such as Uberisation or phrases such as ‘digital disruption’ or ‘being Ubered’. So let me take you on a journey back in time to the year 1999/2001 and lets see what has changed. [easy-tweet tweet="CTC's @NeilCattermull looks at how tech hasn't changed in the past 15 years" via="no" hashtags="cloud, tech"] We all use Internet on mobile devices today and tablets. Yes we do but in 2000 I used my Compaq Ipaq and when I needed Internet on the go I used my Nokia with connection lead to dial into the internet. We are truly mobile today. Hmm so we never used laptops back in the year 2000? I had an awesome Toshiba in those days...  [caption id="attachment_33303" align="alignright" width="282"] Img via Wikipedia[/caption] Business will be conducted over the Cloud. Yes and the precursor to cloud was the Internet anyone remember the promises of the .com era? How the Internet companies will disrupt traditional retail forever. Some did, Amazon.com for instance but who remembers Value America or the other .com losers? My view is that cloud is just a repositioning of the platform. And lets not forget one big winner from this age Dell… The Cloud is centralised computing. What was the Mainframe then? And did we not bring centralised computing services back with the Citrix Winframe product set? Weren't Citrix and early forms of Microsoft Terminal services just a precursor to bringing Intel based centralised computing to a corporate audience? Cloud is a recent development. So we never used services like Hotmail or Yahoo email in the year 2000? What about all those other services that allowed us to store files such as newsgroups or corporate portals? We are in the midst of a new industrial revolution. Yes just the same as the .com era or who remembers the lunacy of the early promises offered by the ASP (application service providers) this was the first step into the SaaS or software as a service marketplace. The revolution in technology is nowhere near as hyped; nothing has radically changed the marketplace. [caption id="attachment_33304" align="alignleft" width="300"] img via techingiteasy[/caption] Social Networking is allows us to communicate like never before. So we never used MSN or Yahoo messenger for near real time communications? Is text messaging still not the predominant messaging platform? The point of this article is to try to alleviate the concerns of traditional IT resellers, and those who wish to sell or broker cloud and SaaS services, but find the market confusing. Don't worry your not alone. [easy-tweet tweet="The #technology space is always going to be full of hyperbole and tales of disruption" user="neilcattermull" usehashtags="no"] The technology space is always going to be full of hyperbole and tales of disruption; the key to any evolution is to measure the impact whilst remaining calm. Whilst there is no doubt that cloud has an impact on areas such as billing, security and application development, as a platform it represents a small fraction of the trillions in IT expenditure. So my message is this, not a lot has radically changed in the last 15 years, if your thinking of adopting cloud services, take a deep breath ignore the marketing hype get informed and make your choices based on hard facts not cloudwash. ### London, Frankfurt and Amsterdam Form the European Cloud Premier League Paris is relegated to join Ireland and Scandinavia in lower league, while Spain and Italy lag far behind. With data sovereignty an issue and with Safe Harbour an increasingly distant memory, what are the locations that you need to consider when choosing to locate a data centre? The European data centre co-location market has grown by nearly 6% in 2015 Go where there’s demand The European data centre co-location market has grown by nearly 6 percent since the beginning of 2015 with supply increasing by 11MW in the last quarter to reach 816MW, according to global real estate advisor CBRE. This growth follows investment of $8.7 billion in the sector in the second quarter driven largely by increased M&A activity. According to CBRE, London had most take-up for the second consecutive quarter across the main European markets that it describes as FLAP – Frankfurt, London, Amsterdam and Paris. Check out the latest CBRE European Data Centre report here and make sure you study the special editorial viewpoint from Compare the Cloud on page 5. Join the Crowd If you want to be able to interoperate with other cloud providers with minimum latency then it makes sense to locate where everyone else does. Major players like Equinix have a presence in several European cities aggregates thousands of peering sessions onto a shared fabric, connecting peers at 19 Internet Exchange Point (IXP) locations in 17 global metropolitan areas. In Europe Equinix partners with ALP-IX in Munich, AMS-IX and NL-IX in Amsterdam, DE-CIX , KleyReX and DatalX in Frankfurt, ECIX in Dusseldorf, LINX in London, and co-operates CIXP in Geneva in conjunction with CERN. Notably missing from the list is Paris, which doesn’t have an Equinix Internet Exchange Point. Stand next to the big fella If your clients are going to be looking to manage hybrid environments then proximity or connectivity to the major public cloud vendors will be advantageous. Having initially based their European operations in Dublin, Amsterdam and Frankfurt, the mega-scale players (AWS and Azure) have announced that they are coming to London. While the other players don’t have a presence yet in Paris, IBM Softlayer does – currently serving all four FLAP locations as well as Milan. Be on the Grid [easy-tweet tweet="You want to be able to link up to other major #datacentres across the globe" user="billmew @comparethecloud" usehashtags="cloud"] You also want to be able to link up to other major data centres across the globe and it pays to go for a location that is served by multiple carriers to give you the opportunity to get the best rates. Again it pays to be located near Internet Exchange Points such as those from equinox where most carriers tend to connect. And again Paris isn’t as well served by exchange points and carriers as its main FLAP rivals. Get yourself a tax break Certain locations, such as Dublin, are attractive as a location due to the favourable tax regime in Ireland, but many other locations are seeking to offer competitive tax breaks too. Keep cool With power one of the main costs and cooling one of the main power requirements, locating in a cool location, such as Scandinavia has significant cost advantages as long as you’ve got the right connectivity with the rest of the world. Availability Of course you also need to take into account the availability of suitable sites and skills. This depends of course on whether you’re looking to build your own or buy capacity from others. The CBRE report tracks both available capacity and demand and is a vital reference point. Regulation The final fly in the ointment is regulation The final fly in the ointment is regulation. While at a European level the safe harbour ruling has forced global enterprises and operators to ensure that they have a footprint in both the US and European markets, there are country specific regulations that go even further. German privacy rules may be reasonably demanding, but its membership of the single European market prevents Germany from being too restrictive. Switzerland and Russia however have chosen to be significantly more restrictive, with the Russia data localisation law going as far as forcing companies that collect the “personal information” of Russian users to store their data on servers within the country and notify the authorities of their physical location when asked. [easy-tweet tweet="London is the top location for #datacentre activity just now" user="billmew @comparethecloud" hashtags="CBRE"] Taking all these factors into account and studying the demand and supply trends from the CBRE report, it is clear that London is the top location for data centre activity just now, with Frankfurt and Amsterdam both in close contention. Paris however is falling behind and risks being relegated to a lower league alongside Dublin and the main Scandinavian centres. Meanwhile Russia’s self-imposed exile makes it market for local operation only. Other major European cities currently lack the infrastructure, connectivity and critical mass to compete. ### Mobile business is awesome business “The best companies will be the most mobile” — Tim Cook We live in exciting times, technology and mobility is ingrained into all aspects of life. The evolution of mobile devices has come a long way from the quintessential yuppie accessory through to our current, beloved, all encompassing smart phone. Location intelligence, sensors and real-time alerts are positively augmenting our lives. Mobile phones are smarter and more powerful than ever and Moore’s Law tells us there will be no respite. The majority of mobile devices are loaded with a multitude of sensors. Each opening up an insight into our being. The number of internet searches carried out on mobile are surpassing those on desktop. Your customers are on the go and there’s no stopping them. Each one, with a direct channel to market in their pocket. [easy-tweet tweet="What can you do to help your business stay ahead of the curve? The answer is simple: think mobile-first" via="no" usehashtags="no"] Re-define your business vision Re-imagine on mobile. Re-invent on mobile. Re-architect on mobile. Self-disrupt on mobile. Include in your vision, delivery of remarkable customers experience. Be big, bold, audacious and be a little crazy. Bravely challenge the status-quo; sticking to your core business and doing it brilliantly. Aim to be the best, not the biggest. The abundance of cutting-edge technology will enable you to achieve your Moonshot quickly and realise a ROI even quicker. Love CX [easy-tweet tweet="Customer Experience (CX) is the magic bullet" via="no" hashtags="mobile, business, CX"] Love your customers. They’ll succumb to your advances. Re-kindle your relationship and revive an old flame by giving them an unbelievable experience. Get into their personal space. Be there all the time. Think the customer. Be the customer. Feel the customer. Get close. Get uncomfortably close. Do it well and it’ll be magic, not awkward at all. "Technology alone is not enough" — Steve Jobs Partner up You’ll have some great people on your team and they’ll do well up to some inflexion point. Here, you’ll need to scale-up, in line with your business vision. Doing it well requires experience made through mistakes, which could limit your time to market. Choose a technology partner by selecting a service provider who is passionate about helping you achieve your Moonshots. They’ll have learned from their mistakes and will be firmly immersed in the holy-trinity of business success: Technology, Design & Agility. They’ll be conversant in the latest scalable technology and understand the cloud-first, mobile-first vision you've set out. They will be able to bring industry best practice to you, supported by business acumen to build success. They’ll have a working domain knowledge to help the conversations be more meaningful. Their obsession with design will be compulsive. UX will be on their mind 24-7. Be prepared; they’ll know your customer better than you. After muscling in, they’ll come clean and set you two up for a long lasting, contented relationship. Ensure your service provider is flexible, lean and unashamedly Agile. Mobile metamorphosis can and should be kept simple and you should both learn from mistakes quickly. Tell them it’s not right. Analyse the data. Tell them again. Focus your learnings on a pivotal next step and keep post-mortems brief. Ensure they deliver critical production-ready value every 2 weeks. Real, measurable, valuable functionality every 14 days. Stipulate that. Meet their top team and challenge them in every way. Speak to their Software Engineers and project managers. Question their MO. Rip their code apart. Tear their designs up. Befriend them. Trust them. Learn what they do for fun and how they spend their weekends. [easy-tweet tweet="Technology alone is not enough — Steve Jobs" via="no" hashtags="CX, Mobility, digital"] Business leverage How have businesses leveraged the power in your pocket? For the love of coffee In the UK we've seen a shift in drinking habits, from the ceremonious tea-loving nation we once were to the laid-back, device-in-hand coffee culture we chill-in today. The thought of waiting for tea to brew is so last century. [caption id="attachment_33227" align="aligncenter" width="600"] Great British cuppa?[/caption] Starbucks has a lot to do with that shift. Now, their latest app takes this even further. Once you've registered, the app allows you to manage all aspects of your coffee drinking experience. You can register your loyalty card, top up credits or even send a gift-card to a friend. Starbucks have made it easy to find your closest branch and ensure it’s still open. It’ll give you directions and allow you to smugly jump the queue with a pre-paid mobile order. Just select, pay and order. Then pick up and enjoy. All that and you can even configure how many shots of espresso you want, along with choice of milk and frothiness! The app can also sync with Apple Watch to further enhance the experience from your wrist. In-flight flicks Once your flight is booked, an air of anxiety sets in; delays, lost luggage, traffic, passport, travel visa. For me, its all about the in-flight films. The films! The difference between a flight and good flight. United Airlines’ approach to this is simple. On selected flights, you can bring your own device and stream whatever you want to watch from their huge media library. No more twiddling with those 1970’s channel changers or thumbing those frustrating, sub-standard capacitive screens. They provide free WiFi on those flights and so long as your device is charged you can watch films, TV and listen to music through their Personal Device Entertainment app. Bring your own headphones and you are all set for a very comfortable flight experience. Large G&T please. [caption id="attachment_33228" align="aligncenter" width="600"] No bullock-cart needed[/caption] For centuries humans and animals have powered logistics right up until the midst of the Industrial Revolution. Age old logistics The Fly-esque transportation of matter is still a few decades away. Physical products (still) need to be physically moved. For business, the ideal rests in a cost-effective and reliable logistics solution. Businesses in Delhi are now resolving their logistics woes through the SmartShift app. The app uses an Uber style on-demand model built on Google Cloud Platform. It’s USP is a competitive market place; bringing together business and delivery drivers — on mobile. A requester posts the job with detail of pick-up and drop-off locations, cargo weight, value etc. along with a photo of the cargo to transport. The delivery drivers in the locality can then tender-bid for that job and once agreed they’ll pick-up and deliver as agreed. Payments (all electronic) are settled once a photo of the cargo is verified at the end-point. All in all, a perfect marriage of supply and demand, bilaterally bolstered through a game changing experience. Ignite your moonshot Mobility is the future for business Mobility is the future for business. Affordable, new-paradigm technology now lets you provide a constant and remarkable customer experience. A beautifully designed, static touch-point which talks volumes and will be reflected in your top-line. Stay ahead of the curve, out-smart your competitors and make your business awesome with mobile. MediaAgility UK Ltd. engineer beautifully designed, intelligent solutions to help clients ignite their moonshots. ### Seven different types of disruptive technologies Disruptive technologies have become a big topic in the IT industry recently with many large businesses providing products that displace another. In short, a disruptive technology is something that replaces an existing technology – below are a few examples of disruptive technologies that are set to become more prevalent in society. Cloud computing An obvious one for us, this has completely revolutionised how IT and tech firms operate – cloud computing replaced hardware. We are now more reliant on large-scale servers as opposed to more local options. [easy-tweet tweet="#Cloud has completely revolutionised how #IT and #tech firms operate" user="comparethecloud"] Internet of things As the world of IT and business is connected, so is the home, and increasingly everything. What might seem like a futuristic practice is quite real and ever-present in a large amount of homes. The internet of things is a way of describing domestic products that are easily identifiable and work together to improve productivity. Mobile Internet As mobile internet has become cheaper and more readily available it has started to displace the more traditional desktop based internet. As mobile speeds continue to get faster and faster we can only expect it to become more prevalent. Renewable Energy Renewable energy is becoming more and more commonplace. What might have once been viewed with scepticism is now seen as the future and the present. Ten years ago you would never thought that you would see so many Prius’ on the road. We’ll surely see the end of fossil fuels sooner rather than later? Robotics [easy-tweet tweet="Over 35% of jobs are in danger of computerisation over the next 20 years" user="comparethecloud" hashtags="robots, IoT, Tech"] One of the most intimidating disruptive technologies – in what might seem like a story from I, Robot, a large amount of jobs will soon be performed by robots. According to the BBC, over 35% of jobs are in danger of computerisation over the next 20 years. They even produced an interesting quiz which shows you the chance of your job being performed by robots. 3D Printing Technology This technology is slowly but surely edging out traditional printing and manufacturing techniques. It’s now much easier for an individual to manufacture their own products and devices. However, there are some people who think that this practice might be a fad. It’ll be interesting to see if it takes off. Next-gen storage solutions Businesses are now developing technologies that will prioritise data based on its value to the firm. This means that while some applications will get pushed back, the end result is a faster and more efficient end-user experience. ### Application modernization: re-designing solutions for disruptive technology As disruptive technologies change how companies do business in a range of industries, generic commercial-off-the-shelf applications may not offer organisations the speed-to-market or flexibility they need to compete in mission-critical domains. Applications must preserve the organisation’s unique identity, while leveraging new technologies and platforms that offer flexibility, responsiveness, and a modern user experience. I see it every day: organisations desperate to transform an IT operating model and cost structure that has existed for at least 25 years—a quarter of a century—yet ultimately still spending their time and money doing the same things they’ve always done. Sometimes this can be simply frustrating, but at other times it can create a tremendous amount of business risk. [easy-tweet tweet="The business of technology itself has been disrupted by #Cloud" user="epstine" usehashtags="no"] Over the past couple of years, we have seen how businesses have been impacted by disruptive technology, and we have also seen how the business of technology itself has been disrupted. For example, cloud computing is changing how organisations purchase, deliver, and support enterprise systems—moving from upfront acquisition costs with delayed delivery times, to a more immediate model where functionality is delivered on demand and paid for on a monthly basis. The ability to innovate at greater speed is now possible, and the way users experience technology has led to heightened expectations. Not only has this shifted the balance of power from solution providers to their customers—it has changed the business model of technology companies and application providers while forcing them to innovate new products.    What is amazing is how the revolution in consumer technologies has impacted the enterprise What is amazing is how the revolution in consumer technologies has impacted the enterprise. Twenty years ago, you had technologies at work that were completely unfamiliar – and often, unavailable – at home. Something considered commonplace today, like email, was almost exclusively confined to the workplace; personal use was just emerging. New operating systems were just beginning to make computing accessible to the casual user. (Shout-out if you remember Windows 95? The excitement of AOL and waiting to her that voice say, “You’ve Got Mail.”)  Having multiple devices – laptop, tablet, smartphone – was virtually nonexistent. Yet access to these technologies and systems in the workplace created a desire and drive for consumer technologies to catch up with enterprise technologies.  [easy-tweet tweet="Shout-out if you remember Windows 95 and the excitement of AOL says @epstine" user="comparethecloud" usehashtags="no"] Now the reverse is true. Consumer technologies have become so inexpensive, so easy to use, and so much a part of our everyday lives that most users expect to have the same type of experience at work that they have in their personal lives. Just consider the implications of the Gartner study that suggested that nearly half of the workforce is using personal devices for work. All of this is now driving what we see in the workplace: There is an expectation that applications will be easy to install, configure, and use; that functionality will meet user expectations nearly on-demand, with updates regularly pushed out to users; that you’ll be able to access applications anytime, anywhere, on any device; and that you'll be able to experience full functionality at all times.  The consumer experience has changed what people expect out of technology and how they consume it. Customer expectations and market disruption have changed what it takes to compete: immediacy, innovation, individuality. Organisations are now in a position where good enough business applications are no longer good enough. This is NOT a keep-the-lights-on moment for most organisations. the organisations that win are those that respond with speed, accuracy, and a differentiated solution Look, the past ten years have seen a global financial crisis that is still roiling the global economy, a shift in the workforce with respect to where job growth is occurring, and transformative technological change. If we’ve learned anything from the past decade, it is that the organisations that win are those that respond with speed, accuracy, and a differentiated solution – one that meets the needs of their market, their customers, and their business partners. If you're dealing with applications that deliver that differentiation, they’re often legacy systems that are rigid; that you can't modify to react to the way you want to do business today or to the way the market is saying you need to do business today. You're going to fall behind. If you’re looking at commercial applications, they may offer the user experience or operating model you’re looking for, but they would require time and money to customise them to the point where they deliver the unique processes that set you apart. You're going to have a significant problem competing, either because you can't innovate or react, or because you’re doing things the same way everyone else does. [easy-tweet tweet="It is no longer acceptable for organisations to be constrained by the limitations of their technology" user="epstine"] There is a third way: a path that allows you to maintain your competitive advantage, but frees you from the confines of last-generation solutions. Modernising your application portfolio allows you to deliver cloud-ready, mobile-aware solutions suitable for the needs of a digital economy. Solutions that rapidly re-architect solutions for today’s technologies allow you to meet customer expectations, maintain your differentiated identity, and take advantage of the financial and operational benefits of state-of-the-art solutions. It is no longer acceptable for organisations to be constrained by the limitations of their technology. It looks like the time has come for a change. ### The Use Of Cloud Software In Contact Centres The use of technology in contact centres is allowing them to reach much larger audiences and by using cost effective pay as you go cloud based solutions, they can also save money. This is very important for contact centre managers who have varying call volumes. [easy-tweet tweet="Using #cloud applications rather than investing in expensive #IT equipment has allowed more businesses to enter the market" via="no" usehashtags="no"] Call Centres are Changing Many of the call centres that advertise their jobs on CCJM will use cloud technology. According to New Voice Media ‘the cloud is simply an alternative acquisition or delivery model for contact centre or customer service systems and applications’. With more companies using cloud applications rather than investing in expensive IT equipment and software, it has allowed more businesses to enter the market. These new technologies are making it cheaper to run a call centre than it was a decade ago. Cloud Security An article on the CallCentre.co.uk website says that 73% of respondents to a survey did not class cloud security as a risk. Nearly all of those asked said that ‘good providers generally have better security than most of their customers’. The report also explains ‘serious cloud-based solution providers have invested very heavily in physical and logical security which many organisations have not done’. Interaction Analytics The online magazine Business Wire explains how at least 12% of contact centres in the UK have started to implement interaction analytics. This means that contact centre agents can engage with their customers by using more than one point of contact. This includes email, chat and voice, but will also allow the company to use many other social media platforms which will help improve productivity. There are approximately one million people working in the call centre industry and this figure is only expected to rise. Flexibility Once a company starts to use cloud technology they no longer have limits on where their staff have to work. The systems allow for all members of staff to work from home or in different parts of the country or the world. Once they are connected they can access, Voice over Internet Protocol (VoIP) phone systems, and also use Customer Relationship Management (CRM) software or apps. These all allow the person to take payments securely with card payments that are processed upstream in the cloud. Infinite Storage the company would need a never ending stack of hardware... Whenever you phone a call centre you’ll hear the words ‘your call may be recorded’. If all of these calls are stored in the building the company would need a never ending stack of hardware. But if a call centre uses the cloud for storage they will have unlimited room and will only pay for what they use. This not only saves money, but also space and energy. Reliability and Performance Many small and medium sized businesses can’t afford big IT systems or IT managers to run the systems, so using the cloud can save the company precious money and time. The use of this relatively new technology allows any member of staff to access any of the businesses data from anywhere they have access to a web browser. Now that people and companies are starting to trust the security of the cloud the faster it will grow. ### Software selection and skills take centre stage at OpenStack Summit The OpenStack Summit in Tokyo once again showed just how much support for the SDN based cloud computing platform exists amongst the vendor community. Fighting for headline space with Oracle OpenWorld and a raft of quarterly financial earnings statements from big network equipment vendors, cloud service providers and telcos was never going to go well from a publicity perspective, and the show went largely unheralded in the IT and telecommunications press. [easy-tweet tweet="The #OpenStack Foundation is tackling the SDN/NFV skills gap says @AxiansUK" via="no" usehashtags="no"] But the summit did a good job of quietly getting on with the serious job of informing and educating participants on the growing breadth and diversity of OpenStack technology solutions currently available. The big news came from the OpenStack Foundation itself as it looked to help service providers navigate the increasingly complex OpenStack landscape. Its newly announced Project Navigator tool was expressly designed to help end users build and deploy OpenStack powered clouds by giving them a clearer picture of where multiple cloud related OpenStack services and projects stand in terms of maturing, release schedule, packaging and documentation support. The Foundation also tackled the SDN/NFV skills gap, announcing a professional certification program for OpenStack cloud administrators. OpenStack training is currently offered by multiple vendors, including Canonical, Cisco, EMC, HP, Midokura, Rackspace and SUSE. The Certified OpenStack Administrator (COA) award looks set to guarantee a minimum, standard level of OpenStack knowledge and skillset with testing administered independently by the Linux Foundation, with the first certificates set to be given out in the first half of 2016. Rackspace launched a beta version of its Carina container environment Containers received more than their fair share of vendor attention. Rackspace launched a beta version of its Carina container environment, a cluster service which bypasses the hypervisor layer to integrate existing Docker tools and bare metal infrastructure components into resource packages that can be provisioned at ‘near instant’ availability to support different PaaS, IaaS and big data workloads. And elsewhere Midokura announced that it will help cloud service providers bridge containers and virtual machines via an OpenStack cloud by contributing to Kuryr – network plug-in initiative which uses OpenStack Neutron to provide networking services to Docker containers. [embed]https://youtu.be/LnVXJzRLgS4[/embed] [embed]https://youtu.be/Md24K2mi8Fw[/embed] ### Professionals beware, you’re about to be automated Today when reading about the impact of Artificial Intelligence and robotics the focus always seems to be on industrial or agricultural assimilation. Most commentators or self proclaimed 'cloud experts' talk about the industrial revolution and the impact of electrification on the labour workforce. To liken the impact of cloud computing to this historical moment of time is just overhyped BS. Since the Norman conquest of the United Kingdom, which brought about the guilds and merchant classes, a human need has been to distinguish oneself amongst peers with professional accreditation or guild membership. These guild, crafts and trade associations are today still guilds and trade associations but many have evolved into chartered professions such as surgeons, accountants and lawyers. So why do we assume that professional roles cannot be ‘digitally disrupted' and thrown into the jobless abyss of virtual intelligence algorithms and robotics? Just look at the rise of High Frequency Trading replacing stockbrokers. Allow me to choose 3 professions: Accountancy Law (Not a Judge or Barrister) Health (General Practitioner) [easy-tweet tweet="Machine learning and artificial intelligence is still currently bound by binary input" hashtags="AI, digital"] Machine learning and artificial intelligence is still currently bound by binary input and output and rules-based structures. Or in plain English the machine only reads 0101010 and has to be trained manually (or via a human) to understand the outcome. Digitally disrupting Accountancy The field of accountancy has in my sincere opinion started to go through the digital disruption. Multi-purpose accountancy platform’s such as SAGE, Freeagent and the American SaaS evangelists often quoted Intuit all offer a cloud based solution that takes care of bookkeeping through to invoicing and tax / payroll. What we have seen is a reduction in the personal contact a SMB has with the accountant and more of the traditional tasks such as journals ledger and payroll are being entrusted to cloud platforms. I personally describe a platform as a foundation for greater things to be developed above the core services. Therefore in a similar vein to what Salesforce has done to selling, I predict that algorithms will get more intelligent and personal taxation will be done using machine learning. i predict personal taxation will be done using machine learning   The question is, are we looking at a combination of a number of professional disciplines combined that will eventually replace accountancy with a set of algorithms? One of the largest UK big data implementations is at HMRC or “Her Majesty’s Revenue and Customs”. Are we slowly going to see HMRC create a master algorithm that battles law and taxation issues with accountants and cloud accounting systems? Will we see bots and virtual accountants and lawyers challenge each other with self-healing and recalculating bots finally jousting to an outcome? Which leads me onto the next profession, law is both complicated and interpreted by a professional in different ways. But if the algorithms were combined... Digitally Disrupting Lawyers Lawyers are very much like cloud and IT professionals in so far as they have come up with acronym led language that is meant to shield the simplicity and intricacies of their profession away from mere mortals. [easy-tweet tweet="Lawyers are very much like cloud and IT professionals ... acronym obsessed!" user="comparethecloud" hashtags="cloud, IT"] Like the previous example machine learning and AI are perfectly forming as alternatives to using a professional lawyer. Here are some highlights: Lawyers read statutes law books and case examples before coming to a conclusion - this could easily be achieved with AI either as a companion or self taught algorithm. Repetitive contract tasks, we are already seeing crowdsourcing of legal work, we view it as 1-2 years away from AI being able to accomplish this task. The drive towards plain English. IT is a perfect function for helping the legal profession become ‘human’ in its language. An example is the plethora of insurance industry comparison sites that allow for a simplification of the underlying contract terms. How long before this tech is rolled out in a mass way and digitally disrupts the legal industry? Ambulance chasing cretins, those ambulance chasing slippery types unfortunately will be able to take advantage of tech available today that will compile report and allow for targeting of potential no-win-no-fee cases of a high probability.  As Artificial Intelligence makes further progress to become more autonomous and perhaps driven further by the possibilities afforded by Quantum Computing will we see the lawyer ‘disrupted’? Digitally disrupting Health The temptation when talking about disruption in health is to be too futuristic about technology such as Nanobots, Gene Programming, and emerging Biotech therapies. We are already seeing the evolution of DNA profiling at a cost that would have been unheard of just 3 years ago, and the rise of the ‘virtual doctor’. In the UK the general practitioner is often the focal point of the community and also the main interface to the UK health service. The question in the future with the cost reductions in advanced sensors and other technology improvements is ‘do we actually need the GP?’ and would it be more cost effective to diagnose and treat symptoms using preventative rather than reactive treatments? The bigger question is around the nature of a diagnosis and how this is achieved and how we leverage technology to overcome the battle of an ever-increasing elderly population. As we move further into the world of Artificial Intelligence and health I would highly recommend a view of this video from IBM on Watson. [embed]https://youtu.be/P7ShPS8JhRA[/embed] ### Black Friday & Beyond: Dealing with web traffic spikes this festive season Web traffic spikes caused by major calendar events occur all year round. During the summer, calendars are filled with events, from festivals such as Glastonbury or Bestival, to popular sporting tournaments such as Wimbledon. These summer days now seem a faint memory as we rapidly approach the clutches of winter - and with winter comes Christmas shopping. In the run up to Christmas, Black Friday and Cyber Monday will lure vast numbers of shoppers to seek out the best bargains online. Every retailer should understand the technology challenges of dealing with varying levels of web traffic that this will cause and, in particular, the large spikes when a surge of sales is due to occur.  [easy-tweet tweet="Technology failures caused by web traffic spikes can be embarrassing as well as expensive, Campbell Williams explains" via="no" hashtags="IT"] Technology failures caused by web traffic spikes can be embarrassing as well as expensive, especially on such anticipated retail days, yet they still occur with predictable regularity. Many websites simply aren’t prepared to cope with upsurges in user activity, which could leave countless shoppers angry and disappointed this Black Friday weekend. Every year, the popularity of Black Friday appears to grow which meaning an increasing pressure is put on the IT infrastructure that underpins participating retailers’ websites. These positive deviations in web traffic illustrate how demand has grown as people increasingly access the latest deals from their mobile devices, choosing to avoid the mad rush of the stores themselves. It is not surprising that Christmas shoppers seek out deals from the comfort of their homes, rather than risking being trampled in a stress-fuelled store frenzy. It is therefore essential that all websites are not only mobile-optimised, but also designed with an infrastructure able to cope with these seasonal peaks and troughs.  [caption id="attachment_33103" align="alignright" width="300"] 15 Powerful Quotes from George Osborne’s Announcement to Increase U.K. Cyber Security Spending[/caption] The adoption of cloud technology via Managed Service Providers (MSPs) is proving increasingly popular for organisations that need to scale their requirements and manage traffic spikes with flexibility.  The approach gives them the ability to add capacity on a temporary basis at times when they predict an increase in website visits.  This capacity can then be scaled back again during quieter periods, and users also have added scope to test for different traffic scenarios without taking their site offline; so they use what they pay for, making it much more cost effective.   [quote_box_center]Tips for creating a resilient website, for Black Friday & beyond: Insist on web infrastructure that is both scalable and flexible Be prepared in advance for jumps in traffic, rather than having to cope with a crisis Ensure the infrastructure is fully tested to ensure that bursts in website activity do not lead to sudden and costly downtime [/quote_box_center] Understanding your infrastructure is key to ensuring that your website is up and running at all times, coping with any increases in demand and not risking your business’ online reputation. ### Uptime funk you up! Cloud computing remains a hot trend in the IT world, having transformed the way businesses think of their IT infrastructure. The cloud often offers reduced costs and business benefits, including increased agility across the board. For many organisations, it is important to be able to provide reliable network and application services at all times, even during non-business hours. However, the perception is that the cloud is not reliable and that downtime, upgrades and maintenance windows are the norm. The reality is that through utilising the right tools and processes, businesses can make cloud uptime truly work for them. businesses can make cloud uptime truly work for them Livin’ it uptime IT organisations are going to be held accountable for keeping high uptime levels for many applications that are being moved to the cloud. When properly planned, a workload running in a cloud environment can be more resilient and meet higher availability requirements than an on-site alternative. [easy-tweet tweet="There remains certain scepticism around performance and uptime in #cloud" user="comparethecloud" usehashtags="no"] While some people might look at clouds gathering in the summer sky with pessimism, the same is also true with the cloud itself. There remains certain scepticism around performance and uptime in the cloud, which stems from historical outages. No system is perfect, the cloud can fail too. Part of the problem is that when Amazon or other major cloud providers have a small outage, it is big news. Trade news, however, does not pick up the hundreds of outages happening every day in corporate data centers. When cloud computing was still relatively new, there were significant performance considerations particularly for database systems, the heart of most applications, as well as data transfer speeds and a lack of maturity in how to architect applications to scale on the cloud. The industry has gone to great lengths when it comes to advancing and guaranteeing performance. The leading cloud service providers offer high powered, as well as workload-specific, high speed storage with guaranteed input / output operations per second (IOPS), and have also made auto scaling easy, among other improvements. These advancements make the cloud much more reliable than the UK’s summer weather. The formula to achieve uptime in the cloud starts with understanding what is the expected and guaranteed uptime for each piece of infrastructure from your Cloud Service Provider (CSP). Service Level Agreements (SLAs) outline what organisations can expect from their cloud provider from a legal perspective and need to be analysed to understand the details and conditions behind the SLA to trigger. via GIPHY Don’t believe me just watch Making the cloud work for your business requires understanding the resources and options each CSP offers and understanding the architectural recommendations and best practices. For new applications, cloud architecture suggest using distributed applications that can be deployed in clusters of disposable servers with a ‘cookie cutter’ image. This model allows quick scaling to handle increased loads, and reduces troubleshooting to eliminating a server and creating a new one from the original image. The cloud provides IT with the opportunity to easily set up mirrors of production systems, in active-active or active-passive configurations. Some CSPs provide greater peace of mind with data replicas in different data centres or continents. Setting up database replication can be as simple as checking a box. [easy-tweet tweet="Dynamic DNS can eliminate the load balancer itself as a single point of failure" via="no" hashtags="cloud"] It’s a good idea to replicate essential data; this can be done to a private site or to be stored by a different cloud provider. With a well-planned deployment and a good infrastructure, companies can efficiently load-balance their IT environment between multiple active, cloud-based, sites – CSPs provide load balancers, tools to replicate entire workloads to a different region in a few clicks. Dynamic DNS can eliminate the load balancer itself as a single point of failure. So, if one site should go down – users would seamlessly be transferred to the next available connection. Cloud can also provide replication solutions for storage with efficient cloud-based backup. Organisations can have a dedicated link to a cloud-based data centre, where engineers are able to safely backup and restore from the cloud at a low cost per Gb. For organisations that are restricted by data retention policies, the cloud provides a retrievable data solution which can be reviewed as needed. Cloud services make it relatively simple to create a versatile environment for organisations which helps to optimise uptime and resilience. Cloud solutions provide a good forecast with greater amounts of recoverability and business continuity while the current technology continues to grow and develop. With cloud, organisations can adopt a flexible growth plan capable of scaling with IT infrastructure data demands. With cloud, uptime funk can give it to you in the form of a reliable IT infrastructure solution with minimal down time. ### SD-WAN is here to stay? Software-defined wide area networks (SD-WANs) have been the subject of much discussion in the networking community lately. The explosion of cloud services and frustration with the rigidity, cost, and complexity of MPLS-based WANs is driving companies to consider transforming their network with the Internet, and pushing MPLS aside.  That said, the question should not be, “Is SD-WAN here to stay?” But, rather, “At what pace will the transition occur?” While it’s a known challenge to integrate both private networks and public cloud resources, the SD-WAN design provides the flexibility required to transition to Internet as needed, visibility into both legacy and cloud applications, the ability to centrally assign business intent policies, and provide users with consistent and dramatically enhanced application performance.   [easy-tweet tweet="It’s a known challenge to integrate both private #networks and public #cloud resources" user="@silverpeakchris" usehashtags="no"] As more organisations move to the cloud to gain a competitive edge in their respective arenas, many are turning to this SD-WAN design as a more cost-effective means of connecting users to applications. The good news is that we do not need to wait around for the underlying network infrastructure to be replaced before we can take advantage of these cloud-optimised SD-WAN designs.   If we look more broadly at the networking arena, much of the world has been waiting for Software-Defined Networks (SDNs) to take hold. However, implementing a true SDN today often means a rip and replace of the current network infrastructure, and many IT teams hesitate to jump in that ocean, evoking the “if it ain’t broke” mantra to justify their network design strategy. In contrast, SD-WAN offers a faster path to adoption by emphasising improvements in the WAN without disrupting the underlay architecture. There’s also the ability to transition to a full broadband-based WAN by using path selection capabilities of an SD-WAN to build a hybrid WAN that leverages both broadband and MPLS. [easy-tweet tweet="Implementing a true #SDN today often means a rip and replace of the current #network infrastructure" via="no"] Companies routinely pay network service providers large sums of money for the same amount of bandwidth that employees can purchase in their homes for one-tenth of the cost.  Another example of long-accepted network clutter include entire corporate backbones that are often built with limited to no redundancy and without encryption. Many network teams also find themselves relying on dated reporting mechanisms and insufficient information to troubleshoot and support critical network infrastructures. The list could go on. Ultimately, trends come and go in the technology industry... While the potential of SDN promises to improve the way networks are designed, built and supported, SD-WAN targets traditional WAN concepts and holds the key to unleashing significant improvements in the way WANs are built and managed. For the first time, it allows organisations of all sizes to connect their branches in a matter of minutes, leveraging the Internet in a secure and optimised manner to either augment or completely replace their legacy MPLS networks. Ultimately, trends come and go in the technology industry, that’s just the way the industry works, but the solution and impact of SD-WAN will be here for the long-run. ### Closing the door to criminals in the cloud In the last decade, successful organisations have become the equivalent of Willy Wonka’s chocolate factory. A psychedelic wonderland for pushing out highly creative and innovative ideas that need to be turned into reality, and fast. At the heart of this transformation is IT, which has gone from being the support function that keeps the lights on, to a vital mechanism that transforms ideas into applications that add business value. You only have to look at how often updates are pushed to a smart phone to see how quickly this is happening. As organisations seek the agility to create new features and applications daily, adoption of cloud platforms and methodologies has increased due to its promise of transformative gains in execution, increased service offerings and lowered infrastructure costs. The cloud has become the backbone for delivery of lightening speed change and innovation. [easy-tweet tweet="The #cloud has become the backbone for delivery of lightening speed change and innovation" via="no" usehashtags="no"] However, moving applications and their associated data to the cloud can be slow, and not without risk. Regulators are mandating the protection of sensitive data at all times. With the cost and pace of regulatory reform continuing to march on, organisations are unable to justify the risk of having sensitive information in the cloud. The obstacles to the Cloud However, in reality it’s not governance, risk or compliance concerns that are the true barrier to progress. Instead, a huge blind spot is emerging in existing data security strategies. We have seen that organisations are spending a lot of money securing their production data. Yet the stringent security controls and protocols that are relied upon to mask sensitive data are not being applied to the databases that are being used to create new features or applications. Our research shows that 80 per cent of this non-production data sits unmasked within IT environments, opening the door to cyber criminals. Even with the existence of regulations like PCI compliance, Solvency II and the Data Protection Directive, personally identifiable information like name, age and location is visible to any developer, tester or analyst with the right access details. This means non-production environments are quickly emerging as the least secure point of entry for savvy cyber criminals – both on premise and in the cloud.  That’s not to say there isn’t technology that can help. When it comes to migrating to the cloud, organisations need to develop security approaches that start from the inside out. This means that before data is even migrated to the cloud then it needs to be transported in a form that is unusable if stolen by malicious cyber-criminals. The answer to being able to take advantage of cloud services is to embed data security into everyday practices and solve the data-masking stigma. organisations need to develop security approaches that start from the inside out Data masking, the process of obfuscating or scrambling the data, exists, but it’s a costly, timely and manual exercise. Waiting an extra week to mask data each time the business needs a refresh can mean slipping behind the competition. As a workaround, some companies end up using synthetic or dummy data, which contains all of the characteristics of production data but with none of the sensitive content. This solves the data privacy issue, but with production and test data not matching, it’s a fast route to more bugs entering the development process. And bugs mean delays. The golden ticket [easy-tweet tweet="Organisations need to insert a new layer into IT architecture that automatically masks #data as part of its delivery" via="no" usehashtags="no"] Instead of taking a weekly or monthly snapshot of production data and then manually applying data masking on an adhoc basis, organisations need to insert a new layer into IT architecture that automatically masks the data as part of its delivery. One approach is to create a permanent, one-time copy of production data and then apply virtualisation and masking at the data level. This makes it possible to mask an entire database and then use this to generate multiple virtual copies of the data that are also masked.    By adopting this approach, organisations can guarantee easy provisioning of masked data. IT can retain control by setting the data masking policy, data retention rules and who has access to the data. Developers, testers and analysts can all provision, refresh or reset data in minutes and they only ever see the masked data. It also allows a centralised view of the organisations’ data, and safeguards information for whoever needs it, for whatever project. Whether on premise, off shore or in the cloud, all data is secured and can be delivered as a service to the application that needs it. Whether it’s from outside hackers or malicious insiders, those that want to steal or leak data will always target the weakest point within IT systems. By bringing data masking into a service based model, organisations can readily extend masked data to any environment - including the cloud - instead of relying on synthetic data or duplicates of non-masked copies. Obfuscating data becomes a fluid part of data delivery, which means that any non-production environment can be moved to the cloud and with zero danger of data theft. Even if data is compromised then it’s true value and integrity is not. security becomes an enabler Even more importantly, security becomes an enabler. Ask any organisation if they know where all their data is and the chances are they won’t know. With secure data being delivered as a service, organisations are able to centralise data, mask it and keep tabs on where it's going. Within 24 hours, businesses have full visibility into where every copy of their production data is and the confidence that it is masked. As a result, organisations become more agile, with both storage and production times reduced. With the blind spot resolved, organisations can then realise the benefits of the cloud and accelerate migrations without risking security. ### Piracy in Practice: Protecting your Property…and your Backs There are many gambles in business. In fact, some would argue, business is, in essence, a form of negotiated gambling. One area senior execs have been willing to gamble on is software. Not in terms of popping the latest copy of CS6 into the staff tombola; but rather weighing up the decision on whether or not to purchase and renew licensing agreements on products protected by intellectual property law, or continue on blissfully unaware. [easy-tweet tweet="Protecting IP means pushing for stricter laws and punishments when it comes to compliance and piracy concerns" via="no" usehashtags="no"] In some cases, the IT department doesn’t even have the ability to monitor and manage what software is installed onto corporate devices; meaning some licences are being installed without ever being acknowledged in the business books. Put simply, does the risk of a one-off fine and the slap on the wrist outweigh the £53 a month per employee software purchase or renewal costs? Until now, most would argue the risk was minor. But the streets are talking, and where the streets talk, the law follows. Get-Out-Of-Jail-Not-Free Card: In light of a recent BBC story by technology reporter David Lee, there have been suggestions that organisations with a vested interest in protecting their intellectual property, such as those in the music and film industry, are pushing for stricter laws and punishments when it comes to compliance and piracy concerns. The proposed endgame is a jail sentence that rivals other highly classified criminal acts such as robbery and battery with a maximum penalty of 10 years. But is this empty scaremongering that holds no real weight? Or is this a victimless crime in which innocent bystanders be held up as “lessons to be learned”? Pointing the Finger: The effectiveness of this new legislation boils down to one principle: mislead by example. Until a significant influencer within the business realm is held accountable and scapegoated, I don’t feel CEOs will be willing to put in the time and effort needed to manage their licensing strategies. However, as the rumour mill continues to churn, it may cause managers and procurement managers to rethink their approach to software licensing in order to avoid being held accountable. The effectiveness of this new legislation boils down to one principle: mislead by example A Global Issue: As this proposed law is UK-centric, global companies will need to be mindful of local copyright compliances within the countries they operate in. This could mean that managers overcompensate by purchasing enterprise licensing agreements to cover their backs (following expansion, staff changes, etc.). This can be costly, and once again may deter businesses from investing and thus opting to continue risky practice. Further to this, there is a considerable talent shortage for knowledgeable asset management professionals. This means that if these harsher penalties are realised, organisations will feel significant pressure to either cut corners or overspend, should they not be able to hire skilled asset managers. It is my suggestion that companies monitor these potential changes in legislation and prepare to seek alternative vendors, where alternative vendors with comparable functionality exist, should their vendors endorse these harsher changes. I absolutely understand the vendors’ justifiable wish to protect their intellectual property. High Cost Risk, Low Cost Solution: There are, however, existing solutions that combat the need for overcompensating on the purchasing of software licenses. Software License and Asset Management packages allow IT managers to oversee where and whom software licenses are being deployed to within your company, meaning you can make informed and accurate financial decisions on software that actually needs to be relicensed or purchased. You can also designate approved software so that all applications distributed throughout the company are licensed and compliant whilst also reporting on any usage beyond corporate guidelines. This could prove to be a literal lifesaver when it comes to reducing business and legal liabilities concerning software vendor licensing! ### Automate and standardise your processes for streamlined business benefits Business leaders say that you should set your standards so high that even your mistakes are excellent. It’s good to know then that striving for business excellence has been made easier in recent years thanks to game-changing software that’s driven intelligent efficiency. Automating tedious tasks with the right tools can save you time, energy, and money – it’s the smart way to tackle time-consuming activities. [easy-tweet tweet="Today 99% of all business processes can be automated" user="stevenicsbbox" hashtags="tech, automation"] Today, 99 per cent of all business processes can be automated, and a PMG study of US corporate IT professionals conducted last year saw 98 per cent agree that it was vital to driving business benefits. In fact, many respondents reported that automation was already helping them meet business objectives related to enhancing the customer experience (61 per cent), increasing productivity gains (59 per cent), sharing knowledge (52 per cent) and delivering new products (46 per cent). Of course, the prime purpose of creating an automated task is to get rid of a manual task and, at the same time, make sure it’s being done exactly the way you want. Automation software lets you put virtually any task or process on auto-pilot and claim back hours in your workday – and with the right automation engine to oversee the operation, the driving of intelligent efficiencies becomes even more marked. Digital employee-of-the-month It’s like a ‘digital employee-of-the-month’ for every month. It’s the helpful member of staff who takes routine tasks off your hands so you can focus on the more important things. It never gets sick or tired, never takes a day off, and works 24 hours a day. In other words, it’s the dream employee. Automation software; it’s the dream employee While many think that enhancing technology in a business means that staff are needed less, it can actually mean that there is more time spent on non-tech activities like face-to-face customer interaction, collaboration and innovation. Essentially, the value of your time is maximised through the use of automated software, and your organisation is enabled to become proactive. Thanks to a standardised approach, it’s also easy to personalise the set-up to your company. Ready-made templates allow quick transference of bespoke branding so that it becomes a natural extension of your organisation. Integration benefits Perhaps the most valuable aspect, however, is the software’s integration benefits. Information that should really be shared across an organisation is often stored in silos, meaning that data has to be tediously ported over when it’s required. That’s exactly the kind of inefficiency that technology should be driving out of business operations. Perhaps the most valuable aspect, however, is the software’s integration benefits Automated software instead makes sure that admin tasks are joined up, providing a more detailed organisational overview. Within every business, applications can be combined to avoid unnecessary overlap, and thanks to this type of software, these opportunities are now actively sought out and identified. There are, in fact, numerous other benefits, from improved customer relationships and increased sales through better service, joined-up thinking, and the use of real time data, to better working environments through higher productivity, and more focused roles. Ultimately though, it’s all about agility; the ability to allow organisations to recognise and respond to opportunities more quickly. In other words, the establishment of a significant competitive edge. Setting your standards so high that even your mistakes are excellent is a worthy approach to business, but with the right automation software in place, now even the little mistakes need not happen. ### This Thanksgiving deals on your private data too In a matter of years, we’ve seen Black Friday and Cyber Monday become two of the most anticipated days of the calendar year. While consumers eagerly await the chance to buy this season’s hottest gifts, what they don’t realize is that hackers are also anticipating a holiday treat: their personal data. This weekend, Zscaler uncovered a campaign where malware turning the holiday shopping season into an opportunity to scam large number of people by creating fake apps offering early access to Amazon.com Black Friday and Cyber Monday offers and deals. The Zscaler research team recently came across one such fake amazon app which was masked as an Amazon.com Black Friday deals app, but actually intended to collect victim's personal data. The URL from where this fake app is downloaded is as shown below: URL :  http[:]//amazon[.]de[.]offer47263[.]cc/amazon[.]apk From the URL it can be observed that the malware authors are using cyber squatting to fool the victim and portraying itself as a legitimate Amazon site. Once the application gets installed it disguises itself as a legit Amazon app. Icon When the user starts this installed fake Amazon app, it loads another app named "com.android.engine" as seen below. Loading application dynamically This newly loaded child application asks for administrative privileges and other risky permissions like sending SMS and dialing phone numbers. Permissions This newly loaded app will first register itself as a service. Even if we remove the fake Amazon app, the "com.android.engine" app will stay persistent and keep doing its activity in the background. Once this malicious app is installed on the victim's phone, the fake Amazon application will start giving the error message: "Device not supported with App". This forces the victim to delete the fake amazon app thinking that there were some errors while installing it. As the malicious child app does not have any icon, it is quite difficult for the common users to remove the app. The presence of this app can be seen in Settings>Apps>Running Applications section of device as shown below. Silently working in backgroud Administrative access This loaded malicious application has code for harvesting user's personal data. The following code routine present in the app is used to collect victim's browser history and bookmarks. Browser data It is also able to harvest the call logs, received inbox messages and segregate it into sender's numbers, SMS body, received incoming call number and contact name etc as shown below. Call logs Inbox messages The malicious app also gathers victims contact details. Contacts This particular piece of malware was also found to be communicating with an IP address in Canada, "198[.]50[.]169[.]251" on port 4467 probably sending the harvest data through network socket. Hard coded IP The following packet capture shows the malware communication with its C&C(Command and Control Center).   Packet Capture Data being sent Especially during this holiday season, consumers need to be aware of the applications they’re downloading and stay away from such fake apps. Always install applications from legitimate app stores and websites. Be aware of the permissions asked by the application during installation. Shopping apps should not be asking for access to your contacts or SMS. Keeping an eye on the permissions used by the app can save you from installing such fake apps. ### Understanding Secure Mobile Working Today, it is easier to work from anywhere , thanks to the increase in the number of high-quality Wi-Fi connections. Even just a few years ago, the fixed desktop with an Internet connection was still the norm. Today, company staff no longer need a permanent desk, as various mobile devices enable them to work in different locations. Mobile staff can share data with their coworkers – but it is imperative that this process is secure. Accessing your data – from any location The physical whereabouts of data is not the only aspect of work that is becoming more flexible. Bring Your Own Device (BYOD) is custom in many areas today. Personal computers, tablets and smartphones will need to be incorporated into the network of the business, if staff are going to be able to work on the devices they are familiar with. However, each member of staff may be working with company data on their own personal device; which presents unavoidable security risks. These devices could have security vulnerabilities that enable unauthorised people to access the data – or even the corporate network. [easy-tweet tweet="Company #data on personal devices; presents unavoidable #security risks" user="brainloop_en" hashtags="BYOD"] But it is possible for staff to work on their own devices without compromising company data. First, the company should ask itself the legal questions; such as “to what extent are employees allowed to access internal network services?”, and “can they work with and save company data on their devices?” The firm may also want to introduce technical security measures. In these cases, it needs to guarantee security wherever the data is being used and stored, as well as securing the data transference itself. For instance, a company could specify that the only permitted devices are those that access the internal network. This can be done by using secured VPN connections and encrypted hard disks. The company could also limit access to certain services, meaning that users’ devices would act as a terminal for a dependable cloud application that gives staff a secure dataroom. Are ‘practical’ applications secure? The use of apps that are in widespread use on mobile devices is a particularly sensitive point. A large proportion of the free, business-oriented storage applications promote the fact that they offer modern file management, a competitive amount of storage space and that documents are centralised in the cloud. Then you have so-called productivity apps that let users draft ideas, collect information and make notes to share with and work on with colleagues. Is the data transmission secure between the mobile device and the cloud computer? But are these cloud applications secure? Who can access to the cloud servers and the data stored on them? Is the data transmission secure between the mobile device and the cloud computer? And is the app only using the data it really needs? Recent research results show that these are justified questions. Researchers at the Fraunhofer Institute for Secure Information Technology (SIT) found that companies’ security requirements are not met by three-quarters of the most popular businesses. IT specialists at Germany’s University of Bremen found that many apps command more permissions than they actually need. 91 per cent of the 10,000 most popular Android apps tested by researchers at the Fraunhofer AIESEC institute require permission to connect to the Internet – without telling the user why. Most of the apps tried to send personal data to servers globally, as soon as the app was launched (without asking the user!) – two-thirds of which sent the data unencrypted. So what can be done by companies and users to control this unwelcome data leakage from mobile apps? The four main mobile operating systems were examined in a new study by DIVSI (the German Institute for Trust and Security on the Internet). It concluded that apps running on a typical Android operating system have the most flexibility in terms of accessing data, whereas iOS and BlackBerry users can remove access permissions from the apps and restore them later as required. This option is not offered by Android or Windows. These restricted control options show that companies running a BYOD strategy must make it an urgency to provide staff with a secure collaboration and communication tool. How likely is data misuse? The risk of data misuse is significant. The losses to German business sits at €11.8 billion per year, a representative survey on industrial espionage by management consulting firm, Corporate Trust estimates. The approximate figure was only a third of what it is now, two years ago. “We’re probably already in Cybergeddon,” says the study leader Christian Schaaf. “We can only hope that companies react soon and implement the appropriate security measures.” [easy-tweet tweet="Cybergeddon - learn more from @brainloop_en by reading this article" user="comparethecloud" usehashtags="no"] Half of the 6,800 companies surveyed said they had fallen victim to a hacker attack on their systems. And 41 per cent had discovered eavesdropping or interceptions on their electronic communications. Customers or partners asking staff leading questions to extract information was the third greatest risk at 38 per cent, and at fourth place with 33 per cent came data stolen by businesses’ own staff. Worst affected of all are innovative midsized companies – yet midmarket firms have inadequate awareness of the risks and few of them employ an effective protection strategy. Some companies are starting to respond by separating private and business use on mobile apps. That’s a significant step but is not adequate to protect documents. What substitutions can businesses offer their staff? Information security is accessible in the cloud as elsewhere, but it requires a series of measures to be employed. It is vital for companies to make their staff aware of the risks and offer them secure applications, in view of the precarious situation at hand. Employees should never be tempted to find a time-saving or more practical workaround – such as quickly sharing a document on a popular but non-secure application. This means that the security tools companies provide must meet all the principal usability standards for security and reliability, as well as convenience and flexibility. ### PTC Liveworx gets busy with IoT I attended the PTC Liveworx event 16th - 18th November in Stuttgart, Germany where I was able to gain valuable insights into the changing world of the Internet of Things. The event boasted around double the attendance from last year, with more than 2500 attendees present, and many more logging in to watch the presentations live. In an event where I was able to meet David Coulthard, create my own IoT application for monitoring beer travelling cross-borders and flying a quad-copter, it is safe to say the event was a great success. [easy-tweet tweet="Read @KateWright24's key takeaways from the PTC Liveworx conference" user="comparethecloud" hashtags="IoT"] So, what were the key takeaways from the conference? We all know that IoT is beginning to disrupt the market, but understanding the magnitude of this disruption is another state of affairs. McKinsey have suggested that the the industry could be worth more than $4 trillion per year by 2025, with factories, public health and transportation and logistics/navigation eating into the largest slices of the pie. With such a large market to be targeting, there is undoubtedly going to be a number of challenges. PTC addressed these during the LiveWorx event and how they are combating them. Capabilities Jim Hepplemann, CEO of PTC (or Chief Thinker as he was coined at the event by a fellow attendee!), enforced the point that, “everything we know in the physical world can be brought into the digital world, and vice versa.” If you stop to think about this statement, it is truly mind boggling. We are reaching an era where 1000s of machines and devices can communicate with each other, seamlessly and with very little human input. [easy-tweet tweet="We are reaching an era where 1000s of machines and devices can communicate with each other" hashtags="IoT"] But, with so many elements to consider, how can one organisation manage this? PTC have realised the challenges this will bring, and to counteract it, they are evangelising the need for ecosystems and interoperability. During the past 24 months, PTC have invested heavily in strategic acquisitions and alliances which have seen them partner up with the likes of GE, Bosch and Windchill to merge together the key strengths of their partners to create an ecosystem equipped to develop holistic IoT products. We were able to see this in motion with the example of the analytics of a bike being drawn through a digital twin avatar (digital realty) - using this they were able to analyse speed, brake speed, tyre pressure etc. But then they took this further, using an iPad app to view the bike in front of them through the lens, with the screen showing the bike with the analytics sitting over the image (augmented realty). Seeing AR in motion put into context for me the future possibilities of how products are created, analysed and serviced. [caption id="attachment_33128" align="aligncenter" width="900"] img via PTC[/caption] Trust and Security Having a wide ecosystem of partners and other tech experts is clearly the way forward for those breaking into the world of IoT. The platforms for analytics and machine learning create a huge amount of transparency within the ecosystem, but how do we make this transparency selective so we don’t put our companies at risk? Dr Kilger from Ernst and Young highlighted that their findings found 82% of companies saying security is the most important aspect of emerging business models adapting to IoT. software, products and analytics can cover 90% of your security During a talk by Björn Peters at Exceet Secure Solutions on, “IoT Security: The Secure Approach to Digital Transformation,” he discussed Exceet’s ability to carry out ‘IoT Risk Assessments’ on companies to analyse their vulnerability, both internally and externally. From this they find security solutions that give different stakeholders in the business varying levels of access. The key is to finding a secure structure that doesn’t impede business operations - some security programmes can be a showstopper if you make it more complex than it needs to be. However, Björn did stress the importance of growing your ecosystem on a foundation of trust - software, products and analytics can cover 90% of your security, but there will always be those who will do everything they can to breach security levels. Consumers Jim Hepplemann made the point that companies are slow in adapting to the disruption of IoT. If businesses don’t begin to make changes to ready themselves for IoT then they will soon fall behind and put themselves at a competitive disadvantage. However, this begs the questions of whether consumers (outside of the enterprise world) are ready for this disruption? If you are a business that operates solely in the B2B market, then you will (hopefully) find your clients adapting within the next few years. However for those that operate within B2C markets, they may hit barriers with the general public adapting. [caption id="attachment_33129" align="aligncenter" width="900"] img via PTC[/caption] I spoke with one gentleman who helped to develop an IoT app that draws data from sensors placed in homes, targeted at those with dementia, Alzheimers, Parkinson’s etc so that carers and health experts can monitor their patients. When he was trialling the device with a study group, he found that some patients were panicking about the idea of sensors being in their house, as if they were in a ‘Big Brother is watching you’ type of situation. The truth is that there are those innovators and early adopters out there, but the vast majority of society will always fall 5-6 years behind in adopting the tech, and this can put some businesses at a real disadvantage. [easy-tweet tweet="The vast majority of society will always fall 5-6 years behind development in adopting tech" user="katewright24" hashtags="IoT"] PTC LiveWorx offered valuable insight and education into the world of IoT, with the possibilities and disruption it will bring to the market whilst addressing challenges and threats within the industry. Having seen the way in which PTC are future-proofing themselves for the market disruption, I am interested to see how the industry will change of the next 10 years, and who will become the big player in the world of IoT. ### Motech hits fast track to business growth with NetSuite OneWorld   NetSuite Inc., has announced that Motech, an automotive service chain in the Philippines, has achieved exponential business growth with help from NetSuite OneWorld. Motech replaced six instances of Sage BusinessWorks used by its six stores with one instance of NetSuite OneWorld to manage mission-critical business processes including financial consolidation, local Philippine tax and regulation compliance, warehouse management, inventory management, order management and subsidiary management. With NetSuite OneWorld, Motech was able to streamline business processes across six company-owned stores in the country, achieving operational efficiency and cost savings. Additionally, NetSuite OneWorld gave Motech the agility, flexibility and scalability it needed to grow and adopt a new business model, enabling the company to expand into franchising. Since 2013, Motech has added 28 retail franchises across the Philippines. With each franchise accounted for as a subsidiary in NetSuite OneWorld, Motech has a real-time view of franchise performance and can better collaborate with franchisees to manage inventory, profitability and service levels. As it continues its rapid expansion strategy, Motech plans to increase its number of stores to 56 by 2017, with a goal of 100 locations by 2020. “The flexibility, agility and visibility we have with NetSuite OneWorld has been a key driver, if not the main driver, in our continued growth,” said Motech President Eugene Naguiat, whose father, Rommel Naguiat, founded the company in 1977. “NetSuite has given us the agility to be creative in the business and run everything efficiently on the same platform.” After struggling with duplicate data entry, a lack of visibility across the business, and significant IT maintenance requirements, Motech realized that its previous software environment could not support its ambitious strategic objectives, turning to NetSuite for an agile, scalable cloud platform that could support innovation and growth. Implementation and ongoing support and optimization by PGE Solutions, a 5-Star NetSuite Solution Provider partner headquartered in Manila, has helped Motech capitalize on the transformational potential of the NetSuite platform. “I wouldn’t be able to envision an expansion of that magnitude if I didn’t have NetSuite to support the business,” Naguiat said. “NetSuite has made it possible to go from simply running the business to truly growing the business in very creative and effective ways.” “NetSuite has provided a standardized, stable and scalable platform that has helped Motech to grow its business exponentially,” said VJ Africa, CEO of PGE Solutions. “Motech has really taken full advantage of NetSuite to disrupt how car service is provided and reap the rewards of a customer-centric and data-driven approach to business.” With support for 190 currencies, 19 languages and automated tax compliance in more than 50 countries, NetSuite OneWorld has provided several benefits for Motech including: Easy multi-subsidiary management. NetSuite OneWorld enables Motech to onboard a new franchise as a “subsidiary” in a matter of hours, compared to days or weeks of work to set up on-premise systems in each location. The company can now more easily manage, aggregate and support all of its franchisees, with Motech also gaining a high-level visibility into franchisee revenues, margins, profits, conversion rates and other business metrics. NetSuite’s unified business management platform is also an attractive selling point for franchisees, who embrace the system for visibility and automation. Streamlined local tax reporting. NetSuite OneWorld’s unique local Philippines tax compliance functionality enables Motech to perform financial reconciliations and meet local tax calculation and reporting requirements at greater speed. Improved inventory and warehouse management. Motech now has zero inventory variance in company-owned stores and two warehouses, compared to writing off between P200,000 and P500,000 in inventory annually before NetSuite. Motech has also reduced the time needed for inventory reconciliation across its nearly 3,000 SKUs from five days of physical inspection at company-owned stores and its warehouses to a single day. Superior order management. NetSuite's order management has given Motech and its franchisees a real-time view into what is in stock, on order or on backorder at any given time, helping to eliminate manual bottlenecks and errors, and creating a smooth flow from sales quote to approved order for its six company-owned stores and 28 franchised stores. Motech also uses the functionality to encode the services and parts needed by each customer, using the data to analyze sales order information and project future parts and orders. Quick disaster recovery. NetSuite also helped Motech after a super typhoon hit the Philippines, flooding some of its branches. As a cloud-based solution, NetSuite enabled Motech to survive the disaster and subsequent flooding without damage to on-premise hardware or vital paper records. At the same time, Motech used its extensive customer database to reach out to customers in areas affected by flooding, offering needed services to get vehicles back on the road. Today, more than 24,000 companies and subsidiaries depend on NetSuite to run complex, mission-critical business processes globally in the cloud. Since its inception in 1998, NetSuite has established itself as the leading provider of enterprise-ready cloud business management suites of enterprise resource planning (ERP), customer relationship management (CRM), and ecommerce applications for businesses of all sizes. Many FORTUNE 100 companies rely on NetSuite to accelerate innovation and business transformation. NetSuite continues its success in delivering the best cloud business management suites to businesses around the world, enabling them to lower IT costs significantly while increasing productivity, as the global adoption of the cloud accelerates. ### KEMP enhances application deliver management with KEMP360 KEMP Technologies has previewed its new KEMP360 to help enterprise IT organisations simplify and streamline their application delivery management. The company has also launched an early access program. The KEMP360 framework solves common technology challenges by giving IT staff visibility and control of application delivery, along with centralised management and monitoring across KEMP’s complete portfolio of ADC technologies. This also helps application owners to meet key business objectives such as operational cost reduction, SLA delivery and optimised quality of experience (QoE) to end users. By enabling proactive and early detection, diagnosis and resolution of issues, organisations can refocus their resources on other areas, while optimising application availability and performance. With support for application services running in traditional datacentre deployments or in public clouds such as Microsoft Azure, AWS and VMware, KEMP360 simplifies the monitoring, management, measurement and optimisation of application delivery across the enterprise, while also supporting customers migrating to virtualised and cloud environments. This is achieved through a single pane view of key metrics related to capacity utilisation, uptime and performance. The Central™ component of KEMP360 simplifies day-to-day application delivery management with comprehensive ADC and service management, log collection and administration. At the same time, KEMP360 Vision™, reduces the impact of application delivery performance issues with proactive monitoring, alerting and automated workflow triggering for remediation. “Our closed beta program for KEMP360, which ran through Q3 ’15, helped us to learn a lot about our customers and identify interesting trends,” said Jason Dover, Director of Product Line Management, KEMP Technologies. “Overwhelmingly, we found our enterprise and Service Provider customers hyper-focused on improving responsiveness and agility within their IT organisations – a key enabler being cloud adoption for new and existing app deployments. KEMP is committed to enabling customers to accomplish these goals and the KEMP360 management framework will be integrated as a key part of this strategy during 2016.” For more information about KEMP360 and to register for the early access program, visit https://kemptechnologies.com/kemp360-early-access . ### ‘aaS’ solutions; cloud’s the limit The emergence of an as-a-service (aaS) economy is disrupting the traditional IT and business services industry and according to a recent report from Accenture and industry analyst firm HfS Research, organisations must transition to ‘aaS’ in order to remain truly competitive. A very different model to one the IT industry has historically embraced, ‘aaS’ refers to an increasing number of services that are delivered over the internet rather than provided locally, or, onsite. From Platform-as-a-Service (PaaS) to Software-as-a-Service (SaaS), these internet-delivered models are helping to give companies the flexibility to customise their computing environments and craft the experiences they desire, on-demand. [easy-tweet tweet="Historically the IT industry hasn't embraced aaS, but things are changing" user="Insight_UK @comparethecloud" usehashtags="no"] [caption id="attachment_33124" align="alignright" width="300"] Spotify[/caption] Some of the most well-known ‘aaS’ products include innovative transportation app ‘Uber’ and music sharing platform ‘Spotify’. What these businesses have in common is that essentially, there is no ‘product’ – the former, a taxi service that does not own any cars, the latter, a music streaming platform which negates the physical purchase of music. [caption id="attachment_33125" align="alignright" width="300"] Uber[/caption] A great example of a product evolving to meet the needs of the modern workplace is Office 365. It began as a floppy disk, then a CD-ROM, and now the whole Microsoft Office suite is available online for download. The value of ‘aaS’ So what are the advantages of this product-less model? The significance of an ‘aaS’ economy lies in three key areas: accessibility, scalability and infinite life span. For customers to be able to log-on from any location, at any time, from any device, is one of the most valuable assets of the ‘aaS’ solutions portfolio. As the traditional structures of working continue to break down as the workforce becomes increasing mobile – supported by the consumerisation of IT – it becomes essential for businesses to implement solutions which cater to this. Initiatives such as Choose Your Own Device (CYOD) and cloud-based software (SaaS) come under this umbrella, and looking ahead, as society becomes increasing connected through the growth of the Internet of Things (IoT), it is likely lead to the introduction of more ‘aaS’ solutions and the evolution of existing ones. [easy-tweet tweet="AaS solutions are fully scalable, organisations can be confident they will not outgrow it" user="insight_UK" usehashtags="no"] ‘AaS’ solutions are also fully scalable and organisations forecasting growth or looking for agility from their networking; storage or other computing resources can benefit from this ‘pay-as-you-go’ style model – confident that they will not outgrow it in a short timeframe. Services can be accessed as and when they are needed with the option to turn up or turn down capacity, dependent on business needs and demands. Moreover, ‘aaS’ is proving itself a remedy to the CIO headache of legacy technology. Its ability to evolve, adapt and grow as the IT landscape innovates and business requirements change, means it can cater to both the demand of today and the possibility of tomorrow. The benefits of ‘aaS’ are certainly appealing... Where do we go from here? The benefits of ‘aaS’ are certainly appealing but, whilst more than half of senior operations leaders view as-a-service as critical for business value, two thirds of enterprises are not prepared for an ‘aaS’ economy. In order to take advantage of these benefits it is vital for organisations to have the right business strategies in place internally, to ensure that they are using ‘aaS’ solutions to their full capabilities. Clear objectives and a real partnership between customer and technology partner are key factors here. Once these elements are in order, the opportunities are endless.  In today’s fast-paced world, organisations should be looking to embrace solutions likes these – ones which allow them to innovate quickly, meet their customers’ needs and ultimately, stay competitive. ### An Interview with Xangati Over the past few weeks we’ve been sitting down to chat with some of the great minds of the cloud industry. This week I had some time with Atchison Frazer of Xangati. Atchison stressed there is still a sense of wariness among the enterprise market when it comes to the cloud and virtualisation, read more below about his thoughts on cloud and Xangati, who he joined a year ago. [easy-tweet tweet="Read CTC's interview with #Cloud expert @AtchisonFrazer of @Xangati" user="comparethecloud" usehashtags="no"] Atchison Frazer, CMO, Xangati Atchison Frazer brings over 20 years of IT strategic marketing expertise to Xangati, most recently as CMO of KEMP Technologies, an emerging growth software developer competing in the enterprise Application Delivery Controller segment, and prior to that, CMO of Gnodal Ltd (now part of Cray), an innovator of High Performance Computing data-fabric technology and High Frequency Trading fintech infrastructure.   Atchison, can you give me an overview of Xangati?  AF: Xangati analyses IT infrastructures in a 360-degree, holistic, live and continuous manner. It not only generates visibility both to the metrics that are emanating from desktop services, network, compute and storage, but also connection brokers and other IT services in the network. Xangati exclusively leverages an in-memory analytics platform that ingests data at very high speeds and scales to large environments, without the need to deploy agents or probes. The company also correlates network data to cross-reference the interdependencies of flows and metrics processed by our analytics engine, enabling superior offerings to our customers. The notion of service assurance involves collecting data, monitoring, trending and performance analytics to provide actionable insights. This includes, for example, real-time analytics as issues materialise, such as post-mortem analytics of historic metrics, gathered when a customer pushes a button to record what's happening in their environment, and predictive analytics to prevent performance contention storms in the future. What do you think of the Cloud Industry at present? AF: We still see caution among the large, global enterprise market, the segment Xangati mostly serves; many of these organisations have only recently in the last 2-3 years deployed large virtualisation environments, whether that be VMware vSphere or Citrix XenServer, and are just now beginning to optimise those virtualised infrastructure investments to extend to application virtualisation, such as for virtual desktop services. However, almost all of these organisations, have hybrid-cloud infrastructure extensibility in their build-out plans, in addition to more transformational events such as software-defined networking, network functions virtualisation and container-based platforms. What are Xangati's main strengths? AF: Because Xangati essentially taps into existing flow-based technologies such as NetFlow for the Cisco estate or AppFlow when Citrix NetScaler is load-balancing web servers, for example, and the fact that the Xangati Virtual Appliance (XVA) can run natively within the customer’s hypervisor, cloud or container platform of choice, we’re the most flexible infrastructure performance management tool available, whereas many other competitors must rely on cumbersome provisioning of agents or probes to achieve the same level of data granularity delivery by XVA. Also, Xangati uniquely generates a cross-reference mesh to all functional components and objects in the virtual infrastructure, then overlays that grid with a storm-tracker utility that provides root-case analysis and triage-style troubleshooting for code-red performance degradation issues. [easy-tweet tweet="Xangati uniquely generates a cross-reference mesh to all functional components and objects in the virtual infrastructure" user="atchisonfrazer" usehashtags="no"] What do you think sets Xangati apart in a competitive cloud marketplace? AF: Given Xangati’s DNA in the virtualisation layer, we’re well-poised to take advantage of cloud service assurance analytics required to ensure that enterprise systems administrators have an independent source of truth, essentially one single pane of glass, to hold accountable internal and external service providers, and conventional silo counterparts, for meeting service-level targets. Could you describe Xangati's ideal client for me? AF: The ideal client for Xangati is an enterprise systems administrator (or a large public sector entity sysadmin) who is the licensee for virtualisation services from VMware, Citrix or Microsoft, and the managed/cloud/hosting services provider system admin is also running hypervisor-based virtual infrastructures and/or offering VDaaS (virtual desktop-as-a-service), as a good example. The other main buying centre for Xangati is app administrators who are responsible for virtualised desktop services leveraging VMware Horizon or Citrix XenApp/XenDesktop. Could you tell us a bit about some of your current clients? AF: XVA has been deployed in more than 400 customer organisations, ranging from mid-market to large enterprise, large public sector, and managed service providers. In the UK, we work with the National Health Service, British Gas, Sky, Colt and many others in concert with our go-to-market partners, such as Triangulate IT. Where you see your company heading in the future? [easy-tweet tweet="The future of @Xangati is very much tied to the hybrid-cloud market as well as the development of third-wave platforms" via="no" usehashtags="no"] AF: The future of Xangati is very much tied to the hybrid-cloud market transition as well as the development of third-wave platforms that transcend the hypervisor orthodoxy, such as containerisation and hyper-converged infrastructures. As these multiple tiers are layered over the on-premise physical and virtual infrastructure environments, visibility to performance degradation or v-storm contention issues becomes even more complex, especially for storage IOPS latency or anomalous network traffic patterns, and so with our in-memory architecture and auto-ingest capability of protocol/API metrics, we’re able to deliver pinpoint-accurate intelligence about how to optimise faster (live, second-by-second, continuous high fidelity stream) and with greater scale. ### 15 Powerful Quotes from George Osborne's Announcement to Increase U.K. Cyber Security Spending Addressing the Government Communications Headquarters (GCHQ) Tuesday, George Osborne, Chancellor of the Exchequer, announced a £1.9 billion investment in cyber security by 2020. In the wake of the horrific attacks in Paris, Osborne pledged solidarity with the nation of France as he shared details of a National Cyber Security Plan that had already been in the works. The new U.K. program will establish a National Cyber Centre (NCC) and increased support for the U.K.'s National Cyber Crime Unit.  [easy-tweet tweet="George Osborne, Chancellor of the Exchequer, has announced a £1.9 billion investment in cyber security by 2020" user="followcontinuum" usehashtags="no"] In his speech, Osborne outlined his five-step plan to make cyber security the U.K.'s top priority. While the government will play the largest role in this effort, service providers should take this latest program to heart. With a greater focus on not just containing, but combatting cybercrime, there will be more opportunities to receive the financial support you need, while closing the widening IT skills gap that's kept you from growing and scaling your businesses. During Osborne's speech, there were many poignant moments particularly relevant to U.K. IT solutions providers. We've grabbed the 15 we found to be the most relevant! A National Emphasis on Cybercrime Prevention 1. "Getting cyber security right requires new thinking. But certain principles remain true in cyberspace as they are true about security in the physical world." 2. "Citizens need to follow basic rules of keeping themselves safe – installing security software, downloading software updates, using strong passwords." 3. "Companies need to protect their own networks, and harden themselves against cyber attack." [easy-tweet tweet="Companies need to protect their own networks, and harden themselves against #cyberattack" user="followcontinuum" usehashtags="no"] 4. "The starting point must be that every British company is a target, that every British network will be attacked, and that cyber crime is not something that happens to other people." 5. "Over the past 18 months, for example, GCHQ has helped U.K. law enforcement tackle a number of high-profile operations against pernicious cybercrime malware threats, like Dridex, Shylock and GameOver Zeus. These have cost U.K. citizens and companies and government departments millions of pounds in the form of fraud, theft and damage[.]" 6. "I can tell you today that right now GCHQ is monitoring cyber threats from high end adversaries against 450 companies across the aerospace, defence, energy, water, finance, transport and telecoms sectors." 7. "In protecting the U.K. from cyber attack, we are not starting from zero." [easy-tweet tweet="In protecting the U.K. from cyber attack, we are not starting from zero." user="followcontinuum" hashtags="cybersecurity"] 8. "The truth is that we have to run simply to stand still. The pace of innovation of cyber attack is breathtakingly fast, and defending Britain means that we have to keep up." 9. "It is easier and cheaper to attack a network than it is to defend it. The barriers to entry are coming right down, and so the task of the defenders is becoming harder." 10. "Last summer GCHQ dealt with 100 cyber national security incidents per month. This summer, the figure was 200 a month. Each of these attacks damages companies, their customers, and the public’s trust in our collective ability to keep their data and privacy safe." 11. "But the Prime Minister, my colleagues at the top of government and I have decided that we have to make a top priority of cyber security, if Britain is to be able to defend itself, now and in the future." 12. "Internet service providers already divert their customers from known bad addresses, to prevent them from being infected with malware. We will explore whether they can work together – with our help – to provide this protection on a national level." 13. "[W]ith the right systems and tools our private Internet service providers could kick out a high proportion of the malware in the U.K. Internet, and block the addresses which we know are doing nothing but scamming, tricking and attacking British Internet users." [easy-tweet tweet="With the right systems and tools our private Internet service providers could kick out a high proportion of the malware in the U.K. Internet" user="followcontinuum" usehashtags="no"] 14. "We will never succeed in keeping Britain safe in cyberspace unless we have more people with the cyber skills that we need. This year’s Global Information Security Workforce Study estimates that theglobal cyber security workforce shortage will widen to 1.5 million by 2020." 15. "If we do not act decisively, the skills gap will grow, and limit everything we want to achieve in cyberspace. So we will launch an ambitious programme to build the cyber skills our country needs, identifying young people with cyber talent, training them, and giving them a diversity of routes into cyber careers." Osborne's full speech can be found here. [easy-tweet tweet="If we do not act decisively, the skills gap will grow, and limit everything we want to achieve in cyberspace." user="followcontinuum"] ### The secret sauce for international enterprises: technology – your friend in international trade It can feel overwhelming to think about working overseas, committing to contracts with partners, considering operating in new languages… and all the bureaucracy required to make it happen. Perhaps this are why the UK has historically lagged in exporting over recent years, but there is a huge range of support available to help businesses overcome such concerns. [easy-tweet tweet="The UK lags behind Germany, France, Netherlands, Italy and Belgium for intra-EU trade" via="no" usehashtags="no"] Drilling into the matter further, right now the UK is currently only 6th in the league table of intra-EU trade. We lag behind Germany, France, Netherlands, Italy and Belgium. In fact, UK export value within the 28 state EU actually contracted two per cent in the period from 2002 to 2013. For all but two member states, Ireland and the UK, the value of exports of goods to partners in the EU increased between 2002 and 2013. As examples, Germany increased by over 50%; the Netherlands by over 84 per cent; and Belgium by over 43 percent. The largest negative intra-EU trade in goods balances are recorded for France (just under EUR 89 billion) and the United Kingdom with nearly EUR 79 billion (European Commission, 2015). So in relative terms the UK has been losing market share with our nearest neighbours. Adding to the picture, Sage’s own research, launched alongside Exporting is GREAT, found that 75 per cent of the UK’s small-to-medium-sized enterprises are not currently selling overseas – and have no plans to. We questioned 400 firms and found that seven per cent export to Europe alone, 16 per cent export globally, but the rest do not export or have any intention of opening up internationally. Discovering that of our thousands of customers, only a quarter export, was a shock. Overcoming the fears that hold potential exporting enterprises back is crucial to the economic wellbeing of the UK as a whole. For an individual enterprise it can provide a huge boost to growth that can compound profit on profit and allow a business to rapidly reinvest into the business. [easy-tweet tweet="A technology partner can help UK businesses to overcome barriers preventing them from exporting" via="no" hashtags="cloud"] Whilst the government can assist with financial and market advice, a technology partner can help businesses overcome other barriers preventing them from exporting, through the use of technology solutions that simplify complexity and fight the fear. Your technology partners can advise what your business needs now and prepare you for the shape of your future business too. Here are a few challenges that can be overcome with smarter technology. How will my business communicate with employees and partners abroad? Multi-language business management solutions allow you drive the necessary administration of a multi-site, multi-country business through language barriers. That way your shipping functions can communicate with finance who can invoice manufacturing… and so on. If the money and goods keep moving on time, your business services customers on time. This only happens when all parties understand each other. How can my business manage international money movements? the software can take the strain Once again, the software can take the strain. The right business management solution can operate in multiple currencies to support the enterprise’s global customer base. If you choose a system with a product configurator tool you can greatly simplifies the design and pricing of product lines. Rather than manually changing figures and hitting the calculator keys, using the automation in smarter software can convert currency amounts in moments so you always understand the health of the business. If you will need to handle transfers and reporting from one country to another it’s useful to have a real-time (linked to the web) global vision while responding to local operational requirements. A strong local connection in the form of a cloud vendor with an international reach can also prove helpful, aiding businesses new to the market to easily assimilate into the local work culture, language, and regulations for compliance. How can I stay connected with my customers across time zones? When working across multiple time zones, the need to be online around the clock can be challenging. This pain point is particularly palpable for businesses that are newly setting up overseas that may not have on-ground staff to field requests. This is where moving to the cloud can keep the business running full-time, without additional staff costs normally required to ensure local infrastructure is always running. moving to the cloud can keep the business running full-time How can a global workforce stay in synch? Most businesses of a certain size will already be using mobility solutions based on the cloud, though many have yet to move from older enterprise resource planning solutions to more modern business management tools either based in the cloud or via ‘hybrid’ working solutions. With a cloud computing solution, hosted by a technology partner, company data can be secured behind state of the art security whilst becoming accessible to all employees and partners via any internet connected device, from tablet or smartphone to PC. That can make a huge difference in making faster deals and responding to new opportunities, connecting teams across the world with no lag. In fact, given the benefits that can specifically benefit exporting, is begs the question: Is cloud a technology or a business solution decision? [easy-tweet tweet="Is cloud a technology or a business solution decision? What do you think? " via="no" usehashtags="no"] ### Why more companies are setting their sights on an invisible data centre When Italian bank, Credito Valtellinese admits to forgetting the very existence of the Nutanix technology which powers the organisation’s IT infrastructure, it is a rare example of a positive oversight. Because in a similar vein to a good waiter, IT infrastructure that is doing its job, shouldn’t really be noticed at all, instead ticking over seamlessly in the background, only to command attention when there is some kind of disruption to the operation. [easy-tweet tweet="IT infrastructure that is doing its job shouldn’t really be noticed at all" user="comparethecloud" usehashtags="no"] It’s the reason why identifying the most transformative, robust and resilient solutions which can keep critical systems running 24/7 has become the Holy Grail for many businesses in the never ending quest to optimise efficiencies. And this scrutiny and drive is heralding a new era for the data centre environment - one that is on the cusp of entirely invisible infrastructure. Leading the charge is San Francisco-based Nutanix, the pioneer of converged infrastructure, whose raison d’etre has been to transform the entire make up of IT infrastructure by driving down cost and complexity and rethinking the approach to storage and compute for a virtualised world. Within a short space of time, this radical approach has become the go-to solution for over 1,700 of multi-sector enterprises from banks, airlines and social housing organisations to one of the world’s largest privately-owned health clubs. All are reaping the benefits of a configurable and scalable network which can evolve seamlessly with business needs, shape new business models and exploit new ways of working. In driving this architectural approach, known as web-scale IT, Nutanix took the lead from the global cloud services of Google, Amazon and Facebook, who in their mission to deliver IT services to millions, realised they would have to devise their own software-defined hugely scalable and resilient IT infrastructure rather than relying on costlier and restrictive proprietary solutions and custom hardware. As the benefits of the new approach became more commonly known, Nutanix soon emerged as the go-to solution equipped to bring web-scale IT to the masses and fuel the more mainstream adoption  with  affordable and tailored packaged solution to fit business requirements. It’s a process which began with the launch of the flagship hyperconvergence software-centric architecture six year ago. By integrating compute and storage into one commodity hardware box, this marked the first major milestone on the journey to data centre invisibility with the server hidden for the first time and the three tier SAN architecture eliminated. the infrastructure’s capacity and performance can be enhanced simply by adding additional computing nodes A core differentiator established from the outset, and which remains a major draw, is the ability to transform the usually complex process of network expansion through an easy to use, scalable ‘pay as you grow’ model. Here, the infrastructure’s capacity and performance can be enhanced simply by adding additional computing nodes with no need to replace or upgrade existing servers, and all in a matter of minutes, as opposed to the days that traditional architecture would take. And now things are moving up a gear as virtualisation joins the server in becoming invisible with the launch of the Nutanix Xtreme Computing Platform which raises the IT enterprise bar even further. Comprising two stand out solutions Nutanix Prism and Nutanix Acropolis, the software integrates virtualisation into the platform to simplify operations and deployment. And it represents an industry first through the flexibility afforded to the IT professionals who are empowered to making the infrastructure decisions based on the requirement of the application, be it a traditional or emerging hypervisor. Indeed, this drive for an ever more versatile solution to negate the restrictions that come with a proprietary vendor extends even further, exacerbating multi-hypervisor adoption which flourishes in a network where complexity is reduced, as competing solutions drive a greater commoditisation in this market. Indeed, for the diverse range of operating environments now embracing Nutanix solutions, enhanced agility is one of the defining characteristics not only leading to significant cost savings and efficiencies, but also driving new service delivery and ways of working as a result. It’s a case in point at Sondrio-based Credito Valtellinese where Nutanix has brought virtualisation to the bank’s data centres in Sondrio and Milan to better respond to ccomplex processes such as acquisitions that have demanded an ever more agile approach to delivery. Nutanix Xtreme is powering a new era in an organisation with a heritage dating back to 1908, as it transitions from a focus on a physical branch environment to a more mobile way of both banking and working as customers gain virtual access to the bank’s services while staff work on the move via notebook and virtual desktop solution. A different domain, but the impact is proving to be just as transformative at health club Fitness First where six figure cost savings are on the cards having replaced the complex and expansive legacy data environment with a hybrid cloud using Nutanix converged infrastructure. The move has seen the previous several hundred servers culled to just two Nutanix hyper converged storage and server racks saving cost and space and leading to far greater flexibility. The result is a business that is far better placed to respond to the level of IT demands which have escalated in line with business growth, beginning with the one club in London in 1993 to a business which now has 1 million members and 370 clubs in 16 countries. An invisible future has never been so clear. ### IBM Europe Cloud Leaders #EUCloudChat Thursday 26 Nov @2pm GMT #EUCloudChat A new sense of urgency is taking hold as enterprises face the implications of digital transformation. Aligning company’s resources and strategic direction to respond to exceptionally dynamic and unpredictable forces in the competitive environment becomes a priority. [easy-tweet tweet="Join @IBM for the #EUCloudChat on Thursday 26th from 2-3pm GMT" user="comparethecloud" usehashtags="no"] Cloud is the enabler of business transformation. Enterprises understand the economic benefits of a flexible, virtualised infrastructure and a shift in cost structure from capex to opex. Yet the market is responding in its own time. CIOs wait to see proven business cases deployed by their peers, then approach migration in stages, based on immediate business priorities and other commitments. There are a lot of cloud providers to choose from. How does the enterprise find the right partner? Is a hybrid cloud, which involves a bit of both public and private cloud, a strategic decision or a way to avoid hard choices? Hacks and breaches make headlines. Is hybrid cloud safe? Is it better to use data centres here in Europe? Why? We hear a lot about the importance of standards and open source in cloud. Do they really matter more here than elsewhere in IT? We’ll discuss the hot topics of the European cloud market with the analyst community, CIOs, customers, consultants, academics, bloggers and other influencers of the cloud ecosystem. Hot topics: business transformation and agility, APIs, open source, public/private/hybrid cloud, security, EU regulations and cloud data localisation. The panel of cloud computing experts includes: Paul Miller, moderator: Senior analyst, Forrester Research @PaulMiller Sebastian Krause, VP IBM Cloud Europe @SebKrause Rashik Parmar, Lead Cloud Advisor, IBM Europe / Distinguished Engineer @RashikParmar Stefano Stinchi, VP IBM Software Cloud Europe  @stefano_cloud Bo Soevsoe Nielsen, SoftLayer Leader, IBM Europe   @bosoevsoe Several other IBM Cloud leaders from across Europe and the UK will also be online chipping in to answer any queries. ### Private/Public Nirvana: the combination of private cloud and public data centres A look at how WAN choke points, and the need to future proof private cloud investment suggests that public data centres or colocation centres rather than public cloud are the immediate evolution for on-premise data centres. Holiday makers can manage their own transport, driving all the way to their destination if necessary, but for international travel most choose to drive as far as the airport and make use of the wide variety of lost-cost, high-speed air services available from there. Others get a cab to the airport to save the hassle of driving at all and some first class air travel even includes a limo service to take you the airport. It just makes sense. These days, just as travellers don’t necessarily need to drive themselves anywhere, companies adopting private cloud are finding that they don’t necessarily need their own data centres. [easy-tweet tweet="Companies adopting private #cloud don’t necessarily need their own data centres" user="apj12 @comparethecloud" usehashtags="no"] Public, private and hybrid cloud Most enterprises are adopting a hybrid cloud strategy in order to make the most of the respective benefits of public and private cloud. For some workloads public cloud is not only the most cost-effective option, but it also offers rapid access to a vast array of available services, such as all those on the AWS marketplace. private cloud is used for workloads where you need control over the compute environment Conversely private cloud is used for workloads where you need control over the compute environment for technical reasons (especially where legacy platforms or architectures) or where proximity is important (such as for data intensive applications where compute power needs to be near big data stores) or for reasons of protection, risk aversion, compliance and regulation. The question remains, which public cloud provider to go for and where to site your private cloud. Private cloud in a private data centre The main argument for using private data centres appears to be inertia. Companies want to continue using and gaining a return from existing facilities and infrastructure. Once virtualised and optimised for cloud services, existing infrastructure can be, and often are, used to host private cloud. However over time the following issues will become increasingly evident: The WAN bottleneck A wide area connection from the private data centre is required to interconnect with other corporate data centres as well as with public cloud services. There is a limited choice of carriers and connectivity options and unless bandwidth is adequate this topology can create a bottleneck. Future proofing If in the longer term greater use of public cloud or colocation services is likely to lead to the eventual closure of the private data centre, then any further investment in this facility will eventually need to be written off. The move to private can be a catalyst to relocate existing equipment sooner rather than later in order to optimise provision from within a public data centre or regional colocation centre. Once relocated the migration from legacy to new infrastructure within the colocation centre can be phased-in in order to mitigate risk and optimise cost and performance. Private cloud in a public data centre or regional colocation centre With more than 1,000 carriers worldwide that interconnect in more than 2,000 colocation centres, one of the best ways for enterprises to leverage the available global fibre infrastructure is to locate within a regional colocation centre and then leverage variety of major networks available from these locations. [easy-tweet tweet=".@GTTcomm provides a high-speed, cost-effective on-ramp to the #cloud" user="apj12 @comparethecloud" usehashtags="no"] An enterprise opting to locate its private cloud services in a regional colocation centre and then connect to additional public cloud services from there will find that many of these cloud services are available from within the colocation centre itself or from a peered location with a very low latency connection. Equinix is an example of a colocation provider that offers a network hub service called Performance Hub. It provides direct Ethernet connectivity to IaaS cloud providers, such as Amazon Web Services (AWS) and Microsoft Azure. Access from the main enterprise sites to the colocation centre will be dependent on getting high-speed, cost-effective network links either direct or via a carrier hub. GTT is an example of this, providing a high-speed, cost-effective on-ramp to the cloud and then extending the reach to other colocation centres where other CSPs might be present. In this way, the chosen colocation centre need never be just an end destination but rather a hub from which to travel onwards. Further considerations: Mobile, remote and external access If a significant proportion of your traffic is from either mobile or remote workers then high-quality user experience can be maintained and network latency minimised from a colocation centre with direct connections to major carriers. Likewise if external client access is critical. Video and converged media The network requirements to support high-bandwidth, low-latency voice and video applications are increasing as are the converged services offered by carriers that link directly to major colocation facilities. [easy-tweet tweet="Consider the option of hosting your private cloud in a public data centre says @apj12" user="comparethecloud" usehashtags="no"] Before sinking any further investment into corporate data centres that may be redundant in a matter of years, especially if you’re about to embark of a large private cloud implementation, consider the option of hosting your private cloud in a public data centre or regional colocation centre. You wouldn’t normally drive all the way to a foreign holiday destination, indeed you might not even want to drive as far as the airport. Well you don’t have to. With GTT as your on-ramp to the cloud and with a 1,000 carriers worldwide that interconnect in more than 2,000 colocation centres, there is a compelling case for closing some, if not all, of your data centres and opting for the ‘Private/Public Nirvana’: a combination of private cloud and public data centres with links to everything beyond. ### Strengthening Container security Initiatives While the new world of IT offers a radically new way for businesses to innovate, it brings a new set of vulnerabilities due to lack of visibility, proper controls and security. And, with the vast amount of security breaches taking place today due to vulnerabilities in open source components, such as example Heartbleed, Shellshock, and Poodle, organisations are increasingly focusing on making the software they build more secure. Addressing the constantly changing landscape of open source security threats can seem a never-ending process. As organisations increasingly turn to containers to improve application and agility, vendors have been plugging security holes while attempting to expand flexibility for container deployments. [easy-tweet tweet="Vendors have been plugging security holes while attempting to expand flexibility for container deployments" via="no" usehashtags="no"] How container security currently works Container providers’ main focus today is to use encryption to secure the code and software version running in Docker users’ software infrastructure to protect users from malicious backdoors in shared application images and other potential security threats. However, this method covers only one aspect of container security, excluding whether software stacks and application portfolios are free from unknown, exploitable versions of open source code. Docker Content Trust only ensures Docker images contain the exact same bits that the developer originally put there, but does alert users of any vulnerabilities already present in the open source components. In fact, a current study by BanyanOps found that more than 30 percent of images in Docker Hub are highly susceptible to a variety of security attacks including Heatbleed and Shellshock. Security Risk 1 - The new threats to old versions Knowing that the container is free of vulnerabilities at the time of initial build and deployment is a necessary but insufficient requirement. New vulnerabilities that can easily impact older versions of open source components are being constantly discovered. An informed open source technology that provides selection and vigilance to users is essential to avoid this. Security Risk 2 – Data sensitivity and container location The security risk posed by a container also depends on the sensitivity of the data that’s being accessed by it, as well as the location in which the container is deployed. The security risk posed by a container also depends on the sensitivity of the data that’s being accessed by it Whether the container is deployed on the internal network behind a firewall or whether it's internet facing will affect the level of security risk. Containers deployed on an internal network behind a firewall, for example, won't be exposed to a publicly available attack.  For this reason it is critical to be aware of where your open source software is located, whether the code exhibits security vulnerabilities and whether a sharp open source profile exists. In other words, having visibility of the code inside containers is critical to container security. Will this slower down container adoption? Analysts hold diverse opinions as to whether concerns over security will slow down container adoption. Business necessity is likely to prevent container adoption from slowing down, as containers have proven to provide many benefits to businesses. These include improved scalability, fewer errors, faster time to market and simplified application management. Dave Bartoletti, principal analyst at Forrester Research, believes security concerns won’t significantly slow container adoption: “With virtualisation, people deployed anyway, even when security and compliance hadn’t caught up yet, and I think we’ll see a lot of the same with Docker,” Meanwhile, Adrian Sanabria, senior security analyst at 451 Research, believes enterprises will give containers a wide berth until security standards are identified and established. “The reality is that security is still a barrier today, and some companies won’t go near containers until there are certain standards in place”, he explains. [easy-tweet tweet="The presence of vulnerabilities in all types of software is inevitable says @bill_ledingham" user="comparethecloud" usehashtags="no"] Whatever the case, what is clear is that security remains a concern as application container deployment ramps up. The presence of vulnerabilities in all types of software is inevitable, and open source is no exception. As security and other gaps in the container ecosystem are filled, organisations are best served to take advantage of the automated tools available to gain control over all their software infrastructure elements, including containers. Taking the lead in setting up a strong application security strategy is imperative to all business and one that they should proactively be seeking to deploy.  ### An Introduction to Continuum Every now and again we like to give you, our loyal readers, the lowdown on the companies that have caught our eye. I sat down with Paul Balkwell, European Sales Director at Continuum to chat about what he has been up to recently at Continuum’s European Headquarters in the Thames Valley. Paul has an abundance of experience in the tech industries, 23 years to be exact. [easy-tweet tweet="Read CTC's chat with #cloud expert Paul Balkwell from @FollowContinuum" user="rhian_wilkinson" usehashtags="no"] Paul has built successful growth strategies for SaaS and IT Managed Services organisations, including the setup and delivery of a multi-million pound recurring revenue IT Managed Services business that, after 12 successful years, was acquired by a leading UK telecoms provider. Prior to joining Continuum, he was Head of Managed IT Services at IP Integration in the UK. Paul also served as New Business Sales Manager at Business Relations International for several years, and was elected to CompTIA’s  UK Channel Community Executive Council in 2015. Suffice to say, he is more than qualified to be one of our featured experts! Paul Balkwell, European Sales Manager, Continuum Managed Services Paul, can you tell me a little about Continuum? PB: Continuum are a 24/7 channel only white label Remote Monitoring and Management (RMM) organisation. Unlike a lot of the RMM providers who are a software only, we are unique, as we have approximately 650 engineers that can resolve up to 90% of all issues as well as carry out all routine maintenance and patching on a “pay-as-you-grow” model. We don’t tie our partners in to lengthy contacts, just a monthly rolling agreement. Basically we take away all the “daily grunt work” to free up internal resources to concentrate on other revenue making activities. So what does that mean for your clients?  PB: With most RMM providers that are software only, you might come in the morning to a couple of hundred alerts that your engineers have to work through. With Continuum, you might have 8 resolved tickets detailing what the issue was and what steps we took to fix it. Our partners also have the ability to pass issues to our engineering teams offering a true extension to their internal support teams. Now for the one we always ask, what do you think of the Cloud Industry at present? PB: It’s an exciting and growing market though I still think we are in a hybrid world where both business and home users have a mixture of onsite and cloud services. [easy-tweet tweet="The cloud industry is an exciting and growing market says Paul Balkwell of @FollowContinuum" user="rhian_wilkinson" usehashtags="no"] What would you say are Continuum’s main strengths? PB: Being channel only and combining our Remote Monitoring and Management software platform with our Network Operation Centre, populated with 650 engineers, makes us a great vehicle for Technology Managed Services Providers to scale and grow reoccurring revenues on a pay-as-you-grow model. What do you think sets you apart in the competitive cloud marketplace? PB: The fact that we are not just a “Software as a Service” offering, we offer a pay-as-you-grow “Service as a Service” model giving our partners the freedom to add, change and modify their offerings without the burden of being tied into a long-term contract. This differentiates us, and ensures our clients are getting the level of service they require. Can you describe Continuum’s ideal client for me? PB: We like to consider our clients as partners to Continuum's offerings, really they are any organisation offering Technology Managed Services, from SME to Enterprise including; IT Managed Services, Break Fix, Telecoms, Office Equipment and Data centres providers that are looking to grow their business more efficiently. Our partners manage clients in many different aspects of business ranging from the very small to the enterprise level of organisations. We enjoy having a broad spectrum of business sizes to work with, it keeps things fresh!  Where you see Continuum heading in the future? PB: In the future we're looking to continue our European growth and to continue offering additional services and revenue streams to our partners including, best of breed IBM Softlayer, which we're really excited about, and Managed Backup & Disaster Recovery Service (which is coming on Jan 16!). Plus Network Monitoring and Resolution Service including, Firewalls, routers and switches. Continuum has an exciting and feature packed future ahead!  Want to read more? Check out some of Paul's featured articles on Compare the Cloud by clicking the images below!  [caption id="attachment_22110" align="aligncenter" width="300"] Read about European Data Reform here.[/caption] [caption id="attachment_32880" align="aligncenter" width="300"] The Queen is Tech Savvy[/caption] ### UK start up Klipboard announce $900,000 seed funding Klipboard, the mobile app and web dashboard, enabling businesses to manage their field based staff digitally and become completely paper-free, announced $900,000 of seed funding from a Seattle based Private Equity group and a number of angel investors.   The funding will be used to support Klipboard’s strategic growth in the UK. Klipboard is designed for companies with employees who work outside the office, allowing staff to complete & manage forms, reports & tasks via a tablet, anywhere at anytime.  It enables businesses to digitally manage their employees in the field from the office, using the Klipboard web dashboard.   Draven McConville, CEO at Klipboard said: “Klipboard enables businesses to become significantly more productive and efficient, by allowing staff who work outside of the office, to complete & manage forms, reports & tasks on their tablet, rather than scribbling on paper and wasting time re-writing.  This seed funding reinforces the value of our product and brand, enabling us to support even more UK companies to become paper-free.” Klipboard can help any business with employees who work outside the office, from organisations of 10 to 10,000.  It is particularly useful for companies where compliance is key, i.e. businesses with a stringent health & safety component, such as those in the construction, retail and facilities industries. The easy to use, consumer style app, enables staff to complete forms & reports, create & manage tasks, all in real time.  For employers, the web dashboard ensures you can keep track of the work your employees are doing, as all activities can be viewed and managed digitally.  Quick to implement for any business, Klipboard can be up and running in 15 minutes, irrespective of the number of employees. ### Business Continuity? Check! Follow these tips to plan and execute an effective disaster recovery plan The vast majority of disaster movies depict the overtly dramatic fallout of catastrophes, both natural and manmade, such as entire city skylines going up in flames or the earth ripping open to swallow every object and human being in its path. What they overlook are the quieter, yet just as damaging events; such as the decimation of entire data centres. Much like disaster movies have been around for ages, disaster-related IT system outages are nothing new. However, the severity of their consequences has soared to epic heights in the age of rapidly mounting data volumes. Losing a “small” percentage of data is no longer a “small” matter: losing even a few pieces of information can affect multiple business units and relationships with customers. [easy-tweet tweet="Much like disaster movies have been around for ages, disaster-related IT system outages are nothing new" via="VSI_DoubleTake" usehashtags="no"] Given the havoc that data loss can wreak, it is natural to assume that most companies would create and test their business continuity plans and make disaster recovery (DR) a priority. Systems failure has certainly affected significant number of businesses. Of the 3,076 respondents from across the globe who participated in Vision Solutions 2015 State of Resilience Report, 48 percent reported that their organisation had experienced a failure requiring DR to resume IT operations. Yet, astoundingly, 87 percent of respondents either had no DR plan or were not entirely confident the plan was complete, tested and ready.  Preparedness doesn’t have to be a tedious or overwhelming process Factor in the heavy cost of downtime, and this lack of an effective business continuity plan seems even more self-destructive. IT operations are typically down one to two hours due to a failure. However, 57 percent of survey respondents reported downtime exceeding one hour, nearly a third of organisations lost a few hours of data and roughly a quarter lost more than a day of data. Of the respondents who indicated that their company performed a down-time cost analysis, 31 percent reported costs of more than $10,000. Preparedness doesn’t have to be a tedious or overwhelming process. Create a checklist for an effective business continuity plan using the steps below to secure the future of your data and the health of your company’s finances. Form a Business Continuity Team While the company IT team should make all employees aware of the business continuity plan and involve them in testing, business executives should also assign creation, testing and execution projects to dedicated staff. Since a DR plan is, in most cases, one piece of an overall business resilience strategy, the IT leader will be part of the overall team. However, they should not stand idly by as another cog in the system; instead, the IT leader should step up and act as a leader, actively pursuing opportunities to demonstrate the plan’s value to business executives.  [easy-tweet tweet="The IT Leader should not stand idly by as another cog in the system" user="VSI_DoubleTake"] This distinction is important for a few reasons: First, leadership needs to support business continuity plans from the top down. Senior management should get involved in both the creation process and during decision-making on any proposed improvements. Management needs to demonstrate a willingness to review and test the plan. Second, management should promote awareness of the plan and reinforce its importance across the company. Perception can make or break the execution of even the most meticulously designed plan, so employees must grasp the plan’s gravity and the impact of system failure and downtime on productivity. IT executives who communicate regularly to leadership about the availability of a business continuity plan will already have an advantage when a need arises to implement the plan swiftly, and empowering employees to understand that there are steps in place to navigate disaster by protecting they business-critical data they access everyday speaks volumes about a company’s commitment to business continuity, overall. Finally, every IT employee in charge of a company-owned and managed server should play an active role in the planning and execution process, as the company must move all servers over in order to keep the business running. This process can become elaborate, as some servers may function on different operating systems or live in different databases. A platform-agnostic solution comes in handy in this scenario, facilitating movement of servers across all operating systems to avoid performance issues. A platform-agnostic solution comes in handy... Select a Solution that is Up to the Task A platform- and storage-agnostic solution offers several benefits that can bolster a well-devised business continuity plan. Very few organisations run all of their operations on physical or virtual servers; the majority split operations between the two environments for optimal infrastructure efficiency. In an emergency, it is imperative that a DR solution offers users the ability to migrate to other platforms or into the Cloud, or the effort will likely be in vain. Additionally, agile tools help users avoid vendor lock-ins, which can literally “trap” valuable data on compromised software or equipment. Vendor lock-ins can render even the best disaster recovery plan on earth powerless. Cloud DR can prove useful here, as companies can avoid some of the hurdles around recovery. [easy-tweet tweet=" Vendor lock-ins can render even the best disaster recovery plan on earth powerless" user="VSI_Doubletake" usehashtags="no"] Data Recovery by the Second When it comes to adequate data recovery, it is not enough to use snapshot technology, which records the current state of data every half hour to hour. There is simply too much new data moving through company systems now. The best solution is continuous data protection, which allows users to “time travel” and access files before they were lost, then copy and paste them into the current state so that operations can continue nearly uninterrupted. A continuous data solution records data in real time, offering a fine granularity to guard against loss. This is critical in industries such as banking, where institutions process transitions every moment of the day. And in healthcare, an oversight in data recovery can literally become a life-or-death situation. At the very least, it can cause chaos. Consider a hospital that loses all of its appointment data in a glitch, resulting in a slew of unhappy patients and frazzled staff. When secondary backup systems fail, deletions and errors can result in permanent data loss, as well as long-term reputational issues that can be difficult to rectify based on the volume and business value of the information that vanished.  Get your Data Priorities Straight Users will be furious, frustrated and stressed out if email servers go down for an hour The aforementioned example illustrates why it is important for each organisation to determine its own priorities in terms of the data it needs to protect most. Some types of data, such as email, are universally important. Users will be furious, frustrated and stressed out if email servers go down for an hour. Conversely, a server that runs monthly finance for the board is less urgent because it does not generate data as frequently as an email server. While not all data is created equal, it all deserves attention within the scope of your plan to determine appropriate backup parameters. The business continuity team should closely examine data that is specific to the enterprise and industry, and figure out how to prioritise them for recovery. There is no wrong or right equation; it depends entirely on the individual organisation and its workflow. Too Much Testing is Never Enough Companies should consistently test their business resilience strategy – not so much to check the hardware, as many would assume, but primarily to put employees through drills to make sure they know how to execute the plan appropriately. The human element of disaster recovery is often the trickiest to master, as it requires a great deal of shepherding and coordinating. It is sometimes the simplest details that can go awry. Testing is often the moment where unanticipated issues surface, making regular testing important for the success of your individual plan. In a recent example, one company turned all of the power in its data centre off and then initiated a fail-over, only to discover a major problem: the team was locked out of the data centre and therefore was unable to complete the process with a required manual step. The battery that powered the physical security card keys was low, so the entire organisation was stuck waiting. Fortunately, IT was able to approve the manual step once it gained re-entry, but this example reinforces why drills are important. Putting it All Together Making a well-reviewed disaster movie is not an easy feat, but a well-planned and executed business continuity plan is within reach if organisations carefully follow the tips above. Effective disaster recovery of valuable data leads to business benefits that will earn rave reviews from employees across the enterprise as well as customers and investors in the market at large. ### Druva Announces Cloud-Based Global MSP Channel Strategy Druva, the leader in converged data protection, today announced its cloud-based global managed service provider (MSP) channel strategy.  Releasing new product capabilities for its top-rated inSync data availability and governance platform, including a new management console for MSPs, Druva now offers tailored capabilities to scale MSP adoption.  Druva simultaneously launched its dedicated PartnerSync programme for MSPs.  Purpose-built for MSPs to quickly jump-start Druva services strategies, the new programme offers a vast number of tools and substantial recurring margins. MSPs, when partnering with Druva, now gain access to industry-leading technology, a new administrative console for co-branded and centralised customer management, a cloud-based architecture with no hardware to manage, as well as partner portal enablement tools, lead generation resources, training, and certification. Participation in PartnerSync for MSPs offers highly competitive margins, deal registration protection, predictable recurring revenue, MSP and customer co-branding, a dedicated account manager, and world-class support. “Managed services and cloud-based services is a multi-billion dollar industry and continues to grow at impressive double-digits,” said Scott Siragusa, VP of Business Development and Channels, Druva. “More and more businesses are turning to MSPs to manage their daily data protection priorities and business needs, which is why we are building new capabilities tailored to the MSP community and allowing them to easily scale and grow their Druva business.” PartnerSync for MSPs is available at three partner levels: Authorised MSP, which offers entry-level access to the MSP programme; Certified MSP, a specialised tier that unlocks joint marketing and empowers your sales force with educational goals; and Elite MSP, the premier level of partnership with the highest margins available to partners. "Our customers leverage us for our ability to identify and extend best-of-breed technologies to support and secure their fast-growing workforce needs,” said Mitch Cottrell, COO of Modo Networks.  “The Druva MSP programme enables us to accomplish this in an easy, turn-key manner. Druva has demonstrated a consistent commitment to our partnership, and they listen to feedback.  This partnership will be more successful day by day as we learn and grow together."  Key Druva PartnerSync for MSPs capabilities include: 100% cloud-based, multi-tenant offering of inSync for ease of deployment and administration Co-branded inSync clients and admin console for creating user profiles, lifecycle management, BYOD policy creation and execution, data collection and eDiscovery enablement, personal backup and restoration for device refresh and OS migrations, etc. Centralised MSP Management console to create and manage customer licenses and accounts The new MSP management console and co-branding capabilities will be generally available in November and are available to Druva PartnerSync for MSPs only. For more information on Druva’s PartnerSync MSP programme, visit: www.druva.com/msp-partner-program. ### New Release of SaaS-Based Offering Includes Beta Support for OpenStack Neutron Platform9 Expands Private Cloud Management Capabilities With Dynamic Network Provisioning for VMware vSphere and KVM  Platform9, the company making private clouds easy, today announced beta support for OpenStack Neutron, providing customers with a turnkey solution for dynamically provisioning networks within vSphere and KVM environments. The company also announced general availability of its support for OpenStack Heat for orchestration as well as for OpenStack Ceilometer for programmatic alerts and event notifications. Today's announcements follow the company's support for OpenStack Cinder last month, which enables customers to leverage the storage subsystem of their choice to provide persistent block storage in their vSphere- or KVM-powered private cloud. All of these capabilities further strengthen Platform9's unique managed private cloud solution, providing out-of-the-box automation that eliminates complex manual processes and the need for specialized skills, simplifying and accelerating IT service delivery.  "Customers are excited by the power and potential of OpenStack, but they are often derailed by its complexity and lack the specific skills to stand up and maintain an OpenStack-powered private cloud," said Madhura Maskasky, Platform9 co-founder and vice president of products. "Today's news demonstrates how we continue to make private clouds easy for customers. For example, network provisioning with Neutron -- which has been one of our most highly requested and anticipated new features to date -- is a simple wizard-driven exercise with Platform9. It takes just minutes, not hours or days, and can be accomplished with zero knowledge of OpenStack. It just works." OpenStack Neutron is a major component within the OpenStack private cloud framework, exposing a range of network provisioning functionality such as creating networks, virtual routers, configurations such as security groups, load balancers and more. Typically, IT teams find it exceedingly difficult to architect and integrate Neutron to work at scale in production, often contributing to stalled private cloud projects. Platform9 removes these barriers with built-in support for common network configurations such as Open vSwitch (OVS) and VMware's Distributed Virtual Switch (VDS) that work out of the box and are backed with a service-level agreement (SLA).  Enabling "Infrastructure as Code" with Orchestration Many organizations are modernizing operational processes, transitioning from manually provisioned workloads to an infrastructure-as-code paradigm in which cloud-based applications consume infrastructure as code. Similar to AWS CloudFormations, OpenStack Heat supports application blueprints that enable specifying infrastructure requirements for modern applications via a markup language, with the Heat framework automating the provisioning of infrastructure across OpenStack services as the application is deployed. By adding support for OpenStack Heat, Platform9 makes it easy for the large installed base of VMware and Linux customers to start their transformation from a static IT environment toward infrastructure-as-code within their private cloud environments.  Programmatic Alerts and Notifications In addition, Platform9's support for OpenStack Ceilometer is now generally available, enabling the use of programmatic alerts and event notifications. These in turn can be used to trigger orchestration policies via Heat. Product Availability Platform9 support for OpenStack Neutron will be generally available in the first half of 2016. Support for OpenStack Heat, OpenStack Ceilometer, and OpenStack Cinder is generally available now. ### Rethink! ITEM Europe 2016 April 25 – 26, 2016 Millennium Hotel London Rethink! ITEM Europe 2016 - IT & Digital Enterprise Minds - Driving Digital Business is an international knowledge and project exchange platform, which will bring together more than 150 IT experts to network and to discuss key industry topics, exchange knowledge, as well as create new partnerships. 35+ CIOs, IT decision makers and senior IT executives from across the industry will share their knowledge through 20 best-practice case studies, 16 workshops and round tables, and two days of One2One partnering sessions. [easy-tweet tweet="Join over 150 IT experts to define the future of IT this April in London" via="no" hashtags="ITEM2016"] Main topics include: IT Strategy, Innovation, IoT & Industry 4.0 Enterprise Mobility & Agile Infrastructure Cloud Technology & IT Virtualization Enterprise Architecture, IT Infrastructure & Applications Architecture management & DevOps in global companies  Big Data & IT Operations IT Security & Compliance Digital Transformation Speakers from companies such as Hotels.com, Met Office, AutoScout 24, RWE GBS UK, Boehringer Ingelheim Ltd., Telefonica UK, BMW, Deutsche Bahn Regio AG, and many more will join the event to present their challenges and opinions about the IT industry. In 2 ½ full days of networking, 30+ case studies and roundtables, as well as more than 30 hours of networking and exchange, participants will get exclusive access to IT decision makers, their projects, and the challenges their face, as well as will get the opportunity to discuss about the current trends, future technologies, and the latest innovations in the IT sector. The event will take place on the 25th and 26th of April 2016 at the Millennium Hotel in London. An Icebreaker session will be held the evening before the event, on April 24. For more information please refer to the event website or download the conference agenda that includes the full schedule of the event along with the topics and confirmed speakers.  Click here to secure your ticket now! For any questions about the event, please reach out to Karolina Edge karolina.edge@we-conect.com ### IBM’s Watson Forecasts Trends and Products for Holiday Season IBM today launched the IBM Watson Trend App, a new way for shoppers to understand the reasons behind the top trends of the holiday season and also predict the hottest products before they sell out. The app is available via a free download at the Apple App Store. [easy-tweet tweet="Let #IBMWatson play Santa this year and take the stress out of Xmas shopping " via="no" usehashtags="no"] [embed]https://www.youtube.com/watch?v=uNW9xhQvKI8#action=share[/embed] The IBM Watson Trend app distills the sentiment of tens of millions of online conversations by scouring 10,000 sources across social media sites, blogs, forums, comments, ratings and reviews. Unlike other apps or lists that provide a static ranking of “hot” products, the Watson app reveals how consumers feel about the products they are considering or have purchased. Using Watson’s understanding of natural language and machine learning technologies, the app uncovers consumer preferences to pinpoint patterns and trends to reveal why people are choosing certain products or brands. The app also uses predictive analytics to forecast if a particular trend is a fleeting fad or will continue to remain strong. To uncover insights behind the top holiday trends and products, Watson’s natural language engine aggregates insights into distinct trend groups: content, context and sentiment. Each group is given a relative daily Trend Score, ranging from 0 to 100, based on the impact (size of the conversation) and the momentum (the rate of growth of the conversation). Users can view the top 100 trending products and stories behind them across three categories – consumer electronics, toys and health and fitness. As of November 18, the top trends include: Smartphone Photogs Drive Demand for Professional-Grade Cameras: While smartphones users have one of the most powerful mobile cameras at their fingertips, Watson finds consumers want more. Watson identified standalone cameras like the Nikon D-SLR are the number one choice for amateurs looking to take their photography to the next level due to their greater depth and superior quality. Amateurs aren’t the only ones looking to upgrade, expert photographers are setting their sights on Sony’s 42 megapixel Alpha 7RII mirrorless camera that allows them to record Ultra HD 4k video Whether it’s Star Wars or Lego City, Buy Now Before It's Too Late: The new Star Wars movie, The Force Awakens, has helped make Lego Bricks the "it" toy of the 2015 holiday season and it’s likely that many will sell out including the new Millennium Falcon set.  However, according to Watson in addition to Star Wars, two other sets will also likely sell out this season: Lego City and Friends. As a result, shoppers should take advantage of early Black Friday deals and make their purchases now. Beyond chatter about impending product shortages, Watson finds that adults and children alike are excited for the new Star Wars line while other products such as Lego City are selling well among both boys and girls, which reinforces the growing trend of gender-neutral toys. Traditional Toys Go Back-to-School: According to Watson, Mattel's connected-toy, Hello Barbie is at the top of parents' gift-giving lists but it’s no longer due to her sense of style. The new Hello Barbie delivers an “engaging and unique play experience” by letting children have a two-way conversation with Barbie. Parents specifically call out the companion app feature that allows for parental control of all conversations. Other “Edutainment” toys leading the pack, according to Watson, include the new PAW Patrol Imagicard Learning Game (for LeapFrog tablets). Tis’ the Season for Wearable Tech: When it comes to must-have tech gifts, wearable accessories top wishlists for both gamers and movie lovers. According to Watson, while excitement around Minecraft continues to be strong, the social conversations aren’t focused on the game itself but the new Gameband, which allows people to play anywhere, on any device. For movie-lovers, Watson identified excitement around the Oculus Rift (OR), a virtual reality wearable that not only enhances the gaming experience but can support made-for-OR movies. As consumers use the IBM Watson Trend app to pinpoint what products are popular and why, IBM also reports on how consumers will shop for these gifts. With the Thanksgiving holiday just one week away, IBM predicts that, for the first time, more consumers will turn to their mobile devices than their desktop to seek out the best buys.  Over the five-day holiday period, mobile traffic is expected to increase by nearly 57 percent, up 17 percent over 2014.  Mobile sales are predicted to increase by more than 36 percent, up 34 percent over last year. The Watson Trend App uses a combination of API capabilities from IBM’s open Watson Developer platform including Sentiment Analysis, Keyword Extraction, Concept Tagging and Taxonomy Classification. In the future, IBM Watson Trend will continue to evolve with the addition of new capabilities including geographic and language data, and an increased level of personalization that caters to each consumer's unique interests and preferences. IBM continues to add and stretch the boundaries of what Watson can do. The original Watson system was based on 1 natural language QA API, today the platform has over 30 APIs powered by over 50 underlying technologies to provide capabilities that span language, speech, vision, and data insights. By sharing these capabilities via the Watson Developer Cloud, an open developer environment, a vibrant community of entrepreneurs, startups and established businesses are commercializing their own cognitive powered products and services to transform a variety of industries and disciplines including retail and customer service. For more information or to download the IBM Watson Trend App, go to www.ibmwatsontrend.com ### Bluebird IT explains the MSP service market Managed services are skilled outsourcing functions that transfer in-house functionalities to be managed by third party managed service provider (MSP). The managed services market is witnessing accelerated growth in recent years due to the advancements in cloud computing, big data and mobility services. Such outsourced services enable organisations to bring in competences that they lack or to replace functions or processes that incurred huge recurring costs. According to Research and Market (2015) managed services reduce the recurring in-house IT costs by 30-40% and bring about 50-60% increase in efficiency. The same report suggest global managed services market is expected to grow from $107.17 billion in 2014 to $193.34 billion by 2019, at an estimated CAGR of 12.5% from 2014 to 2019. [easy-tweet tweet="The global managed services market is expected to grow from $107.17bn in 2014 to $193.34bn by 2019" user="bluebirdITS" usehashtags="no"] The market is clearly growing, and whether, as a service provider, you are providing security, analytics, storage or any other service to any industry, there is one key initiative that stands…differentiation! How can I, as a service provider, offer a revenue generating service that is unique? If not unique, how can I offer qualities above that of my competitors by better utilising the technology that underpins the service? Lets explore the case of a European MSP that identified secure data transfer (as a service) into tier 2 financial service organisations as a profitable niche. Offering a highly secure, governed and flexible batch file transfer infrastructure that complies with PCI DSS and other key regulations provides an opportunity for differentiation and a potential dominance of the market. Their choice of technology was based on security and governance capabilities, as well as the ability to easily onboard new customers to the service with a trusted advisor that has experience in designing PCI compliant file gateway infrastructures. Their choice of advisor was key as the relationship would enable a strategic expansion of the managed service across geographies, if successful. This partner will also offer an SLA driven support centre to ensure any challenges with the service can be rectified, keeping the reputation strong and ensuring regular revenue by differentiating the quality of the service. the one of a kind, differentiated service will be launch in January 2016 Having chosen the best of breed secure file transfer technology and a partner with over 25 years of experience, the one of a kind, differentiated service will be launched in January 2016. This one of a kind, differentiated, premium service into a niche, yet profitable vertical achieves the key use case for any MSP. So why not look at a trusted partner as the first step on your differentiation journey? Such a service would be rapidly consumed in the manufacturing or logistics verticals, and all that is left is for an MSP to take up the proven technology, with a proven partner and provide a unique business model. ### Avangate’s Fall ‘15 Release Helps Companies Increase Subscription Revenues And Accept More Global Payments For Frictionless Online Selling Avangate, the modern Digital Commerce solution provider trusted by thousands of Software, SaaS and Online Services companies to grow their business worldwide, today announced over 150 enhancements to its Avangate Digital Commerce platform. These new features are aimed at simplifying customer acquisition, increasing revenue uplift and conversions, and driving recurring revenue worldwide. “We are increasingly moving beyond the single purchase to literally hundreds of revenue moments, whether accepting a trial, first usage bill, or for a call to the service center,” said Michael Ni, CMO and SVP of Products at Avangate. “For the rapidly growing digital goods, software and services vendors, an estimated 10-20 percent of potential revenue is leaked across these revenue moments creating a major ROI problem. This Fall release is focused on helping customers convert revenue leakage to revenue uplift.” Avangate’s Digital Commerce platform enables businesses to sell directly to customers via online channels, where traditional sales tactics are too expensive and slow. These new enhancements complement the company’s proven fast and secure commerce platform used by over 4,000 software, SaaS and online services companies across the globe. “Avangate has enabled us to take some major leaps forward regarding business optimization, integration with third-party applications, new online sales channels and improving commerce KPIs.  We’ve seen a revenue uplift of more than 30 percent and we have more projects lined up that will support our growth further and improve customer experience through frictionless selling,” said Angel Brown, Director, Product Marketing, FineReader LOB, ABBYY NAHQ. This release enables Software, SaaS, and Online Services providers to: Simplify assisted sales – direct and resellers New optimized shopping cart and MyAccount reduces the steps to complete a self-service purchase and account management. At the same time, this release also enhances assisted selling and order automation, from reseller advanced packaging and promotion, through to extended PO payment flows with pre-filled documents and billing options. Retain more subscription revenue Avangate expands its leading Revenue Recovery Tools (RRT) with European Account Updaters and global retry logic to increase the percentage of payment authorizations, as well as additional marketing tools and communications to recover hard declines to drive overall Conversion and Marketing Campaign Effectiveness. Accept even more global payments This release continues Avangate’s leadership as a global payments solution managing international regulatory compliance, financial reconciliation, and global shopper support for its customers by adding features like simplified tax exemption management, plus Israeli and Turkish tax management. Additional features also enable vendors to maximize conversion by leveraging dynamic 3D Secure, direct debit in Europe, and Brazilian installment payments. Segment and optimize sales via the leading digital goods affiliate network We’ve extended our Affiliate Network, voted #4 CPS Network in the Online Advertising Bluebook, to provide even greater distribution capabilities. New features include dedicated product landing pages and better segment affiliate categories and communication. Embed subscription commerce into your customer experience Comprehensive commerce-level APIs covering order and subscription management means a more seamless commerce experience for shoppers. With this release, we have expanded REST, JSON-RPC and SOAP protocols, optimized use of multiple payment tools including PayPal Express and installment payments, as well as added developer tools, including extended log monitoring and sample code / developer portal. Free Revenue Leakage Assessment To help companies optimize their commerce operations and maximize the value of every customer, Avangate is also announcing a free revenue leakage assessment service to help software, SaaS and online services companies identify, assess and recover potentially lost revenue across the Digital Commerce Lifecycle. To learn more, visit avangate.com/uplift. ### Is Service The New Cash Cow? We live in a world of ‘service as a something’. At the moment, it’s the turn of the service department to get automated, connected end-to-end, and energised with its analytics. (Music to my ears as I run Sony’s Professional Services operation across Europe). Traditionally, manufacturer service departments have been the Cinderella of most large organisations. They’re usually one of the last parts of the business to get modernised, or can be viewed as an afterthought by different parts of the organisation. In fact, you could argue that that service as a line of business is even a bit late to the whole “as-a-service” bandwagon.  And you’d be right. But unlike other lines of business that are already benefitting from this model, service is itself becoming a rich new revenue stream, and even an entire new business model for manufacturers. Instead of selling a piece of industrial equipment to a buyer, manufacturers might loan it and then charge for repairs, monitoring or maintenance. (Simply making something and selling it is now seen as positively old-fashioned...). [easy-tweet tweet="We live in a world of service-as-a-something" user="servicemax @comparethecloud" usehashtags="no"] At Sony, we are seeing increasing customer demand for managed services, particularly with our broadcast customers who want managed service contracts rather than break-fix repair service. I’m sure it’s the same for many other types of manufacturers. If you then throw IoT into the mix in the longer term, this will take things a step further with sensors and devices connected to the internet to maintain communication among users, manufacturers, products and service providers for pro-active maintenance before something breaks. Product as a service is a win-win for customers and manufacturers alike in my view. Customers get the assurance of first class service, the expertise to maintain it, and avoid a large upfront capital expenditure, while manufacturers get a recurring revenue stream, and visibility into any product ‘hot spots’ before they happen. (For example, we support more than 6,200 different products over multi-year service commitments with thousands of customers. As you can imagine, this type of visibility is extremely valuable). With most companies struggling to grow new equipment sales on a global scale, savvy business leaders are finding their service departments can be much more profitable than ever before. This is one of the reasons – the servitsation of companies as a new revenue model – that’s making CEOs look at their service departments in a whole new light with a service income mindset. My personal opinion is that service is actually the product. Over time, I think we’ll begin to see a much more widespread understanding and appreciation of this fact. But it will take time. There’s been a prevailing business model been based on putting a lot of effort into just optimising profits from sales. Everything after that, including service, has been about minimising costs. The shift now is towards an outcomes-based business model, with service providers committing to providing predetermined service levels and prices aligned with customer requirements. This involves longer-term thinking and defining outcomes and relationships: You can see this in more and more industries as people begin to explore how they can move to outcomes-based models. The market has started to question the idea that perhaps the old ways aren’t necessarily the best option anymore. The shift now is towards an outcomes-based business model Of course, for this to happen there must be certain components in place. You need to understand the people, the processes and the terms of the outcomes. And you’ve got to have a system that can accommodate that. At Sony, we are using ServiceMax as our field service management platform for 24 countries across Europe. It supports our move to an outcomes-based model of recurring revenue, (not to mention business benefits of more than €1 million to Sony and its customers through early detection of potential hot spots in product service requirements, increasing speed of resolution, and streamlining our end-to-end service processes). It also means our technicians can have a 360 degree end-to-end view of customer relationships, including insight into products, contract management, past history, as well as fostering better customer interaction and standardising processes. For manufacturers, this is a longer term view, but it is happening now all around us. As the business landscape has changed, so too have customer requirements. It’s time manufacturers take a fresh look at what service strategy best fits their organisation and customer needs. By way of example, we are now selling business solutions to a much wider range of customers than ever before, such as corporate education, healthcare for remote 3D surgery, and digital cinema. In many cases, the engineering skills required to maintain some products are simply not available at the customer end. We are now expected to deliver this expertise and do so with a much more customer-centric approach than simply providing equipment. I’m sure it’s the same for other manufacturers. I’d encourage you to look at your service department and its full potential with fresh eyes. By all means, equip sales teams with the latest Sony tablets, iPads and gadgets to empower them further. But remove the blinkers and limited role definition of your service team or you’ll not only be under serving customers, but also leaving money on the table.  ### Driving Digital Transformation: The Evolving Role of a CIO Few businesses have been untouched by technology. From how to order a takeaway to how we book a doctor’s appointment or keep track of sales leads, the digitisation of business has led to the creation of new job roles and business models, and the evolution of others. Ovum predicts the second digital revolution will occur in the next ten years, which will be driven by digital advancements through automation. Today, 38% of the Technology spend by companies is outside of the IT Organisation, Chief Information Officers (CIOs) are faced with new opportunities to grasp and new challenges to overcome. It is now the time for CIO’s to rethink their role. [easy-tweet tweet="38% of the Technology spend by companies is outside of the IT Organisation" user="BMCSoftware" usehashtags="no" hashtags="CIO"] Re defining the CIO role To understand how the CIO role is changing, we must first understand what the CIO role looks like today. At one end of the spectrum we have the traditional CIO, who reports to the CFO and views IT as a support function. CIOs who treat IT as a cost centre, face the reality of being left behind if they do not adapt their approach. They struggle with more 75% of their budget dedicated to run IT, and are under the pressure of heavy cost reduction ( >10% per year) . CIOs who treat IT as a cost centre, face the reality of being left behind if they do not adapt their approach At the other end of the spectrum, we have the Innovative CIO. With a seat on the board, and a clear vision for the business, this CIO is more like a corporate CTO. They’re typically tech savvy and view leveraging the right technology as an opportunity to innovate, to generate revenue, and to grow business. Most of the time, they have outsourced most, if not all of their delivery capabilities and their current contracts prevent from any strategic agility to activate their plan at corporate level. To match the gaps, we’re seeing new roles created by businesses that fear they aren’t innovating fast enough. One such role is the Chief Digital Officer (CDO) role, which brings together the disciplines of IT and marketing. Gartner predicts that 25% of organisations will employ a CDO in 2015. But isn’t this position, a way for the board to create the necessary tension in the organization to shift to the Digital Economy? Where are the opportunities for CIOs? As we architect our digital future, businesses need to ensure the performance, integrity, efficiency and agility of their digital services are world-class. Digital Enterprise Management (DEM) is an approach to the management of digital services, infrastructure, processes and policy comprising a set of ground-breaking best practices and best-in-class software solutions to support continuous innovation in the digital enterprise. [easy-tweet tweet="Digital business isn’t just about replacing the old with the new" user="BMCSoftware @comparethecloud"] The fact digital transformation spans the entire organisation means it requires business to rethink their whole business model. Digital business isn’t just about replacing the old with the new, it is about harnessing technology to enhance every aspect of an organisation, reshaping and redefining businesses from the ground up. With DEM, IT can become the corporate service broker, establishing a new digital value chain between digital service providers and customers. With DEM, the R&D can accelerate to new type of product lifecycle with Continuous Delivery and IT to become the platform for innovation. To become this leader for innovation throughout the entire organisation. CIO’s have to develop a strategic plan with a well defined Digital Transformation journey, along with the Business Model evolution led by the CEO. The portfolio of Digital Initiatives can be tight with new revenue streams, improve the work environment (in and out of the office) and the productivity of staff, while revolutionising the experience of customers. It sets the platform for Internet of Thing application, including Big Data and Artificial Intelligence. What challenges stand in the CIOs way? Getting started can often be the biggest barrier to digitisation; with an almost limitless number of places to begin a digital transformation, picking an effective set of foundational projects is not without risk. This challenge is compounded by having to maintain operational integrity of the systems already in place. As every CIO knows, keeping critical IT systems running can be difficult and time consuming enough, without the added pressure of trying to innovate at the same time. Rationalising the application portfolio, defining a new Enterprise architecture embracing the systems of record and engagement, automating the IT process on a clear set of governance policies, securing the management vulnerabilities and intrusions should come on the top of the list when building the work plan for the next 24 months. [easy-tweet tweet="CIO’s must create a more user friendly and intuitive experience for business users" via="BMCSoftware" usehashtags="no"] One of the most consistently valuable places to start is in ensuring that the IT organisation itself remains relevant, accessible and the easiest to do business with. CIO’s must create a more user friendly and intuitive experience for business users. Workers today expect IT services to be every bit as good or better than the services they could consume outside the organisation. In this environment, CIOs must make the process of selecting and engaging with third party technology suppliers as easy as finding and purchasing apps on the app store, for example. Realising this vision requires a gear-shift in how CIOs approach IT management. More than ever before CIO’s will have to demonstrate an entrepreneurial spirit, vision, ambition, courage. His ability to empower his team, and hire new talents, new skills, will drive him to define new roles and organisation, more mobile, flexible, project base, harnessing outside-in / inside-out perspectives on new use cases. Fortunately, many ‘new’ CIOs have recognised this and are already reaping the rewards of adopting a modern, digital approach to IT. ### How To Improve Accuracy With Cloud Computing, To Ultimately Improve Customer Engagement You can eradicate a potentially damaging scattergun approach to marketing and CRM by using cloud computing systems to better target your consumers. Properly harnessed, a cloud-as-platform system will let your teams invest customer engagement resources where they’re needed most, and focus on the messaging strategy that will get your business the best results. When those results are delivered, a cloud-based Incentive Compensation Management system will then help you reward exceptional performance and motivate your e-commerce and CRM teams. Here are some of the benefits the right cloud computing philosophy could bring to your company. Innovate faster than your rivals [easy-tweet tweet="The best advances in customer engagement arise with the advent of new, workable ideas" user="comparethecloud" usehashtags="no"] The best advances in customer engagement arise with the advent of new, workable ideas. Identifying the ideas that will move your business forward takes time, but by using the cloud to filter good concepts across your whole organisation and get them into development, you can save time and money. You won’t have a costly infrastructure to maintain, so you can direct budget instead towards development, testing, deployment and iteration of new customer engagement strategies. Developers can concentrate on creating and innovating, rather than maintenance and bug-fixing. The operations side is handled by whoever provides your cloud service; your specialists can focus instead on what they’re best at. Develop new services more cheaply Leading on from that, a cloud service will effectively let you be the front-end user, while the back-end is maintained by the supplier you sign up with. It means you pay only for what you use and what you need. Why should you have pay to maintain your own cloud service? a cloud service will effectively let you be the front-end user There’s no need to purchase licences, development tools or additional bandwidth. It’s all there in the cloud - and it will expand as your needs grow. And it will evolve in the background so that your team can turn the ideas, strategies and workflows that will improve customer engagement. You can recruit the UX experts, business analysts and designers you need to make sure a customer can engage with your brand as best they can. Launch highly targeted campaigns One of the most basic and attractive premises of cloud computing is the connectivity it offers. Teams and departments can upload and share data, information, content and insight and gather feedback effortlessly. It also helps you connect with specific demographics without the need for costly and time-consuming focus groups or pilot schemes. [easy-tweet tweet="One of the most basic and attractive premises of #cloud computing is the connectivity it offers" user="comparethecloud"] Say you want to advertise a product launch. You devise an eCRM campaign, send out three sets of emails to different target groups, each with a call-to-action that directs them to a landing page relevant to them. You can then monitor whether they go to your online store to purchase the product, whether they want to find out a little more about the product or whether they ‘bounce’ away from the page completely. You decide the last group, the bouncers, need more convincing, so you set to work on a different strategy to redouble your efforts there. The common thread in all of this? You guessed it - it can all be done in the cloud, quickly, cheaply and with total buy-in from all relevant members of a campaign team.    ### PTC Announces PTC Windchill 11 for Smart Connected PLM PTC today announced PTC Windchill® 11 product lifecycle management software that achieves new levels of connectivity and process improvement across the entire, closed-loop product lifecycle. PTC Windchill is used by more than 1.5 million users around the world to manage and optimize their product development and lifecycle processes. With this major new release, PTC now offers a PLM system that bridges the digital world and the physical world. Customers benefit through an improved systems-engineering process that is informed by real world usage and quality data from smart connected products. This benefit is extended through broad and easy access to the definitive source of truth of product information enabling richer collaboration and faster decision making. The data stream from smart, connected products opens up new worlds of possibility: driving new business models, creating new service opportunities and improving design decisions. But to fully capitalize on this potential, companies need a product lifecycle management strategy that is equipped for this revolution: a PLM approach enabled by a modern IoT platform. PTC Windchill 11 is enabled by PTC Thingworx™ technology to integrate data from physical products, web-based resources, and enterprise software systems to deliver unparalleled access, insight, and value in a modern and easily consumable user experience. “Smart, connected products are the future, but are intrinsically more complex to design, build and service,” said Brian Shepherd, executive vice president, PLM, PTC. “Products that once operated in isolation are now part of a connected ecosystem with extended interdependencies.  PTC Windchill 11 is the solution for companies that are looking for improved and closed-loop product lifecycle management to thrive in this new era.” PTC Windchill 11 addresses the key requirements for today’s PLM initiatives – smart, connected, complete, and flexible. Smart PTC Windchill 11 offers stakeholders across all levels of the company easy and secure access to all relevant product information keeping them connected with the fast-moving and dynamic process of product development. Specific enhancements include: Role Based Apps: PTC Windchill 11 offers instant access to relevant product information based on who you are and what you need to accomplish. New PTC Windchill Search: PTC Windchill 11 delivers new, multifaceted search capabilities to quickly filter and find specific product information. OSLC Standards.  PTC Windchill 11 is a smart, open PLM system. PTC has incorporated OSLC standards for greater collaboration and connection to other systems.   Connected PTC Windchill 11 introduces revolutionary new connectivity between products and the people who design, develop, test, build, and service them. Windchill 11 closes the ‘lifecycle’ loop with IoT data captured in real time during the operation of physical products.  Product planning, design, and quality teams can learn from the product’s operational behavior to improve features that customers use most, configure offerings to usage patterns, and redesign parts or systems to improve quality and save on costs while meeting design requirements.            Complete Complex products rely on an increasing number of engineering disciplines to closely collaborate throughout the product’s development. Changes made in one area need to be realized and integrated throughout all the others.  This requires comprehensive Bill-of-Materials (BOM) management that that encompasses system models, requirements, mechanical designs, electronic designs, software, documentation, quality planning and analysis, and production information shared across engineering, manufacturing and services organizations.  PTC Windchill 11 offers features that advance the ability to manage this complete product definition, helping companies evolve to a part-centric bill-of-materials managed throughout the entire closed-loop lifecycle.  Companies will be able to leverage the full, evolving product definition and provide accurate, reliable decisions for every stakeholder. Flexible PTC offers a variety of deployment options from SAAS in the cloud to perpetual and on-premise solutions.  Value-ready deployment services are offered from PTC’s own world-class Global Services organization or through a network of consulting and value-added partners.    This flexibility makes PTC Windchill 11 ideal for both large enterprises and small-medium businesses that need product lifecycle management software with a fast-time to value, low total cost of ownership, and completely compatible with their overall IT strategies.   "At Lifetime we are very proud of our products and the impact they have on people's lives.  PTC Windchill helps us move these products rapidly from concept to production with to standard processes and strong team collaboration," Dave Winter, executive vice president engineering and manufacturing, Lifetime Products, Inc. ### Irish Tax Institute Selects MarkLogic for new TaxFind Database MarkLogic Corporation, the leading Enterprise NoSQL database provider, today announced that the Irish Tax Institute has gone live with TaxFind, a new tax research database built on the MarkLogic platform. The Institute chose the MarkLogic database for its fast development time – it went from concept to production in just seven months - and its high-speed search capabilities across huge, complex archives. With the aim of building the industry’s best tax service, the Institute moved its flagship digital service from a relational database to MarkLogic. As well as providing the flexibility required, the MarkLogic database allows the Institute to enrich its content, create new member services and deliver information to all browsers and mobile devices. The organisation’s goal was to integrate a huge level of data, bringing together in excess of 300,000 pages of tax content including archive material in Word, PDF, XML and HTML. By choosing the MarkLogic database, the Institute was able to easily digest and digitize information in any format and make it searchable. MarkLogic’s scalability and flexibility means that the Institute can also add recorded audio/visual content from its conferences and lectures in the future and make it easily accessible with the power of MarkLogic’s search engine. “We have a sizeable archive of valuable tax information contained in many different formats. The challenge was finding a solution that could store and access the large repository of data whilst at the same time providing a simple search interface that pinpointed very quickly the most relevant material for members,” said Martin Lambe, Chief Executive, Irish Tax Institute. “To enhance the search capability we developed a tax taxonomy to enrich the content and we will continue to develop this further. Our members operate on a global basis and are reliant on technology and systems being available 24/7 365 days a year. Irrespective of where our members are or which device they are using, the new cloud based platform will be available to support them,” Lambe added. In February 2015, the Institute appointed UK-based MarkLogic partner 67 Bricks because of its reputation for enriching publishers’ content and its experience building unified search across organizations.   “67 Bricks selected MarkLogic for the Institute because its next-generation database software has the hardened-enterprise capabilities required by the customer. The Institute has a mandate to use technology in innovative ways and the MarkLogic database combined with 67 Bricks content enrichment expertise fitted the bill,” said Sam Herbert, Client Services Director at 67 Bricks. “Its decision was validated as we were able to launch an innovative, industry-leading service in a very short timescale.”  “Irish Tax Institute’s TaxFind is a great example of how to integrate many silos of data to deliver a single view across the organization and launch a new, market-leading service,” said Adrian Carr, Group Vice President at MarkLogic. “Because of MarkLogic’s enterprise features, the Institute is able to do so with the high availability, security and reliability today’s organizations demand.” ### PTC and Bosch Software Innovations Announce Alliance to Deliver Industrial IoT Solutions PTC® and Bosch Software Innovations today announced a technology alliance to facilitate the integration of the ThingWorx® Platform and the Bosch IoT Suite. The new Bosch IoT Suite M2M Connector for ThingWorx allows for technical interplay between the two platforms and is now available on the ThingWorx Marketplace. The technology stack enables IoT developers to connect and control heterogeneous devices and systems, cost-effectively and rapidly develop IoT applications for complex IT landscapes, and easily and quickly adjust IoT solutions to the specific needs of individual companies and sectors. [easy-tweet tweet="ThingWorx & Bosch alliance enables integration of smart, connected products with cross-sector enterprise processes" via="no" usehashtags="no"] PTC and Bosch Software Innovations successfully tested the integrated technology stack in the Industrial Internet Consortium Track & Trace Testbed by wirelessly connecting tightening tools to operate seamlessly together on the factory floor. The Track & Trace application permits the monitoring of the status of all devices on the fly. The data collected is used to optimize production processes and tool maintenance. Irregularities or impending failures are detected immediately and the faulty device can be replaced before downtime can occur. The Track & Trace Testbed provides an excellent example of complex production landscapes such as those in the automotive, aerospace and mechanical and plant engineering sectors, which are made up of heterogeneous machines, devices and process landscapes that must be adjusted to work together. When combined, the ThingWorx and Bosch Software Innovations technology provides IoT developers the exact flexibility needed for connecting heterogeneous equipment and enterprise processes. [quote_box_center]Dr.-Ing. Rainer Kallenbach, CEO, Bosch Software Innovations: "Building on the long-standing relationship between PTC and Bosch, this alliance has the potential to significantly lower the entry barrier for customers into the IoT. This alliance combines deep know-how in Product and Service Lifecycle Management, leading IoT software products and in-depth experience connecting and managing heterogeneous asset landscapes. The heterogeneous environments we normally encounter in the IoT require a new, very open approach. The ideal foundation for this is the open Vorto standard for IoT interface definitions, backed by Eclipse, the leading Open Source community.”[/quote_box_center] In contrast to the two traditional options – standard software and individual development – the joint technology offering is precise and cost effective. The Device Management component (M2M) from the Bosch IoT Suite provides a reliable connection and control of devices, while operating a secure, flexible and transparent infrastructure for distributed devices. Vorto, an open source tool initiated by Bosch Software Innovations and developed by Eclipse IoT, enables the creation and management of information models for integration into different platforms.  The ThingWorx® IoT application development platform enables rapid development of drag-and-drop business applications. A comprehensive security concept is designed to protect applications against unauthorized access. [quote_box_center]Jim Heppelmann, President and CEO, PTC: "Our alliance with Bosch Software Innovations is a perfect strategic fit for both companies. We have combined the world's best technologies to redefine the way daily work gets done and send a strong signal to the market. It has never been so easy for companies to enter into the IoT business. This new technology stack will decisively influence and optimize business processes, even for companies with mature, complex environments." Rob Gremley, President Technology Platform Group, PTC: "ThingWorx offers an easy-to-use interface with simple operating functions like drag-and-drop. This facilitates cross-departmental development of applications, even in complex environments, without the need for advanced IoT Engineers. Once the system and process complexity has been captured with the Bosch Software Innovations IoT platform, ThingWorx removes the last hurdle into the smart, connected world."[/quote_box_center] ### More than a cloud platform: What is OpenStack? Cloud computing may have entered mainstream conversation, but OpenStack cannot claim to be quite so well known. This is despite the fact that the OpenStack project began more than five years ago and is supported by well-known businesses like IBM, Dell, Yahoo and Intel. So what exactly is OpenStack? [easy-tweet tweet="Get to know #OpenStack better with @Bsquared90 on @Comparethecloud" via="no" usehashtags="no"] [caption id="attachment_32526" align="alignright" width="300"] Read more on OpenStack here[/caption] The simplest definition is that OpenStack is an open source cloud operating system. It was founded in July 2010 as a collaboration between Rackspace and NASA with the aim of allowing organisations of any size to develop and offer cloud resources running on standardised hardware. It seeks to shift cloud computing away from proprietary platforms towards a more open, accessible and innovative approach.  In terms of the software itself, the OpenStack project predominantly deals with Infrastructure-as-a-Service, aiming to deliver compute resources, such as virtual machines, network and storage. Prior to OpenStack, if a business wanted to build or deploy an application they usually did so on a dedicated server, which led to a lot of under-utilised servers and a lot of hassle managing and configuring them. Virtualisation improved matters by adding a hypervisor to the servers, but even then, when more and more virtual machines were added, it became increasingly difficult to manage. Hypervisors and servers could come in different brands which made configuration difficult and ultimately businesses began managing servers in much the same way as they would have with physical machines, except there was now an additional virtualisation layer in place. OpenStack takes all the hypervisors and represents them as pools of resources OpenStack takes all the hypervisors being used by an organisation and represents them instead as pools of resources. These OpenStack shared services can be split into compute, networking and storage features and crucially are all accessed via the same extraction layer, rather than the underlying hardware or software. This means that you could be running Hyper-V, VMWare or any other kind of hypervisor, but the user experience delivered by OpenStack would be consistent regardless. The OpenStack dashboard also provides administrators and other users with the ability to set up virtual machines, manage networks or make any other configurations they need to. [easy-tweet tweet="#OpenStack is #opensource and supported by multiple #cloud vendors" user="bsquared90" usehashtags="no"] Because OpenStack is open source and supported by multiple cloud vendors, it openly embraces innovation and is already being used to power public and private clouds across a range of industries. OpenStack is currently aiding academic research, healthcare services, finance, retail and the entertainment sector. One of the world’s foremost scientific undertakings, CERN, opted for OpenStack when it decided the time was right to embrace cloud computing. Researchers at the facility needed a platform that would be able to interact with its existing IT solutions and handle the vast quantities of data being recorded by the Large Hadron Collider. “OpenStack’s technical architecture clearly addressed our needs to run at scale,” explains Tim Bell, manager of infrastructure services at CERN. “Also, the technology and developer ecosystem around OpenStack are very vibrant and would enable us to build the services we needed within the cloud. With an open community, we can benefit from the work of the active contributors but also use our engineering skills to enhance the product for others.” "OpenStack is a global collaboration of developers and cloud computing technologists" As referenced by the CERN scientist, any definition of OpenStack must recognise that it is more than just a cloud computing platform: “OpenStack is a global collaboration of developers and cloud computing technologists producing the open standard cloud computing platform for both public and private clouds.” Like other open source projects, OpenStack also refers to the community behind the technology. Made up of more than 30,000 individuals and 555 supporting companies spread across 177 countries, the OpenStack community is growing rapidly.  If you have a question about OpenStack, the community will be happy to answer it. If you have some code that you wish to contribute, you are free to do so and if you want to learn more about the software, there are online training programmes available. It is the inclusiveness of OpenStack that will help it to continue fostering innovation and prevent it from stagnating. The software currently has a six month release cycle, but developers can design and code additions whenever they want, because ultimately it is up to the community to decide OpenStack’s future. ### What does the future hold for the financial and legal sector in the cloud? Spending on cloud technology is reported to reach $131 billion by 2017, up 18.5 percent from 2012, with analysts at Gartner predicting that 2016 will be a ‘defining year’ for the cloud, as cutting-edge technology becomes more sophisticated over the next few years. There’s no other way about it, cloud computing is becoming more and more prevalent in both our personal and work lives. The continued integration of the cloud into the enterprise appears inevitable; yet while many industries are embracing it, some remain tentative. Two of those somewhat slow to embrace the cloud have been the financial and legal sectors. Those days, however, may be numbered.  [easy-tweet tweet="2016 will be a ‘defining year’ for the #cloud" user="highq and @comparethecloud" usehashtags="no"] With enterprises left, right, and centre integrating with the cloud, this piece will discuss the impact of the cloud now and in the future, why an increasing number of sectors are welcoming it, and why all of the financial and legal sector may soon be following suit. The barriers between integration  Not only does cloud integration show no signs of slowing down, it’s continually picking up speed, with Goldman Sachs forecasting the cloud infrastructure and platform market to grow at 19.62 percent CAGR from 2015 to 2018, reaching $43 billion by 2018. The incentives for enterprises moving to the cloud is constantly growing, including improved collaboration, reduced costs, virtual data rooms, greater integration and more security. Security is a hot topic for any organisation, and especially those in the legal and financial sectors. In such highly regulated industries, where data is extremely valuable, security is of the utmost importance. Some within these sectors view the cloud as a risky venture when it comes to data security, when in fact, the opposite is true. data is extremely valuable, security is of the utmost importance According to a survey by Verizon Enterprises that spoke with 625 IT decision makers, the respondents who had transitioned to the cloud experienced no impact on data security (34 percent) or had improved security levels (39 percent). The cloud offers enterprises more enhanced security and more solutions for keeping their data secure and free from breaches. In 2014, more than 180,000 laptops, tablets and phones were either lost or stolen in the UK alone. Having any work data stored purely on your laptop or tablet puts it at considerable risk, especially if it ends up in the wrong hands. Having your data stored on the cloud, however, means that not only can you access it from any device, but if someone was to find your own device, they would be no where closer to accessing your private information. Not getting left behind  Cloud providers are continually improving and enhancing their security features, aware that this is arguably the biggest barrier between them and sectors such as the legal and financial services. That isn’t to say, however, that these sectors aren’t already embracing the cloud. As we speak, an increasing number of financial and legal organisations are seeing the benefits of such a technological transition. 46 percent of surveyed firms in the European Union (EU) are already using advanced cloud services relating to financial and accounting software applications, and a new report from CipherCloud shows financial firms having an increase confidence in cloud technologies, with 100 percent of respondents saying they would put certain personally identifiable information (PII) in the cloud.  Legal, on the other hand, have been a few steps ahead of the financial services in implementing cloud software and the sector’s integration looks only to continue. In a recent survey by HBR involving 308 in-house legal counsel from companies across 22 industries, some 44 percent reported an increase in their systems and tech budgets. This should come as no surprise, with nearly three quarters of in-house law departments reporting increased legal demands in the past year, many firms will require support from a platform capable of handling high quantities of data, while keeping it all secure. A platform such as the cloud. nearly 3/4 in-house law departments reporting increased legal demands in the past year In industries as highly regulated and data sensitive as the legal and financial services, hesitance to fully embrace the cloud platform is understandable. Technology adoption can often produce a domino effect however, and the more organisations that transition to this platform, the harder it will be for other businesses to resist. There was a time, not too long ago, that many companies resisted the urge of email and hardware storage, still more comfortable with post and hard copies. They had found a system that worked and wanted to stick with it. You can’t fault that. An improved system reared its head, however, and from a business and financial standpoint, it became impossible to ignore. A similar situation is now occurring with cloud integration, and going into 2016, you will find companies finding fewer and fewer reasons to ignore this movement. The technology times are continually changing and enterprises have to decide whether to change with it, or risk being left behind. A risk, it’s predicted, that the financial and legal industries won't be willing to take.  The future is now [easy-tweet tweet="By 2019, cloud applications will account for 90 percent of worldwide mobile data traffic" user="highq" usehashtags="no"] By 2019, cloud applications will account for 90 percent of worldwide mobile data traffic. Enterprises in the cloud isn’t a sign of the future, it’s a sign of the current times. The legal and financial sectors inherit many things, one of which is a highly competitive market. Organisations can’t afford to watch their competitors welcome the cloud platform, and retain a competitive advantage, while they stand idly by, still using dated technologies. Though some may be hesitant at first, as the advantages of enterprise cloud platforms continue to mount, this new technology will be embraced, and the adopters won’t look back.  ### IOT TECH EXPO EUROPE – Olympia, London 10-11 February, 2016 Olympia, London [scribble src="/event/1836922" /] The Internet of Things Conference and Exhibition; (IoT) Tech Expo Event brings together key industries from across Europe for two days of top level content and discussion. Industries include Manufacturing, Transport, Health, Logistics, Government, Energy and Automotive. Introducing and exploring the latest innovations within the Internet of Things, this conference is not to be missed. Thousands of attendees are expected to attend the event across both days including IT decision makers, developers & makers, OEM’s, government and local council officials, automotive exec’s, operators, technology providers, investors, venture capitalists and many more. IoT Tech Expo Europe is set to showcase the most cutting edge technologies from more than 100 exhibitors and provide insight from over 200 speakers sharing their unparalleled industry knowledge and real-life experiences. This year’s IoT Tech Expo Europe will highlight the most innovative advancements in technologies which are affecting IoT. There will be cases and dedicated tracks covering the entire Internet of Things ecosystem including Smart Cities, Connected Living, Developing for the Internet of Things, Connected Industry and Data & Security. Attendees are expected from across the entire IoT industry with representatives from sectors including government, energy, education, transportation, healthcare, manufacturing, M2M and many more. Take advantage of this opportunity to experience two days of discussion, exploration, in-depth analysis and high level networking at this year’s IoT Tech Expo. This is an invaluable opportunity to speak directly with a host of companies and individuals and create long-lasting business relationships. To register or find out more, please visit http://www.iottechexpo.com/europe/ ### Shutting down Shadow IT stifles innovation Shadow IT is a very real issue facing businesses of all sizes today; it encompasses every aspect of a business.  This includes day-to-day processes employees use to complete tasks right through to the management of IT systems. Gartner expects that by 2016, 35 per cent of enterprise IT expenditure will go on shadow IT resources. While organisations are aware of the trend, they often don’t appreciate the scale of it. [easy-tweet tweet="Gartner expects that by 2016, 35% of enterprise IT expenditure will go on shadow IT resources" user="OneLogin" usehashtags="no"] With the vast volume and variety of applications that are working their way into the IT ecosystem, businesses and their IT departments are overwhelmed. Cloud applications have become so easily accessible (all workers need is a credit card), the IT department is often ignorant of their existence. In fact, as little as 8 per cent of both small and large IT companies can say they have a good understanding of the number of unmanaged cloud apps used internally by their organisation. More apps expose more data, and the IT department struggles to remain compliant. Although the desire to control applications is not going to disappear any time soon, it shouldn’t restrict the innovation shadow IT presents to organisations willing to embrace it. millennials bring high expectations for their working environment There is a question here of innovation versus risk. As the ‘millennials’ begin to enter the workforce, they bring high expectations for their working environment.  Today there is a huge gap between what they expect and what they get. To attract and retain tomorrow's workforce, organisations must adopt innovative technologies which support mobile working. By allowing loosely managed innovation, new solutions that will benefit the wider business can be found. The reality is employees like open ways of working because it enables them to do their jobs quicker, more competitively and with better results. However with every staff member from HR to marketing working independently from the IT department and using various applications, to store, sync and share content, the company incurs extra risks. Shadow IT can become a security nightmare. So how can IT empower the business while protecting corporate data?  [easy-tweet tweet="Shadow IT can become a security nightmare" user="OneLogin and @comparethecloud" usehashtags="no"] In order to bring down the defensive barriers and take the worry out of embracing new applications, the IT department should embrace solutions such as IDaaS, multi-factor authentication and user provisioning, which will allow them to keep the benefits but eliminate the risk. Allowing employees to freely use applications not only minimises the drain on IT resources but increases productivity and employee engagement, which helps the business, and also helps the business compete for talent. With this in mind, organisations must ensure they embrace the many beneficial aspects of shadow IT (empowering employees), while empowering IT teams to ensure these new systems are secure and compliant. With systems such as Identity and Access Management in place, harmony can be achieved between these needs, combining business speed with operational excellence. ### DCHQ Adds Customers as Users Surpass 1,000 and Number of Supported Cloud Environments Expands Docker-focused startup arrives at Dockercon EU with new features that let Java developers deploy WAR files on any containerized application stack on any cloud without changing existing development processes. A San Francisco startup in the red-hot container management space is racking up users, new product capabilities and expanded geographic options for its hosted product. DCHQ builds container management software for enterprises using Docker, focused on application modeling, deployment and lifecycle management. The company is talking to users and exhibiting this week at Dockercon EU in Barcelona. Quick Facts More than 1,000 accounts have signed up for DCHQ.io Hosted PaaS More than 500 downloads for DCHQ On-Premise New Standard version is free; runs up to 1,000 containers DCHQ.io Hosted PaaS now available in India via local data center A raft of new features expand options for using different cloud technologies Java developers can deploy WAR files on any containerized application stack without modifying development processes More Downloads, More Customer Momentum DCHQ has surpassed 1,000 signups for its hosted product, approximately 300 of which are active users. The on-premise product has been downloaded more than 500 times, and the number of reference customers continues to grow. Also, to make the software more accessible for evaluation in production scenarios, DCHQ has launched a free Standard version that lets users run up to 1,000 containers at no cost. There are four new reference customers using DCHQ. Each has specific use cases that leverage the broad set of programming and operation automation environments that the software supports. DURAARK.eu is running a microservices architecture with containers developed on Java, JavaScript and C++. MarkaVIP is a Middle Eastern shopping community running a Magento PHP application with Nginx and MySQL. Streaming Networks is a digital video security provider running GlassFish with a MySQL Java application with ApacheDS for user authentication. ProLeads.io is a personalized sales automation company running a high-availability Ruby application with Nginx, Redis, and Mongo. Read more about these customers here: http://dchq.co/customers.html "I really love it how we can manage everything in our application infrastructure from one place, and allow us to have bash scripts that run automatically post-deploy,” said Anders Fredriksson, CEO at ProLeads.io. “This allows us to configure things in a much more custom way that otherwise would have been impossible or very complicated to do." Martin Hecher of DURAARK said, “Setting up the application template feels very familiar for developers which are experienced in Docker and Docker Compose. The DCHQ specific extensions like plugins and template parameters are useful and make lives easier when setting up post-provisioning tasks like a NGINX reverse proxy configuration.” India Launch Home to 15 percent of the world’s applications developers, India is a prime market for software innovation. In October, DCHQ launched DCHQ.in with the hosted version of its product running in the CtrlS data center in Hyderabad. DCHQ is the first cloud application development platform with a secure point of presence in India. This local, secure and low-latency resource simplifies the containerization and automation of multi-tier applications based on Java, Ruby, PHP, and Python. The company is giving away one server for application deployments to the first one hundred customers who sign up during the first three months. New Product Capabilities The latest version of DCHQ adds support for a broad diversity of cloud technology environments, including: Application-defined Infrastructure via YAML Support for VMware vSphere and Photon Scheduled application deployment to facilitate DEV & QE cloud automation Granular approval policies Auto scale-out — based on CPU and memory White-list commands in the in-browser terminal of the running containers For more details on new product capabilities, visit the DCHQ product update page on our website. “The market response to DCHQ has been just phenomenal,” said Amjad Afanah, founder and CEO of DCHQ. “Six months after our launch, we’ve delivered multiple versions of both hosted and on-premises products, and we’ve vastly expanded the number of technologies and environments supported. But most amazing is the number of downloads and customers using the software to simplify the deployment of containers.” ### Bet You Didn't Know the Queen of England was This Tech Savvy! What comes to mind when you think of Britain’s Queen supreme? We’re guessing it’s probably her extensive wardrobe of hats. Or perhaps the longevity of her reign (63 years, yikes!). But it’s probably not her prowess as techno-queen. While Her Majesty may not be a “thought leader” when it comes to technology, it turns out she has been an early adopter for decades. [easy-tweet tweet="The Queen may not be a thought leader for technology, she has been an early adopter for decades" user="followcontinuum" usehashtags="no"] What?! you gasp? One of the world's seemingly most staid leaders is secretly a hip technophile? Indeed. Check out some of these monarchical milestones in this timeline infographic! Infographic Companion Guide 1953: For the first time ever, millions of Royal subjects got to watch the coronation of their new Queen live as it happened, thanks to Elizabeth’s desire to use new television technology to reach out to her people. Though the event was broadcast in black-and-white, it was also recorded in an experimental 3D format, in both color and B/W. TV marketers rejoiced, as they sold a half-million television sets prior to the live broadcast. 1957: Just four years later, Queen Elizabeth II broadcast her annual Christmas message on TV, the first British Royal to do such a thing. She commented on the fact that people could see as well as hear her message live from their own homes, noting it was “just another example of the speed at which things are changing all around us.” Alas, her message was somewhat marred when aberrant weather instead treated listeners to a brief transmission from an American police officer, radioing in to report he was “gonna grab a quick coffee.” No one said technology was perfect. 1976: QEII was the first British Monarch to send an email. Bet you didn’t have email in 1976. 1997: Elizabeth unveiled her new website, another first for the monarchy, of course. 2006: The Queen goes multi-media, introducing Christmas-message-as-a-podcast, available via her website and also on iTunes. 2007: Elizabeth one-ups her podcasts by launching a new YouTube channel. Every Christmas message since has been posted here as a video. 2010: The Queen dives deeper into social media, establishing a presence on Facebook and Flickr. She added Instagram to her “multi-channel marketing” in 2011. Not surprisingly, she has millions of followers. Don’t you wish you did? If only she could use her channels for lead gen. 2012: 3D comes full circle for Queen Elizabeth. Having recorded her coronation in this format, she celebrated her 50thanniversary on the throne (no double-entendre intended) by giving a formal nod to technology, actually broadcasting in 3D. Her spokesperson said, “We wanted to do something a bit different and special in this jubilee year, so doing it for the first time in 3D seemed a good thing, technology-wise. The Queen absolutely agreed straight away.” 2014: The Queen is all a-Twitter, tweeting about a new exhibit at the London Science Museum. The subject of the exhibit was – what else? – the Information Age. With her 63rd anniversary as Queen, Elizabeth II is now the longest-sitting Monarch in British history, surpassing even Victoria, who reigned for 62 years. While 400,000 subjects crowded into London to witness Victoria’s coronation, 27 million were able to experience Elizabeth’s on television. And while there was no way for Victoria to literally reach out to all her subjects around the world, Elizabeth can do that with a few quick clicks. The Royal House may be something of an anachronism these days, but clearly that doesn’t preclude adopting and promoting the latest technologies. Brilliant. It’s good to be Queen. What did you think of our infographic? What's next for the Royal Family - hoverboard motorcades? Sound off below! ### Cyber Security & Medical Equipment: Are We Leaving Ourselves Open To Hackers? Researchers Scott Erven and Mark Callao revealed breaches in the security of thousands of medical systems at the recent Derbycon conference. Hospital equipment including MRI scans and defibrillators are vulnerable to attacks from hackers putting patients lives at risk. Medical device internet security must be tightened While there is always a need for conventional medical equipment, including hospital bed castors used in patient care, there is also an increasing use of hi tech internet linked machines for diagnostic, healing and investigative uses. With wily hackers always on the look out for ways to penetrate cyber security this poses a risk for patients and medical practitioners. [easy-tweet tweet="There are breaches in the security of thousands of medical systems" user="comparethecloud" hashtags="healthtech"] The internet of things According to an article in the authoritative computer magazine The Register, Erven and Collao discovered weaknesses in 68,000 systems used by one large US healthcare organisation. The potential breaches were discovered by exploring the search engine Shodan. This particular search engine is designed to track down ‘internet connected devices, where they are located and who is using them.’ [caption id="attachment_32704" align="alignright" width="300"] The Cloud Doesn’t Have to Be a Healthcare Headache[/caption] Once the researchers played about with search terms and targeted clinics and hospitals they realised that WiFi connected medical equipment is vulnerable to hackers and can be used to steal patient data and other information. Many of the hackers don’t even realise that they are penetrating hospital machinery; they just carry out attacks to show off and wreak havoc. Fighting back In order to combat this growing trend, Caroline Rivett, the director of cyber security at global service providers KPMG, has called for greater awareness to the threats posed by hackers to vulnerable patients. ‘Medical device manufacturers should be designing and building cyber security into medical devices,’ said Ms Rivett. It’s not simply a loss of data that’s at risk, hackers could be able to manipulate treatment plans and even cost people’s lives. [easy-tweet tweet="Medical device manufacturers should be designing and building #cybersecurity into medical devices"] Keeping up with the growth of technology Technology has brought huge beneficial changes to many people’s lives but the medical device manufacturers stand accused of being slow to develop security systems to match the excellence of their healthcare equipment. It’s far more complex to solve a problem in the internet of things than it is to create a patch to protect a smartphone or PC from hacker penetration. The chief security director at security firm Kaspersky, David Emm claims that, ‘medical device developers must talk to security organisations before rolling out vital equipment for public use.’ Hope for the future Even though these security revelations are worrying, at least the problem as been revealed. It’s now up to medical device manufacturers to work in tandem with security experts to ensure that their devices are secure from hacking. Though, as hackers can invade the Pentagon and other high security establishments any equipment that uses WiFi connectivity is always at risk from potential attack. It’s only if manufacturers and developers work at speed to repair any weaknesses in their systems that the public can have total confidence in the security of hospital internet medical devices. ### Investing in Cloud Tools for Effective Financial Management Businesses are deciding to invest in online software to manage their cashflow because they want to spend more time focused on developing their business rather than preparing their accounts. Business owners want to invest in modern technology that has the ability to keep up with the growth of their business. In 2014, 24% of enterprises in the UK used cloud applications in their business IT infrastructure. 51% of these enterprises used cloud applications for email while 25% used financial or accounting software applications, according to a report by Eurostat. Effective tools to manage your accounts There are several tools out there, including all-in-one platforms such as Xero or applications that specialise in just a few tasks. For example, Satago can’t manage your expenses and payroll but it is designed to manage and follow up late invoices for you. We have listed the most common tools known and used by business owners. [easy-tweet tweet="In 2014, 24% of enterprises in the UK used cloud applications in their business IT infrastructure" via="no" usehashtags="no"] Xero Xero is popular with most small businesses and accountants because of its ease of use, ability to collaborate with team members and your accountant and reduces manual data entry required by users. However it fails to offer the functionality to automatically chase overdue invoices for you and it only allows you to send one invoice at a time rather than as batch. It could be an option to use Xero and Satago together to effectively manage your accounts and chase invoices efficiently but this could be costly, especially for startups with a limited budget. Quickbooks Quickbooks offers customisable dashboards and feeds to instantly show you the health of your business and your next actions. You can also create invoices that are customised to your branding and layout with the option to add customisable fields. Quickbooks pulls data from your bank to save time and reduce data entry errors. However a disadvantage of using Quickbooks is the audit trail that means it is easy for users of the account to change financial information without leaving any sort of documentation. Another pitfall is the upgrade fees required to upgrade your software each year which could become costly for some small businesses. Sage One Sage One is easy to use for someone who isn’t experienced with accounting software and has its own separate applications for accounting, payroll and an online cashbook - perfect for sole traders. However Sage One fails to create advanced reports to drill down on individual transactions and requires you to pay an extra £5 per month to use their payroll services for up to 15 employees. FreeAgent FreeAgent is an all-in-one platform that boasts the same features as other accounting applications. It also offers live profit and loss data, a tax timeline to keep you in the loop with important dates and drag and drop customisation. It’s slightly more expensive each month than other platforms, costing an additional £9 per month compared to Xero. Satago Satago specialises in invoice chasing and integrates quickly and easily with most cloud accounting software packages. Its features include automated customisable invoice reminders, integrated Experian risk data about your customers and the ability to send physical payment demand letters, including printing and posting carried out by Satago, so you never have to visit a post box or lick a stamp again. Clearbooks Clearbooks is an all-in-one accounting software package, similar to Xero and Quickbooks, that allows you to generate, store and submit your VAT returns directly to HMRC. The cloud tool easily integrates with PayPal, MailChimp, DueDil and other online applications to effectively manage your accounts and plan your next actions. Kashflow Kashflow allows you to manage invoicing, banking, quotes, credit control, purchases, payroll and other accounting tasks. The three package options are ‘Starter’, ‘Business’ and ‘Business + Payroll.’ This selection allows you to choose a bundle that is tailored to your type of organisation at an affordable price that varies depending on the number of features you require to manage your accounts. Advantages of cloud tools for financial management Some businesses are still unaware of the extensive list of benefits that cloud tools can offer. Here’s a few advantages: Manage your accounts on the move - You can access cloud accounting software anywhere with an internet connection and even on mobile for some well known packages including Xero. Everything is in one place when you need it - All of your accounts, invoices, expenses and more are stored in your application so there’s no worry of losing sheets of paper or receipts. This also means you can reduce the amount of paper you use if your accounts, invoices and payslips are stored and sent electronically. Flexible control of your accounts - Most accounting software packages enable you to grant access to staff or your accountant so they can either review the accounts, input expenses or even make changes to payroll if you desire. The flexibility of the application allows you to be more independent when managing your accounts although it is advisable to still use an accountant to review your accounts. This ensures that you are inputting the correct data and everything is carried out to the legal standard. The challenges of cloud financial management tools [easy-tweet tweet="Before investing in the latest cloud accounting software, it is important to weigh up the advantages and disadvantages" via="no"] Before investing in the latest cloud accounting software, it is important to weigh up the advantages and disadvantages of using an accounting package for your business. Security - Xero, Quickbooks and other cloud tools state that they have bank-level security in place to ensure that your information is always safe and secure. There is a risk of hacking when storing data remotely but these accounting tools are designed to have your security in mind. Unable to manage your accounts offline - The pitfall of cloud technology is the need for an internet connection to use them. Incorrect information - The accounting software is only valid as the data you put into it. Your financial data could be incorrect if not reviewed by an accounting professional. ### An Introduction to Bluebird IT Solutions Here at CTC we’re aiming to open the cloud industry up, and part of doing that is increasing awareness of the amount of companies that exist. Bluebird IT Solutions aren’t completely new to the cloud scene - they’ve been operational for four years in the UK, and are an IBM Premier Partner. I had a chat with Natalie Shelley, and James McCaskie from Bluebird about their ideas on the cloud industry at present, and where they see Bluebird going in the future. NS: Natalie Shelley, Managing Director Trained as an engineer, Natalie was involved with the running family businesses in dentistry for the following 15 years.  She then switched focus to the IT industry where she worked in HP corporate marketing for the worldwide Customer Reference Programme and then in corporate marketing for the defence company QinetiQ. Natalie returned to IT to form the company Bluebird IT Solutions as part of the TIS Group. JM: James McCaskie, Sales Manager James is the Sales Manager at Bluebird IT Solutions, looking after the company’s most valued customers across Northern Europe. James has always worked in enterprise IT sales with a focus on the financial services and industrial sector. James has a business degree and specialises in understanding his customers’ business challenges while prescribing technology remedies to solve them. Bluebird as a whole: Bluebird is the UK operation of the TIS Group which was founded in Croatia in the 1970s. In operation for 4 years now, Bluebird is an IBM Premier Partner with a mission to create outstanding solutions for moving data securely between and within organisations. Managed File Transfer, Business to Business Integration and WebSphere technology are all areas in which Bluebird operate to deliver success to organisations’ data transfer needs. So, James and Natalie, what do you think of the cloud industry at present? JM: Cloud computing has been on the cusp of the hyper curve for a number of years, but I think companies are now really starting to see the value of the cloud transaction model for both cost and business flexibility purposes. More frequently than ever we are getting asked about Cloud and how our rapidly growing customers can best take advantage of the model. Exciting times! NS: Companies can adopt cloud computing in a variety of ways – from simply using cloud based applications alongside or integrated into their on-premise IT to completely replacing all on-premise IT – hardware and software - with hosted services. At present companies are probably somewhere in between these two ends of the spectrum but the trend is moving towards cloud as it comes with a great number of benefits – flexibility, simplicity, and financial to name a few, but this offset against the perceived increase security risk. However, as companies are gaining confidence in the cloud, and security in the cloud,  then the trend will continue. As an observation we’ve noted that the UK is slower at adoption of cloud than other countries though – for instance the Nordics. [easy-tweet tweet="The UK is slower at adoption of cloud than other countries says Natalie Shelley of @BluebirdITS" via="no" usehashtags="no"] What are Bluebird IT Solutions’ main strengths? JM: Our strengths lie in the fact that we are as secure as they come! One of our main product offerings is Connect:Direct, which has never suffered a security breach. We hit all the government security requirements and all transferable data is encrypted both in transit and at rest. We also have a really secure audit trail that we are able to follow easily. NS: Besides the superiority of the IBM technology that we provide, which I consider to be the best on the market, Bluebird’s technicians are extremely skilled and experienced and are experts in the solutions we provide.  I therefore know that our company is providing the best that there is to our clients when it comes to secure Managed File Transfer and B2B Integration.    What do you think sets you apart in the competitive cloud marketplace? JM: Our security! There have been so many breaches recently in the industry, so security is really at the forefront of people’s minds. We pride ourselves on the security that we offer, and we think our clients do too. Our support model is really strong NS: Our support model is also really strong - because we’re a small company we can invest more time into our clients and give them a more personal experience. Each of our account managers knows their clients inside out, and they’re able to respond really quickly if an issue does arise. This means that everything is kept secure and the clients are kept happy - it also means the clients avoid any financial penalties that may be associated with data breaches. Could you describe the ideal client for your Bluebird’s services? NS: Our ideal client would be any company that is required to transfer files and data securely between organisations. The requirement for governance and control over the file transfers are often most relevant in the financial services sector where we use our skills and experience to work with often the smaller financial organisations to develop a future proof strategic IT infrastructure that they can use to handle expected growth. Are you able to tell us a bit about some of your current clients? JM: Our clients vary in sector across Europe, many of whom cannot be publicly referenced due to the nature of their business. However, they all have characteristics in common. They all have the desire to govern and control the transfer of data securely. Many of the companies we work with are using the technology to enable expansion, and we work with a lot of small banks and FinTech organisations to help them achieve that growth. And finally, where you see Bluebird IT Solutions heading in the future? NS: Still delivering the best service and technology to our customers. ### Cobweb Solutions debuts new way of selling cloud services Cobweb, newest Microsoft ‘cloud distributor’, debuts new way of selling cloud services: the ‘Cloud reseller in a box’ and “CSP Enablement” with the Cobweb ecosystem Cobweb Solutions, one of the largest independent cloud hosting providers in Europe, today reinforced its status as a Microsoft ‘cloud distributor’ with a new way for partners to sell services from a range of cloud vendors. Cobweb’s new concept of the cloud reseller in a box, now available to partners, packages industry-leading services from a range of vendors including Microsoft, in an easily consumable and resalable bundle for resellers; whilst CSP Enablement provides a simple solution for end-to-end customer lifecycle management. These programmes work with the ‘Cobweb’, an ecosystem of integrated services resellers can integrate into their portfolio, including billing-as-a-service and bundled services. It’s a true cloud ecosystem, a one stop cloud market place where resellers can consume and resell services as utilities. This marks Cobweb building upon its enhanced participation in the Microsoft Cloud Solution Provider Program as a (indirect) 2-Tier provider. This extension allows Cobweb to sell services through the program to resellers. This expands cloud sales opportunities for partners by enabling them to provide direct billing, sell combined offers and services, as well as directly provision, manage and support Microsoft cloud services. Cobweb provides a sophisticated ecosystem that allows partners to deliver the benefits of their own expertise within a toolset that unleashes the power of the cloud.  Cobweb swiftly enable channel partners to start selling cloud services with simple integration into their portfolio. The set-up in terms of choices and costs have been simplified and defined. The cloud reseller in a box ability combined with Cobweb’s billing service has simplified the process of administrating delivery and customer services so that customers and resellers have a straightforward billing relationship built around Direct Debit. Through this offering reseller partners are enabled with a faster Go-To-Market, can sell services quickly and make more money on services sold. At the core of the cobweb is the Microsoft Office 365, Microsoft Azure and the rest of the Microsoft cloud solution provider services, plus a rich portfolio of additional vendor cloud services and the Cobweb London cloud for business applications requiring UK data residency.  [quote_box_center] Ash Patel, Director of Business Transformation at Cobweb said “Cobweb’s cobweb platform of packaged services offers a rocket pack for resellers looking to soar in the cloud. We’re answering the question, ‘when is a cloud distributor not a cloud distributor? When it’s a Cobweb’. Cobweb’s integration capability provides partners with sector specific solutions, allowing companies, big and small, to seize new opportunities with cloud power. Additionally we’re able to give partners a Go to Market as a service that other businesses would struggle to achieve on their own.”  Julian Dyer, Chief Technical Officer at Cobweb explained, “Partners will be able to execute, operate, manage and build at scale and integrate new services into their portfolio, whether or not they are already Microsoft partners.”  [/quote_box_center] For 19 years Cobweb has been at the forefront of business transformation. As one of Microsoft’s leading UK cloud partners, we are the holders of 7 Gold Partner Competencies and were Finalists for Microsoft Partner of the Year in 2015. As the IT market consumes more cloud, Cobweb helps channel partners and IT professionals make transition to become cloud solutions providers. If you would like to know more, Cobweb is offering visitors to Microsoft’s largest UK Partner event, Future Decoded, an insight into how it helps businesses of all sizes transform their IT infrastructure and harness the power of cloud computing. Visit this 10th & 11th November. ### Armored Client for Citrix provides missing link in virtualisation security SentryBay has announced the launch of Armored Client for Citrix which securely “wraps” the Citrix receiver - providing key endpoint and browser security for connections  to XenDesktop and XenApp installations. The solution overcomes existing security threats such as keylogging, browser vulnerabilities, DNS poisoning and session hijacking. The endpoint software, which can be deployed alongside or post Citrix installation, ensures the security integrity of the endpoint - helping overcome key security and compliance issues. The Armored Client launches a separate secure desktop using a one-time random user and a locked-down “armored” web client – within which the Citrix Receiver operates. No other software (including malware) is able to run within this browser session, and it also overcomes kernel-based malware threats. Multi-level keylogging protection scrambles keystrokes, prevents screenshots and denies malware access, protecting the entry of login credentials into the Citrix Receiver. The protection operates throughout the Citrix session, securing data entry into all enterprise applications and web services. The dedicated session then leaves no trace when closed.  Version 1 applies to HTML5-based versions of the Windows receiver, with the Citrix ReceiverWeb application versions due out shortly.  A mobile version of the solution is currently in development and will be ready to launch in 2016. This version does not require a corporate device or any MDM solution to be operating to provide the necessary endpoint and browser security.  Deployment of the Armored Client is far more economical for enterprises than issuing corporate managed devices for remote access via Citrix. Key technology within the solution is patented to SentryBay, meaning the blend of technologies cannot be provided by other vendors. The technology is applicable to other virtualised clients such as VMware, Microsoft including RDP, Oracle & others. For more information visit www.sentrybay.com ### Lifting the cloud on shadow IT The cloud is streamlining operations for businesses of all sizes. It offers on-demand, distributed access to the information and services that businesses rely on, fuelling new business models, increasing efficiency and therefore helping companies generate extra revenue. When used for file sync and share purposes, the cloud greatly improves the way in which lines of business operate, supporting instant collaboration between staff, and enabling sharing with customers. Despite these benefits, business cloud adoption is often behind where employees would like it to be. As a result, employees find themselves turning to shadow IT – unsanctioned consumer IT solutions which undermine the security and control of businesses’ IT departments. [easy-tweet tweet="Business cloud adoption is often behind where employees would like it to be" user="comparethecloud" hashtags="cloud"] Shadow IT usage in businesses How businesses store and share information is one of the biggest headaches for CIOs and IT departments, and can lead to serious security and compliance issues. If staff store and share corporate information using shadow IT services, there is an increased probability of security breaches. This is because businesses lose visibility of who has access to certain files, which means they are unable to control their content. However, the proliferation of shadow IT into the enterprise is largely the result of business users trying to find the best way to do their jobs. CIOs need to protect their companies from potentially negative impacts, and enable staff to operate at maximum efficiency whilst using the best technology to do so. The challenge is striking a balance between empowering staff to be autonomous and efficient and ensuring the company remains secure. CIOs need to protect their companies The integrity and privacy of a business’ information requires a secure, end-to-end cloud solution. Overlooking the security architecture of the chosen solution can cause a company to incur significant costs – not just in financial terms, but for a company’s brand, reputation and growth capacity. A five-pronged security strategy To satisfy the individual user and business as a whole, IT needs to introduce cloud agility combined with the security of on-premises storage. This should be underpinned by a five-pronged security strategy that targets the security of users, devices, networks, data centres and content. This helps IT monitor the behaviour of its whole business to help reduce rogue applications and risky file-sharing behaviours. Furthermore, because a business needs file sync and share services to keep up with demands and competitive pressures, it is important to deploy a solution that can adapt to changing security and compliance criteria quickly and easily. A pure-cloud (cloud only) file-sharing solution is limited to a single environment that is not flexible enough to adapt to evolving regulations and different security needs. A hybrid solution, where business can choose whether their data resides in a public cloud, data centre or on premises, enables businesses to be adaptive and store their data in the most appropriate manner. Recent changes to the US-EU Safe Harbour regulations which determine how businesses from the US and EU must share and keep track of data, highlights the necessity for businesses to have an adaptive file sharing solution in place to meet the demands of new regulations. Securing your business [easy-tweet tweet="IT departments should seek solutions that are easy to use across all devices" via="no" hashtags="cloud"] IT professionals have an opportunity to be viewed as a partner to business users vs. a disruptor. To encourage user adoption of cloud file services across the business, IT departments should seek solutions that are easy to use across all devices, and can be integrated with popular applications already used by the business. Understanding how users work will help reduce shadow IT whilst allowing employees to continue using their preferred enterprise applications to securely access the information they need, when they need it, regardless of where it is stored. Only then can businesses lift the over-shadowing threat of unsanctioned app usage and its implications on IT security. ### October Top 50 #CloudInfluence Individuals It has been a busy month. As we covered in our recent blog (HP Quits Public Cloud; Dell buys EMC; What’s happening?) we had the $67 billion Dell EMC deal, followed by Symantec spinning out Veritas, Western Digital  buying SanDisk for $19 billion, EMC spinning out Virtustream and HP selling TippingPoint To Trend Micro For $300M. HP chose to exit the public cloud because it couldn’t reach scale in order to compete with AWS, Azure and Google, but Oracle announced a set of new services to maintain a presence in public cloud – promising to compete directly with AWS on price rather than scale – although exactly how Oracle can continue to compete on price in the longer term without scale remains unclear. October also saw re:Invent (the AWS fan fest) as well as Oracle World (the Ellison fan fest) and the Tokyo OpenStack Summit (the fan fest for the open cloud OS). There were plenty of major announcements, event and other opportunities this month for senior executives to shine therefore. [easy-tweet tweet="October's #CloudInfluence individual rankings are out now on @ComparetheCloud" via="no" usehashtags="no"] It is little surprise that Michael Dell topped the rankings – announcing the largest tech acquisition ever can have this kind of effect on your profile – even if much of the comment was questioning the amount of debt that Dell was taking on board in order to see the deal through. Satya Nadella in second, was praised by Daniel Ives in third and other financial analysts like Bernstein’s Mark Moerdler in 15th for the success that Microsoft is having in shifting the firm’s focus to software and cloud services as demand for the Windows operating system slows in a weak PC market. Peggy Johnson in 5th, Microsoft’s EVP for Business Development, was also quoted widely. Robert Leblanc, IBM’s cloud leader, in 4th and Ginni Rometty, the firm’s CEO, in 7th were both commenting on yet further acquisitions as Big Blue turns a shade of sky blue - building out its cloud strategy with yet more strategic acquisitions. The purchase of The Weather Company being one of the largest amounts ever paid for a large data store – and with it a network of monitoring stations. Martin Jetter in 18th, John Kelly in 23rd and Martin Schroeter in 24th completed an impressive showing for IBM. Andy Jassy in 6th and Mark Hurd in 10th took centre stage at their respective events - re:Invent and Oracle World. Celebrating that Google’s apps had been successfully scaled to over 1 billion users and its investment in 'transformative' machine learning, was Google Vice President Sundar Pichai. The firm itself was squeezed out of the top ten organisations for the first time Ridiculed by some for suggesting that HP was not a public cloud player back in April before withdrawing his comments and then finally accepting reality six months later. was Bill Hilf, HP’s SVP and GM Cloud in 12th. And in 14th was Joe Tucci, Corporation Chairman of EMC – Micheal Dell’s target as Dell makes a strategic move to gain scale, just as HP does the reverse – splitting into two. [table id=57 /] NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### Why do you need to be digital ready? Small businesses are crucial to the global economy, however, it can be difficult to grow and succeed in today’s competitive markets. In fact, a study conducted last year found that over half of small businesses in the UK fail to survive longer than five years. There are a lot of challenges that can inhibit their success; obvious ones, such as gaining new customers, managing increasing costs, or securing repeat business, as well as more unusual ones, such as gaining access to the skill pool, or having the experience to elevate an idea into an achievable business goal. [easy-tweet tweet="Over half of small businesses in the UK fail to survive longer than five years" user="ekernevez" usehashtags="no"] How small businesses manage increasingly complex processes they face is key to business growth. Many don’t realise the impact that relying on inefficient processes, which makes them slow to react and vulnerable to human error, can have on the bottom line. For example, the inability to access documents remotely or from a mobile device, fragmented information silos, or uncontrolled paper trails, are all very big problems in a world where people expect immediate access to information wherever they are. Businesses are subsequently turning to include digitisation, the cloud and multi-channel communications, all of which, when managed correctly, can provide long-term benefits. Investing in digital It’s fair to say that today’s businesses are operating in a world where digital technology is increasingly expected. Take communications, for example. While physical mail is still an important channel to communicate with customers, the last decade has undoubtedly seen a shift towards more digital channels. More and more customers are choosing to be contacted via email, social media and everything in between, and it’s crucial that businesses have a well-managed, multi-channel communication strategy in place that considers their customers’ preferences. Failing to do this risks alienating a large number of customers and, subsequently, lost business. Aside from customer demand, digital technology offers companies very real business-growing benefits. One of the biggest is that it can provide a much more direct route to the intended recipient. Indeed, digital communication can reach its destination instantly. The speed with which customer queries are resolved or invoices and contracts are sent can be the difference in securing repeat business and prompt payment, or losing out to a competitor. "Going digital also increases the accuracy and reliability of communications" Going digital also increases the accuracy and reliability of communications. The processing, preparation and sending of documents can all by automated, removing the opportunity for human error. Moreover, removing manual processes also decreases the burden on employees, who are then able to spend their time on other vital business growing activities.  Cloud technology The cloud, in particular, has the power to simplify the way businesses run their business. While investing in the cloud may seem daunting, it provides very real benefits. It grants access to enterprise applications at a fraction of the price and mostly without the need for any costly and time consuming IT services. Additionally, it lifts geographical constraints and improves efficiency. For example, documents can be saved remotely and securely making them accessible from anywhere with an internet connection. By failing to adopt cloud based technology, organisations will undoubtedly find themselves one step behind everyone else. The cloud is also useful when it comes to archiving. Firstly, archiving in the cloud is more cost-effective. Paper records soon fill cardboard boxes and filing cabinets, which are then banished to the basement or the corner of the office. This takes up valuable space that is being paid for and could be better utilised elsewhere. Furthermore, by removing the need for physical onsite storage space it enables businesses to only buy the storage capacity needed, providing greater pricing flexibility. With archiving key to remaining compliant with government regulations, the cloud makes it easier for businesses by passing a lot of the requirements and responsibility to the Cloud Service Provider (CSP). This is particularly useful for smaller companies who don’t always have the time or resources to make sure they are adhering to today’s stringent regulations. For example, the regulations require businesses to have a certain level of record security, a cost which is picked up by them if records are physically stored onsite.  With the cloud, however, security is part of the CSP package, thus ensuring compliance without any additional actions needed. [easy-tweet tweet="Being ‘digital ready’ is something all companies need to be thinking about" user="ekernevez" usehashtags="no"] Digital and cloud technology is playing an important role in helping businesses overcome some of the traditional barriers to growth. Faced with strong competition, many new or small businesses face a long, hard battle to establish themselves, which is why it’s so important that they have the most efficient processes in place. Being ‘digital ready’ is something all companies need to be thinking about – according to Gartner, all businesses will soon be digital businesses. By taking this step and implementing digital and cloud technology, businesses can reap the rewards of an efficient, streamlined communication strategy and be in the best position possible to overcome the odds and grow. ### October Global #CloudInfluence Organisations Retaining top slot in the global #CloudInfluence rankings was Microsoft. The company has been shifting its focus to software and cloud services as demand for the Windows operating system slows in a weak PC market - despite the launch of Windows 10. The strategy lead by Satya Nadella was applauded by Bernstein analyst Mark Moerdler (both appear in the individual rankings this month). It involves a reduced reporting structure with three rather than six segments, including its growing cloud and mobile businesses. Recently the firm announced that Azure revenues have more than doubled.  October was a busy month with re:Invent (the AWS fan fest) followed immediately by both Oracle World (the Ellison fan fest) and the Tokyo OpenStack Summit (the fan fest for the open cloud OS). [easy-tweet tweet="Re:Invent gave AWS a major boost, helping it to return to second place in the rankings." via="no" hashtags="cloudinfluence"] Re:Invent gave AWS a major boost, helping it to return to second place in the rankings. Among the many new services announced was the AWS Database Migration Service and AWS Scheme Conversion Tool designed to simplify the challenge of switching databases, as AWS directly targeted database service renewals and therefore Oracle’s core market. Oracle World saw Larry Ellison and his horde fight back, throwing everything (including the kitchen sink) at cloud with a host of new services. Interestingly Oracle, unlike HP, chose to maintain a presence in public cloud – promising to compete directly with AWS on price rather than scale – although exactly how Oracle can continue to compete on price in the longer term without scale remains unclear. Oracle was in 5th in the rankings behind both IBM and Dell. IBM maintained its challenge to the mega-scale cloud players with yet more strategic acquisitions as it continues to build out its cloud proposition. And Dell opted to steal the headlines with the biggest tech acquisition in history – namely the $67m purchase of EMC and with it 80% of VMware (as we covered in our earlier blog). HP made it into the top ten in the rankings for the first time, and indeed the last time as a single entity. Its retreat from public cloud and its 1st November split into two separate firms raising its profile higher than its cloud activities ever managed to do. VMware also appeared in the top ten with speculation on its future following Dell’s acquisition of EMC. While the Tokyo OpenStack Summit (which we attended) gave all the main OpenStack players a slight boost, questions remain about the long term viability of the open cloud OS. The concerns were probably best outlined by Peter Willis, BT's chief researcher for data networks, when he broke ranks among what has otherwise been a united front from all OpenStack vendors and supports, with an outspoken presentation at the SDN & Openflow World Congress in Germany. He outlined six challenges that OpenStack needs to address to be taken seriously: including the connection of virtual network functions (VNFs), service chain modification, scalability, so-called "start-up storms," the security of OpenStack over the Internet and backwards compatibility (see http://ubm.io/1HsIjga ). In Tokyo Jonathan Bryce, Executive Director of the OpenStack Foundation, acknowledged that BT’s concerns were valid and that they needed to be addressed, but without outlining how or when its contributors would be doing so. With Azure and AWS practically doubling their cloud revenues each year, Bryce declined to suggest a rate of growth that would be acceptable for the OpenStack community if it is to keep pace with the mega scale vendors as it attempts to become enterprise ready and address BT’s concerns. Notable appearances further down the rankings included The Weather Company in 17th – one of IBM’s recent acquisitions. Normally we exclude weather companies from our rankings as they talk about the wrong kind of clouds, but in this case their inclusion was valid. Indeed if the combination of the Weather Company’s meteorological data and IBM Watson’s predictive analytics can provide improvements in weather forecasting then I for one am all for it. [table id=56 /] NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### 5 Top tips for providing effective IT support 5 Top tips for providing effective IT support 1. Evaluate how you currently work Ask yourselves “is this the best we can do?” Are you still working with outdated or legacy processes? Have you been working a certain way due to the limitations of the tools available? Are there small changes you could make to improve the support you offer your customers? Don’t reinvent the wheel – there are already best practice guidelines you could follow, such as ITIL. The world of IT changes every day; don’t get left behind. [easy-tweet tweet="The world of IT changes every day; don’t get left behind" user="comparethecloud" usehashtags="no"] 2. Utilise Self-Service Implementing Self-Service will not only save your analysts time, by reducing the number of incoming calls, it will mean they have more time to fix incidents, improving your SLA and customer service. Out-of-hours help and support can be provided through a dynamic Knowledge base that gives your customer the tools to resolve their own incidents 3. Reward and recognise Apply gamification techniques to reward and challenge your staff to provide great support at all times. You can drive and motivate your support staff in a more engaging manner by providing rewards for exceptional effort and challenges for agents to compete among themselves. Happy staff means happy customers. 4. Social media Allowing customers a choice of platforms to contact you on enables you to provide a more responsive service. Tracking hashtags or keywords such as company name allows your organisation to be more proactive in both customer incidents and customer feedback. [easy-tweet tweet="Tracking hashtags allows your organisation to be more proactive on customer support" user="comparethecloud" usehashtags="no"] 5. Growth of support Look at other departments within your organisation, such as Facilities, Health & Safety and HR and review how you could offer one effective support solution to the organisation. This will not only reduce overall costs, but also reduce duplication and improve visibility of the support needs of your organisation. ### The password is passé The password is dead. It’s time to admit it. After all, 75% of security breaches have been down to weak passwords and security experts have started to accept that even strong passwords aren’t good enough. Security breaches have become a regular news feature. The recent hacks of user accounts at eBay, Ashley Madison and LinkedIn have highlighted the need more secure forms of authentication. More recently, TalkTalk users were told their data might have been compromised in a cyber attack. Yet, it appears that, despite these well-publicised attacks, users and service providers are slow to tackle the issue of inadequate password protection head on. Cyber attacks are easier to manage, and much less likely to occur in the first place, when users are prepared. Most Internet services can be accessed with just a username and password. Email addresses are often used as usernames, making it fairly easy for a hacker to find a valid login through brute force. The trend for businesses to use standardised email platforms (e.g. Gmail) is only amplifying security issues. To make this worse, passwords themselves come with several problems. Is it a case of user error? The number of active monthly Facebook users is almost 1.5 billion. Recent research found that a significant share of these users aren’t aware that they are on the Internet when using Facebook . This is particularly true of emerging markets, where the figure can be as high as 65 per cent. This may be explained, at least in part, by the fact that many of these countries have leapfrogged the fixed line connections a lot of us will have grown up with and moved straight to a mobile environment. That said, in the US, the share of those who didn’t know that Facebook is dependent on the Internet was still five per cent. This shows how the spread of the Internet across the globe has not been accompanied by an awareness of how it works, leading to low levels of IT security awareness.  Another factor is that people are still unconvinced that strong, unique passwords are essential to protect data. Unsecure passwords abounding, users are making themselves, and their employers - vulnerable to attack.  users are making themselves, and their employers - vulnerable to attack Strong passwords are often seen as an annoyance and overly complicated. A ten-digit password with lower case, upper case, numeric and special characters is difficult to come up with and even harder to remember. And frankly, when we entrust our bank balance to a four-digit pin,  it is understandable that many wouldn’t understand why Facebook photos need to be protected with complex passwords. Put it on a Post-it note We’re told over and over again that our password shouldn’t be a pet’s name, date of birth or another easily guessed phrase. IT departments often set requirements for, or even generate, secure passwords for company IT services. The problem with these passwords is that they are only as secure as the way they are transmitted and stored. Passwords are hardly secure if they’re scribbled down on post-it notes and stuck to computer screens. Slightly better than the pen and paper storage method, although just as far from secure, is when the generated passwords are emailed unencrypted to the user to be stored on their computer and mobile phone forever. Situations like this don’t just happen in the office. When users sign up to websites outside of work, confirmation emails showing the username and password are sometimes emailed to the user, proving the password is stored, unencrypted, by the service provider.  It’s high time for 2FA [easy-tweet tweet="Passwords alone are no longer enough to protect data" via="no" hashtags="datasecurity"] What’s clear from all of the above is that passwords alone are no longer enough to protect data. While PINs might seem like a very simple approach to protecting bank accounts, they are actually a method of two-factor authentication (2FA). It is based on the user having both the card, and knowing the PIN that goes with it. 2FAhas been around for many years but has mainly been used by large enterprises because of the cost of ‘tokens’ like RSA SecurID . With smartphones becoming more common, however, the cost of providing authentication devices can be reduced significantly. 66% of adults in the UK use smartphones, making them a cheap and handy medium for 2FA. By using a combination of apps and algorithms, we can generate Time-based One-Time Passwords (TOTP) with the same security as a physical token. These passwords can be used just once and are only valid for a short period of time. They are created using a constant shared between the client and server, and a variable: time. Smartphone users can download a free app to their phone to generate TOTPs for multiple services. While the number of people with smart phones is growing, 2FA can still be used by people without one. For example, text messages can be used to send one time use codes to users. You may have experienced this when asked to authenticate your Google account, for example. 93% of UK adults use a mobile phone so codes sent via SMS could be used by the majority of the population. What’s next? It’s clear that traditional passwords are no match for the sophisticated tools hackers now have at their disposal. The industry needs to encourage the migration towards 2FA by explaining the importance of strong authentication. More importantly, it needs to make 2FA more appealing to users by providing simple, easy tools rather than making people jump through hoops to secure their data. The industry needs to encourage the migration towards 2FA Some companies have already started to take this step. Some banks, for example, require 2FA to transfer money or log in to online banking. Several web-based companies have also taken this approach and made it as simple as possible for their users to get to grips with 2FA. Making the process clear and simple for non-experts is key to ensuring 2FA is used. Two-factor authentication is not used widely yet, but it’s only a matter of time before it will be be seen as a must-have, not an optional extra, and become second nature for all of us. After all, there’s no such thing as being over protected when it comes to data. Then the only issue that remains is ensuring that those we entrust with our data keep it safe at their end. ### Why More Businesses Are Moving to Hosted VoIP Solutions Cloud based VoIP solutions are increasingly being used by businesses of all sizes for their flexibility and pricing. Choosing a hosted VoIP solution makes sense in many ways over traditional landlines for business use. Cloud-based phone systems are increasingly popular among business owners with 5-10 users, although larger corporate clients (100+ users) are moving to cloud-based VoIP solutions as well. [easy-tweet tweet="VoIP telephony, uses packet-switched telephony over the internet to route calls to devices" user="comparethecloud" usehashtags="no"] The reasons are first and foremost cost and improved efficiency compared to a standard telephone system. One of the reasons for its increased popularity is also because of the increase in broadband speeds in cities. Business owners now have access to faster and cheaper internet speeds with higher bandwidths, or sometimes if it's a new office they can already be pretty well equipped for hosted VoIP. Hosted VoIP Overview VoIP (voice over IP) telephony, uses packet-switched telephony over the internet to route calls to devices as opposed to circuit-switched systems used in traditional telephony. Although the technology is not particularly new, it is now ubiquitous, reliable and affordable enough to make it a serious contender for business phone systems. Gone are the days whereby you would experience dropped or unintelligible calls, with many businesses now having fast internet access, and improvements to VoIP systems, it makes sense that many business owners are opting for phone systems such as these. Low-Cost Total Cost of Ownership (TCO) & Fast Deployment With a cloud-hosted solution, the cost savings can be immediately apparent: there are no copper wires to be installed in the business owners building, no expensive hardware such as a PBX is needed as the hosting provider runs the hardware remotely at a data centre, the phone system can easily be upgraded by way of new software updates, and features such as call routing, Caller ID, Voicemail To Email etc often come with Hosted VoIP solutions, redundancy options are also available in case you lose internet access. Cost reduction is further enhanced by not having to employ staff to maintain an on-premise system. Managing servers and PBX systems can be prohibitively expensive. Further cost reduction is possible as the service is billed at a low monthly rate. It is important to check what exact features come with the VoIP solution you are looking to deploy. Some service providers deliver their phone systems with additional features such as IVR, remote call logging, contact saving, messaging systems and many more features. Deployment is as fast as setting up IP-based handsets, and it can scale as business needs expand. If bandwidth issues occur a new line can be installed to cope with more phone users.   Interestingly data provided by PMC telecom shows Over 90% of all Hosted VoIP customers are from major UK Cities. Over 50% of these are from Manchester or London. Other reasons businesses are making the move to Hosted VoIP The features that come as standard with Hosted VoIP are becoming an “out of the box” solution. For example, if you have to consider your business needs and costings such as: DDI Call Recording Call Forwarding/Diverting on a user level Call Queuing On Hold Music These all come as standard. Even with the on hold music - there is a large variety of stock audio to use at no extra charge. In addition call recording is super easy, as all calls are recorded into your cloud. and these are accessible via the online portal which you will use to manage your Hosted VoIP system. Simply, it just saves time, and is cost efficient. A big advantage with businesses with multiple locations - calls are entirely free office to office. This is the case even if your other office is on the other side of the world. Some businesses use just one VoIP phone or a Skype to do this, but now you can integrate this, it makes a lot more sense. Reduced Environmental Footprint Environmentally conscious business owners can also reduce their carbon footprint by moving to a VoIP solution. Billing is generally managed via your online portal, so no paper bills. If you have staff who work from home, they can also be connected to the internet through cost effective business broadband solutions, which also comes with a separate phone line; any charges encountered will appear on the main bill. Because server costs are encountered by PMC you also are reducing your total power consumption – all data is stored remotely, thus reducing your net carbon footprint. [quote_box_center] PMC Telecom PMC Telecom is a British based cloud hosted VoIP provider located in Manchester that provides old style systems, as well as the new Hosted VoIP. “Our Business is based on Telecoms, if we don’t move with the times, we won't survive. We have recently started offering Hosted VoIP, and we are surprised at the reaction. Many businesses are now taking the plunge with hosted VoIP, but most have no idea what it is. If a customer wants a very expensive phone system, we let them know about hosted VoIP, and most either take it, or do not trust making calls via an internet line. But who can blame them, you try and use hosted VoIP on a shared line, or look at earlier adopters of Hosted VoIP and its been a disaster. If your infrastructure isn’t there, it isn't going to work. Interestingly our customer base is expanding more rapidly in Manchester and in London. This could be down to population density however there is strong evidence to suggest that two of our major cities are becoming early adopters of this technology. Very likely due to the fact it makes our job a lot easier when a company already has the infrastructure in place. A problem with hosted VoIP however is that a lot of offices are on industrial parks. These are the last place for BT Openreach to install super fast broadband - and if your internet isn't good enough, hosted VoIP simple isn't an option, unless you install an additional line. This is however still really cost effective, and there is evidence to suggest this may start to be the “go to” solution for phone systems” [/quote_box_center] ### Cloud Cover: It Takes Two to Tango Why IT departments must get in step with cyber-insurance providers The recent TalkTalk hack is a CEO’s nightmare come true, involving everything from calls for the CEO to resign to massive drops in the share price, with significant reputational damage thrown in for good measure. The reverberations of that attack are no doubt now being felt around the country as other CEOs desperately seek reassurances from their staff regarding the robustness of their company’s cyber defences. What it will also - no doubt -  stimulate will be telephone calls to those insurance companies that offer cyber-insurance policies as some CEOs will judge that now would be a good time to start sharing some risk. [easy-tweet tweet="CEOs are desperately seeking reassurances from their staff regarding their company’s cyber defences" via="no" usehashtags="no"] Their IT departments will play a role in that process. But according to a research* that my company (Wallix) recently carried out, the chances are that that role will be a lesser one rather than a greater one and that these IT departments are missing out on an opportunity to significantly raise their profile internally. Paying an insurance company to share some business risk makes good commercial sense. It’s what the Lloyds insurance market has done for centuries and it’s what they are now doing in offering cyber-insurance policies. In fact, so swiftly has the market grown that global gross written premiums quadrupled in just two years, from $850 million in 2012 to $2.5 billion in 2014. the CEO can sleep more easily at night At first glance, once that cyber-insurance premium has been paid and the policy put in place, the CEO can sleep more easily at night, knowing that his or her company is covered. The reality, of course, could be very different and they could still wake up to find that the insurance company won’t be paying out. Here’s why. The ideal form of cyber risk management is a careful balance between appropriate internal IT security measures and the transfer of risk to the insurance company. But those security measures must be enforced, be enforceable and be working. We found two specific areas where this may not be the case, with the in-house IT departments potentially putting their own companies in jeopardy. The first area concerned making security updates. Nearly half of the respondents who took part in a survey that we carried out as part of our research, thought it would be either quite difficult (43%) or very difficult (10%) to ’identify whether…security software fails to make critical updates’. In the event of a cyber-attack triggering a claim on the policy, this is one of the first areas that the insurance company will look at and, in those circumstances, it seems that our unlucky 43% would have some explaining to do. The second area concerned the IT departments’ - some might say – laissez-faire attitude toward staff access. 50% of the sample felt that it would be either ‘difficult’ or ‘very difficult’ to identify whether any ex-employees still had access via accounts to resources on their network. The same percentage (50%) thought the same about ex-third party providers accessing their network and an even bigger proportion (55%) thought the same about ex-contractors accessing their networks. Former staff represent the greatest threat to cyber security Of these three groups, former staff represent the greatest threat. Research shows that 88% of cyber-attacks carried out by insiders came from permanent staff; 7% from contractors and only 5% for agency contractors. So not knowing which of your former employees still had access to your network seemed a mighty big security lapse to us, and one  that the cyber insurance company would want to bring to the attention of senior management too  when turning down the insurance claim. And let’s be clear. Although the vast majority of press coverage about cyber-attacks implies that they are being carried out by outsiders, the reality is that the majority of attacks are actually being carried out by insiders (55%). [easy-tweet tweet="The reality is that the majority (55%) of attacks are actually being carried out by insiders" user="comparethecloud"] [quote_box_center]So what can IT department do about this state of affairs?  Our recommendations are as follows: If your company is considering taking out a cyber-insurance policy, get involved in the decision making process. (This seems obvious, but nearly a fifth (14%) of our respondents didn’t know that their company was considering buying one!) Make sure that you have a clear understanding about the limitations of your existing technology and how that may affect your cover Make sure that your regular and automated security activities (updates, patches, signatures, etc) are working. Maximise your own visibility. If you suffer a breach, the insurance company will want to attribute the source and the more data you have the easier your job will be Know your access control weaknesses. Most cyber insurance policies assume you have complete control and that you have visibility of every user who has access to your infrastructure [/quote_box_center] According to some research carried out by the Ponemon Institute on behalf of Raytheon earlier this year their respondents felt that cybersecurity will become a source of competitive advantage for firms within three years. In other words, those companies operating with the highest levels of IT security in place will gain market share at the expense of others with poorer defences. [easy-tweet tweet="#cybersecurity will become a source of competitive advantage for firms within three years" via="no" usehashtags="no"] The role of the in-house IT department in this ‘cyber security arms race’  (as Dido Harding, the embattled CEO of TalkTalk called it) will become increasingly significant and will offer significant career opportunities, but only if those working in or managing those departments project themselves forward more forcefully. *You can read more about our research by reading our report (‘We May Not Have It Covered’). ### A Cloud journey to take you sky-high 3Reaching the end-point of a Cloud journey may seem like a distant prospect for many. Indeed, Redcentric’s ‘Journey to the Cloud’ research found that 41% of organisations have just begun the process of rolling out Cloud services, only a third have reached the half way point and just 3% have completed their cloud implementations. It’s likely then that some organisations are establishing Cloud services tentatively, suggesting that they still require guidance, support and encouragement to give them confidence that they are working according to best practice. [easy-tweet tweet="41% of organisations have just begun rolling out Cloud services" user="Redcentricplc"] If the journey to the cloud is stalling, the problem may lie with the industry that the organisation operates in. Less than half (44%) of organisations claim that they work in a pro-Cloud industry, calling for an attitude change in certain sectors to make them more embracing of Cloud. Around one in five IT managers in public sector organisations, for example, told us that they work in a ‘quite anti-Cloud’ sector, so it may be hard for them to win over acceptance when implementing the Cloud. Don’t see it as beyond you, however, to convince board-level executives that Cloud is the way forward. As an IT manager, it is down to you to show how the Cloud can innovate the organisation and support business objectives. only 4% of organisations were found to be using cloud to take risks While the Cloud is a worthwhile tool to support innovation, only 4% of organisations were found to be using it to take risks. It’s quite rare for organisations to accept the ‘ups and downs’ that the Cloud can offer, or to be open-minded about what direction they will take. Instead, the most common approach to adopting the Cloud is to be evolutionary, as half of organisations showed signs of taking a steady approach where Cloud is a natural progression for the organisation. But this isn’t using the Cloud to its full advantage. Every organisation faces the need to innovate. Often this is to respond to a changing market place, maintain a competitive edge, or just to increase efficiency. The Cloud can change your whole business model, for example, by enabling services being put online. Alternatively, the Cloud can let you to scale up IT when seasonal demands occur, giving you flexibility of data storage. The Cloud can help you reach goals like this, but you’ll struggle to do so if you don’t trial and test different services, and instead just use the Cloud as a replacement of technology. [easy-tweet tweet="The Cloud can change your whole business model" user="Redcentricplc" usehashtags="no"] Implementing and maintaining the Cloud has its risks, so it’s common for organisations to turn to a key stakeholder when concerns arise. A clear majority (79%) of organisations say that the in-house IT team is seen as the most valuable ‘partner’ when adopting the Cloud. This means that organisations are likely to look internally, as opposed to the service provider, when looking for Cloud guidance. This calls for the in-house IT team to receive substantial training prior to beginning a Cloud roll out to ensure that they can advise and consult the rest of the organisation adequately. But even when you’ve won over Cloud acceptance internally, established how it will support your organisation and prepared for the transition, it’s important to establish your ultimate Cloud destination. While it pays to experiment along the way, you must ensure that you have a strategy in place to get the full effects. The journey you take can shape the destination you arrive at, so ensure that you to take a route that meets your needs. Without a strategy rolling out the Cloud can be a long and onerous process, but under careful planning you can be confident that the process is as quick, effective and safe as possible. ### Can ERP in the Cloud help CIOs meet their business objectives? On-premise Enterprise Resource Planning (ERP) software has earned itself a reputation for being expensive, time consuming and inflexible. Until recently however, many CIOs would have felt that moving a business critical solution such as an ERP system to the Cloud was too risky, because of concerns that the Cloud couldn't deliver the reliability, speed, and security needed for such a crucial system. But as trust in the Cloud increases and disruptive competitors emerge in almost every industry, CIOs are exploring if ERP in the Cloud can help give them the business agility and flexibility they need to stay competitive in today’s fast moving and ever changing business environment. So what are the potential benefits and challenges for organisations considering ERP in the Cloud? [easy-tweet tweet="CIOs are exploring if ERP in the cloud can help give them the business agility and flexibility they need" via="no" usehashtags="no"] Improved agility and scalability For CIOs considering a move to the Cloud, the main drivers are improved business agility and scalability. Today’s businesses need to be able to accommodate and adapt to changes in the business environment, including responding to threats from nimble start-up competitors, by being able to scale up and down according to the needs of the business. Improved agility and scalability means that ERP in the Cloud is also ideal for businesses looking to set up a new office or quickly deploy to new markets oversees as the need for additional hardware is removed and the associated implementation effort is significantly reduced. Enabling enterprise mobility For organisations concerned with improved enterprise mobility, particularly those that embrace bring-your-own-device policies (BYOD) for smartphones and tablets, ERP in the Cloud is a great enabler. Cloud solutions allow for easier and faster mobile application management and integration, no matter the operating system or whether the device is corporate owned or BYOD. Reduce in-house IT support In addition, ERP in the Cloud reduces the need to have a large IT support resource in-house, as well as the costs associated with running your own data centres. It provides the resource allocation flexibility that comes with avoiding large upfront investments in the early phases of deployment. It also means that businesses will have access to the latest software capabilities and regular updates instead of larger disruptive updates every few years. Improved collaboration Improved collaboration is also an important benefit of moving ERP to the Cloud as it becomes much easier to collaborate with staff internally, as well as with partners and clients externally. For example, a retailer may need to work closely and plug in (integrate) numerous external partners to its supply chain. Trial ERP in a ‘lab’ environment Forward-thinking organisations are investing in ‘lab’ environments to trial new tech. The Cloud makes testing different scenarios or processes, customisations and prototyping possible. All of which can be done quickly and could be a great tool for CIOs in helping them to decide whether or not moving ERP to the Cloud is right for them. Organisations can see how this would work for them by using preconfigured industry template models and testing them on their own business. By testing ERP in the Cloud in a lab style environment, businesses can gain a better understanding of the impact this would have if they were to roll this out across a line of business or their entire organisation to help them make the right decision. Looking forward to the Internet of Things A move to the Cloud should also be considered in terms of what it can enable in the future. For example, the Internet of Things (IoT) is a growing area of focus for many CIOs and getting a handle on the ever increasing amount of customer data which has the power to tell you more about your clients and how they are using your services will be crucial. ERP in the Cloud is an enabler for this as it provides the flexibility and scalability to be able to collect and analyse increasing streams of data in real-time. ERP in the Cloud can assist with collecting, cleansing, and analysing an organisations data and turn that into the intelligence that organisations need to better serve customers. Using ERP to collect and analyse all of those data will lead to better, more proactive relationships with customers and enable the development of new business model and service offerings. Consider the journey to Cloud The challenges and complexities of moving an organisations ERP system to the Cloud should not be underestimated. Careful consideration of the journey from its existing ERP solution to its future state needs to be given not just to the technology change but also to the impact on the people and processes within the business. Implementation partners will be able to help you plan for these changes, whichever stage you are at on the journey to Cloud. Organisations will still need to go through the single biggest task associated with any ERP implementation; data migration. In addition, integration with existing on-premises technology will still need to take place which can be time consuming and challenging. Having said that ERP in the Cloud will accelerate development and deployment considerably in comparison to on-premise. Is it right for your business? While moving to the Cloud may be the right step for some, every organisation is unique, has different business objectives and is at different stages in its life cycle. For relatively young and fast growing companies which do not already have an existing ERP system, ERP in the Cloud is likely to be the ideal choice. For others the decision may not be as clear cut and will depend on your individual business objectives. The decision to move an enterprise wide, business critical system such as ERP to the Cloud should be well considered, thought out, explored and tested out on your own business processes. The potential benefits particularly around agility and scalability make moving ERP to the Cloud a worthwhile consideration for everyone. Implementation partners should be able to work with you to test if ERP in the Cloud can help meet your business objectives and help you on the journey to deciding if Cloud is right for your organisation. ### The Power of Cloud: Helping SMEs Focus on the Bottom Line With money, livelihood and reputation on the line, running a small business can be tough. Every day is a careful balancing act of sales, solvency and customer satisfaction, all in an often unpredictable bid for success. It takes hard graft, an understanding of how to get an idea off the ground – and crucially, how to keep a business running. As a result, SME owners have to be laser-focused on the bottom line in order to keep their heads above water. [easy-tweet tweet="50% of SME owners put off doing the books" user="QuickBooksUK" usehashtags="no"] Poor financial management is one of the key reasons so many firms fail in their early days, with 44 percent of SMEs either running out of cash or coming very close to doing so within the first three years. Despite this, research shows that 50% of SME owners put off doing the books. Whether it’s keeping on top of invoices or managing cash flow, financial management is not why most people decide to start their own business. It can be costly, time-consuming and complex, and as a result, it is something that often gets pushed to the back of the queue. Anything to make this process easier goes a long way. Hiring an accountant is one option and can be an incredibly valuable asset, but sometimes, particularly in the early stages, this is a luxury many SMEs cannot afford. Tackling spreadsheets Many SME owners rely on spreadsheets to do their books. However, SME owners we spoke to said this often leaves them stuck in a cycle of spending hours each month, trawling through reams of paper-based records. In fact, many will waste up to a week a year solving issues like understanding formulas, getting the numbers to add up and maintaining version control. But advances in technology are making it possible for SME owners to move their finances online to work anytime and anywhere, making their bookkeeping much more efficient. Taking control with the cloud Most consumers use the cloud every day, probably without even realising. From buying goods through Amazon, listening to music on Spotify or sharing files via DropBox – all are impossible without the cloud. Many SMEs are starting to manage their finances in the cloud, conducting back office work like bookkeeping and accounting. A study found that 37% of SMEs are already adapting to the cloud, a figure that is set to rise to 78% by 2020. The advantages are obvious: the same ease of access, to get the right information at the right time, also applies to cloud accounting and can be a real game changer for business owners. Getting a complete overview of the company finances at the touch of a button enables them to make better informed business decisions very quickly. For example, they might look back at their last quarter and decide it’s time to pay their staff a bonus or plough more investment into a new technology solution. At the other end of the scale, they could avoid a nasty surprise by realising they need to ramp up sales quickly. The business benefits of cloud Not only does cloud accounting provide the owner with the bigger picture, it also helps in the day-to-day maintenance of the business. The software automatically syncs and categorises bank data, saving time and reducing data entry errors. It also removes the headache of VAT by providing the information a user needs to complete their VAT return every quarter.  Having a cloud-based solution also means employees are no longer confined to their desk. Logging into work systems and files via the cloud on a smartphone, tablet or laptop, employees can check the real-time status of an invoice while still with a client, collaborate with suppliers in real-time from a coffee house, run payroll from the road or even accept credit card payments for a sale. One of the biggest pain points for a SME owner to tackle is cash flow – how do you control what’s going out when it’s uncertain how much is coming in? This can lead to problems when new infrastructure or upgrades to business processes are needed, as these often require significant upfront investment. But with cloud tools, this capital investment isn’t needed because the business is subscribing to the use of a product in a pay-as-you-go model. The owner typically only pays a fixed monthly fee, with some services even allowing them to pay for only what they use.  Running the accountancy side of a small business has the potential to be a time-consuming and costly process. By taking advantage of the technology available, SMEs can use the cloud to relieve themselves of much of the burden of financial management. And the silver lining? Getting SME owners back to doing what they do best; designing new products and services, meeting customers and prospects and ultimately, driving their business forward. ### Building strong links in the retail security chain Most people are aware that credit card fraud is a real concern, but how many know just how extensive a problem it is? Do they know, for example, that more than half of the largest security incidents ever recorded have involved card data? Take the Heartland breach in 2009 which compromised up to 100 million cards and more than 650 financial services companies. Or the Home Depot breach in 2014 involving a five month attack on the retailer’s payment tills that is estimated to have compromised as many as 56 million credit cards. [easy-tweet tweet="More than half of the largest security incidents ever recorded have involved card data" via="no" usehashtags="no"]  Credit card fraud is also widely targeted, affecting a number of different sectors. The Trustwave 2014 Global Security Report found that most breaches last year targeted the retail (38%), food and beverage (18%), hospitality (11%), finance (9%) and professional services (8%) sectors. The bulk of those attacks were aimed at those organisations’ e-commerce platforms (54%), point of sale (33%) and data centres (10%).  There has also been a dramatic shift in the type of credit card fraud over the past few years, according to Financial Fraud Action UK 2013 stats. In 2002, 35% of fraud losses were from counterfeit cards and 26% from card-not-present fraud. By 2012, the figure for counterfeit cards had declined to 11% but losses from ‘card-not-present’ fraud accounted for 65% of the total. The reasons for this is the emergence of chip and pin, which has made it harder to commit fraud with counterfeit cards in tandem with a huge increase in e-commerce activity that has made card-not-present fraud more attractive. The fight against credit card fraud has been spearheaded by the PCI SSC Relying on PCI That’s why the payment industry has taken a standards and compliance approach to the problem, asking the industry as a whole, and retailers who rely on it, to formalise their approach to security. The fight against credit card fraud has been spearheaded by the Payment Card Industry Security Standard Council (PCI SSC), which was established to help businesses process card payments securely and reduce card fraud. The organisation developed the worldwide PCI Data Security Standard (PCI DSS), a set of requirements designed to ensure merchants and service providers adequately protect cardholder’s data. PCI DSS requirements apply to all payment channels, including retail shops, mail/ telephone order companies and e-commerce businesses. There are different requirements depending on a range of criteria, such as cardholder data storage, processing channels, security protocols transaction volumes and so on.  For example, merchants processing more than six million transactions a year are subject to an annual on-site audit and a quarterly vulnerability scan. Those with fewer transactions need to take part in an annual self-assessment questionnaire and a quarterly vulnerability scan.  Organisations are required to install and maintain a firewall to protect cardholder data, encrypt the transmission of cardholder data across open public networks and develop and maintain secure systems and applications. They also need to restrict access to cardholder data on a ‘need-to-know’ basis (when access rights are granted to only the minimum amount of data necessary to perform a task), track and monitor all access to network resources and cardholder data, regularly test security systems and processes and maintain a defined formal information security policy. Enforcement measures such as audits and penalties for non-compliance may be necessary.  [easy-tweet tweet="Security is only as strong as the weakest link" user="comparethecloud" usehashtags="no"] Security is only as strong as the weakest link For most retailers, the security challenge goes well beyond their own internal systems and efforts to comply with important standards such as PCI.  The increasing importance of cloud computing, , means that retailers need to look towards their technology partners and suppliers to ensure there are no weak links in their security chain.   But closing the loop internally and working to the highest of security standards will ultimately stand for very little if vulnerabilities exist within technology partners.  In the case of the Target breach, for example, it’s thought by many to have originated via a third party vendor linked to Target.  It’s perhaps the most high profile illustration of the need for retailers to also draw their partners into their security circle, and to ensure consistency of approach. Retailers can’t just build a virtual ‘wall’ around their own systems Retailers can’t just build a virtual ‘wall’ around their own systems and rest easy believing their defences are strong.  As the importance of third party technology partners grows – particularly in areas such as cloud computing – retailers need to understand and trust the security standards of every partner providing a link to the outside world.  PCI DSS compliance can be a complex, time-consuming and expensive business, especially for smaller companies that have enough on their hands trying to meet the obligations of running their day-to-day operations. In many instances, they might be better served working with a service provider that is already PCI compliant and can take away a lot of the burden associated with achieving the PCI DSS requirements. This can provide organisations with access to secure networks that protect cardholder data and meet the key security requirements of PCI DSS while guaranteeing best practice in the face of an unwelcome increase in external threat to data – and customer - security. ### A look into the future: How much can predictive analytics tell us? Being able to predict the future has been an ambition of mankind’s for centuries. Whereas ancient seers used to base their prophecies on celestial movements, our present day approach to predicting the future is much more scientific. In fact, our desire to control and make sense of the disorder that we see around us in everyday life has led to a branch of statistics concerned chiefly with predicting what is yet to occur: predictive analytics. [easy-tweet tweet="Our desire to control the disorder that we see in everyday life has led to predictive analytics" user="bsquared90" usehashtags="no"] Since the birth of computer modelling, predictions have developed from those involving relatively simple mathematics to the kind that can only be made by the most advanced supercomputers. Early examples of predictive technologies included automated anti-aircraft weapons that used sensors to measure flight speeds and angles to help hit targets. Other early developments included the use of computers to aid weather forecasts, air travel and credit risk decisions. However, the sea change that enabled predictive analytics to enter the mainstream was the enormous increase in the amount of data being produced on a daily basis. The rise of Internet and smartphone technologies has meant that the amount of data being created is growing exponentially. In fact, the digital universe is doubling in size every two years and is expected to multiply 10-fold between 2013 and 2020. This big data explosion has given predictive analytics software much more information to work with, making predictions relevant to a broader range of businesses. the digital universe is doubling in size every two years Predictive analytics has already been implemented across a wide variety of industries and its applications are continuing to grow. In healthcare, Stanford University used predictive analytics to train computers to analyse microscopic images of breast cancer tumours, while Google Flu Trends research was able to analyse search patterns to forecast an increase in influenza cases at a specific hospital. Similarly, urban planners have been able to use data to predict potential traffic bottlenecks and financial institutions have looked to statistics to foresee possible risk levels in their work. However, perhaps the most commonly used application of predictive analytics is in marketing and the wealth of data now available about existing and potential customers represents a huge financial incentive for businesses. In a recent speech on the potential benefits of predictive analytics, Eric Siegel, an expert in the field, highlighted a less obvious example of the technology being used for marketing purposes. During the 2012 United States presidential elections, Barack Obama’s campaign used predictive analytics not to determine which citizens would vote Democrat, but which ones would be receptive to campaign activity. The campaign team gave each individual voter a predictive score on how likely they would be to be influenced by certain types of activity – a tactic known as persuasion modelling. In this sense, Siegel claims, predictive analytics “empowers an organisation not just to predict the future, but to influence it.” [easy-tweet tweet="predictive analytics empowers an organisation not just to predict the future, but to influence it" user="bsquared90" usehashtags="no"] Many other marketing campaigns have also used predictive analytics to good effect by identifying customers that are more likely to respond to activity. Using data from previous campaigns, in conjunction with any other information you may have about customers, businesses can produce more targeted materials. Ultimately, predictive analytics doesn’t need to tell a business with certainty that an individual will purchase a product after viewing a campaign. Even a prediction that provides an increase of just a few percentage points in terms of marketing conversion rate could be worth thousands to a business. While most current examples of predictive analytics can’t tell us exactly what will happen next, the future of the technology is promising. In five or 10 years down the line predictive analytics may be able to improve road safety by foreseeing collisions, predict the likelihood of a household appliance breaking down or provide a number of services that are personalised to your individual taste. In fact, much of this is already technologically possible and it only remains to be seen to what extent consumers will embrace it. As more data becomes available and processing power increases, it’s likely that more and more of the future will become easier to predict. ... it’s likely that more and more of the future will become easier to predict ### Druva to Increase Customer Cloud Choice and Flexibility via Microsoft Azure Strategic Relationship Extends Druva’s Public Cloud Portfolio for Data Governance and Availability Druva, the leader in converged data protection, today announced an expanded public cloud presence with the addition of a new strategic alliance, Microsoft Azure. Teaming with one of the world’s largest software providers, the company will extend its Druva cloud solutions to Microsoft’s public cloud and infrastructure platform.  This new relationship provides Druva customers with greater choice for global storage options to better meet their data storage, privacy and security needs, and also offers choice regarding their preferred infrastructure vendor. The ever-growing number of compliance and legal requirements are forcing companies to retain ever more data, as well as be aligned to regional data privacy regulations. This, coupled with an explosion in data growth, means organizations trying to keep pace with expanding traditional on-premise storage infrastructure is rapidly becoming unsustainable.  The public cloud has quickly become the more viable and secure option for this data and its governance and archival needs.  With Druva’s new Azure relationship, customers now have a wider set of choices to best meet their data growth, security and regionally specific regulation requirements. “Druva has always taken a proactive approach to help our customers address their data availability and governance needs – and that involves offering customers strong data protection and security in the cloud for their sensitive workloads,” said Jaspreet Singh, CEO, Druva. “Our work with Microsoft Corp. underscores our commitment to broadening our cloud-related options and giving customers additional choice for deploying in the cloud securely and conveniently. Druva has quickly grown to become the defacto standard for data protection workloads in the public cloud.” Azure will extend the data storage footprint of Druva inSync, the #1 rated analyst endpoint and cloud service data protection solution that integrates secure, scalable, high-performance backup, file sync across all user data, remote file access, data loss prevention, eDiscovery and automated compliance monitoring.  This new relationship provides increased flexibility of inSync deployments, new go-to-market channels and offers more regions to store customer sensitive data. “The Microsoft Azure Marketplace delivers direct access to the cloud-ready applications and services customers are asking for,” said Steve Guggenheimer, Corporate Vice President and Chief Evangelist, Microsoft. “Druva built natively to the public cloud to take advantage of its elasticity, global presence and security to handle petabytes of customer data efficiently, which are also foundational elements of our Azure offering. Our mutual customers will reap the benefits of our joint efforts with cloud scalability and flexibility, always-on reliability, and international compliance support.” [quote_box_center] Additional benefits of Druva availability on Azure include: Security: Azure meets a broad set of international and industry-specific compliance standards, such as ISO 27001, HIPAA, FedRAMP, SOC 1 and SOC 2, as well as country-specific standards including Australia IRAP, UK G-Cloud, and Singapore MTCS. Microsoft was also the first to adopt the uniform international code of practice for cloud privacy, ISO/IEC 27018, which governs the processing of personal information by cloud service providers. Global Availability: Broad international Microsoft datacenter locations provide 21 storage regions around the globe, including Canada and China --  enabling Druva customers to meet data residency needs posed by evolving regional data privacy regulations. Microsoft Customer Advantage: Enterprise customers who have standardized on the Microsoft platform can utilize their contract license credits towards their Druva purchase. [/quote_box_center]   ### The Cloud Doesn’t Have to Be a Healthcare Headache Cloud technology emerged in the healthcare industry with a handful of benefits. Not only has it enabled CIOs to save money by decreasing the need for more infrastructure, it has also proved to be a low-cost solution that helps clinicians provide easier and better care for patients. In fact, recent research predicts that global healthcare organisations will increase their spending on cloud computing services by 20% a year until 2020, when the value of the investments will be more than £8.3bn. [easy-tweet tweet="Global healthcare organisations predicted to increase their spending on cloud by 20% a year until 2020" user="EXTR_UKI" usehashtags="no"] Back in 2013, UK health secretary Jeremy Hunt challenged the NHS to be entirely paperless by 2018 in a bid to “save billions, improve services and help meet the challenges of an ageing population”. One element of this plan was to make everyone able to access their own health records online by March 2015. Whilst we have seen some UK hospitals emerge as innovators and leaders in this initiative, there has also been a high level of reluctance from healthcare organisations to move entirely online given the risks involved in cloud migration.  [caption id="attachment_20951" align="alignright" width="300"] IoT and your blood pressure woes[/caption] Despite the wealth of benefits cloud technology can offer healthcare professionals and their patients, the challenges involved have proven to be a stumbling block to moving forward with the NHS’s plans to go paperless. Here’s a look at some of the specific challenges healthcare organizations are facing when adopting cloud solutions and, more importantly, how these hurdles can be overcome: Compliance: With an online, connected healthcare system, now more than ever it is crucial clinicians work with vendors who understand and adhere to the 1998 Data Protection Act as well as the more specific healthcare acts such as the NHS Act 2006 and the Health and Social Care Act 2012. Failure to follow these could lead to not only embarrassing hacks, but also potentially expensive and costly legal proceedings.  Privacy & Security: As the health sector out stripped all other industries in last year’s Information Commissioner’s Office report with 747 individual data breaches, there is clearly an issue with privacy and security. To try and lower the number of incidents, clinicians are turning more and more to network providers to secure data. With the internet providing a quick and easy distribution channel for information, even the tiniest of data breaches can suddenly become national headline stories. Evolving Role of IT: With the explosion of devices and technology in healthcare, there has also been an increase in patient data. In turn, this has placed heightened demands on IT to provide better performing cloud solutions in a more agile environment. Today’s healthcare IT professionals are grappling with a shift in their responsibility from “keepers of the infrastructure” to “managers of application service delivery.”  clinicians are turning more and more to network providers to secure data While there are undoubtedly challenges with cloud-based technology, it also presents an immense opportunity for hospitals to save money, decrease their infrastructure and provide better patient experiences. However, doing so requires organisations to have complete visibility and a clear understanding about what is taking place on their networks. Network-powered application analytics solutions allow clinicians to create policies based on network data. More importantly, these solutions can give clinicians and their teams a bird’s eye view into their network, ensuring they have visibility into all network applications and activity. This means they can see who’s utilizing the network and what information they are accessing, so that patient data is kept private and only shared with authorised viewers. patient data is kept private In addition, network-powered application analytics solutions are simplifying the role of IT. As IT professionals shift into the role of “manager of application service delivery,” new analytics solutions allow them to view and manage all network devices and activity from one centralised location. The benefits of cloud technology for healthcare organizations cannot be overlooked; however, unleashing these benefits means overcoming several hurdles. Working with suppliers who understand these demands makes implementation and deployment a seamless operation. [easy-tweet tweet="The benefits of cloud technology for healthcare organisations cannot be overlooked" user="EXTR_UKI" usehashtags="no"] ### Seeding the Global Public Sector Cloud: 46Data Classification, Security Frameworks and International Standards All of a sudden, everywhere you look, the cloud is the new normal. Top service providers’ cloud revenues are doubling year on year at the start of what is predicted to be a sustained period of growth in cloud services. As IT workloads have migrated to the cloud, the private sector has led the charge. Governments have been towards the rear, with cloud spend to date generally accounting for less than five percent of a given country’s public sector IT budget. This looks likely to increase quickly as the public sector starts to overcome the blockers to cloud uptake. [easy-tweet tweet="All of a sudden, everywhere you look, the cloud is the new normal." user="KempITLaw" usehashtags="no"] The classic NIST definition of the Cloud specifies Software (SaaS), Platform (PaaS) and Infrastructure (IaaS) as the main Cloud services (see figure 1 below), where each is supplied via network access on a self-service, on-demand, one-to-many, scalable and metered basis, from a private (dedicated), community (group), public (multi-tenant) or hybrid (load balancing) Cloud data centre. The benefits of the Cloud are real and evidenced The benefits of the Cloud are real and evidenced, especially between the private and public cloud where public cloud economies of scale, demand diversification and multi-tenancy are estimated to drive down the costs of an equivalent private cloud by up to ninety percent. Equally real also are the blockers to public sector cloud adoption, where studies consistently show that management of security risk is at the centre of practical, front-line worries about cloud take-up, and that removing them will be indispensable to unlocking the potential for growth. Demonstrating effective management of cloud security to and for all stakeholders is therefore central to cloud adoption by the public sector and a key driver of government cloud policy. Figure 1: Software as a Licence to Software as a Service: the Cloud Service Model Continuum A number of governments have been at the forefront of developing an effective approach to cloud security management, especially the UK which has published a full suite of documentation covering the essentials. The key elements for effective cloud security management have emerged as: a structured and transparent approach to data classification; a transparent and published cloud security framework based on the data classification; and the use of international standards as an effective way to demonstrate compliance with the cloud security framework. Data classification is the real key to unlocking the cloud. This allows organisations to categorise the data they possess by sensitivity and business impact in order to assess risk. The UK has recently moved to a three tier classification model (OFFICIAL → SECRET → TOP SECRET) and has indicated that the OFFICIAL category ‘covers up to ninety percent of public sector business’ like most policy development, service delivery, legal advice, personal data, contracts, statistics, case files, and administrative data. OFFICIAL data in the UK ‘must be secured against a threat model that is broadly similar to that faced by a large UK private company’ with levels of security controls that ‘are based on good, commercially available products in the same way that the best-run businesses manage their sensitive information’. [easy-tweet tweet="Data classification is the real key to unlocking the cloud" user="KempITLaw" usehashtags="no"] Data classification enables a cloud security framework to be developed and mapped to the different kinds of data. Here, the UK government has published a full set of cloud security principles, guidance and implementation[i] dealing with the range of relevant issues from data in transit protection through to security of supply chain, personnel, service operations and consumer management. These cloud security principles have been taken up by the supplier community, and tier one providers like Amazon and Microsoft have published documentation based on them in order to assist UK public sector customers in making cloud service buying decisions consistently with the mandated requirements. Compliance with the published security framework, in turn based on the data classification, can then be evidenced through procedures designed to assess and certify achievement of the cloud security standards. The UK’s cloud security guidance on standards references ISO 27001 as a standard to assess implementation of its cloud security principles. ISO 27001 sets out for managing information security certain control objectives and the controls themselves against which an organisation can be certified, audited and benchmarked. Organisations can request third party certification assurance and this certification can then be provided to the organisation’s customers. ISO 27001 certification is generally expected for approved providers of UK G-Cloud services. This pragmatic but comprehensive combination of data classification and cloud security framework with the assurance that evidenced compliance with generally accepted international standards provides will go a long way to unlocking the benefits, removing the blockers and enabling the public sector cloud around the world to achieve its potential. If you would like to see further information on this topic, please see our October 2015 white papers on Seeding the Global Public Sector Cloud, Part I - A Role for International Standards and Part II – The UK’s Approach as Pathfinder for Other Countries. ### Bitcoin set to defy sceptics and become 6th largest global reserve currency by 2030 TOP GLOBAL FINANCIAL INSTITUTIONS WILL INVEST MORE THAN $1BN IN BLOCKCHAIN-RELATED PROJECTS IN THE NEXT 1-2 YEARS [quote_box_center] ⎯ Bitcoin set to defy sceptics and become 6th largest global reserve currency by 2030, according to research ⎯ More than 1 million bitcoin transactions alone are now taking place daily, in excess of 10 times publicly reported data ⎯ Survey by Magister Advisors of more than 30 leading bitcoin companies demonstrates growing strategic significance of bitcoin and blockchain ⎯ Leading banks each already have 10-20 blockchain projects underway ⎯ China already has hegemony in bitcoin ‘mining’, the process that validates payment[/quote_box_center] A global survey of financial businesses focused on blockchain and bitcoin technology has found that an estimated $1 billion will be spent by the top 100 financial institutions on blockchain-related projects over the next 24 months. The view of the majority of respondents is also that bitcoin will become the 6th largest global reserve currency within 15 years. In the the largest study of its type to date, Magister Advisors, M&A advisors to the technology industry, interviewed senior executives from thirty of the world’s leading bitcoin companies. [easy-tweet tweet="Blockchain is without question the most significant advancement in enterprise IT in a decade" via="no" usehashtags="no"] Jeremy Millar, partner at Magister Advisors who led the research said: “Blockchain is without question the most significant advancement in enterprise IT in a decade, on a par with big data and machine learning. What JAVA is to the Internet, blockchain is to financial services. We have now reached a fork in the road with bitcoin and blockchain. Bitcoin has proven itself as an established currency. Blockchain, more fundamentally, will become the default global standard distributed ledger for financial transactions.” BANKS WILL SPEND $1BN ON BLOCKCHAIN INNOVATION IN THE NEXT TWO YEARS Magister Advisors estimates that the top 100 global financial institutions will invest more than $1bn on bitcoin and blockchain related projects over the next 12-24 months. The same research has found that leading banks have portfolios of 10-20 bitcoin-related projects underway. The initial use of blockchain is typically not to replace core infrastructure, such as wire transfers, but to complement it, often by storing ‘meta-data’ in areas such as settlement and clearing. But the potential is much greater, given the flexibility and robustness of the technology, ranging from property registries to security infrastructure to direct payments. Magister’s research finds that leading private blockchain companies are already signing seven figure contracts with these institutions. A number of projects are ‘near production’, having been proven internally to be sufficiently robust for to satisfy production requirements. Jeremy Millar, partner at Magister Advisors who led the research, said: “Banks will initially be unwilling to remove the core infrastructure that handles the process of clearance and settlement but they will increasingly run parallel blockchain processes, evidence by the spike in investment that our poll has identified. Blockchain is not a flash in the pan. The transparency of its processes and its array of potential applications make it virtually inevitable that it will become the default validation standard.” He added: “Blockchain technology will underpin a growing number of routine transactions globally as trust grows. Our interviews with thirty of the leading bitcoin companies worldwide cement our view that the currency is gaining traction. Growing vendor acceptance and the adoption of bitcoin in developing markets are creating a pincer movement that will lead to widespread business and consumer acceptance and adoption over time.” BITCOIN AND SPECULATION Like many assets, such as futures and commodities, the primary usage of Bitcoin today in developed markets is speculation. Interest in Bitcoin speculation has been enhanced by the lack of volatility and yield in traditional asset classes (much like gold a few years ago), leading to the creation of a number of Bitcoin ‘money market funds’ or ETF equivalents. Magister Advisors estimates that 90% of Bitcoin by value is being held for speculation, not commercial transactions. On the other hand, we estimate that 90% of Bitcoin transactions by volume are in fact commercial transactions, typically in developing economies. Jeremy Millar said: “It is worth noting that futures trading volumes are three times or more the volumes on stock exchanges. Traders are addicted to volatility and we mustn’t underestimate the significance of speculation.” BLOCKCHAIN AND BITCOIN WILL DIVERGE To date, blockchain and bitcoin have captured equal attention but blockchain is set to impact far wider aspects of business and consumer life. The majority of bitcoin transactions are currently taking place in developing economies, reflecting the appeal of the robustness of the technology in economies where an estimated 2 billion adults do not have bank accounts and especially in markets where corruption is endemic in financial services. Jeremy Millar said: “Two million smartphones are being activated per day, so the question arises in developing economies, especially those where corruption is a factor, why open a bank account?” REGULATION BY INNOVATION [easy-tweet tweet="In the developed world, there is a new generation startups working on ‘cleaning up’ bitcoin."] Companies such as Coinanalysis and Elliptic are focused on tracing the source of Bitcoin transactions for proceeds of crime and anti-money laundering. Indeed, the arrest of DEA agent Carl Force and Secret Service special agent Shaun Bridges in part relied on suspicious activity reports filed by Bitcoin exchanges such as Bitstamp. Jeremy Millar said: “Ironically bitcoin has attracted negative publicity over its short life because attempts to rig it have been flagged by the blockchain technology that underpins it. It’s the inherent ability of the blockchain infrastructure to expose these attempts that have impacted perceptions when in fact it should shore them up. Studies, conversely, show that 80% of UK banknotes feature traces of drugs. This self-regulating capability in blockchain will lend itself to array of applications where corruption has hitherto been a problem.” BITCOIN TRANSACTION NUMBERS ARE UNDERREPORTED BY AN ORDER OF MAGNITUDE Reported data from blockchain.info indicates that there are circa 120,000 bitcoin transactions per day, but this under-reports the actual total, according Magister Advisors’ research. “Reported numbers do not factor in the 300,000 daily ‘off-chain’ transactions by large wallet/vault players such as Coinbase, Circle and Xapo, pushing the daily total number of transactions to around 1 million,” said Jeremy Millar. DEVELOPING MARKETS MAY ACCELERATE GLOBAL ADOPTION Within certain markets, bitcoin as a currency is extremely popular, finding applications where financial institutions have a smaller footprint. In some of the large wallets, for example, 90% of the transactions are occurring in developing markets. There are also a number of significant bitcoin remittance companies such as Abra, Bitpesa and Atlas. Bitcoin for cross-currency payroll is also increasingly popular. Jeremy Millar said: “There are three drivers that are accelerating adoption of bitcoin: the maturity of the technology, the range of potential applications and the move from proof-of-concept to production. The early years of public exposure to bitcoin was confined to stunts, such as acceptance of the currency for space tourism bookings and one-off cashpoints. Now we see vendors like Microsoft, Dell – and mature online consumer finance businesses like PayPal accepting the currency. It is still, however, effectively hobbyist. No compelling consumer user case exists in developed economies. Consumer and SME transactions in developing markets, on the other hand, will drive commercialisation.” These factors, combined with growing acceptance by, and investment from the big banks, are accelerating bitcoin’s progression up the credibility curve. In certain markets, bitcoin as a currency is extremely popular. Within some of the large wallets, for example, 90% of the transactions occur in developing markets where an estimated 2 billion adults do not have a conventional bank account. The robustness of blockchain’s security makes it an appealing alternative for those disenfranchised from the traditional banking system. CHINA’S GROWING ROLE One notable additional factor is the growth of China’s involvement in blockchain, the underpinning encryption technology. More than half of ‘hashrate distribution’, a feature that validates the shared ledger and enables payment is carried out by Chinese ‘miners’. Commentators identifying this as a risk to the robustness of blockchain are misguided in Magister Advisors’ view. Jeremy Millar said: “Chinese miners are making a reported $150 million a year in aggregate and the technology is designed to expose attempts at malfeasance. Fundamentally, even if there was a desire to do so, there is no incentive to cheat.” ### Data centre security – Do you understand your risk? Let’s assume for a moment that you still manage all or some of your data in-house. By implication that means that somewhere in the building you have a room full of servers that need to be maintained and protected. And as a manager you’ll be aware of the physical risks that threaten the integrity of your data. These include not only flood, fire and incursions by malicious third parties but also the havoc that can be created by unauthorised members of staff entering the secure area and, accidentally or deliberately, tampering with the equipment. Naturally enough you do your level best to protect your hardware and software from all these threats. Naturally you do your best to protect your hardware and software from threats So now let’s say that you’ve made an important decision to outsource your storage and IT functionality to an external data centre. As with the in-house operation, you’ll want to be absolutely assured that the risks will be effectively addressed. Certainly you will ask questions and your choice of provider depends heavily on the answers. The right questions But will you be asking the right questions? Or, to put it another way, unless you fully understand where the main areas of risk lie, you may not be in position to assess the security provisions put in place by a potential provider. Risk misconceptions [easy-tweet tweet="As a species we’re not great when it comes to assessing real levels of risk and threat" user="comparethecloud" usehashtags="no"] As a species we’re not always terribly good when it comes to assessing real levels of risk and threat. The classic example is the motorist who drives many thousands of miles a month without a second thought while getting stressed at the (statistically) much safer prospect of catching a flight from London to New York. There are good reasons why the latter is perceived as more dangerous – not least that driving gives us a sense of control while flying puts us in the hands of others and that air accidents tend to be both well publicised and unpleasant. Air travel is, therefore, scarier but actually much less risky. And very often data centre customers will focus on the ‘scary’ headline threats, such as terrorism, theft by organised criminals or a major accident. This leads to common questions such as: What provision have you made to protect against an explosion? What has been done to prevent an attack on the data centre from, say a gang driving a truck through the wall? What has been done to ensure the centre continues to operate if there is a major incident in the area? All good questions and your data centre manager should be able to provide the answers. But the truth of the matter is that incidents of this kind are extremely rare. If we take the threat of bomb-blast as an example, there is currently no record of a data centre being attacked by terrorists in this way. Equally the incidence of data centres being affected by attacks on other installations is rare to the point of being negligible. Common threats And in reality the main and common threat to the integrity of data stems from a much more mundane source – namely the member of staff (or perhaps an external party) who gains access to the servers and maliciously or unintentionally causes an outage. This was probably a threat that you were aware of when running an in-house operation, but the expectation is that in an external data centre all staff will be suitably qualified and skilled and those who aren’t will not be given access to key areas. [easy-tweet tweet="It’s vital to ensure that all those with physical access to your servers should be thoroughly vetted and managed"] But the truth is that it’s vital to ensure that all those with physical access to your servers (within the data centre) should be thoroughly vetted and managed. At one level, a negligent or poorly skilled employee can cause an enormous amount of damage. At the malicious end of the spectrum, someone with a grudge or criminal intent could, in extreme circumstances, cripple your operations. So what is to be done? Good housekeeping Well first and foremost it’s important to thoroughly vet your own staff, and particularly those who may be visiting the data centre. Equally important, you should also be vetting anyone within your supply chain who might be given access. Data centre security It’s vital to establish how the outsource provider manages access to your IT hardware within the data centre. How are members of staff authenticated? What measures are in place to prevent an unauthorized person stealing the identities of others to obtain physical or virtual access? Equally important, if security measures are ostensibly present, are they being actively enforced? For instance, let’s say an authorised person opens a secure door with a pass and is followed through by another party. Clearly the second party has no need to use a pass as the door is already opened but this is a breach of procedure. Will he or she be challenged or are there electronic measures in place to prevent this kind of “tailgating.” Is the security of a professional standard? The value of locked-up in data is immeasurable. From client details and e-mail records, through to transactional and operational information the data lies at the heart of corporate operations. Those protecting the data should be security professionals and not simply data centre managers with an added security responsibility. The value of locked-up in data is immeasurable. Outsourcing to a data centre can and should make information more, rather than less secure. Good data centres have the resources and expertise to ensure its integrity. However, before deciding on a provider it is vital to fully understand the risks and ask appropriate questions. ### The growth hack tips that could help build your startup The headlines are full of startup success stories, of individual entrepreneurs with little more than an idea creating million dollar businesses from scratch. But for every one of these stories there are countless others of startups that struggled to build on their early momentum. For fledgling businesses looking to improve their growth rate, data is often seen as the way forward, but it’s important to remember that it won’t solve all you problems. What it will do is provide your startup with focus and accountability, providing you know how to use it. [easy-tweet tweet="For fledgling businesses looking to improve their growth rate, #data is often seen as the way forward" user="bsquared90"] Building on an idea Most startup up ideas begin as little more than instincts or gut feelings. You think your idea will appeal to millions, but investors are likely to require some harder evidence, which is where data comes in. Testing your idea amongst your target audience and recording feedback before launch is a great way of demonstrating tangible proof that there is a market for your product. However, creating a product or service, especially if your revenue streams are limited, is not easily achieved. [easy-tweet tweet="The MVP approach to #growthhacking explained by @bsquared90" user="comparethecloud" usehashtags="no"] One way that startups overcome this challenge is by taking the “minimum viable product” (MVP) approach. This means producing something with just the just the essential features to test whether or not it resonates with your audience. For Dropbox, this was something as simple as a 120-second explanation video. Before they had any users, this simple clip gave potential customers a clear idea of what the product was about and generated the kind of interest that made investors sit up and take notice.  The same approach can be used by more established startups that want to launch a new product. Airbnb is rightly held up as one of the most successful startups in recent times, but one of the reasons for their rapid growth is the company’s use of data combined with the MVP approach. Believing that professionally taken photographs would enable hosts to gain more business, Joe Zadeh, product lead at Airbnb decided to test his theory. He first analysed the number of bookings achieved by listings with professional photos against those without them and then offered professional photography to hosts in order to expand the experiment. He not only found that professionally photographed dwellings got between two and three times as many bookings, but also that many hosts were keen for the site to provide this kind of service. This kind of data has led Airbnb to continue offering its own photography service and vindicated Joe Zadeh’s initial hypothesis. Growth hacking A recent trend amongst startups looking to accelerate is so-called “growth hacking.” This approach uses a combination of data-driven methods and other web-based tactics to help smaller businesses gain a competitive edge. For a digital company this could take the form of in-house testing to find out the most effective way of increasing your userbase. Twitter, for example, discovered that if users selected between five and 10 accounts to follow on the first day of signing up to the site, they would be more likely to continue using it long-term. This lead the social network to launch a number of features that make it easier for new users to find relevant accounts. growth hacking focuses on marketing as a means of company growth More often than not, growth hacking focuses on marketing as a means of company growth. Rather than marketing bigger, which may rapidly drain your financial resources, growth hackers aim to market smarter. Incentivising existing customers to invite friends to your service is one such method, as is making it easier to share information about your startup via social media. Using cloud-based analytics software to make your marketing campaigns more targeted is another cost-effective growth hack that early startups can embrace. Creating a successful startup is not an easy process. You may be the only person with faith in your idea and so there are lot of investors, colleagues and customers that you need to convince. One way of doing this is to use data and online marketing techniques to create a cost-effective marketing strategy that achieves the growth rates your business needs. ### Crowdsourced or broke: community collaboration before the Internet Crowdsourcing has become the ‘buzzword’ of the last few years, with the likes of AirBnB, Uber and Blah Blah car seemingly taking over the world. Suddenly, everyone wants to contribute to the next innovative idea. Its popularity can be attributed to its ability to feed basic human needs, namely: to be connected with like-minded groups; to feel involved; and to feel valued for our contributions. [easy-tweet tweet=".@katewright24 looks back at the history of crowdsourcing and how things have changed" via="no" usehashtags="no"] This has been further driven by the injection of digital media and social media platforms - now we don’t have the same barriers to communication. We have the twitterati, and basically anyone under 25 is a screenager. Your fellow ‘crowdsourcers’ can be 1000s of miles away, but still able to collaborate as easily as your next-door neighbour. What some people fail to remember is that crowdsourcing is not a brand new concept and it is not a ‘fad’ made possible with the growth of the internet. Crowdsourcing has been around for decades (even centuries!). So, let’s remind ourselves of some well known and original examples of crowdsourcing: 1. The Sydney Opera House (Obvs.) In the 1950s the New South Wales Government launched a competition to submit designs for the Sydney Opera House. The only guidelines given were that there had to be two performance rooms: one for opera and one for symphony concerts. The competition attracted more than 200 entries from 28 countries (yes - people were totes able to hear about and submit designs WITHOUT the internet!). The Sydney Opera House as we now know it was designed by Danish competition winner Jørn Utson. A brilliant example of people coming together and contributing to an overall project, with a truly iconic outcome. 2. Oxford English Dictionary It may have been Professor James Murray who established the Oxford English Dictionary, but he couldn’t have done it without the support of more than 800 unpaid, but willing, contributors. Over decades, hundreds of people submitted words and definitions, collated together to bring the first edition of the dictionary. Even now, it can be argued that the Oxford English Dictionary continues to be crowdsourced as new words are added annually, influenced by the general public and slang words becoming every day language (although I still do not approve of ‘twerking’ and ‘bromance’ being officially integrated into our vocabulary.…)  4. Car Washes - WOOT! [caption id="attachment_32657" align="aligncenter" width="1081"] Image courtesy of Jessica Simpson[/caption] They may be cliche… they may be somewhat tacky… but the use of bikini-clad women, washing cars to raise money for a cause or campaign is one of the most well-known examples of crowd-funding, and prove to be very effective. However, now we have the internet, maybe it’s time for these ladies to put a jumper and set up a JustGiving page - especially in the chilly UK!! 3. Toyota Logo Few people are aware that the Toyota logo is a product of a logo competition launched back in 1936. The triumphant emblem consisted of three Japanese katakana symbols for ‘Toyoda’ (the original name). This was then slightly modified for the launch of the new name, ‘Toyota’. [easy-tweet tweet="Toyoda - do you know what this means, or what it has to do with #crowdsourcing?" user="katewright24" usehashtags="no"] 5. Walker’s search for new flavours The Walkers' ‘Do Us a Flavour’ campaign took the UK by storm (and not just because of that awesome pun!). Here in the UK, there are few things we love more than a good crisp, evident by the 1.2 million entries sent in. Some of the creative suggestions included Lamb&Mint, Chill&Chocolate and Builder’s Breakfast. The nation completely got behind this campaign and everyone wanted to be involved - whether it was with an entry or casting a vote. It was so successful that Walkers have repeated the campaign numerous times over. So, five iconic examples, and all possible without the internet and social media. However, there is no denying that online platforms have supported crowdsourcing campaigns dramatically, with companies and applications such as Kickstarter dedicated solely to crowdsourced products. The number of crowdsources portals has risen from 53 to 1096 between 2009 and 2015, showing the option of crowdsourcing your innovative product/service is relatively easy. Just remember, once you get strangers involved in your project, they will want to see an outcome - so make sure you can deliver! No cyberslacking, or you might end up the unwilling subject of a serious blamestorm. Researching this article, I discovered some real word gems that I 100% approve of being in the dictionary - see how many you can find in this article, and tweet them, and your other favourites to me at @KateWright24. ### Infostealer APK Posing as Microsoft Word Document Recently, we (Zscaler) came across a piece of Android malware which was neither a porn app nor a battery status app, but was instead designed to look like a Microsoft Word document. This malicious app portrays itself as a document with an icon resembling Microsoft Word. Due to the ubiquitous nature of mobile devices, its no wonder that PC based malware techniques are appearing in mobile domains. In early Windows malware attacks, attackers would often name the malicious files with eye-catching titles and use common icons to entice victims to open the file. We're seeing this same practice used for Android based malware. Overview: The malware portrays itself as a data file with an icon similar to that used by Microsoft Word documents and is entitled '资料' (Data). It runs with Administrative access and hence cannot be easily uninstalled. Once installed, the malware scans the device for SMS messages and other personally identifiable information such as the IMEI number, SIM card number, Device ID, victim's contact information,  etc. and sends this to the attacker via email. [caption id="attachment_32676" align="aligncenter" width="225"] Malicious APK posing as Microsoft Word File[/caption] Technical Details: Once the malware is installed, it appears on the Android home screen as shown below: [caption id="attachment_32679" align="aligncenter" width="235"] Microsoft Word Icon[/caption] Initiation: As soon as victim tries to start the app, it shows an explicit error stating "Installation errors, this software is not compatible with the phone" and the icon then disappears from the device screen. [caption id="attachment_32675" align="aligncenter" width="225"] Fake Error Message[/caption] When this error is being displayed, the app executes a few major functions as noted below: Sends SMS messages to a hard-coded number. Starts an Android service, named MyService. Starts an asynchronous thread (SmsTask) which runs in background. Starts another thread named MailTask, which also operates in background. Calls phone numbers specified by Attacker. Sends SMS Initially malware tries to send the victim's device IMEI code in a message body to a hardcoded number. Assets.getInstallFlag gets the IMEI (or ESN number in case of CDMA devices) [caption id="attachment_32680" align="aligncenter" width="640"] IMEI code fetching[/caption] And finally sends the message.  [caption id="attachment_32689" align="aligncenter" width="640"] Sending Message[/caption] MyService Service: [caption id="attachment_32686" align="aligncenter" width="640"] Service fetching inbox messages[/caption] The main task performed by MyService is to collect all the SMS messages from inbox of the victim's device.  Once that is done, it stores all the messages in its local logs. SmsTask Thread: Apart from logging SMS messages, MyService was not sending these messages anywhere. This functionality is exhibited in the SmsTask thread. SmsTask will also read the SMS messages and exfiltrates them. [caption id="attachment_32690" align="aligncenter" width="640"] Fetching inbox messages[/caption] Once the messages are collected, the app then sends them to attacker via email. [caption id="attachment_32687" align="aligncenter" width="640"] Calling SendMail method[/caption] A username and password for an email id were found hard-coded in the malware. [caption id="attachment_32688" align="aligncenter" width="640"] SendMail functionality[/caption] MailTask Thread: MailTask's main role is to collect contact information from the victim's device and send it to attacker via the same functionality explained in case of SmsTask. [caption id="attachment_32685" align="aligncenter" width="640"] SmsTask Thread[/caption] Sending Mail: The app sets up an SMTP host on port 465 for sending email. [caption id="attachment_32683" align="aligncenter" width="640"] Sending Mail[/caption] localMimeMessage contains the necessary data to be sent to attacker via email. In the case of SmsTask as mentioned above, localMimeMessage's body contains an SMS message list and in the MailTask instance, it contains contact numbers from victim's device. Calling Functionality: The malware was also designed to call phone numbers provided by an attacker via SMS. It has a broadcast receiver registered to trigger whenever a new SMS is delivered. The malware reads the SMS received from the attacker and acts accordingly. In one instance, malware was trying to fetch phone numbers received in SMS messages and then calling them, as shown in screenshot below: [caption id="attachment_32677" align="aligncenter" width="640"] Broadcast Receiver[/caption] We were able to confirm that the campaign was initiated on October 10, 2015 and almost 300+ users had fallen prey to this malware. The attacker was able to successfully retrieve message details and contact lists from the infected users. The following screenshots shows the list of emails received by the attacker: [caption id="attachment_32684" align="aligncenter" width="640"] Inbox[/caption] Further, each email titled "Message list" consists of full SMS conversations from the victims phones and email with subject "Contact list" contains a list of all the phone numbers fetched from victims contact diaries. [caption id="attachment_32695" align="aligncenter" width="640"] Messages from victim's device[/caption] [caption id="attachment_32696" align="aligncenter" width="640"] Contacts from victim's phone[/caption] There were 300+ such emails found in the C&C admin panel. Such malware creates a significant privacy & financial risk as it obtains contact information and private SMS messages. Prevention: It is recommended that users download apps only from official Android stores like the Google Play store. If you are infected with malware, you can follow the steps mentioned here for removing the malicious app. ### When does big data start becoming creepy? Big Data promises to bring a huge number of benefits to society. The ability to analyse the wealth of information created each day could enable customers to benefit from more relevant services and businesses to identify new revenue streams. However, there is already some growing resistance to how big data is being collected and how it is being used. Most individuals understand, and accept, that the amount of data being created is increasing rapidly – 90 per cent of the data available today has been created in the last two years – but the type of data being used without permission is sometimes less easy to swallow. Consumers are beginning to question whether the collection and storage of certain kinds of data actually offers them any benefits and if they have any claim over their own data at all.  [easy-tweet tweet="There is already resistance to how #bigdata is being collected and how it is being used" user="bsquared90" usehashtags="no"] When it comes to specific industries, individuals are usually receptive to personal data use where a clear utility is easily demonstrated. In the education sector, for example, the UK government has enabled academic researchers to access data under secure conditions to help identify and plug some of the country’s skill gaps. There are now calls to extend this and as long as privacy and anonymity is maintained, it could provide a major talent boosts across several industries. Earthquake prediction, healthcare research and many other examples of big data projects are also easy to justify.] there is a fine line between personal data use that is seen as relevant and that which is viewed as 'creepy' But sometimes big data is less obviously useful and certainly less transparent. The retail industry, for example, is beginning to find that there is a fine line between personal data use that is seen as relevant and that which is viewed as 'creepy'. A recent study by RichRelevance found that 72 per cent of UK shoppers found personalised product recommendations based on purchasing habits useful, but other forms of data use were less welcome. Seven in 10 found facial recognition technology 'creepy', while 76 per cent felt that being greeted by their names as they entered the store was unsettling. A similar report by CSC also found using smartphone and other forms of digital data to monitor customers was intrusive. 73 per cent of respondents were either not comfortable at all, or not particularly comfortable with in-store behaviour tracking and 71 per cent felt similarly about recording information such as gender and age whilst shopping. Clearly, when customer analytics is used to provide a customer benefit then it is less likely to be questioned, but even in this instance there are more nuances to consider. [easy-tweet tweet="There is a generational gap emerging when it comes to the collection of personal data" user="bsquared90" usehashtags="no" hashtags="bigdata"] There is a generational gap emerging when it comes to the collection of personal data, which emphasises that education and, indeed, time is needed before any new technology is fully accepted. The Royal Statistical Society discovered that younger people are more likely to trust organisations to use their data responsibly than older individuals. When surveyed, those aged 16-24 gave Internet companies a mean trust in data score of 4.5, compared to a score of just 3.4 for those aged 55 to 75.  The discrepancy may be partially the result of digital natives being more accepting of new technologies, but it is also worth noting that trust is not particularly high across either age range. Transparency is often missing when it comes to big data collection and it is this aspect that is likely to make both young and old uncomfortable. This isn’t only the case when it comes to covert, mass data collection programmes run by national governments, taking a look at the list of permissions that you’ve agreed to when downloading an app highlights that we often willingly, but unthinkingly, allow huge quantities of data to be shared every second of every day. What must change is how organisations communicate this data use to the public It is important not to descend into scaremongering. Big data and analytics is not just about delivering targeted ads that bolster corporate wallets, it could drive medical efficiencies that save lives, or help urban planners build better infrastructure that benefits millions each day. Data is here to stay and it is only going to get bigger, supplemented by IoT devices and wearable gadgets. What must change is how organisations communicate this data use to the public. What is it going to be used for, how is it collected and what are the advantages? Only then will consumers view big data in terms of its practical benefits rather than its privacy concerns. ### Test and adapt quickly: how Expedia develops for the cloud As a global travel brand, Expedia operates localised sites in over 31 countries with over 40 million monthly visitors, where customers can be found logging in and checking details, checking in for flights or booking travel round the clock - and when you’re rushing to the airport or desperate to bag that bargain holiday - you simply don’t want to be kept waiting. Being available or not can literally make or ruin someone’s long-anticipated holiday, so we need to be available 24/7 to address those needs. Back in the 90s we broke the mould in the travel market blowing the market wide open to a far wider range of choices for customers. What this means today is that we can process an infinite range of potential travel combinations every second. [easy-tweet tweet="Live streaming analytics and cloud-based algorithms are important to @expedia" user="comparethecloud" usehashtags="no"] To ensure continuous development, incorporating 24-hour uptime and to support multiple language capabilities a global workforce spanning multiple time zones is a must, and developer teams must have the ability to act as one. This is why tools such as live streaming analytics and cloud-based algorithms are important to us: our need is for both near real time decision-making, and the cloud allows us to service our customers globally with fast scaling capacity.  using the cloud to store queueing patterns and essential testing functionality becomes critically important At Expedia developer teams work on everything from fast-moving travel requests to agile development, improving Expedia’s mobile App and mobile UX, to an on-the-job analytics experience to develop the “tech of tomorrow” or platform-related improvements touching most engineering teams’ core components, to algorithms that help our online marketing activities. With developers based everywhere from the USA to Australia, and everywhere in between, using the cloud to store queueing patterns and essential testing functionality becomes critically important. Using the cloud, each team can work independently, whilst still collaborating as a larger team. Engineering and testing/hosting in the cloud is crucial to enabling 24/7 uptime and availability across all of our various systems. One team based in GMT can pick something up or escalate to another team as they go to sleep for an AST time zone based team to deliver on. We’ve been in the travel technology business for nearly twenty years, but the last half decade has seen an unparalleled level of innovation. Global travellers are becoming savvier with each passing year and to continue stay ahead in the market and deliver on their evolving expectations, our teams are constantly tweaking and adjusting the data we make available to our travel customers in a way that is both reliable and provides what they need, whilst also optimising our business critical operations and stack architecture on the back end. Back in the 90s the flexibility and choice revolutionised the travel industry Back in the 90s the flexibility and choice revolutionised the travel industry. Today’s customers expect to navigate through these choices in seconds on any device and predictive and streaming analytics are crucial to helping customers to do. Cloud developers are increasingly becoming heroes in today’s travel technology workplace. Thanks to cloud hosting, big data capabilities as well as predictive and streaming analytics, developers across the Expedia ecosystem have the ability to do things today that they could not do only 5 or so years ago. [easy-tweet tweet="Cloud developers are increasingly becoming heroes in today’s travel technology workplace" user="expedia" usehashtags="no"] In order to maintain our global market leadership a commitment to continual innovation is essential. Our developers’ main challenge now rests on the ability to test and learn faster than ever before – and that can mean failing fast in order to learn - whilst at the same time discovering the best possible consumer products they can build. Our developers are encouraged to try things at a rapid pace, to try, test, test again, and if/when the test is not successful to learn the essential lessons of what’s failed fast, and then move items along in a queue to the next task.  We strive not to control or restrain our engineers - but rather to focus on fast recovery and information sharing with our global teams in diverse locations. It’s also vitally important that our teams can share best practice learnings across platform. So in addition to testing and learning with existing platforms, we encourage our teams to test new tools. The majority of our development stack teams work in Java, but our developers ultimately choose the languages that fit and also the manner in which they wish to deploy, then it’s a case of making sense of the scale of the development which needs to occur. Often we use a combination of tools for our Big Data and predictive analytics projects - whether it is Cloud/AWS, Java/Scala, PHP, Python or even Heroku or internal cloud infrastructure for others - we aim to develop an improved experience for what our customer sees when they reach our sites.  [easy-tweet tweet="We strive not to control or restrain our engineers says @Expedia" user="comparethecloud" usehashtags="no"] Whether it’s a myriad of travel routes someone can take to get to a destination, or the number of popular queries a service receives, we try to bring all data into product development, and aim to repeat test features before we roll them out across the business. But it’s accomplished in a way that consistently keeps the goalposts clear: we aim to develop in a truly traveller-assisting way.  Just as our customer’s feedback helps us to improve our services for future travel needs, we often live-test on site so that we don’t have to debate what customers want, but rather we can test live and build on data rather than speculation. The scale of the opportunity, and what we aim to achieve is great: often the traveller accessing our service does not even realise they are subject to a live test, or that they are helping to inform how often someone else can choose the London - New York route at 14:57 on a Tuesday. The fact is we aim to constantly, and consistently, improve on valuable travel experiences our customers have in a real-time way. Given the importance of data and accessibility to that data, we imagine a hybrid approach of in-house and cloud platform services will remain one of the most effective ways we can do this successfully – particularly for an industry with as much decision-making and emotional needs as travel. So no matter if we are planning for executive travel needs or new millennials who are planning their own round-the-world trips, we aim to make our service delivery one that provides only the best traveller options available through current technology available to us. ### IBMz - Tweetchat - 3rd Nov - Make #HybridCloud Your Own Cloud Join the TweetChat! Use the Hashtag #Hybridcloud or #IBMz Join #IBMz & Compare the Cloud for our TweetChat on IBM zSystems, Hybrid Cloud & Making it your own. From 8pm GMT (3pm EST, 12pm PST) on November 3rd 2015. Use the window below or go to https://www.crowdchat.net/HybridCloud to log in using your Twitter account. ### IT Complexity is Stifling UK Innovation Research from Sungard Availability Services® finds that despite being considered a necessity in remaining competitive, complex ‘Hybrid IT’ systems are causing numerous problems for organisations Research from Sungard Availability Services® (Sungard AS), a leading provider of information availability through managed IT, cloud and recovery services, has revealed that over half of UK organisations believe that the complexity of their IT estate is hindering their ability to innovate. The research questioned 150 senior IT decision makers in UK organisations with more than 500 employees; with an average IT spend of around £36m per year.  The rise of cloud computing has placed IT into an era of transition, with organisations looking to embrace cloud and move away from the traditional approach. Many now find themselves in a state of Hybrid IT, running their business across a number of different IT platforms – whether that is private or public cloud, on premise servers, or data centre services. Unsurprisingly, compared to an approach in which infrastructure runs on a single platform, today’s so-called Hybrid IT estate is becoming increasingly complex, and defined as such by 100 per cent of UK respondents. However, despite this complexity, hybrid IT is viewed positively by 92 per cent of organisations and has been identified as a critical component of their success, with 77 per cent of organisations stating that it was a necessary part of staying competitive within their industry. In fact, three quarters of respondents (75 per cent) claimed that the move to a Hybrid IT estate was a strategic choice, with 51 per cent citing it specifically as their stepping stone to cloud. Those who have adopted a Hybrid IT approach have experienced a number of rewards – with over half (53 per cent) pointing to an increase in business agility while improvements in customers service and the speed of product developments have been noted by 39 and 37 per cent respectively. New IT, Old Challenges However, the research also revealed a darker side of Hybrid IT. As increasing numbers of businesses turn to this approach, the perennial issue of IT complexity is once again rearing its head; nearly half (47 per cent) of the UK IT decision makers rate their current IT estate as either ‘very’ or ‘extremely’ complex. Furthermore, 69 per cent of organisations say they are now running more complex IT systems than before. This complexity is adding significant cost to the running of IT estates, with nearly a third of UK organisations (31 per cent) having seen an increase in operating costs thanks to Hybrid IT, adding an average of £251,868 every year. Most worryingly, over half (53 per cent) of IT decision makers claimed that the complexity of their Hybrid IT estate is hindering innovation in their organisation. Add to this the fact that half of organisations (50 per cent) claim they do not have the skill sets needed to manage a complex Hybrid IT environment and we are left with a real cause for concern. The research also found matters of IT security were considered as the biggest concern, with 38 per cent of organisations lacking the necessary skills to deal with security issues. Integration and interoperability was also a crucial concern: 27 per cent of organisations felt they struggled to integrate private cloud environments into their IT estate. Similarly, 22 per cent of respondents admit to difficulties in managing different IT systems across separate business departments. [quote_box_center] Commenting on the findings, Keith Tilley, Executive Vice President, Global Sales & Customer Services Management at Sungard Availability Services, said: “Cloud computing may seem to be the natural way forward for enterprise IT but it is not simply a case of turning off legacy system and pushing old processes into new environments. Most organisations in the UK have legacy applications -many running some of their most critical applications-, and as such, Hybrid IT is a necessary transition in the adoption of cloud services. It is very promising that organisations recognise the strategic value of Hybrid IT, but it equally disturbing that half our respondents feel unable to cope with the level of complexity it has added to their business.” “Hybrid IT is a critical part of running a modern IT environment. However, like any IT strategy, it requires a well-considered and comprehensive roadmap. Building a Hybrid IT environment with ad hoc purchases and trying to keep numerous disparate applications integrated in a single system is a recipe for disaster. Hybrid IT might be, for many businesses, a stepping stone towards a cloud-first policy but a failure to invest in the right applications now will lead to significant issues in the future.” [/quote_box_center] The full report is available to download here. ### ePages launches epages.cloud partner portal As a .cloud Pioneer, ePages leverages one of the world’s first .cloud web identities ePages, a leading international provider of online , shop software, today launched a new partner portal – www.epages.cloud – as a new primary touchpoint and content hub for its global network of resellers.  As one of the very first companies to pioneer a .cloud domain name, ePages makes proud reference to the role that cloud technology delivers in enabling powerful e-commerce software for the SMB market.  The epages.cloud partner portal is a resource designed to provide guidance and inspiration to new and existing partners on the topic of enabling and selling value-added e-commerce solutions.  ePages provides a range of materials that showcase its value proposition and role within the global SaaS ecosystem.  Visitors to the website can learn how easy and efficiently they can add ecommerce to their existing reseller offerings.  ePages’ sales partners are leading hosting providers as well as ERP, logistics, telecommunications, and online listings companies.  In developing its award-winning software, ePages collaborates with more than 80 technology partners such as online marketplaces, comparison shopping websites, payment providers and trust seal vendors.  Wilfried Beeck, CEO ePages, comments, “The epages.cloud website is an ideal resource for any provider, reseller or developer wishing to benefit from the selling of e-commerce to the SMB and web professional markets.  With more than 20 years in the commerce tech sector, we are a trusted expert to the service provider space”. Integral to ePages’ identity and value to the industry is its roots as an established cloud-based software provider.  Beeck continues, “ePages is a fitting pioneer for the .cloud domain.  It comprises the perfect online identity to signify our mission to deliver best in breed e-commerce cloud services.  ePages software enables online retailers to fully benefit from the power and efficiencies of the cloud every day”. [quote_box_center]Francesco Cetraro, Head of Registry Operations at .cloud comments: “We are proud that ePages is among the first Pioneers to communicate their proposition with a .cloud web identity. Trusted by over 140,000 online merchants worldwide, the company is a fine exemplar for how cloud-based technology can deliver excellent online services.  epages.cloud is a great way for the brand to position itself to the service provider space, and we look forward to sharing in their future successes”.[/quote_box_center] In the upcoming months, the epages.cloud website will grow with more content from existing partners and suppliers, and outline ePages’ 2016 plans for industry events and community initiatives.  As ePages’ own international footprint continues to grow, the portal content will likely reflect the inceasingly global nature of the SaaS sector. ### Future of Cloud: Part Two Overcast with Sunny Spells and Storm Clouds to Follow Following yesterdays blog seeking to make a few pointed predictions, I thought I would add some thoughts on what impact this will all have for the future with a short, medium and long term outlook for private cloud and OpenStack: Short term: people will struggle to make a profit from Openstack With Liberty (the 12th release) OpenStack is starting to become enterprise ready, but it is still behind AWS on the maturity curve, needs a few more fixes and the skills are lagging the technology – meaning that until the skills catch up and the skills gap closes a little firms will struggle to make a profit from OpenStack. [easy-tweet tweet="With Liberty #OpenStack is starting to become enterprise ready" user="billmew @comparethecloud" usehashtags="no"] Admittedly there are a few pioneering client engagements (many of which we heard about at the OpenStack Summit in Tokyo), but most players are still working on a number of PoCs (Proof of Concepts) and are yet to really industrialise their offerings and really reap the rewards. Medium term: there could be a fortune to be made moving clients to cloud The gold rush is almost upon us and many firms could strike it rich (at least for a short while). It is all about the applications. These were almost all written to run on Oracle or NoSQL databases. Once a few innovators figure out how to engineer a move from Oracle or NoSQL databases to containers, the flood gates will open and a lot of money will be made in the ensuing migration phase. Amazon AWS already had Oracle clearly in its cross hairs with its recent launch at re:Invent of tools to simplify data and database migration to AWS. Others will improve on this and enable migration directly to containers - which can then run on any cloud platform. This gold rush will eventually come to an end however – these applications can only be migrated once. It is a bit like VMware that made a fortune helping firms virtualise. It is something that you only do once. At the time VMware could charge a real premium for it and make a fortune, but eventually other cheaper options emerged (like KVM) and the market became saturated. VMware then struggled to add further value (and Michael Dell may find it hard to use VMware to help pay off the debts he incurred buying EMC). Interestingly mainframe applications may be among the ones last to shift, meaning that, in an ultimate irony, this legacy particular platform may outlast the Unix/RDBMS rivals that tried to replace it. There is a great deal of IT budget that has been consistently drained in the form of Oracle licence renewals that will then become free for reinvestment in a wave of possible innovation as firms leverage their freedom from the constraints of many of their legacy data stores and seek to capitalise on the potential benefits of containerised environments. [easy-tweet tweet="Sometimes the rise of containers is portrayed as competitive with #OpenStack" user="billmew" usehashtags="no"] During this gold rush phase proximity will be important, and firms will want to locate their compute power near their big data stores. This will be a real opportunity for private cloud implementations – a brief period in which the previously mentioned private cloud/OpenStack premium could well be justified. Sometimes the rise of containers is portrayed as competitive with OpenStack, but in the medium term at least OpenStack and containers will co-exist. Long term: OpenStack risks becoming a backwater The whole argument for private clouds is that no matter how compelling the public cloud services are, there will always be the following compelling drivers for private cloud: Financial advantages (in certain circumstances) – although the private cloud/OpenStack premium will become increasingly hard to sustain The need for technical control over one’s compute environment – although increasing usability and flexibility as well as innovations such as containers are making this less relevant. Reasons of proximity (for instance, as mentioned above, putting the compute power near big data stores) – although once the gold rush is over and applications are all containerised, proximity will no longer be as much of an issue. Reasons of protection, risk aversion, compliance and regulation – although further innovations in security and other area by public cloud vendors and their ecosystem partners are addressing these concerns. The majority of compute cycles already run on public rather than private clouds (across SaaS, PaaS and IaaS). The share of compute cycles on public cloud will increase meaning that the percentage of compute cycles on private cloud will consequently decrease. Add to this the unavoidable private cloud/OpenStack premium and the increasing relative irrelevance of at least some of the major corporate sponsors behind OpenStack and it is not hard to see private cloud and OpenStack becoming a relative backwater – possibly even before it reaches full maturity. So where does that leave us all... This leaves little in the way of a compelling proposition for private cloud at all in the long term. There will be a gold rush and a brief private cloud bonanza – as the industry turns on the likes of Oracle, seeking to break down the barriers of the data store enclosures and free the data – but it will be short-lived. There will be a gold rush and a brief private cloud bonanza The challenge for enterprise tech vendors will be in remaining relevant. Will clients continue to pay HP, Cisco, Oracle or indeed anyone else a private cloud premium? Can they continue to provide the same operational and economic flexibility as public cloud, with the control and autonomy that comes with owning your own environment – while reducing the private cloud premium to an acceptable minimum? The challenge for the predicted ‘handful of players [that] dominate public cloud with their own proprietary stacks’ will be one of continuing to innovate, and to take advantage of economies of scale and fast cycle times in deploying new functionality, and also of staying ahead of what can be achieved just as easily in software. In the future it won’t be all about public and private clouds any more, but about local and global ones. Local clouds will be as stipulated by national governments (such as the G-cloud in the UK) and global ones will be provided by the usual suspects: AWS, Google, Azure and maybe also IBM (which is keeping pace through its SoftLayer and BlueBox acquisitions), with maybe Alibaba’s Aliyun as well. The future will be all about software The future will also be all about software, as everything is becoming software defined. So lets see how things unfold over time. Please feel free to come back and challenge us on these predictions. We love nothing more than a good debate! ### HP Helion OpenStack 2.0 delivers a production ready, enterprise grade cloud platform New lifecycle management and security enhancements help organisations implement and operate OpenStack technology Today at OpenStack Summit Tokyo 2015, HP announced the availability of HP Helion OpenStack® 2.0, a production ready, open source based cloud platform designed to meet enterprise requirements. As organisations strive for the right mix of traditional IT and private cloud technology to run their mission-critical applications and protect sensitive data, many are turning to the flexibility and economics of the OpenStack® project.  [easy-tweet tweet="HP Helion OpenStack 2.0 offers an enterprise grade cloud platform" user="comparethecloud" hashtags="cloudnews"] HP Helion OpenStack 2.0 offers an enterprise grade cloud platform, adding new features to address organizations’ lifecycle management and security challenges, including: Easy provisioning of new infrastructure and the ability to repurpose existing infrastructure to meet scalability needs without impacting availability Rolling upgrades which facilitate entire cloud environment software upgrades without requiring planned or unplanned downtime Continuous patch management allowing security patches and updates without application interruption Easy to use administrator interface, centralized logging and monitoring at scale across a cloud environment Network configuration flexibility to enable connectivity with existing IT environments Strict OpenStack API adherence to enable cross-cloud compatibility and ability to leverage the upstream ecosystem of third party plug-ins HP Helion OpenStack 2.0 also enables customers to create and manage software defined networks (SDN) in a distributed, multi-datacenter environment through integration with HP Distributed Cloud Networking (DCN) and Nuage Networks Virtualized Services Platform. This removes the boundaries of traditional networking, unlocking the full automation and agility needed for hybrid cloud. “Enterprises want to benefit from the powerful capabilities of OpenStack technology, but they must have the enterprise grade capabilities required to support their businesses,” said Bill Hilf, senior vice president and GM, HP Cloud. “The configuration, security and scalability advances in HP Helion OpenStack 2.0 enable organisations to deploy OpenStack technology into production with the confidence that they are backed by the experience and support of a trusted end-to-end technology partner.”  Unmatched OpenStack technology expertise To help customers implement, their cloud environments, HP Helion Professional Services can provide a team of experienced HP architects and cloud technologists to help customers determine the right cloud strategy for them.  HP Helion Professional Services brings a unique perspective to the customer, gained from delivering thousands of lines of code to the OpenStack community, as well as hands-on customer deployments of large scale OpenStack clouds. HP Helion Professional Services has capabilities that span the entire cloud lifecycle, from advice and strategy, to transformation, implementation and management.  In addition, the HP Helion Ready Program connects customers with an ecosystem of HP Helion-certified solution providers and hardware vendors whose software solutions and hardware products have been tested and certified on HP Helion OpenStack, enabling customers to confidently deploy HP Helion OpenStack in heterogeneous environments. HP Helion OpenStack 2.0 is based on OpenStack Kilo. OpenStack Liberty was released on October 15. HP continues to be a leading contributor to the OpenStack community as the leading engineering contributor to OpenStack Liberty in terms of the number of commits, reviews, lines of code, contributing employees, Project Team Leads (PTL) and members of the Technical Committee. In all, HP had 210 employees contribute code to Liberty. HP is a Platinum Founding member of the OpenStack Foundation and a key contributor to multiple OpenStack projects, including funding, code, reviews, testing and training. HP currently holds an OpenStack Board of Directors position, eight PTL positions and three Technical Committee membership positions. ### Net Neutrality #CloudInfluence Special Report for October Rule of Law: From Privacy to Net Neutrality – how perspectives vary between the EU and US The Internet and the cloud both evolved organically, but as their impact and reach have increased, so have the efforts to regulate them. This has exposed a difference in regulatory perspective between the EU and US on a range of issues. [easy-tweet tweet="First there was the right to be forgotten, then there was Safe Harbour" user="billmew @comparethecloud" usehashtags="no"] First there was the ‘right to be forgotten’ in which the EU imposed additional privacy regulations on Google, and then there was the recent EU ‘Safe Harbour’ ruling on transatlantic data sharing and privacy. In both these instances it was EU regulators breaking new regulatory ground, but in the next big issue on the horizon, everyone is looking at the Federal Communications Commission (FCC) in the US. Previously firms like Netflix had begun paying Internet Service Providers (ISPs) like Comcast to ensure that Netflix customers would continue to enjoy streaming video over the service without having their data speed slowed. Net neutrality campaigners feared that blocking, throttling, or pay-for-priority fast lanes would create a tiered or ‘two-speed’ internet. [caption id="attachment_32076" align="alignright" width="300"] Check out September's #CloudInfluence Rankings[/caption] At the end of February, the FCC approved a 313-page order that would allow it to treat internet access like a public utility. The new net neutrality ruling did away with such paid prioritisation and make it illegal for ISPs to slow down internet speeds based on the site a customer is using, or charging internet companies to prioritise traffic that comes to them. [easy-tweet tweet="The FCC approved a 313-page order that would allow it to treat internet access like a public utility" via="no" usehashtags="no"] The impact of such regulations isn’t always predictable. Supposedly it meant Internet-based services, particularly streaming services like Amazon Instant Video and Netflix that use huge amounts of data, would be spared prioritisation charges and therefore charge consumers less. However Netflix has increased prices and there is a fear that these streaming services will monopolise top content (such as sports) and charge a premium for it, just as cable and satellite firms have done in the past. The fight for an open Internet is far from over, however. Cable and telecommunication companies say the order was an overreach by the FCC and that such rules should be created by Congress. The fight for an open Internet is far from over By mid-April, a handful of lawsuits had been filed against the FCC, seeking to overturn the net neutrality rules. The Cellular Telecommunications Industry Association (CTIA) filed one such suit; so did the National Cable and Telecommunications Association (NCTA) and the US Telecom and the American Cable Association. Chief among the objections of the United Telecom Association (UTA) is that their members won’t invest as heavily in broadband infrastructure if they are forced to abide by the new regulations, contravening the FCC’s mandate to encourage investment. The FCC reclassified Internet providers under Title II of the Communications Act, making them common carriers subject to the same regulations as public utilities. The UTA argues that the FCC’s prior light-touch approach deliberately induced hundreds of billions of dollars in investment, all of which has now been devalued. With the legal and regulatory arguments likely to continue for some time to come, we wanted to look at which advocates on either side of the argument were having most impact. Currently the main centres of debate are in the US and in India. The advocates of net neutrality are very vocal and visible, and include business leaders such as Mark Zuckerberg and politicians like Jeb Bush and Megan Smith the US CTO, along with numerous campaigning tech journalists. Index Name Company Position Twitter 1 Mark Zuckerberg Facebook Co-founder, Chairman and CEO 2 Jeb Bush State of Florida 43rd Governor @JebBush 3 Jessica Smith Business Insider Tech Journalist 4 Megan Smith Office of Science and Technology Policy United States Chief Technology Officer (CTO) 5 Kevin Martin Facebook VP Mobile and Global Access Policy 6 Ian Sherr CNET News Journalist 7 John Oliver Comedian @iamjohnoliver 8 Lance Whitney CNET News Journalist @lancewhit 9 Tim Berners Lee Internet Vissionary @timbernerslee 10 Vittorio Colao Vodaphone CEO The telcos are equally active, but far less visible as they are operating mainly behind the scenes with the CTIA, NCTA and UTA all lobbying the FCC as well as petitioning the courts. While the major telcos prefer to act collectively through these industry bodies rather then draw the fire of the net neutrality advocates by acting individually, but nevertheless some such as Comcast are visible in our findings. The debate in India is lead by lawyer and Bharatiya Janata Party (BJP) politician Ravi Shankar Prasad. the lobbying power of the large telcos should never be underestimated On the face of it, the advocates of net neutrality would appear to have the upper hand, but the lobbying power of the large telcos should never be underestimated. The telcos understand that net neutrality is a point of principle for many and is also unlikely to be something that they can reverse for the public Internet. But there is profit to be made providing private networks that bypass the Internet, and they want to ensure that net neutrality doesn’t impact their ability to serve this parallel market. In most countries you have postal services for letters with a common delivery system for all and then a more competitive market for parcel delivery. Letters are covered by regulatory controls to ensure a universal service – a stamp is enough to send a letter to any address. All letters are treated and priced equally, whether delivered to an easy to serve destination such as a block of flats in a city, or a more remote one, such as a rural farmhouse. Parcel couriers tend have variable pricing and delivery options based on priority, size or weight and destination (as well as volume discounts for large clients). The advocates of net neutrality want to retain this kind of common universal service for the Internet so that we don’t have first and second class delivery that would discriminate between types of traffic over the internet. At the same time the telcos realise that they may not be allowed to offer differentiated services over the public internet or to consumers’ homes (at least at this time – it may follow eventually), however they want to be able to compete for the lucrative equivalent of the parcel service – the private data networks that bypass the public internet and serve large corporate clients. [easy-tweet tweet="Corporate clients have invested heavily in new cloud environments that depend on secure and reliable connectivity" via="no" usehashtags="no"] Corporate clients have invested heavily in new cloud environments that depend on secure and reliable connectivity. The pubic Internet is increasingly clogged by high bandwidth video from the likes of Netflix and rarely as secure or reliable as companies need. There is therefore already a thriving private connectivity market. Firms like GTT have a global network with hundreds of Points Of Presence (POPs) connecting corporate data centres and cloud providers internationally. The telcos want a slice of this pie. The question will be what counts as a letter or a parcel and when is a service tiered and when is it separate. The answers to these questions will dictate how the Internet evolves and how much of it becomes regulated. ### Future of Cloud: Part One Crystal Balls and Great Expectations  Back in May, following the OpenStack Summit in Vancouver, I said that: “… many of the main players in OpenStack only really play in the private cloud space – HP for example let slip that it wasn’t really a player in public cloud, before rapidly withdrawing the comment and claiming that it was, when in reality it isn’t. So it is getting increasingly easy to picture a future in which a handful of players dominate public cloud with their own proprietary stacks, while OpenStack becomes the de facto standard for private cloud.”  [easy-tweet tweet="It is easy to picture a future in which a handful of players dominate public cloud, with #OpenStack the de facto standard for private cloud"] Rackspace decided that it would offer managed public cloud services instead of commodity IaaS cloud. IBM scrapped its SmartCloud Enterprise public cloud product when it bought public cloud player SoftLayer. Dell got out of the market some time ago, although it now has an ‘arms-length’ presence again via VMware - which it inherited from its $67 billion EMC purchase. And finally HP announced last week that it wasn’t serious about public cloud after all. [caption id="attachment_32506" align="alignright" width="300"] HP quits Public Cloud; Dell buys EMC; What’s happening?[/caption] As predicted that leaves us with a handful of players dominating public cloud with their own proprietary stacks - although SoftLayer is largely OpenStack compatible and Google is keen to work more closely with OpenStack. It also means that OpenStack has effectively become the de facto standard for private cloud (VMware being expensive and having its on-going independence and therefore survival questioned). Buoyed up with a false sense of infallibility, I thought I’d stick my neck out and make a few further predictions: Open standards, security and great tech won’t provide competitive advantage for OpenStack OpenStack groupies often claim that open standards, security and great tech make the open cloud OS the natural choice over its proprietary, public cloud rivals. They are kidding themselves. Claiming that being open brings competitive advantage is futile. If this were true then Linux would have decimated Windows. The fact is that AWS may be proprietary, but the lock-in issues can be overcome by using good application architecture, as well as by using new technologies such as containers. Besides there are enough companies on the AWS marketplace with competing services in any area that clients have enough choice within the AWS ecosystem without looking elsewhere. On the security front AWS continues to add security innovations and enhancements – and it beat IBM in the bid to run cloud services for the CIA. OpenStack has its own security challenges in any case. As for great tech and the ability to build complex systems for complex needs, more and more of the best tech is being built for multiple platforms (such as container technology), as companies find OpenStack skills in short supply, as more developers focus on AWS and as tech firms seek to spread their bets between AWS, OpenStack and elsewhere.   Private cloud/OpenStack won’t be a safe haven It’s competitive enough in the public cloud between just three main players, but the private cloud/OpenStack world will be one that is far more fragmented, and will be one in which it is exceedingly hard for players to differentiate themselves or to maintain margins. At the same time Amazon is not afraid of taking the fight to the opposition, just ask Walmart, Etsy or any of the tech vendors. AWS already has a major private cloud client in the CIA. Don’t assume that it won’t decide at some point either to provide close API integration with OpenStack and drain business from its private cloud rivals or to take them head on with a private cloud offering of its own. Even if this doesn’t happen and containers simple take off – they’re a whole lot cheaper to run right now on AWS.   The private cloud/OpenStack premium will hinder growth The OpenStack skills shortage and poor usability are two sides of the same coin. If you could reduce complexity then the skills issue would be less of a problem and vice versa. Until this is cracked the cost of skills and complexity combined will continue to be a major overhead for OpenStack. At the same time OpenStack is just part of the whole private cloud market that is worth $8.9bn and is growing at 35% (IDC), whereas AWS alone is not only already bigger than the entire private cloud market but it is growing at 78%. AWS therefore has a scale advantage that is increasing rapidly. In addition all of AWS’s profits are reinvested in infrastructure and innovation for AWS, whereas OpenStack profits are spread across a fragmented community and reinvested in often duplicated or competing development initiatives. [easy-tweet tweet="While private cloud is growing, it is actually falling as a share of the overall cloud market " user="billmew" usehashtags="no"] All of this puts private cloud and OpenStack at a major cost disadvantage meaning that there will always be a price premium for private cloud – even before you start to tailor a solution for the needs of a particular client. While private cloud is growing, it is actually falling as a share of the overall cloud market (as public cloud outpaces it) and it is also failing to match the loss of traditional tech revenues that are being eroded from the traditional tech players. It is not hard to see private cloud revenues starting to shrink once the cloud sector matures and the private cloud/OpenStack premium becoming less sustainable (you’ll be able to read more about my opinions on that in my long term forecast tomorrow).   Old tech giants: From Dominance to Relevance In this environment the tech giants that until recently were tussling for dominance in the market, now find themselves struggling for relevance. Take HP for example: it is a company with a great history, some talented staff and some fantastic technology, but even with all its skills and resources it could not get a competitive OpenStack public cloud off the ground. As I’ve mentioned before, in public cloud AWS is benefiting from a real first mover advantage. Firstly the capital costs and barriers to entry in public cloud are significant. AWS has had to spend about $1 billion each quarter building out its cloud platform. [easy-tweet tweet="HP and others have not dropped out of public cloud because they wanted to, it is because they were forced to" user="billmew" usehashtags="no"]   Secondly differentiation is hard to achieve in cloud, meaning that fast followers are struggling to overcome AWS’s scale advantage by offering obviously improved offerings – usually the approach that fast followers make. HP and others have not dropped out of public cloud because they wanted to, it is because they were forced to. What they do now and how they maintain their relevance within a small and potentially shrinking private cloud market will define their future, but the industry’s future will be defined by others. Check back tomorrow for my cloud weather forecast. ### The Top Ten Cloud Computing Countries in the EU According to Eurostat research, (which is the statistical office of the European Union) cloud computing services are used by one in five enterprises within the EU. As the Managing Director of Speechpath, a cloud telephony and VOIP provider based in Dublin, Ireland, my business relies heavily on cloud capabilities. I have put together this interactive StoryMap on the ‘Top Ten Cloud Computing Countries in the EU’ to highlight the usage of cloud computing specifically within the EU region of which Ireland is part of. [easy-tweet tweet="The information and communications sector is the largest adopter of cloud computing services in the EU" via="no" user="comparethecloud"] The information and communications sector is the largest adopter of cloud computing services in the EU at forty-five percent. The next highest cloud computing adopter by industry is the area of professional, scientific, and technical activities with a twenty-seven percent adoption rate. Finland is the leading country for cloud computing in the EU. It is well above the EU average, (which stands at 19%) with one in every two enterprises in Finland using some form of cloud computing service compared to the other twenty-eight EU countries. Italy comes in second place with two in every five enterprises adopting cloud services, followed by Sweden in third place. ### Is your business ready for the cloud? We hear a great deal about the benefits of cloud with many touting dynamic scalability and revenue expenditure models as valuable reasons to turn to the cloud. Whilst this may be true for some, businesses need to take the time to look at their own unique circumstances and consider if they are really ready for the cloud. There are a lot of inevitable infrastructure questions, but moving to the cloud is not just about the network, it’s about the business and this should therefore form the basis of all decisions relating to the cloud. [easy-tweet tweet="Moving to the cloud is not just about the network" user="comparethecloud" hashtags="cloud"] So what do businesses need to consider before looking to the cloud? First and foremost, businesses need to consider their organisational needs and suitability for the cloud. What applications are they using? How mission-critical are their workloads? What sort of data are they using? A video production company which regularly works with large file sizes may find the bandwidth required to transfer those files is too large for their internet connection so cloud is unlikely to prove an effective solution. Conversely, a retail business which experiences small, but regular, windows of peak demand, may benefit from the dynamic scale cloud can offer. Equally, a strongly sales-focused business may benefit from a more SaaS-focused model, using the cloud for specific line of business applications such as CRM, email and voice systems. [easy-tweet tweet="Just as traditional data centres present a specific set of problems to be resolved, so too does cloud" user="comparethecloud" usehashtags="no"] Whilst there are significant benefits to be found in the cloud, it is important to recognise that it does not provide a universal solution. Just as traditional data centres present a specific set of problems to be resolved, so too does cloud. In changing the landscape of an organisation’s network infrastructure, virtual environment and risk profiles, cloud presents a new set of circumstances for which the implications must be understood. Security in the cloud Let’s begin with a look at security. A business considering a move to the cloud needs to consider how they will apply the security policies used in their on-premise environment, when out in the cloud. For instance, there are many ways to extend security into the cloud using federation and single sign-on (SSO).This means that when a user logs on to their machine, the username and password entered are used to confirm their identity, granting access to any SaaS applications as well. Not only does this make life easier for the employees, it also means a company can maintain central governance of their security policies across multiple cloud platforms, managing access policies based on user profiles. This is very important to businesses where industry based or ISO based compliance controls are in place. Understanding the role of Software Defined Networking (SDN) – extending the network into the cloud By now, it goes without saying that virtualisation is a key enabler in the move to the cloud. What is less well known is the role that SDN plays in the process. SDN provides the ability to seamlessly blend the on-premise network to the cloud. Workloads can already be moved across separate network IP ranges, but the integration isn’t as clean as it might be, with separate IP ranges in play. SDN has the potential to allow virtual machines to move between on premise and the cloud SDN, however, has the potential to allow virtual machines to move between on premise and the cloud. This makes location of compute and storage services irrelevant, even to the people managing the network, whether workloads are running locally or in the cloud, the user experience is the same and the delivery process is seamless. For businesses with bespoke applications, built to manage complex business processes, this level of integration could be key to cloud success. Ensuring that the on-premise and cloud network are one fabric removes much of the complexity around application provisioning and provides much greater levels of flexibility.  This may also influence a company’s choice of cloud provider as the need for networking expertise alongside cloud knowledge becomes even more imperative. Other infrastructure-based issues to consider when looking to the cloud are internet latency and WAN optimisation. Businesses reliant on WAN optimisation in on premise environments (used to improve their application response times) may need to consider alternative optimisation options in the cloud. Finding a partner who can deliver this as part of their cloud offering could be vital to ensure the level of user experience is maintained. The impact on compliance Finally, let’s consider the, sometimes overlooked, issue of compliance. Some compliance issues are general across all businesses whilst others are industry specific. Government regulated industries may face restrictions based on the classification of information, while online stores must adhere to Payment Card Industry compliance mandates. Such regulations may restrict use of the cloud but they may not prevent it across all business applications. The trick is to know the rules and assign protocols accordingly. The trick is to know the rules and assign protocols accordingly Privacy and movement of data are also important issues to be aware of. A company based within the EU, which provisions cloud services from a supplier whose data centre is housed in the US, would currently be in breach of regulations restricting the movement of data outside the EU. Another common pitfall relates to the issue of licensing. It is important to remember that every server, whether physical or virtual, needs a software licence. It is all too easy to overlook this issue in the virtualised world where a VM can be flexibly provisioned at will, but each server will still need to be covered by a license. It is likely, with the ongoing growth of the cloud and progression of technology, that regulations will change further in coming months and years so businesses need to keep abreast of the latest compliance regulations and amend their processes accordingly. The hybrid advantage Arguably the biggest advantage cloud gives us is its flexibility. Every business faces different challenges and cloud offers the flexibility to meet these challenges on an individual basis, be it through a single SaaS application or a full infrastructure model. With a host of cloud brokers emerging in the market, businesses can obtain expert advice to assist them in the provision of IT services across all of the various operating models. ### SUSE Offers Beta Preview of SUSE OpenStack Cloud 6 Based on OpenStack Liberty, delivers high availability and non-disruptive upgrades plus Docker and IBM z/VM mainframe virtualization support to ease transition of enterprise workloads to the cloud. [easy-tweet tweet="SUSE has launched beta testing of SUSE OpenStack Cloud 6" user="comparethecloud" hashtags="openstack"] SUSE® has launched beta testing of SUSE OpenStack Cloud 6, giving customers an early look at the latest enterprise-ready technology for building Infrastructure-as-a-Service private clouds. Based on the OpenStack release Liberty, SUSE OpenStack Cloud 6 delivers high availability enhancements and non-disruptive upgrades along with Docker and IBM z Systems mainframe support to ease the transition of business-critical applications and data to the cloud. The Liberty-based beta will be demonstrated during this week's OpenStack Summit in Tokyo and at SUSECon in Amsterdam Nov. 2-6. "The key enhancements in SUSE OpenStack Cloud 6 are a result of SUSE listening closely to our customers and focusing on delivering the functionality they need to help their businesses thrive," said Michael Miller, SUSE vice president of global alliances and marketing. "They need a robust and stress-free cloud that drives innovation. Improved high availability, non-disruptive upgrades and the inclusion of Docker support directly address these concerns. As the OpenStack market continues its rapid growth, we're staying focused on enterprise customers and meeting their business-critical requirements." Kohei Sudo, CEO and president of S2 Inc. Ltd., said, "The ability of SUSE OpenStack Cloud to support the integration of plug-in components such as MidoNet and Wipro Cloud Service Gateway into an enterprise-class OpenStack distribution gives us a comprehensive solution that satisfies both our technical and business needs. SUSE OpenStack Cloud allows us to quickly build custom private, public and hybrid cloud environments, providing tailored service to our customers that other cloud providers cannot match." [quote_box_center] SUSE OpenStack Cloud 6 enhancements include: Non-disruptive upgrade capabilities to ease migration to future OpenStack releases. Enhanced high availability features to enable customers to run legacy or business-critical applications in the cloud with the same level of availability of more traditional infrastructures. Support for IBM z/VM alongside existing support for Xen, KVM, Hyper-V and VMware, making SUSE first to include these hypervisor options. This gives SUSE OpenStack Cloud the widest range of hypervisor support available and allows customers to incorporate their mainframe platforms into their OpenStack private cloud. Docker support to let customers build and run new and innovative containerized applications. Full support for OpenStack Manila to provide direct access to the performance, scalability and management of the open source Manila shared file system service. Support for SUSE Linux Enterprise Server 12 Support Pack 1 to allow customers to build their OpenStack clouds on the latest version of the top platform for enterprise workloads. [/quote_box_center] Jeff Frey, chief technology officer for IBM z Systems, said, "IBM and SUSE have been working together for more than 15 years, providing enterprise-grade IT service with Linux on the mainframe. Including support for z/VM in a SUSE OpenStack managed cloud allows organizations to take full advantage of the performance, scale, security and high availability of the z Systems mainframe, and at the same time, realize the improved access, flexibility, agility and operational efficiency provided by the cloud." SUSE OpenStack Cloud 6 is scheduled for release early next year following beta testing. Organizations interested in joining the beta program should contact SUSE at cloud@suse.com. For additional information on SUSE OpenStack Cloud, see www.suse.com/cloud or visit SUSE in booth S15 during OpenStack Summit Tokyo Oct. 27-30 or at SUSECon in Amsterdam Nov. 2-6. ### 5 Top Tips for Winning with AWS As AWS increasingly becomes the preferred deployment model for enterprise applications and services, it’s never been more important for a software or AWS SaaS provider to effectively work with AWS.  Many leading technology providers are therefore optimising their software to run on AWS as well as building globally available cloud services delivered through AWS’ worldwide regions. [easy-tweet tweet="It’s never been more important for a software or AWS SaaS provider to effectively work with AWS" via="SplunkUK" usehashtags="no"] We’ve had some success on AWS, and thought we’d share some of our learnings in the form of best practices for you to keep in mind when developing your software or SaaS business in this way. 1. Embrace the Change If you’ve attended the keynote at any recent AWS Summit, you’ve heard the message “Cloud is the New Normal.”  Our advice is to take this to heart, and invest in your business knowing the momentum behind cloud will only continue to accelerate. “Cloud is the New Normal” This is a boon to businesses of every size and in every location around the world – cloud makes it easier than ever before to innovate, rapidly bring an offering to market, and serve your customer. 2. Relentlessly Focus on the Customer Experience Focusing your business on customer success is a must when building a business on AWS. Why? Because the number one driving factor behind everything AWS does is help its customers be successful and innovative.  Tactically, this can mean many things for a SaaS Partner, but the one that stands out is building technology integrations that can provide additional value to AWS customers. A good example of this involves AWS CloudTrail and AWS Config, services that deliver log data on AWS API calls around user activity and resource changes. When properly harnessed, these services help ensure security and compliance of AWS deployments. A handful of SaaS Partners deliver integrations for these AWS services. The importance of these integrations is clear when you think of how crucial security and compliance are for any successful AWS deployment. 3. Leverage Your Customers in Your Go-To-Market Strategy When it comes to building your software or SaaS business on AWS, nothing beats customer validation. One of the most compelling stories is when a customer fully integrates your technology into their AWS strategy. A great example of this is the Federal Industry Regulatory Authority (FINRA). FINRA is an independent regulator that examines all securities firms and their registered persons and surveils trading on U.S. markets. To respond to rapidly changing market dynamics, FINRA is moving its platform to Amazon Web Services (AWS) to analyse and store approximately 30 billion market events every day. FINRA uses Splunk Cloud to ensure security and compliance in their AWS deployment. 4. Choose AWS and Go “All-In” When building out your cloud strategy, you have to make choices. Our advice: When two roads diverge in the cloud, choose AWS. [easy-tweet tweet="When two roads diverge in the #cloud, choose #AWS says Praveen Rangnath" via="SplunkUK" usehashtags="no"] This is a best practice because AWS has the richest and broadest set of services in the market. Whether your offering is storage intensive, computer intensive, or I/O intensive, there are specific solutions for each of them. Regardless of what you need on the infrastructure stack – whether it’s automated provisioning, configuration or management, there is a mature solution that fits the bill. In addition, business today is global.  To successfully grow your business you need the ability to rapidly expand around the world.  AWS offers that through their 11 worldwide regions. 5. Leverage the Ecosystem If you’re building on AWS, chances are that other folks building on AWS will find it useful. This is what makes the AWS announcement of its SaaS Partner Program so exciting. If you’re building a SaaS billing management solution, odds are we could use it for our SaaS operational and security monitoring solution. Since we’re building a SaaS operational and security monitoring solution, odds are you could use it for your SaaS billing management solution. It’s a win-win for all sides. ### Moving to the cloud: keeping the customer at the core of your approach Companies are no longer considering cloud in terms of “why,” but “when?” From ordering an Uber with the push of a button to signing a document with an e-signature tool, we live in a culture of now. In today’s digital era, we are used to getting what we want, exactly when we want it. Many B2C companies have perfected this business model, pairing a personalised experience with instantaneous delivery and driving continuous customer satisfaction. B2B companies, however, have been notoriously slower to catch on. [easy-tweet tweet="The reality today is that customers want information and services immediately" user="vera_loftis @comparethecloud" usehashtags="no"] The reality today is that customers want information and services immediately - whether that customer is a business or an individual consumer. Moving to the cloud can help companies provide their customers with a seamless digital experience, but to do it successfully, they need to ensure that the customer is at the core of their approach. Focusing on these areas can get you there:   Create meaningful customer engagement  The abundance of digital devices and communication channels have eliminated barriers to information, thus increasing consumers’ expectations of all brands. Needing to react to customers in real-time to stay competitive creates an even greater need for companies to adopt more efficient technology and processes. For many, transitioning to the cloud is seen as the easy solution. Yet most businesses often incorrectly approach a cloud implementation from a features and functionality perspective, focusing on the most granular of details instead of the bigger picture: the perspective of the customer.  The customer is interested in the speed and simplicity of their experience, and businesses need to think about all of the touchpoints — end-to-end interactions, re-engagement, follow-up, return business, cross-departmental service, etc. — that positively impact customer experience along the way. All these stages need to deliver a seamless customer experience with no issues or aggravations. Agility and constant innovation [easy-tweet tweet="More than half a century ago, the lifespan of a business was 61 years. Today, it’s only 18" user="vera_loftis" usehashtags="no"] More than half a century ago, the lifespan of a business was 61 years. Today, it’s only 18.  The current market is undeniably aggressive, and in order to stay competitive, businesses need to ensure they are agile enough to adapt to their current environment. Cloud computing has the capability and flexibility to manage technology changes and updates, and continually innovate in real time. In Bluewolf's latest The State of Salesforce Report, 64% of companies surveyed released changes to their Salesforce instance at least monthly -- a 20% jump from last year -- plus, three times as many companies reported releasing at least weekly. As a business evolves, its primary needs are likely to change; companies are realising that  the speed and frequency at which they can innovate counts. The value of the cloud doesn’t end at the initial implementation; those that budget for continuous innovation will get the maximum value out of their investment. Real-time data It goes without saying that good, clean data is essential It goes without saying that good, clean data is essential, but with an expected 87.9 billion consumer emails to be sent and received in 2015, how can businesses hope to rise above their competition and keep customers continuously engaged? Data is the lifeblood of the enterprise and quick access to the right data is crucial. Data access via the cloud allows context and insights from any device, enabling end users to make smarter day-to-day decisions, wherever they are. It also offers a single view of the customer across all departments. This empowers employees to take next best action in the moment and to focus their time on making the customer experience great. In a 24/7 digital world, a seamless customer experience is what we as consumers desire and expect. In order to achieve this, businesses must avoid getting caught up in the nitty gritty or granular details of cloud technology and instead, put the customer’s needs at the forefront of this change. ### OpenStack Foundation Launches Professional Certification Program for OpenStack Cloud Administrators OpenStack training providers will be able to offer ‘Certified OpenStack Administrator’ test virtually, anywhere in the world [easy-tweet tweet="The #OpenStack Foundation today announced a new professional certification program" user="comparethecloud" usehashtags="no"] As part of its strategic efforts to grow the OpenStack talent pool and global community, the OpenStack Foundation today announced a new professional certification program that is meant to provide a baseline assessment of knowledge and be accessible to OpenStack professionals around the world. OpenStack Foundation Executive Director Jonathan Bryce announced during his keynote address marking the opening of the OpenStack Summit in Tokyo, Japan, that the first certification, Certified OpenStack Administrator (COA), will be available in 2016.  The Certified OpenStack Administrator is a professional typically with at least six months OpenStack experience and has the skills required to provide day-to-day operation and management of an OpenStack cloud. More at the website: http://openstack.org/coa There are dozens of companies around the world that currently offer OpenStack training, and representatives from that ecosystem across 10 countries have been involved in a community working group to help define skills and capabilities of a Certified OpenStack Administrator: Asean Labs Canonical Cisco Cloud Enabled EMC Firefly HP Midokura Mirantis Rackspace SkyAtlas Solinea Stacklabs SUSE In addition to helping create the test curriculum and questions for the COA program, these companies will provide training and preparation for the COA test once it becomes available. Certification testing will be administered with the help of the Linux Foundation, which has created a virtual testing platform, making it possible to access the COA test anywhere in the world at an affordable cost. Testing for COA is expected to be available in Q2 2016. To find OpenStack training partners, visit the OpenStack Marketplace, which lists more than 20 training providers, many of whom also offer their own advanced or distro/product-specific certifications in addition to COA certification. The full list of training companies supporting the new certification are: 99cloud Aptira AWcloud Canonical Cloud Enabled Component Soft Firefly hastexo Academy HP Linux Academy Linux Foundation Midokura Mirantis Morphlabs PLUMgrid qSTC Rackspace Solinea SwiftStack SUSE Tesora “This OpenStack professional certification program addresses the need for well-trained and highly qualified OpenStack administrators,”  said Jonathan Bryce, executive director, OpenStack Foundation. “We expect COA certification to become a valuable credential that any hiring manager would want to see on the resume of viable candidate. Further, it is our hope that the OpenStack professional certification program will encourage new entrants into the OpenStack community and expand the talent pool within the industry.” ### An Introduction to Veber Hosting Every now and again we like to give you, our loyal readers, the lowdown on the companies that have caught our eye. Veber have been around for a while, they’ve pretty much seen it all, and now they’re here to tell you all about it. We spoke to Tim, Neil and Rob about their business, you can read all about it straight from them below. TP: Tim Poultney, Director Tim has been the managing director of Veber for 16 years. He has managed the company through a number of re-inventions but Veber, in its current form, is the most successful. Prior to starting Veber Tim was in technology roles for SPAN and Marconi. NL: Neil Laver, Sales Director Neil recently joined Veber from Microsoft where he was a Sales Director for Cloud Productivity. He has a history of building successful businesses in senior sales and marketing roles at leading software and cloud companies. RC: Rob Crowe, Operations Director Rob joined Veber in January 2014 to head up and build the opreations team. Rob has a strong track record delivering complex projects for many large companies. Rob has previously worked for Honeywell, Tesco and Marks & Spencer. Veber as a whole: Veber was founded in 1999, so it has been through all the ups and downs technology has seen since the last millennium. Initially just ‘selling tin’, Veber now specialises in providing unique hosting solutions for unique businesses. All our hosting solutions are designed to meet the customers’ needs to ensure suitability, reliability and value for money. And, Veber’s core group of employees continues to be a select and dedicated team of experts who will know you, your business and your hosting solution. What do you think of the Cloud Industry at present: RC: The cloud industry is so disruptive at the moment. As a small team we work exceptionally hard to keep up with the fast paced changes - my focus with Veber is finding a way to implement structures and procedures to ensure we stay up to date in the rapids of cloud, it’s challenging, but I think we’re up for it. [easy-tweet tweet="The cloud industry is so disruptive at the moment says Rob Crowe of @Veberhost" user="comparethecloud" usehashtags="no"] What are Veber’s main strengths? TP: Customer retention is one of our strongest factors. In 16 years of business we’ve only ever lost 6 clients - and 4 of those only left us because they went bankrupt! I think it’s a credit to our customer service, and our product, that we have such loyal customers. In the past year alone we’ve seen our business growth strongly. All of our new business tends to come through recommendations and word of mouth, so we have a good reputation within the market as a reliable service provider. What do you think sets Veber apart in a competitive cloud marketplace? TP: Because we have such a solution led sale, our customers really get what they need. We’re not about selling them extraneous products for a bit of extra income, we’re a very customer focussed business. It’s all about keeping them happy. A lot of our clients were startups when they joined us, and as they’ve built their businesses and grown, we’ve been able to tailor their solutions for that growth - meaning they get just what they need from us. What are your ideal clients? RC: Startups, particularly ones in the finance and banking sectors, which is a sector that we have great expertise in, and are able to really cater to their needs. These types of start-ups require the latest best of breed technology. They want the near 100% uptime and deep technical expertise. They choose Veber as that describes what we offer. [easy-tweet tweet="#Startups want the near 100% uptime and deep technical expertise" user="veberhost @comparethecloud" usehashtags="no"] TP: It’s about getting the startups at the right time - we find we have to target them before they feel they need to head to the monster that is Amazon, which can be a challenge, but our tailored service definitely offers a more personalised solution than using a major public cloud vendor who will only offer a very cut and paste service for its clients. They just can’t offer the same level of personalised technical expertise that you’d get from Veber.  Could you tell us a bit about some of your current clients? TP: Currency Cloud are one of our clients that we have worked with from when they were a startup, and now we’re seeing their success in the international payments industry. We offer them anything from their virtual arena to a storage farm, tie lines and their data security. RC: We have many other customers in the services and finance sector all of whom have chosen Veber as we offer the best of breed technology and expertise. What is common to all of them is that their businesses are web based so they need the critical security and reliability that we offer. Neil, being that you’re new to the company, could you give us a quick overview of what you hope to add to Veber? NL: Tim, Rob and the rest of the team have done a great job in the recent past to acquire and build new customers. My aim is to accelerate that and begin to develop new products that we can offer to our existing customers and the many new customers that I hope to add. As Rob mentioned earlier, the cloud industry is changing fast and I have joined Veber to ensure that we are as agile as we can be to ensure our customers are getting the best possible service from us. [easy-tweet tweet="The #cloud industry is changing fast and @neillaver has joined @Veberhost to ensure they are as agile as they can be" via="no" usehashtags="no"] Where you see Veber heading in the future? TP: I’ve never been more excited about the future. I am far less worried about customers moving to the public cloud  - it suits some companies but many more see the value in a deeper relationship with a specialized group of experts like Veber RC: It’s a real challenge to keep up with the innovation that is coming from our suppliers like Dell, Cisco and VMware. But from what I’ve from them seen its really exciting. Just in the past few weeks we’ve added flash storage and DDoS mitigation as new services from new suppliers. I see this expanding greatly.  NL: Our growth is allowing us to offer more and more solutions at greater value. But as we grow, we’ll continue to ensure we maintain the great customer service that our customers appreciate. ### Can finance reap the benefits of the cloud without compromise? ...Only if CFOs and CIOs align their requirements and expectations Cloud computing has been a catalyst in opening up much needed dialogue between CIO and CFO. CFOs are being educated on the opportunities and risks associated with IT. CIOs are being provided options with controllable and scalable costs to reduce risk and increase IT departments’ innovation. Interestingly their roles have flipped in some ways. More often than not it’s now the CFO who has direct oversight of the IT department and IT-related spend, but it is the CIO who receives the constant flow of both technical and educational user needs, hardware and software patches and updates and the changing needs of the overall business. [easy-tweet tweet="Interestingly the roles of #CIO and #CFO have flipped in some ways" user="comparethecloud" usehashtags="no"] When it comes to the cloud, CFOs, CIOs and other finance and IT group leaders have divergent attitudes to the cloud, and where those differences trigger the most friction and loss of opportunity, is in methods, processes and timing. Can finance and IT ever be in-sync where cloud is concerned? Recent research from Saugatuck Technology shows that IT seems more concerned about existing systems capability than finance is, and is more apprehensive than finance about cloud-based systems falling short in areas like consolidation and regulatory requirements.  What the research clearly points out is that both finance and IT need to be reassured that the road to the cloud for finance systems takes into account the need for strong financial consolidation and regulatory capabilities. While strategic IT-Finance synchronisation is necessary at all times, it is not always a critical problem. The research indicates that IT and finance leaders tend to be well aligned on cloud when it comes to the most important areas. The challenge is how to harmonise what they see, how they see it, how they communicate it and how they put things into place. Interpreting cloud capabilities into ways of improving specific finance process effectiveness will be the first step on a years-long journey not just into, but with, cloud. the majority of new finance systems will be cloud-based or cloud-enabled within the next few years It is clear that finance, just like IT, thinks there is need and opportunity for improvement in finance systems and operations. And there is no question that the majority of new finance systems will be cloud-based or cloud-enabled within the next few years. The path – or paths - to cloud will shift somewhat as efforts lead to knowledge and experience leads to improved practices, but every realistic scenario includes hybridised on-premises-plus-multiple-cloud environments. In order to move finance systems to the cloud as securely, effectively, and quickly as possible, we will need to overcome some fairly significant differences in Finance-IT synchronisation. That both camps acknowledge the need for more sophisticated technology for the standardisation of methods, regulatory compliance and the integration of systems, is already an important first step. [easy-tweet tweet="IT decision makers have less confidence in current finance management systems" user="comparethecloud" hashtags="cloud"] The research shows that overall, while the core IT and finance positions are extremely similar in patterns, IT decision makers have less confidence in current finance management systems, along with greater concern about the consolidation of subsidiaries and the cost of running or upgrading current finance systems. Results also suggest that IT leaders are more positive than Finance leaders about the benefits of Big Data analytics. Finance is clearly happier with their planning, consolidation and reporting capabilities than IT.  So if finance is going to move to a cloud-based system, they need to be sure they don’t compromise on the capabilities of the solution.  They need a solution that meets their needs and can easily be extended to address the demands of the future. They absolutely must not compromise on that as they move to the cloud. ### Solving health challenges: collaboration and innovation in the cloud  From the moment that a patient steps into their doctor’s surgery, they expect to be treated as an individual; for their healthcare to be as personalised as their internet shopping, banking or travel experience. Consumer-focused cloud platforms have transformed our daily lives and, as consumers, we are more vocal, more empowered and more aware of how we should be treated than ever before. [easy-tweet tweet="Healthcare will use cloud to provide a personalised experience to patients" user="Aridhia" hashtags="healthtech"] Despite the healthcare industry being slow to adopt cloud technology, there is a growing acceptance that as people move from being passive recipients to active, informed consumers in all other walks of life, there is a need and demand for healthcare to follow suit and provide a personalised experience to patients. In the cloud era, we find ourselves on the brink of a transformative shift in medicine, one driven by the explosion of data and the need to disrupt the ‘business as usual’ methods of conducting biomedical research with an innovative big data approach. Data is the foundation of cutting-edge research and is key to delivering innovations that can change how we practice medicine. In the cloud era, we find ourselves on the brink of a transformative shift in medicine No matter where you look in healthcare, research or business, three common themes emerge: collaboration is the solution to solving problems and translating innovation into practice, information governance is central to all uses of data, and consumerisation – building for population scale while offering personalised choice – is key to the success of scaling data-driven innovation.  Balancing collaboration and information governance in the data world is not easy, and this is where the value of cloud computing in healthcare can really be seen. By applying the information approaches that have transformed other industries and building audit, reproducibility and collaboration technologies in to make it relevant and scalable for healthcare, precision medicine and biomedical research, Aridhia has created an open, flexible and secure cloud platform that can accelerate data-driven projects and power the drive towards personalised medicine. Take for example the CHART-ADAPT project, led by the University of Glasgow. This collaborative project is enabling Neuro-ICU clinical teams to swiftly analyse anonymised, high-frequency, real-time data in patients with traumatic brain injuries using AnalytiXagility – Aridhia’s cloud-based platform. It is only by working in collaboration with Aridhia and Philips Healthcare that the university team have been able to realise their ambition of rapidly analysing critical data via a bedside app on an individual patient level, quickly enabling personalised treatment decisions to be made, all while working within the strict bounds of information governance requirements. Balancing collaboration and information governance in the data world is not easy Such collaborations can have a real impact on patient care, but this is only achievable if collaborative research teams take advantage of the benefits that cloud technologies offer. Data by itself is not enough; it is our ability to communicate and collaborate around that data to create new knowledge and enable a personalised approach that is key to transforming healthcare. And while it is a widely accepted concept that innovation requires collaboration, the healthcare world has been relatively slow to embrace the transformation required to benefit from it. there are already Europe-wide projects underway which are leveraging healthcare’s big data to answer research questions Research projects such as CHART-ADAPT have a common goal, but also a common need – that of a cloud-based platform in which to bring both expertise and data together. Such close multidisciplinary collaboration around data would have been unthinkable only a few years ago, but as cloud technologies advance, we now have the opportunity to transform biomedical research into a team sport where a range of skills and experience can be brought to bear on complex challenges. Funding programmes such as the Innovative Medicines Initiative and Horizon 2020 have picked up on the need to promote and enable multidisciplinary collaboration, and there are already Europe-wide projects underway which are leveraging healthcare’s big data to answer research questions. [quote_box_center]One such project that Aridhia is currently involved in is ADVOCATE, a Horizon 2020 funded programme. An excellent example of multinational collaboration around oral healthcare data, ADVOCATE relies on the ability to share data and coordinate knowledge and expertise across multidisciplinary and multi-institutional research teams, no matter what their skillset or where they might be. By delivering an innovative approach to data management, analytics and patient engagement, AnalytiXagility is helping this collaboration shift the focus of European oral healthcare from a reactive to a preventative and personalised model.[/quote_box_center] Although such projects are currently in the minority, embracing collaboration in the cloud is beginning to resonate across the healthcare industry as modern medicine increasingly becomes an information science, and data analysis becomes progressively more challenging, but more crucial. As the volume and complexity of healthcare data grows, the advanced data analysis of vast amounts of genetic, clinical and patient data at the intersection of science and healthcare will become a mainstream enabler of therapy and drug development, clinical decision making, and predicative, preventative healthcare delivery. To get to that point, the healthcare industry needs a specialised cloud platform that addresses its particular needs. [easy-tweet tweet="The healthcare industry needs a specialised #cloud platform that addresses its particular needs" user="aridhia"] For several years Aridhia has been involved in collaborative projects which bring together disciplines such as IT, pharmaceuticals, biotech, genomics and public health. What is clear in each project is that new clinical and patient experiences will be enabled by innovative data-driven applications that interact and serve the patient, researcher or clinician in the context of who they are, where they are and what they are doing in the moment, creating a personalised experience across the entire healthcare lifecycle. The goal of secure, collaborative, cloud-based research platforms such as AnalytiXagility is to create an environment in which each expert can focus on their own discipline and rely on the platform to do the rest.  ### Preview of the OpenStack Summit Tokyo As I packed my bags for the trip to the bi-annual OpenStack jamboree for supporters of the open cloud OS, and contemplated a very long flight from London to Tokyo, I thought I’d look at what we can expect and what the OpenStack movement needs to deliver if it is to thrive against its public cloud rivals. The battleground There are several main dimensions to the battle for cloud supremacy and here’s a brief look at how OpenStack currently stacks up: [easy-tweet tweet="Let's look at how #OpenStack stacks up in terms of capability, ease of use, ecosystem and sophistication" user="billmew" usehashtags="no"] Capability: OpenStack has an impressive array of corporate supporters from HP and IBM to RedHat and Mirantis, all contributing code for the common cause and adding their own OpenStack services. However the level of innovation and rate of delivery of new functionality from rival AWS is phenomenal and would be hard for the OpenStack community to match even if all of its corporate supporters were collaborating in perfect unison – but they’re not. AWS launched 487 services and features this year in the run up to re:Invent (its own fan fest) – everything from developer tools to email services – and then added even more at re:Invent. All of these AWS services contributed to the dominant position of one player – Amazon. Conversely the OpenStack foundation has just launched Liberty, the 12th version of its cloud OS - the new release being better able to handle network function virtualisation and network virtualisation in its other forms as well as containers (Kubernetes, Mesos and Docker Swarm). Heat orchestration has also been improved with more APIs allowing more automation and integration. While many of the main corporate sponsors have contributed to this and are making their own announcements in Tokyo, they aren’t working with a singular purpose as AWS is. The fact remains that they all have different agendas and are all competing for business in the OpenStack segment (all wanting to be the lead vendor on key engagements) and that many are distracted with other major issues – Dell is acquiring EMC, HP is splitting into two and IBM is reinventing itself. Ease of Use: [easy-tweet tweet="Amazon is nominally a commoditized offering" user="billmew" usehashtags="no"] Amazon is nominally a commoditized offering – i.e. very cheap and easy to use. However it is often criticized for not only being far more expensive than it would first appear (once all the options are considered), but also being somewhat more complex. It realises this and has made strides to address the issues – aiming to make it easy for IT executives and developers to find and use unique tools to solve issues quickly or expand capacity cheaply. For example at re:Invent AWS launched tools to simplify data and database migration to AWS. While AWS has further to go on the ease of use front, OpenStack is way behind – it’s often described as a science project for geeks or even rocket scientists. We expect to see ‘ease of use’ as a major focus in Tokyo. Ecosystem: I was among the 6,000 attendees at the last OpenStack Summit in Vancouver and while we expect more attendees in Tokyo, it is still way below the 18,000 attendees at AWS’s re:Invent. This is just an indication, but given that AWS is on the short-list for nearly all enterprise infrastructure deals, and given its current dominance in public cloud, it has become the first choice environment for many developers, putting the OpenStack ecosystem (however loyal and vibrant) at a major disadvantage. After all if you win the developers, you win the war. if you win the developers, you win the war Sophistication: One of the most important battles is going to be for the large corporate clients that have traditionally looked to the likes of IBM, Oracle or HP for their IT needs and who need sophisticated systems to meet their complex requirements. Initially AWS only offered a basic set of compute and storage services so the playing field should be stacked massively in the favour of OpenStack’s major corporate supporters who dominate the corporate IT market with well-established corporate offerings. They also have strong relationships with CIOs and credibility with LOB execs. While the established vendors (who are also major OpenStack supporters) definitely hold the upper hand, AWS is making a significant challenge. Amazon’s recent launch of QuickSight which sits on its Super-fast, Parallel, In-memory Calculation Engine (SPICE) already leverages AWS and partner data sources like Tableau, DOMO, Qlik and Jaspersoft and could soon evolve to challenge the incumbent vendors for LOB data visualisation services (including business intelligence and analytics). At the same time the AWS Database Migration Service and AWS Scheme Conversion Tool simplify the challenge of switching databases, as AWS directly targets database service renewals and therefore Oracle’s core market. Add to this the AWS foray into IoT and it is clear that the leadership positions of the traditional corporate tech vendors are under assault. [easy-tweet tweet="In Tokyo we need to see #OpenStack and its major supports come out fighting" user="billmew" usehashtags="no"] In Tokyo (and beyond) we need to see OpenStack and its major supports come out fighting. They need to move quickly and be aggressive if they are going to make inroads not only against AWS, but against Azure and Google too – and if they don’t then their own respective franchises will increasingly be at risk. However just as we shouldn’t underestimate AWS, neither should we underestimate OpenStack and with it IBM, HP, RedHat, Oracle and the rest. AWS may well have a leadership position in private cloud, but its revenues of $6-7bn are a fraction of the size of any one of the giants supporting OpenStack and if the threat posed by AWS ever galvanised these giants into concerted and coordinated action, AWS could be stopped in its tracks. As we wait to see what the OpenStack community have to offer next week in each of these key battleground areas, we took a look at which members of the community were being most vocal. All the usual suspects appear in our #CloudInfluence list of top OpenStack organisations – appearing somewhat higher in the rankings than they might otherwise do are Dell and EMC as a result of the recent acquisition announcement (the largest in tech history) and also Wipro in regard to slightly unrelated comments from a former employee in the UK that is suing the firm for discrimination and unfair dismissal. In addition, Symantec , which topped the list, had a publicity boost from the sale of its information management business, known as Veritas, to an investor group led by The Carlyle Group together with GIC, Singaporean sovereign wealth fund, for $8 billion. Index Name Website Twitter 1 Symantec http://www.symantec.com @symantec 2 Dell http://Dell.com @dellservices 3 Citrix http://www.citrix.com @citrix 4 Wipro 5 Pure Storage http://www.purestorage.com @PureStorage 6 Mellanox Technologies http://www.mellanox.com @mellanoxtech 7 NTT Communications http://www.ntt.com/ @ntt_comms 8 Cisco http://thenetwork.cisco.com @Cisco 9 VMware http://vmware.com @VMware 10 EMC http://www.emc.com @EMCcorp Unsurprisingly Michael Dell appears at the top of our #CloudInfluence list of top OpenStack individuals. He leapt into the lead on the announcement of the Dell EMC deal, but if you exclude him, there is healthy competition among other senior executives from OpenStack’s major corporate supporters, with IBM having the largest number of executives on the list. Index Name Position Company Twitter 1 Michael Dell CEO Dell @MichaelDell 2 Eyal Waldman president and CEO Mellanox Technologies @EyalWal 3 Robert LeBlanc SVP IBM Cloud IBM 4 John Thompson global SVP of partner and channel sales Symantec 5 Shreya Ukil Former employee Wipro 6 Martin Jetter SVP IBM Global Technology Services IBM 7 Kyle Zimmer President and CEO First Book 8 Charles Liang President and CEO Supermicro @Charles_Liang 9 Oded Sagee senior director of Industrial and Connected Home Solutions SanDisk 10 Chris McNabb general manager of Dell Boomi Dell @chrismcnabb We are going to track how these lists change over the course of the next week or so and gauge the impact of the various announcements made at the OpenStack Summit in Tokyo. ### Everything you need to know about cloud servers & DDoS attacks When a business considers moving some or all of its data and applications into the Cloud, there are likely to be several reasons behind the decision. The advantages of Cloud-based computing are well-established: It offers technology and support, cost-efficiencies and scalability; it means that your company will have access to the latest software and that all patches can be taken care of by the provider if you prefer; if you have massive amounts of data to store then the Cloud saves you space; your maintenance and obsolescence costs are reduced. [easy-tweet tweet="There are thousands of #DDoS attacks worldwide on a daily basis" via="no" hashtags="cloudsecurity"] But probably what most businesses are looking for today, in an age where there are thousands of DDoS attacks worldwide on a daily basis that can cripple a website for long periods, is security. Guaranteed uptime. Third-party providers offer technology such as these 100TB cloud servers that provide some of the most effective deterrents against this malicious and prevalent form of cyber crime which is continually mutating in attempts to exhaust network resources. There are three facts every business should consider in relation to DDoS attacks: They are now so commonplace, so inexpensive and easy to organise in an act of political activism, extortion, retaliation or petty vandalism, that no company with an online offering, whether it’s a multinational bank or a pizza delivery company, can or should consider itself safe. Effective DDoS detection, mitigation and response is vital. It would be prohibitively expensive for all but the largest and wealthiest companies to ‘build-out’ the infrastructure they need to afford complete protection, meaning that third-party solutions, especially those offered by Cloud services providers, are the obvious answer. Relying solely on third-party solutions is unwise. Why are Cloud servers so practical? Cloud service providers are equipped with plenty of bandwidth, and data centres spread over multiple locations, making it far easier for them to cope with large volumes of traffic coming through. They also have trained teams capable of identifying anomalies early on, and know how to deal with them effectively. Providers can offload bogus traffic as the attack is ongoing, thereby preventing their clients’ servers from being overloaded.  The problem is tackled on the edge of the network instead. With the financial clout to develop or buy advanced automated tools that are a match for the latest types of DDoS attack, they are in a good, though not infallible, position to stop the worst of the damage from being inflicted. Because there’s the rub - it can be easy for IT professionals to delude themselves into thinking that by using Cloud servers to host their data and applications they are completely secure. They’re not, clearly. One only needs to look at 2014’s iCloud hack to see that there is the potential for infiltration. Even the largest, best resourced network can be overrun. There is a solid theory that the increased use of the Cloud is what’s driving so many of these large-scale DDoS attacks - incredible amounts of disruption can be caused with just one target. What is key is having the wherewithal to deal with that situation quickly and effectively. The ideal protection against DDoS attacks is twofold The ideal protection against DDoS attacks is twofold. A third-party Cloud’s secure environment, with automated tools such as Arbor Peakflow to manage and clean traffic, and proactive monitoring of networks to ensure any problems are spotted and stepped on immediately. And in house measures, including but not limited to: Knowing the signs of being under attack Rate limiting your router and telling it to filter out obviously bad packets - it may buy you a little breathing space Knowing who at each provider you can call on for help Providing for emergencies with more hardware and bandwidth than you actually need - costly, but it could make all the difference. ### The Differences between EU and US Data Laws European data laws have been in place since 1995. They were brought in as a reaction to the growing amount of internet based businesses owning large amounts of private data. The main idea behind the act is to ensure that use of private data either doesn’t happen or that it is consensual. [easy-tweet tweet="Data protection in Europe is very different to data protection in the US" user="Frontier_Tech @comparethecloud" hashtags="data"] Below are some of the key tenets of the European Data Protection Directive: [quote_box_center] Data may be processed only under the following circumstances. (art. 7): When the data subject has given his consent. When the processing is necessary for the performance of or the entering into a contract. When processing is necessary for compliance with a legal obligation. When processing is necessary in order to protect the vital interests of the data subject. Processing is necessary for the performance of a task carried out in the public interest or in the exercise of official authority vested in the controller or in a third party to whom the data are disclosed. Processing is necessary for the purposes of the legitimate interests pursued by the controller or by the third party or parties to whom the data are disclosed, except where such interests are overridden by the interests for fundamental rights and freedoms of the data subject. The data subject has the right to access all data processed about him. The data subject even has the right to demand the rectification, deletion or blocking of data that is incomplete, inaccurate or isn't being processed in compliance with the data protection rules. (art. 12) [/quote_box_center] This focus on human rights and the interests of the individual rather than the collective is stark in comparison to the U.S equivalent. The U.S data protection law is more reactive and tends to pander to the concerns of big industries. This was best demonstrated during the creation of the ‘Framework for Global Electronic Commerce’ when Bill Clinton and his vice-president Al Gore recommended that ‘the private sector should lead’ and that ‘companies should implement self-regulation in reaction to internet technology’. This type of small-government attitude is indigenous to the United States and their laissez-faire attitude to digital privacy is a hallmark of the American constitution. The implicit right to privacy is guaranteed by the age old legislature, so they aren’t too concerned with creating a new one. Cultural differences One of the reasons that the approach to privacy laws in the EU is so radically different from our American counterparts is due to history. We don’t have a uniform constitution and tend to hold historical documents with less reverence. Another reason is that our history dictates our political philosophy. During WWII and the period after, Europe was rife with Communist regimes that actively used personal information as a way of benefiting themselves. [easy-tweet tweet="European history has greatly impacted the way in which personal data is dealt with in the EU" user="comparethecloud" usehashtags="no"] One massive example of personal data used in a malicious way was the use of personal information to send certain demographics to Nazi concentration camps. Although, this type of situation will never happen again, there is now a certain stigma attached to the way that large governments and institutions manage private and personal data. As a reaction to this, certain companies and individuals have started to call for the EU to loosen their grip on data protection and start operating in a similar way to the U.S. There is a large group of service providers who would much rather see a more lenient system that operates on an ad-hoc basis, one that’s more responsive to changes in technology and the market. While the privacy of users must be valued, legislation needs to be able to move with the times and technology. Where do you stand on the issue? ### White Christmas Could Leave Businesses in the Red A meteoric event forming in the equator has caused bookmakers to slash odds on a white Christmas. With ‘El Niño’ on the way, and UK forecasters predicting the harshest winter in 50 years, leading Cloud IT experts Net Solutions Europe (NSE) comment on the importance of resilient IT systems to mitigate crisis management during the harsh coming months. The nation’s businesses could be forgiven for recounting the severe aftermath of the last winter of 2009/10 inspired by the ‘El Niño Southern Oscillation’. Six years ago widespread transport issues, power failures, school closures and even severe flooding devastated parts of the country, Cumbria alone suffering £124m of damage to commercial property and businesses. With snow estimated to cost businesses nearly £500m per day across the country, the extended period of severe weather caused the economy to contract by 0.5% in Q4 of 2009. According to forecasting experts, the UK is set for even worse this winter, with odds on snow at Christmas dropping from 5/1 to 2/1 as conditions are set to rival the record winters of 1962/63, when lakes and rivers froze over and 1946/47, when closed power stations caused the government introduced measures to impose nationwide limits on power consumption. Even short-term power shortages in 2015, coupled with massive transport disruption or even flooding, have the potential to wreak havoc on the productivity of the nation’s businesses. “Premises losing power and issues with staff access are predictable effects of extreme weathers conditions. The problem is further exacerbated when one internal server is responsible for housing all files and processes. Even the largest organisations can find themselves coming to a grinding halt.” “When a problem is predictable, a solution must exist, and one solution to the problem is cloud integration. Storing data in a secure, purpose-built data centres not only creates confidence in the safety of vital information from both physical and online threats, but enables complete remote access to ensure productivity remains largely unaffected even when any members of staff are unable to get to work.” “The cost implications of investing in new IT systems can cause concern for some business owners.” adds Tom, “but adopting and implementing Cloud into their organisation can often actually be a cost-neutral exercise, adding both flexibility and resilience.” ### NetSuite partners with Tableau NetSuite Inc. the industry's leading provider of cloud-based financials / ERP and omnichannel commerce software suites, today announced a partnership with Tableau Software, delivering a new data export feature to link Tableau’s instant visual analytics to the NetSuite cloud. [easy-tweet tweet="NetSuite and Tableau partner to bring instant visual #analytics to the NetSuite Cloud" user="comparethecloud" hashtags="cloudnews"] The new feature allows businesses to apply Tableau’s powerful analytics to NetSuite data at the press of a button, generating real-time insights and enabling more informed data-driven business decisions. In addition, for the first time, businesses can easily utilize Tableau’s advanced data discovery and visual analytic tools to blend NetSuite with non-NetSuite data sources in order to identify business trends, benchmark to industries and find latent low cost, high margin opportunities. The partnership combines the strengths of respective industry leaders—NetSuite in cloud ERP, CRM and ecommerce, and Tableau, named as a Leader in Gartner’s Magic Quadrant for Business Intelligence and Analytics Platforms for the third consecutive time in February 2015. Key features and benefits to businesses include: Seamless business analysis: Through the joint solution, users can easily leverage ERP, CRM and ecommerce data to identify trends, key performance indicators (KPI), and gain insights into overall business performance. Visual analytics: Easy to use data visualization can filter data dynamically, split trends across different categories or run an in-depth cohort analysis. Deep statistics: Export critical business data to run trend analyses, regressions, correlations and more. Interactive Mapping: Beautiful, interactive worldwide maps that provide global insights into business performance. Available now at no additional cost, the new export feature enables joint customers to export data from NetSuite into a native Tableau file format which can be instantly consumed by both Tableau Desktop and Tableau Online, a hosted SaaS version of Tableau Server that makes cloud analytics possible in just minutes.  [caption id="attachment_32453" align="alignright" width="300"] Read more on NetSuite's latest announcements here[/caption] “This integration opens new doors for joint customers to understand and act on NetSuite data in real-time using best-in-class analytics,” said Dan Jewett, VP of Product Management at Tableau Software. “Robust analytics is a hallmark of leading-edge companies, and this makes it faster and simpler for NetSuite customers to take full advantage of Tableau to drive the business.” “Both NetSuite and Tableau are committed to meeting rising demand among customers for out-of-the-box integration that would let them go from data to insights in a few clicks,” said Guido Haarmans, Senior Vice President of Business Development for Technology Partners at NetSuite. “Teaming up with Tableau and our joint development efforts addresses this demand and enhances the value of our respective technologies.” Tableau, a global leader in rapid-fire, easy-to-use business analytics software, has been operating on NetSuite since 2008. NetSuite OneWorld’s strong financial controls, transparency and auditability were pivotal in Tableau’s initial public offering in 2013, and continue to create tremendous value in Tableau’s global financial operations and reporting across subsidiaries in Australia, Canada, China, France, Germany, India, Ireland, Japan, Singapore and the United Kingdom. Today, more than 24,000 companies and subsidiaries depend on NetSuite to run complex, mission-critical business processes globally in the cloud. Since its inception in 1998, NetSuite has established itself as the leading provider of enterprise-ready cloud business management suites of enterprise resource planning (ERP), customer relationship management (CRM), and ecommerce applications for businesses of all sizes. Many FORTUNE 100 companies rely on NetSuite to accelerate innovation and business transformation. NetSuite continues its success in delivering the best cloud business management suites to businesses around the world, enabling them to lower IT costs significantly while increasing productivity, as the global adoption of the cloud accelerates. ### Has the Safe Harbour Data Pact Denial Left You Feeling Well...Unsafe? Now that the dust has settled, let's revisit last week's ruling regarding Safe Harbour. In case you're unfamiliar with this international headline, the European Court of Justice ruled invalid a fifteen-year old data transfer pact between the United States and Europe.  When former National Security Agency (NSA) contractor Edward Snowden exposed the extent of the NSA's international data surveillance, he created a ripple effect. The result? Public anxiety around data privacy has become cemented in international law. What does this mean for IT service and solutions providers in the U.S. and U.K.? We can't definitively say, but here's a closer look at the situation... [easy-tweet tweet="Public anxiety around data privacy has become cemented in international law" user="followcontinuum" hashtags="data"] Is There a Trickle Down Effect? Right now you're probably thinking... but I'm not Facebook or Google. Aren't they the only ones who should be sweating bullets about this? True, the immediate impact will be experienced most by these tech mega-companies, but that doesn't necessarily mean SMBs are in the clear. Most of the coverage you'll read discusses how this decision affects the regulation of "big data." By this I mean the personal data companies collect regarding a person's social media activity or web search and purchase history. With (Un)safe Harbour, there's now uncertainty over whether this data can legally be distributed across the Atlantic. While sure, this is a problem for the Facebooks, Googles, and Amazons of the world, what about all of us who rely on their services to gather lead and client information for subjects outside of our borders? Can a U.S. company seeking to prospect in Europe legally gain access to a U.K. resident's web history to run targeted remarketing campaigns designed to hold their interest? We don't know at this point. Social and browsing data aren't the only examples of "personal data" as defined by the EU... [caption id="attachment_22110" align="alignright" width="300"] Read about European Data Reform here.[/caption] As we already learned in Getting Up to Speed with European Data Privacy Reform, to protect the privacy of citizens, businesses need to obtain consent from those European residents whose personal data they want to track. Additionally, with new changes to the single EU digital standard, all data that identifies an individual, whether directly or indirectly, will now be personal data. This includes IP addresses and pseudonyms. Does that then mean that this data can only be stored and accessed within the confines of the 28 EU member states?  It's also curious that the EU seeks to move to one, unified data regulation standard when this new ruling affecting approximately 4,500 companies establishes the precedent that each individual European country may choose how they regulate US companies' processing of personal data. [quote_box_center] The Community is Buzzing! While time will tell what comes of this judgment, a few leading publications are already weighing in. Tech Crunch, for instance, claims "companies will need to restructure their European data processing operations — such as building European data centres to process regional data," adding that "such shifts might require other significant procedural changes in how they manage user data flows." If you're an MSP operating in the U.S. with European clients or a European office, this potential change could be an incredible undertaking, both in terms of cost and time. The Financial Times also suggests that companies, especially cloud services companies like Amazon Web Service and Salesforce, may increasingly localise with data centres in Europe, potentially leaving SMBs at a disadvantage. In response to their findings, I wonder If AWS expanding its global infrastructure might mean smaller cloud hosting providers may not be able to keep up and MSPs may be forced to migrate to AWS. Think about the supply and demand angle here. If you're AWS and are experiencing a huge influx of businesses demanding your services out of necessity, won't you want to increase your price? Again, this is merely conjecture reflective of the uncertainty the ruling has introduced for businesses of all shapes and sizes. [/quote_box_center] What about for U.K. MSPs and MSPs Considering International Expansion? The Financial Times also draws attention to the millions that smaller EU companies may have to spend to take on "the strenuous task of shifting data held on American computing infrastructure back to Europe, or securing the technical legal agreements to satisfy the new data protection regime." Then, you have to consider those enterprising IT service providers that seek to take on clients in the U.K. (if they're a U.S. company) and the U.S. (if they're a U.K. company). There's an appeal for some larger U.K. MSPs to serve clients in the United States, since English is the primary language. With the scope of language fragmentation between European countries, targeting the U.S. may be the next logical step for U.K. IT solutions providers wanting to branch out of their territory. With this new data transfer grey area, does that mean that the majority of these businesses will delay expansion indefinitely? Should We Expect any Loopholes? The data protection authority (DPA) at the German federal state of Schleswig Holstein has stated that any an all attempts to evade the European Court of Justice's judgment will be illegal, claiming "only a change in US law can make US companies compliant with European legislation." Furthermore, violators can expect to pay up to €300,000. Considering that Safe Harbor was a loophole that enabled U.S. companies to evade the full weight of the European Data Protection Directive via these seven principles, we should expect more resistance to the restriction of the free-flow of data. In fact, we're already witnessing this. According to Business Insider, the US Department of Commerce still plans to carry out Safe Harbour, despite the fact that any arrangements made after the ruling "will not hold any legal weight with European authorities — meaning American companies who choose to take this route are opening themselves up to legal challenges from national regulators." And guess who they're directing further inquiries to, as expressed on their website? The European Commission (EC) or legal counsel. It's only been a short time since the ruling, and already we're facing a tangled mess of international bureaucracy along with business uncertainty. [easy-tweet tweet="It's only been a short time since the ECJ #safeharbour ruling, and already we're facing a tangled mess of international bureaucracy" user="followcontinuum" usehashtags="no"] Are there any other exceptions that are legally upheld? At a press conference, the EC's justice commissioner Vera Jourová confirmed data-sharing alternatives to practice in the meantime while Safe Harbor negotiations are ironed out. If you're an MSP operating in the healthcare IT vertical, it will behoove you to know that in life or death matters, a patient's medical records "can be transferred internationally in the person’s own interest." Other exemptions include how you can transfer data in order to uphold a contract. Time will tell, but it's likely that more of these workarounds will surface as negotiations continue. Closing Thought The intent of this blog post was not to sensationalize a juicy news story and ignite public panic, nor was it to express that these are concerns of Continuum's. As a company, our official stance as reported in ChannelE2E's post is this: “Without any concrete guidelines, it’s too early to assess the full impact of this ruling. We are carefully monitoring the situation. As rulemaking proceeds, we will be assessing what, if any actions may be required by Continuum or its partners to ensure compliance with European regulations.” Rob Autor, Senior VP of Global Service Delivery at Continuum As European Sales Director, I echo this sentiment. For now, we will wait and see what comes of this latest ruling from Europe's highest court. We recognise that many U.K. IT support providers have offices or clients in the U.S. and we'll be sure to update our European partners, should they have cause to worry. What I hope I've done instead with this article is presented the facts of a timely, relevant tech story and highlighted gray areas that need further defining. Uncertainty is never a good thing in business, and unfortunately, this decision has left a lot of MSPs - both in the U.S. and U.K. - scratching their heads. Are you one of them? What's your take on last week's judgment? Leave a comment below!  ### Know your cloud providers Choosing a Cloud provider is not simple. Not all vendors offer the same features and often their backgrounds lend very different experiences to their current services. When deciding which provider to use its helpful to consider them as a contractor and choose the one that best compliments the overarching business strategy.  Being aware of a provider’s background and the unique experience that informs their current positioning within the Cloud market is important. It can strongly signal which are a good fit for the role they are expected to play in a business strategy. Most vendors have evolved from one of three main areas – Internet Service Providers (ISPs), Value-add Resellers or Mass Hosters all of which have a great many positive features and as always a few negatives but should be considered in the decision making process. [easy-tweet tweet="Most vendors have evolved from one of three main areas #ISPs #VARs #MassHosters" user="fasthosts" usehashtags="no"] Internet Service Providers (ISPs) Many ISPs branched out into Co-Location (CoLo) as they already had large infrastructure networks. With limited changes required to resources and facilities many began to offer managed services and from here naturally progressed to the Cloud. Despite a seemingly easy jump, ISPs can face challenges with automation, efficiency and maintaining industry standards at competitive prices. It’s worth considering that infrastructure isn’t always cutting edge which could be a problem for some businesses. Value-added Resellers (VARs) VARs exist to buy and implement IT on behalf of other companies. On-premise hardware was the traditional route but with more companies realising the power of the Cloud, VARs realised offering complete solution also required Cloud hosting, and to do this they would have to partner up with a hosting infrastructure provider. Customised infrastructure comes with a hefty price tag that can sometimes make the services unrealistic for SMBs. But, if cost is no problem VARs perform to a consistently high standard without the need to be hands on yourself. Mass Hosters These companies originate from shared web hosting, serving hundreds of thousands of websites across many servers. Over the years investment has centred on infrastructure, automation, resource management and speed of provisioning. In fact their best practice often sets the Cloud industry standard. Mass hosters’ make sure that they retain their loyal customer base with a mixture of standard products and fully customisable, service wrap options while always maintaining their focus on agility, price and features [easy-tweet tweet="Understanding the pros and cons of each type of #Cloud provider is the most logical starting point" user="fasthosts"] Understanding the pros and cons of each type of Cloud provider is the most logical starting point and should assist in drawing up a shortlist of vendors that could fulfil strategic needs. To narrow this list further these questions are key before signing any contract: What level of IT expertise do you have in-house and do you require from a vendor? Is the provider from a reputable brand that will remain profitable in a fluid marketplace? Is the vendor’s knowledge of application and system-level managed services sufficient? Are the Support Team or available service wraps adequate? It is essential to cover the basics at the same time – ask about costs, security and uptime guarantees. Ultimately, whatever the vendor’s background, make sure that the vendor has technical competency, great strategic partnerships and that any efficiencies offered are values that can and would be reflected across your business. ### Falling like Dominoes: Snowden, Safe Harbour, Crypto and the future of CyberSecurity In 2013 when Edward Snowden, the former CIA employee and former government contractor, leaked classified information from the United States National Security Agency (NSA) he set off a chain of events that have yet to subside. The slow burning fuse that he lit caused a major explosion last week with more predicted to follow. Safe Harbour, the legal framework for data transfers based on an agreement in 2000 between the European Union (EU) and the US, has been struck down in the European courts.  [caption id="attachment_32362" align="alignright" width="300"] Edward Snowden[/caption] Ever since the Snowden leaks it has been increasingly evident to most in the industry that the surveillance he unveiled and the data privacy regulations in the EU were mutually incompatible, and that it was therefore just a matter of time before Safe Harbour fell. Having seen the end of the Safe Harbour agreement coming for a while now, all the major cloud players had already taken steps to prepare for last week’s ruling. The large firms, with a few exceptions, almost all now have data centres within the EU, and privacy and data management policies in place that adhere to local regulatory requirements. [easy-tweet tweet="The Snowden chain reaction is ringing out through Europe with #safeharbour being struck down" user="billmew" usehashtags="no"] Smaller companies that lack the financial and legal means to get around the ruling as easily will be impacted more by the ruling. Most of the firms that relied on Safe Harbour simply don't have the resources - unlike Facebook, Google and Microsoft - to create alternative corporate legal structures for their transfers of data. They will therefore be left in limbo until each country rules on the Safe Harbour agreement.  [caption id="attachment_32361" align="alignleft" width="300"] Jimmy Wales[/caption] For those with adequate resources, there are of course are still mechanisms that companies can use to transfer data, including binding corporate rules and a presence in the EU to keep and store sensitive data.  Predicting exactly how each country will respond to the Safe Harbour ruling is almost impossible. At a recent briefing at IPExpo, Jimmy Wales, founder of Wikipedia expressed the concern that: “We may be moving to a balkanised era, where data has to be held in a country very specifically across many different jurisdictions." This balkanisation could lead to the creation of multiple country-specific Internets, each gated off legally and governed by its own privacy laws, impacting the ability to provide harmonised services across borders. In the longer run though, Wales believes that while data sovereignty will remain an issue, the focus will move from location to tokenisation and encryption – as tokenisation allows the data to remain in its country of origin and as all data becomes encrypted. This brings us to the next domino that is about to fall – cryptography. the next domino is about to fall – cryptography Most public key crypto-systems currently in use are built on integer factoring and discrete logarithms (such as RSA and ECC). The problem is that there is a quantum algorithm (Shor’s algorithm) that already exists, which, given enough computing power, can break them. And with the development of quantum computers there could soon be systems capable of capitalising on Shor’s algorithm to break these public key crypto-systems. Conversely there are symmetric key ciphers such as AES that are considered quantum-safe and will play an important role the post-quantum world of next generation cryptography. So what does this all mean for you? Each European Union company will need to designate a single person as data controller. This data controller will need to follow exactly how each country responds to the Safe Harbour ruling and to ensure that you comply with each regulatory requirement. [easy-tweet tweet="The #SafeHarbour agreement never provided any real protection for your #data" user="billmew @comparethecloud" usehashtags="no"] However it doesn't really matter what protective measures or agreements this person puts in place (the Safe Harbour agreement never provided any real protection), if there is a breach then you are fully liable. In addition to tokenisation to allow data to remain in its country of origin, your best bet is to ensure that all your data is effectively encrypted, but most current encryption will soon be rendered ineffective. If you are implementing a crypto system now then you need to be sure that it will protect the data for its entire lifecycle (and for things like patient data, this can literally be a lifetime). Indeed for all of your data you need to be planning now for the transition to the use of quantum resistant algorithms or quantum key distribution, or ideally both. You have been warned. ### HP quits Public Cloud; Dell buys EMC; What’s happening? Late last night under the cover of darkness HP put out a blog admitting that it is exiting public cloud as it simply too far behind to ever catch up, and will now bet everything it has on an OpenStack hybrid model. This follows the announcement of the largest acquisition in tech history (Dell EMC). What’s happening here and what (if anything) we can learn (or afford to ignore) from tech trends of the past? [easy-tweet tweet="#CloudNews: @HP quits public cloud" user="comparethecloud" hashtags="cloud"] what we can learn (or afford to ignore) from tech trends of the past Hypothesis 1: This time … first mover advantage is real  Historically companies in the tech sector have rarely benefitted from first mover advantage. Instead fast followers have normally been able to avoid the mistakes of first movers and capitalise on the potential of the emerging markets that the first movers have created for them. This was true in the past – MS-DOS was not the first operating system and Lotus 1-2-3 was not the first spreadsheet. And it is as true today – Google was not the first search engine. Facebook was not the first social network. Groupon was not the first deal site. And Pandora was not the first music site. It is arguable however that in public cloud AWS is benefiting from a real first mover advantage. Why is this? Firstly the capital costs and barriers to entry in public cloud are significant. AWS has had to spend about $1 billion each quarter building out its cloud platform. And secondly differentiation is hard to achieve in cloud, meaning that fast followers are struggling to overcome AWS’s scale advantage by offering obviously improved offerings – usually the approach that fast followers make. Google could have entered the market at the same time as AWS, but chose instead not to re-sell its own capacity until after Amazon had already perfected the model of selling it on a self-service basis, thus ceding first mover advantage entirely to AWS. Microsoft has had to leverage its Office franchise in order to get Azure into a competitive position. IBM was late to respond with its purchase of SoftLayer, albeit nothing like as late as Oracle and others in their response to the cloud threat. Rackspace lacked the ability to maintain a level of investment to match what Amazon sustained for AWS. Hypothesis 2: This time … it’s not a bubble (yet) Not only have we just seen the largest acquisition in tech history, but in the last 18 months the number of privately held technology companies in the US worth more than $1bn has risen from 30 to 80 – that’s in just 18 months. What’s happening here? [easy-tweet tweet="The Dell EMC deal is a major wake-up call to other mature tech stalwarts says Daniel Ives" user="comparethecloud" usehashtags="no"] As the WSJ reported FBR analyst Daniel Ives, one of the regulars in our Global #CloudInfluence rankings, has called the Dell EMC deal a “major wake-up call to other mature tech stalwarts.” He also adds: “We believe there will be wide-reaching ramifications across the tech space for years to come from this transaction.” Ives predicts that whether or not the Dell EMC deal is deemed a long-term success, it will force the hands of its competitors making them more aggressive on M&A, or “risk losing their seat at this game of high-stakes poker.” The WSJ blog goes on to give Ives’s list of possible deal candidates. At the same time the jump from 30 to 80 in the number of privately held technology companies in the US worth more than $1bn isn’t a sign that we are suddenly producing a wave of unicorns. It is simply an indication that these firms are delaying the point at which they move to IPO. The market hasn’t been entirely favourable and there is an increased level of investment by corporate investors including Google, Rakuten, Alibaba, Comcast and others that aren’t necessarily looking for the same speed or level of return that VCs need to. There are more big happenings coming On the heels the $67 billion Dell EMC deal On the heels the $67 billion Dell EMC deal, Symantec is spinning out Veritas, Western Digital is buying SanDisk for $19 billion, EMC is spinning out Virtustream and HP is selling TippingPoint To Trend Micro For $300M. If, as Ives predicts, this continues for the next few years with sustained M&A activity, then the buyers will mainly be the traditional tech companies and so-called pure play technology vendors. Being HP, Cisco, IBM, Microsoft and Oracle, the deals should be relatively large – not only are there more potential targets worth more than $1bn, but while smaller deals will continue, larger ones will be needed to deliver the necessary impact in the necessary timeframe. One particularly pointed article in Wired went further than Ives, describing HP, Cisco, Dell, EMC, IBM and Oracle as the walking dead, saying: “Oh, sure, they’ll shuffle along for some time. They’ll sell some stuff. They’ll make some money. They’ll command some headlines. They may even do some new things. But as tech giants, they’re dead.” as tech giants, they’re dead. If Ives’s prediction of a sustained level of M&A activity materialises, it will maintain values (at least in the immediate term), but in its aftermath there will be far few buyers or targets left standing. Once the dust settles the players left in the game will be very different to those of today. Hypothesis 3: This time … it will pay to be lean, clean and green In the past firms have at times made a play of being lean, clean or green. In future they will need to be all three. Lean: In a lower margin era in which differentiation is hard to create or sustain and competition is fierce, the industry simply won’t be able to sustain the number or size of players that it could in the past. Already we are seeing automation change the industry, with a battle just for top talent. We will see a bifurcation between a small number of massively talented employees addressing the most complex challenges, and a commoditised arena where lower skilled staff either find their roles automated or wages undermined. Whether you’re a developer, a marketer or any other member of staff, if you’re not among the top talent in your area of operation then you will be one of the others. Clean: Following the Snowden revelations, firms in the US were tarnished with an assumption that they were either colluding with the US government or vulnerable to its investigative powers. They have been quick to demonstrate their independence – with Microsoft resisting US government efforts to access data residing in Dublin and with Apple telling a U.S. judge that accessing data stored on a locked iPhone would be "impossible" with devices using its latest operating system. Survival for such firms will depend on having a clean image and a credible level of independence. Green: Some data centre operators have made a big play of using renewable energy for at least some of their newest or more prominent facilities. While this may be a point of newsworthy differentiation at present, it will soon become not only the norm, but also an absolute necessity. ### Do enterprises want to operate or consume an OpenStack cloud? When OpenStack was first developed, it seemed that every business had grand plans to adopt this futuristic technology. Enterprises had been crying out for an IT solution which delivered a fast, scalable, cost-effective service and many organisations anticipated that within a couple of months they would have migrated all their IT requirements onto OpenStack. [easy-tweet tweet="#OpenStack has become a compatibility standard for the private #cloud market" user="comparethecloud" usehashtags="no"] OpenStack has become a compatibility standard for the private cloud market. Some traditional enterprises boast great success stories: despite not being ‘born digital’ like Facebook or Netflix, giants such as Disney and BMW are successfully using OpenStack with private clouds and are proof that it is ready for production environments. The majority of companies however are nowhere closer to implementing OpenStack solutions in-house than they were two years ago. The reasons for this vary – whether cultural, operational or political – but the main barrier seems to be that in-house IT departments have yet to start treating infrastructure as a commodity. cultural, operational and political barriers are still preventing OpenStack Adoption Perhaps this is not surprising: many in-house IT teams currently lack the operating models, business structures and skillsets required to efficiently and cost effectively manage the transition from proprietary models to OpenStack. Similarly, whilst OpenStack is an excellent choice for enabling new, innovative production environments, it is currently too expensive and complicated to rewrite applications in order to migrate existing estates to the cloud. Enterprises face an ‘if it ain’t broke, don’t fix it’ mentality. [easy-tweet tweet="Enterprises face an if it ain’t broke, don’t fix it mentality." user="comparethecloud" hashtags="openstack"] Such reluctance to migrate away from existing infrastructure is only exacerbated when organisations do not feel equipped to make sense of the key challenges, decisions and expertise needed to get started with OpenStack. Operating an OpenStack private cloud requires a specific, technical skill set and enterprises are often discouraged by the complexity of the technology facing them. OpenStack operators must demonstrate a skilled architectural approach and an advanced understanding of system administration skills. They must have a desire and high tolerability for risk and be willing to engage with and contribute to the open source community. The ideal team therefore requires a deep knowledge of modern DevOps, traditional systems administration and network engineering. Most traditional IT professionals are likely to specialise in only one of these areas – it is a rare combination to find all three in one employee, let alone a whole team. Similarly, whilst traditional in-house IT operators are used to manual configuration, by contrast, the OpenStack operator’s role is steadily shifting towards codifying the design of infrastructure in software. This results in speed, agility, repeatability and cost effectiveness, but complicated underlying infrastructure. There are few cloud vendors adept enough to operate OpenStack cloud platforms. For most businesses the challenge is almost impossible. There are few cloud vendors adept enough to operate OpenStack cloud platforms Although the barrier to entry for operators may have risen, it has never been easier to be a consumer of OpenStack. Whilst operators must devote time to improving the platform and evolving service offerings, any developer can now spin up and use infrastructure without investing time, money and resources to manage the stack themselves. The era of the cloud tenant has arrived. So, what does it require to be an adept consumer of OpenStack? Infrastructure may be easier than ever to spin up, but in-house IT departments still need to take responsibility for securing and stabilising platforms effectively.  Consuming a hosted cloud requires a different skillset to operating it. A consumer does not just need to focus on what distribution to buy and on which hardware to deploy it. The savviest cloud tenants are able to evaluate the maturity of service offerings, assess platform security performance and monitor usage to asses cost effectiveness. An adept cloud tenant becomes a confident user of IaaS and provider of PaaS on top of that infrastructure. One of the most important decisions an OpenStack consumer will make is choosing a vendor partner. Whilst adopting OpenStack directly avoids vendor lock-in and mitigates licensing costs, most enterprises do not have resource or confidence to develop in-house expertise. A vendor committed to open source software should help you navigate around lock-in and manage the continuous upgrades, bug fixes and releases offered by the OpenStack platform. An experienced vendor will become an invaluable extension of your in-house team. There is no doubt in the IT enterprise psyche that OpenStack private cloud offers significant advantages; OpenStack addresses regulatory requirements, network latency and security concerns and allows a business to become more agile, innovative and competitive. OpenStack cloud tenancy unlocks the door for those enterprises historically prohibited from operating this new technology due to cost, skill and resource deficiencies. The ability, and desire, to become an OpenStack consumer marks a long awaited tipping point in the traditional IT mind-set and marks a significant step towards enterprise IT professionals treating infrastructure as a commodity. ### Egnyte Launches Industry’s First ‘Smart’ Reporting & Auditing File Service Leveraging Big Data to Reduce Cost, Reduce Risk, and Increase Visibility Egnyte, the market leader in Adaptive Enterprise File Services, today announced the launch of its all-new Smart Reporting and Auditing service. This new service allows organisations to leverage comprehensive dashboards of system-wide analytics around content (creation, editing, viewing, sharing), users, devices, applications, and more to optimise their business’ internal and external data environments.    [easy-tweet tweet=".@Egnyte Launches Industry’s First ‘Smart’ Reporting & Auditing File Service" user="comparethecloud" usehashtags="no"] The Smart Reporting and Auditing service is the first of its kind and exclusive to Egnyte customers, leveraging the company’s hybrid technology to provide comprehensive visibility and control across an organisation’s entire content lifecycle. This includes file infrastructure, which can reside in the cloud, on premises, or a mixture of both, as well as user applications and devices (Watch video: Customers benefitting from Smart Reporting and Auditing) “There is a tremendous opportunity for organisations to take advantage of all the detailed file analytics and insights we’ve collected within their infrastructure, applications, and devices,” said Isabelle Guis, chief strategy officer at Egnyte. “Our focus on delivering this data in an actionable format will create an unparalleled level of efficiency and security that no other company in our space has the ability to do. It’s critical for businesses to have tools like Smart Reporting and Auditing when collaborating internally and externally, not only to optimise file infrastructure and protect against potential threats, but also to gain a competitive advantage.” The actionable intelligence created by Egnyte’s Smart Reporting and Auditing service will: Reduce Cost - Organisations will save money by making informed, data-driven choices about their infrastructure locally and globally — reducing bandwidth consumption, minimizing support issues, and increasing productivity. Reduce Risk - IT leaders are provided access to unique insights about user, file, and device activity that can be harmful to their organisation. The Smart Reporting and Auditing service helps maintain security and compliance with preventative alerts that notify IT about suspicious activities, internally and externally. Increase Visibility - IT and business users will be empowered to make smart decisions at every level with relevant, viable information that is delivered right to their Egnyte interface. IT will have more visibility into the security and efficiency of their entire corporate infrastructure and users will also gain insight into their usage patterns as well as internal and external collaborative efforts. "With content being at the core of just about every business process today, users are accessing files across multiple devices, anywhere, any time,” said Terri McClure, Sr. analyst at Enterprise Strategy Group. "While this data is extremely valuable, it is entirely too costly and time intensive to try and export data to spreadsheets and analyse it manually, as there is simply too much data. In order for organisations to solve this big data analytics problem, the availability of solutions like Egnyte’s Smart Reporting and Auditing service, which provides unique insight into how and where content is access and used, as well as by whom, will be increasingly important.” [quote_box_center]Egnyte customers enjoy the Smart Reporting and Auditing service for its positive impact on their businesses: “As the largest general contractor in Silicon Valley, we collaborate with a wide variety of sub-contractors on jobs that require real-time access to very large files — confidential design plans and blueprints,” said Joe Tan, director of IT at Devcon Construction. “Egnyte’s Smart Reporting and Auditing service gives us complete visibility into how the files are being shared and accessed, so we can effectively manage desktop vs. tablet device workflows out in the field. Since we often cooperate with our competitors, it also allows us to constantly monitor where those files are created, stored and distributed, so when a project is complete we can immediately remove access to our clients’ information.” “Egnyte is making the experience for administrators a little bit easier,” said Patrick Aitken, director of sales and business enablement at Marin Software. “The Smart Reporting and Auditing service is a game changer from an admin point of view. You’re able to look and see file analytics, as well as user statistics, internally and externally. I can easily login and check out regions and activity on links to monitor the effectiveness of content our teams have shared and understand which files are being accessed and by whom."[/quote_box_center] Egnyte’s Smart Reporting and Auditing service is available immediately for all new and existing customers. For more information on Smart Reporting and Auditing please visit www.egnyte.com/SmartReporting ### Why telcos would rule the world if they got their act together Telcos have certain unique characteristics and capabilities that could easily provide real competitive differentiation for them in a number of market niches. If they were innovative service and client oriented firms, rather than utilities simply looking to get a return on investment from sweating their infrastructure assets, then their competitors would need to look out. [easy-tweet tweet="Could a #telco be threatening your business in the future?" user="billmew @comparethecloud" usehashtags="no"] Telcos could rule the world of Hosting At a time when hosting providers are investing enormous sums in new data centre facilities, telcos have legacy real estate assets in the old exchanges that could be incorporated into a virtual data centre network with points of presence in almost every locality. BT for example owns hundreds of properties within the Greater London area. Many are exchanges that once housed bulky telecommunications equipment, but they are typically now largely empty apart from a set of switching gear in one corner. The rest of the space could be used to host racks upon racks of compute and storage. Every one of these properties is ‘on net’ and collectively they could be used to provide a compelling hosting proposition. The cost to BT would be limited. It would simply be making better use of existing real estate. Other legacy telcos are in a similar position. And none of their rivals has this vast array of ‘on-net’ real estate assets. Telcos could rule the world of Cloud Services In a keynote at Telco Cloud in London earlier this year, Teliasonera made an interesting prediction of how the cloud market would play out, saying that “the winner in this business is the one who creates the most compelling offering based on services that others have innovated.” As we are seeing with the emergence of Cloud Service Brokers and MarketPlaces, it is often all about bringing together the right set of services and making them available in an easy and compelling manner to a focused and relevant client segment. In cloud services the telcos are taking on the big boys of the tech world from legacy firms like IBM and HP, to new players like AWS and Google. [easy-tweet tweet="#Telcos already have a direct relationship with almost every consumer and SMB in the country" user="billmew" usehashtags="no"] The thing about these large rivals, and in particular IBM and HP, is that they are predominantly focused on serving large enterprises and despite their best efforts at channel management have always struggled to reach or be relevant to consumers or SMBs. The telco advantage here is that they already have a direct relationship with almost every consumer and SMB in the country with a monthly bill for usage based services. These services have already expanded from telecommunications, to broadband and TV, and could easily be extended further using the right marketplace to offer a range of consumer oriented or SMB oriented services. They don’t even need to innovate to create these services, they simply need to bring together services that others have already innovated in a compelling manner. The providers of the services would jump at the potential market reach that this would provide. Again telcos are in a unique position here. None of their rivals have this reach or indeed this existing billing relationship with the entire consumer and SMB community. Telcos could rule the world of Apps Each mobile phone has a unique identifier associating it with a particular user. It’s the phone number. In addition to knowing who you are, a mobile telco also knows exactly where you are, and they have a billing relationship with you. These characteristics are invaluable to any app developer. Why is it that when you log into an app on your smart phone, you still have to use your Facebook or Twitter ID to identify yourself? The telco already knows who you are. Why is it that phones use GPS to find where you are for location-based services, when the telco already know where you are? Why do you need to use a credit card or Paypal to pay for a service when the telco already has a billing relationship with you? Unfortunately telcos have been slow to offer APIs for mobile app developers so these developers have needed to find other ways of doing these things. Any telco anywhere in the world can almost instantly find you – they have to do so to route a call through to you, and it would be quicker and easier to get identity and location information directly from telcos rather than use the existing work-arounds - if only the right telco APIs were available to the developer community. Again telcos are in a unique position here. None of their rivals can identify and locate or even bill you as easily as they can. telcos are in a unique position In reality I am not suggesting that telcos could, or would dominate any of these markets, but that they have a compelling competitive advantage that could provide them with a valuable proposition to offer to each ecosystem. The thing is that it is not the way they work and not what they do. It could be, but it isn’t. Think of the missed opportunity cost here and multiply if by the number telco clients worldwide (that’s almost all of us, in the developed world at least).  [caption id="attachment_32326" align="aligncenter" width="300"] Listen to Friday Night Cloud Episode 4[/caption] ### Commvault unveils next generation portfolio New portfolio Includes an open data platform, redefining the data protection market Commvault, has announced the next generation of its market leading solutions portfolio, delivering a broad spectrum of innovation designed to help customers address the increasing challenges of managing data and information in highly disrupted, fast-paced global business environments.   [easy-tweet tweet="The cloud is just another location at the end of the day" user="Rhian_Wilkinson" usehashtags="no"] I spoke with Greg White and Phil Curran on Monday about the new announcements, and I'm pleased to report they sound great. (This would be a bit dull if they were awful!) With a focus on the Open API architecture, White and Curran stressed to me that the improvements provide universal access to data under management in its standard captured format - avoiding storage device lock-in.  Commvault now gives customers and third party software partners the ability to write their applications directly to the Commvault Data Platform. This allows information tasks to define data governance policies for all data from the moment of its inception through its lifecycle. Customers can then benefit from common data management services of the platform across all data sources (from Commvault and third-party solutions and applications).  New Commvault innovations will permit customers to more easily consume new technology to evolve from traditional environments, close functionality gaps, eliminate the headaches of point solution integration, and address a range of emerging data needs from targeted requirements up to full, web-scale projects. In addition, new Commvault solutions can help shield customers from market consolidation disruption concerns while simultaneously assuring data protection, security, and compliance.  Together, integrated software applications and the new Commvault Data Platform represent the eleventh major version of Commvault solutions. “Commvault’s next generation software and platform are uniquely designed to help customers activate their data to unlock critical business insight and drive new value from their technology investments,” said N. Robert Hammer, Chairman, President and CEO of Commvault.  “The new Commvault release both leapfrogs point solution vendors in capabilities and functionality while also showcasing the performance, security, compliance, and economic benefits of a holistic data management strategy. For customers, our new platform makes it possible to better manage the challenges of disruption, whether they are seeking to move to data to the cloud, manage data in the cloud, provide secure access to data, consume new technology to replace legacy systems, move from closed proprietary systems to open, or assure a sound data management strategy in the midst of unprecedented market change and consolidation. Now, more than ever, companies are increasingly recognising the need for a trusted partner to assure a solid data management foundation – Commvault is that foundation.” [quote_box_center]Commvault’s new solutions portfolio includes a broad range of innovations all designed to drive additional business insight and value for customers that redefine the market in these key areas: An open, standards-based approach: Runs natively in traditional on premise, hyper and converged environments, and across hybrid, private and public clouds, for protection, retention, access and compliance Enabling disruption: Single, unified data management approach takes the risk out of adopting new technology, and helps companies deal with the uncertainties arising from marketplace disruption The end of lock-in: Traditional data software format lock-in is eliminated and sets the new standard in hardware and infrastructure options for customers Move to the cloud: Provides the bridge from legacy to transitional hybrid to true exploitation of the public cloud; simplifies native moves to and from the cloud to enable DR; and fully migrate business critical workloads (compliance, data governance and backup) into the cloud; assures data security to, from, and in the cloud through advanced authentication and encryption protocols Redefines archive: Transform inaccessible digital landfills into directly accessible active data sets through familiar user experiences (file explorer, Outlook, mobile, etc.) Hardware array & converged and hyper-converged infrastructure ready: Leading the industry in snaps and replication management across HW arrays and infrastructures provides flexibility and choice Governance throughout the data lifecycle: Apply active governance from when data is created based on content and context, rather than applying methodologies only  after the data is created End-to-end data security built in:  Security is fully integrated for data in all locations, with management, audit and compliance monitoring & reporting throughout, Shift and reduce traditional IT spend through smarter IT: Speeds time to value, consolidates point solutions, shift resources from low to high value through faster business process execution and increased automation Total Search: Search across all data (live and versioned), in all locations, across structured and unstructured data [/quote_box_center] Commvault’s next generation software and services offerings were already deployed with early adopter customers in a managed release, with full general availability in December. As of this latest version of its software, Commvault is moving to a continuous innovation pipeline with new features, functionality, and capabilities delivered in a regular and continuous fashion matching customer demand and market dynamics, and in keeping with the reality that more than half of the companies relying on Commvault are today are consuming Commvault software as a service rather than on premise, through the company’s extensive, global partner network. ### Study Shows 93 Percent of IT Pros Consider Wire Data Foundational to ITOA ExtraHop, the global leader in real-time wire data analytics for IT intelligence and business operations, has released the findings of a report conducted by industry research firm TechValidate, revealing how organisations are building out their IT Operations Analytics (ITOA) practices. The study, which polled nearly 100 CTOs and other IT decision makers at Fortune-1000 organisations, shows organisations are thinking more strategically about big data analysis for IT (ITOA), with 65 percent already combining data sources or planning to do so within the next year.  According to Gartner, "[d]emand for ITOA is intensifying among large and small to midsize enterprises. As a result, spend on ITOA, which approached $1.7 billion in 2014, is expected to grow by approximately 70% through 2015." The popularity of ITOA is on the rise based on the variety of use cases reported by respondents including application performance monitoring (90 percent), root-cause analysis (86 percent), network performance monitoring (85 percent), and IT security (67 percent). In terms of big data types, wire data and machine data ranked as the most valuable sources for IT visibility. Additionally, organizations that combined multiple data sources for ITOA glean greater insights from the data, resulting in quicker, more accurate data analysis. IT professionals who are combining wire data with other sources integrated the following data types: machine data (54 percent), agent data (31 percent), and probe data (26 percent). When it comes to wire data specifically, the research shows a majority (60 percent) of IT professionals understand the value and benefits of this data source for obtaining deeper ITOA insights. Ninety-three percent of IT decision makers rely on wire data sources for IT management and monitoring, more so than other traditional big data sources like machine, agent and probe data.  [quote_box_center]Key findings around wire data’s strengths as a data source for IT professionals include: Seventy-three percent believe wire data provides a more comprehensive view of everything within the IT network environment Sixty-seven percent cite the ease of access to the insights wire data provides without any extra installs on servers Seventy-one percent of IT professionals look to wire data as an untapped resource used to obtain new insights other forms of data cannot provide Seventy-five percent believe it is the most objective form of network-based data [/quote_box_center] “As awareness of Big Data matures, we are seeing faster and more widespread adoption of ITOA technology, and the importance of wire data as a source of insight has become a key topic in conversations with our customers,” said Jesse Rothstein, CEO, ExtraHop. “Businesses rely on IT as the lifeblood of their organizations, so the stakes are high for keeping that machine running as efficiently as possible. That’s where ITOA has really proven itself as a high-value, must-have initiative.” For more survey findings and details on ITOA, click here to download the full report: The State of ITOA Today. ### Why cloud is no longer a hard sell Analysts have been predicting significant uptake of cloud computing for a number of years, but businesses have been slower than expected to take advantage of its many benefits. However, attitudes are now changing, and cloud is no longer a ‘hard sell’ to the boardroom. Our research among in-house IT teams shows some of reasons why. For the last four years EACS have carried out an annual ‘Optimise IT’ survey in which we asked IT directors and managers about their business priorities, technology priorities and the barriers to success. While cloud was always rated in the top ten technology priorities, this year it has moved up the rankings to fourth, with only security, mobility and the end user experience rated as higher technology priorities. [easy-tweet tweet="#cloud is now the #4 technology priority" user="EACSLTD and @comparethecloud" usehashtags="no"] Our survey also found that not only has cloud become a higher priority for organisations, it has become much less difficult for the IT department to obtain buy-in from the business. This has been backed up by our experience working with organisations across all sectors to update and improve their infrastructure. This change is due to the fact that organisations are looking at cloud in a completely different way - not simply an alternative way to provide services, but as a means to address the barriers to success. When cloud first appeared it was seen as a potential replacement for traditional IT platforms, and the key consideration was price – did it offer a more cost efficient alternative to traditional solutions? However, our survey identified that the top three barriers to success are resources, budget and time. Organisations realise that cloud can address all of these by helping them address the lack of time and the shortage of skilled staff they need to improve efficiency and productivity. [easy-tweet tweet="Replacing members of staff costs an average of £30,614 per employee" user="EACSLTD @comparethecloud" usehashtags="no"] There is a huge skills shortage in the IT sector, and a report from Oxford Economics found that replacing members of staff costs an average of £30,614 per employee. Two main factors make up this cost: the cost of lost output while a replacement employee gets up to speed, and the logistical cost of recruiting and absorbing a new worker. The first cost is by far the most dominant; they found that on average workers take 28 weeks to reach optimum productivity, which has an attached cost of £25,181 per employee. Cloud can help organisations address this: for example, using cloud as a service takes care of provisioning, which frees up in-house staff time for other, more value-added activities. We are not yet seeing many companies migrating their line of business applications and their core IT platforms into the cloud. The majority still prefer to keep these in-house or to use applications that have specifically been developed in the cloud, such as Saleforce.com. What is going into the cloud are applications such as end user email, and applications that encourage collaborative working, at a fixed price per month. In addition, we see a large adoption of offsite backup and business continuity solutions utilising the cloud.  Applications typically stay on premise if the application vendor does not support cloud technologies Applications typically stay on premise if the application vendor does not support cloud technologies, or the application integrates with a large number of other applications and services. These cannot be hosted effectively as their performance can be affected by network latency associated with offsite hosting.  [caption id="attachment_32326" align="alignright" width="300"] Listen to Friday Night Cloud Episode 4[/caption] The next move will be to transfer line of business applications into the cloud, enabling users to access company databases and file stores from any device anywhere. This will become more attractive now that Microsoft has wrapped desktop applications into its cloud service as RemoteApp, as long as application latency and performance can be addressed. Risk and compliance continue to be important considerations, but our survey found that they are no longer insurmountable barriers to the adoption of cloud. Instead, they are viewed as technical challenges which can be addressed by the IT team with appropriate choices and due diligence. Cloud also helps organisations to address the second and third technology priorities identified by our survey: mobility and improving the end user experience. Users want better access to corporate resources and to be able to consume them from any location, including at home via their home web browser. Cloud helps to make this possible, enabling them to be productive and access their line of business applications wherever choose to work from.  It also addresses the needs of younger employees, who are familiar with cloud technology from their personal applications and expect the same level of connectivity in the workplace. [easy-tweet tweet="Millennials are so familiar with cloud technology that they do not like the traditional desktop based way of working" via="no" usehashtags="no"] An extension of this is that many businesses are finding that young people are so familiar with cloud technology and working across multiple platforms that they do not like the traditional desktop based way of working. As a result organisations are turning to corporate equivalents of social applications such as Yammer, while actively encouraging their staff not to email as it reduces their ability to innovate as well as human interactions.  Our survey did not identify any clear trend in increased use of cloud by sector. Finance, banking and the public sector have increased compliance concerns, but these can all be addressed by an appropriate choice of solution and do not appear to be a deterrent. We recently spoke to an investment company who needed an infrastructure refresh and suggested three alternative solutions: on premise, hybrid (using the cloud for disaster recovery) or a full cloud solution. Our recommendation was to adopt a hybrid cloud solution as we believed that some of the cloud technologies required for a full migration were too new and untested to be considered.  Slightly to our surprise, they chose an all-cloud solution as they liked the availability and elasticity it provided and were keen to use the most up to date technology in their business, whilst accepting the risk of working with bleeding edge technologies. Cloud, then is a much hyped technology which is finally becoming accepted in the boardroom as well as by the IT department. It enables businesses to address their key challenges of limited resources, budget and time, while making life easier for users by supporting mobility and improving the end user experience. ### NetSuite Lands in Europe For years now the main question posed to NetSuite by European analysts has been “what about a real presence in Europe?” They were looking for a European data centre footprint. Last week Safe Harbour, the legal framework for data transfers based on an agreement in 2000 between the European Union (EU) and the US, was struck down in the European courts. In the immediate aftermath of the ruling, NetSuite announced the opening of not one, but two data centres in Europe – in Dublin and Amsterdam. [easy-tweet tweet="The timing of NetSuite’s European #datacentre announcement could not have been better or more fortuitous" user="comparethecloud" usehashtags="no"] Napoleon once said: "I have plenty of clever generals, but just give me a lucky one." The timing of NetSuite’s European data centre announcement could not have been better or more fortuitous. Now, NetSuite CEO Zach Nelson and Co-Founder and CTO Evan Goldberg didn’t just pop out to the shops the day after the Safe Harbour ruling and pick up a couple of data centres. These things take a lot of strategic planning, but the better your strategy and the harder you work, the luckier you get. As American Founding Father Thomas Jefferson is said to have put it: “I’m a great believer in luck, and I find the harder I work the more I have of it.” NetSuite didn’t just pop out to the shops the day after the Safe Harbour ruling and pick up a couple of data centres... Of course all the major cloud players have seen the end of the Safe Harbour agreement coming for a while now and have taken steps (as NetSuite has) to prepare for this. Smaller companies lack the financial and legal means to get around the ruling as easily, and will be impacted more. There are still mechanisms, however, that companies can use to transfer data, including binding corporate rules and a presence in the EU to keep and store sensitive data. In the longer run though, while data sovereignty will remain an issue, the focus will move from location to tokenisation and encryption – as tokenisation allows the data to remain in its country of origin and all data is becoming encrypted. Whether or not NetSuite is simply lucky or has been working diligently to create its own good luck, the firm has been getting a lot of things right recently: Both data centres are powered by 100% renewable energy sources, which should appeal to firms seeking to limit their total energy consumption. The firm also launched a new Android-based version of its mobile client interface, which now includes full Netsuite functionality, and a version for Windows mobile devices is reportedly coming soon. There have also been big client wins with firms like Pret-a-Manger, WHSmiths and Misys (now NetSuite’s largest client with 5,500 users). At the same time NetSuite also announced a partnership naming Capgemini as the sole distributor of NetSuite in France. It isn’t clear whether the firm would strike such a deal in any other market (Capgemini already accounted for a high percentage of NetSuite’s business in France). [caption id="attachment_32454" align="alignleft" width="300"] NETSUITE CLOUD TOUR EUROPEEvent@Emirates Stadium in LondonPIX.Tim Anderson[/caption] Channel partners (known as business partners or BPs) play a vital role in extending a vendor’s market reach and adding peripheral functionality to extend the vendor’s core offering – typically with vertical market IP into vertical niches where the BPs play. However the BPs tend to be paranoid that the major vendors will either steal their largest clients/deals, or will extend their offering to eclipse the added value that the BP provides. [easy-tweet tweet="NetSuite needs to continue working diligently, on its core platform, and on its channel management." user="NeilCattermull" usehashtags="no"] NetSuite doesn’t have the marketing clout to capture the market on its own and it needs a thriving ecosystem to propel it forward. However, it also wants to provide more functionality for its clients by launching industry specific implementations. This risks eclipsing some of the value that current partners provide in tailoring NetSuite for industry niches. [caption id="attachment_32455" align="alignright" width="300"] NETSUITE CLOUD TOUR EUROPEEvent@Emirates Stadium in LondonPIX.Tim Anderson[/caption] In order to maintain its run of good fortune, NetSuite needs to continue working diligently, not only on its core platform and its alliances with major players like Microsoft, but also on its channel management. It needs to assure its channel that the Capgemini deal won’t be repeated in other markets, and provide its partners with a clear roadmap for the industry specific implementations to allow them to adjust their value propositions to continue extending their joint offering into as many market niches as possible. ### Business innovators win the race into the cloud Redcentric research shows cloud IT adoption is for much more than cost cutting and efficiency Innovation is becoming a prime objective of accelerated cloud adoption among UK private sector businesses, according to UK IT manager research by Vanson Bourne on behalf of cloud services provider Redcentric. [easy-tweet tweet="Andy Mills, Group Sales Director, Redcentric, said “Moving to the cloud is often seen simply as a cost cutting, rationalisation and efficiency measure"] Driving business innovation is now cited by almost 40% of private sector respondents and at the very top of the list for large businesses with over 3,000 employees - 59% of whom said it was the key goal for them moving to the cloud. Innovation was at the top of the priorities alongside cost cutting, business continuity and productivity. It beat improving customer service, reducing risk and improving collaboration hands down to become an A-list private sector cloud intention. Public sector IT respondents were less enamoured by cloud’s innovation potential however - only important to one in four.  The research of 200 UK IT managers highlighted five cloud personas, with innovators taking the form of risk-taking Experimenters - using the cloud to test new ideas with little concern about ROI – or Progressives - for whom the cloud is about change, building a new type of organisation and achieving new goals. These innovative groups are among those making bigger, faster strides towards the cloud. Experimenters are the most likely to have reached the half way point on their cloud journey, with 57% reaching this milestone and 29% extending their journey to reap further rewards. Close to one in three Progressives either have the end of their cloud journey in sight, have arrived at their final destination or have extended their journey in some way.  Andy Mills, Group Sales Director, Redcentric, said “Moving to the cloud is often seen simply as a cost cutting, rationalisation and efficiency measure, but these findings show that it can be so much more and businesses can use it as a way of driving innovation and fundamental change in the way that they operate, what they sell and to whom, and how they go to market.” The survey was carried out in spring 2015 by technology market research company Vanson Bourne, on behalf of Redcentric. Interviews were conducted with 200 IT decision makers from the public and private sector. Redcentric’s Journey to the Cloud research report can be downloaded from http://www.redcentricplc.com/resources/reports/journey-to-the-cloud/ ### Managing Software & IT Assets via the Cloud There are many reasons for an organisation to move to the cloud, we find the most popular to be the reduction in costs as a result of moving from capital expenditure to operational expenditure. With cloud computing being taken up so rapidly and by so many, it’s hardly surprising that cloud security is a high priority. By migrating to the cloud, an organisation is effectively moving data from internal storage, into a third party’s. It is imperative that the way in which that data is stored is scrutinised in order to ensure compliance with industry regulations, for an internal cloud solution, this may take the form of internal compliance checks or service line agreements. An external cloud solution means liaising with the vendor regarding the type of data being stored, how it is protected and how security will be audited to meet with industry regulations. These regulations are constantly changing to keep up with the explosion of cloud computing, and it’s fair to say that organisations may struggle to meet their legal obligations surrounding the data stored in the cloud­, it goes beyond data storage. [easy-tweet tweet="Organisations may struggle to meet their legal obligations surrounding the #data stored in the #cloud­" user="LDLtd" usehashtags="no"] Software Compliance in the Cloud Moving to the cloud will likely simplify the management of your organisation’s hardware, however it is a very different story when it comes to licensing of software. In the first instance, licensing models that predate the cloud are likely not to mention it in any agreement or licensing contract, posing legal questions about implementation and usage. On the other hand, cloud vendors with specific cloud licensing agreements present a different set of problems as they are complex and can vary depending on your software licensing model. Moving to the cloud will likely simplify the management of your organisation’s hardware, however it is a very different story when it comes to licensing of software If an organisation’s existing contract includes cloud-­based products, it may be that the existing license can be renegotiated to a cloud­-based model. Similarly, if there are no restrictions in the agreement, it could be that an organisation’s named­ or concurrent user­-based licensing model can be applied to the cloud. Other vendors, however, calculate fees based on the server processor or number of cores, which often don’t map easily to vendor CPU units. Some vendors forbid the use of their software on any third party hardware that is hardware not owned by the licensee, or within a virtualised environment. For some organisations, this could prevent migrating to the cloud completely or enduring huge licensing fees. Migrating any licensed software to the cloud will introduce a third party cloud vendor, which may bring restrictive licensing practices such as insisting an organisation is responsible for their own non-compliance. Alternatively, as the cloud duplicates data in order to make it available, your number of copies could exceed the permissible number of backup copies stipulated in an agreement. If this data is also unavailable across any geographical boundaries, it’ll complicate matters even further! Every cloud has a silver lining It’s not all bad news, though. Some vendors with strict on-­premise offerings have made it possible to deploy their software via the cloud. Take Amazon EC2’s collaboration with IBM as an example: [quote_box_center]“If you have acquired software entitlements under IBM’s International Passport Advantage or Passport Advantage Express Agreements (collectively PA or PA Program), you may have the ability to run IBM software programs on Amazon EC2 and you will simply pay the normal Amazon EC2 hourly prices for On­Demand or Reserved Instances.”​ [/quote_box_center] IBM makes it clear that it is allowing its software to run using Amazon’s cloud computing services, which operate on a “pay for what you use” basis. It sounds flexible, it sounds accessible and it sounds easy – which, after all, is what cloud computing is about. [easy-tweet tweet="Flexible, Accessible and Easy – what #cloud #computing is about" user="LDLtd and @comparethecloud" usehashtags="no"] Planning for the Cloud Ultimately, organisations are going to have to know their licensing agreements inside out before they can know if software compliance in the cloud is possible. There are lots of creases to iron out in the run up to migrating to the cloud, and understanding what restrictions and limitations there are from both the software vendor, and the cloud vendor (if applicable) is the only way to be sure. Without this fundamental business intelligence, organisations can find themselves committing to long ­term, costly business relationships with on­-premise software or expensive cloud vendors. Larger organisations would do well to implement a License Management Service or application to handle the level of scrutiny necessary to ensure licensed software is not misused via the cloud. For those organisations where a License Manager Platform is less feasible due to scaling costs, resource should be allocated to configure a monitoring script to alert servers if or when the number of instances running licensed software exceeds capacity. ### IP EXPO Europe celebrates record numbers for 2015 Exclusive seminar programme focussing on the continuous transformation of IT drives 9% attendee increase IP EXPO Europe, which ran from 7th-8th October at ExCeL London, has yet again confirmed its place as Europe’s number one IT event, with a record number of attendees for its tenth year. The event attracted 14,592 visitors, a 9% increase on 2014, due in part to a brand new format, encompassing six events under one roof for the first time ever.   The six sub-events, including the newly-launched DevOps Europe, Data Analytics Europe and Unified Communications Europe, examined the vast and continuous transformation of the technology industry, cementing IP EXPO Europe’s position as the go-to event to explore issues at the heart of the IT industry.  Speakers included Jimmy Wales, entrepreneur and founder of Wikipedia, who opened the keynote programme to a captivated audience of over 900, discussing a new era of data-driven change. The line-up also featured industry leaders such as Mark Russinovich, CTO at Microsoft Azure, Michelle Unger, VP of Worldwide Sales at IBM Watson, and Robert Epstein, Windows 10 launch lead at Microsoft, who presented a keynote on Windows 10 for the Enterprise. This year’s attendee figures show a dramatic increase from 2014, which saw a 6% increase on the previous year. With a quarter of attendees coming from the C-Suite, 40% of this year’s attendees were particularly interested in security, highlighting the continued concerns around the ever-increasing threat of cyber warfare, and 80% are currently involved in implementing new cloud technologies for their business. “This year’s IP EXPO Europe covered a full spectrum of issues that technology leaders across Europe face on a daily basis. Following the success of this year’s event, we’re very excited about the launch of IP EXPO Nordic next year – in addition to the continued success of IP EXPO Manchester, launched this year,” said Bradley Maule-ffinch, IP EXPO Europe’s Director of Strategy. Next year’s IP EXPO Europe  will run from 5th-6th October at ExCeL London. It will address new areas including Open Source and also further examine the benefits of collaborative code and continuous delivery. Full details can be found at: http://www.ipexpoeurope.com. ### Embracing cloud solutions needs trust Despite the way cloud based solutions have helped play a huge part in working from home, the latest figures from the Office for National Statistics (ONS) shows that 87% of workforces here in the UK still carry out their duties at their employers offices. With our little island becoming more and more crowded it is therefore of little surprise that the ONS estimate that the average commute for a worker has lengthened by 9 minutes since just over a decade ago. [easy-tweet tweet="In 2003, the average UK commute was 45 minutes - now it is 54 minutes" user="CloudBookingLtd and @comparethecloud" usehashtags="no"] In 2003, the average UK commute was 45 minutes - now it is 54 minutes, whilst the world average is about 40 minutes. This wasted time equates to 11 weeks a year commuting, based on a 37.5 hour standard working week. For those of us passionate about all the cloud has to offer it is worth highlighting that, research by Stanford University in the US has shown that remote workers are 13% more productive and take less sick days than those who commute. However, like many championing cloud based solutions for the future, we feel the benefits to the country would be greater if the mobile workforce was properly empowered.    a more focused workforce is often a happier one Without the energy sapping commute, a more focused workforce is often a happier one. The mentality we have in this country is to believe that more hours worked is the answer, a mentality that is stuck in the 1990s. In Sweden, in the past few weeks, a six hour working day has been introduced. Studies have shown this to not affect productivity and create a happier workforce with less conflict too.  Proper cloud based answers lie at the heart of a brave new future, where the workforce can spend more time with their families or pursuing other interests. [easy-tweet tweet="Proper cloud based answers lie at the heart of a brave new future" user="CloudBookingLtd and @comparethecloud" usehashtags="no"] The impact of working from home is worth potentially many, many billions to the well-being of economies. So, why with so many cloud based solutions able to analyse productivity and connect remote workforces hasn’t the cloud been embraced more? According to Cary Cooper, Professor of Organisational Psychology and Health at Lancaster University Management School has said that one of the main reasons employers don’t embrace home-working is simply a lack of trust.  He says, and we agree, that the problem is often trust and the lack of awareness by businesses to manage people who work remotely. Fundamentally, a homeworker should be judged on results, and not how long they are at work. Richard Branson has in recent months announced he will be giving his Virgin employees’ unlimited vacation. Evernote and Netflix, two very successful newer companies are doing the same. These progressive companies know it’s not all about quantity, but quality. With trust in the cloud growing though we feel this bright new future is viable in the not too distant future.  We think in a generation or two’s time the workforces of the world will look back on this period of time and wonder why we travelled to places to do work that we could do at home. the world will look back on this period of time and wonder why we travelled to places to do work that we could do at home The technology is here, but trust in workforces and management skills to manage remote workforces have to be addressed before this working revolution is properly embraced. The cloud has so many of the answers, but for many employers they still have to fill in the gaps elsewhere. We hope they will soon because as countless studies show the pros of empowering the workforce to be more flexible far outweighs the cons. ### Hybrid cloud - what’s the big deal? Cloud computing has emerged as one of the most talked about subjects in business IT over the past few years, as company leaders begin to realise the benefits that this technology can bring. While the cloud has completely revolutionised the way many firms operate, it would be unfair to claim that the innovation is without its drawbacks, and smaller businesses in particular could stand to suffer if they fail to understand the potential risks. [easy-tweet tweet="#Cloud computing has emerged as one of the most talked about subjects in business IT" user="kingofservers"] For this reason, the hybrid cloud has emerged as a viable option for businesses looking to enjoy the efficiency that cloud computing can bring, while keeping some operations running in their traditional format to maximise the effectiveness of the infrastructure. But what exactly is this ‘hybrid cloud’ that everyone is raving about, and how exactly would it work for your business? We investigate below. What is the hybrid cloud? The hybrid cloud combines both public and private clouds, giving companies the chance to enjoy the best of both worlds. The hybrid cloud exists to capitalise on the advantages of combining a traditional, physical infrastructure with the cloud, which means the specific combination will depend entirely on the required outcomes and the specific needs of the company. What’s great about this type of IT infrastructure is it offers secure, on-demand and externally provisioned scalability, allowing businesses to contend with sudden or unexpected workloads. This option is specifically good for firms that require rapid scalability and flexibility, but with continued strong security. What’s the big deal? The hybrid cloud comes with a number of key benefits, which affirm its popularity among businesses of all sizes. Outlined below are a selection of its advantages: Cost benefits: One of the great things about the hybrid cloud is that it is easily quantifiable. Businesses can reduce the overall cost of ownership while improving cost efficiency by leveraging the benefits of hybrid cloud, allowing you to match your cost pattern to demand more easily. Support innovation: Moving to the hybrid cloud allows you to future-proof your business,giving you access to large public cloud resources, along with the ability to test new capabilities and technologies quickly and easily. Fit for purpose: The public cloud is great for start ups, test and development and handling peaks and troughs in internet traffic. However, there have long since been doubts over its security. Therefore, the hybrid cloud can deliver advantages for mission critical applications in terms of advanced security, but of limited use for applications with a shorter shelf life. Improved security: The combination of dedicated and cloud resources allows businesses to address any security concerns you may have, while giving you the chance to place customer data on a dedicated server. Therefore, keeping private matters private. Finding a one-size-fits-all solution for every business is near impossible due to the different sets of requirements each firm has. However, hybrid cloud has emerged as a viable solution to all of these needs. The wider security issue [easy-tweet tweet="Experts have long-since claimed that data in the cloud is just about as secure as that in a physical server" via="no" usehashtags="no"] As mentioned earlier, security is one of the key factors that has held businesses back from total migration to the cloud since its widespread adoption began. While experts on the matter have long-since claimed that data in the cloud is just about as secure as that in a physical server, businesses remain cautious about the potential threat of hackers. What is interesting is that some IT experts claim that data stored in the cloud could prove to be more secure than that on a company’s own server, however, firms are still reluctant to migrate private information. Those companies worried about the safety of their data could use hybrid cloud to their advantage, storing the most sensitive information on local servers, while exporting other data on to the cloud. What’s more, they could opt to carry out critical operations in their own data centres, while using the cloud to carry out heavier processing tasks. This way, they can enjoy the advantages of both types of data storage. A key takeaway Despite frequent reassurance that the cloud is a safe way to store data, some companies would still rather cater for their own private information. The hybrid cloud offers the perfect solution, offering the best features of both the public and private technology. It would be safe to say that the cloud is going nowhere, and the flexibility of hybrid options offer businesses of all types the chance to benefit. ### Digital investigations in the cloud: Avoiding the ‘blame game’ Organisations of all sizes are embracing the cloud and reaping the benefits of cost savings, flexibility and scalability that these services offer.  In fact, UK Cloud adoption is now at a massive 84% and it’s predicted that 70% of organisations already using the cloud expect their adoption to increase over the next 12 months. [easy-tweet tweet="UK Cloud adoption is now at a massive 84% #cloudadoption" user="EnCase and @comparethecloud" usehashtags="no"] However, this growth is creating new levels of complexity when it comes to security breaches and digital investigations. With the scale and severity of security incidents increasing, the unfortunate truth is that, no organisation is immune to the risk of data breaches. And as the recent T-mobile data breach has shown, breaches involving a third party, or cloud hosting provider, create significant challenges. It's vital to understand how cloud providers would respond in the event of a breach and exactly where the lines of responsibility fall. Establishing clarity on how different types of breaches would be managed by a provider means that incidents can be dealt with swiftly and their impact minimised.     Multi-tiered services     [caption id="attachment_32267" align="alignright" width="300"] Click to register now[/caption] The first issue to address is that the catch-all term ‘cloud’ can encompass a range of different services; from hosted email services such as Gmail, to the wholesale migration of an organisation’s IT estate to a provider’s  storage and servers in the cloud.  It means that, increasingly, most organisation’s hybrid IT environments are now infinitely more complex - a combination of legacy and on-premise services, Infrastructure as a Service (IaaS) and Software as a Service (SaaS). This is adding to the challenge of knowing how, and where, specific data is stored and should a breach occur, the level of access that can be provided for investigative purposes. The reality is that the multi-tenant, shared environment of the cloud means that there’s no ‘one size fits all’ approach when it comes to determining the appropriate response to an incident. It will depend both on the type of cloud services that the organisation is using and the nature and scale of the breach. Clarity on the Lines of Responsibility Given this complexity, it’s essential that organisations moving to cloud services understand their providers’ approach to different incidents and have assurance that these obligations are underpinned in the contract. This helps to avoid the ‘blame game’ and minimises confusion in the investigation process. This is important as there’s a growing expectation that a data breach investigation will involve a third party. In our own survey of e-discovery professionals, nearly half - 44 % of the respondents - claimed that they will have a business need either now or within the next year for collecting from cloud repositories as part of e-discovery. Critical areas to consider are: The Security Processes of the Cloud Provider By entrusting your critical data to a cloud provider, you have a responsibility to ensure there is adequate security in place and that this meets all regulatory and legal requirements. Gain visibility into the specific levels of protection that are in place by your provider so that your data is safeguarded from cyber attacks. Ensure that checks are carried out on the provider’s security protocols: ideally, they should have an active, and carefully managed, vulnerability, audit and patch management process in place to ensure that all servers and infrastructure are updated regularly. In terms of remediating controls against known malware and data breaches, the cloud provider should have tools available to enable them to remotely triage, investigate and remediate threats to their infrastructure and supporting systems. Notification Processes  Any organisation entrusting data to a third party provider, must ensure that they understand the breach notification process and legal obligations if there is a data breach. This means not only establishing clarity in the contract on the cloud provider’s responsibility of timely notification to you – their client – of a breach, but also your responsibility for informing the relevant regional regulators. [easy-tweet tweet="Organisations must establish the geographical location of their #data and that it complies with EU DPR" via="no" usehashtags="no"] With forthcoming changes to the EU Data Protection Regulation organisation will be responsible for ensuring that – no matter where the data is held – it is stored and managed within the strictures of the EU DPR. Organisations must establish the geographical location of their data and that it complies with this.  For example, companies that provide cloud services within the EU and rely on data centres in the US will be contractually obliged to comply in accordance with the proposed changes in the European Union.      The Incident Response Processes Find out if your cloud provider conducts regular risk assessments of their clients’ data and that they have the appropriate crisis management steps in place, should an incident occur. Understand if they have conducted ‘fire drills’ to test these measures and would be ready to cope with the commercial impact of data loss for your data. Will they take responsibility for the loss of data or will they leave the organisation to deal with the fallout and have third-party insurance to cover damages? Establish what access your investigative team would have so that they can get to the root cause of an attack and understand where it originated and what happened, some of the largest providers do not provide access to the entire virtual machine and limit access that will impinge an investigation.  Not all incidents are of equal scope, and contracts should also specify the incident response protocols and different levels of support depending on the varying level of the breach. The response required for a malware attack on an Exchange environment running in the cloud will be different to a targeted web application attack. Cloud providers have to work with their clients on an individual basis to create a tailored response. This provides the peace of mind that SLAs have been built into the contract that will account for different attack vectors.    Planning is key and by addressing these areas organisations will not only avoid the complications which hamper many digital investigations, but will also have increased confidence that their cloud providers are upholding the security and integrity of their data.     Planning is key. ### Enterprise Digital Summit London Bill Mew is blogging live from the Enterprise Digital Summit [quote_box_center] 22 October British Academy, 10-11 Carlton House Terrace, London http://www.enterprise-digital.net/london.html Everyone’s talking digital, transformation, and new business models - but how do you make sense of the issues, cut through the hype and put these ideas to work for your business and your people?  We’ve already invited you to our Enterprise Digital Summit London on 22nd October on this topic, but the Agile Elephant team have just been allocated a limited number of free VIP spaces, and so we’d like to invite you personally. This is a one day conference for senior managers and business leaders explaining the mindset, leadership and approaches required to adapt and compete in today’s digital business landscape.  The emphasis is practical, not technical.  There are keynote presentations from: Stowe Boyd - well known futurist, researcher and maverick who coined the term "social tools" in 1999 and the term "hashtag" in 2007 Euan Semple - the author of Companies Don’t Tweet People Do! Professor Vlatka Hlupic - author of The Management Shift and researcher on emergent leadership styles As well as workshops and panel sessions, there will be practical case studies of companies making the shift to digital and deploying enterprise social networks successfully from: Vodafone Bupa Pearsons Euroclear.   Find out more here: http://www.enterprise-digital.net/london.html [/quote_box_center] ### Newest OpenStack Release Expands Services for SDN Newest OpenStack® Release Expands Services for Software-Defined Networking, Container Management and Large Deployments Cloud builders, operators and users unwrap a lengthy wish list of new features and refinements today with the Liberty release of OpenStack, the 12th version of the most widely deployed open source software for building clouds. With the broadest support for popular data center technologies, OpenStack has become the integration engine for service providers and enterprises deploying cloud services.  Available for download today, OpenStack Liberty answers the requests of a diverse community of the softwareąs users, including finer-grained management controls, performance enhancements for large deployments and more powerful tools for managing new technologies like containers in production environment.  Enhanced Manageability Finer-grained access controls and simpler management features debut in Liberty. New capabilities like common library adoption and better configuration management have been added in direct response to the requests of OpenStack cloud operators. The new version also adds role-based access control (RBAC) for the Heat orchestration and Neutron networking projects. These controls allow operators to fine tune security settings at all levels of network and orchestration functions and APIs. Simplified Scalability As the size and scope of production OpenStack deployments continue to grow‹both public and private‹users have asked for improved support for large deployments. In Liberty, these users gain performance and stability improvements that include the initial version of Nova Cells v2, which provides an updated model to support very large and multi-location compute deployments. Additionally, Liberty users will see improvements in the scalability and performance of the Horizon dashboard, Neutron networking Cinder block storage services and during upgrades to Novaąs compute services.  Extensibility to Support New Technologies OpenStack is a single, open source platform for management of the three major cloud compute technologies; virtual machines, containers and bare metal instances. The software also is a favorite platform for organizations implementing NFV (network functions virtualization) services in their networking topologies. Liberty advances the softwareąs capabilities in both areas with new features like an extensible Nova compute scheduler, a network Quality of Service (QoS) framework and enhanced LBaaS (load balancing as a service). Liberty also brings the first full release of the Magnum containers management project. Out of the gate, Magnum supports popular container cluster management tools Kubernetes, Mesos and Docker Swarm. Magnum makes it easier to adopt container technology by tying into existing OpenStack services such as Nova, Ironic and Neutron. Further improvements are planned with new project, Kuryr, which integrates directly with native container networking components such as libnetwork.  The Heat orchestration project adds dozens of new resources for management, automation and orchestration of the expanded capabilities in Liberty. Improvements in management and scale, including APIs to expose what resources and actions are available, all filtered by RBAC are included in the new release. 1,933 individuals across more than 164 organizations contributed to OpenStack Liberty through upstream code, reviews, documentation and internationalization efforts. The top code committers to the Liberty release were HP, Red Hat, Mirantis, IBM, Rackspace, Huawei, Intel, Cisco, VMware, and NEC. Focus on Core Services with Optional Capabilities During the Liberty release cycle, the community shifted the way it organizes and recognizes upstream projects, which became known by community members as the łbig tent.˛ Ultimately, the change allows the community to focus on a smaller set of stable core services, while encouraging more innovation and choice in the broader upstream ecosystem. The core services, available in every OpenStack-Powered product or public cloud, center around compute (virtualization and bare metal), storage (block and object) and networking. New projects added in the last six months provide optional capabilities for container management (supporting Kubernetes, Mesos and Docker Swarm) with Magnum, network orchestration with Astara, container networking with Kuryr, billing with CloudKitty and a Community App Catalog populated with many popular application templates. These new services join already recognized projects to support big data analysis, database cluster management, orchestration and more.  [quote_box_center] Liberty is a milestone release because it underscores the ability of a global, diverse community to agree on technical decisions, amend project governance in response to maturing software and the voice of the marketplace, then build and ship software that gives users and operators what they need. All of this happens in an open community where anyone can participate, giving rise to an extensible platform built to embrace technologies that work today and those on the horizon. Jonathan Bryce, executive director, OpenStack Foundation [/quote_box_center] ### Why SME’s Should Look to the Cloud for Growth  In the UK alone 581,173 start-ups were registered with Companies House during 2014, this was reported to be a record high as Britain’s hunger for business boomed. So it’s no surprise that as the biggest IT revolution over the past decade, 49% of small businesses are using cloud platforms. [easy-tweet tweet="In the UK alone 581,173 start-ups were registered with Companies House during 2014" user="cloudsimplified " hashtags="startups"] With the ability to punch above their weight and compete with larger, more established companies, adopting cloud software has been the ticket for success for many of young start-ups still in their infancy. Whilst many SME business owners can be scared about the concept of cloud, with concerns raised surrounding cost, complexity and security, if businesses don’t begin to adapt themselves they could soon find they can no longer compete. A recent report by Forbes estimates that cloud applications will account for 90% of worldwide data traffic by 2019, and with such a staggeringly high percentage small businesses should not be reluctant to transform their methods. [easy-tweet tweet="#cloud applications will account for 90% of worldwide #data traffic by 2019" user="comparethecloud "] The benefits that can be reaped from embracing the cloud can not only increase productivity but enable them to grow far quicker than traditional businesses who are keen to stick to outdated practices. Financial Clarity  [caption id="attachment_32326" align="alignright" width="300"] Listen to Friday Night Cloud Episode 4[/caption] It wasn’t long ago that small businesses were having to find investment so that they could pay the large upfront fees for IT and software solutions. Taking out excessive overdrafts is the last thing that a business owner wants, and this is something the cloud has revolutionised for start-ups. Software as a Service (SaaS) offers a range of solutions so that small businesses can pay-per-use, and still have the same support and help as their much larger competitors. When cash-flow is something that SME’s are cautious of, this is a significant aid in enabling owners to keep an eye on costs and still have access to the most up-to-date software. Putting small businesses on a level playing field with their competitors is something that was impossible before the introduction of the cloud, without potential risk or fear of dramatic financial implications, but the cloud offers the same sophisticated software to all without discriminating on size.  organisations adopting cloud computing could cut their software costs by up to 20% A recent report from the European Commission found that organisations adopting cloud computing could cut their software costs by up to 20%. The financial agility that is now available is a fact which should be embraced by small businesses rather than ignored. Expansion  The old fashioned problem of hardware failures, loosing time thanks to delayed projects and missing staff, is all a thing of the past thanks to the implementation of the cloud. As long as there’s an internet connection, business can now be done anywhere at anytime, meaning that whether you hire freelancers or have employees out of the office, the business can continue running from any location. Allowing staff to access their applications and files remotely is extremely beneficial to small businesses who previously would have found this disruptive to the traditional constraints of working hours, and lost precious time and money. Instead SME’s are no longer restricted to a bricks and mortar location, and can expand their team much further a field and cope with the issues that would have held them back in the past. Gone are the days suffering lost days because your staff are stuck at home due to bad weather, or your hardware has died. SME’s are no longer restricted to a bricks and mortar location However, the cloud not only battles old problems, but helps to boost the existing problem of attempting to help a business appear more successful. For example, start-ups could have their head office in a more cost effective area but promote the fact they are a ‘national’ business due to the representatives they have across the UK. Creating the illusion that a business is much bigger than it appears can enable them to winner bigger contracts and grow more efficiently without being tied down by excessive overheads. [easy-tweet tweet="SME’s are vital for growth in the UK’s economy" user="CloudSimplified @comparethecloud" usehashtags="no"] It’s well known that SME’s are vital for growth in the UK’s economy, and making the most of what the cloud has to offer can be transformational for these businesses. But the adoption needs to be an educated one and supported by the wider business and digital communities.  ### IBMz Tweetchat Thursday 4pm BST #HybridCloud [Event Finished] Join the TweetChat! Use the Hashtag #Hybridcloud or #IBMz Join #IBMz & Compare the Cloud for our TweetChat on IBM zSystems, Hybrid Cloud & Making it your own. From 7pm BST (3pm EST) on November 3rd 2015. This event is now finished. See the posts from this chat below. Use the window below or go to https://www.crowdchat.net/HybridCloud to log in using your Twitter account. ### KEMP enables OpenStack community with load balancing as a service OpenStack integration delivers QoE and security in the private cloud KEMP Technologies has announced its OpenStack Load Balancing-as-a-Service (LBaaS) plugin, giving customers the power to manage and orchestrate advanced application delivery services in their private cloud environments.  [easy-tweet tweet="KEMP announces OpenStack Load Balancing-as-a-Service (LBaaS) plugin" user="KEMPtech and @comparethecloud" usehashtags="no"] With OpenStack adoption growing rapidly – expanding from the Service Provider space to an increasing number of mainstream verticals – enabling quicker private cloud innovation, reducing data centre licensing costs and escaping vendor lock-in, are key business drivers fuelling this trend. KEMP’s integrated OpenStack plugin makes elements that secure and accelerate application services, available across the Neutron Software Defined Network (SDN) within OpenStack. The ability to deploy and orchestrate these across virtual machines or bare metal servers gives customers flexibility and control to cater for the needs of individual applications. KEMP’s LoadMaster LBaaS plugin enhances OpenStack private cloud deployments by enabling application-centric services that improve quality of experience and security for published workloads.  “KEMP is committed to empowering developers in companies that are building agile environments to accelerate their business operations,” said Peter Melerud, Chief Marketing Officer for KEMP. “With our focus on innovating for the Software Defined Data Centre, OpenStack users can benefit from our investments in the latest network technologies.”  KEMP LoadMaster is an advanced Layer 2-7 load balancing/application delivery controller (ADC) solution. Flexible deployment options are available on a wide array of hypervisor and cloud platforms, as well as dedicated KEMP appliances and third party best-in-class bare metal servers. LoadMaster provides enterprise application integration and acceleration services that intelligently direct user traffic in order to help applications perform better in on-premise, virtualised and hybrid cloud data center architectures.  Download KEMP's OpenStack LBaaS Plugin at https://github.com/openstack/neutron-lbaas/tree/master/neutron_lbaas/drivers/kemptechnologies ### Is the skill gap becoming a skill famine? The importance of training the next generation has been on my mind recently. With the speed that technology is advancing, individuals are being skilled out by the industry through no fault of their own.  IP Expo is somewhat of a meeting of the minds, with generations of technological talent converging in the Excel Centre in London. I spoke with a collection of men that have been driving innovation in the tech sector since before ‘cloud’ began.  To begin my second IP Expo experience (IP Expo 2014 was my first ever cloud event), Alan Saldich, Vice President of Marketing at Cloudera, dropped this gem: “Data drives the modern world” I completely agree with Alan’s statement. We do now live in a world that is heavily data reliant, even things that you aren’t aware of have data connections are generating big data that can be analysed for a better user experience - yet there is an alarming lack of data scientists available to analyse it. [easy-tweet tweet="In a #data driven world - how do we function when no one knows how to analyse the data? " user="alsaldich @rhian_wilkinson"] Last summer David Willetts, the Science Minister, announced a £52 million investment in new and emerging science talent, creating more than 7,800 education and skills opportunities over a two-year period. But will this be enough to fill the void in the STEM professions? Cloudera has taken training into its own hands, implementing training schemes for Hadoop - training over 50,000 individuals through training courses that are publicly available on the web and in their partner eco-system, as well as working with over 100 universities. [caption id="attachment_32320" align="alignright" width="300"] #IPEXPO[/caption] Ed Martin from Kemp stressed that there is a need for more individuals in the industry that are skilled on software and virtualisation deployment, because the industry is moving away from legacy systems.  Martin said, “Universities are getting better at skilling graduates, but there needs to be more back and forth betroth the industry and the educators. Social media is helping too - it’s improving training, there are so many webinars and tweet chats now, it can be a resource for education, it offers a lot of options to people who are actually looking for ways to learn.” Oscar Marquez is Chief Technology Officer at ‘born on the cloud’ company iSheriff - they built an online certification course for their partners so that they would be properly educated on the product, the course is also now open to graduates. Oscar stressed that for him it’s not about a skill shortage, but more of a focus based problem instead. "It is difficult to find well-rounded, accomplished people … There has been a shift from guys who come into the industry at base level and work there way up learning everything. It’s moved to a minute focus - there are guys who can only install a switch. "Apprenticeships may be the way of the future - starting someone as a cloud security specialist and then moving them up. It’s about learning the whole gamut of the company,” Marquez said.  The best guys [entering the market] are gamers  There has been a move away from mechanics in the IT industry - it used to be dominated by the guys who would have sat in their rooms and pulled apart their computer and gaming systems - the people who would find servers on scrapheaps and rebuild them from scratch. A feeling expressed by many of my interviewees was that the industry needs to go back and find those people again. [easy-tweet tweet="The IT / Cloud industry wants to hire the kid who pulls apart his Xbox, because he is the future" user="rhian_wilkinson" ] Ian Bayly, VP of EMEA and Channel Strategy at XIRRUS said that for his company graduates were coming out well trained, but the key challenge they are having is keeping millennials motivated. The speed at which the millennial generation changes jobs is much faster than any previous generation. They are really looking to complete a year, and then, if there isn’t any incentive to stay on, they move up through changing jobs laterally. There is no longer a guarantee of ‘work hard, climb the ladder, run the company’. Millennials are much more cutthroat with their job choices, if they want to progress they often see the only way to do so is to look for a different role. Bayly suggested that introducing tiers could be the solution to this, giving a promotion based structure to advancement via learning. ie. you get your cisco qualification, you move up a tier, and get a bonus. Incentivised learning may be an option in the future David Williamson and Scott Breadmore of efficient iP stressed the importance of helping your own employees up-skill, in order to embed them into the company culture through empowering them on their education. “Universities need to build relationships and partnerships with the industry,” said Breadmore, there is room for collaboration to produce students that have one foot in education and the other in the industry and real life experience. [easy-tweet tweet="There is room for universities to collaborate more with the #IT industry" user="scottbread76 @rhian_wilkinson" hashtags="cloud"] William Culbert, Manager of Solutions Engineering, EMEA at Bomgar said that all undergrads need to find the time to get industry experience, and that their aspirations are being set too high by lecturers. IT is becoming a vocation “IT is becoming a vocation, so for some employees, the continued development is more important than having a degree," said Culbert. Insightful comments abounded at IP Expo, and NetApp’s Elliott Howard had some of the best. Here are, in my opinion, his top three: “I think we need to create mentors in the industry, as an industry we need to be stepping into education and taking control of what we think students need to know in 15 years.” “It is incumbent on the industry to focus, design and influence the outputs of universities.” “We need to stop sucking the life out of graduates, we need to let them teach us as well, they need to keep challenging us as an industry.” [easy-tweet tweet="Elliott Howard says we need to stop sucking the life out of graduates - we need to let them challenge the industry" user="rhian_wilkinson" usehashtags="no"] Chris Pace, Head of Product Marketing at Wallix said there are other way to skill employees to the levels you are looking for - suggesting that removing outsourcing is a good way to improve your in-house skill set, but recognised that there is a disparate level of training between universities and actual business needs. “Traditionally there is a resistance to the core needs of IT security industries. In order for us to develop around cybersecurity, universities need to acknowledge that things need to change in their training around IT security for them to become a part of the solution,” Pace said. “IT support is an industry that people fall into” So can we push more people to join STEM professions? I think yes. Tech is an exciting industry, the masses just need to be drawn in, and skilled, in the right way. When industry and education (of many facets, not just universities) learn to meet in the middle, the gap will be closed. Here’s hoping to a future full of apprenticeships and internships, and all the unicorns the tech industry can dream of. ### Why the future of cloud lies in autonomics If you've been following global trends in cloud technology development, you must have noticed that cloud computing provided us with a host of great innovations – for instance, consumption-based pricing or easy access to global computing resources. But they all came at a certain price: complexity. Anyone who manages a cloud infrastructure today will agree that doing the same for traditional data centres is way easier. What could potentially help us in managing the increasingly complex cloud environments of the future? Experts claims that it could be automatic computing. Why have cloud infrastructures become so complex? There are many factors which account for the current complexity of cloud environments. All these amazing innovations unveiled a need for deeper management – take consumption-based pricing, for instance. It allows us to pay for exclusively what we use, but for doing that it requires constant monitoring and continual optimisation to avoid poor cost management and reduce resource waste. Cloud is under constant innovation in private, public and hybrid sectors Cloud is under constant innovation in private, public and hybrid sectors. Add to it the growing importance of multicloud sectors, and you got yourself a pretty complex environment to manage. Perhaps cloud innovation has gotten out of hand and we're unable to come up with effective ways to manage these newly-acquired infrastructures. We definitely need a new approach for supporting cloud environments and one of the candidates is autonomic computation. [easy-tweet tweet="Perhaps cloud innovation has gotten out of hand and we're unable to come up with effective ways to manage these newly-acquired infrastructures" via="no" usehashtags="no"] What is autonomic computation? At the beginning of this century, IBM released a kind of manifesto which predicted the future software complexity crisis as a result of our inability to cope with the rapid growth of information technology, communication and computation. The manifesto proposed a public debate about autonomic computation as a potential solution. Autonomic computing refers to self-managing systems which can optimise, configure, protect and heal themselves without the necessity of human intervention. How come we've heard so little about autonomic computing?  When IBM issued their vision of the impending problem of software complexity management, the danger didn't seem real. The paper spawned a field of research which is still active, but its impact in the industry was highly limited – mostly because that gap between complexity of software and our ability to manage it hasn't become a serious factor regulating economic growth. Until the widespread adoption of cloud technologies. Why autonomic computing in cloud? Cloud autonomics would enable companies to make the most from cloud computing by automating its management with a set of business policies. Organisations wouldn't need to lose resources to optimise security, usage and cost of cloud infrastructures – it will all be done automatically in accordance with business policies which define every aspect of management. This kind of automation is called policy-automation. An organisation would have to configure an autonomic system with the right governance rules and then rely on this system to monitor the cloud infrastructure, applying necessary changes to bring it back in line with the policy. Benefits of cloud autonomics It should be clear by now that cloud autonomics offer many benefits to companies which decide to deploy such systems to manage their cloud environments. Here are the most important advantages brought by autonomic computing for cloud infrastructures. [easy-tweet tweet="Click to read the most important advantages of autonomic computing for #cloud infrastructures" user="comparethecloud"] Usage – companies will be able to schedule the automated shutdown of idle or long-running infrastructures in support of their business policies (for instance, stating that idle development infrastructures running more than a week must be shut down). Availability – business service level agreements (SLAs) can be supported by the automated migration of data to another region. Such an automated migration can be used for backup of storage from one medium to another. Cost – companies will use cloud autonomics for automated purchase of reserved capacity in accordance with organisational or functional needs. Another benefit could be the automated movement of a workload from one cloud provider to another in search for cost efficiencies. Performance – organisations can use cloud autonomics for the automated increase of the machine type for a workload, employed to support the operation of non-horizontally scaling workloads. Security – companies will benefit from the system to automatically change the network or endpoint security to those which conform to established business policies. It's clear that cloud autonomics is something businesses should definitely get interested in. Experts agree that the future will hold challenges in cloud infrastructure management, so automating at least a part of the process through cloud autonomics might become a suitable solution for many organisations facing this problem. ### Friday Night Cloud: Episode 4: Don't Carry On Doctor Medical Clouds - Hype vs. Reality & The Internet of Egos. Neil has finally broken Andrew and a trip to the hospital is required. If the marketing is to be believed - Doctors with Google Glasses roam the futuristic corridors holding tablets which diagnose your every condition, analyse your records and tests and suggest the best cure for your ailment. Unfortunately the old school peeing in a cup and pushing bits of paper around are still the norm. Meanwhile Neil addresses the Internet of Egos - an ego filled rant against the bold claims pushed out into the technology market. We have also added SoundCloud for our listeners. ### Is the Digital Dark Age Looming Behind the Cloud? As a new transatlantic fibre optic system for high capacity connectivity is being laid between New Jersey and Ireland, large enterprises on both sides of the pond are preparing take advantage of the extra bandwidth from the major cloud service providers, which will allow them to further build out their cloud infrastructures. As well as using the cloud to offer new services, many companies will begin to use the cloud to store their intellectual property, including digital content containing their corporate histories, as well as important day to day information and those kept for compliance. [easy-tweet tweet="In the mad dash to cloud, companies are neglecting to address challenges presented by long-term digital asset retention" via="no" usehashtags="no"] However, in the mad dash to the cloud, some companies are neglecting to address the challenges presented by long-term digital asset retention. Migration plans have become very sophisticated and systematic, but it’s important to distinguish between the requirements for cloud based service delivery, and the unique requirements of retaining digital assets stored in the cloud for longer periods. Your Digital Transformation Time Machine  The cloud is as good for storing digital content as it is for service delivery because it’s safe and it’s durable: most cloud providers will ensure that multiple copies of everything will be stored in multiple locations. But just because digital assets are backed up and accessible via the cloud does not mean that they will be useable for the long term. Earlier this year, Google VP and Internet pioneer Vint Cerf warned of a “Digital Dark Age”, where our vast archives of digital content could become obsolete simply because they cannot be read by the latest version of software available. With increased take up of the new cloud power on offer, companies are beginning to realise that cloud storage on its own is not enough. These companies are looking to digital preservation solutions to ensure that their content and their corporate identities are protected and future-proofed for years to come. The Silver Lining It’s not to say that the cloud can’t have a role in digital preservation. Indeed it is ideal because it is affordable, durable and highly scalable. Many are implementing cloud based digital preservation solutions not only because they realise they need to mitigate risk, but also to preserve and maximise their brand assets and corporate knowledge in a cost effective way. This is not ‘store and forget,’ but rather a proactive approach to safeguarding critical digital content via a dedicated digital preservation effort in the cloud. cloud can have a role in digital preservation CIOs are beginning to understand that a viable cloud based digital preservation project must incorporate budget for the inevitable transition to new file formats as and when needed. Beyond this, there must also be the capacity to keep both metadata and indices up to date so that files can easily be found when needed. Many organisations are very focused on storage of their digital assets in the cloud, as well as the quest to ensure files are secure and accessible for the long term – in some cases that could be hundreds of years from now. However, what we are seeing is that it’s not just about deep archival storage, and 100 years isn’t necessarily the critical point that organisations should be worried about. The growing consensus amongst analysts and the industry is that “long-term” in the digital world now means any record being kept for over 10 years (or indeed any that are already 10 years old). This is down to faster technology refresh cycles that are causing file obsolescence. This requires an active approach to preserving digital information to ensure that vital records are findable, usable and readable when required. [easy-tweet tweet="We now require an active approach to preserving digital information" user="dPreservation and @comparethecloud" usehashtags="no"] Business is waking up to the challenge. As we see an increase in spending on IT cloud infrastructure, the challenge of how to properly manage and safeguard long term  electronic records (whether permanent or not) across the full information lifecycle will only become more prominent. With a little effort, the Digital Dark Age may never see the light, and the digital assets living in the cloud will not only be easily accessible, but also protected and future-proofed for generations to come. ### A joint approach to IT purchasing IT is well known for being a complex and expensive spend category, but what you may not know is that it’s also a major culprit for refusing to follow official procurement guidelines and procedures, choosing instead to make its own purchasing decisions. [easy-tweet tweet="One in three IT professionals choose to go it alone when making #IT spending decisions" user="WaxDigital" usehashtags="no"] We’ve recently conducted research that looks into the procurement’s department’s relationship with other business functions and uncovered that one in three IT professionals choose to ‘go it alone’ when making IT spending decisions, with an overwhelming majority (78%) of the IT professionals surveyed referring to procurement’s formal tender processes as a hindrance, unnecessarily extending the time it takes to process orders. According to our CPO Viewpoint research, 36% of procurement claims to own the process of buying new technology once IT has specified its requirements, however, interestingly, only 12% of IT agree that it is in fact procurement’s responsibility. Where procurement has been involved in the process, only 19% of IT respondents admitted that the procurement team had led the way on cost saving initiatives in their department. This compared to 43% of procurement claiming that they in fact took the lead. [easy-tweet tweet="Research findings point to #IT having an unfavourable view of procurement and being frustrated over its red tape" user="waxdigital" usehashtags="no"] The research findings certainly point to IT having an unfavourable view of procurement and being frustrated over its ‘red tape.’ However, there is also reason to believe that IT is failing to recognise the potential value that procurement delivers.  The two departments were also found to have a polarised view of spending priorities for IT, as procurement’s focus is generally on devices and hardware, while IT is more concerned with infrastructure and security. However, one area where IT and procurement were more closely aligned is around supply chain risk. It would seem that almost half (46%) of IT identified that procurement’s most important contribution was in the areas of supplier risk and negotiation. [caption id="attachment_32203" align="alignright" width="300"] Register now to get 20% off with the code CLOUD20[/caption] So what is at the heart of this gulf between IT and procurement and why should IT sit up and take notice of its organisation’s procurement policies and procedures?  Clearly, technology is a complex spend category and those authorised to purchase within IT may claim it is quicker and easier to forge ahead with their own buying needs. However, it is crucial that IT thinks twice before side-stepping procurement’s official processes as potentially they are putting billions of pounds at risk annually. This department is also missing out on procurement’s supplier negotiation and risk management skills, which could help reduce costs, minimise risks and potential project failures. Surely, it is therefore vital that these departments look to work together more effectively. Working with, rather than against procurement can deliver numerous benefits so how can the two departments work towards collaborating more effectively in the future? [quote_box_center]Here’s our three top tips for facilitating this process: Deciding on spend priorities needs to be a shared responsibility between IT and procurement. This will help procurement gain a far better understanding of what the department needs are now and in the future. IT should stop dismissing procurement’s processes as unnecessary ‘red tape’ but to re-examine them instead – will the additional steps add value or necessary diligence to the project. Are there any bottlenecks in the process, and if so how can these be overcome? Find out if it’s possible to simplify formal tender processes and create ways to work together more strategically and collaboratively. Savings targets identified by procurement should be aligned with the IT department’s objectives for the year. Are there any opportunities to save in low priority areas, contract terms that could be negotiated that mitigates cost increases? Both teams need to to identify which opportunities can meet joint objectives and overall savings targets set by the organisation. A better understanding of how procurement’s processes work will help IT recognise the benefits of this approach.[/quote_box_center] IT purchasing can be complex and demands a certain balance of autonomy and assistance to drive great returns and value. Establishing goals, identifying key areas of spend and utilising the skills of the procurement function presents IT and procurement with the opportunity to really drive a tangible difference to an organisation. ### 6 Questions for Paul Gill, Head of Digital Engagement at Oxfam [caption id="attachment_32244" align="alignright" width="300"] Tweet Paul at @paulsimongill[/caption] Ahead of his presentation at Digital Strategy Innovation In London, we spoke to Paul Gill, Head of Digital Engagement at Oxfam. Paul runs the Digital Engagement team at Oxfam GB. The team crafts engaging digital experiences across the web, email, social media and digital marketing that make Oxfam's supporters (existing and potential) feel part of their work and mission. They also transform Oxfam's digital culture to help achieve this by being experts and role models. He has a personal focus on user experience and the cross-channel experience for Oxfam's supporters. Before Oxfam, he worked at digital agency Torchbox on UX and digital strategy. Which company do you think utilises digital most effectively? Given the wide spectrum of digital, this is a difficult question to answer. I've thought about it from the perspectives of product, cross-channel marketing, creative use of the medium, user experience and a strategic approach. I've also had to question whether I should include a traditional organisation who is doing its best to use digital, or an organisation rooted in digital In the end, I went with Netflix. They understood the fundamental shift in the market, realised what consumers would want, invested heavily in the infrastructure, invested in the product (and the user experience), used the product and WoM to market themselves. What I also like is that they appear to be an overnight success story. They're not - their history is DVD-as-a-service but transitioned to on-demand about 10 years ago. I also like the (perhaps apocryphal) story that, in 2000, Netflix was offered for acquisition to Blockbuster for $50 million, however Blockbuster declined the offer. [easy-tweet tweet="Netflix is using digital most effectively says @PaulSimonGill " user="comparethecloud" hashtags="DigitalLDN"] With traditional strategies there’s a long-term and short-term plan. Are long-term strategies possible with digital or does its constant evolution mean that you can only plan for the short-term? As you'll guess from my answer to the first question, I do believe that long-term strategies are possible. But you have to be flexible enough to adapt as the situation changes in the short term. Sometimes, not investing (and justifying that lack of investment) in opportunities in the short term to achieve the long term is a difficult, but necessary decision. How central is digital to your company’s overarching strategy? Oxfam aims to become a digital-by-default organisation. As a result, our nascent digital strategy (for public engagement) will be very closely aligned with the overarching organisational strategy. Oxfam aims to become a digital-by-default organisation What’s next for digital in terms of its development and what do you see it affecting next? I don't have a crystal ball, but two themes I'm interested are: Fragmentation. Of devices and networks. I love XKCD's world map of the Internet (https://xkcd.com/802/) and see this becoming much more granular. And I love that the granularity comes from types of network and types of interest. The map illustrates that geographical boundaries that do so much to define us currently, will shift our definition to become more of the networks and places in which we chose to spend our time online. By fragmentation of devices, I mean any object becoming part of the internet - so the internet of things being part of a continuum that started with the first phones. Privacy. Mass-market awareness of privacy and issues relating to conversations and personal data being digital / anywhere is only just starting. I see public perception of this becoming greater, and having a bearing online services from any organisation. Which types of companies do you think will find it most difficult to implement digital effectively? Slow-moving and risk-averse organisations are most at risk. And those with rigidly-defined silos. To become digital-by-default, you can't corral digital into one team - it's a meme, a theme, a modus operandi that works across the entire organisation. And whilst, for large organisations, digital is a long-term piece, you have to be flexible (or agile) enough to either make use of opportunities as they arise, or decide not to. [easy-tweet tweet="Slow-moving and risk-averse organisations are most at risk of not implementing digital" user="comparethecloud" hashtags="DigitalLDN"] What can the delegates expect from your presentation at the Digital Strategy Innovation Summit? Understanding how the digital channel can both reflect and influence other channels, and some guide to how this can work in a large organisation - illustrated with some compelling examples from our humanitarian, fundraising and campaigning work over the last year. Compare the Cloud readers can save 20% on two-day passes with the code CLOUD20 for the upcoming summit. ### UK IT Professionals Slow to Respond to Cyber-Insurance According to a recent survey, nearly half of the IT professionals polled thought there was ‘insufficient need’ to invest in cyber-insurance, whilst just over one third did not believe that their company would need to change its IT security policy when taking out cyber-insurance.  These are the main findings to emerge from a recent survey on cyber-insurance carried out amongst IT professionals in the UK and France by Wallix, a software company providing cyber-security and governance solutions for information systems access. [easy-tweet tweet="47% thought that there was ‘insufficient need’ to invest in cyber-insurance" user="comparethecloud" hashtags="cybersecurity"] According to the 2014 Information Security Breaches Survey, 81% of large businesses and 60% of small businesses suffered a breach in the last year with the average cost of breaches to business nearly doubling since last year. As cyber-insurance begins to be seen as a way to effectively cover costs and repair damage associated with a breach, Wallix’s survey reveals IT departments are slow to react to the change. [easy-tweet tweet="35% of UK respondents didn’t know which department would lead the purchasing decision for cyber-insurance" via="no" usehashtags="no"] 41% of respondents did not believe a change in IT policy would be necessary when taking out cyber insurance and half of the respondents thought it would be difficult to identify whether ex-employees, ex-third party providers or ex-contractors still had access to resources on their network. An audit trail that proves access rights are being managed appropriately, e.g. are revoked when an employee leaves the firm, is considered necessary to validate most cyber-insurance policies. [easy-tweet tweet="41% did not believe that their company would need to change its #ITsecurity policy when taking out cyber-insurance" via="no" usehashtags="no"] The breakdown is as follows: half of them thought it would be either ‘difficult’ or ‘very difficult’ to identify whether any ex-third party providers still had access to resources on their network; 40% thought it would be difficult to identify whether any ex-employees still had access and, again, 55% (made up of 45% answering ‘difficult’ and 10% answering ‘very difficult’) would appear to have problems spotting any ex-contractors. Headquartered in France, Wallix conducted the survey in both the UK and France. Although there was a great deal of uniformity in the responses to most answers there was some divergence between the two countries in two question areas: which internal department led the organisation’s purchase decision (according to the French sample, nearly a third thought the Finance Department led on this whilst in the UK the Finance Department did not feature at all) and in their confidence with their systems’ abilities to make critical updates and in their treatment of third party providers The majority of the French sample were very confident, their British counterparts much less so. For the British sample, ‘Identity and Access’ emerged as one of the top three cyber security challenges, alongside ‘meeting compliance’ and ‘working with third parties’. Commenting on the findings, the report’s author, Chris Pace, the company’s Head of Product Marketing at Wallix UK, who commissioned the survey, said, “Cyber-insurance is rapidly coming of age and both the Government and the UK insurance industry have taken big steps to ensure that the UK leads the world in this field. But the IT industry needs to raise its game. Our survey indicates that there is a degree of complacency within many organisations’ IT departments and this needs to be eradicated if companies are not to be put at risk.  We are frankly alarmed that the IT department does not feel the need to change the security policy when cyber insurance policies clearly indicate that businesses must have complete control and visibility of every user who accesses their infrastructure. And yet according to our survey this clearly isn’t happening. Hopefully our report will act as a wake-up call to those IT departments.” The survey findings have been incorporated into a report entitled ‘We may not have it covered: Do IT teams understand the impact of investing in cyber-insurance?’ The report is available to download from the Wallix website here.  The online survey took place during July and August of this year. The sample was drawn from Information Technology professionals. [quote_box_center] Based on the survey, the company has recommended five steps that it feels companies will need to follow so as to get the best from their cyber insurance policy. These are: 1. Get involved. It’s vital the IT department is part of any process to invest in cyber insurance. 2. Know your limits. Make sure you have a clear understanding of the technology limitations that could affect your cover. 3. Belt and braces. Your regular and automated security activities (updates, patching, signatures etc) must be working. They could be the difference between an insurance payout or the spiraling costs and damage limitation resulting from a breach. 4. Maximise your visibility. If you do suffer a breach there’s a possibility that your insurance company will want to attribute its source; the more data you have the easier that job will be. 5. Know your access control weaknesses. Many cyber-insurance policy terms make an assumption that businesses have complete control and visibility of every user who accesses your infrastructure. Start by ensuring you have effective management of privileged user access.  [/quote_box_center] ### Heinz 57 – the need to protect valuable IP The top secret recipe for Heinz Tomato Ketchup is thought to be known by only eight to ten people in the world; KFC’s original recipe chicken is seasoned with a secret blend of eleven herbs and spices that is held in a vault in its Louisville, Kentucky HQ; and the ingredients used by Coca Cola is a closely held trade secret known only to a few employees. Yet, it’s not just multinational behemoths that hold dear valuable intellectual property (IP). From large pharmaceutical organisations to small family-run businesses, all now operate with business critical IP at the core of their operations and need to protect it. In fact, in today’s digital age, up to 80 per cent of the value of UK companies are made up of intangible assets such as their IP. [easy-tweet tweet="Up to 80% of the value of UK companies is made up of intangible assets such as #IP" user="CensorNet and @comparethecloud"] The task of protecting company secrets has been made increasingly difficult with the emergence of cloud-based sharing apps such as Dropbox and YouSendIt, and the ease in which information – including valuable IP – can be transferred via cloud-based social apps such as Facebook, Twitter and Skype.  Not every cloud has a silver lining Cloud adoption and cloud-based file sharing are becoming increasingly popular among the general public; and the unauthorised private use of them within organisations is causing concern among CIOs. However, due to the bulk of security products being designed for an on-premise world, IT organisations are having a hard time keeping up. According to a recent survey conducted by Fruition Partners of 100 UK CIOs, 84 per cent believe cloud adoption reduces their organisation’s control over IT as they are so difficult to monitor. 84% of CIOS believe cloud adoption reduces their organisation's control overcontrol over IT There is no co-incidence that at the same time, breaches – and the stealing of valuable IP – has increased. A recent UK Government survey estimated that in 2014 58 per cent of large organisations suffered staff-related security breaches and that they account for almost a third (31 per cent) of the worst security breaches in the year. The average cost of such a cyber-security breach is substantial, with recent figures estimating it to be between £600k-£1.15m for large businesses and £65k-115k for SMEs. Businesses need to step up to the challenge of managing the rise of cloud applications. Only by gaining greater visibility, analysis and control over them can businesses truly protect the IP that could be leaving the safety of the organisation at risk. [easy-tweet tweet="Businesses need to step up to the challenge of managing the rise of #cloud applications" user="censornet and @comparethecloud" usehashtags="no"] Controlling the exits The simple truth is that all of your employees have a price. Whilst a quarter of employees would sell critical business data for the sum of £5,000, three per cent would be prepared to sell data for just £100 and a further 18 per cent if they were offered just £1,000. Most worrying of all, over a third of those surveyed (35 per cent) would sell confidential IP if the “price was right” so the dangers are real. [caption id="attachment_32203" align="alignright" width="300"] Register now for #DigitalLDN 2015[/caption] Because of this, it is important to manage and control any potential exit channels, including cloud applications, for your data. Whether the IP is industrial design rights, a secret recipe, trademarks or customer data, it is paramount that businesses protect their valuable IP securely with the new breed of security solutions that go beyond simply protecting the web gateway against those on the outside from breaching the organisation’s network perimeter to monitoring all users interactions when they access the internet or applications.  Now more than ever, organisations need to be able to monitor an individual’s use of corporate assets at the most basic level Now more than ever, organisations need to be able to monitor an individual’s use of corporate assets at the most basic level, regardless of whether users are in-office or mobile. Solutions such as cloud application control (CAC) solutions can provide businesses with this visibility and the ability to discover, analyse and control the information staff are accessing or sharing.  Once security solutions extend beyond the web gateway, they can address the fundamental gap that resides between traditional web security and cloud application control, thus securing the way in which we use apps today by ‘following the user’ to ensure no valuable IP is leaving the organisation.  ### Why the future looks sunny for the UK cloud industry The cloud revolution In recent years, the number of businesses turning to “the cloud” has increased exponentially. Cloud technology has become an IT essential for both small businesses and large enterprises alike. Why? Because it facilitates flexibility, agility and scalability.  In 2010 just 48 per cent of SMEs in Britain were using cloud services. Fast forward to 2015, and 84 per cent of small and medium sized businesses in the UK have implemented cloud technology into their organisation. What’s more, almost four in five of these firms are using at least two cloud services. [easy-tweet tweet="In 2010 just 48 per cent of SMEs in Britain were using cloud services" user="cobwebsolutions" usehashtags="no"] But why are so many businesses transitioning their operations to the cloud and what advantages does this deliver for your business?  The business case for the cloud Using cloud services enables SMEs to cut capital expenditure. This is often the primary driver for cloud migration, with UK businesses currently saving an average of 11 per cent from their use of the cloud, a figure forecast to rise to 19 per cent over the next five years.  For businesses with a limited IT budget at their disposal, using a cloud provider is more efficient; not only are there fewer in-house costs, but critical business information is always readily available and the risk of data loss vastly reduced. Hosting IT systems outside of the business infrastructure also means firms can benefit from greater resilience and reliability, along with 24/7 access to IT support. Ultimately, the financial flexibility of cloud services grant SMEs access to complex computing applications that would be otherwise inaccessible without significant financial investment. Thanks to “pay as you use" style cloud services firms can easily scale up or reduce their cloud use as business requirements dictate. Intangible competitive advantages For SMEs to stay ahead of the curve in today’s competitive landscape, agility is key. Cloud services facilitate this. What’s more, CIF’s latest research demonstrates that cloud computing offers a range of benefits that are more difficult to quantify, such as enabling businesses to become more responsive to customer needs and improving collaboration across the organisation. For example, for SMEs, adopting a flexible approach to working hours can be transformative. By implementing cloud solutions into your business, staff are empowered to work remotely. The days of a workforce chained to their desks and restricted to the traditional 9-5 working hours are a thing of the past. Thanks to cloud technology, employees can now access the same computing services, platforms and power at home and on the move as they can in the office. This applies to all communications and processes, from emails and invoicing, to sales data and telephony.  [caption id="attachment_32168" align="alignright" width="300"] Read about the ECJ's latest ruling[/caption] The cloud also facilitates collaboration within the business. Now, thanks to cloud technology information can be stored and managed centrally. Employees can access and work in the same documents in real-time, without the need for hundreds of emails, sharing documents back and forth in attachments. Moreover, thanks to cloud based-productivity tools for email, calendar and file-sharing, such as Office 365, communication processes have also been streamlined.  Why security is a silver lining The misconception that storing company data “outside” the business is insecure prevails, with over 70 per cent of small and medium sized businesses citing this as the key reason for failing to move their applications to the cloud. Indeed, CIF’s latest research revealed that concerns about data security have risen by almost 10 per cent over the last year, despite the fact that 99 per cent of firms using cloud services have never been subject to a data breach.  [easy-tweet tweet="More than 70% of SMBs say security worries are the key reason for not moving applications to the cloud" usehashtags="no"] When it comes to security, cloud technology offers an advantage for SMEs. Using the cloud not only diminishes the risk of losing confidential data on a misplaced mobile device, laptop or USB stick, using established cloud providers also grant SMEs the added protection delivered by leading security experts and the vast amounts of resource service providers invest in ensuring their applications are secure.  A bright future Despite lingering security concerns, the outlook for the cloud industry looks bright. With cloud adoption among UK businesses at a record, we have reached a point where entire businesses can be run from the cloud.  the outlook for the cloud industry looks bright What’s more, this year more businesses than ever are planning to venture into the cloud for the first time and for those businesses already to taking advantage of the benefits it offers, additional services are being added every day. For SMEs to remain competitive, it’s no longer a matter of if, but when they choose to transition to the cloud. ### The 8 U.K. cybersecurity stats shaking the system The United Kingdom (U.K.) has been buzzing about increased cybersecurity defences and protection with recent data breaches like the famous Ashley Madison hacking scandal (among other instances) fuelling public anxiety. At the Financial Times Cyber Security Europe Summit, Ed Vaizey, Minister for the Digital Economy warned U.K. businesses that they needed to implement stronger security measures and encouraged widespread adoption of the government's Cyber Essentials program. The scheme outlines the basic controls all businesses should establish to mitigate the damage of data breaches and upon completion, participating organisations receive a badge to demonstrate to clients that they have taken these necessary precautions. [easy-tweet tweet="8 UK Cybersecurity stats analysed by @followcontinuum on @comparethecloud" via="no" hashtags="data"] Now more than ever, residents are demanding this assurance as cyber threats becomes even more prevalent in the U.K. and data regulations continue to be reevaluated for a new unified standard. What exactly is the state of cybercrime, and why is it causing such public unrest? Using Bit9 + Carbon Black, ThreatMetrix, Procorre, and Price Waterhouse Cooper as references, we've compiled eight cybersecurity statistics every U.K. IT solutions provider needs to know. 8 U.K. Cybersecurity Stats that Will Make IT Solutions Providers Rethink Their Business Model The Current Threat Landscape 1. The U.K. is the target of the most cybercrime attacks, ahead of the U.S. 2. 74% of small businesses, and 90% of major businesses, has had a cyber breach of security in the last year. 3. In the second quarter of this year, 36 million fraud attacks worth approximately £2bn were blocked. Despite the increase in staff awareness training, breaches are still as likely to occur through user error, with people exposing their networks to viruses and other types of malicious software. [easy-tweet tweet="The U.K. is the target of the most cybercrime attacks, ahead of the U.S." user="followcontinuum and @comparethecloud" usehashtags="no"] The U.K. Public's Response  4. 93% of the UK public advocates the mandatory and immediate reporting of all breaches to both the public and authorities. 5. 94% believe businesses should be legally required to establish processes to immediately detect data breaches. 6. 63% want sensitive data to be monitored 24x7. [easy-tweet tweet="94% of UK public believes businesses should be required to establish processes to immediately detect #data breaches" via="no" usehashtags="no"] The Need for Highly Skilled IT Talent 7. 14% of all new IT openings are jobs in cybersecurity. 8. The increase in demand for cybersecurity experts has led to a rise in wages, with 15% of professionals paid around £100,000 a year. [easy-tweet tweet="14% of all new IT openings are jobs in #cybersecurity" user="followcontinuum and @comparethecloud" usehashtags="no"] There are a few things you should take away from these figures. First, the security of clients' and prospects' data continues to be threatened and the consequences will be dire unless IT service providers take action. As we learned with the HIV patient data breach that occurred earlier this year, many of these incidents or the fallout they provoke are often preventable. Just as a hacker will always find a way in, humans will always make mistakes. Ideally, you'd want to train every employee at the companies whose networks you oversee to be 100% security savvy, but expecting this to be the case is impractical. One accidental mass email send here or one foolish click there could shut down their business for good. "Just as a hacker will always find a way in, humans will always make mistakes" That's why you've got to be there for them as their trusted business advisor, providing 24x7x365 IT support. The majority wants this anyway! By following in the footsteps of successful MSPs before you and offering a proactive remote monitoring and management (RMM) solution, you can provide clients the peace of mind that the survey results determined they're looking for. As soon as their network is compromised, your technology will detect it and, depending on whether you're leveraging a network operations centre (NOC) or not, will leave your team to deliver expedited problem resolution. Notice also that as the demand for your services climbs so too does the need for highly-trained technicians. The U.K. market has reacted to the onslaught of cyber threats by offering more jobs and higher income to those with cybersecurity chops, but we've also learned that an excess of IT labor demand has meant a shortage of IT talent in the U.K. In response, we encourage IT solution providers to adapt and embrace a new business model centered around labor arbitrage. Look for a solution with people-powered technology. When you take advantage of an RMM platform backed by a NOC that acts as an extension of your team, you're able to more easily scale and grow your business, without growing your payroll. [quote_box_center]Sources: www.thecommentator.com www.computerweekly.com www.computerweekly.com www.cbronline.com www.pwc.co.uk [/quote_box_center] ### Digital Strategy Innovation Summit London 2015 Rhian Wilkinson is live blogging from the Digital Strategy Innovation Summit London 2015. [quote_box_center]October 22-23 Stamford Bridge, London As organizations and the global market place grow, the role of the digital strategy leaders has become more important for ensuring successful digital strategy implementation and execution. Understanding current challenges will allow digital strategy leaders to excel in their role. This summit will be touching upon: - Strategic innovation in digital -  Increasing customer engagement through digital strategy -  Tracking digital behaviours - Managing your digital team - Digital ROI - See more here Click here to visit the website and register. #DIGITALLDN [/quote_box_center] ### IT governance and IT restriction – maintaining an equilibrium In the move towards a hybrid enterprise IT environment, where users can access applications, data and the underlying infrastructure located on-premises in data centres and private or public clouds, the way IT is managed has to change. The world is becoming a smaller place, with the latest technology connecting us to the whole global landscape. The introduction of a faster and more efficient IT infrastructure has the power to connect regions in almost every country, giving workers the freedom to become more mobile and productive, and businesses less restrictive on their physical location.  [easy-tweet tweet="It is still the responsibility of the CIO and his department to ensure the company’s systems and data are secure" via="no" usehashtags="no"] The benefits of a flexible workforce embracing mobility and the cloud far outnumber the concerns over migrating to a different work environment – in the cloud. These technologies drive innovation in the business and improve employee satisfaction, which results in favourable perceptions of the CIO and employer on the whole. But how should the IT department manage a changing environment such as this? We know that the hybrid enterprise, while delivering multiple benefits, can expose enterprises to a seemingly infinite number of new attack vectors. In today’s reality, users are just as likely to work with their favourite, non-IT-sanctioned cloud apps at Starbucks as they are to sit in a corporate office running centralised data centre apps, resulting in the rise of Shadow IT. Even when employees decide to use technology outside of IT jurisdiction to do their job, it is still the responsibility of the CIO and his department to ensure the company’s systems and data are secure. Governing what we can’t see The way data is stored by enterprises and used by employees continues to change. Though the flexible environments we work in today promote productivity and employee efficiency, they’ve changed the way IT governs technology and manages security. When everything was in the office and data centre, governance was much simpler. But with the profusion of applications being accessed inside and outside the workplace, IT runs the risk of losing visibility and control. This could result in a series of security risks and potentially lost or compromised corporate data. Not only could this put sensitive information in the hands of the wrong person, it can severely damage the company’s reputation, should customer or public information be compromised. The reality is that IT cannot govern what it cannot see, and therefore does not have the ability to control access and usage. [easy-tweet tweet="If #ShadowIT is happening, it’s likely because employees aren't being provided with the best tools to do their job" via="no" usehashtags="no"] As concerning as the risks may seem, restriction is not the answer. If Shadow IT is happening, it’s likely to be a result of employees not being provided with the best tools to do their job. Therefore, it becomes an issue that the organisation needs to address to ensure that the technology being provided is suitable for the workforce. But more importantly, the CIO should embrace an “IT governance” approach, which includes having the visibility in place to monitor user access, network traffic and application performance, in order to provide a holistic understanding of the way IT is being used, without limiting the way employees use it. If you can see it, you can protect it As IT overcomes the challenges of the hybrid enterprise, visibility into infrastructure is one thing that cannot be compromised. Lack of visibility into the network and applications layers may hinder IT’s ability to identify, predict, and prevent threats. Key questions to ask should include: What’s on your network? Who’s using it? How are they using it? Where are they accessing it? When did this all take place? Answers to these questions should be available in real-time in order to provide the most accurate and up-to-date breakdown. The traditional manual approaches to tracking network status often fall short because asset inventories are almost never complete, and at best are only as current as the latest scan. Needless to say, that isn’t ideal for security.  Companies that can control and manage complexity, without restricting user access, will be able to use IT as a competitive business advantage, instead of being weighed down trying to solve performance problems and security concerns of business-critical applications. Suffering from Shadow IT and the lack of visibility need no longer be an issue for IT. New technologies that allow visibility and control from one performance management platform mean that in the hybrid enterprise, with employees working from disparate locations, maintaining a balance between IT governance and IT restriction is possible. [easy-tweet tweet="Suffering from #ShadowIT and the lack of visibility need no longer be an issue for IT" user="riverbed_uk" usehashtags="no"] ### Cloud communications: Why businesses are switching to VoIP Having reliable communications is a prerequisite for successful businesses all over the world – every missed phone call could be potential revenue down the drain. However, communication isn’t just about making sure your office telephone hasn’t been left off the hook and businesses are increasingly looking for more feature-rich telephony options. The advantages of VoIP Many firms are now transitioning their telephone infrastructure to a cloud-based voice over internet protocol (VoIP) system. Transferring all audio information digitally over the internet, VoIP offers businesses a number of benefits compared with other telephone systems. Firstly, from a financial point of view, hosted VoIP solutions can prove very cost-effective. With a low up-front cost, it can prove an attractive option for businesses that cannot afford to completely overhaul their existing infrastructure in order to improve their on-premise communications. Much like other cloud services, businesses are then simply required to pay a monthly subscription fee based on the number of users. [easy-tweet tweet="Hosted #VoIP solutions can prove very cost-effective says @bsquared90" user="comparethecloud" usehashtags="no"] [caption id="attachment_32115" align="alignright" width="300"] Click to listen[/caption] However, there is no reason for businesses to feel limited by the number of users that they have agreed with their cloud VoIP provider. One of the main benefits of this approach to telecommunications is that it is easily scalable. If your business experiences rapid growth, for example, your additional employees can be easily added to the hosted VoIP system without disruption. If companies decide to go with their own on-premise VoIP system, it is likely that they will need to accommodate for future growth, at least in the short-term, which may not be the most efficient use of resources. Moreover, VoIP offers businesses a number of additional features that may prove attractive. Employees may be able to create a list of contact numbers that will be dialled in the event that they are unable to answer their main work phone. It could be also be set so that their office phone could ring a few times, before the call is transferred to their smartphone. This could prove extremely useful for mobile workers that can’t afford to be missing calls because of their frequent movements. Some VoIP systems also come with voice mail transcription so instead of frantically scribbling down messages, employees receive an email with all the details included. For business owners, a cloud-based VoIP service can be used to monitor data usage regarding inbound and outbound calls and provide bandwidth utilisation information. Moving Towards Unified Communications For businesses looking beyond VoIP, adopting cloud based unified communications may be an option worth considering. Sometimes referred to as unified communications as a service (UCaaS), this approach combines all of a company’s communications over a single digital connection. According to recent research, the unified communication market is expected to grow from $15.12 billion this year to $24.88 billion by 2020 and it’s easy to understand this growth in the light of the many business advantages it provides. [easy-tweet tweet="The unified communication market is expected to grow from $15.12 billion in 2015 to $24.88 billion by 2020" via="no" user="bsquared90" usehashtags="no"] Similar to VoIP but more extensive, many UC solutions employ a feature known as Presence that lets clients and customers know the best way of contacting an individual at any given time, whether it’s email, mobile or any other form of communication. Unified comms also makes life simpler and more secure for employees. They can access all their communications from a single portal from any online-connected device and because information is transferred digitally, it can be encrypted to ensure that sensitive data does not fall in to the wrong hands. a unified communications solution could offer a low-cost way of gaining a competitive edge VoIP or UCaaS may not be appropriate for all businesses, particularly as they rely on a strong internet connection and enough bandwidth to support all of a company’s incoming and outgoing communications. In addition, businesses may want to leave some critical phone lines operating across a PBX system, as power cuts and internet outages will leave VoIP and UCaaS solutions inactive. However, if businesses are looking to increase the productivity and flexibility of their employees, a cloud-based VoIP or unified communications solution could offer a low-cost way of gaining a competitive edge. ### Xirrus Solves Security for Public Wi-Fi Networks with EasyPass Personal Wi-Fi EasyPass Personal Wi-Fi gives any establishment the ability to deliver safe connectivity via secure personal networks Xirrus, the leading provider of high-performance wireless networks, today announced availability of Xirrus EasyPass Personal Wi-Fi, a key feature now available in all Xirrus cloud managed networks that safeguards users and their data when accessing public Wi-Fi, guest networks and hotspot environments. With this announcement Xirrus is the first Wi-Fi provider to address the security concerns associated with connecting to these networks, and makes secure personal Wi-Fi a reality for the first time. [easy-tweet tweet="76 percent of people know that public Wi-Fi is not secure" user="xirrus" hashtags="security"] In a recent Xirrus survey on Wi-Fi usage, 76 percent of people know that public Wi-Fi is not secure, but 62 percent use it regardless of security implications. EasyPass Personal Wi-Fi allows users to create a secure personal network via a simple, one-time process that authenticates all devices, ensuring the data across their platforms is safe inside the public network at all times. Unlike a VPN, which encrypts end-to-end connections back to a corporate network and requires additional software, EasyPass Personal enables users to easily create their own secure personal network that automatically encrypts the data on the Wi-Fi network. For the first time, businesses can easily offer security on their public Wi-Fi networks without additional complexity. EasyPass Personal Wi-Fi delivers the following benefits and features: Safe:  Devices communicate within the secure personal network Secure:  Traffic encryption protects valuable data Simple:  Users connect all personal devices effortlessly Easy: No IT required to deploy and manage for each user “It used to be that simply providing a Wi-Fi hotspot for your customers was enough, but that isn’t the case anymore. Customers are worried about security risks when using public Wi-Fi, while organizations want to use Wi-Fi to engage with customers in new ways,” said Nolan Greene, research analyst network infrastructure at IDC. “With EasyPass Personal Wi-Fi, Xirrus delivers the ability for organisations to engage their guests, students and customers while offering secure connectivity at all times.” EasyPass Personal enables public Wi-Fi environments to reap the benefits of short term secure Wi-Fi communications for businesses like hotels, restaurants, retail and event venues, as well as university dormitories, assisted care facilities and multi-tenant buildings which require longer-term use. With EasyPass Personal, these establishments can now promote an unprecedented level of Wi-Fi security, to offer new services, attract new business and create customer loyalty. “Awareness of public Wi-Fi vulnerabilities is at an all time high. Whether we want to acknowledge it or not, every time we connect to public Wi-Fi we are putting our data at peril and are at risk of identify theft. Until now there hasn’t been a wireless solution to address this threat, said Shane Buckley, CEO, Xirrus. “EasyPass Personal Wi-Fi is groundbreaking for our industry. With this introduction, Xirrus offers the first and only solution in the market that allows a secure private connection to be created within a public Wi-Fi network.  Now establishments that have deployed Xirrus Wi-Fi can deliver the safest connection available and protect their users from cyber crime.” Xirrus EasyPass Personal is included at no charge with all XMS Cloud deployments and is available for shipment today. ### Choosing a new phone system – On-premise versus cloud As older traditional phone systems reach end of life, more and more organisations will be looking to replace legacy equipment with more modern unified communications (UC) technology that can deliver additional benefits such as cost-savings, enhanced customer service more functionality and greater flexibility in terms of location-independent collaboration and integration of existing applications. The question though is whether they replace like with like and still go with a capital investment or if they opt for cloud-based communications. A recent report by IDC says that SME cloud spending will grow by nearly 20% by 2019, so there is clearly a rise in the popularity of hosted technology, but a decision between buying or ‘renting’ could be determined by a range of factors such as industry sector, size or other characteristics unique to an individual business.    [easy-tweet tweet="SME cloud spending will grow by nearly 20% by 2019" user="comparethecloud" usehashtags="no"] So how do companies best assess what type of communications is right for them? Whilst there may be little difference in the functionality available with cloud-based UC compared with on-premise (be aware though that some vendors’ cloud solutions may be adulterated) there is a large contrast in the delivery and payment model. For many businesses this is often a starting point for discussion. For organisations such as start-ups that lack working capital, the cloud model is particularly attractive for communications, as for all cloud-based solutions, the initial outlay is low, using a ‘pay as you go’ approach’ for each user and on-going costs are affordable and predictable.  In contrast, an on-premise unified comms solution can often involve a large capital investment for the system itself plus all the additional hardware equipment such as servers and software licences for every user. The hardware investment has a dual implication: 1. upfront costs 2. on-going costs, in the shape of: adding or changing phone extensions upgrading new functionality future de-installing and re-installing of the solution in the case of a move or expansion. For new and/or fast-growth companies, MACS (moves and changes) are likely to dominate, and if you do not have an IT Manager in-house, then it could be both expensive and time-consuming to rely on a third party if you go for an on-premise solution.  In these circumstances cloud based UC is likely to be preferable because it provides much more flexibility both budget-wise and from a scalability point of view. The only hardware that potentially is required is for desk-phones or headsets, and with many solutions such as Swyx, these can be eliminated altogether and just replaced with a mobile, as all PBX functionality (calls, call routing, CTI etc.) and ‘rich presence’ or status information can be supported on a smartphone device.  Similarly when it comes to upgrading your software and system, with cloud this is done automatically by your service provider, so you are always on the latest and greatest without worrying about managing legacy equipment or downtime. [easy-tweet tweet="If your business is highly reliant on communication, then with cloud you will have built-in redundancy supplied by your service provider" via="no" usehashtags="no"] With the majority of phone systems (including on-premise), now based on VoIP and SIP trunking, the question of business continuity should also be raised. If your business is highly reliant on communication, then with cloud you will have built-in redundancy supplied by your service provider, whereas with on-premise you would need to invest in a second server at the same or at a different location to offer a back-up facility. Conversely there are also arguments for buying a solution because once an organisation has re-couped the initial investment, costs could be lower over the long-term.  For established companies that can afford the capital outlay this may be preferable to cloud, and it is even possible to have a CPE solution virtualised in a data centre, so you have all the advantages of cloud, but also can realise a return on your investment. Maintenance costs can also be minimised if you can draw on the expertise of your own IT team, who can carry out any updates or even work on integrating your communications with other applications in the business such as your CRM or accounts. With Swyx’s Visual Contacts function for example it is possible for the system to automatically draw from a company database, so that screen-popping of information occurs including customer details or perhaps accounts data, so you can handle enquiries more efficiently. If you are still unsure of which path to take, always speak to a vendor or reseller that offers both, so there is no bias and they can then judge more fairly what makes best business sense based on your specific individual circumstances and overall objectives.  The good news though is that whichever choice you make, unified communications is sure to reduce overheads, increase productivity and provide a competitive edge. According to the Cloud Industry Forum, more than 30% of small businesses and 55% of mid-sized companies plan to deploy some form of UC in the next 12 months, so now is the perfect time to discover how you can align your communications strategy with your business goals. Checklist for Choosing ‘Cloud’ or ‘On-Premise’ Here is a quick overview of the key attributes of companies and why they would opt for cloud or on-premise respectively. Cloud On-Premise Start-Ups working with minimal capital Cash-rich preference for capex No IT expertise in-house IT Manager / Expertise in-house Already use other cloud services More conservative industry e.g. banking Always want to be on the latest technology term Want to save money over the longer term Green credentials (less hardware required e.g. no server required) Size and needs of business are static Fast-growth companies continuall moving or opening new sites / offices Further Reading: Swyx has produced a free whitepaper on ‘How SMEs benefit from cloud-based communications’ available to download here.     ### The Future of Marine: the Power of Data Unfavourable macroeconomic conditions have weighed down on marine operators over the last few years. The sector is under significant pressure and today the focus for vessel owners and operators is set firmly on optimising operations and reducing operational costs. [easy-tweet tweet="A new set of technologies, driven by the merging of physical and digital, is transforming the marine industry" usehashtags="no"] A new set of technologies, driven by the merging of physical and digital technologies, is transforming the industry, and providing key benefits for vessel owners. GE Marine’s SeaStream Insight, which provides remote monitoring, asset support and predictive analytics, is one pivotal technology that is changing the game here. The technology is being showcased to 1000 customers, developers, media and analysts at GE’s hallmark Industrial Internet event, Minds + Machines. It enables operators to make more informed decisions based on data and can help significantly increase a fleet’s operational efficiency. With this in mind, here I’d like to share three examples of how SeaStream Insight can provide remarkable value to the marine industry. Providing a holistic overview, empowering operators to make smart decisions Even before a ship is built, SeaStream Insight can model behaviour based on user and system requirements to build out the blueprint of the entire vessel to predict its operational performance, something unseen elsewhere in the industry. But the tool is not only of huge benefit in the build phase, once the ship is in operation it collects a mass stream of data relating to the condition of individual components from multiple networks. This is then processed and translated into clear, insightful information for vessel operators. This real-time monitoring provides owners with increased situational awareness,helping them to make smart operational decisions, faster. Moving from planned to conditional based maintenance SeaStream Insight uses GE’s SmartSignal analytics platform to predict the future condition of a vessel’s assets. It allows companies to monitor vessels in real-time, record and analyse their histories and search for anomalies. It can give early warnings when an asset is exhibiting an off standard behaviour, identifying potential problems before they occur. Therefore ship operators can take action weeks, or even months before a potential failure. This enables them to switch from planned to conditional-based maintenance, reducing downtime and creating potentially significant costs savings. [easy-tweet tweet="SeaStream Insight uses GE’s SmartSignal #analytics platform to predict the future condition of a vessel’s assets." via="no" usehashtags="no"] Connecting experts from around the world remotely This access to real-time insight from the vessels enables onshore experts, no matter where they are in the world, to remotely diagnose problems and advise on next steps immediately. In fact, the use of SeaStream Insight is anticipated to help reduce third-party costs associated with repairs, which will bring significant value especially for offshore industry as vessels are often operating in very remote areas. This approach can not only provide vessel owners with instant access to the knowledge of hundreds of experts globally, but also enable faster recourse analysis. Use of SeaStream Insight can also help to bring considerable value across the different marine segments including the transport, navy and offshore sectors, which may all face similar overarching issues but crucially, have differing priorities. Optimizing fuel efficiency for transport vessels Predictive analytics tools can provide the ability to forecast weather conditions to inform optimum propulsion levels, therefore reducing fuel consumption Fuel efficiency is crucial for Liquid Natural Gas Carriers (LNGC), containers and cruise vessels, as in some cases fuel costs account for up to 30-50% of total operational expenses. Predictive analytics tools can provide, for example, the ability to forecast weather conditions to inform optimum propulsion levels, therefore reducing fuel consumption. The ability to predict faults before they occur also help deferring maintenance time. Increasing reliability for navy vessels Whilst, thankfully, active combat duty is not a regular occurrence for most naval vessels, they must be ready for action at a moment’s notice, so reliability availability and maintainability are vital. Typically, all on-board systems are tested separately, and it is extremely difficult to test them all together until the ship is actually in active operation at sea, which can present a challenge. As naval vessels have zero tolerance for systems failure, even if they have redundancy built in, the ability to run diagnostics on all on-board equipment before it goes live is crucial. SeaStream Insight can model behaviors based on user and system requirement documents and build out the blueprint of the entire vessel even before the vessel is built. Through vessel operation modelling, it helps to predict how different assets interact with each other. This modelling provides operational readiness and can give naval commanders increased confidence that their vessels will work at the level required once at sea. Reducing downtime for offshore vessels Reducing downtime is of huge importance within the offshore industry. Outages to the “money line” (control centre) that’s tapping into oil and gas reserves account for a large percentage of total downtime. It has low redundancy, meaning a single failure in the chain of equipment can cause non-productive time. What we need is a holistic view of the entire system and the ability to help prevent problems before they strike. SeaStream Insight could help reduce unplanned downtime, increasing customer revenues. Predictive detection of problems will further help with crew training, cost reduction and ultimately towards de-manning, which address to a global skill shortage in the offshore industry; after all, few vessel operators have access to sufficient qualified engineers to be able to deploy experts in every system aboard every vessel.  [easy-tweet tweet="The digital revolution is upon us says Andy McKeran of GE Marine" user="comparethecloud" usehashtags="no"] The digital revolution is upon us. The use of these technologies is radically changing the efficiency of systems and machines, opening up significant opportunities in many sectors. However, increasing a vessel’s operational efficiency is an extremely complex process, which starts at the design phase. GE’s expertise in software solutions is optimised through partnerships with vessel owners and operators from the offset. This industry collaboration will ensure vessel operators are better prepared for the challenges ahead. ### SEO @ Scale for IBM WebSphere Commerce Customers DeeperThanBlue, a digital channel solutions provider announced today that it has partnered with OneHydra, the leading SEO automation solution, to successfully develop a plugin that will help IBM WebSphere clients significantly improve their SEO (Search Engine Optimisation) performance. . The OneHydra solution allows online retailers that use the IBM WebSphere Commerce platform to efficiently manage the SEO of their website, regardless of the number of products and skus on the site. The solution is cloud based for rapid customer on-boarding and utilises a plugin developed by IBM Premier Partner, DeeperThanBlue using the latest WebSphere Commerce versions and certified by IBM as Ready for Smarter Commerce. Using the plugin, OneHydra integrates real-time with the merchant’s (retailer’s) eCommerce platforms both large and small, so SEO can be introduced and managed on tens of thousands of products without the time or expense involved with hiring a large team, simplifying digital marketing optimisation. SEO automation optimises large sites with minimal user intervention, with SEO changes being applied instantly across the full site. This makes it easier for consumers to find a retailers products through better visibility leading to increased traffic  and revenue. As a result, return on investment is significant. [quote_box_center]“We have worked with a number of SEO companies over many years helping retailers and merchants optimise their IBM Commerce sites and in our view OneHydra provides a unique solution to addressing SEO at scale for large online and multichannel retailers. By deploying the IBM Commerce plugin with OneHydra Digital Marketeers can demonstrate an improved return on expenditure from their customer acquisition activities ”, said Chris Booker, Sales & Marketing Director – DeeperThanBlue.[/quote_box_center] Once OneHydra is installed with WebSphere Commerce, it works by gathering and distilling data and insight from the website, discovers, analyses and manages keywords and identifies thematic cross-linking to drive stronger rankings. The connection with IBM WebSphere Commerce means that new content pages, once approved by a merchandiser, are automatically added to the IBM Commerce catalogue, resulting in real time optimisation that is relevant to customers’ search activity and seasonal trends. And deployment of OneHydra is pain-free too. The complete integration package can be installed in a matter of hours, with components based on WebSphere Commerce templates, meaning that every page is optimised by OneHydra. DeeperThanBlue can assist customers with the installation of the plugin into their WebSphere Commerce environment and with site template modifications. Warren Cowan, CEO and founder of OneHydra, added “We wanted to create a plugin for IBM WebSphere to make is easy for WebSphere Commerce users to be able to quickly implement OneHydra. Partnering with DeeperThanBlue was pain-free and their expertise in the WebSphere Commerce platform means that IBM clients who use the OneHydra plugin can now benefit from traffic and revenue uplift nigh-on instantly.” ### 1 in 4 employees would sell company patents for a trip to the Bahamas A quarter of employees would sell company patents and credit card details for price of a trip to the Bahamas Third of employees have their price for selling their company’s private data 25% would sell critical business information for £5,000, with some open to bribing for as little as £100 35% of employees would sell information on company patents, financial records and customer credit card details if the price was right. [easy-tweet tweet="Third of employees have their price for selling their company’s private data" user="comparethecloud" hashtags="cloudnews"] New research by Clearswift amongst 4,000 employees in the UK, Germany, USA and Australia, found that for £5,000 – the price of a family Caribbean holiday or less than three months of the average UK monthly wage - 25% would sell such data and risk both their job and criminal convictions. 3% of employees would sell private information for as little as £100, rising to 18% who would accept an offer of £1,000. The number of employees open to bribes increases to 35% as the offer reaches £50,000. Such information can prove very valuable to competitors and criminals, and employee bribery can be an easy way in, as security systems become more sophisticated. Certain companies and even governments have long targeted patents and other intellectual property which can give them an edge, whilst competitors can benefit from information, such as when contracts are coming to an end. Criminals can use private information to steal money or bribe senior employees.  [quote_box_center] Heath Davies, Chief Executive at Clearswift, says: “Whilst people are generally taking security more seriously – 65% of employees said they wouldn’t sell data for any price – there is still a significant group of people who are willing to profit from selling something that doesn’t belong to them. This information can be worth millions of pounds.” “A case in point of the true value of data is the recent Ashley Madison hack, where user data has been accessed by a member of their extended enterprise (part of their technical services) according to the site’s CEO; the effects of which have been monumental. The site announced earlier this year that it hoped to raise £130 million in an initial public offering in London this year and it may have lost out on this opportunity reducing the value of its entire business. The attack may have burned a hole in its prospects and has already had a ripple effect on its sister sites Cougar Life and Established Men. As such it is important for companies to understand the risk and address it appropriately – this research can help them do that.” [/quote_box_center] The opportunity to sell valuable information is exacerbated by the ready access most employees have to it. 61% of respondents said they had access to private customer data, 51%, to financial data such as company accounts or shareholder information, and 49% to sensitive product information such as planned launches and patents. Attitudes to data security were mixed, with only 29% saying that company data was their personal responsibility, and 22% saying they didn’t feel it was their responsibility at all. A corresponding Clearswift survey of 504 information security professionals found 62% think employees don’t care enough about the implications of a security breach to change their behaviour.  Davies concludes: “It is not good business to live in fear of your employees, especially as most can be trusted. Getting the balance right has always been hard. But truly understanding where the problems come from, combined with advances in technology which can adapt to respond differently to different threats, really changes the game here.”  Organisations need to find ways to control where sensitive data is stored and put safeguards in place which prevent it from leaving the company network. Many Companies do this but a lot of large companies with very valuable data do not”. ### September Top 10 #CloudInfluence UK and Cloud Bursters After a bit of a lull over the summer vacation months, there were numerous influencers that made a real return to form in September. Therefore for the first time there was an incredible overlap between the main top 10 and the top ten bursters. [easy-tweet tweet=".@Microsoft’s @SatyaNadella and @ScottGu have topped both the individual and the cloud burster rankings" via="no" usehashtags="no"] Microsoft’s Satya Nadella and Scott Guthrie topped both rankings. Also in both the main top ten and the top ten bursters were Aaron Levie, Robert Faletra, Matthew Prince and Bill Ruh. Apple’s Tim Cook, who was just outside the main top ten, made it into the top ten bursters.  Then came Amit Zavery who was publicising the two new additions to Oracle’s cloud platform for integration - Oracle SOA Cloud and Oracle API Manager Cloud Services. These new offerings join its other iPaaS services, including Oracle Integration Cloud which was announced in June. Also commenting on these Oracle announcements was Ravi Gade, senior director of IT Applications at Calix, a supplier of telecommunications access equipment for service providers.   [table id=55 /]  In the UK ranking, Tom Reuner from analyst firm HfS Research was quoted widely on the evolving competitive dynamics on the IT services sector. Simon Porter recovers ground to climb back up to 2nd and Donal Daly, Founder & CEO at Data Wranglers Ltd makes a first appearance in the rankings at 3rd.   [table id=54 /] NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### The ECJ v DPC Ruling on the Safe Harbour Scheme Data location and protection has been at the forefront of our minds recently. Last week the Weltimmo case saw a landmark ruling passed - ensuring companies are held to the standards of the country they are operating in - despite where they may claim to be operating from. This week we see a fresh case come to light, with the safe harbour scheme coming to the forefront. Here's a brief rundown of the case, and its results. [easy-tweet tweet="Safe harbour is dead." user="rhian_wilkinson" hashtags="cloudlaw, dataprivacy"] As is the case with other subscribers residing in the EU, some or all of the data provided by users to Facebook is transferred from Facebook’s Irish subsidiary to servers located in the United States, where it is processed. Maximillian Schrems, a Facebook user since 2008, and an Austrian citizen, lodged a complaint with the Irish supervisory authority (the Data Protection Commissioner), taking the view that, in the light of the revelations made in 2013 by Edward Snowden concerning the activities of the United States intelligence services (in particular the NSA), the law and practice of the United States do not offer sufficient protection against surveillance by the public authorities of the data transferred to that country. The Irish authority rejected the complaint because in a decision of 26 July 2000, the Commission considered that, under the ‘safe harbour’ scheme, the United States ensures an adequate level of protection of the personal data transferred. i.e. Mr Schrems had no reason to be making his complaint. But now, the case has been brought before the High Court of Ireland, and the Court has declared the Safe Harbour Decision invalid. This judgment has the consequence that the Irish supervisory authority is required to examine Mr Schrems’ complaint with all due diligence and, at the conclusion of its investigation, is to decide whether, pursuant to the directive, transfer of the data of Facebook’s European subscribers to the United States should be suspended on the ground that that country does not afford an adequate level of protection of personal data. To simplify - Mr Schrems has brought a hell of a lot of attention to the fact that the United States may not be respecting EU citizen's data privacy rights - and the High Court of Ireland has acknowledged his fears, and said they'll look into it. [quote_box_center] Ashley Winton, UK head of data protection and privacy at international law firm Paul Hastings, comments on the European Court of Justice’s landmark Schrems v DPC ruling. “The ECJ’s Schrems v Irish Data Protection Commissioner ruling has serious repercussions for multi-national companies with operations in Europe.  Data Protection law in Europe provides that personal data may not be exported out of Europe unless certain conditions are met.  More than 4000 US companies have so far enjoyed using the ‘safe harbor’ rules agreed between the European Commission and the US Department of Commerce which permit the easy transfer of personal data from Europe to the US. Many European data protection regulators, particularly those in Germany, have long believed that the conditions of the safe harbour scheme are not substantial enough and the effect of today’s ruling will empower them to investigate and check the acceptability of any data transfer themselves. In addition, although the case today primarily concerns safe harbour the ruling will also apply to other European Commission approved methods of transferring personal data internationally.  Crucially, this case cannot be considered alone. Following the landmark case of Weltimmo last week, multinational companies that have elected to create an establishment in a more business-friendly jurisdiction are now likely to have their data protection practices scrutinised by local regulators all across the EU. There are currently no rules limiting individuals bringing complaints regarding data protection across multiple jurisdictions simultaneously, so we may now see these complaints springing up from every direction, where data is being shared around the world.” [/quote_box_center] It's been an exciting week for EU data - are any of you enjoying this as much as me? ### Wearing It Well: Is Field Service The Killer App for Wearable Computing? For many of us, when we think of wearable technology we think in terms of gadgets aimed at the fitness market, enabling sports jockeys to gather stats about their latest workout, heart rate, and mileage, or seen as toys for techies who want the ability to view text messages and weather forecasts on their watch. It’s yet to explode into mainstream life, and is certainly still nascent in the enterprise. According to a 2015 Digital IQ survey, only three per cent of enterprises are investing heavily in wearable technology, down from six per cent in 2014. In business terms, many people view wearable computing as a technology that’s still in search of a business problem. Field Service is ideally placed to change all that with the potential to become the catalyst for the elusive ‘Killer App’ wearables will require in their quest for market adoption. [easy-tweet tweet="Embedded sensors, processors, software, and connectivity are changing industries' approaches" via="DaveHartProfit and @ServiceMax hashtags="Iot"] [caption id="attachment_32006" align="alignright" width="212"] Scroll down to see the infographic in full[/caption] The convergence of major industry trends, such as the Internet of Things, Cloud Computing, Servitisation, and 3D printing are all nudging field service to the forefront of the business agenda. Embedded sensors, processors, software, and connectivity in products, coupled with a product cloud in which product data is stored and analysed and some applications are run, are driving major improvements in product functionality and performance. It’s also creating massive amounts of new product-usage data. Ironically, as an industry that was relatively late to join the information economy, it could soon find itself at the leading edge of it in the next few years. Until just a few years ago, innovation in the Field Service industry was inching forward at a snail’s pace with manual processes and paperwork. The last two ‘revolutionary’ technologies to hit industry were mobile, which freed techs from the tedious hamster wheel of triplicate paper-based processes, and more recently the cloud, which empowered access and visibility to unprecedented amounts of information. The latter gave rise to the Field Service Management industry, modernising Field Service for the twenty first century with a dedicated business platform and automated end-to-end processes. While mobile freed techs from paper, their hands were still tied. Today, if a field worker needs how-to instructions about a specific piece of equipment, can’t remember every step of a procedure, or needs to seek guidance from a colleague, he or she must literally put down their tools and stop what they’re doing while they log on, look up the information required on their smartphone, tablet or PC or make a phone call to someone who can advise them. This slows progress for both the technician and the customer in an industry where time is money. [easy-tweet tweet="With wearables, field techs have the potential to harness a range of supportive functionality" user="comparethecloud" hashtags="iot"] With wearables, field techs have the potential to harness a range of supportive functionality, such as augmented reality, 3D visualisations, and video conferencing to gain critical information and better facilitate service calls. Schematics can be overlaid onto machines while they are servicing them and colleagues can help field workers in remote locations by seeing what they see onsite to lend support rather than incurring the expense of flying in an expert. These sorts of collaborations can be recorded and stored as reference materials for future jobs, as well as video evidence for using in disputes or investigations on safety or expensive equipment. Likewise cameras can immediately identify the parts and devices the technician is looking at, read serial numbers and bar codes, and automatically detect completion of steps. [easy-tweet tweet="Ruggedised wearable devices in the form of smart watches, smart helmets, etc. can be designed to alert for safety hazards" via="no" usehashtags="no"] Take the Oil and Gas industry, for example. With a decline in production rates, increasing production costs, a retiring skilled workforce leaving a void of knowledge and expertise among younger colleagues, not to mention a whole host of remote and dangerous working environments makes it an ideal candidate for wearable tech. Ruggedised, and even explosion-proof wearable devices in the form of smart watches, smart helmets, smart glasses, as well as sensors embedded into clothing can be designed to detect such things as radiation and chemicals, monitor a worker’s stress levels and alert them to safety hazards, suggestions and procedures – all of which can dramatically improve health and safety. And with augmented reality, access to a wide range of data, virtual over the shoulder coaching, and collaboration in a hands-free environment, wearables can not only enable greater efficiency, faster decision making, and reduced down time, but also lower costs by significantly improving worker efficiency. If we look back at the history of technology adoption, it is the proof of a hard ROI number and strong bottom line cost savings which typically catapult adoption into the enterprise. Wearable technology has the ability to empower field service workers to work seamlessly onsite with access to all of the data, insight and information they require. The equipment that technicians are being tasked with servicing is becoming increasingly complex, and companies are diversifying to repair and maintain a variety of products - including third party products, requiring a wider range of skills. In this context, wearables can not only ease the burden of servicing an increasingly diverse product base, but also enable access to faster, more dynamic intelligence that static manuals simply cannot provide. A recent IDC report estimates the global market for wearables will “swell to 111.9 million units in 2018, resulting in a CAGR of 78.4%”. We are still some way off from seeing wearables used to their full potential, but the field service industry will be both one of the biggest trail blazers and one of the biggest beneficiaries in this market. With Gartner predicting that companies in field service could increase their profits by up to $1 billion annually through greater efficiency and cost effectiveness, it’s not a question of if, but rather of when. ### Alfresco Achieves Benchmarking Milestone by Processing 1 Billion Documents with Amazon Aurora [easy-tweet tweet="New Version of Alfresco One Previewed at AWS re:Invent, Features Impressive Speed and Scale in the Cloud" via="no" usehashtags="no"] Alfresco Software today announced that the upcoming version of its Enterprise Content Management (ECM) platform has achieved a significant milestone in cloud-based content management by surpassing the 1 billion document mark on a full cloud-centric technology stack based on Amazon Elastic Compute Cloud (EC2) and Amazon Aurora. Alfresco, a leading provider of modern ECM and Business Process Management software.  Alfresco is previewing Alfresco One 5.1 at AWS re:Invent this week.  Alfresco recently announced that it certified Amazon Aurora, allowing Alfresco One Enterprise users the benefit of the scale, services, and cost structure of Amazon Relational Database Service (Amazon RDS). Amazon Aurora enables Alfresco One to store, manage, and retrieve billions of documents and related information with fast and linear scalability. Using new techniques of information modelling, indexing, and processing with the recently launched Amazon Aurora database, Alfresco can now support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications.  The 1 billion document benchmark validates Amazon Aurora’s flexibility and scalability on real-life use cases on AWS. This demonstrates the ability to leverage AWS and Amazon Aurora for scalable deployments of content-centric solutions based on Alfresco to address business requirements of the modern digital enterprise, and enable a more agile and cost-effective deployment model compared to traditional on-premises deployments. Alfresco One ingested over 1,000 documents per second, quickly reaching 1 billion documents On this specific benchmark, Alfresco One ingested over 1,000 documents per second, quickly reaching 1 billion documents, backed by a 3.2 terabyte database in Amazon Aurora. Alfresco One, running on Amazon EC2, achieved this with linear ingestion scalability, an important success factor in large migration and consolidation projects. When benchmarked at this scale, the Alfresco One repository also showed virtually no performance degradation in content and metadata operations. “This benchmark demonstrates why Alfresco is a content management platform of choice for global enterprise deployments due to its ability to scale, both horizontally and vertically, and seamlessly leverage native features of cloud and hybrid environments because of its open and modern architecture,” said John Newton, CTO and co-founder of Alfresco. “The next version of Alfresco takes this up a notch with new scalability features designed to support cloud deployment of large-scale use cases, like extended enterprise collaboration and headless content platforms.”  “Amazon Aurora is a relational database engine that combines the speed and availability of high-end commercial databases with the simplicity and cost-effectiveness of open source databases,” said Anurag Gupta, Vice President, Database Services, Amazon Web Services, Inc. “We applaud Alfresco’s ability to leverage Amazon Aurora to address business requirements of the modern digital enterprise, and enable a more agile and cost-effective content deployments.”  [quote_box_center]Alfresco One 5.1, shipping soon and compatible with Amazon Aurora, will include new features related to scalability and large use cases, including: Solr search distributed index server to optimise index size and performance characteristics  Improved metadata query performance for high volume transactional processing Improved integration with modern cloud and DevOps deployment and management technologies to enable seamless cloud deployment  [/quote_box_center] ### September Top 50 #CloudInfluence Individuals This month we have a proliferation of CEOs at the top of the rankings – 9 of the top 13. Leading them all as he has done on several previous months was Satya Nadella from Microsoft who was followed his colleague Scott Guthrie in 2nd and Nicole Herskowitz in 8th. They were each quoted widely as Microsoft announced a number enhancements to its cloud services at its recent Azure Convention. Daniel Ives, a regular in the rankings, was also quoted widely commenting on everything from the new iPhones to the future of tech giants like HO, Oracle and IBM. [easy-tweet tweet="The top 50 #CloudInfluence Individuals are out now" user="comparethecloud" usehashtags="no"] Announcing that Oracle's software will be available on the cloud by the OpenWorld conference at the end of October enabled Mark Hurd to hit 4th. Sharing a stage at the BoxWorks conference in San Francisco, Aaron Levie in 5th and Tim Cook in 11th not only talked technology, but they both also touched on corporate responsibility, arguing that that American corporations have a responsibility to help improve equality, the environment and public education because of a lack of government progress in the past few decades. Doing his best to promote the 2015 CRN Fast Growth 150 List was Robert Faletra in 6th, followed by Matthew Prince from CloudFlare, the hot start up that has attracted $110 million in funding from Fidelity, Baidu, Google, and Microsoft. In 9th was Bill Run from GE which expects its software revenue to roughly triple to $15 billion by 2020 as it reaps significant gains from its digital operations. The company has predicted that the industrial internet will be twice the size of the consumer internet as industrial companies like GE and its customers generate massive productivity gains through the deployment of “intelligent” machines that eliminate unscheduled downtime.  Rounding out the top ten was Salesforce's Marc Benioff who was busy this month touting the company’s Microsoft and Cisco Partnerships as path to success in the Internet of Things (IoT). This followed a host of other SalesForce enhancements announced at Dreamforce. Just outside the top 10 in 12th was Meg Whitman, as HP restructures its cloud business ahead of its November corporate split. HP’s cloud R&D division will move into its software business, led by Robert Youngjohns. It’s cloud product management group, bossed by Bill Hilf, will also be swept into the software division. According to Whitman, this will help HP’s hybrid cloud strategy a great deal. HP’s actual cloud product division will move into the Enterprise Group, the same group that flogs servers and networking equipment. [easy-tweet tweet="#1 in the September #CloudInfluence Individual table goes to @SatyaNadella " user="comparethecloud" usehashtags="no"] [table id=53 /] NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### The IoT, Data Centres and High-Performance Computing Analyst firm Juniper Research estimates that the number of internet-connected things will rise by 285 per cent between 2015 and 2020, from 13.4 billion to 38.5 billion. The Internet of Thing has the potential to permanently disrupt 20th century business models. The IoT has grown to become a global environment of networked devices and sensors with the capacity to provide real time updates of highly specific information. Personal wearables and smart home appliances have effectively captured the interest and imagination of consumers but it is in the sensors embedded in public infrastructure, utilities, factories and supply chains where IoT has the greatest potential to make real, substantive changes to our day to day lives. These ‘devices’ provide critical, real time information that make production processes more efficient, identify potential problems instantly and act to remedy them. [easy-tweet tweet="In the age of #BigData, information gathering, storage and analysis all facilitate behaviour tracking and prediction" usehashtags="no"] In the age of Big Data, information gathering, storage and analysis all facilitate behaviour tracking and prediction, situational awareness and data driven analytics. The next step for the IoT is the jump from monitoring and analytics to automation. Rather than monitoring and reporting a problem for a human to rectify, these devices utilise Machine to Machine (M2M) communication to take action, independent of human intervention. This step will further increase productivity gains and safety standards by reducing reaction times to fractions of a second and allowing for automated error remedy and quality control. This is where the true potential of the Internet of Things is revealed. Data is collected and used as the basis for automated processes which then in turn feed more data back through the network which modifies the automated process further and so on. This is where the true potential of the Internet of Things is revealed However, none of this smooth, streamlined process would be possible without the substantial hardware support provided by the data centre. Data centres are absolutely at the heart of the Internet of Things. They provide the bulk storage capacity and high performance processing power required to handle the massive volumes of raw data that billions of connected devices generate. Back in 2013 the amount of information handled online, globally, was 4.4 zettabytes (the rough equivalent of 18,000 years of HD-TV). It is estimated to be at least ten times that volume by 2020. With the current proliferation of cloud based internet services it is easy to forget that ultimately all that data has to live somewhere. That somewhere is almost certainly a data centre. But traditional data centres are not capable of matching the kind of exponential, global growth that is predicted for the Internet of Things. [easy-tweet tweet="Data centres are absolutely at the heart of the Internet of Things" user="ioDataCenters" hashtags="iot, datacentre"] Given the sheer volume of structured, semi-structured and unstructured data necessary to facilitate M2M communication, automated systems and complex analytics, the modern data centre must be able to spread the storage and computing loads across numerous, global locations. Efficient system management platforms are the only way to make such data manageable and useful. Data centre providers must recognise that they have become a communications hub facilitating the movement of big data from one site to another that may not be known. As a result the new age of data centres must build on and off ramps to clouds, platforms, networks, service providers and enterprises alike. The data required for automated systems to function comes from a network of IoT connected devices all over the world While data volumes are increasing exponentially there is a functional limit to the processing power of single, high-performance computers. With this much data, and this many non-human users of data centres, standardisation must prevail within the physical data centre. This is where the value of a modular approach to colocation begins to enable scalability of high performance computing. The data required for automated systems to function comes from a network of IoT connected devices all over the world. Data centres must anticipate the need for global data analysis/aggregation and by embracing a standardised modular approach data centre providers allow their users to easily harness the processing power of multiple, collocated systems. The other key to unlocking the full potential of the IoT is data centre management software. Data centres have to mirror the automated systems and big data analytics services they enable. This requires the development and installation of a comprehensive data centre management system that can be automated to collect and monitor thousands of data points, in real time, from power efficiency to rack temperature. As long as there are manual links in the technological chain, the internet of everything cannot truly flourish. ### Dell MSP Conference Podcast - Mercedes Benz World - 10th Sept 2015 On 10th September 2015, Compare the Cloud hosted a live recording from the Dell Managed Service Provider Conference at Mercedes Benz World UK. Those interviewed included ProLabs, Pulsant, Storm Internet, VMWare, Nexenta, Cloud Industry Forum, Nutanix and Dell. Listen to the full episode: Alternatively a breakdown of all the excerpts can be found here:   ### Data and Information in the Cloud Cloud is now an essential business enabler, and the question has become not whether to use it, but by how much. So, IT leaders are faced with a conundrum: how can they maximise value from cloud computing, whilst maintaining data security? Virtualised machines and cloud-enabled business-critical applications allow data to be streamed in from millions of new internet-enabled devices escalating data growth in terms of both volume and complexity. [easy-tweet tweet="#IT leaders are faced with a conundrum: how can they maximise value from #cloud computing, whilst maintaining #data security?" via="no" usehashtags="no"] Despite the cloud’s presence, organisations are only recently starting to understand the level of cloud adoption that is appropriate for their business needs. The cloud has developed to a significant extent, with hybrid models becoming the favoured approach in terms of agility, while maintaining control of sensitive data on-premise. A Hybrid Approach to Cloud For all organisations moving to the cloud, whether private, public or a combination can be daunting. Greater business agility and low upfront investment are promised: however, if not handled systematically and driven by insights gleaned from data, cost and complexity can be increased dramatically. A number of organisations are experiencing issues ranging from egress costs to wasteful utilisations, to complex and siloed managements. Beginning with data-driven insights provides businesses with a better understanding of which workloads and applications are most appropriate for a public or private cloud, or on-premise hosting: in turn a successful hybrid model can be deployed. Having direct access to an on-premise, private infrastructure means a decrease in the use of public Internet Having direct access to an on-premise, private infrastructure means a decrease in the use of public Internet, which effectively cuts down latency compared to public cloud services. With the hybrid cloud model, organisations are offered on-premise computational and storage infrastructure for processing data which requires extra speed or higher availability. Combine this with the benefits of the public cloud, where a workload may exceed the computational power of the private cloud component, and you have a compelling argument for hybrid cloud. Expanding the private component of a hybrid cloud also allows for flexibility in virtual server design: organisations can automate the entire virtual machine lifecycle to archiving older VMs to the cloud. Another benefit of the hybrid model is the increased connectivity and collaboration offered to employees – which can often be a challenge in today’s digital world. The ability for teams to easily and securely share files should be coupled with the integration of remote workers into core business processes, such as internal messaging, scheduling, edge protection (laptops, tablets, etc.), business intelligence and analytics. [easy-tweet tweet="Although the benefits are clear for adopting a hybrid approach, it can still be difficult to know where to start" user="NigelTozer" usehashtags="no"] Although the benefits are clear for adopting a hybrid approach, it can still be difficult to know where to start. CIOs need to explore ways in which they can introduce a hybrid model that delivers deeply integrated cloud automation and orchestration tools, ensuring compatibility across cloud solutions and on-premise infrastructure. The hybrid environment is fast emerging as the norm for many CIOs. However, the key to a successful hybrid cloud deployment is to firstly understand which workloads and applications are most appropriate for a particular hosting. Secondly, it is important to leverage a single integrated console with an enterprise-wide view of data across these infrastructures. This will mean that IT leaders can control where to process data and maximise cost savings by identifying reasonable spend in relation to the value that data offers to the business. The Spending Shift – from Capex to Opex Cloud computing offered promises of cost savings yet increasingly we are seeing headlines such as The Hidden Waste and Expense of Cloud Computing [1] or Cloud Computing's Wasteland[2]. So what is actually happening?  [caption id="attachment_31843" align="alignright" width="300"] The Issues Episode 3[/caption] Due to a lack of controls to help track and manage utilisation, businesses are being faced with unexpected costs; typically from an unusually large bill from their cloud provider after cloud instances are left running. In the traditional CapEx model, we invest heavily upfront in hardware and software. However, with the cloud subscription model, we can build a data centre with a credit card in a predictable Operational Expense (OpEx) model – which is wonderful in theory, but largely expensive. As organisations mainstream public cloud, holes are exposed within the maturity of their management processes and controls. As a result, developers have been deploying VMs at will and failing to take down workloads once they are completed. In order to tackle this growing concern, it is necessary that IT leaders ensure they have a data and information management strategy. This would enable them to capture a workload at the point of creation and attach data management services simultaneously. To support hybrid models, we need to be able to remain with the workload as it moves between on-premise to hosted private cloud, to hybrid and public clouds. Data is only useful when value can be gained from it Data is only useful when value can be gained from it, whether it is in the cloud or on-premise. Starting with backup and recovery, organisations are able to fast track into more advanced use cases such as DEV/test solutions and more. Here emerges the hybrid data analytics strategy: ‘analytics with purpose’ will be a guiding principle for progressing businesses. Regardless of whether it’s to introduce a Business Intelligence project, or move an advanced analytics strategy to the next level, organisations leveraging a hybrid cloud model will have opportunity: the chance to construct increasingly intelligent choices regarding structured and unstructured data in their environment. In addition to this, they will be able to rapidly mitigate the risk of compliance-related issues, regain valuable storage space and free up budgets in order to pursue vital opportunities in terms of business growth. ### 451 and 'Uberification' - the cloud as an agent for digital transformation Here at CTC HQ, we frequently discuss the impact that cloud is having in all areas. It has been a catalyst for the re-invention of computer service delivery. Implication: Cloud has marked a changing of the guard – with old tech behemoths being challenged by new cloud rivals. Implication: Cloud has been a disruptive influence on the channel, with many services being offered either direct or via marketplaces, meaning that VARs, distributors and other channel players are having to reinvent themselves and their value proposition. In many ways cloud has lead to the commoditisation of computing. Implication: The commoditisation is especially visible in the competitive price wars between the main public cloud players. Implication: It has also enabled the ‘as-a-Service’ economy where clients try a service to see how they can befit from it and quickly switch to rival offering if they don’t realise enough benefit, as they have little or no capital invested in it. [easy-tweet tweet="#Cloud has enabled the integration of disparate services into powerful solutions" user="billmew and @comparethecloud" usehashtags="no"] It has also been an enabler in the integration of disparate services into powerful solutions that would never have been possible before. Implication: These possibilities are especially evident within the marketplaces that offer an incredible array of services that can be integrated into a cohesive package to meet a complex requirement Implication: it has also lead to a wave of big data solutions that apply analytics to a combination of internal and external data sources and applications. It has enabled completely new business models that have challenged the status quo in a number of sectors. Implication: this digital revolution is particularly evident in the way that Uber and AirBNB have challenged incumbents in their sectors. Implication: other sectors will be impacted in the same way as cloud enables challengers to launch other new business models Normally these issues provide us with more than enough to discuss for hours on end. Last week, however we had an opportunity to chat to William Fellows from analyst firm 451. In his most recent report: 'Uberification' - the cloud as an agent for digital transformation, he is seeking to look beyond all of this to see if the transformational power of cloud that we have unleashed on every other sector may return to transform our own – more that it already has, and in ways that we might struggle to foresee. Defending Against a Stealth Attack [easy-tweet tweet="If you can’t tell who the sucker is at the table then its probably you says @billmew" user="comparethecloud" hashtags="cloud"] The problem with all such truly disruptive innovation is that it’s almost always impossible to tell where, when or how it is going to hit you. I’m reminded here of the old poker adage that if you can’t tell who the sucker is at the table then its probably you. The only real defence against disruptive innovation is to seek to be a disruptive influence yourself. Obviously coming up with the next great innovation isn’t easy – or we’d all be doing it. There are steps that you can take though: Only the paranoid survive: The first step is to avoid complacency and re-evaluate your value proposition. A simple SWOT analysis of each aspect of your business is a simple and easy first step. [caption id="attachment_32076" align="alignright" width="300"] View September's #CloudInfluence Orgs[/caption] Partner creatively:  Uber had the right backers including Benchmark Capital in 2011 and Goldman Sachs, Menlo Ventures, and Bezos Expeditions later that year. Then there was Google Ventures in 2013 and Baidu in late 2014. Even if you’re not a startup there are value networks that you can join and then there are the big System Integrators (SIs). Fellows sees the SIs, like Accenture and Cap Gemini having the consolidated capabilities including experience, processes and knowledge to act as the perfect partners for value networks where the smaller players provide point pieces to complete the jigsaw. Copy and improve: Usually if you spot a great innovation in its early life, it isn’t too late to compete. Google wasn’t the first search engine, it was just the best one about as the sector hit its main growth phase. It can be hard for incumbents (especially those with a large invested interests in the old methods) to switch to the new ways of doing things, but it is possible – and many established firms establish what is termed an innovation “sandbox” because it allows fairly complex, free-form exploration and even playful experimentation (the sand, with its flowing, shifting boundaries) within extremely fixed specified constraints (the walls, straight and rigid, that box in the sand). Open to innovation? [easy-tweet tweet="#cloud still isn’t for the faint hearted says William Fellows of @451research" user="comparethecloud" usehashtags="no"] Many in the OpenStack community hope that it can be the environment in which such innovative value networks can thrive. Fellows argues that so far Azure, Google and AWS have done the most to break new ground and that they are democritising access to compute capabilities, but they are not yet on the same user experience level as Uber. He maintains that cloud still isn’t for the faint hearted and that even the simplest marketplace still needs skills for integration and implementation. “We’ve not really seen our Uber moment yet – when cloud becomes accessible and usable for all.” Here at CTC we will be watching this market closely, seeking to spot the new players. We’ll be out at the Tokyo OpenStack Summit challenging the community to live up to its potential and to make the movement easier to access (for those among us that aren’t rocket scientists). “Two years ago you had to explain what OpenStack was. Now companies are interested in knowling more,” added Fellows. “OpenStack can play a disruptive role, but not as its currently constituted.” We tend to agree with Fellows. Value networks will have a very important role to play and large SIs are possibly the best candidates to lead these. We have also seen a resurgence in our latest ranking from some of the ‘old empires’. So before you get ‘Ubered’, start by being paranoid, looking for the right partners and looking out for embryonic ideas and innovations that you can adopt as your own. We’ll be looking too, and sharing our thoughts right here. To see the full 451 report click here. ### September Global #CloudInfluence Rankings The Empires Strike Back We’ve closely tracked the rivalry between Amazon and Microsoft for global leadership in the monthly #CloudInfluence rankings this year, and we saw Amazon take the lead over the summer. [easy-tweet tweet="September's #CloudInfluence ranking as determined by #bigdata #analytics is out now" user="comparethecloud and @billmew" usehashtags="no"] We wondered if this might be a significant shift in mindshare or whether it was simply a blip with Microsoft being temporarily distracted by the Windows 10 launch. We can now see that it was indeed just a blip, and while Amazon, along with Google, has fallen away, Microsoft and many of the other old empires (IBM, Oracle and even HP) are striking back! Any idea that Microsoft might have taken its eye off the ball and been distracted by the Windows 10 launch was dispelled this month with a flurry of activity and announcements regarding Microsoft Azure. The cloud platform not only got updates for big data, security and IoT, but also a new Azure Container Service, and competitive new pricing with the Azure Compute Pre-Purchase Plan. OneDrive for Business also got updates for security and data loss prevention. These were just some of the announcements coming out of its Azure Convention – a great overview is given here by Computer Business Review: 10 big things that happened to MS Azure this week Apple stood alone as the only consumer brand in the top of the rankings. It saw a surge of coverage of its cloud-based services – including its music streaming. IBM and Oracle in 3rd and 4th marked a resurgence of the old guard. Even HP that had previously struggled to make it into the top 50, rose all the way to 19th. All of these tech behemoths are investing heavily in cloud. IBM acquired both StrongLoop, a leading provider of enterprise Node.js capabilities, and Meterorix, a Workday services partner. Meanwhile Oracle launched two new additions to its cloud platform for integration: Oracle SOA Cloud Service and Oracle API Manager Cloud Service, a suite of tools that allow users to integrate on-premises and cloud-based applications. [caption id="attachment_32076" align="alignright" width="300"] Click to expand[/caption] This left Amazon in 5th, its lowest monthly ranking all year. It is all to likely however that having witnessed the ‘Empires Strike Back’, we will see the ‘Return of the Jedi’ as Amazon hosts its Amazon Web Services (AWS) re: Invent Conference on October 6-9 and the AWS faithful flock to Las Vegas. This will of course, then be followed by the OpenStack Summit in Tokyo. It will be interesting to see the level of excitement that each tribe is able to generate. VMware in 6th will be a major participant in the Tokyo event alongside HP and many of the other main private cloud players. We will be reporting from the Tokyo OpenStack Summit and assessing the relative impact of the Azure Convention, AWS re: Invent Conference and OpenStack Summit. [easy-tweet tweet="September's Top 5 #CloudInfluence companies are @microsoft, apple, @ibm, @oracle and @amazon" via="comparethecloud" usehashtags="no"] [table id=52 /] NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### Unshackling the power of footfall What is something that almost everyone does every day, without thinking? A simple motion that is the underlying form of movement for almost all human beings? Walking, putting one foot in front of the other.  [easy-tweet tweet="Pavegen is the innovative energy solution turning your footsteps into power." user="rhian_wilkinson"] The average person takes over 216 million steps in their lifetime. There is a huge energy potential in the simple action of a footstep, especially as footfall is constant within the urban environment. By designing a floor tile that harnesses something as simple as a step, Pavegen is activating access to a power source previously not considered as viable. The average person takes over 216 million steps in their lifetime. Pavegen was conceptualised by Laurence Kemball-Cook during his time at Loughborough University. He undertook his industrial placement at one of the UK’s largest energy companies, where he was tasked with finding a sustainable energy solution for smart cities of the future. He realised wind and solar power would be ineffective in urban environments, due to high-rise infrastructure and pollution. Given that 75% of the global population is estimated to live in urban areas by 2050, Laurence worked to create an alternative means of renewable energy generation, specifically for this market. He recognised footfall as a potentially unharnessed energy source and came up with the concept of Pavegen. Pavegen aims to be an integral part of urban infrastructure and can help to develop the smart cities of the future. Providing an interactive renewable electricity source to power applications such as street lighting, advertising displays and communications technology. Additionally, the tiles can collect, monitor and communicate real-time data through the use of their wireless API; calculating footsteps per tile for analytics and social media updates. [easy-tweet tweet=" Stepping into clean energy in the future is a reality thanks to @pavegen" user="rhian_wilkinson" usehashtags="no"] The footfall calculator allows for analysis of the most heavily used routes through the Pavegen system. For example if a retail environment is activated with Pavegen, the business can see how people move around their stores by measuring footfall via the tiles. The engagement of customers with particular displays can be measured, and store layouts can be altered to maximise customer engagement based on activity hot spots. Additionally the energy generated during busy periods of the day can be stored to power the store in periods where energy consumption is lower. My first thought upon hearing about Pavegen was that it has such huge potential for usage in events at scale, and the to potential to begin reducing the energy impact of major events. At busy events the footfall data can be used to control crowds to prevent overcrowding and accidents, by sending out live updates on venue capacity, footfall and energy generation. Already in use at events as experiential installations, Pavegen is engaging event attendees to generate interactive power sources. Coinciding with the theme “Feeding the Planet, Energy for Life” Pavegen promotes a sustainable energy resource accessible by all. [easy-tweet tweet="Can you imagine a dancefloor where the loudness of music is relational to the number of dancers? " user="rhian_wilkinson" usehashtags="no"] Showcasing the best of British innovation, Pavegen have an installation in the heart of the Expo Milano 2015. Working in partnership with Coca-Cola, the 30-tile dance floor works by controlling the music volume through the engagement of collective public footsteps, the more steps the louder the music. Pavegen’s largest ever experiential installation was at the 2013 Paris Marathon. Working in partnership with leading French energy specialists Schneider Electric, they installed 176 tiles, including a 25 square metre span at the climax of the race on the Champs Elysée harvesting the energy of 40,000 runners. Pavegen has just completed its first crowdfunding round on Crowdcube, which it launched on the same day as the flagship installation in Canary Wharf. Successfully raising over £2 million via Crowdcube, Pavegen is working on ways to reduce the cost of manufacturing the tiles, whilst increasing the energy output each tile is capable of. Maybe you want your next step will be one that changes the world - with Pavegen, it really could be. ### Have you got a Skelton in your cloud closet? In July 2015, Andrew Skelton was sentenced to eight years for a data breach at supermarket group Morrisons. But what’s that got to do with running a cloud services business? Actually, it’s highly relevant when you consider who Skelton is.  He was the company’s senior internal auditor and how he stole and published sensitive employee data is a dramatic example of an insider hack by a trusted member of staff. While cloud service providers like data centre and hosting companies have little to do with in-store bakeries, shopping trolleys with wonky wheels and the price of baked beans, they too could risk being blindsided by the threats posed by employees with privileged access rights. This can include senior administrators who, like Skelton, have legitimate access to sensitive data and systems. And, like Skelton, they could go rogue and cause financial and reputational damage on a huge scale. [easy-tweet tweet="We have tended to visualise the hacker as the outsider, but this isn't always the case" user="Courion and @comparethecloud" hashtags="security"] We have tended to visualise the hacker as the outsider. But, serious data breaches like Morrisons are more likely to be the work of a disgruntled or criminal employee and highlight the importance of controlling access to employees in any position who have access to sensitive data or systems. Thankfully, awareness of internal threats is becoming better understood. According to the authoritative Verizon 2015 Data Breach Investigations Report, 55 percent of all insider breaches in the last 12 months were examples of privilege abuse. In other words, any employee account could be the subject of an outsider taking control for malicious motives. Of these cases, financial gain and convenience were reported as the primary motivators. [easy-tweet tweet="55% of all insider breaches in the last 12 months were examples of privilege abuse" user="courion" usehashtags="no" hashtags="cloudsecurity"] So what are the best strategies? While monitoring employee behaviours might be one place to start, it would be impossible and invasive to monitor employee behaviours. What’s more, with the vast amounts of complex access privileges assigned to a large number of employees, the problem is a technical one. It’s also likely that an insider hacker will be as, if not more, sophisticated and capable as an external one. Indeed a senior administrator within a cloud business will have access to more techniques and opportunities to hide their exploit. They may be able to operate within the business using multiple accounts under different identities. Some might possess access privileges from previous roles that are no longer appropriate or have conflicting permissions and should have been terminated long ago. Whether their staff are a risk or not, cloud businesses should be determined to get on top of identity and access management. Indeed, a prime strategy should be to undertake a regular and deep audit and clean up of how access privileges are being assigned with ongoing management and control through identity governance and management. This vital exercise can reveal some nasty surprises. For example, my company did an analysis of one global business and discovered 1000+ abandoned contractor accounts, 100+ terminated employee accounts that needed to be de-provisioned, 14,000 inactive user groups and over 25 or so users with access in excess of their role. And, this was a business that had otherwise very robust data security and a large IT function. doing a thorough houseclean of access privilege is an extremely sensible first step For businesses that might rely on temporary or contractor workers, a similar hidden set of risks may be lurking even behind an otherwise well run IT operation. Therefore, doing a thorough houseclean of access privilege is an extremely sensible first step. But this high standard needs to be sustained by choosing processes and systems that significantly reduce the risks by making access management and governance much easier to enforce and do. Complementing other HR and technology strategies like perimeter protection and encryption should be how the chief information security officer (CISO) has access to the very best intelligence about who has access to what; and a clear view of the anomalous behaviours that could be the precursor or immediate evidence of an insider hack. Users tend to leave footprints wherever they go on the network, and their activities can be collected and scrutinised using predictive analytics. New intelligent identity and access management tools are able to sift through huge volumes of user activity and pinpoint and analyse the greatest access risks in real time. This enables businesses to quickly identify misuse of access privileges and take appropriate actions to mitigate the potential damage for their organisation before the insider hack occurs. With the use of real-time access insights, organisations will be able to detect not only existing security vulnerabilities but also potential risk areas and identify the actual causes for these risks. For example, hidden Active Directory Group Nesting is a leading cause of inappropriate access that is usually under the radar of native Access Management. This new visibility of access privileges will result in improved control over how sensitive data is being used and shared by employees, and a better understanding of access risk. [easy-tweet tweet="With the use of real-time access insights, organisations will be able to detect existing security vulnerabilities " via="no" usehashtags="no"] Ultimately, the best practice for protecting your organisation against privileged access misuse may come down to a much more holistic approach that blends technology with the skills of an organisation’s human resources leadership in overseeing and controlling processes for new joiners, leavers and internal movements of staff and changes in roles and responsibilities. With the next generation access intelligence solutions now available, enterprises can weigh the risks to vital assets such as intellectual property and customer information and settle them instantly. ### European Court of Justice passes landmark Data Protection ruling On October 1st, 2015 the European Court of Justice passed a ruling that will change data protection for companies operating across multiple EU member states - particularly for those who are consumer facing.  In simple terms the ruling means that if a company is operating within the borders of a country, and targeting the residents of that country for business, it is subject to the country's data protection rules - regardless of the fact that the operating company may be based in another location. The ruling was decided in the case of Weltimmo s.r.o v Nemzeti Adatvédelmi és Információszabadság Hatóság (Weltimmo s.r.o v National Authority for Data Protection and Freedom of Information). Weltimmo is a Slovakian company who run a property website dealing with Hungarian properties. In Short, Weltimmo screwed up - and it's going to cause everyone in the eu a bit of a headache. Weltimmo's site is serviced in Hungarian, and targets Hungarian citizens. They were offering a free month of advertising to new members - lots of Hungarians signed up, had their free month, then asked for their ads to be removed, and their data deleted. Weltimmo did not comply with their requests, and began billing the advertisers. When they did not pay, Weltimmo passed their details to a Hungarian collection agency. An appeal to the Hungarian Data Protection Office led to a fine of HUF10,000 (€32,000) for Weltimmo, for infringing Hungarian data protection laws. It is from this point that things started to get interesting. After an appeal, the case went to the European Court of Justice (ECJ) for a final ruling as to whether they could apply their data protection laws to a company registered and operating in another EU state. [easy-tweet tweet="The ECJ ruled companies are subject to the data laws of the country they operate in" user="rhian_wilkinson" hashtags="datalaw"] The ECJ ruled that Weltimmo was subject to Hungarian data protection laws: "By today’s judgment, the Court recalls that, according to the directive, each Member State must apply the provisions it adopted pursuant to the directive where the data processing is carried out in the context of the activities conducted on its territory by an establishment of the controller." For more clarification: "The Court states that each supervisory authority established by a Member State must ensure compliance, within the territory of that State, with the provisions adopted by all Member States pursuant to the directive. Consequently, each supervisory authority is to hear claims lodged by any person concerning the protection of his rights and freedoms in regard to the processing of personal data, even if the law applicable to that processing is the law of another Member State. However, in the event of the application of the law of another Member State, the powers of intervention of the supervisory authority must be exercised in compliance, inter alia, with the territorial sovereignty of the other Member States, with the result that a national authority cannot impose penalties outside the territory of its own State." [easy-tweet tweet="The ECJ ruling on data protection has massive ramifications in the global age of information" user="rhian_wilkinson"] This has massive ramifications in the global age of information - companies such as Facebook have come under fire in the past for blurring the lines on personal data laws. In the Facebook case Germany was challenging Facebook to allow profiles to be made under pseudonyms. German Data Protection law provides that a web service provider shall offer an anonymous use of web services where this is technically possible and reasonable - Facebook does not allow anonymous use - and has been known to lock people out of their profiles if they are found to be using a fake name. Facebook has no branch or subsidiary in Germany - they only legally operate places of business in the US and Ireland. Before this ruling, it could have been argued that any processing of personal data is subject to the Data Protection laws of Ireland. But now everything has changed.[quote_box_center] Ashley Winton, UK head of data protection and privacy at international law firm Paul Hastings has released the following statement in regards to the ruling.  "Previously, European laws allowed multinational businesses with operations in Europe to be only subject to the data protection laws of one European country. This was to the benefit of many companies, some of whom elected to create an establishment in the UK or Ireland, where data protection laws and practices are more liberal and arguably more business friendly. Following the case of Weltimmo, companies that have websites translated into another language, targeting consumers of member states outside of their own establishment, may now have to comply with the regulations in each individual member state. This dramatically increases compliance costs, particularly where a website is targeted at multiple member states, and makes the company subject to multiple data protection authorities. We expect that this case will be welcomed by data protection authorities, and as a result, social media and e-commerce multinationals will need to urgently consider their European data protection compliance strategies. With the appetite for enforcement high across a number of member states, the repercussions for non-compliance could be huge."[/quote_box_center] Now, we just wait and see what the fallout of this ruling will be. Popcorn anyone? ### What approaches can you take to data and the cloud? Business intelligence (BI) has been around for years in various guises. It was born from the idea of simple, centrally managed reports from a company’s key applications with all requests for data going to the firm's corporate IT team. The central team would manage these requests individually and send out reports to each employee. [easy-tweet tweet="The current BI model is not sustainable in today's global business market says Southard Jones" user="comparethecloud" usehashtags="no"] While this model worked for critical applications and when data was needed occasionally, this model is not sustainable in today’s global business environment. In the effort to maintain synchronisation across the different environments, administrators have to manage constant data loads and metadata updates. The constant tending needed by big, traditional BI implementations results in restricted access to data and long wait times for the business teams. Worse still, this was a complete barrier to end user self-service or users understanding the data directly. Over time, this expensive and time-consuming effort results in a long queue of people in every department, waiting to get reports. [caption id="attachment_31820" align="alignleft" width="300"] Listen to Friday Night Cloud Episode 3[/caption] Resolving this is both a process and technology issue. Business users don’t want to wait for data that will help them become more efficient at doing their jobs. This leads them to take to their own tools for data discovery and visualisation, resulting in multiple islands of data and documents within the business. In large enterprises, the problem is even worse: thousands of users are doing it their way and coming up with their own views. They can't be blamed for this. After all, they see the value of information for their daily lives. However, while it might benefit them individually, this approach is not sustainable for the business. Any lack of proper governance around self-service analytics can increase reporting errors and leave companies exposed to inconsistent information According to analyst firm Gartner, only 10 per cent of self-service business intelligence initiatives will be sufficiently well-governed to prevent inconsistencies that adversely affect the business over the next couple of years. no one is sharing data –which stifles creativity This is due to the fact that individuals are working in silos – and no one is sharing data –which stifles creativity. On top of this, each person will be producing different answers to the same question. For business leaders responsible for decisions that can be worth millions of dollars or pounds to their companies, this represents an unacceptable state of affairs. Getting the best of both worlds with cloud BI There is a business requirement to satisfy around greater agility and access to data. However, this need can’t compromise the overall governance, consistency and trust in the data. Solving this calls for a more flexible model for delivering data and analytics results, while at the same time maintaining trusted and agile collaboration between centralised and decentralised teams. Cloud BI can help marry up the benefits of central governance with local flexibility. With the ability to cut costs and increase efficiency all the way through the business, a cloud architecture is the key to meeting ever-increasing data and analytics requirements. For example, using cloud BI gives companies the opportunity to create virtual instances of data that link up different physical data into one place. These virtualised BI instances enable firms to extend analytical capabilities across multiple territories, departments and customers at a much faster pace. Think of it as networked analytics. This wide reaching data access across the globe can support both local and aggregate views. This is important, as not every part of a business will run its operations or report on its activities in the same way, yet the central team can consolidate the financial results in a uniform way, as well. Ultimately, both the central team and the local users are working from the same data, but also working with that data in the ways that best suit them. Cloud-based data discovery or visualisation technology lets individuals run analysis and add their own data, all the while still using a centralised tool. Better still, enterprises that move to a completely "networked" environment, using virtualisation to redefine the way BI is delivered and data consumed, can provide more insight back to the business. [easy-tweet tweet="Using #cloud, central IT teams can create a network of interwoven BI instances that share a common analytical fabric. " via="no" usehashtags="no"] Using cloud, central IT teams can create a network of interwoven BI instances that share a common analytical fabric. This approach enables organisations to expand BI across multiple regions, departments and customers in a more agile manner. This analytic fabric is governed and controlled, but still allows users to be consistent and productive in how they work with data. This represents transparent governance for the central team without the overheard that slows down end-users.  When using a networked model, they can compare data and see what each other is doing. For example, a marketing manager may be doing one piece of analysis around results in their territory. The results and approach to getting them can be shared across the business, so that other marketers can make use of the same approach. Sharing these analytics ‘recipes’ helps improve performance across the business. Multi-tenant cloud One important requirement for this approach to work is that the BI platform has to be truly “multi-tenant”. In most cloud deployments, multi-tenant means that multiple customers can be hosted on the same physical infrastructure. However, cloud BI has to consider this in more detail, as it is not enough to simply host multiple silos of data on the same cloud instance and hope for the best. Instead, those instances have to be able to interact and share data between them, but without the end user seeing all the complexity that is going on behind the scenes. Here, multi-tenant means networking virtual logical instances and applications together, so that the end user gets to see their results in context. However, any changes or additions each user makes cannot affect the central data. To be clear, this should not be based on data replication into new environments as this simply creates new silos; instead it is a logical instance that is virtually replicated, changed, adapted for each individual or group around the organisation. This is completely different than traditional BI or discovery, which physically replicate data.  One company already taking advantage of this approach is consumer packaged goods giant Reckitt Benckiser (RB). The company processes huge amounts of information from both its external data sources and internal applications to empower local sales organisation to sell better. Using networked BI, RB can do analytics across the globe and meet the needs of different teams. However, instead of having a big BI team responsible for creating and providing reports in each locale, all analytics are developed and run in the specific region, while central BI governs key data and maintains consistency for corporate metrics. This means that each region uses their own local data and virtual logical instances from corporate BI, giving them agility to run their business operations their way, while central IT can report on trusted data around overall global performance. The team at RB can roll this entire set of analytics to thousands of users in 22 different global locales in 24 weeks. businesses can empower users to create more insight As this example shows, firms can dramatically improve efficiency by moving their BI to the Cloud. By doing this, businesses can empower users to create more insight, as well as cutting costs. ### IBM scores 1 Billion Dollar Cloud Contract! IBM has secured yet another massive Cloud contract in Europe that sets them aside from other players in the industry! Cloud is booming and never more so in in the Nordics. It maybe chilly out there but it seems that the Cloud industry for IBM isn’t. [easy-tweet tweet="IBM and EVRY partner for cloud services over a 10 year contract." user="Comparethecloud and @IBM" hashtags="IBM, EVRY"] IBM announced today that the leading IT services company EVRY has chosen them for a as the premier partner for cloud services over a 10 year contract. This is a big win for the IBM Softlayer Cloud platform by working off plan as the data centre comes online in Fet/Oslo later next year. It seems that the Softlayer Cloud services provisioning is really gaining momentum throughout Europe with additional announcements being made monthly, Shop Direct, Avira, KPN, Sogeti and Finnair to name a few. So why are the constant stream of big names/brands moving to Big Blue? Well I took some time out today to sum up a few key points. Presence – IBM recently announced the opening of its sixth data centre in Europe while adding language support for pretty much every spoken dialect! Connectivity – The low latency connections throughout EMEA and the interconnectivity are a massive plus that cannot be overlooked, less than 30 milliseconds access times throughout Europe! Services – Not only the traditional IaaS offerings can be utilised, how about BlueMix the DevOps toolset with services and APIs to 1000’s of additional products and services to use. Hybrid capabilities that can burst out to pretty much anywhere and anyone (gone are the days where we work in silos and don’t co-mingle with other providers), Watson, the daddy of the Big Data analytical world that’s now accessible to anyone who wishes to find/structure anything, and also the massive portfolio of products that keep getting added daily. So, I know what you are thinking – this note is a bit Pro IBM hey? Well yes I suppose it is but with all of the above considered do you think I have over inflated the message? The future isn’t sunny, its Cloudy with a hint of differentiation The future isn’t sunny, its Cloudy with a hint of differentiation and IBM seem to be galloping at a fierce pace to win this race. Keep it up IBM and tell us more! Read the full press release below. [quote_box_center] IBM and leading Nordic IT services company EVRY today signed and announced a 1 billion USD long-term partnership in which IBM was selected as EVRY's premier provider of cloud infrastructure services. As part of the agreement, IBM will transform EVRY’s existing infrastructure services by using IBM’s proven methodology and global expertise, and giving the company access to IBM’s global cloud resources and capabilities. This includes providing services running on IBM's Cloud infrastructure services, SoftLayer, based in Fet/Oslo data centre later next year. By running these services on IBM Cloud, EVRY’s customers, across a wide range of industries including banking and finance, government, energy, healthcare and retail, will get access to flexible and scalable hybrid cloud infrastructure. EVRY will continue to lead the development of value-added solutions and services and combine its strong local knowledge with use of innovative cloud technology and global scale from IBM. EVRY’s customers will benefit from a faster time-to-market through leading-edge infrastructure solutions. EVRY will maintain responsibility for managing its relationships and delivering services to its customers. “A leading infrastructure business is core to EVRY becoming a Nordic Champion. It is the foundation from which we build solutions that create business value and business outcomes for our customers. EVRY has started this transformation journey, but in order to deliver the best infrastructure solutions in the market, we need to accelerate the ongoing transformation of our infrastructure business”, comments Björn Ivroth, CEO of EVRY. Infrastructure services are the backbone of the systems that support businesses, and the foundation for new and innovative digital solutions that drive businesses forward. End-users expect to be able to access services 24/7 across a wide range of channels and devices. Because of this it is critically important that businesses prepare their infrastructure for the new digital age.  EVRY`s ambition is to support customers with more technological innovation and at the same time to reduce complexity and increase the use of industry standard components in customers’ infrastructure, since this will allow EVRY`s customers to have a more competitive and agile approach to changing market conditions.  “We have selected IBM as a global service provider and as a service delivery model for our basic infrastructure business. Customers will benefit from a faster time-to-market for leading-edge infrastructure, including new cloud based solutions. This strategic move allows EVRY to focus on being a customer-centric organisation with focus on value-added services and solutions built on leading technology”, says CEO Björn Ivroth of EVRY. “Our partnership demonstrates how IBM’s expertise, technology and services can help EVRY adapt to new market conditions and opportunities while having trusted infrastructure services supporting the ongoing operations,” said Martin Jetter, senior vice president, IBM Global Technology Services. “The Nordic region has always been at the forefront of adopting new technologies early and we are excited to work with EVRY as they accelerate the enablement of their clients to lead in the digital era. IBM’s unmatched IT Infrastructure and cloud capabilities provide a perfect foundation for EVRY to create and sell advanced cloud-based solutions for their customers across the full range of customers and industries they serve." The 10-year agreement will be subject to approval from Norwegian Competition Authorities. [/quote_box_center] ### Driving an IT department transformation The cloud is now well and truly ingrained into the everyday life of the enterprise. Security fears, concerns over control and service level agreements are gradually becoming a thing of the past, enabling CIOs to focus on leveraging the technology. In fact, a study of 1,656 CIOs by IBM has found that two-thirds of them are now exploring better ways to collaborate via the cloud and social networking tools, in order to better develop and deliver on their customers’ requirements. As cloud computing moves responsibility for IT maintenance, support and updates into the hands of the cloud supplier, much of the ‘traditional’ work and budget of the IT department has shifted. Now that “keeping the lights on” is no longer their primary concern, IT leaders and departments can shift their focus elsewhere. The consumerisation of IT and user frustrations with complex, legacy systems, driving workers to buy tools and applications that “just work” may also make IT professionals feel as though they are being edged out of yet more mission-critical technology decisions. [easy-tweet tweet="The cloud actually presents IT leaders with the ideal opportunity to move away from just being seen as the on-call helpdesk" via="no" usehashtags="no"] However, the cloud actually presents IT leaders with the ideal opportunity to move away from just being seen as the on-call helpdesk and soundboard for businesses’ technology woes. They can now position themselves as strategists and innovators. For years, if the email system goes down, someone needed access to the latest software, or their laptop or mobile device to access the VPN, they would call IT. With the cloud, a lot of the tactical helpdesk support issues are no longer the IT department’s headache. This creates the opportunity for IT to evolve from its role as a technical support function. Freed from the shackles of mundane tasks, CIOs now have the opportunity to step back and think more like a CEO. By focusing on the business, CIOs can start to examine where they can drive revenue and reduce costs. How can current technologies be optimised to benefit the organisation? What technologies can de deployed to help the business meet its objectives and drive growth? What technology is going to give the business a competitive advantage? What products can the company build to boost sales? These are the kind of questions that IT leaders can begin to ask - and answer - in order to truly add value to the company’s bottom line. Big data is another area where IT leaders have the opportunity to shine Big data is another area where IT leaders have the opportunity to shine. Organisations are generating such vast quantities of data they are unable to manage it effectively, requiring additional tools to capture, manage, analyse and process information. According to the IDC “Digital Universe Study,” sponsored by EMC, the Digital Universe is growing 40% a year into the next decade, and by 2020 the digital universe will reach 44 zettabytes, or 44 trillion gigabytes. This wealth of data impacts an entire organisation and CIOs now have the opportunity – and the power - to unleash a new wave of opportunities for businesses and people around the world. [easy-tweet tweet="The Digital Universe is growing 40% a year into the next decade" user="huddle and @comparethecloud" usehashtags="no"] But where should CIOs turn their attention first? There are many options, but ‘collaboration’ is the critical component of contemporary, effective global working practice – both within and without organisational boundaries. Forrester estimates that 86% of knowledge workers regularly collaborate with external partners. Huddle’s own research with Dods suggests that this number rises to 95% in the UK public sector. This leads to enterprises drowning in content and employees feeling overwhelmed and confused. IT departments need to use their new role in the organisation to deploy collaboration technologies that can dramatically improve the efficiency of the workforce. With a multitude of devices now penetrating the workplace, CIOs also have the opportunity to establish effective mobile device strategies. It’s now not only graduates and the younger members of the workforce that are bringing their smartphones and tablets into the office environment - CEOs and the senior management team are more than likely to pull out their tablet of choice in a meeting. Employees nowadays want to be able to access information at all times. So it is important that IT departments understand the apps that teams are using on their devices and not just the devices themselves. The cloud has transformed the role of IT departments. It has opened the opportunity to elevate a technical support function to something that can add serious value to the bottom line of organisations and help them navigate a changing, turbulent future. CIOs just need to embrace the opportunity and select the right cloud platforms and capabilities that will support their vision for their organisation’s future. ### It’s Business Time: Okta Unveils Inaugural “Businesses at Work” Report [quote_box_center]An In-Depth Look at How We Get Work Done – Exploring Application, Mobile and Security Preferences Across Businesses, People and Geographies[/quote_box_center] Okta, the leading identity and mobility management company, today announced the findings of its inaugural Businesses at Work report. Based on data compiled from Okta’s network of 4,000 pre-integrated applications and millions of daily authentications and verifications worldwide, the 35-page Businesses at Work report details how we get work done today and the preferences of IT leaders, employees and developers.  “We’re seeing companies of all sizes, industries and regions depend on cloud and mobile to propel their businesses forward.” said Todd McKinnon, CEO of Okta. “Business leaders are making big investments in cloud and mobile, and Okta’s dataset offers some insightful trends into the apps, services and security measures businesses are choosing to leverage.” Key findings from this report include: Size doesn’t (really) matter in the cloud: Company size is no longer a strong predictor of how many cloud or mobile apps a company licenses – with medians falling between 16 and 11 off-the-shelf cloud apps. The same goes for businesses using MFA to protect sensitive data – there has been a 40 percent increase year over year in companies protecting their sensitive data with multi-factor authentication for at least one app. Popular enterprise apps can be easily ousted: Certain enterprise apps maintain early leadership positions – i.e. Salesforce.com in CRM, AWS in infrastructure and Box in content storage – but others have lost ground to competition, including Google Apps now trailing Microsoft Office 365 in almost every category. Media darling Slack has also come on strong, increasing usership 50 percent in Q2 2015. If software is eating the world, businesses are hungry: Businesses are making aggressive efforts to enable their partners, customers and contractors through new cloud-based applications, websites or portals. The number of the external identities in Okta grew 284 percent from July 2014 to July 2015, while internal identities grew 192 percent in the same timeframe. UK businesses won’t let go of the security question as a form of verification: While the global trend is for companies to move away from questions on birthplaces and bloodlines as verification methods – having dropped 14 percent worldwide since April 2014 – usage of traditional security questions in the UK actually increased 17 percent in the same timeframe. However, the UK is aligned with the rest of businesses worldwide in its increasing preference for SMS authentication (with usage growing 8 percent globally and 6 percent in the UK since April 2014). SAML is finally the security standard it set out to be: With companies putting a premium on security, developers are increasingly creating apps with the highly secure authentication mechanism, Security Assertion Markup Language (SAML) baked in from the start. Nineteen percent of applications entered in the Okta Application Network today SAML-enabled, a six-fold increase over the past two years. To view a full copy of the 2015 Businesses @ Work report, please visit: okta.com/Businesses-At-Work/2015-08. ### Data In The Cloud — What's Safe, What's Risky? With cloud vendors now running the “race to zero” in an attempt to outdo one another for the buy-in price of service, it's no wonder that companies are storing more data in the cloud. In many cases, running a local data centre simply isn't cost-effective, let alone preferable. Not all data belongs in the cloud Yet the rush to offload local data means some information is ending up in the cloud that should never see the light of day; here's a quick rundown of what's safe and what's scary when it comes to cloud data storage. [easy-tweet tweet="Safe in the cloud: Collaborative Documents and In-House Apps" user="XOcomm and @comparethecloud" hashtags="cloud"] Safe: Collaborative Documents The big benefit of cloud services? Employees can easily collaborate anytime, anywhere using the device of their choice. As a result, vendors have invested time and effort to make basic file sharing easy, fast and reliable. Instead of passing around presentations or spreadsheets using email or hard copy, employees can simply access the cloud service and make necessary changes. What's more, IT admins can configure cloud apps to track changes, store all file versions and record which user made a specific change in the event of an audit or ownership issue. Safe: In-House Apps Many applications are now designed as cloud-native rather than PC or browser code repurposed to work in public or private stacks. As a result, it only makes sense to leverage public-facing applications such as CRM tools or social media management tools. Trying to manage the sheer volume of necessary apps in-house quickly eats available IT time, leaving no room for innovation or effective collaboration with other departments. [easy-tweet tweet="Risky in the cloud: Trade Secrets" user="XOcomm and @comparethecloud" hashtags="cloud"] Risky: Trade Secrets Have a unique marketing plan or product development strategy that forms the core of your business success? Don't store it in the cloud. Doing so runs several risks: First is the possibility of compromise by a malicious third-party or former employee who effectively negates any competitive advantage your business enjoys. Second is the risk of damage or loss; these files are often impossible to replace. [easy-tweet tweet="Scary in the cloud: Personnel Data and Tax Information" user="XOcomm and @comparethecloud" hashtags="cloud"] Scary: Personnel Data As noted by Next Advisor, it's never a good idea to store personally identifiable information (PII) in the cloud. This could include employee records, health care records in the case of medical firms, or litigation for lawyers' officers. Along with the possibility of compromise or theft, there's the risk of legal challenge if user data isn't stored without explicit consent. There's also the problem of audits: If federal agencies or industry compliance organizations ask for a detailed accounting of personal data in the cloud it may not be possible to provide. The result? Anything from warnings to sanctions or fines. Scary: Tax Information Never store tax data in the cloud. According to DZone, using the cloud to store tax information makes companies tempting targets for malicious actors. Plus, despite the resources at its command, the IRS may not be willing — or able — to help companies recover lost tax info. Your best bet is keeping tax information securely stored (and encrypted) on local servers where it can be monitored and regularly audited. Not all data belongs in the cloud. Collaborative documents and public-facing apps make sense; trade secrets, PII and tax information are better kept under virtual lock and key. ### New trading hubs a ‘game changer’ for emerging markets It’s not every day a company gets to truly transform a market. Yet, with our newly announced partnership, Digital Realty is poised to do just that. In partnership with trading technology specialist GMEX Technologies Limited (GMEX Tech), we have plans to establish a network of emerging market trading hubs around the globe. The hubs will link far-flung trading exchanges to investors, providing real-time access to new sources of liquidity. The impact on developing markets is potentially enormous. [easy-tweet tweet="By improving access to liquidity, the hubs will help overcome one of the biggest challenges facing emerging markets. " user="digitalEMEA" usehashtags="no"] By improving access to liquidity, the hubs (the first of which is planned to be established in our London Chessington data centre) will help overcome one of the biggest challenges facing emerging markets. Many have assets that investors would love to trade, but suffer from the difficulty of access to those investors brought about by the market’s often remote location. This is where Digital Realty is able to deliver significant value. Working with GMEX Tech, we will use our global network of data centres to help create cloud-enabled Points of Presence (PoPs). These POPs will provide market participants with direct and secure access to the London trading hub. Other hubs are being planned for the United States and Asia. Participants will have the opportunity to place their own trading servers within a Digital Realty data centre Participants will have the opportunity to place their own trading servers within a Digital Realty data centre, further solidifying their links to their chosen hub. Ideally, the hubs will become a space where emerging markets can come together with prospective investors and traditional exchanges in one central location. Participants in the new trading hubs may benefit in a range of ways. For established players, the hubs may create new trading opportunities and revenue streams. On the sell side, they may open up access to a huge pool of prospective new clients. Market operators may be able to extend beyond their own geographical boundaries and provide the ability for participants to access a broader range of asset classes including securities, commodities, and derivatives. Emerging trading markets should no longer be hostages to the tyranny of distance. The goal is that the hubs will enable the emerging trading markets to operate as though they were physically located in one of the world’s financial capitals. Compared to existing cross-listing relationships, the hubs will better allow true equal partnerships to be created between participants. [easy-tweet tweet="Emerging trading markets should no longer be hostages to the tyranny of distance says @digitalEMEA" via="no" usehashtags="no"] [caption id="attachment_31840" align="alignright" width="300"] Click to listen to the First ever 'The Issues' Podcast[/caption] Until now, many emerging market players have been hindered by the challenges associated with gaining access to core markets in financial centres. Many smaller markets simply do not have the technology or knowledge to establish the required linkages into large global exchanges. The planned hubs will diminish this barrier. The partnership between Digital Realty and GMEX Tech also means that the hubs will provide a secure, open, and agile trading platform. Housed in purpose-built data centres and connected by fully managed and redundant network links, the hubs will be operational around the clock. The bottom line is that the hubs have been designed from the ground up with users in mind. They are ‘participant’ rather than ‘exchange’ focused. Our goal is that they will essentially become one-stop-shops that provide users with access to multiple exchanges and sources of liquidity – a real game changer. I am very excited by this development and am looking forward to the hubs growing and generating new opportunities over the coming months and years. *For more information on the hub, please access the latest Financial Markets insight ### GE and PTC Form Broad Strategic Alliance [quote_box_center] GE and PTC Form Broad Strategic Alliance to Pursue Brilliant Factory Opportunity Companies will bring to market a solution as part of GE’s Brilliant Manufacturing suite that leverages PTC’s ThingWorx® Industrial Internet of Things application enablement environment in GE’s manufacturing software portfolio Companies will align global sales and marketing resources to jointly pursue digital manufacturing opportunities around the combined manufacturing solution Joint solution is accelerating GE’s Brilliant Factory initiative, is already live at a number of GE sites, and will be further deployed across GE factories worldwide PTC’s ThingWorx will become part of the Predix ecosystem [/quote_box_center] GE and PTC today announced that the two companies are partnering to deliver an innovative manufacturing solution that will be available within GE’s Brilliant Manufacturing Suite. This new GE-branded manufacturing solution leverages the capabilities of PTC’s ThingWorx Industrial Internet of Things application enablement environment. The result is an industry-hardened solution that features flexible dashboards and powerful data analytics integrated with GE’s software capabilities on the manufacturing plant floor. The joint solution connects disparate systems – from the shop floor to the ERP – and offers a dashboard and differentiated user experience with consumer-like drag-and-drop capabilities tailored to each user’s role. GE and PTC will align their respective global manufacturing sales and marketing teams to jointly pursue opportunities worldwide. As GE is one of the world’s largest manufacturing companies, the GE-PTC solution will be implemented within GE’s internal manufacturing plants as part of its Brilliant Factory initiative. “The Brilliant Factory integrates engineering and manufacturing design, leverages data to run our factories much more productively, and optimizes the entire supply chain,” said Jamie Miller, GE CIO. “This solution gives us the ability to see that data clearly and easily in a format that makes critical decisions possible, so we can increase machine uptime and predict maintenance before it is needed.” Leveraging PTC’s ThingWorx technology as part of GE’s Brilliant Manufacturing Suite provides robust manufacturing optimization capabilities coupled with: Role-based manufacturing dashboards with real-time manufacturing KPIs Standardized KPI models for all plants, connecting to heterogeneous system landscapes Collaboration, alerts, notifications and access to actionable data to make more informed decisions “GE’s world class manufacturing plants have provided a testing ground for this technology,” said Kate Johnson, Chief Commercial Officer for GE Digital. “A united GE and PTC team is now poised to deliver a modular and robust solution to manufacturers badly in need of optimization through digitization.” GE and PTC will also work together to certify ThingWorx for the Predix ecosystem. ”GE is a true pioneer in the use of technology to build advanced manufacturing facilities,” said Jim Heppelmann, president and CEO of PTC. “It is significant validation that such an industry leader has chosen to deploy our ThingWorx technology at scale. We look forward to collaborating with GE to deliver the promise of the Internet of Things to manufacturers around the globe.” For more information, please click here. ### Up on Cloud Nine: 5 Tips for Transitioning to the Cloud With the ever-changing dynamic of the modern workplace, making the move to the cloud can be transformative. With businesses outputting vast quantities of data, the demand for storage is rife and can no longer be addressed with large data storage deposits hidden in the back of an IT department. In short: it’s now a cloud and data-led world.  moving to the cloud doesn’t have to be an all or nothing solution. But moving to the cloud doesn’t have to be an all or nothing solution. In the first instance, the transition should be driven by a real business problem. The best strategy is to identify the issue which needs to be solved at the outset and make a clear plan.  [easy-tweet tweet="It’s now a cloud and data-led world" user="comparethecloud" hashtags="cloud, data, bigdata "] A key challenge facing businesses is leveraging the opportunities data presents. Without insight data is wasted. Organisations can speed up decision making by empowering employees to use information more efficiently. Many fields, from marketing, sales, HR, product design and finance can glean meaningful insights with the right software analytics solution. So how to navigate this challenging environment?  Here are my five top tips to ensure your transition lands you on cloud nine: Welcome the opportunity for change Embracing the cloud offers vast and exciting prospects for businesses, giving them the perfect opportunity to pursue a new way of doing things. Rather than simply replicating existing solutions, businesses should leverage the flexible pricing, elasticity and instant provisioning delivered by the cloud. cloud data warehouses are just one example of how old problems can be successfully revisited For instance, cloud data warehouses are just one example of how old problems can be successfully revisited. Systems like Amazon Redshift and Google BigQuery can be set up in minutes rather than weeks, and can be scale to fit the size of your data. What’s more, these systems are optimised for analytics, providing insight into big data by analysing devices, social media, or machine systems. Stay on top of technological developments Cloud technology is constantly evolving. As such, the best architecture at the point of transitioning may not be the best in a year, or even six months. By staying abreast of technological advances, businesses can keep on track of inventive solutions to see how they can fit into workflows. Embrace flexibility Elasticity is a key advantage of cloud infrastructure, as it means an organisation’s use can easily grow alongside the business. As there’s no minimum requirement of involvement a business can trial processes before scaling up or down as required and without having to commit to costs up front. What’s more, in terms of cost structure, simple changes can have dramatic results. For example, storing files on Amazon Glacier rather than Amazon S3 can save your business two thirds of your data storage costs. [easy-tweet tweet="Storing files on #AmazonGlacier rather than #AmazonS3 can save your business 2/3rds of your #data storage costs" via="no" usehashtags="no"] Love thy user as thyself One of the main challenges with transitioning to the cloud is that your users may end up being encumbered by endless login and password combinations to remember. Use a Single Sign-On (SSO) solution, such as OneLogin, which allow users to have one password for numerous applications. Enhancing user experience with subtle features like this will make adopting the cloud a far more enticing prospect, and significantly reduce user headaches! Data is not just for the IT guy Moving data and analytics to the cloud offers a range of benefits, from disaster recovery to self-service analytics for a mobile workforce. As organisations find more data to store and measure, the need for vast, scalable storage at a reasonable cost becomes paramount. Granting more people access to data analysis means better, data driven decision making. The results are clear in improved productivity, greater ROI and a more engaged workforce.  [easy-tweet tweet="Granting more people access to data analysis means better, data driven decision making" via="no" hashtags="data"] Why give everyone access to data analytics? Because everyone is an expert in their field of the business; they know their data and will spot outliers, opportunities and issues faster than anyone else. Self-service data analytics through cloud-based services may be one of the lowest cost revolutions you can bring to your organisation.  ### Are You Leaving Your Business Open To Hackers? You may not even realise it, but the cyber door to your important documents and data could be wide open, allowing easy access to hackers. You wouldn’t leave the door to your office open at the end of the day, effectively inviting burglars to enter your premises and fill their swag bag with your expensive equipment. So, why are businesses still allowing themselves to be open to hackers? Whatever the size of your company, hackers can pose a serious threat to your business Whatever the size of your company, hackers can pose a serious threat to your business – which alongside private and important information being leaked, could also be extremely costly. Even the biggest companies can fall victim to hackers. In May 2014 online marketplace eBay suffered a data breach, in which 145 million customers’ personal details were stolen. And they are far from alone! Most recently controversial dating site Ashley Madison saw its users’ records stolen, with hackers threatening to leak members’ personal information. This wasn’t only an embarrassing situation for the company, but for the users who are now facing their use of the infidelity site becoming public knowledge.    [easy-tweet tweet="60% of businesses surveyed had suffered a data breach in the past year" user="comparethecloud" usehashtags="no"] Such situations won’t only leave the company and those involved red faced, it can also have a serious financial impact. A government report showed that 60% of businesses surveyed had suffered a data breach in the past year. The worst of these attacks were costing businesses up to £115,000. So how can you ensure you are protected from hackers? If you understand your companies’ weak spots, you can build a stronger defense. So, first it is important to understand how information can get into the hands of hackers. Although they are coming up with new and inventive ways all the time, there are a few common ways they can gain access. Weak passwords, malware attacks – an infected website, USB drive or application delivering software that can capture keystrokes, passwords and data, phishing e-mails – official looking emails that prompt you to enter your password or click links to infected websites, or they may simply pretend to be you to reset passwords. There are several simple ways to protect your company against hackers, such as long complicated passwords that differ for each account and avoiding automatically clicking through to external sites. [easy-tweet tweet="One of the biggest threats to security lies with the devices employees use" user="comparethecloud" hashtags="cloudsecurity"] However, one of the biggest threats to security lies with the devices employees use. These are the ideal places for malicious software to worm their way into your corporate network. The best form of defense against this is to set up a policy for device usage and install web-protection software.  As important as it is to implement the above, these may not be entirely foolproof, especially when you rely on your entire workforce following procedure to protect the company. The most effective way to protect your documents (especially the most important and vital information) from hackers is by using data storage, which will keep your information both safe and accessible. It is a requirement that certain data is stored responsibly, and choosing the right storage provider is important. Knowing your sensitive data is secure means you can focus on other areas of the business, with the peace of mind that you won’t face a costly, embarrassing, or difficult situation.  ### MapR introduces industry’s first in-Hadoop document database MapR Technologies, Inc., provider of the top-ranked distribution for Apache Hadoop that integrates web-scale enterprise storage and real-time database capabilities, today announced at Strata + Hadoop World  the addition of native JSON support to MapR-DB, the top-ranked NoSQL database. The first in-Hadoop document database will allow developers to quickly deliver scalable applications that also leverage continuous analytics on real-time data.  With these major enhancements, developers benefit from the advantages of a document database combined with the scale, reliability and integrated analytics of enterprise-grade Hadoop and Spark. These capabilities will enhance an organisation’s ability to improve a range of use cases -- from personalising and delivering the best online shopping experience, to reducing risk and preventing fraud in real-time, to improving manufacturing efficiencies and lowering costs. To help developers experiment and experience MapR-DB, a developer preview with sample code is available for download. The MapR Distribution including Hadoop is uniquely architected to serve as a single platform for running analytics and operational applications.  MapR-DB enables continuous analytics on real-time data, while reducing cluster sprawl, eliminating data silos, and lowering the TCO of data management. The native JSON support in MapR-DB also enables faster time-to-value by letting developers quickly stand up more business applications on more data types and sources. MapR-DB supports the Open JSON Application Interface (OJAITM), which is designed to be a general purpose JSON access layer across databases, file systems, and message streams, enabling a flexible and unified interface to work with big data. “MapR continues to build on the innovative data platform at the core of its Hadoop distribution,” said Nik Rouda, senior analyst, Enterprise Strategy Group. “The addition of a document database capability (JSON) neatly extends the powerful NoSQL MapR-DB to seamlessly cover more types of unstructured business data. This makes it faster and easier to build big data applications, without the burden of shuffling data around first.” “MapR enables enterprise developers to create new value-added applications which they have only dreamed about so far,” said Anil Gadre, senior vice president, product management, MapR Technologies. “By deeply integrating a NoSQL document database deep into our platform, developers can quickly build new applications which demand ultra-low latency and huge flexibility in supporting different types of data all in one cluster.”  With these new capabilities, developers can: Build and deploy scalable JSON-based applications – shorten application development cycles and avoid complex persistence methods while having the power to handle production workloads and global deployments Run continuous analytics – develop rich applications that leverage large scale, real-time analytics Gain faster time-to-value – quickly stand up more business applications on more data types and sources using familiar JSON patterns  “Cisco and MapR have a strong relationship working together and driving joint innovations. We see this trend continuing with the addition of native JSON capabilities into the MapR-DB in-Hadoop NoSQL database,” said Raghunath Nambiar, distinguished engineer and chief architect for big data solutions engineering for Cisco. “This new release with the OJAI interface enables organisations to capture and analyse complex data types generated by the Internet of Things and the new generation of big data applications. Combining MapR-DB with Cisco UCS® provides a new level of continuous, large-scale analytics and faster time-to value to our customers.” Software partner Visual Action has already begun integrating its Flaremap visualisation software with the new JSON capabilities in MapR-DB and Apache Spark. “We’re excited to be one of the first partners to work with the new document database capabilities in MapR-DB,” said Jim Bartoo, CEO, Visual Action.  “Complex and extremely large JSON documents, like those generated by oil and gas applications, need a fast, scalable and reliable platform. By combining MapR-DB with our front-end Flaremap visualisation software, our joint customers will be able to immediately analyse and chart huge volumes of JSON operational data and quickly make decisions that can positively impact operations.”  Availability A developer preview package of MapR-DB with native JSON support is available immediately and can be downloaded here. General availability of these new capabilities in MapR-DB will be next quarter. To see an interactive demo application of the new MapR-DB JSON capabilities with Visual Action’s visualisation software visit here. ### How the cloud can rain success on SMBs Embracing new technologies is essential in enabling small businesses to tap their full potential and remain competitive. Two years ago, Gartner predicted that 25 per cent of organisations would lose their market position by 2017 due to "digital business incompetence". Even though this may be a strong assertion, the bottom line is many small businesses don’t harness the full potential of such technologies. [easy-tweet tweet="Reluctance to digitally transform a business doesn’t always indicate technological unawareness." user="toshibaUK" usehashtags="no"] Reluctance to digitally transform a business doesn’t always indicate technological unawareness. Most SMBs are well aware of the potential of mobile technology, but a number of concerns remain –prevalent among these is concern around storing and accessing business data in the cloud. There are always decisions to be made before getting started: How does a business ensure its data is securely accessible? How is disruption avoided during the implementation process? Are the costs manageable? But SMBs shouldn’t let any doubts keep them from reaping the benefits of mobile working, instead viewing technology like the cloud as the enabler it is.  SMBs shouldn’t let any doubts keep them from reaping the benefits of mobile working Cloud at the heart of mobility More so than their larger competitors, SMBs are in need of competitive edge that levels the playing field for them. By honing their mobility strategy, SMBs can increase overall productivity and responsiveness, improve customer experience and employee satisfaction. Cloud storage can be the centrepiece which allows mobile workers to stay connected and be productive no matter the time or place. It is crucial that businesses get this right at the beginning. By paying attention to a few important details, SMBs can make sure cloud technology becomes a value-adding part of their business. [easy-tweet tweet="Cloud storage can be the centrepiece which allows mobile workers to stay connected" user="toshibaUK" hashtags="cloudstorage"] There is no one-size-fits-all approach on how to implement a cloud strategy. For SMBs, which often lack dedicated in-house IT resource, it can be appropriate to only move selected files onto the mobile platform. This way, disruption of business processes and security breaches are less likely to occur. Also, adopting a mobile strategy doesn’t necessarily mean moving all documents into the cloud at once. Taking time allows businesses to ease into the new technology. It is only natural if mistakes occur more frequently while employees are getting used to different processes. A bite-size approach can be easier to digest, ensuring the right processes are in place before this approach is rolled out more broadly. Security and education: a strong IT framework a well-managed IT policy is essential to mitigating risk in the cloud One of the most common reasons for SMBs to abstain from the cloud is security concerns. Mobile working and the use of cloud technology can of course be a security threat if there is no strategy in place – in any business, the weakest link is often the employee, so a well-managed IT policy is essential to mitigating risk in the cloud. Companies need to restrict access where it isn’t authorised, introducing mobile device management tools such as secure containers which enable devices to be remotely wiped if necessary. Equally important is making sure employees understand the risks associated with data theft. For example, if a paper file was marked ‘Confidential’ then it would almost certainly be looked after more carefully. Transitioning this mentality into the digital world can stop employees from using forbidden services or downloading any highly sensitive files which can then easily be left in a taxi and accessed at the click of an unlocked smartphone or laptop button. [easy-tweet tweet="Equipping staff with the right hardware, built for business use, adds an extra layer of security" user="ToshibaUK and @comparethecloud" usehashtags="no"] Similarly, equipping staff with the right hardware, built for business use, adds an extra layer of security. The difference between business-grade and consumer devices is widening all the time, so it is important to implement a Choose Your Own Device strategy which provides flexibility within a controlled selection of secure laptops. Devices such as Toshiba’s Satellite Pro R50-B include built-in protective measures such as Trusted Platform Module, which ensures sensitive business data stays safe on local storage. Being designed specifically for SMBs, these laptops are also lightweight and feature numerous connectivity options that make them ideal for mobile business and cloud solutions. Choosing the cloud to suit your business  Different cloud services cater to different types of customers. As with hardware, there are broad differences between consumer and business-grade cloud storage – namely the features available, security standards and price. And as with devices, it’s strongly advisable to go with business-targeted cloud solutions. Really, SMBs should completely eradicate consumer cloud services except for storage of publicly available and non-confidential data. When choosing a supplier, it is best to go for trusted names which not only offer value and security, but also the tools to work in a more connected and efficient way across the board. For added confidence, some vendors will offer a trial period of up to 90 days, allowing customers to test the service and experience the benefits first hand. Toshiba Cloud Client Manager is one such tool, built to allow companies of any size easy and complete device management, with comprehensive tools to ensure efficiency and compliance. A reputable provider will offer multiple options to safeguard files A reputable provider will also offer multiple options to safeguard files, from password protected platform access to additional authorisation for sensitive files. SMBs may even find that their data is more secure in the cloud than it is on an on-site hard drive: cloud data is always encrypted. Cloud backup also contributes greatly to peace of mind to have a copy of all important documents in case anything should happen to jeopardise the safety of hard drive records. Granted, at the beginning, mobile transformation may require some investment. But while mobile devices are a one off investment, cloud services do not require a big up-front capital expense. They offer an affordable, monthly price-point which tends to suit an SMB’s cash cycle. What’s more, cloud services are very easily scalable as the enterprise grows, allowing smaller organisations to get a suitable process in place that suits their current needs and can adapt to meet their future requirements. Unlocking the productivity potential of cloud apps SMBs need only look at the great variety of business-specific cloud applications to gain an idea of the possibilities to step up productivity. From sharing and editing documents online to organising and tracking staff’s time, cloud apps help avoid lengthy processes by making information accessible to everyone at any given time, regardless of their location. The abundance of apps also means there is a tool to boost productivity across all areas of the business – be it anything from sales to customer service or even HR. In addition, improved connectivity increases productivity by encouraging flexible working schemes such as working from home. For collaborative work, cloud apps offer a streamlined and less error prone process. And one more thing to consider; by using cloud applications – SMBs can effectively outsource their IT to the cloud provider, who takes care of the updates and maintenance. [easy-tweet tweet="Improved #connectivity increases #productivity by encouraging flexible working" user="comparethecloud" usehashtags="no"] Instead of being a source of concern, cloud computing eases the burden of security, disruption and cost. The risks associated with IT transformation are easily manageable if a business analyses its security needs, partners with a trusted cloud supplier, and adopts cloud storage at its own pace. Within a well-executed mobility strategy, cloud technology can become an enabler for SMBs to harness the full potential of mobile working. The future looks bright for SMBs who understand the opportunity of investing in a powerful cloud service. In today’s ever-more mobile world, it is crucial they seize it. cloud computing eases the burden of security, disruption and cost ### How In-Memory Architecture Benefits Cloud-Scale Performance Cloud services, including SaaS applications, are designed to be flexible, on-demand, self-service, scalable and elastic, which make them ideal for enterprises seeking to transform from static legacy systems to more progressive digital computing operating models. In search of the best total cost of ownership, overall cost reduction continues to dominate as the main reason for cloud services investment. According to Gartner, CIO and IT director roles all rated ‘cloud is a modern approach’, ‘innovation’ and ‘operational agility’ as top drivers of cloud service adoption. The conclusion is that CIOs are focused on using the cloud to establish modern, innovative IT environments with operational agility and business competitiveness as key outcomes. [easy-tweet tweet="CIOs are focused on using the cloud to establish modern, innovative #IT environments" user="AtchisonFrazer"] However, because of perceived barriers to successful deployment, such as losing control of data and lack of visibility up and down the stack, few organisations will completely migrate to cloud-based SaaS. Instead, they will live with a mix of SaaS and traditional on-premise application deployment, with a focus on flexible integration. Extending on-premises solutions has been a consistent SaaS driver as businesses continue to seek innovative approaches that are quick to deploy and often leverage the capabilities and data repositories of existing on-premises solutions. the power of cloud computing is in the speed and agility Most enterprises have realised that the power of cloud computing is in the speed and agility gained in developing and operating cloud-based applications. But many have begun to implement agile methodologies and continuous integration for their newly developed Web, mobile and analytics applications because of issues that impact performance such as security. As enterprises re-platform legacy applications to private and hybrid clouds, they must ensure that those apps will be able to scale and that correspondingly, infrastructure performance monitoring capabilities must scale at the speed of cloud.  The underlying network infrastructure layer often presents the most challenges as organisations migrate from on-premise private cloud to hybrid-cloud delivery. Additionally, native cloud apps are being simultaneously designed with DevOps agility goals such as mobile-device responsiveness and low latency endpoint access. Large-scale network operations often experience technical disruptions that degrade system performance and the overall IT infrastructure. For large networks, the source of degradation can be difficult to isolate, especially within virtualised or hybrid-cloud infrastructures, because the problem is often located on remote devices or manifests itself not as a complete failure, but merely as under-performance. [easy-tweet tweet="Isolating a poor performing component is substantially more difficult than isolating one that has completely malfunctioned" via="no" usehashtags="no"] Often, isolating a poor performing component is substantially more difficult than isolating one that has completely malfunctioned. To solve network operation problems, IT administrators use fault management tools that explore and monitor key aspects of a network in a silo fashion. Degraded performance issues can range in scope and complexity depending on the source of the problems. Some examples of network operations problems include sluggish enterprise applications, the misuse of peer-to-peer applications, an underutilised load-balanced link, or lethargic VDI performance – all of which have an adverse effect on IT operations and eventually to an organisation’s productivity and agility. One often-cited problem is when monitoring network traffic for a relatively large enterprise, the amount of information relating to those packets can also be relatively large. The sheer volume of nodes and traffic in the network makes it more difficult for a standalone network-monitoring device to keep up with the amount of information. Fault isolation often requires tracking and storing large amounts of data from network traffic reporting devices and routers. Tracking huge amounts of data from large networks consumes massive amounts of memory and processing time that can hamper system performance. Accordingly, what is needed are scalable systems for efficiently storing and processing data for network monitoring, synchronously with processing interactional data correlated to other infrastructure components and virtualised objects. The solution lies in memory. A memory-driven engine provides the ability to quickly identify contention issues affecting the end-user experience, such as network device conflicts, NetFlow traffic patterns or IOPS degradation. This engine correlates and analyses the health of hundreds of thousands of objects and dozens of measurements within an IT environment’s virtual infrastructure. By continuously comparing system profiles and thresholds against actual activity, it is possible to determine which objects are vulnerable to imminent performance storms. With microsecond-level, data-crunching capabilities, it is possible to track, record and analyse more data, more frequently. Tracking huge amounts of data from large networks consumes massive amounts of memory and processing time that can hamper system performance Conventional external database-driven, performance monitoring solutions may miss intermittent contention storms, network-level and beyond, that affect the performance of the virtual environment. Thus, even when other solutions do eventually identify a problem, it is minutes after end users have been impacted. And often, the information provided is insufficient to determine the root-cause because it’s primarily retrospective. A real-time health can be linked to the relative fitness of every client, desktop, network link, host, server and application that can affect the end–user experience, changing in real time to reflecting the urgency of the performance issue. A significant performance shift will trigger an alert that is paired with a DVR-like recording. An in-memory analytics architecture allows for a real-time operational view of infrastructure health second-by-second, rather than just averaging out data over a five-to-ten minute period. In essence, Xangati identifies exactly which VMs are suffering storm-contention issues from networking resources at exactly the precise moment it matters to take preventive action, regardless of whether apps run on premise or in the cloud. ### Sunrise Software launches tailored solution for Service Providers Sunrise Software, a leading supplier of Service Management solutions, has launched an enhanced new version of its flagship product Sunrise ITSM, designed specifically to help Service Providers meet complex customer delivery needs. With the managed service sector continuing to celebrate rapid growth and combined sales across Europe reaching $294.28 billion, fast-growing firms are facing increasingly complicated and diverse demands from their customers. Launched in September, Service Management for Service Providers has been tailored to meet the needs of organisations that manage diverse service contracts with complex customer hierarchies, billing structures and customer-configured products.  New features created specifically for requirements of Service Providers that outsource activities for their business operations, include managing multiple Service Level Agreements (SLAs) for third party supply and maintenance contracts, correctly cataloguing products and services and keeping track of customer entitlements, invoices and terminations. Neil Penny, Product Director of Sunrise, said: “As the industry continues to grow in scope, we have seen Service Providers faced with an ever more diverse range of suppliers across many complex customer contracts. This makes it increasingly difficult for them to provide a seamless, joined up service.  “IT departments in large organisations effectively act as an effective internal Service Provider to the organisation, making their processes an ideal basis for managing external customers as well. IT Service Management processes and capabilities provide a strong foundation and best practice approach for a Service Provider to manage both suppliers and complex customer support arrangements. “Our latest release, Sunrise Service Management for Service Providers, uses this as inspiration to provide new functionality to meet the demands of customer service oriented organisations, allowing them to not only cope with the complex environment, but excel within it.” New enhancements include opportunity/customer relationship management functionality and a feature to support a Return Merchandise Authorisation (RMA) Process that allows customers to return faulty items that are under warranty. An Inventory/Stock Management feature enables item stock counts including the location, such as those out for delivery or in an engineer’s van. Additional functionality includes the ability to track time and resources spent on particular projects/customers on and off site. For more information on Sunrise Software, its products, services and customers visit https://www.sunrisesoftware.com/ ### The Rise of Cloud BI  Up until fairly recently, Business Intelligence (BI) was rarely run in a cloud environment. Issues surrounding security, and concerns around entrusting sensitive corporate information to the cloud, meant the technology was seen as a much less credible alternative to the well-established, on-premise versions. [easy-tweet tweet="Peter Baxter, MD at Yellowfin, looks at the advantages that #cloudBI solutions are bringing to businesses" via="no" usehashtags="no"] But today, as cloud marches towards becoming a mainstream technology, the adoption of cloud BI is growing steadily.  Enterprises, large and small, are beginning to appreciate the many potential benefits of delivering BI in a cloud-based environment. Here, Peter Baxter, EMEA Managing Director at Yellowfin, looks at the advantages that cloud BI solutions are bringing to businesses. Baxter believes that, as the technology continues to evolve, enterprises across almost all industries will be looking to reap the flexibility benefits of cloud BI. Cloud unleashed Fundamental to this new and growing interest in cloud BI, is the appreciation of, and rise in, the cloud services market more generally. Improved security has seen concerns around storing critical information off-premise abate. Enterprises are overcoming these concerns and focusing instead on the advantages – primarily in the form of time and cost efficiencies enabled by the cloud.  According to research from the Cloud Industry Forum (CIF) carried out earlier this year, the overall cloud adoption rate in the UK stands at a considerable 84 percent. Indeed, on a global scale, enterprises across sectors are looking to the cloud for a multitude of business processes and services.  A new study by Cisco Systems and IDC (entitled “Don’t Get Left Behind: The Business Benefits of Achieving Greater Cloud Adoption”) suggests that, beyond the incentives of increased efficiency and lower costs, a “second wave” of cloud adoption is emerging with businesses adopting cloud services in order to achieve transformative and measurable business value. [easy-tweet tweet="a “second wave” of cloud adoption is emerging with businesses adopting cloud services for measurable business value" via="no" usehashtags="no"] The lure of Cloud BI Enterprises are embracing cloud-based BI for a number of reasons. Key factors behind its current appeal are affordability, scalability, low ongoing strain on IT resources, as well as speed of implementation and delivery of results.  These lucrative benefits are now also being coupled with the fact that cloud-based BI has moved, from being an ‘edgy alternative’, to a ‘tried and tested’ technology with an increasing number of successful and convincing use cases. Industry research also lends serious credibility to these touted benefits, with Enterprise Management Associates’ January 2015 report, “Analytics in the Cloud”, revealing time-to-delivery as the primary business motivation for choosing cloud-based analytics. It also uncovered the top three financial drivers behind the move from on-premise to cloud BI as: minimized hardware and infrastructure cost, reduced implementation costs and reduced administrative costs. Cloud BI offers enterprises a more affordable means of business reporting and analytics compared to traditional on-premise implementations, which often have a total cost of ownership (TCO) that exceeds the budgets for all but large enterprises.  The enhanced flexibility and scalability of cloud-based BI offerings has made enterprise-grade BI accessible to the mid-market, while enabling larger enterprises the option of rolling BI out across more departments and types of users – rather than purely the IT and senior executive teams. [easy-tweet tweet="Cloud BI offers enterprises a more affordable means of business reporting and analytics" user="comparethecloud" hashtags="cloud"] Together with the cost incentive, cloud BI offers small to mid-sized businesses (SMBs) a flexible, fully-scalable (both financially and technologically), low TCO solution that does not require significant in-house IT expertise to set-up, manage and govern.  It’s even possible for SMBs to run cloud BI services completely free – through services such as Yellowfin’s no-cost BI offering on Microsoft’s Azure Marketplace or Amazon’s AWS Marketplace – that enable businesses to set themselves up with a cloud-based BI environment in just a few clicks. Pervasive BI and the importance of the cloud Enterprises are increasingly looking to ‘next generation’ BI tools that are quick and simple to use, can analyse data from different sources, and provide impactful visualisations that are easy for business people to consume.  But, critically, they also need tools that enable key decision-makers to share this information with the rest of the business and draw input from other stakeholders (while maintaining governance and control of the analysis). Cloud BI offers the ideal platform both for analysing data from multiple sources and sharing content throughout the enterprise Cloud BI offers the ideal platform both for analysing data from multiple sources and sharing content throughout the enterprise.  Cloud BI platforms are better at integrating dataflows from different internal systems and external sources than many on-premise models.  As the volume and number of data sources (both internal and external) that organisations look to analyse for competitive advantage increases, so too will the business case for cloud BI. Similarly, because cloud BI tools tend to be accessible to a greater number of users within an enterprise compared to traditional server-based tools, due to their comparative affordability, organisations that embrace cloud-based BI are empowered to share findings with a greater number of users. The future of cloud BI While we have yet to see mass implementation of enterprise-wide cloud BI, the signs are that adoption rates will grow.  Analyst firm Research & Markets predicts that  the global cloud analytics market will grow at a Compound Annual Growth Rate of 26.29% from 2014 to 2019. Market statistics aside, the multitude of benefits that cloud-based BI and analytics offer – from reduced implementation and running costs, complete scalability, ease-of-use and the potential to offer BI to all types of users across an enterprise – are attracting the attention of businesses of all sizes. Indeed, we are almost at a stage when the question is not “should we be implementing cloud BI?” but “when will we be implementing cloud BI and reaping the benefits this form of BI technology has to offer?” ### Dell shines at Service Provider Conference in Weybridge Dell is not known for putting on big events in the UK, in fact I can’t remember the last time it did, but a few weeks ago we were asked to pop along to the Dell Service Provider day at Mercedes Benz World in Weybridge. We never refuse an invite to a good party so off we went - though we did bring our own sense of style. Yes we had a stand, and yes, we were dressed to provoke, and to get tongues wagging, but we wouldn’t be Compare the Cloud if we didn’t! [gallery columns="2" size="large" link="none" ids="31859,31860"] Our aim for the day was to get the low-down on what Dell wanted to say, as well as get the attendees’ opinions on what they thought of the hardware seller! The perception in the market in the past has been that Dell is nothing more than a PC vendor - that may have been accurate once upon a time, but not any more. The announcements we heard at Weybridge certainly squashed any remaining ‘just a PC vendor’ murmurs. In the images below you can see how it all began for Dell in in 1984 when the company had just started (yes that’s Michael Dell) - conversely, today things have moved on considerably and they now sell to consumers, SME, mid to large cap and to public sectors. Including systems, storage, networking, software and Cloud solutions - only 10 % of their business is now end consumers! So to back up these statements of massive change, it is appropriate that we use some hard stats that speak for themselves! Throughout the Dell event at the Mercedes Benz World location there were many speakers that evangelised Dell, and many that couldn’t say enough “good things” about their partner of choice, Dell. Now this isn’t uncommon at a vendor event as attendees are generally the organisers’ clients and partners. A good turnout is expected, especially with the promise of driving a Mercedes Benz AMG around the race track at the end of the event (thank you Dell, I enjoyed the laps). However Dell is not known to the industry for boasting about themselves, or for spending masses amount of money on salubrious events to court their partners - in fact, Dell is not known for running events of this type at all. We were on hand to interview many of the attendees to see if they indeed did think so highly of Dell. I could go into all of the comments that were made from the interviewees but it’s easier for us to summarise, and save ourselves all a little time. Every single interviewee, and others that were camera shy, sang Dell’s praises! Not one company/individual said anything else to the contrary and more than that, didn’t just come along for the potential driving experience but to attend, listen and be educated on what Dell’s strategy for the future is! We were impressed. Nutanix, Red Hat, VMware, Nexenta were but a few vendors that boasted that not only the products were pinnacle to their respective businesses, but that Dell were a pleasure to work with! Now, I know what you’re thinking, I would say this as we too were courted with the option of tearing up the racetrack at 120 mph in a new AMG (although I did and then was asked to exit the vehicle). No, we really enjoyed listening to the responses, announcements and also had a lot of laughs with the guys at Dell and the attendees. Our experience proved that perceptions need to change. Dell is not just a PC supplier and haven’t been for a very long time. [easy-tweet tweet="Dell is not just a PC supplier and haven’t been for a very long time" user="NeilCattermull"] It may have been seen this way by the industry for a very long time due, in my opinion, to the lack of public announcements, the lack of events such as this one, and the also the lack of companies such as us sharing the word. In this very over crowed market we call ‘Cloud’ many vendors large and small are trying to buy, boast, evangelise and even overstep the truth to become players within the industry (heaven knows we have spoken to most of them). It is refreshing to see one of the most unlikeliest perceived vendors finally getting their act together and simply announcing what they have done and intend to do. I think Dell have organically grown into this space without the pomp and circumstance fanfares that others have done, which in my book is absolutely fantastic to hear and see. Dell is the number 1 provider of Server in the U.S, and the number 2 in Europe I’ll leave you one last quote that was announced recently – Dell is the number 1 provider of Server in the U.S, and the number 2 in Europe. Didn’t know that? Neither did I! What Dell has been working on in the software defined space is very impressive, it has been keeping quiet but building momentum over the last few years and its partners think of them of very highly. Keep it up Dell, and if you get any more opportunities to drive new AMG around a track can we come? To see more about the announcements and Dell’s current services watch the youtube video below and spend some time to absorb what they are doing, it’s impressive! [embed]https://www.youtube.com/watch?v=irUSVPFLetE&feature=youtu.be&t=17s[/embed] ### Special Report on Marketing Automation There is hot competition for market share and leadership in the marketing automation sector. So Compare the Cloud has completed an analysis of the market looking at all the other rankings and adding one of our own. Rankings vary of course depending on what you’re looking at and how you measure it. We start with our own rankings, which are based on influence ranking across news media, blogs and social media. If you'd like to understand more about how we analyse influence, there is an article available here. Hubspot has a clear lead over the others with a far higher profile. This profile was recently boosted by the announcement various extensions to its marketing platform: recently a great overview of these additions and their implications was given by analysts TBR, you can view it here. Marketo emerges as clear favourite for big firms with Hubspot the SMB champion Some distance behind Hubspot, battling for second place are Marketo, Oracle’s Eloqua and SalesForce’s Pardot. In addition there is one other outlier that has made a recent impact in the media - Zift Solutions. Recently announcing that it was to acquire the SharedVue division of The Channel Company, Zift is aiming to dominate the niche for automated marketing solutions focused on the channel. [easy-tweet tweet="#MarketingAutomation is a big trend for 2015, @BillMew analyses the industry leaders" user="comparethecloud" usehashtags="no"] Of course influence score is but one metric so we have compared our ranking with others that we think are of value to try to provide the full picture overall. Datanyse Datanyse tracks key technology categories and publishes a wealth of market share and migration data for free. Datanyze searches for web technology signatures on millions of the highest ranking websites to determine which web technologies are being used. It allows you to view country specific results as well as results for subsets such as the top 100 sites, top million sites or indeed its entire data set. For marketing automation it has Marketo in the lead followed by Oracle’s Eloqua, SalesForce’s Pardot, BounceExchange, Hubspot and then IBM’s Unica. [caption id="attachment_31461" align="aligncenter" width="600"] Image via: Datanyse[/caption] G2 Crowd G2 Crowd has a different persective. For its G2 Grid, it rates products based on a combination of the Marketing Automation product’s customer satisfaction and market presence. Its G2 score has Hubspot in first place followed by SalesForce’s Pardot, Marketo Act-On and then Oracle’s Eloqua. For market presence however it has Hubspot again in the lead, but with Oracle’s Eloqua, IBM’s Silverpop, SalesForce’s Pardot and Marketo as runners up. [caption id="attachment_31462" align="aligncenter" width="600"] Image via: G2 Crowd[/caption] Forrester [caption id="attachment_31463" align="alignright" width="278"] Image via: Forrester via B2C[/caption] Forrester earlier this year named the nine most significant lead-to-revenue management (L2RM) marketing automation platform vendors. In 2014 Forrester Research published The Forrester Wave™ Lead-To-Revenue Management Platform Vendors” which evaluated the use of marketing automation by small marketing teams and large enterprises. Act-On was a leader in both categories, followed by Adobe, CallidusCloud, IBM, Marketo, Oracle-Eloqua, Salesforce-Pardot, Salesfusion and Silverpop. This was of course before IBM acquired Silverpop. Gartner  Gartner’s 2015 Magic Quadrant for CRM Lead Management, tabbed Oracle and Marketo as the leaders, followed by Salesforce-Pardot, IBM Silverpop, Hubspot and Adobe. [caption id="attachment_31469" align="aligncenter" width="505"] Image via: Gartner[/caption] Gleanster  For features and functionality Gleanster Research had Oracle Eloqua, Silverpop, HubSpot, Marketo and Act-On in the lead, but for overall value HubSpot, Marketo and Act-On. [caption id="attachment_31464" align="aligncenter" width="600"] Image via: Gleanster[/caption] TrustRadius TrustRadius also has a ranking for marketing automation vendors which is divided into segments by company size: for large firms its top rated firms were Maketo, Oracle-Eloqua and Act-On, for mid-sized business it had HubSpot, Marketo and Act-On and for small businesses it had HubSpot and Lead-Squared. Its overall ranking however showed a clear lead for Hubspot. [caption id="attachment_31465" align="aligncenter" width="600"] Image via: Trust Radius[/caption] Obviously there is only limited agreement or alignment between all these various rankings. This is because each is seeking to measure a slightly different aspect or variable. This varies from pure marketing automation to lead generation and from the needs of large firms to those of SMBs. However reading between the lines, two main winners start to emerge. For SMB firms Hubspot has long been popular choice and the continued buzz around its ever expanding portfolio appear to be strengthening its position at this end of the market. [easy-tweet tweet="SMB's most popular choice for #MarketingAutomation is @Hubspot" user="billmew" usehashtags="no"] Meanwhile at the enterprise end of the market a number of major players are seeking to gain a lead and we have seen firms like Oracle and IBM seek to do so via acquisition and expect further acquisition and consolidation ahead. However the firm emerging strongest at present is Marketo, and while both Marketo and Eloqua were seen by Gartner as market leaders, it was the extent to which Marketo holds a lead in market share (as shown clearly by Datanyse) that puts it in the strongest position. ### Friday Night Cloud: Episode 3: Morons and Tech [embed]http://traffic.libsyn.com/fridaynightcloud/FridayNightCloudEpisode3.mp3[/embed] In this rant of a podcast we discuss whether automation has been good for the human race and unleash a tirade on the impact of cloud, mobiles, IoT & technology on the wellbeing of people. The original quote on people "skimming research" It is clear that users are not reading online in the traditional sense; indeed there are signs that new forms of “reading” are emerging as users “power browse” horizontally through titles, contents pages and abstracts going for quick wins. It almost seems that they go online to avoid reading in the traditional sense. Sources: http://www.theatlantic.com/magazine/archive/2008/07/is-google-making-us-stupid/306868/ ### Criteo chooses Huawei for biggest private Big Data platform in Europe Huawei has been chosen by Criteo to supply 700 servers for its new Hadoop Cluster datacenter based in Pantin (Seine St Denis). Following a strict comparative study, Huawei won the tender to provide its "FusionServer RH2288H V3" model, which consists of a large number of high-capacity disks. The RH2288H V3 servers allow better storage density and reduce energy consumption by 10% thanks to innovations in terms of design and heat dissipation. [easy-tweet tweet="Criteo chooses Huawei servers to build the biggest private #BigData platform in Europe" user="comparethecloud" hashtags="cloudnews"] The new Hadoop platform of Huawei servers represents a gain of 30% in processing performance, thereby improving the Criteo Engine’s performance. In this first stage of the project, some 700 machines have been installed within the datacenter, providing more computing power and storage than the Criteo datacenter in Amsterdam, which has a total of 1,200 servers. In the longer term, the Pantin datacenter will consist of up to 5,000 servers over 350m2 with a power of 2 MW. "This is the biggest private Hadoop platform in Europe as of today," said Matthieu Blumberg, Senior Engineering Manager (Infrastructure Operations). "Huawei has undeniably good ICT solutions and extensive knowledge of Big Data. We were really impressed by the quality of their response to the tenders and support provided during the deployment of their servers." Robert Yang, Head of the Huawei France Enterprise Business Group, commented: "Huawei's goal is to provide our customers the best technology and cloud solutions to help them become the leader in their respective industries. We are proud to have built this partnership with Criteo: this is the kind of project we love to develop.” ### Top 5 Mythbusters for Cloud Collaboration Solutions As more companies turn to the cloud as a way to streamline workflows and cut costs, every vendor seems to offer some type of cloud solution. The benefits, enabling teams to collaborate across geographies and time zones and making media centrally accessible, can transform workflows. [easy-tweet tweet="Get the top 5 myths busted for Cloud Collaboration Solutions by @RoryMcVicar" user="comparethecloud" usehashtags="no"] But like any big industry shift, innovations are happening as fast as the transition and keeping on top of the considerations while continuing to move forward can be a big challenge for broadcasters and content creators. We’d like to help clarify some of the big issues – to bust a few common myths and help you get to a solution that delivers on all that the cloud promises.  Myth 1: We need to invest in our IT infrastructure and staff to support a move to the cloud False! The basic premise of working in the cloud is that you can access media and collaborate remotely without having to build out your own IT infrastructure to do so. The cloud vendor maintains the infrastructure and you get the benefit of their continual development, and of scaling and provisioning on demand and without limit. Where the confusion comes from here is that some vendors offer solutions that are essentially ‘private cloud kits’ – hardware and software that enable remote media access on infrastructure that you own and maintain. Your capital investment in a true cloud solution should be minimal; even zero. Look for a solution that is video-centric, not IT-centric and doesn’t require you to hire or train additional IT staff to set up and administer. [easy-tweet tweet="Your capital investment in a true #cloud solution should be minimal" user="rorymcvicar and @comparethecloud" usehashtags="no"] Myth 2: Only big public cloud vendors like Amazon and Microsoft offer the bandwidth required for pro video Busted! Public cloud solutions like Amazon’s S3 or Microsoft’s Azure are not alone in their ability to deliver the bandwidth you need for cloud-based workflows. There is a new category of ‘vendor cloud’ solutions, where the vendor owns and operates all of the equipment in its own cloud data centres, features and security are purpose-built for professional video, and speeds and capacity can exceed those of public cloud solutions. For example Aframe’s ‘vendor cloud’ is optimised to enable even remote users working on 3G or 4G connections to play back media and make review notes successfully. Also, keep in mind that the big public cloud solution vendors typically charge for uploads and downloads, which can make uploading even a single TV show master super expensive, never mind a day-to-day workflow reliant on them. Public cloud solutions also share their bandwidth across a huge cross-section of industries and customers, including the general cat-video-watching public. Public cloud solutions also share their bandwidth with the general cat-video-watching public One more tip – there are also ‘hybrid cloud’ solutions out there, software purpose-built for video file sharing and collaboration that utilise a separate underlying server network that the software vendor does not own/maintain. While these solutions don’t place any IT burden on you, they can optimise speed, features and security only to the level that their underlying platform can support, and you may have multiple points of contact to work through any issues. [embed]https://youtu.be/YM0FIdyGej4?t=20s[/embed] Myth 3: We have to work on private servers to keep our media secure We understand why organisations mandate this – especially if they assume that their assets need to be in the public cloud to enable remote collaboration. But no! You don’t have to forgo the cloud to maintain security. With a ‘vendor cloud’ solution like Aframe, your assets don’t traverse the Internet without being encrypted. Make sure your vendor encrypts uploads and downloads with the AES 128-bit standard for secure connections to every client. That it has layered protections spanning from the hosting platform to the browser using 256-bit SSL encryption to protect your metadata, passwords, and proxies streaming from the cloud. Customer data should be stored redundantly in separate geographic locations at tier 1 data centres that are SSAE 16 compliant and ISO 27001 aware. In the case of Aframe, the cloud solution has passed strict security audits from multiple organisations including the BBC, Fox Sports, Endemol and others, so its users are confident that their media is safe.  [easy-tweet tweet="You don't have to work on private servers to keep your media secure" user="rorymcvicar and @comparethecloud"] Myth 4:  We already have a MAM system. We can use that for remote media access and collaboration Be careful here. Media asset management (MAM) systems are a foundation component of the video workflow, designed to handle finished media − edited, approved pieces. Trying to make those systems do double-duty as a collaboration system for work in progress clutters them up with raw footage, short shelf-life pieces and other production assets. your cloud collaboration solution should enable all users to share, organize and work with footage in an open, transparent way Also, because most MAMs are designed for technical users, they don’t make it easy for non-technical users to find and use content. That’s exactly what you want your cloud collaboration solution to do − enable all users to share, organize and work with footage in an open, transparent way with videos being discoverable throughout the process. You want to be able to file media in controlled project folders, add relevant data right from the start, then move projects into the MAM once they’re finished. And MAM’s don’t typically include transcoding capabilities that let you upload files in any format knowing that they will be homogenised into your house format automatically, as the best cloud video solutions do. In the end, trying to use a MAM for collaborative production won’t streamline your workflow, and could end up torpedoing your MAM. Myth 5: We don’t create enough video to warrant a cloud solution. We get by with FTP/Dropbox/shipping drives Let’s break this one down. Dropbox, Hightail and other file hosting sites are an easy, super cheap (or even free) way to make files accessible to remote teams. But they don’t centralise media and notes, enable true collaboration or save you any steps. Most will require additional tools to playback media, ad notes, merge metadata, and export results. Some don’t even handle high-resolution media. Think about how you approach the creation of even one video piece. How much time do you spend uploading, copying and downloading footage? (For a single point of comparison, Aframe’s transfer speeds are 15x faster than FTP.) Naming and grouping selected shots? Transcoding cuts into the different formats your reviewers need? Shipping DVDs and drives? How does the editor receive and keep track of notes? How many versions do you have floating out there during the process? How often do your reviewers have trouble playing back files?  After the piece is finished, how easy is it to find and re-use footage from that project?  Those considerations go way beyond posting and retrieving files, of course, but the point is, even when you’re creating a single video project, the time your whole team spends on manual processes adds up. Working with a cloud-based automated solution streamlines all of that, with some companies reporting two weeks of time saved in the edit room per finished piece/episode. That’s significant because you can spend that time doing what matters most… which may be doing more video. In the end, why just get by when you can get more accomplished, faster, while saving everyone involved a lot of hassle? [quote_box_center]At Aframe we’re helping broadcasters, promo teams, corporate marketing groups, and video producers across all industries, all over the world, transform their work with cloud-based video collaboration. We believe collaborative teams should never have to wait for media to begin working.  We have our ear to the street and will keep busting myths to make the transition a bit easier.[/quote_box_center] ### Top 10 Fintech Startups to Watch in October Unless you’ve been living in the wilderness with no internet access for the past few years, you will be aware that the Financial Technology market  or ‘fintech’ is booming. With all different areas to target from security, to bitcoin, to mobile money wallets and many,  many more, fintech startups are popping up everywhere and they are rapidly bringing exciting innovations to a multi-faceted market. [easy-tweet tweet="Here's @KateWright24's top 10 #Fintech companies to watch out for in October 2015 " user="comparethecloud" usehashtags="no"] So, who should we be keeping an eye on as we approach the final quarter of 2015? 1. Brisk Invoicing Brisk Invoicing provides a user-friendly online invoicing platform. It is ideal for small companies and start ups who do not have an accounting department, and need a simple low-cost way to manage their finances. The platform is easy to navigate around and is focussed on ensuring that no payment goes overlooked. Brisk Invoicing automatically saves payees, schedules regular invoices and sends the invoice with a professional tailored design to reflect your company’s image. And that’s not even the best part - you can use the service for FREE! Follow the link above to give it a go. 2. Osper When I was young, I had to learn the value of a pound with £5/week in pocket money, cash in hand and a little notebook where my mum kept track of what I’d spent. Oh, how times have changed! Now young people can learn to manage their finances with the likes of an Osper bank card and app designed for those aged 8-18 years to teach them financial responsibility. The card works anywhere that accepts MasterCard payments and the young person can keep track of their spending and balance via the Osper app on their smartphone. Parents/guardians can load money directly onto the card, via the app, and can set up regular payments e.g. pocket money. They will also get a notification if a payment is declined due to insufficient funds. All of this comes at the extremely reasonable price of £10 per card per year - you can even have a 3 month free trial, if you’re still not sold. [easy-tweet tweet="Want your kids to be financially savvy? @Osper could control the future's pocket money" user="katewright24" hashtags="fintech"] 3. Tink Tink is like the adults version of Osper because, although we may have decades more experience in keeping on top of money, it doesn’t mean that we’re any good at it! Developed in Sweden, and looking to expand across Europe, the app helps you to understand how you spend your money, by integrating your expenditure into categories. You can tailor the app to suit your financial goals, whether this be sticking to a budget or saving for something in particular. The platform is very user friendly, to make budgeting a little more fun (ok maybe “fun” isn’t the right word…but you get what I mean!) At least now you know when you’re spending more on coffee and wine, rather than…y’know….rent. Thanks, Tink! [easy-tweet tweet="Would you know if you spent more on coffee and wine than you do on rent? " user="katewright24" hashtags="fintech"] 4. Digital Shadows Despite the extremely exciting innovations brought to us by the fintech boom, there is a dark side to some of these services. Security breaches have become a rapidly growing concern - I’m sure you all can tell a tale of someone you know who has been the subject of hacking or fraudulent activity. The problem is that everyone is keen to find quick, cost-effective and more manageable ways to manage finance, but at what cost? If you don’t look for the most secure solution you can put yourself and your organisation at huge risk. With that in mind, Digital Shadows is designed to put your security fears to bed. It analyses your digital footprint and highlights potential threats, whether this be loss of sensitive data, compromises to your brand’s integrity or hackers. They are based globally, and able to support more than 27 languages with their software - now is definitely the time to invest in safeguarding your business. 5. Jumio Continuing in the theme of security, Jumio specialises in identity management and credentials verification to reduce fraudulent activity. Bringing on new clients is all well and good, but companies need to ensure that they know their clients are legitimate. Jumio is PCI Level 1 compliant and consistently carries out security audits to ensure they are always at the top of their security game. With a portfolio of clients including AirBnB, Easyjet and United Airlines, Jumio are set to make a big impact in the online security market. 6. Adyen Technological advances over the past 20 years have simplified international business - now anyone can set up a global business and operate across borders. However, there are still complications, one of which is the transfer of money across international borders. Different currencies, exchange rates, international fees and payment delays can create unnecessary hurdles in operating business. Adyen aims to provide a streamline solution to manage global payments, whether this be online, mobile or in store transactions. Adyen can accept almost any type of payment across the world and it does with risk management at the forefront, so you can rest easy that transactions are secure. 7. Transaction Tree As we move more and more online, the world is gradually going paperless. Transaction Tree champion this with their digital receipt service for the retail industry. Although receipts serve their purpose, we all hate that monthly clear out of our wallets/purses/handbags/pockets where we seemed to have accumulated hundreds of pointless receipts - so why not go digital? Not only are e-receipts cost effective and environmentally friendly, but they have an added purpose for companies to capture customer information for analysis. With their email address, organisations can communicate with their customer base through digital marketing. If used effectively, Transaction Tree can offer a great opportunity to its clients. 8. Atom Bank AtomBank could be the top one to watch over the next 2-3 years. They are not open for business yet, but have just recently received their full banking licence.  The concept of AtomBank is a banking system that operates completely online. There are no branches, just an app on your phone. The Guardian has already coined them the "Uber of banking". With all their energy focussed on the digital experience and using business analytics, AtomBank can adapt their banking per user to personalise your banking experience.  The team was assembled by Anthony Thomson, the guy who got Metro bank off the ground. If AtomBank can live up to all it's hype once it opens for business, then it would be a really game changer in the banking market. Watch this space... 9. iZettle iZettle have developed card readers targeted at small businesses. Their card readers are secure, small (pocket friendly!) and easy to set up. The card readers can work with any smartphone device or you can get ones to hook up to a cash register. They are particularly beneficial for those who have non-static businesses e.g. driving instructors, food vans and market sellers. Finally those on the road can move into the 21st century without the risk of losing business to customers who aren’t carrying cash. You can even get their Card Reader Lite for free! They are based in Stockholm but trade internationally across 11 countries including the UK, Mexico, Scandinavia and Spain. 10. Aire Score Aire helps to build your credit score, as do many other companies out there, however Aire specialises in supporting those that maybe struggle to build their credit history e.g. students, migrants and those in the military, who either do not have much history to pull up, or their credit history is based in another country. It can be extremely difficult to apply for a rental property or even get a phone contract without a strong credit history. Aire can support these people with building their ‘banking resume’. You submit the data to build your case and Aire will compile your credit history. Aire believes that everyone should have fair access to finance and they have already built a strong following with their software being discussed by the likes of TechCrunch, The Guardian, FinExtra and TechCity News. [easy-tweet tweet="The #FinTech market is on the rise and now is the time to be engaging its services" user="katewright24" usehashtags="no"] So, whether you are an investor looking for a new opportunity, a consumer looking to benefit from a new financial service or a startup looking for inspiration, it is clear that the FinTech market is on the rise and now is the time to be engaging. If you are interested in learning more about FinTech then check out some of our latest articles below: [caption id="attachment_22213" align="aligncenter" width="300"] What is a blockchain? Building trust in Bitcoin[/caption] [caption id="attachment_21116" align="aligncenter" width="300"] Three common cloud myths that scare financial firms away[/caption] ### Contradictions, Creatives, Metrics and ROI – Lessons from #SMWLDN Last week we attended Social Media Week (SMW) London #SMWLDN and here’s what we learned: 1) Social Media Week is a contradiction It is a physical event that is designed to promote practice in an online social environment. The London event is one of a set of a global events – held across 20 cities and which draws 30,000 participants. These events provide a forum to discuss developments in an industry that is in flux, to share best practice within a profession that is still embryonic, and to network and demonstrate the latest tools in one of the tech world’s greatest areas of innovation.  Now in its fifth year, Social Media Week (SMW) London is the second-largest of the global events (behind New York City) and London is considered “the most social SMW city” of them all. As with past years, it was evident that those within this bubble participated with great enthusiasm both in the discussions at the events as well as on social platforms. Outside the bubble there was, however, a level of at times scathing criticism from those tracking Social Media Week purely from the supposed “insights” posted by users of the #smwldn hashtag. While some of the criticism was simply sour grapes from those left out of the event, I could not help feeling that while there were a few great case studies presented, most of the really ground breaking work was being done by people with too little free time to attend Social Media Week. 2) Its not just about the cool agencies and tech firms In the past, there was a sense that the most of the limelight was focused on the latest, greatest tools or how the coolest agencies were using them to do great things for their corporate clients. In one of this year’s sessions (entitled Social Creative Leadership: A Knife Fight in A Phone Box, hosted by @LeoTwit from Ogilvy @OgilvyUK, and featuring @Whatleydude @FrithFrith and @jedhallam) there was a discussion on the evolving demarcation and dynamics within the creative value chain and between different creative teams – It asked what the role was for creative agencies, media agencies, publishers, the social platforms as well as the clients themselves all of which have creative resources. The debate was worthy, but the knife fight failed to draw any blood (or indeed any bruises). The most compelling contribution was from the client present - Fritha Hookway @FrithFrith Digital Marketing Manager at @Topshop. She made the point that larger firms or brands with the resources should try to retain social control in house and certainly never outsource their community management. With larger firms being able to recruit the necessary skills and afford the same great tools as the agencies, why wouldn’t they want to develop in house capability and to retain direct control of community management and possible also the whole social spectrum. But then where does this leaves agencies, most of whom are built to serve large clients and large budgets which are most able to do it themselves rather than the vast SMB sector where support is really needed. [easy-tweet tweet="Nescafe claimed that ‘Dot com is dead’ at #SMWLDN" user="billmew" usehashtags="no"] Other in-house social leaders that shone out included Barclaycard, British Gas and Nescafe. In a discussion chaired by @javierburon CEO of @SocialBro with panellists @FloraBusby @simonlp @tomball1985 @alexpackham @gregallum & Sarah from @snackmedia. The discussion focused on what the future of paid advertising is on social. Flora from Barclaycard @FloraBusby made an impassioned argument for better targeting and focus to deliver increasingly relevant paid social content. And Greg from British Gas argued for an adaptive strategy approach where spend is varied depending on how platforms or posts are performing. He also argued for appropriate resource allocation to community engagement, saying that British Gas currently aims to respond on Twitter within two and a half hours and is looking to improve that soon two hours, but that the resource to go much further probably couldn’t be justified. It was a great discussion - See a summary of the SocialBro debate on the firm’s own web site here. Nescafe, one of the giants of the food industry, is trying hard to present a human face to its operations and products. It shook things up a bit at Social Media Week by claiming that ‘Dot com is dead’. The firm announced that the era of brand websites is now over and that it is converting its Nescafé.com website into a Tumblr blog. 3) Social media is leading the marketing drive to become data driven and metric focused [easy-tweet tweet="ROI on social is becoming more and more easy to demonstrate" user="billmew" hashtags="socialmedia"] Where once creative types ruled the marketing world, it is now centred on hard-nosed business metrics. In this environment simple “likes” and “retweets” are relatively meaningless metrics. See my previous thoughts on real metrics here. The tools are improving all the time, as is the ability of teams (whether on the client or agency side) to track the real return from campaigns more accurately, identifying users and linking content to click through rates, actions, outcomes and direct purchases and ultimately revenue. This is making the ROI on social more and more easy to demonstrate. It is also driving a more metric-based and data-driven approach which when integrated across the whole the digital marketing mix is transforming the way that businesses operate. ### Can Your Business’s Cloud Storage Cross Borders? In the news recently, we have witnessed numerous data breaches which have left many customers’ private information exposed to hackers. More than 300,000 US taxpayers recently saw their personal tax details accessed by organised crime fraudsters, whose attempts to hack the IRS were shockingly successful. Not all incidents are down to organised hacker groups though, and some business’s lack of knowledge about different countries’ laws and regulations can lead to this type of risk. [easy-tweet tweet="Information that crosses country borders creates problems for the data controllers" user="sixsq and @comparethecloud" usehashtags="no"] Information that crosses country borders creates problems, and as the company is the ‘data controller’ they are responsible for ensuring that the correct security measures are in place. If not, then this can have a huge long-term impact on a business. Now, more than ever before, it is important to protect your business’s private and customer data from security threats. As the digital world continues to grow each day, there becomes more information exposed online and this raises many concerns. Many factors need to be considered to protect confidential data stored by your business, and one of them is how a business manages its cross-border cloud hosting. Cloud Storage Security A lot of businesses use cloud service providers to manage their data storage, and for many it is a useful way to host applications on one platform. The issue is that there are regional differences in data protection laws, and this can impact cross-border cloud storage requirements. The issue is that there are regional differences in data protection laws It is important to be familiar with other international laws and regulations surrounding data protection. This is especially important when considering using a cloud service provider who hosts storage different areas of the world. There are companies who have made innovations in the marketplace to ensure that their services offer flexibility when choosing cloud technology. For example, SixSq, who are part of the Rhea group and describe themselves as ‘software artisans’, partner with multiple cloud service providers to provide a diverse range of options to their customers. Below is some tips on how best to approach this dilemma for your own business needs. If the data you are storing is hosted by a cloud service provider in a different sovereign state, how does this impact on the privacy regulations of your owned data? Access to the data and its protection will be under the laws and regulations of the state it is stored in. But this is not the full story. Countries with USA's Patriot Act like laws can jump borders if the company is owned by a company or owner registered in such a country. It is therefore paramount to understand the ownership of the cloud companies used. [easy-tweet tweet="It is paramount to understand the ownership of the #cloud companies you are using for international services" via="no" usehashtags="no"] Should this be a concern for any business using cloud storage in a different country? Sure, as long as data privacy is an issue. Based on data categorisation, some data might not be sensitive or might only be valuable for a limited time. But for sensitive data (e.g. customers, employees, corporate, medical, sensor) this is critical and therefore understanding geolocation of the data stores is a must. But further, while where the data is stored is obviously important, where it is processed should also be controlled with great care. Is it recommended to use a provider who is based in the country you reside in? Not necessarily. But since we normally understand better our own county’s laws and regulations, it is often a reasonable choice. Having said that, other countries might have either more interesting laws and regulations or cheaper prices or higher quality. Doing your homework is important and getting help is probably a good idea to assist in the choice. What advice would SixSq offer to businesses who are worried about their data storage? ensure you are not stuck with any given provider My advice is to first ensure you are not stuck with any given provider. This means not only from a contractual point of view, but also regarding technology, architecture and process. SixSq is based on the principal of neutrality in the cloud space. One of our customers recently shifted from one cloud to another in a matter of hours. Their architecture allows it, and using our technology (i.e. SlipStream to manage application deployment) ensured that they could simply re-deploy their applications stacks and watch the terabytes of data being moved from one no-sql backend to another. This is a great success where the customer was worried about lock-in and was able to shift its production system, with no downtime, from one cloud provider to another. This means that with careful planning, the right help and support, cloud independence can be achieved. Therefore, for data storage sensitive customers, this type of defensive approach to cloud storage location can be turned into a serious advantage and also eliminate an important risk from its business. [easy-tweet tweet="With careful planning, the right help and support, cloud independence can be achieved" usehashtags="no"] To find out more about international data protection laws and regulations visit the DLA Piper website for an extensive list.  ### Public cloud: Because you’re worth it There may be no form of revolutionary technology subject to more suspicion than the cloud – and public cloud in particular. From a certain perspective, this is understandable. Historically, the progression of technology has been slow and painstaking, but there’s an argument that the three centuries of the Renaissance have nothing on the last fifty years. We’ve seen clones, satellite TV, electric cars, a guy landing a rocket on a comet, a guy landing himself on the moon, the internet, and Chilli Heatwave Doritos arrive in this time. But for every legitimately life changing development, there’ve been three or four underwhelming flops touted as the next big thing. The Nintendo Power Glove. The Segway. Laser Disc. New Coke.  [gallery columns="4" link="none" ids="31385,31386,31383,31384"] So treating the public cloud with a degree of caution is understandable, particularly given early concerns around security and flexibility. Equally, if you’re running your own business, it’s hard to change when you’ve been using on-premise IT solutions for years – particularly when dismantling your comms room seems like an expensive process that could pose certain risks to your data, and things appear to be humming along in decent, if unspectacular, fashion. sticking with your on-site solution isn’t the smart play However, sticking with your on-site solution isn’t the smart play. Cloud technology has been around for a while, and it’s had plenty of time to outgrow any teething issues that may have arisen in the early going. Here are three reasons you should consider ditching your comms room and making the leap to public cloud. [easy-tweet tweet="Here are 3 reasons you should consider ditching your comms room and making the leap to #publiccloud" via="no" usehashtags="no"] 1. It’s cheaper. They may still be widespread, but there’s no getting away from the fact that comms rooms aren’t exactly a bargain. Upfront expenses are substantial enough: servers can cost thousands of pounds, and if you’re looking to expand, you’ll have to buy more of them. It doesn’t stop there, obviously. The day to day operation of a comms room can cost plenty too: you’re powering the main servers, the backup servers, and paying operations engineers to make sure your entire IT infrastructure doesn’t crash at a moment’s notice. [caption id="attachment_22257" align="alignright" width="300"] Click here to read more about the digitisation of computing[/caption] Using hosted services is better… but not much better. Their fee structure may seem superficially cheaper, but it also tends to lock you into a system of pre-set license packages, none of which will ever really meet your precise requirements. If you need to set up a system for 6 users, you’ll typically have to buy five or ten licenses at once – either slightly undershooting or massively overshooting what you actually need. Public cloud represents a significantly smaller investment – and with potentially greater returns. Firstly, you don’t have to pay any setup fees at all. With Amazon Web Services, you can get set up within half an hour on a pay as you go price plan – with no long-term commitment, ‘admin fees’, or hidden costs. Secondly, you won’t have to deal with long, drawn-out software updates that prevents your employees from accessing business-critical applications at inconvenient times. Finally, public cloud is typically available on a per-user, per-month basis, allowing you to scale your IT usage to the precise needs of your business: if you need one new license, you won’t have to buy four you don’t need; if you need to scale down, you can do so without sacrificing other user licenses. 2. It’s safe. [easy-tweet tweet="There may be no more pervasive – and damaging – myth about the #cloud than of its #security vulnerability" user="ifollowoffice" usehashtags="no"] There may be no more pervasive – and damaging – myth about the cloud than the myth of its security vulnerability. The truth of the matter is that most of the stories of cloud-related ‘safety flaws’ can be attributed to the aforementioned growing pains in the technology’s early history – or simple human error. No matter how robust security gets, setting your admin password to ‘password’ will always be problematic for your business. People still believe that the cloud is unsafe, however, because it’s out of sight – if you feel like you don’t have control over it, you don’t feel like it’s safe. Put simply, however, it’s better if your employees are nowhere near your security protocols. The technology cannot afford to be unsafe: its existence and continued success cannot afford to suffer regular security breaches. Public cloud companies spend more on security than any other kind – and they’re always incentivised to stay up to date with it. Cloud security is now as safe as on-premise, if not more so: with increasingly sophisticated encryption and authentication, organisations of all shapes and sizes are using the technology instead of relying on rusty server rooms. Some of these include Netflix – and the Pentagon (so if the cloud is truly unsafe, it’s quite possibly a threat to global security). 3. It’s flexible. The on-premise solution was built with the understanding that the overwhelming majority of employees would be working 9-5. It’s not really built for anything else. Downtime is scheduled outside of office hours, preventing employees from checking emails or accessing vital applications – inhibiting the possibility of flexible working. [easy-tweet tweet="With #publiccloud, it’s possible to liberate your employees from the fetters of the office." user="ifollowoffice and @comparethecloud" usehashtags="no"] With public cloud, it’s possible to liberate your employees from the fetters of the office. All of your applications are available from any device – at any time. Employees can work on holiday, on public transit, or from home – recent tube strikes (and proposed tube strikes) have made this latter opportunity more appealing to business owners than ever. The true promise of the cloud – beyond security, flexibility, cost, or anything else – is exactly this: that organisations will no longer be limited by their IT, but freed by it. ### How to Prepare Your IT Department for a Disaster This year, disaster recovery (DR) has been a top priority for 45 per cent of UK IT departments. With the increase in legal and regulatory compliance coupled with virtualisation and cloud-based strategies for disaster recovery, more IT departments (5 per cent more than 2014, to be exact) are recognising the importance of DR. But knowing DR is necessary and implementing it effectively are two separate things. In a separate study by Timico, only 5 per cent of respondents said they were totally confident that their DR plan was adequate. [easy-tweet tweet="Only 5% of businesses are totally confident that their DR plan is adequate." user="timicouk and @comparethecloud" hashtags="DR"] UK businesses face both man-made and natural disasters such as software and power failures, electrical fires, and flash flooding or high winds. Today, customers expect always-on service, so how can an IT department do its part to avoid downtime when disaster strikes? Each business has different requirements depending on the company’s industry and size, but following these guidelines can help the business prepare its IT infrastructure for a disaster. Prioritise critical systems One of the most important components of being ready for a disaster is classifying important systems and processes and mapping out any interdependencies. A financial organisation, for example, probably needs to restore customers’ online access to their accounts before it restores access to internal files. If the servers powering the customer portal are dependent on other systems or a specific power supply, the business must take measures to ensure redundancy of the core servers and power supplies. A financial organisation probably needs to restore customers’ online access to their accounts before it restores access to internal files Of course, prioritising systems and processes should not be conducted by IT alone. Input from other departments is critical to successfully recovering from downtime, because what IT may consider critical is not a top priority for other key departments. This is why the DR plan should be created in conjunction with a business continuity plan. Typically the business continuity plan is driven by the results of business impact and risk analyses, which have identified the business’s core objectives and departmental priorities. [easy-tweet tweet="What IT may consider critical in #DR is not a top priority for other key departments" user="timicouk and @comparethecloud" usehashtags="no"] Identify a solution for backing up and recovering an IT environment Whether it’s legacy systems that can’t keep up with growing volumes of data, a lack of redundancy or storage media corruption, too many businesses realise during a disaster that they aren’t able to recover their data, or worse, their IT environment. In fact, data loss is up by more than 400 per cent since 2012, according to the EMC Global Data Protection Index. It’s important that a business’s current backup and recovery solution ensure the strategy is adequate for the business’s needs and can protect critical data and systems. [easy-tweet tweet="Data loss is up by more than 400% since 2012" user="timicoUK and @comparethecloud" hashtags="dataloss, DR"] [quote_box_center]For example, Lyco, a specialist lighting e-commerce company based in Milton Keynes, had backed up to disk on-site. But as the organisation’s business grew, management realised the risk of housing backups on-site was too great. They wanted to move backups off-site while reducing recovery time objectives (RTOs). The backup software they were using, however, was not designed to write to a disk at a third-party site, so they switched to the disaster recovery as a service (DRaaS) solution BlackVault Managed Recovery Platform, which uses an on-site appliance in conjunction with a private cloud, BlackCloud. The benefit of this approach is that the organisation was able to manage backups on-site while efficiently sending them off-site to ensure data redundancy. During a disaster, employees are able to access the environment over the Internet or another connectivity option.[/quote_box_center] The appeal of DRaaS solutions is that they provide the ability to recover key IT systems and data quickly (within a 2-4 hour recovery time objective in some cases), which is crucial to meeting customers’ expectations for high availability. Decide how and where employees will resume operations Companies need to have an alternate work environment available at the time of an emergency, whether it’s employees’ homes or rented office space. If renting office space, the facility should be pre-contracted to help ensure it will be available during a disaster. Simply having a space is not adequate, however. Staff members need a way to access their work environment, including documents, business applications and communications platforms such as email and instant messaging. As a managed service provider, we have found that companies are increasingly using DRaaS solutions, which allow employees to access the environment through a VPN or online. Having a backup Internet provider can help ensure a reliable connection will be available.   Have a plan for receiving business phone calls Communication is key in any disaster recovery scenario, so businesses need to consider how they will continue to receive calls. If using landlines, the business should consult its telecomm carrier or managed service provider to review options for rerouting numbers in the event of a disaster. These offerings will expedite the reroute of telephone numbers, rather than calling at time of disaster to have the calls rerouted, which could take hours, if not a day or two. If a business has a cloud-based or voice over IP (VoIP) telephony solution in place, communication options can be remotely managed. Businesses are able to deploy pre-recorded greetings and redirect phones to staff cell phones or an alternate office location. This solution ensures employees can take inbound calls as well as make outbound calls in the event of a disaster. When redirecting calls to cell phones during a disaster, businesses should bear in mind that during a large-scale crisis, overloaded circuits can make it difficult to obtain a signal for placing calls, and emergency services might invoke the government’s Mobile Telecommunication Privileged Access Scheme (MTPAS) procedure. The London bombings of 7/7 is a prime example: for four hours, the network within a mile of Aldgate Tube station was disabled. In these situations, redirecting calls to a landline can provide a more reliable connection. [easy-tweet tweet="Do you know how your company plans to communicate in a disaster scenario? " user="timico and @comparethecloud" hashtags="DR"] Document and regularly test the disaster recovery plan Documenting the disaster recovery plan is an important step, because during a high-pressure situation it’s all too easy to neglect key parts of the plan. In addition, if any critical personnel who were involved in the planning process leave the business, subsequent employees can properly implement the plan. The documentation can also be useful if a managed service provider plays a role in implementing any part of the plan. For a plan to reach its maximum effectiveness, however, it should be tested regularly (annually at minimum) to work out any kinks before a crisis arises. Those responsible might balk at the cost of testing the plan because of resources consumed (e.g. bandwidth) and the disruption to daily operations, but the alternative of not testing enough or at all is a risk that could leave a business vulnerable after a disaster. It may help to break down disaster recovery testing into manageable parts until an organisation is able to complete a full test. Some businesses will perform an IT test of specific systems or processes before conducting a full-scale test involving end users. If an organisation is using a DRaaS solution, IT personnel should take advantage of the ability to spin up a sandbox environment so they can test recovery capabilities without affecting production systems.  After successfully completing a test run, the business can schedule a follow-up test, involving end users as necessary. With the right plan in place, a business can cope with a range of disasters – whether a small, localised one like server failure or a region-wide flood – without sacrificing uptime and customers. With the right plan in place, a business can cope with a range of disasters ### IO London Partners with the London Internet Exchange IO, a leading data centre services company, today announced its partnership with the London Internet Exchange (LINX) to provide a direct route for its customers to local networks.  The partnership means that IO will become one of just four LINX vPoPs in the UK, providing a direct route to the main LINX exchange points for customers. IO customers and prospects will benefit from improved Internet connectivity, latency and performance. LINX is a mutually governed Internet exchange point (IXP) which operates peering hubs for over 650 Internet Service Providers (ISPs) and network operators. This provides its member networks with the facility to peer their Internet traffic between one another. LINX helps distribute peering within the UK with multiple sites including regional exchanges in Edinburgh, Manchester and Cardiff, which allow ISPs and network operators to keep their traffic local. The partnership with IO London comes into effect immediately.  According to Nigel Stevens, IO’s UK Managing Director, this “transforms the data centre into a connected asset rather than a standalone house of data.” He said: “Extending our network connectivity into such a rich community of ISPs providing an extensive range of peering services is very exciting for us, and our customers. It will bring best-in-class performance, latency and control options to both our existing and prospective customer base.  “With the Internet dominating both our work and personal lives, Internet-based content is so viral in our society. Because of this, the measure of a good data centre is what it’s connected to rather than what’s in it.” LINX’s CEO, John Souter said: “We’re delighted that IO has joined the growing number of LINX Virtual PoP partners. By becoming a LINX member via IO Data Centres, networks will have the ability to exchange traffic with the rest of LINX’s expanding community of ISPs and network operators in an efficient, and cost-effective, way.”  IO launched its London data centre earlier this year. With more than 10,000 square metres, the Slough-based facility features highly resilient power, a dedicated substation and houses the first international deployment advanced ECO modules. This delivers ultra efficient performance by using free, outside air, backed by traditional chilled water technology. The company operates six facilities across three continents, and two of the largest data centres in the world. It is a global leader in software-defined data centres, pioneering the next generation of deployments in this field. ### ‘Everywhere Enterprise’ speeds up the need for cloud security Following a whirlwind of employees adopting cloud-based applications and social media platforms in the workplace, it has brought mobility to the top of the agenda for IT. The accessibility of networks means business now happens any time at any given place, establishing an era of the ‘Everywhere Enterprise’. It is evident that there is no longer a set perimeter as mobile devices can connect to 4G networks (soon to be 5G) rolled out in every major city with ease. To put into perspective, we can even check our emails 200 feet below the surface on the London Underground platforms with public Wi-Fi networks. IT struggles to keep up with monitoring such a diverse range of network access points While this workplace shift creates the benefits of increased agility and productivity it is creating a number of security concerns. IT struggles to keep up with monitoring such a diverse range of network access points. It is inevitably losing control with the employee now pulling the strings. As a result, the organisation is at risk of cyber security threats coming in from all directions. Along with the evolution of working practices, threats too have developed from viruses to more sophisticated multi-faceted attacks. These attacks are targeting both the user and the device they operate from and mobility has made it even easier to breach corporate security measures. Looking at securing users beyond the company walls means moving away from the traditional focus on perimeter defence. [easy-tweet tweet="CISOs require a new approach to security in the age of the everywhere enterprise" user="zscaler and @comparethecloud" usehashtags="no"] In this new age of progressive IT, CISOs require a new approach to security; one that provides consistent policy, protection and visibility to all user and devices irrespective of location.  Challenging traditional security The 1990’s idea of using security appliances installed in a data centre to protect employees who are on their laptops sitting in cafés and working via the cloud, no longer makes sense. Not only are security appliances tied to legacy location concepts, dictating limitations to the business rather than enabling it, they tend to be built for one security function only. This creates an explosion of new appliances in the data centre to keep up with each new threat, all of which must be individually purchased, installed, maintained and updated. [easy-tweet tweet="1990s approaches to IT security don't work anymore" user="zscaler and @comparethecloud" usehashtags="no"] Appliances also lack the pace to counter evolving threats and fail to meet the architectural flexibility to accommodate new enterprise technology. As a result businesses are operating on outdated security models that don’t provide enough visibility to enable security executives to maintain control.  As such, the traditional ‘block vs. allow’ strategy is no longer fit for purpose in today’s ‘Everywhere Enterprise’. As the working environment evolves, organisations require a shift to a ‘manage and monitor’ approach. After all, prohibiting access to Internet resources will only encourage users to bypass security controls.  Bridging the cloud gap Looking at new strategies to tackle the threats of the digital age means searching beyond the tools traditionally deployed in the enterprise, towards the benefits of cloud delivered security. However, while the return on investment of the cloud solutions has been well documented, the trend towards using them for security purposes has been treated with trepidation. That’s due to the perceived risks that have been driven by ill-equipped security appliances featuring in the majority of workplace architectures today. As cloud applications become more widely accepted and deployed however, CISOs are starting to see the competitive advantages of cloud computing in terms of flexibility, agility and competitive advantage. Why pay for capital investments and the resources to manage them when you could redeploy the money for strategic projects?  New technologies and processes can deliver enormous gains in productivity and efficiency to drive business metrics like revenue generation and customer satisfaction. And that’s not the only critical advantage. Cloud solutions are integral to helping businesses realise advanced security capabilities – most importantly, better visibility.  [easy-tweet tweet="Corporate network omnipotence is no longer a ‘nice to have’, it’s a business imperative" user="zscaler" hashtags="IT"] In today’s complex IT environments, the ability to see how every user, device and application is accessing the corporate network is no longer a ‘nice to have’, it’s a business imperative. The next generation of enterprise security is about the Direct-to-Cloud Network approach. This is much more than blocking threats. It will support critical security protection by enabling IT to take back control. Shifting power to the CISO The proliferation of mobile and cloud technologies has shifted the centre of gravity toward the user. Moving security to the cloud shifts the balance of power back in favour of the CISO. It allows companies to embrace innovation securely, while providing the visibility and controls needed to ensure compliance with corporate policies. It also helps executives to regain control of the enterprise’s digital assets and user activity, whether located internally or externally on the internet, so they can spot potential threats before they escalate. The challenge for today’s CISOs is shifting focus from basic infrastructure projects to more strategic initiatives. Moving security to the cloud is an example of this type of transformational process. It provides business agility, reduced costs and more importantly, it enables CISOs to use security capabilities as a business enabler. protection is no longer enough, prevention is now key Many CISOs are beginning to act on the principle that protection is no longer enough, prevention is now key. Forward thinking executives will be investing in cloud-based security to facilitate initiatives in light of this new reality. ### YouGov Powers Global User Experience and Improves Time To Market with MongoDB Market Research Firm YouGov Relies on MongoDB’s Scalability and ’Superior Performance’, Upgrade to MongoDB 3.0 Delivers Significant Cost Savings by Reducing Storage requirements by 70% MongoDB today announced that YouGov, one of the world’s leading market research organisations, relies on MongoDB to reduce time to market and improve user experience. YouGov engages in a continuous conversation with millions of consumers to better understand opinions, behaviours and preferences. To collect this large volume of data, YouGov built a platform called Gryphon using an internal database known as FastStore. Realising that the system would need to scale globally, YouGov migrated from FastStore to MongoDB in 2010. MongoDB’s ease of development increased the pace YouGov could deliver services into production. MongoDB is now the default non-relational database for YouGov, used for dozens of applications within the company. YouGov has experienced a 70% reduction in storage capacity by migrating to the latest release, MongoDB 3.0, which resulted in a significant cost saving. “I had been thinking about how to solve this non-relational data problem for years, then in 2010 when I saw MongoDB, the light bulb came on. I knew we had to use it,” said Jason R. Coombs, Executive Technical Director at YouGov, Plc. “Performance issues due to operating at scale were completely eliminated when we migrated to MongoDB. The implementation is dramatically simpler, which means our developers can focus more time on producing higher value applications.” Gryphon ingests all of YouGov’s global survey data into MongoDB, with each document capturing a user’s response and activity during an interview session. Then the data is served to YouGov’s suite of products, such as BrandIndex, a brand perception tool, or Pulse, a consumer behaviour tracker. YouGov considered migrating to Microsoft SQL Server, but it could not provide the speed of innovation or performance required. YouGov then went into production with MongoDB in 2010, making it one of the first large scale deployments of the database. “MongoDB has many times shown its superiority in performance, which directly impacts our customer experience and engagement. There is also no doubt that there are some applications we would not have built had it not been for MongoDB. It removes much of the complexity developers and ops teams face as new apps are built and rolled out,” explained Jason. YouGov uses MongoDB Enterprise Advanced a version of MongoDB that includes advanced software, support, certifications, and licenses.  With a global cluster of five shards, two in the US and two in Europe, the operations team relies on MongoDB Cloud Manager to run, automate and backup the deployment. “While MongoDB has evolved considerably since 2010, as has our own expertise, we continue to rely on the support that comes with MongoDB Enterprise Advanced. It’s less about break and fix support, and more about proactive, consultative services and tools – such as planning upgrades or advice on schema design for new apps,” concluded Jason. Joe Morrissey, VP of EMEA MongoDB said: “MongoDB is ideally suited for organisations like YouGov. Teams who spend every day challenging assumptions about how things should be done, to color outside of the lines and identify smart, efficient solutions to everyday problems. “Seeing one of our earliest adopters weave MongoDB into a culture of constant innovation is gratifying. And it’s inspiring to see how that culture is delivering concrete benefits that propel the business forward. It’s a team we could all learn from,” ### European Golf Tour Hits An Ace With Barracuda Networks The European Golf Tour operates three of the most prestigious professional golf tours in Europe from its headquarters in the UK. The events are nearly all played in Western Europe, with most famous taking place in the United Kingdom, Ireland, Germany, France and Spain. It is also the lead partner in the internationally respected and celebrated Ryder Cup. [easy-tweet tweet="One small IT team supports the entire of the European Golf Tour" user="barracuda and @EuropeanTour " usehashtags="no"] Graham Gifford, IT Director for the European Tour heads up a small IT team that supports one of Europe’s most well known and loved golf tournaments. With 2-3 events staged each week throughout the year, there is no opportunity for network downtime or time to implement complex data recovery processes. There was prevailing pressure to maintain critical applications and data from the organisation’s Exchange, VMware and SQL servers. Gifford had an existing tape-based backup and recovery architecture, but it was proving a massive burden; prone to human error, difficult to monitor and hard to scale in line with the organisation’s needs. Problems were compounded for Gifford's team, as recovery of deleted files was a time consuming and difficult process, which took up a large part of their resources and interfered with other critical work to support the organisation’s high profile tournaments.  Barracuda Makes The Cut  [caption id="attachment_31795" align="alignright" width="300"] Read more on Barracuda's latest announcement here[/caption] Gifford considered several competitor solutions, but selected the Barracuda Backup 1090 after a successful one-month trial. He explains: “We were already big Barracuda fans and had been using the company’s Web Filter and Email Security Service for a few years. We knew they were intuitive, reliable and scaleable - all the things we were looking for in a backup and recovery solution, and all the things we didn’t have with our existing product. In addition, the Barracuda Backup 1090 came with a wealth of features and functionality which other vendors couldn’t offer as an inclusive package. This made Barracuda an obvious choice.” Gifford recalls that the installation process was simple and straightforward, and that unlike many other backup and recovery solutions on the market, there was no need to reboot the network when the agents were added to the servers. He explains: “When we installed our previous backup and recovery solution, we needed to take all our servers offline and reboot them several times before we got the system working. Even then we were never 100% sure that it was backing up everything we wanted it to.” Obviously this leaves nerves a little frayed, fear of data being lost doesn't make for a strong game of golf.  Teeing Up The Network Previously the tape-based solution had provided the offsite backup and recovery required by the organisation, but it came at a high price; recovery was difficult, the solution as a whole was clumsy and unreliable. Seemingly basic problems such as tape jams or incorrect tape maintenance caused big problems for Gifford and his team. In addition, any changes to the network architecture or software updates to the servers cause the backup infrastructure to go into meltdown. Gifford now has the Barracuda Backup 1090, which delivers restores either from an appliance on site or remotely from the cloud. It has sufficient in-built intelligence to accommodate any physical or software changes to the network that it protects.   [easy-tweet tweet="Seemingly basic problems such as tape jams caused big problems for @EuropeanTour" via="no" hashtags="cloudbackup"] [quote_box_center]Gifford comments: “The Barracuda Backup 1090 automates many of the routine tasks that are both time consuming and prone to human error, and it copes effortlessly with the many small daily changes that take place with a network such as ours. This has removed a lot of stress from our lives and made the day to day running of the IT department a much calmer affair.   And the best thing of all,” explains Gifford, “is that the entire backup and recovery process can be monitored and controlled from a single screen. This gives us fantastic insight into network performance and a high level of confidence that everything is running as it should be. Any problems are proactively highlighted and we can deal with them before they cause any issues for the organisation.” [/quote_box_center] Help Yourself… The time that Gifford and his team spend on helping colleagues recover their lost or accidentally deleted files has been cut significantly - meaning they have more time to whittle away on the putting green. “It’s so intuitive,” he comments, “that everyone at the organisation can recover their own lost or deleted files with minimal assistance from us." Previously Gifford would have had to go rummaging through the tape archive, trying to identify which particular file they were looking for. Permissions are easy to set and allow everyone at European Golf Tour to access their own files in the archive and recover them as necessary - as easily as if they were looking through a junk folder in their email or the recycle bin on their computer.  Perhaps the biggest benefit for Gifford and his team is the most simple and understated. “It just works,” he enthuses. “Don’t underestimate how rare that is in a piece of IT kit... it makes me a very happy guy.”   Perhaps that’s why European Golf Tour is now testing Barracuda’s Next Generation Firewalls and Load Balancing solutions. With a portfolio of successfully running products already in place, why wouldn’t they want to see what else Barracuda can hit a hole-in-one on?  ### STEM skills gap must be filled, say UK IT professionals Last summer David Willetts, the Science Minister, announced a £52 million investment in new and emerging science talent, creating more than 7,800 education and skills opportunities over a two-year period. But new research published today by IP EXPO Europe, reveals that UK IT professionals want the Government to do more to encourage people into STEM professions.  The Government’s investment in STEM (science, technology, engineering and mathematics) professions promises to deliver 1,360 apprenticeships, 240 traineeships, 150 Industry Degrees, 230 Modular Masters Modules, 5,900 workforce development opportunities, and a cross-sectoral proposal to attract young people into STEM jobs, over two years. [easy-tweet tweet="75% of UK IT professionals say the Gvmnt needs to do more to entice students and young people into STEM" user="IPEXPO" usehashtags="no"] Despite this, three-quarters (75%) of UK IT professionals say the Government needs to do more to entice students and young people into these jobs.                                                 Bradley Maule-ffinch, IP EXPO Europe’s director of strategy, says: “It’s disappointing that businesses don’t feel the Government is spending enough on STEM initiatives. These jobs are high in demand and vital in boosting the UK economy so the Government must do more to boost recruitment into these professions.”  [quote_box_center]Mark Morrissey, Director of Education Programs at Cloudera, an exhibitor at this year’s IP EXPO Europe, says:  "There is increased competition to recruit technical talent that inhibits the market growth of several new, disruptive technologies," says Mark Morrissey, senior director of education for Cloudera. "Government, industry, and academia need to find avenues of collaboration to highlight the benefits of an IT related education, and help provide access to the training necessary to pursue STEM related careers." [/quote_box_center] Nine in 10 (86%) of IT professionals in the East Midlands say the Government needs to increase its spend on STEM initiatives, followed by 83% in Wales and 71% of respondents in Northern Ireland. Respondents in the North East were the least critical of the Government’s investment in STEM, with nearly a quarter (23%) saying they feel the Government is doing enough to close the STEM skills gap. IP EXPO Europe, taking place on 7-8 October at London’s ExCel, will serve as a venue for business, government and academia to work together to find solutions to the STEM skills challenge. The event will also explore how businesses are transforming as a result of the latest technological developments, and the macro and microeconomic opportunities and challenges businesses are facing in their journey to digital transformation. For further information, or to register FREE, visit: http://www.ipexpoeurope.com/. ### Superconductors: The future of the data centre? Superconductivity is the property whereby a material exhibits zero electrical resistance, meaning absolutely no energy is wasted and ensuring optimum efficiency. It has the potential to bring revolutionary advances to high end computing, delivering processing speeds far in excess of what is currently possible and yet, it is not a newly discovered phenomenon. It was first observed in 1911 by Dutch physicist Heike Kamerlingh Onnes, but the world still waits for superconductors to achieve mainstream use. [easy-tweet tweet="Superconductivity is the property whereby a material exhibits zero electrical resistance" user="bsquared90" hashtags="datacentre"] The problem lies in the fact that getting materials to exhibit superconductivity is easier said than done. It is only witnessed in approximately 25 elements and predominantly at extremely low temperatures, often very close to absolute zero (−273.15°C). The energy required to cool materials to this extent would negate any efficiencies gained from having zero electrical resistance. As such, achieving superconductivity at higher temperatures continues to generate much scientific research. There have been some breakthroughs and scientists have now been able create superconductors at temperatures of -70°C, still impractically cold for most applications, but a sign that, theoretically at least, there is no reason why room temperature superconductors can’t be developed in the future. Superconducting magnets are used in CERN’s Large Hadron Collider to accelerate sub-atomic particles With temperature thresholds still extremely low, applications for superconductors are somewhat limited. Superconducting magnets are used in CERN’s Large Hadron Collider to accelerate sub-atomic particles, while some large electrical providers have experimented with superconducting materials to improve efficiency. However, it is the future applications of superconductors that really promise widespread benefits. [easy-tweet tweet="Superconductors offer a potential solution to the inevitable problems facing the #datacentre industry" via="no" usehashtags="no"] In particular, superconductors offer a potential solution to the inevitable problems facing the data centre industry. The growth of cloud computing, Big Data and related modern phenomena is placing greater pressure on data centres in terms of their processing capacity and reliability, even without considering the vast amount of energy that they consume. In the US, this is expected to reach 200 TWh by 2020, representing approximately five per cent of the country’s total electricity consumption. The limitations of the semi-conductor technology that data centres are built upon, leading to wasted energy and the ever-increasing difficulty of meeting Moore’s Law, are likely to place significant pressures on the industry in the years to come. data centres stand to benefit from a much higher level of efficiency Superconductors offer a potential solution to these data centre challenges, even if they remain theoretical for the time being. Scientists at the National Institute of Standards and Technology are experimenting with the use of two superconducting electrodes to create the 0 or 1 binary values required for superconducting digital computer memory. If they are successful and can scale the technology, then data centres stand to benefit from a much higher level of efficiency. It could, in the words of NIST’s Ron Goldfarb, “revolutionise mainframe computation and data storage within a decade.” [caption id="attachment_22368" align="alignright" width="300"] Image by Julien Bobroff, Frederic Bouquet, Jeffrey Quilliam, via Wikimedia Commons[/caption]  If sustaining computer memory at the low temperatures required for superconductivity can be achieved, it is not only commercial data centres that stand to benefit. Superconductor research is currently taking place that could lead to the next generation of supercomputers. The US government has plans in place to create what would be the world’s fastest computer, one capable of making a quintillion calculations every second (also known as an exaflop), but just as with data centres the limitations of semi-conductor technology are beginning to prove prohibitive. Superconductors offer a way of powering these huge computing resources with far less wasted energy. For example, existing supercomputers consume approximately 10 megawatts of power in order to deliver 20 petaflops of computation. By contrast, superconductor computers promise 100 petaflops of performance for just 200 kilowatts of energy. [easy-tweet tweet="The US government has plans in place to create what would be the world’s fastest computer #exaflop" via="no" usehashtags="no"] ### Digital Data Centres: Preparing for the Future Today How data centres within a wider digital ecosystem can give your organisation a competitive edge. What will the data centre of the future look like? One view is that it will be tall – perhaps very tall. Within the next few years, experts say, data centres could be driven by rail-based robotics capable of scaling your entire rack. This means data centre builders won’t be limited by horizontal expansion space and could fashion efficient structures that scaled upwards, not just out. Deploying robots could also lead to less downtime, as every aspect of data centre management would be forecast and controlled. And as robots can’t see, costly variables like lighting could simply be eliminated. data centres could be driven by rail-based robotics capable of scaling your entire rack While we may not have arrived at the robot-driven, ‘lights out’ data centre of the future quite yet, technology is moving at breakneck speed, putting IT teams under pressure to redefine their data centre strategies. [easy-tweet tweet="#bigdata, #analytics, #mobility and #cloud have become driving forces behind the rapid evolution of digital businesses." via="no" usehashtags="no"] In the past, it’s likely that your organisation built its own data centre, tailored to its needs. Your infrastructure team probably spent most of its time performing routine maintenance and upgrades, and firefighting the end-user problems arising from that centre. But not only do IT staff now have to provide an utterly professional, solid core infrastructure, they also have to deal with their company’s need to succeed in an increasingly digital world. High-speed broadband, big data and analytics, mobility and the cloud have become the driving forces behind the rapid evolution of digital businesses. Firms are making huge efficiencies as they master these digital technologies to improve their operations and their marketing effectiveness, seizing the opportunity to alter the value they can bring to customers. [easy-tweet tweet="#IoT has potential to bring every object, consumer and activity into the digital realm" user="rickycoop" usehashtags="no"] The Internet of Things (IoT) and its potential to bring every object, consumer and activity into the digital realm, is set to vastly multiply the amount of data that businesses and governments process. According to a recent report, by 2020 IoT will drive the creation of more than 25 million apps and 50 trillion gigabytes of data, and see 4 billion people connected to the global network. If your organisation uses data to create differentiating value, this means no let-up for the IT team – quite the reverse. Technology teams can expect to continue allocating many resources to supporting their enterprise’s new digital initiatives, instead of focusing on core IT needs. Organisations won’t have a lot of time to act, either, because these changes to a data-saturated, hyper-connected and broadband world are global – the ground is shifting under our feet. Enter the digital data centre What if you outsourced your data centre requirements to an innovation hub? The need to free up IT teams to focus on areas such as development and innovation has never been more pressing. But what if your data centre provider could help with that? What if you outsourced your data centre requirements to an innovation hub – a ready-made, ‘rentable’ scalable environment for planning, launching and refining long-term strategies? What if this digital centre directly connected you to partners who shared synergies with your business, so you could get expert support with implementation and guidance on managing infrastructure in convenient and secure locations? Far-sighted digital businesses are recognising the implications of tapping into data centres that don’t just sell power, space and cooling facilities, but have transformed into digital ecosystems with powerful online tools and fast interconnections to data, business partners and essential services. They are realising that by placing themselves at the centre of these ecosystems and mastering new digital relationships with potential partners, they can scale up, achieve advanced agility and shape business outcomes in new ways. Join the digital centre revolution What might this look like in practice? A digital data centre that is also a vibrant, connected digital ecosystem should be able to give you access to a huge choice of broadband and mobile networks, internet exchange points, content distribution networks and fixed lines. You should also be able to quickly and safely migrate some or all of your organisation’s workloads across a global network. At the same time, you could take advantage of easy integration with public, private and hybrid clouds. The benefits could include: growing more quickly and with less risk with a proven partner for infrastructure management gaining the ability to scale globally by leveraging secure locations in multiple markets connecting with a wide range of cloud providers and Infrastructure as a Service, Platform as a Service, and Software as a Service solutions. Becoming a digital enterprise is no longer simply about becoming more efficient Becoming a digital enterprise is no longer simply about becoming more efficient. It is also about operating in a smarter world where doing business looks very different. In this environment, proactive enterprises are letting market-leading specialists – and one day, perhaps even robots – manage their IT infrastructure. This is allowing them to focus on their core business, manage costs and lower risk. Will your organisation be one of them? Learn more about the future of the Digital Ecosystem at www.digitalcentre2020.com ### Securing Confidential Data by Beating the ‘DropBox Dilemma’  Figures released by Kaspersky have revealed that accidental data sharing by staff in organisations poses a greater risk than software vulnerabilities. 29 per cent of respondents to the Kaspersky study reported they had suffered accidental data leaks by staff during 2014. The study doesn’t specifically single out consumer-file sharing apps like Dropbox and Google Drive as the reason behind this. However, their ever-increasing use in the enterprise cannot be denied.  Employees turn to these well-known apps because of their ease of use. With the majority of them being free to use and easy to install, they are creating a headache for organisations. [easy-tweet tweet="#IT departments are anxious about losing control of files say @BrainloopInc" hashtags="data,dropbox"] IT departments are anxious about losing control of files inside and outside the workplace, especially as compliance requires knowledge of and control over the data’s location.  With no centralised management or security, these consumer-grade file-sharing platforms can be a nightmare for IT administrators. Consumer file sharing tools such as DropBox typically use the public cloud to store data being shared.  Sharing high-value confidential and sensitive data on these public platforms creates serious security and compliance risks.  Organisations must keep data secured or face paying a high price. Organisations must keep data secured or face paying a high price. If personal is compromised the Information Commissioner's Office (ICO) has the power to fine the offending organisation up to £500,000. Personal data has a wide definition and includes any information that can be used to identify an individual. Organisations who don’t alert data loss to the authorities run the risk of being caught out and could face large financial penalties. Secure file sharing and collaboration is achievable and doesn’t require complex processes. The challenge for IT departments is balancing access to confidential data to those who need to view it, but also keeping it protected from others who aren’t allowed access. The trick to succeeding is to have the right levels of control in place to ensure the business workflow runs smoothly. [easy-tweet tweet="The trick to succeeding is to have the right levels of control in place" hashtags="dataleaks, data, cloud"] It is likely that businesses will soon start to impose bans on the use of consumer storage platforms in the workplace. With this in mind why not tackle the so-called ‘Dropbox Dilemma’ in your workplace now? Here are a few pointers to putting the right level of control in place to make secure file sharing works in your workplace. When looking to implement an enterprise-ready file sharing and collaboration solution it is imperative that it is all encompassing.  It needs to be intuitive and integrated into the business workflow so as not to cause bottlenecks and backdoor leaks    Workforces are now more mobile than ever and for this reason the solution must allow employees to securely manage and collaborate on confidential documents and other information both within the local IT infrastructure and also remotely Don’t be caught between a solution that gives you convenience and one which gives you security. The two, along with an intuitive interface for less sophisticated users, should go hand in hand. Data leaks are usually down to either careless, clueless or malicious actions. Data leaks are usually down to either careless, clueless or malicious actions. If you take this into account on your security protection, you are eliminating the majority of risks that affect most companies.  Introduce a clear document security policy in the workplace and make certain that all employees understand it. Data which is considered to be highly confidential should be secured and accessed only on a need-to-know basis. Implement security options that prohibit alternations from being made to documents unless the user is authorised. Despite trying to expel consumer file sharing programmes, employees may still look to use them. To protect against this it is important to monitor and audit your network on a regular basis. Unauthorised activity on the network can be picked up by scanning activity logs on a daily basis. [easy-tweet tweet="Having secure file sharing in place can improve your bottom line says @BrainloopInc" hashtags="data, cloudstorage, cloud"] Having secure file sharing in place can improve your bottom line. It considerably reduces the risk of sensitive business information being compromised as well as financial vulnerabilities linked to failing to comply with laws and regulations. By introducing the level of security you need in place for file sharing and collaboration, you can stop worrying about security threats. This will allow you to focus on what’s really important – maintaining a successful business that can take full advantage of growth opportunities. ### Friday Night Cloud: Episode 2: The Naughty Bits [embed]http://traffic.libsyn.com/fridaynightcloud/FridayNightCloudEpisode2.mp3[/embed] In Episode 2 we discuss the Internet of (Naughty) Things, Online Gambling Scams, Rubbish of the Net and Internet Security.       ### Preparing for moving to the cloud? Department leaders are now purchasing more and more cloud technology themselves. Our own staff expenses cloud service is used by 65% of NHS trusts and it is nearly always the payroll and finance team leading the change.  These cloud pioneers have led the way and those that we speak to say the same things about the experience: IT is nervous about the implications cloud will have [in terms of their future role] and the board is apprehensive to support something, which is different, an unfamiliar cost model and takes data outside the ‘safe’ four walls of the office.  [easy-tweet tweet="#IT is nervous about the implications #cloud will have on its future" user="Neil_everatt and @comparethecloud" usehashtags="no"] If you’re now making your first foray in cloud, you need to know it won’t be as simple as paying an invoice and sending out login details to your team; you will have to work to win support for your project.  Here are some tips that we’ve learned from working with hundreds of customers, both private and public, to make using cloud applications successful first time. First of all, do your homework. First of all, do your homework. Create a shortlist of possible suppliers, suitable in terms of the service they can provide, but also review them for matching cultures, values and attitudes – you can learn a lot of this from their websites. Look them up on Companies House. How is the firm structured, one shareholder or ten? Is the company a well-established, mature business? Ask the question, how long have they been in business? More importantly, how long will they be in business? There can be no room for ‘fly by nights’. There are plenty of stories out there of suppliers going bust and giving customers just a few days to shift their data. The last thing you want is to have to return to the Board for more funds.    Ask for references. Any established business will be able to present written references and or put you in touch with happy customers.  [easy-tweet tweet="Check the suppliers’ security credentials. Does the supplier have ISO/IEC 27001:2013 certification?" user="neil_everatt" hashtags="cloud"] Check the suppliers’ security credentials. Does the supplier have ISO/IEC 27001:2013 certification? If you work in the NHS, is the supplier rated on the NHS’s Information Governance Toolkit, the internal security check used by NHS organisations?  Visit their premises; we have often hosted NHS data security staff visits. In fact, definitely visit their premises; you don’t want to be buying a critical application from a garden shed business. Check their uptime stats’ too and find out where your data will be held. Check your technology and infrastructure – can you get high-speed access to the cloud application from all sites and remote offices? Is the application designed in such a way that it is useable on low bandwidth networks?  Heavy, graphics rich cloud apps might look nice, but they aren’t always practical. What’s the supplier’s attitude to updates? If they’re going to make massive updates once every year, you might find the latest version doesn’t work on your network; smaller less intrusive updates are better.    [easy-tweet tweet="Heavy, graphics rich #cloudapps might look nice, but they aren’t always practical." user="neileveratt and @comparethecloud" hashtags="cloud"] Also don’t forget about the end-user experience. They’re important because they’re multiple voices and vocal. Is the cloud service easy to use? Does it need lots of training? Is there a supportive mobile app? How can users share feedback on the service? Is there a forum or voting system for new features?  The bottom line is: do your research ready to present a thorough case to the board. You shouldn’t have to do this alone either. Ask the supplier to support you with the business case. They will work with lots of customers so they can create this document for you really easily [but obviously interrogate it].    flick the switch and go for the big bang roll out. My last point would be this, if you’ve made the decision to move to the cloud, done your homework, prepared properly, trained effectively and marketed the new service to internal users, then flick the switch and go for the big bang roll out. Unless you really have to, never run two systems – old and new – simultaneously! It doesn’t work; people don’t like change and you’ll find it difficult to role out a new system if the old one is still available. Most of our customers are up and running in just a few weeks weeks and our customer churn is less 2 per cent year – that’s because they invest time up front to get their deployments right and we help them every step of the way. Do the same and you’ll find moving to the cloud a piece of cake!  ### Following the Money: Investment in the Cloud The reach of the cloud is unprecedented in technological innovation, save perhaps for Internet itself. Mobile users and the Internet of Things (IoT) rely upon the cloud for back-end functionality. Consumers use cloud-based services, from Amazon to Dropbox to Google to Microsoft, often without even realising it. Enterprises use cloud-based applications like Saleforce.com, build applications to use cloud APIs, and are migrating their data centres into the cloud as well. Service providers, ranging from local telcos to international long-haul carriers, are seeing new business in providing direct cloud connectivity for enterprises, establishing and in retrofitting their networks with SDN and NFV to better accommodate cloud applications. [easy-tweet tweet="Today’s #cloud applications and use-cases are spurring vendors and service providers to innovate." user="CloudEthernet" usehashtags="no"] “That’s only the beginning. We have entered an accelerating virtuous cycle: Today’s cloud applications and use-cases are spurring vendors and service providers to innovate. Those new innovations lower barriers and inspire more enterprises to use the cloud. The increased business sparks more innovation. The virtuous cycle is accelerating – with no end in sight, as established players and startups jump into this market”. Said James Walker, President of OpenCloud Connect. The best place to start when examining cloud innovation: The money trail. According to Sean Hackett, Managing Director of 451 Research, “You should look at spending and adoption largely being driven by the enterprise. I think IT is exerting a little bit more control because the nature of the application and workload has changed a bit. Definitely the competitive landscape has changed with incumbent systems integrators and managed services providers looking to co-op the definition of cloud and move into the market. There is a change underway: If you follow the money and you look where enterprises are spending, they're definitely moving more revenue from on-premises to off- premises. Most of that money is really, from a cloud context, being navigated toward hosted private cloud which is a fairly rough definition of what I would think of as a traditional cloud environment.” There is a change underway: If you follow the money and you look where enterprises are spending, they're definitely moving more revenue from on-premises to off- premises. Dan Pitt, Executive Director of the Open Networking Foundation, also sees global telcos investing in private clouds. “They want to host enterprise services, starting with elastic cloud provisioning, to enable offload into their clouds.” He also sees investment in software-defined networks, where telcos “will start a new greenfield business with new technology to offer an enterprise-class cloud that's technically a little bit separate from their current network. And so that's how they're getting experience. They're starting small.” Pitt also looks at data centre web-scale operators: “They're investing in white-box and bare-metal solutions for servers we see in storage and now for networking in terms of switching and in terms of routing. Vendors too are investing, in semi- proprietary switching and routing. Everyone is saying, oh, SDN is the big thing so I have to be on the bandwagon. I have to go to my customers and say, yes, I've got SDN solutions. But if you dig a little bit below the surface, you'll see there's still a lot of proprietary technology. Some of it is necessary because there's a lot of brownfield installations and their customers cannot change as rapidly as the technology might change. But they're starting to introduce some of the new technologies”. Dan also sees a lot of money in glass fibers and photonics. “There's been a lot of investment in packet optical integration. The carriers love it. The optical vendors are doing a great job with this. They're still using some of the legacy protocols through existing equipment, but they're also putting in new OpenFlow controls down in their control plane. And in a few cases they're taking it down to the optical element itself.” [quote_box_center]Moving from building networks to thinking broadly about IT is another area of investment, said Chris Rezentes, the Asia/Pacific Regional Manager for Verizon. “The change from building network to new IT solutions means you still need network experts that know the network, but you need also those folks that have the IT experience to make things more efficient and reduce your cost and having the new ideas for revenues. That impacts where investment is going as well. At Verizon we're not seeing as much investment in global networks. We have the global network out there already. So you're seeing a shift away from that to more of the IT solutions and internal investment on that area.”[/quote_box_center] Rezentes continues, "Our experience when we've been meeting with those application providers, like the Microsofts and the Amazons, they were explaining that they're running into the same situation that we are. We have the experts that know our network and they have the folks that know their cloud applications." You have a lot of customers asking, is the Internet safe? Security is another investment area, says Rezentes, because of enterprise customer concerns. “You have a lot of customers asking, is the Internet safe? Because they haven't been hacked yet they're willing to go with public internet and no security or firewall. And when they're hit, then it's oh, wait, now we need to change our way of thinking. We're still a little bit in that phase, although there is more awareness on the security side. But companies need to be willing to invest. What to invest in? That may be the million-dollar question really.” [easy-tweet tweet="Enterprises want their #IT partners to help drive business efficiencies" user="CloudEthernet & @comparethecloud" hashtags="cloud"] Enterprises want their IT partners to help drive business efficiencies, rather than emphasise network infrastructure, said Amit Sinha Roy, Vice President of Tata Communications, and that shift is also driving vendor investment. "Enterprises want to push out the infrastructure as far as possible, as far as regulation and security allows and as far as the confidence they have in their service provider, be it the cloud or the network for a telco. The enterprise is facing the same issues, and more, of getting the people who can perhaps maintain and grow and drive that system, and even design and architect the solution. That's going to keep driving forward in terms of offloading and investing further into these relationships.” ### Cloud and the move to managed services It’s now fairly universally agreed that the “break-fix” model of IT support is itself broken, and in need of a fix. It doesn’t work in the best interests of either the IT support provider, or the customer. Only engaging with IT support when something has broken means that communication is predicated upon frustration and time-criticality – which isn’t a recipe for a harmonious relationship. In everyday life, we have a tendency to harbour suspicions about those who fix things when they go wrong – no one takes their car to a garage to be fixed and is happy when then mechanic finds additional problems that require fixing. As for the IT support company itself, the business model is a reactive one, beholden to IT failures. There’s no impetus to do a job that’s better than ‘just good enough’ – why would you make sure you avoid the customer’s return business? [easy-tweet tweet="The “break-fix” model of #ITsupport is itself broken - and the #MSP model could be the solution" user="alistairforbes" usehashtags="no"] The Managed Service Provider (MSP) model, however, gives clients a far better service than the “break-fix” model – detecting and solving problems before they arise, rather than responding to emergencies. So many businesses rely so heavily on their IT systems that a problem simply means business stops – the backup systems that used to work are just no longer viable – you may be able to switch from mobile phone to landline if it stops working, for instance, but the chances are all of your contacts are stored either on the device or in the cloud. If IT systems are unavailable, even for a short time, it’s time wasted – no matter how short the gap between the break and fix. The MSP instead builds and manages a system that is designed to prevent the break happening in the first place – a better quality service, paid for by a regular fee. IT support is no longer rewarded for things going wrong, but for things going right. However, shifting to managed service instead of break-fix can be complex. The best MSPs excel at process management, understand customer requirements, and have insights into market conditions and new technologies but getting to that points demands a lot of time and effort and a different mind-set and skill set. Cloud is often dismissed as a marketing buzzword Cloud is often dismissed as a marketing buzzword but this doesn’t do justice to the difference cloud is making to the transition from break-fix to MSP easier. Cloud-based management tools mean that on-boarding is simpler, less maintenance is necessary, and most importantly, that MSPs can spend their time focusing on their clients and their quality of service to those clients. There simply is no need to worry about having to maintain and update the systems and tools the MSP uses to do their job. The focus is put in the right areas. Cloud-based tools are not the complete solution, however - to move away from break-fix and towards MSP relies on more than just the introduction of cloud-based IT monitoring and proactive security management. With a constantly growing number of technology options – also down to the increase of cloud infrastructures - being a perpetually valuable MSP relies on offering customers a reliable and deep knowledge of a wide variety of technologies and the experience from which to advise on best courses of action. The issue MSPs will start to face is the commoditisation of their services. By offering insight and input into the contribution IT can make to the overall success of a business, rather than simply being reactive, MSPs can become more of a partner to their customers, a far cry from simply fixing broken IT equipment. This sort of relationship, however, only comes with time and trust. [easy-tweet tweet="#Cloud has created a #businessIT landscape that inherently requires a different approach from IT service providers" user="alistairforbes" usehashtags="no"] Cloud has created a business IT landscape that inherently requires a different approach from IT service providers – as more choice has led to more confusion. One of the biggest challenges for companies in recent years has been IT fragmentation – different departments using different tools to meet their needs, and ‘shadow IT’ as people circumvent the limitations of the solutions with which they are provided by using their own solutions. The ambition for IT support to break out of the break-fix mould was already there – cloud is simply the all-important catalyst that enables MSPs to present a unified IT landscape to their customers and serve them effectively, despite the inevitable constant changes in business technology. The ambition for IT support to break out of the break-fix mould was already there - cloud is just the catalyst ### What is a blockchain? Building trust in bitcoin Financial processes are dependent upon mutual trust between all parties involved. The reason why individuals lend money from a bank rather than, say, the wealthiest person in their town, is because they offer a greater level of security. However, using banks as the facilitator for all monetary transactions is not always practical and in any case, the 2008 financial crisis has damaged the trust upon which banking depends. In this climate, bitcoin and similar cryptocurrencies have emerged, but they too are faced with the difficult task of creating a mutual feeling of trust amongst users. When you factor in that banks have been engrained within society for centuries while bitcoin is just a few years old, this challenge becomes all the more daunting. [easy-tweet tweet="#Blockchains contain records of all the transactions that have ever taken place using #bitcoin" user="bquared90" usehashtags="no"] The means through which bitcoin and similar cryptocurrencies create trust is through a network-based ledger known as a blockchain. Much like traditional ledgers used by banks all over the world, blockchains contain records of all the transactions that have ever taken place using the bitcoin currency. Unlike bank ledgers, however, blockchains are handled by a network of autonomous computers, not under the control or influence of any single individual or institution. Not even the operators of the blockchain’s various connection points, or nodes, can tamper with this inviolable ledger. the blockchain is self-regulating Bitcoin’s blockchain ledger is maintained by this network of nodes, essentially a network of computer owners that have downloaded a set of software tools that enable their devices to interact with other members of the network. The idea behind this is that each node checks the viability of every single blockchain transaction, ensuring bitcoins are not double-spent and only legitimising transactions once they have been checked against the existing ledger. To incentivise the community to legitimise transactions, bitcoin “miners” are rewarded for dedicating computer resources to this process in the form of new bitcoins. Thus, the blockchain is self-regulating as it is in the interests of all members of the network to reinforce the legitimacy of the currency. Without this level of trust, bitcoin becomes worthless. [caption id="attachment_21218" align="alignleft" width="300"] Neil Cattermull's Top 5 IT Predictions for Finance Sectors[/caption] Perhaps more powerful is the emphasis on transparency lying at the heart of the blockchain. All records are public, anyone is free to view a history of all the recorded bitcoin transactions. When combined with the decentralised nature of bitcoin, this offers a level of transparency that stands in stark contrast to the impenetrability of large scale financial institutions. However, as many media outlets have been at pains to point out, bitcoin’s transparency only goes so far, which has both advantages and disadvantages. Anyone viewing the blockchain will not be greeted with a list of names, IP addresses, or even recognisable purchases. Instead, each transaction will be accompanied by a string of letters and numbers ranging between 26 and 34 characters. Each address is connected to an owner and he or she is free to share this address in order that money can be paid into it, for example, but outside of this there is no easy way of making a personal connection to a blockchain address. This level of anonymity has, of course, led many to associate bitcoin with illegal actions, as the facilitator to a slew of shady transactions taking place on dark web black markets. [easy-tweet tweet="This level of anonymity has led many to associate bitcoin with illegal actions" user="bsquared_90 and @comparethecloud" hashtags="bitcoin, fintech"] However, with enough legal backing and some investigative know-how law enforcement agencies have broken through bitcoin’s veil of anonymity. The FBI’s seizure of more than $3 million worth of bitcoins when it brought down the online marketplace Silk Road likely relied on bitcoin’s traceability. Unlike cash payments, digital currencies such as bitcoin always lead somewhere and although the exact details surrounding the Silk Road case are unknown, the public nature of the bitcoin ledger is more likely to have aided the prosecutors than the criminals. bitcoin has a long way to go before it gains the level of respect and dependability that long established physical currencies have Conversely, outside of the reaches of subpoenas and other more extreme legal methods, the identity of bitcoin users is largely protected. This brings with it many benefits, particularly for the millions of people all over the world without a bank account.  Similarly, in Saudi Arabia women are not allowed to open a bank account without their husband’s permission, meaning that they do not have full control over their own finances. Bitcoin offers these individuals and many others without financial autonomy the opportunity to conduct transactions on their own terms. In this respect, the anonymity offered by bitcoin is hugely empowering. Of course, bitcoin has a long way to go before it gains the level of respect and dependability that long established physical currencies have – in the history of world finance, it may ultimately turn out to be nothing more than a short-lived experiment. But for now it offers an exciting alternative to the monopoly of global financial institutions: disruptive and empowering, anonymous yet transparent, decentralised but secure. ### Automate for a more successful private cloud deployment It’s no wonder that an ever growing number of organisations are investing in virtualised, automated data centres and private clouds, considering the greater agility and better security offered compared to traditional client-server architectures or public clouds.  Despite the storage elements and server in the private cloud being largely automated, it is still likely that the network is provisioned and configured manually. To take full advantage of the various benefits that a private cloud can offer, organisations need to prioritise investing in scalable, automated network control. This will help avoid any deployments being delayed due to the legacy process. [easy-tweet tweet="Organisations need to prioritise investing in scalable, automated network control" user="abarirani" hashtags="cloud"] There are a series of phases that any IT department will go through as its private cloud infrastructure matures. The first of these stages is to pilot projects. At this stage, the IT team will take advantage of any non-critical applications and workloads, and use these to test out the cloud’s infrastructure and design. This allows the IT department to gain both experience and make any necessary changes ahead of moving onto business-critical workloads.  Once confident in their cloud’s design, they can ascend to the second phase, “production”, when one or a small number of business-critical workloads are moved onto the cloud. This allows the IT department further opportunity to make any necessary changes to the private cloud before fully rolling out the initiative.  The final phase sees the large scale-out of the private cloud. The transition will involve moving to geographically-dispersed private cloud environments in multiple data centres, and may also involve multi-vendor cloud platforms.  Stumbling blocks No matter what the size or scope of a project, if the deployment of the private cloud is not in sync throughout the entire process, the operation can prove too risky for a business. We commonly see the server team handling the virtualisation component, but all network aspects may be tackled by a completely different team. The disparate groups that deal with the private cloud provide a major challenge to any project. We commonly see the server team handling the virtualisation component, but all network aspects may be tackled by a completely different team. A possible result of this can be a lack of visibility for the network team into the virtual machine (VM) resources as they’re created and destroyed. This can make it difficult to track and manage any massive spikes in the first instance. There is little point in networking teams even trying to comply with audit and security policies if they don’t have this visibility, as they won’t have access to any accurate information regarding which DNS records and IP addresses are assigned to which VMs at any given time. There are numerous factors, including applications, locations and users, which need to be tracked for VMs as well as networks, DNS zones, and IP addresses. Server admins may well have access to this information, but it’s likely the networking teams will not. And by still using manual methods to react to the creation and deletion of VMs, their responses will often be slow.  Time is money Speed is an important factor to be considered, as a private cloud is only as fast as its slowest component. Due consideration of the core network services must be given when building a private cloud, such as assigning IP addresses and DNS records so that VMs can be easily commissioned and decommissioned in a sheer matter of moments. [easy-tweet tweet="Rapid delivery is a conditional promise of private #cloud." user="abarirani and @comparethecloud"] Rapid delivery is a conditional promise of private cloud. The task of manually provisioning DNS records and IP addresses in a virtual environment can severely inhibit the rate of delivery by hours or even days. This process can be inaccurate and inefficient, and may result in a virtual wasteland of unused IP addresses and DNS records. Moreover, just a few small keystroke errors could lead to potential IP address conflicts, resulting in significant downtime in the private cloud environment. Unreliable DDI (DNS, DHCP and IP address management services) can pose a significant threat to any organisation with the risk of a potentially costly network outage. And the risks extend beyond the network itself. If IP addresses of VMs are being used for billing internal “customers,” then the manual processes may even lead to inaccurate charges, resulting in either lost revenue or disgruntled customers.  The importance of highly available DDI services in providing scalability and resilience cannot be underestimated for private clouds running critical workloads, or those spanning several geographical locations. And, looking forward, any limits to the scalability of a network could prevent deploying any additional tenants and VMs required to meet the demands of an organisation’s growth. Don’t take your chances - automate Success in deploying a private cloud is largely dependent on an organisation both understanding and giving urgent consideration to all critical factors Success in deploying a private cloud is largely dependent on an organisation both understanding and giving urgent consideration to all critical factors, such as those mentioned above. Hinging its approach on principles of automation, visibility and integration can help an organisation take effective control of its private cloud deployment.  In the beacons of successful private cloud deployments, we frequently see the management of storage and compute as being heavily automated, supporting the agile delivery of low-cost services to lines of business, resulting in tangible benefits delivered across the organisation. ### Is a cloud or on-premise model preferable for UC? When it comes to their communications systems, enterprises are battling technological and vendor confusion. Microsoft has disrupted the market enormously by introducing Microsoft Lync and the recent release of Skype for Business, leaving traditional players like Avaya and Cisco scrambling to respond to a new competitive threat. This has in turn led to confusion in the user market. Meanwhile there is a shift to the IT department becoming a service organisation within itself, treating employees as customers and using technology to allow individuals to achieve their goals of increased efficiency, collaboration and productivity through a transformational change of the business. As a result, IT departments are beginning to realise that often it’s cheaper to outsource basic communication services so that they can focus on transformational business change. This is true with Unified Communications (UC) in particular. [easy-tweet tweet="#IT departments are beginning to realise that often it’s cheaper to outsource basic #communication services" user="AzzurriComms" usehashtags="no"] It can seem like an easy decision to make but there can be some important considerations. When deciding whether to manage their own systems on-premise, or subscribe to a cloud or hosted managed service, the question organisations need to ask is “What is the business benefit from us owning and managing this system ourselves?” The changing role of on-premise On-premises software is installed and runs on computers on the premises of the organisation rather than at a remote facility. Many companies have on-premise software that is managed in-house. Although the company may contract with a Cloud Service Provider (CSP) for some aspects of their UC, in the end the company is responsible for its own system. There are times when having an on-premise service is absolutely critical – such as a hospital campus – where it is imperative to have a telephony service on-site. UC is part of an essential business transformation designed to improve customer relationships But the truth is, while UC is part of an essential business transformation designed to improve customer relationships, productivity and efficiency, agility, and business continuity, it is rarely a competitive differentiator within a company. If there is a clear case for competitive differentiation, availability, security or cost reasons, then on-premise can be the way to go. However, with the range of Unified Communications as a Service (UCaaS) offerings available today an on-premise model is no longer the default option. Unified Communications-as-a-Service Meanwhile pure ‘cloud’ services are often still like UC on-premise and offer little to no flexibility or personalisation capabilities. Leaving IT departments to tailor their businesses to the solution they have purchased. But a cloud managed service offered by a CSP can be tailored to individual customers and preferences. Below are several ways that a cloud managed service model can offload risk, and allow an organisation to focus on the business-critical levels of change. Vendor and Technology Decisions: The UC market is a fast growing and fast developing market. New vendors and technologies are emerging, and significant vendor consolidation is occurring. This flux makes it difficult to ensure that a three year vendor and technology decision is appropriate for the project lifetime. Cloud managed services mitigate the risk associated with vendors and technology by contracting to meet a set of end user features within an SLA. Any change in the market such as a vendor exiting or end-of-life product becomes the responsibility of the CSP. Installation and Migration With a cloud managed service, the CSP takes on the responsibility of an ordered and smooth transition to the new service, mitigating any risks involved in migrating users and ensure a swifter RoI. A CSP also has the benefit of providing a standard cloud platform and provisioning a new customer on existing equipment designed for the task is faster, more efficient and more reliable than building a new environment. Service Assurance A cloud managed service provides a base UC platform to all customers. Although each have different requirements, the base platform is the same. This allows a CSP to use a number of different systems to monitor performance and maintain as close to 100 per cent availability as possible. System maintenance is both scheduled and reactive, managing systems releases as they are available and tested, but also able to react quickly to any vendor alerts of vulnerability. Managing Expenditure and Costs When a CSP provides the cost of the service it is committed to delivering the service at that cost, and mitigating the risk of any unanticipated expenditure for the lifetime of the contract. Any changes in vendor, technology, products remains the service provider’s liability. Security and Compliance In a cloud managed service, the service provider mitigates the risk associated with malicious or accidental attacks on the service as part of the SLA. The CSP works in tandem with suppliers to maintain a robust system with timely system maintenance to protect against any threats as they become known. [easy-tweet tweet="Equipping workers with the right #UC tools will drive efficiency, collaboration and productivity." via="no" hashtags="cloud"] Equipping workers with the right UC tools will drive efficiency, collaboration and productivity. However UC on-premise and fixed cloud services do not offer the flexibility needed to support the UC requirements of businesses today. To fully reap the benefits of UC it is hard to look beyond a cloud managed service for businesses today. UC-as-a-Service leaves the commodities such as internet connectivity and IP systems to skilled CSP who can be a better and cheaper option. Once these basics are covered by a trusted partner, this leaves the IT department free to focus on enabling transformational business change. ### Managing hybrid cloud The decision to implement cloud computing is just the first step in a long journey. Each organisation then has to choose the mix of services that is best for their needs based on cost, SLAs and whether they have the in-house skills to manage a particular service. The result will probably be a portfolio of services from different providers plus some retained in house services – a so-called ‘hybrid cloud’. In this article we will look at the challenges of implementing and managing hybrid cloud, including choosing the right services, standards, management and secure access. So what exactly is hybrid cloud? The term was originally coined to mean on-premises private cloud integrated with community and public cloud. Products such as SalesForce, Google Apps and Microsoft 365 integrated into corporate desktops can be considered as hybrid cloud services, but they provide point applications and services only. I define hybrid cloud as a mix of two or more clouds used to provide core and common services to a user community. It is in effect just a way of provisioning services, which could be IaaS, PaaS or any other ‘aaS’. [easy-tweet tweet="I define #hybrid #cloud as a mix of two or more clouds used to provide core services to a user community" user="fordwaybloke" usehashtags="no"] Choosing the right services To make hybrid cloud work, organisations need to get their service management capabilities right. First, they need to define and understand the characteristics of each service they want. Then they need to map it to the available options and choose the most appropriate ones for their needs, which could be provided either internally or by a third party. Finally, they need to independently manage the service and monitor it against SLAs themselves. They need to have an audit function to ensure that the service is and remains fit for purpose and independent service monitoring and management either in house or contracted through an independent third party to ensure the provider actually provides what they are contracted to. A new role has been developed: cloud service broker This has led to the development of a new role: that of cloud service broker, someone who will both define the services and then determine the most appropriate way to provide, manage and secure them. CIOs could allocate the role of service broker to a member of their IT team to manage the cloud vendor/s if the organisation has the capability to act as broker. If they do not have the skills in-house then a third party can provide this service. Think about standards Once an organisation has chosen to provision services with different providers it will need the ability to integrate those services. Toolsets to do this and remote management and reporting capabilities are evolving within both the commercial and open source worlds. Early hybrid cloud options were quite proprietary, as standards take time to catch up. We now have standards in areas such as interoperability, web, authentication etc. and these will help to increase the spread of hybrid cloud services. However, an integrated cloud offering is still a work in progress for most organisations. It is relatively easy to integrate web services, but much harder for legacy IT services. [easy-tweet tweet="Once services are in the #cloud, they still need to be actively managed and monitored" user="fordwaybloke" usehashtags="no"] When defining, running or buying services, organisations should make sure that the interfaces used are as standard as possible e.g. XML, SOAP, REST, SAML etc. It is worth noting that this is rarely in a vendor’s best interests. Smaller vendors may be better at developing services with standard interfaces, so it may be better to choose services from challenger vendors rather than from the very large vendors who may use proprietary interfaces. CIOs should also ensure that they have or have access to good expertise around integration and migration, as these are the areas which always cause the most problems. Once an organisation has moved one or more services to cloud, the portfolio of services still has to be actively managed and performance monitored against SLAs to ensure it receives the contracted service. This means considering: Authentication to multiple services Auditing Billing control and management Service integration Service assurance Monitoring and managing a hybrid cloud Monitoring requires an audit function to ensure that the services your organisation has chosen are and remain fit for purpose. Organisations whose IT service is now dependent on multiple cloud and other external suppliers will want to know: How well are my service providers performing against contractually agreed SLAs?  If they are not performing, where is the problem?  This is particularly important where multiple providers are responsible for elements of the IT service. Is the aggregated service delivering suitable performance to my user community?  This is leading to a growth in new services (Cloud Monitoring as a Service, or CMaaS) to monitor the performance of multiple suppliers, all of whom will claim ‘it’s not their fault” when a problem arises. The aim of such services is to provide organisations with full visibility of how well each individual provider and the overall IT service are performing. They should consolidate events and other performance statistics across the IT supply chain, showing overall service health and providing the ability to drill down into specific services where required. [easy-tweet tweet="CMaaS monitors those who claim it’s not their fault when a problem arises. " user="fordwaybloke and comparethecloud" hashtags="cloud"] When choosing a monitoring service, look for integration with public cloud services (e.g. Office 365, Salesforce, Huddle, Google Apps), IaaS and PaaS services (e.g. Microsoft Azure, Amazon Web Services and Google’s App Engine).  Some services can also be used to monitor traditional IT services such as in-house environments, plus hosted and private cloud services where agents can be deployed or gateways installed into the monitored environment.   They should pull together service availability and other performance information and synthetic transactions against defined services and applications, showing overall system health, response times and latency. Where contractually allowable, agents and gateways can also be placed inside a service provider’s infrastructure to provide more detailed information. The service should provide an overview and enable you to drill down into each of the elements to identify where an issue resides and the potential causes of performance issues.  Why choose hybrid cloud? In my experience, medium sized businesses face the greatest challenges in funding their IT infrastructure. In my experience, medium sized businesses face the greatest challenges in funding their IT infrastructure.  With a small in-house team, they are unlikely to have the diverse range of skills required to run a complex IT infrastructure, and hence either have to  take a ‘best guess’ approach or turn to external experts on a regular basis. Developing a hybrid cloud infrastructure with carefully chosen use of managed cloud services will enable them to focus internal resources on the services most critical to their business. For further information on hybrid cloud, contact Fordway.  ### Delivering Strategic IT Intelligence CIOs need better insight into the performance of the IT infrastructure to support strategic decision making and continually improve IT utilisation. Many CIOs, therefore, are understandably aghast at the loss of hard won on-premise systems' visibility when they move to the cloud. When MSPs offer nothing more than basic Service Level Agreement (SLA) metrics, how can the CIO ensure IT evolves in lines with business objectives? Without current and predicted IT performance insight, does the financially advantageous move to the cloud incur too much risk? [easy-tweet tweet="How can the #CIO ensure #IT evolves in lines with business objectives?" user="managed247 and @comparethecloud" hashtags="cloud"] Business Intelligence Over the past decade, CIOs have started to transform the quality of IT performance information to better support strategic decision making. While still constrained by the challenge of implementing and maintaining the diverse monitoring systems required to provide in depth insight, there is no doubt that many CIOs are working hard to deliver more information about user experience, system productivity and trends in performance. At best an MSP will offer SLA related metrics -There will be no insight into how systems are actually performing Any gain in IT insight disappears, however, when systems are moved to the cloud. Shift the on-premise infrastructure to either public or private cloud and the CIO is at the mercy of third parties with little or no vested interest in delivering relevant information. At best an MSP will offer SLA related metrics. There will be no insight into how systems are actually performing; no flagging of concerns regarding potential problems downstream that could affect user productivity; no insight into the alignment of infrastructure with the business roadmap. Go to a public cloud such as Microsoft Azure or Amazon AWS and the information is even more limited. So where does that leave the CIO when the board starts grumbling that email performance has dropped since the move to the cloud or more critically, when asked what additional investment would be required to support a key new business opportunity? [easy-tweet tweet="Any gain in #IT insight disappears, however, when systems are moved to the #cloud" via="managed247" hashtags="bigdata"] CIOs appear to be faced with a stark choice: opt for the financial benefits of the cloud and lose any hard won improvements in strategic insight. It doesn’t have to be this way. A fully comprehensive monitoring service that can run across on-premise, public and private cloud infrastructures fundamentally changes the game for CIOs. With real time monitoring across the entire infrastructure, combined with business intelligence, a CIO can not only track performance and map that back to user experience today but also accurately predict requirements well in advance.  Conclusion The pressure on the CIO continues to mount – especially as other parts of the business begin to explore big data to gain greater operational insight. With data driven decision making at board level, the CIO needs to be able to demonstrate with confidence how the infrastructure is performing today but also be able to predict accurately the needs for the next two, three or four years ahead.  The good news is that there is no need to be torn between the cost benefits of the cloud and the board-level demand for detailed, strategic insight into both current performance and future requirements. With deep dive, infrastructure wide performance insight across on-premise, public and private cloud, CIOs gain the best of both worlds - there is no longer any need to compromise. ### The great rivalry between Azure and AWS Reflecting on the long game! Every now and then it pays to step back and reflect on the broader trend rather than the monthly grind. We decided to reflect on how the major cloud players had fared so far this year. All year Microsoft has topped the Compare the Cloud global #CloudInfluence rankings, but in July Amazon moved ahead. In August Amazon again held on to the top spot, demonstrating that there has been a possible shift in mind share in the cloud market. [easy-tweet tweet="Amazon is maniacally focused on efficiency, spending no more than it needs on anything. " user="BillMew" hashtags="AWS"] The monthly rankings tracks the level of “buzz” associated with each of the top organisations and individuals in the cloud sector– the level of opinion each is generating, the extent to which this is resonating and the influence it is having on others. There are two main drivers for this “buzz”: Marketing and PR, where firms seek to generate and promote the opinions of their firms and executives The ecosystem, where a firm’s partners, clients and other advocates help spread such opinions and enable them to go viral. Microsoft and Amazon are very different beasts. Microsoft has a massive marketing machine – one that has served it well for decades as it has built and maintained one of the most influential brands in either the technology or business sphere. Amazon has a very different culture, brand and approach. It is maniacally focused on efficiency, spending no more than it needs on anything – including marketing. It runs a very lean marketing and PR structure, often appearing to journalists as aloof or inactive by comparison to its rivals. Instead it focuses on innovation and on rolling out a constant stream of new services to its growing ecosystem of partners, clients and advocates. Until recently the Microsoft marketing machine held sway overall, with a consistent #CloudInfluence lead over Amazon each and every month – often by a considerable margin. However the Windows 10 launch has not only been a distraction for the firm, hogging much of its marketing focus of late, but it has also proven to be an unsettling period for its channel partners. [easy-tweet tweet="Will @Microsoft regain the lead once the dust has settled from the Windows 10 launch?" user="BillMew" usehashtags="no"] All the others appear to be jostling for third place with Apple, IBM, Google and Oracle appearing in the top ten every month and Cisco, Intel and Alibaba making several recent appearances. The big question however is… has there been a major shift in mind share with Amazon stealing top slot, or will Microsoft regain the lead once the dust has settled from the Windows 10 launch and it refocuses on Azure? We’d welcome your thoughts – tweet your opinions using #CloudInfluence – and come back next month to see which firm tops next month’s rankings. NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### Getting Up to Speed with European Data Privacy Reform If you live in the UK or anywhere else in the European Union for that matter, you've likely been following the new EU data regulations reform movement which is expected to establish a consolidated data protection policy framework for all 28 member states. In case you're unfamiliar with the legal proceedings or if you're looking for more details, we've provided background information on the current data regulation legislative standard and outlined what the unified proposal entails. It can be easy to miss the big picture when combing through paragraphs of legal jargon, but hopefully this summary will help pinpoint the key implications and explain how you can proactively respond! The Current State of EU Data Protection Regulation is outlined below. The EU Data Protection Directive  As of right now, Europe is subject to the EU Data Protection Directive (Directive 95/46/EC), established by the European Union to safeguard the privacy and integrity of all personal data processed, used, or exchanged between EU citizens. In accordance with Article 8 of the European Convention on Human Rights (ECHR), the Directive is intended to protect "the rights of privacy in personal and family life, as well as in the home and in personal correspondence." The EU Directive includes the following seven principles: [easy-tweet tweet="The EU Directive: Notice - those whose personal #data is being collected should receive notice" user="followcontinuum"] [easy-tweet tweet="The EU Directive: Purpose - the collected #data should be used only for the purpose(s) provided" user="followcontinuum"] [easy-tweet tweet="The EU Directive: Consent - disclosure of personal #data with third parties may only be permitted if data subject consents" user="followcontinuum"] [easy-tweet tweet="The EU Directive: Security - personal #data that's collected should be kept secure from potential abuses" user="followcontinuum"] [easy-tweet tweet="The EU Directive: Access - #data subjects may access their data and correct any inaccuracies" user="followcontinuum"] [easy-tweet tweet="The EU Directive: Accountability - #data subjects will be able to hold data collectors accountable" user="followcontinuum"] Notice - those whose personal data is being collected should receive notice Purpose - the collected data should be used only for the purpose(s) provided Consent - disclosure or sharing of personal data with third parties may only be permitted if data subject consents Security - personal data that's collected should be kept secure from potential abuses Disclosure - those whose personal data is collected should be notified as to who is receiving it Access - data subjects may access their data and correct any inaccuracies Accountability - data subjects will be able to hold data collectors accountable for abiding by these seven principles Under this standard, each EU member state manages data protection regulations and their enforcement within its jurisdiction. Data controllers are the ones who obtain the personal data from citizens in their country, data subjects, and are held to the seven principles as listed above. Additionally, each member state must form a supervisory authority in charge of monitoring data protection and launching legal proceedings when data regulations are violated. Adding to its decentralized nature, the Directive must be implemented by each member state and written into their own data protection legislation. Up until recently this fragmented approach sufficed... New Digital Union Framework Agreed Upon According to CompTIA's 10th Annual Information Security Trends study, 55% of respondents claimed the increased interconnectivity of devices, systems, and users were among top factors impacting security practices. Now with this rise in interconnectedness and the proliferation of social networks and cloud computing, European data regulations are being revisited and have been in a continuous process of reevaluation by the European Commission since January of 2012. Recently, however, there's been a breakthrough! After universal agreement among the justice ministers of each state, what was once the EU Data Protection Directive will eventually become the General Data Protection Regulation (GDPR). The EU's European Council projects its adoption in either this year or the next, with a two year period before going into effect. Once this happens, because it will be a Regulation and not a Directive, all 28 countries of the European Union will be immediately subject to the legislation. what was once the EU Data Protection Directive will become the General Data Protection Regulation (GDPR) So what does this mean? With one data protection framework, one "single digital union," binding all of the member states of the EU, privacy regulations and European citizens' data will be managed throughout the entire territory, rather than in the individual countries. [quote_box_center] In response to this agreement, Director General of the European Consumer Organisation Monique Goyens gave the following comment: "EU laws are now lagging behind the pace of technologies and business practices. Our personal data is collected, then used and transferred in ways which most consumers are oblivious to. An appropriate update must put control of personal data back in the hands of European consumers. This new regulation is the opportunity to close gaps, ensure robust standards and stipulate that EU laws apply to all businesses operating here.” [/quote_box_center] As of September 2015: As of right now, the GDPR is still in draft-mode and will likely be for the next few months as the European Parliament, Council, and Commission negotiate a finished version. As stated originally, the law won't become enforceable for another two years. That doesn't mean service providers should remain idle though. Successfully implementing the new compliance and data protection standards will take time. Efforts should be made to begin planning today! Read on for suggested areas for review. With the new European Data Protection Regulation, businesses will need to obtain consent from those whose personal data they want to track There are a few new changes you should account for with this uniform regulation. Under the new standard, for instance, Computer Weekly reports that "all data that identifies an individual, whether directly or indirectly, will now be personal data." This increase in the amount of data that will need regulating (though perhaps not with the same degree of scrutiny) includes pseudonyms and IP addresses. Because of this, many more businesses will be affected, especially those that rely on customer profiling to build marketing and selling strategies around personal or behavioural data. With the new European Data Protection Regulation, these businesses will need to obtain consent from those whose personal data they want to track. How readily do you think these SMBs will provide this? Customer data rights isn't the only consideration that still needs to be fleshed out. After a brief summer hiatus, the parties reconvened on September 1st, to continue discussing the implications of the GDPR. Not everyone is on board with a single digital standard, however. On the same day, the Russian Data Localization Law went into effect. As a result, all personal data gathered from those in Russia must now be stored within the country's borders, establishing a precedent of data sovereignty in the midst of a more unified data regulation movement. Impact for Service Providers Serving the EU Such a significant change in legislation could mean MSPs all throughout the EU will be forced to adhere to tougher data protection laws. How then should you respond to these latest updates? Computer Weekly has released a comprehensive guide outlining key components of the unified data regulation framework, those ISACA suggests IT service providers pay attention to. 1. Accountability Review and update your privacy policies, procedures, and documentation since data protection authorities can ask for these at any time. One way to evaluate your policies is by performing a data protection impact assessment. 2. Governance Group and Data Protection Officers Assemble an internal policy governance group to monitor all activities. If your organization has more than 250 employees or if you regularly and systematically monitor data subjects, you'll be required to elect an independent Data Protection Officer (DPO) to oversee and report on data management processes. 3. Explicit Consent This stipulation requires data subjects to freely agree to the processing of their personal data and data controllers to prove consent. Subjects can opt out of direct marketing data usage. 4. Right to be Forgotten Under this regulation, data subjects can mandate removal of their personal data and refuse further distribution by the data controller. 5. Outside Parties Data controllers outside of the EU who process data of those within the EU will need to appoint a representative within the territory. 6. Data Breach Notification Data controllers will have to report any personal data breach to the data protection authority immediately and within 24 hours upon learning of the breach. If longer than this, they must provide the reason. Data controllers might also need to alert data subjects who've been affected in special cases. 7. Sanctions Data protection authorities will have the power to fine up to 2% of annual global turnover for violations. 8. One Lead Supervisory Authority The data protection authority in the EU member state in which a multi-jurisdictional data controller has its main establishment will monitor data processing of the data controller across all states. 9. The Cloud Cloud providers, referred to as data processors, will also be held responsible if there's a breach due to their own improper planning, policies, and procedures. While further implications of this new single digital union will continue to surface, MSPs can take action now to strengthen organizational protocol. Assess all of your internal processes and develop strategies around data classification, retention, collection, removal, storage and search. Track your efforts and frequently report on them and above all, train your employees to comply with the new policies and procedures you enact. ### It’s time for CIOs to change strategy in the fight against consumer technology There is no denying that the role of today’s CIO is increasingly complex and unpredictable. Every day there are new stories about data breaches or data losses, with high profile cases bringing data security into the mainstream news agenda. This is partly thanks to the proliferation and seemingly uncontainable nature of new technology solutions. One example is the use of file sync-and-share (FSS) solutions in an enterprise setting, a cause of many data breaches. The rapid adoption of FSS solutions for personal use, has lead to FSS becoming standard procedure when accessing, storing and sharing business information in the workplace. Unfortunately, as with most technology built with consumers in mind, FSS solutions fail to provide businesses with the security they need for protecting critical business information. This has opened a Pandora’s box for security and data loss challenges as FSS solutions are rarely managed by, or even visible to, IT departments. [easy-tweet tweet="It's time for CIOs to become enablers of consumer technology, rather than blockers" via="Commvault" hashtags="cloud "] In recent years, CIOs have exhausted much of their energy campaigning against the use of consumer technologies in enterprise. They are increasingly unaware of what's happening with technology in their own organisations and have little control of what is being downloaded or transferred across them. It is necessary for CIOs to change strategy and position themselves as the enablers of these technologies, rather than the blocker. Shadow IT is simply not going away. So, how can you ensure that data shared through services such as OneDrive, Dropbox and similar solutions remains visible to IT, backed up, and fully searchable for use by end users and for eDiscovery purposes? The following guidelines enable your workplace to support FSS, whilst maintaining control of its data: Accept Shadow IT behaviour is taking place: Don’t ignore rogue file sharing behaviour. According to a research report by Osterman Research, 68% of business users were storing work-related information in a personally managed FSS solution – without visibility or approval from IT. [easy-tweet tweet="68% of businesses are storing work-related information in a personally managed FSS solution" via="Commvault" usehashtags="no"] Listen to your colleagues: Learn how and why staff choose to use FSS on the job. If IT can understand why, it can then encourage this continued collaboration platform with the support of additional data governance and security measures such as encryption, backup and eDiscovery. It will also enable IT to better choose which applications to roll out across the company, in order to meet staff requirements. Promote easy-to-use services: Remember that one of the most important features of FSS services is a user-friendly interface and quick deployment. Look for a tool that combines the functionality, security and scalability required for an enterprise solution with the ease of use of a consumer-class FSS. Solutions which offer a virtual folder that acts as a “personal cloud” mean staff can freely collaborate and share work-related files just as they would via FSS solutions, but without the security risk. Take responsibility: FSS integration into endpoint data protection solutions needs to be an IT responsibility. By putting in place the correct guidelines and approval processes, the IT team can finally put away the rubber ‘no’ stamp and be the department that colleagues look to for the tools they need and want. Keep your options open: Users choose FSS to increase productivity, collaborate with colleagues and backup files. As such, be sure you can continue to support a wide range of devices and third-party applications when you implement that extra layer of protection. It’s important not to be limited to a specific platform or ecosystem either, so make sure users still have access to the same features across all devices. CIOs must now recognise that their staff should guide them Following these guidelines will reinstitute the IT department as the guardian of consumer technology solutions in the workplace, and therefore IT teams can ensure that they can recover, access and use company information no matter where it resides – while reducing costs and risk. In much the same way as the shop assistant lives by the saying ‘the customer is always right’, CIOs must now recognise that their staff should guide them. By learning from the habits that work outside the enterprise, it is possible for IT to provide a smart complement, and/or alternative, to existing solutions, while still adhering to corporate data management best practices. ### Understanding the Digitisation of Computing [vc_row][vc_column][vc_column_text] A term I am hearing more often, from partners and vendors alike, is ‘digital transformation’. The question I ask both of others and myself is do we all truly understand what a digital transformation is? “The Codification of Computing is Upon Us” When pondering questions such as these I tend to simplify both the concept and idea by drawing out charts on my office wall. The concept of physical or even analogue services is not really described in easy to understand terminology or examples but I will try my best to simplify it here. [click_to_tweet tweet="Who remembers the Sony Walkman powered by a C90 Cassette tape? @ianajeffs #flashback" quote="Who remembers the Sony Walkman powered by a C90 Cassette tape?"] Physical, Digital or Analogue Services Who remembers the Sony Walkman powered by a C90 Cassette tape? Or pressing forward on a CD Walkman? Who would have thought that a few years later we would be using multifunctional smart phones for calling, searching the Internet and listening to music. Another example, is the ‘democratisation’ of the publishing industry that allows for both enhanced physical and digital distribution. Today, any author has the ability to publish in a number of formats and have their publications distributed around the globe at the click of a button. Amanda Hocking, for instance, made her unpublished novel available on the Kindle and has since sold over 1.5m books; I wonder how many authors would have become millionaires without these distribution and promotion mechanisms? The digitisation of goods and services is creating a new wave of entrepreneurship, and companies that were unheard of a few years ago. Take Uber and AirBnB, whose services have democratised travel accommodation and taxi services. “The Codification of Computing is Upon Us” From the network to compute resource, the computing and networking hardware we use has started to become code. Getting from A-B used to require multiple routing tables and hardware investments, but with Software Defined Networking (SDN) this function can be obtained by using software sitting upon legacy hardware and systems. [click_to_tweet tweet="The computing and networking hardware we use has started to become code. @ianajeffs " quote="The computing and networking hardware we use has started to become code."] Cloud Computing or Infrastructure as a Service (IaaS) allows for the abstraction of many hardware services to act as one node or instance. And, now we are seeing Cloud 2.0, which has moved beyond short-term workload to fully implemented enterprise systems being adopted in the cloud. For example, IBM Cloudant allows for a database instance to be replicated across different cloud providers and systems for fault tolerance. This means your database is available at the nearest geo-node to your location with a service level guarantee; allowing for unparalleled data processing and services. The API Economy Many come across the term API during our daily lives but what does this actually mean? An Application Programming Interface is a query that allows for intercommunication of disparate or connected systems. When you book your holiday and use a comparison engine the API is sent to a multitude of holiday providers with your parameters such as: location, flight, No. of stars, cost. But you won’t see this - just a simple screen brought back to you with the best holiday packages. Cloud services use a number of APIs, ranging from a start-stop virtual machine through to creating new instances and integrating other services such as web caching. As interoperability standards improve due to customer pressure we will start to see more services driven by the API. For Managed Service or Cloud Providers I believe we will see more applications using ‘plugins’ driven by an API. And, any Value Added or System Reseller who is building this capability will be positioned for the next wave of computing. 3D Printing and the re-industrialisation of the UK Need a new hip replacement? Need that ‘hard to find’ Darth Vader figure? Why not go to a 3D marketplace and print it out. The 3D printing revolution is upon us, and is already being used in various industries from manufacturing to aerospace. Today, our economy is increasingly based upon knowledge and intellectual property.  So how do you protect that intellectual property and then monetise it? And, as we see more manufacturing services move back to the UK - due to 3D printing lowering costs - how will we service the underlying infrastructure needed for what could potentially be a huge demand? Today, our economy is increasingly based upon knowledge and intellectual property [/vc_column_text][/vc_column][/vc_row] ### Industry Shaping Cloud Trends of Today Cloud computing has experienced exponential growth over the past few years, and enterprises are adopting it on a large scale. Businesses use cloud platforms to gain immediate access to computing power, and to create new applications on demand. This growth is only expected to increase over the coming years. Arvind mehrotra shares his thoughts on the four major cloud trends right now NIIT helps many customers as they embark on their journey into the cloud, and in my experience there are four major cloud trends right now that are shaping our industry: Hybrid Clouds lead to Bimodal IT There has been a long and ongoing debate about the relative merits of public and private cloud models. Hybrid clouds feature an infrastructure that combines private cloud security with cost-effective, powerful and scalable public cloud attributes. For enterprises Bimodal IT is emerging as a strategy of managing two separate, coherent modes of IT delivery, one focused on stability and the other on agility. The former is traditional and sequential, emphasising safety and accuracy. The latter is exploratory and nonlinear, emphasising agility and speed for projects that help innovate, differentiate & transform the business. As hybrid models become mainstream, NIIT has been assisting customers to adopt this cloud deployment model, so that they can leverage a best of both worlds experience. The share of SaaS of total applications spend is accelerating Currently 30% of the application budget is spent on SaaS-based applications. This is projected to grow steadily over next few years. NIIT is helping its customers to deploy their on premise applications on the Cloud and offer them as SaaS-based applications with a monthly subscription model.    [easy-tweet tweet="Currently 30% of the application budget is spent on SaaS-based applications." user="NIITTech and @comparethecloud" hashtags="cloud, saas"] Public Cloud Adoption  AWS is currently the cloud market leader with its greater economies of scale, but Microsoft is quickly catching up and finally getting Azure on track. NIIT has developed partnerships with both AWS and Microsoft Azure and built strong capability across them. We are helping customers across the United States, Europe and APAC markets to migrate and deploy a broad range of applications to these public cloud ecosystems. Vertically-focused cloud businesses Deep vertical market expertise combined with the ability to deliver new apps quickly on a cloud platform is leading to faster customer wins and increased credibility in the vertical market served. NIIT is leveraging its Travel & BFSI vertical expertise to move on premise applications and solutions so that they can be offered as SaaS as managed services.    ### Find the right Cloud Server Platform for your project Over the past decade, the concept of ‘cloud’ has promised some clear and attractive benefits. However, such cloud ‘promises’ are often diluted down because the final cloud solution received is too inflexible or lacks power. There are some great solutions in both the ‘simple clouds’ (mostly for consumer applications) and the ‘mega clouds’ (for enterprise level needs), but the mid-market cloud (for SMBs and smaller-scale developers) is often the most under served. When buying cloud servers, one should consider how solutions are matching up to every one of these cloud promises – lower TCO, more power, ease of use, more reliability, greater flexibility, increased transparency, no vendor lock in, and long term support from a vendor.  When selecting a cloud platform, the conclusion should be based on the resultant overall cloud experience. Firstly, it is important to research the right cloud provider for your needs. Understanding the origin of cloud providers and the role their historical experience plays in supporting your goals is the key to a successful cloud strategy. Cloud providers have typically evolved from three distinct areas; these are Value-add Resellers (VARs), ISPs and Web Hosts. Each type of provider has brought with them vast but widely different experience as they have moved into the ‘cloud market’. [easy-tweet tweet="#Cloud providers have typically evolved from three distinct areas; #VARs, #ISPs and #WebHosts." via="1and1_UK" usehashtags="no"] For example, many VARs evolved into a cloud provider by partnering with a hosting infrastructure provider and offering customers a suite of bespoke cloud services. By combining infrastructure and applications they can be attractive for small businesses with a lack of skills in-house. However, for small scale users, the higher prices of VARs can often be a disadvantage. many VARs evolved into a cloud provider by partnering with a hosting infrastructure provider ISPs have been able to invest in and acquire cloud technology at a rapid pace, however achieving automation and efficiency is very challenging. ISPs can also face challenges delivering ever-changing levels of high speed connectivity and infrastructure while keeping costs affordable. This can mean that investing in cloud services is not always the priority for them and cloud customers could perhaps suffer. ISPs have been able to invest in and acquire cloud technology at a rapid pace Alternatively, a web host’s knowledge and skills originate from the shared web hosting market. These companies have invested heavily in infrastructure, not just enabling automation, resource management and fast provisioning but considering them central to their solutions – meaning a focus on delivering on agility, power and price – ideal for cloud server users. a web host’s knowledge and skills originate from the shared web hosting market When researching individual cloud solutions, it is important to understand how the platform has been born. Has it been re-engineered over years from older legacy systems or has it been redesigned in recent years to reflect the best possible server technology? Look for a provider that reviews its solution in terms of overall cloud-usability – you seek a server that delivers the most on cloud promises. It is important to understand the capabilities today and future of a specific cloud platform. Begin by researching the components of the stack – what server hardware is used? What type of virtualisation? Is XEN or a premium option such as VMware used? Look for indications from a vendor that a ‘premium partner’ strategy has been adopted in order to deliver to you the latest innovations or those with the best possible results.  For example, storage technology is an area that can make an enormous difference to day to day performance.  Newer cloud solutions will use pure SSD rather than spinning disk, which costs a provider more but is lightning quick, as it provides real time deduplication and compression. A cloud user should experience virtually unlimited disk performance as a result. SSD is also a rapidly evolving area of tech so is likely to progress to become even better. Look for benchmarking studies from third party sources around areas like this – the cloud providers with the best price-performance ratio will want to shout about it.  [easy-tweet tweet="Newer cloud solutions will use pure SSD rather than spinning disk" user="1and1_UK" hashtags="cloud"] Similarly, with elements such as firewalling, load balancing and intrusion protection, look for premium suppliers and whether versions are up to date. Cloud providers that are investing in the best component parts for networking, computing, storage and web interface, for the long term, will be more committed to optimising your cloud journey. Ask about how their cloud offering is planned to develop in the future – the best platforms will have a continuous roll out of new functionalities. Deployment time, billing transparency and ease of use are also areas that will affect your overall success with a cloud platform. For example, the ability to leverage multiple machines for short periods and with charges by the minute will enable the scalability promise of cloud to become reality. It is essential that any cloud solution allows you to grow upwards and downwards in real terms as your needs change. As well as fitting your budget, prioritise the providers that show strength with speed, security and levels of guaranteed uptime, proof of their technical competency. Ensure they are investing in strategic partnerships that you will benefit from in everyday performance terms, and that they use cutting edge components and networking. Expect signs of innovation in their cloud packages and plans for continual evolvement that will drive real efficiencies for you in the future. ### 80% of CIOs feel ‘ripped off’ by cloud providers 80% of CIOs feel ‘ripped off’ by cloud providers who make them pay a premium for what many consider basic support ElasticHosts’ survey finds CIOs expect more from cloud providers when it comes to reducing their support and maintenance burden Cloud hosting provider, ElasticHosts, today released the results of a survey* that reveals dissatisfaction with support and service in the cloud sector. The survey found that 93% of businesses are now using the cloud in some shape or form; however, of those that do, three in four (75%) feel that the move to cloud has forced them to sacrifice service and support. [easy-tweet tweet="93% of businesses are now using the cloud in some shape or form" user="elastichosts and @comparethecloud" usehashtags="no"] Indeed, a third of respondents (33%) believe they have sacrificed the majority or all of support by moving to the cloud. For example, 84% of cloud users feel cloud providers could do more to meet expectations on reducing the support and maintenance burden on in-house IT staff. While cloud users often have the choice to purchase a package that offers greater support, 80% feel that this is a ‘rip off’, and that premium services offer little more than what they would normally consider basic support. “Many companies adopt cloud so they can take away the headaches related to managing their IT and reduce the burden on in-house IT staff. Therefore, the need for ongoing support and services will naturally be reduced, as it is outsourced,” commented Richard Davies, CEO of ElasticHosts. “Yet when using any service, you want to be able to ask questions – whether that’s to learn how to configure a server, or to query a bill –you should be able to do this without having to pay a hefty premium.”  The most common problems businesses encounter when using support and services from public cloud providers include: Slow response times to customer service queries – 47% Call handlers lack sufficient technical knowledge – 41% Use of automated phone lines/not being able to speak a human – 33% Complicated escalation processes – 28% Lack of 24/7 availability – 19% [quote_box_center]Davies continues, “With many providers, if you have a technical question, you call up and after a very long wait are then forced to go through a long winded automated service – meaning you will quickly be in a bad mood. If you are then transferred to call handlers that don’t have the required technical knowledge to help resolve your query, and haven’t heard of your company before, your blood will begin to boil.” “The industry should be doing more to help customers. Users are right to expect expert support included as standard with their cloud services. The first person that they contact for support should be an engineer with strong technical understanding of the service, not just a call handler.”[/quote_box_center] To find out more about ElasticHosts, please visit: www.elastichosts.com *Commissioned by ElasticHosts and conducted by independent research company Vanson Bourne, the survey was administered to 200 CIOs at large companies covering a cross-section of vertical markets in the UK. ### Dell MSP Conference - Recording Live Dell and Intel are inviting MSP's to their Service Provider conference at Mercedes Benz World on September 10, 2015. Hosted by Dell Executive Director Peter Barnes, this event is devoted to Managed Service Providers.     [embed]https://www.youtube.com/watch?v=b_s91_PBdDA[/embed] ### Friday Night Cloud: Episode 1: Pilot [embed]http://traffic.libsyn.com/fridaynightcloud/FridayNightCloud-Pilot-Final.mp3[/embed] In our pilot podcast episode we explore the Internet of (Strange) Things, discuss data breaches and look at the future of cloud technology. [easy-tweet tweet="CTC explores the Internet of (Strange) Things, discusses data breaches and cloud technology" user="mrandrewmclean"] Since recording it would appear that a wealth of strange and unusual Internet of Things Wearable (are there are any other sort?) bras are available. In Japan, a country where the marriage rate has fallen considerably in recent years - and not always through choice, the Konkatsu Bra can soon be the prize possession of any hopeful bride. You may expect to see such a device advertised in a 1950's magazine when times were less enlightened and men wore suits on every occasion - even a trip to the dentist. The bra not only includes a holder for an engagement ring but a LED timer which counts down towards the big day playing "The Wedding March" when it hits zero. Other handy attachments can also be added such as a pen holder to sign your marriage nuptials. It is unclear whether this garment is to be worn on a daily basis - simple hygiene would suggest that several would be required and quite how you wash electronic underwear without risking a timer malfunction, spinning out of control at a hastened rate like a failed bomb wire cutting scene from a b-movie, is suspiciously missing from the press release. The true love bra sends a message direct to your smartphone when your heart rate flutters in that special way to tell you you've found 'the one'. Even stranger is that it comes with an electronic lock which only opens when the bra reads the right level or arousal. [embed]https://youtu.be/B8Wd831gUt4[/embed] But even more implausible is the clap-off bra which, and I really can't believe I'm going to write this, solves the problem of keen lovers straining with clasps by simply clapping their hands for the bra to undo. I imagine such a device would be rendered utterly useless at public events, like say the theatre - but perhaps that's part of the thrill. [embed]https://youtu.be/OgJvHePIEdU[/embed] Perhaps a little more practical, if not slightly bewildering, is the Solar Bikini - the perfect way to charge your phone when out in the sun. The bikini top and bottom both have solar panels connected allowing the wearer to plug their phone directly into their crotch. Whether the problem of a dead phone justifies the sight of wearing two reflective glitterballs and electrically charged underpants is one of personal taste - fashion has never been my strongest point. On a more serious note, Cyrcadia Health is developing a bra that could alert women to the early signs of breast cancer. - Click here to read more about this potentially life saving garment. ### The Great British Cloud-off According to Mary Berry and Paul Hollywood, The Great British Bake-off (which is now on air) is going “back to basics” this year. Forget baked alaskas, poviticas and technical challenges that only expert bakers could handle - and say goodbye to scandals like bingate. This year, the bakes will be simpler and entirely focused on creating food that delights. You could argue that cloud computing is in need of the same recipe.   Cloud computing has disrupted the market and caused a great deal of expectation and concern in equal measure. The business benefits – including a more flexible way of working, improved collaboration and reduced cost and risk – have been chronicled extensively. But simmering security issues have also been widely covered. [easy-tweet tweet="#GBBO is going back to basics in 2015, does #Cloud computing need to do the same? " user="comparethecloud" hashtags="cloudoff"] As cloud solutions have advanced and different models have come to the fore, complications have also arisen. For instance, the hybrid model addresses many issues pertinent to exclusive public and private environments, but it has also created complications around managing a diverse environment necessitating different tools for each variety of cloud. By failing to manage the overall environment businesses face rising costs and a poor outcome; a bit like serving cake with chips. However, by bringing everything together to create a coherent, cost effective and high quality service, businesses can thrive. Given these challenges, organisations are now taking a more balanced approach to cloud computing. They’re recognising that small, considered changes can make a big difference.  Baking a great cake takes far more than just throwing flour, eggs, sugar and butter into a bowl. There is an art to how you combine the right ingredients. And when we talk about ‘cooking up a storm’ there are a number of comparisons to migrating to the cloud. Admittedly the metaphor takes some imagination but bear with me - the similarities are worth digesting. when we talk about ‘cooking up a storm’ there are a number of comparisons to migrating to the cloud Preparation Benjamin Franklin once said, “By failing to prepare, you are preparing to fail” and “an ounce of prevention is worth a pound of cure.” The best bakers plan ahead by ordering their ingredients in advance, preparing the area, wiping down the worktops and lining-up their utensils. This preparation extends to considering the sort of ingredients that they want to use. For instance, do they need to buy a certain type of flour or research a vegan cake? [easy-tweet tweet="As a general rule of thumb, non-critical, non-sensitive data is well suited for the public cloud." user="comparethecloud" usehashtags="no"] Successful organisations similarly plan ahead by clarifying their business objectives and needs by considering their approach. Imagine the cloud environment as a mixing bowl. Before steaming ahead, organisations need to consider what applications are right for cloud deployment and the security measures needed. As a general rule of thumb, non-critical, non-sensitive data is well suited for the public cloud. It’s an economical approach and there shouldn’t be any reputational or organisational damage if data is compromised. It’s a bit like buying own-brand chocolate. You might be wasting money by baking with Green & Black’s. However, in some cases, more expensive ingredients or appliances can make all the difference. Invest where you need to.  In the same vein, organisations are considering which applications, (i.e. business critical ones) are best placed in a private, managed cloud, with high levels of security to protect against malicious access or data leaks. Often a hybrid cloud, which combines aspects of all, is the best solution for a business. Measurements When considering your approach to the cloud it’s important to bear in mind that data classification can change and it’s not always simple to follow. For example, in the public sector G-Cloud accreditation previously provided organisations with clear and stringent guidance in terms of classifying data. However, this classification is misleadingly lax. In practice, the onus is on the organisation to make the right decision to put the correct security measures in place. This can only be achieved with a thorough data management strategy which applies differing levels of control to different data sets. This means companies must go above and beyond to ensure the correct structures and fail safe options are in place to avoid future data loss, reputational damage and even a hefty fine from the Information Commissioner’s Office. In many situations, extensive internal knowledge or a partnership with an expert in navigating cloud architectures is a must. Similarly, bakers’ take advice from others, following recipes and using defined measurements for a reason – to prevent failure further down the line. For instance, there’s a reason why a chef sifts flour, why you need self-raising flour or baking powder, why you should measure carefully, pre-heat the oven and use ingredients such as eggs and milk that have been kept at room temperature.   Presentation The Great British Bake-Off scores cakes according to presentation as well as taste. Food commentators often remark that ‘we eat with our eyes’. For bakers, the magic happens when they bring the different ingredients together to create the perfect cake. Similarly, the business impact is felt when a public and private cloud is ‘wrapped’ in a management layer, to ensure the cloud environment is aligned to business goals, security is adequately reinforced and the right type of encryption and processes are automated to systematically identify and fix issues. This is the point at which we can separate the amateur bakers from the professionals. [easy-tweet tweet="The business impact is felt when a public or private #cloud is ‘wrapped’ in a management layer" via="no" usehashtags="no"] A fast-moving bake One aspect that doesn’t apply to the baking analogy is the speed of change. While baking appliances have evolved, tried and testing baking techniques still prevail; whereas the development of cloud computing has happened at pace since the start of the century and continues to accelerate.  According to UK analysts, TechMarketView, cloud computing will grow more than 20% annually between now and 2018. In the public sector specifically - while some areas don’t have the same cloud-first policy as central Government – the transformative benefits are recognised and progress is being made.  However, uneven innovation often leads to a piecemeal approach and the recipe for success in this fast moving environment is an integrated approach. To bake a great cloud, organisations need to take a complete infrastructure approach, with a clear idea of what they want to create, the set of ingredients required, the steps they need to follow to build the right cloud environment from the outset and the satisfying result they want to create. Bon appétit! ### Arrow ECS Big Data Analytics Live Blog Follow the live blog here for all the happenings at the REX today from the Arrow ECS Big Data Analytics Event. The schedule: 13:00  Introduction with Oliver Reid and Peter Reakes 13:10  Keynote Speaker David Fearne provides and introduction to the Solution Centre 13:50 Midmarket Analytics with Phil Powell and Jane Everden 14:20 PDA Mini with Jamie Taylor 14:50  Break and refreshments 15:00  The Sky’s The Limit: an overview of IBM Cloud Data Services withRobert Douglas and Richard Holmes 15:30 Watson: what’s behind the hype with Rebecca Sinnatt 16:00 Analytics Partner Elevator Pitches 16:10  Arrow Analytics Marketing: what’s coming up and how can you engage with Emily Fallon   ### No piece of data should be an island When you think about it, the mere act of aggregating data doesn’t actually provide any real business benefit. Nothing more than a collection of ones and zeros, data by itself can do nothing more than sit in electronic isolation. aggregating data doesn’t actually provide any real business benefit This situation only changes once that data is put to work. When it’s readily accessible by software applications, it becomes a powerful resource. When combined with data from other sources, its power can be increased exponentially.  That string of ones and zeros can now help businesses innovate and grow. The bottom line? No piece of data should be an island. The process of transforming data from bits into business value happens in a range of ways. From a hardware perspective, it happens in “digital centres”, the new age of data centres. Secure storage and reliable access means the data is available as and when it’s required. [easy-tweet tweet="The impact of data-based capabilities is profound" user="jihann22 and @comparethecloud" hashtags="database"] From a software perspective, the transformation occurs through the use of database optimisation and analytical tools. Correlations and patterns can be found within data that drive business activity and improve decision making. The impact of these data-based capabilities is profound. As our ability to put data to work becomes ever more sophisticated, the potential for positive business impact is vast. For example, consider the challenge of customer retention. It’s a big issue for every business but it’s particularly important for financial services organisations. In this sector, the critical factors affecting customer retention include speed, accuracy, and personal relationships. Some might argue that the solution is to pool all data in a single location To achieve effective retention, financial services personnel need fast and efficient access to customer information. They need to be able to fully understand each customer’s relationship with the firm and the interactions they have had in the past. [caption id="attachment_21862" align="alignleft" width="300"] CTC is recording live at the Dell MSP Conference[/caption] In some instances, achieving a holistic customer view will mean extracting data from multiple data stores and combining it all in real time. This can be difficult to achieve when that data is sitting in disparate data centres or even in different racks within the same data centre. Fast and reliable connections are important to ensure these ‘islands’ of data can be combined. Some might argue that the solution is to pool all data in a single location. But, while that might sound enticing, the practicalities of dealing with legacy systems and dispersed operations make it little more than a pipe dream for most organisations. Instead, effective cross-connection between different data stores and applications is one of the best ways to provide the information customer service staff need in the timeliest manner. Being able to efficiently and effectively access and analyse data (regardless of its location) will have a dramatic and positive effect on the quality of service delivered. [easy-tweet tweet="Fast, reliable connections are important to ensure ‘islands’ of data can be combined" user="jihann22" hashtags="data, analytics"] Ultra-connected and agile data centres are the underlying foundation that supports great customer service. They allow data to be accessed as needed and workloads to be efficiently and effectively deployed as and where needed. Depending on the requirements of the organisation, data centres can also support initiatives such as public, private and hybrid cloud infrastructures, peering, mobile device deployments and omni-channel contact centres. Once these value-adding pieces are in place, the humble data bits are able to deliver real business value. Data is no longer an island. Data is no longer an island. ### August Top 10 #CloudInfluence UK and Cloud Bursters The list of top cloud bursters, those appearing on the scene all of a sudden, was headed this month by Diane Byrant who was quoted widely on Intel’s investment of $100 million in Mirantis, as was Alex Freedland in 6th. [easy-tweet tweet="The Top 10s for the UK and Cloud Bursters #CloudInfluence are out now" user="billmew" usehashtags="no"] Sean Boyle in 2nd made a rare outing as those in AWS’s finance division aren’t often permitted to speak to press about, but this was to promote the firm’s use of data to create greater clarity to support decision-making. Commenting on the various winners of the fifth Annual Cloud Computing Excellence Awards, boosted Rich Tehrani, CEO, of TMC into third. [table id=49 /] Back in the UK Simon Hansford holds on to first place in the UK for the second month running as Skyscape Cloud Services announced the latest wave of price cuts. He is followed by Lisa Hutt who was recently announced as Aframe’ snew CMO and Kevin Price at SilverSky this month. [easy-tweet tweet="The only person to have appeared in the UK rankings every month is @SimonlPorter" via="no" hashtags="cloudinfluence"] Having held first place for months Simon Porter dropped to second in May. He was then second again in June and third in July. This month he is down to 4th. To a large extent this is down to the fact that all of his influence comes from social media where he is prolific – he is rarely quoted in the media and his blog posts have been less frequent – he is still however the only person to have appeared in the UK rankings every single month. [table id=50 /] NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### The forecast for analytics is bright and cloudy Wake up and smell the smart Cumulonimbi The origins of data analytics – the science of analysing and drawing conclusions from data – and cloud computing can both be traced to the 1950s.  Using cloud computing for analytics, however, has happened only very recently. Data analytics was originally a backroom administrative task, a manual process of pen and paper, ledgers and tables. With the advent of databases and spreadsheets analytics has slowly moved on, but it was the explosion of new business intelligence tools that enabled the effective slicing, dicing and visualisation of data that really accelerated change.  In fact, a whole new profession has come into being to do the work: the data scientist.  A key challenge is that data analysis is computer-intensive.  To gain accurate insights requires huge amounts of data to be processed.  Until recently this has been beyond the means of most companies. The Cloud grows up Cloud computing, as we know it now with elastic compute capacity, server virtualisation and pay-as-you-go pricing, first became commercially available in 2006.  However, the original clouds were underpowered and not suited to data analytic processing. The original amount of power that the elastic compute unit provided was less than you would find in a modern smartphone. Furthermore, these first clouds suffered from ‘noisy neighbours’ – a result of virtualisation when more than one task is run on the same physical machine.  Inherent bottlenecks on disk I/O or network resources meant that your virtualised cloud computer ground to a halt.  In analytics - where your analysis is only as quick as your slowest lookup - this could be disastrous. [easy-tweet tweet="#Cloud computing as we know it, first became commercially available in 2006" user="EXAGolo" usehashtags="no"] Welcome to the Power Cloud However, times have moved on; no longer are you limited to one cloud provider.  Hundreds of clouds, both large and small, are now available.  For specialised analytical tasks there are clouds optimised for analytics, and to avoid the problem of noisy neighbours you can now use “bare metal” where the virtualisation layer is removed so you benefit from full, single-tenant physical servers to run your analytics.  Bigstep’s Full Metal Cloud is one such cloud provider, which promises an instant 500 per cent performance boost when compared to virtualised clouds. [easy-tweet tweet="Analytical jobs are now the ‘square peg to a traditional transactional database’s round hole’" via="no" hashtags="data"] Database technology has also evolved, and analytical jobs are now the ‘square peg to a traditional transactional database’s round hole’.  The old way of dealing with this problem was indexing the database, creating partitions and data aggregates so you could set the database up in a way where some data could be analysed.  But still, such analytical jobs ran slowly, in batches and would take hours to complete.   Hadoop, the framework for distributed processing of large data sets across computer clusters, has played an important part in this evolution. Although Hadoop scales well, it is very slow to process queriesand this is not ideal in a world where decisions are expected to take just seconds, not minutes.  That’s where a fast analytic database comes in to play.  Users can continue to collect and store data in Hadoop clusters and then run fast analytic jobs in the database.  Moreover, the latest database technology is ready for the cloud; indeed most solutions are now cloud-enabled and run in-memory with native connectivity to popular business intelligence tools such as Tableau – which is a particularly user-friendly platform for analytics and data visualisation. As technology requirements increase, businesses need only pay for what they need at the current time The advantages of cloud are plain to see: Firstly, companies no longer have to invest in data centres. Secondly, it has enabled organisations to move away from a traditional CAPEX model (buy the dedicated hardware upfront in one go and depreciate it over a period of time) to an OPEX one where you only pay for what you need and consume. As technology requirements increase, businesses need only pay for what they need at the current time, rather than having to plan months or years in advance. This means the cloud also enables incremental growth. In addition, all the nuts and bolts of running a data centre, backup and recovery, networking and power are done for you. That’s a huge headache removed in one fair swoop. Taking data analytics to the clouds is undoubtedly the next step in the cloud revolution and the natural way forwards for businesses if they want to get to grips with their data once and for all. EXASOL’s high performance in-memory analytic database is now available in Microsoft Azure. ### August Top 50 #CloudInfluence Individuals We have a new leader at the top of August's #CloudInfluence individual rankings. During the summer holiday season there has been a slight lull in the overall volume of opinions and a limited number of major corporate announcements or events, but one of the commentators that has often appeared in our monthly rankings has been busy nevertheless. [easy-tweet tweet="Daniel Ives from FBR Capital Markets takes #1 in the August #CloudInfluence Individual Rankings" user="comparethecloud" usehashtags="no"] Daniel Ives, an analyst at FBR Capital Markets, has been issuing research reports on topics such as Windows 10 as well as providing press comment on things like the future for Symantec after its sale of Veritas, and on the strategic announcements from VMware at its recent user conference. Then in 2nd and 6th are Daniel Zhang, the CEO of Alibaba and Simon Hu, the president of Aliyun, Alibaba’s cloud arm, following the announcement that Alibaba is to invest billions in Aliyun as it seeks to challenge the market dominance of AWS and Google. [caption id="attachment_22157" align="aligncenter" width="1341"] Three month view of #CloudInfluence Individual fluctuations[/caption] Diane Byrant was quoted widely on Intel’s investment of $100 million in Mirantis which she described as the next step in bringing open cloud infrastructure to the entire industry as part of Intel's 'Cloud for All' initiative. Also quoted was Alex Freedland, Mirantis’s co-founder and president. Satya Nadella lost ground to his Microsoft colleagues Nicole Herskowitz, Cindy Bates, and Mark Russinovich as Micosoft made a number of announcements including extending its support in containers from Docker to Mesos. In a rare outing Sean Boyle, who runs the finance division of Amazon Web Services, was permitted by the company to speak to press about how the firm uses data creates greater clarity to support decision-making. [easy-tweet tweet="The summer break saw a drop in opinions for most of the usual commentators in #Cloud" user="billmew " hashtags="cloudinfluence"] Commenting on behalf of Cisco on the need for cloud providers and internet firms to comply with the new EU Cybersecurity Law, Chris Gow was widely quoted saying: “We’re pleased to see digital service platforms subject to a different regime but we’re disappointed at the lack of recognition that it is the use of cloud that determines the security risk not the service itself.” [caption id="attachment_22158" align="aligncenter" width="1290"] August view of #CloudInfluence Individual Fluctuations[/caption] Meanwhile Cisco colleague David McCulloch responded to news of Facebook’s deal with EZchip, by saying that: “The software defined network threat to Cisco has clearly been overstated.” Eli Fruchter from EZchip disagreed. With technology giants including Facebook and Google developing their own networking-equipment systems to save money and handle an increasing volume of Internet traffic, he sees a real opportunity for EZchip and a threat to traditional hardware makers such as Cisco. [caption id="attachment_22222" align="aligncenter" width="800"] Top 10 #CloudInfluence Individuals for August 2015[/caption] Now that the summer holiday season has come to and end we are expecting competition for places in our next monthly ranking to really hot up as announcements and opinions start to fly. [table id=48 /] NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### 5 Facts: Are Apple Planning Their Own Phone Network? It was only a matter of time before rumours started to fly that Apple were planning their own phone network. There is constantly speculation over what Apple are working on, whether this is a newer model of an existing product or the first in a line of brand new products from them. Alongside ever-updated iPhones, iPads and Macs, they have also recently launched an Apple watch and are apparently working on an iCar and an Apple TV, so we can only imagine what will come next! [easy-tweet tweet="Is #Apple planning its own phone network? " user="comparethecloud" usehashtags="no"] Well, according to reports it could well be an Apple Network! Here are 5 facts that we know about it so far: 1. Apple is in talks to launch a mobile virtual network operator (MVNO) service in the US and Europe, according to Business Insider. They are currently privately trialing an MVNO (a virtual carrier that sees technology companies lease space from established carriers and sell it to customers directly) service in the US. Apple has been looking into launching its own MVNO since at least 2006, when it patented plans to launch its own network. 2. Therefore, if it does launch, Apple won’t actually be launching its own carrier, but would actually switch between real carriers to get the best services. Instead of paying your carrier every month, you will pay Apple directly for data, calls and texts. Apple then provides you with everything you used to get from your carrier, whilst the Apple SIM switches between carriers to give you the best service. 3. Apple has already launched its own ‘universal SIM’ in the iPad Air 2 and iPad Mini 3. This allows users to connect their phone and then choose what network they want to get their data from.  4. Apple is already making itself less dependent on phone networks, with its plans to launch a Siri-powered ‘iCloud Voicemail’. This will see Siri answer phone calls and then transcribe them. At the moment your voicemails are stored with your carrier, so it would make sense for Apple to take control of these before launching an MVNO. 5. It is not likely to appear for another five years, if at all, with no guarantee that it will launch beyond test phase. In fact Apple has completely denied it, which the company very rarely does. However, this could just be to appease carriers concerned by the news. Apple did tell journalist that they had no plans to release a tablet… Only time will tell whether or not Apple will launch a network, and even if it does there are no guarantees that it will be a success, with other large companies already trying and failing at it! In the meantime make sure the products you already have are working to the best of their ability with a company like Lovefone- if you are currently trying to read this through a cracked screen, you could be an hour away from having it back to brand new! ### New White Paper from Datapipe: Rethinking IT Operations in the Era of Cloud The drive to adopt public cloud services like AWS and Azure originates from straightforward business common-sense: public cloud computing is a faster, cheaper, more agile alternative to building IT infrastructure on your own. Paying only for what you consume, with no upfront capital investment outlays makes sound economic sense. Plus, instant access to scalable computing capacity is an advantage that was virtually impossible in the past. But for many organisations, cloud adoption has been limited to simple or sandbox environments, and has not yet shifted to effectively using the cloud at a broader enterprise level. Cloud computing customers expect a lower cost as they perceive the utility model as less expensive. Read the full white paper from Datapipe here: Rethinking IT Operations in the Era of Cloud ### August Global #CloudInfluence Organisations We are flattered with the level of interest that our monthly rankings receive, but some of our followers who find the rankings fascinating sometimes ask: “well, what do they actually mean?” In pure statistical terms our ranking are based on a big data analysis of the leading opinions across the top international news articles, blogs and social media content to which we apply patented natural language processing algorithms. First we extract opinions that have been publicly shared relative to a specific topic (in this case Cloud), then we determine who held that opinion, and finally the algorithm calculates an influence score for the opinion, opinion holder, and the topic. This process is repeated every day, using the full-text articles from over 3 million sources including news, blogs, and social media. For Cloud we are currently tracking over 700,000 significant opinions. We are one of only three media companies to have access to this technology – the others being Forbes and the Economist (not too shabby, if we do say so ourselves). You can read more in depth on the meaning of influence here. [easy-tweet tweet="For #Cloud @comparethecloud is tracking over 700,000 significant opinions from 3 million sources"] What this actually means is that we track the level of “buzz” associated with each of the top organisations and individuals – the level of opinion each is generating, the extent to which this is resonating and the influence it is having on others. There are two main drivers for this “buzz”: Marketing and PR, where firms seek to generate and promote the opinions of their firms and executives The ecosystem, where a firm’s partners, clients and other advocates help spread such opinions and enable them to go viral. In effect we have a market place where there are numerous vendors offering rival cloud solutions with some generating more of a buzz than others. This is important because it is those that can capture both market share and mind share that will prevail. Cloud comes in many colours but some are brighter than others The rankings for August illustrate this point well. Amazon and Google are the ‘born-on-the-cloud’ players that established an early lead in public cloud, but Amazon’s cloud is shining brighter partly because it has a more active ecosystem, but also because it has started to break out its AWS financials, which leads to a lot of press coverage, whereas Google has yet to make the same disclosure. Hence Amazon is in 1st and Google is only 6th. Alibaba, which recently grabbed many headlines as it announced its intention to invest billions to challenge them both, is just behind Google in 7th. [easy-tweet tweet="1st place in the August #CloudInfluence Rankings goes to @Amazon" user="comparethecloud" usehashtags="no"] [caption id="attachment_22082" align="aligncenter" width="1495"] Three Month View of #CloudInfluence Fluctuations[/caption] At the other end of the cloud colour spectrum are the corporate technology heavyweights, IBM and HP. Both have realised that they cannot take on the market leaders Amazon, Google and Microsoft head to head – especially in terms of scale and price. Indeed they are not actually seeking to take them head on or compete directly on price at all. Instead they are offering premium cloud services to the large corporate clients that they serve that require secure and reliable systems to serve their complex needs. The difference here is that IBM’s marketing machine has been in overdrive with partnering announcements almost every other week with Twitter, Tencent, Microsoft, SAP, HootSuite and many others. While HP’s output of headline grabbing announcements has been muted by comparison. IBM is therefore shining far more brightly – appearing in 5th in the August ranking while HP didn’t even make the top 50. While IBM should be concerned that its buzz is almost all marketing-led with little or no ecosystem amplification, HP should be even more concerned that it is failing to generate any buzz at all outside of the OpenStack community. Microsoft in 2nd of course shines brightly in a unique colour all of its own – with a massive marketing machine and considerable ecosystem to support its own proprietary cloud platform, as well as the Office apps that its rivals are seeking to emulate. Likewise Apple and Intel each stand relatively alone in their respective niches in 3rd and 4th overall this month. [caption id="attachment_22081" align="aligncenter" width="1497"] August View of #CloudInfluence Fluctuations[/caption] Further down the rankings, shining just that little bit less brightly were a few interesting names.  General Electric launch Predix Cloud which it claims to be the world’s first and only cloud solution designed specifically for industrial data and analytics. GE claims this platform-as-a-service (PaaS) will capture and analyse the volume, velocity and variety of machine data within a “highly secure,” industrial-strength cloud environment. Adding that Predix Cloud will drive the next phase of growth for the Industrial Internet and enable developers to rapidly create, deploy and manage applications and services for industry. GE also announced that all its businesses will begin migrating their software and analytics to the cloud in the last quarter of 2015, and that the service will be commercially available to its customers and other industrial businesses by 2016. [easy-tweet tweet="Appearing in August's #CloudInfluence rankings, @VMware and @Rackspace thanks to #OpenStack announcement" via="no" usehashtags="no"] VMware and Rackspace both appeared in August’s tables as the two firms announced a new interoperable OpenStack cloud architecture to help customers reduce the complexity associated with hybrid or multi-cloud environments. Meanwhile PRS for Music (formerly the Performing Right Society) made an appearance in the rankings this month after the organisation responsible for collecting publishing royalties for musicians announced that it is suing Soundcloud for not paying songwriters for the use of their music. SoundCloud has consistently refused to acquire a PRS For Music licence. [table id=47 /] NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### How to create a more collaborative classroom with private cloud It’s back to school season, but for today’s children the classroom looks very different from when we were at school. The modern classroom is no longer limited to pencils, papers, and projects on poster boards. Today’s pupils are creating digital files across a variety of devices and classwork is now scattered across desktops, laptops, and tablets. [easy-tweet tweet="In the digital world, education needs to transform to be cloud-ready" user="TransporterEMEA" hashtags="cloud"] In the digital world students need a way to collaborate with peers on projects, research data, or sync documents in real time between group members; devices such as mobiles and laptops facilitate this. Teachers also need an easy way to collect assignments or share class notes, without leaving any records vulnerable to data loss. On the administration side, schools increasingly face the challenges of ever-tightening budgets, limited resources and personnel. They need to make legacy IT equipment work within the current technology landscape to ensure both students and faculty have constant access to data. It's time for schools to stop fighting byod, and to take advantage of the technology Further changes to classroom working are also likely to be brought about by the trend of BYOD (bring your own device). Like many organisations, schools are realising that mobile technology is not going away and that instead of fighting the use of phones, it might be a technology worth taking advantage of. If pupils bring their own devices then it will save schools well-needed resources, but as with enterprise BYOD, this way of working will need facilitating in a way that ensures data is kept secure. So, how can schools utilise the advances in technology whilst also protecting their highly sensitive data? As paper file systems are becoming cumbersome and out-dated, cloud storage might seem an attractive option, especially because this is how students are sharing their data outside of school. However, the convenience of public cloud comes at a price, once you hit the maximum storage allowance, the costs can really add up. In addition to the increased costs is the increased risk to data, a real concern for teachers, administrators and parents. The private cloud What schools and universities need instead, is a private cloud solution that can create a more collaborative environment in the classroom. A device that allows student’s mobile access, professors and teachers the ability to work from home securely and the ability for users to upload and share files both locally within the school and remotely at an offsite campus. If you use a public cloud service then its important to remember your data isn’t 100% secure The most important reason for opting for a private cloud solution is to regain control of your data. If you use a public cloud service then its important to remember your data isn’t 100% secure. With a private cloud solution you own the server where your data is stored giving the IT department complete visibility and control over privacy and location. The way to a more collaborative classroom Every year schools and universities are changing and modernising. You never know what you are going to find when you walk back though that classroom door in September. From blackboards and chalk to electric whiteboards and now projector screens and laptops, everything is modernising and evolving. As technology advances so too does our way of working and learning. [easy-tweet tweet="A lesson in data management that is well worth learning from Geraldine Osman" user="comparethecloud" hashtags="data, byod"] The ability to sync and share files instantly and securely through a private cloud appliance, means that students can work in a way that is collaborative and productive whilst keeping their data secure. A lesson in data management that is well worth learning. ### WightFibre acquires the Click4internet network Huge boost for rural broadband across the Isle of Wight WightFibre announces the purchase of the Click4Internet wireless network, which covers the Isle of Wight and parts of Hampshire around the Solent. The move is a significant step for WightFibre and enables the company to now extend its superfast broadband coverage to the whole of the Isle of Wight. [easy-tweet tweet="Over 90% of the Isle of Wight can now enjoy superfast speeds" user="comparethecloud" usehashtags="no"] WightFibre has been providing fibre optic broadband to the island since 2001 and Click4Internet high speed rural wireless broadband since 2008. Whilst the two networks have been working in partnership since 2012 this latest move now secures the opportunity for homes and businesses right across the island to receive faster broadband from a single island based provider. John Irvine, CEO of Cowes based WightFibre comments “This move is recognition of the issue faced by many across the Isle of Wight and reflects the importance we place on focusing on ways to improve rural connectivity beyond where it is today.” The combination of WightFibre’s extensive technical support and existing infrastructure and Click4Internet’s more rural coverage allows WightFibre to now deliver a superior service to more areas of the Island - providing a real alternative, improved choice and additional services for homes and businesses. Hi-speed internet access is seen as an important contributor to the quality of island life, from the way we work and how our children learn, to how we spend our leisure time and do our online shopping, through to how we contact public services and keep in touch with loved ones. Click4Internet Managing Director, Frazer Munro firmly believes that bringing the two networks together is a great deal for the island, “Click4Internet has been providing service to large areas of the island who have been poorly served by BT. With over 500 homes and businesses on our network we had reached a point where we needed help to accelerate the development and take up of our services. WightFibre will be able to deliver this. We feel we are leaving our existing network and customers in very capable hands”. WightFibre continues to lead the way in the roll out of faster broadband for the island and is already planning upgrades and enhancements to the existing Click4Internet network; improving speeds over the coming months to superfast and also making available phone and TV services to existing Click4Internet customers. These network enhancements are a reflection on Wightfibre’s commitment to these customers, the island and continued investment in its rural economy. This latest move combined with WightFibre’s continued partnership with Wight Wireless ensures that, in total, over 90% of the Island can now enjoy superfast speeds. ### Analytics solutions take away the guesswork Q&A with Annrai O’Toole, CTO, Europe, Workday Businesses have known for a while that data holds the answers to questions around improving performance, but how is cloud computing taking this to the next level? For decades, businesses have been using analytics solutions to unlock business insights which inform their decisions. But as technology evolves, so too do the analytics solutions. Whereas ‘business intelligence’, as it is commonly known, has traditionally focused on analysing past data to make educated decisions for the future, the combination of cloud computing and in-memory computing (IMC) has, today, led us to the next generation of analytics which can deliver actual recommendations to businesses on their next moves. So rather than just looking at past data, modern analytics solutions are instead made up of three broad subsections: descriptive (providing insights into what happened), predictive (predicting what might happen in the future), and prescriptive (delivering recommendations to business as to what steps they should take next to deliver optimal results). By using advanced data science and machine learning algorithms to provide leaders with insights, predictions and recommendations, businesses are armed with information to make smarter, and more strategic, financial and workforce decisions. [easy-tweet tweet="Modern #data #analytics are either descriptive, predictive, or prescriptive." user="annrai and @comparethecloud" usehashtags="no"] So, no more second guessing? In the past, many important decisions have been made based on a limited amount of information or instinct but these days are gone. Recommendations based on algorithms which analyse accurate and current data allow us to make confident and assured decisions. We are already experiencing these analytics solutions in our everyday lives. Take, for example, every time you log into Amazon or Netflix. The websites recommend books or films for you based on previous items you have purchased or viewed. The impact of having access to this information is extremely powerful in the decision making process. What is stopping enterprises implementing this in a professional environment? Businesses face challenges which are much more complicated than simply deciding which book to purchase next. If they are to obtain recommendations to answer the big questions such as ‘how do we increase revenue?’ or ‘how can we retain our talent?’, then they will require huge amounts of data and the analysis of it will be much more complex to get the right results. Legacy systems are clear roadblocks to enterprises adopting modern data analytics solutions But of course, access to this insight relies on having your data in order first. Legacy systems are clear roadblocks to enterprises adopting these analytics solutions, given that traditional ERP systems are situated across multiple servers and databases, with siloed processes and systems. As such, there is no unified view of a company’s data. The first step, then, for businesses is to move scenario data into a data warehouse or business intelligence application before the enterprise can reap the full benefits of analytics. Talent retention is one business challenge many companies find particularly problematic. How do prescriptive analytics help remedy this? At present, knowing when an employee is ready to leave is often based on a gut feeling from their manager. However, losing talented employees not only has an impact on quality of work and customer satisfaction, but it can also cost the company thousands of pounds to replace them, having a significant cost to a business’ bottom line. With so much at stake, surely manager instincts are not the most strategic solution. Using data is a much more accurate way of understanding the top risk factors driving staff turnover. By accurately assessing factors such as time spent in current role, number of job functions held, or time between promotions, as well as highlighting which teams are at the highest risk of turnover, businesses are armed with the right information to support decisions to drive successful business outcomes. If businesses were to implement these solutions, will they need to hire data scientists to make sense of all the information? [easy-tweet tweet="If the thought of algorithms leaves you scratching your head, you needn’t worry..." user="annrai" hashtags="bigdata, analytics"] If the thought of algorithms leaves you scratching your head, you needn’t worry - it isn’t as complicated as you might think, and you certainly do not need to be a data scientist to understand the results. All the work happens behind the scenes and once the data has been analysed, it is presented to a HR manager or finance team in a single, intuitive dashboard. Making the results easy to understand is crucial to the success of analytics solutions, and only then will they deliver valuable insights to managers in the business. So the technology becomes the trusted advisor to the business, does this not take away the role of the manager in the decision making process? In a business environment, it is important to remember that these machine-produced recommendations should not run in a vacuum. Managers still play a crucial role as they ultimately have the final call on a company’s next steps. The prescriptive analytics merely act as intelligent guidance to support the decision making process rather than replacing it all together. What needs to be in place for this new breed of prescriptive analytics solutions to thrive? Accessibility is key. Companies depend on intelligent insights to grow their businesses, but if managers struggle to get access to the information then it becomes pointless. Data doesn’t have to be difficult. The next generation of analytics provide the right people, with the right information at the right time and consequently add real value when it comes to supporting growth in a business.  Data doesn’t have to be difficult. ### The 7 Deadly Sins of Data Loss Prevention in the Cloud The world of work is changing. More people are working remotely and more data is being stored and transferred using cloud based apps, leading to enterprises employing a hybrid of on-premises and cloud-based applications. This creates a fantastic opportunity for identifying cost savings, as well as utilising talent no matter where that sits. However, it also opens up significant security shortcomings when traditional Data Loss Prevention (DLP) solutions are deployed. In particular, as organisations move more of their sensitive data onto SaaS providers’ servers, traditional DLP solutions are unable to provide sufficient visibility into the SaaS environment, or worse yet, are not able to operate in the “as a service” environment at all. [easy-tweet tweet="Traditional #DLP solutions are unable to provide sufficient visibility into the #SaaS environment" user="comparethecloud" usehashtags="no"] Let’s take a look at seven challenges of providing DLP for cloud file sharing applications. 1. Lacking basic visibility The most obvious shortcoming of traditional DLP is that it can only monitor traffic on enterprise-controlled assets. However, traffic to and from a cloud application might not go over an enterprise network at all. It could be generated by a mobile user…through a native mobile application…over a mobile network. This falls out of the sight of traditional on-premises enterprise solutions, and as such, is fundamentally beyond the scope of what classic DLP solutions are designed to handle. 2. Failing to interpret encrypted traffic Traffic to and from cloud applications is typically encrypted. Even if a traditional DLP solution managed to gain network-level visibility into this traffic, it might not be able to interpret the underlying content. Again, without basic visibility, there is little that can be accomplished by traditional solutions. 3. Interpreting links versus raw data Traditional DLP solutions are predicated on processing raw data directly. However data is never being directly shared in cloud file sharing applications. Instead, what is being shared is some type of link to the content. The link itself reveals little to no useful information about the content itself. What must be done is to analyse the content being pointed to by the link, rather than the link itself – again something traditional DLP solutions can’t do. [easy-tweet tweet="Traditional #DLP solutions are predicated on processing raw #data directly." user="ElasticaInc and @comparethecloud" usehashtags="no"] 4. Using ‘perimeter defence’ sharing semantics In the context of traditional enterprise environments, data loss or leakage has a well-defined meaning - namely the crossing of data across the enterprise perimeter. For cloud file sharing applications, however, the definition of leakage or loss is fundamentally different for two reasons. First, once data is hosted with a cloud provider, it already resides outside the enterprise network and can be shared with external third parties. Secondly, data in a cloud application is shared on a per-user basis. For example, if you want to share a file with someone else, you can typically do so by simply entering that person’s email address or the username they use for the application. Traditional DLP solutions do not understand these sharing semantics, and cannot assess if data is being “lost” or leaked. Traditional DLP solutions do not understand these sharing semantics 5. Applying algorithms not designed for file-based data Traditional DLP technologies might make different assumptions regarding the data they have to process - they may assume data is transmitted in a stream and has to be processed as such. When dealing with SaaS-based file sharing applications, the data model generally involves being able to access entire files containing sensitive data. Algorithms that are designed for streaming data might not perform well on file-based data. As a result, to achieve optimal performance for SaaS-based enterprise file sharing applications, it is important to develop algorithms that were designed specifically to take advantage of full-file content. 6. Viewing content myopically, while ignoring broader context Traditional DLP solutions might examine a piece of content in isolation and use that as the sole basis for determining whether or not the transmission of that content represents a violation. For DLP in cloud applications, we have access to much richer context about a particular file – has the file only been shared internally or is it being shared externally too? Who originated the sharing of the file? Was it an external party or did it come from the inside? [easy-tweet tweet="Traditional DLP solutions are not privy to the mechanics of cloud-based file sharing applications" via="no" hashtags="cloud"] Considering context is not just important for determining whether a policy is violated, but it is also important when remediating issues. You might be fine with a SaaS application hosting a file containing specific content, as long as the users with whom the file is shared are internal to the organisation. If an attempt is made to share the content with an unauthorised third party, then you might want to block only this type of access. Because traditional DLP solutions are not privy to the mechanics of cloud-based file sharing applications, they are unable to provide enforcement capabilities that are consistent with the way these applications work. 7. Relying on pattern matching Traditional enterprise DLP technologies rely primarily on basic pattern matching and regular expressions for identifying sensitive content. To identify a credit card number, the DLP solution might look for sixteen numbers formatted in the particular way. While this approach will be highly sensitive to finding credit card numbers, it will have poor specificity. In particular, there may be many instances of files containing digits that can be misconstrued as credit card numbers. To address this concern, it is important to apply techniques from natural language processing and machine learning. These approaches go beyond simply trying to understand the raw content, and instead focus on being able to understand the underlying context. For example, the presence of a sixteen-digit number is itself vague. If, however, in proximity to that number we see a name, an address, and a date – then we can have more confidence that we are dealing with sensitive financial information. a new approach to data loss prevention is vital to ensure security These seven challenges highlight that data loss prevention in the new world of work needs to be redefined. Looking at DLP for SaaS applications is starkly different from what needs to be done for traditional on-premises enterprise applications. Clearly, for forward thinking organisations that now rely on a hybrid of on-premises and cloud-based applications, a new approach to data loss prevention is vital to ensure security. ### Is Online Gaming an Invitation for Hackers? Without the right modes of protection, online gamers may be left open to hackers wanting to access their essential information such as financial and personal data. This is why it is of the utmost importance that anyone using the internet to pass across confidential information, either via online gaming, shopping, banking should be taking the correct precautions to avoid any mishaps online. [easy-tweet tweet="Participating in #onlinegaming should not be considered as an invite to potential #hackers" via="no" usehashtags="no"] However, this doesn’t mean that you can’t enjoy the undoubted thrills and spills of online gaming to the fullest. As long as you are safe online, there is no reason why you can’t enjoy your favourite games as often as you like. Participating in online gaming should not be considered as an invite to potential hackers, instead, it should be seen as a creative outsource and provider of quality entertainment, providing unforgettable experiences for over 1.2 billion gamers worldwide. Security should just be seen as the necessary bedrock to allow this enjoyment to take place. The reality The risks that come with online gaming are not as serious as some naysaying corners of the media would have you believe. There are currently 700 million gamers using the internet to enjoy their games and according to www.bobsbusiness.co.uk the video-game industry is expected to exceed £86 billion by 2016. The majority of these gamers have never experienced any forms of hacking or unwanted third party intrusions. The risks are only present when firewalls and security systems are absent. According to the same article, 34% of social platform gamers experienced password hacking, with 17% of those affected choosing to stop playing as a result. The odds of a hacker gaining access to your online gaming account are slim and operating with no safety mechanisms is a bit like walking out of your house every morning and leaving the door wide open. But what can you do to completely eliminate the possibility of hackers? Take the right precautions. [easy-tweet tweet="The video-game industry is expected to exceed £86 billion by 2016" via="no" hashtags="cloudgaming"] Steps to Avoid Hackers [caption id="attachment_18105" align="alignright" width="300"] Read Neil Cattermull's review of Insomnia54[/caption] The first and easiest step that you can immediately take to prevent hackers from accessing your important data is to use a difficult password. To improve its quality, make sure that it includes a combination of numbers, lower case letters and upper case letters. Avoid using easy-to-guess passwords such as birthdays and pet names as this information can be easy to find on your social media accounts etc. Make sure that you have an effective firewall and anti-virus protection server activated on your laptop to ensure that no harmful viruses can gain access to your data. It is also important to make sure that you have backed up all of your online files and information so that if something were to happen, you can always be assured that you have a back-up. Effects on Online Gaming Gaming is only an open invitation to hackers if you make it one. In 2014, as reported on www.develop-online.net, the online gaming industry ‘fell victim to a hack that saw $1 billion fake dollars flooding into the gaming coffers’. This attack did set back the industry quite a bit. However, this was probably due to the fact that the gamers involved did not know how to practice online safety whilst gaming. As long as you take the right precautions, your personal data will remain safe whilst online so that you can continue to enjoy your favourite games without worrying about potential threats. Gaming is only an open invitation to hackers if you make it one. Rigorous safety is the key to hours of enjoyment. ### How Cloud Storage Helps Companies Move From Hot to Cold Low cost cloud backup solutions are currently transforming disaster recovery (DR) and data protection. The door has now opened for small businesses to adopt previously prohibitively expensive secondary DR capabilities. Providing businesses with flexibility over stored data, the cloud also offers a quick way to recover information after a disaster occurs, with the flexibility that it can be recovered to any location of choice. By virtue of being more dynamic, automated and accessible from any internet-connected location, organisations with cloud storage are believed to be up to twice as likely to be able to recover data after a disaster in four hours or less than those who have non-cloud solutions. [easy-tweet tweet="Block storage is still the traditional option for enterprise workloads" user="LeClairTech" usehashtags="no"] The majority of cloud providers now offer block storage and object storage. For example, Amazon offers Amazon Elastic Block Store (EBS) and Amazon Simple Storage Service (S3). Block storage is still the traditional option for enterprise workloads that require persistent storage.  Block storage is used by applications in the same way as they would consume storage from a storage area network (SAN) device.  Files are split into uniformly sized blocks of data, but have no metadata coupled with them. Object storage stores data and files as distinct units without any hierarchy. Each object contains data, an address and any associated metadata useful to the application that is using the object storage. Object storage is highly scalable as well as long-lasting, making it greatly advantageous. Most object storage systems have mechanisms to duplicate data and give up to 11 nines of durability and availability. Object storage can be effectively used for DR, archiving and cost-efficient cloud backup Object storage can be effectively used for DR, archiving and cost-efficient cloud backup. Data stored in object storage is easily and quickly accessible. The main limitation and challenge of using cloud storage for backup and recovery is in the initial seeding of data into the cloud.  Furthermore, a typical data centre has many terabytes of data on-site and migrating this to the cloud can be a time consuming and costly task. Even with a high transfer rate wide area network (WAN) pipeline, it can take days to transfer this large amount of data.  And even if you have the budget and patience to get your data to the cloud initially, when having to restore from the cloud, WAN bandwidth can be a limiting factor and you can greatly lengthen RPO times. As a result, many cloud services offer cloud seeding programs that allow enterprises to ship physical disk or tapes to the cloud facility so data can be seeded into the cloud for the initial set of data. From that point on, any changed or new data can be transferred incrementally over the WAN. This incremental eternal backup strategy requires much less data to be sent over narrow and costly transfer pipes. [easy-tweet tweet="Incremental eternal #backup strategy requires much less #data to be sent over narrow and costly transfer pipes" via="no" usehashtags="no"] Cold Storage Most data in a data centre cools over time.  This means that it starts out “hot” and is accessed frequently. However, over time data is needed less often and cools to the point where it is cold, accessed very infrequently and predominantly kept as archives for data retention or compliance needs.  Cold data must be retained in many enterprises for seven-to-10 years for regulatory reasons, but with the expectation that it is accessed extremely infrequently. Historically, this data has been saved on tapes and stored off-site. However, while individual tapes are inexpensive, tape is a much maligned technology in IT. Tapes often require expensive hardware systems that must be maintained, and the process for storing data for long-term retention using tapes is labour-intensive and potentially error-prone. Worst of all, as tapes age and bit-rot occurs, data restoration can fail entirely. As a result of this data pattern, there is a new form of cloud storage that is heating up. In 2012, Amazon introduced Amazon Glacier, which provides durable storage for data archiving and online backup for as little as one cent per GB per month. Glacier was designed specifically for data that is accessed infrequently, which can be tolerant of a retrieval time measured in hours, typically three-to-five hours, as opposed to seconds or milliseconds, which is typical with object storage. In March 2015, Google entered the cold storage market with a potentially disruptive solution of its own. Like Amazon Glacier, Google Nearline Storage is also intended for infrequently accessed data and has a cost of one cent per GB, less than half of standard storage costs at Google. However, unlike Amazon Glacier, which will deliver your data within a service level agreement (SLA) of three-to-five hours, Google promises file access of three seconds or less “time to first byte.” [easy-tweet tweet="Cold storage can be a significantly more flexible and cost-effective tape replacement for many businesses" via="no" usehashtags="no"] Cold storage can be a significantly more flexible and cost-effective tape replacement for many businesses that no longer want to use rotational media for archival and retention. It seems like Google Nearline beats Amazon Glacier, offering faster retrieval times at a much lower cost than other cloud-based object storage options. However, there is a catch: Google Nearline is still designed for cold storage. Google states that you should expect 4MB throughput per TB of data stored, although this throughput scales linearly with increased storage consumption. For example, 3TB of data would guarantee 12MB per second of throughput. Therefore, while Google Nearline is faster to first byte than Glacier, if you have a lot of data, Glacier may be the faster option overall. Both Glacier and Nearline can incur additional costs as you access your data depending on how frequently and how much of your data you need to access. Many hyper-scale cloud companies including Amazon, Google, IBM and Microsoft, now offer these services Cold storage and object storage are both good systems for backups and archives. Many hyper-scale cloud companies including Amazon, Google, IBM and Microsoft, now offer these services. With a multitude of options, organisations should consider their own needs before jumping in and using a particular provider. Companies need to assess each system’s initial data seeding requirements, cost per GB, time to first byte retrieval, ease of use, frequency of access to data and overall throughput expectations. The best solution for individual businesses to do is seek a continuity and backup provider that supports the widest range of options, giving flexibility and a cost-efficient way to protect your data in this changing market. ### Demystifying the hype of the hybrid cloud Cloud computing is shrouded in myth and decked in media hype. One such fallacy is that significant numbers of businesses today operate and oversee their infrastructure exclusively within a cloud environment. In a study of small to medium and public sector organisations in the UK undertaken by the Cloud Industry Forum (CIF) 84% of those polled were using some form of cloud-based service and 78% two or more services. [easy-tweet tweet="50% expect to move their entire estate to the #cloud at some point in the future" user="comparethecloud" usehashtags="no"] In truth, many organisations are within, or are moving towards a hybrid cloud—maintaining and managing some resources in-house while using cloud-based services for others. This research sees opinions evenly split on future trends; 50% expect to move their entire estate to the cloud at some point in the future, while the remainder cannot foresee such a modus operandi. At the present time most are a way off migrating their entire infrastructure to a cloud platform with just 15% of them considering their primary IT model to be the cloud. cloud “isn’t yet all things to all men” Alex Hilton, CEO of CIF, commented that the cloud “isn’t yet all things to all men” and that it will sit alongside on-premise solutions “for quite some time to come.” The rush towards cloud-based outsourcing has lost momentum. Here are some reasons why… False economy The major argument for transitioning to a cloud platform is efficiency; shared resources are deployed as and when required, such that IT spend becomes an operating cost—no longer deemed CAPEX where hardware outlay is depreciated over time. Yet contractual service charges are not always preferable to hardware refresh in a climate of declining hardware costs. Security concerns [easy-tweet tweet="75% of organisations stated they had security concerns in migrating specific applications to the #cloud" via="no" usehashtags="no"] According to the CIF’s UK Cloud Adoption Trends for 2015 75% of organisations stated they had security concerns in migrating specific applications to the cloud and 59% reported concerns over data protection. The CFO’s arguments for improved efficiency are little comfort when concerns around maintaining levels of service, and worse still uncertainties in security and compliance, loom large. A rationalised and deskilled workforce would ensue Transition to the cloud will inevitably lead to labour lay-off and a deskilling of in-house resource. Removal of tried and tested methodology, together with technical team expertise and associated accountability seems a brash and unwise move. Lack of visibility and desire to manage critical systems A known factor in the reluctance of enterprises to embrace cloud technologies is that they have come to expect complete visibility of their infrastructure. A hybrid scenario allows on-premises IT services to be deployed for the most sensitive and critical workloads or for the organisation’s legacy systems and these can be monitored via conventional methods. The Mutiny Lite solution allows monitoring of up to 10 managed devices free of charge. Proactive network monitoring provides management of priority business services and notifications whenever they might be at risk. A colocation arrangement may be a sensible route to providing an optimal mix of control and capacity. [easy-tweet tweet="Hybrid take-up rates are not as high as once anticipated." user="comparethecloud" hashtags="hybridcloud, cloud"] Even hybrid take-up rates are not as high as once anticipated. Gartner previously estimated that 72% of companies would have put a hybrid cloud implementation strategy in place by the end of 2015. However, the latest insight has led them to announce that ‘while most companies will use some form of hybrid cloud computing during the next three years, more advanced approaches lack maturity and suffer from significant setup and operational complexity’. Gartner goes as far as to position the hybrid cloud in the ‘trough of disillusionment’ phase of the Gartner Hype Cycle. Until service providers reach new levels in functionality, security and visibility that is where it will firmly remain. ### Xangati Extends Storage Performance Analytics to VMware Virtual SAN Xangati, a service assurance analytics and infrastructure performance management innovator, is extending its cross-silo intelligence and storage performance analytics for VMware Virtual SANTM technology. Xangati collects information from all servers, switch and storage systems that comprise the virtualised data centre and gathers information from VMware vSphere®, VMware vCenterTM Server and VMware Horizon® to create a complete picture of end-user quality of experience and health of virtual desktop or virtual server infrastructures, augmenting VMware vCenterTM and VMware vRealizeTM OperationsTM. [quote_box_center]“Virtualisation environments need to deal with new apps and reconfiguration requirements on the fly for maximum flexibility and resource liquidity, but the lack of visibility of storage behaviour is often contentious,” said Atchison Frazer, VP-Marketing, Xangati. “With integrated Virtual SAN data correlation, vSphere admins will be able to run virtualisation more independently, with little storage expertise required and less reliance on external arrays and storage admins for optimising workload performance.”[/quote_box_center] Xangati has also implemented deep and direct integration into NetApp storage systems that shows the IOPS, throughput and latency that NetApp is delivering to the hypervisor. Additionally, data on NFS or CIFS (Common Internet File System) shares, iSCSI or Fibre Channel LUNs (Logical Unit Numbers) and overall network bitrates are collated by Xangati.  Storage-related storms typically occur when applications unknowingly and excessively share a storage controller, a volume or even a LUN, which cause storage performance to deteriorate, often dramatically and spontaneously, especially among converged infrastructures.  Relative to VMware Virtual SAN support, Xangati will collect information from the hypervisor about the Virtual SAN-based datastore as well as its constituents, including latency, throughput and IOPS, as well as capacity utilisation, resource efficiency and availability information. Virtual infrastructure performance data are collected with highly granular, second-by-second precision and then correlated by Xangati’s advanced analytics engine to adjust thresholds dynamically, eliminating false alarms. When a service impediment crops up, such as a resource contention storm involving Virtual SAN, Xangati will utilise its exclusive StormTracker technology to automatically determine root cause and analyse VSAN-specific metrics to resolve triage-level issues quickly. "VMworld® 2015 US is an ideal place to see firsthand how virtualisation is radically changing storage," said Gaetan Castelein, senior director, Product Management, VMware.  "VMware Virtual SAN, with its ability to replace the hard-to-manage hardware infrastructure, is rising as a better way to handle the new I/O demands and puts storage front and center in the revolutionary changes we're seeing from virtualisation." ### What a successful service provider looks like in the U.S. and U.K. A rather large "pond" separates United Kingdom MSPs from those in the United States. Despite this distance, the superstars from each region closely resemble one another. They understand their respective markets and know how to take full advantage of the managed IT services business model to achieve longterm growth. While the verticals they pursue may differ or the challenges they meet may be unique to their locations, the core characteristics of successful MSPs remain the same in both the U.K. and the U.S. U.S. and U.K. MSPs That Get Top Marks... 1. Understand What a Managed Service Is MRR - This seems like a no-brainer, but in both markets there exists the misconception that if you touch information technology, you're an MSP. In both the United States and the United Kingdom, many break-fix providers and system integrators believe they're offering managed IT, but they don't understand what makes the business model so unique. [easy-tweet tweet="The misconception is that if you touch information technology, you're an MSP." user="comparethecloud" usehashtags="msp"] MSPs who excel do so because they understand that business longevity derives from having regular and predictable cash flow. As an MSP, you can count on a steady stream of monthly recurring revenue (MRR) for ongoing service rather than billing clients for the time it takes to complete unpredictable, "fire alarm" issues. Proactive IT Management - In addition to knowing that their balance sheets are fully managed, high-caliber MSPs understand that clients want the peace of mind that all of their IT-related concerns are fully managed as well. Simplicity is key. You have to make it easier for the client to work for you. Successful MSPs in both the U.S. and U.K. leverage intelligent remote monitoring and management (RMM) technology to proactively manage their clients' endpoints. Their clients, in turn, are not burdened with the expectation that it's only a matter of time before their system goes offline. They know their data protection is continuous, and they have you to thank! 2. Have Significant Experience in the Field Skills don't develop overnight. It takes time to hone your craft, and running a managed IT services operation is no exception. How do U.S. MSPs compare to their U.K. counterparts in this regard? According to CompTIA's U.K. State of the Channel report, [c]hannel firms in the U.K. tend to be on the older side in terms of years in business." In fact, almost "two thirds of [the 400] companies in CompTIA’s study report being in business for ten years or more, compared with 16% that have been in the market for four years or less." For MSPs that have been around the block for a few years, they've been successful because they've been able to refine their operations to be process-driven. They have established workflows and have fine-tuned their Service Level Agreements (SLAs) to maximize service delivery and client satisfaction, creating a stickier customer base. 3. Offer Cross-Sell and Upsell Opportunities [easy-tweet tweet="Popular cross-sells for both U.S. and U.K. MSPs include #backup and #disasterrecovery" via="followcontinuum" usehashtags="no"] After already establishing a stable MRR with RMM, the more successful MSPs in the U.S. and U.K. expand into offering other services to cross-sell existing clients. While it differs for each MSP, it's bundling these additional services that tends to take U.K. and U.S. MSPs to the next level. Popular cross-sells for both U.S. and U.K. MSPs include backup and disaster recovery (BDR) - offered by 72% of U.S. MSPs according to CompTIA's 4th Annual Trends in Managed Services Study - email security, and voiceover IP (VOIP) services. The top MSPs are immensely profitable, and a big part of their success comes from understanding the need for multiple revenue streams per client. Additionally, they capitalize on upsell opportunities like adding RMM agents on additional end points. In both the U.S. and U.K. markets, the most successful MSPs manage the entirety of a client's network, including firewalls, routers, and switchers. They've moved beyond solely serving servers and desktops. [quote_box_center]Note: Clients don't just volunteer when they're ready to spend more money with you. If you want to double your revenue year-over-year, you have to be dialed in to your clients' needs. The top MSPs are more than just service providers. They're true technology partners. They perform network assessments to assess the health of users' IT infrastructure and know which questions to ask to determine whether there is an upsell or cross-sell opportunity. What if you think they need additional services, but they aren't initially sold? Stay on top of them! MSPs in the U.S. and U.K. act as virtual CIOs (vCIOs) andperform Quarterly Business Reviews (QBRs) to reaffirm the value they bring clients as IT consultants. In fact, consulting services for IT was the number one business activity that CompTIA's UK respondents claimed generated the most revenue for them last year. Meetings like QBRs are a great channel to re-open upsell and cross-sell conversations.[/quote_box_center] 4. Target Lucrative Verticals The verticals within the U.S. and U.K. managed IT services markets are largely the same with one notable distinction. Education, government, and finance are three universally lucrative funnels in need of the data regulation MSPs can provide. As we discuss in our MSPedia article, HIPAA and the Healthcare Vertical Opportunity, the healthcare industry offers enormous opportunity to grow for MSPs in the United States due to HIPAA compliance standards for covered entities and business associates. By 2020, IDC Health Insights projects that 42% of all healthcare data created in the Digital Universe will be unprotected but needs to be protected, as use of data and analytics continues to proliferate and more stakeholders are involved in delivery of care. U.K. residents are not subject to this data privacy law. Thus, healthcare IT isn't one of the main verticals that U.K. MSPs target. Nevertheless, managed IT services providers in both territories know where their services are in the greatest demand and craft a plan for how to strategically penetrate these markets and secure accounts. But Even Growth-Driven MSPs Can Still... 1. Face Obstacles to Expansion Scalable Services - In the U.S., the main challenge MSPs are faced with is hiring and retaining talent. Unfortunately, high employee turnover results when technicians leave. The main reasons for departure are that the employee wishes to increase their knowledge and certifications, feels overworked and/or undervalued, or thinks they can start their own MSP. [caption id="attachment_21862" align="alignright" width="300"] CTC will be at Dell's Service Provider Conference 10/09/15[/caption] This disruption to the staff also disrupts the workflows that these MSPs have established and those who aren't leveraging a 24x7 network operations center (NOC) or help desk flounder to meet customer demands efficiently and scalably. Then, there's the issue of acquiring new talent when faced with a non-existant budget for new hires. It's difficult to grow your business and take on new clients when your existing personnel can barely manage the volume of demands they're currently receiving. Language Barrier - In the U.K., the problem is a matter of geography. The territory is relatively small, and U.K. MSPs would like to branch out into more of Europe. Between France, Germany, Spain, Italy, etc., however, the region surrounding these practices has business-stalling language fragmentation. Both U.S. and U.K. MSPs are better able to support clients whose first language is English, but the pool of potential candidates is much smaller in the U.K. 2. Struggle to Implement a Sales and Marketing Strategy When asked how they'd rate their sales and marketing effectiveness, 37% of smaller UK firms that participated in CompTIA's study reported "hit or miss" and "ineffective." Even some of the most successful MSPs continue to need sales and marketing guidance. They possess the technical skillset needed to do their job, but are still learning how to run a business. It may be that they don't know how to recruit a sales team that will take their operation to the next level. Many of these MSPs fail to align their sales and marketing efforts. They shy away from marketing because they don't think they have the time or resources to develop a program with real ROI. As any MSP that works with Navigate 2015 keynote Robin Robins will tell you, investing in a scalable, sustainable marketing strategy pays off for years to come. You have to stay on top of them with relevant, non-salesy content. You have to do it right though! You can't expect the leads to flood in and close on their own. You have to stay on top of them with relevant, non-salesy content. A popular channel that successful MSPs in both the US and UK utilize is email marketing. They send educational newsletters and/or webinar invites to stay top-of-mind with prospective clients. When a lead is captured, the sales team is automatically notified so that a sales person can follow up immediately and move these leads down the funnel. To make sure sales and marketing are supporting one another, you have to have consistent messaging, an established workflow, and continual dialogue. How is Sales notified when a contact fills out a form? You should have this process nailed down. How does Marketing know which content will resonate with its audience? Sales should provide feedback from their conversations. Although successful U.K. and U.S. MSPs have made tremendous strides in the sales and marketing arenas, it continues to be a learning curve for many. ### Financial Services and IBM zSystems Mainframe: Vicom Infinity – Len Santalucia Discussing the benefits of IBM zSystems Mainframe for the Financial Services Industry. As an IBM Premier Business Partner Vicom are a reseller, integrator and MSP of mainframe hardware and software. ### Pulsant acquires the IT consultancy business of Spinnaker Red   Cloud and Colocation expert agrees deal with IT optimisation and transformation firm Pulsant, the cloud computing, managed hosting and colocation expert, today announced the acquisition of the consulting business of Spinnaker Red, a specialist consultancy that provides IT optimisation and transformation services across the complete technology infrastructure landscape.   Pulsant has an established, clear strategy to increase its capabilities, market position and size organically, and through appropriate, targeted acquisitions. In realising its strategy, it has acquired the consulting business of Spinnaker Red to bolster and complement its IT consultancy services, including project management, technical strategy and design and technical implementation. [easy-tweet tweet="Pulsant has acquired the consulting business of Spinnaker Red" user="pulsantUK" usehashtags="no"] Mark Howling, CEO of Pulsant, said: “We are delighted that the acquisition of Spinnaker Red’s consulting business has been finalised. As well as enhancing our cloud infrastructure services portfolio, Pulsant is now well-placed to offer really strong expertise in the areas of IT migration, project management and IT transformation to help customers move into hybrid cloud environments. “As a business, we remain focused on offering the highest level of cloud-based services and solutions and Spinnaker Red’s hugely experienced team will enable us to offer even greater levels of value and support to our customers.”  Spinnaker Red’s consulting business has a proven track record, and an impressive reputation in the industry. Together with its key methodologies, tools and processes, the organisation is an ideal partner for Pulsant, as it moves further towards fulfilling its long-term business goals. The two companies have previously worked together on a number of key projects and as part of the acquisition, Spinnaker Red’s consulting employees will be transferred to the Pulsant team. View the original release here: http://www.pulsant.com/pulsant-acquires-the-it-consultancy-business-of-spinnaker-red/ ### Retail spend on Internet of Things to Reach $2.5bn by 2020 New data from Juniper Research has revealed that retailers seeking to capitalise on IoT (Internet of Things) technologies will spend an estimated $2.5 billion in hardware and installation costs, nearly a fourfold increase over this year’s estimated $670 million spend. The hardware spend includes Bluetooth Beacons and RFID (radio frequency ID) tags. In the first instance, Bluetooth beacons enable visibility over footfall as well as the ability to push relevant information to consumers’ smartphones. Meanwhile, RFID aids in real-time asset tracking, reduced labour costs and even dynamic pricing according to stock levels and online pricing. IoT in Retail: A Knot to Tie Product, Customer & Service Together The new research, The Internet of Things: Consumer, Industrial & Public Services 2015-2020 found that leading retailers using the IoT to generate an ‘ecosystem’ are poised to gain market advantage and truly capitalise on the opportunity. Linking the hardware elements of RFID tags, beacons and connected consumer electronics, such as wearables, with software analytics promises in-depth business insight and an enhanced customer experience. “Retailers such as Zara and Target are already taking advantage of the benefits offered by RFID asset tracking” noted author Steffen Sorrell. “Meanwhile the beacon industry is expanding rapidly; used as a method to provide consumers with contextually relevant information in conjunction with their smartphone or wearable will enormously enhance the in-store experience.” The IoT Needs a New Security Model Additionally, Juniper Research found, with the number of connected units within the IoT (Internet of Things) forecast to reach 38.5 billion in 2020, attitudes and methods with regards to cybersecurity will have to undergo fundamental change. Where today’s security is principally focussed on access prevention, the IoT security model will require robust means of identifying inevitable network breaches. Should suspicious activity be detected, parts of the network can then be ‘shut off’ in similar fashion to marine vessel bulkheads to prevent attack spread. [quote_box_center]Additional Findings • 70% of IoT units are expected to be composed of non-consumer devices by 2020. • With diverse business models and aims of IoT projects such as service revenue, spend and cost-savings taken into account, Juniper forecasts the IoT opportunity to approach $300 billion annually in 2020.[/quote_box_center] The whitepaper, IoT ~ Internet of Transformation, is available to download from the Juniper Research website together with further details of the new research and interactive dataset. ### The stumbling blocks to cloud implementation – communication is king! Perhaps nothing benefits from hindsight quite as much as enterprise software projects. We’ve all heard horror stories about projects that were abandoned, significantly overran cost and time budgets, were barely used by staff, or failed to deliver on promised benefits for the entire organisation, managers and individual employees. However, the majority of IT project failures aren’t due to the technology, but the inability to effect change and a lack of communication. So here are my top five tips to bear in mind for a successful cloud implementation: [easy-tweet tweet="Successful #cloud implementation tip 1: Carefully select your internal project team" via="no" usehashtags="no"] Carefully select your internal project team: One key end goal for cloud implementations is to enable accurate and sophisticated data reporting. In order to ensure every single employee enters and updates their information in real time, it’s vital that there is buy-in for the new system throughout the entire organisation. An inclusive, cross-departmental internal project team which includes members from HR, payroll, finance, IT, security, training, recruitment and a project manager or business analyst can help to accomplish this goal. [easy-tweet tweet="Successful #cloud implementation tip 2: Include other key communities in your team" via="no" usehashtags="no"] Include other key communities in your team:  Additional project team members are required if the company serves a particular vertical – for example, add in the head of professional services, if you’re a consulting firm. If you’re working with an external systems integrator, make sure one or two of their core experts are on your team and treat them as part of your organisation – that way, you establish a good forum to communicate your organisation’s needs. In addition, their experience from prior cloud implementations will be very valuable. [easy-tweet tweet="Successful #cloud implementation tip 3:Defining roles and keeping the team together" via="no" usehashtags="no"] Defining roles and keeping the team together: Once you’ve selected your team, define roles including a primary contact, a coordinator and an internal rollout and communications lead. If your cloud implementation is taking place in phases around the world, do ensure that your internal project team takes the lead on each phase. Do keep your team intact and move them around your organisation on an as-needed basis since they are becoming experts and learning from each rollout they’re involved in. This is also an important factor in helping drive consistency in cloud implementation across regions. [easy-tweet tweet="Successful #cloud implementation tip 4: Agree on a message" via="no" usehashtags="no"] Agree on a message:  Your implementation team shouldn’t have to argue the merits of the cloud project with other staff. Instead, make sure you already have a clear internal message in place – ideally, a single sentence – as to what your organisation hopes to achieve by adopting the new cloud platform. This message is vital to be able ‘sell’ the entire company on the importance and benefits of the move to the cloud. The best way for every employee to view the move is as a business decision not a technology, decision, where each individual has access to and ownership of the system. You will also need a C-level executive whose role in the project is to champion its benefits at a high level throughout the company. This will also ensure that the move to the cloud is seen as positive throughout the business. [easy-tweet tweet="Successful #cloud implementation tip 5: Communication is key" via="no" usehashtags="no"] Communication is key: There is often a great deal of integration and configuration involved in every cloud implementation. Consider which types and levels of integration are applicable to your organisation both today and in the years ahead. Cloud projects will often talk about unifying and simplifying processes. Be clear on how the implementation will effect each area of the business today and communicate this message to your consultants as well as your own internal teams.  The importance of training and communication around a cloud project are often undersold. But getting it right ensures an organisation and each of its departments understand the functionality and benefits of the cloud implementation into their part of the businesses. Ultimately, this helps all employees to understand the positive impact it will have on their day-to-day work. ### How should CIOs respond to the rise of Shadow IT? There’s no question that cloud computing has become a necessity in both our work and personal lives. Mobile phones, computers, games consoles, cars and watches are all being synced to the cloud offering new functionality, greater reliability and the ability to expand beyond traditional product boundaries. [easy-tweet tweet="#IT purchased without the permission, control or knowledge of the IT department is known as #ShadowIT" user="jewellMH" usehashtags="no"] Cloud computing now enables a seamless digital working life both inside and outside of the office. Working from home, sharing and storing data, remote capabilities and Bring Your Own Device (BYOD) have all been facilitated by the accessibility and ease of connectivity to the cloud. In fact, some reports have even suggested that cloud technologies are making physical offices obsolete altogether. However, with the average employee now very comfortable with the process of acquiring and using cloud services, the traditional role of the IT department – as a selector and purchaser of workplace technology – is coming under threat. IT purchased without the permission, control or knowledge of an organisation’s IT department is known as ‘Shadow IT’. A decade ago, Shadow IT spending was limited to basic computer accessories or boxed software. But in today’s market, vast sums are being spent by unauthorised employees on cloud-based software and services. A decade ago, Shadow IT spending was limited to basic computer accessories or boxed software For example, it is increasingly common for individuals or teams to purchase their own cloud storage solutions or direct mail platforms to replace existing internal tools. In many ways, this is only natural. For a digitally savvy employee in, say, HR or marketing, it will be second nature to get the tool they need with just a few clicks (and perhaps a company credit card). And, you might ask, if these tools are helping to drive productivity, why does it even matter? The answer is that, as an isolated incident, it might not. However as a wider trend, Shadow IT can quickly become a significant issue for businesses. It’s been predicted that these types of purchases make up a colossal 40 percent of IT spending – a worryingly high figure considering all of this IT activity is happening outside of an IT department’s control. [easy-tweet tweet="40% of #IT spending is happening outside of the IT Department's control." user="jewellMH" usehashtags="no"] After all, with any purchase, buying individually rather than in bulk results in higher costs. For cloud computing this is very much the case, especially if the cloud services purchased are supplementing existing solutions provided by the company. Furthermore, using cloud services that have not been approved by IT can put business data at risk. By using third party cloud solutions without reading the small print, employees can’t be sure about what is happening to their data once it leaves their sight. Lastly, without input from the IT team there is likely to be little attention paid to the SLAs provided by cloud providers, compliance with government regulations and the plethora of other issues that businesses must consider before handing over money and data to third parties. However, with Shadow IT not likely to disappear any time soon, it seems clear that the CIO must find ways to adapt. A whopping 96 percent of UK CIOs say that business units have bought cloud services without the IT department’s involvement A recent CIO survey commissioned by Brocade, examining the views and experiences of over 200 CIOs from China, France, Germany, Russia, UK and the US, found that the rise of shadow IT is already causing the CIO serious pain. A whopping 96 percent of UK CIOs say that business units have bought cloud services without the IT department’s involvement, despite only 20 percent of organisations saying that this is permitted. This is an alarmingly high figure but gives definite colour to why 87 percent of UK CIOs are concerned about their own job security. Furthermore, the rise of cloud and shadow IT is also putting strain on the relationships that CIOs hold at board level, particularly in the UK. 32 percent of UK CIOs say they have a poor relationship with their CEO, compared with just 14 percent globally, and 40 percent say the same thing about their CFO, compared to 15 percent globally.  As cloud and shadow IT further ingrains itself into everyday working life, it is clear that CIOs need to act quickly to avoid a complete communications breakdown with business leaders. CIOs need to address the fact that cloud computing is becoming the new default model for purchasing and consuming IT services. [easy-tweet tweet="In today’s world of consumerised #IT, the old ‘command and control’ approach simply is not feasible" via="no" usehashtags="no"] The answer for businesses is not simply to clamp down on cloud services and try to force employees back on to the reservation. In today’s world of consumerised IT, the old ‘command and control’ approach simply is not feasible. Users want to use the technology that will allow them to do their job and are not prepared to sacrifice productivity just because it is corporate policy. Instead, a new approach to IT is required, one that prioritises flexibility and agility and that makes the official, IT-approved tool also the easiest and simplest one to use. It is only by doing this that IT teams can regain control and visibility over the business’ IT spending. ### Is New Car Tech Putting Drivers at Risk? There is no doubt car technology has made a huge difference to safety, comfort and efficiency over recent years with the driverless car on the horizon. Even a fairly modestly equipped car includes the sort of safety and comfort equipment barely imaginable on cars of a decade or so ago. There is a possible downside though - the amount of tech can actually increase the risks of driving if misused. [easy-tweet tweet="In-car tech can improve the on-road experience, but is it making us bad drivers?" via="no" hashtags="cartech "] Sat nav A huge advance in helping motorists find their way, ‘sat navs' have helped road safety in many ways. There’s no more need to glance at maps, or handle cumbersome navigation books, and backseat drivers have been nullified. The screen shows at eye level and the vocal directions mean it’s easier to concentrate on the road. There again, there are risks if insufficient care is taken in their use; the danger is in slavishly following the directions. Sometimes errors are made by the sat nav such as directing drivers the wrong way down one ways streets, saying ‘turn right’ when it’s a no right turn junction and, of course, the episodes where drivers find themselves in a field or down a narrow track. [easy-tweet tweet="Being a slave to your sat nav can leave you up the creek" via="no" hashtags="satnav, roadsafety, cartech"] It’s important to keep a sense of awareness when using the sat nav. There can also be a tendency to lose focus on actual road safety and observation if too much attention is given to the sat nav and the screen. Driving in these cases becomes, instead, like a video game and it’s easy to lose focus of the real world dangers around. There may even be a section in the driving test soon regarding sat nav use - it’s been piloted in Scotland - so along with preparing for the driving theory test brushing up on sat nav operation may be required. This is probably the key to making the positives outweigh the negatives. By building sat nav systems into the natural process of learning the laws of the road, learners will be able to treat them as a natural part of the in-car operation. Anti skid, stability control and anti lock brakes Many cars offer as a minimum anti lock brakes (ABS) which help stop brakes locking up, and there are many more aids such as ESC (Electronic Stability Control) and various anti slip and anti skid features. It will soon be a reality that new drivers will have to train to use the technologies built into their driving experience The problems begin when drivers are irresponsible and rely on these safety aids to get them out of trouble. If corners are taken too quickly or risks taken in wet or slippery conditions, it would be folly to rely totally on the safety kit to save the day. These aids certainly make cars safer, especially in adverse conditions such as snow and ice, but sensible driving is still required and it’s important that drivers don’t think it’s safe to take more risks as a result. Speed awareness Thanks to higher levels of refinement and technical improvements to suspension, noise levels and vibration it’s easy to lose track of the speed being driven. Cars are far smoother to drive and spend time in than before, so the sensation of speed is vastly reduced. Drivers who learned the ropes in a ‘rough and ready’ old banger must adapt their skills to be able to recognise what driving at each speed ‘feels like’ or it can become easy to pick up speeding tickets or, worse, be involved in crashes. Electric cars and hybrids Electric cars and hybrids (when not using the engine) run close to silent, so they can be dangerous to pedestrians unable to hear that there is a car nearby. As a driver, it’s important to be aware that pedestrians and even other road users such as cyclists may not have heard you. It’s easy to forget – especially if you’ve been used to driving a car that made plenty of noise before. New tech benefits overall In the end, new car tech has been very positive and no doubt makes cars safer, more efficient and comfortable - but there are drawbacks. Ultimately, there is still no substitute for sensible driving practices. That’s why it is vital to ensure new drivers learn the rules of the road and build in new technology into their learning process. ### Businesses may be ready for cloud, but is it useful to them? Despite common perceptions, cutting costs isn’t the primary reason businesses are choosing cloud these days. The other major advantages are the agility and scalability cloud brings, enabling organisations to quickly respond to business demand. The combination of benefits is driving both IT and lines of business to rely on cloud to serve as a foundation for innovation and enablement. [easy-tweet tweet="Businesses may be ready for #cloud - but the lack of transparency is limiting how useful it is to them"] But the advantages of cloud cannot be fully harnessed if transparency into the environments is compromised. Clouds that limit visibility result in significant operational and financial issues, including performance problems or outages, challenges reporting to management, and unexpected bills. In fact, challenges with transparency restrict 63% of organisations from growing their cloud usage. That’s according to a recent global survey conducted by Forrester Consulting that we commissioned. The survey sought insights from 275 IT executives and decision makers who are experienced cloud customers. When it comes to data about cloud environments, what are organisations looking for from their providers? Clearly security and compliance information is important. Worryingly, 39% of those surveyed said they lacked security data and 47% said they lacked compliance data. Not surprisingly, the majority said they needed on-demand access to necessary reports to make compliance and audit processes easier. [easy-tweet tweet="Challenges with transparency restrict 63% of organisations from growing their #cloud usage"] That said, on-demand reporting technology only goes so far, and many respondents wanted suggestions and/or support from experts on staff at the cloud provider. In light of evolving security risks and corporate compliance concerns – especially as lines of business adopt cloud without IT involvement – cloud providers need to simplify the process for ensuring advanced security and compliance in the cloud, not get in the way. Beyond security and compliance, performance information, historical information and clear details about costs and upcoming bills are also key. Without this, businesses find it hard to plan for or meet the needs of their end users. It also makes it extremely difficult to budget properly. organisations need to understand the performance of a cloud service to get the most from it Just like with their own servers, organisations need to understand the performance of a cloud service to get the most from it, whether that means making sure resources are running properly, anticipating potential issues or preventing wasteful “zombie virtual machines.” Due to a lack of transparency from their cloud providers, more than a third of the respondents in the survey ended up with bills they hadn’t expected and 39% found they were paying for resources they weren’t actually using. [easy-tweet tweet="Is your business working on preventing wasteful #zombie #virtualmachines? " via="comparethecloud" hashtags="VMs, cloud"] Cloud customers can use data to make better purchasing decisions. Clear information from a cloud provider will help companies discover where they need more resources, or even where they can regain capacity and maximise their spend. Once again though, beyond the on-demand data, customers require solid support to ensure they are getting what they need from cloud. In the survey, 60% of respondents said that problems with support were restricting their plans to increase their usage of cloud. Issues like slow response times, lack of human support, lack of expertise of the support personnel and higher-than-expected support costs started with the onboarding process and only continued. Aside from preventing customers from reaping the benefits of cloud, these issues leave businesses feeling that they’re seen more as a source of revenue than as a valued cloud customer. cloud customers should not settle for cloud services that limit visibility into the cloud environments When it comes down to it, cloud customers should not settle for cloud services that limit visibility into the cloud environments. Compromises in transparency mean sacrifices to very agility, scalability and cost benefits that drive organisations to cloud in the first place. And beyond transparency, customers should not underestimate the human element of cloud. A cloud provider’s customer support plays a huge role in speeding return on cloud investment, and ultimately, in determining success and failure of a cloud initiative. As the Forrester study states, “Whether you are a first-time cloud user or looking to grow your cloud portfolio, our research shows that your chances of success are greater with a trusted cloud provider at your side — one that gives you the technology and experts to solve your challenges.” You can read more about the survey findings in the study, “Is Your Cloud Provider Keeping Secrets? Demand Data Transparency, Compliance Expertise, and Human Support From Your Global Cloud Providers.” ### Security concerns in the cloud? 3 steps to minimise risk Despite the increasing adoption of cloud-based solutions in the UK&I, due to security concerns some medium sized enterprises which could benefit from access to these services are dragging their feet when it comes to implementation. Let's look at 3 key precautions that can be taken to minimise risk and ensure that businesses can benefit from the new generation of flexible and cost effective cloud delivery models. For the huge number of businesses with 250+ employees, cloud services offer more than just storage for data – for the first time these businesses have access to applications and communications technologies previously out of reach due to prohibitive infrastructural costs. The move from CAPEX to OPEX IT models enabled by cloud is indeed revolutionising the market. [easy-tweet tweet="The move from #CAPEX to #OPEX #IT models enabled by #cloud is revolutionising the market" via="no" usehashtags="no"] The scalability of cloud-based services makes these applications affordable. The pay-as-you-go model is perfect for running a lean operation with no wasted capacity or expenditure on under-utilised equipment. But businesses need to make sure they move securely and with the partner that suits their needs. The pay-as-you-go model is perfect for running a lean operation Raised concerns Data security seems to dominate much of the discussion about the future of the cloud. This is particularly prevalent in UK&I where Snowden, the NSA revelations and recent high-profile hacking scandals have entered the popular psyche and have raised questions about the security of the cloud. Public clouds, which are owned and operated by third party service providers are coming under increased scrutiny: Just who has access to the data? Where is the data located and in which jurisdiction? Is the equipment used capable of keeping the data secure and segregated? Cloud providers must be able to answer these questions and more. There are providers that can provide secure cloud services and they often go beyond the level of protection that the SME could provide themselves. There are three key considerations that businesses should take into account when looking for a secure and trusted cloud provider: location, reputation and equipment. [easy-tweet tweet="3 key considerations for finding a #cloud provider: #location, #reputation and #equipment" via="no" usehashtags="no"] 1. Data centre location, location, location The location of a data centre is the deciding factor for which laws and regulations protect the stored data, and which governmental bodies have the right to access that information. Here in the UK&I we have some of the most comprehensive regulations concerning the handling of private data and while this is reassuring for many businesses, some enterprises may start to struggle with data residency or sovereignty issues as they look to embrace cloud solutions which host the data in another country. But in response to these concerns, it may pay to opt for a UK-based cloud provider whose data centres reside solely inside the nation's boundaries. For example, we partner with CentriLogic in the UK to host our OpenTouch Enterprise Cloud UCaaS offer. Already a major data centre provider across North America, in the UK CentriLogic hosts all data and infrastructure in its state-of-the-art data centre located in Bracknell. 2. Your service provider is only as good as the company it keeps Who does your prospective provider partner with? This is another key consideration for businesses, not just for present support and security, but for future-proofing your cloud applications. Have a look at the service provider's technology partners and ensure that it is partnered with recognised industry leaders. Not only is this a good indication of the company's reputation, but it also means that it will have extensive technological support in the future, and this will keep you up to date with technological advances in both security and service delivery. It's essential that you check out your service provider to ensure that its reputation is good – and shown to be good by the companies that partner with it. 3. The quality of equipment will impact the quality of service and level of protection Linked to the question of who providers partner with is the importance of quality and interoperability of the equipment they use. If cloud providers are not equipping themselves with the very best technology, then they are not going to be able to offer customers the best services or the best user experience within a secure environment. Are they offering services which make the most of open architecture? Cloud providers should be looking to support a best-of-breed approach to services in a vendor agnostic environment, and not be locked in to providing solutions from single vendors. This is a critically important aspect as the list of X-as-a-Service grows as it will put them in a position to offer leading solutions in areas such as unified communications. A hosting environment which can support interoperability will be able to offer its customers the best range of services of the best quality. Dependability is key Local survivability is a crucial part of any distributed network A cornerstone of a secure system is dependability, and delivering applications or communications technology via the cloud places certain demands on the system. Local survivability is a crucial part of any distributed network, but not all cloud providers can offer the service continuity that mission critical processes need in the event of the WAN going down. The same can be said for vendors offering bespoke services. Can they define DNS, firewall and security settings per customer? For any organisation looking to opt for a cloud solution, security and application integration are key to success. [easy-tweet tweet="For any organisation looking to opt for a #cloud solution, #security and application integration are key to success" via="no" usehashtags="no"] To cloud or not to cloud – or to do both? Businesses need to weigh up the need for certain business applications versus the costs of deploying on-premise infrastructure or indeed opting for a hybrid solution with some applications in the cloud and some on-premise. But if the cloud is right for your business and you want to benefit from the low TCO and advances in flexible deployment, it is important to bear in mind that not all cloud service providers are equal and the services they offer should reflect the twin business needs for security and service delivery. ### Top 10 Startups to Watch in September It seems that 2015 is passing us by incredibly fast. We are already almost at September, and as we ready ourselves for the last stretch of the year, a number of startup companies are bubbling in the background. These startups are developing, showcasing and seeking investment quickly to get their ‘next big idea’ out to market, and some are already gaining some considerable traction. [easy-tweet tweet="September's #startup watch list on @comparethecloud from @katewright24" via="no" usehashtags="no"] Below are some of the startups that I believe are going to end 2015 on a high and hit the ground running in 2016: 1. Bla Bla Car  We have moved into an age where the ‘sharing economy’ has taken the world by storm. Companies such as Uber and Air BnB have seemingly taken over the world in a short space of time. Having seen how quickly these companies have grown and turned over profit, I have no doubt that Bla Bla Car will do the same. Bla Bla Car is a a way for people to get from A to B via car and travelling with people who want to do the same journey. Instead of having to drive alone, or pay for expensive train/plane/bus tickets, you can hitch a ride with someone going the same way. Alternatively, you can be the one offering the ride and post it on the site. Passengers and drivers are rated by their fellow travellers. It’s cost efficient and better for the environment. With this in mind, it won’t be long before you’re hitching a ride with Bla Bla Car to the airport, only to stay in Air BnB accommodation while you’re away and using an Uber taxi for your site-seeing. Welcome to the new age of the sharing economy. 2.  Podo For those of you who have (shamelessly) bought into the ‘selfie stick’ craze, there is a better alternative out there. Podo’s tag line is “stick and shoot” - and that pretty much sums it up. It is a small camera that you can stick on almost all kinds of surfaces to shoot photos or videos. It has bluetooth which connects to your phone. You use the Podo app to set a timer and then the pictures and videos captured automatically load onto your phone. From which you can share them on all social media platforms. It allows you to be more creative with the pictures and videos you take, without the limits of the length of a selfie stick (and the one arm required to hold it!) 3. Kymira For anyone who is sporty - whether this be on a professional, semi-professional or just a regular at the gym - then this could be an interesting product for you. Kymira is sports apparel that recycles the body’s wasted energy to enhance performance Developed by graduates at Reading University, Kymira is sports apparel that recycles the body’s wasted energy to enhance performance and accelerate the recovery process. The fabric has been forged with minerals that harness the energy your body produces and then converts it into infrared, which is delivered back to your skin through the fabric. This penetrates deep into the muscle tissue and delivers more oxygen to the muscles, thus enhancing your performance. It may all seem very “sciencey” but the product has been tried, tested, and patented - and is already revenue-generating. Aside from allowing you to perform at your maximum potential, it is also thermo-regulatory and helps your muscles recover faster. I’m sure all of us can sympathise with the 3 day ache after just one gym session. 4. Try.com  Try.com gives you the opportunity to “try clothes from your favourite online stores for free. No, seriously.” We’ve all paid for clothes online and then suffered the disappointment when what you ordered is not what you had in mind. Then you have to go through the hassle of sending it back and being out of pocket until the money is refunded. With this website you can order clothes to come to you for free, and then if you don’t want them you just send them back within 10 days at no cost to you. With the likes of Zara, J Crew and Reformation on board, try.com is already rising in popularity and there is huge scope for growth. 5. RefME I graduated from university two years ago and the painful memories of Harvard referencing have not faded. RefME is a website and app that students can use to carry out referencing on their behalf. You create your projects/assignments/dissertations on your account and then start to add your references along the way. You can use your phone camera to scan the barcodes of books and journals, or give the application a website and it will format the reference for you.  RefME is configured with more then 7,000 reference styles. Gone are the days of losing marks for incorrect referencing. Hooray! 6. Rormix For those of you that consider yourself a bit of a ‘music snob’, Rormix is your chance to be ahead of the game - it is kind of like a Tinder for music lovers. The app provides a platform for emerging artists to market their music videos. When you download the app you manage your criteria by the type of musics/artists you are interested in. Rormix will then recommend music to your within that criteria. The app will produce 15 second clips - if you enjoy the clip then ‘swipe to like’, and if not you can skip on to the next one. The app is available on IOS, Android and Firefox and free to download. Happy exploring! 7. Cocoon Cocoon has been developed as a new and fresh home security option. It is a small gadget with a siren, HD camera, microphone and motion detector. It is WiFi enabled so it can run through your own WiFi router. The device learns what is normal in your home and alerts you when something isn’t right. You have the Cocoon app on your phone and it sends you real time notifications. It’s even pet friendly so have no fear that your hyper puppy will set it off! 8. Skully  The team at Skully have designed a motorcyclist helmet that is smart to ensure safety and functionality on the road. Essentially the team has taken all of the devices and systems in a car and integrated them into headgear. The Skully helmet includes a 180 degree blind spot camera, heads up display, situation awareness, GPS navigation and bluetooth connectivity. In terms of safety, the heads up display shows critical information within your line of sight so you can keep your eyes on the road. Also, the blind spot camera ensures that you are fully aware of all your surroundings, avoiding any potential accidents. From a functionality point of view, the bluetooth allows you to use your phone hands free whilst the GPS navigation means that you never miss a turn. As we move into the new era of driverless cars, there is potential for Skully technology to communicate with these cars. The future of this product is looking extremely exciting. 9. Blaze  In relation to road safety, Blaze has redefined the bike light for cyclists on the road. The powerful laser light projects a bike symbol six metres ahead of you. This gives the cyclist a much larger footprint on the road and allows them to be seen by other vehicles on the road. The best part about it? It has all angles covered. Whilst current bike lights will only flash for the back or front of the bike, the Blaze Laserlight shows the whole bike, allowing for great visibility, The light can be charged via USB, is waterproof, durable and fits on 99% of handlebars. What more could you want? 10. JOBDOH  JOBDOH is looking to redefine the industry of temporary recruitment. It is an mobile platform where employers can load jobs and temporary workers can book themselves. Most of the roles are within the hospitality industry and are an opportunity for students and other part-time workers to earn cash by picking up shifts. There are thousands of recruitment agencies out there doing the same thing, but this eliminates the middle man by putting the employers directly in contact with the workers through an app. Let’s be honest, no one really likes dealing with recruitment agents and this allows quick and easy bookings. Currently the site is up and running in Hong Kong and is proving to be successful, with the hope of expanding globally. Let’s be honest, no one really likes dealing with recruitment agents It will be interesting to track these companies over the next 12 months. Some are still within their infancy, whilst others have already started turning over a profit. Seeing the vast range of start-up products and services that are emerging, and the innovative technology that drives them, it seems that now is the time to be investing in startups, or even be bold enough to get your ‘next big idea’ out into the market. ### Five Clarified Misconceptions About Cloud It's clear that cloud technology has become a mainstream solution for many businesses and individuals. Despite its incredible popularity, the cloud is still often misunderstood. Users who want to make the most from cloud computing should have a clear idea about how it works, what are its benefits or costs, and what kind of problems it solves. Here are the five most persistent misconceptions about cloud computing, clarified to help you understand the technology and use it to your advantage. 1. Infrastructure location and ownership [easy-tweet tweet="#Hostedservices have always allowed enterprises to use data infrastructures of others" user="comparethecloud" usehashtags="no"] This is a mix-up deriving from lack of understanding when it comes to differences between the public and private cloud. Hosted services have always allowed enterprises to use data infrastructures of others – mostly as a way to cut costs. This holds true for all public cloud computing environments. Private cloud environments, on the other hand, provide more options. Enterprises opting for a private cloud can use someone else’s infrastructure, but can also choose from a host of potential ownership and location configurations exclusive to private cloud environments. [quote_box_center] Among such options you'll find: 1. Enterprise owns hardware, and it's located in a data centre (previously it was called co-lo, or colocation) 2. Provider owns hardware, and it's located in a data centre (otherwise known as “dedicated" infrastructure because the enterprise won't share it with any other businesses) It's worth noting that the public cloud follows this configuration as well, but differs in the fact that the hardware is shared among many enterprises, while hardware in a private cloud is only accessible by one organisation. 3. Enterprise owns hardware, and it's located on-site (for instance, in a data centre at your office). [/quote_box_center] 2. The public cloud can be easily breached Yes, the public cloud is publicly accessible, but this doesn't mean that every single user on the public internet will be able to access your infrastructure. Many enterprises worry about the issue of cloud security and according to experts this problem has defectively curbed the adoption of cloud technologies in many sectors. Still, in order to stay secure in the public cloud, you should remember that there's a shortage of highly-trained professionals who possess skills allowing them to secure cloud applications. That's why in general public cloud environments stand a relatively higher chance at being configured inappropriately and exposed to breaches. Such mistakes are more difficult to make when working on private data centres. Of course, none of those are fully secured against breaches. 3. Payment methodology One of the most important advantages of the cloud is the fact that there's just one monthly payment which can be easily incorporated into an operating budget of an enterprise. It doesn't matter which environment you choose, costs will stay the same month-to-month in public, private or hybrid cloud, allowing enterprises to improve their technology budgeting and planning. And yet, many enterprises are afraid that the cloud might generate additional costs. Before cloud computing, enterprises would pay a stable fee for hosted services as well, but that fee didn't cover the sort of the infrastructure which accompanied these services. Paying for cloud environments, you're paying for both services and infrastructure which require no capital investment. 4. Cloud providers are spying on user activity Next to cloud security, this is a classic fear of both enterprises and individual users. Everyone considers their privacy – including cloud providers. It's safe to say that the entire cloud computing sector could collapse if someone caught a provider snooping around their files. What providers focus on is building efficient security mechanisms which guarantee that data won't be accessible to anyone – not even them! [easy-tweet tweet="Enterprises often feel that relying on #outsourced #IT infrastructure is a fast way to lose control over #data. " via="no" usehashtags="no"] Enterprises often feel that relying entirely on outsourced IT infrastructure is a fast way to lose control over data. But this is another misconception about cloud computing – just because they're delegating the task of managing their data infrastructure to another party doesn't mean that it will move beyond their control. 5. The cloud can be scaled automatically This is a misconception which can seriously affect enterprise growth. Cloud doesn't guarantee the possibility of scaling up every single aspect of company operations. It's true that pure computing power can be scaled up most of the time, but the same doesn't hold for enterprise applications, for instance. If you expect future growth of your venture, you should definitely make sure that applications are developed to allow scalability in the cloud. Such apps should be modular with their functionality split up into independent commands, enabling you to manage, configure and maintain more servers in case the app requires them. adoption of the cloud addresses more than just technological  enterprise needs The confusion around cloud computing concepts is a serious problem which affects the adoption rates of this cutting-edge solution. Industry jargon or lack of consistent information available on the web make it even more difficult. As soon as managers realise, however, that adoption of the cloud addresses way more enterprise needs than just technological ones, they'll be ready to delve deeper into the cloud and discover that all issues listed the above are pure myths. ### Maxava awards IBM i funding to 34 groups Maxava is thrilled to announce that having again received a record number of applications, the Maxava iFoundation has awarded funding to 34 groups, to assist and strengthen the IBM i community. “The contribution of these not-for-profit groups to the IBM i community is invaluable and we are delighted to support their efforts in maintaining and growing a strong mutually supportive network of IBM i professionals,” says Allan Campbell, CEO of Maxava. the Maxava iFoundation has awarded funding to 34 groups, to assist and strengthen the IBM i community Established in 2011 to help IBM i user groups in any part of the world, the Maxava iFoundation has made significant contributions to grow and strengthen the IBM i community. Each year, the program offers grants to qualifying organizations that are focused on promoting and developing the IBM i community by upskilling and recruiting more business professionals to work with the IBM i platform. “It is so wonderful that Maxava has continued this effort.  It truly does make a big difference” says Laura Ubelhor, President of the Southeast Michigan iSeries User Group (SEMIUG). She continues, “The grant this year will once again help fund our development sandbox, used to develop the skills of both group members and students via various hands-on activities.” Campbell, happy to see the continued vibrancy of the user community, as evidenced by the continuing increase in the number of applicants year on year, comments, “It is particularly encouraging to see the focus from some of the groups on vocational training for new talent in the IBM i community.” [quote_box_center]President of New England Midrange Users Group, Dick Ferrara describes how the iFoundation grant would help NEMUG, “Over the past few years we have continued to flourish, thanks to the grant from the Maxava Foundation”. Ferrara says the grant would allow the group to bring in high caliber speakers which increases attendance at meetings, explaining, “This allows us to provide a deeper level of education for our member companies and other interested parties". And Charles Guarino, past President and current board member of the Long Island Systems Users Group LISUG adds, “Offering educational opportunities helps members keep their skill sets viable and relevant in today’s rapidly moving business environment.”[/quote_box_center] Maxava iFoundation’s newly appointed Grant Coordinator for North America and EMEA, Mariah Hartman, is based in Doylestown, PA.  Hartman is enthusiastic about the Maxava iFoundation community; “It’s really exciting to talk to all the groups and discuss their new plans for the 2015-2016 year.  We’ve recently set up a Maxava iFoundation Facebook page and I’m hoping groups globally, will share their success stories through this page.” “Maxava is committed to continuing to look for ways to enhance the IBM i community and the broader promotion of the platform’s relevance for modern, critical, commercial applications, especially in the Cloud,” Campbell says. “We will be looking to see where else we can lend support in promoting the IBM i in future – including within the wider IBM whenever possible”. Maxava iFoundation grants can be used to support activities such as educational conferences and workshops, speaker expenses, marketing and educational collaboration with local colleges and universities and other, similar growth activities. ### The third platform will challenge CIOs to adapt or fail Today, in the information age, the CIO reigns supreme in the world of corporate IT. From small businesses to multinational monopolies, regardless of their IT infrastructure, the CIOs manage their IT empires as insulated hierarchies with no opportunity to cooperate with other departments. However, as cloud computing technology develops, invaluable tools such as data analysis, seamless mobility and instantaneous social networking are transforming the third platform from a business-conscious convenience to an essential part of any businesses’ IT management structure. A force for information collaboration Even as you read this, third platform technologies continue to evolve Even as you read this, third platform technologies continue to evolve. Individual departments are more equipped than ever to take control of their cloud solutions and increase their own productivity. With other departments becoming more independent, the IT department has to develop accordingly or face the prospect of redundancy. The most effective way to do this is by adopting a more cooperative model for the department that suits the adoption of cloud computing. The cloud is open to every aspect of a business because of its simplicity. This means that many newcomers to IT management may not be experts on programming languages and some IT managers may have never entered a line of code in their lifetime. This very idea might seem sacrilegious to veterans in the field, but they offer something that is just as valuable in its place; the ability and expertise to assess the benefits modern cloud computing software can bring to the table for every part of a business [easy-tweet tweet="45% of the 900 participating IT managers across Europe in Barracuda's survey plan to utilise public cloud" via="no" usehashtags="no"] Get on board with a data-conscious approach It’s only a matter of time before companies adopt public cloud solutions as more and more businesses discover its ability to keep costs down and offer a competitive edge against rivals. In Barracuda’s recent survey, 45 percent of the 900 participating IT managers across Europe plan to utilise public cloud to enhance their business in the near future. The survey also identifies the two areas that can benefit the most from public cloud usage; distributed working processes and team-working technologies. Both areas require the CIO to work alongside specialist departments whilst maintaining an outstanding knowledge base, so they can better understand how to improve each department’s performance. The CIO brings their specialist knowledge to the table too, working with other company branches to supervise bring-your-own-device (BYOD) and mobility technologies are running at peak performance through the cloud. In short, the established role of the CIO is becoming obsolete as businesses invest in the cloud. As companies become more aware of the advantages of the cloud, the pressure on the CIO to embrace this new technology will only increase, so action needs to be taken now. The CIO needs to continue on as the driver for IT projects by taking ownership of these technological advances and successfully integrating them into their business. [easy-tweet tweet="The established role of the #CIO is becoming obsolete as businesses invest in the #cloud." via="WielandAlge" usehashtags="no"] Trust no one The added flexibility of cloud solutions means that businesses can meet the demand for IT resources quickly and without the need for advanced development procedures. The new challenge for CIO is to always be vigilant, constructing individual corporate networks to ensure that performance concerns do not override security risks. With more users active in the cloud, without proper management it can be vulnerable to unauthorised and even malicious developments. This high level of suspicion is essential when using any kind of cloud solution, and will have to be built into the very infrastructure of the business in order to protect its data and ensure a fast-acting and effective response. Every detail must be scrutinised by the implementation of intelligent security gates, with any suspicious acts prevented and pursued. Join the crew or go down with the ship The CIO working in a cloud enhanced environment will work across every department to ensure that the business’ IT solutions are being utilised correctly, becoming involved in projects in every part of the company. In this way, the cloud not only threatens traditional IT positions, but traditional IT departments themselves. For CIO’s, the role is likely to become a more strategic position, focusing on the implementation of IT and services for the company as a whole. As the hierarchy changes it is up to every IT leader to ensure their place is secured, not hindered, by the third platform. ### Introduction to Vicom Infinity - Len Santalucia Introduction to Vicom Infinity's CTO, Len Santalucia. As an IBM Premier Business Partner Vicom are a reseller, integrator and MSP of mainframe hardware and software. [easy-tweet tweet="Learn more about Vicom Infinity from their website: http://www.vicominfinity.com/"] ### Why support is still key to cloud customers With vast numbers of companies currently migrating to cloud, we are on the edge of a brand new age of IT that allows for greater agility, value and speed with minimal disruption to business. However, the change hasn’t come about without its fair share of problems. As highlighted in the recent global survey iland commissioned Forrester Consulting to conduct, support is still a hurdle that many cloud providers need to get right for their customers. support is still a hurdle that many cloud providers need to get right for their customers Support is always a sure-fire hit when asking questions about technology. Rarely is support an area where customers feel comfortable, perhaps because our industry is remarkably bad at support (or perhaps, overly cost-conscious?), or perhaps because any frustration with the product manifests as a support issue. But, while it feels like asking about support is like shooting fish in a barrel, the survey did deliver some fine insights - and one troubling one. The survey polled 275 IT executives and decision makers that had significant experience as cloud customers. Perhaps not surprisingly, 50% of respondents were unsatisfied with support and on-boarding. And, one in five had EACH of the following problems – which meant many had multiple problems: On-boarding took too long (26%) On-boarding lacked a human aspect (21%) Lack of expertise of the support personnel (20%) Disappointment with time to resolution (22%) Lingering support issues (19%) Support costs higher than expected (18%) [easy-tweet tweet="60% of respondents felt that support was a barrier to growing their cloud footprint." hashtags="cloud"] But the kicker was this – 60% of respondents felt that support was a barrier to growing their cloud footprint. Now this isn’t shocking – but let’s pick apart what it means. Typically, astute customers don’t need support to keep spinning up boring VMs, over and over. They need support to do something new or interesting or innovative. Pulling back from cloud – and the flexibility and power it provides – is stifling innovation. [quote_box_center] Dante Orsini, SVP of business development at iland said: “As a cloud company, we’re always evaluating our service portfolio. So, we embarked on this survey to better understand where our services could be bolstered and whether customers value the additional benefits we include in our service, like our complementary 24x7 phone support,” said Dante Orsini, SVP of business development at iland. “The results have guided our product management team to double-down on what emerged as high-value relationship building elements; while, prioritising further enhancements including compliance, on-demand reporting, greater data transparency and more.” [/quote_box_center] If that’s the sort of growth that’s being inhibited, we’re all in big trouble. Forrester put a lot of good suggestions in the survey – which I encourage you to read – but a primary one is this:  Test drive support alongside your cloud proof of concept. Why not call the support line, and see how those guys respond to your questions? Can you even access a support line? Does someone respond quickly? Can you partner with that person for the next year of cloud deployments? [easy-tweet tweet="The quality of the cloud vendor’s support is knowable and testable." hashtags="cloud"] The quality of the cloud vendor’s support is knowable and testable. Before you choose a cloud provider, make sure you can easily access their technical experts to help quickly address any questions or issues. And read the fine print to make sure you won’t incur significant additional fees when you contact support. The findings and recommendations are published in the Forrester Consulting study, “Is Your Cloud Provider Keeping Secrets? Demand Data Transparency, Compliance Expertise, and Human Support From Your Global Cloud Providers.” It can be accessed here. ### More Cowbell? Cloud and Mitigating Risk There is a famous Saturday Night Live sketch where Will Ferrell adds “More Cowbell” to a track being produced by Christopher Walken. It was voted as one of the top ten skits that the show ever produced. But what does it have to do with Cloud computing? For many CIOs that are still considering their move to the Cloud – or weighing up whether they should commit even more of their IT to Cloud platforms – there is a perception of risk. In the UK, IT and cloud provider 2e2 was a significant Cloud operator that went bankrupt in 2013. The company had been worth hundreds of millions of pounds, yet it came crashing down quickly when it could not make payments based on its debt interest. For companies that had trusted their services to the 2e2 cloud, this represented a huge shock. For the wider industry, it struck at the very heart of the Cloud model. However, there are lessons that can be learned from this and from other IT service providers that have run into financial trouble. [easy-tweet tweet="It is not as simple as stating that the solution to Cloud risk is More Cloud." user="mastersian" hashtags="morecowbell"] Now, it is not as simple as stating that the solution to Cloud risk is “More Cloud.” However, Cloud can offer an appropriate and cost effective route to protecting against company failure as a cause of downtime. Cloud Mitigation can and should begin by understanding how and where your assets will be hosted. This requires some questions of your potential Cloud provider on the technical side as well as on the business and legal sides. Using DR planning best practices can really help ensure that your Cloud strategy does not fail. On the Cloud technology front, it’s important to know whether your data is being hosted on your behalf in a UK data centre and who is responsible for running that facility. One approach that can work well to mitigate risk while controlling cost is to use two different cloud service providers; one can act as a production site and the other as a DR provider. What is important to avoid is overlap between the two. not all Cloud service providers own and run their own Cloud operations While this might seem obvious, not all Cloud service providers own and run their own Cloud operations; instead, they may resell other services that are rebranded. For example, public Cloud platforms can be rebadged and sold by partners. Alternatively, local UK data centre operators can offer space within their Cloud offerings that can be resold.  It can be possible to buy into two separate brands and then discover that they are using the same physical location. Preventing this kind of issue can stop other potential DR issues too. Checking on the location of your Cloud provider can also ensure that issues such power loss don’t affecting service provision. Using a Cloud DR service should protect your services if the production site goes down for reasons of network loss, whether this is electrical or Internet connectivity. It is highly unlikely that your Cloud DR service will be physically located on the same power grid or network area as either your company or your primary Cloud provider. However, you can check and confirm that this is the case as part of any move to Cloud DR. [easy-tweet tweet="#Cloud #DR should protect your services if the production site goes down for reasons of network loss" user="mastersian" usehashtags="no"] The challenge for DR and business continuity investment has always been the perception that this spending is on the IT equivalent of an insurance policy, rather than on creating business value. Cloud DR can help to reduce the cost of protecting IT assets and company operations so that these are more affordable. By using DR best practices, it is possible to make the most of what the Cloud can offer and mitigate those risks. Sometimes the solution can be as simple as “More Cloud.” Sometimes the solution can be as simple as “More Cloud.” ### Xtium implements Abiquo’s hybrid cloud software One of the biggest names in US cloud computing, Xtium has announced its implementation of hybrid cloud software from provider, Abiquo. The deal, spanning forty-two months, deploys Abiquo’s hybrid cloud solution across Xtium’s three data centres. The long-term aim of the project is to migrate most, if not all, of its large customer base (SMBs and divisions of larger enterprises) to its Abiquo platform. This migration will allow Xtium and its customers to deploy new applications faster and in a more consistent way than before, and to benefit from a straightforward and clear user interface to manage and control their virtual environments. Abiquo’s comprehensive integration options support Xtium in extending the platforms’ capabilities to offer innovative cloud services.  Following an extensive market evaluation, Xtium found Abiquo’s product and roadmap to be closely aligned with the strategy it had in place for its own future. Xtium has been provided with a ready to use solution that fits its requirements exactly without the need to design and build a cloud portal from scratch, which would have been costly and time-expensive. Cloud computing is still a growing market and with the growing number of organisations using both public and private cloud solutions in tandem, but having no means to sync the management process, hybrid cloud solutions are a good business move for the future. Abiquo provides a unified management solution for both public and private cloud environments which, in turn, allows for greater flexibility and efficiency for the end customer.  David Rode, CEO at Xtium says: “The team at Abiquo is up there amongst the best technology partners we’ve worked with. The whole team is responsive and flexible from not only a different time zone, but the other side of the world, providing us with not only the best available technology, but also the best customer service. We look forward to a long and successful partnership.”  Ian Finlay, Chief Operating Officer at Abiquo says: “We are delighted to work with Xtium. Not only has the company shown its commitment to us with a forty-two month deal, but it shows that hybrid cloud is truly the way forward for businesses of all kinds.” Mr Finlay continues: “What’s also great about this deal is it shows that there is still room in this market for high quality service providers to offer their own solutions, rather than trying to compete with the hyperscale providers on cost. Hybrid cloud delivers the best of both worlds to the end customer and helps service providers retain customer ownership across all their cloud computing needs.” ### Curing the Cloud Hangover Cloud Computing promised so much. It was presented as an IT cure-all; the solution to no end of traditional tech challenges. Now the hype has died down, organisations are beginning to wake up to the realities of the model, seeing both its limitations as well as its strengths. The era of the ‘Cloud Hangover’ has begun. That is not to say, however, that the cloud has been anything but a seismic shift in the IT landscape. The benefits of cloud computing span all facets of an organisation, enabling a shift from capital intensive to operational cost models, greater efficiency and agility, and the potential for reduced complexity. However, look more closely, and it becomes clear that the journey to the cloud is not always a simple one. [easy-tweet tweet="The journey to the #cloud is not always a simple one" user="comparethecloud and @Sungardasuk" usehashtags="no"] Interoperability, availability and cloud-related operational expenditure are among the serious concerns IT departments now face as they battle with the Cloud Hangover. Battling Costs Recent research from Sungard Availability Services found that the hangover is costing European businesses an average of more than £2 billion; with the overwhelming majority of businesses (81 per cent) in the UK, Ireland, France and Sweden having encountered some form of unplanned cloud spending. [easy-tweet tweet="81% of businesses in the UK, Ireland, France and Sweden have encountered unplanned #cloud spending" via="no" usehashtags="no"] Not only is each organisation within these countries paying an average of £240,000 per year to ensure cloud services run effectively, but they have also spent an additional £320,000 over the last five years thanks to unforeseen costs such as people, internal maintenance and systems integration. And despite being hyped as a way to reduce IT complexity, many early adopters of the cloud (43 per cent) have found that the complexity of their IT estate has in fact increased since their initial cloud investment. Organisations have rightly been engaged in the hype around cloud computing, but in the rush to join the party, often without considering their own organisation’s reality, many are now realising they have not truly thought about the cloud in the long-term. Cloud is, of course, here to stay. So with this in mind, how can organisations start to cure their Cloud Hangover? 1. Identify business drivers for moving to the cloud Why are you really moving to the cloud, what is your organisation hoping to achieve? There is no right or wrong answer, the cloud offers numerous benefits - such as automated software updates, reduced management overheads, reducing capital expenditure or increasing flexibility – but it’s crucial that your business has a specific aim in mind. 2. Nail down Cloud Service Level Agreements (SLAs) from the start Agreeing on the correct SLA can guarantee availability, security, capacity, performance and upgrades to name just a few benefits. Considering what elements are included in an SLA is essential in ensuring that your cloud deployment can meet those business and IT expectations right from the start. Check with each cloud service provider (CSP) to confirm the elements of the SLA, ensuring that there are no loopholes, and determine how and when the numbers of the SLA are measured. 3. Be sure to consider automation Being able to automate and streamline key operations within the cloud is a critical element of reducing the unexpected costs – both people and time – that so many organisations are now experiencing. Before deploying the cloud, IT decision makers need to consider which operations should be automated and which may need to remain under manual control. 4. Determine how ready you are for the cloud, including migration Not all applications can be migrated seamlessly into the cloud, particularly if they have been written for a specific orchestration platform or are vendor specific in some way. Re-writing the app for the cloud will be a time and cash intensive process often requiring outside help – which can also add to these costs. Of course, on the other hand, migrating to the cloud could also present the opportunity to assess, optimise, and streamline the IT estate. 5. Managing the money A clearly defined billing model at inception is critical Again, the right questions need to be asked at the outset. Confirm whether there are any upfront capital expenditures; whether the bill is a set operational expense; or how charging may change in the use of cloud bursting. A clearly defined billing model at inception is also critical; including how costs are calculated and in what increments. Businesses are complex entities and as a result their infrastructure requirements are complex as well. In short, it’s about picking the right platforms for the right applications – not trying to force square pegs into round holes. [easy-tweet tweet="The cloud has been a seismic shift in the IT landscape." via="no" hashtags="IT, cloud"] Organisations need to recognise cloud computing not as a technical achievement, but as a tool to deliver a specific, individual business outcome. There is no doubt that cloud can be as positive as the hype suggests, but it requires a sensible and pragmatic approach to achieve the transformational outcomes that many have seen. To find out more about the Cloud Hangover, please read the full whitepaper here and take the quiz to see where you are at in the cloud party.   ### Centrify finds Apple BYODs are liabilities Centrify Corporation, the leader in securing identities from cyberthreats, today announced findings from a recent survey it commissioned to determine penetration and security compliance of Apple devices in the workplace. Conducted by Dimensional Research, the Centrify Apple survey demonstrates that while people widely use Apple devices for work, lack of security and management of those devices exposes companies to significant liabilities. Of the total 2,249 US workers surveyed, nearly half (45 per cent) use at least one Apple device for work purposes. In addition: A majority of those Apple devices (63 per cent) are owned by the user as opposed to the company and are used to access work email, corporate documents and business applications 59 percent of Macs are used to access confidential company information 65 per cent of Macs are used to access sensitive or regulated customer information 51 per cent of iPhones in the workplace are used to gain access to business applications 58 per cent of iPads in the workplace are used to gain access to business applications However, despite the popularity of Apple devices in the workplace, businesses do not invest enough resources to secure or manage them. Over half (51 per cent) of all devices are secured by a password that is merely a single word or a series of numbers Most devices (58 per cent) also do not have software installed to enforce strong passwords More than half (56 per cent) of users report sharing their passwords with others Only 17 per cent of Apple devices have a company-supplied password manager Only 28 per cent of Apple devices have company-provided device management solutions installed Only 35 per cent of Apple devices have encryption of stored data enforced by their company Ultimately there is no discernable correlation between password strength and sensitivity of information accessed or accessible from a particular device or user [quote_box_center] “Centrify’s Apple survey spotlights the massive exposures that occur when devices do not comply with standard corporate security policies,” said Bill Mann, senior vice president of products and marketing, Centrify. “In particular, customer data represents a huge liability. Disclosure of regulated information such as healthcare records could expose corporations to fines and other legal action. Most importantly, there are solutions on the market today that can handily secure Apple devices without sacrificing user productivity. It’s time for IT to take action.” [/quote_box_center] Centrify provides the only fully integrated enterprise solution for Apple users, devices and applications across existing corporate networks and beyond. Whether Apple device users work inside or outside of the corporate firewall, Centrify provides all of the capabilities necessary to keep them secure and productive. Centrify integrates comprehensive mobile & Mac management with Identity and Access management. Unlike standalone Enterprise Mobile Management vendors, who can only do device management and push applications to devices, Centrify integrates the ability to secure and manage devices with the identity policy required to manage app access, across devices. The ability to combine the device’s security posture and location with a user’s role to make application access decisions is unique to Centrify. For more information about Centrify’s modern approach to Apple identity management, device management and application management, please visit www.centrify.com/apple. ### 10 Free and Helpful Online Tools for MSPs Everyone appreciates a helping hand. When you discover a free, online tool that makes your job easier, it can be hard to imagine life without it. With that in mind, we decided to reach out to you in order to compile a list of these helpful tools, all in one place. Why spend hours poking around online looking for a solution to your problem when it is very likely that one of your peers has already discovered and shared that solution? [easy-tweet tweet="Free online tools can be endlessly helpful for #MSPs " user="followcontinuum and @comparethecloud" usehashtags="no"] #10 - Evernote Check out Evernote In a nutshell, Evernote keeps you organized. In a world that is shoving an increasing amount of digital media down our throats, it can be difficult to keep track of what we have stored and where we have stored it. With Evernote, you can find anything, anytime, anywhere. Using its own Optical Character Recognition (OCR) service, Evernote is able to read text whether it be in keyboard text, PDF documents or even text in a photograph. Check out the example that Mark O'Neill gives in his article on "MakeUseOf." With Evernote, you would be able to find this photo by searching for the term "U.S. Air Force" or by looking up the number "94977" which appears on the back of the aircraft. The basic package of Evernote is free, but to get the full experience, you'll have to sign up for one of two subscription plans. The organization and search functionality is just part of what makes Evernote such a key tool. For a full rundown of Evernote's capabilities, check out the rest of Mark's article! #9 - Sidekick by HubSpot Check out Sidekick Sidekick is an application that you can add to your internet browser that allows you to gain information about your contacts. Sidekick is able to pull and provide information including, but not limited to, professional history, location, mutual contacts and email history. Sidekick also has extensive email capabilities, including the ability to schedule emails and track who has opened your emails and from which devices. #8 - Google Keep Check out Google Keep Much like Evernote, Google Keep is a good way to store and organize important information. Google Keep also gives you the ability to take notes and make checklists to help you keep your day, week, month or year in order! #7 - Charlie App Check out Charlie App Want to get some inside info on a prospect or new partner before you meet with them? Charlie App can help with that. Charlie combs through hundreds of sources and provides you with a simple, one-page profile on the person or people that you're preparing to meet with. Not only does Charlie App help you better prepare for a meeting by understanding the background and details of the person that you're meeting with, but it saves you the time that you'd spend collecting this information on your own! #6 - Canva Check out Canva Canva is a good way to increase your visual marketing efforts. Canva's easy to use templates allow you to make simple, but visually captivating graphics for social media, landing pages, direct mail campaigns and more! Below is an example of a template that Canva provides for creating a Facebook cover! Visual marketing can seem like a daunting and expensive task. However, by using tools like Canva, you can create and host high-quality graphics with minimal effort and zero cost. #5 - Password Safe Check out Password Safe Password Safe is a great way to eliminate the stress and turmoil that comes with having to remember a long list of user names and passwords. With Password Safe, one master password allows you to access you entire list of log in credentials. Just as only using one password for accounts is risky, keeping a list of your passwords in a Word document or notebook is also unsafe. As one of our Spiceworks followers pointed out, there are other similar solutions out there that provide the same solution. Check out KeePass #4 - VirSCAN.org Check out VirSCAN.org VirSCAN.org is a free tool that uses antivirus engines to scan uploaded files for malware.Below is a snapshot of the VirSCAN.org tool and instructions for use. It should be noted that VirSCAN.org should not be counted on or used as a replacement for antivirus software. Instead, this free tool can be used to check on the amount of risk files pose to your system. Thanks to Redditor, "Paultwo" for the tip! #3 - Mail Tester Check out Mail Tester Are you concerned that your emails are coming off too "spammy?" Mail Tester allows you to check the "spammyness" of your email by simply sending the email to a provided email address. After you've sent the email, you can check out your results by clicking on the provided link. Even if it isn't spam that you're worried about, Mail Tester is a good way to get some constructive feedback on the makeup of your emails. As you can see in the above images, Mail Tester gives you access to a range of information on your emails including broken links, improvement suggestions and whether or not your messages include any broken links! #2 - Ninite Check out Ninite Ninite is a great tool used for downloading multiple apps all at once. Instead of downloading each of your needed applications individually, Ninite allows you to select all of your desired apps, and it installs them in the background! Check out the wide range of apps that Ninite offers for installation. Not only does Ninite install the apps for you, but it also updates applications that are out of date! #1 - Spiceworks Check out Spiceworks If you're not already a part of the Spiceworks community, it's time to get in on the action. Spiceworks is THE place to gain information from your peers. Have a question that you just can't seem to get the right answer to? Post a topic on Spiceworks and let the community answer it for you. If you're facing an issue, it's highly likely that someone else has faced and solved that same issue in the past. Below is a screen shot from the "Water Cooler" thread on Spiceworks.   Perhaps one of these tools will make life easier for you. That is certainly our hope! Did we forget any important and free online tools? Let us know in the comments section!   ### Why businesses are hungry for consolidated cloud IT platforms It is difficult to argue any more that the cloud has not entered the mainstream of business IT. Indeed, businesses in every sector and of every scale are rapidly adopting the ‘cloud first’ mentality that even until comparatively recently was the preserve of early adopters and pioneers. Businesses are moving away from implementing all-encompassing monolithic IT systems in favour of more flexible end-to-end cloud solutions. For the channel, these shifts pose a unique set of challenges. With revenues from traditional technology sales declining and in some areas, disappearing altogether, (for example, prices for traditional telco bandwidth sales are roughly halving every five to seven years), businesses must transform to build new revenue streams from cloud by educating and commissioning sales teams in new ways. [easy-tweet tweet="Prices for traditional #telco bandwidth sales are roughly halving every five to seven years" user="comparethecloud" usehashtags="no"] Channel players who grasp the business transformation opportunity, will be able to grow far more revenue via new cloud models than with the traditional sales model. Of course there are opportunities to sell additional managed services, but the real opportunity lies in the potential to deliver services and applications through intelligent platforms that simplify and consolidate IT functions for end users. Key drivers for change Businesspeople, driven by their experiences with large, expensive, complex IT systems, are increasingly looking to simplify their IT departments, consolidate functions and adopt an opex-based delivery model. However, changing business attitudes to IT delivery are not just driven by desires to move away from painful up front capex costs to more ‘pay as you go’ models - there are also strong operational and productivity drivers. Mobility and flexibility are key watchwords for businesses today - both in terms of optimising their business processes, but also improving productivity by giving employees the ability to work when and where they want to. Traditional IT sales feel a lot more cumbersome than the world of individualised ‘microtransactions’ Alongside this trend of mobility, there’s also the factor of consumer experience driving business change. Traditional IT sales, involving RFPs and the involvement of the board, feel a lot more cumbersome than the world of individualised ‘microtransactions’. Consumers, used to the ‘app store’ and ‘always available’ cloud services such as Google Drive, now increasingly expect to be able to provision things they need in a granular way, and at the click of a button. This attitude has quickly shifted into the business world as users look for ways to achieve higher levels of productivity and collaboration with colleagues. The integration challenge [caption id="attachment_21218" align="alignright" width="300"] Neil Cattermull's Top 5 IT Predictions for Finance Sectors[/caption] Of course there are already a myriad of cloud tools on the market designed to serve this growing demand: from productivity tools to network and communications management software. It is already common for businesses to be making use of at least one or two of these services - either within certain groups, or across the whole organisation. However, beyond ‘dipping a toe’ in this way it is simply not realistic for businesses to expect to be able to reliably and securely manage and scale its IT provision if it is based on an ad hoc collection of disparate tools relying on wildly different platforms. Instead what businesses really need are consolidated cloud platforms, which can quickly and easily integrate new services and be managed via a ‘single pane of glass’. Delivering this sort of simplicity to enable the flexibility, agility and scalability businesses want, requires advanced orchestration and automation capabilities working behind the scenes. This allows self-service and automated set-up of almost anything. Grasping the opportunity [easy-tweet tweet="Servicing the demand for flexible self-service #IT provisioning is a huge opportunity for the #channel" usehashtags="no"] To return to my earlier point, servicing this demand for flexible self-service IT provisioning represents a huge opportunity for the channel. Rather than relying on one off sales, self-service centralised cloud infrastructures open up new progressive, cumulative revenue streams. By giving end-users the power of self-service, combined with the ease of automated service provisioning and delivery, resellers will generally experience organic growth with minimal input from their sales teams. A good example of this is a business self-provisioning a higher bandwidth package (for example, when they employ additional staff or when Wimbledon is on) that, once delivered, rarely tends to be reset to original levels. Other examples are allowing the self-service of voice features, such as ‘hunt groups’ on telephones, or the setting up of security groups on a wireless network. It’s a win-win for both parties It’s a win-win for both parties. Customers feel more in control and can easily service a genuine business need without reams of paperwork and laborious requisitioning processes; resellers can focus on service and support rather than having to continuously upsell and push product. Piecemeal, siloed cloud tools simply will not deliver what businesses need in the future. Consolidation, integration and simplification will be the key watchwords in the near future. The demand from businesses and users is clear. Now channel players need to grasp the opportunity. ### Dustin Kirkland of Canonical on Ubuntu, the future & IBM LinuxOne at LinuxCon 2015 We took time out of LinuxCon 2015, Seattle to speak to Dustin Kirkland from Canonical on the latest Ubuntu developments, 'Wily Werewolf' and their work with IBM on the LinuxOne release. ### Colocation and cloud: what does the future hold It’s not always easy to know what to expect from the colocation industry. According to some, cloud technology is a threat to the data centre itself, while other sources point out that cloud is actually behind the surge in demand for colocation, responsible for the 87 per cent of data centre operators that plan to increase their facility spending over the next year. So what do you choose if you’re looking to refresh your hardware? Does cloud herald the end of the data centre, or will the data centre industry thrive in competition with webscale giants? [easy-tweet tweet="87% of #datacentre operators plan to increase their facility spending over the next year" user="beazerD" usehashtags="no"] As it stands, the colocation industry still seems to be growing at an impressive rate despite growing adoption of public cloud. While proponents of both have good cases to make, it’s likely that many CIOs will opt for a mix between the two. The cloud has left many rethinking their existing approach to storage, yet at the same time demand for colocation services is on the rise. So how will the colocation industry look over the next few years? Colocation as we know it Despite the rise of cloud hosting, the colocation industry is in rude health. It’s growing at a rate of nine per cent year-on-year, while it has even been claimed that the market will grow to over $36bn by 2017. In addition, any disruption in this market seems to be hitting in-house corporate data centre operators more than most. Industry leaders such as Google and Microsoft tend to build their own data centres, an uncommon practice outside of FTSE250 businesses. the market will grow to over $36bn by 2017 That the colocation market continues to thrive is particularly remarkable given how fractured it is. Industry leaders target their own focus areas such as specific sections of the market or vastly different positions on issues such as security, speed and scalability. This complicates the process for IT decision makers looking for that ‘one size fits all’ option, but it does indicate that the colocation industry is making an effort to target the user’s needs. Digital Realty for example offers a vastly different model to local specialist Leaseweb. Despite this, the cloud alternative remains much closer to the ‘one size fits all’ model. The data centre of the future The rate of growth in demand for colocation services is likely to shrink gradually over the next few years as cloud chips away at its market share, but it won’t change substantially. Plenty of firms are opting for colocation as a viable alternative to building their own data centres due to the greater confidence in security it provides. [easy-tweet tweet="Firms are opting for #colocation as a viable alternative to building their own #datacentres" via="no" usehashtags="no"] This is a key factor behind the growth in the ‘mega data centre’ – a buzzword to describe huge infrastructure projects with an incredible amount of collocated server space. In response to demand for colocation services, sites such as ViaWest in Oregon and SuperNAP in Nevada are emerging across the board, bringing in blue-chip customers such as eBay in as anchor tenants. It’s certainly going to be interesting to see if these projects continue to attract high profile customers and whether or not the cloud is likely to have any impact on the sustainability of these projects going forward. Hybrid workloads are also set to grow over the coming years, which is good news for small and medium-sized enterprises (SMEs) that might find colocation more impractical. These firms may have to be ready for peaks and troughs in demand, but it isn’t usually cost effective for them to invest in the right hardware and scale out for peak capacity in a third party data centre. Instead, SMEs can opt for a ‘best of both worlds’ approach by collocating in steady state day-to-day workloads and sending data intensive tasks to public cloud services where necessary. Hybrid cloud still depends on using colocation facilities in data centres Hybrid cloud still depends on using colocation facilities in data centres, so it isn’t going to impinge on the data centre as we know it. Moreover, cloud has been part of the picture for over a decade now, yet remains much smaller than the colocation industry. If cloud is taking over the world, it’s doing it very slowly. The impact of cloud The retail industry provides a good example of this. While the biggest names run their own data centre facilities, this luxury is becoming increasingly rare for the next tier down. Even where cloud begins to affect some of the colocation industry’s ‘bread and butter’ customers such as low-end hosters, the industry has been able to attract the great majority of cloud providers, blunting the impact on revenues. The debate on the impact of cloud divides the colocation industry in two. Some businesses plan to develop their own clouds or at least a cloud ecosystem, while others don’t touch it and believe it may bring them into competition with customers. There’s still no clear verdict on the success of products like AWS Direct Connect and Azure ExpressRoute. Do they add a new tier of services to bring new customers into the data centre, or merely paving the way for customers to leave the cloud? Hybrid approach For the time being, cloud and the data centre are coexisting peacefully; it seems as if claims that cloud will destroy the carrier neutral data centre industry are misplaced. Instead, we’re seeing a hybrid approach. Buying decisions fundamentally come down to economics, and hardware is usually cheaper than cloud. Plus, when it comes to housing equipment colocation always provides a cheaper, more secure and more reliable option than going in-house. Even among SMEs, a market segment largely responsible for the popularity of hybrid cloud, colocation is an increasingly popular option for predictable day-to-day applications such as Exchange. Cloud will play an increasingly large role in the data centre of the future, but IT decision makers shouldn’t write off colocation just yet. ### Jimmy Wales confirmed as opening keynote speaker at IP EXPO Europe 2015 Jimmy Wales, entrepreneur and founder of Wikipedia, will deliver the opening keynote speech this year at Europe’s number one IT event, IP EXPO Europe 2015, taking place at London’s ExCeL. At the event on Wednesday 7th October, Wales will discuss the current state and future of Internet privacy and online censorship. Famous for creating the world’s largest online encyclopaedia and for his controversial views on online freedom of speech, in this insightful speech, Jimmy will address the dangers of government snooping, the value of encryption, and the negative impact of cyber security failures on freedom of expression and democracy. Bradley Maule-ffinch, IP EXPO Europe’s Director of Strategy, says: “Jimmy has strong views on European regulations relating to Internet censorship and the impact this has on businesses, so we are very excited to hear him speak at this year’s IP EXPO Europe.” Wales is also expected to discuss the business impact of Edward Snowden’s NSA disclosures, as well as the UK Government’s Investigatory Powers Bill, or ‘Snoopers’ Charter’, which calls for ISPs and mobile network operators to keep records of individual users’ browsing activity over a 12 month period. Wales’ keynote will take place at 10am on Wednesday 7th October, in the IP EXPO Europe keynote theatre, ExCel London. For further information, or to register FREE, visit www.ipexpoeurope.com or search for #IPEXPO on Twitter. [quote_box_center] Unfortunately due to unforeseen circumstances Snowden will no longer be speaking at IP Expo. [/quote_box_center] ### A Front Row Seat at The Field Service Big Bang What do you get when you mix field service innovation with the efficiencies of the cloud? A field service Big Bang of course. The field service industry is going through nothing short of a revolution. And it’s about time too. Historically field execution simply meant scheduling a technician to get to the customer site within a contracted window. Actually fixing the problem, or having the right parts or skills for the job was almost an afterthought. Thankfully those days are gone. But the problem remains that field service has not entered the information economy as quickly as other lines of business, nor has it moved at the same pace as the rest of the organisation. Unfortunately, the industry is now in a position where it can’t afford not to catch up. [easy-tweet tweet="Field service is the perfect fit for the #InformationEconomy where knowledge is the primary raw material" via="no" usehashtags="no"] In fact, if you think about it, field service is the perfect fit for the information economy where knowledge is the primary raw material and source of value. The issue is that, to date, it hasn’t properly converged or integrated with the wider IT-based information society. But that’s finally changing, and those of us in field service have a front row seat to an industry undertaking a major transformation. It’s already happening and on a global scale. Market forces, such as decreasing margins, servitisation or disruptive service, sluggish capital equipment sales, higher customer expectations, IoT enabled devices, and the realisation that field service can and should be a profit centre rather than cost centre, are just some of the pressures forcing service organisations to make a giant leap frog forward. [easy-tweet tweet="The ‘perfect storm’ - 1980s style processes and systems colliding with expectations of the twenty first century" via="no" hashtags="cloud"] And so we have the ‘perfect storm’ as it were, with 1980s style processes and systems colliding with the expectations of a twenty first century, information-based, customer led economy. Enter the cloud. It delivers all the end-to-end automation, lets you pay-as-you-go, gets you up and running in days not months, is user friendly for the ‘over 50s’ among us, and means you never have to worry about hardware or IT resources. Field service is actually inherently mobile, social and time sensitive. It needs new technologies that compliment these attributes. By empowering and mobilising service technicians with cloud-based, real time tools in the field they can do work-orders, request parts, schedule and be scheduled, look up manuals, take payments, renew maintenance agreements, use social channels to communicate problems swiftly and effectively and upsell and cross sell products and solutions where appropriate. All of this is done on a smartphone or tablet device. All the data is real-time. Technicians also have the option of working off-line and then synching up to reduce dead time administration at the end of a job. And customer relationship management systems pick up the information and ensure that the customer receives future communications, advice, updates and education. And of course all of that data is delivering valuable new insights about your businesses and customers.  Quite a disruption to the status quo, isn’t it? These tools can also be used to harness knowledge – particularly from your most senior technicians – and shared for future and ongoing reference and training purposes. By retaining all that golden knowledge and experience gathered over the years from your more mature technicians as they approach retirement, you are able to help train and develop the next generation who may have limited experience with older model equipment or assets. Quite a disruption to the status quo, isn’t it? But a necessary one. The role of the service organisation has transformed across multiple industries, not just for the likes of the big manufacturing giants. And it’s not just about delivering support anymore. It’s about finding a true understanding of customer needs, and provisioning solutions to ensure customer satisfaction, profitability and maximum value, while maintaining product reliability, availability and performance. Change is uncomfortable and sometimes poses its own set of risks and challenges The challenge for service organisations is delivering on all of this whilst delivering efficiencies at the same time. It simply can’t be done without automation. End-to-end cloud-based, social and mobile field service applications will outpace and outperform manual processes every time, no matter how good your field technicians are. Change is uncomfortable and sometimes poses its own set of risks and challenges. But rigid software and broken processes make change downright impossible. Cloud-based field service solutions can now accommodate and anticipate customer requirements, guide you through the transformation, and give you best practices to follow. Don’t just watch this revolution happen from the side lines. Join in or you risk market forces pushing you aside. ### Datapipe Paves the way for EducationCity’s Expansion to China and Beyond After successfully rolling out their educational materials to schools in the UK and US, EducationCity turned its sights to international markets, in particular China. EducationCity is transforming teaching and learning in language arts, math, science, computing, and English as a second language for elementary school teachers and students in more than 15,500 schools across the globe. They provide interactive creative learning tools and educational activities that are easily accessible online.  [easy-tweet tweet="Hosting outside of the target market was causing latency and performance issues, @Datapipe changed that" usehashtags="no"] EducationCity have been operating in China for around five years so far, initially hosting their Asia-specific learning tools in data centres around the UK and US. However, this approach didn’t offer a long-term solution as it created issues with latency and performance. EducationCity had to find a China hosting solution, but they lacked the expertise to address the legislative and regulatory roadblocks related to private and public cloud offerings in China. Datapipe's decade of experience in the asia-pacific market was invaluable To help navigate the many hurdles the company faced in expanding to China, EducationCity turned to Datapipe. Datapipe has deep experience in the Asia- Pacific market. The company opened its Datapipe Asia office in 2005 and has since provided on-the-ground support out of the company’s Hong Kong office and data centres in Shanghai, Singapore and Hong Kong. This experience made Datapipe an easy choice for EducationCity when seeking help for their move into China. The main roadblock EducationCity faced when expanding to China was the complex regulatory requirements set in place by the Chinese government. EducationCity worked with Datapipe contacts in both Hong Kong and Shanghai to navigate the legal landscape and help smooth EducationCity’s entry into data centres in Shanghai. “Datapipe was really good at guiding us through the move to China and helping us get the right sign-off from various government authorities. We wouldn’t have been able to do it without their on-the-ground support and expertise,” says EducationCity Technical Director, Graham Lyden. [easy-tweet tweet="6 months is all it took from engagement to deployment in China for @EducationCity" user="datapipe and comparethecloud" usehashtags="no"] The entire process from engaging with Datapipe to overcoming the complex procedures involved with hosting in China, getting contracts signed, and deploying EducationCity’s software in China only took six months. Now, because of Datapipe’s on-the-ground presence and experience in deploying public, private and hybrid environments in China, EducationCity is available in primary schools across the region. Additionally, EducationCity was able to utilise Datapipe Asia as an extension of their IT team, saving them the substantial overhead costs associated with opening a satellite office. ### Is personal data safe on a cloud based CSP? How is the growth of cloud based customer service platforms impacting data protection, especially for those organisations with highly sensitive personal data? The popularity of cloud based customer services platforms has grown at an exponential rate across businesses, public sector organisations and charities. In addition to removing the need for organisations to run their own systems, the platforms provide immediate savings of both time and money to the customer and offer flexible pricing options (which can generate further financial savings). What’s more they are a driver for a type of innovation that may not otherwise exist in traditional software offerings. [easy-tweet tweet="The popularity of cloud based customer services platforms has grown at an exponential rate " user="ExpAssist"] Adapting to new technology Adapting to new technology that allows access to applications, documents and software from any web-enabled device has inevitable risks. There have been numerous examples of security breaches across the public, charity and private sectors due to malevolent activity or operational errors releasing highly sensitive personal data. One very clear example is the recent high profile case involving mobile phone retailer Carphone Warehouse, which is at the centre of a major data breach investigation after up to 2.4 million customers’ financial information and personal details were accessed through a malicious cyber hack. Risks of data protection For organisations such as charities that hold highly confidential beneficiary and stakeholder information, data protection is a big issue, and rightly so. Staff can face criminal charges and the organisation may risk losing customers as a result of reputational damage. What’s more, organisations could also face fines worth up to £50,000. Having robust data security in place is absolutely paramount to avoid long-term damage. The biggest risk for customer data leaks is from a contractual, rather than technical perspective The biggest risk for customer data leaks is from a contractual, rather than technical perspective. So as cloud based customer services platforms continue to evolve and underpin processes of organisations throughout the world, it is crucial that organisations have a clear understanding of shifting regulations and responsibilities that must be upheld when protecting data. Effect of privacy breaches [easy-tweet tweet="Due to EU regulation changes organisations now have increased responsibility to minimise data breaches"] With the growth of customer service cloud platforms, and more data being generated by customers across multiple channels, many organisations are using third parties to store and handle customer information on behalf of organisations. But as the EU shifts its regulations to adhere to increased levels of cloud data storage and online customer service platforms, organisations now have increased responsibility to minimise data breaches. They must show they have developed policies and provided in-house training tools to teams, in addition to demonstrating that they have checked their data cloud provider is also taking the appropriate security measures to protect, secure and share highly sensitive customer data. Signing an agreement that ensures the cloud provider has enforced appropriate policies and procedures, and that teams have also been trained in the right way is important when future proofing the storage of customer data. If a leak occurs, it is likely the cloud provider won’t accept liability for data security – meaning the organisation will bear the monetary and reputational brunt of the consequences.  The effects of privacy breaches on organisations using customer service platforms can have significant and long-term damaging effects.  For example, customers may lose trust in the company and take their business elsewhere, particularly if switching costs are low. In addition, where big data loss scandals are concerned, it takes a substantial amount of time and money to conduct a review into what went wrong and integrate more advanced security processes and procedures. Security breaches can have a major impact on reputation, which can consequently deter future customers and cause a long-term impact on revenue streams. Ultimately, prevention is key. Preventing security issues [caption id="attachment_21616" align="alignright" width="300"] Read more about Security and DDoS attacks here[/caption] Read more about security and DDOS Attacks hereTo prevent security issues, it is vital that companies work with an accredited cloud vendor with a working Information Security Management System that is certified to ISO 27001 or equivalent security standards. This is because the parameters for how data is stored and protected haven't changed since the last update of the Data Protection Act in 1998. With increasing numbers of organisations using integrated and customer-centric technology systems to provide support systems, it is important that personal and sensitive data is secured to prevent misuse or unauthorised modification of data, regardless of its type. In addition to signing a contract, it is also important that companies receive a technical description of how customers’ personal data is stored, secured and processed to validate the provider’s claims. However, cloud suppliers may be reluctant to share this information, as they may not want to share insights on processes that could make them redundant and vulnerable as a supplier. Ultimately, it is likely that the cloud vendor will have a number of different clients of varying sizes, and smaller companies may find it more difficult to retrieve the information they need over larger enterprises. Advantages to outsourcing to cloud vendors There are numerous advantages to outsourcing to a cloud vendor with a portfolio of different customers. For example, if the supplier has received various different briefs from a number of companies, it is likely their solution and the systems on which it runs will have gone through extensive testing and updates to support the influx of highly sensitive data. They will also be equipped to run and support this advanced level of infrastructure, and will be likely to have teams who constantly monitor systems to maintain the security of its contents. Conversely, if a company hires somebody to build a unique solution, then it is likely it may not have undergone the same level of testing, which ultimately may result in security breaches. This is a particular risk factor if the company’s in-house team isn’t highly trained in managing the security of the server accurately. As more organisations invest in cloud based software to transform the customer journey and analyse business performance, investing in a robust security system and ensuring IT functions are understood and valued by all is absolutely critical for any organisation working with highly sensitive data. [easy-tweet tweet="Ensuring IT functions are understood is critical for any organisation working with highly sensitive data" via="no" usehashtags="no"] ### AWS to open London data support centre Amazon has decided to target British start-ups by opening a London data support centre, or 'Loft', for businesses using AWS. The East London loft will be the first time AWS has run such a venture outside of the US.  Amazon describes the Lofts are places for startups to get in-person support and education on AWS. As well as being functional workspaces, the Lofts are designed to encourage networking between startups using AWS. According to the Financial Times, Amazon described London as the first “obvious choice” to launch outside the US, because of the number of customers it has with operations based in the UK, and said it would open a similar space in Berlin in October. London is a hub for tech startups, and AWS is a popular platform for deployment due to low prices and malleable solutions. Air BnB, Shazam and JustEat were all startups that ran on AWS. [quote_box_center] Andrew Roughan, Director at IO, said: "This is an astute move from AWS – moving into the vibrant London tech scene and offering a service to a huge start up economy. AWS have a scale platform that transforms IT economics. Start-ups do not have legacy IT complications to concern themselves with and public cloud gives them a great opportunity to scale without significant investment. However, this should not be confused with those start-ups who hold technology as their core competence, their mission critical platform that will make the difference for them winning or losing in their markets. For these start-ups, they may continue to choose public cloud but they should ensure they use an operator with an enterprise grade data centre and consider the build or buy benefit case at every stage of their journey. They should also consider the benefit case of private cloud and colocation in making this decision. Moreover, we believe public cloud is a sourcing decision and should not be confused as an IT strategy – it is one component of a complicated and tailored approach." [/quote_box_center] Having an AWS support centre in London will be extremely advantageous for startups choosing to host in the public cloud. But the Loft shouldn't be taken as a swaying point for those looking for a hosting provider - consider your options wisely, and choose your provider on their merits, not because of their foosball machines. ### Why Agent-based Backup is actually best for Cloud Backup Many backup technologies offer excellent on-premise protection for a massive range of applications and hardware deployments. Historically, a backup environment would have a backup server pulling data from a live environment moving it onto disk, tape or a combination of the two at the same site or across the WAN to another site. This design works for the most part but has its limitations.  [easy-tweet tweet="What are the limitations of disk and tape based cloud backup?" user="comparethecloud" usehashtags="no"] Should an organisation have a distributed environment, an on-premise solution may be required for each site. In this case, centralised management may be difficult, increasing management overhead and potentially necessitating travel between sites. If an organisation has a single site, a method of offsiting backup data is required. If this process is a manual one, the potential for human error enters the equation. Alternatively if a second application/service is used to offsite backup data, the management overhead and cost is increased, potentially even doubled. Lastly, if an organisation, like many today, faces consistent data growth, an existing solution may require regular upgrading / replacing to allow more backup data to be stored. For these reasons, many organisations now naturally look to the cloud for their backup and disaster recovery.  Why wouldn’t I use an existing/well-known backup technology? [easy-tweet tweet="On-premise solutions were designed assuming the data being protected was in the same location as the primary target for backups" via="no" usehashtags="no"] We’ve seen many of the classic LAN/WAN-designed backup products add 'cloud connect’ functionality to their technology in recent times but there remain fundamental design differences between cloud-based backup technologies and their “designed for on-premise first” counterparts. On-premise solutions were designed with the assumption that the data being protected resides in the same location as the primary target for backups and as such are significantly less optimised for use over public networks. Whilst work has been done to retrofit them for cloud usage, fundamental differences remain between products designed for the cloud and those designed for the data centre. These differences are typical most obvious in the areas of security / encryption and bandwidth usage. An agent-based approach The advantage of an agent-based approach is that, in the unlikely event of a client failing, it fails in isolation to the rest of the environment. The agent based approach also allows you to benefit from distributed processing, resulting in faster, more efficient backups.  Multiple, lightweight clients can be configured in a matter of minutes and then administered centrally. [easy-tweet tweet="What is an agent-based approach to cloud backup? And how is it beneficial? " user="comparethecloud"] The advantages of an agent-based approach include: Robustness, Flexibility, Efficiency, Performance and Cost Robustness - In the unlikely event of a client failing, it fails in isolation to the rest of the environment. Flexibility - Agent-based technology enables customers to backup any device from anywhere and restore to anywhere, whether that device is physical/virtual/a server/laptop/SAN or NAS, next to the data centre or on the other side of the planet without the requirement to implement a backup server on that location’s LAN. Efficiency - Protecting large volumes of data over limited bandwidth is the biggest challenge with cloud backup, an incremental forever agent-based backup negates the requirement to move unnecessary changed data from the virtual host offsite e.g. page file, this can reduce the data volume sent by up to 70%. Performance - The agent-based approach also allows you to benefit from distributed processing, resulting in faster, more efficient backups.  Multiple, lightweight clients can be configured in a matter of minutes. Cost - Most cloud backup services are priced on data stored after de-duplication and compression. The more efficient agent-based approach enables customers to benefit from more capacity for their money. In summary, if you have a requirement for cloud backup, it would be worthwhile reviewing technologies developed specifically for cloud backup rather than continuing with a solution that has been retrofitted to enable cloud support. ### What does Private vs. Public cloud mean? It seems like most businesses are ‘flying in the cloud’ these days.  But while cloud popularity has grown rapidly over the past couple of years as organisations realise what that flexibility can do for them, it is not without its complications. One of these is deciding as an organisation whether to use a private or public cloud service. Managing a private cloud is hard to maintain and requires many more resources. On the other hand, the security risks are higher with a public cloud offering, and changing between public cloud platforms is not easy. [easy-tweet tweet="Gartner says more enterprises will look to use #public and #hybrid #cloud models " user="Marne_Martin" usehashtags="no"] Gartner has stated that more enterprises will look to use public and hybrid cloud models as they realise “it’s impossible to private cloud everything”. Gartner’s research director, Michael Warrilow states “we will see ever-more public cloud adoption … public [cloud] is probably going to be 70 to 80 per cent of cloud workloads.” He continued, “it is hard work to private cloud everything. You have got to be like a cloud provider but you have also got to be like traditional IT as well. So you have got to do security, service delivery, etc. People will be too ambitious. We think they should only target private cloud for the most important and relevant workloads.” But what does this mean for your organisation? Let’s clear the clouds, so to speak, and go over what public versus private cloud computing means, in general and for your organisation. Let’s clear the clouds, so to speak, and go over what public versus private cloud computing really means What is a Private Cloud? In the simplest terms, a private cloud refers to a cloud computing platform that is implemented within the corporate firewall, under the control of the IT department. According to SmartData Collective, there are five driving factors that influence an organisation’s decision to move to a private cloud: Gains in agility and speed Reduces company costs Improves overall service quality Moving aligns with company initiatives and plans Increases in data security Others argue however, that the private cloud increases costs due to the need for more personnel (which make up 60% of overall private cloud costs), frequent upkeep, and more complicated infrastructure. Some make the argument that compliance is actually harder to maintain, not easier, with a private cloud, due to hard to replicate public cloud architecture that builds in regulatory compliance mandates What Is The Public Cloud? According to Gartner a public cloud is “a style of computing where scalable and elastic IT-enabled capabilities are provided as a service to external customers using Internet technologies —i.e., public cloud computing uses cloud computing technologies to support customers that are external to the provider’s organization.” Public cloud services may be free or offered on a pay-per-usage model. Examples of public cloud service include Google (Gmail, etc.), Facebook, Salesforce, and Amazon Web Services. The most common benefits of the public cloud are as follows: Easy scalability Cost effectiveness Increased reliability [easy-tweet tweet="The biggest disadvantage of the public cloud is the security of organisational data." user="comparethecloud"] The biggest disadvantage of the public cloud is the security of organisational data. The question becomes, who really owns your data, you or the service your organisation is utilising in the cloud. Most public cloud providers have contracts and regulations that address these concerns for their clients. However geographical security concerns are at play because essentially your server could be in a different country which is governed by an entirely different set of security and/or privacy regulations, which may be a risk to sensitive data.  Another disadvantage of the public cloud is that each public cloud vendor has its own unique platform, making transfers extremely difficult if not impossible. Because of this, there is almost a forced loyalty to a public cloud vendor when you choose to use them for your cloud and SaaS needs. Comparing the Benefits and Costs of Private vs. Public Cloud BENEFITS COSTS Private Cloud Mission Critical Applications Equipment & Hardware “Cannot Fail” Fault Tolerances Virtualisation Security and Trust Data Center SLA Personnel Public Cloud Increased Utilisation Loss of Control Simplification (Do More With Less) Monthly Fees Pay as you Go Increased Support Cost Time to Provision Professional Services SaaS Security Risks (Geographical) What about using both public and private cloud? An integrated cloud service utilising both private and public clouds can perform distinct functions within the same organisation. Organisations are beginning to use this method of cloud to get the best of both private and public cloud offerings. Hybrid cloud offers a number of benefits: Scalability Cost efficiencies Security Flexibility Conclusion Like with mobile and data strategies, organisations must develop a cloud strategy that fits their needs and goals, their future, and most importantly, their customers.  Whether this means sticking with the public cloud, going private, or developing a hybrid model, the goal remains the same; to streamline operations, increase efficiency, and please customers. ### Datapipe Releases White Paper on Hybrid IT: The Cloud of the Future Leading cloud service provider Datapipe has released a new white paper covering hybrid IT and how it is going to work in the future. When cloud computing began to move from the cutting edge to the mainstream, business leaders were typically faced with two options: private cloud or public cloud. Initially, private cloud was by far the preferred choice, due largely to its superior security capabilities. In the coming years, this status quo will shift yet again. hybrid IT will almost certainly prove to be the cloud deployment of the future Hybrid solutions deliver significant advantages over purely private or public approaches, and these benefits will become even more compelling as time goes on. Download the full white paper from Datapipe here: Hybrid IT: The Cloud of the Future ### Building a digital platform to support European smart cities Across the region, cities are racing to make themselves ‘smarter’. Aiming to improve the future quality of life for their citizens, councils and governments are hard at work on a range of projects that will shape urban centres for decades to come. These projects are taking many forms. Some focus on transportation strategies while others look at issues such as waste management, high-speed networks, commercial buildings, and housing. Together, the projects are supporting a larger vision – a platform for a smart city. To work effectively, such a platform must connect all the components and projects that make up the overall smart city vision. These could be anything from sensor networks monitoring roads to green energy generation plants on rooftops. [easy-tweet tweet="Using a digital platform to share data between different systems means a #smartcity can become even smarter over time"] Using the platform to share data between different systems means a smart city can become even smarter over time. Traffic control signals can change based on the number of cars on the road, buildings can alter power consumption based on weather patterns, and citizens can have ready access to information and government services. To encourage the trend, the European Union established the European Innovation Partnership on Smart Cities and Communities. This initiative was designed to foster big ideas that could help large cities across the region reach their smart and green energy goals by 2020. [easy-tweet tweet="Vienna and Barcelona are already employing initiatives to create smart city ecosystems" user="digitalemea"] Some cities are setting their own objectives and timelines for creating smart ecosystems. For example, in Vienna, the city council is encouraging the creation of community-funded power plants. Solar panels located in solar power plants can be ‘purchased’ by residents with an agreed rent then paid back against their power bills. The city-owned energy provider Wien Energie has a goal of producing 50 per cent of its energy requirements from renewable sources by 2030. Meanwhile, Barcelona has captured global attention with its BCN smart city and 22@Barcelona initiatives. The city is investing heavily in everything from transportation systems and public infrastructure to payment systems and health services. Part of the plan involves a push to encourage a shift to electric vehicles in the city. The Council is investing in electric taxis and buses and has established a network of 300 free public charging points for private cars. At the same time, London is progressing with its Smart London Vision. This comprehensive plan tackles ways in which the city can cope with an estimated increase in its population of one million people during the next decade. Part of this plan involves innovative approaches to the use of energy in the city, such as re-using waste heat from underground shafts and electricity sub stations. As smart city platforms evolve, the promised benefits for citizens will become tangible We can expect to see many more such initiatives being adopted by cities throughout Europe and around the world in coming years. As smart city platforms evolve, the promised benefits for citizens will become tangible. The lifeblood of a smart city platform is data. From sensors, devices and citizens, this data is vital for the success and positive impact of all the interconnected systems. [easy-tweet tweet="The lifeblood of a smart city platform is #data" user="digitalemea and @comparethecloud" hashtags="smartcity"] Naturally, the vast amount of data generated by a smart city needs to be managed, stored, and processed within a digital centre ecosystem that is future proof and has access to low latency networks and robust cloud platforms.  For this reason, purpose-built connectivity-rich data centres are an essential component within the platform of every smart city as they will support the myriad activities that go into taking a smart city from a concept to a reality. ### How the cloud is finally bringing innovation to the legal industry The cloud may have been embraced by businesses across all lines of work, but some industries have, understandably, remained more hesitant than others. The legal industry, with its history dating back hundreds of years, is one such profession that has appeared reticent to adopt a technology only a few decades in the making. Innovation often provides faster results but unlike most other business sectors, this is not something law firms have traditionally strived for. By charging clients an hourly rate the legal sector does not have as much incentive to accelerate its business processes and it is often difficult for firms to  estimate accurately how long each task will take - leading to charges that are higher than expected. [easy-tweet tweet="The Legal Sector is beginning to embrace the cloud" user="bsquared90" hashtags="legalIT"] Recently, however, the legal sector has begun to embrace more modern ways of working and the cloud, in particular. The global financial crisis of seven years ago has forced many businesses to change the way they work and find new efficiencies and in these fragile economic conditions, customers were no longer willing to put up with unexpected charges. One of the ways that legal firms have streamlined their business processes has been to outsource various services, specifically low value, time-consuming work. As well as transferring manual work, like document reviews, to lower cost legal services abroad, cloud computing is also helping organisations to realise the benefits of outsourcing. the folly is assuming that physical storage is inherently more secure than digital By moving their IT infrastructure or software to the cloud, legal firms have been achieve greater mobility and swifter disaster recovery.  The latter, in particular, highlights the folly of assuming that physical storage is inherently more secure than digital. There are risks involved with all forms of storage, but the cloud at least offers firms automatic back-ups, meaning that files can be accessed even if there’s been an internal IT disaster. Similarly, the legal industry’s recent acceptance of outsourcing suggests a broader change in how the cloud is being perceived. Legal businesses, much like banks and medical services, contain highly sensitive information and many companies have been reticent to see this information stored digitally, let alone off-premise. Alex Hamilton, CEO of legal services firm Radiant Law, believes that modern approaches to cloud security offer a flexible way of resolving this issue. “I think it’s important to remember that we are always going to live with security risks, but there are now more nuanced ways of approaching data protection,” he said. “You can store contact data with Salesforce, for example, but keep transaction information in-house.” [easy-tweet tweet="41% of UK legal firms said they were likely to move their significant systems to the cloud" user="bsquared90" hashtags="legalIT, lawcloud"] Considering that third party cloud vendors rely on a reputation of trustworthiness in order to generate new business, security is often one of their key concerns. This can mean storing data with a cloud provider is actually more secure than keeping it in-house. Of course, when the decision has been made to transition to the cloud, selecting the right supplier is critical. Service level agreements and accreditations have helped give many law firms (and their customers) peace of mind regarding cloud computing. With security concerns abating, cloud computing is being embraced in a variety of ways. The contracting process has been streamlined, performance analysis metrics have been introduced and automation, where possible, is saving companies time and money. Understandably, however, convincing some legal firms to move to the cloud remains a difficult task. This year, 41 per cent of UK legal firms said that they were likely to move their significant systems, such as their case management and customer relationship management, to the cloud, but the remaining 59 per cent said that this was unlikely or that they had not yet decided. For Alex Hamilton, it is worth remembering that not all legal firms embrace innovation to the same degree.  the legal sector cannot stand still when other industries are embracing innovation  “Innovation has generally been driven by in-house legal teams, placed under greater pressure to evolve and positioned closer to the core business. Internal teams have also historically had little support from IT departments, so are hugely receptive to external cloud solutions.” The discrepancy between the organisations that are willing to adopt new technologies and those that are more hesitant is not necessarily a sign that legal firms remain stubbornly rooted in tradition, but it does reflect that the cloud is not right for every business, nor for every business process. Organisations may still wish to keep certain types of information on-premise and that is an understandable decision. But even with such a well-established history, the legal sector cannot stand still when other industries are embracing innovation. Fortunately, firms are now beginning to recognise the advantages that cloud computing can offer, finally bringing a profession more than a thousand years old into the 21st century. ### Why the UK’s Public Sector is Still Afraid of the Cloud World-famous Zen Master Thich Nhat Hanh is known for saying, “there is a cloud in your tea”, to remind us that the water in our teacups was once rain in a cloud, and that everything in the world is interconnected. With the rapid rise of cloud computing, we now have a cloud in our computers, smartphones, smart TVs, and tablets—or rather, we can access a cloud from any of these devices, and the world is more interconnected than ever before. The cloud has moved from our teacups to our fingertips, yet the UK Government’s G-Cloud framework has shown that many in the public sector are still hesitant to take advantage of everything the cloud has to offer. Here, I’ll be exploring and debunking the key reasons for the public sector’s resistance to cloud computing, “one of the fastest growing segments in the computing industry that will take over and affect many or most aspects of computing.” [caption id="attachment_21689" align="alignleft" width="300"] "There is a cloud in your tea"[/caption] When Whitehall turned to the cloud Over the last few years, most of us have become familiar with public cloud-based platforms like Dropbox and Google Drive, while more and more businesses are turning to third-party service providers for cloud solutions such as hosted desktop and email. It therefore came as no surprise when the UK Government launched the G-Cloud programme in 2012 to encourage local and central government bodies to adopt cloud-based IT systems that would slash IT costs and overheads, be easily scalable, and allow for more flexible working. However, despite the many benefits of cloud solutions, the majority of local government bodies continue to use expensive, outdated IT systems. Even the government’s 2013 ‘Cloud First’ policy has made little impact beyond central government organisations. Outdated systems still the top choice Now, three years after the launch of the G-Cloud framework, the figures remain shaky. Although it was recently announced that public sector spending on G-Cloud services and products has climbed to a whopping £639 million, 76 per cent of the total sales were attributed to central government departments, while the wider public sector accounted for less than a quarter of the total sale value. [easy-tweet tweet="Public sector spending on #GCloud services and products has climbed to a whopping £639 million"] Worryingly, a June 2015 report by the Government Digital Service notes that nearly two thirds of the public sector still opts for outdated computing methods, and according to a July 2015 study conducted for cloud collaboration firm Huddle, more than half of public sector employees don’t believe their organisation could benefit from cloud-based services. With individual users and private organisations increasingly turning to the cloud to meet their changing IT needs, it’s hard not to wonder at the public sector’s continued resistance to adopting cloud solutions. Why so much public sector hesitation? Lack of awareness At its core, the public sector’s issue with the cloud appears to be largely based on a lack of awareness and understanding of cloud-based services. Huddle’s study confirms this, noting that 83 per cent of respondents don’t actually know what to do with the cloud, and 73 per cent have never even heard of the G-Cloud framework. 85% of the public sector believes that the time and cost involved in switching to cloud-based services isn't worth it There is also little awareness of the financial benefits of cloud computing: 85 per cent of the public sector believes that the time and cost involved in switching to cloud-based services wouldn’t outweigh the advantages. The reality, however, is that adopting cloud solutions could potentially save organisations up to 50 per cent in procurement costs, according to G-Cloud director Tony Singleton. Then, of course, there are the same financial advantages that appeal to private companies: no need to invest in or upgrade expensive hardware and software; no upfront capital expenditure; easy and affordable scalability. Security concerns Together with this lack of awareness comes a fear of the security and privacy risks associated with the cloud. Research shows that “the shift of control from technology users to third parties servicing the cloud through outsourcing makes people nervous”, and there is a natural fear of losing control of who can access sensitive data. According to Huddle’s study, this fear permeates the public sector, with many civil servants viewing the cloud as “insecure” and preferring data transfer and storage options like USB drives, the post, and courier services. However, these ‘traditional’ means of data transfer and storage are far less secure than cloud solutions. In recent years, the news has provided ample evidence of this, including the losses of USB drives containing confidential data by bodies ranging from the NHS to the Greater Manchester Police. Hardware can be lost, damaged, stolen or destroyed in countless ways, while these risks are drastically reduced with cloud-based data storage and transfer methods. Of course, there’s no denying the risk of hacking and ‘function creep’ (when data collected over time is used for an unanticipated, undesirable purpose), but these risks are far more likely with public cloud platforms than with the highly secure, heavily regulated private cloud services offered through G-Cloud. Furthermore, all G-Cloud suppliers must comply fully with the Government Security Classifications and Cloud Security Principles. The future looks bright—and cloudy [easy-tweet tweet="The entire public sector will eventually hop aboard the cloud " user="comparethecloud"] Although the government’s cloud initiative is still struggling, there’s little doubt that the entire public sector will eventually hop aboard the cloud just as so many businesses are now doing—and it won’t be because of government-led promotions or compelling speeches by politicians. The reason is quite simply that cloud solutions deliver far more benefits than traditional IT systems and computing methods. The cost benefits are immense, the flexibility undeniable, and the backup options and security are matchless when compared to outdated data transfer and storage options. With these tangible advantages being increasingly recognised and embraced by both the public and private sectors, it’s only a matter of time before we see the majority of public sector bodies harnessing the full potential of the G-Cloud programme. Until then, sit back with a cuppa, relax, and remember that there is a cloud in your tea (and in each of your WiFi-enabled devices)! ### Culling the Cloud: Key Apps to Keep Grounded Which apps should you move to the cloud, and which are best on the ground? For many companies, the temptation to move everything is overwhelming — C-suites worry about spending on cloud services without making best use of these investments, while IT professionals hope to automate key processes and shift the focus of their workload from managing technology problems to innovating new solutions. Yet leaving no app behind comes with risk; Information Week notes that while most enterprises believe they're using just 51 cloud services, the average company has more than 730 running at any given time. Issues of compatibility, security and performance are also worth considering — here's why some apps need to stay put. [easy-tweet tweet="Most enterprises believe they are using just 51 cloud services, but in reality the average company runs 730+"] Leveraging Local With cloud vendors offering the benefit of scale-on-demand resources in addition to high availability, why would companies choose to keep some apps closer to home? According to Forbes, bandwidth is one key consideration — applications that demand large-capacity file sharing (50GB or better) can easily max out even top-performing provider bandwidth, leaving companies with no room in the network pipeline for other operations. Another issue is security, especially in public clouds. While the days of lackluster data protection are long past, the simple fact that public cloud data exists in close proximity to other companies' information silos and is — even in small measure — beyond the control of IT, increases the chance of a data breach. ROI may also prompt businesses to keep their apps close to the chest. While it's possible to estimate the average cost of running, updating and maintaining a local app, doing the same in the cloud comes with a number of unknowns, for example the fluctuating cost of cloud services or the possibility of vendor shutdown. What's more, cloud ROI is often measured in more than cost reduction or increased profit — if companies aren't prepared to effectively measure and report these benefits, it's easy to lose track of cloud returns. [easy-tweet tweet="How do you know which apps should be sent to the cloud? And which should remain grounded?" user="columnit"] Both Feet on the Ground What's worth keeping under the purview of local IT admins? Legacy apps are first on the list. Any proprietary apps more than 10 years old or designed to work in a specific local environment are good candidates to stay put, since re-deploying them in the cloud is costly at best and disastrous at worst. Next are apps that contain sensitive data such as health records or legal documents. While some cloud providers are able to meet compliance standards set by federal and trade organisations, it's often not worth the risk — if sensitive data isn't properly handled, the result could be steep monetary penalties or the loss of consumer confidence. sending documents to the cloud simply to have them bounced back to a local printer is a far more circuitous route than necessary Enterprise Systems Journal, meanwhile, advises that companies leave file access on local stacks in large measure to avoid network latency. Printing is another no-go — sending documents to the cloud simply to have them bounced back to a local printer is a far more circuitous route than necessary. Performance-intensive, mission-critical apps should also remain under the control of IT admins, allowing them to manage things like encryption and resource allotment in addition to eliminating the telltale signs of small problems before they flare up into big issues. While it's worth giving workers access to IT-vetted file sharing apps, companies need to take a hard line against the use of multiple, unsanctioned, public cloud services that put corporate data at risk. Moving applications to the cloud helps free local resources and reduce IT frustration. Culling your cloud app set, however, is critical to maximise resource returns. ### Gartner's 2015 Hype Cycle for Emerging Tech Gartner, Inc.'s "Hype Cycle for Emerging Technologies, 2015," focusses on the movement towards digital business. New to the Hype Cycle this year is the emergence of technologies that support what Gartner defines as digital humanism — the notion that people are the central focus in the manifestation of digital businesses and workplaces.  [quote_box_center] "The Hype Cycle for Emerging Technologies is the broadest aggregate Gartner Hype Cycle, featuring technologies that are the focus of attention because of particularly high levels of interest, and those that Gartner believes have the potential for significant impact," said Betsy Burton, vice president and distinguished analyst at Gartner. "This year, we encourage CIOs and other IT leaders to dedicate time and energy focused on innovation, rather than just incremental business advancement, while also gaining inspiration by scanning beyond the bounds of their industry." "As enterprises continue the journey to becoming digital businesses, identifying and employing the right technologies at the right time will be critical," said Ms. Burton. "As we have set out on the Gartner roadmap to digital business, there are six progressive business era models that enterprises can identify with today and to which they can aspire in the future. However, since the Hype Cycle for Emerging Technologies is purposely focused on more emerging technologies, it mostly supports the last three of these stages: Digital Marketing, Digital Business and Autonomous." [/quote_box_center] Major changes in the 2015 Hype Cycle for Emerging Technologies (see Figure 1) includes autonomous vehicles moving to the peak of the Hype Cycle. While autonomous vehicles are still in early stages of development and certainly won't be storming the roads next week, their peak status represents a significant advancement. All major automotive companies are putting autonomous vehicles on their near-term roadmaps. Similarly, connected homes have moved to the pre-peak stage, with the introduction of entirely new solutions and platforms being enabled by new technology providers and existing manufacturers. Figure 1. Hype Cycle for Emerging Technologies, 2015 hybrid cloud is still five years away from hitting a plateau of productivity Andy Soanes, CTO of Bell Integration said, "Gartner is right to say hybrid cloud is still five years away from hitting a plateau of productivity: businesses are still finding the hybrid cloud sweet spot. Concerns about loss of control mean organisations are unlikely to jump in with both feet and follow Netflix’s recent move to placing almost all IT services on public cloud servers. “Finding the sweet spot - where the business is gaining the maximum benefit from a perfect balance of on-premise IT, private cloud and public cloud services, can be extremely difficult. A business must be able to identify its ‘crown jewels’: those applications where failure, or losing control of data, would have catastrophic consequences. On the other hand, IT should be wary of excessive caution; keeping applications in-house when it no longer benefits the business to do so. Losing control of an application, or putting the organisation at risk or failing compliance, is an understandable worry.  “An IT department that knows the services it has, the services it needs, and the changes it has to make to get there, will be able to identify its perfect cloud sweet spot: where the crown jewels are kept in the business, but the IT department’s energy is not spent performing housekeeping for low-risk, low-value applications. This will be where IT services are providing the maximum value to the business, regardless of where they actually reside.”  For a the full press release on Gartner's 2015 Hype Cycle for Emerging Technologies visit the newsroom. ### When is SaaS right for your business? Although cloud computing is still in its first flush of youth, it’s continually evolving at a fast pace and has made a large impact on the modern business – as has its cousin, cloud-based software as a service (SaaS). Many organisations are actively considering adopting the cloud, with Gartner predicting cloud computing as one of the ten strategic technology trends for 2015. Global cloud services spending reached $56.6 billion last year and will grow to more than $127 billion in 2018, according to recent data from International Data Corporation (IDC). This represents a five-year compound annual growth rate (CAGR) of 22.8 per cent, which is about six times the rate of growth for the overall IT market. [easy-tweet tweet="Global cloud services spending reached $56.6 billion last year and will grow to more than $127 billion in 2018"] There is no doubt that SaaS is now all around us. Our customer data is held in a SaaS CRM system; when I am working proposals I do so on Google Docs – and I can work anywhere, on any device. SaaS is not magic though; whilst there have been many definitions for SaaS, the simplest one to remember is that when the software you are using is being managed and hosted by another organisation, it is true SaaS. Are you right to be cautious? Businesses can be cautious when it comes to accepting new types of technology. With SaaS, many believe that moving from on-premise systems to cloud solutions means forgoing security, data ownership, and weakening internal controls and audit trails. Despite these fears, in order to facilitate greater productivity and profit margins, it’s important for businesses to investigate the benefits that using SaaS can bring.  SaaS minimises demands and impact on IT Maintaining, customising and upgrading on-premise software is time-consuming, expensive and risky. However, SaaS software providers keep their systems up-to-date as a matter of course, so there is no need to apply patches and updates as you would with on-premise alternatives. With no costly software or hardware to deploy or maintain, SaaS also minimises demands and impact on IT.  Calculating the true costs The costs associated with traditional on-premise software can rise when updated hardware and additional licenses are needed to support an increased load that comes with a growing company. SaaS applications on the other hand are designed for scalability and can easily add more users, so is a smart move for many, but the benefits can be most significant for SMBs. This is because it allows them to tap into the vendor’s economies of scale and keep operating expense spend on core software systems to a minimum. It’s not just the purchase and running costs that can be avoided, but the hidden cost of time it would take an internal IT department to implement and manage the software in-house. For larger businesses, the cost benefits may not be quite as pronounced. They need to correctly analyse the lifetime cost of SaaS and compare with more traditional on-premise software solutions. Whilst there may be minimal, or zero, upfront capital expenditure required; the operating expenses of a large licence, coupled with the fact that there may not be the same tax advantages such as R&D tax credits or the opportunity to use the depreciation of the capital assets, could mean that the SaaS solution might not be as cost-effective as initially thought.  Also, be cautious – some services purport to be SaaS when they're not really! Some charging models are only scalable upwards within an extended contract period; some solutions require many days to go live and so on. You may find your solution on a dedicated environment which can't be easily upgraded, or may suffer from outages. Does size matter? By using cloud computing and SaaS, companies can usually scale their cloud services up and down as their requirements change, meaning they pay for what they need, rather than what they think they’ll need, thereby making it a cost-effective solution. It is important to remember though when deciding on which SaaS would be most effective for your business, that one size doesn’t fit all. Careful consideration needs to be given to what it is you ultimately need to deliver to have a positive impact on your bottom line. A common mistake is that businesses get caught up during the buying process and purchase an ‘over-specced’ solution that has superfluous functionality that may just add complexity. Locate your emergency exit Whichever SaaS solution is chosen, an exit strategy must be considered. Most systems make it relatively easy to put data in, but some have put in barriers – purposely or otherwise – to make the process of getting data out problematic. Coupled with this, beware of long term contracts; many providers want to lock you in for as many as five years, meaning that a flexible SaaS solution might not be as flexible as you think. Ideally, only sign contracts that are agile enough to not only scale, but allow you to fully migrate to alternative technologies and services as and when needed.   It is important to realise just where your business sits in the customer hierarchy It is important to realise just where your business sits in the customer hierarchy. Though they wouldn’t admit it, larger SaaS software companies are unlikely to give the same level of attention to a five person SMB as they would to a large multi-national, as it contributes such a small percentage to its overall revenue, so don’t expect the customer service team to jump on any issues as soon as they come up. In from the shadows [easy-tweet tweet="35% of enterprise IT expenditures for most organisations is now managed outside the IT budget" hashtags="shadowIT"] SaaS and cloud computing have been a major catalysts to the growing issue in IT departments of Shadow IT, where systems and solutions are built and used inside organisations without explicit organisational approval. This is because savvy employees can more easily circumvent the IT department when using an off-premise solution. The Shadow IT sector is growing steadily with Gartner claiming that 35 percent of enterprise IT expenditures for most organisations is now managed outside the IT department's budget. A hybrid cloud architecture, supported by a comprehensive cloud management platform, can provide CIOs and IT departments with the most viable solution to shadow IT. Teaching an old dog new tricks  Ultimately, IT teams can learn from some of the benefits SaaS brings. In the fast paced world that we live in, the modern employee wants things immediately, whether it is the latest app or access to the data they need to complete a task. This means that IT teams need to provide a highly scalable technology backbone to lead to better staff productivity and in turn a quicker time to revenue. Whilst at its core SaaS is mostly software re-packaged and centrally managed, there are certainly enough potential benefits to warrant it being on the radar of any businesses wanting to update its systems to meet shifting demands. ### Working smarter, not harder, in the Knowledge Economy Information is the lifeblood of any knowledge-based business. But gaining information is only a small part of the process. Knowing how to store, access and utilise data is, at the end of the day, critical to the success or failure of an enterprise. With more and more information at our disposal, we are in danger of slowing down rather than speeding up our business processes. It is the equivalent of the modern day traffic jam; there is so much traffic on congested roads that modern day cars move at a slower pace than the horse and carriage of yesteryear. [easy-tweet tweet="Information overload is the digital equivalent of a traffic jam says @PaulDHampton" user="comparethecloud"] In fact, information community association AIIM says that due to the enormous amount of information we create today, we are becoming less efficient, rather than more. It estimates that ‘lost’ documents at work cost the UK economy £15bn every year. It seems that the more tools we have at our disposal – tablets, phones, laptops, home PCs – then the more inefficient we are becoming at sharing and using the information we produce on them. Modern Enterprise Content Management (ECM) solutions must accommodate new ways of working. IT departments are under tremendous pressure to support a new class of connected, tech-savvy employees whose expectations for ease of use have been shaped by consumer web-based applications. Today’s employees want to find and share business documents as easily as they can share photographs with their friends or browse and buy a book online. This new working paradigm is being further transformed by an influx of Millennials. These digital natives – born in 1983 or later – grew up on intuitive, modern apps like Instagram, Snapchat or Uber. Millennials are highly mobile, very connected, and want IT solutions that allow them to get their work done independent of location, network, and device. Their expectations for ease of content access intensify the pressure on IT organisations to modernise their ECM strategy. It is estimated that ‘lost’ documents at work cost the UK economy £15bn every year The risk for IT is that if they do not deliver solutions that deliver to the user expectations, and make it easy to share with people outside the organisation or make it easy to access content on mobile devices when outside the office, end users will find their own tools. These unsanctioned tools lead to the proliferation of disconnected information silos and greatly increases the risk associated with confidential information. But whilst ECM tools are fine in themselves, we now also have access to Business Process Management (BPM) capabilities, which help to structure and unblock potential information silos. Whilst BPM is not new, it is becoming more sophisticated and adaptable as time goes on. Adding BPM capabilities to content management tools helps to move work to users who need access to the relevant information, and speeds the flow of information around the workplace. It is the true democratisation of the office. Aiming for the clouds Enterprise IT has been transformed by successive technology waves. First, cloud computing changed everything. Now, mobile devices are rewriting the rules of corporate IT. The realities of today’s enterprise architecture demand a new approach to ECM / BPM that is in synch with how IT supports the modern business. [easy-tweet tweet="Leading analysts like @IDC and @Forrester predict a major shift to #hybrid enterprise #content management." user="PaulDHampton"] To better support mobile users and external partners, many companies are moving some of their business content to the cloud. Leading analysts like IDC and Forrester predict a major shift to hybrid enterprise content management. The next generation approach involves storing content both on premises and in the cloud, with seamless syncing between the two locations. Hybrid ECM and BPM meets IT’s need for control and compliance, while freeing business users and external collaborators to be more productive. In the past, BPM, ECM and case management (also called dynamic case management (DCM) platforms) have traditionally been separate systems, whilst sharing some common architectural components. Whilst DCM and ECM platforms emphasise tooling for content management, recording and tracking dynamic unstructured collaboration of multiple functional groups, BPM suites include the stronger tooling required to codify structured, repeatable processes, workflows and tasks. It has the added capabilities to analyse and control execution and performance. IDC estimates that by 2020, 13% of all data will be stored in the cloud, 61% will be stored on-premises, and the remainder will be ‘touched by the cloud’ (ie processed or transmitted) Analyst company 451 Research sees a growing integration between BPM, DCM and ECM platforms, driven by cloud content sharing capabilities. The convergence of these systems and markets will form a common framework for application development and integration of all types in response to increasing enterprise demand for vendor and technology consolidation. A look inside any modern data centre confirms that one size does not fit all. Today’s ECM systems need to support traditional on-premises deployments, virtualised environments, private cloud deployments and fully fledged public-cloud SaaS deployments – and everything in between. IDC estimates that by 2020, 13% of all data will be stored in the cloud, 61% will be stored on-premises, and the remainder will be ‘touched by the cloud’ (ie processed or transmitted). Flexible implementation options give IT the agility to address evolving business needs over time. [easy-tweet tweet="Data will soon be 'touched by the cloud' at the very least says @IDC" user="comparethecloud and @PaulDHampton"] Enterprises utilising hybrid cloud based ECM / BPM solutions are much more able to keep control of their data in a scalable and flexible manner. To be effective, cloud information management processing needs to be available anytime and from anywhere; process optimisation choices need to be as rich as they are in the enterprise. Less paperwork for all Employees prefer electronic collaboration too. A recent report from digital media company Adobe found that people despise paperwork so much, they would be willing to leave their job for another with equivalent compensation ‘if only the benefit were dramatically less administrative work’. It seems that almost two-thirds of us (69%), whether in the UK or US, are sufficiently fed-up with administration to jump to an employer with less form filling and filing to do. BPM processes can really help here.  It’s a way of making the whole process more streamlined for everyone and giving people more time to focus on the more fulfilling aspects of their work. The new generation of iBPM – or intelligent Business Process Management – is going to take the process one-step further, using social media, collaboration, mobility, analytics and cloud services to enable completely connected workplaces. [easy-tweet tweet="We are looking towards creating a true Knowledge Age economy" user="PaulDHampton"] Gartner’s research, gathered over hundreds of interviews with clients of all sizes around the globe, found that successful BPM projects have seen at least double-digit ROI. More than three-quarters of projects reported an ROI of more than 15%, with a similar proportion claiming an increase in competitive advantage. The soft benefits were also tangible, with benefits across customer retention, employee satisfaction, and transparency, all linked to positive customer business outcomes. As the workplace becomes more diverse, sophisticated and interconnected, it seems improved businesses processes are the next phase in creating a true Knowledge Age economy. ### How to close the door on security risks Many homes and businesses need to have a certain level of security to satisfy their insurance companies, but thieves are always finding new ways to enter properties. With new technologies and online systems there are now even more security problems that have to be overcome. Technology trends and downfalls Residents and business owners are now moving to relying on ‘wireless home-control systems’. They allow the owner to remotely control lights, garages, security cameras and almost anything powered by electricity in the home or office. Recently it was revealed by Forbes Investagative journalist Kashmir Hill that smart homes are vulnerable to hacking. A couple in Oregon, (USA) had their 'smart home’ system commandeered by Hill. Hill discovered she could control their system due to security flaws, and after making a rather creepy phone call, she took over their appliances. If she had been a criminal she could have gained entry to the house or worse. The wireless home-control solution product from Insteon allowed Hill to access seven other households thanks to a security flaw that meant the technology was visible to search engines. There were no passwords, or even usernames being employed to protect the users’ homes. Unfortunately this is not an isolated incident - googling ‘smart home’ will potentially give anyone access to your home system. [easy-tweet tweet="Are your flickering lights and mysteriously locking doors a ghost or a hacker? " user="comparethecloud" hashtags="smarthome"] Cyber security The similarities between protecting your devices and your home is as straightforward ensuring that the doors and windows are locked, and making sure that the security on your computer or phone is just as safe. Many people are guilty of using the same password for each account they have, or some even use the word ‘password’, which is not the safest way of protecting your bank details. This is similar to having only one key for your whole building, and leaving a copy of it in your letterbox. The same goes for businesses that are sending confidential data around the world. It can be as easy for a hacker to gain access to these systems just as it can be for a thief to enter your property via an unsecured door or window. "PASSWORD" is NOT a SAFE password! Personal safety It’s not only important that you protect your home or shop, it’s also imperative that you ensure you and your staff are safe as well. If you feel confident that your surroundings are protected then you will feel safer in yourself. You can fit coded security locks to the external doors, so that if you are in the building alone you’ll know that you are safe. Technology enhancements to security assets are greatly improving the way you can protect your home. There are also on person security devices becoming readily available to market - including phone apps that allow victims of domestic abuse to alert authorities without letting their abuser know that the signal has been sent out.  Risk assessment and Protecting your Premises If you are running a business or protecting your home you can get security advice from your local Police force. They will come to your premises and assess your current protection and give you advice on how best to update anything that needs to be carried out. Fit cameras and ensure that they record and date what is happening outside the building, making sure there are no blind spots where potential burglars can hide and check that all door and window locks are of a high enough standard.  [easy-tweet tweet="Ensuring your digital security is just as important as your physical security" user="comparethecloud"] Shops and residential properties around the country are constantly upgrading their protection against theft and damage. Using Security Direct UK is a good place to start your search for shutters, grilles and other protective measures which will give you peace of mind. It only takes one loose door, lock, or ‘weak’ password and the burglars and hackers will soon be inside helping themselves to your goods and data. It’s better to act now than regret it later. ### IBM LinuxONE: Revolutionising the Data Center The current of change that is now surging through Information Technology is challenging us to answer some fundamental questions about the future of our industry. With the growing dominance of big data, cloud and engagement technologies, we must ask ourselves:   How can we handle digital technologies needed for mobile, social and other demands? How can we make better decisions based on the explosive growth of data – and turn data into a resource? How can we build an infrastructure that optimises digital marketing strategies?  How can we do it all faster? Cheaper? And better?  How can we do it all faster? Cheaper? And better? No one has all the answers, but for more than 20 years now, Linux and IBM have been in the forefront of a movement in open source collaboration that has inspired ground-breaking innovation, expanded capabilities for developers and taken us to the brink of a new IT era.  This week Linux, IBM, and an extensive partnership ecosystem joined forces to raise the bar yet again.   While Linux has been available on the mainframe for the past 15 years, we are now introducing LinuxONE, the fastest, most secure and most open family of enterprise-grade Linux systems built on IBM’s mainframe technology.   [easy-tweet tweet="LinuxONE can scale up to 8,000 Virtual Machines, the most of any single linux system" user="comparethecloud" hashtags="linuxcon"] LinuxONE is the world’s most advanced Linux system.  It can perform up to 30 billion RESTful web interactions a day with more than 470,000 database accesses per second. It can scale up to 8,000 virtual machines – the most of any single Linux system.  And it builds on the mainframe’s phenomenal ability to deliver near 100% uptime for business critical workloads. LinuxONE offers the Linux KVM hypervisor In addition, LinuxONE offers the Linux KVM hypervisor, allowing organisations to manage LinuxONE virtual machines with the same operational process and skills they use for the rest of their IT environment. I have spent more than 30 years in and around the enterprise space with z Systems and I have never been more excited about the breakthrough in the Linux and open source technology. But beyond all the purely technical advantages, what does this mean to you, the developer? For me, this is all about standardisation in two important areas.  First, we are giving developers what they want – the strongest integration of Linux with the enterprise grade capabilities of LinuxOne, with greater access and choice and all the tools developers know and love. With this set of tools, they are no longer limited in what they can develop or deploy on this great platform. Developers can use the LinuxOne system with agility and confidence.     [easy-tweet tweet="With the #Linux Foundation @IBM will support the newly formed Open Mainframe Project" hashtags="linuxcon"] The second big advancement is in the management of the platform.  By adopting technologies like the KVM hypervisor and OpenStack extensions, we are integrating tools that standardise the operational control of the Linux environment, lower cost, and eliminate the need for new training and specialised skills. And of course, the KVM hypervisor for LinuxOne will be equipped with the optimisation our clients have come to expect from a mainframe. I am also proud that IBM is taking significant steps to bolster our already strong commitment to open source innovation. This includes donations of industry leading IT analytics technology to help identify and prevent failures before they occur and cause service disruption. And in collaboration with the Linux Foundation we will support the newly formed Open Mainframe Project. The new community will foster collaboration by enabling knowledge sharing around Linux on the mainframe. Founding members include IBM, CA Technologies, Canonical, L3C, Red Hat and SUSE. When the mainframe concept was conceived more than 50 years ago, it was IBM’s desire that it be “future made” – capable of adapting and being compatible with evolving technology. LinuxOne has more than fulfilled that promise. The enhancements we have just announced, including the greater integration of Linux and enterprise server technology, and the strengthening of our commitment to strong open, ecosystems ensure that we continue to build a “future made” infrastructure that is always one-step ahead of the next big disruption. [easy-tweet tweet="We will continue to build a #futuremade infrastructure" user="IBM and @Linux" hashtags="linuxcon"] ### The stupefying cost of underestimating the DDoS threat Imagine you owned a brick and mortar store. You’ve built this business over time and it’s more than your livelihood, it’s your passion. Now, what if you learned that there is a 45 percent likelihood your store will be broken into? Once you recovered from the shock that there is nearly a 1 in 2 chance that your store will be robbed, you’d likely upgrade your security system. It’s only logical. There is a 45% chance you will be targeted by a DDos Attack So why is it that businesses and entrepreneurs learn that there is a 45 percent chance that their organisation’s website will be targeted by a DDoS attack, they still rely on no more than existing firewalls or other basic forms of internet security to protect them? Maybe because most of them don’t realise what a Distributed Denial of Service (DDoS) attack on your website truly costs. The initial financial hit A DDoS attack basically overloads a website and its servers with external communications requests – so many so that the server can’t respond to legitimate traffic and is rendered essential unavailable. But website downtime may not be the worst impact on a business. The average cost of a DDOS attack is $40,000 per hour, according to a recent survey conducted by Incapsula of IT professionals from nearly 300 North American businesses. No, that’s not a misprint. It actually is $40,000 per hour, but that's only the beginning of this nightmare. Nearly half of all DDoS attacks last between 6 to 24 hours. [easy-tweet tweet="A #DDoS attack could cost your business $40,000 per hour, are you protected?" user="comparethecloud"] Based on these numbers, the average cost of a DDoS attack is somewhere around $500,000 for each incident and in some cases, the actual cost is significantly higher. For instance, 4 percent of the IT professionals interviewed in the survey have experienced at least one DDoS event that lasted more than a week, and recently one lasted 38 days. What’s the cost of recovering the data? It’s hard to know where to start. [caption id="attachment_21615" align="aligncenter" width="598"] DDoS traffic generated by DNS and SYN floods over the course of the attack.[/caption] The longer-term costs In addition to costs incurred fighting off a DDoS incident, these attacks have been found to cause at least one of the following: software or hardware replacement, the installation of malware or a virus, a reduction in revenue, loss of customer trust, financial theft, and loss of intellectual property. These losses are detailed in the following infographic: Not only are these consequences costly, but it can take an organisation weeks or even months to recover from them. In some instances concerning the installation of malware or viruses and loss of intellectual property, the damage may never be fully undone. This is also true for loss of consumer trust. According to the survey, 43 percent of all IT professionals mentioned it as one of the outcomes of a DDoS attack, making it the third most common outcome. [easy-tweet tweet="43% of #IT professionals cite loss of consumer trust as an outcome of #DDoS attacks" user="comparethecloud"] These numbers likely would be even higher if the survey had polled support, sales and marketing teams. In most DDoS attacks, Sales and Customer Service are two of the business areas to take a big financial hit from DDoS attacks, accounting for more than a third of all financial losses as the DDoS tide recedes. Targets of all sizes There have been plenty of high-profile attacks on major corporations in the news, including repeated attacks on Sony PSN and XBOX live. Last Christmas Sony had several different networks taken down by DDoS attacks and has had its Sony Pictures division hacked so severely that unreleased movies were put online and its corporate computer network was rendered unusable. As for the XBOX Live DDoS attack, the Lizard squad hacking group (the same group responsible for the repeated DDoS attacks on Sony PSN) says they are preparing some additional surprises, and are threatening to shut down the XBOX Live network forever. Certainly, the bigger the organisation, the larger a target it presents for DDoS attacks. There are plenty of websites and businesses that would never have to worry about losing $40,000 per hour as a result of a DDoS attack, simply because they don’t deal with that kind of money. Does this mean smaller websites and businesses aren’t targets? Unfortunately, the answer is no. Any website with customers, intellectual property, personal data, and financial information is at risk. On a weekly basis small CMS-powered websites are hit by DDoS attacks, some of which are enabled by vulnerabilities in these platforms. DDoS perpetrators don’t have moral codes, professional ethics or sound business plans. DDoS perpetrators don’t have moral codes, professional ethics or sound business plans. Media only covers the most interesting and most prominent DDoS events. Many of the attacks we are dealing with have nothing to do with hacktivism, corporate espionage and geo-politics. Often a DDoS attack is simply a random act of vandalism by someone who decided to have some fun and take down a website, just for “RTs and LoLs”.    Investing in protection For many businesses, the unsettling reality is that it’s not a matter of if they’ll be hit with a DDoS attack, but when. As the saying goes, it’s better to be prepared. [easy-tweet tweet="36% of all IT professionals are not confident in their current #DDoS protection" user="comparethecloud"] The survey found that 36 percent of all IT professionals are not confident in their current DDoS protection, even though 46 percent were using purpose-built solutions. They have good reason. It’s been demonstrated time and again that counting on a firewall or ISP for protection in the event of a DDoS attack is misguidedly optimistic. Firewalls are easily overwhelmed and ISPs don’t have the resources to monitor all traffic to all websites they serve, not to mention the ability to respond to DDoS events in a timely manner. In fact, even if they detect the attack, their best option is to drop all incoming traffic, as they are not usually equipped to offer any filtering solutions that will allow you to stay online while “under fire.” Professional mitigation services may seem like a major investment. But the price is paltry compared to the costs of a DDoS attack, especially on a mid-sized to large organisation. ### The Worst Computer Viruses in History Antivirus and mobile security firm, BullGuard, has produced an infographic that highlights the major viruses of the past three decades and just how vulnerable the internet can be to these destructive strings of code. Viruses can be an irritating menace or they can be devastatingly destructive. It’s hard to believe today that when computing for the masses took off in the 1980s the concept of a computer virus was something generally scoffed at and dismissed as outlandish. [easy-tweet tweet="The Worst Computer Viruses in History #Infographic from @bullguard now on @Comparethecloud"] Today, we know different; their potential for harm has mirrored their increasing spread. From Slammer, which caused an estimated $10 billion worth of damage, to Conficker, which caused chaos and panic and infected about 15 million computers, they now spread from one corner of the world to another at the speed of light. ### Barracuda Launches CudaDrive New Secure Cloud File Service with Virtual Drive and User Data Protection Store, Protect, and Access Files from Anywhere  Barracuda today launched CudaDrive, a flexible and secure cloud file service for businesses. Delivered to desktops via virtual drive, as well as via mobile and web applications, CudaDrive enables businesses to securely store, share, and access files stored in the Barracuda cloud from anywhere. CudaDrive also includes user data protection for endpoints, and the option to add an appliance for an easy transition from existing corporate file shares. Additionally, the desktop virtual drive feature helps preserve valuable storage space by not requiring that a file be synced locally to access it, a key differentiator from legacy sync-only services. As with other Barracuda solutions, pricing is simple, with plans available by storage capacity with unlimited users rather than charging by seat or by user.  “We’ve helped many companies back up and protect server data with Barracuda Backup, and now with CudaDrive, we’re able to protect end-user data and provide easy sharing,” says Rod Mathews, GM Storage, Barracuda. “It’s not uncommon that users view, share, and store corporate data from many different access points, whether on a smartphone, tablet, or computer, presenting a challenge for companies to ensure that all of this data is protected. CudaDrive helps solve the common business challenge of enabling access to corporate files from any device, while protecting data stored on those devices in the event of a failure or loss.”  Storage Management via Desktop Virtual Drive Designed for business, CudaDrive delivers cloud file services via a unique desktop virtual drive interface and provides employees with instant and secure access to company data from any device. Unlike legacy sync-only services, the virtual drive feature offers users an identical view of their files stored in the Barracuda cloud, even if the files and folders are not stored on the users’ local hard drives, helping to preserve valuable storage space.  CudaDrive highlights include: •User Data Protection - automatically protects documents, desktop, and downloads folders on endpoints, including individual computers and mobile devices, with the ability to add other locations with one click.  •Desktop Virtual Drive - intelligently caches data, minimising local storage requirements and maintaining access to files without the need to sync.  •Enhanced Security - encrypts all data in transit to, and at rest in the Barracuda cloud, using 256-bit AES encryption - the same standard used by military and government agencies - and locally stored data becomes unavailable once users are logged out of the service. •Simplified Pricing - can be purchased by storage capacity without any per user fees.   Intelligent Bandwidth Management via CudaDrive Site Server The CudaDrive experience also can include the addition of an on-premises CudaDrive Site Server, allowing for intelligent bandwidth management for businesses. CudaDrive Site Server provides faster, direct access to files via a local cache, improving speed while on a company’s LAN. Further, the appliance enables file server consolidation by sharing files stored in the cloud via network drive shares. This eases the transition from traditional local file servers to CudaDrive cloud storage. With CudaDrive Site Server, users have ubiquitous access to files both in the cloud and stored locally. “Like many companies, our users and agents need access to corporate files while they’re away from the office, and we need to make sure the data they’re using is protected,” explains Steve Lenker, Director of Operations, Howard Hanna Realcom Realty. “Barracuda has designed CudaDrive to meet the demands of both IT and end users. CudaDrive offers customers the ability to protect corporate data, leverage flexible pricing, and the option to add a local appliance for local data control.” “As a Barracuda reseller, we’ve come to rely on the network flexibility Barracuda’s storage solutions offer our customers,” says Rusty Easter, Owner, REDtech, LLC.  “CudaDrive will help us expand our solution offerings to customers to help them protect and securely access their company data from mobile devices at an attractive price point.”  CudaDrive is Barracuda’s next-generation cloud file services platform, and is an evolution of Copy, the company’s legacy file sync and share service. Existing Copy for Companies customers will be migrated to CudaDrive. For additional information about CudaDrive, please visit: www.barracuda.com/products/cudadrive ### LinuxCon North America 2015 Compare the Cloud is blogging live from LinuxCon North America, Seattle - 17th - 19th August 2015. ### Business continuity is delivered through the cloud Backup and recovery are fundamental to business continuity, and both of these areas have progressed significantly over the past few decades. Traditionally, thousands of space-hungry tapes filled the data centre floor and data centre managers worked within specific backup windows to carry out their daily and weekly backups. As the volume of data within enterprises increased, change was afoot. The backup process required more time to complete, putting pressure on backup windows, applications and overall IT performance. As tape struggled to meet the requirements for primary backup and disk was becoming far cheaper, disk-to-disk backup started to take over as the primary backup layer, with tape increasingly being relegated to deep-archive. [easy-tweet tweet="The need to safeguard against #dataloss and #datacorruption is very real in #cloud" user="comparethecloud"] For IT Directors, the importance of backing up does not need to be highlighted – the need to safeguard against data loss and data corruption is very real, especially as the increase in data volumes shows no signs of abating. Today, a key priority for enterprises is balancing price and performance by determining which data is most important and most frequently accessed and where it should sit within the storage hierarchy. Who’s got your back? For organisations looking to backup as economically and efficiently as possible, there have traditionally been a number of challenges to consider. These include time constraints, security issues and management requirements, as well as capex and opex considerations of cost and capacity. These challenges have caused the backup and recovery industry to develop rapidly. As systems have become more sophisticated, with more focus on the constant availability of data, organisations have, in turn, become more demanding. Now, IT Directors look for solutions that can backup and also create zero capacity snapshot copies, as well as perform deduplication, compression and encryption. In addition, they want to reduce the costly management overhead of having to run and maintain the whole process themselves. Cloud-based backup is increasingly seen as a first-step in addressing these priorities The rapid rise of cloud-based backup and recovery solutions has been the result of these demands. Cloud-based backup is increasingly seen as a first-step in addressing these priorities, closely followed by the deployment of Disaster Recovery-as-a-Service, also in the cloud. This is primarily due to the economics, elasticity and agility of the cloud, which makes it an ideal platform for backup and recovery and other aspects of data protection. [quote_box_center] In response to these demands, NetApp recently introduced AltaVault, which allows organisations to backup data directly into the cloud quickly and at up to 90% less cost than traditional on-premises solutions. Through intelligent deduplication, NetApp is able to reduce data sets by up to 30 times, making it a viable mechanism for storing large volumes of data at minimal cost. Added to which, the data is compressed and is securely protected using AES-256 encryption before it is sent to the cloud. AltaVault is compatible with 85% of leading backup software vendors, which means that existing investments are preserved and the need for training is minimised. The AltaVault solution is available in physical and virtual appliance forms, and is also offered in the cloud as a pay-as-you-go virtual appliance via Amazon Web Services, shortly to be followed by Microsoft Azure. One example of a company that is deploying AltaVault is Blach Construction, a construction company based in Northern California. As Blach grew and expanded, so did its data, driving the need to find a cost-effective backup and recovery strategy to safeguard this vital asset. Backup storage was Blach’s first foray into the public cloud. As a result, it saves 20 business hours per month, which the IT team previously spent monitoring backup jobs, tweaking its devices and managing connectivity issues with the SANs. Now, the IT team is able to focus more on supporting their construction sites. For any organisation, this move to the cloud spells business continuity. As AltaVault keeps a cached copy of recent backups, organisations are able to restore data rapidly when required, safe in the knowledge that a second copy of the data is vaulted in the cloud. For any data centre manager, this is key. Ultimately, you are measured on what you can restore and recover, not what you backed up in the first place.Backup is merely a mechanism for retrieving critical information in a disaster scenario when your organisation experiences an unplanned loss of data. [/quote_box_center] Backup is merely a mechanism for retrieving critical information in a disaster scenario when your organisation experiences an unplanned loss of data. In order to keep the backup process as efficient as possible, organisations are now also looking to archive data that is not going to be accessed or edited in the near-term. This archiving, also known as cold storage, allows them to focus their backups just on the most recent, active data. Continuity and compliancy Moving forward, the industry will develop further still with changes to EU data protection law to unite all twenty-eight EU member states, which currently have their own individual laws. The proposed regulation will look to give individuals more control over their data, with easier access to it and a “right to be forgotten”. For organisations, it will be necessary to rigorously evaluate the risks of any data handling and to guard against accidental loss of data. It will only be possible to process data if the data subject has consented, or if the processing is strictly necessary. [easy-tweet tweet="IT Directors are under no illusions that backing up is a vital process for protecting business-critical data "] For businesses, there is going to be an increasing need to remain compliant and to ensure that data is backed up with the appropriate measures. Any organisation handling the data of EU citizens must comply with these regulations. This will most likely spell further changes to the backup and recovery industry as businesses evaluate their cloud backup solutions and ensure that they are fully compliant, at the same time as maximising cost-effectiveness, efficiency, and security. IT Directors are under no illusions that backing up is a vital process for protecting business-critical data and achieving business continuity within their organisations. The current debate is no longer about the benefits of backup and recovery systems, but instead, looking at how the industry will evolve further and the impact EU data protection regulations will register. ### MidoNet Achieves Certification with Red Hat Enterprise Linux OpenStack Platform 7 Midokura, the global innovator in software network virtualisation, today announced Red Hat Enterprise Linux OpenStack Platform Version 7 (RHEL OSP 7) certification for the latest version of Midokura Enterprise MidoNet (MEM), a scalable network virtualisation solution designed for Infrastructure as a Service (IaaS) clouds. With MEM as the first certified plugin for RHEL OSP 7, Midokura and Red Hat will help drive enterprise adoption for OpenStack cloud deployments. “The strength of Red Hat Enterprise Linux OpenStack Platform lies in joint engineering with RHEL and the robust ecosystem of certified networking solutions,” said Adam Johnson, VP of business for Midokura. “We are pleased to be working closely with Red Hat and the OpenStack Neutron core team to drive requirements from enterprises and web services companies into our joint solutions.” Announced in May, the latest MEM offers an intelligent, software-based network abstraction layer between the hosts and the physical network, by decoupling the IaaS cloud from the network hardware. Operators in turn can build isolated networks in software that overlays the existing hardware-based network infrastructure. The new enterprise-grade MEM release offers enhanced operational and management features, along with new support for OpenStack Kilo, and for container-based environments, such as Docker. Midokura continues to see overwhelming demand for its award-winning MEM and open source MidoNet technology worldwide. Midokura is the trusted network virtualisation vendor of choice for large-scale OpenStack deployments for leading organisations, including Blue Jeans Network, Dell and Colt-KVH. Midokura is a proud participant of the Red Hat Connect for Technology Partners program, which connects business and technical resources to third-party technology companies who are aligning with Red Hat’s OpenStack product offerings. For the full list of solutions certified for RHEL OS 7 and other products in Red Hat’s open hybrid cloud portfolio, visit Red Hat’s Certified Solution Catalog. For information on support, services and costs for the latest MEM technology, contactinfo@midokura.com. ### Udemy's beginners guide to AWS There is a cloud-computing service that dominates the internet; it powers thousands of businesses in 190 countries around the world. You have likely visited at least a few of these websites already today. It has the ability to produce as much power as dozens of servers, without taking up all that space. Yet, according to a survey conducted by Udemy, nearly 2/3rd's of Americans don't know what it is. [easy-tweet tweet="The cloud has been adopted by nearly 90% of businesses" user="udemy and @comparethecloud"] Amazon Web Services began in 2006, and since then has rapidly expanded across the airwaves of the World Wide Web. It powers large sites such as Netflix, Reddit and Pinterest, and each of those pages have over 35 million user accounts. So even though it seems very few Americans are aware of this service, it powers a lot of sites that many of us use every day.  Udemy is an "online learning marketplace" where users can take educational courses in a plethora of subject matters, and they've created an infographic to tell us more about the prevalence of AWS, and why it is so important to our daily browsing.  [easy-tweet tweet="Read @Udemy's guide to #AWS on @Comparethecloud"] ### Linux vs. Linux! We're at #LinuxCon 2015, Seattle Join Us at LinuxCon 2015, LinuxCon North America Conference, United States of America, Seattle, 17 - 19 August 2015 - tweet us @comparethecloud #LinuxCon ### 5 more reasons why businesses should adopt the cloud The arguments in favour of moving contact centre solutions to the cloud are well worn and well understood. Pay as you go models mean businesses can move to an opex rather than capex model. It’s flexible, so businesses can quickly scale their computing power up or down when necessary. Increased collaboration, better disaster recovery and the ability for employees to work flexibly mean that, more and more, cloud computing is the first choice for most businesses. This trend was confirmed by recent independent research commissioned by Genesys, which surveyed businesses using contact centres, with 55% of respondents stating that their entire contact centre functionality would be cloud based by 2020. [easy-tweet tweet="55% of respondents said their entire contact centre functionality would be cloud based by 2020"] This trend is hardly surprising, given the array of advantages that the cloud offers. However, above and beyond the common reasons listed above, there are other less understood reasons for making the switch. Here are 5 more reasons why you should consider moving to the cloud. Rent costs For many businesses, office rental is one of the biggest cost centres. With cloud computing, businesses can be flexible if a more suitable site location comes up, as they don’t have the hassle of a complicated and costly IT move, as well as the upheaval and downtime associated with moving servers and other hardware to another location. What’s more, because cloud computing also supports home working, businesses need only rent the amount of space they need on an average day. Rather than having desks for every employee, which often leads to large amounts of unused office space, staff can hot desk. This can mean huge savings on rent. Smaller office space can also mean lower council rates. Seasonal fluctuations Businesses that experience seasonal fluctuations need to hire more staff at peak times. In the past, business would have to purchase enough software licenses and bandwidth to cover these peaks, despite the fact that this extra capacity would not have been used for the remainder of the year. Cloud lets users scale up quickly and easily with pay as you go licenses.  the average annual cost in electricity of running an on-site server for a small business is around $750 Utility bills One hidden cost of on-site servers is the electricity bills. It not only takes a huge amount of electricity to run the servers, but they also need to be cooled to prevent overheating and failure. The warm air servers produce also means the temperature in the office has to be cooled, further adding to utility bills. Finally, servers also require constant management, with more machines running to service this management. A couple of years ago, Teena Hammond of ZDNet estimated that the average annual cost in electricity of running an on-site server for a small business was around $750. Telecoms bills In contact centres, call queueing at the network level can be expensive. Where contact centre operations have more than one site, calls can be queued at the network level if there are no agents available to take it. Once someone becomes available, the call can then be passed to the agent in one of the contact centres. This can make for significant savings on telephone bills. IT management costs Cloud computing means that on site implementation costs and associated IT maintenance salaries are eliminated. What’s more, ongoing IT maintenance is no longer needed at every site when hardware is removed. For multisite operations, the cloud offers the opportunity to install a single cross-site management team in place, with call routing and self-service controlled at a single point, which, in turn, reduces management costs and increases the available labour pool. cloud is fast becoming the only option for forward-thinking contact centre businesses Cloud computing is becoming a popular choice among IT decision makers in contact centre businesses. While many benefits such as cost, flexibility and better collaboration are some of the benefits that are already well understood, there’s a wide range of less understood advantages, including reduced rent and utility bills, and reductions in IT management costs. This means that cloud is fast becoming the only option for forward-thinking contact centre businesses. You can find more insights on cloud computing and the contact centre in the recent Contact Babel research The Inner Circle Guide to Cloud-Based Contact Centre Solutions. ### Planet of the Things Stephen Hawking touched a nerve when he reiterated his warning about the danger to humanity posed by artificial intelligence. In May this year he and a group of leading scientists had said: “Whereas the short-term impact of AI depends on who controls it, the long-term impact depends on whether it can be controlled at all. All of us should ask ourselves what we can do now to improve the chances of reaping the benefits and avoiding the risks.” Futuristic artificial intelligence may seem a far cry from today’s Internet of Things (IoT), but in both cases the fundamental problem is about the uncertainty and risks of scaling complexity. Early experiments on the interactions between very simple elements – analogous to termites obeying a few basic rules – showed how surprisingly intelligent behaviour begins to emerge as the number of elements increases. Putting an emphasis on “surprisingly” – rather than “intelligent” – means that we are not predicting some malevolent intelligence to emerge from the growing network of smart fridges, but rather that we may find ourselves facing unexpected consequences by adding billions of relatively simple devices to our already complex Internet. [easy-tweet tweet="Futuristic artificial intelligence may seem a far cry from today’s IoT, but the fundamental problems are the same"] Even before we get on to those surprising consequences, however, there is the all-too-predictable certainty that criminal minds are already planning ways to exploit the IoT and create new forms of cyber attack. We recently saw a smart, Internet-connected fridge sending out spam as part of a junk mail campaign that had hijacked more than 100,000 connected devices. But why should this be any more worrying than the existing threat of botnet-launched spam campaigns? [easy-tweet tweet="Scientists are talking about trillion-sensor networks within 10 years." hashtags="IoT"] IoT – the added challenge The first big difference lies in the sheer number of devices that could be, and eventually will be, connected. The world’s population is around seven billion people, and already there are many more devices than that connected to the Internet – although estimates seem to vary considerably. According to IDC’s estimation the number of connectible devices approaches 200 billion while the number of sensors (e.g., the accelerometer in a smart phone) that track, monitor, or feed data to those things is already more than 50 billion, with scientists talking about trillion-sensor networks within 10 years. Of those 200 billion things around 20 billion are already connected, and the number is predicted to reach 30 billion connected devices by 2020. So the first problem is not so much about the impact of any particular thing as about the possibility of unpredicted responses or vulnerabilities emerging out of sheer complexity. ThE SPAM ATTACK OF THE Internet-connected fridge is a reality The second big difference, and the one posing more immediate risk, is the fact that most of the devices now being connected are new to the IT arena. Whereas each new computer added to the Internet comes with some degree of malware protection built into its operating system, things like smoke detectors, security alarms and utility meters come from a different culture: traditionally they were either autonomous units or else, if they were connected, it was on a closed, dedicated network. Fire alarms were installed by one company, control and instrumentation networks came from a different vendor, the electricity meter was installed by the power supplier and none of these networks overlapped. While computers and IT systems have for many years been fighting off attacks, none of these simple devices joining the IoT have inherent defences and they remain wide open to cyber attack. [easy-tweet tweet="None of the simple devices joining the IoT have inherent defences and they remain wide open to cyber attack." hashtags="IoT, cyberattack"] The risk is not only that the particular function could be compromised – say fire alarms disabled before an arson attack – but the IoT could provide a weak link or point of entry to an otherwise strong security chain. The infected fridge continued sending out spam mail without drawing attention to itself, because its normal operation was not affected. Despite this relative vulnerability, the most publicised attacks so far on IoT control systems have penetrated the system via IT: attackers using simple phishing-style means to breach the perimeter and then target privileged access accounts. As well as gaining access to databases and high value systems, this approach lets them use the same privileges to reach control systems and whole new opportunities for sabotage and cyber war. That brings us to the third difference. A lesser difference, but potentially the most dangerous of all, is that many of the things joining the IoT have more of a direct physical role than the computers, game consoles and databanks currently populating the Internet. When the Stuxnet worm closed down some thousand centrifuges at Iran’s Natanz nuclear facility in 2010, IT departments all over the world woke up to the fact that a cyber-attack could cause actual physical damage. This was not simply an attack generating a signal to shut down the centrifuge, but one designed to force changes in the centrifuges’ rotor speeds that could lead to destructive vibrations and internal damage – causing far more serious delays to the nuclear program than any simple shut down. [easy-tweet tweet="The most dangerous fact about #IoT? New devices have more of a direct physical role than computers ever did. "] A couple of years ago we heard about a breach affecting Telvent control system designed to be used with “smart grid” networks. The attackers installed malicious software on the network and also accessed project files for its OASyS DNA system – designed to integrate an electricity company’s IT network with the grid control systems so that legacy systems and applications can communicate with the new smart grid technologies. There was nothing inherently wrong with OASyS DNA: it was a highly sophisticated system in use since the late 90s, but it was never designed to connect to the Internet. the IoT adds enormous extra scale to the already crowded Internet Project files provide a clever way to spread malware because vendors have full rights to modify customers’ systems through the project files. The files hold a lot of customer-specific system data, so an attacker could also use the project files to study a customer’s operations for vulnerabilities in order to design further attacks on critical infrastructure. The Stuxnet attack was a sophisticated example of how a project file was studied to discover how the centrifuges were controlled and then the file was modified so that they were now behaving in a different, harmful manner. [caption id="attachment_17906" align="alignright" width="300"] View the Compare the Cloud special report on IoT and Cloud[/caption] So the IoT adds enormous extra scale to the already crowded Internet, and it also adds extreme diversity. On the one hand we are networking highly critical systems: industrial and utility grid control systems that could cause widespread damage or economic harm if breached; critical healthcare and remote medical devices containing sensitive personal data or responsible for life support; navigation and control systems for connected cars, air traffic control and so on. At the other extreme we have small low-cost monitoring devices, meters, wearable devices, simple switches for remote control of household lighting etc. It would be unrealistic to insist that everything joining the #IoT should have its own built-in defences With such a range of devices it would be unrealistic to insist that everything joining the IoT should have its own built-in defences. The latest malware signature has some sixty million records and to be sure of identifying it by current pattern matching techniques would require 3-4 Gb RAM. A more sophisticated defence is provided by behavioural analysis – studying how the code behaves when quarantined in a “sandbox” environment. Such analysis of behaviour for signs of malignancy is what computer scientists call an “NP Complete” problem – or what the layperson would call “very difficult”. Reducing operational costs is one major driver for IoT connection – so adding sophisticated cyber-security to a ten-dollar switch would be hopelessly uneconomic. There is no way that we can realistically defend the IoT on the militia model, where every device is armed against attack – so how is it possible to provide protection across such a vast and diverse cloud? How to disinfect the Internet of Things VASPA: Virtualization, Automation, Security, Programmability, and Analytics Security is at the centre of the five key challenges being addressed by the OpenCloud Connect (OCC), spelled out under the acronym VASPA, namely: Virtualization, Automation, Security, Programmability, and Analytics. The OCC, established in 2013, is an industry organisation embracing every type of cloud stakeholder – including major users as well as cloud service providers, network service providers, equipment manufacturers, system integrators and software developers. The most promising approach so far to securing the cloud, and so the IoT, is to adopt the SDN principle and consider the traffic flow as a virtual network, rather than a string of hardware elements, and so define a distinct “security layer” to orchestrate Security as a Service. Today’s Internet has been compared to a water supply without any guarantee of purity, leaving responsibility for filtering and sterilizing the water to the customers. Internet users are expected to install their own anti-virus software, firewalls and other forms of security. Security as a Service, however would mean providing traffic that is already decontaminated – so even the most humble connected switch on the IoT could benefit from the most sophisticated security that would be provided by the network itself. [easy-tweet tweet="Today’s Internet has been compared to a water supply without any guarantee of purity" user="comparethecloud"] On the network scale, deep packet inspection, pattern recognition with a cloud databank for security, behavioural analysis and other costly high-level malware defences become an economic proposition. Security as a Service provides a very attractive revenue stream and the ultimate added-value proposition for building customer loyalty and reducing churn. Security as a Service allows organisations to order whatever level or type of security is essential for their operation – knowing that it is being continually maintained, updated and providing security for all their devices. ### Cloud and self-driving ERP – the new technology landscape Cloud is dictating how organisations are built and changing enterprise computing and software development forever. We are all looking to improve our interaction with enterprise applications; cloud holds the key and is being seen increasingly as a key component of corporate IT strategies. 1. The rise of new service-led business models and innovation With more organisations adopting cloud computing, the focus on services is shifting from front-office applications such as customer relationship management and other customer service mechanisms to the back office. At the same traditional back office applications like ERP have potential for the first time to deliver valuable end user experiences. [easy-tweet tweet="By 2016 investment in traditional ERP systems will decline over 30 percent to less than $15B." user="unit4global"] A 2014 report by PwC predicted that by 2016, investment in SaaS solutions would more than double to $78B. This surge compares with investment in traditional ERP systems which will decline over 30 percent to less than $15B. PwC researchers argue that “monolithic legacy, on-premises ERP systems have often been designed to match a predictable drumbeat of production. However, that’s not going to work today. Customers are redefining the way business is done and non-standard is quickly becoming the new normal.” PwC’s analysis also shows that ongoing costs for hybrid ERP systems can often be higher than traditional ERP systems. For enterprises to get the most value from hybrid ERP, they should make them catalysts of strategic change, not just rely on them for cost reduction, says the study. monolithic legacy, on-premises ERP systems have often been designed to match a predictable drumbeat of production. However, that’s not going to work today. The rise of new services organisations is a result of these new business models and the benefits afforded by cloud-based ERP. In many cases they are true examples of innovation born out of the possibilities created by new digital technologies. Companies like Uber, Netflix, Amazon and Airbnb epitomise the digital economy having succeeded in delivering digital, service-led business model innovations. 2. The era of self-driving business apps The cloud enables companies to deliver self-driving business applications, tied to machine-learning, predictive analytics and in-memory technology. For example, modern cloud platforms like Microsoft’s Azure offer predictive analytics services such as Machine Learning, Power BI and Stream Analytics that redefine business intelligence. Organisations can make smarter decisions, improve customer service and uncover new business possibilities from their structured, unstructured and streaming Internet-of-Things data. With ERP systems providing the IT engine for enterprise, this capability will enable them to revamp ERP in a dramatic way, designed for people and not products. For example Public sector organisations will improve fraud detection on bill payments with advanced pattern recognition and machine learning, while not-for-profits have the ability to match campaigns to donation patterns and target donors more effectively. [easy-tweet tweet="We accept that cars will drive themselves in the near future; software is taking over the driving functions by collecting data "] We accept that cars will drive themselves in the near future; software is taking over the driving functions by collecting data from a number of sources and smartly interpreting it to deliver a seamless user experience. The ultimate in customer service will be when your car simply takes you where you want to go in the most efficient way. It will gather data from sources like external sensors, satellite navigation, knowledge of road layouts, speed limits and even congestion, road works and accidents (which will become infrequent incidentally), almost as they happen. This type of automated capability can also be applied to applications, and we’re now entering a new era in enterprise computing with the emergence of “self-driving” applications such as ERP. Analytics is the core to fully automating any process New predictive technologies provide huge opportunity for assisting users and eliminating manual data entry. Analytics is the core to fully automating any process and analysing data is the core of understanding the surroundings, identifying patterns, predicting behaviour and identifying exceptions, as well as screening to make data personal and relevant. This is making the most of the power of machines. Users should be involved only when really needed, that is, when common sense is required. When users interact, business systems must deliver a great experience, similar to the best experiences we have from our favourite personal applications, from any device. This is no small development. A truly self-driving application, made possible in great part by the latest advanced analytics technology, can lead to huge productivity gains for organisations, as well as significant improvements in customer service and engagement. 3. Business computing adds value Typically, traditional ERP platforms provide non-intuitive blank forms for users to complete with the necessary data, which is error prone, time consuming and costly. Conversely, self-driving applications leverage the latest in predictive analytics, social media, mobile devices and machine learning tools to provide context to the information. This allows for more intuitive data entry based on pre-populated forms. As a result, companies can provide not only a better experience for workers regardless of the type of device they’re using but also significantly improved decision making, planning, budgeting and forecasting while also ensuring that customer service standards and the business, financial and operational objectives to be met more accurately and reliably. Self-driving ERP is the next great technology step for the enterprise [easy-tweet tweet="Self-driving ERP is the next great technology step for the enterprise" user="comparethecloud" hashtags="ERP"] The combination of cloud and self-driving ERP provide the route to success and a huge competitive differentiator for services organisations across both public and commercial sector organisations; improving the depth, accuracy and flow of critical data to both internal and external stakeholders while enhancing user and customer experience. Essentially putting the needs of people first. ### It’s Not Just About Data The desire to harness Big Data continues to occupy the minds of most marketers. As more and more connected devices generate even greater volumes of user data, the development of data-driven marketing strategies that leverage Big Data to improve targeting has become an industry obsession. But is this fixation on Big Data actually a bit of a red herring?  [easy-tweet tweet="Is this fixation on #BigData actually a bit of a red-herring?" user="wenczka and @comparethecloud"] Teradata’s 2015 Global Data-driven Marketing Survey reveals that 90% of marketers regard personalisation as a current priority, with 78% claiming to use data systematically to target their customers. The apparent Holy Grail is to move beyond segmentation to ‘true one-to-one personalisation in a real-time context.’ But how? The Present Focus The present focus on data mining to identify trends and deepen customer understanding is understandable. The development of the Internet of Things is just one of the latest things to provide renewed catalysts for data capture 75% of global data created by consumers, marketers can be forgiven for becoming excited by the number of data points open to them for targeting. But bigger is not necessarily better. More data does not mean more insights, but it does mean more processing. If marketers have a real-time remit, the likelihood is that the bigger your data set the slower your output will be. And by the time you get there, the world may have changed. Effective Targeting Effective targeting is not just about the depth of understanding that can be achieved through analysis of all your many data points. It’s about understanding how your audience changes through time – and focusing only on the data that’s relevant to you, your users and your product. [easy-tweet tweet="Marketers need to be able to analyse trends in real time" user="comparethecloud" hashtags="digitalmarketing"] Marketers need to be able to analyse trends in real time. What are your users’ core values? Do they interact with your product in specific geo-locations, use a consistent device or engage at a certain time? By building up a dynamic profile of your users you’ll be able to target them more accurately. The challenge is to establish the core data that actually contributes to their browsing and purchasing preferences – and respond to it. Dynamic Model Ensuring your model has a dynamic element is essential. It’s unlikely that you’ll be targeting only one broad audience – your brand will have sub-demographics that are equally as important. And since you’re likely to be targeting customers who frequently change their minds about products, it’s important to strike while the iron is hot and deliver responsive real-time messaging.  The most effective means of achieving this is through programmatic advertising and machine learning. This allows brands to identify and engage with dynamic audiences at speed and scale. Machine learning platforms recognise prospects from ads served on the system and automatically identify new and subtle trends as they emerge. The approach accelerates customer profiling, meaning prospects can be identified, tracked and targeted quickly.  it’s all too easy to get side-tracked by the sheer volume of available data In the current environment it’s all too easy to get side-tracked by the sheer volume of available data. Big Data opportunities may appear exciting, but the subsequent analytics will only slow you down. By the time you try to apply that demographic data, the chances are your audience will have shifted. If personalisation really is your goal, logistically there’s no other way to deliver relevant real-time targeting than via dynamic targeting systems. [easy-tweet tweet="78% of marketers are claiming to use data systematically to target their customers" user="wenczka"] ### The cloud hanging over the continent I've been thinking recently about customers and customisation. Earlier this month, I was at the Salesforce World Tour in Munich, which I thought looked busier than I'd ever seen it – a sentiment shared by many I talked to – and I wondered why that was. The Cloud Industry Forum's latest research shows that 84% of UK businesses were using at least one cloud-based service as of early 2015, with nearly four in five of those adopting two or more and over half expecting to move their entire IT estate into the cloud at some point. Germany has alway lagged behind the UK in cloud adoption, though, in large part due to privacy issues, which to be fair are still a major point of contention for most end-users in the UK. The NSA's surveillance of Angela Merkel won’t have helped, though. Still, if attendance figures are any indicator, not to mention the approval that greeted Salesforce’s announcement that it will open its own data centre in the country, I think things are about to heat up – at least with respect to the professional services industry. "The NSA's surveillance of Angela Merkel won’t have helped..." Consider: The DACH professional services market Europe’s largest and most mature: margins have become more important than ever.  Public, private and hybrid cloud models of IT services delivery have matured greatly in recent years; it’s no longer seen as a risky investment. [easy-tweet tweet="The cloud has come of age, and best-of-breed solutions are truly available for the first time for businesses of all stripes."] The most popular operating software (Windows Server 2003) has just reached the end-of-support stage in its lifecycle: new infrastructure investments must be made soon... SAP has made quite the hash of SAP Business By Design, its main cloud-based PSA offering for the mid-size market, and lost a lot of goodwill. Accordingly, opportunities abound for alternatives. But the real clincher has been the move from systems of record to systems of engagement. Cloud applications’ main impact is on improving core operations and front office activities for medium-sized businesses, so the German Mittelstand stands to benefit far more from these systems of engagement (how people work together) than they have done in the past with systems of record (back office systems like finance). [easy-tweet tweet="The UK has seen the cloud adoption rate grow by 75% since 2010, are we about to see similar growth from Europe’s economic powerhouse?"] For example, I used to be emailed a timesheet for approval, emailed back said approval, then opened the timesheet recording system to formally enter that approval (when I found time and if I remembered), then emailed the finance team to let them know (when I found time and if I remembered), etc., etc. And that’s assuming there were no queries to resolve, chasing of deadlines or any other ‘real life’ to deal with. Today, I can use Kimble to approve directly from the original email, and all relevant systems are updated and people automatically informed – even if I do it from my mobile phone. A medium-sized business benefitting from a more modern way of working together! It’s like moulding what you need from the clay, rather than building with the limitations of pre-fabricated Lego bricks. That’s pretty cool, but just the tip of the iceberg, because the common Force.com platform means a shared underlying architecture, and that means applications from different suppliers will sync up nicely: in other words, you can now build your very own IT software infrastructure to exactly your own size and specifications and update it as and when you need. It’s like moulding what you need from the clay, rather than building with the limitations of pre-fabricated Lego bricks. The cloud has come of age, and best-of-breed solutions are truly available for the first time for businesses of all stripes. Customisation for every customer. So the question that's really interesting me right now is, if the UK has seen the cloud adoption rate grow by 75% since 2010, are we about to see similar – or even faster – growth from Europe’s economic powerhouse? ### The Future Car Industry: Driving in the Clouds Futurists have long been predicting that the cars of the 21st century would eventually leave paved roads and take to the air. That has not yet happened, nor is it likely to become a reality anytime soon. Yet that does not mean the future of the car industry will not see any radical changes. It will. And, in fact, the future of the car industry is in the clouds. [easy-tweet tweet="The future of the car industry could have us driving in the clouds." user="comparethecloud" hashtags="cartech, cloud"] Before explaining how the cloud paradigm and the car industry will work together, it is important to first understand why the dreams of futurists are nowhere near to being realised. First and foremost, the cars themselves just do not exist. No one has been able to overcome the fundamental law of gravity in a package small enough to function as a safe family car. Until that technology exists, air-based car transport will never happen. No one has been able to overcome the fundamental law of gravity in a package small enough to function as a safe family car A second hurdle that needs to be overcome, and one that is closer than ever to being conquered, is the absence of clearly defined 'lanes' in the air to control traffic flow. Advanced GPS and complicated computer algorithms are close to solving this problem. And this leads us to the real future of the car industry. As we seek to make the dreams of futurists a reality, we are also providing a way to connect modern ground-based cars to the cloud for greater safety and efficiency. “Demand for tech equipped cars has always been high. The current must have tech is the integrated nav and rear view camera, or any sort of digital connectivity such as Ford Sync. Cars equipped with either gadgets are sold very quickly. Low MPG’s is another common customer request” Paul Beadling – Internet Sales Manager at www.jenningsforddirect.co.uk “Demand is what drives innovation so by studying what is popular now we can predict what will come next.” Cars in the Cloud The cloud paradigm was originally developed as a means of improving network speed and efficiency. By hosting applications and data in the cloud, for example, large companies can now eliminate the need for much of their computer infrastructure and, with it, the problems inherent in building and maintaining individual networks. The cloud saves money, increases productivity, and reduces the need for expensive hardware, apps, and storage. Nevertheless, can the cloud be utilised effectively by the car industry? Absolutely. A case in point is Ford's new Adaptive Cruise Control technology. As with standard cruise control, it enables the driver to save fuel by allowing onboard computers to maintain speed. The difference is that Adaptive Cruise Control can adjust itself based on other traffic. If the computer's sensors detect slower traffic ahead, software will automatically reduce speed in order to maintain a safe distance. Once the traffic is cleared, the driver-set speed is resumed. [easy-tweet tweet="Cloud technology can lead us to the eventual goal of creating a network of driverless cars. " user="comparethecloud" hashtags="futuretech"] Let us apply this technology to the eventual goal of creating a network of driverless cars. In this sense, Adaptive Cruise Control is just the start. When each car on the road has the same capability and transmits onboard data to the cloud, a complex software system will be able to account for all vehicle traffic in a given area for the purposes of controlling speed and distance between vehicles. Fuel Efficiency, Safety, Security The future of the car industry is based more on fuel efficiency, safety, and security rather than finding a way to make cars fly. The cloud is of vital importance to these goals. Consider the following possibilities: Fuel Efficiency – Cloud-based software can be combined with GPS data to figure out the fastest and most fuel-efficient routes based on traffic, weather conditions, and other environmental factors. At the same time, cloud-based software will eventually be able to all but eliminate traffic congestion in major metropolitan areas. [easy-tweet tweet="Cloud-based software can be combined with GPS data to figure out the fastest and most fuel-efficient routes"] Safety – Perhaps nothing is more important than safety for combining the cloud with automotive technology. Everything from Adaptive Cruise Control to lane assist and road sign recognition helps to eliminate the mistakes humans make in order to create a safer road system. Connecting all of this technology with every car on the road will only be accomplished in the cloud. Security – Lastly, the cloud will play an important role in security by giving drivers real-time access to emergency services. People will be safer, their cars will be less likely to be used to commit crimes, and emergency services will be better utilised to respond to both criminal complaints and accidents. While futurists dream of flying cars requiring no paved roads, the car industry is working hard on integrating their vehicles with the cloud. The future of the car industry is indeed in the clouds, just not in the way most of us have been long dreaming about. The future of the car industry is indeed in the clouds, just not in the way most of us have been long dreaming about. ### Maximising the value of the cloud: Four tips for your business When the concept of cloud computing was first introduced in the early 2000s, companies were unsure what to make of this new technology. Most businesses didn’t even fully understand what the cloud was. Over the years, cloud services have evolved from primarily raw infrastructure to comprehensive services that can entirely replace on-premise infrastructure like email servers and phone switches. As the technology continues to mature, you’re probably hearing more and more about the value of the services and solutions that comprise the cloud. Which may be why the research firm Gartner estimates that worldwide spending on public cloud services will reach $210 billion by 2016. But while you probably want to get on board, it’s extremely likely that you’re unsure where to start. Good news: there are four easy tips that will help you transition your business into the cloud. Tip 1 – Do your homework to find the right cloud service provider [easy-tweet tweet="Tip 1 – Do your homework to find the right cloud service provider" user="intermedia_net and @comparethecloud"] Once you know which services you want to move to the cloud—whether it’s your email, your phones, your file server, your web server, your accounting system, or any other software—you’ll want to identify the right cloud provider. While there are many factors to consider during the search process, it’s important to keep these three key components in mind: reliability, integration and control. The consistent reliability of IT services is crucial for any business The consistent reliability of IT services is crucial for any business. When your IT is down, your business can incur extremely high costs: employees can’t do their jobs, customers get angry, you lose sales and valuable IT resources are diverted to handle the crisis. Protect your business from these headaches by selecting a provider who offers a 99.999 percent uptime guarantee—which assures less than 30 seconds of unscheduled downtime per month. Second, integration of cloud services can introduce a host of new support, billing and management complexities. Top notch providers can facilitate an integration experience that allows you to easily transfer user and device settings across multiple services without accruing unnecessary IT management costs. Finally, controlling and managing your new cloud services should not create an additional burden on your IT team. In fact, it should make IT management more efficient. Selecting a provider with a simple yet powerful control system will help to avoid unnecessary labour costs while increasing your business’ capabilities. Tip 2 – Audit existing IT systems before making the move [easy-tweet tweet="Tip 2 – Audit existing IT systems before making the move" user="intermedia_net and @comparethecloud"] Many companies make the leap into the cloud without thoroughly evaluating their IT infrastructure. When assessing the move from in-house to a cloud infrastructure, consider the services you already have in place. How can you build around it to maximise efficiencies? For example—don’t jump into the cloud all at once. Instead, slowly integrate just a few services at a time, and allow time for people to get used to the changes. Also, identify which services and applications make the most sense to move to the cloud, and then start the process with the services that are easiest to integrate, such as email, file sync and share, and messaging.  Tip 3 – Don’t forget about security! [easy-tweet tweet=" Tip 3 – Don’t forget about security!" user="intermedia_net and @comparethecloud"] Security breaches can be a costly drain on time—to say nothing of the risk to your business, your customers and your competitive advantage. That’s why you need to look beyond the price tag of a cloud provider. When it comes to protecting your business’ data and resources, features and pricing alone might only reveal a portion of the larger picture you need to make an informed decision. Make sure you demand high standards of security and protection from your cloud provider. take the time to research the security background of the cloud provider you’re considering before signing the contract So take the time to research the security background of the cloud provider you’re considering before signing the contract. Be sure to dive deep into the details of the security offered by the cloud providers and stick with companies that work with the best-known names in the security industry. Many providers use lesser-known security tools that provide only partial or minimal protection, so it’s important to do in-depth research of the security procedures. Tip 4 – Plan for the future: Develop a long-term strategy for cloud success [easy-tweet tweet="Tip 4 – Plan for the future: Develop a long-term strategy for cloud success" user="intermedia_net and @comparethecloud"] Cloud technology must address immediate needs, but a successful cloud strategy should be flexible enough to respond to evolving business needs. Like other business endeavours, creating a strategic plan and roadmap will help you stay on track while working towards accomplishing your goals. Part of this process should be planning for and identifying future IT needs. With a deliberate approach to gaining new insights into the technology, matching the right cloud solutions to your business needs, and establishing the right foundation for future growth, you can pave the way to success in the cloud. ### How Cloud Is Helping the Financial Sector to Become More Agile Retail banking, insurance and the capital markets are fast adopting cloud computing in the quest to grow revenues, reduce costs and increase responsiveness to perceptions of risk. The financial sector is struggling to overcome uncertain growth of the economy, increasing regulatory pressure as well as customer cynicism to maintain profitability and enhance customer confidence in their product offers. Against a background of increased wariness of tech-savvy customers, the need of the hour is higher interaction levels such as instant access to information online and delivery of products and services through handheld and desktop devices. [easy-tweet tweet=" Retail banking, insurance and the capital markets are fast adopting cloud computing" user="comparethecloud"] Strategic Deployment of Cloud Computing While many financial services companies have been responding to the new challenges by modernising their legacy applications and infrastructure up gradation, a number of forward-looking institutions are deploying cloud computing strategically with an intention to transform how the entire organisation operates. They are scanning and selecting from different models of cloud computing that will enable them to successfully meet the challenges and to improve their business agility. Business agility is the competence of an organisation to cost-effectively and swiftly adapt to environmental changes, according to the leading consulting firm, McKinsey & Co. When a company is agile, it is reflected in increased growth of revenue, greater reduction of cost, and more effective handling of business risks and threats to reputation. Revenue Growth Challenge [easy-tweet tweet="Significant impact has been made on American financial services by the Dodd-Frank Wall Street Reform"] Uncertainty of economic recovery, the highly-competitive environment and increasingly tightening regulatory framework are posing immense challenges to the growth of revenues. Significant impact has been made on American financial services by the Dodd-Frank Wall Street Reform and Consumer Protection Act. This, together with other imminent acts, are really squeezing the fee income of retail banks making it imperative for them to quickly devise new product and services that will provide the necessary revenue growth. Companies operating in the capital markets as well as debt relief companies are also coming under intense pressure to provide additional services to boost the diminishing revenues. The situation is identical in the insurance sector with companies facing increasing challenges to provide customers with a variety of products and services through an assortment of channels, especially Internet and app-based. Companies that are agile are able to quickly devise appropriate products and services, activate new delivery channels, improve the quality of customer engagement, boost revenues and swiftly adapt to the changing competitive and regulatory environment. Responsiveness to Reputation and Risk Threats [easy-tweet tweet="Businesses that are agile can react to changing environmental conditions and face new financial risks"] The financial sector crisis in recent times has ignited a great deal of distrust and ill will among customers and provoked governments to conduct detailed industry reviews. This has led to increasing regulatory stringency that together with extensive media coverage of data breaches has further been the cause of concern of customers about data confidentiality and security of their financial transactions. Customers have also woken up to the need of increasing the monitoring of their accounts. Retail banks, as well as insurance companies and capital market companies, need to quickly respond to the changing regulations and customer concerns with improved data security and making available information on a variety of channels, including mobile devices. The organisation must also quickly generate the reports required by the regulatory authorities as well as comply with the changing accounting standards that necessitate them to process larger volumes of data to generate reports of increasing complexity. Businesses that are agile can react to changing environmental conditions and face new financial risks as well as threats to their reputation. They can not only anticipate the imposition of new regulations but adapt quickly to changed circumstances. Business agility also proves to be vital for reacting to data security threats or market shocks. Cost Reduction [easy-tweet tweet="One of the biggest differences that cloud has made to the financial sector is reduction of operating costs."] One of the biggest differences that adoption of cloud computing has made to the financial sector is reduction of operating costs. The ability to discard old technology and adapt quickly to changing competitive and regulatory environments with the adoption of cloud computing technology is enabling companies to dramatically reduce their costs of making available product and services as well as regulatory compliance. When customer expectations are on the rise against a background of severe market churn, there is nothing better than cloud computing to improve service delivery while maintaining efficient and cost-effective services. Companies are also better poised to cater to customer demands of information and transaction abilities on mobile and other hand-held devices. ### Head in the sand – or ahead in the Cloud? Cloud computing, forecast to become a $150 billion industry by the end of 2015, is not only growing, it is transforming almost every industry. And it could transform TV and broadcast media too. Although the cloud is increasingly being used to help companies distribute media to content delivery networks and social media, it’s not yet being fully optimised to improve the expensive and sluggish first mile of content creation. So why is it that so many senior executives still have their head in the sand – when they could get ahead in the cloud? Aframe’s David Peto examines the footage. The proliferation of digital, mobile and social media has spawned a new and burgeoning competitor in the war for viewing figures The challenges for broadcast media are familiar to everyone. The proliferation of digital, mobile and social media has spawned a new and burgeoning competitor in the war for viewing figures: the consumer. In the age of the smart phone, everyone is a cameraman. And in the era of YouTube and Twitter, everyone has an outlet. Worse, in a market led by an insatiable consumer thirst for instant gratification, traditional media companies risk being left behind. [easy-tweet tweet="The sluggish first mile of content creation can be helped along by #cloudcomputing" user="comparethecloud"] To satisfy the appetite, broadcast media conglomerates are bolstering human resource to meet consumer demand, but they remain hamstrung by dated processes that slow down the time it takes to get footage from camera to screen. To conquer the problem, executives need to eradicate the obvious inefficiencies that are hindering progress – and many of these are right at the beginning of the production cycle. First mile v last mile It would be disingenuous to suggest that media conglomerates are not currently embracing the Cloud. They are. But many have simply focused on using it for distribution at the end of the process. The ‘first mile’, however – that cost centre and home to crucial workflows for getting content into the system and working with it before it is broadcast at the other end – remains sluggish and inefficient. Despite disparate video journalists being poised in all manner of remote places, the race to get content to air as quickly as possible is still compromised. And, to compound the problem, broadcasters face the ever-growing issue of multi-formats. Footage may – eventually - reach its chosen destination, but if it’s in the wrong format, the process is again significantly slowed down. [easy-tweet tweet="It’s important to remember that all clouds are not equal." user="davidpeto and @comparethecloud" hashtags="cloud"] There is a widespread need to democratise the chaos of modern production. But with standardised formatting unlikely to happen any time soon, broadcasters need to find a solution to the problem. Improving the speed and efficiency of the ’first mile’ content creation process and delivery can help media conglomerates reach the last mile of distribution so much quicker. This brings the major benefits of cost-reduction, improved competitiveness and increased profitability. And the solution is already out there. The time is now for media conglomerates to address the content creation piece of the broadcast workflow and its classic inefficiencies, for which cloud computing and particularly its multi-tenancy, and the ability to support many discrete users, is a real opportunity. A truly end-to-end solution can significantly reduce the time and cost of getting video from camera to air. But it’s important to remember that all clouds are not equal. Media conglomerates need to look for a partner that not only understands the specifics of broadcast media, but whose cloud solution has the speed and stability to support multi-use, reliably and securely. Specialist cloud video production solutions can help overcome ‘first mile’ costs and challenges Specialist cloud video production solutions can help overcome ‘first mile’ costs and challenges. They can save companies a lot of money up front, and help them become more profitable by winning the eyeballs they so desperately need. So senior decision-makers at today’s top media conglomerates face a clear choice: do they keep their head in the sand, or do they get ahead in the Cloud? ### Cloud technology is no longer an IT issue but a departmental decision   For businesses today, the decision to deploy cloud services no longer sits solely within the IT department. Having technology knowledge when it comes to getting the most from the cloud is not essential and more often than not, demand is coming from the business functions themselves who are realising the benefits that cloud based services could bring to their day-to-day operations.  A recent report from Kronos surveyed 1,000 office workers in organisations with between 50 and 500 employees across the UK, Netherlands and Belgium. The research highlighted that those working at small and medium-sized businesses are aware of the benefits of cloud computing and are increasingly demanding it as an enabler of enterprise mobility and flexibility. The key statistic from the report was that 64 per cent of employees believed cloud applications were important in supporting their day-to-day work, while 83 per cent prefer cloud applications to those deployed on premise. It went on to find that on average, workers use six different cloud applications per month, which shows the popularity of this technology is continuing to rise among SMEs.  [easy-tweet tweet="64 per cent of employees believe cloud applications are important for day-to-day work" user="comparethecloud"] With this in mind, businesses need to consider the importance of providing a cloud solution, particularly for data hungry departments, such as, marketing and finance. Here the cloud provides real business value, not just IT efficiencies. Flexibility for the finance department A report prepared by CFO Research in collaboration with Google, highlighted that 64 per cent of finance executives said that after implementing a cloud solution, it reduced operational costs by up to 20 per cent within the company, with an additional 15 per cent anticipating that in the future operational costs could far exceed this.  As well as the cost-saving, the survey highlighted that the finance department also understood the companywide agility and productivity the cloud could provide. 36 per cent of the respondents said their company’s move to the cloud was motivated by a need for greater flexibility and 35 per cent for improved productivity. Cloud technology can also positively impact employee performance, as staff will have access to state of the art cloud applications, with the report showing 81 per cent of respondents believed a complete implementation of cloud-based systems would improve employee productivity as well as contribute to high-value activities such as corporate-strategy setting (68 per cent). Marketing moves to digital Marketing departments can also benefit hugely from cloud solutions. The technology helps to deliver timely campaigns and allows the department to communicate with others quickly and efficiently. According to Fourth Source, the biggest change within the marketing landscape is the transition from an en-masse broadcast call-to-action approach to a media-rich, real time custom messaging strategy. Within this department, the aim now is to use specific information to communicate more effectively to customers anytime, anywhere. it is vital for companies to rely on the fastest computing performance available With the increased use of video advertising, mobile apps and the development of the latest mobile technologies, cloud solutions now allow marketers to take advantage of these trends, as the amount of data being generated can be carefully scrutinised and analysed, to deliver the most engaging online campaigns. The approach for the marketing department is often focussed on speed and precision rather than creativity. Therefore, it is vital for companies to rely on the fastest computing performance available, to deliver timely communications and scale up or down as capacity is needed. Data explosion With businesses witnessing a data explosion, it’s not just large scale enterprises that need to ensure their data storage and management procedures are up to scratch. From creative departments, through to marketing and finance, the reliance on customer information and digital data is becoming more crucial than ever before. With cloud services providing the perfect solution for those SMEs lacking the necessary infrastructure in-house, a private cloud option can provide more control and confidence in the safety of data stored in the cloud, without the big-business price tag. ### Three Myths about Cloud and Security Businesses regularly ask whether transitioning to the cloud is secure. The question is understandable, as there has been a lot of fear, uncertainty and doubt (FUD) spread across the industry over the past few years. Gartner thinks that cloud computing, by its very nature, is uniquely vulnerable to the risks of myths. much of the negative chatter about cloud security is unfounded However, much of the negative chatter about cloud security is unfounded, and is simply an extension of what is already being dealt with across the physical infrastructure. So let’s take a look at some common statements against cloud, and why they are better classified as myths than truths. Myth #1: Cloud is riskier than traditional IT [easy-tweet tweet="Myth #1: Cloud is riskier than traditional IT" user="digitalrealty"] Your infrastructure isn’t necessarily more secure just because it is located on-premise. Data breaches, data loss, account hijacking, and denial of service attacks have been concerns since the invention of the mainframe computer back in the 1960s. And while it is true that improperly configured cloud resources can result in these types of vulnerabilities, the same can be said of improperly configured physical infrastructure. The key to a secure cloud implementation is working with a competent, experienced provider who makes investing in the strongest forms of networking security, intrusion detection and monitoring services core to their business. In many cases, the security controls an experienced provider can handle may go well above and beyond your in-house IT team’s capabilities. Myth #2: You can’t control where your data resides in the cloud [easy-tweet tweet="Myth #2: You can’t control where your data resides in the cloud" user="digitalrealty"] Heavily regulated industries such as healthcare and finance have controls that dictate where personally identifiable information and protected health information (PHI) can reside. For these types of organisations, it may be (incorrectly) assumed that cloud isn’t a viable option due to the assumption that the location of data cannot be controlled. This “cloud myth” is easily discredited by understanding that organisations can choose to work with a provider that operates out of specific data centres in specific geographic regions. Highly regulated organisations can also leverage a private cloud deployment model that provides them with increased control and governance versus the public cloud. Myth #3: The cloud is not suitable for compliant workloads [easy-tweet tweet="Myth #3: The cloud is not suitable for compliant workloads" user="digitalrealty"] As I mentioned above, organisations with compliant workloads can still leverage the cloud; however, a private or hybrid cloud deployment model may be better suited to their needs. A private cloud allows organisations to enjoy the flexibility and scalability benefits of the cloud, while still keeping data secure and in their control. A hybrid model provides the added benefit of being able to “burst” to a public cloud for non-compliant workloads such as development and testing. ### Break down regulation barriers to cloud adoption There is no doubt that cloud computing has now achieved mainstream deployment in the UK. Recent research from the Cloud Industry Forum (CIF) found that some 78 per cent of UK organisations are adopting at least one cloud based service, an increase of 15 per cent over previous figures. More telling is that turning to the cloud is now not just the reserve of large blue-chip organisations, with 75 per cent of SMEs also embracing cloud technology.  With cloud technology continually evolving, it has now become a mainstream solution for businesses and an integral part of an organisation’s overall IT strategy. According to Gartner, cloud computing has been highlighted as one of the top strategic technology trends in 2015 that organisations cannot afford to avoid. Across the wider business landscape, web hosting, email, CRM, data back-up and disaster recovery continue to be the most pervasive cloud services used.  However, organisations within heavily regulated industries such as the financial services, healthcare or legal have thus far shied away from cloud technology, unsure of the right strategy and afraid of the potential security risks. The Cloud Security Alliance recently found that although the take up is increasing within financial services, with private cloud the most popular for those testing the waters, security is still their main concern.  [easy-tweet tweet="54% of IT decision makers globally say they now store sensitive data in the cloud. " user="comparethecloud"] Times are changing. A report undertaken by Ovum this month revealed that 54% of IT decision makers globally say they now store sensitive data in the cloud. The cloud has a distinct benefit for smaller institutions in heavily regulated industries. They can take advantage of the skills and better security that cloud providers such as Cube52 offer, rather than having to invest in their own staff, software and hardware. The money saved can then be used for better education of staff and to ensure that security is regularly tested and fit for purpose.  One of the main regulatory requirements that has historically dissuaded heavily regulated industries to move away from their legacy on-premise solutions is the need for sensitive data (whether it be customer or financial information) to not cross geographical boundaries. The issue of location – data sovereignty – is currently top of mind for many due to the EU Data Protection Directive adopted in 1995 being set to be replaced with new legislation known as The EU General Data Protection Regulation some time this year. What is important to remember, is that whilst the cloud exists in the ether, that ether will ultimately always be located in a physical location so can be managed accordingly. Organisations should choose a vendor that can guarantee the location of its datacentre, with proximity being a key factor in this decision. But, whilst cloud location should no longer be a barrier, consideration should be given to whether a public, private or hybrid setup is the right one.  Public clouds are based on shared physical hardware which is owned and operated by third-party providers. Public clouds are based on shared physical hardware which is owned and operated by third-party providers. The primary benefits of the public cloud are the speed with which you can deploy IT resources, and the fact it is often the cheapest option as costs are spread across a number of users. However, the security of data held within a multi-tenanted public cloud environment is often a cause for concern in heavily regulated industries. Private cloud is a bespoke infrastructure dedicated purely to your business. The private cloud delivers all the agility, scalability and efficiency benefits of the public cloud, but with greater levels of control and security. This makes it preferable for industries with strict data, regulation and governance obligations. Another key benefit of private cloud is the ability to fully customise the infrastructure components to best suit your specific IT requirements, something that cannot be achieved so easily in the public cloud environment.  The Hybrid cloud is a more recent addition and allows the business to combine public cloud with private cloud or dedicated hosting. This way, a business can benefit from the advantages of each within a bespoke solution. For example, a business could use the public cloud for non-sensitive operations, the private cloud for business critical operations and incorporate any existing dedicated resources to achieve a highly flexible, highly agile and highly cost effective solution. [easy-tweet tweet="The rationale for moving to cloud is no different for businesses in heavily regulated industries"] Overall, the rationale for moving to cloud is no different for businesses in heavily regulated industries than those that aren’t. Flexible infrastructure, faster provision and time to market, low capital expenditure and staff skills shortages in their own IT department. Security must remain an important consideration, but with flexible, resilient and secure solutions available there is no reason why all industries can’t embrace an aspect of cloud technology today and reap the benefits. ### The ‘Intergalactic’ Infrastructure of the Future The Evolution of Cloud Today, most companies are using some form of cloud services. According to Gartner, the Worldwide Public Cloud Services Market is now worth $131 billion: when you consider that ten years ago the only clouds people had heard of were the ones in the sky, this is pretty remarkable growth. So why has cloud adoption enjoyed such phenomenal success? And is it really such a new concept? It could be argued that the idea of cloud was actually introduced as early as the 1960s by J.C.R Licklider, who voiced his idea of an ‘intergalactic computer network’. Licklider’s idea was that everyone on the globe would eventually be interconnected, accessing applications and data at any site, from anywhere. Today, we can see that we are moving ever-closer to Licklider’s intergalactic future, with the cloud acting as the primary delivery mechanism. The ‘cloud’ has become something of a catch all phrase for anything that can be delivered via the internet, whether it is infrastructure, data, applications, or a platform. However, at the fundamental root of all IT innovation is the compute power that drives and supports it - so to narrow the scope, I have focused on the evolution of infrastructure, rather than Software-as-a-Service and Platform-as-a-Service. [easy-tweet tweet="We are moving ever-closer to Licklider’s intergalactic future, with the cloud acting as the primary delivery mechanism."] The Iron Age  To understand how we have come to the version of cloud we have today, it is worth having a look back to life before ‘cloud’ and how the infrastructure environment has developed over the years. It could be argued that the mainframe represents the first iteration of cloud as we know it today. Widely acknowledged in the 1950s as the ‘future of computing’, large-scale mainframes, colloquially referred to as "big iron", provided a large scale central infrastructure, shared by various applications and IT services. Like the cloud, businesses could scale resources up and down, depending on their needs. Aside from maintenance and support, mainframe costs were attributed according to Million Instructions Per Second (MIPS) consumption; the more it was used, the more MIPS were consumed, and the higher the cost. While revolutionary at the time, and still in use to this day, mainframes also have limitations. Mainframes require massive up-front investment, coupled with rapidly depreciating value of physical servers over time, and are expensive to run and maintain. Companies are also limited by the amount of server capacity they have on-site, which means they can struggle to scale capacity according to need. [easy-tweet tweet="The introduction of personal computers in the 1970s changed everything." user="comparethecloud and @elastichosts"] Yet one of the main reasons that people started to move workloads away from the mainframe and onto client servers was actually one of the reasons people are today moving away from client servers and into the cloud: decentralisation. As mentioned above, mainframes act as a central resource, meaning in the early days they had to be connected directly to computer terminals in order to operate. While this was not a problem when companies only had a handful of computers, the introduction of personal computers in the 1970s changed everything. Throughout the late 1980s and early 1990s the distributed client/server model became extremely popular, as applications were migrated from mainframes with input/output terminals to networks of desktop computers. This offered newfound convenience and flexibility for businesses, but also added layers of complication in terms of managing this new distributed environment. The World Wide Web By the mid-1990s the internet revolution was having a massive impact on culture and the way we consumed technology, and also moving us closer to the cloud that we know and love today. While the distributed on-premise model that had emerged in the 80s had offered huge cost and productivity gains, as IT became more integral to business operations the demand for power increased alongside. This created a new set of problems, as companies had to find money for new servers and space to put them, leading to datacentres and adding further layers of complexity to infrastructure management. Not only this, but there was a lot of waste due to the variable need for capacity; companies had to pay up front for servers to support their peak levels of capacity, even though this level was only required infrequently. This made capacity planning a mammoth task, and often meant companies needed to make a trade on performance at times of peak traffic. VMs, coupled with the internet, enabled a new generation of infrastructure cloud services Hosting companies emerged to fill this gap, promising to manage businesses’ infrastructure for a fixed monthly fee. Hosting opened the door to what we see as cloud today, by helping businesses to get rid of their physical servers and the headaches associated with running them. Yet while hosted services had a lot of benefits, businesses increasingly began to feel locked in to rigid contracts paying for capacity and services that they were not using, and began to crave more flexibility. Then in the early 2000s came virtualisation, which allowed business to run different workloads on different virtual machines (VMs). By virtualising servers, businesses could spin up new servers without having to take out more datacentre space, helping to address a lot of the issues faced in the on-premise world. However, these machines still needed to be manually managed, and still require a physical server to provide the compute power needed to run them. VMs, coupled with the internet, enabled a new generation of infrastructure cloud services that we see today. Cloud providers could run multiple workloads from remote locations, helping companies to deploy resources as and when needed, without contracts or large up front investments. By giving businesses access to a network of remote servers hosted on the internet, companies could start to store and manage their data remotely, rather than using a local server. Users could simply sign up to the service, pick an instance size of server, and away they go. Most importantly, they could make changes according to demand, with the option to stop the service whenever they wanted. Breaking bad habits 90% of businesses see over-provisioning as a necessary evil in order to protect performance Not a lot has changed over the past ten years and this model largely reflects the cloud computing we see today. While users today have greater choice over the instance size of Virtual Machine (VM) they wish to deploy, they still pay for the service based on the level of capacity they provision, whether they use it or not. Unless businesses are prepared to deploy expensive and complicated technology to automatically scale capacity according to usage, the only way to avoid overspending is to have a member of staff manually adjust it, a resource-intensive and time-consuming solution. As a result, most companies just set a level which should cover their needs, so that some of the time they are over-provisioned, and some of the time they have to sacrifice peak performance as they are under-provisioned; a far from ideal solution. This trend is evidenced in recent research showing that 90% of businesses see over-provisioning as a necessary evil in order to protect performance and ensure they can handle sudden spikes in demand.  [easy-tweet tweet="Recent changes in the #Linux kernel have enabled a new generation of scalable containers"] This suggests that users are not enjoying the full benefits of the flexibility cloud can provide; instead, they are just picking up their old infrastructure bad habits and moving them into the cloud. However, the introduction of containers could be the answer to these problems. Recent changes in the Linux kernel have enabled a new generation of scalable containers that could make the old Virtual Machine server approach redundant. We have seen the likes of Docker making waves in the PaaS market with its container solution, and now such companies are starting to made waves in the infrastructure world as well. These containers are enabling cloud infrastructure providers to offer dynamically scalable servers that can be billed on actual usage, rather than the capacity that is provisioned, helping to eliminate issues around over-provisioning. Using Linux containers, businesses no longer have to manually provision capacity. Servers scale up and down automatically, meaning that they are only billed for exactly what they use – like you would be for any other utility. Not only is this cost efficient, but it also takes the mind-boggling complexity out of managing infrastructure; businesses can now spin up a server and let it run with absolutely no need for management. The Intergalactic Future Is Here!  It looks like the revolutionary ‘intergalactic computer network’ that J.C.R Licklider predicted all those years ago is finally set to become a reality. And it is funny how things come full-circle, as people start to move back to a centralised model, similar to that provided by the early days of mainframe. As our dependence on cloud in all forms increases, the big question is where next? I believe that just as companies naturally gravitated towards the cloud, leaving hosting companies out in the cold, the same will happen with capacity based vs. usage based billing; logic dictates that containers will win out in the end. [easy-tweet tweet="J.C.R Licklider's predicted intergalactic computer network is becoming a reality" user="elastichosts"] ### The New Cloud Darling Name a multi-billion dollar market that’s getting a long overdue cloud make-over: Field Service Management. This $18 billion industry has only just begun to dip its toe into the huge pool of possibility thanks to software-as-a-service, the tablet and smartphone revolution, and senior management’s recognition that service is indeed strategic. Add to this the commercial insight into addressable revenue-generating opportunities and the resulting top line growth, and you’ve got a new cloud darling. Service businesses represent around seventy per cent of the world’s economy, yet to date, only about a third of the world’s large service businesses currently use field service management solutions. It’s a market poised for growth, and it’s applicable to all vertical service industries and businesses of any size. [easy-tweet tweet="Field service is an industry that was comparatively slow to join the information economy" user="comparethecloud"] Over the past forty years, service hasn’t really changed that much. There’s a lot of paper, a lot of disjointed, manual processes, and no meaningful connection between the field service technician and the context of a customer’s business challenges. You can’t actually track data on paper, particularly in an economy that depends upon the swift, accurate transmission of information. Field service is an industry that was comparatively slow to join the information economy and typically does not move at the same pace as the rest of the business. Until now, the efficiency, productivity and disruption of the cloud had passed the field service industry by. You can’t actually track data on paper... But that’s all changing. Empowering service technicians with cloud-based, real time tools in the field means they can do work-orders, request parts, schedule and be scheduled, look up manuals, take payments, renew maintenance agreements, use social channels to communicate problems swiftly and effectively and upsell and cross sell products and solutions where appropriate. All of this is done on a smart phone or a tablet. All the data is real-time and technicians have the ability to work offline, saving time consuming administration at the end of a job. And customer relationship management systems pick up the information and ensure that the customer receives future communications, advice, updates and education. And of course, all of that data is delivering valuable new insights about your businesses and customers.  [easy-tweet tweet="There are more than twenty million field technicians globally. That's a big #cloud market! "] There are more than twenty million field technicians globally. Most of them are not working this way…yet. But CEO’s are increasingly waking up to the ‘sleeping giant’ that is their service department. On average, service margins are nearly eleven percent higher than equipment margins, and in a recovering economy with delayed capital equipment purchases, that counts for a lot. Better yet, with the right tools in place, service centres can transform from being a cost overhead to a profit centre, delivering leads, taking orders and providing customer intelligence. Field service isn’t exactly known for being a sexy or even innovative industry Field service isn’t exactly known for being a sexy or even innovative industry, but there are some major developments on the horizon that could soon change its image. The Industrial Internet is just one example. There are currently around 25 billion devices connected to the internet. By 2020, this figure is expected to rise to 50 billion. As a planet, our civilisation now depends on machines working. It’s already commonplace for manufacturers to track machine performance of high end equipment with technicians determining whether a problem can be resolved remotely, reducing the costs of unnecessary service visits. However, in the future, machine to machine learning will mean that equipment will communicate with technicians about specific faulty parts, with the machine essentially telling the service department when it is ‘sick’. Likewise, in another ten years, it may be commonplace for field techs to have 3D printers in their vans to simply print the parts they require. Gamification techniques may also be used to reward service techs for making an SLA, cross selling, or supporting another engineer with a phone fix. Granted, this is not how conventional field service operates today, but I think these sorts of advances are just a matter of time. Joining the cloud revolution is just the first step, and as a result, we will see incredible changes in both the short term and longer term. Cloud and mobility adoption are creating a new breed of field service management solutions that are radically re-defining what ‘business as usual’ means for the field service industry.  ### Three Common Cloud Myths That Scare Financial Firms Away In response to consumers’ demand for instant, always-on service, UK financial firms are showing a growing interest in cloud services. Alan Grogan, chief analytics officer of the customer services group at RBS, even suggested developing a cloud service specifically for financial services organisations, despite the financial industry being criticised for lagging behind other industries in technology adoption. [easy-tweet tweet="Financial firms can challenge what they think they know about cloud technology" user="comparethecloud" hashtags="fintech"] Financial firms need the capability to restore highly sensitive information in a secure infrastructure and recover key systems and data within predefined recovery time objectives — a need cloud technology can fulfil. But while most financial organisations have adopted some form of cloud strategy, a fraction of firms have steered clear of the cloud. According to the Cloud Security Alliance 2015 Cloud Survey Report, as many as 19 per cent of companies whose customers typically do not interact with banks electronically have a strict no-cloud policy, and 82 per cent of mid-sized organisations have no cloud strategy in place. What exactly is keeping financial firms from integrating cloud-based solutions into their businesses? The following myths could be to blame.  MYTH: The cloud is insecure Security concerns are one of the top reasons financial organisations are hesitant to adopt cloud-based solutions for their businesses. In the Cloud Security Alliance survey, 60 per cent of respondents ranked data confidentiality as their highest cloud adoption security concern. Businesses view hacker attacks, data loss and identity theft as a real threat, and it doesn’t help that the UK has been named Europe’s worst country for data breaches. [easy-tweet tweet="60 per cent of CSA survey respondents ranked data confidentiality as their highest cloud adoption security concern."] To ascertain whether the cloud infrastructure is secure enough to handle highly sensitive information, it is crucial to first understand the differences between private and public clouds. Whereas public cloud models store data in a multitenant environment in which customers share the same infrastructure, the private cloud allows businesses to back up their data and entire systems configurations in an environment restricted to a specific community. The perception of insecurity is typically associated with the public cloud model. To mitigate the risk of security breaches in the public cloud, financial firms can use a hosted private cloud (e.g. BlackCloud) that maintains strict security controls such as data encryption in transit and at rest. For additional control, firms can store critical data in an on-site managed platform (e.g. BlackVault) that integrates with the cloud. MYTH: It’s the IT department’s decision to adopt cloud technology It’s a common misconception that it’s solely the IT department’s responsibility to make the decision to adopt cloud technology. However, research from Harvard Business Review Analytic Services shows that cloud leaders, or companies that have generated new business opportunities through a mature approach to the cloud, don’t leave cloud decisions entirely up to IT. While cloud leaders are likely to have a CIO leading cloud initiatives, leaders from other parts of the business partner with IT in selecting and deploying services. In the same way, for a cloud implementation to be successful in a financial organisation, executive management needs to be involved in the decision to introduce cloud backup and recovery to the workplace. Not only will the IT and executive management teams be able to work together to identify a solution that benefits the entire organisation, but employees are also more likely to receive thorough training on using the cloud technology if the initiative is supported by the top of command as well as the IT department.  MYTH: Transitioning to the cloud is too complex Some financial firms believe transitioning their IT infrastructure to a cloud environment will be too time-consuming and could lead to downtime or data loss during the transition. There is some truth to this myth in some cases. For example, if a small firm’s limited IT team attempted the transition while managing their day-to-day tasks, the project can become more than a single firm can handle. cloud service doesn’t have to be cumbersome and unmanageable. But implementing a cloud service doesn’t have to be cumbersome and unmanageable. If complexity is a concern, an IT managed services provider can help with project planning, implementation, maintenance and testing to ease the burden on internal IT staff. Additionally, Harvard Business Review Analytics Services points out that cloud leaders with a cohesive cloud strategy that’s supported across the business have been able to reduce project implementation time and complexity.   With a better understanding of the myths associated with cloud-based solutions, financial firms can challenge what they think they know about cloud technology and realise that the cloud can become an integral part of their ability to meet their customers’ expectations. ### Moving to the Cloud? Retailers Should Pack Carefully For many businesses, the key decision isn’t whether or not to migrate to the cloud, it’s deciding which parts to take and which parts to leave behind. Cloud computing is an undeniably alluring option for companies. According to a recent study, 61 percent of retailers said that cloud technology has led to faster responses to shifts in markets and changing customer needs. The cloud affords businesses the flexibility to reduce or expand capacity based on their needs at any given moment. Why buy more servers just to accommodate a short-term traffic spike on Cyber Monday? The economical option is to lease cloud-based capacity on an as-needed basis. But embracing the cloud doesn’t have to be an all-or-nothing proposition for retailers. In fact, the most cost-efficient strategy would likely include a mix of cloud-based applications as well as dedicated servers. Moreover, different applications with varying usage characteristics can be combined to share cloud capacity – furthering cost saving benefits. Consumers' growing desire for digital interaction is spurring retailers to innovate, with many looking to cloud-based development platforms like IBM Bluemix that allow them to assemble new solutions fast, then experiment and refine.  This approach enables even the largest enterprises to compete with nimble and innovative startups.  For example, a retailer could use this approach to quickly build a consumer mobile app or an analytics application that receives automated data from scores of Internet of Things (IoT) devices used in a retail store to monitor everything from the temperature of the freezers to motion detectors. Both applications require "born-on-the-cloud" style development, which is characterised by agile development and a mashup of multiple forms of data from sources including weather and social media. [easy-tweet tweet="Sales applications that are hosted in the cloud can be modified on-the-fly" user="comparethecloud" hashtags="retailcloud"] Certain functions - like e-commerce - are also ideal for the cloud. Sales data from e-commerce is relevant to any number of business functions including marketing, customer service, merchandising and so on. If sales data is stored on the cloud, it can be more efficiently analysed along with other external, unstructured data such as demographics, consumer spending, weather and more. From this data, retailers can gain new insights that can be shared, sorted, filtered and accessed by many groups of people in almost any corner of the globe. An in-store sales associate, for example, could see the last few purchases or even searches a consumer made online and recommend items to the consumer based on that information. Some retailers are also seizing the opportunity to offer cloud based "digital receipts" and other value-added functions as more data moves to the cloud. Another benefit of cloud-based e-commerce systems, they evolve with the business. Another benefit of cloud-based e-commerce systems, they evolve with the business. Because sales platforms face the customer, they change frequently based on what the customer wants and what generates the highest sales for the company. Sales applications that are hosted in the cloud can be modified on-the-fly, sometimes in a matter of minutes or hours, versus months or years, which had been (or is) the case with legacy systems. E-Commerce systems are constantly being refreshed, tweaked and enhanced by retailers, as many companies are looking at cloud as a platform to deliver a 'continuous delivery' model where a constant stream of small, nimble features can be deployed continuously on to their digital platforms. With its excellent automation and provisioning capabilities, cloud technology can keep up with this constant state of flux very efficiently. For example, Spanish retail giant El Corte Inglés (ECI) selected a cloud-based Commerce solution from IBM to quickly extend its online presence while addressing the distinct customer needs of consumers in different regions. By deploying a hybrid cloud and on-prem solution, ECI can deliver customised in-country offers, promotions and pricing optimisation in real-time. The solution manages their customer orders across the entire region while supporting the many translation, payment and regulatory requirements unique to each country.   [easy-tweet tweet="Retailers can also benefit from cloud’s ability to manage flux efficiently" user="comparethecloud"] Retailers can also benefit from cloud’s ability to manage flux efficiently for scenarios where many software changes are needed – such as merchandise and supply-chain transformations. A growing number of retailers are seeing benefits of using cloud for development and testing to support these large business transformations. legacy monolithic applications are not suited for cloud as making them "cloud ready" would be an expensive affair Conversely, legacy monolithic applications are not suited for cloud as making them "cloud ready" would be an expensive affair. Applications that consume lots of network bandwidth or are dependent on a large amount of data from several applications need to be reviewed to ensure that moving to cloud does not simply trade one set of costs for another. Applications that deal with highly sensitive or audited and regulated data are also typically not suitable for cloud due to liability concerns.  Instead, these monolithic applications can leverage hybrid cloud models to realise some of the cost-efficiencies of cloud-based automation and virtualisation, for example, without having to completely re-architect monolithic applications when it does not make business sense. The cloud gets a lot of hype for good reason. We are only limited by our imagination in what we do with it. But just because we can doesn’t mean we should. The best laid plans for the cloud are those that are discriminating and strategic. [easy-tweet tweet="The best laid plans for the cloud are those that are discriminating and strategic."] ### Focus on your business needs, not the deployment model There is a lot of talk in IT circles about cloud models, but I think the conversations should be less about the deployment model and more focussed on what is best for the business. There are a few misconceptions about private cloud out there, that have been perpetuated due to early accounts of public cloud domination, and it’s skewing the conversation.  The hype of the public cloud is gone and especially, in light of the NSA/Edward Snowden fiasco, companies are asking themselves how this will actually deliver value. Now what companies are learning is that private cloud solutions are giving enterprises the usability and flexibility of public cloud, while providing companies complete control and ownership. This is so important in the enterprise where they have massive data centres and infrastructure already invested. [easy-tweet tweet="The hype of the public cloud is gone and especially, in light of the NSA/Edward Snowden fiasco" user="accellion"] It’s crucial too for businesses to cut the hype away from the cloud and ask what’s best for their business model? This is the crucial bit – companies should be choosing a cloud deployment based on what best matches their business model, rather than trying to rearrange business models to fit a one-size fits all public cloud deployment. When it comes to cloud, there are many options – the mantra that ‘cloud fits all’ just isn’t true.  Let’s look at the tradeoff between control and infrastructure costs. Private cloud is assumed to be more expensive than public cloud, but that’s not always the case. At a certain scale, public cloud fees continue to balloon while private cloud becomes more affordable. Because of the up-front cost of servers and equipment, many assume private cloud is pricier. But for many organisations, the equipment needed for private cloud will actually pay for itself in about 6 months. Public clouds might be affordable for smaller organisations but larger organisations already have the economies of scale to implement private clouds more cost-effectively. Don't assume that having a private cloud will be too expensive, it may be worth it in the end Yet for those small businesses, being affordable doesn’t necessarily mean the cloud is the answer. Horror stories like the recent one where Box deleted a users account without permission mean that companies who have business critical data, no matter how small they may be, still back-up their data to local servers, just in case. Worse still were the reported outages in the last 6 months from Office365, Dropbox, GoogleDocs and SalesForce, which mean that even if your data isn’t at risk of a breach, your business can still come to a grinding halt by putting all of your eggs in the cloud-basket. [easy-tweet tweet="Not all private clouds are created equal says @Accellion" user="comparethecloud" hashtags="cloud"] Additionally, it’s key to note that not all private clouds are created equal; there are many different variations. In one instance it’s an on-premise solution where the company hosts all of the equipment. There is also a version where a private cloud is hosted by a third party. As long as the servers only host information from a single business entity, the hosted version is still considered private cloud. Finally, private cloud is experiencing a level of what you might call “private cloud-washing.” Certain solutions are being advertised as private cloud when a number of their components, like the administrative features or the authentication, are hosted in the public cloud. For companies looking at private cloud for security and privacy, it’s important not to be tricked by hidden public cloud processes. So at the end of the day, ask your self how much control you require over your business data, and how much budget you have, because clouds come in all forms. The key to know is that if you’re concerned about security, whether from an intellectual property or regulatory compliance standpoint, the private cloud is a better bet for your business model. ### Europe’s small firms are making a big impact In almost every industry sector, attention tends to focus on the largest players, but often this is not where the real action is happening. Across Europe, start-up firms are redefining categories and providing stiff competition for incumbent operators. If you want to see the future, watch the small players at work. The evidence of the growing power of start-ups is everywhere. In many countries ride-share firm Uber has come from nothing to be a serious rival for established taxi companies. More recently, start-ups such as Stockholm-based Spotify, London-based Shazam and Berlin-based SoundCloud have changed the rules when it comes to digital music. [easy-tweet tweet="Across Europe, start-up firms are redefining categories and providing stiff competition for incumbent operators."] While success is to be admired, the typical start-up faces big challenges during times of rapid evolution and growth. Attracting seed capital, developing a business plan and getting offerings to market are huge tasks that take both time and experience to complete. Attracting and retaining talented staff is another big challenge. With start-up numbers growing by the week, finding the skills needed to get a particular business off the ground can be very difficult. Many firms offer share options or other financial incentives yet still find themselves battling with rivals for the very best of the talent crop. One London-based firm, Entrepreneur First, is taking a different approach to the challenge by recruiting talented IT specialists directly from university and then matching them with an appropriate start-up. According to the company, it has already supported the creation of more than 40 successful firms. Cities across Europe are hoping to foster a Silicon Valley-like ecosystem Meanwhile, cities across Europe are racing to position themselves as start-up hubs in an attempt to attract fresh talent and investment. They’re hoping to foster a Silicon Valley-like ecosystem of small firms that feed off each other and create long-term value for their country. For example, in the Netherlands monitoring group Startup Juncture calculates some 75 small firms attracted investment of around $US560 million in 2014, helping to establish a vibrant business ecosystem. Indeed, both Netflix and Uber have opted to place their European headquarters in Amsterdam. [easy-tweet tweet="Bullish growth plans require #Startups to have reliable #IT infrastructure in place" user="digitalrealty and @comparethecloud"] To support their bullish plans for growth, start-ups must also have the right IT infrastructure in place. Where once this would have required significant capital investment, now many are opting to embrace outsourced and cloud-based resources and take more of a ‘pay-as-you-grow’ approach. Rather than investing in expensive CRM and ERP systems, start-ups can make use of on-demand offerings such as Salesforce.com and NetSuite. For those requiring compute and storage resources, capacity can be rented from a range of cloud providers. Because such technology is critical for a start-up, ensuring it’s housed in a purpose-built and well-managed data centre is vital. Any disruption to core applications or data loss could be a disaster to a small firm enjoying a rapid growth rate. A start-up should check the credentials of a data centre partner carefully before retaining them as part of the team. Do they guarantee appropriate levels of performance and redundancy? How secure are their facilities? Do they ensure that smaller private IT deployments are interoperable with public clouds together with providing provisioning automation to and between these cloud environments? Effective data centres should also have robust connectivity with other facilities. Because start-ups are likely to initially only require modest resources, the ability to co-locate equipment within a secure data centre facility is also important. Rather than paying for space that’s not required, start-ups can increase their floor space as business grows together with supporting burst capacity requirements in public cloud environments. Effective data centres should also have robust connectivity with other facilities. This will allow a start-up to readily shift data between geographic locations as well as take advantage of the rapidly growing range of cloud services. The mix used can evolve as the business develops and forms the foundation of a hybrid cloud strategy. Start-ups have the potential to reshape the European business landscape and, with proper planning and execution, many may very well achieve this goal. The appropriate blend of IT infrastructure pools will greatly enable them getting there. ### July Top 10 #CloudInfluence UK and Cloud Bursters This month’s bursters (those climbing the rankings most dramatically and bursting onto the scene) are lead by Simon Hu, President of Aliyun, who made his first appearance in the rankings. Given Alibaba’s ambitions and its investment of $1bn in cloud to enable Aliyun to take on AWS, we will no doubt be seeing more from Hu, Aliyun and Alibaba in future. Both IBM and Microsoft had complex quarterly result announcements and each needed to explain to the market how they were transforming their businesses to sustain their growth in cloud. IBM’s Virginia Rometty leapt up the rankings, having not appeared prominently since the results three months before, as did Microsoft’s Kevin Turner. Paul Hamerman from Forrester was quoted widely commenting on SAP’s impressive results with close to 130 per cent growth in cloud revenues and 13 per cent growth in overall revenue, while Todd Jackson rose to prominence this month as the former Twitter director of product management announced that he was moving to DropBox.  [easy-tweet tweet="The July @Comparethecloud Cloud Bursters #CloudInfluence table is out now: "] [table id=45 /] The holiday season impacted the UK table this month, with IBM’s prolific social media presence Simon Porter, who hadn’t previously appeared any lower then 2nd dropping to 3rd. Simon’s vacation gave an opportunity for Skyscape’s Simon Hansford and Ubuntu’s Mark Baker to grab the top two spots.  [easy-tweet tweet="The July @Comparethecloud UK Individuals #CloudInfluence table is out now: "] [table id=46 /] For the future tables we have adjusted our search algorithms to include SDN and NFV #CloudInfluence analyses. So if you are interested in these technologies or are a player in this part of the cloud market then tune in again soon to see what impact it has - maybe you'll be appearing in next month's rankings. NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently.   ### Building a Cloud Business for the NHS: Top Tips Transform, innovate, modernise Transform, innovate, modernise – these seem to be the buzzwords of the day, and cloud epitomises all of these, so what’s the best way forward? Neil Everatt of Software Europe, which provides cloud services to NHS for people, expenses and contracts, shares his top tips for building a cloud business that’s right for the NHS. So you want to sell your cloud software to the NHS? Well, why not? The centralised National Programme for IT is long gone, the Government wants to encourage competition, and particularly SMEs into public sector and everybody knows the Government protects the NHS budget, right? Do your product research early. A few years back, as a small firm of 18 people, market research was really tough for us to perform. However, we made it a priority, did it anyway and learnt exactly what our customers needed. One thing we discovered was that to be a truly effective cloud service, our staff expenses solution would need to connect to the NHS payroll system, a system called Electronic Staff Record [ESR]. This would mean that approved claims could be sent direct for payment. It meant adapting our product, but it would reap big rewards in the end. Work with customers – not for customers. Early in our development path, we opened dialogue with the NHS central team who were incredibly helpful in guiding us through the process and the necessary security checks for providing a cloud service to the NHS. [easy-tweet tweet="If you want a cloud business to work with the NHS - Security, security and security!" user="Neil_Everatt"] Security, security and security. We found that gaining an ISOIEC 27001:2005 certification was incredibly helpful, but not always acceptable as a data security check in isolation.  We would often host NHS data security staff visits who would like to check how we were planning on looking after their data.  We found another unexpected advantage in that our office was purpose built in our reseller days, and it included a whole host of security additions that added a huge amount of confidence to those data security officers.  We always made sure we had the highest rating on the NHS’s Information Governance Toolkit, the internal security check used by the NHS organisations.  We’ve recently moved to ISO27001:2013, an even more rigorous standard, which I’m guessing will be the minimum requirement within a few years. Spot the influencers. Over the years we have noticed that there are a few influential innovators and early adopters in the NHS - these people could see what we’re trying to do and bought into the idea.  We nurtured relationships with these people, valuing their feedback and appreciating that if they liked what we did, they would tell others. Companies must have a clear need for their cloud service. For us, working with the NHS, we know that all trusts are looking for compliance with rules, to reduce costs and to remove paper from processes [this is being driven by the top now]. We make certain our cloud services meet these needs.  Understanding the procurement imperative. When businesses look at the NHS they often think every organisation is the same – trust by trust, CCG by CCG, CSU by CSU – or even that it is one business, but it’s not. In the early days, we thought procurement would be easy; we could replicate our efforts with similar thinking across customers. But, tender processing is very deep, it takes a long time and trusts are rightfully careful to follow the rules. The sooner a firm understands this, the easier they’ll find it. We embrace the fact that we’ll need to modify our cloud service for every trust. [easy-tweet tweet="Want to work with the NHS? Stand out and be seen. Attend NHS events, and get noticed! " user="softwareeurope"] I should add there are now a number of framework agreements that can be utilised which makes the process slightly easier and, with the G-Cloud model growing in awareness, this can be another excellent route for your NHS prospects to your products and services. Be flexible. We’re agile and we’re flexible. Not just in the procurement process, but in our cloud services. We are slightly fortunate that our expenses system, for example, was already very configurable, but the NHS needed a whole new layer on top of that. Communicate to customers regularly. We make sure we have a great team communicating with customers on a regular basis about our cloud service roadmap.  We have annual user conferences and now have a roadshow strategy when important new features are planned. Stand out and be seen. We decided to attend and sponsor NHS events as we could afford them, turning up with our roll-up banners and quirky artefacts to get us noticed.  One year, we used a ride-on remote control car which was meant to link to our duty of care and driving features in the product.  We then hit on the idea of offering it as a raffle prize... it was a huge success, everybody wanted to win this car for their kids or their grand-kids. This gained us a rock-solid and bang up to date database of contacts for our sales team to work with. Software Europe provides cloud services for people, expenses and contracts. The company is transforming businesses in the NHS, wider public sector and commercial markets. It already works with 65% of the NHS handling staff expenses. Customers include: 5 Boroughs NHS Foundation Partnership, South West Ambulance Service NHS Foundation Trust and NHS South, Central and West Commissioning Support Unit. ### July Top 50 #CloudInfluence Individuals Topping the July table, and our most consistent performer over recent months, was Satya Nadella, Microsoft’s CEO. He has had a lot on his plate recently with the launch of Windows 10. While Microsoft is making great strides to reorient itself for the era of cloud, Windows 10 is far more of a mobile play. Another attempt to extend the reach of Windows beyond the PC domain that it dominates, and moving it to the mobile devices that are rapidly displacing PCs.  [easy-tweet tweet="#1 spot for July #CloudInfluence goes to @SatyaNadella with Daniel Ives in #2" user="comparethecloud"] FBR Capital Markets Analyst Daniel Ives in 2nd is another regular in the rankings. Corporate results season gave him plenty of opportunity to comment on the strategies of the key players in the market. Making his first appearance in the rankings in 3rd was Simon Hu. President of Aliyun, who rose to prominence after Chinese e-commerce giant Alibaba Group announced that it is investing $1 billion into Aliyun, its cloud computing arm. Furthermore Hu made the bold claim that Aliyun is aiming to overtake AWS in the next few years.  [easy-tweet tweet="Bold claims from Simon Hu regarding Aliyun's future in the #Cloud landed him in the #CloudInfluence 3rd place"] Nicole Herskowitz in 4th was quoted widely in a series of announcements by various Microsoft partners that were making services available on the Microsoft Azure marketplace – as was Phil Sorgen in 7th. Likewise Terry Wise in 5th was quoted in announcements by AWS partners that were making services available on the AWS Cloud – as the two cloud giants battle to establish the pre-eminent ecosystem. One of the AWS partner executives widely quoted was Drew Clarke, Vice President of Products, Qlik Cloud boosting him into the top ten this month as well. Anil Ambani, Reliance Group Chairman announced that the company would achieve full deployment of its next-generation content and cloud delivery network by the end of this year, with-what is claimed to be a first in India - five fully-operational Cloud Xchange points. Looking ahead to the launch of the Apple iPhone 6s, Phil Schiller commented that users are using cloud storage more: "The belief is more and more as we use iCloud services for documents and our photos and videos and music, that perhaps the most price-conscious customers are able to live in an environment where they don't need gobs of local storage because these services are lightening the load.” Apple's point of view is that 16GB should be fine for most folks who want to live off the cloud. [easy-tweet tweet="The P's have it for July with @Phil_Sorgen @Pschiller and @PaulHamerman all making the #CloudInfluence ranking"] Commenting widely on the recent Ashley Madison hack and the resulting blackmail risks, Chase Cunningham, threat intelligence chief at cloud-computing company FireHost, said: "You could really ruin someone's life.” However he said that this hack only added to existing risks, given that many consumers have multiple devices that are connected to their bank accounts or contain sensitive personal data that also puts them at risk. Beyond the top ten a few notable others included: Ginni Rometty, IBM’s CEO and Kevin Turner, Microsoft’s COO both talking about their respective firm's recent results and bigging up their growth in cloud. Analysts like John Madden at Ovum, Kuba Stolarski at IDC, Stephen Kleynhans at Gartner and Audrey William at Frost & Sullivan all pushing their firms’ latest reports on aspects of the cloud market. [table id=44 /] NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### Innovative Business Uses For Wi-Fi We’ve all benefitted from the development of Wi-Fi technology in recent years, with speedier connections at home and work to make the use of smart phones and portable devices. In fact, we’ve reached the stage now where 60% of the population reckon they couldn’t live without their Wi-Fi for more than a day, an addiction stronger than our obsession with coffee to get us through a working day. But Wi-Fi isn’t just about giving us a platform to connect to the Internet. The capabilities for convenience open up innovative ways for businesses to use it to their advantage. Here’s some of the ways Wi-Fi can be utilised to good effect by businesses… [easy-tweet tweet="Wi-Fi isn’t just about giving us a platform to connect to the Internet." user="comparethecloud" hashtags="wifi"] Roam around Meeting rooms all booked up? Or maybe you need a more inspirational setting for a brainstorm than sitting round a table in front of four blank walls? Wi-Fi means you can offer access to your system and network from anywhere within range of your signal. That means meetings can be outside or in the cafeteria and that employees don’t need to feel chained to their desk with their creativity stifled. Remote Of course, this can go beyond that too. Wi-Fi in pubs, restaurants, cafes and at home means that employees can break out from the office and carry out their business in more favourable surrounds while still remaining connected and using their time effectively. Retain People Traffic The Wi-Fi offered in pubs and cafes is actually a decent business tool for them. With customers attracted by fast reliable Internet connections, they’re much more likely to set up camp and stay for a while, spending money in the meantime. An American study showed that premises with free Wi-Fi enjoyed better foot traffic, with customers spending longer – and crucially more money – as a result. Marketing There’s also a marketing opportunity from offering free Wi-Fi. Customers are allowed access to a network providing their details are registered and they are signed in. This gives the business a method of contacting the customer with offers and promotions in the future to entice them back. Rural Woes Businesses in rural areas often suffer from poor mobile phone signal, making for frustration for their employees. Switching to web-based communication apps that tap into Wi-Fi can bypass this and ensure business runs more smoothly in the countryside. Vehicle Safety Ultrasonic sensors and cameras can be fitted on to a vehicle to provide a system that feeds into a monitor and alerts the driver to any obstacles or hazards in the vicinity and sheds light on the potentially dangerous blind spot. Brigade systems show how this can work to boost the safety of vehicles and how Wi-Fi technology can ensure that this system is connected without the need for a complex system of cables. BYOD With tech-savvy employees owning and using the latest smart consumer devices why would you want to hold them back? By allowing access to the company Wi-Fi companies can have a simple and effective way to incorporate a healthy Bring Your Own Device (BYOD) into their working practices. With Wi-Fi now seemingly controlling 60% of the population’s lives, we should really be thankful for the Starbucks on every corner - the web fix we crave is now just as strong as the lust for a double shot latte. [easy-tweet tweet="Wi-Fi cravings control 60% of the population's day to day lives" user="comparethecloud" hashtags="wifi"] ### July Global #CloudInfluence Organisations Just when we thought it was safe to assume that we knew who the main cloud players were – guess what happened?  It was results time again for most of the large cloud players and all boasted impressive revenue gains – well in a market growing at the rate it is they would. Hot on the heals of the results announcements Synergy Research issued a report saying that the “Big Four Cloud Providers” (Amazon Web Services (AWS), Microsoft, IBM and Google) are leaving the rest of the market behind. It found that in aggregate they now control well over half of the worldwide cloud infrastructure service market. Indeed their combined market share rose to 54% in the latest quarter compared with 46% in Q2 2014 and 41% in Q2 2013. [easy-tweet tweet="The @Comparethecloud #CloudInfluence tables for July are out now, did your company make it? "] So that’s it then – it’s all about the big four. Maybe not...  The thing is, the big four are all very different, they play in different parts of the cloud market (IaaS, PaaS and SaaS) and count their cloud revenue in very different ways. IBM includes software and services in its numbers, Microsoft includes software like Office 365, while Google also includes Software as a Service. Only Amazon is reporting numbers heavily weighted toward pure infrastructure. So the "revenue run rate” reported by all the vendors is far from immediately comparable.  However Synergy also reported that AWS had maintained its clear number one ranking with a 29% share of overall cloud infrastructure service market (including IaaS, PaaS and private & hybrid cloud). However given that its numbers are far more heavily focused on IaaS than the others, its share of IaaS is much greater than 29% making it the dominant player by far with the other three some distance behind. So it’s clear then – it’s all about AWS, then the other three and no other real challengers. Well yet again, maybe not... You see Chinese e-commerce giant Alibaba Group has announced that it is investing $1 billion into its Aliyun cloud computing arm and says that it will overtake AWS in the next few years. And this followed Larry Ellison’s bold claims for Oracle cloud capability (reported in last month’s ranking here). Understanding the Big Four’s differing cloud reporting metrics (sorting the Apple from Oranges) and making sense of the claims of their rivals isn’t easy, but at least we can make a clear assessment of the level of attention and influence each is having in the market – this is the extent to which each player is able to get its marketing machine and its ecosystem behind its opinions in order promote them and in doing so gin attention and share of voice. [easy-tweet tweet="The Top 3 #CloudInfluence spots go to @Amazon, @Microsoft and @google" user="comparethecloud"] The top spots go … surprise, surprise … to three of the big four – AWS, Microsoft and then Google. Amazon’s impressive Q2 results gave it the lead, while Microsoft wrote down the value of its Nokia acquisition and Google remained more opaque about its exact cloud revenues. Next came Apple, the top consumer brand, followed by Oracle and Intel all ahead of the last of the big four IBM. IBM faces possibly the largest transformational challenge of the big four, and as it reorients itself it is finding that revenue growth from its strategic imperatives of cloud, analytics and engagement is yet to offset the decline in its legacy businesses. Microsoft’s write downs were more of a one-off hit that only briefly overshadowed its commercial cloud revenues growth of 88 per cent in the quarter, driven largely by Office 365, Azure and Dynamics CRM Online uptake. AWS and Google as born-in-the-cloud entities don’t face the same legacy challenges. Alibaba, largely absent from previous rankings due to its relatively low profile in the West, made it into the top ten this month, and will no doubt it be seen more frequently in the rankings as continues its ambitious growth strategy in cloud. Analyst firms put in a strong showing as usual, with IDC predicting that Public cloud spending will reach almost $70m in 2015 and with Gartner releasing reports including its 2015 "Magic Quadrant" for x86 server virtualisation infrastructure, which found that Microsoft and VMware are leading the pack in that market.  Other notable appearances included:  SAP posted impressive results with close to 130 per cent growth in cloud revenues and 13 per cent growth in overall revenue and Synergy Research (with their research mentioned above) in 38th. [table id=43 /] NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### Node4 acquires cloud computing company Premier IT Networks Node4, the Cloud, Data Centre and communications specialist, has completed the acquisition of London-based Premier IT Networks, a provider of secure cloud computing solutions to data-sensitive sectors. This latest strategic acquisition will strengthen Node4’s geographic presence, vertical sector capability and IT expertise, supporting its ambitious plans for further growth. Under the leadership of Peter Bodley-Scott and Thales Tsailas, Premier IT Networks has grown rapidly since its inception in 2008 to meet the demand for secure cloud computing services in the legal and charity sectors. Its customers include leading Criminal Chambers 25 Bedford Row and 9–12 Bell Yard, and charities Breast Cancer Now and Prostate Cancer UK. Both Peter and Thales will continue to be actively involved in the business. All of Premier IT Network’s employees will continue to work from its London office and will become part of the Node4 team. All services currently provided to customers will continue and Premier IT Network customers will now also have access to Node4’s portfolio of cloud computing, colocation, network connectivity, telephony and communications services. Premier IT Networks is an ISO27001 accredited company and is also an approved user of the Criminal Justice Secure Email Service. This allows people working in the Criminal Justice Service and those working to prevent crime — including public, private and voluntary organisations — to send emails containing information up to an equivalent of ‘Restricted’ (i.e. sensitive data), in a secure way. It is also a member of the G-Cloud procurement framework, which allows the public sector to buy cloud-based services from specialised suppliers. Andrew Gilbert, CEO at Node4, commented: “The Node4 management team is committed to expanding its presence across the UK, as well as its service portfolio, to support the company’s ambitious growth plans. This latest acquisition strengthens our London presence, as well as bringing valuable sector expertise, which will enable us to grow our presence in the legal, not-for-profit and public sectors. Equally importantly, we are gaining a team of IT experts who share our values and customer-centric approach to business.” Peter Bodley-Scott, Premier IT Networks, commented, “Like Node4, our sights are firmly set on future growth. The acquisition by Node4 will enable us to increase the quality and range of services that we can offer to our cloud customers, which will put us in an excellent position to fulfil our growth potential. We felt that Node4 was a perfect fit for us in every respect, sharing our ethical approach to business, cultural values and service ethos.” Premier IT Networks was advised by Grant Thornton, while business analysts PwC and lawyers Pinsent Masons worked on behalf of Node4. ### 3D Printing and implications for the future 3D Printing Services and the Implications it May Have on Businesses in the Future Undoubtedly, one of the most exciting technological breakthroughs of recent times is 3D printing. This fantastic new process, also known as additive manufacturing, is capable of producing real objects with three dimensions rather than two-dimensional paper printing. The possibilities are endless, as anything from trinkets to a full-sized car prototype can be built using 3D printing technology. Besides the novelty aspect of it, there is a sizeable impact it can have on your business as well. How 3-D Printing Works Until now designers and developers have had to compromise with designs by printing out two dimensional renderings or diagrams on paper, but not anymore. With 3D printing, the designer can actually produce a real, physical manifestation of what he has designed. There are several methods of the 3D printing up and coming in the industry. Fused Deposition Modelling printers have a tube similar to inkjet printers. Heated material is pushed through the tube, and printing is done layer by layer till the final product is achieved. Similar to Laser Printers, Selective Laser Sintering units operate by shining a high-powered light beam onto a “bed” of powdery-soft resin, thus hardening it to the particular shape. Another method, stereolithography operates similarly, but with liquid resin. Some Effects 3D Printing Could Have on the Industry [easy-tweet tweet="Statistics indicate that additive manufacturing is an industry worth $2billion today" user="comparethecloud" hashtags="3dprinting"] Statistics indicate that additive manufacturing is an industry worth two billion dollars today, and this could shoot beyond 6 billion by as early as 2017. There is no doubt that the most important application of this is in developing prototypes. In any industry, for example the automobile industry, initial prototypes are designed on computers and then rendered on paper as technical drawings. This is then explained to engineers who implement it into the production line and develop final parts. These parts are then approved, tested and reproduced, then assembled to form the final product. What 3D printing could do is eliminate some of the intermediate steps. Instead of designing flat prototypes, CAD programs are used to develop three-dimensional renderings of the entire product or of the parts needed, and then these are fed into the 3D printer. Soon, the printer is able to produce an exact prototype of this design, to scale. This can be approved immediately and sent into production, hence effectively eliminating several visualisation steps, and saving time and money. Benefits of Additive Manufacturing Efficiency: The 3D Printing process involves much fewer processing steps, less time, less energy and fewer employees needed in the assembly line. It minimises wastage, and hence allows you to use your resources more efficiently. Small Lots: Instead of mass-producing parts and products, you can simply create the finished product through additive manufacturing, thus not wasting money on excess production and inventory costs. Speed: The manufacturing process is sped up by additive manufacturing, as there are no more intermediates between design and manufacturing. Cycle times are reduced as no tooling process is required. Spare parts are made as and when needed, simplifying assembly, logistics and manufacturing as a whole. Reverse-engineering: Instead of studying legacy systems and taking them apart to understand how they work, they can now be fed to a scanner, modified through CAD programs and then produced through any 3D printer. Instead of replacing the classic manufacturing process, additive manufacturing is revolutionising the process as we know it, and making it more efficient. It is capable of producing lightweight, simpler parts which earlier could not have been assembled through traditional processes. Widespread Scope of This Technology The scope 3D printing is not limited just to industry. From the medical to the defence industries and even to aerospace, electronics, tooling and energy industries, what was once a complex prototyping method is now being considered to be the future of digital production and manufacturing. Medical industry: CT scanner parts, ducts and several types of simple equipment can be produced by additive manufacturing. In surgical situations, where the doctors are not able to visualise the situation, but cannot risk exploratory procedures, a simple camera can be used to get an idea and a 3D model can be developed to better understand the situation. Implants and prosthetics can also be manufactured this way. Aerospace industry: Complex parts like fuel nozzles and the like are difficult to manufacture. Thanks to 3D printing, they can now be implemented easily using advanced light materials ranging from tungsten and ultem to stainless steel and several others. Consumer Electronics: As the need for portability increases, simpler products with embedded electronics can be built by additive manufacturing. [easy-tweet tweet="#3Dprinting will change the Medical, Aerospace and Consumer Electronics industries in the future" user="comparethecloud"] In all customisation areas, production costs are very high. Additive manufacturing reduces the number of cycles needed to churn out the products and gets the job done in a faster and a cheaper way. New geometrically complex methods and designs which would be difficult to implement through classic methods can now easily be incorporated into the production line. To know more read up on additive manufacturing online or visit a trusted provider near you. It is safe to say that if the 3D printing progresses at a steady rate, it is indeed the future of the  manufacturing industry, but it is going to take a while before 3D printing is recognised, adopted and implemented as a global standard. ### Blue Coat Acquires Perspecsys Blue Coat Acquires Perspecsys to Effectively Make Public Cloud Applications Private Blue Coat Systems, Inc., a market leader in enterprise security, today announced it has acquired Perspecsys, Inc., a leader in enterprise cloud data protection solutions. With this acquisition, Blue Coat significantly expands its cloud security offerings while enhancing the industry’s most robust hybrid cloud portfolio. This expanded portfolio positions Blue Coat as a leader in the Cloud Access Security Broker (CASB) segment and sets a new bar for data protection within cloud applications. “CASBs have become the primary means of layering security onto SaaS (Software as a Service) applications,” said Mike Fey, Blue Coat president and COO. “Industry analysts predict substantial growth for this space over the next five years. As SaaS deployments continue to increase, this technology will become an essential component of enterprise data and cloud security.” This acquisition enables Blue Coat to offer the industry’s widest range of CASB capabilities within its market-leading Secure Web Gateway portfolio. Together, Perspecsys and Blue Coat deliver the industry’s leading solution for protecting data in an era of user-led cloud application consumption, dramatically enhancing security for corporate-approved cloud applications and services. As dependency on cloud applications grows, enterprises are faced with a number of issues regarding data privacy, compliance and security. These include compliance requirements for Personally-Identifiable-Information (PII), as well as payment (PCI) and protected health information (HIPAA); strict data governance and regional residency rules for multinational companies; and most prominently, the continuous global news of data breaches. These concerns can be an inhibitor to the adoption of cloud applications for many organizations. Perspecsys’ Cloud Data Protection platform solves the key business risks associated with data compliance, privacy and security for enterprises as they move to adopt cloud-based applications. Through the use of its patented cloud data tokenization and encryption capabilities, Perspecsys puts enterprises in complete control of their data at all times, regardless of where the data resides, both while in use and at rest in the cloud. By removing the risks associated with placing sensitive data in the cloud with applications such as salesforce.com, ServiceNow and Oracle, Perspecsys effectively makes the public cloud private. “To gain control over the explosion of cloud application usage, our customers tell us that the proxy is the logical location to implement CASB features and with our historic strength and understanding of web applications it is a perfect fit,” said Greg Clark, Blue Coat CEO. “Given that more than 50 percent of our customers are based overseas, and as we look to expand our vision for cloud protection, we are paying close attention to the unique needs of multinationals that reside or are doing business in geographies with strict compliance and data sovereignty requirements. Perspecsys has delivered an approach for encrypting and tokenizing data in the cloud that can actually be used in the real world by global customers, making it a disruptive force in the market and a great addition to Blue Coat. In summary, Perspecsys’ capability reduces corporate cyber-crime risk. Management of data at rest via tokenizing or encrypting is rapidly becoming a critical element of cyber defense for both corporate owned infrastructure and cloud applications.” “We’re excited to be joining forces with Blue Coat. As the largest content security provider on the planet and a leader in CASB, Blue Coat is the logical source for these security services,” said David Canellos, Perspecsys CEO. “Enterprises want to buy this class of solution from Blue Coat and Perspecsys’ Cloud Data Protection is a natural complement to Blue Coat’s widely recognized portfolio of cloud and proxy technologies.” ### Why are businesses shifting to hosted desktop solutions? Is moving your desktop infrastructure the right approach for every business? Of course not, but an increasing number of firms are now realising the benefits that Virtualisation Desktop Infrastructure (VDI) can offer. Leveraging the cloud in order to host your desktop infrastructure externally means freeing your business and employees from the shackles of the traditional workstation approach. With VDI, storage, applications, and everything else that underpins the desktop experience is managed by a third party vendor. [easy-tweet tweet="Organisations are choosing the VDI approach more than ever before because the technology has improved markedly"] There’s always a danger that the growth in VDI uptake is just another example of businesses jumping on the cloud bandwagon without having given it due thought, but in a lot of cases the technology is a perfect fit. Businesses and their employees are increasingly demanding the flexibility to work wherever and whenever they need to, and VDI makes this possible. Whether you need to work from home, access files on the move or respond to an unexpected development while on holiday, hosted desktop solutions give you the tools you need to work remotely across mobile and desktop platforms. Organisations are choosing the VDI approach more than ever before because the technology has improved markedly over the last couple of years. Cloud hosting your desktop infrastructure has seen criticism, and it is true that badly implemented VDI is going to cause a number of headaches, but the technology is easy to appreciate if it has been implemented correctly. When it is used properly, businesses can access their full range of tools via the cloud without disruption. Popular software like Microsoft Office, as well as more bespoke tools can all be delivered virtually over a cloud network. What’s more, by deploying the same operating system to multiple users, the upgrade process is much simpler. Updates, patches and new applications can be deployed centrally and will automatically be made available to all users of the virtual desktop infrastructure next time they log in. Virtualisation is also more viable for SMBs from a financial point-of-view. VDI also offers disaster prevention and recovery benefits. Reliable cloud hosting companies can manage server failure without service being affected. Many providers will have safeguards in place that compensate for server failures, enabling them to remain at full capacity even while resolving technical issues. Furthermore, businesses that outsource their desktop infrastructure generally utilise thin clients in place of traditional PCs. As well as offering a lower carbon footprint, thin clients have no moving parts, which reduces the likelihood of hardware failure. Virtualisation is also becoming more viable for SMBs from a financial point-of-view. The option of using older hardware, the lack of server maintenance costs and the subscription payment model all add up to make hosted desktops a cost effective solution for the modern business. With the number of benefits on offer, hosting your desktop on remote data centres is becoming increasingly common. Major players in the technology scene are beginning to get on board, with the likes of Microsoft, Amazon and IBM all offering desktop hosting solutions in some form. Whether or not these larger companies can deliver the same level of dedicated service as bespoke cloud hosting companies remains to be seen, but it certainly raises awareness of VDI. Increased awareness will surely be accompanied by increased scrutiny, particularly over integration and security issues. IT infrastructure is extremely complex and configuring VDI solutions for optimal performance is not a simple task. Businesses will also have to ask themselves if Desktop as a Service (DaaS) is right for them, or if they want their entire infrastructure outsourced. Alongside these questions, companies will need to feel secure hosting company data outside of on-premise servers. There are a number of worthwhile reasons to shift to VDI but as with any cloud offering, users must remember that they are buying a continuing service, not a one-off product. To get the most out of your hosted desktop make sure there is ongoing collaboration and consultation with your provider to guarantee a solution that is able to meet your company’s needs both now and in the future. ### Doug Clark, IBM - Live from Cloud World Forum, London Interview with Doug Clark of IBM as part of our live afternoon Kings of Cloud broadcast from Cloud World Forum, Olympia, London - 24/06/2015 ### Top 5 IT Predictions for Finance Sectors The banking systems legacy Financial technology systems, like the banks they serve, have been built over decades through a long series of mergers and acquisitions. Already operating across a complex set of siloed departments and systems (core banking, payments, trading, etc) each merger or acquisition has resulting in further systems being bolted on to create ever greater and more complex IT systems connected by a maze of spaghetti. Most large banks are now creaking at the seams, with more budget devoted to supporting legacy systems than to innovation. Many banks have core banking systems as much as 40 years old, but these systems are so critical to their operation and so heavily integrated into other systems that they have become near impossible to replace. Doing so has been likened to trying to change the engine on a jet airplane mid-flight. This does not mean that there is no new banking technology available, or that new entrant banks aren’t basing their operations on the latest technology. Often making significant use of cloud technologies, challenger banks typically see their IT operations as a major source of competitive advantage. At the same time most of the older retail banks are experimenting with newer technology in some of their peripheral operations, but at their heart in their core systems they are still stuck with what they have – at least for now. Banking in the 21st century: Technology within the financial sector has changed considerably over the last decade. Information technology systems post the dotcom boom (those who survived) evolved into systems that were intertwined with nearly all of the other IT Services utilised within each respective organisation. This came with challenges and benefits post the outsource/offshore trend that amassed much confusion to everyone working inside the finance industry, as well as for the regulators. Virtualisation was still in its infancy with complex infrastructures that were, and still in the majority, replicated in its entirety to a separate location for contingency purposes. Automation of technology was cumbersome and Application Service Providers (ASP) were rife, wrestling with coding for crude web presences coupled with the continued efforts for centralisation of IT Infrastructure made more difficult by the recent outsourcing trend. What followed towards the end of the decade were phases of cost cutting and consolidation, that were inevitable post 2008 and the financial crisis. It was at this stage of technical evolution within the Finance sector that cloud technologies were truly realised. Starting with the easiest of services to untangle, outsourcing had started again but this time with an educated experienced head on the shoulders of the incumbent IT departments. Today the Financial Market Place is a very different landscape and cloud technologies are common amongst regulated and non-regulated industries. You may have heard the phrase “Omnichannel Approach” on numerous occasions and if you haven’t, you will. In this age of digital transformation all business sectors are being affected, not just Finance. It is extremely important to have a consistent approach across all aspects of technology and business. To put this into context, the below statistics will speak for themselves: [easy-tweet tweet="69% of customers already use the Internet to buy financial products #fintech"] [easy-tweet tweet="Mobile banking usage will exceed 1.75 Billion by 2019 representing 32% of the global adult population #fintech"] [easy-tweet tweet="Nearly 50% of the world’s banks will disappear through the cracks opened by the digital disruption #fintech"] [easy-tweet tweet="72% of millennials would be likely to bank with non-financial services companies #fintech"] 69% of customers already use the Internet to buy financial products - PWC Mobile banking usage will exceed 1.75 Billion by 2019 representing 32% of the global adult population – Juniper research Nearly 50% of the world’s banks will disappear through the cracks opened by the digital disruption of the industry  - BBVA Chairman and CEO Francisco Gonzalez 72% of millennials would be likely to bank with non-financial services companies with which they do business (like Google, Amazon), compared to 27% for those over 55 – Accenture The digital transformation is upon us and every industry needs to change! With the emergence of cloud technologies the world has never seen the constant change state of technology where we are today. Just as other industries have changed over the past 5 years and, if you are old enough to remember the Amazon and Borders struggle where Borders never saw the end coming by the new industry book seller chomping at their heels. The same can be said for many other industries where diversification and strategy go a miss. However we do have the benefit of hindsight and this, armed with the knowledge and understanding of more data analysis ever available in our history, the Finance sectors can not only stay in the game but ahead of the curve. There are 3 main focus points to ensure you stay ahead of the curve: 1. Be customer focused and achieve excellent customer service. We are living in the “ME” generation and such should accommodate the demanding needs of the youth of today, across whatever medium that is convenient to them. “The majority of Millennials would rather lose their sense of smell than their technology” - Newscred 2. Make the IT Strategy shift to accommodate point 1. The need to change your roadmap of service delivery is essential. Do not think for one minute that the majority of your clients will even pick up the phone to you if they are unhappy, they will not and simply move to one of your competitors. “2/3 of contactless payments since October have been through Apple Pay” – Tim Cook, CEO Apple 3. Data is the new currency in this digital age. Banks have amassed huge amounts of data from their clients, use it. “The vast majority of data never gets used… Only 0.5 percent of all data is ever analysed” – MIT Technology Review So, this leads onto some predictions for Financial Institutions and technology. We have compiled a “TOP 5” list of emerging trends that we believe will be seen over the next 5 years. 1.  Analytics Evolved Big Data is not a new concept and Financial institutions have been using Big Data in one form or another via trading systems for years. Employ a Data Strategy to cultivate the unstructured information you have already. There are many providers, cultivators and providers of the technology and toolset that can be used. You may of heard of IBM Watson, the artificial intelligence computer that back in 2011 made the headlines by beating former “human” winners of the quiz show Jeopardy? This technology is now mainstream and available at a fraction of the cost of developing anywhere near the same results – cloud based! 2.  App Delivery Application delivery of your products and services is key when your clients have used smart devices from the age of 12 years old. Technically speaking we are in a constant change of major IT Infrastructure, however you should be embracing this change as it will also drive down your cost of ownership. Open source technology, once shunned within regulated industries, is now a major driving force within any industry that requires enterprise IT. OpenStack has emerged onto the scene and has many benefits to your organisation for the orchestration and management of resources. There is also a new way of thinking based around container technology (Docker) which in essence share a single operating system (and binaries) for application delivery without the vast amounts of virtual machines that would normally be required. This technology is in its infancy but is definitely one to watch for the near future.  3. Industry Clouds There is much debate around Public/Private and Hybrid clouds. Rarely you do hear any talk around industry clouds and the massive benefits that they bring. Imagine a cloud based system that is designed for the financial industry, has the IT Governance applied to the infrastructure that meets regulators requirements, data sovereignty laws, is resilient and meets all of the check points set out by your governing body. Bringing cost synergies of shared implementations of bespoke and expensive vendor infrastructure that is secure. A cloud for your industry, designed by your industry specialists (with the  regulators input) that allows you to reduce your expensive staff base of unique skills to manage and secure the technical environment. As well as allowing you access to previously unavailable products and services that would have required a large capex, we’re looking at this as a possibility within the next 5 years. 4. Fintech Involvement Financial institutions have always been trusted with clients money. We see another potential trend that has started to emerge, financial institutions offering IT Services to accommodate the new currency, data. This would make sense to investigate this business model, just as other industries (tech) have done so offering financial services. We also see financial institutions being more involved, from a Venture Capitalist prospective investing in hi-growth tech startups. This too also makes sense to assist and guide these new breed tech firms, courting them at an early stage of development. 5. DRaaS – Disaster Recovery as a Service With the emergence of cloud technologies and providers also comes the reliance of governance from said providers. Disaster Recovery is mandatory for regulated financial firms and this will get ore important when outsourcing to cloud providers. Specialist Disaster Recovery/Business Continuity providers will be emerging into the marketplace with the knowledge and understanding of not just your technology but also your business. This service is of course available today, however with the added security vulnerabilities that go hand in hand with cloud provisioning, specialist providers will undoubtedly emerge. As keen cloud watchers at Compare the Cloud, we monitor the level of comment and debate on cloud in every technology and industry segment. While the use of cloud in the financial services sector is increasing all the time, there is still relatively limited comment and debate on cloud within banking. In the table below we highlight some of those that have been most outspoken on this topic in the last few months.  [table id=42 /] For future events about Cloud Banking and FinTech please visit: http://cloudfsworldseries.com/ ### Fintech Connect – Cloud Banking NYC Seminar Last week Andrew McLean and I headed off to NYC for the Fintech Connect Cloud Banking Seminar with CloudFS.   Day one The event kicked off with an opening address from the Founder and Managing Director of Arthurs Legal – Arthur van der Wees, setting the scene and introducing the event. We then broke straight into the Keynote panel discussion on the viability of cloud within the banking sector and what is and isn’t possible from this perspective. The panellists were heavyweights in the industry and included: Bank Leumi USA, Everbank, Cross River Bank, Wells Fargo, CLSA, Americas and Appdirect. A lively debate formed from the panellists and a typical theme was mentioned through the whole segment – Security and Regulation. [easy-tweet tweet="This year and the next will be one of alliances and acquisitions within the Cloud arena." user="comparethecloud"] Key takeaways for the day were from James O`Neill discussing the different cloud models for the industry, heavily debated around trust for the Hybrid option of cloud. A prominent view was that many banks are using Hybrid already, and that the biggest risks are the service providers, together with the rise of the Community Clouds that specifically service the Financial Services Industry. Bob Savino, CTO of Moven Bank also entertained the audience with the rags to riches story of Moven, and the complexity of keeping up to date with the compliance and security governance to stay relevant. [gallery columns="2" link="none" size="full" ids="21185,21184"] What then followed in the afternoon was a whirlwind of information provided by even more heavyweights within the industry. With speakers representing BONYM, IBM, ING Bank, the NIST, and the CSA. Selling cloud to the board was a very interesting debate, particularly when discussing the overall concept of Cloud. The CTO/CIO requiring help with this to justify spend was stated and also a new way of thinking was required to make this challenge easier. With the 3 year typical cycle of hardware depreciation vs Outsource to the cloud being particularly popular, and banks still spending Capex on technology, this was not just about cost for IT Serviceability but also resilience and security. The day ended on a high note with a summary by the chair on the days discussion with a welcome drinks reception and networking opportunities for the next hour. Day two The keynote was performed by Charaka Kithulegoda, CIO at Tangerine Bank with an overview of how to address transition from legacy to Cloud seamlessly with using a “Cloud First” mentality and this was the theme for the next few discussions too, with most stating that a single pain of glass approach for Cloud Service offerings is a must. In my keynote I suggested that the rise of Compliant Clouds is upon us, as well as multi-vendor participation. This year and the next will be one of alliances and acquisitions within the Cloud arena. Later in the day more discussions opened up on this topic and then led into what the NIST is currently reviewing for the standardisation of Cloud Services within the Financial Sector including potential ways of enforcing this. Towards the later part of the day some fun was had with selfies of the audience and organisers and a scurry of removal men for the shows sponsors. Overall the event was very enlightening on what the Financial Services sector is doing, and taught us all a lot on their thoughts on cloud. The 3 main topics that were raised were very pertinent for the industry – Security, Resiliency and Regulation. I look forward to more cloud banking in the future. ### The Great Cloud Juxtapose [quote_box_center]The definition of Juxtapose is ‘to put people or things together, especially in order to show a contrast or a new relationship between them’[/quote_box_center] Why is this relevant to Cloud and IT or beyond? We have all seen various examples of consultants and bloggers talking about the changes with industries. From my perspective we are on the cusp of seeing a mass dissemination of all industries where the champion is the customer service aligned organisation and holder of the customer record. So let's look at the Cloud Juxtapose through recent examples from the UK and globally: Santander Bank: Moving from the physical to the digital, Santander Bank now offers digital cloud storage services to corporate customers. I believe this is an acknowledgement that safety deposit and traditional bank custodian services have moved beyond the physical layer. The offering of a high-security data repository is also in use by Barclays Bank and we see a growing trend of partnering and developing services for use by banking clients. [easy-tweet tweet="What is the cloud juxtapose? " user="comparethecloud" hashtags="cloud"] Crowdfunding Many good ideas never come to fruition due to lack of available credit mechanism in marketplace or due to external influences such as recession or credit crunches. Previously stimulus derived from banks lending to SME’s or venture capitalists. The "computer says no” culture of banking and the “equity grabbing greedy VC’s” are now the last place entrepreneurs wish to go to. In fact many VC’s and Investment funds suggest getting a seed round via Crowdsourcing and doing a Crowdsource campaign before approaching them. So what is Crowdsourcing? If you head over to the most popular platforms (links at the bottom of this blog) such as IndieGoGo or Kick Starter you will see a plethora of products, services, software and projects that you can subscribe to or fund. The JOBS Or Jumpstart our Business Start-ups Act with various tax benefits and de-coupling of regulations is positively encouraging this in the US. IBM and Bitcoin Bitcoin and Blockchains are topics many of us hear but rarely engage with or understand. A simple definition is the ability to have a non-centralised currency system devoid of government regulation and provision of a cloaking of anonymity. This freedom of currency and ledgering could revolutionise the payment system akin to the Medici banking family of Renaissance Florence. Could IBM be the central technology enabler and trusted pair of hands for this technology? Virgin From broadband to airplanes to mobile telephones the Virgin Empire is one built on a premise of customer service and cross-brand marketing. My belief is that the corporation that holds the customer relationship will be the outlet of the future. Virgin typifies the ‘disruption corporation’ that enters a marketplace and shakes out the existing rules to create its own space. How long before Virgin really enters the cloud computing and application space? 3D Printing and Manufacturing Today the undisputed leader in industrial manufacturing is China. China’s huge factories can retool and deliver on a massive un-paralleled scale usually at a cost that is unachievable elsewhere. But, when the design can be produced then 3D printed at a massive scale and delivered at a fraction of the worldwide shipment cost will this disrupt the existing status-quo? I think the simple answer is yes! We are already starting to see the growth of 3D printing ‘farms’ that are paralleled clustered to mass-produce items. Once the consumable costs and fast processing chips allow for faster easier quicker rendering I believe that 3D printing will arrive. The advance of 3D printing will disrupt manufacturing more than previous examples of steam power or electrification during the industrial revolution. If IBM Watson met Honda’s Asimo Robot? Take the amazing movement and humanlike actions and dexterity of the Honda Asimo robot then using new faster processors throw in the Artificial Intelligence capability of IBM’s artificial intelligence system Watson. What would be the outcome of the amazing juxtaposition between these two technologies? A walking talking self-learning quiz show football-kicking robot that you could take down the pub and win any quiz with… CRM IoT M2M or human management systems Customer relationship management systems today are boring clunky systems designed by accountants to measure sales revenue and are frankly boring. Take the concept of the Internet of Things, M2M that is the ability to allow any device to use a sensor to communicate with a centralised system. And by fusing (with permission) all this data what do we get? The Human Management lifestyle concierge. This system could be ubiquitous across devices and allow for our lives to be managed like never before. I mean who wants to just be a boring statistic or instrument of revenue.. The Bazaar comes to the web Admit it we all love the esoteric and hunting out crafts or items that may be unusual. Lets look at marketplaces such as Etsy or Alibaba will this growth continue where distribution and retail chains are the mini-shop or digital web presence within larger online marketplaces? I hope I have provoked a few thoughts within this blog, links for the various references are below. Links: IndieGoGo: https://www.indiegogo.com/ Kickstarter: https://www.kickstarter.com/ Santander announcement: http://www.ft.com/cms/s/0/4f89b706-a8b7-11e4-97b7-00144feab7de.html#axzz3govGwR1y IBM and Bitcoin: http://www.reuters.com/article/2015/03/12/us-bitcoin-ibm-idUSKBN0M82KB20150312 Crowdsourcing: https://en.wikipedia.org/wiki/Jumpstart_Our_Business_Startups_Act ### GTT Releases White Paper on Managing the Challenges of PCI Compliance GTT Communications, Inc., the leading global cloud networking provider to multinational clients, announced today the release of a new white paper entitled PCI Compliance: Protection Against Data Breaches. The white paper highlights the growing impact of data breaches on companies that conduct credit card transactions and the requirements needed to become and remain Payment Card Industry (PCI) compliant. “As these attacks become more sophisticated, GTT’s security experts help to protect a company’s network from advanced threats.” “Data breaches are becoming more common and the impact to businesses in terms of revenue and reputation are very costly,” said Rick Calder, GTT President and CEO. “As these attacks become more sophisticated, GTT’s security experts help to protect a company’s network from advanced threats.” The white paper focuses on the PCI Data Security Standard (DSS) and highlights the financial impact to businesses and customers from a data breach. The white paper also outlines the most recent PCI DSS changes and what merchants need to do to stay compliant. Download a complimentary copy of the white paper here. ### All Things Code builds for growth on iland UK Start-Up, All Things Code, builds foundation for growth on iland’s Enterprise Cloud Services iland, an award-winning enterprise cloud hosting provider, today announced that UK start-up All Things Code, a software development company, is leveraging iland’s Enterprise Cloud Services (ECS) as the backbone of its IT infrastructure. By hosting its key applications in iland’s public cloud (IaaS) environment, All Things Code can focus on innovation and growing its customer base as the company continues to quickly scale its business.  “To independent software vendors (ISVs) like All Things Code, time spent away from developing and delivering cutting-edge products is time wasted, and iland’s Enterprise Cloud Services help eliminate that waste by arming customers with proven global cloud infrastructure,” said Lilac Schoenbeck, VP of product management and marketing at iland. “Additionally, with intelligent management tools built into the service, iland’s customers are not only able to easily manage their cloud footprint, but also save the time, effort and expense of employing third-party management solutions.  This further frees them up to focus on developing cutting edge software solutions.” All Things Code delivers mobile and web strategies and applications to streamline business processes.  From HR to sales catalogues, All Things Code reconfigures old processes and delivers them as applications that can be used on mobile devices. A start-up with ambitious growth plans, All Things Code needed IT resources it could scale rapidly and easily while maintaining a high level of performance.   As with many software companies today, the decision to host applications in the cloud was an obvious one for All Things Code. “We wanted to use cloud as a cool, integral part of our business, and it was an easy decision for us to go that route rather than buying physical hardware,” said Dan Harding, director at All Things Code. “At the same time, we also wanted to make sure we had the best possible infrastructure available to us with the highest quality and redundancy we could get.” One of the primary reasons All Things Code selected iland was the cloud provider’s credibility in the market. Dan Harding comments, “We work with large companies and needed a cloud vendor with real credibility. Being able to say that our cloud provider has won awards, has a location in one of the top data centres in London, and even has the ability to give our clients or prospective clients a tour of where their app will be hosted was a huge check in the box for us. It was very important for us to have an established and trustworthy cloud provider.” All Things Code is able to optimally serve its customer base because iland’s ECS offering provides the company: Access to best-in-class infrastructure with reliable availability and security that would be out of reach for an on-premise investment The ability to efficiently scale resources. While capacity requirements are difficult for any business to predict, this is particularly challenging for a start-up like All Things Code The flexibility to quickly configure and deploy new products iland’s intuitive ECS portal that enables customers to easily ramp up cloud capabilities, control and predict costs, manage resources, maximise performance and enhance security and compliance The strong partnership that iland has with VMware was another important factor in All Things Code’s adoption of iland’s cloud. The company wanted a cloud provider that could deliver the VMware infrastructure it was familiar with, combined with the cloud agility it needed to grow fast. [quote_box_center]Dan Harding comments, “We have big growth plans. We have a few products ready to go into production so being able to configure those very quickly is key for us. The flexibility and scalability of our cloud resources allows us to do that. And the support we’ve had verifies we made the right choice.”[/quote_box_center] Today, All Things Code operates a main controller, web server and SQL database in the iland cloud. The web backend enables the customers of All Things Code to update and maintain their own catalogues and product ranges themselves. All Things Code also has a backend database to update its own applications, which is also hosted in iland’s ECS environment.  For more information on iland’s enterprise cloud and disaster recovery services, go to: iland Hosted Cloud Services iland Disaster-Recovery-as-a-Service   ### David Fearne, Arrow ECS - Live from Cloud World Forum, London Interview with David Fearne from Arrow ECS UK as part of our live morning Kings of Cloud broadcast from Cloud World Forum, Olympia, London – 25/06/2015. ### Cloud Management What do we mean by ‘Cloud Management’? If you are consuming a significant amount of cloud services, particularly if they incorporate diverse cloud services, you will need some form of cloud management tool. A Cloud Management Platform goes beyond virtualisation management controls. Gartner suggests that, at a bare minimum, such a system should: provide self-service interfaces, enable the provision of system images, manage metering and billing, and provide for some degree of workload optimisation through established policies. Who are Cloud Management Platforms for? These tools are really aimed at the enterprise or organisation with large or complex infrastructure management requirements. They are designed to reduce the complexity of the infrastructure management and facilitate the provisioning of cloud resources. Because of the role they play, they are targeted at the needs of Infrastructure Managers, CTOs and Environment Managers. What benefits do Cloud Management Platforms offer? “Piloting an entire infrastructure can become extremely complex, especially as you have as many interfaces as you have technologies,” argues Jeanne Le Garrec of CMP vendor, Hedera Technology. He argues that the value of the Cloud Management Platform is to provide a ‘single pane of glass’ view of all IT and infrastructure tools which can facilitate the management of the cloud environment. Le Garrec continues: “Since companies nowadays have very heterogeneous infrastructures it can be hard to think about optimisation because there are so many aspects to consider. CMP has a role to play here too.” Are there any risks associated with Cloud Management Platforms? Given the increasingly crowded marketplace, the challenge is to find a solution that best supports the requirements of your organisation, infrastructure and management policies. Gartner’s definition points out that: “More-advanced system offerings may also integrate with external enterprise management systems, include service catalogues, support the configuration of storage and network resources, allow for enhanced resource management via service governors and provide advance monitoring for improved guest performance and availability.” Choosing the appropriate tool to meet the requirements of the business presents a challenge in a market that is “still immature”, according to Gartner. The biggest risk, however, rests with the use of proprietary management tools which inevitably limit the interoperability of cloud services and consequently limit the potential value of moving to cloud provision. ### Cloud Builder The key to a successful Cloud Computing deployment is to choose the right provider for the right application. For example, if you are handling credit card transactions, computing performance is much less of a concern than security and compliance. Cloud Builder is Compare the Cloud’s resource for helping you to decide what is important for you in the context of what your Cloud Service Provider (CSP) can deliver. 1. Pricing Model Cloud computing can be purchased on a reserved or unreserved basis with billing based on hourly, daily or monthly contracts. Different charges apply with different suppliers – for example, latency and/or traffic might affect prices. This means it is essential to look beyond the headline costs and play close attention to the detail of each pricing model and any variable costs within that. It is also important to understand whether the service requires ‘reserved instances’ where you will need to pay a fee in order to secure the capacity you desire. If these charges can apply and you chose not to pay them, this can impact the performance you receive. In such a circumstance – if it matters – then you will need to factor in the higher cost. 2. Transparency How is your CSP’s cloud built and managed? It is important to ask any potential CSP to explain the underlying technologies deployed, the network infrastructure serving the data centre, their change control policies, security policies and practices, geo-resilience, disaster recovery plans, the physical location where your data resides and how that is managed, what management tools are available to you and, more likely than not, their green credentials. The transparency of your CSP will have implications for risk, contingency and future planning (including portability and interoperability) so obfuscation in the early stages of negotiation is a warning sign. Don’t be afraid to dig beyond the marketing hyperbole. 3. Service and Service Level Agreements (SLAs) Moving your computing to the cloud does not mean that you should lower your expectations in terms of the quality of service you receive. Sure, more price-competitive cloud computing service providers may strip back the service element to a bare minimum but if this doesn’t suit your business requirements you may need to consider paying a bit more to secure the cloud computing service that best delivers the service level you require. You SLA must match or exceed the expected level of service were you to operate these services internally. In addition, the demarcations between your business and the CSP for the responsibility for service provision, security and management must be firmly – contractually – established. 4. Geography Post-Snowden, the location where your data resides is an important piece of the puzzle. Geography also matters in terms of connectivity – do you choose a CSP in your region? How does the data centre sit within the wider Internet Infrastructure? And what can the CSP guarantee in terms of geo-resilience and geo-redundancy? Cloud computing technology allows for a lot of resilience and redundancy since virtual servers are separated from the underlying physical hardware. But what if physical failure wasn’t confined to a single component or group of components within the provider’s cloud? What if the whole data centre facility was affected? Perhaps by power outages, human error, terrorism or a natural disaster? Some CSPs will build in a geo-resilient component to their service offerings in order to offer the benefits of a dual data centre strategy. This is most commonly a secondary back up service that is turned on in the event of the primary site going down, but could also include a load-balancing agreement. It is important to understand how your CSP has built their cloud offering so you can understand the risk and then jointly – or independently – implement a strategy for disaster recovery. 5. Lock in If the vision of cloud computing as a magical realm of endless computing power, of limitless potential in a place where the customer is king, is to be realised then the customer must be able to port from one CSP to another with relative ease – or even be able to deploy seamlessly across more than one cloud. Consequently, it is important to understand the degree of portability and interoperability of your cloud computing service. Lock in is greatest if you are using a proprietary SaaS software which is delivered exclusively by its developer, and some CSPs will lock you into using their cloud service by insisting you use their proprietary software to establish your cloud computing environment. Others may allow you to export profiles and VMs at will. Lock in may also arise in the guise of a lengthy minimum term on your contract. Some commentators suggest that it is best practice to plan to move at the end of your contract term in order to ensure that you are aren’t susceptible to price hikes and service changes. Whether you follow this advice or not, it is important to compare vendors in order to select the service most appropriate to your requirements. 6. Deployment Model A cloud can be deployed as ‘public’ (multi-tenant), ‘private’ (single tenant) or a combination of the two (‘hybrid cloud’). The debate about the merits of each cloud deployment model tends to focus on security – itself a heavily debated topic in cloud computing. However, there is no reason why a public cloud is necessarily less secure – or less customisable – than a private cloud; it depends on how the cloud is built, how it is secured and what purpose it is serving. The key is to ensure that the cloud deployment at a minimum matches – or, ideally, exceeds – the expected service levels and security that you would have if operating these services internally. ### Cloud Security What is Cloud Security? There are two aspects to Cloud Security. Firstly, cloud-based security software such as email virus scanning, anti-spam services, internet web threat protection and user monitoring services. These solutions are often cloud based – or a hybrid of on-remise and cloud-based software. Secondly, there is a great deal of debate about the security of moving services to the cloud. This debate tends to concentrate on risk, data management and new models of federated security tools. I’m interested in cloud-based security solutions. What do I need to know? The first stop is Compare the Cloud’s cloud comparison tool – here we list many of the cloud-based security solutions available in the market. I’m interested in the debate about whether moving my IT environments to a cloud-based Infrastructure as a Service (IaaS) model or Platform as a Service (PaaS) model is secure. What do I need to know? Some people have expressed concern that moving to a cloud-based Infrastructure model is necessarily less secure than owning, hosting and maintaining IT infrastructure in-house. However, this isn’t necessarily the case; the security of each model will depend on how each environment is managed. Some commentators have argued that fears about cloud security are more about a feeling of loss of control rather than based on solid security grounds: “Security has become a full-time job and requires a tremendous amount of expertise to do it right on-premises. For all the fear of the cloud, the fact is companies are routinely hacked, and many never even know it. In reality, your on-premises systems are not more secure than the cloud.” Nevertheless, there are security risks inherent in any environment and the cloud is no exception. These need to be considered carefully and clear lines of responsibility drawn between cloud service provider (CSP) and customer. What security measures does my CSP need to have in place? Of course, security and data compliance requirements will vary depending on the industry you operate in and business policies. It is important to speak with your CSP to work out a model which matches your requirements. The Cloud Security Alliance identifies fourteen areas that require consideration: Cloud Architecture Governance and Enterprise Risk Management Legal Contracts and Electronic Discovery Compliance and Audit Information Management and Data Security Portability and Interoperability Traditional Security Business Continuity and Disaster Recovery Data Centre Operations Incident Response Notification and Remediation Application Security Encryption and Key Management Identity and Access Management Virtualisation Security as a Service ### Cloud Deployment ### CaaS What is Communication as a Service? Communication as a Service (Caas) is a method of delivering communication services in a method akin to Software as a Service delivery, but with some special considerations relating specifically to communications applications and the integration with proprietary communications systems. CaaS can include a range of communications services, including voice over IP (VoIP), instant messaging, message routing, call recording and video conferencing. Who is CaaS for? Many people are familiar with consumer CaaS solutions such as Skype, FaceTime and instant messaging tools such as Facebook Messenger. However, mainstream adoption of communications over the public internet has been relatively slow. It’s worth noting here that VoIP isn’t necessarily a public-internet solution; it could just as easily be used over dedicated circuits or LAN/WAN. Business users have benefitted from the cost advantages of VoIP largely by deploying on-site IP-based PBX systems. These offer significant cost advantages by enabling the business to run the corporate telephony system on the same network as is used for data; thereby eliminating the need to install and operate separate networks for voice and data. However, these systems are expensive to purchase, configure and manage – putting them out of reach of most small-to-medium-sized businesses (SMBs). By eliminating the need for businesses to purchase and maintain such expensive communications hardware, CaaS puts much greater options for communication management in the hands of SMBs. What are the advantages of CaaS? The continued convergence of communications has blurred the line between software applications and communications applications. CaaS offers the potential to manage multiple services – voice, video, data – over multiple devices – landline telephone, mobile or smartphone, PC – in a controlled environment. It offers greatest potential for small businesses – enabling the utilisation of VoIP, VPNs, PBX and unified communications without the need for to invest in the upfront costs of the hardware or the need to employ the skilled professionals required to manage and maintain the systems. Some argue “VoIP is an upgrade for your business… whether you need to open a new office on the other side of the country, adjust your business hours on the fly, or prioritise calls from your most important contacts, online telephony makes it possible to adjust how you work as you work.” Are there any risks involved in CaaS? The quality and the reliability of these solutions is of primary concern, given the mission critical nature of telephony and other communications services. What do I need to do before implementing Communications as a Service? It is important to ensure the administrative tools are in place to manage and monitor the system. Because communications applications are mission-critical applications, even greater effort must be made to ensure that the appropriate service level agreements (SLAs) and associated penalties in case of service breach are put in place both in terms of service uptime and security. What’s next? Check out our Cloud Builder page for additional tips about what to consider before moving software or services to the cloud. ### IaaS ### PaaS What is Platform as a Service? Platform as a Service (PaaS) is often referred to as ‘application hosting’. Users rent access to a platform of managed server hardware, network resources and operating systems, rather than procuring and managing it internally. This offers users the opportunity to run or create their own applications. PaaS includes a fully-maintained operating system (usually the PaaS subscriber will have a choice), which presents a more direct and obvious option for the seamless migration of both applications and data. For example, a business’ CRM system which currently runs locally on a Windows Server platform could easily be migrated onto a Windows PaaS solution; all you would need to do in a typical installation is to create access/ security roles and publish to your staff. Who is PaaS for? PaaS is ideal for businesses which are experiencing rapid growth, have sudden and dramatic spikes in computing requirements, don’t have the required skills in-house or simply prefer not to manage the IT infrastructure internally. By passing responsibility to the Cloud Service Provider (CSP) you can benefit from the facilities, resources and expertise of that provider. Since they are an expert in managing IT infrastructure and environments, and have invested heavily in data centres, data backup and failover facilities, network, servers and virtualisation technology, and their staff, facilities and resources are committed to the task of keeping their platform up-to-date in order to secure competitive advantage, your CSP will be in the best possible position to ensure and deliver maximum uptime and availability for your company’s IT operations. What are the advantages of PaaS? PaaS offers the financial advantage common to all cloud computing: essentially shifting expenditure on IT equipment from a capital expense to an operational expense. It also promises to deliver greater agility for the business since your allocation of computing resources is better able to grow (or shrink) according to the demands of your business. In addition, the business will no longer need to invest in the maintenance of hardware and operating software and providers promise ‘seamless and painless’ updates. Since PaaS is delivered over the internet, it is an effective tool for the centralisation of your computing resources, so that they are equally accessible for all areas of your business, irrespective of location. In addition, since PaaS includes familiar operating system environments, it is the most obvious option for server replication and hybrid solutions – and can form a highly effective and cost efficient basis for backup and disaster recovery solutions. Are there any risks involved in PaaS? Any provider worth their salt will have their own system in place for disaster recovery – but does it comply with your internal policies? We recommend taking time to ask and understand how your CSP approaches DR. Keep in mind that your disaster recovery planning should also include the recovery of your data in the unlikely, but possible, event that for any reason your PaaS provider becomes unable to provide the service. Using a PaaS service can also make it harder to demonstrate accountability and control over the protection of data – but it doesn’t have to. It is worth researching your PaaS provider’s server, data storage and security methodology and options for 3rd party auditing for compliance. PaaS is delivered over the internet or, in some ‘private cloud’ cases, over your wide area network (WAN). In either case, you need to ensure that the connection to the service is sufficiently stable, reliable and has sufficient bandwidth to support all business users. What do I need to do before implementing Platform as a Service? Understand the security, data management and compliance policies and the disaster recovery strategies of the providers you are considering. It’s the same as outsourcing any service – you need to do your due diligence on the supplier to ensure they comply with your business policies. You also need to fully understand the performance which you’ll be guaranteed and how that is backed up with a Service Level Agreement (SLA) which includes penalty clauses for failure. The demarcation of responsibilities must be clearly understood and contractually explicit. No less important is the level of support you can expect to receive – again this must be governed by an SLA. This needs to match – or exceed – the service levels you would expect to receive were you operating these services within your organisation. ### Artificial Intelligence Artificial intelligence is used frequently to conjure up image of either cutting-edge advances to computer technology or a Terminator-style apocalypse, but what does it actually mean? Artificial intelligence, or AI, is used to denote when computers or other man-made machines display intelligent behaviour, in contrast to simply responding to instructions set by a human. One of the key tenets of artificial intelligence is being able to learn from previous situations and being able to react appropriately based on that prior knowledge. A high-profile example of a machine displaying AI is IBM’s Deep Blue supercomputer, which was able to defeat World Champion Gary Kasparov at a game of chess in 1997. During all the contests between the two players, Deep Blue received no human input. A more everyday example of AI comes in the form of Apple’s personal assistant Siri, which evaluates previous queries and requests to return results that are personalised to the user. One of the leading tests to determine whether a machine is exhibiting “intelligence,” is the Turing Test, devised by British mathematician Alan Turing. The Turing Test involves a human judge alongside a human and computer participant. Upon asking the participants a series of questions, if the judge is unable to distinguish between the human and computer, the computer is deemed to be “thinking.” The Turing Test does have its critics, but despite being devised more than 60 years ago it is still used as a benchmark for AI testing. Of course, before a computer is able to take the Turing Test or any other form of AI assessment, humans face the challenge of devising an intelligent machine in the first place. Reason, problem solving, knowledge and learning are all difficult functions to replicate despite all being, for the most part, logic-based. When scientists then try to imbue machines with emotional intelligence or other human traits like impulse and creativity, the problems become even greater. AI research is a varied discipline, but successes have been made at replicating logical intelligence. Primarily this involves gathering information through human input or sensors, which the computer can then analyse, before choosing an outcome. The previously mentioned Deep Blue is an example of this form of artificial intelligence. In fact, chess computers highlight that AI has a long way to go before it can fully replicate the thought process of a human being. Instead researchers often have better results focusing on a specific area of intelligence or skill, such as playing chess. IBM’s supercomputer Watson used its four terabytes of disk storage and 16 terabytes of RAM to defeat two former contestants on US quiz show Jeopardy, but still struggled with shorter clues and was unable to respond to the other contestants’ answers. Currently, even the most advanced artificial intelligence machines are plagued by limitations. For now, artificial intelligence of the kind that has inspired literature and movies remains firmly rooted in science fiction, but that is not to discredit the research being carried out. Optimistic outlooks suggest that the development of AI could bring tremendous advances to mankind and ultimately it may one day be able to replicate human intelligence, or perhaps even surpass it. ### Quantum Computing Quantum computing can sound like a daunting term to digest, with all manner of complex scientific notions popping into one’s head. However, understanding the concept is not impossible and could prove extremely useful considering the important role that quantum computing could play in the future of the technology industry. In digital computing, bits are units of information that can exist in only one of two values: either off or on, sometimes represented as either 0 or 1 – a binary digit. Complex combinations of these zeroes and ones make up the bytes and megabytes that are at the hearts of the computers we use every day. In quantum computing, however, quantum particles, usually electrons, can exist as both the “0” and “1”state at the same time as a result of a physical phenomenon known as quantum superposition. In this scenario, units of information in the form of either a 0, 1, or both are known as qubits. In practical terms, this promises far greater computational power than with a digital computer. In transistor-based computing, all the different possible combinations of zeroes and ones must be processed one after another for a calculation to take place. In quantum computing, these calculations can occur simultaneous, providing much faster results. An example of the difference in computing power between a digital computer and a quantum one can be used to highlight one of the risks that the latter poses to online security. The encryption used by the world’s financial systems cannot be cracked by traditional computers, even if they had until the end of the universe. Quantum computers, however, should be able to decrypt the security protocols relatively quickly. Despite the potential benefits of quantum computers, they are unlikely to replace your laptop anytime soon. Controlling quantum particles is an extremely difficult feat, particularly as the number of qubits increases. Preventing them from interacting with each other and the vessel that is containing them is crucial to preventing information loss, but is a proving a challenge for scientists. These problems can only be overcome through a deeper understanding of complex scientific fields, meaning that the day when quantum computers become commonplace is some way off. Currently, there is only one quantum computer commercially available, the D-Wave Two, although even in this instance some critics have questioned whether the device qualifies as a fully-fledged quantum computer. At the moment quantum computing remains largely theoretical, with few practical applications. However, just as the transistor replaced the vacuum tubes used by the earliest computers, quantum computers could one day become the default mode of technology, powering all manner of consumer and business devices. ### VoIP Services Voice services, sometimes referred to as Voice over Internet Protocol or VoIP, are a way of sending telephone calls and other forms of audio communication over an Internet connection, as opposed to a traditional phone line. With voice services, audio is transmitted as digital data using the same network infrastructure as an online connection. As it uses the existing Internet network, voice services do not require a bespoke handset like traditional telephone calls. Instead, any device with a microphone and speaker can be used. However, voice services also offer a number of additional features that set them apart from analogue telephone services. These can range from the relatively simple like caller IDs, call forwarding and 3-way calling to the more complex such as contact lists. As voice services require an existing Internet connection they use up Internet bandwidth in the same way that any other online service would. This means that businesses should consider whether their network infrastructure is robust enough before deciding to transition away from their phone line, as insufficient bandwidth can lead to low quality calls. Voice services are frequently cited as an example of cloud telephony – communication services that use cloud technology, often provided by a third party supplier. While businesses and individuals can use a service like Microsoft’s Skype to carry out VoIP calls, this is often inadequate for larger scale operations. In this case, companies may decide to enlist the services of a cloud telephony provider who will host the infrastructure needed to fulfil their communication needs. Servers, software and network adapters will all be maintained by the cloud provider. Alongside additional features, cloud voice services also offer businesses increased flexibility. Employees are able to retain the same work phone number wherever they are located, simply by signing in to their VoIP account. This is a useful benefit for companies with a mobile workforce as it means they are easily contactable even when out of the office. Similarly, for firms that have customers, partners and clients located all over the world, cloud telephone services are likely to be much cheaper than a traditional phone line. International calls are often charged a premium rate using analogue phone lines, but because voice services operate over the Internet, there are usually no additional charges when compared to domestic calls. Cloud voice services also facilitate conference calls by compressing audio data to allow for more speakers, while adding additional extensions, should your business grow, is straightforward. There are some issues with voice services, namely that it is not as reliable as an old fashioned phone line. If the Internet connection is lost or there is a power failure, Internet voice service will become inactive. Traditional phone lines, which receive their power from the telephone wire itself, do not suffer from this problem. Despite this issue, many businesses are taking advantage of the benefits that cloud voice services bring. The added flexibility, coupled with lower costs means that it is preferred approach to communication technology for many organisations. ### Cloud Hosting Cloud hosting is a way of hosting, or storing, a website and its respective data across multiple servers rather than a single server. Cloud hosting usually relies on virtual servers which are based on a network of various physical servers. Virtual servers refer to a web server that shares its computing resources and so they are not dedicated entirely to a specific website or set of websites. In effect, this enables multiple virtual servers to reside on one physical computer. These virtual and physical servers are not located on-premise, but rather are a feature of cloud computing in that they are accessed via an Internet connection. Cloud hosting allows websites to pull information and resources from several servers, usually located in different data centres in disparate parts of the world, as and when they need them. In this respect, cloud hosting is related to other forms of cloud technology in that businesses are using a service, rather than a product. Using cloud hosting, website resources are utilised only when necessary, improving server efficiency. Moreover, should there be an issue with one of the servers, another within the cloud hosting network will be able to takeover, limiting the amount of downtime experienced. If websites are experiencing peak traffic, more computing resources from the “cloud” of servers can be allocated to it, helping to alleviate bandwidth issues. In fact, websites that use cloud hosting can sometimes remain undisrupted even when entire data centres are forced offline. By spreading resources across multiple data centres, companies achieves a much higher level of reliability. Businesses wishing to introduce resource-draining new features will also benefit from the cloud model as upscaling is seamless. This approach can also prove more cost effective for firms as they only use, and hence pay for, the resources that their website needs. Rather than running a dedicated server operating at half-capacity or less for the majority of the time, cloud hosting promises much greater efficiency. Cloud hosting usually falls under two categories: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). In the former, clients are provided virtualised hardware resources, including server space, IP addresses and bandwidth. Clients are then free to choose whichever software platform they wish to build their website or web application. If customers receive a PaaS package from their cloud hosting provider, however, they will also be given a software framework within which they can build their website. This may include an operating system and development tools. Cloud hosting contrasts with the single server approach, where a single computer acts as the dedicated server for a website and comes with a set amount of memory, hard disk space and bandwidth, all of which will have limited flexibility. However, some businesses may still want to use the dedicated server approach. If a website requires a large proportion resources, cloud hosting could actually work out more expensive than a single server. Organisation may also have security concerns when their website resources are spread across many different servers in various data centres, all of which may be working under different security protocols. With prices surrounding cloud technology falling, the future of cloud hosting looks bright and for businesses that require reliability above all else, it represents a worthwhile investment. ### CRM Customer Relationship Management, or CRM, refers to the way in which a company manages its interactions with its existing and potential customers. More specifically, CRM is a particular type of software package used by enterprise firms to more efficiently organise and process aspects relating to their customer interactions, whether that is sales data, marketing insights or customer support. CRM may also enable a business to automate some feature of its customer interactions, helping to streamline operations and improve efficiency. CRM software differs depending on the developer, but it will usually enable businesses to easily identify individual customers so that issues can be resolved more effectively or so they can be targeted for future sales where relevant. This may take the form of a database of existing customers, relevant contact information and a history of their previous interactions with the company, such as past sales. In the modern world, customer interactions can take place via several different mediums including phone calls, online shopping, apps or emails. CRM collates all this diverse information into a single customer profile to ensure that service is personal and up-to-date. As well as improving customer support efficiency, CRM is also used by companies to identify future sales. By analysing previous interactions and purchasing trends, CRM tools can automatically produce sales reports and help to identify opportunities to up-sell. Alongside automatically updating customer data, CRM may be able to automate other features of the businesses-customer relationship. Virtual agents may be part of the CRM portfolio, where artificial intelligence bots help deal with customer queries and issues. Often these virtual agents are given a human appearance and undergo rigorous testing to ensure that they can deal with a multitude of customer service tasks. If the automated aspects of CRM prove ineffective, they will transfer the customer to a human agent. However, CRM is not only relevant for customer support teams. Increasingly businesses are embedding CRM software across the company as a whole, enabling teams from different parts of the organisation to collaborate more easily. CRM is also a useful internal tool, helping businesses to assess how successful their partnerships are and where improvements can be made. Like many other software applications, CRM is usually hosted in the cloud to provide businesses with added mobility. Where companies have customers based worldwide, CRM can be hugely useful in allowing them to analyse trends across different areas, while having data available for all their customers globally. Using a cloud-based CRM also helps employees to remain productive even when they aren’t based in the office. CRM systems that are based in the cloud are often priced per the number of people accessing the system and have few initial costs, making them an attractive tool for smaller businesses. They are also easily scalable should businesses achieve rapid or unexpected growth. Overall, CRM helps to cut down the amount of time companies spend on administration when it comes to the customer-facing side of their business, allowing them to focus on sales opportunities and growth areas. It is easy to see “the customer” as one large, homogenous whole, particularly when trying to make sense of the huge quantity of data now available. The aim of Customer Relationship Management software is to help businesses understand the individual that makes up that whole to provide better results for customers and businesses alike. ### Internet of Things (IoT) The Internet of Things may not be that well-known amongst the public at the moment, but that is all set to change. The term, often abbreviated to IoT, refers to physical objects that are able to connect online, allowing them to send and receive information. Looking forward, it will bring about an increase in data transfers that do not require human interaction. While the vast majority of information passing across a network is actively created by a human being either pressing a button or uploading content, the Internet of Things means countless devices will gather and process data passively. While this definition includes smartphones, the Internet of Things is often used to describe objects that previously had no level of connectivity whatsoever. Already, companies are working on Internet-connected refrigerators, smart thermostats like those being developed by Nest and online-enabled transport infrastructure. The difference between an IoT device and regular one can be subtle or distinct, but will likely impact on the object’s functionality in some way. An IoT light bulb, for example, allows you to control it remotely via Wi-Fi wherever you are, while the aforementioned Nest thermostat will learn when you change your home’s temperature and devise an optimal heating plan. The Internet of Things, however, isn’t important simply for the number of gimmicky features it adds on to everyday household objects. It is expected to have a profound impact upon the way that we live our lives, in much the same way that smartphones have affected our behaviour. Long term, IoT devices are expected to enable automation on a massive scale, affecting society in ways that go beyond employment figures. Self-driving cars and high-tech medical implants are just some of the IoT developments set to influence our lives in the not-so-distant future. However, not all IoT changes are expected to be positive and to have a complete understanding of the Internet of Things one must also be aware of the disruptive impact it is predicted to have. The increased number of wireless connections that the Internet of Things will initiate, naturally gives hackers more avenues of attack. While no high-profile IoT hacks have yet been noted, they are often mentioned by security researchers as a significant risk because the outcome of a cyberattack on a connected car for example could be far more damaging than any previously witnessed. Even without malicious intent from the outside, the Internet of Things is still likely to disrupt many businesses, placing added strain on company infrastructure and leading to new privacy issues surrounding the extra data being created and stored. For better or worse, the Internet of Things will ensure that the relationship between people, the Internet and everyday objects is unlikely to be the same again. ### Big Data Big Data is a term that is heard a lot in technology circles, but understanding what it actually means is not always the easiest. Broadly, it refers to data sets that are so large that traditional attempts to process and analyse them are no longer applicable. The term has become more common because of the vast amounts of information now available as a result of digital technologies. GPS trackers on our smartphones, social media posts, website cookies and countless other digital footprints that we make each day, all contribute towards the data deluge. As well as the huge increase in the amount of data now available, both the speed with which data is transferred and the different formats that it appears in has also risen. Big Data is, therefore, a huge enabler for businesses and individuals, but also a significant challenge. For example, an enterprise firm may wish to know more about its customers in order to improve its level of service and gain a competitive edge. Big Data offers them this information, but obtaining the insight that they need is often easier said than done. Data can arrive from multiple sources, in a variety of formats and is often completely unstructured. To put the size of this data into some form of context, companies may have to process information in the order of petabytes, which translates as 1,024 terabytes or 1,048,576 gigabytes. However, it is impossible to define a minimum boundary for what separates “Big Data” from ordinary manageable data. In fact, to set such a definition would not only be arbitrary, but also short-sighted, as the amount of data that we define as “big” is only likely to grow in the future. Instead, businesses usually adopt the term Big Data when the information that they are trying to process with traditional database and query approaches takes in the region of days and weeks, rather than minutes. However, if companies are able to capture and analyse their huge data stores, the information gleaned has the potential to transform their business. Where are customers using our services most? Why are they choosing not to buy our product? How long are they using our website and what pages are they visiting? These questions and many more have their answers in Big Data – if firms know how to look for them. As a result, while “Big Data” as a term can refer to the data itself, it is also used to describe software that helps to analyse or store it. Third party suppliers of these tools may use a variety of methods in order to process this data, ranging from machine learning, NoSQL databases and other forms of advanced analytics. Increasingly, Big Data companies are focusing on predictive analytics, that is, how large data sets can be used to predict future behaviour. The more data companies have at their disposal, the more accurately they can model what is likely to happen, which has huge ramifications for industries ranging from retail to meteorology. With the adoption of wearables and the Internet of Things predicted to take off in the coming years, the likelihood is that, whether it’s past, present or future trends you’re looking at, Big Data is only going to get bigger. ### SaaS Software as a Service, sometimes abbreviated to SaaS, is a method of distribution whereby a company or individual accesses software via an Internet connection from an external supplier. SaaS is easiest understood in contrast with the traditional software distribution model, in which applications are stored on a local hard drive or server after being purchased either in a physical format such as CD-ROM or via a download. Instead, Software as a Service utilises an external vendor who will supply the customer with the software package as and when they need it over a network connection. SaaS is often described as renting software rather than owning it due to its pricing structure. In the past, businesses would pay a one-off fee to own a piece of software and would be free to use it for as long as they wished, or more likely, as long as it remained suitable for their business and customers. However, Software as a Service works on the premise of organisations paying a subscription fee to an external supplier, with fees being paid on a monthly or yearly rolling contract. This, of course, drastically alters the relationship between developer and customer when compared to the traditional approach to software. Previously, once an organisation or individual had purchased a piece of software, that was, in effect, the end of the partnership with the developer, unless a support fee had been paid. With SaaS it is just the beginning. Renting software on a subscription basis means that the developer or supplier of the software will continue to host, maintain and update it. As a result, SaaS is also a way of businesses also gaining access to developer expertise as the administration and servicing of the software will all be handled externally. Software as a Service is also part of a broader technology movement known as cloud computing. Like SaaS, the cloud facilitates the movement of resources away from local storage towards a more fluid distribution model involving a network connection. Alongside SaaS, businesses may also chooses to utilise Infrastructure as a Service (IaaS), Platform as a Service (PaaS) and several others, all of which are provided by the cloud. An example of SaaS which you may be familiar with is Microsoft Office Online. In this instance, customers do not own Microsoft Word, Excel or PowerPoint as they would if they had purchased them outright, but instead are able to access lightweight versions of the software over the Internet. Software as a Service offers a number of advantages over on-premise software, most notably the option to access tools wherever and whenever you need them. Providing an Internet connection is present, employees can access the resources they need at all times, enabling greater mobility than with traditional distribution models. It also enables easier collaboration because with a third-party handling updates, it is much easier to ensure that all users have the same version of the software being used. Companies may also find that SaaS is much more cost effective as it foregoes expensive one-off initial costs and adding more users is simple and shouldn’t disrupt existing tasks. That being said, there are criticisms of SaaS surrounding the lack of control it offers users when compared with the traditional model and concerns over data security when information is being transferred over an Internet connection rather than a local network. Depending on the size of your business and the tools it requires, SaaS can offer a number of benefits and has already made a sizeable impact upon many enterprise firms. ### Social Media - A beginners guide By David Terrar, Founder and CXO of Agile Elephant This is a guide to get you started on the road to becoming social media savvy.  If you are here on Compare the Cloud then you are interested in leading edge technology from either the buy side or the sell side.  You might be thinking in terms of your personal brand, or your company’s profile.  Whatever your focus, one thing is certain – social media is part of the fabric of 21st Century business.  You need to master it, or it will master you.  Let’s get you started! Why Bother? Over the last 20 years the World has steadily been going digital, with the most significant changes in terms of cloud, social and mobile technology happening within just the last 10.  Business in general, and marketing in particular, has been transformed. More than 4 out of 5 businesses of all sizes from Fortune 500 to SMEs are using social media in their communications and marketing today.  From books, to cars, to our next IT investment, as buyers we now do a significant amount of research via social media – we trust recommendations from our peers more than direct marketing campaigns – before we even talk to the sales person in the shop or at the potential service provider.  In turn, if you are doing the selling, if you use social media to research your prospective customers, then you are guaranteed to be more successful than your colleagues that don’t – in fact 85% of customers expect businesses to active on social media. And what about hashtags?  Just about every event, advert and TV programme now has a hashtag and social media conversations going on around them.  The shift to social media is happening whether you like it or not, whether you join in or not.  In today’s ultra-competitive business environment social media is a key tool you need to be using to connect, to win business, to differentiate yourself and differentiate your company.  What does it give me? For your company it can help you: Gain more exposure Find prospects and leads Increase website traffic Find new business partners Improve search engine rankings Reduce marketing expenses Close more business Improve brand awareness Add value to your reputation For you personally it can help you gain: Credibility Recognition Influence in your market sector or area of expertise Valuable intelligence on your prospects or competitors Potential clients Rewarding partnerships New opportunities Social Media Strategy in three steps Marketing has changed forever.  We’ve moved from a world of a few TV channels, to a world of 100s.  From one way TV, print and billboard broadcast style advertising to a two way conversation across multiple channels using many different types of technology.  But the key to this new world was expressed in a seminal, on-line book in 1999 called the Cluetrain Manifestowhich explained: “All Markets are conversations”  That simple phrase points to how you can be successful with social media.  You have to find where your customers are hanging out, go there and join the conversation.  We put it in to a 3 phase strategy: Find your community Add value (but be authentic!) Make it part of your workflow (and make use of your tribe) Find your community (or what channels should I use?) Where does your community hang out on-line?  There will be many answers to this.  It will depend on the type of product or service, your particular industry sector, and the specific audience you want to address.  There will be standard social media channels like Twitter, Facebook, LinkedIn and YouTube to address, but there may be specific forums and bulletin boards you need to consider too.  For example, suppose you are selling services in to the small business space in the UK.  If so, you can’t ignore a traditional bulletin board style forum called UK Business Forums – you have to be there answering questions, joining in discussions. Add Value (but be authentic) The key to social media success is in adding value to the conversation.  Too many companies use an old style marketing methodology and simply broadcast their standard messages on these new channels.  Press releases online – no added value and no personality.  That doesn’t work.  What you need to do is listen to your audience, join in the conversation and add value by answering questions or by sharing good quality, relevant content.  Answering their questions will demonstrate your expertise in the subject.  Sharing good quality content will add value, but it isn’t easy.  It takes time and effort to find good sources and then share – ideally with some added commentary – but manage this  correctly and you will stand out from the crowd.  As well as the content you need to share some of yourself too – people are interested in authentic voices and authentic stories.  Storytelling is a powerful component in any marketing, but it becomes even more useful in the social media world of near instant communication and rapid feedback.  As well as the content you find, you need to create content yourself, and that’s hard for everyone and every organisation.   You and your company should consider starting a blog if you haven’t already.  It’s the single most significant thing you can do to improve your success in social media and online.   Make it part of your workflow (and make use of your tribe) To be a social media success means working at it, probably every day, certainly a few times a week.  Creating a strategy is important and ensuring that you stick to it is essential – it should become part of your natural workflow. There are a huge selection of tools, some free, and some where “going pro” is useful and modestly priced which can help you make the most of your time, track and monitor the success of your strategies and automate the process too.  These tools help you search out conversations, follow particular topics or groups, find good content and make it easy to retweet and automate some of the process of sharing and replying for you.  On top of making it a habit, don’t forget to enlist the help of your tribe.  That might be colleagues in your company, or other experts and commentators in your field, your customers or your friends.  When you post a good Tweet, Facebook or LinkedIn post, tip off your tribe to share, retweet or comment.  The extra activity will help you get noticed, gain recognition, and more followers.     What channels? As we said in finding your community, you need to think beyond the standard social media channels and recognise there are other online spaces, forums, sites and bulletin boards that might be extremely important for your industry sector or topic.  But what of the standard channels – should I be there? Twitter – almost certainly Facebook – possibly, and particularly important for retail and certain audience types LinkedIn – it’s the professional’s network, and you certainly need to be there as a person and probably as a company YouTube – video is a powerful medium, and with today’s smartphones making production easier – certainly worth considering   A blog – a significant investment in time, but definitely worth it The rest – Google+, Pinterest, Instagram, Flickr, Xing, Foursquare, Tumblr, Vine, etc. If you are just starting your social media journey you probably don’t need to consider these yet, but for certain topics and certain types of business they will become useful and relevant at some point.  You also need to watch for new trends.  Periscope is a new service for instantly live streaming, publishing to your Twitter timeline/followers and simultaneously recording the stream so you can publish it on YouTube.  People are experimenting with it today, but within a year it might become significant.  Keep an eye open for these new networks as things can change fast.    Creating content Whether it is 140 characters of a tweet, or several hundred words of a blog post, you need good, compelling content.  The “content marketing” topic has been steadily gaining in importance, both because it is a primary feed to your social media strategy and because of the continual changes in Google’s search algorithm.  Now, more than ever, fresh, new content is exactly what you need to improve your social media, blog or your website’s search ranking. Create good headlines! The world is awash with digital content.  If you want your signal to reach beyond the noise then you need a great headline to get noticed.  Learn the rules from traditional journalism, the tabloids, and more than 10 years of experience of the bloggers that have gone before you.  Numbers always help (as in “101 killer ideas for your headline”).  Lists and how-to articles get read.  Controversy is good.  Secrets are great.  Are they the best, or the worst?  You get the idea – invest time in a good headline, then people are more likely to follow your link and read. Use visuals! There is no doubt that your tweet or your article is more likely to be read if it comes with a strong image or a cool diagram.  It takes more effort, but it’s worth it.  Video can be powerful too.   Create a good profile For every different social network, you need to understand the basics and set up a good profile.  You’ll need a professional photograph as your avatar, a good description with keywords that explain your expertise, and links to your blog or website.  For some you will need a header image or background.  Choose carefully – these are your 5 second attention grabbers.  Get help! There is plenty of good advice online to help.  You can find help at Hubspot or Copyblogger orJeff Bullas.   Or you can come to Agile Elephant for help and advice.  We provide a range of social media training, workshops and services to help you educate your team, define your strategy, set up and support your social media presence and run your various social media channels.  Contact usto find out more. ### The CIO: from gatekeeper to enabler As a result of new and advanced technology, significant transformation has been made to the way enterprises operate. Business units, particularly software and web development teams, but also marketing, HR and finance, want to take advantage of the new generation of cloud services, meaning that an interest in cloud technology is no longer solely the domain of the IT team. The consequence of this is that every business unit leader has their own IT solution in mind, rather than waiting for the CIO to deliver against a set of requirements. This is leading to the CIO and IT department losing the control it is expected to have, and then having to deal with the pitfalls that come from having systems and data outside of the company’s infrastructure. Finding a solution that meets the needs of all stakeholders is one of the greatest challenges facing enterprises today and one that will demand that the role of the CIO and IT department changes forever. [easy-tweet tweet="Employees want instant access to their tools from any location, at any time" user="comparethecloud"] The convergence of cloud computing and business-critical applications is having a profound effect and is perhaps the most transformative technology trend affecting today’s workplace. Employees want instant access to their tools from any location, at any time, and so they increasingly want to use cloud-based solutions. These cloud systems offer unlimited scalability of storage and data-processing capabilities, helping business-critical applications to run more efficiently than ever before. Businesses are consequently open to a wide range of growth opportunities, being more able to respond to dynamic market conditions and competition, serve new geographies, and to rapidly develop new products and services. Despite the benefits that cloud computing offers the business, the IT department are understandably nervous about allowing business units access to the wide range of self-service solutions they need. Due to budget limitations and corporate policies, the IT department end up acting as a gatekeeper and restricting the services to which employees can have access. As a result, employees are procuring their own solutions in order to meet their specific business requirements without waiting for the IT department. According to Centrify, more than two-thirds of organisations admit that unauthorised cloud applications are being implemented without IT’s knowledge or involvement. This “shadow IT” could pose a significant threat to an organisation’s information security and availability, potentially impacting its revenue. According to Gartner, 35% of organisations’ technology budget will be spent outside of the IT department by 2020. Shadow IT will continue to grow unless business units and departments are provided with the flexible tools they seek as part of the services IT can offer. Each business unit is faced with differing technological requirements. A software team building a software application to sell in the market usually requires a set up of several servers, storage and networking, i.e. ‘QA environments’. Software development teams could greatly benefit from the agility of creating instances in a short space of time, the efficiency of not having to manage and maintain the infrastructure and paying only for the infrastructure they need. Web content teams also greatly benefit from the cloud. It opens up resources from multiple internet-connected devices with lower barriers for entry. The cloud gives them access to hosted applications and data along with cloud-based development services, allowing content teams to create web applications that have remote access to data and services like never before. Cloud-based developments allow developers to build and host web apps on the company’s own cloud server, speeding up the development and deployment processes. The major challenge facing the IT department is to regain control and enable employees to work flexibly with applications that meet their requirements, whilst also making sure the company’s data is secure and that the CIO can deliver the compliance and reporting requirements demanded. In order to do so, the CIO and the IT department need to bridge the gaps between IT’s need for control and business units’ need for flexibility by offering flexible and dynamic self-service solutions. By doing so, IT becomes an enabler of business change rather than a barrier to it, and by choosing the best hybrid cloud platform to offer both on-premise and cloud solutions, their organisations will avoid pitfalls which can be created by using variety of different systems and cloud offerings within a single organisation. CIOs need to be able to understand that the demands placed by each business unit to become stronger, more innovative and competitive require this service-based approach. This in turn will position the CIO as a business enabler, supporting greater efficiency and innovation to fuel growth and increased revenue, while also future-proofing the organisation’s technology infrastructure to maintain a competitive edge. CIOs must immediately turn their attention to future proofing their business and creating a unified vision by choosing an appropriate hybrid cloud infrastructure that meets their business’ requirements, before the business beats them to it.  They need to embrace shadow IT within the enterprise rather than trying to shut it down. As CIOs continue to be pulled in different directions by employees, cloud providers and customers, it’s clear that a single cloud strategy is no longer sufficient in meeting a variety of needs. Hybrid cloud - combining public cloud infrastructure and external cloud services with private cloud deployments and on-premise IT systems - can provide flexibility, structure and security, and enables the CIO to gain control over the IT environment, assuring data governance, but also enabling the various areas of the business to evolve with the right technology to support them.  Now is the time for the CIO to shake off the gatekeeper reputation and step confidently into the role of ultimate business enabler, with hybrid cloud facilitating this evolution. ### Analytics Analytics refers to the examination of data in order to discover useful patterns and insights. It differs from analysis in that it is heavily focused on large data sets, involving both statistics and mathematics and may also use quantitative analysis and predictive modelling. Individual analysis is often focused on processes and functions and requires smaller scale computational resources. Today, analytics is a valuable tool for many businesses, specifically those that that use digital technologies. Business analytics utilises the vast amounts of data available in order to form insights that could prove beneficial across all aspects of an organisation’s output, such as sales, software development or customer support. As data comes from a variety of sources, including smartphones, websites, emails, telephone calls and many other mediums, businesses can struggle not just with the amount of information that needs to be processed, but also the form and structure that it takes. As a result many companies are investing heavily in business analytics software and expertise in order to gain insights that will, hopefully, give their operations a competitive edge. These tools will automatically collate and format the data being collected by a businesses, whether it originates internally and externally. Then, once businesses have an understanding of the kind of insights that they are pursuing, they can run the data through their analytics platform. Analytics has potential uses across a broad range of industries, for example, the transportation sector. Every time a commuter uses their smart travel card, a piece of data is created. When this is combined with bus or train timetables, GPS tracking on mobile phones, and other transport influencers such as the weather, it adds up to a huge amount of unstructured data for local governments to process. Analytics software is used to make sense of these disparate data sets, so that useful insights can be gathered such as where transport bottlenecks and delays are likely to occur. Another prominent aspect of analytics is using data to predict what is going to happen in the future, not simply making sense of the past. Predictive analytics uses the data that is available and runs a series of statistical procedures including data mining and machine learning to extrapolate potential trends. The possibility to being able to predict future events before they occur is one of the major reasons that businesses are investing heavily in their analytics and Big Data programmes. If an organisation is able to foresee a trend before its competitor, the potential payoff could be considerable. Other forms of analytics include web analytics, which specifically refers to websites and the monitoring of online traffic, and retail analytics focusing on sales figures and customer data. In all of the above examples, the way that analytics tools display their outputs is crucial. These usually take some form of data visualisation, whether it is a chart or graph, but often businesses are able to customise their dashboard and data outputs in order to glean the insights that are important to them. Due to the vast computational power required for analytics on a large scale, particularly when Big Data is involves, companies may decide to host their analytics solutions in the cloud for enhanced efficiency. By purchasing analytics form a third-party service provider, organisations can alleviate some of the pressure on their in-house resources. Digital technologies are now a part of almost every business and can no longer be relegated to the IT department. A huge amount of data is emerging as a result and analytics is the key to making sense of that data. ### The (cloud) empire strikes back Last month London was held to ransom by the train strikes, and the word on the street (the paper I read over someones shoulder on my commute) is that there will be another full strike on the 5th of August. Train strikes make people, myself included, raucously angry at the world. But I found myself questioning this stance during the last strike. We have the cloud, so why are we all so mad? [easy-tweet tweet="Tube strikes don’t have to mean a complete stop-work" user="rhian_wilkinson and @comparethecloud" hashtags="tubestrike, cloud"] I chose not to try and commute during the last strike - because I didn’t have to, and you probably didn’t have to either. The people impacted the worst by tube strikes are the doctors and nurses that couldn’t make it to their shifts on time - potentially putting lives in danger. But for the rest of us, with admittedly non-essential careers, I have to ask again, why were we so mad? We have the internet. We have the cloud. There is very little of my work that I cannot do from my home computer, a glorious advantage during strike action. Consumerisation of IT means that you probably have a very similar set of kit at home as to what is in your office.  Most companies now have cloud functionality to the point where we can, at a push, work remotely at a high level. Admittedly the July 9th strike came with very short notice, and working out of office can require some preparation. Ahead of the proposed strike on August 5th, instead of getting worked up about your commute - get working now to prepare yourself. Most of you will already have access to emails, presumably on your phone or accessible through your home browser. File sharing services like Dropbox and GoogleDocs mean that you can continue to co-work on documents with your colleagues - and there is always Facetime and Skype if you really need to convey something face to face. Conference calls can be dialled into remotely, and even if your house has awful reception, you have no excuse for missing a call thanks to VoIP services. Tube strikes don’t have to mean a complete stop-work, or that you have to be crammed into someones armpit for 2 hours on a replacement bus that smells like it probably should have been retired a number of years ago. If you are going to avoid the hassle of travelling during the next strike, be smart about your preparation. Check the security of your home connection, make sure your antivirus is up to date, and if you’re in doubt about anything talk to your IT department. The cloud enables a free working environment. There are situations which you cannot get around sometimes, but if it is as simple as rescheduling a few meetings - why not work remotely on August 5th? Save yourself some time and stress, just remember to say a little thank you to the cloud gods. ### Dean Chapman, Fedr8 - Live from Cloud World Forum, London Interview with Dean Chapman from Fedr8 as part of our live Afternoon Kings of Cloud broadcast from Cloud World Forum, Olympia, London - 25/06/2015. ### DataCentred joins the Open Invention Network Membership demonstrates commitment to open source and to protecting users of the Linux operating system against software patent aggression DataCentred, the UK’s leading provider of open-source computer processing and storage solutions, has joined the Open Invention Network, the world’s largest organisation created to promote patent collaboration and non-aggression in support of Linux. The decision further demonstrates DataCentred’s commitment to Linux and the wider open source model. DataCentred joins a global community dedicated to promoting patent collaboration between its members and ensuring developers, distributers and end users of Linux are protected against Patent Assertion Entities and other attacks from patent holders. As an Open Invention Network licensee, DataCentred commits to not using its own patent portfolio against Linux and is in turn protected against patent aggression. All OIN licensees gain royalty-free access to OIN’s intellectual property. Access to Linux’s core technology patents will further support the development of DataCentred’s open source cloud platform. DataCentred’s OpenStack compute and Ceph storage cloud platform runs entirely on Linux and joining the OIN gives DataCentred access to the intellectual property which will allow the company to innovate and grow. Open source technologies are the foundation upon which DataCentred is built and have already facilitated a number of recent innovations. Earlier this month DataCentred announced the General Availability of its OpenStack public cloud platform running on ARM AArch 64 processors. The commercialisation of this world first innovation is set to significantly reduce data centre power usage and drive down operational costs, passing on savings and environmental benefits to DataCentred’s customers. The Open Invention Network includes over 1,500 licensee members globally, and has strong industry support with backing from companies including Google, IBM, NEC, Philips, Red Hat, Sony and SUSE. Matt Jarvis, Head of Cloud Computing at DataCentred, said: “The Open Invention Network is completely aligned with DataCentred’s commitment to both Linux and the principles of open source. Linux and open source are the future of the software industry and supporting patent non-aggression is crucial if we want to encourage collaboration, innovation and growth amongst developers, distributors and users. As we enter the era of cloud, joining the Open Invention Network demonstrates DataCentred’s commitment to building a secure, creative, shared open source platform for the future.” ### Arrow ECS Event; IBM Jam on Cams Compare the Cloud is blogging live from the Arrow ECS Event 'IBM Jam on Cams' from the Crystal in London. Speakers include television presenter and author Kate Russell (a technology, gaming and Internet expert famous for regular appearances on BBC World News and BBC Click), Jacqueline Davey, VP, Enterprise Sales and MidMarket UKI, IBM, and Helen Kelisky, VP, Cloud, IBM UK.   ### Station to Station – Cloud DR options and how to make the most of them "Drink to the men who protect you and I." - David Bowie Disaster Recovery planning is one of those necessary IT investments that cloud computing should make easier for everyone. Rather than automatically having to worry about second site leasing and multiple hardware assets, cloud DR services can take these pain points away while also reducing the cost for a service. [easy-tweet tweet="An alternative approach is to protect machines at the Virtual Machine guest level." user="MastersIan"] However, not all cloud DR options are created equal, and decisions made around private cloud installs can affect your options. So how can organisations make the best use of cloud DR to get their data from their own infrastructure and on to the right platform? How can we go from one cloud station to another? There are a range of cloud platforms out there that have been adopted by Cloud Service Providers (CSP) large and small. The dominant ones are VMware and Microsoft on the proprietary side, while open source options like Xen and OpenStack have also been used to build and run clouds. When it comes to DR planning and hosting your critical workloads with a CSP, their choices will affect how well they can support your IT and the potential recovery times that can be achieved. If you run on VMware and your preferred partner cloud is based on Hyper-V, then the options will be more limited compared to an “all-VMware” approach. So does this mean that your choice of platform will limit your options when it comes to working with partners? Not necessarily. There are several routes to using cloud for DR and depending on your existing IT, these can be mixed and matched. Here is an overview of the options that are open to you: Work with the same – If your company has committed to a specific virtualisation platform, then working with a CSP on the same platform should be a simple option. However, this can rely on you being on the same version for compatibility, so it is worth aligning any update plans so that compatibility problems don’t arise later. Match your platforms – for larger organisations, or ones that run a multitude of different platforms, working across multiple platforms may be more difficult. However, it may be possible to find a CSP that can support each of the platforms that are in place as well. For example, if you have both physical and virtual servers in place, then these can be supported in a “like with like” manner. Mix it up – An alternative approach for virtual platforms is to protect machines at the Virtual Machine guest level. This involves creating copies of each machine and then replicating data to secondary copies at the CSP site. As this works at the guest level, it’s possible to protect across different platforms, such as from VMware virtual machines to copies based on Microsoft Hyper-V. Go multi-cloud – For those already using a cloud infrastructure, running across several cloud providers can be an option where meeting regulations on protection or spreading across multiple geographic locations are concerned. Data can be taken from one cloud and then sent on to another, or sent from the primary site to two separate cloud providers. However you decide to look at Cloud DR, there are plenty of options available to you. Protecting your workloads across all the different IT hardware, storage and software platforms is possible using the right approach to replicating your data. By understanding how existing infrastructure decisions affect DR planning, it’s possible to put better plans in place for the future. In the words of David Bowie, “Drink to the men who protect you and I.” ### Banking leaders turning to cloud to leapfrog the competition Why Cloud? Optimisers, innovators and disruptors rely on it to leapfrog the competition. [quote_box_center]Regulatory, geopolitical risks and complexity are multiplying. Bankers need agility. Skilled non-bank competitors are moving very quickly. Banks must move much faster. Cloud is driving digital banking’s business-model transformation and revenue, not just cost savings.[/quote_box_center] Dramatic changes are creating ripples across the financial services industry which require institutions to adopt new approaches to maximise profitability and remain relevant. The pressures on margins, emboldened customers and increasing compliance demands are driving the disaggregation of the vertically integrated model which predominated the industry until recently. [easy-tweet tweet="Best Practice: Think big, start small and drive pilot projects quickly for rapid ecosystem domination. "] The prevalence of smart devices and easily accessible investment information has increasingly placed consumers in the driver’s seat to create banking relationships with a variety of providers outside the standard list of banks that customers think of today. In response, banks are introducing cloud-based solutions that offer integrated risk management, predictive real-time analytics, core banking transformation, mobile money systems and more. Most are relying on private clouds and hybrid clouds to help drive efficiency, standardisation and best practices, and to retain greater customisation and control than public clouds would permit. Based on the success of these cloud-based solutions, financial services firms are turning to the cloud even more in their search to find a dynamic platform to develop, test and offer the innovative services being demanded by consumers. Speed Matters - Moving from 45 days to 20 minutes and 60% faster For example, a top North American financial services organisation wanted to use cloud to optimise their business and dramatically reduce the development cycles for the company’s more than 20,000 internal application developers who were typically forced to wait up to 45 days for server resources to be provisioned. The bank built an internal cloud to enable self-service requests, automated provisioning and internal chargeback capabilities, boosting utilisation rates and improving efficiencies for the developers. This cloud based approach allowed them to slash server provisioning times from 45 days to less than 20 minutes. The result was faster development cycles and putting new enhancements into the hands of customers more quickly. Cloud is driving business model transformation, not just cost savings. Some innovators are using the cloud to understand their customers better, extend their value proposition or change their role within a market. A global European bank executed a massive volume of transactions daily, millions of customer interactions carried out through more than 10 distinct channels, from ATMs and mobile devices to online retail banking. Cloud-based analytics are helping this bank gain new insight into their channels and get results 60 percent faster. Capturing, reporting on and analysing data relating to each transaction is critical to identifying customer behaviour patterns but the bank didn’t have a standard process in place. By creating a cloud-based centralised hub for processing and analysing traffic data across all channels, this bank now analyses several thousand log files per hour. Previously it took five days just to collect all the report data. They now have a timely count of how many customers are using each channel and how they’re using it. The shift to cloud computing is seen by many in the financial industry as the key to unlocking competitive advantages. This view has been reinforced by the institutions that have already embraced the cloud and surpassed the competition thanks to their ability to rapidly introduce new services. Financial institutions are also unique in that - unlike many other industries - the high level of industry standardisation in payments and securities processing eases their migration to the cloud.  An international bank that saw cloud as a major disruptive force needed to create a consolidated risk strategy to adapt to rapidly-changing regulatory and compliance requirements. By implementing a cloud-based solution with easy access to more than 11,000 operational risk loss events, the bank was able to accelerate their regulatory compliance, while driving process and control improvements. Cloud is a big opportunity. But to make it a game-changer for their business, banks need to first think big about what they can do with cloud and understand how cloud fits into their strategy, ecosystem and roadmap. Whether they start small with a specific pilot application or business process, or dive into the cloud feet first, speed is critical in today’s competitive marketplace. The bottom line is the banking industry faces key challenges in this new era of demanding, digitally-connected consumers and that is further magnified via a changing economic and regulatory landscape. Dramatic forces across the industry require new approaches and cloud computing is already delivering game-changing results.  ### Edward Snowden to Keynote at IP Expo Edward Snowden joins keynote line-up at IP EXPO Europe 2015 to discuss the truth about privacy Renowned former NSA employee Edward Snowden will deliver a keynote speech this year at Europe’s number one IT event, IP EXPO Europe 2015, taking place at London’s ExCel. Joining the event live via satellite on Wednesday 7th October, Snowden will share his views on the implications of national cyber security today. [easy-tweet tweet="Edward Snowden will be keynoting LIVE via satellite at @IPEXPO" user="comparethecloud" hashtags="snowdenspeaks"] Famous for his ‘whistleblowing’ against NSA in 2013 which has since fuelled thousands of debates over mass surveillance, government secrecy and national security, Snowden will be interviewed live by renowned journalist and broadcaster, Andrew Neil, and will speak out on his views about the current state of cybersecurity worldwide. He is expected to discuss the truth about our privacy on the Web and the implications of recent major security breaches on the future of cyber and national security. Bradley Maule-finch, IP EXPO Europe’s Director of Strategy, says: “Our security is increasingly under threat and, following recent events like the Hacking Team breach, there are fresh concerns about how safe we really are on the Web. It’s a global issue, one which was bought to the forefront of the news agenda by Edward Snowden in 2013. For that reason, we are very excited to welcome him to this year’s IP EXPO Europe and hear his views on the matter”. Snowden is also expected to give his views on the Government’s Investigatory Powers Bill, or ‘Snoopers’ Charter’, which calls for ISPs and mobile network operators to keep records of individual users’ browsing activity over a 12 month period. Earlier this year, the Prime Minister announced a further extension to the bill which will strengthen powers for security services to intercept the content of communication. Edward Snowden’s keynote will take place at 3pm on Wednesday 7th October, in the IP EXPO Europe keynote theatre, ExCel London. For further information or to register FREE, visit www.ipexpoeurope.com or search for #SnowdenSpeaks on Twitter. ### Electrifying the cloud in Small Business Businesses are flocking to the cloud. From start-ups looking to take advantage of the agility and savings offered by the rapidly expanding world of public cloud services to multinational corporations implementing their own private infrastructures based on the principles and technologies of cloud computing, there’s no doubt that we’ve entered a new era of business computing. This may all seem like a relatively recent development, but the reality is that cloud computing has been in the works for decades. While a history lesson may seem irrelevant when talking about something that’s moving forward with such force, the knowledge will show us where it’s headed in the future. [easy-tweet tweet="The development of cloud computing is strikingly similar to the expansion of electrification. " user="comparethecloud"] Looking Back to Electrification Today it may seem as though the electrification of the world occurred overnight, but in truth it took decades for energy to become a publicly available utility. Even up into the early 20th century, businesses that relied upon electricity often had no choice but to generate their own. As efficiency improved, distributions systems were eventually able to carry electricity from large power plants to paying customers; however, even these were relatively primitive throughout the 1920s. Power was only available during the hours of peak demand, and was mostly used for lighting up a few hours of the early morning and late evening. As the means of generating power improved and its distribution systems were refined, the price of electricity fell. After decades of generating their own power, businesses finally found it more economical to pay for power from the grid than to run their own, isolated systems. Electrifying the Cloud The development of cloud computing is strikingly similar to the expansion of electrification. Businesses began implementing this type of technology by creating and maintaining their own network infrastructures. The software used within these networks was either developed in-house or licensed – for no small fee – from third-party developers. The development of such networks led to the principles upon which contemporary cloud computing is based, and the beginnings of centralised computing resources accessible by relatively simple clients emerged as early as the 1950s. Indeed, the idea of a worldwide – then dubbed ‘intergalactic’ – network of computers was first introduced by a man by the name of J.C.R. Licklider in – get this – 1969. Cloud computing – the actual practice of delivering software via the Internet – didn’t become a reality until Salesforce.com made it so in 1997. Amazon became the first to offer commercial cloud services in 2002, and the practice became truly mainstream when Microsoft and Google jumped on board in 2008. The Utility of the Network Just as companies with large sites of operation still find it beneficial to maintain their own power plants, many also find it economically and strategically advantageous to adopt commercially-available cloud computing infrastructures for use in their own private networks. These, of course, are known as private clouds. Due to the enhanced efficiency and capability of public networks, it is no longer necessary to maintain a private infrastructure to take advantage of centralised resources in remote locations. This so-called “maturation” of the cloud has created an environment in which subscription-based services are accessible anywhere one has a laptop and an Internet connection. This new realm of accessibility has created the ideal virtual landscape for an entirely new industry – an industry on which companies are expected to spend over $106 Billion in 2015. They’re spending by the truckload for some very good reasons – the utilisation of cloud services has immense advantages over the old ways of doing things. Clear Skies in the Future of the Cloud If your business hasn’t yet taken a step into the cloud, there’s never been a better time to explore the possibilities that this exciting new realm of business computing has to offer. From secure remote data storage to the improved management of dental practices, compete with those that already exist, competition will continue to reduce costs and drive innovation in the cloud. If that isn’t incentive enough, it’s a sure bet that your company’s competition has a keen eye on the competitive edge that’s available in the cloud. Indeed, present-day start-ups are employing strategies based wholly upon the utilisation of cloud services to leapfrog the technological growing pains that have historically plagued burgeoning businesses. There’s never been a better time to explore the possibilities of the cloud. The future offers limitless potential for this new realm of computing, and those who start now will be well prepared to catch the next wave towards the future of business. ### Compliance, Workloads, and the Data Centre In recent months, discussions around data centre compliance have shifted substantially. Where attention had tended to focus on ways to deal with the diverse range of regulatory requirements governing business activity, it’s now expanded to include topics such as privacy and protection. The shift is being driven by the changing use of corporate data and the evolving role of the data centre itself. In particular, the rise of hybrid cloud infrastructures is making the task of compliance even more complex. [easy-tweet tweet="The rise of hybrid cloud infrastructures is making the task of compliance even more complex" user="comparethecloud"] With company IT infrastructures increasingly spanning national and international borders, the complications of compliance become even more acute. They now extend far beyond more traditionally scrutinised areas such as the finance, insurance, and medical sectors and touch virtually every industry sector. The Complications of Compliance Regulations and directives are designed to protect individuals. At the same time, the ability to prove adherence to those regulations protects the businesses and service providers. For this reason, compliance has end-to-end effects and implications. Also, it’s no longer only the reputation and financial position of a business that is at risk. Company officers may even be exposed to personal consequences if found to be in breach of their duties. This recent Digital Realty whitepaper, while focused on the American market, covers issues that affect businesses across EMEA as they deal with the rapidly evolving world of cloud computing. It’s also critical for organisations to ensure they have full understanding of their complete IT landscape and the inherent risks faced when using cloud services. Why the compliance of the DC matters – Helping Businesses Protect Their Workloads As a provider of client-focused data centre solutions, Digital Realty provides on-going support to match the changing needs of our clients. This includes helping them understand where their data resides and how it is being used. Digital Realty helps mitigate risks by supporting and facilitating compliance, and assesses those risks in order to minimise exposure and loss. Compliance is a fundamental part of way Digital Realty operates. We see it as an opportunity to partner with our clients to deliver agile solutions that meet their specific needs. We ensure the necessary certifications that facilitate compliance across our European Turn Key FlexSM data centre portfolio. For example, The Uptime Institute, the global data centre authority, has certified our Irish data centre, Digital Profile Park, as Tier III compliant for both design and facility. Tier certification reflects the data centre’s rigorous requirements for high-availability service capabilities. Our UK data centre which we launched in Crawley for Rackspace is one of the UK’s greenest data centres with a design BREEAM assessment certification of ‘Excellent’. Geographic issues and complexities The European Union has taken – and continues to take – important initiatives to harmonise relevant legislation and is working with the US and cross-industry representatives on the subject. The EU Data Protection directive outlines seven principles designed to protect all personal data collected for (or about) citizens of the EU. The principles cover issues such as how such data can be collected, stored, and shared. As a result, the directive creates challenges for organisations with data centres in the region as they need to ensure that they adhere to these principles. The situation is made even more complex by the additional regulations imposed by individual EU member countries and the recent focus on comprehensive reforms to create a “one-continent one-law” set of regulations. It’s vital to consider these challenges when selecting a data centre partner. As well has having a deep understanding of the evolving regulatory landscape, a partner must also be able to demonstrate they have a thorough compliance program in place. Digital Realty views compliance as a fundamental component of our customer offering, designed to meet their specific requirements in every market in which they operate. Details of our compliance credentials are contained in this white paper. ### Enterprise Digital Summit - London - Workshop Pre-SUMMIT Workshop October 21 10-11 Carlton House Terrace, London SW1Y 5AH 9:30 am - 5:00 pm Driving Digital Transformation in the Enterprise This pre-conference workshop is a deep dive in to managing digital transformation, end to end across the enterprise.  We cover aspects of social and collaboration technology, but also the human communication involved.  We will discuss driving value across your organisation whilst dealing with today’s disruptive business landscape.  The workshop agenda covers: The Digital Wave - how cloud, mobile, social, and big data technologies are transforming the way we work. The Human Interface – dealing with people in the transformation, from different organisation structures to lessons we can learn from biology, anthropology, modern military tactics and the latest business thinking. What leadership and management skills are required? What is the role of the Organisation anyway, going forward? Discussing a European Social Business and Enterprise Social Networks Survey, with analysis, trends, a comparison of available platforms, a look at what company are actually doing in the UK versus the US and Europe and what can we learn from the other territories. Engagement – the keys to organisational responsiveness, the benefits of an engaged workforce, hard numbers, soft approaches, and new ways of thinking covering the psychology of engagement and current trends in neuropsychology. Talent attraction, management & retention in the new world of work, along with the the emerging role of HR and the importance of data analytics. Delivering end to end agility using social techniques to transform the value chain within marketing & PR, sales, customer service and retention, innovation, R&D, procurement, production operations & logistics. How to diagnose an organisation’s potential for Social Busines, finding the high impact opportunities for your organisation to help you build a business case and define the ROI. Join us at The British Academy to discover what works? what doesn’t? and what's next? ### A blog to Storage Marketers regarding IOT Scaremongering Wading through my usual early morning spam I noticed that there was a number of articles relating to IoT (Internet of Things) and storage growth. Like the idiotic mindless early stage cloud marketing types it seems the storage marketing lot have really been let loose from the asylum today. It appears that we are about to get drowned by loads of storage requirements generated by the Internet of Things.. oh and our data centres will not cope…Yawn. I have to wonder if these laggard dull storage-marketing idiots actually understand anything about technology at all? Or better still do they look at the world around them and question anything? [easy-tweet tweet="The storage marketing in your inbox is just scaremongering " user="comparethecloud" hashtags="storage, datacentre"] The damage the storage marketing types have caused is seen in the data centres up and down the country filled with SANs, Fibre Switches and other storage technologies which have rendered those cloud providers and MSP’s unable to compete on costs due to early adoption and false storage marketing in respect to the promise of cloud adoption and storage needs.. Ok so we are going to have Zetabytes, Zogabytes or even gogabytes (I made that one up), so the internet of things will explode ok… Does that mean end-users and cloud providers should rush out now and go and buy lots of storage? For anticipated demand that is… I mean seriously, I know you have to sell a product, but really? Dear storage marketer, do you actually do any research at all? Have you looked at the recent news regarding breakthroughs in processor size? Or better still have you looked at future advances in storage here is a little research link regarding DNA and storage. Here is an idea. How about informing the customers of your products correctly and ‘tell rather than sell’?  Or better still go and speak to your finance teams and come up with a model that scales on demand (or shrinks) inline with a cloud-billing model? My personal view is that the sooner storage vendors are compelled by customer demand to move into a subscription model the better (one with no lock-in using open standards and swappable components). Currently the storage market is awash with a lock-in mentality, I mean yes you can get a free SAN enclosure but it is the cost of the drives thereafter that pays for your free lunch. Advances in object storage and codification / virtualisation of the network and storage pools hopefully will allow cloud providers and end-users to liberate resources away from proprietary stacks. My open question to the storage marketers is this, in two years time when your customer is sitting with a great big blob of your storage and rules out ever dealing with your organisation again will you think ‘we did well selling that box’? Or will you think, this customer has really grown it's a shame we used marketing rubbish to lock them in? The former will be the truthful answer I suspect, and it is this short term pursuit of targets that will see storage vendors be obliterated by cloud storage models long term. The final thought I will leave those that have managed to reach the bottom of this blog post is this, the iPhone you hold has more processing power than all the Axis and Allied powers combined in World War Two! Miniaturisation is a process of taking an item making it smaller driving down costs and increasing performance; this will happen to storage. The Internet of Things will help this drive. If you want my sincere advice if your organisation is looking to scale up storage in anticipation of IoT demand in 2018 DON'T… ### Schroedinger’s Backup – will your recovery work? Theoretical physics doesn’t often cross over with Cloud computing. However, in the world of recovery and business continuity, it’s very relevant. Austrian physicist Erwin Schroedinger created a famous thought experiment in 1935, where he imagined a cat being placed in a sealed box alongside a radioactive element and a flask of deadly poison. The element would decay over time, but the chance of this taking place at any given moment was a probability of 50/50. [easy-tweet tweet="Without testing, it is impossible to tell; a case of “Schroedinger’s Backup”, if you will." user="mastersian and @comparethecloud"] If the element does decay, the phial of poison would break and kill the cat. The chance that the cat was alive or dead was therefore equal too. The important lesson from this was that it was impossible to know whether Puss had survived his brush with death unless you opened the box; before this, it would be impossible to tell. How does this apply to Cloud and disaster recovery (DR)? For many traditional implementations, recovery is just as dicey a proposition as for Schroedinger’s famous feline. Previously, testing backup and recovery programmes was hard to justify, based on the time required for systems to be down during the recovery test process. There was always the potential situation where the backup process would not actually work, leading to both lost revenue and (potentially) loss of career. Better then to leave things as they are and trust the plan, some might say. It’s understandable that people may have this mindset. If the plan is not tested, then things as they stand can continue and everything remains operational. However, it might also fail, and it would be impossible to tell beforehand if the recovery would succeed. Without testing, it is impossible to tell; a case of “Schroedinger’s Backup”, if you will. However, this lack of testing leads to false hope and fragile IT remaining in place. While the business may be happy in ignorance, any incident could lead to big problems in the future. The longer that operations go on, the bigger the issue will get. Today, using Cloud can help remove the uncertainty around recovery through making testing simpler. Helping companies test their backup and recovery processes can be a great opportunity to prove the value that Cloud can deliver, as well as ensuring that all the efforts around business continuity planning are worthwhile investments. For companies that are already using Cloud for recovery, testing the implementation can be as simple as starting up new instances and making sure that the process works effectively. The production systems should be able to carry on running while the test is carried out, as the test can be run on new Cloud infrastructure. For those organisations that want to look at how they can test their current approach, there is some more work involved. Replicating data from the existing backup systems over to a Cloud instance would be required; this Cloud instance could then be fired up to show that the backup systems are working properly. While there might be more time required to carry out the initial replication, the result is the same. Testing Cloud recovery programmes as standard can actually be a great differentiation point for DRaaS providers, as it shows that the provider knows exactly how their recovery procedures will work in practice. It also ensures that the Cloud service keeps up with all the changes that are taking place across IT, both within customers and in the industry as a whole. While testing might not be high up on the agenda for the CIO, it’s an essential prerequisite for IT teams that have to be sure that their infrastructure is being protected and that recovery systems are working properly. Using the Cloud to carry out this testing makes sure that there are no excuses for not doing this. ### Continuum is Empowering Small Business IT in the UK LeadingEdge, a U.K.-based IT services and support provider founded in 2000, ensures that small and medium-sized enterprises (SMEs) have the technology infrastructure and tools they need to be successful, and that systems operate effectively and efficiently every day. In order to deliver those results, the company relies on software, support and services from Continuum. “The partnership between Continuum and LeadingEdge is very, very strong,” says Dean Parry, General Manager at LeadingEdge. “And it’s only growing stronger. The way they’re getting involved in our business and helping us has allowed us to move forward at a very rapid rate.” LeadingEdge leverages Continuum’s Remote Monitoring and Management (RMM) platform – a SaaS-based portal that allows MSPs to easily monitor, troubleshoot, and maintain desktops, servers, mobile devices and other client endpoints. The software intelligently captures device and network information, employing a proprietary alerting system that can eliminate up to 80 percent of erroneous tickets and false alerts. [easy-tweet tweet="“Having Continuum RMM allows us to be a proactive team, rather than a reactive one,” says Dean Parry"] “Having Continuum RMM allows us to be a proactive team, rather than a reactive one,” says Parry. “The portal helps us massively, giving us the ability to go anywhere in the world and support our clients without actually having to be in our offices.” Furthermore, the platform is directly supported by nearly 600 technicians at Continuum’s state-of-the-art Network Operations Centre (NOC). These certified experts act as a direct extension of LeadingEdge’s workforce, by providing 24x7x365 monitoring, issue remediation and a variety of related support and services. “The NOC has skills and expertise that our engineers don’t have, and we feel very comfortable and confident when pushing issues to them,” says Parry. “Working with the NOC on a day-to-day-basis allows us to get under the hood with our clients. We don’t have to worry about backend services, because Continuum is taking care of them. That gives us time to drill down into our clients’ needs.” It’s a model that has not only helped LeadingEdge expand their technical skillset and knowledgebase, but has also allowed the company to grow and onboard new clients without hiring new technical staff. “Working with the NOC means that we don’t have to employ tens or hundreds of people to look after our everyday IT tasks,” says Parry. “We’ve actually lowered the amount of staff we have.” Frontline Support from Continuum’s Help Desk While Continuum’s NOC provides backend monitoring and support, they don’t interact directly with LeadingEdge’s clients. That task is reserved for Continuum’s Help Desk – a frontline support centre that offers 24x7x365 desktop troubleshooting and related services to end users via phone, email or web-based chat. “Today, we support clients around the globe, 24x7 – without having to worry about an issue at two in the morning in London,” says Parry. “The Help Desk technicians are very professional, and take control of the situation. We haven’t had any problems.” Continuum’s Help Desk is staffed with more than 100 technicians who provide support to more than 35,000 end users. The service is also completely white- labelled, helping Continuum’s MSP partners to maintain a seamless brand experience with their customers. Network Assessments: A Foot in the Door LeadingEdge also leverages Continuum’s Network Assessment Tool, a solution powered by RapidFire Tools, when meeting with potential clients. The tool scans entire IT environments to capture device and network information, identify potential risks or security threats, and more – and generates a series of reports that can be custom-branded and used as part of a proposal from LeadingEdge. “Before we engage with a customer, we always carry out an IT audit using the Network Assessment Tool,” says Parry. “That gives us a complete picture of their infrastructure, patching levels, antivirus levels, and more – and we’re then able to put together a much more detailed proposal for the customer.” Trusted, Local Support from Continuum’s UK Team In addition to receiving support from Continuum’s NOC and Help Desk, LeadingEdge also appreciates that Continuum has established a local presence in the UK to provide personalised account support and one-on-one time when the company needs it. “We’ve mostly been dealing with U.S. support since partnering with Continuum, but the company recently set up an office in the UK. It’s really given us confidence and comfort to know that we can contact a local account manager in the UK and meet face-to-face – they help us straight away. It’s been brilliant for us.” ### Enterprise Digital Summit - London 22 October 2015 10-11 Carlton House Terrace, London SW1Y 5AH 8:50 am - 5:30 pm The Enterprise Digital Summit is the premier UK event to help your organisation make sense of the shift to digital technology and new business models that is disrupting the world of work as we know it.  Increasing use of cloud, social, mobile and analytic technology is a given in today’s marketplace, but we focus on the leadership, digital literacy, management and business issues around putting these technologies to work end to end across the whole business, and not just for communications and marketing.  We talk less about theory and more about practice, generating real returns - where do you start, what works, what doesn't, and what's next? Our keynote speakers are Stowe Boyd of Gigaom Research, the well known futurist who coined the terms "social tools” in 1999 & “hashtag” in 2007, and Professor Vlatka Hlupic of Westminster University, author of The Management Shift.  We have case study speakers from Vodafone and Pearson. Other speakers include David D’Sousa, Bonnie Cheuk, and Belinda Gannaway, with more to be announced shortly.  Our venue is The British Academy at 10-11 Carlton House Terrace, overlooking the Mall.  This is a one day conference aimed at executives and managers who want to get to grips with digital transformation for the enterprise.   Register at http://www.enterprise-digital.net/london/. ### IoT and your blood pressure woes Hospitals and health services around the globe are under immense pressure. In the UK, as of the Eurostat regional yearbook 2013, the last time the data was aggregated, there were just 2.71 doctors per 1000 people. Taking into consideration population growth, and the consequences of having ageing population, these figures have probably not improved much in the past two years. [easy-tweet tweet="#IoT and #bigdata will change healthcare in the future" user="rhian_wilkinson and @comparethecloud"] Healthcare services’ needs never seem to be satisfied, bed shortages are only the tip of the iceberg. There is also the issue of ageing technology that has to be replaced - and staffing continues to be a problem as services are not expanded to account for growing populations. Temporary staffing solutions are employed to fill the gaps in the rota, but how effective is this really? It became apparent at the beginning of June 2015 that staffing agencies are ‘ripping off the NHS’ by charging up to £3500 a shift for agency doctors. "There will always be a need for agency staff... but they should be there for those times when there is a cold snap, when there's a flu outbreak, when you have sudden spike in demand you couldn't predict," Health Secretary Jeremy Hunt said. Patients are 15% more likely to die if they are admitted to hospital on a Sunday than a Wednesday, Hunt said on July 16, 2015. This is due in part to the staffing problems health services encounter over the weekend, but I have to wonder if this could relate to the long standing issue with doctors’ handwriting being illegible, and paperwork being notoriously easy to mix up. Digital health documents have already started to conquer this hurdle. IoT could be the answer to Jeremy Hunt’s woes. In 2008 The Academy of Royal Medical Colleges found that “The quality of medical record keeping in the UK is highly variable across the NHS. The layout of admission, handover and discharge proformas is very different between hospitals and clinical departments and many do not use proformas. This variability is largely because doctors learn how to take a medical history by apprenticeship rather than the application of a standard record structure.” These findings led to move to create a standardised version of the structure and content of medical records. In 2013 NHS England set itself challenging targets, for completely paperless communications between primary and secondary care by 2015, and a wholly paperless NHS by 2018. By standardising, and digitising all medical data, big data analytics will be able to work as it was intended - to better our lives. If all medical data coming out of a borough in London is trending towards a spike in lung problems, we can see that air quality is an issue, and address it before more residents become unwell.  On July 16 2015, Hunt said that within 5 years he expects seamless access to electronic health records which can be shared and the introduction of new medical devices which can send emergency alerts to ensure ambulances arrive quicker. If you have a heart condition, and are wearing a monitoring band that can summon an ambulance as soon as it detects an issue, you're probably much less likely to be found dead on your living room floor. IoT could be the answer to Jeremy Hunt’s woes. Many patients need appointments for minimal checkups on their blood pressure and heart rate, IoT devices have the ability to minimise contact time between patient and doctor. Something as simple as a wristband or chest strap can monitor the patient’s target areas, and record and transmit the data for analysis. In the circumstance that blood pressure or heart rate became elevated, both the doctor and the patient could be notified automatically. This would prompt the requirement for an appointment, rather than having frequent appointments for something that can be monitored remotely. Companies such as Ohmatex are exhibiting prototypes of IoT smart textiles including a stocking that measures peripheral oedema, which is common in patients suffering from heart or kidney failure. Daniel Thomas has spoken his thoughts on big data and medicine previously, particularly that a key part of medical care is human interaction. Recovery is bolstered by the support of family, along with cups of tea or course. If we could improve the way technology in hospitals is used to monitor patients, support staff would be able to spend more time on the personal aspects of healthcare. Robotics are progressing rapidly, and in a few years time personal healthcare companions like “Baymax” from Big Hero Six might be a reality. Japan already has giant RIBA bots that look like teddybears. With these improvements in support, and the developments we are seeing in big data, medicine will be a completely different game within the next decade. IoT and big data will change the way we seek and receive medical care. [easy-tweet tweet="IoT and big data will change the way we seek and receive medical care." user="Rhian_Wilkinson" hashtags="IoT, HealthTech"] ### Hybrid is the Word (Part 2) Our individual rankings for hybrid cloud are dominated by analysts seeking to promote their recent research, reports or opinions. First up, Kuba Stolarski has been promoting the IDC report on IT cloud spending, commenting that: "Private and public cloud infrastructures have been growing at a similar pace, suggesting that customers are open to a broad array of hybrid deployment scenarios as firms modernise their IT”. [easy-tweet tweet="Hybrid cloud's top #cloudinfluence individuals from @comparethecloud" hashtags="hybridcloud "] Natalya Yezhkova also promoted the same IDC report, which forecast IT cloud spending growth of 26% in 2015. He added that: "End users continue to evaluate various approaches to adopting cloud-based IT: some integrate public cloud service into their IT strategies, others choose to build their own private clouds or use third-party private cloud offerings, and some, seeing benefits in both, implement hybrid cloud strategies.” Step Ahead Solutions launched a kick-starter campaign to gain public funding and bring its services mainstream. Robert Joseph, its CEO, believes that the company's unique Cloud Services Brokerage offering hybrid IT services is game changing and currently vastly under-utilised, and has the capability to change the way people use computers. Another vocal analyst Charley McMaster has been promoting the firm’s 2015-16 Hybrid Cloud Backup Appliance Buyer's Guide. He was quoted highlighted: "Unitrends' Recovery943S and Recovery936S [which] stand out from the competition in the high-end hybrid cloud backup appliance market, delivering arguably better protection and recovery for highly virtualised environments as well as connectivity to multiple cloud providers.” Stacy Nethercoat from distributor Tech Data focused on the free trials the Carbonite were offering through channel partners, saying: "Carbonite's hybrid backup solutions are a great addition to our suite of cloud offerings. Further IDC analyst comment from Vladimir Kroa focused on IBM initiatives for Startups to help drive European cloud expansion.  He commented that: ”In Central and Eastern Europe, spending on cloud services, which includes public, private and hybrid cloud, is on the rise and is far exceeding the growth rates of traditional technology delivery models.” Kroa also added that: "Consequently, the integration of internal and external resources is becoming more frequent as end users are opting for a hybrid cloud model that, on one hand, solves some of the legislative concerns about data privacy, and, at the same time, enables more flexibility, price transparency and acceleration of go-to-market strategies.” Meanwhile David Dennis from GroundWork focused on the firm’s Cloud Hub Amazon Connector. “Hybrid clouds, using a combination of public clouds like Amazon plus a local private cloud, are becoming increasingly popular with GroundWork's user base,” he said. Intel was busy promoting discussion on the need for interoperability to drive both cloud and IoT. Commenting specifically on the fact that CipherCloud and the Cloud Security Alliance are forging a Cloud Security Working Group, Curt Aubley, Intel’s Data Center Group CTO, said that: "With enterprises scaling their use of hybrid cloud solutions to drive their business, the role of interoperable cloud security based on standard APIs is critical.” Cloudian celebrated its selection as a leading innovator as well as the fact that it has been asked to present at the Japan-U.S. Innovation Awards. Saying that he was delighted to demonstrate its HyperStore hybrid cloud storage solution for smart data applications at the Japan-U.S. Innovation Awards Innovation Showcase, Michael Tso added that: "Since Cloudian's inception, we've seen the explosion of Hybrid IT, an approach to enterprise computing in which an organisation provides and manages some IT resources in-house but uses cloud-based services for others.” Commenting on EMC’s sale of Box competitor Syncplicity to private equity firm Skyview, Gaurav Verma, Dropbox explained that: “EMC had acquired Syncplicity in May of 2012 and focused since then on its hybrid cloud approach and a high-security competitor to Box and Dropbox.” It will be interesting to see how well Syncplicity thrives against Box and Dropbox without EMC. [table id=41 /] NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### I am a Human not an Algorithm Today almost all aspects of our lives are controlled by an algorithm of some description. From the Google searches delivering us results based on our preferences and location, to the Facebook timeline that filters our feed based on what it thinks we want to see. Do you wonder why that friend from high school you never talk to doesn’t show up in your feed anymore? It’s because Facebook figured out that you don’t care about their status updates very much. [easy-tweet tweet="The danger is the algorithm itself has been engineered, coded, and conceptualised by a human..." user="MrAndrewMcLean"] The danger of any machine automation based on an algorithm is that the algorithm itself has been engineered, coded, and conceptualised by a human. Not that I am saying humans are all bad - just prone to the occasional mistake. Human errors are the basis of all algorithm errors. Here is a very brief look at how algorithms are influencing our day to day lives. Over-reliance on social media data: Twitter is fantastic especially for sentiment analysis but what if that sentiment was artificially generated? Hacking attacks on the Twitter accounts of major news organisations could be used to send false Tweets about catastrophic incidents. Bill Mew's explanation of sentiment data is as follows: “Social Sentiment is the view, whether positive or negative, of a brand or keyword as it is used across social media platforms such as Facebook and Twitter.” So what? You may ask. Today many of the trading and banking platforms are fully automated based on algorithms, including social data. This type of trading via algorithm is commonly called ‘High Frequency Trading’ which invokes trading decisions made by computers in milliseconds. So what if the algorithm invoked social media sentiment data? If a major ‘trusted’ news or media outlet published a terrorist incident or a currency-changing event this would be picked up on by the trading algorithm. Consequently, the algorithm could cause a fluctuation in price due to the logistical non-human aspect. The social sentiment data could cause the algorithms of high frequency trading to react without verification, and effect a number of trades on a share currency or commodity market. Unbiased Media Influence Whatever your leanings towards politics, sexuality, or music you occasionally may want to break out of your usual sphere and see something new. Unfortunately on a daily basis this decision is being taken away from you, the cookies placed on your machine, and the magazines you subscribe to, and the web searches you make have made your preferences identifiable.  Many say advertising has driven the growth of the web, and most would agree this is an undeniable fact. But now the algorithms behind the advertising also control the content you see across all devices and media - only presenting you with content that is based on your pre-determined profile. Most would agree that this means the algorithm has been corrupted. Internet of Things IOT, Internet of Things, Internet of Everything, Wearables, it has many names and guises, but in essence it is the blending of technology with humans. From the FitBits and Apple watches that monitors your steps per day, to the diabetes app that measures your food intake - technology is now integrated with everyday human life more than it ever has been before. As we bring more of these devices into our homes and into our lives we have to remember that they are driven by algorithms - and that algorithms are prone to error. Machine automation and learning is fantastic, as are many of the devices available to you and I today. Just always keep a sceptical view of everything that is presented to you, nothing is random. ### Hybrid is the word! Whereas we were once reliably informed by a Mr Travolta and the BeeGees that “Grease is the word” – apparently it had groove and a meaning and was the time, the place and the motion as well as the way we are feeling; today however “hybrid" is definitely the word, and it is the direction that most clients are taking in terms of cloud adoption. We therefore chose to do a special report on who it is that is currently spreading “the word”. As with all of the Compare the Cloud #CloudInfluence league tables, we used a sophisticated big data analysis of the most significant global news, blogs, forums, and social media interactions (English language only). This provides a snapshot taken at a particular point of time of the respective influence of both organisations and individuals in relation to hybrid cloud. Companies or individuals that were particularly active in the assessed period will feature more prominently as we see here in the top of the rankings with companies that made a big splash with recent hybrid related announcements. [easy-tweet tweet="Read about the top 20 organisations with #CloudInfluence for #hybridcloud" user="comparethecloud" hashtags="hybrid, cloudnews"] Favourable timing helped Shaw Communications hit the top spot, with its influence peaking just before we conducted our analysis following its announcement of the creation of a new, state-of-the-art data centre in Calgary. Here it aims to offer business customers a variety of hybrid solutions, including cloud, colocation, managed services or any combination in between. Also benefiting from the timing of a recent announcement was OneNeck IT Solutions,a provider of hybrid IT solutions, which is opening its newest data centre in Denver Metro Area, with its ReliaCloud pod placed in this facility. And in 3rd another announcement helped Net Access, a leading provider of hybrid colocation, cloud, connectivity and business continuity solutions. Net Access announced that their data centres now meet the Payment Card Industry Data Security Standard (PCI DSS) 3.0 and Service Organization Controls 1 (SOC 1) Type 2 standards of control. In 4th is our first big player. Ahead of its split in November, Hewlett Packard is setting out its cloud strategy, saying that it wants to create a new type of hybrid infrastructure that optimises for performance and cost by helping its customers build clouds that scale and work with their current and future infrastructure.  Then came mega-scale cloud player Google which in a major move is joining OpenStack as a manoeuvre in its massive cloud war offensive against AWS and Azure. Google thinks that both hybrid cloud and containers are going to be important to businesses globally, and by hooking up with the OpenStack Foundation, it can show off its container technologies to the array of enterprise developers looking to improve the relationship between public and private clouds.  Next up is VeriStor Systems, an advanced IT solutions provider specializing in virtual infrastructure and enterprise private, public and hybrid cloud services and solutions. VeriStor is expanding its Security Solutions and Services Practice to protect the entire enterprise landscape from endpoint to datacenter to cloud. Intel have been banging the drum for integrity and confidentiality assurance. It believes that it is becoming a critical requirement in private, public, and hybrid cloud infrastructures, and that cloud service providers must offer trusted clouds to their customers to provide them with the confidence to move sensitive workloads into the cloud. Even if organisations have confidence in the systems deployed in their data centres, in hybrid cloud environments, on premise systems may be instantly and automatically supplemented by capacity from a public provider. We are familiar with the next organisation in the rankings, having participated with them in a recent Twitter chat on the very topic of hybrid cloud (see  https://www.crowdchat.net/PureAppChat). Boosted by this Twitter chat and all its other activity on hybrid is IBM Pure Application Systems. At the same time IBM has announced a container service for Bluemix. The service, IBM says, is intended to help developers deliver applications across hybrid cloud environments.  Adaptive Computing, a firm that claims to power many of the world’s largest private/hybrid cloud and technical computing environments with its Moab optimisation and scheduling software, has announced that SURFsara, a national Tier1 data centre in the Netherlands that provides high performance computing and data infrastructure to industry and the academic community, has deployed Moab HPC Suite.  Finally Microsoft has also been banging the drum for hybrid - one of the major announcements at Microsoft Worldwide Partner Conference was the introduction of Microsoft Azure Certified for Hybrid Cloud—a new program designed to support service providers who run the Microsoft platforms in their data centres and who want to deploy a Microsoft Cloud solution in their own data centre, or build an Azure-enabled hybrid cloud solution, and monetise their investments. On the back of this Cisco was to announce that it is the first data centre infrastructure vendor to participate in the program. Back in March, Cisco and Microsoft announced a jointly engineered solution for service providers designed to dramatically improve the time to market with hybrid cloud services.  [table id=40 /] NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### China is Global Leader in Deployment of IoT, Finds New GSMA Report Rapid Market Growth, Government Investment, Operator Industry Partnerships and Adoption of Common Specifications Helping to Drive Internet of Things (IoT) to Scale. At Mobile World Congress Shanghai, the GSMA issued a new report highlighting China’s leadership in the worldwide machine-to-machine (M2M) market. According to the report, “How China Is Scaling the IoT”, China is the world’s largest M2M market with 74 million M2M connections and has now become the global leader in the deployment of the Internet of Things (IoT).* The in-depth report includes insights from the country’s major mobile operators China Mobile, China Telecom and China Unicom, as well as leading industry experts, and cites the combination of a strong economy, far-sighted government investment and international cross-sector partnerships as key factors in enabling the IoT to quickly reach scale. “Clearly, China’s size offers economies of scale unavailable to other countries, but it’s been the government’s focused strategy, emphasis on common specifications and cross-sector collaboration that has allowed the Internet of Things to scale, delivering positive benefits to businesses and consumer alike,” commented Alex Sinclair, Chief Technology Officer, GSMA. “Connectivity is boosting major industries such as logistics, manufacturing and energy in terms of increased efficiency, but it has also created a new consumer market in areas such as connected vehicles, home appliances and wearables, putting China at the forefront of IoT deployment.” Connected Consumers Backed by government support, sectors such as transportation, energy, logistics, utilities and manufacturing have all benefited from the real-time information provided by mobile connectivity to increase efficiency, lower costs and manage infrastructure. However, the consumer market has also experienced incredible growth, with millions of Chinese consumers now owning multiple connected devices and experiencing the IoT in their daily lives. The wearables market in particular is a hotbed of innovation, with thousands of products available at accessible prices, including smartwatches, tracking devices and fitness bands with built-in connectivity. The connected car market is also growing apace, driven by the increasing availability of 4G network coverage which is providing a range of in-car services such as entertainment, navigation, safety and vehicle diagnostics. Machina Research forecasts that the number of connected devices, such as vehicle platforms, providing services within the connected car sector will increase from 16 million in 2015 to 67.8 billion in 2020 and to 130 billion by 2024, making China the world’s largest connected car market after Russia. Positive Government Support China has benefited from proactive government support in the development of the IoT with funding allocated as part of the country’s 12th Five-Year Development Plans and additional funding made available for research and development. China has also led in the development of standards, establishing an IoT standards association and promoting Chinese-developed standards internationally. Adoption of essential industry specifications and guidelines, such as the GSMA Embedded SIM Specification and the IoT Device Connection Efficiency Guidelines, will enable further growth and scale.** The central government has also selected 202 cities, including Beijing, Guangzhou, Hangzhou and Shanghai, to pilot smart city projects to collect, store and analyse information related to transportation, electricity, public safety and environmental factors. Operator Partnerships The report highlights that China’s leading mobile operators, China Mobile, China Telecom and China Unicom, are at the vanguard in the development of the IoT and are moving from a business-to-business focus to offering more sophisticated consumer-oriented propositions via partnerships with other companies such as automotive makers and wearables companies. They are also forging both domestic and international partnerships with vendors and manufacturers to bring the benefits of connectivity to a wide range of machines, vehicles and devices. Beyond the supply of ubiquitous, high-performance connectivity, operators are working to standardise platforms, simplify business processes and provide value-added services such as security, authentication and billing. To download the report click here: http://www.gsma.com/newsroom/wp-content/uploads//16531-China-IoT-Report-LR.pdf ### Cloud is just the end of the beginning “This is not the end. It is not even the beginning of the end. But it is, perhaps, the end of the beginning.” - Winston Churchill Computing is evolving. Undergoing an evolution that will eventually intersect and allow mankind to go beyond whatever we think possible today. When Cloud Computing, or in particular IaaS, first arrived we paid by credit card, we rented space, and we took advantage of cheap ubiquitous computing on demand. Like Darwinism and evolution of the species the time is now upon us to move into the next phase. This phase as we see it is not more “cloud! cloud! cloud!” it is the possibilities and applications that cloud computing allows us to achieve. Many people we have spoken to think Big Data is the next phase. Whilst there is certainly a very strong argument for this perspective we would disagree. [easy-tweet tweet="Enter the age of the Sensoratic, where society is run by sensors" user="BillMew and @ComparetheCloud" hashtags="IoT, BigData"] Big Data by its very nature is the evolution of merging Enterprise data sets into an understandable outcome that allows for predictive modelling. An example being a large fried chicken chain that is able to predict load cost and events to enable better profitability across 1000s of stores. Bringing social data into this mix allows for reaction to events for instance ‘announcement of a local parade’ which would bring additional footfall past the shop. Big Data has been with us for a long time in many guises and has certainly crossed the chasm into a late majority adoption cycle. Enter the age of the Sensoratic. A society that is run by sensors, everything from your fridge knowing when you are out of milk, to energy supply being determined by demand on the grid. The platform (cloud) is here, and now is the time to really push forward. The way forward is using sensor data. We have the ability to stream data, and interface with any device using a variety of means and protocols which allow for data gathering. The age of the Sensoratic brigs together the Internet of Things (IoT), big data analytics, the interconnected universe, a smarter planet and a host of other advances into a new system of systems. From your oven, to your TV, every device now has the ability to ‘speak’ to us. Consumerisation of IT is here, but not in the format espoused by the world. The consumerisation I am talking about is the convergence of everything into machine language. The way this is being accessed, analysed, and reported on is by massive cloud clusters. In the future, is it possible that we will we start getting free household appliances to allow the supermarket of our choice to drop shopping to us via drones? If you have a ‘Tesco Fridge’ that automatically orders from your nearest Tesco when you run out of milk, how does that work? What would the contract look like? Who owns your data? Unthinkable? For decades ice-cream now firms have provided free freezers to corner shops to help them sell ice cream. Only the firms’ own brand of ice cream treats could be stocked in the freezer and these were replenished regularly by the firm on the shop keeper’s behalf. Advances in sensors and systems have automated this process such that it could now be possible to extend this model to every home to include a full range of both frozen and fresh produce. Cloud has democratised computing and compute resource access for developers, big data providers, analytics tools and technologies such as software defined networking and devops. The next evolution is democratising all of this data into a common set of standards and formats to create something meaningful that would allow everyone to leverage technical advances. As we all sit here today and ponder where technology is going whilst staring at our Apple Watches, iPhones or Fitbit bands remember that it was only 20 years ago that the thought of having a telephone, video player and music storage in one device that fits in your hand was thought impossible. I do wish the next phase of computing will hurry up… ### June Top 10 #CloudInfluence UK and Cloud Bursters Top of the list of executives bursting onto the scene in June, our ‘Cloud Bursters’ #1, was Mark Reuss, President of General Motors, who has been talking about making the computer systems in the firm’s new cars’ cloud-based (see the June Top 50 individual rankings where Mark came 5th). Next came Scott Guthrie, the man who who runs Azure, commenting that he sees the cloud market becoming a three-horse race (with Google as the third pony). "I give [AWS] credit for pushing the cloud earlier than others," Guthrie said, "but I don't think this is a market that's winner take all." He believes that Microsoft and Google are also doing something that's familiar to any retailer who has ever competed with Amazon.com: They're engaging the incumbent in a price war. Commenting on another war, SoundCloud chief executive Alexander Ljung welcomed the arrival of Apple Music saying that its launch will only benefit his music streaming service. Speaking to Music Business Worldwide, Ljung maintained that he was not concerned about Apple Music having an adverse effect on SoundCloud because he doesn’t “see anything out there that’s remotely close to SoundCloud”. Lets see how these wars play out for all involved. In the June Top 50 individual rankings we focused on Larry and the Oracle cloud love in, as they launched Oracle Commerce Cloud. Keith Hurley, Director of Spindrift (a DigitasLBI Company) was one of the Oracle partners rolled out to add independent adulation to that from Oracle. He claimed that: “With Oracle Commerce Cloud we can now deliver the core commerce components of our customer implementations and also serve as the design agency for the broader customer experience.” Also part of the Oracle cloud love in was Ken Volpe, Senior Vice President, product development, Oracle, who added that Commerce Cloud is a new, differentiating piece of the broader Oracle CX cloud applications portfolio and helps ensure online businesses no longer have to worry about deploying code, upgrading, and managing day-to-day infrastructure. So there we have it. Focusing on the importance of security standards for the cloud security ecosystem, Jim Reavis , CEO of the CSA, commented that the right set of working definitions can boost adoption. Returning to the bette front, Jim Weins, vice president of marketing at Rightscale, a service that helps developers manage cloud services, estimates that AWS customers now pay half as much for the same services as they did at the beginning of 2013 and that Google's prices have fallen 62% over the same period. Meanwhile in India, Sivarama Krishnan, Director of PwC’s Risk Advisory business endorsed the government’s new privacy and data sovereignty policy, saying: ”The government's locker is more secure primarily because the data gets stored within India and you are protected under the Information Technology Act, 2000. If you store anything in DropBox or Google Drive, you are governed by US regulations.” Conversely a more global approach was advocated by Jim Comfort, general manager of cloud services for IBM: “Our global network of IBM Cloud data centre locations, along with on-the-ground expertise, will provide customers with the best support and the most reliable availability, security and performance for their local and worldwide business transactions,” he said. Rounding out the top ten was Jim Cole, SVP of Outsourcing Services at Hitachi Consulting who believes that he has the answer for companies wanting to extend private cloud environments to include a public, hosted platform for non-mission-critical workloads. [table id=37 /] Topping the UK rankings for June was Windstream VP of data center marketing Rob Carter who believes that their Elastic Hybrid Cloud truly creates an elastic environment, leveraging the best of public and private cloud worlds. This means that for the second month running IBM’s prolific Simon Porter was pipped to top spot by a single point.  In 3rd was Angus MacSween, iomart's CEO commenting on the increasing complexity and demand for public cloud services. Their acquisition of SystemsUp, he argues has broadened iomart's ability to engage at a strategic level and act as a trusted advisor on cloud strategy to organisations wanting to create the right blend of cloud services. He came just ahead of John Jackson, from Camden Council, last months’ leader n the rankings. Rob Fraser, CTO for cloud services Microsoft UK, claimed that in cost of compute or storage, open cloud is hard to beat, while Maarten Ectors of Cannonical Group argued that the Snappy Ubuntu Core on top of Acer's products would enable developers to let their imagination run free, and Simon Hansford from Skyscape Cloud Services welcomed a partnership with DeepSecure for secure digital services to the UK public sector. Finally we had Emma King from CA Technologies and Jim Liddle from Storage Made Easy who succeeded in pushing our very own Neil Cattermull down to 11th this month. [table id=38 /] NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### Getting what you pay for - avoiding Cloud migration challenges The vision for cloud computing has always been around the twin benefits of cutting costs and improving flexibility for the business. However – like so many things in this life – the gap between the ideal implementation of cloud and real life circumstances has meant that many companies have faced challenges in delivering on all that promise. At this point, it’s worth looking at how to plan ahead and stop problems from affecting what would otherwise be successful projects. [easy-tweet tweet=".@mastersian is talking about the gap between the ideal implementation of cloud and real life circumstances" user="comparethecloud"] The Cloud Industry Forum released research in June 2015 that stated only 10 per cent of companies felt that their cloud implementations had gone as smoothly as possible and could not have been improved. Of those surveyed, the biggest initial challenges after signing on the dotted line and deciding to “go cloud” were issues around the complexity the migration and difficulties around the sovereignty of data held in the cloud. Now, these issues should be put into context – many IT projects that don’t involve cloud computing solutions have their equal share of problems that come up after the ink is dry on the contract. Indeed, you only have to look at some of the problems that come up in public sector IT projects as evidence that things are not simple and that unexpected costs can crop up. However, these hurdles have a bigger impact on the perception of cloud. After all, traditional IT is known to be complex and to require insight and experience; the perception is that cloud should remove all of these issues as part of the reason to shift over. Take this element away, and cloud can get categorised as simply paying for “other people’s computers”. So what should your response be? Migration planning is essential for avoiding what Donald Rumsfeld called the “Unknown unknowns” – those problems that can’t be anticipated ahead of them coming up. The more comprehensive the planning stage, the more likely it is that problems won’t come up. As part of this, it’s worth looking at the following: What are your dependencies? As part of any move to the cloud, there will be steps where data may be held on both internal and external systems. During this migration, it’s worth understanding where other IT systems or business processes will be connected during each stage. This will help with project planning and critical path analysis, but also provide back-up support in the event of something going wrong. How are you migrating? There are multiple different ways to get data into the cloud, varying from fresh implementations of data through to in-place migrations that can be carried out while systems remain online. Discussing the migration process and what potential windows of downtime might come up will help you set expectations within the business. Is the business informed? Any significant IT project should have a communications plan for the business as well as the technical plans. It is always worth over-communicating on the status of any planned migration so that the business knows what will be the impact of the move. If there are any problems that might occur due to downtime, then you can collaborate on how to reduce those downtime windows with smarter tools or through alterations to the project timeline. Looking ahead in this way can help you ensure that any move to the cloud is not affected by downtime or poor user experience during the shift over. Even as IT becomes more complex under the covers, the migration process can be made seamless with the right processes, people and tools in place. This combination can ensure that investment in cloud can start paying back straight away. ### How is cloud changing the landscape of modern business? Everywhere you turn today in the modern world of the IT professional everyone seems to be taking about cloud computing. In light of this I have decided to throw my hat in the ring and give my opinion on how I see the business sphere changing. What is every business interested in? I am sure a number of interesting and diverse reasons have gone through the minds of people reading this, but simply put it is ensuring the business stays in profit and costs are kept to a minimum. For a long time now IT has had the stigma of being a leech on business funds. By this I mean the return on investment received by IT investment is never fully realised by the business due to ongoing support and constant maintenance that comes with these of projects. Finally cloud computing has given us a means to change this. If we consider the substantial costs that come with the purchase of software licenses that a huge percentage of business need to function on a daily bases often the value created by these purchases is often unbalanced. However cloud computing has given us a means to revolutionise this, the traditional means of  purchasing software has been almost disbanded in favour of an open source approach. However it is not only open source projects that are changing their approach to how their software is purchased. Even well established companies have seen the need for change, they have identified that the organisations that use their software are either huge ever changing organisms that will grow when times are good and shrink when the economy demands it or small start-ups that cannot meet the costs associated with the traditional software license system offered. On this basis companies like Microsoft and Adobe have started to offer their products on a per monthly / per user bases that allows start-ups access to the same technology as huge enterprises for as long as required. Therefore we are seeing that cloud computing is changing the competitiveness of the business environment for start-ups, but we are also seeing how cloud computing is allowing larger organisations a great deal more flexibility and allowing them rapid elasticity which gives them the ability to scale up and down when required a defining requirement of cloud computing. When you consider the budget conscious world we currently inhabit, the cost of maintaining these software solutions is often four times the cost of software. With cloud hosted solutions this maintenance cost is absorbed by the provider thereby reducing the ongoing operational cost to the business - yet another example of cloud computing making IT more cost effective. We also have to acknowledge that cloud computing is not simply a means of replacing Opex with Capex, an argument often used to emphasise the benefits of cloud computing. It is in fact lowering Opex versus Capex. How is this done I hear you say? Well with the current method of providing IT service you are required to purchase the equipment (servers, routers, etc), you then have the added Opex of monitoring networking equipment, your systems administrators, hardware vendors and so on. This I believe is the hidden treasure behind cloud computing, if you adopt cloud computing your are moving this substantial Opex costs to the cloud service provider, you also need to factor in how this now simplifies your infrastructure. You are now only having to deal with one vendor instead of a host of vendors spread out across network, servers, data centres and so on. It is a firm belief of mine that simplification of any problem will allow for the greatest innovation and cost saving which I have gathered from years of business analysis work. I mentioned Openstack earlier on in this article and now seems as good a time as any to bring to light another factor I feel we will change the current business world we inhabit. This is going to be the rise of new industry leaders and IT vendors. Now do not misinterpret what I am saying, the likes of VMware and Microsoft are not going anywhere. However I do feel that we will see the rapid adoption of new technologies like Openstack that will perhaps knock these established vendors off their pedestals. There is already evidence of this happening with a reported number of larger organisations already looking to Openstack for private cloud needs. I believe there is a very simple reason for this, Openstack was developed for the sole purpose of delivering private clouds which means that these organisations can still abide by the compliance requirements (PCI, ISO) whilst being given the cost savings that come from adopting cloud computing and open source software. On the back of this, another point to note for these organisation is that any acquisition and mergers these companies take part in will be a far simpler process in terms of bring the IT infrastructure together. Perhaps the biggest shake up we are going to see is an increased awareness and leveraging of the Internet and web 2.0. This can already be seen by a number of retailers giving up their high street presence in favour of a digital platform. We are even seeing the likes of banks who, for a number of years, have relied on their high street presence for continued profits choose a digital first approach, We are even seeing the formation of digital banks such as Atom Bank formed in early 2015, the first UK bank to be completely digital with no presence of a bank on the high street. Why is it we are seeing this new business approach?  Simply put, it is down to cloud computing and its ability to automate and rapidly expand infrastructure to deal with increased traffic to sites or applications. Finally and I feel they most important change we are going to see within the business world is an acceptance of experimentation and innovation. Prior to cloud computing there would have been a lot of hesitation by the C-suite of any organisation to try to innovate or experiment using IT. The only clear reason I can see for this is that the cost of providing the resources for a project that could potential fall flat on its face was too much of risk to any organisation despite its size. So how does cloud computing help this? Well with the clouds ability to pay for only what is used for as long as it used nullifies any risk. If it is identified that that a new project is not generating the expect outcome all resources currently being utilised in the cloud are stopped the expenditure is only up to the point at when the resources where stopped. This means that there is no ongoing costs to the company, but there is also no upfront costs. The often overlooked benefit to cloud computing its ability to rapidly copy reason process that seem to be working. So if a project is generating greater than expected outcomes using cloud computing this process could be duplicated almost instantly allowing the company to capitalise and generate as much income from it as long as there is a need / market for them to do so. In conclusion cloud computing certainly brings with it an immense amount of change and uncertainty. It also brings with it the chance to change the very environment in which we currently work in, allowing us to march on into the future innovating for the better. ### IP EXPO Europe 2015 adds Cloud heavyweights to speaker line-up Tenth annual IP EXPO Europe will encompass six sub-events to cover next-generation IT requirements IP EXPO Europe, Europe’s number one IT event, has today announced the addition of Mark Russinovich, Microsoft Azure CTO, and Barak Regev, Head of Google’s EMEA Cloud Platform, to the 2015 keynote speaker line-up. Russinovich and Regev will go head to head on the major IP EXPO Europe keynote panel session ‘The Future of the Cloud’ and join a prestigious speaker line-up that includes SAP’s Chief Technologist, Mark Darbyshire, F-Secure’s Mikko Hypponen, and Silent Circle Founder, Phil Zimmerman. The speaker programme will cover all aspects of next-generation IT requirements, including cyber security, data centre, DevOps, data analytics, and unified communication in addition to Cloud and Infrastructure. Mark Russinovich is Chief Technology Officer of Microsoft Azure, Microsoft’s cloud platform. He is considered one of Microsoft’s most important software engineers and a widely recognised expert in distributed systems. He is the driving force behind Microsoft’s cloud platform Azure and has lead Microsoft to become the biggest code contributor to Docker. Barak Regev is Head of Google’s EMEA Cloud Platform, and a key reason behind Google’s incredible cloud growth. A renowned cloud evangelist focusing on the advantages of strong data storage and analytics capabilities to drive cloud as the next cycle in computing. Bradley Maule-ffinch, Director of Strategy for IP EXPO Europe, says: “Russinovich and Regev are considered two of the most important Cloud heavyweights in Europe and EMEA. We are very excited to welcome them to our Keynote Theatre this year. “As Cloud technologies continue to develop and become the norm for many businesses, learning from the experts is more important than ever. We look forward to hearing what both these Cloud pioneers will have to say.” The keynote panel session ‘The Future of the Cloud’ will take place at 12:00pm on Day One, 7th October, at the IP EXPO Europe keynote theatre.   For further information on speakers, or to register FREE, visit: http://www.ipexpoeurope.com/. About IP EXPO Europe IP EXPO Europe is Europe’s leading IT event, designed for those looking to find out how the latest IT innovations can drive business growth and competitiveness. Now in its 10th year, the event showcases brand new exclusive content and senior-level insights from across the industry, as well as unveiling the latest developments in IT. It covers everything you need to run a successful enterprise or organisation.  IP EXPO Europe 2015 now features three new themes – Data Analytics Europe, DevOps Europe and Unified Communications Europe, joining the existing Cloud and Infrastructure, Data Centre and Cyber Security areas – incorporating six events under one roof for the first time ever, and making it the most comprehensive business-enhancing experience for those across IT, industry, finance and facilities roles. ### 5 key benefits of looking after your IT assets strategically Any savvy modern business must realise the value of their IT assets – be that financial, technological or operational – and look to manage them effectively in order to succeed. An effective IT asset management policy is one that is strategic, taking a short, medium and long term view of the ways to protect and upgrade the vital components of the business operation. In short, every organisation must know what IT resources it’s using, how well it uses them, how to protect those assets and what new software and hardware it might need next to consolidate or improve amid a fast-changing market. [easy-tweet tweet="No business can afford to become irrelevant" user="ComparetheCloud" hashtags="IT, assetmanagement"] Here are five key ways you’ll benefit from a strategic approach… Safety: This is at the heart of a strong IT asset management system. All software should be protected as robustly as possible and patches should be applied as soon as they are available – and in a way that is not too obtrusive to day-to-day operations - to ensure that you are benefitting from the most up-to-date safety for your assets. Failure to do so could leave you open to the threats posed by hackers. Licenses: Businesses can waste a significant amount of money on software licenses if they are not careful. It’s important to manage this properly, with a regular audit of what is being used and how effective that use is. Companies such as 1E offer software that can provide such information and once you have this data to hand you’ll be able to make informed, strategic decisions that could save you money on unused licenses or ill-matched software. Forward thinking: You need to keep up with the latest developments and assess what new technology might help your business – and ideally not after all of your rivals have signed up and raced ahead of you. Looking at the short-, medium- and long-term pictures will allow you to stay at the head of the game when it comes to the IT assets your business possesses, identifying the assets that need replacing – and their ideal replacement – before it becomes too late. No business can afford to become irrelevant, and strategic management is the trick to ensuring this does not happen. Money: This forward thinking approach also has practical benefits when it comes to your bank balance. It’s important to highlight the hardware and software assets you need to invest in at the earliest opportunity so that you can factor these into your budget. A strategic outlook involves healthier finances and less chance of a nasty big bill coming from out of the blue. Greener: The need to keep a lid on your bills and ensure your green credentials are strong should not be seen as entirely separate from the way you manage your IT. Strategic management of your IT should help you to identify poorly performing hardware that consumes copious amounts of power as well as unnecessary software that runs in the background, draining power and system performance. Using software that can ensure PCs boot up for updates at a specific time during an evening can save on energy, avoiding the need for terminals to be left on for long periods. IT asset management borrows the same fundamental principles as the management of any other business assets. Taking a strong strategic approach should ensure you protect your IT just as you would your buildings and machinery – and upgrade and repair at the earliest opportunity to ensure problems do not develop. ### Leonard Santalucia, Vicom Infinity - IBM Edge 2015, Las Vegas Leonard Santalucia from Vicom Infinity talks to us at IBM #Edge2015 Las Vegas to discuss cloud, Mainframe, Managed Service Providers and their recent success with airline reservation systems company Radixx. ### Layne Levine, GTT - IBM Edge 2015, Las Vegas Layne Levine from GTT joined us at IBM #Edge2015 Las Vegas to discuss GTT, the cloud, communications and the future. ### Pulsant Plays Host for Record Number of Online Edinburgh International Festival Tickets Pulsant’s eight year partnership with Edinburgh International Festival reaches new highs as more visitors than ever book online The Pulsant hosted Edinburgh International Festival (EIF) website has seen a surge in tickets sales for this year’s renowned event since the opening of the 2015 box office.  Pulsant supplies the enterprise cloud infrastructure to support the site with an architecture that incorporates hardware-based load balancing, caching, application servers and database servers. Pulsant previously facilitated a move from a private cloud to enterprise cloud solution when the festival’s website and externally hosted ticketing site were upgraded. Enterprise Cloud is more comprehensive and offers greater flexibility and improved performance by changing the architecture and adding more resource to the platform.  Chris Shields, regional sales director, Pulsant said: “We have worked with the Edinburgh International Festival for the last eight years and throughout that time they have seen a massive shift from internet access being a tool, to it being absolutely business critical. We are delighted to have played a supporting role in their record number of online bookings and wish the team every success at this year’s event.”  The EIF website experiences three main peaks during the year – the first during the launch of its programme, the second when tickets are made available for purchase, and the third when the festival actually begins.  Nicola Kenny, digital marketing manager, Edinburgh International Festival said: “With unprecedented numbers of visitors booking online this year, the constant availability of the website is key, as is the connectivity Pulsant provides between the website and the externally hosted ticketing system. Since tickets went on sale at the end of March, the buying process has gone very smoothly even during the busiest, peak periods.” This year's event runs from 7 until 31 August and will welcome more than 2,300 participants from 39 countries. Among the hottest tickets are Ivo van Hove's production of Sophocles' Antigone, starring French actress Juliette Binoche, the Citizens Theatre's production of Alistair Gray's seminal novel Lanark, the Komische Oper and 1927 theatre company's production of The Magic Flute and Budapest Festival Orchestra's The Marriage of Figaro. ### Windows Server 2003 end of support – Are businesses just hoping to be lucky? One constant in the technology industry is that technology continues to evolve at a rate that many find it difficult to keep up with. Microsoft, for example, is currently busy promoting the launch of its latest desktop operating system (OS) – Windows 10 – that will be available from July 29 in 190 markets around the world. The company’s expectations are high, with it professing a goal of putting Windows 10 on a billion devices within two to three years. As with its desktop OS, Microsoft’s Server solutions are also at a turning point, with countless businesses around the world facing up to the stark reality that as one server version comes into being; support for another one will drop by the wayside; now it’s the turn of Windows Server 2003. But managing any migration is not for the faint hearted.  The ticking time bomb  With around 61 per cent of businesses still running Windows Server 2003 as of March this year, there is a ticking time bomb facing UK businesses to address the security risks posed by the imminent ‘end of support’ by Microsoft on July 14. Two of the main reasons why UK businesses have tended to bury their collective heads in the sand are the lack of funds for the substantial financial outlay required and a distinct lack of time and internal resources to migrate to Windows Server 2012 before the deadline. It feels like we’ve been here before. Back in April 2014, Microsoft's official support for the beloved Windows XP ended. Even our own UK Government was unprepared and was forced to pay an additional £5.5 million for one year’s extended support for computers that hadn't been updated by the deadline. It’s not a problem confined to Whitehall. At the start of this year, NetMarketShare research highlighted that almost a quarter of PCs today still run Windows XP, down only slightly from last December’s figure of around 29 per cent, suggesting that many organisations haven’t yet made the leap away from this unsupported OS.  The writing’s on the wall Back to the present, where Microsoft will no longer issue security updates for any version of Windows Server 2003 after July 14. This is following mainstream support being switched off back in July 2010, so the writing's been on the wall a long time. But whilst businesses should have already started, if not finished, their migration to a supported version of Windows Server, a worrying percentage have not. In fact, approximately one in three enterprises are planning to run Windows Server 2003 after the deadline, so it’s imperative for businesses to look at alternative solutions to prevent data loss as a matter of urgency. We are all aware that failure to patch potential security holes in your system is a serious risk and not just from the potential ICO fines. Vulnerability and patch management isn't easy. In fact, in today's computing environment, it's a never-ending cycle and the task is made far more difficult when the patches aren’t being ‘pushed’ to your user’s desktops. Therefore, businesses need to now seriously consider whether to either make the move to a new version of Windows Server on-premise or to move to the cloud. However, a simple migration to the latest version of Windows Server might not be easy. Businesses increasingly build their own bespoke applications to meet their specific business requirements. This DIY approach can save them license fees and ensure they have something that will fulfil whatever unique business pain they are trying to solve, however these applications are often just designed to work on whichever server platform they were originally designed for. Therefore migrating to Windows Server 2012 – whether on-premise or in the cloud – can open up a plethora of compatibility and integration issues. So what can you do if this is the case or you have simply left it too late? What if the deadline is looming yet you don’t have the internal resources, or indeed the funds, to facilitate such a large migration of your organisation’s systems? Another way With the security issues faced by many businesses following the end of support for Windows XP still fresh in the mind, there is strong demand for an alternative solution to full migration. In particular, an interim solution that allows companies to maintain security and business continuity whilst planning resources needed for full migration. By using a combination of cloud management and migration technologies, businesses can migrate a physical or virtual server machine quickly and simply into the cloud. More importantly, firewalls tailored to address the vulnerabilities of Windows Server 2003 can then be added around these servers to immediately protect them from threats, whilst still allowing businesses to access their critical data. This enables businesses to address the challenge of application migration without putting vital business information in danger and migrate Windows Server 2003 to the cloud with the security required to continue without support until full migration to Windows Server 2012 is completed. The end of support for Windows Server 2003 is an issue a substantial number of businesses are facing and one that needs to be addressed with immediacy. According to Microsoft, there are 11 million Windows Server 2003 servers still running. If your business has not taken action yet, it’s likely your critical data may be at risk. Migrating can be daunting but it is imperative to find a solution which will protect your information and buy some more time before investing in a full move to Windows Server 2012. ### Windows 2003 will expose 1/3 businesses to major security risks after deadline One in three enterprises planning to run Windows 2003 after deadline are exposed to major security risks Fusion Media Networks and Abiquo build Windows 2003 migration service driven by Double-Take to give immediate protection for businesses unable to address full migration to Windows Server 2012 With around 61 per cent of businesses still running Windows Server 2003 as of March this year, connectivity and business communication specialist, Fusion Media Networks has combined technologies with hybrid cloud software provider, Abiquo and provider of disaster recovery, high availability and migration software, Vision Solutions, to address the security risks posed by imminent “end of support” by Microsoft of Windows Server 2003. These combined technologies offer a unique service to businesses, enabling them to migrate a physical or virtual server machine quickly and simply into their Abiquo- or Azure-based cloud using Vision Solutions’ technology, which would then be managed with Abiquo’s solutions to ensure the customer still has full control. Firewalls can then be added around these servers to protect them from threats, which allows businesses to address the challenge of application migration without putting vital business information in danger. The solution allows migration of Windows 2003 servers to the cloud with the security required to continue without support until full migration to Windows 2012 server is completed. The service offers an interim security solution for businesses who lack the resources to migrate to Windows Server 2012, before the deadline. With approximately one in three enterprises planning to run Windows Server 2003 after 14th July, it’s imperative for businesses to look at security solutions as a matter of urgency.  “In response to the security issues faced by many businesses following the end of support for Windows XP, there is strong demand for an alternative solution to full migration. This interim solution, using combined technology from ourselves, Abiquo and Vision Solutions, allows companies to maintain security and business continuity whilst planning resources needed for full migration,” commented Lee Norvall, Technical Director at Fusion Media Networks. “The end of support for Windows Server 2003 is an issue a huge number of businesses are facing and one that needs to be addressed with immediacy,” said Ian Finlay, COO of Abiquo. “According to Microsoft last month, there were 11 million Windows 2003 servers still running and with less than a month to go before end of support, businesses that have not taken action will be at risk. Migrating can be daunting but this offers a solution which will protect your information and buy some more time before investing in a full move to Server 2012.” ### June Top 50 #CloudInfluence Individuals In an amazing turn around the man that topped this month’s rankings is the same man that in September 2008 completely dissed cloud computing while speaking at an analyst conference. [quote_box_center]“The interesting thing about cloud computing is that we’ve redefined cloud computing to include everything that we already do. ... The computer industry is the only industry that is more fashion-driven than women’s fashion. Maybe I’m an idiot, but I have no idea what anyone is talking about. What is it? It’s complete gibberish. It’s insane. When is this idiocy going to stop?"[/quote_box_center] Yes, top of the individual rankings for June 2015 is none other than Larry Ellison, founder and CTO at Oracle. One day I am sure we will tire of reminding Larry of this quote, but not yet! [easy-tweet tweet="The #CloudInfluence #1 spot for June 2015 goes to @LarryEllison CTO @Oracle" user="Comparethecloud"] This month Larry was in full cloud evangelism mode for Oracle: [quote_box_center]"We're now prepared to call our application suite complete, which means that you can now move everything in your data centre to the Oracle cloud," Ellison said, before adding. “We’re prepared to compete with Amazon.com on price.”[/quote_box_center] He’s obviously now a complete cloud convert, joining the idiots and matching their gibberish. Can Oracle be the firm to take on AWS? It certainly has the breadth and scale: it spans the full spectrum of infrastructure-as-a-service (IaaS), platform-as-a-service (PaaS) and software-as-a-service (SaaS). It can also control everything from the iron right up to the top of the software stack. Add to this its corporate clients list, and its willingness to take on complex hybrid environments that AWS has so far shied away from and it makes a formidable competitor. It also has a significant cash cow from its licencing and maintenance to fund its cloud expansion and a considerable lock-in with its database clients. Contrast this with the public cloud market share leader, as its recent financial disclosure showed, AWS may be profitable, but its parent company Amazon is hardly making any profit at all. Pipped to top spot this month was Satya Nadella from Microsoft who managed to stay just ahead of Larry’s Oracle colleagues, Mark Hurd and Safra Catz who joined the Oracle cloud love-in. Then came Mark Reuss, President of General Motors, talking about making the computer systems in firm’s new cars' cloud-based. Before you get carried away and think that we’re talking about flying cars here, it’s just the automotive and telematic data from the cars that will be in the cloud. And control will remain with the driver for the foreseeable future – driverless cars still being some way off. Drew Houston, Founder and CEO of Dropbox explained this month how after seeing years of user growth, the firm needs to expand beyond its consumer focus and attract more lucrative business clients. Ths is why he has appointed Dennis Woodside, an 11-year Google veteran who was most recently Motorola’s CEO, as DRopbox’s first-ever chief operating officer. Then we have Daniel Ives from FBR Capital Markets, a regular in our rankings, commenting this month on VMware and Oracle among others. He was followed by Tim Cook explaining how Apple is expanding user privacy to prevent iOS apps from seeing other installed apps. David Wadhwani, SVP and GM, ofDigital Media at Adobe revelled in the announcement of Creative Cloud 2015 which he described as Adobe’s most powerful and comprehensive release to date. Rounding out the top ten this month was Microsoft Azure CTO Mark Russinovich commenting on a host of advances to Azure including SDN and containers as well as celebrating his firm’s inclusion as a leader in Gartner’s Public Cloud Storage Services for the second consecutive year – see our special report on cloud storage. Other notable appearances in this months rankings includes: Jim Reavis, executive director of the Cloud Security Alliance (CSA), rose to prominence arguing for the standardisation of security APIs to help the ecosystem coalesce around a universal language and process for integrating security tools into the cloud applications. Mike Wynne, GIS Business Analyst for the City and County of San Francisco who was thrilled that his city had been selected as a 2014 City on a Cloud Innovation Challenge winner by AWS for its innovative use of … er … AWS. And finally Iranian Brigadier General Gholamreza Jalali created attention commenting on Iran’s ban on the use of smartphones, arguing that these handsets are a perfect cocktail of security risks as they often back up their data to the cloud. The move may impact staffers, but the Iranian government will likely consider it a worthwhile sacrifice if it prevents other nations from spying on its political manoeuvres. [table id=36 /] NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### Understanding the Difference between NoSQL and SQL Hadoop is not a type of database, but rather a software ecosystem that allows for massively parallel computing. It is an enabler of certain types NoSQL distributed databases, which can allow spread of data across thousands of servers (cluster) with little reduction in performance.    There has been a lot of talk about NoSQL over the past few years, but most people still do not know the difference between NoSQL and SQL. Understanding the differences and the benefits and drawbacks of each database model will help you make informed decisions. 1. Whereas SQL databases are relational, hence the name relational database management system (RDBMS), NoSQL database is non-relational or distributed. SQL databases are table-based whereas NoSQL databases are document-based, graph databases, wide-column stores, or key-value pairs. The structured query language is actually where SQL (Structured Query Language) gets its name. In NoSQL DB, focus for queries is on document collection. This is sometimes called Unstructured Query Language (UnQL). UnQL syntax varies from one database to the next. 2. In SQL database the data is in the form of tables consisting of a number of rows, whereas data in NoSQL has no standard schema definitions that it has to adhere to. NoSQL DBs have dynamic schema while SQL databases consist of predefined schema. 3. NoSQL DBs are horizontally scalable while SQL DBs are vertically scalable. To scale NoSQL DBs, increase the DB servers in the cluster for load balancing. To scale SQL DBs, increase the horsepower of the CPU, SSD, RAM and other hardware on the server. This means NoSQL is the best option if scalability is a major consideration. 4. SQL allows for interaction since it is a declarative query language. Once you state what you want such as to display, the DB extracts the results after assembling an algorithm internally. With NoSQL, MapReduce, being a procedural query technique, requires that you not only know what you want, but that you state exactly how to produce the answer. The increased interaction with data allows for new insight that will help with product development. 5. SQL has been around for a while and this explains why it is standardised. Although some vendors introduce dialects to their interfaces, the core is standardised and additional specs like JDBC and ODBC provide stable and broadly available SQL stores interfaces. This allows for an ecosystem of operator and management tools to help in the designing, monitoring, inspection, exploration, and building of applications on SQL systems. This means SQL programmers and users can reuse their UI knowledge and API across different backend systems, thereby reducing the development time of applications. Standardisation is also important in that it allows declarative third-party ETL (Extract, Transform, Load) tools. These tools enable you to flow data across systems and between databases. [easy-tweet tweet="Big data facilitates predictive analysis, which will keep you one step ahead of your competitors." hashtags="bigdata"] 6. SQL DBs are a better fit for complex query-intensive environments. This is because NoSQL does not have a standard interface to perform complex queries on a high-level and queries on NoSQL are not as powerful as SQL queries. 7. NoSQL DBs are the better fit for hierarchical data storage. This is because NoSQL follows the key-value pair storage method that is similar to JSON data. This makes NoSQL the best option for big data since today, most SQL vendors have added JSON-type support as well as XML document support. 8. While it is possible to use NoSQL for average transactions, SQL is the best choice for heavy duty transactional-type applications. This is because NoSQL is not stable enough when loaded with complex transactional applications and high traffic. SQL are a good fit because its database is stable and it promises integrity and atomicity of data. This is mostly because NoSQL has not been around for as long as SQL. 9. You will get better support with SQL databases, mostly because SQL DBs have been around for longer. When you hire a remote DBA experts team, chances are that most of the DBAs have training in SQL DBs. However, more and more remote DBA teams today have experience with NoSQL because of the increased demand. SQL DBs are also more reliable since they have been tried and tested over several years. 10. With SQL DBs, emphasis is on Atomicity, Consistency, Isolation and Durability (ACID) properties. On the other hand, NoSQL DBs follows the Brewers Consistency, Availability and Partition (CAP) theorem. 11. SQL databases are either close-sourced from commercial vendors or open-source whereas NoSQLs are open-source. This means NoSQL is the better choice if you want to save money. Popular examples of SQL databases are Oracle, MySQL, MS-SQL, Sqlite, and Postgres while popular examples of NoSQL are MongoDB, HBase, Redis, BigTable, RavenDb, CouchDb, Cassandra, and Neo4j. Why Big Data? Big data solutions are applicable for the processing and analysis of big data. Major sources of big data are black box data, power grid data, social media data, stock exchange data, transport data, and search engine data. You should go for big data management solutions, be they SQL or NoSQL, because: 1. Big data analysis allows you to get customer information that will allow you to treat your customers as individuals. Today’s customers are very demanding and they want to connect one-on-one and to do so in real time. Information on what customers are complaining about will also help you in reputation management. 2. With big data, you can test different variations of CAD (computer-aided design) images. The information you get will help you determine how a change will affect your product and processes. 3. Big data facilitates predictive analysis, which will keep you one step ahead of your competitors. You will be able to analyse social media feeds as well as newspaper reports to get information that will help you take advantage of certain opportunities. Big data also allows you to do health-tests on customers and suppliers, thereby reducing defaulting rates. 4. Big data analysis will keep your data safe since there are tools to help map data landscape of a company for purposes of analysing internal threats. As an example, you will be able to flag emailing and the storage of 16 digit numbers (credit card numbers have 16 digits). 5. You will be able to diversify your revenue streams with big data since you will get trend data that will help you come up with new revenue streams. The above article will let you know about the differences, benefits and drawbacks of each database model and why should you go should go for big data management solutions. ### Know your options around Cloud and DR Protecting data is an important remit for IT. Cloud computing models can help enormously here, particularly for companies that either can’t or don’t want to run their own secondary IT operations. However, it’s important that people know what they are really buying when they look at using the Cloud for their disaster recovery (DR) and business continuity operations. Not all the options are created equal. [easy-tweet tweet="For most companies, storage is storage, regardless of where that data actually physically resides." user="MastersIan"] To help with this, below is a round-up of some of the options that companies have when it comes to using Cloud services. Without going into the details of recovery time objectives and recovery point objectives, it’s important to understand how each of these options work in practice when it comes to recovery of data. Option #1 – Storage For most companies, storage is storage, regardless of where that data actually physically resides. Cloud offers the ability to store huge amounts of information over time. Rather than paying for the hardware and managing it yourself, Cloud providers are falling over themselves to offer storage to customers at knock-down prices. However, while storing data in the Cloud is an option to protect yourself against data loss, it does not include the ability to recover those files later. This still has to be done manually. While uploading of information and files can be automated, that is also not automatically included as standard. The point here is that you will get what you pay for, which in the case of Cloud storage is not much beyond storing the data itself. Option #2 – Cloud Backup Backup takes things further than simple storage by adding automation to the process. Each backup can ensure that the latest versions of files are saved, while it can also be possible to host multiple versions of those files too so that issues like corruptions can also be dealt with. The purpose of a backup is to save your bacon when something goes wrong. That means that recovery has to be considered too. While backups may be stored in the Cloud, it’s important to understand what the recovery process is as well. Customers may be restoring to their own IT assets, to a managed service provider’s kit or to another platform; whichever is the right approach for them. Option #3 – Cloud Recovery, or DRaaS Full recovery in the Cloud and Disaster Recovery as a Service (DRaaS) options cover the ability to use the Cloud as a platform for recovery of workloads. For the customer, there is no requirement to haul out new servers and storage to get their critical applications back up and running quickly; instead, the Cloud provider can start the applications and then point users at these server instances. The aim here is to make the whole process easier both for the IT team and for the employees and customers as well. Rather than requiring hours for recovery while on-site staff set up and execute the DR plan, new instances can be up and running in minutes while the switchover is seamless. Running in the Cloud can make sure that services remain useful, while the IT team solves the problems and returns systems to normal. DRaaS offers a better level of service for customers and more automation of processes, so it can attract a more premium price for the service provider. Getting the right mix All these services may be combined in order to meet the needs of each customer. For example, some workloads may not suit the DRaaS model as they are on non-x86 hardware. In this case, it’s probably more accurate to describe any continuity or DR offering as a managed service rather than “Cloud.” However, the end result should be the same: easier management of business continuity for the customer. For businesses looking at how to make use of Cloud for their operations, moving DR into the Cloud can offer the best option when it comes to reducing the potential amount of downtime around any issue. However, it’s also important that this is not seen as just being an insurance option. Instead, it can open up new ways of working that improve the overall performance of IT in general. Knowing your options can help turn this potential into reality. ### Are Russia's gates to the West closing? The ongoing conflict in Ukraine, EU sanctions and unauthorised military flights near UK airspace. Not since the days of the Cold War has Russia felt so distant from the West. The latest action from the Kremlin sure to drive a wedge between Russia and the outside world is the country’s data sovereignty laws, which threaten to usher in a type of digital Iron Curtain. Back in July 2014, the State Duma or parliament passed a law decreeing that all personal data concerning Russian nationals must be stored within the country, meaning social networks and other online businesses must have a physical presence in order to operate there, whether that means renting storage from a domestic data centre or opening a Russian branch of their own. The new ruling comes into effect on 1st September 2015, meaning businesses do not have long to decide on their next move. Companies will be reticent to abandon a country with a population in excess of 140 million and a growing online audience standing at approximately 50 per cent of that figure. eBay and PayPal are just two of the companies to have agreed to the ruling, with both firms asserting that they will be moving data regarding their Russian customers ahead of the deadline. However, further legislation passed in the last few months will also make it more challenging for businesses to relocate within the Russian Federation. In late February, the Duma increased the potential fines for businesses that flout the data protection rules, while in May, a law was passed allowing the Russian government to ban any foreign or international NGO that it deems to be “undesirable.” While Russia has always been viewed as having a more draconian Internet policy – it has been accused of overzealous online censorship in the past – the new legislation is unexpectedly oppressive. However, businesses should not view the new government restrictions as impenetrable, but rather look at the ways in which technology can enable them to continue their relationships with Russian customers. Under the new rules, businesses have the opportunity to explore other ways of hosting data outside of on-premise solutions. Data centre operators based in Russia such as IXcellerate offer companies the option to transition their data from its present location to one which complies with the current legislation. [easy-tweet tweet="Companies should be looking to cement their online presence in Russia now." user="@IXcellerate and @Comparethecloud"] Migrating data from one physical location to another can be a challenging and time consuming process and many businesses will be concerned about potential disruption to their services, integration with existing networks and security issues. By enlisting the help of a Managed Service Provider (MSP), companies can host their data in Russia, while leaving the rest of their operations uninterrupted. The benefit of cloud technology means that hosting data abroad is a much smoother process than it was even just a few years ago. By enabling organisations to host data in different parts of the world they are able to serve a truly global customer base while complying with regional data laws. Choosing the right MSP will, of course, be crucial and companies will likely be drawn to organisations with experience of dealing with Russian data law, particularly given its propensity to change in recent months. IXcellerate also offers businesses strong connections with the UK and other parts of Western Europe, placating concerns of geographical divisions arising within their organisation. Businesses currently serving Russian customers but storing data overseas have an important decision to make over the next few months. Access to certain sites has already been blocked by the state regulator Roskomnadzor and any sites that do not comply with the ruling by the 1st September deadline will be shut down. In spite of the restrictions, companies should be looking to cement their online presence in Russia now. Internet usage in the country is growing rapidly and Russian is now the second most used language online after English. If companies want their voice to be heard globally, a presence in the country is essential and cloud hosting within the Russian Federation gives businesses the means to achieve this.  The new data protection laws are challenging, but if companies are willing to adhere to them they will gain access to a wealth of opportunities in the Russian market. ### Finding the Balance between Agility and Control for MSPs How web access management enables MSP’s to deliver business Agility, whilst maintaining Control over Security & Costs. Many companies struggle to find the right balance between enabling business agility and increased user productivity, whilst maintaining control over escalating project and IT Security costs. Businesses typically have a core set of applications requiring user access whilst they are in the office, on the move, or at home.  Software-as-a-Service (SaaS) solutions are increasingly being deployed alongside these applications to meet specific user group requirements.    With a growing mix of On-Premise, hosted applications and SaaS, companies need to find ways of enabling these environments to co-exist. They are also aiming to provide a seamless Web user experience and trying to maintain control, without going through costly and time consuming user ID consolidation projects.  Larger organisations in particular can find it difficult to launch new agile web applications due to the number of different ID stores incorporated within their existing infrastructure. MSPs and clients are usually faced with the task of linking these various identification systems together before online applications can be deployed – a serious investment in terms of both time and money. However, web access management (WAM) tools like SES make this a far simpler process. Achieving security is about more than just implementing the right software By linking ID systems, whether internal or in the cloud, WAM software can give businesses a competitive edge. Legacy systems can remain in place while new agile web solutions are launched, enhancing business agility and giving software users a smoother and more integrated experience. Web access management is a great tool for authenticating, authorising and auditing who is able to access your company data and resources. Do you want your employees to work more efficiently by avoiding multiple log-ins? If so, Single Sign-On can help you achieve this. If businesses need to implement multi-factor authentication for particular datasets, then United Security Provider’s (USP) risk based approach to online resources can provide flexible security options. USP can also help protect your Internet applications by helping to provide confidentiality and integrity. Application firewalls and customisable security mechanisms have led to companies across many industries adopting WAM software whether they need baseline security or e-banking level protection. In fact, there are many facets of web access management, but it is worth remembering that they are only as effective as the security policy underpinning them. Achieving security is about more than just implementing the right software. Web access management is a hugely valuable tool for securing your organisation, but it needs to be utilised as part of a broader security strategy. Before opting for any security software, businesses must decide exactly what they want from it. Employees should be informed of what data they can access and which information can be shared externally. Companies must also have a strong understanding of the data protection laws in their country to understand the regulations governing data use. Given that keeping your online applications secure can be a complex task, it is worth remembering that web access management, like other cloud packages, is an ongoing service, not a one-off product. Many vendors will offer specialist advice and consulting skills alongside the software package itself, so businesses would be wise to make use of the security expertise available to them. For more information please visit www.web-access-management.com ### CTC presented at IBM Edge 2015 Compare the Cloud's Neil Cattermull and Andrew McLean provided their unique insights into quickly growing hybrid cloud adoption rates, and using cloud technology to address business challenges across industry at IBM Edge 2015. ### Is Big Data giving advertising a Bad Reputation? Whilst different disciplines of IT, ‘Big Data’ and ‘Cloud Computing’ are two very distinct technologies, and in the view of most end-users they are intrinsically linked. So how are they linked? We have all probably heard of names such as MapReduce and Hadoop, these are the underlying technologies that enable the move from unstructured to structured data that can be analysed. The short term ‘peak’ of these technologies lends itself very well to the Cloud Computing model of technology on-demand. But from a cloud platform provider’s perspective the rental of these processing or computing burst technologies and associated tools is the demarcation point. A cloud infrastructure (IaaS, PaaS) provider would never release data to a 3rd party data broker, otherwise the ensuing outcry would not lend itself to business longevity. [easy-tweet tweet="The age of creepiness is here." user="comparethecloud and @MrAndrewMcLean"] The age of creepiness Today there are a number of companies that provide ‘data services’ to enable ‘targeted advertising’ and a ‘relevant experience’. I call total BS on this, what these so called companies are doing is seeking to go beyond segment marketing, where you target customers by location, age range or other demographics, to individualised marketing where clients are targeted using unique identifiers. These identifiers include, but are not limited to, personal interests or even sexuality as well as a host of other highly personal particulars. These can range through to your relationship with, and the behaviour of other family members, and a range of other information points that are labelled as ‘contextual’ - or as I’d prefer to term it ‘creepy’. The privacy trade-off Your data is worth more to a provider than the smartphone application you play or the online dating site you visit. Every time you fill in a form or launch an application or sign up to an online email service you are asked to agree to a policy which is typically a very long document written in legal jargon, and in font 9 that you never really read, but that gives permission for everything. The question is, if you use the application or software for free does that trade off matter? I would say that it depends on whether this is made clear in legible highlighted terms. But what if you pay for that online dating website or application? should that data be part of the devils bargain? The Cookie Jar and PII (personally identifiable information) Like a greedy ‘Cookie Monster’ dipping into the jar of cookies these creepy companies are hoarding information on us all like at no other point in history. The claim to anonymity is gone and the technology driving this is ‘Big Data’. When 100s of ad-technology companies are able to synchronise your cookie data across all platforms to stitch all that disparate information together this is troubling. So what kind of information are these Big Data platforms tracking? Are you married? Your sexuality Credit card purchasing habits Do you enjoy Football? Your magazine subscription preferences Online survey responses saying you like Chicken Your online life Your location and where you have travelled Your spouses friends and family and your preferences compared Geo-targeting based on your (assumed) income bracket Your work history including previous employment Your search history Your browsing history The issue is that when your online life is tracked by a small piece of code, when this code is deleted as a group it disrupts remembered passwords and automatic logins rendering deleting cookies a regrettable task. The secondary issue we face is the hidden aspect of what I call ‘stealth monsters’. Those that throw apps of useful information for free to grab your history. Privacy is a ‘right’ not something that should be earned and those that go ‘all-in’ with these companies need to take a long hard look. Yes your advertising will be very ‘contextual’ and ‘targeted’ but at what price? That price is insecure mobile applications, data leakage, and stealth marketing under the guise of ‘contextual’. There is a danger here on all devices, but it is typically most acute on the personal devices that we use to play games, use social media and subscribe to publications or news services. This is why organisations need to gain control of BYOD (bring your own device) and other personally owned elements of IT infrastructure such as tablets and smartphones. End-user education is key. Explain to your staff the risks that web forms and smartphone apps bring to your IT network, and show them how to retain privacy. Start demanding information on where the information from your magazine or newspaper subscription is going - and ask if it’s being supplemented or synchronised with the ‘cookie monsters’. If it is, find out how to opt-out and don't buy the story of non-personally identified information because when you synchronise across all advertising and certain social media networks ‘context’ always identifies who you are. Hey but don't worry the cool looking dude or dudette from the advertising company in his Gucci loafers and (soon to be worn) Apple watch will tell you it is ‘all cool and contextual’. On a more personal note: how about using Big Data technologies to cure cancer or solve world issues or something meaningful? Not to be Big Brother and track us all. ### Are driverless cars vulnerable to hackers? You may remember some rather unsettling headlines from May this year, when a cybersecurity consultant caused a minor furore by announcing that he had managed to take control of the engines of the plane he was travelling on, by hacking into the onboard wifi system. One of the big arguments behind driverless technology is that it has the potential to make our roads safer. Security and aviation experts piled on to deny that this was possible, but by then it had garnered worldwide media attention, reflecting many public concerns that if computers can be taken control of remotely, then so can vehicles if they have network-connected computers inside them. Mission Secure, an American security firm involved in cyber defence, agrees. It fears that with the volumes of technology in today’s modern cars, hackers will soon be targeting them. One of the big arguments behind driverless technology is that it has the potential to make our roads safer. This is of major interest to everyone, from manufacturers to dealerships such as www.jenningsmotorgroup.co.uk, to the car-driving public. Everyone knows that the next big innovation in the motor industry is probably going to be driverless tech. Google have their own test track going in California, and the British government has plans to introduce a live testing scheme in several UK cities soon. By 2020, or maybe even sooner, driverless vehicles could be on the roads. But if you’re not in control of your car, could someone else be? Is there any truth in the idea that a driverless car might be vulnerable to hacking? So why would someone want to hack into a car’s technology system? So why would someone want to hack into a car’s technology system? There are several potential reasons. Firstly, it could be to steal cars remotely, with the thieves themselves far away. They could even disable wireless-connected police vehicles in pursuit. Deliberately crashing a car could be used to settle a score, or to cause mass congestion - researchers at the Universities of California and Washington have in the past managed to infect driverless vehicles with computer viruses and were able to cause them to crash. It could even form a kind of terrorist attack. These ideas may seem outlandish, but they are certainly enough to trouble the experts. It’s said that Google even has an entire team of crack programmers looking for security flaws in their vehicles and how to avoid them. Ironically, the more complex we make our in-car technology, often with the worthy aim of making passengers safer, the more loopholes are left for hackers to exploit. One expert from the Institution of Engineering and Technology has said that 90% of applications contain defects - the average vehicle contains some 10 million lines of software code. With the introduction of driverless tech you could easily be looking at hundreds of millions of lines that need to be just right for everything to work perfectly - there are bound to be a few bugs in there somewhere. So that covers some of the risks. What about the solutions? Firstly, and most obviously, is to keep vital driving control functions such as braking and steering offline and unconnected to any network. But that doesn’t exactly tally with driverless technology, which will be using radar and GPS systems to locate traffic problems ahead and avoid them. Then there is keeping as much software in-car as possible, and minimising what is online. And lastly, to introduce black boxes to cars, much as they are with planes, to establish the causes of any incidents and correctly assign blame - system fault, driver error, or external factor. It is vital to keep driving control functions such as braking and steering offline and unconnected to any network For the time being though, the prevailing theory is that, rare as such events are likely to be, if something is network-connected then dedicated hackers will somehow be able to get into it. Obviously the number of people who would want to do something like this is very low, and the number of people who have the resources and skills to do something even if they wanted to is lower still. So although there is a danger, maybe it’s not something worth getting too worked up over. There are, after all, many other issues to do with the concept of driverless cars - such as reaction times slowing - that are more of a concern. ### Gordon Davey, Dell UK - Live from Cloud World Forum, London Interview with Gordon Davey from Dell UK as part of our live Afternoon Kings of Cloud broadcast from Cloud World Forum, Olympia, London - 25/06/2015. ### June Global #CloudInfluence Organisations Overall the cloud attention index dropped gently in June as we approach the summer holiday season (in the northern hemisphere). This gives us a chance to reflect on where some of the key players stand. Microsoft and Amazon have maintained a hold on the highest levels of attention, both now being described by analysts as the stand out players – certainly in public cloud. Looking back 90 days to early March shows the full extent of their lead at times. Apple leadership in the consumer/mobile space enables it to secure third spot, but beyond this we start to look at players that are struggling to maintain any real global prominence in cloud. Google appears to be losing ground to its two main rivals, but remains a somewhat unknown quantity – being one of the few major players yet to break out any financials for its cloud operations.  Larry Ellison, who appears prominently in our forthcoming ranking for individuals, helped boost Oracle into fifth this month. In a live event at its headquarters in Redwood Shores, Oracle announced the addition of 24 new platform and infrastructure services to the Oracle Cloud, signifying the firm’s all out strategic focus on cloud.  Intel was a launch partner in the new Marketplace Alliance Program (MAP) from Aliyun, Alibaba Group's cloud computing arm, Intel also acquired Altera for $16.7 billion. Announcing impressive results and an expanded Creative Cloud enterprise offering boosted Adobe Systems into sixth. Cisco stepped up its cloud strategy with the expansion of its Intercloud alliance and had a further boost right at the end of the month with the $635 million acquisition of OpenDNS to boost its security business. Acquiring Blue Box and partnering with Box helped keep IBM in the top ten, but longer term the critical issue for IBM is completing the SoftLayer data centre expansion and then generating some client and partner enthusiasm for the platform. Red Hat in 10th announced the availability of Red Hat Enterprise Linux (RHEL) for SAP HANA to customers of Amazon Web Services. The more detailed monthly chart shows how these players fared over the month. Other notable appearances in the top 50 included:  Hewlett Packard achieved its highest #CloudInfluence ranking yet.  With the company splitting into two different companies on November 1, this month’s Discover event in Las Vegas was an opportunity for the part that will become ‘Hewlett-Packard Enterprise’ to set out its vision which includes a major focus on Hybrid infrastructure – maximising performance and cost for the future. Analysts Gartner and IDC ranked highly again – Gartner’s latest Magic Quadrant for Public Cloud Storage Services put AWS, Google & Microsoft far ahead of all others in cloud storage. See our response to this in our recent special report on cloud storage. And just outside the top 50 Aliyun, Alibaba Group's cloud computing arm, made a number of moves, including its Marketplace Alliance Program. The firm could well become a significant force in the future. [table id=35 /] NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### Will the last reseller standing please turn off the lights? In 1912 the RMS Titanic Sank after hitting an iceberg, one of the few stories to emerge was how the ships musicians played to calm the passengers until it finally went to its watery grave. The question I ask each reader of this post ‘is your business slowly sinking like the violin players of the Titanic’? Hardware and on-premise services have always been a staple of the IT industry where resellers, MSPs (and any other label you choose) earned margin through a transactional relationship with a customer, usually the IT function. This transactional sale typically led to a subscription support service usually comprised of telephone and on-site software and hardware maintenance services, including a help desk. The onslaught of cost driven centralised computing caused by cloud and infrastructure as a service has not been as all conquering and encompassing as many predicted. Whilst the preceding statement is not necessarily an indicator of where I think market disruption will go long-term it is my belief that cloud has not fulfilled its predicted rate of adoption by the ultimate consumers of IT services. One of the terms I hear pedalled out by laggard marketers is cloud will ‘become a utility’ - that computing will be ‘switched on and consumed like electricity or water’. Has anyone asked these marketing types what is the reasoning behind these statements or at least requested a breakdown or explanation? The answer is actually quite simple; every computing device we use is subject to miniaturisation, which has been evolving over the years. An often-quoted fact is that the iPhone has more computing power than all of the systems available during World War Two across all powers. Therefore as devices juxtapose and become multi-purpose (think Camera, Walkman, Mobile Telephone, Web browser which is now your smart phone) we are entering an era of low cost computing that will be cheap without margin enrichment. Understand that every industry has gone through a major disruption Looking ahead the questions are; Will an IT reseller that is playing at the tail end of a late majority marketplace still be around tomorrow? And when is this journey going to be eclipsed by on-demand services? On-demand Who remembers the local pub, barber or supermarket cork board with all the local traders’ business cards on? On-demand is being driven by the millennial generation -those who have never been without the Internet. They want it now, and so do all those local companies vying for services from the cleaners to the gardeners. Everyone wants to access the on-demand generation. Think about the rise of UBER and AIRBnB and how the journey for the customer is an experience. I am not advocating that IT resellers transform into marketplaces, just that consumers of IT will change the role of the CIOs and the CTOs will evolve. Customers want to be advised (sorry shiny suited salespeople). They want to work with brands that inform and consult. The transactional relationship of old is like a distant relative, always turning up to the reading of the last will and testament, to bag some cash and run. Today you need to turn up and be a friend to your customer, working with them on a subscription basis. Marketing needs to not be so ‘sell-sell-sell’ it is about providing information to elicit choice. So how do you move forward? First look at what your business core is, if it is transactional decide on the longevity of that marketplace. Once you have decided on your transactional exit, look at disruptive cloud providers and application suites you can leverage to gain new market share. Speak to your IT distributors and chat through your aspirations and benchmark against best practises. Understand that every industry has gone through a major disruption. The cloud in my view is only version 1.0. The next stage will be all about the applications and services running on these clouds and how they are consumed. Allay your customer’s fears and trepidations around security and access concerns, help them evolve with you and above all be the local trusted partner you have always been. There is a reason your customers have purchased from you previously, no cloud provider in the world is capable of supporting a worldwide SME client base and being successful. This puts the local partner at the forefront of cloud adoption and support and management. In summary, the server that sat in the corner has become a virtual box held in a large data centre. The services and support a reseller provides have just moved to a different location. If you have any comments please email us at pr@comparethecloud.net and we will publish them, whether positive or negative. ### IP EXPO EUROPE 7-8 October 2015 Excel London Europe’s number ONE IT event, designed for those looking to find out how the latest IT innovations can drive business growth and competitiveness. Now in its 10th year, the event showcases brand new exclusive content and senior-level insights from across the industry, as well as unveiling the latest developments in IT. It covers everything you need to run a successful enterprise or organisation.  Arrive with challenges, leave with solutions. IP EXPO Europe 2015 taking place at ExCel London has this year launched three new events – Data Analytics Europe, DevOps Europe and Unified Communications Europe, joining the existing Cloud and Infrastructure, Data Centre and Cyber Security areas – announcing six events under one roof for the first time ever. IP EXPO Europe 2015 will be the most comprehensive business-enhancing experience for those across IT, industry, including IT managers, CTOs, network and storage engineers, CISOs, data analysts, developers and communications specialists. Register FREE today at www.ipexpoeurope.com (Register FREE before 7pm on Tuesday 6th October 2015 to save £35*) ### IBM and Mayor of London Launch Tech.London for Startup Community [quote_box_center] A digital hub with resources, education, access to technology, free cloud credits, mentoring, jobs and more London’s digital tech sector is expected to create an additional £12 billion of economic activity over the next decade. It’s also expected to create 46,000 new jobs in the capital over the next decade. [/quote_box_center] Tech.London launched on June 15 2015 to support this growth. It’s a portal which brings together London’s digital economy in one place to keep the community tapped into the very latest information and insights. The site, developed by Gust.com, runs on IBM's cloud platform Bluemix, and integrates data and applications from dozens of leading high-tech firms including Level39, Ogilvy, Meetup.com, Eventbrite. In London creative ideas and entrepreneurial spirit are in abundance, however to launch a successful business you need funding, the best talent and digital knowledge. From advertising to finance, technological innovation intersects with every industry in London, and a centralised platform to locate local resources for the community is so vital. That is why Tech.London, a new comprehensive online community and platform, is launching to connect, support, and grow London’s expanding technology sector and early-stage ecosystem. Tech.London is collaboration between Mayor of London Boris Johnson, Gust, IBM, and a wide number of partners from London’s technology community and is testament to the city’s growing tech and digital industries. It is the embodiment of a shared desire and commitment to bring the community together, increase job opportunities, train the workforce of the future, support the growth of new businesses, and ultimately ensure London is at the centre of technology innovation. Visitors to the site will have simultaneous access to a live database of London startups and early-stage investors, a constantly updated stream of career opportunities and educational courses, the universal calendar for London tech and startup community events, the complete guide to London’s co-working and incubator spaces, up-to-the-minute news about London’s tech and digital community, and much more. London’s funding gap Let’s look at the funding gap in London. Well-known tech goliaths may find it easier to raise new funding rounds and devote millions to M&A activities; however London still suffers from a visible early-stage funding gap. To support these efforts the coalition introduced an eight per cent reduction in corporation tax; R&D tax credits and even more reduced tax for companies with patents. Such measures boosted our tech sector and gave the UK a record 581,173 business registrations in 2014. This year’s election the major parties highlighted the importance of the digital economy to the UK’s future. The newly elected government has the opportunity to incorporate technology into its policy from an early stage and benefit from improved collaboration and efficiency. But it’s still not enough. The new Government’s proposed Enterprise Bill aims to reduce red tape on business, create a Small Business Conciliation Service to handle business disputes and reform the appeal process for Business Rates which is promising. Tech.London platform gives startups and entrepreneurs access to additional partner resources that can help their businesses develop and grow. For example, startups can take advantage of up to $120,000 USD in free credits for IBM’s cloud platform. Getting the right talent There are also vocalised fears from the business community around the inevitable in/out referendum on our membership of the EU by the end of 2017. We know that to stay competitive we must also ensure that it is still possible to bring highly skilled technology experts over to the UK. The current visa process is expensive and overly complex. The employment shortage issue is one partly of myth, as the majority of Tech City businesses actually have a surplus of vacancies they can’t fill quick enough. The issue is actually around the skills needed in today’s environment. Top five skills most in demand are: coders & developers, marketing & PR, business development, web design and user experience specialists. This skills gap is being plugged by temporary resource, including freelancers and interns. Members of the community are invited and encouraged to become a partner in Tech.London, by adding their startup, event, news, or content feed. A weekly newsletter is also available, giving subscribers a London digital ‘week-in-review’ and details of upcoming community highlights. In the coming months, Tech.London will be optimised for mobile devices so that Londoners can access information and resources from anywhere, at anytime and with the aim to roll out nationwide in future. For more information about Tech.London and how to get involved see here: http://www.tech.london/ ### The Power to Transform Governments, Cities and Citizen Services through Cloud Cloud computing today is an imperative for public sector organisations wanting to make governments more efficient and cities better places to live. By allowing agencies at the national or local level to reduce spending on IT infrastructure, to scale easily, and consolidate information and services into a single, integrated system, cloud computing can provide a proven advantage. According to Gartner, by 2017, public cloud offerings will grow to account for more than 25 percent of government business services in domains other than national defence and security. Just as business leaders are choosing cloud for its economics, speed and flexibility to drive business growth, its time for governments to turn to cloud computing – hybrid, private or public - for its ability to contain costs and increase predictability. In the US, cloud computing has the potential to reduce the federal IT budget by 25 percent according to a study by MeriTalk, PaaS or Play? Cloud's Next Move. The UK Department of Energy and Climate Change is integrating cloud-based smart energy readers and shared meeting spaces to eliminate waste and increase productivity saving nearly $160,000 annually. In Germany, the Pension Fund Baden-Württemberg gained faster access to its pension records by 99 percent on a cloud-based system with no increase in costs At the state level, the State of California’s CalCloud is a new public sector technology model powered by cloud computing to build and deliver more innovative government services and savings. The platform, now available to municipalities and all state and local government agencies on a subscription basis, is the first of its kind to be implemented in the United States. At the local level, Sunderland is a mid-sized city in the United Kingdom using cloud to cut IT costs and encourage innovation. Not only are cloud services helping the city work smarter and personalise services for citizens, the city is turning their investment in cloud systems into a profit centre expected to bring in about $2.29 million annually. By collaborating with businesses who support the city in delivering services, these partner organisations are also able to access low cost, scalable cloud services while the city earns a profit and puts its excess computing capacity to work. The rise of the cloud comes at a good time for budget-conscious public sector organisations. Rather than pouring their own resources into new data centres, pay-as-you-go cloud services provide an efficient, effective way of consolidating data and programs from different government agencies. But many public sector organisations need to better understand how to get started and what type of cloud environment is best for them. For those governments that truly want to transform, its time to put a bold plan in place and move aggressively. 1. First, they need to create a cloud strategy with an architecture and plans. This includes determining their organization’s goals, platform requirements and understanding the complexity associated with it. 2. The next step is to identify and prioritise workloads. This includes defining business drivers to prioritize use cases for cloud, implementing a Cloud First strategy to evaluate the right blend of cloud options for new projects, and assessing and evaluating the best applications to become candidates for cloud. 3. Determine cloud deployment options by assessing and determining how to best use private, public and hybrid delivery models. 4. Develop a public sector cloud business case and cost models including transition costs and examine the potential returns including time required for initial payback 5. Lastly, prepare for implementation within each government agency or department by developing cloud risk management plans and policies, security and compliance processes, and transition plans. Then assess the impact on the operating model and make adjustments as necessary. Putting a strategic plan in place should be the foundation for every public sector organization ready to reap the obvious benefits of cloud computing. By helping control costs, improve efficiencies for citizen-centric services and most importantly, empowering governments to transform themselves, cloud is the key to becoming more economically competitive in a data driven economy. We’re still at an early stage where the potential for open cloud is just being imagined – especially in governments. By allowing agencies at the national or local level to reduce spending on IT infrastructure, to scale easily, and consolidate information and services into a single, integrated system, cloud computing can provide a proven advantage. For those governments that truly want to transform, its time to put a bold plan in place and move aggressively. ### Logfiller Ltd. Announces Rollout of Layer8 Logfiller Ltd., a young technology company, is rolling out its new software, Layer8, a user experience measurement tool that reveals actionable new data.  This innovation has “immediate and significant implications for efficiency, cyber security and compliance across the Windows environment” explained co-founder, Michael Colopy, “providing far more insight than standard technology.” As the cost of improving user satisfaction and productivity continues to rise, said Jeremy Barker, Logfiller’s CTO and the inventor of the new technology, “Layer8 can be the frontline early warning tool that helps keep costs down by allowing enterprises to optimize the use of their expensive drilldown systems. It’s an ROI multiplier, boosting the value of Big Data services.”  Layer8 from Logfiller provides desktop end user experience data “all the time, for all users and all applications, regardless of the software technology or deployment architecture.” Initially confused with application performance monitoring (APM), Layer8 from Logfiller fills a distinct role by offering relevant, comprehensive, and actual user experience insight in real time.  Layer8’s user experience rating (UXR) is a unique and, some say, game changing metric that is derived from the technology’s ability to provide a complete record of the user’s experience from logon to logoff, capturing every relevant event in between.    “You can’t fix what you can’t see” is a common management axiom. Providing enterprises a high level view across their entire networks allows them to reduce the number of endpoint agents while gleaning new, actual user experience information that is not modeled, derived, or estimated.  Layer8 can quantify precisely when and where inefficiencies and other hazards affect end users, the company said; it is uniquely useful for assessing SLA performance, tracking system-wide productivity and auditing activity for all Windows-based systems, including virtual and hosted solutions from Citrix, VMWare, Microsoft and others. This last differentiator, Logfiller’s Colopy noted, offers significant advantages in addressing cyber security and compliance needs without incurring large cost increases. “Layer8 is a game-changing technology in this sense: it helps unify top management around IT decision making,” Colopy stated, “by fostering rapid cooperation – based on actual user experience data, across all executive functions” including the CEO, CFO, CTO, CIO, CMO, HRO – “in other words, all who share responsibility for efficiency and cyber security.” Asked what business sector will benefit most quickly from a scalable tool that produces complete user experience data, CTO Barker cited large financial institutions of all kinds, including commercial and investment banks, insurance companies, brokerages and “in any environment where what happens in-house at the user interface is immediately important to the organization.” The single best validation of Layer8’s practical value, said Logfiller’s Michael Colopy, is the common reaction to its demonstrations using an organization’s own data: “we had no idea!” ### Richard Holmes, Arrow ECS - Live from Cloud World Forum, London Interview with Richard Holmes from Arrow ECS UK as part of our live afternoon Kings of Cloud broadcast from Cloud World Forum, Olympia, London - 24/06/2015. ### SMEs: embrace the cloud and reap the benefits Cloud computing has become one of the main game changers in IT evolution over the past decade, offering many benefits to businesses, in particular SMEs. In the UK, 581,173 start-ups were registered with Companies House last year, and part of this accelerated growth can be put down to the fact that cloud computing technology allows new ventures to be in business almost instantly with an IT infrastructure that would be far beyond their reach if they were required to buy and host it on-premise. This rapid deployment means that new innovative SMEs have the ability to punch above their weight and compete with larger, more established peers almost immediately. For already established SMEs, deploying cloud services may not seem as simple, with research from the Federation of Small Businesses stating that only a quarter of small firms are actually investing in technology such as cloud computing, even though they could reap many benefits from going down this route. It seems that new SMEs are more keen and willing to embrace new technologies than their counterparts.    Barriers to entry  Many major business houses across the globe have already switched from on-premise IT solutions to the cloud and are enjoying the advantages it brings. So why are SMEs holding back from making the transition? A survey conducted by Oxford Economics found that many SMEs, while accepting the need to go global in order to stay ahead within their own market, were more interested in other technologies such as business intelligence and mobile solutions, rather than implementing cloud technology. Many SME business owners are simply scared about the concept of cloud, and have concerns about the perceived costs, complexity and risks that are involved. Indeed 35 per cent of SMEs do not fully understand the benefits that the cloud can provide them, and are therefore simply ignoring it. Cloud computing is changing the way people do business, and by ignoring the potential of this technology to evolve their business, SMEs are missing out. It’s now essential that they look at adopting the cloud, and the benefits it can bring in terms of transforming productivity, and enabling them to remain competitive. To bring the cloud down to earth, there are four key areas that SMEs should understand and consider. Cost Cloud technology is associated with large cost benefits, as the need for infrastructure, upfront capital expense and on-going maintenance costs are largely reduced. A recent report from the European Commission found that the adoption of cloud computing could result in 80% of organisations reducing their costs by about 10%-20%. Pay-as-you-go Many cloud computing services offer the option to pay monthly, which SMEs for whom cash-flow is an important factor is a significant aid. Pay-as-you-go options allow businesses to access sophisticated software with no upfront fees and no lock-in period. Business agility Mobile technology has meant that business can now be done anywhere where there’s an internet connection. Cloud computing enables businesses to make the most of this by allowing staff to access their applications and files on the go. This can be extremely beneficial for businesses that employ freelancers for example, or employees who frequently have to travel. The flexibility that the cloud brings can be especially helpful at times where it may be difficult for staff to get into the office - businesses can be run from any location as employees are not reliant on on-premise technology or servers. Security Until now, one of the largest barriers to adopting cloud computing has been concerns over data security, with 66 per cent of businesses citing this as their main worry. However, research shows that in spite of public perception, cloud providers can typically offer increased security at a much lower cost that SMEs could otherwise afford. Data being hosted in a UK data centre can in fact be much more secure than if businesses were to manage it internally. With their capacity to adapt in an ever-changing businesses environment, SMEs are vital for the growth of the UK’s economy. Many innovative start-up businesses are achieving growth by making the most of the benefits that the cloud provides, but all businesses should be aware of the transformational potential of the cloud. SMEs at all levels must be educated with the know-how to stay ahead of the game, and the wider business and digital community must ensure they work to encourage and support the adoption of new innovative technologies, including the cloud, among SMEs. ### Oliver Peuschel, Datapipe EMEA - Live from Cloud World Forum, London Interview with Oliver Peuschel of Datapipe EMEA as part of our live afternoon Kings of Cloud broadcast from Cloud World Forum, Olympia, London - 24/06/2015 ### Andy Johnson, GTT - Live from Cloud World Forum, London Interview with Andy Johnson of GTT as part of our live morning Kings of Cloud broadcast from Cloud World Forum, Olympia, London - 24/06/2015 ###  Special #CloudInfluence Report on Storage "PUT IT IN THE CLOUD" This month we look at cloud storage. Can it be used as an indicator or predictor for wider cloud dynamics. In  other words what do market dynamics in cloud storage tell us about the wider cloud market? It is always hard to determine cause from effect – you’re often either in the ‘chicken came first’ or ‘egg came first’ camp – or indecisively in both. And often even identifying any form of direct correlation between different factors or markets can be nigh on impossible. We were fascinated by a recent Gartner Magic Quadrant for Public Cloud Storage Services. In 2014 Gartner postulated that AWS and Microsoft were the leaders in the market for public cloud storage services with Google as a possible challenger and IBM, HP and Rackspace as niche players. This year AWS, Microsoft and Google as leaders with Rackspace, IBM, AT&T and Version as niche players (HP having been dropped) - get a copy of the full report here. The most interesting observation was quite how clearly Gartner see the market split between leaders and niche players – with this Gartner Magic Quadrant particularly unusual for not featuring any visionaries or challengers at all. This is of course Gartner’s analysis for public rather than private cloud storage services, but it makes stark reading for OpenStack advocates. At its recent Vancouver summit much of the chatter was about storage, with our correspondents meeting many innovative players. At the same time, HP and other OpenStack advocates refused to concede that OpenStack had lost the public cloud market to the mega-scale vendors and that it was now in effective a niche private cloud standard, along with VMware. It is hardly heartening therefore for such advocates for HP to have been dropped from the most recent Gartner Magic Quadrant and for the gap between the leaders and niche player to have widened with OpenStack only represented among the niche players. We decided to do a #CloudInfluence Special Report on Storage with a special focus on both OpenStack Storage and AWS Storage. Looking at the last few weeks starting on 7th May a week ahead of the Vancouver OpenStack Summit, we see that the 10 companies or individuals grabbing the most attention discussing OpenStack storage were as follows:   [table id=33 /] Topping the list was Shawn Flynn, IT Infrastructure manager for Servers and Storage at Progressive Waste Solutions who was quoted in regard to the firm’s adoption of Breqwatr Cloud Appliance for rapid deployment of an OpenStack private cloud. He was followed by Load DynamiX the leader in storage infrastructure performance validation and Infortrend Technology, the Taiwanese technology company specialising in SAN and NAS storage systems. Other notable individuals were Thomas Kao, the senior product planning director at Infortrend, Boris Renski, the Co-Founder & CMO at Mirantis and also OpenStack Foundation board member and Mark Shuttleworth, Founder of Canonical Ltd. We compared the level of activity and attention with a similar analysis for AWS storage over the same period. It is possibly unsurprising to see Jeff Bezos, Amazon’s CEO appear or indeed Amazon itself as well as Amazon Cloud Drive and Amazon Web Services. Others to appear were Larry Ellison who singled out AWS as his main target saying: "We're prepared to compete with Amazon on price." Mark Shuttleworth also appeared commenting on AWS as well as OpenStack, as did Ilan Rabinovitch at Ooyala commenting on multi-cloud management [table id=34 /] Looking at the level of activity and attention for both OpenStack Storage and AWS Storage over this period might give the impression that there was an even balance between the two, but this belies the fact that the Vancouver OpenStack Summit represented a peak in activity for OpenStack. Extending the time period for AWS shows that there was a massive peak of media and social interest in AWS storage a few months ago (when Amazon broke out the AWS financials for the first time). As great as the buzz was at the Vancouver OpenStack Summit, and as numerous the number of storage announcements, they represent little more than a blip in attention when compare to AWS. In the same way the impressive turnout in Vancouver would be dwarfed by the cumulative attendance of the various regional AWS summits. All this supports the picture given by Gartner in its public cloud storage services analysis of a market for cloud storage services that is being dominated by the mega-scale cloud vendors. So does this matter and what does it mean for the OpenStack community and for the niche players? Storage is one aspect of cloud that is particularly open to economies of scale, giving a real competitive advantage for the mega-scale vendors. The niche players will need to sustain their focus on added value services and hybrid integration. They will also need to decide whether to leverage public cloud storage services within any hybrid environment or to retain their own cloud storage services. It will be most interesting to see how this unfolds. NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### 9 Ways to Use Podcasts as Learning Tools What’s the word for a radio broadcast that is made available on the internet for sharing and downloading? If you said podcast then you know exactly what we are going to be talking about. Podcasts and vodcasts can be recorded in both audio and video formats. Normally, podcasting is used in combination with blogging. However, recently the scope of podcasting is extending far beyond that. Its use has extended to training, marketing, and even education. Let’s see how this tool is being used in the adult professional education sector. Delivering Supplemental Information: Some people require extra lectures to fully comprehend a topic they haven’t understood fully. This is particularly relevant for marketing teams working in fields they are unfamiliar with, having access to resources constituting of more than just literature can make understanding a product much easier. Additionally, podcasting can be used to provide extra research content or deliver discussions to anyone who needs extra support. Deliver Project Presentations: Video Podcasts, also known as Vodcasts, can be used by both educators and professionals to deliver project presentations. By allowing a pre-recording to be made of the presentation, you allow the presenter to avoid the stress of public speaking - yet you still get an engaging visual presentation. This is also a convenient way to have a presenter speak on your topic if they are not available at the desired time of presentation, or there needs to be multiple locations. Library for Future Professionals: The considerate group of professionals using podcasts as a learning tool can create a “library” for the future entrants. This will include lessons, learning experiences, tips, and words of advice that will be helpful for the newcomers. Podcasting is a cheap way to share knowledge, similar to leaving notes in the margins of a book. Interview A Speaker or an Expert: Calling an expert to a classroom or office won’t always be easy. Sometimes, it is easy to visit an expert yourself and save their time. You can prepare a list of questions for the expert beforehand and ask them if they are comfortable with you recording the interview for your classroom.You can then upload the interview podcast for people to view as an optional extra - or screen it during your training session. Newsletter Distribution: Save paper and allow people to view an audio news right from the comfort of their home or office. These will obviously be put up on the institute’s website, allowing learners to view them at any time—and repeatedly as many times as they want. While you are Away: Sometimes, educators have to be away for a few classes to attend meetings or see to other obligations that involve travelling. During your available time, you can record podcasts lessons for the lecture and ask the substitute teacher to use the recording during lecture time. This will also give the trainer a head start making available to him/her an example of your lecture. Consequently, the substitute will know exactly what to talk about and how to deliver the remaining lecture in a style that suits you. Real-Life Observations: While educating the class about any specific career-related course say “fundamentals of marketing”, you can ask them to record what they have observed in real life. The class can create their very own Discovery-type documentaries of what they have known about the subject or observed its subtleties in the field. Engaging Information: Some topics are constantly rehashed in the same way, particularly in the technology sector, and the reading material available on them is often stagnant. By having industry experts discuss your desired learning topic via a podcast or vodcast, you offer an alternative learning resource. Aural and Visual education tools are becoming more and more prominent as alternative learning styles are recognised. Responsive Education: Certain people don’t learn well from reading - they need a more personal explanation, responsive podcasts are particularly good for this. Offering listeners an option to send in questions for the experts to respond to during their recordings. It gives listeners the freedom to respond at their own time, without the pressure of asking it in front of an audience. Podcasting and Vodcasting are important tools for the future of in-workplace education. As the industries mature and develop, staff will need to do so with the technologies. Alternate tools for their education are the ideal way to maximise their time, and your money. Instead of taking all your staff to a seminar room (and away from their desks) for 3 hours, offer them 1 hour podcasts to listen to, then follow it up with a question session. It’s time to think differently about education in the future. ### IBM and Box announce global partnership IBM and Box have announced a global partnership that will combine the best-in-class technologies and resources of both companies to transform work in the cloud. Together, the companies plan to integrate their existing products and services and develop new, innovative solutions targeted across industries and professions ranging from medical teams working on complex cases to individuals negotiating consumer loans by mobile phone to engineers and researchers identifying patterns in patents, reports and academic journals. As companies increasingly seek simple, secure collaboration solutions that tap into local data and have global reach, this strategic alliance brings together Box’s industry-leading cloud content collaboration platform with IBM Analytics and Social solutions, IBM Security technologies and the global footprint of the IBM Cloud. The two companies will jointly deliver these solutions to market internationally, and IBM will also enable builders and developers to integrate Box APIs into enterprise apps and web services. [quote_box_center]“Today’s digital enterprises demand world-class technologies that transform how their organisations operate both internally and externally,” said Aaron Levie, co-founder and CEO of Box. “This extensive alliance between Box and IBM opens up an exciting opportunity for both companies to reach new markets and deliver unified solutions and services that can redefine industries, advance secure collaboration and revolutionise enterprise mobility.”[/quote_box_center] “This partnership will transform the way work is done in industries and professions that shape our experience every day. The impact will be felt by experts and professionals in industries such as healthcare, financial services, law, and engineering who are overwhelmed by today’s digital data and seek better solutions to manage large volumes of information more intelligently and securely,” said Bob Picciano, senior vice president, IBM Analytics. “The integration of IBM and Box technologies, combined with our global cloud capabilities and the ability to enrich content with analytics, will help unlock actionable insights for use across the enterprise.” IBM and Box will partner in three key areas: Transformation of Enterprise Work Content Management: Box will integrate IBM industry-leading enterprise content management including content capture, extraction, analytics, case management and governance. Watson Analytics: IBM and Box will collaborate to bring in-depth enterprise insights using IBM Watson Analytics to content stored in Box. Social Collaboration Solutions: IBM and Box will collaborate to integrate Box capabilities intoIBM Verse and IBM Connections, the company’s business email solution and social collaboration platform. International Reach and Security Enterprise Cloud: Box will enable joint customers to store their content on the IBM Cloud. The IBM Cloud provides data resiliency, data privacy and data localization, a key consideration for international customers who want the option to keep their data in country. Specialized Enterprise Consulting: The partnership will draw upon the specialized Enterprise Content Management skills of IBM Global Business Services professionals to help clients connect or integrate Box capabilities with existing data and systems. Enterprise Security: Box will expand on its enterprise security offerings with IBM security technologies for threat detection, anomaly identification, mobile device management and identity protection. New Content Rich Apps and Solutions Mobile Apps for Industries:  Box and IBM will jointly develop content management solutions and incorporate Box technology into select IBM MobileFirst for iOS apps. Custom App Development: IBM will enable enterprise developers to integrate Box APIs on the IBM Bluemix developer cloud to help build content rich web and mobile apps. By leveraging the power of the cloud, this comprehensive partnership will transform collaboration and the ability to gain actionable insights across a variety of industries. For example, IBM and Box can offer more streamlined collaboration between patients and physicians for hospitals and healthcare providers in a security-rich environment. A hospital can share test results with patients, while maintaining a formal review with the physicians and nurses coordinating patient care. IBM and Box also plan to help the world’s leading retailers manage their omni channel experiences. For example, a fashion retailer will be able to deliver a consistent brand experience that extends to all physical and e-commerce stores by sharing advertising images, promotional materials and product lines easily through any device, while maintaining control of merchandising consistency, pricing and packaging across all channels. An insurance underwriter will be able to capture and review inspection reports while mobile, to expedite the approval process; a field worker can access up-to-date manuals to solve problems in real time, while documenting work in a central location to drive collaborative scheduling and planning. For more information please visit: www.ibm.com/ibmandbox or www.box.com/ibm.   ### Ian Jeffs, Arrow ECS - Live from Cloud World Forum, London Interview with Ian Jeffs from Arrow ECS UK as part of our live morning Kings of Cloud broadcast from Cloud World Forum, Olympia, London - 25/06/2015. ### Sandy Carter, IBM - Live from Cloud World Forum, London Interview with Sandy Carter, GM at IBM as part of our live afternoon Kings of Cloud broadcast from Cloud World Forum, Olympia, London - 24/06/2015. ### GTT's won a Stevie! GTT Communications, the leading global cloud networking provider to multinational clients, announced today that it received a Stevie® Award for the Fastest Growing Company of the Year in the 2015 American Business Awards. “We are honored to be recognized as one of the Fastest Growing Companies of the Year by the American Business Awards,” said Rick Calder, GTT President and CEO. “Our strong growth is due to the successful execution of our strategy to expand cloud networking services to multinational clients, to extend secure network connectivity to any location in the world and with any application in the cloud, and to deliver outstanding client experience by living our core values of simplicity, speed and agility.” The American Business Awards is the premier business awards program in the United States. The judges selected this year’s winners from a record pool of more than 3,300 nominations submitted by organizations of all sizes across virtually every industry. More than 200 executives worldwide participated in the judging process. “In 2014, GTT reported revenue of $207.3 million, reflecting a compound annual growth rate (CAGR) of 26% since 2010,” said Mike Sicoli, GTT’s Chief Financial Officer. “For the last quarter ended March 31, 2015, GTT’s annualized revenue run rate grew to $250 million. The company closed the MegaPath Managed Services acquisition on April 1st, adding another $124 million in recurring revenue and bringing us materially closer to our next financial objective - $400 million in revenue and $100 million in adjusted EBITDA.” GTT’s global Tier1 IP network connects to any location in the world and with any application in the cloud. ### … Now it’s Docker on mainframes??? How Docker Will Change the Game for Enterprise scale Application Development You have got to be kidding, Docker on the Mainframe? Following our post on June 3rd – "IBM’s Mainframe team gatecrashing MongoDB World," it appears the IBM z team is out in force again – this week at DockerCon in San Francisco. Here is another one of these new, exciting technologies, and it's appearing on the mainframe? No way! It seems like the z System team is demonstrating the mainframe is a platform on which developers can innovate, and enable users to modernize Linux environments. We caught up with Steven Dickens and Dale Hoffman to learn more. Steven is the Linux Offering Manager for IBM’s premier z Systems platform, and it turns out Dale is leading the charge on the open source technology partnership efforts in this space. Why the sudden interest in Docker by IBM z Systems? Essentially Docker reduces the development effort required to build applications and transport them between different platforms. The main reason is that services and solutions are decomposed into what are referred to as “micro services.” By breaking a big solution into smaller chunks and being able to replace the individual components and micro services of the entire solution, it becomes simple to upgrade or come up with new versions of an overall solution. This goes hand-in-hand with newer philosophies about how applications are built, packaged, used, and managed, and is consistent with DevOps best practices. We then asked how IBM and Docker were working together overall. It’s common knowledge that Docker is partnering with many companies and open technology foundations with IBM one of the recognised partners on the Docker’s lists. What’s new that IBM is expected to be the first reseller of Docker Hub Enterprise -- Docker’s behind the firewall repository for storing and managing Docker containers. A truly excited Hoffman described Docker on IBM z like this: “Docker is Docker is Docker… on Linux on z Systems too!” He then elaborated by saying he regards the push to bring Docker to the mainframe as a game changer because it marks the convergence of what Docker provides - an open platform - enabling an easy and efficient experience for developers to build containers for deploying and transporting distributed applications, with some fantastic improvements in security and performance – when deployed on the mainframe. The advantage of this portability means developers have the ability to develop and deploy anywhere, without having to compromise on security or performance. Extending Docker to the z Systems environment will expand the range of applications for the mainframe and speed up the development process. Hoffman went on to reveal, “Our early benchmarks indicate that Docker performance screams on IBM z with 2x higher application throughput per container over other platforms, essentially supporting a higher number of containers per virtualised/physical resources.” The result means Docker paves a new way of creating and moving applications and workloads to and from z Systems, making it possible for developers to reuse a micro service and put it together in a bigger application using Docker containers and interconnects between them. This enablement of Docker on IBM z Systems means Linux users can further extend and advance their innovation with Docker technology to support enterprise grade business critical data and applications.  When will Docker become available on z Systems? According to Hoffman, z Systems has already made Docker binaries available to clients as a technology preview to developers allowing them to experiment with Docker in order to gain experience, and understand how complementary the technology is to the mainframe. We then quizzed Steven Dickens, who is out at Red Hat Summit in Boston this week, on why the mainframe team are becoming more vocal in the Open Source world and how he sees this space evolving. [quote_box_center] According to Dickens, “IBM is seeing a shift in how clients are looking to deploy Linux. We are seeing a shift into the enterprise grade arena with Linux, where clients are looking to deploy applications previously only seen on high-end UNIX and mainframe systems onto Linux for the first time. This behavior is being driven by the maturity of NoSQL databases and development languages. Linux on IBM z has been around 15-years and we have seen double digit growth ever since as the Linux and Open market is increasingly asking for more enterprise grade capabilities, especially around security, availability, performance, high availability and disaster recovery.” [/quote_box_center] In summary, we at CTC are witnessing some interesting dynamics in the Linux space and a big shift in how people want to deploy Linux in an enterprise grade cloud environment. The momentum on the IBM z platform is building with the enablement of these new and exciting technologies now available for rapid innovation. What’s next from these guys? We sense more is sure to come. Stay tuned.   ### Cloud World Forum: A Digital-Era event By Doug Clark, cloud leader for IBM UK and Ireland I have a confession to make.  I like metrics.  One of the reasons I like them is that they are a great way to track the pace of change.  Today “data” is the fuel of the revolution and the buzzword everyone uses to partner with it is “digital” One trend that’s advancing at a blistering pace as I write this is the advance of technology. Even if you were a technophobe you couldn’t avoid the explosion of tech everywhere and in every circumstance of our ever busier lives. Consider the number of bytes of data we generate, the amount of time and money we spend online, the speed of our internet connections, even the number of apps on our phones. Whichever metric you choose, a clear conclusion can be drawn - the way we use and consume technology has changed profoundly and irreversibly.  It’s greater than evolution, Information Technology has transformed and embedded itself in the heart of the business rather than as a support function. Tech will continue to transform many aspects of human existence; from the way that we communicate, to the way that we purchase goods, the way we track our health, the way we maintain our presence. Historically we reflect on revolutions – such as how steam and coal-powered machines once fuelled a revolution that brought us into the Industrial Age. Today “data” is the fuel of the revolution and the buzzword everyone uses to partner with it is “digital”. Revolutions are never simple - and the demands this revolution makes are no different and seemingly full of contradiction. Despite being known as the disposable generation, we can’t just rip and replace what we already have in all its formats. So we need to harness increasingly sophisticated technology to bridge to this era, an era that demands 24/7 access of relevant timely and accurate data – digested and served in an increasingly flexible and mobile level of access. And what underpins this revolution? At IBM we believe that organisations of all sizes stand to benefit from a better understanding of cloud. Not only to highlight the value that cloud can bring to their business, but also to realise how those cloud offerings can be tailored to best fit their needs. At Cloud World Forum in London we’re aiming to help promote this understanding, and to bring the potential of the cloud to organisations across the UK. Cloud allows organisations to build systems of engagement that are suited to the dynamic nature of the digital era. This potential lies in a whole range of concepts and tools that were unheard of just a few years ago – big data and analytics, hyperscale computing, cognitive technologies, the Internet of Things, etc. Equipping organisations with these tools allows them to mine the vast quantities of data available, mould it into actionable insights . Part of the value of these digital-era tools is that they are highly responsive, often taking input in near real-time, and can rapidly change and evolve. These qualities are ill-matched with the established IT systems of many companies, which are traditionally static systems of record. Cloud allows organisations to build systems of engagement that are suited to the dynamic nature of the digital era. Unfortunately, the transition to the cloud has not always been a smooth one. Companies not willing to make the significant investment into a personal private cloud have largely been presented with only one alternative: a share in a public cloud, and the concerns over integration and security that implies. Even larger organisations that are willing and able to set up a private cloud face a huge challenge to develop within the rigidity of their larger existing IT infrastructure. IBM is pioneering a process of migration to the cloud that is as flexible as the fundamental principles of Cloud computing. We believe that the future for Cloud is in a hybrid model; one seamlessly integrating on-premises and external infrastructure to maximise efficiency.    I believe that we’re uniquely focused on delivering this hybrid model, and it’s built on some key execution areas Developer productivity to develop, refine and deploy apps quickly and continuously with hybrid cloud DevOps. Using cloud to deliver software faster. This includes both a digital innovation platform to compose and run next generation apps as well as the lifecycle management and coordination to deliver those apps in conjunction with traditional apps.   Data and analytics to provide the best insight from all relevant data inside and outside the organisation and optimise data sovereignty and locality. 80% of new applications are data intensive according to IDC. Clearly, the ability to access, synthesise, analyse and replicate data is paramount for the new hybrid cloud. Visibility, control and security everywhere that data and services exist whether on premise or off.  Integration and portability to easily integrate, compose and deliver web and mobile apps. Organisations need integration across data, applications, and business processes. And to be able to move the app closer to the data or the data closer to app where and when it make the most sense. Our Open by Design approach provides the flexibility and freedom to choose and change environments as our clients’ needs change. IBM has a long history of support for open standards and open source initiatives and nowhere is this more important than with hybrid cloud. IBM’s Open by Design approach gives customers the freedom to choose and change environments, data and services as needed. A properly managed cloud provides a space to develop truly creative solutions, and to realise the opportunities that the digital era provides in such abundance. IBM Cloud attends events like Cloud World Forum to spread this message and explain how we’re helping clients to respond to the digital era through the new hybrid cloud, built open by design with security everywhere. ### Why Web Access Management is more than just Single Sign-On Mobile working, bring your own device (BYOD) policies, and web-based applications have made accessing data far more convenient for businesses and employees. However, this convenience has come with a clear drawback: security issues. With the widespread adoption of wireless networks, multiple access points and greater collaboration, protecting sensitive data is a real challenge for businesses. One of the ways to mitigate this risk is to adopt web access management (WAM) software. By managing who is able to access company tools and services, businesses can retain control of their data while still enabling their employees and customers to benefit from the level of flexibility required in the modern workplace. Web access management software can take a variety of approaches, but one of the most common is Single Sign-On (SSO). This enables members of staff to access multiple websites and applications through a single authentication process, using the same password and username. One of the major benefits of SSO is the added efficiency and, therefore, cost savings that it gives businesses. protecting sensitive data is a real challenge for businesses Due to the increased number of online applications, employees often suffer from “password fatigue,” resulting in unnecessary calls to the IT help desk as a result of forgotten log-in details. Over the course of a year this can add up to hours upon hours of wasted time and money for your business. What’s more, switching to a single password approach helps improve security. Having to remember multiple passwords, each with differing levels of complexity, often results in employees simplifying their password choices or writing them down – introducing vulnerabilities into your network. Introducing the Single Sign-On approach to your business is a great way of simplifying security, without cutting corners, enabling a more efficient working environment for your members of staff. Web access management is an excellent security aide, but it is more than just a way of preventing cyberattacks and third party interference, whilst the use of Single Sign-On enhances user experience, there are inevitably different User Groups that require stronger levels of authentication. These may include Managers, Administrators, Customers, or Suppliers all with differing levels of access to applications and data. Additionally “Higher Risk” device types that are accessing systems may need stronger authentification, these could include BYODs and Web Browsers from different locations, as well as dedicated company devices.  United Security Providers enable you to manage all of these User types appropriately, utilising a comprehensive and flexible rules based management console with existing integration to the standard Multi-factor authentication vendors from typical token devices (RSA, Safenet, etc.) to one time use SMS codes. Web access management is as much about protecting your customers as your business  However, web access management isn’t simply about passwords, and at United Security Providers, we recognise that protecting company data is a multi-faceted task. Our USP Secure Entry Server also utilises a Highly Secure (true reverse proxy) web application firewall to protect companies against present day and future threats. Web access management is as much about protecting your customers as your business, and a built-in firewall helps secure your business’ reputation by helping you achieve data security and PCI DSS compliance. In order to maintain confidentiality in B2B communications, organisations may also want adopt a web service gateway to protect data in transit. Web service gateways ensure that only authorised partners and clients are able to access your company data, preventing third parties from intercepting communications. Web service security also helps provide the level of confidentiality that a company’s reputation can be built upon. Businesses may also want to limit what information certain employees can access. Often, this is not a question of trust, but one of relevance. While certain company teams may need access to sales data or banking information, for example, others will not. Web access management lets businesses decide who has access to their data, placing security management firmly in their own hands. Web access management is about so much more than simply limiting the number of passwords that members of staff have to remember Web access management is about so much more than simply limiting the number of passwords that members of staff have to remember. Ultimately, it enables businesses to manage and enforce their own security protocols, giving them greater control over their applications, processes and data. For many organisations, large or small, their reputation is of the upmost importance and web access management provides the tools needed to secure this reputation against present and future threats. For more information on web access management, visit: www.web-access-management.com ### Bloomberg gets savvy with Smarter Cities Firstly I'd like to commend Bloomberg on their first BloombergLIVE event. The Bloomberg Technology Conference: Smarter Cities. I attend a lot of these conferences and often I hear the same thing, in the same format, over and over again. The medium Bloomberg chose was, for me, exceptional. Instead of having mere figurehead chairpeople, they chose to use their resource of highly qualified TV journalists. Short presentations by speakers were followed by insightful interviews led by the journalists, with encouragement for audience participation. This change of format meant that instead of just being able to say what they liked in their presentation, the speakers had to follow up their statements by responding to questions and queries from the chairs who took notes throughout. This makes a big difference to the normally mumbled questions that come from 'unsure if I should be asking this' audience members. Moving on from this point let's recognise that London is well on its way to becoming a mega-city - that is, a city with more than 10 million occupants. As this rapid expansion continues, the tech city within the hub is working to make London more resilient and functional for the future. Smart Cities are not just about IoT street lamps and noise pollution monitors. Looking towards the future, smarter cities will be integrated social experiences, allowing for transparency of government and engagement of city constituents. Andrew Collinge, Assistant Director of Intelligence and Analytics for the Greater London Authority said "It's up to city hall to be more deterministic when looking at smart cities." I think this is right, and the sentiment was echoed by many throughout the day, cities need to work together at a greater scale to become 'smart'. London is unlucky in some ways, we are trying to retro-fit a centuries old infrastructure with modern technological solutions. London is well on its way to becoming a mega-city - that is, a city with more than 10 million occupants One of my favourite comments of the day was that technology companies often only see a problem that needs to be solved, and they design to fix a problem. You can't approach a city as a problem, it is too big for that. The companies making tech that is really making a difference now aren't trying to solve problems, they're being much more innovative than that. [gallery columns="4" link="none" td_gallery_title_input="Bloomberg #SmarterCities Graphics" ids="20390,20391,20392,20393,20394,20395,20396,20397"] Smarter Cities and Power Take PaveGen for example, their idea is incredible. Take something everyone does, every day, and innovate it to make a resource we all need. One step for power. Paving tiles, priced at the same point as flagged paving that in high traffic areas, will generate enough power to justify their existence. An installation of 200 tiles into a football pitch allows the energy generated by players to be stored and used to power floodlights to light the pitch. Additionally, the data that can be collected from these tiles if they were located in shopping malls or within stores is invaluable. Being able to track footfall within a retail environment can show the retailer if its customers are lingering in certain areas, or if particular displays aren't getting the footfall that they were expected. Lawrence Kemball-Cook, CEO and Founder of PaveGen said originally they wanted to know if they could design a floor that would tweet every time it was stepped on, but they decided on the more practical end function of power generation. Power supply and demand is now being looked at more intuitively also, when all of Britain decides to make a cup of tea after X-factor, the demand on the power grid is overwhelming. Power management companies are now looking at solutions that shut off non-crucial power supplies for short periods of time. For example, turning of the power to your fridge for 10 minutes does not have any negative ramifications, the fridge will maintain its temperature for short periods of time due to stored power. So when companies require energy for a huge task once a day, but it does not matter when this task happens, energy management means that this task can be relegated to a time where the demand on the grid is best able to handle it. Smarter Cities and Health Smart cities should be working to make people healthier, happier and more engaged with their environments. Health tech is progressing to become proactive in this regard. Looking to be preventative rather than solvent. Through improving mental health services through apps and online, more support is offered to people who otherwise would not access the services. It is proven that social connection forestalls mental illness, smart cities projects are not just about tech. As Katherine Oliver, Principle of Media and Technology at Bloomberg Associates put it; it's about the three P's. Parks, playgrounds and paths are key points for smart city development. Smart cities and health go hand in hand. Tracking health outbreaks through wearables, twitter, and geotagged google searches allows for pharmaceutical companies to distribute drugs to the areas which need it most, and to predict and back track patterns of infection. Technology innovation in cities to promote better health is not limited to the realm of healthcare itself. Traffic improvements, and smarter parking solutions reduce emissions by reducing the number of cars driving around looking for parking spaces. Emission management is a big part of success for cities in the future, improving air quality and reducing environmental impacts will be big focusses in the near future. Smarter Cities and Transport I have useless navigation skills - the perils of being in the GPS generation We all use transport apps. Frankly, I wouldn't be able to get around London without them, I have useless navigation skills - the perils of being in the GPS generation. Travel data is vastly improving our quality of life. People are able to choose their mode of travel through many means, whether price is a priority, or if they just want the fastest route. Giving people visibility to their transport infrastructures is changing the way they interact with their cities. Omid Ashtari, GM of City Mapper has received reviews along the lines of 'I'm sitting on a bus that was two streets away from my house, getting to my destination faster than ever before, and before City Mapper I never even knew this bus stop existed.' This is true. Apps are giving us travel options that we would not have considered previously. It might be faster to walk one station further away and get on a different line, rather than catching 3 trains to your destination. In Singapore they have gone one step further to make travel more hassle-free, they are supplying travellers with information on how crowded the busses are. Giving commuters the option to know a less-full bus is only 5 minutes away. The Bigger Picture "Amsterdam is NOT tracking old people." My favourite inflammatory statement of the day from Ger Baron, the CTO of the City of Amsterdam. There is so much potential for smart cities to improve quality of life. Partnerships between the public and private sectors can have real effect. Amsterdam has been working with supermarkets to see if anti-obesity advertising is changing the ways people shop. Collecting data on people's shopping habits is allowing them to target their advertising more accordingly, in the hope of reducing health problems long term. Following the Boston marathon bombings, outdoor advertising firm Clear Channel International used its smart connected to convey information to citizens in real time. Bus shelters and billboards are now able to have direct contact with citizens to improve their quality of life. Simple things like weather forecasts and bus times are conveyed daily - but in the event of a disaster smart cities really can make the lives of their occupants a little easier. As I have said before, Smart cities are the future. The way we choose to live in them is up to us now. Personally I hope that we choose a socially conscious, environmentally sustainable future, but the design of that future is yet to be determined. The city is not a problem to be solved, it is a culture and a community to be fostered, and in the future engaged through innovation. For more information from the event, check my twitter feed @Rhian_Wilkinson, or the hashtags #SmarterCities or #BTech2015. ### InstaCloud unlocks the benefits of private cloud for SMEs Despite large scale adoption of cloud services in recent years, figures from the Federation of Small Business suggest that over one third (38 per cent) of SMEs still remain sceptical about the benefits, harbouring concerns over data theft, reliability and access. With many of these fears born out of public cloud experiences, a new private cloud option has been developed to give SMEs more control and confidence in the safety of their data stored in the cloud, without the big-business price tag.  InstaCloud has been launched to provide a low risk, low cost option for small businesses to take advantage of private cloud, without the need to set it up themselves or have the in-house resource or expertise to manage the platform. Founded by ex-Microsoft senior partner technology strategist, Steve Belton and Matthew Munson, who previously started hosting company PoundHost, the service is underpinned by a management team, who bring over 25 years of combined data centre and hosting expertise and heritage, giving SMEs renewed confidence in embracing cloud technology. With no contract required and a monthly fixed billing approach, end users have peace of mind when it comes to budgeting for cloud services with no hidden fees. Customers only purchase the space they need and with UK-based data centres in Slough and London, concerns around data governance and integrity are reduced. [quote_box_center] Commenting on the launch, Steve Belton, Head of Cloud Engineering & Co-Founder at InstaCloud explains: “Security and data sovereignty issues associated with the public cloud have tarnished large scale adoption of cloud services among the small business community in particular, in recent years. With many dismissing private cloud as an option and often lacking the internal resource or IT skill to go down this route, we have been slow to break down the barriers to take-up. Through InstaCloud we aim to give an affordable and realistic option to small businesses as a way out of relying solely on public cloud, which is often not fit for purpose or designed for consumer use.” [/quote_box_center] The InstaCloud solution is based on Microsoft’s Cloud OS technology and built on a state-of-the-art infrastructure platform. Offered with VM’s running Windows Server 2012 R2 and with licences included as part of this cost. Bandwidth provision is free of charge, with RAM and storage usage charged at a fixed rate. Network connectivity uses a software defined network and Cisco switches and firewalls. Solutions are provisioned with hardware and network redundancy in mind giving small businesses confidence in the integrity of their data, with NetApp storage ensuring high performance and stability at all times.  “Our instant, mini private cloud can grow with a business and through the pay-as-you-go model there are no nasty surprises meaning usage can be budgeted for in advance. We believe that SMEs should have flexibility and choice when it comes to state-of-the-art hosting with cost and fear no longer a barrier to experiencing the benefits of a secure and scalable private cloud solution,” adds Steve. InstaCloud is offering new customers a free one month trial and more information can be found at www.instacloud.com ### GTT is connecting Le Pain Quotidien [quote_box_center] Rick Calder, GTT President and CEO “We are very excited that Le Pain Quotidien chose GTT for our advanced cloud networking services, we provide a truly better way for them to reach the cloud and look forward to working with Le Pain Quotidien as they expand their global presence.”  [/quote_box_center] GTT Communications, Inc. the leading global cloud networking provider to multinational clients, announced earlier this week that Le Pain Quotidien, a global café-bakery chain, selected GTT to connect their retail stores across the United States. Le Pain Quotidien has more than 200 locations worldwide, across five continents, with more than 90 of these located in the United States.  “We selected GTT because of their proven track record to deliver reliable and secure connectivity to all of our locations coast-to-coast,” said Cezary Moroz, Senior Director of IT, Le Pain Quotidien.  “GTT’s ability to provide a unified solution that includes broadband, cable, Ethernet, and MPLS technologies set them apart from other network providers. The GTT solution provides our 90+ stores, and ultimately our customers, with the highest level of quality and service reliability.” ### IO data centres launch in the UK Today IO launched its UK operation with a new data centre facility in Slough, Berkshire. They were kind enough to invite Compare the Cloud along for a peek inside the high tech facility. There are currently more than 20 data centres in Slough, and from the outside, IO has stayed on plan, it is a steel framed building with coated steel walls and a coated steel roof system. Basically, it looks exactly how you expect a data centre to look. The space we were toured through was massive, the scale of the shell of the building is hard to believe. I have to admit that whilst standing in the middle of the presently almost empty data centre floor, I couldn't help but think how awesome it would be to race go-karts across the slick concrete floors. There is just that much room.  With more than 10,000 square metres, the Slough-based data centre features highly resilient power, and a dedicated substation. Slough will house the first international deployment of BASELAYER’s advanced ECO modules, designed to deliver ultra-efficient performance by using free, outside air, backed by traditional chilled water technology. Slough is IO’s second modular data centre outside of North America, following Singapore in 2013. As part of its long-term, strategic technology partnership with IO, Goldman Sachs has signed on as the UK facility’s first customer.  [quote_box_center]Nigel Stevens, IO’s UK Managing Director “We are delighted to open the data centre with such a prestigious first customer.”  “In preparation for our launch, we have assembled a great team of industry experts from the London area. We’re looking forward to helping existing customers expand their footprint into Europe, as well as attracting customers from the UK market,” “What’s different about IO’s modular approach to data centres is that you only have to buy what you need. Historically, this industry requires that you overprovision in order to ensure sufficient capacity. At IO, that’s not the case. You simply purchase space and power as your business demands it. When you add to that the ability to scale power density up to 30kW per rack, and the visibility provided by our IO.OS software, it’s clear that we have a very compelling solution.” [/quote_box_center] IO operates six facilities across three continents, and two of the largest data centres in the world. It is a global leader in software-defined data centres, pioneering the next generation of deployments in this field. Security was paramount to the facility's design, there is iris scanner access to the data centre floor and each module comes with its own security system too. Overall, the news here is, the UK now has another data centre provider. It's modular, high tech, and my favourite part, there is an aim to be eco friendly. NB. After the event was done, the shuttle bus driver didn't quite execute his turning circle correctly and had a bit of trouble with the double security gate system. Unfortunately security gates, no matter how high-tech, do not have a function to account for very irritable mini bus drivers. [gallery td_gallery_title_input="IO Data Centre, Slough" ids="20357,20358,20359"] ### CensorNet signs Blue Solutions Targeting SMBs with enterprise-class web security CensorNet, the next generation cloud security company, is partnering with IT software distributor Blue Solutions to offer UK SMB customers enterprise-class web security solutions. [quote_box_center]Mark Charleton, co-founder and director of Blue Solutions said, “As the modern business ecosystem diversifies, organisations are struggling to meet the needs of an increasingly mobile workforce as existing web security solutions are generally on premise. With more organisations moving to the cloud, it’s becoming increasingly difficult to monitor and control what users are doing online,” “Vendors like Websense provide web security solutions for the enterprise market, leaving SMBs largely ignored in terms of security functionality, commitment and price. However, CensorNet’s next generation web security solution, together with its in-built cloud application control (CAC) functionality, will give our resellers the opportunity to generate additional revenue streams. This will provide their customers with better visibility of what cloud applications their employees are accessing regardless of whether they are authorised or unauthorised.”  [/quote_box_center] Blue Solutions is focused on supplying leading-edge software to a diverse spectrum of resellers and MSPs across all vertical sectors. With over 18 years’ experience in helping partners effectively grow their business, the company supports small to medium enterprises.     “Whilst we are strong in end-point security, we’ve been looking for advanced web security technology and CensorNet is the only vendor, that’s truly committed to the channel and combines traditional web security and cloud application control in one single solution. We’re looking forward to working with such a talented, fresh team that really understands the channel,” concludes Charleton.   Ed Macnair CEO at CensorNet adds, “Blue Solutions is a highly proactive and forward thinking niche distributor, with advanced technical experience and an expansive SMB market reach. This partnership marks an important step in our plans to grow aggressively in the UK by offering SMBs our enterprise-class web security solutions. Our enhanced SWG offering provides organisations with a far more informative security solution which makes it a very compelling proposition for SMBs.” This announcement follows CensorNet’s recent launch of its new Secure Web Gateway product, which bridges the gap between web security and Cloud Application Control offering customers the ability to discover and analyse cloud application use in a single solution. Blue Solutions will be introducing CensorNet at the Synaxon UK National Conference on 18 - 19 June in Daventry, UK.  ### Going SaaS: Advice from an Infrastructure Provider Brian Ussher, President and Co-founder, iland Right now offering applications in a SaaS model is not just an option for software developers — it is an absolute requirement.  While many of our favourite apps are still deployed on premise, the “cloud first” initiative has gained tremendous popularity for the ease of deployment and maintenance and flexibility it provides end users. From our vantage point as an infrastructure provider, we work very closely with many software vendors to both develop and deliver their SaaS apps.  This experience has uncovered some key considerations that we believe are critical for success.  Top things to think about when developing and delivering your SaaS app:  1)      Planning and testing matter: We cannot overstate the importance of starting out with the right software architecture in place. The software architecture and infrastructure must work in unison in order to keep things running smoothly. This means choosing the right architecture, network and performance monitoring tools and plans to ensure success.  Plan on standardising your deployments and creating templates of them so that as you add customers to your SaaS service, you know it will behave in a consistent fashion. Focus on tools that work in an extensible, open fashion and that will be still relevant five to 10 years from now.  2)      Understand the interplay between the infrastructure, the code and the database:  Whenever there is a performance problem with an application, the standard approach is to throw more resources at it to speed it up. While this may work in the short term, it does not address the root problem and often sets you up for a more catastrophic failure down the line.  Understand the interplay of the infrastructure equation, namely, the network, storage, compute and virtualisation layer, and your code, database, and OS layer. Work with your team and your service provider to address their relative health individually as well as understand how to triage between them when troubleshooting a problem.  If you are not sure how to handle an issue, start by separating these parts into buckets and work to assess their individual issues – but when doing this, make sure to have the right team leaders working on the problem to solve it quickly. 3)      Communication between the core teams: Often we see that teams operate in silos where the outsourced development team is not interfacing well with the internal development team, product management or others. It is critical to choose the right partners and establish open communication lines up front. Also identify key stakeholders rather than try to unwind a dysfunctional communication stream later when deep problems arise.  We have seen great success in organisations where one person is assigned to manage communications between these groups. 4)      Always have a Database Administrator (DBA): It is almost impossible to compensate for a database that is not architected optimally or, more importantly, maintained.  Before embarking on a new SaaS-based app or converting from on premise, take some time to measure the health and viability of your database and make sure you have a DBA to guide you through this mission-critical part of the process. 5)      Setup the proper dev, test, and staging areas for your app: This seems like obvious guidance, however, we have seen organisations that develop and deliver their apps entirely from a production environment as well as other risky staging environments.  Setting up the right dev, test and staging platforms not only ensures an optimal experience for your customers, but also gives your teams the freedom to try new concepts and features in a safe environment.  Many of our customers utilise our pay-as-you-go pricing to establish flexible laboratories to create and test their innovative ideas.   6)      Have backups and proper disaster recovery plan in place:  In the SaaS world, your customers expect their apps and the proprietary information that may be stored in those apps to be constantly available. However, we see many app developers that do not have a proper disaster recovery plan in place.  Be it natural disaster or cybercrime, a good disaster recovery plan will literally save your bacon in dark times.  We specialise in disaster recovery from backups to sophisticated Disaster-Recovery-as-a-Service programs and can attest that DR is one budget item you cannot afford to cut. The flexibility and cost efficiency of developing and delivering SaaS applications with a third-party infrastructure provider is substantial. As experts in infrastructure and given our experience working with other application developers, iland has learned a great deal about how to cross the cloud chasm quickly, cost effectively and with minimal disruption to your customers. If you'd like to discuss SaaS any further, visit iland at www.iland.com. ### Smarter Cities Speaker Announcement With the Bloomberg Technology Conference: Smarter Cities just a few days away they are busy fine tuning the agenda for what is shaping up to be a really fantastic event.  You can see the full agenda and book your place here. Some exciting new speakers have just been announced: Paul Smith, CEO of Cisco UK & Ireland will be talking about the importance of collaboration and how his organisation is addressing the challenges of a changing world as populations migrate to urban areas. Jessica Federer, Chief Digital Officer at Bayer will be focusing on healthier cities and sharing her passion for translating digital developments in to public health advancements. William Eccleshare, Chairman & CEO at Clear Channel International will look at how smarter cities create smarter citizens and what this means for the future of urban advertising and it’s role in creating, developing and maintaining smart cities. To join them on 18th June at Bloomberg HQ in London you need to register on the Bloomberg Live website. We expect this event to sell-out so would recommend booking your place today. ### My Open Source Journey They say that a journey of a thousand miles starts with a single step. But the part they don’t tell you is that the direction that first step is taken in will have a drastic impact on the path you take. That’s why I’m glad that, back in 1992, the first step in my career journey was pointed toward the Open Source community. I became interested in the Open Source for a pretty simple reason – I could get Unix-type technology at home with Linux. The technology appealed to my curious mind. Sure, systems were a little rough at times, but for the first time I could sit down and make a piece of software do what I wanted it to do – whether it was changing a keyboard layout, configuring a modem, adding support for something new, or any other customisation. Combined with community’s white-hat appeal to my youthful idealism, I was officially hooked on Open Source. This was going to be my career. "I was officially hooked on Open Source" Convinced that the Open Source model – an alternative to the evil commercialisation of the software market – was the future, I shifted the focus of my own consulting company starting 1993 towards a strong focus on open source. I dug in more and became somewhat militant in my support. In fact, after about a year, I made a personal and business decision – I would only take jobs and projects involving Open Source.  I look back at that young entrepreneur and I laud his scruples – but I cringe at his business acumen. I quickly found out that the consultancy would be a commercial disaster but I learned some important lessons about finding a sustainable way to combine idealism and money generation. Despite my early business failings, the idea of monetising Open Source projects in a fair way stuck with me, and it remains a part of my DNA today. The idea of monetising open source projects has always been subject to debate within the community, and after a while I began to realise something that has since crystallised and become one of my core beliefs today: business and the Open Source community need each other. I understand the view point of the side saying any commercialisation is bad and destroys the spirit of open source– but I wholeheartedly disagree with it. Open Source is a fantastic tool to rally people around the common good, inspiring millions to do something for the simple reason that it’s the right thing to do. And while some dislike the idea of contributing for the benefit of a company – the understanding of how community development should work has become so strong, that anybody can easily identify and weed out any individual or organisation with nefarious or selfish intentions and simply spend his or her time somewhere else. No harm done, and this choice is a good thing as it provides a level of self-regulation on the players in an Open Source community. Yet it is only with the participation of corporations that the Open Source community can fulfil its potential. Companies have the coffers to become community benefactors, providing the resources that individuals, no matter how well-intentioned, simply cannot (with very rare exceptions). As long as they are adopting a model that does not encroach on the value of the efforts donated by community members, companies have the ability to deliver tremendous and lasting value to the Open Source community. We’ve seen time and again that there are enough projects and companies out there doing the right thing so a balance gets struck. Projects are furthered, knowledge is advanced, organisations profit and jobs matching Open Source contributor skill sets are created: a true win-win for all involved, with the possible exception of the staunch ideologue group that long ago counted me as a member. It was with this delicate ecosystem in mind that I decided that I wanted to throw my full support and entrepreneurial spirit behind an enterprise-driven Open Source project. I came across ownCloud, a quickly-growing community with developer appeal and traction being guided by the steady hand of Frank Karlitschek, an Open Source pioneer I am proud to now call a business partner. It was 2011, and the ownCloud project’s momentum and its location-independent architecture combined with the emerging need in the market for data privacy and security in an increasingly cloud-based world convinced me that this was the opportunity I’d been looking for. We began working together, and it’s been a remarkable success to date with the Open Source community powering a scalable and sticky solution that’s seeing growing enterprise traction every year. While our dedicated team has helped advance Open Source universal file access and collaboration to the benefit of millions of ownCloud project users across the globe, the open source community has delivered tremendous value to our enterprise customers as well. a community numbering in the millions ensures that the user experience will remain top notcH Building a business on the back of a successful Open Source project has brought our company – and its users – some real advantages. First and foremost, the speed of innovation is simply faster. Thanks to the contributions from the Open Source community, ownCloud has cultivated an ecosystem with thousands of developers problem solving and fine-tuning to make sure every bug is ironed out. At the same time, a community numbering in the millions ensures that the user experience will remain top notch – we know that our users won’t settle for anything but the best, and have the power to simply leave if it can be found elsewhere. Perhaps the clearest advantage, however, has been the built-in user base and cache associated with a globally popular project. Thousands of start-ups have built a better widget, but failed because they lack the awareness to gain market traction or because potential customers were not willing to take a leap of faith as an early adopter. By working with a proven project, our team has been able to eliminate those massive barriers to start-up success, enabling us to start quickly and continue to grow. A lot has changed in the Open Source world since I first discovered it almost 25 years ago. But at its core, it remains very similar. It’s a vibrant community laser-focused on collaboration and the advancement of a common cause. As it’s grown and evolved, I’m thrilled to see that the community has set aside a seat at the table for like-minded businesses, including mine. As our team continues to work with the Open Source community to further the ownCloud project, I look forward to seeing what’s in store for the next 25 years. ### How do you make sense of Digital Disruption? In today's business landscape I hear people trying to make sense of the terms digital disruption and digital transformation.  Let me try (and I've got an event for that). In 1995 (when we started to talk about Being Digital) less than 1% of the world population had an Internet connection, but today more than 40% do - that's 3.14 billion people and counting.  For decades Kodak was a superbrand synonymous with photography, even invented digital photography but clung to their old film based business model - where are they now?  About a year ago WhatsApp, a relatively new startup, was acquired by Facebook for over $19 billion - they had built a customer base of over 350 million users in less than 5 years.  In September 2014 the Indian Space Research Organisation put a spacecraft in to orbit around Mars, making India only the fourth country to do this - but what is really amazing is that their whole budget was $74 million, which is less than the budget for the movie Gravity.  Something very dramatic is happening. Here is an extract from a new book called No Ordinary Disruption by Richard Dobbs, James Manyika and Jonathan Woetzel which quite rightly suggests we now live in a world of near-constant discontinuity: "Competitors can rise in almost complete stealth and burst upon the scene.  Businesses that were protected by large and deep moats find that their defences are easily breached.  Vast new markets are conjured seemingly from nothing.  Technology and globalisation have accelerated and intensified the natural forces of market competition.  Long-term trend lines, once reliably smooth, now more closely resemble sawtooth mountain ridges, hockey sticks (a plateau followed by a steep ascent) or the silhouette of Mount Fuji (rising steadily, then falling).  Five years is an eternity." Something very dramatic is happening, it's accelerating, and how do we make sense of it?  How do we pull all of these threads together and figure out how to compete, how to create value, how to ride the wave of these forces and put technology to work for our businesses and brands? On October 22nd Kongress Media and Agile Elephant are co-producing the 2nd edition of the Enterprise Digital Summit London at The British Academy, 10-11 Carlton House Terrace to show you how.  We will be addressing the mindshift required and the management challenges of making this digital transformation work end to end in your business.  We will cover the digital topic and social collaboration techniques, but our emphasis will be on the employee, customer, partner and stakeholder behaviours you need to encourage and the issues of management and corporate culture that you need to address to put these new technologies to use.  Let me talk through some of the great speakers we have on the agenda. The opening keynote will be from Stowe Boyd.  Stowe's a futurist, researcher, a bit of a maverick and describes himself as an edgling.  He has been helping us make sense of technology and how it affects the world of work for decades.  He coined the term "social tools" in 1999 and the term "hashtag" in 2007.  We are delighted to have his insight kicking things off. Our second keynote is from Vlatka Hlupic, Professor of Business and Management at Westminster University.   Last year she published a book called The Management Shift on her research from over 20 companies who have been using her approach and leadership model. They are from small to large, in various sectors and include a FTSE 100 Company.  She'll be presenting her model of 5 levels of emergent leadership. We have practical case study stories from Vodafone and Pearson, and a great collection of industry speakers and commentators.  Along with those speakers there will be some great panel discussions, and the chance to participate in a number workshop sessions around transformational change management, digital workplace management, community management and adoption of social tools. Our friends at Compare the Cloud will be there tweeting and blogging about it.  If you are interested in joining us and broadening your mind around digital, then go here for tickets and full details.  ### AppFormix Exits Stealth Mode Compare the Cloud met AppFormix at OpenStack Vancouver last month, and we're now pleased to welcome them into the public eye! Delivering Control and Optimization of Cloud-Based IT Infrastructure Founded to optimize operations for applications running in a shared environment; company secures series A from August Capital AppFormix emerged from stealth mode on Tuesday June 9th, 2015 to address the challenges involved in running applications on a shared infrastructure using virtual machines or Docker containers to create a fully optimised cloud-based data centre. AppFormix bridges the gap between application requirements and the underlying resources. Currently in beta, the AppFormix software solution is designed to provide developers, operators, and DevOps teams the ability to monitor, analyze and control how applications consume cloud resources in real-time, thus enabling the enterprise to operate a reliable, cost effective, and agile hybrid cloud. “Due to resource contention among competing tenants and applications, shared infrastructures deliver variable and unpredictable performance. The impacts include fluctuating application performance and inefficient infrastructure usage,” said Paul Burns, President of cloud research firm Neovise. “AppFormix addresses these issues by providing real time visibility and control over infrastructure consumption through APIs and policies.” Real-Time Infrastructure Visibility for Predictable Application Performance Applications can only perform as well as the infrastructure they run on. Truly reliable performance can only occur when cloud infrastructure is more responsive to the needs of developers and their applications. AppFormix software automatically detects and eliminates resource contention among applications on shared infrastructure. Resource bottlenecks are always moving, and AppFormix is focusing on enabling developers, operations and DevOps teams to use the same toolset to pinpoint them in real-time and establish policies for preventing problems in the future. AppFormix ties into OpenStack, Kubernetes and Mesos for managing the physical infrastructure and orchestrating virtual machines and Docker containers. Delivering Expertise in Enterprise Technology Founded by technologists with deep industry experience in virtualization, networking, analytics, and security, AppFormix also today announced series A funding of $7 million from top tier venture capital firm August Capital. “August Capital prides itself on finding technologies that are integral in providing long-lasting improvements to the data center. AppFormix has the potential to be truly transformative to enterprise IT,” said Vivek Mehra, general partner at August Capital. “The AppFormix team has a proven record of past success and we look forward to working with them.” “Siloed tools for different types of infrastructure are yesterday’s technology. The future is about real-time infrastructure visibility and the convergence of all tools and teams across the IT department,” said Sumeet Singh, founder and CEO of AppFormix. “Cloud infrastructure is shared infrastructure. By providing application-level visibility of the infrastructure and programmatically controlling how applications use shared resources AppFormix is enabling a more economical, reliable and agile software-defined data center.” Product Availability AppFormix is currently in beta, and will be generally available in Q3 2015. For more information go to appformix.com. AppFormix software is currently available for Linux and Windows. About AppFormix AppFormix is focused on helping operators and developers optimize multi-tenant cloud infrastructure while ensuring the ability to keep applications running reliably and efficiently. To learn more, visit http://www.appformix.com/. ### Nitro launches Pro 10 Nitro launches Pro 10, accelerating the way businesses create, prepare and sign documents Nitro, has launched Nitro Pro 10, the latest version of its desktop offering. The development of Nitro Pro 10 continues Nitro’s innovation streak following its major funding round in November 2014.  Nitro Pro is a document productivity solution that accelerates the way businesses create, prepare and sign documents; anytime, anywhere. Advancements to the product make it quicker and easier to use than ever before, enabling UK businesses to improve internal efficiency and optimise their teams’ productivity. [quote_box_center]  Sam Chandler, founder and CEO of Nitro, commented: “Nitro continues to transform the way the world interacts with documents. This solution transcends competitive point solutions in PDFs and e-signatures, establishing Nitro as a firm leader in document productivity, and the only viable competitor to Adobe Document Cloud.” [/quote_box_center]  With a rich history in providing digital document solutions, Nitro currently helps more than 500,000 businesses around the world to improve their document productivity. Nitro continually searches for ways to help companies remove costly bottlenecks and inefficiencies, and recently partnered with the PDF Association to survey more than 1,200 knowledge workers to gain insight into the document usage habits of the modern workforce. The survey found that the top document challenges are editing (55 percent of respondents), reviewing and signing (31 percent each), which are negatively impacting productivity. Nitro Pro 10 offers key cloud-based integrations to overcome productivity challenges, including the capability to request and receive unlimited e-signatures on desktop; online storage integrations with Google Drive, OneDrive and Dropbox; and online collaboration tools to share documents and provide feedback with online mark-up and commenting tools. [quote_box_center] “Modern businesses are still constrained by a vital business tool they use every day for multiple hours at a time – documents,” said Sam Thorpe, director of product at Nitro. “In developing Nitro Pro 10, we worked closely with businesses across the spectrum, from startups to Fortune 500 enterprises, in order to deliver a completely new solution specifically designed to provide greater visibility and control over documents."  [/quote_box_center] About Nitro Nitro is changing the way the world works with documents. From the desktop to the cloud, Nitro makes it easy to create, edit, share, sign and collaborate – online or offline. The goal: make businesses more productive. More than 500,000 businesses run Nitro, including over 50% of the Fortune 500. Nitro is the PDF software partner of choice for Lenovo, and our award-winning products, including Nitro Pro and Nitro Cloud, are used by millions of people around the world every month. Nitro EMEA is headquartered in Dublin, Ireland; Nitro also has offices in San Francisco, USA, Melbourne, Australia, St. Petersburg, Russia and Nitra, Slovakia. One of the fastest-growing private companies in the world, Nitro is also a multiple Inc. 500/5000, BRW Fast 100, Deloitte Technology Fast 50 and Software 500 award winner. For more information: www.gonitro.com/about.                    ### Every cloud needs a home [quote_box_center]The phenomenal growth in cloud based service delivery and managed services is changing essential data centre requirements. For any Managed Services or Software as a Service (SaaS) provider looking for a data centre to support this critical new business model and meet 99.999% customer Service Level Agreements (SLA), issues such as the resilience of connectivity and power, as well as security are clearly critical. Organisations also need to consider the time it takes to get engineers on site – not only to address issues that may impact the SLA but also to set up and install new customer equipment quickly. While every cloud clearly needs a home – it cannot be just any home. As Matthew Dent, CEO, Volta Data Centres, explains, in an era increasingly dominated by the quality, reliability and resilience of cloud based models, organisations cannot afford to jeopardise the life-blood of the business by failing to unlock the door of the right data centre location. [/quote_box_center] New Imperative Gartner predicts that by 2016 cloud computing will make up the bulk of new IT spend. 2016 is also predicted to be a defining year as private cloud begins to give way to hybrid cloud and nearly half of large enterprises are expected to have hybrid cloud deployments by the end of 2017. The resultant shift towards Software-as-a-Service (SaaS), Platform-as-a-Service (PaaS) and Infrastructure-as-a-Service (IaaS) is having a fundamental impact on data centre priorities. These cloud services require a physical infrastructure: a safe and secure environment from which the business critical servers and hard drives that contain the applications and data can operate. They require fast, uninterrupted data connections, a totally secure and reliable power supply and buildings that offer the highest possible levels of security. Five keys to finding the right home for the cloud: Key One: Location The right data centre location is not just about the obvious requirements of avoiding the flood plain or flight path, although these are clearly essential.  For any Managed Services provider there are a number of other considerations, not least the time it takes engineers to travel to and from the data centre. While travel time is less of an issue for a colocation site that will rely on remote management and data centre staff for the majority of support requirements, most Managed Services providers have a far more hands on approach. From adding new customer equipment, to the need to ensure any issues are immediately addressed to meet customer Service Level Agreements (SLA) and avoid problems, the speed with which engineers can reach the data centre can be critical. A data centre in central London that can be accessed in minutes rather than hours reduces wasted travel time and costs, enabling a far more responsive service. Key Two: Business Scalability With many Managed Services providers making their first foray into the cloud, it is extremely tough to predict business growth – and hence data centre requirements. It is therefore essential to look for an environment that offers flexibility. For example, can the data centre run high density and low density racks side by side? Without row-based, rack level cooling, an organisation might struggle to adapt or add racks without the upheaval of moving to a different location within the building.  Checking the options available for adding and expanding racks is essential for any provider expecting to rapidly scale up its cloud based business. Key Three: Connectivity Cisco estimates that by 2017 global cloud traffic will reach 5.3 zettabytes (one zettabyte of data is the equivalent to the information stored on around 250 billion DVDs) . In this environment the issue is not only the reliability of the connectivity but also latency: the ability to offer transmission time in nanoseconds fundamentally transforms the quality and type of solutions and services that can be offered. In terms of resilience, every data centre offers multiple Tier 2 carrier options. Few, however, are able to offer multiple Tier 1 carriers. As a result, this means that the connectivity is still reliant upon a single fibre provider – typically BT. Without diverse fibre connections, the back-up options are limited: any damage to the underlying cable network – for example, damage during road works - will take out every Tier 2 connection that is simply using the same last mile from BT. For any Managed Services provider it is essential to look for a data centre with multiple Tier 1 carrier options and diverse entry points into the building as well as Tier 2 carrier services, in order to achieve full resilience. Key Four: Power Supply All Data Centres can be expected to have a dual power supply from the grid; and all have some form of resiliency or back up to the grid power supply from an array of batteries or generators. How many in central London, however, can offer dedicated 33KV transformers?  Most urban data centres are limited to 11KV tapped into one main grid substation – if that goes out, they will lose both A and B grid feeds. A data centre should take dual feeds from different substations to provide true resilience – however, given the limitations regarding power supply in London this is becoming harder and harder to achieve creating potentially huge risks for Managed Services providers. Key Five: Security Security has always been a vital consideration within any data centre environment. From a bomb proof building to multiple levels of security to access the building and the server room and the additional option of adding cages around the racks, or even a dedicated private suite, a depth of security facilities is key to support the changing threat risk. Given customer concerns regarding the data security in the cloud, it is essential for any Managed Services provider to be highly confident in the procedures in place, from authorised and unauthorised access to fire and vandalism risks. Conclusion The cloud has transformed the way businesses operate and the way the world does business. However, in what is becoming a commodity marketplace there is a risk that organisations are failing to understand the essential data centre requirements. This is a business critical decision – rather than buying down to a price, expectations should be rising and organisations should be continually looking for more from providers to support an evolving business model. The right environment does not only minimise risk but also reduces costs by cutting non-productive engineer time and improves responsiveness to customers. Isn’t it time to unlock the right home for the cloud? ### The changing face of European cloud The European cloud market has well and truly come of age. Like the market in the US, companies in the region have now taken the industry beyond the early adopter phase and into an era where cloud is transforming critical business processes. This is not just conjecture; the numbers back it up. According to IDC, a third of global IT spending in 2015 will be dedicated to cloud computing. What is more, this spend is being driven, in no small part, by activity in EMEA. Western Europe, for example, is showing the fastest rate of growth in cloud spending with a rise of 32 per cent. So what are enterprises now demanding of cloud service providers in the region? There has been a generational change in the security, performance and compliance of the cloud That was then, this is now Traditionally the large proportion of initial cloud deployments were used for test/development projects or new Software-as-a-Service (SaaS) applications. The cost efficiencies of this practice have always been clear but, while useful, these deployments did not tap into the full latent potential of the cloud. Now enterprise organisations in the region have made the most significant step - investing in cloud computing for their most critical production environments. Businesses are now allowing cloud computing to transform the way they work. So what has driven this change and allowed the region to move forward so rapidly? A generational shift There has been a generational change in the security, performance and compliance of the cloud, making it too valuable for enterprises to ignore. It is no longer a question of whether companies should adopt cloud, rather, a question of how quickly they are going to use it to drive business growth.    The evolution of SAP as a company is a perfect indicator of this shift. The business software giant is shifting its focus to a “cloud first” approach, with the launch of S4/HANA leading the way. The message is clear. The enterprise is not just adopting cloud; it is demanding high performance cloud for critical production environments.    Unlocking true potential The cost-savings that come with operating in the cloud have never been in doubt, but it is the agility that cloud is now bringing organisations that is the real long-term benefit driving adoption. Companies can now truly move at the speed of their business. Business pressures change on a daily basis and modern European businesses are realising the value of cloud services that can be scaled up or down in direct response to changing demand. The infrastructure deployment of large SAP implementations for example is no longer cost-prohibitive and we are seeing the large business software companies adapting to provide cloud implementations that match the agility requirements of the modern business. With this increasingly sophisticated use of the cloud comes a greater scrutiny of the practices of cloud providers, as the market in EMEA is discovering. The data privacy and sovereignty debate that is in full flow across the region embodies this. Home comforts In the post-Snowden era... In the post-Snowden era countries including Germany, Italy, Russia and even Brazil, have passed laws ensuring that citizens’ data must be stored within their country of residence, presenting an obstacle for many cloud service providers operating globally. As cloud becomes more pivotal to the European enterprise, these cloud providers will need to provide satisfactory answers to enterprise concerns over data privacy if they are to be classed as truly fit for purpose. So what is a satisfactory response? The reality is, that companies throughout the region want their corporate data to be subject only to local laws. Merely storing data within a country will not be enough to assuage these concerns. Data needs to be stored and managed by a national company, to eliminate the danger of judicial overreach from abroad. For a comprehensive approach cloud providers can operate within a service provider model. Working with national telecoms operators and systems integrators can ensure that data is held under the remit of national law. Dealing with the location of data is only half the solution, data ownership is the most pressing matter that cloud providers need to address in EMEA as the debate evolves. The face of the cloud in EMEA is changing at a rate of knots; if businesses fail to adapt, they risk being left behind. ### What is the “Consumerisation of IT”? By Simon Porter Going to the office used to be exciting, there was a fast internet connection, that brand new laptop you couldn’t afford to have at home, and it didn’t take a exuberant amount of time to load the next webpage. The consumerisation of IT has been caused by the wide availability as well as the increasing affordability of high performance technology, in the form of smart phones, tablet and laptops, and connectivity in the form of 3G or 4G as well as broadband. It has not only had an impact on the way we play games or watch films in our spare time, but has also impacted the enterprise IT industry. For a very long time the office IT environment was miles ahead of what people could afford to have in their homes, but the commodification of the Internet changed that. As products became more widely available, the average consumer began to have a full office set up at home. The internet connection was also equally as good, if not better than the one available in the office. As this shift occurred, suddenly businesses were being pressured by their employees to meet the standards of IT they were able to achieve outside of the office. As new technologies are now being developed in the consumer space, rather than solely the enterprise IT sector, businesses need to think about how they will be affected, and how they can take advantage of this new tech insurgence.  The IDG Enterprise’s study ‘Consumerization of IT in the Enterprise’ (CITE) found the rapid growth of personal devices being used for work has required 82% of organisations make changes. Enterprises are outspending SMBs on mobile apps designed to increase customer retention Policies on how corporate data can be shared are now commonplace, and businesses are starting to invest in mobile device management (MDM) solutions, as well as purchasing secure file sharing services. The CITE study also found Enterprises with more than 1,000 employees are more likely to invest in mobile apps to increase customer and employee satisfaction than their SMB counterparts (46% versus 38%). Enterprises are also outspending SMBs on mobile apps designed to increase customer retention (38% versus 20%).  The following graphic shows the breakdown by benefit area, the SMB figures are shown in blue. For SMBs, corporate and government organisations, the biggest challenge is the security risk caused when employees use unapproved technologies and devices at work. Even with corporate security policies in place that limit or prevent their use, there needs to be an awareness that personal devices will still be brought into the workplace and be used by employees. 90% of enterprises say that the use of consumer or individual services used for work is pervasive today including Dropbox, Google, Skype, LinkedIn, Facebook and other social networking sites.  49% of these sites are used with IT approval, and 41% are not.  79% report that file sharing and collaboration tools including Box, Egnyte, Google Apps, Microsoft, Office 365, GroupLogic, ShareFile and others are pervasively used today.  49% are with IT approval and 30% are not.  The following graphic provides a comparison. As IT becomes more and more consumer centric, both Enterprises and SMB’s need to consider the ways in which adopting BYOD policies and outside tech can benefit their business. Allowing employees to work on a devices with which they are familiar and productive may outweigh the negatives as long as a proper security system is in place. We work with many SMB clients to help them take advantage of BYOD policies but in a secure way. Read here how a midsize UK sales agency generates revenue for their large clients whilst protecting corporate data unapproved IT will find its way into the workplace, whether companies like it or not The important take-away from the CITE study is that unapproved IT will find its way into the workplace, whether companies like it or not. Putting in place an effective management plan can allow for consumer developed IT to benefit the business space As more and more devices become tech embedded with the Internet of Things explosion, we need to look toward a future that will bring many more devices into the enterprise space. ### May Top 10 #CloudInfluence UK and Cloud Bursters Topping the May #CloudInfluence bursters table was Steve Mollenkopf Qualcomm’s President and COO, who was second in the top 50 global rankings commenting on the Avago Technologies acquisition of chipmaker Broadcom Corp and how this might create new competitive challenges for Qualcomm. Kamal Nath [2], CEO of Sify Technologies, appeared in articles covering the cloud enablement of the Madhya Pradesh State Data Centre which won a 'Best Cloud Solution Implementation' award. Gordon Coburn [3], President of Cognizant Technologies, was promoting the fact that the company is starting to build its own software platforms and tools as part of its push towards digital and generating more revenues from newer areas of technology such as cloud computing and analytics. Sundar Pichai [4], Vice President of Chrome Google, rose to prominence explaining how Google plans to convince developers that it is the easiest platform to build on for cloud and the Internet of Things (IoT). New Jersey native Chris Palermo [5], the founder, CEO, and President of Global Communication Networks, Inc. was followed by Dennis Grunt [6], director of Silicon Valley Bank as the bank agreed to provide a debt facility to MOBI.  Then came Brian Metherell from Toshiba, Edward Snowden commenting on data privacy, Mark Meyers celebrating the millionth sale of Spiral's CloudPets and finally John Minor from JobsOhio following decision to Amazon expands its cloud computing in central Ohio creating 1,000 extra jobs.  [table id=31 /] Jumping into top spot in the UK table was John Jackson, Assistant Director (ICT), London Borough of Camden commenting on how at a time of austerity in government, running a low-cost, flexible hybrid cloud service for the borough’s customers is essential. Camden went with CloudBolt, which he claims provides a flexible service with a mature product architecture that is simple and infinitely extensible. This pushed IBM’s Simon Porter down to second in the UK table for the first time. [table id=32 /] NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### May Top 50 #CloudInfluence Individuals This month’s top 10 has a few familiar faces, but also several new entrants, demonstrating the volatility of the ranking for individuals from and high turnover from one month to the next. After a brief flirtation with top spot following the excitement around the first ever disclosure of financials for AWS Jeff Bezos dropped back to fifth this month. This allowed Satya Nadella, Microsoft’s CEP, to regain top spot for May. Steve Mollenkopf, Qualcomm’s President and COO, was next, commenting on the Avago Technologies acquisition of chipmaker Broadcom Corp and how this might create new competitive challenges for Qualcomm. There was a familiar face in third, FBR Capital Markets Analyst Daniel Ives commenting on the fortunes of a host of different US listed tech firms. And in fourth, just ahead of Bezos, was Marc Benioff, Salesforce.com CEO following speculation that the firm might be acquired. Then came Adam Massey, Google’s director of global cloud ecosystem and partnerships commenting on how Salesforce is teaming up with Google, as well as others such as Cloudera, to bring big data to its analytics cloud, and then Adam Pratt a spokesperson from IBM attempting valiantly to fend off speculation about job cuts at the firm’s global delivery centres in Dubuque and Columbia. Nick Lippis, Chairman and Co-Founder of the Open Networking User Group (ONUG) rose to prominence following ANVI’s announcement that they had successfully completed test to validate their technology solutions against ONUG’s key IT specifications. Completing the top ten are Bill McDermott, Co-Chief Executive at SAP and Majd Sakr, computer science professor at Carnegie Mellon University. Interesting appearances outside the top ten include: Alibaba founder Jack Ma is being replaced by Daniel Zhang [14] the new CEO who makes his first appearance in the rankings. He is joined by Alibaba’s president of cloud computing, Simon Hu, as the firm starts being seen as an increasingly credible cloud player. A number of academics and analysts appear in the rankings, as well with Majd Sakr of Carnegie Mellon University,in 10th , there is Wesley Kou [23] from National Taiwan University, top cloud analyst Lydia Leong of Gartner and market analyst Samad Samana [11] – the second representative from FBR Capital Markets. A regular in the rankings, frequently the top IBMer and almost always the leader in the UK rankings Simon Porter [29] suffers a slight setback this month. Not only is he not the top IBMer this month (behind Adam Pratt [7]), but he has been pipped in the UK rankings by John Jackson [28] – albeit by only the slightest of margins. See the full UK rankings tomorrow. Clients making use of cloud also appear with Gerry Moore the CIO of St James Hospital discussing his firm’s use of DropBox across the five hospitals and multiple clinics that they manage spread across three European countries, and John Jackson from Camden Council (who’ll be covered in more detail in tomorrow’s UK rankings). [table id=30 /] NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### May Global #CloudInfluence Organisations Maybe cloud is becoming a two horse race after all. There is the pioneer and long time leader, Amazon Web Services with its impressive market share lead and its massive ecosystem of service providers, but with an incredibly lean PR and marketing operation. Then there is the old-era behemoth, Microsoft with its rapid growth in cloud, supported by its large ecosystem of existing partners, its dominant Office suite and its powerful marketing machine. This month’s rankings show that they retain a clear lead over the pursuing pack, but what happened to all the other challengers? Google was seen as one of the cloud leaders, but has recently made some worrying missteps. In previous bouts of cloud price combat Amazon has either instigated a price drop, or has done so in reaction to one of its rivals and the rest of the leading players have followed suite. Google recently dropped its prices, but its rivals didn’t feel the need to respond. Does this mean that price is no longer a key factor or that Google is no longer seen as a threat? We believe that for some time now price has only been one of a number of key factors with ecosystems becoming of increasing importance – something that we will return to later. We don’t dismiss the chance of further bouts of price competition ahead, but there are signs that Google may no longer be seen as such a threat by Amazon andMicrosoft. Amazon is still generally seen as having a lead in total features – partly down to its own innovation, but increasingly from its thriving ecosystem of service providers. To date Google has tried to differentiate its cloud offerings by focusing on developer-friendly services, with Microsoft using the dominance of its Office apps as a critical differentiator, along with the partner ecosystem. However the one with the weakest ecosystem of the three is Google and maybe the recent price play, rather than a sign of confidence, is one of weakness. Until it breaks out its cloud financials as AWS recently did, we won’t know what its margins are, but the fact that Amazon and Microsoft haven’t felt pressured to match the price cuts says a great deal. Few among the rest of the field are in a position to make a serious challenge right now, unless they make a significant new move. IBM’s acquisition of SoftLayer made it a player, but hasn’t borne the fruit that had been expected, with the company refocusing on premium private or hybrid clouds. It has also made numerous recent partnerships with everyone from Microsoft to Twitter, but it still needs to make its SoftLayer expansion strategy work or make another big move. Right on cue IBM announces the acquisition of Blue Box, to expand its presence in OpenStack and extend its cloud reach. Whether this will be the big move the IBM needs to make, one of a number of smaller moves that will collectively make the difference or will be too little too late, remains to be seen. EMC has just made a big move. It already partially owns VMware Corporation and Pivotal Software, giving it a unique relevance in the cloud market, and its CEO Joe Tucci is chairman of the board for both of these firms. In late May it also announced that it was to acquire Virtustream. Exactly how EMC plays its cards will determine its future in cloud. If its federated structure doesn’t enable it to capitalize on these assets then it will come under renewed pressure to spin them off – especially VMware. We saw HP at the recent OpenStack Summit in Vancouver, presiding regally over the community that it has done so much to support. Nobody would have thought that following a string of disastrous acquisitions and management blunders, it was in the midst of a corporate split, with almost every line of business struggling in its market. In cloud it has little to show from its relatively recent Eucalyptus acquisition and is struggling to make ground in private cloud, with almost no presence in public cloud. HP’s woes are reflected by other OpenStack players such as RedHat and RackSpace, as the open source cloud community struggles to make itself the defacto standard for private cloud, having all but ceded public cloud to AWS, Microsoft and Google. All this is reflected in our May 2015 #CloudInfluence rankings: Microsoft’s marketing machine keeps it firmly in the lead while Amazon’s market share and ecosystem keep it in second. [caption id="attachment_20208" align="aligncenter" width="640"] There are obvious leagues of difference between Microsoft, Amazon and the rest of the field.[/caption] [caption id="attachment_20209" align="aligncenter" width="640"] This graph shows a zoomed perspective of the battle at the end of the month.[/caption] Of the rest, Google’s price drop boosts it into third, but some distance behind the top two and Gartner manages to rise to prominence commenting on the others, while IBM leads the chasing pack with EMC gaining a boost from publicity around its recent Virtustream acquisition. Apple is in a unique position all its own – dominating its own consumer ecosystems. Then there is little to choose from the rest (including Cisco, Citrix, Hewlett Packard, Intel, Oracle, Rackspace and SAP). Notable mentions from among the rest of the field: Alibaba [24] continues to gain momentum, as the Chinese giant becomes amore globally recognised brand. Another Chinese entrant was China Telecom [17] while Oversea-Chinese Banking Corp. (OCBC) [28] commented widely how long China will prop up bondholders to ensure a smooth clean up of its local debt mess, including troubled local cloud players. Elliot Management [25] rose to prominence agitating for EMC to spin off VMware. And Avnet [36] was our first distribution to make it into the top 50. [table id=28 /] NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### An Alternative Future – re-evaluating cloud and DR By Ian Masters Alternative histories can be a creative way to explore what we currently know. Fiction pieces like Fatherland by Robert Harris and 1984 by George Orwell are famous the world over. In the world of IT, I think this approach can also be applied. It can be fun to imagine how the world of technology might be different. For example, home entertainment may have been very different if the war between VHS and Betamax had been won by the superior tape format. In another world, Blockbuster would have acquired the fledgling Netflix and continued to own film rental via online channels as well as more traditional sales. Perhaps the Post Office here in the UK would have launched itself as a broadband operator, rather than companies like BT and Cable & Wireless, in order to make a success of the rise of Internet shopping rather than concentrating solely on the last mile of delivery for post. Imagine that the world of x86 servers and storage platforms has to compete in this “cloud first” world For this article, let’s hypothesise that cloud computing came to market earlier. For example, what if a savvy IBM employee had extended the mainframe model of multi-tenant computing services faster than the shift to mini PCs and client-server IT took place? This approach could have offered a cloud-style approach to delivering IT. For business continuity and disaster recovery (DR), this would lead to an interesting dilemma. Imagine that the world of x86 servers and storage platforms has to compete in this “cloud first” world. Investing in DR would have two options: the cloud model, or the “new wave approach” of implementing your own infrastructure and operations. Traditional DR implementations require secondary servers, licenses and storage assets. While this approach to delivering DR to the business can still be valuable – particularly when there are applications on specific hardware or that require certain skills to maintain – the secondary set of infrastructure is a big potential expense. For long-term backup, physical media like tape may also have to be implemented too, meaning further costs. This all has to be hosted somewhere, so secondary sites would have to be considered. Justifying this level of investment – particularly in a set of technology assets that you hope you never have to use – becomes difficult. Why would a company buy in two sets of physical services or virtual infrastructure, when you can simply buy in a service?  For CIOs that are considering how to use cloud, it’s important to look at the end results. Thinking about this “alternate history” can help show where IT as an industry is sticking to ways of implementing strategies, like DR, because that is the traditional approach. It can also highlight specific areas where there are still problems to solve. Companies may have challenges around regulation that may stop them making use of the cloud for all their critical data. However, cloud services have been developed that can take account those regulations, so this is becoming less of an issue. Local cloud services that are run by UK-based companies are an option when data residency issues are important, for example. It’s important that we view the alternatives with an open mind. In the past, looking at the alternatives to traditional DR technologies has often involved a degree of sounding out the unknown. Today, cloud has reached a degree of maturity that makes cloud DR a potentially better option than traditional methods. It’s important that we view the alternatives with an open mind. ### Thoughts on "The End Of The Human Race" Stephen Hawking was recently quoted saying that the full development of artificial intelligence (AI) could "spell the end of the human race," it has also been likened to being our biggest existential threat. artificial intelligence (AI) could "spell the end of the human race" I'm not sure if I should find this worrying or not. I am a millennial, in every sense of the word. I was born in 1992, (yes, just let that sink in for a moment.) Data privacy is not at the forefront of my mind, I give up my details freely to the internet, Facebook, Google, Twitter - it does not bother me. I studied books like 1984 in high school (yes, the paperback, not the e-book). I actually adore George Orwell's writing, it fascinates me that what was once considered a dystopian future society is now so close to our reality. If you are in public, look around you. I guarantee you can see at least one security camera. If you are at home, count the connected devices within the radius of your reach. If you are reading this, you are being surveilled. Now, does that honestly bother you? Because in reality, we have come too far for you to start worrying now. Your entire life can be deducted by the data you unwillingly, or unknowingly leave behind. Those 'crazy' people who live in the mountains and wear tin foil hats? If right now, you decided to move to the mountains and drop off the grid, someone would notice. Your little trail of lights and beeps and pings would go silent. You may not be found - that would be a feat on your behalf, but your absence would be recorded. Dystopia is now. You are being recorded, your digital footprint exists. Your entire life can be deducted by the data you unwillingly, or unknowingly leave behind. Does this freak you out? Personally, I am not bothered. I am of the 'if you have nothing to hide, why should it matter if someone is looking' school of thought. I couldn't care less if someone in the FBI is looking at my digital footprint, what are the chances that they actually are looking? Why on earth would I be of interest to them? Consider this for a moment, someone you care for goes missing - but they have their smart phone on them. Because of tracking technology, the police are able to pinpoint exactly where they were when they disappeared. On top of that, their digital footprint will tell the police exactly who that person has been interacting with, and where they last tweeted from. Big data analytics can analyse their twitter feed, and their Facebook page for emotional sentiment, to see if they are a likely candidate as a runaway, or more likely to be a victim of something more sinister. The fact that we are constantly surveilled may unnerve you in some ways, but in others, are you not thankful for it? Who wouldn't want a sassy robot maid? Artificial intelligence is exciting to me, I loved The Jetsons growing up. Who wouldn't want a sassy robot maid? I don't think AI will be the end of the human race, I see androids being an integrated part of society. I have watched enough sci-fi movies to know that terrifies some people, but if anything, shouldn't those movies have just taught us how to avoid those situations? For me, AI should go hand in hand with EI, Emotional Intelligence. To be blunt about it, I am sure we can design AI that has more EI than certain human beings do presently. Why worry? In fact, shouldn't you be more worried about the person sitting next to you on the train than the robot that has been programmed to EI perfection? Surely the more volatile of the two is the human being who has so many unknown factors. AI should go hand in hand with EI, Emotional Intelligence. There is a reason we humans surveil other humans so closey - they do bad things. Don't you think it would be nice to have AI and EI perfect robots that we wouldn't have to worry about all the time? What do you think? Tweet me and let's talk about it, @rhian_wilkinson ### IBM Gate Crashes MongoDBWorld in NYC By Neil Cattermull IBM has turned up un-announced at MongoDBWorld in NYC with the unlikeliest of technologies to show off.  IBM turned up in an unlikely location with their new z13 mainframe technology yesterday. MongoDB World is running this week in New York and is a typical Open Source conference - a strange place for IBM to turn up with their mainframe technology, right?  Not so fast. IBM labs seem to be working a stealth effort enabling the Open Source Ecosystem with many of the top packages in the industry. Recently IBM announced several key Open Source products running on their z system. node.js bringing server side scripting language to the platform Docker as method of modernizing a Linux environment - making it easier to test and deploy apps leveraging container technology concepts PostgreSQL as an Oracle compatible SQL database This long line of new Linux additions to the platform began with Hadoop on z last April, Big Insights in October, followed by DB2 BLU. We get the feeling this is just the beginning for these additions, and are expecting many announcements to start popping up shortly. According to our inside source, IBM’s internal proof of concept and early benchmark on YCSB (Yahoo Cloud Service Benchmark) is showing a move from 1.3x to more than 2x performance advantage of running MongoDB v3 on z13 over the latest distributed platforms. IBM ported the open source version of MongoDB v3.0 to Linux on IBM z as the latest NoSQL database, allowing Big Data innovation on the z13 platform. MongoDB is a document-oriented database that supports server-side JavaScript, and a JSON-like query language.  In today's world Big Data is becoming more prevalent as we start to leverage insights from structured and unstructured data together to deliver better business outcomes. MongoDB represents a powerful aggregation point where insights from traditional system-of-record data ie. retail transaction history, can be joined with system-of-engagement data ie. micro-beacon geodesic information, to provide a rich and comprehensive user-experience for the shopper. On top of this, MongoDB’s database aggregation now enable businesses to bring these two worlds together for their own benefit, and gain access to insights to all of their data to facilitate critical business decisions. With the recent announcements of Docker, NodeJS and PostgreSQL available on IBM z we can only imagine the official announcement for MongoDB is coming soon. Personally I sense we will be hearing lots more around Linux on the mainframe in the near future. Exciting times ahead! ### CensorNet & Spiceworks Webinar 23 June 2015 2PM GMT Webinar Title – Protect Your Network From Your Users’ Secret Cloud Apps Webinar Description Do you know your end-users are possibly (probably?) using cloud and mobile applications without your awareness? Protect yourself! Discover in real time what applications are being used in your network, including what information, messages, or files are being passed into the public space.  Learn about: Shining a light on shadow applications in your network with Cloud Application Discovery Using a Secure Web Gateway to help prevent security breaches and malware   Safely enabling BYOD and cloud in your network to protect your data and maintain bandwidth   CLICK HERE TO REGISTER ### The Future in Retail and Banking After many many sleepless nights and discussions with opinionated business owners I feel I would like to set out what I see as the not too distant future of cloud technology. Now these are only my opinions, but I have been right before (he says smugly) and your comments are welcome. With a focus on the retail and banking sectors, I hope what I have to say can disrupt the way you're thinking about cloud tech and the future. Retail – the future of shopping We have seen stores come online rapidly and obviously some have excelled more than others in digitalising their outlook, take a look at what Burberry are doing for a great example of this. Some interesting statistics show that there are conflicting comments on the way we shop. The internetretailer.com suggest: 56% of shoppers still think advertising is important to their purchase decision in-store. But only 12% of shoppers feel the in-store sales associate is an important touch-point in a purchase decision. This is an interesting conflict and I think the decision is really down to personal taste, coupled with the “can I be bothered to go to the shops” mentality. Advertising is still important for clothing purchases, but the desire for a sales associate experience is very low. Clothes are personal and we all like to try before we buy, but how many of us have recently purchased clothes online, negating the sales associate altogether? The only thing stopping you is the pain of sending the items back when they don’t fit, which many retailers are now combatting by offering returns collection services. So, I think small steps will be taken to reduce the amount of staff needed within retail organisations (similar to TFL ticket offices) and the retail chains will become glorified showrooms that will hold a skeleton staff for essentials. Payments can still be made in-store by wireless transactions and a point. Wait I hear you say, how do I know it fits? Well due to your smart device that has recorded your measurements and size this will feed a big database in the sky (constantly updating) your measurements. Imagine an add-on for your smart device that 3D scans your body, and records your measurements. This is then recorded to your favourite stores online presence and each store will know your size. Not only this but also your purchase history and preference so that fantastic offers can be presented. Sound far-fetched? It’s really not. There are already 3D scanning attachments available for tablet devices, though at present they aren’t quite sophisticated enough to scan a whole body. I personally would still go to the stores, as I like to get out of the house. plus, I don't want to end up an obese man that can’t tie his shoelaces. but the need to visit stores at all will diminish eventually. Banking – What else can I have? The banking sector has always had the most sophisticated technology for mainstream operations and let’s face it they need to. In this competitive landscape where margins for banking products being squeezed and regulation stifling creativity, this industry will change dramatically. Let me make one statement insomuch that it may explain my way of thinking. You trust your bank with your money right? (Even if you don’t do you have a choice?). So who do you trust with your data as this is the new economy for business? Correct – logically you would trust your bank! Makes sense really and a new breed of service provider for your data is born for the cloud. Already we have seen a few high street banks offering data storage, why not offer the whole package, applications to boot? In fact it would be easier for banks to scale their tech for mass consumption than your average IT provider and they already have the client base to convert. So what does this mean for you if you are a Cloud provider today? Possibly that you can expect to be purchased over the next 10 years. This is just a taster of the modernisation of technology within two market sectors. However generically I feel the improvement of technology is at a pace that is very hard to predict. Gone are the days that we could easily relate to upgrades and replacements for our ever creaking tech. From a business standpoint there is no need to purchase any core computing services unless you fall into a service provider model, it just doesn’t make any economic sense. Your home PC will be supplied by your internet provider (or cable/satellite TV) and maintained by them – this too makes sense as whoever provides you the connectivity, to this increasingly over populated web called the internet, holds the keys. Connectivity is the one thing that you cannot be online without and it’s the artery that feeds net. One final thought. We may see a revolution on our hands with a very classist scenario taking place between the pro automation companies and the out of work employees who simply say no. There is an economic change coming and it may be the disruptor that nobody is expecting. ### Bloomberg Technology Conference: Smarter Cities 18 June 2015 bloomberg hq, Finsbury square, london Headline Event of London Technology Week As part of London Tech Week, the Bloomberg Technology Conference: Smarter Cities, will look at how technology continues to transform and improve the way we conduct business in our cities. Taking place on 18th June at Bloomberg HQ, this event will bring together the heads of Europe’s smartest and greenest cities to share their insights, triumphs, challenges and plans for future innovations.  With a unique focus on the intersection of health, sustainability and transport, we will discuss the importance of interconnectivity in ensuring that business, tourism and culture continue to thrive in our cities and keep them competing on a global scale. Confirmed speakers include: Julie Alexander, Director, Urban Development Siemen’s Global Centre of Competence for Cities Richard Brys, CEO, Utilidex William Eccleshare, Chairman & CEO, Clear Channel International Jessica Federer, Chief Digital Officer, Bayer Matthew Hudson, Head of Business Development, Transport for London Phil Smith, CEO, Cisco UK & Ireland For questions about the event or program, you can contact the team at blive@bloomberg.net or call 020 7673 2302. To see the full agenda please visit the Bloomberg Live website. ### INTEROP LONDON 16 – 18 June 2015 ExCeL, London the flagship event of London Technology Week. Business agenda: IT thinkers talk mobility, apps and security  Senior IT teams will be out in force for Interop (16 – 18 June 2015), the flagship event of London Technology Week. Ahead of the show, we spoke to some of the industry speakers and senior IT decision makers about how they are adapting to an increasingly digitised environment and other current technology priorities  A recent poll of IT decision-makers by Interop London confirmed that mobility is a top priority for UK businesses, with 88% confirming that they are currently exploring solutions to facilitate a more flexible, mobile working environment. Hiten Vadukul, Enterprise Architect at the global fitness chain Virgin Active and a member of the Interop Advisory board is currently going through the process. He believes that mobility projects should have a clear purpose and concentrate on facilitating collaboration and seamless working, confirming that “as part of our workplace strategy, we are looking to standardise the user experience across mobile, tablet and desktop devices”.  Other current IT priorities highlighted in the UK poll moving from on-premise to cloud-based solutions, relocating data to a cloud-hosted server and improving security. Meanwhile, approximately two thirds (63%) said they were on the hunt for new technology solutions to help solve business or industry-specific problems. Mobile digital workplace With the current focus on facilitating employee and business-owned mobile devices across business networks, in future, analysts predict a growing shift towards the use and development of corporate apps. According to Gartner, this will result in one in four enterprises having an enterprise app store for managing corporate-sanctioned apps on PCs and mobile devices by 2017.  Theresa Regli, Principal Analyst and Managing Partner at the Real Story Group, will be giving a presentation on mobile platforms for business at Interop. [quote_box_center]She said: “Facilitating a mobile digital workplace for employees continues to be big priority for CIOs and HR teams and in the last couple of years, a huge amount of progress has been made in this area.  “However, while the technology now exists to build interactive business apps very quickly and tailor them to different types of devices, as yet, many firms don’t have a strategy for it. “In future, the focus in terms of mobile digital workplace will continue to be the use of web-based technologies to facilitate document access. One area where we are likely to see the digital workplace being ‘appified’ is in sales, which will continue to make it easier for sales teams to track leads and communicate leads. We also anticipate the growing use of ‘phablets’ (phone and tablet hybrid devices) to carry out workflow-oriented tasks such as sharing, editing and approving documents.”   [/quote_box_center]  Securing the enterprise When questioned about the security threats facing their organisations, almost two-thirds (62%) of senior IT professionals polled by Interop said that unintended user errors such as accidental malware uploads and data loss presented their biggest security challenge – far more than 13% who were most concerned about external attacks from hackers, cloud storage and third-party data hosting, or unsupported personal devices.  “People are producing more data from more devices than ever before,” said Virgin Active’s Vadukul. “Keeping this secure is not optional – unless you’re prepared to risk brand reputation. This is why businesses continuously need to adapt and improve when it comes to securing sensitive data.”  David Stanley, Head of IS Operations at thetrainline.com and a member of the Interop London advisory board also advocates taking a proactive approach to security threats, saying that for his organisation “security and compliance isn’t about spotting and fixing a problem – it’s about doing everything we can to ensure we don’t have one in the first place.”  David Self, Head of PMO at the multinational media company UBM made the point that, as data-driven enterprise, losing data could mean losing competitive advantage. For this reason, his team “always ensures security is a major part of any new process or developments from the outset and always equips company devices with high-level security applications”. Mindful of the risk posed by unintended user action, he believes that “the only way to counteract the security challenge is to educate the business about the risks”.  For more information, see: londontechnologyweek.co.uk/2015/01/interop-london-2/   ### Stratoscale Achieves OpenStack Powered Validation Stratoscale, the software platform for hyper-converged infrastructure, confirmed its support of OpenStack initiatives at the OpenStack Summit in Vancouver last week. Stratoscale is now a validated OpenStack Powered solution provider. Stratoscale commits today to support the OpenStack Federated Identity initiative, to be generally available by end of 2015. [quote_box_center]“We applaud OpenStack’s efforts, and feel the Powered validation and Federated Identity initiative are two major steps in moving from being a "de facto" standard to solidifying an actual industry standard the ecosystem can rally behind for a truly open cloud,” said Stratoscale founder and CEO Ariel Maislos.[/quote_box_center] Stratoscale announces its designation as an OpenStack Powered solution provider, meaning it has been validated by OpenStack. As an OpenStack Powered Compute partner, Stratoscale’s software has passed the initial round of interoperability tests coming out of Defcore, complying with OpenStack’s interoperability guidelines. OpenStack Powered partners comply with standards that confirm them as part of a truly open cloud ecosystem. Interoperability testing results are available in OpenStack Marketplace where users can now plug into a global network of OpenStack Powered cloud service providers including Stratoscale and 13 other companies. Stratoscale also publicly announced its commitment to deliver products that support OpenStack Identity Federation by end of 2015. A new feature in OpenStack’s Kilo release of Keystone, Federated Identity is an "enabler" for hybrid and multi-clouds, unlocking the potential of the larger cloud ecosystem. The truest power of this federated capability is realized through broad industry support such as that from Stratoscale who, along with 25 others announced at Summit, will make these distributions, appliances and public clouds generally available by end of 2015. “These two designations of several in our partner network, including Stratoscale, support our commitment to interoperability,” said Jonathan Bryce, executive director, OpenStack Foundation.  “We believe this interoperability will allow the industry to focus on building and consuming cloud infrastructure, minimizing vendor lock-in.  This will simultaneously allow vendors to innovate in areas that don't void compatibility or sacrifice openness.” About Stratoscale Based in Herzeliya, Israel, Stratoscale is redefining the data center, developing a hardware-agnostic, software platform converging compute, storage and networking across the rack or data center. The self-optimizing platform automatically distributes all physical and virtual assets and workloads in real time, delivering “rack-scale economics” to data centers of all sizes with unparalleled efficiency and operational simplicity.  Stratoscale is backed by leading investors including: Battery Ventures, Bessemer Venture Partners, Cisco, Intel and SanDisk.  For more information, visit http://www.stratoscale.com/ ### Value of Smart Data and Analytics Unlocked at the 2nd Annual Smart Data Summit Dubai The potential of smart data and analytics to transform businesses and drive revenue growth was brought into the spotlight at the 2nd Annual Smart Data Summit that concluded in Dubai this week. Produced by global conference organisers, Expotrade, the summit was attended by over 300 participants, over two days of keynote presentations, panel discussions and case studies. The event took place on 25-26 May at Sofitel Dubai The Palm Resort and Spa. The summit commenced with a welcome address delivered by IBM’s Sunil Mahajan, who went on to make a presentation titled ‘Not Business Analytics as Usual.’ He explained that insights having a profound effect on businesses can be uncovered using big data harnessed from a number of sources and analysed in different ways. He proceeded to emphasise how faster, easier and smarter analytics services were the need of the hour, and highlighted how IBM has helped organisations realise the value of big data and analytics.  Braintree, a division of PayPal’s Daniel Nelson explored the topic of data science and the mobile revolution in his presentation, which examined how commerce is changing due to the disruptive force of ‘mobile’, early data science successes and presented a snapshot of the future. On the other hand, Twitter’s Anupam Dikhit focused on using smart data to drive real-time personalised marketing at scale. Stressing on the importance of scale, he mentioned how approximately 1 billion tweets generated every two days translated into large amounts of data signals which, in turn, help build recent user profiles. He explained how marketers can use this to tailor targeted campaigns and reach people at the right time, thereby bridging the gap between engagement and conversion.    Day 1 Included speakers from Emirates Integrated Telecommunications Company (du), Nakheel, Data Aurora, along with premium sponsors Thomson Reuters, HP Big Data, Denodo, Qlik, Wipro, presenting on topics such as the impact of big data on the telecom industry; data governance and security; next generation analytical platforms; data virtualisation; discovering the power of big data; redesigning business digitally; using technology to improve customer experience; big data ephemeralisation. The panel discussed in detail how big data analytics could reinvent social media marketing. Day 2 Highlights included speakers from Al Ahli Holding Group, UAE University, UAE Exchange, Al Safeer Group of Companies, speaking on optimising loyalty programs; integration of social media with CRM/ERP systems; big data and quantum computing; building of the brand using smart data. The day’s discussion featured the panel shedding light on the metrics to measure ROI on smart data investments in marketing. In only its second year, the Smart Data Summit has been established as the foremost event of its kind in the Middle East. This year’s partnerships included industry leaders such as IBM, Qlik, HP Big Data, Wipro, Denodo, Thomson Reuters and Badr IT. Exhibitors included MDS ap Tech, EnterpriseDB, Qarar, iDashboards, Master Works, Cloudera, Inseyab, GDS, Nutanix, Radix Mena, Bisnode, Enodos, Data’oro, Minodes, arroWebs, Xceed Ventures. The event was a success, evidenced by the appreciative comments it attracted at the close. [quote_box_center] Anupam Dikhit, Industry Manager, Twitter, said, “In terms of the way the event has been designed, the kind of speakers that presented, the entire structure and format – I think it’s fantastic. Because you have taken an issue or space that is up coming and a region that is up and coming – we have seen a lot of traction happen in the Middle East over the last few years, so it’s really a sunrise sector here  – and you have got the best of thought-leaders from across the world. That’s not a mean feat.” [/quote_box_center] [quote_box_center] Dr. Dirk Jungnickel, Senior Vice President – Business Analytics, Emirates Integrated Telecommunications Company (du), said, “It has been a very pleasant and interesting event. I had some very good conversations during the breaks and also the presentations so far have been first-class. It’s a good mix of vendors and consumers of big data technologies or people who are applying the technologies.” [/quote_box_center] [quote_box_center] Kamran Bader, Managing Director, Data’oro, said, “The event provided an excellent opportunity to network and learn from industry experts and big data practitioners. A very well organised and structured summit.” [/quote_box_center] [quote_box_center] Hisham Assaf, Regional Director – Middle East, iDashboards, said, “The significant attendance of decision makers and senior executives made the event a very effective platform for networking and discussing the most important challenges associated with the delivery of data and insights to support business strategy.” [/quote_box_center] [quote_box_center] Martina Boettcher, Regional Director of Marketing, Starwood Hotels & Resorts, said, “The summit provided a rich and diverse overview and it was particularly interesting to hear from various industries. There was a ‘take-away’ from each speaker, regardless of your own level of involvement in Big Data.” [/quote_box_center] [quote_box_center] Muhammad Imran, Marketing Manager, ACA Steel Constructions Contracting, said, “The summit was very informative for new upcoming professionals in the industry. It was favourable for networking, well-organised, and a source of inspiration for organisations already having in-house IT departments.” [/quote_box_center] [quote_box_center] Brad Hariharan, Regional Director, Expotrade Middle East, organizers of the conference, said, “The Smart Data Summit successfully highlighted the tremendous importance of smart data and analytics in today’s connected world. The last two days have demonstrated the impact that data analytics can have across sectors and industries, establishing this event as the premier event of its kind in the region. We thank our sponsors, delegates, speakers and partners and look forward to bringing next year’s edition to this audience.” [/quote_box_center] For event details, visit www.smartdatadubai.com ### Cloud FS Americas (Cloud Banking) 22-22 JULY 2015 METROPOLITAN WEST, NYC Introducing Americas first gathering dedicated to transformational financial services business models underpinned by cloud architecture. Demands on financial institutions across America for customer-centric IT architecture and a migration from the CAPEX cost models of yester-year to the adoption of pay-as-you-use cloud billing models have made secure cloud infrastructure crucial to driving front to back office business transformations. Cloud models can enable established banks, financial services providers, and hedge funds to expedite the launch of new products and services. For new players, cloud platforms and services represent a low-cost, scalable and agile route to market. Cloud FS Americas will gather banking IT leaders to establish frameworks for cloud viability, migration and implementation whilst tackling concerns on cloud security, data sovereignty and outsourcing policies. Utilize Cloud FS Americas as a platform to address complexity-free models of IT consumption and issue RFPs to best-of-breed cloud enablement partners with the capabilities to accommodate your budget, scale and migration timeline. Cloud FS Americas will feature holistic insights, expert-led overviews and educational case studies from cloud’s earliest financial adopters and best-of-breed multi-cloud innovations from the world’s leading cloud technologists. Join hundreds of your peers as they assess cloud viability and engineer paths to advanced cloud implementation and integration. WWW.CLOUD-FSAMERICAS.COM We look forward to welcoming you to New York. ### Cloud World Forum London 2015 24-25 June 2015, Olympia Grand Cloud World Forum continues to go from strength to strength, and we're looking forward to 2015 being the best year yet! Co-located with Enterprise Apps World, part of the renowned Apps World Series, we are EMEA’s largest Cloud & DevOps expo! Join CEOs, CIOs, CDOs, CMOs, HR, and Finance Directors along with their IT Operations and Development teams to bring the digital enterprise to life through cloud technology. 16 content theatres, 300+ Enterprise Speakers and 4 show floor content theatres & closed room discussions Leading ecosystem partners and past supporters include Google, Salesforce, AWS, IBM, Microsoft, EMC, Dell, HP & more 300+ leading solution providers demonstrating their new solutions Dedicated Start-Up/SME exhibition & content zone Critical Cyber Security and Cloud & Data Governance Zone VIP Networking club lounge: The Village A dedicated exhibition and content zone for the start-up and SME community: The Cloud Tech Hub Shaped by 100+ research calls with C-Level IT purchasers, and technology pioneers Powered by the leading research and analyst house, Ovum, and partner of Business Cloud News, we only showcase what the IT and tech industry wants to see! Register for your FREE pass now! ### IBM z Systems - Edge 2015 Las Vegas Compare the Cloud attended the IBM Edge conference 11-15th May. IBM held a weeklong conference within the grand environment of the Palazzo Hotel, Las Vegas, demonstrating and educating some of the new ways to think for technical roadmaps. Highlighted, as always, were some major breakthroughs for technical innovations for their current and planned products and services. An audience of 6000+ CTOs, CIOs Executive level representatives of companies from all over the planet attended the week. IBM z Systems highlights at Edge2015 The Executive Edge (a two day event running at the same time as the main conference) was the highlight of the week. Not that I’m biased in the slightest but young Andrew McLean and I kicked off the Cloud keynote on day 2 with fresh thinking from an independent perspective! Our session was entitled: A new way to think. In the session, we brought together a combination of IBM Products/Services that would best serve the retail sector. IBM z Systems Mainframe for the core infrastructure, Bluemix for the consistent Omni channel presence with Devops as a consideration, Spectrum Accelerate for the growth of data using storage defined networking and Watson for the analytics and brains behind the processing. [embed]https://www.youtube.com/watch?v=OOJUR6NWhE8[/embed] Ron Peri – CEO Radixx A great presentation from Ron, who absolutely surprised me on his U-turn from x86 servers to a complete adoption of mainframe infrastructure. Why does this surprise me? Well, if you get the opportunity to chat to Ron and know his background you would realise that for decades he was responsible for keeping x86 infrastructure in place, running and cost effectively and was completely against any other technology. He did not want to know or even be told that a mainframe would be a viable option, thinking z Systems were closed old architecture and once he eventually had a demonstration from Vicom Infinity, a New York based expert mainframe consultancy, he purchased one within 30 days to migrate away from the 172 x86 cores to a staggering 10 on a z Systems mainframe. Simply incredible, and what a pleasure it was to interview him during the week! So back to the highlights of the event. Some of the bullet point announcements were: A mainframe can analyse over 30,000 transactions a second, encrypted and securely to stop credit card fraud! Adding the ability to layer data across IOT devices such as iWatches etc to cross reference static data i.e. clinical data for a more accurate diagnosis – and at present z Systems are the only player in the market place that have the processing power and scalability to achieve this securely! Using the new z13 mainframe system, its possible to lower the total cost of ownership by 32% compared with an x86 private cloud! Generation z Program and continued support – Assisting the millennial generations (estimated to be half the global workforce by 2020) with mastering the Mainframe program. X86 systems max out on processing around 70/80% and with a mainframe you can effectively use virtually 100% of the resources. So in summary for IBM z Systems and the mainframe, its not old tech and should never be seen as so. The technology behind the IBM z13 is incredible and scalable, It’s also affordable for firms that are serious about security and resiliency with true cloud benefits that are designed for big data analytics and mobile services. The week continued with additional announcements across other IBM systems and product lines that were tied into a holistic Hybrid cloud standpoint. From Rocket Data Access services on Bluemix for z Systems that enables the development of mobile applications seamlessly integrated to a mainframe. Storage announcements included the IBM Power System E850 – a four socket system with up to 70% guaranteed utilisation. IBM Spectrum Control Storage Insights that enables, amongst other benefits, an on premise Storage Defined Storage that allows external developers to assist with coding and software development – on your infrastructure without the need of additional infrastructure. This coupled with the compression of IBMs XIV GEN 3 real time compression, reclaiming existing free storage and scaling for Hybrid cloud, proved that again IBM are thinking of future and have one of the most comprehensive storage management portfolios in the market. Cattermull’s Take: Following on from this brief summary I would like to add my personal thoughts on the week. IBM is seriously underrated as a Cloud player compared to some others I could mention. Their tech breakthroughs come thick and fast monthly and mostly go unnoticed. IBM needs to shout louder and be proud of their achievements more as some of the breakthroughs are ground breaking. Just one point that got announced is quite literally life changing, Watson having the ability to analyse medical data that is unstructured at an individual level and plan treatment based on this that is individually tailored to you. This is quite literally live saving and this has never been done before. As much as I love my Doctor and have known him for many years, this breakthrough will change my health for the better in ways he never could, and even predict potential illnesses that I would develop. Keep it up IBM and we can’t wait until next year for Edge 2016 and the product news and technology breakthroughs you will announce. ### UK Hosted Desktop Provider Launches in Singapore Today localisation of cloud services is key to fulfilling regulatory and access requirements. A key part of the cloud stack has always been hosted or cloud hosted desktops. These devices enable remote workers or centralisation of services to allow for everything from group collaboration to flexible working. As part of a global expansion plan UK Provider VESK has announced the invocation of services in the ASIA Pacific region focussed initially on Singapore. In collaboration with the Equinix SG1 data centre service and inter-data centre access provided by GTT Networks, VESK has put in place a highly scalable robust service. [quote_center]“We have invested a quarter of a million pounds to grow our existing infrastructure filling two full cabinets in the SG1 data centre in the centre of Singapore’s financial district,” said James Mackie, Managing Director, Vesk.[/quote_center] “Launching our presence in Singapore has been an exciting expansion for VESK and we look forward to doing more business in the Asia-Pac region in the future,” Mackie added. The data centre location within the key financial district of Singapore enables VESK to access a number of key research and development and financial markets. Underpinned by the latest Citrix Xen Hypervisor along with an impressive Flash based storage pool and 10GBPs interconnectivity this is indeed a fantastic architecture to invoke desktop services upon. Anchor tenants for the initial Singapore VESK service includes prominent companies in the legal and recruitment sector. Singapore ranks at number 2 on the Computing Market Attractiveness Index prepared by the Asia Cloud Computing Association. Singapore has a market of 407,298 SMEs, 99% of all enterprises and these SMEs contribute 50% of Singapore’s GDP. The sectors most likely to lead as early adopters of cloud computing solutions in Singapore are legal, recruitment, commerce (wholesale and retail), and accommodation. To access this highly addressable and changing marketplace VESK may be a pioneer with many other UK companies potentially following this lead. About VESK: VESK is now the fastest growing provider of Hosted Virtual Desktops in the UK. VESK enables you to access all your emails, files and business applications anywhere in the world using any device. VESK is privately held, owns it’s headquarters and the data centre infrastructure from the ground up. VESK has been profitable since January 2011. VESK has clients of all sizes and in most countries. VESK has been security assured to procure directly to the UK’s public sector, thanks to tier 4 and ISO27001 data centres. VESK is different. Their core values are built on integrity and trust, their strategy is simply to grow something that everyone loves by building the largest, most reliable and secure virtual desktop infrastructure (VDI) in the world. ### Storms, Floods and Blackouts How IT managers and disaster recovery planners can prepare for unpredictable weather. The weather in the UK is becoming more unpredictable and IT managers and disaster recovery (DR) planners need to keep on their toes to avoid becoming the next unwitting victims of downtime, outages and worse. Over the past winter, a barrage of flash floods overtook more than 5,000 homes and businesses on the south-western coast.1 But summer offers no respite; last year, we witnessed flash floods to the south-east, with lightning damaging buildings, rising waters invading homes and businesses, and waterlogged roads leaving people stranded in vehicles.2 In December, a “weather bomb” – an intense storm with low centre pressure triggered by jet stream changes3 – pummelled northern residents with rain, hail, lightning and blustery winds. Dangerous roads and incapacitated travel systems made travel difficult, if not impossible, while widespread power outages left tens of thousands of people without power.4 In fact, power outages in the UK last year more than doubled since 2013, according to Eaton’s Blackout Tracker report.5 To make the issue worse, the National Grid’s spare electricity capacity is dangerously low, with over a 12 percent shrinkage rate over the course of three years.6 While the National Grid is taking precautions to prevent widespread outages, the Ofgem Electricity Capacity Assessment Report 2014 estimates that margins will reach their lowest level in winter 2015-2016 as older power stations are shut down.7 Throughout the course of these events, many IT managers and DR planners were forced to demonstrate their ability to work well under pressure when responding to outages caused by severe weather or problems with critical power supply. PREPARING FOR FUTURE OUTAGES This year, the unpredictable weather shows no signs of letting up. In January, a 250 mph jet stream and two Atlantic weather bombs threatened winds of up to 90 mph.8 And while the Met Office reported that April was the fifth sunniest month across the UK, weather conditions varied drastically between regions. For instance, Faversham, Kent, experienced the highest recorded temperature of the year, while Katesbridge, Northern Ireland, experienced severe frosts accompanied by a temperature of -8C.9 Experts ‒ including researchers from the Met Office and Newcastle University10 and Edward Hanna, professor of climate change at the University of Sheffield11 ‒ suggest climate change could lead to an increase in severe weather phenomena such as flash flooding and weather bombs – which can in turn lead to blackouts (though it’s not certain that climate change is to blame for the variable weather). Bearing this possibility in mind, your business should implement measures to cope with events that could damage your IT infrastructure and in turn, your ability to continue business operations. Below are a few steps your business might need to implement to weather the storms ahead. Reduce the risk of data loss Often even the most careful precautions can’t protect against system damage during storms. To reduce the risk of significant data loss, vault data off-site in a location that is as close as possible to your facility without there being a common risk between geographies. In the event that your primary facility is inaccessible, disaster recovery as a service (DRaaS) solutions can give you access to critical systems and data remotely in a virtual environment within your recovery time objectives. Because these solutions are billed based on actual cloud storage used, they are often more cost-effective than a dedicated data centre or colocation solution. Keep systems powered up Your business also needs to have a plan for power outages and ensure that any critical supply chain vendors do as well. To minimise the impact of a blackout on your business, classify your important systems and processes and determine what resources (e.g., uninterruptible power supply and generators) are necessary to sustain them during an outage.   Outsource equipment and network repairs After a severe weather event, your IT staff might be stretched thin seeing to hardware damaged by water or electrical surges and working to restore a damaged network. Enlisting the help of an IT managed services provider prior to an emergency provides the assurance that during an outage systems can quickly return to business as usual. To reduce the burden on staff, managed services providers help troubleshoot system errors and repair or replace hardware. If there is damage to the network, the provider assists with resolving network issues and performing any necessary re-cabling. Implement a bring-your-own-device strategy If employees aren’t able to access their primary work environment due to conditions such as closed roads, unsafe driving conditions or a power outage at the home office, they’re more likely to turn to their personal devices for work. Although most UK businesses (95 percent, according to a BT study) permit bring-your-own-device (BYOD) practices, security is sorely lacking. BT’s research shows 41 percent of organisations have suffered a mobile security breach over the last year, 33 percent grant users unbridled access to the internal network, and 15 percent lack confidence that they have the resources to prevent a breach.12 It’s important to establish a BYOD policy that addresses issues such as data security, remote management, data transfer, backups, data wipe and technical support (office or field based). If you work with a managed services provider for your IT support, check to see if the vendor can assist with developing and supporting your BYOD program. While you might not be able to predict exactly how weather- and utilities-related incidents will affect your business, it’s important that you think ahead and prepare. Using the right resources and enlisting the assistance of strategic vendors can alleviate some of the stress during an interruption and help you improve your ability to get back to business quickly. You might work well under pressure, but why place an unnecessary burden on yourself? ### Welcome To Your New Look Journey In The Cloud Introducing the new look Compare the Cloud.  Behold the new look CompareTheCloud.net - a fully responsive hub into cloud, big data, analytics, Internet of Things and cutting edge technology. We've expanded the site to fully embrace our articles, opinions, news, podcasts, videos, events and education fully. As ever we have great articles from some of the best experts in the industry. [caption id="attachment_18389" align="alignleft" width="1200"] Internet of Things will continue to grow in 2015.[/caption] We will continue to add new features in the forthcoming weeks. Should you have any feedback or are looking to submit great content to us - email us at info@comparethecloud.net ### We're presenting at Scot-Cloud 2015! ### Special #2: Cloud Expo – Part 1 [embed]http://traffic.libsyn.com/comparethecloud/The_Issues_Special2_Part1.mp3[/embed] In the first of a two part special we recorded live at the Cloud Expo Europe 2015, London. Join us and our guests as we discuss cloud, technology and silliness. Hosts: Andrew McLean & Neil Cattermull Guests: Omer Wilson | Marketing Director | EMEA | Digital Realty Daniel Steeves | Beyond Solutions Andrew Bartlam | VP EMEA Alliances & Business Development | CipherCloud Miriam Brown | Business Development Manager | Kingston Technology Tom Spalding & Dorothy Constantino | Account Executives | Datapipe Anthony Senter | Strategic Alliance Director | EMEA | GTT Joe Baguley | Chief Technology Officer | EMEA | VMware Andy Roberts | Field Architect | SolidFire John Bruylant | Founder | Global Cloud Broker John McLean | Retired Jonathan Wisler | General Manager | EMEA | SoftLayer Ambrose McNevin | Editor in Chief | Computer Business Review Part 2 coming soon. ### Choosing A Small-Business Hosted PBX Solution: Top Five Considerations Hosted PBX solutions are set to gain ground this year. A recent Sand Hill article notes that these systems have a “buyer power” of 4.1 out of 5, making them favourable investments for companies of any size. However, small businesses especially stand to benefit from the adoption of hosted PBX or VoIP PBX options, in large measure because these solutions provide the flexibility and quality of in-house PBX systems without the cost of maintaining or upgrading infrastructure. Of course, not all hosted providers are created equal; here five key considerations before making any PBX investments: Look Inside Is your network ready to handle a hosted PBX solution? Before signing any service level agreements (SLAs) or setting up new telecommunications infrastructure, take a hard look at your network configuration and performance. Since hosted providers use off-site servers and resources to deliver VoIP services, it's often tempting to ascribe any technical difficulties to supplier failure. Though if your “last mile” isn't up to snuff or your network is needlessly bogged down by legacy infrastructure, even the best PBX provider won't be able to offer quality service. Always look inside first. Think Beyond Bandwidth As noted by Simple Solutions Computing, it's also important to think beyond bandwidth. This covers both the bandwidth available from your Internet service provider (ISP) and the “pipeline” offered by a PBX provider. For small businesses hoping to compete on a global scale by offering not just voice but multimedia content and real-time conferencing, it's critical to ensure that both download and upload speeds are enough to handle the load of your new PBX system and that your provider is similarly equipped. Too slow or too small and you end up with sound “jitter”, delays and even dropped calls. SIP or PSTN While there's nothing wrong with using a hosted VoIP solution which connects to a public switched telephone network (PSTN) for your hosted PBX system, many small businesses now opt for session initiation protocol (SIP) trunking instead, which allows them to easily communicate with mobile and fixed telephone subscribers worldwide. Moving away from traditional telephone lines enables a company to operate entirely online and without the need to convert data across multiple formats in a single call. The migration process from PSTN to SIP can be time consuming, however, so Tech Target recommends finding a stable provider that also owns last mile technology and offers redundant trunks in the event of failure. Go Feature Rich For most small businesses, high-quality calls aren't enough on their own. In effect, quality of service (QoS) should be the basic expectation for any VoIP provider; it's also important to search solutions that offer things like built-in video support, real-time conferencing and the ability to add new numbers or extensions on the fly. In addition, you should be able to scale up on demand and reduce service as required. The Price Isn't Always Right Finally, small businesses need to look past the price tag. This is difficult, especially for companies with only a few IT staffers or even part-time support, but opting for the lowest priced option may cost more long term. Instead, take a hard look at what each price point delivers: Often, low-cost providers take limited responsibility for service delivery, leaving you on the hook if any issues arise. Want to find the best hosted PBX for your small business? First, look at your own network performance and bandwidth needs. Next, consider the value of SIP, going feature rich and opting to pay a little more for better service over the long term. ### May 2015 #CloudInfluence Special Report: 
Battle of the Big 5 In the run up to Cloud Word Forum 2015 , 24-25 June at Olympia in London, we wanted to take a look at the Battle of the Big 5, not just in area of cloud, but also in big data and in the Internet of Things (IoT) – both of which are part of a breakout stream at Cloud World Forum this year. The big five tech vendors that are currently dominating the market are Amazon, Apple, Google, IBM and Microsoft. In Cloud they are being challenged by Dell, HP, Oracle, SAP and Salesforce. In Big Data they are being challenged by Cloudera, Hortonworks, Oracle, Pivotal, and SAP. And in IoT they are being challenged by Cisco, Hitachi, Huawei, Intel and Samsung. In our monthly #CloudInfluence reports we have argued that it is the players with the most active ecosystem and most professional marketing machine that will prevail. So in our #CloudInfluence special report this month we decided to reflect how successful each of the main players had been in gaining attention for themselves in each of the three key battle grounds: Cloud big data and IoT. [quote_box_center] Battleground 1: Cloud Attention Index: 60,464 Significant Opinions: 478,413 Significant Influencers: 5,249 Big five players: Amazon, Apple, Google, IBM and Microsoft. Main challengers: Dell, HP, Oracle, SAP and Salesforce. Other challengers: … come to Cloud Word Forum 2015 to meet a few! [/quote_box_center] Cloud is a very hot topic, with a high level of activity every week over the last few months (the light blue bars on the chart), with a gently increasing attention index (the purple line) and consistently positive sentiment (the orange line). As our monthly rankings have shown Microsoft has maintained an impressive lead in #CloudInfluence over recent months. The recent disclosure by Amazon of its AWS financial briefly boosted the firm’s profile, moving it ahead of IBM and within reach of Microsoft, only for Microsoft to pull ahead again.  Google and  SAP then lead the rest of the pack. Cloud Influence Score  [quote_box_center] Battleground 2: Big Data Attention Index: 18,780 Significant Opinions: 359,981 Significant Influencers: 1,993 Big five players: Amazon, Apple, Google, IBM and Microsoft. Main challengers: Cloudera, Hortonworks, Oracle, Pivotal, and SAP. Other challengers: … join the IoT & Big Data stream at Cloud Word Forum 2015 to meet a few! [/quote_box_center] Big Data is also a very hot topic, even if not quite as hot as cloud. Following a peak of activity in mid February, activity has been a little subdued with the overall attention index fluctuating, but not rising to any real degree. Sentiment has remained positive, but not as positive as for cloud. Attention Index Big Data v Cloud Computing For a long period until the start of April Pivotal held the lead in big data, but following a period in which many of the players contested the overall leadership, Microsoft emerged as the leader, only to be pipped very recently by Hortonworks. Big Data Influence Score [quote_box_center] Battleground 3: IoT Attention Index: 22,648 Significant Opinions: 324,071 Significant Influencers: 2,115 Big five players: Amazon, Apple, Google, IBM and Microsoft. Main challengers: Cisco, Hitachi, Huawei, Intel and Samsung. Other challengers: … join the IoT & Big Data stream at Cloud Word Forum 2015 to meet a few! [/quote_box_center] IoT is another hot topic. It experienced a peak of activity in early March, but the overall attention index has remained fairly level, with sentiment remaining positive all along. All of this has been at a far lower level of activity and attention than for cloud. Attention Index IoT v Cloud Computing The lead in IoT has changed hands on a number of occasions recently: from Microsoft, to Intel, to Microsoft again, to IBM, back to Microsoft and then on to Samsung. IoT Influence Score Obviously all of these battlegrounds are interlinked, with IoT devices feeding information into the cloud for big data analysis. And of course competition between the big 5 will continue as they do all they can to grab attention and a leadership position in each of these areas. Tactics will include ecosystem support, the announcement of new products and services, alliances with other key players (as IBM has done with Twitter, Facebook, Tencent, Apple, etc.) and a host of other marketing initiatives. Some will also make strategic acquisitions, with Salesforce being in play at the moment. We will track all of this as it unfolds and share valuable insights with you via our monthly #CloudInfluence rankings and our special reports (such as this one). We will also be at Cloud World Forum where some the top players and a host of the challenger firms will be speaking and exhibiting, and where we will be doing video interviews with some of the top executives from our stand. We hope to see you all there. NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently.   ### AWS Chalk and OpenStack Cheese I've been at the OpenStack Summit in Vancouver this week, and what has struck me is the difference between the AWS ecosystem and the OpenStack community – they are at times like chalk and cheese. So what are the main differences? Amazon has all but captured the public cloud space. It doesn’t have this space all to itself (Microsoft and Google are growing rapidly to challenge it), but it doesn’t really have any interest in the private cloud space (unless you have a budget the size of the CIA). Conversely many of the main players in OpenStack only really play in the private cloud space – HP for example let slip that it wasn’t really a player in public cloud, before rapidly withdrawing the comment and claiming that it was, when in reality it isn’t. So it is getting increasingly easy to picture a future in which a handful of players dominate public cloud with their own proprietary stacks, while OpenStack becomes the de facto standard for private cloud. Then there are the two different ecosystems: As we covered in our preview for this month’s cloud rankings AWS’s ultimate appeal isn’t necessarily its cost (even with continually falling prices, it can still be quite expensive) or its ease of procurement as an elastic hosting provider. Its main appeal is now its massive ecosystem of services and the ability to tap into them fairly quickly. As fast as AWS innovates, its ecosystem adds value faster than it could ever do on its own. It’s all relatively quick and relatively easy to implement and there’s lots of services to choose from. The OpenStack ecosystem is very different: as we are seeing at the Vancouver OpenStack Summit, OpenStack boasts an enthusiastic following among the developer community and a very long line of powerful companies are supporting the project and contributing to the code and marketing of OpenStack. I have also been impressed by the progress that the OpenStack community has recently made to become more organised and address some of its most obvious weaknesses. This week it announced interoperability testing requirements for products to be branded as “OpenStack Powered” including public clouds, hosted private clouds, distributions, and appliances (initially supported by 14 community members). A new federated identity feature will become available in the OpenStack Kilo release later this year (which is being support initially by over 30 companies), and it has announced a community app catalog. Add to this the work that it has done to certify major vendors via its DevCore programme and it is making all the right moves – but are they too little, too late? As I mentioned above it has already lost the public cloud to Amazon and others, and realistically only has the private cloud in its scope – albeit this is one fairly large niche. The development ethos within each community is very different as well: In the Amazon marketplace you have partners developing stand alone services that are designed to plug into one another. Typically they are closed source, well integrated via AWS APIs and normally as easy to spin up as any Amazon service. The OpenStack community much of the focus is on collaborating to develop new functions, rather than standalone services. The venture capital firms that were so keen to pour money into OpenStack startups a few years back are only now starting to realise that much of the development has been contributed to the common good in the form of open source code, and that the venture capital firms are unlikely ever to make a return on this. OpenStack's Achilles heel is its usability. Speakers wouldn’t be joking about the need for a PhD to implement OpenStack if there wasn't at least a modicum of truth to the matter. For large clients with the tech skills to implement and maintain it, this isn’t a problem. And the SMB community that have minimal tech skills look to MSPs to provide cloud to them as a service. It is in the middle ground between these where OpenStack risks ceding ground to AWS and others – although innovative firms like Platform9 are seeking to address this. And finally the differences between the business models: One major OpenStack client that I spoke to framed this very well. He said that his company has many apps to support, each of which is used by a limited number of users. Amazon is set up as a volume operation where you need to have a service that appeals to many clients, whereas OpenStack caters more for the bespoke needs of each client. He went to Amazon for any off the shelf apps, but to OpenStack for everything that needed any tailoring, as Amazon would either be uneconomic or unsuitable for the number of users he had on each app. Given all these differences you’d think that AWS and OpenStack were indeed like chalk and cheese. However the underlying technology for each is relatively similar – for every component in AWS there is an equivalent in OpenStack – this is all illustrated very well in a blog by RedHat listing the equivalent components in each stack, here: http://redhatstackblog.redhat.com/2015/05/13/public-vs-private-amazon-compared-to-openstack/ Suddenly the differences don’t look so great. Maybe they’re both cheese after all. AWS is the pre-packaged cheese that you get in the supermarket – well packaged, uniform and relatively cheap. OpenStack is the cheese that you get in at the deli counter – cut to size, often richer in flavour and arguably worth paying a little more for when you really want and can afford to. ### Digital Realty Champions Green Initiatives Digital Realty the leading global provider of data centre and colocation solutions, announced on May 7th, updates to its sustainability initiatives to encourage greener practices in the data centre industry. To accompany the sustainability updates, Digital Realty also released a new white paper, "Powering the Green Data Center," to continue its commitment to providing diverse options for its clients to source clean energy. "As the leading global provider of data center solutions, we're committed to ongoing efforts that enhance the growth of our clients' businesses while also minimizing the environmental impact of our increasingly digital world," said A. William Stein, Digital Realty's Chief Executive Officer. "The world is increasingly leveraging the economic and social value of being connected digitally, and data centers are the home of the physical infrastructure that underpins the digital world. Our goal is to make it as easy as possible for our clients to select more sustainable data center solutions." The white paper, which the company published today, is a planning guide authored by Digital Realty Director of Sustainability Aaron Binkley that identifies the considerations for moving to a clean energy model in the data centre, describes the growing importance of sustainability in the data centre, provides a definition of "clean" energy, and outlines options for sourcing clean energy. Update on Sustainability Initiatives Energy Efficiency - Digital Realty's expert staff is dedicated to improving energy efficiency in its data centres. This includes screening assets for low- and no-cost upgrades, as well as capital projects where highly energy-efficient products are given preference during replacement and upgrade. This also results in lower operating costs for its clients. Digital Realty's participation in the Better Buildings Challenge exemplifies its commitment. The Better Buildings Challenge is a Department of Energy-led initiative to improve energy efficiency in data centers by 20 percent by 2020. For more details on this exciting program, visit Digital Realty's blog post, Joining the White House's Better Buildings Challenge.Digital Realty is also an active contributor to ASHRAE's efforts to advance data center energy efficiency and sustainability. Paul Finch, Digital Realty's Senior Technical Director, holds one of 16 voting positions with ASHRAE on the Technical Committee for Mission Critical Facilities, Data Centers, Technology Spaces and Electronic Equipment. This committee takes a global leadership role in developing new publications and standards in energy efficiency and sustainability for the data center industry. Cleaner Energy – In January 2015, Digital Realty launched the Clean StartSM program to support its clients' goals to power their data centers with clean, renewable energy. This was the first industry initiative of its kind offering clean energy options for new leases at no incremental cost. Other data center companies have followed suit, and we're glad they did.For calendar year 2014, Digital Realty's portfolio average fuel mix contained 22% renewables; the company is working with power suppliers and clients to meet its long term goal of powering its data centers with 100% renewable energy. In addition, Digital Realty recently became a signatory to the Corporate Renewable Energy Buyers' Principles, developed jointly by the World Wildlife Fund and the World Resources Institute. The principles seek to spur progress on resolving the challenges corporations face when buying renewable energy. Greener Client Solutions – Digital Realty has a successful track record in sustainability dating back through our history as a company, with more than 40 sustainable "green" building certifications, including the completion of the world's first LEED gold-certified data center in Chicago in 2007. These data centers are built to best-in-class sustainability standards for energy performance and resource efficiency. This global success story includes projects in the United States, the United Kingdom, Ireland, Singapore and Australia.A few recent examples of recognition received for the company's sustainable data centre solutions include:1. North America: Digital Realty's data centre in Austin, Texas – a custom build for ARM Holdings – won a DatacenterDynamicsGreen Data Center award.2. EMEA: The London-area data centre recently launched for our client Rackspace is extremely energy-efficient due primarily to the design of its cooling system, which uses outside air for heat transfer. Since it is an indirect adiabatic solution, it doesn't actually pull any of the outside air through the equipment on the IT floor.3. APAC: Digital Realty's Singapore facility recently received a Green Mark Platinum certification by the Building and Construction Authority (BCA) and Infocomm Development Authority (IDA). This is the first time a colocation data centre in Singapore has achieved the Green Mark Platinum level under the category of "Existing Data Centre." "Digital Realty continues to expand its solutions for clients seeking clean energy, energy efficiency, and sustainability in the data centre industry," said Aaron Binkley, Digital Realty's Director of Sustainability. "For example, in an article published in 2008, InfoWorld recognized Digital Realty's CTO and head of Portfolio Operations Jim Smith in its prestigious 'CTO 25' list as 'something of an icon in the tech world as he was among the first – if not the first – in the industry to speak openly about the inner workings of an efficient data center at free how-to seminars and green computing forums.' We look forward to accelerating our industry leadership in promoting sustainability in the data centre." To download the full paper, visit this link now: “Powering the Green Data Centre”   ### A Quantum Leap into the Future of Computing The world of fundamental research is dominated by a small number of incredibly talented people. They may have a great support system around them and some talented colleagues to work with and with whom to exchange ideas and theories, but they are the stars of the research world.  When you bring a few of these stars together you can get incredible breakthroughs. I am a big fan of one particular researcher, Richard Feynman who was not only part of the Los Alamos project in WW2, but won a Nobel Prize and was one of those to initiate the field of quantum computing. I’d highly recommend his semi-autobiographical books “Surely You're Joking, Mr. Feynman!” and “What Do You Care What Other People Think?” which will show you what a great sense of humour as well as a great mind the man had. IBM posts more patents every year then any other company and has done each year since 1993. On top of this five of IBM’s alumni have won the Nobel Prize: Leo Esaki, of the Thomas J. Watson Research Centre in Yorktown Heights, N.Y., in 1973, for work in semiconductors; Gerd Bining and Heinrich Rohrer, of the Zurich Research Centre, in 1986, for the scanning tunneling microscope; and Georg Bednorz and Alex Mueller, also of Zurich, in 1987, for research in superconductivity. I remember on a visit to IBM’s Zurich Research Centre once remarking on the art work in reception, only to be told that it had been painted by one to the researchers whose hobby was art. His hobbies also included 16 languages and 8 instruments all of which he had mastered. These guys are supremely talented polymaths and justifiably the stars of their world. IBM has announced that they have made a breakthrough in Quantum Computing, the very field initiated by Richard Feynman. Bearing in mind that all technical advances are in reality thousands of small breakthroughs that accumulate to create a revolution, the breakthrough announced by IBM marks a significant step on the path to the creation of real Quantum Computers. So what is quantum computing? Digital computers use transistors to encode data into binary digits with two possible states 1 and 0. This is is like the bead on an abacus that you can move from left to right. Advances in microchip designs have achieved two things. They have enabled us to cram in ever more processors (lots more beads on the abacus) using ever smaller fabrications (smaller beads so that you can fit more in). They have allowed us to pack more and more processing power onto the silicon chips that we use today, but we are reaching a set of physical limits that will bring such advances to an end. Quantum computers are fundamentally different to what we know today, and have quantum bits, or "qubits," that can exist as a 1, a 0, or as both at the same time. So while a regular computer made of two bits can encode information in only one of four possible combinations: 00, 01, 10, 11. A quantum computer can hold all four of those combinations at once, and can therefore handle exponentially more information than regular computers. While a regular computer might evaluate a series of options in turn before comparing the results to find a winner, a quantum computer would consider all the options at the same time, because qubits can process lots of information all at once, getting to the answer much faster. There are a few challenges however, such as how to overcome the inherent instability of the qubits which have a tendency to forget what information they have been given, or the risk of altering the information by simply trying to read it. Then there is the challenge of programming the qubits. Thankfully, there have been several recent advances in quantum computing. In March researchers at Google and the University of California, Santa Barbara (UCSB) demonstrated an error-correction quantum bit (qubit) circuit that allows quantum computers to remain stable enough to reproduce the same results. Then a Canadian company called D-Wave built a type of quantum computer. Google and Lockheed Martin LMT are evaluating the D-Wave machine to see how it compares to traditional machines. At the end of April 2015 a team at Cambridge Quantum Computing (CQCL) announced the development of a quantum computer operating system, using a high speed supercomputer to accurately simulate a quantum processor. However this was almost immediately topped by IBM who announced a way to detect and measure the two types of quantum errors, called bit-flip and phase-flip, at the same time. They produced a prototype 4-qubit circuit, but while previous quantum computers had been built in a linear fashion with qubits all in a row allowing for the measurement of only one quantum error at a time, IBM’s new model sees qubits stacked in a square array, allowing two methods of error correction to occur simultaneously. Given their exponential advantages over classical computers, quantum systems promise to revolutionise our world. It isn’t easy to simulate even a modest sized quantum system, because to write down the quantum state of a system of just 300 qubits, you would need 2^300 numbers. This is roughly the number of protons in the known universe, so no amount of Moore's Law scaling will ever make it possible for a classical computer to process that many numbers. ### PTC aiming for the TOP with IoT at LiveWorx PTC’s annual IoT event LiveWorx was a hit in Boston last week, May 4-7 2015, with PTC customers, prospects and third party developers all excited by the future of IoT. PTC CEO Jim Heppelmann and his team saw a turnout of an estimated 2000 attendees, up from around 250 or so in 2014. Concerns on whether standards and security technologies can keep pace with the rapid pace of evolution and adoption of IoT did nothing to dampen the enthusiasm of the crowd. The reality of the IoT is that it isn’t a new market, and adoption is much more rapid than people have expected, it has been accelerated in recent times by new technologies. And PTC is determined to be at the leading edge of that acceleration. IoT is more than just smart, connected products. It requires middleware and infrastructure to make the detached connected; allowing companies to turn the petabytes of dark data and information captured by sensor laden products into valuable insights and innovative services. PTC’s acquisitions of Thingworx and Axeda and (May’s intent to buy) Coldlight have moved them beyond their traditional walls of (CAD, PLM, ALM and SLM) product to position them well in the rapidly evolving IoT world. PTC is moving into integration services, software infrastructures, connectivity, analytics and insights. PTC is making these moves to take an early lead in what is expected to be amongst the most rapidly evolving technologies of the next few years. Reinforcement (if this was needed) of their IoT strategy came in the form of a partnering agreement with servicemax, the field service specialist.  This centres on a new generation of (IoT) connected service products from servicemax that will deliver, amongst other things, predictive maintenance and upgrade advisory services to their customers. PTC’s existing (CAD/PLM/ALM/SLM) businesses may have taken somewhat of a back seat at LiveWorx but they’re not forgotten. Their CAD business is heading for IoTisation too. ThingWorx Converge is intended to deliver product development to IoT integration (of sorts). Converge is, in fact, an add-on that delivers more standardised 3rd party/ISV software to IoT connectivity. Having seen a demo (keep it simple PTC!), I get it. But the concept of hub and spoke integration isn’t new, and yet another integration platform may be one to many. In its current form I’m a tad confused as to why it isn’t simply a new set of features offered as part of standard ThingWorx? Wouldn’t that create pull from the ISV developer community PTC? Of course PTC’s recent expansion into domains anew will not be without its challenges. For one they need to integrate their existing business lines with new acquisitions into something more cohesive than a disconnected collection of software toolsets. Converge will allow this to happen … to an extent. They also need to continue to deliver revenue and profit from existing business lines to fund their investments in IoT. According to PTC, this is also in hand (and hopefully to plan?). All in all, LiveWorx was a good event. From 250 attendees last year to around 2000 this, on that kind of growth trend should we be expecting 20,000 next year PTC? ### Cloud Migration Challenges How To Stop Downtime From Risking Your Cloud Projects Cloud computing represents a great opportunity for companies to step off the treadmill of IT hardware refresh projects and management of updates to IT systems. According to Vision Solutions’ own State of Resilience Report for 2015, 62 per cent of all companies are using cloud services in some shape or form. This statistic covers the use of a wide variety of different cloud platforms, from internal private clouds through to hybrid and public cloud implementations. Moving to the cloud is a significant challenge. At this point, the issue of downtime can stop cloud projects in their tracks. The risk of critical IT systems being offline can lead to projects being deferred or cancelled. Over 60 per cent of IT professionals surveyed stated that they had had to delay a major IT project; when asked about the reasons for putting back these projects, the biggest reason given was the risk of encountering downtime. This was followed by a lack of resources to complete the migration and then lack of continuity planning skills. When an incident of downtime can affect the perception of IT internally – and end up costing more than any projected savings - how can you plan ahead successfully? To improve the chances of any migration being successful, it’s worth spending the time to thoroughly understand what is being moved to the cloud and why. Email servers, file servers and databases are all commonly moved to the cloud; however, these tend to be single instances that can work in isolation. For more intricate services like applications, there can be more moving parts involved that all have to be working at the same time. From this, it is possible to look at how each component will work with the others over time. Can a corporate application work across different areas during the migration, or will it all have to be in one place to perform in a way that is reliable and sufficient for the business? It can be possible to update the application to work in a hybrid fashion during its migration – however, this can lead to additional costs being incurred as updates are made to each part of the service as it was made. Each part of the application will have to be migrated to the location where it will live long-term, and each of these moves can lead to downtime for the service if not managed properly. Planning the move is important, as it enables IT teams to diagnose potential sources of risk and downtime. This includes looking at potential unforeseen circumstances that might come up during a move, such as risk of data loss or the ability to roll back to the current systems if something is not working properly. Avoiding the hidden challenges and costs around cloud migration Alongside the obvious costs that can be associated with migration, there are others that are less obvious. Planning ahead can help a migration go smoothly, while tools like replication can reduce the window of downtime that is required as part of a move to the cloud as well. However, the people element of a successful migration is often overlooked. Staff costs can cover several elements within a migration. This involves more than just the spend on pizzas and beer that can tide the team over during a late night or weekend implementation. From investment in additional expertise as part of the project through to overtime payments if projects have to be completed out-of-hours, budgeting for this is important. For organisations that don’t invest in staff time, there can be a cost around productivity and morale that goes on far longer as well. Investing in proper migration planning and tools to speed up the move to the cloud can pay off over time, particularly for service providers that are working with multiple customers around cloud. Getting migrations done in shorter time-scales, in more predictable ways and with less downtime for the customer provides more customer satisfaction. In turn, this can help cloud service providers improve their profitability and differentiate their services. Many IT professionals have a valid fear of downtime and the effect that loss of service can have, both on the business they work for and on their own careers. However, with the right planning and data migration plans, this risk of downtime can be virtually eliminated. ### HyperX® has announced the world’s fastest DDR4 128GB memory kit HyperX®, a division of Kingston® Technology Company, Inc., the independent world leader in memory products, yesterday announced that it has created the world’s fastest DDR4 128GB memory kit running at an astounding 3000MHz. The kit consists of eight 16GB HyperX Predator modules (16GB x 8) with ultra-tight 16-16-16-36 timings and XMP profiles for easy and stable overclocking. The accomplishment was achieved using the MSI X99 MPOWER motherboard in an eight module, quad-channel configuration along with an Intel® Core™ i7 5820K processor. During Computex Taipei, HyperX will unleash a high-performance system featuring the upcoming 16GB modules powered by the recently released HyperX Predator M.2 PCIe SSD. The live demo will take place during the HyperX Roadshow Experience from June 4-7, 2015, at the ATT 4 FUN center, in Taipei. More details will be forthcoming about availability and pricing for HyperX Predator 16GB DDR4 modules and kits. HyperX is the high-performance product division of Kingston Technology encompassing high-speed DDR4 and DDR3 memory, SSDs, USB Flash drives, headsets and mouse pads. Targeted at gamers, overclockers and enthusiasts, HyperX is known throughout the world for quality, performance and innovation. HyperX is committed to eSports as it sponsors over 20 teams globally and is the main sponsor of Intel Extreme Masters. HyperX can be found at many shows including Brasil Game Show, China Joy, DreamHack, gamescom andPAX. For more information visit the HyperX home page. ### VESK announces a "VESK for Legal" A new cloud computing business focused on the legal sector. James Mackie, Managing Director of VESK Limited, the fastest growing provider of hosted virtual desktops in the UK, is pleased to announce the creation of ‘VESK for Legal’, a company that will specialise in the delivery of cloud based IT services to the legal sector. VESK currently delivers a world-class, highly scalable, reliable and secure cloud IT infrastructure, allowing businesses to configure and run any applications of choice, accessible from any location using any type of device. With over 375 customers, a number of which are already in the legal sector, VESK offer a full 99.9% SLA, is fully SRA compliant and incorporate full backup and disaster recovery as an integral part of their solution. VESK maintained 100% uptime for all of 2012, 2013 and 2014. VESK are ISO27001 and ISO9001 compliant, security assured by the cabinet office to host data for the UK’s public sector and winners of the 2013 Green IT Awards. “VESK are delighted to make this commitment to the legal sector and to welcome David Bennett to our business to lead this exciting initiative. Dave has over 20 years of experience in implementing and managing infrastructure and business solutions for law firms. His background is relatively unique, having worked in both large and small law firms. This has given him an appreciation of the pressures and constraints that different size firms are under, and particularly the need to improve productivity and working practices.” said James Mackie. “VESK for Legal offers significant and, in some instances, transformational benefits to law firms. The traditional in-sourced model of delivering legal IT, with the majority of effort, investment and resources focused on keeping the lights on, is not well aligned to the needs of the modern law firm," said David Bennett. James added “The innovative use of IT to support lawyer’s efficiency and enhance client relationships can enable law firm differentiation. As such CIOs and CTOs are striving to ensure that their law firms exploit technology in order to meet strategic objectives and drive competitive advantage. VESK for Legal enables this by providing the platform on which the CIO and his/her IT department can configure and implement those applications and associated business processes that can drive differentiation. "We remove the burden of managing physical servers, networks, storage, back up and disaster recovery from the valuable resources within the IT department. Dave’s knowledge and awareness of the legal market underpins VESKs commitment to law firms. VESK for Legal offers the opportunity for legal IT departments to shift the focus of their teams to where their firm wants and needs them to be – finding ways of leveraging IT to make their lawyers more productive.” "VESK for Legal provides a utility view of IT, commoditising the delivery of technology into a service. Applications and data are held and managed centrally on our infrastructure, with access provided over the internet or through direct connections and paid for on a subscription or demand basis. This is very different from the traditional IT model where hardware and software are owned by the firm," said David Bennett. Working closely with existing law firm technology partners, VESK regularly provision the leading practice management, case management, document management, collaboration and digital dictation systems for their law firm clients. VESK for Legal will build on this experience with the addition of specialist legal IT knowledge and experience while offering the full range of cloud computing services including hosted desktops, infrastructure and business applications. With VESK for Legal, law firms can achieve real cost savings, significantly reduce the time it takes to adopt new technologies and have peace of mind that their data is secure and their IT solutions are up to date. ### Cloud Partnership Announced: CensorNet and Networks Unlimited CensorNet has appointed Networks Unlimited to drive market push in South Africa and the Sub-Saharan African region. On the 31st of March, CensorNet announced it has partnered with South Africa’s leading value-added distributor Networks Unlimited to service the entire Sub-Saharan African region. The region includes many of the world’s fastest growing economies and is experiencing unparalleled levels of infrastructure improvements, opening up the market to modern cloud solutions for the first time. "I have been working with partners in South Africa for many years and I am excited to work with Networks Unlimited to drive the adoption of our web security solutions in the region. While in the past the region suffered from poor bandwidth, limiting the penetration of the cloud, the market is now evolving at a rapid rate and the use of cloud applications continues to rise to unprecedented levels," said Macnair. "This partnership is indicative to our plans to grow internationally, whilst enabling us to offer technical and sales support locally to our burgeoning customer base." Networks Unlimited boasts four branches in South Africa, and also enjoys a strong presence in 19 Sub-Saharan African countries. Opening its doors in 1994, the company has a 20-year track record of sourcing and distributing best-of-breed solutions to ensure that the region continues to receive some of the most innovative solutions used by major enterprises today. "Currently there is a technological gap within the Sub-Saharan web security market. As more and more organisations are adopting cloud applications, traditional web security solutions are outdated and unable to cope," comments Anton Jacobsz, managing director at Networks Unlimited. "What impressed us most is CensorNet’s commitment in providing the most technologically advanced security solutions to respond to the growing demand in the market." “With a wide market reach on the continent, Networks Unlimited is a high pedigree value-added distributor offering the best and latest solutions within the converged technology, data centre, networking and security landscapes,” says Ed Macnair, CEO of CensorNet. “Our suite of cloud security solutions complements Networks Unlimited’s existing portfolio while unlocking a plethora of up- and cross-sell opportunities”. The announcement followed CensorNet’s statement of intent in February to shake up the web security channel through an intensive channel recruitment drive. This is built upon it partnering with security focused channel partners and MSPs, strengthening its core channel management team and continuing to drive product development. Networks Unlimited will be formally introducing CensorNet at the IT Web Security Summit at Vodacom World, Midrand, South Africa from 26-27 May. CensorNet CEO Ed Macnair will be attending and speaking at the event. For more information, or to register, visit http://www.itweb.co.za/index.php?option=com_content&view=article&id=138977. ### April Top 10 #CloudInfluence UK and Cloud Bursters This month’s cloud bursters is unusual in consisting mainly of CEOs. Top of the list is Amazon’s CEO Jeff Bezos. With all the excitement and comment following the disclosure of the financials for Amazon’s cloud business AWS, Jeff topped the Global 50 this month. Next on the list is Daniel Ives, an analyst at FBR Capital Markets, who appears in the #CloudInfluence ranking sporadically, commenting each time the main tech firms issue quarterly results. He is followed by Scott Guthrie, EVP for Microsoft’s cloud and enterprise group, commenting on their claimed leadership in cloud. Next up is Oracle CEO, Mark Hurd. Interestingly enough, neither the firm where Mark now works, Oracle, nor the firm where he came from, HP, feature very highly in our #CloudInfluence rankings. Like many legacy tech vendors they are focusing on private cloud because it’s not all that different from their existing operating model. The challenge for them is that many clients are voting with their workloads, as is evident in the recent results from the public cloud providers like AWS and Microsoft. It remains to be seen how quickly the traditional vendors will change, because its impossible to simply change your operating model and culture overnight. It is interesting therefore to see which of these vendors appear regularly in our #CloudInfluence rankings, as IBM does - showing that they appear to be making this transition. And which of these vendors don’t appear as regularly in our #CloudInfluence rankings, like Oracle and HP - suggesting that they need to do more. Luis Benavides, Founder and CEO of Day1 Solutions, rose to prominence this month commenting on their partnership with Cloudian and appointing a new VP of Engineering and Solutions. He is followed by a series of other CEOs with recent announcements along with Vibhor Kapoor another Microsoft Azure executive. [table id=21 /]   April UK #CloudInfluence Individuals As ever topping our UK list is the prolific Simon Porter, IBM’s European Midmarket VP. While he dropped from 41 to 44 in the Global 50 #CloudInfluence Individuals this month, he was still the top IBMer, ahead of his CEO Ginni Rommetty in 47th. Second this month was Ian Masters from Vision Solutions who was quoted widely in relation to the launch of the firm’s Double-Take Cloud Migration Center product which provides automated migration for servers and applications into the cloud from private and public cloud platforms. Research reports were also widely in evidence this month. Ian van Reenen in 3rd was commenting on the results of an Autotask survey of Managed services providers (MSPs) which found that they are more focused than ever on managing endpoints and using remote monitoring and management (RMM) solutions to increase visibility and scale to meet the ongoing demands of managing cloud-based solutions. Meanwhile David Blumenthal, MD, of the Commonwealth Fund in 9th was commenting on a report that theft, illegal hacking, and other breaches of protected health information have compromised 29 million medical records in 949 incidents between 2010 and 2013, spelling out a crying need for better data security. [table id=22 /] The health of the industry is demonstrated by the change month on month in our rankings for individuals – not only the Global 50 #CloudInfluence Individuals, but also the cloud bursters and UK #CloudInfluence Individuals. Aside from a few prolific regulars, such as IBM’s Simon Porter, most of the names are new each month as different companies launch new products, services or reports. NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### Vision Solutions Drives Business to the Cloud A new application, the Double-Take Cloud Migration Centre, is making migration to Microsoft Azure seamless while minimising risk. Vision Solutions, Inc., a leading provider of disaster recovery, high availability and migration software and services, today formally announced its Double-Take Cloud Migration Centre product. Double-Take Cloud Migration Centre provides automated migration for servers and applications into the cloud from private and public cloud platforms. Double-Take Cloud Migration Center supports migration for servers running on Amazon Web Services and on VMware vSphere moving over to Microsoft Azure. The migrations can be managed and completed automatically on behalf of the user – from cloud image provisioning and set-up to installation and cut-over. “Companies are looking at how they move to cloud and use migration as part of their ongoing strategy for deploying IT assets. What suited businesses last year may no longer be the best fit today due to changes in cost or performance,” said Vision Solutions Vice President, Cloud & Strategic Alliances, Ian Masters. “However, migrations cannot adversely impact companies’ day-to-day operations with detrimental downtime. Double-Take Cloud Migration Centre provides automated, simplified migration processes, which ease the burden associated with moving platforms, while providing the flexibility companies need when selecting the best cloud platform as their requirements change over time.” For companies moving to Microsoft Azure, Double-Take Cloud Migration Center includes automated discovery of all virtual machines (VMs) running within a user’s account, making it easier to group machines that can be migrated together. Users can then select migration destinations during the initial set-up phase in order to make the process more efficient. When servers are geographically distributed, whether for availability or performance, these relationships can be selected and kept in place after the migration is completed. Users will benefit from the following features available to them within the new application: Easy configuration with simple steps and automation Simplified source discovery with easy-to-find production servers Convenient, browser-based migration management Real-time replication to avoid lost productivity Test cutover capability Automatic target server provisioning “Vision Solutions’ Double-Take Cloud Migration Centre is transforming the way companies move applications and servers to the cloud,” said Tim Laplante, Director of Product Strategy at Vision Solutions. “Users can confidently rely on the simplicity of this service to perform migrations, making a complex process intuitive. With a seamless process from start to finish – all steps clearly indicated in one centralized, browser-based dashboard – the task becomes foolproof. The Double-Take Cloud Migration Center is a tremendous asset to IT teams, it frees them up for a variety of other tasks and strategic IT initiatives and allows them to create more value for the business overall.” Double-Take Cloud Migration Centre is based on Double-Take Move, the company’s migration solution for true, near-zero downtime and anything-to-anything server, application and data migrations. To learn more about Double-Take Cloud Migration Centre or to request a demo, visit: http://www.visionsolutions.com/products/windows/double-take-cloud-migration-center/overview. ### April Top 50 #CloudInfluence Individuals With all the excitement and comment following the disclosure of Amazon’s financials for its cloud business AWS, it is little surprise that Jeff Bezos, Amazon’s CEO tops this month’s individual #CloudInfluence rankings, pushing Satya Nadella, Microsoft’s President into 2nd place. Third in the rankings and the centre of great speculation on his plans for the future of the company is Marc Benioff, Salesforce.com’s CEO. The recent disclosure that the firms had been approached by a potential buyer sparked much debate on who this might be. While Oracle and Microsoft were both suggested as realistic suitors for Salesforce that could afford the $50bn price-tag, we think that the chances of a deal of this size coming off are somewhat slim. The commotion around AWS’s financials, the competing cloud revenue claims from Microsoft and IBM and the ensuing surge of comment on the cloud market swept a number of executives into this month’s rankings, including Terry Wise, Tom Szkutak, Andy Jassy and Dave McCann from Amazon, Eric Schmidt from Google, Scott Guthrie at Microsoft and even Larry Ellison from Oracle– as well as commentators such as Michael Pachter from Wedbush Securities and Daniel Ives from FBR Capital Markets, a regular in the rankings, whose opinions were also quoted on the speculation surrounding Salesforce. The announcement of a collaboration between Fujitsu and Microsoft to transform manufacturing processes through IoT innovation pushed Hiroyuki Sakai, Fujitsu’s Head of Global Marketing into this month’s top ten. Executives that rose to prominence this months shouting about awards that they had won included Brian Johnson, CEO and co-founder of DivvyCloud when his company was named by Gartner as one of the 2015 "Cool Vendors" in Cloud Management, and Dinanath Kholkar from TCS when his firm was recognised as a ‘Leader’ in Analytics Business Process Services (BPS) by Everest Group. Iron Mountain and CloudBlue’s joint announcement of a set of secure e-Waste and IT Asset Disposition Services helped to propelled both Eileen Sweeney, the SVP and GM Data Management at Iron Mountain and Ken Beyer from Ingram Micro into the rankings. [table id=20 /] The April Cloud Bursters and UK Individuals #CloudInfluence tables will be published tomorrow, in the meantime, you can see the full Top 50 Global Organisations here. NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### Cloud acronyms are a pain in the AaaS ### April Global #CloudInfluence Organisations As we covered in our preview this month, Microsoft retained top slot in the rankings this month and while its recently announced commercial cloud revenues at a $6.3 billion run rate might not put it ahead of AWS in the IaaS space, it is still impressive, especially given the current growth rate. Amazon was the second highest ranked #CloudInfluence entity this month with its cloud business AWS in 4th. Both Amazon and AWS had a massive boost in their #CloudInfluence scores this month following the much awaited disclosure of the first AWS financials, but even combining their #CloudInfluence scores as we did in the correlation chart in our preview did not quite equal Microsoft’s score, showing that Microsoft retails a clear overall lead. The next few places in this month’s rankings were occupied by IBM, Google, Oracle, Apple and SAP – most of whom we covered in our preview. They were also featured in much of the cloud market coverage that followed the AWS disclosure. These articles also featured comments from analyst at firms like Gartner and IDC, thereby boosting their #CloudInfluence and propelling them into this month’s top ten as well. VMWare [19] jumped to 10th in the rankings following impressive financial results and the announcement at the company’s annual Partner Exchange conference in San Francisco of plans to give MSPs access to the vCloud Air Network. Picking up a new CTO from Google and opening a data centre in Frankfurt helped to propel DigitalOcean into 12th. Singapore Telecommunications acquisition of US -based cyber-security firm Trustwave pushed into 13th. The Chinese corporate bond market experience its second default this month after Internet company Cloud Live Technology Group Co. said it will miss payments. Synergy Research may not have quite made it into the top ten, but were quick to revise their cloud market share analysis following the AWS announcement and were featured in a certain amount of coverage late in the month. Other notable commentators were Ovum which reckons that by 2016 more than 80 percent of global enterprises will be using IaaS, primarily in private cloud computing, and Fitch which reaffirmed its top credit rating on Oracle, saying that it has and will continue to make significant investments, both organic and inorganic, to retain its long-term competitiveness relative to other cloud subscription providers. Interesting then that the more recent speculation about a potential bidder for Salesforce has centred on Oracle. [table id=19 /] Obviously we will be following all the twists and turns in the cloud market with our #CloudInfluence rankings each month – not only following the key organisations, but also the key individuals as you will see in tomorrow’s April #CloudInfluence rankings for individuals. NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### Legal Issues To Consider When Moving To The Cloud Moving to the cloud may seem like a no-brainer for most business owners. You won’t need to fork out cash for a server, you save on an on-call IT guy and you may receive more specialist service. Furthermore, with the quality of the internet in 2015, there is little noticeable difference between upload and download speeds of an on-site server and the cloud. However, several legal issues need to be considered before moving to the cloud. Some of them include: whether your data would be safe, and what would happen if your data were stolen? This article will go through the top 4 legal issues of cloud computing and give you some insight before you make your final decision. Privacy and security Every business has confidential information – ranging from clients lists to trade secrets. Hence, often the first and foremost concern of a business owner is: just how safe is my data in the cloud? The answer to this question really depends on the quality of the hosting company. An experienced cloud host will have extensive security mechanisms to ensure your data is safe. Make sure you do some thorough research and pick a good host! Leaked data and liability But you might still ask: what would happen if my confidential information were leaked anyway? In this situation, you need to determine if your cloud provider has acted in a negligent manner. Very generally, this is whether they have taken all reasonable precautions to protect your data and prevent theft. Another consideration may be the limited liability clause in your provider’s contract. This will generally state the maximum monetary amount your provider will pay in case of negligence. It may be in your interest to negotiate this clause prior to making a contract if the potential damages your business may incur will vastly outweigh the liability cap. Service and reliability If all your emails, files and data are run through a cloud, the smooth operations of your business will be totally reliant on cloud service. For example, were your cloud provider to have a technical problem, your whole business could be down for indefinite period of time. There are two ways to mitigate this type of issue. First, make sure you have a good cloud provider. The Office 365 Cloud, for example, has multiple servers and multiple backups of data around the world. This means even if one of their servers were down, you would still have access to your information. Secondly, make sure your contract stipulates that you will be credited for every day of interruption experienced. There should also be a clause allowing you to terminate the contract if the interruption spans over an agreed timeline. Nothing is worse than having to pay for a service that you are no longer using! Ownership The question of data ownership is another very common legal issue. It is not enough to simply assume that you own all rights to data stored in the cloud. Make sure your initial contract makes it expressly clear that all data your company places in the cloud will be 100% yours and retrievable whenever necessary. For more information on the cloud, look out for the cloud education hashtag on Twitter. #CloudEducation ### CensorNet Launches In-Built Cloud Application Control Functionality The new Secure Web Gateway bridges the gap between Web Security and Cloud Application Control to offer customers ability to discover and analyse cloud application use in a single solution. “From Dropbox to Twitter, Salesforce to LinkedIn, now, more than ever, businesses need visibility and the ability to control what information and data their staff can access. They are using a plethora of cloud-based applications within the workplace which need to be monitored from a productivity, security and protection of intellectual property perspective,” says Ed Macnair, CEO of CensorNet. With cloud application adoption and BYOD growing at an exponential rate, CensorNet has announced it has launched its cloud application discovery and analysis functionality for its Secure Web Gateway (SWG) solution. The first of its kind, CensorNet’s new Secure Web Gateway allows organisations to have full real-time visibility and control of all aspects of web access and cloud application use across all devices, including BYOD, in a single solution. CIOs and IT departments are under increasing pressure to provide employees with reliable and secure web access, whilst controlling the use of cloud applications; all without compromising data security and preventing the sprawl of Shadow IT. However, current web security solutions are outdated and unable to meet the current demands and complexity of modern business applications. To bridge the gap in the web security market, CensorNet is driving its new security offering to enable organisations to intercept all employees’ web traffic to provide a real-time view of cloud application usage on their network.  Social media can be a valuable tool for organisations, but carries with it responsibilities. Stories are rife of employees posting damaging or libellous comments about a company or its products or publishing sensitive commercial data on their feeds. However, current web security could only tell a CIO that a person has accessed the application, rather than the content of the post itself. The other major problem for CIOs today is the number of cloud file sharing applications that can be used as an easy way to extract intellectual property – whether it be patents, designs or confidential customer lists – outside of the organisation by terminated employees or those simply after a quick buck. The problem can be exacerbated when the data is moving cross geographical borders which could breach strict regulatory guidelines. Up until now, the details within the packets uploaded to such applications have been difficult to ascertain. “At launch, CensorNet’s SWG will be able to monitor 150 of the most prevalent used cloud applications in business today, with hundreds more to be added as part of the ongoing product roadmap; giving unprecedented visibility of the user activity within the cloud application activity. This helps to ensure regulatory compliance and adds an extra layer of security for organisations that have an increasing number of their business processes in the cloud,” says Macnair. According to Gartner, by 2016, 25 per cent of enterprises will secure access to cloud-based services using a cloud access security broker (CASB) platform. CensorNet’s new SWG bridges the gap between web security and Cloud Application Control solutions, offering cost efficiencies to businesses wanting to plug the widening security risk gap caused by cloud applications with just one single solution. The launch of the new discovery and analysis functionality will be followed by further product development enhancements to CensorNet’s Hybrid Web Security through 2015 and cements the company’s strategy statement earlier this month to redefine the web security market. CensorNet’s new SWG is available for existing customers immediately and the solution is available to download as a free 30 day trial now at CensorNet.com. ### Preview for April 2015 #CloudInfluence Rankings With Amazon’s disclosure of its AWS financials, we thought we’d run a special preview to this month’s #CloudInfluence Rankings to put them into context. You’d have thought that this month’s disclosure by Amazon of its eagerly awaited Amazon Web Services financials might fill in a big piece of the cloud financial picture and clarify its market leadership, but Microsoft and IBM were quick to claim leadership as well – albeit by a very different metric. So which company is right? Which one is the real market leader? And which one is best placed to outpace its rivals in the future? Amazon Web Services announced an annual run rate of $6.26 billion, with an operating income of $265 million. Microsoft said that its commercial cloud business is on a $6.3 billion run rate. And IBM claimed that its 12-month rolling revenue is now on a more than $7.7 billion run rate. The catch? Amazon’s financials only include IaaS whereas Microsoft includes things like CRM Online, Office 365 and Azure and IBM's big cloud number includes everything from services to gear to managed services. While there is merit to the way that Microsoft and IBM come up with their numbers – it is after all a hybrid cloud world – if you strip IBM’s numbers down to just "as a service revenue" you get a $3.8 billion run rate. And that as-a-service figure includes software as a service as well as platform and infrastructure. Thankfully soon after Amazon’s AWS disclosure, Synergy Research Group adjusted its market share figures for cloud infrastructure to provide some clarity between the different players. Synergy Research Group found that: “Amazon Web Services (AWS) remains larger than its four main competitors combined in the cloud infrastructure service market. Microsoft can once again lay claim to having by far the highest revenue growth rate and IBM remains the king of the private & hybrid services segment, but AWS continues to grow faster than the market as a whole and its market share crept up to 29% in the quarter. Google is quietly gaining share though it remains just half the size of Microsoft in this market, while Salesforce, once the unquestioned leader in PaaS, rounds out the top five ranked companies.” That’s clear then – Amazon is the winner. Well maybe not. As Camille Mendler, lead analyst at Ovum, explained in an interesting opening plenary at Telco Cloud London: “Cloud is as much about loyalty as it is about technology services”. In other words players need to focus on generating loyalty within their client base and their channel ecosystem. As we have highlighted in our recent blogs on marketing, most vendors are focused on experience marketing to help drive both loyalty and advocacy. Our monthly #CloudInfluence reports are an effective way of tracking how well the various players are doing: providing a comprehensive big data analysis across all major global news, blogs, forums, and social media interaction, the rankings show how much buzz there is within each firm’s ecosystem as well as how effective their announcements and marketing machines are in grabbing attention. Retaining its overall lead in the #CloudInfluence rankings this month is Microsoft with Amazon in second (albeit closer to Microsoft than any players has yet been in the rankings), followed by IBM and then Google, with Salesforce far down the rankings some distance behind. The most marked difference in performance between the major cloud players this month was in Amazon’s score, which largely on the back to the excited comment and debate sparked by its financial disclosure almost doubled its #CloudInfluence. Indeed it is interesting to note the correlation between the #CloudInfluence score of each of the major players and their growth rate (see chart). *Note: For this graph Amazon and AWS #CloudInfluence scores have been aggregated. Salesforce, IBM and Miscrosoft are all aligned to the trendline given in the graph. And if you negate the brief boost that Amazon’s disclosure gave to its #CloudInfluence score this month, by taking its February score instead, Amazon is also aligned to the trend line. The exception is Google, which is the only major player not to break out any financials for its cloud infrastructure business. However, given the reception that Amazon’s disclosure had, pressure is building on Google to do the same and we may yet see it do so and in doing so increase its #CloudInfluence score at the same time. So what does this tell us? The full April 2015 Compare the Cloud #CloudInfluence rankings will be published tomorrow, but here is a brief preview of how these top firms fare: Microsoft – No 1 in April 2015 Compare the Cloud #CloudInfluence rankings Microsoft’s large existing ecosystem and powerful marketing machine are helping to drive its share of voice and therefore its #CloudInfluence. It also has the financial reserves to invest whatever it takes to maintain its impressive growth rate and succeed in cloud. It will need to keep growing rapidly though if it is to have any chance of catching Amazon. Microsoft can make it easy for enterprises to keep their legacy investments in Windows-oriented data centers, even as it helps them migrate to its Azure cloud. Microsoft has also built up decades of trust and goodwill with CIOs, making it a safe choice for the cloud. On top of this its marketing machine has built a level of brand recognition and awareness that extends its reach further into the non-tech SMB arena than almost any other player. Amazon – No 2 in April 2015 Compare the Cloud #CloudInfluence rankings Amazon’s first mover advantage has created an impressive market share lead and with it a rapidly expanding cloud ecosystem. While it lacks the reserves of its rivals, it is at least turning a decent profit and needs to do more to boost its marketing in order to maintain a decent level of #CloudInfluence. AWS’s ultimate appeal isn’t necessarily its cost (even with continually falling prices, it can still be quite expensive) or its ease of procurement as an elastic hosting provider. It’s main appeal is its massive ecosystem of services and the ability to tap into them fairly quickly. As fast as AWS innovates, its ecosystem adds value faster than it could ever do so itself. If its rivals ever grow their ecosystems to match AWS’s then it could be trumped. IBM – No 3 in April 2015 Compare the Cloud #CloudInfluence rankings IBM not only has a significant marketing machine to drive its #CloudInfluence, but also has an army technology and business consultants that is trusted by big corporate CIOs to maintain their most critical systems and manage their most challenging integrations and transformations. IBM would challenge any perceptions that it has been left out of what might be a two-horse race, but it isn’t really playing in the public cloud arena and is instead focused on hybrid cloud where it claims to be: “successfully targeting that high value Cloud space.” To do so it needs to maintain a competitive advantage with added value offerings such as Watson, even as Azure Machine Learning service and Amazon Machine Learning service seek to leapfrog the Google Prediction API and challenge Watson itself. Google  – No 4 in April 2015 Compare the Cloud #CloudInfluence rankings Google has some great technology and is quietly gaining share, but it’s lack of a sizable ecosystem to rival those of AWS, Microsoft or even IBM may prevent it from catching the others, while its lack of financial disclosure has prevented it from maximizing its potential #CloudInfluence. An added distraction for Google is the anti-trust case from the European commission that focus on accusations that it has unfairly used its products to oust competitors. It may be too early to right them off, but Google has a number of challenges on its hands if it is to catch up. Salesforce  – Way behind in April 2015 Compare the Cloud #CloudInfluence rankings Has Salesforce been left behind already? Once the undisputed PaaS leader, Salesforce lags far behind the others in terms of scale or growth and almost fails to register at all in terms of #CloudInfluence. Interestingly though, while hostile to Microsoft under Steve Ballmer, Salesforce has changed tune since Satya Nadella's appointment last year. It has launched products that integrate with Microsoft Outlook, SharePoint, and Office 365, becoming one of Microsoft’s biggest partners. Maybe this is an admission by Salesforce that: “If you can’t beat them, join them.” At the end of the day,  it may be that the correlation between #CloudInfluence and growth may turn out to be more indicative and illustrative than direct. However, here at Compare the Cloud, we believe that the winners in this market will be those with the healthiest ecosystem, and the most effective experience marketing. As summarised by David Andreasson from Teliasonera; “the winner in this business is the one who creates the most compelling offering based on services that others have innovated.” NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### Smart Cities Are The Future This week the Waldorf Hilton hosted Ovum’s Smart to Future Cities conference. If you haven’t been able to attend the sessions, never fear, I was there for you. Smart cities are aiming to integrate energy, mobility and ICT, and in turn lowering carbon emissions. We need to look towards the future and consider the way our expanding population will live in the cities of the future and the problems they will face.  Steve Turner, Head of City Policy for Manchester spoke on the UK-China Smart Green Cities Planning and Governance projects, which are working at city and state levels within China to open the market for UK SMEs. The Manchester / Wuhan twin city initiative means that Manchester is able to work with Digital China to develop joint metrics for measuring a smart city. Nine SMEs are currently working to help smarten the Wuhan Qinshan Riverside District development which is being developed as a wholly smart city through the partnership with Manchester. The internet of things and development of smart cities is bringing things like personalised management plans in healthcare into focus as a real prospect for the future. There is a consumer push for the system to be more seamless. The US is moving towards a much more integrated healthcare system and the rest of the world will soon follow. Siemens has become a major part of the IoT industry, bringing its technology to work with condition monitoring in relation to things like wind turbines and how their efficiency can be maximised through adjustments. Siemens also operates peak power alarms to ensure there is no loss of power through system overloads. Energy security is becoming a big realm of interest, with ways to protect and sustain business becoming paramount. Emergency outages such as Hurricane Katrina saw a limited number of industries prepared to protect and readily back up their energy supply in order to protect their business. So knowing that we need to be saving energy for our future needs because of the limited nature of our current resources, we need to be focussed on innovation. Research and innovation are not always on the same page as reality, cities are not ready for some of the levels of innovation that are being developed. Companies like SKIDATA started by providing ski access provisions, but now provide event ticketing and car park services which are managed through smart phone applications. "The solutions should be centred around the end customer and be easily accessible,” said Simone Frank who is in charge of Business Development / Urban for SKIDATA. Options both for the future, and even the present, are things like storage lockers for online purchases pickup at train stations etc. managed via a smartphone app that notifies the locker holder when their package has arrived. Park and ride and bike and ride systems that are linked to the public transport systems. Smart phone penetration of 80% in the UK means that it is the best way to get in touch with a customer. They can pay, order services, manage those services and be served ads targeting their needs and usage history. What we need to realise is the vision of smart cities is really a smart environment, even though we do not yet know what all of the potential smart services of the future will be. We need to manage the reuse and sharing of resources. Guy Redmill from ISPM said "there should be a vision behind what the smart city should ultimately become, if we take the overall vision, then we must consider the smart city cannot be made up of individual project silos, because then those silos are not interconnected." Throughout the conference there was many mentions of silos, and I think Guy Redmill hit the nail on the head, saying the future smart city needs to be considered as a whole. If we implement many smart initiatives individually that do not communicate or connect to each other, the city will not be all that smart - the data being mined won’t be able to refer to other data points, and overall, data is where the sm ### Insights From Day 2 at Telco Cloud London In our overview of day one at London Telco Cloud, we looked at the competitive dilemma that Telcos face, with shrinking margins on core services and with stiff competition in newer converged services and cloud. The answer appeared to be the development of converged services and cloud-based apps that could be delivered easily and affectively via marketplaces to the SMB sector where telcos retain a critical footprint. On day two specialist consulting firm mm1 delivered a realty check: explaining why SMBs don’t (yet) buy cloud or IT from telcos. The key points that they made were: Telcos are not the natural partner for SMBs to talk about cloud. Mid-size companies usually have other partners for IT If mid-size companies talk to telcos about cloud, currently the telco sales forces aren’t able to sell it very well Customer understanding is key – telcos are accustomed to transaction sales at scale, but cloud needs to be a consultative sale which smaller IT service companies are more used to delivering While cloud can be simple to provision, the customer lifecycle remains complex (inc. design, implementation, migration, training, and in-life support) SMBs are looking for worry-free management throughout this lifecycle complexity and see their local IT suppliers as the most trusted partners for this – not their telco. The conclusion from mm1’s pitch was that telcos cannot win with stand-alone cloud offerings, and instead need to address SMBs with credible, converged telco-cloud offerings instead. These are a combination of not only classic telco services with fixed and mobile access, but also cloud and professional services. Currently however, while most telcos offer “fixed” and “mobile” and “cloud” services for SMB customers, few provide them as truly converged offerings. This will require a fundamental business and operating model transformation by most telcos, say mm1, but don’t worry, they say they’re here to help. Another fascinating keynote on day 2 of Telco Cloud was delivered by Goran Car, Director of the Professional Services and Solutions Strategic Business Unit in Combis and CTO ComCloud, one of the leading system integrators in the Adriatic region. He introduced us to the Flynn Effect – coined by James Flynn who in his TED 2013 paper looked at why our IQ levels are on average higher than our grandparents. Already panicked by all these talented millennials, I wanted to know if there was any hope of competing with them. Are successive generations really brighter and more talented that those before them? As Goran explained:  in 1865 you’d be lucky to shoot a target from distance once a minute, in 1898 you’d expect to be able to do so several times a minute and in 1918 you’d have been able to hit a target 50 or more times. This wasn’t down to any inherent evolutionary advances. Instead it was down to the move from the musket to the repeating rifle and then to the machine gun. The ever-improving capability of the technology at our disposal is enhancing our potential, as is our familiarity with it and our ability to get the most from it. Cloud is just such a technological evolution and the more that we harness it, the more this will lead to further innovations. The point was endorsed by David Andreasson, head of product and technology at Teliasonera when in his keynote he predicted that “the winner in this business is the one who creates the most compelling offering based on services that others have innovated.” So what is clear to us is this: We don’t have to worry about all those pesky and highly-talented millennials as they will innovate and create new services that we can then capitalise on. We do need to incorporate these services into compelling, credible, converged telco-cloud offerings as mm1 outlined. We can deliver them to our hungry SMB clients easily and effectively via marketplaces. At least I think that this was what I learned from Telco Cloud, or maybe the hospitality got to me too much. ### Insights from Day 1 at Telco Cloud London At Telco Cloud the debate turns to future revenue streams, up-selling opportunities, apps and marketplaces – it all sounds cloudy to me! As the game of cloud poker intensifies, Telco execs converged on London this week for Telco Cloud – one of the Informa Cloud World Series events. It was refreshing to see such a good turn out with a large number of high calibre delegates and a packed auditorium with standing room only at times. Maintaining their best poker faces, it was hard to tell if the speakers and the throng of delegates were more concerned by their cloud rivals, more resigned to the competitive dynamics or more hopeful for the future. Telcos are facing a gradual erosion of traditional income streams as telephony and messaging are commoditised and as the digitisation and move to NFV and SDN reshapes the competitive landscape. They accept that they don’t hold all the cards. Amazon has stolen an unassailable lead in public cloud, Apple and Android have captured the mobile app space, Microsoft owns the most critical business app suite and there is no way that the telcos can take any of these players head on even in markets where they still have a dominant position as the domestic carrier. The telcos do have one trump card though – their direct relationship with companies and most importantly with SMBs – the segment of the market that the big tech players find most hard to reach. Typically SMBs already have an account with a telco that includes a billing relationship for an on-going service. This provides a unique opportunity for the telcos to up-sell other services, as long as they can sell, deliver and maintain them well. Easy – yes? – well maybe not! This isn’t quite as straight forward as it sounds as the sales teams at most telcos need to be retrained to sell apps and digital services, the telcos need to provide an attractive set of apps that can be delivered quickly, easily and efficiently, and they need to make it all work well together over time in order to retain what is inappropriately often termed client ownership – in reality nobody actually owns the client! Most telcos understand that some form of app store or marketplace set up is required and among the many speakers at Telco Cloud were BCSG and AppDirect, both of whom spoke eloquently about how they can enable telcos to deliver a set of apps to their clients effectively. This is where it starts to get complicated though. The options for telcos wanting to sell added value services or apps are: You build your own marketplace from the ground up - an unnecessarily expensive option that few would consider these days. You use a Cloud Service Broker to provide a platform on which to build your own marketplace - even IBM used a CSB when it build BlueMix on AppDirect. You post all your services on a vendor marketplace, but risk them being lost among the throng as you get with busy hubs like Apple’s App Store. You find or found an independent marketplace for your local market and hope that you can be a key local provider of services within this ecosystem. Whether you’re a big fish in a small pond, a smaller looking fish in a large lake, or the only fish in the puddle, its not a case of “build an app and they will come.” With the proliferation of apps and marketplaces any pull is greatly dissipated and you need an effective marketing campaign behind your apps to push them to the target market segment. Amazon of course has its own marketplace. And it’s speaker at Telco Cloud boasted that over a million companies are already using services from AWS and its partners. Typically though this will appeal to the more tech savvy corporates or the born on the cloud start-ups, rather then the less tech savvy SMBs. The telcos can create their own marketplace to serve the SMBs in their local markets, but many of the SMBs will have Office365 (or maybe the alternative from Google) high on their shopping list and are likely to be using cloud services such as Gmail or Dropbox already. Telcos wanting to own the ‘client relationship’ and with it the annuity revenue and future up-selling opportunities, will be concerned that opening the door to Microsoft, Google or other cloud players and including them in their marketplace will risk dissipating much of the value and wallet share to these potential rivals or even risk losing the clients to them entirely. And while telcos no longer need to go to the effort and expense of building their own marketplace from scratch – cloud service brokers like AppDirect can do most of the heavy lifting and service integration for them – there are local MSPs in every market that can also use the same cloud service brokers to create rival local marketplaces. While the telco may be local on a national scale, these MSPs sell themselves as the ultra-local trusted supplier at a far more localised level either to a small region, small market segment or small vertical niche. It all comes back however to the telcos’ trump card – their existing connectivity and billing relationship. SMBs want to have it simple – easy access to apps that are easy to use and already integrated, and that can be provided under a single account with a single bill. If the telcos can get this proposition right then potentially they have a very bright future. The SMB sector is vast and hungry for these services! ### Is It Getting Cloudy in Banking? Following on from the Cloud Banking Europe event a few weeks back, we thought a brief article was needed to get the message to a wider audience. The main theme of the seminar was Regulation and Compliance, aiming to explain cloud within the industry. However it was apparent that within the broad group of attendees there was a lack of understanding of IT governance rules and regulations within the industry. I personally thought that some of the discussions were out of date due to the fast paced cloud environments changing on a monthly basis. The two main regulators in the financial industry are the PRA and the FCA. The Prudential Regulation Authority (PRA) is a part of the Bank of England and 'is responsible for the prudential regulation and supervision of banks, building societies, credit unions, insurers and major investment firms. It sets standards and supervises financial institutions at the level of the individual firm.' The Financial Conduct Authority (FCA), formally the FSA, is a financial regulatory body in the United Kingdom, but operates independently of the United Kingdom government, and is financed by charging fees to members of the financial services industry. The FCA maintains the integrity of the UK’s financial markets and focuses on the regulation of conduct by both retail and wholesale financial services firms, with a consumer focus. Over the two days a varied selection of speakers were present, together with vendor exhibitions in the main foyer. It was interesting to see and talk to vendors that are working within the financial services space and listen to their view on what drives banking tech to the cloud and also what is hindering adoption. The general consensus is three fold. Speed Resiliency Security These points are extremely important in any industry moving to a centralised outsourced model, however even more so for the financial industry. So, lets discuss each one in more depth. Speed (low latency) Whilst this is extremely important within any market sector adopting cloud principles, within the financial sectors this is critical. With the complex trading systems and principles within each asset class of the banking world (commodities, futures, forex etc that are very fast flowing), a few milliseconds time delay is disastrous. For other asset classes such as fund management this time delay is not so critical.  Fast links to exchanges are key for fast order routing and this is the mainstay for any financial order routing connectivity. However this does depend on how and the frequency of trading for any given firm. Resiliency Part of the regulations that authorised firms need to comply with is Disaster Recovery. As part of FCA adherence, this is a main point and should be taken very seriously. Hosting with a cloud provider does not automatically give you Disaster Recovery, you have simply moved your infrastructure to someone else and now your technology abides by their own procedures and policies. This is a misconception that many firms fall foul of and unless you state what you want in regards to Disaster Recovery, you will be unpleasantly surprised! Although the FCA guidelines are stated very clearly in a very large handbook, the actual policies are very difficult to understand, unless you are from an IT and financial business background, which is very rare. Security Again, this is paramount for any firm moving to a cloud-centralised infrastructure (even more so in a hybrid example). With over $3 trillion generated from cybercrime, this is more than all of the drugs traffic trade globally and it highlights the incredible scale of fraud. There are best practice guidelines demonstrated within the FCAs handbook for regulated firms however this can be hard to implement if your cloud provider does not understand the specific security governance from your regulator, and lets face it its not easy to understand even if you are from that market sector let alone if you are a cloud provider that’s services every industry. So, from a compliance point of view IT governance for the banking industry is an absolute nightmare and requires a lot of thought and planning. Some of the main points highlighted within the FCAs manual for guidance include: IT Governance and Strategy Risk Management Information Security and controls Security Practices Logical Access Security Administration Security Monitoring Business Continuity Planning and Disaster Recovery Offshoring/Outsourcing As an example, take this question set out within the FCA working manual (and this is only one heading within outsourcing). A common platform firm must in particular take the necessary steps to ensure that the following conditions are satisfied: The service provider must have the ability, capacity, and any authorisation required by law to perform the outsourced functions, services or activities reliably and professionally; The service provider must carry out the outsourced services effectively, and to this end the firm must establish methods for assessing the standard of performance of the service provider; The service provider must properly supervise the carrying out of the outsourced functions, and adequately manage the risks associated with the outsourcing; Appropriate action must be taken if it appears that the service provider may not be carrying out the functions effectively and in compliance with applicable laws and regulatory requirements; The firm must retain the necessary expertise to supervise the outsourced functions effectively and to manage the risks associated with the outsourcing ,5 5and must supervise those functions and manage those risks; The service provider must disclose to the firm any development that may have a material impact on its ability to carry out the outsourced functions effectively and in compliance with applicable laws and regulatory requirements; The firm must be able to terminate the arrangement for the outsourcing where necessary without detriment to the continuity and quality of its provision of services to clients; The service provider must co-operate with the appropriate regulator and any other relevant competent authority in connection with the outsourced activities; The firm, its auditors, the appropriate regulator and any other relevant competent authority must have effective access to data related to the outsourced activities, as well as to the business premises of the service provider; and the appropriate regulator and any other relevant competent authority must be able to exercise those rights of access; The service provider must protect any confidential information relating to the firm and its clients; The firm and the service provider must establish, implement and maintain a contingency plan for disaster recovery and periodic testing of backup facilities where that is necessary having regard to the function, service or activity that has been outsourced.  Confused? I am guessing you are! Now lets add in a hybrid approach where more than one service provider will be involved – how on earth can you achieve the appropriate governance when various cloud providers are being utilised. My vision for banking and technology in the future (specifically cloud) would be one of more automation as this is much easier to regulate. This coupled with the correct education and guidance on cloud technologies from the regulators themselves, which I think is a must. The current rules and governance set out from the governing bodies has not changes since 2008 (pretty much the birth of cloud) and desperately needs addressing. However with this said, we have been invited to two separate seminar days based at the FCAs premises to discuss these issues and to provide input on potential changes moving forward. If cloud has a major part to play in the banking world it is imperative that the governing body should provide the correct advice on the technology utilised with the corresponding governance. At the end of the day, the regulator is funded wholly by the fines it distributes amongst its member firms. To me it seems unfair that a firm should suffer a fine from the governing body who regulates their understanding of the cloud technology, when the regulator itself has a lack of technical understanding. ### We're going to IBM Edge2015, Las Vegas! ### Cloud Service Matchmaking: IBM + Twitter We’re going to look back a few weeks to an interesting announcement that will begin to affect you in a profound way in the very near future. On March 17th 2015, IBM announced the highly anticipated details of their partnership with Twitter. They will be working on ‘industry-first cloud data services’ that allow the extraction of actionable business data insights from Twitter data by business professionals and developers. Presently there are more than 100 early client engagements where IBM is helping enterprises to apply social data to help enable business decisions. The full press release is available here. The IBM / Twitter combination will help enterprises to enrich and analyse their Twitter data in combination with millions of data points from other streams of public and business data. Weather forecasts, sales information and product inventory stats were listed as target points for data mining, in order to find correlations to drive the actionable insights. “So much of business decision making relies on internal data such as sales, promotion and inventory. Now with Twitter data, customer feedback can easily be incorporated into decision making,” said Chris Moody, Vice President of Data Strategy at Twitter. “IBM’s unique capabilities can help businesses leverage this valuable data, and we expect to see rapid demand in retail, telecommunications, finance and more.” The announcement of the top three social insights from the early engaged clients were in relation to geographic reactions to brands, simply put, people are more likely to tweet unhappy things about their service provider if their mobile phone loses reception during a weather event. Retail employees have valuable relationships with their most loyal customers, and when employee turnover occurs, retail business is directly affected. Essentially, if you fire someone’s favourite employee in your store or restaurant, they are going to tweet about it, and the results showed that it would impact sales negatively. The third social insight was that an analysis of the social feeds of top fashion bloggers can enable fashion brands to predict what will or will not sell well. To be specific, the release stated: “By using psycholinguistic analytics from IBM Research to extract a full spectrum of psychological, cognitive and social traits from Twitter data that influential fashion bloggers generate - combined with operational data such as sales and market share information - manufacturers can better understand why some products sell well while others don’t.” So IBM has announced a cloud data service that allow business professionals and developers to extract actionable business insights from Twitter data. Is this a big deal? - Potentially yes, but let’s not get ahead of ourselves. The real value here is not from analysing Twitter streams - there are many, many tools that already do this. True actionable insights can, however, be uncovered by enriching and analysing Twitter data in combination with millions of data points from other streams of public and business data, the correlations will be what drive better decision making. Is it a game changer? - No, not yet. Potentially the new IBM analytics services on the cloud could help businesses and developers create social data-enabled apps, merge sophisticated, predictive analytics with Twitter data and more easily analyse that data. This will only happen however if the the service is easy enough to use, easy enough to integrate with, and is economically priced. IBM has some fantastic middleware, but it is not always easy to use. It isn't always easy to integrate either - even sometimes when trying to get two of the vendors own apps to work together. On top of that IBM software may well be competitively priced against other corporate tech vendors, but with Amazon AWS slashing the entry point for cloud services and many social tools available at low or  often even no cost, this is a different game and it remains to be seen how well IBM will play in the social media area. And will it make an impact? - Absolutely. It was big news when IBM's Watson technology won the infamous Jeopardy challenge, but the technology has yet to find widespread adoption. If in this instance IBM cracks the ease of use, integration and pricing challenges then the new service could well have a huge impact, but even if it doesn't, IBM is now forcing the hand of its main rivals. They have to develop their own such offerings or risk IBM stealing an advantage. Either way this kind of functionality is now available and in the not too distant future will be in common use. ### 7 Common Cloud Misconceptions Even though cloud has been with us for some time already, many executives used to relying on traditional data servers consider this relatively novel technology an unnecessary risk and a potential source of problems. Their concerns are often based on some persistent misconceptions that seem to surround every new technology. Here are 7 major myths about cloud computing – debunked! 1. Cloud is not safe This is perhaps the most significant and least grounded belief about the cloud. Many managers claim that they wouldn't want critical information about a company to float somewhere around the internet, stored in shared hardware and accessible by anyone ranging from regular users to the National Security Agency. The truth is that cloud providers usually boast many more security protocols than any company could ever get their hands on in their private, non-cloud data servers. Cloud providers can, and do, experience downtimes, but companies can efficiently avoid them by placing their data in several locations and developing a backup strategy. 2. Cloud is too expensive Another classic argument. While cloud may involve some considerable upfront costs, it will be much cheaper in the long run. First, IT management tasks are outsourced and their cost is reduced. Moreover, in some cases companies can save up to 50% of costs normally related to operation, information system updates and maintenance. Going for cloud, companies can reduce their investment in hardware equipment or human resources and radically limit space usually used for information infrastructure and management. 3. Organisations are still experimenting with the cloud Some executives consider cloud just another technology trend that is bound to expire in some time. They think most companies that choose this solution are merely experimenting with it to see whether it brings any benefits. A study conducted by Cloud Connect in partnership with Everest Group demonstrated that what's happening isn't experimentation, but serious investment. 58% of examined enterprises would spend more than 10% of their annual IT budgets on cloud services. And that certainly isn't little. 4. Cloud doesn't bring any tangible benefits to businesses Wrong. Cloud computing is a major technological development that makes IT resources and applications easily accessible through a smart network. Increased data availability is just one important aspect of the cloud – another is the fact that it can easily adapt to company’s needs and usage habits. Other than that, enterprises choosing the cloud also enjoy an improved system performance and on-demand scalability. 5. Companies lose control over data in cloud This concern is closely tied to the one about security. Many executives feel that by relying completely on an outsourced IT infrastructure, they might simply lose control over it. This simply isn't true – by delegating the task of managing a company’s data infrastructure, executives can benefit from the expertise of data specialists at the cloud provider. When choosing cloud, it's the internal human resources that have full control over the company data – but the efficiency of their IT infrastructure is under the care of data experts, minimising problem-solving issues or troubleshooting, effectively giving your IT team more time to focus on development projects. 6. Scaling in cloud is automatic Cloud doesn't guarantee the possibility of scaling up each and every aspect of company’s operations. While pure computing power can be scaled up most of the time, enterprise application might not. That's why companies that expect to grow should make sure that their applications were developed to allow scalability in cloud. Speaking technically, applications should be modular – their functionality should be split up into independent commands. This will in turn allow companies to manage, configure and maintain more servers if their app requires them. 7. Cloud is just about technology requirements Finally, many executives believe that cloud is a solution that affects exclusively the technology needs of an enterprise. The truth is that more than a half of enterprises examined in this study consider cloud to serve as a strategic business differentiator that accelerates innovation and promotes operational excellence.  All in all, cloud is a solution that brings tangible benefits to any enterprise deciding to outsource its data servers. When we look around us and see how many people are using cloud services – Google Drive, Gmail, Amazon, eBay, iTunes – we realise that the importance of cloud is steadily growing and might soon become a standard practice not only for consumer use, but enterprise as well. ### Cloud and the SMB This year’s State of the Cloud report found that the majority of businesses have adopted cloud computing in some form. In fact, 93 per cent of the organisations surveyed are employing cloud-based applications or experimenting with infrastructure-as-a-service (IaaS). However, it is important to remember that the cloud is not a one-size-fits-all solution. SMBs (Small and Midsized Businesses), in particular, need to carefully assess which particular cloud approach is right for them to ensure that already-stretched IT budgets are not wasted. With that in mind, SMBs need to be aware of the various ways that they can use the cloud to enhance their organisation.  First of all, companies must assess whether a public, private, or hybrid cloud approach is the right way to go. The public cloud relies on third-party suppliers which usually offer a number of benefits, but also come with added security risks. A private cloud approach, meanwhile, offers less flexibility. Increasingly, SMBs are turning towards hybrid cloud platforms that look to encapsulate the benefits of both the public and private cloud ecosystems. Once IT leaders have decided on the cloud format that best suits their company, they can start to benefit from the many uses of cloud technology. Reduce costs One of the cloud’s foremost uses is lowering IT budgets. If SMBs use an external vendor for their cloud needs, often they will sign up for an operational expense (OpEX) payment model. Essentially this means that large upfront hardware and software costs, which are often prohibitive for smaller firms, are avoided. Instead, a subscription service is implemented that avoids long term financial commitments. Another financial benefit of the cloud is that hardware maintenance and software updates are all carried out off-premise and are the responsibility of the cloud supplier. This means that IT departments can make significant reductions on their maintenance outgoings. SMBs should shop around to see which cloud vendor is offering the most cost-effective service. Google Apps for Business, IBM, Microsoft’s Office 365 and many other suppliers offer businesses entire suites of cloud applications, so it may work out cheaper to get as many services as possible from one vendor. The ability to consolidate software under one package is one the cloud’s main benefits. Business agility Another benefit of using the cloud is achieving added business agility. Today’s fast moving digital world requires companies to react quickly and seamlessly to change, something that can be easily facilitated by the cloud. Organisations can leverage the cloud to test new software solutions and bring applications to market quicker than ever before. Adopting a hybrid cloud approach can also help businesses achieve scalability without significant disruption. For example, SMBs are able to transition between private and public servers seamlessly in order to acquire more bandwidth and accommodate usage spikes. With traditional on-premise solutions, businesses may require time-consuming and expensive overhauls in order to scale their operations. In this sense, the cloud also enables SMBs to gain a competitive edge on their rivals. By lowering costs and enabling greater agility, medium-sized businesses and startups are able to more readily challenge established players, making for a more competitive and, ultimately healthy, enterprise market. While the cloud also enables SMBs to implement agile solutions to present-day problems, it also ensures that businesses are future-proofed against unexpected innovations. The Internet of Things (IoT) is just one example of a coming technology predicted to cause mass disruption across a multitude of industries. By having a robust cloud infrastructure in place today, SMBs can prepare for this and many other innovations safe in the knowledge that they can react to unforeseen change. Mobility The cloud enables more mobile business solutions, fostering greater employee satisfaction and productivity, which is ideal for most SMBs. Today members of staff, clients and customers demand a business environment that allows work to take place on multiple devices, with fluid movement between PCs, tablets and mobiles. Cloud computing enables this level of mobility by hosting files and applications externally, allowing them to be accessed and modified in real time across several devices. Similarly, the cloud can work wonders for SMBs that have offices in multiple locations, perhaps in several countries. The cloud makes sharing information a far easier task than using on-premise technology. Often employees will simply need to send a link or the location of the file in order to share their work with a colleague, meaning the days of huge email attachments crashing inboxes should be relegated to the past. Storage SMBs should also consider the cloud when it comes to their storage options. Firms of any size will want to ensure that the files and applications they need are easily accessible at all times, but finding space for this information can be difficult and often comes at a cost. Cloud storage providers offer significant savings when compared to local systems, coupled with easy access wherever your employees are based. Security regulations may mean that businesses must keep certain types of data in-house at all times, but it is likely that a cloud solution could still be effective in some way. Marketing materials, for example, which are usually made up of images, as well as audio and video content, can take up a lot of server space and may be better placed in the cloud. Furthermore, although security is often viewed as one of cloud computing’s major drawbacks, many third-party suppliers are working hard to tackle this perception. Some cloud providers now offer encryption free-of-charge, meaning that they are not aware of what files they are hosting and ensuring your privacy is protected. Cloud storage also offers SMBs added peace of mind in the event of system failure or any other local issue. By backing up your data off-site, businesses are covered in the event of technical issues, making the recovery task much more straightforward. Integration The cloud can also serve SMBs that are looking to integrate new applications into existing services. The majority of cloud computing software packages use an Application Programming Interface, or API. Cloud APIs enable apps to request data and other resources from multiple services either directly or indirectly, making for a more connected business experience. SMBs may find that by adopting the cloud, applications are already compatible with their existing infrastructure, and do not need to be modified in order to enable integration, saving businesses time and money. Companies will have to assess whether cross-platform APIs, which are more flexible but could have reduced functionality, are right for them or if vendor-specific interfaces would be more suitable. Environmental benefits For the more environmentally-conscious companies out there, cloud computing can also offer a greener approach to business growth, without compromising on success. Instead of operating on-premise servers, which must be larger than necessary to accommodate expansion and peak work flows, the cloud enables businesses to only occupy the server space that they need, improving efficiency and lowering their energy output. A 2013 report found that in a small business environment, server utilisation rates lie between 5 and 10 per cent, compared to 60 to 70 per cent for cloud-based shared data centres. This means a lot of energy, and money, wasted for on-premise solutions. Cloud service suppliers can also implement more efficient server layouts that are difficult for internal server rooms to replicate, allowing data centres to run more efficiently with a lower carbon footprint. Although most SMBs are already utilising the cloud in some way, it is important for IT leaders to constantly reassess the technology at their disposal to distinguish whether an in-house or third-party approach is best for their business. There are many potential uses for the cloud, so it’s all about choosing the right one for your organisation. ### Part Two: Cloud Marketing #CloudInfluence Individuals #CloudInfluence Marketing Opinion Holders In polite conversation one is meant to avoid contentious topics such as politics, religion and sexuality, and the same normally goes for marketing, but occasionally there are key topics of principle where firms choose to take a stand. Salesforce Marketing Cloud took a very public stand against Indiana's Religious Freedom Restoration Act, or RFRA, and this propelled Scott McCorkle, its CEO, into the limelight and into first position in the rankings. Oracle executives John Stetic and Kevin Akeroyd were both widely quoted when the firm announced new features and capabilities for its marketing cloud. Similarly David Beebe was covered promoting the launch of Marriott’s new Digital Travel Magazine, arguing that: “Travel just lends itself to content creation and story telling.” CEOs Jamie Reardon and Kumar Vora both shot to prominence promoting their services: Find Your Influence, a leading influencer marketing platform that connects brands with bloggers, and Acrolinx, a software platform for WCM. Notable other appearances included Tony Sanders from Adobe and Dominic Citino from Sitecore, both of whom were widely quoted promoting members of their business partner channel. All the major players are reliant on their channel ecosystems to help drive the sales, as well as implementation and support for their marketing platforms. Indeed it was just such an announcement that propelled Sitecore business partner Perficient into the company rankings and its Sitecore Practice Director, Stephen Tynes into the opinion holder rankings. And Paul Sieminski, a Republican who is the general counsel to Automattic, the company that operates Web-making tool WordPress.com, has been notably active in fighting for net neutrality, arguing that it is: "such a fundamental issue for the Internet." [table id=17 /] #CloudInfluence Marketing Commentators (press, bloggers, etc) Normally we exclude commentators (press bloggers etc) from our rankings and focus instead only on opinion holders (those that the commentators quote in their articles). However for this special report we thought it might be worth highlighting a few of the top commentators. Wordpress bloggers feature highly: Sarah Gooding and Jeff Chandler both blog for WPTavern and Jan Mylo has done so for Wordpress Planet. With over 150,000 followers on Twitter, Jeff Bullas is a social marketing phenomenon. And the Content Marketing Institute, along with its publication CCO Magazine and main event Content Marketing World, is possibly the most influential platform for comment. [table id=19 /] NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### 10 UK Data Centres Go Global Many people have said to us that a “cloud offering is useless without connectivity”. Pulsant has now addressed this by announcing a global networking deal with GTT Communications Inc. The strength of Pulsant has always been its regional data centre footprint and interconnectivity from Scotland through to Croydon. The GTT deal brings the Pulsant cloud and Big Data services to 202 data centres and points of presence from Germany to Japan to the USA. Commenting on the recent venture Andy Johnson, Managing Director GTT EMEA and APAC stated, “As the market moves more towards hybrid cloud services it is essential that secure, private connectivity solutions are easily accessible from all major metropolitan marketplaces. By linking to the Pulsant UK footprint of 10 data centres and their wealth of international clients and expertise, we are able to open these to the worldwide market and vice versa.” Matt Lovell, CTO Pulsant stated, “the customer is the core of our services, therefore expanding our footprint globally brings our customers closer to core marketplaces.” The best comment and one that I endorse wholeheartedly was from Rob Davies, Sales and Marketing Director Pulsant, said “cloud should be borderless, easy secure private access should be a customer right, as should any determined hybrid capability. Pulsant customers are now able to penetrate international marketplaces with private connectivity direct to the clients office anywhere in the world, and to any cloud or platform of choice.” The Win-Win in this relationship: Pulsant open their 10-data centre UK footprint to the GTT global wide area network GTT gets access to major Pulsant clients to enable worldwide private networking Pulsant deliver a single cable to the rack enabling access to all major networks and cloud / SaaS platforms including IBM Cloud and AWS. Pulsant clients can reach major countries and territories through private networking rather than using the Public Internet So what does this mean for their clients? Well if you want your cloud services supported from two data centres in disparate locations to provide redundancy, but want both centres located in the UK to answer client concerns about data sovereignty then you won’t get this from any of the major cloud players. Pulsant is your best bet. If you have offices across the UK that are using cloud applications that have significant I/O requirements or are sensitive to issues of latency or performance, then you want a choice of data centres near your regional offices. But rather than having to go for a provider with a single UK location, you want a reliable connection to, and between each data centre that isn’t going to suffer from packet loss or performance and latency issues. If you want to be open for business for clients across the globe or want to partner with other players to provide hybrid solutions from a number of different global locations then you need reliable connectivity with a strong SLA and real global reach. If all of these issues sound familiar and are important to your business then this deal between Pulsant and GTT is of real significance to you. The final question that is at the forefront of our mind is: are we seeing Pulsant, the first truly UK centric data centre become an international cloud and SaaS hybrid player? If that is the case then Pulsant has just become the British flag waving colocation and hosting facility of choice for those companies looking to access the UK markets. Certainly a story worth watching as it unfolds. ### Cloud Marketing #CloudInfluence Special Report Recently we explored how marketing was rapidly becoming the most powerful and resource-rich function in any business and also how the Future of Marketing is in the Cloud.  This touched on how marketing was changing and the rise of new disciplines such as influencer marketing and content marketing within the whole move to data-driven marketing. We also recently looked at the metrics that marketers are now swamped with and at the increasing importance of differentiating between relatively meaningless social metrics and indicators of real attention or influence. Also in looking at how to understand what successful marketing really looks likes – we gave some insight on the theory behind our #CloudInfluence rankings. We thought it appropriate then to follow up with a #CloudInfluence special report this month on marketing, to look at who the real influencers are that are grabbing all the attention in this field. We understand that while there are a number of cloud marketing services that are growing rapidly, not all marketing is yet cloud-based. At the same time traditional disciplines such as Web Content Management (WCM) are maturing and evolving into a broad set of interconnected data-driven marketing disciplines. Our analysis, while time specific (looking at the last 60-90 days), seeks to encompass this whole spectrum and focus on those firms and individuals that are influencing the agenda in marketing, rather than just achieving their own marketing ends. #CloudInfluence Marketing Companies All the major cloud marketing firms understand the importance of influencer marketing as a key component of the current marketing framework. All are seeking to provide a service or solution to help their clients promote themselves and are at the same time vying for prominence and attention themselves. While the rankings provide a snapshot of current prominence that favours companies with recent major announcements, these firms all need to be making such newsworthy announcements on a regular basis in order to grab attention and maintain their profile and prominence. Timing on this occasion favours Oracle and Adobe - both are not only key marketing platform providers, but are also ones that have announced recent major enhancements. Zia Consulting, a provider of Enterprise Content Management (ECM) and Intelligent Document Capture solutions, made a big recent play of its new strategic partnership with SeeUnity, the content integration experts. Similarly WPP's Data Alliance, the WPP company that supports the group's data business, made an announcement to promote its deepening partnership with Facebook to provide new data-driven solutions that can deliver personalisation at scale on Facebook. Most vendors have a real mixture of stakeholders in their ecosystem from their own employees and executives, to technology and integration partners as well as marketing agencies and in-house client marketing departments. Enabling positive advocacy across each of these constituencies is essential. In particular however, partner ecosystems will have a major role in determining the long term winners and losers in the market. Not all players have their partner strategy right, but some do: for example Perficient in 5th in the rankings rose to prominence having become a Sitecore Platinum Implementation Partner, while BrainJocks in 11th gained coverage for becoming a Sitecore Gold Implementation Partner. It’s not just the software vendors that are partnering in this way. The Accident Data Center (ADC), an online resource for people who have been involved in recent motor-vehicle crashes, has launched a Sponsor Network, providing injury attorneys with exclusive advertising and content-marketing opportunities. As other players make big announcements the rankings will change (and we may revisit this topic again in the future). Rather than blame the rankings or this current snapshot, any firms that do not appear should be asking themselves why they are not more visible. In such a fragmented and competitive market that is in a critical growth phase, all players should be seeking to be as visible as possible and all should understand the importance of grabbing attention and achieving real influence. [table id=16 /] NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### Office 365 Compliance in Regulated Industries Compliance in regulated industries is often difficult to understand, so we are taking a closer look at Office 365 and how Microsoft addresses key regulatory requirements. Firms working within regulated industries, such as solicitor practices and financial services, are subject to strict regulatory standards. These standards extend across all aspects of business, including technology. Regulatory bodies, such as the Solicitors Regulation Authority and the Financial Conduct Authority, impose strict requirements around the management, processing and security of client data. Despite common misconceptions, these requirements do not preclude the use of cloud technologies and the benefits of Cloud computing are still available to be reaped. Data Residency and Data Protection The EU data protection regulation stipulates that data should not be transferred outside the EU, unless to a country with similarly high data protection standards. Office 365 complies with this legislation by adopting a regionalised data centre strategy, storing European customer data in either its Dublin or Amsterdam data centres. In April 2014, Microsoft became the first (and to date, only) cloud provider to receive approval from the Article 29 Working Party, an independent advisory body established by the European Parliament to focus on data protection. The ruling confirmed that Microsoft meets the high standards of EU data protection legislation so regardless of where data is stored, it is protected to a standard approved by EU authorities. Microsoft is also certified under the Safe Harbor Framework, recognising companies aligned with EU data privacy rules. Businesses that wish to legally transfer data from the EU to the U.S. must comply with the Safe Harbour principles. Client Confidentiality Client confidentiality is a key concern for businesses working within regulated industries. Microsoft provides contractual security commitments that protect your data at all times. Confidential information will not be disclosed to third parties, nor used for any purpose other than that agreed. If a government request is received to access your data, Microsoft commit to notifying you, unless they are legally prohibited from doing so. Security Regulatory bodies often request security compliance with ISO 27001 2005 as minimum. Office 365 and the infrastructure layer on which it relies are ISO 27001 certified, delivering: 24-hour monitoring and restricted access to data centres Encryption of data at rest and during transmission Data loss prevention to avoid sensitive data from leaking either inside or outside the organisation Enforcement of "hard" passwords and multi-factor authentication Data Ownership and Regulatory Access Regulated firms must have adequate agreements with their providers to allow regulatory bodies to access and inspect their data. With Office 365, you own your data, retain all rights to it and can download a copy of it at any time. This can be done without Microsoft assistance and subsequently issued to your regulatory body. Data Recovery The data backup and continuity arrangements of your cloud provider are important. Office 365 backs up your data at least once a week and maintains multiple copies across its data centres. It also commits to delivering at least 99.9% up-time with a financially-backed guarantee. USA Patriot Act The USA Patriot Act applies to companies based anywhere in the world with a US parent company. It obliges them to disclose information on their customers to US Government agencies without their knowledge or consent, potentially conflicting with EU data protection laws. Despite its severe reputation, the Patriot Act is no more intrusive than similar interception regimes across EU member states, such as the UK Regulation of Investigative Powers Act 2000. The Patriot Act is also limited in scope and does not apply to the majority of cloud customers. Where it does apply, Microsoft’s certification with the Safe Harbour Agreement ensures compliance with the EU Data Protection Directive. Microsoft is at the forefront of security and management of cloud services. As highlighted above, Office 365 provides a good fit for companies working within heavily regulated industries. In addition to Microsoft's commitments, undertaking your own due diligence, establishing policies, training and security measures means you can ensure continued regulatory compliance. The benefits of Office 365 are available for reaping by regulated industries and as a topic of #CloudEducation, we have started the research for you. ### Up All Night For Cloud Gaming Insomnia54 - The UKs biggest gaming festival Compare the Cloud had the pleasure of partnering with MultiPlay events, founders of the Insomnia Gaming Festival. The three day event is situated at the Ricoh Arena, Coventry - otherwise known as the Wasps’ rugby ground. Now I appreciate that attending an event with 30,000 under 20s may not be everyones cup of tea, least of all mine, but I was simply bowled over by the event and heres why. An insight into the gaming industry, period! As soon as I was walked onto the arena I was taken aback by a completely different IT World than anything I had ever seen before. If you have teenage kids you are probably aware of the term “e-Sports” and have demands being thrown at you for the latest games and consoles to play. If you aren’t privy to the pleasure of teen gaming addicts, you will be surprised to know that it is big business - really big business. The tournaments have literally hundreds of PCs networked so you can compete against other teams, individuals or aim to have the best score on the day for featured games. There are hundreds of teams that play against other teams for money. Now in my day this was never possible, I had a commodore 64 that took 5 mins to load a tape that sounded like an old analogue modem dialling into  Compuserve (yes, I am a dinosaur) then I would bash the keyboard madly to Daley Thomson’s Decathlon.  Now networked Minecraft, Halo2, League of Legends and so many others can be played with great prizes of cash to the winners and runners up. [caption id="attachment_16996" align="aligncenter" width="640" class=" "] Minecraft Network[/caption] How about BYOC? Bring Your Own Computer events are amazing and to see kids bringing in their own computers to just plug and play was amazing. The PCs that were networked looked totally pimped out and were serious gaming machines. There were hundreds of them and I was amazed how simply they were connected to a LAN environment. [caption id="attachment_17352" align="aligncenter" width="640"] The CEX challengers[/caption] Next on my list of surprises was the Cex stand, and to describe what I saw really needs a picture of the challengers. In a nutshell the Cex stand, among others, challenged anyone to put together a PC from scratch within a time limit. It was amazing to see kids as young as 12 completing this with ease, and so fast! Sadly I didn't pluck up the courage to take the challenge and embarrass myself, but if this is what the young generation are capable of at 12, what on earth is coming in the next decade, writing computer games at 7? And doesn’t the future look exciting for when these kids come of hiring age! What happened next just knocked me for six. Whilst walking to a refreshments stand to consume slush puppies I was met by a screaming mass of children running after a couple of teenagers - Enter the Sidemen. Never heard of them? Neither had I until I was nearly knocked over by a couple of dozen screaming gamer groupies. Out of curiosity, and possibly blind stupidity, I followed this mass of screaming children. All became clear thanks to a bit of explaining from one of the bewildered fathers I spoke to. The Sidemen seem to be a mixture of jackass meets game reviewers.  Hard to imagine but have a look for yourself online here when you get a spare minute. Anyway, they have youtube channels that have crazy videos on stunts and pranks and also reviews on games with walkthroughs with their own merchandise! [caption id="attachment_17342" align="aligncenter" width="640" class=" "] The SideMen merch stand had lines even when the 'stars' weren't present![/caption] In summary there is little I can say to do the event justice. There were stands selling other merchandise (PC components like memory as well as games) and money was literately being thrown to the vendors (funded by what I can only imagine are very supportive parents). There were so many games and chances to play games it was really really fun. On a serious note the event was just awesome if you like e-Sports or playing games on your console at home. And it can be big business if you wish to exhibit. The younger generation seem to be guided by a herd of youtube channel stars that encourage the youth of today to buy the latest games as they come out on the market. The majority of the games were streamed at the event by Clanforge (Thanks to Isaac for his hospitality when I was very out of my element!) who utilise a variety of providers that guarantee a very resilient service on the day! The gaming industry is massive and you really need to attend one of the Insomnia events to understand the magnitude of the potential market. The next one is from the 28th - 31st August and I highly recommend you get down to Coventry and get your game on! ### Marketing Metrics: What Does Success Look Like? In the past marketing was a creative function: the creative types in the advertising or PR agency came up with campaigns that would not only wow their clients, but promised to wow their clients’ customers as well. Editorial coverage was measured in EAV (Equivalent Advertising Value) and print advertising came with a rate card, all of which was based on audited figures for the size and demographics of a publication’s readership. Then came the internet and social media with real and immediate metrics. Suddenly on web sites we had banner advertising where we could track click through rates, and web sites where we could track visits or page-views and use cookies and forms to capture behaviour and identity. Social media then gave us followers or likes and links with even more clicks. All these easy to generate numbers were volatile, but they appeared to be telling a story and they turned marketing into more of a science. Marketers now spend all their time looking at an endless stream of such quantitative metrics, but surely it is quality and not quantity that will help you understand the real progress of any campaign, and the impact that you’re really having on results. So where are the qualitative metrics to tell you not only how a particular program is working, but why? And where is the intelligence to tell you where to prioritise efforts and how to differentiate between your many marketing activities so as to see which are being most effective? I am reminded of the phrase "There are three kinds of lies: lies, damned lies, and statistics" which was popularised in the United States by Mark Twain (among others), who attributed it to the 19th-century British Prime Minister Benjamin Disraeli. We would argue that the qualitative metrics that you really need to focus on are attention and influence. In a recent blog “The Future of Marketing is in the Cloud” I focused on the new marketing disciplines of Influencer Marketing and Content Marketing. Understanding real attention and influence are at the core of Influencer Marketing. There are various social listening tools and apps that provide quantitative metrics, but to a large extent knowing how many times your name appears on the internet or was repeated on Twitter doesn’t really give you any useful information. You really need to know and understand how others react when your brand is being discussed. For our monthly #CloudInfluence rankings we use Appinions, a tool that can recognise and discard inconsequential mentions, and instead focus on the real level of active attention. For this to work you need to understand the context of any opinion and the rate at which others of real influence respond to, report on or repeat any opinion. It’s not a small task technically, but using natural language processing and a big data infrastructure Appinions analyses the full text of the world’s news, blogs, and social media every day. Going beyond just social media, it differentiates between authors (such journalists who report opinions) and opinion holders (those who the journalists quote), and uses natural language processing to analyse sentiment. Through this we are able to garner a more realistic indication of influence. Then we provide you with the results of #CloudInfluence each month for free! (… but you can always buy us a drink in gratitude if we ever meet). Aside from Compare the Cloud, other Appinions media partners include Forbes magazine, which uses the technology to publish a list of the world's most influential CMOs and The Economist, which uses the technology to publish a list of the most influential economists. Here at Compare the Cloud we monitor and report on all the players and developments in the cloud arena. Differentiating between the winners and losers in this market is as difficult as ever. With Amazon yet to break out separate financials for AWS and other vendors including a host of different kinds of software and services in their disclosed cloud revenue figures, it remains hard to make direct market share comparisons on the basis of revenue. Added to this are the market distorting impact of widespread discounting as well as the use of credits to attract clients to a vendor’s cloud platform. Even numbers relating to client cloud usage are hard to get hold of and to compare. In a market in its early growth phase, as cloud is, it is often mind share that has the greatest influence and as here at Compare the Cloud we hope that our #CloudInfluence rankings provide a useful additional perspective. Our #CloudInfluence tables are posted from the first Monday of each month, with a special report focussing on a specific industry arena released mid-month. ### March Top 10 #CloudInfluence UK and Cloud Bursters Each month in our top 50 ranking for individuals we feature a host of influential executives who are the greatest movers and shakers on the global stage, but we also find it interesting to focus on the main cloud bursters – those appearing from nowhere to make an impact each month – and we also like to look at the top players closer to home – here in the UK. March 2015 #CloudInfluence Cloud Bursters The top cloud burster this month was Salesforce.com CEO Marc Benioff. This month Benioff revealed that the company intends to further grow their cloud computing software and Salesforce ANT migration products to enable organizations to manage their customer service, sales, online marketing and even human resources. Also making a splash was Joe Montana who appears in 19th in the overall rankings. While he’s best known as the former Quarterback of the San Francisco 49ers, it is his track record as an astute investor with investments in Pinterest, Dropbox and other start-ups that was the focus of coverage this month. Wouldn’t we all like to have earned what he did as a player, but to have invested as well! John Kavanagh, Senior Director of Identity Management at Synacor rose to prominence this month commenting on Synacor’s Cloud ID SDK launch as the firm seeks to simplify mobile app development and subscriber login experience. Encouraging users to think beyond familiar phrases when creating passwords, as well as encouraging firms to prompt them to do so, was researcher professor Mohammad Mannan. Then asserting that technology played a defining role in shaping India's economy, Oracle India managing director Shailendra Kumar commented this month that the company was looking for bright people eager to lead cloud transformation across the country. [table id=15 /] March 2015 #CloudInfluence UK Individuals Yet again the leader of this months #CloudInfluence rankings for the UK is IBM’s Simon Porter – one of the most active execs that we know on social media. He is followed by analyst Nitesh Patel, the Senior Analyst for the Wireless Media Strategies at Strategy Analytics. In a recent report he argued that mobile operators should use mobile cloud storage services not only to help drive loyalty and traffic, but also to reduce churn, rather than relying on cloud as a premium service and source of direct revenue. Next came Iomart CEO Angus MacSween after Iomart was not only accredited for UK Government G-Cloud 6, but the firm was also identified as one of the “London Stock Exchange's 1000 Companies to Inspire Britain” for the second year in a row – a report that celebrates the UK's fastest-growing and most dynamic small and medium sized businesses. Next commenting on the usage of private cloud and ASP-type services within the financial services industry and its slow adoption of pure public cloud-based services, was Richard Edwards, Principal Analyst at Ovum. Finally with further comment promoting a report that found that the UK tech sector outperformed the wider UK economy in Q4 2014, was Tudor Aw [3], Technology Sector Head at KPMG Europe. [table id=14 /] If you feature in the March #CloudInfluence top 50 and would like to put a badge on your site,  please use the following, or contact info@comparethecloud.net for other formats. NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### March Top 50 #CloudInfluence Individuals The monthly #CloudInfluence ranking for organisations is usually a tussle between the main players for share of voice, with the same names appearing near the top each month, albeit in a different order. The monthly ranking for individuals however, tends to see far greater change month to month with few executives appearing on a consistent basis. In doing so it reflects the news agenda each month with different executives capturing the limelight depending on what they have to announce and how much influence their opinions carry. Indeed none of March's top five individuals appeared in February’s rankings at all. Aziz Benmalek, general manager of Microsoft's Hosting Service Provider unit, took top spot in the individual rankings this month. He was quoted widely in regard to a Microsoft-sponsored study from 451 Research which focused on how enterprises are turning to cloud providers that offer more than hosting infrastructure services. The research found the cloud computing market has evolved, offering more opportunities for vendors that venture beyond providing the essentials of getting business clouds off the ground. The survey of 1,700 hosting and cloud customers worldwide revealed that in terms of IT spend, 70 percent of the opportunity for cloud providers now revolves around application hosting, managed data services like backup and disaster recovery, and security. Aziz commented on how the results show that IT organizations are seeking cloud solutions that go "way beyond [their] infrastructure needs”. In 2nd place was Josh Petersen, director of Amazon Cloud Drive, as the firm announced two new plans for customers that want an affordable, secure solution to store unlimited amounts of photos, videos, movies, music, and files in one convenient place, thereby taking on the unlimited storage plans from rivals Google and Dropbox. Then came Gary Pearsons, VP and GM at Rockwell Automation who was quoted in a Microsoft IOT announcement which focused on how Microsoft Azure IoT services, Windows 10 IoT for devices and Power BI are empowering firms to make better use of their data. In 4th was Michelle Bailey, Senior VP at 451 Research who had been behind the Microsoft-sponsored study mentioned earlier and in fifth was Mark Hurd, CEO at Oracle. Hurd delivered the opening keynote address at the Oracle Industry Connect conference on how the turmoil caused by cloud represents an opportunity for those who seize it. Only when we get past the top 5 do we start seeing some familiar faces from last month’s rankings. In 6th place is Solgenia Group’s Founder and CEO Ermanno Bonifazi [5], followed by Coudian CMO Paul Turner [3] in 7th and last month’s leader Marc Rogers [1], CloudFlare’s principal security researcher comes in at 8th with a 7 place drop. In 9th is Chinese Republic Premier, Li Keqiang as his government produced an "Internet Plus" action plan to propel the integration of the country’s modern manufacturing with the mobile Internet, cloud computing, big data, and the Internet of Things. Completing the top 10 is Don Butler, executive director of connected vehicle and services at Ford, earning his rank by talking about the vehicle data that Ford acquires wirelessly through the cloud and how it can be used to remotely diagnose mechanical problems or alert owners and dealers on scheduled maintenance. [table id=13 /] Further down the rankings industry heavyweights Marc Benioff [16] CEO of Salesforce.com came in at 15th moving up one place since February, Brian Krzanich, CEO of Intel made 16th, and Larry Ellison, Founder and CTO of Oracle at 20th all appear commenting on the cloud strategies of their respective firms. A celebrity of a different kind appears in 19th. Joe Montana is best known as the former Quarterback of the San Francisco 49ers, but he is also an astute investor - he's invested in Pinterest three times, Dropbox twice, and other large startups like CoreOS in the past. Kentucky senator and potential 2016 presidential candidate Rand Paul in 40th used a town-hall meeting with employees of Dynamic Network Services, a cloud computing company, not only to talk about the importance of technology leadership, but also to promote his other political views. A few familiar faces also appear, with Daniel Ives [14] an analyst at FBR Capital Markets, in 13th commenting widely in the investment press on the tech market and Simon Porter [25], European VP at IBM, as active as ever on social media appearing in 41st. If you feature in the March #CloudInfluence top 50 and would like to put a badge on your site,  please use the following, or contact info@comparethecloud.net for other formats. NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### March Global #CloudInfluence Organisations Competition in the cloud market is intensifying, but differentiating between the winners and losers in as difficult as ever. With Amazon yet to break out separate financials for AWS and other vendors including a host of different kinds of software and services in their disclosed cloud revenue figures, it remains hard to make direct market share comparisons on the basis of revenue. Added to this are the market distorting impact of widespread discounting as well as the use of credits to attract clients to a vendor’s cloud platform. Even numbers relating to client cloud usage are hard to get hold of or to compare. This month however some light was shed in this area when an article from Business Insider disclosed usage figures from Microsoft that showed a significant rise in usage – albeit from a concerning low historic level. Our March #CloudInfluence rankings provide a clear comparative perspective in terms of mind share and market influence, February’s rankings are shown in square brackets when applicable. Microsoft [1] topped the table for a third consecutive month, just ahead of rival Amazon [4] which moved up to 2nd ahead of Apple and Oracle. After a burst of activity in February around its PWLC and Impact events, IBM [2] fell back to 5th in the rankings for March. AWS may currently be bigger than Microsoft in cloud revenue terms, but Microsoft is now growing its cloud revenue significantly faster, and at the current rate will catch AWS in the next few years. Microsoft have also previously shown low cloud usage rates, but these are now rising rapidly as well. In terms of share of voice and #CloudInfluence, Microsoft is a cut above its main rivals. Though you wouldn’t want to write off any of the main players: AWS’s lead is significant; Google’s technology is incredible; and IBM’s ability to come back against competitors has been proven time and time again over the last 100 years. However, in a market in its early growth phase, as cloud is, it is often mind share that has the greatest influence. And as the big data analytics behind the Compare the Cloud #CloudInfluence rankings shows, it is Microsoft that currently appears most likely the heir apparent to the cloud throne. The Amazon model has been to reduce prices in order to drive growth, even though according to Citi's estimates it has done so at a loss, even with minimal marketing spend. By contrast rivals like Microsoft and IBM have invested heavily in cloud marketing, leading February’s #CloudInfluence rankings as a result. March's rankings show IBM [2] has failed to sustain the momentum from its PWLC and Impact events and subsequently has fallen to 5th. The #CloudInfluence rankings analytics found that Microsoft’s influence has continued to increase – significantly outpacing that of all its rivals. By the time we publish next April's rankings at the beginning of May, Amazon is due to have disclosed its AWS financials and we will know for sure how profitable the business is and whether it has the cash flow and reserves to take on deep-pocketed rivals like Microsoft and Google, or to fund and sustain a long marketing battle. [table id=12 /] And the rest? Boosted by a Synergy Research Group report that showed it had retained its lead the cloud infrastructure equipment market, as well as by its CeBIT announcement with Deutsche Telekom of a number of newly developed Intercloud-based services, Cisco [16] jumped to 6th overall this month. Next came new entrant Adobe Systems in 7th, which despite negative coverage of its financial results, received much more positive coverage for the Acrobat Document Cloud, the latest of its professional cloud services, which includes Adobe Creative Cloud and Marketing Cloud. Also in a classic co-opetition arrangement, despite head to head competition between Adobe Experience Manager (AEM) and IBM Web Content Manager in the WCM space (as well as in analytics), Adobe Systems announced a deal with IBM's digital agency, IBM Interactive Experience, to build enterprise consulting capabilities for Adobe Marketing Cloud. In 8th we have Amazon’s cloud operation AWS [5], followed by Amazon’s main eCommerce rival Alibaba and one of its main cloud rivals Google [7] dropping three places to 10th. Users took to Twitter this month to vent their frustration at being unable to access Apple’s iTunes and App stores which suffered an unusually long outage – one that the company blamed on an internal technical error. After the Apple Store came SAP [10] dropping two places to 12th, followed by Red Hat Inc, which reported a 16 percent jump in quarterly revenue from increasing adoption of open-source software and its cloud-based products. Analyst firms IDC [23] in 20th and Gartner [11] in 23rd appeared alongside Deutsche Telekom and Yahoo! News [15], with rival analyst firm Forrester in 28th. Also in the top 30 were vendors Sony, Ericsson [19], Dell, Avaya, EMC and Nvidia [37]. Other notable entrants to the rankings were the China Securities Regulatory Commission and the People's Republic of China itself following the Regulatory Commission’s announcement that Meng Kai, the former chairman and biggest shareholder in Cloud Live Technology Group Co., should fulfil his promise and return to resolve the firm’s repayment problems. In 40th was The Pirate Bay (TPB), which overcame a move by Swedish police to take the site down late last year, by simply reappearing the following month. Also thanks to a new distribution partner CloudFlare which provides a service that manages the dialogue between users and website hosting servers, TPB has succeeded not only in reducing bandwidth burdens and protecting itself against DDoS attacks and other threats, but has also succeeded in hiding information from ISPs, making it much more difficult for them to block access to its main portal (thepiratebay.se). Therefore many of the existing ISP-level blocks no longer work, making TPB even easier to find than it already was. Check back tomorrow for the Top 50 Global #CloudInfluence Individuals for March, and on Friday for the UK #CloudInfluence and Cloud Bursters tables. If you feature in the March #CloudInfluence top 50 and would like to put a badge on your site,  please use the following, or contact info@comparethecloud.net for other formats. NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. ### The Future of Marketing is in the Cloud Thankfully I am recently and happily married (for the second time), as the thought of ever returning to the dating scene fills me with complete dread. I wasn’t much of a ladies’ man in my twenties or even again in my late thirties after my divorce. Maybe I just didn’t understand women enough or maybe I simply wasn’t appealing enough – or both. Similarly firms have to understand their clients and how they can make their product or proposition appealing to them – but at least they have a selection of new marketing apps and marketing clouds to help them. Marketing is on the ascent. It is rapidly becoming the most powerful and resource-rich function in any business. As the function responsible for creating and sustaining a long-lasting relationship with the most important asset of any business – its customer – it is becoming the function that drives all others. In the new hyper-connected, mobile, always-on world, consumer expectations are changing and marketers need to shift their focus from pushing messages at people to engaging them in an ongoing conversation or relationship. Here’s how: 1. It's not a one night stand – it’s a relationship. Firms are moving away from a focus on transactional selling towards relationship selling where engagement is expected to grow deeper and endure over time. It requires listening, nurturing and care along with intimacy and trust. There are a limited amount of ‘touch point’ opportunities in which if you are able to address your customers' needs  in a meaningful way, rather than put them on hold while telling them how important their call is, you can develop your relationship. Successful relationships are built on mutual appeal and shared values. 2. New marketing funnel When dating it often felt as if all the most appealing girls were already taken. In marketing early engagement in the decision making process has always been essential, but the game has changed. According to Lori Wizdo, Principal Analyst at Forrester Research, “…today’s buyers might be anywhere from two-thirds to 90% of the way through their journey before they reach out to a vendor.” Traditional marketing techniques are at least in part becoming obsolete. Today it is all about influence and content: Influencer Marketing: Understanding whose opinions matter most and encouraging them to use their influence to advocate your ideas/brand. Content Marketing: Understanding the needs of each individual client/cohort. Tailoring content, responses and offerings to each one. 3. Agility and Authenticity In the past, marketing departments were set up to conduct campaigns and launches, each of which featured planning, execution, analysis and then a feedback loop to help refine or adjust the next campaign. Campaign workflows were also rigid and siloed in this way. Nowadays everything is in real-time with constant dialogue, constant listening and instant response. And analytics is no longer just an up-front or follow-up exercise – it is a constant operation within an iterative and flexible workflow. Consumers also have an expectation not only of immediacy, but also of intimacy. Where once marketing was about creating a myth and selling it, it is now about finding a truth and sharing it. It’s time to ditch the old chat-up lines and the false claims that might once have impressed your date. Anyone can discover almost anything instantly online and circulate it to an audience of millions. You need to be authentic, to live by your values and to be confident enough to share the truth - customers appreciate it. 4. Invest in the long term Engaged customers are like any other asset that generates value in the form of return on investment (ROl). If you understand what drives engagement and what makes one company preferred over others--and you can present it in a compelling way then any CEO or CFO will pay attention. Often the most important factor driving engagement is a culture of customer centricity. Engagement plays a central role in the culture of any sustainable and successful business and needs to be reflected in areas in the business far beyond the marketing function from incentives, to salaries and promotions. Marketing no longer sits in a corner and runs campaigns; it interacts with almost every part of the business, from IT to customer service and logistics. One of the best investments firms can make is in an end to end marketing cloud – many are now available. These allow firms to understand their clients, manage their client engagement and integrate their business processes to support this. The future of marketing is most definitely in the cloud. We are aiming to return to the topics of Influencer Marketing and Content Marketing in future blogs, but in the meantime keep your eyes peeled for our next monthly #CloudInfluence rankings. ### Brain Swapping App Beta Released 1st April 2015 - A new mobile-connected cloud application, Brain2U, has been released on public beta. The app, developed by BrainSucka Labs, uses a pioneering new technology to upload a users mind into the cloud and download it into a friends head. The technology uses a specially designed headset transmitter which connects to a smart phone via bluetooth. Both users require a headset to be active for the transfer to take place. This technology is not without controversy. In October last year early previews were withdrawn after one user attempted to swap brains with his cat. Although successful the user had to be later rescued from a tree by fire fighters. "Safety is now our top concern. No more incidents with beloved pets and wild squirrels. The advent of Brain-As-A-Service (BAAS) is quick, reliable and a lot of fun. Only last night I swapped brains with my twin brother - people literally couldn't tell the difference!" said CEO of BrainSucka Labs, Marko Smallbottom. Have you tried the new Brain4U app? Headsets retail for £99 and the app can be downloaded from smartphone stores. Let us know your experiences below. ### Is an SLA an insurance policy? It’s important that every company, especially service providers, regularly review and edit their contract terms and conditions. It sounds like an obvious thing to do, but you’d be surprised how little this is actually done. I’ve been seeing more and more companies relying on the service level agreement to protect the company interests; therefore a contractual update in this area must reflect any industry and company changes, ensuring the on-going protection of the business. What is an SLA (service level agreement)? A service level agreement is one element within a full service contract, outlining specified terms of service. The SLA ensures that all parties involved are informed of their responsibilities and the penalties which may occur from breaking these agreements. Responsibilities are typically outlined in measurable terms, such as: The percentage of time a service will be available The schedule of notification of hardware maintenance Response times after logging a ticket Fix times after logging a ticket At a first glance an SLA seems like a fool-proof way of guaranteeing a level of service. However, when you explore the fine print, the agreement tells another story. Breaking the agreement An SLA also details what penalties are in place if the provider fails to meet their agreed requirements. These penalties not only vary per provider, but also per contract. Traditionally, the provider gives a percentage of the overall monthly contract cost back to the user… how much depends on the level of down time that occurs. Monitoring After an outage, in order to determine how much compensation is required the correct monitoring tools must be in place. If there are no tools, then there is no compensation. Does the penalty cover all servers, or just one? There is no point in having an SLA on a server that is hardly used, likewise, is there monitoring on all servers? How frequent is server availability measured? If the monitoring software checks the servers every 15 minutes, this may not be accurate as in between these checks the server could be down. Does the SLA cover all financial damage? It is unlikely that an SLA will ever cover the financial damage you could incur from a major outage. Most insurance companies will not cover hosting companies for loss of earning or consequential loss for their customers. So the best you can do is make sure the penalty looks reasonable in line with the value of the contract you hold with the organisation. Make sure there is some pressure for the provider to get the service up and running again as soon as possible. Some company’s penalty fees are insignificant, for example Microsoft 365 limit their liability to just $5. ### Andrew Bartlam -‎ VP EMEA Business Development & Alliances - CipherCloud - Live from Cloud Expo 2015 ### Nick Beale - Business Development Manager - Hyve - Live from Cloud Expo 2015 ### Ian Stewart - Account Director - Britannic Technologies - Live from Cloud Expo 2015 ### David Terrar - Founder & CXO - Agile Elephant - Live from Cloud Expo 2015 ### Frank Jennings - The Cloud Lawyer - Wallace LLP - Live from Cloud Expo 2015 ### BYOD, is the big bad wolf dead? Bring your own device (BYOD) is far from a cutting edge term, in fact, it’s been on the table for a while now and it all started with the boss when they bought their nice new shiny toy into work that they got for Christmas and announced this is what they will use from this day forward. The gospel according to BYOD had been born.  It became the universally accepted hall pass for ‘Let’s sidestep the policies we once insisted were important. Never mind that standardisation malarkey, details, details. Let’s not worry about any of that, we have new shiny toys. That’s all that matters’. Of course, it’s not just the boss any longer; we’re all guilty. As the concept of BYOD began to grow a set of legs, the era of the smartphone and the tablet emerged and eye contact slowly became a thing of the past. In May 2013, Gartner published an analyst view that, by 2017, up to 60% of businesses will have adopted a BYOD policy of some description. BYOD as a strategy may well prove to be an answer to many questions being posed to businesses the world over but do we now consider it a safe option? In this white paper, we examine todays rationale for BYOD and try to answer the question; Is BYOD really a viable option and has that big bad wolf been tamed and transformed into an ‘App shaped pocket sized pampered pooch’? It’s a very good question. Contrary to the hype, the speed of digital surprised everybody There can be no argument that the digital world is rapidly changing at unprecedented speed. For many CIO’s it really is a question of adapt or be left behind by the business they aim to support. We no longer care what the badge on the software says or who the vendor is in most cases. Many business users have grown up knowing nothing else but digital technology and all that it offers. They see the IT market as a supermarket; its shelves full of packaged services and products that they can choose for themselves. After years of exalting the virtues technology could potentially deliver, the hype has become the reality. An IT department is increasingly being perceived as an organisation that introduces unnecessary delay. Obstacles previously regarded, mainly by a protective CIO, as barriers to technology adoption, are either being brushed aside or aggressively challenged by an ever-savvy user community. The fact is that the digital children are just as, if not more, familiar with the new technologies than many of their IT counterparts. You’re more likely to hear ‘I’m going to anyway’, rather than ‘May I?’ because that way, the job gets done but at what cost? Download CensorNet's new white paper ‘BYOD – is the big bad wolf dead?’ here.  ### The Rise of Linux on Power Whilst I cannot profess to be particularly trendy (especially in the eyes of my kids) not being a millennial and all, fashion for me is a realm of the utilitarian. Suits for work and jeans for pretty much everything else, occasionally though it’s nice to be able to be on trend with something, and with Linux on power I get that opportunity. Linux has always seemed to be “the trendy surfer” of the range of OS options that we come across. Recently though, it seems to be, like me, coming of age. Stepping from the cool fringes firmly into the mainstream. The rise of Linux-based applications that now appear at the core of business critical infrastructure is testament to the relevance of the OS - especially in the realm of HPC and web based applications. So what does this mean for an IBM Power-focused Business Partner like us at Maple Computing? Well, with the advancement in IBM’s Power 8 chipset and platform, and the incredible benchmark figures in CPU cycles and application multithreading, making Power an ideal platform. Especially for businesses running such HPC and web based applications with their inherent low-latency and high-performance nature. We are finding that businesses that demand decent transaction performance seem to be addressing the problem by throwing large numbers of X86 based servers at the issue in an effort to provide the muscle required. Examples of such businesses could include a web facing transactional business, or a service business, which requires high transactional throughput. With this approach you require a steady stream of servers to keep up with the demand as well as the integration skills, the space and the funds to do so. Stop, think and take a fresh approach to HPC and web based applications: Such Linux solutions are increasingly ones that IBM have already certified for Linux on Power. They have been developed to utilise the unique advantages of Power 8, the high performance chipset and, perhaps more importantly, the multithreading capabilities it brings through SMT8. Customers can typically consolidate 30 plus cores of x86 workload onto just a handful of Power 8 cores, saving money, power, and rack space whilst lowering licensing costs. Perhaps just as important is the ability to deliver better performance in a scalable architecture that can grow in line with the business demand incrementally rather than requiring regular large-scale purchases. Having grown up with IBM UNIX, as well as with AS/400 in all its previous incarnations, we already understand the incredible reliability, scalability and performance that these platforms offer. IBM’s PowerLinux solutions are tuned to extend the performance of POWER to key emerging workloads running on industry-standard Linux, which will give us the tools to compete not just at the high-end for HPC analytics etc., but also right into the congested heart of the data centre. Featuring Power technology in dense, rack-optimised form factors, these servers run industry standard Red Hat, SUSE and Ubuntu Linux distributions, and are priced to provide more competitive, reliable and scalable alternatives to commodity x86 scale-out options, and we even have the open virtualisation capability scale-out solutions! IBM has also embraced the idea of open distribution and standards by opening up its chip architecture via The OpenPOWER Foundation. This is a collaboration around Power Architecture products initiated by IBM and announced as the "OpenPOWER Consortium" on August 6, 2013. The plan is for IBM to open up the technology surrounding its Power Architecture offerings - including the processor specifications, firmware and software - and to offer this on a liberal license using a collaborative development model with its partners. In my opinion this is a very clever move - opening up its IP to enable the server vendor ecosystem to build its own customised server, networking, and storage hardware on OpenPOWER for the data centres of the future and for cloud computing whilst promoting the IBM chip technology. With Google reportedly taking up the platform for provision in its own data centres, this could prove prolific and I would guess that IBM can rest safe in the knowledge that most businesses would be reluctant to run mission critical apps on freeware, without paid support of some description. This means a secure future for IBM and its POWER-focused Business Partners and a compelling, competitive, scalable and reliable platform for their clients. ### A Moment of Cloud Clarity and Vision There are only a few times in your life when you get that epiphany moment, that time when you think to yourself “I get it, I understand and I don’t know why I didn’t see this before.” I had one of these recently and I want to share it with you all. Amongst the very busy schedule at IBM’s largest get together in Las Vegas in February, there was an excellent Executive Exchange that I was invited to attend. What is an Executive Exchange? Well some of the CXOs of IBM explain their vision and strategies for the future with an audience of likeminded individuals. So, let me set the scene with a short description of the presenters first. Linda Bernardi – Chief Innovation Officer, Cloud and IOT Daniel Sabbah – CTO and GM Next Generation Platform Jeff Smith – Chief Information Officer Kevin Eagan – Vice President and GM, Digital Channels This was an impressive line-up and I was humbled to sit at the very front table from the stage. Now, IBM has had its criticism of late in one way or another, but I have to say that what I listened to over the next two and a half hours was nothing short of that epiphany moment. Linda was excellent with her introduction to disruptive technologies and innovations that were her vision of the future. Together with Daniel on stage describing, and ultimately leading to, a very profound statement – “by taking risks, this is by far better than not doing anything at all – Execute!” In the context of cloud technology, in my opinion, this is 100% right. We are in a current state of change that continues to evolve rapidly and in an industry where a 14-year old boy can be a multi-millionaire overnight by creating a SaaS application, because he can and not because he doesn’t try. We have everything at our fingertips to promote this change, we just need to be bold and confident that we are part of the change – disruptive and not disrupted, our perception is key. There are three key points here: Think the Unthinkable If disruptive changes are inevitable, it’s crucial to start planning for the post-change future and do whatever is necessary to survive. Cloud is a disruptive technology. Be Bold, Take Risks Major changes to business models require considerable experimentation to get it right! We are in unchartered waters with cloud technology and there is not one person on this planet that can tell you exactly what the outcome will be in the next ten years accurately. Develop Strategy for Survival Every business can benefit from embracing disruptive innovations – Not just start-ups! Never has there been a better time to use disruptive technology to embrace change. The speed of innovation is absolutely incredible at the moment and by using disruptive innovations is the only way to survive. Now, I can hear you all say – “what on earth is he talking about”. Well let me try to explain with an analogy that Jeff Smith told far better than I will here based around disruptive thinking. “Any customer can have a car painted any colour that he wants, so long as it is black” - Henry T Ford Out of the box thinking is not a new concept. Back in the early 1900s, the first automobiles were hand painted and to paint/re-paint the first cars took many weeks. Now to meet the high demand for the model T cars this caused issues with distribution and it was just accepted that this was the case. Cars in the early 1900s were basically motorised carriages using the same painting techniques for horse drawn carriages derived from the oil-based paints. The painting process was a very complicated and expensive procedure and the drying time took several weeks. On top of this the paints couldn’t stand up to time and would end up turning yellow as there was no binding medium, so every time a colour would fade or yellow, it needed to be repainted. That long, expensive process is what prompted Ford to develop a simple but effective change; he would start using the binding agent cellulose similarly to the way it was used for glazing pottery for painting his cars. The result was miraculous; it didn’t take as long to dry (from 336 hours to 15) which also moved it in line with the assembly line process. There are various rumors that the staff painting them had to be naked to achieve the finish required, what an unusual job that must have been. So, if this idea had not struck Ford, the automotive industry may well have not taken off as it did and given rise to where we are today, and what a pioneer Henry T Ford was. Now back to cloud and disruptive technologies. During the afternoon we listened to each of the presenters and their visions of how to survive within this very disruptive industry, each having a profound message and strategy for us to take away and I will list out a few of my favourite quotes from that session. Jeff Smith “The wider your vision the clearer your focus.” Kevin Eagan “Cultural Transformation, Customer Experience, Digital Go To Market and Analytics are key to the future of your business.” Daniel Sabbah “Taking risks is better than not doing anything – Execute.” Businesses should not fear the future of technology. To sit back and say “I have no need to change” or “I’ll wait for a while and see how technology will shape out and then make a decision” or “the cloud is not secure" is not a smart business decision. By doing nothing in this transformational age of cloud technology you will simply become extinct, and if you have this train of thought you are a dead man walking. What I took away from the afternoon is much simpler. I came away with the confidence that IBM has the leadership and the right strategies in place to transform and evolve into a truly open, up to the minute giant that has rolled up its sleeves and got to work. Yes, they are undergoing change, but isn’t every cloud technology vendor? This is not IBM as you know them from old, this is an IBM that has a vision, the experience and the right key strategists to spray paint a new colour for cloud tech that will see them through to the future. Be bold, be confident and plan for the inevitable change, not only from a technology perspective but also from a business understanding. ### Russia – The Land Of Opportunity for Cloud There has been much speculation surrounding Russia recently, particularly in regards to regulations on data. However, the changing country-based rules surrounding Russian data sovereignty make very interesting reading, especially if you are a cloud service provider. RAPSI News reported the following on the situation in Russia in September 2014:  In December 2014, the lower house of the Russian parliament adopted amendments making it mandatory for personal data to be stored on domestic servers by September 2015 – instead of September 2016 as previously foreseen by legislation. A law requiring internet companies to store personal data of Russian nationals within the country's borders will take effect on September 1st 2015, according to the bill passed by Russia's upper house of parliament, the Federation Council. In July 2014, President Vladimir Putin signed a law obliging foreign online companies that sell plane tickets and consumer goods in Russia, as well as social networking sites, to store Russian personal data only in Russia. Foreign companies such as social media giants Twitter and Facebook; as well as highly trafficked travel websites like Booking.com will have to open offices in Russia to be able to continue offering their services to Russian residents. In October 2015, the Foreign Investment Advisory Council (FIAC) asked Prime Minister Dmitry Medvedev to amend the data storage law. FIAC had one primary motive - to suspend the bill that would change the date from September 1st 2016, to January 1st 2015. They succeeded in their request, and henceforth the 2015 has been settled on. The extra 8 months is key for FIAC to be able to adjust country-wide in house provisions in time for the bill to come into play. Apart from additional spending that could run into the tens of millions of euros, this could lead to the closure of law-abiding companies, the investors said. The lawmakers after the consultations with the companies decided to move the date for the enforcement of the law to September 1st 2015. Concluding RAPSI’s original reports, we can see that initially, the law was to take effect on September 1st 2016, but after a motion to bring the date forwards to January 1st 2015 was stopped by FIAC, a mid-way date of September 1st 2015 was set. This is the date we are now looking towards. From February 2015, RAPSI issued further updates on the situation in Russia:  In 2014, Russian communications regulator Roskomnadzor filed 28 claims with the courts to block 96 websites. On Thursday 12th February 2015 it was announced that access to over 60 websites would be blocked for violations of legislation on personal data. The order to close the websites was passed on January 29th by Moscow's Simonovsky District Court at the request of the regulator. This was finalised, and came into effect on March 1st 2015. So what does this mean if you live/have a business in Russia? Well, you must comply with the personal data law by the 1st of September, period! Already access to certain sites has been blocked, and come the 1st of September very hefty fines and penalties will be given to any Russian firm caught storing their information outside the Russian regions. There is just under 6 months left before the law comes into effect. Local service providers need to be utilised for the hosting of data and associated services without exception. This creates somewhat of an issue, so who do you use for your data storage and cloud provisioning? It may seem a risky venture to host in Russia, but backed by one of the most respected and experienced individuals within the industry, IXcellerate really does provide unparalleled access to UK and local hosting partners from within Russia. A high reward comes to those who execute in this territory over the next 12 months, legislation is fixed and strong gains will be achieved once you look through the iron curtains. In Russia, the service provider market place is inherently different from the rest of the world. There are very few MSPs operating, and the limited service providers who are operating are extremely large due to the nature of the way businesses is conducted. IXcellerate are unique to the Russian market, backed with strong UK ties and immense experience from founder Guy Willner, the co-founder and former CEO of IXEurope, and former President of Europe for Equinix Inc. The IXcellerate Tier 3+ Moscow One data centre opened in 2012.  A stand alone data centre located within the MKAD ring road of Moscow, set over a 15,000m² campus. IXcellerate Moscow One provides a bespoke solution aimed towards fulfilling clients’ unique data centre requirements. Unparalleled as a carrier neutral data centre operator in Moscow, IXcellerate’s Moscow One is comparable to the quality of data centres in America and Western Europe It offers co-location services including private and secure cages, combined with on-hand technical support 24 hours a day, all year round. Russia is the land of opportunity for hosting providers, even with forced legislation based around data sovereignty. It may seem a risky venture to host in Russia, but backed by one of the most respected and experienced individuals within the industry, IXcellerate really does provide unparalleled access to UK and local hosting partners from within Russia. A high reward comes to those who execute in this territory over the next 12 months, legislation is fixed and strong gains will be achieved once you look through the iron curtains. ### Cloud Gaming #CloudInfluence Special Report Cloud gaming is enabling a proliferation of gaming options - everything from set-top boxes and high-end game consoles to gaming PCs or even mobile devices. Using big data analytics, we take a snapshot across social media, blogs and print media to find out which organisations and individuals currently have the greatest #CloudInfluence in the field of cloud gaming. The movers and shakers in cloud gaming With supporters of each platform and each streaming technique now advocating different approaches, there is a massive tussle for hearts and minds as the main platform providers compete to win the loyalty of developers and gamers in the new era of cloud gaming. The role of influencers in helping to promote the viewpoints and messages of each camp is key. The following big data analysis, across social media, blog and print articles, looks at which organisations and individuals have #CloudInfluence in the field of cloud gaming: [table id=10 /] The highest profile firm is Microsoft, with current debate centring not only on its successful Xbox platform, but also on the new Windows 10 OS. Initially coverage was far from positive, focusing on notorious hacking group Lizard Squad, which made a distributed denial of service (DDoS) attacks against both PlayStation Network and Xbox Live at Christmas, disrupting online game services for thousands of new PS4 and Xbox One owners. Lizard Squad then made a further DDoS attack in mid-February. Since then Microsoft has announced that Xbox Live will be included with windows 10 and that it will be free. This announcement didn’t go without comment as many Xbox owners have been paying subscription fees for years, but it represents a strategic land grab for the PC gaming market. Next with the “Q4 2014 State of the Internet – Security Report” from its Prolexic Security Engineering and Research Team (PLXsert) was Akamai. The PLXsert team are experts on DDoS protection and cloud security services and strategies , and they provided analysis and insight into the global attack threat landscape, which was reported widely. Third in the rankings was Nvidia, which announced impressive results, launched its new Shield Game Console, and made its Grid Cloud Service official. Rumours of a spring release date for its Street Fighter 5 boosted Capcom to fourth when online retailers such as Amazon and Best Buy listed its availability as March 31, 2016. Sony slipped to 5th in the rankings with little in the way of major news since Christmas, and was followed by Ravello Systems, which reported that 888 Holdings PLC, an online gaming entertainment and solution provider, uses Ravello's cloud service. Nintendo was down in 7th with Spacetime Studios in 8th. Founded by former Sony Online Entertainment veterans, Spacetime Studios announced “Call of Champions” a Multiplayer Online Battle Arena (MOBA) game in which each battle is expected to last for just five minutes. This is an attempt to distill and simplify the MOBA game genre into bite-sized portions that are more suitable for mobile gamers. In 9th was Vault Networks, the cloud computing, web hosting, and colocation provider, that announce a partnership with Multiplay a UK-based gaming services company. And finally in 10th was the notorious Lizard Squad themselves. Interestingly many large gaming corporations such as EA did not make the #CloudInfluence top ten organisations due to a lull in announcements in the post-christmas period. The nature of the gaming industry tends to lean towards huge PR pushes before big retail periods such as Christmas, with a quieter media period through the start of the year when retail is at its lowest. This explains why many of the expected companies did not make the top ten as they have not had a media presence through the period that was analysed. [table id=11 /] As far the rankings for individuals go Phil Spencer and Larry Hryb from Microsoft topped the charts, commenting on the Xbox Live DDoS attack, on the future of the platform and its extension to Windows 10. Jules Urbach from OTOY was in third commenting on the ground-breaking holographic light field technology used in Batman and the live streamed in 360-degree high-definition virtual reality they aim to use to deliver an unparalleled fan experience for NHL fans. Next in the rankings was James Mielke, an executive at the cloud-gaming startup Shinra Technologies who was quoted widely commenting on how Shuhei Yoshida, president of Sony's worldwide game studios, has helped propel the company's game division toward a surprising turnaround.  Then came Patrick Walker, an analyst from EEDAR, who is widely quoted on gaming issues and Shaun Himmerick ,Mortal Kombat X executive producer, who accidentally gave gamers the impression that the fighting game wouldn't require Xbox Live Gold or PlayStation Plus. "PS Plus is not needed for any feature of MKX," Himmerick tweeted adding that the same was true for Xbox Live Gold. After sparking much excitement across social media, he later indicated that his initial statements over Twitter, since deleted, had been a misunderstanding. To find out more about how Cloud Gaming is evolving or to meet the key players in the local market as they gather in London on April 13-14, register here for Cloud Gaming Europe - http://www.videogamesintelligence.com/cloud-gaming-europe/ NOTE: the Compare the Cloud #CloudInfluence league tables, are based on a broad big data analysis of all major global news, blogs, forums, and social media interaction over the past 90 days (in this case from the immediate pre-Christmas period to mid March). The league tables provide a snapshot taken at a particular point of time of the respective influence of both organisations and individuals over the last quarter. Companies that were particularly active in the given period will feature more prominently. This special report is intended to provide readers with unique insights into who the most influential organisations and individuals in the Cloud Gaming arena really are at this time. ### The Evolution of Gaming – A Cloudy Future The gaming sector has been dominated for years by successive generations of console platforms from Sony, Microsoft and Nintendo. The present generation of consoles were the first for which online gaming became a real factor with Xbox Live and PlayStation Plus dominating. Each console has a truly differentiated set of offerings, the two networks vied for loyalty and leadership just as hard as their many gamers do when playing against each other. However with the latest generation of consoles, the Xbox One and PS4, the emphasis has shifted more towards on-line or cloud-based gaming with far less differentiation in the offerings that the two dominant players provide. While previously it would have been the three dominant console providers we would focus on, Nintendo has fallen off the radar after finding less traction with latest generation consoles and being late to market with the Nintendo network. When the console was the key gaming device it was easier to differentiate - especially when a radical new technology was introduced, as was the case with the original Nintendo Wii. However as the emphasis in gaming shifts more to the cloud such differentiation is less easy to create and sustain. It is becoming easier for rivals to copy their competitor’s most popular features. To a large extent the aforementioned consoles and their networks (Xbox Live and PlayStation Plus) use the cloud simply to enable massively multiplayer online games (MMO or MMOG). However this still requires the local use of consoles as well as the games that run on them - both at relatively significant cost. The cloud is however enabling a far greater revolution that could eliminate the need for consoles entirely. Cloud allows access from an array of different devices with fragmentation being driven at one end of the market by PC-based Steam machines that are seeking to rival the console platforms, and at the other end by thin clients that are used for either video streaming or file streaming: Video streaming: Similar to video on demand, video streaming uses a thin client to access a remote system where the actual game is stored, executed, and rendered before the resulting video is returned. The thin client can be any device capable of sending control signals and accepting the streaming video in return. With the best current high-performance broadband such remote processing is now starting to rival disc-based consoles, but lower bandwidth connectivity can still cause too much of a lag for certain game formats.  File streaming: Conversely file streaming progressive downloads elements of the game file which are then run on the thin client. A small initial part of a game is downloaded with remaining game content downloaded on demand. Local processing eliminates lag, but requires a device with the processing capabilities to operate the game, and enough local storage to cache the elements of the game files. Cloud gaming is enabling a proliferation of gaming options from high-end game consoles, to set-top boxes or smart TVs, as well as gaming PCs, and even mobile devices. Challenging the current generation consoles from Sony, Microsoft, and Nintendo are not only a number of Steam Machines from Value’s hardware partners, but also the new Nvidia Shield console, Razor’s cloud-based Forge TV Android set-top box, and the Amazon TV set-top box that can all play games. There is also a growing range of games for tablets and smartphones on both Android and iOS. I hope that this has set your mind onto the cloud gaming playing field, and that you are now prepared for this month's #CloudInfluence Special Report. March's report will be focussing on Cloud Gaming, ahead of the Video Games Intelligence Cloud Gaming Expo next month. If this has sparked your interest in Cloud Gaming, you can register for the Expo here: www.videogamesintelligence.com/cloudgaming. Let us know if you will be attending, Compare the Cloud will be there, and we'd love to see you all there too. ### What Does PLMA Stand For? PLMA is quite frankly a made-up acronym because Proprietary Latency Monitoring Appliances was far too long for the title. What we're really here to talk about is how InStream is changing the latency monitoring world by leveraging cloud technology.  For most CIOs their main task is to deliver as much in terms of bang for the buck as they can for their businesses – and the evolution of cloud computing has had an enormous impact on the price performance continuum. Vast data centres packed with racks of commodity servers are now providing various levels of computing from Paas, to IaaS to Saas at a fraction of the cost at which it was previously possible. There are however a few common concerns that are hindering the widespread adoption of cloud. The main concerns are privacy and security, knowing where you data is and how safe it is from accidental or malicious exposure or corruption. Another concern is the reliability, robustness and security of the systems as well as the networks that connect often increasingly complex hybrid cloud environments. Cloud environments are prone to down outages, just as the internet links that connect them are prone to congestion. There are a few CIOs that work in sectors where such concerns are paramount. Regulated industries cannot afford to compromise at all on security, mission critical systems cannot afford to compromise on availability and many time-sensitive applications simply cannot afford to compromise on even the slightest time lag or latency. In these industries cloud still has a role to play, but the data centres must have high availability, secure systems with dedicated low latency communications. High availability and performance can be achieved by using a premium cloud provider such as SoftLayer, which manages its cloud services to the highest level of security, reliability and resilience. SoftLayer also provides free secure connectivity between its growing number of data centres across the globe. Monitoring things such as latency is still important though, especially across the all-important final link (e.g. between a trading system and an exchange), but such latency monitoring has, until now, been very expensive. The telecoms industry was the first to take latency seriously and developed high specification, dedicated, proprietary appliances. These same appliances were later adopted by financial markets firms to monitor the latency on their trading systems. Both industries paid a fortune to do so. A new wave of time sensitive industries (such as cloud gaming) has so far balked at the cost of such appliances. Surely with cloud reducing the price performance continuum everywhere else, they felt that there must be a cheaper and easier way of measuring latency, and there is. Integration Systems has pioneered a new generation of latency monitoring systems that leverage cloud technology, and they are able to provide the service as a lower cost commodity to their clients. A revolution in latency monitoring is now happening and it is starting in financial and capital markets. To monitor and manage their transaction latency traders, brokers, exchanges and service providers have traditionally not only had to use expensive proprietary appliances with expensive contracts, but have also needed dedicated staff to manage the latency monitoring tools and needed to install and manage devices at each trading node. On top of this, a recent move from 1G to 10G networks has necessitated an expensive upgrade cycle for such appliances, which faced needing either to maintain non-vendor supported equipment themselves, or having to spend hundreds of thousands of pounds on costly replacement appliances. However while the trading systems are especially latency intolerant, the monitoring systems need not be. The monitoring systems simply need to collect, compare and analyse time-stamped data from various sources. Integration Systems has pioneered a new generation of latency monitoring systems that leverage cloud technology, and they are able to provide the service as a lower cost commodity to their clients. Its InStream managed latency monitoring service allows its financial markets clients to actively and accurately monitor and report trade workflow information, as well as SLA and revenue impacting latency issues, but instead of employing expensive appliances, it does so via a cloud-based service. InStream provides trading firms with a unique mechanism of monitoring latency in tertiary and metropolitan areas connections enabling trading firms to take latency monitoring from being limited to the co-lo to providing a comprehensive view of latency across a geographically distributed trading infrastructure, all within its cost effective framework. InStream gives clients the ability to visualize and explore trading and pricing environments and paths in a new and powerful manner and the best part is that this is available in an OPEX model, not a large one-off capital cost. With cloud rapidly becoming accepted as the natural delivery model for all kind of data services from corporate networks to streaming home video, it is just a matter of time before other expensive silos are blown apart, just as InStream is currently doing to the world of expensive dedicated proprietary latency-monitoring appliances. ### CensorNet to Shake Up Stagnant Web Security Market CensorNet has had myriad announcements recently, so we sat down with CEO Ed Macnair and his new Director of Sales Engineering, Alex Kurz to hear more about the strategy and big developments in 2015. CensorNet, the next generation cloud security company, has made a stand this year to revive the ‘stagnant’ web security market. The appointment of Alex Kurz comes only weeks after Nikki Simpson joined the CensorNet team from Websense. A former Websense Channel Manager, Nikki is heading up CensorNet’s distribution channel. “Much of the web security market has become dominated by large vendors and has seen no real product developments since the early 2000’s. Solutions such as Websense and Trustwave’s WebMarshal were developed before the era of Facebook and the vast number of cloud applications that we use in business today. Consequently resellers’ options have become increasingly limited, leaving a huge swathe of channel partners disillusioned with nowhere to go,” explains Ed Macnair, CEO, CensorNet. Stepping up the pace CensorNet recently also announced a partnership with Renaissance, Ireland’s premier IT distributor. “Renaissance is a hugely respected and forward thinking value added distributor in Ireland, with advanced technical experience and an expansive market reach. This partnership enables us to not only develop a wider reseller base across Ireland but also provides support to our existing bank of customers,” said Macnair. “We’ve laid down the gauntlet to our competitors by developing decisive partnerships with channel partners, such as Renaissance. We are redefining cloud security by providing organisations with greater visibility and much better control in the supervision of company-wide internet access and the use of cloud applications across all devices, regardless of whether users are in-office or mobile. Working with partners who have extensive market and technology experience in the web security space will be central to our success as we intensify our growth strategy across both EMEA and North America this year.” Web Security as-a-Service - the next big opportunity for MSPs The appointment of Alex Kurz as Director of Sales Engineering forms part of the company’s strategy to accelerate the adoption of its next generation cloud security solutions. . CEO Ed Macnair said CensorNet is excited to have Alex on board. Formerly Trustwave’s Technical Director for service providers in Europe, Alex brings with him a wealth of technical knowledge. Particularly adept with web and email content security, he has well established relationships within the channel, especially the service provider community. “In recent years the web security market has been largely commoditised. It’s been all about brand design change rather than developing new product functionality. As it stands, the current market is outdated and ill-equipped to address the needs or complexity of web access and cloud applications,” explains Alex Kurz, Director of Sales Engineering, CensorNet.  “Many traditional web security solutions were developed before the era of cloud applications and have not evolved fast enough to cope with the vast number of cloud applications that we use in business today. Web security as-a-service is the next big story in the content management market and businesses need solutions that befit the environment. To be part of such a great new channel proposition for service providers is a very exciting opportunity. My primary objective will be to strengthen CensorNet’s position within the channel community and drive adoption amongst customers worldwide - and building a strong pre-sales team will be fundamental in achieving this,” continues Kurz.      Kurz will be growing CensorNet’s pre-sales engineer team, providing training and support to technically enable its new channel partners.  His role will also involve working alongside the product team to develop the next generation of content security solutions.  “Exciting times are ahead for CensorNet. With the upcoming launch of our new web security solution we will be looking to the channel to push out our solution far and wide. Alex will be a vital asset in driving our intensive channel strategy, enhancing product development and strengthening the core management team”, adds Macnair. Webinar Alert: With CensorNet MSPs can offer customers and resellers the unique Web Security-as-a-Service offering entirely under their own brand and at a price point that the MSP can determine themselves.  Want to learn more?  Join the CensorNet Webinar on Thursday 26th March at 10am GMT to find out more about the product offering and exciting new MSP Program. http://www.censornet.com/webinars/reseller-webinar-registration/ ### Pimp Your Hardware with Kingston Tech Big data analytics hardware doesn’t need to cost a fortune, commoditised hardware along with upgraded existing assets can be used to drive big data. The vendors will do all that they can to dissuade you from installing other providers hardware, suggesting that commoditised hardware provides insufficient reliability and that upgrading your existing hardware (with memory other than their own) risks voiding the warranty on the equipment in question. Though thanks to Kingston Technology, we are happy to put your mind at ease. As much as sales representatives might attempt to coerce you into purchasing memory modules from them, usually at much higher prices than you can get elsewhere, a ploy known as a "tie-in sales provision”, such provisions are generally illegal. Indeed they are specifically prohibited in the consumer market by section 102(c) of the little known Magnuson-Moss Warranty Act of 1975 and can also violate sections 1 and 2 of the Sherman Antitrust Act. This useful blog by outlines both these legal rulings in more detail. To remain competitive in today’s hyperconnected, saturated markets, firms of all sizes must look at their use of data and how they leverage not only their own data (sometimes their greatest asset), but also the massive stores of third party data (from twitter traffic to weather data). Optimising their use of both kinds of data, probably within hybrid cloud environments, will enable them to achieve deeper customer engagement and streamlined omnichannel commerce, gain insight into clients behaviour and optimise effective and cost-efficient client outreach. According to an IDC study, the amount of data in the world is set to grow ten-fold in the next six years to 44 zettabytes or 44 trillion gigabytes. By leveraging next-generation technologies to harness this data, firms can gain insights into the behaviours and preferences of customers, as well as into how they interact with the world - uncovering hidden insights that lead to new revenue opportunities. Pimping your gear to build a DIY Big Data Machine There are various guides available that can take you through the steps required to build a big data analytics platform, such as this one on Hadoop.  Among the main considerations is the core platform. Whether this is hosted on premise, in a remote data centre or across a hybrid environment, it needs to be tuned to provide the necessary raw power as well as scalability. In power terms it needs to be able to ingest information at speed. The system also needs to be able to determine whether this data is immediately available for analysis, and what the level of potential latency is and how this might impact real-time applications and MapReduce speeds. You also need to understand how the platform can be scaled to meet ongoing demands in terms of number of nodes, tables, files, and so on. There are vendors that would suggest that this means that you need to go out and purchase racks of powerful new servers from them. However, the mega scale vendors (such as Google and AWS) have shown that big data engines can be built using commoditised hardware, and as a further option you can also tune up your existing hardware to provide the horsepower to make the most of big data analytics. Our recommended steps are: Assess your requirements - the Hadoop buyers guide in the link above provides a good methodology for this. Look at how you can revamp your existing kit - companies like Kingston provide low cost memory upgrade options to allow you to pimp your existing kit to provide the power and potentially also the scalability to drive your new big data platform. Build out as you need to, using commoditised hardware - just as the mega-scale cloud vendors do. Hopefully this will allow you to make your budget stretch further and gain the proposed value from big data insights for less than you might think. If you want to discuss pimping your hardware with Kingston Tech, visit them at Cloud Expo on March 11th and 12th, at stand 512. ### Pitch Yourself! at Cloud Expo Europe Compare the Cloud will be recording audio and video pitches of cloud services during Cloud Expo Europe on 11th and 12th March, 2015 at Excel, London. If you would like to record a short pitch for your company - find us at the DIGITAL REALTY stand or speak to our team. You can also tweet us @comparethecloud or email cloudpitch@comparethecloud.net Pitches can be between 60-120 seconds long and we will also interview about your company and strategies. Recording Times: Wednesday1-3pm Thursday 12-2pm While you wait - why not help Neil park his car in our game: Neil's On Wheels. Alternatively why not try your skills on Andy Crush. Enjoy! ### Cirrus, Nimbus, or Hybrid? Is the cloud right for you? It all starts with your own intelligence! Armed with the correct facts about your current IT infrastructure, you can make the decision of when, or if, a move to cloud infrastructure suits your organisation. Whatever your conclusions are today, the answer will inevitably be a resounding yes if the below predictions are true to form: IDC predicts that by 2016, there will be an 11% shift of IT budget away from traditional in-house IT delivery, toward various versions of cloud computing as a new delivery model.  In 2016 over 80% of enterprises globally will using IaaS, with investments in private cloud computing showing the greater growth. Ovum forecasts that by 2016, 75% of EMEA-based enterprises will be using IaaS. Gartner estimates that IoT product and service suppliers will generate incremental revenue exceeding $300 billion in 2020. IDC state that the worldwide market for IoT solutions will grow from $1.9 trillion in 2013 to $7.1 trillion in 2020. It is imperative to understand where you are today, in order to know where you need to get to tomorrow. This statement is never more true than when discussing cloud technologies. An in-depth analysis of your current IT Infrastructure may sound like common sense, but you would be surprised how frequently this first step is assumed rather than identified to the correct level of detail. What the cloud offers businesses today is resiliency, global reach, and future proofing of the technical infrastructure for you and your company. With security and mobility being high on any company’s agenda, this can often be difficult to resource and budget for, for any given enterprise. The tangible business benefits of cloud adoption include things such as shifting expensive CAPEX costs to a monthly OPEX model with flexible contracts. Whilst the intangible benefits of can include mobility, resiliency and scalability of the IT Infrastructure to name a few. But hold on, when does the cloud not suit you? Cloud technology will not fit all, so a hybrid approach is needed. Just as the technologist out there will understand that hybrid clouds are coming of age with the choice of infrastructure and location, the business consumers know that hybrid support is also needed to help them through the transition and beyond. And let’s face it, they are quite literately running blind, all of their technology and systems are now being supported and run by someone else. This may seem like an anti-cloud argument but it’s actually quite the opposite. Cloud technology will only ever get faster, cheaper, more flexible until there will be no other choice but to use it. However we need to keep an eye on the actual consumers that are buying this technology and the reasons why they are doing so - we must never lose sight of this. We are entering the “Internet of Everything” where quite literally everything is connected and this includes home utility devices such as fridge freezers, bins and even toilets. With this new age of technology we must make our decisions wisely as they will definitely determine our future at work, as well as in our personal lives. I personally have an interconnected house, with a smart phone, smart watch, smart bin smart TV and unfortunately a smart wife (just kidding with the last comment). Everything I do can be made public in an instant, I can monitor emails on my watch, I can even tell my fridge what to order or even let it decide for me. These interconnected devices will only accelerate and become more mainstay as we move forward a few years. Moving to a cloud centralised IT model is inevitable, unless you wish to live in the wild like an animal. As cloud technologists, we have a duty to our clients, a duty that requires us to understand why anyone would migrate to cloud environments ,and we need to have the integrity to say when it’s not fit for their desired purpose. If we work in partnership and educate along the journey to cloud adoption we will retain them for the future and trust will be formed. The cloud marketplace is a relatively young industry and there are many choices a consumer can make, we can ensure you make the right one! ### The Invisible Threat To London’s Economy London is pulling up the rear just behind Birmingham in a recent survey from Prosyn IT Support. The city’s available internet connection speeds are well behind Oxford, Stockholm, Bucharest and many other cities at home and abroad, weighing in at a paltry average of 26.3 Mbps, with far lower within the City of London and the West End. For a place that prides itself on its powerful industry and quick business growth, it certainly isn’t equipping entrepreneurs with the best tools available, and makes keeping their operations in the cloud an especially difficulty undertaking for many companies! The infrastructure for superfast fibre broadband that has been taking much of the world by storm lately is almost nowhere to be found within London’s inner limits. Things do not seem to be moving in that direction anytime soon either, despite the huge cost in lost productivity. Business is moving into the cloud at an alarming rate, with the global digital economy growing at more than 10% every year… It won’t be long, therefore, before London’s businesses are left in the dust. As home to 22% of the economic output of the entire United Kingdom, what would happen if businesses abandoned London in search of more favourable conditions, or what if businesses from abroad never came? This lack of cloud infrastructure is crippling not only to London itself, but the entire UK, and it must be addressed. ### February Top 10 #CloudInfluence UK and Cloud Bursters Bursting Onto The Scene: Our February Cloud Bursters, the #CloudInfluence individuals with the greatest increase month on month, were as follows: [table id=8 /] Rodney Rogers came in 1st place as our highest Cloud Burster thanks to his appearance at 2nd place in the #CloudInfluence Global 50. Narendra Modi in 2nd, India’s prime minister said that his government's push towards building IT infrastructure would adopt innovations by the industry and initiatives such as "Digital Locker". He also claimed that this would increase the need for "cloud godowns", where people would be able to store and access their documents and information on the move. It is great to see senior politicians appreciating the potential of cloud. Ericsson's president and CEO, Hans Vestberg was in 3rd with Samsung's senior director of product marketing Shoneel Kolhatkar in 4th as the launch of the Samsung S6 and looming Mobile World Congress pushed them into prominence. Carlos Moncho, Chairman & CEO of PUSHTech in 5th was also boosted by a pre-Mobile World Congress announcement. Then there were two analysts: Ajay Sunder, Vice President, ICT Practice, Frost & Sullivan Asia Pacific in 6th followed by Rohit Mehra , Vice President, Network Infrastructure at IDC in 7th who has been commenting on the uptake of 10Gb Ethernet and 40Gb Ethernet as data centres seek to grow their capacity. The other cloud bursters were Orlando Scott-Cowley, cyber security expert at cloud-computing company Mimecast, in 8th, and two more analysts Brad Casemore, research director, Data Center Networks, at IDC in 9th and Seamus Crehan , president of Crehan Research in 10th. Closer to home: The leader of this months #CloudInfluence rankings for the UK should not come as much of a surprise to anyone. Topping the UK list last month, appearing at 25 in this month’s global list, and topping the UK list yet again is IBM’s Simon Porter. Then excluding two of our own directors (who would have appeared in 8th and 11th) from the running, the rest of the UK list was as follows: [table id=9 /] Simon Hansford, Skyscape CEO, retained 2nd place. The next two had reports to promote: Tudor Aw, Technology Sector Head at KPMG Europe in 3rd, was promoting a report that found that the UK tech sector outperformed the wider UK economy in Q4 2014, while Alex Hilton in 4th was publicising CIF research that found that only two percent of UK businesses have reported a security breach in their cloud services. Then we had the CEO and CTO of Iomart, which was accredited for the UK Government G-Cloud 6, followed by Mark Porter of InterCloud and the CEOs of Napatech and Tisski. Francis Maude dropped to 10th this month, still promoting the government’s G-Cloud. As for the top UK organisations, the UK government came top with Francis Maude and others promoting its G-Cloud, then excluding ourselves again (Compare the Cloud would have come second) came Aria Networks and Iomart. Come back next month to see how things change! ### DeeperThanBlue achieve Premier Partner status with IBM Digital channels and integration specialist, DeeperThanBlue has announced it has achieved IBM Premier Business Partner status. This represents a significant milestone for the Sheffield company, since it was awarded Advanced Business Partner status just four months ago. Premier Partner status is awarded to organisations that have demonstrated superior skills and marketing success when building and selling IBM-related solutions. DeeperThanBlue has demonstrated consistent delivery of significant value to their clients through innovative solutions – a feature at the heart of the IBM products they work with. DeeperThanBlue primarily work with IBM WebSphere Commerce, which provides an e-commerce platform delivering seamless and consistent omni-channel shopping experiences, including mobile, social and in-store. They also use the mobile, business process management, and integration technologies from IBM's suite. “We are delighted to have achieved the Premier Business Partner status in such a short period of time, this can be largely attributed to the solution offerings that we implement and the quality of the team of consultants and service delivery people that we have assembled,” said DeeperThanBlue Sales and Marketing Director, Chris Booker. In recent months DeeperThanBlue has won a number of new projects to provide digital solutions for retail and distribution clients, successfully employing IBM commerce and mobile technologies deployed in the cloud or on-premise, with a primary focus on customer experience management. They have also signed a number of contracts for business process management and integration projects for manufacturing and logistics clients. These new ventures are helping DeeperThanBlue's clients become more agile with their trading partners and optimising their internal business processes and supply chain. ### February Top 50 #CloudInfluence Individuals There was a high rate of change in the Global Top 50 #CloudInfluence Individual League Tables with only 11 of last month’s top 50 appearing in this month’s rankings. For the rankings we exclude journalists and other authors who tend to report other people’s opinions and focus instead on the key opinion holders themselves – those that tend to be quoted in any such articles. January’s rankings are shown in square brackets when applicable.  The Top Ten: Security issues set the agenda this month as Marc Rogers [3], Principal Security Researcher at CloudFlare, climbed to 1st place with widespread comments on security issues. He was quoted widely on the volume of malicious traffic coming from Tor exit nodes and the Lizard Squad attack on Lenovo. Appearing in the widely reported CEO Power Panel during 15th Cloud Expo, Rodney Rogers [4], CEO of Virtustream came in this month at 2nd, followed by a new entrant Paul Turner, Cloudian’s CMO in 3rd. With a brief lull in corporate announcements, Satya Nadella [1], President of Microsoft dropped to 4th. Meanwhile, evangelising about how Solgenia Group has leveraged revenue generating capability of cloud consumption, catapulted Ermanno Bonifazi, the firms’s founder and CEO into 5th in our rankings. Widely quoted championing the contribution of the channel, and explaining that as organisations become more willing to adopt cloud technology and services, the demand for experienced solution providers is rapidly increasing, was Robert Faletra [25] who came in at 6th place, the CEO of The Channel Company. Vint Cerf, Google Analyst, and one of the fathers of the internet, appeared in 7th warning that we risk becoming a "forgotten generation" or even that this will be "a forgotten century" because as computer hardware and software become more quickly obsolete, we are losing information. Major General James Linder of the US Army appeared in 8th position, quoted widely that the U.S. military was introducing a Cloud-based technology to allow African allies to quickly share intelligence across borders, such as mapping information for the location of potential targets. In 9th and 10th were two IBMers, Robert LeBlanc, the new SVP for IBM Cloud, and his boss Virginia (Ginni) Rometty [5], IBM’s CEO. Both were quoted in the raft of cloud announcements from IBM InterConnect covering new services, data centres, customers and partners which helped boost IBM from 6th to 2nd in the #CloudInfluence organisation rankings. They also announced a $4 billion investment with which IBM says it plans to increase revenue from growing areas like cloud computing to $40 billion by 2018 as the battle for mind share and market share hots up.  [table id=7 /] Firm Favourites: Executives from firms featured in previous #CloudInfluence rankings. Helping to promote his company into a position of prominence in this month’s organisational rankings was Marc Benioff, CEO of Salesforce.com - well on its way to becoming the fastest company to achieve $10 billion in revenue. Also reflecting the data centre consolidation highlighted in the February special report on data centres, were David Ruberg in 11th, CEO of Interxion and Craig Young, CEO of Megapath, in 48th. Up and Down: Notable Risers and Fallers Showing how quickly you can fall in the rankings if you go quiet and fail to sustain a constant flow of comment were executives such as Dennis Woodside [2] the COO of Dropbox who fell to 18th, and Martin Schroeter [6] IBM’s CFO who rarely makes public comment other than for results or shareholder meetings. Schroeter fell out of the Global 50 this month entirely due to his lack of comment. We were interested to see Navnit Singh, Chairman and Regional MD for India at executive search firm KornFerry International appear in 18th following his comments that AI (artificial intelligence) and cloud delivery models are changing the employment landscape, with only incremental growth being seen in hiring for middle management and below in IT services. We were also delighted to see a Brit appear in this month’s rankings – Simon Porter VP of IBM and a prolific force in social media in 25th place.  Check back tomorrow for the February UK #CloudInfluence League Tables, and the cloud bursters league table. ### February Global #CloudInfluence League Table The #CloudInfluence battle heated up in February as Microsoft and IBM stole a mind share lead over Amazon in cloud. All the cloud players are in a massive battle for market share and mind share, and while AWS remains in a league of its own in terms of scale, it can no longer claim to be bigger than its closest four competitors combined.  This month’s rankings show AWS's rivals are not only catching up in terms of market share, but have taken an outright lead in terms of mind share. Amazon prides itself on being a relatively lean operation compared to the traditional players, but it cannot afford to be complacent or to underestimate these giants. Microsoft and IBM have massive marketing machines and huge ecosystems supporting their products and services, and both giants are focused on displacing AWS. Here are the February 2015 Global #CloudInfluence rankings for organisations. January's rankings are shown in square brackets when applicable.  Top of the charts: February's Top Ten Yet again Microsoft [1] top the ranking of #CloudInfluence organisations, with a commanding lead over the rest. IBM [6] leapt to 2nd place in the rankings following the announcement of a $4 billion investment with which IBM says it plans to increase revenue from growing areas like cloud computing to $40 billion by 2018. Its InterConnect event was brimming with cloud announcements covering new services, data centres, customers, and partners – such as the announcement with Juniper which joined the top 50 this month sneaking in to 39th position on the back of the announcement. Amazon [3] and Apple [4] ranked in reverse order this month with Apple moving ahead to 3rd as it announced anyone — even PC-users — can access its suite of iCloud web-based productivity apps – Pages, Numbers, and Keynote – for free. However as some consolation for Amazon's slip to 4th, its cloud operation AWS [8] jumped to 5th. Amazon will break out its AWS revenues separately for the first time in its next financial report, until then we are still reliant on market share reports such as the recent one by Synergy Research Group (SRG), which catapulted to 18th in the rankings on the back of the report.  The SRG report found that AWS's cloud growth had faltered in competition with Microsoft and IBM in the second quarter of 2014. While AWS remains to be the largest singularity, its competitors combined growth has finally disproven AWS's claim to fame of being bigger than all its competitors combined.  Akamai [10] and Google [11] jumped to 6th and 7th respectively. Akamai reported record results up 25% adjusted for foreign exchange, while Google’s addition of features such as a private docker registry, a cloud benchmarking tool and an incident reporting dashboard failed to mask the somewhat embarrassing two hour outage experienced by its cloud engine. Oracle [5] dropped to 8th place, VMware [8] slipped one place to 9th, and SAP [2] dropped all the way to 10th. [table id=6 /] Making a big splash: firms with announcements that boosted their rankings Consolidation in the data centre sector continues at a pace, as was seen with MegaPath Corp. in 38th selling its managed services business division to GTT Communications, and leading British data centre provider Telecity in 19th agreeing to merge its operations with Netherlands based Interxion. At the same time Rackspace jumped to 21st place on the back of speculation about its future (see also the Data Centre Special Report which appear prior to the MegaPath GTT deal). Special guests: cloud customers appearing in the rankings Additional comment pushed the European Space Agency [32] up to 26th place in February. The National Hockey League, in at 28th, was widely quoted about seeking to increase fan engagement and deepen fan loyalty with the newly released statistics platform on NHL.com, powered by the SAP HANA Enterprise Cloud service. Meanwhile the Indian army came in at 47th after announcing a move to a highly encrypted cloud storage system as it works towards becoming a "100 per cent digital" force. Missing in action: big names that were missing until now HP and Salesforce were both notable absentees last month. However there is a stark difference between the two firms' cloud presence. HP after having wasted billions on acquisitions like Autonomy, EDS, 3Com, and Palm trying to buy growth, have no real cloud revenue to speak of – even their recent deal with Deutsche Bank could hardly be seen as cloud. Meanwhile Salesforce.com has come from nowhere in the same period to grow to a market cap nearly three-quarters of HP’s size, and is well on its way to becoming the fastest company to achieve $10 billion in revenue. Consequently while Salesforce made it into the Global #CloudInfluence top 50 at 48th, HP rose only as far as 60th– although HP’s Enterprise Services (the part of the company being split away from the PC business) did appear in this month’s ranking – albeit in unlucky 13th on the back of considerable negative coverage. Tune in again next month to see how these players all do next time, and come back tomorrow to see who the Top 50 Global #CloudInfluence Individuals are. ### Where in The Cloud is My Data? ‘The cloud’. Used in this popular singular form with the definite article, it suggests that there really is a single, nebulous entity where computing and storage magically take place. Of course reality is that “the cloud” is a network of data centres, and within those, a network of servers and storage nodes. So when you put data in “the cloud”, where is it exactly? Some companies are very wary of placing data in anything but their own data centres. In that case they know exactly where their data is, and a private cloud doesn’t change that. This blog is not for them. However, there are many reasons why even enterprises in regulated industries want to use external cloud services, and such a move doesn’t necessarily mean that they lose sight and control of their data – they just need to ask the six questions. You can view this as a series of increasingly contained questions that peel back layer by layer where the data is actually stored: In which region is my data stored? In which country is my data stored? In which country is my cloud provider registered? In what type of datacenter(s) is my data stored? In what type of tenancy model is my data stored? In what storage format is my data stored? [caption id="" align="alignleft" width="352"] Much like with Matryoshkas, you need to peel through several external layers to get to where your data is really stored.[/caption] In which region is my data stored? Many cloud providers have data centres deployed across multiple regions. The physical distance between the user and the datacenter, even in high bandwidth connections, still matters. It will continue to matter because of network latency, or the minimum time it takes a bit of information to get from one place to another. High latency causes data transfers to time out and applications to break down, so one would always prefer to have the data stored in one’s own region. There are additional considerations here for companies that are spread out over multiple regions (e.g. North America and Europe), who would want data to be replicated across multiple regions. This is also sometimes done for redundancy or disaster recovery purposes. Many cloud services providers automatically send the data to the closest data centre. However, there may be cases where an organisation would prefer to limit certain types of data to one region – e.g., the European Union – for legal or regulatory reasons. In which country is my data stored? The issue of data sovereignty has gone to the forefront as cloud services become more popular. The issue is that in some countries – many European countries as well as Canada – certain types of data should not leave the country. Sometimes this isn’t a legal requirement but a privacy concern that a company would like to convey to its customers. In which country is my cloud provider registered? Data residency, or where the data is physically stored, is not the only aspect influencing sovereignty. Even if the data is stored in your own country, but the provider hosting it is a company subject to foreign laws, your data may be accessible to foreign governments under various laws of information disclosure, or it may be disclosed to certain parties in case of a lawsuit. Check your service provider's legal status if it's important to you not to have your data exposed to such disclosures. In what type of datacenter(s) is my data stored? You want to know that industry-standard security best practices are applied to your data storage, and that includes both IT security as well as physical security measures. Things like 24/7 surveillance, anti-fire systems and multi-factor authentication for entry are to be expected. There are various types of certification for data centres that audit data centres to ensure compliance with best practices. In what type of tenancy model is my data stored? There’s a difference between the multi-tenancy model of public cloud, and the model now popularly called Virtual Private Cloud or VPC. The difference is data storage and access is significant. In the normal multi-tenancy model, your data is stored in the same logical system or “bucket” with other organisations’ data, and access to it is governed by access control mechanisms. With a VPC, your data is stored in logically separate, fenced-off infrastructure that can be made accessible only via your VPN. This makes the VPC as secure as your own private datacenter environment, so even if access control mechanisms fail, your data can never be mixed with other data. By default, this also means you can encrypt your data with your own keys, and control every aspect of encryption policy. CTERA explained this in depth in a previous blog post. In what storage format is my data stored? Commons storage formats (in and out of the cloud) include block, file and object storage. Cloud storage typically includes flavours of all three, but they are used for different purposes and applications. For most file services such as backup, file sync & share, as well as for storage of large amounts of unstructured data, object is the preferred format – for both its massive scalability as well as it cost-efficiency. AWS S3 is an object storage system, as is OpenStack Swift. Price per GB on object is roughly half that of most other file-based systems. CTERA supports nearly all object storage vendors. Normally, this is not something cloud users need to concern themselves with – but if you do have cost concerns and scalability expectations and are using one of the lesser-known cloud providers, it is wise to verify the suitability of storage format to your needs. How Deep Is Your Love? For Your Data That Is. If you made it thus far in this blog post, congratulations! You deeply care about your data. Seriously, not everyone needs to dig through all these layers, but it’s good to know they exist. Cloud providers are not responsible for your data – you are. Demystifying cloud storage to understand where your data really resides and how it affects data governance, security, integrity, sovereignty, and compliance can remove some of the obstacles in adopting “The Cloud” in an optimal way. ### Data Centres #CloudInfluence Special Report As part of our 2015 #CloudInfluence program we will not only be publishing the Compare the Cloud #CloudInfluence rankings at the start of each month (focusing each time on the preceding 90 days), but we are also aiming to publish a special report in the middle of each month on a related topic area. This month the #CloudInfluence Special Report focusses on data centres. In a previous life I was an Officer in the Royal Navy. I joined straight from school and signed up for a full 22 year contract (longer than I had lived at the time) certain as I was that it was the career for me. As it happened this career was cut short on medical grounds after just four years and a new career in the high tech world beckoned. It showed the perils of seeking to predict things to far ahead. In September, 2008, Larry Ellison completely dissed cloud computing while speaking at an analyst conference; “The interesting thing about cloud computing is that we’ve redefined cloud computing to include everything that we already do ... the computer industry is the only industry that is more fashion-driven than women’s fashion. Maybe I’m an idiot, but I have no idea what anyone is talking about. What is it? It’s complete gibberish. It’s insane. When is this idiocy going to stop?” I’m not about to suggest that Larry is any kind of idiot, but it is now plain to see that cloud is here and that it is seriously reshaping the world around us. It is however right to caution against some of the more superlative hype associated with cloud - the fashion-driven aspect that Larry referred to. It was brilliantly summed up recently in this Dilbert cartoon, https://twitter.com/securebusiness/status/520524659193040896/photo/1.  A more realistic perspective is required in the data centre market. Seeing through the hype and focusing on the real opportunities as the sector matures and consolidates in essential. And with massive long-term infrastructure investment required, a long term perspective is also vital. Another lesson that I learned in the Royal Navy was that most of the attention might well be focused on what happens in the bridge, but it is down in the engine room where a great deal of the hard work occurs and without a smooth running engine room no amount of noise on the bridge is going to get you anywhere. The engine rooms for cloud are the data centres - and there are over a thousand key data centres across Western Europe which we have identified. While there is enormous buzz about cloud across social media, in the press and in blogs, there is relatively little about data centres. Our big data analytics showed that the community of influencers focusing on cloud was over ten times as great as the community focused on data centres, even though there was a good degree of overlap with more than half of all articles, blogs or tweets that focus on data centres also focusing on could issues. Looking specifically at which organisation had the most influence on data centre issues, our #CloudInfluence Special Report analysis on data centres found that Digital Realty Trust came out on top. Timing is everything though. Had the report come out at almost any time over recent months Digital Realty would have stood head and shoulders above the rest, however the recently announced Telecity Interxion merger boosted the profile of both companies. Telecity therefore came a close second to Digital Realty, with Interxion in 4th just behind Equinix, another key player in the market. The Telecity Interxion merger also impacted the table of the top individual influencers in the data centre market. David Ruberg from Interxion topped the list followed by Jim Poole of Equinix and A William Stein at Digital Realty, following the announcement of organisational changes at the firm. Raul Saura from T-Systems Iberia was quoted widely following the announcement that Emerson Network Power had been selected by T-Systems to construct one of its largest modular data centres in Spain, as was Scott Barbour, business leader of Emerson Network Power. Aaron Binkley from Digital Realty appeared in 6th commenting on sustainability, followed by David Wilkinson of Equinix after the major deal with the Asia Pacific Stock Exchange that will see APX’s new trading platform deployed in Sydney data centre SY1. In 8th was Ivo Pascucci of TeliaSonera commenting on their extended 100G connectivity with the US. Finally the announcement by Verne Global that it had secured US$98M of equity funding, led by Stefnir, an Icelandic asset management and private equity firm that is working with several of the largest Icelandic pension funds, ensured that both Isaac Kato of Verne and Arnar Ragnarsson of Stefnir made it into the top ten this month. As the market consolidation continues, as seen recently with the Telecity Interxion merger, we expect the data centre influencer field to be dominated by the same large players and their executives. In the engine rooms where the real work is done, these are the real cloud heroes. [table id=1 /] ### Hybridity; The Future Mainstay With the emergence of cloud technology being the mainstay of collaborative and efficient working, the term hybrid cloud has been bandied around with much speculation in recent months. I think that if you ask five people the same question – “what is hybrid cloud technology?” you will get five different answers! A hybrid (anything really) is something that has been produced from two different things especially to get, create or breed better characteristics or performance. With cloud computing, this is exactly the case and should not be confused with HPC, Scalable technology, burstable resources or anything else. These are simply marketing descriptions that already exist in the industry today. Simply putting together two services that share some common infrastructure (SaaS application access or running two or more SaaS applications) isn’t really a hybrid model in my opinion. So lets look at this in a little more detail. Any hybrid cloud challenges traditional views on IT infrastructures to gain the best performance for the individual business needs. This may include a number of differing solutions from Public IaaS such as AWS and MS Azure, PaaS, SaaS and a multitude of other IT deployments including private clouds, dedicated infrastructures, be they hosted with varying specialist providers or even on premise and collocated IT deployments and in my opinion the following issues need to be thrashed out before considering a move to a true hybrid cloud solution: cloud management With disparate clouds, each provider will potentially utilise different tools and management platforms. This comes with its own challenges but put this together with conjoining other technologies, other cloud platforms with other provider’s policies and procedures, it becomes a positive minefield of issues. A common management platform helps with API’s that can accommodate the management complexity as well as the orchestration of resources. Connectivity If Public/Private hosted clouds are utilised you must look at latency, bandwidth reliability (especially to disparate services) as well as the Network Layer. Not only must bandwidth, latency, reliability and associated cost considerations be taken into account, but also the logical network topology must be carefully designed (networks, routing, firewalls etc). When discussing the benefits of a hybrid model one must consider the move to this cloud model as an end goal. Small steps can be made before undertaking the daunting task of tackling disparate technologies from independent vendors. Also, public cloud is a big step for anyone, consolidating management portals, SDN usage to extend the enterprise, automation issues and others. Wouldn’t it be much easier to pick one provider to manage these issues for you? The cloud is becoming more and more complex with the speed of innovation becoming constant and in business it’s a full time job addressing operational changes that are not tech related. Now, try to understand and manage the cloud computing changes that affect your business, its tough. Take a small steps approach and get good advice from your trusted advisors in technology, whoever they are, and don’t be hoodwinked by marketing spin and industry acronyms for the sake of keeping up with the Jones’s. Security Let’s also not forget the importance of security. With the multitude of security principles available you will need to pick wisely. Ask yourself this: Can my security principles be applied to a Public Cloud presence? In many cases I doubt it, so a Private Hybrid would probably be your preferred approach. How do I know that another vendors internal and external policies are suitable? Simply, you don’t and due diligence is needed! Do other vendors that I utilise compete with my current incumbent providers? Competition is healthy in business, but Cloud commoditisation is rife and each provider will be begging for your business, and maybe to your detriment. So, we can take this to one more level of explanation, Vertical Markets. For financial service businesses, regulation is key to sustainability and compliance. Can you apply your IT industry specific regulatory rulings to a public cloud? No is often then answer. Can you apply them to a private cloud environment? Of course you can. However this does not mean that you will not be able to have a hybrid model, just that careful consideration is needed for the commodity elements of your technology, which workloads sit where, and the regulating bodies understanding on “what and where” said services can be stored and accessed from. In summary A hybrid cloud is a combination of different clouds, private, public or a mixture of both. The biggest issue/ challenge to consider is the management and support of the different technologies. I believe a private hybrid model will be the mainstay going forward with management platforms. As with the hybrid cloud model you can have the best of both worlds, security with flexibility. ### The Dragon's Den for Tech Start Ups I had the pleasure of attending We Are The Future's (@WATF_) inaugural London Start Up Summit on February 17, 2015, where eight start ups were pitching for the chance to win a six day trip to Silicon Valley. I was hesitant about attending this event; dedicating my entire afternoon to a relatively unknown entity seemed like a large commitment in a very busy week. However, I could not be happier that I leapt into the void. If you have hesitated at attending a start up event in the past I am here to tell you to stop being such an idiot. Ingenuity is hard sought after in the present day and age and if you are missing these events you are missing opportunities to capitalise on the future. My attendance developed from a twitter conversation with an incredible young Scottish entrepreneur. It is not often I come across someone in this industry younger, and more driven than I am. Bruce Walker at WATF is just that. His visionary not-for-profit is giving UK start ups opportunities that otherwise would be completely out of reach. Now, I'm not just talking about the trip to Silicon Valley that the start ups were competing for - the networking possibilities that the startups are exposed to at this event are invaluable. Joining Bruce in judging the pitches were the Emblem Group's CEO Michael Buckmaster-Brown, MBN Solutions CEO Michael Young, and Julie Baker, the Head of Enterprise for RBS/Natwest Banking. On top of the benefits for the start ups who were pitching, I want to stress the benefits for yourselves at attending such an event. WATF brought to my attention multiple startups that CTC otherwise may never have encountered. The winners of the trip to Silicon Valley from the London SUS were Karisma Kids, an app that develops emotional intelligence in children aged 3-9, and Silicon MilkRoundAbout, a tech start up jobs fair. Among the other pitchers were incredibly exciting businesses such as Vidzy which produces crowd sourced digital marketing content through platforms such as Vine; and my personal favourite of the day, Aire which produces 'credit scores for humanity'. As an Australian who moved overseas, having to struggle with starting a life again with a credit score of zero as a British citizen, a service like Aire would have been, and still could be, life changing for me. Concluding my recommendation for you to change your mindset on attending start up events, all I can offer you is these two points. Firstly, it is nice to be out of the office, and secondly, you never know who you will meet, maybe it will be a company worth investing in, maybe it will be your new best friend. The worst case scenario is that you get to speak with a group of exciting entrepreneurs who have an invigorating desire for success, and I see nothing wrong with that at all.   ### Getting Ahead in the Clouds Enterprises are beginning to get their heads into the clouds, and the next three years will see significant growth. Enterprises and third party operators are transforming to cloud data centres and fast becoming the indispensable infrastructure for computing, storage and management of Big Data. Enterprises require simple and scalable ways to create interoperability between data and endpoints, and among multiple data silos. They need infrastructures which offer highly available, multi-tenant and elastic integration cloud to support both high density computing and day-to-day business operations. An international survey of IT decision makers conducted for international data centre operator Equinix surveyed companies in Australia, Brazil, Canada, Germany, the UK, US and Singapore. The results indicated that over 90% intend to deploy cloud-based services over the next 12 months. 79% of respondents said they plan to use multiple clouds, 77% within the next 12 months. 58% of respondents said they planned to use clouds operating in more than one country. In November 2014, Cisco released its latest GCI report showing that by 2018, 75% of data centre workloads will be processed in the cloud and global data centre traffic will have nearly tripled. Highlights from Cisco’s Cloud Index - GCI report for 2013 – 2018 Global data centre traffic will grow nearly three-fold By 2018, global data centre traffic will reach 8.6 zettabytes per year By 2018, 76 percent of all data centre traffic will come from the cloud By 2018, three out of four data centre workloads will be processed in the cloud Source: http://www.cisco.com/c/en/us/solutions/service-provider/global-cloud-index-gci/index.html As Figure 8 [below] from Cisco’s Cloud Index report indicates, workload density in private-cloud data centres will continue to grow faster than workloads in public cloud data centres – but the gap is narrowing. [caption id="attachment_16772" align="alignnone" width="611"] Figure 8: Public Cloud Virtualisation Gaining Momentum[/caption] Source: Cisco Global Cloud Index, 2013–2018 Cloud activity is happening on a global scale.  For example: IBM is expanding its global cloud computing network to 40 cloud centres with 12 new locations including an IBM Cloud centre in Frankfurt.  In 2014, IBM opened a SoftLayer Cloud Centre in Paris, France which complements its SoftLayer cloud centres in London and Amsterdam. IBM also has a strategic partnership with third party data centre operator Equinix to provide direct access to the full portfolio of SoftLayer cloud services via the Equinix Cloud Exchange. This partnership allows IBM to access nine markets worldwide spanning the Americas, Europe and Asia Pacific. Since the start of November, IBM has announced more than US$4 billion worth of cloud agreements with major enterprises around the world. Cisco Systems is investing US$1 billion over two years to expand its cloud offerings, linking hundreds of data centres and cloud providers around the world. In second half of 2014, Cisco added over 30 Intercloud Partners including Deutsche Telekom, BT, NTT Data and Equinix, thereby expanding Intercloud’s reach to 250 data centres in 50 countries. These and other cloud developments are hugely impacting wholesale and retail co-location data centres operators:  Cloud accounted for 65% of new leases in 2014 for wholesale operator Digital Realty Trust - data presented at BroadGroup’s Finance & Investment Forum, London in January 2015.  Tier 1 co-location operators, such as Equinix, Interxion and TelecityGroup, are developing their data centres as cloud connectivity hubs.  New entrant operators are building their businesses on the growth cloud customers. For example; Virtus in London reported that new customers in Q4 2014 included five cloud providers. The key driver for all this activity is not hard to see: it’s all about data. Particularly mobile data applications, whether they are for consumers or enterprise users.  Enterprise mobility and BYOD are among the trends for 2015. Data growth is tripling, generated through media, video, M2M and the Internet of Things. The need for new technology is driving change, zettabyte architectures, new designs, and next generation data centres. More businesses are looking to outsource non-core services via the cloud, making cost savings in the value chain yet benefitting from cloud innovation, especially in regards to Big Data. These are all important themes which feature in BroadGroup’s Datacloud 2015 conference 3rd & 4th June in Monaco – an event which creates the largest European platform for data centre and cloud IT infrastructure end users and business leaders to discuss these dynamic changes in our industry. ### Making IBM Software relevant for Cloud service builders The world has moved on since 1995 when IBM acquired Lotus Software, Windows 95 was released, the Shuttle docked with MIR, and Dreamworks introduced us all to Buzz and Woody for the first time. At a time when the deployment of applications and services via the cloud and hybrid infrastructures are becoming more common, relying on a vendor programme built in the 1990s doesn’t always make the best sense. For an organisation interested in creating solutions, services or using IBM Software technology within their enterprise (and there are lots of partners, end users and a massively varied IBM software technology stack out there) the common assumption is that to purchase a “right to use” licence, you have to take that well trodden path through the IBM’s Passport Advantage (PA) scheme, with its numerous initiatives and often complex processes. Leveraging IBM technology in a cloud deployment has often meant that as a partner you are adapting your go-to-market model and the commercial relationship with your end user to fit IBM’s PA programme. Buying an asset such as perpetual licences, which is then restricted to a specific client can be a massive turn-off for an “X”aaS provider, even when the technology itself is the best out there. Since 2010, when IBM rebranded their “OEM-lite” licensing programme, there has been a flexible programme available to ISVs, MSPs and SIs by the name of Application Specific Licensing (ASL). ASL is aimed at business partners who wish to take IBM’s software and build it into the repeatable solutions they take to market – historically an application “on-premise” but also with the ability to deploy that software via a cloud delivery model by putting the right to use firmly in the control of the partner and their offerings rather than the eventual end user. Other features (more than it is possible to cover here) make ASL all the more appealing in the deployment of hosted solutions and SaaS, which are ideal for anyone building such things on SoftLayer: ASL is the only IBM licensing model available today that enables monthly licensing metrics to be created for IBM Software – allowing IBM to map their component costs to the ramp-up, ramp-down usage and commercial model of a true SaaS solution. ASL enables the partner to manage and redeploy licences throughout their user estate – subject to conditions, IBM components of a cloud solution can be reused when an end clients leave and new client joins a service. ASL routes support for the IBM component to the solution provider – the Partner is the interface for post-sales technical support, streamlining the end user experience and adding greater value to the solution being offered. The ability for a partner to create a unique agreement means ASL gives an ISV or MSP not just a USP over their rivals, but the ability to leverage IBM technology in markets where they may previously have not been commercially viable. Restricting use to a service, using a monthly model means a client no longer has to take on a third-party asset and can easily consume a service from a provider building in IBM software, and do it across their multi-national sites with ease. So what’s the catch? There are very few as long as you add significant value to the IBM component. Purchase commitments are a thing of the past. Solutions can be added to an ASL agreement and amended as they evolve, plus PA and ASL programmes can sit side by side but you can not mix and match within a solution deployment. In short, partners really do have an option when building out PaaS, SaaS and BPaaS (CLARIFY acronym) offerings using IBM software, be that Sametime, WebSphere, Tivoli Endpoint Management, Rational, or any number of the recent acquisitions (such as Netezza, MaaS360, WorkLight or Cast Iron). Off the back of strong growth in 2010 one thing is for sure in 2015, with the promise of new features such as “Percentage of Revenue” licensing models for certain software, ASL is relevant for those in the cloud space and will continue to attract new partners and clients who have previously dismissed IBM as too difficult to bring into their own terms, conditions and commercials in the past. ### Tales from the Cloud Crypt Friday the 13th - no one really knows where the superstition originates from, however there are many claims throughout history. Legend has it that if; If 13 people sit down to dinner together, one will die within the year. The Turks so disliked the number 13 that it was practically expunged from their vocabulary (Brewer, 1894). Many cities do not have a 13th Street or a 13th Avenue, many buildings don't have a 13th floor. If you have 13 letters in your name, you will have the devil's luck (e.g., Jack the Ripper, Charles Manson, Jeffrey Dahmer, Theodore Bundy, and Albert De Salvo all have 13 letters in their names). There are 13 witches in a coven. In any instance, here at Compare the Cloud we have a craving for superstition, and as today is Friday the 13th we figured we would indulge ourselves! So we're highlighting some true horror stories with Cloud and the industry. The following contribution is courtesy of Neil Cattermull. Replication is the Devil! So, I recall some years ago a true situation that could have easily been avoided. One major international organisation had dual redundancy for all of their core systems, tested regularly with failover tests documented and publicised at board level. One frosty morning during the post New Year celebrations, a certain power provider happened to be lurking around a particular street corner with a cordoned off manhole cover. By lunchtime there was a creepy occurrence where lights were flickering, and random desktop computers restarting by themselves – creepy. Then, if by some cataclysmic coincidence, the office was immersed in darkness, shortly followed by a plethora of swearing. Somewhere in the distance, close to the fire exit, fluorescent jackets were wandering around ushering staff to the stairs. Fast forward to the IT floor – Panic stations had kicked in and the clucking of acronyms were heard like battery hens in warehouses. Bang, the lights come back on and sighs are a welcome relief. However partial power had been restored but not to the core server rooms in the basement. Communications had been established with the outside world but the Big Computing rooms were isolated and many of the core IT services had failed. Enter Headless chicken mode AKA Disaster Recovery, came into play. Disaster Recovery was invoked (all core services switched to run from the replication partner data centre) and amazingly, as per the test results from 4 months ago, core systems were restored within the next 3 hours, FANTASTIC! One whole day later, with the root cause fixed (a rogue over zealous electrician taken to hospital), the decision to revert back to primary IT infrastructure onsite was taken. After vigorous testing, hours of through the night working by dedicated IT staff. And then switch gets flipped and normal working was restored! Some time later the help-desk calls came flooding in with similar descriptions... WHERE'S MY DATA!? Enter Stage Left: THE REPLICATION DEVIL. Within all the testing procedures, scripts, processes and sign off sheets the one thing that had not been tested was… Fail-back replication of data from services from running in disaster recovery mode. When testing in disaster recovery mode it had never crossed anyones mind, and unfortunately assumed working, that failback replication of data worked. What followed on for many weeks was the never ending support help-desk calls asking for missing data. The moral to this story, be prepared for failure but be prepared to fail back! Plus, a bonus from our fearless leader, Dan Thomas; The Cloud Company in question shall remain nameless. An salesperson in a very slick suit from a cloud organisation oversold the benefits of cloud and their company services. The dream was sold to the end-user organisation of lower costs and manageability of all applications and services. The end-customer based on the salespersons advice moved all critical systems including databases and created a pool of hosted desktops. The customer dream wasn’t realised the day the broadband connection they were using for a 50 person organisation collapsed. With no back-up line and a highly saturated connection that was next to useless the company went ‘dark’ for 60 working days whilst a more secure leased line connection was invoked. Moral of the story, cloud is useless without decent connectivity.   ### Nightmare on Vendor Street As the editorial team here were asked to pen their cloud nightmares for Friday the 13th, I thought I’d leave the others to do so from the client perspective and share my thoughts on the turmoil being experienced by some of the major technology vendors. The tech sector used to be simple. Solutions were a combination of hardware, software and services and vendors competed with each other is some segments while cooperating and partnering in others. This relatively comfortable equilibrium has been thrown into chaos by the move to the cloud. While we are massive cloud advocates here at Compare the Cloud, we are also realists - cloud is still a small market within the overall IT sector and, while growing rapidly, the whole transition to the cloud will take some time. So lets look at what’s happening to the (still very large) traditional technology markets - and you’re going to have to excuse a few sweeping generalisations in an article of this length: Hardware: while still a massive market, most hardware segments are experiencing either revenue or margin decline or both. If you’re in one of the segments that are experiencing both at the same time (as some Unix vendors are), this is a real nightmare. Software: traditionally the technology segment with the highest margins. Essentially once the massive upfront cost of development is covered, all incremental licence sales are almost entirely profit and many firms have lucrative software support and maintenance businesses as well. The problem is the changing delivery model. If the architecture of your software is such that moving it to the cloud and making it a Software-as-a-Service (SaaS) offering is straight forward or if you’re ahead of your competitors in doing so, then you’re sitting relatively pretty. If however you don’t have the resources or skills to move to SaaS or your application architecture prohibits it then again you’re facing a real nightmare. Services: for years vendors have been looking for the holy grail of replicable services that would break the linear relationship between headcount and revenue in services. Be careful what you wish for. The cloud has provided a wonderland of replicable services, but at the same time is destroying margins. Clients will eventually learn, probably through bitter experience, that cheap is not always cheerful. Quality service and support comes at a cost, but it is an incredibly hard thing to measure. For vendors, differentiation is becoming harder to achieve, while for clients telling the difference between vendors that are all promise and those that have the quality to delivery on their promises is key. Marketing is therefore becoming even more important than before - if you’re not marketing-led and you haven’t grasped new disciplines such as influencer marketing, content marketing and social selling then you’ll soon be facing the biggest nightmare of all. So what about Cloud? It’s where the growth is (if not the margins yet). Almost all net new IT investment is in areas such as analytics, mobile and social and almost all of this is cloud based. The vendor landscape will change: while all vendors want to establish a presence in the cloud some are struggling to do so; others are growing their cloud revenues at an impressive rate; many are looking to partner with the host of smaller cloud players out there; and then there are the mega-scale cloud vendors that appear to have taken the lead. Cloud will become a bigger and bigger part of the overall IT market, but as it does competition will increase (and consolidation will occur). As we are seeing with today’s services vendors, marketing will become key. Finding a niche where you can add more value than others and then being able to market yourself so that the right people understand the value that you have to provide, will make the difference between living the dream and experiencing a whole new nightmare. ### January #CloudInfluence UK and Cloud Bursters League Tables Now, having global statistics is all well and good, but let's be honest, we're all interested in the UK specific data. So here it is. The top 10 UK #CloudInfluence individuals and the top 7 rising stars #CloudInfluence individuals, those who featured in the charts with no prior presence. As far as the UK contingent go, the top ten is headed by Simon Porter from IBM, who is not only a regular blog contributor to Compare the Cloud, but must also be one of the most prolific executives on social media, having posted over 100,000 tweets. In 8th, nestled among the various CEOs listed, is Francis Maude, Minister for the Cabinet Office and Paymaster General meaning that his boss David Cameron, the Prime Minister, fell outside the top ten in 11th. [table id=4 /] January's rising stars – people that came out of nowhere, having not really had much of a presence previously – included Roberto Medrano, Peter De Santis and Robert Faletra, all of whom featured prominently in the Global 50 #CloudInfluence Individuals. It also includes Canadian Industry Minister James Moore who claimed that Canadian geography was a real challenge for extending universal broadband access. He suggested that the U.S. target wouldn't be realistic or affordable for Canada, given their population density.January’s rising stars – people that came out of nowhere, having not really had much of a presence previously – included Roberto Medrano, Peter De Santis and Robert Faletra, all of whom featured prominently in the Global 50 #CloudInfluence Individuals. [table id=5 /] Tom Szkutak, Amazon’s CFO, rose to prominence following excitement in the media that Amazon would be breaking out its Web Services revenue in future. Dr Chandra Sekhar, founder and CEO of TheRightDoctors.com, gained publicity for receiving $100,000 credit for Google's cloud services and 24/7 gold support earlier this month as a part of the Google launchpad to be held in February. As did Shreekant Pawar, co-founder of Diabeto, a hardware device targeted at diabetics to store and monitor blood sugar levels on smartphone from a glucometer directly, as they received $40,000 from the same fund. Next Month Tune in again next month to see how all the firms and execs do next time, to see which new names appear, to see what ones disappear, and to see if Larry Ellison can break into the global top 20 or if David Cameron can break into the UK top 10. ### January #CloudInfluence Individual League Table January 2015: ComparetheCloud.net Global 50 #CloudInfluence Individuals For the purposes of the Global 50 #CloudInfluence Individuals we exclude journalists and other authors who tend to report other people’s opinions and focus instead on the key opinion holders themselves – those that tend to be quoted in any such articles. The top twenty influencers is an interesting mix of industry heavyweights which includes several CEOs and CFOs from the firms that are listed in our Global 50 #CloudInfluence Organisations. At 1st and 2nd we have the CEO’s of Microsoft and Dropbox. In 3rd is Marc Rogers, the principal security researcher at CloudFlare Inc., who is also a former hacker and has been widely quoted recently on security issues. In 4th is Rodney Rogers, chairman and CEO of Virtustream. Having posted almost 10,000 tweets, he is the most prolific of all the heavyweight influencers on social media. Then in 5th and 6th are two IBMers: Virginia Rometty IBM’s CEO (and the highest placed woman on the list) and Martin Schroeter IBM’s CFO. The next three heavyweights are analysts: Daniel Ives at FBR Capital Markets is the highest profile investment analyst. He regularly issues research notes on the primary tech players and is quoted heavily in the tech and business media. Ed Anderson is research vice president at Gartner and the highest profile industry analyst. And Ross MacMillan is an Equity Research Analyst at RBC Capital Markets. We then have a number of other senior company execs from Microsoft, Oracle, Apple, Amazon Web Services and SAP. These are followed by Peter Kenny, the chief market strategist at Clearpool Group in New York; Jim Reavis, the executive director of the Cloud Security Alliance (CSA); and Roberto Medrano the Executive Vice President at SOA Software – he was recently cited as one of the top 100 Hispanic professionals in the information technology (IT) field in the HITEC 100 List. Completing the top twenty are Bill Fathers the Senior Vice President and General Manager for the Hybrid Cloud Services Business Unit at VMware; Amy Hood, the CFO for Office and Dynamics at Microsoft; and Peter De Santis, a VP at AWS Compute Services. This pushes Larry Ellison of Oracle down to 21st on the list, with other notable characters that appear including Edward Snowden at 46 – still quoted even in his absence, and Missy Krasner, the second highest placed woman and managing director of healthcare and life sciences at Box. January’s cycle of Q4 announcements from the major players and the fact that almost all are playing up their cloud revenues accounts accounts for the presence of so many of these firms in the list of organisations, and of their top execs in the list of individuals. Notably absent from both lists are HP and its execs as HP has a different quarterly reporting cycle. [table id=3 /] As the #CloudInfluence League Tables are released each month you will be able to track your progress in the influence charts. For more information, or consultation on improving your #CloudInfluence ranking; contact Bill@comparethecloud.net. ### Introducing the #CloudInfluence League Table From February 2015 onwards Compare the Cloud will be publishing a regular set of monthly #CloudInfluence league tables. While there are various free tools that can provide an indication of social influence, the Compare the Cloud #CloudInfluence league tables, are based on far broader analysis of all major global news, blogs, forums, and social media interaction over the past 90 days. The January league tables provide a snapshot taken on the last day of January of the respective influence of both organisations and individuals over the last quarter. Published in early February these league tables provide readers with unique insights into who the most influential organisations and individuals in the Cloud arena really are. January 2015: ComparetheCloud.net Global 50 #CloudInfluence Organisations The top 6 global organisations with influence in Cloud are, as you might expect, the major global Cloud and technology vendors. In order of influence these are: 1. Microsoft which had a host of recent cloud announcements; 2. SAP which recently announced strong 2014 Results, saying that its Cloud business would drive future growth; 3. Amazon which recently announced that one million customers actively use its Cloud services; 4. Apple which recently posted historic first quarter results and record profits not just for Apple, but for any public company, ever! 5. Oracle which also made a whole series of cloud announcements; and then 6. IBM which reported disappointing 2014 Results, but that included strong growth in Cloud revenues. IBM was also the subject of significant negative sentiment following wildly speculative opinions about job losses that were patently absurd in their predicted volume. Thereafter the field opens right up with analyst firm Gartner appearing in 7th, followed by VMWare which is the focus of much takeover speculation in 8th and with Amazon Web Services, the firm’s Cloud services business for which revenues will soon be broken out separately, appearing in 9th overall. In 10th was Akamai which recently releases its third quarter 2014 'State of the Internet' report. Just outside the top ten are Google, another of the major global Cloud players; IDC, the analyst firm; Amazon Webmail, the third entry for Amazon; and Citrix, another firm that was the subject of significant negative sentiment around potential job losses. The remainder of the top 20 was comprised of Adobe, Tata Consultancy Services., Yahoo! News, Google Drive (a second appearance for Google) and Hutchison Whampoa, the Hong Kong-based conglomerate which says that it's in exclusive talks to buy mobile network provider 02 to combine with its existing 3 network in the UK. [table id=2 /]  At 24 and 26 were Intel and Cisco – both of which harbour Cloud ambitions. Also telecom vendors Ericsson at 34 and Huawei at 39.Other notable appearances in the January 2015 ComparetheCloud.net Global 50 list of #CloudInfluence Organisations include:  At 31 was the European Space Agency which has recently built a private Cloud platform with Red Hat.  IBM makes a second appearance at 43 with SmartCloud as does Microsoft at 44 with Microsoft Office for Android. For those more focused on local markets, as we are here in the UK, the top half dozen #CloudInfluence Organisations in the UK were: 1. Aria Networks, a business in the UK that makes network planning and design software for the telecommunications industry, which made a series of Cloud oriented announcements. 2. GDS, the Government Digital Service, which announced that it had agreed a two year extension with cloud service provider Carrenza to continue to be a primary supplier hosting GOV.UK. 3. Business Monitor International, which published a number of its Q1 2015 Information Technology Reports on cloud computing services in various international markets. 4. The UK government itself after the Cabinet Office detailed a series of security questions for suppliers wishing to provide services through the upcoming sixth iteration of the G-Cloud framework 5. Compare the Cloud, as we continue to provide constructive comment and informative insights on all things Cloud both in the UK and beyond. 6. The European Commission, as the Directorate-General for Informatics (Digit) at the European Commission released a tender for a framework to acquire its first cloud services. We hope you have enjoyed our first #CloudInfluence league table, each month Compare the Cloud will be publishing the top 50 organisations and the top 50 individuals, as well as a spotlight on the UK's cloud influence. For more information on the #CloudInfluence League Tables please contact Bill@comparethecloud.net. ### Welcoming Bill Mew, a CTC Convert Compare the Cloud is delighted to announce the appointment of Bill Mew as Marketing and Communications Director. Bill is a tech industry veteran and experienced corporate marketing and communications professional with over 20 years spent working in blue chip organisations mostly in pan-European and global communications roles A former Weapons Engineering Officer in the Royal Navy and former PR consultant working for a range of technology firms from Apple, Compaq, Oracle, Cellnet and BT as well as working for PM John Major on his election campaign, Bill is a seasoned tech marketing pro with a particular focus on the evolution of marketing, PR, social media and influence in the Cloud era. Bill is also a fifteen year IBM veteran who as Global Sector Communications Leader for IBM’s largest and most profitable sector the Financial Services Sector, was the first person based outside the USA ever to be given a global sector communications leadership role in IBM. Latterly Bill led social media as well as media, analyst and influencer relations for both IBM Global Business Partner and IBM Midmarket in Europe.  A regular commentator on #Cloud, #SocialSelling and #InfluencerMarketing, as well as a dad with a passion for technology, economics, politics & Arsenal FC, you can follow Bill on twitter @BillMew. ### Cost Flexibility in the Law Cloud Cloud computing enables law firms to align IT costs and services more directly to the needs of the business. The effects of the financial crisis in 2008 on the legal industry have been well documented, and although the sector is now recovering there is strong evidence that the new competitive environment is here to stay. Present market conditions are requiring law firms to be more creative and imaginative in the way they plan their investments. The impact of this can be seen in a number of ways – some segments of the market have seen contraction and consolidation, while others have seen geographical expansion as firms look to exploit new opportunities in new markets. On top of these changes across the board we have seen significant merger activity. A common theme arising in the sector is agility – firms that can adapt quickly are the ones that are best placed to cope with changes to their business model and exploit new opportunities. For law firm CIOs this means there is an imperative to reduce costs quickly if the firm is contracting, or to expand capability if the firm is expanding into new jurisdictions.  By following the traditional approach to delivering IT infrastructure this becomes extremely difficult - after paying lawyers and renting business premises, a firm’s IT infrastructure is typically its largest cost and this can amount to approximately 5% of a firm’s turnover. As this cost includes servers, networks, data centres, security, and teams of people to manage that infrastructure, the investment is essentially fixed. Similarly, if the firm contracts in size, they would still be paying for expensive infrastructure that they do not need. Cloud computing provides CIOs with a radically different model; it allows the CIO to treat core IT infrastructure as a utility. With no requirement for capital expenditure on IT, the firm is able make use of the hardware, processing capacity and storage that it needs on a pay-for-use basis.  As a result what was a significant fixed cost for the firm becomes an elastic, flexible cost. This flexibility is one of the main advantages of cloud computing – capability can be scaled up and down on demand with virtually zero capital cost. This gives the CIO the ability to respond to the ever changing needs of the firm much more quickly, and therefore allows for an IT service that is more closely aligned to the business requirements of the firm. Cloud based IT services that are tailored closely to the business requirements of a firm not only allow for more cost effective business, but smoother operations. Cutting out the capital expenditure benefits the company as a whole, and the utility approach to cloud services is a CIO’s flexible, cost-effective dream. ### 8 Tips For Moving Business Data To The Cloud Over the last few years, both consumers and businesses became increasingly appreciative of the benefits brought by storing their data in the cloud. Moving business data to the cloud has many advantages – the data is available to everyone in every location, and the complexities of storing and structuring information are no longer a concern. Still, if you're planning to migrate your entire workload into the cloud, there are some things you should know to avoid unnecessary complications and additional costs this move might generate. Here are 8 tips for successfully moving your business data to the cloud. 1. Have a good reason for the move Migrating your data to a cloud platform requires a clear purpose and a precise plan – without it you risk your whole organisation not being able to adjust to this change. Be clear when it comes to statistics – tools for measuring your progress are essential for demonstrating why the move to the cloud made sense from the strategic point of view. When changing your data storage tactics, prepare for a deep transition of the leadership model and possibly other aspects of your organisation. 2. Carefully evaluate the providers The cloud isn't just a data centre, it's a complex service with an infrastructure managed by APIs. Choose a provider that boasts an impressive set of APIs that allow control of the data infrastructure in a cost-efficient and flexible way. Another important matter are services like firewalls, antivirus and intrusion detection – they all attest to the quality of the data storage. Have a look around to see which providers offer secure environments that are easy to manage and perform well even during peak demand.  3. Understand the pricing It's likely that your move towards the cloud is motivated by financial reasons. To ensure you benefit from the service, make sure you know your costs, both the hard ones (servers, infrastructure, licenses, vendor support) and those that require some measurement (the amount of valuable time your team spends on dealing with problems, manually entering data etc.). Knowing what is at stake will help you choose the best solution, clearing your path towards the cloud. 4. Consult your decision Instead of delegating the task of moving your data to an untrained developer, find an expert consultant who will help you decide which solutions are best for your company, and how to perform the move successfully with as little disruption as possible. Even if this adds to your costs overall, it will be worth the trouble. Dealing with any data damage on your own will be far more costly. 5. Train your staff It is important to make sure the person overlooking your moving process is an expert who can additionally train your staff. Some cloud providers offer training sessions, if this is the case with your provider, have one of your staff do the training, and later help you audit your progress and work with the service. Train your team both before and after the transition.  The skills required to migrate data differ from those necessary to keep the cloud service running smoothly. 6. Ensure data security There's no reason for you to be overly confident with your cloud storage, certainly not at the beginning. Take all precautions to secure your data. Read through your user agreement and check whether the server or data centre of your choice is SSAE 16, SAS 70 and SOC 2 audited, and if their clients are HIPAA or PCI certified. Firewalls and encryption services are a good sign as well. Before you upload any data that might be sensitive, think twice whether the cloud is really the best place to store it. In some situations, keeping crucial important information away from the virtual world can become a great advantage. Another important aspect of cloud security are passwords – don't just settle for ones that are easy to remember. Avoid doubling passwords through various services like for e-mail, social media accounts and your precious cloud storage – you're just asking for someone to break into your account.  7. Beware of cloud lock-in In order to take full advantage of all that cloud has to offer, you must avoid a situation where all your workload is locked in a specific provider. Cloud lock-in is a significant problem encountered by businesses more often than you'd suspect – even if providers advertise their services as the most robust, cost-effective and most-supported on the market, it doesn't mean you won't ever have to migrate your data outside of their platforms. Before choosing the provider, make sure that all your data and workloads can be migrated through virtualization, container-like solutions and fluid migration 8. Have a backup strategy in place Cloud is susceptible to downtime too – have your backup and disaster recovery strategies in place, making sure they have been updated since you migrated your data. Be ready to collaborate with your vendor in case of events like breaches or outages. Recovering from such events in the cloud is not what you're used to, so accept that there may be hiccups in the beginning of the process. Migrating your business data to the cloud and gaining a full understanding of its potential won't happen in a day. One thing is clear – cloud adoption is a lengthy process, but it has benefits for the workings of your venture that are well worth the trouble. ### Finding the difference in cloud suppliers The Cloud industry is continuing to expand rapidly, with new technologies and solutions being showcased, seemingly, every month. What did strike me as interesting was the uniformity of suppliers. They're pushing out the same message to potential customers and it is becoming increasingly difficult, even for fellow providers, to work out exactly what services companies were offering. In an industry becoming saturated with the same marketing message, how is a customer expected to break through the noise and identify a quality hosting provider? This got me thinking and in my opinion, analysing two key areas of experience and investment will put your business in a strong position to do so. Investment, both in terms of technology and financial contributions must be considered as a whole. More providers are now telling customers that they are offering a "full service", while not disclosing how they are actually achieving this. The first question one should ask is "what lies beneath?" how is the service you are looking to buy actually being delivered? Everyone has a glossy storefront, but back stage all things are not equal. Are you dealing directly with the Cloud provider or is your supplier a reseller? With automated self-service portals being increasingly white labelled, everyone is a Cloud provider. This is not a problem it its own right, but it does add an extra layer to the service you are purchasing, meaning you need to check your provider’s suppliers too. You really need to find out what hardware the provider is running on.  To create a basic hosting environment you only need a PC, some free virtualisation software and an Internet connection, and until things start to go wrong you will be blissfully unaware of your situation.  A quality hosting provider will be using enterprise hardware, with resilient components. For example, a provider can either buy a 120 GB solid state drive (SSD) for £80 or an enterprise grade version for £1000. The difference being that the enterprise drive is designed to last longer under an intense workload, where a consumer drive is not and could fail every six months causing potential service outages.  When extending this concept to two servers costing £500 and one costing £6000, it’s not hard to appreciate the difference. How is the supplier operating the platform? Do you have to compete with other users for resources?  How does the platform deal with that contention? Different platforms handle this better than others and as many home broadband providers will notice, services can be slow during school holidays. Fine, but can you afford for your applications to be slow due to similar events occurring which are out of your control? I have heard of instances where providers are contending memory by 4-8 times on a server. It is vital to make sure you get some performance guarantees from your supplier. Where is your supplier hosting the service, in a data centre or in their bedroom? I once met a man who was hosting a Lotus Notes solution in his cellar at home over 6 ADSL circuits and was complaining about the rain. If you cannot see the platform, find out where it is. Finding out what Tier classification the data centre is should also give you a guide to its suitability. I would suggest that Tier 3 or 4 should be a target, tier 4 being the best but still fairly uncommon in the UK. (Ed:  ...strictly speaking a Tier 4 classification required armed guards!) Is their data centre based in the UK or internationally? Location can have implications for both privacy and internet performance. Did you know that if your data is affiliated with an American company it instantly falls under the US patriot act, which means the American government can view the information, without your permission?  As for bandwidth, it is a long way to America, therefore your service using the native internet from the UK will be slower.  I ran a few tests using a broadband speed check.  The results on our fibre broadband connection were as follows:     Ping Download Upload Leeds London 15 33 8 Leeds Dublin 29 33 7 Leeds San Francisco 157 22 6 Leeds LA 172 15 4 However, this does vary depending on your broadband provider and the time of day. Originally published on Compare the Cloud 15 Febuary 2013 by Richard May. ### Host Your Cloud In The UK You run a successful cloud business outside Europe and are now looking to set up your EMEA operation. But where do you choose? The UK is in your top 5 choices but is it the best option? Or must you set up in Germany because you've heard German data can’t be held outside the country? Here’s a quick rundown of what to expect if you set up business in the UK (and the key benefits of doing so over Germany).   The Economy – £ vs € The financial climate is volatile and choosing a stable economy in which to establish your European operation is crucial to your success. The Eurozone crisis has left many European countries struggling to recover. 2015 is bringing further currency pressure in the Eurozone. Switzerland has abandoned its currency cap of its Franc against the Euro, the Greek election results have favoured abandoning austerity measures which could further destabilise the Euro, and the European Central Bank has announced €1.1 trillion of quantative easing in an attempt to boost the Eurozone. The UK is looking fairly stable by comparison. In fact, the World Bank GDP growth rates for 2013 illustrate that even Germany has found the struggle much harder than the UK – Germany’s GDP increased by only 0.1% compared to 1.7% for the UK. Also, the World Bank forecasts that the percentage increase in GDP from 2014 to 2015 will be 2.9% for the UK but only 1.1% for the Euro Zone. Corporation Tax Only two things in life are certain and one of them is having to pay tax. There are ways to reduce your tax burden, of course, as demonstrated by more than a few multi-national companies recently. You could establish your business in a tax haven. Or you could set up several international divisions to spread your tax liabilities between them, favouring the country with the lowest rate. Not surprisingly, tax forum shopping and the taxation of multi-national businesses is a big topic for governments. Assuming you’ll have to pay corporation tax at some point, you’ll need to take advice from specialists, as tax is often complicated and time consuming. In fact, the World Bank ranks the UK 52 places ahead of Germany in terms of ease of paying taxes. Apparently, companies in Germany spend 218 hours on average per year dealing with tax related issues, whereas UK companies spend only 110 hours. It’s not just how long but how much. An overseas company with a branch or office in the UK can expect to pay one yearly payment of corporation tax at a rate of 21% based on profits from UK activities. In Germany the position is more complicated with a company paying two different taxes. First there’s trade tax which is set by the local authority. For larger towns, rates are between 14-17% and for smaller towns, rates are between 12-16%. Then there’s corporate tax at 15%. Law and regulation The EU as a whole is a fairly law abiding region. While this is obvious, sometimes it is worth remembering the enormous value in doing business in a jurisdiction where the rule of law is recognised and upheld by the vast majority of residents. In terms of the laws and regulations themselves, naturally, depending upon your view, there is either too much generally or not enough. Or at least, maybe there is not enough cloud-specific regulation. Given cloud is a regional or even global sector, it is important that regulators think about how to adopt solutions on a regional or global scale for them to work. The calls to establish a European Cloud could put the UK and Germany on a level playing field with each other and possibly give them an advantage over those cloud providers outside Europe from selling to customers inside Europe. When I’m asked for my advice on this, I say it’s best that regulation is not rushed and not too specific on the technology as otherwise it will be out of date the minute it’s passed into law. Also, these issues are better dealt with on the basis of an open global market. Having isolationist or protectionist policies is probably a backward step when the market is global. But, having a base inside Europe is better than not having one. So what about the “red tape” factor and how far this slows down the ability to do business? There is not enough space for me to go into detail of all the regulations that will likely affect you if you do business in the UK and compare them against those of the other EU countries. But the World Bank has ranked all the countries in the world in order of how good they are for doing business in 2015. At number eight, the UK is the highest Western European country, one place behind the US and six places ahead of Germany. Again, UK is a strong contender to establish European operations. Data security and sovereignty Data security and sovereignty are key topics in the cloud sector but ones which are often widely misunderstood. Most people know that the collection, processing and storage of personal information about a living individual, including an email address, must be compliant with EU data protection laws. After that, general awareness becomes more patchy. Under the DPA, as long as the data protection principles are complied with and the country where the information is being transferred to has an adequate level of protection for data subjects, a data transfer agreement is enough to legitimise the transfer. I regularly advise on data protection issues and often end up correcting misunderstandings of the law. For example, I’ve had non-lawyers tell me that UK law prohibits the transfer of data outside the UK. The law doesn’t say that. I’ve spoken to US cloud providers who say they have been told they must set up their EMEA operations in Germany as otherwise they will be excluded from trading with German customers. That’s also not the case and I have set them straight on that. Sometimes these misunderstandings can be explained by businesses which adopt a data policy that is more restrictive than the law. This stricter interpretation leads those in the business to believe the policy says the same as the law. Other times, it is simply misinterpretation by a business to suit its business interests. After all, keeping data in the country of origin can provide a boost for local providers to the exclusion of providers outside the country. It’s worth remembering, it is the customer who will be primarily responsible under data protection laws. The cloud provider will normally store and process data on behalf of the customer and will have to follow the customer’s instructions. The new data protection regulation is aimed at truly harmonising the position throughout the EU. Until then, there is the perception that the approach to data protection is far stricter in Germany than in the UK. It’s true, there are some key differences between the UK Data Protection Act and the German Bundesdatenschutzgesetz, and in my view, they favour a UK operation rather than a German one. For example, BDSG requires express consent in writing in advance from the data subject whereas the DPA does not require explicit consent in the same way. Further, if a company has a data security breach, under BDSG, the data controller must inform the authorities and the data subjects but no such general obligation currently exists under the DPA – this will change under the regulation. Finally, under BDSG, in the absence of explicit consent, there must be a legal basis for the transfer of data outside of the European Economic Area and the data recipient must ensure an adequate level of data protection. Under the DPA, as long as the data protection principles are complied with and the country where the information is being transferred to has an adequate level of protection for data subjects, a data transfer agreement is enough to legitimise the transfer. Culture and language barriers It is always a risk doing business in a country where you don’t have an awareness of the culture or an understanding of the language. English is the second most widely spoken language in the world (after Mandarin) when you include those speaking it as a second language. The English language also dominates the EU, with 52% speaking it natively or as a foreign language compared to 27% for German. Of course, English is the most widely spoken language of the USA too, the home of many cloud providers and currently the world’s largest economy. Localise your contracts It’s also important to ensure you get your contracts properly drafted to ensure you not only protect your risks but also give the customer some reassurances. US public cloud contracts are notorious for shifting most of the risk to the customer – “as is” service, poor service credits and a complete exclusion of all liabilities including for loss of data. This may work fine in the US, but doesn’t sit well in the UK where you may struggle to enforce a complete exclusion of all liability. Customers want a more balanced approach that protects them too. I regularly advise customers looking to migrate to the cloud and these are exactly the issues I warn them about. When cloud providers instruct me to draft or update their cloud contracts, I flag these as areas for consideration. Providers are keen to stand out from a very crowded market and selling better provisioned private or hybrid cloud addressing these issues can be a great way of doing this. Therefore you should make sure you localise your cloud contract to ensure you don’t generally fall foul of English law.   If you’re looking to set up your European or EMEA cloud operation, the UK is the natural choice. The UK economy is growing faster than the Eurozone, it has a thriving cloud sector and an affordable and straightforward tax regime. It also has strong legal system and a fair approach to cloud contracting. And, in a world that is increasingly using English to communicate, you must not underestimate the importance of a country with 64m people where the main language is English.   ### IBM z13: A Journey The year of 2015 began with something big. The IBM z13 launch. I want to take a moment to reminisce about IBM - 100 years of technology and a mainframe which has been through several iterations in the past 50 years or so. A company with an incredible past, and now with the z13, an exciting future. The circa $1bn reimagining of their zSystems computing juggernaut, the z13, has taken us on quite the journey, so humour me as I indulge my nostalgia. Over the past 12 months we have had the privilege of seeing behind the scenes of this latest release - from visiting the labs in Poughkeepsie, to the final unveiling in New York early 2015. Having met those behind the Mainframe the dedication to creating something relevant and exceptional is unquestionable; but for a generation looking to fully embrace the cloud and all its facets, has IBM hit the mark? The answer is a resounding yes. In a world of inane cloud marketing IBM have something to actually say about the cloud. And they’re saying it loudly with z13. In this author's humble opinion, IBM are shouting loud - this is cloud for grown ups. A serious investment for a serious cloud offering. This is a big deal for IBM - not just an upgraded product, but rather a reimagining of their mainframe services for the new generation. The generation of enterprise, with the consumer focussing on mobile, analytics and big data. The specifications of their system are eye-wateringly impressive - boasting security, capacity, speed and resiliency all from a single box. The power of this system has shifted to not only include the enterprise, but also the managed service providers who are embracing zSystems technology for their own customer bases. More detailed information on the z13 can be found here: http://www-03.ibm.com/systems/z/hardware/z13.html As part of their global launch IBM held an event on the 27th Jan 2015 at the Mayfair Hotel, London, to launch their new generation of zSystems; z13. See below for our 60 second video of the event. https://www.youtube.com/watch?v=GyHLOMbaois ### Are you ready for EU Data Protection Regulation? New EU data protection regulation is expected to come into force this year, replacing the Data Protection Directive adopted in 1995. Aimed at strengthening consumer and business trust in Europe’s digital economy, the legislation will address some key security and privacy concerns. The implications for businesses and Cloud providers is vast, yet research by Skyhigh Networks suggests Cloud providers are poorly prepared, with only 1 in 100 currently meeting the forthcoming regulation. So what can we expect from the new regulation and how will it affect the Cloud computing market? Extended Scope The proposed regulation will apply to all European businesses and any business outside the EU holding personal data on EU citizens. Companies doing business with the EU, such as Cloud providers, will also need to comply with the regulation. Overview of Changes Data Residency: Strict data residency rules will prevent companies from storing or transferring data through countries outside the EU where equally strong data protection standards are not upheld. Currently only 11 countries outside the EU satisfy these privacy requirements and the US, where two-thirds of all cloud data centres are headquartered, is not among these. Right to Erasure: The new legislation requires businesses to delete personal data on individual request. Ensuring all copies of this data is deleted may prove challenging as 63% of cloud providers currently maintain data indefinitely or have no provisions for data retention, and another 23% maintain the right to share data with third parties. Responsibility and Security Breaches: Under the proposed regulation, liability for data breaches and violations will be shared between data controllers (businesses that own the data) and data processors (such as cloud providers that store the data). This means that businesses, as well as Cloud providers, will share responsibility for data management. Implications for Businesses Increased businesses liability and heftier penalties of up to 5% of annual turnover or up to €100m, will cause business leaders to take note. Data governance and risk analysis will become key responsibilities which, in turn, will have time and budget implications. Selecting the right Cloud provider will be critical and businesses will need to follow due diligence to ensure privacy and security compliance. In addition, any security breach will need to be notified to the EU regulatory authority within 24 hours. This could prove challenging as businesses are often unaware of breaches with their Cloud providers. Instead, encryption may be an attractive option, providing another layer of protection and complying with the 1998 UK Data Protection Act which allows breach notification to be bypassed if data is inaccessible to third parties. Implications for Cloud Providers Cloud providers are arguably most affected by the legislation, yet least prepared. Increased security requirements and administrative burdens will require disclosure of data processing and security details. Permission will also need to be sought from clients before enlisting any 3rd party services and all data will need to be returned on contract termination. In terms of data residency, large global providers, such as Microsoft and Amazon, are already making provisions to maintain data within the EU by setting up European data centres. The landscape for Cloud computing is changing. The European-wide regulation will drive up security levels and increase accountability. Although an exact date for the new regulation is yet to be confirmed, businesses and Cloud providers must put in place the building blocks now, to be fully prepared for when the legislation does arrive. ### A Hybrid Cloud Quickie https://www.youtube.com/watch?v=Nb9-5QxQHXo Today I am looking briefly at the usefulness of a hybrid cloud solution for your business. Hybrid cloud can benefit organisations, primarily by removing the burden of certain the operational challenges facing businesses. Let’s take disaster recovery as our first example. One of the biggest advantages cloud environments have when compared to on-site technology is their resiliency - being cloud hosted provides an unparalleled safety in regards to office based disasters. With the hybrid extension of the data centre model, choices can be made on the location in regards to levels of resiliency, and also around the replication of your IT enterprise environment to alternate locations. If your office burns down, and your data is stored elsewhere, your business isn't going to suffer anywhere near as much as if you had lost everything. For example, if a business is regulated by a governing body such as the FCA for financial institutions, disaster recovery is mandatory. The level of cover required differs depending on the particular markets that are operated within. The office could be located within the heart of any financial district, with its technology centralised in a data centre environment. This in turn can have replication partners hosting in other data centres, at other geographic locations, even in different countries. This replication is ensuring if a disaster did indeed happen, choices can be made on whether your IT resumption can be established, and if so, how quickly.  Another good example of hybrid cloud is collaboration B2B opportunities. With the emergence of software defined networking, additional services can be delivered from the same data centre, or even from other providers that have niche services for your business. In this model anything ‘aaS’ can be delivered securely from independent software providers and IT Hosting companies, as well as other firms that may well benefit from business to business collaboration. Now let's look at some examples of this form of hybridisation in action.  Example one would be a business hosting their core infrastructure within a data centre, and having a need for a specific business application that is delivered from only one provider, and not through a SaaS model. Being involved in a hybrid B2B collaboration means they would still be able to access their application, despite the lack of SaaS.  Example two would be a specific type of recruitment packaged application. A secure networking connection could be employed through the use of a hybrid cloud connectivity services – simply extending the enterprise to the independent software providers infrastructure with an easy way of deploying, monitoring and managing the process. And finally example three of the hybrid cloud model’s usefulness is its ability to transform application delivery with disparate data access. Several models that utilise this method already exist today within the emerging application marketplace stores. These cloud marketplaces are similar to the all too familiar smart phone application stores. Looking at hybrid through another lens, imagine that due to restrictive data sovereignty laws it is mandatory that personal data has to reside within your country of residency (fast becoming a reality for some in the EU). However the applications, specialist IT infrastructure and key business services you need are not available within your territory. This would pose a real business challenge for any organisation. Online application marketplaces publishing thousands of applications from overseas locations are helping to avert these issues. Through use of hybrid networking techniques linking local provisioning of data and storage resources to the required marketplace applications, the appropriate governance can be achieved. This is similarly true for on-boarding applications to a business marketplace from a vendors standpoint. A single independent software vendor based from one location could achieve global status quite literally overnight by being present in a marketplace.  So, when used appropriately, the Hybrid Cloud model is a true business enabler, complementing key business services and in line with the flexibility that cloud promises. ### The biggest computer on Earth Think the Internet is big? It's beyond big. There are more devices on the Internet than there are people on Earth. There are now at least two billion PCs, a billion mobile phones plus 4-5 billion servers, consoles, tablets, smart TVs and embedded devices all joined together by enough cable to reach to the Moon and back 1,000 times over. All that hardware could fill every football stadium in Britain to the brim – and it uses more electricity than the whole of Japan. The Net has become a vast, multi-trillion-dollar planetary machine; the most expensive and complex thing that humans have ever made. And everything about it is growing. Over 1.5m smartphones are registered daily; 5x the human birthrate. CPUs and GPUs are finding their way into everything from fridges to teddy bears and Moore's Law continues to hold; computing power keeps doubling every year or two, making today's laptops quicker than yesterday's supercomputers. The sum total of the Internet's home computing power is almost beyond imagination. Its also over 1000x greater than all the supercomputers on Earth combined. And we're wasting almost all of it. Sure, all those billions of PCs, tablets and phones are "being used". A third of all PCs are never switched off. But even though we are using these devices constantly, at any given time the average CPU load across the Internet is less than 2%. Unless video-encoding or playing the latest 3D game, the typical PC is almost completely idle. People don't type at GHz speeds or view holiday photos at 60 fps. As processors keep getting faster and more numerous, the ratio of used-to-idle computing continues to increase. Almost everyone has more than they need – almost, that is, because some people can never have enough; physicists, biologists, engineers, climatologists, astronomers, chemists, geologists – pretty much anyone doing fundamental research. Science and industry spend billions on ever-faster-supercomputers for this reason; they have become indispensable to our modern way of life. Cars, planes, medicines, oil wells, solar cells, even the shape of Pringles crisps was designed by supercomputer. They are the most useful and productive research tools ever made. But they don't last. The IBM Roadrunner, the world's fastest computer in 2009, was decommissioned in 2013 because it was obsolete. It cost over $100m, as will its successor. The owners, Los Alamos National Lab, can use the same floor space and energy far more efficiently with new hardware. Like all supercomputers though, Roadrunner's limited shelf-life was unavoidable. Computers do not age gracefully. The sum total of the Internet's home computing power is almost beyond imagination... And we're wasting almost all of it. Contrast this with billions of idle CPUs on the Internet that are continually being replaced by their owners. Broken devices are repaired, old ones upgraded and more are constantly added. The Internet is unique among machines: effectively self-healing, self-improving and permanently switched on. Parts of it may turn on and off every second but, considered as a single system, the Net has 100% uptime and it always will do. Using the Internet as a platform has been happening for years. Berkeley University's BOINC software has enabled dozens of science projects to harness over $2 billion's worth of donated CPU time over the last decade, from over 6 million idle PCs. The concept of volunteer computing is technically proven, the only issue is persuading device owners to allow it. Considering that the Internet is wasting over $500m in unused computing per day, it is certainly an endeavour worth pursuing. It is true that many HPC and cloud applications are not suitable for heterogeneous WAN computing. Low latency is out of the question (even with that new 99.7c hollow optic fibre) and highly secret data is unlikely to ever leave its owner's building without military-grade homomorphic encryption. But there are millions of tasks that are perfectly suited; Monte Carlos, parameter sweeps, climate modelling, rendering – anything that doesn't need Infiniband, shouldn't be queuing for HPC and will happily use public resources to gain 10x-100x more computing for the same budget. Using the Internet to compute also permits a whole new class of tasks which might prove to be the most interesting – and the most valuable – of all: those which can only be served by a global grid of millions of CPUs working together. For some applications, $100m supercomputers will always be just too small. Originally published on Compare the Cloud 2 April 2013 by Mark McAndrew. ### Awful Cloud Marketing - Introducing Cloud Muscle! ### Virtual Technology Cluster accelerates innovation Restoration Partners launches virtual technology cluster alongside Lockheed Martin to enhance UK cyber security. The joint initiative accelerates development and adoption of mature technologies. Restoration Partners, working with Lockheed Martin UK, a wholly owned subsidiary of the global security and technology company, has established a virtual network to allow UK based innovative solutions to grow and mature to counter the threat from cyber attacks. The Virtual Technology Cluster (VTC) brings together industry, academia and the investment community. The aim is to accelerate innovation and boost growth among the UK’s home-grown cybersecurity technology developers. The VTC will provide small, specialist businesses, who might find it difficult to get the right support, a ready-made ecosystem of capital, professional services and suppliers relevant to their field. Through a multi-tiered arrangement, SMEs can trade with larger companies and benefit from growth support or share intelligence, experience and ideas. Building on more than £40m of investment in cyber facilities at Farnborough, Hampshire, including a £30m innovation lab which is available for use by small and medium-sized British companies, Lockheed Martin has invested a further £250,000 to engage with the cyber supply chain and take the VTC concept through pilot to launch. Ken Olisa, OBE, chairman of Restoration Partners said "The fight for cybersecurity is an asynchronous one: small nimble groups pushing innovation in pursuit of destruction versus nation states, businesses and citizens seeking to go about their lawful business. Fortunately, the UK is home to a growing number of entrepreneurial innovators who are devoted to defeating the cybercriminals." "The Virtual Technology Cluster is a pioneering initiative," Ken Olisa added, "nourishing those innovators and connecting them with Lockheed Martin’s supply chain so that solutions to the ever-present threat can be implemented at the fastest speed possible." Stephen Ball, Chief Executive for Lockheed Martin UK said "We recognise challenges faced by SMEs when turning their innovations into products, and taking those products to market. We also know how crucial these niche businesses are to the supply chain of companies like Lockheed Martin as well as to the security of the UK. Our VTC is a new way of working with suppliers to develop a comprehensive network that will stimulate growth in this crucially important technology sector." Visit the website for the full Lockheed Martin press release. ### Pitch your cloud business Giving cloud businesses a chance to pitch their services CloudPitch is now part of the Compare the Cloud website. Here at the start of 2015 the market is already looking up. Cloud businesses including managed service providers, data centres, co-location services and hosts to web apps and even connectivity specialists have more opportunity than in any previous year. At Compare the Cloud we have just added another opportunity with the re-launch of CloudPitch. We have championed the adoption and value of cloud computing to businesses. And our website helps companies select services and providers best suited to their needs. With CloudPitch, now an integral part of our website, we are giving cloud businesses a chance to pitch their services. To add your business, simply submit your pitch. Once your business is listed you and anyone else can vote it up. The more votes received the higher your pitch will be in the listing. In the future we will be featuring the top business of the month to our readers. Be warned at the end of every month the vote counts get reset to zero. To stay top you will need to keep voting. Have a great cloud business? Pitch it. https://www.youtube.com/watch?v=7hr56b6oTas ### Special #1 – The Fixes [embed]http://traffic.libsyn.com/comparethecloud/The_Issues_Special1.mp3[/embed] ### The cloud is bigger than computing Whether you consult Gartner, canvass the providers or poll the marketplace the fact is that more businesses are investing in cloud computing. It is a profitable business with revenues set to rise faster than the recent drop in oil prices. But this boom is not attributable to the benefits cloud computing promises. This boom belongs to the cloud alone, not computing. Cloud computing has become a hackneyed phrase that has served marketing well but now should be retired. What we so often call cloud is really a group of services, enterprise IT, that manages network access, manages the devices that use our networks and provisioned services that make business systems interoperable. And, we shouldn't forget the benefits of data, analytics and machine learning - Big Data and the Internet of Things. The cloud has become a catch-all descriptor for all these services and more. The term computing is encumbered. It reminds of us an era where power was measured in megahertz. Speed was a hallmark of the 20th century. Back then the silicon, the microprocessor, was important. But when we first cut our physical ties to the desktop PC with Wi-Fi our IT requirements began to change.The rise of mobile phones and mobile device only hastened the pace of change. Effectiveness and enablement are the measure of today's technology. In some ways the term computing has already begun see the exit. To the public the cloud is a place where we store data. To the industry the cloud is often a network or application that manages provided services. Hybrid cloud deployments, mixing public access with private security, hosted applications, online storage, centralised, end-point management are helping businesses become more efficient across greater geographies. The success of the cloud is clearly in its application, not its speed. The potential and the rising opportunity for MSPs providing these services is fuelling the boom. As we near the end of prediction season, and before the year truly gets underway, we should turn our attention to our own terminology. Computing doesn't sell connectivity - you could say it doesn’t sell anything. Cloud doesn't need computing. With so much growth it is time to divorce the suffix and let the cloud reign. ### Top Trends and Concepts for 2015 Previously we’ve spoken about the top technology for 2015, so now we’re going to look at the trends and concepts that are coming into the forefront of our vision. 2014 saw the rise of big data as a trend in the tech world, the future is really moving more towards the ways in which we will analyse and employ the data that we have collected. Trend - Commodity analytics Advanced, Pervasive and Invisible Analytics, Gartner calls it; I say Commodity Analytics and I stand by this. I’m not cheapening the hyper complex and fine work done by the software and data scientists, if anything they have done such a good job of unlocking and demonstrating the value inside everyone’s data that we all want a piece of it. Whether I want to know where I stand for profile views inside my LinkedIn network, or what might be the next big technology trend based on a million value sales report from 2014. Analytics, in the broadest sense of the word, is more and more a part of our lives. I was reminded to what level recently when I opened my phone and checked the weather for London and was presented with a simple icon representing that it was going to be dismal. However, upon thinking about what that symbol represents from an analytics perspective, I soon cheered up. The crunching of 1000s of sensors, past and present weather data for this time of year, plus satellite and radar inputs all on one of the worlds most powerful computers. I then checked the travel route I was planning on taking, once again not great, but this was based on real-time traffic information from speed sensors along a stretch of 250 miles of the UK giving me a near minute perfect estimation of my journey. I got out of bed with a renewed vigor. All this demonstrates analytics is becoming truly commoditised. No longer the preserve of the enterprise boardroom, this technology is now feeding everyone all the time to help them make better micro decisions, as well as major ones. Do I take a coat or not? Do I go on the M1 or the A1? This all helps make Analytics a bigger slice of the IT sphere and whatever we thought we knew about the analytics market in 2014 will be child’s play compared with 2015. Trend - Computing everywhere The saying used to be, "That calculator has more power than it took to send men to the moon." Now the saying is, "That cheap, relatively low end smart device has more power than my top of the range laptop 2 years ago." Computing power is everywhere and the proliferation of high-powered mobile computing has helped to give application developers much more freedom when developing mobile apps. Dual core chips have become commonplace along with upwards of 2GB RAM and even 128GB flash disks in phones. What does this mean for the IT industry? Simple, mobile computing and mobile as the primary business tool is now the reality, if you don’t have a dedicated mobile app to mirror/complement your on-premise or cloud delivered service, you will be left behind. The only thing that’s letting these devices down (more than battery life) is the mobile networks they now rely upon. They are not fast enough; don’t have enough throughput or enough coverage. But I am not one for just complaining; this is where the opportunity lies, building mobile apps that give the feeling of real-time whilst using what little bandwidth they have while relying on the local computing power and storage to cache and process information which is bound for the cloud can be the intermediary step. Trend - Commoditisation of development As infrastructure skills become obsolete, the IT department is evolving into a development house. Development isn’t the ‘dark art’ it used to be, with the UK government stating that coding classes will start at 4 years old from 2015; the only way is ‘up’ for the quantity and quality of developers in the UK’s future. In the here and now, and the next year specifically, we can expect to see development becoming much more common place in the SME. A good question is what will they do? As supply chains become more complex and time-to-productivity becomes more and more important, developers can help organisations to gain a competitive edge by developing systems that allow them to: react quicker, deliver sooner, and be better informed than their competition. How do they achieve this? Well when you are starting from a blank sheet of paper the world really is your oyster. The best thing is that in most cases an organisation gains this competitiveness for less than buying the equivalent off-the-shelf product, let alone the lower costs of maintaining the software. Don’t get me wrong it isn’t as easy as I have just made it sound. The development function in an organisation requires an eco system of people. Strong project management is critical, developer operations (DevOps) personnel, but done properly the ROI of this model can be the difference between a company being relevant in 2016 or not. Concept - Hybrid cloud Cloud computing is worth $121bn. Hybrid cloud’s predicted worth is $1.85tr by 2017, accounting for 50% of all it’s spending according to Gartner. Hybrid cloud, love the name or hate it, is here and it’s going to be the big cloud enabler of 2015. As CIOs look for the way make cloud (and I mean "public" cloud) a part of their architectures, hybrid is going to be that way. Hybrid cloud will enable cloud bursting for compute and storage, and with Software Defined Infrastructure (SDI) to smooth the network, shared systems and security governance will allow this to be achieved using simply a few clicks. However, a few things will need to be overcome, the first one and the biggest being cost! One of the contributors to confusion in the cloud has been the perception that it is fundamentally cheaper than on-premise. This can be true, but rarely, as it can often have initially higher costs of implementation. To really take advantage of the cloud you should take the time to re-engineer your applications architecture. This sounds scary but isn’t. Secondly the disservice some of the operating system migration tool vendors have done to make enterprises think that they can just lift and shift their operating systems driver, apps, security and all into the cloud. The very thought sends shivers down my spine! When moving to the cloud the first thing I tell anyone is. "The cloud is not a one size fit all solution." The first thing to do is identify what is cloud ready and what isn’t. This will help you to start to realise the true benefits and drawbacks of the cloud. Understand what the cloud can offer your applications (or even single elements of your applications), which ones should be in the cloud, which ones shouldn’t, and then architect for this. Once you do, hybrid cloud becomes your gateway to the best service delivery with a highly resilient infrastructure, near unlimited scale (dependent on provider), and spin-up spin-down commercial flexibility. Concept - Cloud/client architecture Client/Server you’re so 1990s… Client/cloud architecture is becoming more and more prevalent in the modern world of application delivery. Look at salesforce.com, Dropbox, office365 to name but a few. I am aware they have servers in the back end, but the point is that enterprises are no longer connecting to a dedicated server, IP address, or, even in most cases, URL. It’s shared on a massive scale and delivers to the mobile device on demand anywhere any time. Concept - SDE Software Defined Everything Software defined data centres seemed to come of age last year, by this I mean, in the world of the SME. It’s the typical story of product pioneered due to a specialist need, in this case the need of Internet scale companies, namely NASA and Rackspace. Open sourced, then rapidly developed to far beyond the required capabilities, then adopted by the MSP world at large. Over the last few years, all the large vendors have developed a play of some description. The project as a whole has matured dramatically. Services companies like Mirantis have sprung up to provide professional services around the technology. All of this helps to de-risk and make adoption easy and straight-forward for the end customers that are going to turn it from top of the CIO agenda to defacto in the SME datacenter. Concept - Risk based security and self-protection To maintain a safe open and secure cyberspace we must look to new methods that will keep us ahead of the threats in 2015. We have already seen cyber weapons developed which derailed Iran’s nuclear program by 2 years, and viruses that are so well written, even the most advanced security companies can’t remove them. So what is 2015 going to look like? The truth is we won’t know until it attacks us, this is sadly the nature of the beast, but we can ‘best prepare’ for the future. Perimeter security is no longer enough; Intrusion Detection Systems (IDS) do a good job at being abstracted from the application but lack context-awareness. What is required for the ever-dangerous cyber world we live in is a security approach that automatically links context-aware security to abstract security giving the best possible hope for the user. These systems represent self-protecting applications, operating systems that are paranoid of their applications and then inline dynamic perimeter securities that are just plain schizophrenic. We need all of this while improving the user experience, and giving more access and flexibility to increase productivity. No small task… The truth is, this is my guess, but who is to say that tomorrow some unknown technology won’t come along and disrupt everything, and that is exactly what I love about our industry. ### Benchmarks of a good hosting provider Horse meat burger, cod and chips, a bacon sandwich, cloud hosting providers... which is the odd one out? The bacon sandwich, of course. Last year's well publicised food scares aside, the point is that often products are not quite what they seem. So in the list above, why is the bacon sandwich the odd one out? Well, to date, we have yet to see a successfully counterfeited Bacon sandwich on the market. Often cloud computing hosting providers are also not quite what they seem. Lots of promises are made on websites as to how good a service is, but on further exploration is the service actually what it seems? It is useful to explore some of them. 99.9999% or 100% uptime guarantee A Service Level Agreement (SLA) is a measure of availability, which is typically described through percentage statements such as 99.999% SLA. When these statements are explored in real time, this is the outcome: Availability % Downtime per year Downtime per month Downtime per week 90% ("one nine") 36.5 days 72 hours 16.8 hours 98% 7.30 days 14.4 hours 3.36 hours 99% ("two nines") 3.65 days 7.20 hours 1.68 hours 99.5% 1.83 days 3.60 hours 50.4 minutes 99.9% ("three nines") 8.76 hours 43.8 minutes 10.1 minutes 99.99% ("four nines") 52.56 minutes 4.32 minutes 1.01 minutes 99.999% ("five nines") 5.26 minutes 25.9 seconds 6.05 seconds 99.9999% ("six nines") 31.5 seconds 2.59 seconds 0.605 seconds 99.99999% ("seven nines") 3.15 seconds 0.259 seconds 0.0605 seconds This information is all very good, but without looking at the fine print it is actually a meaningless and empty promise on the part of the provider. Failure to meet an SLA is usually backed by a penalty clause and penalties as an industry standard are not normally too onerous for service providers. Traditionally, the provider gives a percentage of the overall monthly fee back to the user, depending on the length of the outage. Below is a table showing what a customer could commonly expect to reclaim: Service Availability during Monthly Review Period Service Credits as % of Monthly Rental Charge <99.9%-99.8% 5% 99.79%-99.5% 10% 99.49%-99.0% 20% 98.9%-98.0% 30% <98% 40% The calculation is: (Total hours - Total hours Unavailable)/Total hours) x 100 Other important factors to note when looking into the SLA include: 1. The penalty often does not refer to the service as a whole. Your business may have a web server but the SLA may only be for either network or server availability, so the fact that your website is not working may not be relevant. 2. The advertised SLA does not actually match the penalty clause. For example, your business may have been told it will receive 99.999% availability but the penalty clause starts at 99% availability, so you would never be fully compensated. 3. It is also important to explore what time period the penalty is measured over. It is common that providers use a period of 12 weeks, which actually means on a 99% SLA that your business could be offline for 21.6 hours, or on a 99.9% SLA just over 2 hours and still be within the service level. 4. How is the availability measured? Monitoring commonly checks servers at various intervals, e.g. every 15 minutes, every 5 minutes or every minute. Based on these tests, short outages may often go unnoticed and it is often impractical to check with a higher frequency. 99.999% uptime checked every minute will mean that only longer outages will be spotted or that a short 2 second glitch will show up as a minute long outage. 5. It is usually the customer's responsibility to request service credit checks and SLA's are therefore not a good way to determine the quality of a service. Unlimited bandwidth included or 1000 GB bandwidth included What do they these statements actually mean? Well, they could mean any number of things and statements like these often have an * next to them, referring you to an acceptable use policy where they actually restrict you to 400GB of throughput. However, more often the biggest concern is actually how quickly you can get this bandwidth, how big is the Internet connection? To illustrate this further, if we equate Water to Internet traffic and a Hose Pipe to the Internet connection, the biggest pipe will move more Water and it really does not matter if your contract allows you to move a gallon of Water or not. If the Pipe can only move three gallons and you are sharing it with 1,000 other people then the net result will be a negative experience. We have ISO 27001 for information security There are few points which you need to know about ISO 27001. Firstly it is an information security management system (ISMS) which means that it is a framework for managing security, it is not a standard that ensures that an organisation is actually secure. Only standards such as PCI actually provide any guarantees, as they are prescriptive about processes, not just suggestive. Due to the fact that it is a system, people can just buy an off the shelf set of processes and procedures and quickly demonstrate an ISMS without actually really properly implementing it. However, there are some new legislation guidelines being released soon, which will mean that the auditor will be paying far more attention to the ‘monitor and measurement' section which will hopefully improve the situation. When you setup an ISMS the business also sets the scope of the system and some Cloud providers have been known to use this to their advantage. For example, the business could limit the system scope to the ‘spare parts' management element of the business then publicly broadcast that it is IS0 27001 compliant but conveniently not mention what for. There are positive and negative elements to IS0 27001 and although it is a good indicator on the strengths of a hosting provider, you clearly need to ask more than "do you have an IS0 27001 accreditation?" See terms and Conditions Remember, what a business offers on the website is not necessarily what you get. For instance, I recently witnessed a company offering a Hosted Email solution and on its homepage the provider boldly stated that they offered a free backup service. I could not find this mentioned anywhere else on the website and after further inspection I discovered that their terms and conditions actually stated that backup was NOT included in the service, unless expressly mentioned in your selected email package. I will let you draw your own conclusions on this. Your business needs to check the terms and conditions closely in order to explore the full nature of the service being provided, particularly on longer contracts. You may sign up for a service which allegedly offers UK hosting on dedicated hardware, in full IS0 27001 audited facilities, but if the supplier then decides that they wished to migrate over to Amazon Web service, you may not be able to legally stop him. Equally, you may be contractually obliged to continue to use the service regardless of any changes. Contracts should give your business the right to cancel if there is any fundamental change to the nature of the service delivery. You are nearly always ultimately responsible for your own data and most contracts will state that the service provider shall not be liable for loss or corruption of data or information. This is not necessarily because they are a poor provider but usually because it is hard, if not impossible, to get insurance against that form of loss – especially when it is so difficult to place a value on data. I hope that by reading this blog you have realised that it is easy for providers to hide behind the online world in which we live in, so closely check what you are buying. Just because the website looks good and you recognise the brand, it does not guarantee that you receive the level of service which you are expecting. It is also important to remember that many of the technologies used in the Cloud are new and long standing, strong brands will also be new to these technologies, and their previous ethos may not necessarily work in the new Cloud world. Many companies historically may have been excellent at providing break-fix style contracts, but can they stop things breaking in the first place? Originally published on Compare the Cloud 16 April 2013 as What makes a quality Cloud hosting provider? by Richard May. ### Celebrities lead Apps World Germany Gaming heavyweights and Apple legend to keynote the inaugural Apps World Germany 2015. Apple co-founder Steve Wozniak, will be keynoting the event in Berlin this April. He will be joined by founder of Lionhead Studios; Peter Molyneux OBE and co-founder of Games Workshop; Ian Livingstone. The event taking place at CityCube, Berlin on April 22-23 is set to attract over 6,000 attendees over the two days. Apple co-founder Steve Wozniak, will be keynoting the inaugural Apps World Germany event in Berlin this April. He will be joined by founder of Lionhead Studios; Peter Molyneux OBE and co-founder of Games Workshop; Ian Livingstone. The event taking place at CityCube, Berlin on April 22-23 is set to attract over 6,000 attendees over the two days. Having helped shape the computing industry with his designs of early Apple products that influenced the later Macintosh, Steve Wozniak will be discussing the story of Apple, entrepreneurship, innovation and sharing his industry insight with developers. Both Peter Molyneux and Ian Livingstone will be sharing their insight on the Gaming World stage (part of International Games Week Berlin) this April with their individual keynotes discussing the evolution of mobile games production; the challenges, lessons and their future predictions. Wozniak, Molyneux and Livingstone will be leading a stellar speaker line up of over 200 at the show speaking at 10 niche workshops including HTML5, Mobile Strategy & Marketing, Mobile Payment & Banking, Gaming Apps, TV Apps, Developer, Droid, Enterprise Apps, Cloud and Connected Car. Ian Johnson, Managing Director of the event organiser Six Degrees said: "We're ecstatic to welcome Steve Wozniak, Peter Molyneux and Ian Livingstone to Apps World Germany. The speaker line up is stronger than ever and we're excited that a whole new audience will be able to hear from some of the industry's most respected figures." To see Steve Wozniak, Peter Molyneux and Ian Livingstone keynote at Apps World Germany please visit thewebsite to register your exhibition pass. ### Top Technologies for 2015 I want to tell you all about my technology predictions for the coming year, in our ever-changing world keeping up with tech is key if you want to stay ahead of the game. First of all a disclaimer; these are not new technologies - they are tech that has reached a certain level of maturity and become relevant to the channel. They are tech to look out for in 2015, as they will be impacting the technology landscape. Technology - PaaS Platform as a Service (PaaS) is in essence the runtime environment for code. The runtime is the application on a server that takes code a developer writes, be that JAVA, PHP or C# to name very few, and translates it into machine instructions, allowing developers to concentrate on writing code and nothing else. The infrastructure, the network, the operating system, and the runtime are all fully maintained, secured and managed by the PaaS provider. This sounds great, but the barrier to entry has always been PaaS being in the domain of the developer; as PaaS is designed to take custom code and run it, all the frills and fancy bits need to be built by the application developer. You can't pick up Microsoft exchange and run it in this way, as it needs to talk to the operating system. So what has changed to make me think that this is the year of the developer? Well, the dark art of development is no longer. Development is becoming commoditised. What do I mean by this? If you go to most recruitment agencies you can get a developer starting the next week on a rolling one-month contract for around £10k. Now, this isn't cheap, and without strong project management you will be posting your money down the drain, but do it properly and you could have a custom application based on open standards. Easy to maintain, running on PaaS, no servers to worry about supporting. This will be the enabler to PaaS that takes it from cool tech to actively used tech. With offerings from Google, Amazon, IBM, Redhat and Microsoft, PaaS is being taken seriously in 2015. Workloads running companies ranging from finance and banking, to retail and real-estate with names like JP Morgan Chase, Barclays and Diebold (they make ATM's) are running production critical workloads via PaaS services. Technology - APIs APIs will be the default element of any application or platform worth considering in 2015. APIs, give an IT department a programmatic interface, which allows them to integrate their existing services and applications, to new services. In essence they expose an amount of the functionality that can be controlled from a remote location. There are two main cases for using APIs: 1. Integration of third party services The web is becoming more integrated and the idea of using common services, rather than owning, managing, and, maintaining 'on-premise', is very attractive. For example, you may need to send thousands of emails for marketing campaigns. Having an email service that has an API sending emails means you don't have to maintain an email system for this purpose. More than anything you maintain the privilege of sending thousands of mails without being blacklisted by your ISP... You simply press a button from your marketing system and everything just works. Sendgrid.com is a good example of this. 2. System to system integration The ability to federate data between systems has huge business benefits. Accessing data stored in one system to complement or fuel the results of another without any duplication of work or data is a massive benefit. Let's use the email example above; you have a customer record system holding sales data from your customer base and you want to run an email campaign based on what has been sold previously. If you integrated the two systems using APIs the marketing system can query the sales system automatically to extract the exact data required. Saving hours of laborious copy and paste to get the data into the marketing system or creating a new database that replicates the data for this single purpose. Technology - Docker Docker is an open platform for development to production consistency. Build, ship, and run distributed applications without changing code operating systems from laptop to mainframe. The system consists of: the Docker Engine, a portable, lightweight runtime and packaging tool; and, Docker Hub, a cloud service for sharing applications and automating workflows. As a result, IT can ship faster and run the same app, unchanged, on laptops, data centre VMs, and any cloud. Docker. Is nothing new? Linux containerisation has been around since 2008 but Docker wrapped simplicity and tooling around it to improve adoption, getting the open source community behind it with names like Google and IBM contributing to development and tool sets attracting customers like ebay, yelp, Spotify and Rackspace. Docker, PaaS and APIs are really just scratching the surface of what the year ahead will bring for us. I expect more and more exciting technologies to emerge as 2015 progresses, and for the way we deal with data and workflows to develop. ### Onomi launches agile services for hybrid data Three years after the sale of £10m Oracle cloud provider Quantix to Interoute, Co-founders Andrew Slater and Julian Boneham have launched Onomi, an exciting new venture focused on providing managed services for hybrid database environments. The advent of Big Data has created a groundswell of interest in using data to create innovative and potentially game changing solutions within the enterprise. In response to this, the demand for NoSQL database technologies has mirrored the explosion in the volumes of unstructured data. This growth trend looks set to continue as more organisations are increasingly realising the need to look at complementing their existing relational database investment with next generation databases such as MongoDB. Onomi Director and Co-founder Andrew Slater states "We have seen a huge rise in the use of NoSQL within corporate IT environments and one of the overwhelming use cases is to complement an existing relational database deployment such as Oracle or SQL Server. With a deep background and long history in database services we have created Onomi to address these changing needs and to help companies harness the benefits of NoSQL, without having to throw away their existing database investment." The Onomi management team will be using all the experience they've gained in the Database Managed Services market over the past 17 years to refresh and update the service model focusing on outcome based SLAs and cloud-ready automation. Julian Boneham states "We're employing some of the brightest minds in the database management marketplace to ensure that our customers can take advantage of an emerging technology area such a NoSQL whilst equally relying on Onomi to bring a mature service model to meet their operational and governance needs." Vijay Vijayasankar, VP & GM, Global Channels and Global Services, MongoDB commented: "We're excited to have Onomi as a partner working with us to help drive the benefits of MongoDB in the enterprise. Onomi is a great example of how our partner ecosystem is developing to meet operationally focused demand from customers running MongoDB in business critical environments." Industry veteran Mike Briercliffe, who has known the team for many years and who is acting as an advisor to the Onomi board states: "Everyone is talking about Big Data and the Internet of Things and the sheer speed at which data is being created as a result. We're only now starting to see the emergence of new-style services companies that are actually going to help companies make best use of these new technologies." "Onomi is one of those and has spotted a significant gap that has formed in the market. There's a need to integrate NoSQL databases with traditionally structured relational databases and to provide the ongoing operational support services. This will be a hot topic for an ever increasing number of CIOs and IT Directors over the next few years, in fact for business management as a whole," added Briercliffe. For more information just visit the Onomi website. ### Datacloud Awards set for Monaco New Datacenter and Cloud awards announced for June ceremony in Monaco. Recognising Datacenter and Cloud achievements in Europe winners of the Datacloud Awards 2015 will be announced at an evening ceremony in Monaco on the 2nd June. Organised by BroadGroup, the Information Media Technology and Professional Services company, the awards will be hosted prior to the opening of Europe's 'must-attend' Datacloud Europe conference and exhibition. Now in their 8th year, the Judges have announced a number of exciting changes to keep pace with fast moving industry developments, as the Datacloud Awards sustain their integrity and position as the only pan-European recognition of demonstrable leadership in datacenter and cloud. Top of the list are two new Awards - the European Award/Prix européene/Europäischer Preis for Datacenter Industry Leader and the European Cloud Award, Industry Leader. Another innovation recognises the important role of marketing in the industry and the Awards for Marketer of the Year – one each for datacenter and cloud – will offer the very first in this category. The Awards also bring much needed recognition to enterprise users – the infrastructure and IT delivery executives managing around 70% of all Europe’s mission critical facilities and also among the most important end users for cloud. The full list of Awards for 2015 is available to view at www.datacloudcongress.com/awards Commenting on the new Awards structure, Gerd Simon, chairman of the Judges Panel said, "An objective of all Awards is to recognise the best. In setting the bar higher yet again, these Awards will provide new goals for the extraordinary talents across Europe operating and engaging in datacenter and cloud." The Judges will also make a special nomination for a company or individual who has demonstrated excellence in their field over the past year. Visit www.datacloudcongress.com/awards/nominate to make your nomination. Final closing date is 31 March 2015. Taking place against the spectacular backdrop of Monaco the Judges anticipate that the Awards in 2015 will be the most exciting and valuable ever. ### Internet of Things Power-Up By this stage we have all basically accepted that cloud and the IoT (internet of things) is a part of our future, if you haven't, you should probably assess your life and start the process of acceptance. As cloud enabled devices become more pervasive we are being almost forced to accept them into our lives and households, yet no one seems to be considering, at least openly and loudly, the implications of an IoT household. I will admit that I am young, that I am obsessed with the speed of my internet connection, that I am obsessed with my Netflix account, and that I am completely lost during a power-outage. Now, if my whole house is cloud dependent, how am I going to handle it?! The average broadband speed in the UK is currently 18.7Mbps. Urban centres are getting 33.4Mbps, suburban are running at 22.9Mbps, and rural areas are achieving 13.6Mbps. Despite all of this we cannot help but constantly desire faster connections, no one enjoys the buffering of the newest episode of House of Cards on Netflix. We can easily assume that most of those households are running at least 3 devices connected to their internet router. Between laptops, phones, tablets and televisions the average UK household is already demanding a lot from their connection. In 2014 IoT devices rose into our vision like a mountain range on the horizon, a veritable Everest of bandwidth limit consuming doodads and wotsits. Now don't get me wrong, I want a wifi enabled smart toaster just as much as the rest of you, if it can butter my crumpets for me, even better! What I'm worried about is the toll this is going to have on my network. I've lived in shared housing, when 4 people on 7 devices are all using the same wifi to try to watch tv, stream music, and use social media all at the same time, hell might as well break use. The network doesn't stand for it. Maybe you're all scoffing and saying "well I'm on the most advanced internet plan there is" - but could it withstand a whole household of connected devices? In 2014 there was announcements of products such as Amazon Echo the smart device that is like a personal assistant for your home, and Apple has home-kit on the horizon for 2015, which will work with Apple TV to control everything in your abode. From your heating, to your lighting, to the power-strip you left your hair straightener/dryer/kettle plugged into this morning - you'll be able to turn all of these off once you're already running down the street to catch that last tube which will get you to work at a mildly acceptable degree of lateness. Smart fridges have been around since 2012, LCD screens and internet connections are almost old news in the kitchen world, you'll never run out of milk again once your fridge is ordering it for you. Ovens that have capabilities for internet searches and making sure you never burn another Sunday roast also aren't out of the ordinary. But this is just a tiny part of the kitchen. How are our bandwidths going to cope once our washing machines, dishwashers, music systems, toilets, showers, beds, alarm clocks, and wardrobes are all downloading and uploading data in the name of simplifying our lives? Yes, I'll admit, the smart closet that lets me plan my outfit for the day a'la Cher from Clueless has always been a dream, but if it means I have to double or triple my internet bill, do I need it? Can I afford it? The answer to both is probably not. Whilst researching this article I discovered the aforementioned smart closet is actually a reality, though first I will need to teach IBM Watson all about my fashion sense, and vastly increase my bank account to afford all of the tech. The IoT is the future for us, but we are presently not equipped to handle it. Personally I'd prefer that my Netflix stream quickly over my fridge ordering my milk, and when it comes down to it, I don't think that my present network could handle a house full of connected devices anyway. Could yours? ### Cloud Past, Present and Future Cloud Past Any fool can know. The point is to understand as Albert Einstein stated over 80 years ago. What started out as another major IT crusade started from a well understood position of technical positives but ultimately missed the point. You see Cloud wasn't, and more importantly, isn't about technology. I was brought up with the message that experience is priceless and the great thing about the past is what we can learn but it only matters if we apply this. Very few journeys to new horizons can start with absolute clarity about the destination or what will occur on the journey. The point is we understand the purpose of the journey and what benefit it may bring to others. Whatever our purpose or focus from medical discovery, astronomical voyages or supporting the challenges we are presented with as a society, such as Ebola this year, Clouds have a greater purpose when applied if we learn as if we were to live forever. Perspective remains one of our biggest challenges and we need to stand back more often than we do. In an ever-impressive world of posts and social media updates, it can be difficult to consider or even understand the destination or whom we can benefit. Stephen Sutton rightly touched millions of us this year, with his inspirational posts or Candice Curry's "An Open letter to My Daughters stepmom" and many more. The point is the add perspective and help us understand more about ourselves. Technically, Cloud enables. It enables us to do more, to help others and make a better world. In the past we have extolled the virtues of replacing CAPEX with OPEX, pay as you go, pay for what you use, bursting as needed and being evergreen. No matter what it is, you can have it “as a Service” but we have to give it a name each time. Let's ask the question another way, what are we enabling and why but it must be as a Service. Cloud Present 2014 brought us fabulous new, sophisticated eye wear, a new drone based delivery visionaries and wearable tech in abundance. It brought us UK floods like we have not seen in recent times, the highest level of parking fines ever recorded, more online mega named shopping days which we have already forgotten, till next year and a new wave of online security vulnerabilities! The majority of e-commerce sites now operate in the cloud. Most of our personal data is stored somewhere in clouds as is every move and click we make. Wearable tech is great but how much do we want to and know is shared. Big Data is here, in every sense, analysing all our purchases and movements, our insane exercise patterns and driving journeys but what is the purpose. Once we move beyond the social media posts, tweets and followers, what does it mean and what actually is our destination and the true benefits we derive from this gift we call the present? Well, let's say we created a smart application that linked rising water levels to early warning systems for homes and businesses. What say we used location data to inform us of the parking conditions or terms based on our whereabouts? What say we linked the systems we are with, use and depend on with automatic updates to protect us. What say we electronically linked the sudden rise in reported medical conditions or humanitarian disasters anywhere in the world to government and awareness campaigns? Would this give us great purpose and meaning? Would it actually move us closer to understand how we benefit of the technology and how this can protect us? Of course. So why not? What is stopping us in these applications to be better citizens? Might it be our application or truly understanding how we measure the benefits? It does not come as a service or with a funky LED output in front of us but it does have a point. Cloud Future So we need to establish points for Clouds. It must have a clear purpose. We need to help people understand this, What does it mean and where is it going? After all it is not how your start but how you finish. 2015 needs to add the 4 P's. Slightly different from Kotler's view but Cloud needs Problem, Purpose, Perspective and Points. In their January 2013 HBR issue of rethinking the 4 P's the author's challenged how trusted sources of advice and diagnostics must be reviewed. The point is we need to solve problems with Cloud otherwise it truly has no purpose. But we must begin with this in mind not technology trying to identifying problems to solve. This gives us purpose, and perspective for we can identify can we can resolve issues and how these benefit us all. It is a really simple tool but it helps us keep it real and relevant. Unless we are applying what we know then it has no purpose or relevance for this is what people are purchasing – this is the point. Perspective comes from what we know, our experience and understanding. This is how we create the point of the proposal and proposition. What we present as the proposition is the proposal to solve the problem and the purpose of our products and therefore what our customer will purchase. We simply need to readjust our perspective to make Cloud real. No colours of the rainbow, no as a service, no named shopping days, just plain old simple perspective. So where's the problem? ### Endpoint device management and IT security Does endpoint device management help or hinder IT security? We’re seeing IT departments and managers continue to worry about the increased security risks that BYOD (Bring Your Own Device) and CYOD (Choose Your Own Device) policies present. Data loss – both on a company and personal basis – in the event of devices being lost or infected by malware is high on the list of concerns held by organisations of all sizes. This is becoming increasingly difficult to control due to the vast number of devices and platforms that connect to business sensitive information, resulting in a potential security minefield that businesses are keen to avoid. When considering endpoint management, businesses often focus predominantly on security. This is not just about the ability to apply policies to devices and knowing where they are, but essentially implementing patching and ensuring software is up-to-date and not at risk of vulnerabilities and hackers. Recently, we have seen exploits such as Heartbleed and Shellshock have devastating effects on enterprises, proving how important it is for businesses to patch devices, particularly when outside of a secure network. Businesses are opening up to the idea of a more mobile workforce, but the main constraint continues to be the perceived security risk. Companies want to empower their workforce by deploying mixed-device estates, allowing them to enjoy the best of PC and mobile. BYOD and CYOD captures two types of device; firstly Windows PCs and secondly smartphones or tablets with mobile operating systems, such as iOS or Android. The challenge here is that companies often use different solutions to manage these two product arms. For example, Client Management (CM) tools for Windows PCs are a very different consideration compared to the Mobile Device Management (MDM) solutions which are used for devices with a mobile OS. Today, companies want to empower their workforce by deploying mixed-device estates, allowing them to enjoy the best of PC and mobile worlds during their day-to-day working lives. This encompasses everything from smartphones and tablets, to Ultrabooks - all of which have the ability to access corporate data and apps. At present, there is no single solution that unifies the device management space, so IT managers face the challenge of using both MDM and CM tools to manage the two very different environments. However, unified endpoint management solutions look set to evolve in order to overcome this challenge and revolutionise how IT managers can tackle this common headache. Some systems are evolving to encompass both MDM and Client Management tools, with an example of this being Toshiba’s Cloud Client Manager (TCCM). Toshiba’s TCCM is evolving to include an MDM* together with its CMT capabilities. With the solution sitting off-premise, customers can remove the cost and skills barrier that users of other solutions may face, as there is no need to install servers and employ additional staff to manage the upkeep of these servers. It also becomes much quicker and easier to deploy more mobile devices with a per-endpoint license, making it possible to work out exactly how much the management of devices will cost up front. Currently, many companies implement restrictive strategies around mobility, rather than embracing new devices which could act as business enablers. This is because many businesses naturally remain concerned about off-premise security. With this in mind, IT managers should carefully consider whether they are comfortable with entrusting business sensitive information to off-premise cloud providers. This is where solutions such as TCCM can become invaluable again, as this doesn’t handle any of a company’s data - the solution collects device data, rather than documents which are on the hard drive. With this set-up, any potential risk can be quantified, so IT managers can understand the parameters, helping to provide assurance that company data is safe. From a wider cloud perspective, business owners should also make sure they are aware of where their data is residing, who can access it and which jurisdiction it is governed by. For example, if a company would prefer its data to be hosted in Europe rather than the US, its heads and IT team need to ensure they understand the risk, quantify it, and then decide whether the risk is manageable. Again, businesses should ensure their provider has a sufficient level of security at their data centres. It is crucial for decision makers to understand exactly what’s in place before committing to a contract. Overall, business owners will understandably, and quite rightly, continue to be concerned about data protection, but they should also have a strategy in place so they know what questions to ask and how to qualify that risk. Through using an endpoint management system and having the benefits of cloud based solutions, the risks can be minimal and the benefits extremely rewarding. * MDM available in TCCM first half 2015. ### Focus on winning disciplines I'm always amazed watching decathletes perform. In the Olympics and World Championships they compete with astonishing results in ten different disciplines. But each of the decathletes, for the most part, only truly perform with astonishing results in four or five of these disciplines.. The highest points total determines the winner. Often the winner's success in several of the individual disciplines is lower than that of other competitors. Over the years there has been world records set by decathletes who did not go on to win the competition. Let's focus on results from the London Olympics 2012. The Decathlon winner was Asthon Eaton, despite not winning any individual events. If you add all the individual discipline's winning scores you will fine that Eaton's overall points were 2832 points below the combined total. Discipline Men's Decathlon winning result by Asthon Eaton in the London Olympics 2012 Differ Men's Individual winning result in the London Olympics 2012 Men's Individual winner 100 metres 10.35 7% 9.63 Usain Bolt Long jump 8.03 3% 8.31 Greg Rutherford Shot put 14.66 33% 21.89 Tomasz Majewski High jump 2.05 14% 2.38 Ivan Ukhov 400 metres 46.90 7% 43.94 Kirani James 110 metres hurdles 13.56 5% 12.92 Aries Merritt Discus throw 42.53 38% 68.27 Robert Harting Pole vault 5.20 13% 5.97 Renaud Lavillenie Javelin throw 61.96 27% 84.58 Keshorn Walcott 1,500 metres 4.33.59 30% 3.34.08 Taoufik Makhloufi Points 8,869 25% 11,701 Source: IAAF The Decathlon is a competition to find the most all-round athlete. The winner is generally the decathlete who is strongest in five to six of the contested disciplines. You have to be really good (fast and explosive) in those to align with, or more preferably surpass your competitors. In these core events, successful decathletes are often not that far from the individual top athletes. Though in the remaining disciplines their scores can be miles off the top. Why? Because the remaining disciplines have less in common with those they perform best. They either have to be sustainable or very strong in specific techniques. If an athlete is particularly skilled in one of those more specific disciplines – let's use discus throw as our example – they most probably fall short in another of the disciplines. Eaton was close to 40% less successful in the discus throw as his individual medal-winning counterpart, because he lacked the specified skills to throw discus greater distances. The discus throw requires a different training schedule and conditioning programme than those of Eaton's more successful events. His best results came in the disciplines that required great running. If we apply the lessons of the Decathlon to IT it helps us to understand the balance of strengths to discipline specific requirements. Consider the following: is infrastructure or email operations really your core business discipline? Should you stay focused and defend your infrastructure's winning expertise, or should you even be trying email operations if infrastructure is you core discipline? I Swear During my lifetime no one may total 11,701 points in Decathlon. I am not sure someone will even break 10,000 points. There is the possibly, but perhaps we haven't seen this decathlete perform yet. Similarly your business is unlikely to reach 11,701, or even Eaton's 8,869. Without frequents reviews and tough questioning of your business model you are likely to spend a lot of your resources without ever reaching that level; especially on your own. But what if we could change the rules of the Decathlon? What if we were allowed to outsource some the events or maybe even the whole Decathlon to someone else? If the Decathlon was your business, would you? It is possible to outsource our businesses. Maybe like Eaton your business is not great at rowing discus. At the London Olympics 2012 Robert Harting was the clear discus winner. If you could would you outsource the discus discipline to him? Robert Harting has the ability, the potential, to accelerate your services by 38%. That was his winning margin. Business does not operate with the same, single competitor requirements the Decathlon has. While you certainly have laws and regulations to comply to, there are no actual game rules to follow. To outsource discipline expertise is neither cheating nor breaking the rules. It could just be good business. If you want to compete, not only winning but surpass your rivals, then you have to decide what is your focus discipline, and what is not. Your customers expect you to bring innovation, best value, and bang for their buck – not mediocre results across the board. Prepare well Basics Review training program (business model) – what's core, what's not. Any reason to switch, add or remove focus? Know the rules. Know customer triggers and expectations. Is bread-and-butter core? If yes: Excel! If no: Cloud! Find burdens – get rid of them Find time and/or money consuming darlings. And even if not consuming; why should you offer something thats not your core because you can?! No, you shouldn't because it's not - core! Adopt partnership/joint venture thinking. Though I'm not saying you actually should partner. Adopt cloud services adoption and hybrid environments thinking. Though I'm not saying you actually should adopt cloud services. Adopt outsourcing thinking. Though I'm not saying you actually should outsource. Adopt meaningful thinking. Saying you should provide meaningful IT services. Mantras Yes I can... Replace my on-premises Exchange server with Office 365. It will give me time to focus on core business and at the same time bring extended value to my customers. Partner with someone to either become a VAR and/or add VAR to my offer. Or maybe our joint offer can become a bestseller. Outsource infrastructure services to concentrate on the database layer where I'm the shining star. Co-locate my data centre. I don't need to own my "data centre". Especially since I run my equipment in the most doubtful space, and therefore don't need to worry my customers finds out or audit. Partner with someone to build an app to my service, since I don't want to hire an app developer, and especially since it's not my core. Adopt a hybrid approach to cloud computing. I don't have to be radical. I can use sequenced transformation to either private or public clouds. And they can collaborate with my on-premises legacy systems since I can use management systems for that. Offer way higher SLA's, better continuity, faster and more effective service development etc. if I think of cloud computing services as business accelerators instead of argue some odd reason to not. I don't need to make it more complicated than it actually is. Increase customer experience if I spend time listening, partnering and collaborating with my customer instead of training disciplines they never will gain from. I'm building trust and don't risk losing them. Use a 'via' instead of 'by' approach. Consult a professional advisor of what to move, how to and when.. Change mind-set. I don't have to be the decathlete, because I don't need that to become successful! It might be difficult to let go of some of those disciplines your business does not demonstrate winning expertise but in doing so you could become a much stronger competitor. By becoming more aware of your core skills and aligning to them to well reviewed, often questioned, strengths alongside an updated training program you might just excel. ### The future of events with TickX Have you ever tried to find tickets to an event via any of the major search engines? It's a nightmare, endless websites offering different prices, packages, and often even out of date or incorrect information. Thankfully apps are stepping in to help. I'm a big app fan, I've used other Facebook integrated apps to purchase gig tickets in the past, but now there is TickX. A free app, and now the UK's largest event guide and ticket comparison app. TickX helps avoid unimaginative nights out at the same boring places, you have to admit every Friday night at the same pub/bar/club combo gets a bit old. What's better than free help planning your perfect first date? Or your best mate's birthday? Any reason to go out at all, TickX is there to help. Featuring events in over 56 cities across the UK, there are over 25,000 club nights, gigs, festivals, sports, lifestyle, arts and comedy events available to browse on the app. By accumulating all the information available into one app, TickX has provided a service that fills a void in ticket sales and event marketing. The ever-growing demand for comparison sites is familiar to us all, with cost-conscious consumers increasingly wanting to compare and contrast everything from markets to meerkats. TickX has co-operated with a network of partners, including the UK's largest primary ticketing agents as well as reseller ticketing brokers. So even when the event you're chasing tickets to has sold out, you might still nab a re-sale ticket via the TickX app. The app was created by self-confessed computer geek Sam Coley and enthusiastic Steve Pearce, after becoming fed up with experiencing the frustration associated with planning a day or night out. "TickX started with a vision to create a one-stop shop for event lovers, after becoming sick and tired of endless browsing the web looking for good places to go out," Steve Pearce said. Knowing that within the UK's billion-pound events industry there must be thousands of things happening, yet there was no singular route to access them all. Sharing the same vision to fuse multiple ticket sellers' feeds into one platform, they created TickX to provide users with a clear view of all events and their prices. "Our flexible search options, powerful ticket comparison engine and seamless interface now give users what they want. A fast and flexible way of finding the right event, for the right price." From the outset, Steve & Sam pioneered user-friendly design allowing people to search by city or radius with the option to filter listings by date and type of event. And if that wasn't enough, TickX has a super-search function that enables event-lovers to search for a specific event, show, artist, team or venue. Once users find something interesting, they can compare prices and buy tickets through a network of well-respected partners. Sam said that living in a city he and his friends were always looking for new events but found discovering them a nightmare - almost impossible when already on a night out. "We saw a gap in the market for a complete event directory that compared prices to find the best deal, which is why we built TickX," "Since then I've discovered some great comedy gigs which I didn't know about, and already saved money finding the best prices on my festival tickets! So there's no excuse not to find something fun to do now," said Sam. The free app which is available on Google Play and App Store is also be partnered by an easy-to-use website. Available online at: tickx.co.uk ### A view of the cloud in Asia Let's start 2015 with a look to the future and the Asia Pacific region. It is fast becoming a centre for cloud technology, and as a rising arena of play it has bodies falling into place to keep it in check. This is looking to be a big year for the Asia Cloud Computing Association (ACCA) which was formed in 2010. We spoke to Lim May-Ann the Executive Director of the ACCA to see what 2015, and the greater future, holds for cloud in Asia. Firstly, let's get some background knowledge on the ACCA – in 2009 the cloud industry Asia was just fledging, so a group of committed senior executives from the various cloud and technology companies put together the ACCA as a vendor-neutral organisation to educate the industry. Since its inception the organisation has grown from strength to strength and has a strong focus on public advocacy for cloud interests such as clear policies surrounding data regulations, data sovereignty and cloud certification. The ACCA is open to membership for all who have vested interest in the Asia Pacific arena - for more specific membership enquiries please visit the ACCA website. The flagship report that the ACCA produces is the Cloud Readiness Index (CRI), a report on the cloud readiness of economies in Asia Pacific. Using publicly available information, it is a conglomeration report of ten factors that the ACCA believes comprises cloud readiness: privacy, international connectivity, data sovereignty, broadband quality, government regulatory environment and usage, power grid and green policy, IP protection, business sophistication, data centre risk, and freedom of information. Lim May-Ann said the significance of the report is that if data, data management, and the capacity to manage data, are going to be the currencies of the future, they are going to be skill sets that economies need to have measured, and then developed in order to succeed in the 21st century - and beyond. "Essentially that is what the CRI measures – the ability and the availability of resources to take up cloud computing. So if you study economics you know that demand is both the need for, and the willingness to purchase something - what we are really measuring is the ability of the countries to take up cloud computing - whether or not the countries actually do take up cloud computing, that's another matter altogether." The CRI measures the capacity and the ability of cloud industries across 14 countries in the region and ranks them according to the report's findings. The 2015 report is yet to be released, though we can look back at the report from 2014 - let us know in the comments your opinion on the results. View the entire 2014 CRI. The ACCA is working to accelerate cloud computing in Asia Pacific, and is supporting the creation of a safe and stable regulatory environment so that cloud can be adopted. On top of this they are also trying to educate an entire region on the benefits of cloud computing and increase business understanding of cloud products. Presently there is a focus on the financial services industry and its take up of cloud computing. The ACCA has strong established bases in both Hong Kong and Singapore which are financial centres for the region. Lim May-Ann said there is a lot of discussion around financial services industry's use of cloud and that unfortunately, there is a lack of clarity when it comes to looking at the regulations and whether or not the regulations actually allow for financial industries to use cloud services. "There was a lot of desire for us to educate and engage with the various public agencies to clarify whether or not cloud computing could be used if you were a financial services industry – for the banks and the insurance agencies for example, there were questions around what kind of data can you store here, and what can you store elsewhere [sic]; sharing of credit information; whether an individual is credit worthy or not, and other similar information." The example Lim May-Ann used to illustrate her point was my own situation, as I had moved from the Asia Pacific region, Australia, to London, can my bank share my credit information with a bank here, or do I have to start from scratch? (Spoiler alert, I had to start from scratch, lots of paperwork.) "There are lots of questions around (data privacy), I'm proud to see that we've actually got quite a lot of traction around that from the government agencies as well, but it is not just the ACCA working on this, there are multiple organisations asking for clarification from the financial and government agencies." "The Monetary Authority of Singapore (MAS) has just come up with updated guidelines on how to outsource technologies, for example. While it's not a specifically tailored to cloud computing – it doesn’t explicitly specify that financial companies can use cloud computing – he updated guidelines strengthened and clarified their language around what sort of technologies could be outsourced to, for example to IaaS, PaaS, and SaaS." Cloud computing in the Asia Pacific is interesting, because despite a high take up of SaaS products such as Dropbox and OneDrive there is still apprehension around the whole concept of cloud, particularly problems arise around cloud costing schemes. For the Asia Pacific region one size does not fit all. If a service is $5USD per month per user that is affordable for a company in Australia, but not necessarily for one based in Myanmar or Indonesia. Lim May-Ann sees the future of the Asia Pacific region’s cloud in the expansion into regions outside of Hong Kong and Singapore. "I would say I think that a lot of the cloud uptake will continue to remain as potential, a lot of the countries are facing very basic infrastructural questions around power supply and internet connectivity. "I think where we’re going to see more possibilities are in economies where mobile services are the highest. Indonesia and Myanmar are two countries which could have very successful cloud marketing strategies when the marketing hits that segment of the population which is currently having their first computing experience in mobile form rather than as desktop or a laptop." So we can see 2015 as the year for the Asia Pacific to rise up, for mobile services to reign supreme, and for the mobile generation to be taken seriously. The desktop computer is fading out in the Asia Pacific, in order for cloud services to be considered seriously they must be mobile-ready. The up and coming regions of the world are developing faster than they ever have before, cloud is slowly but surely penetrating into their markets, and developing economies are inviting it in - but not without asking the important questions. ### Encrypting cloud data Being free to choose the most suitable encryption for your business seems a good idea. But it will only work in a context of recognised standards across encryption systems and providers' security platforms. Dr. Hongwen Zhang, Chair Security Working Group, CloudEthernet Forum explains. In the first years of major cloud uptake there was the oft-repeated advice to business that the sensible course would be to use public cloud services to simplify mundane operations, but that critical or high priority data should not be trusted to a public cloud service but kept under control in a private cloud. Instinctively this made sense: you should not allow your secrets to float about in a cloud where you have no idea where they are stored or who is in charge of them. The irony is that the cloud - being so obviously vulnerable and inviting to attackers - is constantly being reinforced with the most sophisticated security measures: so data in the cloud is probably far better protected than any SME could afford to secure its own data internally. It is like air travel: because flying is instinctively scary, so much has been spent to make it safe that you are less likely to die on a flight than you are driving the same journey in the "safety" of your own car. The biggest risk in air travel is in the journey to the airport, just as the biggest risk in cloud computing lies in the data's passage to the cloud - hence the importance of a secure line to a cloud service. So let's take a look at encryption. Instinctively it makes sense to keep full control of your own encryption and keys, rather than let them get into any stranger's hands - so how far do we trust that instinct, bearing in mind the need also to balance security against efficiency? The idea of encryption is as old as the concept of written language: if a message might fall into enemy hands, then it is important to ensure that they will not be able to read it. We have recently been told that US forces used Native American communicators in WW2 because the chances of anyone in Japan understanding their language was near zero. More typically, encryption relies on some sort of "key" to unlock and make sense of the message it contains, and that transfers the problem of security to a new level: now the message is secure, the focus shifts to protecting the key. In the case of access to cloud services: if we are encrypting data because we are worried about its security in an unknown cloud, why then should we trust the same cloud to hold the encryption keys? ### Big Computing 2015 A very happy New Year to you all and here's to hoping 2015 will be a prosperous one. So, 2015 is upon us and the usual predictions from the big analyst firms are already out. I've tweeted some of them. What do I think the future holds for the cloud this year? What might be the top topics and trends that divide opinion? Well, having travelled the globe extensively in 2014, talking to Cloud Vendors, distributors and end consumers, I have a little to say on the subject. I know this may sound like a small rant but at least 80% of the people reading this article will not understand the implications of the Internet of Things (IoT) and Big Data moving forward. For me their individual importance will continue to rise many years in the future. Gartner predicts 5 billion devices will be connected to the Internet and data sources this year alone. That's a lot of captured data. As each of these devices collect, query and share this captured information data itself becomes smarter. The more we interact with data the bigger the data gets making more specific queries and analytics possible making the results more accurate. Predictive analysis for future outcomes could one day reach of the low 90% range. Some would say that seems nearly clairvoyant. Collecting this volume of data means that how you store, process and deliver the results will be key. Many of the big vendors have been readying their services in anticipation of the IoT and the rising force of Big Data. Mergers, acquisitions and redefinition fuelled a lot of the news last year. IBM made some big changes, reinventing a centuries old company. Microsoft embraced a "mobile first" world. The biggest enterprise trend last year seemed to be the separation of hardware divisions and the acquisition of cloud services. Just look at HP. I expect this trend to continue as cloud technologies takeover more of our IT. The intelligent searches, data comparison and collation will need to a huge supply of information, as will the IoT. Connectivity and performance will be essential to these services. Expect a lot of discussions on bandwidth, processing power, APIs and app development in 2015. The great mainframe computers of yesterday will be rebooted, like a Hollywood film franchise, for the demands of Big Data. The data centre has often been defined by its capabilities with rows of boxes, cool and quietly processing the commands of its software. These boxes are often segmented by services, by the companies that own and use them, etc. The data centre of the future won’t do away with these individual boxes completely, businesses and storage demands will ensure their continuation, but we will see many of these boxes functioning contiguously like and alongside the new mainframe computers. Big Data needs Big Computing. One other prediction will be the rise of intelligent cyber security - common sense really. The Sony hack dominated the press in December, and continues today with sanctions and diplomatic name-calling. This year we will need to do more to protect ourselves from self-appointed cyber activists and that lurk in every corner of the globe. Big Computing is going to need robust security. Territory based data policies are already in place but they will be more prominent going forward. The data breaches that defined last year were often defined by their lack of good IT governance. This will eventually lead to country-based clouds that have their own distinct rules for non-domicile companies. It makes sense when you think about it. I have seen some incredible cyber defence services towards the end of 2014 but I am going to stay quiet for the next few weeks – Compare the Cloud will certainly have more to share soon. That's it for now. Enjoy the New Year ahead. 2015 will see the IoT take off in a big way as Big Data and Big Computing work together. ### Explaining Big Data to the Mother-in-law I should have seen it coming, but I was caught cold... damn my lack of focus. I was preoccupied making sure my kids behaved and using my mashed potato to support the next layer of Christmas dinner on my plate. My Mother-in-law ambushed me over the table from behind the Brussel sprouts with a question that instantly sobered me from my head fog of Prosecco... "So Andrew what is this Big Data thingy?" Why did she ask this? Well the background is that in December I left my well paid, enjoyable job that provides a blanket of middle class comfort to her daughter and adorable three granddaughters... to well, start a new company with my long term colleague Julian. Having proudly given the family a pre-dinner screening of my new website, my Mother-in-law really wanted to know if this Big Data thing I was pinning my hopes on was going to deliver and if the next 12 months of Aldi and Daddy working late was going to be worth it. To be honest the whole family all chimed in which includes a seasoned IT project manager, an accountant, graduates etc. basically nobody understood what on earth I was doing and what is meant by Big Data. I tried the "it's just computers" fob off, but it was knocked back – they wanted to know if I was going to be losing the family house next year!. This was serious, I was woefully unprepared but here was my attempt at explaining it to my Mother-in-law (Madeline). Companies have always used data and information to help run and make decisions about their business. Traditionally the only data available to do this was neatly inputted into a database (store) by a person. For example, Valerie in accounts using an accounting application, If she typed in an invoice number it would be stored in a nice structured way so the company could easily run reports and get access to it. Over the last few years the quantity and way the data is created has changed dramatically to the point that of all the data existing in the world today 90% of it was generated only in the last two years. The data is generated in much larger volumes by machines, sensors and social media (with a lot more people are using Twitter and Facebook etc. on their mobile phones). This data explosion is great for businesses because if they can glean valuable information from it, it allows them to gain an advantage over competitors. The big problem is that the data isn't neatly structured and there are big volumes. This makes it very slow and expensive to store, process and analyse using traditional technologies. A good example, is an airline, they could use data from sensors on engines to reduce fuel costs or even analyse social media data to identify any problems their customers are complaining about. Reducing costs and increasing customer satisfaction. Therefore, this change in the amount of data and the different way in which it is being created is really what I mean by big data. My company (Onomi) will help businesses to make the new and old technology work together, so our customers can make better decisions and build new applications for their customers that use this new kind of data. To be honest I‘m not sure if I managed to get Madeline to really understand, but I've bought myself a few extra weeks. At which point I'll try the next Jedi mind trick using Hadoop and a deck of playing cards analogy... I'll let you know how I get on. ### From Cloud to Big Data As ever the last day of the year is a time of reflection and planning. We look back at the year just gone and ahead to the year to come. If this were TV the year-end review would be a tedious line-up of desperate celebrities introducing footage culled from the programmes we watched and news reported throughout the year. Since this is a blog we will have to leave the flickering review to your imagination and just get to the point: another year, another significant advance for cloud computing. Unsurprisingly cloud adoption is increasing. The story is barely even news now. Greater revenues and bigger opportunities often predicated by an industry survey or sweeping outline from Gartner has marked the last 3 years. The facts make plain reading. From the tech giants to the MSPs cloud services have grown year on year – even at the cusp of austerity and slowed, global economic prospects. Businesses everywhere have begun to realise the benefits of mixing IT provision - hosted applications, online storage, centrally managed resources (end-point managed devices) and data - through connected, cloud-based services. Having a bunch of boxes networked on-site just doesn't make sense as businesses are more mobile and operate as much out on the road as they do in the office. Cloud computing has made incredible steps toward ubiquity. It has become more useful and essential to businesses. Future gazing now, next year is going to be more about Big Data. Our obsession with defining these connected, distributed services under the banner of cloud will diminish. Instead we will look more at the individual benefits of those services that have risen out of our great push to cloud computing. Aggregating data, comparing an ever-increasing sample of inputs, is going to offer more insights than perhaps we are ready to understand. 2015 will be Big Data’s year. In some respects Big Data has already arrived. 2014 already showed us the advancing future. This year we began to access services and data using natural language - transforming our use and understanding of technology to make it more useful and easier to understand. Apple's Siri, Google Now and Microsoft's Cortana have each demonstrated importance of this new approach. And IBM's Watson Analytics suggests the beginning of what is (and could be) possible in the coming months. From customer insight to medical application and financial comparisons Big Data can help us redefine and better understand the complexity of our ever-connected world. Our access to this information will be enabled by the cloud framework we have already invested in, and no doubt stored on a mix of private and public (hybrid) cloud networks and secure storage, but the real story will be data, data, data. Happy New Year everyone. ### Personal clouds So, cloud and making it personal – what does this actually mean? Personal clouds are a bit of a quandary and I believe are a bit misunderstood. You will see many companies advertising data storage, backup and accessibility products that offer you, the consumer, a personal cloud. Free storage and backup of data on the cloud that's accessible anywhere to home personal cloud based solutions are available form pretty much any large vendor today. With the massive amount of data that we all use and generate (email, word processing, pictures, movies), we have to store them somewhere. Increasingly we are turning to free services, often marketed to us as our own personal cloud. In my book, if something is free it has little or no value so if something has no value then why would you put something you value onto it? Good question. Is it a misconception that everything is secure in the Digital Age. Just ask Sony. Why on earth would anyone, let alone someone in the public eye, place very personal pictures or information about themselves on a service that could potentially be viewed by millions if the incorrect security has been applied? A clicked check box at the end of 30-page disclaimer that no one ever reads is hardly a demonstration of great security. So, in my mind there are two issues: 1. There is a basic lack of understanding around the technology that supports a personal cloud service. The notion that "everything is covered and it will never happen to me" is all too prevalent. 2. Little common sense is in evidence when people all too eagerly place pictures and movies of themselves on a free public service. Should it ever be breached the public disclosure could be very damaging. Maybe I am being overly cynical but this is the case with mainstream cloud technology understanding. We have a price war going on at the moment. Amazon, Google and Microsoft (and others) are competing against each other with continual price reductions to entice users and grow business. Let's face it, nobody can reduce the cost of sales continually and make a profit without compensating for losses somewhere else. Now, I'm not saying that the previous companies mentioned have done this, what I am saying is that it's an old sales tactic to have loss leaders to gain market share to sell them something else. All three of these companies sell other services and products - make sense? We should view the personal cloud in the right context: its personal. Take Facebook as an example. You could say that this is personal cloud of sorts as there is information totally personal to you stored there for you to share amongst your circle of friends. Anyone with access can view those pictures and messages. That is the whole point of cloud technology, to share collaboratively. But do you trust Facebook? Access is defined by permissions and robust security – we hope. Because your data is online there is an opportunity, however remote that someone outside of your circle can share your personal data stored there. Isn't it ironic that since the Apple security breach a week or two back both Amazon and Google have announced that they are stepping up their security policies. Why do we wait for an issue to happen before we start working on preventative measures? With the continual advancement of technology and our perceptions of what it can do for us, there are bound to be a misalignment of expectations to reality. I would like to use the analogy of a Hybrid Cloud model. What is Hybrid I hear you ask? Good question and if you asked 5 people you would get different. The Types of Cloud have been evolving to suit business needs and this has never been more apparent with the emergence of the term Hybrid cloud computing. We seemingly understand Private and Public clouds, but Hybrid? Hybrid is a term that has been defined as a mix of private and public services based at your location(s). Hybrid clouds are already in use by many businesses. Maybe you backup your data to another location? Maybe you have applications delivered to over the Internet (often referred to as SaaS or Software-as-a-Service). Hybrid is a broad term that encompasses so much of what cloud has to offer. Confusion is rife with cloud technology in general. As a technologist I take this for granted sometimes and I love to get different perspectives on the subject from others outside the industry. I can definitely say that after traveling all around the world for my job, for pleasure and sometimes just for the hell of it, there is a major misunderstanding of cloud technologies by the general population. As we advance with our ever increasing computing power our expectation exceeds the reality of what technology can do. Automation and the aggregation of data, Big Data, have improved but our clouds don't think for us yet. Common sense requires an abstract reasoning that cloud computing cannot yet apply to data. The cloud cannot think or make the kind of personal decisions we would make individually. Technology can only analyse and predict outcomes based on previous history. Even today, could a super computer like the IBM Watson be able to tell you when your naked selfies were on a storage system with poor security levels in place? Maybe the answer should be "are you crazy to store those pictures online in the first place?" As technology becomes more advanced, so do the challenges of understanding it. Providers are businesses and their common sense isn't necessarily yours. Do we need to understand the modern technology that surrounds us or simply have blind faith in the providers that supply our personal cloud services? At the end of the day it is your choice in how you utilise cloud technologies. But you might want to consider being a little more educated on the risks that come with these, often free, commodity cloud services. Don't just focus on the benefits. ### The data centre in 2020 After reading Compare the Cloud's piece on data centres in 2020, I'm offering you all my own thoughts on the future. The data centre in six to ten years time will undoubtedly generate power locally but I disagree that it will be based in nuclear fission. I do agree that the maturity of micro mobile devices and strategies will drive massive growth in data analytics, storage and processing. Also, I doubt that anything can have a clear purpose when we call it the 'Internet of Things'. Did we learn nothing from prefixing all products and services with 'e' 15 years ago!? Power is the greatest cost to any data centre provider, regardless of global location - so becoming cost competitive and more sustainable requires us to begin to generate and source power differently. After all, the greatest proportional rise in power costs have been due to distribution costs not power production. data centres of the future will witness a complete re-engineering of how they process, manage, connect and store. We will evolve from buying homogenous blocks of capacity; whether it be defined by units of racks or the time it takes to compute. The data centre will morph into banks of homogenous modular processing, flash and connection concentrations that drive out many of the inefficiencies and under-utilisation of previous generations. In the future we will purchase capacity subscriptions in the same way that as domestic and commercial consumers we buy power, television and mobile services today. Connectivity will be ubiquitous, one package which connects our lives and all our devices. Subscription to connectivity packages will provide limitless connectivity regardless of the device and locale to hubs which process, protect and store our unique data. These will drive a transformation within the data centre, a movement to large blocks of centralised processing and memory, which is allocated based on a quality of service and subscription package. 'But that's a Mainframe' I hear you say, well yes, processing will be delivered from banks of clustered devices - be they CPU or GPU, which will be consolidated and optimised with the emphasis on intelligent grid management systems which only invokes more capacity on demand. These will be supported by flash systems delivering shared memory and the first tier of storage to support the 'always on' instantaneous experience we are now beginning to demand. Multiple levels of storage will then determine by touch, data age and retention profiles as to how data should be stored and indexed. All this will be managed by the connectivity strategy which constantly analyses our data streams, usage, and services aligned to the best subscription packages to allocate resources most effectively. The data centre will morph into banks of homogenous modular processing, flash and connection concentrations that drive out many of the inefficiencies and under-utilisation of previous generations. Let's think of these as Smarter data centres. So what powers these SDCs? I remember reading about the impending maturity of nuclear fission 20 years ago and the excitement of what it could become; yet taking nuclear fission mainstream remains our greatest challenge. data centres need two attributes; continuous power and connectivity. The source of either is, and will remain only important in terms of unit cost. Over the years I have seen many data centre projects consider a wide range of local power sources. Whilst nuclear fission will undoubtedly solve a number of issues, regulation and risk will ultimately govern its adoption. Creating smaller local pods of nuclear waste production from data centre fission cells will need a clear strategy; as will managing and transporting fuel and waste to restricted and registered sites. Fission research led to the creation of nuclear weapons, and building a nuclear capability within a data centre facility will require another level of control and governance which will need political approval. Given the challenges we have faced already in getting clear, considered and well invested strategies for long term sustainable power generation in the UK by already approved and regulated suppliers, the likelihood of this cascading to local regulated operators in 6 to 10 years is not conceivable even if we are being optimistic. Are we really going to see mainstream global capitals approving nuclear fission cell deployments without the appropriate regulation and risk management frameworks in place? The data centre of the future will be locally sustainable. We have to address the issue of continuous power in terms of many generation strategies such as biomass, solar, wind or geothermal as many of these require maintenance at some point, as will nuclear fission infrastructures so we either ultimately fall back to national infrastructures which need to be sized for peak demand or we deliver this through temporary or even permanent redundant onsite generation. The data centre of 2020 will have three main considerations. There will be a need for more space to support the generation strategy. We will be required to change our thinking around long-term capital investment strategies; to support either the refit or implementation of new technologies. The new technologies will trade long-term unit cost reductions for even greater initial investments. Finally, there will need to be greater central and political support. Without considerations in the realms of investment, policy, governance, and innovation in 2020 we will purely deliver the 'Internet of Things', there will be no clear definition, and confusion will continue to reign supreme. ### Navigating the hybrid highway In the second part of our hybrid special Bree Freeman explains the potential pitfalls of adopting the hybrid cloud path. This year hybrid cloud has shaped up to be the go-to model for enterprise users. No matter what your organisations size, all the figures point to the same thing – you're going to be in the hybrid cloud majority soon, well if you're not already. And it's easy to understand why, as the hybrid cloud offers a number of benefits to big business, such as allowing a firm to retain sensitive data behind its own firewall while taking advantage of the lower cost and flexibility of the cloud. It can also improve scalability and provisioning at decreased cost, allowing resources to be allocated to the cloud for short-term projects at much lower cost than it would to make changes to your infrastructure. Plus there's one often overlooked advantage of the hybrid option; that it allows companies that are reluctant to move to a cloud model the opportunity to dip their toes into it at low cost and with less perceived risk – a significant benefit for those companies with management that take a more conservative approach to technology. But as with most things in life 'knowledge is power' and despite all that is good with a hybrid there are potential pitfalls that you should be aware. One such pitfall is also hybrid's advantage: security compliance. While a hybrid cloud enables you to comply with certain regulations, all the while still being able to reap the benefits of public cloud, you now have two different services to ensure you remain compliant – your public and your private cloud. Failure to do so could defeat the entire purpose of deploying the hybrid cloud approach in the first place. There can also potentially be other issues around things such as infrastructure dependency, which over internal IT infrastructure is one of the major drawbacks of implementing a hybrid cloud. To mitigate the impact of outage a firm would need to ensure redundancy across the data centres. You can do this by using multiple data centres from a single provider or by engaging multiple providers. Another are you really need to do your homework on is the potential complication from poorly constructed SLAs. Make sure you have detailed SLAs drawn up for both your private and public cloud providers. This way you can ensure that both providers can meet your expectations. At the same time you also need to have a realistic approach towards distribution of workload. My advice is to look for potential integration issues that ultimately could disrupt services. Complex networking is also an area that you or your company should be aware of. In a hybrid set-up the network plays the most important role, basically it is imperative for smooth transfer of data. What is more is that the APIs used in a hybrid environment demand complex networking as well, so ensure that you bring this up with your provider before signing on to the dotted line. Another area that you need to be aware of is data sovereignty, because when data flows from a private to public cloud, you could possibly breach data sovereignty laws when the data in the public cloud moves out of a country's border. So be warned that multiple service locations can expose you or your company to unforeseen jurisdictions. Also be aware of your data management, I know I'm more than likely teaching you to suck eggs but constant movement of data between the private cloud and multiple public clouds may add to the complexity of managing and tracking the location of your data. Plus the cost of data mobility between clouds can be substantial when moving large amounts of data between your clouds. Other niggling areas that you really do need to be aware of include: replication of confidential data between clouds can expose your company to potential data security risks and operational errors can occur even with ISO27001 security standards-compliant public cloud providers. There's also the chance that public cloud providers may shut down their services giving insufficient time window for you to migrate your data – you just need to look at 2013 Nirvanix scandal to see what I mean. However, don't let me scare you off the Hybrid path as there are a number of 'good guys' out there that can make your journey an easy and open one, in my opinion these include Amazon, Microsoft, Rackspace, IBM. And here in the UK we boast a vibrant hybrid cloud market for small- and medium-sized that is being headed up with some great providers such as Attenda, Niu Solutions, Claranet, Pulsanet, Commensus, Nasstar and last but not least ioMart. ### Infinera Cloud Xpress now shipping Infinera has announced that it had started shipping the Infinera Cloud Xpress. The Cloud Xpress is a family of metro optical platforms designed for network operators delivering Cloud-based services to consumers and businesses worldwide. Infinera has shipped the Cloud Xpress for production deployment as well as field trials to multiple customers. Cloud Xpress customers include Internet content providers, data centre operators and large enterprises. Infinera announced the Cloud Xpress this fall as part of the Infinera Cloud Xpress family. The Cloud Xpress platform is optimized for the metro Cloud, the transport network that provides data centre connectivity within a metro area. A small, stackable platform, Cloud Xpress is designed to simplify the deployment of high-bandwidth connections while lowering the power and space required to scale metro Cloud networks. As Cloud adoption increases, large network operators are reporting a magnification effect on incoming traffic, such that a single request from an end user can generate a dramatic increase in the amount of East-West traffic between data centres. This magnification effect is dramatically accelerating the deployment of high-capacity transport solutions into the metro Cloud. The Cloud Xpress is designed specifically to address this need. "As Cloud applications and services have grown in importance, the need for high-speed data center interconnection has grown at a very high rate," said Paul Parker-Johnson, practice lead, Cloud and virtual system infrastructures at ACG Research. "This has generated the need for new transport solutions with leaner form factors, higher densities and superior operating efficiencies. Cloud Xpress is well-positioned to participate in the emerging service provider data centre interconnect transport market, which we expect to reach $4 billion by 2019." "Meeting our commitment and shipping the Cloud Xpress in Q4 2014 is a key milestone for Infinera as we enter the metro Cloud market," said Stuart Elby, senior vice president of Cloud network strategy and technology at Infinera. "The Infinera Cloud Xpress marks a better way to bring innovative Intelligent Transport Network solutions to the metro Cloud market by delivering a purpose built product with the right form factor, space and power profile." The Infinera Cloud Xpress features one terabit per second of input and output capacity in just two rack units with up to 500 gigabits per second (Gb/s) of line-side capacity and a mix of 10 gigabit Ethernet (GbE), 40 GbE and 100 GbE client-side interfaces. The Cloud Xpress is designed with a rack-and-stack form factor and a new software approach that enables it to plug into existing Cloud provisioning systems using open software defined networking application programming interfaces. In addition, the Cloud Xpress consumes half as much power as the leading solutions from the vendors providing comparably-sized metro Cloud platforms. For more information on Cloud Xpress, visit: www.infinera.com/go/cloud ### Node4’s on-net with Virgin Media Node4 has announced that its Leeds Data Centre is now on-net with Virgin Media. The Virgin Media interconnect adds a significant capability to the existing resilient and flexible tier 1 network connections available from Node4's Leeds facility. This additional capability brings Node4's Leeds data centre in line with the company's other facilities in offering interconnects into tier 1 networks including BT, Vodafone, TalkTalk Business, KCOM and Virgin Media. Chris Pagel, Network Services Manager at Node4 commented: "We're dedicated to having the highest quality solutions range using the best suppliers possible at all of our Data Centres. The addition of Virgin Media interconnects to our Leeds Data Centre demonstrates our continuing commitment to extending the capabilities of our Data Centre facilities and enhancing the flexibility of our offering for customers." ### Bah Humbug Cloud It's a weird moment when you sit down to write, having been asked to do a little piece for Compare The Cloud, and all you can think of is one Andrew McLean’s best Scrooge face on their Christmas podcast flyers... Personally, I quite like Christmas; generally speaking everyone is in high spirits and for the briefest of periods Reading, one of the places that Pulsant calls home and my own professional base, stops being one big traffic jam. Using the definition of Christmas from everyone’s younger years of "that time of year you get new toys," Christmas for me professionally was many months ago. As one who frequently beavers away in the background, behind the cloud, most of my new toys arrived during summer. The massive boom for a lot of our customers in the run up to the festive period caused a lot of nervous conversations in Autumn around planning the capacity they would need to stay up and running. I'm told it's always tricky! But we know that... We have to make a good estimate of how much capacity our customers will estimate they need to survive Christmas and get it in before they ask for it! Our Enterprise Cloud, in providing that flexible capacity our customers enjoy running up to Christmas, actually causes Pulsant quite a headache. Turning that bad problem of "we've run out of capacity because our sales were stronger than we predicted" into the good problem of "we needed more capacity because our sales were stronger" in a matter of minutes is great for our customers but it takes Pulsant several weeks, if not months, to put in large quantities of blades, storage systems and network devices. As a result, and at the risk of highlighting that I'm a massive nerd, I got loads of those cool new toys quite a while ago. So, Christmas for me at work is a relatively quiet affair. Our customers are booming and my esteemed colleagues are helping them boom, on the infrastructure we prepared ages ago. I guess my Christmas present is that warm and fuzzy feeling that actually… we are really quite good at this! That or socks. If you'd like to know more about how our Enterprise Cloud helps our customers cope with the demands of Christmas you can read all about it on our website. It is however, with regret, that I must inform you that we do not sell socks! ### Hybrid: The Scrabble proof point It always seems to me that like the Gold Rush Miners of the 19th Century, IT practitioners are always looking for the pot of gold or the killer application that will define or justify their efforts, investment or focus. Maybe it's an urban myth but it is said that the only people who really made money out of the gold rush of that time were the people who supplied the picks and shovels, effectively the infrastructure. That got me thinking that maybe the killer application for cloud is errr cloud, or more specifically hybrid cloud. I have to say when the concepts of cloud first became mainstream, I had a hard time getting my head around the concept of a hybrid cloud. Not so much the idea of a bridge between the world of private and public but specifically how do you really build something that provides that seamless bridge between two often separately owned and controlled environments. I also questioned the value of a hybrid cloud environment. Interestingly and yes, slightly flippantly referring back to the tile of the blog, if you spell out the words 'Private', 'Public' and 'Hybrid' in Scrabble tiles, it turns out that private and public score 12 each, whilst Hybrid scores 14. The additional value in a Hybrid model was always there maybe it just needed realising. It could be argued that those who built a private cloud environment and has access to a public cloud offer and who then manually moved workloads between the two were operating, in essence, a hybrid cloud. The reality is however there has always been a technology gap, an enabling piece of the jigsaw that would fully automate the link between private and public to a point where the two literally appear as one. Enter Software Defined Networking. Network Virtualisation, in the shape of both SDN and NFV (Network Functions Virtualisation) are collectively the biggest disruption to the networking and IT space in our lifetime. Over time, they will bring together the worlds of 'IP and IT' in a manner that has simply not existed previously and in doing so, not only will they transform the business model of IT service delivery but will also open up new service opportunities. hybrid cloud is one of those new service opportunities. The elephant in the room that prevented the hybrid model becoming more prevalent earlier in the lifecycle of cloud was the network. Whilst apps, servers and storage developed a more automated orchestration framework that allowed true workload mobility, the network remained relatively static and manual. What Software defined networking has done is to integrate the network into the same automated orchestration framework as the apps, servers and storage that now allows for seamless workload mobility both within a data centre but also, crucially, between them. Security and data sovereignty notwithstanding, the hybrid cloud is now open for business and with it comes a number of potential new use cases such as seamless cloud bursting, new models of disaster recovery and much greater control of application mobility and location than has ever existed. Bringing security and data sovereignty back into the picture, the freedom of having the ability to seamlessly control where workloads are best run combined with the ability to locate storage and data appropriately is a flexibility 'win win' for most IT departments. Looking to the near term future, the much discussed Internet of Things will, in my opinion, massively ramp the need for seamless hybrid cloud environments. The amount of data generated from literally billions of devices will require a Big Data model the likes of which it's hard to imagine the scale of today. That can only happen in a truly distributed but controlled delivery environment. Maybe, just maybe, the timing for both SDN and hybrid cloud could not have been better. ### New survey reveals telecom trends for 2015 2015 - The Year of Digital and Cloud for Communications Providers? 79% of communications providers held back by their current IT systems but next year they must focus on building the right foundations for future success. Despite the hype surrounding trends such as the Internet of Things and wearable technology, telecommunications companies need to focus their 2015 investments on building the foundations for a digital future, according to a new survey commissioned by global cloud technology company, CloudSense. The survey polled decision-makers across Communication Service Providers with over 400 employees across Europe and the US. The results show that over three-quarters of respondents (79%) are dissatisfied in some way with their current IT systems. This legacy infrastructure is obviously holding them back from effectively selling new products as the majority (82%) say they cannot easily bundle across all products and services - a major disadvantage in such a competitive and fast-moving marketplace. When asked which technology will most impact their business in 2015 the answers spanned a range with no one clear winner. The most popular answer was the cloud (29%) followed by digital services (21%), whereas the Internet of Things and wearable technology only scored 13% and 6% respectively. Big data was cited by only 10%. "According to Ovum, the telecommunications sector will be one of the top industries for IT spending over the next 12 - 18 months. But the challenge for the vast majority of Communication Service Providers globally is that their current systems are not built to deliver future success. With a number of technologies impacting their company in 2015 their commercial success will be dependent upon how they sell and deliver those new technologies to customers. The numbers are clear; 79% of communication provider decision-makers admit their systems are holding them back when selling new products and 82% acknowledge they are limited in how they can bundle their products and services," says Richard Britton, CEO, CloudSense. "47% of our survey respondents say that 'winning new business' will be a key challenge for 2015, closely followed by 46% saying 'reducing operational overheads'. However, it is impossible to increase new business whilst reducing costs without a significant change in approach. Continuing to rely on ageing systems which undermine the order to cash journey impacts revenue, costs and customer experience." The survey showed that a vast range of issues continue to impact the customer journey negatively. For example, 35% were concerned that their customers still had a different experience depending on channel. The next two most significant concerns for businesses were the continuing use of manual processes (highlighted by 31%) and the fact that order capture and management systems are not fast enough to change, leading to increased time to market for new products or services (30%). CloudSense Communication Providers 2015 findings For a detailed infographic demonstrating the breakdown of results please see: cloudsense.com ### The importance of uptime How seamless uptime management contributes to an operational peace of mind. Cloud computing has become the default way for deploying applications or services. Cloud computing offers companies, enterprises and start-ups the ability to avoid, or minimise, spending while leveraging the flexible nature of the cloud infrastructure to meet growing business needs. A growing challenge for applications is obtaining optimal availability at all times. Today, cloud based infrastructures are often built with a large number of systems geared for elastic scalability while hardware costs should be kept to a minimum. These flexible scenarios mean that certain components are geared to fail. Enterprises are designing applications to tolerate occasional downtimes, or at least devising an application with the ability to bypass potential failures. Even with all of the precautionary measures in place, writing or rewriting existing applications for optimal cloud usage can be labour intensive and involves a significant investment of costly resources. Delivering availability for each application, at the right time, requires a considerable understanding of usage patterns. By nature, each application is designed to sustain certain capacities. Designating fixed availability is usually not a viable option as certain factors, like patterns of usage, are not being considered. Beyond focusing on avoiding downtimes, which is of high importance, would a professional uptime management constitute a seamless solution to further operational concerns? What is Uptime Management All About? Uptime Management is a set of services and tools designed for controlling, monitoring and optimising operational productivity. Proper uptime management is indeed crucial in averting emerging issues, solving critical situations and reducing downtime. Furthermore, Uptime Management encompasses a disaster recovery mechanism in the event of an emerging issue. Here are the 7 main services that Uptime Management should encompass: 24/7 NOC Centre Real-Time Monitoring platform Tier 1+2 services DevOps Run-book operation and centralised dashboard Infrastructure Maintenance DR Management 24/7 NOC Centre At its core, Uptime Management is dependent on 24/7 Network Operation Centre (NOC). The NOC is not only responsible for controlling the network and bare metal infrastructure; the NOC actually manages the entire application and service operation. The NOC offers a broader, overarching analysis of the entire system operation. With this information, critical decisions can be approached in a proactive manner rather than a temporary, reactive response. In this manner, the NOC services promote a hands-on, continuous, business-focused monitoring approach. Real Time Monitoring A crucial part of the Uptime Management service is real-time monitoring. This functionality is dependent on two critical factors: requirements and availability. The monitoring platform should be perfectly matched to the operational necessities of the specific business monitoring is being conducted in a humanised manner to assure availability and attentiveness at all times, ensuring that all emerging situations receive the necessary attention in real-time. There are 4 layers of Monitoring as part of Uptime Management: Bare Metal Monitoring Network Monitoring SLA Monitoring Application Monitoring It goes without saying that all 4 layers of monitoring should be carried out in a precise and centralised manner. In other words, the Uptime Management provides a unified view of the entire IT operation, which renders confidence and stability enabling the respective decision makers to allocate skilful resources to other tasks and assignments within the organisation. Indeed, professional monitoring means continuous service leverage, as changes and updates to and from the cloud are constantly being implemented e.g. new modules. Real-time monitoring of both the application and its infrastructure secures the service smoothness, primarily based upon the critical assessments, stemming from the humanised NOC operation. Tier 1 Services A tiered IT support structure enables an organisation to maximise its staff resources by allowing NOC engineers to address routine activities, freeing up higher‐level support engineers to focus on more advanced issues and implement strategic initiatives for the organisation. In a 24x7 proactive support environment, events or incidents, reported by servers, applications, or networks, can be detected, classified and recorded via the monitoring tools and consequently solved. For the sake of improving efficiency, customised monitoring dashboards are then used to filter out any irrelevant events or false positives. Integrating a tiered support structure, utilising a 24x7 NOC, enables an organisation to detect, prioritise, escalate and efficiently resolve incidents without diverting resources of development engineers DevOps It appears that a complementary component to the Uptime Management scope constitutes the DevOps framework. In this context the goal of the DevOps team is to increase agility during stress situations in Live production by performing Tier 2 & 3 support in real-time with utmost efficiency as per NOC/R&D requirements. Furthermore, a well-functioning DevOps scope excels through utilising the structured architecture and enhancing the network productivity while implementing additional monitoring procedures and graphs. Run-book Operation and Centralised Dashboard In addition to real time monitoring, Uptime Management constitutes a service harmonisation between the run-book process and a centralised dashboard. Targeting functionality optimisation enables both the NOC team and the end user to benefit from a clear overview of the scale and extent of service productivity. By employing a Run-Book mechanism with a centralised dashboard a sound and smooth knowledge flow within the organisation is established. <!-- Let's take a closer look at these two Uptime Management components: Centralised Dashboard provides each authorised person within the organisation with a unified status view any time and can follow pre-defined and yet easily customisable key performance indicators and parameters. Run-Book A process, which is incorporated into the operational workflow, focused on distilling a clear and simple list of tasks and indices out of any architecture state, regardless of its complexity. This Run-Book process forms an accurate transfer of a non-documented knowledge, accumulated by particular individuals, towards meticulous and constantly updating event documentation. Consequently, this collective information reduces the dependency on single individuals within the organisation. --> Infrastructure Maintenance Routine preventive maintenance is perhaps the easiest and least painful way of bolstering server reliability. Regularly performing maintenance such as updating system software can go a long way in creating a data centre filled with servers operating at optimal levels, with minimal investment of resources or staff time. Organising and scheduling server maintenance, ensures that all necessary work is performed when required, minimising the impact on overall operation of the enterprise. At all times, maintenance work should be handled in such a way that the practice itself would not consume server uptime. DR Management Prevention is better than cure. In today's global online economy, 24/7 access to the entire organisational data and applications is a requirement for an enterprise's IT end-users and customers. Keeping your business running 24x7 under any circumstance is critical to preserving customer trust and ensuring success. A business continuity policy is the next step to protecting enterprise workloads against downtime. DR management is a managed service featuring software-based replication platform to replicate production systems. A seamless Uptime Management means the right selection and pursuit of a DR strategy for each organisation. A prerequisite for a well-functioning DR is to put together a DR plan, which refers to the organisation's business necessities by devising the key metrics of recovery point objective (RPO) as well as recovery time objective (RTO) for its operational and business-oriented processes. A DR Management facilitates continuous access to the organisational data and systems, even after a disaster, which is often associated with severe lack of storage in a cloud environment. Furthermore, in hybrid cloud infrastructure a well-organised DR Management replicates both on-site and off-site data centres, so that in the event of a physical disaster, servers can be brought up in the cloud environment and vice versa. Uptime Management has always been a crucial matter. In many ways, it can be regarded as the 'final mile' of any IT operation. Once integrated, seamless uptime management can directly impact on reducing unnecessary issues, system downtime and, ultimately, guarantees an operational peace of mind. ### Lawyers need the cloud too The IT departments of law firms are focused on ‘keeping the lights on’, yet lawyers are demanding more ways for IT to help them drive efficiency and differentiate their firm. So how can cloud computing enable firms to fundamentally shift the focus of their IT departments? Most commentators accept the legal industry will not return to the type of market that characterised the years leading up to the collapse of the financial markets. Over recent months my conversations with senior stakeholders in a number of law firms have confirmed change within the sector is being driven by a series of key influences. These include: A more sophisticated approach to the purchase of legal services, and subsequently increased demands from clients in relation to price, transparency, security and quality Emerging markets of increasing importance The general globalisation of legal services New entrants to the market including ABSs and LPOs. Because of these influences we are seeing firms and in-house legal teams constantly seeking ways to be more efficient and innovative in order to satisfy their respective clients and remain profitable. So how does all this relate to cloud-based IT services? In brief, my experience strongly suggests that for a large number of law firms’ technology is simply failing to support the business change that these organisations are striving to achieve. While there is common agreement amongst business leaders and IT directors that the key priority for their firms is to identify and implement ways in which IT can drive lawyer efficiency and deliver business value - the reality is, the majority of effort and investment is actually being applied to building and maintaining traditional on premise or co-located IT infrastructures. These infrastructures, made up of servers, storage, and networks are expensive and increasingly complex to manage. Benchmarking figures show firms typically spend between 4-5% of their turnover on IT, and this often equates to the several thousand pounds per person per year. With this large fixed investment, IT departments subsequently spend their time maintaining this investment. Indeed, it is estimated law firms allocate approximately 80% of their spend and 85% of resources on operational activities. This means only 15-20% of the focus of an IT function is directed towards activities that will exploit technology to generate business value. In other words the needs of law firms and their approach to IT investment are fundamentally mis-aligned. Cloud based IT services, if used effectively, offer the opportunity for IT departments to fundamentally change the above model. Shifting the focus of their teams to where their firms want and need it to be, that is, finding ways for IT to make lawyers more productive. By adopting the cloud and accepting IT infrastructure is now just commodity, more IT resources can be assigned to roles and tasks that analyse business needs, align IT to legal requirements, and create business value. Instead of striving to be experts in servers, operating systems, storage and email, the IT department can become experts in finding ways for technology to be applied to enhance lawyers processes. This could include improved mobility, enhanced business process, knowledge re-use, business analytics, and legal transaction management. One of the complaints often heard from IT directors and CIOs is that they are not given enough of a voice in the way their firms set and execute broader business strategy. This may be the case but it will not change while the focus of IT leaders is on infrastructure technology. Managing partners and CEOs are not interested in hardware specifications, or virtualisation techniques, they are interested in business efficiency and profitability. Until IT leaders start to focus on how technology can support these ambitions, they will not be invited to the ‘top-table’. The adoption of cloud based IT services can deliver a range of benefits including cost reduction, and improved quality, flexibility and security. Though in my view the ability to re-align the focus of the IT department, IT investment and the IT director is the most important benefit of cloud to any organisation. ### Enterprise gets 10 from Apple and IBM Apple and IBM introduce 10 business-focused IBM MobileFirst iOS apps. Microsoft has quite comfortably ruled the office for sometime. Even with great, cloud-based advances from Google and near misses from Apple, (iWork anyone?) capturing the daily business and enterprise market has not been easy. Microsoft has looked increasingly weak in other areas. Where PCs and Windows laptops reigned supreme now iPads are showing up in boardroom presentations, on trains, airplanes and even the coffee shop. The success of the iPad, arguably the catalyst for our increasingly mobile appetitive, has given the outsiders a second shot at the crown. The shot in this title bout might come from the iPad, or iPhone, in your hand but the software is all IBM. Back in April, following a study that found 90% of global organisations surveyed were planning to increase their investment in mobile, IBM announced the expansion of its IBM MobileFirst portfolio. In July Apple and IBM announced a business partnership to develop business-centric apps together. And this week they have released the first salvo in the enterprise battle - IBM MobileFirst for iOS. Billed as "A Suite of Solutions Built for Transformation" IBM MobileFirst for iOS is an end-to-end set of apps have been developed to address security, cloud, integration, and app management and support. These apps aim to combine the power of enterprise data and analytics with an elegant user experience. Who would have ever guessed the killer enterprise app(s) the iPad had been missing was going to come IBM? "What we're delivering aims directly at the new quest of business—smart technologies that unlock new value at the intersection of big data and individual engagement," said Bridget van Kralingen, Senior Vice President for IBM Global Business Services. "Our collaboration combines IBM's industry expertise and unmatched position in enterprise computing, with Apple's legendary user experience and excellence in product design to lift the performance of a new generation of business professionals." "This is a big step for iPhone and iPad in the enterprise, and we can’t wait to see the exciting new ways organisations will put iOS devices to work," said Philip Schiller, Apple's Senior Vice President of Worldwide Marketing. "The business world has gone mobile, and Apple and IBM are bringing together the world’s best technology with the smartest data and analytics to help businesses redefine how work gets done." Initially the apps will be focused on the finance, air-travel, and banking industries. Some of the big names supporting IBM MobileFirst for iOS include banking giant Citi, Air Canada and US telecoms provider Sprint. The IBM MobileFirst website, which is well worth a look, says the apps are "designed so enterprise systems work for people, not the other way around." It is about time too. For many of us Microsoft dominance has often meant tools ugly enough to get the job done. The market needs challengers. Without competition there is no innovation, no new thinking. IBM MobileFirst for iOS might just be the opposition that not only takes on Microsoft but also helps enterprise become genuinely mobile. ### Juniper finds connectivity matters Juniper Networks' Global Bandwidth Index reveals differences in consumer mobile Internet usage patterns and behaviour around the world. Juniper Networks has unveiled its first ever Global Bandwidth Index Report, which explores differences between how people use mobile Internet connectivity in their day-to-day lives at work and at home and what they hope to achieve using their connected devices in the future. The report reveals a transformative impact of connectivity with nearly all consumers surveyed in emerging markets (97 percent) reporting fundamental life-changes in key areas of their lives. The Juniper Networks Global Bandwidth Index also found that people in developing countries often use connected devices as a tool for personal advancement and self-improvement, while in the developed world, the focus is much more on convenience and efficiency. According to the study, nearly twice as many people in developing countries regularly use connected devices for educational purposes as those in developed markets. Further, 46 percent of respondents in developing countries use connected devices for professional development versus 27 percent in developed markets. In developed nations, on the other hand, people are more likely to use connected devices for practical day-to-day activities like banking (51 percent), shopping (41 percent) and searching for local information (42 percent). Juniper Networks commissioned the independent firm Wakefield Research to survey 5,500 adults in developed markets, including Australia, Germany, Japan, the United Kingdom and the United States, which are typically moving quickly to implement high bandwidth Long Term Evolution (LTE) networks capable of delivering mobile services up to 100 times faster than older networks. Wakefield also sampled consumers in emerging markets, including Brazil, China, India and South Africa where networks tend to be slower and less reliable[i]. The Global Bandwidth Index Report yielded the following key findings: Transformative Connections Ninety-seven percent of people in emerging markets reported fundamental life changes due to connectivity, including a transformation in the way they complete a wide range of essential and everyday tasks, from banking to accessing local information, enjoying entertainment, receiving health care and engaging in civic life. Compare that to 22 percent of consumers in developed markets who report that connectivity has not had a significant effect on their lives. The Global Bandwidth Index also uncovers a corresponding impact on people’s perception of economic opportunity. Forty percent of respondents in emerging markets report that connectivity has improved their earning power, compared with just 17 percent in developed markets. Sixty percent of consumers in emerging markets believe that connectivity has transformed their social lives, compared with 38 percent in the developed countries. "The Juniper Network’s Global Bandwidth Index found that mobile connectivity has had a profound impact on how people communicate, work, learn and play around the world. It also suggests that this transformation will continue as new technologies emerge, network speeds increase and hundreds of millions of people who aren’t yet connected to the Internet gain access. The report reveals an opportunity for service providers to continue to deliver new, life-changing services in areas like education, particularly in emerging markets where there is a great demand," Mike Marcellin, senior vice president, strategy and marketing, Juniper Networks. The Education Opportunity Education is a prime area people in developing countries are more likely to utilise the power of connectivity to help them get ahead. Thirty-nine percent of people in developing nations surveyed have experienced a significant transformation in their access to education thanks to connectivity. In developed countries, that numbers is less than half. In India, for example, 45 percent of respondents say that connectivity has fundamentally changed how they access textbooks, complete coursework or use teaching tools, compared with just seven percent in Japan. Looking to the future, more than half of consumers surveyed in emerging countries would like to have more access to educational resources compared to less than one-quarter in developed countries. Not All Bandwidth Created Equal Despite positive life changes, the majority of individuals in emerging markets report they have missed personal and professional opportunities as a result of connectivity challenges. Overall, 60 percent of consumers in emerging markets cited connection speed as the most common problem (compared with 27 percent in developed countries). Further, 30 percent of people in emerging markets stated that simply finding a connection remains an issue (compared to just 13 percent in developed nations). "Despite these connectivity challenges, the Global Bandwidth Index data shows that consumers in emerging markets are still significantly more satisfied with their networks than their counterparts in developed countries. The transformative impact of connectivity on peoples’ lives in the developing world is much stronger than the feeling that networks should be faster and more reliable. Meanwhile, in developed countries, high bandwidth connectivity is so commonplace that people are much more sensitive to interruptions in service," Mike Marcellin, senior vice president, strategy and marketing, Juniper Networks. To receive a full report of the Juniper Networks Global Bandwidth Index, please email: JuniperUK@edelman.com ### The growth of the hybrid In the first of our two-part special Bree Freeman looks at why more companies than ever are opting for a hybrid cloud solution. With 2015 literally just around the corner, the time for reflection is upon us. This past year we have seen the Cloud come out of its infancy and become entrenched into the fabric of our working lives. More and more companies have come to understand the potential of the Cloud and are embracing the technology with aplomb and greater understand of what the Cloud actually means for their business. For me there is only one word that encapsulates the Cloud market this year - the 'hybrid'. The perfect marriage between both private and public cloud services. This year hybrids have gained considerable traction, especially with enterprises, which are increasingly turning to hybrid cloud solutions to remain flexible and agile. According to the latest research by Technology Business Research at least 20% of 1,600 companies surveyed have integrated at least two clouds and creating a £4.5 billion hybrid integration market this year. But why are enterprises turning to the hybrid option? Many experts believe that it's because companies are not prepared to completely trust all their information and services to public clouds. And there is good reason for that concern - you just have to watch the news and hear about Amazon outages, Azure outages, services unavailable like Office 365, DropBox, Gmail, and so on. But could it be that businesses love the low cost and ease that public clouds offer, while taking advantage of increased security and tailor-made aspects that private clouds provide? One of the key draws of a hybrid model, is that it allows the enterprise to still keep their private information on premises, but at the same time provide employees with tools that support the new way of working — with the so-called 'anytime, anywhere access'. So if the public cloud isn't right for everything, but organisations still want some sort of cloud, it usually ends up being a hybrid deployment. Whatever, the reason why enterprises are adopting a hybrid model, cost and on-demand flexibility are major factors to most that I've spoken with, the rise in hybrid adoption is on the rise and set to continue into next year and beyond. If you don't have a hybrid cloud now, research firm Gartner says you most likely will in the future. The firm says that hybrid cloud is today where the private cloud market was three years ago. By 2017, Gartner predicts that half of mainstream enterprises will have a hybrid cloud. And according to the results of a recent Cloud Industry Forum (CIF) poll of 250 UK senior IT and business decision makers 80% will have a Hybrid IT environment by July 2015. "This means that nine out of 10 companies will continue to invest in on-premise IT alongside and integrated with Cloud solutions. In other words we are in fact seeing the normalisation of Cloud in the hybrid IT market," said Alex Hilton, CIF's CEO. And according to London-based independent analyst and consultancy firm Ovum our European counterparts are also set to switch to a hybrid cloud. They recently surveyed more than 1,000 European organisations and found that almost half of them plan to set up a hybrid cloud within the next two years, despite private clouds dominating existing deployments, because these are good fit for their business processes and workloads. In the case of embracing the hybrid cloud, the movement will likely continue to grow though the pace of that growth may slow. More businesses are likely to adopt a hybrid cloud, which may accelerate the transition into a full-fledged adoption of cloud computing. Time will tell if this will be the case, but as confidence in the cloud grows and technology advances, businesses will likely be more comfortable to make the switch. But be warned don't go down the hybrid route without being fully aware of the potential pitfalls along the way. It is only natural that there will be a fair amount of trial and error before a company can start to maximise the many benefits of hybrid and in part two of my report on hybrid clouds I will endeavour to guide you through the potential challenges and discuss strategies for a successful hybrid adoption. ### Are you a Workhorse or a Unicorn? Big Data analysis - everyone claims to be doing it, but is anyone really getting it right? I sat down with Richard J Self from the University of Derby at the fifth Analytics and Big Data Congress to discuss the future of big data analytics. As the popularity of big data grows in the industry the accuracy of analysis and the employment of the data is coming under more scrutiny. For a good analysis of big data, Self explained the need for what is known in the data analysis community as a 'unicorn'. "Well, typically I think that there are two or three areas that need to be involved [in good analysis], one of them is that in the terms of the team involved in doing data analytics you've got to have the technical skills to extract the data and to load it into the analysis system. "Then someone with the skills to do some statistics, the regression analysis, the correlation, whatever you choose, and then maybe some predictors because the decision makers want to know what's happening in the future. Then you need to have someone who can convey the message to the decision makers." "So you've got 4 or 5 different skill sets or if you're lucky you've got the unicorn - the one person with all of the hard technical skills and all of the soft skills, the problem understanding, problem solving, story telling and so on." Unfortunately those who have exemplary skills on the analysis end often don't possess the communication and story telling skills that are required to translate the results of big data analysis into utilisable information. There is a shortage of the 'unicorn' analysts - even of plain old workhorse analysts in fact. According to recent research demand for data analysts is ten times that of the graduates UK universities are producing. The demand for data analysts is exceeding the supply because of the excess of big data that is now being produced. "Most of the big data is from our corporate execution/operations systems, Internet of Things and social media. As I'm discovering now with location data - all of the data we have is of uncertain veracity - we don't know which elements are true and which elements are false or inaccurate." "The real problem in cleaning data is in identifying data that is erroneous, that is not correct and then removing that from the analytical process - typically a statistician would say 'oh we'll get rid of the outliers and we'll just use the stuff that falls within the normal distribution' but what we are discovering in business is that the outliers are sometimes incredibly important." In the past data analytics and statisticians used to discount the outlying data as useless and harmful to the mean. Big data analytics has recently learnt that those outlying data points are in fact important as they help to determine the validity of the entire data set. The one data point lying outside the curve may in fact indicate a whole set of data that has been overlooked - or be a beacon for an upcoming issue. As we're beginning to analyse things like sentiment from tweets - the problem is arising that humans can identify irony, but we can't teach a text analytic system to understand that neural pathway. The system cannot be taught to understand that a sequence of words from one person is okay, but from another it would be an ironic statement. There is also the issues around colloquial language and spelling errors, it's all well and good to teach a system to correct basic spelling errors; but often Twitter is like reading a whole different language due to the character constrictions. "Big data is, it seems the latest magic silver bullet, it has certainly been high on the Gartner hype curve - but now it's dropping down and actually being actioned in business. We know that very large numbers of big organisations are using big data - or they claim they are in surveys." "Many big organisations are able to currently demonstrate improvements in revenue, profitability, and stock market evaluation through excellent execution of analytics. However I'm not entirely sure how representative all of that data is because it was said at the IBM insights conference in Las Vegas in November that 60% of big data analytics projects are actually not very successful." Despite all the hype around big data - no one really seems to be talking about the issues of privacy around that data. Clearly it is something we should be worried about, as let's be honest with ourselves, for the most part we're not great at protecting our data. The problem is that we don't trust anyone to protect our data... it can be tracked back through simple online services and information. The problems around data security and big data is with things like the data being produced by wearable devices - technically that data you are sharing with the insurance company is your data, because it falls under the data protection scheme as being attributable to a living person. An awful lot of new data streams coming from wearable technology are bringing up a whole new suite of problems. Let's say you own a watch with sensors in it that track all of your health and exercise data, and you've agreed to let that data go back to the organisation you purchased the watch from so it can be analysed. What about when in a year's time you upgrade that smart watch and sell your old one on - have you sold your data on with it? How do you clean the data out from that device? Is it possible to, and if so, how do you disconnect it from you? What systems are being put in place to protect your wearable data and differentiate you from the new owner? Your smart watch location tracking data is probably going to show that you spend almost every night at a certain set of coordinates - if someone gets that data they can easily deduce where you live. The problem is that we don't trust anyone to protect our data - we are reasonably certain that anonymising doesn't work very well. It can be tracked back through simple online services and information. In some cases it can even locate back to an appliance in your house - if you have a smart fridge it has to be chattering to something. When you're on holiday and your smart house is sending out data that says all your lights are off and you haven't used any heating in the past 3 weeks - it becomes very easy for someone to see that your house is unoccupied and decide to go snoop around. As a whole we're not very good at personal data security and big data analytics currently hasn't found a way to help us with that. Self did explain that it is helping with other forms of security though. Many big banks are getting very good at big data analysis that can detect, and does, find where fraud is occurring within their systems. They are able to supply this information to the police and direct them to the next ATM that is most likely to be hit. Self has said that it is impossible to train a 'unicorn' in a three year degree, there is a distinct need for universities to start understanding the different roles that are involved in a data analytics team. For them to start catering to training graduates who are employable on the different hard skill levels that are required, but also to start training them in the soft skills - at the communication level. Big data analytics unicorns are the dream of the future - but for now, as an industry we will have to struggle on with a team of workhorses that have around a 40% success rate. ### Advertising giant WPP goes hybrid with IBM WPP, is the world’s largest communications services group, has chosen IBM to provide hybrid cloud services. This new strategic partnership includes a seven-year, $1.25-billion services contract for IBM to transform and manage WPP's global technology platform. In the agreement IBM will provide a service delivery and technology platform that allows WPP to integrate its operations while delivering significantly improved productivity. The agreement will enable WPP to innovate new digital services that will be run and managed within a global hybrid cloud infrastructure. In 2013 WPP billed $72.3 billion and enjoyed revenues of $17.3 billion. Leading the field of advertising agencies it has acquired 300 agencies over the past decade and with this new agreement will continue to build its leadership in a digital and mobile world. This new flexible hybrid cloud infrastructure will enable WPP to expand the use of big data and analytics to drive decision-making and amplify the creative process. It will also help WPP will continue to build its leadership in a digital and mobile world and enhance its ability to more quickly deploy new products and services using an IBM hybrid cloud. "As the world’s largest communications group, we are seeking to exploit IBM’s cloud computing expertise to allow us to innovate and add value to both the service and the product we deliver to clients across 111 countries," said Robin Dargue, WPP Group CIO. This announcement is a significant milestone in the deployment of enterprise cloud and extends IBM’s position as the premier global cloud platform,” said Erich Clementi, Senior Vice President, IBM Global Technology Services. “Our secure, open enterprise cloud platform will enable WPP to quickly deliver new innovations to their customers." For the full press release visit the IBM website. ### Calyx adds disaster recovery to IaaS Managed services provider invests in business continuity options to expand cloud portfolio. Business continuity is becoming a greater issue for businesses considering the cloud. UK managed services provider Calyx has announced the launch of two key complementary services to its existing Infrastructure-as-a-Service (IaaS) solution. Following significant investment, the company can now offer both Back-Up as a Service (BUaaS) and Disaster Recovery as a Service (DRaaS) capabilities, meaning customers can enjoy a fully resilient solution with guaranteed availability. Responding to customer demand, the service provides a cost effective alternative for medium and enterprise companies looking to take advantage of cloud technology without committing to large capital expenditure programmes. "Calyx' Cloud Portfolio is all about answering the needs of our customers. That is why we have invested in [data security and protection], these new and enhanced services. We now have infrastructure replicated in data centres in both the North and South of the UK and a market-leading solution that brings with it a guaranteed availability supported by stringent SLAs", said Steve Clark, CEO at Calyx. The solutions forms part of Calyx' Hybrid Cloud portfolio, which enables customers to unite a variety of infrastructures – physical, virtual, public and private cloud – into a single, seamless environment that is both highly secure and highly scalable whilst remaining significantly more cost-effective than traditional infrastructure models. The portfolio is backed by Calyx' 24/7/365 service desk, which offers in-house 1st, 2nd and 3rd line support and is based at the company's headquarters in Manchester. Clark concludes "Gartner predicts that IaaS will achieve a CAGR of 41.3% through to 2016, so it's clear that this is a growth area. Calyx's refreshed solution will give customers complete flexibility to choose the right infrastructure according to their individual needs and the nature of the task." ### The cloud's second wave The world of cloud computing continues to evolve on an almost daily basis and the recent exit of Rackspace from the IaaS market can be viewed as major change in the evolution of the cloud. If one of the hyper-scale darlings of the IaaS space declares that it is now a commodity, and that they plan to move to managed, higher value/cost offerings, then what hope do Tier 2 and 3 players have for local markets or specific industry verticals? MSP/CSPs looking to survive, and yes I mean survive, beyond the next three years need to find a niche and develop their tailored offering quickly. This transition from "Simple IT-aaS" to "Mission Critical-aaS" is what I am calling the second wave of cloud. If you consider x86 virtualisation technology as a pre-cursor to cloud, the adoption cycle follows a definite transition: first the virtualisation of the test and development environment in order to identify the benefits; next, as the processes mature, the roll out of VMware across the entire production x86 estate. While this environment may achieve all the anticipated results planned for with this new technology, the UNIX and Mainframe backend systems haven't been touched. All virtualisation has achieved was removing 50% of the cost out of 30% of the overall IT budget. Cloud is following a similar adoption model. It may be a noble pursuit but moving an enterprise's ‘Simple IT' to a public cloud provider will barely move the needle when you take a macro perspective on measuring your overall IT budget. If you manage to take all in-house x86 systems and move them off premises, you will achieve savings and clear the decks of a number of issues and support nightmares in the process, but will you really affect your bottom line IT cost base? No longer can an MSP simply throw a few x86-based VMware guests up on the web and hope to build a sustainable business. Cloud service providers must offer a specific software solution to a vertical. That solution must include, and uniquely understand, how to offer quality of service, availability and security. As this evolution to what I call the second wave of cloud workloads or " Mission Critical IT-aaS" becomes the new battleground for MSPs, then the focus will be on innovation and daring to be different. This next battleground centres on how to take the 99.999% availability, 24x7x365 applications that run your business, and deliver them as a service either on-premises or (preferably) off-premises in the public cloud. My belief is that only when you can deliver on this project will you truly be able to declare victory. So how do you achieve 99.999% in the cloud when the current leader, according to Gartner's Magic Quadrant for Cloud IaaS, is only able to deliver at best 99.95% reliability? One approach I would advocate is to map out your mission critical workloads. Look for common components of the applications, such as database and middleware and look to decompose your applications into horizontal 'utilities' that can be delivered as a service. One example would the enterprise database. I am sure in any large corporate there are hundreds of servers running databases, such as Oracle and DB2 on UNIX servers. Why not try to consolidate these databases into a horizontal utility that can be provided to any application that requires a database? This way you can centralise demand and offer Database as a service (DBaaS) to your applications. If you take this approach you can then, based on latency and other factors, decide whether this utility is on-premises or in the public cloud. Following the crowd and hoping to be lucky could see some of the 35,000 MSPs currently operating globally to go to the wall, and out of business. This is to be expected as any marketplace matures new entrants are consolidated as the successful survive - call it the service provider equivalent of Survival of the Fittest. So to all those MSPs out there not wanting to become the next dodo, I suggest you take a long hard look at the building blocks of your business. Consider your infrastructure and ask yourself one hard question: have you dared to be different? Or have you followed the accepted practice and built simply a derivation of what everyone else in the industry is offering? However, a word of caution, if you look to aggregate multiple databases onto one utility service, then that service better scale, be available and be secure. "No cloud platform can scale, be available and offer enterprise grade security on the levels I need to offer high I/O workloads such as database as a horizontal utility," the dissenting voices may say. You may not like the answer, but better to know now rather than let the market find you out: There is one cloud platform that can handle enterprise (class) cloud (workloads, in one) system... IBM's new Enterprise Cloud System. ### City Network takes OpenStack to Europe Sweden's City Network, a leading provider of infrastructure services, today announced that it is the first European provider to offer OpenStack in multiple data centres across Europe. City Network is one of Europe's fastest growing Infrastructure as a Service (IaaS) providers. And now, its customers can benefit from OpenStack's transparency, security, and scalability. The OpenStack platform is supported by nearly 20,000 developers and most of the major IT companies. "The OpenStack revolution for cloud parallels the Linux revolution for operating systems," says Johan Christenson, CEO of City Network. "It's the future infrastructure platform for all types of cloud-service operations that we can now offer customers. Since our launch in 2002, we've concentrated heavily on giving customers the highest possible transparency, while maintaining security and flexibility. With OpenStack, we take a historical step forward in this effort." Initially, City Cloud (City Network's IaaS) is available on OpenStack in Sweden and the UK. Recently published IDC figures indicate that the public cloud services market will turn over a staggering $127 billion in 2018 and IaaS sales growth will accelerate six times faster than the IT market in general. In Sweden, IaaS is offered from City Network's data centres in Stockholm and Karlskrona, so its customers can have a fully redundant infrastructure - with benefits afforded by OpenStack-enabled transparency, scalability, and security. City Network will during 2015 offer its services on OpenStack in all markets that it serves across Europe. Customers' requirements for open, scalable, secure infrastructures drove the OpenStack-based IaaS offering. Compared to large international competitors' proprietary solutions, OpenStack's greatest advantages are capabilities that enable upward and downward scalability. "In today's changing digital economy, it's critical to any business to be able to quickly adjust to infrastructure needs - regardless of whether the requirements are related to new projects or to establishing operations in new markets," says Christenson. "Being stuck in inflexible, proprietary and less transparent solutions is expensive and can be downright devastating for businesses. With the infrastructure delivered based on OpenStack, we remove these business risks." ### DaaS market heats up In part two of our series looking at Desktop as a Service (DaaS), Bree Freeman offers up some advice as to how to pick the right DaaS provider for your business. Patterns of working today are drastically different to the way in which we worked only a decade ago. With the rise of technologies such as the Cloud, today's workers are much more tech-savvy and have brought about a dramatic shift towards a more mobile working environment. As a result of this more companies than ever have had to become Cloud-wise instead of Cloud-wary and are now looking at desktop virtualisation as a key to dodge any potential pitfalls of having a mobile workforce. Today's virtual desktops have evolved somewhat since VMware re-introduced the technology back in the late '90s. With the aid of Cloud technology virtual desktops enable delivery to any type of device, whether that's a PC, Mac, phone, tablet, as well as permitting device-to-device session roaming, they also provide superior data security, have global availability, Oh! And offer a great user experience to boot. So much more than you can get with a tradition workstation any day! It is little wonder then that the DaaS market is predicted to trounce last years market size figures of just over US$290 million. Especially when you take into consideration that big Cloud players like Amazon are jumping on the bandwagon and honing in to the market so they can take their own piece of the DaaS pie. Add to the mix significant investments in R&D you can see why the DaaS market is heating up and provider offerings are becoming more and more robust as the market matures. Today's virtual desktops have evolved somewhat since VMware re-introduced the technology back in the late '90s. My advice is take the time to make the right choice for your organisation, as finding the right virtual desktop provider for you will depend on what your needs are and what the providers can offer you. Which is why before you adopt Cloud-hosted virtual desktops you need to consider business continuity, hardware compatibility, flexibility options and more. Different providers offer different capabilities; so take a hard look at the supplier's offerings first. So who are the players in this evolving arena? Well, overall specialised firms, such as VMware, Citrix, Microsoft and Oracle, dominate the industry. Cloud architects such as Citrix, VMWare & Microsoft tend to pioneer their own solutions, which don't make it to the open market and these tend to be focused at the larger enterprise than the SMB. However, the big players do have channel partners, so if you're a SMB I suggest you look at using one of these independent providers. Why, well with an independent a business can maintain a high degree of flexibility, this is because independents can offer to 'mix' vendors, such as hosting a Microsoft desktop on Amazon Web Services. They will also be able to offer Citrix or Microsoft's RDP, Hosted Linux OS, Parallels for Macs and other products. My advice is take the time to make the right choice for your organisation, as finding the right virtual desktop provider for you will depend on what your needs are and what the providers can offer you. The biggest issue that you will face is deciding which of these independents should you opt for. Well, if you Google 'virtual desktop provider' you will be hit with page after page of businesses saying they are 'a leading hosted virtual desktop provider' and it would seem that the industry is saturated with companies basically offering very similar products for very similar fees. For me though, I have my favs. VESK is one such company that boasts it's one of UK's largest virtual desktop suppliers but the truth of the matter is that VESK are not fabricating the truth with this statement. They have a 100% uptime record and are the only specialist virtual desktop supplier quality tested and security approved by the UK Government. Plus they have a friendly team to boot. Nasstar is another UK-based provider that I would recommend you speak with. Their virtual desktop provides a single access point for many apps, ranging from the most commonly used – such as Microsoft Office, Microsoft Exchange email and Sage – to industry specific and bespoke apps. The company also offers a range of on-demand, published and web-based apps that can be delivered through their offerings. Other cool players in the field include Cobweb Solutions, entrust IT, GoCloud Hosted Desktop UK and specialised firm Hosted Accountants. So all in all when looking at virtual desktop suppliers, you should focus on the features that are offered more than the cost. Each provider offers a unique solution of its own; select the one that best fits the needs of your company and in no time you'll be enjoying the benefits of desktop in the Cloud. ### Looking after accountants in the cloud The needs of accountancy practices are hugely varied, and for a reason. Firms can be anywhere from one or two people through to 50+ staff. More importantly though the IT knowledge and degree of technical amplitude varies ever more greatly. A small modern firm will have millennial staff that expect Wi-Fi, document management, and don't see a problem with using social media at work. Conversely a very traditional firm will still embrace paper files, even faxing, and of course keeping the sacred office hours limited to 9-5pm. Probably the biggest challenge to face the industry over the last twenty years has been the adoption of desktop applications - software to deal with complex compliance work. Tax returns of all types, final accounts and company secretarial matters have been, for a long time, managed in many case by professional software. Not only is that software potentially expensive, it typically requires a good level of IT knowledge to manage it properly. In addition to managing the software tools needed for staff, stakeholders and owners are also under pressure to recover the full working week from staff who are often billed out by the hour. Add of top of this complex software upgrades and it is easy to see why an accountancy practice would love to outsource it's IT function if economically viable. PaaS versus SaaS This is where a specialist firm such as Hosted Accountants can help. We are simply providing a managed platform to free owners from the perils of managing and updating IT systems. We are not pretending to be a full SaaS solution - that will arrive but it will take the industry several more years until there is a suitable, quality option available. For now the best way to embrace the cloud is to utilise a cloud (data centre) based server and access and run applications from there. Modern RDP/Citrix technology means this can be done on any device, anywhere in the world subject to some form of internet signal. In our service provision model we manage not just the applications but also the other aspects of running an IT function that would have previously taken up staff time. This includes backing up not just daily, but daily to at least two geographically separate secure locations in the UK. Add into the mix anti-virus, a full telephone help desk team, server monitoring and it is easy to see why the service is very appealing to firms. When considering providers it is important examine all the issues and requirements. The Cloud Revolution Many of the accountants we work in collaboration with are still confused as to why the cloud has suddenly appeared. They have been using the Internet, email, online banking and Facebook for nearly ten years - that is all cloud, held on a server somewhere waiting to be reached. For many this confusion has been further compounded by the arrival of pure SaaS bookkeeping applications that now compete for the market compared to more traditional packages. Numerous packages now compete in a true online SaaS model with traditional players such as Sage, IRIS and Thomson Reuters. The Tipping Point Adoption of such services and the whole cloud ideology has increased within accountancy practices but is still patchy depending on the leadership of the individual practice, and how much they trust “the cloud” to manage data for them versus on-site. Within the microcosm of the accountancy world our clients are bombarded with marketing messages, services, and new things to buy. Traditional vendors confuse the customer offering hybrid solutions or a quick-fix solution whilst they plan to produce a full SaaS offering which will take years. Certainly for now a hosted DaaS platform is the ideal solution allowing both access to all legacy applications and data but also from anywhere, and without the worry of backups and security in-house. For all the cloud marketing and adverts the real appeal for accountants is just being able to use all their stuff anywhere. I think this will happen eventually but there is a long way to go and a huge market at stake for those involved. ### Businesses expecting cloud IT infrastructure Survey of IT decision makers demonstrates the cloud is increasingly seen as standard practice for IT provisioning. A majority of businesses expect to have fully cloud-based IT infrastructure in next five years. Attendees at Cloud Clarity, an educational event hosted by Node4 to help businesses better understand cloud services last week, confirmed that they expect to have moved to a fully cloud-based IT infrastructure within the next five years. A survey of the IT decision makers present found that 53% agreed that it would be very likely, although there is still a significant minority of 27% that thought it would be unlikely for their business. The survey found that managing email is a key driver for cloud adoption with 47% of attendees agreeing that it would benefit their business to host mail applications in the cloud. Data storage and file sharing were also cited by over a quarter of attendees as key applications that would benefit from being moved to the cloud. Telephony and voice communications was the other critical business application that the decision makers would like to see in the cloud. "More and more businesses are evaluating cloud services as a tool for reducing costs, limiting risk and providing the flexibility they need to run their businesses in a fast paced, technology driven workplace. Rather than a risk, cloud services are now mostly viewed as an IT solution that gives organisations of all sizes access to the latest and most up to date technology, enabling businesses to gain competitive advantage," commented Steve Denby, Head of Sales South, Node4. "However, unless you have exactly the right solution for your business’ individual needs then it is unlikely you will get the full benefits that the cloud can deliver. This means going beyond the accepted limitations of private, public or hybrid solutions and entering a new era of truly customised cloud solutions. The onus is on vendors to provide these capabilities and businesses need to ensure that they are working with the right suppliers to ensure their business is getting real value from any cloud infrastructure." Cloud Clarity, held at The Gherkin, 30 St Mary Axe, London, focused on providing an overview of cloud technology as well as compliance in the cloud for businesses not currently engaged with, or contemplating making use of, cloud services. The event also featured talks from guest speakers Simon Hazlitt, author of Running on Air, and Neil Cattermull, Director of Cloud Practice at Compare the Cloud. ### Cloud interest still rising The annual Computerworld Forecast Study predicts that cloud IT initiatives and interest will continue to rise next year. As the calendar year begins to draw to a close research, analysis and year-end reviews tackle the performance and predictions in the cloud market. Surveying 194 respondents - 55% of which were IT executives, 19% from mid-level IT, 16% in IT professional roles and the remaining 7% from business management - Computerworld's annual study looks at the IT priorities for next year and considers budget allocation, staffing and IT projects. This year's survey found that when asked, "what is the single most important technology project that your IT department is working on right now," 16% said cloud computing. Replacing and modernising legacy systems came in second with 12%. Cloud projects are certainly growing in importance and are expected to cause the most disruption in the future, with the goal being to improve service and generate new revenue streams. The newest areas of spending were surprising. The Internet of Things (IoT) lead the pack with 32% of those surveyed considering its future as significant. High Performance Computing (HPC) was next with 22% and Energy-saving & Carbon-reducing Technologies with 16%. These newest spending plans are often deeply connected with cloud computing projects. Computerworld’s survey also found that IT decision-makers largely plan to maintain their current levels of investment in almost all areas, with an overall IT budget increase of 4.3%. The findings, the first of many we can expect in the coming weeks, express the growth and potential cloud computing has in store for 2015. ### Thanking smartphones for the cloud For many in the US today is the last day of the week. Tomorrow is Thanksgiving and the day after, Black Friday. In consideration of this holiday, and partying like its 1621 with pack of pilgrims and big spread, let's offer thanks to the smartphone. Why the smartphone? Simple, it has, more than any other gadget, helped make cloud computing more successful and increased the pace of adoption. This time last year, leveraging cloud-based analytics and aggregating some seriously big data, IBM reported an 18.9% increase in Black Friday online shopping over the previous year. Beyond the food and family, Thanksgiving is good marker for economic performance; and every good measurement needs a baseline. Analysts have been optimistic for some time and have predicted significant growth for cloud computing. By 2015 the cloud market is estimated to reach $121 billion dollars. Much of this growth is from cloud services driven more often by our increasingly larger mobile workforce. In many ways it is the consumer market that has helped drive cloud adoption. If we were to choose one category that has pushed the marketplace faster than any other it would be mobility, the rise of the smartphone, and subsequently the tablet, has done more to than other single contribution. Unsurprisingly this category does well out of Black Friday sales. The latest smartphone is often one of the biggest gifts come Christmas. Yes, there are better reasons for businesses to consider the cloud - keeping shadow IT in check, end-point management, data aggregation and network security - but without public interest adoption would be slower. In going mobile the public has pushed cloud computing along faster, making the gains we've enjoyed this year better. The move to mobile really begins with our adoption of smartphones, the apps that power them and the social media that binds us. In a short span we went from phones being phones, simply connecting us as disembodied voices on the move, to being a means to share ideas, comment and collaborate. Last Thanksgiving the "selfie" was named word of the year by Oxford Dictionaries. Since the announcement the public continue to take more and more selfies. Uploaded to social networks these photos are stored and shared on some of the biggest, distributed networks. When the public has access to a greater range of services at home than in the office it was only going to be a matter of time before they demanded the same from work. Many will argue that Bring Your Own Device (BYOD) was initially a response to the failure of businesses to meet these demands. Historically corporate IT hasn't kept pace with the consumer market so people starting using their own smartphones and tablets in the office. Over the last year we have seen the enterprise market start to catch up with consumer sentiment. Even small and medium sized businesses are investing in the cloud. More are moving traditional services (data and storage) off-premises and using web-based applications (productivity and CRM) to do business. Gartner anticipates the bulk of new IT spending by 2016 will be for cloud computing platforms and applications. Many of those apps will have started as a consumer proposition before their usefulness was applied to the enterprise market. The influence of our consumer drive to be more mobile is far reaching. So this Thanksgiving we should say "thank you" to the smartphone. ### DaaS for law firms Cloud in Action: the team at VESK take a look at DaaS and suggest how it could benefit law firms. After speaking with dozens of mid-tier law firms, IT availability and uptime were the main areas of concern. The frustrations vented ranged from problems with existing Practice Management Systems (PMS), integration problems between CRM and Case Management Systems (CMS), to digital dictation and document management issues. Most popular legal software vendors have been adding new functions and modules to their existing suite of products such as Adarent, Elite, and HP. The problem is finding a way to integrate all of these applications from different vendors into a cohesive and centrally managed IT environment. This is where Desktop-as-a-Service (DaaS) comes in to play. Instead of trying to reinvent the wheel by putting together bespoke application packages, law firms can outsource the Microsoft Windows and Domain services. By outsourcing these services they are reducing internal IT’s workload, resulting in less “Have you tried turning it off and on again?” exchanges. Internal IT no longer need to support and manage the end user desktop and device, because the following services have been offloaded: Windows desktops Microsoft Office Internet Explorer Desktop security Disaster Recovery The DaaS model manages these key features, allowing the law practice to work from anywhere, on any device whilst being able to centrally manage their IT from a web control panel, or directly through Active Directory. Thin clients can be implemented, further reducing the cost of maintaining ageing PC estates. Larger law firms run dozens of applications, virtualising and migrating all of these applications could take months, even years. Many enterprise organisations do not want their data leaving their own data centre, therefore DaaS is the ideal solution. DaaS has the empowering benefits of cloud without removing the data and applications from internal IT’s control and management. Thanks to recent increases in bandwidth interconnectivity between data centres, large bandwidth can now be implemented with minimal cost. Running interconnectivity between the data centre, where the desktops reside, and the organisation’s server farm, where the applications and data reside, permits the publishing and streaming of applications between the two locations. This interconnectivity is enabling all of the business applications to appear as links or shortcuts on the virtual desktop, seamlessly provided to the end user. Virtualised applications are now able to be accessed from the virtual desktop, but what is more impressive is that Server/Client applications which have not yet been virtualised can also be published on the virtual desktop securely through Organisational Units and Security Groups. Essentially end users can access all of their applications and data from within their virtual desktop, without the applications being migrated or the data being moved away from the organisations data centre. Application access and security policies can be managed easily, maintaining the full benefits of cloud computing but without the headache of huge scale migrations. ### Flying high with DaaS With adoption rates on the rise the virtual desktop arena is entering a new dawn of its existence. With the aid of the mobility movement businesses are turning to this once spurned technology as a means to have better control over their documents, apps, and programs virtually, from any device, anytime, anywhere all with the aid of the Cloud. The move towards virtual desktops has been on the rise for a number of years. Fuelled by the mobile workforce this once pooh-poohed tech has entered into the Cloud sphere and boy has the market been growing. According to recent figures the desktop virtualisation market is valued at $1.6 billion (bn) and is expected to reach $14bn by the year 2019 at a compound annual growth rate of 63.7%, and according to analyst Gartner, it predicts an 8 to 10% market penetration by next year; that's some big numbers! So what exactly is a virtual desktop – also referred to as a hosted virtual desktop or Desktop-as-a-Service (DaaS)? Well it is basically a desktop hosted by someone else. In other words a provider lifts your current desktop off the physical PC or laptop and makes it virtual and hosts it for you by housing your information in one of their data centres and not on your device. Whatever that device may be, be it an iMac or iPad or even the same PC you used before, it doesn't matter as long as it has an internet connection you'll have a desktop, meaning you can work from anyway at anytime or any device. So a virtual desktop is basically a single, integrated solution that allows end users to get access to all of their applications regardless of type, all of their data regardless of where it came from and their desktop is accessible from any device on any network, anywhere in the world. A perfect solution for company's looking to deal with the conundrum of today's mobile workforce, so no wonder the industry is growing! In fact a number of Cloud-based desktop virtualisation platforms have popped up over the past few years, all of which purport to contain the magical formula that strikes the proper balance between operational efficiency and broad user flexibility. Be warned though, it simply isn't a case that all providers supply the exactly same form of virtual desktops, it all depends on the technology they have chosen. A very common method to follow is to use Microsoft Terminal Services – Citrix also provide very popular methods – and connect to the desktop via a RDP (Remote Desktop Protocol) on the client devices – be this PC, Laptop, thin client etc. So, as a business you have a server environment in a data centre that has enough power to host multiple Windows desktops, then you have client devices in your office that have nothing on them but are connected to the Internet. The user then accesses the Internet to connect to the server and login to their desktop. The user can now work as normal but instead of using a PC to run the applications be that Word, Excel etc, they are being run by the server in the data centre and just sending screen images and mouse-clicks/keyboard presses back and forth. So why chose DaaS? Well I could bang on all day about the overall benefits of incorporating a virtual desktop into your working world but I shall spare you the snoresville version and simply say, that security and disaster recovery are two very good reasons to look at DaaS, especially as BYOD and streamlined, automated systems management inform the decision to move to a virtual working world. Add to that business-continuity capabilities, lower operating and hardware costs, fast deployments, better support and reliability, and potential lower set up costs as well as monthly bills, which is beneficial to small businesses who won't have to lay out vast sums of money, with the added bonus of instant IT support. A number of Cloud-based desktop virtualisation platforms have popped up over the past few years, all of which purport to contain the magical formula that strikes the proper balance between operational efficiency and broad user flexibility. But before you start chatting with a provider, I would advise that you take a serious look at the cons of DaaS. The biggest hurdle with DaaS is licensing. Microsoft makes it almost impossible to license desktops for the Cloud, it is not adverse to desktop Cloud, but does use its lead position to shape and control the market. If your company already has volume licensing with providers like Microsoft, you're less than likely to get a chance to use it for your desktops. Other downsides of DaaS include lack connectivity issues, reliability and concerns about leaving security in the hands of the provider. Your business also needs enough bandwidth to run a remote, hosted desktop. And custom applications, especially ones that need client/server access, can make DaaS adoption even more difficult. Overall though, despite the cons of DaaS for me if you integrate the high-value line of business apps and data needed to run a business, DaaS is a solution with strategic value. In part two of the series Bree will look at some of the big boys and smaller players in the market. ### Guide to Hybrid Cloud ### Azure Outage Microsoft's recent Azure outage highlights importance of service continuity and the challenges of relying on connectivity. As many businesses look to the cloud for improve sharing, storage, data and security we are reminded that technical issues can affect the performance and continuity of services as much as trust. This week fault with Microsoft's Azure cloud computing platform disrupted websites and online services across North America, Europe and Asia. Many of Microsoft's own products, including Office 365 and Xbox Live, were also caught up in the outage. Microsoft has seen its cloud service market share rise. According to recent Synergy Research, in the second quarter of this year Microsoft grew at 164%, compared to IBM at 86%, AWS at 49% and Google 47%. As competition increases there is very little room for high profile outages. Writing on the Microsoft Azure blog Jason Zander, CVP, Microsoft Azure Team, said "an issue was discovered that resulted in reduced capacity across services utilising Azure Storage, including Virtual Machines, Visual Studio Online, Websites, Search and other Microsoft services." During an upgrade rollout a previously undetected issue arose which prevented the front end from taking traffic. This in turn caused other services built on top to experience issues. "We will continue to investigate what led to this event and will drive the needed improvements to avoid similar situations in the future," he added. This latest outage is not the only one, nor the last, to affect businesses. In May this year Adobe Creative Cloud customers also experienced a disruption of services. With some 1.84 million paid Creative Cloud subscriptions the outage was widely discussed across social eventually making headlines in press. Apologising Jason Zander said, "our customers put their trust in us and we take that very seriously." Trust and reliable service provision are essential to cloud adoption. Security has often been cited as the main factor in cloud adoption and while its importance is deserved continuity should not be ignored. As the uptake in cloud services continues providers will have to prioritise reliability. ### How durable is your cloud storage? This post looks at how to interpret ‘durability’ claims of cloud storage providers. What does that long string of 9s means when a service provider talks about how safe your data is? I'll argue that these measures of apparent data safety are largely meaningless. Much better is to understand what risk analysis the provider has done, which risks have been mitigated, and how they’ve done it – but this is the one thing that most providers won’t actually tell you. Let's start with some basics of what a series of 9s might mean. If an SLA says availability is 99.99%, i.e. uptime, then this means that the service is allowed to be down for 1/10,000th of the time - on average. The time period matters. 99.99% uptime could mean a service is unavailable for 1 second every 3hrs, 1 minute every 7 days, or 1 hour a year. Lots of little outages are very different from the occasional big outage, which is why availability is typically specified on a monthly basis. In this context, within the SLA, the service can be down for no more than approximately 4 minutes during the month, either in one go or spread across the month. This is all pretty black and white and easy to understand. But what does a similar series of 9s mean for durability? For example, Amazon currently say for S3 that "Amazon S3 is designed to provide 99.999999999% durability of objects over a given year. This durability level corresponds to an average annual expected loss of 0.000000001% of objects. For example, if you store 10,000 objects with Amazon S3, you can on average expect to incur a loss of a single object once every 10,000,000 years." First notice the bit about 'designed', i.e. it is not a commitment nor is it a statement of based on evidence - after all Amazon haven't been storing data for 10 years let alone 10 million. Indeed, many cloud providers simply won't give any quantified indication of the level of data safety they offer - not because they don't offer safe storage, but because its really hard to provide meaningful numbers. Next notice the reference to 'loss of objects'. An object in S3 can be between 1 byte and 5TB. So does this mean that if I have 10,000 small files then I could aggregate them into one big zip, i.e. one object, and hence reduce the chance of data loss by a factor of 10,000? I'd have to store my zip for more than the age of the Universe before I could expect to lose it. Or conversely, if I split up all my files into single byte objects then does this mean I can expect loose 1Byte out of every 1TB that I store? It would appear that simply changing the way I put data into Amazon could change data loss from an apparent impossibility to a near certainty! Amazon themselves say they have over 2 trillion objects in S3, so if their durability is 11 nines then does that means it's already likely that they have lost at least one of these objects? I wonder who the unlucky customer is and whether they know? Actually, it could easily be that they haven't actually lost any data yet. But that doesn't mean they should necessarily advertise even more nines! The real question is what risks have Amazon addressed in their durability estimates and what measures they use to counter them. I'm not singling out Amazon here either, they have a pretty good data safety approach and track record. Rather it's about the need for transparency and that applies to all cloud storage providers, including Arkivum. When we started the Arkivum business we looked at existing cloud storage services and the assertions they made on the safety of the data they stored. We immediately decided to short-cut all that confusion and offer a straight-forward 100% data integrity guarantee and back it up with proper insurance. This makes it really simple for customers to understand what they are getting from us. This helps differentiate us from others in the market. It also makes it simpler for us because a 100% guarantee means the need to put data integrity first, so no discussions of what corners to cut or how to write wriggle room into our contract and SLA. But, inevitably, some of our customers are now asking us whether we can store data at a lower cost in exchange for a lower level of data safety. This is why we've launched a new 1+1 service that reduces redundancy in favour of a lower price. And then comes the question 'what are the chances of data loss' and 'how many 9s do we offer'. I'd argue that these actually aren't the best questions to ask. To illustrate the point, let's take driving a car as an analogy. If you get in your car, then statistically speaking you have a 99.99% chance of making it to your destination without breaking down. But there are 600 trips per person per year on average in the UK and 70M people in the population, so the total number of car journeys on UK roads is very high. That’s why there are 20,000 breakdowns on UK roads each day. So whilst it is a near certainty that you won't breakdown when you drive your car, it is also a near certainty that someone else will! In statistical terms, this is what happens in a large 'population' subject to 'rare events'. The same approach is often used to model data safety - large 'populations' of drives or tapes and 'rare' corruption or failure events. The key word here is model - it's a prediction not an observation. It's not as if a storage provider will actually load up PBs of data onto thousands of drives and store it for 10s of years just to see how much gets lost. Therefore, the numbers you get from storage providers are only as good as their models. For example, a really simple model might consider multiple copies of a data object (redundancy) and how likely it is for each copy to be lost (e.g. because it gets corrupted). For example, a manufacturer of a SATA hard drive might say that the unrecoverable Bit Error Rate (BER) is 10 ^ -14. This means that, on average, when reading data back from disk that one bit will be wrong for each thousand trillion bits you read - 14 nines of reliability. You might then think that storing two copies on separate drives means that there's a 10 ^ -28 chance of losing just one bit in your data when you try to read it back because if a bit is lost on one drive then the same bit won't be lost on the other. In other words, to all intents and purposes data loss is never going to happen. But this is like saying that the chances of a car breaking down is purely a factor of whether one part of engine will fail or not. In the case of hard drives, it's not only BER that matters, but also the Annual Failure Rate (AFR) of the drive as a whole, which can be anywhere between 1% and 15%, i.e. a much higher risk. Of course, this is why we have RAID storage arrays, erasure coding, and other clever ways of against protecting against data corruption and drive failures. Cloud providers use these as a matter of course. But whilst these systems and techniques make life a whole lot better, they don't eliminate data loss completely and they can introduce their own bugs and problems. Nothing is ever perfect. It is not necessarily not media failure rate (tapes or drives) that counts so much as how reliable the overall system is. This the end of the story by a long way. Thinking again of cars on the road, the chances of a breakdown of an individual car isn't something that can be calculated in isolation. Breakdowns are often correlated. A queue of cars overheats in a summer traffic jam. An accident blackspot. The first cold day of the winter catching out those who haven't kept their antifreeze topped up or batteries in good order. Contaminated fuel from a petrol station. A pile-up on the motorway. Faulty engine management software from a manufacturer. There's a long list of ways that if something goes wrong for one car then it will also affect several others at the same time. Exactly the same is true for storage. Failures are often correlated. Faulty firmware. Bugs in software. Human error. A bad batch of tapes or drives. Failing to test the backups. This is compounded when a service provider uses the same hardware/firmware/processes/staff for all the copies of data that they hold. And homogenisation is popular with cloud operators because it helps drive their costs down. But it means that it's no longer possible to say that the copies are independent. This dramatically increases the real likelihood of data loss. Worse still, if all the copies are online and linked, then a software error that deletes one copy can automatically propagate and cause loss of the other copies. To illustrate this, suppose you have 3 copies of a data object on three separate storage servers. Suppose you decide to check your data is OK at the end of the year. If the odds are say 1 in 1000 of losing one copy, e.g. because of a bug in a RAID array, then you might say that because the three copies are independent then there's a 1 in billion chance of losing all three, i.e. permanent data loss because there are no 'good' copies left. But what if you said that there is 1 in 1000 chance of losing the first copy of the data, but correlation effects mean that if the first copy is lost then you revise the odds to 1 in 100 for also losing the second copy, and again if this is also lost then you drop the odds to 1 in 10 for the third copy. The chances of data loss would now be 1 in a million. That's a big change. Correlated failure modes make a lot of difference, yet this is typically overlooked when people build reliability models - probably because the maths gets a lot harder! Long-term storage requires still further thought. To keep data for any significant length of time means media refreshes, software updates, migrations of storage systems and a whole host of other things. It also means regular checking of data integrity to pick up failures and fix them. Do this well and the risks are low and there is actually an opportunity to increase data safety. In particular, regular checks on data integrity offers the opportunity to do repairs before its too late. But do all of this badly and the risks of data loss when something needs to change in the system can go up dramatically. What it does mean is that you can't simply convert statistics on chances of data loss right now into statistics of losing data over the long-term. A 99.99% statistic of no car breakdown today doesn't translate into a 10,000 days between breakdowns (about 27 years). How long you can go without a breakdown clearly depends on how well you maintain your car or if you buy to a new one that is more reliable (migration). Just the same is true for storage. At this point, I've resisted delving into metrics such as Mean Time to Data Loss (MTTDL) or Mean Time Between Failures (MTBF) of systems. Partly because it would make this post even longer than it is now, partly because they over complicate the issue, and partly because they can be misleading. An MTBF of 1,000,000 hrs for a hard drive doesn't mean you can expect any one hard drive to last for 100 years! If you want more on this then it's covered in a great paper called Mean Time to Meaningless where the title sums up the value of these metrics very nicely. The way of dealing with all this is to use survival analysis and an actuarial approach to risk. Work out all the ways in which data can be lost, the chances of these happening, and then the chance of data 'surviving' all these factors. It's not the individual failure modes that matter so much as the chance of surviving all of them, especially taking correlations into account. It also means uncertainty about the risks and the 'spread' of scenarios that could play out. That means some hard maths and use of stochastic techniques if you want a more reasoned set of 9s. In my opinion, the only real way forward is for a lot more transparency from cloud storage service providers so an informed judgement can be made by their users. Basically, go ask the provider for the risks they've considered and how they mitigate them. If it looks comprehensive and sensible then there's a good chance that your data will be as safe as is reasonably possible. There's a lot to consider in long-term data storage. Data goes on a long and complex journey. To use the car analogy one last time, data makes hundreds of individual trips, both geographically as it gets transferred to and from systems and between locations, and also over time as it goes through migrations and checks. There are risks with every trip and these soon add up. Multiple copies of the data in separate 'cars' is always a good strategy to help ensure that at least one makes it to the destination, as is the use regular servicing to keep each car in order - but beware if those cars are all in the same fleet run by the same supplier or come from the same manufacturer - otherwise you might just get hit by a recall for a fault across all of them... At Arkivum, we're very happy to share our risk analysis and mitigation with our customers (and auditors) and show why we think a 100% data integrity guarantee is justified for the Arkivum100 service. Many of the measures we use are outlined in a previously. And for those who opt for our reduced redundancy 1+1 service, whilst we might not put a string of 9s against it, it's still possible to see exactly why data is still in safe hands. ### Reducing big word terminology End-users have a tough time. Trying to find a cloud solution, let alone just considering one, usually begins with navigating through the layers of hype and hyperbole. With so many cloud providers communicating with acronyms and big enterprise, big word terminology is it any wonder that cloud adoption is not as great as predicted? To help end-user, providers and everyone in the cloud marketplace we have produced our own jargon buster. Enjoy. http://youtu.be/o24ZSs549ZY ### Can you see the Cloud for the Fog? Cloud marketing is everywhere these days. You can’t escape it, from endless acronyms to unreachable uptime levels people are willingly being sucked into noise of advertising. Can they walk the walk as well as they talk the talk? Unfortunately, it’s now the case that the bigger the marketing budget, the easier it is for customers to sign up to a poor service. I’m here to say that bigger isn’t better, it’s all down to how you treat the customers and how far you are willing to go to protect their data. I’ve wrote this blog with the hope that it will not only inspire service providers, but also enable customers to select the correct provider for their requirements. To do this, I’ll be using two service providers, unsurprisingly named Mr Big and Mr Small. Customer services In their latest advertising campaign, Mr Big is boasting that they have 16 million more customers than Mr Small, however, when you analyse the campaign message that bigger is better it doesn’t always prove to be right. For example, what happens when there is a platform issue and all 16 million additional customers are trying to call a technician? Chances are that if there are actually enough customer agents to handle this volume of calls, you’ll reach an international call centre, and we all know how fun they can be. Mr Small may have less customers, but their customers are content. They have an account manager, can reach a technician within a few minutes, and most likely have the mobile number for the owner should they get really stuck. Whereas, Mr Big allows you to sign up online, essentially offering a box solution at arm’s length. Which would you prefer? Price Mr Big may argue that due to size their service will be cheaper than that offered by Mr Small, which admittedly may often be the case, however, you have to research the provider and remember the saying buy cheap, buy twice. A demonstration of this would be if you are hosting your business data on a platform with Mr Big for around £150 a month and Mr Small offers you an alternative solution for £250. How did they get this cost down? Have they spread costs more thinly (due to having more customers) or are they using cheaper, less reliable equipment? Chances are they have done both to maximise profit. This is where you have to ask yourself, how much would downtime actually cost my business? If you selected Mr Big and the platform fails, how much would it cost your company while it was offline? I bet it would be more than the £100 you’d have saved that month. Privacy In another campaign Mr Big highlights the fact that they have 47 data centres across the world to cater for its users. So today your data is in New York, tomorrow Hong Kong, and London on Wednesday. With so many malicious hackers, snoopers etc. targeting large corporates who also host on Mr Bigs system, how tight do you think the security is, as your data winds its way around the world on a daily basis? Not to mention how many potential customers you are putting off when they are restricted by local laws stating they can’t have data leaving their own physical geography. Speed Out of these 47 locations, do you know where your data is stored? The further away your data centre, the slower it will be for you to reach and download your data. For example, a customer based in London, UK is interested in both providers. When doing a bandwidth speed test the company reaches Mr Smalls’ data centre in York with a download speed of 16.1 Mbps. Big's local data centre in Bandung, Indonesia has a download speed of 0.74 Mbps. Size can matter, but bigger is not always better. About virtualDCS The founders of virtualDCS have pioneered the development of the Cloud Computing industry for over a decade. As one of the first companies dedicated to Cloud services in the world, businesses can be confident that they will only receive the finest solutions. Their approach is to work in partnership with clients to ensure that their infrastructure is ready to exceed the service levels demanded by their business. ### IoT, not just great technology The questions I’ve been asked most recently on the topic of Internet of Things (IoT) relate to its value beneath (or beyond) the hype. Not surprising given the huge amounts of PR and tech speak attributed to the topic over recent past. What’s the proposition to (often smaller) business and consumers; as a business what can I do with it to attract (new) customers and how much can I make from it (additive, profitable revenues). No simple answer I’m afraid. It obviously depends on what you do as to how you take advantage of the architecture. Naturally, any answer will based on circumstance - if one’s a consumer of IoT (a user, perhaps a utility organisation, a manufacturer or a distributor, etc.) or a developer (software or hardware developers). Not forgetting, of course, technology integrators; the likes of systems integrators, cloud and managed service providers for example. Reading last weeks BBC article called "Is your connected car spying on you?" highlights common concerns that affect both consumer and developers; and that’s to do with the topics of security and privacy. For those amongst us (consumers) that live in the cloud connected world, we worry about privacy and security, more now than ever before. Just to be clear, (for my part) security and privacy aren’t the same thing. The two may have connection and overlap but security in my eyes has to do with protection from nefarious misuse, whereas privacy has much to do with (unintended) (mis-)use and distribution of privileged/confidential information. The question for those looking at developing or using IoT is how safe are we and information we deem privileged, and how safe is the use of and the construct of the (our chosen) IoT. An interesting topic and one I know software (and hardware and service) providers are grappling with. Cisco for one, has recently announced that it’s investing a billion dollars in an IoT led network infrastructure that aims to deliver both security and privacy to the IoT world. A development that will be closely watched, if not mirrored by its competitors. Coincidentally, and at a meeting last week, we discussed whether there was a place for the Internet of Things as well as an Intranet of Things. Well, the latter might make it more secure, but it still doesn’t solve privacy issues, and its application defeats some of the value propositions of the Internet of Things. To this end Cisco’s solution might just offer one solution to the dilemma. But it’s aimed at the enterprise. What about the rest of us (read consumers)? The IoT is more than hype. For my part it really can provide delivery and consumption of some very interesting and valuable products and services. But as with all technology, provisioning and use must be made with confidence. (I for one don’t want anyone messing with my car’s engine management system, but I do value the fact that my chosen agent knows that there’s a service due and contacts me appropriately.) In the (present and) forthcoming rush towards broader IoT application, let’s make sure that the values that the IoT brings isn’t subsumed by the concerns the BBC article highlights. I’d suggest that its rapid adoption won’t just be about great technology, it’ll be about great technology delivered in a manner that delivers it as useful, practical, private and secure. ### IT still not ready for IoT IT departments across EMEA are still not ready for the Internet of Things, finds ISACA survey. Increased security threats and data privacy issues considered the biggest challenges around connected devices. New research from global IT association ISACA found that few European IT departments or workplaces are ready for the invasion of wearable technology and other connected devices. According to 2014 IT Risk/Reward Barometer—a 110-country survey of ISACA members who are business and IT professionals—43% of respondents in Europe, the Middle East and Africa (EMEA) say their organisation has plans in place to leverage the Internet of Things or expects to create plans in the next 12 months. However, the majority is not ready for wearable technology in the workplace. More than half (57%) say their bring your own device (BYOD) policy does not address wearables and a further 24% do not even have a BYOD policy in place. This is a concern, as approximately 8 in 10 respondents (81%) say BYOW (bring your own wearables) is as risky as—or riskier than—BYOD. Overall, half of ISACA members across EMEA believe the benefit of the Internet of Things outweighs the risk for individuals (50%), while nearly a third believe the risk outweighs the benefit for enterprises (31%). Yet despite the risks, nearly a third (30%) say the Internet of Things has given their business greater access to information and a quarter (25%) say it has improved services in their organisation. Approximately four in 10 hope to benefit from improved services (40%), increased customer satisfaction (39%), and greater efficiency (38%) as a result of connected devices. Despite the benefits of connected devices, more than half (51%) of respondents believe the biggest challenge regarding the Internet of Things is increased security threats, while a quarter (26%) are concerned about data privacy issues. Two-thirds (68%) admit they are very concerned about the decreasing level of personal privacy. More than a quarter of respondents say the general public's biggest concerns about connected devices should be that they don't know how the information collected on the devices will be used (28%) or they don't know who has access to the information collected (26%). "The Internet of Things is here to stay, and following the holidays, we are likely to see a surge in wearable devices in the workplace," said Ramsés Gallego, international vice president of ISACA and security strategist and evangelist with Dell Software. "These devices can deliver great value, but they can also bring great risk. ISACA's research found that more than a third (35%) of EMEA ISACA members believe Big Data has the potential to add significant value, yet one-fifth (21%) admit their organisation lacks the analytics capabilities or skills to deal with it. Companies are seeing the benefits of these technologies but also the challenges that represent for the Privacy, Assurance, Risk Management and Cybersecurity dimensions, With these new technologies and tools at the forefront of business innovation, companies must begin an embrace and educate and adapt and adopt approach with connected devices and Big Data. ISACA recently established the Cybersecurity Nexus (CSX) as a resource that enterprises and security professionals can turn to for security guidance. Additional information can be found at www.isaca.org/cyber. ISACA's IT Risk/Reward Barometer examines attitudes and behaviours related to the risks and rewards of key technology trends, including the Internet of Things, Big Data and BYOD. The 2014 Barometer consists of two components: a survey of 1,646 ISACA members who are IT and business professionals around the world, including 603 across EMEA, and a survey of more than 4,000 consumers in four countries, including 1,001 in the UK. For a full survey report, including related infographics, video and global results, visit: www.isaca.org/risk-reward-barometer ### Humanised NOC Is humanised NOC Inevitable? Five cases where human interaction perfects NOC automation. In today’s cloud-based economy, businesses and startups are producing apps in impressive speed. Unfortunately, while most companies design the apps or software with the appropriate infrastructure, many do not realise that deploying apps is only half the battle. More important to the success of the product delivery is the ongoing monitoring, remediation, and infrastructure management, often lacking in the service that most cloud companies offer. Even the ones that do use monitoring services, the limited range of services proves to be inadequate to meet the needs of today’s businesses. For most companies operating in the cloud, Network Operations Centres (NOC) are used to monitor and control occurrences of the delivered platform. Generally, the Operations Centre is the operational hub or main location, from which all other operations stem. To date, every Operations Centre requires sophisticated machinery and is limited in their automated functions. In other words, machines cannot run on autopilot, they require the complex problem solving knowledge of a human for ongoing support. Human input for NOC operations bridges the complex data transfer and meta information of a network with the flexibility and real-time intelligence required to assess and respond to technical glitches or errors. While technology offers the capability of streamlining efficiency, it cannot replace the human instinct and intelligence, despite every effort to increase automation by reducing human involvement, the level of agility necessary to resolve all potential system errors, still requires human interaction. In fact, it is the human aptitude that enables the holistic approach to technology cultivating the optimal environment to learn specific behaviour patterns of a system and then apply the trends to reduce and solve future problems. For the sake of clarification, network monitoring is the basic, more surface level review of the network activities. Uptime management , however, is the ongoing, review and evaluation of the data to address and respond to the errors in a strategic manner, with the ultimate goal of optimising the functionality of the service. Most companies rely on the network monitoring style and lose the valuable data that may offer insights and crucial information necessary to responding to future issues. To solve this passive style of monitoring, on-demand NOC services provide the added value of transforming the more passive network monitoring style into a hands-on, continuous, solution focused monitoring approach. Collecting the data is only valuable if the company is willing to invest in human capital to identify, review and resolve the data trends. Here are 5 cases to better portray the importance of the human component in on-demand NOC monitoring. The cases will also highlight how human interactions compliments, rather than detract, from complete automation of NOC tools. Five Cases of Human/Automation 1. Auto-Scaling in the Cloud The most basic example of where human interaction is necessary is cloud performance that is automated by a scheduler. While the scheduler automatically scales the networks resources at peak times, monitoring the scheduler, at all times, requires human involvement. For example, there are a variety of ways to modify the scheduler to account for increasing or decreasing traffic patterns. Applying these methodologies, however, require ongoing changes and evaluation at the application and business levels that can only be executed by real-time monitoring. Granted, some preemptive solutions can be front ended in the event of expected increase of traffic. Even with expected spikes in traffic patterns, the changes require management that are in line with user demand. Otherwise, the business is liable to suffer unnecessary financial consequences. Building a comprehensive NOC team can modify a system to provide both synchronised front-end with back-end performance, while managing all levels of traffic. Essentially, this human involvement protects the SaaS user from experiencing unwanted downtime or performance disruptions. 2. On-Demand Monitoring and Application Updates Every cloud-based application is hypersensitive to major upgrades. NOC’s can serve a secondary function as a monitoring service to detect possible problematic areas. Searching for the most appropriate monitoring solution may be unrealistic; using the NOC can be an efficient means to preserving the system prior to implementing any additional monitoring or automation procedures. Again, a flexible NOC team, offers a faster time-to-market solution and maintains customer satisfaction. 3. Behavioural Learning as a Predictor for Future Errors The ideal NOC team is constantly observing, learning and measuring system behaviours. By studying the regular patterns of behaviour, the team will sense and, often, foresee possible problematic scenarios. A keen sense of understanding these trends, enables the NOC team to take the necessary actions to avoid or quickly resolve situations of disruption. By identifying and familiarising themselves with the system’s behaviour, the NOC team can intelligently assess the risk levels and determine which issues require concrete engagement. 4. Cross-Platform Monitoring and Holistic Approach The multi-dimensional capabilities of the NOC offer a broader, overarching analysis of the entire system. With this information, critical decisions can be approached in a proactive manner rather than a temporary, reactive response which often results in compounded difficulties down the line. As an example, graphs can depict a reduction in site searches and an increase in errors, normally this would be attributed to a component being down. With NOC analysis, the team is able to analyse various parameters over time. By observing dependencies between various components of the cross-platform NOC, the team is able to address the issue at hand, to target the cause as well as to solve the problem with a relevant longer-lasting and appropriate solution. 5. Documenting Repeating Events With the information garnered and recorded from various events and errors, the NOC team is properly equipped with material to build structured operations reports. These reports offer the valuable baseline of information to identify problems and generate relevant solutions. The documentation can also be used to avoid future situations. Outcomes should be based on predefined protocols to avoid recurring events without adding unnecessary strain on the development team to produce another automated action. These are only five of the many possible scenarios that require human interaction in NOC operations. The optimal, continuous performance of such a sophisticated system is only viable with the ongoing critical thinking of the human potential. To date, humans are still critical in today’s technologically advanced world to strengthen our modern IT world. ### Swift Spotify cloud streaming This week the majestic T-Swifty (Taylor Swift for those of you not in the know) removed her entire music catalogue from Spotify. This has left me immeasurably sad, I can no longer stream 1989 and dance around my house to Shake It Off. But more importantly it got me thinking, cloud based music providers such as Spotify have changed the music industry. When Spotify happened, the music industry went into crisis control mode - record sales were already dropping due to illegal downloads. The introduction of free streaming services had the lovely people at iTunes, to put it frankly, freaking out. My generation is quite self-entitled. As millennials we believe we have the right to access and own any and all media when and where we want to, often in complete disregard of the legality of such entitlement. So streaming music via a subscription service, which can either be free with advertisements, or paid without, has mass appeal. It is instant, easy, and has an agreeable user interface without requiring massive storage spaces for an extensive music library. Most people I know don't own physical CDs anymore, let alone a CD player. Everything is digital. My iTunes music cloud, or my Spotify digital library are defining parts of my online identity. Everything contained in them has implications on my personality, lifestyle and cultural choices. By having music services hosted in the cloud, companies are able to collate more data about their customers' preferences, therefore building stronger customer profiles and increasing their abilities to specify services to those tastes. On unpaid Spotify accounts, advertisements and track suggestions can be specified to the users based on their previous actions on the app, location, searches, and previously payed tracks. Now, I'm not claiming that the music industry is dying out, the opposite of that, I think it is undergoing an evolutionary change. Taylor Swift's bucking of the Spotify trend has proved that the music industry is not completely dead - her album 1989 has sold 1.287 million copies in its first week, her fan base remains strong despite there being no way to access her music on Spotify. While artists are generally no longer making as much money from their releases, their reach has been increased exponentially by free streaming services, which still do profit them for each play - whilst obviously, illegal downloads do not. Music publishing company Kobalt has just announced that it's Spotify revenue has overtaken that of iTunes for the first time, and that if growth continues at its current rate, Spotify will overtake iTunes in terms of revenue contributed to the music industry. If the user growth on Spotify is increasing at such a steady rate, surely the cloud based music subscription service is capitalising on all its user data, right? I wholly expect that Spotify will only become smarter as time passes and its end users increase. As a millennial I disagree with the notion that Spotify is over the hump, and that soon it will be phased out. I think music streaming services like it are the way of the future. Vinyl will always have its niche, and purchases of albums on iTunes currently hold appeal because of media ownership and having availability across multiple devices. I think cloud based streaming will flourish as piracy laws are enforced more heavily, as illegal downloads are cracked down on, and wifi enabled devices become more and more readily available. As networks improve, cloud based streaming services will thrive. If the cost is merely your musical identity preferences, and henceforth your customer profile (which you've already surrendered to the internet regardless might I add) why not take up a cloud based service? If you can free up all that space on your device by removing your music library to the cloud, why wouldn't you? It may not have T-Swifty in it right now, but I predict that she will return to streaming soon enough for us to dance again. ### Navigating the hybrid cloud world Most CIOs have their heads in the cloud these days. But if they're leading substantial, established enterprises, they also have their feet in a well-designed, carefully-secured data centre. CIOs across the board are feeling pressure from somewhere – whether it's their CEO, their employees, customers, partners or the general shift in the market - to implement the latest cloud-based IT technology for efficiency, agility and economy. But they also need to take advantage of their existing IT infrastructure to maximise return on investment. These leaders are finding that they can create hybrid clouds to get the best of both worlds. They can cut costs by storing and sharing some data and applications internally in a private cloud. And they can nimbly develop new applications and store voluminous amounts of unstructured information for big data analytics in public clouds. They lease that capacity from cloud-hosting companies that specialise in data management, while integrating these capabilities into their existing on premise infrastructure. The key is to figure out what data and what applications fit best in which place. Then figure out where they have to interact. Get the right software tools for managing the hybrid environment. It requires careful planning to manage a private cloud and a third party public cloud host. But for companies that want to get the benefit of new technology while still needing to provide bullet-proof continuity of operations, the old and the new need to work together. Many established companies with significant IT infrastructure are making the decision to develop a hybrid cloud. For example, NiSource Inc., one of the largest natural-gas transmission companies in the U.S., recently said that it plans to move to a hybrid cloud. Companies today typically use three types of computing: dedicated servers in the corporate data centre that run key applications; pooled resources in a private cloud in the corporate data centre; and resources that are run by public cloud providers and accessed over the Internet. Mixing and blending any of these deployment models creates a hybrid cloud. Most established enterprises can't simply close their data centres and move IT to an outside provider. In some cases they have handcrafted, 20-year old applications that are vital to services provided to a few key customers. Some ERP systems have been tuned over 20 years, and downtime would threaten corporate viability. Such applications may simply be too fragile to move. In those cases it often will be more cost effective to keep legacy applications running on isolated specialised servers rather than moving them to virtualised private cloud servers. In other cases, there will be specialised data that is subject to regulatory or legal constraints. But building a private cloud in the corporate data centre will cut costs and increase flexibility. In a private cloud, virtual servers can handle hundreds of workloads on a single physical server. Data centre architects don't need to dedicate storage devices to a single application. While some jobs still run on dedicated computers, CIOs treat most of the computing capacity in the data centres as a pooled resource -- in effect, a private cloud that is allocated on demand. The data centre has long been the symbolic heart of the CIO's domain. For many years, corporate data centres continuously required more space and more electric power. But the virtualisation revolution that started in 1998 meant many applications could run on a single server, and most companies have been consolidating workloads on servers ever since. Adopting hybrid-cloud architecture means most enterprises already have all the bricks-and-mortar data centre space they will ever need. CIOs can stop worrying about how to enlarge their data centres or obtain more electric power for air-conditioning. Most companies are using some public-cloud infrastructure as well. Sometimes IT departments have made a decision to use the cloud for testing or development of new applications. In other cases a marketing executive or a researcher has bypassed IT to use a new application, expensing the cost on a corporate credit card. In most companies, it's important for the IT department to have a handle on all corporate data, and know what might be going off company servers. There are ample positive reasons for CIOs to embrace the cloud. Last year, my company, IBM commissioned a survey of top executives at 800 enterprises around the world about a range of issues. That study revealed that the majority are using cloud to integrate and apply mobile, social and big data technologies. The cloud is paying off for those companies. Those with high cloud adoption are reporting almost double the revenue growth and nearly 2.5 times higher gross profit growth than peer companies. Embracing the cloud makes it easier for the IT department to be seen as a partner in new initiatives for other departments. Historically, IT has often been seen as a barrier. Budget constraints and the need to buy and install hardware meant that any new project required months of activity by IT before operating groups could get access. When using a public cloud, a few days or weeks of coding by IT developers can be enough to launch a new product. Our survey found that 66% of organisations are using cloud to strengthen the relationship between IT and lines of business. Embracing the cloud isn't an all-or-nothing proposition. The hybrid cloud is the route to cloud computing that will be most comfortable and cost efficient for large enterprises with long-established IT departments and data centres. ### Toshiba announces cloud client manager Toshiba Europe has announced the availability and expansion of the Toshiba Cloud Client Manager (TCCM) - enabling organisations to manage and control their mobile workforce. With mobility on the rise more businesses use multiple devices, many of which are located remotely in people's homes or at satellite offices. As laptops and tablets are significant, often costly, assets it is important for businesses to have access to these devices wherever they are on the network. TCCM allows companies to manage their endpoint devices, including PCs, laptops, smartphones and tablets, providing complete estate asset management and comprehensive tools to ensure device efficiency and compliance. Toshiba’s cloud service incorporates patch management, asset inventory, power management modules and Toshiba software drivers distribution. From early 2015, TCCM will expand to offer cloud back-up and mobile device management (MDM) modules to further enhance the level of control an organisation has over its devices. Given that data forms the backbone of many businesses today, it's fundamental to safeguard records and documents with effective backup systems. With TCCM, users can choose to utilise an automatic cloud backup function. This will keep their business-critical data secure in the event of device or system failure ensuring that it can be recovered, and operations can be resumed without business damage. TCCM is also scalable, limiting operating expenses through SaaS. Using a standard web browser, administrators can oversee and take control of these endpoint devices online, in real-time, without the need to invest in servers, dedicated management software or require advanced technical expertise to deploy. For the company, this means more agile management – helping organisations get the best from their devices, keeping them secure and up-to-date. The asset management is complicated enough and Toshiba’s TCCM is certainly pitched to address many IT managers concerns. For more information visit: www.toshiba.eu ### IBM unveils dynamic cloud security IBM has announced it has built the industry's first intelligent security portfolio for protecting people, data and applications in the cloud - IBM Dynamic Cloud Security. Built on IBM's investments in cloud, security and analytics software and services, the portfolio is designed to protect a business's most vital data and applications using advanced analytics across their enterprise, public and private clouds as well as mobile devices. Security is increasingly a concern that is often cited as the reason for delaying cloud adoption. Businesses are struggling to safeguard their existing IT systems against attackers who are becoming increasingly sophisticated and more difficult to detect. 75% of security breaches take days, weeks or even months to be discovered, significantly increasing the damage inflicted. "Customers are now moving critical workloads to the cloud and they expect enterprise grade security to move with it," said Brendan Hannigan, General Manager, IBM Security Systems. "We have pivoted our entire security portfolio to the cloud to help customers lock down user access, control data and maintain visibility. With the right visibility into threats, enterprises can more securely connect their people, data and processes to the cloud." Using analytics IBM’s cloud security tools give companies a "single-pane-of-glass" view of their entire business. From private data centres, to the cloud and even to an individual employee's mobile device, the security the security status is comprehensively reported and shows exactly who is using the cloud, what data individuals are accessing and from where they are accessing it. The IBM Dynamic Cloud Security portfolio addresses the security gaps that can exist between on-premise, cloud, SaaS and mobile applications. The portfolio is focused on authenticating access, controlling data, improving visibility and optimising security operations for the cloud. According to a new IBM study of nearly 150 Chief Information Security Information Officers (CISOs) almost half expect a major cloud provider to experience a security breach. 85% of the CISOs polled say their organisations are now moving to cloud, putting security clearly in focus. Despite these concerns, critical workloads processing customer and sensitive data are still moving to the cloud. IT analyst firm Gartner reports that nearly half of large enterprises will deploy hybrid clouds by the end of 2017. As cloud adoption increases security will become ever more important. For more information visit: ibm.com ### GTT increases IBM Cloud availability across Europe GTT increases possibilities to interconnect to IBM Cloud data centres across Europe. In a move that looks certain to benefit cloud developers GTT has increased IBM Cloud via interconnect in 101 data centres across Europe. Aiming for the $2.6 trillion on-premise, hybrid cloud market this announcement puts the IBM Cloud platform within reach of all local European markets. According to a recent IDC forecast cloud services will account for more than half of global growth in software, server and storage spending by 2018. This important move by IBM is showing that the elephant is truly beginning to dance. The IBM brand has always been synonymous with longevity, stability and security. The question now is, will this move be the catalyst that allows IBM to compete with AWS and Azure? We surely think so. GTT allows for private, dedicated and, most importantly, secure connectivity across multiple regions, and with an end-to-end connectivity. Today’s announcement is big. Our view is that this combination could be the final part of the puzzle in developing a comprehensive offer that delivers a meaningful hybrid cloud. The enterprise market is sure to notice. More businesses are viewing the cloud as an essential part of their business platform. Enabling staff to work across a range of locations using a mix of devices the hybrid cloud addresses key security concerns, helps with the provision of services and manages both the network and the devices that access it. The hybrid cloud also enables businesses to develop applications on the cloud to deliver IT management, business protocol and enhance their own, outward services. The hybrid cloud is fast becoming the central battleground in the marketplace. We are at the tipping point where the cloud truly begins to reach everywhere. This service is being provided by GTT, who operate a global Tier 1 IP backbone based on a 100GE Juniper end-to-end network. IBM customers can connect to GTT's network and onwards to IBM Cloud services from within any of the data centres listed below or GTT can extend the connectivity out to a customer's office from the nearest data centre. More on the hybrid cloud To learn more about the importance of the hybrid cloud watch our short video guide: https://www.youtube.com/watch?v=Nb9-5QxQHXo Where to find IBM Cloud and SaaS services The following data centres are now able to directly connect to deliver hybrid cloud and software services utilising the IBM SoftLayer platform: Country City Data centre provider Austria Vienna InterXion (VIE1) Belgium Antwerp LCL Belgium Belgium Brussels LCL Belgium Belgium Brussels InterXion (BRU1) Belgium Brussels InterXion (BRU1) Bulgaria Sofia Neterra Communications Bulgaria Sofia Telepoint Czech Republic Prague CE Colo Prague Denmark Copenhagen GlobalConnect Copenhagen Denmark Lyngby DIX France Aubervilliers InterXion (PA2) France Aubervilliers TelecityGroup (Aubervilliers) France Courbevoie SFR Netcenter Paris France Marseille SFR Netcenter Marseille France Paris Equinix (PA2) France Paris Telehouse 2 (Voltaire) France Paris Telehouse 2 (Voltaire) France Paris Telehouse 1 (Jeuneurs) France Paris InterXion (PA1) France Paris TelecityGroup (Courbevoie) Germany Berlin I/P/B Carrier Colo Germany Dusseldorf InterXion (DUS1) Germany Frankfurt Itenos Germany Frankfurt TelecityGroup (Gutleutstrasse) Germany Frankfurt InterXion (FRA1 & FRA6) Germany Frankfurt Ecotel Germany Frankfurt Equinix (FR5) Germany Frankfurt TelecityGroup (Lyonerstrasse) Germany Frankfurt Equinix (FR1) Germany Hamburg E-Shelter Germany Munich Equinix (MU1) Ireland Dublin Eircom Ireland Dublin TelecityGroup (Kilcarbery) Ireland Dublin TelecityGroup (Kilcarbery) Ireland Dublin InterXion (DUB2) Ireland Dublin InterXion (DUB1) Ireland Dublin InterXion (DUB1) Ireland Dublin TelecityGroup (CityWest) Ireland Dublin TelecityGroup (CityWest) Italy Milan Infracom Italy Milan KPNQwest Milan Italy Rome Interoute Italy Rome NaMex Internet Exchange Italy Turin Tiscali Italy Italy Turin IT Gate Lithuania Vilnius Teo Datacenter Netherlands Amsterdam Server Central Netherlands Amsterdam InterXion (AMS3) Netherlands Amsterdam TelecityGroup 3 (Sloterdijk) Netherlands Amsterdam Global Switch Netherlands Amsterdam TelecityGroup 2 (South East) Netherlands Amsterdam TelecityGroup 5 (Schepenbergweg) Netherlands Amsterdam NIKHEF Netherlands Amsterdam TelecityGroup 1 (Science Park) Netherlands Amsterdam Vancis (SARA) Netherlands Amsterdam Vancis (SARA) Netherlands Amsterdam Equinix (AM3) Netherlands Amsterdam TelecityGroup 4 (Wenckebachweg) Norway Oslo TeliaSonera Poland Warsaw Corponet (LIM building) Poland Warsaw PLIX (LIM building) Poland Warsaw Netia Romania Bucharest NXData 1 Slovakia Bratislava CEColo Bratislava Udernicka Spain Barcelona Telvent Spain Madrid InterXion (MAD1) Spain Madrid Espanix Mesena 80 Spain Madrid Telvent Carrierhouse 2 Spain Madrid Terremark Madrid Sweden Kista InterXion Kista (STO1) Sweden Stockholm TelecityGroup Sweden Stockholm TeliaSonera Switzerland Basel IWB Telehouse Switzerland Zurich Equinix (ZH1 & ZH2) Switzerland Zurich InterXion (Glattbrugg) United Kingdom London InterXion (LON1) United Kingdom London Telehouse East United Kingdom London Telehouse North United Kingdom London Telehouse North United Kingdom London Telehouse North United Kingdom London Telehouse North United Kingdom London Telehouse East United Kingdom London Equinix Slough (LD4) United Kingdom London TelecityGroup (Meridian Gate) United Kingdom London TelecityGroup (Meridian Gate) United Kingdom London TelecityGroup (Sovereign House) United Kingdom London TelecityGroup (Sovereign House) United Kingdom London Global Switch (London 2) United Kingdom London TelecityGroup (Hex 6&7) United Kingdom London TelecityGroup (Hex 6&7) United Kingdom London Equinix Slough (LD5) United Kingdom London TelecityGroup (Hex 8&9) United Kingdom London TelecityGroup (Hex 8&9) United Kingdom London TelecityGroup (Hex 8&9) United Kingdom London City Lifeline London United Kingdom London Equinix (LD2) United Kingdom London Equinix London Park Royal (LD3) United Kingdom Manchester TelecityGroup (Kilburn House) United Kingdom Manchester TelecityGroup (Williams House)   ### Communication Next After using VoIP to stay connected between meetings in the US and in London, and having declared the old telephone is no longer as important, it seems the UK's communications regulator Ofcom agrees and has found that the landline is in decline. With smartphones and mobile computing the next wave has surely begun as cloud services and better technology are changing the way we connect with each other. Recent research from Ofcom in the The Communications Market 2014 has found that our attitudes are changing as a result of technology and our use of it. With broadband take-up, smartphone ownership, social media and cloud services usage all rising the UK is becoming more digitally aware and digitally driven. In 2004, when Ofcom first began publishing the Communications Market Report, the number of households with broadband was just 16%. Today this has grown to 77%. With this improved connectivity the average UK adult now spends more time, 8 hours and 41 minutes, using media or communications than they do sleeping. 8 hours and 21 minutes is the UK average for a night's sleep. The population is squeezing more into each day. And they are doing it by multi-tasking on different devices - laptops, tablets and, more importantly, smartphones. According to the report total use of media and communications averaged over 11 hours every day in 2014. This is an increase of more than 2 hours since Ofcom last conducted similar research in 2010. Ed Richards, Ofcom Chief Executive, said: "Our research shows that a 'millennium generation' is shaping communications habits for the future. While children and teenagers are the most digitally savvy, all age groups are benefitting from new technology. We're now spending more time using media or communications than sleeping. The convenience and simplicity of smartphones and tablets are helping us cram more activities into our daily lives." As we look toward 2015 communications are set to keep on improving. We are better connected through super-fast broadband and 4G mobile has arrived. This connectivity is helping people communicate on the move, and via their smartphone. UK smartphone take-up is almost even with laptop ownership. In the first quarter of 2014 (Q1), laptop computers were the most popular internet-enabled device and were present in 63% of households, followed very closely by smartphones at 61%. In 2013 smartphone ownership was only 51%. While tablet use is spread evenly across generations, smartphone ownership differs greatly by age. Of those aged 16-24s 88% own a smartphone, compared to 14% among those aged 65+. Smartphone take-up has also continued to increase rapidly over the past year, up to six in 10 adults, 61%, compared to half, 51%, a year earlier. The growth in smartphone use in particular has contributed to people spending an extra 2 hours per day on media and communications since 2010. Young adults are glued to their smartphones for 3 hours 36 minutes each day, nearly three times the 1 hour 22 minute average across all adults. This young segment is where we are seeing the start of the long decline for the landline. According to Ofcom 64% of adults used a landline to make voice calls this year compared with 45% using smartphones. On average, we spend 29 minutes a day making voice calls. 19 minutes of those calls are via landlines. But when you look at those aged 16-25 the landline does not fare well. 94% of young adults used their mobile to make and receive calls. Children aged 12-15 are abandoning the telephone. Just 3% of their communications time was spent making voice calls. 94% of children communicated using instant messaging and social networking. In all these figures VoIP was amongst the winners, enjoying greater use and awareness. While VoIP may not have won the boardroom just yet - it is great for meetings when offices and countries separate the team - the public are increasingly using it to contact family and friends. Around half, 53%, claiming to use it to contact family who live in the UK, and 46% to contact family who live abroad. Teens born at the turn of the millennium are unlikely to have ever known the 'dial-up' Internet. They are the first generation to benefit from broadband and digital communications while growing up. The Millenials are leading the way but the report suggests every age group is enjoying the benefits our technology enhanced communications. For those of in the cloud marketplace our services, from VoIP to infrastructure provision and data centres, are making these changes possible. Ofcom's report only reinforces what many us have already noticed individually. ### The rise of the Cloud CRM With the popularity of customer relationship management software (CRM) going through the roof this year, Bree Freeman takes a look at this little-understood, complicated business system. In an ever-interconnected world, engaging with your customers is critical for the success of your business so it is little wonder that the popularity of products such as customer relationship management software (CRM) is on the rise. According to Gartner, the CRM industry, which they say is currently valued at $20.6 billion (bn), will grow to $36.5bn worldwide by 2017. That expeditious growth also means that in 2017, CRM spending will outpace spending on enterprise resource planning for the first time. As it is, spending on CRM is already far ahead of that on business intelligence, supply chain management, and collaboration spending. While CRM itself isn't new, with the aid of the Cloud it is becoming increasingly critical in any business, in any marketplace. CRM apps today are designed to meet the demands of a new breed of customer; one that is digitally connected, social and very much informed. But what exactly is CRM and how can it help you, as a business, grow? Well we all know that when running a business, it is important to keep track of your clients and all of your interactions with them – and equally important is to give attention to investors, pay creditors, track employee workflow and so on and so on. Here's where CRM software comes into its own, as it is the best way to keep track of everything in your business without drowning in multiple software programs and apps. While CRM was originally created to help keep track of client information, it has grown and evolved into multifaceted programs that can track sales and create custom marketing opportunities. The emphasis of CRM is on openness, so that everyone at the coalface or behind the front line has access to up to date information about the customer. Modern systems can be configured to reflect the business model exactly; for example it makes sense that the CRM system should reflect the journey of the customer – from initial awareness to becoming a customer and beyond. If you're not already using cloud CRM software for your business, then in my opinion you should be as there are many key benefits that come with using a Cloud solution that you should be aware of. To name but a few, these include: Streamlined sales and marketing processes Access to real-time data No hardware costs Higher sales productivity Added cross-selling and up-selling opportunities Improved service, loyalty, and retention Increased call centre efficiency Higher close rates Better profiling and targeting Reduced expenses Increased market share Higher overall profitability Marginal costing Before you venture into the realm of CRM, it is important to understand that CRM systems come in many flavours and capabilities, and there is a plethora of online systems available to choose from. So, it is vital that you determine which CRM system is best for your particular business but choosing the right solution for your business can be a daunting prospect. You really do need to consider your customers' needs, your company's needs, and how well a provider can meet those needs. What makes the selection process even more difficult – particularly this year – is that many of the leading CRM providers have made great technological strides in social media, mobile technology, big data, virtual assistants, analytics, and many other areas. Today these CRM providers can cater to quite literally your ever need, offering mobile platforms that are tied to your products, and these are often high quality offerings: for instance the Salesforce1 mobile platform, which celebrates its first birthday this month, is excellent. Industry leader Oracle, who's RightNow product is an extremely strong competitor in the CRM marketplace, particularly for those firms wishing to streamline service and support to enable agents to handle multiple support channels. There is also the popular SugarCRM, which offers a solution that automates your core sales, customer service and marketing processes. Personally I was impressed with its mobile dashboard. For a cloud solution closer to home – made in the UK – both Workbooks and icomplete have something to offer. And you really should check out the offerings from well-known players like Salesforce.com, Microsoft and IBM. For the smaller business I would suggest you cast your eye over developer Swiftpage's widely used app Act!, which is a fabulous app for SMBs as among other things, it will work with contacts consolidated from many different places including Outlook, Gmail, LinkedIn and Facebook. The service includes integrated email marketing features and allows you to track interaction histories for customers. And its interface is easy and the pricing is very reasonable. Overall though CRM is far more than just a one-time project activity; it is a change in company culture and a continuous journey that matures over time as customer needs evolve, organisational capabilities change and the customer relationship grows. So make sure you choose wisely! ### Seagate targets the cloud With two announcements Seagate is pushing ahead with its new cloud strategy. First up is a career move which demonstrates how important High Performance Computing (HPC) is becoming in the marketplace. Seagate has announced that it has hired Sri Hosakote, former Head of Engineering for Cisco, as its Executive Vice President of Systems for the Cloud Systems and Solutions (CSS) group. Hosakote, whose previous roles also include time as a software engineer for Sprint Communications, is expected to lead the creation and delivery of HPC solutions and custom, modularised systems for OEMs with an emphasis on converged infrastructure. "Sri's leadership will be an important driver as we extend Seagate's market position as a trusted partner in storing, protecting and sharing data by going beyond our leading disk drive business to deliver cloud systems and solutions for OEMs and 'do-it-yourself' organisations," said Jamie Lerner, President of CSS for Seagate. Hosakote's appointment follows the unveiling of Seagate's new Kinetic hard disk drive (HDD). Seagate has been making efforts to boost its enterprise-class PCIe flash range. They have also managed to pick gains in the surveillance and video analytics market, as well as with its cloud-based applications. With the introduction of the Seagate Kinetic Open Storage platform in 2013 another storage product was always going to be likely. The Kinetic HDD is designed to replace legacy systems through the combination of open source object storage protocol with Ethernet connectivity. This streamlined approach allows storage applications to talk directly to Kinetic object storage HDDs through direct IP addressing. Seagate hopes to eliminate the overhead of an entire storage server tier, which they claim will reduce costs by 50%. Based on the Seagate Kinetic Open Storage platform, Kinetic HDD dramatically reduces total cost of ownership (TCO) by simplifying cloud storage architectures and eliminating multiple layers of legacy software and hardware infrastructure. AOL and HP are both working with Seagate, exploring Ethernet connected drives in an effort to drive big data solutions. "The addition of Kinetic devices to the storage environment creates architectural flexibility when deploying systems. Kinetic storage also improves TCO when storing the enormous data sets required by applications, while simplifying management and reducing the effort required to operate reliable data storage systems," said Dan Pollack, Chief Architect, Storage Operations for AOL. "Over the past decade, the unprecedented explosion of data has been driven by social media, smartphones, tablets, and the rapid growth of every sort of Internet-connected device," said Scott Horn, Seagate Vice President of Marketing. "Cloud service providers are increasingly looking for solutions that will simplify infrastructure, improve scalability and reduce costs." "Seagate is building on its legacy, extending innovation from the device into the information infrastructure, both on-site and in the cloud," said Jamie Lerner. There are many tech companies trying to solve to the information infrastructure stack. The management layer is increasingly as important as the file system and data centre hardware. With the new Kinetic HDD Seagate believe they have the cost effective answer to the storage needs of our ever-expanding data storage. And, with the appointment of Hosakote, Seagate is showing the willingness to invest more in cloud solutions. Seagate is demonstrating the Kinetic Open Storage platform at OpenStack Summit Paris 2014 this week. ### Node4 continues London expansion New base in the City to support new and existing customers in and around London. UK-based cloud and data centre specialist Node4 has opened a new office in London – the latest step in the company's long-term business strategy to expand its presence in the south of England. The new office, located in the City, is part of Node4's ongoing commitment to provide direct support to its new and existing customers at a regional level. Node4 has been working with customers from London for many years and, as the business continues to grow, the necessity to ensure local support on the ground in the south east has increased. The new office will allow Node4 to meet with London-based customers and demonstrate its comprehensive range of solutions. This enhances direct communication with customers in the City and opens up further opportunities for them to take advantage of Node4's extensive solutions portfolio. Node4's recent acquisition of LETN Solutions has also given it a new office in Reading and expanded its dedicated regional sales team serving customers in this region. Paul Bryce, Business Development Director at Node4 commented: "The opening of a London office is another example of Node4's commitment to building a truly national presence. London-based businesses will benefit from localised customer support and technical expertise across the region. This ensures that we can serve customers in the south in the best possible way as well as attract new ones." ### Microsoft cloud revenue rising According to data from Synergy Research Group Microsoft has the highest rate of growth amongst leading cloud infrastructure providers in the all important 3rd quarter. Closing out a month that saw Microsoft announce its cloud in a box, the good news keeps on coming. The new Q3 data suggests Microsoft grew its revenue by 136% taking their worldwide market share to over 10%. In October Microsoft CEO Satya Nadella said, "the enterprises of today and tomorrow demand a cloud platform that is reliable, scalable and flexible. With more than 80% of the Fortune 500 on the Microsoft cloud, we are delivering the industry's most complete cloud — for every business, every industry and every geography." Detractors have dismissed these claims as posturing or, at the very least, a hopeful marketing pitch. But Synergy Research Group's findings show real growth. For Microsoft the new "mobile-first, cloud-first" strategy is paying off. But Microsoft still has a long way to go. Amazon is out in front and leading the pack. Amazon's AWS cloud revenues grew after a relatively timid second quarter, resulting in its market share rising back up to 27%. Synergy Research Group suggests that this growth rate may be lower than Microsoft's. In absolute terms AWS revenue growth over the past year is greater than the total cloud revenue for Microsoft over the same period. AWS has the greatest share of the market and has scale over its competitors. With a 7% share IBM is the third largest cloud operator, followed by Google, salesforce and Rackspace. With most of the major operators having now released their earnings data for Q3, Synergy estimates that quarterly cloud infrastructure revenues have now passed the $4 billion. The revenue includes the full range of IaaS, PaaS and private and hybrid cloud services. Twelve-month revenues exceed $14.5 billion as the market grew by 49%. Cloud adoption keeps rising and as it does the leading providers are all gaining market share. "Given the level of competition in the market, Microsoft's growth rate has been truly impressive, reflecting a strong corporate focus on cloud and huge on-going investment levels" said John Dinsdale, a Chief Analyst and Research Director at Synergy Research Group. "The other main feature of the Q3 market was the return to strong sequential revenue growth at AWS following a relatively weak second quarter. While prices continue to fall, Q3 did not see the huge pricing disconnect that caused AWS growth rate to falter in Q2." No matter which cloud infrastructure provider you may favour the news is good - the cloud market is enjoying some of its highest growth. While the global markets haven't set the world alight the cloud marketplace is certainly blazing a path forward. ### Fast cars, fast clouds The minute the New Year begins the clock starts ticking, ever advancing to close out the year that has just begun. This is just the way time works. From a beginning an ending is sure to follow. For cloud computing to be truly successful it needs an ending too. I would suggest that ending is ubiquity. The cloud should become so important, so successful, that it permeates our working habits and the provision of our services to such a degree that we don’t even mention it. The final goal will be reached when the term cloud computing becomes unnecessary. Consider the adoption of the automobile in the early 1900s. Once the price of ownership came down with mass production and the public could afford to buy one the only question was need. Some people weren't sure what they would need a car for or how it had the potential to transform their lives. But technology has a way of answering questions we didn’t know we had. Marketing, even back then, was important to help sell the car concept and persuade the public. No one has to explain the importance or adoption of cars to us anymore. Since the car stopped being a science experiment and assumed the role of transporter the debate has largely disappeared. Today cars are sold individually on their features, efficiency, size and social aspiration. Wheels, the drive train and the engine are as important as the colour and seating layout. The public aren't told they need a car, only which one they might like to buy. Like cars cloud computing is beginning to take hold and become commonplace. While it still has a long way to go the affects of the technology are already being felt; and this is helping it make deeper connections into our offices and homes. For many cloud computing is often associated with storage and productivity. Services like Dropbox, Gmail, Google Drive, Microsoft Office 365 and even Apple's iCloud make-up what the public commonly associates with the cloud. But for those of us in the industry we know the cloud has bigger capabilities. One of those capabilities is big data. It is driving decisions with insight and doing so ubiquitously. Supermarket chains are already leveraging big data. Monitoring your shopping habits and offering discounts that match your profile the supermarkets are aggregating data to improve marketing and increase profits. Collation and the application of the data happens almost invisibly to the public. But supermarkets aren't the only ones doing this. The MetOffice uses a supercomputer to analyse terabytes of data to help it predict our weather. In the future big data has the potential to accelerate our understanding of medical conditions helping us create more effective treatments. When we look at the receipt for our weekly shop and the coupons subsequently offered, just like the presenter on TV telling us it will rain in the afternoon, we don’t see cloud computing at work. If anything we observe, or rather enjoy, its benefits. One aspect that is completely visible is the role of the provider. The supermarkets are large, individual brands whose collective scale makes it possible to offer us goods at a price we can afford. We identify with certain brands and choose those that offer each of the services we believe we want or need. Like the car manufacturers their expertise, the variety on offer, is how their sales pitch is defined. This pitch defines the parameters we use to determine our preference. Instead of brake horsepower and chrome alloys it is cheaper milk and eggs. The service, the product provided, is sold on features not adoption. Looking back now it may seem strange to have had to convince people of the importance and use of a car. The concept itself is now effectively invisible. The public sees only the features and price of a given product, be it Ford, Toyota, Mercedes or BMW. The future of cloud computing will be framed by branded provision – where the debate will focus on speed, scalability and the productivity it enables. But the concept of cloud will no longer be important. What the cloud is able to achieve and how it supports efficiencies and manages data certainly will. Have your say Let’s have your thoughts and comments. How do you see the future success of the cloud? ### Hackathon at Apps World Europe Mobile developers come together for innovative Hackfest at this year's Apps World Europe. The Hackfest returns to Apps World Europe on November 12-13 in the ExCeL, London to challenge teams of mobile developers and designers to collaborate, innovate and create using the latest tools and technologies on a variety of cutting edge platforms for the opportunity to win an ultra-thin Razer Blade gaming laptop, a Parisian experience and over £1,000 in Amazon vouchers. Partnering with Razer, participants will be challenged to create a new app in the Razer Nabu Hackfest that harnesses the social capabilities of the new Razer Nabu; a revolutionary wearable technology that delivers notifications from a smartphone right to a user's wrist and tracks selected personal information. Based on an open development platform, the Razer Nabu can collect personal, physical and geographical opt-in data as well as pre-configured capabilities and gestures. This allows first- and third-party developers to update existing apps or build new ones. "We are very excited to partner up with Apps World Europe for their popular Hackfest," says Min-Liang Tan, Razer co-founder and CEO. "The Razer Nabu is the first smart band ever that brings it all together: Fitness tracking, mobile notifications and social functions. We are looking forward to working directly together with the most creative app developers out there to harness the huge potential the Nabu offers." Ian Johnson founder of Apps World shares "We are delighted that Razer have chosen the Hackfest at Apps World Europe as the premiere Razer Nabu hackathon event. The Nabu is an innovative device which has the potential to transform the world of gaming and fitness app-based wearables and I'm really excited to see what pioneering ideas our developer community showcase during the event." Co-hosted with the Razer Nabu Hackfest, the Apps World Hackfest will ask developers to push the boundaries of originality whilst providing cutting edge solutions to challenges supplied by sponsors MasterCard, IBM, Calldorado and Progress. The Hackfest will be commencing from 9am at Apps World Europe in the ExCeL, London from 12-13 November. For more information visit: www.the-hackfest.com ### Memset announces managed services Moving into the managed services market Memset has announced it has launched a new business division, Memset Managed Services (MMS), to help the company address the increasing bespoke nature of public sector accounts and to add further value to its Infrastructure-As-A-Service (IaaS) solutions. Cloud providers are increasingly recognising the need to provide customised solutions as well as standard service-level agreement (SLAs) offerings. This flexible business model was a natural progression for Memset who are looking to draw enterprise and government customers by adding value further up the stack. Kate Craig-Wood, Memset's MD, explained: "Memset provides one of the most capable and mature cloud platforms in the market today, and with the increasing number of customers looking for bespoke IT solutions and more expertise in association to our hosting services, our managed cloud offering will help to address these needs." Memset itself has made notable progress as a provider to UK government in the past year or so through G-Cloud, and whilst there is still a majority of customers happy to purchase off-the-shelf products in the CloudStore, there is an increasing demand for the managed services model. As a flourishing British SME Memset hope the service will stand out against rival cloud Infrastructure-As-A-Service (IaaS) providers on both price and quality of service. MMS will deliver an all-inclusive solution combining traditional managed hosting together with options such as migration, application management, pro-active monitoring, full change management, database administration and a telephone service desk. ### EMC bets on hybrid While the other big providers including Amazon and Google tend to focus on public cloud solutions EMC is betting on hybrid. EMC has launched its EMC® Enterprise Hybrid Cloud Solution that integrates hardware, software and services from EMC and VMware to unite the strengths of private and public cloud. The EMC Enterprise Hybrid Cloud Solution enables IT-as-a-service (ITaaS) in as few as 28 days. Alongside this announcement EMC has also added three cloud companies, The Cloudscaling Group, Maginatics and Spanning Cloud Apps, extending its capabilities across cloud infrastructure, storage and data protection. With these acquisitions EMC can offer customers hybrid cloud solutions based on OpenStack technology, cloud choice with data mobility across multiple clouds, and new protection capabilities for "born in the cloud" applications and data. As IT organisations race to keep pace with the demands of the modern and rapidly changing business, they need to leverage both private cloud and public cloud. Where the public cloud is low cost and flexible the private cloud is secure, controlled and reliable. Mixing the benefits of each of these services in a hybrid deployment means that organisations will no longer have to make trade-offs between the speed and agility of public cloud services and the control and security of private cloud infrastructure. The EMC Enterprise Hybrid Cloud Solution empowers IT to be a broker of trusted cloud services while maintaining the freedom to choose the Management and Orchestration technology upon which the hybrid cloud is standardised. EMC believes it unites the best of private and public clouds it enable ITaaS. Once thing is certain, hybrid clouds are certainly increasing in popularity and are beginning to define the marketplace battleground. ### Golden Parameters of Monitoring Cloud computing is disrupting the technology and IT markets. The cloud enables any business, small or large, to operate like a full-blown enterprise. Cloud technology offers affordable solutions transforming businesses to become more agile and responsive to industry changes, increases their collaborative opportunities, and extends the current computation capacity. While cloud computing has become a viable solution to many, it’s not flawless. Today, the cloud data is dynamic. Businesses can no longer release an app without continuously monitoring and analysing its behaviour. Like many other technologies, cloud deployment requires ongoing monitoring to ensure optimal performance at all times. Cloud monitoring tools offer visibility to uncover the potential problems enabling users to make real-time changes to prevent customers from being impacted. Importance of Monitoring Performance analytics are important for any business operating in the cloud. The information captured in monitoring can serve as the core basis for decision making and provides insight on better understanding how an app is utilising resources. Additionally, certain cloud monitoring tools can anticipate cloud outages by identifying unusual behaviours and alert the IT team to respond in a timely manner. The generic cloud monitoring tools included by providers are often too generic to meet the needs of each specific business. To uncover business truths, the ideal solution is for businesses to invest in creating a tailor monitoring system to validate product offerings and to distill the most critical information for specific business needs. Several factors affect cloud computing performance One of the most important characteristics of the cloud is its scalability. While scalability is critical, unless businesses are actively monitoring their provisioned systems closely, they may be underutilising the full capacity of the cloud. A system that is over-provisioned can be a drain on system resources and a significant waste of company money. Alternatively, a system that is under-provisioned, is directly contributing to poor performance. In either case, since all users are sharing the same level of resources, there is always potential for server bottlenecks. As cloud computing increases to conquer the future of how businesses operate, it is essential that companies maximise the functionality of the cloud by closely monitoring leveraging the valuable information generated. With so many options, why is monitoring still an issue? The information aggregated from cloud monitoring is often overwhelming to IT teams. Monitoring tools provide analytic solutions but are only direct results of what the cloud actually provides. A dashboard of only the most important performance statistics are necessary and provide actionable solutions in the dynamic cloud environment. The limitations in the monitoring system often fails to provide the most essential information for each business. Failure management and performance optimisation techniques need scalable and intelligent methods to analyse the large amounts of monitoring data, and to apply smart optimisations to mitigate problems and ensure dynamic resource management in the presence of frequent component failures. At this level of scale, different system components operate at multiple time scales, further adding to the management challenges. At the core of the overflow of alerts, lay core business truths. Unfortunately, businesses operating in the cloud are overwhelmed by the alerts and are not refining this critical information to make informed business decisions. Ensure success without budget headaches Businesses should monitor their cloud environments to ensure they address any issues that could be detrimental to operations. Because many factors can affect cloud performance, it is essential for companies to begin maximising the full capabilities of cloud server monitoring solutions. Companies need to filter the high volume of data into one a single high quality dashboard that provides the most critical information for their cloud system. Cloud monitoring with the most core information will provide the business with a realistic view into the many parameters that affect the health and availability of the cloud system. Today, companies invest significant resources into IT infrastructure, it would be irresponsible to not maintain and monitor its progress. Granted, the cloud may not be as expensive as many other viable options, integrating cloud technology means embracing all of its capacities along with its challenges. Cloud computing indeed offers many benefits and monitoring the is an essential aspect in maintaining proper balance and optimal performance. I would add that ideally it will be great that if possible all part of the organisation will meet and create a "formula" that will ingrate few parameters into a single number that above threshold it is OK and less need to be checked. As always, technology is constantly changing. The sole trend that remains constant is the need for effective monitoring of the specific technology. As cloud computing increases to conquer the future of how businesses operate, it is essential that companies maximise the functionality of the cloud by closely monitoring leveraging the valuable information generated. The organisations monitoring their cloud by customising the information to meet their needs, will successfully manage their services in the most cost-effective manner while providing the highest level of system performance. The cloud is widely regarded as a disruptive technology, business shouldn’t simply accept the challenges of overwhelming data or inability to filter the alerts that are the guiding factor to potential disruptions that can harm performance. Pairing cloud computing with cloud server monitoring that is tailored to a company’s needs, isn’t just effective, it’s essential. ### On the right track In the final instalment of our series looking at small business and the cloud, Bree Freeman offers up some advice on how to manoeuvre your way through the maze that is the cloud. If you're a non-techie person, you might think of cloud computing as programs that are floating around up there, somewhere. You've heard about it, but don't truly understand what it all means and most importantly how the technology can benefit your business. But choosing the right cloud solution can often be a daunting one, however it needn't still be perceived as a leap of faith. Some businesses are concerned about exactly how to choose the right cloud apps, how to get the best use from them, and if their data will be secure when it's up there. To help dispel these doubts, do your research. For the sake of the evolution, or survival, of your business taking some time to figure out what your requirements are, is the first step to unlocking the full potential of the cloud. Cloud services can make apps that were once the preserve of only the largest organisations, like CRM, sales management, contact centre software, call recording – available to any company, from a two-person business to a multi-million pound organisation. It puts what were once the big kids' toys in the hands of any business – irrespective of size. So, the first thing you need to decide is what are you basic business needs. Are you looking for an overhaul of your infrastructure or simply wanting an app to manage your finances, or are you looking for a tool that helps with productivity such as Office 365 and Google Drive? Of course, not everything has to go on the cloud. Many people opt for a mixed model of cloud for sensitive things and localised computing for the day-to-day stuff. There are different forms of cloud – private cloud is exclusive to its clients, public clouds such as Google, which anyone can log on to without breaking the bank, and hybrid, in which the provider balances the client's needs against the technical offerings. If you are looking for an overhaul of your infrastructure then my advice is get yourself a credible service provider. But be warned the market is becoming saturated so maybe for your first steps into the cloud arena you might find chatting with one of the ol' boys of cloud a better option such as SoftLayer, AWS, Google, Rackspace. However there are some fabulous young 'uns out there, that speak your language, understand your needs and are local to you. Companies such as entrustIT, Yorkshire Cloud, iomart Hosting and Phoenix offer great managed services and have friendly experts to help you smoothly migrate to the cloud. For the sake of the evolution, or survival, of your business taking some time to figure out what your requirements are, is the first step to unlocking the full potential of the cloud. However, if you simply want to dip your toe in the cloud then you'll be better off starting small and working your way up. If you're looking for a storage solution then companies such as Box and Dropbox might be right up your street. If you've ever cursed yourself for leaving that important file on your office computer or ever needed to send a client a file immediately, then both Box and Dropbox can keep that from happening again. You simply drag a file, plop it into the relevant folder, and you can instantly access it from anywhere, well as long as you have internet access. If you're looking for a backup solution then I'd suggest looking at Mozy. They continuously back up the files on your computer or server. It gives small businesses the space to back up their entire computer and server files for a very reasonable price, so you know your files are retrievable, even during a data loss crisis. When it comes to cost effective telecoms solution then VoIP could be the way to go for you. Like most cloud tech it's a system that can grow or shrink as business conditions warrant. Adding extensions or direct numbers requires only a few mouse clicks, and new extensions can be provisioned in just a few minutes' time. Whether, like me, your colleagues are scattered across the globe or are located in a single building, hosted VoIP makes it possible for everyone in your business to operate on a single communication system. You can have one business number with multiple extensions and departments – all configurable from a computer or mobile device. With respected companies such as the likes of BT, RingCentral, Vodafone and Verizon having a firm footing in the VoIP arena, you can ensure you'll be in safe hands. Whatever your business needs are there is more than likely a cloud solution or app for you. The key as always with most things is 'knowledge'; you can't unlock and reap the benefits of the cloud without doing your research. You need a firm understanding of what YOU WANT from the cloud; once you have this then you simply need to find the right provider or solution. Easy peasy! ### IBM helps an organisation know more, faster IBM has announced a new generation of data services on the IBM Cloud designed to ensure more trusted information can be applied more readily across an organisation. We are flooded by big data. The competitive advantage this data offers is only useful you can understand its meaning and apply its insights. The faster you can do that the more likely to be ahead of the competition. "All the data in the world is useless if you can’t put it to work," said Beth Smith, general manager, Big Data, IBM. "The new cloud-born services from IBM provide data professionals the ability to deliver data with speed and confidence as the fuel for applications and analytics. The ability to source and manage the right data will help organisations keep data management streamlined, while adhering to increasingly stringent regulatory demands, and produce results and analysis of real value to the business." Nucleus Research estimates that analytics pays back $13.01 (£8) for every dollar spent (60p) – 1.2 times more than it did three years ago. For businesses able to leverage big data insights the affect on profits and margins could potentially be huge. But the issue has often been how to work with the data. Unlike other solutions that address only one aspect of the data problem, the new services introduced by IBM helps organisations put the data to work for their business. These new innovations will help ensure that data flows seamlessly to applications and analytics, making them portable and accessible, no matter where the data resides. They include: IBM DataWorks – IBM DataWorks is a set of cloud-based data refinery services that shape, cleanse, match and secure data. The new services enable business users to find, use and contribute data for analysis; application developers to embed data services into new applications; and IT and data professionals to enable self-service data access and instil confidence to act on the data. IBM dashDB – IBM dashDB is a cloud-based data warehousing and analytics service with in-memory technology built-in to deliver answers faster. dashDB keeps infrastructure concerns out of the way of critical and time sensitive analytics. A new integration of dashDB with Cloudant, IBM’s NoSQL database as a service (DBaaS), allows Cloudant clients to embed analytics in their applications with a few clicks. IBM Cloudant - IBM is extending its portfolio with Cloudant Local, an on-premise edition of the fully managed cloud database-as-a-service that enables a fluid hybrid cloud data layer that spans private data centers, mobile devices and third-party cloud providers. This ensures customers can easily reconfigure their cloud data platforms over time to optimise the cost, security, reach and performance. The power of these capabilities could enable, for example, a ride sharing service company to improve their customers’ experience by ensuring their drivers are in the right place at the right time. The company can take taxi trip information, captured in a mobile application running on Cloudant, directly into dashDB, and then use DataWorks to refine and load additional weather and traffic data to provide more insight. With dashDB, Cloudant and DataWorks working together, new insights can be leveraged to improve customer experience and grow revenue. IBM DataWorks, dashDB and Cloudant are available on IBM Bluemix. For more information on DataWorks, go to ibm.biz/ibmdataworks. ### Russia unplugged Through my travels over the last 8 months or so I have picked up some valuable insights on the cloud technology and have also been surprised by my findings. On a recent visit to Moscow I learned that in Russia the cloud looks set to be in conflict with new legislation that could effectively divide the Internet. By 1 January 2015 data operators will be required to store personal data of Russian citizens on servers inside the territory of the Russian Federation. The legislation which was signed by President Putin on 9 July and its roll-out was brought forward following a vote in September. Companies like Facebook and Twitter, for example, who store data on servers outside of the country will be in breach of the new rules. AWS, Google, Microsoft and many others that have no data centre presence. Cloud and web services that store personal data beyond Russian borders face being blocked from the Internet. Alexander Yushchenko, a member of the Committee on Information Policy responsible for the legislation, assured the public that many services would be able continue. He believes that companies like Facebook and Twitter will be able to rent Russian servers, complying with the new rules. Last week Dmitry Marinichev, appointed in July by the government as Russia’s first Internet ombudsman, suggested that it was going to be difficult for data operators to comply. "At the moment it cannot technically be implemented," said Marinichev. "Social networks, Internet stores, cloud services and ticket and hotel booking services would be at risk." Nevertheless the deadline and ability to comply misses one of the more crucial aspects of the new laws. As part of the legislation Russia wants to be able to implement a kill-switch. Increasing security this kill-switch would allow the government to sever Russia from the global Internet during times of war or large-scale civil protests. This is a big issue that no one is really talking about - at least not on my travels or in cloud forums. What does all this mean really? My belief is that we will see a frenzy to set-up local data servers. Facebook, Twitter, Google and all the big providers will need to host these data servers in Russia, and quickly. Is this a big step back to the dark days? Maybe. To quote Winston Churchill, "an Iron Curtain has descended across the Continent." This one might only be digitally drawn but the impact is huge. One thing is for sure – Russia will not be the only country considering these measures. The rise of the country clouds will soon be emerging and I think countries will be more insular in their approach due to trust and security. ### Reduction is as important as renewables Renewables tend to lead the energy debate but reduction is an important part of the story as well. Following our look at data centre energy consumption, it's with great interest we note the launch of Kingston Technology's new DDR SDRAM, DDR4. The increase in demand for additional storage and computational capacity means that data centres energy consumption is on the rise. Apart from using less, which is not likely as cloud adoption continues, improving the hardware and its power requirements may be the best option. The increasing shift in the market towards outsourcing IT services to the cloud vendors is changing the way we house and manage our IT infrastructure and this gives us a huge opportunity to rethink data centre energy consumption and reengineer a greener, more environmentally-friendly approach to data centre provisioning. A big part of this is the way the data centre operators source energy – and the drive towards 'clean energy' by some major data centre operators (e.g. Apple) is highly commendable. But sourcing energy from renewable sources isn't the only part of the picture. Reducing demand must also play a part. That's why the DDR4 is interesting: Kingston claim the DDR4 will deliver improved performance, higher DIMM capacities and lower power consumption. Its higher capacity DRAM chips and stacking technologies means that DDR4 can achieve more than 2GBps per DIMM with a power consumption of 1.2v per DIMM – compared with 1.5v for the previous generation DDR3. It doesn't sound a lot, but this could equate to power savings at chip level of up to 40% claim Kingston. And savings at chip level has a cumulative effect: impacting power consumption in terms of voltage conversion, power distribution, UPS, cooling and switchgear and transformer, so savings are compounded. This is important when we consider that 50% of data centre power goes on IT equipment; by far the largest consumer within the data centre. Cooling, air movement and lighting tend to get a lot of attention in terms of reducing energy consumption in the data centre, perhaps because this often tends to be where the greatest impact can be made. But they are relatively low consumers: 25%, 12% and 3% respectively, according to EYP Mission Critical Facilities figures. Small savings at chip level can make a huge impact in the overall efficiency of the data centre – and, perhaps just as importantly from an energy management point of view, for the consumption of power at desktop and client device level too. It's interesting that Apple, Kingston and Infinera are trying to exploit the competitive advantage in the green credentials of their products. It would be nice to think that the industry would begin to take a lead on energy consumption reduction and green energy sourcing, but I think that from a business perspective the economics of energy consumption reduction still lead the debate. Nevertheless, as energy prices continue to rise and global instability increases the risk relating to supply, perhaps the message of energy reduction will gain further traction throughout the industry. ### Video meetings on the rise As cloud services are redefining the way we work together our reliance on these services is revealing a change in attitudes. 71% of workers favour video over audio conferencing, with younger generation leading the way. A YouGov survey conducted on behalf of RingCentral UK Ltd., a subsidiary of RingCentral, Inc., has found that face-to-face meetings are becoming less viable for businesses, as workforces become dispersed over multiple locations and more companies conduct business overseas. The RingCentral research shows that two-thirds of the senior decision makers surveyed (66%) believe that one of the biggest disadvantages of face-to-face meetings is the time it takes to travel to and from meetings, while half (50%) believe another big disadvantage is the cost. Six in ten (62%) admitted they had lost the most time when travelling to face-to-face meetings, while a third (32%) admitted most time was lost when face-to-face meetings overran due to distractions such as small talk, making teas and coffees, and office tours. As a result, businesses tend to favour conference calls over face-to-face meetings. Six in ten (60%) respondents said they prefer conference calls over face-to-face meetings as they are more time-and cost-efficient, while a third (34%) said they use conference calls because they work in a multi-dispersed/ multi-geographic team. A third (33%) admitted that conference calls were more viable for the business, enabling them to make connections abroad that might otherwise have been too costly or economically unfeasible. A downside to audio conferencing is that participants may not always be fully engaged. In fact, a fifth (19%) of us have admitted to conducting a conference call in our pyjamas or from an unusual location, such as the bathroom or outdoors. When it comes to conference calls, the majority of businesses (71%) prefer video over audio conferencing as it's more engaging, inclusive and interactive. Despite this, the research, conducted by YouGov and commissioned by RingCentral – a cloud business communications company – found that many companies still conduct business in the traditional way, with just under half (46%) of companies saying that face-to-face meetings still make the most sense for their business. Over a third (35%) of businesses admitted that poor image quality and sound have put them off using video conferencing in the past, while over a quarter (28%) blamed the complexity of setting up the technology for putting them off using it. "Many companies still prefer traditional face-to-face meetings as they are considered more personable and engaging than audio conference calls, but travelling to and from meetings is costly and time-consuming. Businesses are already beginning to see that video conferencing is the answer to bridge this gap," said Lars Nordhild Rønning, General Manager, EMEA, at RingCentral UK. "Over two thirds (71%) of businesses would use video conferencing over audio conferencing, so businesses must look to new technologies such as High-Definition (HD) video conferencing that flawlessly meet business expectations to help them maintain relationships, achieve their business goals and remain as efficient as possible." The research found that it is the younger generations of workers who are leading the adoption of video conferencing. Over a quarter (27%) of respondents aged 18-34 say they use video conferencing for business at least once a week, compared to just 13% of those aged over 45. It is also the younger generation of workers who are much more open to embracing and investing in new technologies in the workplace. Nearly two-fifths (39%) of 18-34 year olds said they are likely to spend more time on video conferencing calls in the next 12 months, and nearly a quarter (24%) are likely to invest more money in the technology in the next 12 months. In contrast, just 26% of those aged 45+ are likely to spend more time on video conferencing calls over the next 12 months, with only 16% likely to invest in the technology. "The benefits of HD video conferencing shouldn’t be underestimated. Research that RingCentral commissioned last year found that allowing employees to work flexibly is important for employee productivity levels. With video conferencing, businesses have the ability to remain personable and engaging with staff and clients," said Nordhild Rønning. ### Knowledge is power In part two of our SMBs and the cloud series, Bree Freeman looks at how the cloud is playing a key part in increasing the business potential for SMBs. Can you remember those old sweet shops? You know the ones where there were jars and jars of these fantastically coloured sweets, which were weighed out by the quarter on a big old-fashioned metal scale pan and packaged into a small white paper bag. The brown-coated shopkeeper who stood masterfully behind the counter then handed that special little package back to you and off you went (running naturally) to get on with exploring the contents of your little white bag. Hmmmm! Pear drops! But for me sometimes I became bamboozled by the array of sweets on offer and simply couldn't decide, much to the annoyance of my elder brother, what I was going to buy. So finding myself in a bit of a quandary, I would always opt for a mix-up as being a rather sweet-toothed young madam I simply couldn't stick with just the one type of sweetie – I wanted to try them all. Where are you heading with this analogy I hear you say. Well, I have found myself in a similar sticky predicament when it comes to choosing the right cloud computing technology for myself. As a small business owner you can often find yourself in a similar situation but this time instead of tantalisingly colourful jars of mouth-watering sweeties, there's a whole shopping department of cloud tech out there. Finding the right solution/s for your business can be just as confounding as sweetie picking. The cloud represents a fundamental shift in the way businesses have to operate and it’s something that every organisation must embrace in order to stay competitive. These days the market is saturated with cloud solutions that could help take your company up a notch or two or three. But choosing what is right for you is ultimately the hardest part. When it comes to unlocking the potential of the cloud, as always, knowledge is power. Knowing what cloud technology can do for you and your small business is the key to increasing the potential of your business. In the past few years every trend that we've seen, whether it's BYOD, mobile working and big data has all been made possible thanks to the flexible and reliable environments offered through the cloud. The cloud represents a fundamental shift in the way businesses have to operate and it's something that every organisation must embrace in order to stay competitive and it would seem that more small businesses than ever have their heads stuck firmly in a cloud. According to research by the British Chamber of Commerce and BT Business and YouGov on behalf of Vodafone, some 60% of UK SMBs are using cloud-based apps these days. And it is no surprise really that the majority of SMBs are opting for the cloud as one of the greatest benefits of the cloud is financial. SMBs have discovered that every cloud has a silver lining, and the business apps found within are giving them the same operational power as installed software, at a fraction of the cost. The cloud can remove the need for many types of hardware, software, networking management and overall IT maintenance. This helps reduce costs for businesses in these areas, with the ability to pay only for what is used, as there is no upfront cost. You'll pay a low, predictable, flat rate monthly fee per user for the software that you use; and that means that you can scale up or down, as your business needs demand. If and when you take on more staff, you can immediately switch on new licenses, and similarly turn the tap off if you scale down. In fact you can pick and mix your cloud tech, try out a VoIP system, get yourself a little CRM, whatever you require there will be a cloud solution or app for you. In the end, though, for small businesses, moving your whole base of operations to the cloud might be the smartest move you ever make. With increased efficiency, productivity, and communication between employees, the sky is really the limit of how successful your small business can be. My advice is figure out what you want from the cloud, do your research or speak with a reputable provider who, like my old brown-coated shopkeeper, has the knowledge to help you find the right solution to your needs. In part three of our series Bree will look some of the delectable cloud morsels on offer. ### Cloud adoption and the SAP drive Cloud is the word on everyone's lips in every organisation why do you think this is? Cloud as a disruptive technology is a topic that is much discussed, with many drawing comparisons between the emergence of cloud and the advent of the Internet age. There is good reason for this, there are striking similarities in the way both of these innovations are transforming the way organisations collaborate, communicate and create which frees up the business to become more agile, responsive and innovative. What are the parallels that can be drawn between cloud and the Internet age? Well, we see some companies taking the plunge, while others are adopting a more conservative approach. It should come as no surprise that the "born on the web" companies have been early adopters while enterprises have been somewhat more reserved in their exploitation of cloud. That reservation is valid, as they need to make the most of the investments already made in IT whilst transitioning to the cloud and running their business to its optimum whilst they do so. What trends are you seeing? We're seeing an interesting trend among some young companies adopting cloud. Initially, they turn to cloud because it offers them instant access to infrastructure and unlimited compute power that fuels the rapid build out of their business. However, at a critical point they reach a scale where they are drawn towards a hybrid cloud model where they create their own cloud capacity. Conversely, we are seeing more established enterprises moving in the opposite direction starting out with a private cloud and then graduating toward a hybrid model. Whether you are a new born on the internet company using off premise cloud but evolving to include some on premise as your business matures or you are a more traditional enterprise moving its workloads from non cloud to a hybrid state of mixed on and off premise cloud, the end point is still the same it is a dynamic hybrid cloud outcome in most cases. How can we really drive cloud adoption amongst companies of all shapes and sizes? We must make the cloud usable and friendly for all organisations helping our world turn: from one-person start-ups to government agencies to hospital networks to manufacturers. If cloud can help young companies bring their innovations to market in days instead of weeks, imagine the potential it holds for everyone else? We believe that this can be achieved through collaboration within our industry, such as the partnership with IBM we announced last week - IBM and SAP Partner to Accelerate Enterprise Cloud Adoption. By dovetailing into each other's strengths, together we can build a cloud which is secure, scalable and open – one which is free of the key inhibitors which make many hesitant of using cloud today. At the other end of the spectrum, you can set up an email environment in 5 minutes using Bluemix. What would such a useable, enterprise-friendly cloud be built on? It would be built on three pillars of technology, all of which will help speed the adoption of enterprise-grade apps to the cloud. Integration: Establishing a hybrid model leveraging the public cloud for certain projects is the ideal way for organisations to test the waters, especially if they already have invested heavily in their own systems and infrastructure. This gives enterprises the option of keeping some of their more sensitive data in-house, and maximising the value of sunk costs in existing systems. For this hybrid model to succeed we need to embrace open standards, enabling new cloud-based workloads from mobile, data and social networks to easily connect back into and work with existing technologies and back-end systems, as well as with other services from providers and companies from across the spectrum. With our partnership announced today, we'll be able to bring this standard of openness to our clients through our integration with IBM's cloud – which is built on open standards such as OASIS TOSCA and on open source initiatives such as OpenStack and CloudFoundry. Security: One of the most pressing – and visible – challenges facing the adoption of cloud by larger organisations today is data protection and privacy. In addition to the critical need to protect data in the cloud against hackers, increasing government regulations require certain data to reside within national borders – which presents an insurmountable challenge to many European companies using a US-based cloud provider. Cloud should not only present zero privacy concerns, but it should also be built on a global data centre infrastructure which allows companies to effortlessly stay within the legal boundaries of where they're doing business ensuring that they meet governance and compliance requirements at all times. Big Data: Big Data is quickly becoming organisations' most valuable asset, and is rising as the primary competitive advantage in growing customer relationships, marketing, trend prediction, product improvement and more. As mobile and social technologies multiply, so does Big Data, and cloud is rapidly becoming the only way businesses can handle and analyse tremendously large, data-intense workloads. Cloud platforms should not only be able to manage Big Data with ease, but should also have built-in analytical capabilities to help companies extract valuable information from these workloads smoothly. By further assimilating the latest advances in cognitive and predictive analytics into the cloud, we're hoping to bring the full power of cloud to the forefront of the global technology stage. Luckily, this ideal of what cloud should be is not just a vision. It's a reality, and is within reach of countless organisations today. ### Microsoft announces cloud in a box Microsoft has announced the Microsoft Cloud Platform System (CPS), a cloud device aimed at helping businesses move to the cloud. The CPS is a Dell powered device that brings together Azure, Windows Server and Microsoft System Center to deliver an "Azure-consistent cloud in a box". Taking on Google and Amazon this new device will allow businesses to run Azure cloud computing service inside their own data centres. "Our ecosystem is the backbone of our cloud platform, and our embrace of open source technologies is at the heart," said Scott Guthrie, executive vice president of Cloud and Enterprise at Microsoft. "By helping to create an open platform powered by choice and flexibility, we are enabling the enterprises and developers of today and tomorrow to connect with each other and create new business opportunities in the mobile-first, cloud-first world." Worldwide demand for cloud computing continues to accelerate, and Microsoft is investing to meet this demand. By the end of 2014, Microsoft Azure will be operational in 19 regions around the world — at least double the number of any other public cloud provider. Many believe that Microsoft is still playing catch-up with Google and Amazon. With this announcement Microsoft CEO Satya Nadella has outlined how Microsoft is using Microsoft Azure, Office 365 and Microsoft Dynamics to deliver the industry's most complete cloud. "The enterprises of today and tomorrow demand a cloud platform that is reliable, scalable and flexible," Nadella said. "With more than 80 percent of the Fortune 500 on the Microsoft cloud, we are delivering the industry's most complete cloud — for every business, every industry and every geography." ### Trading at the Speed of Light Enabling cloud and digital services Ahead of Cloud Asia, we speak to Rocky Scopelliti, Group General Manager, Industry Centre of Excellence, Telstra, Australia in order to determine some of the winning steps companies should be taking when approaching data analytics. When thinking about those companies that are successful in terms of their data analytics, there are inevitably some common characteristics in terms of their approach to them, and the company-wide policy with regard to them. Rocky is in agreement with this, stating that for these companies "their analytics support a strategic, distinctive capability, while the approach to, and management of, analytics is enterprise-wide". What's more, "Senior management are committed to the use of analytics; and the company makes a significant strategic bet on analytics-based competition." It's not just the responsibility of the enterprise, however; ICT organisations have a vital role in enabling more competitive and effective data analytics, by creating an environment for them to succeed. Looking at a specific vertical – Financial Services – this is something that Telstra refer to as a Smart Connected Financial Services World. This, according to Rocky "will enable data analytics to benefit financial institutions; it will be used to respond to the changing competitive environment, and how the design of experiences will be valued by consumers, delivering growth. Demand for ‘trading at the speed of light' and the need for shaving nanoseconds of latency has seen organisations like Telstra make significant investments in cable systems linking global markets and exchanges". Trading at the speed of light is paramount, where a fraction of a second can cost millions. As global technology infrastructure develops even further, consumers across all verticals will inevitably become more demanding – what will be next, retailing at the speed of light? Data mining at the speed of light? Looking specifically at what Telstra can offer to enable a Smart Connected Financial Services World, they have a wide range of cloud and leading edge digital services to enable institutions to reduce risk, improve productivity and create growth. When discussing this, Rocky makes the point that "Our research demonstrates that there is very strong customer demand for the personalisation and convenience that relationship managers typically provide. Unfortunately, direct relationship management is relatively expensive and is typically restricted to very high value customers." However, there are several options: "Contact.Me brings together the best of a virtual IPA and a real, but remote, relationship manager in a single, engaging interface. Also we have Branch.Me, which is all about personalising the customer's experience while they are visiting the branch – should they opt in for this level of service. In terms of a personalised customer experience, we have Digital.Me, it isn't just about using analytics to help the customer manage their finances more easily and effectively, it's also about creating a platform that will allow providers to offer new services that customers will value". Finally, talk turns to this November's Cloud Asia event, where Telstra are the exclusive Enterprise Cloud Partner. Rocky's reasons for Telstra's participation at this important industry meeting place are that "Cloud Asia brings together new insights from leaders breaking new ground. I look forward to contributing my thought leadership on how Cloud can create new business models to compete in an increasingly data-fused environment." We look forward to it too. You can join Rocky, and all our event participants, by registering for Cloud Asia 2014. With a free-to-attend expo, and complimentary conference passes for enterprise, make sure you're part of the conversation. ### Take-off with amortisation A plane that never takes off or lands? Andre Smith looks at standardised pricing and amortisation. It has often been said that the two most difficult parts of any project, especially technology-led projects, are getting it started and getting it completed. Pilots will tell you that flying a plane is three hours of peace book-ended by three minutes of stark raving panic. Customers of plane travel might be tempted to conclude what they are paying for is the "flight". The customer's interpretation emphasises efficiency. The pilot responds with pragmatism. There can be no flight without a takeoff and a landing. This debate is now a central feature of many discussions of price and value. On one side of the debate are people who will sincerely tell the world greater efficiency is not only possible, but also necessary no matter what the application. On the other side are the pragmatists who pursue efficiency, but only so far. They point out efficiency has a cost which is often greater than what is being replaced. "Why can't we pay only for what we need?" ask the customers of technology solutions. A fair question, but it is possible it makes over-simplified assumptions about the technologies and the details of their deployment. Going back to the aircraft flight example, it is clear the majority of a pilot's skills are employed to safely get the plane off the ground and back again. Unless there are acrobatics or traumatic weather patterns on the flight plan, the rest of the trip only requires occasional pilot attention. Without stretching the metaphor too far, the question the "only pay for what we need" crowd is asking is "why can't we just pay for the time in the air?" At this point, it is no longer a fair question for obvious logistical reasons. But what if there are models that would work on a per-minute basis? Everyone knows phone calls have been billed by the minute for over 100 years. Why can't that model be more generally applied? Amortisation The answer is it can, provided the "take-offs and landings" can be amortised over the purchases of all customers paying on a per-minute basis. Phone service is a perfect example because by and large, the process of connecting a phone call is the same for nearly every mobile phone or landline phone user. On the other hand, if every call required custom background music, a specialised ringtone and a paid moderator, the possibility of charging by the minute would rapidly fade. The reason is fairly simple from a business standpoint: unpredictable costs must be paired with flexible pricing, or the company could find itself losing money on every sale. Installing technology is one kind of product that would be fairly lucrative from a pay-per-minute standpoint. It would be valuable to the installer if they could make a single upgrade and rent it out by the minute. On the other hand, services like project management make more sense for the purchaser. One-time services don't match an ongoing billing arrangement. The value in a pay-per-minute billing structure depends on an ongoing exchange of value. Project management is something that requires additional work at regular intervals. The value matches the billing, so this service is one that would likely benefit from and even enhance the idea of a pay-per-minute billing structure. It's always a good idea to try and balance value and price if the product is going to be competitive.Where Pay-Per-Minute Works Nevertheless, there are examples of per-minute products. Ziferblat, a Russian cafe chain, charges its customers 3p a minute while they are inside. They even provide those customers with an alarm clock to carry around with them. Everything else in the cafe is free. We've all seen people who spend a great deal of time in coffee shops and restaurants, working remotely on wireless connections with their mobile devices or PCs. Ziferblat's idea of what they call a "coffice" (coffee-office) would turn this working time into a profit centre. They have plans to expand worldwide soon, so it won't be long before we know if the model they have turned into a success so far will continue to grow. Customisation Requires Flexibility The difference between this example and a product like customised web hosting or custom application development for business-to-business product sales is the start-up and wind-down time and expense. A product must be understood before it can be built, and when the start-up costs reset to zero each time a new customer decides to make a purchase, the seller finds himself in a situation where he not only has to build a suspension bridge, but also has to re-discover physics and invent steel too. The pay-per-minute model relies on a commodity product like time in a cafe rather than a custom product like a business sales application. The question of "why can't we pay for just what we use?" fails to take into consideration the fact that no other customer can buy your takeoff or landing or the time your pilot invests in them for their plane flight. When you buy the groundwork for your product, you are paying for what you use. Nobody else can use it. The Power Example National Geographic recently addressed this issue in their who make use of the energy grid despite their reduced bills. Like most energy products, electricity is billed on time as well as by volume. Utilities contend that infrastructure must be paid for and that by paying less, solar panel-equipped homes are shifting the burden to other customers. This is an example of paying by the minute, except in reverse. Solar equipped customers are paying less over the same time interval. According to the utility, they have the same financial responsibility as other customers for the grid and power plants, but they are paying a different rate. This throws another variable into the question. Can this model work if two customers pay different by-the-minute rates? It's a very good question, because it turns what was a commodity product into a custom product. The general customer needs basic power from the same plant and grid. The solar customer doesn't. Once you add a new class of customer, the pay-per-minute model becomes questionable. Conclusion It remains to be seen if new kinds of products and management strategies will improve processes to the point where standardised pricing can be combined with customised products. In the meantime, analysis like this is crucial to knowing where we are on the road to innovation, and what our options are for future progress. ### Apple Pays First we had the new iPhone 6, next year's Apple Watch, and now we have the iPad Air 2 but it is the new mobile wallet we should be watching. Our data is about to get a lot more personal. 'Watch' out, Apple's about. Everything is Apple in time for Christmas. iPhone mania and iPad lust is here once again! Yes! It's that time of year once more when the fruit-themed tech giant rolls out yet another incarnation of the greatest thing ever. With bated breath Apple fanboys and the media alike descended upon Cupertino, California yesterday to see for themselves Apple's latest arrivals. But this year's annual pop culture event not only saw the release of the iPad Air 2, iPad mini 3 and the long rumoured 5k iMac; Apple, with its usual pomp, announced what could be one of its most lucrative services – that Apple Pay is launching Monday, October 20, following the release of iOS 8.1. Apple's assault on the mobile payments has begun. Commentators are already speculating as to the impact of Apple Pay in the payment, retail and mobile sectors, which allows users of its new handsets to pay for goods and services simply by tapping their phone against a special terminal in shops. The new app uses near-field communication (NFC) chips. These NFC chips – similar to those used in Oyster cards and contactless credit and debit cards – can communicate securely and wirelessly, and uses a Touch ID fingerprint sensor to authenticate purchases. You can also, using the handsets iSight built-in camera, take a picture of your bankcard and the details will automatically be added to the service, under an app called Passbook. Sounds all great and Star Trekky doesn't it? But there are legitimate questions about the privacy implications of Apple's plans, especially given the recent iCloud nude celeb leak, Apple doesn't have a good record on picture security right now. Moreover, Apple's foray into the wearables world has raised further concerns. Take the fitness-monitoring app for the Apple Watch for instance, some are run by third parties outside of Apple's control, so just what kind of information might they learn and share about Apple Watch wearers? Apple has touted the encryption features it'll use to obscure credit-card information, but if people do take photographs of their credit cards to add them as payment options just how will those images be secured? Also, just how secure are payments given the device lacks the security of Touch ID. Well, according to reports when a user first puts on the Watch they must type in a PIN code to authorise Apple Pay. Once it's on, the Watch uses constant skin contact, which it can sense using the four sapphire-covered lenses on the underside of the device, to authorise payments. However, once the device is removed from a user's wrist, they must re-enter their PIN when putting the device back on their wrist. Commentators are already speculating as to the impact of Apple Pay in the payment, retail and mobile sectors, which allows users of its new handsets to pay for goods and services simply by tapping their phone against a special terminal in shops. Perhaps mindful of potential skeptics, Apple's senior vice president of Internet Software and Services, Eddy Cue, stressed how secure and safe the system is. "Security and privacy is at the core of Apple Pay. When you're using Apple Pay in a store, restaurant or other merchant, cashiers will no longer see your name, credit card number or security code, helping to reduce the potential for fraud." Adding: "Apple doesn't collect your purchase history, so we don't know what you bought, where you bought it or how much you paid for it. And if your iPhone is lost or stolen, you can use Find My iPhone to quickly suspend payments from that device." Yo Delmar, Vice President of GRC at MetricStream, a governance risk and compliance specialist, told me that: "Apple Pay is a game-changer for digital wallets – finally introducing securely stored payment information on your mobile device. By sending a simple token, instead of the full credit card account number, to the merchant's NFC – enabled point-of-sale register and adding Touch ID fingerprint scanner, we finally have a payment system that actually reduces the risk of credit card theft and security breaches. Credit and debit card numbers are not stored on the device, nor on Apple's servers. By assigning a unique, encrypted device account number Apple raises the bar on secure payments everywhere." It is likely that the true security implications of Apple Pay will only become clear once the new products have been on the market for a while – until then, the industry can only really guess at the potential pitfalls. But one thing's for sure, the wearables market has entered a new era and it would seem that Apple and its loyal fanboys, have a significant part to play in what could be an extremely interesting development in the payment, retail and mobile sectors. ### Power Struggle ServerChoice's Joe Beaumont takes a look a power in the data centre and its importance. Fly Wheels At ServerChoice our MD has just got himself a shiny BMW i3. I know, I know, I said this was about data centres, but bear with me for one moment: I'm obviously obliged to point out the connection between our MD's commitment to green transport and our highly efficient green data centres (insert cheesy salesman grin here), but it does go to show how electrical power is worming its way into evermore fundamental parts of our lives. I'm reminded of the end few lyrics to the Kraftwerk song ‘The Voice of Energy'. And yes, the title of this paragraph is a pun. Or at least it would be if you lived in 1970's San Francisco. Right. On to something more interesting: Flywheels When it comes to UPSes there's always a debate between battery and flywheel, with proponents of both camps banging their drums. There are impressive credentials for both: batteries are tried and tested and power the latest Formula-E racecars, plus the boss' not-quite-so-super car (see above). Flywheels, on the other hand, are seen as greener and power such mighty beasts as the JET Fusion reactor1. The truth, as usual, is somewhere between the two; both technologies have their pros and cons and thus there's space for both in a modern, resilient DC. Flywheels offer instant power, are greener and take up less space. But they only last for a few seconds: what if your generator doesn't start on the first crank? Batteries, meanwhile, aren't as green, aren't as space or cost effective, but will last for minutes and minutes. And if your goal is resilience, that's what counts. Why not use both? Use a flywheel as a cheap battery buffer. Location Location Location Whether you're looking for a colocation/cloud provider or aiming to build your own data centre, location can't be underestimated from a power point of view. Whilst it's very trendy to be in London, the truth is that quite a few facilities have limited power density, sometimes to as little as 4 Amps per rack. That won't meet the requirements for a lot of mid/low-end gear, let alone anything with the words ‘high and ‘performance' in their description. And then there's power resilience and security to think of. That's why more data centres are taking an outside-of-M25 location: as much power as you can munch and with modern connectivity offering sub-2ms latency, you're as good as there. Brain Power With energy prices inevitably rising, the demand from cost-conscious customers is to move toward pay-as-you-go power metering. Our own version is called FlexPower™ and has already been covered in a previous sales-pitch-free blog. Intelligent PDUs are a part of this smart process and have the advantage of enabling a few clever things, like remote power cycling and visibility of real-time power consumption. The Power Factor That isn't just a snappy sub-heading; it's a real thing. I'll skip the jargon about dimensionless numbers and reactive/resistive loads and just tell you that it's a number, usually near 1, and demonstrates difference between the apparent and actual power available2. Technically it's kVA verses kW and all data centre operators worth their salt will have a figure approaching or above 0.8-0.9. It's not something that often needs to be considered but it has the potential to trip up data centre managers when capacity planning, especially on larger scales, as kVA is not the same as kW. Muddled Metrics Power Usage Effectiveness (PUE) is the most commonly used data centre power metric and represents how efficiently your facility is running. The equation is simple: Total Data Centre Energy divided by the IT Equipment Energy. In other words, how much power is getting to the IT equipment compared to how much you're using overall. That ‘overall' includes things like lighting, running the cooling, UPS system losses, cabling losses, etc. So a PUE of 2.0 means that for every 1kW powering some equipment, another 1kW is used to run the data centre. A few years ago a PUE of 2.0+ was common, though these days a modern facility should be running with PUEs lower than 1.5. There's a lot of variability in the industry: though Google often cite incredibly low PUEs the 2014 average is still 1.7. There are also increasing calls to replace PUE with a new metric, one that takes into account what the equipment is actually doing. A low PUE is all very well and good, but what if the server is sitting idle for 70% of its time? There have been a few proposed, but the most promising-looking is FVER, though time will tell if it catches on. 1 I was lucky enough to have a tour of the Culham Centre for Fusion Energy recently and cannot rate it highly enough. Our friendly, funny and informative geek guide explained that the JET Fusion Reactor makes use of two 700-tonne flywheels to pump in up to 300 megawatts of energy. 2 Strictly, the PF value can be between -1 and 1, with negative numbers occurring when power is fed back into the system. For most data centres this isn't very likely. For a more sciency look at the power factor, there's loads of good sources online. ### Embracing digital technology to drive growth From streamlining costs to accelerating business growth, the cloud is transforming the way small businesses operate. In the first of a three-part series exploring how small business owners can unlock the full potential of the cloud, Bree Freeman, examines how the cloud has played a major role in redefining the workplace environment. Patterns of working today are so different from how our parents worked; we live in a largely work-anywhere world, thanks to the cloud and mobile tech advancements. These technologies, affordable software platforms, and increasingly seamless collaboration over the Internet are having an enormous impact on the freelance economy as well as the small business owner. I recently found myself having my brains picked by a close friend of mine who, being sick of working for other people, had made the decision to walk away from a very respectable regular pay check and go it alone… to enter the unpredictable world of the freelancer. Dun, dun, dun! To some, the move away from financial security simply beggars belief, but to others, like myself included, the move means greater flexibility, more control, and being able to sit in your PJ's whilst working on your latest blog!!! And it would seem that I am not alone, as the latest employment figures show far more people are self-employed in the UK than ever before – 4.59 million to be precise. So what is fuelling this work for yourself, migration? Well, I don’t know why the rest of my fellow self-employees have gone out alone but without a doubt the cloud is powering the migration. For SMBs, the cloud is far more than just the latest technology trend, it has evolved to become a tool that is fundamentally reshaping how they run their businesses and the results they are able to achieve. But it is not just the self-employed that are reaping the benefits; for SMBs, the cloud is far more than just the latest technology trend, it has evolved to become a tool that is fundamentally reshaping how they run their businesses and the results they are able to achieve. With new UK business numbers swelling by more than half a million in 2013, I don't know if it's fact but I can't help think that cloud technology is playing a pivotal role in this entrepreneurial explosion. A recent report by IDC says that SMB cloud spending will grow by nearly 20% over the next five years. "We're predicting record worldwide SMB IT spending for 2014 – US$560bn – with exceptional gains in key regions and across key technologies," says Raymond Boggs, vice president of SMB Research at IDC. Another recently published report by Google and Deloitte – Small business, big technology: How the cloud enables rapid growth in SMBs – paints a very uplifting picture. According to their findings small and medium businesses are using cloud technology to overcome growth challenges and scale and grow faster. (I knew it!!) The research explores the operating practices and strategies of businesses in the US and Europe with up to 750 employees, and suggests that SMBs should move tools and applications to the cloud in order to free up time, capital and resources, and to establish a platform for sustainable rapid growth. Interestingly, some 85% of SMBs reported that cloud technology enables them to scale and grow faster, while 66% said that the cloud allows them to outperform against their competitors. It's this ability to leapfrog other companies and scale rapidly that makes cloud an essential part of any small business's strategy. Well I believe it does. And it's not just start-ups that use the cloud as a tool for fast growth: 79% of relatively mature companies — those older than 5 years who are growing at less than 10% per year — believe cloud technology enables them to access new markets and revenue streams. This will become even more important as organisations adopt more digital tools in order to expand into new regions or product areas. There is no doubt that demand is growing for reliable, scalable and flexible IT systems that can support the rapidly changing needs of today’s small and growing businesses. The great news is that there are a number of technology platforms and services that take care of SMB's day-to-day needs such as web hosting and hosted email, which are early examples of cloud computing that were fully embraced by many SMBs prior to the popularisation of the term 'cloud computing'. The secret to unlocking the full potential of the cloud comes through understanding. As always, knowledge is power; knowing what cloud technology can do for you and your business is key to deciding how and where it best meets the needs of your organisation. In part 2 of this series I will look how you can do this. ### Start your subscription business Why you need to take action now The subscription business model is going from strength to strength as more and more companies evolve from selling products to becoming service providers. It is a process that's transforming not only established businesses that are looking to rapidly launch new services or solutions, but also new start-ups, eager to develop an agile and flexible customer engagement strategy. Technology giants, whose brands have become synonymous with innovation, like Spotify, Netflix and Amazon, all understand the value of the subscription business model and are making active use of it to launch exciting new service offerings. In recent weeks, we have even seen Amazon start testing a new ebook rental service called Kindle Unlimited, which offers subscribers unlimited access to more than 600,000 titles on any device for less than ten dollars a month. In tandem with this, we are seeing a new breed of niche e-commerce start-ups come to the fore by introducing subscription billing models for traditional repeat purchases including everything from razor blades to coffee beans and quality wine. But moving to a subscriptions-based model can also bring challenges. As the move to subscriptions gathers momentum, markets inevitably become more competitive. If you are already offering this approach, rivals may come in, replicate your business model and undercut you on price. Or if you are a start-up just beginning to roll-out new services it may be difficult to stand out from the crowd. Ultimately, whatever your size, status and the maturity of your business, you'll want to find a way of differentiating yourself; of quickly and easily adding more strings to your bow without breaking the bank in delivering them. Technology giants, whose brands have become synonymous with innovation, like Spotify, Netflix and Amazon, all understand the value of the subscription business model. So how can you get one step ahead of the competition before the opportunity passes you by? How can you ensure you get your fair share of the lucrative subscriptions environment before you miss your chance? Here we provide some top tips: Make it Easy for Customers to Pay. Before you launch a new business or roll-out a new service, decide on a payment model that is quick and easy for customers to use and gives them the choice and flexibility they need. The precise mix will depend on your business focus and customer mix, of course. But typically you'll need to decide whether you want to enable them to pay by credit card, direct debit, PayPal or other methods such as bank transfer. And you'll need to put in place the right systems to enable this. Put Tailored Service Packages in Place. Customers want flexibility, not just attractive prices. Businesses will almost always look for a service to be tailored to their needs in some way. So knowing your customer well and having a flexible way of packaging and pricing your products and services will be key to the long-term success of your subscription model. Put Recurring Revenue at the Heart of your Business Model. By going down the subscriptions route, you are creating a billing relationship that makes it easy for your customers to buy additional services from you, and less likely that they will buy from someone else. So, make use of the regular contact point you have to create a stronger relationship with your customers, win repeat business and up-sell more. Remember - Flexible Billing is Key. As we have already highlighted, to make a subscriptions-based model a success you need flexible pricing options. The problem is that traditional on-premise billing systems are typically too expensive to implement or not agile enough to handle this requirement in a timely manner. Cloud billing can be key here, reducing the time taken and cost of setting up new offerings and enabling businesses to rapidly turn creative ideas into monetised services. Innovation is an Urgent Requirement. As competition hots up in the world of subscriptions, you need to be continuously innovating to keep yourself ahead of the pack. It's an urgent need. With a cloud billing implementation you get new features and enhancements automatically as part of regular software updates, driving greater agility and the opportunity to continue innovating around new pricing and billing capabilities to stay one step in front of your rivals. In the world of business, long-term success is about finding a winning formula quickly – and that may well mean fast iteration, failing first time and then quickly trying again before you succeed. Remember the on-going migration to subscriptions is now well under way. If you want to take part in it and be successful you need to take action now. And implementing a cloud billing solution will be a critical step in embracing subscription services and taking full advantage of all the many benefits the approach can deliver. ### The Cloud as a Teenager Earlier this year, at Compare the Cloud’s inaugural Cloud Rainmakers event, Steve Pearce of Arrow said of the cloud, "the teenager is growing up." While this comparison is an apt reflection of maturity there is more to it than this. Representing the cloud marketplace as a teenager also helps explain the growth in opportunity – for a number of differing reasons. The growth in the cloud mirrors a cultural shift in teenage spending, one that has come to define modern retailing and anticipate our high-tech economy. If you go back 30 years, the teenage market was just beginning to show some of its financial potential; first in video games and entertainment then outward. Kids who grew up in the 1970s and 1980s, the so-called Generation X, had more to spend than their parents did at the same age. Next up we had the Millennials who continued the trend. In 2001 the combined young market had an aggregate buying power of $1 trillion. Even with the underperformance of the global markets in the wake of casino banking this leverage has continued to develop and be significant. Businesses from McDonalds to Starbucks, even Apple, would benefit from their new demographic spending power; but realising the gains these businesses had to refine their approach. Where retailers and household brands typically sold to a mature, adult market, jockeying for share at the expense competitors, now they had to create new categories, market on niche appeal and accept complimentary loyalty. Teen and adult relationships are often marked by balanced inconsistency – "frenemies" and competitive cooperation – so too is the technology market. Increasing revenues with that rising demographic demanded that businesses understand the conditions of sale. Teens, as they often do, renegotiated the terms. Positioning, fit and purpose wouldn’t sell products on a purely either/or basis. This departure resonates especially with those born-in-cloud businesses. The growth in the cloud mirrors a cultural shift in teenage spending, one that has come to define modern retailing and anticipate our high-tech economy. The cloud vista invariably includes Amazon, Google, IBM and Microsoft. For the burgeoning teenage market 30 years ago we probably would have included Gap, Nike and Nintendo. Yet beneath these mighty giants there is an amazing array of start-ups and exciting businesses. We have seen the rise of Subway, Hollister and Primark. In the cloud we have Dropbox, Spotify and WhatsApp as recent examples. So like the real-world marketplace we have giants and upstarts. New services and cloud software that bridge, sync, monitor and expand broader ecosystems are creating valuable businesses. The tech giants have since learned... well accepted... that customers, like the teens who took us there in the first place, would buy outside. AWS, a stock Microsoft Surface 3 or Apple iPad Air doesn’t do the job on its own. Getting the most from cloud services, benefiting from a data centre provision often relies on third party apps. Like the ad in the 1990s where Madonna and Missy Elliot asked, "where'd you get them jeans?" the Gap understood that their products were part of a larger appeal. The jeans went with a lifestyle in the same manner data centre provision supports business systems. Each is a foundation of something larger; and an opportunity for revenue and renewed expansion. From listening to music on a Walkman to the now antique iPod the young market has evolved. They embraced mobile phones and poked with Facebook. The product chronicle of this demographic segment is a history of technology and digital communications. Following these same footsteps the cloud market has enjoyed similar advancement; moving on from individual, networked PCs to a grid of inter-connected devices with data storage spread across a lattice of data centres. The age of x86 went grey with the Baby Boomers as a faster economy took hold. According to IDC research the public cloud services were worth $47.4 billion in 2013. By 2017 the value could be as high as $107 billion. This growth is massive even if at half that projection. The giants will take a large share of this spending but they will not take it all. There is room in the market for younger attitudes, younger apps and nimble services. In chasing teenage spending over the last 30 years retailers have had to adapt their products to make them relevant. At the same time new entrants attacked the fringes with products and services in tune with the opportunity. The teenager is growing up and so is every aspect of the market that supports this growth. This maturity will prove to be as important as the teenage market was to the last century and there lies the true lesson for us all. ### Cloud security in 2015 After yesterday's venture into the cloud in 2015, I decided I wanted to learn more about cloud security. Many vendors mentioned to me that the market is worried about cloud security, and that security fears are the main reason driving users to uptake hybrid cloud options. I again took the approach of a broad question; "what do you think will be an issue facing cloud security in 2015?", and let vendors interpret it as they liked. Though today I limited the vendors I asked to those exhibiting in the security part of the expo. "If you asked me this a month ago I would have said where the data is being stored would have been the most important issue, but after the Apple iCloud nude photo scandal it's more about how they're securing that data." - Ryan Farmer,  Marketing Manager @ Acumin Consulting Personally I've never considered the physical location of my cloud data an issue. Though this is probably due to the fact that until a short time ago I was blissfully unaware that my Facebook page actually existed in a physical location somewhere - apparently possibly in Ireland. Now that I am aware of my data's physical manifestation, it does make me uneasy to think that it might be in a laxly secured environment. As my cloud knowledge grows, and events I am aware of such as the "cloudgate" scandal as it was dubbed, my security concerns are rising. As the world becomes progressively more digital the security of data is going to become more and more important. "I think cloud security is a bit iffy, because I don't think the cloud is very secure at all. A lot of customers prefer to have the appliance in house because they have more control. I think as cloud progresses, security will improve." - James Kirpichnikov, Sales Manager @ Watchguard As I wandered through the security area of the expo I was surprised at how many vendors honestly told me that they didn't think the cloud is secure. I was expecting them to all tell me that their product made the cloud as secure as it ever could be and that my data would definitely be safe with them as my security provider. James from Watchguard told me keeping a private on site data centre is preferable for a lot of businesses because of the sense of control they have over it. In 2015 the biggest issue facing cloud will be incident response, ultimately all these technologies fail at some point. "The most important issue for cloud security in the future is security flexibility, cloud is just a way of deploying systems and applications, underneath that is generally virtualisation technology. The challenge of that is how you protect any kind of cloud. Hybrid private/public clouds are probably where most companies are going to go, so we have to look at how we can develop security technology that works on all of the platforms in a simple and effective way." - Jamie Pearce, Sales Manager @ Bitdefender The rise of the hybrid cloud seems to be a big talking point in the industry at the moment. Jamie explained to me that where the security issues arise with hybrid systems is needing different security for each component of a hybrid cloud, so the challenge they are facing is creating protection which works across the private and public platforms and is easily deployed and managed. Simplified down for me, I considered his comments in the realm of personal security. Mace may work to protect my personal space, but it won't be as affective to protect a group of 50 people. Jamie's aim is to find a security solution that can be executed effectively and simultaneously across both my personal space, and the group space. "In 2015 the biggest issue facing cloud will be incident response, ultimately all these technologies fail at some point. The prevention technologies aren't perfect, you have to have a plan in place to rectify those failures when they happen." - Tim Armstrong, Sales Engineering Manager, @ Co3 Systems Co3 Systems work with cloud systems after there has been a security breach, helping them to respond to the issue and resolve it quickly and effectively. I found Tim's point of view really interesting because he was the only person who talked to me about what happens after the cloud has been breached. Incident response and security resolutions probably are something that should be discussed a lot more. People admitting to me that the cloud isn't secure really threw me, though it does come back to what I was told yesterday - if someone builds a fence, there will always be someone else trying to go around, under or over that fence. That is just human nature. "I think the cloud is essential, it's going to drive efficiency in the enterprise and in the way States and governments work. At Encryptics the way that we approach security is that we lock down at the device level. We use a peer to peer system, while currently a lot of people maintain a trusted server which keeps all the data, and all of the keys to the data in the same place. Pre-Internet encryption is going to be important in the future, that way you are protected before you even get in the cloud! We hope to lead the charge to utilise the trusted peer to peer system to leverage the positive factors the cloud has to offer." - Mitch Scherr, CEO @ Encryptics I would like to note here that Mitch opened his answer to me by saying that he saw a cloud burst and thunderstorm in the future, possibly a bit of lightening, and definitely some rain. His approach to cloud security seemed, to me, very straight forward and logical. A focus on device security and protecting the data in advance of cloud storage made sense to me. He was kind enough to simplify his analogy even further for me. If you put your data in a locked box, then put that box and the key which opens it in a larger box, what is to stop someone from opening the bigger box and using the key stored inside to access your data? There needs to be a separation of the locked data, and the keys to the information, hence his notion of future cloud services moving to add security at a device level. So, on day two of my first IPExpo I focussed on cloud security. I learnt I should probably be more concerned about the state of my cloud - and that at least one CEO in the data security industry has a cracking sense of humour! ### All Inclusive Cloud Management Platform Scalr: All Inclusive Cloud Management Platform for IT, DevOps Engineers & the CFO The cloud computing landscape is just as busy and swiftly changing as ever. Today I'd like to freeze-frame the cloud management platform sector for you, specifically giving you a clear idea of where Scalr fits in, its value, and how the technology achieves that goal for our customers. Scalr is a cloud management platform (CMP), which Gartner defines as "products that incorporate self-service interfaces, provision system images, enable metering and billing, and provide for some degree of workload optimisation through established policies." Here at Scalr, we find that there are really two camps of CMPs that fit this description – Legacy CMPs, those that apply to the cloud a traditional IT infrastructure provisioning workflow that still requires manual approval; and Cloud Native CMPs, those that support an automated, forward-thinking DevOps-centric approach to cloud management. At Scalr, we believe the latter approach allows businesses to achieve the true promise of cloud computing: self-service provisioning that results in greater agility and increased efficiency. Scalr helps achieve these results in several important ways and we do so by supporting three key constituents: DevOps – Scalr is focused on streamlining the provisioning, configuration and orchestration process for DevOps engineers, allowing them to accelerate application development. Instead of provisioning individual resources, Scalr empowers them with a high-level interface and templates which can create an entire operating environment in a single click or API call. Realising the promise of the self-service cloud, Scalr's application environment automation also features backup, disaster recovery, and auto-scaling.Further enabling DevOps is Scalr's extensible platform that provides numerous hooks that integrate with software like Chef , Docker and Salt that engineers use on a daily basis. All of this benefits developers by allowing them to focus on developing, increasing their productivity and in turn reducing time-to-market. IT – Developers aren't bogged down with process because Scalr allows IT teams to retain ownership of information security and compliance in a fully automated way. This enables IT to transparently ensure applications and workloads are running in clouds that comply with established service management, governance and compliance policies. IT can effectively reduce operational risk because Scalr allows them to create and manage policies for resource placement, role-based access control, and change management. Moreover, Scalr reduces supplier risk and vendor lock-in by supporting the leading public and private clouds, enabling a true multi- and/or hybrid cloud strategy. CFO – With Scalr's new Cost Analytics features, finance is able to apply a sensible financial schema in a transparent, non-intrusive manner, giving finance visibility into cloud costs. In addition to reporting, Scalr provides showback and chargeback features that enable finance to maintain budgets. With Scalr, finance can drill down into cost data to understand what comprises an expenditure, and just as importantly, why those costs change over time. We like to say that Scalr delivers balance for development, IT and finance by delivering the precise control IT requires and the financial oversight finance demands all while removing the barriers to efficiency that development confronts managing cloud infrastructure. The result: businesses are able to reap cloud computing's efficiency and agility for true business gains. If you'd like to learn more about Scalr and how we help our clients, please check us out at: www.scalr.com or reach out to us at @scalr. ### Back Up In The Clouds Simon Talbot, Chief Technical Officer of leading cloud services provider Net Solutions Europe (NSE), has recently returned from a gruelling eleven months at sea, skippering the Team GREAT Britain yacht in the Clipper Round the World Yacht Race. The race gives paying amateur sailors the opportunity to hone their sailing skills, whilst taking part in legs of this esteemed event. Skippering this crew is not an easy task, and Simon was delighted to be given the opportunity to prove he was up to the challenge. He has sailed since he was a boy and has been doing so professionally for the past ten years. Taking eleven months off from any job is not easy, and Simon explains how he managed it without NSE suffering from his absence, "My business ethos has always been 'self-elimination'. I believe it is important to empower your staff to ensure that your business can run flawlessly without you. This enabled me, not only to take advantage of this coveted opportunity to skipper a yacht in this prestigious race, but also to take a more visionary, developmental role on my return." "The cloud offers incredible benefits and capabilities to businesses, but it can seem quite complex,"says Simon. "The recent issues regarding access to data held in non UK data centres, or by non-European companies, particularly the ruling by US District Judge Loretta Preska demanding technology giant Microsoft hands over information, including customer's private emails, stored in a data centre in Ireland to the US government, has highlighted the need for companies to receive accurate and up-to-date advice on the best storage solutions appropriate to the data they hold. Understanding the complete picture and having strategies in place for any eventuality are key to business success, and I applied this philosophy to my preparations for the Clipper Race." Simon knew he had been selected for the role of GREAT Britain's skipper in plenty of time to devise strategies in both his business interests and his private life to ensure his absence did not have a negative effect on either. "As a professional sailor I have been away for a month or so annually for the last 8-10 years. To enable me to do this, I have ensured that my position within the company is such that my team can continue to operate at the same high level in my absence. Preparing my family was understandably more difficult. I made sure strategies were in place for every eventuality, and my wife had access to all the information she possibly could need to resolve any issues. My children expressed mixed emotions at my going; they were excited to know that they would be flying out to exotic places to visit Daddy, but they would obviously miss me." Although Simon's crew aboard the yacht had received training before the race, he was the only professional on board and needed to match people's abilities to allocated jobs when he set up his core team. In his position as skipper he found crew members' over- enthusiasm to be one the worst problems. "I like to give individuals ownership of what they are doing. In a dangerous occupation like sailing, it is important to find a compromise between respecting the chain of command, and pushing yourself to try new things. The crew should be running the boat, and the skipper should not be in charge of the day to day tasks. The more minutiae you involve yourself in, the less you empower your crew. Mistakes are made in the early days, but everyone learns from mistakes. It was my job as skipper to make sure the errors were managed and did not endanger safety. My job evolved as the race progressed, and the crew owned their race. It worked well, and the success was tangible. The skippers who micro-managed their crews did not fare as well overall." Sailing 40,000 miles and competing in 16 gruelling races, Simon, and his crew took second place overall in the competition. Securing a victory in the final race, Team GREAT Britain became the most decorated boat in the fleet, awarded to the team with the most race wins. "I am so proud of the team's achievement," says Simon" I found the whole experience fantastically rewarding on so many levels." At first glance, taking a long sabbatical to tick being a Round the World Yacht Race Skipper off your bucket list seems poles apart from running a globally successful cloud service company, but when you analyse Simon Talbot's approach to both, it is clearly not. ### The cloud in 2015 As 2014 nears its end, we are beginning to look towards 2015, and what it has in store for the cloud. IP Expo Europe is my first foray into the IT world, it is big, daunting, and has made me realise how vastly people's ideas on the cloud are truly spread. To try to wrap my head around the world of the cloud I decided to go talk to some of the vendors at today's expo. I had one aim, to ask them the following question: "Where do you think cloud is going in 2015?" I took a random approach to choosing my targets, I approached stalls that had someone who was free to chat to me, and only included them in the following selection if they gave me a legitimate response. I have to admit, I was hoping someone would tell me they saw rain in the future of the cloud - but alas, no one was quite that witty. [quote_box_center]"In the future I see the cloud becoming a hybrid solution for companies. I think companies will be happy to move some services to the cloud, but there will continue to be a need for on-site servers." - Janet O'Sullivan, Channel Market Manager @ Storage Craft[/quote_box_center] Admitting to me that she was quite new to cloud herself, Janet's justification was that she didn't believe people would be willing to let all their data go into the unknown. This makes sense to me, having everything in the cloud is quite a daunting thought. [quote_box_center]"From a sensible point of view people are going to want their data at their fingertips, the cloud is going to have to become more accessible. But I think in reality the cloud is going to implode. Once the cloud has a solid definition and is established for what it means to people it will no longer be desirable as a secure and accessible solution." - Mark James, Security Specialist @ ESET[/quote_box_center] I like Mark. He gave me a point of view no one else was brave enough to express. Cloud is such a broad term, when the general population begins to understand exactly what it is in finite terms, I agree that people may flee from it. People tend to stick to what they know, we are creatures of habit, if all of a sudden someone discovers the cloud is not what they thought it was, I think it's highly probable they will be turned off it. [quote_box_center]"I think there are parts of the cloud business that are already in perfect competition, those markets won't change much, I think what will change is that storage and syncing are going to be fairly commoditised in 2015. We will see an expansion in marketplaces for hybrid cloud and mobile cloud services, especially in mobile editing, collaboration and creation services. There will be a rise of specific programming and a specifying of applications. The cloud needs to become more vertical and get smarter about what works to support workflow in business." - Ahmet Tuncay, CEO @ Soonr[/quote_box_center] Ahmet showed me some amazing software designed specifically for medical use with CT scans and X-rays, his argument for the cloud becoming more specified really resounded with me. He explained that currently a lot of programs are broadly available and used in multiple industries, despite possibly not being the most effective for the job at hand. For example, construction companies may be using the same software as medical institutions. By specifying the software to the task at hand, productivity can be increased. [quote_box_center]"What will change the cloud in 2015 is faster networks in our homes and on our phones, but security will continue to be an ongoing issue. If someone builds a fence, there is always someone else trying to figure out a way around it." - Chitra Dunn, Managing Director @ Agileise[/quote_box_center] Personally this excites me, I am of the generation "now". Any improvements that will supply me with a faster connection are fine by me! Security issues tend not to be in the forefront of my mind, I admit I should probably be more concerned about it, especially due to how many people mentioned security as an ongoing issue. [quote_box_center]"We will see more of the same of 2014, no radical changes. Cloud is not going to change the world in 2015." - Paul Holt, Senior Vice President @ Verisign[/quote_box_center] Paul made me laugh, he was very honest in his opinion of the cloud, it has existed since before we gave it a buzzword, and it will exist once that buzzword dies off. [quote_box_center]"It's all going to be about security in 2015. The PCI V3 (Payment Card Industry) is going to be massive. I also think the value of cloud services is going to increase." - Henry Merrett, Senior Network Engineer @ ServerChoice[/quote_box_center] Another acronym for me to remember! I have to admit I did interject Henry and get him to break it down for me. The tech world is big on acronyms, I'm struggling to remember them all! [quote_box_center]"The cloud for us has been one of those things that people consider security around a lot more. In 2015 I see clouds becoming a lot more localised because people are fearful of where their data is." - Jon-Mark Wilkinson, Channel Sales Manager @ Watch Guard[/quote_box_center] I promise Jon-Mark is of no relation to me. He cited the celebrity nude pictures scandal of late, media coverage has definitely made people more fearful of their personal data being leaked than they ever were before. [quote_box_center]"The cloud this time next year will be just as confused as it is now. We as an industry need to start simplifying the market for customers. In terms of cloud adoption, it's about culture, customers may look the same on paper but culturally they operate on different levels. The younger generation of IT coming through will push for developments on faster, more effective cloud services." - Chris Reynolds, Partner Manager @ Pulsant[/quote_box_center] Chris talked to me at length about how what people think cloud is, often doesn't correlate to what actually exists. He explained to me that what one company refers to as cloud can be completely different to another. [quote_box_center]"More hybrid. More local." - Joe Baguley, CTO @ VMware Europe[/quote_box_center] Concise, and to the point. Joe summed up what a lot of vendors told me, they saw a move to localised clouds and an increase in hybrid cloud services. From my jaunt around the expo my answer would be the following; the cloud in 2015 is likely to become a more visible part of everyday life, end-users will have more access to cloud services and be more aware of the services they are using as they are specified for their needs. ### VoIP in the real world So, here I am travelling again to another foreign country for the fourth time in a month, and I feel the need to comment on a service that has helped me beyond belief on my travels. About one month ago Compare the Cloud subscribed to a VoIP service, that allows all of our staff to be connected and visible. All of the Compare the Cloud personnel work remotely and are geographically separated, so communication can sometimes be difficult. However, now that we all have a SIP client on our mobile phones life is much easier. This, along with the option of static phones and soft phones installed on our Apple Macs, allows us one dialling directory. In addition, we can also use our own dial in video and audio conferences. What I really wanted to put forward today is a true life example of how this service has saved me and the company time, money and lots of stress whilst traveling. Please bear with me as this isn’t an advert for the company providing the service (although it is fantastic), but a real life usage of a mobile VoIP offering and how I use it. Day 1 My story begins with jumping on a flight to Moscow from Gatwick on a miserable Sunday afternoon. It’s raining and I’m running late for the flight (as usual). I settle into my seat on the plane after forcing my hand luggage into the smallest space imaginable and close my eyes for 5 minutes. The next thing I know is I am being nudged in the side by the person next to me, as I dozed off, and my head is on their shoulder. Don’t worry it’s not from Russia with Love, but we have landed in Russia! After the usual cattle stampede to passport control, collecting of bags and jumping into a taxi, I turn the mobile on, turn roaming off and sit down for the long journey to Moscow. Traffic was totally horrendous, and finally I arrive at the hotel 2 hours later and check in. There is something really important that I have to call in about, and I haven't had the chance to buy a local sim card for my phone, or had my phone unlocked - not good. Historically, this would have been a catastrophe if I didn’t have my VoIP providers app on my phone. The hotel wireless was average, and as any frequent travellers out there reading this will concur, it can be hit and miss with connectivity, but I am connected anyway. I ran my application on my mobile that links to me contacts, and I dialled out. The call quality was perfect and crystal clear, in fact better than my normal mobile calls. I stayed on the call for 40 minutes and thankfully did not get robbed blind by my mobile provider who would have charged me £1.50 a minute. This call would have cost me £60 under normal circumstances and been of a poor quality. Crisis over, I went to the bar for a drink or ten and got to thinking that we pay a third of the amount per month for our VOIP service, that my mobile provider would have charged for that call! So actually in one call the the service has paid for itself for the month 3 times over. Early start in the morning, so its off to bed for me. Day 2/3/4 I leave the hotel and walk to the clients office. Great! Wireless not available as I leave the boundaries of the hotel so I grab a coffee in Starbucks (yes, they do have Starbucks in Moscow... lots of them) and jump on their free WiFi. Fantastic, I then fire up the application on my mobile again and get the early morning calls finished with a double shot latte in the other hand. What more could I ask for? I’ve saved time, money and a lot of stress, and got a good cup of coffee into the bargain! This was my routine for the next few days, the only difference being the differing WiFi hotspots across Moscow. I even managed to make calls on the metro, which in Moscow is WiFi enabled! I think Transport for London could also learn a thing or two! Day 5 Towards the end of my visit to Moscow, I had one final monster task where I knew I was going to have to put the my VoIP to the test. I needed to arrange a conference call with a UK client. Now this has been a bit cumbersome in the past, and to be honest I was dreading it a bit. But to my utter astonishment it was one of my most stress free tasks whilst in Moscow! Through an application which I had installed on my Mac, I managed to set up the conference, invite the participants and give them dial in access within minutes. Using the hotel’s WiFi I held the call in the hotel lobby and even managed to use screen sharing. Without meaning to sound patronising, it is difficult to tell you how much VoIP helped me on this particular trip. I was staying in a country that generally did not speak english, and my limited command of Pigeon Russian is inaudible to be honest, so not having to find somewhere that might sell me a local mobile sim card, that may or may not work, and/or an internationally enabled sim which is totally reliant on my mobile being unlocked was a huge saving grace. The time has come for mobile operators to start realising that there are other alternatives to their high priced unreliable services, which are becoming more and more scalable and offer a better level of connectivity both to voice communications and also to the internet. VoIP was probably always destined to become bigger and better, but my experience in Moscow has shown me that it truly is a business enabler. I know there are those out there who will say that connectivity over 3/4G can be a bit hit and miss, and I agree, but this is down to the mobile service in the place you are situated at the time, bear in mind I only used WiFi. However if this option wasn’t open to me, I would have had to present our Finance Director with a rather expensive mobile bill on my expenses sheet along with a big decrease in my own productivity. The irony of all of this is that the VoIP company concerned, RingCentral, originally contacted us at Compare the Cloud with a view to employing our services and becoming one of our customers. We looked at what they do and were so blown away with their offering and delivery we decided to turn the tables and became one of their customers. As a company we prefer not to be seen to endorse any one product, company or service over another but on this occasion the outcome is important to me personally, and I would have been very limited had I not had this available to me. I would like to Thank RingCentral for providing me with a solution that enables me to get the job done no matter where I am in the world. ### Riding the Digital Enterprise Wave Enterprise 2.0 Summit London There are plenty of conferences about enterprise social networks, using social media in business and social business. Most of them focus on external use of social media for communications. Some of them talk social collaboration and employee engagement. Our conference is the one that goes further. We take digital and social technology to the heart of the business process and focus on real returns - generating more revenue and/or reducing real costs. We talk less of the theory and more about putting it in to practice - where do you start, what works, what doesn't, and what's next? Book now - November 26 for the Enterprise 2.0 Summit London. We are talking Digital Transformation - what is that? The business landscape is changing. It's driven by significant changes in infrastructure, economics, and the 3 big technology shifts (Cloud, Social and Mobile) that are happening simultaneously. On top of that there are emerging technologies like the Internet of Things, Big Data & Analytics, Artificial Intelligence and 3D Printing that have the potential for an even more profound effect on the World economy and the way business works. Today's marketplace has more demanding customers, faster changing technology and more competition. Whatever business you are in your business model is under threat by a smarter, nimbler competitor who will be using technology to skip past you in to a new field of play. By using the term digital we are highlighting that you need to think further than just adding social and mobile technologies. You have to harness your existing technology and make it work better. You have to think of using technology to help you go to market faster with new offerings and to reach your customers in new ways. You have to re-evaluate your business and your value proposition and stop thinking business as usual. You have to start thinking digitally for your business and an entire new generation of technologies. Smart companies can evolve a digital strategy. Business as usual will get left behind. If you are behind the curve like a Kodak or Blockbuster or even a Phones 4U, you have to think in terms of a more significant digital transformation Sounds quite interesting, but why bother? If this still sounds nice to have as an add-on rather than vital, the most important thing is that it works! Take a look at these survey results from Capemini Consulting and MIT Sloan Management from their report "How digital leaders outperform their peers in every industry". They split the organisations in to 4 categories; with the most digitally savvy being called the "Digirati". Companies in that most advanced category generate 9% more revenue, create 26% more profit and have 12% higher market valuation than the rest. Going Digital makes a direct contribution to the bottom line. How do you do that? Come to the Enterprise 2.0 Summit London Our conference on November 26 at the British Academy in London directly addresses these issues. We've assembled case studies and speakers from companies who have embraced digital technology like Barclays, CEMEX, Deutsche Bank, and Sanofi Pasteur. We have well known industry experts like Euan Semple, Lee Bryant and our very own Agile Elephant team. There will be keynote sessions, panel discussions, 3 workshop options and ample time for networking with other delegates and the experts. The emphasis of the content is practical rather than theoretical. We'll make the business case, show you where the returns and efficiencies can be made, then give you a roadmap for making it work in your company. The main one-day conference is on November 26th, with an optional and more detailed pre-conference workshop day on the 25th. There will also be an unconference day on the 27th, aimed at consultants and practitioners who want to share ideas, best practice and discuss how this topic will develop in the future. The event is co-produced by Kongress Media, who run similar events in France, Germany and Switzerland, and Agile Elephant, the UK digital transformation consultancy. All of the details are on the conference website. You can book a place here Special offer for Compare the Cloud readers We are delighted to offer Compare the Cloud readers readers a special discount. Email us and we will send you a code to use when you sign-up. We believe this will be the most comprehensive event in the UK this year for riding the digital enterprise wave rather than getting swamped by it. We hope to see you there. ### Need for speed Data centres are at the heart of modern cloud software technology, serving a critical role in the expanding capabilities for business and enterprises alike. But as demand for the cloud grows, so does the need for the advancement in data centre interconnection (DCI) technologies. The unstoppable rise of the cloud is fuelling the global data-centre market; the sheer number and breadth of data centres has exploded over the past few years. We have seen the birth of the mega hyper-scale data centre as well as its smaller counterpart – the micro data centre. But as cloud adoption continues to soar, large network operators are reporting a magnification effect on incoming traffic; this magnification in turn is fuelling the need for better interconnection between a provider's data centre and their customer. One of the challenges for many customers in their cloud journey is that last mile of connectivity between themselves and the cloud. This last mile has always been the drawback for better Internet usage and despite claims of high-speed broadband and fibre optic leased lines; there is always a need for greater speed, and with greater speed comes the need for greater technology. Say hello to data centre interconnection (DCI). As solid-state drives and multicore processors have become compulsory, interconnection itself has grown in importance and the technology involved is evolving as the need for leaner, higher density and more efficient DCIs are becoming more essential to meet demand. One such company that is leading the pack when it comes to DCI technology development is Infinera, they recently introduced the Infinera Cloud Xpress family of metro optical platforms, which have been designed for network operators delivering cloud-based services to consumers and businesses across the globe. As solid-state drives and multicore processors have become compulsory, interconnection itself has grown in importance and the technology involved is evolving as the need for leaner, higher density and more efficient DCIs are becoming more essential to meet demand. "Cloud Xpress improves the overall cloud experience by making DCI faster than ever," Infinera's Mark Showalter told me. "By increasing the speed that data centres across the metro, and around the world, share information, end users get a faster, richer multimedia experience on their mobile device or at their desktop. For example, the Cloud Xpress enables 8 terabits per second of transport capacity on a single optical fibre. To put that in perspective, that is enough capacity to simultaneously stream more than 1 million HD videos, on a single optical fibre." I asked Mark what his opinion was about the evolution of DCI and he told me that Infinera has seen DCI evolve over the past few years along two vectors. "The first vector of DCI evolution is an explosion in the amount of DCI traffic," commented Mark. "As an example, Cloud operator Facebook publicly reports a single user request generates almost 1000 times the amount of DCI traffic to serve that request. This is a starkly different traffic model than we see in traditional service provider networks, where the amount of DCI traffic spawned by a user request is closer to zero. This explosion in DCI traffic has led directly to the second vector of DCI evolution." "The second vector of DCI evolution is the aggressive deployment of 100G optical transport between data centres. For the last decade 10G optical transport has been widely deployed for DCI. In the last few years we have seen the beginning of a once in a decade transition to 100G optical transport DCI. This transition to 100G started first in long-haul DCI in 2012, and is forecast to start ramping up in the metro in 2015. The Cloud Xpress is designed specifically for the high-growth metro 100G DCI market, providing a rack-and-stack solution that delivers unmatched transport capacity in 2 rack units using 50% less power than the current market leader." Let's not forget Cloud Xpress' green credentials. Designed from day one with green in mind, Cloud Xpress includes Infinera's photonic integrated circuits (PICs), these PICs provide high transport capacity at very low power resulting in consumption being 50% less power than the current market leader in the metro cloud. By dramatically lowering the power requirements you are also lower the cooling requirements for optical transport in the data centre. So what's next for DCI development? "We're seeing the deployment of warehouse scale data centres in rural areas and the deployment of distributed data centres in metro areas, both are impacting DCI. The warehouse scale applications are taking advantage of cheap real estate to build massive data centres, and that is driving the need for high-speed long-haul DCI. "In the metro, cloud operators are deploying multiple data centres to keep response times low by keeping the data close to their customers. Also in some cities there just isn't enough real estate to build one big data centre, so instead cloud operators distribute multiple data centres across the metro. This metro cloud is where we see the next wave of growth for 100G DCI." ### Episode 4: Mis-selling Cloud, Bargain Hunters, Analyst Research and Thankless IT Support [embed]http://traffic.libsyn.com/comparethecloud/The_Issues_Episode4.mp3[/embed] Fed up of being yelled at? Spending more money on coffee than IT? In the fourth episode we discuss the Mis-selling of Cloud, rant at Bargain Hunters, ask if Analysts do their Research and send out love to Thankless IT Support people. Listen now via your favourite service: ### Using the cloud to get more customers When it comes to cloud computing, most business leaders will look to the promised benefits of a more productive workforce as a primary reason for adopting the technology. It's a trend that's impossible to deny as more and more businesses make the transition each year. Research shows that while spending on public cloud services reached just under $50 billion in 2013, expectations are for that spending to rise to $107 billion by the year 2017. In other words, companies see the future of their operations in the cloud, and the reasons for this go beyond more productivity and greater efficiency. Many companies see cloud computing as the way to net more customers. Customer Relationship Management Perhaps one of the most notable ways that cloud computing can help businesses acquire more customers is by helping each business gain a greater understanding of them. This is most commonly seen in customer relationship management (CRM) platforms and software. A number of companies, like Salesforce and NetSuite, have taken customer management to the cloud, offering their expertise and services in helping businesses organise and gain insights from customer data. Spending on public cloud services reached just under $50 billion in 2013, expectations are for that spending to rise to $107 billion by the year 2017. Using these cloud services provides companies with the tools to place context around customer relations, helping them understand many of the nuances surrounding those relationships and giving them a new view of what drives a customer's passions and interests. These insights not only lead to customers staying with companies for longer periods of times, they help organisations develop new ideas and innovations which can then attract new customers, helping businesses grow and thrive. Customer Support Using the cloud also enables businesses to provide new dimensions of customer support. As mentioned above, CRM software gives new insights into customer behaviour and interests. This in turn can fuel how businesses provide customer support. In many cases, companies are opting out of using giant call centres and following a predetermined script for whenever customers need help. Instead, they are using the cloud to offer individual support and responses. This is especially important in an age where customers have higher demands and usually expect instant help whenever they need it. Cloud computing platforms also ensure customers can receive that help from multiple devices. For example, if they're experiencing a problem on their smartphone, they can still use a tablet or laptop to get the help they need. Having improved customer support can create goodwill among customers and, in effect, lead to a good reputation for a company, in turn leading to more customers. Innovative Services Years ago, companies were asking basic questions such as, "What is cloud computing?" Now the questions have evolved to ones like, "What new things can cloud computing help the company do?" As the use of the cloud has become more prevalent, organisations are finding new ways to utilise cloud computing for their advantage. Indeed, one of the benefits of the cloud is how it can help companies of all sizes develop new services to offer their customers. In a recent survey, 70% of adults said that it is important for companies to offer mobile apps. Without cloud computing, this would be extremely difficult for businesses, particularly smaller organisations with limited resources, to offer. But since cloud providers can offer software, storage, infrastructure, and platforms for other companies to take advantage of, new services like mobile apps are within the realm of possibility. These strategies and methods only scratch the surface of what businesses can do with the cloud to attract more customers. Years from now, we may even see some of these innovations as basic, paling in comparison to what's to come. If one thing's for sure, the cloud will be a major part of the vast majority of businesses in the future. As companies learn more about cloud computing and its capabilities, they'll be able to utilise it to the fullest extent to grow and bring more customers into the organisation. ### Digital Realty announce direct access to VMware vCloud Air Global data centre and colocation provider Digital Realty has announced that it is enabling direct access to VMware vCloud Air across their data centre portfolio. With 130 locations across 11 countries this announcement will benefit those businesses looking for scalability in their VMware cloud services. Commenting on the collaboration with VMware, Matt Miszewski, Senior VP of Sales and Marketing for Digital Realty, said, "Our work with VMware is an important next step in our role as a business partner for our colocation and wholesale data centre clients. By offering pre-provisioned and on-demand services operated by VMware, we will eliminate the often burdensome process of cloud connectivity by providing our clients with a seamless extension to the cloud from their existing data centre locations in our facilities." "VMware Direct Connect enables clients to seamlessly expand their infrastructure," said Scott Collison, VP for VMware's vCloud Air business unit, "with the protection of being within their own [clients] private network." Digital Realty are making it easier for businesses to leverage the new vCloud Air Direct Connect feature, which supports high-bandwidth dedicated and secure private line connectivity. This announcement follows the industry push to improve data services as more businesses transition to the cloud. For more information, see the original Digital Realty press release. ### Is your data locked in? Some of you may or may not have heard of the term Software Defined Networking (SDN), regardless for me it is Cloud for the Network. Many may disagree with this description but I believe it is apt and best describes what SDN offers and where there could be problems. Before I continue let's consider a simple analogy as means of explanation. SDN is like electricity; more accurately how it is supplied and distributed. We all buy and use electricity. We connect our appliances, TVs, the laptop I am writing on right now, through standardised wall sockets. These sockets are supplied current through a fuse box which is wired to supply outside which is in turn wired to the national grid. If we want to change energy suppliers we can do so simply by migrating our billing information to a new provider. It all sounds easy and convenient. Now imagine those wall sockets were different. What if everywhere you went the sockets varied depending on your location. Imagine each area (not country but hotel down the road, office and airports) had its own proprietary sockets with no agreed open standards. You wouldn't be able to charge your laptop simply by plugging it in. To make it worse it might not even be case of having the right adapter. The voltage output could vary and even the access could be regulated, with differing priorities depending on the kind of energy request. This is what is happening everyday in data centres all over the world. And why? Simply, all SDNs are not created equal. If software is defining the network, and it is, then it is possible to throttle the connection, use differing access protocols and define traffic inequitably. Proprietary hardware and patents that do not allow for open interconnection or free movement of a data are commonplace in today's software defined network. More worrying for businesses accessing data centre services the use of these proprietary connections make it very difficult to change providers. Where we can change energy suppliers quickly and easily (and readily compare provision and price) businesses cannot. Data centre customers should not have to rework their routing, switching, firewall protection and addressing systems to just to change providers. The affect of current SDN practice is that businesses are being locked in to their current providers. Some might say, held to ransom. If the continuity of your business services are such that you cannot move providers without incurring significant migration fees then your data is locked in. Should a customer's choice be limited to a single proprietary vendor or should it be open and transparent? You can guess my position. An interconnected, open system allowing both the network provider and customer easy access via a software-defined system is what the cloud marketplace needs. This is why we are seeing more MSPs and vendors going with open standards for the service provision. Industry associations such as the CEF (Carrier Ethernet Forum) are a useful resource for understanding these issues. Take a look but don't stop there. To promote a meaningful change lies with customer pressure. It should be easy, a fundamental right, for a business to be able to switch providers. Ask your hardware or switching vendor what their adherence to open networking standards and systems are. And ask your data centre provider about choice. Where are your choices away from proprietary, and what is the lock-in based on the network hardware deployed? SDN is a technology we would recommend if implemented fairly. OpenContrail is another good resource and is uses open source software that is backed by Juniper Networks, a respected name in the networking industry. If you're running an OpenStack or CloudStack environment there are options available including Neutron with CloudStack supporting Big Switch. Additionally big vendors such as IBM have created partner eco-systems. Another good SDN resource can be found on the IBM website. My view is that SDN will continue to be integrated into cloud platforms and control panels. But unless software defined networking technologies move away from being proprietary and cumbersome, advancement will be stalled. In the long term an open source SDN could liberate proprietary systems and allow for simpler customer migration. One day it could be as easy as switching electricity providers. The key question to ask your data centre provider, whether you're an end-user or MSP, is how can you leave, back out and exit your current network deployment? If the answer is prefaced with deep intake of breath or a big sum then the answer is no. You are locked in to your data centre. The cloud should be about freedom of choice and the ability to differentiate using both open and closed systems through seamless interconnectivity. Creating systems where information can be secured and easily accessed is a goal for any organisation looking to enjoy the benefits of the cloud. SDNs need to do more, be more open, to help realise this goal. Feel free to agree or disagree with this article, all comments and opinions are welcome. ### Windows aims for 10 In what many will say is a continuation of Microsoft’s new “Mobile First, Cloud First” the tech giant has unveiled its next Windows operating system, Windows 10. One thing for sure is they've skipped version 9 in an effort to put some distance between the underwhelming interest in previous releases. Windows 10 is Microsoft's first adaptive OS. From Xbox to PCs, phones to tablets and possible future gadgets the new Window will adjust to the devices customers are using; delivering a consistent, familiar and compatible experience. Windows 10 will run across the broadest range of devices ever from the IoT to enterprise data centres worldwide. Microsoft is also delivering a converged application platform for developers on all devices with a unified app store. Developers will be able to write an application once and deploy it easily across multiple device types, making discovery, purchase and updating easier than ever for customers. "Windows 10 represents the first step of a whole new generation of Windows, unlocking new experiences to give customers new ways to work, play and connect," said Terry Myerson, Executive VP of the Operating Systems group at Microsoft. "This will be our most comprehensive operating system and the best release Microsoft has ever done for our business customers, and we look forward to working together with our broader Windows community to bring Windows 10 to life in the months ahead." Microsoft's announcement is intended to show the company's renewed commitment to businesses while improving relationships with the development community. Also introduced was the Windows Insider Program, kicking off its largest-ever open collaborative development effort. Program participants will receive the technical preview of Windows 10 and updated builds through the development cycle. For more information visit Microsoft for the full press release. ### Why HPC cloud is more than a bunch of servers High-performance computing (HPC) is moving into the mainstream – migrating away from the halls of academia and rapidly establishing itself in the arenas of business and industry. Key to its on-going success has been the willingness of businesses to embrace online, on-demand business models. HPC cloud services have the potential to appeal to any organisation with complex modelling and simulation requirements and fluctuating power needs. The approach typically involves the solutions provider investing in infrastructure that gives prospects the opportunity to buy access to that computing resource rather than making an upfront investment in a complex IT hardware implementation. The benefits may be compelling but before HPC-focused businesses take the plunge and start implementing HPC cloud services, they first need to look at their own business model. In some cases, using a cloud service will not be the best option. If the data being worked on is constantly changing and the business constantly has to move it to a remote location to work on it, the cost (in time and resource) is likely to offset any benefits of using a remote service. Where cloud models make more sense is where the business has a large dataset that is relatively static and wants to run multiple different analyses against that data. In that case, it won’t need to transfer large amounts of data back and forth; it will just be sending input data and results data (which may be quite small). Having taken the decision to go down the cloud route, however, HPC business users need to realise that success will be about much more than just putting servers in the cloud. As jobs run by HPC users tend to be resource intensive, these users will typically be looking for a HPC cloud services model that uses a high-performance interconnect, allowing for faster, more efficient processing. Unlike commercial clouds, HPC cloud services address the processing requirements of HPC customers by providing an on-demand remote compute facility with a pre-installed and configured environment where independent software vendor applications and open source codes are available. Where cloud models make more sense is where the business has a large dataset that is relatively static and wants to run multiple different analyses. Data processing is a key element of the HPC cloud approach. But once they have carried this out, how can they retrieve the data? Typically, they will need a combination of front-end tools, web browser tools and remote visualisation tools. Ease of access is also key. We are seeing growing usage of smartphones and tablets that provide a mobile interface to the cloud. Visualisation is another driver. Large engineering companies running crash simulations no longer need to bring the data generated back into their own local systems. Instead they can run visualisations on remote machines - a quicker, easier and more cost-effective approach. HPC users in the cloud can now also make use of computational steering techniques to monitor and, where necessary, change a simulation they are working on, enabling them to save time and money invested into long-running jobs. There is a massive potential for HPC in the cloud. Available network bandwidth has increased and broadband has become all pervasive, further driving up-take. These kinds of services will not be for every organisation. But today, businesses with specific types of data processing needs are increasingly attracted to them. For many organisations, the time for HPC cloud services is now. ### Cisco invests for global cloud network Cisco have announced that more than 30 additional companies, including Deutsche Telekom, BT, NTT DATA and Equinix, have added their support to the hybrid cloud network Intercloud. News follows the completion of its purchase of Metacloud, which deploys and operates OpenStack-as-a-Service model, private clouds for global organisations. Cisco will invest $1 billion over the next two years linking 250 additional data centres in 50 countries. The aim is to build a worldwide network of interconnected clouds – which Cisco and its partners are rapidly developing to enable a new generation of standardised cloud applications and the proliferation of secure hybrid clouds. "Since we announced our OpenStack-based cloud strategy six months ago, we've received tremendous industry-wide support. The strategy is gaining momentum in the open source community and providing partners with a powerful cloud platform with global reach, and Internet scale and efficiencies," said Rob Lloyd, Cisco's president of development and sales. ### Bursting out at Christmas The time of the year is rapidly approaching where we all shop online, order those last minute goods and give the Internet and web hosting sites the Annual stress-test. You could be a website owner or perhaps a hosting provider or a software as a service provider offering back office integration. Regardless the question is whether your existing systems can handle the Christmas peaks? Here are some statistical stats* on mobile applications you may find interesting: 24% of application users go to retail apps 52% of retail application users are female App users are younger - 28% of retail application users are aged 16-24 Retail App usage peaks at lunchtime Social network applications reach 67% of users Total Application usage has its peak on a Friday Retail Application usage peaks on a Tuesday If your selling goods or services into different channels how would you plan peak capacity based on marketing into the retail application range of 16-24 year olds? Another scenario could be a games developer ensuring that the critical Christmas release is going to scale on demand? Moving further afield perhaps your customer needs to move web-facing systems closer to worldwide markets such as Hong Kong to capitalise on the Asian growth markets or simply replicate in the US with a CDN (content delivery network) capacity? Lets look at the rhythm of purchasing behaviours for different market sectors: Technology Retail: Weekday peak flows are on a Tuesday and Wednesday. Average access times are Monday - Friday: 1 - 2pm. Weekends are Saturday and Sunday: 3 - 4pm. Fashion Retail: Peak flows are on a Monday or Thursday. Supermarkets: Monday and Tuesday Evenings. The message from the statistics above is to plan plan plan and plan, use available sources to predict capacity and load requirements. Once you have an idea then plan ahead to predict peak loads and decide based on your unique requirements whether there is a case for peak load bursting. As we are all too aware, hardware investments such as storage area networks and blade servers can be costly especially if the production cycle is to satisfy a peak demand over a quarter. The question I tend to get asked a lot is whether the cloud offers a true return on investment and operational expenditure benefits? My answer is always "it depends on the workloads you are executing and the longevity of those workloads." There are times when physical hardware deployed within the datacentre and amortised over a set period of time either on a medium term project basis or contracted period is economically viable compare to a cloud workload. On the flip side a seasonal burst or peak is a matter of costs both in terms of engineering complexity and transactional values vs. physical hardware costs. Arrow ECS have recently partnered with SoftLayer, an IBM company, to enable our partners to take advantage of the power and flexibility of this huge cloud platform. Do I view SoftLayer as a server retirement home? No, but I do see SoftLayer as a strategic partner for allowing our partners to access unprecedented resources and services on-demand. These demands can be either peak or core the choice comes down to cost and load requirements. At Arrow we view the SoftLayer journey as one of coaching and awareness. Awareness in terms of technicality, a journey defined as gently introducing our partners to the new world of cloud and its potential to reshape their offerings. I use the word partner often in this blog because I believe no cloud provider will succeed long-term outside of the development and testing communities without the engagement of an IT partner whether that is in the form of a value added reseller or a service provider. As we move into Quarter 4 online demand from both mobile and smartphones is about to increase exponentially maybe this is year when online purchasing breaks all records? The question is are you as the partner advising and providing solutions to your myriad of end-customers? If so have you taken into account the demands on your customers services coming into the Christmas break? Here at Arrow we are committed to providing solutions that resonate with our partner’s end-users, technical assistance and above all best-in-class solutions across our portfolio. Our commitment to providing ease of access to SoftLayer services includes: Monthly billing services with 30 days credit and invoicing (negating the impact of credit flow which is not aligned to billing cycles) Technical workshops helping you to design solutions Migration assistance and workload assessment Marketing and sales support Sales workshops that are geared towards being consultative with end-users, providing solutions not hype Whilst the debate may centre on cloud as a technology enabler alongside hybrid and API access, we wish to ensure that our partners are ready to enable discussion from a business and scalability perspective. This scalability could cover everything from temporary storage for images through to expanding loads on an ordering system or even disaster recovery in the event of a Christmas flood? Do you want to discuss further? Get in touch with us or contact me on twitter @ianajeffs and I will endeavour to answer any questions you may have. *Statistics courtesy of GFK Research   ### Big Data and machine learning as a force for good Before deciding to write this blog I have thought long and hard about whether I should even publish here on Compare the Cloud. As an organisation we promote and debate the benefits of cloud computing and the technology that supports the marketplace, but I have always believed that there is room for larger issues. Often our industry affects the way we live and so should be discussed openly. Recently I was hospitalised with a serious condition with long-term side effects. I am not fishing for sympathy nor is it the focus of this blog, still it had an impact and that is what I want to share with you. Lying in the hospital (perhaps having taken too much Morphine) I studied intensely the environment around me. I considered how computing advances are taking too long to reach the area that needs it most, our humanity. My personal view of the world changed during my stay in hospital, as did my personal definition of what is humanity. I never truly understood helplessness or suffering until I witnessed a bowel cancer patient in incredible pain. Suffering he still bravely smiled as his family sat around him. The guilt and helplessness I felt in not being able to assist this fellow patient has stayed with me. I have never experienced before and doubt I ever will again that emotive feeling. The hospital staff must find it difficult to cope. Good luck in your recovery. Mate, you know who you are and even though I am a West Ham football fan when Southampton beat us at Upton Park I smiled inside at the thought of you cheering. So where does my illness, the hospital stay and Big Data and machine learning fit together? They do, with a little explanation. Allow me to provide an explanation of what Big Data and machine learning is and how they can change our world. Traditional data and computing systems Computers were never really smart. They just seem so. A computer, from Alan Turing’s computational machine to the laptop in front of you, works on the basis that it is ‘under instruction’. Data is fed into the system in a structured format and commands are executed according to that instruction. A computer never learns in the sense that we do. Its capability is restricted by the instructional process; by the data that is fed into the machine. Take away the instructions and we have a useless box decoration with a slice of silicon in the middle. Computers follow orders and never think outside the box. Machine reasoning Modern computers, the big amazing thinking machines like IBM’s Watson can show us something new. Yes, at the most basic level the computer is still simply fed its instructions, but moving a level up the next software layer is helping computers reach a greater understanding. Computing systems today are able to review and compare unstructured data and extrapolate values. These learning machines are taking Big Data, the volume of disparate information, and predict relationships. This combination is producing computer systems with the ability to rationalise, in an almost human manner. This ability has the power to advance humanity and redefine how we are able to diagnose, treat and manage patients. A thinking machine with rational insight based on Big Data comparisons, absorbing millions of patient notes, measuring clinical trials and aggregating statistical data, could integrate via sensors with the Internet of Everything (IoE). Data sensors Sensors are already part of our daily lives. They surround us and bind computers together. From simple vehicle diagnostic tools to warn us our tyres are running low or that we need to fill up to heart-rate monitors for fitness addicts sensors are all around. With improving connectivity sensors can join a larger, central system to provide a constant stream of data to our big thinking machines. As anyone who has read Freakonomics might have already realised, comparing data from a wider pool of inputs can fundamentally change the understanding of a given issue. The application of sensor data in this way could allow us to diagnose, predict or even prevent ailments. The better we are measured and informed the better we might be able to help those now suffering in hospitals. Technology practitioner In patient care technology will never replace a real person. Let’s get this fact out early. Technology should not be used to replace those kind smiles or reassuring words, the genuine humanity, that comes from a nurse - or in my case the wonderful hospital cleaners making me cups of tea. We should not use machine learning or Big Data as a means for achieving austerity. A cost-cutting human resources tool would be wrong and an abuse of technology. Our reasoning machines, inter-connected by the IoE, could augment our current medical capabilities. As a tool it could be as important as the introduction of antibiotics or non-invasive surgical techniques. For example Big Data and machine learning could personalise patient care. Hospital bed sensors could be used to compare data, gathered from motion-behaviour, heart rate and thermal imaging, to identify whether this patients movements indicate pain or potential suffering. The diagnosis could predict outcomes based on treatment and suggest the means to reduce discomfort. Medical equipment could also share its data for comparison by even bigger machines. This would aggregate common complaints, national patterns and enable us to predict outcomes; and eventually prevent them. We’re not talking about a utopian goal. The perfect system is never perfect. What I am suggesting is using the technology we already have available to us to improve the human condition. The same computing power that pushes terabytes of information around the globe could help us medically. As humanitarian efforts go these Big Data medical marvels offer more to boast about that aggregating visitor statistics on a website. I’d like to see my Southampton supporting friend never need to suffer again in pain. Imagine being able to have sensors feeding data to learning machines – we might even be able to prescribe medication without the long wait for the doctor to make his way to the ward, finally. In the future we might even be able to simulate drugs trials, create new chemical combinations that could more effectively combat the spread of diseases such as Ebola in Africa. The future is available today My idea here is not all fantasy and it certainly is not science fiction. We are making great things happen in the cloud marketplace. Many of those technical leaps could easily be used in a medical context. Dr John E. Kelly, Senior Vice President and Director of IBM Research, suggests that as “genomic research progresses and information becomes more available, we aim to make the process of analysis much more practical and accessible through cloud-based, cognitive systems like Watson. With this knowledge, doctors will be able to attack cancer and other devastating diseases with treatments that are tailored to the patient’s and disease’s own DNA profiles. This is a major transformation that can help improve the lives of millions of patients around the world.” This is not a plug for IBM but Dr Kelly is absolutely right. Technology can bridge the huge amounts of medical and clinical data we are creating everyday. We don’t call it Big Data for nothing. Connecting all the pieces of our human condition, fighting the forces that shorten our mortal time on this earth, is an endeavour not so different as when we first connected our computers to a young Internet. Billions are poured into research to bring make our connections faster, more efficient and more powerful (if not sometimes greener in our use of electricity). The pharmaceutical industry is investing billions in the pursuit of miracle drugs. These industries need to reap the rewards of good product, monetisation is what keeps the money flowing back into development, but both could equally contribute to the human effort. It often said that pharma-companies should open source their developments, maybe in our drive to beat cancer, to cure and to heal we should open source our technology where possible. Success often brings the cost of the cure, or the provision of services down. I believe it is possible for businesses to profitable and ethical. A debate of the commercial reality is probably bigger than the scope of this blog. Nevertheless as we move toward a new world of computational and technical advancement, remember to look inside yourself. Use these amazing technologies for the greater good. I will finish this blog post with an apt quote from Mahatma Gandhi: You must not lose faith in humanity. Humanity is an ocean; if a few drops of the ocean are dirty, the ocean does not become dirty. Dedication I would like to dedicate this post to the staff of The Royal Hampshire County Hospital, Winchester. Thank you for nursing me back to health and allowing me to look at my two young children through new eyes. I promise not to waste the opportunity you have given me. ### Appointment means more expertise for Node4 Node4 appointment adds expertise to product management team. Data centre and cloud provider Node4 has announced the appointment of John Williams as Product Manager. Williams brings 25 years of IT industry experience to Node4, with a strong background managing leading edge cloud solutions for a range of customers and sectors. Paul Bryce, Business Development Director, Node4 said: "John's expertise in product management and impressive base of technical knowledge will be crucial to Node4 as our solutions portfolio continues to evolve. His experience working within businesses, with partners and with technology makes him an ideal addition to the team as we look to ensure our products and solutions continue to meet the needs of both customers and partners." The appointment follows Node4's recent acquisition of acquisition of cloud infrastructure specialist LETN Solutions, meaning its N4Cloud platform will provide even more flexibility for customers and partners. ### Episode 3: Big Cloud Vendors, Neil Does No Research, Migrations and Twitter [embed]http://traffic.libsyn.com/comparethecloud/The_Issues_Episode3.mp3[/embed] Twitter not talking to you? Us neither! In our third episode we discuss Big Cloud Vendors, tell Neil that he Does No Research, ponder Cloud Migrations and go into a full rant with Twitter. Episode Notes The top responders on Twitter are found in this article with only Rackspace representing Team Cloud and this research on 70% of companies ignoring complaints (with a nice infographic here). Listen now via your favourite service ### A day with Arrow: Transforming the data centre It is has been said that an event is really only worthwhile if you're there. Often there is little value in retelling the day after the fact, unless it was "Transforming the Data Centre" a launch event hosted by Arrow aimed at enabling MSPs to grow their business profitably with less risk. Featuring talks from Microsoft, VMware, Mirantis and overviews from vendors such as Lenovo, SolidFire and Arista alongside those of our host, Arrow, the event was packed with detail and insight. Arrow had invited over 70 partners to the launch and through the course of the day it became evident that MSPs need to review their understanding of data centres. The role of the data centre is evolving as the cloud marketplace matures, especially for those in distribution – and that is good place to start. A question I am often asked is “Does distribution add value?” My stock answer is usually, “go and find out.” Most of us tend to find the answer we already wanted to align with our own opinions. Real education is fuelled by discovery, the kind of discovery that changes your mind. As Steve Chambers of Microsoft UK highlighted during his presentation, "Customer demand for cloud and hybrid services is expected to reach $108 billion by 2017." Steve was quoting IDC as he explained Microsoft's own cloud OS strategy. Still the point was certain and convincing: the change in our use and reliance on cloud services demonstrates the biggest shift in IT since the introduction of PCs. With this shift and the cloud's increasing maturity the marketplace is becoming a battleground that will build fortunes and define businesses for decades. The data centre is an integral part of cloud provision. What happens at the data centre affects the services delivered. From the experience of the end user to the fees the distributor charges, the role of the data centre is more prominent and more significant. What used to be rooms full of black boxes and cabling, delivering connectivity to storage, the data centre now defines the context and ability to deliver cloud services. Scale and application, the software layer, represent the true market potential. At the event SolidFire and VMware approached this potential from different angles. Ian Cooper of SolidFire examined the input and output of today's storage and showed how the shift from hard drives and solid state to flash storage was enabling better performance. And on that performance VMware's Chris De Vere, echoing the ex-CEO of Microsoft, Steve Ballmer, only without unintended humour, said the cloud relied on "Applications, Applications, Applications." Creating a software layer that could deal with burgeoning data requests and greater services establishes a resilient platform. One that is more attractive to enterprises while fuelling more growth in the market. As David Moss of Arista explained, "Building networks designed to meet the requirements of the cloud [will] bring those architectures and capabilities to the enterprise." It is sentiment echoed by our good hosts of the day, Arrow. Put a little more eloquently by Denise Bryant, we need to facilitate "co-op-petition" between cloud service providers. Only by working together in the promotion of improved data centre strategies can MSPs, VARs and cloud vendors compete and positively influence the market. In doing so they are not just creating more revenue to chase but lowering the risk. We are seeing a lot of change, growth and development in cloud. It may not be the expansive, unrealised frontier of a few years ago but it is certainly isn't without challenge, interest or reward. I had the good fortune to be with Arrow for “Transforming the Data Centre.” While many of you have missed out my potted summary certainly gives us all something to think about. Thanks again to Arrow. ### Samsung to stop selling Chromebooks The PC market has unquestionably been in decline and now, following Sony's earlier exit, Samsung is to stop selling laptops in Europe. Samsung said in a statement that the move was "specific to the region - and is not necessarily reflective of conditions in other markets." Nevertheless Samsung's exit will mean that it is not just the of end Samsung Windows laptops but Chromebooks as well. For Google this is likely to be unwelcome news. Often lauded for their build quality and desirability Samsung Chromebooks were selling in enough volume. With PC demand coming from business and enterprise users consumer appetite continues to weaken. Outside of the premium market laptop sales have fallen. Tablets and smartphones, augmented by cloud services, continue to increase at the expense of the laptop market. With its Android-powered smartphones and Galaxy tablets doing well pulling the plug on Samsung’s laptop lines may be a shrewd move. ### Episode 2: Connectivity, Cloud Acronyms, Productivity and Obsolescence [embed]http://traffic.libsyn.com/comparethecloud/The_Issues_Episode2.mp3[/embed] Know your GaaS from your AaaS? In the second episode of our new series of podcasts we discuss Connectivity, Cloud Acronyms, Productivity and Cloud Obsolescence. Episode Notes It seems we weren’t wrong about the connection between the Thatcher government and overall poor broadband quality within the UK. This article explains why the UK never went from copper to fibre in the 80’s together with other European countries. An article from CIO about planned obsolescence of Apple products and how the company intentionally limits lifespan. Should cloud companies learn from this or avoid it? Listen now via your favourite service ### Safer than our selfies? Apple chases the cloud's silver linings with Apple Pay. As Stephen reported last week, possibly the most interesting Apple launch last week was that of the Tech Giant's new NFC-enabled mobile payments service, Apple Pay. It's left pundits wondering whether Apple's entry into the lucrative payments market could invigorate consumers' appetite for contactless payment solutions. PayPal seems to be worried; its full page ads in US newspapers this week have certainly fuelled interest in the forthcoming battle to dominate this space. While PayPal currently has a huge share of the online payment space (with a revenue last year of $6.6bn – up 20% on 2012 – generated by moving $180bn in 26 currencies across 193 countries), PayPal's foray into contactless payment has been less surefooted. The eBay owned business has been talking about itself as "your wallet in the cloud" for years, but has made little inroads in terms of taking a share of high street payments. Perhaps PayPal's biggest success in the US has been as a way of topping up credit for users of the Starbucks app which enables users to pay for their cappuccinos via QR codes and which apparently accounts for 10% of Starbucks payments in the US. Here in the UK, while PayPal is also supporting several retailers and restaurant chains with payments via their own apps, only 2,000 of the UK's 270,000+ retail stores accept payment using the PayPal app. It seems, to date, on both sides of the Atlantic consumer interest in replacing traditional card payments (whether magnetic stripe and signature there or chip and pin here) has been relatively low. Is that about to change now Apple has entered the fray? Apple's solution is elegant in its simplicity, utilises existing technologies and payment infrastructures and effectively addresses issues of security: Apple Pay doesn't collect payment history on the device or store account details on it; payment is made via a unique device account number rather than your card details and is authorised using a single-use code and fingerprint ID. For the retailer, proprietary hardware won't be required. Thanks to Apple's agreements with MasterCard PayPass, Visa payWave and American Express ExpressPay, retailers will be able to leverage existing NFC contactless wave readers. Reports suggest that major US retailers are keen to work with Apple Pay – including Macy's, McDonalds, Nike, Subway, Walgreens, Bloomingdales – but it remains to be seen how fast this change will happen. At present, fewer than 5% of US retailers currently have a way of accepting Apple Pay payments, with Walmart notably eschewing NFC payment systems (although perhaps there isn't too much of a natural crossover with Walmart and Apple Pay shoppers anyway. and WholeFoods have signed up it seems!). Here in the UK, we have to wait until next year before Apple Pay becomes a reality. Which is strange, because in many ways the UK market might be more ready for it. We're no strangers to contactless payment here: Transport for London has been operating the Oyster card scheme for more than ten years and began introducing contactless payments ‘proper' using the contactless wave readers more than two years ago. TfL announced this summer that London's buses are now a cash-free zone. Moving all that cash in to the cloud reduces risks for drivers, removes the costs of managing cash for the business and hasn't seemed to inconvenience passengers. And now Barclays have introduced a solution to help shoppers worried about ‘card clash'; whereby you end up paying twice because you have more than one NFC payment tool in your wallet – the bPay wristband. You don't have to be a Barclays' customer to use the wristband – it can be linked to any UK bank account or credit card and swiped at any contactless wave terminal to pay for purchases up to the value of £20 (as per the contactless card). Its launch was timed to coincide with the tube roll-out of TfL's new contactless payment system. Could wearables be the way forward for contactless payments? Apple Pay will work with Apple Watch. Some banks are already developing Google Glass banking apps and there is talk of integration with Google wallet (launched way back in 2011). Google Wallet has been a slow burner, it seems. Will this change is Android users see Apple Pay succeed? The contactless card is already accepted by quite a number of UK retailers – and leveraging this existing infrastructure seems a more straightforward way of going about contactless payments than PayPal's approach. Perhaps the Apple Pay launch will be just the extra little shove the concept needs to encourage retailers to commit and, once they do and the payment mechanism becomes more widely acceptable, consumers to adopt. The missing ingredient up until now has been the incentive to change – perhaps a little Apple marketing hype is all the incentive we really needed. Only time will tell. ### Red Hat targets Government cloud Red Hat has launched Red Hat Cloud for Government in a bid to help agencies deploy government-specific services in the cloud via open source software. Security concerns, as we have seen highlighted in the media by high-profile attacks, budget challenges and efficiency targets often make it difficult to migrate government services to the cloud. Red Hat Cloud for Government is aimed at addressing these pressures. As part of its government offerings, Red Hat’s consultative services will provide agencies with hands-on, risk-free assessments, while access to a variety of cloud service providers will provide agencies with options to better manage technology costs. Red Hat will work with government agencies to standardise and automate the infrastructure that is already in place, and then identify opportunities to simplify service delivery with Infrastructure-as-a-Service (IaaS) based on Red Hat Enterprise Linux OpenStack Platform and Platform-as-a-Service (PaaS) tools. "The Red Hat Cloud for Government program can help agencies save time and money, become more agile, and above all, make it easier to take advantage of the powerful open source and cloud forces that are changing the industry," said Paul Smith, General Manager and Vice President, Public Sector, Red Hat. "Agencies want to know how to effectively implement new technologies like IaaS and PaaS, and use third-party cloud providers without making things too expensive or too complicated." ### IBM - Is the Elephant going to dance again? Prologue Following a long week of meetings, you know the kind when you feel you’ve seen more receptions and boardrooms than your own office, I finally caught up with our founder, Daniel Thomas. I had been thinking about social media and how few businesses, even born in the cloud business, seem to understand how to use it effectively. Dan raised the debate, suggesting it was a deeper issue that showed how little businesses understood customers or how the provision of service fit with demand and real world use. "I would blog about any vendor that is consciously making an effort to engage with customers," he said. "I would name and praise them if I could find a vendor that was meaningfully talking to the customer. Real dialogue would help develop and support the MSP and cloud provider market." Reversing my typical response I asked Dan, "All right, who is doing something right?" What followed was an 8-year walk through the cloud marketplace – from early growth to rapid maturity. And one name came up more than most and that was IBM. The marketplace is experiencing a lot of change and there is still, as the tech giants battle it out, a lot to play for. As competition increases it is worth taking a minute to share Dan’s unique perspective. Is the Elephant going to dance again? In the last eight years I have watched IBM change. This transformation recalls former CEO Louis V. Gerstner's great book, "Who Says Elephants Can't Dance?: How I Turned Around IBM". The book also has one of my favourite quotes ever, "Politicians, I sack politicians." When Gerstener took over IBM in 1993 shares had slumped and Big Blue was facing certain collapse. He transformed the company into a major force. Now, following the sale of System X (the Intel Server Division) to Lenovo, the addition of SoftLayer and other important software acquisitions, IBM is shifting again, strategically. The shift is being fuelled by innovation, innovation for the most part in the cloud. So with these changes, can the elephant dance again? I think so. Consider the cloud marketplace today. Three names come up time and again: Amazon with AWS, Microsoft's Azure and, of course, Google. When we think of the cloud, broadly, it is often as an infrastructure for everything we say is cloud. But is it? Cloud computing is an umbrella term that covers a range of services from software to storage, virtualisation, data aggregation and even communications. There is a lot more to the cloud and the market demonstrates this. One of the biggest growth areas is that of the mid-market. The mid-market is services provided for 1,000 users or less. Yes it sounds small but to get hung up on size undervalues its importance. The mid-market are the enterprises of the future, the next global success stories. This is also a very influential group. Below the 1,000-user threshold you find early adopters, business optimisers and innovators all using the new cloud-connected tools to improve the processes and productivity that drives business forward. While IBM as a brand is huge, as those in marketing reading this will certainly agree, they are helping businesses innovate; and are doing so on a one-to-one basis. Just tweet @simonlporter and @davidjkay – beneath the headlines you’ll find a pair of IBMers with a passion for all things cloud. The cloud is a very social platform and you’ll find, as I did, that good advice is often only 140 characters away. With big names brand association is important, and hard not to ignore. Brand instils confidence and develops trust. But the service delivered is important. The longevity of a tech giant counts for little if the services don’t fit and are too expensive. Everyone wants continuity, especially in the fast moving world of IT, but businesses must justify the expenditure. Value for money used to imply higher prices but that assumption no longer holds true. You shouldn’t be surprised to find IBM’s pricing is reasonable – the cloud is a great leveller. The mid-market may not spend big but their ambitions are no less than those large enterprises the tech giants serve. Being small does not exclude thinking big. And being big doesn’t mean sticking the mid-market with a ‘one-size-fits’ set of services. Business development today is about partnering small. Looking over their shoulders Amazon, Microsoft and Google have made important acquisitions. But as they war with storage and productivity suites innovation is introducing competition. IBM’s introduction of Watson Analytics is proof. Bringing Big Data to businesses Watson Analytics is a natural language-based cloud application that provides powerful, predictive and visual analytic tools for businesses. Users need only to ask questions to get results in terms familiar to their business. Patterns, relationships are visualised. But Watson Analytics isn’t all that Big Blue has been working on. Look at the recently announced partnership with Apple. Both companies stand to change the enterprise market in ways no one has really witnessed. Coming soon will be a new class of industry-specific enterprise solutions developed for the iPhone and iPad. Better still will be IBM cloud services optimised for iOS, including device management, security, analytics and mobile integration. IBM has been making it easier to leverage the cloud. Working with service providers, MSPs and ISVs within traditional IT distribution channels IBM is creating new tools and applications with greater cloud integration. These tools are helping the mid-market. As IBM refines its relationships, just as it has reinvented itself over the years, this big company is becoming more relevant to smaller businesses. My belief is that the elephant is just mid-Viennese Waltz. IBM is catching its breath while getting ready to really boogie. Originally published 18 September 2014, 10:00 GMT ### Adobe buys Aviary Adobe acquires Aviary to fast-track Creative Cloud app development. Adobe has announced the acquisition of privately Aviary. With millions of people using Aviary-powered photo-editing apps and thousand of developers using Aviary's Software Development Kits (SDKs) across multiple platforms the acquisition accelerates Adobe's Creative Cloud strategy. Adobe is looking to cements its position as the leading creative tools software provider. The addition of Aviary will help improve the platform for third-party apps through a new Creative SDK. THe Adobe Creative SDK is currently under development. When ready the software library will enable developers to tap into Adobe's creative technologies to build mobile apps and drive new connections between mobile devices and Adobe Creative Cloud desktop applications and services. Adobe Creative SDK gives third-party developers access to Adobe APIs, previously only available to the internal engineering teams. Examples include: browsing files stored in Creative Cloud; extracting elements from PSD files; Adobe’s “Touch Slide” software for straight-line drawing; and cloud image-editing services like Content-Aware Fill and Upright. Adobe Creative SDK is currently being tested by developers and a beta launch is expected in the coming months. "Aviary was founded to bring creative freedom to the world," said Avi Muchnick, Aviary’s co-founder. "We're excited to join Adobe and tap into their incredible wealth of creative technology and supercharge our collective SDK offering. Together, we will help nurture the next generation of third party creative apps." For further details on the announcement, visit Adobe’s Newsroom. ### EMC considered HP merger EMC, data storage, information security and virtualisation provider, has been reported as having held merger talks with rivals Dell and HP. In a story originally reported in the Wall Street Journal and subsequently confirmed by CNBC a merger had been a likely option. EMC had considered a deal with HP but walked away over possible financial terms and shareholder approval. The deal would have been one of the largest in recent years. EMC provides of IT storage hardware solutions focused on data backup and recovery. The company is also invested in the cloud. With a market value of nearly $60 billion with a lot of EMC's value attributed to its majority stake in leading software virtualisation provider VMware. A merger between EMC and HP would create an enterprise giant capable of competing across the IT and cloud marketplace. ### Hybrid cloud is the battleground for IBM Hybrid cloud for IBM is about data an applications. Infrastructure provision is changing and x86 hardware vendors are under pressure from commoditisation in the data centre and cloud providers. White box manufacturers are declared the winners in the x86 race. The value now is recognised to be in the software layer. The ubiquitous hypervisor is also facing commoditisation as networking incumbents are eyeing SDN with caution. The discussion now is about the momentum OpenStack has achieved, now competing effectively with Amazon AWS. The move of x86 to the cloud is a change that had been in progress for sometime in IBM. As a founding member of the OpenStack Foundation in 2012 the move was well underway. With the purchase of SoftLayer as its public cloud in 2013 (based on commodity x86) progress was made. IBM is embracing of cloud. IT delivery models have changed and a hybrid model of both onsite and offsite capability is now the norm. Almost all the organisations I work with are asking for public cloud-based services to complement their existing IT capabilities. The SoftLayer USP of PAYG bare-metal server provision is making this an easy discussion, avoiding many of the concerns around security and sharing of infrastructure. Up take is good and we now have a network of 40 data centres worldwide, with our first in London opening back in July. White box manufacturers are declared the winners in the x86 race. The value now is recognised to be in the software layer. OpenStack adoption continues to accelerate in IBM and it is providing a platform for hybrid clouds. IBM Cloud Orchestrator is one such example. It is a foundation for hybrid adoption across private and public clouds and comes with, OpenStack, AWS and native SoftLayer API support. Zenith is a fully managed OpenStack cloud offered on a PAYG basis and delivered through IBM SoftLayer. DevOps exploitation of OpenStack HEAT for portable application delivery with HOT templates in IBM UrbanCode Deploy. And, HEAT offers the industry a potential standard for portable applications, giving users the freedom to deploy onsite or offsite and choose their cloud provider. Hybrid is much more than running x86 in the cloud. Hybrid is focussed on the integration and consumption of business services on the cloud. I am really excited by IBM BlueMix. It is a composable app. Based on a CloudFoundry PaaS platform running on SoftLayer Blue Mix allows the creation and deployment of cloud-based software quickly and effectively. It features cloud integration services to enable a secure connection between public applications and private data. A true hybrid model of bringing public cloud services to complement an enterprises’ existing IT investments. In a move that takes our mobile strategy for the enterprise to the next level, BlueMix is the secret sauce in IBM’s recent enterprise mobility deal with Apple. Exciting times and certainly no need to buy compatibility with AWS. ### Rackspace stays independent Rackspace to stay independent and focus on managed cloud market. Rackspace has ended its mergers and acquisition evaluation, declaring its commitment to remain independent. The move bucks the trend of big name aquisitions, including those recently by Microsoft and Google. According to previous SEC disclosures Rackspace had been approached by a range of suitors looking to explore strategic relationships, ranging from partnership to acquisition. After a comprehensive review the board concluded that with reaccelerated growth and potential for the coming year the company was better as an independent. "The company is best positioned to drive value... through the continued execution of its strategic plan to capitalize on the growing market opportunity for managed cloud services," said Graham Weston, Rackspace co-founder and chairman. Along with the announcement Rackspace has named Taylor Rhodes as CEO to lead and drive its managed cloud strategy. "As a seven year leader of Rackspace, [Taylor] was instrumental in the development and execution of Rackspace's managed cloud strategy," said Weston. "We are confident that under his leadership, Rackspace is well positioned to lead in the large and growing managed cloud category." For more information check out the full press release on the Rackspace website. ### Oracle CEO steps down Oracle founder Larry Ellison resigns after 37 years as CEO. There has been a dramatic shift in Oracle’s direction over the last few years. Oracle, the software powerhouse providing technology for many mid to enterprise level businesses as well as to banks and governments, had been slow to fully embrace cloud technology. Heavily reliant on on-premise licensing Oracle has often been associated with being a big player for over 30 years. But things have changed. Oracle’s claim in 2012 that they didn’t “get the cloud” was considered as baffling to many. The subsequent move into public IaaS was seen as a strange, especially considering the nature of their business and fierce competition from AWS. Many in the industry felt their offering was not a true, public IaaS but rather private builds embedded in Oracle software stacks. This gave the impression of a division between marketing and technology inside the organisation. Larry Ellison, 70, co-founded what would become Oracle with Bob Miner and Ed Oates in 1977. After 37 years of steering the company he is to step down. This could be the Oracle’s moment of change, and it’s biggest yet. In a statement issued by the Oracle board, president Michael Boskin said, “Larry has made it very clear that he wants to keep working full time and focus his energy on product engineering, technology development and strategy.” As the software layer becomes an increasingly important battleground in the cloud marketplace Oracle needs to adapt quickly. There’s a certain “new blood” to this announcement given it comes at a time when Oracle need to fully embrace cloud to avoid further erosion of their market share. Mark Hurd and Safra Catz have been named as successors, and become co-chief executives. Ellison will become CTO of Oracle Software. It is speculated that Ellison will make embracing the cloud a priority, redefining Oracle as a champion of the cloud. Hours after the announcement Oracle shares fell 2.5% with disappointing share results reported for some time now. Is Oracle’s cloud strategy working? Do they need a dramatic change? As always we welcome your comments. ### Infinera introduces Cloud Xpress Infinera has introduced the Infinera Cloud Xpress family of metro optical platforms, designed for network operators delivering cloud-based services to consumers and businesses worldwide. The Cloud Xpress is designed to simplify the deployment of high-bandwidth connections while lowering the power and space required to scale metro cloud networks. Consumers and businesses increasingly rely on the cloud for their application needs. As cloud adoption increases, large network operators are reporting a magnification effect on incoming traffic. A single request can often generate 10 times the amount of traffic between data centres. This magnification effect is accelerating the need for better, high-capacity solutions that connect the cloud. The Cloud Xpress is a small, stackable platform designed specifically to address this need. Infinera believe Cloud Xpress is the first to offer a combination of operational simplicity, massive transmission capacity and low power consumption. "As cloud applications and services have grown in importance, the need for high-speed data centre interconnection (DCI) has grown at a very high rate," said Paul Parker-Johnson, practice lead, cloud and virtual system infrastructures at ACG Research. Connectivity and power consumption are issues are going to continue to grow as cloud networks expand. Leaner, higher density and more efficient DCIs are going to be essential to meet future demand. “The introduction of the Cloud Xpress for the metro cloud shows that Infinera is again bringing the right 100 Gb/s solution to market at the right time, marking our initial entry into the rapidly growing metro transport market with a unique solution for cloud data centre interconnection,” said Tom Fallon, chief executive officer at Infinera. The Cloud Xpress is currently in customer trials with general availability in December 2014. For more information check out the full press release on the Infinera website. For more information on Cloud Xpress: www.infinera.com/go/cloud ### Cisco to acquire Metacloud Acquisition of private OpenStack cloud service company accelerates Cisco's Intercloud strategy. Cisco has announced its intention to acquire, Metacloud. Basen in Pasadena, California, the privately held company deploys and operates private cloud for a number of global businesses. Using a unique OpenStack as a Service model Metacloud delivers private clouds based in a customer's own data centre. The acquisition of Metacloud and its OpenStack-based cloud platform is part of a larger Cisco strategy to build the world's largest global network of clouds, Intercloud. Since announcing its Intercloud strategy in March, Cisco has enlisting key technology partners, service and cloud providers. Intercloud is Cisco's response to t customer requirements for a globally distributed, highly secure cloud platform capable of meeting the robust demands of the Internet of Everything. "Cloud computing has dramatically changed the IT landscape. To enable greater business agility and lower costs, organisations are shifting from an on-premise IT structure to hybrid IT – a mix of private cloud, public cloud, and on-premise applications," said Hilton Romanski, senior vice president, Cisco Corporate Development. Metacloud staff will join Cisco’s Cloud Infrastructure and Managed Services organisation on completion of the acquisition, which is expected to be complete in the first quarter of fiscal year 2015. For more information read the full press release on the Cisco website. ### A guide to domain parking and cyber-squatting for brands With the introduction of new top level domains in early 2014, brands must be aware of the importance of protecting their domain name more than ever before. Danielle Middleton, digital content writer for Claranet SOHO examines the issues. A successful brand requires a strong online presence, a crucial aspect of this is securing your company’s domain name. For growing born-in-the-cloud companies and service providers the right domain name is critical. In 2014 there are various issues that brands must be aware of when establishing their website in order to protect their reputation, product, service and finances. This article will discuss the relevance of your domain name and the importance of taking precautions against that name being hijacked by a third party. The Internet Corporation For Assigned Names and Numbers has shake-up internet domains with the release of hundreds of new generic Top Level Domains (gTLDs). The driver to create new gTLDs was to build a trusted internet through promoting competition in domain registrations. Too much emphasis has been placed on securing the .com and the marketplace was suffering. From specific gTLDs like .email and .digital to more oblique names like .ninja and .cool the possibilities have grown to meet demand. Businesses need to be able express their identity and define their offering with their URL. For example, a recent YouGov survey showed that 1 in 4 small businesses in London would apply for the .london domain. Location and relevance in a domain are as important as finding the right name. However, the expansion of gTLDs has exposed companies to the threat of cyber-squatting prompting many to rush to domain parking. What is domain parking? When you park a domain, visitors to your domain name will see a temporary web page that indicates you have reserved that domain. This typically takes place in the following instances: Before a site has been completed Prior to a hosting provider being found When the domain name is for sale To prevent others using variations of your domain name This final point refers to cyber-squatting; a form of domain parking with the potential to cause vast amounts of damage to a brand. What your brand needs to know about cyber-squatting With more gTLDs available, there is a substantially increased danger of a brand being targeted or put at risk from cyber-squatters. Cyber-squatting is the process of purchasing a domain name of a well-known company or brand with the intent to make a profit. The cyber-squatter will seek to sell the domain name to the brand or person who owns that particular trademark at a drastically higher price than they will have originally paid, and capitalise on the search traffic a high profile brand name will generate. According to a report in The Telegraph shortly after the gTLD reform, 80 per cent of .web extensions relating to 50 of the UK’s most valuable brands, including HSBC.web and JohnLewis.web, had been reserved or pre-ordered by third parties. A further 78 per cent of the UK’s top 50 brands had been pre-ordered under the .online domain name, 72 per cent under .shop and 68 per cent under .blog – all by unknown third parties. gTLDs, HTTPS and cloud security Almost six months after the introduction of gTLDs, Google has revealed the importance of website security in its latest ranking algorithm. In a post on Google’s Online Security blog, the search engine stated that it had been ‘running tests taking into account whether sites use secure, encrypted connections as a signal in our search ranking algorithms. We’ve seen positive results, so we’re starting to use HTTPS as a ranking signal.’ The fact that Google is placing more importance on the need to automatically give visitors a secure connection via HTTPS is a particularly positive step for cloud providers, as the hosting platform supplies the website security. By encouraging HTTPS to be an internet-wide practice, Google is ensuring that information is encrypted at both ends of its journey – protecting site owners and visitors. As far as cloud providers are concerned - gTLDs, brand and the rising importance of web security in search engines can be tied together. For instance, Symantec Corporation might apply for the following gTLDs: Symantec.antivirus Symantec.cloud Symantec.norton Symantec.protection The above exemplifies how the new gTLDs provide plenty of opportunity for brands to achieve exact match domains which increases search potential. While by ensuring that a URL has been switched to HTTPS, webmasters are signalling to search engines that they are a secure and responsible site. Parking your domain to help your brand Webmasters need to be aware that HTTPS and gTLDs will become increasingly important in search engines as Google begins targeting security and the relevance of a top level domain for users. This therefore means that a brand’s reputation can be damaged in two ways: A cyber-squatter adopts a gTLD and uses the domain to provide a knock-off service, product or other means of money-making under the brand’s name. Google identifies that the gTLD does not have an HTTPS URL and is therefore not secure, the content is poor and irrelevant to the search term. Brands can register gTLDs such as Symantec.antivirus and park them as a preventative measure. However, this is a more costly method as domain registrars are not limited on the amount they can charge for domain extensions - with the number of gTLDs now available this can quickly become a less economical option in the long term. Protecting your trademark against cyber-squatters Brands need to be aware of the damage cyber-squatters can do to their reputation as customers approach the website believing it to be authentic. Branded domain names held by cyber-squatters are often used to sell counterfeit goods or provide services under false-pretences. The fallout can be damaging to the brands and businesses targeted. To prevent third parties from gaining your brand’s domain name, record your trademark in the Trademark Clearinghouse. This will secure all domain names relating to your trademark and enable you to receive alerts from domain registrars should a third party attempt to use your trademarked name; brands should be prepared to pay around £90 per year to record their trademark. ### Watson Analytics brings Big Data to businesses With Watson Analytics IBM introduces powerful analytics for everyone. IBM has announced the launch of Watson Analytics, a natural language-based service that provides powerful, predictive and visual analytic tools for businesses. In what is seen as a breakthrough move Watson Analytics is designed to make advanced and predictive analytics easy to acquire and use for anyone. Most analytics assume users have data ready for analysis, a clear idea of analysis needed and the skills and time required for analysis. However few business users are ready. Finding and validating data can be time consuming – often representing 50% of the effort in an analysis project. Business users often struggle to understand out what analysis would be relevant and how best to report or diagram findings. Watson Analytics automates these steps making it easier for businesses to fin the answers they’re seeking, quickly and on their own. Using natural language processing business users can ask questions and get results in terms familiar to their business. As users interact with the results, they can fine-tune their questions and the data to surface the most relevant facts and uncover unforeseen patterns and relationships, which will enable predictive decision making for all levels of users. "Watson Analytics is designed to help all business people – from sales reps on the road to company CEOs – see patterns, pursue ideas and improve all types of decisions," said Bob Picciano, Senior Vice President, Information and Analytics Group, IBM. "We have eliminated the barrier between the answers they seek, the analytics they want and the data in the form they need." Watson Analytics is a cloud-based application delivered on the SoftLayer platform, the infrastructure provider IBM acquired in 2013, and will available through the IBM Cloud marketplace. IBM also intends to make Watson Analytics services available through IBM Bluemix. This move will enable developers and ISVs to leverage its capabilities in their own applications. Certain Watson Analytics capabilities will be available for beta test users within 30 days, and offered in a variety of freemium and premium packages starting later this year. For more information, visit: watsonanalytics.com. ### Episode 1: Piratism, Cloud, Big Data and Microsoft Needs Love [embed]http://traffic.libsyn.com/comparethecloud/Episode_1.mp3[/embed] Voting on the issues coming soon – in the meantime leave a comment or tweet us at @comparethecloud. Comments may be used for future episodes. Episode Notes Here’s the TechRepublic Article about IBM using sensors to monitor household habits. It turns out the rise of IoT cloud connected fridges ruling over mankind in a post apocalyptic world wasn’t as ridiculous as we thought. There’s a few articles online about hacking these devices (in fairness, often due to poor end-user management) including this article about Fridges being turned into spam networks. In response to the iCloud ‘hacks’ – Apple have beefed up their security. Better late than never. Listen now via your favourite service ### The Jolly Green Giant With the virtual world growing exponentially and consumer demand expected to rise by 60% or more by 2020, the energy appetite of the Cloud is vast. As this appetite rises it is also creating increased demand for at least one offline product: electricity. The debate around the cloud and its green credentials is one that has been doing the rounds since the cloud came into fruition. I myself have in the past written a number of pieces extolling the many benefits for a business to migrate to the cloud but the cloud still has someway to go to prove its green credentials, especially when you look at a data centre’s energy consumption. Always in the storm’s eye amid green ICT debates, these mammoth data centres have a mammoth appetite for electricity. They are easily the greediest electricity gobblers in the modern world, eating up electricity at a staggering rate. Global power demand for data centres alone grew to an estimated 40GW in 2013. This an increase of 7% compared to 2012 and according to market intelligence organisation IDC, stored data could reach nearly 2 zettabytes (1 billion terabytes) by next year. All that demand equates to a lot of coal needed to generate power. This is a troublesome. Often the cheapest power options in many regions rely on energy generated by some of the oldest coal-fired power plants, which also often happen to be the dirtiest in terms of emissions and the thirstiest when it comes to water consumption. The environmental rationale for technology companies to act is clear and simple, as a rapid shift to renewable energy is necessary to stem the worst impacts of climate change. However, there are a few companies that are flying the flag for the green Internet movement. According to a recent report from environmental activists Greenpeace, a small number of leading data centre operators have taken key steps toward building a greener internet, in particularly those firms that have committed to build a 100% renewably powered platform. Greenpeace writes that, "These commitments are having a profound impact in the real world, shifting investment from legacy coal, gas and nuclear power plants to renewable energy technologies, and disrupting the status quo among major electric utilities." The environmental rationale for technology companies to act is clear and simple, as a rapid shift to renewable energy is necessary to stem the worst impacts of climate change. Now, the business case is becoming more compelling as well: costs for renewable energy continue to drop, prices for fossil fuel-based electricity are rising, and leading companies are perceiving those price cues. One such company that is really striding forward with the green mission is tech behemoth Apple. Yep! The very same firm that only two years ago became Greenpeace’s number one target after earning the dubious distinction of being among the worst cloudy climate offenders, thanks in part to its reliance on coal. These days it’s, well, it’s a different story. Talk about turning over a new leaf. The tech giant has become one of the poster boys for the green cloud movement. Since it’s very public 2012 battering Apple has come top of the class of Greenpeace's most recent report – Clicking Clean: How Companies are Creating the Green Internet – by ensuring 100% of its total data centre power consumption comes from renewables. Greenpeace say that, "Apple has done the most of any data centre operator to make its part of the internet green through the on-site installation of renewable energy, particularly solar power." The tech giant has deployed large solar farms for both its Nevada data centres, to provide a significant amount of new renewable energy. Hats off to Apple for its push to have all-renewable energy sources for its data farms, it’s just a shame that other big players in the market aren’t being as forthcoming with their environmental rationales. "Other companies lag far behind," according to Greenpeace, "with little sense of urgency, choosing to paper over their growing dirty energy footprints with status quo solutions such as renewable energy credits and carbon offsets while rapidly expanding their infrastructure. Greenpeace have rather bluntly condoned Amazon Web Services (AWS), for "paying lip service to sustainability, and simply buying dirty energy straight from the grid." The activist group believes that AWS are "choosing how to power their infrastructure based solely on lowest electricity prices, without consideration to the impact their growing electricity footprints have on human health or the environment." Ouch! But you can see where Greenpeace are coming from as AWS, which hosts a large part of the internet, currently only sources 15% of its electricity demand with clean energy, with coal powering 28% of the company's cloud, nuclear 27% and gas 25%. ### Microsoft acquires Minecraft The rumours over the weekend have proved true. Microsoft has acquired Minecraft for $2.5 billion. The acquisition includes Minecraft's Swedish developer Mojang. The deal is less about Microsoft's new "Mobile First, Cloud First" strategy and more about games. Minecraft is the most popular online game on both the Xbox 360 and the new Xbox One. [quote_box_center]CEO Satya Nadella said of the acquisition, "Gaming is a top activity spanning devices, from PCs and consoles to tablets and mobile, with billions of hours spent each year. Minecraft is more than a great game franchise – it is an open world platform, driven by a vibrant community we care deeply about."[/quote_box_center] Writing on the Xbox Wire blog, Phil Spencer, Head of Xbox said, "Our investments in cloud, Xbox Live and mobile technology will enable players to benefit from richer and faster worlds, more powerful development tools, and more opportunities to connect with the Minecraft community." It is a big acquisition and high profile. Microsoft has potentially made a savvy move in buying Minecraft. ### HP Eucalyptus and AWS kissing in a tree It can hardly be called a ‘startup’ but you may have noticed HP’s recent acquisition of Eucalyptus and the recruitment of enigmatic CEO and Founder of MySQL Marten Mickos. Eucalyptus is the go-to control panel for enabling hybrid integration and leveraging the AWS API stack. Or in plain terms it allows you to manage Amazon web services simply securely and easily. What does this mean? Marten has been very outspoken with regards the Openstack movement describing it as ‘Russia’ in terms of collaboration and oversight. This position has somewhat softened in recent times (I wonder if HP’s support of Openstack drove this somewhat Machiavellian U-turn). AWS and HP will they play nicely? Good question, with its almost myopic support of AWS what does Eucalyptus bring to HP? Or even HP partners? Are we looking at another Dell Enstratius scenario repeated with a HP brand? The question that arises is HP looking to leverage the huge deployed server base to create hybrid services in support of HP Cloud or AWS? Or is this move designed to move customers from AWS to HP Cloud? Is this a case of too little to late? Looking at the data centre market many providers offer huge interconnects to the main cloud providers on a network level. The question that one ponders is whether this acquisition is a defensive move to secure the Intel server base of HP and negate new entrants to this sphere such as Lenovo. Does a server based control panel allow for this HP installed base to be protected in what is a 3.7 Trillion on-premise market against a 3 billion-cloud market? Does the end-customer require this kind of integration? HP climbing off the ropes? The answer to this is yes, it certainly makes the cloud market a lot more ‘interesting’. The question though is whether HP makes this acquisition count within its partner and distribution channels? From an enterprise perspective we believe this may well be a winner, although from a mid-market and MSP perspective the support of AWS and HP Cloud will be very interesting. As always platform is one thing, solutions, applications and online white label/co-labelled stores combined with great mobile and smart device support will be key. ### The cloud as a revenue generator The cloud as a revenue generator - is cloud the next generation growth catalyst? What do business owners, who are not technically inclined, think of when they hear the phrase Cloud Computing? Some think of the cloud as an affordable way of integrating next generation technology with their business; a section of owners look at it as a much needed addition to their existing IT framework; there might be plenty of others who look at the cloud as a means of increasing top-line revenue. If we take a closer look at all these reasons, one commonality stands out – Most businesses think of cloud as a revenue generator. Whatever their immediate reasons for adopting the cloud, they are looking at this technology as a means of driving revenue; and for good reason. The inherent nature of cloud computing makes it a growth catalyst. So the question is – why has cloud earned a reputation for being a driver of growth? Or are we expecting too much from it. Let's take a look: Product Development One of the ways the cloud helps generate revenue is by fuelling product development. An extremely instructive 2014 North Bridge Future of Cloud Computing Survey put the number of businesses using the cloud for revenue generation or product development at 49%. These businesses have the right expectations from the cloud as it definitely acts as a facilitator of product development. By using cloud for product development, the need to maintain expensive, high maintenance enterprise level tools on-premise goes out of the window. At the same time, you don't have to worry whether you'll get access to the latest technology. If you sign up for managed cloud services, you get upgrades, patches, and everything else needed to develop a cutting-edge product conveniently and seamlessly. Whether it's internal collaboration activities, idea sharing, gathering feedback, or delivering product updates, the cloud helps you address all activities tied in directly with a product lifecycle. Explore Newer Markets One big reason why a particular business experiences strategic growth and another doesn't is the former's ability to tap into previously untapped high-growth markets. Now, exploring new markets is fraught with challenges. You'll need to tweak your business to suit the local tastes vis-à-vis the consumer, hire a skilled local workforce, battle local competition and do a whole lot more to make an impression on a new target audience. The disruptive nature of cloud technology means it has the ability to trigger a fundamental shift in business. Not many businesses have the flexibility required to leverage the potential of unexplored markets for their business. This was until the arrival of the Cloud. An insightful article on CIO titled Cloud-based Ecommerce Platforms Give Retailers Global Flexibility, discusses how Clairns, a French cosmetic company optimised the use of the cloud through cloud-based eCommerce platforms to target the rapidly growing Chinese consumer market. This is just one example of how cloud gives businesses the flexibility to grow in new markets. This, as can be imagined, boosts revenue. Cloud Simplifies Astute businesses have used cloud as a growth driver by simplifying internal processes, whether its human resources, marketing or customer support. They've also improved access to resources like storage etc. which has led to better collaboration amongst employees. This has gone a long way in improving a business's process efficiency and performance. The ability to access, share, analyse and take action on relevant data gets a huge fillip, courtesy the cloud. The fact is if you move certain processes/systems/applications to the cloud, somebody else is doing a major portion of the operational work for you. You'll find this will give your business more scope to grow, which brings us to our next point, agility. Business on Steroids What separates the best business from the rest from the revenue generation point of view? It is agility. Agility is a nebulous term that could mean the capacity of businesses to quickly adapt to changing market conditions. It could also refer to their ability to create products faster. Agility could also mean the capability to market a product faster, or even innovate quickly. If there is an HR guy reading this, for him, ability can mean faster recruitment. The way you can describe business agility is different and will mean different things at different points of time during a business's journey. While it might have various meanings, there is no doubt that agility delivers business growth and according to a Harvard Business Review Survey, 41% of businesses have adopted the cloud because of its ability to make them more Agile. How you define agility depends on you. If you're running an eCommerce business, agility could probably mean your ability to respond to customer feedback faster. If it's a product development company, agility could mean faster updates or debugging. The key here is finding out how you can use the cloud to speed up business functioning as a whole. For this to happen, you need to understand the cloud and I am talking about really understanding its depths. Investing in a CIO will help. If you want to optimise the use of the cloud, hiring an expert is the first step, everything else will follow. Cloud and Growth Think of a situation where you've been operating in a business niche for a really long time, but growth has been stagnant; your inability to reach out to a mass market has worn off your competitive edge. You are staring at business redundancy. Enter the cloud. Your pick a cloud platform or two and choose from a smorgasbord of plug-in services available on the cloud, and voila, you will soon find your business competing with the big boys in your niche. The disruptive nature of cloud technology means it has the ability to trigger a fundamental shift in how your business functions. This is how it has been driving revenue generation for businesses, and will continue to do so in the future. There is absolutely no doubt if you want to accelerate business growth and at the same time keep costs under control, there is no escaping the Cloud. ### CWCS provides cloud hosting for Edinburgh Fringe Festival Edinburgh Festival Fringe Society’s performance registration system, "Edfringeware" is implemented with the help of CWCS’s Cloud Server. CWCS Managed Hosting has announced that it is working with the Edinburgh Festival Fringe Society to deliver their Edfringeware portal. This will be hosted on its powerful and reliable Supreme Cloud infrastructure. CWCS is one of the UK’s leading providers of cloud hosting. It’s superior EMC and HP powered cloud platform delivers fast, reliable performance and is extremely flexible. The Edinburgh Festival Fringe, the world’s largest arts festival, needs a highly reliable and flexible hosting solution to ensure its partners can upload their shows at any time. The Edinburgh Festival Fringe Society approached CWCS Managed Hosting for a powerful and scalable solution to host its market-leading VIA ticketing system for the Edinburgh Fringe Festival Society’s Edfringeware. CWCS Managed Hosting’s Supreme Cloud is being utilised to make it easier to register a show at the Edinburgh Festival Fringe. It has improved data flows and operational processes involved with the core functions that the Fringe Society provides. Edfringeware successfully achieved this with 49,497 performances registered through the system this year. Edfringeware is an open portal to all participants who would like to add their performance to the programme. Keeping to their core policy, the Society takes no part in vetting the festival’s programme. Chief Executive of The Edinburgh Festival Fringe Society Kath M Mainland said, "We’ve been hugely impressed by the professional service delivered by CWCS. Edfringeware is a massively important tool which connects us with our participants so it’s vital that we make sure this service is performing at optimum levels." The CWCS Supreme Cloud server has enabled Edinburgh Festival Fringe Society to improve processes and is also reducing the costs of its server management. Edinburgh Festival Fringe Society has benefited from the complete scalability of the Supreme Cloud, enabling them to increase and reduce their resources on demand. Not only does it make it more efficient and cost-effective for the Society, it also makes it easier for the system to grow as the festival grows. This historic and innovative festival has come a long way since its inaugural event in 1947. Staying true to its pioneering past, the Society leads the way in modern thinking. Karl Mendez, Managing Director of CWCS Managed Hosting states, "At CWCS, we’re proud to provide this wonderful Festival with the modern infrastructure to host its unique event registration and management system. Our Supreme Cloud offers extremely flexible, scalable and reliable hosting solution for businesses. The nature of the Supreme Cloud means that it’s an extremely cost-effective, enterprise-level hosting solution." ### Apple Pay takes on shopping Unsurprisingly Apple has launched two new larger iPhones, the iPhone 6 and iPhone 6 Plus. They also released their first major new product since the iPad, the greatly anticipated Apple Watch – finally a reason to retire that retro 1980s Casio. But for those of you with an interest in the cloud it is the launch of Apple Pay that should be pulling in the headlines. Apple Pay intends to change how we shop with an innovative contactless payment technology. The system uses a combination of device and a unique, cloud-based security system to complete transactions. Customers will be able to shop without ever opening their wallets. Before the launch event there had been rumours, and plenty of leaks, that Apple was working on a payments system. Initially Apple Pay will only work with the new iPhone 6 and next year’s Apple Watch. iPhone 5 users tethered to the Apple Watch will also work. Unlike many on the other side of the Atlantic most of us are already used to contactless payments. But the big difference with Apple Pay is that this new system takes the card out of everyday purchases. The retailer never has to see the customer's card. Apple Pay intends to change how we shop. "Your wallet, without the wallet" as Apple says. According to Apple when a customer presents their credit or debit card for payment the card number and identity are on display. This visibility is a risk. With Apple Pay the actual credit or debit card is never risked. Instead a unique Device Account Number is assigned, encrypted and securely stored on a dedicated chip in iPhone and Apple Watch. When a customer makes a purchase the Device Account Number alongside a transaction-specific dynamic security code is used to process your payment. Customer details are never stored on Apple servers. The actual card numbers are never shared with merchants or transmitted with payment. Apple also says it doesn’t save transaction information. With Apple Pay, payments are private – the shopping version of “no follow” we hope. This is good news especially given the recent high profile celebrity iCloud hacks. “Security and privacy is at the core of Apple Pay. When you’re using Apple Pay in a store, restaurant or other merchant, cashiers will no longer see your name, credit card number or security code, helping to reduce the potential for fraud,” said Eddy Cue, Apple’s Senior Vice President of Internet Software and Services. Apple announced that Apple Pay works with over 220,000 US retailers, including Nike, Staples, Subway and McDonalds. The UK and Europe should follow next year. Apple will also launch a new Apple Pay API for developers to integrate the system into their apps. Looking beyond the hardware announcements this is a good move for Apple and for the cloud marketplace. Apple Pay supports credit and debit cards from American Express, MasterCard and Visa. More of the big banks, many of whom rely on enterprise cloud networks to support their banking, are likely to join up. There needed to be innovation in payments systems and Apple Pay looks to be breaking new ground. ### Juniper Networks expands Spotlight Secure to stop malware Open security intelligence platform enables SRX firewall customers to take quick action on data from varied threat detection feeds to enhance protection of high-IQ networks. Juniper Networks has announced new advancements in its security capabilities that extend the Juniper Networks® Spotlight Secure threat intelligence platform and link it with firewall policies in Juniper Networks SRX Series Services Gateways. These new advanced security capabilities empower customers to quickly take action on intelligence from varied threat detection technologies by immediately pushing enforcement rules to SRX firewalls to cut off command-and-control (C&C) traffic, isolate infected systems and effectively combat a diversity of threats targeting networks. This novel approach frees customers to choose the most appropriate threat detection technologies available – including feeds customized to their business – rather than being locked into only the intelligence data offered by their firewall vendor. As the threat landscape continues to accelerate and evolve, the security industry continues to respond with a variety of disparate new detection technologies. "We have transformed our security solution to address the challenges and constraints that our customers face with traditional firewalls. By creating an open framework that enables the aggregation of intelligence data from multiple feeds, we can provide application and user-level visibility. In addition, this solution is able to increase agility to effectively combat evolving threats in order to keep High-IQ networks secure," said Jonathan Davidson, Juniper Nework's Senior Vice President and General Manager of Security, Switching, and Solutions Business Unit. As the threat landscape continues to accelerate and evolve, the security industry continues to respond with a variety of disparate new detection technologies. Unfortunately, this approach results in customers struggling to manage a patchwork of uncoordinated security tools, leaving a gap between detection and enforcement at the firewall. Many Next-Generation Firewalls (NGFW) include integrated capabilities, such as Intrusion Prevention System (IPS), anti-virus signatures and proprietary reputation feeds, but they are closed systems that are not capable of taking full advantage of the highly diverse third-party and custom feeds utilized by customers. Juniper’s expanded Spotlight Secure platform addresses these challenges and constraints by aggregating threat feeds from multiple sources to deliver open, consolidated and actionable intelligence to SRX firewalls across the organization. These sources include Juniper threat feeds, third-party threat feeds and threat detection technologies that the customer can deploy. Administrators are now able to define enforcement policies from all feeds via a single, centralized management point. More information can be found at Juniper Networks or connect with Juniper on Twitter and Facebook. ### Naked celebrity hack highlights digital vulnerability By now, you and pretty much everybody else on planet Earth have heard about the Great Naked Celebrity Photo Leak 2014. Mainstream media has been buzzing away about yet another nefarious scoundrel hacker breaking into female celebrities' iCloud personal accounts, stealing nude photos and posting them on the web (seriously, what gives these guys the right? Perverts!). The revealing images purportedly include actress Jennifer Lawrence, singer Rihanna, selfie extraordinaire Kim Kardashian, model Kate Upton, plus a whole host of other celebs, were posted on the forum 4chan before spreading with the ferocity of a bush fire over social media networks. What’s worrying about this whole pervy debacle is that security in the cloud is again making headlines. But how did the hacker gain access to such personal images? Well, let’s rule out the initial reports that suggested iCloud itself sustained a large security attack, as the service is 128-bit encrypted both ways of delivery and as Jeff Dodd, CEO for cloud-based managed IT services provider entrustIT Europe, said to me: “This seems like a weird one to me; I can’t see any way for it to have been a breach of security at iCloud because the sheer logistics of locating individual celebs in a whole mass of data more or less works against you. That implies that whoever stole the data logged in as those celebs and then downloaded their pics.” Apple itself stated in a press release that no such breach was made of its systems: “After more than 40 hours of investigation, we have discovered that certain celebrity accounts were compromised by a very targeted attack on user names, passwords and security questions, a practice that has become all too common on the Internet. None of the cases we have investigated has resulted from any breach in any of Apple’s systems including iCloud or Find my iPhone.” So how did the hacker do it? Well, unlike other recent celebrity hacks such as the 2012 Scarlett Johansson rude piccie leak, this breach appears different in that it used a near-zero-day vulnerability in an Apple cloud interface. Instead of using social engineering or some low-tech research to gain control of the accounts, the attacker basically knocked down the front door. Apple didn't find out until the attack was over. Opps! The original hack, it would seem, looks to have been done by “chaining” between accounts: whereby gaining access to one person’s account, the hacker could access their address book and use that to attack others. While an unusual, long, convoluted password may have prevented the attack; it seems unlikely that even Apple's two-factor authentication would have helped, which according to reports the tech giant is broadening to avoid future intrusion. On Friday [5 September] Apple announced that tools will be put in place for legitimate users of accounts to seize back control. The company’s Chief Exec Tim Cook was quoted as saying that Apple also wants to make people savvier when it comes to guarding against hackers with strong passwords and other techniques. "When I step back from this terrible scenario that happened and say what more could we have done, I think about the awareness piece," Cook was quoted as saying. Brute Force Attack The signs are that it was a brute force attack as a posting on online code-sharing site GitHub said a user had discovered a bug in Apple's Find My iPhone service, which tracks the location of a missing phone and allows a user to disable the phone remotely if it is stolen. The bug allowed a hacker to keep trying passwords until identifying the right one. Most online services lock down an account after multiple incorrect password attempts to prevent this type of attack and it would seem Apple haven’t rested on their laurels, as the GitHub post was updated on Monday to read: "The end of fun, Apple have just patched." Jeff told me that if it was a brute force attack, “then it emphasises how important it is to look after your online passwords because half of your authentication (your email address) is probably public domain already. If you’re using the same password for more than one service then that’s like Russian roulette with two chambers loaded. The third chamber would be letting other people know your password too.” Jeff added: “Although I don’t frequent such exalted circles myself these celebs generally have an entourage and limited privacy. I imagine that access to their online accounts isn’t as secret as they’d hope or like. In those circumstances the only solution is incredible care about what you upload to the cloud because being brutally honest, a celebs personal life has value and criminals are dedicated.” What’s worrying about this whole pervy debacle is that security in the cloud is again making headlines. This breach, no matter who is to blame, ultimately still alerts businesses to the risk of cloud storage, but this unfortunate opportunity should, in my opinion, be used to highlight areas where improvements can be made and cloud security awareness can be heightened. Plus, despite the negativity there are now more opportunities than ever for channel partners who specialise in cloud security to move in and toughen up security, particularly on previously 'trusted' platforms. ### BSkyB and the digital village Broadcasting giant BSkyB turned to Salesforce.com in a bid to find a single platform where they could share knowledge quickly with different teams, divisions and geographies, all through one simple-to-use social feed. Like a vast majority of couples with children, my husband and I spend most nights sat in the living room whiling away the hours in front of the TV. And probably like many thousands of other couples one of us, or not both, are also immersed within the realm of social media. Now over the past few months I have noticed an increase in the time my husband spends on his iPad conversing within this realm – annoying I hear you say but only when he bursts into involuntary laughter and I almost send my well-deserved glass of vino all over the place. However, after one particular shock explosion of laughter, I asked him what he was up to. His response was "it's this Chatter thing we have at work. It's brilliant!" After a considerable amount of time with my husband waxing lyrical about how this app had completely opened the lines of communication up for him and the rest of his team as well as how it's improved things for his customers, curiosity took over and I decided to take a deeper look into the whole enterprise social network arena. My first port of call was to speak with his employer – BSkyB. The home entertainment and communications behemoth certainly does seem to stand by its mantra – Believe in Better. Not only when it comes to their customers but also with their staff, as around 18 months ago the media giant rolled out a new social network experience for its many thousand workers (my husband included) in a bid to improve internal communication and increase employee satisfaction. A team was put together with Nicola Band, BSkyB's Chatter Community Manager, playing a pivotal role. "I was given a basic problem to solve," Nicola told me, "sometimes people just don't know where to go to get things done particularly in our contact centres where we were growing both in size and geographical spread. We had a new environment where people were no longer co-located and many of our people worked with key strategic partners – we needed a practical, simple way to join the dots. I started exploring social enterprise network solutions and was fortunate enough to link up with one of our technology project managers who was keen to trial Chatter." With two billion people now active on social networks, a new generation of workers has emerged for whom social networks are crucial for keeping in touch with colleagues and sharing information. Sky's decision to rollout Chatter across its work force comes "as a natural one", according to Nicola. "There were several reasons why Chatter became our product of choice: simplicity – little need for training, intuitive interface, reliability, ability to customise through our object allocation, investment in improvement/enhancements (regular release roadmap) and mobile offering (several thousand of our users in the field access only via iPad)." Since Chatter's inception 18 months ago the results have been extremely positive. "Chatter allows our people to contribute ideas, avoid duplication of effort and create a vast accessible network of expertise," Nicola explained. "We have a more connected workforce, allowing collaboration and team work between our field- and office-based colleagues, our internal and strategic partner." So what exactly is this Chatter then? Well it's really a customer resource management (CRM) system similar in a way to Twitter and Facebook, however Chatter allows users to form a community within their business that can be used for secure collaboration and knowledge-sharing. It can be optimised for mobile and desktop use and is easily integrated with any social app. It provides a social network experience to help employees to collaborate on such things as sales opportunities, service cases, campaigns and projects, all in one place. A digital village you could say! Xabier Ormazabal, Senior Director, Head of UK Marketing at salesforce.com, told me that thanks to information overload businesses are looking for a "single platform where they can share knowledge quickly with different teams, divisions and geographies, all through one simple-to-use social feed." "And with two billion people now active on social networks, a new generation of workers has emerged for whom social networks are crucial for keeping in touch with colleagues and sharing information. Those workers expect the same level of functionality and connectivity at work from business applications as they get from their smartphone's Facebook app. As a result, more and more businesses are turning to enterprise social networks to connect employees in a user-friendly way." And the results for BSkyB speak for themselves. As Nicola points out, Chatter has been well received particularly by those who were less connected before, like its field and partner colleagues. "Our fantastic engagement rates are testament to the way our people embrace collaborative working. We are 18 months into our implementation and continue to grow." ### Node4 launches sync and share solution for SMEs Cloud-based File Sync and Share gives SMEs a secure alternative to consumer applications. Node4, Cloud and Data Centre specialist, has launched File Sync and Share, a Cloud-based collaboration product that allows users to share and collaborate on content using Cloud services in a secure and seamless manner. Based on Node4’s resilient N4Cloud platform, File Sync and Share is an alternative to consumer solutions such as Dropbox, Box and Google Drive. A small software app can be installed by users on a variety of connected devices and synchronises automatically when files or folders are created or modified. This means an employee can create or receive a document on his/her desktop in the office and, using File Sync and Share, access it on a connected smartphone, tablet or laptop through the app. The platform can be used to share files internally and externally and is centrally managed through a web-portal where administrators monitor access rights and permissions, giving businesses more control than many “open” consumer applications. Administrators can manage remotely time-limited projects, so an employee who has left the business or an external user who no longer needs access to information can have project files removed from their remote devices. John Williams, Product Manager, Node4 said: "The increasing use of personal devices for work purposes has led to consumer sharing platforms such as Dropbox and Google Drive becoming popular with business users. For businesses, the issue is that it is difficult to control and monitor the use of such platforms and, ultimately, you don’t know where your data actually is or what security measures are in place to protect it. File Sync and Share is a solution that provides SMEs with a secure and collaborative way of sharing potentially sensitive information." "By using a solution based on N4Cloud, hosted in one of Node4’s four state-of-the-art data centres, SMEs know exactly where their critical data is being stored and who has access to it. Node4’s highly experienced technical team is also on-hand 24/7 to provide local support to businesses in the UK." About Node4 Node4 is a Cloud and Data Centre specialist that in 2014 is celebrating its tenth anniversary. Since 2004 the company has grown rapidly through its comprehensive service offering and the growth in demand for hosted IT from businesses all over the UK. It has four state-of-the-art Data Centre facilities, two located in Derby and one in Leeds and one in Northampton, which offer the latest in security technology, ensuring that even the most mission-critical applications are hosted securely and protected. Its leading solution, N4Cloud is a key part of the company's wide product portfolio. For more information, please visit: www.node4.co.uk ### What if Zombies attack? Even in the face of a Zombie Apocalypse business continuity and disaster recovery is an important consideration. Steve Denby, Sales Director of LETN Solutions (a Node4 company) asks, What if Zombies attack? I decided to write this blog because I've had some really interesting discussions about business continuity and disaster recovery over the years and I thought it might be good to share my candid view on how best to manage expectations with Executive Teams on these subjects. The "what constitutes a disaster" argument There is a surprisingly repetitive pattern when discussing what existing DR capabilities are in place with Boards and Execs, this seems to nearly always start with the assumption that everything is already protected. In the absence of information about what protection actually exists it seems the most common assumption is that everything is protected very well. To the point where should a Disaster occur that a recovery would take less than a day - sometimes the opinion is mere hours to recovery. Unfortunately this is rarely the case, and I'll explain why further on. The second thing that seems to happen is that nobody can agree what a disaster is. This may sound a bit daft but I can tell you that right now that you are thinking Disaster = X while the next person to read this will be thinking Disaster = Y. This single issue is perhaps the biggest hurdle I have faced when trying to help executive teams take decisive action. It turns into a huge distraction as we have to walk through various scenarios that get steadily more unlikely: Extended Loss of Power Extended Loss of Communication lines Flooding Fire Damage Theft or damage through attempted theft of critical infrastructure components Hurricane or localized violent weather Terrorist Attack or threat of terrorist attack Nuclear Weapon Detonation/EMP Alien Invasion What if Zombies attack? Obviously these last two are tongue in cheek and I usually throw them onto the table just to bring things back into focus, but I kid you not they have come up in conversation. The truth is that a disaster is an event that removes the use of your IT infrastructure for an extended period of time, say for example 48Hrs+ anything less than this can be usually be weathered. Only your executive team can define it specifically for our organization. Logic dictates that every business or organization should have at least a basic contingency in place about what happens in this eventuality. IT infrastructure is only part of this problem of course and I often use a reference story that I was once told to explain the wider implications. The business in question had offices which were severely damaged by the bunsfied oil refinery explosion some years ago. When they turned up for work after the event they had little choice but to send everyone home . It was only a week later that they realized that did not appear to have up to date records available for all the employee's within their DR plan, it was therefore an interesting task trying to reach everyone to let them know it was time to come back to work. A little bit of planning can save a lot of hassle, a cloud based HR application today would remove this problem entirely and cost next to nothing of course so things are becoming easier by default. The next huge hurdle we get to debate about is how to categorize the various applications and data from a business continuity point of view. There are obviously some applications that need to be back up and running before others so the business can get back to trading, the problem is that its quite difficult to get agreement from a group of department heads on exactly which ones these are. It is an obvious point of conflict and consequently sometimes ends up being moved around at the bottom of the monthly board meeting's agenda like the proverbial Brussel Sprouts on your plate at Christmas. Keeping it simple and providing clear direction What has become apparent over the years is that it's far better to develop a strategy that introduces a simple base level of cover for all applications and data, you then invite this to be challenged. This is generally better received and sets up immediate discussion around specific recovery times. The only caveat around this is email and phone systems. Real time communication is the life blood of the modern office and with the increasing reliance on presence based technology / video conferencing these systems ideally need to be fully redundant, the good news is that it's not too difficult to achieve this today. Cloud based email services are becoming standard and hosted ip based telephony systems are easy to divert to mobiles or other offices in the event of a disaster. Some technology specifics Where possible I always suggest that a client uses storage based replication for virtual machines and data instead of software tools. Storage replication is on the whole far more robust, a lot more efficient and less expensive to manage. There are now many services available that allow storage to be replicated off-site onto a third party platform, sometimes called data protection as a service or DPaaS this is a simple and easy way of ensuring you have a near line copy of your servers and data offsite and you don't have to worry about trying to recover from tape should the worst happen. Trying to avoid owning hardware is an objective that should be high on every CIO's todo list. Infrastructure is not core to running the business or organisation it is therefore an unneeded distraction, get rid of as much as possible and only pay for resource that you are consuming. Replicating to a cloud service for your DR is the ideal way to move into using this model with a very low risk. Some caveats I would hear is that depending on the sensitivity of the data you need to ensure that your service provider has some basics in place, i.e. They operate "sovereign" data centers which guarantee to keep your data in your country and protected by your local laws, they solid data protection processes that are audited externally on a regular basis (ISO27001 or equivalent), there are options for encryption for data both when at "rest" and while traveling between sites. Another option to bear in mind is whether there are archiving options available, this allows you to only keep the most critical data locally removing the need to continually grow your on-site storage footprint. Looking at DR as a Service It stands to reason that if your servers and data are all off-site, and providing your service provider has cloud computing resource as well, then you can actually purchase a full DR failover service? This base level of cover should be relatively inexpensive, offer a solid SLA of say 4 hours to recover Tier one applications and require little to no involvement from your team in management or monitoring. Testing also becomes far smoother and more regular as well, this can be done in the middle of the day using live data on the DR site instead of having to spend a weekend Watching tapes restore onto bare metal servers. The outlook Looking forward the future is looking pretty good. As the world's IT transforms, adopting cloud based applications that are highly redundant and delivered as a service, we see a shift away from having to worry about our on premise data and servers because they are unlikely to exist in ten years time. The trick is how do we handle things today with this knowledge? My personal view is that we are moving into a very interesting five year period of transformation, its going to be transitional, innovative and exciting with lots of new options that don't exist today. Those that embrace the "hybrid" concept will see the best returns and sleep more soundly at night. Move what can be moved (email for instance) to highly resilient cloud based solutions, get them off your desk and concentrate on the SLA of your line of business applications, these too can sometimes be moved to IaaS (Infrastructure as a Service) platforms where for a slightly lower cost you can increase the availability, eliminate DR concerns and have on demand performance/capacity when you need it. For the the applications and data that has to stay "in-house" (and there is going to be quite a bit of this for some types of organizations) its now easy to purchase prepackaged virtual infrastructure. Its pre-validated (read de-risked here) designs like this that also offer Cloud DR/Backup options so again in the interest of "getting stress off of your desk" the smart money in my opinion is going to be spent on these solutions. So the final note I'd add is that as the underlying infrastructure that your Apps and Data rely on become commodities the need to worry about DR/BC will dissipate, so look for those opportunities to transform how you run your applications and you'll never have to worry about those pesky Zombies Attacking. ### Data Centre Trends Digital Realty's 2014 survey of data centre trends across Europe has revealed that the top expected drivers of data centre capacity growth are storage growth, business growth, and virtualisation. The next tier of drivers includes big data, business continuity and data centre consolidation. Some of these trends were also identified in the APAC region, where a similar survey found the top drivers of expected data centre capacity growth are virtualisation, big data, and data centre consolidation. How can these needs be met by the next generation of data centres? Storage Growth Virtualisation is creating new opportunities to meet storage requirements; SAN and NAS are replacing arrays and offering lower-cost alternatives to traditional models. Organisations have an opportunity to leverage cost savings provided they think more strategically about performance and 'big data' management. Data Centre Providers that can offer cloud services in tandem with traditional co-location or hosted services offer a new elasticity to provision. This scenario also offers a 'less risky' transition to hybrid cloud. Business Growth Business growth and transformation are ably supported by cloud software and services. By transforming IT into an overhead instead of a cap-ex cost, financial risk is reduced and organisations are able to become more agile. Corporate spending on cloud-computing services, software and resources will reach $191 billion in 2020 as companies replace older equipment and programs with Internet-based systems, Forrester projects. Virtualisation Virtualisation is a multi-faceted topic, but there is one common thread: IT professionals are suddenly being asked to gain a whole new set of knowledge and skills. The major attraction of looking to new data centre capacity to meet a business' virtualisation requirements is that it is possible to partner with data centre companies who not only have the required new skills and knowledge but also have proven experience. Big Data Big Data isn't just about the growth of storage; businesses must also consider management, compute and security and that's why an experienced data centre operator with a broad portfolio of data centre solutions is a key asset in meeting the challenge big data presents. And it does represent a big challenge for every business. Global IP traffic has increased fivefold over the past 5 years, and will increase threefold over the next 5years. We have produced more data in the period between 2010 and 2012 than everything we produced up to 2010. The market is set to be worth $25bn by 2015 as the speed of growth accelerates – leading some analysts to dub this the zettabyte era. Business Continuity The 'always on' nature of today's commercial environment demands an 'always on' data centre. Digital Realty has achieved 99.999% uptime since 2007. Of course, high availability is only one part of the equation. Global instability has also pushed disaster recovery up the agenda for a lot of companies, so it isn't a surprise that business continuity features high on the list of drivers. As a driver, it affects both provisioning and location. Large data centres offer strategic advantage in terms of the variety of routes and carriers into the data centre. Many companies are looking to regional data centres to support continuity plans, leveraging improvements in telecoms and cloud computing to minimise risk. Data Centre Consolidation Improvements in telecoms and cloud computing are also making data centre consolidation increasingly attractive. While data centre consolidation might be a driver for new provisioning for client organisations, we also see a trend towards smaller and more regional data centres. This strategy is seen as one that can reduce risk and help to manage issues around data residency, compliance and security. What does this mean when choosing a data centre provider? Given the changing nature of the data centre, expertise and strategic direction are compelling reasons to partner with specialist data centre providers but, above all, performance considerations still dominate, as a in February 2014 Forrester Consulting study commissioned by Digital Realty makes clear. It found the top criteria when making decisions about new data centre facilities are: Network connectivity options, carrier availability and carrier density Resiliency level and availability of the data centre facility Level of control over data centre facility Risk profile of the data centre location Access to cloud, managed service providers, or other partners Today's new IT realities – what Gartner dubs the 'nexus of forces' – are creating demand for additional data centre capacity. Control and reliability remain key considerations, but cloud is now also becoming an increasingly important consideration in choices about data centre provisioning. ### Egenera Cloud Suite wins excellence award Egenera Cloud Suite has been named the 2014 Cloud Computing Excellence Award winner by Cloud Computing Magazine. The Cloud Computing Excellence Award, presented by TMC Global Media's Cloud Computing Magazine, recognizes companies that have most effectively leveraged cloud computing technology to bring new offerings to market helping customers to utilize the economical potential of the cloud. "We are fully committed to helping enterprises overcome the complexities that arise when dealing with disparate virtual and cloud infrastructures," said Thomas Duda, VP of Sales for the EMEA region at Egenera. "We are thrilled to be recognised by TMC as this award further validates the need for a comprehensive management platform and the value that we provide in filling this need." Egenera Cloud Suite (ECS) combines PAN Cloud Director, PAN Manager to deliver a best-in-class cloud orchestration & management software suite to deploy and manage IT infrastructure as a service (IaaS). ECS delivers the most comprehensive and service friendly cloud and data center management software solution on the market today. It enables businesses to mix physical, virtual and public cloud resources with agility and efficiency in cloud designs of their choosing. "Recognising leaders in the advancement of cloud computing, TMC is proud to announce Egenera Cloud Suite as a recipient of the Fourth Annual Cloud Computing Excellence Award," said Rich Tehrani, CEO of TMC. "Egenera is being honored for their achievement in bringing innovation and excellence to the market, while leveraging the latest technology trends." ### Could Amazon's stranglehold be under threat? With infrastructure-based monolith AWS representing a whopping 83% of the cloud market, which is about as close to total domination as you're likely to get, can it's closest rivals actually have a decent look in? There's nothing like a David and Goliath scenario to really set tongues wagging amongst us cloudy commentators. Like a baying crowd akin to scenes from Gladiator, we jostle to get the prime position to see who will come out best against the beast that is Amazon Web Services (AWS). For the past eight years the pioneering infrastructure-based monolith that is AWS has dominated the cloud industry, swatting away it's closet rivals with ease. But, and it's a big but, could the cloud powerhouse's iron grip on the sector be about to loosen? Well, if reports from the likes of analyst Gartner and Synergy Research Group are accurate AWS' cloud revenue in Q2 this year dipped to 38% from 60% in the prior quarter. Add to this a small robust band of deep-pocketed providers snapping at Amazon's heels and starting to make some gains: Google – is emerging as one of the biggest threats to AWS yet, Microsoft – in July its CEO Satya Nadella said its cloud business was on track to bring in US $4.4 billion (bn) in revenue. This would put it a few hundred million dollars behind AWS (the size of which Amazon doesn't disclose, but which has been estimated to account for about US $5bn a year), IBM – said to be on a US $7bn run rate this year, plus a growing number of startups such as Digital Ocean, Joyent and Contegix. It's easy to understand why, with the pressure mounting, Amazon has entered a price war with its closest competitors. Nervous Amazon? At the moment then AWS remains the undisputed leader of the cloud services industry, with Google believed to be one with the best potential to offer real competition given its existing scale and based on the features and reliability of its cloud offerings. In fact, the increased competition has given Amazon the impetus for a series of ‘race to the bottom' price cuts, the main factor behind Amazon's slower revenue growth last quarter according to its CFO Tom Szkutak. Back in March AWS made cuts across the board. On average, the cost of its S3 storage services was reduced 51%. Prices for EC2 compute services dropped around 38%. RDS, the AWS relational database services, had on average a drop of 28% and Elastic Map Reduce was 27% to 61% cheaper, depending on the service. What's interesting is that these reductions were announced a day after Google unveiled price cuts ranging from 32% to 85% on various of its Compute Engine IaaS offerings. So this leaves the two companies' services more or less at parity with each other in terms of cost, if not performance. "AWS still offers far more features in each product and far more products," one CIO friend of mine, who didn't wish to be named, said to me: "Sure you can migrate if you don't need any of Amazon's features, but why would you? Price isn't the driving factor as they are all too similar and in huge competition on that front. The only choice is down to reliability and product selection, you might not use the products now, but what about a few years down the line? So surely it is better to be in the most feature laden walled garden. The grass isn't always greener!" Gartner's Research VP Lydia Leong, commenting on her firm's findings, said: "With Microsoft and Google apparently now serious about this market, AWS finally has credible competitors. Having aggressive, innovative rivals are helping to push this market forward even faster, to the detriment of most other providers in the market, as well as the IT vendors selling do-it-yourself on-premises private cloud. AWS is likely to continue to dominate this market for years, but the market direction is no longer as thoroughly in its control." At the moment then AWS remains the undisputed leader of the cloud services industry, with Google believed to be the one with the best potential to offer real competition given its existing scale and based on the features and reliability of its cloud offerings. And while the chasing pack have shown growth in the cloud infrastructure services space, they recognise the threats that still loom in the distance, however, they don't seem to be faltering or wavering in the shadow of the overwhelming influence and resources that Amazon has thrown at the cloud in their effort to stomp out anyone who might try to stand up and challenge their increasingly frightening reach of dominance. So for this year Amazon reigns supreme, despite the odd knock, but as the Internet penetrates every part of our lives, even down to the everyday object like the refrigerator that will someday soon be 'smart' and connected – networking giant Cisco predicts a world where 50 billion devices could be connected to the internet by 2020 – humanity is only going to need more cloud. Indeed, Forrester Research projected the global public cloud market would rise to US $191bn by 2020, up from US $58bn in 2013. So if you take all that into consideration then there is plenty of cloud market pie to go around for the suppliers. ### vArmour raises $36 million in funding Menlo Ventures, Columbus Nova Technology Partners and Citi Ventures Contribute Funds to Accelerate Development of Disruptive Technology, Scale To Meet Global Demand vArmour, a stealth security company designed to protect the new data defined perimeter for enterprises, today announced it has raised $36 million in funding, including a Series B round closed in December 2013 ($15 million) led by Menlo Ventures and a Series C round closed in August 2014 ($21 million) led by Columbus Nova Technology Partners (CNTP), Citi Ventures and Work-Bench Ventures, bringing total company funding to $42 million. vArmour will use the capital to scale development and sales teams to help the enterprise protect their new data defined perimeter in today's reality of pervasive virtualization, constant threats and advanced security breaches. vArmour has the escape velocity to completely disrupt the security industry," said Mohsen Moazami, Founder and Managing Director, CNTP. "We went all in on vArmour as I'm confident that vArmour's continued growth will deliver a solution for the enterprise that can visualize, manage and ultimately protect the data center in a way we've never seen before." Our investment in vArmour is a testament to the potential for its technology to shape the future of enterprise security for the data center," said Arvind Purushotham, Managing Director at Citi Ventures. "As an organization that supports billions of secure transactions per day, Citi is dedicated to consumer trust and vArmour's management team shares our vision for the importance of enterprise security." Three-fourths of organizations have active command-and-control communications, indicating that attackers have control of breached systems and are possibly already receiving data from them. As massive, growing amounts of data continue to be distributed on a global scale, security controls need to move deep into the data center and be as dynamic as the applications and data they protect. There, at the data defined perimeter, vArmour provides the needed protection dynamically and securely by giving enterprises instant visibility and control of their East/West traffic flows for both old and new data centers architectures. "Advanced targeted attacks are easily bypassing traditional firewalls and signature-based prevention mechanisms. All organizations should now assume that they are in a state of continuous compromise. However, organizations have deluded themselves into believing that 100 percent prevention is possible, and they have become overly reliant on blocking-based and signature-based mechanisms for protection. As a result, most enterprises have limited capabilities to detect and respond to breaches when they inevitably occur, resulting in longer ‘dwell times' and increased damage," wrote Neil MacDonald and Peter Firstbrook of Gartner. "In reality, going forward, improved prevention, detection, response and prediction capabilities are all needed to deal with all types of attacks, ‘advanced' or not. Furthermore, these should not be viewed as siloed capabilities; rather, they should work intelligently together as an integrated, adaptive system to constitute a complete protection process for advanced threats." As part of the investments, Pravin Vazirani, general partner at Menlo Ventures and Mohsen Moazami, CNTP, will both join the vArmour Board of Directors. Recently, Lane Bess and Dave Stevens, both previous CEOs of Palo Alto Networks, were also announced as investors and vArmour board members, offering their combined 60 years of experience leading and growing security, data center and tech start-ups, including current roles at Brocade and Zscaler. The company, which has been in stealth-mode since its founding in January 2011, previously announced a $6 million Series A led by Highland Capital in January 2013 and is now prepared to launch later in 2014. "Our investors saw first-hand vArmour's large scale deployments of advanced data center security technology on a global basis and they realized that we have created something extremely valuable that solves a strategic issue for the enterprise," said Tim Eades, vArmour CEO. "Today, our board represents some of the best minds in security with Lane and Dave from Palo Alto Networks, now joined by Mohsen from CNTP and Pravin from Menlo. With our all-star team throughout the US, EMEA and APAC, we're in the best position to bring this technology to market and forever change the rules for security in the data center." Those interested in learning more about the newly defined data perimeter and East/West traffic flows are invited to attend an upcoming webinar on August 20. For more information on the webinar, presented by vArmour, distinguished ESG analyst Jon Oltsik and Demetrios "Laz" Lazarikos, Blue Lava Consulting IT Security Strategist and former CISO for Sears Online, visit vArmour.com. ### CenturyLink launches new Private Cloud CenturyLink launches new, globally available Private Cloud service in 57 data centres Federation with CenturyLink public cloud dramatically simplifies hybrid deployments CenturyLink, Inc. today announced the availability of CenturyLink Private Cloud, a new cloud service that delivers the agility and innovation of public cloud but with the physical isolation, dedicated hardware and security standards that many large enterprises require. Each private cloud instance is federated into the CenturyLink Cloud network of public cloud nodes, giving users a single interface for creating and maintaining a hybrid environment that spans both public and private clouds. The end result is greater agility for businesses across a broader range of workloads – even those that require additional security and compliance. "CenturyLink continues to realise our vision for business-friendly hybrid IT solutions, with public cloud, private cloud and network connectivity all available from one provider," said Andrew Higginbotham, senior vice president, cloud and technology, at CenturyLink. "CenturyLink Private Cloud delivers the best of private cloud – from dedicated hardware and physical isolation to enterprise-level security and service-level agreements – along with our truly innovative public cloud experience, featuring advanced self-service automation and a fast pace of feature innovation." CenturyLink Private Cloud – built on CenturyLink’s public cloud platform and expertise – was developed to meet the needs of large enterprises: Furthers hybrid IT capabilities – Private Cloud connects with public cloud running the same technology platform; enterprises can also leverage other CenturyLink solutions, including network and opt-in managed services for critical applications. Runs on the same industry-recognised platform as CenturyLink's public cloud – Innovative cloud management, automation and orchestration capabilities, as well as new features as they are released. Delivers geographic flexibility – Enterprises can deploy private cloud in any of CenturyLink’s 57 data centres located in 34 cities around the world. "Hybrid cloud is the enterprise IT operational model of the future," said Melanie Posey, research vice president, hosting and managed network services, at global analyst firm IDC. "CenturyLink Private Cloud provides a frictionless hybrid cloud option for organisations that want to source private and public cloud services from a single provider and connect cloud with other IT services through an extensive Tier 1 IP network." The launch of CenturyLink Private Cloud builds on CenturyLink’s recent cloud advancements, which include industry recognition as a leading cloud provider, the availability of cloud-based managed services and new CenturyLink Cloud pricing. About CenturyLink Technology Solutions CenturyLink Technology Solutions delivers innovative managed services for global businesses on virtual, dedicated and colocation platforms. For more information, visit www.centurylink.com/technology. About CenturyLink CenturyLink is the third largest telecommunications company in the United States and is recognised as a leader in the network services market by technology industry analyst firms. The company is a global leader in cloud infrastructure and hosted IT solutions for enterprise customers. For more information, visit www.centurylink.com. ### Is the phone in decline? Day to day my meetings invariably include sharing contact details – email address, phone number and, a little less frequently, social links and instant messaging IDs. The phone number most often shared is the mobile, rarely ever a landline. As consumers we are just not anchored to one spot and as such the landline is no longer as important as it used to be. We talk on the move. Share documents and close deals without an office in sight. In fact doing business in just one place is increasingly uncommon. Businesses have become mobile. With the staff spread over different geographic regions, on trains and across continents the landline phone is effectively dead. As an average consumer broadband and the mobile phone is all the connectivity needed. The importance of both has begun to inform our approach to communications. We're not talking ringtones, we're talking about talk, messaging and data. When confronted with a landline many of us are frustrated how ‘dumb' landlines can be in comparison to smartphones. The metaphor of the old phone is still useful. As the mobile proves we need a kind of phone. We need something to talk with, especially in the office – for those of us who still need an office. The truth is that it is still more comfortable to talk on an old style receiver than press a thin mobile phone to your ear for an hour. But this is where the similarities end – shape and comfort As consumers we are just not anchored to one spot and as such the landline is no longer as important as it used to be. The phone has been rebooted with VoIP. VoIP is a mouthful and it stands for Voice-over-Internet Protocol. What it means is that, like our mobile phones, telephone services are delivered by a network, often the Internet, and not limited to location. The receiver may look like an old phone but it has a greater range of features. From text messaging and auto-receptionist services, to access from anywhere in the world this new phone is more capable and offers greater benefits. Where the landline just rang blankly VoIP allows us to download call data from anywhere in the world – home and away, in a coffee shop or in the airport. It is the flexibility of VoIP and our changing expectation of communication that has relegated the old phone. Online telephony allows businesses flexibility and focus. "VoIP is an upgrade for your business," comments Ryan Macapagal, Affiliate Manager at VoIP provider RingCentral. "Whether you need to open a new office on the other side of the country, adjust your business hours on the fly, or prioritise calls from your most important contacts, online telephony makes it possible to adjust how you work, as you work." My old phone was tied by copper wire to a specific corner of my office. With the Internet, cloud computing and a generally more connected world we are not confined to a single spot on the globe. Or in my case, having a phone on the right side of the desk nearest the wall socket. Businesses need to be and are becoming more mobile. "We are no longer tied to desks," adds Ryan Macapagal. "We are seeing a shift to mobile, where our habits from home have followed us to work." The phone as we knew it is an antique, a relic of a different age. VoIP offers businesses of every size the sort of communications tools that only the tech giants and global companies could afford. The Internet and online telephony is delivering better communications and has swept the old landline aside. ### Cost of security breaches rises Egress Software Technologies, a leading provider of hosted and on-premise encryption services, has produced a follow-up infographic, Information Security Breaches Survey, to the market survey first published in July. The 2014: The Year of Encryption market survey has shown that the cost of breaches has nearly doubled in the last 12 months. This year we have already seen the damage caused by the Heartbleed vulnerability but human error has also been a contributor. Highlights of Egress' 2014: The Year of Encryption are illustrated the infographic below. For more insight on the findings please visit the Egress website.   ### BYOD and the IT Crowd With summer about to end IT managers might be thinking on their return about how to manage staff - can hosted desktops be the missing magic and how does this fit with BYOD? Bring-Your-Own-Device (BYOD), or as one contributor described it "K-BYOD (Kiss goodBye to Your Data)", is a phenomenon that has been on the rise for the past few years. As technology moves in leaps and bounds towards a 'Star Trek-styled' era, mobile devices are at the forefront of a revolution - the mobile office. But as with most tech revolutions there comes a new set of sleep depriving inducing nightmares for your company's IT crowd. That being, an extra layer of security needed and the headache of disparate pieces of kit that don't have proper corporate management tools. And if leading tech researcher Gartner is right by 2016, 38% of companies will expect to stop providing devices to their employees and by 2020, 85% will implement a BYOD program. That makes for a rather short window of opportunity for IT to get its BYOD act together. Creating efficient BYOD practices requires a BYOD policy. Enter stage right - the Hosted Desktop! Hosted desktops come in many flavours and configurations, but basically the technology allows servers to run virtualised Windows desktop sessions accessible via a simple Web browser. From an IT perspective, this makes the client device largely irrelevant. Sounds great for BYOD, right? Well, yes, it does and it also needn't be a massive headache for the IT department to retain control and visibility over devices without recourse to draconian BYOD policies or hugely expensive security measures that will, invariably, be reactive and which require considerable investment in management? "What corporate IT want," comments Jeff Dodd, CEO for cloud-based managed IT services provider entrustIT Europe, "is a thin layer that goes onto an employee device and allows access to all the corporate systems if enabled by the business but that leaves no data footprint on the device once access is rescinded. "A hosted desktop is rendered on the device but never stores data locally. If its Citrix based, a hosted desktop will run on almost any device from phone to workstation and on OSX, iOS, Android, Windows and Linux. Of course a full desktop on a phone requires Steve Austin's bionic eye but you can also publish an application the same way - something compact and bijoux made usable via the power of 3G. Best of all, when access is revoked the applications and desktops stop working but the data never left the building so you remain secure." "Drop, break or lose the device and the desktop keeps right on running behind it - you log in again from somewhere else. Coupled to something like Citrix Sharefile you get corporate file-sharing for the times when offline access to your super-duper corporate presentation is a must." Hosted desktops come in many flavours and configurations, but basically the technology allows servers to run virtualised Windows desktop sessions accessible via a simple Web browser. From an IT perspective, this makes the client device largely irrelevant. Hosted Desktops will work 24/7, 365 days a year on any web-enabled device and it only needs a tiny amount of bandwidth to be super-responsive. As Jeff put its: "It's the real 'martini' IT solution: anytime, anyplace, anywhere!" Adding: "Seriously - why would any overworked IT supremo deploy anything else even inside his organisation. One hit management and deployment and coupled to a good quality managed service/hosted desktop provider the support issues are minimal." Moving away from the IT department, what's in it for the business? Joseph Blass, CEO, WorkPlaceLive, told me that, "A business doesn't necessarily need to increase spend in order to increase revenue; sometimes it is enough to generate more productivity out of existing resources. What better way to increase productivity than enabling employees across the business access to their full work desktop and their entire suite of business software applications - from their CRM database to their accounting packages, as well as their emails, files and data - from any location, using any device. Wherever employees go, their office goes with them, allowing them to work productively always. While in the past such practices used to involve security risks such as employees forgetting their computers on a train, the Hosted Desktop is typically even more secure than the employee sitting by the desk in the office." And Russ Herringshaw from cloud comms provider Cobweb Solutions agrees. "A BYOD policy provisions a mobile workforce, offers increased productivity and flexibility, as well as controlled costs and employee satisfaction." As for the view of the IT set? One IT manager friend of mine said: "Hosted desktops and BYOD ultimately fit together, like pieces of a puzzle, to form a harmonious big-picture of tomorrow's world. Resistance is now well and truly futile. Young and incoming workers expect to use a tablet and a smartphone, whilst the generation of workers set to be the next batch of senior employees have become accustomed to these devices. My advice - find the right IT solution, it will certainly save you some sleepless nights!" ### A new class of cloud University fees are at an all-time high. In fact, last month's figures published by the Office for Fair Access (OFFA) highlighted that three-quarters of universities and colleges in the UK will charge £9,000 for at least one course in 2015/16 – meaning the cost of a typical three-year course will soon exceed £26,000 for the first time. So it's not surprising students are demanding more from the experience they get at university or college. Throw into the mix the huge pressure on students to secure a job after their studies – with figures from the Association of Graduate Recruiter (AGR) indicating there is still an average of 85 applications for every one job in the marketplace – students are quite rightly being very selective with the place they choose to study. Along with college and universities' entry standards and graduate prospects, students will often look at how an institution is rated when it comes to student satisfaction, and that's where technology plays a big part. Today, many further or higher education facilities are using cloud technology to not only enhance the experience students get while studying, but also to give professors and tutors invaluable insights through data on their students to help deliver the best possible teaching in a way that best suits each individual. Classroom in the cloud A good example of some of the latest cloud technology in action is at London South Bank University (LSBU), which, in collaboration with IBM, is investing more than £14.8 million into providing an exceptional student experience. The university is using a mix of analytics, mobile, social and security solutions built on a cloud infrastructure to monitor academic progress of individual students. This principal idea behind the use of cloud in education is simple – not every individual learns in the same way and one size certainly doesn't fit all. So the cloud technology being used at LSBU is helping students engage and learn in a way that works for them – wherever, whenever and however they want to. Personalising the experience a student undergoes at LSBU can start from the very moment they decide to enrol on a course, right through to employment and lifelong learning. To support the introduction of the programme, LSBU's entire IT environment is moving to SoftLayer, an IBM company cloud computing platform. Adopting a cloud strategy is greatly increasing the university's agility, enabling educators and administrators to offer new services to students at busy times at a much faster rate to manage the ever-changing performance demands of today's digitally-connected students. In addition, by moving the infrastructure online, the space taken by the original data centres can now be transformed into additional teaching space. Moving to the cloud allows LSBU to be interconnected and intelligent when monitoring its students' academic progress, creating a holistic overview for every single student's personalised learning experience. It aligns with predictive analytics, enabling the university to identify students who may require additional support with their studies, and intervene with those students to offer any help that may be needed to for student retention. The platform also offers social collaboration for richer curriculum delivery through a personalised portal providing anywhere, anytime, any-device access to on-demand learning. On the flip-side, it provides educators with the training materials, curriculum guides, software and hardware needed to teach in-demand business and technology skills. LSBU isn't the only educational institution using cloud to boost student retention and satisfaction – other institutions up and down the country are also using the cloud to boost the student experience and attract prospective students. For example, Brockenhurst College is using cloud technology in collaboration with IBM and Wessex Education Shared Services (WESS) – a public sector innovation organisation – to transform the learning experience, while also shifting the focus back on their core business by utilising auxiliary functions such as finance, payroll, procurement and human resources in the cloud. Meanwhile, Birmingham Metropolitan College is using cloud-based tools such as file sharing, web conferencing and instant messaging, to transform the learning experience and provide the ultimate 'classroom in the cloud' and has the capacity to accommodate more than 26,000 students based in the UK and those increasingly based oversees. Attracting talent Elsewhere, education institutions are looking to maximise their online presence to attract prospective students and to deliver a powerful fusion of web visitor profiles, analytics and digital marketing execution that empowers them to gather more information on prospective learners, and use it to improve their recruitment experience. Previously, one such college had no way of tracking how prospective students interacted with its website and social media pages; the online recruitment journey ended as soon as a visitor left the College website. Thanks to its cloud technology, the college can capture all of the digital interactions of visitors as they move around its website or social media pages and fuse it with their manual interactions to provide a single view. Once a potential student leaves the website, the college can use this information to create personalised banner advertisements on other webpages that proactively reach out to entice learners back to the College's website. The technology has predicted the College will increase its student numbers by 20%. Cloud without limitations Ever-changing global economic and job market trends are having a huge impact on student appetites for learning, and with higher tuition fees than ever before, students now expect a great deal more from educational institutions. As a result, to be effective, education must become more personal, engaging, and efficient than ever before. The agility of cloud technology today holds the promise of enabling a truly personalised and exceptional educational experience, which has the potential to transform the entire education industry. With competition at an all-time high, institutions that don't invest in innovative ways of offering new services and improving student satisfaction risk falling behind. ### Productivity Wars If Satya Nadella is right to see productivity as a key battleground for tech market dominance in today's mobile-first and cloud-first world, Amazon's recent offering - the Zocalo cloud-based office productivity service – should represent a gutsy offensive action. A new front in the war for cloud supremacy? But has the battle for supremacy in the document-sharing/ office productivity space really opened up another front in the war for overall market dominance? Or is it a case of 'offence as the best form of defence'? Microsoft's cloud revenue is closing in fast on AWS: Nadella said this year it was on track to bring in about $4.4bn in revenue, compared with AWS' estimated $5bn a year. Amazon's Zocalo announcement was greeted with enthusiasm. Certainly it represents a serious challenge to Box and Dropbox; offering a comparative service at a greatly reduced price (especially for Amazon WorkSpace customers). And there is an irony, as Ben Kepes noted in Forbes, that Amazon is competing with its own ecosystem – it has hosted the dominant cloud player in this space, Dropbox, on its S3 cloud service for years. But what does the launch mean in the wider context? Initial skirmishes The main pro in Zocalo's favour seems to be the integration with Active Directory. The main con, its lack of native ability to create or edit content. This lack means it can't really compete with Google docs or Office 365 but it offers, at least, the same as Dropbox. It remains to be seen whether Amazon will offer a document creation tool eventually – whether something similar to Box Notes, or conjuring something out of nothing or maybe even an openoffice.org derivative. I think it is unlikely. I think Amazon's entry into this space – although it can be interpreted as provocative – is really rather incidental. It is equally possible to interpret the move as a move to change perceptions and challenge security objections about cloud-based document sharing and, ultimately, cloud in general. Chipping away at a resistant potential market Dropbox has demonstrated that there is both an appetite and a lucrative market for office productivity/ document sharing tools. But it is also obvious that one of the major constraints on this market is enterprise fears about security. Several high profile companies have announced they block Dropbox and it is these enterprise clients that Amazon seems to be addressing with Zocalo. Dropbox utilises its own authentication; there are no trust relationships. With its integration with Active Directory, Zocalo offers the enterprise user a significant benefit – a federated authentication model. It is repositioning the cloud-based document sharing tool for an enterprise user for whom IT is making the decisions. It is attempting to capture those enterprise users whose IT departments don't like the lack of visibility and control over information that Dropbox represents; those organisations which have blocked Dropbox and apps like it. The ability to federate with Active Directory means that when Zocalo users wish to synchronise across devices they must be synchronising across company-controlled devices. They aren't going to be downloading company information outside the company's security domain. Small steps towards answering concerns about risk It's an opportunity to bring more enterprise users to a cloud environment. To replace legacy enterprise-hosted file sharing applications and entice businesses to utilise AWS services. It makes its enterprise cloud storage more attractive. And that's the difference: for all Nadella's talk of productivity it is the other three elements of Nadella's vision that will be his battlegrounds: 'cloud-first', 'mobile-first' and 'platform'. For Microsoft, repositioning around mobile and cloud is both belated and necessary. The Amazon launch of Zocalo seems to be more about strengthening an existing position. Strengthening its position in the enterprise cloud space will be important as Microsoft's game of catch up begins to make real inroads. Perhaps Amazon’s move is really a case of 'defence as the best defence'. As Microsoft's sales of Office 365 take off, even Apple is defending this territory. Its new iOS8 will offer users improved file management and access via the new iCloud Drive. Is this all a case of defensive 'me too'? Or does this focus on productivity tools represent the first skirmish in a back-up strategy? After all, a price war has to end at some point, and if you're not on the winning team at that point it has to become all about the value-add. ### Citygate upgrades with Node4 Citygate simplifies IT management with a centrally managed cloud-based system from Node4. New and used car dealer Citygate has been able to reduce costs, simplify IT management and increase the flexibility of its business operation since partnering with Data Centre and communications specialist Node4 by implementing a centrally managed cloud-based system. The IT and telephony infrastructure upgrade has also enabled Citygate to expand to two new business locations, with the IT provisions up and running within 24 hours at both sites. The way IT functions is crucial to Citygate's operation - maximising employee productivity, ensuring smooth interactions with customers and also enabling the business to expand to new sites quickly and efficiently. However, Citygate's legacy system, with dedicated onsite infrastructure and phone systems for each of its nine sites, was difficult and costly to manage and network connections between the various sites were also very limited and unreliable. This translated into wasted time for the whole team and the company was not able to reliably offer the levels of customer service that it wanted to; as a result it was severely hindering the business. In order to achieve the desired efficiency and performance, Node4 first of all looked at the network, as this would be the foundation of any future solutions. As a result, connectivity for all voice and data traffic to all of Citygate’s sites is now delivered via Node4's high-speed, secure and highly resilient MPLS network. This means that all of Citygate’s sites now benefit from a centralised cloud-based infrastructure that integrates all of its IT provisions into one central point rather than having separate infrastructure at each individual site. Telephony systems have been removed from each site and replaced by a central hosted switch at Node4’s Northampton data centre. The switch to IP telephony also delivered instant cost savings to the business through reduced maintenance costs and bringing in new functionality to improve both the user and customer experience. Citygate's email server and all other critical equipment has also been relocated, benefiting from central management. The new infrastructure has significantly enhanced network bandwidth - with higher capacity connections at every site, dramatically improving the speed of the network for the whole group. Central management of the system has also meant fewer glitches and less downtime, freeing up Citygate’s staff to concentrate on their own business. Citygate is now looking at rolling out a fully cloud-based solution for its email and business applications to further enhance its IT service delivery and flexibility. Peter Dickey, Commercial Director at Citygate commented: "We now have complete confidence that our IT infrastructure is built to best practice standards, and having the system centrally managed by Node4 frees us up to focus on what we do best. Not only that but we have a fantastic foundation to explore other technologies that could improve our operation, and we know the infrastructure is capable of scaling up further down the line to accommodate future growth." Paul Bryce, Business Development Director at Node4 said: "We always want to help our customers solve their business challenges - it's not just about selling a solution. We could see that Citygate's legacy infrastructure was not joined up and piecemeal and they didn’'t even really know how vulnerable the system might be. We know the automotive trade well so we were well aware that centralising the infrastructure and fully managing the system with full support was going to give Citygate the peace of mind they needed to focus on their core business." About Node4 Node4 is a Cloud and Data Centre specialist that in 2014 is celebrating its tenth anniversary. Since 2004 the company has grown rapidly through its comprehensive service offering and the growth in demand for hosted IT from businesses all over the UK. It has four state-of-the-art Data Centre facilities, two located in Derby and one in Leeds and one in Northampton, which offer the latest in security technology, ensuring that even the most mission-critical applications are hosted securely and protected. Its leading solution, N4Cloud is a key part of the company's wide product portfolio. For more information, please visit: www.node4.co.uk ### How cloud computing is changing education Technology commentator Rick Delgado takes a look at cloud computing and finds it has benefits for education as well as businesses. When people talk about cloud computing, they usually approach it from the angle of how it can help businesses. While the technology certainly can provide a boost to companies both large and small, there are other sectors that could benefit from the cloud. Perhaps the most promising of those sectors is education. As the economy continues to struggle and governments tighten budgets, education is usually one of the areas hit the hardest. That's why cloud computing is becoming such an integral part of every school district's outlook. Some experts even predict 35% of schools' IT budgets will be devoted to cloud computing by the year 2017. As the demand for cloud computing grows education looks set to change. Tweet This: Experts predict 35% of schools' IT budget will be devoted to #cloud computing by 2017. #education Cost Savings So how can cloud computing help educational institutions? In basic terms, cloud computing moves the IT infrastructure out of the school building. With this move many of the costs associated with supporting servers, applications and data are reduced. Service providers can manage server maintenance, system and application upgrades and even storage costs more effectively and at lower cost. Better still, software and infrastructure can be paid for as needed and actually used (a pay-per-use model). These cost savings benefit any organisation. For educational institutions there are greater rewards in the cloud than just reducing support costs. Paperless Schools The cloud really benefits education by transforming the classroom. Most schoolwork can be completed on online. As this happens classrooms are slowly going paperless. Assignments can be completed on the cloud, eliminating the need to turn in papers for grading purposes. Essential documents, classroom rules and project guidelines can also be accessed on the cloud, a change that not only helps the environment by saving on paper, but also lessens the workload on teachers needing to copy materials. Backup & Storage The cloud also allows students and teachers to back-up their assignments easily. As long as students and teachers are connected to the Internet, they can work with their files. This cloud back-up also serves as an emergency storage system for school districts. Educational organisations have a lot of student data. Losing this data can be a major headache to say the least. That's why having a cloud back-up can prove to be invaluable in the event of disaster. Classroom Collaboration The change in the classroom has also introduced wireless devices with students bringing their own laptops and tablets to school. Bring your own device (BYOD) is becoming a vital element of school life. With the convenience of these wireless devices the cloud enables students and teachers to collaborate on their assignments. Students can work together simultaneously on projects, in turn learning how to cooperate and get the most from their skills. Teachers also benefit by sharing lesson plans with other teachers. The result is a greater willingness to share and more engagement. A Growing Market According to a recent report, the cloud computing in education market was worth more than $5 billion in 2014. It is expected to grow dramatically and could be worth more than $12 billion by 2019. Other reports show that nearly half of all higher education institutions have prioritised moving to the cloud. Saving costs on IT expenses is an important incentive but so are the other benefits. In short, cloud computing in education is growing, and that doesn't appear to be letting up anytime soon. As education embraces the cloud, institutions will see students become more engaged in the learning process while teachers become more productive. Concerns over privacy and security will need to be addressed and school districts will need to decide whether a public or private cloud best serves their needs, but the benefits are clear. Cloud computing is changing education for the better. ### CWCS launches next-gen cloud CWCS Managed Hosting has announced major upgrades to its cloud infrastructure, boosting performance, reliability and speed, with the launch of Supreme Cloud. Karl Mendez, Managing Director of CWCS Managed Hosting stated, "Our public cloud just got a whole lot better. As well as improving speed and performance, it is greener and reduces costs." The new enterprise-grade, fully redundant EMC SAN solution includes state-of-the-art hardware with the latest SSD-based caching functionality. This infrastructure upgrade is one of the first implementations of brand new higher density disks. The platform provides greater capacity, allows modular expansion with zero downtime, increases reliability and delivers superior performance. It is also more energy efficient. CWCS is constantly improving its products and services through continuous innovation. The infrastructure has multiple physical servers with all points of failure taken in to account. If one node or hardware point fails, there is ample redundancy and fail-over so that there is no single point of failure. This ensures exceptionally high levels of reliability, resilience and up-time. Mendez, added, "We are genuinely proud of our Supreme Cloud infrastructure and it is great to be able to pass on reduced costs and improved performance directly to our customers. We always aim to provide the best products, along with what we consider to be the best customer service and technical support in the industry." About CWCS Managed Hosting CWCS Managed Hosting is a specialist business and enterprise-level managed hosting company offering cloud servers, private cloud, hybrid cloud, dedicated servers and co-location. CWCS was founded in 1999 and operates from two highly secure data centres in Nottingham, UK. The company also has US and Canada data centre facilities. For more information about CWCS Managed Hosting visit: www.cwcs.co.uk ### Corintech launches FilesThruTheAir Cloud-based monitoring software and sensor solution allows users to monitor data anytime, anywhere. Corintech has today launched FilesThruTheAir , a new comprehensive environmental monitoring solution combining advanced WiFi sensors with a remote Cloud-based data logging service. Measuring temperature and humidity, data logged on FilesThruTheAir WiFi sensors is universally accessible from any Internet enabled device through the FilesThruTheAir Cloud-based monitoring platform. FilesThruTheAir WiFi sensors communicate via any existing WiFi connection, updating the data automatically to the Cloud, providing maximum simplicity to the data gathering process. Ease-of-use is also key to the FilesThruTheAir Cloud account set-up, the browser-based interface takes the user through the set-up process step by step. Users can choose between three levels of Cloud account, from a free entry-level service for simple implementations and to a comprehensive professional service that supports unlimited devices in multiple locations and can enable data to be accessed by multiple users. Tim Waterman, Technical Director at Corintech said, "By giving users the freedom to access, manage and analyse their information remotely in the Cloud, FilesThruTheAir introduces a whole new dimension to modern data logging with the most flexible and comprehensive solution to date. As alternative solutions use niche or proprietary wireless standards, we saw a clear gap in the market for a solution that enables high specification sensors to take advantage of existing WiFi networks in combination with a robust and dedicated Cloud monitoring service." Waterman added, "When developing the FilesThruTheAir product range, our main objective was to streamline the environmental monitoring process for our customers. By developing a robust Cloud-software that integrates seamlessly with our advanced sensor range we are very pleased to bring the most complete but easy to use solution to the market that removes the burden of data gathering." About Corintech Corintech develops and manufactures environmental monitoring devices, offering PC or Cloud-based data logging systems incorporating its FilesThruTheAir™ WiFi sensors. www.corintech.com ### Apps World Europe reveals keynote line-up Apps World Europe reveals keynote line-up that includes Google Glass pioneer. Apps World Europe has announced the first of its keynote speakers for the show with a line-up that includes Babak Parviz, Google Glass Pioneer and now Vice President of Amazon. Parviz is one of a number of experts who will present keynote speeches at Apps World Europe, which will move for the first time to London’s ExCeL in November. The keynote speakers make up more than 200 speakers who will be presenting at the show which is due to attract more than 12,000 attendees for 2014. Other keynotes include As Seen on TV founder, entrepreneur and investor and now Shark Tank judge (the US equivalent of Dragon’s Den) Kevin Harrington &dash; the man also credited with creating the infomercial in 1984. Another famous TV face includes Jason Bradbury, presenter of The Gadget Show UK & US whose television career spans nearly 20 years. Other keynotes include Peter Molyneux, founder of Lionhead Studios and one of the most recognised faces in the video games industry; Sir Nigel Shadbolt, Co-Founder & Chairman, ODI - Open Data Institute whose 33 year career saw him being knighted in 2013; Gerard Grech, CEO, Tech City UK whose international experience has taken him now to new heights building Tech City UK; Takeshi Idezawa, Chief Operating Officer of LINE and Bill Liao, European Venture Partner of venture capital business SOSVentures. Ian Johnson, founder of Apps World, said: "Apps World has grown hugely as a brand over the last five years and will this year attract more than 12,000 attendees to our European show in London. Our success is based on the strength of the keynotes we attract to speak at the show – a list that is constantly being added to. Apps World Europe 2014 promises to be the best show yet." Apps World Europe will be held in London’s Excel from 12-13 November. For more information about our speakers visit www.apps-world.net/europe/agenda/keynote-speakers. For more general information visit www.apps-world.net/europe. ### Node4 acquires LETN Solutions Expanding its cloud offering Node4 acquires cloud infrastructure firm LETN Solutions. Node4, the cloud and data centre specialist, has completed a strategic acquisition of LETN Solutions, a leading cloud infrastructure company. The acquisition will extend Node4's existing portfolio of cloud solutions allowing it to further its Infrastructure as a Service (IaaS), Disaster Recovery and Virtualisation service offerings. LETN is a Reading-based cloud infrastructure company. Since 2005, the company has offered managed support, professional services and hosting services to its customers. By merging LETN's team and portfolio with its own, Node4 has enhanced its cloud offering by introducing LETN's evoCloud and evoPod services, which support packages such as Infrastructure as a Service (IaaS), Disaster Recovery and Virtualisation. Additionally, LETN is a NetApp specialist and one of only 18 qualified Authorised Support Partners (ASPs) in the UK. The acquisition will see LETN's Reading office, services and employees become a part of Node4. All services currently provided to LETN customers will continue, and every customer will have the opportunity to take advantage of other Node4 services. This includes further cloud offerings as well as colocation, connectivity and communication packages. Andrew Gilbert, Managing Director, Node4 commented: "This is a strategic acquisition for Node4 given LETN's capability and expertise around cloud computing. We have been placing increasing amounts of emphasis on our cloud-based solutions in the past year and this move reaffirms our belief in the opportunity cloud brings to our customers and the industry. There is a common belief in technology and customer focus that will provide great synergy between our two businesses. Furthermore, given our regional business model, LETN's location in the South is of great strategic benefit to Node4." David Mearing, Co-founder and Chairman, LETN Solutions added: "Combining LETN and Node4's respective solutions range and expertise means customers of both organisations will benefit from an enhanced service with greater choice. Node4 is a company that shares our core business values of delivering industry-leading technology, excellent customer service and expert advice." As part of this acquisition, Steve Denby and Gregg Mearing, Co-founders of LETN Solutions, will take up positions within Node4's management team. Advising LETN were M&A specialists, Winchester Technology Advisors (WTA) and lawyers, Goodman Derrick, whilst business analysts, PwC and lawyers, Pinsent Masons worked on behalf of Node4. About Node4 Node4 is a Cloud and Data Centre specialist that in 2014 is celebrating its tenth anniversary. Since 2004 the company has grown rapidly through its comprehensive service offering and the growth in demand for hosted IT from businesses all over the UK. It has four state-of-the-art Data Centre facilities, two located in Derby and one in Leeds and one in Northampton, which offer the latest in security technology, ensuring that even the most mission-critical applications are hosted securely and protected. Its leading solution, N4Cloud is a key part of the company's wide product portfolio. For more information, please visit: www.node4.co.uk About LETN LETN Solutions has been providing Managed Support, Professional Services and Hosting Services since 2005. LETN was formed to specifically address the needs of small and mid-size organisations that required intelligent design, implementation and support for emerging technologies. ### Opportunity on the fringes of a price war I suppose like myself we are all bored of hearing about the "race to the bottom" for cloud pricing. It is like walking around a traders market with all the stalls selling the same product. The only difference is price, and how loud the traders shout for your attention. Step away from this loud aggressive marketing, ignore the cutting and look around. Guess what? Our traders market is full of customers using smartphones, tablets and laptops. These devices are running applications. We rely on these applications. Next find 10 people in your office (even better if they have no interest in IT) and ask them if they care about the IaaS layer or the costs saved by its use? You will probably get a bemused look. If they are still paying attention ask the same 10 people if they care about how the applications they are using work. Ask are those applications simple, efficient enough to deliver value for the tasks required? Where you had blank expressions before you will certainly have strongly voiced opinions now. As the cloud price war delivers greater value for the expenditure it is squeezing the market leaving only a select, monolithic providers. The next generation of users wants powerful simplicity. They want beautiful designed interfaces that allow them to be productive. So where does this story take us so far? We all need a clear and specific goal. Without one, pricing is as unimportant as the features and benefits promised. Managed service providers (MSPs) are beginning to understand this. And, working with independent software vendors (ISVs), they are starting to see a window of opportunity beneath the price war. Businesses like consumers need to have clear ultimate goal. And the application of this goal has got to be simple. Everyone really wants support out the box, easy installation and an interaction that supports productivity. The "whatever" as a service cloud is delivering for a business has got to be a service that is useful and aligned with the goal. My personal view is that we are heading into a world of fewer choices. As the cloud price war delivers greater value for the expenditure it is squeezing the market leaving only a select, monolithic providers. Scale, processing power, geographic distance and connectivity are so hugely important in delivering a cloud product that providers have to be a global sized businesses just to compete. Only the big traders can offer the kind of price discounts that the marketplace is seeing now. But in all this frenetic activity of discounting there are still opportunities to be had. While Amazon is always going to be able to undercut Dropbox on its own AWS, the fact Dropbox exists demonstrates opportunity beneath the giants. ISVs are targeting users with features and productivity tools that are missing from our new, mobile device-led lives. For example, here at the Compare the Cloud we have been trying Wunderlist and Todoist in the battle for task management. Apart from the obvious reasons, why do we need task management? Because, not naming names, our cloud-based productivity suite doesn't deliver a complete, out-of-the-box, experience that is useful. Beneath price is quality and customer need – what the end-user really requires to get the job done. Working together MSPs and ISVs can benefit from the price war margins. The slightly disappointing application experience of IaaS layers sold in our traders market is fuelling opportunity. Cost cutting headlines aside, the race to the bottom is at least helping to create a new, burgeoning eco-system of mobile and application developers and niche service providers. ### Data Centre take-up has doubled Data Centre take-up has doubled year on year says new research. In new research just released by GVA Connect, the data centre specialist division of property adviser GVA, showed that the take-up of data centre space by end-users that include corporate occupiers, government agencies and IT integrators more than doubled in the first half of 2014 compared to this time last year. "With continued acquisition by data centre operators through 2013 and a sharp increase of end-user take up in Q1/2014," said GVA Connect director Charles Carden, "our research shows the continued growth in take up of both co-location and wholesale data centre space by end users which is rewarding those commitments by operators to expand existing sites as well as developing new sites." Total take up, including operators and end-users, in H1/2014 increased by 14% compared to the same period in 2013. Within that, the volume of end user transactions more than doubled from a total of 13MW in H1/2013 to 30MW in H1/2014. "Transaction volumes have increased significantly" said Carden, adding that while a small number of transactions were recorded for 1MW plus of capacity, the underlying trend across all sectors was typically for 50kW plus, on a scalable basis to allow for projected growth. The latest research showed once again that London and the South East continue to lead the way as an International technology hub with almost 70% of total UK take up in H1/2014 being within London Synchronous locations (sub 3 milliseconds round trip latency) with the remainder across the UK. GVA Connect has predicted that the West Thurrock area is set to become one of the major outer-London data centre locations and this is illustrated by the availability within that campus of high quality, high power and fully consented sites such as the new Gateway Data Centre. GVA Connect sees the trend for IT outsourcing as set to continue with the volume of end-user demand, together with the reduced timeframe for delivery likely to stimulate further development of new sites from the end of 2014. ### The battle for your online data First, we had the launch of Amazon's cloud-based office productivity suite, Zocalo. It seemed that the behemoth was taking a page out of Microsoft and Google's office apps to play them at their own game and squeeze out Box and Dropbox into the bargain. Then, as we covered here on Compare the Cloud, Google Cloud Platform began offering two terabytes of free storage for a year through one of its partners, a startup called Panzura. The Google/Panzura announcement also had industry commentators salivating at the prospect of Google competing with industry leader Amazon Web Services in a price war. And why not? Neither business is short of the funds to finance a battle. But just where will all this free storage lead? Consumers have been quick to embrace the convenience and low capital outlay of Cloud storage. The model makes total sense when you don't have the expertise or upfront cash to invest in your own hardware or alternative offsite locations to store it in. Consumers have also driven the popularity of Google docs and forced Microsoft and Amazon to play catch up. Business has dipped its toe in the water with varying degrees of enthusiasm, but the unwon potential is still huge. And it is the unwon business segment of the market that Google claims to be targeting with this new Panzura offer. Chris Rimer, global head of partners at Google's cloud platform business, said: "this is a way for customers to try something new, especially if they have had some kind of aversion to using the cloud in the past... We want to make sure that potential customers are not worried about cost as a barrier to entry." Consumers have driven the popularity of Google docs and forced Microsoft and Amazon to play catch up. To accept the notion that this offer will entice new business users to the cloud, then, does one have to accept that cost is the main barrier for business users? While value may drive consumer cloud adoption, I'd argue that security and data residency are greater barriers to cloud adoption in the business sphere. For one thing, cheap deals already exist: Microsoft Azure's pricing starts at 2.4 cents per gigabyte per month. Besides, the Google/Panzura offer is only free for as long as users are accessing data from a single location. How realistic is it to think that business users would be accessing their data from a single location? And what happens at the end of the year when the free period ends? How realistic is it to imagine business users would switch suppliers on an annual basis to secure the best deals? Is that not adding more perceived risk to the equation? Besides, the cost of frequently changing suppliers would undoubtedly negate any savings. Similarly, Amazon's lowest price of one cent per gigabyte per month has serious caveats since it applies only to a service for infrequently accessed data. AWS is reducing the headline rate by restricting performance and bandwidth requirements, benefitting from the relatively lower cost of high-latency storage. All these caveats make me think the idea of a price war is a little overcooked. It's more about grabbing headlines than winning over cloud-averse people. Nevertheless, two terabytes is a significant amount of data. For smaller, less data-heavy organisations the biggest temptation must be to backup all of its data for gratis into the cloud. Two terabytes of free storage could represent a really cost-effective disaster recovery option. While two terabytes could accommodate all the enterprise business systems for a small or medium-sized data-light business, for data-heavy or large organisations with complex applications it would fall short of total requirements. Could this shortfall be part of the strategy? I can see innovation project teams being tempted by short-term 'free' deals that enable them to deliver services rapidly and within budget, leaving operational IT out of the original decision-making process. This scenario would be a way to demonstrate proof of concept through a different (possibly less risk-averse) route into the business. Then leaving operational IT to discuss the ramifications of running the service with a business management that has already seen it in operation. After all, Google's cloud applications have sneaked into many an organisation under IT's radar through BYOD, could Google be thinking this back-door route into the organisation is a good way in for cloud storage too? After all, if it's a free service, what sign-off does it require? ### CoinTerra selects CenturyLink for BitCoin mining Large-scale Bitcoin mining operations to be hosted across multiple CenturyLink locations as CoinTerra selects CenturyLink Technology Solutions data centre CenturyLink has announced that it has signed a multi-megawatt data centre deal to host CoinTerra's rapidly growing Bitcoin-mining operations in CenturyLink data centres. CoinTerra designs, produces and operates best-in-class hardware and software that power the Bitcoin blockchain ecosystem. Bitcoin is a "crypto-currency" that operates on a user-powered, peer-to-peer payment network. The company estimates that more than 15 percent of the total Bitcoin network runs using CoinTerra hardware. "As the Bitcoin ecosystem continues to flourish, CoinTerra's continuously expanding mining operations require reliable power and extremely high levels of availability to ensure peak performance," said CoinTerra CEO Ravi Iyengar. "CenturyLink's expertise, operational efficiency and immense data centre presence provide us with the ability to safely and securely operate truly powerful and reliable Bitcoin mining on an enterprise scale." With CenturyLink, CoinTerra's growing Bitcoin mining operations are currently housed across multiple data centres, with additional locations scheduled to be brought online in the coming months. "CoinTerra started its search for a colocation provider by looking for commodity power, but quickly learned how CenturyLink's capabilities – from our data centre operations to cooling systems that keep infrastructure operating efficiently – would provide an even greater return on investment to keep the company on a path for rapid growth and future success," said David Meredith, senior vice president and general manager, CenturyLink Technology Solutions. "We make life easier for CoinTerra, allowing them to focus on what they do best – developing and operating Bitcoin-mining processors and systems – while we ensure the security and reliability of their expanding infrastructure." About CenturyLink CenturyLink is a global leader in cloud infrastructure and hosted IT solutions for enterprise customers. CenturyLink provides data, voice and managed services in local, national and select international markets through its high-quality advanced fiber optic network and multiple data centers for businesses and consumers. The company also offers advanced entertainment services under the CenturyLink® Prism™ TV and DIRECTV brands. About CoinTerra™ Founded in Austin, Texas in mid-2013, CoinTerra is currently one of the fastest-growing technology startups in the world. CoinTerra designs, produces, and operates systems that power the Bitcoin blockchain network. ### ESOS to shake up businesses New energy savings law ESOS will be have a profound affect on businesses. For enterprises, clean tech and born in the cloud businesses now is the time to take notice. On July 17th 2014 a new UK law came into place known as the Energy Savings Opportunity Scheme (ESOS) (UK Gov ESOS Scheme). For enterprises interested in doing more to be sustainable by saving resources such as energy, water and land use, those in clean tech and IT companies supplying the solutions that underpin increased sustainability in and the emerging cloud providers that can better and faster enable the sustainable enterprise plans this is an important law. Aimed at around 9,000 UK enterprises ESOS requires them to do energy audits and make plans for reductions. This actual need to do something will come as a shock to most of them. Research done by Cambium LLP in 2013 shows that around 3% to 5% of the total investment in energy reduction and carbon management projects will be in IT systems and IT consultancy. The implication is that over the next 15 years, a net new IT market will grow with a size of £0.1bn to £1.0bn in IT systems just for enterprise energy efficiency improvements. The expected annual growth rate of this IT market is in excess of 11%. If only 5% of these plans are acted on, the UK Government expects there to be a net benefit to the UK of £1.6bn by 2030. This implies a total investment in energy reduction schemes of at least £3.3bn over the next 15 years but could be as high as £33bn. ESOS Market Size Blog Spreadsheets and central IT won't work alone ESOS will lead businesses to do more at the start and then evolve over the next few years across 5 key areas. Compliance with the legislation and providing the data and reports is just the start of the need for quick solutions. All enterprises will need a compliance solution. Further resource management projects will spin off this kick-start in many enterprises as they realise the resources and money they have been wasting and can reasonably control Building to larger sustainability projects. The new awareness in enterprises is expected to launch many of them to review sustainability right across their functions and to look for solutions to address the management of increasing sustainability objectives Large volumes of data, analysis, reports will need to be gathered, analysed and acted on Real-time global inputs from sensors will become the norm to support the energy reduction systems and then more required to support wider sustainability initiatives The cloud has a role to play The enterprises will need to move fast to gather data and create the initial position report for the business just to be compliant. Many are expected to do the minimum that is required at first. This will change over a number of years into a structured and broad approach to resource use reduction and sustainability that needs an integrating, adaptable, global and scalable set of solutions. They will not want yet more bespoke IT systems in-house. They will not know the application and data architecture that is best for some time. This will lead them to a flexible initial approach that does not get baked into the only way forward. They will be looking for application solutions that can get bigger faster or even scale up and down at times, as well as systems approaches that can be reused and re-purposed across new sustainability applications. The sounds like many of the good features of solutions in the Cloud. What can cloud providers do now? Cloud service providers need to make sure they understand the initial and near-term solutions and capacity needed by enterprises in the UK, EU and beyond. ESOS Compliance, Energy Reduction and Broader Sustainability will be a major focus in actioning the energy saving changes. Those providers who develop an offering for these 9,000 new prospects will be first to benefit from the opportunity first. And there is room to scale. Those cloud initiatives solutions might start as small projects could quickly grow into much larger implementations. Starting small can lead to an enduring and fruitful relationship. For providers who are not in a position to develop their own solutions, they need to find an application partner (ISV). Partnering with an ICV could create new, featured services such as: ESOS Compliance as a service (ECaaS Energy Efficiency as a Service (EEaaS) Sustainability as a Service (SUSaaS) ESOS will have a profound affect on businesses. But this new energy savings law certainly has some benefits to those cloud providers who are first to market. About Cambium Cambium help suppliers to grow their sales to enterprises interested in energy efficiency, sustainability and corporate responsibility. We understand know which organisations are actively investing in these topics and how to reach the key decision makers. We help our clients position their products with the right audiences in a way that gets attention. No matter if you are a new entrant to the UK market or an existing player seeking to improve the effectiveness of your sales and marketing, Cambium can help your grow your pipeline and improve conversion rates in these rapidly growing markets. To discuss the implications for your business or discover more, contact us at Cambium LLP ### Defining the Cloud: Public vs Private vs Hybrid The ultimate goal of cloud computing is ubiquity – where the services work so well they are just part of the everyday business processes. Cloud computing has finally matured but there is still progress to be made. To reach the goal businesses still need to be educated on the kinds of cloud services they can choose. The biggest question is often whether a public or private cloud or a combination of the two, a hybrid cloud, is the right choice. Public Cloud A public cloud is open to the general public and delivered over shared infrastructure. Amazon EC2, Microsoft Azure and Google Compute Engine are examples of large-scale public cloud offerings. With these types of cloud you can sign-up online very quickly, choose the resources you need and swiftly deploy a virtual infrastructure. Possibly the biggest concern over a public cloud is security and data privacy. Public cloud offerings are typically very cheap and have very much become a commodity-based service, offering a one-size-fits-all solution where you have little say in the technological infrastructure. They are ideal for companies who have a basic need for infrastructure but don't require their solution to be tailored to any specific requirements they may have. Support is often limited and management is rarely provided, so a public cloud provider is usually more suited to organisations that have a high degree of technical knowledge in-house. Possibly the biggest concern over a public cloud is security and data privacy. You will probably never know exactly where your data is being hosted and so to what laws it is subject to, such as the Patriot Act in the USA. Because of this, data sovereignty is an increasingly prominent issue and many customers are opting for public cloud providers based in the UK or EU. Also, the use of shared infrastructure means you don't know who your digital neighbours are – and if they are behaving poorly it can have knock-on effect to the performance and reliability of your solution. PROS Usually cheap Simple sign-up process Quick to deploy a virtual infrastructure Quick and easy to scale resources CONS Not ideal for applications that require high security or transmit/store sensitive data Limited ability to tailor a solution to your requirements You need to have a high degree of technical knowledge in-house Limited support and management IDEAL USES Software development test environments Hosting applications that don't require data security or legal requirement to know exactly where data is hosted Private Cloud As the name suggests, a private cloud is a cloud deployment where infrastructure is completely private to the organisation using it. This is the opposite philosophy to a public cloud, where the infrastructure is shared amongst multiple companies. Typically, private cloud providers allow the environment to be tailored to your exact requirements, meaning you can have free choice on hardware vendors used to make up your private cloud platform. There would normally be collaborative design consultations between you and the cloud supplier to ensure environment is optimally configured to meet your needs. With a private cloud solution you can be sure that no one else is using your infrastructure, boosting performance and reliability. You also have control over where your data is stored which increases security and privacy. For organisations with sensitive data, data privacy and data sovereignty can be a vital consideration. Private clouds will often come with a tailored Service Level Agreement (SLA), an added level of support and the option of partial or full management – reducing the burden on your in-house technical team. The downside to a private cloud is usually the cost, as your service provider will need to make its money back on the investment into all of the dedicated components used to make up your private cloud. PROS Can be tailored to your exact requirements Extremely secure – you know exactly where your data is Dedicated Resources – 100% of resources guaranteed to be available for your use only Enhanced Service Level Agreement CONS Cost – can be price prohibitive for smaller organisations Resources are fixed and cannot be easily flexed up without adding more physical capacity IDEAL USES Hosting mission-critical applications Hosting applications or storing data that requires a high level of security Requirements that need a completely bespoke solution put in place Hybrid Cloud A hybrid cloud is, as you might expect, a blend of private and public cloud infrastructures. This hybrid approach combines the advantages of the scalability and cost-effectiveness of a public cloud with the dedicated resources and power of a private infrastructure. By way of an example, a typical hybrid cloud deployment might be hosting your key live environment (such as an Exchange or SharePoint) within a private cloud, but sending archive data to the public cloud where you could make use of lower-cost and lower-performance storage. Another example: you might need guaranteed server resources for an important application or for database which for performance reasons can't be virtualised and therefore needs to be completely private. This would rule out a traditional public cloud, but with a hybrid deployment you could use a public cloud infrastructure for other elements that could still be virtualised. In the examples above you would probably want to encrypt the data on anything stored on a public cloud and inter-cloud latency would have to be considered so it doesn't degrade your performance. The hybrid approach really works better when using a Virtual Private Cloud (VPC) within the same data centre (or at least one quite local), instead of a public cloud because of the potential latency issues. For example, most of Amazon and Microsoft services are hosted in Dublin, so if your private cloud is located in the UK you’re more than likely going to run into latency issues. PROS Offers the advantages of the best parts of a public and private cloud infrastructure Offers a more flexible approach to managing your IT Less expensive than a private cloud CONS More expensive than a public cloud Can be more complicated and time consuming to set-up IDEAL USES For companies who prefer to maintain more control over their IT by using a private cloud, but are willing to use a public cloud for less mission–critical activities For larger companies who have multiple applications, some of which can be run in a public cloud and others which must stay private owing to, for example, information governance Virtual Private Cloud (VPC) It is worth just touching on the benefits of a VPC because it is an extremely valid option which is often overlooked. In many respects a VPC is like a public cloud: it is still a shared multi-tenanted environment, but usually comes with a number of enhanced security features to ensure maximum security is provided. In my opinion a true VPC should be able to provide complete segregation between customers and be able to guarantee key resources (such as RAM). It's also my belief that if any infrastructure is going to be truly private, people shouldn't just be able to sign-up online with a credit card and start using it. As a VPC is essentially still a multi-tenanted environment I believe it's important to carry out background checks on the customers before on-boarding them, thus avoiding any high-risk entities being hosted which potentially could attract the wrong kind of attention. PROS Much lower cost than a private cloud Provides all the benefits of a public cloud, but with increased security and control Ideal for deploying within a hybrid cloud environment, where everything must remain private CONS More costly than a public cloud, but probably less so than you may think IDEAL USES Perfect for environments where privacy or security is of great importance. Hosting applications which may require a more tailored solution than a public cloud can offer, but without the higher cost of a fully private cloud There is no golden rule as to which type of cloud environment is best: each platform has its positive and negative points. They all have their place. For me it really comes down to understanding your own needs and balancing these against the budget you have. For most businesses understanding the benefits of public, private and hybrid clouds is a great start – making it easier to choose a cloud provider with the right cloud services to suit your requirements. Each choice has clear benefits. ### IBM expands high performance computing capabilities SoftLayer adds InfiniBand to cloud services portfolio. IBM has announced that InfiniBand links will be available as a networking option for customers using bare metal servers from SoftLayer, an IBM Company. Adding InfiniBand support enables very high network throughput and low latency between bare metal servers. Ideal for computationally intensive activities common to energy companies and big data applications, this solution makes High Performance Computing (HPC) even more accessible on the cloud capabilities "As more and more companies migrate their toughest workloads to the cloud, many are now demanding that vendors provide high speed networking performance to keep up," said SoftLayer CEO Lance Crosby. "Our InfiniBand support is helping to push the technological envelope on what can be done on the cloud today. It showcases our innovation when collaborating with customers to help them solve complex business issues." InfiniBand is an industry-standard networking architecture that delivers high transfer speeds—up to 56Gbps—between compute nodes. That is the equivalent of transferring data from more than 30,000 Blu-ray discs in a single day. The architecture provides additional features, contributing to InfinBand’s overall superior reliability, availability, and serviceability over legacy PCI bus and other proprietary switch fabrics and I/O solutions. These advantages bring SoftLayer customers even lower latency between bare metal servers, applicable to private clusters of servers with up to hundreds of compute nodes, ideal for computing solutions for applications such as life sciences and genomics, computer aided engineering, financial services, electronics design and reservoir simulation. By reducing latency between bare metal servers in these private clusters customers can easily manage massive amounts of data faster, more effectively and efficiently. "Bringing InfiniBand capability to the cloud is driven by the growing need for extremely high levels of speed and performance for scenarios such as HPC and big data", said Philbert Shih, managing director for Structure Research. "This type of offering will help enable engineers and scientists to build, compute, and analyze simulations in real time leveraging hundreds of compute nodes. Being able to share and analyze data at this speed will only accelerate cloud adoption from this use case, while making HPC performance more accessible across a wide variety of industries." The introduction on InfiniBand on SoftLayer will especially benefit customers who are leveraging fully supported, ready-to-run clusters complete with code name IBM Elastic Storage, IBM Platform LSF or Platform Symphony workload management. InfiniBand will be available on SoftLayer through the IBM Platform Computing team, expected in the third quarter of 2014. For more information or to add it to an existing SoftLayer account, please visit, IBM Platform Computing Cloud Service or contact or contact Terry Fisher, Global Sales Leader at TerryFisher@uk.ibm.com.   ### The link between the cloud and wearable technology Technology commentator Rick Delgado finds that cloud computing is a major driving force motivating the recent trend of wearable technology. Fuel drives pretty much everything in our world. Cars need gasoline. Televisions need electricity. Even our bodies need food and oxygen to keep us going. Without fuel, things break down and cease to function. So what does this have to do with wearable technology? There's a major driving force that is motivating the recent trend of wearable technology. The force - or fuel, if you will - behind it is none other than cloud computing. Without the advances in cloud computing, wearable technology wouldn't have the chance to progress or proliferate. Even though it's seen only recent growth, already around 18% of people in the United States and the United Kingdom use wearable technology,(Tweet This) according to research from Rackspace. As cloud computing continues to advance and become more convenient and usable, expect wearable technology to grow even more popular at the office as well as at home. So what is cloud computing exactly? Though it has numerous applications, cloud computing basically means computing on the Internet. Connecting to the cloud represents connecting to the Internet, which nowadays is made easier through the advances in wireless technology. One can easily see how cloud computing has lead to the rapid spread of wearable technology. By helping people connect to the Internet at all points, they can have easy access to all their personal data. Think of the convenience that has been enjoyed by millions of people from using smartphones, only now computer access would be available from things we are wearing. Not only does wearable technology allow us to access data from the cloud, but we can actually generate data that is then sent to large data centres. Perhaps most impressive of all, this data access and collection can all be done in real time. Wearable technology is convenient and climbing in popularity, but the real impact will likely be felt with the Internet of Things. Businesses are already using the power of the cloud to make wearable technology a reality. Some you might be familiar with. Google Glass is perhaps the most prominent product being offered to select individuals. The item has a constant connection to the cloud and gives real time information to users over a display. Smart watches have also become a relatively popular item, like the Samsung Galaxy Gear or the Pebble Watch. They function much as you'd expect, alerting users of incoming emails and other messages made possible through their wireless connection. There are also health and fitness monitors, wearable cameras, and, perhaps most interesting of all, smart clothing that uses embedded sensors to monitor users' activities and send that data to company servers. Without cloud computing, none of these products would be possible. Wearable technology and cloud computing is all about collecting and analysing data. Much has been talked about wearable technology's convenience and climbing popularity, but the real impact will likely be felt with the Internet of Things (IoT). The IoT basically connects devices both to the Internet and each other. While this may provide for new and innovative ways for people to interact with everyday objects, by allowing devices to communicate with each other, businesses can fine tune them to ensure peak efficiency, particularly among factory equipment. This basic concept can even extend to connecting people like never before - essentially a human cloud. This connection has a lot of potential, in part as a way to make life easier and more informed. In the same Rackspace report, of those who use wearable technology, 82% said that it has enhanced their life. Some said wearable technology made them feel more intelligent, others said there was improvement in their personal efficiency, while others said wearable tech helped them with career development. There are a number of hurdles wearable technology needs to address before it becomes more mainstream. Wearable technology and cloud computing is all about collecting and analysing data, but the method of data capture and analysis still needs improvement. That means equipping devices with better sensors capable of keeping track of and monitoring users' activities over a wider spectrum. Better calculation of that data falls within the realm of cloud computing, so improving data centre processing power is a must. Companies also need to find better ways to incorporate more cloud capabilities in devices to truly connect them with the world. Considering the advances the technology has made already, it's reasonable to think these challenges will be addressed in due time. Cloud computing and wearable technology go hand in hand. Without cloud computing, it's likely wearable technology wouldn't even be possible. There are various ways wearable tech will become integrated into our daily lives in the coming years. As we become used to the idea, we'll also know that cloud computing is the technology fuelling it. We can expect fully connected lives to an extent beyond what we have even today. ### CTC's Ultimate Knowledge Test How well do you know Compare the Cloud's team? And how well do you know Cloud Computing? It's time to find out! Add your answers in the comments.   ### SMEs ignoring IT Manager warns Node4   SMEs are ignoring the importances of IT Managers at their own risk, warns Node4. Independent research finds that over half of SMEs believe the IT manager will be dead by 2024. Node4, the Cloud and Data Centre specialist, is reminding SMEs of the important role that IT managers play in their business now and in the coming decade. The warning comes after Node4's independent research found that 57% of SMEs agree that the role of IT manager will be dead in 10 years time. Node4 believes that the IT manager will be key to driving business success in the coming years and companies that ignore this do so at their own risk. With the advent of Cloud and outsourced IT, the media has been rife with predictions around the imminent death of the CIO or IT manager. According to some, the simplification of technology and the increased trend towards outsourcing technology services means that the IT manager will become superfluous, if not entirely obsolete. "As one of the industry's leading Data Centre and Cloud specialists, many would think we welcome speculation that the role of the IT manager will be defunct by 2024. Perhaps it makes sense that as more organisations outsource IT to companies like us, we will naturally assume more of the IT manager's responsibilities, thus making the in-house functionality a thing of the past," commented Paul Bryce, Business Development Director, Node4. "However, the truth of the matter is that the IT manager is now more important than ever. Change is the new norm for IT departments and successful organisations need a technology expert who understands the IT landscape and can marry this expertise with the needs of the business. Rather than being someone who controls all the technology within the organisation, IT managers must take on a role resembling a 'steward of risk'. A well informed member of the organisation who helps its employees use technology to win more business, beat the competition and succeed in the post-recession economy." About Node4 Node4 is a Cloud and Data Centre specialist that in 2014 is celebrating its tenth anniversary. Since 2004 the company has grown rapidly through its comprehensive service offering and the growth in demand for hosted IT from businesses all over the UK. It has four state-of-the-art Data Centre facilities, two located in Derby and one in Leeds and one in Northampton, which offer the latest in security technology, ensuring that even the most mission-critical applications are hosted securely and protected. Its leading solution, N4Cloud is a key part of the company's wide product portfolio. For more information, please visit: www.node4.co.uk ### Pulsant wins Asigra partner award Pulsant, the cloud computing, managed hosting and colocation expert, has been named Fastest Growing Partner of the Year by Asigra, the leading cloud backup, recovery and restore software provider. Pulsant was recognised for its dedication and leadership in supporting technology users with innovative cloud-based data recovery solutions. Pulsant entered into a strategic partnership with Asigra at the beginning of the year adding enterprise cloud back-up and replication solutions to its portfolio. "Capturing market share in the highly competitive backup and recovery market requires intelligence-driven efforts at multiple levels to outmarket, outsell and outperform, winning the opportunity to deliver a more compelling solution to the customer,” says David Farajun, CEO, Asigra. "Pulsant achieves consecutively high results and therefore ranks number one in this category, receiving the win.” Rob Davies, sales and marketing director, Pulsant, says: "We have seen a lot of growth in this area. Working with a partner like Asigra ensures that we are perfectly poised to deliver on this demand with a world-class product that rounds out our cloud and colocation offering.” Asigra was recently named TechTarget's top enterprise backup application for 2014, and the company has been included in the Gartner Magic Quadrant for Enterprise Backup Software and Integrated Appliances for the fourth year in a row. The award ceremony took place during Asigra's annual Partner Summit held in June in Toronto. About Pulsant Pulsant is an award winning company that provides enterprise-class IT services to over 3,000 customers, ranging from SMEs to large private and public sector organisations. The company is one of the UK's largest providers of Managed Hosting, Cloud Computing, Colocation and Managed Networks. In 2013 Pulsant was awarded the Royal Warrant as a provider of hosted IT and data centre services to the Royal household. For more information, please visit: www.pulsant.com ### 6 Lessons from Virtualisation and Cloud Over the last few years I have seen many of my customers transform their IT and businesses, as they grappled with emerging technologies, namely virtualisation and cloud. These concepts quickly moved from ideas to reality and I made some notes along the way. The lessons learnt from cloud and virtualisation can be applied to the latest technology trends in our industry and Part 2 of my blog will specifically apply these learnings to Software Defined Networks (SDN) and Network Functions Virtualisation (NFV). But first: Lessons from Virtualisation and Cloud 1) Not everything is worth virtualising Anywhere between 50% and 70% of all workloads today are virtualised, but that still leaves a large amount that are not. Often these are legacy applications that are not worth the hassle of virtualising, but many applications with high throughput or processing demands run better or need to be run on bare metal servers. 2) End users lead the way and create the demand End users are often the first to adopt new technologies if pricing or convenience appeals. For enterprise end users this meant the rise of cloud storage platforms such as Dropbox, which were significantly easier to setup and maintain than enterprise systems. For service providers they see their enterprise customers consuming cloud based network services whilst they continue to ship hardware appliance to site. Whoever your end users are the way they consume resources has changed and so have the economics - when they receive utility or usage based pricing from a cloud provider they will expect the same from their Service Provider. 3) Pricing models to change As end users demand new utility based pricing so enterprises, rightly, demand it of their vendors. Pay as you grow models have become commonplace for storage and compute, and will now become so for network services. Think of Network as a Service (NaaS). 4) No one vendor has the complete answer Almost every vendor pitch I saw about cloud implied, if not directly stated, that a vendor had the complete cloud story. "Cloud in a box", but the truth was that most vendors have their installed base to protect and some IP they have developed. End users needed to decipher the hype from the reality, but getting away from presentations and into proof of concepts (PoC) soon helped to achieve this. 5) Skills change Previously both end users and suppliers could and did, and have IT resources that were solely focussed on one hardware appliance or technology, in some cases this remains, but increasingly I find multi skilled employees that are able to install and configure compute, storage and virtualisation. Just as technology has enabled resources to be better utilised it has enabled staff to be better skilled and more valuable to the organisation. The fear of mass redundancies due to automation and cloud has largely disappeared as IT employees have actually become more valuable to an organisation, not less. 6) Don't jump out of the frying pan into the fire I saw several organisations move from a physical machine sprawl to a virtual machine sprawl. If you don't change your processes and workflows to better suit the new models you will end up back in the same mess. Take the opportunity that new technology brings to revaluate how you do things. Find out how we can apply these lessons to the latest technologies that are gripping our industry - those being SDN and NFV in my next instalment. What do you think? Is there anything you would add to this list? Please add your comments below, thanks. ### CRM as a Service encourages better business CRM (Customer Relationship Management) has been a hyped acronym for many years and can mean many things to many people. Many still debate the terminology, does CRM really describe what it's used for? Or, how much does it really cost, what ROI can really be achieved, how to get users to adopt its use? Fundamentally CRM provides a way to manage customer information, share it securely, track customer interactions and record activities across the business is required by most business sectors and sizes. Sure there are specific, often niche, requirements that at times can be better served by more specialised applications, but for most these requirements are pretty similar and easy to achieve with broader CRM applications. By its nature CRM is aggregating information centrally and sharing with a focused purpose. Long before we were talking about cloud computing, CRM was already headed there. According to Gartner Software as a Service (SaaS) delivery of CRM applications represented 34% of total worldwide CRM application spending in 2011. In that year more than 50% of all Sales Force Automation (SFA) spending was on the SaaS platform. By the numbers alone we can see Cloud CRM is at a tipping point. How does this effect supply channels and why does the cloud change the possibilities and potentials in this arena? Increasingly there are a number of important supply chain demands in that CRM can assist with aiding top and bottom line business benefits. According to Gartner Software as a Service (SaaS) delivery of CRM applications represented 34% of total worldwide CRM application spending in 2011. Recording relationships between multiple parties of suppliers, customers, contractors, alliances and consultants is an important part of understanding a supply ecosystem. Knowing who your supplier options are, who they have relationships with and who they compete with can aid in faster and more effective business decisions. This is CRM in its element, managing connections, interactions and information. Organisations and how they relate today is ever more complex. With contractors working for multiple companies, directors being on the board of multiple organisations and sister, parent and subsidiary company relationships are increasingly difficult to understand. CRM systems make it easy to record these relationships between people and organisations thus making it easy to spot relationships that could be important. Not overlooking the detail enables you to leverage these relationships to the benefit of your finances. A challenge with CRM systems across supply channels is that invariably they will all adopt CRM specific to their needs. Often these channels will be reticent to have a system forced upon, meaning a mix of CRM systems will be used throughout that supply channel. It is unlikely the same CRM system will be used. Larger companies can afford well-known and more expensive CRM systems that the smaller to mid-sized firms cannot. Licensing, deployment and configuration costs can price businesses from adopting the bigger brands. By its nature CRM is aggregating information centrally and sharing with a focused purpose. Long before we were talking about cloud computing, CRM was already headed there. Cloud computing is increasingly come to mean bridging different, distributed services across a shared network. APIs and open data have seen services from LinkedIn and Twitter to cloud storage like Dropbox join the mix of enterprise tools. There are some single winners, Google and Microsoft's Office 365 vying to be the productivity suite of modern offices, but we are also seeing CRM gain equal footing. CRM services can easily be integrated in the business processes. Using exporting and importing data via configured templates can make it easy to share data, either manually or in an automated stream. Opening up your CRM to suppliers as well as customers through APIs, forms on your website, etc. can make the data you have more useful. Imagine having partners able to manage and update their own details, to update product books or product information. Their participation reducing the management time you and your staff have to invest. A web-based CRM system, hosted in the cloud, offers significant advantages. Positioning CRM as a Service (CRMaaS) while a mouthful as an acronym is where businesses can easily improve the bottom line. Managing relationships, suppliers and customers has always been important but now these relationships can partly begin to management themselves. With customers able to view and update their own details or even review reports relevant only to them businesses can keep their IP secure while fostering closer relationships. Aggregating data and sharing the reports is a step better than cataloguing orders, invoices, contact details and connections. A wider CRM system can enable you to better map customer orders and tighten your supply chain accordingly. This also increases the visibility and reporting of sales margins, supplier reports, product analysis and all within easy reach of sales and marketing. Coordinating different aspects of your business has never been easier. Utilising CRM in the same manner as SaaS products creates a system that can, and with greater flexibility, make it easier to access and share information across supply channels. CRM has been headed to the cloud for sometime. Now as cloud computing matures and the public begin to see the benefits businesses should be encouraged to adopt a CRMaaS approach. ### Confidence rocked according to Egress market survey Egress Software Technologies, a leading provider of hosted and on-premise encryption services, has produced a market survey and finds that industry confidence in Cloud-based communication solutions has been rocked following Edward Snowden’s revelations. Despite the economic benefits of outsourcing business functions to third party service providers, including reduced operational costs and streamlined processes, there is growing concern amongst IT professionals about the security of Cloud-based SaaS services and solutions. Such is the concern that when selecting a new product or service, data security and information assurance are now regarded as being of equal importance to financial and efficiency gains. The 2014: The Year of Encryption market survey examined the views of this year’s delegates at Europe's largest information security event, Infosecurity Europe 2014. The majority of those surveyed acknowledged that Snowden had undermined users’ confidence in the perceived security of Cloud-based solutions. This has made them more aware of issues such as encryption, data residency and access control, and driving those surveyed to consider new information security systems. Reflecting on the results of the survey, Egress CEO Tony Pepper comments: "The close relationship we maintain with Egress customers means that we have become increasingly aware of a change in attitude towards the security of Cloud-based communication solutions – a view that the 2014: The Year of Encryption survey has now confirmed for the wider market. We feel that Cloud-based solutions still have a key role to play, and assuming industry adopts the necessary levels of security (in the way that we do at Egress), then concern over the protection and integrity of highly sensitive data should not be a factor. In fact, we actually see the influence of Snowden as a major positive for the information security market. Not only has it forced users to question the underlying security of their data but it is encouraging them to apply greater due diligence when procuring new solutions." Highlights of Egress' 2014: The Year of Encryption are illustrated the infographic below. For more insight on the findings please visit the Egress website. ### IBM Cloud serves an ace for fans at Wimbledon This year marks the 25th anniversary of IBM's role as the official technology supplier to Wimbledon. Over the 12 years my team and I have been lucky enough to play a small role in support of the All England Lawn Tennis Club (AELTC). Throughout our relationship we've driven many new innovations and technologies that enhance the fan experience. For those lucky enough to be one of the half a million fans joining us in SW19 during The Championships fortnight you will certainly understand our contribution. For those watching elsewhere we have helped them to enjoy a rich, full and immersive experience - an experience that it is the next best thing to really being there. Most spectators and fans watching Wimbledon via our mobile app, on the web or just the stats popping up on TV graphics don't realise The Championships runs on the cloud. But not just any cloud, but one that's supercharged - the IBM Cloud. While the tournament in is full flow I thought I'd share with you some of the highlights of the service we provide. First, the website. Last year nearly 20 million unique users visited wimbledon.com during The Championships, and this year we've done a lot of work so that fans will be able to personalise their access on the site from any smartphones, tablets and other devices. The app is free. Simply visit Wimbledon Mobile News for the latest live scores, stats, live videos and match analysis. The app provides that extra layer of analysis to enhance the fan experience and engagement. Then there is SlamTracker. Using a multi-tenant private IBM Cloud spanning 3 active datacentres SlamTracker provides a “second screen” experience for fans on wimbledon.com. SlamTracker acts like a real-time dashboard for all matches around each of the 19 courts. It also uses predictive analytics to identify performance; for example, the 3 key things each player needs to focus on in order to be more likely to win. These updates, delivered in real-time during each match, give fans a sense of being part of the action. SlamTracker works by analysing 41 million data points from eight grand slam tournaments. We've driven many new innovations and technologies that enhance the fan experience. Next is SoftLayer, a tournament winning, hybrid-dynamic cloud. SoftLayer uses analytics technology to deliver insights into the evolving social conversations taking place on and off the court at Wimbledon. It powers the Social Command Centre and knows which topics and which players are being talked about most - regionally, nationally and even globally. By using analytics in this respect, the digital team at Wimbledon is able to understand what is of interest to fans, what they want to see and what they read the most. SoftLayer is helping Wimbledon respond to fans' needs in real-time, serving up relevant content on relevant digital platforms to make the experience more fun and engaging. New this year is "Hill vs. World". We ask questions of fans cheering on Henman Hill and compare and contrast their answers with those of fans watching elsewhere. “Hill vs. World” is serving up some interesting results. And, Match Point… Using IBM Watson technologies and our cloud management software Smart Cloud Orchestrator, we are automatically adjusting the cloud infrastructure to better power wimbledon.com. We like to think of our contribution as "under the covers". In a sense like the familiar rain covers racing out underneath grey skies, we are ensuring the grass courts are optimised for the competitive challenges of The Championships. In 2013 we used our analytics capabilities to monitor a raft of feeds including social media as well as other data - order of play, historic demands and even the weather. IBM then used this data to automatically tune the capacity we provisioned - load balancing, auto-scaling, etc. This was a huge enhancement over how we handled the same data and traffic in 2012. This year, with IBM Watson technologies and our cloud management software Smart Cloud Orchestrator, we have halved the provisioning time while ensuring the website has to capacity to look ahead 90 minutes. Few can boast a 50% gain over the year before, something the top seeds playing can only dream of. The net effect is that we run a leaner system that responds faster. We can make changes more efficiently on a continuous cycle. This automated intelligence in the management is for me the most exciting part of our contribution. As data pours in from SlamTracker while SoftLayer responds to social traffic and the fans atop Henman Hill cheer we are enhancing The Championships, Wimbledon 2014. The benefit to fans is almost transparent, few ever realise the power cloud computing has to offer. Wimbledon for me is one of the sporting highlights of the year. So with all the new technology innovations we are providing this year, we feel confident that they deliver that extra layer of insights and entertainment for fans to enjoy the world over. Game, Set, Match. For me it has been 12 great years of tennis – 25 great years for IBM. Wimbledon Bluemix Innovation Challenge Our work with Wimbledon is just one example of how technology contributes to The Championships and the overall fan experience. Like every new player looking to mark their mark on tennis' grandest stage, from the top seeds to the unseeded, we are open new ideas. Do you have an idea for improvements for next year? If so then enter the Wimbledon Bluemix Innovation Challenge. Your app could be a smash. ### Yes Workforce launches mobile workforce solution Cloud-based software built in North West England comes out of beta with full launch. Yes Workforce, the leading SaaS provider for mobile workforce management, has successfully launched its ‘off the shelf’ mobile workforce solution. The solution was developed from Yes Workforce’s enterprise software engine, which has been integrated by large corporations such as TNT Post. Now small to medium sized businesses can use the same core product to manage mobile workforces effectively and reduce paperwork. Yes Workforce is a cloud-based solution enabling teams of workers on the move to coordinate jobs and exchange information easily through a mobile app, reducing paperwork and improving efficiency. The software also allows users to capture signatures, fill out automatically generated time sheets, and send and receive photographs - all from their mobile. The real-time dashboard makes it easy to schedule and reallocate jobs for multiple employees. Teams can also run real-time reports and view live updates from their workers on the move. The mobile management platform improves workforce organisation through real time tracking, eliminating the need for paper-based admin and increasing accountability with evidence of completed jobs. It is headquartered in Bolton in the UK with satellite offices located across Europe. Chris Flynn, Founder and CEO of Yes Workforce, commented: "Our vision is to help businesses become more efficient, helping them to eliminate time-consuming paperwork so they can focus on what they do best. Demand for our cloud-based solution has been extremely encouraging, and I’m confident that this will continue to grow as we come out of beta and establish Yes Workforce as the most advanced and cost effective mobile workforce management solution available." About Yes Workforce Yes Workforce is a cloud-based mobile workforce management solution. It enables mobile workforces to manage jobs easily through a mobile app, reducing paperwork and improving efficiency. Yes Workforce provides a real-time dashboard, making it easy to schedule and reallocate jobs for multiple employees. The software also allows users to capture signatures, fill out automatically generated time sheets, and send and receive photographs. Users can run real-time reports and view live updates from their workers on the move. ### Keypasco earns New Product Innovation Leadership Award Keypasco has been granted the Frost & Sullivan New Product Innovation Leadership Award for Secure Authentication Keypasco has earned the Frost & Sullivan 2014 New Product Innovation Leadership Award for Secure Authentication for its secure multi-factor authentication solution, which is described by Frost & Sullivan as a game changer and "with its Vakten software, Keypasco has the ability to disrupt the secure authentication market, and support the growing need for secure mobility." Each year, the Frost & Sullivan Award for Product Innovation is presented to the company that has demonstrated excellence in new products and technologies; the industry analysts compare and measure performance in order to identify best practices in the industry. "Recipients of this award represent the top ten percent of their industry: the other ninety percent just can't keep up." Frost & Sullivan describes Keypasco as: "a true pioneer of the 21st century for mobile security." "Keypasco uses features such as a fingerprinting factor that can be used to determine if a correct device is being used (multiple platforms incl. smartphone mobile devices), as well as a geo-location feature to see where the request was made. If it is within the normal pattern for the user, the transaction is authorized and if the behaviour is uncommon for the user, it then uses risk-based factors to determine if the request is fraudulent. Combining those factors together create a more authentic solution for sensitive tasks such as mobile banking." About Frost & Sullivan Frost & Sullivan, the Growth Partnership Company, enables clients to accelerate growth and achieve best in class positions in growth, innovation and leadership. The company's Growth Partnership Service provides the CEO and the CEO's Growth Team with disciplined research and best practice models to drive the generation, evaluation and implementation of powerful growth strategies. Frost & Sullivan leverages more than 50 years of experience in partnering with Global 1000 companies, emerging businesses and the investment community from 40 offices on six continents. To join our Growth Partnership, please visit: http://www.frost.com. ### IBM opening new SoftLayer data centre in London Extending SoftLayer Cloud Services Platform to Support Growing European Customer Base IBM has announced that SoftLayer, an IBM Company, will open a data centre in London this month. It will be the latest of 15 new data centres that IBM plans to open as part of a $1.2 billion dollar global investment to strengthen and extend its cloud services in Europe and around the world. The new facility will provide customers and their end users with SoftLayer services that meet in-country data residency requirements. It will also complement the existing SoftLayer Amsterdam data centre and London network Point of Presence (PoP), both launched in 2012, to provide European customers redundancy options within the region. "We already have a large customer base in London and the region; we're excited to give those customers a full SoftLayer data centre right in their backyard, with all the privacy, security, and control the SoftLayer platform offers," said Lance Crosby, SoftLayer CEO. "The work these businesses are doing—the solutions and services that they are building in the cloud—is inspiring. Organizations of all sizes are using SoftLayer services to disrupt their industries or even their own operations, creating new business models and applications." One such customer is MobFox, the largest and fastest growing mobile advertising platform in Europe. "MobFox has been working with SoftLayer for a couple of years. We currently deliver more than 150 billion impressions per month for clients including Nike, Heineken, EA, eBay, BMW, Netflix, Expedia, and McDonalds," said Julian Zehetmayr, MobFox CEO. "London is a key location for company like ours, operating in digital advertising space and serving global clients. We are very excited about the option of deploying SoftLayer servers directly here." SoftLayer services will provide solutions ideal for businesses based in London and surrounding areas. The region is a key cloud market, with customers using cloud to deploy web-centric workloads or to transform their existing operations. One third of the world's largest companies are headquartered in London and a majority of the world's largest financial institutions have operations there. Additionally, London has one of the world's largest communities for technology startups, incubators, and entrepreneurs. London is a key location for company like ours, operating in digital advertising space and serving global clients. The London data centre will have capacity for more than 15,000 physical servers and will offer the full range of SoftLayer cloud infrastructure services, including bare metal servers, virtual servers, storage and networking. It will seamlessly integrate via the company's leading private network with all SoftLayer data centres and network PoPs around the world. With these services deployed on demand, and with full remote access and control, customers will be enabled to create their ideal cloud environment—whether it be public, private, dedicated and/or hybrid. SoftLayer will start taking orders for the London data centre in July with a special offer of up to $500.00 USD off new orders in the data centre, for a limited time. More details about the discount and updates on the opening of SoftLayer's new London data centre are available at www.softlayer.com/London. About IBM's Cloud Investment in SoftLayer SoftLayer became part of the IBM Cloud in July 2013. The SoftLayer infrastructure is now the foundation of IBM's cloud portfolio—including extensive middleware software and solutions—ensuring businesses have the scalability, transparency, and control they need to deploy IT operations in the cloud. In January, IBM committed to investing $1.2 billion to expand its global cloud operations in all major geographies and financial centres, increasing the reach and capability of a business' IT operations. The investment will grow SoftLayer's global cloud footprint to 40 data centres across five continents, and will double SoftLayer cloud capacity. About IBM Cloud Computing IBM has helped more than 30,000 clients around the world with 40,000 industry experts. Since its acquisition in 2013, SoftLayer has served 6,000 new cloud clients. Today, IBM has 100+ cloud SaaS solutions, thousands of experts with deep industry knowledge helping clients transform and a network of 40 data centres worldwide. Since 2007, IBM has invested more than $7 billion in 17 acquisitions to accelerate its cloud initiatives and build a high value cloud portfolio. IBM holds 1,560 cloud patents focused on driving innovation. In fact, IBM for the 21st consecutive year topped the annual list of US patent leaders. IBM processes more than 5.5M client transactions daily through IBM's public cloud. For more information about cloud offerings from IBM, visit http://www.ibm.com/cloud. Follow us on Twitter at @IBMcloud and on our blog at http://www.thoughtsoncloud.com.  Join the conversation #ibmcloud. ### Why VoIP? Voice-over-Internet Protocol (VoIP) phone systems are becoming more popular. Driven by greater public awareness through smartphone apps it is becoming increasingly important for businesses. If you are unfamiliar with VoIP you shouldn't be. It is transforming communications, making them more cost effective, flexible and more far reaching than traditional copper-wire telephone systems. From voice calls to the venerable FAX and SMS, VoIP is going global - and it is using the cloud to storm the marketplace. What is so great about online telephony? And why is VoIP a threat to traditional telecom providers? Compare the Cloud asked leading VoIP provider, RingCentral for answers and a little market insight. 8 reasons why VoIP is better than a conventional phone service 1. VoIP is a hosted phone solution. It does not require on-site Private Branch Exchange (PBX) equipment. This saves office space. You also save on cost because you will not need dedicated technicians to maintain the redundant PBX. 2. And, because there is no physical PBX hardware involved, you never need to worry about expensive maintenance and on-site repair contracts or, more importantly, upgrades. As VoIP develops and adds new features these improvements are pushed to you invisibly, through the cloud. Online telephone system is always up to date. 3. You still have the full functionality of phone system wherever you can get online. VoIP works just like a phone, only better. You can redirect freephone numbers, change message alerts, and download call data from anywhere – from home, at a business convention, or from your favourite coffee shop just as easily as you can in the office. 4. VoIP systems aren’t geographically tied. Your phone will work across different offices. When your postcode changes your phone doesn’t have to. Whether your employees are located in a single building or spread across different time zones, online telephony means you all share a single communication system. 5. VoIP offers greater efficiency. Your online phone system can act as your dedicated receptionist. An auto-receptionist can greet callers and route customers. Because the system is delivered online it can include advanced functionality, in way like email rules and filters, to make efficiency saving for your business. 6. Online telephony is flexible. Not only can you set it up in minutes, VoIP is flexible. You can expand the service or prune it has business needs change. You’re never stuck with features or hardware you no longer want (or want to pay for), and adding new offices or making employment changes is simple. 7. With this flexibility also comes flexibility on costs. Most online phone systems are structured around a pay-as-you-go model, making them low-risk. You only have to pay for the minutes of service you use. 8. In comparison to the traditional PBX telephone systems VoIP is far cheaper on a monthly basis. With no hardware, wires and accompanying exchanges to install or maintain VoIP enjoys lower service costs. So VoIP is a real benefit to businesses. But what is the greatest asset to these benefits? Martin Lucas, Senior Sales Manager at RingCentral, summarizes, “VoIP allows you to focus on the lifeblood of your business: your customers. Whether you need to open a new office on the other side of the country, adjust your business hours on the fly, or prioritise calls from your most important contacts, online telephony makes it possible to adjust how you work, as you work.” Compared to traditional copper-wire telephone systems, online telephony gives you more for less: greater flexibility, greater control and greater oversight at a lower total cost of operation. VoIP offers businesses of every size the sort of communications tools that were once only found in the largest, global companies. About RingCentral Founded in 2003, RingCentral has been providing businesses and individuals with cloud-based business communications solutions across the UK and globally. They offer the freedom to accelerate your business communications while avoiding costly traditional telephone systems and maintenance contracts. They have already successfully helped over 300,000 businesses around the world improve their telephony systems with Voice-over-Internet Protocol (VoIP). www.ringcentral.co.uk         ### A team and B team - 4 different characteristics You and/your IT department backup your data every day, or every week. Backing up data is not the same as archiving it. Why? Backup is about ensuring that if something goes wrong with a file or data that you regularly work on/with you can 'recover' it, i.e. go back to a working version, even if you have to redo some of your work. Archive is about being able to access the work you have done, but in the future. Archive is about the data or information that holds a value for you, something you need to keep for reference or regulatory compliance etc. Consider your home PC or laptop – it has your most treasured photos on it, your finance files, and many other personal documents. How often to you look at your photos? You back them up, don’t you, but when was the last time you looked at them? How much space do they take on your PC? Do you have to move things to external disc regularly (where they are no longer backed up) just to make room for the Christmas photos? Now apply that to a business context. Let’s say instead of photos, we’re looking at microscopy images or genome sequences. How do you decide where to put the data to preserve it? Does it go on your laptop? Or do you transfer it to an external drive on your desk? What do you do when it’s full? Get another, and another and another? Here are our favourite four characteristics of Archive and Backup – these will help you determine what’s needed for you and your business (and maybe even at home!). A – Data stored won’t change B – Data stored will be different/will change A – Data is retrieved for reuse B – Data is retrieved when a disaster hits A – Stores data that needs to be kept for a long time (1 year, 5 years, 25 years or even further) B – Stores data for the life of the backup cycle A – Stores data that is needed for regulatory compliance; to free up space on desktops, laptops and servers; and/or to reduce costs of backups B – Stores data for disaster recovery Conclusion? You need both. You need to backup regularly and you need an archive too. Arkivum represents the A-team and we work with a number of B-teams to make sure your needs are covered. For more information about Archive vs Backup, download our white paper. ### Creating the Perfect Virtualization Team Virtualization has quickly turned into a must-have technology strategy for businesses all over the world. No matter the size of the company, organizations have seen major benefits as virtualization makes companies more efficient and productive while also reducing costs. Think of a business that uses virtualization like a well-run machine, managing servers with greater efficiency, conserving physical space with fewer servers, getting a better performance out of their equipment, and even saving money with decreased energy costs. With these benefits, its no wonder virtualization is so popular, but implementing it is a major step. The process can be complicated and initially require a significant investment in resources and money. That’s why putting together the best virtualization technology team is almost a necessity before a business can take advantage of virtualization’s many benefits. Here’s a look at the different roles for each team member and the skills they need to get the job done. Team Lead The team lead is the person at the head of your virtualization team. The team lead acts as a manager, someone who knows all about the technical aspects of virtualization while also demonstrating an understanding of the business side of things. As a manager, the team lead helps direct the team to meet business goals while also providing guidance for the tech team during the virtualization process and determining which vendors offer the best services. On the infrastructure side of things, team leads work with team members to map out strategies and designs for virtualized systems and help maintain servers and storage systems. In terms of skills, a team lead needs a thorough knowledge of the technology used by the business. The team lead also needs to be able to work in a variety of virtualized environments while guiding others through them. Team leads should also be able to troubleshoot problems arising from servers. Good communication skills are a must. Putting together the best virtualization technology team is almost a necessity before a business can take advantage of virtualization. Architect The architect's focus is on the design and engineering aspects of a virtualization project. The architect’s role deals directly with the client while also having back-end responsibilities like developing solutions and techniques that target potential clients for sales teams. Architects are also instrumental in developing virtualization services while working in tandem with team members to come up with solutions that can put a business at the top of the market. An architect’s skills include several years of IT experience along with a seasoned knowledge of virtualization software and environments. Architects also need skills in communication, consulting, documentation, and contracting. An architect will also keep updated on all of the latest technologies. System Administrator The role of the system administrator has evolved over the last few years. System administrators still deal in many of the day-to-day activities of virtualization, like creating virtual servers, assisting in technical support, and installing and troubleshooting virtualization systems. But system administrators are also getting more involved in strategic planning, like developing policies that focus on security, maintaining technical documentation, and sending bills each month to the appropriate departments. The modern system administrator also needs a modern set of skills, like possessing a thorough understanding of the setup and configuration of servers. A system administrator has to know how to manage user accounts and apply operating system updates when they are developed. System administrators also need to have experience in protocol based file sharing and networking. Three to four years of IT experience is preferable. Engineer The engineer’s role is mainly to turn the architect’s designs into reality, but there’s much more to the job than that. During the pre-implementation of a virtualized system, engineers contribute to planning, building, and testing the projects. They manage and maintain vendor certifications, while researching the most cost-effective virtualization solutions, then deploying them to clients. After virtualization is implemented, an engineer is in charge of managing the storage and servers in the virtualization infrastructure. Engineers also need to troubleshoot any of the problems that clients find and communicate those problems back to the virtualization team. An engineer needs to have up-to-date knowledge on new systems and infrastructures. They also need to effectively understand storage and networking. Having experience with large server environments, particularly with big enterprises, is also a must. Engineers should also acquire proper certifications, like VMWare Certified Professional and Cisco Certified Network Associate. The right team can make all the difference when it comes to implementing virtualization for your business successfully. Finding the right people with the right skills, knowledge, and experience might be difficult at first, but it will all be worth it in the end. That’s when businesses will see the incredible efficiency, streamlined management, and reduced environmental costs that come with virtualization. After putting together the best team that works well together, the result will be a business ready to meet technological demands for years to come. ### Oak Hill to acquire Pulsant from BDC Pulsant, a leading provider of managed, hosted data centre and IT infrastructure services to the mid-market, has announced that Oak Hill Capital Partners has agreed to acquire Pulsant from Bridgepoint Development Capital (BDC). Terms of the transaction were not disclosed but the acquistion is expected to be complete in the coming weeks. Pulsant was formed in October 2010 when BDC acquired Lumison, a provider of connectivity, hosting and managed IT services. This was the first step in a strategy to create a 'one stop shop' managed and hosted IT services supplier in the underserved UK mid-market.  Subsequently, Pulsant (as the business was re-branded) made three further acquisitions: Blue Square Data, a colocation provider, acquired in February 2011; Dedipower, a managed hosting and cloud services provider, acquired in September 2011; and Scolocate, a provider of colocation, managed hosting and cloud services in Scotland, acquired in December 2012. Pulsant currently provides its services from a network of 10 company operated data centres across the UK, connected via the company's own fibre network. Consolidation amongst suppliers in this dynamic market, and we believe Pulsant is well-positioned to expand its leadership position. Oak Hill is an experienced investor in the data centre services market. Its current IT services investments include ViaWest, Inc., a leading provider of colocation and hybrid cloud services to medium-sized enterprises in regional US markets, and Intermedia.net, Inc., a global provider of cloud-based, hosted services to small- and medium-sized businesses. In 2005, Oak Hill co-led the consolidation of the European colocation industry by acquiring Telecity Group plc and executing on a number of subsequent strategic acquisitions. Mark Howling, CEO of Pulsant, commented "We expect continued significant consolidation amongst suppliers in this dynamic market, and we believe Pulsant is well-positioned to expand its leadership position by executing on targeted acquisitions, bringing additional capabilities to benefit Pulsant and its customers.  As we considered our strategic options for the business, there was a lot of interest, but Oak Hill emerged as the preferred partner, having a strong understanding of the market and of businesses like Pulsant. Oak Hill will be an active partner and we are very much looking forward to working with them. Having now used most of the funds that BDC had available for us to invest, we are delighted to have identified a new investor that can support our development in the next phase of our long-term growth." David Scott, Principal at Oak Hill, said "We are pleased to be investing in Pulsant and to be supporting Mark Howling and the rest of the Pulsant management team as the company continues its strong growth trajectory. With this transaction, Oak Hill builds upon a decade of significant data centre expertise, where Oak Hill has a demonstrated track record. We believe Pulsant is well positioned to grow in the data centre services market with clear and sustained demand for its services.  We look forward to further building the Pulsant franchise." Alan Payne, Partner of BDC, commented: "With Mark Howling and the team, we achieved what we set out to do: for Pulsant to become the trusted partner to medium-sized businesses looking for a provider to meet all of their IT and networking needs." About Pulsant Pulsant operates in the growing managed data centre services and cloud computing market. It typically provides managed services where it acts as the outsourced partner responsible for managing clients' IT infrastructure, hosting critical IT hardware in its secure data centre physical facilities, providing "cloud" based infrastructure services and offering network managed services, which allow clients to connect securely to the internet and other locations. Pulsant has ten data centres across six locations in the UK, making it one of the largest suppliers of data centre services in the UK. About Oak Hill Capital Partners Oak Hill Capital Partners is a private equity firm with more than $8 billion of initial capital commitments from leading entrepreneurs, endowments, foundations, corporations, pension funds, and global financial institutions. Since inception 28 years ago, the professionals at Oak Hill and its predecessors have invested in more than 70 significant private equity transactions across broad segments of the U.S. and global economies. Oak Hill applies an industry-focused, theme-based approach to investing in the following sectors: Consumer, Retail & Distribution; Industrials; Media & Communications; and Services. Oak Hill works actively in partnership with management to implement strategic and operational initiatives to create franchise value. About Bridgepoint Development Capital Bridgepoint Development Capital provides funding to businesses headquartered in France, the Nordic region and the UK, typically buyouts valued up to €150 million. BDC has a team of 19 investment professionals wholly dedicated to its investment activity and operating from offices in London, Paris and Stockholm. It is part of Bridgepoint, the international private equity group, which invests in businesses valued between €200 million and €1 billion across Europe. ### Regulated Data Security and Encryption With the changes in regulation for industries such as Accountancy, Legal, Financial and Healthcare, data governance comes under the spotlight again and again. For most mid-sized to large capital firms enterprise wide security and data protection systems is already in place (or should be). But what options are left for the SME market place? These firms still need to adhere to the rules and regulations of their governing body, but an enterprise class system is overkill for smaller companies. It is a good thing for SMEs that the marketplace has an answer. Kingston and Trend Micro both offer an encryption method and set of products using Opal that can secure SSD (Solid State Drives) to an enterprise standard, and even remotely manage the attached devices. The biggest challenge with any data is control. Once that data has left the corporate network compliance needs to be sure that information is secure and encrypted. For example, let's say I work in the insurance industry. I have all of my clients listed and saved on my laptop, along with their policies schedules. I am a road warrior so everything is on my laptop. I probably come to the company office once a week (maybe longer) and catch up with the endless paperwork I have to complete. How can I ensure that my data is secure on my laptop, or even (heaven forbid) on a pluggable device such as a USB stick? This is a massive risk for my business. Any loss of data, a slip in its integrity and security and I would be exactly the type of person that the regulating body wants to police. As Trend Micro points out, "The proliferation of data and devices in today's enterprises has increased the complexity of protecting confidential data, meeting compliance mandates, and preventing costly data breaches." And they are right. The challenges become greater as more employees bring their own devices (BYOD), laptop and storage, to work. Productivity has improved but the risks are greater. Ensuring that sensitive data is secured in the case of device loss has never been more difficult. Many of these credit companies are not aware that advances in technology can benefit these vulnerable firms. With recent regulation changes in the UK on consumer credit firms the FCA will enforce extremely heavy penalties for companies that can comply with the rules. Around 50,000 firms currently make up the £200 billion a year consumer credit market. This mean there is a lot of sensitive data out there to secure. Around 50,000 firms currently make up the £200 billion-a-year consumer credit market to be regulated under the FCA, these rules are and will be enforced with extremely heavy penalties if found non-compliant. Many of these credit companies are not aware that advances in technology can benefit these vulnerable firms; making it easier to meet data protection and security regulation. Leading the way Kingston Technology secures data using firmware customization across its SSDNow KC300 range, implementing self-encrypting drives (SEDs). The security focuses on data stored on the drive, allowing IT departments to protect company data. Customizing the firmware enables TCG Opal 1.0. The Opal specification of the Trusted Computing Group (TCG) is a standard for creating and managing interoperable SEDs for the protection of stored data from compromise due to loss, theft or drive end of life. Using Opal Kingston reduces risk of data theft without impacting on the drive's read/write transfer rate. Security compliance shouldn't mean compromising performance. Trend Micro encrypts data on a wide range of devices - PC and Macintosh laptops, desktops, CDs, DVDs, USB drives and other removable media – through Endpoint Encryption management console. This solution combines enterprise-wide, full disk, file/folder and removable media encryption. There is also port and device control to prevent unauthorized access and use of private information. A single management console allows companies to manage users holistically. Deploying Trend Micro Endpoint Encryption helps ensure that important data will continue to be protected, even as mobile computing needs change. Putting it all together Combining these two solutions from Kingston and Trend Micro can benefit SMEs. Together they can create a complete enterprise solution for data protection and security. For SMEs this level of security not only complies with regulation but can also be very cost effective. Both products from have the ability to use the Opal standards. Why Opal standards? Opal operates independently of the physical media type used to store the data. Back in the day when Opal was being designed most drives had spinning platters, the venerable Hard Disk Drive (HDD). These drives either had 512 byte physical block sizes which corresponded one-to-one with the 512 byte Logical Block Address (LBA) presented to the host computer or they had a pretty efficient method for dealing with the LBA in 4k or 8k physical storage blocks. In the last few years we have seen HDD retreat from the advance of Solid State Drives (SSDs). These drives also have 4k or 8k or more physical block sizes but due to the underlying nature of the memory used in SSDs it is inefficient to let the host computer ignorantly write data to just any address in any size chunks. Using these products together we get Opal standard SSDs with Opal standard management software that can remotely control, log, police and standardize data security. We're looking for security without a hit on performance. For the road warrior like me an SME firm does not need to spend an absolute fortune on enterprise tech to achieve good data governance. However, regulatory compliance through IT governance is not always core competency for firms. Corners are often cut due to a lack of understanding; and this can often result in data loss and security breaches. But there are vendors out there, like Kingston and Trend Micro, are trying to help businesses understand security and data compliance issues. ### Finding the right Cloud solutions provider How do you sort the wheat from the chaff? What questions should you be asking to ensure you get all the cloud benefits you're hoping for? Let's consider three key areas that make the really good cloud providers stand out. Datacentre(s) Ideally you'd want your proposed provider to have more than one and to be replicating your data seamlessly between them so that in the event of a failure they can switch you over quickly. Of course, not every service you buy has to be "always on" so think about the impact of service downtime and make a judgement appropriate to what you're buying. Always ask what Tier datacentres they use. DCs are categorised 1 to 4 with 1 being the lowest quality. The box below illustrates one reason it matters: Tier SLA Maximum Annual Downtime (Mins) Cost to supplier 1 99.671% 1,729 £ 2 99.741% 1,361 £ 3 99.982% 95 ££ 4 99.995% 26 £££ What is this telling us? That in a Tier 2 datacentre facility you can get almost a day in downtime each year and they'd still be within their SLA. You're smart so you've also realised that there's no way a cloud provider based in a Tier 1 to 3 datacentre can deliver a 99.99% SLA, because the datacentre itself can't. If your service comes from a Tier 4, that's a good sign. The provider has invested a fair bit more for that small uptick in reliability so you know he's committed to quality. And you want quality, right? Geography matters too - EU organisations have to comply with EU legislation on EU customer data and the main plank of this is that data may only be processed by, or shared with, organisations with similar compliance regimes. Generally that will exclude all services delivered from the USA, including email. Just because you're emailing a company across town it doesn't mean that your email wasn't processed by a server in the US .. or indeed anywhere in the world. To be 100% safe: if your customers are based in the EU then you want an EU datacentre to ensure compliance with legislation. There are exceptions of course, but follow that simple rule and you won't go far wrong. One final thing - you should be able to verify that a provider's datacentres are ISO27001 certified. This means that they are accredited to a recognised international standard in information security. ISO has to be maintained so you know that your provider is being regularly audited to stay compliant. Financial Stability Let's assume your whole IT infrastructure is moving to the cloud. What would you do if in 12 months time your supplier went bust and you had to scramble to recover your data? We recommend some basic financial due diligence early on, to save yourself potential stress later. For private companies, check out their records at companies house. Are their filings complete and up to date? Download the accounts; are the shareholders funds and retained profit figures increasing year on year? Do they have any cash? A plc must publicise their accounts, so look at the website. Are they making a profit? How much cash do they have? Is it increasing or plummeting? Be objective. Compare their accounts with yours, can you extrapolate their financial condition based on what you know about your own business? Are they sound? Finally, think about how they describe their business. For example, If they claim to deliver 5,000 hosted desktops then they're probably a £4m turnover business. Do they look and feel the size they claim to be? If not, why not? Would they big themselves up if things were going well? "Cheap" and "Good" are incompatible For Hosted Desktop the basic license requirements are the same for everyone. All the service providers pay the same to Microsoft, Citrix et al without exception. On top of the basic license charges each cloud provider makes a charge for the datacentres, support, hardware and an element of profit. Net result? On a like for like basis, you shouldn't get a final cost that varies by more than a few % between good providers. Barring dishonesty, there are probably 3 ways to be wildly cheaper than the competition and not run out of cash: Skimp on the datacentre (see above) Skimp on the product delivery (better PaaS providers will deliver using Citrix which sadly costs more) Skimp on support. None of which should be that appealing. The wrap-up So, there you have it, assuming the product is a good fit, then these are the of the most important elements you should consider when migrating to the cloud: A solvent provider with a track record that suggests they might remain solvent; Datacentres that are tier 3 or 4, EU based and ISO27001 certified; And finally, a price that doesn't appear abnormal. To quote John Ruskin: "It is unwise to pay too much, but it's worse to pay too little... If you deal with the lowest bidder, it is well to add something for the risk you run, and if you do that you will have enough to pay for something better". Selecting a cloud service provider can be daunting, but applying sense and a little research goes a long way. ### Cloud World Series Awards Shortlist Announced Cloud World Series have announced the shortlist for the Cloud World Series Awards, which takes place at this year's Cloud World Forum in June. The awards are a celebration of the drive, innovation and hard work in the global cloud computing industry, with six categories recognising services from across the Cloud ecosystem. The Awards will take place in the Technical Theatre of Cloud World Forum in the Olympia National, at the end of Day One of the event. The Awards will bring together the industry leaders and experts across the Cloud that have driven developments in Cloud over the past year. Bryan Glick, Editor in Chief, Computer Weekly, will present the ceremony. The Awards will be judged by world class experts in Cloud Computing and this year's panel includes Professor Mark Skilton Professor of Practice in Information Systems & Management, Warwick Business School, Chris Wray, Partner, Kemp Little Consulting, Kerem Arsal, Manager, Africa & Middle East, Pyramid Research and Camille Mendler, Principal Analyst and Head of Enterprise Verticals, Informa Telecoms & Media. The panel independently selected the shortlist and will also select the winners. Ewa Campbell, Head of Marketing, Cloud World Series, added: "The Cloud World Series Awards is fantastic recognition for those who have this year developed and delivered world class solutions and products which have advanced the industry as a whole, and moved developments forwards." The categories and the shortlisted companies with their products are listed below: Best Cloud Service Cloudyn - Cloudyn Bigstep - Full Metal Cloud Business Connexion - Amazon Web Services Best Cloud Application Peppermint Technology with Microsoft - Peppermint Legal Service Platform Rosslyn Analytics - Rapid Spend Analysis Iris Solutions - Thomson Online Benefits Best Cloud Platform Microsoft UK - Microsoft Azure Carrenza - Openshift by Red Hat CloudBees,Inc - CloudBees Continuous Delivery platform as a service Best Cloud Security Solution CohesiveFT - VNS3 FortyCloud - Insync Spirent - Armorhub Best Cloud Data Centre & Storage Solution Egnyte- Egnyte Spanning Cloud Apps - Infrastructure as a Service Zadara Storage - Viertual Private Storage Arrays (VPSA) Best Big Data Analytics Solution sales-i - sales-i Amazon Web Services - Infrastructure in the Cloud Domo - AirWatch The Cloud World Forum will take place on 17-18 June 2014, at the Olympia National, London, UK. To register to attend the event, please go to Cloud World Forum. ### IBM is making it easier to develop software in the Cloud with Bluemix IBM Makes it Easier for Clients to Develop Software in the Cloud with Bluemix IBM has announced how several enterprises and born-on-the-web developers are embracing BlueMix cloud and/or on-premise DevOps capabilities from IBM to launch new applications faster in a secure, reliable and trusted manner. A growing number of the world's 17 million developers are moving the bulk of their app development to the cloud to speed app creation and deployment and to connect on-premise functions with Web-based big data systems. As they do, they're using IBM's BlueMix and its "dev ops" capabilities that enable them to collaborate with other programmers to build, deploy and manage their cloud apps while tapping a growing ecosystem of in-house services, along with those from IBM partners. Developers are moving the bulk of their app development to the cloud to speed app creation and deployment. Like many organisations that have been around for decades or longer, Bay Area Rapid Transit (BART) wants to embrace the cloud for all its potential in helping deliver better service to the citizens of San Francisco. At the same time, they want to harness the value of all the data housed in on-premise applications that it has invested so heavily in for years. With IBM BlueMix and DevOps, BART was able to build and deploy a mobile app allowing supervisors to track real-time data on the BART system, and to automatically send mobile push notifications with updates on train car status. As a result, BART has cut its previous six month mobile app delivery time to 15 days. Building on the more than four dozen app development tools introduced since February, IBM is also announcing new: DevOps services on BlueMix, enabling teams to rapidly test apps in multiple environments and quickly aggregate feedback; allowing for apps to be built with security in design DevOps capabilities on-premise, which will help connect back-end data in organization infrastructures and legacy equipment to new applications and processes. About IBM Bluemix Through programs such as the new Bluemix Garage initiative and participation in the Cloud Foundry Foundation, IBM has been expanding outreach efforts to developers worldwide, a population expected to grow to more than 26 million by 2016. Bluemix developer meet-ups have been held globally in Asia, Latin America, Europe, India, Africa, North America and more, and IBM is also partnering with Strathmore University in Nairobi, Kenya to expand cloud-based development education and training within Africa's rapidly growing tech economies. For more information on IBM DevOps and BlueMix, please visit: www.ibm.com ### Arkivum announces first certified data archiving connection to the N3 network Arkivum announces first certified data archiving connection to the N3 network with long-term data archiving service now available to NHS Arkivum, the provider of large scale, long term, and ultra-­‐safe digital archiving solutions today announced the availability of N3 connectivity for its Archive as a Service offering. Arkivum is the only archiving provider to have achieved certification against the Information Governance Statement of Compliance (IGSoC), which governs access to the N3 network. The medical community is faced with increasing volumes of data resulting from today's diagnostic technology, such as Whole Genome Sequencing, 3-dimensional imaging and photo-microscopy. As an example, it is estimated that the data volumes for Whole Genome Sequencing will reach as much as 20-50PB per year in the UK (source: Genomics England). The sensitivity and nature of this data means that a safe and secure archiving strategy is needed to preserve the results for the future, often for up to 30 years or longer. "It is estimated that the data volumes for Whole Genome Sequencing will reach as much as 20-50PB per year in the UK." As a result the IGSoC sets out a rigorous process and mandates a series of requirements that organisations must satisfy in order to receive certification and N3 connectivity. These include a range of security related requirements and the provision of assurances in respect of safeguarding the NHS N3 network and patient-related information assets. Jim Cook, CEO of Arkivum explained, "This is a significant milestone for Arkivum enabling it to provide secure, scalable, long-term and cost-effective digital archiving solutions to a wide range of healthcare clients. We believe this demonstrates our ability and commitment to meeting the rigorous security measures required. With a growing need to preserve the results of medical research for the future, we look forward to working with NHS Trusts and their partners to deliver innovative solutions to help manage the ever increasing amounts of data generated by today's medicine." This announcement follows Arkivum's successful completion of the Information Governance Toolkit to Level 3 and sponsorship from an NHS body to achieve the Information Governance Statement of Compliance (IGSoC). About Arkivum Founded in 2011, as a spin-out of the University of Southampton 2011, Arkivum specialises in the management and storage of an organisation's information assets. Arkivum delivers systems that can intelligently manage content to efficiently store and retrieve data over the long term while offering a highly cost-effective solution, with low up‐front investment and zero risk. Arkivum provides a completely transparent data archiving service and its approach to data safety and security is simple; it keeps multiple copies of customers' data in secure UK data centres and actively manages its integrity to ensure it remains in bit-perfect condition all the time. Arkivum relies on proven storage technology and open standards to deliver fast and efficient online access. The company's unique solution is the only system available on the market, which guarantees 100% data integrity. More information can be found at www.arkivum.com ### Hybrid Cloud to be Key to Cloud Sector Growth According to a recent report by the China Securities Journal, the China Telecom company will aim to develop a hybrid cloud business during this year and next. China Telecom is merely the latest business to recognise the value of the hybrid cloud; a new technology which is becoming extremely prominent in cloud computing. So what is the hybrid cloud, and why are experts and analysts predicting that it will be the prevalent cloud system in the coming years? Many individuals and businesses alike continue to view the cloud as a rather blanket proposition. Many individuals and businesses alike continue to view the cloud as a rather blanket proposition. Either you’re in the cloud, or you’re completely outside of it. This is a complete misnomer, and the hybrid cloud is proving this assumption to be a complete myth. The hybrid cloud is offering businesses the ability to benefit from both public and private applications, and to manage them in ways which are flexible and customer-oriented. One of the issues which has prevented smaller businesses in particular making the move over to cloud-based systems has been concerns about cost and flexibility. Despite this, a Computer Weekly survey as early as March, 2012 indicated that one-third of IT organisations were backing up at least some of their data using cloud applications. So the appetite for cloud computing was already there quite some time ago. What the hybrid cloud has done is make the cloud accessible for a whole new raft of commerce. It offers the cost-effective solutions of the public cloud, and the security of the private cloud. The hybrid cloud enables businesses to reap the reward of the benefits of both the public and private cloud. It offers the cost-effective solutions of the public cloud, and the security of the private cloud. The hybrid cloud potentially offers significant economic benefits. It is possible that these will reduce over time as the price of cloud computing naturally comes down, but in the short-term this will be a very significant factor. In the meantime, the hybrid cloud offers companies the opportunity to scale easily by allocating resources for immediate projects. Additionally, data security is a big issue in the cloud, at least in terms of public perception. The hybrid cloud eases these concerns by enabling users to choose dedicated servers and network devices with restricted access, while also allowing communication over a private network. The hybrid cloud enables companies to move seamlessly between the two types of cloud model. The hybrid cloud is already gaining popularity among SMEs for this very reason. Aside from the fact that the the hybrid cloud specifically offers the economic and scale benefits which are associated with public clouds, the dual nature of the hybrid cloud also opens up a lot of options for businesses. Perhaps the most obvious benefit of this is that applications can be assigned to the public cloud when user demand is high, while applications requiring stringent security provisions can be operated within the safety of the private cloud. Essentially, the hybrid cloud offers the best of both worlds at a price which suits the smaller budget. 40 percent of senior IT decision-makers surveyed had already opted for some form of hybrid cloud-based infrastructure. Already this is paying off in terms of numbers. According to an IDC report, 40 percent of senior IT decision-makers surveyed had already opted for some form of hybrid cloud-based infrastructure, or were intending to adopt such a package in the near future. Only days ago, the Wall Street Journal was the latest publication to report that the demand for the hybrid cloud is likely to go through the roof in the near future. It seems that the hybrid cloud has inadvertently tapped into one of the key psychological factors related to change; it offers a taste of a new technology, but in a way that is not too threatening to our intrinsic human fear of such change. With the cloud now becoming part of everyday life, and mainstream advertising now focusing on cloud technologies (Eurosport’s French Open coverage has been sponsored by the Microsoft Cloud this year), the hybrid cloud seems to offer an excellent way for the unsure and weary to get in on the ground floor with this exciting technology which will unquestionably be important for our collective future. ### Big Data, Big Deal By Daniel Steeves, Partner at HBVP Prime Advantage, Director at Beyond Solutions It is easy to drown in big data articles, let alone the numbers that they quote: companies counting customer transactions in the millions (or more) per day, websites with literally billions of photos and videos, not to mention 3000 tweets per second! We are inundated with broad brush information but, like the data being written about, the relevance to you and your business is not so clear… it is your data, not big data, that matters but even that has limited relevance until you know what you want to do with it. Big data is coming to a cloud near you Data is of course aggregating at an amazing rate but this is nothing new… it’s clearly been happening for quite some time. What has changed is that, as that data piles up, awareness of the potential value within it is increasing. Combined with advances in technology and the delivery of that technology, storage and access costs are drastically reduced which creates opportunities to derive information, intelligence and insights from that data (in many cases, an asset you already own!). The Definition So, a simple definition with some necessary technical undertones: to simplify the vast detaiIs and myriad references on the subject at Wikipedia, I define the point where data becomes “big data” as when you have too much of it to manage, search, manipulate and process using conventional database systems (and / or it no longer suits your database architecture and capabilities). The four primary issues that contribute to this definition are volume (too much of it), velocity (it moves, changes or aggregates too quickly for you to fully take advantage of it), variety (structured, unstructured, multimedia, internal and external sources) and veracity (how much can you trust the info you are deriving and how are contradictions managed). Your Definition Now even if your data doesn’t “qualify” as big data by the above definition, that in itself doesn’t mean that your data can’t add value (or that you won’t have similar issues as you try to manage, search, manipulate and process it). So, the first suggestion is to move on and talk instead about your data, big or small… the data which you have or to which you could arrange or purchase access. The second, even more important suggestion is to get a clear idea about what you’d like to (try to) achieve before you start spending all that time, effort and money: there are clever, structured approaches that you should be taking to determine target outcomes, not to mention which will help define budgets and determine timeframes (otherwise how will you know when you’ve succeeded… or not?). Looking at data this way is different… the data from a purchase at a till was once a record of the transaction only (date, time, stuff purchased, location, payment method) whereas that same customer data can be aggregated and analysed to either target individual consumer preferences (you buy cat food every week: here is a coupon) or to support multi-level forecasts for sales and purchasing decisions. IT CSI In a manner similar to how DNA has allowed forensic examiners to reopen criminal investigations, data can let us look back and better understand, with more information and better context, a more precise answer of just how we arrived “where we are”. And if know what happened when, and how, we can see about learning from it to either prevent it from happening again, or to repeat it (depending of course on what IT was). So, talking big data, or just plain data, is little different from all of the other tech discussions because data itself is not technology: it is a by-product of technology which in some cases can be used to create value. The Impact, The Future One thing is clear: analytical and maths education and experience will keep you employed: Big data has increased the demand of information management specialists: the usual suspects, for example, Software AG, Oracle Corporation, IBM, Microsoft, SAP, HP and EMC have spent more than $15 billion acquiring software firms only specializing in data management and analytics. On its own, this piece of the industry worth more than $100 billion and growing at almost 10 percent a year which is about twice as fast as the software business as a whole. Get in touch at steeves@beyond-solutions.co.uk if you’d like a chat on big data, your data and your business (and where we now provide a new Sounding Board service). ### Observations From Desktop Virtualization Projects [quote_box_center]Case Study: Desktop Virtualization in the form of Virtual Desktop Infrastructure (VDI) Hosted Desktop (HD) or Desktop-as-a-Service (DaaS) is one of the trickiest "applications" to deliver from the Cloud.  The Desktop experience has to be 'spot on' in order to ensure that users continue to enjoy the performance of a local desktop.  This month we've already heard from James Mackie at VESK on how his business is delivering desktop performance, and now we can hear from Liquidware Labs and their server-side optimizations conducted in partnership with Kingston Technology.[/quote_box_center] The Business challenge Since its inception, Liquidware Labs has been heavily engaged in the transformation of the desktop. With visibility based upon leveraging our Stratusphere FIT product to support assessments, and Stratusphere UX to validate performance, we have a long history that encompasses over 400 metrics-driven desktop virtualization projects.  Agnostic of approach, the results of this history are unambiguous—success in VDI requires a new approach to how server-class resources are sized and implemented to support desktop workloads. Whether you invest upfront with an assessment or jump in and optimise later, there are critical attributes that must be considered. Success in VDI requires a new approach to how server-class resources are sized and implemented to support desktop workloads. Many early adopters of VDI assume that measuring desktop workloads prior to beginning a virtualization project serves only to support the sizing and build-out of the host server environment. While this is partially true, there are other important benefits associated with this step, specifically as it relates to creating the optimal virtual machine (VM) image. We believe there are a couple of primary benefits to a metrics-driven approach to VDI: Capture the baseline user experience in the current environment—this is critical to ensure you provide an equal-to-better user experience when physical desktops are converted to virtual. Related, this step provides the ability to perform a before-and-after comparison of resource use and ultimately, end user experience. Monitor application use as it relates to desktop pools and images—if for no other reason, this attribute of the assessment helps you to better understand what applications are used versus installed. This benefit also provides visibility into user and group resource requirements. This paper highlights the importance of memory and storage resources; specifically the critical role each plays in the end user experience and overall performance of virtual desktop workloads. Regardless of when you measure, do not miss the critically-important step to quantify end user computing resource requirements to support and ensure optimal end user experience. It is the cornerstone of a successful and optimised desktop virtualisation implementation. This paper will not detail the specific steps or process with respect to assessing user and machine-specific workloads. Rather, this paper will highlight the importance of memory and storage resources; specifically the critical role each plays in the end user experience and overall performance of virtual desktop workloads. Why Do Server Memory and Storage Resources Matter? Delivering end user computing resources in a virtual desktop architecture is profoundly different from how we’ve provisioned, managed and optimised desktops in the past. More to the point, the way we manage server and storage resources to support VDI is about identifying and minimising resource bottlenecks in your environment. We have found that the most common resource bottlenecks observed while leveraging the Stratusphere UX product relate to consolidation ratios, memory and storage resources. These are very common performance occurrences, which can be prevented during the early phases of implementation. Improperly sized VM memory is a common issue that can cause end user performance issues in a VDI architecture. Poor consolidation ratio—this very common bottleneck is due to unbalanced resource usage in host servers. Understanding how CPU and memory play a role in optimising VDI performance is critical to meeting total cost of ownership (TCO) and return on investment (ROI) goals. Improperly sized VM memory—memory-to-disk swapping on the guest OS is another common issue that can cause end user performance issues in a VDI architecture. This can be especially tricky as host memory page sharing and ballooning do not prevent swapping if the guest VM OS “thinks” it’s near capacity. Storage and “boot storms”—successful VDI deployments also minimize the number of VM images required to satisfy all use cases. This desire can have the negative consequence of creating boot, or login storms; especially if storage requirements are not measured and allocated for both average and peak requirements. Download the full white paper > ### How do Techgate see the Cloud market maturing during 2014? Compare the Cloud interviews Martin Wright, Managing Director of Techgate PLC to see how 2014 looks from the perspective of a Cloud Infrastructure provider. Over the last 3-4 years we have experienced a very “confused” marketplace with regards to Cloud computing. The disruptive changes across the business and enterprise IT supply chain have been tremendous. It is in the last 18 months that we have seen businesses and customers acting in a more responsive than reactive manner with regards to Cloud adoption. The technologies around cloud services are now being broken down, explained and associated with specific, tangible business benefits. [pullquote]The technologies around cloud services are now being broken down, explained and associated with specific, tangible business benefits[/pullquote]Along with the market, Cloud providers are now maturing in terms of articulating their positioning and offerings. Customers now talk about their real-life implementations that have delivered specific and provable benefits. Of course we are not “there” yet and important changes will continue to take place across the cloud computing arena. How does this compare to the rest of Europe? The rest of the Northern Europe’s cloud computing market seems to follow the same trends with no (or little) delay as the UK market. In general, we believe there is still a big opportunity for European providers when addressing the European market, as the majority of services, are still provisioned from US-based companies which often does not sit well with customers who are headquartered in Europe. Are there any trends in the Cloud market occurring at the moment – what are you noticing, from a customer buying point of view? I think customers have become more realistic with regards to cloud services. Their education and knowledge levels have increased significantly, compared to previous years, and this is very evident when we talk to them, as they are much more informed and many times know exactly what they are looking for, which was not the case previously. Again, not so long ago, we were treating “the Cloud” as some sort of panacea in IT that solves all problems. Now we talk about Hybrid Cloud strategies and implementations, critical applications that run on premises but which integrate with Public Cloud services, or off-premises Private Clouds that provide automated failover capabilities. Customers essentially understand that it makes sense for some parts of their traditional IT to be migrated and outsourced to newer types of services. What challenges are you seeing for the end consumer for using Cloud and security, and how would you feel they could be dealt with ? I think customers are much more demanding than before with regards to Cloud solutions and their security – and this is a very positive interaction for the whole industry. Security, data domicile concerns, availability. Customers essentially understand that it makes sense for some parts of their traditional IT to be migrated and outsourced to newer types of services, but they will not – and they shouldn’t – compromise on the way they should be run. Probably the most important part of this is security and customers should ask all the right questions from top to bottom with regards to what they are getting. All the cloud services (especially IaaS ones) are not the same in terms of the underlying data centres, network, physical security and hardware quality etc and, of course, they do not address the same requirements and audience. Providing enterprise-class solutions is one of Techgate’s key positioning pillars and a true market differentiator for the company and our clients. We have made significant investments, in both time and money, to gain important accreditations such as BS25999 (Business Continuity) and ISO27001 (Data Security) to give the customer complete reassurance that we are doing everything to ensure the availability and security of their Cloud systems and associated data. The fact that we own and manage our own data centres and network means that we can provide the customer with guaranteed, “end-to-end” levels of service delivery and management. How do you see the commodity Cloud players positioning themselves over the course of 2014? (AWS, Google, etc) The larger, global players in this market will continue to compete on the key elements of keen price-points, self-service capabilities and minimal time-to-market. As a larger proportion of the higher end small to medium size businesses and the larger enterprise customers start to move more and more of their applications and functionality outside of the organisation, the penetration of commodity services will, undoubtedly, increase. However, this trend will apply to specific, less business-critical parts of an organisation’s operations. The majority of critical applications and functionality will require consulting and design engagement and customer specific system builds, things that the larger providers will not, currently, do. In essence, enterprise applications will continue to be pushed externally, but in the form of bespoke Private/Community or Hybrid Cloud implementations. What, in your opinion, are the most important factors when considering a cloud provider for the following – For hosting? We believe the most important quality during an engagement with a potential customer is the level of flexibility that the supply side can provide in order to address and accommodate customer specific requirements. With regards to Cloud infrastructure offerings, there are many Public Cloud solutions that are more than capable of handling applications that currently run on over-specified, expensive dedicated on-premises environments. However there are many scenarios where providers are trying to shoe-horn customised applications into the wrong platform. The capacity for the provider to customise their infrastructure to build and provide bespoke hosting solutions is, undoubtedly, a competitive advantage. Paired with that advantage comes maintaining high-quality data centre infrastructures across all layers from the network, to the switches and storage devices. For Business Continuity (BC) purposes? With regards to sourcing BC and Disaster Recovery as a Service (DRaaS) providers, and on top of the factors already mentioned, customers should be looking for a supplier with proven experience, backed up by customer case studies, and a wide portfolio of purpose-built BC & DRaaS services that can deliver integrated solutions. For example, building Cloud backup and DRaaS platforms and all the supporting processes and procedures is a whole separate business model and cannot happen overnight. The technical and administrative ability to deliver business critical BC and DRaaS services and bring systems online when really needed is something that only comes with experience. Also, look for a company that takes its own BC planning seriously. For example Techgate has invested much time and effort in achieving the BS25999 accreditation, the British Standard for business continuity, across all its locations and operations, something that very few other providers have done. For Security? Security is a concept that should be incorporated into all types of IT offerings and not just with Cloud solutions, but in every IT building block. Security should be a prerequisite and falls in-line with all the other factors that should influence a buyer such as data domicile, system flexibility, performance and availability. Making a visit to a provider’s data centre together with carrying out due diligence on their networking infrastructure and asking for relevant case studies is always worthwhile. For example, a provider that specialises in Business Continuity solutions and has experience in running critical applications and production platforms for high-profile customers/government, should be more than able to deliver military-grade secure solutions. However, make sure that the provider can back all this up with the right accreditations and, undoubtedly, the bench mark here is ISO27001, the International Standard for Data Security, again something which Techgate has invested much time and effort to achieve and maintain on an ongoing basis. From a B2B perspective, how do you see the channel reseller market evolving over this year? The channel reseller market has undergone many changes over the last 18 months. More and more players coming from the distribution side are investing in building “reseller-friendly” Cloud platforms and services to push to the reseller community. Traditional resellers (system integrators, IT support companies etc) are now more and more pressured to switch to recurring revenue streams and offer Cloud services something which they, undoubtedly, must do. They can do this via partnering with Managed Service Providers (MSPs) with reseller-friendly expertise and business model, such as Techgate, or via consuming off-the-shelf services from distributors, or building platforms themselves. On top of all of this, more and more hardware vendors are now also offering Cloud services either directly to end-users, or via the channel or a combinations of both. Techgate has been partnering with resellers since its very beginning in 2001 and, currently, has more than 200 resellers successfully selling its services. However, with all the changes to the IT industry, wrought by the Cloud, the markets pressures on the traditional IT resellers are now extreme, but it is important that resellers embrace those market changes. We believe that, by working closely with our resellers, in order to provide customer specific Cloud solutions, often using a combination of customised and “off the shelf” services, it is possible to carve out a good business model for both the reseller and the MSP. Traditional IT reselling, or just selling commodity Cloud services, will not be viable routes forward for many IT resellers, especially now that multinationals are competing with them directly for the business. Businesses, such as Techgate and its resellers, that have a real understanding of the business benefits of Cloud, in all its forms, combined with a real “go to market” strategy, backed up with proven services and solutions, will maintain and grow their market share. General sum up on how Techgate are positioned in the market place and achievements over the last year (growth, what Techgate specialise, etc. ) Techgate’s core positioning has not changed over the last years. We are Business Continuity and Disaster Recovery specialists, providing enterprise-class, trusted IT infrastructure services. However, as the marketplace is heavily transforming, we have been changing as well, for example by adopting new Cloud technologies and harnessing the benefits of new hardware innovations into our solutions, and, at the same time, recognising and embracing the changes to the supply chain of the industry. We understand the confusion around Cloud solutions and our approach is a more down-to-earth, realistic one, with regards to what you can and, equally, cannot do with Cloud and implementations and this is an approach our clients really seem to appreciate and value. We feel that what has been missed for quite some time with regards to Cloud computing is honest, well informed conversations with customer, which is exactly where Techgate’s strengths lie. With regards to last year, we worked hard on winning important high-profile, enterprise customers, such as the Advertising Standards Authority (ASA), building bespoke solutions that fitted exact customer requirements. Naturally, we did a lot of projects around DR requirements but saw an increasing demand for high-availability and hybrid production solutions. All of this resulted in a very substantial growth in our revenues from Cloud services that far exceeded any decline we may have seen in traditional IT managed services. So in this story there is a strong message coming through. The Cloud, in all its forms, is here and is here to stay, so make sure you embrace it before it passes you by. ### Telehouse introduce RMADS security initiative Telehouse Europe, a leading provider of global data centres and managed ICT services, announced today that it has undergone an extensive RMADS (Risk Management and Accreditation Documentation Set) process for the Public Sector. This follows the implementation of the new Government Security Classifications on April 2nd 2014. The nature of security requirements is that they are constantly changing and it is our duty as a supplier of government colocation services to provide up-to date and readily available support. The new Her Majesty’s Government classifications place an emphasis on government departments to make security decisions based on a risk assessment of their data in relation to both physical and information security principles. In line with this, RMADS facilitates a comprehensive due diligence process in the selection of infrastructure requirements based on CSE (Catalogue of Security Equipment), SAPMA (Security Assessment of Protectively Marked Assets) and CPNI (Centre for the Protection of National Infrastructure) methodologies. The RMADS project is the latest in a roadmap of compliance standards held by Telehouse Europe’s London facilities which, among others, includes ISO27001 (Security), PCI-DSS (Payment Information) and BS22301-2:2007 (Business Continuity) accreditations. Full documentation and support is now available to all Telehouse customers, including both Managed Service Providers and the Public Sector for use in their compliance programmes. Commenting on the RMADS initiative, Andrew Fray, Business Development Director at Telehouse Europe, said “The Introduction of RMADS is an important part of our mission critical infrastructure and recognises the need for close collaboration between government departments and suppliers in managing data security.” Andrew Fray continued in saying “The nature of security requirements is that they are constantly changing and it is our duty as a supplier of government colocation services to provide up-to date and readily available support.” Further information about Telehouse and its accreditation portfolio please contact William Horley, Marketing Manager per email: marketing@uk.telehouse.net or visit www.telehouse.net. About Telehouse Europe Telehouse provides global Tier III data centres, connectivity and managed ICT solutions, providing secure and resilient infrastructure for mission-critical IT systems. Established in 1989, Telehouse became Europe’s first carrier neutral colocation provider. Today, the company is at the heart of the Internet and telecommunications infrastructure, serving over 1,000 major corporations worldwide, across a wide range of industries including government and financial institutions. Telehouse is the data centre subsidiary of Japanese corporation KDDI, a $45 billion Global Fortune 300 telecommunications and systems integration provider. ### Is BYOD an Opportunity or Threat? Sometimes, it seems like every man and his dog is talking about the relentless march towards mobile and BYOD.  But what kind of opportunities and threats does BYOD present for Cloud Service Providers? I can't help feeling that BYOD is a massive opportunity for CSPs generally.  Since the BYOD trend seems to centre mostly around the use of phones and tablets – rather than BYO laptops (barring the odd MacBook addict) – I'm thinking that one way or another most of the data is off-device and - for business systems - the compute aspect will be too in most cases.  Even if there exists the capacity to store data or run apps locally, this really isn't desirable. As such, this has to be a massive opportunity for CSPs, whose strengths lie exactly in providing a scalable on-demand service that creates a cost-effective way of supporting the remote compute of these BYOD. BYOD as a threat to BYOS CSPs have benefitted greatly from the Bring Your Own Software (BYOS) trend.  However, as IT departments implement (largely) reactive policies to control the BYOD phenomenon (whether BYOD is sanctioned or not), measures such as EAS, MDM, MAM, and VDI could restrict the use of the largely cloud-based applications brought into the organisation on user-owned devices. Measures such as EAS, MDM, MAM, and VDI could restrict the use of the largely cloud-based applications brought into the organisation on user-owned devices. Opportunities Some SaaS providers will no doubt make it on to the 'company approved' lists (if they're not there already). Cloud-based App Stores. Tools to manage BYOD Many of the tools used to manage BYOD activities could be a potential threat to the existing SaaS trend (especially BYOS) in a business context, depending how they are implemented.  However, not without provoking end-user resistance. Opportunities Some forward-thinking Mobile Device Management (MDM) vendors are already offering cloud-based solutions of their own management applications. Could this pave the way for new, more agile entrants to the MDM market? Particularly those who can not only provide mobile device monitoring but also offer creative solutions for mobile application management? VDIs and virtualised apps offload compute to the data centre and there is a strong argument for choosing cloud services to support that, especially where there can be large fluctuations in service demand or a rapid requirement for provisioning. Vendors, such as Citrix, already offer a hosted VDI offering. Ultimately, is the answer to BYOD to be found in  the Workspace as a Service (WaaS) model? Some forward-thinking Mobile Device Management (MDM) vendors are already offering cloud-based solutions of their own management applications. Rethinking the network infrastructure The segregated guest network is one approach; or perhaps multiple layers, each offering their own capabilities.  The guest layer might be effectively a wireless hotspot, then another mobile network might offer some low-risk business applications, then the standard secure office and corporate network.  I'm not sure that this evolution particularly helps CSPs but it does offer an opportunity in terms of positioning and marketing messages. Opportunities Perhaps CSPs can exploit this requirement with clear marketing messages: we understand the need for the different layers in your network and we can provide services which match or enhance this model. Reposition offerings: either highly secure, leveraging all our expert security and encryption expertise vs. open and scalable as required, or both. Surely there is an opportunity for CSPs to cannily target the smaller/ medium sized businesses that may not otherwise have the in-house technical knowledge or infrastructure to deliver their BYOD management strategies?  Particularly in terms of their ability to deliver applications across a highly diverse mobile device landscape? ### The expanding DaaS market, the big players and healthy competition Firstly let me make an apology for referring to HVD, VDI, HVDI, DaaS, HD, VD etc under the umbrella term of "DaaS" (Desktop-as-a-Service). Guise will kill me. Every January we hear “this is the year for VDI”, or “will this be the year for VDI”, or “will VDI ever arrive?” (my previous blog). Well I'm here to tell you this year IS and HAS been the year for VDI already.  VDI is now mainstream. If a business doesn't already have it, they're either considering it, they have it on the back burner or they've ruled it out for one of a number of reasons. It's safe to say there are now tens of millions of virtual desktops and billions of virtual servers. If a business doesn't already have it, they're either considering it, they have it on the back burner or they've ruled it out. So why are there so many technical blogs antagonising over VDI? I'll tell you. There are two main reasons...  Firstly for the IT Supplier, it's far less profitable. In my previous life I ran an IT Support company, our only major outgoings were our staff and operational costs. If we billed £600 per day for onsite support, the majority would be profit. If we billed a customer £1,000 per month for contracted remote IT Support, the majority would be profit. Now consider VDI; you have to pay for staff (circa 25%), Microsoft and Citrix / VMWare licensing (circa 20%), Servers + Storage + Multiple Datacentres + DR (circa 30%), operational costs (circa 10%) leaving around 10% profit. Compared to 50%+ profit for offering traditional IT Support. Most HVD providers I know report an annual loss. I only know of two companies who actually have Citrix HDX 2 working with 100% accuracy and finesse... it's complicated to set up Secondly, it's complicated to set up. I only know of two companies who actually have Citrix HDX 2 working with 100% accuracy and finesse. It's not just a case of switching HDX on, but managing around 12 different settings on the server, infrastructure and endpoint device. I only know of a handful of HVD providers who understand how to mirror web interfaces across multiple datacentres. If not configured correctly, there are 101 things that can go wrong. If any of part of the stack falls over, the VDI goes down, and VDI's name sake is disparaged. So, it's expensive and complicated.  I whole heartedly concur. But, what if a HVD provider builds a reliable infrastructure and is able to generate a profit through economies of scale. I've realised a HVDI will become profitable at around 3,000 users if you are charging £50 - £60 per desktop per month (lower than £50 and you cannot generate a profit). Ok so now let's discern the negative impulsion from traditional IT suppliers, and consultants alike. I don't think anyone will disagree that if a VDI is stable, fast and cost effective, the benefits far outweigh any other form of Business computing. Detailing every benefit is not within the context of this blog; but we are all aware that anywhere access, from any device, for any application (SaaS or non-SaaS), Disaster Recovery and cost effectiveness are huge benefactors. Add to this the ongoing management of hardware, software and constant upgrading of the platform by the service provider really is the answer for ease of use and access for most businesses. Going a step further; Big Data Analytics because your infrastructure is centralised can add some serious weight. I don't think anyone will disagree that if a VDI is stable, fast and cost effective, the benefits far outweigh any other form of Business computing. All of the above brings me to my next point and reconfirms why the "big players" now have their own DaaS offerings. Now that VDI is mainstream and in place in around 25-50% of medium sized businesses, it was only a matter of time until the big players moved into this area. This is a strong message and reaffirms that HVD and cloud services soon will be mandatory, not optional. I could end the blog here… And I would be happy to, concluding how far VDI has come in the last 25 years. And since I started providing VDI 10 years ago, now most IT companies have this offering in their artillery either offering it themselves or via resale. VESK doubled in size in 2013, without investment, just from extremely hard work and running a very reliable and high performance VDI. And now, some of the largest IT companies in the world AND the two largest cloud providers in the world are offering hosted desktops. But there is one immense distinction to make between the big cloud players and specialist HVD firms. The big players don't quite have it right, so let's take a closer look at who is offering HVD:  Amazon, Google, Microsoft (rumoured Project Mohoro) and VMWare. Amazon Workspaces: Signing up for a Workspaces desktop is fairly easy and the control panel is very good. The basic version of Workspaces is $50 with Office. Ok, so now you have a Windows virtual desktop, great right? Not really, you still need your applications installed and your VDI built. You still need Exchange and Active Directory access.  There is no customisation of the desktop at all, the resolution is very low. I could just imagine giving a Workspaces demo desktop to a prospect and the reaction on their face when they realise the task ahead.  Their data centres are in the US, and many UK businesses are either not permitted to host their data in the US for compliance reasons (such as law firms) or because the latency and access speeds would equate to poor performance. So on top of the basic version of Workspaces at $50, you would also need a local IT firm to build the Active Directory environment, migrate and manage the applications on to AWS using Amazon's bespoke API's and programming tools. In total, realistically you would be looking at around $120 (£85) per desktop per month for the desktop and ongoing management without 1st Line / Onsite support. And it's only viable if you are in the US. VMWare, Google and Microsoft's offerings are all very similar. To summarise; Desktone try to explain why they are better than Citrix, their same argument over and over again is that Citrix wasn't built for multi-tenant and hosted environments, but more for private cloud deployments. Every single point made on their Desktone Vs Citrix table is now incumbent, apart from one (not requiring SQL licenses). Again, data is stored in the US and a true test is to demo a Desktone desktop vs VESK, you'll see the performance of VESK exceeds that of a Desktone desktop. Every single VDI we have built for our customers is different, there is no one size fits all. Where Microsoft is concerned - could you imagine calling Microsoft and asking for help with Excel? It won't happen, so you basically need an IT supplier to build and architect the VDI. Every single VDI we have built for our customers is different, there is no one size fits all. You see, a pattern is starting to emerge. So why the VESK infrastructure? Apart from the obvious benefits of running a cloud service for your business, a few are listed above, there's always a more bespoke benefit for every company, usually these differ vastly. Some businesses require high levels of DR, some require BYOD or migrating all of their applications into the cloud and integrating their applications. But what differentiates VESK from the usual, or known benefits ? Firstly it's performance. The reason VESK desktops are so quick to respond is because of the way they are setup, firstly a correct configuration of Citrix and Microsoft services but secondly and more importantly; the hardware. Through years of experience, we have built different SAN pools for different IO requirements. Exchange logs require extremely fast random read IO so the disks hosting the Exchange logs are configured differently. Microsoft desktops sit on one pool, but the Microsoft page files sit on another pool of smaller and therefore faster SSDs. All performance derives from the fastest SSDs currently available and these SSDs are replaced every 12 months. But the performance goes deeper than SSDs, the actual disks where Exchange, application and desktop data reside are also customised for their workload depending on the requirement of the specific area of the VDI. We monitor every 'bit' of information and data flow across our VDI and implement customisations based on the analysis of data reporting. That's why when you log into a VDI it's extremely fast, opening a 40GB Exchange email account and searching is instant and access between applications is instant. Deep breath, and finally: UK datacentres ensure fast access across Europe rather than slower access across the Atlantic. Secondly it's integration of applications and bespoke customer configurations. Therefore you only need to work with one cloud services provider plus your line of business application providers. Most businesses will require add-on applications for Exchange and sometimes Office. Complex bespoke group and security policies, personalised AD configurations, DFS file replication, VPN's for application integration are part of the VESK implementation. These are all realistic requirements for the majority of businesses. Commoditising DaaS by making the whole process of building desktops automated through control panels and API's is an inherent problem because some level of integration, awareness of the customer, and communication with the hosting provider is required. When you log into a VDI it's extremely fast searching is instant and access between applications is instant. To summarise The big players will be successful in passing desktops to macro-sized businesses and also the Channel. The majority of businesses will use smaller HVD providers who are able to be agile and have a specific understanding of the customer's needs. UK businesses will more than likely not host their desktops in the US for one of two reasons; latency or compliance. The cost of hosting with Amazon/Mohoro will be more than using a local HVD provider when considering all costs involved. ### Billing is the Edge in a Subscriptions World Gaining an Edge in the New Subscriptions-based World – Why Billing is the Key Small businesses and start-ups are becoming big fans of the subscription business model. Increasingly, they are either transitioning their businesses from selling products to becoming service providers, or they are simply starting out with a service-based approach from day one. All are attracted by the prospect of securing recurring business from existing customers with a more predictable revenue stream based on subscriptions, and thereby creating a more valuable supplier-customer relationship. Attracted by the prospect of securing recurring business from existing customers with a more predictable revenue stream. Unfortunately, for many SMEs, it is not quite so easy to turn vision into reality.  Most start out with simple business ideas which can be easily replicated. If a copycat competitor comes on the scene, undercuts them on price and develops goods or services at a cheaper location, it becomes much harder to differentiate. So, if they want to gain real competitive advantage, what small businesses need most of all is a means of rapidly bringing new services to market and monetising them. They want to provide their customers with more service options and add more flexibility around how they price and package services and up-sell and cross-sell new service offerings. And they need to do all this without increasing their overheads. This is where, despite its flexibility and creativity, a business can get held back by a seemingly straightforward back office process – its billing. Businesses need a means of rapidly bringing new services to market and monetising them. A business model based on subscriptions and usage needs to be able to offer variable pricing plans, which means investing in a billing system. However, traditional on-premise billing systems are typically too expensive, or not agile enough, to handle this requirement. On top of this, they often take years to implement, configure and integrate with other applications. In short, on-premise billing is far from the ideal choice for any subscription-based business and definitely an inhibitor for SMEs looking to monetise new services quickly. Fortunately, help is at hand in the shape of cloud billing solutions which can slash the time taken and the costs incurred in setting up new services, giving SMEs and start-ups the chance to rapidly turn innovative ideas into monetised solutions. Moreover, implementation can be done in a matter of weeks or even days and the business only needs to pay for the application once it’s in commercial use – a key benefit for small businesses in particular. This approach is far more cost-effective than traditional methods. There’s no need to invest in expensive new hardware or maintenance costs and payment is based on what is used rather than on upfront software licensing or prohibitive implementation fees. Furthermore, software upgrades are made automatically to ensure users keep up-to-date and can make full use of the latest functionality. No dynamic and resourceful small business should be held back by its billing system. However, it is often these apparently minor obstacles that prove to be sticking points on the road to success. But, once again, the cloud is breaking down the barriers for these companies – and providing a more positive outlook for those needing to rapidly monetise their service offering and capitalise on the subscription revolution. ### Ensuring end-to-end interoperability of the cloud The CloudEthernet Forum (CEF) and MEF have recently announced the creation of the CEF ‘Open Cloud Project.’  The Open Cloud Project will focus on creating an open test and iterative standards development program for service providers, industry vendors and over-the-top (OTT) cloud service providers. The Open Cloud Project includes a dedicated proof of concept test laboratory based in Silicon Valley, to provide ongoing testing and support for the iterative development of the CEF’s CloudE 1.0 open cloud environment. It will also provide the basis of future compliance and benchmark testing. Informed by the experience and learning of its close association with the MEF, and seeing the benefits that rapid, iterative development has brought to the cloud industry, the CEF is taking this pioneering step of integrating testing into the standards development process right from the start. The Open Cloud Project will focus on creating an open test and iterative standards development program for service providers, industry vendors and over-the-top (OTT) cloud service providers. Our Initial work will be focused on three areas: application performance management, cloud security and traffic load balancing.   The project’s open test program will lay the groundwork for a fully inter-working cloud environment, and the advancement of best practices to manage OTT and cloud services. This is vitally important work if we are to avoid fragmentation of the cloud. Cloud services rely on the end to end interoperability of so many players – enterprises, network and datacenter equipment vendors, datacenter operators, orchestration layers, management and reporting platforms, security devices, network service providers – the list goes on. The MEF has shown a successful model of defining service types and attributes which everyone can agree to and align with, which through this test bed we can adapt and bring to the cloud industry. This is vitally important work if we are to avoid fragmentation of the cloud. Unless we can define industry best practices and global standards to establish an open cloud environment, cloud services run the risk of becoming more and more fragmented and difficult to integrate. We have already started work on defining a reference architecture for cloud interoperability, including discussion with other standards bodies to make sure that work is aligned. The aim this year is to decide the initial, fundamental criteria for a set of standardised, open and interoperable cloud specifications called “CloudE 1.0”. The lab will play a key role in this development, ensuring faster establishment of workable and proven open standards. Ask yourself, what are the basic standards that need to be met for a workable Cloud Ethernet service? For example, a car is not a car without certain basic components – wheels, engine, steering etc. – so the industry must decide a set of vendor-neutral networking criteria without which cloud services cannot realistically be supported. This is a co-operative process – we’re recruiting the brightest brains and most experienced practitioners to lead each of our five fundamentals, because we need to address the challenges with full 360 degree vision. For more information, and detail of how to join the Forum, please visit: www.CloudEthernet.org ### Are cloud price reductions sustainable? During this last month a notable trend continued; the price of cloud computing - infrastructure and platforms - continues to head in one direction only. Downward. At their event in San Francisco recently, Google announced a series of huge price cuts for its services for running software applications and websites and storing large quantities of data.  Just one day later, the market leader Amazon Web Services (AWS) unsurprisingly followed Google’s lead by unveiling a raft of price cuts for its own popular cloud services. The major player in the industry slashing prices, and a huge pretender to the throne such as Google doing the same, will have a big practical impact on the adoption of cloud services in the near future. AWS will now be 30 to 40 percent cheaper after Amazon’s recent decision. Meanwhile, Simple Storage Service "S3" , its object-based online file storage, will be price cut by over 50%.  And Amazon’s data anlaysis service, Elastic MapReduce, will see price cuts as large as 61%. This represents a massive sea change in the cloud landscape. There has been growing momentum in the provision of cloud services for some time, but the major player in the industry slashing prices, and a huge pretender to the throne such as Google doing the same, will have a big practical impact on the adoption of cloud services in the near future. Amazon has always indicated with regard to its cloud services that it was running a low-margin business, which seemed entirely plausible based on the fact that it is well known that this does apply to their core retail business. But this recent willingness to slash prices related to its cloud services indicates that this claim was probably a tad misleading. What this means for cloud in the future is that competitors to Amazon will be feasibly able to cut their costs as well. When this will happen remains to be seen. This recent willingness to slash prices related to its cloud services indicates that this [low-margin] claim was probably a tad misleading. This will obviously mean that cloud computing becomes a much more viable proposition for many businesses in the near future. In that sense, this decision by Amazon and Google can be seen as a red letter day in cloud computing, as it may well be looked back on as the point at which cloud computing made its big breakthrough into the mainstream. In addition to Google and Amazon’s price cutting, Google has also in accordance announced its decision to harmonise its cloud computing business to a single entity. Meanwhile, as Amazon and Google make aggressive moves in the marketplace, Microsoft has also made significant cloud-related moves just recently, offering Office for the Apple iPad. This seems to be entirely intended to induce more people to purchase its cloud-based Office 365 product. Intel have also recently stated that they will invest $740m in Cloudera; a move that will establish Intel as a market leader with regard to carrying out big data analysis in the cloud. Finally, Cisco has announced that it will invest $1 billion in cloud computing in the short-to-medium term in an attempt to catch up with its rivals in the cloud. Perhaps none of these announcements in and of themselves can be considered a shockwave. They were all logical moves, and some of them could even have been predicted. But what they do show is that massive corporations that are intrinsically associated with ‘conventional’ computing are taking the cloud more and more seriously, and acknowledging that it will be key to their business models from this point on. In light of such huge investment, one has to wonder whether the price reductions we are seeing are sustainable, or simply a short-term initiative to grab market share?  You could be even more cynical and argue that these price reductions are there to build the cloud market, in a move to prove to investors that such colossal capital investment is indeed worthwhile?! ### Flexiant Acquires Besol’s Tapp Multi-Cloud Management Technology and Business Tapp multi-cloud lifecycle management platform to be integrated with Flexiant Cloud Orchestrator to immediately benefit MSP’s, telcos and others London, UK and Seville, Spain – 14 May 2014 – Flexiant today announced its acquisition of the Tapp multi-cloud management technology and business from Besol Sl. The cash and stock acquisition includes the migration of technology, staff and customers to Flexiant. On many occasions Flexiant has stated that “this is the decade of the service provider” and the move to acquire Tapp increases the capability of Flexiant to enable managed service providers (MSPs), telcos and others to manage workloads across multiple cloud vendors and geographies. Flexiant, recognized as the leading trendsetter and provider of cloud orchestration and management, will use this acquisition to further accelerate its innovation roadmap to deliver an even broader range of solutions for the cloud service provider.[pullquote]...the move to acquire Tapp increases the capability of Flexiant to enable managed service providers (MSPs), telcos and others to manage workloads across multiple cloud vendors and geographies.[/pullquote] George Knox, CEO, Flexiant said, “We believe that MSP’s are uniquely placed to capture cloud revenue and opportunity and that this market has been underserved. MSPs do not have the range of enabling software solutions needed to maximize their market position.” “With their unique customer first view of service delivery rather than technology first, MSPs have a deep understanding of their customers’ business priorities and critical SLAs. Equally MSPs are expert at managing, on behalf of their customers, multiple, competing IT vendor relationships. It is a natural extension to this customer relationship to also offer customer services to transition to the cloud and to manage multiple cloud service providers. At the moment there are limited solutions to help the MSP manage the cloud services provider. Our move to acquire Tapp technology will extend the management solutions we already provide to MSPs across multi-cloud deployments so that they can decide what infrastructure and vendor to use based on their customer’s unique business requirements,” continued Knox. Tapp CEO Hector Rodriguez said, “I am truly excited Tapp joins forces with Flexiant. Tapp’s leading-edge automated multi cloud platform and Flexiant’s unrivalled capability in cloud orchestration will deliver second-to-none value to MSPs and global telcos. By joining a larger organization with the clout of Flexiant’s, there is a truly EU-born power play in infrastructure cloud computing in the making.”[pullquote]Tapp’s leading-edge automated multi cloud platform and Flexiant’s unrivalled capability in cloud orchestration will deliver second-to-none value to MSPs and global telcos.[/pullquote] Also a Gartner Cool Vendor alongside Flexiant, Tapp capabilities includefeatures and functions that can be deployed across multiple clouds and consumed as-a-service afterwards. With the goal of guaranteeing consistency, independently of the chosen cloud providers, Tapp’s features are very much abstracted from the infrastructure layer. Tapp provides a comprehensive set of application-focused tools, from load balancing and auto-scaling to DNS management and File Delivery Network. One of the most interesting capabilities lies in the extensive use of Chef for application blueprinting within Tapp. This enables application blueprints to be deployable on multiple clouds, without relying too heavily on provider-specific OS images. The combination of Flexiant’s innovative technology with that of Tapp will provide service providers a range of completely unique opportunities to secure and grow existing cloud services revenue and margins as well as create and defend new revenue streams. [pullquote]Tapp provides a comprehensive set of application-focused tools, from load balancing and auto-scaling to DNS management and File Delivery Network.[/pullquote]Flexiant will integrate the Tapp technology into its innovative and leading cloud management platform to offer multi-tenant cloud solutions so MSPs can select the right infrastructure solution for the customer – whether it is their own infrastructure or that of a trusted cloud service provider, including Amazon Web Services, Cloudstack, Google, IBM, Joyent, Microsoft Azure, OpenStack, Rackspace, VMware and every cloud provider running Flexiant Cloud Orchestrator. Furthermore, Flexiant has a combined Tapp and Flexiant roadmap to generate a range of compelling sector specific services that will allow service providers using Flexiant to differentiate their offers in the battleground for cloud revenue. Service providers can expect a new and comprehensive set of cloud services using Flexiant Cloud Orchestrator and Tapp starting in summer 2014.   About Flexiant Flexiant provides cloud orchestration software focused solely to the global service provider market. Flexiant Cloud Orchestrator is a cloud management software suite that arms service providers with a customizable platform to help them turn innovative ideas into revenue generating services quickly and easily. With Flexiant, service providers can generate more revenue and accelerate growth, compete more effectively and lead the market through innovation. Vendor agnostic and supporting multiple hypervisors, Flexiant Cloud Orchestrator offers a customizable platform, a flexible interface, integrated metering and billing, reseller capabilities and application management. Flexiant gives service providers the ability to develop, launch and bill for new cloud services quickly. Flexiant has been named a Gartner Cool Vendor in Cloud Management, ranked as the most innovative product in the market by Info-Tech Research Group for the second year in a row and called an industry double threat by 451 Research. Flexiant is now a Dell certified technology partner and Parallels partner. Customers include Colt Ceano, Computerlinks, ITEX, NetGroup and WeServe. Flexiant is also a key participant in the European Framework Programme. For more information visit www.flexiant.com. About Besol Besol, through its Tapp.in platform, provides infrastructure management across multiple cloud (IaaS) providers. Tapp makes it easy for companies to deploy their infrastructure and manage their servers in a public cloud environment with the ability to seamlessly migrate between public cloud providers. Tapp also provides a DNS administration, intelligent Auto-scaling, DR, cost control, load balancing, wizards and application performance management. Tapp is currently compatible with the largest IaaS vendors and was named 2012 Cool Vendor by Gartner in Cloud Services Brokerage Enablers and was selected finalist in 2012 Structure Europe.   ### City Network extends the lead in European IaaS City Network extends the lead in European infrastructure as a service with the launch of a full service City Cloud data center in London. City Network unveiled their fifth data centre and the first City Cloud data centre outside of Scandinavia.  The London data centre is the first of many to be launched over the next months to come and a milestone in the company’s ongoing expansion. A new London office with local staff will also be available. The data center in London houses all City Cloud core infrastructure services (IaaS) which include servers, storage, backup and monitoring and more. All City Cloud users can launch any of these services in the London data center through an intuitive simple to use dash board. The roll-out plan is extensive and aggressive with Frankfurt coming up next. Other cities to follow are Paris, Madrid, Zurich, Vienna, Brussels and Milan. City Cloud is delivering the first truly pan-European infrastructure as a service where ease of use is a key driver. With multiple data centers throughout Europe City Cloud offers a one-stop shop via a simple web interface for the majority of IT needs among European companies. City Cloud accommodates for both local and European data security laws as well as the cultural need to have data reside in a specific location. “So far we have only scratched the surface of what a well built and powerful IaaS can do for a company. The release of the London node and cities to come, will allow European companies to deploy computing power in a way never before possible. The true power lies in ease of use where the whole organization can take advantage - not just certified engineers.” says Johan Christenson, CEO of City Network. About City Network With more than 20000 customers across Europe City Network is one of the leading European hosting companies. Core services include public, hybrid- and private clouds. Dedicated environments as well as shared hosting and domain services are also offered in combination with the cloud services. City Network also provides high end backup services via www.onlinebackup.io and www.onlinebackup.se. City Cloud is the cloud computing brand which you can read more about at www.citycloud.com. You can find more information about City Network here: www.citynetworkhosting.com. ### CenturyLink announces mitigation appliance option for DDoS Mitigation Service Integrates on-site DDoS mitigation with CenturyLink's network of global, network-based scrubbing system CenturyLink, Inc. (NYSE: CTL) today announced the launch of a mitigation appliance option for CenturyLink Technology Solutions’ DDoS Mitigation Service that protects businesses’ IT infrastructure against distributed denial of service (DDoS) attacks. A DDoS attack occurs when a large number of compromised systems attack a target, such as a website, and overwhelm it with activity that causes the target to become unresponsive and thereby denies legitimate users with access to the system. DDoS attacks have grown more frequent, with nearly 50 percent of enterprise IT leaders reporting that their organizations experienced one or more DDoS attack in the last three years, according to independent research commissioned by CenturyLink Technology Solutions in 2013. CenturyLink’s DDoS Mitigation Service detects attack traffic – at the network level and through a mitigation appliance – before it impacts an organization’s infrastructure. A team of security experts monitor CenturyLink’s global mitigation system, which analyzes traffic 24/7 and diverts and scrubs potentially malicious packets. The solution protects environments whether they are accessed via CenturyLink’s network, one of CenturyLink’s more than 55 data centers, customer premises or a third-party data center or network. “DDoS attacks continue to evolve, becoming more sophisticated in scale and complexity and resulting in greater downtime for organizations,” said Chris Richter, vice president, managed security products and services, at CenturyLink Technology Solutions. “We designed our DDoS service to enable global protection through multiple cleansing centers around the world and flexible commercial terms. With the integration of a network-based solution with an on-site appliance, we can further serve businesses that require always-on mitigation and deliver protection for multiple third-party networks while providing full visibility and reporting across all of their networks.” CenturyLink’s DDoS Mitigation Service with the new mitigation appliance option employs advanced layers of defense, improving availability of customer’s infrastructure and reducing the likelihood of future DDoS attacks. It also keeps security costs manageable, with a pricing model based on usage and no required hardware or software. “Businesses need to be proactive and vigilant when it comes to protecting their assets from potential DDoS attacks,” said Christina Richmond, program director, security services, at IDC. “With CenturyLink’s new mitigation appliance option, the company moves to a higher echelon of DDoS mitigation providers that offer true defense-in-depth designed to mitigate DDoS attacks at multiple layers and further extends its already-strong security portfolio.” CenturyLink Technology Solutions has delivered a broad range of IT security solutions for more than 15 years. The organization’s security experts provide businesses with the tools needed to meet security compliance requirements, lower capital costs and mitigate the risk of potential attacks. Visit http://www.centurylinktechnology.com/security for more information. About CenturyLink Technology Solutions CenturyLink Technology Solutions delivers innovative managed services for global businesses on virtual, dedicated and colocation platforms. For more information, visit www.centurylink.com/technology. About CenturyLink CenturyLink is the third largest telecommunications company in the United States and is recognized as a leader in the network services market by technology industry analyst firms. The company is a global leader in cloud infrastructure and hosted IT solutions for enterprise customers. CenturyLink provides data, voice and managed services in local, national and select international markets through its high-quality advanced fiber optic network and multiple data centers for businesses and consumers. The company also offers advanced entertainment services under the CenturyLink® Prism™ TV and DIRECTV brands. Headquartered in Monroe, La., CenturyLink is an S&P 500 company and is included among the Fortune 500 list of America’s largest corporations. For more information, visit www.centurylink.com. ### IBM unveils storage technology for an era of Big Data IBM unveils a portfolio of software defined storage products that deliver improved economics at the same time they enable organisations to access and process any type of data, on any type of storage device, anywhere in the world. [quote_box_center] Breakthrough storage software provides infinite scaling across all data types Changes economics of the datacenter by reducing storage costs up to 90 percent Innovation that powered IBM Watson and top supercomputers now available commercially [/quote_box_center] One technology in the portfolio, codenamed "Elastic Storage," offers unprecedented performance, infinite scale, and is capable of reducing storage costs up to 90 percent by automatically moving data onto the most economical storage device. Born in IBM Research Labs, this new, patented breakthrough technology allows enterprises to exploit – not just manage – the exploding growth of data in a variety of forms generated by countless devices, sensors, business processes, and social networks. The new storage software is ideally suited for the most data-intensive applications, which require high-speed access to massive volumes of information – from seismic data processing, risk management and financial analysis, weather modelling, and scientific research, to determining the next best action in real-time retail situations. "Digital information is growing at such a rapid rate and in such dramatic volumes that traditional storage systems used to house and manage it will eventually run out of runway," said Tom Rosamilia, Senior Vice President, IBM Systems and Technology Group. "Our technology offers the advances in speed, scalability and cost savings that clients require to operate in a world where data is the basis of competitive advantage." Software-defined storage is a set of software capabilities that automatically manage data locally and globally, providing breakthrough speed in data access, easier administration and the ability to scale technology infrastructures quickly and more cost-effectively as data volumes expand. In addition, these advances can work with any company’s storage systems to provide automated and virtualised storage. Game-Changing Technology The foundations of today’s Elastic Storage were used for the Jeopardy! television match between IBM's Watson and two former Jeopardy! champions. For the show, IBM’s Watson had access to 200 million pages of structured and unstructured data, including the full text of Wikipedia. By using Elastic Storage capabilities, around five terabytes of Watson’s "knowledge" (or 200 million pages of data) were loaded in only minutes into the computer’s memory. A key reason these capabilities were chosen for the Watson system that competed on Jeopardy! was its scalability, the architectural limits for which stretch into the thousands of "yottabytes." A yottabyte is one billion petabytes, or the equivalent of a data centre the size of one million city blocks, which would fill the states of Delaware and Rhode Island combined. IBM Research has demonstrated that Elastic Storage can successfully scan 10 billion files on a single cluster in just 43 minutes – a technology demonstration that translates into unequalled performance for clients analysing massive data repositories to extract business insights. At its core, Elastic Storage builds on IBM’s global file system software to provide online storage management, scalable access, and integrated data governance tools capable of managing vast amounts of data and billions of files. For example, Elastic Storage also exploits server-side Flash for up to six times increase in performance than with standard SAS disks. This feature recognizes when a server has Flash storage and automatically uses that Flash as cache memory to improve performance. Elastic Storage virtualizes the storage allowing multiple systems and applications to share common pools of storage. This enables transparent global access to data without the need to modify applications and without the need for additional and often disruptive storage management applications. Since Elastic Storage is not reliant on centralised management to determine file location and placement, customers can have continuous and highly-available access to data in the event of software or hardware failures. A key component of Elastic Storage is its ability to automatically and intelligently move data to the most strategic and economic storage system available. For the National Center for Atmospheric Research’s Computational and Information Services Laboratory (CISL), growing data volumes are part of its DNA. The organisation, which stores and manages more than 50 petabytes of information between its Wyoming and Colorado centres, relies on Elastic Storage to give researchers fast access to vast amounts of diverse data. "We provide computational, educational, and research data services for geosciences to more than 1,000 users at more than 200 different sites," said Pamela Gillman, manager, Data Analysis Services Group, CISL. "The IBM global file system software has enabled scalable, reliable and fast access to this information. That has dramatically improved the performance of the different functions, as well as the organisation as a whole." A key component of Elastic Storage is its ability to automatically and intelligently move data to the most strategic and economic storage system available. Through policy-driven features and real-time analytics, for example, Elastic Storage can automatically move infrequently-used data to less expensive low-cost tape drives, while storing more frequently-accessed data on high-speedFlash systems for quicker access. Such policy-driven features can provide cost savings of up to 90 percent. In addition, the software features native encryption and secure erase, which ensures that data is irretrievable to comply with regulations such as HIPAA and Sarbanes-Oxley. Through its support of OpenStack cloud management software, Elastic Storage also enables customers to store, manage and access data across private, public and hybrid clouds for global data sharing and collaboration. In addition to supporting OpenStack Cinder and Swift access, Elastic Storage supports other open APIs such as POSIX and Hadoop. While traditional storage systems must move data to separate designated systems for transaction processing and analytics, Elastic Storage can automatically balance resources to support both types of application workloads, including Hadoop based analytics. This dramatically speeds analysis and eliminates the costly and time-consuming process of producing duplicate copies of data. Elastic Storage software will also be available as an IBM SoftLayer cloud service later this year. About IBM Big Data and Analytics Each day, we create 2.5 quintillion bytes of data generated by a variety of sources -- from climate information, to posts on social media sites, and from purchase transaction records to healthcare medical images. At IBM, we believe that data is emerging as the world's newest resource for competitive advantage, and analytics is the key to make sense of it. IBM is helping clients harness Big Data and analytics to provide insights needed to make better decisions, create value, and deliver that value to clients and society. IBM has the world's deepest and broadest portfolio of Big Data and analytics technologies and solutions, spanning services, software, research and hardware. For more information about IBM Software Defined Storage, visit www.ibm.com/storage and follow us on twitter: @ibmstorage. ### How do Kingston see the storage market maturing throughout 2014? Matthew Mackender, EMEA SSD Business manager at Kingston Technology gives us some market insight into the take up of SSDs and Flash memory. How do Kingston see the storage market maturing over 2014? We feel that 2014 will be another very strong year for SSD in both the client and enterprise market segments. In the past SSD's were typically used by power users and high end server configurations but with the cost per GB on NAND flash dropping to below £0.50 on consumer grade drives as well as significant drops on enterprise class drives that could now be set to change. On the client side although users have always been aware of all the benefits of SSD, when it came to the purchasing decision it was sometimes hard for them to justify purchasing an SSD when they compared price vs performance ratio. However this has now changed and the jump in performance compared to the increase in price over a HDD now means the purchase makes sense. On the enterprise side although we are still a long way from full scale adoption of all flash server configs there has definitely been an increase in the awareness of the benefits of implementing SSD. On the enterprise side there has definitely been an increase in the awareness of the benefits of implementing SSD [and] there has been a number improvements in controller technology that help to mitigate the known limitations associated with NAND endurance. Obviously with enterprise customers cost is a determining factor in the purchasing decision and the cost drops on NAND flash will help to increase run rate but more of a concern is making sure that the components in the server are built to last. Over the last few years there has been a number improvements in controller technology that help to mitigate the known limitations associated with NAND endurance and with these improvements set to continue this will contribute to an increase adoption and confidence in the benefits of SSD in enterprise customers. How does this compare to the rest of Europe? SSD sales continue to grow across all of Europe however we do see the more mature early adopting regions such are DACH and Scandinavia leading the way when it comes to units and GB's shipped. Are there any trends in the flash/storage market occurring at the moment – what are you noticing, from a customer buying point of view? There has definitely been a move towards higher capacity drives in recent months with the high runners moving from 60GB and 120GB up to 240GB and 480GB. On the enterprise side we have seen a number of customers purchasing consumer grade SSD's knowing that these drives do not necessarily fit the endurance needs of their environment however are happy to replace these drive as and when they max out due to the price point compared to higher endurance eMLC enterprise grade SSD's. ### Carrot, stick or lockdown? How to approach BYOD Here at Compare the Cloud, we've already taken a look at the factors to consider before implementing a BYOD scheme: cost, productivity, employee morale, acceptable use, security, and availability, according to fellow commentator Rick Delgado. How best then to address these?  Carrot, stick or lockdown? The carrot I've discussed elsewhere how, when implemented and managed well so they remain relevant for business users, Enterprise App Stores (EAS) can be a positive way of encouraging the use of 'acceptable' applications. Our carrot, then, if you will. Wiser minds than mine, however, have suggested that EASs cannot yet be considered carrots; in fact, they are more like unicorns.  More thought about than seen. The stick Where to turn then, to manage our BYOD scheme?  If an EAS is the carrot, Mobile Device Management (MDM) is the stick.  MDM might work well in a corporate environment where employees are using corporate-owned devices.  But in a BYOD setting, there is a conflict between the needs of IT to control the device and the fact that the device is user-owned.  Can IT really expect user acceptance of its ability to  wipe the device and/or restrict the use of apps on it? There is a conflict between the needs of IT to control the device and the fact that the device is user-owned. In its role as stick, the traditional MDM model doesn't fit the BYOD environment.  It simply isn't acceptable or desirable for IT to have this level of control over personal devices.  It's a problem that the MDM vendors are struggling to grapple with: in mid-2013, Gartner's John Girard warned "MDM is in chaos right now and I think this market is going to die." At the heart of the problem is, of course, the issue that MDM does not help you deliver core business applications and data. Girard identifies the leaders in the MDM space as AirWatch, MobileIron, Citrix, SAP, Good Technology and Fiberlink, and says they are all partnering with other vendors to provide Mobile Application Management capabilities or developing ways to wrap a security policy container around apps. In mid-2013, Gartner's John Girard warned "MDM is in chaos right now and I think this market is going to die." 'App wrapping' is a user/device-centric access control method for executing applications.  This approach does present familiar concerns about app compatibility, application support and cross-platform operability. Some tools exist to run a segregated, encrypted version of the OS on a device which can then conform to corporate security policies whilst insulating the user's personal device from those policies.  However, the device will inevitably take a performance hit using this kind of approach. VDI delivers the business tools and data the user needs, whilst also allowing IT to protect corporate data and applications. The 'odd man out' of Girard's list, perhaps, is Citrix which offers an existing server/ client solution that delivers something MDM cannot. The lockdown Citrix's XenDesktop solution – and other VDIs like it – solve a lot of the problems of BYOD.  Providing the user with a discrete desktop instance running in the data centre (whether that data centre resides in the cloud or not) delivers the business tools and data the user needs, whilst also allowing IT to protect corporate data and applications.  By putting rules in place to restrict the transfer of data between the virtual desktop and the device, and the opening of corporate files outside of the VDI, security risks can be addressed.  Meanwhile, the user is free to run whatever personal applications they wish to on the client device. So a VDI offers the best possibility of lockdown, especially when combined with MDM. In tandem with this, it is sensible for organisations to adopt new approach to network management: creating a two-tiered approach.  One open network which allows access for the myriad of devices taken into the organisation and a second highly secure network on which corporate applications and data reside. In the past VDI has suffered with user acceptance and the issue of application portability but the market is developing apace.  One large installation I was discussing recently had gravitated to a Citrix desktop (because of the legacy in-house Citrix knowledge base and skills) sitting on a VMWare virtualisation platform.  The solution owner, a dyed-in-the-wool Citrix aficionado, was seriously considering a move to VMware Horizon View because of the way the latest version addresses this issue of cross-platform inter-operability whilst continuing to integrate with existing Citrix investment. The best approach? The best approach will depend on the degree of mobility required.  The employee base of the organisation will also play a part; both in terms of the risk it represents and the benefits that are demanded.  Where there is a high churn of low-skilled staff, a myriad of different apps probably won't be demanded or advisable.  But where there is a high skill base, there may be a greater tendency to stray outside the approved App Store. Ultimately, the best approach to take will be determined by each organisation's individual security policies and requirements. ### Will Virtual Desktops ever arrive? It has already happened. Here's the problem (and it took me quite some time to come to the realisation): all these anti-VDI blogs have been written by people who have had a poor experience of VDI.  Looking at our setup and our demo platform, everyone who tests a demo desktop always reports how fast the performance is.  We are Citrix based and as much as everyone thinks setting it up is easy, I would say well over 95% of Citrix deployments aren't set up correctly.  Take the example of HDX2; you must have at least 10 components correctly configured for Flash Redirection to work as intended, any misalignment and it simply demonstrates substandard performance, I can't find full documentation online about this, we've had to figure it out the hard way. Anti-VDI blogs have been written by people who have had a poor experience of VDI. I've been reading similar blogs now for some time.  My business - VESK - doubled its user base in 2013.  I am not writing this response because I part own a hosted VDI company, I'm saying it because I am an investor in technology that works.  And VDI is now taking off.  DaaS is a different matter, it's just MS apps so I'll let Guise Bule [CEO of TuCloud] answer that.  Guise has a large US user base so DaaS for him has been more about security.  I know personally, I couldn't sell DaaS.  My customers want an entire HVD (Hosted Virtual Desktop) solution, all their data, apps, desktops, servers - i.e. PaaS.  A different market entirely. I couldn't sell DaaS.  My customers want an entire HVD (Hosted Virtual Desktop) solution, all their data, apps, desktops, servers - i.e. PaaS.  A different market entirely. Now look at the larger providers; Amazon are now offering Workspaces, VMWare are offering Desktone, Microsoft will be offering desktops soon.  Many of the large telcos in Europe have now acquired desktop providers (such as Colt’s acquisition of ThinkGrid and Telefonica’s acquisition of EyeOS) or have their own desktop offering.  The larger providers were waiting to see if we succeeded, which we have, so they are now offering it themselves, their desktop offerings will evolve. The simple fact of the matter is that as of January 2014; 90% of business desktops consist of one form of MS Windows.  Mainly Windows 7, 8 and XP.  Even SaaS apps are sitting on an Operating System somewhere either Windows or Unix based.  Do you remember in older versions of Citrix Presentation Server or MetaFrame, what would happen if you were running a SaaS app and you pressed SHIFT+F2?  The application would minimise and you would see the application sitting on a Windows desktop. [pullquote]We have also published Linux desktops, cheaper than Windows.  BUT they didn't take off - end users like Windows. [/pullquote]We have also published Linux desktops, cheaper than Windows.  BUT they didn't take off - end users like Windows.  As much as people love to berate Microsoft, there's no arguing their desktop Operating System's longevity and strength.  They of course make mistakes; Windows ME & Windows 8, but there's always a solid OS in production, right now that's Windows 7 (in 2015 it will be Windows 9) and Windows 2008 R2 for session based desktop computing. Imagine how long it would take to change hundreds of millions of computer users (2Bn PC users now) from saving a file as a .doc / .docx and use something else?  Even if there was a big take-up it would take years and years.  Most businesses use Microsoft Office, firstly because their staff know how to use it and secondly, because it's a productive and useful tool.  We seem to overlook this fact because of the bespoke apps we use and the relationships we have with those vendors. PC sales have stopped slumping for the first time in a long time, this is in part due to Windows XP reaching EOL.  Of course PC sales aren't going to grow like they were, you now have a myriad of thin clients, amazing new tablets and laptops that have a 30 hour battery life.  These devices put less emphasis on the PC, that's all that's happening here; dilution of the workstation thanks to evolution of the device. I would not want to write a lengthy tender or a blog or update an Excel spreadsheet on my iPad. People are just working on a number of different devices.  But as I have always said;  if you want to be productive, if you want to socially interact (in the physical sense) with your work colleagues (for many businesses this is essential); you need to be at your desk, in front of a large screen, with a user interface that is at least as good as a mouse and keyboard.  I would not want to write a lengthy tender or a blog or update an Excel spreadsheet on my iPad.  iPads are Consumer devices, unless you don't mind having users being disturbed by their social apps.  I don't know many businesses that use iPads as their Primary device.  Of course in some cases iPads (sorry for picking on the iPad here, I am referring to most handheld devices) are great for business, but the majority of use cases, Business users need a computer and desktop (be that thin client or PC) from where to work.  Most people I know have now realised this.  Mobile access will continue to increase.  Microsoft will continue to innovate. SaaS apps are great, you could run a thin client with single sign-on and access SaaS only.  But in reality, there's a reason why FireFox is a 30MB download and Windows is a 2GB download.  Enough said I think, this sentence should invoke a separate blog :-) There's a reason why FireFox is a 30MB download and Windows is a 2GB download. So now we have ascertained that we need a DESKTOP computer, what else do we need?  A desktop.  And if the technology allows for the desktop to be accessed in equal measure remotely as it does locally, then why not remotely if the desktop can be backed up, managed and then added into a network (aka the data centre) which contains all of your applications, data, user profiles and servers ? HVD didn't take off in 2000 with Terminal Services because Internet connectivity was not as reliable and the TS stack would show you a simple multi-coloured JPG image by streaming 1 inch horizontal strips to your occipital lobe one fractal row at a time.  The technology has now changed, with HDX 3D Pro you can stream AutoCad online, the AppStore now has the latest version of Citrix Receiver available for free download allowing a Windows desktop and some seriously powerful Windows / Unix-based applications to run on your iPad which would never be possible with the iPad natively, we can run your desktops from 1U servers that contain half TB of RAM and 64 processor cores, the list goes on. Now consider the security of tier 4 datacentres, access via 2048-bit SSL (soon to be upgraded to quantum cryptography), hourly snapshots, real-time DR, you get the picture.  Another important point about VDI is that many private VDI deployments have failed because the curators miscalculate the realistic investment required, hence the reason public and hybrid clouds can be leveraged by SME’s without having to go through the whole learning curve of building their own. With HDX 3D Pro you can stream AutoCad online [and] the AppStore now has the latest version of Citrix Receiver available for free download allowing some seriously powerful Windows-based applications to run on your iPad. I used to run an IT business offering support for local desktops and servers.  I know how profitable that can be.  We used to make over 90% profit on more than 90% of our service agreements.  VESK on the other hand is lucky to make 10-15% profit at the end of the year.  The margins are lower, so it's not quite as appealing, another reason for some of these anti-VDI blogs.  Out of all the techies on the planet, there seem to be a disproportionately unreasonable amount who write about this.  The fact is that there haven't been enough GOOD implementations.  VESK has over 400 customers, we have zero outbound sales, our new business derives from word of mouth. So the proof is in the pudding.  VDI is taking off, if the big boys are offering it now it will only be a matter of time before a good % of desktops are simply stored in the datacentre, rather than on a local workstation.  I call that Hosted VDI or HVD. Of course there are other technologies emerging, other options.  But if you want to train your staff in using SaaS only apps, upgrade all legacy apps and somehow forget that half your apps integrate with one another; be that Exchange integration or Windows integration then the other option is local servers and local desktops.  And for those of you who want to stay local, if you have nothing to gain from moving to the cloud, I say- stay local.  If you have the in-house expertise to manage your applications, if you don't have an awful lot of remote working, if you're happy with your in-house / outsourced IT team, my advice is "if it's not broken then don't fix it". And for those of you who want to stay local, if you have nothing to gain from moving to the cloud, I say- stay local. A pushy salesman may tell you if you don't backup your data properly 90% of businesses go out of business blah blah blah but I would say; don't be pushed into anything.  I would only ever sell our solution if it's right for your business.  It can be expensive so if you're happy with your local servers, a few installed apps, a few SaaS apps, local desktops and local support and you're happy with the reliability, then there's no need to fix it. If you find you're constantly having to upgrade software, having regular downtime because you don't have the expertise to manage Exchange, have a poor experience when accessing your data and applications remotely, then consider moving to the cloud. So to answer the question, yes we have arrived.  Every HVD provider that I know, is growing.  User experience is always improving.  The big players have started to offer desktops, because they understood that they would have to.  With billions of people connected online, there will always be negative write-ups, as with any form of technology.  Especially one that is more complex and generates a lower profit margin than typical IT models for the Service Provider.  So yes, we have already arrived.  Reading this blog back in 2 years time it will seem obvious. ### Server Flash and Flash Hypervisor solution lowers application latency [quote_box_center] CASE STUDY "The Pernix[Data]-Kingston acceleration solution allowed us to extend the service life of our existing SAN solution by two to three years. It's been an economical alternative to buying a new array and was easy to install, configure and manage remotely at our international locations." Tran Phan, Senior Storage Administrator, Kingston Technologies [/quote_box_center] The Business challenge Our SAN was over committed which resulted in high application latency times and performance issues... Privately owned Kingston Technology Company, Inc. (Kingston) designs, manufactures and distributes DRAM and Flash products. Businesses and consumers use its memory solutions for desktops, notebooks, servers, printers and consumer electronics. The firm's 4,000+ employees work in multiple worldwide locations to support manufacturing, engineering, distribution, service and sales operations. Kingston's IT department oversees the operations of six discrete data centers around the world. In order to increase the firm's IT operational efficiency, along with network security, managers implemented server consolidation and disaster recovery initiatives. The projects both leveraged the existing storage area network (SAN) infrastructure to support workloads running everything from database applications to file and print servers to email systems in a virtual machine (VM). Soon after starting the projects, the IT team experienced storage performance concerns. “Our SAN was over committed which resulted in high application latency times and performance issues,” recalls Tran Phan, senior storage administrator for Kingston in Fountain Valley, CA. “That led to a lot of incident calls, with people complaining that they couldn't finish their work or process customer orders.” Phan's team weighed its SAN upgrade options. “It would have been too expensive for us to just purchase another array,” says Phan. “So, we looked at ways to leverage server-side flash solutions to accelerate virtualized server workloads and prolong the service life of our legacy storage systems.” ...we looked at ways to leverage server-side flash solutions to accelerate virtualized server workloads and prolong the service life of our legacy storage systems. Technology Solution Phan's research revealed several alternatives. “Some vendors use hardware-based flash acceleration while others use a combination of software and server flash to accelerate workloads. We went with the latter approach.” Phan's team implemented the PernixData FVP solution to aggregate server-side flash for several locations. The combination of Kingston enterprise-class SSDs with the PernixData Flash Hypervisor allows Phan to scale out a data tier to accelerate the primary storage at corporate sites. “We went with Pernix[Data] because the application operates at the host extension level,” explains Phan. The combination of Kingston enterprise-class SSDs with the PernixData Flash Hypervisor allows Phan to scale out a data tier to accelerate the primary storage at corporate sites. “Other reasons were that it doesn't require a virtual appliance to manage the software, and it's certified for inter-operability with Kingston SSDs.” The software-based solution — leveraging existing SAN infrastructure — had the added advantage of being very easy to deploy. “We are able to install and implement the Pernix[Data] solution quickly and remotely to international locations without the hassle of having to travel there or procure and ship hardware,” recalls Phan. After a total installation time of 15 minutes, Phan was able to aggregate his server-side Flash devices to increase the performance of his VMs. While the PernixData solution accelerates both reads and writes caching at the host level, Kingston's environment is much more write centric. Consequently, “We needed to use enterprise-class SSDs because they have the endurance to support our write-intensive environment.” The use of Kingston enterprise-class SSDs gives Phan's SAN solution longevity. They deliver up to 10 times the endurance of client SSDs, with up to 30,000 program/erase cycles versus the 3,000 that consumer-grade SSDs typically offer. Business results The implementation of the PernixData hypervisor software in conjunction with the Kingston enterprise-class SSDs delivered a number of benefits to the company. Boosted I/O headroom to cut costs and reduce risk. The PernixData-Kingston solution solved the company's storage over-commitment problem. “Before, we could only accommodate 50 of the 150 virtual servers that our production environment needed,” explains Phan. “The SAN solution gave us the needed I/Os to handle the total allotment.” As a result, Phan's team was able to complete a cost-saving server consolidation project leveraging VM technologies. That in turn allowed the implementation of a DR-upgrade project multiple fail-over site options. SAN solution performance promotes employee productivity The PernixData-Kingston solution lowered application latency by 83 percent, from 30 ms to five ms during peak load. “We get a lot fewer service desk calls now because users no longer wait on applications andthey can now promptly process customer orders.” The combined solution improved two Kingston SAN performance indicators as shown in figures 1 and 2 below. Figure 1: PernixData Flash Hypervisor software works with Kingston SSDs to lower SAN CPU utilization. Figure 2: Disk array utilization fell by 45 percent with the PernixData-Kingston solution. An economical SAN upgrade strategy for other locations “As other sites outgrow their storage platforms, we have a template to quickly and easily deploy the same solution at their locations,” says Phan. “The ability to administer the solution remotely helps me reduce the workload of local admins who wear enough hats as it is.” Another benefit is the ability to optimize Phan's back-end storage independently of each VM. That eliminates the need to profile each VM to utilize server-side Flash as some other caching products require. “I had a ‘wow' moment, when I learned that I didn't have to touch individual VMs to optimize them,” recalls Phan. “All I have to do is add a VM to the server-flash cluster and the Pernix[Data] solution automatically optimizes it.” The solution's boost in I/Os also helps the corporate balance sheet because, “It allowed us to extend the service life of our legacy SAN solutions by two to three years,” explains Phan. “And we were able to do that at about half the cost of a new array.” To learn how enterpriseclass SSDs from Kingston Technology can help your organization, visit www.kingston.com To learn how the PernixData FVP solution can help your organization, visit www.pernixdata.com SUMMARY Workload acceleration delivered the I/Os necessary to:Implement cost-saving server consolidation project along with the virtualization of production environment.Cut application latency: Boosted employee productivity. Eliminated customer-order-processing delays. Silenced angry user help-desk calls. Stronger disaster-recovery solution lowered risk to business. Extended service life of existing SAN by “two to three years” at 50 percent less cost than alternative. Significantly improved SAN performance. Slashed latency by 83 percent, from 30ms to 5ms. Lowered server CPU utilization by 35 percent. Cut legacy disk array utilization by 45 percent. Dramatically reduced SAN upgrade time-to-deployment. Can install solution in 15 minutes. Remote administration from HQ frees on-site IT staff to work on other projects. ### Flash lowers application latency and increases server consolidation [quote_box_center]Case Study: The Pernix[Data]-Kingston acceleration solution allowed us to extend the service life of our existing SAN solution by two to three years. It’s been an economical alternative to buying a new array and was easy to install, configure and manage remotely at our international locations.[/quote_box_center] The Business challenge Privately owned Kingston Technology Company, Inc. (Kingston) designs, manufactures and distributes DRAM and Flash products. Businesses and consumers use its memory solutions for desktops, notebooks, servers, printers and consumer electronics. The firm’s 4,000+ employees work in multiple worldwide locations to support manufacturing, engineering, distribution, service and sales operations. Kingston’s IT department oversees the operations of six discrete data centres around the world. In order to increase the firm’s IT operational efficiency, along with network security, managers implemented server consolidation and disaster recovery initiatives. The projects both leveraged the existing storage area network (SAN) infrastructure to support workloads running everything from database applications to file and print servers to email systems in a virtual machine (VM). Soon after starting the projects, the IT team experienced storage performance concerns. "Our SAN was over committed which resulted in high application latency times and performance issues," recalls Tran Phan, senior storage administrator for Kingston in Fountain Valley, CA. "That led to a lot of incident calls, with people complaining that they couldn’t finish their work or process customer orders." Phan’s team weighed its SAN upgrade options. "It would have been too expensive for us to just purchase another array," says Phan. "So, we looked at ways to leverage server-side flash solutions to accelerate virtualised server workloads and prolong the service life of our legacy storage systems." Technology Solution Phan’s research revealed several alternatives. "Some vendors use hardware-based flash acceleration while others use a combination of software and server flash to accelerate workloads. We went with the latter approach." Phan’s team implemented the PernixData FVP solution to aggregate server-side flash for several locations. The combination of Kingston enterprise-class SSDs with the PernixData Flash Hypervisor allows Phan to scale out a data tier to accelerate the primary storage at corporate sites. "We went with Pernix[Data] because the application operates at the host extension level," explains Phan. "Other reasons were that it doesn't require a virtual appliance to manage the software, and it’s certified for inter-operability with Kingston SSDs." The software-based solution — leveraging existing SAN infrastructure — had the added advantage of being very easy to deploy. "We are able to install and implement the Pernix[Data] solution quickly and remotely to international locations without the hassle of having to travel there or procure and ship hardware," recalls Phan. After a total installation time of 15 minutes, Phan was able to aggregate his server-side Flash devices to increase the performance of his VMs. While the PernixData solution accelerates both reads and writes caching at the host level, Kingston’s environment is much more write centric. Consequently, "We needed to use enterprise-class SSDs because they have the endurance to support our write-intensive environment." The use of Kingston enterprise-class SSDs gives Phan’s SAN solution longevity. They deliver up to 10 times the endurance of client SSDs, with up to 30,000 program/erase cycles versus the 3,000 that consumer-grade SSDs typically offer. Business results The implementation of the PernixData hypervisor software in conjunction with the Kingston enterprise-class SSDs delivered a number of benefits to the company. Boosted I/O headroom to cut costs and reduce risk. The PernixData-Kingston solution solved the company’s storage over-commitment problem. "Before, we could only accommodate 50 of the 150 virtual servers that our production environment needed," explains Phan. "The SAN solution gave us the needed I/Os to handle the total allotment." As a result, Phan’s team was able to complete a cost-saving server consolidation project leveraging VM technologies. That in turn allowed the implementation of a DR-upgrade project multiple fail-over site options. SAN solution performance promotes employee productivity The PernixData-Kingston solution lowered application latency by 83 percent, from 30 ms to five ms during peak load. "We get a lot fewer service desk calls now because users no longer wait on applications and they can now promptly process customer orders." The combined solution improved two Kingston SAN performance indicators as shown in figures 1 and 2 below. Figure 1: PernixData Flash Hypervisor software works with Kingston SSDs to lower SAN CPU utilization. Figure 2: Disk array utilization fell by 45 percent with the PernixData-Kingston solution. An economical SAN upgrade strategy for other locations "As other sites outgrow their storage platforms, we have a template to quickly and easily deploy the same solution at their locations," says Phan. "The ability to administer the solution remotely helps me reduce the workload of local admins who wear enough hats as it is." Another benefit is the ability to optimize Phan’s back-end storage independently of each VM. That eliminates the need to profile each VM to utilize server-side Flash as some other caching products require. "I had a ‘wow’ moment, when I learned that I didn’t have to touch individual VMs to optimize them," recalls Phan. "All I have to do is add a VM to the server-flash cluster and the Pernix[Data] solution automatically optimises it." The solution’s boost in I/Os also helps the corporate balance sheet because, "It allowed us to extend the service life of our legacy SAN solutions by two to three years," explains Phan. "And we were able to do that at about half the cost of a new array." To learn how enterprise-class SSDs from Kingston Technology can help your organisation, visit: www.kingston.com To learn how the PernixData FVP solution can help your organisation, visit: www.pernixdata.com Summary: Workload acceleration delivered the I/Os necessary to: Implement cost-saving server consolidation project along with the virtualisation of production environment and cut application latency. Boosted employee productivity. Eliminated customer-order-processing delays. Silenced angry user help-desk calls. Stronger disaster-recovery solution lowered risk to business. Extended service life of existing SAN by "two to three years" at 50 percent less cost than alternative. Significantly improved SAN performance. Slashed latency by 83 percent, from 30ms to 5ms. Lowered server CPU utilisation by 35 percent. Cut legacy disk array utilisation by 45 percent. Dramatically reduced SAN upgrade time-to-deployment. Can install solution in 15 minutes. Remote administration from HQ frees on-site IT staff to work on other projects. ### Net Neutrality: good for cloud computing? Net Neutrality – the idea that all Internet traffic should be treated equally and shouldn't be meddled with – has hit the headlines again. This April we have seen the EU take a major step towards the enforcement of net neutrality across the 28 countries of the European Union.  And we have seen what’s viewed as an equally large step in the opposite direction in the US with the FCC looking to favour new rules that many claim abandon the core principles of net neutrality. Supporters of net neutrality vastly out-number those against, or at least they are certainly a lot more vocal.  Proposals to allow network service providers to receive money from content providers in exchange for supplying a prioritised Internet delivery appear to infringe on the equal access /equal treatment principles around which the Internet has grown up. Supporters are concerned that abandoning commitments to net neutrality will lead to a two tier Internet – a super quick highway for those that can afford it, and a congested B-road for the rest of us to contend with.  As one US commentator states in this article:  "It threatens to make the Internet just like everything else in American society: unequal in a way that deeply threatens our long-term prosperity." Road Neutrality ceased a long time ago Opponents of net neutrality insist that the Internet should not be subject to Government interference, and that free market forces should be allowed to play out in order to permit innovation and investment.  They point to the fact that overall consumer Internet speeds are increasing rapidly and that consumers are demanding high quality content which they expect suppliers such as Netflix and Apple to supply dependably and reliably. Certainly they are correct to assert that not all traffic is the same and certain applications such as video and voice data particularly need extra speed and bandwidth - whereas services such as email and file sync do not.  Furthermore they state that a commercial offer to prioritise certain services makes sense for the provider , the consumer of that service, and indeed the non-consumer of that service.  As Internet Economics and Policy commentator, Roslyn Layton, suggests in this article : "Net neutrality means that I have to pay for my neighbour's Netflix, even though I don’t watch it myself". So those on both sides of the argument claim to be representing the best interests of the end consumer. Who is right, and what does all this mean for the consumer (and supplier) of business cloud services? Well, in the business world, we are used to paying for things – and in many instances these things are purchased in order to gain an edge and a competitive advantage.  There are plenty of examples but here’s a pertinent one: Your business may well have purchased a leased line (private circuit) service connecting your office to the Internet or your private cloud which gives your business faster/more consistent/more reliable connectivity than it would expect to receive from a shared xDSL broadband service. Now today, as more of our business is happening on the Internet and on mobile thanks to social media, VoIP, Video conferencing, file sharing and cloud services, business users are having to mix it up a bit more with consumers and share fixed and mobile Internet resources. So, if we had the chance to invest in an SLA-backed business-grade Internet access to certain applications - wouldn't that be attractive? Let’s face it, that’s more or less what your business is hoping to achieve by purchasing that leased line.  However since this leased line is a private circuit, it’s not considered ‘the Internet’, it’s existence is also unlikely to adversely affect other Internet users, and it’s not paid for by your application or content provider prioritising their content over others – and so it's not an infringement on net neutrality even if it is an Internet advantage. Looking at things from the other end of the delivery chain, it is also evident that your business-grade supplier of content, applications and 'XaaS' from the cloud is also making investments to improve their chances of delivering their Internet service to you better - by paying for things. Content Delivery Networks (CDNs) have offered paid caching services for many years and now offer a wide range of dynamic content acceleration options in order to improve end-user experience of online services. Similarly most cloud service providers have their pick of a selection of IP transit providers, and can choose the best Internet routes to users using BGP tables – again to improve their service to end-users with better Internet connectivity. Again these actions aren't in breach of net neutrality because content is 'prioritised' using servers not networks, and selecting the best network doesn't adversely impact anyone else. The sticking point therefore is when that content or application provider wishes to strike up a deal with the network owner (Internet operator) prioritising their content over everyone else’s.  This is not a major consideration on the backbone of the Internet (as provider choice, bandwidth and private interconnects are abundant), it’s a factor in the ‘last mile’ – the Internet leg that is most bandwidth constrained, congested, the most expensive to deliver – and where the fewest choice of suppliers exist. If your video content provider wants to pay to install dedicated leased line Internet-access into your home that would be fine, but if they want to pay your ISP to receive preferential treatment on your existing xDSL or cable Internet service, then that’s not fine – as there’s the possibility it might adversely impact and be to the detriment of other broadband users or other Internet content/application suppliers. This begs the question whether triple-play (Internet + TV + Phone) providers are actually infringing on net neutrality principles by assigning portions of your bandwidth for (or prioritising) OTT services that they themselves are supplying? So it’s permissible if you operate the network and supply the content, but not ok if you only supply the content?  Is my neighbour’s tripe play service affecting my Internet access experience? Well, yes it could well be?! It’s clear there is a fine line between abiding by and contravening net neutrality principles.  And reportedly some providers such as Apple and Comcast are seeking to test that fine line in pursuit of a buying and selling a performance advantage. The acid test for net neutrality proponents shouldn't really be whether network providers allow the purchase of ‘premium’ service levels, but whether in doing so it has an undesirable effect on the service (and/or the rate of service improvement) that is then received by others on the ‘standard’ level of service.  Policing this could prove interesting - particularly if it turns out there’s a huge market for prioritising content - which there will be if it's not policed very well. So, on the one hand net neutrality threatens to block businesses being able to buy and sell content and applications with greater speed and reliability.  And on the other hand it can be argued that net neutrality safeguards a level playing field ensuring that cloud service providers and cloud consumers both small and large have an equal stake on the pathways of the Internet regardless of what they are up to. Determining the importance of maintaining net neutrality for cloud computing may largely depend on your point of view and how you think your business needs are best served. Key factors influencing your view are likely to be: a) How constrained your Internet access resources are? In other words, how much would your experience of your cloud apps benefit if you could purchase a better QoS?  Net neutrality commitments are likely to affect you more where bandwidth is scarce – i.e. remote regions, the developing world or where there’s monopoly telcos failing to invest.  If your business operates in such areas net neutrality would scupper any chance for you to buy an advantage on existing Internet connections. It may also dis-incentivise investment in capacity upgrades from the ISP/network owner. b) How susceptible to and vulnerable the services you buy or sell are to varying quality of Internet conditions? What are you are relying on the Internet to do for your business? If your business is buying or selling business-grade voice or video services then you may also miss out on a big opportunity to secure or deliver a better service, perhaps one with an SLA on performance. The biggest factor however: c) How much money your business has? If your business is a start-up cloud supplier or cloud consumer, then you’re likely to have a lot less money to invest in securing an Internet advantage than the big players with whom you may compete with.  Here net neutrality helps you out by not letting your competitors buy an advantage over you. However, this does rather ignore the fact that they are already buying advantages over you in everything from IT hardware to CDN to marketing to human resources. When opportunities exist for the rich and not the poor, you can see why tempers flare. ### The Case for Open Source Virtualization Technology commentator Rick Delgado outlines the key business, technology and operational advantages of open source virtualisation software. Virtualisation has now become a regular part of the business world. Company leaders no longer look at is as the wave of the future so much as it is one of the present day tools used for business. It’s little wonder why virtualisation technology has transformed into something so integral, especially with the way it allows businesses to save money and improve productivity. With virtualisation now considered the norm, many businesses are turning to open source virtualisation to gain even more benefits.  As every business leader is quickly learning, there is definitely a case to be made for going open source for their virtualisation needs. Open source virtualisation software takes many of the same advantages of regular server virtualisation and adds upon it by further consolidating a company’s virtual and physical infrastructure while maintaining many of the same cost savings. On its own, virtualisation makes for more processing agility and better management of resources, and while those benefits are extended through the open source version, adopters of open source will see even more positives from the experience. First, it’s important to understand the concept of open source software. Companies can use open source software developed for various purposes, only unlike proprietary systems, the software is usually free of charge.  Besides the cost, the software also allows users to become co-developers--they can take the existing software and build improvements upon it and iron out any of the mistakes and bugs. Modification is the name of the game, and these improvements can be shared with others who are also using the open source software. This customisability is a very attractive option for many companies. In fact, open source software saves companies around $60 billion every year. The same applies for open source virtualisation. The process of virtualising a company’s servers can be a complicated one, but with open source virtualisation, since the costs are lowered - deployment is easier. The customisability of the software code is also a huge benefit.  Each business has different goals and methods of practice, and that means they need software that they can modify to see to their own needs. When using proprietary systems, businesses are locked in to the methods and uses of the software they purchased. That’s not the case with open source. Businesses can make any new additions and modifications to the code so that it impacts their company and servers in the best way possible. Open source software saves companies around $60 billion every year. Open source virtualisation also provides for a far larger degree of flexibility for a company. The options available to a business, given they have the right personnel, are almost endless as long as any changes to the code are within the realm of server virtualisation. Companies also use a variety of other important software and hardware tools, and finding a specific virtualisation software that complies with all them and all their varieties can be difficult. Luckily, open source virtualisation solves that problem by allowing companies to change and add to the code, allowing it to work with different operating systems and cloud computing services. Businesses also have a large variety of open source virtualisation softwares and technologies to choose from when picking what they want, though some have installation fees associated with them. KVM (Kernel-based Virtual Machine) supports Linux and Windows and gives each virtual machine its own hardware resources. VirtualBox provides full virtualisation features and even comes in open and closed source versions depending on the business’s preference. OpenVZ is very flexible even as far as open source virtualisation goes and allows for resource additions in real time without a system reboot. There are many other options, each with special features adding to the software’s already impressive modification abilities. When using proprietary systems, businesses are locked in to the methods and uses of the software they purchased. Open source virtualisation is just the latest tool companies are utilising to get the most out of their technical infrastructure. The benefits have been shown to give businesses added flexibility, lower costs, and greater productivity. The technology is still being adopted, but it’s only a matter of time before more companies are using open source virtualisation on a regular basis. ### Choosing the right CSP – how important are multiple sites? By Russel Ridgley, Head of Cloud Services at Pulsant Selecting the right cloud service provider or colocation partner is a complex task that can have long-term consequences on different aspects of your business. Two of the most important considerations are security and availability. You need assurances that your provider has the expertise, experience and resource to cope with these demands, as well as any number of other issues that relate to your business' data and applications. To some extent evaluating a vendor's credentials – ISO 271001, ISO 9001, CSA STAR– can allay fears and inspire confidence that they are capable of doing what they say they can. However, there are other factors to consider and while price always plays a role in decision-making, it shouldn't be the sole element. Selecting the right cloud service provider ... can have long-term consequences on different aspects of your business. Being assured that your data and / or systems are always safe from cyber criminals and always available, your provider should also be offering a comprehensive disaster recovery (DR) and back up service. And this is where the differentiation between single site operators and multiple site operators becomes apparent. Fact: data centres do fail, whether it's a glitch in a single server to a complete fall over, and therefore your data needs to be protected. So if you've chosen a small or start-up data centre provider due to favourable pricing and even if it meets security and availability requirements, you should be asking questions about DR and back up. ...your provider should be offering a comprehensive disaster recovery (DR) and back up service ...this is where the differentiation between single site operators and multiple site operators becomes apparent. Trusting a provider with multiple data centre sites ensures that your data will always be safe – whether it is backed up to a site across the country or one within commutable distance, if one data centre in the network experiences issues, there are seamless plans in place to deal with them. That's not to say that single site operators cannot cater for DR and back up. It simply means that the options for these services are limited and while data may be backed up and a DR site is available, it is outsourced and therefore possibly open to other issues surrounding security and the type and effectiveness of the DR itself. Single data centre site operators also may struggle to offer you and your business the opportunity to scale. The worst case scenario is that you will have to migrate your entire IT infrastructure across to a new provider who can properly deal with your growing needs. This, however, is time and cost intensive and essentially means that you have go through the process of selecting a cloud service provider all over again. [If] it is outsourced [it is] possibly open to other issues surrounding security and the type and effectiveness of the DR itself. Ultimately the business model of the single site operator may not be sustainable. It has neither the resources nor benefit of economies of scale when it comes to purchasing power, investment in infrastructure, etc., which means that the cost savings normally associated with these things cannot be passed back onto the customer. While costs may initially be low on sign up, they will need to sharply increase in order for the operator to maintain its profitability. And if it can't… the result could be a move to a new provider anyway as unprofitable data centres, as with any business, do not last long.   ### IBM Debuts Enterprise Cloud Marketplace with Global Partner Ecosystem IBM Launches Single Online Destination for Cloud Innovation LAS VEGAS, 28 April 2014 -- IBM today launched a new Cloud marketplace that brings together IBM’s vast portfolio of cloud capabilities and new third-party services in a way that delivers a simple and easy experience for three key user groups within the enterprise – developers, IT managers and business leaders to learn, try, and buy software and services from IBM and its global partner ecosystem. The launch of IBM Cloud marketplace represents the next major step in IBM's cloud leadership. This single online destination will serve as the digital front door to cloud innovation bringing together IBM’s capabilities-as-a-service and those of partners and third party vendors with the security and resiliency enterprises expect. The marketplace offers a gateway for IBM’s vast global business partner ecosystem to tap into the growing $250B cloud market opportunity, with instant access to IBM’s rich intellectual capital, array of services and software capabilities and access to IBM’s vast enterprise client network. For partners, IBM's cloud marketplace provides a global path to the enterprise and new opportunities to collaborate with a network of channel partners in its ecosystem to generate new revenue streams driving cloud innovation. This marketplace will serve as a cloud innovation hub where technology meets business with hundreds of cloud services from IBM, partners and third party ecosystem. Clients can access a full suite of IBM-as-a-Service with 100 SaaS applications, IBM's Bluemix platform-as-a-service with composable services, the powerful SoftLayer infrastructure-as-a-service and third party cloud services. Today’s news is the next significant step in IBM’s continued march toward building the most comprehensive cloud portfolio for the enterprise. This year alone IBM announced: $1.2 Billion investment to expand its global cloud footprint with SoftLayer $1 Billion in cloud development with the launch of Bluemix, Platform-as-a-Service enabling much of IBM middleware to the cloud. $1 Billion in the launch of a new business unit, the Watson Group for cloud-delivered cognitive innovation. Acquisition of Aspera, Cloudant and Silverpop bringing the total to $7 Billion invested in 17 cloud acquisitions since 2010. "Increasingly cloud users from business, IT and development across the enterprise are looking for easy access to a wide range of services to address new business models and shifting market conditions," said Robert LeBlanc, Senior Vice President, IBM Software & Cloud Solutions. "IBM Cloud marketplace puts big data analytics, mobile, social, commerce, integration ---- the full power of IBM-as-a-Service and our ecosystem --- at our clients' fingertips to help them quickly deliver innovative services to their constituents. Building a Best in Class Ecosystem of Cloud Service Providers IBM cloud marketplace offers an ideal environment for business partners and independent software vendors to monetize their solutions as cloud services for the enterprise, Several IBM partners including SendGrid, Zend, Redis Labs, Sonian, Flow Search Corp, Deep DB, M2Mi and Ustream have featured a wide array of cloud services on IBM marketplace for enterprise clients. “IBM has brought together a full suite of enterprise class cloud services and software and made these solutions simple to consume and integrate, whether you are an enterprise developer or forward looking business exec,” said Andi Gutmans, Zend CEO and Co-founder. “We will support the rapid delivery of our applications through the IBM Cloud marketplace, enabling millions of web and mobile developers, and many popular PHP applications to be consumed with enterprise services and service levels on the IBM Cloud.” "Most cloud marketplaces are tied to one specific product offering. If you don't use the particular service for which the marketplace was built - even if you're a customer of other products by the same company, that marketplace is irrelevant for you,” Jim Franklin, CEO of SendGrid. “But the IBM cloud marketplace will be available to all IBM and non-IBM customers. Whether you're using BlueMix, or SoftLayer, or another IBM product, the IBM Marketplace will be there to serve you. As a vendor, being able to reach all IBM customers from one place is very exciting.” How the IBM Cloud Marketplace Works Clients can conveniently discover, test and experience hundreds of IBM and partner enterprise-grade cloud services online that are open, scalable and secure. Content is dynamically served up by job role and service pages offer easy, intuitive access for those interested in categories such as Start-ups, Mobile, Gaming and others. .For example, an enterprise DevOps team is looking for a better way to develop new technologies to meet fast-changing needs of the business. Until now, they would have searched across the Web, looking at vendor sites to piece together solutions. Today, they can consolidate their evaluation, immediate trial and purchasing of both IBM and third-party applications in the IBM cloud marketplace and or contact IBM directly for a purchase. IBM's cloud marketplace has three key components: For Line of Business Professionals Busy business professionals from across the organization are looking for a fast, easy way to find user-friendly applications that are secure and scalable. IBM Cloud marketplace for business will serve as a single stop where business and IT professionals can learn about, deploy and consume over 100 SaaS applications ranging from Marketing, Procurement, Sales & Commerce, Supply Chain, Customer Service, Finance, Legal, and City Managers. For Developers IBM's Cloud marketplace -- Dev provides an integrated, get-started-now, cloud-based development environment where individual developers, development shops and enterprise development teams can quickly and effectively build enterprise applications via leading services and application protocol interfaces (APIs). These applications can be easily and securely integrated in hybrid on premise and off premise cloud environments. It is uniquely built on an open environment so developers can choose any open source or third party tools and integrate apps, as needed. Building on IBM’s $1 Billion investment in Bluemix, Platform as a Service, today IBM also announced the expansion of Bluemix with 30 cloud services, bringing advanced big data and analytics, mobile, security and devops services to developers and bringing enterprise developers into the cloud. For IT Departments IT managers want infrastructure they can configure and control to select their own suite of services, hardware and software; as well as reliable and scalable technology that is easy to access and implement. Cloud marketplace --- Ops provides a secure set of cloud services built on Softlayer that help clients deploy cloud services. support high performance businesses at enterprise scale. SoftLayer gives clients the ability to choose a cloud environment and location that best suits their business needs and provides visibility and transparency to where data reside, control of data security and placement with a choice of public, private or bare-metal server options. Services include Big Data, Disaster Recovery, Hybrid environments, Managed Security Services, Cloud Environments for Small and Medium Business among others. For instance, IT managers will be able to access two new IaaS offerings from IBM's Big Data and Analytics portfolio in the Cloud Marketplace. InfoSphere Streams will allow organizations to analyze and share data in motion for real-time decision management, and InfoSphere BigInsights will make it easier for developers to use Hadoop to build secure big data apps. With the addition of these new solutions, IBM now offers more than 15 solutions from its Big Data and Analytics portfolio, Watson Foundations, for business users in the Cloud Marketplace. IBM's Enterprise Content capabilities will also be available to help knowledge workers actively engage and manage content in a trusted cloud environment. About IBM Cloud Computing IBM has helped more than 30,000 clients around the world with 40,000 industry experts around cloud engagements. IBM has 100+ cloud SaaS solutions, thousands of experts with deep industry knowledge helping clients transform and a network of 40 data centers worldwide. IBM has invested more than $7 billion in 17 acquisitions to accelerate its cloud initiatives and build a high value cloud portfolio. IBM holds 1,560 cloud patents focused on driving innovation. In fact, IBM for the 21st consecutive year topped the annual list of US patent leaders. For more information about cloud offerings from IBM, visit http://www.ibm.com/cloud. Follow us on Twitter at @IBMcloud and on our blog at http://www.thoughtsoncloud.com. Join the conversation #ibmcloud. To learn more about today's news please read Smarter Planet blog. ### IBM Announces New BlueMix Services to Help Developers Build Applications in the Cloud   More than 30 Services, including Data and Analytics-as-a-Service for Real-Time Insights, New BlueMix Garages for Developers LAS VEGAS, 28 April 2014 – IBM today announced more than 30 additional cloud services are available in BlueMix, its Platform-as-a-Service (PaaS) to help developers quickly integrate applications and speed deployment of new cloud services. In addition, IBM is launching BlueMix Garages, collaborative locations where developers can create new apps on BlueMix, learn new development skills, and access IBM's developer ecosystem. With this news, IBM aims to accelerate the pace at which cloud computing will transform global industries. Built on open standards and leveraging Cloud Foundry, BlueMix has made major advancements since being announced in February, including: Cloud integration services to enable a secure connection between an organization's public applications and private data Swift, secure and fast connections, via the cloud, between Internet of Things and machine to machine device to store, query and visualize data Data and analytics-as-a-service to allow the rapid design and scale of applications that turn Big Data into competitive intelligence DevOps services that support the complete developer lifecycle IBM is leveraging BlueMix's foundation on SoftLayer for this expansion, combining the strength of IBM's middleware software with third-party and open technologies to create an integrated development experience in the cloud. Using an open cloud environment, BlueMix helps both born-on-the-web and enterprise developers build applications with their choice of tools. With the ability to easily combine a variety of services and APIs from across the industry to compose, test and scale custom applications, developers will be able to cut deployment time from months to minutes. "The adoption rate of BlueMix has been nothing short of phenomenal since being announced only a few short weeks ago," said Steve Robinson, General Manager of IBM Cloud Platform Services. "Rapidly growing numbers of beta participants are embracing our model of extending their existing assets and services into a cloud-based, open-source development platform, allowing our clients to bridge between the tools they are planning for the future and the workloads and services they use today to get them to market faster." In the first eight weeks of BlueMix’s open beta launch, IBM has seen rapid adoption of the open platform with key clients and partners such as GameStop, Pitney Bowes, Continental Automotive and start-ups including EyeQ. "BlueMix has helped us grow and scale our resources at a rapid pace, assisting us in deploying a cost-effective solution to our clients that helps us get to market faster," said Michael Garel, CEO and Founder of EyeQ. "With BlueMix, we are able to reduce the amount of time spent on monthly server maintenance by 85 percent, and turn our attentions back to greater innovation." BlueMix Helps Build and Integrate Apps Quickly IBM’s new BlueMix services are designed to help businesses rapidly transform using Big Data, mobile and social technologies in the cloud. Some of the new BlueMix services include: Cloud Integration services to securely connect and integrate an organization’s applications and information in the cloud. Developers can use pre-defined Kristen M. (Kristen Dattoli) Dattoli connectors for accelerated integration, or develop custom APIs as needed to easily and securely tie back into systems of record behind their firewall. Integrated API management capabilities provide an easy mechanism to publish self-service APIs, that can be shared with the broader API economy. This allows developers to mix cloud-based PaaS, third party cloud applications, and on-premises systems behind security gateways, moving between cloud and on-premises systems in a hybrid, integrated environment. Internet of Things services allowing developers to register and connect networked devices such as microprocessors and embedded machine to machine sensors to the cloud, easily aggregating and reacting to data and events in real time. Organizations can build applications which efficiently manage, analyze, visualize, and interact with the massive quantities of temporal and spatial data generated by vehicles, wearables, mobile phones, cameras, computers, sensors and other intelligent devices. Data and Analytics services for developers to deliver data-centric mobile, web-scale applications. With these new services, including geospatial, time series, predictive scoring, and reporting, developers can easily create sophisticated applications that provide real-time actionable insight so that organizations can predict outcomes and make better business decisions. For example, a developer could create an application that integrates sensor data, location data, weather data and usage trends from a network of equipment to identify and avoid emerging maintenance issues. In addition, new data masking, discovery and audit capabilities help developers create applications with built-in data privacy and security. DevOps services enabling developers, IT departments and business teams with an open, integrated rapid development environment that scales from individual developers to enterprise teams. The DevOps Continuous Integration service will provide end-to-end “build” capabilities to speed changes through the development process, DevOps Mobile Quality Assurance (MQA) will help analyze user sentiment to spot problems before they go viral, and the Monitoring and Analytics service will identify application problems during development - leveraging analytics to help applications achieve availability and performance goals. In addition, DevOps will include a new RapidApp service that requires no coding,using visual tools to expand the scope of web and business applications developers can create. BlueMix Garage Speeds Development of Cloud Applications IBM also announced the launch of BlueMix Garages – physical, collaborative spaces for developers, product managers and designers to collaborate with IBM experts to rapidly innovate and deliver new cloud applications to be deployed onto BlueMix. The first IBM BlueMix Garage will be located in San Francisco’s South of Market neighborhood – home to more startups per square foot than anywhere in the world. The Garage will sit in Galvanize, a startup hub which will provide a shared home to approximately 200 San Francisco startups by the end of 2014. Working side-by-side with IBM consultants, partners and entrepreneurs, Garage users will transform application development with modern cloud technologies and highly disciplined agile processes. Underpinned by a strong ecosystem, IBM aims to accelerate the pace at which cloud computing will transform global industries. Currently, IBM’s education, mentoring and innovation initiatives reach more than four million software developers worldwide, and more than 1000 startups across retail, healthcare, finance, agriculture, automotive and more. The continued growth of the BlueMix community, as well as launches of additional BlueMix Garages, will help provide this network with the skills, innovation and tools needed to leverage the power of cloud to revolutionize the use of mobile, big data and more. About IBM Cloud Computing IBM is the global leader in cloud with an unmatched portfolio of open cloud solutions that help clients to think, build or tap into it. No other company has the ability to bring together unique industry knowledge and unmatched cloud capabilities, that have already helped more than 30,000 clients around the world with 40,000 industry experts. Today, IBM has 110 cloud SaaS solutions, thousands of experts with deep industry knowledge helping clients transform and a network of 40 data centers worldwide. Since 2007, IBM has invested more than $7 billion in 17 acquisitions to accelerate its cloud initiatives and build a high value cloud portfolio. IBM holds 1,560 cloud patents focused on driving innovation. In fact, IBM for the 21st consecutive year topped the annual list of US patent leaders. IBM processes more the 5.5M client transactions daily through IBM's public cloud. IBM expects to achieve $7 billion in annual cloud revenue by 2015. For more information about cloud offerings from IBM, visit http://www.ibm.com/cloud. Follow us on Twitter at @IBMcloud and on our blog at http://www.thoughtsoncloud.com. Join the conversation #ibmcloud. For more information on IBM Impact, visit https://www-304.ibm.com/connections/blogs/aim/?lang=en_us. Unless stated otherwise above: IBM United Kingdom Limited - Registered in England and Wales with number 741598. Registered office: PO Box 41, North Harbour, Portsmouth, Hampshire PO6 3AU ### Intermedia appoints new UK sales director Aidan Simister chosen to drive Intermedia’s sales growth Intermedia, the world’s largest one-stop shop for cloud IT services and business applications, including hosted Microsoft Exchange, archiving and backup, collaboration and file management, has appointed Aidan Simister as its new UK director of sales following the company’s recently announced growth strategy and the acquisition of SaaSID. Ed Macnair, managing director of Intermedia EMEA, says, “We are excited to have Aidan join us. He has extensive experience in building successful sales teams and consistently exceeding growth targets. He brings with him a wealth of knowledge and has well established relationships within the channel. This is an incredibly exciting time for Intermedia and Aidan will be a vital asset to the business as we continue to expand our footprint in the UK.” “Intermedia has a tremendous opportunity for further growth in the UK and I’m excited to be part of it. Today, companies have to use cloud technology to be competitive in whatever sector they trade in. Some businesses have been hesitant to embrace cloud services due to fear of lack of control, management, security and in some instances compliance. Intermedia is perfectly positioned to help businesses overcome those concerns and fully embrace the benefits of the cloud without losing control,” says Aidan Simister, UK director of sales. Aidan will be growing Intermedia’s cloud services revenue through the UK channel by developing new relationships with partners and customers as well as strengthening existing ones. He has more than 17 years’ experience in the IT industry, with in-depth knowledge ranging from taking new technologies to market to driving new business growth through both direct and indirect channels. In this role, he will be looking to recruit VARs, SIs and MSPs to increase market adoption of Intermedia services in the UK. Aidan has previously held sales and business development management positions at organisations including Netwrix, Access Information Security and Sophos. Partners who are interested can find out more about Intermedia’s partner programme at www.intermedia.co.uk/partners or contact Intermedia on +44 (0)20 3384 2158. ### MSP vs Traditional IT Channel MSP vs Traditional IT Channel - who will win? The buzzword of the moment among every IT vendor and major cloud provider is ‘MSP’ or Managed Service Provider. During briefings we constantly hear that “the MSP is important”, “we must get to the MSPs” - and the marketing departments must be going insane when they hear that dreaded term. The term MSP is nothing new and has been around since the early days of the Internet.  Commonly an MSP is defined as a service-led organisation that manages either on-premise or off-premise IT services and solutions for an end-user client.  So why is the MSP heralded as the path to cloud enlightenment?[pullquote]So why is the MSP heralded as the path to cloud enlightenment?[/pullquote] MSPs hold end-user local relationships that every major vendor wishes to leverage. The ‘management’ element of an MSPs business model is seen as key to unlocking cloud services and applications. MSPs are considered to be ‘mature’ in terms of business model and service consumption / subscription based billing. It’s the latest marketing buzzword (in the supply chain at least) - similar to Big Data and Cloud Computing. Let's all smell the coffee, wake up for a minute and summarise this situation:  Every MSP is currently in a position where every vendor is courting them, IT distributors are romancing them, and suddenly they have become what a pair of Prada shoes are to my wife (essential apparently?). Q. So where does that leave the traditional “sell-to” IT supplier / Value-add Reseller (VAR)? A. In a much stronger position than the MSP! The question to those MSP’s that have made a significant CapEx outlay on hardware is how do you reconcile your model moving forward? What?! I hear you cry...  Well, sorry IT industry, I am sure you're all now shaking your fists in anger at the above statement in strong disagreement (and sorry marketing people for disrupting your pretty meaningless PowerPoint charts).  But allow me to qualify why I am making such an outlandish statement: Many MSP’s are in the situation whereas they have invested in large volumes of hosted infrastructure and colocation trying to compete with Amazon (AWS) on cost and retention.  Admittedly, there are exceptions to this generally based on the MSP attaining a high level of technical competency and vertically aligned focus.  The traditional VAR/Reseller/Channel non-cloud MSP (have you taken off the vendor labeled hair shirts and looked to the cloud for salvation yet?) is in a much stronger position. The reason for this is simple, “transactional” sales and less of a desire to “retain” control of every aspect of the IT solution has its advantages for both vendors and customers.  The traditional IT reseller also has the benefit of hindsight and vendor choice.  Let's all face facts: cloud computing in whatever form it takes - whether Software, Infrastructure or Platform computing - is here and has 'crossed the chasm'.  The question to those MSP’s that have made a significant CapEx outlay on hardware is how do you reconcile your model moving forward?  For an example of where the cloud is bypassing the IT channel, go to your admin panel on your Microsoft SQL Server and look at the backup options, note where it says “Backup to cloud” ...hmmm. And of course that cloud is Microsoft's Azure. For an example of where the cloud is bypassing the IT channel, go to your admin panel on your Microsoft SQL Server ...note where it says “Backup to cloud”. Many major vendor software suites are incorporating direct touch end-user strategies where cloud is concerned; could there be an option one day for a button on VMware instances offering a direct “Send to VCHS Cloud” option… probably, and soon.  The million pound question is do you embrace, differentiate or compete?  In today's IT landscape, you cannot compete in terms of marketing budgets, system developers or sales resources.  But then the larger an organisation is the harder it can fall, especially in this transaction-orientated world of cloud.  So for both the IT reseller and MSP you hold the key in the most crucial area of all: CUSTOMERS.  And how do you ensure your customers don’t pass you by? Simple! Talk to them, and be the eyes and ears of the latest developments ...and above all get investigating an API brokerage solution now.  We all lament the loss of local shops to large faceless supermarket chains that offer lower costs benchmark against each other and capture all your spending habits on loyalty cards (yawn).  Personally I much preferred a local shop with personality not some industrialised horse meat purveyor.  The point I am making above is that a faceless control panel with lower costs will never replace that personal service and handshake over  a cup of coffee.  And in terms of support, no vendor in this world is capable of providing mass levels of IT support and personal service in the mid-market, make no mistake about that. Q. Where does the IT VAR or the MSP fit in then? A. Simple, “Hybrid, Hybrid, Hybrid, Hybrid”.[pullquote]...how do you ensure your customers don’t pass you by?  Talk to them, and be the eyes and ears of the latest developments ...and above all get investigating an API brokerage solution now.[/pullquote] Customers don't want "server sprawl" "shadow IT" or undocumented or uncontrolled services. Control the use of the cloud using plugins or APIs; bring the billing together under one roof, and above all negotiate with your vendors or distributors to get credit easing and leverage what is available in terms of resources. Q. How can you ensure you're on the right strategy? A. Speak to the consultants at Compare the Cloud, and we'll help you get on the straight and narrow (path to success)! @comparethecloud or ask@comparethecloud.net. ### Lunacloud expands business to Russia Lisbon and Moscow, April 23rd, 2014 – Lunacloud is pleased to announce the opening of its operations in Moscow, Russia. The launch of its operation in Russia will allow Lunacloud to continue the implementation of its strategy of international expansion. Lunacloud is a provider that strives to exceed customer expectations in delivering cloud infrastructure and platform services (IaaS and PaaS) by being technologically advanced and design mindfully, delivering powerful, flexible and yet very simple cloud compute & storage services at the best value. Lunacloud was launched in the UK and Portugal in June 2012 and has expanded to France and Spain in 2013. Russia is an important market for Lunacloud because it is one of the fastest growing economies in the world and at the same time faces less competition by other cloud providers than other major markets. With this opening, Lunacloud aims to reduce cloud adoption barriers for small and medium businesses in Russia, and extends its offer to the entire Eastern Europe region. Lunacloud provides its website and control panel, as well as telephone support, in Russian, pricing and billing in Rubles, as well as popular local payment methods, options that other global cloud players do not offer. Lunacloud plans to open a Datacenter in Moscow at the end of 2014, which will expand the geographical reach of its cloud infrastructure and will allow customers to choose whether they want locate their data inside Russia (EU East datacenter) or prefer to locate it in any other of the Lunacloud datacenters (EU West or EU Central).[pullquote]Lunacloud plans to open a Datacenter in Moscow at the end of 2014[/pullquote] “Opening in Russia is a major step in implementing Lunacloud’s strategy of being a global cloud infrastructure and platform provider with a strong focus on local operations in different regions”, said António Ferreira, Lunacloud’s CEO. “Starting business in Russia, is a result of our confidence and passion to provide high quality and innovative cloud services, to meet the needs of all Russian speaking customers" said Mikhail Mikhailov, Head of Lunacloud Russia. Lunacloud in Russia can be reached at Kozhevnicheskaya St. 14, - 5  Moscow or by email at info@lunacloud.ru --- Lunacloud differentiators Lunacloud’s cloud portfolio includes Cloud Servers, on which to run operating systems such as Linux and Windows, as well as applications, Cloud Storage, for data object storage, S3 compatible and Cloud Jelastic, a Java and PHP (rRby, soon) platform that allows developers to deploy their code in a quick, safe and effective way, and hosting their applications in an auto-scalable environment. Competitive pricing:  10 to 30% lower than major competitors. No commitments or long-term contracts. Each customer only pays the resources that it uses. Extreme Flexibility:  usually cloud providers only offer from 10 to 20 different pre-packaged server configurations, that do not address specific needs of their customers. Lunacloud’s offer is extremely flexible since it allows any RAM, CPU and DSK size combination (respectively, from 512 MB to 96 GB, 1 to 8 processors, 10 GB to 2 TB), a total of 307,200 different combinations! And each customer may upgrade or downgrade server resources, anytime. Self-service:  each customer self-manages its services, using a Web Control Panel or the API provided by Lunacloud. Resizing servers (scaling up or down) is also possible, without the need to power off or reboot the server, unlike the competition. Proximity:  indispensable to SMBs, thanks to local support in Russian. ### Has IaaS commoditisation triumphed? With news that cloud computing units will be traded on a commodities exchange, Compare the Cloud considers what the impact will be. The Chicago Mercantile Exchange (CME) announced on Monday that it is moving ahead with plans to create a “spot exchange” for Infrastructure-as-a-Service (IaaS).  The building blocks of cloud computing – storage and processing resources – will be traded on an exchange in the same fashion as other commodities such as metals, agricultural products and energy units. Wikipedia describes a commodity as “...a class of goods for which there is demand, but which is supplied without qualitative differentiation across a market.”  So have we reached that point?  Are the qualitative differences between IaaS offerings now so minor that a comparatively simple ranking system can make IaaS units a tradable entity? How providers of IaaS services can differentiate themselves has been a subject of hot discussion for a while now, and people tend to fall into one of two camps:  Those that believe IaaS is a commodity and subsequently requires all-but-the-largest providers to federate in order to survive / compete / succeed, and those that believe IaaS has plenty of opportunities for differentiation, and that providers should set out in pursuit of that. This news certainly endorses the views of the commodity camp.  But is it that simple - and if so, what are the ramifications for IaaS providers both large and small? The company reducing Iaas to tradable units is called 6Fusion. 6Fusion have developed a normalization methodology which they call the Workload Allocation Cube (WAC), which “yields a single representative unit value of actual consumption” from six metrics as shown here: 6Fusion's Workload Allocation Cube (WAC) Composition Many believe that those in financial circles have a tendency to forget that at the end of every resource traded is a consumer! Whilst certainly a lot of thought will have gone into this, it’s also clear that there is quite a lot at stake in terms of this system producing accurate and desirable outcomes. Can the future of IaaS provision and consumption be pinned on these calculations, or is it over-simplified? Many believe that those in financial circles have a tendency to forget that at the end of every resource traded is a consumer! So will IaaS consumers go along with this? Let’s also spare a thought for the providers.  Sure, when this CME ‘service’ goes live it is likely to include tradable units from the largest IaaS providers – Amazon, Microsoft, Google (i.e. those that are currently having a big price punch up), but where does this leave everyone else? If you’re an IaaS provider, how will your carefully nurtured computing facility look once it’s been put through the WAC mincer?  Will your price to performance ratios (and accessible scale) allow you to stand up and compete on such a trading platform?  If not, then is the only way to get a slice of the action through federation? [pullquote]Providers: Will your price to performance ratios (and accessible scale) allow you to stand up and compete on such a trading platform?[/pullquote]Advocates of differentiation on the other hand will encourage IaaS providers to identify what makes them special – whether that might be customer service, location, performance or value-added service, and use that to capture new business.  You’ll notice that only one of those features in the WAC, so we’ll have to assume that the CME’s view of what makes up IaaS doesn’t include or depend on the other three things.  Can this view be justified? Perhaps IaaS location is a relatively niche requirement so falls outside of the scope, and perhaps the value-add services built on top of IaaS are now actually PaaS, so also fall outside the remit.  Is customer service surplus to requirements for those presumed “mega consumers” of IaaS that will access compute resources via such an exchange? [pullquote]Is customer service surplus to requirements for those presumed “mega consumers” of IaaS that will access compute resources via such an exchange?[/pullquote]Maybe that’s the key here. Certainly some buyers of IaaS just want maximum bang for their buck, so it’s these businesses the CME believes stand to benefit most from this service.  But are there enough buyers that fit this profile, and will they produce a sufficient volume of activity? As one final thought, Amazon challengers – Microsoft, Google & co – those that do have enough weight to compete to the death on price, must be licking their lips at the thought of an exchange. For them it may well be a big opportunity to take a bite out of Amazon’s market dominance as it levels the playing field. Of course that also depends on the exchange attracting the kind of scale and activity that is hoped for, and how well geared up for the fight they really are. ### Disaster recovery: Should we, or shouldn't we? Russel Ridgley of Pulsant looks at the not insignificant matter of DR and what options you have for protecting your data and to what extent. Disaster recovery, as the name suggests, forms a crucial part of any organisation’s data protection and business continuity strategy. Or, rather it should. According to research conducted by Forrester, disaster recovery (DR) and business continuity formed 5.8% of the overall IT budget for organisations in 2013, with 66% of respondents rating it as a high or critical business priority. While no-one seems to be disputing the importance of having this type of strategy in place, the practicalities of implementation can often be challenging – from budgetary issues, buy-in from CIOs and the type of solution itself. DR means different things to different people – from time, in terms of minutes or weeks, to the extent of what it covers, including critical systems or everything. The requirements of a DR solution can also be challenging as they are hard to define, sometimes spanning hundreds of pages of documentation. Once DR plans are implemented their performance can be difficult to measure because organisations sometimes cannot test and monitor it without experiencing a system disruption and certainly the testing of complex DR plans isn't without risk in itself. In addition, the impact of introducing DR strategies can be underestimated – DR affects all aspects of an organisation. It is not just a recovery system but also includes aspects such as all of the connectivity surrounding it, the switch-over of users, and dealing with the data within it. Given the importance of DR, how then do you ensure its successful implementation and use? DR can be done in-house or outsourced, and regardless of which method is chosen there are a few things that should be taken into account. Any DR project needs to take into account the entire IT estate. It often ends up as a cost versus compromise balance. It’s important to weigh this balance up fully as there are risks of investing into recovering a system where a non-resilient component elsewhere may introduce more risk than loss of that single system – and potentially be mitigated with a lesser investment. Part of the consideration is how much data your business could tolerate losing though obviously the thought of data loss is terrifying in itself. Most DR strategies require that your data be in two places at once; data replication can either be synchronous (writing both ends at once so data is guaranteed to be up to date) or asynchronous where the data is periodically copied. Both have pros and cons with the downside of synchronous being that the entire system is reliant on the connectivity between sites, while choosing asynchronous replication may actually cause data loss when switching to your DR system. It’s worth considering that the disaster could be a massive data corruption. Data replication will also replicate data corruption and restoring from backups is normally a slow, time consuming task and a significant amount of data is likely to be lost due to the normally long backup intervals (mostly every 24 hours) Asynchronous or journal-based replication gives some options for preventing your DR site being affected, but can add risks of their own and certainly cost in the case of the latter. Cost is always a core consideration and effective DR strategies certainly come at a price and remain a concern for CIOs and decision-makers. However, if your business includes a critical system that would bring your organisation to a stand-still if it failed, the real question is perhaps: can you afford not to have DR? You can find information on Pulsant's Cloud on-demand Backup Services here. ### 60% of UK SMEs taking advantage of outsourced IT infrastructure, according to Node4 UK SMEs increasingly confident in adoption of latest IT technology - enabling faster response to employee demands and business growth Derby, UK, 10th April 2014 – New research has shown that 60% of UK SMEs have partly or fully outsourced their IT infrastructure; with over 31,000 SMEs in the UK, this means that there are now more than 18,600 businesses that have moved some part of their IT provision off premises. The findings, detailed in a new report from data centre and communications specialist Node4, also highlight that 1 in 10 SMEs have already deployed a fully cloud-based IT infrastructure. This means that over 300,000 employees in the UK are already experiencing the full flexibility and efficiency benefits of cloud-based IT solutions. The new report, 'Facing up to the IT infrastructure challenge', shows how UK SMEs are increasingly confident in their ability to adopt the latest technologies and to take advantage of advanced flexible IT services to deliver real business growth. The report analyses the attitudes and views of IT decision makers at UK SMEs and provides expert advice from Node4’s experienced team on how SMEs can continue to take full advantage of the latest IT technology. Increasingly companies see technology and IT as a business enabler, something that their organisations can’t do without. But along with the growing importance of technology has come the almost overwhelming array of choices when it comes to IT infrastructure provision; cloud computing, IP-based technologies and virtualisation are just a few options available today. Paul Bryce, Business Development Director, Node4, commented: "Many studies continue to claim SMEs are struggling to adapt to this new landscape and are at risk of being left behind. However, we firmly believe there is much greater recognition of the value that IT can deliver to businesses amongst SMEs today - and the apprehension that surrounded outsourcing has largely dissipated. IT has gone from being a static cost-centre to a dynamic business enabler that must support the organisation at every stage of the sales cycle, empowering the business to grow and succeed. Our research and analysis has borne this feeling out and our report demonstrates how savvy SMEs are able to proactively embrace the latest technologies and meet and exceed the demands of the business and employees." Drawing on a detailed survey of 250 IT strategy business decision makers in businesses of between 50-500 employees, Node4's new report combines analysis and recommendations for SMEs to deploy IT solutions that help them to win more business, beat the competition and prosper in the growing economy. ‘Facing up to the IT infrastructure challenge’ is available to download here. Bryce added: "SMEs are not ‘making do’ with out of date infrastructure. In fact in our experience most organisations have recognised the need to evolve their technology infrastructure to meet the needs of its users and succeed in today’s competitive environment. Many businesses are worried about this transition but they realise that if they don’t keep pace, they will fall behind and miss the opportunity to capitalise on the renewed economic growth. Our report today will hopefully help more SMEs recognise where they can do more and how partners like Node4 can help them achieve the success they want."   About Node4 Node4 is a data centre and communications specialist with solutions and expertise covering cloud, colocation, managed hosting through cloud and virtualised environments, connectivity, SIP trunking and hosted telephony. Node4 works in close partnership with its customers to provide bespoke IT solutions customised to individual business needs, delivering its services via four UK-based data centre facilities that offer best-of-breed infrastructure and the latest in security technology. For more information please visit www.node4.co.uk.   Media contact Ben Smith / Joe McNamara EML Wildfire Technology PR Node4@emlwildfire.com +44 208 408 8000 ### SMEs are not interested in cloud technology! After drinking what must be thousands of lattes, meeting a similar quantity of end user 'consumers' (potential and actual), and then hundreds of cloud computing vendors, I want to share some of my thoughts with you and I hope these do not come across too much as a soapbox rant. "Cloud" is a word that’s so over-used, over-sold and yet underestimated.  Those of us within the cloud industry seem to assume that the end consumer of this technology actually understands what we are talking about. We talk about what the greatest trends in cloud are and 'prophesise' on where they are going to end up, without - to coin a phrase - eating our own dog food.[pullquote]Those of us within the cloud industry seem to assume that the end consumer of this technology actually understands what we are talking about. [/pullquote] We can talk about IaaS, PaaS, SaaS and who is doing what, we can debate whether Openstack will start to be adopted in the masses, we can even pontificate over what strategy should be employed in order to gain market share in the hybrid cloud race. But what does this all mean to the end consumer of these services? At the tech-driven vendor side of the business, we like to talk about technology. We like to see the latest advancements in cloud tech and see the value of them. We are at Cloud version 3.0 (I just made this version up by the way) and our clients' understanding is still at Cloud version 1.0. See the problem? ...And in the SME market space the gap is only getting wider. So, let’s look in depth at the SME market place (Small and Medium size enterprises). Below are some statistics from the Federation of Small Businesses (updated as of Oct 2013). At the start of 2013:[pullquote]We are at Cloud version 3.0... and our clients' understanding is still at Cloud version 1.0. See the problem? [/pullquote] There were an estimated 4.9 million businesses in the UK which employed 24.3 million people, and had a combined turnover of £3,300 billion SMEs accounted for 99.9 per cent of all private sector businesses in the UK, 59.3 per cent of private sector employment and 48.1 per cent of private sector turnover SMEs employed 14.4 million people and had a combined turnover of £1,600 billion Small businesses alone accounted for 47 per cent of private sector employment and 33.1 per cent of turnover There were 891,000 businesses operating in the construction sector - nearly a fifth of all businesses In the financial and insurance sector, only 27.5 per cent of employment was in SMEs. However, in the agriculture, forestry and fishing sector virtually all employment (95.4 per cent) was in SMEs Only 22.5 per cent of private sector turnover was in the arts, entertainment and recreation activities, while 92.7 per cent was in the agriculture, forestry and fishing sector With 841,000 private sector business, London had more firms than any other region in the UK. The south east had the second largest number of businesses with 791,000. Together these regions account for almost a third of all firms Micro: 0-9 employees, Small: 10-49 employees, Medium: 50-249 employees So the SME market place accounts for £1,600 billion revenue in the UK (what a market!) The SME understanding of Cloud. [pullquote]...it will be in the skewed in favour of whoever has the largest marketing campaigns running at the time, or is the cool company of the moment.[/pullquote]So what do you think is the typical SME's level of understanding of Cloud technologies?  Well, I can tell you that it certainly won’t be the same as yours and mine! It will be based on word of mouth from colleagues, it will be what they have read on and offline - and to be honest, it will be in the skewed in favour of whoever has the largest marketing campaigns running at the time, or is the cool company of the moment.  When one of us (cloud vendors) gets asked a direct question about cloud by an SME, we tend to over-complicate and miss-communicate the reply, which only serves to leave them in a confused state (yes we do, no arguement!). Does this slow the adoption to Cloud based services? You bet - and in a big way!  If you - as a vendor - want to win your prospective clients over, you need to educate them from their understanding of cloud version 1.0 to version 3.0 and the reasons why it matters to them.  Don’t let them be won over by a smooth talking sales campaign that promises everything and under-delivers once they have agreed a contract, as this is very rife in the Cloud industry today. [pullquote]If you... want to win your prospective clients over, you need to educate them from their understanding of cloud version 1.0 to version 3.0 and the reasons why it matters to them.[/pullquote]In my opinion, there are no more product-based sales approaches within Cloud... there simply just cannot be. With 'cloud' meaning so many things to so many different businesses, it will be over-sold and under delivered. Hence we are now talking about SLAs, termination clauses and everything that needs to be considered when being approached by a cloud technology vendor or service provider, if it all goes wrong - prior to signing up (which is not a bad thing to do by the way). As cloud technologists / evangelists, we have a duty to our clients... a duty that requires us to understand why anyone would buy our products, and have the integrity to say when they are not suitable for purpose. If we work in partnership and educate our clients along the journey to a successful cloud adoption, then we will retain them for the future and trust will be formed. The cloud marketplace is a relatively young industry and there are many choices a consumer can make, so let's help ensure they make the right one. For the end consumer Cloud technologies are changing business in every industry at such an alarming rate.  New applications and services born from the cloud are appearing daily and major technology upheaval is becoming nearly constant. How do you keep up? How do you know what to change, where should you outsource your technology and why? There are so many questions an SME could ask and they are  right to ask, but let's examine one common question in particular as it is quite revealing... Q. Must I have a Cloud provider that has flexible contract lengths, as the technology changes so rapidly? A. Yes and No... Yes - There are commodity cloud services (i.e cloud data backup, email services) that have been commoditised and are based around simple application services that have been employed for years. Data is data and once a cloud based system has your data, it can be transported quite literately anywhere and at anytime. So in this instance you could say 'yes', a short term flexible contract/agreement could be sensible. No – If your business depends on specific knowledge or understanding of the technology you use, or if your business relies on specific/bespoke applications that require more than a template approach of support – then no. Why would you change this for a short term agreement with a firm that may tell you they understand your specific requirements but in reality they don’t?[pullquote]Cloud technology will not fit all, so a hybrid approach is frequently needed[/pullquote] A point to note for business consumers of cloud services – a short term contract/agreement isn't predictable revenue and that is the same for any business. If you had monthly contracts to all of your clients, would that make you less comfortable with growing your business, or indeed would your clients want monthly agreements? What I am alluding to is an analogy that both the SME business and the cloud vendor can both understand. Cloud technology will not fit all, so a hybrid approach is frequently needed. Just as the technologist out there will understand that hybrid clouds are coming of age with the choice of infrastructure and location, the business consumers know that hybrid cloud support is also needed to help them through the transition and beyond. And let’s face it, they are quite literately running blind in so much that all of their technology and systems are now being supported and run by someone else. [pullquote]This may seem like an anti-cloud based rant but it’s actually quite the opposite...[/pullquote]There can never be any replacement for local knowledge and understanding for any given business, so why should IT be any different? For the majority of IT services I totally agree with moving to cloud-based computing, without any doubt, but if we do not see this challenge for adoption now we will be in a terrible state of affairs in the very near future. We will all be on hold to call centres, being read scripted questions and ultimately be asked have you turned it off and on again whilst all you want to do is talk to someone that understands why you have at least called. This may seem like an anti-cloud based rant but it’s actually quite the opposite. Cloud technology will only ever get faster, cheaper, more flexible until there will be no other choice but to use it.  However we need to keep an eye on the actual consumers that are buying this technology and the reasons why - we must not lose sight of this.  Do you concur? ### VMware: Memory extends cloud performance [quote_box_center]Case Study: Not all clouds are created equal and the underlying hardware can dramatically affect the performance output of virtualized machines.  Here we hear from VMware who selected Kingston memory modules to enable more scale and performance from their service-provider-scale cloud.[/quote_box_center] "The competitive pricing and reliability of Kingston's server memory allowed us to scale the capacity of our cloud environment while reducing the number of physical servers. As a result, we are better able to deliver product integration and operational workflow feedback to VMware R&D." says Adam Zimman, Senior Director of Integration Engineering, VMware R&D The Business Challenge VMware's Integration Engineering (IE) Team provides transparent feedback on the implementation and operation of VMware products and technologies as "Customer [0]" running within an infrastructure as a service cloud. As part of this mandate, the IE Team built out a Service-Provider-scale cloud. The majority of servers in the initial implementation were configured with a 48 GB memory footprint. However, based on the utilization rate of the equipment for 2011 activities, the Team determined that doubling the memory footprint of 48 GB servers would be the best way to accommodate an increased workload projected for 2012. Solution To increase the memory footprint of 288 servers from 48 GB to 96 GB, VMware specialists installed 1,728 Kingston 8GB Low Voltage Modules. This brought all 416 servers to 96 GB across the board. These servers utilised a petabyte of usable storage along with a total memory footprint of 39.96 TB. Business Challenge VMware, Inc. (NYSE: VMW) is the global leader in virtualisation and cloud infrastructure. Additionally, the company's product portfolio includes solutions in downtime management, system recovery, resource provisioning and security issues. The Palo Alto, California-based company is a major player in its space, posting 2011 revenues of $3.77 billion. Today, over 11,000 VMware employees serve more than 350,000 customers and 50,000 partners. VMware's Integration Engineering (IE) Team was formed to capture, validate and deliver transparent feedback to the company's R&D division. As VMware's "Customer (0)" the team generates feedback through the operation of a Service-Provide-scale cloud. Workloads running in the IE cloud include on-site betas for new products, VMware Hands-On Labs (HOLs), and various special projects as needed. In order to support the application workloads associated with these directives, the IE Team built out a Service-Provider-scale cloud across three data centre locations. 2012 planning drives infrastructure upgrade Initially, the facilities housed blade servers provisioned with 48 GB of memory each, which was sufficient to support the Team's workloads. However, the projected workloads for 2012 were great enough to justify infrastructure improvements. That led to an analysis of how to most effectively increase the cloud's workload capacity on a limited budget. Equipment utilisation reports showed that server memory constrained their ability to take on more application workloads. To overcome this obstacle, the decision was made to double the memory footprint of 288 servers from 48 MB to 96 MB. This approach had the added advantage of better utilising server processor power and network storage. Technology Solution Of the cloud environment's 416 servers, 288 of these required a memory upgrade. However, even with discounted pricing, the OEM manufacturer's pricing was so high, that only a percentage of the 288 servers could be outfitted with 96 GB of memory. That motivated Adam Zimman, the Senior Director of Integration Engineering, for VMware R&D, to explore alternatives including memory from Kingston Technology Company. "We chose Kingston for two reasons," recalls Zimman. "First, their pricing was significantly more competitive—so much so that I was able to upgrade all 288 blade servers within the budget I was allotted. Secondly, based on my 20 years of building systems, I have always trusted Kingston as a reliable manufacturer of top-quality parts." After procuring 1,728 Kingston KCS-B200ALV/8G 8 GB DDR3 SDRAM Modules, VMware technicians installed them into the blade servers. Business Results The Kingston solution resulted in a number of benefits for VMware's environment. Able to take on more workloads with confidence The Kingston memory upgrade doubled the capacity of the VMware workloads running on top of them.This result is consistent with an independent study of Kingston memory that found that the number of virtual machines a server could support increased proportionally with the increase in RAM. In addition to doubling workload capacity, the Kingston memory increased the average number of instances that could run on each host by 150%, (from two instances on a 48 GB host to five on a 96 GB host). Moreover, the IE Team averages 11 virtual machines per instance with some that can support as many as 27. With memory headroom to spare, the IE Team can accept more application workloads from internal customers. For example, in 2011 the Team completed six onsite beta tests. In the first half of 2012 it has completed eight tests and is on track to complete up to 25 by the end of the year. And while the 2011 VMworld Hands-On Lab events tallied 31 labs with 144,083 virtual machines deployed, the added server memory can handle even greater workloads for future events. Competitive pricing enables operational efficiencies The Kingston memory offered a competitive list price significantly better than the OEM's discounted price. Consequently, 288 servers were upgraded to 96 GB of memory to bring the entire installed base of 416 serves to a uniform memory footprint. Operationally, the uniform 96 GB servers make it easier to implement automated provisioning. And through that automation, the Team expects to manage workload distribution even more efficiently. VMware's CAPEX budget also received a boost. The service life of installed servers was extended by supplementing them with more memory versus the capital-intensive alternative of procuring newer ones. Reliable modules support high-availability objectives The reliability of the Kingston modules is an added plus in production environments. The company incorporates rigid quality assurance measures into its manufacturing operations and vigorously tests modules to ensure that they meet its high standards. That's consistent with Zimman's personal experience. In over nine years of using Kingston memory, he can't recall experiencing a single DIMM failure. Integration Engineering Cloud Hardware: Kingston 8GB ECC Reg 1333MHZ (KCS-B200ALV/8G) Low Voltage Modules contribute to a total memory footprint of 39.936 TB. 416 compute nodes utilize OEM blade servers. 1 petabyte of usable storage capacity. Cloud resources for 2011 VMworld event: 9 racks of computing hardware totalling 544 compute nodes with a yield of 3,520 compute cores and 29.95 TB of RAM. Converged fabric NFS, 10 GB Ethernet network. Cloud resources for 2012 VMworld event: 7 racks of computing hardware totalling 416 compute nodes with a yield of 3,840 compute cores and 39.936 TB of RAM. Converged fabric NFS, 10 GB Ethernet network. Summary: Doubled the workload capacity of upgraded servers to easily take on expanding workloads. Value-priced memory enabled the implementation of a uniform server memory footprint across the cloud environment. Highly reliable Kingston memory supports demanding workload requirements. ### The cloud's value chain needs service brokers Maurice van der Woude, CEO of BPdelivery B.V, explains how cloud service brokers provide a valuable link in the cloud delivery chain. With the emergence of cloud computing, it has become obvious that suppliers involved in delivering cloud solutions are not capable of administering the full value chain themselves. Traditionally, all suppliers want to do all activities by themselves, but let's be honest, it has become impossible to know all about the processes needed behind even one delivery. There is a huge diversity in demand, and the Cloud industry needs to identify that and start working together. It is not necessary to have all expertise under one roof. Why should you? Leave the right expertise to the right experts and start partnerships. Making your company a fixed part of the value chain is more lucrative a business model than sticking to the old market models, where we tend to do it all ourselves with 50% quality or less. Inevitably leaving dissatisfied customers.  Do what you do best, and leave the rest to others within your value chain. By being part of a value chain, business comes to you, while you are able to grant business to your value chain partners. There is a huge diversity in demand, and the Cloud industry needs to identify that and start working together. The channel market When you come to think of the cloud value chain where resellers are involved, it is easy to say to resellers that they will have to change the way they offer their products. But every reseller is having the same issues: They understand that they need to change but they have no idea how to do that. Resellers are not business changers, most of the time resellers have a technical background - and pushing them in a more consultancy like role does not seem a good fit. It would be like asking someone who is accustomed to walking in sneakers and forcing him to wear leather shoes. For now it seems safer to stay in their current comfort zone instead of trying something new, with the risk of losing it all.  And who can help them with that business change? There are no proven business cases for this. For the distributors it is quite easy to adapt their distribution packages towards cloud, but how do the distributors offer that new package when they have no idea what the business user, standing at the counter of the reseller, actually needs in order to run his business? And what can be the answer of the reseller, with all his technical knowledge but little knowledge on the customers business processes? The Service Providers Service providers will have a difficult time when their customer demands a multiplicity of solutions from different suppliers, with different Cloud solutions running on (in the worst case scenario) multiple platforms. And if that service provider only had the one customer things would be easier, but most of the time that is not the case.  Service providers want to deliver their technical solutions and do not want to get caught up in all the business processes that are behind one delivery. In the European Small and Medium Business market (SMB) there are a lot of small technical providers that deliver their solutions to their customers, but are having the greatest difficulties in getting the invoices right, or their internal business processes to function as an oiled machine. Service providers leave money on the floor right there, where having the right implementation of the administrative and technical business processes would save them a lot of money on a daily basis. If the above were the only situations at hand, the market could concentrate on solving these particular issues. These issues however, are identifiable by the fact that there may be more parties involved behind one delivery, and delivery does not take place in the traditional one-on-one situation. Where more parties are involved with Cloud deliveries, that is where Cloud Services Brokers can come into play. Cloud brokers make it easier for organisations to consume and maintain their cloud services, particularly when multiple parties are involved. A Cloud Services Broker (CSB) plays an intermediary role in cloud computing deliveries and consumption. Cloud brokers make it easier for organisations to consume and maintain their cloud services, particularly when multiple parties are involved. The Cloud market is about high volumes with small products, because the solutions have to be able to run on any device, anywhere. So not only do high volumes play a significant role, also the large variety of solutions affect the value chain.  Not every reseller and service provider is capable of handling that on his own. Cloud Services Brokers According to Gartner a Cloud Services Broker comes in three flavours: Cloud Service Inter-mediation: An inter-mediation broker provides a service that directly enhances a given service delivered to one or more service consumers, essentially adding value on top of a given service to enhance some specific capability. It may be solutions that connects customer demands and offerings from suppliers. The services from BPdelivery certainly qualify for this. To ensure cloud makes the music, it will need orchestration, not only on a technical level, but certainly on business level where processes need to be carried out in a consistent way. Aggregation: An aggregation broker service combines multiple services into one or more new services. It will ensure that data is modeled across all component services and integrated as well as ensuring the movement and security of data between the service consumer and multiple providers. These aggregation brokers will exist primarily in the cloud as service providers in their own right, forming a layer of service provisions that approximates the application layer in traditional computing. This aggregation part will be the more technical Broker service. Cloud Service Arbitrage: Cloud service arbitrage is similar to cloud service aggregation. The difference between them is that the services being aggregated aren’t fixed. The goal of arbitrage is to provide flexibility and opportunistic choices for the service aggregator. In this case the customer will have more solutions to choose from, avoiding the risk of vendor lock-in. BPdelivery is a broker service that makes it easy for service providers to provide all their different customers with their products, all at different pricing and pricing models. The unique approach is that BPdelivery is able to provide the full delivery process, from applications to hardware, IP telephony and the invoicing. Apart from that, one of the unique selling points is that BPdelivery does not touch the relationship between the providers and their customers. Also the service comes white-labeled, so the customer is not even aware of the fact that the full delivery process is carried out by a third party. With this cloud inter-mediation broker service, all parties involved do what they do best on their own, while service and demand are connected with each other. The true "cloud broker" knows exactly what is going on in the market and knows about the business processes. Research Area Cloud brokerage is a new market, and Gartner has predicted that this market will become a standard sometime in 2014.  BPdelivery believes that evolution in this market will be needed in order to ensure the growth of Cloud in Europe and beyond. We already identified the need for these types of broker services and we expect this market to grow on a large scale. A short video on Gartners' vision can be found on BPdelivery (here) In February 2013, F5 created an extensive whitepaper titled: "Integrating the Cloud: Bridges, Brokers and gateways". Though this whitepaper may look quite technical, the business issues are discussed with the part focus on cloud service brokers. They identified that the hybrid cloud is the type of cloud deployment that is used most of the time. It is only logical to derive from that that cloud brokers are needed to ensure the inter-operability of hybrid clouds. The value chain before cloud services brokers will become a fixed part of the value chain, Cloud solution providers need to realise that they will fulfil the technical deliveries, where the cloud services brokers take care of the business processes behind it, like adding value and connecting solutions and improvements of the delivery processes itself. In this way the cloud value chain is best served because all parties are doing the things that they do best. The market realises that with the uptake of cloud solutions, more parties should be involved in the value chain. It will be impossible for one party to cover all the different areas needed, especially when you are active in the SMB market. Do bear in mind that a cloud services broker is not a consultant. A consultant may advise on process improvements, but they do not actually deliver the process improvements. BPdelivery caters for that and has the business advisors to make sure the right path for your company is chosen. ### StratoGen Appoints Kat McClure to Spearhead Global Marketing Efforts London, 7th April 2014 – Leading VMware cloud hosting provider StratoGen is pleased to announce the appointment of Kat McClure as its new Global Marketing Director. Kat will lead the international marketing program to meet StratoGen’s market expansion within the hybrid cloud space. Her efforts will be focused on increasing global demand by bringing in new customers and partners as well as further developing existing business relationships. With over 14 years of experience in the IT and media sectors, Kat has considerable experience in multi-national B2B marketing including working at Microsoft, AppSense, Mistral Internet and BskyB. Kat’s extensive marketing background is extremely well placed to further enhance the good reputation and rapid growth plans of the StratoGen business. Karl Robinson, Sales Director at StratoGen commented: “We are thrilled that Kat has joined the leadership team at StratoGen. Her previous experience and pragmatic approach will be pivotal to our plans to significantly expand our hybrid cloud offering over the next few years. The business has been growing 50% year on year and having Kat heading up the marketing efforts will certainly help propel StratoGen into its next phase of growth.” Find out more by visiting www.stratogen.net ### IBM Accelerates Mobile Innovation for Businesses with New MobileFirst Services IBM expands services portfolio to help clients rapidly develop and launch mobile solutions. To help organisations accelerate and drive more business value from their mobile initiatives, IBM has announced the expansion of its IBM MobileFirst portfolio with new and enhanced services focused on mobile strategy and security, application development and device procurement and management. “The expansion of our mobility services portfolio demonstrates our continued investments in building the full breadth of capabilities clients need to radically transform and grow their business through mobile,” said Rich Esposito, general manager, Mobility Services, IBM Global Technology Services. “In 2014, we will further leverage our software acquisitions and growing capabilities in the cloud to provide clients with more flexibility and choices, including ‘mobility as-a-service’ solutions.” A recent IBM study reports that 90 percent of global organisations across eight major industries are planning to sustain or increase their investments in mobile technologies over the next 12 to 18 months. According to the report, the top three mobile challenges facing organisations are: Integrating mobile applications with existing systems (54 percent) Implementing secure end-to-end mobile solutions for devices and applications (53 percent) Reacting to changes in technology and mobile devices in a reasonable period of time (51 percent) Today mobile is helping drive business transactions and revenue while delivering valuable information that teams can tap into to identify and capture new business opportunities. As a result, mobile leaders are doing more to integrate this channel into the fabric of their business, helping clients address their key challenges and fully integrate mobile across their entire business. For example, The Fashion Institute of Design and Merchandising (FIDM) wanted to enable secure bring your own device (BYOD) access to online resources at all campuses while enhancing user satisfaction and increasing productivity for faculty, staff and guests. Using IBM solutions including its IBM Network Services, FIDM rolled out a BYOD solution that lets users securely access online resources using their smartphones, tablets and laptops. As a result FIDM has improved user satisfaction while increasing faculty and staff productivity. In order to help further drive this transformation, IBM is rolling out new and enhanced mobility services, including: IBM Mobile Infrastructure Consulting Services – Helps enterprises evaluate their existing mobile infrastructure environment, identify gaps and build a comprehensive mobile strategy and roadmap that meets their business and technical needs. Using industry-specific points of view, IBM guides the journey, focuses clients on approaches that are specific to their business and recommends solutions that achieve the client’s service management, cloud and workplace objectives. IBM Mobile Application Platform Management Services – With apps emerging as a key growth area for their business, enterprises are creating their own developer communities. With these new services IBM will help clients build, configure and fully manage their developer community’s app dev environment, supported by both software and a skilled team of experts. IBM Mobile Device Procurement and Deployment Services – Allows clients to simplify the selection and ordering of devices and install the client’s tailored platforms, apps and service components. Devices can be shipped directly to employees and include services for ongoing secure management, customer service and predefined refresh cycles with disposal or re-purposing. IBM Mobile Managed Mobility Services – Provides enterprises with scalable, secure, reliable and flexible management of their mobile infrastructure and wireless endpoints. With these services IBM helps clients reduce the risk, complexity and cost of managing their bring your own device (BYOD) and corporate device programs while freeing up the client’s IT resources to focus on core business needs. Leveraging technology from its acquisition of Fiberlink, IBM has enhanced the offering with Mobility as a Service (MaaS) as its primary-go-to-market focus, using MaaS360 and an ecosystem of IBM business partners to deliver cloud-based solutions. Mobile Network Services – Provides customers with a deep understanding of resources required to deploy a secure, scalable and reliable network infrastructure to support the unique demands of their mobile business. As a result, clients can identify the new unique components and design considerations that must be taken into account to establish a secure mobile enterprise. For example, customers can pinpoint what type of mobile security is required to sustain the increased network traffic volume generated by mobile apps and data. Mobile Collaboration Services – Provides a suite of productivity solutions including email, instant messaging, voice and video that enable mobile employees to exchange information, locate experts and become more productive. Clients can benefit from design, implementation and managed services to help them stay connected and perform business transactions from any mobile device, anywhere. As a result, employees become more collaborative, creative and effective, driving growth, customer loyalty, cost reductions and higher employee satisfaction. Mobile Virtualisation Services – Enables clients to leverage virtualisation technology to design, implement and manage distributed end users. Using this service, end users can access platform independent, hosted applications and full client images. This comprehensive offering includes assessment, design and implementation and managed services delivered on premises. Devices included in this service include laptops, workstations, tablets and thin clients. IBM Smart and Embedded Device Security – Helps enterprises secure their emerging mobile applications and enables device manufacturers to address concerns around safety, stability, service cost and intellectual property protection of smart and embedded devices. IBM uses threat modelling, source code analysis and penetration testing of device and application, including firmware and kernel module security, to identify and fix vulnerabilities. This enables the client to prevent hackers from gaining root access to their devices and increases the integrity and availability of their security services. In 2013, IBM launched IBM MobileFirst, a significant mobile strategy that enables clients to radically streamline and accelerate mobile adoption. IBM assembled the people, technology and R&D to build the most comprehensive mobile portfolio in the industry. IBM MobileFirst combines deep industry expertise with mobile, Big Data and analytics, cloud and social technologies to help organisations capture new markets and reach more people. For more information about IBM MobileFirst, visit: http://ibm.com/mobilefirst ### Community computing initiative drives learning and creativity Compare the Cloud & Reprezent Radio help young to compute and broadcast their way to a brighter future. Reprezent 107.3FM is a full time London community radio station, exclusively for 13-24 year olds. All programming is made by young people trained in-house, and focuses on young people from urban areas of poverty. Reprezent would like to thank Compare the Cloud for the donation of PC’s that has allowed our young people to use better ICT equipment and hopefully make a difference to their futures. Young people find technology and new media exciting, and these are our main tools for engaging them in learning the skills they need to do well in life. "It’s here at Reprezent where I have built my confidence. In terms of broadcasting on a live show, that was actually big for me. Reprezent gave me that platform in order to do that and in order to have self-confidence and be myself. Reprezent was one of the main things that helped me to get into university. I didn't meet the criteria grade but because I had the experience at Reprezent, they reconsidered my application and accepted me. Yes, Reprezent was a real life saver." says JJ (Jonathan) aged 19. Most of the young people who come through our training experience extreme difficulties in their lives one way or another (on average 400 per year). Many have been excluded from school, or struggling with a mainstream education system ill equipped to fully support them. A number come from unstable family circumstances, without strong role models to encourage and help them to develop. Many have been unemployed for a long period of time, and have had little training to help them to realise their potential. Over 90% who start on our training  no matter what their background and ability complete the course and achieve qualifications. Our training is nationally accredited, and gives young people the opportunity to get on air. In addition, radio is a great engagement tool for young people, as it’s as accessible to anyone with a voice, irrespective of educational level, literacy or even self confidence. Through learning how to work on radio, trainees learn the skills necessary for the world of work and further education such as speaking and listening, presentation, interview techniques, working in a team, turning up on time, negotiation, research, ICT and technical skills. As well as essential life skills, every young person on our training has a bespoke Learning and Development plan, agreed with them by our youth support workers. This sets out their long and short term goals (no matter how lofty!), is regularly reviewed to make sure they are doing everything they can to achieve them, so we’re helping them in the right way. By the time a young person has gone through our training, they will be prepared to take the next positive leap in their lives. They’ll have a CV full of practical work experience and be properly prepared for their next move. Over 90% who start on our training  no matter what their background and ability complete the course and achieve qualifications. Over 50% of all starters successfully gain employment at the end of the course, 30% progress to further or higher education, and many of the remainder continue with the station making radio programmes and supporting new trainees. Our ambition is for young people to move out of the cycle of poverty with new found skills, confidence and aspirations. Funding is very challenging, especially in the current austere climate so the more support we get from companies and individuals the greater impact this will have on improving the lives of young people across London. To find out more about Reprezent and how you can get involved visit www.wereprezent.co.uk. ### Is NSA killing cloud? Much copy has been given to the Snowden revelations and there has been widespread ire at the practices of NSA and GCHQ. In fact, Yahoo has said it will switch from London to Ireland partially to avoid snooping. Taxation is also said to be playing a part here but the UK government was so concerned that it would lose intel it summoned Yahoo to a meeting. 97% of EU ICT decision-makers now prefer buying a cloud service which is located in their region (NTT) The revelations have caused a wider wobble in cloud adoption and this is shown by a a recent report by NTT Communications (PDF). NTT surveyed 1,000 ICT decision-makers in UK and USA as well as France, Germany, Hong Kong. For example, 62% of those not currently using cloud feel the revelations have prevented them from moving their ICT into the cloud. And 97% of EU ICT decision-makers now prefer buying a cloud service which is located in their region (92% for US). 84% feel they need more training on data protection laws. It's important to put this into perspective. Most governments have security agencies which undertake electronic surveillance. Yes, even Ireland. Security agencies share and will continue to share data. The German security agency doesn't feature heavily in the Snowden revelations but has also admitted it shares data. Either businesses are comfortable with their own national government snooping on them or they're not aware of it. Last year someone admitted to me they would prefer GCHQ snooping on them rather than NSA or FBI. I still don't understand how that makes sense. NSA hasn't cracked all encryption yet, but it is working on it. Where does this leave us? NSA hasn't cracked all encryption yet, but it is working on it. In the meantime, there is a great opportunity for encryption and tokenisation and other obfuscation techniques to be a more prominent feature of cloud services going forward. I agree, there remains much confusion about data protection laws and more people should be trained on it. Also, aside from security agency snooping, not enough (public) cloud providers yet deal adequately with data protection. Pick any public cloud contract and look for the exclusions of liability for loss, corruption or deletion of data. It is reckoned the cloud market will be worth $225bn in 2015 and providers continue to invest heavily in building cloud services. For example, Google alone spent $7.3bn in 2013 and even Cisco is going cloud. Electronic surveillance isn't confined to cloud - keeping data on premise just makes it harder. But NSA has managed that too. So staying out of cloud isn't the answer. The EU and US have committed to improving Safe Harbor by this summer. Don't give up now. Cloud is about to get interesting. ### Cisco - Between a rock and a hard(ware) place Last week, Cisco Systems Inc announced that they plan to offer cloud computing services, pledging to spend over $1 billion over the next two calendar years in order to establish themselves in the burgeoning cloud market. Cisco’s perhaps slightly unexpected move has been widely reported in the media this week, with the Wall Street Journal initially breaking the story.[pullquote]Cisco is extremely serious about their cloud plans, and as an opening salvo [the investment is] a tidy sum.[/pullquote] The new operation will be referred to as Cisco Cloud Services, and the billion dollar investment that Cisco is making will apparently largely be invested in building the data centers required to run this new service. And it is evident from the commitment that they have made that Cisco is extremely serious about their cloud plans, and as an opening salvo it's a tidy sum. At present, Cisco is primarily known for its dealings in network hardware, so to some extent this represents a diversion from their usual business and modus operandi.  Certainly four years ago they had no intention of providing cloud services - preferring to be the equipment supplier behind the growth of cloud,or as they described it at the time: "Cisco's motto is to be the arms supplier". [pullquote]Certainly four years ago they had no intention of providing cloud services... "Cisco's motto is to be the arms supplier".[/pullquote]However, many market analysts are viewing the move to cloud as an entirely logical strategy, with Cisco evidently wishing to take advantage of business' desire to rent computing services rather than incurring the often huge initial outlay of buying and maintaining their networking hardware and servers. The cloud offers a flexibility in this regard that hardware of course never has and never will, albeit somebody somewhere still has to buy and manage such hardware.  The problem is that it isn't always Cisco's gear filling the data centers of Cloud Service providers (CSPs). Cisco’s move therefore not only represents a growing trend in cloud computing where expenditure on enterprise hardware has been falling for quite some time, but also reflects one in the world of hardware provision as well - we've recently seen IBM make a huge strategic shift away from hardware to cloud.  The competition is growing and everyone wants a slice of the action.[pullquote]The problem is that it isn't always Cisco's gear filling the data centers of Cloud Service providers (CSPs).[/pullquote] It seems quite clear that Cisco is making this ambitious cloud move with the intention of becoming the IT partner of choice for larger Enterprises and expand their supply of services to their existing and large customer base of telecom and technology service provider businesses. Cisco has suffered from the fact that it has been selling hardware which "powers" cloud computing, but drawing the line at offering end-user facing services. which they are no addressing.  This creates a potentially awkward situation where they are likely to find themselves competing with their customers for Enterprise customer business - a position that VMware and IBM amongst others have had to navigate delicately around.  As with these cases, it seems there is diminishing role and viability for the straight forward reseller in the world of cloud, with everyone needing to differentiate themselves and bring their own value-add to a customer solution. So it can be said that it makes sense for Cisco to offer this joined-up functionality bridging hardware and cloud at  this point in time, and take full advantage of the name and reputation that it has built up over the years with some good momentum and brand penetration.[pullquote]...they are likely to find themselves competing with their customers for Enterprise customer business...[/pullquote] Cisco has already seen many of its natural rivals enter the cloud space. IBM has made numerous acquisitions with the intention of stamping its impact in the cloud, while Microsoft has already developed their Azure service with growing SaaS offerings. Oracle is also working on cloud services, while Hewlett-Packard has already developed its cloud stack. Both Verizon and AT&T have already developed low-cost cloud services, while Amazon is the dominant market leader in this particular facet of computing. [pullquote]...[will they] be seduced by providing potentially more profitable, differentiating and customer-retentive Platform-as-a-Service (PaaS)..?[/pullquote]What has yet to become clear is whether Cisco can contend with lower margin business of providing infrastructure-as-a-service (IaaS) or whether they'll be seduced by providing the potentially more profitable, differentiating and customer-retentive Platform-as-a-Service (PaaS) options to customers.  If they do, then they really do have a lot of catching up to do (no doubt requiring additional business acquisitions), and they'll need to find extra speed, agility and skill-base they are not currently known for. Although a challenge, this isn't insurmountable especially when you have dollars to spend - and it seems inevitable that cloud provision will become a central part of Cisco’s business from this point on. What do you think?  How do you see Cisco's foray into cloud service provision developing, in what key areas, and how successfully versus the established competition? ### Virtualization versus Cloud - and when to use them Rick Delgado puts virtualisation and cloud computing in the ring, asking how best to determine which is the better option for your business. It seems like businesses everywhere nowadays are turning to virtualisation and the cloud to decrease costs and increase company efficiency. They’re a pair of the latest buzzwords that every business leader seems intent on using, but the two are often used interchangeably as if they mean the same thing, which would be incorrect. While similar to each other, virtualisation and cloud computing each offer distinct and separate advantages depending on how they’re used and what your business needs. While similar to each other, virtualisation and cloud computing each offer distinct and separate advantages depending on how they’re used and what your business needs. Virtualisation vs. the Cloud To properly note each technology’s benefits, it’s important to describe how they differ from each other. In its basic form, virtualisation creates a virtual form of something. In the case of server virtualisation, it allows multiple servers to run on the same hardware, separating the virtual servers from the physical hardware and running multiple computing environments in one physical space. Virtualisation is essentially software that manipulates the hardware. In contrast, cloud computing does away with the physical hardware altogether, providing a computing service that is accessible over the internet. Put simply, cloud computing is computing as a service, and virtualisation is optimised computing on a physical infrastructure. With fewer physical servers there will be a reduced cost to power, cool and manage the server infrastructure. Benefits of Virtualisation Server virtualisation has many benefits for an enterprise. Servers are migrated to virtual machines, which in turn are placed into fewer physical servers. All this effort is largely intended to save money and improve efficiency, since with fewer physical servers there will be a reduced cost to power, cool and manage the server infrastructure. By consolidating servers, it also saves on the amount of physical space needed. Virtualisation also gives businesses a greater degree of flexibility when allocating additional resources to the servers. Moving virtual machines and storage to another machine can also be done seamlessly, without a need to disconnect from the hardware or interrupt an operating system or application. It also allows options for layers of extra redundancy, increasing availability, and for faster recovery in case of disaster (DR). Benefits of Cloud Computing Cloud computing has its own share of additional benefits. Since it’s (as a) a service, businesses will depend on a provider / supplier instead of relying on its own IT staff. This outsourced IT means your company will not have to worry about the day-to-day maintenance and administration of the service, which can save your business money on IT costs. (Most) Cloud service providers also offer a pay-as-you-go model where you only pay for the services that you need and use. If there are some systems, services or resources you don’t want, then you simply don’t need to pay for them. This also allows a business to optimise its operational capacity based on what it needs in the moment, meaning long-term planning isn't as necessary as it would be when adding actual physical infrastructure for yourself. If there are some systems, services or resources you don’t want, then you simply don’t need to pay for them. Choosing Between the Two With both tools providing unique benefits, knowing when is the best time to utilise each of them becomes important as well. First, you need to assess what your company’s needs and workloads are so you have accurate and up to date information when making a decision. Since your business would be in charge of managing a virtualised infrastructure, you also need to know who is going to provide support while it operates. Virtualised environments can also be difficult to integrate with other systems. If it’s too challenging for the amount of money and resources you have available, cloud computing is likely be the better option. This is born out by the research company Gartner advising businesses with less than $20m in annual revenues not to build their own infrastructure. Security needs may also play a big factor in the decision-making process; whereby if data security (and accountability) is of paramount concern then virtualisation may be the better route to take since all your data is kept in-house, and there may be less reliance on encryption of data at rest and in transit. Though both technologies have a lot to offer individually, many companies are choosing to use both in tandem to get the most out of these tools - effectively a hybrid (public/private) cloud. When the two interact with each other, most of their advantages are amplified. Many businesses have already implemented one technology, so going that one step further could prove very beneficial in the long run, reducing costs and maximising productivity. What other points or advice have I missed for those looking into virtualisation and cloud computing, as the best solution for their business problem? ### The need for application performance Béatrice Piquer-Durand, VP of Marketing for of Ipanema Technologies, highlights how networks are failing to keep up with applications and user expectations of performance. We live in an age where businesses are constantly battling to become more efficient, and increase output whilst minimising costs. It’s a ruthless arena; CIOs are consistently looking towards IT as a key enabler - one which will give them an edge. IT solutions and innovations are embedded into everyday business functions. Desktop virtualisation, cloud applications, BYOD, social media and Unified Communications are just a few examples of IT functions which companies rely upon every day. All of these applications or services can be offered to the user via private or public networks, and it’s the network therefore which needs to bear the brunt of the strain. An inefficient flow of applications across the network results in impeded business performance. Under-performing applications In a recent study we commissioned with Easynet (KillerApps 2013), it was found that 54% of organisations surveyed suffered from application performance problems, such as slowness or non-responsiveness. Application issues are always a source of frustration; for the end user, they signal time wastage, and for the business, lost time means lost productivity. Users’ changing IT habits mean that the performance of applications is becoming more critical. As users depend upon a greater breadth of applications and services (such as Office 365, Salesforce and GoogleApps), they expect them to be delivered seamlessly. A video call with jitter and delay will frustrate, and slow load times for a virtual desktop aren’t acceptable. The role of the CIO can sometimes be a thankless one. They’re expected to manage the flow of data across the networks efficiently, and ensure smooth running of business critical applications. When this goes well, the user notices nothing. When this goes wrong, fingers are quick to be pointed. There comes a point when a business decides that all of its applications are ‘business critical’, and therefore cannot be ranked over or under another. Effectively managing applications Traditionally, these problems were handled through WAN optimisation and prioritisation. Yet increasingly, businesses need to do more than simply prioritise certain applications over others. There comes a point when a business decides that all of its applications are ‘business critical’, and therefore cannot be ranked over or under another. The CIO needs to guarantee the performance and success of the entire application portfolio. After all, who wants to ensure good video quality at the price of a shaky SAP experience? Choosing is not always an option. ERP, Unified Communications, cloud applications, virtual desktops, social media, video, telepresence, big data, mobility and BYOD: all contribute to business efficiency. This is where the need to guarantee the performance of applications comes into play. In an ideal situation, businesses guarantee each application’s performance at any time, and directly align the network with the enterprise’s business goals. There’s a need for the network to have greater intelligence, and to be able to sense and respond to everything that’s happening in an orchestrated way. Only then can IT departments control application performance on a global basis. A shift in approach towards applications - the future Already, we are seeing some of this effective management in how businesses should handle social media applications. In the past, CIOs have tended to block applications which they deemed an unnecessary stress on the network, such as social media sites like Facebook and Twitter. According to the KillerApps 2013 research, however, there’s been a shift away from blocking social media applications, towards managing them instead. The research showed that the number of CIOs blocking Facebook dropped by 15% compared to last year, and blocking of all online TV and video has halved. This would suggest that not only CIOs are becoming more adept at prioritising the business-critical applications, but they’re also embracing the role of social media within the workplace. Businesses are adapting to user demands, and users are increasingly using social media applications as a means of communicating. They are managing these social applications while ensuring business ones can continue to function. As networks become more complex, with greater demands placed upon them, businesses will need to be able to guarantee the performance of their applications. The use of social media will continue to rise within the workplace, and so too will ‘shadow IT’ practices (IT solutions and systems which are built and implemented by departments without organisational approval). CIOs will need to maintain control and transparency over their networks if they are to maximise the chance for business efficiency, and this will be a key area of focus for 2014. ### CenturyLink expands global data centre CenturyLink expands global data centre presence in eight markets. Increased demand fuels plans for more than 180,000 square feet of additional space CenturyLink, Inc. has announced further investment in its hosting capabilities with plans to expand its global data centre presence in eight markets during 2014. The opening of three new data centers in North America, paired with expansions to five existing CenturyLink data centers, responds to global demand for secure cloud, colocation and managed services, particularly in regions with a strong CenturyLink network presence. These plans follow recent news that the company will also invest in expanding its CenturyLink Cloud network of public cloud data centers. By adding capacity in these key locations, we position ourselves firmly for delivering the advanced network, hosting and IT services that organizations depend on to compete in the global economy. “CenturyLink is committed to investing in world-class infrastructure to meet growing demand for access to comprehensive and secure technology solutions that power business transformation,” said Jeff Von Deylen, president, CenturyLink Technology Solutions. “By adding capacity in these key locations, we position ourselves firmly for delivering the advanced network, hosting and IT services that organizations depend on to compete in the global economy.” The new builds and expansions will offer more than 180,000 square feet of additional space to CenturyLink’s global presence: Phoenix (new data center sites, in partnership with IO, opened in January 2014) Weehawken, N.J. (expansion completed in March 2014) Minneapolis (new data center site opening in May 2014) Sterling, Va. (expansion completing in May 2014) Irvine, Calif. (expansion completing in June 2014) Toronto (new data center site opening in Q3 2014) Reading, England (expansion completing in Q3 2014) Chicago (expansion completing in Q3 2014) These expansions will bring more than 20 megawatts of additional power to CenturyLink’s total data center footprint, enabling businesses like the Chicago Board Options Exchange (CBOE) to add points of presence in locations most strategic for their operations. “We are pleased to see CenturyLink expanding capacity in Weehawken, which serves as an excellent venue for the global financial industry with its proximity to trading markets,” said Curt Schumacher, chief technology officer at CBOE. “We added a point of presence in this key data center, which features a vibrant ecosystem and secure network solutions, to supply subscribing firms with competitive markets data over low-latency trading connections.” Weehawken is one of more than 55 data centers CenturyLink operates, with more than 2.5 million square feet of gross raised floor space throughout North America, Europe and Asia. The company, ranked by Synergy Research and Frost & Sullivan as the second-largest retail colocation provider in the United States, offers robust colocation, cloud and managed services across its footprint of data centers powered by an advanced fiber network, high-availability SLAs and Uptime Institute Tier III-certified facilities. “IT leaders tell us the road to cloud often starts in an outsourced data center,” said David Meredith, senior vice president and general manager for CenturyLink Technology Solutions. “With this global rollout, we’re on an ambitious course to deliver carrier diversity, interconnectivity and managed hybrid services at pace with the market’s appetite for access to flexible and secure IT infrastructure.” About CenturyLink Technology Solutions CenturyLink Technology Solutions delivers innovative managed services for global businesses on virtual, dedicated and colocation platforms. For more information, visit www.centurylink.com/technology. About CenturyLink CenturyLink is the third largest telecommunications company in the United States and is recognized as a leader in the network services market by technology industry analyst firms. The company is a global leader in cloud infrastructure and hosted IT solutions for enterprise customers. CenturyLink provides data, voice and managed services in local, national and select international markets through its high-quality advanced fiber optic network and multiple data centers for businesses and consumers. The company also offers advanced entertainment services under the CenturyLink® Prism™ TV and DIRECTV brands. Headquartered in Monroe, La., CenturyLink is an S&P 500 company and is included among the Fortune 500 list of America’s largest corporations. For more information, visit www.centurylink.com. ### The cloud over your company Cloud Computing ticks so many boxes. Robert Cordray, Business Consultant & Entrepreneur gives an overview of the key advantages that cloud computing can bring to your business, particularly outbound sales teams. There is a cloud over your company. However, this cloud isn't bringing rain, but data. Cloud technology allows users to access all kinds of data, software, and applications without the need to be tethered to a desktop computer. If you have access to a Wi-Fi hotspot, then you have access to the cloud. Think of it as the great cyberspace hard drive. To keep their competitive edge, many companies are encouraging their sales forces to make full use of cloud technology. Here are some of the reasons why the cloud is an integral part of any successful sales force: Total Mobility While it is true that a lot of business is conducted over the internet, there is still the need for the "face to face" meeting. When a sales force hits the road, they'll be totally connected to all the pertinent information from their parent company. This allows that employee to also work from home and while traveling. The improvement in their work vs. life balance actually increases productivity and that is good for everyone. Easy Expansion The goal of any company is to expand their business. With a cloud-based set up, adding bandwidth and capacity can happen instantaneously. This can free up space on a company's already challenged servers. Meeting the demands of a growing customer base is essential for a sales force to succeed Backup The cloud takes the pressure off of losing data because it is almost never lost or currupted. This is vital when desktop computers will occasionally crash. There won't be any worries about work slowing down or suffering such setbacks when you can access all you need from the cloud. Working Together A sales force that is scattered around the country or the globe needs to be on the 'same page'. Sales Enablement tools can be remotely accessed from the cloud and put into practice at any time. Sales reps will be able to instantly sync up all of their data. It also lets the sales force share important updates from the parent office. This promotes a sales force to work together as a cohesive unit that can focus on all the company's goals and objectives. Automatic Updates Improvements and bug fixes happen all the time. Software and applications that are being provided as a cloud service (SaaS) are automatically updated. There is no longer the need to search for these updates. The sales force will know they are working with the best tools available. Security Laptops are easy targets for thieves, especially in airports. If a company laptop should turn up stolen, the work can continue without missing a beat. That's because everything will be on the cloud ready and available for access. Centralized Data With the cloud, your sales force doesn't have to make adjustments for different time zones. In other words, they don't have to wait for someone to wake up in order to access a project file. All of that data will be readily available, synchronized and updated around the clock. Affordable Options Using cloud computing will show up as a positive entry on a company's accounting books. That is because most cloud services are set up on a pay per use system. Not only will these services be faster to set-up, but also those start-up costs are likely to be minimal. You'll also know exactly what your operating costs will be for the year. That will certainly help with budgeting. Staying Competitive A small business has a chance to compete with the big players thanks to cloud computing. When it is quicker to access data, stay connected and recover backups, then a sales force can truly hit the ground running. Going Green When a company turns a large portion of their data processing and storage over to the cloud, they are reducing their need for their own servers and data center space. That in turn will cut back on their energy expenditures and reduce their carbon footprint. Not only is that a great marketing asset, but it is also good for the environment. Are you putting cloud computing to work for your company? You should. ### IBM step up cloud presence with Actifio deal IBM has been taking further steps to beef up its cloud storage services to small and medium-sized enterprises (SMEs). The massive 'hardware manufacturer' – currently the world’s 33rd largest corporation according to Forbes – is evolving its approach to business considerably in order to tessellate more snugly with a computing future which revolves around the cloud. [pullquote]IBM are now investing heavily in the technology in order to establish themselves as a major player in the cloud.[/pullquote]In accordance with this desire, the company has invested $1.2 billion recently in improving its cloud storage services. Additionally, the computing powerhouse has also agreed a deal with a company based in Massachusetts named Actifio, a prominent and growing provider of copy data management technology, which enables SMEs to boost their data storage. Cloud computing is offering SMEs the opportunity to save a great deal of money and time with regard to their data storage, and IBM is attempting to tap into element of the technology. Many companies make several copies of their data with the intention of using them for different purposes. Thus, the cost of maintaining servers particularly in a world in which the amount of data produced is escalating is obviously increasing rapidly. The cloud has obviously lent itself perfectly to remedying these issues, and IBM are now investing heavily in the technology in order to establish themselves as a major player in the cloud. Actifio has been a particular innovator in this field, as they have developed a unique technology which enables enterprises to use one solitary ‘golden copy’ of data for a multitude of purposes. Naturally this greatly helps control the overheads associated with data storage, which is particularly welcome for SMEs in particular. IBM and Actifio have been working together for some time ahead of the launch of a managed cloud storage service which will be called IBM SmartCloud Data Virtualization. This innovative technology will enable companies to “decouple application data from their physical infrastructure”. IBM strongly believes that this will enable companies to improve the resilience and agility of their business structures, and make moving into the cloud increasingly seamless for business of all sizes. Major executives from IBM have spoken of their admiration for Actifio’s working practices and vision, in addition to the actual quality of their technological achievement. The word at IBM is that Actifio offers a fast and flexible solution to help businesses get into the cloud, and one which has made it very easy for IBM to meet its compliance and security protocols and targets. With SmartCloud Data Virtualization, IBM will be able to deliver a managed service, and will actually charge customers on per gigabyte protected basis. In accordance with this deal, IBM are purchasing Actifio hardware and software, and will utilise it in order to deliver SmartCloud Data Virtualization to an expectant public. IBM is hoping to make gains on some of the other big cloud providers, most notably Amazon, who continue to lead the field in the cloud with Amazon Web Services. The IBM service particularly underlines the flexibility and scalability of cloud solutions, naturally a particularly important consideration for small businesses. With SmartCloud Data Virtualization, IBM will be able to deliver a managed service, and will actually charge customers on per gigabyte protected basis. Not only does this negate the need to pay for duplicate copies of data, but it also ensures that businesses can sign up for the service without needing to know their long-term requirements. The Actifio ventue is helping IBM target some pretty big markets. The corporation has stated that it intends to move into what it refers to as “a $100 billion market for cloud expenditures.” Meanwhile, activists are extremely enthusiastic about IBM’s vision, pointing out that it will enable them to deliver joined-up services which some of their rivals are forced to provide in separate bundles. Both Actifio and IBM are evidently focused on a long-term partnership as one of the iconic names in computing looks to push itself to the forefront of this burgeoning technology. Given their success in the past and the moves that they’re making at the moment, don’t bet against them being a major cloud player very soon. ### Post Snowden - move or secure your data? Matthew Tyler, CEO of Blackfoot, considers a post-Snowden world where data security is high on the agenda, and looks for a practical solutions. Since the Snowden revelations of nine months ago there has been a huge amount written on 'global surveillance' and the interception of confidential data. In fact there has been so much written and spoken about the subject that even a Co-op chairman would struggle to find time to keep abreast of it! When we listen to the subtext of these writings and conversations we begin to see an interesting question emerging, specifically, how do we protect ourselves in the future? This article explores some of the disclosures and fundamental changes that will be required. Most enterprise class networking equipment and software systems have been designed with ‘in place’ back doors that allow easy access. Shock disclosures and two common themes Amongst the many writings are disclosures about how the largest US tech companies are releasing vast amounts of data to the National Security Agency (NSA) through automated and legal extraction processes. Like busses and bad news, shock disclosures often travel in three’s, so it should come as no surprise that internet communications have been routinely hoovered from the web by security services and that most enterprise class networking equipment and software systems have been designed with ‘in place’ back doors that allow easy access. So now the shock has passed and a general acceptance has occurred that the NSA can and probably do take whatever and whenever it wants, two common questions are emerging: 1) should we be surprised by this behaviour of security services? 2) where should we placing our data in the future? Should we be surprised? Although the NSA and security services appear to have penchant for the extraction and monitoring of data, we probably shouldn’t be surprised that they do. Let's face it, the job of the security services is to protect us from threats. We also shouldn’t be too surprised by the methods they deploy in the digital age, as they are probably no more intrusive than those used in the analogue age of the 70s and 80s when government approved export licenses where required to sell core telco kit to other countries. What is important is that governments shed light on and ensure they are transparent rather than being seen as complicit. Where should we be placing our data in the future? The question about where data should be held in the future has become commonplace and with as many as 25% of businesses considering moving their data out of the US, or away from companies located in the US, an important question should be asked quickly, namely, is the right question being asked or with the advent of change and the crumbling of borders is the better question How can I keep my data safe? This is explored in the next section. Securing data in a brave new world As evidenced by a recent MI5 and GCHQ communications to FTSE 350 Chairmen, the 10 year £30b building spend on creating bulletproof corporate networks with highly secure ingress and egress points has failed. In the post-Snowden world we can be certain their knowledge that ‘large organisation are bleeding data’ is accurate; and we now know how they knew! Adding ever growing perimeter security to protect vulnerable and poorly designed applications is of little value to businesses who want to leverage the power of web enabled systems. In the new digital world workers have been un-tethered from their desks and now wish to interact with clients, colleagues and supply chains in entirely new ways. Much of this has been enabled by mobile and the new world of Cloud computing, however - in this brave new world - adding ever growing perimeter security to protect vulnerable and poorly designed applications is of little value to businesses who want to leverage the power of web enabled systems to support their business objectives, improve customer experience and streamline operational costs The future holds an interesting dichotomy, on one hand businesses want to web enable their applications and extend system access beyond the traditional boundaries but on the other hand, traditional models of securing networks, gateways and end points constrain businesses. Future investments in the traditional approach could be considered the equivalent of throwing out the baby when the 'information' bath is emptied into the cloud. Secure the applications to protect the data So if the question is to be how can I protect my data, where should the smart money be spent in the future? Perhaps Snowden provided a hint about the answer by stating that strong application security and well-managed cryptographic technique will cause the security services the most headaches. Without doubt these headaches will be equally unpleasant to the bad guys out there who's intent is not to protect but to gain financially. To meet the needs of the future, organisations must refocus their security efforts on securing applications and information rather than building bigger walls around the perimeter. This idea is gaining wide-scale acceptance and as we look to the future it is highly probable the only IT investments likely to be considered as transferable assets are the investments in the applications businesses use to make sense of the data and information they hold. Final thought IT professionals are very familiar with the notion that security is a little like an onion and that security is best achieved by applying layers rather than relying on silver bullet solutions. If the information is at the core, then we need to start by knowing where it is. Information is now everywhere within most corporate networks and is certainly held in more places than most IT departments are aware of. It naturally follows that you can't secure what you don't know about. So therefore moving applications to the cloud should be seen as a way of cleaning up corporate networks which can be riddled with information storage either inside on spreadsheets or outside on Dropbox / Google Drive type services. With application security and strong encryption as essential layers of any secure information onion (aka Snowden), the brave new digital world of clouds connected to the Wild West Web can offer an excellent way of keeping your information from prying eyes and getting your information and security in place before new legislation, various security services or the real bad guys get their hands on it. ### Intermedia debuts Salesforce and ConnectWise integration Intermedia Debuts New Integrations with Salesforce and ConnectWise, Linking the Two Leading Platforms for MSPs and VARs in EMEA. By integrating its Partner Portal with its partners’ most popular CRM and professional services automation tools, Intermedia slashes the complexity of reselling cloud services. Intermedia EMEA partners can now integrate their Partner Portal with Salesforce and ConnectWise to grow their cloud services footprint without adding to their accounting and sales overhead. “Intermedia is committed to continuously developing new tools and improving our services to make it easier for our partners and customers to do business,” says Ed Macnair, managing director EMEA at Intermedia. “The integration of Salesforce and ConnectWise with our Partner Portal allows us to offer a more seamless business platform not only for partners, but administrators and end-users as well.” Intermedia is the world’s largest one-stop shop for cloud IT services and business applications. Its industry-leading Partner Programme enables MSPs and VARs to resell hosted Exchange, Lync, Sharepoint, Email Archiving and other services. Now, in addition to controlling their own pricing, bundling, branding and customer relationships, Intermedia EMEA partners can streamline their workflows with integrations between their three main business platforms: Salesforce – The world’s largest provider of customer relationship management software with more than 100,000 customers ConnectWise – The No. 1 business management platform worldwide with more than 80,000 users Intermedia’s Partner Portal – The single point of control for Intermedia partners to manage their businesses, services, and collective 350,000+ users With the Salesforce integration, partners can: Sync customer account data from the Intermedia Partner Portal to Salesforce Sync their Intermedia plan pricing with Salesforce pricing books Track product usage to easily spot selling opportunities Easily generate quotes without having to manually copy and paste data from the Partner Portal Create new accounts directly in Salesforce and automatically synchronise to the Partner Portal With the ConnectWise integration, partners can: Simplify their billing processes and reduce the amount of time spent generating invoices Export their catalogue of Intermedia products and services Ensure product and customer usage data in the Intermedia Partner Portal are automatically updated in their billing system Eliminate the need to manually update billing agreements (exclusive to ConnectWise integration) “We leveraged these integrations to effortlessly bring our Intermedia services into our existing business processes,” said Jon Matero, president and CEO of Network Heroes, an Intermedia partner. “We maintain a single system of record for all our CRM and accounting needs, and keep our accounting and sales teams more focused and more productive. It’s going to save us hundreds of man-hours this year.” Intermedia is committed to making it simple for partners to run their businesses and profit from managing cloud services. These integration apps are being offered free of charge via Intermedia’s Partner Portal; each is designed to take less than 30 minutes to set up. To find out more about the Intermedia EMEA Partner Programme, click here. About Intermedia Intermedia is the world’s largest one-stop shop for cloud IT services and business applications. Its Office in the Cloud™ delivers the essential services that SMBs need to do business – including hosted Exchange, Lync, Sharepoint, Email Archiving, security, mobility and more. All of Intermedia’s services are pre-integrated for a seamless user experience. They are all managed via Intermedia’s HostPilot® control panel, with just one login, one password, one bill and one source of support – creating tremendous cross-service efficiencies for both users and IT administrators. In addition, they all offer enterprise-grade security, 99.999% availability and 24/7 customer care to assure a worry-free experience. Intermedia was the first company to offer hosted Microsoft Exchange and, with 100,000 customers and more than 1,000,000 users, is today the world’s largest independent provider of hosted Exchange. Intermedia also enables channel partners – including VARs, MSPs, telcos and cable companies – to sell cloud services under their own brand, with full control over billing, pricing and every other element of their customer relationships. Intermedia’s 600 employees in 3 countries manage 10 data centres to power its Office in the Cloud – and to assure its famous worry-free experience. Learn more at Intermedia.co.uk. ### Weserve B.V. selects Flexiant for cloud platform Webhosting company, Weserve B.V. has announced it has selected Flexiant Cloud Orchestrator to manage its multi-datacenter cloud solution from one platform. Faced with multi-datacenter management limitations from its previous cloud software vendor, Weserve was in search of a mature, professional and high availability IaaS cloud orchestration solution that could serve its 1,500-2,000 customers and 25 resellers across the Netherlands. Robert Nijhof, Managing Director, Weserve said, “It was important for us to manage our two datacenters in the Netherlands from one platform. After disappointment from our existing vendor, we evaluated Flexiant, OpenStack and CloudStack. Flexiant was able to meet our requirements so that we could support now, and expand in the future, our current cloud offering based on VPS with more capabilities and differentiated services.” Weserve has deployed Flexiant Cloud Orchestrator across both datacenters and already has customers on the platform. Weserve has also now completed migrating its existing customers from the previous platform. “The main benefit of selecting Flexiant was moving immediately to a multi-datacenter solution and secondly Bento Boxes. We extend the Bento Box functionality to our resellers as most want to turn multiple VPS solutions into hybrid solutions. With Flexiant’s valuable tool, they can use our solution, but deploy it themselves,” said Nijhof. Flexiant’s cloud blueprint technology, Bento Boxes, offers pre-packaged, pre-configured and sophisticated application stacks that are ready to use. By working closely with Flexiant’s Professional Services team, Weserve was able to get the platform up and running in days.  Nijhof said, “The team at Flexiant is very supportive and knowledgeable. Combined with Flexiant Cloud Orchestrator’s superiority to other vendors, we now have the technical solutions we need to support our customers and reseller business.” “Any organization wanting mature, professional and high availability cloud orchestration should evaluate Flexiant. The solution makes our lives easier and allows us to deliver the service we want to offer. We plan to expand our use of Flexiant Cloud Orchestrator as we expand our business,” concluded Nijhof. About Weserve B.V. Since 2003, Weserve B.V. has been active in the webhosting business. Weserve B.V. offers a wide variety of service including domain registration, webhosting, dedicated services, colocation and rackhosting, high availability VPS servers, streaming services, online backup and spam filtering. Weserve B.V. manages its own network and hardware in the Serverius datacenter at Dronten, The Netherlands. From its core all services are offered and monitored. The NOC system of Weserve B.V. is also used to offer managed services for business users. From the datacenter, Weserve B.V. delivers and manages its FttO connections to businesses and has several interconnects with first class Fiber suppliers. For more info about our business services, please check  www.weserve.nl About Flexiant Flexiant provides cloud orchestration software focused solely to the global service provider market. Flexiant Cloud Orchestrator is a cloud management software suite that arms service providers with a customizable platform to help them turn innovative ideas into revenue generating services quickly and easily. With Flexiant, service providers can generate more revenue and accelerate growth, compete more effectively and lead the market through innovation. Vendor agnostic and supporting multiple hypervisors, Flexiant Cloud Orchestrator offers a customizable platform, a flexible interface, integrated metering and billing, reseller capabilities and application management. Flexiant gives service providers the ability to develop, launch and bill for new cloud services quickly. Flexiant has been named a Gartner Cool Vendor in Cloud Management, ranked as the most innovative product in the market by Info-Tech Research Group for the second year in a row and called an industry double threat by 451 Research. Flexiant is now a Dell certified technology partner and Parallels partner. Customers include Colt Ceano, Computerlinks, ITEX, NetGroup and WeServe. Flexiant is also a key participant in the European Framework Programme. For more information visit www.flexiant.com. ### Cloud World Forum Addresses the Evolving Role of the CIO London, 18 March 2014 – In an increasingly digital world the Cloud era presents a wealth of opportunity to economies, enterprises and individuals around the globe. At an executive level, Cloud is having a profound effect on the role of CIOs, whose corporate responsibilities are evolving as their companies fight to stay ahead of their competitors and maintain the pace of innovation. This issue will be addressed at this year’s sixth annual Cloud World Forum, where some of the industry’s most notable CIOs will headline as speakers. In a report by the McKinsey Global Institute, Cloud is listed as one of 12 technologies with the potential to drive massive economic transformations and disruptions in the coming years[1]. As a huge driver of potential growth, CIOs have to hone new skills and learn how best to harness the power of Cloud. “The up-and-coming role of Chief Digital Officer (CDO) exemplifies the convergence between technology and business,” said Laurent Lachal, Senior Analyst at Ovum, in a recent report[2].  “Many CxOs, irrespective of their background, be it IT (CIO), marketing (CMO), or finance (CFO), can potentially fulfill it, with CDOs increasingly regarded as "CEOs in waiting". In this context, there is a tendency to describe the role as the object of a war between CIOs and other CxOs. Similarly, when it comes to the control of IT budgets, many like to pin CIOs against other CxOs, particularly CMOs. What digital enterprises need is not CxOs battling over roles, budgets, or strategies, but CxOs with both a business and an IT background coordinating their IT investments and strategies.” As part of the conference agenda on day one, a panel discussion will take place entitled ‘Redefining the role of the CIO in the era of the Cloud’. Among the issues examined by the panel, are: what the CIO role means today if many of the decisions to use Cloud/IT technologies are made by other departments, the rise of Shadow IT and its impact on Data Centre operations and the impact that is needed on contracts and security governance to contain and liberate employees and partnerships, as well as control performance, compliance risks and cyber threats. Georgios Kipouros, Head of Production, Cloud World Series, said: “Cloud offers huge potential, but like any transformative technology it also presents its challenges too. Aside from the technological implications it’s also drastically altering business operations and responsibilities at managerial level. Organisations have to move faster to keep up with the competitiveness that’s enabled by the Cloud and not fall prey to more forward-thinking companies - and the role of the CIO is central to doing that. This is a hot topic in the industry and the Cloud World Forum presents the perfect platform for the industry’s leaders to gather and debate the issue.” The Cloud World Forum conference and exhibition will take place on 17-18 June 2014, at the Olympia National Hall, London, UK. As EMEA’s largest Cloud event it boasts the industry’s most comprehensive agenda, with more than 240 speakers participating from 74 countries. The latest advances in Cloud and IT technologies will be unveiled and the show features the only agenda in the industry led by Cloud end-users including large and medium sized enterprises, public sector organisations, online players, regulators, telcos and analysts. 2014’s event will explore the latest in the field of SaaS, channel, virtualisation, security, OpenStack, Big Data and much more. It will also see the launch of a dedicated theatre on Enterprise Mobility and BYOx. Cloud World Forum 2014 has already received unprecedented industry support from the leading Cloud solution providers, including: HP, Microsoft, Intel, Salesforce, Google, IBM, Rackspace and AWS. The event will be co-located with Big Data World Congress and Enterprise Apps World, and the event will also benefit from the introduction of a more engaging, interactive programme including innovative learning formats such as hands-on labs, live demos and brainstorming sessions throughout all theatres and co-located events. To view the full event programme and register for Cloud World Forum, please visit www.cloudwf.com or call +44 (0)207 017 5506. Alternatively, keep up to date with the event on Twitter @CloudWSeries or using #CloudWF. ENDS   About Cloud World Forum Cloud World Forum is EMEA’s leading Cloud event with the industry’s most comprehensive agenda of all things Cloud. Over 8000 delegates come from more than 70 countries around the world to meet the industry’s leading solution providers. Now celebrating its sixth year, the show gathers the pivotal payers of the Cloud revolution and features 12 theatres led by Cloud end-users. More than 300 speakers from multinationals, SMEs, public sector organisations, online players, regulators, telcos and analysts are set to take the floor in engaging, thought-provoking keynotes, hands-on labs, brainstorming sessions and live demos over two days. About Informa Telecoms & Media Informa Telecoms and Media (www.informatandm.com) organises more than 120 global annual events, attended by more than 70,000 executives worldwide. With a focus on quality content, Informa Telecoms and Media deliver a key audience of decision-makers from the mobile, fixed, alternative, wholesale, MVNO, broadband and satellite operator communities. Informa Telecoms and Media is also the leading provider of business intelligence and strategic services to the global telecoms and media markets. Driven by constant first-hand contact with the industry, its 90 analysts and researchers produce a range of intelligence services including news and analytical products, in depth market reports and datasets focussed on technology, strategy and content. For media information please contact Dana Corson at dana.corson@proactive-pr.com, Holly Tyrrell at holly.tyrrell@proactive-pr.com or Matthew Dunkling at matthew.dunkling@proactive-pr.com. Alternatively call on +44 1636 812152. [1] McKinsey Global Institute, May 2013: Disruptive technologies: Advances that will transform life, business, and the global economy [2] 2014 Trends to Watch: Cloud Computing, February 2014: http://ovum.com/research/2014-trends-to-watch-cloud-computing/ ### Pulsant works with IntelligentComms to streamline communications Pulsant works with IntelligentComms to streamline its internal communications [quote_box_center]Case Study: Cloud communications gets less obvious press than cloud computing (processing / storage), however lets not forget that Skype kicked off a revolution in personal communications which the business-world has taken a long while to replicate and build on.  Microsoft's Lync (formerly Microsoft Office Communicator) promises seamless convergence of enterprise-grade voice, video and messaging - however a successful deployment is subject to how well it's implemented and then how effectively its used.  Whats interesting in this case study is how Pulsant have been able to deploy a solution with control of every aspect - hosting, network, software, QoS, SLA - and (importantly) have provided ongoing end-user training.[/quote_box_center] IntelligentComms supplies bespoke, flexible and modular telecoms expense management (TEM) solutions to a broad range of industries, aimed at improving efficiency, while minimising cost and risk. Therefore, as experts in the field of telecoms, when it came to selecting a new internal communications solution that would meet the changing requirements of its own workforce, the company knew exactly what it was looking for. Its previous communications solution combined services from multiple vendors and one web-based product in particular was proving to be exceedingly costly due to the average length of conference calls and the amount of bandwidth these consumed. Further, the company was using Skype for voice calls and instant messaging, which was efficient but lacked a business-class service level agreement (SLA). As a result, no audit function was provided opening up the organisation to potential risk. IntelligentComms sought a solution that covered the requirements for voice, conference calls, video calls, instant messaging and desktop sharing. Pulsant had an established relationship with the telecoms company and was able to provide an informed assessment of the company's needs. Dietmar Wand, client manager, Pulsant, explained: "We already provided IntelligentComms with a managed WAN, web hosting and colocation services – so we had an excellent understanding of their organisational and business requirements. Since the company advises other businesses on their telecoms needs, the IntelligentComms team was able to provide a very clear brief on their own specific requirements." Pulsant recommended an organisation-wide consolidation of communications tools using Microsoft Lync as the main, and only, communications system. Wand continued: “The solution matched their requirements and we developed a cost matrix that demonstrated Microsoft Lync would indeed be more efficient and provide a better service at a lesser cost than the current mix of products and services.” The strength of the platform was demonstrated through a two week trial of the service throughout the organisation. Following the successful pilot the system was implemented over the course of a month, which included extensive training for all IntelligentComms staff. Stuart Bingham, infrastructure manager, IntelligentComms, outlined why the solution Pulsant put forward was selected: “We have a small inbound call centre that answers calls from our clients. We have SLAs with our clients in that we have to answer and respond to calls within a certain period, so it was vital that the platform be robust, reliable and always available, with absolutely no downtime during working hours. This is why we decided to go for a managed service from Pulsant. “We now rely on Lync for our entire telephony requirements. This includes enterprise voice for inbound and outbound voice as each user has their own direct dial in. All of our staff members are also equipped to work from home, if needed, and as such can now rely on Lync for instant messaging, as well as for voice calls to office-based staff.” All of IntelligentComms' key staff members have the mobile Lync client installed on company-issued iPhones in order to ensure they are connected even when they are not in the office. This gives users the simultaneous functionality of an office handset and a mobile device ensuring that they are always available to customers and colleagues regardless of location. The fully managed end to end unified communications solution brings added benefit to customer interactions. According to Bingham, the company regularly runs Lync sharing sessions with clients to run demonstrations. “Being able to connect our clients into a voice bridge and show them exactly what to do on the screen is especially helpful,” he said. "As more and more of our larger enterprise-type clients in industries such as business process outsourcing (BPO), retail and services are migrating to Lync, we have federated with them so that multiple parties are able to easily interact with one another as if they are using the same platform. This extends to voice, IM, screen sharing and presence. This recent addition has been a boon to both parties as we are now able to share information far more easily and speed up communications. Any calls to federated clients are VoIP based and free of charge, so there are also cost savings to be realised," said Bingham. ### Social media to be cloud growth catalyst Chris Morris takes a look at how cloud computing has been a primary instigator and enabler of rapid growth of social media startups. The cloud sector has certainly grown significantly since we first heard about this once obscure technology. But from its humble beginnings where the average person wondered how this apparently unimaginable technology would work, the cloud has now pretty much established itself as a mainstream technology. Many people are now beginning to use the cloud on a pretty regular basis, and thus we now see numerous major corporations vying for supremacy. One area where the cloud has already begun to make serious inroads into the everyday consciousness of the general public is with regard to social media. It is perhaps appropriate that the cloud has had such a significant influence on this modish form of communication, and no-one can doubt the extent of that influence. Amazon Web Services, the IaaS cloud leader of course, has been the power behind the rise of Instagram from a small start-up with just thirteen employees to one of the world’s biggest websites, recipient of a billion dollar buyout from Facebook. The cloud has also been involved in the rise of Pinterest, which has grown exponentially from 50,000 users to 17 million in a mere nine months; a feat that would have been nigh on impossible without the existence of the cloud. Then of course there's Facebook's WhatsApp with 450 million 'active' users - also built in the cloud. The latest flavour of the month in the social media market is also benefiting from the cloud. Snapchat has already turned down a $4 billion bid from Google according to popular rumours (and previously a $3 billion offer from Facebook), and with the site currently gaining a huge amount of cachet among the general public, this is just the latest success story in the social media sphere which is cloud-powered. The brains behind Snapchat have spoken about how tapping into the new technology offered a fantastic opportunity for them, and one which was perfect for them when they were starting out with very little cash. Cloud computing enables a business to scale upwards very rapidly without making a massive investment in infrastructure beforehand, meaning that start-ups can rapidly gain value and prominence and not be hamstrung by intimidating costs. By reducing the need for funding in the early stages of an enterprise, the cloud saves entrepreneurs a considerable amount of time, and also gives more potential for businesses to experiment rather than feeling constrained by practical considerations. The aforementioned Snapchat was a classic example of this. When it was launched in September, 2011, the people behind the social media site had no idea of its apparent potential, although obviously they had high hopes for it. But the early days of the site gave little indication of the huge success ahead of it. In January, 2012, Snapchat had a mere 3,000 users each month. It is even conceivable that the owners could have knocked the whole idea on the head there and then if they had expensive overheads to contend with. Instead, efficiency savings bought them a little breathing space, and the site has since taken off massively. Start-ups can rapidly gain value and prominence and not be hamstrung by intimidating costs. Cloud computing has been taken up by many established companies, but it has also meant that the cost of starting a technology company is lower than it has ever been previously. With the cloud still in its relative infancy, it seems certain that the cloud will have a profound impact that the technology will have on business and tech sites in the future. Analysts believe that the next generation of cloud services will move beyond the admittedly useful scalability of existing cloud provisions and enable start-ups to acquire building blocks of the products themselves. This will mean that entrepreneurship will become an even more broadly available proposition; the cloud is truly a democratising technology in this respect. When we see open APIs and platforms-as-a-service then it will be possible for technology sites to create products and services even more quickly. This will impact upon every conceivable area of business, but one can expect social media to be high on the list. ### Intermedia increases advisor commissions Intermedia bucks industry trend by upping Advisor commissions by 33%. In an era of shrinking margins for the IT channel, Intermedia continues to make the cloud much more profitable for MSPs and VARs in the EMEA region. Intermedia is the world’s largest one-stop shop for cloud IT services and business applications, including hosted Exchange, Lync, Sharepoint, Email Archiving and other services. Its industry leading Partner Programme allows MSPs and VARs choose on a customer-by-customer basis between Intermedia’s Private Label model (in which the partner manages pricing, bundling billing, branding and every other aspect of the customer relationship) and its Advisor model (in which the partner or agent offloads support and billing to Intermedia). As of 12th March, Intermedia Advisors can earn 33 per cent more monthly recurring revenue from every new customer account. Intermedia’s revised Advisor commission schedule for new customer accounts has increased from 6 per cent to 8 per cent commission on monthly recurring revenue for data services such as Exchange, Lync and SharePoint. In addition, unlike many other providers, Intermedia’s Partner Programme guarantees commissions for the life of the account. “We are committed to the success of our partners and our programme offers flexibility in the way partners can engage with us,” says Ed Macnair, managing director EMEA at Intermedia. “By offering industry-leading cloud services and by increasing the commissions for our advisor partners we are giving our partners a real edge in the market, which further underlines our commitment to expanding our EMEA business.” While Intermedia’s Advisor programme provides partners or agents with very attractive commissions, its Private Label programme offers larger margins by allowing partners to own and manage more elements of their customer relationships. Private Label partners have the potential to achieve higher revenue and profit as Intermedia partners because they can: Sell under their own brand Build custom bundles that seamlessly cross-sell and upsell their other services Integrate with their existing Salesforce and ConnectWise systems Streamline their internal processes by managing multiple customers and services via Intermedia’s Partner Portal Rely on Intermedia for marketing support, sales enablement, onboarding and migration, and 24/7 phone and chat support To find out more about the Intermedia EMEA Partner Programme, click here. About Intermedia Intermedia is the world’s largest one-stop shop for cloud IT services and business applications. Its Office in the Cloud™ delivers the essential services that SMBs need to do business – including hosted Exchange, Lync, Sharepoint, Email Archiving, security, mobility and more. All of Intermedia’s services are pre-integrated for a seamless user experience. They are all managed via Intermedia’s HostPilot® control panel, with just one login, one password, one bill and one source of support – creating tremendous cross-service efficiencies for both users and IT administrators. In addition, they all offer enterprise-grade security, 99.999% availability and 24/7 customer care to assure a worry-free experience. Intermedia was the first company to offer hosted Microsoft Exchange and, with 100,000 customers and more than 1,000,000 users, is today the world’s largest independent provider of hosted Exchange. Intermedia also enables channel partners – including VARs, MSPs, telcos and cable companies – to sell cloud services under their own brand, with full control over billing, pricing and every other element of their customer relationships. Intermedia’s 600 employees in 3 countries manage 10 data centres to power its Office in the Cloud – and to assure its famous worry-free experience. Learn more at Intermedia.co.uk. ### The cloud community faces industrial scale challenges No one group has all the expertise needed to create a perfect cloud environment – it needs expertise in application behaviour, data centre design, network and machine virtualisation, wide area networking and so much more. Getting this all working together calls for collaboration, says James Walker President of the CloudEthernet Forum. With the arrival of the smartphone: a new type of thin client appeared that seemed to hold the whole world in its Internet grasp. People did not have to shift perspective and embrace the SaaS model, they just found they were already using it, and the word of this new aeon was “cloud”. The result has been a surge in cloud uptake that took even its strongest advocates by surprise. The total worldwide market for cloud infrastructure and services is expected to grow to $206B in 2016. The signs are everywhere, as massive new data centres are springing up in the coldest places: Dell’Oro Group predicts that within five years more than 75% of Ethernet ports will be sold into data centres, with similar predictions for compute and storage gear from Gartner and Forrester. So the total worldwide market for cloud infrastructure and services is expected to grow to $206B in 2016, and the cloud will be the hub for most business investments well into the next decade. This means that the industry will soon be facing a much steeper sales incline – and this is just when it can least afford to slip. If the cloud fails now, it could send the whole market tumbling back down the slope. The bad news is that the cracks in the cloud structure have already started to show. The good news is that this has been recognised in time and the industry has launched the CloudEthernet Forum which is rallying members to tackle fundamental issues and ensure a reliable, scalable and secure cloud for the coming generation. A more detailed analysis of the challenges and suggested steps to their resolution is available in the CEF white paper The Benefits and Challenges Ahead for Cloud Operators. For the full report simply visit:http://www.cloudethernet.org/resources/download-whitepaper-request/ ### What’s your cloud battleground? Tony Lucas of Flexiant advises service providers to differentiate in order to compete on the cloud battlefield on their terms. Until now, cloud washing has been the talk of the town. But that’s all changed. The public cloud market is growing and to compete in the market, you must enter the cloud battleground. And the lines have been drawn. Some service providers are commoditising the market, while others fall foul of the giants. So, what about you? If you’re entering the cloud battle, why not win it? Flexiant put together a short playful video on what’s happening on the cloud battleground. Maybe you’ll be inspired to win the cloud war! An inspired title, Ahead in the Clouds is a great resource for everything cloud. This video speaks to exactly that message for the service provider. You need to get ahead in the cloud! With all this in mind, we thought we’d take a step back and provide a quick question and answer session on Flexiant so you can assess how cloud orchestration solutions can help you win the cloud war. What is Flexiant all about? Flexiant’s heritage in the service provider and cloud industry stretches back to 1997. Originally developed for its own customers, Flexiant’s technology was designed specifically with the needs of service providers in mind and enables them to build and sell cloud services. Spending on public IT cloud services will grow five times faster than that for the IT industry as a whole from 2013 to 2017, according to a new report from IDC. Why the service provider? We all know the public cloud market is growing substantially. The switch to public cloud is happening with increasing momentum. In fact, spending on public IT cloud services will grow five times faster than that for the IT industry as a whole from 2013 to 2017, according to a new report from IDC. With this in mind, we cater to the service provider because that’s where the cloud is moving. Service providers  need a solution that’s going to let them be more than a commodity and instead an external source of cloud expertise with all the infrastructure and application solutions available in open place easily. What is cloud orchestration? The cloud actually exists – there is compute, storage, network, cables, racks and all of that needs to be managed. There are also applications that need to be deployed on this hardware. Cloud orchestration lets you manage all that from one single platform. To put it another way, it is the ability to instantly manage, automate and streamline all elements of a cloud platform including the management of physical and virtual resources. How did you develop your cloud orchestration solution? Tony Lucas, our founder, faced a number of challenges and saw a lot of change during his time at the helm of the hosting company XCalibre, which specialised in solving some of the world’s most challenging web hosting problems. However, as new technologies emerged and old problems were solved, the two problems that always remained were flexibility and scalability. He alongside his team, put together a cloud orchestration solution that solved his business problems when we were a hosting company. Effectively, we wanted to maximize our internal resources while delivering against the customer needs. With that in mind, we realised we had a solution that really did solve a business problem for host providers, service providers and telcos. It helps these businesses overcome the technical challenges of building a truly scalable IT infrastructure while delivering on cloud computing’s promises of on-demand, self-service provisioning of computing services with granular metering and billing for customers. It is here that Flexiant Cloud Orchestrator, built on the foundation of Lucas’ hosting company’s years of practical development, comes in. With cloud orchestration in place, service providers gain an end-to-end Infrastructure-as-a-Service (IaaS) platform, fully loaded with over a decade’s experience and problem solving built in. Why is your cloud orchestration solution different? Because Flexiant was built to solve an existing hosting company’s problems, our heritage means we’ve kept the business requirements in mind in every new version. Our software lets you orchestrate your infrastructure and applications, but it also lets you differentiate against competitors. You can meter and bill based on your business needs. You can expand to new geographies with the correct language. You can deploy pre-configured applications based on what verticals or industries you specialise. You can also expand your business through resellers by supporting complex sales situations. The cloud orchestration handles all the intricacies of all this and of course more. You can meter and bill based on your business needs. You can expand to new geographies with the correct language. You can deploy pre-configured applications based on what verticals or industries you specialize. You can also expand your business through resellers by supporting complex sales situations. What is the cloud battleground? We believe there is a battle out there to win the cloud war. You have private clouds, hybrid clouds, public clouds and vendors all supporting these. You have customers that want to protect the empire and others that want all the benefits of public cloud. You have industry giants like Amazon with huge stakes in the market, while giants are quickly trying to catch up. How is it then that you are going to differentiate and win business? That’s the cloud battleground. So then, how exactly do you start to win? Service providers, telcos and hosters need to be armed with a platform that helps turn innovative ideas into revenue generating services quickly and easily. We know from our customers that there is a lot of creativity in the market. But when technology doesn’t support this innovation, you might miss a market opportunity. Worse yet, a competitor might have a similar idea and get to market fast. We believe you need to lead the market through innovation and to do this you need a solution that can be customized, easily, to adapt to the services you want to deliver your customers. You also need to compete more effectively by developing your own unique services to differentiate and here again, you need the ability to customize your platform. Lastly, you need to generate more revenue and growth. Speed to revenue and revenue from new streams, such as resellers becomes essential. Customization and extensibility sounds good to get you to market and make more money, but can that really make a difference? To illustrate how important it is to differentiate based on customization, extensibility and an ecosystem, you needn’t look further than Apple vs. Nokia. Nokia was the world's largest vendor of mobile phones from 1998 to 2012. However, over the past five years its market share declined as a result of the growing use of touchscreen smartphones from other vendors—principally the Apple iPhone. Whereas Apple created, first and foremost, a device that can be summed up as ‘cool’, they also did something genius. They created a product that could be customized to the end users requirements by making it easy to create apps for it. As a result, they created an ecosystem of all sorts developing apps that end users wanted. They extended the use of the iPhone to others and as of October 2013, the App Store has more than 475,000 native apps by Apple and third parties. Consider the difference in share prices. Apple stock increased 332.47% from the day Apple launched the first iPhone up until today whereas Nokia has dropped 74%. Put another way, if you owned $100 of stock in both companies on the day Apple launched the iPhone, your apple stock would now be worth over 13 times your Nokia stock. Applied to the cloud service market the same holds true. If you can offer a differentiated cloud service that allows you and your end users to customize it to meet their needs, plus create an eco-system of technologies that integrate, you’ll be primed to win the cloud war. Illustrating this point demonstrates the difference between both companies in creating business value, and also of enabling easy integration with third parties. When you do this, you allow customization by the end user, and also create an ecosystem of partners primed to make money themselves, all while increasing revenue for you. Applied to the cloud service market the same holds true. If you can offer a differentiated cloud service that allows you and your end users to customize it to meet their needs, plus create an eco-system of technologies that integrate, you’ll be primed to win the cloud war. Back to Apple vs. Nokia – one has thrived, while the other didn’t survive on its own.  How will Flexiant Cloud Orchestrator allow you to do the same? Our cloud orchestration solution supports both an individual business, but also extends to reselling cloud services and includes: Internationalization so it is easy to on-board customers based on their geography and billing requirements. Granular multi-level metering and billing that supports complex billing scenarios, multi-level master billing entities and customers, applies differing rates of billing and pricing structures for different entities. This should include subscription billing capabilities so you can bill according to your business. Payment provider plug-ins to interface with payment provider systems to allow different plug-ins to be created for handling merchant transactions. An intuitive and customizable user interface so end users only see relevant information, resources in use and capabilities available which will help to reduce the workload on the IT team. It includes brandable and whitelabelled context sensitive help to change the languages, descriptions and terminology used; important when you want to differentiate services. Chef configuration management tool for single click complex application deployment involving multiple servers, networks and resources all from a friendly user interface. Cloud blueprint technology to allow service providers to graphically select all of the elements required to build a standard virtual workload package and save this as a re-usable and re-deployable template. Scalability assurance at every layer of the stack and across multiple clusters and locations so service providers never need to worry about scaling the business. How is this all customizable? Our solution is a fully customizable platform to meet a service provider’s unique needs. You can customize the user interface (UI), the billing system and now apply customization based on business logic and additional plugins to enable ecosystem partners. Flexiant has introduced new trigger technology within Flexiant Cloud Orchestrator to allow external systems as well as users to receive notifications or requests when customizable ‘trigger’ events occur within the platform. What is a plugin? One of the key challenges for service providers wishing to differentiate is the choice between using an off the shelf solution with limited differentiation capabilities, and building something themselves that can take far longer than expected and require skills not always available in house. We believe there is a third way, which is having a solid and mature core platform, with a series of plugins to then modify that platform to suit you and your customers requirements. Short and simple plug-ins can be written to apply logic to cloud management. Service providers can easily establish rules that trigger activity such as notifications that an event has occurred, automated provisioning of resources, applying workflows, etc. A plugin enables you to tightly integrate our platform with other solutions, and/or change the way our platform itself behaves. We have a range of off-the-shelf plugins for integration with third party technologies to choose from, or you can create a bespoke plugin to meet your individual requirements.  This makes it quick and simple for either Flexiant or our customers to develop customizations, with a few clicks through our intuitive UI and in a matter of days. Short and simple plug-ins can be written to apply logic to cloud management. Service providers can easily establish rules that trigger activity such as notifications that an event has occurred, automated provisioning of resources, applying workflows, etc. Plugins can also be applied to other systems within a service provider’s infrastructure so, for example, using a trigger; you can easily write plug-ins on top of the management platform for back-up infrastructure, billing systems, data center infrastructure and customer management or monitoring. <!-- What’s your final words for the reader? If you are a service provider, visit our website, have a look around and get in contact. We know we can help you differentiate based on a cloud orchestration that can be up and running in days, and then customized and extended to support your complex business.--> ### Flash Systems: The Future of Data Storage. By Steve Hollingsworth, Director at Applied Technologies As more and more businesses and individuals migrate into the Cloud, it is predicted eventually data storage centres will gradually become a thing of the past. The Cloud believers are waiting for the old churning mass banks of servers tucked away in companies basements to shut down and gather dust, as we all start trying to work out how to use Google Docs. I am a non-believer! The Cloud is a fantastic innovation in technology and for some it will prove to be revolutionary in its results. But, not everyone wants to wholly migrate, especially businesses who are used to and working effectively outside the Cloud. Migration itself for example takes time, and if you choose to use Amazon Web Services (AWS) or Google Apps, staff will be required to learn how to use new software which will take up valuable time and lower productivity. This point is illustrated by the fact that many larger organisations are not yet fully moving into the cloud, choosing instead to use a "hybrid" cloud, with core applications hosted by themselves on agile infrastructures, still requiring data storage centres. The amount of data businesses store and use is growing at a very fast rate with an estimated 45 fold growth of data storage by 2020, this therefore means the majority who still store data physically will continue to require extra storage capacity. With physical space and power becoming a premium, flash-systems are set to solve these issues. The old style of just throwing more disks into a data centre in order to increase capacity is not only expensive to fund and requires a surplus of space, but in terms of efficiency and energy use it will also prove costly. Replacing out of date data storage technology with Flash Systems will result in an on average drop in environmental damage along with energy costs as a result. For businesses who require high power processing, Flash Systems represent a quantum leap in performance. Flash is allowing business' to mine data quickly for immediate business decisions, increase productivity 'per core' to reduce license costs, and changing computing in many other ways. Good business requires maximum productivity which can only really be provided by maximum speed and process management. Although once installed, Flash Systems are much cheaper to run, I do appreciate they are expensive to set up. However, when a business chooses to invest in a Flash System, it can be gradually introduced, and correctly sized and implemented with hot data, a limited amount of flash can have exponential benefits. Businesses are able to retain their hard disk drives for data retention and Flash Systems can be prioritised to high demand data or high end processes. This, therefore, invites the introduction of hybrid data storage systems, which combine hard disk drives and Flash Systems to cover a plethora of data storage and operations. Good business requires maximum productivity which can only really be provided by maximum speed and process management. Flash can deliver this productivity, at costs points perhaps unexpected, and can be implemented as part of a Private or Hybrid cloud, dependent on the application requirements. With 20 years experience in solution design and sales, Steve has worked with IBM Systems from System36/38, through the AS/400 and RS/6000, to the present POWER7 iOS / AIX and the latest Storage technologies. Steve is a Director at Covenco, parent company of Applied Technologies. Excerpt:  As more and more businesses and individuals migrate into the Cloud, it is predicted eventually data storage centres will gradually become a thing of the past... ### The Future of Data Storage: Flash Systems? Steve Hollingsworth, Director at Applied Technologies provides an insight into why he thinks some businesses will choose Flash Systems over Cloud. As more and more businesses and individuals migrate into the Cloud, it is predicted eventually data storage centres will gradually become a thing of the past. The Cloud believers are waiting for the old churning mass banks of servers tucked away in companies basements to shut down and gather dust, as we all start trying to work out how to use Google Docs. I am a non-believer! The Cloud is a fantastic innovation in technology and for some it will prove to be revolutionary in its results. But, not everyone wants to wholly migrate, especially businesses who are used to and working effectively outside the Cloud. Migration itself for example takes time, and if you choose to use Amazon Web Services (AWS) or Google Apps, staff will be required to learn how to use new software which will take up valuable time and lower productivity. This point is illustrated by the fact that many larger organisations are not yet fully moving into the cloud, choosing instead to use a "hybrid" cloud, with core applications hosted by themselves on agile infrastructures, still requiring data storage centres. The amount of data businesses store and use is growing at a very fast rate with an estimated 45 fold growth of data storage by 2020, this therefore means the majority who still store data physically will continue to require extra storage capacity. With physical space and power becoming a premium, flash-systems are set to solve these issues. The old style of just throwing more disks into a data centre in order to increase capacity is not only expensive to fund and requires a surplus of space, but in terms of efficiency and energy use it will also prove costly. Replacing out of date data storage technology with Flash Systems will result in an on average drop in environmental damage along with energy costs as a result. For businesses who require high power processing, Flash Systems represent a quantum leap in performance. Flash is allowing business' to mine data quickly for immediate business decisions, increase productivity 'per core' to reduce license costs, and changing computing in many other ways. Good business requires maximum productivity which can only really be provided by maximum speed and process management. Although once installed, Flash Systems are much cheaper to run, I do appreciate they are expensive to set up. However, when a business chooses to invest in a Flash System, it can be gradually introduced, and correctly sized and implemented with hot data, a limited amount of flash can have exponential benefits. Businesses are able to retain their hard disk drives for data retention and Flash Systems can be prioritised to high demand data or high end processes. This, therefore, invites the introduction of hybrid data storage systems, which combine hard disk drives and Flash Systems to cover a plethora of data storage and operations. Good business requires maximum productivity which can only really be provided by maximum speed and process management. Flash can deliver this productivity, at costs points perhaps unexpected, and can be implemented as part of a Private or Hybrid cloud, dependent on the application requirements. ### Forget the www - Merkel has plans for an EUww Angela Merkel got the full red carpet treatment when she arrived in the UK last week. Addressing both Houses and dropping in on the Queen was all part of Cameron's strategy to woo her support for his proposals for EU reform. Angela Merkel has already made it clear she can't and won't deliver everything Cameron needs, but she does have her mind on some thorny reforms of her own. The previous week she was visiting French President Francois Hollande to convince him that a European communications network to improve data protection is not only possible but desirable. Whether her conversations with British party leaders today will have covered this same ground is unclear, but the notion does seem to be gathering speed. ...a European communications network to improve data protection is not only possible but desirable. On Monday 24th February the European Parliament's Civil Liberties, Justice and Home Affairs committee (LIBE) voted to accept Edward Snowdon's testimony in relation to its inquiry into mass surveillance. Green spokesperson and member of the LIBE committee, Jan Philipp Albrecht, is urging for Snowdon's testimony to take place before the Parliament votes on the LIBE report on 12th March. The report calls for the immediate suspension of the EU/US Safe Harbour agreement, acceleration of the General Data Protection regulation proposals, reassessment of Canada and New Zealand as safe to receive European data, and an autonomous EU IT capability. The report is unlikely to be accepted, but Snowdon's testimony is an important indication of momentum in the debate. US Senators have been vocal about their unhappiness about his being granted an audience. It was a busy week for the protectionists. Last Monday also saw Brazilian President, Dilma Rousseff, announce a $185m joint EU-Brasil project to lay a fibre-optic undersea communications cable across the Atlantic, between Lisbon and Fortaleza to bypass US routes and "guarantee the neutrality" of the internet. ...a fibre-optic undersea communications cable across the Atlantic, between Lisbon and Fortaleza to bypass US routes and "guarantee the neutrality" of the internet. Like Merkel, Dilma has strong feelings about the NSA revelations; postponing a trip to Washington in protest last year over the behaviour of US spy activity. Merkel reportedly told US President, Barack Obama, that NSA activity – which German publication Der Speigel revealed included listening to her personal mobile phone – "is like the Stasi", East Germany's hated and powerful secret police. It is taking some time for the EU to develop of coherent response to the Snowdon revelations, perhaps in part because the issue is complicated by the complicity and cooperation of member states, most notably Britain, in aiding the NSA. Nevertheless, Merkel has been outspoken about her distaste for companies like Facebook and Google basing their operations in countries where data protection is compromised: "Many countries have lower levels of data protection that Germany, and we do not want our privacy laws watered down". Merkel has called for unified European rules but, in the meantime, German courts are busy asserting authority. A German court ruled last week that Facebook must comply with German data protection rules, despite the internet giant originally arguing that its data processing for German users was controlled from Ireland and should be subject to Irish data protection law. Merkel has also asserted that: "One shouldn't have to send emails and other information across the Atlantic. Rather one could build up a communications network inside Europe." "...we do not want our privacy laws watered down" What this would mean in practice is unclear. It seems incredibly unlikely that the wholesale investment in a protected EU infrastructure is likely or practicable. And commentators including Tim Berners-Lee have long warned against a 'balkanisation' of the internet. A further strengthening of initiatives such as the European Cloud Partnership seems more practicable and more likely. It has already been strengthened by words, if not additional investment. And Neelie Kroes, the European Commissioner for Digital Agenda, is focusing on the development of a seal of quality for data protection within the framework of the European Cloud Partnership. The fact remains, however, that any initiatives remain hard to police. As Edward Snowdon told German state broadcaster, ARD, the creation of a European "cloud" that did not send electronic data to servers on US soil would not protect Europe from US espionage: "If the NSA can pull text messages out of telecommunications networks in China, they can probably manage to get Facebook messages out of Germany." Whatever happens, it seems more likely that the net effect of these machinations and words of attrition will be to create an environment where European cloud-based service providers can flourish. There may yet be a significant marketing advantage in being able to offer European-only data centres and encrypted data services. According to Der Spiegel, there is a joke doing the rounds in the German IT sector that their best marketer these days is Edward Snowdon. Perhaps Angela Merkel is intent on giving him a run for his money. ### Merkel has plans for http://EUww Forget the www - Merkel has plans for an EUww Angela Merkel got the full red carpet treatment when she arrived in the UK last week.  Addressing both Houses and dropping in on the Queen was all part of Cameron's strategy to woo her support for his proposals for EU reform.  Angela Merkel has already made it clear she can't and won't deliver everything Cameron needs, but she does have her mind on some thorny reforms of her own. The previous week she was visiting French President Francois Hollande to convince him that a European communications network to improve data protection is not only possible but desirable.  Whether her conversations with British party leaders today will have covered this same ground is unclear, but the notion does seem to be gathering speed. A European communications network to improve data protection is not only possible but desirable. On Monday 24th February the European Parliament's Civil Liberties, Justice and Home Affairs committee (LIBE) voted to accept Edward Snowdon's testimony in relation to its inquiry into mass surveillance.  Green spokesperson and member of the LIBE committee, Jan Philipp Albrecht, is urging for Snowdon's testimony to take place before the Parliament votes on the LIBE report on 12th March.  The report calls for the immediate suspension of the EU/US Safe Harbour agreement, acceleration of the General Data Protection regulation proposals, reassessment of Canada and New Zealand as safe to receive European data, and an autonomous EU IT capability. The report is unlikely to be accepted, but Snowdon's testimony is an important indication of momentum in the debate.  US Senators have been vocal about their unhappiness about his being granted an audience. It was a busy week for the protectionists.  Last Monday also saw Brazilian President, Dilma Rousseff, announce a $185m joint EU-Brasil project to lay a fibre-optic undersea communications cable across the Atlantic, between Lisbon and Fortaleza to bypass US routes and "guarantee the neutrality" of the internet.[pullquote]...a fibre-optic undersea communications cable across the Atlantic, between Lisbon and Fortaleza to bypass US routes and "guarantee the neutrality" of the internet.[/pullquote] Like Merkel, Dilma has strong feelings about the NSA revelations; postponing a trip to Washington in protest last year over the behaviour of US spy activity.  Merkel reportedly told US President, Barack Obama, that NSA activity – which German publication Der Speigel revealed included listening to her personal mobile phone – "is like the Stasi", East Germany's hated and powerful secret police. It is taking some time for the EU to develop of coherent response to the Snowdon revelations, perhaps in part because the issue is complicated by the complicity and cooperation of member states, most notably Britain, in aiding the NSA. Nevertheless, Merkel has been outspoken about her distaste for companies like Facebook and Google basing their operations in countries where data protection is compromised: "Many countries have lower levels of data protection that Germany, and we do not want our privacy laws watered down". Merkel has called for unified European rules but, in the meantime, German courts are busy asserting authority.  A German court ruled last week that Facebook must comply with German data protection rules, despite the internet giant originally arguing that its data processing for German users was controlled from Ireland and should be subject to Irish data protection law. Merkel has also asserted that: "One shouldn't have to send emails and other information across the Atlantic.  Rather one could build up a communications network inside Europe." We do not want our privacy laws watered down. What this would mean in practice is unclear.  It seems incredibly unlikely that the wholesale investment in a protected EU infrastructure is likely or practicable.  And commentators including Tim Berners-Lee have long warned against a 'balkanisation' of the internet. A further strengthening of initiatives such as the European Cloud Partnership seems more practicable and more likely.  It has already been strengthened by words, if not additional investment.  And Neelie Kroes, the European Commissioner for Digital Agenda, is focusing on the development of a seal of quality for data protection within the framework of the European Cloud Partnership. The fact remains, however, that any initiatives remain hard to police.  As Edward Snowdon told German state broadcaster, ARD, the creation of a European "cloud" that did not send electronic data to servers on US soil would not protect Europe from US espionage: "If the NSA can pull text messages out of telecommunications networks in China, they can probably manage to get Facebook messages out of Germany." If the NSA can pull text messages out of telecommunications networks in China, they can probably manage to get Facebook messages out of Germany. Whatever happens, it seems more likely that the net effect of these machinations and words of attrition will be to create an environment where European cloud-based service providers can flourish.  There may yet be a significant marketing advantage in being able to offer European-only data centres and encrypted data services. According to Der Spiegel, there is a joke doing the rounds in the German IT sector that their best marketer these days is Edward Snowdon.  Perhaps Angela Merkel is intent on giving him a run for his money. ### CenturyLink Cloud delivers Hyperscale service Announces public cloud expansion to Sterling, Va., Silicon Valley, Paris and London CenturyLink, Inc. (NYSE: CTL) has announced the commercial availability of Hyperscale high-performance server instances offered through the CenturyLink Cloud platform. The new service is designed for web-scale workloads, big data and cloud-native applications. CenturyLink Cloud includes dozens of self-service features that help enterprises run their mission-critical and business applications in the public cloud with ease. With Hyperscale, developers can now use the same platform to launch and manage advanced next-generation apps. “New applications are crucial to delivering a competitive advantage for enterprises, and Hyperscale is the ideal service for these workloads,” said Jared Wray, CenturyLink Cloud chief technology officer, CenturyLink Technology Solutions. “CenturyLink continues to bring developers and IT together with this new capability. Developers get self-service and lightning-fast performance for popular NoSQL platforms, and IT can easily use our cloud management platform for governance and billing.” Hyperscale combines CenturyLink Cloud’s high-performance compute with 100-percent flash storage to deliver a superior experience for web-scale architectures built on Couchbase, MongoDB and other NoSQL technologies. Users of the service will consistently see performance at or above 15,000 input/output operations per second (IOPS) for a diverse range of workloads. Hyperscale also is ideal for big data scenarios and serves as a complementary offering to CenturyLink’s existing Big Data Foundation Services. Global Public Cloud Expansion Continues CenturyLink today also announced expansion plans for the CenturyLink Cloud network of public cloud data centers, growing from nine to 13 locations in the first half of 2014. Starting in March, customers will be able to deploy virtual resources in Santa Clara, Calif., and Sterling, Va., with locations in Paris and the London metro market coming online in the second quarter of 2014. All CenturyLink Cloud services, including Hyperscale, will be available in these new locations. This broad footprint offers more choice and flexibility for geo-targeted solutions and for boosting performance while ensuring data sovereignty. Customers can also deploy advanced IT configurations through the interoperability of public cloud, colocation, managed services and network solutions available through the CenturyLink Technology Solutions portfolio.   About CenturyLink Technology Solutions CenturyLink Technology Solutions delivers innovative managed services for global businesses on virtual, dedicated and colocation platforms. For more information, visit www.centurylink.com/technology. About CenturyLink CenturyLink is the third largest telecommunications company in the United States and is recognized as a leader in the network services market by technology industry analyst firms. The company is a global leader in cloud infrastructure and hosted IT solutions for enterprise customers. CenturyLink provides data, voice and managed services in local, national and select international markets through its high-quality advanced fiber optic network and multiple data centers for businesses and consumers. The company also offers advanced entertainment services under the CenturyLink® Prism™ TV and DIRECTV brands. Headquartered in Monroe, La., CenturyLink is an S&P 500 company and is included among the Fortune 500 list of America’s largest corporations. For more information, visit www.centurylink.com. ### Getting serious about Cloud Claire Buchanan, Senior Vice President, Global Operations at Bridgeworks identifies data transfer rates as one of the critical components to successful cloud adoption. Sitting in a meeting room the other day, our discussion swirling around Big Data and Cloud, as is probably common in most technology companies, we decided to get real and answer the question: “What is the barrier stopping enterprises adopting the cloud?” We next identified the six pillars, the Six Ss’, to successful cloud adoption;  service, speed, scale, security, sovereignty (data) and simplicity. We agreed that security could be overcome and along with service and sovereignty, which was very much down to the choice of geo and provider, the real barriers are speed and scale - and then keeping it simple. The barrier stopping enterprises adopting the cloud? [The] real barriers are speed and scale - and then keeping it simple. Clearly speed and scale can be used for many disciplines in the technology business. For the purposes of this blog as well as one of the biggest inhibitors, this is down to the sheer size of the big data challenge and moving data over distance fast, be that from data centre to data centre or customer to host provider. The enormous challenge of getting that data quickly, efficiently and in a 'performant' manner from source to host, is the key. The WAN limitations are easy to identify: Size of pipe (in this case bigger does not necessarily mean faster), the Protocol, and the Amount of Data needed to be moved. We have all heard of those projects that ever so nearly made it to the Cloud Provider but then the time and cost of the transfer of the data made it far too big to contemplate.  Movement of data on tapes is still commonplace as it is viewed as a safer and faster route in a non-cloud world. From small amounts of data, a private individual trying to upload a movie to Dropbox or a huge Corporation performing a US coast to coast daily batch process, the problem is extremely real. Even the mighty Amazon allows clients to send in their drives in to upload the data to the Cloud such is the problem transferring data across the WAN – all day every day. So what do CIO’s want? Let’s start with the simple stuff - get the cold data (Legacy) off the Primary storage tier, 80% of that data has no real value in day to day business terms but for regulatory, governance and compliance reasons the data has to live somewhere. The challenge therefore is to migrate data out of the production environment and into storage. The cloud offers safe, compliant storage driving down cost and bringing simplification to corporate environments but still the very basic challenge remains – moving it. Most large organisations want to offload their legacy data, have someone else manage or archive it for them at a fraction of their in-house cost. Before Cloud we had the philosophy of Write Once Read Many (WORM) but the reality of the cloud is we want to put data that we might not necessarily need, but cannot destroy, somewhere less costly. In this cloud world it is longer WORM, it is very much a write once read possibly (WORP).  We are not talking warm data here but more often than not cold or even glacial Ask a simple question, “How many megabytes per second can your WAN Optimisation product get on a 10Gb link?” - that should be fun. All sound simple so far? For small organisations it is, but for medium to large organisations the elephant in the room is how to move terabytes or even petabytes of data across the WAN. WAN Optimisation I hear you say, that should fix it. Not in a Write Once Read Possibly world (remember WORP?) in order to dedupe the system needs to learn. I could easily slip down the slippery slope, these guys don’t optimise the WAN they simply cut down the size of the files you send across the WAN and this has computational limitations, more Data Optimisation than WAN Optimisation, would you not agree? Ask a simple question, “How many megabytes per second can your WAN Optimisation product get on a 10Gb link?” - that should be fun. What really is needed, and it is not what we know as WAN Optimisation, we need something that moves data at the speed of light because that is the only physical limitation. Something that can move all data at scale (even encrypted or compressed from source - business wants security and encryption is becoming ever more popular) in an unrestricted manner. Remember simplification. It should be a simple to install, say less than an hour and not require any network or storage management time. It should just run on its own, the only intervention to pull off the stats. The news is good, it can be done. You just need to open your minds to the seemingly impossible. Just to whet your appetite, on a 10 Gb link how does 1 GB in 1 second grab you? Better yet, 1 TB in 16.2 minutes? There are corporations live and in production on 20Gbps basking in the simplicity of it all, not to mention the cost, productivity and continuity savings.  Just for good measure they can throttle back if they need to limit the percentage of the pipe for other things and still get lightning speed. The barrier to the Cloud for enterprise adoption can be removed. ### Commonwealth War Graves Commission Ends Tape Backup Headache with Cloud-Based Data Recovery CASE STUDY Foreword: Taking a step away from thought leadership into real-life scenarios is critical for a balanced view of the merits and challenges of cloud computing. Here we present an interesting case study produced by back-up software specialists Asigra, for a solution deployed and managed by Techgate Plc for the Commonwealth War Graves Commission (CWGC). Overview The Commonwealth War Graves Commission (CWGC) cares for cemeteries and memorials for Commonwealth servicemen and women who died during both world wars. The organization maintains a vast amount of important digital records and archives that must be protected in the event of data loss caused by human error, system failure or natural disaster. CWGC chose Disaster Recovery Specialist Techgate plc which is powered by Asigra. CWGC was able to significantly reduce the complexity and management overhead of their backup environment while improving overall performance. Customer Overview CWGC ensures that the 1.7 million men and women of the Commonwealth forces who died in the two world wars will never be forgotten. The organization cares for cemeteries and memorials at 23,000 locations across 153 countries. CWGC values and aims, as laid out in 1917, are as relevant now as they were almost 100 years ago. The Commission is proud of the work it carries out to commemorate the war dead, from building and maintaining cemeteries and memorials to the preservation of records. The Commission continues to set the highest standards for its work, leveraging preeminent architects of their day, who have included Sir Edwin Lutyens, Sir Herbert Baker and Sir Reginald Blomfield. Because of the important records maintained by the Commission, a highly reliable backup solution is necessary to ensure that the data is always accessible. The organization had been reliant on a tape backup solution by Symantec, but this was a highly distributed technology that required each backup environment to be managed independently around the world. Furthermore, because tapes were stored off-site and required significant manual processes to recall the data from this medium, a full business continuity plan invocation would take far too long - approximately 30 days. Additionally, because of the large amounts of data that needed be backed up, it was taking longer to protect the information. As a consequence, the Commission was not able to meet its recovery time objectives ford at a from primary business applications. Business Situation The CWGC maintains and manages more than 20TB of data with more than 4TB of historical casualty records. This data is located across disparate physical locations and continues to grow in size. The Commission's legacy tape backup solution was having difficulties keeping up with the growing volume of data which was causing issues with backing up the information within a finite backup window and recovering the data in a timely manner. Because the tape system's recovery times were 30 days to restore all non-critical data and at least 24 hours for critical data, it became a major risk for the day-to-day operations of CWGC. They needed to improve their recovery times significantly in order to ensure business continuity for the long term. [quote_box_center]“Because of the importance of the records we keep, data availability is very important to the mission of CWGC,” said Will Webster, Head of IT for CWGC. “The Symantec tape backup solution we had in place was no longer able to support this mission and therefore needed to be replaced. During our review process for alternatives, we determined that key criteria should not only include backup and recovery performance but also management simplicity for the sake of operational efficiency.”[/quote_box_center] In addition to recovery performance challenges, CWGC was also dealing with the complexities of managing a vast number of backup tapes. Sending these tapes off-site and retrieving them for recoveries was becoming extremely time intensive as data volumes increased. As a result, the Commission sought to change from tape backup to a Cloud based managed backup and recovery service using disaster recovery specialists Techgate. Techgate provides the Asigra Cloud BackupTM service from its highly secure, Tier 3, UK-based data centres, utilizing cuttingedge IBM server and storage infrastructure. "...key criteria should not only include backup and recovery performance but also management simplicity for the sake of operational efficiency.” Solution CWGC removed its tape backup systems and deployed the Asigra Cloud-based recovery solution to six of its locations throughout Europe, backing up 80 servers over those locations. Each of these environments included Windows-based servers as well as Microsoft Hyper-V virtualized servers with multiple databases (including SQL and Exchange) in place across these systems. “We were facing a number of challenges with tape backup that were straining our financial, IT and human resources and saw a real opportunity with Asigra Cloud Backup and Techgate,” continued Webster. “From the outset, the solution directly addressed our concerns. It allowed us to centrally manage our backup and recovery processes and eliminate manual tape management tasks. Asigra's Backup Lifecycle Management (BLM) feature allowed us to minimise our data growth issues by prioritizing data to improve recovery performance.” Techgate. Techgate provides the Asigra Cloud BackupTM service from its highly secure, Tier 3, UK-based data centres, utilizing cuttingedge IBM server and storage infrastructure. Results Prior to implementing Techgate's Asigra solution, non-specialist staff at each location were assigned to deal with the management of backup tapes which took between 15-20 minutes per day. The new solution changed all of that by moving CWGC from a back up infrastructure reliant on legacy technology to a modern cloud-based solution. The Commission liked Asigra's agent-less architecture which eliminated the need to install software on backup targets, which had historically reduced performance and resulted incompatibility issues. With Asigra, backup operations are automated and users have greater confidence in knowing that their data is, in the event of a disaster, always quickly and easily accessible. “In conjunction with Asigra, we have shown CWGC a clear and modern alternative to tape backup. With initial deployments of the backup service in place, Asigra has proven effective at simplifying backup and recovery processes for the organization. We look forward to supporting additional disaster recovery requirements for the Commission as they benefit from the operational improvements enabled by cloud backup. As with most of our customers, UK data domicile, compliance and security, were key system requirements. ” Chris McDonnell, Solutions Delivery Manager, Cloud Services, Techgate plc. About Asigra Trusted since 1986, Asigra provides organizations around the world the ability to recover their data now from anywhere through a global network of partners who deliver cloud backup and recovery services as public, private and/or hybrid deployments. As the industry's first enterprise agentless cloud-based recovery software to provide data backup and recovery of servers, virtual machines, endpoint devices, databases and applications, SaaS and IaaS based applications, Asigra lowers the total cost of ownership, reduces recovery time objectives, eliminates silos of backup data by providing a single consolidated repository, and provides 100% recovery assurance. Asigra's revolutionary patent-pending Recovery License Model provides organizations with a cost effective data recovery business model unlike any other offered in the storage market. Asigra has been recognized as a Gartner Cool Vendor and has been included in the Gartner Magic Quadrant for Enterprise Backup and Recovery Software since 2010. More information on Asigra can be found at www.recoveryiseverything.com ### CWGC ends tape back-up [quote_box_center]Case Study: Taking a step away from thought leadership into real-life scenarios is critical for a balanced view of the merits and challenges of cloud computing.  Here we present an interesting case study produced by back-up software specialists Asigra, for a solution deployed and managed by Techgate Plc for the Commonwealth War Graves Commission (CWGC).[/quote_box_center] Overview The Commonwealth War Graves Commission (CWGC) cares for cemeteries and memorials for Commonwealth servicemen and women who died during both world wars. The organization maintains a vast amount of important digital records and archives that must be protected in the event of data loss caused by human error, system failure or natural disaster. CWGC chose Disaster Recovery Specialist Techgate plc which is powered by Asigra. CWGC was able to significantly reduce the complexity and management overhead of their backup environment while improving overall performance. Customer Overview CWGC ensures that the 1.7 million men and women of the Commonwealth forces who died in the two world wars will never be forgotten. The organization cares for cemeteries and memorials at 23,000 locations across 153 countries. CWGC values and aims, as laid out in 1917, are as relevant now as they were almost 100 years ago. The Commission is proud of the work it carries out to commemorate the war dead, from building and maintaining cemeteries and memorials to the preservation of records. The Commission continues to set the highest standards for its work, leveraging preeminent architects of their day, who have included Sir Edwin Lutyens, Sir Herbert Baker and Sir Reginald Blomfield. CWGC ensures that the 1.7 million men and women of the Commonwealth forces who died in the two world wars will never be forgotten. Because of the important records maintained by the Commission, a highly reliable backup solution is necessary to ensure that the data is always accessible. The organization had been reliant on a tape backup solution by Symantec, but this was a highly distributed technology that required each backup environment to be managed independently around the world. Furthermore, because tapes were stored off-site and required significant manual processes to recall the data from this medium, a full business continuity plan invocation would take far too long - approximately 30 days. Additionally, because of the large amounts of data that needed be backed up, it was taking longer to protect the information. As a consequence, the Commission was not able to meet its recovery time objectives ford at a from primary business applications. Business Situation The CWGC maintains and manages more than 20TB of data with more than 4TB of historical casualty records. This data is located across disparate physical locations and continues to grow in size. The Commission’s legacy tape backup solution was having difficulties keeping up with the growing volume of data which was causing issues with backing up the information within a finite backup window and recovering the data in a timely manner. Because the tape system’s recovery times were 30 days to restore all non-critical data and at least 24 hours for critical data, it became a major risk for the day-to-day operations of CWGC. They needed to improve their recovery times significantly in order to ensure business continuity for the long term. Because tapes were stored off-site and required significant manual processes to recall the data from this medium, a full business continuity plan invocation would take far too long - approximately 30 days. “Because of the importance of the records we keep, data availability is very important to the mission of CWGC,” said Will Webster, Head of IT for CWGC. “The Symantec tape backup solution we had in place was no longer able to support this mission and therefore needed to be replaced. During our review process for alternatives, we determined that key criteria should not only include backup and recovery performance but also management simplicity for the sake of operational efficiency.” In addition to recovery performance challenges, CWGC was also dealing with the complexities of managing a vast number of backup tapes. Sending these tapes off-site and retrieving them for recoveries was becoming extremely time intensive as data volumes increased. As a result, the Commission sought to change from tape backup to a Cloud based managed backup and recovery service using disaster recovery specialists Techgate. Techgate provides the Asigra Cloud BackupTM service from its highly secure, Tier 3, UK-based data centres, utilizing cuttingedge IBM server and storage infrastructure. Solution CWGC removed its tape backup systems and deployed the Asigra Cloud-based recovery solution to six of its locations throughout Europe, backing up 80 servers over those locations.  Each of these environments included Windows-based servers as well as Microsoft Hyper-V virtualized servers with multiple databases (including SQL and Exchange) in place across these systems. “We were facing a number of challenges with tape backup that were straining our financial, IT and human resources and saw a real opportunity with Asigra Cloud Backup and Techgate,” continued Webster. “From the outset, the solution directly addressed our concerns. It allowed us to centrally manage our backup and recovery processes and eliminate manual tape management tasks. Asigra’s Backup Lifecycle Management (BLM) feature allowed us to minimise our data growth issues by prioritizing data to improve recovery performance.” Results Prior to implementing Techgate’s Asigra solution, non-specialist staff at each location were assigned to deal with the management of backup tapes which took between 15-20 minutes per day.  The new solution changed all of that by moving CWGC from a back up infrastructure reliant on legacy technology to a modern cloud-based solution. The Commission liked Asigra’s agent-less architecture which eliminated the need to install software on backup targets, which had historically reduced performance and resulted incompatibility issues. With Asigra, backup operations are automated and users have greater confidence in knowing that their data is, in the event of a disaster, always quickly and easily accessible. “In conjunction with Asigra, we have shown CWGC a clear and modern alternative to tape backup. With initial deployments of the backup service in place, Asigra has proven effective at simplifying backup and recovery processes for the organization. We look forward to supporting additional disaster recovery requirements for the Commission as they benefit from the operational improvements enabled by cloud backup. As with most of our customers, UK data domicile, compliance and security, were key system requirements.” Chris McDonnell, Solutions Delivery Manager, Cloud Services, Techgate plc. About Asigra Trusted since 1986, Asigra provides organizations around the world the ability to recover their data now from anywhere through a global network of partners who deliver cloud backup and recovery services as public, private and/or hybrid deployments. As the industry’s first enterprise agentless cloud-based recovery software to provide data backup and recovery of servers, virtual machines, endpoint devices, databases and applications, SaaS and IaaS based applications, Asigra lowers the total cost of ownership, reduces recovery time objectives, eliminates silos of backup data by providing a single consolidated repository, and provides 100% recovery assurance. Asigra’s revolutionary patent-pending Recovery License Model provides organizations with a cost effective data recovery business model unlike any other offered in the storage market. Asigra has been recognized as a Gartner Cool Vendor and has been included in the Gartner Magic Quadrant for Enterprise Backup and Recovery Software since 2010. More information on Asigra can be found at www.recoveryiseverything.com ### Talk to me straight Paul Bevan, Director of Surefire Excellence, wields his light sabre to cut through any BS he hears from vendors at Cloud Expo Europe this year. You have been warned. The folks at Compare the Cloud have thrown out another gentle challenge to me. With Cloud Expo Europe fast approaching and memories of last year’s orgy of feeds and speeds still giving me indigestion, they suggested I needed to get my light sabre out and talk about how to engage those elusive new customers. In this day and age, if you have to resort to being economical with the truth or simply leaving out important facts that don’t support your story you shouldn’t be in marketing. Customers are faced with business and technology that is constantly changing. They are looking more and more for vendors to deliver actionable insight and become trusted advisors. You can’t achieve that unless you are open, honest and knowledgeable. Customers have learned to be as savvy at buying as analysts are at gleaning the whole picture from technology vendors. Good analysts have an ability to smell BS at 20 paces. Their reputation, and income, actually rests on their ability to deliver sound, impartial advice to their customers. Having an open discussion with them that showed you knew your market and your customers, how your product or solution worked and how it solved customer problems didn’t necessarily guarantee being in the leadership category of a quadrant or wave, but it did guarantee access and, usually, insights from the analyst that helped in the development and positioning of your product. Customers are looking more and more for vendors to deliver actionable insight and become trusted advisors. Customers are much more like analysts now. Their reputation rests on their ability to deliver value to the businesses that they work for. Sure, with the pressures they are under and the amount of information they need to assimilate, you may occasionally be able to sell them something against the run of play…but you’ll only do it once. In any case, that scenario is getting ever less likely. The excellent book, “The Challenger Sale” by Matthew Dixon and Brent Adamson, based on a Corporate Executive Board survey of thousands of sales people globally, highlighted the increasing number of people becoming involved in the process of buying technology solutions, and the need for consensus across a wide range of stakeholders. Never mind all the other insights from the book, it shows there are simply too many people involved to “fool all the time”. Once you have “got in” to a customer by helping them solve a business problem, they will give you the freedom to lay out all your technical goodies, and you will find they will also be more willing to work round minor visible technical shortcomings rather than going through the pain of switching to another vendor. So, if you are manning a stand at CloudExpo Europe this year and you see me coming... remember my BS detector will be switched to “ON”! ### Entrepreneur delivers more using up to 90% less electricity [quote_box_center]Case Study: "One of the requirements to building a reliable server is to ensure that each of the components that we install into it are themselves reliable. We have found Kingston’s DRAMs and SSDs to be very reliable and solid performers. And that’s been a key to earning and protecting our reputation for quality, low-power profile systems," says Tony Lees, Managing Director, Avantek Computer Limited[/quote_box_center] The Business challenge Avantek Computer Limited (Avantek) shows that delivering more for less can be achieved. The Leicestershire, UK company is leveraging emerging- and state-of-the-art technologies to bend the performance and power curve in favor of datacenters. From its founding in 1998, the company has grown to become a player in the university computing marketplace and beyond. The flagship product fueling this growth is Avantek’s low-energy profile, high-performance, highly-scalable ARM processor servers. A single datacenter can take more power than a medium-sized town. The Avantek solution comes at a time when datacenter operators are keen to cut their energy use.  Collectively, datacenters consume about 30 billion watts of electricity per year1. That reality, leads Peter Gross, former HP Technology Consulting Managing Partner to call it "staggering," going on to say "A single datacenter can take more power than a medium-sized town." Most server owners don’t operate at that scale. Still, even for server clusters the energy bill to run production servers 24x7x365 adds up like this: Annual Server Energy Bill = (total # of servers) x (average kilowatt/hr. draw per server) x (8,760 hours/year). At $0.11 per kilowatt hour, that could tally over $1,000 per year per server . "We set out to create server solutions that are disruptive to traditional notions of energy consumption," explains Tony Lees, Managing Director of Avantek Computer Limited in Tur Langton, Leicestershire, U.K.  "And we’ve done just that by architecting servers that significantly drive up the processing-to-power-use ratio for our customers." To achieve that end, Lees took a comprehensive approach to designing his products. Technology Solution Lees based the Avantek ARM server on Calxeda’s quad-core 32-bit ECX-1000 ARM processor. "That gives it powerful processing capabilities along with a low power profile. Plus, it’s scalable, up to 4096 nodes." It’s no coincidence that Avantek’s solutions offer these specifications. Servers that push the performance per-watt envelope are a subject of intense interest in the U.K. right now. "The problem the British government has made a big fuss about, is that the U.K. is running out of electricity," says Lees. "There are even stories that we may have blackouts this winter." In light of this, it’s little wonder that energy-efficient technologies are top of mind among U.K. IT professionals. And that’s one reason why Avantek’s 3U ARM server—sipping a teacup five watts per node—has gained a foothold with customers. We standardized on SSDs because they use a lot less power than spinning disks. And we use low-power DRAM. "In order to deliver a low-power server you have to use energy-conserving components," explains Lees. "We standardized on SSDs because they use a lot less power than spinning disks. And we use low-power DRAM and high-efficiency power supplies as well." Lees’ design answered the what-components-to-use question. To solve the whose-components-to-use problem, his team relied upon past experience and real-world testing. "We ran stringent benchmark tests on SSDs and DRAMs from a number of suppliers," recalls Lees. "And we found the Kingston solutions to be performers." That’s consistent with benchmark test results from a third party, which placed Kingston first (least power used) among a field of eight SSD contestants. Still, excellent performance without enterprise-class reliability would not help Avantek sell production environment systems. "To field an unreliable server, based on cheap components, would irreparably harm our reputation and our business," says Lees. "So in addition to Kingston SSDs, we also chose to use their DRAM. It’s been extremely reliable over the years we’ve used them. And we felt that past experience was a good predictor of future performance and it has proven to be so." Today, a 48-node Avantek 3U Arm server cluster comes equipped with 192 Cores, up to 48 Kingston SSDs and 192GB of Kingston DRAM. All together, the energy-conserving server configuration draws a bit less than 240 watts of power. Business results The Kingston-equipped Avantek ARM servers have delivered a number of benefits to the company’s customers. Significantly increases server processing-to-power-use ratio The Avantek conservation-optimized servers give data centre executives a viable option to dramatically reduce their cluster-computing electricity costs. One of the greatest advantages of our ARM server solutions is that they consume up to 90 percent less power. "One of the greatest advantages of our ARM server solutions is that they consume up to 90 percent less power to run them than an equivalent server from another supplier," says Lees. "And because less power is required to drive them, our customers can also save up to 50 percent of the power they use to cool legacy servers." The company’s lab tests have translated to customer production environments. "One of our customers is using our systems to run complex human genome computations," says Lees.  "They tell us that they've seen massive performance-per-watt improvements in the comparative tests they've run against legacy systems." Allows space- and power-limited facilities to add more computing Some of Avantek’s customers have maxed out their server capacity. "One of our largest university customers can’t import any more electricity because they are housed in listed buildings ," recalls Lees. "So the only way they can add more computing power is to replace older systems with our energy-conscious ARM servers." This solution approach has the added benefit of lowering the load upon existing cooling systems as well, to liberate more power to be applied elsewhere in the data center. Performance and reliability earns customers By utilizing Kingston SSDs and DRAMs with high MTBF ratings, Avantek servers have earned a reputation for reliability. "Our university customers use our servers to run everything from a 20,000-user e-mail system to complex human genome project computations," explains Lees. "And their track record has been rock solid." Summary: Kingston-equipped (SSDs, DRAM) ARM servers draw up to 90 percent less electricity and 50 percent less cooling. Lowers server TCO. Supports customer green initiatives.Reliability of SSD and DRAM solutions contributes to server uptime. Meets customer availability requirements for production environment servers. Avantek servers have been "rock solid" in performance in university production environments — from 20,000-user e-mail systems to human genome research projects." Kingston Technology will be at Cloud Expo Europe 2014 from 26-27th February 2014. Visit Stand 504, further details can be found here. ### UK Entrepreneur delivers more server computations using up to 90 percent less electricity [quote_box_center] CASE STUDY "One of the requirements to building a reliable server is to ensure that each of the components that we install into it are themselves reliable. We have found Kingston's DRAMs and SSDs to be very reliable and solid performers. And that's been a key to earning and protecting our reputation for quality, low-power profile systems."   [/quote_box_center] Tony Lees, Managing Director, Avantek Computer Limited The Business challenge Avantek Computer Limited (Avantek) shows that delivering more for less can be achieved. The Leicestershire, U.K. company is leveraging emerging- and state-of-the-art technologies to bend the performance and power curve in favor of datacenters. From its founding in 1998, the company has grown to become a player in the university computing marketplace and beyond. The flagship product fueling this growth is Avantek's low-energy profile, high-performance, highly-scalable ARM processor servers. A single datacenter can take more power than a medium-sized town. The Avantek solution comes at a time when datacenter operators are keen to cut their energy use. Collectively, datacenters consume about 30 billion watts of electricity per year1. That reality, leads Peter Gross, former HP Technology Consulting Managing Partner to call it “staggering,” going on to say “A single datacenter can take more power than a medium-sized town.” Most server owners don't operate at that scale. Still, even for server clusters the energy bill to run production servers 24x7x365 adds up like this: Annual Server Energy Bill = (total # of servers) x (average kilowatt/hr. draw per server) x (8,760 hours/year). At $0.11 per kilowatt hour, that could tally over $1,000 per year per server . “We set out to create server solutions that are disruptive to traditional notions of energy consumption,” explains Tony Lees, Managing Director of Avantek Computer Limited in Tur Langton, Leicestershire, U.K. “And we've done just that by architecting servers that significantly drive up the processing-to-power-use ratio for our customers.” To achieve that end, Lees took a comprehensive approach to designing his products. Technology Solution Lees based the Avantek ARM server on Calxeda's quad-core 32-bit ECX-1000 ARM processor. “That gives it powerful processing capabilities along with a low power profile. Plus, it's scalable, up to 4096 nodes.” It's no coincidence that Avantek's solutions offer these specifications. Servers that push the performance per-watt envelope are a subject of intense interest in the U.K. right now. “The problem the British government has made a big fuss about, is that the U.K. is running out of electricity,” says Lees. “There are even stories that we may have blackouts this winter.” In light of this, it's little wonder that energy-efficient technologies are top of mind among U.K. IT professionals. And that's one reason why Avantek's 3U ARM server—sipping a teacup five watts per node—has gained a foothold with customers. “We standardized on SSDs because they use a lot less power than spinning disks. And we use low-power DRAM.." “In order to deliver a low-power server you have to use energy-conserving components,” explains Lees. “We standardized on SSDs because they use a lot less power than spinning disks. And we use low-power DRAM and high-efficiency power supplies as well.” Lees' design answered the “what-components-to-use” question. To solve the “whose-components-to-use” problem, his team relied upon past experience and real-world testing. “We ran stringent benchmark tests on SSDs and DRAMs from a number of suppliers,” recalls Lees. “And we found the Kingston solutions to be performers.” That's consistent with benchmark test results from a third party, which placed Kingston first (least power used) among a field of eight SSD contestants.5 Still, excellent performance without enterprise-class reliability would not help Avantek sell production environment systems. “To field an unreliable server, based on cheap components, would irreparably harm our reputation and our business,” says Lees. “So in addition to Kingston SSDs, we also chose to use their DRAM. It's been extremely reliable over the years we've used them. And we felt that past experience was a good predictor of future performance and it has proven to be so.” Today, a 48-node Avantek 3U Arm server cluster comes equipped with 192 Cores, up to 48 Kingston SSDs and 192GB of Kingston DRAM. All together, the energy-conserving server configuration draws a bit less than 240 watts of power. Business results The Kingston-equipped Avantek ARM servers have delivered a number of benefits to the company's customers. Significantly increases server processing-to-power-use ratio The Avantek conservation-optimized servers give data centre executives a viable option to dramatically reduce their cluster-computing electricity costs. One of the greatest advantages of our ARM server solutions is that they consume up to 90 percent less power... “One of the greatest advantages of our ARM server solutions is that they consume up to 90 percent less power to run them than an equivalent server from another supplier,” says Lees. “And because less power is required to drive them, our customers can also save up to 50 percent of the power they use to cool legacy servers.” The company's lab tests have translated to customer production environments. “One of our customers is using our systems to run complex human genome computations,” says Lees. “They tell us that they've seen massive performance-per-watt improvements in the comparative tests they've run against legacy systems.” Allows space- and power-limited facilities to add more computing Some of Avantek's customers have maxed out their server capacity. “One of our largest university customers can't import any more electricity because they are housed in listed buildings ,” recalls Lees. “So the only way they can add more computing power is to replace older systems with our energy-conscious ARM servers.” This solution approach has the added benefit of lowering the load upon existing cooling systems as well, to liberate more power to be applied elsewhere in the data center. Performance and reliability earns customers By utilizing Kingston SSDs and DRAMs with high MTBF ratings, Avantek servers have earned a reputation for reliability. “Our university customers use our servers to run everything from a 20,000-user e-mail system to complex human genome project computations,” explains Lees. “And their track record has been rock solid.” SUMMARY Kingston-equipped (SSDs, DRAM) ARM servers draw up to 90 percent less electricity and 50 percent less cooling. • Lowers server TCO. • Supports customer green initiatives.Reliability of SSD and DRAM solutions contributes to server uptime. • Meets customer availability requirements for production environment servers. • Avantek servers have been “rock solid” in performance in university production environments — from 20,000-user e-mail systems to human genome research projects.” Kingston Technology will be at Cloud Expo Europe 2014 from 26-27th February 2014. Visit Stand 504, further details can be found here. Excerpt: Case Study: With help from Kingston Technology SSD and DRAM, Avantek ARM processor servers deliver more performance with a lot less power. Avantek Computer Limited (Avantek) shows that delivering more for less can be achieved. ### Intermedia sets sights on UK growth World’s largest one-stop cloud provider Intermedia sets sights on UK growth Intermedia, the world’s largest one-stop shop for cloud IT services and business applications, including hosted Microsoft Exchange, archiving and backup, collaboration and file management, has announced an aggressive growth strategy for EMEA, with a major focus on channel recruitment. The announcement follows closely on the company’s acquisition of UK security vendor, SaaSID, which bolsters Intermedia’s expanding portfolio with identity and access management tools that provide SMBs with a single point of control over all their cloud services, regardless of provider. Intermedia believes value added resellers and managed service providers are vital for driving business growth in the region. The EMEA channel recruitment campaign will initially focus on the UK market, with plans to rapidly expand Intermedia’s partner network within the broader region. Ed Macnair, Managing Director EMEA at Intermedia, says: "This is an exciting time for the channel. The Cloud Industry Forum reports that nearly half of all UK businesses are already using cloud services and eighty per cent of those plan to increase cloud service consumption in 2014. At the same time, Intermedia is announcing industry-leading cloud services that will give partners a real edge. Having worked closely with the channel in my previous roles, I am looking forward to reconnecting with the well-known names and developing fresh partnerships in this region. We are committed to expanding our business in EMEA and have a target to grow our regional business by three hundred per cent this year." Intermedia allows channel partners to sell under their own brand, set their own pricing and fully own their customer relationship. Partners who join the Partner Programme will gain access to a number of benefits,  including a comprehensive cloud service portfolio which can be white-labelled, recurring service revenues, special product promotions, 24/7 technical support and a proprietary Partner Portal, for easily managing and provisioning customer accounts. Channel partners can choose to either resell Intermedia’s services under its Private Label model, or receive commission for recommending Intermedia under its Advisor model. Intermedia’s partner recruitment drive comes hot on the heels of new channel-friendly offerings that include standalone Email Archivingand McAfee Email Defense, adding to Intermedia’s existing services for hosted Microsoft Exchange, mobility, archiving and backup, collaboration and file management. Interested parties can join the Intermedia partner programme immediately at www.intermedia.co.uk/partners About Intermedia Intermedia is the world’s largest one-stop shop for cloud IT services and business applications. Its Office in the Cloud™ delivers the essential services that SMBs need to do business – including hosted Exchange, Lync, Sharepoint, email archiving, security, mobility and more. All of Intermedia’s services are pre-integrated for a seamless user experience. They are all managed via Intermedia’s HostPilot® control panel, with just one login, one password, one bill and one source of support – creating tremendous cross-service efficiencies for both users and IT administrators. In addition, they all offer enterprise-grade security, 99.999% availability and 24/7 customer care to assure a worry-free experience. Intermedia was the first company to offer hosted Microsoft Exchange and, with 100,000 customers and more than 1,000,000 users, is today the world’s largest independent provider of hosted Exchange. Intermedia also enables channel partners – including VARs, MSPs, telcos and cable companies – to sell cloud services under their own brand, with full control over billing, pricing and every other element of their customer relationships. Intermedia’s 600 employees in 3 countries manage 10 data centres to power its Office in the Cloud – and to assure its famous worry-free experience. Learn more at Intermedia.co.uk. ### Agile Elephant goes Enterprise 2.0 in Paris David Terrar of D2C, Agile Elephant & EuroCloud UK takes time out from Enterprise 2.0 to explain what social business is and why its increasingly important for your business. As I'm writing this the Agile Elephant team are sitting on Eurostar heading back from Paris. We attended Kongress Media's Enterprise 2.0 Summit in Paris on Tuesday and Wednesday - I was speaking on a panel on strategic engagement and running a workshop session on project management and governance. This will be the first of two posts here. An introduction to the topic and the event, and then a conference report. We spent the two days talking enterprise 2.0, social business and open business... OK, what's that all about then, and why should you be interested? Let me start by explaining a little of the background. Where to start? Back in May 2006, Andrew McAfee of the Harvard Business School started the wider use of the term Enterprise 2.0 as a kind of business oriented evolution of the web 2.0 term that was around at the time. He defined it as: "Enterprise 2.0 is the use of emergent social software platforms within companies, or between companies and their partners or customers.” At that stage, the emergent tools were blogs, wikis, forums, document sharing, RSS feeds, microblogging and activity streams. So the term (Enterprise 2.0) has been around for over 8 years, but during this current decade people have started to use the terms social business and social enterprise instead. So the term has been around for over 8 years, but during this current decade the concept has evolved, and people have started to use the terms social business and social enterprise instead. This is problematic as the term social enterprise had already been coined by Professor Muhammad Yunus to mean a business with a social rather than financial purpose. That didn't stop Salesforce, in 2011, branding their major customer and partner events with "Welcome to the Social Enterprise" and even trying (and failing) to trademark the term. Their definition of a Social Enterprise was one where social tools like Salesforce Chatter are used to connect and collaborate in new ways inside as well as outside of the organisation. These social tools, and there are many of them, can provide a very different platform for teamwork compared to sending files by email, which is the default collaboration approach in most organisations, albeit occasionally modified by having some sort of shared drive or intranet as the file repository. By 2012 Salesforce had dropped the term, but their shows declared "Business is Social". We've also used terms like knowledge management, corporate social networking, social collaboration, or social media in business. Social Business should not be confused with the term Social Media, although it uses social media channels. Social Media incorporates social networking, blogging, microblogging, forums, user generated content, crowd sourcing, RSS feeds and more. All of those communication channels might be used in a Social Business approach, but it will involve other social collaboration tools along with a major change of mind-set and culture for the organisation. A culture of openness, sharing and collaboration that goes hand in hand with what Forrester and others call digital disruption. This openness is the antithesis of the old, corporate, command and control hierarchy where knowledge was power, and you were motivated to hang on to information, a valuable currency to keep private for your own use.  That world is slowly changing. This openness is the antithesis of the old, corporate, command and control hierarchy where knowledge was power. Enterprise 2.0, Social Business - part of our current problem is that neither of those terms work very well, but the actual concept they are trying to describe can add real value to the bottom line in any organisation.  Social tools give your organisation the ability to connect your teams so they can collaborate together, with partners and with customers.  They allow you to share knowledge in new and easier ways.  When they are properly connected to your core business processes, they can add real value – improved sales, better customer retention, faster time to market – benefits with hard numbers against them.  Every organisation should be considering a move to Social Business – if you don’t do it you’re competitors certainly will. The Summit had some great speakers - Dion Hinchcliffe from Dachis, Rachel Happe of the Community Roundtable, Dan Pontefract of Telus, John Mell of IBM, Emanuele Quintarelli from Ernst & Young, Bertrand Duperrin of NextModernity, Lee Bryant of Postshift, and Luis Saurez just starting his journey having left IBM only days ago. It was a packed agenda covering: Success factors for social workplace adoption Key drivers for leveraging social value generation & business transformation Best practices for enhancing business performance and employee engagement Visions for future work & process organization Every organisation should be considering a move to Social Business – if you don’t do it you’re competitors certainly will. The event is sponsored by IBM (who have the Connections platform), SAP (with their Jam platform), Jive and a number of other players - Sitrion, Bluekiwi, Xwiki, NextModernity, Lecko. There might be over 100 social business platforms on the market, some of them are very good, but the players you'll come across more often with the larger customers or number of implementations are IBM, SAP, Jive and Yammer from Microsoft. It was great meeting our friends across from USA and Canada, as well as meeting all of the key European social business practitioners in one place and learning from some great customer case studies. Janet Parkinson, Alan Patrick and I were contributing to the tweet stream at #e20s and flying the Agile Elephant flag. We'll blog there and post a show report here in a few days. ### Taxing the Cloud Philip Grannum Managing Director, Hosting Operations, Easynet Global Services highlights tax considerations that should not be overlooked when deploying cloud and colocation. Taxation is a hot issue when it comes to cloud, both for providers and purchasers. The impact of tax on the cloud shouldn't be sniffed at, and tax is often an afterthought for businesses considering cloud migration. As far as vendors are concerned, some of the major cloud players have come under fire for the way that they organise their businesses, so that they don’t appear to bear an equitable share of the tax burden which bricks and mortar companies are obliged to pay. Tax is often an afterthought for businesses considering cloud migration. Whilst not wishing to get into the perceived rights and wrongs of this, it is certainly a dynamic area which is keeping accountants and tax lawyers busy across the globe. Each tax jurisdiction has its own rules and interpretation of these rules even at the individual State level within the US, let alone at the country level.  In these times of austerity it’s not surprising that governments are seeking to tighten tax regimes on electronic goods, seen for example in the recent change in tax treatment of off shore gambling operations. As for cloud purchasers: leaving aside the tax location choices that companies may make to optimise their tax position, for  those companies in Europe taking day-to-day decisions about where to host their infrastructure, VAT is likely to be the prime concern.  Now, VAT is a little complex (to me at any rate) but suffice it to say that it works a lot easier if you trade with a company who is in the same VAT jurisdiction as you.  If you don’t it does get a little bit more complicated.  For those who wish to understand more about it, I recommend this guide published by PWC. “Where it applies”, says the HMRC, “you act as if you are both the supplier and the customer - you charge yourself the VAT. Now I’m no tax expert, but a couple of things stood out for me. If you are a UK company and you procure some cloud services from a provider not resident in the UK for tax purposes, say from the US, what you do according to the HMRC is apply a ‘reverse charge’ or tax shift’.  “Where it applies”, says the HMRC, “you act as if you are both the supplier and the customer - you charge yourself the VAT and then, assuming that the service relates to VAT taxable supplies that you make, you also claim it back. So there's no net cost to you - the two taxes cancel each other out.” If you have got this far without falling to sleep, you’ll no doubt think that if you are an IT manager, this is all fine as the accountants can take care of it with the taxman, and in many cases when buying a simple cloud service this is true. However, if you are deploying a service which consists of some hybrid deployment of cloud with some kit (physical goods) in colocation, then it becomes even more complicated.  This is because physical goods are treated differently to services for tax purposes.  The colocation itself will also most likely be treated differently.  Whilst cloud can be argued for tax purposes as not having a specific geography of delivery, colocation usually does.  Colocation in some geographies can be subject to some local taxation (for example sales tax in US), property taxes or can even be construed as setting up a fixed establishment in the country (India).  This is before getting into exotic issues like carbon taxes. According to tax consultants RKG Consulting, “The VAT treatment of colocation services is very complex and advance planning should be undertaken”. I couldn't have put it better myself. So if you wanted to set up an infrastructure from the UK in say the Netherlands with maybe your disaster recovery in Germany including kit, colocation and some cloud, you have quite a bit to think about before even considering the practicalities of the support for the physical kit which may not be part of the service provided by the cloud provider. Now of course the cloud purists may complain at this point that the customer can solve this problem by moving everything to the cloud (which may be one of those ‘wouldn’t start from here’ conversations) or at least that the customer splits their environment and retains the colocation locally. However, this may or may not be practical/cost effective depending on the kit being co-located. International protected 10G for a tape library anyone? No, I thought not! As KPMG warns in its Tax in the Cloud white paper, “It is becoming increasingly important that companies do not undertake cloud activities in isolation but weigh up any business opportunities with the potential tax implications”. There’s no need for tax to become a barrier to cloud migration: it should simply be clearly integrated into the business case, as another consideration. What has your experience been of tax in the cloud, as a vendor or purchaser? Have you included it in your business case, or do you prefer to gloss over it and leave it to your finance team? ### ULCC and Arkivum announce partnership Large scale, long term and cost effective digital archiving made easy The University of London Computer Centre (ULCC), the leading IT service provider to the UK education sector, and Arkivum Ltd., providers of large scale, long term, and cost effective digital archiving solutions, today officially announced their strategic partnership. The partnership enables ULCC’s Academic & Research Technologies team to offer an integrated, long term digital preservation solution to the academic, heritage and special collections sector - by adding large scale and long term digital archiving capabilities to its existing service portfolio of digital preservation training (DPTP / link), consultancy, and repository development and management. Richard Davis, Head of Academic & Research Technologies at ULCC said: 'The costs and complexity of implementing best practice in data management and digital preservation can be  prohibitive. With Arkivum’s clearly-defined, no-nonsense service levels, data security and cost transparency are assured, and costs can be readily forecast for organisations or individual projects. We expect it to be of particular benefit to Higher Education institutions, seeking trusted storage for research data, cultural heritage organisations with large digitised collections, and any organisation needing to manage valuable business records and digital assets over the long-term. 'ULCC’s Managed Digital Archive Service offers a quick and cost-effective way to access Arkivum’s service, through a shared Arkivum gateway, further backed by decades of experience in managing digital archives and data repositories at ULCC. "We delighted to be working with ULCC," said Jim Cook, CEO of Arkivum. "Working together, we are able to offer ULCC's customers the ability to archive their long term digital data regardless of whether the need is for 5 years, 10 years or much longer. We can safeguard the integrity of the data and provide piece of mind against any compliance or regulatory needs - now, tomorrow and long into the future." To find out more about the Arkivum service and its integration into ULCC's existing EPrints-based repository service you can join us for the Arkivum/ULCC Briefing. You can register online for the half-day event, which will take place on Monday, 24th February at Senate House, University of London. Link to register is: https://ulcceverybitarchived.eventbrite.co.uk ### Microsoft CEO appointment indicates cloud focus One of the most significant and high-profile jobs in any sector, not merely the technology strata, has just been filled after a long period of apparent cogitation. The appointment of the new CEO of Microsoft has proven to be one of the most laborious processes within recent memory, but the decision has finally been taken to appoint the Indian Satya Nadella to this critical position within the software giant. Evidently, with Microsoft at a crossroads in their existence there was a lot of care taken to select the right person for the job, and when looking and Nadella’s history and previous position it immediately becomes evident why he was chosen. Nadella’s previous position was Microsoft Corporation’s head of Cloud and Enterprise, and it is clear that his elevation has been fundamentally driven by a desire within Microsoft to compete with the top corporations within this strata. It is not exactly a secret that Microsoft’s core business model has been based around a technology which is threatened by the cloud, and evidently Nadella has been brought in to work out how the company can penetrate the cloud significantly in the future. Certainly the appointment of Nadella is a significant gamble. Not only as he never run a business of this size before, or been in a remotely similar position, he’s never run a business of any size. Nadella joined Microsoft after a short period at Sun Microsystems, and his educational and work background is mostly technical in nature. In many ways, the existence of the cloud is something of an inconvenience for Microsoft. However, there is no doubt that Nadella has the technical and academic knowledge required for the role, having earned an MS in Computer Science from the University of Wisconsin–Milwaukee and an MBA from the University of Chicago Booth School of Business. Since then he has been intimately involved in Microsoft’s cloud efforts, so he has as good a grounding in the matter as one could reasonably ask for. In many ways, the existence of the cloud is something of an inconvenience for Microsoft. Having dominated the operating system market for decades with Windows, and having a particularly strong position in the corporate sector, there was little incentive for one of the world’s biggest and most successful companies to significantly evolve. Without doubt, they would quite happily go on living in a world in which they can sell the Windows OS to people at $150+ a pop. Unfortunately for them, the cloud has rather complicated this business model, as many observers believe that physical hardware and operating systems being downloaded to all computer users’ physical machines will become less and less prevalent in the near future. There has already been talk that in an attempt to respond to the present threats to its business that Microsoft will take a particularly radical approach to the forthcoming Windows 9, and actually offer it to consumers for free; aping the strategy of Apple with their Lion operating system. Nadella has already made it clear in his early comments since taking the Microsoft CEO job that he fully recognises his role is one of evolving the company. His opening comments released to the press were all focused on the future direction of Microsoft, and how the company must ““continue to transform” and do so rapidly. There was also a noticeable emphasis on innovation, which will be absolutely essential for Microsoft going forward. Currently, Amazon remains the dominant force in the cloud (IaaS & PaaS), and Microsoft is certainly playing catch-up attempting to get to the top of the industry. Microsoft are clearly investing a lot of time, money and energy into its Azure cloud system, and recently slashed storage prices related to it in an attempt to close the gap on Amazon Web Services. It was reported last year that Microsoft’s strategy was beginning to pay dividends, but for the time being Amazon continues to be dominant in the cloud sector. It is an odd situation for a company such as Microsoft to be under serious threat, but the cloud certainly poses such a threat to their business and prominence. But with decades of experience of overcoming such challenges, don’t bet against them solving this problem. ### Intermedia partners with Global IT Security Authority Intermedia adds McAfee Email Defense to create a partnership between the exchange hosting leader and the Global IT Security Authority. The world’s largest one-stop shop for cloud IT services and business applications has chosen McAfee to protect its customers and partners from email-based security threats and business continuity risks. McAfee Labs estimates that between 80-90 percent of all email traffic is unwanted1 and third-party research revealed that 58 percent of organisations have been infiltrated by email-based malware2—a threat vector with the primary objective of stealing information. By teaming with McAfee, Intermedia—which also is the world’s largest Microsoft Exchange hoster with over 700,000 mailboxes—can protect its SMB customers from threats that can disrupt productivity, bring down systems, corrupt company data and fully compromise security and privacy. Intermedia’s McAfee Email Defense Suite consists of three services. McAfee Basic and Advanced Email Protection: McAfee Basic Email Protection eliminates spam and viruses before they reach users’ mailboxes. McAfee Advanced Email Protection adds granular control over protection settings as well as McAfee ClickProtect to defend users who click malicious URLs. McAfee Email Continuity: This add-on service offers redundant email access for peace of mind beyond Intermedia’s 99.999% uptime guarantee. McAfee Email Data Loss Prevention: This add-on service filters outgoing email to ensure sensitive information or undetected viruses don’t leave users’ outboxes. These services are available today for all new mailboxes, and will be rolled out to all Intermedia mailboxes over the coming months. “The world has too many stories of email-based viruses bringing down networks or exposing sensitive information,” says Ed Macnair, managing director EMEA at Intermedia. “We’re bringing McAfee into our ecosystem to protect productivity and security by shielding our users from email-based threats.” Intermedia’s channel partners can also take full advantage of Intermedia’s new McAfee service. They can now add world-class protection to their services portfolio—without the integration and management complexities of working with a third-party security provider. “McAfee knows how critical it is for an SMB to keep their business up and running and protected from email threats,” says Bill Rielly, senior vice president of Small & Medium Business at McAfee. “With Intermedia’s new McAfee Email Defense Suite, we are able to protect more businesses from the threats posed by email.” Intermedia customers are excited about this new offering as well. “I’m pleased that McAfee is so well integrated that deploying it requires no effort at all," said Roy Hill, IT Director of Opportune LP Outsourcing, an Intermedia customer that participated in the pilot programme. "And I'm very happy with the level of protection it provides for both inbound and outbound messages." Intermedia is committed to offering tools and services to equip its growing user base to compete in rapidly changing markets. Any customer who provisions either the McAfee Email Continuity or Data Loss Protection add-on receives the other at no additional charge. 1 McAfee Third Quarter 2013 Threats Report: http://www.mcafee.com/us/resources/white-papers/wp-osterman-cybercriminals-make-money-email.pdf 2 Osterman Research: How Cybercriminals Make Money with Your Email: http://www.mcafee.com/us/resources/reports/rp-quarterly-threat-q3-2013.pdf About Intermedia Intermedia is the world’s largest one-stop shop for cloud IT services and business applications. Its Office in the Cloud™ delivers the essential services that SMBs need to do business – including hosted Exchange, Hosted PBX, SecuriSync file sync and share, security, mobility and more. All of Intermedia’s services are pre-integrated for a seamless user experience. They are all managed via Intermedia’s HostPilot® control panel, with just one login, one password, one bill and one source of support – creating tremendous cross-service efficiencies for both users and IT administrators. In addition, they all offer enterprise-grade security, 99.999% availability and 24/7 customer care to assure a worry-free experience. Intermedia was the first company to offer hosted Microsoft Exchange and, with 100,000 customers and over 1,000,000 users, is today the world’s largest independent provider of hosted Exchange. Intermedia also enables over 13,500 channel partners – including VARs, MSPs, telcos and cable companies – to sell cloud services under their own brand, with full control over billing, pricing and every other element of their customer relationships. Intermedia’s 600 employees in 3 countries manage 10 datacenters to power its Office in the Cloud – and to assure its famous worry-free experience. Learn more at Intermedia.co.uk. ### How a hybrid cloud architecture can reduce risks Margaret Ranken, Telecommunications Analyst, looks at why enterprises are choosing a hybrid cloud architecture in order to reduce risk and gain competitive advantage. As the benefits of cloud computing become well understood, most companies are using at least some cloud services, whether email, backup, e-commerce sites or indeed services from providers of all types of Iaas, PaaS, SaaS or XaaS services. The standard of data centre performance, resilience, monitoring and availability attainable by the specialists is making it harder to justify maintaining an in-house data center for all but the largest organisations. Nowadays, start-ups never even consider the possibility, it's cloud all the way. For companies who already have established in-house deployments, there is always the concern that change is costly and introduces risks. Most IT managers follow the old adage "if it ain't broke don't fix it" so moving a system that is working well from in-house servers to a cloud deployment doesn't make sense.  It is no surprise then, to find that in October 2013, Gartner estimated that nearly half of large enterprises will have hybrid cloud deployments by the end of 2017. Good management tools are essential to knowing where your assets are located, how they are protected and what the threats are. Hybrid cloud management tools are emerging from companies like VMware and Red Hat that allow you to manage in-house and cloud infrastructure seamlessly, applying a single model for management, orchestration, security and networking across both the public and private data centers. Network virtualisation allows you to configure firewalls and networks in the cloud as if they were in your own data center and follow the same polices for security, control and compliance. By creating a seamless infrastructure you can reduce rather than increase complexity and thus reduce risk. Hybrid cloud management tools are emerging from companies like VMware and Red Hat that allow you to manage in-house and cloud infrastructure seamlessly. The most obvious way in which a hybrid cloud can reduce risks is in cloud backup and disaster recovery applications, by providing flexible access to off-site capacity that can be configured to provide hot back-ups for truly mission critical data and regular but less frequent back ups for more run-of-the mill information. Using an elastic cloud architecture also reduces the risks to public websites, especially e-commerce facilities, which often require publicly accessible websites linked to mission-critical assets on-site. Their resource usage is dynamic and unpredictable and sites can be brought down by a sudden surge of demand, for example following an unexpectedly successful promotion. Hybrid cloud allows you to extend your infrastructure seamlessly when needed and to replicate mission critical applications so that there is no downtime should problems occur. Another area where hybrid cloud can reduce risks is in development, where, for example, a new mobile app can be tested in the cloud without risking disruption of the core in-house systems. Test environments can be created quickly in the cloud so if there are multiple streams of development you can create multiple environments in which to test the different streams.  It makes the decision to go ahead with a project much easier when you know that it is using a flexible infrastructure that can be created separately from the core systems. Assessing risks can be a challenging process, and that must be the first step. But a hybrid cloud carefully designed to reduce those risks once they are identified may well turn out to be the best solution in the long run. ### A local cloud for local people Philip Grannum, Managing Director, Hosting Operations, Easynet Global Services considers the advantages that localisation has for the delivery of cloud computing services. I've long been aware of the critical nature of location in the decision to site a data centre, factoring in such considerations as availability of power infrastructure, telecom access and physical security.  In the Cloud world, however, on the face of it things should be simpler. Data should be accessible from anywhere and the physical hosting location shouldn't matter – or so say the Cloud purists. Content doesn't observe national borders. Indeed a senior Engineer at one of the US Cloud leaders admitted that they had initially assumed that two US-based data centre locations would be sufficient to operate a Cloud service. This US-centric view perhaps portrays a certain level of naiveté when viewed from today’s post-NSA market situation, and indeed the same provider is expanding its footprint across the globe. In reality much content or data is anchored to its home market. At a philosophical level too I’m not convinced that concentrating most of the content and data into a small number of mega data centres, operated by a small number of providers, is necessarily the right model. The Internet is very robust due to its ubiquity and distributed architecture. The most natural location for data is close to the user, and so a more dispersed Cloud has potential end-user benefits. Because of this, at Easynet we locate our Cloud hosting services physically in the principal markets we serve. Europe is not a uniform market and local market needs and concerns are in many cases most readily supported with local in-country infrastructures. Data sovereignty There has been much discussion recently about the free migration of labour within the EU. Labour may be free more or less to move, but it is not yet be so easy for people to take their data with them! Each country in the EU has its own data protection agency and whilst there is movement towards a common framework in the EU, each is currently operating in a different way within a different regulatory framework (and I’m sure that the good folks of Royston Vasey have their own ideas about data protection too!). Some countries like Iceland and Switzerland have been touting themselves as potential safe havens to host data outside the main regulatory blocks. The sovereign state of Sealand made its own similar attempt. There’s a useful guide here for those interested. There are barriers to data migration. There generally needs to be explicit consent of the data owner to move their data outside their borders. There are barriers to data migration. There generally needs to be explicit consent of the data owner to move their data outside their borders. Also, even if a Cloud provider’s infrastructure is located within the desired region of operation, if their staff located outside of that geography have potential access (e.g. root access to a server with unencrypted data), then that can be judged as non-compliant from a data protection perspective i.e. the support structure and method can be as important as the physical location. The EU is implementing a new regulatory regime with a view to providing a more harmonised approach across EU, for which there is an excellent guide here. However, in general the message for Cloud companies, and their customers, is that the regime is getting tighter, including the potential obligation to notify the national supervisory agency of data breaches within 24 hours; the rights of data portability for consumers; and the right to be ‘forgotten’. In the UK, the Government has its own data security policy, which in practice means that UK hosting managed by UK security-cleared staff is the most practical option for many classes of data. Easynet’s G-Cloud accredited hosting is operated wholly onshore in UK for this reason. Industry sectors like Financial Services, Pharma, Healthcare, Telecoms and Utilities are also subject to further data regulation requirements specific to their industry and data types over and above general data protection provisions. In a post-NSA world with an increased awareness of Cyber threats these regulations are only going to get tougher. Security Now I’m not going to enter into a long discussion here about the relative security benefits of Cloud vs. operating on-premise. Data which is not connected to anything and with no means of direct access to it is the most secure but also most useless. Thereafter every decision is a balance of risk between ease of use and control.  It is certainly the case that Cloud and hosting companies have a depth of expertise in security management which exceeds that of many of their customers who can benefit from that, in building greater security into their controls. It is also true that multi-tenant Cloud may not provide optimum balance between risk and access for some critical data, particularly when the data owner has to prove compliance to a sceptical auditor. It is certainly the case that Cloud and hosting companies have a depth of expertise in security management which exceeds that of many of their customers who can benefit from that. Indeed the practicality of conducting an audit is one area where the location can be an important consideration. In proving compliance, be it PCI or some other standard the customer is working to, it is often required that the auditor can have access to the facilities where data is both hosted and accessed from. This can be a lot easier and less expensive to organise if conducted within the country where the host company and auditor is located. Performance A storage salesman once said to me that virtual machines can move easily but data is heavy. A nice way of saying that whilst machine images are easy to move around it can be difficult and costly to move around large quantities of data. This impacts when migrating existing systems and data to the Cloud If you have a virtualised environment it is often possible to migrate machines (subject to compatibility etc.) into a Cloud environment: some Cloud providers won’t allow you to migrate VMs and require a fresh build, but many will allow this. However, if the data is large it may be impractical or costly to migrate over the wire - if connectivity is not already in place it can be expensive to put a large link in for a short term data transfer and some providers, levy high volume-related charges for data transfers. This may mean that the most practical solution for a large deployment is a physical transfer on media to the host data centre. Again, when up against a short time window, which is typically the case for critical infrastructure, this is usually most practical if the target data centre isn't too far away. Adding the complexity of cross border migration can introduce additional time and risk. The second place it hits is performance. Latency can negatively impact the user experience, particularly if the application is chatty between a user machine and the host. For some applications distance doesn't matter so much or can be ameliorated by using CDN or WAAS. However in general Cloud, it is a lot easier if the target hosting location isn't too far from a latency perspective from the end users, if not in-country then at least in the same region of the globe. If you’re replicating a lot of data between two sites it is also useful if they stay within the same geography. Carriers generally charge a lot more for international transport than domestic circuits at the same bandwidth. It may also be the case (particularly for synchronous applications), that staying within a metro, or proximate to a metro, can have significant advantages (hence the location of data centres in a ring around London providing backup infrastructure to clients located within the M25). Are these practical considerations going to erode rapidly so we end up in a borderless data world, with Cloud operations mainly in mega data centres operated by the industry giants, and where data sovereignty becomes an anachronism? Or will we increasingly move towards a more diverse and distributed model with a higher degree of localism and variety? I hope it’s the latter. What do you think? ### Which vision of SDN will win? At the end of last year, Compare the Cloud took a look at what we thought was in store for Cloud users and Cloud service providers, and one of our predictions was for a greater awareness of Software Defined Networking (SDN) to develop this year. But, if this proves to be the case, which vision of SDN will win through in 2014? The Case for Software Defined Networking The argument for SDN is very tempting: by applying the logic of server virtualisation to a network, we should be able to achieve the same level of abstraction and the same benefits, right? Well, the jury is out on whether it is really achievable and, in the meantime, two very different visions are emerging about how those benefits can be realised. By applying the logic of server virtualisation to a network, we should be able to achieve the same level of abstraction and the same benefits, right? Attempts to create a policy-defined, more centrally-managed network are not new. At CohesiveFT they've been developing a software defined (overlay) network since 2007. However, the success and widespread use of virtualised servers, the growing acceptance and move towards cloud computing models and the success of businesses like Facebook and Amazon in efficiently managing huge data centres has really ignited interest in the potential of SDN. Hardly a week goes by without some new SDN-related acquisition, strategy unveiling or the like. And a schism seems to be emerging between those who see SDN as a software-only solution, which decouples the network from the physical infrastructure, and those who argue that network virtualisation must be paired with an ability to address the physical infrastructure if SDN is to have real value. Network Virtualisation Holds All the Answers Given where they are coming from, it is no surprise that the "decouple, decouple, decouple" argument is being led by leading hypervisor vendor, VMware. The vision for their network virtualisation platform, NSX, is to enable users to deploy a virtual network for an application at the same speed and operational efficiency that you can deploy a virtual machine. VMware delivers this through its hypervisor virtual switch, vSwitch, and its NSX controller. The virtual switches connect to each other across the physical network using an overlay network, handle links between local virtual machines, and provide access to the physical network should a remote resource be required An advantage of this approach is that additional services, such as distributed firewalls and load-balancers, can reside in the virtual switch, allowing for a more flexible firewalling policy and greater network efficiency. However, the strength of the solution - of taking the intelligence out of the physical infrastructure, so it is responsible only for forwarding overlay packets - is also at the core of its criticism. An advantage of this approach is that additional services, such as distributed firewalls and load-balancers, can reside in the virtual switch. Because it offers no visibility into the physical underlay network on which the solutions resides, there can be no insight or assistance with regards traffic engineering, fault isolation, load distribution or other essential physical network management activities. Software Is Only Part of the Answer Those who believe that a virtualised network which does not support communication with network switching hardware cannot represent a true SDN solution, should look no further than the words of Padmasree Warrior, Chief Technology and Strategy Officer at Cisco, who argued last year that: "Software network virtualization treats physical and virtual infrastructure as separate entities, and denies customers a common policy framework and common operational model for management, orchestration and monitoring." CISCO has been quick to target this perceived weakness in the VMware vision, as it promotes its own SDN vision: Application Centric Infrastructure (ACI). Through its APIC controller, CISCO will enable users to create a policy-driven infrastructure, controlling both virtual and (CISCO) hardware switches, and built around the needs of the application. Through its APIC controller, CISCO will enable users to create a policy-driven infrastructure, controlling both virtual and (CISCO) hardware switches. CISCO argue this approach will deliver both network virtualisation and visibility into the physical network which will enable the fine-tuning of network traffic. Which Vision Will Win Out? VMware have been quick to dismiss CISCO's vision of the Application Centric Network as 'Hardware Defined Networking'; casting it as an attempt to slow a shift in architecture specifications towards virtual networks which can run on low-cost hardware.  The commoditisation of hardware model doesn't work, they argue, when you're tied in to specific hardware. But it isn't as simple as that: the fact remains that hardware will still need to be updated to cope with changing network needs, and it will be continue to be necessary to monitor and optimise it. I guess whichever side of the argument ultimately wins hearts and minds, the real winners here will be cloud customers and CSPs. Two major technology firms bringing to the market two different solutions and investing huge marketing efforts into promoting their own approaches will certainly raise awareness in the potential of SDN. As SDN becomes adopted more widely, so will customer organisations' ability to leverage the full benefit of virtualisation and Cloud computing. For CSPs, a key benefit of virtual networks is the provision of multi-tenant isolation on a shared physical network. To this end, many CSPs are running their own proprietary SDN solutions in their environments already (as we saw from Chris Purrington's post earlier), so this argument is really one about raising awareness about SDN and communicating the benefits. Those benefits include helping organisations manage hybrid and multi-cloud environments in a way that helps that leverage greatest performance, efficiency and security. And so whichever vision wins out, as SDN becomes adopted more widely, so will customer organisations' ability to leverage the full benefit of virtualisation and Cloud computing. ### RES Software launches IT Store Solution RES Software has announced the launch of the RES IT Store. The RES IT Store transforms how users consume IT services through proactive, secure and automated delivery and return of IT resources. By automatically delivering predicted IT services to users, and supplementing automated self-service for additional service requests, IT is now able to give users the instant, consumer-like experience they have been demanding in the enterprise. “The RES IT Store provides a win-win for IT departments and users,” said Bob Janssen, CTO and Founder of RES Software. “While IT can benefit from the ability to automatically provision a user with services that can be predicted for them based on their business-defined needs, the RES IT Store enables the business user to interact with IT through an intuitive, self-service experience. The result is a single point of access for all IT services, providing the convenience and instant response the user needs with the security, compliance and control demanded by the business. It’s really what both IT and the business have been looking for.” With the rise of the “consumerisation” of IT, the RES IT Store provides a similar experience to what users receive from the consumer app stores of companies such as Apple and Google. Combining the security features sought by IT with the ease-of-use and efficiency requested by users, the RES IT Store provides a controlled and automated platform from which users and IT can request and access all of the services, applications and data they need to be productive. “When you look at the challenges that IT departments are facing today, the core struggle lies in delivering on users’ mounting expectations,” said Mike Whitehead, Senior Systems Engineer for Intuit, whose company deployed the RES IT Store solution in 2013. “When users are accustomed in their ‘consumer lives’ to an easy experience and immediate access to any apps or data they need, IT is driven to replicate that experience as much as possible in the enterprise. The RES IT Store helps us do that.” The RES IT Store enables organisations to: Reduce the time needed for IT to provision new employees – often from days or weeks to just minutes – by proactively subscribing users to IT services based on their business qualifications, such as their roles, departments, geographic locations and more Reduce the number of service desk tickets through proactive automation, self-service and request fulfilment, which can also remediate incidents. RES IT Store users have seen a reduction of service tickets of 25% Ensure the automated and rapid return of all IT services during the offboarding process, as well as other organisational changes such as promotions or role changes Take advantage of the familiar and consumer-like app store model, while adding speed of delivery and IT efficiency through on-demand automation To support the growing mobile workforce, the RES IT Store integrates with leading enterprise mobility management technologies, including Citrix XenMobile and MobileIron. RES Software also supports leading IT service management provider ServiceNow and cloud-based services, such as Salesforce.com and Microsoft Office 365, to provide users with a single interface for requesting IT services. RES Software will continue to build its integrations, with support for BMC Remedy and AirWatch slated for Q2 of 2014. With the RES IT Store, IT can aggregate services for users across physical, virtual, Mac, mobile and cloud platforms, streamlining the complexity of today’s IT environment. The result is the ability for IT to proactively manage the complete lifecycles of IT services. RES Software has developed its IT Store solution with an open technology approach that is designed to integrate with any needed workplace technology. “The RES IT Store creates the opportunity to bridge the widening gap between users and IT,” said Klaus Besier, CEO of RES Software. “It gives customers a full-service, comprehensive IT platform that provides the IT department and end users with the best possible IT service experience, while also lowering costs and improving productivity. Based on feedback from our launch customers, partners and the analyst community, the RES IT Store is a game-changer for enterprise IT.” The RES IT Store is available with the newly released RES Suite 2014. It is a lightweight platform that works with an organisation’s existing infrastructure, and can be implemented in a scalable way to address specific or broad-based use cases in the enterprise, such as onboarding/offboarding, service desk ticket reduction, or on-demand app store provisioning. The RES Suite adds advanced abilities to the IT Store and allows IT to personalise and secure access to the user’s IT services in real-time. To learn more about the RES IT Store, visit www.itstore.com. ### Savings offer US government cloud incentive While most people in the know have long since recognised cloud computing to be the future of information technology, there are a variety of barriers and obstacles that have to be negotiated before it becomes as prominent a part of people’s lives as a technology we take for granted, such as the Internet. Two immediately come to mind. Firstly, there is the natural conservatism that we human-beings seem to have as a default function. The average punter in the street will not sign up to cloud computing until it becomes a regular part of everyday conversation, and businesses will stick steadfastly to their existing set-ups until very explicit benefits of cloud computing are demonstrably indicated. Of course there will always be some enthusiasts, visionaries and early adopters who see the value of the cloud long before the general consensus jumps on board, but the history of technology tells us that the process of the majority getting on board with a new innovation can be a laborious one. Businesses will stick steadfastly to their existing set-ups until very explicit benefits of cloud computing are demonstrably indicated. Secondly, there is that perennial issue which makes the world go round – economics. The bottom line is that no-one will sign up to cloud computing, particularly among business and large private sector organisations, until it is in their financial interest to do so. Many businesses and government entities have naturally made huge investment in existing technology, and they are not going to give that up until they’re 100% certain that it is genuinely obsolete, or at least until they see some very strong benefits from doing so. However, a recently published study suggests that the cloud computing 'logjam' might be about to be eased somewhat. The study in question suggests that adoption of cloud offerings — particularly as a platform for service middleware and application development tools — can make a significant contribution to savings the US government money. It is indicated by the study that cloud computing could potentially cut U.S. government application development costs by as much as $20.5 billion per annum. The survey in question was carried out with 153 federal IT executives by MeriTalk, a public-private partnership for advancing government information technology structure. The study was underwritten by Red Hat, Inc., thus underlining its credibility, and calculates that Platform as a Service (PaaS) could cut federal IT costs by approximately one-third. The US federal government is in fact a very large producer of computer software, in order to operate its various departments. More than three-quarters of the respondents to the survey stated that developing new applications is absolutely essential to the day-to-day running of their departments. The report was so critical of the existing IT arrangements in US federal government departments that it described them as “fundamentally broken”, with the study citing evidence from the General Accounting Office which indicates that the government spends 70% – $56 billion – of its IT budget on care for legacy systems, which move at a “glacial pace”. Naturally this translates into higher costs for government departments, which the software application development cycle taking an average of three and a half years. In further damning information from the study, 41% of the federal IT managers who responded stated that their agencies’ software and applications need to either upgraded or completely replaced. As well as painting a damning picture of the IT systems currently in operation in the US government, this study also provides extremely significant news for the future of cloud computing, demonstrating quite clearly that the cloud can offer huge savings to big organisations; exactly the sort of incentive which is required to catalyse the cloud revolution. While those in the know rightly recognise that saving money is but a mere part of the total palette of opportunities which is provided by the embryonic technology, economics usually dictates the uptake of any viable medium. For example, it is often asserted that Betamax video was superior to the VHS format which ultimately won the videocassette war, but eventually lost out to its competitor. Similarly, the Spectrum home computer cleaned up in the European market despite more sophisticated offerings elsewhere, simply due to its price point. The fact that we are now beginning to see tangible evidence of the economic benefits of the cloud will only speed up its widespread adoption. Both businesses and private sector organisations alike want to see a bottom line delivery from any major investment or any significant change to their operating procedures. The fact that we are now beginning to see tangible evidence of the economic benefits of the cloud will only speed up its widespread adoption. ### By Frank Jennings, Cloud Lawyer at DMH Stallard LLP Our understanding of government surveillance changed in June 2013 when The Guardian published the first revelations of the NSA’s PRISM and GCHQ’s Tempora snooping programmes. My post that same month Help! NSA has my data – sought to introduce calm against a background of extensive blanket surveillance which has generally been greeted with alarm. 7 months is a long time in the world of cloud, so what do we know now? Data privacy has continued to be a hot topic with some questioning whether we should have a European-only cloud or whether we should abandon cloud altogether. Data privacy has continued to be a hot topic with some questioning whether we should have a European-only cloud or whether we should abandon cloud altogether. The extent of the snooping has taken many by surprise. The Guardian and others have disclosed the surveillance in detail and while it’s not necessary to repeat them, here are some high (low?) lights: the US and UK snooped on foreign leaders at the 2009 G20 summit there have been accusations that Skype voluntarily joined the PRISM programme and RSA introduced back doors to their products to facilitate surveillance. These accusations are denied NSA collected address books from Yahoo, Hotmail, Facebook and Gmail seemingly without their knowledge or cooperation NSA cracked mobile phone encryption and was alleged to be listening to the phone calls of Angela Merkel, the German Chancellor with the Israeli prime minister and the EU Competition Commissioner also targeted The reaction in many quarters has been furious. Again, some highlights for me: the Russian government bought electric typewriters and this was reportedly as a result of them discovering that they were being spied on the European parliament’s civil liberties committee says the activities of NSA and GCHQ appear to be illegal and have asked Snowden to submit to questioning the American Civil Liberties Union is pursuing a lawsuit against the NSA alleging that its spying activities are unconstitutional and Kentucky senator Rand Paul is also about to bring a claim and is urging Americans to join the lawsuit two Californian state senators introduced a bill in an attempt to cut off NSA’s water supply, essential for computer-cooling the European justice and rights commissioner Viviane Reding threatened to freeze the EU/US Safe Harbor scheme President Obama announced some reforms to the NSA Here are my answers to questions which I’ve been asked since June last year: 1. Will the EU Data Protection Regulation stop NSA and GCHQ snooping on me? No. The draft regulation is an attempt to harmonise the different approach to data protection laws across the EU. It has been heavily criticised and has undergone a plethora of amendments and there still remains much disagreement that must be resolved before the regulation can be implemented. Regardless, article 2 of the current draft contains an exemption for national security and it is highly likely that some form of exemption will be retained in the final draft. 2. Should we abandon the Safe Harbor scheme? …scrapping Safe Harbor and preventing the flow of EU personal data to the US would punish those companies for the actions of the NSA. No. The EU/US Safe Harbor scheme was an attempt to protect EU citizen’s data in the face of a lack of a US federal law providing similar protections. Don’t forget that most US companies have denied actively participating in the PRISM programme so scrapping Safe Harbor and preventing the flow of EU personal data to the US would punish those companies for the actions of the NSA. There’s no doubt that hitting the profits of US companies would get the attention of the US government but at the same time it could severely impair the growth of cloud in the EU as so much of it is based in the US. 3. Should I abandon cloud? No. The first road fatalities in the US and UK were in the 1890s but this didn’t lead to the banning of the car. 1.24m people worldwide died of road traffic injuries in 2010 alone and yet we still continue to use cars. As with any new development there will always be negatives. The key is to establish proper guidelines and restrictions for surveillance by security agencies rather than abandon cloud completely. 4. Should security agency powers be curtailed? …this should not distract the average cloud provider and customer from getting on with their business. Maybe. The debate continues. In his recent announcement of proposed reforms to the NSA, President Obama said that NSA had been acting within its powers but he recognised that Snowden’s revelations had caused anxiety. The reforms are not as extensive as many were asking for and need further clarification. For example, the NSA will continue to have access to phone data but won’t hold this itself; a third party yet to be identified will hold it instead and NSA will access it when needed. Also, foreign citizens – including other world leaders – will enjoy the same protections as US citizens but will still be the subject of surveillance if necessary to uphold national security. In short, the NSA will continue to undertake surveillance but with some adjustments. This is clearly an ongoing discussion but the NSA, GCHQ and other security agencies will all continue to undertake surveillance to some extent. In the meantime, this should not distract the average cloud provider and customer from getting on with their business. 5. Should the EU adopt its own cloud? Yes, if you mean a European cloud to promote a thriving European-based cloud to help businesses to compete with the US-based cloud and let’s hope that’s what the European Cloud Partnership and Cloud for Europe achieve. However, if you mean a state sponsored scheme to build a European cloud to lock data in the EU and keep out US companies then no, for the following reasons: I’m always wary of protectionist policies, particularly given Europe’s history It must not act as Fortress Europe as this would be contrary to the attempts at creating a global economy through international bodies such as the World Trade Organisation Despite the rhetoric, bureaucrats and national governments are normally not the best examples of how to implement successful technology projects It would also not overcome the reality of state surveillance. GCHQ’s own surveillance programme, Tempora, is well-known following Snowden’s revelations. There have also been disclosures that Germany’s Federal Intelligence Service has contributed to NSA’s data collection and France’s Directorate General for External Security has been intercepting and storing French telephone and Internet communications. And don’t forget that, under the various mutual legal assistance treaties which national governments have signed, security agencies share information between them, including with the NSA. 6. Why don’t the legislators act quicker to help cloud? It’s important for legislators to strike a balance between regulating innovation to protect consumers without rushing laws and stifling the innovation. Innovation happens faster than law making. It has always been so. Typically, until specific laws are passed to regulate an innovation, judges will apply any relevant existing laws, meaning there may be over regulation rather than under regulation in the short term. It’s important for legislators to strike a balance between regulating innovation to protect consumers without rushing laws and stifling the innovation. This supposed lack of relevant law hasn’t stopped US cloud developing. Nor are customers without adequate protection. Consumers are already well protected. Arguably it’s SMEs who need laws to help redress the balance and in the meantime they need to shop around. And read the contract terms before signing up – but a cloud lawyer would say that! 7. So, how do I prevent security agencies snooping on me? Avoid cloud altogether and buy electric typewriters to keep everything on paper on premise. That sounds a bit extreme. Address the problem at source, perhaps by curtailing surveillance powers or through better scrutiny? Obama’s announcement doesn’t give me much comfort that much will truly change. Run certain activities in the cloud, but keep sensitive data out of the cloud. Encrypt (or token-ise) sensitive data before transmitting and storing it in the cloud Of course, these aren’t foolproof. The NSA has apparently been using tiny radio devices to get access to offline computers. It can already crack some encryption algorithms. The NSA reforms won’t prevent snooping but at least for now, their activities will be more closely scrutinised. Also, if all data is encrypted – not just at rest in data centres but in transit too – this will likely cause the NSA and other national security agencies to focus their resources on those targets who they genuinely believe are a threat to national security rather than the blanket approach they have up to now, as they can’t efficiently decrypt all data (yet). 8) Should I just stop panicking and carry on with my business? Yes. ### The Network Risks for Online Gaming By Frank Puranik, Product Director for iTrinegy I've spent decades looking at networks and their impact on application delivery in different industries and none more so than network/online gaming. The global casino/gaming industry was purported to have a net worth of over $125 billion last year, and with online gaming taking an increasingly significant piece of the pie it's essential that their users are happy if they are to continue to grow revenue. Networks are an essential part of the online gaming world and with all the different hosting offerings available e.g. Cloud and virtualised data centre choices it can be tricky to deliver a good online game experience to users. To do this requires 3 things: ....with [...] Cloud and virtualised data centre choices it can be tricky to deliver a good online game experience to users. Software that's nice to use, efficient and looks good Servers that have enough capacity for peak demand, without going slow and, responsive Network delivery to the users' preferred device Technically we'd put all of these factors together and call it the users ‘Quality of Experience' (QoE) with the servers and networks providing the essential delivery of the game, which we'd measure as Quality of Service (QoS). The server infrastructure has ability to be strictly controlled: cpu power provisioning, virtualization level, users per virtual server etc. But, the network, on the other-hand, poses a real challenge, as we cannot control it: in online gaming, for example, we just have to take what's out there (a mix of home networks, corporate, mobile networks etc) and from your personal experience you know how variable and poor these networks can be at times. For the right commercial reasons the gaming industry hasn't helped itself technically, setting up data centres in offshore locations for tax reasons which have often had poor network access. This is changing due to new (UK) legislation to tax bets where the consumer is located, and no longer where the servers or the gaming corporations are based. This now adds the prospect of being able to move data centres to ‘better' locations, but the definition of ‘better' itself may be cost driven, now freeing gaming companies to look at solutions located in places such as Iceland - with its abundant geo-thermal energy and therefore low energy prices - without again being primarily focused on the network access. ...the network, on the other-hand, poses a real challenge, as we cannot control it... The reason for this is a constant open reporting on how bandwidth is cheap and getting cheaper, which leads the non-savvy to the assumption that the network can fundamentally be ignored because, like the server structure, if you run out just add more... Unlike the server infrastructure, however, this is absolutely not true! Bandwidth is not always abundant to all locations, it is not inexpensive to all locations and even more, it may come with large amounts of latency, packet loss and other network related issues, which to the player appears as the dreaded LAG! LAG is not to be trifled with. Apart from an uninspiring, poorly crafted game, LAG is the single most prevalent reason that a user will leave the game (either temporarily or permanently). Even if you have purchased and controlled the best MPLS/Private network into your datacentre there are still issues surrounding the final delivery to the end user – the last mile i.e. adsl, the corporate network (QoS policies), mobile cellular networks, poor wifi etc. and, in addition the actual distance to the data centre e.g. Iceland. All these create delay and network latency which translates directly to LAG. So what can we do? LAG is the single most prevalent reason that a user will leave the game (either temporarily or permanently)... Better Game Design, Development and Testing Firstly, when building the game the design must tolerate these (unpredictable) real world networks, and cope with the vagaries of the last mile. This means, we need, ideally to have those networks available to us through the development and testing process of the game. The problem here is that often the developers have access only to fixed and typically excellent networks (great WANs or LANs), because the development environment is highly controlled, and so writing the game to tolerate poor networks is not done. The test environment will be quite similar, perhaps with a few different network scenarios available. Network Emulation has the answer here providing an ability to create different networks on-demand for any different last mile networks types (such as mobile, WiFi, ADSL with high latency, packet loss characteristics, etc) as well as being to able to simulate the MPLS/WAN into the data centre(s). Our company iTrinegy has a proven role to play here, one such recent example was by emulating real world network conditions for a graphics-intensive multi-player online game. Whilst all the testers were physically sat next to each other, in the virtual world we created the network conditions that put them all over the globe connected through all sorts of networks and networking technologies. Network Emulation has the answer here providing an ability to create different networks on-demand for any different last mile networks types... Monitoring and Measuring the Network Experience Secondly we must, where possible, measure and control those networks ensuring that they do not become overloaded as running out of bandwidth invariably creates additional latency and LAG. Major events need special consideration as we need to cope with peak load, not just the normal loads. To do this you need to be measuring the game performance across the network on a constant ongoing basis to look for increasing delays, loss of game data (causing communications to be repeated) and out of bandwidth problems (which lead to queuing). All of these ultimately translate to game LAG. We also need to be looking at all of these for future capacity and ongoing game performance over the networks (good old fashioned capacity planning - but this time with a strong network bias). Furthermore we need to be looking at ‘headroom' for the big event as well as trends in increased network usage (likely caused by more players) which is just good common sense (capacity planning). The data for this is made available by an Application Aware Network Profiling tool. Datacenter and Cloud Moves (Transformation) This risk can be removed by emulating the new network in advance and testing the game play in the emulated network to ensure that the game still performs at least as well as or better, as in the original infrastructure. Lastly, if moving a data centre is on the radar, then we really need to re-test and (re-certify the game) to ensure that it will play as well or as better from the new data centre, with the new networks to that location. The game is going to have to cope with a network that may have more latency due to its physical location related to users, or simply because the networks to those locations are not quite as good. This risk can be removed by emulating the new network in advance and testing the game play in the emulated network to ensure that the game still performs at least as well as or better, as in the original infrastructure. (It should be noted that this is the standard unwritten data centre SLA - the Application should perform at least as well after any transformation as it did before - the big joke being that most don't know how it performed before the transformation - hence measure first!). Either way successfully measuring and emulating will de-risk any type of datacentre, or cloud move. In summary this implies the following process: 1) measurement of the game's performance today (through Network Profiling) 2) accurate prediction (via Network Emulation) and 3) Network Profiling in the new network to verify that all is as expected. Outsourcing your infrastructure and network, does not outsource your responsibility to provide a good gaming experience... Don't think these points don't apply if you are outsourcing or moving to a cloud provider. Outsourcing your infrastructure and network, does not outsource your responsibility to provide a good gaming experience – In the end the buck stops with you not the outsourcer. All of these technical points mentioned above are fundamentally all focused on providing a good quality of service which translates to winning and keeping more customers, by giving customers the best possible experience. Not doing this risks your customer retention and therefore your bottom line. Frank Puranik is Product Director for iTrinegy a company specialising in the performance of application across networks, with products and services around Network Profiling, Network Emulation and Application Performance Management for all types of networks. ### Network risks for online gaming Frank Puranik, Product Director for iTrinegy , identifies network conditions as the big "unknown quantity" for online gaming performance, and offers a solution. I've spent decades looking at networks and their impact on application delivery in different industries and none more so than network/online gaming. The global casino/gaming industry was purported to have a net worth of over $125 billion last year, and with online gaming taking an increasingly significant piece of the pie it’s essential that their users are happy if they are to continue to grow revenue. Networks are an essential part of the online gaming world and with all the different hosting offerings available e.g. Cloud and virtualised data centre choices it can be tricky to deliver a good online game experience to users. To do this requires 3 things: Software that’s nice to use, efficient and looks good Servers that have enough capacity for peak demand, without going slow and, responsive Network delivery to the users’ preferred device Technically we’d put all of these factors together and call it the users ‘Quality of Experience’ (QoE) with the servers and networks providing the essential delivery of the game, which we’d measure as Quality of Service (QoS). The server infrastructure has ability to be strictly controlled: cpu power provisioning, virtualization level, users per virtual server etc. But, the network, on the other-hand, poses a real challenge, as we cannot control it: in online gaming, for example, we just have to take what’s out there (a mix of home networks, corporate, mobile networks etc) and from your personal experience you know how variable and poor these networks can be at times. With cloud and virtualised data centre choices it can be tricky to deliver a good online game experience to users. For the right commercial reasons the gaming industry hasn’t helped itself technically, setting up data centres in offshore locations for tax reasons which have often had poor network access.  This is changing due to new (UK) legislation  to tax bets where the consumer is located, and no longer where the servers or the gaming corporations are based. This now adds the prospect of being able to move data centres to ‘better’ locations, but the definition of ‘better’ itself may be cost driven, now freeing gaming companies to look at solutions located in places such as Iceland - with its abundant geo-thermal energy and therefore low energy prices - without again being primarily focused on the network access. The reason for this is a constant open reporting on how bandwidth is cheap and getting cheaper, which leads the non-savvy to the assumption that the network can fundamentally be ignored because, like the server structure, if you run out just add more... Unlike the server infrastructure, however, this is absolutely not true! Bandwidth is not always abundant to all locations, it is not inexpensive to all locations and even more, it may come with large amounts of latency, packet loss and other network related issues, which to the player appears as the dreaded LAG! LAG is not to be trifled with.  Apart from an uninspiring, poorly crafted game, LAG is the single most prevalent reason that a user will leave the game (either temporarily or permanently). Even if you have purchased and controlled the best MPLS/Private network into your datacentre there are still issues surrounding the final delivery to the end user – the last mile i.e. adsl, the corporate network (QoS policies), mobile cellular networks, poor wifi etc. and, in addition the actual distance to the data centre e.g. Iceland. All these create delay and network latency which translates directly to LAG. So what can we do? Better Game Design, Development and Testing Firstly, when building the game the design must tolerate these (unpredictable) real world networks, and cope with the vagaries of the last mile. This means, we need, ideally to have those networks available to us through the development and testing process of the game. The problem here is that often the developers have access only to fixed and typically excellent networks (great WANs or LANs), because the development environment is highly controlled, and so writing the game to tolerate poor networks is not done. The test environment will be quite similar, perhaps with a few different network scenarios available. Network Emulation has the answer here providing an ability to create different networks on-demand for any different last mile networks types (such as mobile, WiFi, ADSL with high latency, packet loss characteristics, etc) as well as being to able to simulate the MPLS/WAN into the data centre(s). Our company iTrinegy has a proven role to play here, one such recent example was by emulating real world network conditions for a graphics-intensive multi-player online game. Whilst all the testers were physically sat next to each other, in the virtual world we created the network conditions that put them all over the globe connected through all sorts of networks and networking technologies. Network Emulation has the answer here providing an ability to create different networks on-demand for any different last mile networks types. Monitoring and Measuring the Network Experience Secondly we must, where possible, measure and control those networks ensuring that they do not become overloaded as running out of bandwidth invariably creates additional latency and LAG. Major events need special consideration as we need to cope with peak load, not just the normal loads. To do this you need to be measuring the game performance across the network on a constant ongoing basis to look for increasing delays, loss of game data (causing communications to be repeated) and out of bandwidth problems (which lead to queuing). All of these ultimately translate to game LAG. We also need to be looking at all of these for future capacity and ongoing game performance over the networks (good old fashioned capacity planning - but this time with a strong network bias). Furthermore we need to be looking at ‘headroom’ for the big event as well as trends in increased network usage (likely caused by more players) which is just good common sense (capacity planning). The data for this is made available by an Application Aware Network Profiling tool. Datacenter and Cloud Moves (Transformation) Lastly, if moving a data centre is on the radar, then we really need to re-test and (re-certify the game) to ensure that it will play as well or as better from the new data centre, with the new networks to that location. The game is going to have to cope with a network that may have more latency due to its physical location related to users, or simply because the networks to those locations are not quite as good. This risk can be removed by emulating the new network in advance and testing the game play in the emulated network to ensure that the game still performs at least as well as or better, as in the original infrastructure. (It should be noted that this is the standard unwritten data centre SLA - the Application should perform at least as well after any transformation as it did before - the big joke being that most don’t know how it performed before the transformation - hence measure first). Either way successfully measuring and emulating will de-risk any type of datacentre, or cloud move. In summary this implies the following process: 1) measurement of the game’s performance today (through Network Profiling) 2) accurate prediction (via Network Emulation) and 3) Network Profiling in the new network to verify that all is as expected. Outsourcing your infrastructure and network, does not outsource your responsibility to provide a good gaming experience. Don’t think these points don’t apply if you are outsourcing or moving to a cloud provider. Outsourcing your infrastructure and network, does not outsource your responsibility to provide a good gaming experience – In the end the buck stops with you not the outsourcer. All of these technical points mentioned above are fundamentally all focused on providing a good quality of service which translates to winning and keeping more customers, by giving customers the best possible experience. Not doing this risks your customer retention and therefore your bottom line. ### IBM drives social business in the cloud IBM builds on growing client and partner momentum for social business in the cloud. IBM has announced strong momentum for its Software-as-a-Service (SaaS) based social business offerings from partners, clients and developers. As part of today's news, IBM is also highlighting its continued investment with new innovations in its social business and cloud technologies. According to a recent IBM study of more than 800 business decision makers worldwide, organisations that use software delivered as a service through the cloud (SaaS) reported two and a half times higher profit than their peers. In addition, organisations that adopt cloud-based solutions are 79 percent more likely to drive increased collaboration across their organization, and twice as likely to leverage analytics to unlock greater insight about their business. A growing list of businesses such as Codorniu, Princess Cruise Lines, SIKA, SafeGuard, Shanks and Verisure have taken their business into the cloud, extending the benefits to millions of users around the globe. “We’ve seen amazing demand over the past year as clients adopt and partners extend the capabilities we’ve been delivering through the cloud,” said Jeff Schick, Vice President of Social Software, IBM. “Over this period, we have spurred new innovation by integrating social capabilities into an organization’s business processes. This has been made possible through our open and extensible APIs that have let us unleash the power of cloud and help organisations collaborate and increase workforce productivity.” Connections in the Cloud and Next Generation Web Mail Experience IBM today announced the rebranding of its mail, chat, meetings, office productivity and content capabilities, making them part of IBM’s Connections brand in 2014. As part of this effort, IBM plans to expand the Connections portfolio to include high-fidelity, high definition video based on its Sametime 9 technology. In addition, IBM is also announcing its next generation web mail experience. In the age of information overload, workers must be able to quickly access and effectively manage the information most vital to their job, much of which resides in their inbox. This new mail offering, planned for availability in both the cloud and on-premises, will use analytics to deliver powerful task-level focus and inbox management capabilities that let employees easily track the content and messages needed to do their job. "IBM’s new web mail experience is very intuitive and integrated with key social business capabilities. This will help our people be more efficient in prioritising and managing daily work, including tracking requests and follow-ups in a powerful yet simple experience," said Berry van der Schans, Information and Communication Technology Manager for Shanks Group Plc. IBM also plans to introduce Domino Applications in the cloud through a ready-made Platform as a Service offering built on IBM SoftLayer. As a result, customers will be able to build on the investments they have made in custom applications and enhance them with new mobile options. Partners will also have a faster path for bringing their new Domino applications to cloud and getting them to market more quickly. Growing Ecosystem Drives Demand for a New Social Business Software IBM is teaming with business partner Parallels to integrate with their innovative cloud automation platform, making it easier for partners that distribute IBM products to assemble and sell unique cloud offerings. For example, a telecommunications company can use IBM’s new plug-in to easily combine and provision its cloud offerings from IBM and other Parallels-ready vendors, such as Sugar CRM. This streamlines the business process of delivering cloud services through a reseller channel. IBM is also introducing a new certified set of global partners that provide on-boarding services that make it easier than ever for clients to deploy their mail to the cloud. Furthermore, IBM’s independent software provider (ISV) partner ecosystem continues to grow. To date, hundreds of unique Connections-based applications have been developed using APIs from the IBM Social Business Toolkit. For example ISVs, such as AppFusions, Flow, HootSuite and Kaltura and Shoutlet have built entirely new applications on the IBM Connections platform -- both in the cloud and on premises. HootSuite, for example, created a new social application integrating IBM Connections capabilities and content into the HootSuite dashboard. This allows data from corporate social networks to be viewed alongside social media data from Facebook, Twitter, LinkedIn, and other networks so employees can be even more empowered, connected, and informed. The Social Software Leader In 2013 IBM was named the number one provider of enterprise social software for the fourth consecutive year by IDC1.. In addition Forrester Research recognized IBM as a leader in file sync and share platforms according to the July 2013 report, The Forrester Wave™: File Sync And Share Platforms, Q3 2013*. Today 75 percent of the Fortune 100 have IBM enterprise social software. For information about IBM's social business initiatives, please visit http://www.ibm.com/press/socialbusiness or follow #IBMSocialBiz and #IBMConnect on Twitter. ### Trustmarque appoints Graham Spivey Trustmarque Solutions Limited (Trustmarque), a leading provider of technology services and solutions to private and public sector organisations across the UK, has appointed Graham Spivey as Director – Cloud Practice. Previously holding positions at Board level and as a Sales and Marketing Director, Spivey has a wealth of experience, having spent over 25 years’ within the IT Managed Services and Solutions arenas. He also has over 10 years’ experience devising and implementing Managed Hosting and Cloud strategies, with expertise spanning the private, public and system integrator sectors. Spivey will be responsible for creating and executing a Cloud and hosting strategy to accelerate the growth of Trustmarque’s Cloud business. Building on the strong platform Trustmarque has already established following the acquisition of Nimbus Technology Systems Ltd; Spivey will lead the development of the company’s Cloud and SaaS market propositions, increasing both their breadth and depth. Scott Haddow CEO of Trustmarque commented, “Graham’s knowledge and skill set is extremely valuable to Trustmarque and I am delighted to welcome him to the business.  He joins us at a very exciting time as we continue to cement our position as an independent end-to-end technology services provider and the trusted adviser of choice for blue chip and large government enterprises. He will undoubtedly help to drive the business forward into its next phase of significant growth.” “It is a fantastic opportunity to join Trustmarque who have an outstanding reputation for providing excellent  solutions and services to customers across all sectors,” said Graham Spivey. “I am looking forward to accelerating the growth of our Cloud and SaaS offerings whilst complementing our commercial and services offerings in the software and solutions space.” ### India sees future in South Asia cloud Indian government foresees huge future for the cloud in South Asia The future of India as an industrial powerhouse has been well understood for some time. The potential of this nation which still admittedly has massive social and economic problems can be illustrated by the following statistic: The top 25% of the Indian population with the highest IQs are greater in number than the total population of the United States. India has also been particularly associated with information technology, employing more than 2.5 million people in this sector, and a similar number of people graduate each year with associated degrees from Indian universities.  A graduation of 2.5 million people per annum is roughly equal to the entire combined population of the British cities of Manchester, Birmingham, Glasgow and Edinburgh. India’s IT outsourcing industry already stands at over $50 billion per annum, and is expected to increase to $225 billion by 2020. India also produces over 500,000 engineers each year, and as 12% of its population speaks fluent English, it is expected that in the next decade India and China will be the two countries with the most English language speaking people in the world. These facts should put into perspective the global role that India will play, and indeed is playing, in information technology. It should hardly come as a surprise then that huge efforts are already been made within the nation to embrace the cloud computing revolution. Just this week, it was announced that Meghraj, the Union government's cloud computing project, will be released next month, with an initial focus on an e-Governance app store at a National Data Centre. India’s IT outsourcing industry already stands at over $50 billion per annum... India produces over 500,000 engineers each year, and 12% of its population speaks fluent English. The Indian government has announced that the Meghraj project will also be utilised as a repository and marketplace of e-governance apps, while there is an apparent intention to extend the cloud computing project in India to land registry, although there are still logistical issues to be settled in this regard. India has already made strong progress in building its cloud infrastructure. There are already twenty-two data centres operational within the South Asian nation, with a lot of Indian government services already being delivered via the cloud. The Indian government states that its social services are already more than 50% run by cloud applications, with the regions of Bihar, Mizoram and Jharkhand, at a particularly advanced stage of implementation. Though the Indian cloud is already very much under construction, industry analysis indicates that this process will only accelerate in the coming years. According to a study from research firm Gartner, the Indian cloud services revenue will have a compounded annual growth rate of 33.2% from 2012 through 2017 across every sector of the cloud computing market. In addition, the Indian government has announced their intention to build a ‘Centre of Excellence’ for cloud computing, which will be critical to the future development of the cloud in India. This centre has been tasked with being instrumentally involved in capability building, as well as having a remit related to providing advisory services for government departments and generally spreading awareness among the public sector in India of the growing importance of cloud computing. Already Indian government departments have been proven to have saved money by switching to the cloud. A report in April highlighted that a previous Indian government cloud initiative had made savings for the public sector equivalent to $80 million, due to savings on physical server costs. Further plans already exist to extend this prevalence for cloud computing further. Government blueprints have already been published which indicated that the cloud is intended in the future to provide services to government departments, citizens and businesses throughout India, and that cloud-based mobile connectivity will also be prioritised as well. The growth of cloud computing in India is indicative of the importance that is being placed on the cloud in the so-called developing economies. India is part of the collective known as the BRICS nations – Brazil, Russia, India, China and South Africa – which it is predicted will wield great economic and political power in the coming decades, or possibly sooner. China and India alone house 37% of the world’s population, and there are many other demographic factors which point to the rising power and importance of these nations in the near to mid-term future. As cloud computing increasingly established itself as a mainstream technology, it is going to be very important for the technology to be embraced by every significant geographic region of the world. With both China and India showing a clear enthusiasm for the cloud, one can be certain that this coming power block has truly embraced the cloud revolution. ### Customer intimacy or platform efficiency? Paul Bevan, Director of Surefire Excellence looks at the choice facing IT service providers: Build a volume-based platform or get close to the customer. The IT world has moved on... fast. If you hadn’t noticed, the development of the Cloud, the rise of "as-a-service" offerings, the promise of mobile and the Big Data challenge are driving some very different business models and customer engagement strategies. 10 years ago, maybe even 5 years ago, systems integration was the norm. IT vendors created tightly integrated solutions that encompassed hardware, software and services. Sure, there were telcos providing the network infrastructure and hardware vendors delivering fairly standard computer and networking equipment. But the value was in the integrated solutions. Marketing departments were focused on finding valuable and defensible niches, and then putting together “whole products” that appealed to well defined and easily identifiable audiences. Companies that could do it all, or at least knew how to partner and lead, dominated. As the world moved away from on-premise solutions to an "as-a-service" environment, new types of companies emerged focusing on specific elements within the technology stack. VARs and SIs particularly found themselves becoming disinter-mediated and losing their privileged customer relationships. The tight coupling of systems was slowly being replaced by a much more loosely coupled environment of IaaS, PaaS and SaaS providers. For the IaaS and PaaS providers the challenge is to win volume through effective product development and highly efficient operations. The vendor challenge has changed For the IaaS and PaaS providers the challenge is to win volume through effective product development and highly efficient operations. Speed to market, industrialisation of processes and a relentless focus on ease of use and deployment are critical. For the SaaS providers and the more traditional SIs and Enterprise IT players the challenge is to be more focused on delivering specific applications tailored to the needs of the end customer. Beyond the realm of genuinely enterprise wide applications which apply to all verticals, of which there are very few, this will require a level of specialty and focus. Given that these applications will effectively be services, service integration will be the name of the game…and the service will be specific to the needs of the customer. Marketing needs to think (and act) differently In 2014 you won’t be able to afford to lose focus. You won’t be able to be all things to all people. Even IBM can’t do that anymore. Inexorably, the IT world is polarising between those who will provide the underpinning platforms and those who will provide high-end, specific, application aligned services. You will either have to have a very deep and intimate understanding of your customers’ business allied to an ability to build specific solutions, or you will need to learn the skills of manufacturing efficiency, channel development and volume product management and marketing. The impact on marketing is dramatic, and the role of marketing (not just marketing communications) is critical. If you try and do both you will succeed at neither. If you dip your foot in the water you’ll just lose your foot and bleed out slowly. So what way will you look this year? ### Lunacloud’s CEO speaking at Cloud Computing Lunacloud’s CEO, António Miguel Ferreira was invited to join the team of speakers of Cloud Expo Europe 2014, the main and most important event in Europe, Middle East and Africa on the Cloud Computing industry, taking place at the ExCel Exhibition Centre in London, over the next 26th to 27th February. Antonio will talk about "successes and the misgivings of the European cloud ecosystem. This event is targeted towards IT decision makers of large, medium and small businesses in the public and private sector, seeking to harness the power of cloud computing. Cloud Expo Europe 2014 includes a world-class conference programme involving over 300 world-class speakers and dozens of compelling case studies. The event also incorporates a leading international exhibition and technology solution innovations from more than 200 international suppliers. About Lunacloud Founded in 2011 by industry experts Antonio Miguel Ferreira and Charles Nasser, Lunacloud is the latest pay-as-you-go Cloud infrastructure services provider available to tech-savvy SMEs and start-ups as well as individual developers in the Europe. Lunacloud is a pure-play cloud services provider focusing on delivering reliable, elastic and low cost cloud infrastructure (IaaS) and platform (PaaS) services, on which you can run your operating systems and applications, deploy your code or store your data. Lunacloud provides Cloud Servers, Cloud Storage and Cloud Jelastic services. Lunacloud has offices in the UK, Portugal, France and Spain, providing local support in English, Portuguese, French and Spanish and will be expanding to other non-European countries in 2014, including Russia, Brazil and Colombia. ### Intermedia introduces standalone email archiving Channel gains access to new cloud service revenues as Intermedia introduces standalone Email Archiving. Now Intermedia partners can provide cloud-based email compliance and eDiscovery services wherever customers’ email is hosted As the world’s largest one-stop shop for cloud IT services and business applications, Intermedia now offers Email Archiving as a standalone service, to provide partners with a new way to introduce their customers to the cloud. Intermedia partners can now help any customer to achieve compliance for archived emails that are hosted by Microsoft Office 365, as well as for on-premise, or off-premises Microsoft Exchange; Google Mail; IBM Notes®; GroupWise; IMail; Scalix; or Zimbra environments. By reselling standalone Email Archiving, Intermedia partners can target a much wider customer base, using Email Archiving as an introductory cloud service that can be used as a precursor to future sales of Intermedia’s integrated Office in the Cloud services. “We’re committed to helping our partners grow their business cloud service revenues,” says Ed Macnair, Intermedia managing director, EMEA, “With standalone Email Archiving, our partners gain another way to capitalise on a high-growth market, while helping their customers to reduce business, legal, and compliance costs.” Many email providers do not offer email archiving that enables compliance with the UK’s Data Protection Act (1998); Financial Conduct Authority and Freedom of Information Act and US regulations including ERPA, FRCP, SOC, SEC and HIPAA. This created strong market demand for Intermedia’s Email Archiving service to be offered as a standalone service, in addition to the email archiving service, which launched as an integrated option for Intemedia’s Hosted Exchange solution in Q4 2013. Following the launch of the standalone product, virtually any small or medium-sized business can easily and cost-effectively assure that every single email—as well as more than 450 types of file attachments—is archived, searchable and immediately retrievable, forever. According to Osterman Research, businesses without email archiving can incur hundreds of thousands of dollars in costs if an employee deletes a critical email or if a lawsuit requires emails to be submitted as evidence within a short period of time. Intermedia’s scalable and secure solution helps customers in legal, healthcare, financial and other regulated industries, to preserve and discover critical information, mitigate litigation costs, and reduce disaster recovery and compliance risks. Intermedia’s Email Archiving solution offers a number of unique benefits beyond preservation and compliance. It offers true linear scalability that protects customers from overpaying or under-provisioning, by allowing them to incrementally increase costs on a per-user basis. Additionally, its technology allows for rapid on-boarding as well as easy-to-use search-and-retrieve functionality. With eight data copies spread across multiple data centres, Intermedia stores twice the number of data copies compared to other solutions. As with all Intermedia services, partners manage Email Archiving from their Partner Portal: a single point of control that reduces the cost and complexity of managing multiple solutions to make partners more efficient and more profitable. About Intermedia Intermedia is the world’s largest one-stop shop for cloud IT services and business applications. Its Office in the Cloud™ delivers the essential services that SMBs need to do business – including hosted Exchange, Hosted PBX, SecuriSync file sync and share, security, mobility and more. All of Intermedia’s services are pre-integrated for a seamless user experience. They are all managed via Intermedia’s HostPilot® control panel, with just one login, one password, one bill and one source of support – creating tremendous cross-service efficiencies for both users and IT administrators. In addition, they all offer enterprise-grade security, 99.999% availability and 24/7 customer care to assure a worry-free experience. Founded in 1995, Intermedia was the first to offer hosted Microsoft Exchange. With 90,000 customers and 750,000 users, it is the world’s largest provider of hosted Exchange, outside of Microsoft itself. Intermedia also enables over 13,500 channel partners – including VARs, MSPs, telcos and cable companies – to sell cloud services under their own brand, with full control over billing, pricing and every other element of their customer relationships. Intermedia’s 600 employees in 3 countries manage 10 data centres to power its Office in the Cloud – and to assure its famous worry-free experience. Learn more at Intermedia.co.uk. Contact Josie Herbert, Phiness PR, +44 (0)1252 400569, josie@phinesspr.co.uk ### PRISM, 7 months on - What do we know and what should we do? Our understanding of government surveillance changed in June 2013 when The Guardian published the first revelations of the NSA’s PRISM and GCHQ’s Tempora snooping programmes. My post that same month Help! NSA has my data –  sought to introduce calm against a background of extensive blanket surveillance which has generally been greeted with alarm. 7 months is a long time in the world of cloud, so what do we know now? Data privacy has continued to be a hot topic with some questioning whether we should have a European-only cloud or whether we should abandon cloud altogether. The extent of the snooping has taken many by surprise. The Guardian and others have disclosed the surveillance in detail and while it's not necessary to repeat them, here are some high (low?) lights: the US and UK snooped on foreign leaders at the 2009 G20 summit there have been accusations that Skype voluntarily joined the PRISM programme and RSA introduced back doors to their products to facilitate surveillance. These accusations are denied NSA collected address books from Yahoo, Hotmail, Facebook and Gmail seemingly without their knowledge or cooperation NSA cracked mobile phone encryption and was alleged to be listening to the phone calls of Angela Merkel, the German Chancellor with the Israeli prime minister and the EU Competition Commissioner also targeted The reaction in many quarters has been furious. Again, some highlights for me: the Russian government bought electric typewriters and this was reportedly as a result of them discovering that they were being spied on the European parliament's civil liberties committee says the activities of NSA and GCHQ appear to be illegal and have asked Snowden to submit to questioning the American Civil Liberties Union is pursuing a lawsuit against the NSA alleging that its spying activities are unconstitutional and Kentucky senator Rand Paul is also about to bring a claim and is urging Americans to join the lawsuit two Californian state senators introduced a bill in an attempt to cut off NSA's water supply, essential for computer-cooling the European justice and rights commissioner Viviane Reding threatened to freeze the EU/US Safe Harbor scheme President Obama announced some reforms to the NSA Here are my answers to questions which I’ve been asked since June last year: 1. Will the EU Data Protection Regulation stop NSA and GCHQ snooping on me? No. The draft regulation is an attempt to harmonise the different approach to data protection laws across the EU. It has been heavily criticised and has undergone a plethora of amendments and there still remains much disagreement that must be resolved before the regulation can be implemented. Regardless, article 2 of the current draft contains an exemption for national security and it is highly likely that some form of exemption will be retained in the final draft. Data privacy has continued to be a hot topic with some questioning whether we should have a European-only cloud or whether we should abandon cloud altogether. 2. Should we abandon the Safe Harbor scheme? No. The EU/US Safe Harbor scheme was an attempt to protect EU citizen’s data in the face of a lack of a US federal law providing similar protections. Don’t forget that most US companies have denied actively participating in the PRISM programme so scrapping Safe Harbor and preventing the flow of EU personal data to the US would punish those companies for the actions of the NSA. There’s no doubt that hitting the profits of US companies would get the attention of the US government but at the same time it could severely impair the growth of cloud in the EU as so much of it is based in the US. 3. Should I abandon cloud? No. The first road fatalities in the US and UK were in the 1890s but this didn’t lead to the banning of the car. 1.24m people worldwide died of road traffic injuries in 2010 alone and yet we still continue to use cars. As with any new development there will always be negatives. The key is to establish proper guidelines and restrictions for surveillance by security agencies rather than abandon cloud completely. 4. Should security agency powers be curtailed? Maybe. The debate continues. In his recent announcement of proposed reforms to the NSA, President Obama said that NSA had been acting within its powers but he recognised that Snowden’s revelations had caused anxiety. The reforms are not as extensive as many were asking for and need further clarification. For example, the NSA will continue to have access to phone data but won’t hold this itself; a third party yet to be identified will hold it instead and NSA will access it when needed. Also, foreign citizens – including other world leaders – will enjoy the same protections as US citizens but will still be the subject of surveillance if necessary to uphold national security. In short, the NSA will continue to undertake surveillance but with some adjustments. This is clearly an ongoing discussion but the NSA, GCHQ and other security agencies will all continue to undertake surveillance to some extent. In the meantime, this should not distract the average cloud provider and customer from getting on with their business. 5. Should the EU adopt its own cloud? Yes, if you mean a European cloud to promote a thriving European-based cloud to help businesses to compete with the US-based cloud and let’s hope that’s what the European Cloud Partnership and Cloud for Europe achieve. However, if you mean a state sponsored scheme to build a European cloud to lock data in the EU and keep out US companies then no, for the following reasons: I’m always wary of protectionist policies, particularly given Europe’s history it must not act as Fortress Europe as this would be contrary to the attempts at creating a global economy through international bodies such as the World Trade Organisation despite the rhetoric, bureaucrats and national governments are normally not the best examples of how to implement successful technology projects It would also not overcome the reality of state surveillance. GCHQ’s own surveillance programme, Tempora, is well-known following Snowden’s revelations. There have also been disclosures that Germany's Federal Intelligence Service has contributed to NSA’s data collection and France's Directorate General for External Security has been intercepting and storing French telephone and Internet communications. And don’t forget that, under the various mutual legal assistance treaties which national governments have signed, security agencies share information between them, including with the NSA. 6. Why don't the legislators act quicker to help cloud? Innovation happens faster than law making. It has always been so. Typically, until specific laws are passed to regulate an innovation, judges will apply any relevant existing laws, meaning there may be over regulation rather than under regulation in the short term.  It's important for legislators to strike a balance between regulating innovation to protect consumers without rushing laws and stifling the innovation. This supposed lack of relevant law hasn't stopped US cloud developing. Nor are customers without adequate protection. Consumers are already well protected. Arguably it's SMEs who need laws to help redress the balance and in the meantime they need to shop around. And read the contract terms before signing up - but a cloud lawyer would say that! 7. So, how do I prevent security agencies snooping on me? Avoid cloud altogether and buy electric typewriters to keep everything on paper on premise. That sounds a bit extreme. Address the problem at source, perhaps by curtailing surveillance powers or through better scrutiny? Obama’s announcement doesn't give me much comfort that much will truly change. Run certain activities in the cloud, but keep sensitive data out of the cloud. Encrypt (or token-ise) sensitive data before transmitting and storing it in the cloud Of course, these aren't foolproof. The NSA has apparently been using tiny radio devices to get access to offline computers. It can already crack some encryption algorithms. The NSA reforms won’t prevent snooping but at least for now, their activities will be more closely scrutinised. Also, if all data is encrypted – not just at rest in data centres but in transit too – this will likely cause the NSA and other national security agencies to focus their resources on those targets who they genuinely believe are a threat to national security rather than the blanket approach they have up to now, as they can’t efficiently decrypt all data (yet). 8) Should I just stop panicking and carry on with my business? Yes. ### IBM commits $1.2 bn to expand global cloud IBM builds a massive network of local cloud hubs for businesses worldwide with 40 data centers across five continents. IBM has announced plans to commit over $1.2 billion to significantly expand its global cloud footprint. This investment includes a network of data centers designed to bring clients greater flexibility, transparency and control over how they manage their data, run their businesses and deploy their IT operations in the cloud. This year IBM plans to deliver cloud services from 40 data centers worldwide in 15 countries and five continents globally, including North America, South America, Europe, Asia and Australia. IBM will open 15 new centers worldwide adding to the existing global footprint of 13 global data centers from SoftLayer and 12 from IBM. Among the newest data centers to launch are China, Washington, D.C., Hong Kong, London, Japan, India, Canada, Mexico City and Dallas. With this announcement, IBM plans to have data centers in all major geographies and financial centers with plans to expand in the Middle East and Africa in 2015. By some estimates, the global cloud market is set to grow to $200 billion by 2020; driven largely by businesses and government agencies deploying cloud services to market, sell, develop products, manage their supply chain and transform their business practices. "IBM is continuing to invest in high growth areas," said Erich Clementi, senior vice president of IBM Global Technology Services. "Last year, IBM made a big investment adding the $2 billion acquisition of SoftLayer to its existing high value cloud portfolio. Today's announcement is another major step in driving a global expansion of IBM's cloud footprint and helping clients drive transformation." The new cloud investments IBM is making will provide business clients the ability to place and control their data globally. IBM SoftLayer gives clients the ability to choose a cloud environment and location that best suits their business needs and provides visibility and transparency to where data reside, control of data security and placement. IBM SoftLayer is able to deliver high performance services globally across the SoftLayer network. The combination of distributed local data centers and a global network allows clients to place data where it is required, when it is required as well as the ability to consolidate or aggregate data as needed. This provides optimized application performance and responsiveness. SoftLayer's unique network architecture allows clients to optimize global performance using a private network and not be subject to the uncontrolled nature of the public networks and the internet. Cloudant - a provider of distributed database-as-a-service (DBaaS) services: “Cloudant’s global expansion rate is fueled by the always-on commitment we make to our customers,” said Cloudant CEO Derek Schoettle. “Our mission is to be the standard data layer for Web and mobile applications. That mission requires us to push application data to the network edge, in as many locations as possible. Expanding beyond IBM SoftLayer’s current footprint presents significant value to our business. The investment IBM is making to expand their global footprint will not only help fuel our growth, but the growth of thousands of Cloudant users worldwide as well.” In today's world of rapid response with mobile and social data proliferation, this type of automation and speed of access to data with high availability and control makes IBM SoftLayer cloud infrastructure an ideal capability for business clients worldwide. "Cloud represents a growing area for venture capitalist investment," said Ann Winblad, co-founder and Managing Director of Hummer Winblad Venture Partners . "By investing in the cloud ecosystem, IBM not only makes it easier for enterprises to adopt cloud and drive innovation, but also helps new companies of all sizes get off the ground more quickly." said Winblad. IBM SoftLayer Underpins IBM’s Growing Cloud Portfolio The acquisition of SoftLayer represents another major investment for IBM clients. Since its acquisition in 2013, IBM SoftLayer has served nearly 2,400 new cloud clients. In fact, IBM plans to establish SoftLayer as the foundation of its wide ranging cloud portfolio. The SoftLayer infrastructure will provide a scalable, secure base for the global delivery of cloud services spanning IBM's extensive middleware and SaaS solutions. SoftLayer's flexibility and global network will also facilitate faster development, deployment and delivery of mobile, analytic, social solutions as clients adopt cloud as a delivery platform for IT operations and manage their business. Last week IBM made a significant investment and established the IBM Watson Group, a new business unit dedicated to the development and commercialization of cloud-delivered cognitive and Big Data innovations. As part of this initiative IBM will also deploy Watson on SoftLayer. About IBM Cloud Computing IBM is the global leader in cloud with an unmatched portfolio of open cloud solutions that help clients to think, build or tap into it. No other company has the ability to bring together unique industry knowledge and unmatched cloud capabilities, that have already helped more than 30,000 clients around the world. Today, IBM has more than 100 cloud SaaS solutions, thousands of experts with deep industry knowledge helping clients transform and a network of 40 data centers worldwide. Since 2007, IBM has invested more than $7 billion in 15 acquisitions to accelerate its cloud initiatives and build a high value cloud portfolio. IBM holds 1,560 cloud patents focused on driving innovation. In fact, IBM for the 21st consecutive year topped the annual list of US patent leaders. IBM processes more the 5.5M client transactions daily through IBM's public cloud. IBM expects to achieve $7 billion in annual cloud revenue by 2015 For more information about cloud offerings from IBM, visit http://www.ibm.com/cloud. Follow us on Twitter at @IBMcloud and on our blog at http://www.thoughtsoncloud.com. Join the conversation #ibmcloud. ### Compliance: Data Storage in a Regulated World Having cut my teeth within a regulated industry this subject is close to me. I feel the need to ask a question to everyone and I look forward to the answers. Why do we need Regulatory Compliance within technology? There are many industries that are regulated – Financial, Health, Insurance, and Accounting and Tax planning to name a few.  Now, here lies the problem - each regulated industry requires different sets of rules according to its given regulator. A need for a bridge between the technical understanding of the business requirements to regulatory guidelines is very apparent.  If anyone reading this article has read the FCA's (formerly FSA) handbook and tried to understand what IT governance is required they will know what I mean. I will list out a couple of examples where compliance for data storage and retrieval differ vastly. Health – Meeting and Minutes details – Must be held for a minimum of 30 years. Insurance – Employers Liability Policies – Must be held for a minimum of 40 years. Now these are just two low level examples of data retention, now add ALL of the other considerations (and there are a lot) into the mix. Data Access, Information Security, Business Continuity, Data Protection laws etc, you will soon see that the role of a CIO/CTO within these regulated firms is a difficult one as well as knowing that these regulations change too. So, we have hoards of information that we need to store under our governing body, where do we store them? This now creates another problem, who do we trust to store them effectively and for this length of time? Let’s be honest, most technology firms cannot see past a 5 year business plan, let alone 40 years. This as well as the format that the data has been stored on, will we all be naive enough to think that in 20+ years’ time the data we stored initially can even be accessed? When I was working in the banking sector we had so many disparate systems it was crazy and over a 5 year plan we eventually standardised them through one platform. However we still had the same problem of catering for the eventuality of recalling data from an OS2 Warp operating system from 10 years prior. Now, consider the financial regulated world. This is a very very complicated topic and again the policies differ massively depending on what type of activities you conduct under the regulators adherence. For example: The length of time records should be kept depends on which type of business the records relate to. For MiFID (Markets in Financial Instruments Directive) business, records must be kept for at least 5 years after an individual has stopped carrying on an activity; for non-MiFID business it is 3 years after stopping the activity and for a pension transfer specialist the records must be kept indefinitely. This includes Email, File, Databases and in fact any data that has used for said given business. There’s a MiFID II on the horizon (2015) with even more significant changes are looking to be introduced. Now do you see the complications on this one topic (data storage) within IT Governance? For me, conformity needs to stem from understanding. If you do not understand what you need to conform to, how on earth can you? A simple understanding of one ruling of conformity for example 2 years of data storage and not 5 could save you £1000`s and let you sleep at night.  Imagine if you knew the rulings for ALL of your data storage requirements and you have fine-tuned them to your infrastructure, or better still spoken to someone who already understands them. Conformity needs to stem from understanding. If you do not understand what you need to conform to, how on earth can you? There is one company who I have spoken to recently whose approach to this challenge warrants particular merit, and they stand by a 100% guaranteed data restoration rate however long you store your data for: Arkivum. Arkivum’s storage is based on the principle that 3 copies are needed for absolute certainty that data is safe. So using active integrity checking at all times, every one of your files is copied three times, with two copies held online in geographically separated Data Centre’s and the third held offline locked away in an escrow service. [caption id="attachment_10479" align="alignnone" width="670"] Arkivum's Data Archiving Storage System[/caption] It was very interesting for me to talk to a technology company and discuss compliance; they even have a dedicated compliance officer. The regulation of IT, especially Cloud, is paramount and right up there with security. Is this the future for IT companies, that they must have a better understanding of compliance rulings in regulated industries? With the state of “internet of things” gathering momentum and even your domestic items being able to talk to each other (and maybe even talk about you to each other), let alone the internet - my feelings are that the regulation of IT, especially Cloud, is paramount and right up there with security.  The only issue I have with regulations is that they sometimes stifle creativity and flexibility, but that’s a whole new topic that I am sure we will discuss in the future. What are your thoughts on regulated IT and compliance? ### Gaming companies flock to IBM SoftLayer cloud Gaming companies flock to IBM SoftLayer’s cloud, adding to 130 million players worldwide SoftLayer, an IBM Company, today announced that game development studio KUULUU and world’s largest online game servers provider Multiplay are using IBM SoftLayer’s cloud capabilities to power widely popular games such as Battlefield 4 and RECHARGE. KUULUU tapped into SoftLayer’s cloud for higher performance and scale for their newest game created with Linkin Park, the most popular band on Facebook with 55 million followers, called RECHARGE, while Multiplay utilizes SoftLayer cloud to support the mega title Battlefield 4. >The global gaming market is estimated to total $111 billion by 2015 – driven largely by the increasingly popularity of cloud gaming (online, streamed and downloaded games are estimated to represent as much as $38B in revenue in 2012). By leveraging the cloud built on open standards to host and stream games, developers are able to provide users with uninterrupted, instant access to games across any devices that will provide higher performance and easily scale based. This allows games to be streamed directly from the cloud, rather than downloaded locally, freeing up storage space on user devices and making access to updates easier and more efficient. Since last spring, the amount of active players relying on SoftLayer’s cloud infrastructure has grown to 130 million. The company serves the whole ecosystem, from independent developers, to game studios, to publishers. By taking advantage of SoftLayer’s cloud capabilities they are able to meet the demands of players around the world who expect faster development cycles, no-lag game play, and a flawless overall user experience. SoftLayer’s high value cloud platform allows KUULUU and Multiplay to scale easily to meet dynamic and extensive workloads requirements, with physical and virtual cloud servers available in real time on a massive global network. By leveraging the cloud built on open standards to host and stream games, developers are able to provide users with uninterrupted, instant access to games across any devices that will provide higher performance and easily scale based. This allows games to be streamed directly from the cloud, rather than downloaded locally, freeing up storage space on user devices and making access to updates easier and more efficient. Multiplay is one of the world's largest hosts of online game servers and is also home to one of Europe's biggest online gaming communities with over seven million gamers playing on their servers every month. The company provides high quality, affordable game servers for all major titles, including EA’s Battlefield. For locations in Europe, the United States and Asia, they chose SoftLayer bare metal servers. Multiplay supports approximately 500,000 gamers on SoftLayer’s IaaS platform, hosting over 60 gaming titles including Minecraft, Battlefield 4, DayZ, Starbound and Team Fortress 2. Over 100,000 peak concurrent gamers play every night on SoftLayer bare metal servers throughout the world. In the case of Battlefield 4, Multiplay was able to spin-up and provision the IT resources required to support 25,000 new users in less than four hours while still being able to deliver a superior online experience. “Bare metal game servers are the best way to get a truly excellent online gaming experience. Utilizing the power of single tenant machines, a Multiplay game server will always be there, ready for you to start gaming. We work with some of the biggest names in the gaming industry to bring players the biggest and best online gaming titles.” said Will Lowther, Business Development Manager for Multiplay. “For Battlefield 4 we chose bare metal cloud solutions, provisioning them in locations all over the globe. With hardcore games the players expect absolutely flawless experience, so we cannot allow any lag times or glitches. By using SoftLayer’s platform coupled with the high-speed network, we give the game fans exactly the experience they want.” Bare metal game servers are the best way to get a truly excellent online gaming experience. Utilizing the power of single tenant machines, a Multiplay game server will always be there, ready for you to start gaming. With the launch of the highly anticipated RECHARGE, KUULUU needed a high performance and scalable service to support the game. Together with Linkin Park, they created a third-person 3-D online experience that combines puzzle, adventure and action elements. The game supports Music For Relief Power the World campaign, by introducing players to the real world clean energy solutions in the game. A big percentage of Linkin Park's audience are gamers, and as the biggest band on Facebook with over 55 million followers, using gaming as a vehicle gets to those fans on a personal level. “We knew that producing and running RECHARGE as a Facebook mid-core game would be especially resource-intensive. SoftLayer gave us all the performance and bandwidth that we need in an extremely flexible way that suits our industry and our product requirements,” says Florian Juergs, CEO for KUULUU. “On our side, we can do everything possible to develop an amazing game experience, but you do need partners that can support it. SoftLayer is that partner.” In providing KUULUU with an infrastructure that was resilient enough to withstand the constant demands of beta testing, launch, daily play, or update downloads, SoftLayer enabled KUULUU to concentrate on its core mission. About SoftLayer, an IBM Company 
 SoftLayer, an IBM Company, operates a global cloud infrastructure platform built for Internet scale. With 100,000 devices under management, 13 data centers in the United States, Asia and Europe and a global footprint of network points of presence, SoftLayer provides Infrastructure-as-a-Service to leading-edge customers ranging from Web startups to global enterprises. SoftLayer’s modular architecture provides unparalleled performance and control, with a full-featured API and sophisticated automation controlling a flexible unified platform that seamlessly spans physical and virtual devices, and a worldwide network for secure, low-latency communications. For more information, please visit softlayer.com. About IBM Cloud Computing IBM is the global leader in cloud with an unmatched portfolio of open cloud solutions that help clients build, rent or tap into cloud capabilities. IBM could support 30 percent more top-level websites than any other cloud computing provider. Among the Fortune 500, 24 of the top 25 companies rely on IBM cloud computing. No other company has the ability to bring together unique industry knowledge and unmatched cloud capabilities, that have already helped more than 20,000 clients around the world. Today, IBM has more than 100 cloud SaaS solutions, 37,000 experts with deep industry knowledge helping clients transform and a network of more than 25 global cloud delivery centers. Since 2007, IBM has invested more than $6 billion in acquisitions to accelerate its cloud initiatives. For more information about cloud offerings from IBM, visit http://www.ibm.com/smartcloud. Follow us on Twitter at @IBMcloud and on our blog at http://www.thoughtsoncloud.com. Join the conversation #ibmcloud. ### Are Enterprise App Stores a vital ingredient for any successful BYOD strategy? Rick Delgado's recent Compare the Cloud article about a subject gaining a lot of traction at the moment, Bring Your Own Device (BYOD), raised some interesting questions about the realities of implementing such a policy. Just where do the personal and the professional overlap? And what is the best way to police this border territory? Could Enterprise App Stores be a way of managing the overlap? And, if so, who stands to gain? The "Store-Ship Enterprise" A recent Gartner report predicted that by 2017 25 per cent of enterprises would have Enterprise App Stores where workers can browse and download apps to their computers and mobile devices. It is argued that they offer a cost-effective and faster way of distributing Apps to end users and, crucially, of managing the Apps that users chose to install on their devices. Could Enterprise App Stores be a way of managing the overlap? And, if so, who stands to gain? It is this second point that could make an Enterprise App Store attractive to any organisation considering a BYOD policy. By encouraging end users to stick to a narrow range of approved Apps, it shifts some control back into the hands of IT. If successfully implemented and widely adopted, it could go some way to alleviating the dual risks that a BYOD policy may represent. Namely: what are the users installing and taking in to the organisation? And: what are they taking out? The Perception of Greater Risk By encouraging end users to stick to a narrow range of approved Apps, it shifts some control back into the hands of IT. Of course, these risks exist for any organisation, and in some ways, a BYOD strategy requires nothing more than the management of a fleet of mobile devices – something any organisation is doing already. The difference, of course, lies in the fact this fleet is no longer homogeneous; it consists of a huge range of devices, platforms, OS, and software all running all kinds of version numbers. Plus, because the devices are seen as personal devices by the end user, end users are much more willing and likely to install Apps that fall outside corporate guidelines and policies. A report conducted by Frost & Sullivan for McAfee, recently reported in The Register, found that more than 80% of surveyed staff (in both IT and line of business) were using at least 1 non-approved SaaS application. The authors suggest it is likely that more than a third of all software within organisations has been installed and used without IT oversight. It is the fear of this tide of 'shadow IT' that underwrites much of the business nervousness about BYOD strategies. And yet the one thing this survey makes clear is that this isn't a BYOD issue. Shadow IT is happening anyway; with or without BYOD policies being formally in place. My experience tells me that, in fact, it isn't just shadow IT that is happening anyway, without a formal BYOD strategy being in place: BYOD is happening anyway. Canute-like Strategies Resisting BYOD is a little like setting one's throne at the water's edge. Organisations need to be manage the complexities of this developing reality, which is set to be further complicated as wearable devices take off. Given this complexity and the 'on the ground' reality, Mobile Device Management (MDM) systems begin to make sense. And an Enterprise App Store is the carrot accompanying the MDM stick. Using an Enterprise App Store within a BYOD strategy is not so much about stopping the horse from bolting, but rather offering Dobbin some nice green pasture in the hope he doesn't make a break for it and take some of the fence with him. The Audacity of Hope The success of Apple's App Store and Google Play has encouraged non-IT people to install and manage applications on their devices in a way most people couldn't have predicted 10 years ago. This does present issues for IT, but using the very tools that are driving these risks to restrict those same risks seems like a sensible plan. The Enterprise App Store offers an opportunity for IT to approve and package up tools for BYOD end users in a way that both enables them to ensure that the Apps are secure and managed and offers an opportunity to manage licensing and purchasing agreements in a more coherent fashion. Staff have clear guidelines about which tools they should be using for which task; enabling better collaboration within the organisation and (hopefully) ensuring they comply with organisational standards. 'Hopefully' is the key word here: because, while a good Enterprise App Store will enable end users to do everything, it doesn't prevent them doing anything. The Threat to Traditional Software Vendors The key to success will be populating the App Stores with the tools that end users want to use so that they are encouraged to use the Enterprise App Store route, and refreshing content regularly enough to prevent them straying. This means populating the App Stores not only with Enterprise solutions, but also the approved SaaS solutions with which users are familiar; everything they need to do their job. It seems such a pragmatic approach that Gartner's prediction of 25 per cent seems low to me. But the biggest winner from this seemingly inexorable shift has to be the SaaS vendors. The fast implementation times make Cloud-based management tools and Enterprise App Stores an attractive proposition. But the biggest winner from this seemingly inexorable shift has to be the SaaS vendors. While 'thin client' access to the Enterprise systems might be one solution, it is likely that organisations will be tempted more than ever to look more towards the SaaS model.   ### Janet awards framework to Arkivum Janet awards framework to Arkivum to provide fast and cost-effective archiving for research and education. Janet, part of the Jisc Group, is today announcing a framework agreement with Arkivum Ltd for the provision of “Data Archive to Tape” as a service to provide fast and cost-effective archiving for research and education. The Agreement covers purchases of Arkivum’s digital archiving service by Higher Education institutions, Further Education and Specialist Colleges and Research Council establishments in the UK, and by any other bodies whose core purpose is the support or advancement of further or higher education or research. It allows all qualifying organisations to procure the archive service quickly without the administrative overhead and costs of an EU compliant competitive tender. Dan Perry, Director of Product and Marketing at Janet said, “Long term cost effective data storage is increasingly a key issue for research and education organisations and to support our sector we have run a competitive OJEU compliant procurement. The outcome is the framework has been awarded to Arkivum. In addition to demonstrating a good understanding of the challenges facing organisations from the data bonanza, one of the many benefits of the Arkivum archiving service is the fast implementation and speed to ‘go live’.” Jim Cook, CEO Arkivum commented, “We are delighted to have this framework agreement in place with Janet. Based on this agreement, institutions will be able to make the most of a well understood and used process, and know that they are getting competitive and pre-negotiated education prices based on the total size of the community and not just the size of their particular institution.” About Janet Janet, part of the Jisc group, has the primary aim of providing and developing a network infrastructure and services that meet the needs of research and education communities in the UK. Janet manages the operation and development of the Janet network and related services on behalf of Jisc, the UK’s expert on digital technology for education and research. Its work is guided by its funders, owners (AoC, GuildHE and UUK) and trustees. For more information, visit www.ja.net. About Arkivum Arkivum specialises in the management and storage of an organisation's information assets. Arkivum delivers systems that can intelligently manage content to efficiently store and retrieve data over the long term while offering a highly cost effective solution, with low up-front investment and zero risk. Arkivum was formed as a spin-out from the University of Southampton.  With a world-class reputation in the field of digital preservation, the University has been working for over a decade to develop best practice for the safe keeping of digital data over the long term. With on-going links to the University, Arkivum has direct access to state-of-the-art research and this is complemented with a team that has in-depth experience in datacentre operations and storage system implementation. Arkivum provides a completely transparent data archiving service and its approach to data safety and security is simple; it keeps multiple copies of customers’ data in secure UK data centres and actively manages its integrity to ensure it remains in bit-perfect condition all the time. Arkivum relies on proven storage technology and open standards to deliver fast and efficient online access. The company’s unique solution is the only system available on the market which guarantees 100% data integrity. More information can be found at http://www.arkivum.com or on Arkivum's profile page. ### Virtus adds V Range to its Intelligent Data Centre portfolio London, 14 January  2014 – Virtus Data Centres, London’s flexible and efficient data centre specialists, announced today that it has launched its V range; a portfolio of services designed to further enhance its existing range of  flexible and efficient data centre solutions. The Virtus Intelligent Data Centre portfolio, which currently includes the industry-leading CoLo-On-Demand and Connectivity-On-Demand services has been extended to incorporate the new V-Rack, V-Pod and V-Suite range of solutions with Flex, Enterprise and On Demand options available. These products enable a customer to choose the IT solution that best suits their business requirements, ensuring they are getting absolute flexibility and the lowest TCO possible without compromising on quality of design and implementation. The full range of services have been specifically designed to help customers meet challenges such as delivering 100% uptime for production, matching data centre costs to  ever-changing requirements, testing new markets without long term commitment, and allowing accurate monitoring of  equipment to aid with  agile resource re-allocation in line with business fluctuations. The V products can be scaled up or scaled down at Rack, Pod or even Suite level allowing customers to pick and choose from a wide range of products and services  including ‘pay as you use’ or ‘pay as you grow’ models with contract lengths varying from as little as a day to a decade. Neil Cresswell, CEO of Virtus Data Centres, commented:  “At Virtus we are transforming the data centre landscape. We see ourselves as an IT services company first, helping our customers run and grow their business with innovative and reliable technology solutions based on our flexible and efficient data centre solutions. We offer the flexibility to handle different application workloads that drive different system requirements with different technical and commercial data centre solutions in the same data centre or data centre portfolio.” Neil Cresswell, added: “We caused a bit of a stir with our CoLo-On-Demand service six months ago, then supplemented that with Connectivity-On-Demand. Recently we have been working on extending that market-leading flexibility to the rest of our range, so that we are able to deliver to our customers the same flexible solutions regardless of the IT budget, the size of the deployment or the length of the contract.” For more information on the Virtus Intelligent Data Centre portfolio please contact info@Virtusdcs.com.  For more information on Virtus, visit their profile page here or website here. ### Enterprise App Stores vital for BYOD? Rick Delgado's recent Compare the Cloud article about a subject gaining a lot of traction at the moment, Bring Your Own Device (BYOD), raised some interesting questions about the realities of implementing such a policy. Just where do the personal and the professional overlap? And what is the best way to police this border territory? Could Enterprise App Stores be a way of managing the overlap? And, if so, who stands to gain? The "Store-Ship Enterprise" A recent Gartner report predicted that by 2017 25 per cent of enterprises would have Enterprise App Stores where workers can browse and download apps to their computers and mobile devices. It is argued that they offer a cost-effective and faster way of distributing Apps to end users and, crucially, of managing the Apps that users chose to install on their devices. It is this second point that could make an Enterprise App Store attractive to any organisation considering a BYOD policy. By encouraging end users to stick to a narrow range of approved Apps, it shifts some control back into the hands of IT. If successfully implemented and widely adopted, it could go some way to alleviating the dual risks that a BYOD policy may represent. Namely: what are the users installing and taking in to the organisation? And: what are they taking out? Encouraging end users to stick to a narrow range of approved Apps, it shifts control back into the hands of IT. The Perception of Greater Risk Of course, these risks exist for any organisation, and in some ways, a BYOD strategy requires nothing more than the  management of a fleet of mobile devices - something any organisation is doing already.  The difference, of course, lies in the fact this fleet is no longer homogeneous;  it consists of a huge range of devices, platforms, OS, and software all running all kinds of version numbers. Plus, because the devices are seen as personal devices by the end user, end users are much more willing and likely to install Apps that fall outside corporate guidelines and policies. A report conducted by Frost & Sullivan for McAfee, recently reported in The Register, found that more than 80% of surveyed staff (in both IT and line of business) were using at least 1 non-approved SaaS application.  The authors suggest it is likely that more than a third of all software within organisations has been installed and used without IT oversight. It is the fear of this tide of 'shadow IT' that underwrites much of the business nervousness about BYOD strategies. And yet the one thing this survey makes clear is that this isn't a BYOD issue. Shadow IT is happening anyway. It is the fear of this tide of 'shadow IT' that underwrites much of the business nervousness about BYOD strategies.  And yet the one thing this survey makes clear is that this isn't a BYOD issue. Shadow IT is happening anyway; with or without BYOD policies being formally in place. My experience tells me that, in fact, it isn't just shadow IT that is happening anyway, without a formal BYOD strategy being in place: BYOD is happening anyway. Canute-like Strategies Resisting BYOD is a little like setting one's throne at the water's edge. Organisations need to be manage the complexities of this developing reality, which is set to be further complicated as wearable devices take off. Given this complexity and the 'on the ground' reality, Mobile Device Management (MDM) systems begin to make sense. And an Enterprise App Store is the carrot accompanying the MDM stick. Using an Enterprise App Store within a BYOD strategy is not so much about stopping the horse from bolting, but rather offering Dobbin some nice green pasture in the hope he doesn't make a break for it and take some of the fence with him. Using an Enterprise App Store within a BYOD strategy is not so much about stopping the horse from bolting, but rather offering Dobbin some nice green pasture in the hope he doesn't make a break for it. The Audacity of Hope The success of Apple's App Store and Google Play has encouraged non-IT people to install and manage applications on their devices in a way most people couldn't have predicted 10 years ago.  This does present issues for IT, but using the very tools that are driving these risks to restrict those same risks seems like a sensible plan. The Enterprise App Store offers an opportunity for IT to approve and package up tools for BYOD end users in a way that both enables them to ensure that the Apps are secure and managed and offers an opportunity to manage licensing and purchasing agreements in a more coherent fashion. Staff have clear guidelines about which tools they should be using for which task; enabling better collaboration within the organisation and (hopefully) ensuring they comply with organisational standards. 'Hopefully' is the key word here: because, while a good Enterprise App Store will enable end users to do everything, it doesn't prevent them doing anything. The Threat to Traditional Software Vendors The key to success will be populating the App Stores with the tools that end users want to use so that they are encouraged to use the Enterprise App Store route, and refreshing content regularly enough to prevent them straying. This means populating the App Stores not only with Enterprise solutions, but also the approved SaaS solutions with which users are familiar; everything they need to do their job.  It seems such a pragmatic approach that Gartner's prediction of 25 per cent seems low to me. The biggest winner from this seemingly inexorable shift has to be the SaaS vendors. The fast implementation times make Cloud-based management tools and Enterprise App Stores an attractive proposition.  But the biggest winner from this seemingly inexorable shift has to be the SaaS vendors. While 'thin client' access to the Enterprise systems might be one solution, it is likely that organisations will be tempted more than ever to look more towards the SaaS model. ### Microsoft link-up creates China cloud syndrome China has been rather slow to adopt the cloud given the fact that they, firstly, have a reputation for embracing technology, and, secondly, are set to become a massive economic powerhouse in the coming years and decades. Meanwhile the United States, still by some distance the world’s largest economy, is already fully on board with the cloud revolution, and cloud computing now underpins such commonly used technology as Google Drive, iTunes and DropBox. But the attitude of the world’s second largest economy to cloud computing is steadily changing, and there is an increasing indication that the big corporate players who will decide the future of this technology are investing in the Far Eastern nation. “A new chapter of cloud is going to be written by a new ecosystem in China's market, and Microsoft will be the leader of this disruption.” - Forrester Microsoft have already announced last year that it is partnering with 21 Vianet, a company based in China, to sell emerging cloud technologies to Chinese consumers. And in a response to its apparent economic and technological travails, IBM has also made it clear that it is pursuing a similar joint venture. Microsoft has been particularly enthusiastic about the future of the cloud in China, though, describing their link-up with 21 Vianet as “the start of a new era”. It has also received a positive response from the analyst community, with research company Forrester predicting that “a new chapter of cloud is going to be written by a new ecosystem in China's market, and Microsoft will be the leader of this disruption.” This was just one example of significant cloud news in China, but it seems to have been a straw that broke the camel’s back.  Tail end of 2013, ABB, a Swiss firm involved in power and automation technology, announced that it has won an order to supply fifty-four 10 kilovolt dry-type transformers for an R & D and data storage centre in Tianjin, China. This centre is owned by Tencent, one of China’s largest internet service providers. This has been necessary because Tencent is significantly expanding its cloud computing and R & D capabilities. The firm has already built and established data centres using environmentally-friendly and energy efficient technologies, with two in particular based in Shanghai and Shenzhen. With the company growing significantly in northern China, the new datacentre is an indication of Tencent’s intent to build serious cloud infrastructure within the nation. While this is another indication of the move within China to develop a cloud-based infrastructure, there are many others which have recently appeared on the horizon. Barely a week goes by nowadays without a new Chinese cloud arrangement being announced, and it is evident that the sluggish adoption of this technology by the Chinese establishment is well and truly a thing of the past. China has recently set up a cloud computing industry alliance with the aim of promoting the development and innovation of information technology, with a particular focus on cloud technology. Earlier this week, German business software maker SAP AG and China Telecom announced a strategic partnership in cloud computing. As a result of this, the SAP Cloud portfolio will be offered to organisations throughout China by the China Datacom Corporation Limited (CDC). CDC is a joint venture between SAP and China Communication Services or CCS, a subsidiary of the China Telecom Group, and is a significant part of the telecommunications infrastructure in China, and the partnership was taken as an indication that CDC will be looking to offer considerable support for the cloud in the coming years. This news followed on from an announcement that China has recently set up a cloud computing industry alliance. This alliance has been based in Beijing with the aim of promoting the development and innovation of information technology, with a particular focus on cloud technology. This was the first of its kind in China, being jointly established by Tsinghua and Peking Universities and the Center for International Economic and Technological Cooperation, which operates under the Ministry of Industry and Information Technology in China. When one puts these snippets of news together, all of which are occurring within a timescale of just a couple of months, it is quite clear that China is moving forward rapidly with the adoption of the cloud. This must be seen as significant given the country’s increasingly prominent economic position, and signals that the world’s two largest economies are fully embracing this technology. If anyone doubted the coming prominence of cloud computing, it’s time to change your opinion. ### Microsoft Cloud Link-up Creates China Cloud Syndrome By Christopher Morris, Tech Commentator As I have written previously, China has been rather slow to adopt the cloud given the fact that they, firstly, have a reputation for embracing technology, and, secondly, are set to become a massive economic powerhouse in the coming years and decades. Meanwhile the United States, still by some distance the world's largest economy, is already fully on board with the cloud revolution, and cloud computing now underpins such commonly used technology as Google Drive, iTunes and DropBox. But the attitude of the world's second largest economy to cloud computing is steadily changing, and there is an increasing indication that the big corporate players who will decide the future of this technology are investing in the Far Eastern nation. “a new chapter of cloud is going to be written by a new ecosystem in China's market, and Microsoft will be the leader of this disruption.” - Forrester Microsoft have already announced last year that it is partnering with 21 Vianet, a company based in China, to sell emerging cloud technologies to Chinese consumers. And in a response to its apparent economic and technological travails, IBM has also made it clear that it is pursuing a similar joint venture. Microsoft has been particularly enthusiastic about the future of the cloud in China, though, describing their link-up with 21 Vianet as “the start of a new era”. It has also received a positive response from the analyst community, with research company Forrester predicting that “a new chapter of cloud is going to be written by a new ecosystem in China's market, and Microsoft will be the leader of this disruption.” This was just one example of significant cloud news in China, but it seems to have been a straw that broke the camel's back. Tail end of 2013, ABB, a Swiss firm involved in power and automation technology, announced that it has won an order to supply fifty-four 10 kilovolt dry-type transformers for an R & D and data storage centre in Tianjin, China. This centre is owned by Tencent, one of China's largest internet service providers. ...the new datacentre is an indication of Tencent's intent to build serious cloud infrastructure within the nation. This has been necessary because Tencent is significantly expanding its cloud computing and R & D capabilities. The firm has already built and established data centres using environmentally-friendly and energy efficient technologies, with two in particular based in Shanghai and Shenzhen. With the company growing significantly in northern China, the new datacentre is an indication of Tencent's intent to build serious cloud infrastructure within the nation. While this is another indication of the move within China to develop a cloud-based infrastructure, there are many others which have recently appeared on the horizon. Barely a week goes by nowadays without a new Chinese cloud arrangement being announced, and it is evident that the sluggish adoption of this technology by the Chinese establishment is well and truly a thing of the past. Earlier this week, German business software maker SAP AG and China Telecom announced a strategic partnership in cloud computing. As a result of this, the SAP Cloud portfolio will be offered to organisations throughout China by the China Datacom Corporation Limited (CDC). CDC is a joint venture between SAP and China Communication Services or CCS, a subsidiary of the China Telecom Group, and is a significant part of the telecommunications infrastructure in China, and the partnership was taken as an indication that CDC will be looking to offer considerable support for the cloud in the coming years. China has recently set up a cloud computing industry alliance [...] with the aim of promoting the development and innovation of information technology, with a particular focus on cloud technology. This news followed on from an announcement that China has recently set up a cloud computing industry alliance. This alliance has been based in Beijing with the aim of promoting the development and innovation of information technology, with a particular focus on cloud technology. This was the first of its kind in China, being jointly established by Tsinghua and Peking Universities and the Center for International Economic and Technological Cooperation, which operates under the Ministry of Industry and Information Technology in China. When one puts these snippets of news together, all of which are occurring within a timescale of just a couple of months, it is quite clear that China is moving forward rapidly with the adoption of the cloud. This must be seen as significant given the country's increasingly prominent economic position, and signals that the world's two largest economies are fully embracing this technology. If anyone doubted the coming prominence of cloud computing, it's time to change your opinion! ### Busting the Myths of the Data Centre – Addressing Power Efficiency in Storage By Gavin McLaughlin, Solutions Development Director at X-IO Storage   In all walks of life, not just the IT industry, history is littered with common misconceptions that, through deliberate misdirection, false marketing or just plain old hearsay become common “facts” that just simply aren't true. One of the latest doing the rounds is the almost hysterical shouting from the rooftops by flash memory (often referred as “AFA” – All Flash Array) vendors that solid state drives or flash modules use much less power than traditional hard disk drives. Whilst this may be the case with some disk drives and some flash modules, again much like in real life, this can sometimes be true, however it's not right to make a general statement because it's not factually accurate. ...the almost hysterical shouting from the rooftops by flash memory vendors that solid state drives or flash modules use much less power than traditional hard disk drives. When it comes to data centre power consumption, which is increasingly in the spotlight the marketing noise from many AFA is particularly loud. It can be easy to be seduced by marketing statements from some storage vendors such as “requires only 10W per Terabyte” and perceiving it to be a highly efficient array. It's only when you delve deeper into the unit's architecture to find that a 10TB array has two 2.5KW power supplies that you may smell a rat. This is because they're talking about the raw components rather than the entire storage solution. Whilst in some cases (but not all), flash memory needs less power than hard disk drives, you still need to drive them using reasonably meaty processors and some traditional cache memory to help accelerate writes. These all need power and when you couple these together and then add some not necessarily well-written efficient code, you can suddenly realise that the marketing statement was analogous to looking at the fuel usage of a domestic boiler when it's merely running just the hot water and not the full central heating. ...you can suddenly realise that the marketing statement was analogous to looking at the fuel usage of a domestic boiler when it's merely running just the hot water and not the full central heating. When used carefully and correctly, hard disk drives can actually be more power efficient, more reliable and even better performing than flash memory. The truth is that in order to address many of today's challenges faced by organisations requiring enterprise storage, a “right tool for the job” approach is necessary. Hard disk drives are great for sequential workloads for example, whereas flash memory is great for smoothing out the bumps in a random workload. Only by carefully fusing both enterprise grade hard disk drives and flash memory do you realise the best of both worlds and be able to deploy true hybrid storage. Solutions that enable customers to deploy enterprise grade, real-time tiered hybrid storage are rare but do exist on the market, Using this sort of enterprise solution enables the performance boost of flash but at the more efficient price point of hard disk drive arrays. The real benefit though for organisations concerned about power efficiency is that the power and cooling requirements are actually much lower than not only all flash arrays but also traditional enterprise storage. It's clear that with the right guidance and using the right tools businesses can learn from enterprises that are making the most of the power in a datacentre, whilst not compromising on performance and reliability. With all the marketing noise in the storage industry right now, it's very easy to believe some of the misconceptions being thrown around. Don't be fooled by them. Instead, focus on digging deeper than just the marketing blurb. Find out the real facts and don't compromise on performance. Otherwise you could end up finding out the hard way. Gavin McLaughlin is Solutions Development Director at X-IO Storage based out of X-IO's London office covering the EMEA region. X-IO Technologies is a recognised innovator in the data storage industry due to its award-winning, cost effective, high performance and self-healing Intelligent Storage systems, including flash-enabled hybrid storage.   ### Addressing power efficiency in storage Busting the myths of the data centre Gavin McLaughlin, Solutions Development Director at X-IO Storage examines power efficiency in storage In all walks of life, not just the IT industry, history is littered with common misconceptions that, through deliberate misdirection, false marketing or just plain old hearsay become common “facts” that just simply aren’t true. One of the latest doing the rounds is the almost hysterical shouting from the rooftops by flash memory (often referred as “AFA” – All Flash Array) vendors that solid state drives or flash modules use much less power than traditional hard disk drives. Whilst this may be the case with some disk drives and some flash modules, again much like in real life, this can sometimes be true, however it’s not right to make a general statement because it’s not factually accurate. When it comes to data centre power consumption, which is increasingly in the spotlight the marketing noise from many AFA is particularly loud. It can be easy to be seduced by marketing statements from some storage vendors such as “requires only 10W per Terabyte” and perceiving it to be a highly efficient array. It’s only when you delve deeper into the unit’s architecture to find that a 10TB array has two 2.5KW power supplies that you may smell a rat. This is because they’re talking about the raw components rather than the entire storage solution. Whilst in some cases (but not all), flash memory needs less power than hard disk drives, you still need to drive them using reasonably meaty processors and some traditional cache memory to help accelerate writes. These all need power and when you couple these together and then add some not necessarily well-written efficient code, you can suddenly realise that the marketing statement was analogous to looking at the fuel usage of a domestic boiler when it’s merely running just the hot water and not the full central heating. When used carefully and correctly, hard disk drives can actually be more power efficient, more reliable and even better performing than flash memory. The truth is that in order to address many of today’s challenges faced by organisations requiring enterprise storage, a “right tool for the job” approach is necessary. Hard disk drives are great for sequential workloads for example, whereas flash memory is great for smoothing out the bumps in a random workload. Only by carefully fusing both enterprise grade hard disk drives and flash memory do you realise the best of both worlds and be able to deploy true hybrid storage. The real benefit though for organisations concerned about power efficiency is that the power and cooling requirements [of hybrid storage] are actually much lower than not only all flash arrays but also traditional enterprise storage. Solutions that enable customers to deploy enterprise grade, real-time tiered hybrid storage are rare but do exist on the market, Using this sort of enterprise solution enables the performance boost of flash but at the more efficient price point of hard disk drive arrays. The real benefit though for organisations concerned about power efficiency is that the power and cooling requirements are actually much lower than not only all flash arrays but also traditional enterprise storage. It’s clear that with the right guidance and using the right tools businesses can learn from enterprises that are making the most of the power in a datacentre, whilst not compromising on performance and reliability. With all the marketing noise in the storage industry right now, it’s very easy to believe some of the misconceptions being thrown around. Don’t be fooled by them. Instead, focus on digging deeper than just the marketing blurb. Find out the real facts and don’t compromise on performance. Otherwise you could end up finding out the hard way. ### 2014: Year of the OpenStack Ecosystem Jeanne Le Garrec, Sales & Channel Manager at Hedera Technology says 2014 will be the year that OpenStack finally meets the requirements of the Enterprise. After 8 releases, OpenStack is a buzzy subject in the cloud community. Initiated by Rackspace and the NASA in 2010 it is, today, backed by the biggest IT companies such as IBM, HP, AT&T and VMware.  Companies like Deutsche Bank or Paypal have built production clouds based on OpenStack, giving the community its first business cases & references.  OpenStack is growing fast.  The last gathering in Hong Kong saw more than 3000 attendees and 2/3 were first timers. All these figures may seem impressive and promising for the project, and they are, but one fact makes the situation somewhat more gloomy:  the number of production deployment doesn’t reach the cloud community's expectations. As a Cloud Management Platform (CMP) provider, Hedera have close contact with our clients and are continuously trying to solve their issues. We would like to share with you some feedback and our vision of OpenStack. OpenStack is perceived by analysts as the Linux cloud revolution and offers IT departments a low-cost IaaS with ultra-scalable features. OpenStack's success is illustrated by the gathering of big and more modest sized companies in the community and their adoption of the project. But after all the hype OpenStack generated, we have recently seen various articles questioning the level of interest within enterprises. How did such a popular project became a "wait and see" solution? What often comes back in the different analyses, is that OpenStack is "not a solution ready for the enterprise environment". For hosting companies or service providers with big scalability challenges the solution fits well, but it is more complicated to integrate it into a large company with strict processes. Analysts indicate that OpenStack is not a cloud or at least it's not a “cloud in a box” solution.  If the real OpenStack competitor remains Amazon, no need to say that there is a long way to be able to compete on the same level. The Chief Engineer of Cloud at eBay Market Places is one of these analysts, who is quoted as saying: "Though the community did a nice job at putting together this software, an instance of an OpenStack installation does not make a cloud. As an operator you will be dealing with many additional activities not all of which users see. These include infra-onboarding, boostrapping, remediation, config management, patching, packaging, upgrades, high availability…" Gartner’s analyst, Alessandro Perilli, wrote another buzzy article.  In this article, he exposes several reasons on why vendors can’t sell OpenStack to enterprises. The main ones being: The lack of clarity about what OpenStack does and doesn’t do The lack of transparency about the business model of and around OpenStack The lack of vision and long term differentiation The lack of pragmatism The lack of clarity that surrounds OpenStack makes it difficult for enterprises to understand exactly what OpenStack is. The fast cycle of new releases certainly doesn’t help either. Understanding a project involving +13k people across +130 countries is hard enough, and it becomes even harder when every 6 months new side projects are integrated to OpenStack. We have been talking with service providers and enterprise IT managers for a while now, specially in France where we are based.  The first remark we can make is that OpenStack is pretty much a new "thing" in France. 8 months ago when we were calling prospects and telling them about OpenStack, their answer was "what is OpenStack?".  It recently changed in a drastic manner. Today, enterprises know what OpenStack is or at least they are curious about it. The reason they started to show interest in OpenStack is simple, they understand the value proposition: reduced costs and no vendor lock-in. Most of the prospects we talk to find these two arguments compelling. For the most advanced of them who started to deploy Openstack, they realize that it's not a simple task. At the beginning of December, during OpenStack in Action 4, we recall one IT manager telling us his OpenStack installation story and how difficult it was, even for the experienced Sys admin that he was. The next step for mass enterprise adoption will be to accept the fact that they will need change management in order to deploy and manage OpenStack. To help address the need of a rapid and painless OpenStack deployment we have built pimpmystack.net, a free way to deploy and test OpenStack on a dedicated infrastructure. However, some of the companies we talk to are not ready to separate from their old habits & inert technologies. What we are sure about is that they are aware of the usage gap between their existing tools and OpenStack. It is hard to make accurate forecasts on what the OpenStack ecosystem will look like in few years - but there are some trends we are observing and following. We think more and more OpenStack 'bundles' will appear to cross the chasm between users' need and the existing out-of-the-box OpenStack. These bundles' features will be more adapted to Sys Admins and should prevent solutions like CloudStack or SCVMM from getting more Sys Admin market share. Companies providing software bundles with OpenStack such as Suse, RedHat, HP or Canonical are mainly the ones implementing them in a production environment. We think that more and more consulting firms will challenge these legacy providers in the integration market. We believe that OpenStack is a major project and we have no doubt that it is the future of the cloud. 2014 will be the year OpenStack enters the enterprise market in a big way and 'the gap' we have identified will be bridged by third party solutions. Companies will need to complete their OpenStack deployment with automation & orchestration solutions, like CMPs. These products ensure fast and easy deployments, a complete & efficient integration with the IT processes and enable companies to embrace a solution to manage their private and (eventually) hybrid cloud environments. A whole new eco-system is there to be built, don’t be afraid, be excited. ### 6 Things to Consider Before You Switch to BYOD Rick Delgado looks at BYOD and its various technical and human challenges - and all is not always quite what it seems. With more and more people in possession of mobile smart devices, it seems logical that many companies are beginning to implement BYOD policies in the office. What is BYOD? Well, it stands for "Bring Your Own Device", and it basically amounts to having employees supply their own smart devices and possibly other hardware as well. Rather than having the company pay for the smartphones and tablets used by the workforce, a policy is drafted that outlines the rules and regulations for employees to be able to access company data on their own personal equipment. Why purchase a $500 dollar mobile smart device when the employee already has one in his or her pocket? Of course, it isn't quite as simple as all of that. Here are a few things that business leaders should consider before they jump headlong into converting their companies into BYOD workplaces: 1. Cost The biggest for most companies when they consider a BYOD policy is the resultant cost reduction. After all, why purchase a $500 dollar mobile smart device when the employee already has one in his or her pocket? At the same time, removing company-owned devices from the equation frees up IT departments from having to deal with basic maintenance and repair (which would all be handled by the employee), and focus on wider reaching issues. However, this isn't to say that BYOD is free. The company may be expected to cover a portion of the monthly bill, as well as pay for increased data use. Still, the increased cost for the employee might make certain workers unhappy. 2. Productivity For some reason, employees seem to be more productive when they are able to work on their own devices. Part of it has to with being more comfortable and skilled when using one’s own device. Thus, the employee will be able to work faster than would be possible on a less-familiar system. At the same time, by making a personal smart device also a company device, employees will be more likely to have the device with them at all times; not simply while they’re at the office. This means that they’re more likely to do additional work at home, and are easier to contact. Of course, the downside is that as personal and professional lives begin to intersect, there will also be employees who use their devices for non-business activities during work hours. Still, the overall gain amounts to an average of 37 additional minutes of productivity per employee per week, which really adds up fast. Still, the overall gain amounts to an average of 37 additional minutes of productivity per employee per week, which really adds up fast. 3. Employee morale This is a tough one. On the one hand, most of the studies that have been released claim that employees who are able to use their own devices are happier at work. However, many of these studies seem to focus rather on allowing employees to use their personal devices, and not on forcing them to. In fact, a recent study by the analysis firm IDC has concluded that the majority of employees would rather have a company device than be expected to do company work on personal equipment. Businesses should get a feel for what their employees really want before switching to BYOD in an attempt to improve morale. 4. Acceptable use People like to have freedom, especially with their personal devices. Normally this isn’t an issue. However, when a personal device becomes a company device, that freedom can cause serious problems. After all, the company doesn’t technically own the device, so how can they dictate what is appropriate use and what is not? This is an issue that needs to be considered by any business that is considering a BYOD policy. A recent study by IDC concluded that the majority of employees would rather have a company device than be expected to do company work on personal equipment. 5. Security This may be the biggest concern among the bunch. When data is contained on a corporate server, or within a protected network, then it is relatively safe from attackers and theft. However, once that information is accessed by a remote mobile device, then the safety controls that the business had over that data is lost. With a legion of various possible devices in a BYOD environment, it’s just not possible to integrate a single, overarching security policy that doesn't also violate an employee’s personal rights. And when an employee leaves, how can your company be sure that the device that he or she takes along isn't full of sensitive corporate data? 6. Availability Although the trend is certainly leaning in this direction, the truth is that not every employee in the workforce has access to a personal smart device. By expecting workers to supply their own devices, an organization may leave many at a disadvantage, and could even open themselves up to allegations of employee discrimination. Business leaders would be smart to review these topics when considering switching to BYOD. Obviously there are great advantages, there are, however, some things that may not be the best fit for your particular company, just yet. ### What are hosted desktops? A hosted desktop (often also referred to as Desktop as a Service or DaaS) – much like the name suggests – is a desktop that is hosted in the cloud. In essence, a hosted desktop combines multiple cloud offerings such as email, applications and data storage into one solution, providing similar functionalities and capabilities. A hosted desktop looks no different from a traditional, physical desktop; the difference between the two is the location where apps, data and email are stored. A physical desktop accesses and stores data, apps and email on the actual device. With a hosted desktop, everything is stored in a secure data centre and the device merely serves as a medium to display images of what is accessed. Why use a Hosted Desktop? Email, apps and data – all in one place, accessible anywhere Hosted desktops might look like physical desktops, however they offer the advantage of being accessible anywhere, on any device. This means that users don’t have to individually transfer files or install applications on new devices; all they need is an internet connection to log on to their desktop. Remote working is available to all users as they can just log on at home, in a coffee shop or in another office and have all applications, email and files available to them. In addition, compatibility issues are eliminated and the risk of physically transporting files from one location to another is removed. One unified solution Any business can profit from adopting cloud solutions but things can become overwhelming when a company utilises multiple cloud offerings from various providers, be it storage, accounting software, backups or MS Office. A hosted desktop combines these offerings into one, meaning that there is one single provider and therefore one single POC for a business, disregarding whether this involves emails, specific apps or data. Data storage and security are being taken care of automatically and regular back-ups, software updates and disaster recovery are also part of the solution. Security One of the main benefits of hosted desktops is the high level of security they provide. Security does not depend on individual devices any more – the desktops are hosted in a secure infrastructure and data, apps and email are held in multiple, secure data centres. Similar to keeping software up to date, the hosted desktop provider also automatically performs security updates for the entire system and all desktops. Furthermore, data is automatically encrypted and available Multi-Factor Authentication ensures that files are kept safe. Seeing as nothing is stored on the actual device, critical company data would not be compromised if the device is lost, stolen or breaks down. Data backups and disaster recovery Rather than having to manually back up data and store these backups securely, files that are located on the hosted desktop are backed up automatically and regularly by the provider, taking the burden away from the business. Similarly, since backups are stored in multiple data centres, even an office disaster would not cause the loss of files. Due to the accessibility of the hosted desktop, employees could log-on on alternative devices and continue working from home or in another location. BYOD proof Another benefit of hosted desktops is that the solution is fully Bring Your Own Device (BYOD) proof. Since the device would only be used to access a user’s desktop remotely via the internet without any data or apps being stored on the device itself, the risk of data loss or data theft is minimised. Furthermore, users have access to the same software and applications they use on their work desktop, removing compatibility problems and the need to update software regularly. ### Apache CloudStack and the cloud API wars By David Nalley, Apache CloudStack Committer The cloud API wars have continued unabated in 2013, with many flare ups. Apache CloudStack hasn't been a huge partisan in these wars; but we still consider APIs very important. What is this 'Apache CloudStack you speak of? You haven't heard about Apache CloudStack? Perhaps it's not surprising then that its often been called the 'best kept secret in cloud computing'. Apache CloudStack is both project and product; it's a top level project at the inimitable Apache Software Foundation with a purpose of producing software for deploying public and private Infrastructure-as-a-Service clouds. This means you can pool storage, network, and compute resources and allow users to provision and consume resources on-demand; while at the same time ensuring isolation between users' resources. Despite not having a huge hype following there are a number of prolific deployments that have been publicly acknowledged. What are APIs so important to Apache CloudStack or any other IaaS platform? APIs form the core communication method between consumers and the platform itself. While virtually every IaaS platform has some type of nice web GUI to interact with, it's just not efficient. Moreover, such UIs typically are designed for humans. What if your monitoring system or continuous integration system needs to interact with your IaaS platform? In short, APIs are how work actually gets done in a IaaS cloud platform. Apache CloudStack GUI: Pretty, but not where the real work gets done! APIs are great, right, so no problems? Well, except that there are a plethora of APIs out there; and while automation and your own sanity demand that you have an use an API, which one is best? Does it matter? It's important to realize that when most people talk about cloud APIs they generally are referring to user APIs. E.g. the APIs users make use of to provision, deprovision, or otherwise interact with the cloud platform. There is a separate (sometimes hidden) administrative API that the cloud provider uses for controlling the cloud itself. It's important to realize that when most people talk about cloud APIs they generally are referring to user APIs. ...There is a separate (sometimes hidden) administrative API that the cloud provider uses for controlling the cloud itself. The Standards Naturally most folks want to use some 'standard'. The IT industry certainly has a glut of standardization bodies, so finding and adopting a cloud API standard should be a no brainer, right? Sadly it's a bit more complicated than that. There are a couple of cloud API standards; most notably CIMI (Cloud Infrastructure Management Interface) from the folks at DMTF and OCCI (Open Cloud Computing Interface) from the Open Grid Forum. There are really two problems with these standards. The first problem is that neither has massive adoption. Which in turn means that the number of libraries and third-party tools that support these APIs is pretty small. In many ways this defeats the purpose of using a single, widely adopted, standard. The second issue is that because of their wide applicability the APIs are pretty generic. Turning on or off a virtual machine is easy, but if you want to use the latest coolest feature of a specific IaaS platform it might not be exposed at all by CIMI or OCCI. In some ways this is a good thing; it enforces adherence to core functionality that should work on virtually any platform. This means you mitigate some of the risk of lock-in to a specific platform. On the other hand it might mean artificially limiting ourselves from cool, innovative technology. In some ways this is a good thing; it enforces adherence to core functionality that should work on virtually any platform. [...] On the other hand it might mean artificially limiting ourselves from cool, innovative technology. The de facto standard As in most emerging technology, there are de facto standards in the cloud API arena. In the IaaS realm that is the Amazon Web Services API. Virtually every cloud-y tool or library will at least work with the AWS API. The corollary to this is that virtually every IaaS has some measure of AWSAPI compatibility. Perhaps that makes it the real standard. It certainly has the adoption and ecosystem around it. But its sadly no panacea and does have a few difficulties. Unlike CIMI and OCCI which are designed to be more generic by nature, the AWSAPI is really designed to be implementation specific – to Amazon's implementation. The challenge here is that in addition to ingesting requests and parroting back the proper commands, there are plenty of assumptions around how the cloud is implemented; and those underlying semantics mean you must either mimic them or you have some potential problems with your fidelity to the API. That means you have a lot of technology limitations; and for better or worse are set to track Amazon. What's the Apache CloudStack take on all of the APIs? In short, the project recognises that there are valid reasons to use a real or de facto standard and we see the advantages and limitations that come with that. We encourage the development of those 'standards integrations' even as we continue forward development of our native APIs. Apache CloudStack tends to be agnostic; we don't have a preferred hypervisor, or a preferred storage technology, or networking topology; and that agnosticism is just as present in the API realm. CloudStack maintains and develops its own user and administrative APIs that are easily consumable, but we also have a native translation layer for the Amazon EC2 API that covers about 60% of the classic EC2 API set; because we recognize that there are so many tools written around EC2. There's also plenty of work going on close to the project from community members who value the API diversity. So you can find both OCCI implementations for CloudStack as well as a Google Compute Engine translation layer. In short, the project recognizes that there are valid reasons to use a real or de facto standard and we see the advantages and limitations that come with that. We encourage the development of those 'standards integrations' even as we continue forward development of our native APIs. David Nalley @ke4qqq, ke4qqq@apache.org] is a committer and member of the Project Management Committee for Apache CloudStack and is employed in the Citrix Open Source Business Office. David is also a Thought Leader for Compare the Cloud and you can see his profile here.   ### For broader adoption, Cloud must go Vertical! By Allan Behrens, Managing Director of Taxal I like to look at IT topics in the context of the industry you’re in. And that goes double for Cloud. Let me explain… If I’m an automotive manufacturer, new technologies (and importantly those for this discussion in the Cloud product and services business) must add value to my business directly; notably to developing, selling and servicing better cars, trucks etc. If I don’t understand the linkage between what the IT providers offer and what I do, is it a surprise that I chose to stay with what I know and trust? If I don’t understand the linkage between what the IT providers offer and what I do, is it a surprise that I chose to stay with what I know and trust? I believe that one of the reasons for (arguably) slower take up of Cloud in general within the manufacturing sector is to a large degree due to the gaps between what user companies need, what they understand to be available, and the value that the new paradigm brings to support and enhance their business initiatives. These gaps are unfortunately (and embarrassingly) exacerbated by the significant amount of techno-babble (and acronym hell) that’s an age-old promotional ethos of us in the IT industry. Of course this gap in presentation and understanding/acceptance isn’t unique to manufacturing, it applies across many industries. Needless to say there are many other reasons that affect the take-up of Cloud (or any new paradigm for that matter), among these are the past stigmas of over-selling, drawn out implementations and lock-ins attributed to some of those in our IT communities. To my point on the experiential gap between IT suppliers and their customers. It would be naive to simply pin the reasons for the absorption of new technology in one vertical over another as being one of semantics or misunderstandings. The availability of suitable Cloud enabled software (and security and hardware/software infrastructure) plays a fundamental role. The challenge is what comes first, the chicken or the egg? Should software providers be developing (and migrating) their offerings for (and to) the Cloud in advance of client demand? In reality the availability of Cloud software for manufacturing companies is somewhat mixed. There are, of course, many, what many would call ‘horizontal’ applications available that are non-manufacturing specific, CRM (e.g. Salesforce) being one example. Naturally, some developers are waiting to see what demand there is before committing to Cloud, but we’re seeing some interesting service providers deliver Cloud delivery platforms that allow users to take advantage of non-Cloud based apps; for example Rescale in the engineering simulation domain. Of course there are long-serving application developers moving/developing product and services on the Cloud; Oracle, SAP, Dassault Systèmes and Siemens PLM amongst them. With their large suites of software it’s no surprise that it’s taking time to “Cloud-enable” their portfolios, but I tend to feel that many of them are still in the ‘wait and see’ category. There are, of course, developers who have whole-heartedly nailed their flag to the post. I’d include in the latter group suppliers such as IBM active in a number of areas including those of analytics and software engineering, engineering solutions supplier Autodesk with new Cloud based design simulation and manufacturing services, and ERP solutions from Netsuite. Of course the promise of elastic and on-demand resource also provides an attractive option for many software start-ups, and we’re also seeing innovative solutions emerging in many areas including those of design, engineering, collaboration and supply chain management. Rounding off then, Cloud adoption in industries such as manufacturing requires both technology and commercial catalysts. Of course enabling infrastructures and applications need to be available, and these are rapidly evolving/works in progress. But Cloud acceptance can be further improved; for one thing, IT suppliers may want to consider a change their sales and marketing strategies; one that better promotes Cloud software and infrastructure focused on client (industry specific) business outcomes; business value-add rather than ‘technology for technologies sake’. As for potential user companies, perhaps it’s time for them to consider new the opportunities and advantages that Cloud can bring to their businesses; not only from financial or technological standpoints, but also in its ability, especially for small and medium sized business, to allow them to ‘punch above their weight’. ### PEER 1 Hosting leverages enterprise-class SSDs [quote_box_center]Case Study: In cloud computing and resource virtualisation circles, hardware can often take a back seat in the performance debate.  The separation of the virtual and the physical suggests to some that it doesn't really matter what hardware you're running on, all that matters is how many virtual server instances you can produce of varying sizes and how functional your UI and APIs are.  However, as growing interest in 'bare metal' cloud services demonstrates, there is an increased level of knowledge and expectation of server performance from business users, as they search out options that produce more consistency and power to drive their applications.  Hardware may well be a commodity but it doesn't mean all DRAM modules, HDDs, CPUs, SSDs are all the same - and it certainly doesn't mean every supplier is the same in how they go about ensuring you get the most out of your hardware purchase!  That's where Kingston Technology's KingstonConsult service comes in - a free service designed to help you get the most out of your hardware, and frequently there are rich rewards for getting the experts to cast their eye over your operation, as we can see in this case study of Peer 1 Hosting.[/quote_center][/quote_box_center] The Challenge PEER 1 Hosting is a fast-growing hosting services provider. From its roots in Vancouver, Canada, the company has grown to 500+ employees who support operations across North America and Europe. Through the years, PEER 1 built a reputation for delivering highly available colocation, managed hosting and dedicated hosting services. That, plus a corporate strategy to leverage the latest technologies, has fostered a client base that numbers over 13,000. "The benchmark results of the Kingston E-series SSDs we tested exceeded our expectations. The I/Os were blazing fast, around 55,000, and the reliability we measured promotes our 99.9 per cent uptime guarantee to our customers." - John Hamner, Product Manager, PEER 1 Hosting PEER 1's hosted solutions include server offerings that range from two hard drives up to 16, plus memory from eight to 756 gigabytes. Managers leverage SSDs as caching devices for servers running high I/O applications. In PEER 1's world, that means e-commerce, gaming and database applications. The SSDs they certify must therefore meet high performance and reliability standards. In addition to sourcing quality SSDs, PEER 1 managers needed to partner with a vendor that could deliver large quantities on relatively short notice. "We might go an entire month without putting a single SSD in a server, but then the next month need 5,000 units," says John Hamner, product manager for PEER 1 Hosting. "Customers want to move forward on an implementation as quickly as possible, so we need to partner with an SSD provider that can quickly handle the volumes we need." Technology Solution Since quality varies so widely among SSD products, PEER 1 conducts a stringent review process of prospective SSD models. "We can't just throw any SSD into the servers we set up for our customers," explains Hamner. "Their business is based on reliable performance and if we lumbered them with a subpar SSD, our reputation would take a hit." With so much on the line, Hamner performs a comprehensive vetting process. "We have many engineering cycles and do a lot of benchmarks to evaluate SSD candidates." Hamner based his final SSD selection on three primary criteria. "One is performance, another is procurement — can a vendor deliver the quantities we need in a timely fashion? — and the last is customer service. Kingston and its enterprise-class E-series SSDs excelled in all three areas." To date, Hamner has procured Kingston E100 (100GB), E200 (200GB) and E400 (400GB) SSDs for use in clients' servers. "They come in the 2.5 inch form factor that we like, and they're pretty easy to install. One of the interesting things we found with the imaging process was that a lot of the drivers were pretty native, so we didn't have to do anything special to tweak the drives for use." Business results The Kingston E-series SSDs have delivered a number of benefits to PEER 1 and its customers. High endurance increases return on investment In addition to helping PEER 1 meet its 99.9 per cent uptime service level agreements, the 30,000 program erase cycles of the Kingston enterprise-class SSDs give ROI a boost. "The fact that Kingston uses eMLC technology really increases our return on investment because we know the drive is going to last a lot longer than the typical MLC drives that most of our competitors are offering today," explains Hamner. "As a result, we give our customers a longer life cycle, which equates to more uptime and less time spent replacing drives." In addition to higher endurance, the Kingston E-series SSDs help Hamner's customers consolidate their infrastructure to do more with less. "In the past, customers that needed to access large data sets quickly were at the mercy of spinning drives. So—I'll be conservative here—in order to achieve half the I/Os of a Kingston SSD, they had to have at least eight spinning drives. Now that Kingston has taken bulk storage and added high I/Os, our customers can buy two drives from us instead and still get the exact amount of storage they need plus much better performance." High performance enhances customer satisfaction while opening the door to new market segments During benchmark tests, and in production environments after installation, the Kingston SSDs have delivered superior performance. "The E-series SSD's performance specs really outmatch a lot of the other competitors out there today," says Hamner. "That's been a big benefit to our customers, especially the ones running high I/O applications. The amount of I/O you can get out of these drives is pretty phenomenal—between 50,000 and 60,000 according to our tests." What does this mean for PEER 1's customers? "The SSD I/Os we've experienced translate into screaming performance for our customers. Not just our gaming customers, but also our database and e-commerce clients that have a lot of visitor traffic to process." Additionally, PEER 1 managers have developed an offering to appeal to the NoSQL database market. "Previously, we used standard SATA drives but they didn't have the availability that those customers needed," explains Hamner. "Now, we can offer those customers a very robust solution based on highly available, high-performance Kingston SSDs." Kingston procurement team eliminates supply chain bottlenecks One of Hamner's main concerns about a potential partnership with Kingston was whether the company could deliver large quantities of SSDs in a timely manner. "One of the great things about working with Kingston is that it's taken the supply chain bottleneck issue off the table. Our account reps usually manage the supply chain; they manage our inventory levels, monitor usage trends and make recommendations that keep the supply running smoothly." [quote_box_center] Summary: PEER 1 Hosting is a major IT hosting provider. The company from Vancouver, Canada, offers colocation; managed & dedicated hosting services across North America and Europe. In order to continue to grow the business, boost customer retention and enter new market segments, managers decided to improve performance of their server offerings. After extensive vetting and detailed benchmark tests, they selected the Kingston enterprise-class E-series SSDs. These include the: SE100S37/100G SE100S37/200G and SE100S37/400G models The company is using the SSDs as caching devices in server disk arrays for its database, gaming and e-commerce customers. PEER 1 now able to offer robust server solutions to the NoSQL market—SATA disk drives were previously too unreliable PEER 1 benchmark tests measured "phenomenal" I/Os of between 50,000 and 60,000 Enterprise-class MTTF extends life cycle and ROI Kingston procurement eliminates supply-chain bottlenecks for large orders [/quote_box_center] ### Connectivity-On-Demand ends punitive charges Neil Cresswell, CEO of Virtus Data Centres introduces us to their innovative and flexible usage-based bandwidth access and consumption model. When Virtus added CoLo-On-Demand to its Virtus Intelligent Data Centre portfolio earlier this year, it challenged the landscape of the traditional colocation industry. For the very first time, we were offering customers the opportunity to flex up/down their power requirements from 0kW to 10kW per rack on a daily basis and provide contracts that could be terminated on a day’s notice. This ground breaking concept allowed our customers to align their IT power bills with their actual work output, but connectivity still posed a problem because standard contracts lacked the flexibility of the CoLo-On-Demand solution.  And in order to truly reflect our customers’ dynamic requirements, flexible colo needs flexible connectivity. We have been working with our connectivity partners to develop Connectivity-On-Demand; a highly flexible connectivity solution offering Interconnections between LONDON1 and over 100 ‘on net’ data centres in the UK and a ‘Pay as you Use’ IP Transit service. Virtus’ flexible Interconnect service can be turned off or flexed up or down between 100mb or 1Gb on 30 day’s notice without penalty while the IP transit service is based on usage and actual bandwidth consumed rather than committed data rates and punitive “burst charges”. With the combination of CoLo-On-Demand and Connectivity-on-Demand our customers can effectively have “Flexible Everything” – colo, connectivity and customer services – this is a new, truly flexible and dynamic model for procuring enterprise data centre solutions. The full range of Intelligent Data Centre products has been specifically developed to help customers meet challenges such as delivering 100% uptime for production, matching data centre costs to ever-changing requirements, testing new markets without long term commitment, and accurate monitoring and control allowing agile resource re-allocation in line with business requirements. Customers can now choose the products that best suit their business requirements whether it be for a specific project that is heavily workload based, a development platform for research or laboratory work or just simply because a customer wants low cost access to a state-of-the-art data centre facility. LONDON1 has been designed to be flexible enough that it can accommodate racks of up to 25kW so high density deployments are not a problem for us. The Intelligent Data Centre portfolio is ideally suited for anyone who has changing workload requirements from a large enterprise client to a startup, ‘sandbox’ or ‘shopfront’. More importantly they guarantee that all our customers are getting absolute flexibility and the lowest TCO possible without compromising on quality of design and implementation. Virtus aim to continue to innovate in-line with and beyond the way businesses of all types deploy and acquire data centre and connectivity services, to ensure we exceed our customers’ expectations in quality, flexibility, service and value. ### The day computing changed forever - and we all missed it Daniel Thomas of Compare the Cloud explains how using DNA for storing data is just the start of the humanification of IT. On Wednesday, the 23rd January 2013, the world we live in changed as a result of a breakthrough in science where synthetic DNA was used to store all of Shakespeare’s 154 sonnets, an audio clip of Martin Luther King's "I have a dream speech", and the famous "double helix" paper. Think about this just for one minute? Look at the smartphone, laptop, PC or data centre you are currently using or residing in! Yes that’s right - DNA! - something so small that we cannot see it with our naked eye, and something intrinsically linked to what we are as human beings and what creates us. Something so small yet so vital can now be used to store and retrieve data. DNA packs information into much less space than other media. For example, CERN, the European particle-physics lab near Geneva, currently stores around 90 petabytes of data on some 100 tape drives. (Nick) Goldman’s method could fit all of those data into 41 grams of DNA. This is the biggest disruption in the history of technology; bigger than the Victorian industrial revolution, bigger than the Renaissance (I wonder what Machiavelli would of said!). Was this the day that as human beings we finally took the first strides to becoming the "device"? Will we look back on 2013 as the catalyst for what I would define as the "Humanification of IT" where our natural processing power and storage capabilities were harnessed with bio mechanics to create the ultimate IT device? Was this the day that as human beings we finally took the first strides to becoming the device? Let’s look 5 years forward into the Humanification chasm! Datacentres Today the data centre industry is the engine behind cloud connectivity and IT in general. But what happens when this disruption moves the physical element of storage and processing to the human? Todays data centre is about physical mass of IT kit, what do they do when this kit that drives storage and compute is transferred to something so small we cannot even see it(IT)? The Data Centre of the future will look nothing like what we have today, it will be like a biology facility with storage being in controlled "super test tube repositories". Connectivity and bandwidth Imagine a network comprising billions of human beings connected via 10th generation WIFI devices located within a human on-boarded bio-nano-chip the processing power will propel human advancement further than ever. The power of this "human cluster" to provide science with answers is beyond imagination. Move over professor Stephen Hawkings and Albert Einstein, every human will have on-demand access to a super-computing-grid of immense proportions allowing us to outsmart current thinking by cumulative factors of 10. Move over professor Stephen Hawkings and Albert Einstein, every human will have on-demand access to a super-computing-grid. Consumerisation of Humanification Imagine the scenario: the directors of Netflix - just like Blockbuster in the past - sitting in a boardroom agonising about how to exploit streaming video. What would Netflix do in this scenario? Simple! Tap into the Humanification RFC, (request for comment) standard 83383838339 to patent video on demand via biomechanics, to present video straight to the mind of the consumer. Same for Apple - will they create an 'itunes' for music streamed directly to the ear canal? What will TV manufacturers do? Will Bose develop a humanised sound system? Will Microsoft debut the "XBOX human 1500" for "The Matrix"-style multi-player gaming? Will IBM create "Social Business for Humans" which allows controlled human inter-working from a business perspective? The Future of Enterprise IT - based on Humanification Will currency, shares and futures trading systems still be in existence? NO WAY! Tapping into the power of the 'Human Cloud' will allow any trader or individual to outsmart any predictive modelling and trade execution system currently in existence both now and in the future. Communications will be based on bio-mechanical human "tap-ins" where visual images are presented in the brain cortex handled by a human-queue technology that blocks out nuisance or spam interactions. Why would we need to travel when we can meet up in the virtual world interact and do business? Will we need to travel? Who has ever used "second life"? Imagine the 'Human cloud' and the creation of a unique ID (UID) for every person on the planet. That UID is then customised to be an avatar of our physical beings, why would we need to travel when we can meet up in the virtual world interact and do business? The virtual human will be the next battleground for public relations companies across the globe. Imagine attending a virtual West Ham match at the Olympic Stadium, experiencing the atmosphere chanting and smells of the burger stalls awash with a virtual claret and blue scarf (sorry, I couldn't resist). Video Conferencing Such a laggard technology! Seven years from now it will be akin to playing an Atari game next to a Playstation or XBOX. Biotechnology-driven Telekinesis will be the next battleground with 'Virtual Human' bridges being used to communicate with trusted social and business contacts. Office Production Suites Seriously? Sitting at a keyboard and typing? All we will need to do is dictate in our minds what we want to convey, put it through to the "profanity and grammar" thought process, then to be communicated onwards with productivity tools such as smell, sound and touch applied on subscription - which combined convey our thoughts and feelings to the receiving party. Our UID avatar will be busy I suspect, I can imagine my fellow train passengers being overloaded with thought processes as I type this! The Human Cloud The power of Wikipedia - the mind of Stephen Hawking - the design genius of Steve Jobs - the cunning of Machiavelli - the beauty of Raphael or Michelangelo - the analytics of Alan Turing - the will and determination of Winston Churchill. Science would have evolved to the point where cryogenics intersects with the humanification of technology where the physical form would be transported to the UID avatar based on remaining human matter or brain functions. Imagine doing an enterprise project where you can call in these great human beings to interact in avatar form and advise your company on how to do a marketing campaign product design - or getting Machiavelli to advise you on politics? Could we be voting Winston Churchill as our next virtual prime minister or maybe get John Lennon to write more music? How will we achieve this? IBM's 'Watson' is currently on the path to creating genuine human interaction; could this be progressed to feed all available information into a UID avatar? Taking the works of Shakespeare and the private diaries of Winston Churchill to recreate his emotions and personality? Could we be voting Winston Churchill as our next virtual prime minister or maybe get John Lennon to write more music? Or Bobby Moore to grace our virtual football field once again and lift the world cup? Perhaps we could even bring back John F Kennedy or Theodore Roosevelt to settle conflicts? The Downside of 'Humanification' The ability to control the human population via a DNA or a Bio-mechanical virus is unsettling to say the least. Would governments use this technology to do evil? Rather than cure cancer and advance human science and knowledge? Who knows what the future holds - hopefully the transparency and liberating nature of inclusive and swift technological advancement will protect us? Who knows what the future holds - hopefully the transparency and liberating nature of inclusive and swift technological advancement will protect us? Advancements in Security and Orchestration Will the corporate and personal firewall become the "Biowall"? With Juniper and Checkpoint perhaps offering host and grid customisation services stored on a DNA profile with bio-mechanical protection? Will Flexiant be first to market with the "Humanification Orchestration engine" whereas human resources are dynamically allocated as per DNA and bio-mechanical workload with inbuilt security based on an HPI (human programming interface)? With the power of this technology, will the law, legislation and regulation need to be drastically altered and improved? Perhaps Frank Jennings could draft a HTSR (human technology social responsibility) contract to keep us all on track? In Summary Don’t have nightmares! This is a work of fiction - loosely based on current technology advancements and potential scenarios. Inside all of us is a decent human being, and - as a technology industry - I am certain we will strive to deliver good to humanity. I look forward to meeting you all at the virtual Olympic stadium to watch my beloved West Ham in 2018! Dedications I would like to principally dedicate this blog post to a man whom I respect greatly both as a biological human being and a technology leader, Simon Porter of IBM. Further dedications and a Merry Christmas to all of you! Team IBM: Doug Clark, Bill Mew, Chris Wynn, Amanda Markham, Andrew Gill, Richard Potts, David Kay, Victoria Milstead, Stuart Hoskins, Ben Faradoye, Ronald Velten, Martijn Van Veem, John Mason, Brig Sermon, Gregor Sideris, Vicky Gillies, Dierdra Errity, Jerry Crossfield, Steven Dickens, John Smith, Mark Tomlinson, Steve Strutt, James Lowe and any I have missed! Industry Influencers (I treasure and respect all the advice and guidance thank you!) Paul "Obi Wan" Bevan, David Terrar, Alan Behrens, Dale Ville, Andrew Buss, Daniel Steeves and Max Buchler, Lindsay Smith, Ian Moyse and finally Gav and the team at the Virtual Machine User Group! Arrow ECS: Steve Pearce, Stuart Simmons and last but certainly not least Ian Jeffs (thank you for all the support I truly do appreciate it) Digital Realty: Omer Wilson, Rob Bath Pulsant: Rob Davies and Matt Lovell Easynet Global Services: Eoin Jennings RTW Hosting: Mike Wills Techgate PLC: Martin and Dawn Wright Arkivum: Mark Ellis, Cathy Brode and Jim Cook CloudSigma: Robert Jenkins and Bernino Lind Intermedia: Ed Macnair (legend) Imtech ICT: Stephen Maloney, James Morgan, Fiona Squire Virtual DCS: Richard May And a thanks to my future virtual UID avatar advisors in 2018: Steve Jobs, Alan Turing, Winston Churchill, Clement Atlee, John F Kennedy, Niccolo Machiavelli, General Bernard Montgomery, Raphael, Botticelli, Michelangelo, Robert Stephenson, David Lloyd George, Bobby Moore, Kenneth Williams and Richard Burton. Come and join the fun! tell us your virtual UID avatar board advisors or comment on 'Humanification' - either directly on the blog or using twitter hashtag #Humanification let us know your thoughts or technology ideas or just say Happy Christmas! Daniel Thomas Founder of Compare the Cloud ### Is OpenStack a Game Changer for Cloud? For those of you that haven't heard of OpenStack, well please read on. I will try and explain what the latest buzz in the Cloud world is and where you can position yourself in the future with this very exciting topic. Wikipedia`s definition - OpenStack – “a cloud-computing project, aims to provide the "ubiquitous open source cloud computing platform for public and private clouds."Predominantly acting as an infrastructure as a service (IaaS) platform, it is free and open-source software released under the terms of the Apache License. The project is managed by the OpenStack Foundation, a non-profit corporate entity established in September 2012 to promote OpenStack software and its community. Easy to understand? Maybe for the technically minded amongst us - but not all. After attending the Paris OpenStack in Action 4 event (allegedly the biggest in Europe), it was made relatively clear. However first I think you need to understand what Cloud means (and that's quite difficult with current Cloud washing) to really understand OpenStack. ..you need to understand what Cloud means... to really understand OpenStack. Cloud technology So you have some services that are delivered via the cloud. Maybe an online backup of data, applications that are launched through a web browser or even some servers that are hosted in a datacentre? Just to confuse you the Cloud terminology will now kick in and these are called BaaS (backup as a service), SaaS (or software as a service) or IaaS (infrastructure as a service). There are infinite more acronyms for Cloud services than these few (PaaS – Platform as a Service, SaaS Software as a Service, DaaS – Desktop as a Service etc etc) but let's keep it relatively basic for the moment. In essence Cloud Services are anything delivered via online methods (centralised computing of old for the over 40`s amongst us). All of you reading this are probably taking some form of this type of service, maybe not be a complete outsourced model but in part at least (email is a good example). So where does OpenStack enter the arena I hear you ask? Well OpenStack is a little bit different from all of the above (cringe as I open myself up for criticism), as it's quite literately an Open Source developed standard (memories of Linux when first released) and what I mean by this is this: [APIs] allow seamless connectivity and modified source code to a virtually unlimited list of vendors/providers for Cloud services... OpenStack has a different licensing model to traditional methods (which is a relief from traditional MS overcharging, and others) in the sense that developments and certain additional functionality become features and not necessarily additional costs. This, as well as taking away a typical “lock in” adopted by the majority of SW providers, makes absolute sense, but it's not just about software. There are open API`s (application programming interfaces) that allow seamless connectivity and modified source code to a virtually unlimited list of vendors/providers for Cloud services which puts you, the consumer, in control of what you want, where you want it and for how long ! It doesn't stop there, what about compute power, virtualisation, data storage, KVM`s and Linux, these can all be managed and integrated within OpenStack (the three main areas – Object Storage, Open Compute and Open Image). Imagine a time when you can have your data stored where you want it to be (UK, US - your country of choice) and have compute power from IBM (VMs spinning up) and your data running to inexpensive NAS devices elsewhere as well as other VMs launching from yet another location or provider and all of this with you having control? Well it's available now and there are many providers gearing up for a very big play (IBM, VMware, Rackspace, HP, Oracle and many others). Why are the bigger players looking to OpenStack? Well hardware sales will certainly increase and with the integration of application stores, managing the compute/storage availability makes this so much easier (and cheaper). Let's face it, Amazon has been working towards this for years but they don't have the lead on this anymore. Amazon has been working towards this for years but they don't have the lead on this anymore. However (and a big however), it is open source and the consumer market needs to catch up with the vendor led trends. Make no mistake, this is the future for Cloud within the next few years without a doubt, but the adoption is relatively slow, mostly because of the lack of understanding from a client perspective as well as how to sell this new service (same problem when selling cloud services). Personally I feel the consumer needs more education on Cloud in general before we see OpenStack really taking off due to a over advanced vendor led market, but it's only a matter of time. It's a bit techie really and requires a lot of explanation to the uneducated Cloud consumers prior to wining confidence. Personally I feel the consumer needs more education on Cloud in general before we see OpenStack really taking off ... At the Paris OpenStack 4 Expo there were many providers trying to get this point across. I spoke to a few and tried to get an understanding of the differing products/services they were offering. One in particular was HEDERA (writing an article for us in the very near future). What they boasted (and demonstrated to me) was impressive and consisted of giving you, the end client, the ability of piloting your multi-IaaS from one platform, deploy and provision in a few minutes not hours, adapt in real-time for RAM on demand with a typical 30% reduction on usage as well as being able to cater for SUSE, VMware and Hyper-V environments. I must say, I was very impressed with their offering, but don't just take my word for it, visit pimpmystack.net and sign up for a free trial to see for yourself. Furthermore, they offer this as an on premise solution too, allowing you to calculate your own budget lines and IT charge backs for larger enterprises. Don't get me wrong, there are ways to achieve most of this now, but with several cobbled together products (many Cloud orchestration firms on the market) - But with OpenStack you achieve this and certain standards are achieved too! Don't get me wrong, there are ways to achieve most of this now, but with several cobbled together products... To summarise, I feel that OpenStack will be a game changer for Cloud over the next few years and one to watch avidly. I also believe that this may force the standardisation for Cloud that is so much needed at present. Whatever your stage of current Cloud awareness may be at the moment OpenStack should be on your horizon. Are you utilising OpenStack at the moment? If you are we would love to hear from you and listen to your experiences.   ### Where's your Integrity? Integrity is word that means different things to different people. Generally speaking, integrity means soundness, completeness and incorruptibility. We're used to hearing this term associated with politicians, or more accurately lack of it, but in the world of data and cloud services this single word has a wide range of implications depending on whether you come from an IT, business, legal or digital preservation perspective. I gave a talk on digital preservation at a DIA event recently which was all about Electronic Document Management for pharmaceutical and medical products. Use of SaaS is just starting to take off in this sector. The session I was in looked at how to keep and use data for the long-term, which is a challenge that many businesses face. There was a really interesting mix of presentations on regulatory compliance, legal admissibility and 'evidential' weight, and risk-based approaches to digital preservation (my bit). Imagine that you've spent 6 years doing a clinical trial for a pharmaceutical drug... [generating] maybe 100,000 PDF documents and associated material for the trial - and now you need to be able to keep it accessible and readable for decades. One of the questions to the panel was 'what does integrity mean for electronic documents?' shortly followed by 'how can it be maintained over say 20 or even 50 years?'. Imagine that you've spent 6 years doing a clinical trial for a pharmaceutical drug and have just collected and submitted the results to the regulator, e.g. the FDA or EMA, in the form of an electronic Trial Master File (eTMF). That eTMF contains maybe 100,000 PDF documents and associated material for the trial - and now you need to be able to keep it accessible and readable for decades. At the same time, it needs to be held securely, held in a way that allows you to demonstrate integrity (that word again), held in systems that you can show have been validated, held in a way that gives you a full audit trail. A pretty daunting task, but actually one that applies to a wide range of content in different sectors. At Arkivum we come across this problem on a daily basis, with examples including gene sequences, social science surveys, records of hazardous material disposal, and even digital works of art. It's a major challenge, especially where cloud services are being used. Rarely do you have the transparency to 'see inside' a cloud service so you can assess what they are doing with your data and even more rarely is there any form of contractual guarantee, certification, or auditable evidence that they are 'doing the right thing' - especially in a form that would stand up to scrutiny by a regulator. But back to integrity for a minute. We discussed at least three levels of integrity at the DIA event. At the IT level, integrity is often short hand for 'data integrity in storage', which in turn means knowing that the 'bits' in a file or object are correct and haven't changed - i.e. avoiding data corruption or loss. In the case of an eTMF this might mean using a checksum or other form of 'digital fingerprint' for each one of the files and then 'checking the checksums' to detect any loss of integrity. This is an activity known as 'scrubbing'. Sometimes this is handled by the storage system, e.g. use of parity in RAID arrays, sometimes this is handled by filesystems designed to protect integrity, e.g. ZFS, and sometimes it is handled by tagging files with checksums as part of the 'metadata' so manual checks can be made when a data transfer or storage operation needs to be confirmed. Sometimes all this happens behind the scenes and sometimes it's an activity that the user has to do, e.g. a document download might have an MD5 checksum provided along with the file so you can make sure that you've downloaded it correctly. But what happens if the file itself has to be changed, e.g. to keep it readable because it was created 20 years ago and there's no longer the software available that understands what the original file means? This could mean changing the format of a word document, spreadsheet, database, or any other form of proprietary data so that it's still readable. Checksums don't help you here because the 'bits' are changing. So what do you do if the file format needs to be changed, i.e. converted so it can be read in tomorrow's software? This is firmly in digital preservation territory and raises questions of what is 'significant' about the file that needs to be 'preserved' during the conversion. Then the question is how can the conversion be tested and validated - something that's key to asserting and proving that integrity has been maintained, especially in a regulated environment. The key here is to think in terms of who will need to use the content in the future and create a Trusted Digital Repository that collects together, stores, manages and provides access to this future community in order to meet their needs. It's also not enough just to have integrity at the 'file level', it's needed for whole collections of documents and data. If just one of those 100,000 PDF documents in an eTMF goes AWOL or is corrupted then the overall integrity is lost. This is about completeness and correctness - nothing missing, nothing added, nothing tampered with, no links broken, no changes to order or structure. If you have some form of description of what should be there, e.g. manifests and document lists, then you at least stand a chance of spotting when something goes missing - or proving to a regulator that it hasn't - but this requires strict control over this 'metadata' as well as the data it describes, if the integrity of the metadata can be compromised then so can the data. We've been tackling these problems for a while at Arkivum and we take integrity in all senses of the word very seriously - data integrity is a contractually guaranteed part of our service. This involves extensive use of checksums, data replication, active integrity monitoring, trained staff, and tightly controlled processes and other measures to guard against a very wide range of risks that could lead to data corruption and loss. Perhaps the most important thing is a chain of custody with our customers - unless we know that we have received data that's complete and correct then we could end up storing duff data for decades - and that's no good to anyone. We provide ways for customers to access checksums and confirm that we have the right data - we can digitally sign an audit trail to say this has happened. There is no hiding place when this has occurred - no escape clause - no excuses based on a string of 9s where we could say 'sorry, statistically you were just the unlucky one'. We use BagIt from the Library of Congress to allow large collections of files to be 'bagged and tagged' so any changes in any part of the content or structure can immediately be detected. Customer data is escrowed too so that each customer has a built-in exit plan, which includes an audit trail that shows each 'bag' of customer data has made it to escrow with nothing gone missing on the way. It would be easy for me to go on at length about all the infrastructure, people and processes that we use to ensure data integrity. But fundamentally, integrity starts and ends with the customer – it's their data. We need to be sure that we have been given the right thing before we can guarantee to deliver it back again unchanged. Integrity is tightly bound to chain of custody and that chain links us and our customers together. Cloud on the one hand has the virtue that it 'hides the details' and 'takes away the complexity' ... But on the other hand, we also need transparency and auditability... We need to be able to look under the hood and validate what cloud services are really doing. What's interesting for me is how to help our customers achieve 'data integrity in the cloud' - especially in a way that meets compliance requirements so they can sleep at night knowing that they can access and use their data for a long time to come. Provable data integrity is something I hope we'll see a lot more of in cloud services - and with it new models for chain of custody, transparency and auditability. But maybe there's a paradox here to solve too. Cloud on the one hand has the virtue that it 'hides the details' and 'takes away the complexity' - you don't have to worry about how and where the data is stored - you just have access when you need it. After all, that's the whole point about 'services'. But on the other hand, we also need transparency and auditability so we can be sure of integrity - opening up the black box to scrutiny. We need to be able to look under the hood and validate what cloud services are really doing - or as Ronald Reagan put it 'trust but verify'. Only then do you know where your integrity really is. ### Hosted Desktop market Heats up It's very interesting what's occurring in the VDI/Hosted Desktop space at the moment. Microsoft with their offering (and increasing their SPLA costs to partners), VMware buying Desktone, IBM with Virtual Bridges, Cisco with their UCS offering - and now Amazon with their introduction to the race with the WorkSpaces product set. With the bigger players now entering the VDI market place, how can the smaller companies compete? Compare the Cloud asked some of the top VDI/HD companies in the UK to get their take on the market shift. [quote_box_center] "Amazon are offering Windows Client desktops (DaaS), not the entire VDI (Virtual Desktop Infrastructure). Therefore you will still need another IT supplier involved to build the VDI (Virtual Desktop Infrastructure); to build the Windows domain, the file redirection servers, the mirrored web interfaces, the application servers and to integrate the desktops with the servers. And then who is going to put an SLA (guaranteed uptime) on all of that infrastructure? Amazon won't. So, Amazon are offering a virtual desktop. In my opinion their offering is DaaS and IaaS (Desktop as a Service with Infrastructure as a Service). The only difference between this and what every other large US IaaS provider is, and has been offering, is a Windows 7 (or 8) desktop. Personally I think this is great because it strengthens confidence across all segments to prove VDI has not only been set as a precedent for future cloud services, it's also been adopted by the mainstream, it was only a matter of time. Fortunately, many of us far more agile companies realised this a decade ago and have shaped our businesses around this mind set. Anyone who has successfully implementing VDI solutions will know, VDI is not something you can commoditise. Most businesses IT systems are fairly complex, they don't just run standalone Exchange, Office and SharePoint. In reality, businesses run a myriad of applications, not only do they run LOB applications, they also integrate these applications. So based on the simple DaaS concept of a Windows 7 (or 8) desktop running IE and Office, the reality is the requirement is far greater; Exchange integration between multiple applications, MS Office running multiple plugins, additional web browser requirements, installation and management of 3rd party applications, not to mention the customisation of group policy management for desktops, servers and applications. So getting into a tad more detail, the issue with Amazon offering VDI is that they wouldn't be able to fulfil the requirements of 95% of most UK businesses and organisations. Most organisations would simply not accept waiting for days for Amazon's support desk to return a call, or fix a problem. They would not be happier with anything less than a named (if not, dedicated) account and project manager, they wouldn't be happy with being told “no you can't have this”. And this is something Amazon will definitely say, because they are commoditising the cloud by automating provisioning and selling in bulk. They cannot compete with localised support, agility and the realistic requirement of customised infrastructures that clients demand." James Mackie, MD Vesk [/quote_box_center] [quote_box_center] "Earlier this week Amazon launched a limited preview of Amazon WorkSpaces, its play in the hosted desktop market. So where does this fit in with what's already available?For very small businesses, I am not sure it offers any advantages over using traditional desktop/laptop with Office365, Google Apps or even Dropbox. At the top end, I suspect the IT department will want to have a bit more control; they may still use AWS, but build their own environment instead.That leaves the middle ground, organisations with enough staff to have complex systems, yet not big enough to have an IT department. This is also the area that most existing hosted desktop providers also occupy. Since it seems unlikely that these customers with have the capability to set-up and use the solution themselves, I suspect there will be a channel offering and MSPs/VARs will be the ones pushing this. For all but the simplest customers though I am not sure it fits the bill. Most want require some form of 3rd party app, such as a recruitment database, integrated with their desktop and office applications. It will be interesting to see what integration will become possible." Mark O`dell, CEO Connect Cloud Services [/quote_box_center] [quote_box_center] "It's interesting to view the developments happening within the VDI market place at the moment. I think the Hosted Desktop/VDI awareness has eventually come into popularity due to the media hype around BYOD. As ever, with correct marketing and execution anything can be sold and sometimes oversold. What is very important to note with the increased awareness of VDI is to endure the requirements are gathered correctly before signing up. VDI is great for most consumer firms but not all and it's imperative that this expectation has to be discussed at the start of any dialogue prior to contract signing. I cannot see Amazon getting this right due to the in-human element of their service delivery. After reading Brian Madden's article published a few weeks ago, it seems clear that the Amazon WorkSpaces product is far from being a commercial offering, but what it does help with is the general awareness for Hosted Desktop Technologies and the firms that do this well." Andrew Mclean, CEO Applayer [/quote_box_center] [quote_box_center] "As with any product launched from a large enterprise vendor some of the interaction between the end clients always gets overlooked. There always has to be a consultation with the end user, in the same way that you would consult on a physical box. Is it fit for purpose, what is the actual requirement and so on? Cloud providers that put out 'standard' servers and expect customers to fit in are not providing the core of service that the cloud should be. It is not a commodity sale, it is a consultative sale. But moreover, there should be an ongoing review of requirements. Somehow I think that this will not be the case with the larger players in this market place and there is no substitute for good account management. There seems to be a similar comment here based around setting expectations and then managing them. These old analogies keep coming up time and time again for service delivery and, if I'm honest, very few large enterprise vendors get this right. Something as critical as ALL of your IT Infrastructure being housed with a single provider needs managing and not just a tick in a box and then click on migrate. My take on the insurgence of VDI/HD? Well I see this as a stepping stone to true SaaS enablement. Why else would most of the largest tech companies in the world be interested in moving into this space. The hardest first step from localised technology is to virtualise and work from a centralised Cloud System/Service. Once this has been achieved it's relatively easy to convert to any other means of application access/usage once the hard work has been done. With regards to Amazon and their WorkSpaces product, from what I have seen and read (thanks Brian Madden), they have a long way to go before delivering this commercially as a competitor to the rest. In fact the release of this service will only boost the overall awareness of VDI and Hosted Desktop even more, which will not be a bad thing for everyone else. Maybe they will upsell to their existing clients first, making it an easy transition to VDI." Mark Southerton, MD Xanadu Technology [/quote_box_center] So the Hosted Desktop / VDI space is somewhat more complex than it appears at first. Certainly by all accounts Amazon are going to need to provide a more supportive hands-on approach in order to gain serious traction with the enterprises. However it certainly looks like Hosted Desktop is ready to hit the big time and secure widespread recognition as a valuable subset of cloud computing delivering real business value. Does your business provide Hosted Desktop services - if so what's your take on this news and progress in the market? Or are you a business that's rolled out hosted desktop? If so whats been your experience?   ### Is Software-Defined Storage (SDS) your ticket to Storage Freedom? By Keith Joseph "Storage Virtualization – Storage Hypervisor – Software-defined Storage." It has many names, but has the time come for the data centre to embrace the concept of full vendor-independent storage management? Over the years we have seen a move by vendors to add new storage products to their portfolio by in some cases obtaining - at great cost - new and latest cutting edge storage products, or by signing OEM agreements with suppliers to fill the gap in the product lineup; and some even designing and building new product lines themselves to give them access to a market sector. A case in point is the move into the Solid-State Drive (SSD) market space and the monies changing hands for vendors or from investors wishing to cash in on this current hot market. We see this with one 'tier two' vendor being offered $150m in VC funding to help drive their business forward - or from a large networking vendor paying $415m for a start-up with limited sales because they wished to move into this space. But the signs are clear that's where they think the market is going, and they will invest to win. However in this modern age gold rush we are seeing companies adopting the stance that ‘if we build it (or buy it) they will come'... but is this the case? Well, the reply from Storage managers is "yes and no"...confusing - yes! Understandable ? Actually, yes. To understand why its confusing read IT Managers Struggling to See Through all-Flash Storage Myths. So let me put my point of view forward and explain. The storage market is driven by a few big players who will go to almost any length to protect their market place. The reason is they make a very nice return on investment for their investors and this in turn makes them happy - so entitling them to better rewards. Simple that's the world we live in today. However every now and then a new company starts up that pricks the ears of their customers or a new technology like SSD or flash comes along, so all the 'hunters' circle looking to spear this fish in the proverbial storage barrel, and in most cases it's a win win: Vendor gets new product line and the start-up's investors bag a lot of cash. However this "win win" for the investors and vendor is a problem for the end-user and the reason is the vendors increasingly have a mixed bag of products they offer. If you look at most of the vendors offerings out there this mixed bag is pulled together with different interfaces and or management consoles that do not talk or work with each other and in some cases mismatched functions but look alike. Add to this the fact the vendors like a nice cozy closed ecosystem so don't want to open up their customers storage because by doing so this may reduce their potential to earn big returns - and could also highlight their short comings in the product range. However saying all that the world of storage is moving fast and picking up pace, and its being driven by the way we consume more and more data and how we are use it, store it and demand it; but the difference now is that we are not willing to pay through the nose for it. Plus the drive to more outsourced infrastructures and data centres, and the reduction in budgets of "on-premise" hardware. This leads to a need to reduce upfront costs and sweat products hardier throughout their lifespan. So what to do? Well vendors have noticed and are trying to act without fully throwing open the door to other vendors hence the description ‘software-defined storage' becoming more common in the marketing stuff they put out. Why this term and not "Storage Virtualization" or "Storage Hypervisor"? Well when you have pushed back on a concept for so long it looks a little odd to do a 180 u-turn so they are moving 90 degrees first! Then time will let them complete the move. So is this the right move? Well for storage consumers it most certainly is. It covers and sorts out a lot of headaches. It removes the mixed management problem, lets different storage types work together, adds new functions to models that are missing them or to old models that never had them - the list goes on... Does it help the vendor that embraces the movement? Well that's down to the vendor; if they embrace it fully then yes as the conversations with the storage consumer changes from a "we sell tin" model to a "how can we help or fix your need?" approach. If they do not then the storage consumer will walk and SDS lets them do so without impact. So who should be looking at this? Well, almost all data consumers outside of users just needing a basic file store. Why? Because it gives options which you do not have with a tin-based offering. Tin is Tin and Disk is Disk, software is the controlling conductor or your storage orchestra. ...who should be looking at this? Well, almost all data consumers outside of users just needing a basic file store. Is it the magic bullet for the storage world? It could be but for a couple of bumps to navigate around. Cost: currently vendors charge way too much for a piece of software - in some cases six figures plus. Add to this the way they sell it by making you buy a node/ header or two or three then adding the cost of the storage capacity you need to manage - in some cases this can be 2-3 times the cost of the physical disk ...so not a good start. Then who should you buy it from? IBM has the most mature offering from the major vendor stable with the SVC range or you buy from boutique software house that are based on different operating systems ranging from Windows, Linux or ZFS. But the choices are out there and it's down to you. The interesting fact is this will be on ongoing as vendors position and counter position all in the interest of defending their corner… ### Application Apathy: How to Diagnose and Treat it By Amar Nandha, Head of Application Performance Management (APM) Practice, Easynet Global Services Apps for business or consumer use, across any device, are changing the way we experience content. Enterprise applications such as CRM systems, content management systems, VDI, procurement applications and unified communications jostle for priority on the network, despite there being plenty of room for all. Often we have to wait for applications to open, reload an application as it's running slowly, or get rudely thrown out of an application in the middle of a piece of work. Whether based on the corporate network or hosted in the cloud, the golden rule is this: where there are apps, there has to be application management. Whether based on the corporate network or hosted in the cloud, the golden rule is this: where there are apps, there has to be application management. Killer Apps 2013, the latest research commissioned by Easynet Global Services and Ipanema Technologies, looks at this in a bit more detail. The researchers questioned 650 companies in Europe and the US, all with more than 1000 employees, to find out how well their applications were performing across their networks in a business environment. The results were concerning: Almost 80% of respondents experienced problems with their applications in terms of delay over the last 12 months. 66% of businesses relied on response times to tell how well an application was performing, and 42% rely on user complaints. Many of us are just resigned to the fact that we'll experience delays, resulting in what we call ‘application apathy'. But even losing just 5 minutes a day costs a business 1% in terms of productivity, so this isn't just annoying, it's costly too. These performance issues mainly affect business critical applications such as ERP , CRM, and VDI, but almost 30% found that video applications and unified communications frequently suffer in terms of performance. The other concern which emerged from the research was that these performance issues were still common despite investments being made in bandwidth to overcome them. It's logical to think that the more applications on a network, the more bandwidth is needed, and this is only likely to increase as the diversity and power of applications expands. The research reflects this with 84% of respondents reporting that their companies' bandwidth requirements are growing by at least 10% every year. It isn't just about bandwidth though: it's about making what you have, work harder, making your network for the fast transmission of content become ‘application aware'. Without this, it's like having a satnav without traffic updates: you can get where you want, but not necessarily the best or fastest way, so you're just not making the most of the money you've spent. Intelligence on how a network is running, which applications are slowing it down, when and why, gives companies the ability to make decisions which can massively improve the way they operate, nudging them in front of competitors to gain that all-important competitive edge. The solution? Businesses need intelligence on their networks at an application level, and they need performance guarantees for each application. Smart Application Assurance Ultimately, managing applications and infrastructure is highly complicated. The multiplication of applications means networks and applications are more challenging to manage than ever before. Companies need the ability to monitor what is happening as applications flow across their network. They need the ability to monitor, control, prioritise, and maximise the value these applications deliver. When things do go wrong it's the ability to know pro-actively before users even notice. Easynet's Smart Application Assurance (SAA) solution enables this. We can determine what applications are flowing where, when, and how. We then work with the enterprise to create business-appropriate benchmarks, enabling application-specific performance levels to be set. Should performance dip, our service teams can identify this before there is significant impact to the business and the customer is immediately contacted. Our regular consulting process, our uniquely proactive approach, and our response guarantee all significantly reduce application issues while simultaneously empowering the enterprise. Smart Application Assurance delivers an end-to-end service for assuring application performance: One-hour ‘Smart Response' SLA Intelligent issue identification Assured application roll-outs Continuous consultancy Auditing and discovery Dynamic control and prioritisation Improved communication This value can be applied to both existing Easynet Global Services customers and customers who have networks and infrastructure with other providers ranging from MPLS to Internet and mobile 3G based connectivity. Migrating some applications to the cloud works for lots of businesses, but wherever your applications sit, make sure you're smart about the journey they're taking. Excerpt: Amar Nandha of Easynet Global Services asks why we accept sub-standard performance from our applications and explains Easynet's solution. Apps for business or consumer use, across any device, are changing the way we experience content. Enterprise applications such as CRM systems, content management systems, VDI, procurement applications and unified communications jostle for priority on the network, despite there being plenty of room for all.   ### A Swiss Army Knife for the Cloud By Alessandro Sorniotti, cloud security scientist, IBM Research - Zurich In the late 1990s, few people quite recognised the enormous potential of the Web or how it would transform the way we live and work. Cloud computing is similarly at an early inflection point today, and we are staring up at a “hockey stick” shaped growth curve. In fact, according to Gartner Research, more than 60 percent of all enterprises have adopted some form of cloud computing this year. But even with the growth, cloud's potential as a business innovation tool still remains virtually untapped since most organizations see cloud as an IT tool. But the reality is different. The idea of tapping into pools of computing resources has firmly caught hold and is becoming more and more part of the overall vernacular of the c-suite including CIOs and Chief Marketing Officers (CMOs) who need to drive new revenue opportunities using technology that can be activated at a moment's notice. As a side note it's estimated that by 2017, CMOs will spend more on IT than CIOs. IBM has invested more than $6 billion in more than a dozen acquisitions since 2007 to accelerate cloud and to provide higher value offerings that give these line of business leaders the expertise rich solutions they need. And today we are adding a new technology from IBM Research into this portfolio called Intercloud Storage or "cloud of clouds toolkit". Cloud of Clouds Some of the biggest hurdles to cloud adoption are reliability, security and vendor lock-in. If a cloud is hacked, goes out of business or a power failure occurs it could be devastating for a business. On the flip side, if the vendor decides to increase its rates or fails to meet certain obligations the client may want to switch to another vendor. The main idea behind the intercloud is that its security and reliability exceeds that of the single clouds that make it. An approach to remedy this situation is based on what we call an intercloud or “cloud of clouds”. The main idea behind the intercloud is that its security and reliability exceeds that of the single clouds that make it. Let's take security as an example. The platform of a cloud provider is usually treated as a single security domain, typically built using homogeneous components with little internal diversification. The intercloud provides heterogeneity by leveraging multiple administrative and security domains, which is important for building intrusion-tolerant systems. The intercloud concept is implemented on top of multiple, separate cloud providers, for example using Softlayer, an IBM company; Amazon, or Microsoft. The idea being that if one fails for whatever reason the other takes over, transparently to the user. Developed at IBM's lab in Zurich, the "cloud of clouds toolkit" is basically the "Swiss army knife" for cloud and intercloud storage, providing data replication and dispersal, encryption, integrity protection and compression for more than 20 supported cloud storage backends. Through the "cloud of clouds toolkit", we have opened the gates to the intercloud for a number of IBM storage systems; in our prototypes, single files, whole filesystems or snapshots of volumes can be migrated, backed up or shared on the cloud(s) — independent of the vendor. InterCloud Storage explicitly addresses space efficiency, data synchronization, and metadata coordination when storing data redundantly on object storage. Once a cloud fails, the back-up cloud immediately responds and ensures data availability — transparently to the user. No synchronization or communication among cloud clients is needed due to the innovative approach that adds redundancy and tolerates failures. Lastly, InterCloud Storage features a modular, layered, and highly configurable design. Layers are switchable, and can be adopted to a variety of different configurations (single-, multi-cloud, etc.). The implementation and use of layers in InterCloud Storage is transparent to the client. More details on the patent-pending algorithms in the cloud of clouds toolkit read this published paper titled "Robust data sharing with key-value stores". InterCloud Storage will begin pilot testing in early 2014. Excerpt: Alessandro Sorniotti of IBM unveils the InterCloud Storage - cloud of clouds toolkit promising seamless migration and backup across clouds.. In the late 1990s, few people quite recognized the enormous potential of the Web or how it would transform the way we live and work. Cloud computing is similarly at an early inflection point today, and we are staring up at a “hockey stick” shaped growth curve.   ### How will IT Trends affect CSPs and consumers in 2014? It seems like it is that time of year again... the annual rehash of the year that was and the jostling for pre-eminence in predictions for the year which approaches. So let us heave ourselves onto that creaking bandwagon and take a look at which IT trends are likely to affect our marketplace over the next twelve months... Trend #1: Continued growth of social and mobile It seems almost too obvious to mention... up to now, it has driven the sector growth both in terms of cloud-based Apps and platforms, such as Facebook, and cloud-based service providers (since smaller SaaS providers and new entrants have looked to IaaS to enable initial growth without the need for big capital expenditure). Threats: Most analysts report a huge surge in the number of businesses and marketers investing or planning to invest in mobile and social as a way of driving consumer engagement with their brand. It is possible, however, that instead of focusing on better-leveraging established cloud-based platforms and utilising emerging cloud-based bespoke App development and hosting services, this investment will be diverted to developing and hosting bespoke Apps in-house. Opportunities: While business in some quarters has been slow to react, consumers have been busy embracing the cloud. This change has been powered by both software and hardware-driven changes. It is expected that this year, Smartphone and Tablet use will overtake that of the traditional desktop. Consumers are now using multiple devices, and no single device is the primary hub - instead, their primary hub is the personal cloud. In addition, the increasingly complex demands of mobile users will require a commensurate growth of server-side computing and storage capacity. Trend #2: Increasing scope of the "Internet of Things" Again, a driver for increasing amounts of server-side computing and storage capacity. For me, the opportunities are exemplified by the high-tech vending machines SAP Hana demoed earlier this year. These machines use a Smartphone to transmit transaction data to the cloud for analytics, including inventory management. If the customer has the associated App installed on his or her Smartphone, then a profile will be built up over time based on purchase history, feedback and promotional take-up. ...high-tech vending machines [...] use a Smartphone to transmit transaction data to the cloud for analytics, including inventory management. Threats: In this instance, SAP Hana is offering a full service solution so, while it leverages 'Cloud' technologies, this particular example serves more to illustrate how traditional software/ hardware/ service vendors are moving to take advantage of the new business model. While this might be great for consumers, it means an increasingly competitive marketplace for existing CSPs. Opportunities: For me, this example highlights the new opportunities that exist for organisations who are able to think creatively about the kind of solutions consumers want or could benefit from. For those willing to re-imagine traditional solutions, opportunity abounds. Trend #3: Big Data Big Data has been a topic for many years now, but expanding consumer engagement initiatives and the internet of things will only add to the pressure on organisations of all kinds. Threats: Compliance is a key driver to keep data onsite. Plus, the shrinking rack space/ performance ratio alleviates some of the pressure from organisations. It could also be argued that the big data issue is primarily an opportunity for providers of analysis solutions. It could also be argued that the big data issue is primarily an opportunity for providers of analysis solutions... Opportunities: The ever-increasing demands for the storage, backup, and archiving of data and the additional processing power required for analytics exerts a continued pressure on organisations. In testing economic conditions, where the tendency is to kick big Cap Ex spending into the long grass, Iaas and AaaS look like an attractive way to meet short-term demands. Trend #4: Increasing awareness of Software Defined Networks Software Defined Networking has been touted as the last step in the virtualisation process - enabling data centres to be truly abstracted from the physical. Threats: The main threat here is one of skills shortage. For CSPs providing IaaS solutions, if SDN does live up to its promise of helping organisations to effect better management of their existing infrastructure, it may also pose problems, as it stalls organisations' moves towards supplementary external capacity. SDN should also facilitate improved granularity of data flow metrics, which brings the potential for improved monitoring and security, possibly making cloud solutions more attractive to some. Opportunities: As infrastructure becomes dynamically configurable, flexible provision offered by cloud-based XaaS becomes more desirable. SDN should also facilitate improved granularity of data flow metrics, which brings the potential for improved monitoring and security, possibly making cloud solutions more attractive to some. It has led some analysts to suggest that enterprises should be designing private cloud services with a hybrid future in mind. This strategy will be viewed more positively as standards for infrastructure program-ability and data centre interoperability more fully emerge. Trend #5: The big freeze? In 2013, Facebook opened its first data centre outside the USA, in Lulea, Northern Sweden, making a lot of noise about its green credentials as it did so. As well as running on renewable energy from local hydro-electric schemes, the data centre's sub-arctic location has meant it can operate with approximately 70% less mechanical cooling capacity than the average data centre. With Iceland now also promoting itself as a cooler, greener data centre location are we set to see a migration of provision to colder climes? Threats: What may be good news for the environment may be bad news for less competitive data centres located in warmer climates and dealing with higher energy costs. Additional market disruption may also evolve from Facebook's decision to share data centre efficiency knowledge through the Open Compute programme paving the way for low-cost imitators in economies like China and India to replicate these operating efficiencies, with green energy potentially coming from solar and wind sources; greater competition from lower operating costs will force all players to take increasingly efficiency-driven energy management initiatives. What may be good news for the environment may be bad news for less competitive data centres located in warmer climates and dealing with higher energy costs. Opportunities: The lack of warm days isn't the only consideration when building data centre cooling systems; air moisture levels can also impact energy consumption dramatically, so there is something positive for UK sites! If increasing energy efficiency, reduced operational costs, and open standards for data centre operation and interoperability are the result then we can only all benefit in the long run. Trend #6: Consolidation in the market place? Back in July, the BVP Cloud Computing Index announced that the top 30 Cloud Technology companies in the US had a market capitalisation value of more than $100 Billion - claiming the figure demonstrated Cloud had 'come of age'. Threats: This spend on Cloud is still just a small drop in the IT ocean. Gartner figures suggest the global annual $30 Billion spend on SaaS, PaaS and IaaS is just 2% of an overall global annual IT spend of $1,370 Billion. AWS may now dominate IaaS and arguably PaaS provision, but Microsoft, Google, IBM, VMware and others are now vying for market share. It is yet to be seen whether this competition will lead to an expansion of the overall market or a redistribution of market share. Data from the Synergy Research Group suggests the Cloud computing market grew by 47% year on year... Opportunities: The BVP Index is still heavily dominated by SaaS and doesn't take account of privately-owned Cloud businesses. Data from the Synergy Research Group suggests the Cloud computing market grew by 47% year on year (up to Q2 2013). What factors are going to be decisive in 2014 for your business? How do you see the market developing? What predictions do you make for the next 12 months? ### What's Driving the African Cloud Computing Boom? Given the fact that cloud computing is such a new technology, and indeed many people in the world are not even familiar with it yet, one doesn't really associate it with a part of the globe that is most commonly referred to as the third world. But the recent growth of the cloud in one particular African nation indicates clearly that the continent is moving forward with its instigation of cloud computing on a theatre-wide basis. African Cloud and Internet activity hotspots A report this week has indicated that more businesses in Nigeria will be using cloud-based services than in South Africa by the end of 2014, with Nigeria also apparently shooting past Kenya in the same list in the process. This is according to a survey which has recently been carried out by the American multinational networking company, Cisco, with some assistance from World Wide Worx. ‘The Cloud in Africa: Reality Check 2013' research study found that already half of South Africa's medium and large-sized businesses are using the cloud, as opposed to 48 per cent in Kenya and 36 per cent in Nigeria. But this is expected to change rapidly in the next couple of years, as 44 per cent of the Nigerian businesses surveyed indicated that they expect to adopt the cloud by the end of 2014. This compared to just 16 per cent in South Africa and 24 per cent in Kenya. Cisco indicated in their report that there is a broadband revolution spreading through Africa like wildfire, and that the lack of conditioned thinking on IT matters in the continent would stand the cloud in good stead within the continent in the coming years. Of those polled, over 90 per cent had no security concerns about the cloud, which would probably differ somewhat to the same figure among Western companies, even though the adoption and take up of the cloud is steadily increasing across the Western world. It is perhaps hard for those of us in the West to imagine Nigeria rapidly embracing the cloud, but news seeping out of the nation in recent weeks has indicated that this is emphatically the case. Just weeks ago, it was reported widely that Computer Warehouse Group Plc, a Nigerian technology company, was seeking to raise capital via the stock exchange in order to completely restructure the company's business toward cloud computing, and away from traditional hardware and software. Computer Warehouse Group is apparently aiming to become the largest cloud-computing provider in Africa by 2015. In announcing the news, the company spoke enthusiastically about being a “trailblazer” in cloud computing, and also stated that they expected many computing firms in Nigeria, and across the whole of the African continent, to follow suit once they'd witnessed how successful Computer Warehouse Group's plan had been. It was reported in accordance with this that African computing companies are increasingly looking to the cloud to increase the profit margins. For example, Dimension Data Holdings Ltd, a South African firm acquired the California-based OpSource in 2011 in order to catalyse its cloud operation. The trend for African firms to move into cloud computing has been exemplified by the recent growth in the industry in Nigeria, but it is by no means limited to the nation that is expected to become Africa's biggest cloud advocated within a couple of years. Aside from the lack of dogmatism with regard to cloud computing in Africa, the relative lack of existing IT infrastructure has also been cited as a reason for the growing interest in the cloud in Africa. It makes little sense for companies to invest money that they may very well not have in expensive IT equipment, if it is not necessary, and may even become obsolete in the future. The same can be said for the uptake of mobile networks and mobile devices in Africa - why build expensive legacy fixed line network infrastructure when modern wireless communications will do? Aside from the African study, Cisco has also recently published its third annual Global Cloud Index, which suggests that global cloud traffic will increase by over 450% by 2017, reaching 5.3 zettabytes four years from now. This same global study predicted that the Middle East and Africa will have the highest growth in cloud traffic over the period. The Middle East and North Africa are often associated with one another hence the acronym MENA, partly due to their geographical proximity, but also due to business and economic ties which have been built up due to the area's housing of some of the world's richest supplies of crude oil. It goes without saying that the adoption of the cloud in the world's most oil-rich region will be a huge boost to the future of the technology. ### What is the CloudEthernet Forum? By James Walker, President of the CloudEthernet Forum (CEF) Cloud services have become immensely popular much faster than was ever predicted. This enormous growth is driving a gap between what leading cloud providers can deliver and the rest of the industry when it comes to provisioning cloud resources and connectivity. In addition, users are demanding more control and better visibility of services. These market drivers were behind the formation of the CloudEthernet Forum (CEF) earlier this year. The CEF was set up to find solutions to these problems, and to anticipate others which are still emerging. Scalability of Ethernet to support millions of MACS and VLANs in a way that maximizes the transparency of deployment, and minimizes the configuration required for setup and operation. End to end provisioning that is agnostic to the specific devices and technologies across the network, and requiring minimal or no human intervention for rapid deployment of services. Deterministic traffic behaviour and performance to meet specified SLAs or legal/regulatory requirements – e.g. for keeping data within national boundaries. These challenges are being faced right across the industry and no one company can be expected to solve such fast-evolving issues. These challenges are being faced right across the industry and no one company can be expected to solve such fast-evolving issues. There is also a need to support these solutions across multiple operators and vendors. Collectively, the CEF's members can bring about changes that will help enterprise customers take full advantage of the cloud. The network equipment manufacturer community will play a key role in the success of the CloudEthernet Forum by ensuring that their various solutions are compatible and can be used to implement next generation cloud services using Ethernet based on implementations specified by the Forum. However, it is the data centre and WAN service providers that play a primary role by bringing to bear their implementation and operational experience and expertise in defining the appropriate implementations that the rest of the industry can follow. Benefits of membership include the following: Ability to influence the direction of the cloud service infrastructure market and its rate of adoption Faster access to key information that can impact their service or product rollout Positioning their companies as market leaders in their particular part of the cloud services industry The CEF is an industry organization of market-leading cloud service providers, network service providers, equipment manufacturers, system integrators and software developers that are focused on development of CloudEthernet technologies through open standards development. Affiliated with the 200+ member Metro Ethernet Forum (MEF), the CEF aims to address the $200B cloud services market and accelerate and facilitate the use of standard protocols to support large scale global data centre deployments, making cloud services easier, faster, more secure, and affordable to deploy. For more information, and detail of how to join the Forum, please visit: www.CloudEthernet.org ### Verizon the Latest Big Cloud Player By Christopher Morris, Tech Commentator The retail firm, Amazon, have already established themselves as the world's biggest player in the cloud market. The company that originally made its name as a bookseller have greatly diversified as the size of the corporation as grown, and its successful move into electronics with the electronic book reader the Kindle has convinced the strategists behind the company's future to move into cloud computing. This could hardly have been more successful, given that the company that Forbes listed as the seventh most innovative in the world is currently the ‘go to' cloud provider. However, in what is certain to be a trillion dollar industry in years to come, Amazon would be extremely naive should they expect to get the apex of the mountain to themselves for a significant period of time. The nature of cloud computing means that many diverse companies can get involved in this embryonic technology, and it is the early years of a particular innovation or new phenomenon when very big gains, or losses, can be made in a market. This is probably doubly true for anything related to hi-tech industry or electronics, with many a brand today consigned to history, and many early market leaders having been completely wiped out. Thus, Betamax was wiped out by VHS despite many devotees claiming it to be technically superior, Sega made a series of abysmal decisions that turned them from the world's biggest console manufacturer to completely out of the business in a matter of a few years, while it is only the most committed Internet fanatic who will today remember the name Netscape Navigator. ...we can expect to see many familiar names announcing cloud services and trying to get a slice of the action So while big corporations continue to vie for supremacy in the cloudspace, and indeed to merely establish themselves as a viable cloud player, we can expect to see many familiar names announcing cloud services and trying to get a slice of the action. The latest major corporation to get involved with the cloud in a big way is Verizon, who have recently announced a cloud-based service for businesses, simply called Verizon Cloud. Moving into cloud computing on a major level would seem to be a natural move for the international telecommunications firm, whose yearly revenue is well over $100 billion, and whose assets include a significant share in the Italian arm of Vodafone. In order to sell their new service to potential customers, Verizon has claimed that "the new Verizon Cloud service will offer better end-user control over performance than any other cloud solution." At present, the service is still in the beta phase, but enough is known about Verizon Cloud for it to be stated that the service will feature an IaaS elastic computing system, Verizon Cloud Compute, and an object storage system, Verizon Cloud Storage, once it is fully released. ...enough is known about Verizon Cloud for it to be stated that the service will feature an IaaS elastic computing system, Verizon Cloud Compute, and an object storage system, Verizon Cloud Storage... The gimmick, as you might call it, that Verizon hope will make their service stand apart from other is that they claim that Verizon Cloud delivers the performance that you actually require from the service, rather than forcing you to pay for stuff that you don't need. Verizon state that Verizon Cloud will enable users to “set virtual machine and network performances”, which they claim will enable users to maintain a consistent and predictable system performance from their cloud service, even during peak terms of usage. This differs from Amazon's Web Services for cloud, for example, which compels users to select a preset virtual machine size. Whether Verizon can deliver or not on this claim at this early point in the lifespan of the cloud remains to be seen. What can be said for certain is that their cloud-based offering for businesses is to be based on technology produced by Terremark; a cloud technology company that Verizon acquired a couple of years ago. ...their cloud-based offering for businesses is to be based on technology produced by Terremark (Verizon Terremark) The CTO of Verizon Terremark, John Considine, has been very much pushing the flexible element of the new Verizon system, stating that unlike existing applications, the behaviour of your neighbour will have no effect on the way that Verizon Cloud operates. There is a complex technical explanation behind how Verizon state that they have achieved this, but essentially Considine claims that Verizon will allocate performance on an individual basis, and that this will be based on individual elements and user capabilities. It is very early days for Verizon's offering, given that we have yet to see a final release. But evidently Verizon believe that they've made a significant technological breakthrough as the cloud power struggle begins in earnest. Excerpt: Christopher Morris reflects on recent news that Verizon is going head-to-head with Amazon for flexible cloud compute resourcing at scale. The retail firm, Amazon, have already established themselves as the world’s biggest player in the cloud market. The company that originally made its name as a bookseller have greatly diversified as the size of the corporation as grown, and its successful move into electronics with the electronic book reader the Kindle has convinced the strategists behind the company’s future to move into cloud computing. ### Next generation IaaS: Multi-cloud via APIs By Basant Singh, Software Engineer and Blogger Demystifying the new buzzword: "Multi-cloud" When you enquire about availability of a service to cloud IaaS providers, without fail they talk about three nines (99.9%), four nines, five nines uptime percentages and design for failure ideas. The most notable IaaS service - Amazon EC2 - offers a 99.95% of service commitment and that translates to 4.38 hours of downtime per year (or 5.04 minutes per week). Rackspace alternatively offers 100% network uptime guarantee, as do numerous others. Information – The above mentioned SLA availability excludes scheduled downtime for maintenance as well as many other exclusions mentioned in mouse-print. Although on any given day the cloud service availability is much higher than the traditional hosting service yet cloud IaaS has its own share of hiccups and that make them talk of the town as the expectations are very high. Every now and then we keep hearing about the outages of Amazon AWS, Microsoft or Google. As a result, in last few major outages the social media was all buzzing with the talks of unavailability of some popular cloud hosted services like Netflix, Twitter, Zynga, Quora, Heroku, Instagram, AirBnB, Foursquare and Reddit etc. that went offline due to their service providers outages. Outages are a part of IT and you can't stop them No matter how well prepared you are to prevent an outage; somehow it is waiting to happen! There are multiple components in hardware & software apart from multiple parameters that must collaborate seamlessly to run a service and anyone of them can fail at any point in time making the service unavailable. Additionally there are many external factors (natural disasters, grid failures etc.) that are out of control of anyone or any single entity. I think it is not wise to expect a service that is always available and 100% reliable. But what about your customers? ...you can easily play the tweet-and-blame your provider game, as many services are already doing, but eventually it's you who has to bear the revenue loss For every downtime you can easily play the tweet-and-blame your provider game, as many services are already doing, but eventually it's you who has to bear the revenue loss, loss of competitive edge and the damage to the brand value of your company. Even a few minutes of downtime is blown out of proportion in social media circles. Also it seems, your competition is just waiting to grab this opportunity and turn it into an advantage, something like this: So how can IaaS minimize the effect of outages? Existing solutions Mirroring and Availability Zones: Rackspace and Amazon EC2 IaaS providers do indeed have a strategy for it. Rackspace offers multiple geographically separated regions and it recommends that by mirroring your infrastructure between data centres you can mitigate outage risk. On the other hand, Amazon EC2 offers multiple AWS Availability Zones. As per Amazon's portal FAQs: "Each availability zone runs on its own physically distinct, independent infrastructure, and is engineered to be highly reliable. Common points of failures like generators and cooling equipment are not shared across Availability Zones. Additionally, they are physically separate, such that even extremely uncommon disasters such as fires, tornados or flooding would only affect a single Availability Zone." But it has been observed that during last year's outage of Amazon EC2 (US East data center) on 22 Oct. 2012 even the applications configured for multiple availability zone were knocked off. Multi-hypervisor design: Example: OnApp Cloud OnApp's server based multi-hypervisor design monitors different cloud services and supports automatic failover by relocating virtual machines and rerouting application data. This service empowers you to point-click-and-manage clouds based on different virtualization platforms. OnApp currently supports Xen, KVM and VMware hypervisors. Federated network of public cloud providers - Computenext and 6Fusion Federation of cloud services (cloud brokerage) is about accessing multiple cloud IaaS from a single sign-on account and managing them apart from comparing, measuring and unified billing of the compute resources. Based on your requirements, you can change your provider if needed without rewriting your code and API calls. No more vendor lock-ins! It seems the above mentioned platforms are already serving as the basis of the beginning of the much talked multi-cloud approach by providing the API abstraction (explained later) to the multi-cloud deployments. I think they should take their service to the next level by providing automatic failover and built-in real-time communication between multiple vendors. A formidable challenge! Nextgen solution Is Multi-cloud approach a better strategy to achieve highest possible service availability? Yes, of course it is. A few companies have already started experimenting this (PayPal for instance). A few have already sensed the upcoming demand for it and are in the process of building right tools for multi-cloud approach (RightScale multi-cloud management). Apart from the apparent benefit of high availability, multi-cloud may lead to price reduction and healthy competition among IaaS providers for a better service. As a customer you no longer have to face that vendor-lock-in issue. Apart from the apparent benefit of high availability, multi-cloud may lead to price reduction and healthy competition among IaaS providers for a better service. What are the challenges in multi-cloud implementation? As a developer I understand that implementing this is easier said than done unless we address the following cloud standards issues: Interoperability Portability etc. The path to multi-cloud goes via APIs APIs…? I won't define APIs here but let me give you a practical scenario from daily life to make you understand the role of an API in a multi-cloud approach. How do you book (reserve) flight tickets? You simply sign in to your favorite travel portal (or the airline's portal) and enter the journey date, city-pair detail, and within seconds you have a list of available airlines on your screen. You choose one of them and after a few clicks and within a few minutes you've got the booking confirmation message. Sounds so simple. Now you must be aware that there are 1000s of travel portals, offering flight booking services to millions of direct customers (registered/guest users) and travel agents in almost 200 countries spread around the globe! Similarly for train and bus bookings, many of these portals are providing you with the facility to view seat layout and book as per your requirement. As booking are going on simultaneously across the world through 1000s of portals: How do they ensure that the same seat is not booked for more than one customer or that the total number of bookings should not exceed the available seats? Now this sounds a bit complex, isn't it? All this is made possible via the magic of APIs running in the background Airlines, train or bus operators has their inventory. They (or a third party) maintain a computer reservation system (CRS) where they enter their inventory details in their respective database. Once they have a CRS, every vendor who wants to be a part of GDS (Global Distribution System) needs to expose a web API to access its inventory. A GDS will aggregate many such APIs from different vendors and can offer its own web API to multiple channels like travel portals throughout the world. Now, any booking request to the GDS API will be directed to all the participating vendors computer reservation system (CRS) and the response from them is consolidated and shown to the calling program, (i.e. to your favorite travel portal). So, the GDS APIs are simply a layer of abstraction between the actual vendor inventory and the consumer (travel portals). More or less this is the work flow in any ticket booking system. The API will receive a few parameters like date, city-pair etc. from your portal and will respond with the availability and fare details. To simplify the above let me say that if you wish to develop a travel portal you don't need to worry about talking to the multiple operators for inventory. You can contact a GDS provider, (Galileo, Amadeus, Sabre etc.) purchase a license and integrate their API in your web application. The API will receive a few parameters like date, city-pair etc. from your portal and will respond with the availability and fare details. So, if you want to build a travel portal for international flight booking, it doesn't matter in which part of the globe you are, as a prerequisite you have to integrate the GDS APIs to your application. This abstraction, to some extent lowers the barrier of developing a portal and that's the reason there are numerous new travel (agent) websites starting up operations every other day. Bottom-line Maybe the cloud should evolve and mature a little more in terms of interoperability and standards in order to offer this much awaited new approach at an affordable cost. I think the above analogy says it all for the cloud services as well. The IaaS providers have got an inventory to share and most of them already have their own APIs to manage server resources. But before the IaaS goes the multi-cloud way on an industrial scale, many rough areas need to be smoothed out. Maybe the cloud should evolve and mature a little more in terms of interoperability and standards in order to offer this much awaited new approach at an affordable cost. What do you think… are we going to witness many multi-cloud implementations in 2014? ### IT Managers Struggling to See Through all-Flash Storage Myths By Gavin McLaughlin, Solutions Development Director at X-IO Storage UK businesses are being bamboozled by overhyped marketing myths about the performance, reliability and power consumption of all-flash storage arrays despite practical evidence and common-sense arguments to the contrary. 74% have a rose-tinted view... According to a recent survey, close to three-quarters (74%) of IT managers have a rose-tinted view of all-flash storage, despite misgivings about the cost, risk and general lack of need for all-flash arrays. As many as 76% accepted the myth that all-flash arrays are faster than hybrid arrays. But while flash can undoubtedly assist with lowering latency for random reads, it can often be the same or even slower than well designed hard disk arrays under some workloads, particularly sequential writes. A similar number accepted that all-flash solutions used less power and cooling compared to hybrid storage. But real-life testing by numerous users has found hybrid arrays use half the power of all-flash arrays in like-for-like evaluations. Flash modules/SSDs draw less power as raw components than HDD, but enterprise storage arrays are not just raw storage. They use processors and cache memory is also needed in some designs, both of which require power. On average, all-flash arrays draw double the amount of power and require more cooling than true hybrid arrays. ...real-life testing by numerous users has found hybrid arrays use half the power of all-flash arrays in like-for-like evaluations Two in five (40%) respondents believed all-flash arrays provided a higher degree of reliability than hybrid arrays based on marketing messages from all-flash vendors that HDDs are unreliable. But this is not the case. Some enterprise storage vendors have been guilty of using the wrong tool for the job by relying on consumer grade SATA drives to cut supply chain costs, causing hard disk drives to be seen as problematic. When issues with flash media are taken into account, particularly with cell failure on NAND silicon, flash arrays usually have shorter duty cycles than hybrid storage. It's clear IT managers are unaware of the truth behind these all-flash myths. Many vendors are trying to convince customers and resellers that hard disks are outdated technology and flash is the most appropriate media for all use cases. The truth is that hybrid storage is the practical option, it offers the best of both worlds by combining the advantages of flash with the established benefits of hard drives. The truth is that hybrid storage is the practical option, it offers the best of both worlds by combining the advantages of flash with the established benefits of hard drives. The survey was undertaken by independent research firm Vanson Bourne, which interviewed senior IT decision-makers at 100 large enterprises across the UK. The purpose of the survey, commissioned by hybrid storage vendor X-IO Technologies, was to compare the adoption plans of all-flash arrays in the enterprise against the hype being promoted by all-flash array vendors. It also shows where the market is educated about the promise of all-flash arrays and where it is misinformed. IT budgets are currently under tight restrictions, IT managers need to implement a storage solution which provides the right amount of performance required whilst remaining cost effective. In time, storage architects and buyers will realise that flash is a tool rather than a solution. But in the meantime, it won't stop some users getting their fingers burnt in the world of storage sales, a place where people push hard to close deals that often fail to offer the best solution for a customer's needs. It is up to the storage industry to help customers and resellers understand the strengths and weaknesses of each type of storage, helping people cut through the hype. It is up to the storage industry to help customers and resellers understand the strengths and weaknesses of each type of storage... *The survey data was collected via an online survey completed by a nationally representative sample of 100 IT managers from key industry sectors such as financial services, manufacturing, retail, distribution and transport and the commercial sector from across the United Kingdom. The survey was prepared on behalf of Vanson Bourne and conducted in February 2013. The identity of the 100 respondents will remain confidential, in accordance with the Vanson Bourne Code of Conduct. The X-IO name was not revealed during the interview to ensure the data remained unbiased. Gavin McLaughlin is Solutions Development Director at X-IO Storage based out of X-IO's London office covering the EMEA region. X-IO Technologies is a recognised innovator in the data storage industry due to its award-winning, cost effective, high performance and self-healing Intelligent Storage systems, including flash-enabled hybrid storage. ### Ten Steps for CIOs to De-Risk SDN There is a new breed of CIO embedded in the boardroom looking at the digital business. These CIOs are aiming to do more than keep the lights on. They want to make their corporate networks more like the cloud. They want the ability to respond to application requirements quickly, whilst keeping data secure and helping the delivery of policies like secure BYOD. [easy-tweet tweet="New CIOs want to make their corporate networks more like the cloud" via="no" hashtags="cloud"] However, most rigid corporate networks simply can’t handle the kinds of performance, growth, change management and automation that these companies demand. What they get instead is an unacceptable 30 or 60-day wait for suppliers to make network changes. They have to wait for router deliveries. They have to wait for software upgrades. They have to submit support tickets for a simple network changes. They get poor service.    Fortunately, there are a whole host of innovative technologies in the market such as Software-Defined Networking and Network Function Virtualisation, deployed by nimble suppliers, that are revitalising enterprise networks. Software Defined Networking [SDN] particularly can provide flexibility and superior performance to give the CIO and their organisation the competitive edge they’re looking for. We have been using SDN concepts for over a decade and, in doing so, we’ve learned some valuable best practice tips for the successful deployment of a SDN platform that we’d like to share: Business case This might seem basic, but its important, you don’t ‘buy a SDN’. It’s not a product – a network - you can pick off the shelf. SDN is an enabling capability that sits in your network. SDN can offer compelling enterprise benefits, and you will need to be able to convey those benefits to the Board. You will need to present a business case for SDN to your CFO and CEO peers. While SDN’s business benefits are real, they can be difficult to demonstrate. SDN involves network changes at the foundational level, so the benefits can take time to appear. Start with a small implementation project. This will help you demonstrate the advantages of SDN and can pave the way for additional funding for future projects. Focus the implementation on a business use case that is near and dear to your company. SDN, for example, can foster more rapid service development and provisioning and when that’s demonstrated alongside mobile app development, business executives can see how SDN could expedite delivery of new products and services and lead to increased revenue streams. Consider the cultural implications of SDN within IT. IT leaders should take a page from today’s DevOps market and encourage collaboration among network engineers and software developers when planning, testing and executing SDN strategies and implementations. IT organisational charts may even need to be rewritten as roles and responsibilities change. SDN is an enabling capability that sits in your network Deployment Rather than move to an all-SDN setup in one move, consider gradually combining SDN with more conventional networks. SDN solutions need to reflect this hybrid approach. SDN requires networking staff to switch their focus from hardware to software. With SDN, IT staff configure networks with graphical software and write code using interactive developer tools. This requires new ways of thinking, new types of training, perhaps even new positions on the IT org’ chart. Danger zones Security! By moving the “brains” of the network to a central Controller device, SDN creates a new vulnerability: If the Controller is taken down by an attack, the entire network can crash. SDN solutions require additional security measures, both built into the architecture and delivered as a service. The term SDN can mean different things coming from different suppliers. For IT departments, that’s confusing. Standards will help. While some standards already exist, more standards — and more mature ones — are still needed. Working with network service providers If you plan to work with a service provider ask some basic questions before engaging, for example: How do you support SDN in your network services? What types of SDN-based tools, portals and applications do you provide and what functionalities do they enable?  Can I see all of my sites by location name? Can I see all traffic per site?  Will I be able to differentiate traffic and determine who’s using bandwidth? Can IT change and configure bandwidth on demand and schedule bandwidth as business needs require?  Do you service provide a business-continuity solution, using the Internet as a backup solution? My last point would be this, if you’re going to upgrade your network, look for a network service provider that has SDN in the roadmap with clearly delineated benefits, services and delivery dates. ### How DCIM Improves Performance and Reduces Carbon Footprint By Margaret Ranken, Telecommunications Analyst As data centres have become increasingly complex, juggling efficiency and availability to meet the ever-growing demand for computing power, the spreadsheets that most data centre managers currently use are no longer adequate. Not only are they a poor means of tracking the increasingly large numbers of assets in the data centre, but they are incapable of meeting the challenges created by virtualization, which pushes servers to their limits and creates “hotspots”. DCIM (data centre infrastructure management) tools have been developed to fill the gap, and many commentators have forecast rapid adoption: in 2012 Gartner predicted DCIM could penetrate as many as 60 percent of U.S. data centres by 2015. 451 Group's recent study predicts DCIM sales to grow at 44% CAGR to reach $1.8bn in aggregate revenue in 2016. However, progress so far has been slow, despite the fact that leading vendor CA Technologies claims energy savings of 30% and an 11 month payback. A Heavy Reading report recently sounded a note of caution about the implementation challenges among the larger data centres that host cloud infrastructure, where IT and building systems are often managed by separate teams. DCIM is important because it can both increase the reliability of the infrastructure and help to reduce carbon emissions. Managing the building infrastructure separately from the IT systems and processes creates problems that have become more critical as virtualization has created a dynamic computing environment within a static building environment. Inefficient allocation of virtualized applications to servers can cause rapid changes in computing load that increase power consumption and create "hotspots”. If unanticipated, these can be too much for the data centre's air-conditioning systems, reducing efficiency and, in turn, reducing availability due to overloading and server outages. The DCIM approach integrates the management of the IT systems and the building systems into one seamless whole, so that the load on the servers is managed in tandem with the building systems. DCIM also helps to enforce consistent, repeatable processes and reduce the operator errors which can account for as much as 80% of system outages. The detailed real-time monitoring that DCIM tools support improves visibility of both IT usage and physical infrastructure. Right from the design stage, DCIM uses power, cooling and network data to determine the optimum placement of new servers. During operation, equipment that is consuming large amounts of energy is identified and, where hotspots are developing, fan speeds are increased and server loadings re-configured to pre-empt problems. Integrated asset management tools mean that managers know exactly what the servers are doing and when resources such as space, power, cooling and network connectivity are likely to run out. They can analyse “what if” scenarios for server refreshes, the impact of virtualization, and moves, adds and changes and predict the effects of any faults. When choosing a cloud service or managed service provider, you should look for one that is either already using DCIM, or plans to introduce it soon if you want to be sure that your mission-critical IT systems are being operated to the highest possible standards. DCIM brings the additional benefit of making it simple to monitor and reduce the data centre's carbon footprint. Using the wealth of data gathered, it becomes possible to make plans to reach energy saving goals, implement them and document the progress. DCIM tools provide the accurate energy consumption analysis and verification needed to demonstrate compliance with environmental regulations and contribute to meeting corporate social responsibility targets. When choosing a cloud service or managed service provider, you should look for one that is either already using DCIM, or plans to introduce it soon if you want to be sure that your mission-critical IT systems are being operated to the highest possible standards. About the Author: Margaret Ranken has been spent over fifteen years analysing the telecoms industry and providing strategic advice to some of its largest players. Her reports have covered a variety of industry topics including M2M, Enterprise Mobile Data, Unified Communications, Fixed Mobile Convergence, Quad Play and High-Speed Broadband Networks. Her forecasts for Business Data Services were highly regarded in the industry. Before that, she spent a decade working on European Commission research projects in advanced communications, managing projects and writing reports on their results. She started her career as an engineering trainee with what is now British Telecom. She has an MSc in Telecommunications and Information Systems from the University of Essex and an MA in Engineering from the University of Cambridge.   ### How to Sell Cloud and Shift all the Risk to Customers. (An Exposé!) In an earlier post I identified the types of risks that I tell customers they should address in cloud contracts. I also advise cloud providers - and they are generally willing to have an open dialogue with customers about the risks associated with cloud. This is because to give a customer the safer, more reliable cloud that they desire, it is actually an opportunity for a cloud provider to sell a better specified, more expensive cloud. Welcome Mr Customer... But if you really want to sell cloud without taking on any of the risk here's what you need to know. (Of course, if you're the type of cloud provider who wants to know how to differentiate yourself in the market, then you can address these issues as these are the ones the customers are concerned about.) 1) “As is” Selling cloud “as is” suggests that it is new, untried and untested and the customer is lucky to have it. It is a clear way of saying “buyer beware”. If the cloud is free or cheap -- “as is” cloud is often associated with highly standardised, low cost & low margin services -- or is being used for non-core, less important services, then this is probably acceptable for a customer. If you want to differentiate, then define your cloud in relation to a specification that fits the customer's needs and make assurances over the quality of the cloud. 2) Service credits I have yet to meet a customer who is happy with credits against future payments for a service that is not performing. Service credits are a great way of seemingly providing the customer with a form of compensation but in reality protecting the provider against unhelpful claims. Additionally, providers can make it even harder for customers by not volunteering to pay service credits but by requiring the customer to claim them. And then, to top it off, make sure service credits are the customer's sole and exclusive remedy. But I have yet to meet a customer who is happy with credits against future payments for a service that is not performing. A better way would be to have a sensible dialogue with a customer to see what they want. Customers I advise are more interested in making sure the cloud actually works. 3) Lock in Flexible, scalable cloud is great for customers - they can buy cloud services as they need them. And stop buying them when they don't need them. Of course, this is not a great business model for providers and it would be much better to have a reliable, repeating book of revenue by locking the customer into a longer term relationship. Proprietary or tailored cloud is a way of achieving this. Or, you could just tell the customer he can't terminate for 5 years. Customers I speak to recognise that, where the provider is spending money to build them a private cloud by investing in new hardware, then some form of minimum term is necessary. But even then, customers are willing to negotiate an early termination payment to recognise this. And then will want some assistance to migrate to a new provider. Most customers recognise that paying extra (standard) charges is appropriate. 4) Data loss It is not uncommon for public clouds to contractually exclude their liability for damaging, corrupting, losing or deleting customer data. By the nature of cloud services, the provider hosts the customer's data. It is not uncommon for public clouds to contractually exclude their liability for damaging, corrupting, losing or deleting customer data. This is giving customers a great data storage and processing service and yet not bearing any responsibility if it all goes wrong. Data security and loss is the customer's primary concern. Cloud providers I speak to remind me of how much they spend on building secure data centres and they're willing to protect customer data. Selling cloud and taking responsibility for a customer's data is a great way to be better than public cloud. The new draft EU Data Protection Regulation will impose new obligations on cloud providers and there's no getting away from this (at least not in Europe). 5) Cap your risk Often one of the sales messages of cloud is that is cheaper than on premise. Whether this is true is perhaps a topic for someone else to pursue. From a provider's perspective, the money that they are making is probably significantly lower than the losses that a customer would incur if their cloud service did not work. It has long been the case that IT providers will seek to cap all their risk, all of their liabilities under the contract to the amount the customer pays them and cloud providers copy this. You could even cap all risk to just 3 months of customer payments. Of course, the worse it is for a customer, the more likely a judge will be to favour the customer. This is the big debate in the industry. Customers want more but providers can't or won't give more because this would mean them losing much more than they earn. But surely if your cloud is as great as you say it is, then the customer would have no need to sue you for large losses... But surely if your cloud is as great as you say it is, then the customer would have no need to sue you for large losses... If you're a cloud provider, what's your approach to dealing with the risks in selling cloud? Do you sell on the basis of a standard set of risks placed on the customer or do you negotiate with customers and use it as an opportunity to sell better cloud? ### Optimising memory and SSDs for online gaming applications with Kingston Consult [quote_box_center] CASE STUDY In cloud computing and resource virtualisation circles, hardware can often take a back seat in the performance debate. The separation of the virtual and the physical suggests to some that it doesn't really matter what hardware you're running on, all that matters is how many virtual server instances you can produce of varying sizes and how functional your UI and APIs are. However, as growing interest in 'bare metal' cloud services demonstrates, there is an increased level of knowledge and expectation of server performance from business users, as they search out options that produce more consistency and power to drive their applications. Hardware may well be a commodity but it doesn't mean all DRAM modules, HDDs, CPUs, SSDs are all the same - and it certainly doesn't mean every supplier is the same in how they go about ensuring you get the most out of your hardware purchase! That's where Kingston Technology's KingstonConsult service comes in - a free service designed to help you get the most out of your hardware, and frequently there are rich rewards for getting the experts to cast their eye over your operation, as we can see in this case study. [/quote_box_center] The Business challenge i3D.net began in 2004 when CEO Stijn Koster put together his first server for gaming customers. Since then, he's built a managed hosting provider company with over 8,000 servers at 16 datacentres across Europe, Australia, Japan, South Africa and the United States. A broad range of 30,000 customers make use of i3D.net's online infrastructure services and managed hosting solutions. Those clients include brand-name game companies that leverage high-performance servers. All told, the company's servers allow 1,000,000 gamers to play their favourite titles daily. Additionally, i3D.net's enterprise customers use hosted servers to run applications with high-availability and high-capacity requirements. That puts company engineers under pressure to field ever-more powerful solutions. “We consistently see higher server specifications year on year,” says Stijn Koster, CEO for i3D.net in Rotterdam, Netherlands. “So our customers continuously drive us to boost the performance, capacity and reliability of our servers.” Motivated by the desire to save money, the company's enterprise customers require i3D.net to increase the number of virtual server instances that each server can host–all without sacrificing performance. “We needed to raise our server utilisation rates because many of them were already near 100 per cent,” recalls Koster. “We considered it an essential goal to help us increase our market share.” For gaming customers, performance translates into servers that minimise lag times. That's particularly important for twitch-game (first-person shooter) titles that rely upon prompt screen refreshes to deliver a satisfactory, multi-player game user experience. “Our gaming customers have zero tolerance for slow performance,” says Koster. “If our servers don't deliver responsive game play, they'll go somewhere else.” With the release of new games and gaming platforms like Xbox® and PlayStation®, the company's engineers have to continuously raise the bar on server capabilities. “The new gaming platforms utilise a lot of cloud infrastructure,” explains Koster. “So the servers running the game clients need storage for the platform and game, plus enough memory to run them well. To meet these requirements we had to double our servers' memory.” The new gaming platforms utilise a lot of cloud infrastructure ... the game clients need storage for the platform and game, plus enough memory to run them well Technology Solution “Before we purchase a large batch of memory or SSDs, we run extensive benchmark and compatibility tests,” says Koster. “We've had a really good experience with Kingston® products over the years. That, plus our test results, motivated us to select them as our memory and SSD partner.” Maximising server memory performance for gaming customers i3D.net specialists worked with KingstonConsult experts to identify the maximum memory speed supported by legacy server processors. They also reviewed chipset specs to understand the processor memory support rules. Then, they recommended the best Kingston memory modules and configurations to meet performance goals. To extend the service life of existing servers, specialists recommended the use of high-density modules that left memory slots available for future memory expansions. “My engineers told me that the installation experience was plug-and-play–and that's how it should be in today's marketplace,” stated Koster. “They also liked that the compatibility of the memory with our processors was easy to check with Kingston's Configurator.” Maximising server memory capacity for enterprise customers To support corporate virtualisation requirements, it was necessary to upgrade servers to the maximum memory capacity they could support. To meet this need, Kingston specialists recommended the largest capacity memory modules that the company provides. “Working with Kingston allows us to customise our server configurations as needed to meet our customers' requirements,” says Koster. “That way, we're able to offer a tailored solution portfolio versus a one-size-fits all. And that's been very attractive to our customers.” Maximising server IOPS When you replace the millisecond read-write times of HDDs with the microsecond times of SSDs, that adds up to a lot of IOPS per server. “Our emphasis on markedly increasing IOPS was really driven by our customers' demand for it,” explains Koster. “Kingston engineers showed us how to achieve that end using SSDs. When you replace the millisecond read-write times of HDDs with the microsecond times of SSDs, that adds up to a lot of IOPS per server. Business results Kingston memory modules, SSDs and consultation services helped i3D.net realise a number of business, financial and technical benefits. Significant server performance enhancements deliver competitive advantage The new online gaming platforms can accommodate 60–100 players per instance running on i3D.net servers. “Previously, these loads stressed out our servers,” recalls Koster. “But since we upgraded them with Kingston memory and SSDs, we've been able to double the gaming instances from 32 to 64, with some servers able to handle up to 100. And we've done that while limiting lag to deliver an enjoyable user experience.” Additionally, the Kingston enterprise-class SSDs helped solve the company's server IO latency bottleneck. “The limiting factor was our HDDs, not our processors,” states Koster. “By eliminating that log jam, we've been able to help corporate customers lower their costs by placing more virtual systems on each server.” Our server infrastructure has [...] given us a competitive edge against other infrastructure providers like Amazon and Microsoft The server performance enhancements certainly yielded significant technical benefits. For Koster, however, the real prize was that “Our server infrastructure has been a fundamental part of the organic growth of our company. It has given us a competitive edge against other infrastructure providers like Amazon and Microsoft. By offering customers tailored server solutions, we've experienced double-digit growth.” Enterprise-class reliability supports Service Level Agreements Another key element to i3D.net's success is its SLA guarantee of 99.99 per cent availability. To ensure this standard, i3D.net technicians log and analyse device failure incidents. “The Kingston solutions have definitely made it easier for us to meet our SLAs,” says Koster. “The MTBF [mean time between failures] of their modules is incredible. Since we started using them, we've reduced DRAM failures by 50 per cent. And because we use Kingston as our default brand, we don't have to de-bug memory from different brands so we're saving 10 per cent in staff labour hours as well.” These results stem from Kingston's rigorous testing, and quality assurance procedures. Consequently, Kingston backs its memory products with a lifetime warranty while the Enterprise SSD product line is backed by a three-year warranty. And with program/erase cycles of 30,000, the durability of the E-series SSDs is typically 10 times what client SSDs offer. KingstonCare and consultation services deliver added value They've been advising us on how to gain the maximum performance from specific server configurations ...we don't see that level of engagement from other vendors. Kingston offers a suite of free services collectively titled KingstonCare. These include hassle-free RMA (return merchandise authorisation) plus RMA product cross-shipping and on-site device spares. Together, the services are designed to reduce customer downtime and promote the meeting of SLAs. “Between KingstonCare and the reliability of their solutions, we simply don't have RMA stress here,” explains Koster. “That's a huge benefit when you consider that we have 8,000 servers to administer.” Koster also praises the performance of Kingston technicians. “They've been advising us on how to gain the maximum performance from specific server configurations. That's an added value to us that sets them apart. It's pretty unique and we don't see that level of engagement from other vendors." SUMMARY Gained competitive advantage by offering flexible server configurations. Optimised servers (memory & SSDs) for performance, capacity and IOPS. Doubled number of instances that gaming servers can run. Significantly increased number of virtual systems that corporate servers can run. More easily meet 99.99 per cent Service Level Agreement (SLA) specification to retain and gain customers. Slashed DRAM failures by 50 per cent since standardising on Kingston memory modules. Durable, enterprise-class SSDs deliver 30,000 program/erase cycles – up to 10 times those of competing solutions. ### Nimbox: Our Story By Tom Chappelow, Nimbox “Nimbox enables corporate Cloud file sharing and backup, allowing users to securely upload, edit and recover documents anywhere in the world." The idea was simple - to create an easy to use, secure and available service that was accessible enough for everybody. We wanted SMEs and start-ups to have enterprise level security. We wanted it to offer unparalleled levels of privacy, whilst keeping it compliant. Above all, we wanted users to trust us with their data. Corporate executives would have called it blue sky thinking - we called it "Nimbox". The System View of Nimbox We decided that the service would be built with privacy at its core. It's for this reason that, before we put our tech-heads on, we created a legal framework that sets out exactly what we could and couldn't protect. To us, transparency is as important as the security of your data. We wanted SMEs and start-ups to have enterprise level security. Today, Nimbox boasts some of the most advanced information security methods available, but it wasn't an easy road to get here. Before I tell you about our technical security, I'll focus on a moral issue - an issue that we tackled with from day one. Most large Cloud hosting companies design and run ‘bots' that crawl through user data looking for copyright infringement, information relating to national security and examples of indecent material. We took the hard decision to not do this. Not because we think copyright infringement is OK, or because we don't care about national security; but, because we think that the fundamental right to privacy is one that must be protected. After all, it was Benjamin Franklin who told us that “They who can give up essential liberty to obtain a little temporary safety, deserve neither liberty nor safety.” Once we had decided that we didn't want access to our users' files or to store any information about them [...]we could make the easy decision to offer end-to-end encryption, to which only the user holds the key. Once we had decided that we didn't want access to our users' files or to store any information about them, which we didn't explicitly need to provide the service, we could make the easy decision to offer end-to-end encryption, to which only the user holds the key. We have not, and we will not join any governmental system that allows for the interception of user data. From the start it was clear that security had to run parallel with usability, after all there is no point offering a service which is so secure, that it inconveniences the user. Users had to love the product as much as we do. Because of this, we offer browser based and desktop access to files, as well as Apps for iPhone, iPad and Android. ...there is no point offering a service which is so secure, that it inconveniences the user... Because of this, we offer browser based and desktop access to files, as well as Apps for iPhone, iPad and Android. App is PIN protected Our security standards are what you would expect from a privacy first company: our security protocols include AES 256-bit data-at-rest encryption, TLS 1.2, SHA-256 fingerprinting, password hashing with twelve-character random salt and 20,000 iterations of HMAC-SHA256 via PBKDF2, as well as physical hosting in ISO27001 accredited Tier 3 data centres. But security and ease of use without availability is no help at all. That's why Nimbox is hosted on our industry-leading virtual platform, offering 99.999% uptime and daily platform-level backup. We also recognised that although customers want to store their files with us, they may want to bill with their favourite supplier – so we have opened our doors to those resellers that share our values, whilst we maintain the security and operational infrastructure of the service. In summary, we offer a useful range of services that stretch from desktop and server backup and file sync and share, to document collaboration and mobile access – all protected by a robust legal and ethical framework that protects your privacy. We believe we're the best, and we plan to prove this to our users each and every day. For more information visit www.nimbox.co.uk, email sales@nimbox.co.uk or call +44 (0) 8454 75 75 74 ### BRICS in the Wall   A couple of weeks ago, I wrote about some of the issues surrounding the growth of cloud computing in China, and how the relatively embryonic technology is set to become a massive part of the economic growth in what is likely to be the world's next superpower. Undoubtedly, obstacles still remain for cloud computing in China, but increasingly it appears that the Chinese government is paving the way for its steady uptake. It is perhaps not commonly known in the Western world, at least among those who don't participate regularly in economic issues, that China is part of a new economic bloc that is usually colloquially and collectively referred to as the BRIC or BRICS nations (depending on how relevant one deems South Africa to be in this grouping). BRICS is a simple acronym, which stands for Brazil, Russia, India, China and South Africa, and is a broad-based union, motivated largely by economic factors. BRICS is a simple acronym, which stands for Brazil, Russia, India, China and South Africa, and is a broad-based union, motivated largely by economic factors, which was formed just a few years ago. Originally comprised of just the first four countries, but with South Africa admitted a couple of years ago, the leaders of the nations convene every few years for the so-called BRICS conference, during which many issues of import are discussed, and the BRICS nations generally conspire to gain their members a bigger collective voice in the way that the world is run, particularly on an economic level. Often described as ‘emerging nations', it is accepted that the BRICS nations will have a significant influence on the direction of the global economy in the 21st century, particularly given the obvious economic malaise in the so-called ‘Western' world. Thus, it is interesting and important for the future of the cloud that recent surveys carried out in some of the BRICS nations' principle players indicate that the technology appears to be operating on a sound footing in this new area of economic significance. Of course, India has also carved itself out something of a reputation as an IT leader, not merely within the BRICS nations, but in the world as a whole. And, in common with China, the nation has invested very heavily in educating its population in important economic growth areas for the future, such as engineering and IT. The Times of India were among the publications this week that reported on a study recently carried out which indicates that IT leaders located within these ‘emerging nations' are excited about the transformational and innovative potential of 'cloud' technology. The research was carried out by Cisco Consulting Services, and the management consultancy firm concluded that attitudes of IT firms in the likes of Brazil, India and China were extremely positive towards the cloud, and were keen to embrace it, not merely for cost saving, corner cutting reasons. The Cisco study, which was conducted among a selection of Information Technology firms in nine nations, reported that 83 percent of respondents considered the cloud to be very satisfactory, while a further 13 percent were somewhat satisfied. Obviously this leaves just four percent being dissatisfied, giving a strong indication that the cloud is set to become a very big deal in these nations. Cisco concluded that the adoption of cloud computing in India and China is becoming “mainstream”, and the broad-based nature of the study, which interviewed 4,226 information technology leaders in eighteen industries, including over 600 in India, supports this notion. It is perhaps telling, though, that companies within this new economic powerhouse had the same basic trepidation as many individuals involved in the more traditionally economically powerful nations. Security and privacy issues were recorded as being top of the list of concerns that companies had regarding cloud technology, particularly the complexity of managing third parties. Nonetheless, respondents had attempted to address some of these concerns in a practical way, and expressed the opinion that even when the switch to a cloud-centric computing culture is made, the IT involved will remain centralised and well-funded, enabling the cloud to be managed and maintained with consistent policy and security solutions. The study gives strong indication that business leaders in the emerging economic powers are keen to get on board with cloud technology, which can only be a good thing for the future of the cloud in general. It also gives further notice that this will be a truly global technology in the future, not one confined to the traditional ‘Western' hierarchy. ### Differentiation in a Cloud(ed) Market An Interview with Rob Davies, Sales & Marketing Director of Pulsant Let's be honest – while there are a lot of cloud vendors out there all promising the best service, highest performing solution and most outstanding capabilities, the reality is that not all of the marketing talk is a true reflection of what can be delivered. It takes time and considerable effort to wade through the offerings in the market and selecting the most appropriate solution is often not simply about the solution itself but the resources, expertise and credentials of the vendor. What differentiates Pulsant from the rest of the market? It's not just one thing. I think it is a combination of our core strengths, such as our flexible approach to developing an outcome for our customers, our resources from our ten data centres and 10G core network to our owned and managed cloud platforms, and our accreditations like the Royal Warrant, being HPCloud Agile, ISO 27001, PCI DSS and now STAR**. I believe our most important differentiator is the fact that we invest in the best people. The Pulsant team is dedicated, knowledgeable and works towards the same objective, which is to deliver what our clients need in the best way possible focusing on the quality of that delivery. Why do you regard your data centres as a strength? Not only do we own 10 data centres strategically located throughout the UK, we also own the 10G network that connects them together. As one of the largest regional data centre players in the UK market we recognise that our data centre network addresses the requirements of many IT departments and companies that want to be physically close to their IT assets. The Pulsant team is dedicated, knowledgeable and works towards the same objective, which is to deliver what our clients need... Speed of access, service and location are therefore important to them and are all issues that are answered by our network. We recently made a £1.5M investment in refreshing our core network infrastructure and we're now running the latest Cisco routing and switching fabric designed to offer scalability and deliver new services into the future. From a cloud provider perspective, being able to offer services in multiple UK locations, connected by a high speed resilient network is critical in the delivery of a robust solution and competing in this market. What bandwidth connectivity options to you offer? When it comes to connectivity the options can be endless, we own the dedicated 10G core network that connects our sites, and we offer competitive IP transit, or Internet services, in and out of our data centres, and content delivery networks. We have partnerships in place with all the key network carriers centres so that our customers can select their vendor of choice. This provides our clients with the connectivity they require in, out and between our data centres and whatever size they need. Why do you consider flexibility to be a core strength? It sounds pretty straight forward, but often customers don't know exactly what they want or need... Our approach is to create solutions around what our clients want, be it technical or commercial, and it is through our flexibility and consultative approach that this is accomplished. We believe that there's no one-size-fits-all model, and ensuring our customers get the best, most appropriate solution is key. This involves assisting in determining the requirements, expectations and objectives and then offering advice as to which solutions would be suited to address those needs. It sounds pretty straight forward, but often customers don't know exactly what they want or need - and the fact that we can customise solutions based on our resources and expertise means that we are able to deliver a better service. This approach also works if the customer wants to use Pulsant solutions with their existing partners – we're able to dovetail into that effectively. Why were you awarded a Royal Warrant? We've been providing managed hosting and cloud services to the Royal Household for over five years... We've been providing managed hosting and cloud services to the Royal Household for over five years and were awarded the Royal Warrant this year. It was given in recognition of the continuous exceptional service that we deliver to the Royal Household – it's important to note that not all suppliers achieve this. It fully complements our other accreditations, including our ISO rating, PCI DSS, IL2 and IL3 level security, and further reinforces our reputation as a trusted and secure provider in managing sensitive customer data. When did you become an HPCloud Agile partner and why? We've been an official HPCloud Agile partner for two years and saw this opportunity as an excellent way of using the world-class HP hardware platform to deliver our cloud platform. HP is a great brand and a recognised sign of quality and we felt that in terms of technology and innovation it aligned with and underpinned our drive to offer an enterprise-class outcome to customers. And while being part of this programme does not set us apart on its own, we feel that our scale, UK presence, customers, facilities, the innovation and use of our cloud platform and data centre capacity, does differentiate us. Every customer is equal In essence we own the entire supply chain and in this way we can manage whatever outcome our clients want – whether they want a complete solution from network connectivity and data centre services, to if they want to make use of our colocation services on their own network. We have a vast array of clients across industries and sizes, who trust us with their mission-critical data. Though the level of service may vary, the fact remains that within our systems we manage the availability and security of that data for them. ### What next for Social Business? Patchwork Elephant Event Report Event Sponsored by Compare the Cloud As part of London's Social Media Week we put on an event called Social Business – The Patchwork Elephant Revisited asking "What next for Social Business?". We were kindly sponsored by our friends here at Compare the Cloud and we introduced the event and the speakers in an earlier post. The idea was to get 8 different perspectives on where we are at, and where we go next, with using social and collaboration tools "inside" the business to add value and work more effectively. Why is the "Social" word seen with such suspicion by some executives in the C-suite? With the explosion of social media use in marketing or customer support reaching out of the organisation, why aren't more companies using it all over their organisations? We believe change is happening, but why aren't we further forward with "Social Business"? A few weeks after our event, Chris Heuer did a guest post on Brian Solis' blog that moved in to the same territory we covered asking Social Business is Dead! Long Live What's Next! and highlighted the problem with: "While the ideas behind the moniker are invaluable in defining the future of work, most large companies simply aren't buying into or investing in Social Business transformation efforts in more than a piecemeal sort of way" Why is that? Here are the 8 perspectives and presentations from the 27th November 2013: [quote_box_center] Alan Patrick, Broadsight.com Social Business - The Patchwork Elephant 01 - The patchwork elephant revisited - Alan Patrick from David Terrar Alan set the scene for us by revisiting our event from 3 years ago, highlighting that demand generation was the quick win and that social media structures are orthogonal to the normal hierarchy of command and control and so: "resistance may be futile, but strong!" He talked about social business being systematic, connecting the front end to the back end, looking to add value, and that culture follows commerce. He highlighted how value will be created, and referenced a McKinsey study on where the potential productivity improvement might be over the next 10-20 years by industry sector. He talked Ronald Coase's theory of the firm and how they exist to reduce transaction costs, and highlighted that social technologies can directly help with that. [/quote_box_center] [quote_box_center] Janet Parkinson, D2C & Technotroplis Social Business - The Patchwork Elephant 02 - think forward 40 years - Janet Parkinson from David Terrar Janet talked about the unthinkable idea, and asked if business could become nothing more than a social object, with individuals collaborating via social networks, doing things businesses used to do. She quoted James Burke who suggested earlier this year that: "Nanotechnology will destroy the present social and economic system - because it will become pointless" and then revisited Burke's 1973 predictions of what 1993 technology might be like. He had some things wrong, but a lot right, foreseeing the proliferation of the computer in offices, schools and homes, and the creation of metadata banks of personal information. She highlighted how difficult prediction is, but then talked about a future of radical abundance where technologies, like the early 3D printing we see now, will mean people can produce their own goods from virtually nothing for virtually nothing, and how that will have a knock on effect changing the business world dramatically - it will affect production, transport, consumer facing businesses selling goods, sales and marketing, business support services and finance. It could change the nature and need for cities, and even governments. Does business become a social object? [/quote_box_center] [quote_box_center] Will McInnes, Brandwatch Social Business - The Patchwork Elephant 03 - culture shock - glimmers of hope - Will McInnes from David Terrar Will wanted to provide glimmers of hope. He talked about the Culture Shock (his book) of how networked our World has become. How society has moved from ancient times when we gathered at the the stone circle for social interaction, to everyone being connected with smart phones and tablets, even wearing technology, and he referenced that YouTube video of a small child expecting a Magazine to work by touch like an iPad. He talked about the data we collect today just as a byproduct of the technology we use. He talked about preconceptions and misconceptions - the jazz segment of the music industry is $100m a year, but Grand Theft Auto's latest game version sold $800m in the first day! We don't have colonies on Mars, we have Facebook instead! He talked about decentralised, bottom-up innovation. He talked about the purpose of an organisation and quoted Umair Haque (who spoke at our 2010 version of this event) tweeting: "Making shareholder enrichment the basis of an economy is probably an idea that belongs up there with Cheez Whiz and Donald Trump's hair." He also quoted Simon Kuznets, the inventor of the term GDP saying in 1934: "The welfare of a nation can scarcely be inferred from a measure of national income" He talked of crowdsourcing, from Wikipedia to Giff Gaff, and of organisations without bosses or hierarchy like Valve Corporation. He talked ratings and reviews and the effects of that big shift on the high street. He talked about the immediacy of citizen reporting, and the implications of humans being networked. He talked OODA loops as a necessary approach to all of this, and mentioned his Meaning conference which will endeavour to connect and inspire the people who believe in better business, and want to be part of the change. [/quote_box_center] [quote_box_center] Mat Morrison, Starcom Media Vest Group Social Business - The Patchwork Elephant 04 - I hate everything - Mat Morrison from David Terrar Mat talked about his @mediacsar presence on Twitter, along with his @evilczar alter ego as an example of how you can be different personae on the Internet. He talked of Lego, of information overload, the power of on-line comments, how naive some marketers are around this topic, and how brands now have to act on Twitter. He highlighted that although there might be 80m (or 200m or... ?) active users of Twitter, the median number of followers is actually 30 and so this lens is distorted. He talked about the power of some well known Twitter complaints, and how you might get better service from some companies, such as BT, by complaining on Twitter rather than phoning their help line. [/quote_box_center] [quote_box_center] Luis Suarez, IBM The Patchwork Elephant 05 - social business is open business - Luis Saurez from David Terrar Luis's premise is that a social business is (or should be) an open business. He talked about the culture change required to move from the old way of doing things to this new way of collaboration and sharing using social tools. He talked in terms of a 30 year time frame - and he's right, this is a major change that will happen slowly, but it's happening. He used his own company, IBM, as an example - they've been doing social business internally well before the existence of Facebook. He talked Open Business and mentioned @davidcushman. He explained an Open Business uses its resources to discover people who share its purpose, and then bring them together to realise that purpose. He talked about the hierarchy and the wirearchy coexisting in a networked company. He talked about accountability, and getting rid of layers, and providing incentives for employees to share. He explained how managers need to transform in to leaders, and talked about the need for transparency. His conclusion, with a touch of Mafia style - Open Business is "Just" Business, it's the only way to go. [/quote_box_center] [quote_box_center] Neil Usher, Sky Neil didn't use any slides. He talked about being a corporate employee but trying to think about things holistically. Neil talked about what it was really like for employees working for corporates and 'using' internal social technologies. One of the reasons he didn't use slides was because he wanted to feel the vulnerability which many feel when starting to use networks for the first time. He talked about the workessence blog he has been writing for the past 2 years. He told us a little about creating a Yammer network in one company and then using Salesforce Chatter in the next to create internal social networks within the organisations he's worked for. When he made that switch between companies, he discovered that people at his old company said they'd miss his input on the internal network. He talked about LinkedIn and jokingly wondered what Google+ was for! He talked of the value of asking questions of Twitter to crowd-source expertise and the fact that complete strangers will respond with the answers. He was firm on the fact that these on-line social interactions amplify the subsequent face to face interactions, and vice versa too. And he also managed the compulsory reference to Euan Semple (who, by the way, was one of the speakers back at our 2010 version of this event).   [/quote_box_center] [quote_box_center] Anne-Marie McEwan, The Smart Work Company Social Business - The Patchwork Elephant 07 - pushing big boulders uphill - Anne-Marie McEwan from David Terrar Anne-Marie described herself as a recovering academic, and said she would be talking about pushing Big Boulders Uphill! She was explaining her experiences writing a book, putting together a post graduate course and developing The Smart Work Company, which pulls together social business, the changing world of work and the way the physical workplace is changing too. She described education as liberating, and democratising and how she wants to make a business school education available to anyone who wants it. She talked of things experiential and social. She described the social psychology of organising, of interlinked groups and their relationships. She wondered why we had lost so much openness and gave that as one of the triggers for writing her book, because it means so much to her. She talked of her work based masters course she taught at Kingston University and about getting people to think strategically. She quoted Orlov the Meerkat and wondered: "What could possibles go wrong? She's currently putting together a PGC for Chester University. She admitted it's been a hard sell to date. When people think of a Post Graduate course they think in terms of a curriculum on "paper", when actually she wants them to think in terms of what you actually do at work and doing it better. She talked about applying social technologies, about the what and the how, but also the where. She talked about a massive appetite for on-line learning, worried that current MOOCs have not helped as much as they should. She contrasted just putting the old curriculum on video to an approach of connecting to others outside your organisation doing the same thing, to scope a project plan, learn together through discovering good practice and principles, critique and amend to suit your own circumstances. She believes the doers are the experts, but we are the facilitators and feels she's been given a second chance. [/quote_box_center] [quote_box_center] David Terrar, D2C Social Business - The Patchwork Elephant 08 - thoughts and summary - David Terrar from David Terrar My job was to summarise the sessions, but add some thoughts of my own, so I quoted Dirk Gently as I also believe in: "the fundamental connectedness of all things" It's important to realise that, in the last 40 years of regular technology disruptions every 5 to 10 years, we've never had 3 of them happening simultaneously before. We have the shift to Cloud and web based apps happening at the same time as the explosion in social technologies happening at the same time that we are all walking round with mobile phones and tablets, so that we have the Internet in our hands, any-time anywhere. That's changing everything. No matter what you do, your business model needs to change. Back in 2011 Salesforce, who have one of the most complete business to social collaboration to social media monitoring offerings available, was promoting the Social Enterprise (when the term was already in use to mean something else), and they even tried to trademark it! By 2012 that idea had failed, they changed their messaging but it evolved to "Business is Social". The same concept in different words. It's also important to note that Darwin's theory of evolution still holds in business and marketing - categories naturally fragment and we have a huge landscape of software choices and point solutions, and so maybe this plethora of choice and the lack of maturity of better known, larger social business offerings is part of the reason why we haven't made as much progress since 2006 or 2008, as many would have expected viewed from back then. But there is more to it than that. As Luis said, the culture change required is happening, but it will take decades. I quoted Susan Scrupski who said:"without executive direction, support and sincere engagement, internal efforts are nothing more than an aimless electronic water cooler" There are smart companies who have heeded those words. [/quote_box_center] I highlighted major enterprises like Lilly and BASF and Deutsche Bank who all have great case studies of what can be achieved using social technologies, and we need more good case studies like those to get the C Level executives on board. In the Q & A around the summary, Benjamin Ellis from the floor highlighted that it's only a generation or two before our time that the average worker couldn't read - we now have a literate, educated workforce, with technology to help. We are beginning to move on from the Taylorist view of flows and mass production efficiency to a very different, flat, networked World with technology in our hands and everywhere we touch. Over time we've used terms like Web 2.0, Office 2.0, Enterprise 2.0, Social Enterprise and Social Business as well as Collaboration and Knowledge Management. We may not have the language right, but the idea of using these technologies to change the way we work is stronger than ever. The next stage has got to be about putting what we've learned so far in to practice, and making that Elephant (in the room) dance. Finally, I just want to add that the vibe in the room during the afternoon was a bit special, a bit different, and the discussion at the end of each talk was lively and productive. We all enjoyed it, and I'd like to thank our friends at Compare the Cloud once again for their sponsorship to help make it happen.   ### Beware of Square-Peg Salesmen with Hammers By Gavin McLaughlin, Solutions Development Director at X-IO Storage Open any IT magazine, read a technology news website, or subscribe to any data storage mailing list and you're probably fed up to the back teeth with headlines that make such bold statements as “How to use SSDs in your business” or “Why Hard Disk Drives are dead”. Whilst most technology savvy readers will have the sense to take over dramatised headlines with a pinch of salt, there are sadly many falling for the hype. “Customers don't buy storage, they buy solutions” All marketing noise aside, there's a concerning trend emerging here and that's the over-use of vendors selling storage components (be it in the form of flash, SSDs, hard drives or cardboard boxes) rather than selling business effecting solutions. “Customers don't buy storage, they buy solutions” is an old clichéd saying amongst sales teams, however it suddenly seems to have been forgotten in the storage industry. If I look back at some recent customer projects that I've been involved in, the opening line from IT departments has been phrases such as …. Our SAP system is running slow and we think the storage layer might be an issue Our system performance drops when we're around 85% full We spend way too much operational time escorting service engineers in the building ...they may well find that this is just a short term solution, that they've shifted the bottleneck elsewhere in the stack, or even that they've just dramatically increased the risk of system failure. With all of these, the customer was leading with a business challenge that may require differing toolsets to solve the underlying technical issue. Would running in to their building shouting “hey, swap out all your HDDs for SSDs” solve their problems? Well whilst there's always a chance someone could get lucky, they may well find that this is just a short term solution, that they've shifted the bottleneck elsewhere in the stack, or even that they've just dramatically increased the risk of system failure. What we're seeing all of sudden is a dramatic increase in the number of single-technology vendors, particularly all-flash-array sellers trying to argue that their technology is the only tool for the job and therefore that flash is the saviour of all storage woes. To coin a commonly used analogy, this is a great example of a salesman that only sells square pegs trying to convince you that his product is a perfect fit for a round hole and he'll give you a free hammer if you buy them. Of course the answer is to buy the right shaped peg for the appropriate hole, but of course he'll have a go at convincing you that the hammer option is much simpler and will save you time. Flash is a great innovation for the storage industry, there's no doubt about it, however it's just a tool, not a solution... The better way to look at all these new storage approaches is to first look at the challenges your organisation faces and then, when it comes to building out a solution, see what toolsets can help you solve those challenges in the most efficient manner. Somehow I doubt that the most efficient approach will involve a hammer. Gavin McLaughlin is Solutions Development Director at X-IO Storage based out of X-IO's London office covering the EMEA region. X-IO Technologies is a recognised innovator in the data storage industry due to its award-winning, cost effective, high performance and self-healing Intelligent Storage systems, including flash-enabled hybrid storage. Read more >   ### How should infrastructure providers compete with Amazon? [pull_quote_center] AWS is the overwhelming market share leader, with more than five times the compute capacity in use than the aggregate total of the other fourteen providers in this Magic Quadrant. It is a thought leader; it is extraordinarily innovative, exceptionally agile and very responsive to the market. It has the richest IaaS product portfolio, and is constantly expanding its service offerings and reducing its prices. It has by far the largest pool of capacity, which makes its service one of the few suitable for batch computing, especially for workloads that require short-term provisioning of hundreds of servers at a time. Gartner Magic Quadrant for Cloud Infrastructure as a Service, August 2013 [/pull_quote_center] With this sort of gushing praise and resounding affirmation from the world’s leading information technology and advisory company, you may as well think the rest of the cloud industry is better off packing up, closing shop and going home. But they haven’t.  Whilst Amazon doesn’t break out AWS revenues, various unofficial sources reckon its exceeded the $3bn revenue mark this year and has in excess of 450,000 servers across its 7 data centres. So we have to wonder on what basis the rest of the cloud infrastructure-as-a-Service industry is currently - and plans - to compete on?  Rather than pontificating and airing our own views, we felt it was best to draft in some expert commentary from a selection of other IaaS competitors – and invite them to tell us in what areas they can compete and win. First up, one of only a handful of operators that can square up to AWS in terms of scale and financial muscle is SoftLayer  - recently purchased by IBM: [quote_box_center]"What sets IBM SoftLayer apart from Amazon Web Services? Well, by bringing IBM and SoftLayer together we believe we now have the strongest IaaS offering in the market, exceeding that offered by Amazon for every target user group. To cite some examples: Through bare metal options our cloud delivers better speed, performance and consistency. From an integration perspective, IBM SoftLayer has 1600 APIs versus 60 of AWS. From an expertise and global reach perspective, IBM SoftLayer has 37000 experts and 25 global cloud centers. And from a research and investment persepective, IBM invests $6billion in R&D and is the holder of 14,000 patents in the cloud computing space. And finally moving up the stack, IBM SoftLayer has 100+ SaaS offerings, as well as seamless and integrated enterprise-grade integration for public and private, hybrid cloud services." Jonathan Wisler, General Manager EMEA, SoftLayer, an IBM Company[/quote_box_center] So SoftLayer's approach is multi-layered and designed to provide all types of user a direct alternative to AWS. SoftLayer is leading the charge with its bare metal cloud offering (which AWS don't currently offer) and aiming to secure business from users who are finding the performance of virtual cloud servers to be inconsistent or insufficient.  Now in the IBM fold, it's bolstered both financially and technically with a large library of PaaS and SaaS solutions available. Next up is UK-based Techgate plc,  an enterprise-focussed MSP who have built their cloud around a core service focus of backup and disaster recovery (DR) solutions: [quote_box_center] Flexiant’s Tony Lucas says: "Certainly, none of our customers are ever going to out-market Amazon in terms of cost of capital. What they can do is build niche businesses that are unique from AWS and provide flexibility that Amazon can’t…" and we at Techgate couldn’t agree more. Our positioning against AWS, Rackspace and all the global, "high-volume", commodity offerings out there, is… none really, as we simply do not position ourselves and rarely have to compete against them. Our core business model is built on flexibility and this is how the business is ran from top to bottom. From the actual solutions being customised (combination of physical, virtual, hybrid clouds most of the times with Disaster Recovery elements) and delivered to the customer, to the flexibility in meeting SLA requirements, meeting project deadlines or supporting bids for our channel partners. One of the most important pillars for us is the time we spend with the customer before the deal to understand and provide guidance with regards to what they are trying to achieve. And in the midmarket where our focus is on, this is the most important differentiator and positioning pillar at the same time, as our customers won’t have to spend a fortune in engaging with the enterprise sales/consulting teams of the vendors and the AWS/Rackspace etc offerings are most of the times not relevant for them, as they are asking for a customised solution with different components or specific architecture. In terms of the actual offerings/technologies and if we were to compare the components ("building blocks") of our solutions, I guess our Enterprise Public Cloud (vCloud Powered) platform could be compared to AWS’s EC2, as they are both Virtual Private Cloud offerings, but the similarities stop right there. As with all our offerings, the main differentiator is that everything is hosted only in the UK, in high-end server and storage infrastructure with a specific SLA and network infrastructure that is wholly managed by us. Last but not least, as our main focus and expertise is Disaster Recovery, by nature our solutions and their positioning are different, as it is the way we approach the Cloud paradigm in general." Martin Wright, Managing Director, Techgate plc [/quote_box_center] Techgate has identified its target market (mid-sized enterprises) and has formulated its cloud solutions to target specific business functions and SLAs.  There's no intention to compete head-on with AWS, instead the approach is to utilise cloud to strengthen core services. The focus is delivering "business solutions" rather than any form of "commoditised computing". virtualDCS is a business also born out of delivering enterprise-grade hosting solutions: [quote_box_center] "The only way to compete with Amazon is on items such a personal service and data sovereignty. The cloud concept of putting everything out there somewhere is fine to a point, but when your data is regulated, or very personal do you trust somewhere." Richard May, Managing Director, virtualDCS   [/quote_box_center]With reference to data sovereignty, virtualDCS make a valuable and more general point that demonstrable and accountable location of data should be a key issue for most enterprises, and to that extent IaaS providers who can offer this stand to benefit. Next up, Peer 1 Hosting: [quote_box_center]"Competing with someone who has 90% market share? Go big or go niche. The fast growth and profitable niche for Peer 1 Hosting is our Mission Critical Cloud offerings, that's our VMware public and private cloud solutions - in summary different technology, different geography and better customer experience including better technical support. Enterprises are taking advantage of the instant provisioning in a secure high availability stack - dual data centre, HA firewalls and load balances are all standard as part of the stacks deployed in England, Canada, Germany and soon the USA." Dominic Monkhouse, SVP Customer Experience, PEER1 Hosting[/quote_box_center] Evolving from a web(site) hosting and perhaps a more developer-oriented focus, Peer 1 has extended into offering IaaS with the enterprise squarely in its sights.   From what's said it appears that Peer 1's strategy is to deliver specifically engineered enterprise-grade packages, rather than expect customers to build their own - and risk doing it wrong.  This could be the assurance that enterprises want, and private cloud is also a key differentiation to AWS (CIA contract excepted) that most of the companies mentioned here can offer. Pulsant: [quote_box_center] "Amazon certainly are still making themselves a formidable opponent in the play for the cloud market. Previous gaps in their services e.g. private connectivity in and virtual data centre have been closed and additional functionality just keeps coming. Amazon have created a brilliant product but it only wins when playing their own game. If you want to work the Amazon way all is good but if you want Amazon to work your way you’ll struggle. The opportunity for service providers to steal share from Amazon by providing tailored services is huge especially in combination with a more consultative, relationship lead engagement that Amazon still rarely provide. Pulsant is equipping itself will all the tools it needs to provide customers with tailored combinations of services that integrate fully and are all provided with a single, accessible managed service wrap. Building platforms to various specifications to achieve targeted outcomes is often the easy part of service provision but presenting the customer with a disparate array of loosely related services doesn’t provide the Enterprise feel most businesses are looking for. Pulsant is ideally placed to provide an accessible, managed service covering functionality provided by market leaders such as Amazon provide but also combining in services tailored to individual needs, traditional services such as colocation and telecoms as well more in depth application support and consultancy." Rob Davies, Sales & Marketing Director, Pulsant [/quote_box_center] Pulsant are one of fairly small selection of IaaS providers that can strip your cloud journey back to bricks and mortar, through their ownership of a substantial UK data centre footprint.  The emphasis here is tailoring to specific needs and being more consultative.  If IaaS isn't right for you or you want to build and manage your own IaaS/PaaS/SaaS cloud, then Pulsant have a product and service for you whichever path you go down. Diversity of service offerings is certainly one way to out-manouver the competition in your back yard. Easynet Global Services: [quote_box_center] "I guess the key things are:You have to respect amazon for what they do well. They have been successful as they have executed a functional capability and created a new service category by unbundling and simplifying support structures implicit in traditional managed services.Plenty of respect to Verizon for their attempt to re-visit what Amazon have done and try to do it better with improved performance and control of infrastructure. For the rest of us I guess the way forward is not to compete directly but to focus on the areas where Amazon is weak. These would be: Customer service and support Ability to customise and build integrated solutions - including leveraging best of private, multi-tenant and public cloud capabilities GeographySecurity and accreditationWorkload focus Application focus. Plenty of areas there for new cloud models to emerge without going directly toe to toe with AWS. It is now time for the GM era of cloud post Fordist AWS approach. Vive la difference!" Eoin Jennings, General Manager UK Hosting, Easynet Global Services [/quote_box_center] Easynet set out a number of areas through which providers can focus and compete, and suggest the global landscape of cloud solutions has plenty of room to evolve.  Workloads and applications suggest PaaS and SaaS solutions are a key part of this evolution, and again the ability to customise around (enterprise) customer requirements. Interoute: [quote_box_center] "Looking at the overall cloud market, many cloud vendors are simply pushing the same solutions in different boxes. Their bids to take on AWS are all strikingly similar, revolving around various combinations and pricing of compute and storage.So, where is true competitive edge that can take on AWS coming from? Well, without a network behind their cloud, most cloud providers - including AWS - charge you the second you turn on your machines even if you are on a "free" tier. Bandwidth and inbound/outbound charges and overpriced virtual routing are how non-network integrated cloud providers cope with the fact that they cannot help you get to or from their computing power without someone else’s network covering the ground." Nelson Tavares, European Director Channel & Alliances, Interoute[/quote_box_center] As one may expect, Interoute are keen to highlight the advantages to be had from owning a network. Users often underestimate the cost of bandwidth (data in / data out) when configuring IaaS solutions - and depending on the nature of your business, low 'headline' service charges can escalate dramatically once you flick the virtual switch.  Its also key to remember that cloud is not much without connectivity, so steps you can take to ensure low latency and consistent bandwidth to your IaaS cloud will help your application delivery, plus: your private cloud should really have a private connection. Lunacloud: [quote_box_center]"We believe in simplicity and localization. We want to be seen as the world's local cloud provider in IaaS and PaaS: easy to understand, easy to use, easy to deal with. And close to customers in all aspects, like a local provider must be. A simple service delivered through an efficient backend. Competitiveness is a lot more than just economies of scale." António Miguel Ferreira, Chief Executive Officer, Lunacloud[/quote_box_center] Ease of use and localisation is key for Lunacloud. With growing operations across the European continent and aiming to cater for European diversity in language and culture, being a cloud that's easy to understand - and engage with - across sales and support functions certainly has its merits, and this is often an area where major multinationals can fall short. Sungard: [quote_box_center] "Employ coopetition as appropriate in order to leverage the momentum behind Amazon and continue to provide complementary capabilities and services.Amazon is the clear leader in the public cloud infrastructure space for certain scenarios as demonstrated by their current market share. Their procurement model is attractive to developers and it provides that demographic with a platform and set of capabilities that they can use to meet many of their nascent self-managed development and test requirements.As we all know, however, in most cases, the ultimate goal of hosting infrastructure is to deploy in a more stable and managed production environment. Some key requirements that are typical of a production implementation or more specifically an enterprise production environment are as follows: High SLA’s at the virtual machine level. Companies want the highest degree of assurance possible such that their implementations are highly available and will experience minimal downtime, accompanied by service credits commensurate with a true production environment. Easy Access to skilled resources and management of infrastructure as needed. Companies are already operating in Hybrid IT infrastructure environments which introduces tremendous complexity with various legacy technologies and business drivers. As such many companies are seeking a vendor that they can partner with not just to provide the underlying infrastructure but to also walk through the implementation life-cycle and be available to help as appropriate. An understanding of Legacy IT with a path to a hybrid IT environment. Companies often have one foot in legacy IT infrastructure and typically maintain in-house or collocated servers for their business critical services. By adding additional services or extending existing implementations often requires an understanding of more than just cloud, a deep understanding of legacy IT architecture and infrastructure is required. Change Management & Provisioning: As workloads move into production, enterprises require rigorous change management processes and project-oriented provisioning. Migration services. As more and more companies move workloads in to the cloud they often turn to managed services providers for migration services and practices By virtue of managing more complex environments and providing enterprise-grade services, managed services providers will also continue to maintain and grow their relationships with customers that provide insight into the ever-evolving business drivers influencing the path customers are taking. Amazon will continue to enjoy success for some of the more homogeneous workloads, especially in the development and test environments. But in today’s highly complex enterprise arena, businesses demand more than just Cloud; they want solutions. Those solutions need to be built to accommodate specific performance and availability requirements, and come bundled with services and expertise that help customers transition to the Cloud that works best for them. The most savvy and resilient cloud infrastructure providers don’t want to compete head-to-head with Amazon , but rather will continue to provide the best overall solutions that their customers are demanding. The most successful managed services providers and system integrators will provide solutions to access the AWS infrastructure and add complementary managed services." Simon Withers, Vice President of Global Cloud Product, Sungard Availability Services [/quote_box_center] 'Coopetition' through providing complementary capabilities is an interesting theme as Sungard is one provider who actually allows you to manage AWS instances through its own cloud service interface. To that extent Sungard will welcome AWS use by their customers - but up to a point.  Beyond that point competition kicks in and it becomes a story of enterprise-grade solutions, rather than 'cloud for kicks'. Firehost: [quote_box_center] "For inspiration, the strategists and innovators at FireHost look to our end users needs, before we look to a competitor. Our goal is not to compete directly with any cloud or infrastructure provider for share of and end user’s "IT wallet" but to help as many businesses as possible secure and handle regulated data such as payment card details and healthcare data responsibly. Our goal is to help global companies optimize business continuity plans, uphold data sovereignty mandates, reduce their risk profiles, and dodge hackers all while keeping auditors and their end users happy and productive. To do this, we focus on four areas of expertise and ensure that our product and service offerings surpass Amazon (or anyone else for that matter’s) capabilities. Security. Compliance. Performance. Managed Service." Chris Drake, CEO and Founder, FireHost [/quote_box_center] With the tagline "Secure Cloud Hosting" - Firehost is unequivocal about the ground upon which it competes on.  With security being everyone's favorite cloud bugbear, it seems like a solid strategy - as is targeting verticals who have to have high levels of security: online stores handling credit cards, and healthcare data handlers. Softcat: [quote_box_center]"Why would you want to compete? The facts speak for themselves, Amazon have developed something that customers need and want. Amazon certainly is not right for all, but for many it has enabled organisations to get their products and services globally accessible overnight. Softcat has taken a very open mind to what Amazon have on offer and actively trained their engineers to Architect level on the AWS platform. By doing this Softcat are able to give customers the best advice on what approach to take. This in turn allows Softcat to support and lead our customers to the solution that best fits their requirements this maybe on Amazon, Azure, HPCS, OpenStack or of course Cloud Softcat!" Simon Walker, Managed Services Director, Cloud Softcat / Softcat[/quote_box_center] Last but not least, Softcat take an approach to cover all bases, and a degree of 'coopertition' - ensuring their engineers know AWS thoroughly. "Know your enemy" perhaps, but any approach that puts customer's needs first is a convincing one. So what can we learn from these valuable insights?  Certainly there is a high degree of respect for what AWS has achieved, however there's also great optimism that the rest of the industry can benefit from the lead Amazon has taken.  Amazon has 'made' a market and were first to offer the right type of service to a global audience.  It's even fair to say that AWS have ('single-handedly') sold the concept of the IaaS cloud to the business world, so now its a refinement process - and one which our commentators involved believe they can play an important part in. What do you think? ### What role do Distributors have in Cloud Computing? By Luke Wheeler News hot off the press this week was the announcement from Westcoast - a major UK channel distributor - that they have teamed up with HP to build a cloud service for channel partners. It's reported that Westcoast's HP cloud platform will offer their core customer-base of IT resellers and service providers, the ability to resell on-demand cloud infrastructure (IaaS) and software (SaaS). The idea being that resellers can go-to-market with comprehensive cloud offerings, without carrying any of the burden of major capital expenditures or systems management headaches. As Michael Clifford, Director of Cloud Computing, for HP United Kingdom & Ireland, confirms: "Owning, managing and utilising a data centre frightens many resellers. HP Converged Cloud will enable Westcoast to demystify cloud for its resellers and make high quality cloud services easily available to anyone wishing to offer them." A warehouse full of the physical. Whilst of course Westcoast won't be closing down their conventional model of distributing physical tin any time soon, this bold strategic move is symptomatic of a sea change in IT consumption - one that favours flexibility, on-demand provisioning & utilisation, fast deployment and ubiquitous accessibility. It's fair to say that with every virtual cloud server purchased, that's potentially one less physical server moving off Westcoast's warehouse shelves - so when the market goes away from you, you have to move to catch up with the market. In the same way that major hardware & systems manufacturers have relied on Distributors to manage channel customers, product education, credit and logistics - there is good reason to believe that such a role can continue to be of importance when it comes to selling virtual cloud services. Sure, there are some stark differences and fresh challenges in what the job entails, but at the end of the day what the reseller wants is access to a product, to know how it will benefit their customers, to make money on it, and to be comfortable in knowing "it will just work". All of this can come from cloud. Distribution is in an enviable position when it comes to benefiting from new technologies. Their customers are the trusted interface between the business user and the technology, and its a proven relationship model that can be somewhat isolated from the industry hype and immune from fads and marketing spins. The channel could ultimately be what stands between cloud computing becoming the 'defacto' standard way of doing things, and it being a neat toy for 'innovators' and 'early adopters'. What matters in this grassroots relationship is RoI and tangible operational and business benefits - and subsequently the quiet majority of businesses will take cloud only as and when it offers them these key things. The channel could ultimately be what stands between cloud computing becoming the 'defacto' standard way of doing things, and it being a neat toy for 'innovators' and 'early adopters'. In essence then, that is the challenge for Distribution as it moves to cater for virtual product distribution. The 'big idea' that the likes of Westcoast are pursuing isn't about the cloud itself - its about continuing to enable their customers to succeed. If they focus too much on the cloud itself or tie themselves in knots trying to price-match Amazon, Azure, Google and the big players, then the opportunity - and indeed the channel - will gradually slip away. ### US Government to Increasingly Shift to Cloud-Based Services By Christopher Morris, Tech Commentator The growing significance - and increasing economic and computing significance of the cloud - is still lost on many sectors of the general population. It comes across as a mysterious, somewhat enigmatic technology, of which many people neither understand the benefits or the rationale behind switching to it. Of course, to the initiated, the advantages of cloud computing are obvious, with the ability to store or process vast amounts of data without needing to purchase appropriate storage equipment offering significant financial savings to both businesses and individuals. With two-billion broadband users dotted around the globe, it goes without saying that there is a huge market for this technology. While it has yet to be taken up a large scale at the home user level – although websites such as Dropbox are becoming increasingly popular – it is generally presumed that this will increase exponentially in the coming years. But corporate clients are already embracing this technology in droves, and the history of IT, in common with many other industrial fields, tells us that large private sector businesses ultimately usually drive consumer behaviour. ...the history of IT[...] tells us that large private sector businesses ultimately usually drive consumer behaviour. As businesses increasingly shift to third-party data storage in data centres managed by IT specialists, and adopt cloud computing as a matter of course, the accessibility of cloud computing will become more feasible to the average home computer user, as corporate uptake slowly but surely drives the price down and helps assist in the development of the technology. This has been a common meme with many previous technologies; the corporate sector drives its adoption and economic model, before the everyday consumer gets on board when its price point and availability reaches an acceptable level. In our technology-driven contemporary society, this process is typically much faster than it once was. However, while the extent to which cloud computing is expected to evolve business and home computing is widely acknowledged, there is a third major interest which is also looking to participate in this technological revolution as well. Government entities are now also looking to jump on the cloud computing bandwagon, and this prospect raises many questions at the level of both state and general public. It should be noted firstly that this is a perfectly logical move, particularly given the historical links between the public sector and the Internet. It is widely known that the original Internet was eventually born thanks to military investment and research back in the 1960s. Thus, this now very commercial technology has its roots very firmly in public sector investment. Government entities are now also looking to jump on the cloud computing bandwagon... Naturally in these austere times, government agencies and departments are constantly looking for ways to save money and cut costs. In this regard, shifting as many of their functions and services as possible to the cloud might seem like facile common sense. It is perfectly natural for the government at both local and federal level to see optimised business models which incorporate mass data storage by specialised units, thus negating the need for government to purchase expensive, and perishable, IT equipment on a mass scale. Already there are numerous government entities in the United States that have taken up the cloud computing baton. To maintain the military-industrial theme, there has been a strong armed forces identity to the early adopters, with the US Army, Air Force and Navy all adopting cloud-based services, and other departments such as the Department of Justice, US Development Agency, and the Department of Education following suit, with many more expected to follow in the coming years. The US government is promoting this concept as a way to greatly streamline, and ultimately, improve government services... "maximize capacity utilization, improve IT flexibility and responsiveness, and minimize cost". The US government is promoting this concept as a way to greatly streamline, and ultimately, improve government services. This is a noble concept, of course, and quite possibly a feasible and achievable one. In accordance with this policy, the US federal government has adopted the ‘Cloud First' policy, in order to manage the systematic move to cloud-based services. This policy “mandates that agencies take full advantage of cloud computing benefits to maximize capacity utilization, improve IT flexibility and responsiveness, and minimize cost.” The focus in the early days of the scheme will be on shared services for government agencies, with the delivery of services for the populace likely to increasingly follow as the systems become consolidated. This offers a great deal of potential for more joined-up services to be delivered by government, and many will believe that anything which reduces government bureaucracy, red tape and spending can only be a good thing. Perhaps the only thorny issue to be dealt with that will be of interest many is the issue of privacy. In the shadow of some of the less admirable revelations about the National Security Agency, this will be an issue of concern many Americans, and it is something that will have to be satisfactorily addressed before the public is ready to embrace cloud-based government services.   ### NextiraOne recognised by Cisco NextiraOne UK, part of Europe’s leading expert in communications services, announced today that it has achieved a Customer Satisfaction Excellence Gold Star from Cisco for the sixth year in a row. This designation recognises NextiraOne for delivering outstanding customer service to customers in the UK. “Delivering service excellence for each of our customers is always at the forefront of our strategy” commented Christopher Lewis, Head of Marketing at NextiraOne. “Attaining Gold Star status in the Cisco Customer Satisfaction survey for the sixth consecutive time is a great reward for all the hard work and commitment shown by our entire UK team, we continue to deliver consistently high standards of customer care and it is always good to be recognised by our customers and Cisco as our partner.” Cisco measures the customer satisfaction levels achieved by its Gold, Silver, and Premier Certified partners based on regional target goals, providing a weighted average of a partner's pre- and post-sales support over a rolling 12-month period. Partners that achieve outstanding customer satisfaction are awarded the Customer Satisfaction Excellence Gold Star and can be found using the advanced search menu in the Cisco Partner Locator. The Cisco Resale Channel Program provides a framework for partners to build the sales, technical, and Cisco Lifecycle Services skills required to deliver Cisco solutions to end customers. Through the program's specialisations and certifications, Cisco recognises a partner's expertise in deploying solutions based on Cisco advanced technologies and services. Using a third-party audit process, the program validates partner qualifications such as technology skills, business best practices, customer satisfaction, and presales and post-sales support capabilities - critical factors in choosing a trusted partner. “This welcome news follows on the heels of a fantastic performance in Cisco’s FY13 with double digit growth in sales, some great customer wins such as Gazprom, our success at Cisco Live 2013 and our recent announcement of record growth in our Data Centre business in the first half of 2013. Much of this is built on our Cisco-based solutions and expertise,” added Chris Lewis. ### CRM on the cloud – a good move for your business? By Steve Lumley, Technical Writer Implementing or upgrading your customer relationship management (CRM) software is a big commitment – both in terms of time, money, resources and planning. The vendor selection process used to be a straight forward task of determining which software best met the needs of my organisation. However now – thanks to the advent of cloud computing - you have another major decision to make: Should you host the CRM software on your own servers or make use a cloud-based / SaaS solution? Whether or not you should use a cloud-based solution isn't that straight-forward, though it's a great test-bed for establishing why a business should make more use of cloud computing services. ...with most legacy CRM vendors offering a cloud-based solution, and a whole raft of “born in the cloud” CRM vendors entering the market, it's clear which direction the vendors are taking us… And now with most legacy CRM vendors offering a cloud-based solution, and a whole raft of “born in the cloud” CRM vendors entering the market, it's clear which direction the vendors are taking us… Hosting CRM on the cloud Staying with conventional practice and hosting the software on your own servers may seem like the safer option – after all you have IT teams managing other software, you may have server capacity in place already – and you can point a finger at it and say “there it is”. However what may look like the safer option may not be the better option – or indeed the safer option after all. Your business needs to assess the overall impact on existing management and support capabilities, resources, and assume responsibilities for service uptime and licensing. The question of data security is of huge importance – both in terms of its protection and location. Just because a cloud-based solution means your data resides outside of your data centre elsewhere, doesn't mean you can assume it's less safe or more safe. Ultimately you have to do your own research and ask the right questions of both options – the results may surprise. Just because a cloud-based solution means your data resides outside of your data centre elsewhere, doesn't mean you can assume it's less safe or more safe. What can be said is that hosting your CRM on the cloud in general, does open up some very interesting opportunities that can appeal to every stakeholder in your business. Such advantages include the “on-demand” nature of cloud provisioning – the ability to scale up and scale down as needed, whether it be compute capacity or user licensing. The opportunity to outsource the management of the software to the actual vendor (or business with such expertise) can also be attractive – as they will then look after the software including its compute resources, updates and patches – and since this is their focus they should be better at this than you are. Another major opportunity is accessibility; whilst not exclusive to cloud-based solutions, they are likely to better tuned to remote access via mobile apps, and have better integration with social media. For those businesses that have to avoid a “shared platform” and are happy to forego contractual and resource flexibility, the Cloud has an answer for that too – in the form of private managed cloud solutions. These may be single-tenant (as opposed to multi-tenant) architectures where software or both software and hardware is dedicated to you, but still managed by the CRM provider / ISV (Independent Software Vendor or expert outsource partner. (See Basant Singh's blog on SaaS tenant variations here) Microsoft Dynamics CRM Online 2013 This month, Microsoft announced that its cloud-based CRM offering - Microsoft Dynamics CRM Online 2013 – roll-out is underway. Microsoft may be a bit later than most to the party, however this is a major consideration for such a large software vendor, who knows its core customer-base won't tolerate round after round of beta initiatives and trial and error. By launching a cloud-based version of Dynamics CRM 2013, they are in effect giving their backing to cloud-based application delivery and SaaS, so if Dynamics is sub-par then they stand to lose a great deal. To challenge the most prominent enterprise-level CRM operators – such as Salesforce.com, Oracle and SAP - and their cloud offerings, Microsoft has to get this right. By launching a cloud-based version of Dynamics CRM 2013, they are in effect giving their backing to cloud-based application delivery and SaaS, so if Dynamics is sub-par then they stand to lose a great deal. In Microsoft's announcement, they echo many of the factors we've discussed: “Dynamics CRM 2013 and its cloud-based counterpart offer features aimed at helping sales and customer service organizations better engage with a new generation of educated, social-media-savvy business-to-business (B2B) and business-to-consumer (B2C) customers”, says Microsoft Dynamics CRM Vice President Bob Stutz. He continues: “The new Dynamics CRM has been refreshed to be more user-friendly; especially by those staff using tablet computer devices. Microsoft has linked up various elements to bring an effective offering to the marketplace.” Reportedly, it's the effective “joined-up thinking” that is helping Microsoft redraw the CRM landscape. For instance, users can make Skype calls directly from within the Dynamics CRM software, workflows have been improved as have the social media elements. And with a simple price plan of £28.70 per user, per month, and native mobile apps being launched – it's sure to grab many people's attention. The single most important question regarding cloud CRM This move by Microsoft underlines how the world of technology and how we access information is changing rapidly. It also means that a firm looking to invest in CRM needs its staff to access and use its capabilities fully – and that probably means they'll need access from anywhere from any device. Let's face it, BYOD and mobile access is an unstoppable juggernaut with demonstrable productivity advantages – so it makes sense to adapt and enable, rather than be rigid and preventative. Cloud-based CRM (and all that it encompasses) offers a great deal, but the most important question is “can it help me look after my existing customers, as well as nurture and develop new ones?” Given the operational advantages for the business and the user, I believe that in the vast majority of cases it can. Cloud-based CRM is no longer for the early adopter, Microsoft's announcement means it's now for the majority – so its heading in one direction, you just have to decide whether you're fighting the tide or going with the flow. ### Will Intel’s new CPU offer cloud service providers the processing power they have been waiting for? By David Howell, Tech Journalist and Commentator With a quoted 45% increase in efficiency and up to 50% improvement in performance using up to 12 cores compared to the previous generation, Intel's Xeon Processor E5-2600 v2 (Ivy Bridge-EP) product family looks set to offer what could be a quantum leap forward in CPU power for data centres looking to leverage their services for the cloud and big data applications. Diane Bryant, senior vice president and general manager of the Data centre and Connected Systems Group at Intel commented: [quote_box_center]"More than ever, organizations are looking to information technology to transform their businesses. Offering new cloud-based services requires an infrastructure that is versatile enough to support the diverse workloads and is flexible enough to respond to changes in resource demand across servers, storage and network."[/quote_box_center] Cloud Power Boost? As the cloud has rapidly evolved, data centres have struggled to offer the flexible on-demand services that business now need to power their enterprises. Partnering with IBM and their NeXtScale System, the new Intel CPU's can now be built into dynamic platforms that can rapidly evolve as service demand changes. The new chip will also be used in the new x3650 M4 HD high-density storage server from IBM to support big data storage and analytics. And of course with virtualisation becoming ubiquitous, Intel's new chip has been designed with this in mind. IDC's Digital Universe Study from last year stated that 82% of global telecommunications providers are expected to evaluate SDN (Software Defined Networking) and NFV (Network Function Virtualization) this year. Intel explained their approach: [quote_box_center] “Using Intel's Open Network Platform (ONP) server reference design, customers can use high-volume Xeon-based servers combined with industry open standards to consolidate virtualized networking applications. This allows them to deliver market leading throughput performance and latency for SDN and NFV workloads. Intel's ONP server reference design is based on the Wind River Open Virtualization Profile and the Intel Data Plane Development Kit Accelerated Open vSwitch.”[/quote_box_center] For data centres and their customers the new Intel chip is the first of what has been dubbed SoC (System on a Chip) with Diane Bryant succinctly commenting at the last Intel Datacentre Day: "Today, we look at IT as the service. IT is no longer supporting the business, rather IT is the business." And this is at the heart of the new CPU's that Intel has developed. The new Xeon is scheduled for release next year, but it's interesting to see how Intel is also positioning their other low-powered chips. These are in the shape of the Atom for data centre usage. With the Atom's fabrication process converging with the Xeon range, the 14nm Atom architecture code named ‘Denverton' will roll out in the near future. The two streams will offer data centres the options they need to build service platforms that are affordable. What is clear is that server architecture that couples CPU with memory for big data and cloud applications need the level of processing power that the new Xeon chips offer. Data centres are looking to differentiate in the marketplace, and know that their customers are struggling with their current hardware set up. And with virtualisation marching on to dominance, the E5 can't come quickly enough. ### The impact of server memory on data centre performance, capacity, power and costs By Debbie Fowler of KingstonConsult It’s not exactly front page news that virtualisation, big data and the cloud are major drivers changing the rules for flexibility and elasticity in data centres. We all know that data centre growth is escalating, costs are increasing and there is massive demand to support data that has to move quickly and efficiently. Demand for data driven by ubiquitous access to content from both consumers and business is stretching resources. server-memory We all know that managing the performance of systems in the data centre is a time and resource consuming priority, and issues of capacity, performance, power and cost are major considerations. You may be surprised to learn that the humble memory module can play a vital role in enabling your business and IT goals. Memory is often overlooked as a solution to improving over-all server performance, capacity and power saving; the immediate reaction is normally to add more servers without fully analysing the optimal way to maximise the under-utilised servers already in place. Unfortunately today, knowing how to choose the right type of memory configuration required to achieve the desired results and business goals is not easily understood. Both memory and server technologies have evolved rapidly over the last five to ten years. Balancing low power with capacity and performance requires an understanding of the role that server memory plays. Memory has evolved to become one of the most important components of the data centre. Server processors (often under-utilised) are able to process multiple threads over many cores to maximise the utilisation of a single server. Without sufficient or optimally configured memory, performance degrades and servers do not reach their full potential. Rather than automatically adding more servers to improve performance, in many cases additional memory will address the issues and reduce complexity and cost. Rather than automatically adding more servers to improve performance, in many cases additional memory will address the issues and reduce complexity and cost. The following are considerations and recommendations to identify how additional server memory can help data centres efficiently improve overall performance and how to ensure new memory eases resource allocation without business disruption. First, identify the role and goals for a given server or servers in the data centre. Prioritise the importance of better performance and speed, reduction of power consumption, or increased capacity. While not mutually exclusive, the prioritisation of these factors will dictate the optimal memory choices. Minimising memory power consumption can save between 5 percent and 10 percent or more of a server’s total power draw, which obviously gets multiplied across a data centre. Although memory is considered a commodity today with industry standards in place, it doesn’t mean every memory module or memory configuration will be supported by every server. There are many compatibility considerations related to the components on the memory module and your server. Fundamentally there are no differences between branded servers and white-box servers. There may, however, be subtle differences in motherboard design or system height that require usage of a specific memory type. An IBM server, for example, may have height restrictions and require memory with very low-profile (VLP) designs. Minimising memory power consumption can save between 5 percent and 10 percent or more of a server’s total power draw, which obviously gets multiplied across a data centre. Or an HP ProLiant server might have compatibility issues with specific register components or DRAM brands. It’s very important to select the memory that is guaranteed to be compatible with your specific server system. It is also worth noting that older server systems may not be compatible with the latest memory module technologies or best practice configurations. Ensure that new memory is installed correctly and follow the server’s channel architecture guidelines. For example, it has become the norm over the years to install memory modules in pairs. When triple- and quad-channel servers were introduced, many wrongly assumed that continuing to install in pairs was the correct way to go. In fact this is not the case and often leads to memory channels being incorrectly populated and the potential performance of the server being compromised. Memory incompatibility problems typically manifest themselves as system lockups, blue screens or error-correction logs. Memory performance, however, is not so easy to diagnose. To fully understand if the memory is performing as desired, it must be correctly benchmarked or closely monitored. Choosing the cheapest solution may not be the wisest choice to meet long-term goals. For example, data centre managers, when evaluating new memory, may see that 8GB DIMMs are fairly inexpensive, and purchasing 16 of them for the server can achieve their capacity goal of 128GB. The other option would be to choose eight 16GB DIMMs instead, which may be more costly at the outset but will provide savings over the long term on energy consumption (fewer DIMMs drawing less power) and provide headroom (open sockets) to expand memory in the future. This is where Kingston Technology, a company with 25 years experience of manufacturing computer memory can be of use to your business. KingstonConsult offers you an independent opinion on whether the memory configuration you are currently using or are planning to use is balanced and optimised for your organisational goals and business needs. KingstonConsult services are offered free of charge following qualification and do not require you to be a current Kingston customer. The following service offerings are at the core of KingstonConsult: KingstonConsult experts will look into your existing or planned server configuration and work with you to understand your individual business requirements. Based on our findings we will supply you with a tailored ‘Server Assessment Report’ which will address commercial and technical issues in order to demonstrate which memory would best support your specific business objectives. KingstonConsult offers product evaluations to enable you to conduct a “proof of concept” review of our Server Assessment recommendations in your own real life environment. KingstonConsult’s experts offer server configuration training designed to educate both MIS and business stakeholders regarding the technical and commercial challenges and benefits. To find out more see: www.kingston.com/Think or email kingstonconsult@kingston.eu ### How is the Cloud affecting the Colocation market and traditional Colocation Providers? By Mark Underwood CTC: Mark Underwoood presents his view of traditional colocation markets, how cloud computing affects these markets, and how colocation providers need to adapt to change and embrace cloud. Colocation (or co-location) is “the act of placing multiple (sometimes related) entities within a single location” (Wikipedia), and - to this extent - Colocation and Cloud Computing are two services which were founded on very similar principles. Both cloud and colocation see customers coming together in order to benefit from a purpose-built service or facility that in themselves, individually, they would be hard-pushed to afford, let alone manage. The Cloud is in the House. Despite this, you may be surprised to learn that Cloud Computing is posing one of the greatest challenges to Colocation providers; it places a new set of demands on data centre operators, and has dramatically shaken up the typical profile and cross-section of customers. Cloud Computing ...places a new set of demands on data centre operators, and has dramatically shaken up the typical profile and cross-section of customers. So what is it about Cloud Computing that is forcing Colocation providers to re-examine their business models, facilities, and their approach to securing new and existing business? Well, in the same way that the arrival of commercial multi-tenant Data Centre (Colocation) facilities meant that businesses no longer needed to invest in power and cooling for their on-site makeshift ‘broom cupboard' server rooms; Cloud Computing now provides businesses all the computing power they may need without maintaining their own infrastructure. The result is a shrinking colocation market for traditional end user enterprise colocation customers, brought about by companies relocating their workloads to the Cloud. This is especially noticeable in the mid-size enterprise sector where businesses are agile enough to quickly take advantage of new technology opportunities, and historically invested less heavily in their own data centre infrastructure. ...despite on-going exponential growth in computing needs, a mid-size enterprise that may have needed 10 racks three years ago, now only needs two or three racks It's these disappearing mid-size enterprises that represented a large, reliable, predictable and profitable segment of your typical colocation provider's customer-base. Combine this with the increased density of compute resources and we find that despite on-going exponential growth in computing needs, a mid-size enterprise that may have needed 10 racks three years ago, now only needs two or three racks. The flipside of this is the arrival of a new ‘super-breed' of customer to Colocation facilities. These are the cloud computing service providers – the very businesses that are taking on the relocated workloads from our traditional client base! Whilst many of these Cloud Service Providers (CSPs) have morphed into existence from legacy web hosting businesses, systems integrators (SIs), managed service providers (MSPs) and indeed traditional IT value-add resellers (VARs), they have been joined by a whole new breed of software providers, platform providers, and infrastructure providers offering their wares “as-a-service”. However regardless of their origins, becoming a CSP has required a fresh and substantial investment in computing infrastructure and software. [Cloud Service Providers] are looking to colocate the latest ultra-efficient high density computing hardware. However the latest hardware is not as compatible with older data centres as you might think. As a result CSPs are looking to colocate the latest ultra-efficient high density computing hardware. However the latest hardware is not as compatible with older data centres as you might think. Indeed, older data centre and colocation facilities that may have been perfectly adequate for a colo customer of any size and orientation even as little as two years ago, today fall a long way short of being adequate to support an increasingly large proportion of colocation Customers requiring high density power and cooling solutions. So what options do these legacy operators have? One option is dropping their prices to attract new business – and you don't have to look far for super-cheap colocation racks – they are cheap for a reason. Another option is to over-subscribe existing power and cooling resources – a dangerous strategy that no operator would admit to, but has the inevitable consequence of unplanned outages and falling service levels. (Google “data centre outage” for further reading.) The obvious correct option is to invest in upgrades to the facility – but capital-intensive investment is pretty hard to come by right now even for large businesses – and did we mention the erosion of the existing core colo customer base of mid-size enterprises? So, many older data centre facilities – particularly those without a very solid financial backing - have their work cut-out to keep up with this brave new world, and this is the biggest upset in the market. Meeting the needs of Cloud Service Providers and Cloud Operators. Cloud economies-of-scale are derived directly from the efficiency of the infrastructure or platform – and this includes the efficiency and economy (the value) offered by the underlying data centre. Cloud operators need to know that their data centre can match their ambitions, growth and uptime expectations. Cloud operators also need connectivity and lots of it – available all the time and scalable to the highest levels. Inevitably, bandwidth has become an increasingly important consideration for data centres - so factors such as duct capacity, fibre entry, cross-connect methodology and management, re-sold and low-cost aggregated bandwidth offerings – as well as rules and regulations for on-site network operators should be examined very closely. Once you've determined that the colocation facility has what you need in order to provide a stable and reliable foundation for your cloud service for both today's and tomorrow's requirements, you are likely to find that your short-list of providers is made up of the more expensive facilities. So what can a colocation provider do more to assist your cloud operation and your bottom line? At Virtus we've examined this very challenge and come up with Colo-on-Demand. It's designed to align your colocation expenditure with your cloud revenues. In other words, as your cloud customer computing requirements increase, you can quickly and easily ramp up your colocation requirements. The same goes for when they decrease. Our colocation flexes with your cloud. Cloud and Colocation were founded on very similar principles. At Virtus we are committed to ensuring that our colocation - and your cloud - tracks the same productive path into the foreseeable future.   ### What is Data Centre PUE? By Luke Wheeler Any business that owns or operates within a data centre will know that power consumption has become a major – if not the key – factor determining the costs of co-locating your computing infrastructure within a purpose-built facility. Electricity costs in the UK have nigh on doubled over the last seven years (source: Castle Cover utilities Index) and green initiatives from central government with both stick and carrot approach are using increasingly strong-arm tactics. ...with electricity costs escalating and the pressure to be green increasing, it pays dividends to ensure that you're not wasting power. So with electricity costs escalating and the pressure to be green increasing, it pays dividends to ensure that you're not wasting power. And with your computing infrastructure consuming a whole load of power – this is where PUE comes in. PUE is Power Utilisation Effectiveness. It's a metric used to determine the energy efficiency of a particular data centre. It's calculated by dividing the total amount of power entering a data centre by the power actually being used to run the computing infrastructure within it. To place this in an example: If the total power entering the data centre is 375kW and the actual power being used by your computing infrastructure is 250kW, then your PUE value is 1.5. It may be easiest to think of it as a ratio (1:1.5) where “1” is your computing infrastructure power consumption and 1.5 is power consumption of the whole data centre. In this example, the data centre facilities are using 125kW (50% of your compute infrastructure's power consumption) in order to deliver the facilities (i.e. the UPS, cooling, lighting etc). The more efficient the data centre – the lower the relative and proportional power overhead of providing the facilities will be, and the lower the PUE value will be (assuming there's no reduction in service levels). Currently the most efficient data centres are reporting PUE values of around 1.07 and the least efficient languishing around 3.0, with an overall industry average of 1.8-1.89 (according to The Uptime Institute, 2012) What this means to you and your business As data centre costs have moved away from being governed by square footage towards power consumption, a substantial proportion of the costs incurred for running your data centre operation will be directly attributed to the power you use. Whether you have a whole building, a floor of a 3rd party data centre, or just a rack in a hosting facility, the price you pay for you power will be directly impacted by the PUE of the data centre as a whole. Again it matters not whether you are on an ‘all-inclusive' tariff or metered power tariff – the data centre will be recovering what it costs them to power the data centre and your proportion of it. Put simply, the lower the PUE, the lower the price you'll end up paying for power. Put simply, the lower the PUE, the lower the price you'll end up paying for power. So how much money can you potentially save by co-locating in a highly efficient – low PUE – data centre? Here is an illustration of how you could save £41,000 per annum based on a typical 5 rack deployment when you choose a data centre with a PUE of 1.25 over a data centre with PUE of 3.0: Of course in practice your non-power-associated costs (for space) will probably cost more in an ultra-efficient data centre than they would in a low efficiency data centre, but the total cost of ownership (TCO) is still probably a lot less – and your green credentials will look a lot healthier too. Can the PUE be impacted by other Customers? When you're in a shared facility, the activities or non-activity of other customers can affect the PUE of the data centre as a whole. If, for instance, you're in a relatively empty data centre facility, a fair amount of total power may be cooling empty space, making it less efficient overall. If you're in a busy facility, the data centre is actually likely to be working more effectively, despite computing infrastructure power and overall facility power draw being higher. As you might expect, the PUE in these scenarios boils down to the how well the facility is managed, and most data centre operators will typically factor these variations in so that you have a consistent PUE to work with. So is PUE the ultimate measure of a data centre? ...how can facility power be kept to a minimum? Well if you didn't need cooling at all, that would be a huge advantage... If you've followed our PUE explanation, you'll see that the best PUE opportunity exists where the power required to support the data centre facilities is the smallest fraction of the power that goes into the computing infrastructure. So how can facility power be kept to a minimum? Well if you didn't need cooling at all, that would be a huge advantage! This goes some way in explaining why there's been an increase in data centre builds in year-round colder climates – locations such as Iceland, Greenland and the Nordics. Achieving the same PUE in equatorial and tropical regions would be a challenge indeed. Whilst co-locating in cool climates may give you the best PUE and greenest credentials – and possibly highly competitively priced renewable energy, you still have to factor in your other data centre requirements. These are likely to include potentially mission-critical factors such as connectivity, accessibility, physical security, access to technical support, data location, and contractual flexibility. To sum up then, PUE is a measure of the efficiency of the data centre as a whole, and it will directly or indirectly impact the operating costs of your co-location. Its not the 'be all and end all' as your data centre operation is likely to need to fulfil a broader brief of requirements. A better PUE value should help you achieve better value from your data centre power provision - and also give you a better understanding and the ability to model your exposure to inevitable future power price increases. Of course - to complete the picture - you should also examine how power efficient the hardware within your racks is. ### Easynet Global Services, Daring to be Different in Cloud By Eoin Jennings, General Manager of Easynet Global Services Since I joined Easynet Global Services a couple of months ago, people have been asking me a few questions: why did I move to Easynet and how do I think Easynet can compete in a relatively crowded marketplace with service providers offering similar service capability based on the same technology building blocks? Engineers (and I am one of those) most often tend to think of some technical feature as a means of obtaining competitive advantage e.g. a new and cheaper way of doing backups, a different type of server, solid state drives etc. However whilst all service providers must keep pace with technology to remain relevant and competitive in the market, each incremental technology change gives only a relatively short term advantage as these technologies are available for adoption by all service providers (unless you’re a technology developer which we are not). Some will clearly take the wrong choices or fall off the pace and maybe new innovation gives new entrants a way into the market, but the choice of technology is not in itself sufficient basis for a long term position in the market. … the choice of technology is not in itself sufficient basis for a long term position in the market. I was taken recently by the First Direct (a UK online bank) advertising campaign with the tag line ‘dare to be different’. At the product level what First Direct does is provide current accounts, loans, mortgages, cards just like any other bank i.e. a relatively undifferentiated offer. However they make their service unique by their style and how they deliver the customer experience. This is not accidental, and as a long- standing and satisfied First Direct customer I can vouch for it. So what does ‘dare to be different’ mean for Easynet in the cloud world where essentially the elements of what we offer re available commonly in the marketplace? Right size, right services: We can and do deliver a true multi service capability across cloud, hosting and security. Our scale of operation makes it possible for us to do this. Our larger competitors struggle to achieve true service integration because of the scale of their network business and operations. The best that they can often do is some level of network integration and a veneer of service management across the top of what are essentially stove pipe service offerings. This is no particular criticism as they have to do this at their scale, but it can seem to the customer that they may as well have bought their cloud and network from different companies as they often do. At the smaller end there are excellent service providers who focus on one aspect of service but don’t have the breadth of services available to offer complete solutions. Real service matters: The second part of ‘dare to be different’ is customer service and support. This is the primary basis by which First Direct compete. You can go online and do your business through a portal but you can also access a real person quickly 24*7 who can actually help you there and then, and not just take your name and number or send you off to another queue to explain your problem a second time. It means a common service interface where the support centre is first point of contact for all service issues and owns remediation across all services. It’s about doing more than just swivel chair tickets to separate product support teams with no overall service level view or ownership. I have seen companies where even that level of ticket flow integration is not achieved, so that the customer, who has been sold a story of ‘seamless integration’, has to triage and move tickets between different service provider NOCs for different services: hugely frustrating. It means having an account management and service management relationship that understands the customer’s business and takes ownership end to end for the entire scope of services rather than being supported by a general account management and service manager (if that even exists) supported by a bunch of overlay product specific sales and support specialists who have an intermittent relationship with the customer and who are only concerned with their part of the service delivery, rather than the total customer view. It means having real people to support you from day to day. Proof of this lies in Easynet’s victory over two tech giants as it was crowned 2013 European Service Provider of the Year at the IT Europa European IT & software Excellence Awards. Problem led not product led The cloud world is being driven by automation. This has great benefits for the service provider as it reduces service variability and cost of support. This translates through to customers in reduced service cost. All good so far. It certainly is important to have a service capability that is up to date based on proven technologies that offers the flexibility and agility benefits that people expect from cloud. Easynet’s award-winning partnership with HP is worth a nod here. But is this enough? It is fine if your business need can be entirely met wholly by the shrink wrapped service model being offered, but many customers don’t have such clean problems. They have to live with their legacy. From the point of view of the cloud purists, the legacy is your problem. You need to figure it out and get on with architecting your apps for cloud like the ‘cloud natives’ and by the way do you like the look of my fancy portal? It can be a little preachy and often leaves the customer a little cold. They want to get there but need to be shown how to get there and the steps along the way. Market Maverick – but only if it solves a customer problem At our scale of operation we are both willing and able to do the extra things to enable customers to use the cloud effectively to solve their problems. If some additional feature or capability is needed to support what the customer is trying to achieve we are willing to go that extra mile and accommodate the variation in support, rather than mandating that the customer takes the standard service only and leaving the rest of the mess with them. What’s important to our customers is that we create new products which address specific business issues… Integral to a ‘Dare to be Different’ ethos is an element of being a maverick and creating a culture which welcomes innovation. In the nineties Easynet had a reputation for being a pioneer. Now, Easynet has grown up but the pipe and slippers are a long way off as it continues to be first to market with a host of products. As I said earlier, being first out of the blocks is great until everyone else catches up, but creating a culture which welcomes ideas and innovation across the business is intrinsic to Easynet. What’s important to our customers is that we create new products which address specific business issues, and which help businesses address what’s holding them back from achieving their goals. Easynet has an active, involved Customer Advisory Board. One of its members recently talked about how great it would be if Easynet could provide an IT product to help his organisation respond swiftly to changing market conditions. He said that he felt longer lead times for network installations were restrictive and prohibitive. We briefed our product team, and within months we could offer him Rapid Deployment 4G. Similarly, our innovative Smart Networks group of products were the brainchild of the Customer Advisory Board. In summary, I think Easynet has a distinctive position in the market which is about providing real truly integrated service and not just technology (and I’m not going to use the word seamless). I look forward to delivering that service experience to our customers. ### Is Your SaaS Multi-tenant? By Basant Singh, Software Engineer and Blogger Over the past couple of years, single-tenant vs. multi-tenant SaaS debate has been creating loads of buzz in the technology circles. So, let me ask you a few questions: Does multi-tenancy matter if you are a SaaS provider? As a subscriber do you need to care if your SaaS is a single-tenant or multi-tenant? Multi-tenancy has its benefits. If you are curious and want an instant and short answer, it's an emphatic - YES, it matters! Whether you are a SaaS provider or a SaaS consumer multi-tenancy matters to you. To simplify your decision making process let me tell you convincingly that if it's not multi-tenant it's not a true SaaS in the first place! Yes, The National Institute of Standards and Technology (NIST), The European Network and Information Security Agency (ENISA), Cloud Security Alliance (CSA) and many other authorities have already specified multi-tenancy as one of the essential (or at least desired) characteristics of a Cloud SaaS. Disclosure: Many SaaS architectural considerations and the term multi-tenancy have been simplified for the purposes of this article. Who/what are tenants? In plain English, Tenants are subscribers (consumer/customer/client) of a service. For a B2B or Line-of-Business application like a CRM, a tenant can be a company with 100s of users. Examples: Salesforce.com and Google Apps for Business. Similarly for Consumer oriented or B2C SaaS, individuals like you and me (yes, general public) can be tenants. A few examples could be Facebook, Twitter, Dropbox etc. But it's important to note that though these services started themselves as a consumer oriented apps, of late they are trying to diversify themselves in to the B2B segment as well. Gmail is a true consumer oriented SaaS. I must add here that in most of the cases, B2B services require more sophisticated architectural considerations and development efforts in comparison to the consumer oriented applications. What is a Multi-tenant SaaS? Do you know how a traditional web application is developed and deployed? It's generally designed, developed and deployed keeping in mind the requirement of a single client. Simply speaking, it's like developing and deploying different code base and different database instances for each client. So, if you have 100 clients this translates to 100 code bases (builds) and 100 databases to be deployed and maintained! See figure# 1 below for a graphical representation: Multi-tenant SaaS architecture, on the other hand, is an application that is designed/architected in such a way that it maintains only one code base and one database for all the clients. So, all your 100 clients now may use a single code base and a single database instance! See figure# 2 below for a graphical representation: How does it Benefit the SaaS Provider? Reduced Support and Maintenance: It's indeed no-brainer, as maintaining a single code base and database is far easier than maintaining and releasing patches for 100 code bases! Also, if you are upgrading your code or infrastructure it's simply a onetime effort at one place rather than updating it in 100 different versions at many places. ...maintaining a single code base and database is far easier than maintaining and releasing patches for 100 code bases! Cost Efficient – Sharing of resources (development and maintenance effort, infrastructure etc.) almost always converts to cost savings. Although a multi-tenant SaaS may require more time and effort initially yet it definitely pays in the long run once you've increased subscriber volume. In most of the scenarios, new subscriber's signup and on-boarding process can be automated for a low or no touch sales and support. A provider with single-tenant service can't continue to provide the service at an affordable price point and hence may find it difficult to compete, hence may become financially unviable. Security – You don't need to worry about the security of every client's applications anymore. Implement state-of-the-art security features in your single code base and database and deploy it with an infrastructure (IaaS) provider who has a good track record of security and up-time. Peace of mind! How does it Benefit the SaaS Consumer? All the points mentioned above would contribute to making a provider an efficient SaaS operator who can in turn pass on the extra benefits to their subscribers. An efficient service may attract higher number of subscribers and since SaaS is all about economies of scale, you have a better chance of getting the service at an affordable price point. Also, as the provider is managing all their subscribers from a single code base and database instance they will be putting their best efforts into offering a quality service. ...since SaaS is all about economies of scale, you have a better chance of getting the service at an affordable price point. Bottom-line Multi-tenant SaaS is like sailing together on the same boat with everyone else. There are indeed some cons as well like even a minor hiccup can affect every subscriber (not in the scope of this article). However, on any day the pros of a multi-tenant SaaS far outweigh its cons hence do enquire about it before you sign up for your next SaaS. ### We are in an Information Technology Renaissance By Daniel Steeves, Director at Beyond Solutions Blame it on a summer holiday in Florence and a Dan Brown novel but it strikes me that we are in an era of IT renaissance: not only does it feel like a ‘revival' but we actually have the opportunity today for all of this information technology stuff to really start to deliver cost-effective value across the board… if it is done right. While there has never been much argument that Thomas Watson was a little short-sighted in declaring five computers worldwide as the saturation point, I can't recall anyone forecasting a billion users around the world accessing the same website, let alone the overall breadth of where we're at and where we're clearly going. Yet we do take a lot of it for granted - and it is amazing how quickly that happens – things like connectivity (practically) everywhere and cheap computing. And it all continues to drive changes in attitudes and behaviours - for both consumers and businesses – resulting in demand, interest and uptake for IT unlike anything we've ever seen before. And, again, a shot at a serious return on your investment… if it is done right. [Incidentally if you're pushed for time and want to cut to the juicy bit - scroll down to the ‘Call to Action' section at the bottom!] Solving Business Problems, not Problem Businesses But, to loosely quote myself from a chat with Neil Cattermull during a video interview (to be published next week), “If a business expects Cloud Computing to suddenly sort and solve all of the business problems of the past - such as project delays and budget overruns resulting from scope creep and changes requests - then many will be sadly disappointed”. If a business expects Cloud Computing to suddenly sort and solve all of the business problems of the past - such as project delays and budget overruns resulting from scope creep and changes requests - then many will be sadly disappointed... While I do appreciate that many of the contributors to said risks have changed hands and may well be reduced or just plain go away, whatever is being delivered from the Cloud still involves and impacts the users of the business and current IT delivery. Cloud can provide solutions for businesses, if done right… but it doesn't change how the business itself is being run. There is one significant difference from the days of yore, though: there is now vast amounts of clever information and opinions from subject matter experts – ranging from advisors and providers to end users and end user businesses, all of whom have experienced either the pain or the joy (or more typically a little bit of each) on the journey that they've been taking. Add to that the encyclopaedic knowledge - free for the taking - on sites like this one (our hosts at Compare the Cloud)… today we have the basis, the platforms, the knowledge and the resources to do things right. In a moment I am going to ask you to add to the sum total and help to get it all a little more ‘right'. Doing IT Right IT...we have the knowledge of what needs to be done if we want it to be done right consistently… don't we? In this industry we do know what “right” means, not to mention what wrong looks like: after 40-plus years of business IT education the hard way, the industry as a whole pretty much agrees on the basics (even if they constantly argue the specifics). Starting with requirements and working through to planning and execution, we have the knowledge of what needs to be done if we want it to be done right, consistently… don't we? We should also, by now, know that spending money to create ‘automated' replications of existing and sometimes ancient processes, thought patterns and approaches rather than investing the time, effort and money to at least investigate prospects to improve, streamline, remove bottlenecks and introduce other business efficiencies in a new technology-based solution would probably not be the preferred approach… but we'll save that discussion for another time! Lessons from Leonardo and the Italian Renaissance The point of all of this is that I've started thinking that maybe, like common sense, those lessons presented practically aren't so common - and that maybe our own IT Renaissance is missing a few bits. The movement of information, experience and views made the Renaissance “matter”. So to start with, here are 3 key things that we should (or could) learn from the thought processes of 15th Century Italy to apply to today's business technology environments? 1. Renaissance innovators sought inspiration and knowledge from the great thinkers that came before them ...where Today's Business users and Cloud providers need to consider and then apply or adapt Best Practices and methods in order to derive maximum value. 2. During those ancient days the focus shifted towards individual rather than institutional achievements and people realised that they could be in control ...On the same hand, Cloud computing enables and encourages small business and start-ups, putting previously unaffordable capabilities within relatively easy access. 3. Finally, the Renaissance came to matter because of the movement of information, experience and views across borders … this one seems kind of self-explanatory. CALL TO ACTION: A Little Homework for the Experts We'd like to see are some clear, outcome-focused lessons, best practices and styles or approaches from the past which you believe should apply today... As I see it whilst numbers 2 and 3 (above) seem to be well in-hand, number 1 on that list could use some help! So I was thinking why don't we compile our own list of lessons learned which we feel apply directly to the cloud space? And who better to compile such a list than you, - the Readers of and Contributors to Compare the Cloud – a group consisting of some of whom I consider some of the most experienced and knowledgeable in this space? So I invite you to share those gems of experience from your past which you deem as a critical success factor for today's IT landscape. No, you aren't required to spin a specific Renaissance connection: what we'd like to see are some clear, outcome-focused lessons, best practices and styles or approaches from the past which you believe should apply today or which you have been applying today and throughout your career: you know, that uncommon commodity of common sense... I look forward to a solid, educational and useful list (and thanks in advance for participating!) ### The Growth of Cloud Computing in China By Christopher Morris, Tech Commentator It is considered to be a facile reality nowadays that the cloud is about to change computing as we know it and become a dominant technology in the very near future. But while this has been accepted intellectually, and the Western world has begun to embrace this exciting technology, the focus of attention with regard to this topic has thus far been largely centred on the Western market. Few people have considered how cloud computing will develop and be developed in the so-called emerging nations and economies. It hardly seems appropriate to describe China as an ‘emerging economy' today. Most people that pay attention to economic trends already recognise that the Far East nation will become an economic powerhouse, and the world's most prominent and powerful superpower, in the very near future. This has been evident for some of the Western world's most perceptive prognosticators for quite some time, but with China having become the world's second largest economy recently, the days of US hegemony in the world economy appear to be seriously numbered. Thus, the Chinese market is becoming an extremely fertile one for Western corporations and products, as well as an important one in its own right. And given the fact that cloud computing is to become such an important facet of the IT industry, and a phenomenon that will feed into many other commercial industries as well, it is natural that the cloud will need to be developed in China. ...the Chinese market is becoming an extremely fertile one for Western corporations and products But given the rather restrictive attitude that the autocratic Chinese state has taken to Internet censorship, no-one is quite sure yet how the cloud will develop in China. Certainly if one is to look at the existing ‘net in the nation then it will be very different in appearance to that which were are accustomed to in the West, thanks to the ‘Great Firewall of China' which blocks out huge amounts of web content from the Chinese public. So how will the Great Cloud of China look like when it's developed, how is this process currently emerging, and what consequences will this have for the cloud and direction of the IT industry as a whole? In fact, cloud computing in China is growing pretty rapidly, despite the fact that the industry is still in its infancy. The number of web-facing computers within the country has grown by nearly ten percent in the last year alone, and the vast majority of this growth has been attributed to the cloud sector. The largest cloud computing provider in China, Aliyun (a spin-off from the giant Alibaba), now has six times more web-facing computers than it did a year ago. The largest cloud computing provider in China, Aliyun, now has six times more web-facing computers than it did a year ago. This exponential growth has been achieved thanks to critical financial support from the government. The Chinese government has targeted growth in this sector for much the same reason that the industry has expanded rapidly in the United States; the technology makes data storage convenient, and maintenance costs associated with cloud computing are extremely low. Thus, the Chinese government declared expansion of cloud computing to be a priority in its 12th Five-Year Plan, which was released in 2011, alongside a raft of other measures which are intended to stimulate next generation industries within the country. The city of Beijing alone received more than $8 billion of support in order to construct servers and other cloud-related infrastructure. The city of Beijing alone received more than $8 billion of support in order to construct servers and other cloud-related infrastructure. It is hardly surprising then that Western corporations are already eyeing the Chinese market hungrily. Microsoft has already signalled its intention to tap into this potentially multi-billion dollar source of revenue by stepping up the promotion of its cloud services, Windows Azure, and Office 365, within the nation, as well as putting its promotional muscle behind smartphones and tablets produced by the computing giant which run the Windows Phone 8 platform. Microsoft launched Windows Azure in China recently, and Steve Ballmer, the chief executive of Microsoft has suggested that the firm's revenue from the China market, comprising the mainland, Hong Kong and Taiwan, will surpass that from the United States in the near future. However, it is not all plain sailing for the cloud in China. A report to the United States-China Economic and Security Review Commission has stated that laws in China which require foreign companies to partner with local firms could raise security concerns for Western companies, while the Great Firewall of China itself has a seriously negative impact on Internet speeds; potentially hampering the industry's development. ...laws in China which require foreign companies to partner with local firms could raise security concerns for Western companies Nonetheless, the potential for expansion in China is obvious. The country has the world's largest population of Internet users, and by the end of the 2013 there will be 500 million smartphones online in China. The nation will also soon boast the world's largest number of English speakers, a triad of factors which is sure to mean that the Chinese cloud market is soon up there with the biggest in the world. ### PCI Compliance in the Cloud: What you need to know By Gilad Parann-Nissany, CEO of Porticor Cloud Computing – the buzz words of the technology sector this decade: if you're not already doing it, you're missing out. Articles have been written. Experts have been crowned. Events have been attended. We all agree – the cloud presents opportunities for cost savings, elasticity, and scalability. But for companies that are bound by Payment Card Industry Data Security Standard (PCI DSS), securing financial data in “the cloud” presents new issues. How is the Cloud Different? Securing brick and mortar businesses was one thing, securing data centres and hardware was an added level, but securing the foggy boundaries of the cloud presents a new set of challenges. The skills and knowledge you acquired in the data centre are still very relevant to the cloud world. However, the most obvious change is that physical walls are no longer available to protect your systems and data. Cloud Encryption is the answer – producing “mathematical walls” to replace the physical ones. ...securing the foggy boundaries of the cloud presents a new set of challenges. Cloud Encryption is the answer – producing “mathematical walls” to replace the physical ones. Are these challenges manageable? Yes. Should you take them on? Yes. Should you do it alone? Oh no… PCI Compliance and Encryption in the Cloud: The Challenges Six of the twelve requirements of PCI DSS touch on the need for encryption and key management in the cloud, and on proper management of these systems. The main challenges in complying with PCI and operating in public or hybrid clouds are: Protection methods such as hashing and encryption (part of requirement 3) Encrypting transmission over networks (requirement 4) Securing systems and applications (requirement 6) Restricting access to data (requirement 7) Assigning unique accountability (requirement 8) Tracking and monitoring access (requirement 10) These are unequivocally big topics. But you unequivocally do not need to take them on alone. Solutions, like our Virtual Private Data (VPD), combine state of the art encryption with patented key management to enable organizations to effectively comply with PCI DSS in the cloud. Protection Methods: The Solution PCI DSS stresses the importance of protection methods such as hashing and encryption PCI DSS stresses the importance of protection methods such as hashing and encryption since “If an intruder circumvents other security controls and gains access to encrypted data, without the proper cryptographic keys, the data is unreadable and unusable to that person.” Our solution was designed with certain elements especially with PCI Compliance in mind, sporting features such as: Strong hashing (SHA-2) and encryption (AES-256) to render PAN unreadable. Key-splitting and homomorphic key encryption to protect the integrity and security of the keys. Only partial keys are stored in any location, and those parts are also encrypted. Exact mathematical descriptions and proofs of strength of protocols, which have been validated by leading cryptographic experts. Supports of AES 256 and RSA public keys from 1024 to 4096 bits, and secure storage of keys of all major crypto systems of any length. Encrypting Transmission: The Solution According to the PCI standard, sensitive information must be encrypted during transmission over networks that may be accessed by malicious individuals. Whichever solution you use, make sure that: All communications within the system are always encrypted. SSL/TLS is always enabled and cannot be switched off. There are mechanisms for issuing certificates for SSL/TLS encryption on a per-customer per-project basis. Your solution supports IPsec communications between cloud servers. Securing Systems: The Solution Choose a solution that helps you update the latest software patches quickly and easily. Requirement six emphasizes the importance of keeping systems up to date with “the most recently released, appropriate software patches” in order to eliminate security vulnerabilities that could be exploited by hackers or inside threats. Choose a solution that helps you update the latest software patches quickly and easily. Restricting Access: The Solution Systems and processes must be in place to limit access based on need to know and according to job responsibilities. This requirement relates both to the data itself and to management and storage of the encryption keys. Encryption is a great way to isolate date in the cloud. It depends of course on keeping the encryption keys safe and ensuring no unauthorized person has access to encryption keys. The solution to this is quite straight-forward: administrators should never be able to see the keys that are used to encrypt cardholder data. Keys should be managed by name, and the value always hidden. Since Administrators do not know the keys, they are unable to decrypt the data. Assigning Accountability: The Solution Assigning a unique identification (ID) to each person with access ensures that each individual is uniquely accountable for his or her actions, so that operations on critical data and systems can be traced. Tracking and Monitoring: The Solution Logging mechanisms and the ability to track user activities are critical in preventing, detecting, or minimizing the impact of a data compromise. Your logs must be secure and stored in a way that they cannot be modified. PCI compliance [...] these provisions protect your customers, which in turn, protect you. PCI Compliance in the Cloud: Is it Worth the Effort? Complying with PCI DSS is not just about the legality – it is simply good business. There are many requirements to PCI compliance. Perhaps life would be easier without them. But these provisions protect your customers, which in turn, protect you. PCI compliance does not have to be cost-prohibitive. It does not have to take a lot of time. But it absolutely does have to be done and it unequivocally is worth the effort. About the Author & Porticor Gilad Parann-Nissany is Founder and CEO of Porticor, a cloud computing security pioneer. Porticor infuses trust into the cloud with secure, easy to use, and scalable solutions for data encryption and key management. Porticor enables companies of all sizes to safeguard their data, comply with regulatory standards like PCI DSS, and streamline operations. ### Top 10 Things to Consider Before Moving to the Cloud By Neil Cattermull, Director of Cloud Practice, Compare the Cloud Introduction The cloud is here to stay and it makes sense to use it. The market is already developing and changes will occur over the next few years that will allow you to have even more choice than today. Microsoft will inevitably offer a complete hosted solution, along with Oracle, VMware and many, many others. We are also seeing disparate systems on disparate platforms (AWS, Google and others) being linked together, managed by complicated orchestration products. The greatest problem with all of these platforms and services is likely to be the support element, so when choosing cloud for your business, make sure you are asking the right questions: 1. What cloud services do you need? In order to choose the option that is best suited to your business, it is vital to understand exactly what cloud services you need. In order to choose the option that is best suited to your business, it is vital to understand exactly what cloud services you need. There are so many on the market, from full infrastructure hosting and application delivery through to managed backup and disaster recovery services. Even if you only want to move one or two of your services to the cloud at the moment, think about whether you may want to extend this in years to come. Choosing a provider that offers them all could give you more flexibility in the long run. One of the features of a cloud solution is its ability to scale up and down to match your size, but you should still ensure that the provider's capabilities match your plans for growth. 2. Who am I dealing with? Many cloud companies do not in fact have their own infrastructure but resell from others. This need not be a problem – it is common practice for a cloud provider to sell services via a channel of smaller resellers – but you should make sure you know who is actually providing them! It is quite possible that you will receive better support from a smaller value-add reseller, but you need to know whose customer you are, and who is ultimately responsible for the services you are buying. 3. What about the contract? Standard hosted contract terms are often 24 - 36 months, with shorter terms generally attracting higher costs. Some cloud providers, however, are now starting to offer a 12 months or less contract or “pay as you grow” option. This can be helpful if the provider is new to you or even new to the market place, and will allow you to gauge the type of service you will receive without making a long term commitment. 4. The Service Level Agreement Don't get tied into a service that just isn't working for you. This is very important. Don't get tied into a service that just isn't working for you. Check the terms and conditions for material breaches and downtime. Many providers offer compensation but this is likely to be insignificant compared to a loss of service for your business if your entire company's infrastructure is running remotely. A good provider will give you the option to terminate the service if the SLA is consistently breached but beware there are many providers that will not. 5. Where is my data? There are so many reasons why you should know this. If you do not know where your IP (intellectual property) or data is, then how can you get this back if you fall out with your provider? It is YOUR data and YOU need to know where it is. A good provider will give you access to it, no matter what the circumstances, at very short notice. Beware “safe harbour” agreements too. Although they are designed for data protection, they often fail to stand up if challenged. If you are offered one, have it checked thoroughly by a lawyer. 6. Security You must ensure that your provider has the appropriate security to safeguard your business. This is a very important point and it should be right up there with “should I have cloud services for my business?” At the end of the day, your data is accessible from the internet (and we all use it in one form or another). You must ensure that your provider has the appropriate security to safeguard your business. ISO standards are a good base to grade the provider's competency in this area, but there are many other standards that can also be adhered to. Note that if you are regulated by a governing body such as the FCA (FSA of old) or HIPAA (health care), additional security standards are required. Make sure these are not just a tick-in-the-box accreditation – challenge the provider on what they offer. 7. Internal policies As well as being good security practices, security policies for your business are essential for cloud services. ‘Password123' is not good enough! Staff will use applications to share information, whether you know it or not. On average, internet users have 25 password-protected applications they manage, but only six (or less) unique passwords. Using a cloud password management platform that enables employees with one password to access all their applications (single sign on) will help to provide a better experience while securing company access and data. 8. Check for hidden costs One major problem with all the options available today is to normalise the offerings and get a fair comparison. You can compare features and functions with a bit of research - using comparison tools such as those on Compare the Cloud - however, providers differ not only in functionality, but also in costs and billing methods. Make sure you get to the bottom of the provider's pricing. For example: CPU costs: 2 Core (@2.5GHz) with 2GB RAM costs £x.xx /instance/month Storage costs: Cost of 1 GB usable storage, SAN/NAS based on 10TB base infrastructure = £x.xx/GB/month Backup costs (£x.xx/GB backed-up) Network costs: (£x.xx/GB/month transferred in and/or out) 9. Availability Consider how your company's business handles network, system and other failures. Does the cloud infrastructure need to be highly resilient, or can individual parts fail without causing a major service interruption? A good cloud provider will have a replicated copy of your infrastructure (for their own internal disaster recovery plan). A good cloud provider will have a replicated copy of your infrastructure (for their own internal disaster recovery plan). Some providers will charge you for this and some providers will simply not have this and gloss over the discussion with you. A good start would be to discuss where your provider is hosting your service – data centre – and ask about the Tier level. Every data centre can be graded by this tiering (and should be) and the results will be obvious for you to understand when you receive them. Tier 1 = Non-redundant capacity components (single uplink and servers). Tier 2 = Tier 1 + Redundant capacity components. Tier 3 = Tier 1 + Tier 2 + Dual-powered equipment and multiple uplinks. Tier 4 = Tier 1 + Tier 2 + Tier 3 + all components are fully fault-tolerant including uplinks, storage, chillers, HVAC systems, servers etc. Everything is dual-powered. A Tier 4 data centre is considered as the most robust and less prone to failures. Naturally, the simplest is a Tier 1 data centre used by small business or shops. 10. When it all goes wrong So you now have a cloud service, or multiple services, and it all goes horribly wrong. How do you migrate away from the incumbent failing provider? Make sure that you are not handcuffed to large exit bills and contract penalty clauses. There are some test cases where clients were asked to pay extortionate fees just to keep their cloud services running after the firm went into financial hardship. If your business is considering a move to the cloud and you need some advice, contact Compare the Cloud's Cloud Practice Group and we'll provide you some free advice and details on other ways we may be able to help make your transition a smooth and happy one. ### Why can’t I move straight to the cloud? By Matthew Tyler, CEO of Blackfoot UK I hear an awful lot of rubbish spouted about 'clouds' mostly relating to risk and security concerns. I also see an entire new industry all vying for the inevitable IT spend. The main question we hear is, Should I move to the cloud? This is not the question that you are looking for… Organisations now require IT as a utility with equal importance to other utilities such as power and water. Put simply why would you want to generate your own power when it can be delivered to me down a pipe? The benefits are virtually endless, you don't have to mine the coal, burn the coal, you don't have a supply chain to manage or ensure you have enough coal – the same theory now applies to IT and the cloud. The question you should be asking is… Why can't I move straight to the cloud? I've spent more and more each year getting less and less secure… What has all this money achieved? Put simply why would you want to generate your own power when it can be delivered to me down a pipe? Over the last 10 years, UK PLC has spent over £30billion on IT Security, which is rising at 10% per year AND the ICO stating that the amount of data breaches has risen 10 fold in the last 5 years. This question can be answered with both simple psychology terms as well in a more complex way by taking into account changing technology and a consumer revolution. IT PLC has failed abysmally at translating our unique language so that normal people can understand. IT PLC has failed abysmally at translating our unique language so that normal people can understand. An example is footfall vs visitors and who would want more hits of no value surely any sane person would want less hits and more users. The historical answer was to purchase something physical with lights on, which everyone can point at and feel that they have received value for money. The people who have paid for it can look at it can wonder what it does but can see the little lights flicker, hoping that this is the last ‘little box' they are asked to buy but always knowing that in reality another flickering box will inevitably follow. The more complex technical answer is that over the last 10 years the OWASP application weakness top 10 has not changed and applications are still as weak as they were then. The difference is that everything is now hyper connected and hacking is far easier using kits such as Black hole ‘point and click' hacking requiring little to no skill. Many IT people aren't application aware, having come from a transport or build background. The networks they built were strong at the edges, secure, fast, controlled. Vast sums were spent on creating the best internal networks to quickly deliver the applications organisations relied on for their daily activities. These systems kept the baddies from the small number of tightly controlled points where data would enter or leave their shiny fast state of the art networks. But with todays mobile technology these once tightly controlled points are being relaxed to cater for this need for mobility in business. The problem is Moore's Law versus simple accountancy. Over the last 10 years consumer tech is now cheaper faster, more reliable and easier to use. Consumers are less willing to ‘sweat an asset' as consumer tech is cool and about having the latest fastest coolest, smallest device and the need to access to applications on the move. People use their devices to do their work on because they are more familiar, convenient, more stable, and of course cooler. Organisations in a drive for increased efficiency have sweated the ‘infrastructure assets' as well as opening up the once very closed applications to the general public, CRM systems are now online client engagement portals, email is now accessible anywhere at anytime. Applications have been widely ignored, 76% of large organisation data theft comes from Web servers, Databases and Mail servers and the time taken from initial attack to data extraction in 72% of cases happens within seconds or minutes. The answer to the cloud question is: If your application security are not ready to put straight into the cloud then what have you been doing for the last 10 years? About Blackfoot: Blackfoot is an information security and compliance specialist dedicated to protecting client's reputation and profits by providing strategic and pragmatic advice to reduce risk, liability and costs.   ### Why is "Going Mobile" in the Enterprise considered such a giant leap? By Stefano Buliani, Founder of Cloudbase.io Large organizations are sometimes scared of the "go mobile" motto, this is because in their mind “mobile” involves a complete rethink of their system architecture, which will require a large amount of time and money to achieve. Our research at cloudbase.io shows that large enterprises spend upwards of £300,000 of their IT budget to develop the integrations necessary to deploy a single mobile application. This is seriously slowing down mobile adoption in the enterprise. Mobile is a terminal, an end-user platform used to interact with the data that already exists. It should not require architectural changes or significant development. All that's needed is a way to push this data to the mobile devices without requiring significant development work. Cloudbase.io helps reduce time and resources needed, often showing a 200 to 300% plus ROI Enabling a new Mobile "idea" to go from concept to deployment in the quickest possible time. This enables both independent and Enterprise developers alike to concentrate on designing the best App possible and not having to worry about how they connect their existing systems with each other and their brand new mobile application. Cloudbase.io middleware component At cloudbase.io we have made a point of supporting all of the smartphones available on the market. Our middleware platform creates a simple layer of restful APIs in front of any system or database deployed in your network and our client libraries allow you to query these APIs from any mobile or backend platform without missing a bit or wasting time messing around with HTTP connections and parsing. There are countless benefits to be gained from embracing mobile. Productivity will increase for a number of reasons: People will be able to work on the move, mobile applications allow for smarter and faster work. Imagine if all you had to do to send a meter reading for your electricity was take a picture on your mobile – the app would automatically read and recognize all the numbers on the meter and send it to your account – job done in 30 seconds. Imagine capturing the mood of a customer as they consume your product with a mobile “mood” app. Platform fragmentation is also a significant concern for the enterprise. While BYOD (Bring your own device) is all the rage these days it poses significant challenges in terms of development, security and architecture to the teams running the system. Each mobile platform is different, requires different set of tool to manage and is not necessarily ready for enterprise-level security requirements. Additionally, when it comes to building mobile applications, each platform uses a different technology, programming language and architecture; this seriously hinders the BYOD dream. Cloudbase.io has system integration covered - development is next. For the enterprise space a safe bet is HTML5 and JavaScript apps - by adapting the same standards created over decades in the web space we can ensure the development process happens only once for all platforms, despite Internet Explorer. This will allow the fast development and deployment of rich mobile applications. We have been following the mobile and apps market closely over the past two years. While it may seem it is now a mature market, we see so many new possibilities appearing every day thanks to advancement in hardware technology and software that we think mobile is still in its infancy, especially in the enterprise world. The next few years are going to be critical and will define the ground rules that will govern the enterprise app market for the forthcoming future. We very much look forward to it.   ### The Expense of Expense Management By Peter Martin, Acubiz UK & Ireland Managing expenses isn't the most electrifying subject. Until, that is, that you realise how much time and money you're spending on this mundane but vital process. And that money comes straight off your bottom line. In fact, unless you have got it under control, the chances are you won't know how many hours every one of your employees - and especially those commission-hungry road warriors - are devoting to recording and claiming the costs of their travel; whether or not any of the claims, for whatever reason, is less than accurate; and whether you are claiming the VAT reclaim which can inject cash flow back to your bottom line. ...realise how much time and money you're spending on this mundane but vital process Even if your employees are dutifully filling in and submitting faultlessly completed spreadsheets, your finance staff will be busy checking them, reconciling them with electronic receipts from airlines and hotels, and re-entering all the necessary information into your back-office systems such as SAP so the payments can be processed and reconciled. And if your hardworking staff is constantly on the move travelling abroad, are you making sure you reclaim not only the VAT from UK travel, but from all those journeys and stays in other countries? Research suggests that less than one percent of all companies get that right. The VAT reclaim on international travel can be significant and contribute to a decent cash flow that one can't ignore in today's economic climate. And all this is hardly the thing your business needs to concentrate on. Claim your Pot of Gold ...start with your employee photographing his or her receipts with a smartphone and recording the necessary details “on the go” The good news is that it's possible to improve the management of expenses beyond recognition, by adopting a solution that automates your processes from one end to the other. And that can start with your employee photographing his or her receipts with a smartphone and recording the necessary details “on the go”. The results can be dramatic: it's possible to reduce the costs of processing expense claims and reimbursement by up to 75%, and to reduce the time spent entering all that data and moving it from system to system by 85%, freeing your team to do things that add value. At the same time, you can gain total visibility and control of travel spending, ensuring that all your staff complies completely with your expense policies, and making it easy to spot patterns or behaviour that might need attention. And by making it so much simpler and more automated, you'll reduce the error rate by up to 95%. Cloud and Expense Management: a Match made in Heaven Expense management and the cloud were born for each other…allowing true innovation by simplifying the entire process. Expense management and the cloud were born for each other…allowing true innovation by simplifying the entire process. First of all, you no longer have to choose and buy software that you then have to install and support on hardware you have to buy, with the entire headache that it brings with it. Software updates to apply; extra servers or storage in time to stop the system slowing to a crawl; add of course, all the cash up front you'd prefer to be spending on applications that bring you more customers or keep them happier - or not spend at all. Now, you can simply subscribe to a service you can use on the web. You pay each month for the number of users you're managing or the number of reports they're submitting, without investment needed. The hardware, software and all the complexities of looking after it, the responsibility for making sure there's enough spare capacity for everyone, keeping the service running - they're someone else's problem. And once your expense management is centralised and “in the cloud”, two other huge opportunities are open to you. First, your travellers can use their smart phones to capture what they spend as soon as they spend it - a photo of the receipt, a few keystrokes and the data's already safe and in the system. No agonising over who was at that business dinner in Singapore, or what the taxi was for in Berlin or worry about a missing receipt/s. Secondly, it's easier to hook in to the systems your travel partners are using - credit card companies, travel agencies, hotel chains and airlines - so that all those electronic receipts can flow effortlessly in without involving your finance team. The system allows for daily data update from the credit card providers as well as travel management systems. Covering the Ground But that isn't to say you don't need to take some care over choosing who will provide this service. If you have offices in other countries, are their local currencies, languages and legal requirements catered for, and if not, how long will take your new partner to put that right? Months or days? Like most organizations, your combination of expense and accounting procedures and back office systems will be unique in one way or another. So to automate fully, your expense management system must be able to integrate properly with those back office systems. And you'll probably want some customisation to get things the way you want - although we recommend that you re-visit your expense management procedures as well: you'll probably find you can simplify things a lot. So you need a solution that can be adapted easily to your needs, and a provider who'll help you get the job done. Above all, is the solution easy to use? Will your travellers need face to face training, and then still flood you with questions, or is the system easy and intuitive to use, and supported with helpful online videos of the key tasks? Make sure you're on the Money How do you spot an expense management system that's going to deliver? Does the solution provider really understand expense management, and the specific challenges and requirement in European countries? Will they share that experience with you to help you save as much time and money as possible? YouTube Video: YouTube.com/watch?v=8ziAIGSnZB4 Is it a true cloud, “pay as you go” solution: are you charged only monthly, and only for the users or reports you actually need to handle? Can your employees easily capture and enter information and receipts using their smartphones? Does the solution support all the countries you need? If not, is the service designed so that the language, currency and other localisations can be added quickly with a simple and modular set of changes that don't affect the rest of the system? How easily will it connect to your back office systems, and will it automate the exchange of as much data as possible, with as little fuss? How many travel partners does it support, and how easy will it be to connect to any that you use? How quickly will you be able to have the service up and running the way you need it, and your staff using it? Does your provider have a tried and tested implementation process? The faster it's done, the faster you reap the benefits and the less employee time you'll have had to invest in getting there. Author Peter Martin: a strategic marketing expert in enterprise IT hosted services and software. His ahead of the curve thinking on Cloud computing, his ability to distil and articulate clearly the business value of complex technology has created a portfolio of services generating profits and growth. Peter Martin in his capacity as a Strategic Marketing Consultant for Acubiz UK&I tackles the key issue of "Value Creation" with cloud based expense management solutions for organisations. ### Is the writing on the wall for Cloud Computing? By Theo Priestley The IT industry moves in cycles. Much like the fashion industry, trends come and go then return with a bang. It seems that Cloud Computing is about to hit that cycle now and go out of fashion faster than it came in. The NSA and PRISM break out has exposed data security at a global scale that nobody was, but really should have been, prepared for. But it's also raised eyebrows across the C-level in the enterprise as to whether they want to invest in a cloud strategy now. An exchange this morning with Ray Wang of Constellation Research revealed that 100 clients think that the US Govt has screwed over every cloud vendor with what's happening and that on-premise software is back on the menu. It's hardly surprising, you could tell that the writing was on the wall when Snowden hit the headlines but the turnaround is proving a lot faster. “We see the writing the wall, and we have decided that it is best for us to shut down Silent Mail now.” But it's not just Cloud vendors who are sweating. Third party security and encryption vendors have basically been shot down in flames too. This week alone two email encryption services, Lavabit and Silent Circle, have shut up shop because of the NSA spying. Silent Circle told customers it has killed off Silent Mail rather than risk their privacy. Companies are willing to close their own doors than compromise their own values and risk the data they hold for customers being targeted and used, and this raises a lot more questions around whether encryption services are really worth it if Governments can just come knocking and ask for the information to be handed over. So where do we go from here ? Back to on-premise like this never happened ? Hybrid Cloud ? (no!) It's just another pendulum swing for IT; on-prem, off-prem, on-prem….and repeat. Remind anyone of the BPO cycle ? For vendors who have a core strength in on-premise software this the absolute right time to shout about it. The writing may be on the wall for Cloud for now, and much like the fashion industry where trends come and go then return with a bang, with Cloud it'll be a thunderclap. Watch the skies….   ### Overcoming Disaster By Neil Cresswell, CEO, Virtus Data Centres Research just published by Zerto (who surveyed IT managers, VMware admins, System admins, plus business continuity and disaster recovery experts) found that 42 per cent had suffered an outage in the last six months. It also found that a combination of power loss/interruption and hardware failure accounted for two thirds of all outages. The survey, (reproduced by permission of Zerto below,) quite correctly concludes that all businesses need a working disaster recovery (DR) process. The first of the two main causes of downtime that the research identified as being responsible for 31.5 per cent of outages – was power loss or interruption. Although the research doesn't drill down into the type of IT installation in these cases, it would be reasonable to conclude that most of these occurred in company's own server rooms. The reason is fairly simple – most enterprises simply cannot cost justify the cost of deployment to the level of redundancy that is available and deployed at major Tier III colocation data centres. At a major colocation provider like Virtus, the economies of scale mean that it is easy for us to provide dual power feeds from the national grid - plus multiple power and cooling distribution paths. Then redundant and concurrently maintainable UPS (Uninterruptible Power Systems), generators and other components -– are all available and able to withstand at least 72 hours of failure of the national grid supply or any other component. In terms of the other major factor identified by the survey – hardware failure – there are considerable advantages to using ‘cloud' providers rather than operating your own hardware. Many of Virtus' customers are cloud providers and offer solutions with in-built redundancy at the hardware level. Whilst these don't stop hardware failures – that would defy the laws of physics – they do stop outages due to hardware failures because their IT infrastructure is designed to be fault-tolerant. This extra level of resilience is available whether the cloud provider is selling IaaS (Infrastructure as a Service) PaaS (Platform as a Service) or SaaS (Software as a Service). So, as part of the disaster recovery plan recommended by Zerto as a conclusion from their research, two thirds of the problems can be very simply mitigated by switching from in-house server rooms to cloud-style computing provision in high efficiency, well staffed, tightly managed and high reliability Tier III data centres. We can proudly say that since Virtus opened for business in March 2011 we have had a completely solid 100 per cent up time. The Zerto research is reproduced here by kind permission. ### Drawing a line in the Cloud By Ed Macnair, CEO of SaaSID The consumerisation of IT has increased the sense of entitlement among employees, who expect to be able to access information wherever they have an internet connection. Cloud-based platforms are perfect for meeting this demand and businesses have quickly recognised the productivity and cost benefits of migrating to Google Apps and Office 365. The barrier to making the move to either platform has been the question of how to prevent corporate information being shared beyond the authorised user base. Taking Google Apps as an example, while it enables employees to access, process and share documents quickly, intuitively and cost effectively, its inherently open structure has proved to be a sticking point for CIOs and CISOs. The barrier: how to prevent corporate information being shared beyond the authorised user base. Organisations adopting Google Apps have to provide employees with access to Gmail. But employees can then login to a consumer Gmail account and attach files to personal messages: risking unintentional sharing of documents outside of the company. If a project manager has legitimate access to financial documents via Google Drive and shares them with another employee, these documents could then be onwardly shared. We have helped organisations to address this issue, by allowing corporate only Gmail access and blocking all consumer Gmail. By configuring our browser-based software to inject specific headers into web pages, CIOs can control the list of authorised domains that employees can log in with. For example John Smith can log in to corporate Gmail using "john.smith@comparethecloud.net", but not using his personal Google account, "john.smith@gmail.com". This helps organisations adopt Google Apps, by drawing the line between personal and corporate Gmail. This helps organisations adopt Google Apps, by drawing the line between personal and corporate Gmail. Google Apps was built with collaboration in mind. Using our browser-based agent, CIOs can centrally manage who has access to Google Apps features that enable document sharing, regardless of the device used, or whether it is behind the corporate firewall or proxy. In the same way, access to Office 365 features can be governed and audited through the browser, enabling CIOs to apply the equivalent levels of control that have traditionally been used to govern Windows applications that are run on-premise. Now Apple developers have been given the opportunity to try the beta version of iWork on iCloud. This will allow colleagues to access documents wherever they have a browser and synchronise corporate documents and presentations across all Apple devices. Apple's browser-based collaboration platform is expected to be available to businesses this autumn. iWork on iCloud will allow Apple fans to access and collaborate on documents, spreadsheets and presentations in browser-based versions of Pages, Numbers and Keynote, in the same way that businesses currently use Google Apps and Microsoft Office 365. To enable employees to benefit, CIOs will need to consider how to manage corporate documents that are shared via iCloud and how to revoke access to the corporate iCloud instances when employees move on. Single Sign-On (SSO) to web applications [...] can realise the cost and productivity benefits [...] without making any changes to the back-end applications. They will need to ensure that they retain control over who can access documents within iCloud, particularly if Apple devices are shared between colleagues or family members. However, while this sounds like an additional burden on CIOs, the most effective method of managing corporate documents shared through iCloud is the same as for any cloud-based collaboration tool. We have found that by providing Single Sign-On (SSO) to web applications - and controlling access to specific application features - organisations can realise the cost and productivity benefits and retain control over documents that are shared using Google Apps, or Office 365, without making any changes to the back-end applications. CIOs will be able to gain the same level of control over iWork for iCloud when it launches later in the year. Applying on-premise equivalent control over application features helps to draw the line in the cloud between personal and corporate accounts so that employees can be more productive, without increasing risk. For further information on browser-based application control, download our whitepaper, “Trusting Google Apps for Business: how flexible, simple controls protect data in the Cloud” About SaaSID SaaSID Single Sign-On provides organisations with a single point of control over access to web applications that minimises the risk of unauthorised access to sensitive data. Find out more from our company profile or www.saasid.com. Follow SaaSID on Twitter: https://twitter.com/SaaSID and connect with SaaSID on LinkedIn www.linkedin.com/company/saasid-limited. ### What role does IBM PureSystems have to play in the world of Cloud? As this website lays testament to, the world of enterprise computing is moving to the cloud. There are now a plethora of cloud infrastructure (as-a) service providers, offering all manner of cloud-based VMs to any business wanting access to scalable on-demand computing resources. So why (you may ask) is the world's largest IT service provider still banging on about “systems” – physical pieces of computing hardware bundled with various types of software? And why is IBM hijacking cloud terminology such as PaaS, IaaS, and Cloud to describe the roles and functions of these so-called Systems? You can be forgiven for thinking there's a bit of mixed messaging going on here, and that it's about time someone explained what's going on. And - perhaps more specifically - why businesses and service providers should at least know the basic 'raison d'être' behind IBM's PureSystems. Behind every Cloud is Infrastructure hardware and software! Cloud disassociates computing resources and applications from hardware – via a layer of clever virtualisation and a funky web-based administrative console. As such it's easy to see why the underlying hardware (and systems) is less of a concern and becomes distant from the end-user's computing experience. However hardware and underlying systems are still the engine room to all clouds, and the more mission-critical your cloud computing is - the more important it is to understand the capability of the hardware. Enterprise-Grade Computing If you're offering low cost web hosting or chasing down the lowest CPU/hour then stop reading now. IBM made its name in Enterprise-grade computing, and it's this crucial market segment that IBM continues to serve. Who is “doing” Enterprise-grade Computing? Well, the Enterprises are of course, as are much of the Mid-Market (~200-1000 employees), as are the Managed Service Providers (MSPs), Systems Integrators (Sis), and Outsourcers delivering IT services to these businesses. These are the businesses that IBM has built PureSystems for. If you're offering low cost web hosting or chasing down the lowest CPU/hour then stop reading now. Is this Vendor “Lock-in” by any other name? IBM's systems technology business is worth approximately $17.5 Billion per year, so it's clearly a revenue stream that IBM would like to hang on to. Is it threatened by increasing use of Cloud? Yes and no – depending which way you look at it. Is IBM trying to tie down customers to the IBM way of doing things, on the basis of its heritage, brand, past performance, expertise and historic technological leadership via a closed appliance? Absolutely not. PureSystems represents the pinnacle of Enterprise-grade computing built on an open foundation which provides the flexibility necessary for businesses to gain competitive advantage by choosing the application vendor of their choice, and also to allow it to inter-operate with the widest possible range of other software and systems infrastructure. Competitive Advantage The premise of PureSystems is to give the Enterprise the best of all worlds (physical / virtual & Cloud) in the simplest format possible. PureSystems is about introducing simplicity, predictability, reliability, performance, and technological advantage to your business. These translate into competitive advantage – which improves your business' revenue and profitability. Fancy words and big promises – but what does this really mean? It means every aspect of the IT Lifecycle is optimised for ease – ease of procurement, ease of deployment, ease of management and ease of support. If you're a fan of 5000 piece puzzles and sitting on a data centre floor surrounded with equipment and cables trying to assemble it into some form of production-ready environment within implausible timescales, then PureSystems isn't for you. Performance The engine at the heart of the PureSystems range is called“Flex System” which has more CPU, more RAM, more I/O in less space, requiring less power and less management effort than any other chassis / blade based platform! Back to Cloud The engine is the first element which helps me lower my TCO and increase my ROI and TTM. Now how does it get me my Cloud? Hopefully from the ramblings above, you've deduced that PureSystems are appliances designed to enable you to build your own cloud service. What type of cloud service you want will determine what type of PureSystem you want… If you're a fan of 5000 piece puzzles and sitting on a data centre floor surrounded with equipment and cables trying to assemble it into some form of production-ready environment within implausible timescales, then PureSystems isn't for you. Basic Infrastructure (as a Service): If you're an Enterprise looking to create a flexible private cloud of VMs for your own use, or a Service Provider looking to create a high performance cloud for shared (public) use - and everything in between - then the optimum platform, built on the Flex System engine is known as PureFlex. This is general purpose and capable of running pretty much any application from any vendor (Microsoft, Oracle, IBM, Open Source, SAP, SAS etc), and supports the widest range of Processors, Networks, Storage, Hypervisors and Operating Systems. PureFlex appliances give you both the hardware and the management tools you need for providing IaaS, courtesy of the Flex System Manager appliance and SmartCloud Management software ( a cloud provisioning and orchestration tool for running cloud across a range of hypervisors, such as VMWare, Hyper-V,KVM and PowerVM). Platform (as a Service) / PaaS: If you're looking to deliver PaaS to your business or your customers, then PureApplication is the platform to look at. PureApplication adds J2EE (Websphere) and relational database (DB2) and other more sophisticated (cloud) orchestration and platform management capabilities – with the bonus of all you can eat licensing for Websphere and DB2. Not wishing to get too complex at this point, but it also leverages a management construct called Virtual Application Patterns to automate the provisioning of Complex Multi-Tier applications – the same patterns used by IBM's own public cloud (SmartCloud Enterprise). Database / Analytics / Big Data (as a Service): Finally If you're looking to host transactional and/or analytical databases, then the PureData platform is optimised for such workloads. They make such systems much faster and easier to deploy and in addition have significant price and performance advantages over any other products in the market. The current offerings will shortly be enhanced with a platform for unstructured “Big Data” processing PureData system for Hadoop announced in April '13). Feather in the cap The ‘feather in the cap' of the PureSystems concept has to be its future-proofing. Those still unclear why they should deviate from a 47u rack of 1u servers should consider the following: Each slot in a Flex System has exactly the same volume as a 1u server - so you can fit in as much I/O / RAM etc. as you can in a 1u server. Each slot has 16 x 16 Gbps lines of I/O (256Gbps per slot!) I/O lanes can be pass through, converged fabric (FCOE), separate LAN / SAN or Infiniband Slot separators can be configured in any way you like means that you can have a 2-socket, 1 slot node with up to 256Gbps of I/O and 768GB RAM. By way of example(s) you could have: a 4-socket, 2 slot node with up to 512Gbps of I/O and 1.5TB RAM a 4 slot V7000 storage node with built in storage virtualisation (to connect to any existing storage) and with real time compression. a PCI expansion node, a Storage expansion node, a management appliance, or any mixture of the above. In Summary The future-proofing capacity and flexibility, combined with its aforementioned ease and simplicity is really what sets PureSystems apart, and hopefully this blog has brought these valuable attributes out from behind the glossy veil of big words and acronyms that IBM is usually associated with! More Information For more information on IBM PureSystems please contact Alex Rutter – Head of PureSystems Sales at IBM – email - Alex_Rutter@uk.ibm.com, ### Don't forget the Bare Metal Cloud By Jonathan Wisler, General Manager EMEA, SoftLayer I just got back from the Cloud World Forum in London. There was some very good new technology, and having talked to literally hundreds of people about IaaS, I am still amazed by the perception that it is only a virtual server offering affair. So let me say it as clear as possible: It is NOT. You can now have a truly hybrid cloud infrastructure that include public clouds (virtual), bare metal clouds (dedicated servers) and private clouds (your own private VM farm). A little history about SoftLayer. We were started in the kitchen of one of the founders with the goal to revolutionize the hosting industry by removing the pain points for customers and operations. Our founders did not limit themselves to selling only servers with hypervisors, they wanted to get the customers the compute power and network performance when they needed it and with minimal effort. They developed what we now call "infrastructure as a service" before anyone was calling it the cloud. It was not about selling virtualization technology. It was about quick deployment times, consumption based billing, automation, control and accessibility through a common API. So why is this important? If I could sum it up in one word: performance. While virtual servers are valuable for certain applications like web applications and gaming front ends, given they are a shared tenant environment, you will trade off performance and reliability for speedy deployment. This is where bare metal cloud comes in. You can still deploy them quickly, in 1-4 hours, rather than 5-15 minutes for virtual servers. But since this is a single tenant environment, you can get more compute power and network throughput. Just for an internal benchmark, we did some performance testing on our Riak database engineered servers and we got huge performance differences. We have heard from some customers that performance improvement of bare metal is 10-20 times over virtual only. I have also worked on customer migrations from a pure virtual environment to a hybrid of bare metal plus virtual servers. On average we can cut their monthly billings in half while improving reliability and performance. I am not aiming to start a debate on virtual vs. bare metal because they are both valuable. The point is you can have both... I am not aiming to start a debate on virtual vs. bare metal because they are both valuable. It really depends on your application's performance and elasticity requirements. The point is you can have both, sitting on the same network architecture, accessible through any pane of glass, with pay-as-you model and complete with an API. Also known as “the cloud.” The question is what tools will be the best to optimize the performance of your architecture. Cloud computing infrastructure now has the options you need to make sure your customers, whether they are external or internal, get the experience they need and expect. ### An interview with Virtus Data Centres Compare the Cloud speaks to Mark Underwood, Senior Business Development Director of Virtus Data Centres CTC: Mark tell us about yourself I joined Virtus Data Centres in Oct 2011 and I am responsible for new business sales opportunities as well as the on-going development of strategic business relationships with partner businesses. Previously I spent three years at Global Switch managing the sales function for the two London docklands sites. Prior to that, I worked at PacketExchange for 18 months in a new business sales capacity working with some of the larger European network providers and US social networking and content providers delivering wide area peering services. I was at TeleCity for 5 years managing their New Business sales function within the UK. Prior to that I worked with Level 3 Communications as an account director, I spent 4 years at Easynet in direct sales and as Web Development manager and I started my working life at Dun & Bradstreet where I spent 11½ years in a variety of sales focused roles. CTC: Who are Virtus? Virtus is a leading UK provider of environmentally efficient, modern, interconnected carrier neutral data centres. Virtus owns, designs, builds and operates environmentally and economically efficient data centres around the heart of London's cloud and digital data economy. Our customers can rely on our secure facilities and uptime record for their data and applications, enabling them to store, protect, connect, process and distribute data to customers, partners and networks within our data centres and the global digital economy. Virtus Data Centres was established in 2008 and is a well capitalised developer and operator of Tier 3 data centre facilities within the Greater London region. Virtus was explicitly set up to deliver additional capacity into a market facing increasing demand for a limited supply of high quality cost efficient data centre facilities and services. We own the freehold to every property [...] and can therefore provide price certainty to our customers when contracts are reviewed. We own the freehold to every property that we build from the ground up and as such we are not subject to the vagaries of upward rent reviews and lease adjustments by property landlords and can therefore provide price certainty to our customers when contracts are reviewed. We believe that Virtus offers a uniquely different approach in today's data centre market place. We have a significant ownership in the company by employees who have invested into the business, which we believe creates a very different attitude in how we engage with our customers and run our business. We take individual responsibility for our reputation and we look for better outcomes for our customers whilst building sustainable value within our business. We have fully committed shareholders and investors, and we are able to build our data centres speculatively without the need of anchor tenants which means that our prospects and customers have the ability to move into high quality modern data centre space immediately. As a business we remain focused upon our core strengths and competencies, which are building and operating data centre infrastructures and in doing so we provide flexible solutions to facilitate our customers' growth by not restricting them to today's specific requirements. CTC: So what does Virtus do in the Cloud Space? Our data centres are built around the needs of cloud companies and customers with both hybrid and traditional data centre needs. We have just launched our “CoLo-on-Demand” service which is specifically tailored to the needs of Cloud service providers. We come from a cloud as well as a data centre background and we want to help cloud providers get established quickly in a low cost, low risk manner, and help established cloud providers to grow quickly and flexibly. We offer low entry costs, ultra low risk commitment and flexible high quality, service agreements. Virtus' CoLo-On-Demand allows you to turn-up or turn-down the amount of computing power you buy. You no longer have to over-provision for peak loads - you can now match your overheads to the revenue you generate. You can scale up, scale down or even cancel your commitment, at any time. Virtus' CoLo-On-Demand allows you to turn-up or turn-down the amount of computing power you buy... you can now match your overheads to the revenue you generate. We believe that having predictable low cost surety of growth and capacity is as important to growing cloud businesses as low entry costs and risk free commitment - CoLo-on-Demand guarantees Cloud businesses the flexibility to scale in a contiguous manner - we will reserve adjacent racks giving you a highly efficient and cost effective growth plan. By providing space in which Cloud companies can grow and expand their services within our data centres, our technology sector customers and end users can co-habit, interact and grow. We look to work together in partnership with cloud and technology businesses and in so doing we believe that we are delivering an eco-system which enables end-user customers to integrate their own IT infrastructure seamlessly with Cloud services. This in turn enables cloud companies to build and grow their product and service portfolios, which results in an environment which is mutually beneficial, creating the catalyst for exciting new developments. CTC: What is Virtus policy with regards Cloud providers having open access? We have a uniquely flexible approach to encourage technology companies to grow into our data centres. This has resonated extremely well within a number of channels and we are now engaged with number of Telco's, Hardware Vendors, Distributors, MSPs and ISVs who can provide innovative and flexible solutions to further enhance Virtus' data centre environments as a true technology eco-system. In so doing we bring together the skills resources and expertise to work collaboratively with both our technology and end-user customers and partners. CTC: What do you think sets Virtus apart in a competitive datacentre and colocation marketplace? Quality, Value, Innovation, Service and Flexibility are at the heart of our business. We are a people business with a strong client lead approach. We are fixated on delivering the best service possible supported by extensive industry knowledge and expertise. As a leading UK provider of environmentally efficient, modern, interconnected carrier neutral data centres. Virtus owns, designs, builds and operates environmentally and economically efficient data centres around the heart of London's cloud and digital data economy. All our London sites provide purpose-built, highly specified data centre space with meeting and office space, and offer a full range of colocation, interconnection, support and monitoring services. Our customers can rely on our secure facilities and 100% uptime record for their data and applications, enabling them to store, protect, connect, process and distribute data to customers, partners and networks within our data centres and the global digital economy. Located in prime locations in London, our data centres have excellent communication, transport and airport links. Our highly resilient and secure data centres are built to advanced quality standards using innovative modern design techniques delivering industry leading power and cooling efficiency. All our sites are operated to the highest operational and security standards. This ensures our data centres are ideally flexible for handling a broad range of requirements. All our London sites provide purpose-built, highly specified data centre space with meeting and office space, and offer a full range of colocation, interconnection, support and monitoring services. CTC: What connectivity options are available to your customers? Our data centres are carrier neutral and we actively encourage network providers to PoP the facilities to create a wider choice of carriers and service providers for our customers. Our data centres are served by diverse incoming fibre duct entry points and dual meet-me-rooms so that our customers can always be assured of network resilience and redundancy to support the redundancy built into the data centre infrastructure. In terms of connectivity options our customers have access to a full portfolio of network solutions including IP Transit, Point to Multipoint Ethernet, MPLS and VPLS services as well as Dark Fibre and DWDM services from multiple carriers. CTC: Who are the management team in Virtus? Our internal leadership is a key factor in our continued success. Neil Cresswell is Chief Executive Officer of Virtus Datacentres. Neil has more than 25 years of experience in the data centre, technology and banking industries; building successful, high growth, profitable businesses with a reputation for quality, innovation and customer service. He has worked across UK, Europe, Asia and the U.S for companies such as Savvis, Attenda, BNP Paribas, IBM and Fiserv. In April 2013 he joined Virtus Datacentres as CEO. Neil was previously with Savvis (a leading global supplier of cloud and datacentre service and a division of CenturyLink Inc), for six years, where he was Managing Director of the EMEA Region as well as having global responsibility for Savvis' financial services sector. Neil holds an honours degree in Business Studies from the University of Westminster and is a graduate of the Henley Business School Advanced Management Programme. Konstantin Boreman is Chief Operating Officer and is responsible for the operational aspects of the business. He has over twenty five years of experience in the Telecommunications and IT industries and a proven track record in building successful businesses from start-ups to large enterprises. Prior to joining Virtus Data Centres in early 2012, Konstantin spent 10 years at Interxion, a leading European data centre and colocation services company where he played a key role in delivering 22 consecutive quarters of revenue growth which led to an Initial Public Offering. Before joining Interxion, Konstantin spent 7 years at Global Telesystems (GTS) in senior commercial and general management positions. Daryl Seaton is Chief Financial Officer he started his career in JP Morgan Asset Management in the UK Equities team where he gained his CFA qualification. He then moved into Private Equity, working for 3i plc. Daryl joined the company in May 2005 from HBOS plc. Daryl's experience of leverage buyouts and investments spans a wide spectrum of geographies, sectors and transaction types including 8 years within the property and data centre sectors. His transactional and fund raising skills were invaluable during the set up and early growth of the business and he continues to play a key role in the day to day management of Virtus. Steven Norris is our Chairman, Steven was a Conservative Member of the UK Parliament from 1983-1997. He was Parliamentary Private Secretary at the Department of Environment, then the Department of Trade and Industry in the government of Baroness Thatcher and Parliamentary Private Secretary at the Home Office under Prime Minister John Major. He was Minister for Transport in London from 1992-96 and a former Vice-Chairman of the Conservative Party. In 2000 and 2004 Steven Norris was his party's Candidate for Mayor of London. His private sector appointments have included chairmanship of Jarvis (the construction and rail engineering company), AMT Sybex and a non-exec director of a number of other smaller companies. Steve is also chairman of the Data Centre Alliance which represents the DC industry to the external world. CTC: Have you got any programs that help cloud providers with billing models? Colo-On-Demand enables [Cloud Providers to] scale their rack densities from 0 to 10kW ...and we reserve additional adjacent cabinets for them to grow into. To really encourage Cloud companies to start, embed and grow within our data centres, we have launched CoLo-on-Demand. The on-demand solution enables start-ups through to mature cloud companies to come in and take space, power and (with a carrier partner) connectivity that is both scalable and low cost. Contained within a flexible contract this ground breaking product is designed to enable cloud companies to mirror their overheads to revenues. We provide a service that enables them scale their rack densities from 0 to 10kW as their business or hardware scales and we reserve additional adjacent cabinets for them to grow into. CTC: One question we ask all our interviewees, What is your definition of Cloud Computing? Cloud Computing to me in its simplest form is IT services delivered across a network be it across a private network or across the public internet. Generally delivered as an op-ex based model to allow organisations to buy “IT-as-a-Service”. CTC: Mark, finally can you tell us where you see the datacentre industry in five years' time? I see data centres at the very epicentre of the Cloud revolution and at the heart of our digital economy. Technology has transformed the way we live and communicate. Cloud computing, social media, mobile apps the “big data” explosion and on-demand services can only be delivered from purpose built highly efficient data centres. These data centres will continue to evolve, becoming more energy efficient; they will become highly connected data and technology hubs serving our growing demands to communicate and consume more and more data and IT services. CTC: Mark, how can anyone reading this get in touch with you? You can contact me directly by email mu@virtusdatacentres.com, on Twitter: @Virtus_Mark or by phone on 44 20 7499 1300. Or visit our website: http://www.virtusdatacentres.com. Its definitely worth hearing what we have to say. ### Could all servers become the size of an Apple TV? (They will if Cloud maturity rises) By Theo Priestley, Consultant at ITredux According to both Gartner and IDC worldwide server market revenue has fallen by up to 5%, with major server manufacturers taking a hit. The figure doesn't appear big but the impact and shift to Cloud is starting to make its mark. But the biggest shifts will be taking place further in the future as organizations move to downsize their infrastructure and run their business on smaller IT footprints. ...Will the Micro Server become a reality? The server market figures might actually be a blip for the following reasons; Firms move towards server farms hosted by a third party, reducing spend internally on larger IT assets but large farms still require these assets so spend will level out Hardline negotiations are taking place where once server OEMs had a business by the short and curlies, orgs now have a number of choices at their disposal But the biggest trend shift will happen not in the size of the revenue but the physical size of kit required by businesses internally. I posted a quirky article recently which suggested that all you'd need to run a business was a box the size of an Apple TV. What if a box as small as the Apple TV was all you needed to run your business ? Mainframe is becoming extinct... Software is becoming leaner, faster... Cloud is digitally pervasive... In-Memory data and parallel processing across an Internet of Things will be a reality... Startup businesses are becoming leaner, more agile... And all you'd need to run it all is a 3x3x1 inch block! As companies embrace Cloud, whether in full or hybrid, the likelihood of massive IT spend on server hardware becomes more in question and in reality OEMs need to move with this. The logical way is to start downsizing the physical requirements of server technology to almost a slave unit as the majority of heavy lifting for an organization is done by the Cloud provider. The worldwide server market isn't dying on its feed because of Cloud, it's just going to undergo a radical change in configuration. Which means both hardware and software has to change with it. Developers used to squeeze everything into 5kb once, there's no reason it can't happen again, And the tiny box sitting under your TV might just be pointing to the future of infrastructure in a few years time. ### The Benefits of Video Conferencing and Hosted Microsoft Lync By Mark Stubbles on behalf of Interact Technology I think it is fair to say that most people in this day and age have used some form of video conferencing software; video conferencing is also known as video collaboration and it uses two-way video and audio technologies to bring together different people located in different locations around the world. Skype is probably the most popular video conferencing platform for home users who use it to connect with friends and loved ones in other parts of the country or around the world; Skype has become so popular that the software even comes preloaded on most new laptops and netbooks. This technology has really made the world a smaller place, it is a much better way to connect than conventional phone calls and it is usually more cost effective or even free, it is obviously much more practical and far cheaper than flying too. Microsoft Lync can [...] enable more people to connect more cost effectively at one time [and] it can promote a more professional image to clients and staff. It is not just home users that use video conferencing though, more and more business users are turning to this technology too. Video conferencing software and technology can seriously reduce staff travel expenses, improve sales and staff communication at all levels in locations all around the world. Since Skype is usually free, many businesses tend to utilise it but using a platform/software such as Microsoft Lync can be more beneficial, enable more people to connect more cost effectively at one time. It can promote a more professional image to clients and staff, in this post we will discuss some of the features and benefits that Microsoft Lync offers. Microsoft says on their site “The new Lync is our boldest and best Lync release ever, with innovation across the board. Lync has always made it easy for people to communicate wherever they are, but we're making it even better with new and improved features.” “Lync enables users to communicate securely anywhere they have network connectivity, and automatically adapts to network conditions.” Small to medium sized companies will usually find a hosted Microsoft Lync solution to be effective; this kind of solution will provide a single hosted interface that can unite PSTN, SIP voice communications, instant messaging, audio and video conferencing. Lync unites all of these technologies into a richer, more simplified solution in one client making it simple to switch between the different types of supported communication. This type of Lync solution can be used to extend and even replace traditional PBX systems. A hosted solution offers lower initial costs it means that companies don't need to invest in communications hardware and they don't need to maintain this very expensive equipment on an on-going basis, the lack of maintenance required means that no specialist staff need to be employed and that both time and money are saved. Scalability of a Hosted Solution This type of Lync solution can be used to extend and even replace traditional PBX systems. A hosted solution is perfect for a small to medium sized businesses; besides the lower initial costs of ownership a hosted solution enables a client to easily scale the service up or down as their business needs demand it and that these changes can be deployed more rapidly without disruption to the clients business. This means the business can take advantage of opportunities as they arrive and that the businesses cash flow can be forecasted more effectively, providing the business with a healthier balance sheet. Why should you invest in Lync? To put it simply Lync makes communication easier, the main benefit of Lync is the integration it offers. It integrates and works on a wide variety of devices such as Microsoft Windows, Windows Phone, Apple iPad, Apple iPhone, Android phone and tablet devices. The Windows 8 and Windows RT app provide a seamless touch-first experience; users can join a conference on any of the supported devices listed above with just one click. Lync connects with Skype and allows browser-based access too this means that the people you wish to connect and conference with do not need to download any special software or pay for subscriptions, etc. all they need is an internet or network connection. Virtually meet, see, interact and conference with up to five participants at once, share notes and other documents during your conference sessions using OneNote. For more information about Microsoft Lync or any other voice and video solutions, speak to Interact. ### "Industry Nomenclature" Phil Clark, Marketing and Channel Development Director, NIU Solutions After 23 years in the IT Services industry, I have never been so frustrated with the confusion we're generating with clients, largely down to nomenclature. It is well publicised that the Cloud word is overused, but I am seeing a dangerous kickback from companies that are worried about cloud message dilution - and that MSP as an industry term is safer territory. And so begins MSP message dilution – where does it stop? And in this context we are seeing client enquiries become increasingly complex. Our company is current embroiled in many client sales situations, across whom there is no common competition, no consistency of requirement definition, smatterings of “cloud” and “Services” with no real clarity on the need - and, as a result, they will get responses from our competitors and ourselves that are impossible to compare, full of gaps and assumptions that are invalid. Then they'll pick the cheapest one. I have never been so frustrated with the confusion we [as an industry] are generating with clients, largely down to nomenclature. And this company will be one who doesn't actually provide Services with a capital "S", but is likely to be a reseller of stuff who wants to move into this space - and have broadly under-called the costs to support this move. So when the client asks that provider to sign up to Termination for Persistant Service Level breach, or integrate their service management framework with the clients ServiceNow offering, and they are met with blank faces, and at that point they'll realise their tender process was a load of old toss - and they should have picked the provider who was more experienced in the Services way of doing things. Or - even worse - they muddle through contracts to be staring down the barrel of a “service provider” who is in administration because they didn't realise how complex MSP world can be. So I was wondering how we can start to be a bit more specific in order to help our clients find the right partners on projects. Radical thinking, but lets think about our world on a spectrum from the viewpoint of what our clients actually want. This may be a little simplistic but I think it captures the profile of most clients reasonably well:- Shade 1 - Clients who want to buy IT or Applications entirely on demand, and don't care where its delivered from. They also don't care about SLA's or responsibilities for who does what, and who makes it work. Shade 2 – Clients who want to buy IT or Applications on demand, but have specific requirements for that IT that make entirely shared or standardised services not suitable. Shade 3 – Clients who want to buy IT or Applications cheaper than their current delivery model (either on demand or not), but impose certain restrictions on delivery models to support their business requirements Shade 4 – Clients who have an existing IT team, but need to source specific skills or services from a third party to supplement functions that they will integrate into their world internally Shade 5 – Clients who want to buy all of their IT or Applications for the same price as their current delivery model, but with higher levels of function or flexibility (such as management or DR) gained through the benefits of Shared Services from a Service Provider, and are looking to a third party to integrate it into their world, but navigate the complexity of service ownership and boundaries themselves through contracts. Shade 6 – Clients who have no concept of IT, but need some “stuff” to support their business processes and want someone else to design, build and run it for them, and - most importantly - own the service entirely. So across these 6 shades, we have created industry terms that are blurred on the edges to make companies significantly more appealing to a broader client community. Public Cloud, Private Cloud, Hybrid Cloud, Cloud Services Aggregator, Cloud Services Integrator, Cloud Services Broker, Managed Services Provider, Mini-Sourcing, Selective Outsourcing, Full scope Outsourcing, Outtasking, etc. ...companies that resell product (dump and run) who [...] have simply rebadged themselves to get out of a commoditised and low margin business, but have yet to understand the complexities of the MSP space. And then you have a different dimension: companies that resell product (dump and run) who are now either Private Cloud companies or Onsite MSPs, who simply have rebadged themselves to get out of a commoditised and low margin business, but have yet to understand the complexities of the MSP space - and are starting to get rumbled. Can we please all agree that reselling a product with no SLA's on a financed annuity basis is not a service? It's a financed product, and if you have no ability to take a call when it goes wrong you are not an MSP! Tier 1 Vendors with Hardware and Software arms take note too please. So my question to readers is this... Would it be sensible to agree some definitions on a spectrum to ensure our clients understand where we all play - and stop wasting lots of peoples time and money? I'm not saying my definitions are correct, but surely a classification of capability, signed up to by appropriate service providers and governed by a industry body (that is aligned to clients needs) would be a useful tool for both identification of client opportunity, and also potential partnerships in our space. I am happy to start. Based on my definitions, we're an MSP with existing clients who need Shades 3-6 and we do a very small amount (less than 10% of revenues) occasional product resale if a client needs no "follow-on" Service. With a capital S.   ### Cloudy with Integration By Theo Priestley, Consultant at ITredux I was recently with a client who is looking for a new enterprise software solution and they have a very positive outlook towards Cloud. They love it, so much so that they weren't interested in on-premise solutions. This represents a real shift in attitude, and the company is no slouch or tadpole in size either. And this is where iPaaS (Integration Platform as a Service) is really coming into the fore and has to rule as a first-thought strategy when entering the cumulonimbus worlds of Cloud and SaaS. In the last month both SoftwareAG (Integration LIVE) and TIBCO (Cloud Bus) have thrown down the gauntlet into the ring with IBM, Mulesoft and Informatica to name a scarce few who are already there. But why has it taken everyone so long? iPaaS, without the vendor nonsense clouding the understanding (pun) is a solution provider's service that allows cloud-cloud and cloud-premise integration for applications. It's a step away from Cloud Brokerage which is essentially the development and maintenance of SaaS applications and their integration in the entirety. If iPaaS was a scarce offering Brokerage is even thinner on the ground (however take a look at Accenture's play in this area recently announced in April). But back to the client at the start of the post. Despite their pro-cloud stance tucked away in their enterprise architecture diagrams was that horrendous long box labelled ‘ESB' (Enterprise Service Bus) and it just looked so out of place in a forward thinking strategy. And this is where iPaaS doesn't go far enough for me. iPaaS is limited because it's built primarily for Cloud based services. A Cloud based ESB strategy would be a welcome minimum requirement to any iPaaS solution simply because the Service Bus is one of the most overworked and overly relied on pieces of kit in an enterprise stack. Removing that entire headache of routing messages between disparate systems into a Cloud solution makes more sense where a hybrid environment exists rather than retain that piece on-premise. It's also a solid first step for companies in moving away from relying on IT to manage the biggest piece of the architecture puzzle. Most of these services will have already undergone SSAE 16, ISO27001, PCI compliance so they'll be robust and secure and I see no reason not to ask a solution vendor whether they can provide a cloud ESB in its truest sense as part of their iPaas offering. I see no reason not to ask a solution vendor whether they can provide a cloud ESB in its truest sense as part of their iPaas offering. Cloud is really at a tipping point now. We have SaaS applications and companies willing to invest but in an unstructured and legacy heavy manner. We have vendors offering to integrate a multi-SaaS strategy with their own solution but still retain that on-premise bus. We have Brokers who will handle it fully end to end. But the key to real Cloud adoption is in how that ESB is treated. It's the biggest prize to be had and the largest thorn to remove. The iPaaS rush is on and every cloud has a golden lining, but without clearly including a Cloud-based ESB in that strategy you might just be panning for iron pyrites.   ### How SMEs can access enterprise-grade managed services with the cloud? By Nick Isherwood, Technology Services Director, JMC IT No matter what size, businesses don't need to suffer poor system performance anymore! Thanks to the cloud and some really game-changing technology, small and medium sized businesses (SMEs) can now access the same level of proactive IT support and monitoring that was once only accessible to much larger organisations – with much bigger budgets. Now, regardless of where you choose to place your systems, IT partners are able to monitor them at a level that was once only manageable to large businesses. Businesses should look for IT partners taking an innovative approach, who have developed an application performance management ‘bolt-on' to their monitoring How has the game changed? Using the cloud and systems such as ScienceLogic, a number of IT partners have taken managed service support to the next level, from ASM (Active System Monitoring) to APM (Application Performance Management). These technologies allow outsourced IT support partners to customise a client's system monitoring around the elements that are most important to them. For example, a business that relies heavily on email in order for them to operate effectively would suffer should there be any downtime on this element of their IT. Companies embracing these latest methods, can now offer an enterprise-grade service to businesses, at a more accessible cost. Businesses should look for IT partners taking an innovative approach, who have developed an application performance management ‘bolt-on' to their monitoring to give an extra level of support to that email system and all of its components. Dynamic Component Map showing high-level performance of Microsoft HyperV hosts, guests & applications. What is application performance management? APM does not mean that the rest of the system's performance is ignored; far from it. Your IT partner should continually proactively monitor how the whole system is performing, and make any necessary fixes before they escalate into a larger issue. This way, clients should never experience any downtime or under-performing applications. However, APM takes this a step further and adds a further layer of protection to the key elements of the system that a company values the most – or cannot function without. This could be a link to a database or as mentioned above – a company's entire email system. APM looks at what is most critical for that specific business and focuses on all components that have an effect on its performance in a much more detailed approach. Network availability & performance, including event status. At JMC, for example, we can set early warning metrics to alert our technicians to when any of these elements' performance drops below an agreed level, meaning we can act quickly to fix the issue before it becomes a problem. We also provide, where required, access to our live monitoring data on a client's own bespoke desktop, meaning they can securely log in to see the performance of their systems at any time. This gives huge peace of mind to organisations that really rely on their IT to operate, which is increasingly the case for many businesses. What are the benefits? By taking a much more proactive approach to monitoring and support, and the specific critical elements of a business' systems, problems can be solved up to four times faster than if it escalates into a fully ‘broken' state – i.e. when something stops working altogether. If you have the right support partner, you will experience little or no downtime and are able to work at an optimal level at all times. This view provides an instant view into the performance of a Primary & Replication back-up service. A good support partner will report to their clients on a regular basis, notifying them when performance has dipped into the ‘alert' stage and demonstrating that they are working to – or have already resolved the issue. You should be able to see the live status of your systems at any time, and be provided with an easy-to-digest quarterly report to keep you abreast of the work that has taken place on your systems and their performance. Sadly, we come across many organisations that don't have this experience with their IT support and monitoring. But it isn't necessary to suffer from poor performance – and it isn't difficult to switch provider. Some partners will even offer you a no-quibble money-back guarantee if you aren't receiving a superior service at the three-month review. ### EuroCloud UK Awards Showcase the Cream of Cloud Apps By David Terrar, EuroCloud UK POST-EVENT UPDATE (12 July 2013): At the awards ceremony, Richard Sykes explained the judges thinking and announced the awards, but wanted to point out the strength in depth and range of companies that had been shortlisted. There were some very different styles and types of solutions competing in each category, and that highlights the quality and diversity of cloud offerings on the market here in the UK and Globally. Richard explained that the judges wanted to give XTM Cloud a special award as their solution didn't really fit well in any of the categories, but tackled a very important issue for the whole sector - translation of software solutions to local language - in a very elegant and comprehensive fashion. The winners were: Best Start-Up: Present.me Best Business Impact: Anomolay42 for Guidepost Solutions Best Case Study, Commercial Sector: Daisy Group for Manchester Airport Best Case Study, Public Sector: Accellion for London Borough of Camden Best Cloud Offering: OneLogin ... Our congratulations to all of the winners! Original Post and Category Nominees (12 July 2013): Later today the annual EuroCloud UK awards will be showcasing the cream of cloud apps and services. The awards event is sponsored by IBM and hosted by well-known accounting firm Baker Tilly at their offices in the city. The awards are being judged by independent panel - Dr Richard Sykes (Chair, Cloud industry Forum), Max Cooter (Editor, CloudPro) and David Blacher (Head of Media & Technology Practice, Baker Tilly). Here are the shortlisted candidates for each of the 5 categories, plus a Special Award that the judges added. Best Start-Up: AdvancedChange with their cloud based HR and performance management software Actus. It's designed to increase employee performance, engagement, and so improve the organisations profitability as well as compliance. They have name customers like Siemens and have grown to 2000 users in 6 months. SaaSID with their Cloud Application Manager. They provide a browser-based single sign-on (SSO) solution designed to provide full visibility and management of web applications that is equivalent to existing on-premise application control, but from any device. They're addressing the vital BYOD problem, gaining traction and IDC says their solution is "smart". Present.me describe themselves as YouTube meets Slideshare. They make it easy for an individual or a company to create and share a video presentation on line. The spoken word combined with visuals - a simple but very powerful concept. They started in May 2012 but already have over 30,000 subscribers. Best Business Impact: D2C with WordFrame providing the ICAEW Community. Started as a pilot back in 2007, this was part of a shift in business model to move from traditional print publishing with advertising to on line community content and sponsorship. As of today 79,613 of the 138,000 members are participating in what is a strategic move for the Institute. Anomolay42 for Guidepost Solutions. Guidepost provide confidential investigations, litigation support and forensic accounting and financial investigations. They use Anomaly42's rapid data discovery engine has enabled us to connect previously hard to find data and has transformed the speed and efficiency in which they undertake investigations and monitoring work. WebExpenses using Skype, Trello, Google Docs and, of course, webexpenses. In 2007 WebExpenses decided to use only cloud based tools to run the business. Best Case Study, Commercial Sector: Workbooks for Taopix. Taopix are a software development company providing a photo book and photo gift software platform to digital printers and photo finishing companies. Workbooks gives them a single, unified cloud based CRM and business solution that the entire Taopix team can access at any time, anywhere in the world. NaviSite for ASOS. NaviSite provide enterprise-class infrastucture and platform as a service. ASOS is a global on line fashion destination. They needed a flexible, scalable development and testing environment. With Navisite they can create new tes/dev environments in a day instead of 2 weeks, and have moved 2 key production systems on to NaviCloud. Daisy Group for Manchester Airport. Dais Group provide unified communications. When Manchester Airport hosted their web services in house, they struggled with peaks in traffic caused by things like adverse weather conditions. Daisy's managed hosting solution gives them the flexibility and elasticity to handle millions of website hits and improve customer service. The Judges Special Award: XTM Cloud is a multi-tenant SaaS solution to help linguists translate more efficiently, improve translation quality and consistency, and reduce delivery times. For project managers it helps them manage and allocate translation resources, control and monitor the progress of projects and estimate costs. Customers range from 3DI to Philip Morris International to Zynga. Best Case Study, Public Sector: Accellion for London Borough of Camden. Accellion provide secure mobile file sharing. LBoC were implementing a BYOD policy and needed to provide anytime anywhere access to Sharepoint and File Shares. Accellion has given all employees secure mobile access to confidential documents and revolutionized the way the Council works internally and with their partners. Firehost for Cancer Research UK. The charity used Firehost's secure cloud hosting services to support the demands of a donation system for a live Global telethon. They load tested up to 300 transactions per second, and during the campaign processes 38,000 transactions, raising over £7,000,000. Exponential-e for Viridian. Exponential-e uses its own network to deliver business critical services including cloud, voice services and connectivity. Viridian is a housing association supporting 30,000 residents who wanted to move to a cloud based IT model. Exponential-e gave them a more flexible, agile and efficient environment as well as shifting expenditure to pay as you go. Best Cloud Offering: Abiquo provides a software based, hypervisor agnostic, Cloud Management Platform that works across private, hosted and public IT infrastructure. With Abiquo organizations can discover and optimize existing infrastructure, choose the right platform for every workload, whilst managing access, usage, security and costs. OneLogin provide a comprehensive cloud identity platform for managing user identities, both in the cloud and behind the firewall. Onelogin provides pre-integration with over 3,000 cloud apps, 5 leading strong-­‐authentication vendors, Active Directory, LDAP, Google Apps, Workday and more. It has a clean simple UI, one click access to the Cloud, an iPad app and Cloud search. Xero is a UK and global on line accounting solution connecting small and micro businesses to their advisers and other services. Xero was the first provider to incorporate bank feeds and has pioneered the development of smart-phone applications. Their UK customer base is currently 22,000 but they plan to grow it to 100,000 within 3 years. We wish everyone the best of luck and we'll report the winners later. Disclosure: I'm an unpaid director of EuroCloud UK, and my company has been shortlisted for the best business impact award. ### Partnerships, Alliances in the Cloud, And What Makes Them Effective By Marcus Cauchi of Sandler Training The recent tie up of Oracle and Salesforce.com (see here for article) got me to thinking. On the surface this is about tying up their technologies but underpinning this is something else which hasn’t been factored into the mix, which is that both these companies sales forces are trained in and use the Sandler selling system. Now I am really curious to see how having a shared sales methodology and philosophy behind their sales is going to affect their growth and the rate of customer acquisition, assuming they can make the technologies play nicely together. I’m curious to learn about other partnerships like Oracle and Salesforce.com‘s in the Cloud space, and how their salespeople, managers and engineers play nicely together. Certainly the history of IT is littered with partnerships and tie ups which failed to deliver on the expected results. And I am curious about the implications for vendors who sell via the channel, and the impact not having a cohesive sales methodology is affecting them and their clients? ..the history of IT is littered with partnerships and tie ups which failed to deliver on the expected results. From CIOs I’d like to understand how having partners with differing selling methodologies affects your ability to get a good understanding of what the partnering organisations are offering and how well or badly they communicate what they offer to you and your teams? Do they focus on the technology and the IT team or have they spread their influence across your organisation, and especially are they engaging with the right executives, influencers and users in your business? I am curious whether you see these kind of partnerships becoming more prevalent – and what you consider to be the defining factors that will ensure in the EXECUTION they are successful? On paper and at the strategic thinking stages many ideas regarding tie-ups look good but when it comes to execution they fall far short of expectations. …teams where the leadership fails to speak with one, clear voice, tend to become very political very quickly. In my experience, teams where the leadership fails to speak with one, clear voice, tend to become very political very quickly. A crack at the top of an organisation turns into the Grand Canyon by the time it reaches the front line troops and it becomes full blown political warfare. If they aren’t cohesive and they don’t communicate what is expected clearly you can be sure there will be as many versions as those who heard or read the policy or decision. And they will take their perception to their teams and so on. In this tie up as with any, the leadership can be united and be clear, but it is essential to keep reinforcing that clarity (common purpose, common language, shared cultures, aligned compensation schemes, aligned management, shared targets and key must-win accounts, sharing of intelligence and contacts etc) so customers, staff, suppliers and commentators don’t become confused or feel there are mixed messages or intent coming from either or both parties. And the final element that my experience tells me is essential is that both management teams repeatedly reinforce and practice what they are aiming to achieve with their teams at home and together before they go into meet the prospects. What is interesting here is we teach a rule that “When on a joint sales call, only one person talks”. That means the other doesn’t! 2 on 1 is intimidating, so this needs to be managed very carefully and the battle of egos doesn’t take place in front of the prospect, and to ensure they avoid the embarrassment of feeling like you have to interject because the other salesperson is too weak to do their job right. 2 on 1 is intimidating, so this needs to be managed very carefully and the battle of egos doesn’t take place in front of the prospect, If you are a CEO, manager or owner and you are considering a tie up or partnership, I’d be very curious to learn what pitfalls you have experienced in the past and what you can do to prevent them so that such alliances work from the outset? How will you avoid making the same mistakes with the next partnership? And what are you doing to prepare your sales force so that they go out and deliver the added value the partnership is able to offer customers, instead of getting in the way of the sale. I have a couple of clients independent of one another whom I introduced from IBM and Imtech; they work well together precisely because they do the preparation and make sure they speak the same language. They don’t fall over one another or let their egos get in the way and they both smash target routinely wherever they are working. And they have common sales philosophies and systems. In working with a couple of the Salesforce.com and Oracle partners, I have noticed a major shift in their performance. In one case, they were able to go from startup to £3m annual run rate in 12 months and after 18 months they sold for $8-figures by partnering and one major factor in their success and exit for cash was sharing their sales methodology. Where do you see the best opportunities for tie ups in the market based on compatible and related technologies over the next 12 months [and] why? Where do you see the best opportunities for tie ups in the market based on compatible and related technologies over the next 12 months? Why? Are their selling methods compatible or effective? And what about marriages made in Hell, the ones that won’t work out? Which ones do you predict will crash and burn or worse, fizzle out and die a slow painful death? Has this sparked any other thoughts on why and how Cloud vendors can extend their reach and market share by collaborating? Marcus Cauchi runs Sandler Training in London. He lacks the milk of human kindness you’d associate with nice trainers because no one hires him to be their friend. They hire him to drive up sales and eliminate ineffective and bad selling and sales management habits. Working with him is rarely a pleasant experience, it’s tough and it’s expensive. Call him and you’ll know why. 0203 427 5133 or 07515 937221. ### 7 Considerations when going Mobile By Michael Worley, Director at Spark33 The Mobile landscape is new, exciting, and often confusing. It can be a challenge to fully grasp even the fundamentals. Spark33 work closely with start-ups and enterprise clients to define and execute every stage of Mobile Development. When embarking on any mobile project, there are seven key focus areas to consider in order to plan for success. 1. User Experience It's All About The Data: Whether building a workforce application or creating a customer facing service, reviewing all available analytics is invaluable. Discover which devices are connecting with existing services, when, for how long and how often the data is being accessed. Research the Market: See what's already available. Understand the common pains and problems in the area, then investigate other services with integration potential to improve touch points and add value. Understand The Users: In order to score the value of a mobile project, consider and critique those behaviours that analytics alone can't explicitly detail. Listen to users and take advantage of their experience where failures and gaps in functionality exist, and which features will deliver “easy win” benefits, good return on investment, and excellent customer satisfaction. Any well-considered application balances features and design with learned user experiences. Any well-considered application balances features and design with learned user experiences. Whether a Content Management System (CMS) or mobile interface, understanding this ensures software will be intuitive and recognisable. 2. Development Model Don't Get Left Behind: The rapid nature of change in Mobile demands Agile Methodologies. Be responsive to the market and its users: if the focus shifts, agile, change-friendly practices will allow you to adapt and align with this new direction, wherever it takes you. 3. App Store Accounts (iOS example) Will your solution be internally available or for the public? It's a fundamental question which requires different focus and tools depending on the answer. If the project is to provide internal applications on Apple's platform, to develop and distribute software internally, an iOS Enterprise Developer Program account will need to be purchased. This provides the toolset for in-house distribution and development. If developing for consumers, you'll require an iOS Developer Account. This provides the toolset for development and distribution to the Public Apple App Store. In business, the management of these accounts is often performed by internal IT departments. If a third party is developing the project, account access and privileges can be granted and the process managed as part of the service agreement. Answering the simple, obvious questions early on provides focus, direction and avoids pain further along the development process. 4. Device and Application Management Mobile Device Management (MDM) [...] provides the management layer for device-level security and access. Mobile Device Management (MDM): the most likely solution for internal applications, this provides the management layer for device-level security and access. MDM functionality includes password strength requirements, restrictions on third party application access, and set-up of an enterprise app store for the management and distribution of applications. Mobile Application Management (MAM): this provides the toolset needed to manage application-level access and policies, providing a set of controls for data access within the applications. MDM and MAM services are most commonly third party subscription-based products but can also be custom-baked into any solution. 5. Front-End Development There are many languages to consider in mobile development, from native languages like Objective-C for iOS and JAVA for Android, to HTML/Javascript/CSS based environments for web-based or hybrid apps and cross-platform support. These technologies have varying benefits, disadvantages, and challenges. Adopting the right solution requires an in-depth investigation into and understanding of the demographic being targeted, their devices, development costs, and the platform's scalability, extensibility, and quality. 6. Back-End Development Mobile has evolved far beyond basic standalone apps. Today there are many back-end solutions, bespoke or off-the-shelf, providing operations from the simple to complex when delivering data to front-end applications. The architecture of the back-end depends on correct analysis of the solution's users, costs, sustainability, technology, existing system integration, security requirements, and scalability. Hosting is the heart of production software, allowing users to be ever-connected to the information and benefits of a service. 7. Hosting & Architecture Hosting is the heart of production software, allowing users to be ever-connected to the information and benefits of a service. Deciding on how to host an application requires plenty of thought and time gathering requirements, defining the architecture, cloud strategy, and roadmap, and identifying security needs within the requirements of the business. Spark33 have 25 years IT experience and have been actively involved in Mobile since the birth of the App Store. Spark33 work closely and fluidly with startups and enterprise clients to illuminate, define, and execute every stage of Mobile Development. To find out more see: www.spark33.co.uk or email info@spark33.co.uk   ### Cloud Service Providers - Can you answer the un-asked questions...? By Jessica Barry, The APM Group: Independent Certification partner to the Cloud Industry Forum While many people (including myself) may not understand how cloud computing is technically delivered to their devices, the concept of Cloud has become much less confusing to IT users in recent years. I attended the Cloud World Forum event in London recently, with the Cloud Industry Forum (CIF). It was clear that people are starting to see through the hype and understand the Cloud. It was also clear that Cloud Service Providers (CSPs) are skilled in delivering innovative advertising campaigns about their solutions, which coupled with the attractive pricing that the Cloud offers, presents an appealing IT delivery to those walking around an exhibition hall. However it was also clear from one to one discussions with delegates that questions remain in their minds that can't be answered by an eye-catching exhibition stand, special offers or even after the all- important sales pitch. Whether they are too embarrassed to ask or fear they may be baffled with technical jargon, the questions they're not asking include: “Will this Cloud service support the technology in the office and my remote workers?” “This CSP claims to only hold data in the EU, but the EU has over 25 states – which one has my customer information?” “I like that they include project management to migrate us to the Cloud, but what capability does the CSP have to deliver the core service I'm paying for?” “Does the CSP have support staff in the UK or will they be accessing my system and data from other countries?” “It's definitely the right solution for now, but what if a better tool comes along. How do I get my data back to change supplier?” It appears that a number of CSPs are missing a trick in their product positioning and promotion that will ultimately transfer opportunities into sales - answering the un-asked questions! Like any company exhibiting at a key trade show, CSPs don't want to water down the strength of their brand or messaging with lots of small print; particularly on show stands, literature and post-show email marketing. However, in a technical market that still has work to do in educating its consumer base on issues such as capability, resilience and security, CSPs need to find a way to build trust that will convert opportunities into sales. So how does a CSP effectively communicate to potential customers that they have the answers to the un-asked questions? Once solution is independent certification, specifically The CIF Code of Practice (CoP) CSP self-certification. According to UKAS (UK Accreditation Service), independent certification engenders confidence and trust, and reassures consumers of the integrity and quality of a business, service or product. The CIF CoP is no different; the self-certification covers three core requirement areas – Transparency, Capability and Accountability. The requirements against these areas are considered to be integral to the delivery of quality, effective and trusted cloud solutions and have helped a number of certified CSPs to improve internal process as well as stand out in the Cloud and secure new business. Of course, CIF certification is not the only certification to provide end user assurance, and indeed, the CIF CoP recognizes this by referring to and accepting submission of existing certifications such as ISO27001, ISO9001 and ISO14001 as documented evidence of Capability by CSP applicants. Until there is a definitive cloud standard developed by the ISO or other independent standards bodies, it is important to recognize the CIF CoP is one that is written and endorsed by the Cloud industry itself, as the CIF's constituency is comprised of CSPs whose contributions fund development of the CoP alongside industry research, advocacy and education activities. CIF research shows the adoption and demand for Cloud services will continue to grow through 2013 which will increase competition and choice in the market, meaning CSPs will need to find new ways of differentiating themselves. Alongside market growth, the size and scale of trade shows like Cloud World Forum and Cloud Expo Europe will increase, as will the actual number of events, which will become critical fixtures in the marketing and events calendar. For this reason, CSPs will not only need to find a way to stand out with new quality service offerings, strong, identifiable brands and by a solid online presence, they will also need to make sure they stand out in the exhibition hall. And what better way to catch the eye of the consumer, who may be full of burning questions and concerns, than to show them a mark of trust and let the certification speak for itself… For more information on the Code of Practice or the CIF, please go to www.cloudindustryforum.org or email info@cloudindustryforum.org ### Whose Data is it Anyway? By Nick Prescot, Information Security Analyst, Firehost In the increasingly complex world of hosting – shared, dedicated managed, virtualised and the like, there is an increasing concern about data ownership. Putting all your data in the cloud has become fraught with many security and compliance concerns, but what is interesting is that data ownership is often overlooked. With the new EU data protection directive currently being debated in the European Parliament, the question of whom owns what data has become far more pertinent. As it currently stands, all organisations in the UK are subject to the UK Data Protection Act (UK-DPA). This piece of legislation defines three main categories of data ownership, the first being a ‘data controller' – which is responsible for individuals' data and ensuring that it is processed in accordance with the Act's wider principles. The data controller can also ask a ‘data processor' to process the data on their behalf, but the data that is being moved around must be given the same protection that the data controller is required to give. Last, but not least, is the ‘data subject,' who is the individual whose data is being given to the data controller and processor to move around and utilise. The new EU data protection directive currently being debated in the EU parliament intends to fundamentally change how personal data (including credit card data[1]) will be processed, and what the penalties for data misappropriation will be. The new EU data protection directive currently being debated in the EU parliament [will establish] what the penalties for data misappropriation will be. The main proposed elements of the new EU data protection directive are: The data processor and the data controller will be held equally responsible for the management and the movement of data Compulsory breach notification to the local authorities if personal data has been breached Punitive fine of up to two percent of global turnover if the data controller/processor has been determined to be in breach of the data protection laws Harmonisation of the data protection laws across the 27 European member states The proposed changes are an evolution of what has been previously enforced as are the responsibilities of those handling data. The control and processing of data will be much more onerous on the entities that are processing personal data. The Information Commissioner's Office has publicised that it will fine organisations up to £500,000 for a breach of personal data. At this stage, it is important to separate the potential effects these changes will have on the public and private sectors in the UK. The Information Commissioner's Office has publicised that it will fine organisations up to £500,000 for a breach of personal data. To fine a public sector organisation such amounts will unlikely force it to close, but it will probably have to re-adjust its budget, which will impact the services the entity delivers. For the private sector, the financial impact is often more severe, as the reallocation of funds is more challenging. For both types of organisations, reputation damage cannot be escaped. The proposed evolution of the EU-wide data protection directive will have a substantial impact on hosting providers, therefore all businesses that host their data with a third party (data processor) should take note. The last EU Data Directive was passed in 1995 before the Internet went mainstream so a unified approach across the EU will harmonise the approach to data privacy and protection across the 27 EU member states. Not only will the data controller be required to provide more assurance in terms of managing the data, the data processor will be required to undertake the same level of assurance. The fact that data processors, such as hosting companies, will bear the same burden of regulatory responsibility as data controllers means that more care needs to be taken during the evaluation of potential cloud providers. Once the bill is passed (likely in 2014 and implemented in 2016), cloud providers' data protection and privacy programs will need to be more carefully reviewed for alignment with the imposed regulatory climate. Whilst the new EU data directive is being discussed, the issue of data sovereignty needs to be explored and each provider's practices fully understood. These issues are important for all cloud providers regardless of where they are based or from where they operate. For providers based outside of the EU, full understanding of the provider's Safe Harbor status, their stance regarding the laws governing data privacy in their country of origin and how they plan on meeting compliance with the EU directives and data sovereignty are all of paramount importance. [1] Reference the Lush decision by the ICO ### Help! NSA has my data – Your questions answered “Help! The NSA and GCHQ have been snooping on my data under Prism. I'm going to leave cloud and go back to on-premise as that's the only way I can be sure to keep my data safe from prying eyes.” If this is you, then before you abandon cloud, it's important that you regain perspective. 1. What has been happening? Edward Snowden is the most famous person of the moment. Opinion is divided as to whether he is a hero whistleblower or a traitor to national security (or perhaps even world security). Whatever view you take of him, he's highlighted that the NSA and GCHQ have been using the powers available to them to get access to data. These revelations are continuing. 2. Are you kidding? Is that even legal? This should not come as a great shock. Intelligence agencies snoop on your data. That's how they gather intelligence. As I said previously in Patriot Act and Data Security: 8 Myths Busted, they have various legitimate snooping powers. The US Patriot Act is probably the best known and much of this data has been accessed under Section 215 of that Act. The NSA has also been using the Foreign Intelligence Surveillance Act Amendment Act. President Obama has said that NSA has acted lawfully and that this is a modest intrusion in individuals' privacy and is justified to keep the nation safe. William Hague has said that GCHQ has acted legally and is not using its alliance with the NSA to get around UK laws. President Obama has said that NSA has acted lawfully and that this is a modest intrusion in individuals' privacy and is justified to keep the nation safe. 3. So the snooping is widespread? Yes. In fact, there is even the sense in the US that it is easier to justify this snooping because it largely targets non-US citizens. The lesson from this is that intelligence agencies can and do snoop. Perhaps the only real surprise is how widespread this snooping is. 4. Shouldn't we curtail these snooping powers? Certainly, the widespread access to data has made many uneasy. The debate will rage for some time as to whether the access powers of governmental agencies should be curtailed or whether this is a concession to our privacy to protect national security. For the foreseeable future, at least, these powers will continue to exist. 5. Have cloud providers been participating in this? Google, Facebook, Apple and others have denied that they have actively participated in the Prism programme or that they have an open back door for intelligence agencies to gather data at will. But Skype has admitted that it joined Prism before Microsoft acquired it allowing agencies to snoop on calls. The issue isn't really which providers are actively participating and which aren't. The issue, whether they are participating in Prism or not, is that intelligence agencies can get access to your data and they can compel the provider not to tell you. 6. So should I move back on-premise and abandon cloud as the only safe option? These powers apply whether you're in the cloud or not. Admittedly cloud makes it easier to get your data, but if a security agency wants to get your data it will find a way. 7. So it's a hopeless cause then? It is important to retain (or regain) perspective. It is widely known that the EU shares with US agencies information of those passengers who fly into the US but this has not stopped people flying there. These snooping powers will come under scrutiny but in the meantime the world will continue to go round and business will go on. 8. Ok, what should I do? ...Are you one? Hollywood has long depicted all-seeing governments and some might say these revelations show this to be not so fanciful or restricted to conspiracy theorists. These powers are not new and it is unlikely they will be curtailed short term. As I said in my last blog, this snooping is unlikely to be targeted at the average business. But if you're still not comfortable then this is what you should do: Classify your data according to importance: ordinary data needs little protection; sensitive data needs higher standards of protection Protect your data according to its importance: consider if encrypting your sensitive data is appropriate whether you keep it in the cloud or not. Use cloud wisely: consider keeping your sensitive data on premise if it gives you comfort but if you have encrypted it, it shouldn't matter if you keep it in the cloud or not. Also, if your email is not encrypted then moving your email server into the cloud probably won't make things worse Clearly the revelations about NSA and GCHQ make it appear that 'Big Brother' is watching us. But that doesn't mean that we should all abandon cloud. Nothing has truly changed - we're just all more aware of what our governments are doing and we should act accordingly to protect our data.   ### Why Aren't You Selling Your Socks Off With Cloud Solutions? By Marcus Cauchi of Sandler Training I have a colleague who is prospecting with a company selling AS400 PaaS solutions to financial institutions. They are about go into a meeting with over 15 of the prospect's team from CIO to 'Chief Pointy Head' in charge of paperclips to do a presentation. They will present on the company, it's technology, it's points of difference. They will answer a series of technical, personnel and business questions and then they will trial close. They will handle objections and then trial close again. They will then be asked for a proposal and walk away satisfied that they have done a good day's work. Sack the lot of them! Fire them! None of them are selling. You would do better with a trainee from a YTS scheme and a checklist of questions for them to ask the prospect for all the good this 'show up and throw up' dog and pony show will do. Gone are the days when CIOs could focus on efficiency and driving down costs. You can't save your way to growth and success. Eventually you will cut out the fun, the life and the opportunity, your bets and most creative people will leave and you will be left with the deadwood and the middle layer of mush (people who say, “At least I still have a job” or “at least we didn't shrink that account by as much as we might have” or “At least I got 80% of my number”). It's not about the technology. CIOs don't sit there thinking about the next shiny bit of technology. They are deeply involved in the business and how it runs, what makes it succeed or they are headed for the dole queue. IT is a business enabler nowadays not a bunch of technologies burning electricity and eating a hole in the FD's pocket. CIOs who are worth their salt have to educate vendors on how to present their solution to the business decision makers in their company because the vendors are doing it all wrong. You are doing it all wrong! Vendors (you) who can't make the knowledge accessible enough so even business executives can understand it, are doomed. Businesses want to know how they can manage overtime costs, find people with specific skills in their company, apply the right resources in real time to where they are needed so buyers can buy them. Everyone in IT (vendors and IT departments) need to understand what the business needs and why they need to solve real business problems. Vendors in the Cloud need to understand the pain the buyer organisation has so you can offer real solutions to real life obstacles to growth, top line sales, expansion of market share. IT IS NOT ABOUT THE TECHNOLOGY. Vendors can help buyers drive innovation in the Front Office and the Back Office. Vendors can help speed up the R&D cycle at no risk just for the cost of rental of a couple of thousand server hours without driving up their CapEx. Vendors can solve the problems CIOs are facing with business users exepcting the same ease and convenience they get from their consumer devices. You have shifted form being a technology provider to a business or customer enabler. You are no longer just innovating for IT but for the whole business and its supply chain. You can save your way to mediocrity by driving down costs and driving up efficiency or you can create products and services that people want to buy. Think about it for a moment. Banks aren't banks any more. They're IT services companies that happen to package their solutions in financial products. Banks without IT are dead. If you can't respond quickly to your customers' demands and preempt their needs, you will lose to nimbler competitors. You want to be able to come up with ideas in the business, apply and test them quickly and cheaply... Saas allows you to compress time without paying a premium. The smart money isn't writing off the data centre or server farm either. If you have a situation that requires unvarying, regular use of a particular platform or service, it's probably a good idea to have this in house. But variable or unpredictable use of a service favours a rental model. In Mergers & Acquisitions the Cloud can help drive the decisions the merging companies make on which systems to keep and which to ditch. The cloud can help you rapidly prototype solutions and changes at ultra low cost. You no longer have to bet your shirt on projects because the costs and risks are now so low. But such projects only work when the CIO, his team, executives in other departments all share a common picture of the future, where they're headed and why it's important. This means CIOs need to have close working relationships with the CEO, CFO, COO, Sales & Marketing Directors, Director of Customer Service, Supply Chain Director, HR Director etc. without it, how can s/he know what the business wants and needs. The CIO has a place on the Board and can only maintain it if s/he understands that the success of the business depends on the success of IT. If you sell Cloud services (SaaS, Paas or Iaas) you have a responsibility to understand where the CIO is in his career, in this job lifecycle and the pressures s/he faces to serve the business. Their continued survival depends on them being able to provide rapid service enablement using technology but not being tied to tradition. Avon had to produce a solution to a series of business needs that served their 6m reps worldwide. They considered building the application in house; partnering with a traditional software vendor and having them host it, or using Salesforce.com. They picked the latter because they could begin testing the application within weeks without adding extra strain to their IT organisation. This meant they could focus their attention and resources on the application itself. Bravissima have used Big Data to drive sales of certain products up by as much as 600% just by responding to weather forecasts and adjusting their banner advertising online and their window displays in-store. In every case, from Flextronics to Avon, Bravissima to Toyota, Cloud vendors responded to the prospects' pain whether it be from social media habits, BYOD, sales force pressures or a need to provide services demanded by customers and enabled by IT. But showing up and throwing up is not the way to do this. The company I wrote about at the start is going to end up in a long, protracted sales cycle that results in “no decision” while they shop around, try and develop it in house, go to their preferred supplier and see if they can do it for them for less. They will waste time kissing frogs thinking they are selling. They aren't. CIOs need vendors who know how to get to the heart of a business problem and happen to solve that problem through technology using the platform, infrastructure or software that serves the business's needs best. It may not be the lowest cost per unit, it may not be the most efficient, but if they are sold right, Cloud solutions will provide businesses with real competitive advantages and tear them away from the addiction to the traditional pains that are holding them back today. Vendors who turn up trying to sell technology will be robbed of their knowledge and buyers will buy despite how they were sold not because of it. Remember bad generals usually fight the last war! [quote_box_center]Marcus Cauchi runs Sandler Training in London. He lacks the milk of human kindness you'd associate with nice trainers because no one hires him to be their friend. They hire him to drive up sales and eliminate ineffective and bad selling and sales management habits. Working with him is rarely a pleasant experience, it's tough and it's expensive. Call him and you'll know why. 0203 427 5133 or 07515 937221.[/quote_box_center] ### Good News! You're Not Paranoid By Daniel Steeves, Director at Beyond Solutions So, they can see it all… and they really are watching! For the past six years, PRISM (under public ownership) and that other form of Big Brother, the Internet in general, (under, mostly, private ownership) for much more than six years. While very different in nature, perception and our view of having given permission (or otherwise) these two avenues to our information are also very much the same. As is the answer to how you can take control of both and render PRISM itself effectively irrelevant with regards to your data (more on that a bit later). A recent article on ReadWrite.com, amongst others asking questions like ‘In Cloud We Don't Trust?' might be a little off-kilter from the start: isn't trust meant to be based on things like implementation and delivery, not on concepts? In any case, modern trends have led to behaviour changes that on one hand relax our views on personal privacy while accenting them with heightened awareness and increased concerns regarding personal security and identity... clearly a contradictory position but that is where we are. Trust is based on the implementation and delivery, not on the concept. And I think we know, mostly, how we got here: Data, Big Data, lots of data: accumulated, shared, analysed and desired… not to mention viewed, comprised and stolen. From the looks of what we see on real-time attack monitors like HoneyPot or Akamai – and what we read in the June update from the Information Commissioner's Office reporting a tenfold increase in corporate data breaches over the past five years – we are pretty aware of what mistakes and attacks are possible on both sides of the security fences. Probably by now we shouldn't be surprised that Governments, businesses and individuals would attempt to find opportunities to their advantage – maybe not in terms of right and wrong, moral or immoral, legal or illegal but simply in terms of the odds of it happening. We are the King-Makers! After all “We, the People” have generated vast amounts of wealth for other people by giving them our personal information and the permission to use it pretty much as they please. Many may not have realised, from the start, exactly what was happening but as our search results improved, social and business networking sites connected us to friends and family, businesses flourished and content worth consuming popped up everywhere, about everything – all provided essentially for free – for the most part we decided we were okay with it all… even if we didn't read the fine print before accepting the Terms & Conditions! I'm neither qualified nor, to be honest, interested in debating legalities, civil rights, moral views or due process and there are more than a few spots covering those angles… so let's skip to the mechanics. For sake of argument, let's consider it a given that “they” are legitimately seeking access to information to support investigations in our ‘best interests'. And give the benefit of the doubt to the Courts as gatekeepers who accept only the most compelling arguments. And that legitimate, local due process will be followed. And that the information requested will be used solely for the original purpose (that ‘greater good' thing). And that things like Safe Harbour and other treaties meant to enforce the protection of our data, personal and otherwise actually do enforce the protection of our data. We are the Protectors? Okay, a few leaps of faith needed so far, particularly when we add to the mix that the rules and policies of such treaties worldwide were actually defined in a different era of technology years before the capabilities (and resulting application or interpretation of US laws to suit the use) of PRISM could even be considered. It was even longer before our latest favourite whistle-blower made us aware of its existence let alone its extent and capabilities: in simple terms, if “they” want it then, apparently, "they" can get it. I could be wrong but I think that many of us inherently accept that some of that data, somewhere, connect and analysed the right way by the right people might prove to hold the key to something vital. I know that I enjoy hearing about intelligence-led victories (especially when achieved intelligently!) The “doing it behind our backs” thing might not have been well advised, though, in a world where fundamental control by the owners or keepers of data – who have the responsibility, legal and ethical, to control and protect the data they hold for or about others – is how it is meant to be, or so we'd been led to believe. You Hold the Keys The only reason to consider having to accept any of those leaps of faith is because PRISM as a data-Hoover actually poses little cause for alarm if the data can't be read. So why not spy-proof your data to limit or eliminate the ability of anyone, Government or otherwise, to access your data without your knowledge; for it to be snooped or otherwise collected whilst it travels via the internet and through some clouds; or when in the hands of Internet, Cloud or Hosting providers – or your supply chain or anywhere else it might get to, planned or otherwise. PRISM actually poses little cause for alarm… just spy-proof your data Encryption as spy-proofer is not so complicated: in simple terms, data is kept confidential by means of a cipher, for example, which like a lock has a key: lock to encrypt, unlock to decrypt. And it is perfectly legal. In fact it is generally recommended but far less generally implemented even though it can be done with little to no performance impact and, from the risk mitigated by protecting your business from fines like those reported by Gradian Systems factored into the model, it might prove cost-effective as well! While any reputable business will want to comply with the laws of any states in which they do business, encryption provides you with legal control: companies may be compelled to decrypt their data under subpoena or other methods but they will always know when and which data has been targeted. The fact is that of those businesses who do use encryption to secure data across networks to offices and partners or via publicly accessible services, few of them encrypt data as it travels through their internal networks, between storage and servers. Which might mean that they are not getting what they think they have. The trick, then is to not only clarify your requirements, as always but to have clarity as to the implication of the options available and, ideally, collaboration with the subject matter and domain expertise of a trusted advisor. Otherwise you might end up with encrypted data but not have control of the keys. One such trusted collaborator is Jon Penney, CEO of Intellect Security (disclosure time: he is a client) who provides enterprises with source-to-archive encryption solutions and via Cryptosoft delivers scalable policy-driven encryption, as an appliance or as-a-Service, protecting data across workflows and business processes including partners, supply chain and customers. [quote_box_center]SSL protects [data] on the way to the cloud but not in the cloud. As Jon put it “SSL provides fairly robust protection for your data as it moves along the pipes but the data is only protected in motion: on the way to the cloud, not in the cloud, for example. What you need in this scenario is for the data to remain encrypted-as-used in real time by applications, in active local and remote storage, for backups or when archived and everywhere in between. If clear at any time, either on your own systems or when out of your direct control, the data is subject to access without credentials.”[/quote_box_center] The key is of course the keys: intelligent design abstracting business logic from physical infrastructure and cross-platform secured distribution - to ensure that security is designed as a part of, rather than a restriction to, the processes of the business process. Not to mention keeping them in the hands of the business - and not their service provider. Of course, if "Big Brother" really wants it he'll probably find a way… the key may be the key but that ownership and control mitigates risk only until the Law says otherwise. Daniel is a Director at Beyond Solutions where he advises Cloud and other Technology providers and a Thought Leader for Compare the Cloud. You can reach him at steeves@beyond-solutions.co.uk or follow him on Twitter @DanielSteeves. Daniel has recently started publishing the Beyond Cloud series of video ‘discovery discussions' with IntelligentHQ.com ### The Future of the IT channel (and the 2025 Car Market) By Mark Adams, CCO of Cloudmore [quote_box_center] It's 2025 and you are the franchise owner of a city-based car dealership. The manufacturer of the cars you sell has just arrived... He says to you: “The future is renting electric cars, we are not going to sell cars anymore, instead we are going to rent them - and we want you to help us!” You reply: “Errr, that's great… I think… how much margin do we make?” The business model works like this: You find the customer, you sell to the customer, and look after the customer and for that we let you have 12% margin for the first year and then 4% per year after that. Your first thought is that all my sharp suited sales people are going to have to go.... OK, this could be OK because right now we earn a lot of money from the on-going servicing of the vehicles. “So how much do we get for looking after the the cars then?” The manufacture replies: “ah, well, you see we do that. We look after the cars - they are pretty much maintenance-free and we repair them remotely, so you won't really need to look after them… But,” he added “you can earn some great money washing them!” The penny is starting to drop now, you probably won't even bill the customer in the future, all the effort invested in your maintenance workshop will be wasted - and you are certainly going to need to downsize your upmarket dealership premises. [/quote_box_center] Is this a likely scenario? Well, renting smaller electric vehicles around cities has already arrived; it's just not main stream yet. Another very likely scenario is that we will see new innovative companies with no legacy of expensive premises and people to hold them back changing charging and service delivery models. It is fun to hypothesise about such things, but for the supply of IT the future has arrived in 2013. Moves by the big players into direct contact with the customer, solutions that are “self-healing” at the backend - and a set of competencies in the Channel that are starting to look out of step with the customer requirements of the future. A vision of the future channel (emerging now) So what might a Channel player of the future look like? We already know that the Customer conversations are starting to be less technical, more about the business value derived from the solution; we are starting to see new supply chain models where the Channel Partner acts more as a solution advisor than reseller - and gets margin more like a distributor; and we know that the average Channel Partner's cost-base needs some serious attention. ...the Channel Partner acts more as a solution advisor than reseller - and gets margin more like a distributor... I predict there will be consolidation - and although these new companies will have more customers - the average revenue per customer will be lower. They will mainly purchase from a limited number of Cloud aggregators/brokers to keep down their own costs, and only the biggest will own their own clouds. The sales teams will be more like business consultants advising how the end-customer can derive a competitive advantage from the use of a particular solution, but they will also be experts in distributed security models and processes. Training will be provided in on-demand models with the most innovative companies producing lots of their own content to help and advise their customers and users to get the best from their systems. I believe that contract terms will still be measured in years but payments models we be pay-per-use, probably on a quarterly basis for most. The most difficult change for many Channel businesses will be the ability to continually attract new customers. I would suggest that this is not a core competence (as a general rule) across the channel, and I know that traditional methods of feet on the street will not be financially viable… so the successful companies will need to have powerful, low cost lead generation and sales engines. This is not going to easy for many. SaaS, PaaS and IaaS vendors will continue for struggle over the next few years in terms of their commercials, SLAs and Go-To-Market Models. A significant part of what Cloudmore do with our own vendors is help them through this process, no matter what size they are. But where this is disruption there is opportunity. I am seeing some fantastic new and innovative companies trying to make the most of the new style supply chain. Indeed, our recent integration of the IBM SmartCloud for Social Business solution in to the Cloudmore portfolio means you can now deploy this service to your customer in minutes, it's about 4 minutes actually! This means this solution is now a viable option for SMBs, through the Channel, and as a ‘trojan horse' into larger mid-market companies by deploying onto one department and letting the viral thing happen. This can be non-disruptive, new revenue for a Channel Partner, and starts that deeper business value conversation with the Customer. So a Customer who is pushing for a Cloud solution can be not just given a cheaper alternative – for instance: substituting Exchange Email for another Exchange Email, what value are you adding by doing that? Adding value with innovation and different approaches creates loyalty. With over 1000 Partners across 6+ countries serving over 6000 end-customer organisations, Cloudmore are constantly innovating towards next generation Cloud Aggregation business model to enable businesses to move to the Cloud quickly, seamlessly and securely. ### Patriot Act and Data Security: 8 Myths Busted By Frank Jennings, Cloud Lawyer It is likely that, after Sarbanes-Oxley, the most widely feared and misunderstood piece of US legislation - at least in the cloud sector - is the Patriot Act. According to some, the Patriot Act is a good reason to avoid cloud computing altogether. Others say that the draft EU data protection regulation is the Commission's defence against the Patriot Act. Others say it is a good reason to avoid using a US provider. Certainly, there is a lot said about the Patriot Act but not all of it is accurate. Here is our myth buster. 1) The Patriot Act is anti European No. The Uniting and Strengthening America by Providing Appropriate Tools Required to Intercept and Obstruct Terrorism Act 2001, to use its full title, was passed following the tragic events of 9/11. It consolidated existing anti-terrorism laws and has wide surveillance and seizure powers, some without the need for a court order. However it applies to US cloud providers just as much as European providers. There's also the lesser known Foreign Intelligence Surveillance Act Amendment Act which allows surveillance of activities which are unlawful or which are potentially against US interests, such as activist, protest or political groups. 2) If I store my data in a US data centre the FBI will access my data Certainly the FBI could get access to your data but that doesn't mean they will. Businesses should not worry unnecessarily. The US government is unlikely to want to get access to the data of the average business and will more likely target those engaging in activities which are unlawful or which are potentially against US interests. 3) If I store my data in the UK, the FBI can't access it No. In June 2011 the managing director of Microsoft UK admitted that the Patriot Act applies to them because they are a US headquartered business. So, even if you use a UK provider and store your data in Germany, if you have a US provider anywhere in your cloud supply chain which handles your data, the Patriot Act could still affect you. 4) If I avoid US providers, no one will see my data It's important to remember that each government has its own anti-terrorism and surveillance legislation - the UK government is no exception with it's Regulation of Investigatory Powers Act (with the appropriate acronym RIPA). Powers under RIPA have been used widely and not just for surveillance in the cloud sector. Councils have used it to tackle dog fouling and to take sound recordings of noisy children. So each government can get access to data within their jurisdiction. 5) The UK won't hand my data over to the FBI Yes it will. Many countries - including the US and UK - have signed treaties and have in place processes to pass data - your data - to each other upon request. To keep your data safe from the UK government or the FBI, you would probably have to store your data in Iran or North Korea. Good luck with that. 6) The draft EU Data Protection Regulation will stop the Patriot Act It's true the new regulation will increase the level of protection of personal data - and is intended to apply to data about EU citizens held outside the EU. This is making US companies uneasy and there is a lot of lobbying going on to water this down. Even the UK information commissioner is not happy with the regulation. But don't forget, there are two key limitations here. First, the regulation - in whatever its final form - will protect only data about individuals, not all information. Second, it includes exemptions for national security, just as the current law does. 7) If governments can get my data wherever I am, I should stay out of cloud While it's true that it will make it more difficult for governments to get access to your data, storing it on-premise and out of cloud is not necessarily the answer either. If you are the kind of organisation that the UK government, the FBI and other organisations want to investigate, then maybe you should stay out of cloud. If you are unlikely to attract the attention of law enforcement agencies, then why forego the benefits of cloud because of a misplaced fear that they will come after you? At least evaluate whether private cloud or hybrid cloud could work for you. 8) I'm not in cloud. I'm secure Cloud providers place data security at the heart of their operation. If you stay with an on-premise solution, make sure you address data security - don't assume that your security is adequate. Also, make sure your staff aren't using their own devices for business purposes. Even if you don't support BYOD, can you be sure that your staff aren't using their own iPads or Android devices for business. And you'd better check whether they use Salesforce, Dropbox, Gmail and all those other excellent cloud tools. What can you do to avoid scrutiny? There are a number of steps you can take: Assess whether your business model or the data you collect is likely to attract the attention of UK and US governmental agencies Evaluate your data and identify the really important information Consider hybrid cloud where you keep your key data on premise and run everything else through public cloud Consider private cloud where your data is held by someone you can investigate and trust Consider encryption or tokenisation to protect your data Check whether staff are using their own devices or public cloud accounts I agree that data security concerns are important: I've co-authored two reports on this subject and have co-authored a cloud contracts best practice whitepaper which addresses data concerns. Email me for copies. [quote_box_center] FEATURED COMMENT: from this blog's comments section below With thanks to Verizon Terremark and Eoin Jennings: Some further information from our legal [department]: The Patriot Act does not grant the US government access to customer records stored in the cloud. The law only applies to business records of the cloud provider itself. Therefore, if the US government wants access to a customer's data stored outside the US, it must request assistance from local (in-country) law enforcement, just as other governments around the world do [/quote_box_center] Here is a more detailed explanation of how the Patriot Act ACTUALLY works: 1. The Patriot Act is a law of very limited application. The law applies only to national security (terrorism) investigations within the US. The law does not give the US government the power to act outside the US, and it does not apply at all to criminal or civil investigations outside the national security area. 2. The Patriot Act does not grant the US government access to a cloud customer's data. The important fact about the Patriot Act that those commenting on the law in the media do not understand is that the law is a “business records” statute. What this means is that the US government can use the law to ask any company that does business in the US (this includes US subsidiaries of non-US companies) to provide the company's own records (things like customer name, address or means of payment), but it cannot require a hosting provider to provide access to customer data stored in a non-US data center. 3. To access customer data stored outside the US, the US government uses established treaties. Because the Patriot Act in our view does not authorize the US government to “search” a server located outside the US, the US government must request assistance from local (in-country) law enforcement to conduct a “search”, just as other EU governments do. 4. The Patriot Act is not being used to access customer data stored in the EU. The Dutch government has recently confirmed in answers to parliamentary questions that it is not aware of any requests under the Patriot Act for personal data stored in the EU and that the US authorities have stated that if such a request were made it would be with the assistance of in-country law enforcement. ### It’s time to rethink Cloud Management By Jim Darragh, CEO of Abiquo We've all heard of ‘Cloud' we all understand it and know the benefits it can bring, so I'm not here to preach to the choir, but what I will do is challenge you; I want you to take a good hard look at your current Cloud offering and start asking yourself some difficult questions. Here begins your Cloud therapy. So where does Cloud begin? It's safe to say most, if not all organisations have virtualised at some level. The truth is Cloud is a natural evolution of your virtualisation strategy, an extension that builds upon the benefits you hoped to gain from virtualising. It is necessary for organisations to migrate their existing legacy processes, not just the technology in the cloud but to truly optimise all their resources - organisations must do both. Enterprise governance is key in the context of who owns the hardware – where is it, who can access it, how is it secured, what patches need deploying into what environment. However, what about the machine or lifecycle management or capacity planning for that matter or workflow and Business Process Management? All of these need incorporating too as their legacy is already processed from what they inherited for the old business and the customer would like to automate them into a true cloud offering. That's precisely what Abiquo provide. ...has your virtualisation strategy worked and are you in control of your virtual environment? So let's start here, you need to ask yourself some questions; has your virtualisation strategy worked and are you in control of your virtual environment? Do you have the mechanisms, management and control to ensure that you are getting the utilisation you wanted? It's also worth thinking about how you plan to leverage your current virtual environment to offer the next level of agility, flexibility and usage model. If you have control of virtualisation and the levels of usage you can begin to accurately plan for future capacity needs, you can also begin to plan which workloads require what level of capacity, compute, storage and network. In doing so, you are then also able to control where the capacity resides and how to ensure you have a higher level of efficiency and utilisation. Assess your options; what is available to you? Can you begin to combine your own in-house infrastructural resources with "rented" options from public sources? Do you have the ability today to manage both as a common ubiquitous set of resource that is available to you when your plan says you need it? Do you fully understand the cost implications and potential benefits of mixing your own resources in your data centre, and that you can procure "on demand" from public cloud providers whilst maintaining the control and processes that you have for your organisation? You probably also want to consider how you can control and manage a mixed set of resources, whilst maintaining the processes and management systems? (i.e. the tools that essentially keep your IT running smoothly and in a framework of control and compliance). Can you keep those robust systems and processes but now apply them to an infrastructure that is both highly utilised and flexible, and consists of elements you own and rent? Can you give your users the speed and flexibility that they can freely procure outside your organisation? Your consumers are now accustomed to procuring resource over the web with a few clicks of the mouse, but they will invariably do this outside of the control of IT and the governance of the business, unless IT can provide the same experience from within. Your consumers are now accustomed to procuring resource over the web with a few clicks of the mouse, but they will invariably do this outside of the control of IT and the governance of the business, unless IT can provide the same experience from within. It can be achieved, your users can be given access to self-service resource, but governed by you and the policies you have for where workloads should reside, approved by your organisation and using the underlying infrastructure and applications that you deem appropriate. It is important to have a long hard look at your Cloud offering, is it delivering what you had hoped and are your customers or users finding it as accessible and valuable as they should? Indeed, what do you want your Cloud and IT process to look like in the future? Abiquo will be exhibiting at the Cloud World Forum in June, visit stand 4080 for your FREE Cloud Health Check. ### Changing sales behaviour? You might want to look at the sales incentive plan first. By Paul Bevan, Director of Surefire Sales We have heard a lot recently, even within the recent portals of ComparetheCloud, about the need to change the way we sell. While there is always a danger in over-simplifying the drivers and levers that affect human behaviour, might I suggest that a little time and thought spent on making sales incentive plans fit for Cloud purpose would be time well spent. I don't think it is an exaggeration to say that we get the sales results our incentive schemes deserve. Anyone who has a run a sales team knows that sales people are masters at working out how to make a scheme work to their advantage. The more complex we try and make the schemes, to try and balance up the drive for growth and the drive for profits, the more the sales teams find ways of driving a coach and horses through its provisions. Anyone who has a run a sales team knows that sales people are masters at working out how to make a scheme work to their advantage. When we were selling hardware, or software on a licence basis, it was relatively easy, in theory, to construct incentives based on whether the focus was on order growth, margin growth, customer acquisition and retention, or a combination of all three… and we still got it wrong regularly. Now, Cloud and globalisation have made life much more complicated. For example, if you are a sales person selling data centre space and are paid on monthly new bookings in your geography only, are you going to chase a local deal for 20 racks now with a national enterprise, or the new IaaS vendor, tipped for greatness, who only needs one rack in your geography and a further 10 in other geographies, with the chance that this could become a huge magnet for future business. Trust me, without management intervention, most sales guys will take the 20 local racks now. The long-term company strategy may have been better served by taking the other deal, but the drive for revenue now can often be a compelling factor for many organisations. This is not restricted to hardware either as software and service companies struggle to incentivise sales for selling Cloud based pay per use models effectively. If the anecdotal evidence is to be believed, the problems are pretty widespread and must be impacting growth and profit targets. Are there some great examples of simple, effective schemes out there that are driving desired sales behaviours in Cloud vendors, or do we need to adopt the DEC approach from the 1980s and do away with the bonus scheme for sales completely? I'd love to hear your views. ### Cloud Security and Control using SDN Overlay By Chris Purrington, VP of Sales at CohesiveFT CohesiveFT is company ‘born in the cloud'. Since 2006 we've been helping our customers to use cloud computing, and that whole time we've been focused on addressing the issues of security, control and complexity which are encountered when using Infrastructure as a Service. Our goal is, as much as possible, to allow our customers to carry on doing in the cloud what they were doing on premise, without lots of changes and learning. From a customer engagement in 2007 (a global science project looking to use AWS to host common business tools shared to 50+ locations) we identified the need for secure overlay networking and so developed VNS3™ our Software Defined (Overlay) Networking product, although of course it wasn't called SDN back then. In this blog I want to tell you about VNS3 (“Virtual Network Server-Cubed”) and how it can address many common cloud adoption hurdles, but I should also mention it is part of a suite of products that offer a tool kit for cloud application migration. Our Server3™ image and topology automation product allows you to import your VMs, software components and ISOs, and capture your topology definitions. It then transforms them into VMs for your chosen cloud and launches these VMs together as one application. This multi-cloud application migration process is all about automation and reuse, not magic! Watch out for more on this in a later blog. Cloud Adoption Hurdles Back to application centric SDN. We use that term because VNS3 operates at the application layer (layer7) rather than down at the network layer as other OpenFlow based SDN offerings do. Running at the application layer VNS3 gives back control to the application owner and so addressees many more of todays cloud based system's requirements than OpenFlow does. Here are some of the typical adoption hurdles faced when moving systems to the cloud; Security - is your data secure enough? Compliance - can the CXO attest to the security of their data? Compliance - is the level of encryption high enough for PCI/HIPPA/etc.? Connectivity - can all your remote offices, or customer/partners securely connected to their specific servers in the cloud? High availability/DR - will my system always be available, or can I recover quickly? Control - can I use the IP addressing I want, where I want it? Scaleability - can I connect as many offices, networks and devices as I want? Portability - can I move my network with my application to another cloud? Connectivity for ‘road warriors' - can our iOS and Android devices be connected securely? Monitoring - can I use my existing NOC to monitor and manage my cloud-based servers? Integration - how can I integrate to on premise data, systems and infrastructure, eg Active Directory. All of these and more are addressed by VNS3 - it's so much more than a cloud VPN or VLAN. Your Network in Your Control VNS3 is a software only virtual appliance that our customers license, deploy, configure and manage. It's in their control, not in the control of the cloud providers, so their CXO can attest that their data in motion to, from and within the cloud is secured with 256 bit encryption. VNS3 is a hybrid device, it is a router, switch, firewall, IPsec and SSL VPN Concentrator and protocol re-distributor. This means our customers can overlay the cloud native network with their own network and so have control over end-to-end Encryption, IP Addressing, Network Topology and Multicast Protocols. We see lots of varied uses for VNS3 in the Cloud; capacity expansion, disaster readiness/recovery, legacy migration and integration, development & test environments on demand, cloud WAN, and partner/customer/branch networks. The overlay network is achieved through the separation of network identity from location, allowing our customers to ; Extend their corporate network into the cloud VNS3 Integrate to on premise systems and data Spread their overlay network across multiple cloud regions or even cloud providers, Create a highly available meshed network Build hybrid and or federated clouds. Connect multiple locations and remote users to federate common shared infrastructure Create global cloud WANs to connect to disparate customer & partner networks - often with very different requirements. Create a cloud security lattice - orthogonal security measures layered up to provide enterprise class security in the cloud We see lots of varied uses for VNS3 in the Cloud; capacity expansion, disaster readiness/recovery, legacy migration and integration, development & test environments on demand, cloud WAN, and partner/customer/branch networks. Relying on VNS3, customers have created PCI and HIPPA compliant services, ISV are embedding it, SI's are delivering solutions with it - and CSPs/MSPs are offering it as a ‘VPC on steroids'. An example of a VNS3 high availability multi cloud meshed overlay network with a 172 address, securely connecting multiple data centers. These meshed networks are dynamic, they can be designed to failover automatically to keep make the network always available, and a comprehensive set of APIs are available so you can script and automate your overlay network's creation. As the cloud space matures the number of cloud providers will continue to grow, analysts are validating that cloud convergence, interoperability and federation across providers will be key for many production deployments. So the importance of Overlay SDN will grow, as will architecting your cloud networking for security, flexibility and future choice. CohesiveFT are also sponsoring Cloud World Forum at London Olympia 26th/27th June you can find them on stand 3050. CohesiveFT are founders and organisers of CloudCamp London, the next one is 26th June 2013; www.cloudcamp.org/london. Interact with CohesiveFT on Twitter: @cohesiveFT Get in touch: ContactMe@cohesiveft.com   ### An Interview with Firehost The leader in secure cloud hosting, delivers service to Europe. Katrena Drake of Firehost CTC: Tell us about yourself I’m Katrena Drake the director of European Operations for FireHost. I’ve worked in FireHost since the company’s inception and, over the years, it’s been exciting to see us develop progressively more secure cloud hosting where it’s most needed- for large organisations, dealing with sensitive data, or those regulated by technical standards such as the Payment Card Industry Data Security Standard (PCI DSS). One of the most unanticipated rewards of providing cloud hosting with security is how many companies now take security so seriously. I only moved to the UK last year, so feel free to correct my speech or teach me new words and places to see, although I’ve actually learned quite a lot from the dozens of applicants I’ve interviewed to join us at our new HQ, west of London. CTC: Who are FireHost and what is your UK strategy? FireHost is the leader in secure cloud hosting. We serve thousands of end users in more than 30 countries. We’re here in the UK to address, head on, the top two concerns most British companies have about moving their data to the cloud. 1. Security Our infrastructure is purpose built to protect businesses from organised cybercrime and malicious, egotistical hacker exploits. Last year alone, we mitigated millions of SQL injections and cross-site scripting (XSS) attacks, protecting financial institutions and healthcare organisations' databases from breaches. In multi channel retail, we mitigated countless distributed denial of service (DDoS) attacks and managed complex load balancing scenarios, to help keep the ‘cybertills’ ringing on our clients' biggest shopping days. These skills come in handy with celebrities, politicians and their pundits, many of whom choose to host their infrastructure with us as well. Most notably, and what gives us such a great sense of pride, is that we protect many of our fellow technology firms and security companies. Names you know, like RSA, HP, ArcSight, LogLogic, among others. Being chosen by your peers is a fantastic endorsement. 2. Data sovereignty and accountability Ask five cloud hosters and you’ll probably get five different answers about how they control where your data resides. With FireHost, explicit control over the data centre in which your information resides is standard procedure – a part of our SLA, if you rather. We reinforce secure server location visually in the customer portal, with country by country indicators for every secure server a client commissions with us. A very handy feature for IT staff of businesses with multi-national operations. CTC: We understand you host Kevin Mitnicks blog for readers who do not know about Kevin could you give us a brief overview and explain why this shows your capabilities as a secure hosting provider? In case you don’t know Kevin Mitnick, he was the first hacker to ever serve time in prison for cyber crimes. He quite impressively breached large corporations – Motorola, NEC, Nokia, Sun Microsystems, Fujitsu, Siemens and others. When he was released from prison in the early 2000s, he decided to start using his hacking skills for good. Kevin has gone on to help many of the firms he once targeted, helping them to improve their security. through his consultancy. He also created the ‘Black Hat Security’ educational conferences. While the new cleaner image did much for his career, it also made him quite an appetising target for his old, hacker cronies. For half a decade, Mitnick has been on the run from hackers. Quite ironic really! He went through a multitude of website hosting providers for his site, using different hosting platforms and developers, but simply could not keep his site online with the intended visage. In 2009, after a particularly embarrassing hack had put unsavory images on the front page of mitnicksecurity.com, FireHost's CEO Chris approached Kevin to become a client. Kevin scoffed and wished Chris "good luck", obviously believing that it was impossible to keep him safe. Chris persuaded Kevin to give FireHost a try and today, three years later, Kevin's site remains safely in our environment. Incident free. CTC: What would you say are your unique advantages in a very competitive market? With FireHost, security is not a bespoke solution; it’s built into the environment for every client who uses our service. Even the most basic plan incorporates the technical requirements of PCI DSS– one of the highest and most stringent and specific IT standards ever published. Security is delivered transparently. Clients have the ability to see FireHost’s security at work. The secure customer portal provides up-to-the-minute statistics of the specific types of any attacks made on your site , as well as where they originated from geographically, by IP address. You can see trends over time and learn exactly how, when, and from where your web applications attract malice. A virtual tour of the portal is on our site: http://www.FireHost.co.uk/secure-hosting/scale-control Also, our service level agreement (SLA) commitment to ensuring and demonstrating exactly where your data resides is a huge differentiator between FireHost and other providers. CTC: What is your customer support strategy? Our service is managed – everything from the data centre carpark, to the operating system on our servers is FireHost’s domain. Customer support is available through all the typical channels (online chat, ticket system, phone, dedicated account managers), but we deliver it with above average commitment to quality, proactivity, and expertise. CTC: What is your personal definition of Cloud Computing? Financially, ’The Cloud‘ is an infrastructure (hardware) deployment strategy aimed at maximising hardware resources and turning IT into a ‘costs of goods sold’ business resource, rather than a massive balance sheet, bloating capital expenditure. Practically speaking, cloud computing allows commercial agility to thrive, deploying, testing, scaling, changing and delivering web applications reliably in an uncertain business climate. This is what cloud does well. FireHost helps do it best, without the wheels of accountability and security falling off. CTC: Does FireHost intend on pursing a channel partner strategy in the UK and what type of partner do you wish to engage? Absolutely, we’re enthusiastic about using FireHost’s infrastructure and uniquely secure cloud hosting to help extend the business of our partners. We have two programmes available: Referral Partners – Referral partners are businesses or individuals who interface with IT decision makers. They receive a monthly commission for introductions that result in new FireHost customers. Simple as that. Value Added Resellers – VARs use FireHost’s secure cloud hosting to deliver services to their end users. SaaS solution providers, software developers, creators of content management systems (CMS) and web enablement platforms, web design and development agencies, managed service providers, and other service providers who typically provide a support function for their clients enjoy the VAR programme’s capabilities. A VAR’s clients may or may not know that FireHost is a component of service delivery. You can guess what would be our preference, but VARs are absolutely allowed to ’white label’ FireHost secure infrastructure when necessary. More information on the partner programme can be found on our site: http://www.FireHost.co.uk/partners CTC: What parallels can you draw between the US and UK markets? Businesses in both markets are constantly looking for processes and systems to optimise their operation. Technology is always available faster and better than it was before; this makes IT a constant target for evolution and change. Both markets are rich with innovative businesses looking for ways to deliver products, services (or even just their brand) reliably and on demand. New devices created every day make information more easily and immediately accessible, and many businesses are struggling with how to balance mobility and security. Two segments in particular that we serve very well are financial (payment method providers, ecommerce retailers, gateways and merchant acquirers) and healthcare (holding confidential patient information and helping deliver important healthcare information in real time), which have many similar concerns. Perhaps the most obvious concern, related to the cloud specifically, is security (can an unwanted menace get to my files?) and accountability (where is my data, really? When it’s ‘in the cloud’) As mentioned earlier, we built our infrastructure and operating model specifically to solve these two concerns head on. CTC: finally if you could change anything in the computing world what would it be? Install bases would magically adopt or upgrade to the latest and greatest EVERYTHING. Seriously, that would be wonderful if legacy systems and processes ceased to exist. It would make more things interoperate and close unnecessary security holes, often caused by a lack of human intervention. FireHost’s UK website can be found at www.FireHost.co.uk or the team can be reached on 0800 500 3167 ### Are you in the Race? By Mark Adams, CCO of Cloudmore When our heroic Olympians started preparing for the great event that was London 2012, they had no doubt what the job was. Everyone of them was in a race, a race to be the fastest, highest, or longest it did not matter. There will probably be much written about why we were able to be more successful than we ever could imagined. There will be no question that one key element was: preparation. The athletes knew when they had to reach peak performance, they knew where they were going to race and they knew all about their competition, their strengths, their weaknesses and so built a picture of what was needed to beat them. When I joined Cobweb Solutions as M.D in 2003, there was no Race to the Cloud, there was no competition, there weren't any customers! When working in a nascent, very early adopter market, it is not so hard to figure out a sales strategy: 1. ensure those early adopters can find you and 2. when they do find you then ensure that you have the best offer/story. I knew that in those early days of SaaS and Cloud, credibility was the key to adoption, not so important at the SMB level but I wanted to, and did succeed in, attracting bigger customers. So we implemented a program of Quality Assurance, purchased the best technologies we could afford and built the biggest pure-play Hosted Exchange provider in Europe. Back to the Race: In 2006 Amazon launched Amazon Web Services, in January 2007 Google launched Gmail with Docs and Spreadsheet, Microsoft conceived BPOS (early Office365) in 2007, now granted those early versions were not perfect but whether you liked them or not the Race was on. Now, here's the thing, if we agree that the large Vendors, and I include, IBM, HP, VMware, Netsuite plus others, are racing to gain marketshare and customer billing relationships, then what is our response in the real world of cash-flow, daily customer management and sales? If we asked our brilliant British cycling team to compete in London 2012 on equipment that was 10 years old, with out-of-date training and nutrition plans, then would we expect them to win? No, So when we, as an industry propose solutions and commercial models that to the average businessman seem out of step with their business goals and what they are reading in the mainstream (IT / Cloud) press then we cannot expect to win. Like it or not we live in a subscription economy, more and more we 'rent' the things we use. The cynics among you will say that this subscription economy is another name for credit, and that may be so, but there is no doubt that the new and emerging business models for everything from movie rental to car purchase are now key component to economic recovery and growth. At Cloudmore we are driven by levelling the playing field for all organisations to have the best IT, the same IT no matter whether large or small and removing the barriers to purchase, adoption and use. We have built our own intermediation platform to aggregate the best in Cloud Services so that our users get the best, fastest, and most cost effective solution to their problems and needs. We have developed 'boxed' marketing programs, created micro-sites for generate leads, activated our Vendors MDF funds, we ARE in the Race, but sadly far too much of the IT channel are not. They continue to give poor advice, sell outdated solutions, deploy hardware and grimly hang onto margin wherever possible. ...far too much of the IT channel are not [in the race]. They continue to give poor advice, sell outdated solutions, deploy hardware and grimly hang onto margin wherever possible. Even worse is to give up, a cannot be bothered to even think attitude of let's promote a Service where I cannot differentiate, get ridiculously low margins and even give my customer billing relationship away, if that is not throwing in the towel I don't know what is. I joined the IT industry after 15 years in the Print and Publishing sector. During that time I witnessed a revolution in digital pre-press and content creation. The winners of that Race, were the startups who adopted the new technologies fast, the advertising and media agencies who took a larger piece of the revenue pie. The losers were the businesses that failed to change. Printing prices have crashed over the last 20 years yet there are still small and large printing and graphics businesses making a good living. It took 10 years for that industry to change. If the Race started in 2007 I reckon we are about half way. ### Why the Swiss are not taking to the Cloud By Mateo Meier, Technical Director of Artmotion GmbH. Like many countries, Switzerland openly embraces new technology and the cloud has been around for some time here. While it is widely used in many facets of life its uptake by large corporations and enterprises remains relatively slow, but why? Well, in truth there is no easy answer to this but much of it lies with Switzerland's culture and deep routed traditions. As one of Switzerland's leading hosting providers we often look to learn more about new technologies before adopting them. On a practical level we like to know that these new methods have been tried and well tested before offering it to our customer base. Although cloud technology is improving all the time evidence from the Swiss market still suggests a sense of unease. Switzerland is of course famous for its secretive banking laws and strong sense of privacy and applies this cultural thinking to many of its business practices. As a hosting provider we are also keen to embrace this however it does present some challenges with new technologies. The cloud is a prime example because of this because of its open nature and the loose regulations surrounding technology. Simply put we Swiss tend to prefer proven and more tightly controlled technology. A recent survey seems to be further support this theory with many businesses claiming that they would only move to new technology if it has been reliably tested and over a proven period of time. This is not however to say companies are not open to new technology. Enquiries of Cloud Services Artmotion, a data center providing server hosting to business owners, noted that 45% of people who enquired about Cloud hosting ended up ordering a dedicated server solution. This is largely due to privacy concerns. Cloud services are more “open” in comparison to dedicated server solutions. The difference is due to the open nature of the Cloud networks that are comprised of multi-location setups. In turn, the Cloud is easier for government agencies to request and receive access to certain files. These requests are in fixed locations, but in Switzerland it is very difficult for third parties to gain access to information on the Cloud. Risks Still a Concern In time we believe the cloud will become the standard choice of server here but there is still some way to go in this market. Many in Switzerland remain skeptical about the cloud with many experts coming out and highlighting weaknesses with the approach. The main concern appears to be the technology behind server platforms. From a consumer point of view some experts stated that providers were not as transparent as they could be and often found old server technology being used. This has been a concern as Switzerland is one of the leading business forums in the world and needs fast connections and reliability. Others of course, like ourselves, share different views. We have seen great advances in this field and are able to offer a great solution to our clients. Cloud Hosting Providers Entering Switzerland Despite the recent criticism and slow up take, a number of large providers have moved into the region with a core focus on cloud services. Jelastic for example invested huge sums in a new datacenter in Switzerland and focused heavily on cloud solutions. In time we believe the cloud will become the standard choice of server here but there is still some way to go in this market. ### Benchmarking Cloud Server Performance and Comparing IaaS Providers By Robert Jenkins, Chief Executive Officer of CloudSigma It's 2013 and there are plenty of IaaS Cloud Server / Cloud Hosting service providers to choose from in the market, and the logical approach to provider selection is to try and evaluate potential suppliers by performance, service and price. Price is the probably the easiest metric to compare providers on but how do you know you're really comparing apples with apples? In terms of provider's capability and the service aspects they subscribe to, you can get an excellent high-level picture from comparison tools such as that offered by ComparetheCloud.net. So once you've found a selection of providers that provide the type of product and the level of service you need, how do you then determine which of these is offering the best value in terms of price and performance? Essentially you now have two avenues available to you in order to dig deeper: You can either sign-up to trials from your short-list of providers and test the performance of your own applications on each cloud yourself – or – you can refer to benchmarking or performance reports generated by a third party. (Or indeed, you can do a combination of both.) At this point, let me deliver you an early concluding statement to this blog: There is absolutely no substitute to testing a service yourself with your own applications against your own performance criteria! Whilst testing numerous clouds for your best fit may be the acid test, in the real world it's also very time and resource intensive. So if there is a way to narrow down your search to just two or three providers (who you can then test with), it makes sense to explore that option first. And that's where 3rd party benchmarking and performance reports can play their part. There is absolutely no substitute to testing a service yourself with your own applications against your own performance criteria! So what is benchmarking good for, and what do you need to be wary of? Benchmarking The purpose of benchmarking is primarily undertaken to establish a common, independent reference point between multiple options for fair comparison. Benchmarking methodology and granularity has improved dramatically in last few years but challenges remain. These include many built in bias' from selection bias to location bias and more. It really is a challenge. Benchmarking is as much about determining variability in performance levels as it is about absolute performance levels. Benchmarking should factor in price. You want the maximum actual computing performance possible for a given amount of money or vice versa– not just simplistic quantity ‘indicators' such as the amount of resources your subscription level provides you with. 3GHz of CPU from provider A is not the same as 3GHz of CPU from provider B in delivered workload terms. The best benchmarking will take the server instance through a full life cycle (“the whole process”) including creation of compute and storage each time making each test iteration an independent sample; performance is about consistency as much as it is about actual level. It's a bit like ordering a taxi and not knowing what will show up on any given day – a Lada or an S-Class Mercedes. Full life cycle testing also ensures your cloud provider can't “cheat the system”. Benchmarking needs to be done over time in a repeated fashion. Why? Because a sample of one isn't representative! You need to keep sampling over time particularly when you are talking about public clouds where your location within the infrastructure will change with each sample test. Furthermore, benchmarking needs to assess both Compute, Storage AND Network performance to get the full picture, not just one or the other in isolation. Other specific use case tests are important: performance for different operating systems, CPU options, database workloads, web servers, video Encoding Frame rates etc. Look for what's relevant to you. Also of importance is from where and how the benchmark tests are conducted – data centre, network level or end-user browser level. This can greatly affect results for things like external networking throughput and responsiveness. The impact of geography on measurements should not be underestimated in skewing network performance results. Beware of nonsensical reports that get produced such as ‘global views' of best performing clouds which only make any sense if you serve the globe from only one location! A good example of a company leading the revolution in benchmarking around cloud providers is Cloud Spectator. By implementing a strict testing region and combining this with specific use case based tests, the results they are producing are extremely relevant and insightful for customers. On the networking side, Cedexis have built an impressive data gathering network that can show real performance levels of public clouds as measured by billions of data points on load speed, error rates and more taken from within end user browsers. Such results are directly relevant to anyone running public facing web based services. A sample report from Cloud Spectator on CloudSigma versus selected IaaS competitors So what does benchmarking itself not do so well? A good example is that it doesn't test ultimate capacity or scalability – particularly burst scalability in bandwidth utilisation, or testing of the provider's ability to rapidly scale up storage. Some things are too hard to test in simulated scenarios with the required degree of reliability and consistency. Benchmarking doesn't take into account the price differences between providers which are the other side of the coin. A true comparison needs to look at price adjusted performance levels. Some providers also expose the ability to scale computing resources independently as benchmarking needs to be an exact server profile match between all providers. If someone is offering resource bundles then each measured provider has to match the same bundle specification, whether all that resource is needed or not. This masks the over-provisioning that more restrictive platforms impose on their customers. Conclusion The right type of benchmarking report or service can provide a valuable insight into general performance levels. You should focus on the performance criteria and metrics that are important and relevant to you and your application. Benchmarking reports should never replace actual testing, but represent an opportunity to scientifically filter down your list of provider to find those which are in the right ball park for what you need. Then it's a case of testing each provider in real world conditions - Test, Test and Test again! ### Buyers of cloud computing need and expect a more consultative sales person By George Constantopoulos – Cloud Practise Director at SBR Consulting In the fast-paced environment of the high technology industry, consultative sales is becoming the occupation of the future. However, the future is here today. Now you're wondering, how does that fit in? Well, let's take a look at how the rapid change and evolving trends in technology are reshaping the global business landscape and altering the current sales process. The convergence of cloud, mobile, social and big data has established a series of innovation platforms with enormous potential. This, in turn, has developed a sense of urgency for sales professionals to adopt a new mindset, a new sales culture with a different set of behaviours to succeed in this evolving marketplace, that is also characterized by extreme competition. Consequently, preparing to adapt and become as current as possible with the technology advances requires a fresh approach. To liberate the sales potential within their organisation and improve performance, sales leaders must undergo transformational change to manage the growth enabling business model of cloud computing, which is far beyond simply being “fashionable”, as it is increasingly dominating the digital economy. Now, let's try and see why and how is this justified? Alongside pure cloud play companies, most of the traditional IT vendors are introducing cloud solutions to meet the diverse and growing demands of the market. The great majority of business leaders that are responsible for delivering these new services have experience in selling products, such as hardware equipment, software licenses, or on-premise applications. Contrary to that, and while moving at the pace with which their customers do business, cloud solution specialists follow a distinctive sales strategy. It is also essential to note that the pricing model for all aspects of cloud-based services is radically different to that of traditional IT offerings. Selling cloud-based solutions requires a more services-oriented approach to IT, in other words a strategy that requires a more consultative relationship with customers. It's not so much about products anymore; it's rather about discovering customers' problems first and then recommending the appropriate cloud solutions that address those specific problems. It's not so much about products anymore; it's rather about discovering customers' problems first and then recommending the appropriate cloud solutions that address those specific problems. Emphasis should be placed on linking the cloud's enabling capabilities with the immediate business results than can be produced. The key to consultative sales is developing a mutual understanding of customers' needs and assembling the necessary tools and processes to solve them. Achieving success in the cloud computing space requires unique skills centered around effective sales and long-term relationship building, while ensuring customer satisfaction. Implementing a consultative sales methodology more effectively over time, generates more sales, while considerably improving the sales cycle and driving increases in revenues. Cloud is more than just a different means of delivering IT applications; it also provides a new way of doing business. An equally important aspect is that Cloud computing enables businesses to rethink how they operate. That is why change management is one of the first areas that sales specialists must introduce to their clients when developing a dialogue around Cloud solutions. As businesses are gradually moving away from traditional IT infrastructures to more flexible hybrid models, the growth and demand for Cloud services will increase to new heights and IT solution providers will be faced with diverse customer needs. As many solution providers continue to struggle with how to successfully equip their sales teams with the knowledge, skills and tools to sell Cloud solutions, the transition to building and scaling a consultative sales model can be painful, since a more comprehensive sales approach is required here. One that will help to delight their clients, by demonstrating the benefits and the advantages through practical applications that increase value and solve business problems. Additionally, there is a necessity to incorporate sales acumen to staff with a technical background, thereby enhancing the sales experience for the customer by infusing more value and contributing to consultative sales. Deploying Cloud solutions can sometimes be complicated, but convincing customers to buy them can be an even tougher job. While traditional sales generally involve a one-to-one relationship between the solution provider and the client, consultative sales is more of a team-to-team ritual. Due to vendor marketing hype, a considerable amount of confusion has been created as to what the “Cloud” is and does. In addition to that, it is worth noting that a new terminology has been introduced to the IT world, “Cloud Washing”. That translates into the purposeful and sometimes deceptive attempt by a vendor to re-brand an old product or service by associating the buzzword "Cloud" with it. Essentially, certain marketing people can call anything "The Cloud" and get attention. Since the definition of the Cloud is somewhat fuzzy, many believe that anything delivered over the Internet is Cloud computing. The necessity to produce clarity and differentiate the value that the Cloud brings, lies mostly upon the business experts that must possess a fluency in consultative sales. Here is the key takeaway: Just as change management should be introduced to an organisation that intends to exploit cloud computing, it is equally important for sales people who want to become proficient in offering cloud solutions, to adopt a high performance consultative sales culture through a change programme that focuses on creating successful sales habits, rather than following another basic sales training course.     ### Keep it Secure… the Business of Cloud By Daniel Steeves, Director at Beyond Solutions and Partner with James Caan at HB Prime Advantage I tend to agree: this cloud thing isn't as secure as some think it is, simply because all clouds are not created equal. To be fair, though, I would have said the same thing about more than a few business and Government IT systems I've seen over the years. Security, much like insurance, is a matter of degrees and levels: it all depends on what you must do, what you want to do and what you can afford to do (and, like insurance, security can also be expensive). Security, much like insurance [...] all depends on what you must do, what you want to do and what you can afford to do... Depending on where you go or what you read, the issues vary but the underlying concerns about Cloud security seem fairly standard: concern about the lack of control over cloud-based environments; concern about access to data and systems from outside of the walls of the business combined; concern and uncertainty as to how to manage current threats, both targeted and random, let alone whatever comes next. Truth be told, much of this stuff is, relatively speaking, straight-forward if not easy once you've done the work, determined your needs and selected the correct, trusted partner (who has passed the requisite due diligence which any business with any significant level of security requirements or concerns should insist on). And you do need to look closely… can't imagine we'd find too many cloud or other IT managed service providers, or cloud-based software or things-as-a-service that are offered or marketed as less than secure… but what I said at the start about clouds applies equally to security offerings: not all are created equal. To be fair, the security needs of, say, a country-wide building supplies chain are clearly not equal nor as critical as those of an NHS Trust or an insurance company. "A move from self-hosted to the cloud typically improves security" The fact remains, though, that moving systems and data from a less-than-secure facility in the corner of the office to the cloud does typically provide for better security than was in place before. It is the business and expertise of that Cloud provider, who is meant to have the expertise, the processes, the mechanisms and the economies of scale to cover the physical, systems and human aspects of security. Beyond Physical and Systems Security But, according to the industry and the numbers, someone might still get in… locks and alarm systems don't always foil the clever burglar and so goes it with information security. It could also be that someone from within is trying to take something out, either maliciously or inadvertently (and, after all, the weakest links in the overall enterprise security chain are often the human ones). To say that things have changed is an understatement: from the nature and magnitude of information and data held by companies and Governments to the methods, motives and sheer numbers of attackers looking to access it, it is a difficult environment to fathom for many, let alone manage. Even for those succeeding in doing so, those who are clever enough to in-build security rather than continue to treat it as an add-on, the evolution of business technology is occurring at a rate that is stretching current strategies to their breaking point. With Benefits come Risks (duh!) Businesses are always looking to technology to provide an edge, to help earn a little more or spend a little less and to move or keep ahead of the competition who they perceive to be doing similar things. This reliance on and the contribution of technology to the bottom line (and in some cases, the survival of the business) is most often rewarded when a business transforms form and function to fully take advantage of the capabilities being delivered. A funny thing happened on the way to the cloud, though: just as these evolved technologies are delivering the capabilities to derive some serious value from those masses of information, the combined use cases of cloud and mobile tend to involves scattering that valuable information, that newly discovered asset, feeling like it is almost to the wind! Three Things If the information and systems merit protection then logic dictates [...] that even if the data gets into the wrong hands, that the data is worthless. To me, that means end-to-end encryption. Data is suddenly everywhere, as are the (increasing) numbers of people and access points and I believe a radical, current view requirements-focused redraft and rethink of security strategies is needed for a majority of businesses. Guarding the perimeter is well and good but provides little value for things outside of it: so, guard against but accept the potential – or even the inevitability of a breach – which takes us to step two. If the information and systems merit protection then logic dictates that you ensure that even if the data gets into the wrong hands, that the data is worthless. To me, that means end-to-end encryption. "Encrypt data as generated, if that is what you need to do" Now, with complete awareness that some of my more in-depth security colleagues might find the following a little too simplified (and I open the floor to their learned elaborations in the comments below) I believe that building a modern "infosec" strategy architecture (designed so as to flex, scale and adjust with the needs of the business) is a combined effort of three things taken from three focuses of three different views, all circling around the data of the business to deliver effective, audited and controlled access to authenticate users, devices, systems and locations: A combined understanding of the requirements from the Business, the Technical and the Security camps, considering: Systems, Data and People, each of which access, move and store data, which should be from generation and for its lifecycle, to be Encrypted with a rigorous Key Management mechanism backed up by a well-formulated, well-communicated and well-enforced Security Policy. After all, if someone manages to get past your secure facilities which house your secure systems managed by your security-cleared people there it is still another matter to crack 256-bit AES. There are a lot of corrupt little minds out there: if your system is connected and they want to get at it, then they will find a way. In closing, a quote of my younger self taken from my weekly “Computer Corner” column in the business section of an Ottawa newspaper in the late 80s discussing personal computer security “There are a lot of corrupt little minds out there: if your system is connected and they want to get at it, then they will find a way”. So, things may have changed immeasurably but they really haven't changed that much! Daniel is a Director at Beyond Solutions, a Thought Leader for Compare the Cloud and Partner with James Caan advising small business at HB Prime Advantage. You can reach him at steeves@beyond-solutions.co.uk, and follow him on Twitter @DanielSteeves. ### Does utilising the cloud expose us to more risk? By Chloe Clarke - Head of Global Business Development at Razor Thorn Security The launch and continued rise of the Cloud in recent years would be difficult to miss, yet surprisingly widespread apprehension towards this new service exists within our business culture. The Cloud holds so much potential due to its flexibility, accessibility and cost saving capabilities, so I want to explore what is causing this uneasiness towards Cloud services, and what key questions are affecting people's decisions to adopt and leverage it? From an outsider's perspective, (I formerly worked in retail and educational sales roles, so I'm not from a technical background) it became clear relatively early on in my new role that many of the statements Cloud vendors would make are not always accurate, and there also seemed to be a stigma surrounding the Cloud. After speaking to a variety of individuals considering Cloud adoption, they all appeared to share the same concern – risk. It was felt that the level of uncertainty following implementing cloud services represented too great a risk to enable them to go ahead with their project. With first-hand experience working in the education and retail sectors, I understand what it's like working in a highly regulated environment. We would be forced to recall and amend any incorrectly advertised information that was being used to sell an item immediately, for if it were found to be false it would cause reputation damage, revenue losses, and bring the possibility of law suits. Whilst this rigorous application of regulation is common practice, in contrast there seems to be a rather blasé attitude towards the amount of misleading advertisement which a number of Cloud companies provide regarding the security of their services. For example, I was handed documentation by a company stating that their Cloud held PCI DSS accreditation, however following a few questions about this from me, the individual modified their statement to: “We work in a PCI manner”… It was felt that the level of uncertainty following implementing cloud services represented too great a risk to enable them to go ahead with their project. Getting to the truth of Cloud security can be a minefield of ill-informed sales jargon, which is at best confusing, and at worst misinformation, which serves to put businesses off completely. With this in mind, I rounded up a selection of security industry experts to find out the realities of risk involving the Cloud, whether they can be mitigated, and if so how. [quote_box_center] Mark Bailey – Speechly Bircham LLP – Lawyers “Using Cloud does not automatically expose a business to more risk: the big Cloud providers can potentially offer a level of information and physical security far beyond that of most businesses. The key to Cloud is that “you get what you pay for”. Cloud services are all very different, using different infrastructure models as well as being set up and managed differently. This means that evaluation of the service (correct due diligence, intelligent risk review around the actual use that the business requires for the service and properly tailored legal contracts, where possible) are critical. The known risks can then be identified and mitigated, for the appropriate price. Risk is often created where there is a mismatch between the expectations that a buyer has of a service and what the provider is prepared to offer, or where adequate due diligence (both internally by the buyer of its own requirements, and of the provider) is not conducted. The attitude to risk on Cloud also needs to be consistent with the business' overall risk management processes internally.” [/quote_box_center] [quote_box_center] Frank Jennings – DMH Stallard LLP- Lawyers "Using Cloud services need not expose you to more risk. In fact, if you haven't embraced proper DR services on your existing IT, using Cloud services could actually reduce your risk. You must spend wisely on Cloud services to get the resilience and security you need. Also, you must check the liability exclusions in the Cloud services contract to make sure you're happy all the risk isn't on you as customer. In particular, check if the provider has excluded liability for loss of data and clarify whether your remedies are restricted to service credits." [/quote_box_center] [quote_box_center] David Prince – Hibu plc. (Formerly known as Yell) “More and more business leaders are making decisions on when Cloud services will be adopted and in what capacity. It is our responsibility as advisors to the business to ensure appropriate and independent risk assessments are being performed. With the stigmatisms attached to Cloud, this is sometimes easier said than done. In one hand you have the undeniable business efficiencies, such as costs savings, agility, and rapid time-to-market, but in the other hand you're faced with the inherent obscurities of any Cloud model which must be identified, understood, communicated, and managed. As a strong advocate for simplicity (where possible) I believe a lot of the fear, uncertainty and doubt can be overcome in a relatively straight forward manner, even with the abundance of complexities inherent with this significant shift in IT service delivery. Go back to basics and cover the pre-requisites – contextualise your risk, know your data, and know your business! This will enable you to better understand your organisations risk-appetite and how it is being impacted by Cloud.“ [/quote_box_center] [quote_box_center] Steve Carroll – CompareSafe – Security Verification Assessors “Cloud security has always had a level of the unknown about it. While it is true that the Cloud can be a very secure platform, in the same breath it is worth remembering that if set up or used in the wrong way it can open its potential users up to a world of security issues. One article even compared it to "picking a dog with the least fleas". That being said, companies should not be afraid of moving to the Cloud as long as they do some serious checks to know exactly what they are buying and if in doubt get it independently checked.” [/quote_box_center] [quote_box_center] Adam Moss – Razor Thorn Security – Information Security Consultants “If Cloud is approached in the correct manner and the security is done properly, it is possible to minimise the risks of moving to the Cloud. The flexibility that Cloud technology brings mean that there are many creative ways that Cloud services can be provided with security as an integral part of the set up. The key to choosing a Cloud provider is to ask as many questions as it takes for you to be sure that the Cloud service provider does what they say. Ask for evidence of the level of security and if compliance is an issue definitely ask to see any relevant documentation to certify the compliance. If you have a security professional on your team whether in-house or outsourced, get them in involved in the process, ask their advice and if possible get them talking to the Cloud provider.” [/quote_box_center] [quote_box_center] Nick Prescot – Firehost – Cloud Provider “Using the Cloud model for the use of computing is not different from moving to a different dedicated environment…they all have different risks that are inherent with moving data from one location to the other location. Also, it is case of Caveat emptor in the sense that the customer needs to understand how the data is managed, hosted and in some cases secured. The question of data sovereignty and compliance requirements should be the main start of any risk assessment and also how the expectation of data privacy and protection can be managed.” [/quote_box_center] [quote_box_center] Dave Foster – ProAct – Cloud Provider “I think the first thing is to look at what risks you have today. I would argue the as a Managed Cloud Service provider we will be a lot more secure than customers as we are audited so have to operate to certain standards (such as ISO27001 and PCI) and also have to prove that we vet and police all our staff. How many customers do that? Assuming you decide to take a Cloud service then it really depends on the service taken and the type of provider. As an example, if you took IaaS (I.E. hosted VMs and storage – the most common Cloud infrastructure service) and selected a public provider like Amazon or Rackspace then I would argue you are exposed to more risk. Their models are designed to protect their infrastructure not their customer VMs, applications and data. If you selected an Enterprise provider like Proact who provides an up to the OS service and who proactively monitors and polices security of your VMs, data and applications then I would argue that we go a lot further than most customers. See our security FAQ” [/quote_box_center] The underlying message is clear: utilizing the Cloud is risky, if a hasty, ill-informed leap is made during procurement. The Cloud eco-system is still maturing and we are not yet able to transfer all services to the Cloud, however there are many situations where it is potentially safer than in-house IT if implemented correctly. If a vendor offers a Cloud service which provides a raft of benefits and features, it falls to the business to ascertain whether or not this is proven. Turning to experts and industry counsels for guidelines and advice is invaluable, given the complexities that procuring Cloud services entails. Getting past the myths surrounding the Cloud is crucial to encouraging widespread uptake, but with a proportion of the general public still convinced the Cloud is of the meteorological variety, with many believing our data is floating around in the sky; demystifying Cloud services presents no easy task. Although the Cloud can be difficult to comprehend, its offerings are still growing, establishing and being gradually accepted by consumers. However, this is hindered by a misunderstanding of the product, compounded by misinformation driven by some desperate or perhaps just plain ignorant sales teams. I want to thank everyone who commented on this article: Hibu plc, Firehost, ProAct, CompareSafe, Razor Thorn Security, Speechly Bircham and DMH Stallard. ### Is IBM “a Disaster in the Making”? There’s been a lot of commentary and criticism in the news recently about IBM’s business strategy when it comes to cloud computing.  A recent example of this is Seeking Alpha’s published article entitled “IBM: a Disaster in the Making”. Whilst it’s true that most criticism towards IBM’s cloud strategy comes from financial investment publications or market commentators, there is definitely an overspill of sentiment from such commentary into the world of business IT - and how CIOs, CTOs and technology decision makers then perceive the vendor in question. IBM are certainly big and bad enough to ride both the ups and downs of positive and less-than-positive commentaries, so leaping to IBM’s defence is certainly not what this commentary is about. Rather more we have spotted a handful of inaccuracies in the reporting that demonstrate an underlying misconception in the market about what cloud computing is, and also how particular vendors are strategizing in order to set the agenda and compete in the global cloud computing market. If there’s any vendor or cloud service provider that’s been mis-represented, we’d be only too pleased to allow them to use ComparetheCloud.net as a platform for setting the record straight or telling the business IT community where they are heading.  We’re here to assist the business IT sector to evaluate and compare providers, and provider strategy is a big part of any selection process. So IBM copped some flack, but these are the specific areas we picked up on and wanted to discuss: Claim 1:  IBM’s cloud business model is based on selling ‘on-premise’ computing – which includes “private” clouds, and ultimately ignores the huge revenue opportunities of utility-based public cloud services. It’s clear to us that IBM actually has a huge stake in both types of cloud provision.  They can provide you with standard off-the shelf x86 Intel servers which serve as cost-competitive “building bricks” for your own corporate data centre requirements, or for service providers building out either private clouds (dedicated to a particular customer) or public clouds (serving multiple customers).  Moving up the IBM system portfolio they also have their “Data Centre - or Cloud - in-a-box” type of pre-configured turn-key solution – Pure Systems. Again this can be used to deploy either private or public clouds, but conveniently it comes with the IBM software stack pre-deployed which is a decent value-add for service providers looking to address real-world corporate computing requirements.  Furthermore, there is IBM’s SmartCloud which itself is entirely a public cloud operated by IBM themselves, which is a direct competitor or alternative to the utility-based Infrastructure as a Service (IaaS) models of Amazon, Microsoft and other international public cloud plays. Claim 2:  IBM can’t compete with cheap rented computing power.  IBM need to sell a box. Rented computing power refers to the ultra-competitive IaaS market, which many people would like to believe, is at the core of cloud computing - and subsequently determines the winners and losers in the field.  In truth, IaaS is simply computing hardware virtualised.  IBM’s hardware business has never been about hitting the lowest price point.  In the same fashion, their Cloud services business is not about hitting the lowest price point. Through SmartCloud IBM offer rented computing power, and whether it’s cheap or not is almost immaterial, the clue is in the “Smart” component of the name.  IBM’s strategy is to differentiate with value-add, and this is largely based on providing access to their expansive software portfolio.  So IBM see the cloud opportunity not in chasing the lowest price point but delivering business-enablement through having a more functional product that cares less about GHz and GB, and more about solutions for businesses.   And at the end of the day, you can have the lowest price on compute power, but you still need to run some apps to get anything done! Claim 3:  IBM has ceded the low-margin public cloud market to others, and has focused on the private cloud market which is a strategy doomed to fail. This is a close re-iteration of claim 1 where we countered that IBM is directly and indirectly involved in both public and private clouds.  However it’s the claim that a vendor strategies targeting private cloud are “doomed to fail” which we’d call into question.  We know a lot of cloud and managed service providers – large and small - that are focussing on private and on-premise clouds because there is a huge opportunity there.  Sure private clouds may not have the on-tap scalability or low price points of public clouds, but this ignores the flipside which is greater control, visibility and accountability. Some businesses – frequently very large businesses - have to choose private cloud for reasons which may be regulatory or simply cultural. In fact we’d stake that any business that historically bought IBM ThinkPad laptops will (or will soon) have a private cloud solution somewhere in their IT delivery model. Claim 4:  IBM's long-term prospects are troublesome because - as venture capitalists will tell you - entrepreneurs and start-ups are all building their IT infrastructure in the public cloud. Start-ups and entrepreneurs may well be the stars of tomorrow, but are they reflective of the rest of the business IT community and what it’s up to?  Start-ups are often operating on ultra-tight budgets and the next Facebook certainly needs to be able to scale on demand – so yes, public cloud makes a lot of sense for these businesses.  As businesses mature, their computing requirements become a lot more complex. It’s less about finding the lowest price per GB and more about ensuring enterprise levels of performance, security and ultimately deploying IT which improves efficiency and delivers ROI across the business.  Public cloud operators across the world recognise this which is why the more savvy providers are differentiating themselves by attaining compliance certifications, focussing on security, providing extra layers of in-built redundancy, tapping new markets, or investing in the software stack and providing platforms (PaaS). Claim 5:  The emergence of the public cloud materially changes IBM's competitive environment The emergence of cloud has materially changes everyone’s competitive environment.  There are plenty of big established businesses which were built on the back of traditional hardware and/or software sales, which now have to adapt to “the cloud model”. The cloud model certainly allows new entrants to gain traction quickly and gain market share. And indeed it’s certainly fair to say that the cloud in general encourages competition and innovation.  The key here though is not chasing the lowest price per GB, the key is innovation and delivering real-world business solutions for real businesses.  Whether IBM is delivering innovation and real-world business solutions for real businesses – well (as a technology professional) you are the best judge of that. ### The Ultimate Guide to Cloud Migration - Part 3 By Simon Mohr, CEO at SEM Solutions This is the final part in a three part series, taken from ‘The Ultimate Guide to Cloud Migration' by Simon Mohr and Databarracks. Did you miss Part One or Part Two? Alternatively you can download the full whitepaper here. What good looks like Migration is a complicated process, but it's no dark art, and overcoming some of the most common (and most severe) pain points is often just a case of having prepared sufficiently. And yet there's often a reticence in the mid-market to regard migration as the critical process it is, and therefore little to no concept of best practice. This reticence is not present at the very top end of the market. The regional nuances and idiosyncrasies (from both a technical and process standpoint) make migration a dizzyingly complex exercise for global companies with disparate workforces and IT environments. Consequently, larger organisations or heavily regulated industries often have very fixed ideas about how migrations should be conducted, and plan for them accordingly. Processes must be bulletproof, with clear, accountable owners assigned to every aspect. Migration at this level is necessarily laborious, and the organisations carrying them out simply can't afford the catastrophic disruption that failure would cause. This diligence is a matter of necessity, and is also reflected in the accompanying documentation. One of the main migration guides used by large organisations is nearly 600 pages long and distils hugely complex moves, such as consolidating servers from multiple geographical data centres, into standardised processes. These rigours permeate the entire migration strategy – it's not unheard of for companies to spend upwards of 200 man days on the data gathering process alone, and this kind of attention to detail is reflective of what is needed at the lower end of the market, although at the moment it's completely absent. It's rare for smaller companies to even anticipate migration projects, let alone have a firm grasp of best practice. Obviously smaller companies don't merit hundreds of man hours of planning – but efforts must be scaled down in a relative fashion rather than starting from scratch and meeting needs as they arise. SMEs would be well advised to follow the lead of larger organisations and establish some standardised processes relevant to their size and complexity. Doing so not only saves money on initial discovery and scoping exercises with migration consultants, but aligns the outcome of the migration itself to more long term business need. Conclusion Migration doesn't just happen overnight. Not without planning at least. In fact, Bloor Research's ‘Data Migration in the Global 2000' survey found that in migration projects, insufficient scope definition caused: Over 50% of respondents to run over budget Over two-thirds to run over timescale A failure to plan sufficiently isn't necessarily indicative of technical difficulty. Quite the opposite in fact: the biggest migration challenges are usually a question of perception. Many of the customers I work with are regularly surprised when the migration process is painless; there's frequently an expectation that it will be disruptive and costly and take too long. This attitude almost always stems from some common misconceptions that can cumulatively have a considerable impact on the success of a migration project: ‘We don't need no administration' Value your system administrators. Whilst it's true that many organisations are reducing their internal IT function in favour of cloud-provider SLAs, maintaining a clear view of your IT systems and data will pay for itself once it's time to migrate ‘Migration is a purely technical challenge' Leave the technical hurdles to the migration partners – that's their speciality. Migration is primarily a business challenge and the surest indicator of success is clear process and open communication. Establish some best practice; there's no one-size-fits-all approach, much less industry-standard methodology. ‘Downtime is an opportunity for change' Don't combine upgrade programmes or big systems changes with migration projects. There's a reason scheduled downtime is ‘scheduled' – it's time set aside for intensive work with known entities. Limit the variables you're working with as much as possible. ‘Our service provider will complete the migration for us' This obviously depends on what is being migrated, but many organisations are still surprised when cloud providers do not offer migration services. As IT has become more complex, dedicated migration partners are frequently required. Ultimately though, the best piece of migration advice for anyone who owns a digital environment is to not let the need arise unexpectedly. Position it as an expected event under the total cost of ownership. You'll sidestep the financial and operational shock of discovering it as an urgent need and encourage better day-to-day data management policies. This attitude is steadily improving. Organisations are getting smarter about migration, and many are now choosing to work with experienced migration partners after recognising it as the self-contained process it is. Doing so almost always makes the movement of data, applications and infrastructure a non-disruptive process. You can download the full whitepaper here. ### The Ultimate Guide to Cloud Migration - Part 2 By Simon Mohr, CEO at SEM Solutions This is the second part in a three part series, taken from ‘The Ultimate Guide to Cloud Migration' by Simon Mohr and Databarracks. Did you miss Part One? Or you can download the full whitepaper here. A to B in Six Easy Steps 1. Planning Planning for a migration is not significantly different than planning for any other major IT project, particularly if it's already built in to the total cost of ownership. Companies will start with an initial need and then outline a roadmap towards the solution, generally composed of: Starting conditions Desired end results Processes of change Testing (of both functionality and performance) Reporting to measure success However, given the intensive (and intrusive) nature of migrations, migration partners also require a degree of transparency and disclosure not usually needed in other projects – of both the reasons for the migration and of the initial shape of the IT environment itself. For instance, certain initial conditions, such as ageing pieces of infrastructure, a lack of disk space or recurring downtime must be factored in to the migration strategy. These types of issues can often be planned around, but failing to highlight them at an early stage could unexpectedly impact the image quality or the ability to complete synchronisations successfully. 2. Getting Access Actually getting access to the companies' systems can be quite complicated depending on the day-to-day hygiene of the client's IT environment. It relies on strong password management policies and a solid understanding of access criteria to routers and servers. Essentially, they need a bunch of keys to hand over and they need to know what each key does. This is usually a good indicator of how well the business knows their own systems and can foreshadow potential problem areas later in the migration. Once those keys have been handed over the migration specialist will then need to perform a gap analysis to measure up how closely the customer's idea of their IT estate aligns with reality. For instance, it's fairly common for a company to employ a number of different developers over the lifetime of a single server. During that period, each of the developers could have been building scripts and uploading backups independently of one another, without ever informing the wider business. So, out of a hypothetical terabyte of storage, ¾ of it could be taken up with archived zip files simply because there was nowhere better to back up that data at the time. This is a fairly common situation, and it's not at all unusual for companies to have a very poor understanding of their own environment – particularly given the shrinking numbers of technical and system administration jobs in modern IT departments. 3. Proof of Concept Essentially, the proof of concept stage is a dry run of the migration model itself, executed on a sample set of data, without hitting the ‘Go Live' button at the end. This can be configured around a single transaction or event just to demonstrate the functionality of the migration path, or be more complex depending on the number and variety of systems being moved. If the move isn't complicated, this can be a very fast piece of due diligence. Alternatively, the proof of concept stage can be a good point at which to align expectations with what is possible given the timeframe, budget and resources allowed. It can become a stage at which the migration partner says ‘We're not going to commit to complete this – what we're committing to is establishing whether or not this is possible.' 4. Carrying out the migration – Imaging or Domain-by-Domain? Historically, migrating a large amount of data was a case of putting a disk drive in a jiffy bag and calling the courier. Obviously, the kind of downtime this creates isn't compatible with the high-availability culture of today's business environment. Consequently, there are two fundamental ways to migrate non-disruptively, which can be loosely summarised as either moving all the data at once in as short a time as possible, or moving discrete elements of the environment in the background. Imaging On the surface, the first method sounds like the obvious choice; a simple copy and paste process completed outside of business hours. Uninformed companies tend to presume they can shut their systems down in the legacy environment on a Friday, copy a clone image to the new environment over the weekend and press ‘Go' at 6am on Monday so everyone can pick up their emails. In reality, there are many factors that could affect the quality of the transfer, the consequences of which might not emerge until the destination. This is also the only method with scheduled downtime outside of the Go Live phase – you can't take an image of a live environment, and the prospect of shutting everything down can be alarming. For many organisations, hitting the ‘off' switch isn't something they've done before. Domain Domain migrations are more reliable than imaging, but they take significantly longer. By moving across one system component at a time – such as an exchange server – the chances of service outage are significantly reduced. However this also means the customer has to maintain two environments simultaneously, which can be costly, particularly if the domain is part of a large infrastructure. The transfer rate is also slower as the data transfer tends to happen over the internet, with most migration providers allowing around 1 Gb/hour. This is a skill in itself, because if the migration can be completed with no impact to existing services, it can be carried out at any time in the background without disrupting the IT environment. A good analogy is to think of your IT environment as a house. Migrating data using imaging is the equivalent of picking up the house and moving it to a new location in its entirety. The likelihood of that house reaching its new destination without having a single brick loose or piece of furniture out of place is highly unlikely. The house is probably going to need to tweaking and spring cleaning once it arrives. Similarly, it's certainly quicker, but during the process you don't have anywhere to live. Domain migration is the equivalent of moving the house room-by-room; much slower, but necessarily more attentive to the respective parts. It's very easy when doing a migration to trip up along the way – to do things and not consider the unintended consequences of a particular action. This is resolved by marrying up suitable IT skills with a more business-focused outlook – understanding the business impact of the technical decisions being made. A migration is a very specific, contained process, with long term implications. Having someone with a foot in either camp (business and technical) is preferential. 5. Testing It's surprising how frequently companies don't anticipate the possibility that the migration might not have worked perfectly first time round. For many the testing phase is regarded as just a formality – similar to a car mechanic driving around the block to demonstrate that everything works perfectly. It's not rigorous enough. Testing should be a point at which the migration is given a chance to fail; better in the testing phase than weeks after completion, when the migration specialist will need to be called back in, usually at a cost to the business. In fact, testing can be significantly more work than the actual move itself, particularly if you're migrating complex websites with many points of entry, types of interaction and different user groups – those interdependent processes all need to be tested thoroughly. Ideally, organisations will have a shopping list of required tests to carry out at the other end, though again, the loss of internal IT skills can make this unlikely. Ultimately, it entirely depends on size, preparedness and complexity. Testing could only require up to half a day, or at the other end of the scale, up to 3 months for an organisation with thousands of clients who can't risk downtime. This is also perhaps the only stage at which there's a risk of the customer being too cautious, so it's important to make the clear distinction between 'testing' and 'staging' in more traditional projects. Testing in a safe, staged development or test environment is an integral part to most IT projects, and so companies are often surprised to find out this is not the case with migrations. The testing phase, though not taking place in a discrete environment, fulfils this need completely so long as adequate resources are devoted to it. Carrying out additional staging as a precautionary measure is a duplication of effort. 6. Go Live The go live phase is a very important part of the process and the only time (except for imaging) where scheduled downtime occurs. It's at this point all the file structures and databases are verified and checked for changes. Many organisations are tempted to go live very late at night on a weekend because of a presumed sense of security – if something goes wrong, there's plenty of time to fix it before work starts again on Monday. In reality though, this is often a mistake, for two reasons. Firstly, if the stages prior to the go live point have been completed exhaustively, no more errors should occur. Secondly, in the unlikely event something does go wrong, it's incredibly difficult to get in contact with the relevant web developers and engineers in the middle of the night at the weekend. General best practice dictates that the optimum time is around 7am on a Monday morning, leaving a good amount of time to rectify any issues without causing much disruption. Next time we'll be covering how some of the world's biggest migrations are done, along with some of the most common migration misconceptions. You can download the full whitepaper here. ### The Ultimate Guide to Cloud Migration - Part 1 By Simon Mohr, CEO at SEM Solutions This is the first in a 3 part series, taken from ‘The Ultimate Guide to Cloud Migration' by Simon Mohr and Databarracks. Next time we'll break down the six easy steps of the migration process. You can download the full whitepaper here. The pace at which we've come to regard cloud technologies as an everyday utility is staggering. Even the language we use has changed. Looking back only a few years, people would speak in terms of movement; of the journey towards cloud. Today, it's often assumed we have arrived at some new technological frontier, and are instead occupied with making improvements – making things bigger, faster and more efficient. In truth though, no one makes this journey just once. Moving data is, and always will be, an inherent part of owning a digital environment – particularly if part of that environment is hosted by a third party. It's a universal stage in the technology lifecycle; whether you're a multinational company or a two-man start-up, migrating data between environments is one of most important processes to get right. Moving data is, and always will be, an inherent part of owning a digital environment – particularly if part of that environment is hosted by a third party. It can also be deceptively complicated, and something the uninitiated tend to underestimate. The paper ‘Data Migration in the Global 2000' from Bloor Research found that only 16% of data migration projects were delivered successfully (e.g. on time and to budget). On paper it's a just a big copy and paste job, so why do so many migration projects fail to match up to expectations? For a start, migrating to the cloud is a remote, rather than a physical process; it's rarely a case of simply moving data from point A to point B. Instead, cloud migrations often arise as a consequence of wider business drivers like corporate growth or restructuring, and so must be simultaneously absorbed into a wider business strategy but regarded as a single, contained process. The huge and widely broadcasted benefits of moving away from a traditional dedicated environment has generated an atmosphere of urgency around cloud adoption; success stories of companies becoming exponentially more agile, flexible and efficient are common, and the rest of market wants a piece of the action. However organisations must be careful this enthusiasm doesn't translate to impatience with the migration process as a whole. Remote migration isn't inherently risky, but it can be a disruptive process that should be held in equal regard to the expected benefits of the destination environment. Moving to an untested platform without first establishing how suitable it is for the immediate business need (and compatible with the existing environment) is like going on holiday without booking a plane ticket or packing your bags. Hasty migrations risk two major consequences: The migration process is ill-planned and therefore likely to encounter difficulties. The destination environment fails to live up to the unreasonable expectations placed on it. Working with a trusted migration partner sidesteps the majority of potential problems. Cloud computing has proved to be a disruptive technology, and as systems and infrastructure have become more complex, so have the operational skills sitting behind them compartmentalised. ‘Migration' as a contained skillset has emerged as a part of this process, and as we'll go on to see, successful migration projects are frequently a consequence of good planning, sufficient resources and, perhaps most fundamentally, the anticipation of the migration process as an intrinsic part of cloud adoption. The ‘grudge purchase' and valuing your data Before we get in to specifics, let's take a look at one of the main causes of tension during migration projects – cost. Migrations, particularly for small and medium sized enterprises, are rarely budgeted for. This can be for two reasons: either the need to migrate arises suddenly (perhaps due to failing hardware or immediate change in business need), or more likely the process is assumed to be included as part of the service provider offering. Whatever the cause, it's a mistake to regard the migration process as an inconvenience. Doing so undermines the value of your business critical data. In countless annual IT surveys, organisational data is universally regarded as the single most important company asset. In fact, respondents to a Computer Weekly survey reported ‘Data Protection' as their top security priority in 2013. So why does this attitude suddenly evaporate once the data is in motion? The remedy for this is relatively simple: to build in migration as an expected part of the total cost of ownership. This doesn't mean simply allocating more budget to service providers though: for the majority of them, this is no longer a core competence. Ten years ago budget hosting companies might have been able to move a simple website onto their servers with relative ease, but IT is too complex now - there are too many concurrent services and interdependent systems. Service providers are getting much savvier about this too; many have underestimated this complexity and had their fingers burnt in the past. The same dynamic is true of internal IT teams. Organisations might be tempted to cut corners and reassign internal technical staff to oversee a migration project because, on paper, the skills look quite similar. But technical aptitude is only a single prerequisite of successful migrations – it's not simply a case of following a set of instructions and hitting the ‘go live' button at the end. It's a more nuanced business process that demands a suite of other skills to get right first time. This is really the key: migrations are highly intensive events with no real margin for error, and skimping on cost up-front is very short-sighted given the significant costs incurred by rectifying a botched project. Of course, underpinning all of this is the fact that there is no inherent risk in data migration, so long as the planning, testing and allocation of resources are sufficient. This might sound simple enough, but the reality is often quite different. The skill gap Another side-effect of the recent uptake of cloud services has been a dramatic loss of internal technical skill. As organisations outsource more of their IT functions, it's often assumed there's a corresponding drop in the need for technical staff. In some cases, this only becomes a problem in times of disruption – when systems fail client-side and there's no one around to resolve the issue. However, understaffed IT departments can have invisible but serious consequences in the long term. In short, outsourcing the systems does not mean outsourcing the knowledge, and companies that cut back internal capability too far necessarily distance themselves from their IT, and put their data at unnecessary risk. One of the supposed pillars of cloud computing is that IT becomes a disposable utility that does not need to be maintained internally. In specific cases, such as individual services from providers with very high SLAs, this might be true. But a central part of migration is having an accurate image of the entire IT estate for comparison at either end. This is a basic requirement without which the success of a migration is impossible to determine, and yet increasing numbers of companies have a very limited picture of the systems they possess - a consequence of overzealous outsourcing. Industry Insight This emerging skill gap is proving to be particularly painful in conjunction with a recent trend anecdotally reported by digital agencies. As big brands are realising the importance of continuous digital engagement with their customers, many are building out significant internal creative and development functions to test campaigns and reduce their time to market. Whilst this ability is hugely beneficial for creative teams, there's a danger that as individual development clouds become easier to spin up (most today can be set up with a few mouse clicks and a credit card) organisations are unwittingly creating masses of data in separate, temporary instances without maintaining them or closing them down afterwards. Where historically these would have been maintained and/or consolidated by on-site technical and system administration staff, there is now a risk they'll simply remain, unused. This leads to 'cloud sprawl' - an inefficient cloud environment, often with the same levels of unused capacity seen in traditional dedicated environments that will inevitably require a migration to rectify down the line. This is the other side of building migration into the total cost of ownership – undergoing regular maintenance as a business-as-usual activity to prolong the longevity of existing systems and reduce the amount of extraneous work required during actual migrations. Next time we'll break down the six easy steps of the migration process... You can download the full whitepaper here. ### Interview with Tim Ayling of Trend Micro Compare the Cloud Interviews Tim Ayling, UK Channel Director of Trend Micro CTC: Tim tell us your all about yourself? I've been in the IT security game for over 15 years now (where's that time gone?!), working for software vendors & consultancies in various roles. I've got a couple of kids, lived abroad for several years, and am a big Brighton fan – these are not linked! CTC: Most people view Trend as a consumer anti-virus vendor what is Trends offering in the cloud? That was my perception before I joined, but the things we're doing with virtualisation and cloud are really ground-breaking in my opinion. The benefits for Deep Security for Cloud Service Providers is almost a no-brainer, allowing CSPs to offer Virtual Patching, anti-malware, integrity monitoring, firewalls (and more) in a multi-tenanted fashion as a tick-box exercise with a pricing model that fits. Our policy-based encryption technology, Secure Cloud, is also hugely popular in the US especially – but that will become more important locally next year when the EU Directive comes into play. CTC: So can deep security be provided as a managed service by partners? Absolutely. We have a multi-tenanted version that allows MSP's to offer everything we have in Deep Security as a service. We've also learnt that traditional pricing models don't work and now fit into the new Service Provider's method. CTC: How can partners get a demonstration of the deep security suite? Give me a call! It's actually really easy. We have a team of people in Cork, our EMEA Headquarters, who spend their time running demo's etc. They're a great team and we're lucky to have them. CTC: Where do you see revenue gains and return on investment from migrating to Deep Security? I could go on for ever here, but I'll try to keep it simple. Deep Security frees staff from continually configuring, updating and patching agents... Deep Security frees staff from continually configuring, updating and patching agents, provides a more manageable way to secure virtual machines, increases consolidation ratios, protects virtual servers as they move between a data centre and public cloud, and provides a security suite that can't be matched. An ROI built on improved consolidation ratios alone is usually enough. CTC: Where are you seeing the product deployed? That's a hard question as there is no vertical that stands out. Having said that, since the release of our multi-tenanted version, the CSP and data centre markets have boomed for us. One of the key business cases appears to be the virtual patching capabilities, meaning the infrastructure team don't have the same pressures to regularly patch anymore. I never realised the scale of the patching problem until I joined Trend! CTC: What management and orchestration panels do you integrate with? Our open architecture means we can integrate with a huge raft of orchestration panels, so it's an open-ended question really. There are some we're going a step further with, such as Abiquo, but we obviously can't afford to go solo with this. CTC: Tim, if you was the CEO of a cloud or managed service provider how would you deploy Deep Security both commercially and technically? Well you would obviously expect me to say that it's best to install the lot and quickly! The truth is that everyone is different and they have different priorities. You have ultra-secure providers, such as The Bunker, who would benefit from deploying everything and making sure the world knows. Others may be providing a low-cost alternative and will be looking for up-sell opportunities, and in that case it may be best to deploy a version that allows the end-customer to choose which security components, if any, they want. I wouldn't ever recommend a deployment path that is good for all. CTC: have you got any links or helpful guidance for people wishing to explore further? Yes, indeed: Deep Security Deep Discovery Research Papers CTC: Here is a challenge Tim in 150 simple words describe why a cloud provider or MSP should deploy Trend over your competition! Trend are known to be two years ahead of the competition when it comes to Service Provider security. Putting the security question aside, our architecture enables huge performance increases when securing virtualised environments enabling service providers to fit more VMs on each host. Our multi-tenanted approach and flexible service provider licensing model also ensures we fit both technically and commercial into the way service providers want to work. From the security standpoint, our Smart Protection Network is the largest threat database in the world, covering traditional threats on the web, email and files, as well as the emerging threats around mobile applications – we've downloaded every app on Google's Play Store and given them a reputation score. This is big data before big data existed. Before I joined Trend I thought of the company as an AV vendor. I now see Trend as the key innovators in the security space – years ahead in many aspects, especially virtualisation and cloud. CTC: Tim, what resources are available in terms of technical training, sales guidance and commercial modelling from Trend for new partners or existing partners wishing to use Deep Security? Trend are 25 years old, and have been a channel-led company that whole time. We're a billion dollar company with over a billion in the bank, so it's clearly worked! We provide regular technical training, we have events specifically for our channel, we have flexible commercial models and much more. Anyone who has been to our EMEA HQ in Cork who would understand the machine we have working behind us to make sure our partners succeed. I've never worked for a company with the human resources Trend has working for the good of the company and its partners. CTC: Tim, a question we ask everyone, what is your definition of cloud computing? For me cloud computing is where an organisation entrusts their data and systems with a remote service, and that data is accessed through the web. That's it in a nutshell, but of course that is very open-ended & probably doesn't encapsulate everything. CTC: Tim, have you got any events or shows coming up where you can interface with partners and customers? We were at Infosec in London last month in a big way. We do a raft of events though, and are especially prevalent at the VMUG events across the UK. We're also doing the VMware PEX events in the UK, and can be found somewhere at most cloud events. We also have a channel-specific event coming up which interested partners can talk to me about. CTC: and finally Tim for anyone reading this blog what security tips would you offer them? Having worked in both software and consultancy, I understand the importance of tools and processes. One doesn't work without the other, though processes need to be mapped first. I would also suggest a proactive approach to security, rather than reactive. Too many companies spend vast amounts of money after an incident, which seems absurd to me. Like it or not, your company will be a target. To get in touch with the Cloud Team at Trend Micro email trend@comparethecloud.net to be introduced. ### Flexiant's view on Dell buying Cloud Management vendor Enstratius By George Knox, CEO of Flexiant Yesterday, Dell announced the acquisition of Enstratius, an enterprise cloud-management software and services provider that delivers single and multi-cloud management capabilities. As a fellow Gartner Cool Vendor this year in the 2013 Cloud Management report, we want to congratulate Enstratius. This acquisition is proof that the biggest IT companies in the world recognize that they need to help their clients quickly deliver cloud solutions and that long, expensive development projects are no longer viable (if they ever were). They also need a cloud solution that can easily manage both infrastructure and applications. These companies are not developing, but acquiring the technology. Only in March did Oracle announce it had agreed to acquire Nimbula, a 2012 Gartner Cool Vendor and provider of private cloud infrastructure management software. For many of these IT organizations, cloud management is a strategic priority because the benefits are so tangible. Most are focused on enterprise IT, but there are a few focused on the particular requirements of service providers and telcos. This is where Flexiant has focused and differentiates. We also believe strongly, as the purchasers of cloud technology shifts from enterprise IT to telecom and service provider companies; the needs of the purchaser will drive the large traditional enterprise software and infrastructure providers to acquire innovation and technology that fills their technology gaps. Doing so will allow them to take advantage of the next generation of technology spending. Getting to market quickly with a cloud management solution will open up the new products and services to offer customers We stress to our customers the need to buy verses develop and the same holds true for these enterprise IT focused organizations. Getting to market quickly with a cloud management solution will open up the new products and services to offer customers. We view it as a no-brainer. With all the recent cloud management vendor acquisitions, it makes you wonder what the future landscape will look like. ### Conference versus User Group Events By Brendon Higgins, of the Virtual Machine User Group (VMUG) At 14 years old, I picked computing as a career option and have continued on that course ever since. One of the key factors in my professional growth has been both knowing 'who' to ask for help and more importantly, having something of value to offer in return. Tools like Linked-In and Facebook are great for keeping relationships going but how to establish them in the first place? My first computing job had me locked away with 6 others, in the MIS department for a company geographically located in the middle of nowhere. Opportunities to develop a professional network beyond the company where limited, so I started attending my local British Computer Society branch events. Where I learnt new skills, discovered new ideas and increasing my peer group all at the same time. BCS events are great but they tend to focus on concepts and as an engineer, I also require specific product information. Conferences are filled with various presentations and discussions but these tended to be held in big venues, with large crowds and require financial resources to attend. Product information is available however the speakers tend to be from the sales or marketing departments, rather than the people who user the systems. Normally this is simply because the ability and the desire to stand at the front of the room and speak publicly to large audience of your peers is rare. Companies are expected to ensure their products are presented in the best possible light and just trying to ensure the best experience for customers in the audience. However marketing and sales people may be tempted to introduce FUD to help "enhance the message". Also with 150 plus people in the audience, the speaker tends not to take questions from the floor until the end of the presentation - making it possible to sneak slideware past unchallenged! This is especially the case if the speaker change over periods are brief, limiting the risk of exposure in a public Q&A session. There are always worthy talks on the agenda but selecting which talk to attend is a skill in its own right. No, for me, the real value of these events comes from identifying your peers in the crowd, then discovering what they have learnt to work and more importantly fail to work, in their environment. The trouble is finding them in the crowds! Sitting down with your fellow system owners, a skilled moderator and the vendor's technical staff is how you can honestly explore a product in detail. For an engineer looking to discover what a system can actually do and its real world limitations, in my opinion, the small group round table format is best. Sitting down with your fellow system owners, a skilled moderator and the vendor's technical staff is how you can honestly explore a product in detail. It is possible to quickly identify who has a comparable environment or is working on the same issues as you. Learning from their experiences and growing your professional network at the same time. However invitations to these types of events tend to be limited to only the largest or most "interesting" of customers. So if the personal success of conferences depends on random chance events and the round table event tickets are delivered by unicorns, how can you build the professional relationships and develop the skills you require? User group events run by actual customers who own and operate systems have traditionally filled this gap. These events have different formats but can be divided into single or multi track events. Single room events are cheaper and easier to organise, while ensuring each speaker on the agenda gets close to 100 percent attendance, for their presentations. However this is the natural home of death by PowerPoint, as ‘sneaking out' is difficult with only one option available and the captive audience of customers is a dream come true for sales teams. No for me, the next best thing to a small group round table is a multi-track local user group event. These events provide a mixture of large ‘key note' type addresses for consumption by the masses and small break out rooms for more specialised subjects. Main room talks will always draw in a large percentage of the total delegates attending the event, which is why they are in the largest room after all. However, while the information is still broadcast at the audience, the smaller scales involved enable more engagement from the floor, for a two conversation. Often a large percentage of the audience will know each other and the speaker, making it possible to describe solutions in less general terms. Also a small audience which is confident in its own knowledge of a product will quickly pick up on any comment or suggestion which is "less than fair…" and openly challenge it. Speakers beware and do your homework! For those who already know the speakers subject matter or are not interested in that specific application, there are other items on the agenda. These smaller rooms closely resemble the format of the mythical round table because they are in affect the same. Less than 20 people talking about the actual operation of a product or service and someone from the user group committee to ensure the discussion stays on topic. While using the "wisdom of crowds" to rapidly develop novel solutions during the session. Networking is the key strength of a user group meeting. Operating at the regional level, members tend to be local, know each other outside of the group and are able introduce new members to relevant contacts. Networking is the key strength of a user group meeting. Where peer to peer support can frequently resolve technical issues quicker than formal channels; or customers sharing information about deals and discounts can be used by others to further drive cost savings from vendors. The later clearly being a disadvantage to the vendor and not the sort of thing they wish to encourage at ‘their events'. An improvement on a vendor's local user group, is an independent user group of customers which is self-funded, because it has the ability to go 'off message' and start talking about a products weakness or even the alternative solutions available. Because the group is formed from actual customers, the vendor's influence over the group is limited, so long as the group remembers it exists solely to serve its members interests and provide the forum to exchange ideas and share information which is correct. ### How to spot when your business should give Dropbox the boot An interview with Mark Butcher, Commercial Director of Probox by Proact CTC: Mark firstly introduce yourself to our audience I'm the Commercial Director for Proact in the UK. This title covers a multitude of sins but primarily relates to owning marketing operations and ensuring that the products and services we take to market are profitable for us to deliver whilst adding value to our customers and making their lives easier. CTC: You make a lofty claim about Dropbox, tell us about the company behind Probox? Proact are an exceptionally well kept secret! If you ask anyone in the storage and virtualisation markets who Proact are, they'll probably be able to tell you all about us as we're Europe's largest specialist and have been for more than a decade. However outside of this niche what's less well known is our capability in delivering cloud computing services as it's something we've been doing successfully for 10-years (under its many guises) and we've now got more than 30PB of data under management with around 12PB in our datacentres across Europe. Proact is a publicly listed company founded in Sweden in 1994 with a UK presence for around 12-years. We've got a direct presence in 13 countries and our annual turnover is approximately £250m with more than 650 employees across the group (70% of the employees being deeply technical). Our size makes us an interesting proposition as our biggest differentiator is simply that we're big enough to deliver but aren't at a size yet where we've lost neither our flexibility nor our ability to respond. One simple reason why we continue to grow whilst retaining our customers is that we haven't lost the ability to apologise when things go wrong and we always fix our mistakes. People remember the companies who stand by them when the s**t hits the fan, regardless of who caused the problem. CTC: So what in your opinion sets Probox apart from Dropbox? Security, security, security with enterprise IT thrown into the mix!! Dropbox is a great product if you're an end user as it's simple to use and easy to share information but it's really not so good if you're a company who cares about your business data. Probox is different simply because it's been built from the ground up to be secure in a way that doesn't intrude on the end users. It's got all the feature you'd expect, mobile clients, flexible sharing and is exceptionally easy to use but at the same time offers companies an immensely high level of security and control. ...what [...] sets Probox apart from Dropbox? Security, Security, security! Unlike Dropbox and similar services that use commodity storage scattered around the world we are able to give a guarantee to a customer that their data will not be stored outside of a specific datacentre, we can even pin it down to the exact drives if that's what they want to know. This is critical to many customers who are concerned about data privacy and corporate governance. Simply because Proact are not a subsidiary of a US corporation also means that unlike Dropbox, Microsoft and many others we aren't subject to the Patriot Act or other similar US centric legislation. For obvious reasons I won't go into detail how we secure the actual platform but we have built the infrastructure on the NetApp FlexPod reference architecture for the simple reason that it's designed to offer secure multi tenancy and offers bullet proof performance and availability. CTC: So how do customers purchase and test your product? It's really simple, register your interest via our website (www.probox.eu) or speak to one of our many channel partners who are taking Probox to market. We offer 30-day trials for any customer and we're always looking for new partners to help us grow the Probox user base. CTC: what are the expansion plans for Probox? The opportunity for Probox is huge, not only because users want a flexible way of sharing their data but also because most organisations now recognise that it adds value to them if they offer their employees this capability. Our plan for the end of 2013 is to have Probox deployed in a further 10 European datacentres with Asia Pac and Americas to follow closely after. On any computer you treat it like a file share... We see most growth for Probox coming from our channel as that's a really obvious route to market. CTC: Dropbox is renowned for its simplicity How easy is the Probox product to use? Probox is designed to be simple; after all it's not much more than an exceptionally secure file sharing tool that allows you to access your data wherever you want. On any computer you treat it like a file share and on your mobile device it's an elegant simple way of viewing your data. CTC: A question we ask every in every interview, what is your definition of Cloud Computing? I'm not a big fan of the phrase Cloud Computing as I haven't met anyone yet who can give me a straight answer as it means something different everyone. I'm pretty simple, so to me as an IT consumer Cloud Computing is nothing more than the IT resources I need being available anytime, anyplace, on any device. As far as I'm concerned I shouldn't care about servers, storage arrays, RPO's or RTO's. That's someone else's problem – or rather to put it bluntly, that's what Proact deliver, we look after the boring stuff so companies can focus on what makes them tick. CTC: And finally if you could change one thing about the Cloud Computing industry what would it be? To shoot anyone who answers the previous question and pretends to know what Cloud Computing is! More seriously I'd like the industry to mature a little and for a lot of the hype to die down. There are too many providers out there making a lot of noise about what they are “going” to do without actually delivering much of any substance. There are no real standards and end user customers are totally confused by the different messages being thrown at them. CTC: Thank you Mark, check out www.probox.eu and decide for yourself if Probox is a suitable dropbox replacement! ### IBM WebSphere Cast Iron How do you integrate Cloud-Cloud and Cloud-on premise? With IBM WebSphere Cast Iron ...one of IBM SmartCloud's "Secret Sauce" ingredients. Cloud computing has a major impact on how companies manage their business, be it sales, marketing, HR, finance etc. As each of these functions move to the cloud they require migration and integration, and it’s not just inter-business evolution. Each vendor providing a cloud solution creates their own interface to the application. What we see is a challenge for companies of all sizes, and in all locations, as they attempt to understand and then manage these unique application interfaces and integrate applications from cloud to cloud and cloud to on-premise. This is where WebSphere Cast Iron brings some serious magic. As more and more business critical applications move to the cloud we start to see different cloud states emerging such as hybrid clouds. Take for example SmartCloud for SAP; real-time synchronisation of data or data delivery across new and old estates. These hybrid architectures require integration to bridge the gap between the on-premise existing systems and the new cloud applications, platform, and infrastructure. WebSphere Cast Iron will integrate on-premise systems to and from cloud applications via a simple drag and drop interface. It’s a GUI based solution, there is no code to maintain; everything is XML based including all variables and data handling. It's a non-proprietary solution and requires no custom code to implement. It provides all the management and monitoring of the integration code regardless of the simplicity or complexity of the integration. So in effect, it can grow with your business direction and velocity. Speed and simplicity are core features of WebSphere Cast Iron. I’ve seen clients use the solution initially for bridging their on-premise ERP to new cloud based CRM projects. Realising the ease of use they also then start to use the flexibility of WebSphere Cast Iron for much smaller requirements where before they might just have produced some custom code and SQL to get at a database. That’s great for the easy stuff, but here’s a useful feature for much larger customers that have deployed an enterprise service bus (ESB). Until now the move to a cloud hybrid model poses a significant challenge for this infrastructure, why? Well, in the cloud model you’d expect far more frequent changes than an ESB was designed to accommodate, especially when considering more dynamic applications like mobile, social or the growing diverse catalogue of SaaS applications. WebSphere Cast Iron integrates to the ESB acting as a cloud gateway, interfacing the ESB to your emerging portfolio of cloud applications. This means that as demand grows, the enterprise architecture flexes and morphs and more resource is placed, even if only as a transient burst, into flexible cloud services. Neither business agility nor IT governance need be sacrificed or compromised. With immediate effect, if desired, you could keep your core data residing within the enterprise serviced by one or more key (maybe legacy) applications, yet gain the advantage of smaller cloud based services providing niche applications, such as HR, collaboration services and so on. Cloud integration services then become pivotal to the IT function. Rather than covering cracks in the architecture they are seen as vital in leveraging the base IT strategy of key enterprise investments, whilst supporting continually emerging business demands. In a cloud world, we can’t expect restrictive systems that force the same data structures, so WebSphere Cast Iron’s role in performing data transformation is a huge bonus. What’s more, data aggregation is also expected when providing services for other applications, so WebSphere Cast Iron helps combine data from multiple sources. It drives performance benefits across the system making just one API call to obtain data rather than many. WebSphere Cast Iron comes in 3 flavours, giving you choice to match your overall IT strategy; a physical on-premise appliance, a virtualised hypervisor based appliance or as WebSphere Cast Iron Live, which resides purely in the cloud. Whether you need to bridge application to application in your enterprise, or hook up your legacy investments to some of this dynamic cloud stuff or interconnect a portfolio of applications across your newly building cloud partner ecosystem, WebSphere Cast Iron has a valid and valuable role. Don’t just take my word for it, here’s what our clients are saying…. Large consumer products company:  "IBM WebSphere Cast Iron cloud not only met our needs for a standards-based, cost-effective and centrally managed messaging and application integration; it also boosted IT staff productivity by 16 times." IndigoVision achieves easy integration of salesforce.com:  "WebSphere Cast Iron solves the business issue of integrating our on-premise financial solution with salesforce.com ... This approach enables IndigoVision to operate and support global sales teams while retaining its financial accounting processes in-house." If you’re interested in becoming an Integration Business Partner with IBM and utilise WebSphere Cast Iron to help your clients, here’s a great resource for you. For more information on IBM WebSphere Cast Iron visit the IBM website here. ### How to select a cloud infrastructure provider for Disaster Recovery By Martin Wright, Managing Director of Techgate PLC "...Make sure you ask the right questions when choosing a Cloud Infrastructure provider." The cloud market place is maturing and the range of solutions on offer is becoming broader, but this can make it even trickier to judge exactly what you are buying. Your Disaster Recovery (DR) solution must be as secure as your in-house IT systems, but many companies are not yet asking the right questions when it comes to selecting a trusted provider. This blog post tells you how to avoid some of the pitfalls. Security and availability Make sure your Business Continuity provider takes security seriously, especially if your company handles sensitive data. If security is not a strategic backbone to their own IT setup and network from top to bottom they might not be offering the level of security your organisation requires. A brand new pair of Tier3 Data Centres with “military-level security” does not mean anything if there is no fault-tolerant network, with multiple points of failure, or no monitoring service to make sure all the traffic remains uncompromised. Check that your provider offers: Data Centres in a “low-risk area” outside of a city centre and away from the threat of power outages or terrorist attacks An ISP-independent, underlying fully redundant network that they own and manage Solutions that can be failed-over to a second separate site if needed Connectivity options with various providers Avoid vendor lock-in You don't want to be locked into any provider forever. So you need to look for a one that can supply cloud services based on a standard platform that you can migrate from again if you need to. In essence you need to build your exit strategy when you engage. VMWare's vCloud is the leading cloud platform at the moment. Building a cloud solution based on their stack will allow you to migrate your workloads and applications to another provider if required in the future. Accreditations Ask your provider for evidence of their accreditations. There are two accreditations that relate to Information Security and Business Continuity Management: ISO27001 BS25999 [This accreditation will be superseded by ISO22301 in 2014] Check also that your supplier has the right technology partners in place and that their partner status is current (this is important). It may also be worth checking whether the staff are all CRB checked, otherwise you could be buying a really secure system run by less than scrupulous people. Hardware infrastructure One of the most common cloud computing myths is that hardware becomes irrelevant with the cloud. The technologies used and the overall performance of your provider's hardware do matter if you want to be certain of getting a professional and reliable solution. Check that your provider is using cutting-edge infrastructure - CPUs, switches, firewalls, storage area networks (SANs) and hard drives. These components can differentiate an enterprise-class cloud offering from a Virtual Private Server with entry-level performance. Make sure you are comparing like-to-like in terms of performance, especially when you are buying storage, using industry standards such as IOPS, or Passmark scores. The technology in infrastructure has moved on a long way even in two years! Compliance and legal issues Now this really is a sensitive one; data domicile matters – or in other words “where in the world is the technical infrastructure located and where are my applications and data exactly?” Different countries have different data regulations and you need to be certain of where your data is actually being held. In the last 18 months, the Patriot Act and the US government's ease of access to any data centre of an American provider even on European soil, has created a lot of controversy and fear among perspective cloud customers (see Frank Jennings' blog here on the subject). Especially when it comes to UK sourced data, where the legal framework concerning information security and usage is stricter (and even more so in specific, regulated industries). The location of the Data Centres and where data and applications reside are very real and significant issues that need to be addressed. In any case, you should consume cloud resources and migrate data in a controlled manner, knowing exactly how the cloud you are using is set up and where it is located. References and industry experience Another important factor in the checklist when making your selection is the cloud provider's existing clientele and references. Look for relevant reference companies in your sector, who have similar requirements to you. Ask for case studies that explain what the provider did for them as a demonstration of their technical and support capacity. Try to engage in a conversation with the relevant customer and ask for their experience with the company. Support and flexibility Last but not least, consider the level of support you will get and how personal that support is. Sometimes a helpdesk is not enough to sort out your problems or help you with your cloud adoption strategy. Besides solving any technical problems and resolution, consider whether the provider can provide adequate account management and offer a consultative approach towards what your mid-term/long-term objectives. Can you initiate a discussion with the tech support, or even better the cloud vendor's technical architects, to go through your specific requirements and technologies? Can the provider offer professional advice about your systems and IT infrastructure choices over the next months? Does the provider have the expertise, industry knowledge and understanding to guide you through the “cloud-washing” and hype that you come across every day? Look for a provider that can accurately assess your requirements, provide support to migrate your data and manage any issues that may arise, for whatever reason. ### What makes a quality Cloud hosting provider? Part 2 Did you miss Part 1 of Richard's "What makes a quality cloud hosting provider" series - read it here. Horse meat Burger, Cod and Chips, a Bacon sandwich, Cloud Computing providers… which is the odd one out? The Bacon sandwich, of course! I’m sure you are aware that there have recently been a number of well publicised food scares around Europe, with products not being quite what they seem. So in the list above, why is the Bacon sandwich the odd one out? Well, to date, I have not heard of anybody successfully counterfeiting a Bacon sandwich! Often Cloud Computing providers, as I covered in part 1 of this blog, are also not quite what they seem. Lots of promises are made on websites as to how good a service is, but on further exploration is the service actually what it seems? I am now going to explore some of them.  “99.9999% or 100% uptime guarantee” A Service Level Agreement (SLA) is a measure of availability, which is typically described through percentage statements such as ’99.999% SLA’. When these statements are explored in real time, this is the outcome: Availability % Downtime per year Downtime per month Downtime per week 90% ("one nine") 36.5 days 72 hours 16.8 hours 98% 7.30 days 14.4 hours 3.36 hours 99% ("two nines") 3.65 days 7.20 hours 1.68 hours 99.5% 1.83 days 3.60 hours 50.4 minutes 99.9% ("three nines") 8.76 hours 43.8 minutes 10.1 minutes 99.99% ("four nines") 52.56 minutes 4.32 minutes 1.01 minutes 99.999% ("five nines") 5.26 minutes 25.9 seconds 6.05 seconds 99.9999% ("six nines") 31.5 seconds 2.59 seconds 0.605 seconds 99.99999% ("seven nines") 3.15 seconds 0.259 seconds 0.0605 seconds This information is all very good, but without looking at the fine print it is actually a meaningless and empty promise on the part of the provider. [pullquote]...without looking at the fine print it is actually a meaningless and empty promise on the part of the provider...[/pullquote]Failure to meet an SLA is usually backed by a penalty clause and penalties as an industry standard are not normally too onerous for service providers. Traditionally, the provider gives a percentage of the overall monthly fee back to the user, depending on the length of the outage. Below is a table showing what a customer could commonly expect to reclaim: Service Availability during Monthly Review Period Service Credits as % of Monthly Rental Charge <99.9%-99.8% 5% 99.79%-99.5% 10% 99.49%-99.0% 20% 98.9%-98.0% 30% <98% 40%   The calculation is: (Total hours - Total hours Unavailable)/Total hours) x 100 Other important factors to note when looking into the SLA include: 1)    The penalty often does not refer to the service as a whole. Your business may have a web server but the SLA may only be for either network or server availability, so the fact that your website is not working may not be relevant. 2)    The advertised SLA does not actually match the penalty clause. For example, your business may have been told it will receive 99.999% availability but the penalty clause starts at 99% availability, so you would never be fully compensated. 3)    It is also important to explore what time period the penalty is measured over. It is common that providers use a period of 12 weeks, which actually means on a 99% SLA that your business could be offline for 21.6 hours, or on a 99.9% SLA just over 2 hours and still be within the service level. 4)    How is the availability measured? Monitoring commonly checks servers at various intervals, e.g. every 15 minutes, every 5 minutes or every minute. Based on these tests, short outages may often go unnoticed and it is often impractical to check with a higher frequency. 99.999% uptime checked every minute will mean that only longer outages will be spotted or that a short 2 second glitch will show up as a minute long outage. 5)    It is usually the customer’s responsibility to request service credit checks and SLA’s are therefore not a good way to determine the quality of a service. “Unlimited bandwidth included” or “1000 GB bandwidth included” What do they these statements actually mean? Well, they could mean any number of things and statements like these often have an * next to them, referring you to an acceptable use policy where they actually restrict you to 400GB of throughput. However, more often the biggest concern is actually how quickly you can get this bandwidth, how big is the Internet connection? [pullquote]...the biggest pipe will move more Water.[/pullquote]To illustrate this further, if we equate Water to Internet traffic and a Hose Pipe to the Internet connection, the biggest pipe will move more Water and it really does not matter if your contract allows you to move a gallon of Water or not. If the Pipe can only move three gallons and you are sharing it with 1,000 other people then the net result will be a negative experience. “We have ISO 27001 for information security”  There are few points which you need to know about ISO 27001. Firstly it is an information security management system (ISMS) which means that it is a framework for managing security, it is not a standard that ensures that an organisation is actually secure. Only standards such as PCI actually provide any guarantees, as they are prescriptive about processes, not just suggestive. Due to the fact that it is a system, people can just buy an off the shelf set of processes and procedures and quickly demonstrate an ISMS without actually really properly implementing it. However, there are some new legislation guidelines being released soon, which will mean that the auditor will be paying far more attention to the ‘monitor and measurement’ section which will hopefully improve the situation. [pullquote]...you clearly need to ask more than “do you have an IS0 27001 accreditation?[/pullquote]When you setup an ISMS the business also sets the scope of the system and some Cloud providers have been known to use this to their advantage. For example, the business could limit the system scope to the ‘spare parts’ management element of the business then publically broadcast that it is IS0 27001 compliant but conveniently not mention what for. There are positive and negative elements to IS0 27001 and although it is a good indicator on the strengths of a hosting provider, you clearly need to ask more than “do you have an IS0 27001 accreditation?” “See terms and Conditions” Remember, what a business offers on the website is not necessarily what you get. For instance, I recently witnessed a company offering a Hosted Email solution and on its homepage the provider boldly stated that they offered a free backup service. I could not find this mentioned anywhere else on the website and after further inspection I discovered that their terms and conditions actually stated that backup was NOT included in the service, unless expressly mentioned in your selected email package. I will let you draw your own conclusions on this. Your business needs to check the terms and conditions closely in order to explore the full nature of the service being provided, particularly on longer contracts. You may sign up for a service which allegedly offers UK Hosting on dedicated hardware, in full IS0 27001 audited facilities, but if the supplier then decides that they wished to migrate over to Amazon Web service, you may not be able to legally stop him. Equally, you may be contractually obliged to continue to use the service regardless of any changes. Contracts should give your business the right to cancel if there is any fundamental change to the nature of the service delivery. You are nearly always ultimately responsible for your own data and most contracts will state that the service provider shall not be liable for loss or corruption of data or information. [pullquote]...most contracts will state that the service provider shall not be liable for loss or corruption of data or information.[/pullquote]This is not necessarily because they are a poor provider but usually because it is hard, if not impossible, to get insurance against that form of loss – especially when it is so difficult to place a value on data. I hope that by reading this blog you have realised that it is easy for providers to hide behind the online world in which we live in, so closely check what you are buying. Just because the website looks good and you recognise the brand, it does not guarantee that you receive the level of service which you are expecting. It is also important to remember that many of the technologies used in the Cloud are new and long standing, strong brands will also be new to these technologies, and their previous ethos may not necessarily work in the new Cloud world. Many companies historically may have been excellent at providing break-fix style contracts, but can they stop things breaking in the first place? Did you miss Part 1 of Richard's "What makes a quality cloud hosting provider" series - read it here. For more information on virtualDCS and its cloud computing services visit www.virtualDCS.co.uk , call +44 (0) 8453 888 327 or email enquiries@virtualDCS.co.uk   ### IT standards and cloud: mutually exclusive? As some of you may be aware the first wedding anniversary that you celebrate is your paper anniversary which comes just a year after the marriage. A year ago we saw that marriage of IBM's hardware technology with its software expertise to launch IBM PureSystems - a new generation of expert, integrated systems. Marriages these days need a lot of planning and rarely come cheap and IBM PureSystems was no exception - the result of $2 billion in R&D and acquisitions by IBM over four years, that combines decades of technology know-how with innovations in the core components of enterprise technology including compute, networking, storage and software - enough to differentiate IBM PureSystems from the rival couplings that other vendors have brought together. A year on, we invited John Easton from IBM to submit a ‘paper' (an online one of course) to reflect on how married life is going for IBM PureSystems, and how they are coping with the domestic chores inherent in the Cloud environment that we all live in … IT standards and cloud: mutually exclusive? By John Easton, IBM Distinguished Engineer, and CTO for IBM's Systems & Technology Group in the UK & Ireland In the rush to cloud computing, very few users give much thought to the challenges of providing the very services they are so eager to consume. They expect rapidly provisioned services delivered to them regardless of who else is making similar requests. Think then, about the challenges to deliver IT with these sorts of attributes. Key to the cloud provider is keeping costs under control, which explains the drive to ever higher levels of virtualisation, automation and standardisation. It is the latter of these that gives rise to the “bitter taste” that cloud leaves many organisations; they discover cloud services don't quite meet their business requirements in the same way as the bespoke solutions they are used to. Many organisations realise the value here and have embarked upon a journey towards standardised IT, but most struggle to make progress. Why is this so difficult? This I feel, is more an organisational or cultural issue rather than a technical one. Many individuals have built their careers installing systems or customising pieces of software to the nth degree. Is there really business value in doing this today? I'd suggest not, yet for this audience, often in positions of influence, there seems to be a fundamental mistrust of anything that infringes upon their domains. I do wonder whether the challenge arises because these standards are typically set outside of the delivering organisation. In the case of cloud, it's even worse. The interface to the cloud is often an anonymous web page. There is no-one to talk to. What is needed here is not an arbitrary, remotely enforced standard that suits the cloud provider, but rather one at levels more appropriate to meet business needs: i.e. the ability to set standardisation points in the infrastructure, middleware and data layers. IBM's PureSystems family provides just such an ability. They can quickly and reliably deploy “as a service” building upon standardisation at any of these; be that something as simple as a Windows virtual machine or a complex combination of middleware delivering an enterprise BPM environment, the customisation the business demands can then be built upon these underlying standard services. IBM's PureSystems family provides... the ability to set standardisation points in the infrastructure, middleware and data layers. For most organisations, cloud is a hybrid mix of Commodity, Core and Specialist services. Trying to fight it out in the commodity space where differentiation is largely on price alone is already a lost battle: an enterprise cannot achieve the required economies of scale. Consider then those differentiated services where the customisation the enterprise needs can be built upon a standardised platform. This is standardisation that works; giving flexibility to choose how and where standardisation is delivered. Allowing the standards to be defined appropriately, yet still allowing the business to rapidly get the services it needs whilst the IT organisation can focus on their value-add, rather than worrying or arguing about the deployment of those standards (whatever these might be) underpinning these differentiated services. ### Choice brings 'tiers' of joy By Steve Rushen, Senior Director of Services and Support at Abiquo As consumers in our day-to-day lives we have become accustomed to choice. A cup of coffee, is no longer a simple “cup of coffee”. We now choose where we go for a coffee. Anything, from a small independent café, to a plethora of global brands present on every high street. We choose the coffee bean, the strength, the type of milk and all sorts of flavours and toppings... IT is no different. Once we get to the coffee shop the choices quickly grow exponentially. We choose the coffee bean, the strength, the type of milk and all sorts of flavours and toppings that customise our coffee experience. This is just one example of the choice we now expect, and its very different from how we consumed coffee 10 years ago. IT is no different. We are transitioning to a self-service cloud enabled world where we expect to have the choices in order to customise our interaction with IT. We have that choice already through our smartphones and tablets, but more and more IT consumers expect to be offered real choice in the IT services that they consume. This represents a real challenge for the IT department or administrator. For years IT has created efficiency and reduced the support overhead by creating “standard” environments and platforms, effectively creating an underlying IT infrastructure with a one size fits all mentality. With the rise of cloud technologies and services the IT consumer is already being presented with the choices that they want. IT needs to adapt and become the enabler to providing that same choice to their consumers. [Choice]...represents a real challenge for the IT department or administrator. The IT department still needs to think about its own costs and efficiency. It is not as simple as creating multiple service offerings for the consumer. In the modern world where the IT consumer expects choice and self-service the IT department needs a single platform that enables them to deliver just that! The technology world has transformed, just like the earlier coffee analogy, there are an infinite combination of choices in compute, storage, networking and the services that software will provide on top of that underling infrastructure. The IT department needs a single management platform that abstracts the consumer from the underlying complexity in those infrastructure technologies but still allows them to choose which is the right combination of technology for the task in hand, or the service that they are creating. IT simply needs a single platform that offers multiple tiers of service. For example my development team are in the early proof of concept stage of a creating a new application. At this stage they are in short development cycles, and their core requirement is agility. They don't need high performance servers or the latest SSD storage. As the development of the application matures they need to start considering the performance and availability that will be needed in production and their core requirements change. By creating multiple tiers of service the first environment would be on a low tier, using commodity hardware and storage, and perhaps a free hypervisor. As time goes on, its time to move the application to a higher tier in the stack. A hardware and software stack that mirrors production. The key here is providing the choice to the IT consumer through a single platform, that enables them to make choose the service tier that they want through self-service.   ### Tell Me Why I Should Use Your Cloud ***Winners Announcement**** 13/03/2013 12:00GMT A big thank you to all participants. Here are the lucky winners as selected by our each of our three media panellists: OutSource Magazine Winner, selected by Jamie Liddell, Editor: Charity Engine For your head: As seen on BBC News, the Charity Engine grid offers 90,000 CPU cores and 30,000 GPUs for approximately 10x less cost than Amazon, Microsoft or Google - as low as one penny per core/hr. For your heart: Not only is this the world's cheapest on-demand computing, it's also one of the most energy-efficient and socially responsible. We only use recycled CPU time from machines that already exist and are switched on anyway, but are mostly idle. Furthermore, half our profits go to nine international charity partners including Oxfam, CARE and MSF. -- 150 Word Pitch from Mark McAndrew, Founder and CEO at Charity Engine CloudExpo Europe Winner, selected by the Closer Still Media Team: Ventrus Networks 50 shades of cloud? Well, we could potentially come up with 50 good reasons that blow the cloud competition away but here is a quick resume of why Ventrus beats the rest. We work with you to provide a customisable Cloud environment to fit your exact business needs for security, transparency, accountability, reliability and proven expertise and professionalism in the field. Our cloud is fully scalable, providing you with innovative ways to reduce costs and maximise worth. We offer a personable, all-encompassing service assisting you in establishing company readiness to enable a phased or hybrid approach if required. Our cloud is perpetually available, fast and cost–effective. We offer a seamless solution designed to individual business requirements. We guarantee reduction of expenditure, dependable security, improved resource utilisation, scalability, seamless integration, anywhere access, disaster recovery and minimal upfront costs. We really are the silver lining you have been looking for. -- 150 Word Pitch from Sara Ponsford, responsible for Marketing and Social Media at Ventrus VMUG Winner, selected by Gav Brining: iomart Hosting Contrary to general marketing puff, the cloud is not some sort of mystical ethereal panacea for all of IT's problems. It physically exists and it exists in the data centre. And we own seven of them. We foresaw the emergence of the ‘cloud' over eight years ago, and have invested millions of pounds in building a cloud capability from physical bricks and mortar upwards. We offer high availability cloud services, across multiple DC sites, connected to one fibre network, supported 24x7. The benefits are clear. No legacy operations, practices, cookie cutter services or reliance on third parties. The result of this ‘one throat to choke' approach is that we are totally accountable to you. No cloudwash. Tell us what you need/want and we'll tailor a solution to suit you and then back it with industry leading SLA guarantees. We don't sell ‘cloud this or that', we sell peace of mind. -- 150 Word Pitch from Phil Worms, Director of Marketing and Corporate Communications and the team at iomart Hosting Congratulations to those companies! PLEASE NOTE: Whilst the competition is now over, this blog's comments are still very much open, and we welcome further 150-word submissions. All submissions will be showcased in a forthcoming Compare the Cloud initiative, so it's well worth telling everyone who you are and what you are about! Original Post posted on Monday 4th March 2013: Foreword by Paul Bevan, Practice Leader, IT Services at Bloor Research Some years ago, at one of those global set piece sales kick off events so beloved of large IT Corporations, I was accosted by a very animated product manager who demanded to know when I was going to get sales to "sell the products we make." Without batting an eyelid I replied: "when you stop telling us what you have made, and when you start telling us why you made it." He certainly didn't get it then, and - sadly - I am seeing too much evidence that product developers in the IT space still don't get it. From Convergent Technology to Informix to Betamax, the technology industry is littered with technically superior products that failed to gain widespread adoption in the face of competitors who made the connection to their customers' needs and aspirations, and there will be plenty more in the burgeoning Cloud Computing market. The Cloud market is fun. It's buzzing. New companies and new products are popping up every day. However most haven't yet reached Geoffrey Moore's Chasm yet. The developers and techies are talking to the visionaries and innovators. Everyone is excited. This is easy. But in order to become the next Microsoft, Apple or Google they have to achieve mainstream adoption. That mainstream world is populated with, mostly, hardworking, risk averse managers driven by revenue, margin and cost targets. They may use technology extensively. They may understand how it helps them do their jobs, but they wouldn't know their IPv6 from their elbow. If your next Big Thing in the Cloud is going to be the one that makes your fame and fortune then you need to be able to articulate its benefits in a way that the business person holding the purse strings will understand. I was once told memorably, by someone from IBM, that if, in the 1970s, you had put IBM marketing with Burroughs engineers, the rest of the IT world could have packed up and gone home. Does anyone under the age of 50 remember Burroughs?...I rest my case M'lud. So, let me introduce this challenge: Contribute your 150 word pitch to this Blog's comments section! Compare the Cloud has enlisted the help of media partners Outsource Magazine, the Virtual Machine User Group and CloudExpo Europe to help promote this blog and to provide expert commentary. Outsource Magazine will be keeping an eye on all the 150 word pitch submissions and to the cloud services provider it deems to have delivered the most entertaining and/or enlightening pitch will receive an email MPU banner broadcast to 18,000 end-users and a free single page in the imminent spring issue. The judge from Outsource will be Outsource Editor, Jamie Liddell. (Outsource terms: no cash alternative, we will confirm the date, judge's decision is final and cannot be appealed). The Virtual Machine User Group will be looking for the 150 words that provide the best clarity to end-users. The winning contributing service provider will be invited to guest speak at a VMUG conference in the UK. The judge from the VMUG is Gav Brining. CloudExpo Europe: "The show goes on!" Cloud Expo Europe are providing this blog with email exposure and promotion across the @cloudexpoeurope twitter stream. Cloud Expo Europe team will also be looking for their own "winning pitch" and the winning contributor will win a 6 month banner on Cloud News – an e-newsletter distributed to over 40,000 IT decision makers monthly! And - to cap it off - ALL cloud service providers submitting their 150 word pitch in the comments section below will be offered the chance for it to be featured and enshrined permanently in Compare the Cloud's forthcoming "hall of fame"! Rules? Well, you need to stick to 150 words! Your entry needs to be posted by 17:00 GMT on Friday 8th March 2013. One entry per company will count. Blog comments and general fair play will be policed by Compare the Cloud. To get everyone started here are several pitches from a selection of Compare the Cloud partners and sponsors: At IBM we see cloud as a ‘how', not a ‘what'; it necessitates a flexible approach. Where the customer starts its journey depends on business need. IBM works with clients to determine that business need as well as achieve better, more agile IT and apply the transformative power of cloud computing to reinvent the way clients do business. IBM offers a comprehensive set of capabilities, providing the foundations of building and deploying private or hybrid clouds, delivering platform and infrastructure capabilities as a service, and supplying our suite of software capabilities available on the cloud; ranging from collaboration to business process management and web analytics. We integrate our ecosystem partners, incorporating technologies to bring the benefits of cloud to clients. Our work with partners is the reason why 80% of Fortune 500 companies use IBM cloud capability. IBM's cloud solutions are engineered to Open Standards, allowing a more collaborative approach to providing solutions. Simon Porter, Vice President - Mid Market Sales, Europe, IBM Why Digital Realty? As the largest wholesale owner, operator and investor in outsourced Data Centre facilities in the world – we are simply unmatched in the experience, stability and reach to provide or build a secure home to your public, private or managed Cloud. 1. Flexible solutions. A comprehensive suite of flexible solutions that can help you save time, reduce costs, minimise risk and better align your data centres with your business. 2. Financial strength. Our substantial financial resources make us a stable, reliable, long-term partner. 3. Expertise that gives you an edge. Our data centres are designed, built and operated by professionals with decades of experience in site selection, design & construction. 4. Locations worldwide. With more than 100 properties (over 20 million square feet of space) in over 30 markets around the world (and counting), we have data centres where you need them. 5. A smarter supply chain. Our global relationships with major contractors and suppliers can dramatically improve efficiencies and significantly reduce your time to market. Omer Wilson, Marketing Director | Europe, Digital Realty “You want to buy IT like a consumer, you want a PAYG product that's easy to buy, easy to deploy, easy to cancel. But you don't want to rely on open source, insecure, low performance, web-hosting-dressed-up-as-cloud: you want a grown up product, an enterprise grade, secure, high performance platform. That's what we at FireHost provide: the private cloud made public and the best of the public cloud made private.” Daniel Beazer, Director of Strategy, Firehost It's counterintuitive for security officers, who have been responsible for protecting an organisation's data and intellectual property, to give up control of infrastructure and data to a third party. So CSPs must become best-in-class security experts if they are to earn an organisation's trust. Trend Micro's products, services and CSP programs assist CSPs to deliver best in class security performance. Supporting your customer's own policies, Trend Micro manages security across the enterprise/cloud boundary and protects their data wherever it may be. We ensure that your customers' data regulations and confidentiality are held sacrosanct. And we deliver our services from the cloud, with uptime SLAs that follow your charging policies and simplify your billing. Optimal digital threat protection is best implemented using both the power of the cloud and from on-premise installations. When you're looking for a comprehensive and custom-defense system for your customers' digital assets, think of Trend Micro. Tim Ayling, Channel Director, Trend Micro When your business needs electricity, you don't need to build and manage a power station. When your business needs information technology, you don't need to build, manage and maintain hardware and software. This is the heart of the Cloud approach to IT - pick the right business solution to run your organisation, but let the experts take the pain of creating and managing the underlying technology. This is our approach at D2C - think business, not technology. We are Dedicated 2 Cloud, and Dedicated 2 Customers. We provide Cloud business solutions, practical business advice and proven experience of making things work. Our solutions include social media, accounting, CRM and ERP. We build social websites that work: web communities that connect and engage your customers, suppliers, partners, and employees. Whether you are a company needing a solution, or an IT provider making the transition to the Cloud, we can help. Dave Terrar, CEO, D2C Abiquo delivers an award-winning, purpose built Cloud Management solution that helps customers to build, operate and deliver their virtual data center and public cloud, integrated with existing technology, quickly and ready to monetise. Well, how can this benefit my business you ask? In essence, we don't care what you chose from a technology perspective, we provide a technology agnostic solution that you can integrate and deliver in a matter of days. This unique feature to our software sets us apart from our competitors; in this day and age technology should not be a barrier and we at Abiquo advocate just that and aim to prevent the detrimental effects of vendor lock-in. Our easy to understand interface means that our software is simple to use, not just for you but also for your customers. We currently work with a number of leading MSP's and many Global enterprises who have benefited immensely from our Cloud Management solution. Jim Darragh, CEO, Abiquo Information Security is one of the big three items on any cloud customer's shopping list, when looking to procure cloud services. “The First Rule of Business: Protect your investment”. All your customers are looking to protect their investment, so why wouldn't you provide the best possible, all-encompassing secure solution possible in order to compete in this overloaded market? What differentiates your cloud services from that of your competitor? Why not let security be that differentiator? Razor Thorn Security are the premier providers for information security within the EU and it's our job to assist you in providing the most secure solutions from a business, technological and physical viewpoint. We advise some of the largest and most successful organisations in the world, including some of the largest and most successful cloud companies in the market today. Why wouldn't you get the best information security advice possible? Ours… James Rees, Managing Director, Razorthorn Security Nimble is the world's first Social Relationship Manager. It easily connects all of your Contacts, Calendars, Direct Communications plus Social Listening and Engagement into a simple, affordable web based platform for individuals and teams. Nimble integrates LinkedIn, Facebook, Twitter & Google contacts with all of your digital communications and activities into one seamless environment. Nimble empowers small businesses in today's socially connected world to collaborate more efficiently, to listen and engage with their community in order to attract and retain the right customers. Richard Young, EMEA Director, Nimble No enterprise is predicting internal IT growth, instead enterprises are turning to service providers to provide private, hybrid and public cloud based IT. With a $30 billion market opportunity, and exponential growth predicted, we firmly believe this is the decade of the cloud service provider as enterprises turn to them to provide their cloud infrastructure. The challenge here is that many service providers have not yet created their cloud business. Instead, many have dressed up their existing infrastructure to appear as cloud. Our software helps these service providers to build their cloud services business by providing on-demand, fully automated provisioning. But it does more than that; it provides capabilities beyond those normally available including multi-currency billing, multi-level reseller support and extraordinary levels of customisation. These cloud management capabilities are necessary for any company building a commercial cloud service to secure, support and grow cloud customers. Jim Foley, SVP Market Development, Flexiant Ultra Secure? The rate of change of the IT landscape has never been faster or provided more opportunity for business transformation. Cloud computing and web based applications give us the possibility of access 24/7. Social tools allow everyone to get involved. Mobile devices and mobile networks mean we can do all of this on the move and from anywhere. The back office is combining with the front office and collaborating with customers, partners and stakeholders has never been easier. It's a huge opportunity for business agility, but it comes with an even bigger threat. You have to consider the confidentiality, integrity and availability of your data and systems with more precision than ever. Security can't be an add-on, it has to be considered at all levels and designed in from the start. You have to think Ultra Secure. Graham Spivey Sales and Marketing Director, The Bunker PLEASE NOTE: Whilst the competition is now over, this blog's comments are still very much open, and we welcome further 150-word submissions. All submissions will be showcased in a forthcoming Compare the Cloud initiative, so it's well worth telling everyone who you are and what you are about! Post date: 2013-03-13 12:00:07 ### Cloud-buying Questions By Daniel Steeves Cloud Service Provider: You've made your pitch and you're in the door, sitting across from some subset of senior management who are waiting to hear about how you and your cloud can change their world. Well done (especially these days!)… but there just might be a few questions before the deal closes: buying cloud from you is a leap of faith - not only in your business - but in their own business and its ability to capitalise on what you are offering. The following questions, posed in no particular order, are the nature of which you might hear coming from the other buy-side of the table: Where are your Data Centres; how are they connected; who and how has the whole thing been designed and built? What is your current technology landscape and what are your plans? What happens if my applications / data / websites are unavailable (and /or remain unavailable for an inordinate amount of time? And, for that matter, what is the definition of an inordinate amount of time?) Tell me about service levels and service credits? How quickly can you restore lost data (including recovery from user error); what is the back-up regimen, frequency, retention policies? Where is the data backed up? Tell about your support model for my business users, technical users and developers. Will we have an account manager and, if yes, what does that mean? Show me how I am not locked in to you: what are the mechanisms to ensure a cost effective repatriation of applications and data (to either another provider or back into my own data centre); what are the costs and timings of such decommissioning? While at this point you foresee no problems in moving our (pick one: ERP; bespoke trading platform; SOA; etc.) to your cloud... what is your approach (from due diligence through to the actual porting exercise) and what happens if there are problems? Will there be any impact on my costs? You seem like a new and risky (or successful and growing business): what happens to my business if you should go under (or get acquired)? Where have you done any or all of the above / can I speak with a current customer bearing some relationship to mine in terms of industry sector and scale? Can I see your Customer case studies quoting business results? Buying cloud from you is a leap of faith - not only in your business - but in their own business and its ability to capitalise on what you are offering. When I started writing this, I had planned a list of a few questions only, intending to discuss each a little more including views as to how to answer: along the way it has become the start of a solid list of tough questions which might prove of value all along the supply chain (and I'd also suggest that, if a customer doesn't ask such questions, that a larger opportunity just might exist to start a journey where you can start adding extra value from day one of the sale process by posing and answering those questions together… never a bad way to start a relationship!). At this point I'd like to throw the floor open to you: what are the questions a service provider should be asked? And which are the questions a provider should be well prepared for? And what are the killer questions that might have caught you off-guard?! ### Customer Engagement in the Cloud Next Thursday's EuroCloud UK (what's EuroCloud? - see below) monthly meeting is about probably the most vital topic to all business, and the one that's been most effected by the intersection of three, simultaneous technology shifts currently in play: The shift to Cloud Computing and web based apps The web going Mobile - everyone's walking around with the Internet on their smart phones and tablets The explosion of social media - every brand wants you to like them on Facebook, follow their hashtag on Twitter, share content on LinkedIn, join their community - if you aren't blogging as a business you've missed the train and competition is already ahead of you. The topic is "Customer engagement through the Cloud". We'll be taking a 360 degree view of how Cloud changes the game, and where Cloud based solutions are being deployed successfully to help companies connect directly, efficiently and effectively with their target market and partner ecosystem. The event is on Thursday 14 March, starting at 16:00 at Google Campus, Tech City, London EC2. Full details and ticket booking on the Eventbrite page here. We follow a format of many, content rich presentation of 10 (or up to 15) minutes, and then all the speakers join a combined panel session for your questions at the end. At 18:00 we will move on to networking and refreshments in the Google Campus bar. I'll be introducing things, setting the scene, pointing to the different ways Cloud, social and mobile are making a difference, acting as MC and moderating the final panel session. Our speakers are as follows: Charlie Simpson of Present.me will be talking about what he calls the next step in social and business communication - exploiting the web, video and social networks to get your message across more effectively. Dorothy Meade of Blur Group will talk about how online marketplaces are changing the game for buyers and providers - using the web to connect projects with potential suppliers directly. Stein Sektnan of SuperOffice Software will talk about how Cloud makes CRM different, in terms of the flexibility of the product, the approach to implementation, and how web enablement moves buyers from traditional purchasing to digital purchasing model. Doug Clark, IBM's Cloud Lead for the UK, will talk about how the new landscape changes working with partners and service providers. Dmitry Aksenov of cool startup London Brand Management (as introduced by Compare the Cloud) will show us how Cloud will help us take the next step - using artificial intelligence technology to make sense of complex unstructured data from any device and then acting on it. This is technology that could have a myriad of different applications to add value to customer connections and business processes. All the speakers will come back on stage to answer questions from the audience, and we hope to get as much discussion and debate going as possible. Please go to EuroCloud UK's events page to book a place. About EuroCloud UK EuroCloud UK is a forum for companies engaged in cloud business activity to network together, share best practice advice and build the profile of the industry. About David Terrar David is a director of EuroCloud UK, CEO of D2C and you can find out more about him on our thought leaders page.   ### What makes a quality Cloud hosting provider? Part 1 After attending several Cloud Computing and technology focused exhibitions over the last few weeks, it is crystal clear that the Cloud industry is continuing to expand rapidly, with new technologies and solutions being showcased at each event. With this observation, combined with predictions from leading bodies such as IDC stating that: “the Cloud will comprise of $17.4 billion in I.T. purchases and grow to become a $44 billion market in 2013”, no wonder there is a positive feeling throughout the industry. What did strike me as an interesting issue throughout my travels was the uniformity of suppliers. They were pushing out exactly the same message to potential customers and it is becoming increasingly difficult, even for fellow providers, to work out exactly what services companies were offering. In an industry becoming saturated with the same marketing message, how is a customer expected to break through the noise and identify a quality hosting provider? This got me thinking and in my opinion, analysing two key areas of experience and investment will put your business in a strong position to do so. Investment, both in terms of technology and financial contributions must be considered as a whole. More providers are now telling customers that they are offering a full service, while not disclosing how they are actually achieving this. The first question one should ask is “what lies beneath?” how is the service you are looking to buy actually being delivered? Everyone has a glossy storefront, but back stage all things are not equal. Are you dealing directly with the Cloud provider or is your supplier a reseller? With automated self-service portals being increasingly white labelled, everyone is a Cloud provider. This is not a problem it its own right, but it does add an extra layer to the service you are purchasing, meaning you need to check your provider’s suppliers too. You really need to find out what hardware the provider is running on.  To create a basic hosting environment you only need a PC, some free virtualisation software and an Internet connection, and until things start to go wrong you will be blissfully unaware of your situation.  A quality hosting provider will be using enterprise hardware, with resilient components. For example, a provider can either buy a 120 GB solid state drive (SSD) for £80 or an enterprise grade version for £1000.  The difference being that the enterprise drive is designed to last longer under an intense workload, where a consumer drive is not and could fail every six months causing potential service outages.  When extending this concept to two servers costing £500 and one costing £6000, it’s not hard to appreciate the difference. How is the supplier operating the platform? Do you have to compete with other users for resources?  How does the platform deal with that contention? Different platforms handle this better than others and as many home broadband providers will notice, services can be slow during school holidays. Fine, but can you afford for your applications to be slow due to similar events occurring which are out of your control? I have heard of instances where providers are contending memory by 4 – 8 times on a server. It is vital to make sure you get some performance guarantees from your supplier. Where is your supplier hosting the service, in a data centre or in their bedroom? I met a man a few weeks ago who was hosting a Lotus Notes solution in his cellar at home over 6 ADSL circuits and was complaining about the rain. If you cannot see the platform, find out where it is. Finding out what Tier classification the data centre is should also give you a guide to its suitability.  I would suggest that Tier 3 or 4 should be a target, tier 4 being the best but still fairly uncommon in the UK. Is their data centre based in the UK or internationally? Location can have implications for both privacy and internet performance.  Did you know that if your data is affiliated with an American company it instantly falls under the US patriot act, which means the American government can view the information, without your permission?  As for bandwidth, it is a long way to America, therefore your service using the native internet from the UK will be slower.  I ran a few tests using a broadband speed check.  The results on our fibre broadband connection were as follows:     Ping (ms) Download (mbps) Upload (mbps) Leeds London 15 33 8 Leeds Dublin 29 33 7 Leeds San Francisco 157 22 6 Leeds LA 172 15 4 However, this does vary depending on your broadband provider and the time of day. In order to stop the blog becoming a novel, I have decided to split it into parts. In part two, I will cover Cloud standards and the difference between ‘what it says on the tin’ and what the terms and conditions say. For more information on cloud computing from virtualDCS call +44 08453 888 327, email enquiries@virtualDCS.co.uk or visit www.virtualDCS.co.uk     ### Virtual patching: a way out of the patch maelstrom By Tim Ayling, Director for Channels and Marketing, Trend Micro One of the biggest problems data centre owners have today is patching. It's a burdensome, costly, and time-consuming affair that's often done manually and, given the current threat landscape, can leave mission critical systems open to new threats for dangerously long periods. Typical datacentres today may be running systems from a hotch potch of vendors that need patching, all with different schedules and different levels of criticality. Oracle's patch load is legendary, while Microsoft's Patch Tuesday is written on the calendar of most system administrators in double thick red pen. Add to this complexity the fact that many systems are going out of support and no longer have patches issued you get another headache for the IT department. Then try multiplying this a thousand fold in the environment of a cloud service provider, tasked with keeping secure a data centre servicing hundreds of thousands of users. These businesses are increasingly differentiating on the security and stability of their services – in this context a missed patch could lead to a serious outage or security incident, bad headlines and an exit of customers. Today's patch managers have an unenviable task, not least because of zero day threats. As soon as a vulnerability has been discovered or publicly announced the clock is ticking. Make no mistake; the bad guys have their own SLAs to produce an exploit before the vendor gets there first with a patch of their own. It's then the job of the overworked system administrator to make sure their systems aren't exposed, and in virtual environments it can be even more challenging. The most important thing to remember is that security teams can't shoe-horn their tried and tested physical security tools and techniques into virtual environments. It needs to be virtual patching. If organisations simply don't have the resources to patch more often than every 3-6 months, virtual patching can provide a sticking plaster to fix the issue and protect the relevant systems from vulnerabilities until those patches are applied. It should be an agentless virtual patching system which protects at a hypervisor level, because inserting agents onto each VM will degrade performance. The benefits are obvious. It's all about performance, cost and security. If automated, virtual patching can save valuable man hours, as well as extend the lifespan of legacy applications which are no longer supported, and reduce the business disruption caused by emergency patches. More importantly, for the cloud provider it means peace of mind and knowing your customers are safe. ### Cloud Expo Europe 2013: A Business Consultant's Viewpoint By Daniel Steeves, a Director at Beyond Solutions I've been attending IT industry shows and events the nature of this one since the ‘old days' when business computing existed only on mainframes (which, by the way, are in many ways the predecessor architecture of cloud which is, effectively, computing re-centralised…) and I recall only a few where I have felt it worth attending for more than one day: Cloud Expo Europe 2013 was one of them. As mentioned in a recent blog published elsewhere, I typically counsel any business to focus on requirements and targets before looking at the technology, but these times (and this show) are a little different. Because you can do more than you could before – particularly without spending a large amount of money up front to get started doing it – thinking outside of the box effectively requires knowledge of what might be out there and I am now more likely to suggest that you see what 'could be' prior to defining or designing what you want. Lessons learned? “Events of this nature, with vendors, seminars and keynotes, help you to see what 'could be' prior to defining or designing what you want” While cloud computing is far from new as a concept, what is available and how it is being delivered – not to mention accessibility to it, both 'physically' and financially – might just be a game changer for your business or for your clients. Events such as this provide access to those who do it and with those who have done it, with the latter arguably the more important part of the mix. Cloud Computing is here to stay… then again, we've known that since well before people starting calling it cloud but with the growth and opportunity focus of an entire industry, world-wide and including all sides of the equation – manufacturers and vendors, service providers, techies and end users – we can be pretty sure that the quality and quantity of services on offer will do nothing but increase going forward. So the primary lesson learned is not necessarily a new one: with this type of growth comes more players whose numbers will decrease over time, which means that one of my stock answers applies: Get your requirements straight and do the due diligence (take a look here at a recent blog I wrote on that subject for a starting point) before choosing partners in the cloud computing space. Observations on Exhibitors And an event of this nature is a good place to get started doing just that… the sector was well-covered, including some large players with a small footprint, some not-so-large players clearly staking a claim with a more significant presence and some very small entries who look ready to cause an impact. I was pleased to come across little of the hype-driven marketing we've been seeing over the last years, with what seemed a more mature presentation of more mature capabilities. With booths stocked with CEOs and CTOs (and the occasional Business Development exec) it was clear that the customer focus of the sector is changing. “Cloud to Clarity wasn't quite delivered… but this is not a reflection on either the event or the exhibitors, though, rather on the state of a rapidly evolving industry… but it is getting better” But not quite far enough: I was disappointed on an overall lack of focus on the business view – with a few notable exceptions – which I feel would have better suited the strong line-up business-themed speakers and seminars including the end user community (along with the usual technical and product coverage) or the needs of what turned out to be a very broad cross-section of attendees. Cloud Expo Europe Comment: New for 2014: Cloud Expo Europe will host a vertically led roundtable seminar programme, exclusively focussed on the pressing requirements of Europe's business and technology focussed CXOs. This seminar programme is open to senior directors from end-user organisations and delegate places will be allocated by application or invitation only. I also feel that the majority of exhibitors missed on the opportunity to differentiate themselves and educate their customers: I may know that all clouds are not created equal but as far as helping potential customers to discern the differences - and, from service levels and service credits through to customisation, maintenance, or upgrade procedures and testing, or open or closed or public or private, there is a lot to compare - let alone clearly understand why cloud computing is the right answer, for their business: there is some work yet to be done. Finally, while I have been known to complain about hype, I do need to keep informed… this article covers (mostly cloud-related) 2013 Top Terminologies-to-be which will matter and which were, for the most part, not covered other than in conversations. Observations on Attendees On the other hand it proved a great show to re-connect with colleagues and contacts across the industry and that broad brush of attendees and exhibitors, ranging from CEOs to techies, from end users to Service Providers (and more than a few of their non-exhibiting competitors), and from Industry Analysts to Cloud and Industry experts – allowed for an equally broad range of conversations. Observations on the Event I found attending Cloud Expo Europe to be a very productive use of my time and, using Twitter as a measure, most vendors and attendees felt the same. However, the promise of moving from ‘Cloud to Clarity' wasn't quite delivered, despite a fine venue and a solid set of information sessions. Delivery to that promise needed to come from the exhibitors, very few of whom were able to do so effectively. This is not a reflection on either the event or the exhibitors, though, rather on the state of a rapidly evolving industry… constant flux in an already complex environment consisting of complicated stuff and clever people working out convoluted ways to deliver better, stronger, faster and cheaper solutions ensures that clarity is never easy. But it is getting better… and I for one need events of this nature to stay connected and informed. ### Virtus appoints new Marketing Director London, 28 January 2013 – Virtus Data Centres Ltd, (Virtus), London’s flexible and efficient data centre specialists announced today that Venessa Wilson has joined Virtus as its new Marketing Director. With over 20 years of experience within the IT sector including time working at  Equinix, Cisco Systems, Tivoli (IBM) and 3Com, Venessa now has overall responsibility for all marketing activities within the business from managing the brand to developing campaigns into all of Virtus’ key verticals.   Venessa was also a Client Services Director at a specialist IT Channel Marketing agency, where she worked for a wide variety of global IT software and hardware vendors. Neil Cresswell, CEO of Virtus, commented: “I am absolutely delighted that Venessa has joined our team. Her previous experience within the IT industry on both client and agency side means she has a holistic view of marketing, from inception to delivery and management. This will allow Virtus to be able to develop and deliver its strategic marketing objectives  quickly and efficiently in future, whilst being flexible enough to evolve strategies when required. Venessa’s unique abilities and her very pragmatic ‘hands on’ approach is an excellent fit for Virtus and we are thrilled to have her working with us.”   ABOUT VIRTUS: Virtus, London’s flexible and efficient data centre specialists, help customers optimize their IT for today’s challenges. We assist clients to power their 24/7 IT operations with enough flexibility to enable IT expansion and consolidation, which helps businesses to run smoothly and grow securely. Our data centre and connectivity solutions provide the foundation for technology which is increasingly at the front and centre of every successful, connected business and helps make IT more efficient, flexible and more cost-aligned to changing requirements. In the last five years we have built a reputation for exceptional service and value. We specialize in the middle-market for data centre solutions in London, incorporating the best of wholesale and retail models. Virtus solutions are ideal for businesses that need scalable, reliable and efficient colocation, with the convenience and rich connectivity of London locations. They are uniquely flexible solutions supported by honest, responsive customer support, at the lowest TCO available in London. Virtus is London’s fastest growing specialist data centre provider. Our highly experienced management and operations team has a perfect track record of delivering 100% availability to our cloud, MSP, financial, media and education customers. All our solutions are delivered from large eco-efficient data centres that have been designed, built and operated by Virtus on land that we own. Our expanding portfolio currently comprises 28 MVA of power in 21,242 sq. m. of data centre space located in outer London’s Zone 5 and interconnected to key sites in London by 20 leading network carriers. Today, we continue to innovate in the way businesses of all types deploy and acquire colocation and connectivity services. Virtus is London’s flexible and efficient data centre specialists for IT Service providers and Enterprise IT Companies. For more information please go to: www.virtusdatacentres.com or contact us on twitter: @VirtusDCs. ### The Google Apps Option for Small Business By Gina Smith, Freelance Technology Writer A growing number of small businesses, universities and even large corporations are choosing to implement Google Apps for Business (Google Apps) into their IT mix. In fact, some estimates indicate over 5 million businesses have now implemented Google's innovative, cloud-based technology. There are significant advantages to using Google Apps. A Complete Suite: Google Apps offer a compatible and complete suite of office applications. Gmail (e-mail), Docs (create, share and edit/compose documents with others), Sheets (spreadsheet), Slides (presentation), Calendar, Talk (chat and video chat) Drive (cloud storage) and Picasa (photo storage and sharing) are among Google's advanced tools designed for business productivity. Docs, Sheets and Slides even convert into Microsoft Office formats. Collaboration: Now, your Vice President in Los Angeles, Marketing Consultant in Orlando and Sales Manager in New York can all work together on that big proposal from the comfort of their own offices. Google Apps allow everyone to write and edit together “live” in the cloud. Easy Access and Backup: Google Apps allow for easy access, sharing and editing of documents, presentations and information. And, because editing and revisions are composed in “real time”, your work is instantly backed up and stored in the cloud (more specifically, in Google Drive). You can also upload new documents, files and projects for you (and anyone you give sharing permissions to) to access. Chrome: Google's Chrome browser is one of the most highly rated, fastest browsers currently available. Additionally, rather than outfitting your staff with expensive laptops, you now can consider the less expensive Chromebook option. Chromebooks boot up almost instantly and have a user interface designed specifically and optimized for Google Apps. Google Apps is easy to manage and designed with the user in mind. It can be integrated with all of your mobile devices, including phones and tablets. Apple users sometimes shy away from Google Apps because they fear it will not integrate. However, I have found no compatibility issues with my Apple products. Although Google Apps stores your information in the cloud, some IT experts recommend considering additional backup, “just in case”. There are companies such as spanning.com which offer qualified cloud backup and restoration services. If you plan to go this route, be sure to use a provider which specifically specializes in the backup and restoration Google Apps. Not only is Google Apps easy to use and manage, but many companies even report an impressive cost-savings after implementing Google Apps. Don't be afraid to log on to the Google Apps for Business website and take advantage of the free trial. Gina Smith writes freelance articles for magazines, online outlets and publications. Smith covers the latest topics in the business, golf, tourism, technology and entertainment industries. ### Grey Clouds over Cost Savings By Michael Queenan, Director - Nephos Technologies If you are looking at Cloud for just cost savings you're going to be disappointed. Let me start with a bold statement - Cloud, as a vanilla term, is not the cost savings beast that everyone has been telling you it is. Don't get me wrong there are cost savings opportunities in certain areas – which I will come on to later – but if you are looking at Cloud for a pure cost saving exercise you are likely to be disappointed. As a quick introduction, and in case you haven't heard of Nephos Technologies before, we are a completely independent Cloud services firm that was started due to the lack of openness and transparency we heard from our customers. We kept hearing that there didn't seem to be enough companies looking to help customers understand Cloud, providing honest, unbiased advice and recommendations into how and where customers could and should look at Cloud services. So that is where we started from and one of the key services we provide. The part that most amuses me about this topic is that it is a problem caused by the Cloud providers themselves as highlighted beautifully in Daniel Steeves post (Stop Selling Cloud). Everyone knows there is a big opportunity in Cloud, as all the analysts keep telling us, so it is the old “Field of Dreams” approach: “if you build it they will come”. Instead of trying to understand how their customers businesses work and where is the most appropriate place to look at Cloud, all these companies realised they had spent millions on building a Cloud and needed to sell it, so the easiest place to start was create an ROI whether that is real of imagined. Really Looking at Costs In my opinion the first thing that needs to be done before looking at any Cloud service is to truly understand the cost of running this service in-house. Surprisingly to me this is still something that most organisations don't have a handle on yet, and until these costs are understood there is no way to know if moving them outside the business is better or not. Building out the overall internal cost of a service isn't just about hardware there are a number of other elements that need to be included; hosting, electricity, cooling, management, refresh rate, write off period etc. The last point is especially important in certain market sectors such as Public Sector as their write off periods can be up to 10 years. Trying to build an ROI on hardware that isn't going to be replaced for 10 years is impossible! The cost argument can also be a difficult one to make for certain markets as they are not setup to suddenly drop 50% of their capital budgets, but have their operational cost increase by 400%. The problem I'm starting to see in this area, more in the SME and Midmarket organisations, is that because there are now so many un-regulated Cloud companies out there trying to sell their services on cost rather than business benefit is that when they don't get the savings, or service, they expect which ultimately leaves the customer disappointed or disillusioned with Cloud as it doesn't live up to the hype. I believe this is why the work of organisations such as The Cloud Industry Forum is so important and needs to be encouraged to help act as a kite mark to stop the cowboy companies out for a quick buck. Where Cloud does Help Despite these challenges, it's not all doom and gloom. If you look at Cloud services for more than just cost savings your business will undoubtedly reap the rewards. There are a number of areas that I believe Cloud is great for…… and I do! - Agility: This to me is the biggest reason to look at Cloud from a business perspective as the ability to “right size” your environment based on usage is hugely key to a number of markets such as; retail, media, consumer brands whose infrastructure requirements go up and down depending on time of year - Management: For organisations with small IT Teams this is a huge consideration. The fact that they no longer have to worry about hardware upgrades, patch management and overall operational cost of keeping services running means they can focus on delivering business innovation through IT. - Get access to enterprise software you couldn't afford before: The days of only large enterprises having access to enterprise software is long gone and SaaS provides companies of all sizes access to software that enables them to compete with organisations of a much bigger size. - Certain Cloud Services do work on an ROI model: There are a number of Cloud services such as DR as a Service, Storage as a Service and Backup as a Service that are very easy to build out an ROI due to the known costs of providing these services in-house. A good example is backup where Nephos Technologies partners with companies that can provide this service for around £1500 per TB per year, which is hugely cheaper than providing it on-site. - Test and Development: To me this is a no-brainer as it greatly reduces the cost of deploying these types of environments and also decreases the time to market for new services through the software lifecycle. - Technology Refresh: This is the time when an ROI argument comes to the fore if customers are looking at changing from a capital to operational model. - Mobile/New Markets: With a number of companies looking into interacting with customers in a new way (mobile/social media) and also looking into new international markets Cloud provides an easy way to test these strategies without a huge investment. To highlight the fact that if you understand the customers business requirements it is indeed possible to offer cost savings through Cloud; Nephos Technologies recently completed a piece of work for a European retailer where we delivered 45% savings on predicted capital expenditure, 38% savings on overall operational cost and 61% savings on IT for opening up a new site. We did this by understanding their business plans and matching services to these problems, not by pumping a load of numbers into a spreadsheet to demonstrate an ROI. The last thing I will say on this topic is that Cloud needs to be looked at the same as any new technology and should be based on business requirements not because it is the latest flashy thing to do. You need to understand why you're considering it as an option and what the end game is – what benefit are you really trying to achieve! Cloud needs to be looked at the same as any new technology... what benefit are you really trying to achieve?! At Nephos Technologies our Cloud Feasibility Study delivers an adoption strategy that highlights the areas we believe you should, could and definitely should not look at as potential candidates for deploying Cloud services. It starts with speaking with line of business managers to get an understanding of how your business works and what your future business plans are . We also evaluate your current IT infrastructure to understand development plans and changing requirements. Then, and only then, do we feel we are in a place to start looking at where Cloud might be able to help our customers achieve those goals. ### Taking Credit Card Payments? How PCI-DSS compliant is your Cloud Service Provider? By Daniel Beazer, Director of Strategy at FireHost Inc. The large amount of marketing hot air created by the hosting and cloud industry around PCI DSS came up for debate at the recent Cloud Industry Forum's ‘Curry in the Cloud' event. Michael Queenan of Nephos Technologies singled out hosting companies that claim to be compliant to the payment card standard when they are only riding off the back of a certification held by their carrier neutral data centre. It's not just internet industry insiders who are alarmed by the unfounded claims made by hosting and cloud providers. Neira Jones, the widely respected Head of Payment Security at Barclaycard, spoke at the last Merchant Risk Forum about the number of companies exhibiting at Internet World claiming to be PCI compliant but on closer examination turning out to be anything but. But without the expertise of a Qualified Security Assessor (QSA) or a Neira Jones how is an online retailer, charity or other organisation taking credit card payments to tell whether a PCI complaint badge on a site is a marketing tool or the mark of a useful service? It's easy to see why hosters and cloud providers want to leap on the PCI DSS bandwagon. After all, web sites that have an e-commerce function are generating revenues and those are the type of customers we all want to have. But what is harder is to have the tools to decipher the marketing materials out there and work out exactly what a cloud hosting provider is offering. how is an online retailer... to tell whether a PCI complaint badge on a site is a marketing tool or the mark of a useful service? How to make yourself PCI compliant with a wave of a wand Part of the problem is with the standard itself. It's very broad, covering all aspects of credit card payments, from how to deal with paper credit card slips to anti-virus software. Some of the requirements are specific such as 3.2 (‘Do not store sensitive authentication data after authorisation), whereas others are more like general good practice requirements like 9.1 (‘Restrict physical access to the place where the data is stored') and requirement 12 (‘Maintain a policy that addresses information security for employees and contractors'). As the broader requirements are the sort of policies that any cloud or hosting provider will have in place, all a provider has to do is pay for a QSA to come in for a couple of days and vet requirements 9 and 12; hey presto, with the waft of a wand, the provider is PCI compliant without changing a single feature of its product set. Never mind about those inconvenient sections about firewalls, monitoring, secure networks and server separation. Of course anyone wanting a PCI compliant hosting solution is a sophisticated buyer and therefore quite capable of seeing through these kind of assertions, you might think. But the anecdotal evidence is that customers who are lulled into thinking that the compliance box has been ticked are not too happy when they find out that they are on their own for ten out of twelve requirements. How can the innocent eCommerce site go about making sure he goes with a provider that actually offers a meaningful PCI service versus a rack piggy backing off a data centre certification? The answer is to request the Attestation of Compliance (AOC) from the service provider. The AOC will tell you specifically what was included, and more importantly, excluded, from the assessment. In the event they will not provide the entire AOC, ask them for the Scoping section - as this section is required to contain a description of what was included and excluded from the assessment. In some cases the QSA will even provide a list of the PCI DSS requirements that a customer of the service provider can rely on in this section. If the service provider is unwilling to share this information with you, that is a sign that they are not likely validated for anything beyond requirements 9 and 12. In the meantime, we should remember that the kind of tricks played with PCI can be played with just about any standard, and web shops processing credit card information should check and check again when it comes to any cloud provider claiming to be compliant. Expert CommentaryBy James Rees, QSA and Director of Razor Thorn Security I agree on the whole with Firehost, and Neira Jones is absolutely correct. We have found a number of cloud companies attempting to dodge the PCI DSS compliance requirements by using compliant datacentres. It is a severe problem in a number of industries ecommerce, web design companies and cloud companies where this occurs far too much. As a QSA I can tell you that if you are assisting a company in any way with the storage, transference or processing of card information you will need to be compliant with the PCI DSS requirements, the extent of your liability will be dictated by the services you offer to the client. Cloud companies looking to prove their compliance will need to supply the following: Signed and current Attestation of Compliance (AoC) as a Service Provider OR allow the customers QSA to come on site and undertake an audit for the items covered within the PCI DSS compliance model that the cloud company facilitates for the client being audited. Access to the Executive summary section of the service providers Report on Compliance (RoC). Some QSA's will ask for this if they want to double check what the service provider can actually facilitate if they still have questions from reviewing the AoC. Listing of the service provider on the Visa Merchant Agent portal. This is not a REQUIREMENT but Visa do maintain a list of certified service providers, this list is being populated at the moment but is still in its early stages, so not all service providers are listed yet. Though this is expected to change within the next year. Other items that could help smooth the process: Allow the client to speak to your QSA; maybe that client has some special and specific requirements that may change your PCI DSS accreditation in some way….. If in doubt CHECK with a professional. Engage your QSA to create you a Jargon free, English translation for the sales team on what the cloud company can facilitate, and what it cannot. This is invaluable as it prevents the cloud sales people promising PCI DSS related services that are not able to be catered for (This is something we do for all of our clients as standard at Razor Thorn Security). Allow you client to ask your QSA questions related to PCI DSS compliance, if they have specific concerns or some strange requirements not only will you need to know but it promotes confidence to the client that you as the cloud service provider take PCI DSS seriously. Don't rely just on internal people to answer these questions, the QSA badge is a powerful confidence booster and may actually win you the client because your seen as helpful… ### Cloud Control: A spotlight on the grey areas of security by Databarracks One of the biggest stumbling blocks when it comes to cloud services, either public or private, is security. So what should you look out for when choosing a cloud service provider? Databarracks’ Head of R&D, Radek Dymacz, gives his advice: Service providers and customers have different definitions of what “the cloud” is. As a result, you see a lot of offerings in the marketplace that would have been described as “hosting” a few years ago but are now termed “cloud services”. Quite often, these older services won’t have the same security features built in to their cloud infrastructure. There’s also a tendency to think of cloud computing as being some abstract service existing in the ether. It isn’t. These services are hosted on actual servers in real data centres. Even if you’ve made sure your cloud infrastructure is correctly isolated and secure, it will still be vulnerable if the hosting site is unsafe. If someone can break into the data centre and steal your hardware, as happened to Vodafone, then your data and even your whole operation is at risk. Several hundred thousand customers were affected after thieves stole equipment from one of the telecom giant’s data centres in Basingstoke last year. So make sure your cloud provider’s data centre has robust physical protection and the strictest protocol when it comes to access. But while many IT managers think their data will be better off kept in-house in their own server room, a good cloud service provider should be able to improve security practices - after all you are buying into their resources and best practices. Public cloud security is usually self-service and straightforward. Public cloud vendors are generally larger and quite transparent about security. With the public cloud, you can spin up a server very quickly, but it is also your responsibility to secure and protect it, which includes having a backup of your server should something go wrong. In the case of virtual private cloud infrastructure, the service provider will generally take on responsibility for security in areas like backup or managing firewalls. But you need to be clear about what’s included in the service provider’s scope and what’s left to you, the customer, to manage. Virtual private clouds should conceptually be more secure but there are more private cloud service providers and so more ways to architect a private cloud. The greater range of options leaves an increased chance of security problems. If opting for a virtual private cloud through a service provider or IaaS, investigate the network and OS level security measures in place. You may also want additional security features which don’t necessarily come as standard in cloud services. For example, 2-factor authentication isn’t a universal feature, but will improve access security. Like the public cloud, virtual private cloud services are based on using a shared, multi-tenanted platform. Therefore, the important questions to ask your service provider should be focussed on network security: how is the environment setup to allow for network isolation? What underlying technology has the service provider built their cloud services on? Established technologies, such as VMware’s vCloud have network security technologies built in. vShield for instance is designed specifically with cloud service providers in mind. There are lots of new products on the market so you need ensure your service provider is using proven technology and that it’s included as part of the service. Additionally, check what provisions are in place to prevent brute force attacks on the network. This can be as simple as automatically denying access if a password is entered incorrectly more than a pre-set number of times. Keeping your data private doesn’t just mean thwarting threats from hackers - it can just as easily be preventing access to your data from any other parties. The most high-profile example of this would be the USA Patriot Act. Quite simply, if your data is hosted in the US or by a US-owned company, it falls under the jurisdiction of the USA Patriot Act. The US government can demand to see your data and the service provider will be forced to hand it over. So question how sensitive your data is before storing it with a US provider and make sure your contract is super tight. As standard, data in transit should be encrypted but for most cloud services, data at rest won’t be. If your information is very sensitive, you may want to include encryption of data at rest in the cloud. However, be aware that it will have an impact on performance. As standard, data in transit should be encrypted but for most cloud services, data at rest won’t be. There are other standards to look out for when it comes to selecting a service provider. ISO 9001 is a BSi assessed certification which indicates that a company is well managed, while ISO 27001 is the globally recognised benchmark for Information security. Above all, you need to have a solid service level agreement which not only guarantees levels of physical and digital security, but also gives a clear indication of which areas are the responsibility of the provider and which are accountable to the customer. Bio: Radek Dymacz is Head of R&D for Databarracks, a provider of cloud hosting, virtualisation and managed backup services. Radek studied computer science in Poland and joined Databarracks as an Open Source and Linux/Unix specialist 5 years ago. He has since progressed to become Head of R&D and is responsible for keeping Databarracks ahead of the competition when it comes to technology adoption. His other areas of expertise include cloud security, Object Storage and Enterprise Storage. In his spare time, Radek enjoys composing music and playing guitar. Databarracks provides secure cloud infrastructure, backup and disaster recovery services from UK data centres hosted at The Bunker. ### Vissensa an IBM MSP Success Story An interview with Steve Groom, CEO of Vissensa Recently Compare the Cloud had the pleasure of meeting Steve Groom CEO of Vissensa, a managed services company with a comprehensive IT service portfolio which is predominantly based on IBM hardware and software as well as the IBM Cloud - SmartCloud. We asked a number of questions concerning how IBM has helped Vissensa become the organisation they are today. [quote_box_center] Vissensa - a bit of background... Vissensa was formed by a group of commercial and technical professionals from an NHS managed service background who achieved an enviable record of 6 years zero downtime on the ISoft architecture they were responsible for. While servicing this NHS contract, they established data centres in the UK and now as Vissensa have grown into a four datacentre operation and recently grew into a fifth site where they became an anchor tenant for the new Datum data centre in Farnborough. During this development they have replicated this scale of footprint to bring carrier class stability and performance to the Hosted Desktop/ Backup and managed services market. By standardising on IBM hardware and services, Vissensa has established a strong USP and has laid solid foundations through an aggressive growth and reinvestment strategy. [/quote_box_center] CTC: Steve, tell us about your background and why you founded Vissensa? I was very fortunate to have worked with a technical team who had a “never fail” attitude that is rare in an overall delivery model. This team managed a complex primary and DR environment for patient data for NHS clients and had exceeded all the service levels required for a long time. It made it very easy to design and implement our first baby steps whilst laying the core foundation of what Vissensa stands for - which is robust, quality services, forward thought and leadership. CTC: What separates you from the abundance of hosted desktop and managed services companies? I think the answer is three fold. Firstly, we have developed a domain expertise in using the available desktop technologies out there, and so have an overarching view of what technology suits a client’s business model, regardless of vendor; but more importantly we look at a desktop deployment in the longer term and from the “consumer” end - not the IT department perspective. This allows Vissensa develop a roadmap with the client of how their desktop experience is set to develop, which may be through thin client deployment or the adoption of BOYD (Bring Your Own Device) adoption in the organisation. On top of this we collaborate with other companies to introduce them as part of the overall solution to ensure that a comprehensive working model can be engineered. For example, rolling out BYOD brings a new set of challenges to the enterprise - such a security, we work closely with specialists like Jim Rees at Razor Thorn Security who has many years of broad security experience to advise on a specific client’s future desktop requirements. Lastly the final differentiator is that we provide integration to a range of other cloud services such as online backup and file collaboration as well as application hosting / PaaS. BYOD brings a new set of challenges to the enterprise - such a security CTC: How has IBM helped Vissensa in recent years? IBM continues to be a key vendor for Vissensa as it has invested in developing an embracing partner programme which can be mapped to a business partner's market and trajectory. IBM's hardware & software allows Vissensa to build and augment industry-leading components into a range of solutions that can underpin our own USPs and capabilities. IBM's marketing and channel programmes have been re-aligned to a business-partner-focused model and this makes “doing business “ with IBM as a business partner much simpler. This is really apparent in their "first mover" position as a major vendor aligning itself to MSP’s in response to the markets growing concerns of the shrinking reseller model across the technology marketplace CTC: What products within the IBM cloud portfolio have Vissensa adopted? Vissensa have adopted a hardware position that we standardise on IBM blade, FlexSystem, unified Storage (storwize etc) for hardware with the associated integrated software stacks which comes with them - such as Tivoli Flash Copy and TSM. We’ve been working closely with some of the IBM team on the Lotus Live and collaboration tools for deployment into the education sector and we also have adopted Websphere for a specific healthcare app we built. In terms of management we’re in the early stages with IBM SmartCloud Control Desk as part of our extended desktop management offerings - and of course we have been implementing the IBM Virtual Desktop product which recently became a Tivoli branded solution along with EndPoint management and security. CTC: What is the integration like between IBM cloud products? One of the things Vissensa is starting to like about IBM and Tivoli is that the amassed wealth of IBM as a software company and its ability to adopt and then integrate products is now becoming really evident. An Example - The IBM SmartCloud Desktop Infrastructure product has already got a number of integration points in it, such as patch management - which may not seem revolutionary - but when you factor in that IBMs patch management (Tivoli Endpoint Manager) is both “fat client” and “thin client “ aware - this allows one tool to be used to patch a traditional as well as a centrally managed VDI community of users. CTC: IBM are making a push into the MSP market based on your experience would you advocate managed service providers to join the IBM cloud partner program? IBM have been very helpful at each step of our development I think we’re a good advert for companies that are moving to an MSP model as we have developed our solution from the ground up, and have become successful based on the companies we have teamed up with and the technology we have deployed. IBM have been very helpful at each step of our development which is transacted through the partner program. It’s the partner program which is effective, consistent and understandable which helps new organisations align themselves with what they “are and are not”. CTC: One question we ask on every interview, what is your definition of cloud computing? For me “cloud” is a consumption model based on the business requirement which is flexible and transparent to change. Many don’t prioritise the need to have technical and support personnel available on and off premises as part of a cloud solution, which when you think about it is fundamental - since rarely does a company move “big bang” style into the cloud. CTC: And if you could change one thing within the cloud computing market what would it be? I think more comprehensive information which can give rise to more informed understanding of what it is and what is needed for you to engage a business with the cloud, particularly for the SME market - would be great. CTC: And finally, anything you can share with regards Vissensa’s plans for 2013? 2013 is going to be a massive year for Vissensa. We have already seen triple digit growth in terms of our cloud-based revenues, and this is set to accelerate further. We’re being asked to provide cloud into different countries, South Africa, and South East Asia being two which are high on the radar, but also we will be announcing a comprehensive cloud toolbox and service integrated in a management dashboard designed to address all aspects of how business will consume corporate data from a widening range of ubiquitous BYO devices. …so watch this space! Thank you Steve. For further information on Vissensa visit www.vissensa.com or get in touch with Compare the Cloud for an introduction. ### Office 365 and Google Apps, what is there to choose?? By Kerry Burn, CEO, 848 GroupKerry Burn of 848 Group Understanding the differences between Google Apps and Office 365 is a little bit like the comparison made a few years ago by an Italian journalist Umberto Eco between Apple and PC’s. [quote_box_center] Umberto argued the following: ‘The fact is that the world is divided between users of the Macintosh computer and users of MS-DOS compatible computers. I am firmly of the opinion that the Macintosh is Catholic and that DOS is Protestant. Indeed, the Macintosh is counterreformist and has been influenced by the “ratio studiorum” of the Jesuits. It is cheerful, friendly, conciliatory, it tells the faithful how they must proceed step by step to reach – if not the Kingdom of Heaven – the moment in which their document is printed. It is catechistic: the essence of revelation is dealt with via simple formulae and sumptuous icons. Everyone has a right to salvation. DOS is Protestant, or even Calvinistic. It allows free interpretation of scripture, demands difficult personal decisions, imposes a subtle hermeneutics upon the user, and takes for granted the idea that not all can reach salvation. To make the system work you need to interpret the program yourself: a long way from the baroque community of revelers, the user is closed within the loneliness of his own inner torment.’ [/quote_box_center] To make the system work you need to interpret the program yourself. (Umberto Eco) The current face off between Google and Microsoft is not dissimilar. in the following paragraphs I give a brief overview of each, 848 cloud services can advise further on both. Office 365 Microsoft Office 365Following the point made by Umberto, Office 365 for Enterprises is a product that offers salvation to those who are prepared to put the effort in. Out of the box you can set up a powerful messaging and collaboration platform (using Lync) and you can set up saving of documents to an online SharePoint platform. Effort is required however to commission the collaboration features of Office 365 and in our experiences you will need to employ SharePoint expertise as part of the roll out of such technology. Mobile connectivity is also impressive on Android and Windows mobile but there is an extra charge for BlackBerry. Connectivity is the same as people are used to with BlackBerry services today Office 365 also includes impressive Unified Communications capabilities, You can federate users on Office 365 to other company’s using Lync and it has superior sharing and collaboration facilities borne out of its Live Meeting service. The Office Web apps also offer a similar look and feel to Office, therefore there is a shorter learning curve than using Google Apps. Google Apps Google AppsGoogle represents the purest form of cloud computing, built from the web up it means that anyone with access to an internet browser can use the suite of applications within Google Apps. Google apps offers mail, calendar, contacts, sites and Google Talk with the ability to use a custom domain (we are surprised how many people do not realise that you can use your own Internet domain) and also include an Office compatible web apps suite for good measure. Google offers a range of features in its Google Apps for business service that offer immediate gratification and ‘out of the box’ features that include superb collaboration and the ability to use your existing Microsoft Applications with the Google back end. The integration is not perfect and would need to be tested but it is a pretty compelling experience for the investment required. Mobile connectivity is well catered for, all of the leading mobile devices are able to connect via Google Sync and sync mail, contacts and calendars. In the case of the iPhone and Android (arguably the two platforms that will be left standing in the coming months) you can sync and search server side. A service that has only really been available for people with a large Exchange environment until now. Google apps includes Google Hangouts, based on the Plus service, this allows upto 20 people to talk in real time and perform a variety of sharing and collaboration tasks. It is likely that Google will charge for this in the future but for now it is a very attractive add on for people using the platform. The Future The soon to be released Office 2013 will see Microsoft further consolidate their strategy to make their products ‘more cloud like’ and critically it will appeal to corporate IT departments where the new functionality in Office 2013 will allow public sharing via SkyDrive and – of course – integration with enterprise storage such as Sharepoint. Google Apps have, as expected, modified the Google Plus offering so that it is far more enterprise friendly with restricted sharing and integration with calendar and email. Using video conferencing as a part of on line collaboration is a natural extension and Google have a superb integration. One cannot rule out the importance of federation either... One cannot rule out the importance of federation either, Box.com is leading the way here with a tight and well managed service that integrates with Google and Microsoft solutions. People using Box can easily setup secure areas to create virtual deal rooms between partners and suppliers and integrate this into existing applications. Box is a leading light in the BYOD strategy as it allows an enterprise client the chance to give the users what they want by allowing use on devices whilst maintaining enterprise security. It is not always that easy to move email systems, particularly with larger numbers of users. What should you choose? Choosing a cloud based mail and collaboration service from any of the big players is akin to choosing a luxury car, all luxury brands can get you from A to B but they will do it in a slightly different way and portray a view of the type of decisions you the driver have made to choose that car. Google and Microsoft want IT to be fun and productive and these products go some way towards offering this. For the IT departments in Mid Market to Enterprise Microsoft is a logical choice, it is more complex to set up and in larger deployments the lower disruption should not be underestimated. A business can deliver a cloud based solution and if they wish the users can almost migrate with a minimum of disruption. In a more disruptive and indeed more maverick type company, Google Apps is the friend of the anti microsoft brigade, but it has a strong enterprise pedigree and has won a considerable amount of plaudits with their larger corporate wins with Rentokil, BBVA, Hillingdon council and others. There is also much to be said for the simplicity of its licensing model and cross platform and device support. Either way people are not going to put up with a lack of choice for much longer people want their own devices and their own way of working. I cannot think of a time where it has been easier to achieve this. ### Interview with James Rees of Razor Thorn Security CTC: James, tell us a bit about your background and about Razor Thorn Security? I have been in information security and IT now for over 15 years, much of that time I have been a consultant for various consulting companies, though every so often I have worked for single companies directly. To date I have worked in pretty much every sector out there at some point or another as well as worked with many of the fortune 100 companies in the world. Razor Thorn Security was born in 2007, I primarily in the early days used it for freelance work but it became a company in its own right in 2010 and ever since it has been growing steadily into one of the best information security companies in Europe, both the guys I employ as well as myself are very proud of what we have done. The First Rule of Business; Protect your investment. -- Etiquette of the Banker 1775 Today we provide a varied list of consultancy services to a number of high profile clients; some of them cloud security organisations. In all cases we are there to protect the businesses most critical assets and ensure that an organisation can understand and react to threats to its wellbeing. The best way to describe what we do in one sentence is: The First Rule of Business; Protect your investment, (Etiquette of the Banker 1775). CTC: what is Information security and why should we do it? Information security is greatly misunderstood, many business people and more than a few IT people think that it is some form of witchcraft to do with IT. In all honesty you could not be more wrong; my definition of information Security is thus: The management and proactive planning for the protection of business critical assets, be they logical, physical or digital based from risk and threats that could adversely affect those critical functions. The other important thing to point out, especially to those companies that provide some form of service to either the public or some sort of managed service is that customers will expect at least an excellent level of security from their providers. This is something that you better get correct early on because if you do not then you are opening up your organisation to all kinds of threats to its brand, revenue streams and its operation. Too many organisations only pay lip service to the security of their organisation and those same companies usually at some point wind up getting burned badly for it, and these days it’s commonly very publically done through the media. CTC: Information Security, sounds like a nice to have IT thing doesn’t it? IT is a large part of information security, but only because it is the singular asset in our modern business world that allows an organisation to operate efficiently and effectively. Information security when done correctly also covers a whole list of important aspects such as: Governance and Compliance (PCI DSS, ISO27001, SOX, etc) Process and procedure (Information security policies, etc) Logical Security (Data Management, incident response, business development, etc) Physical Security (CCTV, Access Control, etc) IT Security (Antivirus, Firewalls, server/desktop, etc) Business Continuity / Disaster recovery Many of these have a great deal of IT involved in them but they are also extremely complex business considerations in each that also need to be considered. One of the things that inexperienced information security people get wrong is putting in overzealous technological and policy/procedural controls that hamper a business from operating efficiently and effectively. It has been a big problem and has led people to mistrust information security professionals due to bad experiences in the past. At Razor Thorn Security we always look at adequate security based on the business needs rather than the technology of the moment. The business must always come first, but that is not to say however it cannot have good and effective security… Our clients love the approach and we have a number of long term and excellent relations built because of this. CTC: What about compliance such as ISO27001 / PCI DSS what is the hype about? Compliance is currently the key to getting into the major contracts as a service provider. Especially in the cloud arena, there are a few undertaking this at the moment and doing very well out of it but there is a fair bit of room for more. In the current business markets with the cloud you would be hard pressed to get the larger potential clients to sign up to your service without being able to prove you are compliant to one or both of those compliance requirements, Companies looking to move to the cloud are very concerned about the security of their systems when moving to a third party and they want proof that your systems are as secure as possible before they are willing to agree. Too many service providers pay lip service to information security or think it’s only restricted to the IT side of the business, the unfortunate truth that service providers tend to find is that when they are challenged to show their levels of security they can’t, which means the potential customers commonly lose faith in the fact that service providers can supply them what they want. PCI DSS is the big boy here... Mark my words, there is an extensive opportunity to companies that are accredited... PCI DSS is the big boy here, if you want to get into a market where your potential customers take card payments its most likely this is the area of their business they will want to move to the cloud with because of the cost of the overheads of maintaining the security requirements. Mark my words, there is an extensive opportunity to companies that are accredited and can prove it… CTC: Tell us what your average day entails? I have a very busy company but when I am not on a client site and in the office I usually get to work at about 07:00 – 07:30 and spend the first few quiet hours writing articles. I write a lot, it’s my passion next to information security itself so combining the two is an excellent use of my time. I have done a lot of work on Cyber warfare, Cyber security, compliance etc. but I am also developing papers on the application of information security in emerging technological fields such as biotechnology, nanotechnology as well as a number of other future technologies as these will be an important part of our lives in the next twenty years. At 09:00 / 09:30 until 14:00 I tend to work on clients, be it advisories for our consultants out in the field who need some assistance or working on client offerings. I find I always work best between the morning and early afternoon when creating content. In the afternoon between 14:00 and 18:00 I tend to take conference calls, talk to existing and new clients about service offerings, etc. The afternoon is my favourite time for doing this. Evenings I relax, spend time with my wife and watch horror or/and sci fi films, as well as getting regularly get savaged by my wife’s pet rabbit who hates me with a passion for some rabbit reason… CTC: What is your view on Cloud Computing and where do you see the marketplace heading? This is a tough one, Cloud computing is the most recent paradigm shift in the application of technology in the business world and a very important change in the way that we have managed our technology from recent years. More and more businesses are looking to save on their costs and reduce overheads by shifting key infrastructure and services over to cloud models. In the next few years we will head down a similar direction to our American cousins and a lot of public and private organisations will move to the cloud. Company’s adoption of cloud technology will depend on three key factors: Cost Security Reliability The only problem I see currently is that few European cloud vendors can prove to prospective clients that they are both secure and reliable. ...few European cloud vendors can prove to prospective clients that they are both secure and reliable. Customers are rightfully being very careful in moving to the cloud, in effect they are putting all their faith in a technology delivered, maintained and secured by a third party, so they will be very cautious. From talking to most of the key cloud players in the business I can see that realisation from cloud suppliers that information security is a key selling point has begun to occur, but in many cases they are looking at it from an IT perspective rather than a business perspective. This in my opinion needs to change fast if European cloud suppliers want to ramp up their sales. There are a lot of suppliers out there but very few that take security seriously. In my opinion if you want a good solid long term business that will survive to be a key player in the cloud industry then you NEED to start understanding REAL information security, and you need to do it quickly. The one thing I can see right now with 100% clarity is that at some point in the next year or so one of the larger cloud vendors will have a catastrophic security event that will destroy their brand and reputation. It will be a wakeup call for the survivors; the question is however, who will it be? CTC: A question we always ask, what is your definition of Cloud Computing? Argh, this is something I hear people debating over a great deal. My definition of cloud computing is thus: “Any Service(s) delivered to an entity from a collective computing resource over a network connection (including the internet)” CTC: If you could change one thing in the world, what would it be? Develop both cold fusion and hydrogen fuel cells, we have a rubbish energy system at the moment dependant on a mineral slime. ### Interview with Steven Winstone-Adair of Interact Technology CTC: Steve, tell us about yourself and Interact Technology? SWA: Whilst travelling on my year out in Australasia and the Far East, I found myself working for a couple of telecoms businesses. Upon returning to the UK in '96, it seemed natural to use my acquired skills and jump on the Telecoms boom bandwagon. I joined a freshly formed company called Margolis Communications. By 2000, I was appointed Sales Director and helped the company grow considerably within the traditional voice market over the next few years. In 2004, whilst sustaining growth within this sector we started to dabble within the up and coming video conferencing market. In 2007, I decided to pursue my dream to start up my own business, Interact Technology. Using my skills and networks, I started up Interact Technology with the view that Telecoms, Video and Data would soon all merge rather than being three different sectors. With this in mind, I laid the foundations of the business to cover each of these areas - with business unified communications being the main emphasis of the company. Now in 2012 the convergence of voice, video and data is complete, and since we laid the right foundation at the outset it has put us in a very strong position to win new customers whilst servicing and growing our existing legacy customers. CTC: In terms of supplying business-grade voice services to SMEs, to what extent are your new and existing customer base embracing VoIP and Cloud-based voice services? SWA: We have three main service delivery methods which we can offer customers for voice services, that is: On-site legacy equipment Privately hosted equipment within a customer's data centre Cloud-based and Hosted solutions on a per month per handset basis Whilst many customers are happy ditching their ISDN lines in favour of SIP trunks, we have seen a growing number of new and growing businesses moving to the pure cloud & hosted platform. This is at its strongest in the small office (5-20 handset) market. Larger SME size businesses tend to flirt with pure cloud and hosted telephony solutions, however the majority tend to stay with a legacy PBX or private & dedicated hosted service. The larger the company, the more gradual the move to the cloud and multi-tenant infrastructure tends to be. CTC: What are the main advantages - or where can SMEs reap the most benefits - from moving away from the traditional digital PBX, ISDN phone lines, voicemail systems etc? SWA: There is strong financial argument for businesses to review their arrangements and move to a hosted platform. Cost savings can be made immediatley on the following: Telephone lines: By Moving towards SIP companies can immediately slash their line rental bills by at least half Telephone calls: SIP telephony offers more aggressive rates often saving up to 80% off current bills Support: Support costs are almost eliminated using this model So as a cost justification alone, this becomes very easy against a new system sale although the Manufacturers are getting wise to the threat of lost sales and are offering a number of 0% finance packages over a fixed term, to advertise their models over a monthly period too. CTC: Are there any things SMEs should be particularly careful about when switching to VoIP and Cloud-based Services? SWA: It is the same argument now that it was a few years ago. Bandwidth. Providing that you have good enough bandwidth with high uploads and downloads, and with limited Jitter and latency, the call quality should be just fine but like most IP services, could suffer from periods of poor quality or limited service. As long as you are up front with the customer, and they are not totally reliant on voice then this is often an acceptable risk/reward equation. If recommending a Legacy IP PBX, I would always recommend a (strong) cocktail of ISDN and SIP lines for additional resiliency! CTC: In your opinion, what are the key things an SME should look for in a voice services provider? SWA: We work on increasing the service to our customers, providing a single bill for the full range of services that we provide our customers. I would look for a company that is recommending a solution with resilience in mind and that is not just a PBX seller. CTC: How does Interact Technology differentiate itself from the competition? SWA: With our knowledge of voice, video and data convergence, Interact can provide a unique insight into providing an all-encompassing service for our clients. Many telecoms companies will partner up with other companies to provide this service - but this will be at a cost to the client both financial and service wise. At Interact we can provide the full solution and offer a single point of contact for all support matters and peace of mind. With our head office being in the heart of the city, and our demonstration suite housing the latest Voice and Video solutions, we are also best placed to show these solutions working in action without the need to travel to the M4 corridor. CTC: Are there any particular cloud-based or cloud-related technologies coming down the road which could bring some compelling benefits to SMEs? SWA: With the convergence theme in mind, The new Microsoft Lync 2013 platform is definitely worth looking out for. We have recently added these solutions to our portfolio, offering this functionality to smaller businesses at a realistic price with our new "one box" and cloud-hosted solutions. We expect this to be a big seller in 2013 although it won’t be for everyone. CTC: Indeed I understand you'll be telling us about Lync soon. Thanks for your time Steve - any closing thoughts or advice for people looking at voice communications in the cloud? SWA: As always manage your expectation. Whilst bandwidth is now becoming more and more stable and with the advent of fibre-based broadband (FTTC) and other high bandwidth types of services, VoIP and Video-oIP performance will only get better. We offer bespoke xDSL products that are fully managed so we can provide an end-to-end solution for fault diagnostics. This type of managed service should be part of your solution to ensure highest service levels and accountability from your supplier. ### Cloud Service Contracts Our expert lawyer tells you what you should look out for By Frank Jennings, cloud lawyer at DMH Stallard Cloud means no software. Cloud offers flexibility and scalability. Cloud is cheaper and more resilient than on-premise IT. We've all heard about these - and no doubt other benefits of cloud too. It's very easy to start using cloud services and to get these benefits. You can even bypass your CIO by using your credit card to buy a cloud service direct. But do you check what you're getting before you sign up? All too often, customers ask the really important questions after they have adopted cloud. Here is our FAQ of risks in cloud contracts that customers should be asking. And guess what? Reputable cloud providers don't mind you asking. 1. Will the provider negotiate the contract? This depends upon the type of cloud. For public cloud, probably not as it's a highly standardised generic service - it either does what you want or you go elsewhere. But with private and hybrid cloud or where you dealing with a reseller, you can and should negotiate. 2. What service guarantees will the provider make? With public cloud you will generally get a multi-tenanted solution where you and other customers share space on the provider's infrastructure. It won't be tailored to your exact requirements so the provider's promises will be restricted. While the provider may promise that the service will comply with its published specification and SLA, you should expect statements that the service is provided "as is" with exclusions of any useful promises about it being fit for your specific purposes. Or that the quality of the service will be satisfactory for your needs. Again, with private / hybrid / resellers you can specify a greater degree of tailoring and you should negotiate these warranties too. Reputable cloud providers don't mind you asking [the right questions] 3. What risks does the customer bear? Remember, the customer is ultimately responsible for data security and compliance and the Information Commissioner or FSA will fine the customer for breaches. If you want your data kept in the UK or EU, check the location of the provider's primary and secondary data centres, and don't forget to ask where their call centre is. If you want back-up or failover, don't assume these come as standard. If you want your data encrypted, are you responsible for this? If the data is lost, are you responsible for recovering it? Has the provider limited all their liability to the fees you're paying them (or, as above, to service credits only)? 4. What should I look out for? Does your provider have a good reputation? Do they have any accreditations, such as ISO27001/9001 or conform to the Cloud Industry Forum's Code of Practice? These take time, money and effort and show that the provider has an eye on the customer's interests. Do a credit check on them. Do they own the data centre or buy space from someone else? Is it Tier 3 and above? Can you "step-in" to the contract if your reseller goes bust? Can the provider post new terms or prices on their website by simply emailing you? 5. Can I sue my cloud provider for a service failure? Typically, the customer will bear the brunt of a public cloud service failure. Check the SLA - it will probably say your "sole and exclusive" remedy is service credits on an hourly or daily basis and you'll have to claim these as the provider won't automatically give them to you. If your cloud service is down for a day, service credits generally won't amount to any use anyway. Again, with private / hybrid / resellers, you can often get better protection. You get what you pay for, after all. Some providers are so certain of their resilience, they will even indemnify the customer for data loss. But, be careful: anyone can set up a cloud service. It doesn't mean they have any capability. Or money. 6. How do I change provider? If you're dissatisfied with their service, you may be able to end the contract for service breaches - but see the warnings above. Otherwise, you'll have to terminate by giving notice. Often cloud contracts are for a minimum period so check whether this has elapsed. Make sure you give notice in time to avoid an auto-rollover of another 12 months. 7. Is insurance worth getting? Insurance for cloud outages and data losses is at an early stage, but you should definitely speak to a broker. Of course, even the best insurance cover is never a substitute for taking practical steps to minimise the likelihood of needing to claim on it. 8. Can a lawyer provide any guidance on this? Of course, but check that your lawyer really knows cloud. For example, make sure they're familiar with the Cloud Industry Forum's best practice contract recommendations. You wouldn't go to your GP for heart surgery, so think carefully before using the same lawyer who drafted your Aunt Mabel's will. ### You Need Cloud Orchestration By Tony Lucas, Founder, Flexiant The cloud services market is in a transitory phase. We are witnessing the weakening of the argument that states that individuals or companies need to own and manage their own computing resources. Increasingly, these individuals and companies will turn to cloud based utility services to get the job done. Gartner and IDC both predict significant public cloud growth. Gartner predicts that public cloud services will grow from £109bn in 2012 to £206bn by 2016. IDC's prediction is that spend on public cloud services will grow at a rate five times greater than the average increase in IT spend. The opportunity for cloud service providers is significant, but these organisations need a way to promote margin and revenue growth opportunities by being able to offer a differentiated end-user experience which helps to increase the cloud platform consumption. Organizations can then focus on growing their business, instead of wasting time on repetitive and reproducible activities enabling the focus for more complex, value added services to be developed. The clock is ticking for cloud service providers; the time is now for these cloud service providers to stop hemming and hawing over what new cloud technology to deploy and instead act now to stay ahead of the competition. This is the decade of the service provider and if new technology is not adopted, revenue opportunities will be lost. In reading David Terrar's post on some of the challenges for cloud ISV CTOs to overcome, one of the challenges I particularly agreed with was the advice on not developing it all yourself. David writes: [quote_box_centre]"Don't develop it all yourself. In the past the typical software company would have developers who thought they could do it all. The perception was always that it would be cheaper to build the support system or billing system or customer forum exactly the way we want it to work rather than buy. The end result was a whole load of bespoke software to support to run the business with limited functionality and no best practice. The smart development teams know what they are good at and stick to it."[/quote_box_centre] Corporations are looking to concentrate on what they do best. The stars are aligned for service providers to capitalise on this opportunity, but having the capabilities to offer the orchestration businesses require presents a challenge. This is where cloud services providers need to stop looking to develop bespoke solutions and instead harness the benefits of proven, mature and reliable cloud orchestration solutions. Cloud orchestration software can help by managing all elements of a cloud platform; physical and virtual resources. It allows organisations to turn traditional hardware into a cloud platform by automating and streamlining the deployment while managing and provisioning those resources appropriately. This way, service providers get the most from their investments and exploit the growth opportunities of cloud services. With that in mind, Flexiant has today announced its latest version of Flexiant Cloud Orchestrator V3. With V3.0, the entire range of cloud service providers including traditional hosting companies, managed service providers, telecommunication companies and new entrants to the cloud services market can create a cloud services business that customers love. What Info-Tech Research Group Thinks Named by Info-Tech Research Group as a Cloud Management Trendsetter in its “Vendor Landscape: Cloud Management Solutions” report, Flexiant was evaluated against nine other competitors including Abiquo, CA Technologies, Citrix, Eucalyptus, Nimbula, OpenStack, Virtustream and VMware. Flexiant was rated the leading innovator and as a solution of choice for service providers or indeed anyone, such as hosting companies or telco, which wants to build a cloud services business quickly rather than build an internally focused enterprise cloud. The “Vendor Landscape: Cloud Management Solutions” highlighted Flexiant's strengths including standout dashboard customisability, flexible and integrated billing capabilities, quota and permission management, and alerts tied to resource utilisation and flexible deployment options with multi-hypervisor support. Flexiant Cloud Orchestrator Flexiant Cloud Orchestrator V3 offers a cloud orchestration solution for on-demand, fully automated provisioning of cloud services. The latest version combines Flexiant Cloud Orchestrators previous capabilities – multi cluster support, across multiple locations and geographies, all managed from a single pane of glass portal, multi hypervisor support, granular metering and billing, highly configurable user interface, roll based access control, an extensive API and reseller capabilities – with new features to extend the provisioning of resources to the application. New features include: Bento Boxes, Flexiant Cloud Orchestrator's functionality for delivering pre-configured application provisioning, allows service providers to offer pre-packaged, pre-configured and sophisticated application stacks that are ready to use. SmartFilters gives users the ability to track both virtual and physical resources according to business needs and requirements in a way that makes sense to them. V3 offers scalability assurance. Flexiant is also announcing its Universal Storage Solution (USS) which removes the storage compatibility headache. Support for the Hyper-V (2012) hypervisor. Flexiant Cloud Orchestrator V3 further supports service providers, traditional hosting organisations and telecommunication companies with a cloud orchestration solution that is reliable and mature at a compelling price point. Flexiant now offers four new cloud management software editions to give companies of all sizes access to Flexiant Cloud Orchestrator V3. It offers traditional hosting companies an alternative orchestration solution that works while providing an upgrade path to advanced functionality currently experienced in the service provider market. What Our Customers Say Selected by Cartika, a leader in application hosting and advanced clustering technologies, Flexiant Cloud Orchestrator V3 will provide the cloud orchestration solution it requires internally and to meet its customers' expectations of a more automated and flexible solution. [quote_center]Andrew Rouchotas, CEO, Cartika, explained, “We knew we needed a cloud provisioning platform that would enable us to provide enterprise grade solutions, help us differentiate against other service providers and that would also support both our current and future business strategies. We discussed internal development, but the time to market was prohibitive. After looking at several vendors, it became obvious that Flexiant was clearly a more mature and stable software solution than anything else available. Flexiant's product is head and shoulders above the alternatives. We were impressed with its robustness and ease of deployment and also the service delivery team's ability to meet our bespoke needs.”[/quote_center] [quote_box_center]Richard Parker, Market Development Director at ITEX explains, “ITEX originally embarked on a do it yourself type project using existing technologies and found after a number of months that the amount of effort required to build a robust cloud platform that could stand the scrutiny of very well informed clients was quite an undertaking. It soon became evident that the work and resource involved was impractical. ITEX went out to the market to re-evaluate the commercial platforms available and selected Flexiant for its flexibility, simplicity, scalability and affordability. The ability to monitor and bill on a very granular level to its client base was of paramount importance to ITEX, yet it wasn't a feature it had seen in other systems.”[/quote_box_center] Over the last year Flexiant has gained significant momentum in the marketplace. With two new versions of Flexiant Cloud Orchestrator, new licensing editions, named by Info-Tech Research Group as an innovator and trendsetter, customer endorsements, Flexiant is definitely a company on the move and one to watch. For further information visit http://www.flexiant.com ### The Future of Cloud Computing Jobs Via Resource founder Tor Macleod shows his insight on the cloud computing jobs market. Many readers will be aware that finding cloud specialists is becoming harder for businesses. Below i will shed some light onto the challenges employers face in this sector and why choosing a recruitment partner with experience o the market is key to finding the resource you require. Right now, cloud computing is probably the hottest field to get into for any technical professional. The cloud is a major buzz word amongst big business and smaller organisations alike. With the world embracing the cloud there has been a massive boom in cloud related jobs. Reports on the US job market show just how much the demand for cloud skills have increased. This August there were 10,000 US job posts requiring cloud computing experience or knowledge, an increase of 92% form 2011 and by 400% since 2010! ...finding cloud specialists is becoming harder for businesses. This years software 500 saw, 5 cloud companies added to the list. Cloud computing and management services are the number one for employer growth within the software industry with a massive 1828% increase in employees in the past year. IT Services and Consulting, second place, showed an increase in employees by a comparatively small 355%. If you consider that these statistics only account 5 companies whose primary business is cloud and not positions within organisations, then the actual number of cloud jobs is much higher. So what are the the most posted cloud jobs? 1. Software Engineer 2. Senior Software Engineer 3. Java Developer 4. Systems Engineer 5. Senior Systems Engineer 6. Network Engineer 7. Senior Java Developer 8. Systems Administrator 9. Enterprise Architect 10. Websphere Cloud Computing Engineer But you don't have to be an engineer to be a part of this vastly expanding job market, there is also demand for other positions with cloud knowledge- marketing managers, sales representatives and market and research analysts to name a few. What you need to know to get a cloud job? The most required skill on technical job posts is 'cloud computing', here are the other most common requirements for a technical cloud job: 1. Oracle Java 2. Linux 3. Structured Query Language (SQL) 4. UNIX 5. Software as a Service (SaaS) 6. Python Extensible Programming Language 7. Practical Extraction and Reporting Language (Perl) 8. Extensible Markup Language (XML) 9. Service Oriented Architecture (SOA) 10. JavaScript (JS) Other skills and certifications wanted 1. Certified Information Systems Security Professional (CISSP) 2. Project Management Professional (PMP) 3. Microsoft Certified Systems Engineer (MCSE) 4. Cisco Network Associate (CCNA) 5. VMWare Certified Professional (VCP) 6. Certified Information Systems Auditor (CISA) 7. Microsoft Certified IT Professional (MCITP) 8. Cisco Certified Network Professional (CCNP) 9. Information Security Penetration Testing Professional (ISP) 10. Cisco CCIE Voice (CCIE) What is the future of the cloud job market? With fears over the security of the cloud subsiding more and more companies are set to join the cloud. A recent report predicts that jobs generated by cloud computing (globally) will increase from 6.7m to 13.8m by 2013. Creating a108% job growth within the UK alone. With more cloud jobs than there are cloud technicians to fill them, there has never been a better time to be a cloud professional. If you are looking for a job then there are plenty to choose from- but how do you know which is the best for you. For businesses it is becoming increasingly difficult to find the talent that they need. It is important to know where to look, and time your jobs effectively to ensure you secure the top talent. Visit http://www.viaresource.com ### Building a cloud computing pioneer By Mikael Lirbank, CEO, Witsbits Lately I’ve been getting lots of questions about being one of the original cloud computing pioneers and the story of Witsbits, so I thought I’d get it down here. My co-founders and I created Witsbits in 2006, and after experimenting with a few different cloud offerings, we eventually shifted our focus onto selling software for cloud deployments. By 2010, everyone wanted to jump on the cloud bandwagon, but we quickly realized that most companies were intimidated by the complexity of available products. This troubled me. One day in January 2011, I woke up on a typically dark and cold Swedish morning with an idea that I just couldn’t shake. I’d occasionally dreamt up ideas before, but those seemingly great thoughts always turned out to really suck after only a short period of sober contemplation. So I didn’t expect this one to be any different, but it was. I realized that we could shave tons of time and complexity off virtualization if we, from the customer perspective, could get rid of the virtualization software itself (or at least most of it) and its configuration files, and make the remaining parts self-configuring and self-upgrading. While products offered by industry incumbents like VMware, Citrix and Microsoft offer advanced functionality and flexible customization and configuration capabilities, this also makes them very complex to deploy and maintain. Since most virtualization deployments are quite standard, IT administrators end up spending hours on configuring virtualization with default settings over and over again. I thought, what if we could drastically simplify the process of setting up and managing virtualized servers, and do it all as a service to meet the needs of most IT departments? This would free up tons and tons of person-hours for more strategic work, and drastically change the day-to-day lives of IT administrators. So we decided to build a one-size-fits-all virtualization service for the 80% of the market that tend to use virtualization in a typical or default way. The idea seemed almost too simple, but the more I researched it, the more clear the opportunity was. So we went for it, and we’re aiming to do to the virtualization industry what Salesforce.com did to CRM software – by simplifying the experience, lowering the barriers to adoption, and providing much more value than administrators can get with VMware or others. After our investors approved the change in focus, we put our awesome development team on the task and came out with a prototype in 30 days, and then the first version of our new product in just 60 days. With LIVEvisor, our self-configuring bare metal hypervisor, and Director, our web-based management system, I believe we’ve made virtualization deployment and server management dead simple. We’re making it as easy as pushing the power button on your physical server to magically set it up for hosting virtual machines and remote management. I’m very pleased at the traction we’re seeing so far, especially amongst managed service providers (MSPs), who run IT departments for several companies at once. Our product lets them get new customers provisioned faster than they could’ve imagined (in minutes versus weeks or months), because it requires no installation or configuration. MSPs tell us that they love how Witsbits lets them easily manage all of their clients’ servers centrally in one place (whether they sit in the cloud or on premise) without spending time writing scripts, juggling APIs, setting up encrypted tunnels, etc. Towards the end of 2011, we moved our headquarters to San Francisco, but kept our engineering team in Sweden. This enabled us to join the extensive cloud and virtualization ecosystem in Silicon Valley and be close to the latest industry trends, while still having access to Swedish engineering talent and keeping our existing team of great coders intact. Today, we’re gathering feedback from customers and iterating our SaaS-based virtualization product on a weekly basis. Witsbits provides an easy, quick, affordable on-ramp to the benefits of virtualization (like transitioning workloads between locations and providers, business continuity, etc.), while exposing less and less of its complexities (like time-consuming configuration legwork, etc.). Feel free to give Witsbits a spin, and let us know what you think! ### Stop Selling the Cloud! By Daniel Steeves “Sales and marketing fail when they either are selling or marketing the wrong thing, or the right thing in the wrong way” ~ me, 2012 In planning a recent holiday, we started (as I assume most do) with choosing a destination, based on either the mission (see, visit, do something…) or the vision (city; dive; rest…) for the holiday. We narrowed things down by applying our requirements criteria including travel time and budget, adjusting and extending our view as we made a selection. We decided on a city break in Istanbul and during this initial ‘discovery' process we further refined our requirements and determined, broadly, that transport and desired level of accommodation would be both available and affordable and that the city offered us what we were seeking for this holiday. Based on target location, timeframes and convenience, flying was pretty much the only valid option. While we always seek as painless a travel experience as possible, we typically allocate more budget to accommodations and activities than we do to travel: this typically makes Business Class or better a price barrier but we never adjust our basic requirements: local airports, competitive pricing and a track record that leads us to expect a safe runway landing rather than a splash down or a ride into the side of a mountain. Those needs met, we selected and booked the flight. Then came the Fun, part 1: deciding where to stay and from there planning what we might get up to on arrival. Following pretty much the same requirements-defined approach as above, at a more granular level, we finalised our criteria relevant to the type of break we wanted (our mission and vision) ranging from the neighbourhood of our hotel to its proximity to sights and transport. This was followed by the Fun, part 2: going there and doing it. Loosely illustrated our time allocation – and preferably also that of our budget (which is, sadly, usually a larger chunk by far than desired), looks a little bit like: The same type of breakdown applies, loosely, to selling technology-based business solutions (although not always the same ratios): technology is meant to enable something, to solve something, to deliver something or to earn something, not just to be really cool tech with a better name and fancier tools. Awareness is Good, Hype is Bad Now, as written in Part 1 of this reality check, Beware the Cloud-ists! I'd like to say that I like cloud and have done since well before it was called cloud. And that today, with the processing power, capacity and bandwidth to deliver to the promise, not to mention the sliding commercial models and minimal-to-nil start-up barriers in terms of costs or time delays, we are now enabled to use smart, utility delivered, commodity computing in a way that it can really make a difference. This is very promising in the view of both the business and technology camps: reduced risk, reduced cost, increased agility and overall reduced barriers across what has become a much-simplified business to consumer to business loop (particularly but not exclusively to the online or otherwise computing-centric space). Cloud Computing has struck a chord with and captured the imagination of the public, businesses and Government in a way that other attempts at delivering utility model computing, ranging from On Demand to first generation SaaS and other such incarnations never did. Everywhere you look are managers, experts, analysts and “observers of a journalistic nature” writing, blogging and tweeting about it: a myriad of real-world experts (some of whom know about that of which they speak, others clearly who do not!) generating countless books, articles and whitepapers. Cloud is Merely the Delivery Vehicle … and cloud is also a great foot in the door. Controlled and applied realistically, the hype around cloud is raising awareness and increasing uptake and I suggest using the opportunity created by this hype. Leverage the interest and awareness to start new, broader conversations based on the business goals of your customer (and their customers) rather than the delivery needs of their IT department or your own short-term sales targets… I suggest that you stop selling cloud and instead start to sell and deliver the things that cloud will enable: their imaginations have been captured by all of those pundits and it is time to take advantage of that: the right way! Raising interest in and creating an understanding of the cloud and its (actual) benefits is clearly good all round: assuming that you want to move from selling products and services to selling business solutions and long-term relationships, cloud is the perfect vehicle to enable that move (more a transformation, by the way, than a move). Selling business solutions and long-term relationships indicates that it is the competitive edge (in terms of: agility; cost of sales and delivery; sustainability of service) delivered via these new commercial models and methods wrapped around enhanced, improved and new technologies that are cloud, rather than the tech itself. Sell the holiday: the hotel and the activities are that which captures the imagination, not the flight. And that answer remains the same even if your customer is re-selling your cloud-based models and methods to their end-users: the only difference is that you need to work with them to move them to also think and sell the same way. Cloud is about enabling innovation, driving speed, delivering business agility and reducing risks: You need a different conversation and a different proposition… my suggestion is to stop selling cloud, start having those different conversations and see where they take you: let cloud sell itself based on what it can do rather than what it is. As the saying goes, every cloud has a silver-lining: your competitors who are selling that silver lining may make short term income but I would suggest that the real place to go mining would be in the doors opened and relationships built by the conversations around how this cloud stuff can transform their businesses, reduce their costs and get them to their goals better, stronger, quicker, cheaper. That is where lies the mother lode. Oh… and don't forget to beware of the Cloud-ists! ### Virtual Machine User Group (VMUG) calls for clarification of Patriot Act and Data Location Policies My name is Gav Brining and I represent the Virtual Machine User Group (www.vmug.org.uk) an independent and active organisation comprised of 2000 end-users dedicated to virtualisation and cloud computing. Today we call upon the IT industry - comprised of cloud and IT vendors, industry organisations and associations - to provide guidance to our community and beyond with regards the United States Patriot Act and data location guidance. My group represents the individuals and companies who deploy your technology solutions in businesses across the UK, and as such we feel that it's time for some clear and concise guidance as opposed to the general confusion, marketing hype and Machiavellian tactics of vendors who are are using such legislation to influence buyers. The United Kingdom has historically been a stepping stone for American vendors entering the European market, and the protectionist attitude of UK vendors is creating a hostile reception for American companies wishing to deploy in our country. Can we as an industry and country in recession afford this loss of revenue, opportunity, jobs and influence? We are not looking for yet another committee or think tank to ponder the subject and potentially do something about it at some point; this saga has been ongoing for over a calendar year now and is a constant source of frustration and confusion for our members. ...the United States Patriot Act and data location guidance... is a constant source of frustration and confusion for our members We call on the IT vendors, trade associations and policy makers to solve this issue once and for all. This is what VMUG proposes: Simple clear effective guidance We call on the industry to create a simple bullet point guide that allows our end-users to understand the conditions and impact that the US Patriot Act and Data location polices have on our businesses, and how this affects system deployments and technical architecture requirements. We understand that there will be debate and ambiguity on each summary, but seriously - is it really that hard to provide a formalised five page PDF guide and summary to help us understand and make choices based on fact rather than fiction and irrelevant marketing hype by people who do not understand or have not conducted thorough research on the subject? Let technology win! As an end-user organisation we believe that technology should always be the winner. Vendors win orders based on usability, cost, and the depth and breadth of solution - whether that technology or cloud offering is American or British! We need American technology to drive our UK vendors to innovate, compete and to be leaders on the world stage! Without healthy competition how can you drive your products and service into world markets? A central independent resource We understand that this is a contentious subject, so we call on the industry to put rivalry aside for 5 minutes and work collaboratively to create a simple clearly worded resource - a website or page on an industry association - which allows our members to get to the bottom of what they need to know - without any of the marketing hype. As an independent not-for-profit industry group we will assist any efforts to solve these issues once and for all. Gav Brining on behalf of the Virtual Machine User Group (www.vmug.org.uk) Register to attend our events here http://www.vmug.org.uk/   ### Using Social Media and Cloud Computing to do Good What did you do on the morning of Saturday the 20th of October? I was out at my local town centre with my 3 year old son when we spotted some witches!! My very excited child ran over to the witches and I discovered that they represented a local charity which helps children with ‘advanced needs’ the Wildern Opportunity Group. When chatting with the girls about what they do and watching my son win a plastic spider (he loves it!), It was then I thought about myself and questioned my own humanity (if I had 10% of the decency of these girls I would be a fine human indeed). Hearing about the funding and other issues such as a lack of sensory equipment, I walked away feeling helpless and freely admit to having a tear stream down my cheek. After coming home and 'making myself man up' putting on my headphones and zoning out, I started to think what I could do to help them? Dianna Ross and the Supremes Kicked in “ain’t no Mountain High Enough” (not my playlist honest!) which gave me an inspirational moment. I work in an amazing industry packed with kind people who are mothers, fathers, brothers and sisters I know that we will rally around a great cause! Yesterday (22nd October) I had a totally amazing meeting sitting on a toddler seat at the Wildern Centre (the thought of a major cloud player sitting in the same seat doing a meeting in a pin stripe suit made me laugh!), and discussing how we can help. When I founded this website you are now visiting I was determined to ensure that Compare the Cloud can be used as a force of good I truly thank you Wildern Group from the bottom of my heart for giving me that opportunity so here is the plan! Smash the funding target & Use Cloud Computing to make a difference They need to raise £20,000 to fund ongoing activities to help these amazing children and parents lives that little bit easier. Hit the PayPal donate button on their website here http://www.wildernopportunitygroup.co.uk/ each person donating come back onto this blog and leave a comment, we will then publish a link to your website! Each donation made I will retweet your brand, I also intend to reach out to my media contacts to retweet and respectfully ask providers to put rivalry aside and retweet any donation made! Being amazing and donating will really make a difference no matter how large or small associate your brand with something truly positive! I have never requested that people hit our social share buttons as quality content should stand out! but please please please put this post on twitter, facebook, linkedin and any social bookmarking service. Spread the word a share is as good as a donation to raise awareness! Cloud Sales teams buy some karma for your next deal!! Check out their wish-list (this is due to be updated) http://www.wildernopportunitygroup.co.uk/pre-school/hedgeend/wish-list Donations to date: Compare the Cloud James Golding www.tdmgroup.net Graham Spivey www.thebunker.net Marcus Cauchi Sandler Training Kerry Burn www.848.co Fenton Bard www.sourceplc.com Richard May www.virtualdcs.co.uk Eamonn Colman www.computenext.com Richard Cook www.championcomms.com Paul Heywood www.claranet.co.uk Frank Bennett www.cloudassessment.co.uk Mitchell Feldman www.cloudamour.com Ray Bricknell Linked in James Rees www.razorthorn.co.uk Ian Cooper www.hostedaccountants.co.uk Daniel Steeves www.beyond-solutions.co.uk David Terrar www.d2c.org.uk Neil Cattermull (personal donation) Maggie Meer www.cloudexpoeurope.com Doug Clark (personal donation) @Cloudstuff Millovan Milic www.devtech.rs Paul Bates www.probox.eu Neal Wilkinson www.sleek.net Gav Brining www.vmug.org.uk Mike Blackmore www.oracle.com Martin Clare www.imtechict.co.uk Dr James Mitchell www.strategic-blue.com Stephen Maloney www.imtechict.co.uk Dave Cartwright www.safehosts.co.uk What can cloud computing do to help? Lets build them a ‘Castle in the Sky’.... I am going to reach out to some of the finest technical minds in our industry to help stir a direct debate (morning Simon and Bernino email due in your inboxes shortly)! After talking to Terri (Manager Wildern Opportunity Group) and hearing about the issues they have with lack of children’s places and sensory toys and failing equipment, we moved on to discussing how technology could aid in helping parents and children. The Cloud Dream Team! Imagine forming a cloud dream team to aid the group? Let us prove to the wider commercial world that the cloud industry can provide clear simple products that are easily adopted and meet genuine needs. So firstly we need a volunteer who is able to take ideas and translate them into needs! We need a great project manager to scope and manage requirements (morning JD)! A technical architect! A delivery manager! A security expert! (morning JR) Any other dream team suggestions welcome I will update accordingly! Using cloud computing technology to make an impact Put yourself in this scenario, you’re a father or mother its 3:00am (a child with additional needs on average sleeps 2-3 hours per night), imagine being able to log onto a secure private cloud and perhaps chat with another parent maybe about football or swapping advice using VOIP or IM? Take that idea a step further perhaps streaming videos onto a tablet PC to create a sensory environment or bringing the virtual centre to the child at 3:00am? The staff could then produce training videos, tips and guidance upload content grow the community to include professionals, remember one video produced by an overworked professional can reach hundreds of parents! From a business perspective they will need social media guidance clear easy documentation and simple walkthroughs perhaps on video? To all the technical people reading this blog, spare some time and see if you can debate on how we can provide an immersive cloud based sensory environment that can touch the souls of these children, any ideas would be great and your reward for your valuable time is a smile on a parents and child’s face! With regards building covenants is there a DC provider out there willing to help them navigate local politics and solve restrictions in place? Lets scope this out together as a community, I will be inviting the parents and staff onto this blog once we have started nailing down ideas to interact and hopefully help us create a live scoping exercise. Remember we need to no take up any time and deliver something simple clear and effective with a single point of contact to gently help the Wildern team! Note: this blog post is free of any copyright restrictions with all comments and content open source, feel free to share link or reproduce in any format. Help make this small worthwhile charity make some noise!!! Please re-tweet and share using the social buttons below! Post date: 2012-10-23 09:10:06 ### Big Blue wants You! By Luke Wheeler Compare the Cloud were invited along to IBM's European Analyst and MSP Influencer Forum event on Tuesday. As you might expect from the US' second largest firm - it was a well drilled affair, an action packed day of presentations, workgroups, networking, acronyms and coffee. The big announcement - around which the day was constructed - has now been well covered well by the IT industry's mainstream news channels, so we won't delve too deeply into the details here, but to sum it up in one over-extended sentence: IBM recognise the value of Managed Service Providers (MSPs) as a primary route to market for sales of IBM products and services to the “mid-market” sector (sub-1000 employee companies) - and have subsequently created a specific MSP program called “PartnerWorld” - which offers a broad range of partner benefits encompassing supply of product, technical and sales support, marketing and business development, leveraging of the IBM brand, and financing. To the credit of the Mid-Market team at IBM there are certainly some interesting “beyond hardware” initiatives within the program, and MSPs should certainly take a good look at it. IBM invests $2bn a year in their partner channels, and has reportedly recruited >1400 MSP partners this year so far - so they're certainly committed. IBM invests $2bn a year in their partner channels... It's not every day you get invited into the hospitality of the world's third most valuable brand* and get to meet so many key influencers, so as Compare the Cloud we felt there was a need to delve a little deeper into the strategy of IBM and what this might actually mean for the UK and Europe's Small and Medium-sized businesses (SMBs)… The first take-away positive for us was the fact that IBM realise that the nature of their channel has changed. IBM do indeed recognise there are different computing requirements and engagement levels needed for different types of partners – whether they be MSPs, Independent Software Vendors (ISVs), systems integrators, Value-add Resellers (VARs), Telcos, Cloud Services Providers (CSPs) etc. - and are re-organising their channel strategies accordingly. The second take-away for us was that IBM do understand that the market is contracting for their primary revenue sources: traditional product sales (hardware and software) and Professional Services – and that these are being off-set with growth in demand for Managed Services, Maintenance and Application Hosting. So the big news here is that IBM knows what the Mid-market / SMB sector wants, and they now have formulated a strategy to best deliver it. But what is “it”? What does IBM have that the SMB wants? We heard ourselves asking. “Well, we have an expansive portfolio of products and services”, came the reply. And indeed, they do... but perhaps communication around specific products, technologies and their benefits needs to take central stage in this initiative of addressing the mid-market? It's evident from IBM's strategy that the MSPs and channel partners will be relied upon to cherry-pick the right IBM products for their customers, alleviate any perceived complexities, and to re-assure end-user SMBs that IBM's products and services are indeed better than the competition and worth paying a premium for. IBM clearly have a plan for educating their partners for the top-down approach, but what about the bottom-up approach and delivering winning messages to the end-user SMBs (such that they then turn to their suppliers and demand IBM products and services)? What products do IBM have that are of value and relevance to SMBs today? On behalf of the SMB and Mid-market sector, Compare the Cloud would like to explore that question in more detail: What products do IBM have that are of value and relevance to SMBs today? For us that's the $24.1bn** question. With annual investment of $6bn in R&D, and 6,180 patents to their name, IBM must have some great technology – technology that the SMB can benefit from. Of course IBM has some excellent products and services within their expansive portfolio, but as an SMB (and perhaps even as a channel partner) you could easily be forgiven for not knowing exactly what they are and why they are better. The IBM website isn't particularly enlightening either – sure it looks great and the right kind of information is there – but am I being asked to buy into the IBM brand or a better product? As we know, the brand itself is worth a great deal – but to the Mid-market and SMB – we're pretty sure in most cases they'll want the better product?! So, politely we laid down the gauntlet and challenged IBM to step out (of their Ivory Tower) into our (Pie ‘n' Mash) arena to explain the advantages of their key products and services to an informed and eager SMB audience. The challenge is set. The initial response has been positive. We look forward to finding out what's under the hood - and whether SMBs will want Big Blue. *According to Interbrand (2012), the IBM brand is valued at $75,532m, behind Coca Cola and Apple. **Around 20% of its $107bn annual revenue comes from the channel. Not bad for starters.   ### 10 challenges for the Cloud ISV CTO to overcome Back in April I was asked to do the keynote session for an event aimed at Independent Software Vendors with the title "5 Key Challenges for the Cloud ISV CTO and How to Beat Them!". The audience were software developers either considering moving their applications to the Cloud or who had just started that journey. Actually it was a struggle to pick just 5 and so I decided to revisit the topic and highlight 10 issues here. These challenges for a developer translate in to the characteristics for a best of breed Cloud application. For someone on the buy side looking for a solution your shortlisted provider may not have all of these ingredients today, but looking at the company, their solution and their product roadmap from this standpoint will give you a good understanding of how innovative and forward thinking they are. This list should form part of the due diligence on your selected vendor. Recognize that we're not in Kansas anymore. There are plenty of IT people in the industry who have decades of experience of technology life cycles and believe that everything that goes around comes around. This Cloud stuff is just like those centrally hosted mainframe bureau systems of old but with new marketing labels. Wrong! Have you read Douglas Adams great book Dirk Gently's Holistic Detective Agency? Like Dirk I believe in the "fundamental interconnectedness of all things." I've lived through the transition from mainframe to mini-computers to distributed PCs and Group-ware. From Open Systems (which really meant UNIX), to client/server and the first wave of the Internet and Web 1.0. Each shift meant a huge technology change and some major players failed to survive each transition. Today is different. Today we have three shifts, each easily as significant as those I've just listed, but they are happening simultaneously - the shift to Cloud is happening at the same time as the shift to Social at the same time as the shift Mobile. The technology landscape has changed forever. We're not in Kansas anymore, and it affects everything. Look carefully at your Cloud provider. There are two types - those who are retrofitting existing IT in to the Cloud - that's fine and it gets some of the benefits, but it's old school dressed in new clothes. The cool companies get what has happened. They are embracing the new paradigm with multi-tenant, public Cloud technology that helps customers adopt new business models, collaborate with their customers in new ways and make use of their data in ways that weren't possible with old IT systems. Which approach are you looking at? The consumerization of IT and the user experience. Old style software was complex, demanding training courses and expert help to implement it. Amazon, and websites like it, have shown us that you don't need a training course to place an order for a book. The standard for good website design is still Steve Krug's book "Don't Make Me Think!" - the provider needs to take that mentality in to their business solution. Modern Cloud application providers need to put the user experience and user testing "front and centre" of their design philosophy. Way back in 2006 I met Richard Moross the founder of Moo.com, the business card people. I asked him what was the most significant investment in setting up his company. He answered immediately "the £10,000 I spent with Flow Interactive working on the user experience of the website". It shows because making cards on Moo is fun and I'm much more likely to recommend them to my friends because I enjoyed the experience. Their competitor Vistaprint is spending heavily on TV advertising, but they'll fail because their website is user hostile! Moo's usability trumps it. That frictionless and fun work-flow is what you need in your business solution. User experience is paramount and you should look for Cloud apps providers who think that too. The whole business model needs to change from end to end. A software company converting from selling old style, on premise, software licences to a Software as a Service subscription model has to change every part of their business. They compensate their sales team in different ways, and they market and sell differently. They implement the solution differently, starting with a small pilot and then rolling out the solution to more and more users only after it has been proven to be successful. They provide better documentation, more self service training and support. They provide software releases quarterly or maybe even weekly rather than a "big bang" upgrade every 18 months. They will be much more transparent about their product roadmap and provide better ways for customers to get involved and influence product direction. Their emphasis shifts from great selling to great on-going support as customer retention to build up recurring revenue becomes the business driver. They manage and monitor their business differently. Looking for these characteristics will help you pick the providers who have embraced the new Cloud paradigm properly and moved on from the old style software companies who may just be retrofitting legacy client/server software in to Cloud infrastructure. Go Agile. Agile software development is a group of software development methods based on iterative and incremental development, where requirements and solutions evolve through collaboration. The real Cloud providers have adopted this approach, involving their customers in the process, and bring out regular, incremental releases. They will organize in to small teams and consequently get much more done. Look for companies who have moved on from the old methods of developing software. Go Mobile. Most of us carry smart phones. Many of us have iPads or other tablet devices. Many of us want to use our own devices in our place of work - BYOD is where it's at. As well as supporting all of the different browsers the Cloud providers now have to consider mobile devices too. That's just the way of this new connected World . They need to look at how the user interface (menus, buttons, input fields) works on a 3-4 inch screen instead of a 20 inch screen and consider HTML 5 or creating dedicated smart phone or iPad apps for specific parts of the business system. Not every Cloud provider will be doing this yet, but you should ask the question and see where mobile is positioned in their product roadmap. Go Social as a business. Smart companies are using social media tools to have conversations externally with their customers and partners, and internally to make their teams work more effectively. Smart Cloud providers need to be adopting this within the way they work. If the provider hasn't got sharing buttons for Twitter, Facebook, LinkedIn and other social networks on their website - do they really get the way the World has changed? Do they blog? Do they demonstrate their expertise and their thought leadership? Are they building a customer community? They need to have embraced new media marketing in the way they do business, so ask them. Business is Social after all. Go Social with your product (actually I mean service). Some Cloud companies, like Salesforce.com, will do it by developing integral micro-blogging and messaging within the product (with Chatter), and collaboration tools (like Chatterbox) or by acquisition (adding Radian6 and Rypple). Others will do it by integrating with external social tools like Yammer or Tibbr. Similar to mobile, not many Cloud providers will be doing this yet, but you should ask the question and see where social is positioned in their product roadmap. Don't develop it all yourself. In the past the typical software company would have developers who thought they could do it all. The perception was always that it would be cheaper to build the support system or billing system or customer forum exactly the way we want it to work rather than buy. The end result was a whole load of bespoke software to support to run the business with limited functionality and no best practice. The smart development teams know what they are good at and stick to it. For things like support they should be considering Get Satisfaction or Zendesk. For things like subscription billing, consider Zuora. Who do they use for customer forums or websites? Does the company focus their development resources on what they are good at? Do they use off the shelf products for everything else? If they don't - find out why. Have a strong story on security. Even though most companies are considering Cloud solutions, the security questions will be amongst the first you are asked. If you haven't got a strong story you're toast. One of the beauties of the Cloud approach is that even the smallest provider can piggy back on the best available infrastructure at a fraction of the cost of doing it all themselves. I can buy 1Gb of storage from IBM for $2 a month, or from Amazon for 12.5 cents a month, and a range of prices in between. I can buy Infrastructure as a Service or invest in a Platform for development which does it all. I can rent servers with an inclusive backup and fail over service. Amazon and Apple are investing in huge data centres, but smaller providers like Memset do a great job too. They need to cover everything from making their site “hacker safe” to the way the way they handle privacy, data sovereignty and ownership in their contracts. As a buyer, make sure that your provider gives you full visibility and transparency on the technical foundation of their service and the terms of its provision. Start with why! Maybe this should have been first. The smart Cloud providers understand and can verbalize why they do what they do. I'm a huge fan of Simon Sinek's book, TED talks and approach. Get the book, but at very least watch the video. Companies like Apple or Salesforce.com get it, but there are plenty technology companies who are too caught up in the what and the how, and have lost the why. Look for Cloud providers who understand their own reason for being, and then live and breathe it. If you see at least 4 or 5 of these characteristics in your Cloud provider, then they are positioned well for the future. However, if you don't see much evidence of these 10, then you may be dealing with a traditional software provider who is dressing up their solution in Cloud clothes. Move on and find a more innovative partner. ABOUT DAVID TERRAR David heads up D2C, a consulting firm and Cloud Services provider. He is chair of Intellect's SaaS Group, a Director of EuroCloud UK, an operations and governance board member of the Cloud Industry Forum and a regular speaker at social media, social business and Cloud Computing events including chairing London's Cloud Computing World Forum. Although his history is rooted in traditional enterprise systems he is passionate about the intersection of cloud computing, mobile technology and social media, how these tools can be deployed to make business more effective, and the way these trends are changing the world of work. More here ABOUT D2C D2C are consultants and solution providers Dedicated 2 Cloud Computing and Dedicated 2 Customers. They provide solutions for websites, web communities, collaboration, online accounting and Enterprise Requirements Planning (ERP). They advise companies on making the transition to the Cloud and adoption of social tools and new media marketing to help make businesses more effective. More here   ### 3 Cardinal Sins When Selling The Cloud By Marcus Cauchi Sales Trainer Post date: 2012-09-07 11:42:28 “Thanks for coming in to see me again John. I wanted to tap your brains to see if you can you fix the unreliability problems we've had with the network going down this week? I think it's because Cosco don't like talking to Jumper and our customers are throwing a hissyfit.” “Mr Jones, let me tell you why our solution is perfect for you! The carbungulator on the defibrilator of the Rippitz has ample latency to prevent your sprogwangle from falling over when you have DNS attack on your network.” “Huh?” “Yes, it's true. Our Rippitz ethernet whizzbang is the most redormant. In addition our Saas, CaaS and AaaS can save you squillions of pounds through BPM and hosted CRM.” “That sounds excellent sonny. Let me have a proposal with your costings. Can you get it to me before the weekend so I can look over it and give me a call in a couple of weeks. I'll be discussing this with the Head of Janitorial Supplies at the Board Meeting next month and we should know by then where we stand. Thank you and good bye.” Sin 1: No jargon, ever. If you want your prospects to not glaze over and to buy from you stop using jargon. It's neither big nor clever. Why would you use language that your prospect doesn't understand. Stop making yourself the smartest person in the room. He'll not thank you for it and you're about to waste your evening writing a proposal for someone who was NEVER going to buy from you. Why? Because ALL RFPs are bogus. Yes I said ALL. I mean all. Why do you need an RFP if it's a real selling situation? Can you win it? Areyou going to? Why are you writing it then? I know you're playing the law of averages. Throw enough s*** and some has to stick. That's a proper fools game and results in doing shedloads of free consulting. Sin 2: Failure to Diagnose – the interesting disease. In the example above, do you even know what the real problem is? Have you bothered to diagnose? Did you ask any questions? Did you even listen or did you just leap in trying to solve the symptom? What happens when you have a boil on your bum and the cause is your dodgy liver? If you burst the pimple what happens? It comes back somewhere else because you haven't fixed the correct end of the problem. The acronym STFU is especially relevant for most of the boys reading this because you are most often guilty of premature presentation syndrome. Ladies, you're almost as bad but I give you some credit for having the ability to STFU and listen. They show you a symptom. The light is red but you still plough in because you want to prove how much you know. Again, don't try and be the smartest person in the room or else you will waste a lot of time kissing frogs and getting very slimy lips. You will become a professional visitor. If you manage people who go out and try and meet lots of people (anyone with a pulse (optional) who sits still long enough for them to whip out their 12 inches of pure Powerpoint hell. Be interested NOT interesting. Do this and he will tell you how to sell to him, he will do the presentation for you, he will close himself and he'll thank you for taking his money. Do what you're doing and continue racking up debt on your credit cards, get fired or leave your current job within 12-18 months because you can't make it work and then blame the economy, the company, marketing, the management, finance, your pricing or the competition … but don NOT take responsibility for the fact you aren't selling. At best you're order taking. And only then when you get lucky and people buy in spite of how you tried to sell to them. Sin 3: Collateral, catalogues, websites are NOT a substitute for selling. They are a cop out. You don't buy off brochures unless it's cheap as chips commodity stuff. If your job, livelihood, reputation depended on it you wouldn't in all likelihood buy from a a piece of paper or off a web page, would you? You might but as a rule, what's likely to happen? And if it was an important purchase wouldn't you want someone to help you make sure it was the right purchase. No egg on your face. No embarrassing trips to the headmaster's office for a caning when the company's reputation is in tatters or you're in front of another Parliamentary Select Committee to investigate another bad purchase scandal? Product knowledge used to early in the sale is LETHAL. If you insist of showing up and throwing up with virtually the same crappy feature and benefit rap that your rivals are doing, you deserve to end up doing the free consulting, used as column fodder and ending up in voicemail jail. Never ever, ever sell using features and benefits. And only present once, to everyone for a final decision – yes or no and nothing in between. I know. I know. ….. “Blah blah blah. You have to present! You can't sell that way! You have no idea what you're talking about Marcus. I've been selling Cloud for the last couple of years and I can tell you we have always done it that way and it works for us.” Works! Works! Most of you are out there working hard, trying to meet prospects in a crowded competitive market. You end up in beauty parades and in the end it comes down to price more often than not. And your positive prospects who were enthusiastic and loved you (“That's the best presentation I've ever seen John. Send me a proposal so I can show it to my boss. He'll love this.”) turns into the Scarlet Pimpernel and gives you unlimited access to his voicemail system as he goes into avoiding-you-mode. When you leave the office leaving him to think about it, the only person thinking about it is ….YOU! Stop selling like a clown and consider why your prospect will buy from you. You cannot differentiate in what you sell but you can differentiate in how you sell. If you want to learn how go to http://www.london1.sandler.com/content/show/60095 and click on the Free Stuff to audit yourself and find out just how good a salesperson you really are. I bet you don't dare even take a peek! ### IBM Cloud Opinion Piece: Alleviating IT headaches for small and midsize businesses By Simon Porter, Vice President Mid Market Sales, IBM Europe New technology can make the lives of small and midsize businesses (SMBs) more complicated. In a survey conducted by Techaisle, an SMB market research organization, 54 percent of SMBs claimed that their IT difficulties actually increased over the past 3 years. Another 39 percent said these IT struggles outweighed their business challenges. As they constantly find themselves dealing with IT issues rather than reaping the benefits of these solutions, precious time and resources are being diverted from identifying new market opportunities to grow their business. With the rise of Big Data, increasing amounts of information are available from sources ranging from Facebook “likes” on a product to mobile device applications like Foursquare. This gives SMBs, a whole new level of insight about their customers that they’ve never seen before. According to IDC’s Digital Universe Study, the digital universe doubles every year, generating 2 zettabytes this year compared to just 1 last year. There is a pressing need to simplify IT more than ever as data needs are rising, but SMBs don’t have the resources to spare for dealing with it. According to Techaisle, 72 percent of SMBs say that IT vendors should simplify technology. Technology providers need to offer SMBs personalized solutions that are integrated and easy to understand. Evidence from a Cloud study by IBM in the UK shows that SMBs are now looking beyond the cost and resource efficiencies enabled by the cloud, and focusing instead how the cloud can improve business outcomes and bring strategic value. The study revealed that the advantages of the cloud are being recognised by the majority of UK SMBs -- two thirds of the senior managers surveyed had either already implemented cloud services or intended to in the future, with 45% of UK businesses looking to do so within the next two years. The increased ability for employees to work with greater mobility and flexibility was identified as the most popular reason to move to cloud services (39% of respondents), with cost efficiencies named as the second most popular reason (33% of respondents). To help SMBs focus on generating new revenue streams and increasing profits rather than IT headaches, many are turning to Managed Service Providers (MSPs) to deal with these technical obstacles so that employees can shift their focus to their primary task of growing the business. In addition, MSPs eliminate the need to hire in-house support staff. This allows SMBs to increase their technical capabilities while maintaining cohesiveness. By utilizing cloud-based solutions, MSPs can help SMBs store and analyze more data than ever before, without risking a drop-off in performance. By harnessing the power of Big Data, SMBs gain valuable insight which can help them shape the direction of their business. In a further survey of UK MSPs, more than 40% of MSP respondents identified the poor general understanding of the benefits of the Cloud and how businesses can take advantage of its characteristics as a real barrier to adoption. 20% of MSP executives also specified skills shortages in sales, solution design and security as another barrier. Others commented on the need for effective marketing capability and management skills. Many MSPs also felt challenged by longer sales cycles, with clients taking longer to commit to new projects and contracts. With this in mind IBM will soon be announcing a comprehensive program of support for MSPs to help them create new business opportunities and accelerate growth, including: Incentives to increase profitability Offerings leveraging IBM world-class software and hardware Collaborate with IBM's existing ecosystem to expand opportunities Joint marketing programs to fuel growth Skill building in ways to enhance expertise to align with high value opportunities There will be a global virtual event on September 26th to announce the launch of this new PartnerWorld initiative for Managed Service Providers . You can register here for this event. There are two key points here: firstly that SMBs need to be ready to take advantage of advanced IT solutions and the potential benefits of the Cloud, and secondly that vendors and third parties need to simplify the management of these solutions for SMBs and provide support for MSPs in the form of marketing, financing and skills. This combination will accelerate the MSPs’ ability to provide solutions that can in turn accelerate the SMBs’ ability to focus on creating new business and enhancing their profitability. About the IBM UK Cloud Research Study: For the independent study, conducted by YouGov in collaboration with IBM, the opinions of 530 senior managers at small and medium-sized companies in the United Kingdom were surveyed. It was conducted in the second quarter of 2012 to capture current and upcoming business and IT priorities for cloud computing. About the IBM UK MSP Research Study: Conducted in early 2012 this independent study which surveyed opinions of just over 100 MSPs in the UK, was supplemented with further in-depth interviews with a subset of the respondents. About the Author Simon Porter Vice President Mid Market Sales IBM Europe Simon was named Vice President, IBM Mid Market Sales, Europe in January 2012, from March 2010 he was Vice President Northeast Europe. His team is responsible for IBM’s portfolio of software, services and hardware to organizations with less than 1000 employees in Europe - a market opportunity of over $70bn in 2012. Simon led the transformation of this business from a direct sales coverage model to a Business Partner led model in 2010 to better capture this enormous opportunity, working with over 1500 Business Partners and developing new channel relationships with ISVs and Managed Service providers to capture the cloud opportunity. Between August 2008 and February 2010, Simon was Vice President Sales, General Business Northeast Europe, Systems & Technology Group. He led the team responsible for serving all Small & Medium Business clients for IBMs storage and server business significantly growing market share, channel participation and revenue. Between January 2007 and August 2008, Simon was Vice President Sales, Global Engineering Solutions Northeast Europe, Systems & Technology Group. He led the team responsible for providing innovative, semi-conductor, customized systems and consulting solutions to clients, leveraging IBM’s research, Intellectual Property portfolio, and product design skills to help clients bring more innovative products to market faster and at lower cost than doing it themselves. This covered clients in the telecommunications, transport and video surveillance segments. Prior to this position, Simon was Vice President Industry Systems, Systems & Technology Group, Northeast Europe. From 2005 to 2007 Simon was Vice President Lenovo Alliance, Northeast Europe, responsible for the transition of IBM’s PC business to Lenovo and subsequent management of the Alliance between the two organisations. Between 2003 & 2005, Simon was Vice President, Computer Services Industry for Europe, Middle East & Africa. He led the team responsible for relationships and business with Consultants, Systems Integrators, and Outsourcers. This role involved managing these firms as clients of IBM, as well as developing business partner relationships for all of IBM’s products and services. Before joining IBM in 1987, Simon earned a Bachelor of Arts (Hons) degree in Accounting & Finance from Nottingham University. He also holds a Diploma in Management from Henley Management College. ### Interview with of Antonio Ferreira of Lunacloud CTC: Antonio, tell us about yourself and your experience in the cloud computing industry? I have been on the Internet business since in 1994 I launched an Internet Service Provider. Back then, Internet access was not as common as it is nowadays, until adoption took off in 1998/99. At the moment, I would say that we stand in the same place in terms of the adoption curve for cloud computing. I sold my ISP to an US multinational back in 1999, but I have stayed in the industry as managing director of other Internet companies, including VIA NET.WORKS (networks), Amen (web hosting) and Claranet (managed services provider), which acquired this US company in 2005. That's when I met Charles Nasser, founder & CEO of Claranet and my partner in Lunacloud. I also authored some books on web programming in the US in 1998/99. Therefore, hosting in general has been my area of expertise for many years, including virtualization since 2006/2007. The cloud concept was being shaped by then, until it became what it is now. CTC: Who are Lunacloud and what is your differentiator in a very crowded IaaS market? Lunacloud is a new company that I and Charles started in 2011, but has only seen the market light in June 2012.The IaaS market is indeed crowded, very much the same way the Internet access market was in 1998/99. However, there is still a dominant player, which is Amazon, and many smaller players. We believe that some business models will not survive, but some will, during the consolidation phase that will inevitably follow. Given our experience in the market and the backing, we strongly believe we have a successful offer that will attract customers and will distinguish ourselves from competitors. It is a harder task when you're small, but it will became easier as we grow. Lunacloud is a pure-play cloud provider, we only do IaaS and nothing else, not charging setup or recurring fixed fees, or asking for term contract commitments. Our customer is a customer only if he likes the service. We provide a lot more flexibility than other competitors, as a customer can choose any mix of RAM, CPU and DISK space for a Cloud Server, which allows 307,200 different server configurations! And these resources can be changed on-the-fly, not requiring a server to reboot. This is what we call hot resize! Also, we engineered our platforms to use the most up-to-date infrastructure. We did some benchmarking of CPU, DISK I/O and RAM access against main competitors and we are way better in terms of the performance we offer for comparable resources (happy to share the benchmark results, by email to info@lunacloud.com). In terms of price (and I don't believe the price wars have started yet), we position ourselves around 20% cheaper than the dominant player, we genuinely want to use our economies of scale to provide a lower cost for the services that we provide. In addition to flexibility, power and value... we are a lot simpler in terms of messaging, account creation process, resources management. Simplicity is elegance. CTC: What countries have Lunacloud deployed into? We started with the UK and Portugal and are now working on opening France, Spain and the Baltic States (Estonia, Latvia, Lithuania) in 2012. CTC: What are the development plans for Lunacloud? Germany and other european countries will follow in 2013 and outside Europe only in 2014. We have the ambition to become a global player, with nodes across the world, in locations that differ from our competitors (for a genuine alternative), all based on Tier3+ datacenters and within 40 ms of our customers, for optimized performance. CTC: Where did the name Lunacloud originate from? There are two ways to understand it, the first one being "Luna", easy to spell in any language and which means "The moon"... therefore an analogy with "the cloud from the moon". The second one is a well kept secret for now... guesses accepted. :-) CTC: What are the trends your organisation is noticing within the marketplace? The cloud definition provided by the NIST that splits the service models in IaaS, PaaS and SaaS is being clearly followed now and the market seems to be maturing, with the cloud not being such a "cloudy" concept now. We notice that there is still high tendency for organizations to deploy their own "private clouds", until they trust more public/private cloud service providers. We're still 2-3 years from major cloud adoption that I believe will only occur when organizations and customers finally trust their cloud providers. CTC: How do you intend to attract new customers onto your platform? We will rely a lot on word of mouth from happy customers, including the 100+ beta testers that we had for 3 months prior to launch. We know it takes a while, but it is the most effective way of winning loyal customers, because they will rely on the suggestion of a friend or partner. However, we are also doing some traditional communications through online media. Given my past and the fact that Lunacloud is a new start-up, I have also had the opportunity to speak with many journalists and analysts. As we make our differentiators more clear, I believe we will attract new customers. After all, opening an account with Lunacloud is free and the cost of using the resources is lower than anywhere else. There is nothing to loose to give Lunacloud a try. CTC: A question we ask everyone “what is your definition of cloud computing? My definition is the one developed by the NIST. There are 3 service models - IaaS, PaaS, SaaS - and cloud services must comply with 5 characteristics: on-demand self service, broad network access, resource pooling, rapid elasticity, measured service. On average 60% to 70% of the so called cloud providers to not follow this definition. This creates confusion. I state that we are a pure-play cloud provider. We only do cloud and we do it at 100%. CTC: Tell us what your average day entails? It's like being back to old start-up days, plus having to manage other more established businesses. I really enjoy what I am doing, so I dedicate at least 12 hours per day at this moment to running Internet businesses, with a special focus on Lunacloud. I usually start at 8 a.m., check statistics and news from the previous day and then work on the different project streams during the day. I enjoy talking to customers when I can, so I usually take some time to do it too, even if it is a small customer. This keeps me well informed about what customers need. CTC: Who is the team behind Lunacloud? We have a group of about 20 people in the UK and Portugal, covering infrastructure, network, engineering, operations, support, marketing, finance and HR, as well as some external outsourced functions. We also have the backing of Claranet, which allows us to be a small and flexible start-up, while having some resources and processes that usually only bigger companies have. CTC: and finally if you could change one thing within the cloud computing industry what would it be? Although we have one of the most elegant brands, I would remove the branding, so that customers could buy cloud services based exclusively on their merits of performance, reliability and price. Communication makes it harder for customers to perceive real value. I might review my position when we become more well know. ;-) ### Interview with Richard May of virtualDCS CTC: Richard, tell us about virtualDCS and the ethos of the company, what makes virtualDCS unique? The founders of virtualDCS have pioneered the development of the Cloud Computing industry for over a decade. As one of the first companies in the world dedicated to Cloud services, our customers are confident that they only receive the finest solutions. virtualDCS believe that every business deserves the right solution; therefore our approach is to work in partnership with clients to ensure that their infrastructure is ready to exceed the service levels demanded by their business. We believe in forming honest, ethical partnerships, only recommending solutions that will benefit your company, whilst ensuring that you don't pay a penny more than you need to. Our unique approach and experience enables us to provide businesses with highly available services using the latest virtualisation technology, such as VMware. Working with our customers, virtualDCS bring clarity to Cloud Computing, delivering robust services, underpinned by outstanding levels of client care. CTC: What is your background in I.T.? When I was 11 my family bought me a Sinclair ZX81 computer and my fascination with the application of computer technology has never stopped since. As technology evolved, so did my interest and in 1994 I graduated University with a ‘computing in business' degree. Since then I have held a variety of positions including I.T. Manager at a Textile Company, implementing information systems for BT and ICL and acting as development manager for Planet Online and Majestic Design, where I assisted in the creation of online web solutions. Working at ICM from 2000, I developed a strong interest in Hosting I.T. Services and the benefits of virtualisation technologies. Harnessing this experience, I was involved in several projects for VMware and Microsoft which have enabled the Cloud to be what we see today. In 2008, I decided to form virtualDCS with the focus of providing high quality solutions for businesses, utilising virtualisation technologies to achieve availability and flexibility, which was simply cost prohibitive when using physical technologies. CTC: What services and products do virtualDCS offer? virtualDCS offer a range of services categorised into the four key areas of Hosting, Disaster Recovery, Professional Services and Support. Our Hosting Services include: Infrastructure as a Service, Software as a Service enablement, Voice as a Service, Hosted Desktop solutions and Physical Colocation. virtualDCS also provide a range of Disaster Recovery solutions, including CloudCover™ which replicates data in real time to a virtual platform, Workplace Recovery and On-site Server Recovery. In addition to our support services, virtualDCS also offer consultations for I.T. projects, such as Private Clouds. We provide an array of services, with the goal of enabling our customers to get the most from Cloud technology. CTC: Do you have a partner program? virtualDCS do offer a program where our partners can resell any of our services through either a reseller or a referral scheme. We also offer a lead registration arrangement in an effort to increase partner margins and to protect them against competition. In addition to reselling our existing services, virtualDCS has also created a unique ‘Virtual Service Provider' solution, which enables the reseller to create and sell tailored Cloud Computing solutions using our platform. We recognise the value of working closely with our partners and as their business grows, so does virtualDCS. This is why we offer support every step of the way, from proposal creation to implementation. We also provide free personalised, branded marketing material to help our partners close sales. CTC: What is the technology stack that underpins your Cloud infrastructure? Our platform is completely VMware based, with three tiers of resilient storage and physical servers. The platform is designed to automatically move the virtual machines between physical hosts to ensure consistently high levels of performance, with no down time. The VMware technology used also means that if a physical server within the platform fails, the virtual servers will be available again within minutes. Some of our additional platform features include enterprise firewalls and multiple Internet connections from separate providers. CTC: What does your average day entail? As Managing Director of virtualDCS, I am lucky that we have now reached a size where the team covers most of the day to day functions required to keep the business running. This means I can spend my time focused on evolving the business. This includes the continuous progress of our ISO27001 security processes and procedures, new product development and working with new customers to make a difference to their I.T. challenges. CTC: What is your view on Cloud computing and where do you see the marketplace heading? My view on Cloud Computing is that it is rapidly becoming, if it is not already, one of the most abused I.T. terms in history, covering anything and everything. If we assume that what people usually mean by Cloud Computing is ‘I.T. services delivered via the Internet,' then I am sure that as networks continue to improve, all services will eventually be delivered in this way. Applications will be presented by software companies through the Internet, where the need for dedicated server equipment and server rooms will become a thing of the past for most businesses. CTC: A question we ask everyone, what is your definition of Cloud Computing? If I am giving a definition of pure Cloud Computing, then to me it is defined as I.T. services that are delivered to end users as utilities, like electricity and mobiles phones currently are. CTC: If you could change one thing in the world, what would it be? I'd love to stop companies creating products that are designed to simply break, so that they can charge to fix them or sell them on again. ### Distribution wants to help in times of turmoil By Mark Charleton of Blue Solutions Some of you will have read by now that Doyenz is exiting the UK market, and at very short notice. Doyenz will no longer be providing or supporting the rCloud backup and recovery service in the UK by the end of this week, 10th August 2012. I received email confirmation myself this morning. Our UK resellers and MSPs were emailed directly at the same time to notify them of the change. Based on many of the comments I have read in the media, on Twitter and calls I have received into the office today, I also appreciate how extremely frustrating this is for our reseller and MSP partners. It is not easy for international firms to break into the UK market, particularly in a poor economy and sadly, I suspect this might not be the only withdrawal we will see over the next 12-24 months. It does happen. The distribution market in general has a very big role to play in trying to limit impact on the reseller community of such withdrawals, and it needs to take this very seriously. Along with standard due diligence when agreeing to work with vendors, distribution should: Only work with vendors that it believes has products and services that add competitive advantage to resellers Be prepared to use the vendor’s products itself, in-house, to act as a test bed and check on-going suitability for the reseller community Maintain a strong and close working relationship with the vendor to ensure constant communications on market and product developments Build a broad portfolio of complimentary vendors to present best fit choice [and alternatives] for resellers When situations like Doyenz’s exit from the UK market happens [and often completely by surprise], distribution must have: Accurate information available for resellers Well trained, knowledgeable product experts available to support resellers Direct telephone and email support A migration path [where possible] for resellers to work with alternative cloud suppliers Obviously I’m biased given my role at Blue Solutions, but I do think it is really important for distribution and the reseller community to stick together in times of turmoil such as Doyenz’s exit, we want to help and we will help wherever possible.   ### How to work with a Trade Association By Lindsay Smith, EuroCloud UK Secretary General I've been a member of trade or professional associations over 30 years. I’ve seen that some people, some organisations get a lot out of them and others don’t. Is there a role for them today? Well, take a look at some of the evidence, they have always been around and in the UK today there are many hundreds of them from the Aberdeen Fish Curers & Merchants Association to the Yachting Journalists Association, so, market forces being what they are it’s a fair conclusion that they provide value. By that I mean that they have to generate more benefit for their members than they cost their members. Do you want to know how to get at that value? Well that’s what this is about. How to use them I’m writing as the Secretary General of EuroCloud UK, we are a not-for-profit trade association for the UK Cloud industry, part of the wider European EuroCloud movement with over 1,000 business organisations as members. Why this is particularly relevant as a place from which to talk about the benefits of a trade association is because Cloud is a very disruptive force in the IT sector and in the wider economy. Disruptive means it breaks the old model of doing things. Breaking and questioning, changing and creating are zealously pursued – and trade associations can’t and shouldn’t escape the inquisitioner’s attention. It’s very healthy – assume that old ways of working are not relevant until you can make the case to be included in the new world. Well, we have made our inventory, and we have asked the question of our members and, thumbs-up, we are a significant benefit, but that benefit is wasted unless we reach out, tell you how to get it and make sure it’s easily accessible to everyone. So read on… What is a Trade Association? There are many, they have been around a long while, so I can be brief here. Here are the main characteristics: Mutual Organisation – Owned and answerable to the members and run for the members Aligned – to a particular sector of commerce, business or profession Objectives – to further the interests of the members, the sector, to speak with a legitimate, collective voice and influence and stimulate the environment There are several other characteristics which may or may not differentiate one from another but tight definition and segmentation really isn’t needed here. If it looks like a duck, swims like a duck and quacks like a duck then it probably is a trade association, in a manner of speaking. Before leaving the dull bit on definitions though let’s just restrict my scope to trade associations that have businesses as their main (or only) type of member. Many allow for or cater to individuals, some of what goes below will be relevant but there’s not much I’ve found on the web that deals with business trade associations (which is what EuroCloud is), so that’s my topic here. What to Expect from a Trade Association Positively supporting and promoting the general business of the sector to which they are aligned Promoting the interests of the members Creating opportunities for the members to promote themselves But you know that. What you want to know is how to get maximum benefit (direct first and indirect if it’s going), okay, well just a second before we pitch in to that. A quick word on perspective: the benefits a member wants and can make use of are going to vary depending on the size, maturity and type of business you are. I’m going to try and cover it for everybody, but the needs of an established, global enterprise and those of a small-to-medium sized new business are going to be quite different – there is something for everybody, but pick and choose what will be attractive to your business. So, thank you for staying with me, let’s get stuck in. Talk to the Secretary There’s going to be a lot of talking in all of what follows. But you see, communication is essential to getting the value you want. The Secretary is the key – involved in every aspect of the affairs of the association but particularly membership (and their contentment, retention and development), events, sponsorship, outward communications and promotion. Some associations are big enough to have specialists in charge of these functions, but get to the person at the centre who holds all the strings first. Not only can he make things happen, but he can also talk through your goals for your membership (help you formulate them if you haven’t arrived with a pre-conceived set) – explain how other people use their membership and give examples of what works well. He’s also well positioned for matchmaking. Matchmaking is really valuable. Networking I’ll look at below, but if you itch for some introduction, alliance, opening or opportunity – make sure the Secretary knows and he can take some of the dipity out of the inevitable serendipity, and what your left with is, well almost serene. The Secretary wants to hear from you, that’s his job and he can’t do it well without knowing you, knowing what you want and having a plan to do something about it. Now you are already getting more out of your membership than the members who don’t make this basic connection. Don’t ignore the other Board members (sometimes called committee, sometimes council) particularly if you already have a good relationship there, but the odds are they will pass it to the Secretary to make something happen. What do I ask for? What can they do for me? This is where size and maturity play a big part. A multinational may be looking to reinforce its brand and image and be reaching-out to an eco-system to reassure them of their commitment to a particular geography or community and have a catchment for new & emerging businesses they want to attract to their constituency. A newer or smaller business may want to get visibility in the supply-chain, prospect for alliances, partners, resellers or customers. To both it’s a form of promotion and the association can actively and passively promote brand, company, product/service, image and individuals. Any association’s reach is far wider than its membership. Here at EuroCloud UK we have over 1,000 businesses across Europe who are members of our wider movement, newsletters, briefings and announcements reach several thousand more, over half the attendance at events can be non-members, which includes press, commentators, government, and potential customers. Add to this the media and PR sponsorship and relationships we cultivate, social media channels we use and it all adds up to quite some reach. So how and what do you use it for? 1. Get your company profile in the newsletter and on the website. Trade associations are proud of their members and want to trumpet you joining and any special news you have. 2. Register for events (as early as possible) and attend – we give four paid-for places for our members at any event. You may also get access to the registration list and be able to set-up meetings, post details about you and who you might like to meet. 3. Talk at an event – our events are a balance between content, discussion and networking. There are great opportunities for 5 or 10 minute presentations that are strong on content and tie-in to the theme of the event. All speakers introduce their companies as part of their authority for expressing a view or insight on a subject – and the audience wants to know who you are and what you do. 4. Talk at an event and you will be on the stage for the panel discussion and questions which are an intrinsic part of the show. Tell it how it is, how you see it – it’s a good introduction and you will find helps people get to know you. 5. Sponsor an event – within guidelines this gives you joint editorial authority over content and shape of an event, more time to speak and develop your theme, branding in the promotion at the event and what is written and reported afterwards and a booth/space to station staff and promotional materials – people want to know about you and then have a locus to find out. 6. Network at events – more on this later 7. Ask your association to co-promote one of your events. In the UK, we at EuroCloud do this for our members. Simple bandwidth and scheduling mean we can only do one co-promoted event per month, so how do we choose? Those who ask first. Book your place when you know your event date. Sure, we have to ration the number of these we do – particularly as we provide a free speaker – and they have to work to some guidelines, but the magic ‘secret’ is to talk to the Secretary about it, just ask. 8. Blogs, white-papers, web-content – all informative, relevant and original content is a benefit to the author and a benefit to the membership. Trade associations are an exchange of information, ideas and opinions. Most provide for and encourage this on-line participation. 9. Advocacy – trade associations lobby government and the authorities, liaise and participate in other associations, on behalf of and as a collective voice for their members. Express your views if you want them to be part of this, at events, in discussion, in response to surveys or by writing to or calling the Secretary. 10. Awards Programme – EuroCloud like many associations has an active and highly competitive, well publicised Awards programme. Ours starts taking submissions in April with closure in June and an Awards event held in July. We benefit from our European structure because all winners of the national Awards go through to a European final at the EuroCloud Congress in October (this year Luxembourg). All our entrants (60 in 2012) get a great deal of promotional activity wrapped around their participation through media interest, our promotion and media sponsorship. It’s an effort measured in minutes to participate – for a great deal of presence and publicity. But only those who have thought about how to get benefit from participating get access to this gift of a PR opportunity. 11. There’s another way a trade association can support you – only we haven’t thought of it yet… if you have an idea not in this list and think it is consistent with the mission and values of your association – call the Secretary and just ask. Networking Networking justifies a separate mention because it’s a special and yet unpredictable benefit. Combine it with matchmaking (see above) and speaking at events or sponsorship to try and get a little more predictability in the mix, but there are some handy tips to keep networking productive. Everyone that goes to a trade association event wants to have a conversation. If you are at a loss, all you need to do is say just that. I’m not going to patronise with the etiquette of networking, list my 23-point plan for success through networking, paint a bestiary of networking animals or even remind you to smile, listen and be amiable. Ah, sorry, I lied. I am going to tell you to smile, listen and be amiable. That is the key that opens every door – plain, old-fashioned courtesy wins over any other approach, every time. Networking is like one of those magic artefacts in a Harry Potter story. It can and will provide you with an unknown but valuable insight on every occasion that you consult it, but you have no control over just what that is. The best formula for dealing with this is to enjoy the serendipity of the moment. Find and benefit from the interaction you are presented with and play the long game. Give something back you think your new acquaintance will enjoy and then ask their advice – she or he may know someone who is just the person you need to meet. The other aspect to the long game is to think about networking as not one, but several attendances at sequential events. It won’t be long before the whole room opens up to you as any familiar gathering does. You are known, people recognise you and want to talk to you about opportunities they have heard of since they last saw you. Your third visit will always be much more productive than your first. Ask not what your association can do for you but what you can do for your association Play your association by these simple rules and you will achieve a return on your investment that will make it the most effective part of your marketing spend. You also earn a coveted status that simple cash doesn’t buy. You become known, surprisingly widely, through these connections and their connections as active, committed to, a participant of some gravitas in your sector. People, ideas, relationships will start to gravitate towards you. The mutual respect and trust of your peers converts to trust in the market and in a young industry sector – that is worth having. Lindsay Smith is EuroCloud UK Secretary General. For further information visit http://www.eurocloud.org.uk ### Creative agencies can run and scale their digital projects in the cloud with full control By Greg Rusu general manager of Zunicore, PEER 1’s cloud division The cloud has dramatically reduced the capital required to innovate across every sector.But for all the promise of utility computing via the cloud, many creative agencies – including web designers and developers, marketing agencies and ad agencies – are struggling to realise the benefits. They have found themselves struggling to quickly and cost-effectively manage on-demand computing resources to serve a global audience. Those in the creative agency fields have to adjust their business models to keep pace with technology changes. For web designers and developers, traditional desktop software will eventually be replaced by cloud applications. Licensing will change to a pay as you go model and investment will be made in this rather than existing desktop software like Adobe’s product suite. Though agencies by nature like to focus on the product they create, not the delivery method, the internet has forced them to change their business models. They’re now responsible for generating demand and fulfilling it. And today demand generation almost always has an online and thus, strong viral component. In other words, a campaign can instantly take off on Facebook or via Twitter or YouTube and unexpectedly drive more traffic than anticipated. When that happens, the agency needs to ensure that its client’s website can handle the spike. Unfortunately, many agencies fail to rise to this challenge and instead stand by helplessly as the website scales beyond the client’s allocated spend. Many more are forced to over-invest in computing resources ahead of time as a fail-safe measure. Either way, they end up watching their immediate profit margins and long-term profitability erode. This element of risk has become part and parcel of the agency model. In other words, clients now expect agencies to do more than come up with a creative idea and write code; they expect them to handle every aspect of a campaign, including delivering the campaign and microsite, and ensuring any online component supports millions of site visitors without fail. The key to tapping into the subscription-based revenues enabled by web-related offerings such as hosting and search engine optimization (SEO) is to efficiently and economically deploy infrastructure to support campaigns and promotions. After all, tremendous revenues are at stake. At its core, the utility aspect of cloud computing promised to help creative agencies deal with the unpredictable nature of their business model and deploy and use infrastructure on an as-needed basis – without much planning or forethought. But the cloud hasn’t fulfilled the promise. In reality, it’s almost as complicated to bring up a cloud instance as needed as it is to become a hedge fund manager. There is a lack of control, inflexibility, over-complexity, difficult to use scaling tools, unreliability and poor visibility. All of these issues combined mean that developers end up spending tremendous resources – notably in time and IT management – to launch and run their campaigns and programs in the cloud. As a result, a failed campaign can tie up valuable resources with little or no return for clients, while a winning campaign can easily exceed available resources, leading to financial failure for the agency. A need surfaced for a solution to specifically address the needs of creative agencies with unique and changing technology requirements as they launch varying products on the web. Web developers and designers should be able to instantly deploy feature-rich and optimised cloud infrastructure and automatically scale those computing resources to support their campaigns and programs within an intuitive, easy-to-use interface. We feel we have achieved this with Zunicore, the new cloud division of PEER 1 Hosting. A fully user-controlled environment including customisable resource pools allows users to increase or decrease the size of their pool, or add virtual machines based on their business needs. This type of control also provides Overdraft Protection against the users hosting cost. Persistent storage area network storage is more reliable and less susceptible to data loss and offers higher performance than local storage devices, which also incur additional costs. Hands-free autoscaling allows users to specify how much they want to scale their resources based on user-defined thresholds such as technical resources, cost or both. This makes over or under-investment a thing of the past. We aim to offer creative agencies an unprecedented level of functionality so they can focus on their business and not on the IT delivery vehicle. In other words, the service is simple and intuitive to use and not a drain on the time and resources of in-house IT people. As a result, they can now confidently launch programs designed to generate unprecedented levels of traffic for their clients. ### Internet Abductions Of Olympic Proportions By Ian Moyse, Workbooks.com, Eurocloud UK Board Member & Cloud Industry Forum Governance Board Member Recent studies demonstrate that upwards of 25% of Internet bandwidth in an office are consumed by employees misusing the internet. According to Gartner, the average growth of business email volume is 30% annually, with the average size of the email content growing in parallel. Add to this the growth of Web misuse from streaming media, downloads, file sharing, social networking, and spam, and it becomes pretty clear that the mismanaged cost to business of non-work-related Internet use is already bad and getting worse. With the Olympic tirade about to start there will be a mix of impacts on internet use and the security around it. Users in the city are being told to work from home with some companies we hear being instructed by officialdom to reduce their staff in the city during the game on period! These staff working remotely at home on work provided machines will spend how much time really working? Will they be distracted by the usual lures of the web, facebook, Hotmail, youtube etc – surely not!? Worse still will be those employees across the UK still at their office location who will be viewing the Olympic activities and news online, streaming video and live TV will become commonplace. What impact will this have on not only the productivity of those guilty parties, but also those around them whose bandwidth is being sucked away from them! There are plenty of examples already, including employees wasting more than two hours a day on recreational computer activities (according to a survey fielded by AOL & Salary.com) and that, according to an IDC report, “30% – 40% of Internet use in the workplace is unrelated to business.” Studies and surveys such as these typically focus only on lost productivity — and there's no doubt that's bad enough. But they rarely discuss the significant hidden financial impact of bandwidth wastage from these activities. We have started to take bandwidth for granted as it's become cheaper and more readily available. However, as the adoption of cloud-based solutions (like customer relations management tools) increases, it will be critical to ensure the user has a good experience with Web-based applications, with the speed of their ability to work unimpeded by bandwidth grabbers and slowdowns. Social networkers are as much to blame as habitual gamers, sports fans, or file sharers: After ‘posting messages,' the next two most common social network activities are uploading and downloading music and video content. Overall top bandwidth hogs reported include employees sending emails with large attachments, recreational Web surfing, listening to the radio over the Internet, music downloads, and streaming video over the Internet. One rogue user in an office streaming large files can impact everyone else trying to work. Clearly, there's a need to manage individual users' bandwidth usage. Perhaps by taking simple steps, such as giving users bandwidth allowances, admins can control the abuse. By blocking streaming media, allowing users to go to sites but without the ability to see streamed videos, bandwidth usage can be reduced dramatically. It may also be possible to block the downloads of certain file types or MIME types, such as Flash Video .flv files, unless the user has a legitimate business reason to view them. And blocking some MIME types can even help prevent users being bamboozled into infecting their own computers by malicious advertisements. The web has increasingly become a valuable business tool for research, information (how often do you check a route, traffic, customer information with no second thought to the reliance on this medium) and for cloud applications that are fundamental to the business. Action needs to be taken by organisations now to get control back of this valuable tool and medium and to reduce the negatives it can bring in terms of time wasting, HR cases (facebook examples are rife) and security risks. With a younger generation entering our businesses, for whom spending their life on facebook, twitter and the web is the norm, better to get these policies and policing tools in place now before it becomes even more of a risk and hill to climb to introduce this change of culture and behaviour. Employees need to understand where the line is drawn, ‘drive that internet car, but don't speed, don't drive dangerously and certainly don't pick up strange hitch hikers along the way.' It's essential our internet workers are not wasting valuable work time, acting recklessly or going off road using work assets on work systems! An example solution to this is easy in using Software-as-a-Service Web filtering, offering unique advantages that can now be brought into play. For example, bandwidth compression of all traffic from the cloud to your users browsers, and even the ability to block Web adverts at the gateway, may conserve this bandwidth resource. Knowing and controlling where and what people can do on the web is no longer an option, its becoming a necessity. Over time, simple measures such as these can conserve a large amount of bandwidth and regain productive employee work time. An employee spending 10 minutes an hour outside smoking is visible and you can choose to address it. An employee wasting such time on internet activities is invisible and even to them that time passes fast and unnoticeably and the internet is of more interest to more people than an activity such as smoking. Think of this as the hidden disease in your company and start to prevent it now. While we all appreciate the privilege of using the Internet for personal purposes at work, a small number of rotten apples on your network can truly ruin the whole bunch. The Olympics will drain your resources, productivity and bandwidth for sure! Draconian measures are sure to hurt morale, so it's a bit of a balancing act to find the right mix of measures that work. As the growth of cloud applications and hosted services continues, it will be more important than ever to keep these bandwidth hogs in check, lest the rest of the company suffer. ### Digital Realty Completes Acquisition of Sentrum Portfolio in London The three-property portfolio totals approximately 761,000 square feet San Francisco, CA – July 16, 2012 – Digital Realty Trust, Inc. (NYSE: DLR), a leading global provider of data center solutions, announced today the completion of the acquisition of a three-property data center portfolio located in the greater London area, referred to as the Sentrum Portfolio.  The Sentrum Portfolio comprises approximately 761,000 square feet across three data centers located in Woking, Watford and Croydon.  The purchase price was £715.9 million (equivalent to $1.1 billion based on the July 11, 2012 exchange rate of £1.00 to $1.55) (subject to adjustment in limited circumstances and to additional earn-out payments based on a multiple of the net operating income from the lease-up of currently vacant space in the portfolio in the next three years). “The addition of these high-quality, mission critical facilities to our London portfolio strengthens our position as a leading data center solutions provider in this highly strategic, global market,” said Michael F. Foust, CEO of Digital Realty.  “We now have eight data center facilities located across the greater London metropolitan area, which represent over 50 percent of our European portfolio.  In addition, the properties include additional inventory for future development that will enable us to meet local and international U.K. customers’ long term requirements.” “The portfolio is approximately 80% leased to a diverse roster of high quality companies that we are now pleased to call our customers,” said Scott Peterson, Chief Acquisitions Officer for Digital Realty. “In addition, the opportunity to lease the balance of the portfolio’s available space is a great opportunity to enhance the return on this investment for our shareholders.” “The completion of this acquisition combined with the success of our recent equity offering represent important strategic accomplishments for Digital Realty,” said A. William Stein, CFO and Chief Investment Officer of Digital Realty. “We significantly increased our presence in an important global market, enabling us to better serve our growing customer base, while providing our shareholders with a transaction that we expect to be accretive to our 2012 Core FFO on a diluted per share and unit basis (excluding one-time acquisition and financing costs).  As previously stated, we plan to provide updated guidance for the year in our second quarter 2012 earnings announcement.” The Company estimated the total square feet available for lease in the Sentrum Portfolio based on a number of factors in addition to contractually leased square feet, including available power, required support space and common area. The Company’s estimate of the percentage of occupied rentable square feet in the Sentrum Portfolio may change based on the Company’s experience operating the properties following the closing of the acquisition. About Digital Realty Digital Realty Trust, Inc. focuses on delivering customer driven data center solutions by providing secure, reliable and cost effective facilities that meet each customer's unique data center needs. Digital Realty's customers include domestic and international companies across multiple industry verticals ranging from information technology and Internet enterprises, to manufacturing and financial services. Digital Realty's 108 properties, excluding three properties held as investments in unconsolidated joint ventures, comprise approximately 20.8 million square feet as of July 16, 2012, including 2.3 million square feet of space held for redevelopment. Digital Realty's portfolio is located in 32 markets throughout Europe, North America, Australia and Asia. Additional information about Digital Realty is included in the Company Overview, which is available on the Investors page of Digital Realty's website at http://www.digitalrealty.com. Core FFO is funds from operations, calculated in accordance with the standards established by the National Association of Real Estate Investment Trusts, adjusted to exclude items that do not reflect ongoing revenue or expense streams, including transaction expenses.  Other REITs may not calculate Core FFO in a consistent manner. Accordingly, Digital Realty’s Core FFO may not be comparable to other REITs’ Core FFO.  Core FFO should be considered only as a supplement to net income computed in accordance with GAAP as a measure of performance. Safe Harbor Statement This press release contains forward-looking statements which are based on current expectations, forecasts and assumptions that involve risks and uncertainties that could cause actual outcomes and results to differ materially, including statements related to the acquisition of the Sentrum Portfolio, the expected return on investment and the impact of the Sentrum Portfolio on 2012 Core FFO and 2012 guidance. These risks and uncertainties include, among others, the following: the impact of the recent deterioration in global economic, credit and market conditions, including the downgrade of the U.S. government’s credit rating; current local economic conditions in our geographic markets; decreases in information technology spending, including as a result of economic slowdowns or recession; adverse economic or real estate developments in our industry or the industry sectors that we sell to (including risks relating to decreasing real estate valuations and impairment charges); our dependence upon significant tenants; bankruptcy or insolvency of a major tenant or a significant number of smaller tenants; defaults on or non-renewal of leases by tenants; our failure to obtain necessary debt and equity financing; increased interest rates and operating costs; risks associated with using debt to fund our business activities, including re-financing and interest rate risks, our failure to repay debt when due, adverse changes in our credit ratings or our breach of covenants or other terms contained in our loan facilities and agreements; financial market fluctuations; changes in foreign currency exchange rates; our inability to manage our growth effectively; difficulty acquiring or operating properties in foreign jurisdictions; our failure to successfully integrate and operate acquired or redeveloped properties or businesses; risks related to joint venture investments, including as a result of our lack of control of such investments; delays or unexpected costs in development or redevelopment of properties; decreased rental rates or increased vacancy rates; increased competition or available supply of data center space; our inability to successfully develop and lease new properties and space held for redevelopment; difficulties in identifying properties to acquire and completing acquisitions; our inability to acquire off-market properties; our inability to comply with the rules and regulations applicable to reporting companies; our failure to maintain our status as a REIT; possible adverse changes to tax laws; restrictions on our ability to engage in certain business activities; environmental uncertainties and risks related to natural disasters; losses in excess of our insurance coverage; changes in foreign laws and regulations, including those related to taxation and real estate ownership and operation; and changes in local, state and federal regulatory requirements, including changes in real estate and zoning laws and increases in real property tax rates.  For a further list and description of such risks and uncertainties, see the reports and other filings by the Company with the U.S. Securities and Exchange Commission, including the Company’s Annual Report on Form 10-K for the year ended December 31, 2011 and Quarterly Report on Form 10-Q for the quarter ended March 31, 2012.  The Company disclaims any intention or obligation to update or revise any forward-looking statements, whether as a result of new information, future events or otherwise. ### Testing Live Blogging Compare the cloud is testing live blogging. If you have stumbled upon this page we apologise for the inconvenience, and recommend you head back to the homepage, here. ### What is Information Security Really? “We have gone on holiday by mistake!” Withnail & I Information security is very much misunderstood out in the business world and pretty much any of the other virtual worlds you care to mention.  It means different things to different people:  To financial companies it is commonly viewed as a “required to have, because we are told we have to” they do what they need to do because they are forced to by a governing body (FSA, Card brands, etc).  They do the minimum they need to do in order to tick the boxes, no more, no less. To large / medium retail companies information security is something they have to do because they are told they have to by the banks, they don’t like it because it eats substantially into their profits but they do the minimum they need to, in order to tick the boxes. To cloud companies it’s viewed as “not their problem because it’s not their data, thus not their responsibility” so they only do the minimum required to assist in their sales process, commonly ISO27001 as it’s the easiest to attain. To technological companies its firewalls and antivirus, after all they will “never get hacked” as they “are not a target” thus they do the minimum required in their minds to provide security at the smallest cost possible, ticking only the boxes they need to. Looking at the examples above carefully you begin to see a pattern, nobody really knows what information security is, nobody really wants to do it as they think it costs too much and if they do have to do it, they will do the minimum required in order to tick whatever box they need to.  This leads me to ask a question. “What is the minimum?” Funnily enough, every organisation that I have spoken to in the examples above, cannot answer that simple question. Sure some of them will mention compliance (especially PCI DSS) but on the whole there has been no good answer and it is quite interesting. Analysing further and digging deeper the question becomes something different, it becomes: “What is the minimum that we HAVE to do?” So what is it you have to do? Is it securing card data? Changing your contracts to absolve you from any security responsibilities for the services you provide as an outsourcer?  What in your mind is the minimum that you have to do to secure your operations? Analysing even deeper the question becomes: “What is the minimum that we have to do and what can happen if we don’t?” What are the consequences of you not becoming secure? What fines do you face? What bad publicity do you risk (let’s face it the British media LOVES to see someone fall from grace and reports heavily on it)? When you yet again analyse that question it changes again to: “What are our responsibilities?” Now that is a good question and is the root question when it comes to looking at information security in your own organisation.  What are you as an organisation obliged as a business to do to protect: Your Owner / shareholders / stakeholders Your reputation / brand Your Revenue streams / assets Your clients These ultimately are the things that you are responsible for within your organisation, all of you from the IT guy on helpdesk, the sales people selling you product and the directors and shareholders that run the business itself.  Information security is a company-wide concern on all levels and one in this current market that cannot be ignored, companies are falling at the first security hurdle left and right, security breaches are causing more lost and stolen revenue in the western world than any other criminal activity and it’s getting worse. Can your business afford a security incident? Think long and carefully about the answer to that question…  If you need help with the answers, don’t forget we are only a phone call away. T +44 (0)1622 873242  ### Cloud Adoption Let the Facts speak for themselves The Cloud Industry Forum exists to drive best practice in cloud services delivery and to educate and inform the market on related issues. A core aspect of our activity is to conduct independent market research to provide clarity and transparency on key trends. The latest (of 3) research projects conducted over 18 months and across 250 UK-based organisations has just been completed and the highlights are as follows: 61 per cent of the organisations surveyed are currently formally using at least one Cloud-based service, with a 92 per cent satisfaction level with their experience. This adoption figure represents an increase of twenty seven percent since the original research was last conducted in January 2011. Broken down, adoption in larger organisations is most prevalent, standing at 68 per cent, with adoption in organisations with fewer than 20 employees standing at just over half (52 per cent). This strikes a contrast to the previous research in November 11, which found no discernible correlation between the size of an organisation and Cloud adoption, although it does reflect a similar position of Enterprise leadership that was seen originally seen at the beginning of 2011. The findings reveal little disparity between public sector (62 per cent) and private sector (61 per cent) adoption. The last research however found that the public sector lagged the private - 49 per cent and 56 per cent respectively - pointing to a dramatic increase in public sector Cloud adoption. In terms of future adoption, of the 39% not using Cloud services today, around one in four plan to do so in the coming year. Organisations that do not use Cloud and have no concrete plans too, are positive/not closed to the idea. This roughly reflects the last survey where 17 per cent said “Yes” and 59 per cent said “Possibly”. Broken down by sector and size, public sector organisations and SMEs appear to be the most dynamic sectors for growth in the next 12 months, with 34 per cent and 30 per cent respectively. The research clearly demonstrates a solid increase in Cloud adoption in the UK and confirms that enthusiasm for Cloud services remains strong. Although there have been increases in adoption across the board, the public sector has seen the most considerable growth, which, given the recent Government interest in Cloud, should come as little surprise. In a departure from our previous research, enterprises of 200 employees or more have leapt ahead of smaller organisations in terms of Cloud adoption. Just over three quarters (76 per cent) of respondents currently using Cloud services expect to increase their use over the next 12 months, slightly up from the previous figure, which stood at 73 per cent. Email services, collaboration services, data storage services and data back-up services emerged as the area’s most likely to benefit from this expected increase. The research also continues to reflect that the primary reason for the initial adoption of cloud services is the flexible model of delivery (71 per cent), as well as scalability (66 per cent) and the low cost of adoption (58 per cent). Once again operational cost savings where not seen as the major driver, however it is increasingly important to the majority of respondents (52 per cent). Another aspect of cloud adoption is understanding the processes and assistance needed. It was noteworthy that 59% of organisations that bought cloud services carried out a trial before they entered into a contract. Furthermore, only 16% of organisations implemented the solution themselves, whilst 50% required specific assistance from the CSP and the balance used third party specialist services . Clearly an implementation/migrations services capability and try-before-you-buy capabilities are important points for differentiation in the market. As the number of cloud adopters increases, so too does the number of Cloud Service Providers (CSPs) and service models in the market, resulting in a diverse and complex supply chain. Many established suppliers have altered their business models to include Cloud Services within their portfolio; 66 per cent of end users have expressed an increasing expectation of self service in light of the new service models enabled by Cloud Computing; and a new breed of Cloud brokerages and aggregators are forming to provide commercial guidance and a single point of contact to customers. Whether buying direct online or via a third party it is essential that an organisation can establish confidence (i.e. trust) in the Service Provider/s so that they can be ultimately confident in both their expectations; in the nature of service provided; the responsibilities of the parties, and, the important issues to consider in entering, managing, and ultimately exiting a contract. In validation of CIF’s certified Code of Practice, 78 per cent of organisations now using, or planning to use, Cloud services see value in working with a Cloud Service Provider (CSP) that has signed up to an industry Code of Practice. So, in short, the UK cloud market is in a healthy state, both in terms of adoption and satisfaction. To get a copy of the full 2012-2013 UK Cloud Adoption Research Results which will be published in a free White Paper, please subscribe to our updates at www.cloudindustryforum.org   ### Cloud in the mainstream By Jon Smith, e-Marketing Specialist for Insight Cloud In the world of IT all we hear about these days is “The Cloud” somehow this has become the subject of water cooler chatter amongst IT managers. When we look at the statistics it is not surprising why with CRN predicting that SME spending on cloud computing alone would reach $100Bn by 2014 and Gartner claim that the entire Cloud market will be worth $150Bn by 2013. The question is why is cloud suddenly the hot topic on everyone’s lips? Well in a time of economic uncertainty especially in Europe cloud spending is growing significantly quicker than any other type of IT spending. The reason for this are the cost savings that implementing a cloud solution brings businesses, significant savings are made on hardware, maintenance and the time of your IT department. A significant amount of an IT departments time is taken up by hardware maintenance. One of the main drainers of time is software installation which is no longer necessary when sourcing programs from the cloud. Cloud computing is also considered a green technology which is a major consideration for many businesses especially those seeking or holding a ISO 14:001 accreditation. The sudden interest in cloud is surprising considering cloud applications have been available for quite some time. Gmail is essentially a cloud email service first launched in 2005 and Hotmail has been around since 1996. With time comes market saturation and this is the situation we’re facing in the UK with hundreds of businesses offering cloud services including Rackspace, IBM, Dell, Insight, Joyent, virtualdcs, firehost and Sleek Networks. Consumer Market What is really interesting at the moment is how cloud services and apps are being used by consumers not only for email but for storage, backup and even potentially disaster recovery which is historically used by businesses. One of the most obvious examples is drop box which can used for backup, storage and disaster recovery. Drop box allows consumers to store files online and then access them from any location whether that be music, photos or important documents. With a low cost users can securely save their data knowing that whatever happens it will be accessible. Another example is the cloud music service, Spotify. When it first launched it was revolutionary as it gave consumers access to Music from any location and now with the Spotify mobile app they have created a cloud experience that cannot be rivalled. Why has this happened? Why have cloud services all of a sudden exploded into the mainstream? One reason has to be the rise of the smartphone, according to Kantar Worldpanel ComTech smartphone adoption in the UK is now over 50%. Consumer wants and needs have now changed they expect to be able to access information and entertainment on the go and this is now possible. Smartphones and tablets has given consumers more freedom when it comes to accessing information. The future of cloud Since cloud has now become a mainstream service the next question is what will the future hold for Cloud Computing? As the industry continues to grow and starts to mature we’re expecting the cost of cloud services to decrease significantly especially as more businesses start offering a cloud solution and the market becomes more competitive. We also expect to see more cloud services become available especially for consumers. At the moment the main proposition for consumers revolves around personal storage but as the industry develops we envisage more complicated cloud services becoming available to suit changing consumer needs. Businesses who are in the unique position of offering a business and consumer cloud proposition really are in a position to take advantage of the changing marketplace but the question is will they will be able to keep it up?   ### The benefits of independent security advice James Rees, Managing Director, Razorthorn Security In today's modern business world Information security is important…. Not just important but VERY important. Cybercrime is now a multibillion dollar industry, Symantec reported in 2011 some frightening statistics: Cost of Global Cybercrime in 2011 = $114 Billion Value victims placed on financial loss and time = $274 Billion Now look at those amounts, it's estimated that Cybercrime cost $388 Billion…. Astounding to think of, but frightening when you place that next to the drugs trade which in the same year cost $288 Billion…. Just the other day it came to light of three hacking attempts on the hotel giant Wyndham Worldwide Corp that allegedly involved the credit card information of hundreds of thousands of customers, this is just the most recent reported case out of the thousands that go unreported in our economy, and it's getting worse! Cybercrime is here to stay, it's going to get more prolific and more expensive as our society becomes more dependent on technology and digital services… I know what you are thinking, “Oh god, another information security doom merchant with prophecies of impending disaster” and to be honest you would be forgiven. In recent history there have been many security professionals using this method to sell their services and it has become a central focus these days, especially with the sales people from many information security companies. I don't personally agree with this method, I believe that information security is an asset to the business and that's how I sell it. Selling through fear, although effective, is not in my honest opinion, the way of the future with Information Security. In the last five years, the business world's nominal business operations have been shifting dramatically from a host of internal systems, housed in a purpose built datacentre (or cupboard in the corner…. No really… true story). But in recent years many companies are looking to reduce costs to the business, by moving their technical management and infrastructure into an outsourced solution, the most recent iteration being this massive behemoth of a buzz industry… the cloud. EVERY company we have spoken to about outsourcing to the cloud, list security of their data and solutions as the top three (usually No 1) concerns. I challenge you to go out and find me a company looking to move over to the cloud, that in the initial first meeting conversation, doesn't list the security of the solution as a major requirement. Information security is the KEY requirement, for any organisation looking to outsource any aspect of their operations, here are just a few simple examples: Companies outsourcing their call centre operations take the security of their customer's confidential data very seriously, as even a suspected security breach can have a very significant impact to not only their own reputation but the call centres too. Companies outsourcing certain operations to the cloud want to ensure the solution protects the integrity, availability and confidentiality of their data and services. A security event of any sort that has a risk against these three key factors is a major issue. Companies outsourcing their IT support functions to a cheaper location want to ensure that their proprietary information is protected and that the support staff from the outsourced company will use their administrative access to systems and services responsibly and not use them to steal customers information and re-sell it on. Employing a professional Information Security professional can be (especially in today's market) expensive talent to employ, PCI DSS for instance has led to a massive need for talented information security people and with demand comes a rise in price, we are expensive commodities. The question then becomes, is there a way to get the same level of security but for a cheaper cost to the business? Of course there is, many SME's (even the larger ones) do not need a full time information security person and cannot afford the cost of a good one. So the only way to get a good experienced information security professional is to outsource to an organisation that specialises in the information security field. Outsourcing your information security to the right company, means that you can gain the support of some of the best information security people in the business for a fraction of the price you would normally pay for a permanent employee with the same level of knowledge and experience. I want all of you reading this to ask yourselves and your board of director's one question: “Do we take the security of our operations, customers and business partners seriously enough? If so how serious?” ### Degrees of separation work to highlight the success of a Cloud partnership By John Dunnet  Introduction We are often asked, ‘Can you use your Journey 2 method with an off shore model?’ to which the answer is of course ‘Yes’! But I always like to qualify that statement. Whenever a new variable is introduced into an organisation the ability to perform to the same standards of is made harder, especially in the forming/storming stage of a relationship. The factors that increase the difficulty we describe as ‘degrees of separation’. When two or more organisations start working together the relationship is made more complex by the number of degrees of separation involved. Degrees of Separation Degrees of separation can be categorised into the following groups: Culture If two organisations are culturallydifferent: for example a government body and a start-up, or whose constituents are from different cultures (e.g. the US and India). Time Zone If organisations are in significantly different time zones reducing the number of contact hours in a standard day – or forcing one or both organisations to work ‘out of hours’. Distance Physical separation. The longer thedistance the less likely people are able to travel to sort out issues quickly if required. Language Having different first languages, or the age-old joke of two countries being separated by a common language. Number of different entities involved The number of different organisations involved in a task. If there are more than two parties are involved then, groups, power dynamics and political rivalries can distract from the objectives. As the number of degrees of separation increase the difficult of delivery to the ratio: Difficulty = Complexity of Task number of degrees of separation So for example a UK based organisation outsourcing software development to India would have culture; time zone, distance and language as degrees of separation, which scores 4 making the initial relationship to the power 4 harder than if we were doing development with an in-house team. Difficulty = Complexity4 If we look at a near shore solution in a European country in the same time zone for a UK based company, we have language and distance as our degrees of separation which scores 2 and therefore the relationship is made harder to the power 2. Difficulty = Complexity2 In this instance if the financial considerations were the same the near shore solution would be preferred. The degrees of separation analysis is a key output in the Journey 2 approach, and underpins decision making on what services, or business functions and organisation would consider outsourcing, and where. ### How cloud providers will help data center operators increase customer conversion By Robert Jenkins, CTO, CloudSigma Like any company operating in an industry closely related to technology development, data centre operators regularly have to re-invent themselves, the way they operate and how they service customers. There has been no greater disruptive technology for data centre operators in recent years than the advent of cloud computing. On the face of it, rapidly increasing computing capacity requirements would seem like a straight win for data center operators however the actual situation is far more complex. A revolution is happening in the customer base of data centre operators, the next few years will represent an "adapt or die" change for incumbent providers of data centre space. Where did all the customers go? Significant change is already underway amongst the customer bases of leading data centre providers. The move to co-location by many companies started 10-15 years ago and continued to gather pace. The maturing of virtualisation technologies in turn made companies' physical infrastructure deployments in data centers that much more flexible and useful. Now the latest technology wave is hitting the tenants of data centers; cloud computing. The big difference with the latest technology transformation is that, in the case of public cloud, companies stop purchasing power and space from a data center operator - and instead buy computing resources from a public cloud provider. Essentially the cloud provider becomes an intermediary between the end customer and the data centre provider, if they use an external data centre provider at all. As the adoption of cloud computing accelerates, on a business as usual basis, data centre providers are increasingly going to see customers failing to increase existing space and - over time - run down legacy deployments. This is a major threat to the long term revenues of the data center operator. We are definitely a good few years away from the point where the majority of computing resources are purchased through cloud delivery models rather than directly through in-house dedicated hardware however the trajectory is already clear. Historically such shifts in technology have tended to accelerate faster than people anticipate. Private cloud: A False Dawn Not all cloud computing means customers moving away from a direct relationship with a data centre provider. Private cloud deployments do keep companies owning their own hardware and therefore are less disruptive to the customer base than public clouds. It is however clear that private cloud is a transitional technology. As public cloud is maturing, the boundaries that were previously understood for public clouds are being pushed back leaving the uncontested territory for private cloud deployments shrinking year on year. Fundamentally private clouds fail to address some of the most powerful benefits of using public clouds: elasticity and agility. A private cloud deployment is not elastic, companies therefore still need to provision for peak load and suffer from under-utilisation of their physical infrastructure. Yes private clouds can enable companies to more efficiently utilise their infrastructure through virtualisation but the overall fixed capacity constraint still exists. Private clouds also offer limited agility for companies. Provisioning, procurement and maintenance are all still in-house responsibilities and for the most part, non-core to companies' key strengths. Private cloud will not therefore be a steady long-term equilibrium for company infrastructure but a transitional step to wider adoption of public cloud. Private cloud has introduced companies to the powers of virtualisation but fails to focus companies around their core business strengths and has many limitations. Hydrid Cloud will be pivotal The reality is that all but startups from recent years have legacy infrastructure and systems. For that reason public cloud adoption will continue to gather pace but in-house infrastructure is here to stay for a good few years yet. Hybrid cloud deployments therefore offer customers a way to gain some of the benefits of public cloud whilst utilising existing investments in private cloud and dedicated infrastructure. For this reason, hydrid cloud will become the dominant deployment model for most companies over the next 3-5 years. Once deployed it is likely that future growth will come from the public cloud side of the equation as the efficiencies and benefits shine through. Replacement cycles will be the crunch points where existing co-location space is downgraded and computing resources in the public cloud are upgraded. The Customer Base of the Future: Hybrid cloud deployments will become increasingly the norm in the coming years with public cloud provisioning relentlessly increasing its percentage of overall computing resources under management over time . In the not too distant future public cloud will therefore represent the bulk of computing infrastructure. For the data center operator of today, this means that the majority of current customers will no longer be customers in five years time. The paradox is therefore that, whilst data center providers see robust growth today, as adoption of public cloud increases, that growth will slow and reverse quite rapidly. Successful data center operators will therefore be the ones that can safely navigate this fundamental realignment in their customer bases. Data center operators can however embrace this change and turn a significant threat into a major opportunity... Keep it Under One Roof As companies become increasingly aware of hybrid cloud and eventually more pure public cloud deployments, they are becoming increasingly discriminative about this aspect of their hosting when choosing data centre locations or renewing existing deployments. The ability of data centers to offer effective coordination of private in-house and public cloud infrastructure will become a critical success factor. Currently most data centres do not offer public cloud hosting as part of the sales process considering it as a pure threat to the potential sale. By partnering with cloud providers who do not wish to run their own data centres themselves, data centre operators can provide a very compelling solution to potential customers. Customers currently are often comparing in-house solutions with public cloud solutions. A data center operator that can quote for both deployment types is in a very strong position indeed, especially if both deployments are within their own data centers. Without such an option, if the customer decides on a public cloud deployment, the prospect is lost to a public cloud vendor who likely doesn't utilise that data centre operators services. A data centre operator that can offer a public cloud within its four walls and actually offer private, hybrid and public cloud deployments as part of its sales process will significantly increase customer conversion rates and therefore revenue growth. The opportunity exists to create tight integration with the partner public cloud provider allowing secure connectivity between public and private deployments at low cost. As customers transition away from their own dedicated infrastructure into hybrid and then more public cloud deployments, the data centre operator will continue to see revenue growth by having all those deployment options hosted within its environment. The explosion of cloud computing and cloud delivered services will drive overall computing capacity requirements over the next decade. Data centre operators that work strategically with the right public cloud vendors will reap the benefits of that growth. ### Making Your Public Cloud, Private By Dave Meizlik, Dome9 If you’re like most today, you’re looking to the cloud with cautious optimism to help make your enterprise more efficient and agile. I say “cautious” concerns for security, cost, and complexity in the cloud run rampant. Will my infrastructure be more or less secure? How much will the cloud really save me? What do I have to do (or give up), and what will it get me? These are just a few of the questions you’re likely mulling over. Cost is highly dependent on your infrastructure, so I’ll focus mostly on how to simultaneously tackle the issue of security and complexity, by making your public cloud, private. First off, what is a private cloud and why would you want one? A private cloud is a piece of infrastructure operated solely for a single organization. Private clouds are growing in popularity as a more secure means to get more control over an infrastructure-as-a-service (IaaS). It lets you segregate your cloud from other organizations, building a cloud infrastructure that is – in essence – an extension of your network. The truth, however, is that creating a private cloud and applying your legacy approach to networking and security creates complexity and drives up cost – two things you’re moving to the cloud to avoid, and doesn’t necessarily increase your security. You pay a premium for a private cloud so you can isolate your infrastructure from others, create secure connectivity (using VPNs), and maintain control over your security. The truth is, there’s a better way to achieve the same result, but at a much lower cost and with far less resource. Instead of creating an expanded perimeter around your cloud by making it private, simply isolate each individual server in a public cloud via a firewall management service, locking down each individual server with dynamic policy controls for remote access, on demand. This way you’re, in effect, making your public cloud servers, private. Note the emphasis on “servers” in my last sentence. That’s because each server is locked down and isolated, rather than the entire cloud. Each server, in isolation in a public cloud, is just like one big private cloud. By example, imagine you have a cluster of application servers and databases in a public cloud. Using a firewall management service, you can close administrative service ports like SSH, and RDP, and configure server-to-server communications for MySQL and other services. Then, using the firewall management tool, you enable secure, time-based remote access only when and for whom you authorize with the click of a button. This ensures protected access to your servers without exposing them to risk (e.g., brute force attacks and vulnerabilities from open service ports). What’s more, it makes your cloud servers virtually invisible to hackers and eliminates the need for clunky, pain-in-the-tail VPN clients. In effect, you’ve made your public cloud, private! This approach saves you significant time and cost, both upfront since you can safely leverage a public cloud infrastructure. And it makes it easier on you and your team, since remote access is available anytime, from anywhere, without having to connect back through a VPN. Moreover, this approach actually provides increased security, since you’re controlling access to each individual cloud server rather than the entire network (i.e., through a VPN). Now because you’re managing potentially thousands of individual server firewalls, you need a firewall management service to make this efficient. With a firewall management service you can automate policy administration and secure access, on-demand. You can, for example, apply a group-based policy for all your web servers. That’s one policy for multiple machines. Then, with a click of a button, your web developers can self-grant secure access to any machine on-the-fly, with time-based controls to ensure that while they’re accessing the servers, the cloud server’s firewall port(s) are open only for the machines from which they are connecting. Bye-bye VPN clients! You can also setup multiple group-based policies with a firewall management service. For example, one for your SQL databases, another for your web servers, a third for your application servers, and so on. And you can create role-based access controls with user-administered (yet monitored) secure access. This lets your developers and IT staff do their jobs, securely, for hundreds if-not thousands of servers, while making management easy and scalable. Now there aren’t too many firewall management services out there. Like the cloud itself, this is a new space. However, as you may have guessed, I work for one called Dome9. A description of Dome9 is below, and you can learn more at www.dome9.com, but first let me take a minute to list out a few important things you should consider in a firewall management service: #1) Agent-based vs. API-based deployments – Some firewall management solutions provide only agent-based solution. But if you’re an AWS or OpenStack user, you will benefit greatly from managing the existing firewall capabilities of these environments by connecting them to your firewall manager using your cloud provider API keys instead of installing agents on each server. API-based deployments into your cloud give you immediate-on firewall management, without the need to deploy an agent on each server. That’s rapid scale! #2) Automated access controls – You don’t want to have to leave ports open all the time, even for trusted IPs. Instead, look for a service that lets you dynamically open and close service ports with time-based controls. This way your ports are only opened for specific users, services, and time-periods, and your cloud servers are virtually invisible to hackers. #3) Multi-cloud & server policy groups – You likely have (or will) multiple servers across multiple infrastructures. Regardless of the distribution, you’ll want to abstract security as an application layer across them all, and employ group-based policy management to ensure you’ve got consolidation with your security management. That’s one policy set across multiple servers, even in multiple infrastructures. Hopefully this has given you some ideas for how you can get more value from public cloud computing without having to jump through all the hoops of setting up a private cloud. As you explore the topic more, I invite you to visit www.dome9.com and sign up today for free to see how we can help you Secure Your Cloud™. About Dome9 Dome9 is one of today’s fastest growing cloud security services. With over 1,200 customers, worldwide, Dome9 provides cloud-based firewall management to centralize and automate policy controls for any server running in any infrastructure. ### Planning a Journey to Cloudy Horizons By John Dunnet, Managing Director - Evoco.co.uk Making decisions as an IT leader was never as simple as the audience you served believed it to be. The interdependence and underlying complexity was never obvious to the outsider, especially if you were doing well in delivering service. And for a while, those who had IT sorted, had it easy. OS – check; Personal productivity – check; Routing – check; hardware vendor – check. In fact my biggest bugbear was the system bought by a business leader, installed under on a server under a desk, which became mission critical, broke and then became my problem. A couple of days without the app, and people soon learned self-service IT was not the way forward. Sorted. Now the same thing is happening with the Cloud. The business head can commission the purchase of an enterprise business solution, seat by seat with little more than a corporate credit card. The consequences of which will always end up at the door of the CIO. When we referred to 'the cloud' in the late 80s, it was the place where your network went to communicate, now it is a cover to sell anything and everything to everyone. Cloud could be simple. The concept at its purest is ‘on demand’ and ‘internet based’. This could be storage, or it could be an application or anything else for that matter. A micro business of 3-4 people could decide nearly all of IT business would be ‘public’ cloud based. An internet based file sharing service would be appropriate for data storage; a hosted mail service (with hosted email filtering included) can deliver the same experience as on network in the eyes of the user; a software as a service provider can provide an accounts package with nothing to install; another can provide a powerful yet on demand CRM solution; a host for their website; and a internet based voice mail service for their telephony. To do this they own no hardware other than the laptops. Their business runs on a number of very functional and affordable applications; they own no servers; their business applications are in the public cloud. All they need is a desktop or portable device and an internet connection (which these days is available cheaply in most locations on the move). Even in the office an increasing number of small businesses are going cable-free with wireless technology being cheap, easy and giving greater range and reliability that previously available. With benefits including : no installation charges; software updates done for you centrally; always on/always accessible solutions; data back ups done for you in multiple location secure environments; no capital outlay and the easy ability to add users during the term as growth determines. Of course there are some considerations as with any solution such as: reliance on a cloud companies financial strength; need for an internet connection to access applications; lack of possible integration between systems (although signs are of clouds starting to talk to clouds); no multi-cloud single sign on (again though signs are of this abating) ; the small business also needs some guidance on questions to ask of a cloud provider around areas such as data sovereignty, SLA’s and exit strategies. As you can see in this small and simple example, there are risks which need to be weighed against the benefits. As you scale to an organisation of thousands, this can quickly become a quagmire. To overcome this you need a method to develop a cloud strategy. It should assess the needs of your business and demonstrate the decision making behind which services should be cloud based and which mode of cloud (public, private, hybrid). It should be flexible enough to adapt to changing business circumstances. Cloud strategies need to consider, your organisations culture and its risk appetite. The current and future shape IT estate and the journey to it. The current and future IT change portfolio. Once this is understood, determine what to do first and why. The strategy should be developed with a method which describes the business benefits and risks of each service; define logical groupings, and studies of similar successful implementations. Now is the time for organisations to define cloud plans and strategies that will be their foundation for change for the coming few years. Cloud is going to effect and impact businesses, people, decisions, budgets and strategies more in the next 36 months than in the last 10 years and business and IT leaders need to have answers in advance of the questions that will come their way. If they don't a Kodak moment awaits. John is Managing Director of Evoco ### Why PCI DSS and Security in General? By James Rees, Managing Director Of Razor Thorn Security The other day I was chatting to a good friend of mine, Lisa. She is an information security manager for a cloud company over in the states, we were having a chat about security in the cloud market and as she was on the inside she gave me an insight into what our cousins over in the states thought about information security. Lisa, a newcomer to the cloud unlike many of us gave me an excellent fresh eyes perspective on what cloud companies over there are doing the response she gave me was simply: “Security is a marketable feature” She could not be more right. Information Security has been fast becoming an important factor in the buyers list of “must have items” the general public are tired with suffering when their credit card details are stolen, or their accounts hacked they have been suffering under this for too long. For example a lady I know recently used her card in a call centre, not long afterwards she discovered somebody had emptied the account attached to the card she used as well as using all of the available overdraft facilities… The bank then froze the account for 28 days to investigate, leaving her stuck. This type of issue has been occurring time and time again for years and now we are starting to see a demand from the public for excellent levels of security from the people we purchase items from. This is starting to ripple up through the service providers and various other companies in the service industry space and their suppliers, compliance requirements such as PCI DSS has sped this up exponentially…. People and organisations now spend their money carefully on organisations that take Information Security seriously, it’s one of the largest reasons why they buy their services from the people that they do. Cash is dying. Cold hard cash is only good these days for small items, a chocolate bar, a packet of cigarettes, milk for the office, etc. To add to this, the high street is in a rather sharp decline and has been for some time now. Items we desire can be purchased online now for cheaper, so this is where people are going with their credit cards and their debit cards. There are hundreds of thousands of companies out there that take card details over the internet or through call centres, this increases every day, each one of those companies taking card payments contractually when they get those merchant accounts and online merchant accounts are signed up to PCI DSS, this means they HAVE to do it. It also means there is a good chance that those companies these merchant organisations deal with will also have to comply under the PCI DSS rules governing service providers and so on and so forth, the chain of PCI DSS can pass through several layers of organisations. Information Security Is marketable, PCI DSS is marketable. There is a massive market out there for cloud companies to take advantage of if they decide to take security seriously. But one warning, do not pay lip service to information security…. If you do NOT take it seriously at some point you will suffer a security breach, every company does at some point, and when it happens you will suffer and your business will be at risk. If you are going to have a secure solution then make sure you get the right advice, get a professional on staff or independently to advise you and help you. Information security is commonly viewed as an IT thing, firewalls, antivirus and such software. It is not, it is much wider than that. Good information security can not only provide you with technical security but also the ability recover faster and mitigate the damage. Good information security is about the business and protecting your assets. Good information security is about protecting your reputation. Good information security is about protecting your clients and customers. Can you afford not to have information security in the current market? Finally. Do you want to be the next company explaining to their clients why the ‘secure’ service that you offer suffered a security breach? I know I wouldn’t want to be... Take security seriously, get the right help to advise you, it has far too many bad consequences if you get it wrong... A simple Google search will show you how bad it can get.   ### Tips on Cloud Selling From Sandler Training WARNING: This article will make you deeply uncomfortable and make you ask why do you sell the way you sell Telling Isn’t Selling I was wondering if you sell any of the services or products listed below? And do you ever tell your prospects that this is some of what you sell? Cloud Computing, Infrastructure as a Service (IaaS), Public Cloud Private Cloud Hybrid Cloud Cloud Hosting Cloud Backup Cloud Storage Communications as a Service (CaaS) Hosted PBX SIP Trunk VoIP IP Phones Call Recording Call Management Unified Communications Video Conferencing Software as a Service (SaaS) Office Applications Customer Relationship Management 
(CRM) Hosted Desktop Hosted Email Project Management Help Desk Business Process Management (BPM) Single Sign-On Virtual Office Platform as a Service(PaaS) Virtual Server Virtual Database Virtual Private Data Centre (VPDC) Content Delivery Network (CDN) Data Centre & CoLocation Racks Suites Disaster Recovery Connectivity IP Transit Leased Lines Ethernet MPLS Wireless Hardware Servers Network Storage Area Network (SANs) Virtualisation Cloud Security Firewalls Penetration Testing Single Sign On Intrusion Prevention/Detection Encryption Denial of Service (DDoS) Vulnerability Scanning Web Security Email Security Consultancy Infrastructure Management Remote Monitoring Load Testing Why? So what? Who cares? And do you ever struggle to differentiate in a crowded competitive marketplace where there’s a fag-paper difference on paper to your prospect between you and the next vendor? Despite so many vendors and so much choice, I don’t suppose always get full fees and charge premium don’t you? And you never end up giving away value added services to buy the business? Or over-servicing to try and keep the business? A quick test of your belief system re. selling. On the last 5 sales you have worked on, how many of them involved a conversation about discounts? Q1. If you won the deal, why did you win it? Q2. If you lost the deal, why did you lose it? What should you have done differently? A1. If your answer to Q1 was you won it because we were the right technical fit and we offered a good discount Or A2. We lost it on price and we should have given them a bigger discount! Then you are falling into a trap called a heuristic. These are shortcuts to actual thought and reason. Let me illustrate my point. Heuristics Look around your room for anything that’s blue. Blue. Blue furniture, blue paint, blue pens, blue ink, blue backgrounds, blue objects, blue curtains, blue clothes, blue cups, blue business cards, blue art, blue sky. Search for everything you can that is blue. OK now close your eyes and recall all the yellow items in the room. What this exercise should have taught you is that when you are looking for one thing you can be blinded to the stuff that you’re not looking for. Who hasn’t had the experience of looking for a red key-ring and only to discover it was blue and under your nose for the last 5 minutes? How often are you falling into a similar blind-spot when you sell? Buyers don’t buy for the reasons you think they will, they buy for their reasons. We had a client in Sandler who sold Oracle. He picked up an RFP from a German company that had SAP as an incumbent. After meeting the IT people he realized there was nothing there but he felt he had to put a proposal together, so he did, but at a massively inflated price and with no real technical effort going into the offer. He sent it off and to his surprise … he won it. Why? He had a great relationship, so when the salesman caught up with the CTO he told us candidly that the real reason was that he was looking for his next move and had no Oracle experience on his CV. Oracle’s proposal helped him deliver what mattered to him personally and emotionally; it helped him to advance his career. Now how easy is it to sell for your reasons not the prospect’s reasons? How easy is it to fall into the trap that because we believe something or our attention is focused on a particular issue, that we blind ourselves to what may really be important. The Cloud solves problems. Technical problems, business problems, emotional problems, financial problems, efficiency problems, competitive problems, branding problems; all problems that exist and are real in the lives of your prospects. But too often, aren’t we blinkered by our product knowledge training and traditions? (www.despair.com/tradition.htm) How Much Does Using Your Extensive Product Knowledge Cost You in Lost Sales? “Product knowledge used in the sale is lethal” – David Sandler Selling is not negotiating. Negotiating is what happens AFTER YOU HAVE FAILED TO SELL! Negotiating is a more painful version of order-taking which is where you show-up and throw up, quote and hope, and sell and run. And then you beg for the business and drop your trous prices in the hope that the buyer will throw you some crumbs compared to the real value you bring to them. Selling is not about spilling your guts and answering every question asked by the huge technical team whose express reason for having you in the room is to plunder you and your techies of their knowhow without paying you for it. That is called FREE CONSULTING. (A very bad thing since the second highest hidden cost in any business I have come across in the last 9 years has been the hidden costs of selling) Selling the Cloud requires you to be a diagnostician first. Find out WHY they need and want what you offer. Does it help them advance towards their personal and strategic objectives or does it tick a technical box? Get examples. Ask about the impact. Why are they considering changing anything? What keeps them up at night; causes them to risk missing their targets or lose their bonus; what puts their mission or vision in jeopardy; if they lag for much longer who stands to gain the upper hand over them; why does any of this matter to them personally; what impact does not doing something now have on them: what impact does t have on others; why do they care; so what; who cares; how much is this costing them in time, money and resources; who is looking for them to fail? The potential for diagnosing their reasons for buying is almost limitless so long as you get out of your own way in the sale. The only person who really cares about the technical gobbledegook your presentation is filled with and the features or benefits you think are important is …… YOU! Your prospect wants his pain fixed, ideally forever, if not, for now. Remember, “You differentiate in HOW you sell not in WHAT you sell.”   ### Partner opportunities for traditional I.T. companies By Richard May, Managing Director of Virtual DCS Before virtualisation technology entered the market, two of the most popular traditional I.T. opportunities were hardware based infrastructure upgrades, and tape based disaster recovery solutions. These solutions have now been replaced by ‘as a Service' technologies, that hold additional benefits to the customer such as reduced downtime, a lower cost of ownership and a reduction in capital expenditure. It is no surprise that as the demand for the Cloud soars; many I.T. resellers are rethinking their business strategy and the technology that they are offering to potential businesses. Now, many resellers are choosing to use their industry knowledge of both traditional and non-traditional methods in order to offer a range of solutions to their customers. Cloud suppliers are not the only businesses that are in demand for partnerships, as resellers themselves play an essential part in the success of the providers. In light of this, more creative tactics are being used in an effort to provide the most attractive business solutions for resellers, and therefore the most attractive business solutions for their customers. This blog will explore virtualDCS's experiences when creating and retaining new Cloud reseller partnerships, highlighting important aspects that we believe should be within any partnership scheme. Ideally, when selecting a new provider the majority of resellers appreciate a combination of a good support response, and a variety of solutions to choose from. One of the most important points that a Cloud provider could make is offering a strong service selection. For example, virtualDCS enables our partners to resell all of our headlining services, including, CloudCover™, our business continuity solution and our Software as a Service enablement package. Should our resellers ever need it, we provide support and guidance on all of their proposals and solutions, in addition to simple reseller tools used to calculate and change pricing. We also offer our partners a simple referral scheme and a lead registration process to protect their leads. Partners that choose to log leads also receive an additional margin. In addition to our core solutions, we offer a Virtual Service Provider (VSP) program that is completely tailored towards the needs of the reseller. This solution offers a simple business model, where the reseller has dedicated space on our platform. They can then use our Cloud interface to resell this space, adding unique software applications and resources for their customers. We don't believe in ‘putting a square peg in a round hole' which is why we work with our partners to create tailored solutions. Our partners agree that this is the most beneficial to both themselves and their customers. As an extension of our support services, we offer free marketing material to help our resellers sell their solutions. This is another proven and successful tool used by many Cloud providers throughout the industry. In addition to images, booklets and templates, virtualDCS also creates custom branded material for our partners that choose to ‘white label' our services, which few providers offer. Our partners believe that this helps them throughout the entire sales process. We have also witnessed the need for a high level of administrative control. Any reseller, even if they provide a traditional, Cloud or hybrid solution, is a business. Therefore, throughout all of our reseller offerings we deliver administrative tools to make the sales process simple. For example, our VSP Cloud interface contains an advanced invoicing system and multi-currency support. We believe that these points are vital to a successful reseller opportunity, especially for partners that are moving from a traditional I.T. solution. For more information on our partner opportunities and our services, contact a Cloud expert on 08453 888 327 or email enquiries@virtualdcs.co.uk About virtualDCS The founders of virtualDCS have pioneered the development of the Cloud Computing industry for over a decade. As one of the first companies dedicated to Cloud services in the world, customers are confident that they will only receive the finest solutions. Their approach is to work in partnership with clients to ensure that their infrastructure is ready to exceed the service levels demanded by their business. ### Auto Deployment, Easy Management and Any Device Access is the Key to the Virtual Desktop Experience James Broughton 10Zig TechnologyBy James Broughton, Sales Manager – UK and Europe , 10ZiG Technology Ltd VDI surgeons have been chopping up your pc’s for some time, they have been cutting open your desktops and transplanting your OS, apps and storage into the heavens. So now you are left with a tiny device, a lean mean green fighting machine, however, does this detract from the previous experience? I’m writing this blog on my iPad whilst on a plane on my way to the Dutch VMUG. The iPad has been a nifty tool for accessing my VDI (although not right now as plane mode is in business). But it grants me easy quick access to that key data and apps that the surgeons removed a few years ago. My problem is data input or creation is just a pain, my fingers starting to ache after the repeated taps on unforgiving glass and the fact that half of my screen is blocked by a floating keyboard means my screen size is reduced to about 6 inch’s, which in some circles, I’m told, is seen as plenty/adequate. Typing on an iPad I must be getting old, aching fingers, small text and grumbling about my screen size. So what other ways are there to connect to your VDI, remote apps or key data? Well you can connect with pretty much anything, your old PC, your laptop and Thin and Zero Clients do an exemplary job. All of these devices give you the ability to keep your nice soft keyboard and with the new developments from the likes of the company I work for, 10ZiG Technology, we are able to display and run all of your apps granting the performance that you need, including You Tube and fancy CAD with up to 60 fps and multiple displays and all on a small Thin or Zero Client. The iPad has been a nifty tool for accessing my VDI… My problem is data input or creation is just a pain, my fingers starting to ache after the repeated taps on unforgiving glass… and half of my screen is blocked by a floating keyboard Tablets and Phones are great for access, but not for an all-day device, you’ll find users still need great performance, a forgiving keyboard and a 19″ screen or four. The problem with using your PC or Laptop to connect is that is garnishes a complete OS and this can be confusing for the end user. A box in a box. Two start menu bars, two wallpapers, but worse for the support team, two desktops to manage… Putting virtual desktops in was supposed to be easier! Why does your access device need to give you any other features than access to your virtual desktop/apps etc? Well if it’s a laptop or IPad, then maybe it does need to give you more, why? Because you are a mobile user, someone who needs their apps with and without connectivity. But if you use a desk based device, this needs to be streamlined to the end user, with the performance they need without multiple user interfaces to contend with. Hosted Virtual Desktop experience from 10Zig Technology From a management point of view, let’s be greedy, the desktop solution needs to be auto deploying and easy to manage, with update and schedule features like “turn me off/on” and if the worst happens you need to be able to remotely take control. The important thing here for greedy desktop technicians is that you ideally want to manage the virtual desktops, not an end point as well. All you want is for the device to be taken out of the box, plugged in and done. If you’re using a hosted cloud solution, then you certainly don’t want to manage desktop pc’s/laptops, that’s why you moved to the cloud in the first place. In summary, cloud VDI and any device access is a great way to serve your end users the desktops they demand, but realistically an iPad isn’t a two way pain of glass. A true window to your virtual desktop needs a soft keyboard, a big screen and must have the ability to dish up a bit of you tube at lunch time without making the processor in your servers go Gangnam Style. Don’t forget to be greedy, demand auto deploy and a single desktop interface. ### Accuracy and compliancy, installed as standard By Robert Tickell, IBM Mid Market Sales Leader for the UK and Ireland Ask any IT worker about updating masses of computing platforms and watch them frown. The effort involved in updating, removing or adding applications, particularly at enterprise level, is huge and time-consuming. Then there’s the downtime – which nobody likes. Users are frustrated and it costs businesses money. For a midsize company, the cost of downtime is estimated to be around £60,000 an hour, according to businesses surveyed in the Aberdeen Group ‘How Private Clouds can Transform IT in Midsize Companies‘ report. Private-Cloud-InfographicYet with a move to Private Cloud computing, you get the chance to have all your platforms achieving compliance at a stroke. Policies and defaults can be standardised across applications. With the enforcement of compliance via the use of server configuration templates, you can reduce errors and have stronger adherence to company policies. The adoption of virtualisation has also helped organisations slash away at downtime when it comes to deploying applications. For midsize companies who have yet to embrace virtualised computing, it takes 14.7 days on average to deploy a new application, according to companies surveyed in the Aberdeen Group report. Yet this was reduced to a mere 4.2 days after virtualisation. From a business continuity perspective, companies that have embraced virtualisation can gain even more efficiencies by jumping to the Private Cloud. With this technology, near-zero downtime can be offered in the event of failures, thanks to automated capabilities to that deliver a rapid recovery. Tiered security is another essential provided by the Private Cloud, allowing IT workers to set up secure work groups and isolate parts of the network. By speaking to IBM, your business can find the right management app to fit its activities. We look for the ‘single pane of glass’ solution, a central console to manage your entire infrastructure, including all integration with storage and your network. The net effect is reducing your IT overhead and enabling fast responses to issues. If you’re already using multiple hypervisors, you’ll appreciate how IBM’s management solution can take them on, even if they’re not all from the same vendor. This ensures you can reduce deployment times from days to hours. With the automated provisioning of standardised virtual images, you can seriously cut down on deployment errors. The Private Cloud also makes a business more flexible and innovative. With the simple management, swift deployment time and adherence to standards, IT staff can be assigned to other projects. Now you can dedicate time to having your business evolve. IBM has the know-how to quickly take a virtualised environment to Private Cloud. Our expertise can ensure the deployment is incredibly swift without any of your existing virtualisation needing to be ‘undone’. As you can see, IBM’s approach to the Private Cloud is all about taking out the risk and boosting efficiency for your business, as it already does for companies across the globe. We’ve produced an infographic showing you the key business benefits of Private Cloud. This shows you, at a glance, the financial and efficiency rewards. We’ve sourced our facts from the Aberdeen Group ‘How Private Clouds can Transform IT in Midsize Companies’ Report. To download the Aberdeen Group report (quick registration required) and our infographic (no registration required), just visit http://www.ibmoffers.co.uk/privatecloud ### THE CRM Lottery 'Will you get the outcome you expect or will you be one of the losers?' By Ian Moyse Let's discuss a couple of the heavyweight "C" words in the IT Sector: Cloud and CRM. What do they have in common? Well they have both certainly provoked a lot of press coverage and airtime, both have driven large brand name vendors to jump in with big investment, acquisitions and marketing hype, they have helped drive each other's growth and both certainly have had customers debating what and when they should elect to use such a solution. CRM (Customer Relationship Management) has been a hyped acronym for many years and can mean many things to many people. Many debates abound in this area from if it is the correct terminology to describe what its used for, how much does it really cost, what ROI can really be achieved, how to get users to adopt its use, routes to market to sell it and what The fundamentals are though that a way to manage customer information, share it securely, track customer interactions and record activities across the business is required by most business sectors and sizes. Sure there are nuances by vertical and specific requirements that at times can be better served by specialised applications, but for the masses the requirements are pretty similar and easy to achieve, or so you would think! For 2012 CRM remains an important technology in its own right with recent Gartner Analysis citing that CRM is ascending rapidly in the priorities of CIOs in 2012, moving from 18th place to eighth place. It was also cited by Gartner that that CEOs see CRM as their number 1 technology-enabled investment in 2012. Another shift that has been seen is for this market need to be fulfilled increasingly by in the cloud solutions, where according to Gartner Software-as-a-Service (SaaS) delivery of CRM applications represented 34% of total worldwide CRM application spending in 2011 and more than 50% of all Sales Force Automation (SFA) spending was on the SaaS platform. So Cloud CRM is at the tipping point of outweighing on network solutions in the not too distant future and becoming the defacto form factor of delivery. Many customers of historic software products are already migrating their data across, seeing the benefits of putting the heavy lifting of the architecture, backups, resilience and security into someone else's hands is now a practicality. This doubly resounds as true in the mass market SMB sector where both cloud growth and CRM adoption can bring strong business benefits and empowerment and the combination of the two can compound this. Pierre Audoin Consultants (PAC) recently reported that SMBs spent almost £1.6bn on business application software and SaaS in 2011, and that this is set to grow to more than £2.1bn by 2015 – faster than the enterprise segment! The analyst also revealed that key areas of investment were cited as including CRM, analytics, mobility and industry-specific software. So cloud is becoming the defacto for CRM implementations, the world is good and there are no issues to be discussed here. Of course what the marketing will have you perceive is never quite the reality and there are challenges in the CRM world as in any other. CRM projects do not all go to plan and there are many hidden challenges that can prevent you from having that win. Gartner has reported that as many as 85% of companies that buy CRM software to automate sales efforts, don't pick the right tools because they fail to define business objectives or develop processes for meeting objectives and that as many as 42% of CRM licenses bought go onto be unused. If you look at a variety of customer reports across the years reporting on data from customers of failed CRM projects there is an average of 46.3% of failures reported in surveys. The data varies greatly, likely due to different analysis methodologies. Many failed CRM projects actually deliver mixed results and achieve partial objectives, making the definition of “failure” subjective. However the aggregate weight of this data clearly suggests strong dissatisfaction with CRM projects; 2001 Gartner Group: 50% 2002 Butler Group: 70% 2002 Selling Power, CSO Forum: 69.3% 2005 AMR Research: 18% 2006 AMR Research: 31% 2007 AMR Research: 29% 2007 Economist Intelligence Unit: 56% 2009 Forrester Research: 47% Many organizations do achieve acceptable ROI from their CRM rollouts, yet a substantial numbers of CRM customers are dissatisfied with some significant aspect of their implementations. According to Frederick Newell (Author of Why CRM Doesn't Work) “only 25 to 30 per cent of companies implementing CRM initiatives feel that they are getting the return they expected!” As recent as four years ago, a presenter at a Gartner conference suggested a lengthy list of factors that can jeopardize CRM initiatives including poor methodology, lack of established success metrics, and too much involvement from IT. A study by CRM Forum broke the failures into nine significant categories; •29% - Organisational Change •22% - Company Politics and Inertia •20% - Lack of CRM Understanding •12% - Poor Planning •6% - Lack of CRM Skills •4%- Budget Problems •2%- Software Problems •1%- Bad Advice •4%- Other So if industry analysts like Gartner, Forrester and Butler suggest that 50% to 70% of CRM projects are sub-optimal and do not deliver a return on investment or fail altogether. What are the odds on you achieving a win? Should you even consider playing in the CRM lottery? So why haven't you heard more of this before? Why do you not read about these failed CRM projects and what is causing these results? Well a CRM project team certainly isn't going to want to admit an unsuccessful outcome, the business leaders who have invested in the CRM project are often seen patting the CRM team on the back, for they have rolled it out haven't they – its installed and working! The true measure is not whether the system is live. It is whether the users actually adopt it and use it as an inherent part of their role, not in a lacklustre ‘my manager forces me to ensure its updated before his monthly meeting' type of way. From your ‘Start state' what problems and challenges did you have in your business you foresaw the solution solving? What were the business issues you were looking to address? Not the screens you wanted or the look of the reports, but what was the actual ‘end state' you envisaged - was it an increase in renewal rates, a reduction in time for processing orders/transactions or more centralised management visibility of daily sales activity and performance? The simple truth is that too many organisations buy an off the shelf CRM solution, without first considering what business issues they really want to address. The old saying goes that ‘if you don't know where you're going, any road will get you there', and this is true with CRM. Entering into a CRM project, many customers do not truly understand their goals and the objectives they are looking to achieve in order to be able to measure a success or failure against them. These objectives need to be business ones, not purely technical implementation points. Implementing is one component of a successful CRM project and often the core focus of the technical teams who are often still seen wanting to keep things in house and to manage and touch the components involved. Even in this day cloud solutions often get side-lined with little consideration. There still seems a psychological barrier with many feeling they have more control or security and even a better price when deploying an on network solution themselves. In reality if you consider the options cloud is a very attractive proposition. Deploying a CRM solution in house involves buying the hardware, operating system and CRM software. Installing and configuring it onto your network and resolving any issues along the way. Now think about resilience, what if any component fails, what do you need to do to mitigate this , have a 2nd standby box perhaps? If something fails it is upon you to fix, replace any failed component and deal with the issue. We all know that things break, phones, TV's and PC's so what's to say a component of your in house CRM solution won't fail at an inopportune time and leave you with little or no access to your valuable data and processes. How about giving remote access to users, perhaps over VPN, what about mobile access? Do you already have this in place on your network or will this be a new requirement imposed on you by users wanting to access the CRM information on the move (after all making it easy for users encourages adoption and use!). How will you backup your CRM data and how often, daily? and will you ensure this is done. Best practise for backup is off site with multi-location copies, but this is hard and costly to attain in simple fashion. With your CRM containing important valued information to your business, the importance of this increases. So when factoring in a comparison of cloud ensure you truly evaluate the pro's and cons of both solutions and price up not only the hardware and software costs, but the costs of delivering mobile access, resiliency and backup and the unexpected costs of replacing components or recovering data in case of a failure. Cloud CRM aligned with a smart plan of achieving the desired business outcome can enable even the smallest business to have a winning bonus ticket.A Gartner Group report once cited that 65% of those with a CRM initiative will fail “to align senior executives, IT Management, functional/departmental management and customer outcomes”. The findings of a poll of 100 SME organisations with CRM implementations revealed that while 60% of sales directors insist that CRM is fundamental to their sales processes, a quarter have lost customers directly through their businesses ineffective use of CRM technology. Essentially teams are not using their CRM systems correctly, 44% of sales directors admitting that fewer than 80% of their staff use the technology in the way they expected. The knock on effect is a loss of potential revenue and increasing levels of customer dissatisfaction aligned with a lack of quality data and reporting. Too many CRM products are designed for specific use by the sales team, despite customers interacting with other departments as well, such as Technical Support, finance and Customer Service. If different departments can utilise the CRM and share information the knock on effect is that the value to each person to use the CRM increases. So what can you do to ensure that you are a CRM Winner? It is key to have the requirements of all the departments who might touch the solution gathered , understood and prioritised and not only as a way of selecting the technology, but as a foundation of the implementation plans. What will the users want that will have a positive impact on the outcome? What will enthuse them to use the system when it's up and running? CRM considerations at high level can be grouped into four segments, form factor (on network or cloud), vendor selection, requirements/desired end state and implementation. It is all about the outcome and measurement of success that will define if you had a winning ticket. Ensure that you do not fall into any of these CRM traps; •No clarity on the project objectives •Lack of Executive Sponsorship •Lack of user involvement during planning •Lack of User Adoption •Blindly recreating existing processes in a new system (without review/appraisal) •Choosing the wrong Vendor Partner Enter any CRM project not looking at it as a technical implementation, but with clarity on your desired business outcome and all the factors needed to achieve this and deliver a successful user experience. Winning the CRM lottery isn't easy but it's not out of your reach if you apply focus and effort along with working with a partner who focuses on your outcome over their own! For further information visit www.workbooks.com   ### The Evolution of CRM: Building Social Relationships in the Cloud By Richard Young, Director of EMEA, Nimble Social media has turned the relationship between customers and salespeople on its head. But Customer Relationship Management (CRM) tools are adapting to this sea change – as must you, if you're a sales professional. Product and service vendors have been managing their relationships with customers for many, many centuries, really – ever since the concept of commerce was put into practice. Starting a couple of decades ago, the term customer relationship management became synonymous with computer software that automated the process. Desktop-based CRM systems helped businesses build databases of customers and identify promising leads. Information about these prospects was fed into the system from all corners of the company, and the sales team was able to assemble detailed profiles of their contacts. They were then able to identify these potential customers' needs, maintain a running log of their interaction and build relationships with them -- based on a great deal of knowledge. As other desktop systems migrated to the Web in the 90s, so did CRM applications making access easier for remote employees. Then along came social media sites like Facebook and Twitter. Slowly, the model changed again. A New Kind of Customer, and a Sales Challenge Today, you as the salesperson are no longer the first contact that purchasers of goods and service have with a company. By the time they're ready to start exploring the possibility of a sale, these consumers and businesses have learned a lot from their network of online contacts. According to a Google eBook that explores this Zero Moment of Truth – the online decision-making moment -- they've solicited information from 10.4 different sources. So the connected customers of 2012 are very different from the even the online shoppers at the turn of the millennium. They do more comparisons. They browse Facebook and Google+ pages for product and service information – and for comments and reviews from people who have already purchased. They read posts on company blogs and pose questions. They follow Twitter feeds to see how responsive customer service is. What does this mean for you? Several things: No longer can you rattle off a description and list of features – your audience already knows about them. Might you provide testimonials from satisfied customers? No, the prospects have already read peoples' evaluations on numerous sites. Advertisements? Direct mail? Brochures? There's still a place for such tools, but they only scratch the surface of what a diligent researcher can find on the Web. A Limited Data-Gathering Model In these early days of social media, you're no longer the primary go-to resource for information. The conversation is changing, and the buyer is driving it. This phenomenon changes the nature of CRM. Previously, you would build a profile of a prospect based on: Basic contact details, possibly obtained from a mailing list A contact history, neatly-organised electronic records of interaction with the prospect Information about the prospect that other individuals in the company have received Demographic data, a synopsis of the prospect's interests and a sense of what their needs were would be extraordinarily helpful, but that would require a lengthy conversation – if the individual was willing. Maybe they checked boxes on a mail-in inquiry card – but probably not. Traditional CRM was limited to collecting information that was relayed directly by the prospect to you or a co-worker. Discovering Needs, Qualifying Leads In a sense, the Web is serving as a lead qualifier – at least in terms of gauging interest. If someone contacts you, it means they've probably done their homework – "touched" those 10.4 informational sources – and they're closer to their buying decision than last century's customers would have been by then. If only it were always that easy. It's been said that the Internet is one giant sales database. While there's some truth to that, no useful database would be so far-flung and disjointed. So you must be a bit of a detective to build the more comprehensive prospect profiles that social media facilitates. First, you need to find out where your potential customers "live" online. That will depend in large part on the type of business you're in, since every social media destination has its own draw. Join LinkedIn Groups and participate in topic-driven Twitter chats (there's a good list here). Explore Facebook and Google+ pages hosted by companies in your industry. What are they talking about? Are there common threads you might follow to other venues? What blogs and other communities might your prospects frequent? Build Your Credentials Social media provides a golden opportunity for customer relationship management that desktop-based CRM tools don't. You can slowly and carefully establish yourself as a thought leader in your field by sharing your expertise. Be helpful. Answer questions on blogs, in chats, on Twitter. Provide general information and leave the sales pitch back on your own website. There's nothing wrong with posting a link to your blog or site in your signature line, just as you might hand a business card to someone at an event when you're just chatting, but don't overtly sell when you're just travelling around getting the lay of the land. Ask questions. You're trying to assess peoples' needs. What problems do they have with their existing solution? What's on their wish list? People like to buy from people they like, individuals and companies that simply want to understand, to empathise, to learn and respond. Social Relationship Coordination Necessary Managing your social relationships can be a Herculean task. Fortunately, there's help.Nimble provides a Web-based framework for your contacts, email, social streams and activities. No more hopping from site to site. You get a comprehensive profile of your customers and prospects, and you can see who influences them, pulling them, too, into Nimble. The site goes far beyond what desktop CRM solutions can do, letting you track, manage and explore the social connections that can evolve into sales relationships. Undoubtedly, CRM will continue to evolve. Social media have opened the floodgates, and effective management of these new insights into your existing and potential customers can keep you competitive.   ### The Sky’s the Limit By Micheal Higgins, CloudSigma, Manager of Enterprise Solutions Architecture As public cloud services continue to evolve, we’re learning the true extent of possibilities for cloud computing. A Forrester survey revealed that a full 36 percent of enterprise IT managers plan to invest in infrastructure-as-a-service (IaaS) in 2012, which means that cloud-hosted infrastructure is no longer a niche market. Part of the reason for this is the sheer range of applications for IaaS in practically every industry imaginable. At CloudSigma, we’re exploring the cloud’s potential in markets as diverse as the media industry and research science. We’re finding that by taking full advantage of the public cloud’s storage and compute capabilities, matched with the flexibility and customisation to meet clients’ unique needs, there’s no limit to how we can tailor our customers’ public cloud implementations. Recently, at the 2012 National Association of Broadcasters (NAB) Show, we officially launched our Media Services Ecosystem, a public cloud environment created exclusively for media industry professionals, to provide them with one roof under which to collaborate. Whether it’s the next blockbuster feature film, or a future Grammy-winning pop hit, media productions can be built faster, cheaper and more efficiently within a cloud environment. By utilising features such as our powerful solid state drive (SSD) storage capabilities and 10GigE networking, industry partners all over the world are building services within our cloud. Media production companies can now store and move large format files at the blink of an eye and at a minimal cost between different service providers, saving countless dollars and hours. That means that even smaller companies can move the sort of workloads necessary to assemble a high-end production. Our same IaaS model that’s supercharging major media productions with one hand is powering the most sophisticated scientific research under way at Europe’s leading research facilities. As part of the Helix Nebula partnership, CloudSigma is among the providers enabling an immense “Science Cloud” to generate the volume of computing resources that today’s researchers require to, say, chase the elusive Higgs particle... or to examine genomes essential to better understanding evolution and biodiversity… or to create a platform with which to observe natural disasters from space. CloudSigma is such an essential part of this collaborative because of our ability to provide full scalability, on demand resources and unrivalled storage performance for organisations that experience a high degree of variance in data flows. With a cloud build to handle the most demanding types of computing requirements, we excel at running core enterprise systems and web services also. The redundancy and high throughput needed to enable our HPC customer workflows lets mainstream business uses, from databases to web servers, fly in our cloud. The potential applications for our cloud are only just beginning to roll out. We’re establishing ourselves as a completely unique, pure-cloud provider on many fronts, including our ability to offer full compliance with national data privacy laws by providing a European cloud free from the domain of the U.S. Patriot Act; our vendor lock-in free model allowing full retrieval of drive images at any time; and our emphasis on security, with private fiber connectivity and virtual servers separated at the hypervisor level. Our unique position is the reason we’ve formed such successful partnerships, including those with Besol, with its Tapp Platform for cloud migrations; Attend, with its MediaCloud media collaboration tool; Strategic Blue, with its cloud brokerage and billing services; and now, with Compare the Cloud, helping them to build out their comprehensive cloud provider comparison services. We’re looking forward with excitement to where our one-of-a-kind public cloud takes us next. More and more, with IaaS and other cloud services, it seems that the question isn’t what can the cloud do, it’s what can’t the cloud do. ### Is Cloud Technology Anything New? Cloud Computing as we see it is simply good old fashioned technology that is over 25 years old rebranded, re-spun and pumped out to the market in various guises. Large enterprise companies such as HSBC, Barclays and many other banks have been deploying server based computing for decades. When security, disaster recovery and business continuity are paramount, server based computing comes into its own. Banks have used terminal servers across multiple sites, offering their actual users nothing more than a Virtual Desktop that is locked down and is essentially a dumb terminal used to access the backend servers from any location for ages. With an enterprise IT solution, you will find that the hardware is ‘enterprise grade’, owned by the enterprise and designed, built and supported by staff at the enterprise using the technology. Users are generally locked down at the desktop level, unable to install applications of their choosing and are given minimal yet sufficient email accounts. This leads to a very stable, secure, robust and hassle free IT solution which means the phone hardly ever rings as not a lot can go wrong!! An enterprise system is generally designed so that ‘the phone won’t ring’ because if it does ring too frequently then questions will be asked of senior IT members of staff. So, role on to around the year 2005-2006 when a few young wise heads left enterprise type companies where they had rolled out large Citrix or RDP deployments for Banks or Insurance companies and they thought…..’Ah…..I can build this in a Data Centre, petition off segments for SME’s to use and earn revenue!!’ Thus was born the early stages of what we now call Cloud Computing. Cirrus Stratus runs it’s own Enterprise platform built on enterprise hardware and enterprise principals because our founder has worked for many large enterprise customers and has been working with Citrix since 1991. When I say enterprise hardware I mean hardware that as large enterprise such as a bank would be happy to use, hardware that is robust, secure and more importantly totally scalable for unlimited growth. Cirrus Stratus owns its own IBM terminal servers, it’s own exchange platform and controls it’s own data storage/backups over IBM SAN’s (soon to be SAN Volume Controllers offering real-time data replication). With regards to enterprise principals I mean a system built whereby the phone hardly ever rings with unhappy and frustrated users, I mean where a company owns all its own hardware and does not use third parties for key components like MS Exchange and Data storage, data backup where essentially it is not a truly managed service because you are at the behest of third parties. So, in a nutshell, Cloud Computing is using the same 25 year old technology that large enterprises over multiple global offices have been using for ages however this time all of the infrastructure resides in a Data Centre and is backed up to a secondary Data Centre Companies that sign up to the Cirrus Stratus service are simply given their own locked down, secure petition on a server that is load balanced and monitored to ensure the user gets the best experience possible in terms of speed and functionality. The user is given a Virtual Desktop with MS Office 2010, an email account, data storage and backup, network drives and practically any 3rd party application can also be loaded onto the servers at the data centre. The end result is that the user can use any device and log on via a secure, encrypted and stable Citrix gateway and access their data and applications from anywhere and at anytime. If you are interested in adopting an enterprise Virtual Desktop solution for your business at affordable prices, then please ring Cirrus Stratus on 0208 543 3322.   ### Why Cloud Buyers Hate Bad Salespeople (and How to Avoid Being Perceived as One) By Marcus Cauchi of Sandler Training Selling can best be defined as “getting your price on your terms and both parties walk away from the table happy and satisfied ... eventually.” As buyers, don't you love to buy and hate to be sold. I do. Don't you love to feel like you're in control, to feel comfortable with the salesperson, the company, the product, the warranty and the service? Don't we all like to feel good about our purchase decisions at the point of sale and then, when we are back in our homes or offices, to feel happy with our decision long after we made it and when scrutinised by significant others in our lives and work? So if this works for you, why wouldn't you expect it to work for your customers and your prospects? Do you accept the fact that you aren't for everybody and not everybody is right for you as a customer? Or do you still chase your personal commission above your prospect's best interests and peddle anything that they agree to buy so long as you get paid? Tech has long had a bad reputation for selling blue sky. Tech has also long had a bad habit of calling at their level of comfort. That isn’t selling. It’s an attempt at order taking. One geek talking to another geek about geeky things isn’t selling either. It’s free consulting. If you're selling Cloud based technology, stop thinking about this as a technology solution. It isn't. So what is it? Cloud happens to be the means by which business and personal visions are realised. The Cloud enables businesses to behave differently, to drive efficiency, to improve transparency, collaboration and currency of information. But unless you find someone who is finding the lack of these abilities is causing them enough pain to want to fix it, it remains an academic question for the people with money and power instead of one that you have tied directly to their career, reputation, personal bonuses and potential loss of power. These are the people the cloud is able to help in ways they haven't yet imagined. It’s your job to find out why they’d care enough to do it now, with you and pay you handsomely to do it! But telling them they need The Cloud isn't going to work either; they need to discover the reasons they want to buy it for themselves. Your reasons are just salesman's puff. And buyers think of salesmen thus: 1. They invade your day and interrupt you to beg or push for a meeting. 2. They show up and throw up (usually in some form of death by PowerPoint blathering on about their company, their products features and benefits) 3. They attempt to close you 4. They field questions and objections which they try and "handle" 5. Once the bun-fight is over, they try and close you again (often with a bruised ego for their troubles) and instead of qualifying you so as not to waste your time or theirs … 6. They offer to put their ideas, pricing and terms in writing in the form of a proposal so … 7. You can shop them to the competition That is bad selling; it starts with bad management, bad recruitment and bad beliefs entrenched through bad training and accepting bad selling habits. Forgive my bluntness but I’m guessing it’s probably not far off where you and your company are today and if not you, you know Cloud competitors for whom it is absolutely true. I get paid to tell people how to fix these problems so I'm not going to tell you how to fix them here for fear of making a hypocrite of myself. But if you can answer the following questions well, you are already well on the way to fixing the problems you're facing with your disappointing sales performance. 1. What do you want to happen at the end of each interaction with prospect? 2. What do you currently agree will happen? 3. And if it doesn't happen, what do you agree you can do to escalate it or end it? 4. How do you get your prospect to 70% of the talking and when you are talking you're asking great questions? 5. Are you really paying attention? Listening? Demonstrating you're listening? Responding to non-verbal cues? 6. Do you have a structured method to reach a qualified decision at the end of every prospect or customer interaction? If not, why not ... and when will you have one? 7. When you do your pre-call planning how do you rehearse the ways you are going to neutralise objections early and on your terms? 8. How do you make sure that you if you end up with a qualified prospect who is willing and able to buy, make the decision and funnel the required resources your way, how do you make sure you are co-diagnosing their problem and co-designing a solution if it's appropriate to get that far? 9. If you are selling to a committee do you ever make the mistake of presenting more than once to everybody who needs to be involved, and only for a decision? Yes or no but nothing in between? 10. How do you eliminate the dreaded "I need to think it over" and leaving with a wishy-washy unclear agreement about what happens next? 11. How do you make sure that everyone who is going to decide or influence that decision, is bought-in in before you waste your time and resources moving ahead with a presentation of ANY sort? 12. How do you eliminate the risk of delays and roadblocks systematically? 13. How do you get access to the key people at the top of organisations and take yourself out of being a tech/IT sale and make selling the Cloud, a strategic purchase (even with what others might see as a commodity)? 14. How does your current system of selling the Cloud and (dis)qualifying, enable you to KNOW the real reason(s) why you're going to meet them? Why you're leaving? What happens next? Why you're going back? And what will happen at the end of the next interaction? 15. How do you as a company capture the lesson learned daily by you and your sales team? How do you share this knowledge consistently and regularly today to sell the Cloud more effectively? If you take nothing else away from this article, take this. People do not buy the Cloud because they want to get into the Cloud. There are reasons (their reasons) they might want to change what they are doing and do what you are selling them the ability to do. DO NOT MINDREAD. Find their reasons and get them to tell you how to sell the Cloud to them. Their reasons are not your reasons. ASK! ### Credit Cards – The ugly truth part 2! Following on from my last release of credit card fraud, I felt that I had to write a small article on one of the topics mentioned in more detail, contactless credit card fraud. Contactless credit/debit cards are now (in the UK) a very common addition to the chip and pin payment method. Imagine this, I am at my regular coffee house and go to order that double shot latte and notice that I have no cash on me, no problem, I will pay by card. I get my card out and simply touch my card to the contactless device and hey presto, coffee paid for. What’s the problem with this I hear you ask? Contactless payments have been around for quite some time here in the UK. Well there’s no problem with the mechanism itself, only with the fact that the chap standing behind me has just scanned my cards (and I have a few) in my wallet with his phone and helped himself to my full name, number and expiry date of all of the cards in my wallet without even stealing it! How about that for a perfect crime and the app runs on any “off the shelf” smartphone. There are approximately 19 million contactless cards in the UK, Barclays attribute for about 13 million of them. So, the majority of online stores have linked added security verification (Barclays in particular) to prevent this fraud, as well as the C V V (3 digit number on the back of the card) to be required for payment. However there are hundreds of online stores that don’t ask for the C V V or added security measures. I know what you are thinking, that’s terrible right? How can banks issue these cards without even encrypting the contactless information when paying this way? I think the same way and now that Lloyds and other banks are issuing these cards, there’s serious fraud about to happen. This is a very serious breach of the data protection act without any doubt and what are the banks doing about this, nothing. In fact they freely admit this is a concern but without the 3 digit C V V code it should not be possible to use the information. But, as Channel 4 News highlights, it is indeed possible and hundreds of websites accept these “stolen” details as payment (including Amazon – go figure that one). Don’t just take my word for it, click on the link at the end of the page to watch the video and read what Channel 4 News has to say. You know, someone will always benefit from someone else’s misfortune, and this is a perfect example. Sales in shielded wallets have gone up massively (and will go up even further once this information becomes general knowledge) and I am not kidding either. This is incredible and hard to conceive that theft could be so easy. Can you imagine how many opportunists will be walking the streets getting that little bit closer to you with a smartphone in their pocket (is that a phone in your pocket or are you just about to rob me – sorry, I couldn’t resist that comment). On the tube, on the bus, in-line for that coffee and capturing not just one cards information from your wallet, all of them! Look who’s hosting the Olympics this year and the extra million’s coming over, all with wallets. There is a website that details the contactless system of payment and also a 17 page document that highlights the process of payments – from a retailers point of view. There is one part where the document states “Customer Action”, and I am going to write in and suggest to cover your wallet in tin foil or indeed, buy a shielded wallet and watch the shares go up. My advice, get a shielded wallet and buy shares in the company that makes them as this will be a massive problem that we have not seen yet.   With thanks to Channel 4 News http://www.channel4.com/news/fraud-fears-grow-over-contactless-bank-card-technology ### Maildistiller announces true cloud computing with the introduction of utility billing. It’s the announcement MSPs and resellers in the email security space have been waiting for, finally a cloud security vendor offering the utility billing flexibility that true cloud computing promised. [quote_box_center]“Just remember that cloud is not about computers or technologies, it is about ICT services or ICT enabled business services supplied on a utility basis, just like electricity, water or telephony.” (Source HM Government G-Cloud Blog)[/quote_box_center] Maildistiller, based in Belfast, is arguably a relatively new name to the email security spotlight. However, the company has undoubtedly cemented its position in the market following almost 8 years of continued growth, success and a concentration on the development of their technology as opposed to marketing hype. Now though, with a portfolio of sophisticated solutions refined and perfected, a 100% channel focus and the introduction of the market’s first utility pricing model, the company is fully-equipped to become a frontrunner. Colm McGoldrick, founder and CEO of cloud email security company Maildistiller, explains the necessity of a utility pricing option for the channel. “Cloud has been around in some form or other for quite a number of years but has really only gained momentum in recent times. This is largely due to the challenges and concerns resellers face when considering a transition to cloud services, particularly the financial model that accompanies it. At Maildistiller we wanted to make this easier and give our partners a better way to buy so that they can fully exploit the huge opportunities the cloud poses.” Maildistiller’s utility billing model is unquestionably going to give the email security market a timely shake up and will force other vendors into playing catch-up, as more and more partners begin demanding this level of flexibility. Until that time, as the only vendor currently offering the channel utility billing, whereby MSPs only pay for what they use on a per monthly basis – Maildistiller are unsurprisingly quickly becoming the vendor of choice for many and the go-to option for exWebroot partners in need of a stable home. Maildistiller’s ongoing commitment to the channel and in alleviating any barriers preventing a transition to the cloud is certainly an admirable one that all of the Maildistiller team are extremely proud to be a part of. For me as the marketing executive for this innovative and slightly maverick email security company, this is a very exciting time. Maildistiller has always been a technology focused vendor, which to their merit has resulted in an impressive suite of solutions and a partner program that actually ‘gets’ what channel partners need. I’ve no doubt that the interjection of a true utility billing model which allows partners to scale up and down on a monthly basis and only charges for actual usage will cause a welcomed stir in the market. Now, it’s time to get shouting and make sure Maildistiller’s well and truly in the email security spotlight going forward! Danielle Campbell www.maildistiller.com ### Credit Cards – The ugly truth! Neil Cattermull delivers his perspective on the recent credit card breaches in a Two part blog series. Over the years all of us have become to rely on pieces of plastic that will allow us to spend more money than we have. We all have one and many of us have 3 or more. Credit cards, have been with us since the 1980`s, starting with the traditional debit cards. We all know the main providers and to some people they are even status symbols for showing off the amount of wealth they have (Black, Platinum, Gold and others). It started with a signature required, then a memorable pin number (which most people keep the same across all of their cards) and recently the contactless verification systems. Nothing new to tell you I hear you say? Yes you are right, this is not news. However, what is news is that the big players processing systems (based out in the US) have now been hacked, allegedly by a taxi cab parked in a car park, and according to the Daily Mail, affects over 10 million users of the cards. Yes I did say that correctly, 10 million. This reminds me of the Heartland Payment Systems Fraud a few years ago when 130 million credit and debit card records were stolen from their (and others) computer systems by having inadequate security principles and safeguards in place! One 29 year old hacker was sentenced to prison until 2025 and guess what, he had been doing it since 2003 and was only arrested in 2008! So back to the latest security breach, Global Payments Global Payments Inc (GPN on the NYSE) have been violated by hackers that very quickly today knocked off nearly 10% off their share value, and it would have been more if the stock hadn’t been frozen. So who are “Global Payments Inc” and why should that affect my Visa/MasterCard that’s in my wallet? We let me explain and try to shed some light on the subject. I have a Visa card, I go to pay for that expensive lunch with said card. It gets processed, with a tip if I am feeling generous, and off it goes into cyberspace to be checked and debited against my name and account. At this stage my specific data from that transaction has been sent to a 3rd party that follows through that transaction on the behalf of the card provider – effectively executing that transaction. It is at this stage the fraud has begun with another hacker tapping into this data and capturing it. Now they have my card details, and my identification that belongs to that card, happy shopping! The CEO of Global Payments Inc was strangely unavailable to comment and both Visa and MasterCard have stated “our own systems have not been breached and the account data may have been compromised at a 3rd party entity”. Neither firm specified how many customers may be affected. So that’s alright then, their systems are safe but just not the ones that actually carry out that transaction – due diligence springs to mind. So what regulations are in place to stop this happening? Well there’s a regulation called PCI-DSS (Payment Card Industry Data Security Standard) compliance that puts the framework and rules down in place to ensure this exact situation does not happen. Now I am not sure what has happened in the US with Global Payments Inc`s security, but it seems to me that it may not have been quite up to standard or at least checked and audited regularly. But what really worries me is that the UK has not seen this wave as yet and further research uncovered (article published on the callcentrehelper.com website) that 37% of contact centres in the UK are judging themselves PCI-DSS compliant. If this wasn’t bad enough, the vast majority (89%) admitted to not understanding its requirements and penalties. This seems to me like a bomb waiting to go off. Imagine all of the times you have given your credit card details to the insurance, online stores and other contact centres when purchasing goods over the phone. Absolutely incredible isn’t it, and there’s all of us worrying about our own home PCs and protection! When discussing this in more detail with various sources, only a few companies like to comment. Companies such as Firehost, a leading US and now UK based IT Hosting Company specialising in the secure hosted market. Jim Ciampaglio, Global SVP states [quote_box_center]“Security is paramount for any company and needs to be tested regularly. So many times firms adopt really solid security policies at the start of their lifecycle, but over time policies get side–stepped and circumvented. Regular testing and compliance audits are critical. Without this in place big holes open up in rigid systems.".[/quote_box_center] To be honest I totally agree with Jim and its satisfying that his company is now established in the UK. There is an absolute need for a company such as Firehost here in old blighty as I still think security is not taken seriously enough and they have a 100% record. Another company I spoke to was PSTG ltd, an IT Solutions Provider in the Cloud Services sector. Dirk De Vos, one of the managing partners spoke out and told me “This breach highlights the huge risk with larger organisations that use third party companies to manage or deliver services using secure data. Visa & MasterCard hold extremely sensitive data and their partners need to be stringently checked to ensure lapses in security like this don't take place. Both Visa and MasterCard who spend 10's of millions on their security infrastructure have been let down by their third party handler and hopefully they can learn from their mistakes and ensure this doesn't happen again. If anything it reinstates the adage "Cash is king" and you should probably not be handing your Visa/MasterCard to random taxis drivers.” So what can we do to limit our risk? Let’s face it, you, the individual with the card paying for that expensive meal cannot do very much to be honest. This is down to the chaps that hold your data and their security principles (the same in every industry). However there are some really basic things that you can do to limit your risk. For example, I have a credit card (4 to be exact) and I only use one specific card to conduct web based and “over the phone” transactions. This card has a very small limit (£250) and so I know with hand on heart that that’s my limit of fraud, simple. Or even better pay by good old cash but watch out at the ATM, there’s a long history of those being hacked too! ### What is Cloud Computing? By Richard May, Managing Director of Virtual DCS The ‘Cloud’ does not exist. It is simply a metaphor for the internet and within this metaphor there are a number of services that are placed under the Cloud Computing label. The most popular currently include Infrastructure as a Service (IaaS), Software as a Service (SaaS), Recovery as a Service (RaaS), and Platform as a Service (PaaS). The number of ‘as a service’ products that can be included within the Cloud umbrella are limited only to the imagination of the business and the technology available to them, with more Cloud services being created on a daily basis. Infrastructure as a Service is perhaps the basis for many of the other Cloud services. IaaS is where a company splits a server’s resources into smaller partitions, which after this process, are then known as ‘Virtual Machines’, or ‘Virtual Private Servers’, depending on how the server is partitioned. For a technical comparison of Virtual Private Servers and VMware virtual Machines, click here. The Cloud provider would then sell these individual partitions to the user who can gain access to this, typically via the internet. By using virtualisation technology, the user can pay for their partition on a monthly or quarterly basis while avoiding purchasing and maintaining redundant hardware, as upgrades and patches are typically maintained by the provider. The user also has the ability to expand their partition size whenever their resources require it, only paying for the facilities that the business uses. Within IaaS, the user can also choose from either a ‘Public’ or ‘Private Cloud’ solution, a ‘Public Cloud’, as described above, is where the server would be divided into several secure partitions, each partition housing a different business. A ‘Private Cloud’ is where the user would have use of the entire server, either offsite and maintained by the Cloud provider or onsite and maintained by the business. The decision to use a ‘Private Cloud’ over a ‘Public Cloud’ is often dependent on the legal requirements and legislation of the business. Recovery as a Service (RaaS) is a Cloud based method of disaster recovery, however unlike traditional solutions, the user does not have to purchase a duplicate set of hardware in case of a disaster. RaaS, like all Cloud technology, is flexible to the needs of a business, with options for real time replication, or bulk backup schedules. Furthermore, (depending on the level of service that the Cloud provider offers) a Cloud based solution would typically restore data within a matter of minutes, whereas a traditional tape based solution could take days. Platform as a Service is a solution that enables the user to develop, and launch, custom applications from their environment. Most commonly used by software developers, this method enables the user to remain in control of their environment, while still receiving the full benefits of virtualisation while only paying for the resources that they use. Finally, Software as a Service (SaaS) enables software developers to securely release their software via the internet, enabling them to charge for their software on a monthly or quarterly subscription service. SaaS offers an additional layer of control and security for the software developer, while giving them access to an international revenue stream that would have been previously unavailable. For more information on Cloud Computing and our services, please visit www.virtualdcs.co.uk/ or call a Cloud expert on 08453 888 327. About virtualDCS: Yorkshire Cloud Computing company Virtual Data Centre services (virtualDCS) were established in 2008 by Richard May and John Murray and have since had a 100% growth rate each financial year. Since January 2011 virtualDCS has employed 6 new staff members. Using the latest Cloud Computing technology from VMware, their solution delivers significant cost savings, improved availability and reduced carbon emissions. ### What is the Cloud Industry Forum? An Interview with its Chairman, Andy Burton Cloud Industry Forum The Cloud Industry Forum (CIF) was established in 2009 to provide transparency through certification to a Code of Practice for credible online Cloud service providers, and to assist end users in determining core information necessary to enable them to adopt these services. Andy Burton is Chairman of the Cloud Industry Forum. CTC: Welcome Andy, why did you form the Cloud Industry Forum (CIF)? AB: Because there was (and is) so much hype and vendor led messaging, and because the primary point of information is online it is very difficult for credible service providers to stand out from the marketing ‘noise’ and for end users to find a trusted party to rely on for impartial advice. It was clear that formal Standards were some way off, but industry led best practice could help facilitate education and operational standards in the meantime. CIF absolutely fills that need, we are a not-for-profit industry body that carries out research on cloud computing, educates end users on the key issues and opportunities around cloud solutions and champions best practice among professional cloud service providers. CTC: What is your definition of Cloud Computing? AB: How long have you got? In short there is no “universal cloud”. Cloud computing is effectively a means of accessing IT as a service wherever you have internet access and through whatever device you access it. It is scalable (up and down) and typically charged on a consumption based pricing model, where (like mobile phones) you can get better rates if you subscribe to higher volume or longer commitment. Cloud operates on two levels, namely Service Models and Deployment Models. Service models (as in “as-a-service”) determine how managed you want the solution – do you want infrastructure to build a solution, or a platform which includes the notion of operating system built in and reduces the complexity of infrastructure management), or do you want the whole application (or software) served? Deployment models are to do with notions of shared or dedicated infrastructure and deal with the poles of privacy and collaboration. Private clouds are restricted to one organisation, public clouds are a shared platform that offers a price point for a more standardised experience (much like SaaS does at an application level) and Hybrid clouds enable public and/or private clouds to inter-operate at a community level for common activities. CTC: How have your initial goals changed or adapted from your original goals? AB: In the first year or two our focus was on educating the Cloud Service Provider market and establishing the Certification scheme for the Code of Practice. Now that has been up and running for a year, our focus has broadened to ensure end users, especially buyers, are aware of the Code of Practice and encouraging them to look for the Certification mark as it ensures those Service providers will provide clear information about their organisation and capabilities in a common format that enables end users to make rational comparison between certified vendors and therefore assist an informed decision. CTC: Is the Cloud Industry Forum targeted towards larger cloud providers? AB: Not in any way. It is not a cartel, there are no barriers to entry other than an intent to act professionally. Cloud is transforming the industry and enables new talent to shine and prosper, but standing out in the crowd can be a challenge so the Code of Practice becomes a clear sign of the CSP’s commitment to operating in a transparent (and trusted) manner. CTC: What do you offer smaller cloud providers who deliver a good service but may not have large budgets? AB: Small companies pay a much lower fee to participate in the scheme than larger companies keeping the philosophy in line with cloud pricing in the industry. The challenge is in making sure the company can document and certify the information that we believe is essential to provide confidence to a prospective buyer. In fact right now, we are actively helping smaller CSP’s to get through the process by providing free guidance and support in achieving the obligations of the Code of Practice. CTC: Do you view hosted telephony providers as Cloud Computing providers and do you intend to reach out to this market segment? AB: There are many markets that are converging in the cloud space, most notably from the ISV (software producer), Telco’s, Network operators and the managed service and outsourcing markets. They all bring unique skills as the challenge in delivering IT as a service is you need good IT operations, good comms capability (data and voice) and a service ethos. So in short, yes hosted telephony is in that definition, and yes we intend to reach them. We actively partner with Comms Business magazine to share our thoughts to this market area. CTC: As a not for profit organisation where do you spend the capital raised from members? AB: Money raised from Membership is invested in research, white papers, educational events and promoting the code of practice. Funds raised from the Code of Practice Certification are invested in administering the scheme and ensuring effective governance and integrity. CTC: Are your certification and membership programs open to non UK vendors and suppliers? AB: Yes they are – cloud has no international barriers, nor does our Certification scheme. That said we require participants to declare their scope of coverage and make clear issues like data centre locations and the ability for customers to determine data sovereignty etc. Whilst we are physically based in the UK we have relationships with Alliance partners that endorse our Code of Practice like EuroCloud and the Cloud Security Alliance operating across Europe and in the US respectively. We are shortly to launch our US presence. CTC: Is there any recourse for end-users who have a complaint about the service or performance of one of your suppliers? AB: Absolutely, we have an independent Governance Board made up of lawyers, end users, industry vendors and independent experts. It presides over the evolution of the Code of Practice, and hears any complaints about organisations that are believed to have breached their Certification. CTC: What are the goals for the Cloud Industry Forum in 2012? AB: The Primary goals for CIF in 2012 are two-fold: 1. To ensure that the Code of Practice Certification is introduced as a gating criteria all buyers RFQ’s and ITT’s when they are investigating suppliers for cloud services. The more users ask for Certification the more vendors will see value in demonstrating transparency, capability and accountability for cloud services! 2. To increase the level of awareness of cloud computing among end users and to enable them to determine if and when any IT services they require should be operated as a cloud service. CTC: Finally, what are the 3 main things in your opinion any company should consider before adopting subscribing or purchasing a cloud computing service? AB: 1. Know your restrictions: Issues like regulation, legacy architectures, and degrees of integration between apps that will shape the Deployment and/or Service Models that may be relevant to your organisation. For further information view our white papers at www.cloudindustryforum.org 2. Know your potential suppliers: Look for CIF certification, if you can’t find it apply the same principles by following the advice of white paper 6 which you can also find on www.cloudindustryforum.org 3. Review the contract: All laws are not equal and all contracts are not balanced. See our advice in White Paper 3 from the same website. CTC: Thanks Andy! ### Hosted Telephony for Hotels By Neil Tolley, Managing Director, Fourteen IP Hotels, particularly the major groups, have been moving services above property (to the cloud) for quite some time now with servers and storage going quite some time ago and now a great many hotels host their Front of House system in the cloud along with numerous other serbices. Indeed some groups such as Kempinski have moved email over to Goole Apps taking away the management of a worldwide email service. Now that these projects have been completed and have been a success there is a drive to move other services to the cloud, including telephony. Applications such as Call Accounting and PMS from Tiger, SDD and others are available as a hosted service and are now becoming the norm, leaving telephony. The reasons are simple, hoteliers are just that and do not want to have to manage a huge array of increasingly complex IT systems. The challenges providing hosted voice solutions to hotels has always been around other services such as Front of House, Call Accounting and PMS also needing to be off property and then the particular problem of the high number of analogue extensions deployed in most hotels. Historically this was a real problem and the solution was to re-cable in CAT5e and change handsets to IP or SIP, few will be surprised that there were no takers for that approach. There are now affordable solutions, I shall focus on two but there are others. Audiocodes manufacture a number of gateways that are certified to work with a range of PBX’s. These gateways allow legacy analogue extensions and trunks as well as ISDN2 and ISDN30 trunks to be connected to the LAN and then on to the hosted service. The Audiocodes gateways also provide local survivability allowing some services to continue in case of network failure and allow analogue trunks to be connected for both survivability and emergency services use. Mitel’s model uses their AX controller to work in ‘peripheral mode’ and allows up to 288 analogue extensions to be connected in a single chassis which is connected to the network. Both solutions work well and importantly are priced well enough to allow a hosted solution to be built. For full service hotels that require all the features and functionality of a traditional on site PBX solutions from Mitel and NEC are available, for smaller hotels then a solution is on the horizon from one of the major SME hosting providers that will provide hosted voice along with ‘big hotel’ features and access to cloud based call accounting, PMS and guest vmail. Why Move to Hosted in Your Hotel? There are a huge number of hotels that have very old, very reliable, telephone systems. 10, 15 or even 20 year old systems are normal. They work, perhaps they do not have the features of a new system, but they work and a new system is very expensive. Long gone are the days where guests use the hotel phone and telephone revenues were a good source of income and a reason to upgrade the phone system. So why move to hosted? The 10-20 year old PBX will no doubt not be supported by the manufacturer and the number of parts and ability of maintainers to keep the systems going will continue to diminish if indeed it has not already. Purchasing a new onsite system requires a large capex investment and provides for the same challenges to appear again in a few years time, perhaps more so as we live in a world of regular software updates and ever changing needs for connectivity. A hosted solution requires minimal capital expenditure and gives you a system with all the features you require, fully managed and always up to date with the latest software release. You will get the latest features not just for your guests but for you and your team. Save the money and hassle of dealing with carriers for your ISDN and analogue lines. Save some space and electricity. Add in flexibility on adding and removing features and users, the ability to swap easily to a local PBX should you sell the hotel and access to other cloud based services such as call accounting, voicemail, PMS The question is perhaps why would you not move to hosted for your hotel? ### The “ALL INCLUSIVE” Colocation Cost Con Power = electrical power and electrical power = vast profits for the colocation provider or legacy reseller. Data centres, especially in London are critical to the country’s growth and ability to compete on a global stage. Over charging for services is simply stifling UK PLC’s ability to ride the economic challenges. Source Managed Services (sourceplc.com) have been contacted by increasing numbers of enterprise customers simply fed-up with being charged for power they are restricted from using by the provider. So where’s the con? Simply put, some Data Centre providers and their legacy resellers are charging for the maximum amount of power presented to the commando socket, multiplied by the price per amp. So a 16Amp commando attracts 16Amps of power cost, even though you contracted to 10amps and have been contractually restricted to using just 10Amps. Let’s take a small data centre requirement for example, one that is common throughout London’s small to medium organisations, the engine of our UK economy. Just four server cabinets/racks located within a leading data centre in Central London. The suite has a contracted maximum power draw of 10kW due to the cooling restraints of the facility. Each of these cabinets has dual (A+B pdu’s) 16Amp commandos to provide resilience. Each cabinet however requires just 2.5kW or roughly 11amps. So, the Customer is expecting pay for the 11Amps per cabinet at the rate per amp (circa £20-25 per month). Having this fixed pricing may be important to the budgeting process for the user and as such they are willing to pay for the full amount of allocated power, whether used or not. On closer inspection however it seems that some data centre providers are charging them for the maximum amount of amps presented to the suite, in this example 4 x 16amps on the ‘A’ feeds and 4 x 16amps on the ‘B’ feeds. So this Customer, requiring just 44Amps and happy to pay for it (and even understanding that the provider needs to make some profit), is being charged for 128amps, over three times the actual delivered! From the continued migration away from these legacy providers, it is clear that as UK’s data centre tenants become aware of this over-charging they will evolve to the new approach of cost clarity and transparency from leading providers such as Digital Realty. Source, for example, present their Customers with metered power, regardless of the requirement, charge for the power used, not the commando presented and compliment this with fixed rental pricing for over twenty years, regardless of the committed term. The industry has ceased adapting and has now evolved. Now is the time to examine your ‘all-inclusive’ power charges and challenge them. If your provider won’t evolve, then please contact Source for cost clarity, transparency and honest consultancy. Fenton Bard Fenton is a founding director of Source Managed Services, one of the UK’s leading independent providers of data centre a colocation services. For more information please contact Source on 0845 467 0160, source@sourceplc.com or www.sourceplc.com - Source – a fresh approach. ### Who are Strategic Blue? An interview with Dr. James Mitchell CTC: Who is Strategic Blue? We’re the world’s first cloud broker-dealer, based in London but offering our services globally. We bring trading expertise from the commodities markets to cloud computing. CTC: What’s Strategic Blue’s business model? We buy from cloud providers on terms that suit the providers, and sell to cloud users on the terms that suit the users. This is a purely financial, non-technical service, which does not interfere with the technical means of access to the cloud. As a cloud “broker-dealer”, we step into the billing chain between the cloud provider and the cloud user. We’ll pay the cloud provider for the usage of all our consolidated cloud users on the cloud providers’ terms...maybe all in USD, often with a large prepayment in exchange for a discount. We then ask each cloud user how they would like to pay. This may involve a different currency, or they may want a pre-agreed price for a project that doesn’t start immediately, or they may need extended payment terms to match off against their revenue stream. If the customer is able to make a commitment to a minimum amount of usage for a certain period, we can often offer a discount to the cloud user. And if their requirements change, we can be flexible to accommodate their needs, effectively selling the discount back to them. The reason why brokers and broker-dealers are so common in commoditised markets is that they provide a fast-track to finding the best product and the best price, without having to have a discussion with every vendor. The difference between a broker and a broker-dealer is that a broker arranges a trade between a buyer and a seller, whereas the broker-dealer is the buyer to the sell and the seller to the buyer, assuming credit risk and often a financing and risk-taking role. Broker-dealers are able to offer deal structures that are not available directly from the providers. CTC: You come from a commodities trading background - how do you apply this to cloud computing? We play to our strengths – we have experience in originating structured deals that meet the needs of the customer. There are well-documented analogies between cloud or “utility” computing and electricity, and we find that our experience is highly transferable and relevant. Importantly, we try not to stray out of our core financial intermediation expertise, preferring to partner with people who bring technical expertise to provide a complementary, full service offering to our customers. CTC: What cloud vendors do you work with? At the moment we are able to offer our services to varying extents on Amazon Web Services, CloudSigma, OpSource and Joyent and Firehost, and are in discussions with a number of others. We’ve also recently started offering our financial intermediation services for media-related Software as a Service, through our “CloudMediaHub by Strategic Blue” initiative, which is launching at the NAB media show in Las Vegas in April. CTC: Why would someone come to you rather than work directly with the cloud vendors? The buying preferences of a cloud user mismatch really quite badly with the selling preferences of the typical cloud provider, resulting in a compromise that neither are particularly happy with. When we get involved, the cloud provider gets paid on terms that they are very happy with, and the cloud user has the bizarre experience of actually being asked how they would like to pay, and how long they would like to commit for, which currency they would prefer etc. CTC: What type of customers are you currently engaging with? Our business model is generally applicable to any user of cloud computing, from a start-up through to large enterprises. Part of our team are generalists, who work across industries, but we have also developed a specialism in the media sector. This specialism originated because the media sector has certain drivers that encourage early adoption of this game-changing technology. However, since the start of 2012, we have suddenly seen a rapid acceleration in uptake from other industries too. CTC: What do you define as cloud computing? At its simplest, cloud computing is utility computing – it’s available “on demand” and on a “pay as you go” basis. We concentrate on “public” and “community” clouds, i.e. clouds that are provided by a third party, rather than a “private cloud”, which is made available onto to users inside the same organisation. Our view is that start-ups will typically use public cloud computing services 100%, with major enterprises operating their own private clouds, which “burst” to a public cloud, to form a “hybrid cloud”, when they have a spike in demand. The economics of working out when the “bursting” should take place is actually very similar to working out when different types of power plant should be run to achieve the lowest cost of electricity, so we feel quite at home analysing this! CTC: What current offerings would you like to highlight within your portfolio? We are particularly looking for existing or would-be SaaS vendors, who have a product that is applicable to the media industry, for inclusion in our CloudMediaHub initiative. We provide both a means to optimise the economics of purchasing the infrastructure to run your SaaS on, and also a sales channel to the end-user. In general, our services are of the most value to companies who run large compute-intensive projects on the cloud, that last for a number of months. We find that we can generally save significant costs for these types of projects. CTC: What advice do you offer any company looking to adopt the cloud computing model? Run a risk register for EACH application...but start with a risk analysis of your existing setup. You’ll find that you exchange certain risks associated with a DIY approach for different risks associated with a utility billing model. The Strategic Blue team will be happy to talk you through this. CTC: What major changes to the cloud computing industry do you envisage in 2012? We are now seeing major players come aggressively into the market to compete against Amazon Web Services, as well as vast numbers of smaller players operating in geographic or speciality niches. Our major prediction for the future is that it will rapidly become impossible for an end-user who has a day job to keep abreast of all that is going on. Cloud Brokers and Broker-Dealers will then proliferate to help customers find the right technical solution and the best price. The timing of this is not a coincidence...the adoption of cloud computing in general, including a proportion of public cloud usage, is widely regarded as inevitable by most enterprises, and as a “no-brainer” for SMEs. Finally, as more people move into cloud use, we’re going to see a lot of inefficient purchasing, with users potentially buying the wrong thing, or purchasing taking place remotely from the people actually using the cloud. Obviously this is something that we can offer a solution to, and we’d be happy to help on! For more information on Strategic Blue visit their website www.strategic-blue.com ### Simplifying Unified Communications in the Cloud Rob Pickering CEO ipcortex tells us about Unified Communications Although cloud generates a huge amount of press, and some might say hype, in the world of computing, it’s a phrase that’s having much less impact on the telecoms world. However, hosted telecoms deployments have been firmly in the frame for some time and the impact that cloud-like deployments will have on the world of communications will be very clear and quite profound. In this blog post I want to look at the relationship between open standards, cloud computing and the “coming of age” of Unified Communications. It's a topic that I feel passionately about, have written a white paper discussing it and believe it will have a huge impact on the ways in which we sell, support and consume UC technology. Here's why… The telecoms industry has a legacy that spans decades. Historically PBXs only carried voice. They were limited to a pre-defined location or locations, and companies generally operated in fairly static organisational structures. Requirements could be well specified and interoperability was of little importance. Now all of that is changing. Businesses are much more mobile, far more fluid in the ways in which they organise, and M&A activity continues to re-shape industries across the globe. Under such pressures, IT staff must also become much more agile, better able to adapt and provision the capabilities their employers require at the drop of a hat. Arguably the initial explosion in Cloud Computing came as a result of IT managers’ inability to do just that, with departments seeking out business solutions delivered as a service across the net. By considering these business requirements, it becomes a relatively simple task to define the key requirements of a Unified communications platform able to meet these challenges. We need to architect solutions that: Enable quick response to changing business demands Facilitate and integrate remote and mobile working on the fly Enable fast integration of comms between merged and/or collaborative working arrangements Allow the adoption of capabilities and features available in the broader market place, perhaps facilitating better customer or supplier communications, or perhaps to improve internal communications. Provide a scalable, secure, “utility like” service where customers pay for what they consume. I would argue that in the next evolution of our industry, it will be next to impossible to achieve this by building an architecture that has its roots in generations of legacy, and more importantly proprietary architectures. As I look at the response of larger legacy telecoms vendors, I see strategies which could leave customers with incredibly functional Unified Communications solutions, but only if their employees only ever talk to each other, and only if they are primarily office based. This is not a route which will give a business a competitive edge - and providing a series of connectors and API's simply doesn’t solve the problem for the long term. Of course, not everyone will move to Cloud Computing and Unified Communications tomorrow. That said, churn rates of technology in the telecoms industry can be quite long so it's important that organisations start to consider some of the key strategic decision about their own longer term strategy now. In the white paper I identified six such decisions that need to be considered in developing the right strategy: Do I have a Business Case for Unified Communications? Is my current infrastructure “UC Cloud” capable? Do I want to leverage open or closed end-points? Which Implementation Architecture? How much should I leverage external services? Partner Selection To be quick to respond, to innovate and to evolve our products and services at the rate that will be required by our customers, we must provide a level of interoperability that is only really achievable through rapid and broad scale adoption of open standards. If before open standards were perceived simply as a route away from vendor lock-in, in this next phase, they will be a business necessity. ipcortex was founded in the UK in 2002 by current CEO, Rob Pickering with a focus on Unified Communications. ### Compare the Cloud City Cloud Interview CTC: Who are City Cloud and what is your companies main focus? City Network is one of the leading hosting providers in Scandinavia with over 15000 customers. Whereas a few other companies provide VPS services, our cloud computing platform called City Cloud is the leading and still the only true cloud computing service in Scandinavia. Our main focus is to deliver a secure, flexible and scalable cloud hosting service for everyone. CTC: Where are your datacentres or points of presence located? We are located in the city of Karlskrona in the south-east of Sweden. This is our HQ and this is also where we have our main datacentre. We also have secondary datacentres in Scandinavia. We have just recently started our expansion abroad and launched our hosting services in Poland and Argentina as well as a general international push. CTC: What are the advantages for any UK company choosing to contract with you? Our cloud computing service is a real competitor to the services found mainly in the US. There is however a couple of important differences that we always point out. With City Cloud you always know how much your servers cost, where your data is located and that you have the ability to reach knowledgeable support personnel for any reason. We provide support in many languages for those companies with an international angle. These are in many ways basic but very important to point out when you compare our service with that of providers in the US. CTC: Would there be latency issues introduced by using City Cloud? In general any provider in Europe would most likely do well in most European countries. However if you run an application such as a game that is extremely latency sensitive - you would do best in running actual tests. We have customers from all over the world, including big games, that run from Scandinavia and targeting the world. We see no issues running from Scandinavia - like many use Amazon in Ireland to run their services. CTC: Have you got any current products or services that you would like to highlight to a UK audience? Our City Cloud offering in itself is the current offering that we always want to highlight. Your datacentre in a browser with scalable and cost efficient servers in a matter of minutes. However we recently made a huge improvement for our Windows users by installing a certain set of drivers that make Windows Server run approximately 50% faster than before. This is a big deal since our Unix/Linux flavours have had this advantage for quite some time. CTC: Are your customer service and support teams fluent in English? Of course. In addition to English we provide support in Spanish, Polish and Swedish. CTC: What is City Clouds unique proposition? Easy to use, understandable, flexible, scalable and cost efficient computing power for all your needs. CTC: What future products and plans are city cloud working on? We will launch multiple datacentres for our customers to be able to scale with extreme redundancy very easily. We are also working on a storage solution similar to Amazons S3 - but with a regional angle and as always around half the price. City Cloud is provided by City Network Hosting. We've built a platform where security, scalability and redundancy are our leading keywords. We have had help from Dell, Cisco and Enomaly – all the leading players in the operation of cloud computing platforms. City Network is one of Scandinavia's largest hosting providers with more than 15000 customers. ### Private Cloud is NOT a Hosted Service By Terry Pullin Gone are the days of cloud computing only being delivered by a provider as a hosted service. Cloud computing is often defined as “the delivery of IT over the internet”, in reality this is not true. Though hosting providers are embracing cloud computing the underlying technology to deliver services in this way, the reality is cloud computing is just the software that enables this. A range of cloud enablement (aka cloud orchestration) software solutions are available in the market which provide the key cloud computing benefits to businesses that cannot move their IT to a hosted environment. To start to understand what this software achieves it is important to know the difference between cloud vs virtualisation. Virtualisation alone is NOT cloud computing. Virtualisation is often a key component to a cloud computing platform but is not the finished product. Virtualisation technology pools hardware resources from multiple usually underutilised hardware devices and makes the resources available as a single pool from which virtual servers draw their requirements. Cloud computing is a layer above virtualisation which allows many additional features and benefits for example, hybrid computing environments which use a combination of physical & virtual server farms, multiple hypervisor control meaning you can choose to use different virtualisation technologies for different servers, simplified management in that you can manage these multiple infrastructures from one control panel, automation of many IT tasks, and ease of provisioning. As aforementioned hosting providers who have re-invented themselves as “cloud” hosting providers use this software alongside their virtualisation technology. The reason for this is simple, to control costs and make their services competitive. Many of their cost savings are from the automation and ease of management of cloud computing rather than virtualisation. They also make additional profits by having the ability to quickly deploy new servers and services through automation of those workflows. Now many businesses outside of the IT industry are using this software to make their own infrastructures more efficient and less costly. Though the ability to deploy cloud computing across internal infrastructure has been reasonably hidden to mid-market and enterprise businesses as the focus has been on supplying the ever growing public cloud market many non-IT industry businesses are now adopting this technology in-house and numerous established software providers are fulfilling the requirements. The cloud computing model has received so much interest because of its claims to reduce cost and simplify IT, however uptake in businesses that are either to large or for whom data security is essential has been slower than expected because the options have been to move data a server processes to a hosting providers infrastructure. With a private cloud in-house businesses are now able to realise the benefits of cloud computing on their own infrastructure eliminating the risks associated with the hosted model. For more information on how Backbone Connect are deploying cloud computing to our customers why not start at our website www.backboneconnect.co.uk About the author: Terry Pullin is Head of Cloud enablement services and is fast becoming one of the most established cloud thought leaders in the UK receiving regular requests to speak at industry events regarding cloud computing, why not follow him on twitter for his personal insight on the technology. ### Is a Company Dedicated to Providing Hosted Desktops to Accountants too niche? Hosted Accountants have been setup to provide hosted desktop solutions to accountancy firms and their clients, is this too niche? Well the three directors who have combined total of over 35 years’ experience in providing technology solutions to accountants don’t think so. They’ve increasingly been hearing from firms about how much time and money is being spent on keeping the smaller firms IT infrastructure up to scratch. Why is a Hosted Desktop just right for accountancy firms? In total there are somewhere in the region of 14,000 firms of accountants in the UK. The majority employ up to 10 members of staff. In these smaller firms it’s difficult to dedicate a large budget to IT, although using technology properly is becoming increasingly important and offers a cost effective way of competing with the larger firms. There are three potential ways to tackle the problem surrounding the area of increasingly sophisticated programs, one of which is the hosted desktop solution. 1) Bite the bullet and pay an external IT company either on a monthly retainer or an hourly rate. Either way costs can quickly get out of control. If the software provider releases an update to their compliance suite, then this can take hours to sort out. Or as and when a server needs replacing, this will once again take a large chunk out of an IT budget, both in terms of the expense of the server, but also in configuring the system. 2) Take care of the programs in-house with one of the firms accountants. This in itself directly costs the firm money. When the accountant is working on the practice’s IT systems, they are not completing chargeable work. It’s also unlikely that they are trained in IT, so will generally take longer to complete an upgrade or fix a problem. 3) Take a hosted desktop solution where you receive a fully managed IT Service. With Hosted Accountants there’s a 24/7 support line and with the premium support service, you will receive a response within 15 minutes. Hosted Desktops and accountancy software. There are some products that you purchase and keep on using without having to worry about upgrading, unless of course you always want the latest features. For instance, if you’re using a CRM system to manage you marketing and tracking key dates, then there’s not a huge amount that you would need to upgrade for, so you can sometimes skip one of two releases. However with the software that accountants use upgrades are usually driven by compliance. There are dates that have to be met by accountants and their clients and if these dates are missed then financial penalties are often raised. With HMRC changing the tax rates every year for self-assessment and corporation tax, then the accountancy software suppliers have to update their systems. With the update of any software program there is always the risk of a program being released that is buggy. Frequently there are additional patches that are then sent out to fix any bugs in the programs. With each patch more and more time is being spent on maintaining the systems. But it is essential these programs are updated. However with a hosted desktop as a service solution then this mundane maintenance is taken care of by a support team. They know the infrastructure of the various programs inside out, so the updates can be applied and run very quickly. Your hosted desktop partner will also have economy of scale. When the update does arrive, the time spent on getting it to work is spread across many firms and not just one. Hosted Accountants are dedicated to providing hosted desktop solutions to accountants and their clients. We know all of the major producers of accountancy software so you can be sure that we will keep your systems available so you can work in any location at any time. ### CAPEX and OPEX an Accountants view Laurence Moore is Chairman of Prime Accountants Jonathan Toni (Strategic Alliances Director Compare the Cloud) has asked me to write a guest blog about the benefits derived by moving from a CAPEX model to an OPEX model when moving to the cloud. As a practising Chartered Accountant he wants me to explain that to techy people who apparently see this as a major benefit. Interesting question because when recommending cloud technology to SME businesses I rarely, if ever, mention it. I probably should throw it in more often, but don't. The reason is, I dont see that as one of the benefits driving this technology change but really an extra benefit as a bonus, resulting from the decision already made. I hope Jonathan wasn't hoping for an indepth explanation of the different accounting treatment of CAPEX and OPEX because, if so, we will all nod off at about the same time. The simple benefit to the businesses I advise from that aspect, is purely cashflow - no big upfront payments, and reduced commitment, no long tie-ins. OK thats the accounting bit done then. So why am I passionate about the cloud? Two reasons - benefits to a business - and secondly the benefit to my ability to advise and support businesses. We all know the business benefits in terms of secure data centres, remote logins from multiple locations and multiple users, no need for local backups, ease of updates, so I will explain what it means for my accountancy business when advising clients. Traditional accounting practice and its traditional client Business installs Sage on PC in office, hires bookkeepers who are reasonably good (problem number 1 - as how does the business owner know that?) Bookkeepers 'keep the books' knock out the VAT returns, have a go at reconciling the bank and a few months after the end of the year send the accountant a backup. Accountant tries to find the same version of Sage or Quickbooks on his network, restores the backup, fiddles around with the numbers, enters them into his specialised accounting practice software and prints the year end accounts out. He then either forgets to give the client the adjustments to make to their Sage/QB or he sends them to the bookkeeper who ignores them as they dont understand them. Client gets accounts 6 months after the year end and sticks them in a desk drawer as too old to be of any use. Accountant sends client his fee note, client moans about it. Go back to beginning and start again for the second year. Cloud accounting practice and cloud client Accountant explains cloud accounting sytem to client, sets it up and gives client login(s) and training. Agrees fixed monthly fee for all services including cloud accounts subscription Cloud accounts connects to online banking system automatically downloading bank transactions daily. Client raises sales invoices in the office or out on the road on phone or tablet. Employees enter their expenses on a restricted login profile. Accountant answers any queries immediately they arise as he has constant live access to the single ledger which everyone shares. Client scans his supplier invoices into Dropbox, shared with accountant's staff Accountant enters supplier invoices into cloud system and reconciles bank transactions. If required, sends statements to customers by email on behalf of the client. Perhaps gives the customers a direct dial number in his office for any queries they have. Accountant pays client's agreed supplier invoices via his secure BACS bureau using file extraced from cloud accounts, so no-one enters any payments into banking system or writes cheques out. Client gets monthly accounts with explanation, possibly compared to budget set at beginning of year. Straight after year end, accountant completes year end accounts with any adjustments entered directly into live cloud accounts system Accountants annual fee is less than traditional costs of on-premise software, bookkeeper(s) and traditional year end accountant, which resulted in no useful information available to the business. This is all happening now with real clients now. And thats why I love the cloud ....and dont care about CAPEX, OPEX, EBITDA, PBT, EIS, FRSSE and nor do my clients.   ### Big Brother in the Cloud Following on from my last Blog, “Data storage in and outside the UK”, I thought that I would contribute an update on this topic. For the benefit of those who didn’t read the last white paper, well in summary it contained risks associated with outsourcing data storage and the data protection act (DPA). So, on with the update. In November I read a very embarrassing article (for Microsoft) and I must admit, for the first time I had some sympathy for the software giant even though most resellers of their technology wouldn’t agree with me of late (for obvious reasons). Back in June of this year poor old Gordon Frazer (use the term loosely), MD of Microsoft UK, announced to a room full of journalists that he could not guarantee that data stored in Microsoft’s European datacentres would not end up in the hands of the US government. Now, imagine the scene, he is announcing the highly publicised “Microsoft Office 365” product (at its launch) in London. The press had a field day and what came next was an onslaught of criticism from all sides of the room. So why would “poor old” Gordon say this I hear you ask? Well since the September the 11th attacks on the World Trade Centre (and the Pentagon), a new US Government act came into play – SEC 215, ACCESS TO RECORDS AND OTHER ITEMS UNDER THE FOREIGN INTELLIGENCE SURVEILLANCE ACT. So, what does this mean? Well simply put it allows the FBI to obtain ANY data from European companies that have their data stored in US-owned datacentres, even if the datacentres are based in Europe. If this wasn’t serious enough, the datacentre in question would be under a gagging order not to mention this to the individuals under suspicion. Now in most circumstances people such as you and I would not be worried about this prospect and live safe with the fact that the nasty terrorists are being observed (which I 100% believe in), however it’s the speculative reasons that I am not overly happy about and let me explain why. The above Government act (also known as the Patriot Act) is supposed to be linked to terrorism, however we will not know how it’s being used. In fact you never will and if you object to your data having the ability to be in the hands of the US Government, you yourself will automatically come under scrutiny for being a terrorist by not cooperating, see the dilemma? Just to raise another eyebrow, how about this. When Microsoft was asked to comment on this they declined. When HP and Amazon were asked, they didn’t even respond and Dell and Salesforce suggested that they didn’t have a spokesperson to available! Interesting hey? So, it comes back to my first paper on where you should store your data, within the country you reside in. Unfortunately this statement is now not good enough to protect the security of your IP (intellectual property), now it goes one level deeper – Is the datacentre US owned? Sound crazy? Then you would be thinking like me and expect a visit from the FBI for being uncooperative. Seriously though, this has big ramifications to the online hosting market place and data security. Imagine what data could be accessed without your knowledge? Financial data, Health records, IP that you have been working on, pictures of the ex-wife, well maybe that’s a bit too far but you get the point. Many UK based companies are now considering this topic seriously and the fact that this is not commonly known and there is very little, if any, publicity on this issue makes me think. Cloud Computing is a complex subject as it is with its own questions surrounding the variants of technology. This together with the topics of data sensitivity and security, it could “Cloud”(pardon the pun) the issue even more and potentially steer people away from the technology. What do I think? Well I can see a move to a more local approach to data storage. I am a patriot of this country called Great Britain and wherever possible adopt and suggest this approach, this topic is just another reason for justifying it, one that I am sure that the UK government would agree with me on (G-Cloud). For a more in-depth feature on this topic, please read the November issue of Computing (Nov 3rd), however this article does raise both sets of arguments with a slant and big centre on Rackspace, a US owned datacentre.     ### 6 Reasons to Re-evaluate Your Email Security Service By Samantha Stone, Vice President of Marketing for Mimecast North America If you were to conduct an analysis of my personal email in-box you’d discover that I’m a balding man with performance problems who’s a particular fan of grunge rock and is getting my nursing degree on-line. The only problem is I’m a high energy marketing professional, mother of four who admits a passion for 80’s rock and can’t get enough on-line cooking shows. Sadly, my personal email in-box receives the same amount of spam as most corporate email addresses and IT professionals in business of all sizes are tasked with keeping up with a growing array of creative email attacks. Luckily there are a number of quality cloud-based technology solutions that can not only mitigate risk to the organization, but eliminate resource drain on IT. Here are six questions you should consider to determine if a new email security provider might be right for you. Have you suffered unplanned service outages from your email security provider? Do you ever have to wait days or weeks for a response to a support inquiry? Are multiple requests required before you can talk to a live person? Have viruses breached your provider’s defenses? Are you being told to upgrade your service at additional cost to receive better protection capabilities? Is spam still plaguing your email users? Did you find hidden clauses presented to you as part of your provider’s Service Level Agreement that could present privacy concerns? If you are ready to evaluate new email management solutions; I encourage you to check out Mimecast. With guaranteed SLAs, predictable pricing and a customer first approach to support, we offer a compelling approach to email security. But don’t take my word for it…listen to what our customers have to say http://www.mimecast.com/Customers/Testimonials/   ### GUEST BLOG Compare the Cloud is the go-to platform for cloud computing, Internet of Things (IoT), big data, and technology enthusiasts. Our audience are highly specialised, comprising professionals, decision-makers, and thought leaders in the tech sector. With our trusted reputation in the industry, submitting a blog to the Compare the Cloud website offers you a unique opportunity to share your expertise with a highly engaged and relevant audience. Why Write for Compare the Cloud? By contributing to our trusted platform, you are positioning yourself and your brand as industry leaders. Our readers are keen to stay informed about the latest trends, insights, and developments in cloud computing, IoT, and big data. Our content is shared globally, so you'll not only engage with a highly targeted audience but also benefit from our global reach, significantly expanding your visibility and impact across international markets. Content Requirements: To maintain the integrity and quality of our content, we have a few guidelines: Original & Exclusive Content: Your blog must be unique to Compare the Cloud. We do not accept pieces that have been previously published or written by AI. Additionally, your submission should not be republished elsewhere in the future. Topic Restrictions: Please refrain from mentioning specific cryptocurrencies or gambling. Our focus is on cloud computing, IoT, and big data, and we aim to keep our content relevant and valuable to our readers. External Links: We do not allow links to gambling sites or content that is suitable only for those aged 18+, ensuring that our platform remains professional. Are you interested in contributing? You can check out the types of blogs we welcome by clicking here. If you have any queries or need further clarification, feel free to reach out to us at pr@comparethecloud.net. Ready to share your insights with our audience? Submit your blog here, we look forward to showcasing your brilliance on Compare the Cloud! ### Home [tdc_zone type="tdc_content"][vc_row tdc_css="eyJhbGwiOnsiYmFja2dyb3VuZC1jb2xvciI6IiMwMDAwMDAiLCJkaXNwbGF5IjoiIn0sInBob25lIjp7ImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3Njd9" full_width="stretch_row_content td-stretch-content"][vc_column][td_flex_block_2 image_align="center" meta_info_align="center" color_overlay="eyJ0eXBlIjoiZ3JhZGllbnQiLCJjb2xvcjEiOiJyZ2JhKDAsMCwwLDAuNzgpIiwiY29sb3IyIjoicmdiYSgwLDAsMCwwLjgyKSIsIm1peGVkQ29sb3JzIjpbeyJjb2xvciI6InJnYmEoMCwwLDAsMC4zNikiLCJwZXJjZW50YWdlIjoyNX0seyJjb2xvciI6InJnYmEoMCwwLDAsMC40KSIsInBlcmNlbnRhZ2UiOjc1fV0sImRlZ3JlZSI6IjE4MCIsImNzcyI6ImJhY2tncm91bmQ6IC13ZWJraXQtbGluZWFyLWdyYWRpZW50KDE4MGRlZyxyZ2JhKDAsMCwwLDAuODIpLHJnYmEoMCwwLDAsMC4zNikgMjUlLHJnYmEoMCwwLDAsMC40KSA3NSUscmdiYSgwLDAsMCwwLjc4KSk7YmFja2dyb3VuZDogbGluZWFyLWdyYWRpZW50KDE4MGRlZyxyZ2JhKDAsMCwwLDAuODIpLHJnYmEoMCwwLDAsMC4zNikgMjUlLHJnYmEoMCwwLDAsMC40KSA3NSUscmdiYSgwLDAsMCwwLjc4KSk7IiwiY3NzUGFyYW1zIjoiMTgwZGVnLHJnYmEoMCwwLDAsMC44MikscmdiYSgwLDAsMCwwLjM2KSAyNSUscmdiYSgwLDAsMCwwLjQpIDc1JSxyZ2JhKDAsMCwwLDAuNzgpIn0=" limit="1" ajax_pagination="next_prev" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjAiLCJkaXNwbGF5IjoiIn0sInBob25lIjp7Im1hcmdpbi1yaWdodCI6Ii0yMCIsIm1hcmdpbi1sZWZ0IjoiLTIwIiwiZGlzcGxheSI6IiJ9LCJwaG9uZV9tYXhfd2lkdGgiOjc2N30=" modules_height="eyJhbGwiOiIxMDAwIiwicGhvbmUiOiI2MDAiLCJsYW5kc2NhcGUiOiI4NDAiLCJwb3J0cmFpdCI6IjcwMCJ9" modules_space="0" image_margin="0" meta_width="eyJsYW5kc2NhcGUiOiIxMDAlIn0=" meta_info_horiz="content-horiz-left" meta_margin="0 auto" show_com="none" f_title_font_family="467" f_title_font_size="eyJhbGwiOiI1NCIsImxhbmRzY2FwZSI6IjUwIiwicG9ydHJhaXQiOiI0MiIsInBob25lIjoiMzAifQ==" f_title_font_line_height="1.2" f_ex_font_family="394" f_ex_font_size="eyJhbGwiOiIxOCIsImxhbmRzY2FwZSI6IjE3IiwicG9ydHJhaXQiOiIxNSIsInBob25lIjoiMTUifQ==" title_txt="#ffffff" pag_border_radius="3" pag_padding="eyJhbGwiOiIxOCIsInBob25lIjoiMTAiLCJwb3J0cmFpdCI6IjE0IiwibGFuZHNjYXBlIjoiMTYifQ==" meta_padding="eyJhbGwiOiIwIDYwMHB4IDEwMHB4IDIwcHgiLCJsYW5kc2NhcGUiOiIwIDI1MHB4IDkwcHggMjBweCIsInBob25lIjoiMCAyMHB4IiwicG9ydHJhaXQiOiIwIDE2MHB4IDgwcHggMjBweCJ9" f_ex_font_line_height="1.5" nextprev_bg="rgba(255,255,255,0.4)" pag_icons_size="eyJhbGwiOiIxNCIsInBob25lIjoiMTAiLCJwb3J0cmFpdCI6IjEyIiwibGFuZHNjYXBlIjoiMTMifQ==" nextprev_bg_h="rgba(255,255,255,0.8)" title_txt_hover="#ffffff" mc2_el="44" modules_cat_border="0" modules_category_padding="eyJhbGwiOiI4cHggMTZweCIsInBob25lIjoiN3B4IDEwcHgifQ==" modules_category_margin="0" art_title="eyJhbGwiOiIyMHB4IDAiLCJwb3J0cmFpdCI6IjEycHggMCAxNnB4In0=" cat_border="rgba(255,255,255,0)" cat_border_hover="rgba(255,255,255,0)" modules_category="above" f_cat_font_family="394" f_meta_font_family="394" ex_txt="#ffffff" f_cat_font_transform="uppercase" f_cat_font_spacing="0.5" f_cat_font_size="eyJhbGwiOiIxMiIsInBvcnRyYWl0IjoiMTEiLCJwaG9uZSI6IjExIn0=" f_cat_font_line_height="1" f_cat_font_weight="600" cat_txt="#000000" cat_txt_hover="var(--primary)" cat_bg="#ffffff" cat_bg_hover="#ffffff" f_meta_font_size="eyJhbGwiOiIxNCIsInBvcnRyYWl0IjoiMTMiLCJwaG9uZSI6IjEzIn0=" f_meta_font_line_height="1.2" f_meta_font_weight="" post_ids="" f_ex_font_weight="400" modules_category_radius="eyJhbGwiOiIxMDAiLCJwaG9uZSI6IjIifQ==" excl_padd="4px 8px" f_excl_font_size="13" excl_radius="3" f_excl_font_family="394" f_excl_font_weight="600" excl_show="none" category_id="41" nextprev_position="top" td_ajax_preloading="preload"][vc_row_inner absolute_align="bottom" absolute_width="absolute_inner_1400 absolute_inner" absolute_position="eyJhbGwiOiJ5ZXMiLCJwaG9uZSI6IiJ9" tdc_css="eyJwb3J0cmFpdCI6eyJwYWRkaW5nLXJpZ2h0IjoiMjAiLCJwYWRkaW5nLWxlZnQiOiIyMCIsImRpc3BsYXkiOiIifSwicG9ydHJhaXRfbWF4X3dpZHRoIjoxMDE4LCJwb3J0cmFpdF9taW5fd2lkdGgiOjc2OCwicGhvbmUiOnsiYmFja2dyb3VuZC1jb2xvciI6IiMwMDAwMDAiLCJkaXNwbGF5IjoiIn0sInBob25lX21heF93aWR0aCI6NzY3LCJsYW5kc2NhcGUiOnsicGFkZGluZy1yaWdodCI6IjIwIiwicGFkZGluZy1sZWZ0IjoiMjAiLCJkaXNwbGF5IjoiIn0sImxhbmRzY2FwZV9tYXhfd2lkdGgiOjExNDAsImxhbmRzY2FwZV9taW5fd2lkdGgiOjEwMTl9"][vc_column_inner][tdm_block_column_title title_text="TGF0ZXN0JTIwQmxvZ3MlMjBhbmQlMjBWaWRlb3M=" title_tag="h3" title_size="tdm-title-sm" tds_title1-f_title_font_family="394" tds_title1-f_title_font_weight="700" tds_title1-f_title_font_size="eyJhbGwiOiIxNSIsInBvcnRyYWl0IjoiMTQifQ==" tds_title1-f_title_font_line_height="1.2" tds_title1-f_title_font_transform="uppercase" tds_title1-title_color="#ffffff" tds_title1-f_title_font_spacing="0.5" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjUiLCJkaXNwbGF5IjoiIn19"][td_flex_block_1 modules_on_row="eyJhbGwiOiIyNSUiLCJwaG9uZSI6IjEwMCUifQ==" limit="4" modules_category="above" show_btn="eyJwaG9uZSI6Im5vbmUiLCJhbGwiOiJub25lIn0=" show_excerpt="eyJwaG9uZSI6Im5vbmUiLCJwb3J0cmFpdCI6Im5vbmUiLCJhbGwiOiJub25lIn0=" ajax_pagination="" td_ajax_preloading="" sort="" category_id="41" f_title_font_size="eyJhbGwiOiIxNCIsInBvcnRyYWl0IjoiMTAiLCJsYW5kc2NhcGUiOiIxMiJ9" f_title_font_line_height="1.4" show_cat="eyJwb3J0cmFpdCI6Im5vbmUifQ==" meta_info_border_style="" meta_padding="eyJhbGwiOiIwIDAgMCAyMHB4IiwibGFuZHNjYXBlIjoiMCAwIDAgMTVweCIsInBvcnRyYWl0IjoiMCAwIDAgMTBweCIsInBob25lIjoiMCAwIDAgMjBweCJ9" modules_divider="" image_size="" meta_info_align="center" image_floated="float_left" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjAiLCJib3JkZXItdG9wLXdpZHRoIjoiMSIsInBhZGRpbmctdG9wIjoiMzAiLCJwYWRkaW5nLWJvdHRvbSI6IjMwIiwiYm9yZGVyLWNvbG9yIjoicmdiYSgyNTUsMjU1LDI1NSwwLjE2KSIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlIjp7InBhZGRpbmctdG9wIjoiMjUiLCJwYWRkaW5nLWJvdHRvbSI6IjI1IiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGVfbWF4X3dpZHRoIjoxMTQwLCJsYW5kc2NhcGVfbWluX3dpZHRoIjoxMDE5LCJwb3J0cmFpdCI6eyJwYWRkaW5nLXRvcCI6IjIwIiwicGFkZGluZy1ib3R0b20iOiIyMCIsImRpc3BsYXkiOiIifSwicG9ydHJhaXRfbWF4X3dpZHRoIjoxMDE4LCJwb3J0cmFpdF9taW5fd2lkdGgiOjc2OCwicGhvbmUiOnsiZGlzcGxheSI6IiJ9LCJwaG9uZV9tYXhfd2lkdGgiOjc2N30=" meta_info_horiz="content-horiz-left" f_title_font_weight="400" image_height="eyJhbGwiOiIxMjAiLCJwaG9uZSI6IjEyMCJ9" all_modules_space="eyJhbGwiOiIwIiwicGhvbmUiOiIxNSJ9" art_excerpt="0" art_title="eyJhbGwiOiIxMHB4IDAgMCAwIiwicG9ydHJhaXQiOiIwIn0=" btn_bg="rgba(255,255,255,0)" f_btn_font_transform="uppercase" f_btn_font_weight="" f_cat_font_transform="" f_cat_font_weight="400" btn_bg_hover="rgba(255,255,255,0)" meta_width="eyJwaG9uZSI6IjEwMCUifQ==" show_audio="" show_com="none" show_date="none" show_author="none" mc1_el="10" f_title_font_family="467" f_title_font_transform="" title_txt="#ffffff" title_txt_hover="rgba(255,255,255,0.75)" cat_txt="#ffffff" cat_txt_hover="rgba(255,255,255,0.75)" cat_bg="rgba(255,255,255,0)" cat_bg_hover="rgba(255,255,255,0)" modules_category_padding="6px 12px" f_cat_font_family="394" f_cat_font_size="eyJhbGwiOiIxMiIsInBvcnRyYWl0IjoiMTEiLCJsYW5kc2NhcGUiOiIxMSJ9" f_cat_font_line_height="1" modules_gap="eyJhbGwiOiIyMCIsInBob25lIjoiMCIsImxhbmRzY2FwZSI6IjE1IiwicG9ydHJhaXQiOiIxMCJ9" image_width="30" post_ids="" image_radius="8" modules_cat_border="1" modules_category_radius="100" cat_border="rgba(255,255,255,0.3)" cat_border_hover="rgba(255,255,255,0.3)" review_size="0" f_excl_font_size="11" excl_padd="4px 6px 3px" excl_radius="2" f_excl_font_family="394" f_excl_font_weight="500" excl_bg="var(--metro-exclusive)" td_query_cache="yes" td_query_cache_expiration="300"][/vc_column_inner][/vc_row_inner][/vc_column][/vc_row][vc_row full_width="stretch_row_1400 td-stretch-content"][vc_column][td_block_raw_css content="JTQwbWVkaWElMjAobWluLXdpZHRoJTNBJTIwMTQwMXB4KSUyMCU3QiUwQSUyMCUyMC50ZF9mbGV4X2Jsb2NrXzIudGRfd2l0aF9hamF4X3BhZ2luYXRpb24lMjAudGQtYmxvY2staW5uZXItcGFnaW5hdGlvbiUyMC50ZC1uZXh0LXByZXYtd3JhcCUyMCU3QiUwQSUyMCUyMCUyMCUyMHRvcCUzQSUyMDQzJTI1JTIwIWltcG9ydGFudCUzQiUwQSUyMCUyMCUyMCUyMG1hcmdpbiUzQSUyMDAlM0IlMEElMjAlMjAlMjAlMjByaWdodCUzQSUyMDUwJTI1JTNCJTBBJTIwJTIwJTIwJTIwdHJhbnNmb3JtJTNBJTIwdHJhbnNsYXRlWCgtNjgwcHgpJTNCJTBBJTIwJTIwJTdEJTBBJTdEJTBBJTQwbWVkaWElMjAobWluLXdpZHRoJTNBJTIwNzY4cHgpJTIwYW5kJTIwKG1heC13aWR0aCUzQSUyMDE0MDBweCklMjAlN0IlMEElMjAlMjAudGRfZmxleF9ibG9ja18yLnRkX3dpdGhfYWpheF9wYWdpbmF0aW9uJTIwLnRkLWJsb2NrLWlubmVyLXBhZ2luYXRpb24lMjAudGQtbmV4dC1wcmV2LXdyYXAlMjAlN0IlMEElMjAlMjAlMjAlMjB0b3AlM0ElMjA0MyUyNSUyMCFpbXBvcnRhbnQlM0IlMEElMjAlMjAlMjAlMjBtYXJnaW4lM0ElMjAwJTNCJTBBJTIwJTIwJTIwJTIwcmlnaHQlM0ElMjAyMHB4JTNCJTBBJTIwJTIwJTdEJTBBJTdEJTBBLnRkX2ZsZXhfYmxvY2tfMiUyMC50ZF9tb2R1bGVfd3JhcCUyMC50ZC1tb2R1bGUtbWV0YS1pbmZvJTIwLmVudHJ5LXRpdGxlJTIwJTdCJTBBJTIwJTIwdGV4dC1zaGFkb3clM0ElMjAwJTIwNHB4JTIwMTZweCUyMHJnYmEoMCUyQyUyMDAlMkMlMjAwJTJDJTIwMC4zKSUzQiUwQSU3RCUwQQ=="][tdm_block_column_title title_text="RmVhdHVyZWQlMjBBcnRpY2xlcw==" title_tag="h3" title_size="tdm-title-sm" tds_title1-f_title_font_family="394" tds_title1-f_title_font_weight="800" tds_title1-f_title_font_size="eyJhbGwiOiI2MCIsInBob25lIjoiNTQiLCJsYW5kc2NhcGUiOiI1MCIsInBvcnRyYWl0IjoiNDAifQ==" tds_title1-title_color="var(--accent-color)" tdc_css="eyJhbGwiOnsicGFkZGluZy10b3AiOiI4MCIsInBhZGRpbmctYm90dG9tIjoiMzUiLCJkaXNwbGF5IjoiIn0sImxhbmRzY2FwZSI6eyJwYWRkaW5nLXRvcCI6IjYwIiwicGFkZGluZy1ib3R0b20iOiIyMCIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOSwicG9ydHJhaXQiOnsicGFkZGluZy10b3AiOiI0MCIsInBhZGRpbmctYm90dG9tIjoiMTAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3NjgsInBob25lIjp7InBhZGRpbmctdG9wIjoiNjAiLCJkaXNwbGF5IjoiIn0sInBob25lX21heF93aWR0aCI6NzY3fQ==" tds_title1-f_title_font_line_height="1.1" tds_title="tds_title1" tds_title2-line_width="100%" tds_title2-line_color="eyJ0eXBlIjoiZ3JhZGllbnQiLCJjb2xvcjEiOiIjZjQzZjNmIiwiY29sb3IyIjoiI2Y0M2YzZiIsIm1peGVkQ29sb3JzIjpbXSwiZGVncmVlIjoiLTkwIiwiY3NzIjoiYmFja2dyb3VuZC1jb2xvcjogI2Y0M2YzZjsiLCJjc3NQYXJhbXMiOiIwZGVnLCNmNDNmM2YsI2Y0M2YzZiJ9" content_align_horizontal="content-horiz-center"][/vc_column][/vc_row][vc_row full_width="stretch_row_1400 td-stretch-content" tdc_css="eyJhbGwiOnsiZGlzcGxheSI6IiJ9fQ=="][vc_column][td_flex_block_1 modules_on_row="eyJhbGwiOiIyMCUiLCJwaG9uZSI6IjEwMCUifQ==" modules_category="above" show_btn="none" show_excerpt="eyJwb3J0cmFpdCI6Im5vbmUiLCJhbGwiOiJub25lIn0=" ajax_pagination="" td_ajax_preloading="preload" sort="" category_id="41" f_title_font_size="eyJhbGwiOiIxNiIsImxhbmRzY2FwZSI6IjE0IiwicG9ydHJhaXQiOiIxMyJ9" f_title_font_line_height="1.4" show_cat="" meta_info_border_style="" meta_padding="20px 0 0 0" modules_divider="" image_size="" meta_info_align="" image_floated="" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjMwIiwiYm9yZGVyLWJvdHRvbS13aWR0aCI6IjEiLCJwYWRkaW5nLWJvdHRvbSI6IjIwIiwiYm9yZGVyLWNvbG9yIjoiI2VhZWFlYSIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlIjp7ImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOSwicG9ydHJhaXQiOnsiZGlzcGxheSI6IiJ9LCJwb3J0cmFpdF9tYXhfd2lkdGgiOjEwMTgsInBvcnRyYWl0X21pbl93aWR0aCI6NzY4LCJwaG9uZSI6eyJkaXNwbGF5IjoiIn0sInBob25lX21heF93aWR0aCI6NzY3fQ==" meta_info_horiz="content-horiz-left" f_title_font_weight="800" image_height="120" all_modules_space="eyJwb3J0cmFpdCI6IjAiLCJwaG9uZSI6IjMwIiwiYWxsIjoiMCJ9" art_excerpt="0" art_title="10px 0 0 0" btn_bg="rgba(255,255,255,0)" f_btn_font_transform="uppercase" f_btn_font_weight="" f_cat_font_transform="" f_cat_font_weight="600" btn_bg_hover="rgba(255,255,255,0)" meta_width="eyJwaG9uZSI6IjEwMCUifQ==" show_audio="" show_com="none" show_date="none" show_author="none" mc1_el="10" f_title_font_family="394" f_title_font_transform="" title_txt="#000000" title_txt_hover="var(--metro-blue)" cat_txt="var(--metro-blue)" cat_bg="rgba(255,255,255,0)" cat_bg_hover="rgba(255,255,255,0)" modules_category_padding="eyJhbGwiOiI2cHggMTJweCIsInBvcnRyYWl0IjoiNHB4IDEwcHgifQ==" f_cat_font_family="394" f_cat_font_size="eyJhbGwiOiIxMyIsImxhbmRzY2FwZSI6IjEyIiwicG9ydHJhaXQiOiIxMSJ9" f_cat_font_line_height="1" modules_gap="eyJhbGwiOiIyMCIsImxhbmRzY2FwZSI6IjE1IiwicG9ydHJhaXQiOiIxMCIsInBob25lIjoiMCJ9" image_radius="10" cat_txt_hover="var(--metro-blue-acc)" modules_cat_border="1" modules_category_radius="100" cat_border="#eaeaea" cat_border_hover="#eaeaea" review_size="0" excl_padd="4px 6px 3px" f_excl_font_size="11" f_excl_font_family="394" f_excl_font_weight="500" excl_radius="2" excl_bg="var(--metro-exclusive)"][/vc_column][/vc_row][vc_row full_width="stretch_row_1400 td-stretch-content" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjEwMCIsImRpc3BsYXkiOiIifSwicGhvbmUiOnsibWFyZ2luLWJvdHRvbSI6IjYwIiwiZGlzcGxheSI6IiJ9LCJwaG9uZV9tYXhfd2lkdGgiOjc2NywibGFuZHNjYXBlIjp7Im1hcmdpbi1ib3R0b20iOiI4MCIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOSwicG9ydHJhaXQiOnsibWFyZ2luLWJvdHRvbSI6IjYwIiwicGFkZGluZy1yaWdodCI6IjUiLCJwYWRkaW5nLWxlZnQiOiI1IiwiZGlzcGxheSI6IiJ9LCJwb3J0cmFpdF9tYXhfd2lkdGgiOjEwMTgsInBvcnRyYWl0X21pbl93aWR0aCI6NzY4fQ==" gap="eyJsYW5kc2NhcGUiOiIxNSIsInBvcnRyYWl0IjoiMTAiLCJhbGwiOiIyMCIsInBob25lIjoiMCJ9"][vc_column width="1/2" tdc_css="eyJhbGwiOnsid2lkdGgiOiI2MCUiLCJkaXNwbGF5IjoiIn0sImxhbmRzY2FwZSI6eyJkaXNwbGF5IjoiIn0sImxhbmRzY2FwZV9tYXhfd2lkdGgiOjExNDAsImxhbmRzY2FwZV9taW5fd2lkdGgiOjEwMTksInBvcnRyYWl0Ijp7ImRpc3BsYXkiOiIifSwicG9ydHJhaXRfbWF4X3dpZHRoIjoxMDE4LCJwb3J0cmFpdF9taW5fd2lkdGgiOjc2OCwicGhvbmUiOnsibWFyZ2luLWJvdHRvbSI6IjMwIiwid2lkdGgiOiIxMDAlIiwiZGlzcGxheSI6IiJ9LCJwaG9uZV9tYXhfd2lkdGgiOjc2N30="][td_flex_block_1 modules_on_row="" limit="1" modules_category="above" show_btn="none" show_excerpt="eyJsYW5kc2NhcGUiOiJub25lIiwicG9ydHJhaXQiOiJub25lIn0=" ajax_pagination="" td_ajax_preloading="preload" sort="" category_id="254" f_title_font_size="eyJhbGwiOiI1MCIsInBvcnRyYWl0IjoiMjQiLCJsYW5kc2NhcGUiOiIzMiIsInBob25lIjoiNDAifQ==" f_title_font_line_height="eyJwb3J0cmFpdCI6IjEuMiIsImFsbCI6IjEuMiJ9" show_cat="" meta_info_border_style="" meta_padding="eyJhbGwiOiIwIDAgMCA0MHB4ICIsInBvcnRyYWl0IjoiMCAwIDAgMjBweCIsImxhbmRzY2FwZSI6IjAgMCAwIDMwcHgiLCJwaG9uZSI6IjI1cHggMCAwIDAifQ==" modules_divider="" image_size="" meta_info_align="center" image_floated="eyJhbGwiOiJmbG9hdF9sZWZ0IiwicGhvbmUiOiJub19mbG9hdCJ9" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjI2IiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGUiOnsibWFyZ2luLWJvdHRvbSI6IjMwIiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGVfbWF4X3dpZHRoIjoxMTQwLCJsYW5kc2NhcGVfbWluX3dpZHRoIjoxMDE5LCJwb3J0cmFpdCI6eyJtYXJnaW4tYm90dG9tIjoiMjAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3NjgsInBob25lIjp7ImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3Njd9" meta_info_horiz="content-horiz-left" f_title_font_weight="700" image_height="130" all_modules_space="0" art_excerpt="0" art_title="eyJhbGwiOiIxNXB4IDAgMTJweCAwIiwicG9ydHJhaXQiOiIxNXB4IDAgMTBweCAwIiwibGFuZHNjYXBlIjoiMTVweCAwIDAgMCIsInBob25lIjoiMTJweCAwIDE1cHggMCJ9" btn_bg="rgba(255,255,255,0)" f_btn_font_transform="uppercase" f_btn_font_weight="" f_ex_font_weight="500" f_cat_font_transform="" f_cat_font_weight="600" btn_bg_hover="rgba(255,255,255,0)" f_ex_font_size="eyJhbGwiOiIxNCIsImxhbmRzY2FwZSI6IjEzIiwicG9ydHJhaXQiOiIxMyJ9" f_ex_font_line_height="1.4" meta_width="eyJwaG9uZSI6IjEwMCUifQ==" show_audio="" show_com="none" show_date="none" show_author="none" f_title_font_family="467" f_title_font_transform="" f_ex_font_family="394" title_txt="#000000" title_txt_hover="var(--metro-red)" ex_txt="#000000" cat_txt="var(--metro-red)" cat_bg="rgba(255,255,255,0)" cat_bg_hover="rgba(255,255,255,0)" author_txt="#000000" author_txt_hover="var(--metro-red)" modules_category_padding="eyJhbGwiOiI4cHggMTRweCIsInBvcnRyYWl0IjoiNnB4IDEycHgifQ==" f_cat_font_family="394" f_cat_font_size="eyJhbGwiOiIxMyIsInBvcnRyYWl0IjoiMTIifQ==" f_cat_font_line_height="1" excerpt_middle="yes" modules_gap="0" image_width="eyJhbGwiOiI0NSIsInBob25lIjoiMTAwIn0=" image_radius="10" mc1_el="22" cat_txt_hover="var(--metro-blue)" modules_cat_border="1" modules_category_radius="100" cat_border="#eaeaea" cat_border_hover="#eaeaea" review_size="0" excl_show="none"][td_flex_block_1 modules_on_row="eyJhbGwiOiI1MCUiLCJwaG9uZSI6IjEwMCUifQ==" image_size="" image_floated="hidden" image_width="eyJwaG9uZSI6IjMwIn0=" image_height="eyJwaG9uZSI6IjExMCJ9" show_btn="none" show_excerpt="eyJwaG9uZSI6Im5vbmUiLCJhbGwiOiJub25lIn0=" show_com="eyJwaG9uZSI6Im5vbmUiLCJhbGwiOiJub25lIn0=" show_author="none" show_cat="none" meta_padding="0" f_title_font_size="eyJhbGwiOiIyMCIsInBvcnRyYWl0IjoiMTUiLCJsYW5kc2NhcGUiOiIxNiJ9" f_title_font_line_height="1.4" f_title_font_weight="800" all_modules_space="eyJsYW5kc2NhcGUiOiIyMCIsInBvcnRyYWl0IjoiMTUiLCJhbGwiOiIyNSJ9" category_id="" show_date="eyJwb3J0cmFpdCI6Im5vbmUifQ==" art_excerpt="0" show_review="none" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0Ijp7ImRpc3BsYXkiOiIifSwicG9ydHJhaXRfbWF4X3dpZHRoIjoxMDE4LCJwb3J0cmFpdF9taW5fd2lkdGgiOjc2OCwicGhvbmUiOnsiZGlzcGxheSI6IiJ9LCJwaG9uZV9tYXhfd2lkdGgiOjc2N30=" f_title_font_family="467" mc1_el="10" title_txt_hover="var(--metro-red)" title_txt="#000000" art_title="eyJhbGwiOiIwIDAgMTBweCIsInBvcnRyYWl0IjoiMCJ9" modules_border_size="1px 0 0 0" m_padding="eyJhbGwiOiIyMHB4IDAgMCAwIiwibGFuZHNjYXBlIjoiMTVweCAwIDAgMCIsInBvcnRyYWl0IjoiMTVweCAwIDAgMCJ9" modules_gap="eyJhbGwiOiI0MCIsImxhbmRzY2FwZSI6IjMwIiwicG9ydHJhaXQiOiIyMCJ9" f_meta_font_size="eyJhbGwiOiIxMiIsInBvcnRyYWl0IjoiMTAiLCJsYW5kc2NhcGUiOiIxMSJ9" f_meta_font_line_height="1" f_meta_font_weight="500" f_meta_font_family="394" limit="6" offset="1" review_size="0" excl_padd="4px 6px 3px" excl_radius="2" f_excl_font_family="394" f_excl_font_size="11" f_excl_font_weight="500" excl_bg="var(--metro-exclusive)"][/vc_column][vc_column width="1/2" tdc_css="eyJhbGwiOnsid2lkdGgiOiI0MCUiLCJkaXNwbGF5IjoiIn0sInBob25lIjp7IndpZHRoIjoiMTAwJSIsImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3Njd9" is_sticky="yes"][td_flex_block_1 modules_on_row="eyJhbGwiOiI1MCUiLCJwaG9uZSI6IjEwMCUifQ==" limit="4" modules_category="above" show_btn="none" show_excerpt="eyJwb3J0cmFpdCI6Im5vbmUiLCJhbGwiOiJub25lIn0=" ajax_pagination="" td_ajax_preloading="preload" sort="" category_id="644" f_title_font_size="eyJhbGwiOiIxNiIsImxhbmRzY2FwZSI6IjE0IiwicG9ydHJhaXQiOiIxMiJ9" f_title_font_line_height="1.4" show_cat="" meta_info_border_style="" meta_padding="eyJhbGwiOiIyMHB4IDAgMCAwIiwicG9ydHJhaXQiOiIxNXB4IDAgMCAwIn0=" modules_divider="" image_size="" meta_info_align="" image_floated="" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjAiLCJkaXNwbGF5IjoiIn0sImxhbmRzY2FwZSI6eyJkaXNwbGF5IjoiIn0sImxhbmRzY2FwZV9tYXhfd2lkdGgiOjExNDAsImxhbmRzY2FwZV9taW5fd2lkdGgiOjEwMTksInBvcnRyYWl0Ijp7ImRpc3BsYXkiOiIifSwicG9ydHJhaXRfbWF4X3dpZHRoIjoxMDE4LCJwb3J0cmFpdF9taW5fd2lkdGgiOjc2OCwicGhvbmUiOnsibWFyZ2luLWJvdHRvbSI6IjAiLCJkaXNwbGF5IjoiIn0sInBob25lX21heF93aWR0aCI6NzY3fQ==" meta_info_horiz="content-horiz-left" f_title_font_weight="800" image_height="eyJhbGwiOiIxMjAiLCJwb3J0cmFpdCI6IjEwMCJ9" all_modules_space="eyJwb3J0cmFpdCI6IjIwIiwiYWxsIjoiMjAiLCJwaG9uZSI6IjMwIn0=" art_excerpt="0" art_title="10px 0 0 0" btn_bg="rgba(255,255,255,0)" f_btn_font_transform="uppercase" f_btn_font_weight="" f_cat_font_transform="" f_cat_font_weight="600" btn_bg_hover="rgba(255,255,255,0)" meta_width="eyJwaG9uZSI6IjEwMCUifQ==" show_audio="" show_com="none" show_date="none" show_author="none" mc1_el="10" f_title_font_family="394" f_title_font_transform="" title_txt="#000000" title_txt_hover="var(--metro-blue)" cat_txt="var(--metro-blue)" cat_bg="rgba(255,255,255,0)" cat_bg_hover="rgba(255,255,255,0)" modules_category_padding="eyJhbGwiOiI2cHggMTJweCIsInBvcnRyYWl0IjoiNHB4IDEwcHgifQ==" f_cat_font_family="394" f_cat_font_size="eyJhbGwiOiIxMyIsImxhbmRzY2FwZSI6IjEyIiwicG9ydHJhaXQiOiIxMSJ9" f_cat_font_line_height="1" modules_gap="eyJhbGwiOiIyMCIsInBob25lIjoiMCJ9" image_radius="10" cat_txt_hover="var(--metro-blue-acc)" modules_cat_border="1" modules_category_radius="100" cat_border="#eaeaea" cat_border_hover="#eaeaea" review_size="0" excl_padd="4px 6px 3px" excl_radius="2" f_excl_font_family="394" f_excl_font_size="11" f_excl_font_weight="500" excl_bg="var(--metro-exclusive)"][/vc_column][/vc_row][vc_row full_width="stretch_row_1400 td-stretch-content" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjgwIiwiZGlzcGxheSI6IiJ9LCJwaG9uZSI6eyJtYXJnaW4tYm90dG9tIjoiNjAiLCJkaXNwbGF5IjoiIn0sInBob25lX21heF93aWR0aCI6NzY3LCJsYW5kc2NhcGUiOnsibWFyZ2luLWJvdHRvbSI6IjYwIiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGVfbWF4X3dpZHRoIjoxMTQwLCJsYW5kc2NhcGVfbWluX3dpZHRoIjoxMDE5LCJwb3J0cmFpdCI6eyJtYXJnaW4tYm90dG9tIjoiNDAiLCJwYWRkaW5nLXJpZ2h0IjoiNSIsInBhZGRpbmctbGVmdCI6IjUiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3Njh9"][vc_column][/vc_column][/vc_row][vc_row full_width="stretch_row_1400 td-stretch-content" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjM1IiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGUiOnsibWFyZ2luLWJvdHRvbSI6IjIwIiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGVfbWF4X3dpZHRoIjoxMTQwLCJsYW5kc2NhcGVfbWluX3dpZHRoIjoxMDE5LCJwb3J0cmFpdCI6eyJtYXJnaW4tYm90dG9tIjoiMTAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3Njh9"][vc_column][tdm_block_column_title title_text="QUk=" title_tag="h3" title_size="tdm-title-sm" tds_title1-f_title_font_family="394" tds_title1-f_title_font_weight="800" tds_title1-f_title_font_size="eyJhbGwiOiI2MCIsInBob25lIjoiNTQiLCJsYW5kc2NhcGUiOiI1MCIsInBvcnRyYWl0IjoiNDAifQ==" tds_title1-title_color="var(--metro-blue)" tdc_css="eyJhbGwiOnsicGFkZGluZy1ib3R0b20iOiI2IiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGUiOnsicGFkZGluZy1ib3R0b20iOiI1IiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGVfbWF4X3dpZHRoIjoxMTQwLCJsYW5kc2NhcGVfbWluX3dpZHRoIjoxMDE5LCJwb3J0cmFpdCI6eyJwYWRkaW5nLWJvdHRvbSI6IjIiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3NjgsInBob25lIjp7InBhZGRpbmctYm90dG9tIjoiMTAiLCJkaXNwbGF5IjoiIn0sInBob25lX21heF93aWR0aCI6NzY3fQ==" tds_title1-f_title_font_line_height="1.1" tds_title="tds_title1" tds_title2-line_width="100%" tds_title2-line_color="eyJ0eXBlIjoiZ3JhZGllbnQiLCJjb2xvcjEiOiIjZjQzZjNmIiwiY29sb3IyIjoiI2Y0M2YzZiIsIm1peGVkQ29sb3JzIjpbXSwiZGVncmVlIjoiLTkwIiwiY3NzIjoiYmFja2dyb3VuZC1jb2xvcjogI2Y0M2YzZjsiLCJjc3NQYXJhbXMiOiIwZGVnLCNmNDNmM2YsI2Y0M2YzZiJ9" content_align_horizontal="content-horiz-center"][vc_row_inner absolute_align="center" absolute_position="eyJhbGwiOiJ5ZXMiLCJwaG9uZSI6IiJ9"][vc_column_inner][tdm_block_button button_size="tdm-btn-lg" button_display="" content_align_horizontal="content-horiz-right" tds_button1-border_radius="4" tds_button1-background_color="var(--metro-red)" tds_button1-background_hover_color="var(--metro-blue)" tds_button1-f_btn_text_font_family="394" tds_button1-f_btn_text_font_size="eyJhbGwiOiIxMiIsInBvcnRyYWl0IjoiMTEifQ==" tds_button1-f_btn_text_font_line_height="1" tds_button1-f_btn_text_font_weight="600" tds_button1-f_btn_text_font_spacing="0.5" tds_button1-f_btn_text_font_transform="uppercase" button_padding="0" button_text="View More" tdc_css="eyJwaG9uZSI6eyJjb250ZW50LWgtYWxpZ24iOiJjb250ZW50LWhvcml6LWNlbnRlciIsImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3Njd9" tds_button="tds_button5" button_url="/category/articles/ai/" button_tdicon="td-icon-right" icon_align="eyJhbGwiOiIxIiwicG9ydHJhaXQiOiIxIn0=" tds_button5-f_btn_text_font_family="394" tds_button5-f_btn_text_font_weight="700" tds_button5-f_btn_text_font_size="eyJhbGwiOiIxNCIsInBvcnRyYWl0IjoiMTMifQ==" tds_button5-f_btn_text_font_transform="uppercase" tds_button5-text_color="var(--metro-blue)" tds_button5-text_hover_color="var(--metro-blue-acc)" button_icon_space="eyJhbGwiOiIxNCIsInBvcnRyYWl0IjoiMTAifQ==" tds_button5-f_btn_text_font_line_height="1.2"][/vc_column_inner][/vc_row_inner][/vc_column][/vc_row][vc_row full_width="stretch_row_1400 td-stretch-content" tdc_css="eyJhbGwiOnsiYmFja2dyb3VuZC1jb2xvciI6InZhcigtLW1ldHJvLWJsdWUpIiwiYmFja2dyb3VuZC1zdHlsZSI6ImNvbnRhaW4iLCJvcGFjaXR5IjoiMC4wNCIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlIjp7ImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOSwicG9ydHJhaXQiOnsiZGlzcGxheSI6IiJ9LCJwb3J0cmFpdF9tYXhfd2lkdGgiOjEwMTgsInBvcnRyYWl0X21pbl93aWR0aCI6NzY4LCJwaG9uZSI6eyJwYWRkaW5nLXRvcCI6IjI0IiwicGFkZGluZy1ib3R0b20iOiIyNCIsImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3Njd9" flex_layout="block" flex_vert_align="flex-start"][vc_column][td_flex_block_1 show_com="none" image_width="eyJhbGwiOiI1MCIsInBob25lIjoiMTAwIn0=" image_floated="eyJhbGwiOiJmbG9hdF9yaWdodCIsInBob25lIjoibm9fZmxvYXQifQ==" meta_padding="eyJhbGwiOiIwIDE0MHB4ICAwIDAgIiwicGhvbmUiOiIyNXB4IDAgMCAwIiwibGFuZHNjYXBlIjoiMCA0MHB4ICAwIDAgIiwicG9ydHJhaXQiOiIwIDQwcHggIDAgMCAifQ==" image_radius="" image_height="eyJhbGwiOiIxMDAiLCJwb3J0cmFpdCI6IjExMCJ9" meta_info_horiz="content-horiz-left" modules_category="above" modules_category_margin="" show_excerpt="" show_btn="" show_review="none" show_cat="" m_padding="0" shadow_shadow_color="rgba(0,0,0,0.14)" f_title_font_family="467" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0Ijp7ImRpc3BsYXkiOiIifSwicG9ydHJhaXRfbWF4X3dpZHRoIjoxMDE4LCJwb3J0cmFpdF9taW5fd2lkdGgiOjc2OCwicGhvbmUiOnsiZGlzcGxheSI6IiJ9LCJwaG9uZV9tYXhfd2lkdGgiOjc2NywibGFuZHNjYXBlIjp7ImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOX0=" f_title_font_size="eyJhbGwiOiI1MCIsInBvcnRyYWl0IjoiMjgiLCJwaG9uZSI6IjQwIiwibGFuZHNjYXBlIjoiMzYifQ==" underline_height="0" f_title_font_weight="500" show_author="none" show_date="none" art_title="eyJhbGwiOiIyMHB4IDAiLCJwb3J0cmFpdCI6IjE1cHggMCJ9" cat_bg="#ffffff" cat_bg_hover="#ffffff" cat_txt="var(--metro-blue)" cat_txt_hover="var(--metro-blue-acc)" f_cat_font_family="394" f_cat_font_transform="uppercase" f_cat_font_weight="600" f_cat_font_size="eyJhbGwiOiIxNCIsImxhbmRzY2FwZSI6IjEzIiwicG9ydHJhaXQiOiIxMSIsInBob25lIjoiMTIifQ==" modules_divider="" h_effect="" title_txt_hover="#ffffff" f_title_font_line_height="eyJhbGwiOiIxIiwicG9ydHJhaXQiOiIxLjEifQ==" f_cat_font_line_height="1" all_modules_space="eyJhbGwiOiIwIiwicG9ydHJhaXQiOiIzMCIsInBob25lIjoiNDAifQ==" category_id="649" sort="" td_ajax_preloading="preload" modules_on_row="" modules_gap="eyJwb3J0cmFpdCI6IjIwIiwiYWxsIjoiMCJ9" limit="1" modules_category_padding="eyJhbGwiOiI4cHggMTZweCIsInBvcnRyYWl0IjoiNnB4IDEycHgifQ==" mc1_el="36" author_txt="#ffffff" author_txt_hover="#ffffff" date_txt="rgba(255,255,255,0.6)" ex_txt="rgba(255,255,255,0.9)" art_btn="0" f_btn_font_family="394" f_btn_font_transform="uppercase" f_btn_font_weight="600" f_btn_font_size="12" f_btn_font_style="" btn_bg_hover="#ffffff" btn_txt="#ffffff" btn_txt_hover="var(--metro-blue)" btn_title="Read post" btn_bg="var(--metro-blue-acc)" title_txt="#ffffff" art_excerpt="eyJhbGwiOiIwIDAgNDBweCAwIiwicG9ydHJhaXQiOiIwIDAgMzBweCAwIn0=" meta_info_align="center" f_btn_font_spacing="0.5" f_btn_font_line_height="1" btn_margin="0" image_size="td_1068x0" f_cat_font_spacing="0.5" f_ex_font_family="394" f_ex_font_size="eyJhbGwiOiIxOCIsInBob25lIjoiMTUiLCJwb3J0cmFpdCI6IjEzIiwibGFuZHNjYXBlIjoiMTYifQ==" f_ex_font_line_height="1.4" f_ex_font_weight="500" btn_padding="eyJhbGwiOiIxNXB4IDIwcHggMTRweCIsInBvcnRyYWl0IjoiMTRweCAxN3B4IDEzcHgifQ==" offset="1" modules_category_radius="100" btn_radius="4" modules_cat_border="0" review_size="0" excl_show="none"][/vc_column][/vc_row][vc_row full_width="stretch_row_1400 td-stretch-content" tdc_css="eyJhbGwiOnsiYmFja2dyb3VuZC1jb2xvciI6InZhcigtLW1ldHJvLWJsdWUtYWNjKSIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlIjp7ImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOSwicG9ydHJhaXQiOnsiZGlzcGxheSI6IiJ9LCJwb3J0cmFpdF9tYXhfd2lkdGgiOjEwMTgsInBvcnRyYWl0X21pbl93aWR0aCI6NzY4LCJwaG9uZSI6eyJwYWRkaW5nLXRvcCI6IjI0IiwicGFkZGluZy1ib3R0b20iOiIyNCIsImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3Njd9" flex_layout="block" flex_vert_align="flex-start"][vc_column][td_flex_block_1 show_com="none" image_width="eyJhbGwiOiI1MCIsInBob25lIjoiMTAwIn0=" image_floated="eyJhbGwiOiJmbG9hdF9sZWZ0IiwicGhvbmUiOiJub19mbG9hdCJ9" meta_padding="eyJhbGwiOiIwIDEwMHB4ICAwIDYwcHgiLCJwaG9uZSI6IjI1cHggMCAwIDAiLCJsYW5kc2NhcGUiOiIwIDIwcHggIDAgNDBweCIsInBvcnRyYWl0IjoiMCAwIDAgMzBweCAifQ==" image_radius="" image_height="eyJhbGwiOiIxMDAiLCJwb3J0cmFpdCI6IjExMCJ9" meta_info_horiz="content-horiz-left" modules_category="above" modules_category_margin="" show_excerpt="" show_btn="" show_review="none" show_cat="" m_padding="0" shadow_shadow_color="rgba(0,0,0,0.14)" f_title_font_family="467" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0Ijp7ImRpc3BsYXkiOiIifSwicG9ydHJhaXRfbWF4X3dpZHRoIjoxMDE4LCJwb3J0cmFpdF9taW5fd2lkdGgiOjc2OCwicGhvbmUiOnsiZGlzcGxheSI6IiJ9LCJwaG9uZV9tYXhfd2lkdGgiOjc2NywibGFuZHNjYXBlIjp7ImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOX0=" f_title_font_size="eyJhbGwiOiI1MCIsInBvcnRyYWl0IjoiMjgiLCJwaG9uZSI6IjQwIiwibGFuZHNjYXBlIjoiMzYifQ==" underline_height="0" f_title_font_weight="500" show_author="none" show_date="none" art_title="eyJhbGwiOiIyMHB4IDAiLCJwb3J0cmFpdCI6IjE1cHggMCJ9" cat_bg="#ffffff" cat_bg_hover="#ffffff" cat_txt="var(--metro-blue)" cat_txt_hover="var(--metro-blue-acc)" f_cat_font_family="394" f_cat_font_transform="uppercase" f_cat_font_weight="600" f_cat_font_size="eyJhbGwiOiIxNCIsImxhbmRzY2FwZSI6IjEzIiwicG9ydHJhaXQiOiIxMSIsInBob25lIjoiMTIifQ==" modules_divider="" h_effect="" title_txt_hover="#ffffff" f_title_font_line_height="eyJhbGwiOiIxIiwicG9ydHJhaXQiOiIxLjEifQ==" f_cat_font_line_height="1" all_modules_space="eyJhbGwiOiIwIiwicG9ydHJhaXQiOiIzMCIsInBob25lIjoiNDAifQ==" category_id="649" sort="random_posts" td_ajax_preloading="preload" modules_on_row="" modules_gap="eyJwb3J0cmFpdCI6IjIwIiwiYWxsIjoiMCJ9" limit="1" modules_category_padding="eyJhbGwiOiI4cHggMTZweCIsInBvcnRyYWl0IjoiNnB4IDEycHgifQ==" mc1_el="36" author_txt="#ffffff" author_txt_hover="#ffffff" date_txt="rgba(255,255,255,0.6)" ex_txt="rgba(255,255,255,0.9)" art_btn="0" f_btn_font_family="394" f_btn_font_transform="uppercase" f_btn_font_weight="600" f_btn_font_size="eyJhbGwiOiIxMiIsInBvcnRyYWl0IjoiMTIifQ==" f_btn_font_style="" btn_bg_hover="#ffffff" btn_txt="#ffffff" btn_txt_hover="var(--metro-blue-acc)" btn_title="Read post" btn_bg="var(--metro-blue)" title_txt="#ffffff" art_excerpt="eyJhbGwiOiIwIDAgNDBweCAwIiwicG9ydHJhaXQiOiIwIDAgMzBweCAwIn0=" meta_info_align="center" f_btn_font_spacing="0.5" f_btn_font_line_height="1" btn_margin="0" image_size="td_1068x0" f_cat_font_spacing="0.5" f_ex_font_family="394" f_ex_font_size="eyJhbGwiOiIxOCIsInBob25lIjoiMTUiLCJwb3J0cmFpdCI6IjEzIiwibGFuZHNjYXBlIjoiMTYifQ==" f_ex_font_line_height="1.4" f_ex_font_weight="500" btn_padding="eyJhbGwiOiIxNXB4IDIwcHggMTRweCIsInBvcnRyYWl0IjoiMTRweCAxN3B4IDEzcHgifQ==" modules_cat_border="0" modules_category_radius="100" btn_radius="4" offset="3" review_size="0" excl_show="none"][/vc_column][/vc_row][vc_row full_width="stretch_row_1400 td-stretch-content" tdc_css="eyJhbGwiOnsibWFyZ2luLXRvcCI6Ii01IiwiYmFja2dyb3VuZC1jb2xvciI6InZhcigtLW1ldHJvLWJsdWUpIiwiYmFja2dyb3VuZC1zdHlsZSI6ImNvbnRhaW4iLCJvcGFjaXR5IjoiMC4wNCIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlIjp7ImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOSwicG9ydHJhaXQiOnsiZGlzcGxheSI6IiJ9LCJwb3J0cmFpdF9tYXhfd2lkdGgiOjEwMTgsInBvcnRyYWl0X21pbl93aWR0aCI6NzY4LCJwaG9uZSI6eyJwYWRkaW5nLXRvcCI6IjI0IiwicGFkZGluZy1ib3R0b20iOiIyNCIsImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3Njd9" flex_layout="block" flex_vert_align="flex-start"][vc_column][td_flex_block_1 show_com="none" image_width="eyJhbGwiOiI1MCIsInBob25lIjoiMTAwIn0=" image_floated="eyJhbGwiOiJmbG9hdF9yaWdodCIsInBob25lIjoibm9fZmxvYXQifQ==" meta_padding="eyJhbGwiOiIwIDE0MHB4ICAwIDAgIiwicGhvbmUiOiIyNXB4IDAgMCAwIiwibGFuZHNjYXBlIjoiMCA0MHB4ICAwIDAgIiwicG9ydHJhaXQiOiIwIDQwcHggIDAgMCAifQ==" image_radius="" image_height="eyJhbGwiOiIxMDAiLCJwb3J0cmFpdCI6IjExMCJ9" meta_info_horiz="content-horiz-left" modules_category="above" modules_category_margin="" show_excerpt="" show_btn="" show_review="none" show_cat="" m_padding="0" shadow_shadow_color="rgba(0,0,0,0.14)" f_title_font_family="467" tdc_css="eyJhbGwiOnsibWFyZ2luLXRvcCI6Ii01IiwibWFyZ2luLWJvdHRvbSI6IjAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0Ijp7ImRpc3BsYXkiOiIifSwicG9ydHJhaXRfbWF4X3dpZHRoIjoxMDE4LCJwb3J0cmFpdF9taW5fd2lkdGgiOjc2OCwicGhvbmUiOnsiZGlzcGxheSI6IiJ9LCJwaG9uZV9tYXhfd2lkdGgiOjc2NywibGFuZHNjYXBlIjp7ImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOX0=" f_title_font_size="eyJhbGwiOiI1MCIsInBvcnRyYWl0IjoiMjgiLCJwaG9uZSI6IjQwIiwibGFuZHNjYXBlIjoiMzYifQ==" underline_height="0" f_title_font_weight="500" show_author="none" show_date="none" art_title="eyJhbGwiOiIyMHB4IDAiLCJwb3J0cmFpdCI6IjE1cHggMCJ9" cat_bg="#ffffff" cat_bg_hover="#ffffff" cat_txt="var(--metro-blue)" cat_txt_hover="var(--metro-blue-acc)" f_cat_font_family="394" f_cat_font_transform="uppercase" f_cat_font_weight="600" f_cat_font_size="eyJhbGwiOiIxNCIsImxhbmRzY2FwZSI6IjEzIiwicG9ydHJhaXQiOiIxMSIsInBob25lIjoiMTIifQ==" modules_divider="" h_effect="" title_txt_hover="#ffffff" f_title_font_line_height="eyJhbGwiOiIxIiwicG9ydHJhaXQiOiIxLjEifQ==" f_cat_font_line_height="1" all_modules_space="eyJhbGwiOiIwIiwicG9ydHJhaXQiOiIzMCIsInBob25lIjoiNDAifQ==" category_id="649" sort="random_posts" td_ajax_preloading="preload" modules_on_row="" modules_gap="eyJwb3J0cmFpdCI6IjIwIiwiYWxsIjoiMCJ9" limit="1" modules_category_padding="eyJhbGwiOiI4cHggMTZweCIsInBvcnRyYWl0IjoiNnB4IDEycHgifQ==" mc1_el="36" author_txt="#ffffff" author_txt_hover="#ffffff" date_txt="rgba(255,255,255,0.6)" ex_txt="rgba(255,255,255,0.9)" art_btn="0" f_btn_font_family="394" f_btn_font_transform="uppercase" f_btn_font_weight="600" f_btn_font_size="12" f_btn_font_style="" btn_bg_hover="#ffffff" btn_txt="#ffffff" btn_txt_hover="var(--metro-blue)" btn_title="Read post" btn_bg="var(--metro-blue-acc)" title_txt="#ffffff" art_excerpt="eyJhbGwiOiIwIDAgNDBweCAwIiwicG9ydHJhaXQiOiIwIDAgMzBweCAwIn0=" meta_info_align="center" f_btn_font_spacing="0.5" f_btn_font_line_height="1" btn_margin="0" image_size="td_1068x0" f_cat_font_spacing="0.5" f_ex_font_family="394" f_ex_font_size="eyJhbGwiOiIxOCIsInBob25lIjoiMTUiLCJwb3J0cmFpdCI6IjEzIiwibGFuZHNjYXBlIjoiMTYifQ==" f_ex_font_line_height="1.4" f_ex_font_weight="500" btn_padding="eyJhbGwiOiIxNXB4IDIwcHggMTRweCIsInBvcnRyYWl0IjoiMTRweCAxN3B4IDEzcHgifQ==" offset="3" modules_category_radius="100" btn_radius="4" modules_cat_border="0" review_size="0" excl_show="none"][/vc_column][/vc_row][vc_row full_width="stretch_row_1400 td-stretch-content"][vc_column][tdm_block_column_title title_text="VGVjaG5vbG9neQ==" title_tag="h3" title_size="tdm-title-sm" tds_title1-f_title_font_family="394" tds_title1-f_title_font_weight="800" tds_title1-f_title_font_size="eyJhbGwiOiI2MCIsInBob25lIjoiNTQiLCJsYW5kc2NhcGUiOiI1MCIsInBvcnRyYWl0IjoiNDAifQ==" tds_title1-title_color="var(--accent-color)" tdc_css="eyJhbGwiOnsicGFkZGluZy10b3AiOiI4MCIsInBhZGRpbmctYm90dG9tIjoiMzUiLCJkaXNwbGF5IjoiIn0sImxhbmRzY2FwZSI6eyJwYWRkaW5nLXRvcCI6IjYwIiwicGFkZGluZy1ib3R0b20iOiIyMCIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOSwicG9ydHJhaXQiOnsicGFkZGluZy10b3AiOiI0MCIsInBhZGRpbmctYm90dG9tIjoiMTAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3NjgsInBob25lIjp7InBhZGRpbmctdG9wIjoiNjAiLCJkaXNwbGF5IjoiIn0sInBob25lX21heF93aWR0aCI6NzY3fQ==" tds_title1-f_title_font_line_height="1.1" tds_title="tds_title1" tds_title2-line_width="100%" tds_title2-line_color="eyJ0eXBlIjoiZ3JhZGllbnQiLCJjb2xvcjEiOiIjZjQzZjNmIiwiY29sb3IyIjoiI2Y0M2YzZiIsIm1peGVkQ29sb3JzIjpbXSwiZGVncmVlIjoiLTkwIiwiY3NzIjoiYmFja2dyb3VuZC1jb2xvcjogI2Y0M2YzZjsiLCJjc3NQYXJhbXMiOiIwZGVnLCNmNDNmM2YsI2Y0M2YzZiJ9" content_align_horizontal="content-horiz-center"][/vc_column][/vc_row][vc_row full_width="stretch_row_1400 td-stretch-content" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjEwMCIsImRpc3BsYXkiOiIifSwicGhvbmUiOnsibWFyZ2luLWJvdHRvbSI6IjYwIiwiZGlzcGxheSI6IiJ9LCJwaG9uZV9tYXhfd2lkdGgiOjc2NywibGFuZHNjYXBlIjp7Im1hcmdpbi1ib3R0b20iOiI4MCIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOSwicG9ydHJhaXQiOnsibWFyZ2luLWJvdHRvbSI6IjYwIiwicGFkZGluZy1yaWdodCI6IjUiLCJwYWRkaW5nLWxlZnQiOiI1IiwiZGlzcGxheSI6IiJ9LCJwb3J0cmFpdF9tYXhfd2lkdGgiOjEwMTgsInBvcnRyYWl0X21pbl93aWR0aCI6NzY4fQ==" gap="eyJsYW5kc2NhcGUiOiIxNSIsInBvcnRyYWl0IjoiMTAiLCJhbGwiOiIyMCIsInBob25lIjoiMCJ9"][vc_column width="1/1"][td_flex_block_1 modules_on_row="eyJhbGwiOiI1MCUiLCJwaG9uZSI6IjEwMCUifQ==" limit="2" modules_category="above" show_btn="none" show_excerpt="eyJsYW5kc2NhcGUiOiJub25lIiwicG9ydHJhaXQiOiJub25lIn0=" ajax_pagination="" td_ajax_preloading="preload" sort="featured" category_id="409" f_title_font_size="eyJhbGwiOiIzNiIsInBvcnRyYWl0IjoiMjAiLCJsYW5kc2NhcGUiOiIyOCJ9" f_title_font_line_height="eyJwb3J0cmFpdCI6IjEuMiIsImFsbCI6IjEuMiJ9" show_cat="" meta_info_border_style="" meta_padding="eyJhbGwiOiIwIDAgMCAzMHB4ICIsInBvcnRyYWl0IjoiMCAwIDAgMTVweCIsImxhbmRzY2FwZSI6IjAgMCAwIDIwcHgiLCJwaG9uZSI6IjMwcHggMCAwIDAifQ==" modules_divider="" image_size="" meta_info_align="center" image_floated="eyJhbGwiOiJmbG9hdF9sZWZ0IiwicGhvbmUiOiJub19mbG9hdCJ9" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjI0IiwiYm9yZGVyLWJvdHRvbS13aWR0aCI6IjEiLCJwYWRkaW5nLWJvdHRvbSI6IjI0IiwiYm9yZGVyLWNvbG9yIjoiI2VhZWFlYSIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlIjp7Im1hcmdpbi1ib3R0b20iOiIyMCIsInBhZGRpbmctYm90dG9tIjoiMjAiLCJkaXNwbGF5IjoiIn0sImxhbmRzY2FwZV9tYXhfd2lkdGgiOjExNDAsImxhbmRzY2FwZV9taW5fd2lkdGgiOjEwMTksInBvcnRyYWl0Ijp7Im1hcmdpbi1ib3R0b20iOiIxNSIsInBhZGRpbmctYm90dG9tIjoiMTUiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3NjgsInBob25lIjp7Im1hcmdpbi1ib3R0b20iOiIzMCIsImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3Njd9" meta_info_horiz="content-horiz-left" f_title_font_weight="700" image_height="130" all_modules_space="eyJhbGwiOiIwIiwicGhvbmUiOiIzMCJ9" art_excerpt="0" art_title="eyJhbGwiOiIxNXB4IDAgMTJweCAwIiwicG9ydHJhaXQiOiIxMHB4IDAgMCIsImxhbmRzY2FwZSI6IjE1cHggMCAwIDAiLCJwaG9uZSI6IjEycHggMCAxNXB4IDAifQ==" btn_bg="rgba(255,255,255,0)" f_btn_font_transform="uppercase" f_btn_font_weight="" f_ex_font_weight="500" f_cat_font_transform="" f_cat_font_weight="600" btn_bg_hover="rgba(255,255,255,0)" f_ex_font_size="eyJhbGwiOiIxNCIsImxhbmRzY2FwZSI6IjEzIiwicG9ydHJhaXQiOiIxMyJ9" f_ex_font_line_height="1.4" meta_width="eyJwaG9uZSI6IjEwMCUifQ==" show_audio="" show_com="none" show_date="none" show_author="none" f_title_font_family="467" f_title_font_transform="" f_ex_font_family="394" title_txt="#000000" title_txt_hover="var(--metro-red)" ex_txt="#000000" cat_txt="var(--metro-red)" cat_bg="rgba(255,255,255,0)" cat_bg_hover="rgba(255,255,255,0)" author_txt="#000000" author_txt_hover="var(--metro-red)" modules_category_padding="eyJhbGwiOiI4cHggMTRweCIsInBvcnRyYWl0IjoiNnB4IDEycHgifQ==" f_cat_font_family="394" f_cat_font_size="eyJhbGwiOiIxMyIsInBvcnRyYWl0IjoiMTIifQ==" f_cat_font_line_height="1" excerpt_middle="yes" modules_gap="eyJhbGwiOiIzMCIsImxhbmRzY2FwZSI6IjIwIiwicG9ydHJhaXQiOiIxNSIsInBob25lIjoiMCJ9" image_width="eyJhbGwiOiI0NSIsInBob25lIjoiMTAwIn0=" image_radius="10" mc1_el="22" cat_txt_hover="var(--metro-blue)" modules_cat_border="1" modules_category_radius="100" cat_border="#eaeaea" cat_border_hover="#eaeaea" offset="1" post_ids="" excl_show="none"][td_flex_block_1 modules_on_row="eyJhbGwiOiIyMCUiLCJwaG9uZSI6IjEwMCUifQ==" modules_category="above" show_btn="none" show_excerpt="eyJwb3J0cmFpdCI6Im5vbmUiLCJhbGwiOiJub25lIn0=" ajax_pagination="" td_ajax_preloading="preload" sort="featured" category_id="" f_title_font_size="eyJhbGwiOiIxNiIsImxhbmRzY2FwZSI6IjE0IiwicG9ydHJhaXQiOiIxMyJ9" f_title_font_line_height="1.4" show_cat="" meta_info_border_style="" meta_padding="20px 0 0 0" modules_divider="" image_size="" meta_info_align="" image_floated="" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjYwIiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGUiOnsiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGVfbWF4X3dpZHRoIjoxMTQwLCJsYW5kc2NhcGVfbWluX3dpZHRoIjoxMDE5LCJwb3J0cmFpdCI6eyJtYXJnaW4tYm90dG9tIjoiNDAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3NjgsInBob25lIjp7Im1hcmdpbi1ib3R0b20iOiI1MCIsImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3Njd9" meta_info_horiz="content-horiz-left" f_title_font_weight="800" image_height="120" all_modules_space="eyJwb3J0cmFpdCI6IjAiLCJwaG9uZSI6IjMwIiwiYWxsIjoiMCJ9" art_excerpt="0" art_title="10px 0 0 0" btn_bg="rgba(255,255,255,0)" f_btn_font_transform="uppercase" f_btn_font_weight="" f_cat_font_transform="" f_cat_font_weight="600" btn_bg_hover="rgba(255,255,255,0)" meta_width="eyJwaG9uZSI6IjEwMCUifQ==" show_audio="" show_com="none" show_date="none" show_author="none" mc1_el="10" f_title_font_family="394" f_title_font_transform="" title_txt="#000000" title_txt_hover="var(--metro-red)" cat_txt="var(--metro-red)" cat_bg="rgba(255,255,255,0)" cat_bg_hover="rgba(255,255,255,0)" modules_category_padding="eyJhbGwiOiI2cHggMTJweCIsInBvcnRyYWl0IjoiNHB4IDEwcHgifQ==" f_cat_font_family="394" f_cat_font_size="eyJhbGwiOiIxMyIsImxhbmRzY2FwZSI6IjEyIiwicG9ydHJhaXQiOiIxMSJ9" f_cat_font_line_height="1" modules_gap="eyJhbGwiOiIyMCIsImxhbmRzY2FwZSI6IjE1IiwicG9ydHJhaXQiOiIxMCIsInBob25lIjoiMCJ9" image_radius="10" cat_txt_hover="var(--metro-blue)" modules_cat_border="1" modules_category_radius="100" cat_border="#eaeaea" cat_border_hover="#eaeaea" excl_padd="4px 6px 3px" excl_radius="2" f_excl_font_family="394" f_excl_font_size="11" f_excl_font_weight="500" excl_bg="var(--metro-exclusive)"][tdm_block_button button_size="tdm-btn-lg" button_display="" content_align_horizontal="content-horiz-center" button_text="More from Technology" button_url="/category/articles/" tds_button1-border_radius="4" tds_button1-background_color="var(--metro-red)" tds_button1-background_hover_color="var(--metro-blue)" tds_button1-f_btn_text_font_family="394" tds_button1-f_btn_text_font_size="eyJhbGwiOiIxMiIsInBvcnRyYWl0IjoiMTEifQ==" tds_button1-f_btn_text_font_line_height="1" tds_button1-f_btn_text_font_weight="600" tds_button1-f_btn_text_font_spacing="0.5" tds_button1-f_btn_text_font_transform="uppercase" button_padding="eyJhbGwiOiIyMHB4IDI4cHgiLCJwb3J0cmFpdCI6IjE4cHggMjRweCJ9"][/vc_column][/vc_row][vc_row full_width="stretch_row_1400 td-stretch-content" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjEwMCIsInBhZGRpbmctdG9wIjoiODAiLCJwYWRkaW5nLWJvdHRvbSI6IjEwMCIsImJhY2tncm91bmQtY29sb3IiOiIjZjRmNGY0IiwiZGlzcGxheSI6IiJ9LCJwaG9uZSI6eyJtYXJnaW4tYm90dG9tIjoiNjAiLCJwYWRkaW5nLXRvcCI6IjQwIiwicGFkZGluZy1ib3R0b20iOiI2MCIsImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3NjcsImxhbmRzY2FwZSI6eyJtYXJnaW4tYm90dG9tIjoiODAiLCJwYWRkaW5nLXRvcCI6IjYwIiwicGFkZGluZy1ib3R0b20iOiI4MCIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOSwicG9ydHJhaXQiOnsibWFyZ2luLWJvdHRvbSI6IjYwIiwicGFkZGluZy1yaWdodCI6IjUiLCJwYWRkaW5nLWxlZnQiOiI1IiwiZGlzcGxheSI6IiJ9LCJwb3J0cmFpdF9tYXhfd2lkdGgiOjEwMTgsInBvcnRyYWl0X21pbl93aWR0aCI6NzY4fQ=="][vc_column width="1/1"][vc_row_inner][vc_column_inner][tdm_block_column_title title_text="Q2xvdWQlMjBDb21wdXRpbmc=" title_tag="h3" title_size="tdm-title-sm" tds_title1-f_title_font_family="467" tds_title1-f_title_font_weight="800" tds_title1-f_title_font_size="eyJhbGwiOiI2MCIsInBob25lIjoiNTQiLCJsYW5kc2NhcGUiOiI1MCIsInBvcnRyYWl0IjoiNDAifQ==" tds_title1-title_color="#000000" tdc_css="eyJhbGwiOnsicGFkZGluZy1ib3R0b20iOiIzNSIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlIjp7InBhZGRpbmctYm90dG9tIjoiMzAiLCJkaXNwbGF5IjoiIn0sImxhbmRzY2FwZV9tYXhfd2lkdGgiOjExNDAsImxhbmRzY2FwZV9taW5fd2lkdGgiOjEwMTksInBvcnRyYWl0Ijp7InBhZGRpbmctYm90dG9tIjoiMjUiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3NjgsInBob25lIjp7ImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3Njd9" tds_title1-f_title_font_line_height="1.1" tds_title="tds_title1" tds_title2-line_width="100%" tds_title2-line_color="eyJ0eXBlIjoiZ3JhZGllbnQiLCJjb2xvcjEiOiIjZjQzZjNmIiwiY29sb3IyIjoiI2Y0M2YzZiIsIm1peGVkQ29sb3JzIjpbXSwiZGVncmVlIjoiLTkwIiwiY3NzIjoiYmFja2dyb3VuZC1jb2xvcjogI2Y0M2YzZjsiLCJjc3NQYXJhbXMiOiIwZGVnLCNmNDNmM2YsI2Y0M2YzZiJ9" content_align_horizontal="content-horiz-center"][/vc_column_inner][/vc_row_inner][vc_row_inner gap="eyJhbGwiOiIyMCIsImxhbmRzY2FwZSI6IjE1IiwicG9ydHJhaXQiOiIxMCIsInBob25lIjoiMCJ9" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjYwIiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGUiOnsibWFyZ2luLWJvdHRvbSI6IjUwIiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGVfbWF4X3dpZHRoIjoxMTQwLCJsYW5kc2NhcGVfbWluX3dpZHRoIjoxMDE5LCJwb3J0cmFpdCI6eyJtYXJnaW4tYm90dG9tIjoiNDAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3NjgsInBob25lIjp7Im1hcmdpbi1ib3R0b20iOiI1MCIsImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3Njd9"][vc_column_inner width="1/2"][td_flex_block_1 modules_on_row="" limit="1" modules_category="above" show_btn="none" show_excerpt="eyJsYW5kc2NhcGUiOiJub25lIiwicG9ydHJhaXQiOiJub25lIiwiYWxsIjoibm9uZSJ9" ajax_pagination="" td_ajax_preloading="preload" sort="" category_id="394" f_title_font_size="eyJhbGwiOiI1MCIsInBob25lIjoiNDIiLCJsYW5kc2NhcGUiOiI0MiIsInBvcnRyYWl0IjoiMzYifQ==" f_title_font_line_height="1.2" show_cat="" meta_info_border_style="" meta_padding="eyJhbGwiOiIzMHB4IDAgMCAwIiwicG9ydHJhaXQiOiIyMHB4IDAgMCIsImxhbmRzY2FwZSI6IjI1cHggMCAwICIsInBob25lIjoiMjVweCAwIDAgMCJ9" modules_divider="" image_size="" meta_info_align="center" image_floated="" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjAiLCJkaXNwbGF5IjoiIn0sImxhbmRzY2FwZSI6eyJkaXNwbGF5IjoiIn0sImxhbmRzY2FwZV9tYXhfd2lkdGgiOjExNDAsImxhbmRzY2FwZV9taW5fd2lkdGgiOjEwMTksInBvcnRyYWl0Ijp7ImRpc3BsYXkiOiIifSwicG9ydHJhaXRfbWF4X3dpZHRoIjoxMDE4LCJwb3J0cmFpdF9taW5fd2lkdGgiOjc2OCwicGhvbmUiOnsiZGlzcGxheSI6IiJ9LCJwaG9uZV9tYXhfd2lkdGgiOjc2N30=" meta_info_horiz="content-horiz-left" f_title_font_weight="700" image_height="eyJhbGwiOiI3MCIsInBvcnRyYWl0IjoiOTAifQ==" all_modules_space="0" art_excerpt="0" art_title="eyJhbGwiOiIxMHB4IDAgMCIsInBvcnRyYWl0IjoiOHB4IDAgMCIsImxhbmRzY2FwZSI6IjEycHggMCAwIDAiLCJwaG9uZSI6IjEwcHggMCAwIn0=" btn_bg="rgba(255,255,255,0)" f_btn_font_transform="uppercase" f_btn_font_weight="" f_ex_font_weight="500" f_cat_font_transform="" f_cat_font_weight="600" btn_bg_hover="rgba(255,255,255,0)" f_ex_font_size="eyJhbGwiOiIxNCIsImxhbmRzY2FwZSI6IjEzIiwicG9ydHJhaXQiOiIxMyJ9" f_ex_font_line_height="1.4" meta_width="eyJwaG9uZSI6IjEwMCUifQ==" show_audio="" show_com="none" show_date="none" show_author="none" f_title_font_family="467" f_title_font_transform="" f_ex_font_family="394" title_txt="#000000" title_txt_hover="var(--metro-blue)" ex_txt="#000000" cat_txt="var(--metro-blue)" cat_bg="rgba(255,255,255,0)" cat_bg_hover="rgba(255,255,255,0)" author_txt="#000000" author_txt_hover="var(--metro-blue)" modules_category_padding="eyJhbGwiOiI4cHggMTRweCIsInBvcnRyYWl0IjoiNnB4IDEycHgifQ==" f_cat_font_family="394" f_cat_font_size="eyJhbGwiOiIxMyIsInBvcnRyYWl0IjoiMTIifQ==" f_cat_font_line_height="1" excerpt_middle="yes" modules_gap="0" image_width="eyJhbGwiOiIxMDAiLCJwaG9uZSI6IjEwMCJ9" image_radius="10" cat_txt_hover="var(--metro-blue-acc)" modules_cat_border="1" modules_category_radius="100" cat_border="#eaeaea" cat_border_hover="#eaeaea" review_size="0" offset="1"][/vc_column_inner][vc_column_inner width="1/2"][td_flex_block_1 modules_on_row="eyJwaG9uZSI6IjEwMCUifQ==" image_size="" image_floated="hidden" image_width="eyJwaG9uZSI6IjMwIn0=" image_height="eyJwaG9uZSI6IjExMCJ9" show_btn="none" show_excerpt="eyJwaG9uZSI6Im5vbmUiLCJhbGwiOiJub25lIn0=" show_com="eyJwaG9uZSI6Im5vbmUiLCJhbGwiOiJub25lIn0=" show_author="" show_cat="none" meta_padding="0" f_title_font_size="eyJhbGwiOiIzMCIsInBvcnRyYWl0IjoiMjAiLCJsYW5kc2NhcGUiOiIyNSIsInBob25lIjoiMjUifQ==" f_title_font_line_height="1.4" f_title_font_weight="800" all_modules_space="eyJsYW5kc2NhcGUiOiIyMCIsInBvcnRyYWl0IjoiMTUiLCJhbGwiOiIyNSJ9" category_id="42" show_date="" art_excerpt="0" show_review="none" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0Ijp7ImRpc3BsYXkiOiIifSwicG9ydHJhaXRfbWF4X3dpZHRoIjoxMDE4LCJwb3J0cmFpdF9taW5fd2lkdGgiOjc2OCwicGhvbmUiOnsiZGlzcGxheSI6IiJ9LCJwaG9uZV9tYXhfd2lkdGgiOjc2N30=" f_title_font_family="467" mc1_el="10" title_txt_hover="var(--metro-blue)" title_txt="#000000" art_title="eyJhbGwiOiIwIDAgMTBweCIsInBvcnRyYWl0IjoiMCAwIDhweCJ9" modules_border_size="1px 0 0 0" m_padding="eyJhbGwiOiIyMHB4IDAgMCAwIiwibGFuZHNjYXBlIjoiMTVweCAwIDAgMCIsInBvcnRyYWl0IjoiMTVweCAwIDAgMCJ9" modules_gap="eyJhbGwiOiI0MCIsImxhbmRzY2FwZSI6IjMwIiwicG9ydHJhaXQiOiIyMCJ9" f_meta_font_size="eyJhbGwiOiIxMiIsInBvcnRyYWl0IjoiMTAifQ==" f_meta_font_line_height="1" f_meta_font_weight="" f_meta_font_family="394" offset="1" review_size="0" author_txt="var(--metro-blue)" author_txt_hover="var(--metro-blue-acc)" excl_padd="4px 6px 3px" f_excl_font_family="394" f_excl_font_size="11" f_excl_font_weight="500" excl_radius="2" excl_bg="var(--metro-exclusive)"][/vc_column_inner][/vc_row_inner][tdm_block_button button_size="tdm-btn-lg" button_display="" content_align_horizontal="content-horiz-center" button_text="View more Cloud" button_url="/category/articles/cloud/" tds_button1-border_radius="4" tds_button1-background_color="#000000" tds_button1-background_hover_color="var(--metro-blue)" tds_button1-f_btn_text_font_family="394" tds_button1-f_btn_text_font_size="eyJhbGwiOiIxMiIsInBvcnRyYWl0IjoiMTEifQ==" tds_button1-f_btn_text_font_line_height="1" tds_button1-f_btn_text_font_weight="600" tds_button1-f_btn_text_font_spacing="0.5" tds_button1-f_btn_text_font_transform="uppercase" button_padding="eyJhbGwiOiIyMHB4IDI4cHgiLCJwb3J0cmFpdCI6IjE4cHggMjRweCJ9"][/vc_column][/vc_row][vc_row full_width="stretch_row_1400 td-stretch-content" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjEwMCIsImRpc3BsYXkiOiIifSwicGhvbmUiOnsibWFyZ2luLWJvdHRvbSI6IjYwIiwiZGlzcGxheSI6IiJ9LCJwaG9uZV9tYXhfd2lkdGgiOjc2NywibGFuZHNjYXBlIjp7Im1hcmdpbi1ib3R0b20iOiI4MCIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOSwicG9ydHJhaXQiOnsibWFyZ2luLWJvdHRvbSI6IjYwIiwicGFkZGluZy1yaWdodCI6IjUiLCJwYWRkaW5nLWxlZnQiOiI1IiwiZGlzcGxheSI6IiJ9LCJwb3J0cmFpdF9tYXhfd2lkdGgiOjEwMTgsInBvcnRyYWl0X21pbl93aWR0aCI6NzY4fQ==" gap="eyJsYW5kc2NhcGUiOiIxNSIsInBvcnRyYWl0IjoiMTAiLCJhbGwiOiIyMCIsInBob25lIjoiMCJ9"][vc_column width="3/4" tdc_css="eyJwaG9uZSI6eyJtYXJnaW4tYm90dG9tIjoiNjAiLCJkaXNwbGF5IjoiIn0sInBob25lX21heF93aWR0aCI6NzY3fQ=="][/vc_column][vc_column width="1/4"][/vc_column][/vc_row][/tdc_zone] ### Contributors Many people have contributed to this website and we are thankful to them all for their hard work. [molongui_author_list output=list layout=basic min_post_count=1] ### Videos [tdc_zone type="tdc_content"][vc_row full_width="stretch_row_content td-stretch-content"][vc_column][/vc_column][/vc_row][vc_row][vc_column][td_block_text_with_title f_h1_font_size="60px"] Videos Stories worth seeing [/td_block_text_with_title][tdm_block_text_image title_tag="h3" title_size="tdm-title-md" description="VmlzdWFsbHklMjBjYXB0dXJpbmclMjB0aGUlMjBzdG9yaWVzJTJDJTIwaWRlYXMlMjBhbmQlMjB2aWV3cyUyMGZyb20lMjBhY3Jvc3MlMjB0aGUlMjB0ZWNobm9sb2d5JTIwaW5kdXN0cnklMkMlMjB0aGlzJTIwdmlkZW8lMjBzZWN0aW9uJTIwZmVhdHVyZXMlMjBzb21lJTIwb2YlMjBvdXIlMjBmYXZvdXJpdGUlMjBjb250ZW50LiUwQSUyMCUzQ2JyJTNFJTNDYnIlM0UlMEFpbW1lcnNlJTIweW91cnNlbGYlMjBpbg==" button_tdicon="tdc-font-fa tdc-font-fa-chevron-right" button_size="tdm-btn-lg" content_align_vertical="content-vert-top" media_size_image_height="169" media_size_image_width="300" image="54879"][vc_empty_space][td_flex_block_5 art_title_pos="bottom" info_pos="bottom" art_excerpt_pos="bottom" art_audio_pos="bottom" modules_category="image" btn_pos="bottom" hide_audio="yes" limit="1" category_id="32290"][vc_separator border_width="10" color="#0a0a0a"][vc_empty_space][td_flex_block_1 modules_on_row="20%" limit="" hide_audio="yes" category_id="32290" offset="1"][/vc_column][/vc_row][vc_row][vc_column][/vc_column][/vc_row][/tdc_zone] ### Our Brands ### Terms and Conditions 1.      INTRODUCTION 1.1.           This document (together with any documents referred to in it) tells you the terms and conditions upon which we sell and supply the services (the Services) listed on this website (the ‘Website’) to you. 1.2.           Before confirming your order please: 1.2.1.               Read through these terms and conditions (the ‘Conditions’) and in particular our cancellations and returns policy at clause 11.and limitation of our liability and your indemnity at clause 15. 1.2.2.               Print a copy for future reference. 1.2.3.               Read our privacy policy regarding your personal information. 1.3.           By ordering any of the Services listed on this Website, you agree to be legally bound by these Conditions. You will be unable to proceed with your purchase if you do not accept these terms and conditions as may be modified or amended and posted on this Website from time to time. 1.4.           We reserve the right to revise and amend the Website, our disclaimers and the Conditions at any time without notice to you. Your continued use of the Website (or any part thereof) following a change shall be deemed to be your acceptance of such change. It is your responsibility to check regularly to determine whether we have changed these Conditions. 2.      ABOUT US 2.1.           This Website is owned and operated by Compare the Cloud limited (‘we’/’us’/’our’), a limited company registered in England and Wales under company number: 0765758 having our registered office at Cargo Works, 1 – 2 Hatfields, Waterloo, LONDON, SE1 9PG.  Our VAT Number is: 122 2531 64. 3.      COMMUNICATIONS 3.1.           You agree that email and other electronic communications can be used as a long-distance means of communication and acknowledge that all contracts, notices, information and other communications that we provide to you electronically comply with any legal requirement that such communications be in writing. 3.2.           We will contact you by email or provide you with information by posting notices on our Website. 4.      OVERSEAS ORDERS 4.1.           Our Website is only intended for use by customers resident in England, Wales, Scotland and Northern Ireland (the United Kingdom). Unless your product is available within the UK 4.2.           We may accept your order if you are resident in the European Economic Area (EEA), subject to reserving a right to amend the specifications or standards of the Services offered on the Website and/or these Conditions or to refuse to accept an order for our Services from you, if it will put an excessive strain on our business or if we have an objective reason for doing so. If we accept your order, you will be liable to pay for all and any additional costs that we incur in order to facilitate your order. You will have an opportunity to cancel your order in case the additional costs are not acceptable. 4.3.           If we agree to supply any Services ordered from the Website for delivery outside the United Kingdom they may be subject to import duties and/or additional taxes or expenses incurred due to complying with foreign regulatory requirements or laws. You will be responsible for payment of any such duties and/or taxes in addition to our price including VAT. Please note that we have no control over these charges and cannot predict their amount. Please contact your local customs office or taxation authority for further information before placing your order. 4.4.           You must comply with all applicable laws and regulations of the country for which the Services are destined. We will not be liable for any breach by you of any such laws. 5.      REGISTRATION 5.1.           When registering on the Website you must choose a username and password. You are responsible for all actions taken under your chosen username and password. 5.2.           By registering on the Website you undertake: 5.2.1.               That all the details you provide to us for the purpose of registering on the Website and purchasing the Services are true, accurate, current and complete in all respects 5.2.2.               To notify us immediately of any changes to the information provided on registration or to your personal information 5.2.3.               That you are over 18 or if under 18 you have a parent or guardian’s permission to register with and purchase the Services from this Website in conjunction with and under their supervision 5.2.4.               To only use the Website using your own username and password 5.2.5.               To make every effort to keep your password safe 5.2.6.               Not to disclose your password to anyone 5.2.7.               To change your password immediately upon discovering that it has been compromised 5.2.8.               To neither transfer or sell your username or password to anyone, nor permit, either directly or indirectly, anyone other than you to use them 5.3.           You authorise us to transmit your name, address and other personal information supplied by you (including updated information) to obtain information from third parties about you, including, but not limited to, credit reports and so that we may authenticate your identity. 5.4.           We reserve the right to terminate an agreement formed with you pursuant to clause 9.below and to suspend or terminate your access to the Website immediately and without notice to you if: 5.4.1.               You fail to make any payment to us when due 5.4.2.               You breach these Conditions (repeatedly or otherwise) 5.4.3.               You are impersonating any other person or entity 5.4.4.               When requested by us to do so, you fail to provide us within a reasonable time with sufficient information to enable us to determine the accuracy and validity of any information supplied by you, or your identity 5.4.5.               We suspect you have engaged, or are about to engage, or have in anyway been involved, in fraudulent or illegal activity on the Website 6.      ELIGIBILITY TO PURCHASE FROM THE WEBSITE 6.1.           To be eligible to purchase the Services on this Website and lawfully enter into and form contracts with us, you must: 6.1.1.               Be 18 years of age or over 6.1.2.               Be legally capable of entering into a binding contract 6.1.3.               Provide full details of an address in the United Kingdom or the European Economic Area (if you reside in the EEA) for the performance or delivery of the Services 6.2.           If you are under 18, you may only use the Website in conjunction with, and under the supervision of, a parent or guardian. If you do not qualify, you must not use our Website. 7.      PRICE 7.1.           The prices of the Services are quoted on the Website. 7.2.           Prices quoted are for performance of the Services in the United Kingdom unless otherwise specified. 7.3.           Unless otherwise stated, the prices quoted Exclude VAT. 7.4.           We reserve the right, by giving notice to you at any time before delivery or performance of our obligations to you, to increase the price of the Services to reflect any increase in the cost to us due to any factor beyond our control (such as without limitation, any foreign exchange fluctuation, significant increase in the costs of labour, materials or other costs of manufacture). In the unlikely event of this occurring, you shall be entitled to cancel the order at any time before we have commenced providing the Services. 8.      PAYMENT 8.1.           Payment can be made by any major prepay, credit or debit card or through an electronic payment account as explained on the order form. 8.2.           By placing an order, you consent to payment being charged to your prepay/debit/credit card account or electronic payment account as provided on the order form. 8.3.           Payment will be debited and cleared from your account before the provision of the Service to you. 8.4.           When you pay for your order by card, we carry out certain checks which include obtaining authorisation from your card issuer to ensure you have adequate funds and for security reasons. This may involve validating your name, address and other personal information supplied by you during the order process against appropriate third party databases including the card issuer, registered credit reference agencies and fraud prevention agencies. 8.5.           By accepting these Conditions you: 8.5.1.               Undertake that all the details you provide to us for the purpose of purchasing the Services are correct and  that the payment card you are using is your own and that there are sufficient funds to cover the cost of the Services ordered 8.5.2.               Undertake that any and all Services ordered by you are for your own private or domestic use only and not for resale 8.5.3.               Authorise us to transmit the payment and delivery information provided by you during the order process (included any updated information) for the purpose of obtaining authorisation from your card issuer to ensure you have adequate funds, to authenticate your identity, to validate your payment card and for other security reasons, such as fraud prevention 8.6.           We shall contact you should any problems occur with the authorisation of your card. 8.7.           We will take all reasonable care, in so far as it is in our power to do so, to keep the details of your order and payment secure, but in the absence of negligence on our part, we cannot be held liable for any loss you may suffer if a third party procures unauthorised access to any data you provide when accessing or ordering from our Website. 9.      ORDER PROCESS AND FORMATION OF A CONTRACT 9.1.           All orders are subject to acceptance and availability. If any Services ordered are not available, you will be notified by email and you will have the option either to wait until the item is available or to cancel your order. It is your responsibility to provide us with a valid email address so that we can contact you if necessary. 9.2.           Any order placed by you constitutes an offer to purchase the Services from us. All such offers received from you are subject to acceptance by us and we reserve the right to refuse any order placed by you at any time prior to acceptance, without providing an explanation. 9.3.           You shall be responsible for ensuring the accuracy of the details provided by you during the order process and we will not accept an order unless all details requested from you have been entered correctly. 9.4.           You agree that if we contact you to acknowledge receipt of your order such communication shall not amount to our acceptance of your offer to purchase the Services ordered by you from the Website. 9.5.           A contract between you and us (the ‘Contract’) incorporating these Conditions will only subsist after we have debited your payment card and have confirmed that we shall be providing the requested Service or made it available to be downloaded. We will send you an email to confirm this (a ‘Confirmation Notice’). The Confirmation Notice will amount to an acceptance of your offer to buy the Services from us. The Contract will only be formed when we send you the Confirmation Notice (whether or not you receive it). 9.6.           Where we agree to supply Services to you permanently or on an ongoing (continuous) basis, such as by subscription, they shall be provided for a minimum fixed period of time (the ‘Minimum Duration’). The length of the Minimum Duration will depend on which package or product you have selected to purchase and is provided on the Website. 9.7.           The Contract will relate only to the Services stated in the Confirmation Notice.  We will not be obliged to supply any other Services which may have been part of your order until we have sent you a separate Confirmation Notice relating to it. 9.8.           You must check that the details contained in the Confirmation Notice are correct and you should print out and keep a copy of it. 9.9.           You will be subject to the version of our policies and Conditions in force at the time that you order the Services from us, unless: 9.9.1.               Any change to those policies or these Conditions is required to be made by law or governmental authority 9.9.2.               We notify you of any change to our policies or these Conditions before we send you the Confirmation Notice, in which case, we are entitled to assume that you have accepted it, unless we receive written notification from you to the contrary within seven working days of receipt of the Confirmation Notice 9.10.        In some cases, we accept orders as agents on behalf of third party sellers.  The resulting legal contract is between you and that third party seller, and is subject to the terms and conditions of that third party seller, which they will advise you of directly.  You should carefully review their terms and conditions applying to the transaction. 10.    DELIVERY 10.1.        The Services will be delivered to you at the address you provided during the order process which may be an address other than the billing address, but please note that extra documentation may be needed to comply with such orders. We may where appropriate and at our option, deliver all or part of the Services, to the email address you supplied on registration or such other email address that we agree to use to communicate with you. 10.2.        Any dates quoted for completing performance of the Service are approximate only. If no date is specified then it will take place within 30 days or a reasonable time of the date of the Confirmation Notice, unless there are exceptional circumstances.  10.3.        We shall not be liable for any delay in completing performance of the Service, however caused. 10.4.        The Services may be sent to you in instalments. 11.    CANCELLING YOUR CONTRACT AND RETURNS 11.1.        Cancelling before receiving a Confirmation Notice. 11.1.1.             You may cancel your order for the Services at any time prior to receiving a Confirmation Notice from us so long as you contact us in writing. You can send us a cancellation notice by sending an email to enquiries@comparethecloud.net or a letter to Cargo Works, 1 – 2 Hatfields, Waterloo, LONDON, SE1 9PG. Your cancellation notice must quote your name, address, the name or a description of the services and your order reference number. 11.2.        Cancellation after receiving a Confirmation Notice. 11.2.1.             You are entitled to cancel your Contract and obtain a refund within 14 working days from the date of the Confirmation Notice.  This also applies, where appropriate and subject to clause 11.4., to items that are available to be downloaded. However, you will no longer have a right to cancel if, with your agreement, we have already commenced providing the Services to you before this period of time expires. We shall be deemed to have already commenced providing the Services, in circumstances where you have already downloaded products or materials that we made available to you, from the Website. 11.2.2.             You may notify us of your wish to cancel by sending us a cancellation notice to enquiries@comparethecloud.net or a letter to Cargo Works, 1 – 2 Hatfields, Waterloo, LONDON, SE1 9PG. Your cancellation notice must quote your name, address, the name or a description of the services and your order reference number. 11.2.3.             Upon receiving your cancellation notice, we will contact you providing any necessary instructions which you will be required to follow. 11.2.4.             So long as you have complied with your obligations under this clause, we will refund the purchase price to you by debiting the payment card you used to purchase the Services. 11.3.        Cancelling ongoing Services. 11.3.1.             Some of the Services that we provide are available for either a fixed period or unspecified period of time (such as Cloud Computing Subscriptions with third parties). In this clause these services are referred to respectively as ‘Ongoing Fixed Term Services’ and ‘Ongoing Non-Fixed Term Services’. 11.3.2.             You are entitled to cancel your Contract for any Recurrent Fixed Term Services and Recurrent Non-Fixed Term Services that you have purchased and obtain a refund within 120 working days from the date of the Confirmation Notice.  This also applies, where appropriate, and subject to clause 11.4., to items that are available to be downloaded. 11.3.3.             You will no longer have a right to cancel any Ongoing Fixed Term Services if, with your agreement, we have already commenced providing this service to you within 14 working days from the date of the Confirmation Notice. We shall be deemed to have already commenced providing the Ongoing Fixed Term Services, in circumstances where you have already downloaded products or materials that we made available to you from the Website. 11.3.4.             In these circumstances you cannot cancel the Contract for any Ongoing Fixed Term Services until the end of the Minimum Duration (even where the Minimum Duration is more than one year) and you will not be entitled to a refund. 11.3.5.             Although you may notify us of your intention to cancel an Ongoing Fixed Term Services at any time, such notice will only take effect after the Minimum Duration has elapsed. You may notify us of your wish to cancel the Ongoing Fixed Term Services by sending us a cancellation notice to enquiries@comparethecloud.net or a letter to Cargo Works, 1 – 2 Hatfields, Waterloo, LONDON, SE1 9PG. Your cancellation notice must quote your name, address, the name or a description of the services and your order reference number. 11.3.6.             We may, at our sole discretion, agree to temporarily suspend any Ongoing Fixed Term Services if you will be unable to use the service, such as, for example, if you are going on holiday. We will require at least 7 working days’ advance notice for this to be implemented. The maximum period of suspension will be 2 weeks in any calendar year. You may use the same contact details for providing a cancellation notice to request the Ongoing Fixed Term Services to be suspended. 11.3.7.             You will still have a right to cancel any Ongoing Non-Fixed Term Services if we have already commenced providing this service to you within 14 working days from the date of the Confirmation Notice, upon giving us 7 weeks’ advance notice in writing. You may notify us of your wish to cancel by sending us a cancellation notice to enquiries@comparethecloud.net or a letter to Cargo Works, 1 – 2 Hatfields, Waterloo, LONDON, SE1 9PG. Your cancellation notice must quote your name, address, the name or a description of the services and your order reference number. 11.4.        Exception to the right to cancel You will not have a right to cancel an order for any goods or services purchased from us, in the following situations: 11.4.1.             If you expressly agree to us beginning to provide any services before the end of the cancellation period.  11.4.2.             The Contract is for goods which are bespoke or have been personalised or which may deteriorate (such as food) 11.4.3.             The Contract is for goods and/or services the price of which is dependent on fluctuations in the financial market which cannot be controlled by us 11.4.4.             The Contract is for the sale of land, auctions and financial service agreements 11.4.5.             The Contract is for the supply of: 11.4.5.1.                   Audio or video recordings and computer software if unsealed by you 11.4.5.2.                   Audio or video recordings and software and other items that you have successfully downloaded where a free trial or demonstration was available to you to view or download  11.4.5.3.                   Newspapers, magazines and other periodicals 11.4.5.4.                   Gaming, betting and lottery services 11.5.        Incorrectly priced or described services 11.5.1.             Whilst we try and ensure that all the information on our Website is accurate, errors may occur. In the unlikely event that the price and/or description of an item listed on the Website has been incorrectly advertised, we will not be under any obligation to sell or provide those services to you. 11.5.2.             If we discover the error before sending you a Confirmation Notice we will at our discretion, either reject your order and notify you of such rejection, or inform you as soon as possible and give you the option of cancelling your order or reconfirming it at the correct price and/or description. If we give you the option of cancelling your order or reconfirming it at the correct price and/or description but either cannot contact you or do not receive your response within 14 days of sending you notification (whether or not you receive it), we will reject your order. 11.5.3.             If we discover the error after sending you a Confirmation Notice we may, at our discretion and without incurring any liability to you, cancel the Contract provided that the error is, in our reasonable opinion, obvious and unmistakable and could have reasonably been recognised by you. We will notify if we cancel the Contract. 11.5.4.             If your order is cancelled or rejected and you have already paid for the Services, you will receive a full refund in accordance with clause 11.7. 11.6.        Delivery by instalments 11.6.1.             The Services may be sent to you in instalments. You may cancel the outstanding part of your order and receive a refund, if you have already paid, of the purchase price of the outstanding Services in accordance with clause 11.7. 11.7.        Processing refunds 11.7.1.             We will notify you about your refund via email within a reasonable period of time.  We will usually process a refund as soon as possible and, in any case, within 30 days of the day we confirmed to you via e-mail that you are entitled to a refund.  Refunds will be made by crediting the payment card or electronic payment account you used to purchase the Services. 12.    COMPLAINTS 12.1.        If you have a comment, concern or complaint about any Services you have purchased from us, please contact us via email at enquiries@comparethecloud.net or by post at Cargo Works, 1 – 2 Hatfields, Waterloo, LONDON, SE1 9PG. 13.    INTELLECTUAL PROPERTY 13.1.        The content of the Website is protected by copyright (including design copyrights), trade marks, patent, database and other intellectual property rights and similar proprietary rights which include, (without limitation), all rights in materials, works, techniques, computer programs, source codes, data, technical information, trading business brand names, goodwill, service marks utility models, semi-conductor topography rights, the style or presentation of the goods or services, creations, inventions or improvements upon or additions to an invention, confidential information, know-how and any research effort relating to Compare the Cloud limited moral rights and any similar rights in any country (whether registered or unregistered and including applications for and the right to apply for them in any part of the world) and you acknowledge that the intellectual property rights in the material and content supplied as part of the Website shall remain with us or our licensors. 13.2.        You may download or copy the content and other downloadable items displayed on the Website subject to the condition that the material may only be used for personal non-commercial purposes. Copying or storing the contents of the Website for other than personal use is expressly prohibited. 13.3.        You may retrieve and display the content of the Website on a computer screen, store such content in electronic form on disk (but not any server or other storage device connected to a network) or print one copy of such content for your own personal, non-commercial use, provided you keep intact all and any copyright and proprietary notices. You may not otherwise reproduce, modify, copy or distribute or use for commercial purposes any of the materials or content on the Website. 13.4.        You acknowledge that any other use of the material and content of this Website is strictly prohibited and you agree not to (and agree not to assist or facilitate any third party to) copy, reproduce, transmit, publish, display, distribute, commercially exploit or create derivative works from such material and content. 13.5.        No licence is granted to you in these Conditions to use any of our trade marks or those of our affiliated companies. 13.6.        Services sold by us and Website content may be subject to copyright, trade mark or other intellectual property rights in favour of third parties. We acknowledge those rights. 14.    WEBSITE USE 14.1.        You are permitted to use the Website and the material contained in it only as expressly authorised by us under our terms of use. 15.    LIABILITY AND INDEMNITY 15.1.        Notwithstanding any other provision in the Conditions, nothing will affect or limit your statutory rights; or will exclude or limit our liability for: 15.1.1.             Death or personal injury resulting from our negligence 15.1.2.             Fraud or fraudulent misrepresentation 15.1.3.             Action pursuant to section 2(3) of the Consumer Protection Act 1987 15.1.4.             Any matter for which it would be unlawful for us to exclude or attempt to exclude our liability 15.2.        The Website is provided on an ‘as is’ and ‘as available’ basis without any representation or endorsement made and we make no warranties or guarantees, whether express or implied, statutory or otherwise (unless otherwise expressly stated in these Conditions or required by law) in relation to the information, materials, content or services found or offered on the Website for any particular purpose or any transaction that may be conducted on or through the Website including but not limited to, implied warranties of non-infringement, compatibility, timeliness, performance, security, accuracy, condition or completeness, or any implied warranty arising from course of dealing or usage or trade custom. 15.3.        We will not be liable if the Website is unavailable at any time. 15.4.        We make no representation or warranty of any kind express or implied statutory or otherwise regarding the availability of the Website or that it will be timely or error-free, that defects will be corrected, or that the Website or the server that makes it available are free of viruses or bugs. 15.5.        We will not be responsible or liable to you for any loss of content or material uploaded or transmitted through the Website and we accept no liability of any kind for any loss or damage resulting from action taken in reliance on material or information contained on the Website. 15.6.        We cannot guarantee and cannot be responsible for the security or privacy of the Website and any information provided by you. You must bear the risk associated with the use of the internet. In particular, we will not be liable for any damage or loss caused by a distributed denial-of-service attack, any viruses trojans, worms, logic bombs, keystroke loggers, spyware, adware or other material which is malicious or technologically harmful that may infect your computer, peripheral computer equipment, computer programs, data or other proprietary material as a result of your use of the Website or you downloading any material posted or sold on the Website or from any website linked to it. 15.7.        We will use all reasonable endeavours to carry out our obligations within a reasonable period of time but will not be liable to you for any loss, costs or expenses arising directly or indirectly from any delays in doing so. 15.8.        We will not be liable, in contract or tort (including, without limitation, negligence), or in respect of pre-contract or other representations (other than fraudulent misrepresentations) or otherwise for: 15.8.1.             any economic losses (including without limitation loss of revenues, profits, contracts, business or anticipated savings and any other consequential loss); or 15.8.2.             any loss of goodwill or reputation; or 15.8.3.             any special or indirect losses; or 15.8.4.             any loss of data; or 15.8.5.             wasted management or office time; or 15.8.6.             any other loss or damage of any kind suffered or incurred arising out of or in connection with the provision of any matter under these Conditions and/or the Contract and/or the use of this Website or any aspect related to your purchase of the Services even if such losses are foreseeable or result from a deliberate breach of these Conditions by us that would entitle you to terminate the Contract between us or as a result of any action we have taken in response to your breach of these Conditions. Without prejudice to the terms of this clause and in the event that we are unable to rely upon it, our liability for all and any losses you suffer as a result of us breaking the Contract, whether or not deliberate, including those listed in clauses 15.8.1.to 15.8.6., is strictly limited to the purchase price of the Services you purchased. 15.9.        If you buy any goods or services from a third party seller through our Website, the seller’s individual liability will be set out in their own terms and conditions. 15.10.      You agree to fully indemnify, defend and hold us, and our officers, directors, employees and suppliers, harmless immediately on demand, from and against all claims, including but not limited to losses (including loss of profit, revenue, goodwill or reputation), costs and expenses, including reasonable administrative and legal costs, arising out of any breach of these Conditions by you, or any other liabilities arising out of your use of this Website or any other person accessing the Website using your personal information with your authority. 15.11.      This clause does not affect your statutory rights as a consumer, nor does it affect your contractual cancellation rights. 16.    REVIEWS 16.1.        You acknowledge that any review, feedback or rating which you leave may be published by us on the Website and you agree that it may be displayed for as long as we consider appropriate and that the content may be syndicated to our other websites, publications or marketing materials. 16.2.        You undertake that any review, feedback or rating that you write shall: 16.2.1.             Comply with applicable law in the UK and the law in any country from which they are posted 16.2.2.             Be factually accurate 16.2.3.             Contain genuinely held opinions (where applicable) 16.2.4.             Not contain any material which is either defamatory, threatening, obscene, abusive, offensive, hateful, inflammatory or is likely to harass, upset, annoy, alarm, embarrass or invade the privacy of, any person or be deceiving 16.2.5.             Not promote or advocate an unlawful act or activity, discrimination, sexually explicit material or violence 16.2.6.             Not infringe any trademark, copyright (including design rights), database right, or other intellectual property rights of any other person or breach of any legal duty you owe to a third party 16.2.7.             Not be used to impersonate any person, or to misrepresent your identity 16.3.        You agree to indemnify and hold us harmless against any claim or action brought by third parties, arising out of or in connection with any review, feedback or rating posted by you on the Website, including, without limitation, the violation of their privacy, defamatory statements or infringement of intellectual property rights. 16.4.        You grant us and our affiliate companies a non-exclusive, royalty-free worldwide license to use or edit any reviews posted by you. 16.5.        We reserve the right to publish, edit or remove any reviews without notifying you. 17.    FORCE MAJEURE 17.1.        We shall have no liability for delays or failures in delivery or performance of our obligations to you resulting from any act, events, omissions, failures or accidents that are outside of our control (‘Force Majeure’), which, without limitation, include: 17.1.1.             Strikes, lock-outs or other industrial action 17.1.2.             Shortages of labour, fuel, power, raw materials 17.1.3.             Late, defective performance or non-performance by suppliers 17.1.4.             Private or public telecommunication, computer network failures or breakdown of equipment 17.1.5.             Civil commotion, riot, invasion, terrorist attack or threat of terrorist attack, war (whether declared or not) or threat or preparation for war. 17.1.6.             Fire, explosion, storm, flood, earthquake, subsidence, epidemic or other natural disaster or extreme weather conditions. 17.1.7.             Impossibility of the use of railways, shipping, aircraft, motor transport or other means of public or private transport. 17.1.8.             Acts, decrees, legislation, regulations or restrictions of any government 17.1.9.             Other causes, beyond our reasonable control 17.2.        Our performance will be deemed to be suspended for the period that the event of Force Majeure continues, and we will have an extension of time for performance for the duration of that period.  We will use our reasonable endeavours to minimise any delay caused by Force Majeure or to find a solution by which our obligations may be performed despite the Force Majeure event. We shall promptly notify you of any Force Majeure event giving details of it and (where possible) the extent and likely duration of any delay. 17.3.        Where the period of non-performance or delay in relation to any event of Force Majeure exceeds 30 days from the date of notice to you of the event of Force Majeure, either you or us may, by written notice to the other, terminate the Contract with immediate effect upon service. 18.    PRIVACY POLICY 18.1.        In order to monitor and improve customer service, we sometimes record telephone calls. 18.2.        We shall be entitled to process your data in accordance with the terms of our Privacy Policy. Please view this document for further information. All information provided by you will be treated securely and in accordance with the Data Protection Act 1998 (as amended). 18.3.        You can find full details of our Privacy Policy on the Website. 19.    THIRD PARTY RIGHTS 19.1.        Except for our affiliates, directors, employees or representatives, a person who is not a party to the Contract has no right under the Contracts (Rights of Third Parties) Act 1999 to enforce any term of the Contract but this does not affect any right or remedy of a third party that exists or is available apart from that Act. 20.    EXTERNAL LINKS 20.1.        To provide increased value and convenience to our users, we may provide links to other websites or resources for you to access at your sole discretion and risk. You acknowledge and agree that, as you have chosen to enter the linked website we are not responsible for the availability of such external sites or resources, and do not review or endorse and are not responsible or liable in any way, whether directly or indirectly, for: 20.1.1.             The privacy practices of such websites 20.1.2.             The content of such websites, including (without limitation) any advertising, content, products, goods or other materials or services on or available from such websites or resources 20.1.3.             The use which others make of these websites; or 20.1.4.             Any damage, loss or offence caused or alleged to be caused to you, arising from or in connection with the use of or reliance upon any such advertising, content, products, goods, materials or services available on and/or purchased by you from such external websites or resources 21.    LINKING TO THE WEBSITE 21.1.        You must not create a link to the Website from another website, document or any other source without first obtaining our prior written consent. 21.2.        Any agreed link must be: 21.2.1.             To the Website’s homepage 21.2.2.             Established from a website or document that is owned by you and does not contain content that is offensive,  controversial, infringes any intellectual property rights or other rights of any other person or does not comply in any way with the law in the UK and the law in any country from which they are hosted 21.2.3.             Provided in such a way that is fair and legal and does not damage our reputation or take advantage of it 21.2.4.             Established in such a way that does not suggest any form of association, approval or endorsement on our part where none exists 21.3.        We have no obligation to inform you if the address of the Website home page changes and it is your responsibility to ensure that any link you provide to our homepage is at all times accurate. 21.4.        We reserve the right to withdraw our consent without notice and without providing any reasons for withdrawal. Upon receiving such notice you must immediately remove the link and inform us once this has been done. 22.    NOTICES 22.1.        All notices given by you to us must be given to us at Cargo Works, 1 – 2 Hatfields, Waterloo, LONDON, SE1 9PG or by using enquiries@comparethecloud.net. We may give notice as described in clause 3. 22.2.        Notice will be deemed received and properly served immediately when posted on our Website, 24 hours after an email is sent, or three days after the date of posting of any letter.  In proving the service of any notice, it will be sufficient to prove, in the case of a letter, that such letter was properly addressed, stamped and placed in the post and, in the case of an email, that such email was sent to the specified email address of the addressee. 23.    ENTIRE AGREEMENT 23.1.        The Contract represents the entire agreement between us in relation to the subject matter of the Contract and supersede any prior agreement, understanding or arrangement between us, whether oral or in writing. 23.2.        We each acknowledge that, in entering into a Contract, neither of us has relied on any express or implied representation, undertaking or promise given by the other from anything said or written in any negotiations between us prior to such Contract except as has been expressly incorporated in such Contract. 23.3.        Neither of us shall have any remedy in respect of any untrue statement made by the other, whether orally or in writing, prior to the date of any Contract (unless such untrue statement was made fraudulently) and the other party’s only remedy shall be for breach of contract as provided in these Conditions. 24.    GENERAL 24.1.        We reserve the right to change the domain address of this Website and any services, products, product prices, product specifications and availability at any time. 24.2.        All prices and descriptions supersede all previous publications. All product descriptions are approximate. 24.3.        Every effort is made to keep information regarding stock availability on the Website up to date. However, we do not guarantee that this is the case, or that stock will always be available. 24.4.        If any provision of these terms and conditions is held by any competent authority to be invalid or unenforceable in whole or in part, the validity of the other provisions of the Contract and the remainder of the provision in question will not be affected. 24.5.        All Contracts are concluded and available in English only. 24.6.        If we fail, at any time during the term of a Contract, to insist upon strict performance of any of your obligations under it or any of these terms and conditions, or if we fail to exercise any of the rights or remedies to which we are entitled under the Contract, this shall not constitute a waiver of such rights or remedies and shall not relieve you from compliance with your obligations. 24.7.        A waiver by us of any default shall not constitute a waiver of any subsequent default. 24.8.        No waiver by us of any of these Conditions or of any other term of a Contract shall be effective unless it is expressly stated to be a waiver and is communicated to you in writing in accordance with clause 3. 24.9.        Any Contract between you and us is binding on you and us and on our respective successors and assigns.  You may not transfer, assign, charge or otherwise dispose of the Contract, or any of your rights or obligations arising under it, without our prior written consent. We may transfer, assign, charge, sub-contract or otherwise dispose of a Contract, or any of our rights or obligations arising under it, at any time during the term of the Contract. 25.    GOVERNING LAW AND JURISDICTION 25.1.        The Website is controlled and operated in the United Kingdom. 25.2.        Every purchase you make shall be deemed performed in England and Wales. 25.3.        The Conditions and any Contract brought into being as a result of usage of this Website will be governed by the laws of England and Wales and you irrevocably agree to submit to the exclusive jurisdiction of the courts of England and Wales. Website – terms and conditions of use Please read these terms and conditions carefully as they contain important information about your rights and obligations when using this website (the ‘Website’) and in particular clause 11.4. The Website is owned and operated by Compare the Cloud limited (‘we’/’us’/’our’), a limited company registered in England and Wales under company number: 0765758 having our registered office at Cargo Works, 1 – 2 Hatfields, Waterloo, LONDON, SE1 9PG. Our VAT Number is: 122 2531 64. The term ‘you’ refers to the user or viewer of our Website. By browsing on or using the Website you are agreeing to comply with and be bound by these terms and conditions which, together with our privacy policy, governs our relationship with you regarding the use of our Website. 1.      ACCESS 1.1.           You will be able to access parts of the Website without having to register any details with us. However, from time to time certain areas of this Website may be accessible only if you are a registered user. 1.2.           You are responsible for making all arrangements necessary for you to have access to our Website. You are also responsible for ensuring that all persons who access our Website through your internet connection are aware of these terms, and that they comply with them. 1.3.           We make reasonable efforts to ensure that this Website is available to view and use 24 hours a day throughout each year however, this is not guaranteed. The Website may be temporarily unavailable at anytime because of: server or systems failure or other technical issues; reasons that are beyond our control; required updating, maintenance or repair. 1.4.           Where possible we will try to give you advance warning of maintenance issues but shall not be obliged to do so. 2.      REGISTERING ON THIS WEBSITE 2.1.           When registering on the Website you must choose a username and password. You are responsible for all actions taken under your chosen username and password. 2.2.           By registering on the Website you undertake: 2.2.1.               That all the details you provide to us for the purpose of registering on the Website are true, accurate, current and complete in all respects 2.2.2.               You will notify us immediately of any changes to the information provided on registration 2.2.3.               You are over 18 or if under 18 you have a parent or guardian’s permission to register with the Website in conjunction with and under their supervision 2.2.4.               To only use the Website using your own username and password 2.2.5.               To make every effort to keep your password safe 2.2.6.               Not to disclose your password to anyone 2.2.7.               To change your password immediately upon discovering that it has been compromised 2.2.8.               To neither transfer or sell your username or password to anyone, nor permit, either directly or indirectly, anyone other than you to use them 2.3.           You authorise us to transmit your name, address and other personal information supplied by you (included updated information) to obtain information from third parties about you, including, but not limited to, credit reports and so that we may authenticate your identity. 3.      ELIGIBILITY TO PURCHASE FROM THE WEBSITE 3.1.           To be eligible to purchase the Services on this Website and lawfully enter into and form contracts with us, you must: 3.1.1.               Be 18 years of age or over 3.1.2.               Be legally capable of entering into a binding contract 3.1.3.               Provide full details of an address in the United Kingdom or the European Economic Area, (if you reside in the EEA), for the performance or delivery of the Services 3.2.           If you are under 18, you may only use the Website in conjunction with, and under the supervision of, a parent or guardian. If you do not qualify, you must not use our Website. 4.      INTELLECTUAL PROPERTY 4.1.           The content of the Website is protected by copyright (including design copyrights), trade marks, patent, database and other intellectual property rights and similar proprietary rights which include, (without limitation), all rights in materials, works, techniques, computer programs, source codes, data, technical information, trading business brand names, goodwill, service marks utility models, semi-conductor topography rights, the style or presentation of the goods or services, creations, inventions or improvements upon or additions to an invention, confidential information, know-how and any research effort relating to Compare the Cloud limited moral rights and any similar rights in any country (whether registered or unregistered and including applications for and the right to apply for them in any part of the world). 4.2.           You acknowledge that the intellectual property rights in the material and content supplied as part of the Website shall remain with us or our licensors. 4.3.           You may download or copy the content and other downloadable items displayed on the Website subject to the condition that the material may only be used for personal non-commercial purposes. Copying or storing the contents of the Website for other than personal use is expressly prohibited. 4.4.           You may retrieve and display the content of the Website on a computer screen, store such content in electronic form on disk (but not any server or other storage device connected to a network) or print one copy of such content for your own personal, non-commercial use, provided you keep intact all and any copyright and proprietary notices. 4.5.           You may not otherwise reproduce, modify, copy or distribute or use for commercial purposes any of the materials or content on the Website. 4.6.           You acknowledge that any other use of the material and content of this Website is strictly prohibited and you agree not to (and agree not to assist or facilitate any third party to) copy, reproduce, transmit, publish, display, distribute, commercially exploit or create derivative works from such material and content. 4.7.           No licence is granted to you to use any of our trade marks or those of our affiliated companies. 5.      DISCLAIMER 5.1.           It shall be your responsibility to ensure that any products, services or information available through the Website meet your specific requirements. 5.2.           We will not be liable to you if the Website is unavailable at any time. 5.3.           We attempt to ensure that the information available on the Website at any time is accurate. However, we do not guarantee the accuracy or completeness of material on this Website. We use all reasonable endeavours to correct errors and omissions as quickly as practicable after becoming aware or being notified of them. We make no commitment to ensure that such material is correct or up to date. 5.4.           All drawings, images, descriptive matter and specifications on the Website are for the sole purpose of giving an approximate description for your general information only and should be used only as a guide. 5.5.           Any prices and offers are only valid at the time they are published on the Website. 5.6.           All prices and descriptions supersede all previous publications. 5.7.           Every effort is made to keep information regarding stock availability on the Website up to date. However, we do not guarantee that this is the case, or that stock will always be available. 5.8.           The Website is provided on an ‘as is’ and ‘as available’ basis without any representation or endorsement made and we make no warranties or guarantees, whether express or implied, statutory or otherwise (unless otherwise expressly stated in these terms and conditions or required by law) in relation to the information, materials, content or services found or offered on the Website for any particular purpose or any transaction that may be conducted on or through the Website including but not limited to, implied warranties of non-infringement, compatibility, timeliness, performance, security, accuracy, condition or completeness, or any implied warranty arising from course of dealing or usage or trade custom. 5.9.           We make no representation or warranty of any kind express or implied statutory or otherwise regarding the availability of the Website or that it will be timely or error-free, that defects will be corrected, or that the Website or the server that makes it available are free of viruses or bugs. 5.10.        We will not be responsible or liable to you for any loss of content or material uploaded or transmitted through the Website and we accept no liability of any kind for any loss or damage from action taken in reliance on material or information contained on the Website. 5.11.        We cannot guarantee and cannot be responsible for the security or privacy of the Website and any information provided by you. 5.12.        You must bear the risk associated with the use of the internet. In particular, we will not be liable for any damage or loss caused by a distributed denial-of-service attack, any viruses trojans, worms, logic bombs, keystroke loggers, spyware, adware or other material which is malicious or technologically harmful that may infect your computer, peripheral computer equipment, computer programs, data or other proprietary material as a result of your use of the Website or you downloading any material posted or sold on the Website or from any website linked to it. 5.13.        We reserve the right to disclose such information to law enforcement authorities as we reasonably feel is necessary should you breach this agreement. 6.      USE OF THE WEBSITE 6.1.           You are permitted to use the Website and the material contained in it only as expressly authorised by us and in accordance with these terms and conditions, as may be amended from time to time without notice to you. 6.2.           We provide access and use of the Website on the basis that we exclude all representations, warranties and conditions to the maximum extent permitted by law.  6.3.           We reserve the right to: 6.3.1.               Make changes to the information or materials on this Website at any time and without notice to you. 6.3.2.               Temporarily or permanently change, suspend or discontinue any aspect of the Website, including the availability of any features, information, database or content or restrict access to parts of or the entire Website without notice or liability to you or any third party. 6.3.3.               Refuse to post material on the Website or to remove material already posted on the Website 6.4.           You may not use the Website for any of the following purposes: 6.4.1.               Disseminating any unlawful, harassing, libellous, abusive, threatening, harmful, vulgar, obscene, or otherwise objectionable material 6.4.2.               Transmitting material that encourages conduct that constitutes a criminal offence, results in civil liability or otherwise                       6.4.3.               Breaching any applicable local, national or international laws, regulations or code of practice 6.4.4.               Gaining unauthorised access to other computer systems 6.4.5.               Interfering with any other person’s use or enjoyment of the Website 6.4.6.               Breaching any laws concerning the use of public telecommunications networks 6.4.7.               Interfering with, disrupting or damaging networks or websites connected to the Website 6.4.8.               Utilisation of data mining, robots or similar data gathering and extraction tools to extract (whether once or many times) for re-utilisation of any substantial parts of the Website 6.4.9.               To transmit, or procure the sending of, any unsolicited or unauthorised advertising or promotional material or any other form of similar solicitation 6.4.10.             To create and/or publish your own database that features all or substantial parts of the Website 6.4.11.             Making, transmitting or storing electronic copies of materials protected by copyright without the prior permission of the owner 6.5.           In addition, you must not: 6.5.1.               Knowingly introduce viruses, trojans, worms, logic bombs, keystroke loggers, spyware, adware or other material which is malicious or technologically harmful to the Website 6.5.2.               Attempt to gain unauthorised access to the Website, the server on which the Website is stored or any server, computer or database connected to it 6.5.3.               Attack the Website via a denial-of-service attack or a distributed denial-of service attack 6.5.4.               Damage or disrupt any part of the Website, any equipment or network on which the Website is stored or any software used for the provision of the Website 6.6.           A breach of this clause may be a criminal offence under the Computer Misuse Act 1990. We may report any such breach to the relevant law enforcement authorities and disclose your identity to them. In the event of such a breach, your right to use the Website will cease immediately. 7.      SUSPENDING OR TERMINATING YOUR ACCESS 7.1.           We reserve the right to terminate or suspend your access to the Website immediately and without notice to you if: 7.1.1.               You fail to make any payment to us when due 7.1.2.               You breach the terms of these terms and conditions (repeatedly or otherwise) 7.1.3.               You are impersonating any other person or entity 7.1.4.               When requested by us to do so, you fail to provide us within a reasonable time with sufficient information to enable us to determine the accuracy and validity of any information supplied by you, or your identity 7.1.5.               We suspect you have engaged, or about to engage, or have in anyway been involved, in fraudulent or illegal activity on the Website 8.      REVIEWS 8.1.           You acknowledge that any review, feedback or rating which you leave may be published by us on the Website and you agree that it may be displayed for as long as we consider appropriate and that the content may be syndicated to our other websites, publications or marketing materials. 8.2.           You undertake that any review, feedback or rating that you write shall: 8.2.1.               Comply with applicable law in the UK and the law in any country from which they are posted 8.2.2.               Be factually accurate 8.2.3.               Contain genuinely held opinions (where applicable) 8.2.4.               Not contain any material which is either defamatory, threatening, obscene, abusive, offensive, hateful, inflammatory or is likely to harass, upset, annoy, alarm, embarrass or invade the privacy of, any person or be deceiving 8.2.5.               Not promote or advocate an unlawful act or activity, discrimination, sexually explicit material or violence 8.2.6.               Not infringe any trademark, copyright (including design rights), database right, or other intellectual property rights of any other person or breach any legal duty you owe to a third party 8.2.7.               Not be used to impersonate any person, or to misrepresent your identity 8.3.           You agree to indemnify and hold us harmless against any claim or action brought by third parties, arising out of or in connection with any review, feedback or rating posted by you on the Website, including, without limitation, the violation of their privacy, defamatory statements or infringement of intellectual property rights. 8.4.           You grant us and our affiliate companies a non-exclusive, royalty-free worldwide license to use or edit any reviews posted by you. 8.5.           We reserve the right to publish, edit or remove any reviews without notifying you. 9.      LINKING TO THE WEBSITE 9.1.           You must not create a link to the Website from another website, document or any other source without first obtaining our prior written consent. 9.2.           Any agreed link must be: 9.2.1.               To the Website’s homepage 9.2.2.               Established from a website or document that is owned by you and does not contain content that is offensive, controversial, infringes any intellectual property rights or other rights of any other person or does not comply in any way with the law in the UK and the law in any country from which they are hosted 9.2.3.               Provided in such a way that is fair and legal and does not damage our reputation or take advantage of it 9.2.4.               Established in such a way that does not suggest any form of association, approval or endorsement on our part where none exists 9.3.                 We have no obligation to inform you if the address of the Website home page changes and it is your responsibility to ensure that any link   you provide to our homepage is at all times accurate. 9.4.           We reserve the right to withdraw our consent without notice and without providing any reasons for withdrawal. Upon receiving such notice you must immediately remove the link and inform us once this has been done. 10.    EXTERNAL LINKS 10.1.        To provide increased value and convenience to our users, we may provide links to other websites or resources for you to access at your sole discretion and risk. You acknowledge and agree that, as you have chosen to enter the linked website we are not responsible for the availability of such external sites or resources, and do not review or endorse and are not responsible or liable in any way, whether directly or indirectly, for: 10.1.1.             The privacy practices of such websites 10.1.2.             The content of such websites, including (without limitation) any advertising, content, products, goods or other materials or services on or available from such websites or resources 10.1.3.             The use which others make of these websites 10.1.4.             Any damage, loss or offence caused or alleged to be caused to you, arising from or in connection with the use of or reliance upon any such advertising, content, products, goods, materials or services available on and/or purchased by you from such external websites or resources 11.    LIMITATION OF LIABILITY AND INDEMNITY 11.1.        Notwithstanding any other provision in these terms and conditions, nothing will affect or limit your statutory rights; or will exclude or limit our liability for: 11.1.1.             Death or personal injury resulting from our negligence 11.1.2.             Fraud or fraudulent misrepresentation 11.1.3.             Action pursuant to section 2(3) of the Consumer Protection Act 1987 11.1.4.             Any matter for which it would be unlawful for us to exclude or attempt to exclude our liability 11.2.        We will not be liable, in contract or tort (including, without limitation, negligence), or in respect of pre-contract or other representations (other than fraudulent or negligent misrepresentations) or otherwise for the below mentioned losses which you have suffered or incurred arising out of or in connection with the provision of any matter in these terms and conditions even if such losses are forseeable or result from a deliberate breach by us or as a result of any action we have taken in response to your breach: 11.2.1.             Any economic losses (including without limitation loss of revenues, profits, contracts, business or anticipated savings) 11.2.2.             Any loss of goodwill or reputation; or 11.2.3.             Any special or indirect losses; or 11.2.4.             Any loss of data 11.2.5.             Wasted management or office time 11.2.6.             Any other loss or damage of any kind 11.3.        If you buy any goods or services from a third party seller through our Website, the seller’s individual liability will be set out in their own terms and conditions. 11.4.        You agree to fully indemnify, defend and hold us, and our officers, directors, employees and suppliers, harmless immediately on demand, from and against all claims, including but not limited to losses (including loss of profit, revenue, goodwill or reputation), costs and expenses, including reasonable administrative and legal costs, arising out of any breach of these terms and conditions by you, or any other liabilities arising out of your use of this Website or any other person accessing the Website using your personal information with your authority. 11.5.        This clause does not affect your statutory rights as a consumer. 12.    GENERAL 12.1.        We reserve the right to change the domain address of this Website and any services, products, product prices, product specifications and availability at any time. 12.2.        If any provision of these terms and conditions is held by any competent authority to be invalid or unenforceable in whole or in part, the validity of the other provisions in these terms and conditions and the remainder of the provision in question will not be affected. 12.3.        All Contracts are concluded and available in English only. 12.4.        If we fail, at any time to insist upon strict performance of any of your obligations under these terms and conditions, or if we fail to exercise any of the rights or remedies to which we are entitled under these terms and conditions, it shall not constitute a waiver of such rights or remedies and shall not relieve you from compliance with your obligations. 12.5.        A waiver by us of any default shall not constitute a waiver of any subsequent default. 12.6.        No waiver by us of any of these terms and conditions shall be effective unless it is expressly stated to be a waiver and is communicated to you in writing. 13.    GOVERNING LAW AND JURISDICTION 13.1.        The Website is controlled and operated in the United Kingdom. 13.2.        These terms and conditions will be governed by the laws of England and Wales and you irrevocably agree to submit to the exclusive jurisdiction of the courts of England and Wales. ### Privacy Policy Last updated: 1 June 2018 https://www.comparethecloud.net (‘Website’) is provided by Compare the Cloud Limited (‘we’/’us’/’our’). In doing so, we may be in a position to receive and process personal information relating to you. As the controller of this information, we’re providing this Privacy Notice (‘Notice’) to explain our approach to personal information. This Notice forms part of our Terms and Conditions, which governs the use of this Website. We intend only to process personal information fairly and transparently as required by data protection law including the General Data Protection Regulation (GDPR). In particular, before obtaining information from you we intend to alert you to this Notice, let you know how we intend to process the information and (unless processing is necessary for at least one of the 5 reasons outlined in clause 2 below) we’ll only process the information if you consent to that processing. The GDPR also defines certain ‘special categories’ of personal information that’s considered more sensitive. These categories require a higher level of protection, as explained below. Of course, you may browse parts of this Website without providing any information about yourself and without accepting cookies. In that case, it’s unlikely we’ll possess and process any information relating to you. We’ll start this Notice by setting out the conditions we must satisfy before processing your data. However, you may wish to skip to clause 4, which summarises what we intend to collect. The Notice also explains some of the security measures we take to protect your personal information, and tells you certain things we will or won’t do. You should read this Notice in conjunction with the Terms and Conditions. Sometimes, when you take a new service or product from us, or discuss taking a new service or product but decide against, we might wish to provide you with further information about similar services or products by email or other written electronic communication. In that situation, we will always give you the opportunity to refuse to receive that further information and if you change your mind please let us know. We’ll endeavour to remind you of your right to opt-out on each occasion that we provide such information. 1 Identity and contact details1.1 Registered number: 076575891.2 Registered office: 97 Leigh Road, Eastleigh, Hampshire, SO50 9DR1.3 info@comparethecloud.net 2 When we’re allowed to collect information from youWe will only collect personal information relating to you if one of the following conditions have been satisfied:2.1 You have clearly told us that you are content for us to collect that information for the certain purpose or purposes that we will have specified.2.2 The processing is necessary for the performance a contract that we have with you.2.3 The processing is necessary so that we can comply with the law.2.4 The processing is necessary to protect someone’s life.2.5 The processing is necessary for performance of a task that’s in the public interest.2.6 The processing is necessary for our or another’s legitimate interest – but in this case, we’ll balance those interests against your interests.3 How to consent3.1 At the point of collecting the information, we’ll endeavour to explain how we intend to use the information and which of these purposes apply. If we rely on consent, we’ll provide you with the opportunity to tell us that you’re happy to provide the information. 3.2 If at any point in time you change your mind and decide that you don’t consent, please let us know and we’ll endeavour to stop processing your information in the specified manner, or we’ll delete your data if there is no continuing reason for possessing it.3.3 If you don’t consent to a particular bit of processing, we’ll endeavour to ensure that the Website and our service continue to operate without the need for that information. 4 Information we expect to collect from you4.1 We envisage asking for the following types of information from you: Information typePurpose and related detailsJustificationContact information·     We ask for this to add you to our news letter·     We’ll ask for your consent 4.2 We may collect personal information about you from a number of sources, including the following:4.2.1 From you when you agree to take a service or product from us, in which case this may include your contact details, date of birth, how you will pay for the product or service and your bank details.4.2.2 From you when you contact us with an enquiry or in response to a communication from us, in which case, this may tell us something about how you use our services.4.2.3 From documents that are available to the public, such as the electoral register.4.2.4 From third parties to whom you have provided information with your consent to pass it on to other organisations or persons – when we receive such information we will let you know as soon as is reasonably practicable.4.3 If you refuse to provide information requested, then if that information is necessary for a service we provide to you we may need to stop providing that service.4.4 At the time of collecting information, by whichever method is used, we’ll endeavour to alert you and inform you about our purposes and legal basis for processing that information, as well as whether we intend to share the information with anyone else or send it outside of the European Economic Area. If at any point you think we’ve invited you to provide information without explaining why, feel free to object and ask for our reasons.5 Using your personal information5.1 Data protection, privacy and security are important to us, and we shall only use your personal information for specified purposes and shall not keep such personal information longer than is necessary to fulfil these purposes. The following are examples of such purposes. We have also indicated below which GDPR justification applies, however it will depend on the circumstances of each case. At the time of collecting we will provide further information, and you may always ask for further information from us.5.1.1 To help us to identify you when you contact us. This will normally be necessary for the performance our contract.5.1.2 To help us to identify accounts, services and/or products which you could have from us or selected partners from time to time. We may do this by automatic means using a scoring system, which uses the personal information you’ve provided and/or any information we hold about you and personal information from third party agencies (including credit reference agencies). We will only use your information for this purpose if you agree to it.5.1.3 To help us to administer and to contact you about improved administration of any accounts, services and products we have provided before, do provide now or will or may provide in the future. This will often be necessary, but sometimes the improvements will not be necessary in which case we will ask whether you agree.5.1.4 To allow us to carry out marketing analysis and customer profiling (including with transactional information), conduct research, including creating statistical and testing information. This will sometimes require that you consent, but will sometimes be exempt as market research.5.1.5 To help to prevent and detect fraud or loss. This will only be done in certain circumstances when we consider it necessary or the law requires it.5.1.6 To allow us to contact you by written electronic means (such as email, text or multimedia messages) about products and services offered by us where:5.1.6.1 these products are similar to those you have already purchased from us,5.1.6.2 you were given the opportunity to opt out of being contacted by us at the time your personal information was originally collected by us and at the time of our subsequent communications with you, and5.1.6.3 you have not opted out of us contacting you.5.1.7 To allow us to contact you in any way (including mail, email, telephone, visit, text or multimedia messages) about products and services offered by us and selected partners where you have expressly consented to us doing so.5.1.8 We may monitor and record communications with you (including phone conversations and emails) for quality assurance and compliance.5.1.8.1 Before doing that, we will always tell you of our intentions and of the specific purpose in making the recording. Sometimes such recordings will be necessary to comply with the law. Alternatively, sometimes the recording will be necessary for our legitimate interest, but in that case we’ll only record the call if our interest outweighs yours. This will depend on all the circumstances, in particular the importance of the information and whether we can obtain the information another way that’s less intrusive.5.1.8.2 If we think the recording would be useful for us but that it’s not necessary we’ll ask whether you consent to the recording, and will provide an option for you to tell us that you consent. In those situations, if you don’t consent, the call will either automatically end or will not be recorded.5.1.9 When it’s required by law, we’ll check your details with fraud prevention agencies. If you provide false or inaccurate information and we suspect fraud, we intend to record this.5.2 We will not disclose your personal information to any third party except in accordance with this Notice, and in particular in these circumstances:5.2.1 They will be processing the data on our behalf as a data processor (where we’ll be the data controller). In that situation, we’ll always have a contract with the data processor as set out in the GDPR. This contract provides significant restrictions as to how the data processor operates so that you can be confident your data is protected to the same degree as provided in this Notice.5.2.2 Sometimes it might be necessary to share data with another data controller. Before doing that we’ll always tell you. Note that if we receive information about you from a third party, then as soon as reasonably practicable afterwards we’ll let you know; that’s required by the GDPR.5.2.3 Alternatively, sometimes we might consider it to be in your interest to send your information to a third party. If that’s the case, we’ll always ask whether you agree before sending.5.3 Where you give us personal information on behalf of someone else, you confirm that you have provided them with the information set out in this Notice and that they have not objected to such use of their personal information.5.4 We may allow other people and organisations to use personal information we hold about you in the following circumstances:5.4.1 If we, or substantially all of our assets, are acquired or are in the process of being acquired by a third party, in which case personal information held by us, about our customers, will be one of the transferred assets.5.4.2 If we have been legitimately asked to provide information for legal or regulatory purposes or as part of legal proceedings or prospective legal proceedings.5.4.3 We may employ companies and individuals to perform functions on our behalf and we may disclose your personal information to these parties for the purposes set out above, for example, for fulfilling orders, delivering packages, sending postal mail and email, removing repetitive information from customer lists, analysing data, providing marketing assistance, providing search results and links (including paid listings and links) and providing customer service. Those parties will be bound by strict contractual provisions with us and will only have access to personal information needed to perform their functions, and they may not use it for any other purpose. Further, they must process the personal information in accordance with this Notice and as permitted by the GDPR. From time to time, these other people and organisations to whom we may pass your personal information may be outside the European Economic Area. We will take all steps reasonably necessary to ensure that your personal information is treated securely and in accordance with this Notice and the GDPR.6 Protecting information6.1 We have strict security measures to protect personal information.6.2 We work to protect the security of your information during transmission by using Secure Sockets Layer (SSL) software to encrypt information you input.6.3 We reveal only the last five digits of your credit card numbers when confirming an order. Of course, we transmit the entire credit card number to the appropriate credit card company during order processing.6.4 We maintain physical, electronic and procedural safeguards in connection with the collection, storage and disclosure of personally identifiable customer information. Our security procedures mean that we may occasionally request proof of identity before we disclose personal information to you.6.5 It is important for you to protect against unauthorised access to your password and to your computer. Be sure to sign off when you finish using a shared computer.7 The internet7.1 If you communicate with us using the internet, we may occasionally email you about our services and products. When you first give us personal information through the Website, we will normally give you the opportunity to say whether you would prefer that we don’t contact you by email. You can also always send us an email (at the address set out below) at any time if you change your mind.7.2 Please remember that communications over the internet, such as emails and webmails (messages sent through a website), are not secure unless they have been encrypted. Your communications may go through a number of countries before they are delivered – this is the nature of the internet. We cannot accept responsibility for any unauthorised access or loss of personal information that is beyond our control.7.3 The Website may include third-party advertising and links to third-party websites. We do not provide any personally identifiable customer personal information to these third-party advertisers or third-party websites unless you’ve consented in accordance with this privacy notice.7.4 We exclude all liability for loss that you may incur when interacting with this third-party advertising or using these third-party websites unless you’ve consented in accordance with this privacy notice.8 Further information8.1 If you would like any more information or you have any comments about this Notice, please either write to us at Data Protection Manager, Compare the Cloud Limited, 97 Leigh Road, Eastleigh, Hampshire, SO50 9DR, or email us at info@comparethecloud.net.8.2 Please note that we may have to amend this Notice on occasion, for example if we change the cookies that we use. If we do that, we will publish the amended version on the Website. In that situation we will endeavour to alert you to the change, but it’s also your responsibility to check regularly to determine whether this Notice has changed.8.3 You can ask us for a copy of this Notice by writing to the above address or by emailing us at info@comparethecloud.net. This Notice applies to personal information we hold about individuals. It does not apply to information we hold about companies and other organisations.8.4 If you would like access to the personal information that we hold about you, you can do this by emailing us at info@comparethecloud.net or writing to us at the address noted above. There is not normally a fee for such a request, however if the request is unfounded, repetitive or excessive we may request a fee or refuse to comply with your request. You can also ask us to send the personal information we hold about you to another controller.8.5 We aim to keep the personal information we hold about you accurate and up to date. If you tell us that we’re holding any inaccurate or incomplete personal information about you, we will promptly amend, complete or delete it accordingly. Please email us at info@comparethecloud.net or write to us at the address above to update your personal information. You have the right to complain to the Information Commissioner’s Office if we don’t do this.8.6 You can ask us to delete the personal information that we hold about you if we relied on your consent in holding that information or if it’s no longer necessary. You can also restrict or object to our processing of your personal information in certain circumstances. You can do this by emailing us at info@comparethecloud.net or writing to us at the address noted above.We will tell you if there is a breach, or a likely breach, of your data protection rights. ### Work With Us Most companies don’t have the social presence or brand awareness needed in today’s fast moving and a tech-savvy business world, let alone the ability to transfer that presence into viable business leads. At Compare the Cloud we aim to educate everyone, from the newest startups to the largest technology companies in the world, in how best to drive their businesses forward. Using a strategic combination of business consultancy, content marketing, media series and events and workshops, to enable you to stand out from the cloud crowd. We offer Social media training programmes and solutions to raise brand awareness with both written and visual content, this will ensure you are engaging with the right audience. We can also organise roundtable events to bring together key influencers within the cloud market, as well as host or speak at events for you to further promote your brand. At Compare the Cloud we cover the most exciting industry events with either video interviews, blogs, social media presence and even live broadcasts, providing a refreshing voice. Compare the Cloud is passionate, disruptive and educating from the newest startups to the largest technology companies in the world. Comparethecloud.net – a website that extols the virtues and value of Cloud Computing to businesses and helps companies select services and providers best suited to their needs. Can’t find an answer to your question?  Then please don’t hesitate to contact us. Call us on +44(0)203 371 8430. or Email info@comparethecloud.net ### Contact The cloud marketplace is ever changing and fast moving and your contribution is welcome. For news, press releases and product launches please get in touch with the editorial team. Please note – we are not accepting PR calls at this time. Please ensure when you are submitting an article that they adhere to our requirements: We only publish original & exclusive content, meaning it should not have been previously published elsewhere, it has not been written by AI & should not be published elsewhere in the future. We ask that you avoid mentioning specific cryptocurrencies or gambling within the articles. We do not permit links to gambling sites or sites with 18+ content. Email: pr@comparethecloud.net To talk to us about how we can help your business or for sponsorship opportunities just email:Email: info@comparethecloud.net London (Disruptive Studio):Cargo Works1-2 HatfieldsLondonSE1 9PGUnited Kingdom To contact our marketing, finance and administration teams please use the details below: Admin Office:Unit 25Premier WayAbbey Park Industrial EstateRomseyHampshireSO51 9AQRegistered in UK. Company Number: 07657589VAT Number: GB 122 2531 64 ### Home [tdc_zone type="tdc_content"][vc_row tdc_css="eyJhbGwiOnsiYmFja2dyb3VuZC1jb2xvciI6IiMwMDAwMDAiLCJkaXNwbGF5IjoiIn0sInBob25lIjp7ImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3Njd9" full_width="stretch_row_content td-stretch-content"][vc_column][td_flex_block_2 image_align="center" meta_info_align="center" color_overlay="eyJ0eXBlIjoiZ3JhZGllbnQiLCJjb2xvcjEiOiJyZ2JhKDAsMCwwLDAuNzgpIiwiY29sb3IyIjoicmdiYSgwLDAsMCwwLjgyKSIsIm1peGVkQ29sb3JzIjpbeyJjb2xvciI6InJnYmEoMCwwLDAsMC4zNikiLCJwZXJjZW50YWdlIjoyNX0seyJjb2xvciI6InJnYmEoMCwwLDAsMC40KSIsInBlcmNlbnRhZ2UiOjc1fV0sImRlZ3JlZSI6IjE4MCIsImNzcyI6ImJhY2tncm91bmQ6IC13ZWJraXQtbGluZWFyLWdyYWRpZW50KDE4MGRlZyxyZ2JhKDAsMCwwLDAuODIpLHJnYmEoMCwwLDAsMC4zNikgMjUlLHJnYmEoMCwwLDAsMC40KSA3NSUscmdiYSgwLDAsMCwwLjc4KSk7YmFja2dyb3VuZDogbGluZWFyLWdyYWRpZW50KDE4MGRlZyxyZ2JhKDAsMCwwLDAuODIpLHJnYmEoMCwwLDAsMC4zNikgMjUlLHJnYmEoMCwwLDAsMC40KSA3NSUscmdiYSgwLDAsMCwwLjc4KSk7IiwiY3NzUGFyYW1zIjoiMTgwZGVnLHJnYmEoMCwwLDAsMC44MikscmdiYSgwLDAsMCwwLjM2KSAyNSUscmdiYSgwLDAsMCwwLjQpIDc1JSxyZ2JhKDAsMCwwLDAuNzgpIn0=" limit="1" ajax_pagination="next_prev" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjAiLCJkaXNwbGF5IjoiIn0sInBob25lIjp7Im1hcmdpbi1yaWdodCI6Ii0yMCIsIm1hcmdpbi1sZWZ0IjoiLTIwIiwiZGlzcGxheSI6IiJ9LCJwaG9uZV9tYXhfd2lkdGgiOjc2N30=" modules_height="eyJhbGwiOiIxMDAwIiwicGhvbmUiOiI2MDAiLCJsYW5kc2NhcGUiOiI4NDAiLCJwb3J0cmFpdCI6IjcwMCJ9" modules_space="0" image_margin="0" meta_width="eyJsYW5kc2NhcGUiOiIxMDAlIn0=" meta_info_horiz="content-horiz-left" meta_margin="0 auto" show_com="none" f_title_font_family="467" f_title_font_size="eyJhbGwiOiI1NCIsImxhbmRzY2FwZSI6IjUwIiwicG9ydHJhaXQiOiI0MiIsInBob25lIjoiMzAifQ==" f_title_font_line_height="1.2" f_ex_font_family="394" f_ex_font_size="eyJhbGwiOiIxOCIsImxhbmRzY2FwZSI6IjE3IiwicG9ydHJhaXQiOiIxNSIsInBob25lIjoiMTUifQ==" title_txt="#ffffff" pag_border_radius="3" pag_padding="eyJhbGwiOiIxOCIsInBob25lIjoiMTAiLCJwb3J0cmFpdCI6IjE0IiwibGFuZHNjYXBlIjoiMTYifQ==" meta_padding="eyJhbGwiOiIwIDYwMHB4IDEwMHB4IDIwcHgiLCJsYW5kc2NhcGUiOiIwIDI1MHB4IDkwcHggMjBweCIsInBob25lIjoiMCAyMHB4IiwicG9ydHJhaXQiOiIwIDE2MHB4IDgwcHggMjBweCJ9" f_ex_font_line_height="1.5" nextprev_bg="rgba(255,255,255,0.4)" pag_icons_size="eyJhbGwiOiIxNCIsInBob25lIjoiMTAiLCJwb3J0cmFpdCI6IjEyIiwibGFuZHNjYXBlIjoiMTMifQ==" nextprev_bg_h="rgba(255,255,255,0.8)" title_txt_hover="#ffffff" mc2_el="44" modules_cat_border="0" modules_category_padding="eyJhbGwiOiI4cHggMTZweCIsInBob25lIjoiN3B4IDEwcHgifQ==" modules_category_margin="0" art_title="eyJhbGwiOiIyMHB4IDAiLCJwb3J0cmFpdCI6IjEycHggMCAxNnB4In0=" cat_border="rgba(255,255,255,0)" cat_border_hover="rgba(255,255,255,0)" modules_category="above" f_cat_font_family="394" f_meta_font_family="394" ex_txt="#ffffff" f_cat_font_transform="uppercase" f_cat_font_spacing="0.5" f_cat_font_size="eyJhbGwiOiIxMiIsInBvcnRyYWl0IjoiMTEiLCJwaG9uZSI6IjExIn0=" f_cat_font_line_height="1" f_cat_font_weight="600" cat_txt="#000000" cat_txt_hover="var(--primary)" cat_bg="#ffffff" cat_bg_hover="#ffffff" f_meta_font_size="eyJhbGwiOiIxNCIsInBvcnRyYWl0IjoiMTMiLCJwaG9uZSI6IjEzIn0=" f_meta_font_line_height="1.2" f_meta_font_weight="" post_ids="" f_ex_font_weight="400" modules_category_radius="eyJhbGwiOiIxMDAiLCJwaG9uZSI6IjIifQ==" excl_padd="4px 8px" f_excl_font_size="13" excl_radius="3" f_excl_font_family="394" f_excl_font_weight="600" excl_show="none" category_id="41" nextprev_position="top"][vc_row_inner absolute_align="bottom" absolute_width="absolute_inner_1400 absolute_inner" absolute_position="eyJhbGwiOiJ5ZXMiLCJwaG9uZSI6IiJ9" tdc_css="eyJwb3J0cmFpdCI6eyJwYWRkaW5nLXJpZ2h0IjoiMjAiLCJwYWRkaW5nLWxlZnQiOiIyMCIsImRpc3BsYXkiOiIifSwicG9ydHJhaXRfbWF4X3dpZHRoIjoxMDE4LCJwb3J0cmFpdF9taW5fd2lkdGgiOjc2OCwicGhvbmUiOnsiYmFja2dyb3VuZC1jb2xvciI6IiMwMDAwMDAiLCJkaXNwbGF5IjoiIn0sInBob25lX21heF93aWR0aCI6NzY3LCJsYW5kc2NhcGUiOnsicGFkZGluZy1yaWdodCI6IjIwIiwicGFkZGluZy1sZWZ0IjoiMjAiLCJkaXNwbGF5IjoiIn0sImxhbmRzY2FwZV9tYXhfd2lkdGgiOjExNDAsImxhbmRzY2FwZV9taW5fd2lkdGgiOjEwMTl9"][vc_column_inner][tdm_block_column_title title_text="QnJlYWtpbmclMjBuZXdzJTNB" title_tag="h3" title_size="tdm-title-sm" tds_title1-f_title_font_family="394" tds_title1-f_title_font_weight="700" tds_title1-f_title_font_size="eyJhbGwiOiIxNSIsInBvcnRyYWl0IjoiMTQifQ==" tds_title1-f_title_font_line_height="1.2" tds_title1-f_title_font_transform="uppercase" tds_title1-title_color="#ffffff" tds_title1-f_title_font_spacing="0.5" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjUiLCJkaXNwbGF5IjoiIn19"][td_flex_block_1 modules_on_row="eyJhbGwiOiIyNSUiLCJwaG9uZSI6IjEwMCUifQ==" limit="4" modules_category="above" show_btn="eyJwaG9uZSI6Im5vbmUiLCJhbGwiOiJub25lIn0=" show_excerpt="eyJwaG9uZSI6Im5vbmUiLCJwb3J0cmFpdCI6Im5vbmUiLCJhbGwiOiJub25lIn0=" ajax_pagination="" td_ajax_preloading="" sort="" category_id="41" f_title_font_size="eyJhbGwiOiIxNCIsInBvcnRyYWl0IjoiMTAiLCJsYW5kc2NhcGUiOiIxMiJ9" f_title_font_line_height="1.4" show_cat="eyJwb3J0cmFpdCI6Im5vbmUifQ==" meta_info_border_style="" meta_padding="eyJhbGwiOiIwIDAgMCAyMHB4IiwibGFuZHNjYXBlIjoiMCAwIDAgMTVweCIsInBvcnRyYWl0IjoiMCAwIDAgMTBweCIsInBob25lIjoiMCAwIDAgMjBweCJ9" modules_divider="" image_size="" meta_info_align="center" image_floated="float_left" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjAiLCJib3JkZXItdG9wLXdpZHRoIjoiMSIsInBhZGRpbmctdG9wIjoiMzAiLCJwYWRkaW5nLWJvdHRvbSI6IjMwIiwiYm9yZGVyLWNvbG9yIjoicmdiYSgyNTUsMjU1LDI1NSwwLjE2KSIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlIjp7InBhZGRpbmctdG9wIjoiMjUiLCJwYWRkaW5nLWJvdHRvbSI6IjI1IiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGVfbWF4X3dpZHRoIjoxMTQwLCJsYW5kc2NhcGVfbWluX3dpZHRoIjoxMDE5LCJwb3J0cmFpdCI6eyJwYWRkaW5nLXRvcCI6IjIwIiwicGFkZGluZy1ib3R0b20iOiIyMCIsImRpc3BsYXkiOiIifSwicG9ydHJhaXRfbWF4X3dpZHRoIjoxMDE4LCJwb3J0cmFpdF9taW5fd2lkdGgiOjc2OCwicGhvbmUiOnsiZGlzcGxheSI6IiJ9LCJwaG9uZV9tYXhfd2lkdGgiOjc2N30=" meta_info_horiz="content-horiz-left" f_title_font_weight="400" image_height="eyJhbGwiOiIxMjAiLCJwaG9uZSI6IjEyMCJ9" all_modules_space="eyJhbGwiOiIwIiwicGhvbmUiOiIxNSJ9" art_excerpt="0" art_title="eyJhbGwiOiIxMHB4IDAgMCAwIiwicG9ydHJhaXQiOiIwIn0=" btn_bg="rgba(255,255,255,0)" f_btn_font_transform="uppercase" f_btn_font_weight="" f_cat_font_transform="" f_cat_font_weight="400" btn_bg_hover="rgba(255,255,255,0)" meta_width="eyJwaG9uZSI6IjEwMCUifQ==" show_audio="" show_com="none" show_date="none" show_author="none" mc1_el="10" f_title_font_family="467" f_title_font_transform="" title_txt="#ffffff" title_txt_hover="rgba(255,255,255,0.75)" cat_txt="#ffffff" cat_txt_hover="rgba(255,255,255,0.75)" cat_bg="rgba(255,255,255,0)" cat_bg_hover="rgba(255,255,255,0)" modules_category_padding="6px 12px" f_cat_font_family="394" f_cat_font_size="eyJhbGwiOiIxMiIsInBvcnRyYWl0IjoiMTEiLCJsYW5kc2NhcGUiOiIxMSJ9" f_cat_font_line_height="1" modules_gap="eyJhbGwiOiIyMCIsInBob25lIjoiMCIsImxhbmRzY2FwZSI6IjE1IiwicG9ydHJhaXQiOiIxMCJ9" image_width="30" post_ids="" image_radius="8" modules_cat_border="1" modules_category_radius="100" cat_border="rgba(255,255,255,0.3)" cat_border_hover="rgba(255,255,255,0.3)" review_size="0" f_excl_font_size="11" excl_padd="4px 6px 3px" excl_radius="2" f_excl_font_family="394" f_excl_font_weight="500" excl_bg="var(--metro-exclusive)" td_query_cache="yes" td_query_cache_expiration="300"][/vc_column_inner][/vc_row_inner][/vc_column][/vc_row][vc_row full_width="stretch_row_1400 td-stretch-content"][vc_column][td_block_raw_css content="JTQwbWVkaWElMjAobWluLXdpZHRoJTNBJTIwMTQwMXB4KSUyMCU3QiUwQSUyMCUyMC50ZF9mbGV4X2Jsb2NrXzIudGRfd2l0aF9hamF4X3BhZ2luYXRpb24lMjAudGQtYmxvY2staW5uZXItcGFnaW5hdGlvbiUyMC50ZC1uZXh0LXByZXYtd3JhcCUyMCU3QiUwQSUyMCUyMCUyMCUyMHRvcCUzQSUyMDQzJTI1JTIwIWltcG9ydGFudCUzQiUwQSUyMCUyMCUyMCUyMG1hcmdpbiUzQSUyMDAlM0IlMEElMjAlMjAlMjAlMjByaWdodCUzQSUyMDUwJTI1JTNCJTBBJTIwJTIwJTIwJTIwdHJhbnNmb3JtJTNBJTIwdHJhbnNsYXRlWCgtNjgwcHgpJTNCJTBBJTIwJTIwJTdEJTBBJTdEJTBBJTQwbWVkaWElMjAobWluLXdpZHRoJTNBJTIwNzY4cHgpJTIwYW5kJTIwKG1heC13aWR0aCUzQSUyMDE0MDBweCklMjAlN0IlMEElMjAlMjAudGRfZmxleF9ibG9ja18yLnRkX3dpdGhfYWpheF9wYWdpbmF0aW9uJTIwLnRkLWJsb2NrLWlubmVyLXBhZ2luYXRpb24lMjAudGQtbmV4dC1wcmV2LXdyYXAlMjAlN0IlMEElMjAlMjAlMjAlMjB0b3AlM0ElMjA0MyUyNSUyMCFpbXBvcnRhbnQlM0IlMEElMjAlMjAlMjAlMjBtYXJnaW4lM0ElMjAwJTNCJTBBJTIwJTIwJTIwJTIwcmlnaHQlM0ElMjAyMHB4JTNCJTBBJTIwJTIwJTdEJTBBJTdEJTBBLnRkX2ZsZXhfYmxvY2tfMiUyMC50ZF9tb2R1bGVfd3JhcCUyMC50ZC1tb2R1bGUtbWV0YS1pbmZvJTIwLmVudHJ5LXRpdGxlJTIwJTdCJTBBJTIwJTIwdGV4dC1zaGFkb3clM0ElMjAwJTIwNHB4JTIwMTZweCUyMHJnYmEoMCUyQyUyMDAlMkMlMjAwJTJDJTIwMC4zKSUzQiUwQSU3RCUwQQ=="][tdm_block_column_title title_text="RmVhdHVyZWQlMjBBcnRpY2xlcw==" title_tag="h3" title_size="tdm-title-sm" tds_title1-f_title_font_family="394" tds_title1-f_title_font_weight="800" tds_title1-f_title_font_size="eyJhbGwiOiI2MCIsInBob25lIjoiNTQiLCJsYW5kc2NhcGUiOiI1MCIsInBvcnRyYWl0IjoiNDAifQ==" tds_title1-title_color="var(--accent-color)" tdc_css="eyJhbGwiOnsicGFkZGluZy10b3AiOiI4MCIsInBhZGRpbmctYm90dG9tIjoiMzUiLCJkaXNwbGF5IjoiIn0sImxhbmRzY2FwZSI6eyJwYWRkaW5nLXRvcCI6IjYwIiwicGFkZGluZy1ib3R0b20iOiIyMCIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOSwicG9ydHJhaXQiOnsicGFkZGluZy10b3AiOiI0MCIsInBhZGRpbmctYm90dG9tIjoiMTAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3NjgsInBob25lIjp7InBhZGRpbmctdG9wIjoiNjAiLCJkaXNwbGF5IjoiIn0sInBob25lX21heF93aWR0aCI6NzY3fQ==" tds_title1-f_title_font_line_height="1.1" tds_title="tds_title1" tds_title2-line_width="100%" tds_title2-line_color="eyJ0eXBlIjoiZ3JhZGllbnQiLCJjb2xvcjEiOiIjZjQzZjNmIiwiY29sb3IyIjoiI2Y0M2YzZiIsIm1peGVkQ29sb3JzIjpbXSwiZGVncmVlIjoiLTkwIiwiY3NzIjoiYmFja2dyb3VuZC1jb2xvcjogI2Y0M2YzZjsiLCJjc3NQYXJhbXMiOiIwZGVnLCNmNDNmM2YsI2Y0M2YzZiJ9" content_align_horizontal="content-horiz-center"][/vc_column][/vc_row][vc_row full_width="stretch_row_1400 td-stretch-content" tdc_css="eyJhbGwiOnsiZGlzcGxheSI6IiJ9fQ=="][vc_column][td_flex_block_1 modules_on_row="eyJhbGwiOiIyMCUiLCJwaG9uZSI6IjEwMCUifQ==" modules_category="above" show_btn="none" show_excerpt="eyJwb3J0cmFpdCI6Im5vbmUiLCJhbGwiOiJub25lIn0=" ajax_pagination="" td_ajax_preloading="preload" sort="" category_id="41" f_title_font_size="eyJhbGwiOiIxNiIsImxhbmRzY2FwZSI6IjE0IiwicG9ydHJhaXQiOiIxMyJ9" f_title_font_line_height="1.4" show_cat="" meta_info_border_style="" meta_padding="20px 0 0 0" modules_divider="" image_size="" meta_info_align="" image_floated="" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjMwIiwiYm9yZGVyLWJvdHRvbS13aWR0aCI6IjEiLCJwYWRkaW5nLWJvdHRvbSI6IjIwIiwiYm9yZGVyLWNvbG9yIjoiI2VhZWFlYSIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlIjp7ImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOSwicG9ydHJhaXQiOnsiZGlzcGxheSI6IiJ9LCJwb3J0cmFpdF9tYXhfd2lkdGgiOjEwMTgsInBvcnRyYWl0X21pbl93aWR0aCI6NzY4LCJwaG9uZSI6eyJkaXNwbGF5IjoiIn0sInBob25lX21heF93aWR0aCI6NzY3fQ==" meta_info_horiz="content-horiz-left" f_title_font_weight="800" image_height="120" all_modules_space="eyJwb3J0cmFpdCI6IjAiLCJwaG9uZSI6IjMwIiwiYWxsIjoiMCJ9" art_excerpt="0" art_title="10px 0 0 0" btn_bg="rgba(255,255,255,0)" f_btn_font_transform="uppercase" f_btn_font_weight="" f_cat_font_transform="" f_cat_font_weight="600" btn_bg_hover="rgba(255,255,255,0)" meta_width="eyJwaG9uZSI6IjEwMCUifQ==" show_audio="" show_com="none" show_date="none" show_author="none" mc1_el="10" f_title_font_family="394" f_title_font_transform="" title_txt="#000000" title_txt_hover="var(--metro-blue)" cat_txt="var(--metro-blue)" cat_bg="rgba(255,255,255,0)" cat_bg_hover="rgba(255,255,255,0)" modules_category_padding="eyJhbGwiOiI2cHggMTJweCIsInBvcnRyYWl0IjoiNHB4IDEwcHgifQ==" f_cat_font_family="394" f_cat_font_size="eyJhbGwiOiIxMyIsImxhbmRzY2FwZSI6IjEyIiwicG9ydHJhaXQiOiIxMSJ9" f_cat_font_line_height="1" modules_gap="eyJhbGwiOiIyMCIsImxhbmRzY2FwZSI6IjE1IiwicG9ydHJhaXQiOiIxMCIsInBob25lIjoiMCJ9" image_radius="10" cat_txt_hover="var(--metro-blue-acc)" modules_cat_border="1" modules_category_radius="100" cat_border="#eaeaea" cat_border_hover="#eaeaea" review_size="0" excl_padd="4px 6px 3px" f_excl_font_size="11" f_excl_font_family="394" f_excl_font_weight="500" excl_radius="2" excl_bg="var(--metro-exclusive)"][/vc_column][/vc_row][vc_row full_width="stretch_row_1400 td-stretch-content" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjEwMCIsImRpc3BsYXkiOiIifSwicGhvbmUiOnsibWFyZ2luLWJvdHRvbSI6IjYwIiwiZGlzcGxheSI6IiJ9LCJwaG9uZV9tYXhfd2lkdGgiOjc2NywibGFuZHNjYXBlIjp7Im1hcmdpbi1ib3R0b20iOiI4MCIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOSwicG9ydHJhaXQiOnsibWFyZ2luLWJvdHRvbSI6IjYwIiwicGFkZGluZy1yaWdodCI6IjUiLCJwYWRkaW5nLWxlZnQiOiI1IiwiZGlzcGxheSI6IiJ9LCJwb3J0cmFpdF9tYXhfd2lkdGgiOjEwMTgsInBvcnRyYWl0X21pbl93aWR0aCI6NzY4fQ==" gap="eyJsYW5kc2NhcGUiOiIxNSIsInBvcnRyYWl0IjoiMTAiLCJhbGwiOiIyMCIsInBob25lIjoiMCJ9"][vc_column width="1/2" tdc_css="eyJhbGwiOnsid2lkdGgiOiI2MCUiLCJkaXNwbGF5IjoiIn0sImxhbmRzY2FwZSI6eyJkaXNwbGF5IjoiIn0sImxhbmRzY2FwZV9tYXhfd2lkdGgiOjExNDAsImxhbmRzY2FwZV9taW5fd2lkdGgiOjEwMTksInBvcnRyYWl0Ijp7ImRpc3BsYXkiOiIifSwicG9ydHJhaXRfbWF4X3dpZHRoIjoxMDE4LCJwb3J0cmFpdF9taW5fd2lkdGgiOjc2OCwicGhvbmUiOnsibWFyZ2luLWJvdHRvbSI6IjMwIiwid2lkdGgiOiIxMDAlIiwiZGlzcGxheSI6IiJ9LCJwaG9uZV9tYXhfd2lkdGgiOjc2N30="][td_flex_block_1 modules_on_row="" limit="1" modules_category="above" show_btn="none" show_excerpt="eyJsYW5kc2NhcGUiOiJub25lIiwicG9ydHJhaXQiOiJub25lIn0=" ajax_pagination="" td_ajax_preloading="preload" sort="" category_id="254" f_title_font_size="eyJhbGwiOiI1MCIsInBvcnRyYWl0IjoiMjQiLCJsYW5kc2NhcGUiOiIzMiIsInBob25lIjoiNDAifQ==" f_title_font_line_height="eyJwb3J0cmFpdCI6IjEuMiIsImFsbCI6IjEuMiJ9" show_cat="" meta_info_border_style="" meta_padding="eyJhbGwiOiIwIDAgMCA0MHB4ICIsInBvcnRyYWl0IjoiMCAwIDAgMjBweCIsImxhbmRzY2FwZSI6IjAgMCAwIDMwcHgiLCJwaG9uZSI6IjI1cHggMCAwIDAifQ==" modules_divider="" image_size="" meta_info_align="center" image_floated="eyJhbGwiOiJmbG9hdF9sZWZ0IiwicGhvbmUiOiJub19mbG9hdCJ9" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjI2IiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGUiOnsibWFyZ2luLWJvdHRvbSI6IjMwIiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGVfbWF4X3dpZHRoIjoxMTQwLCJsYW5kc2NhcGVfbWluX3dpZHRoIjoxMDE5LCJwb3J0cmFpdCI6eyJtYXJnaW4tYm90dG9tIjoiMjAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3NjgsInBob25lIjp7ImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3Njd9" meta_info_horiz="content-horiz-left" f_title_font_weight="700" image_height="130" all_modules_space="0" art_excerpt="0" art_title="eyJhbGwiOiIxNXB4IDAgMTJweCAwIiwicG9ydHJhaXQiOiIxNXB4IDAgMTBweCAwIiwibGFuZHNjYXBlIjoiMTVweCAwIDAgMCIsInBob25lIjoiMTJweCAwIDE1cHggMCJ9" btn_bg="rgba(255,255,255,0)" f_btn_font_transform="uppercase" f_btn_font_weight="" f_ex_font_weight="500" f_cat_font_transform="" f_cat_font_weight="600" btn_bg_hover="rgba(255,255,255,0)" f_ex_font_size="eyJhbGwiOiIxNCIsImxhbmRzY2FwZSI6IjEzIiwicG9ydHJhaXQiOiIxMyJ9" f_ex_font_line_height="1.4" meta_width="eyJwaG9uZSI6IjEwMCUifQ==" show_audio="" show_com="none" show_date="none" show_author="none" f_title_font_family="467" f_title_font_transform="" f_ex_font_family="394" title_txt="#000000" title_txt_hover="var(--metro-red)" ex_txt="#000000" cat_txt="var(--metro-red)" cat_bg="rgba(255,255,255,0)" cat_bg_hover="rgba(255,255,255,0)" author_txt="#000000" author_txt_hover="var(--metro-red)" modules_category_padding="eyJhbGwiOiI4cHggMTRweCIsInBvcnRyYWl0IjoiNnB4IDEycHgifQ==" f_cat_font_family="394" f_cat_font_size="eyJhbGwiOiIxMyIsInBvcnRyYWl0IjoiMTIifQ==" f_cat_font_line_height="1" excerpt_middle="yes" modules_gap="0" image_width="eyJhbGwiOiI0NSIsInBob25lIjoiMTAwIn0=" image_radius="10" mc1_el="22" cat_txt_hover="var(--metro-blue)" modules_cat_border="1" modules_category_radius="100" cat_border="#eaeaea" cat_border_hover="#eaeaea" review_size="0" excl_show="none"][td_flex_block_1 modules_on_row="eyJhbGwiOiI1MCUiLCJwaG9uZSI6IjEwMCUifQ==" image_size="" image_floated="hidden" image_width="eyJwaG9uZSI6IjMwIn0=" image_height="eyJwaG9uZSI6IjExMCJ9" show_btn="none" show_excerpt="eyJwaG9uZSI6Im5vbmUiLCJhbGwiOiJub25lIn0=" show_com="eyJwaG9uZSI6Im5vbmUiLCJhbGwiOiJub25lIn0=" show_author="none" show_cat="none" meta_padding="0" f_title_font_size="eyJhbGwiOiIyMCIsInBvcnRyYWl0IjoiMTUiLCJsYW5kc2NhcGUiOiIxNiJ9" f_title_font_line_height="1.4" f_title_font_weight="800" all_modules_space="eyJsYW5kc2NhcGUiOiIyMCIsInBvcnRyYWl0IjoiMTUiLCJhbGwiOiIyNSJ9" category_id="" show_date="eyJwb3J0cmFpdCI6Im5vbmUifQ==" art_excerpt="0" show_review="none" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0Ijp7ImRpc3BsYXkiOiIifSwicG9ydHJhaXRfbWF4X3dpZHRoIjoxMDE4LCJwb3J0cmFpdF9taW5fd2lkdGgiOjc2OCwicGhvbmUiOnsiZGlzcGxheSI6IiJ9LCJwaG9uZV9tYXhfd2lkdGgiOjc2N30=" f_title_font_family="467" mc1_el="10" title_txt_hover="var(--metro-red)" title_txt="#000000" art_title="eyJhbGwiOiIwIDAgMTBweCIsInBvcnRyYWl0IjoiMCJ9" modules_border_size="1px 0 0 0" m_padding="eyJhbGwiOiIyMHB4IDAgMCAwIiwibGFuZHNjYXBlIjoiMTVweCAwIDAgMCIsInBvcnRyYWl0IjoiMTVweCAwIDAgMCJ9" modules_gap="eyJhbGwiOiI0MCIsImxhbmRzY2FwZSI6IjMwIiwicG9ydHJhaXQiOiIyMCJ9" f_meta_font_size="eyJhbGwiOiIxMiIsInBvcnRyYWl0IjoiMTAiLCJsYW5kc2NhcGUiOiIxMSJ9" f_meta_font_line_height="1" f_meta_font_weight="500" f_meta_font_family="394" limit="6" offset="1" review_size="0" excl_padd="4px 6px 3px" excl_radius="2" f_excl_font_family="394" f_excl_font_size="11" f_excl_font_weight="500" excl_bg="var(--metro-exclusive)"][/vc_column][vc_column width="1/2" tdc_css="eyJhbGwiOnsid2lkdGgiOiI0MCUiLCJkaXNwbGF5IjoiIn0sInBob25lIjp7IndpZHRoIjoiMTAwJSIsImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3Njd9" is_sticky="yes"][td_flex_block_1 modules_on_row="eyJhbGwiOiI1MCUiLCJwaG9uZSI6IjEwMCUifQ==" limit="4" modules_category="above" show_btn="none" show_excerpt="eyJwb3J0cmFpdCI6Im5vbmUiLCJhbGwiOiJub25lIn0=" ajax_pagination="" td_ajax_preloading="preload" sort="" category_id="644" f_title_font_size="eyJhbGwiOiIxNiIsImxhbmRzY2FwZSI6IjE0IiwicG9ydHJhaXQiOiIxMiJ9" f_title_font_line_height="1.4" show_cat="" meta_info_border_style="" meta_padding="eyJhbGwiOiIyMHB4IDAgMCAwIiwicG9ydHJhaXQiOiIxNXB4IDAgMCAwIn0=" modules_divider="" image_size="" meta_info_align="" image_floated="" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjAiLCJkaXNwbGF5IjoiIn0sImxhbmRzY2FwZSI6eyJkaXNwbGF5IjoiIn0sImxhbmRzY2FwZV9tYXhfd2lkdGgiOjExNDAsImxhbmRzY2FwZV9taW5fd2lkdGgiOjEwMTksInBvcnRyYWl0Ijp7ImRpc3BsYXkiOiIifSwicG9ydHJhaXRfbWF4X3dpZHRoIjoxMDE4LCJwb3J0cmFpdF9taW5fd2lkdGgiOjc2OCwicGhvbmUiOnsibWFyZ2luLWJvdHRvbSI6IjAiLCJkaXNwbGF5IjoiIn0sInBob25lX21heF93aWR0aCI6NzY3fQ==" meta_info_horiz="content-horiz-left" f_title_font_weight="800" image_height="eyJhbGwiOiIxMjAiLCJwb3J0cmFpdCI6IjEwMCJ9" all_modules_space="eyJwb3J0cmFpdCI6IjIwIiwiYWxsIjoiMjAiLCJwaG9uZSI6IjMwIn0=" art_excerpt="0" art_title="10px 0 0 0" btn_bg="rgba(255,255,255,0)" f_btn_font_transform="uppercase" f_btn_font_weight="" f_cat_font_transform="" f_cat_font_weight="600" btn_bg_hover="rgba(255,255,255,0)" meta_width="eyJwaG9uZSI6IjEwMCUifQ==" show_audio="" show_com="none" show_date="none" show_author="none" mc1_el="10" f_title_font_family="394" f_title_font_transform="" title_txt="#000000" title_txt_hover="var(--metro-blue)" cat_txt="var(--metro-blue)" cat_bg="rgba(255,255,255,0)" cat_bg_hover="rgba(255,255,255,0)" modules_category_padding="eyJhbGwiOiI2cHggMTJweCIsInBvcnRyYWl0IjoiNHB4IDEwcHgifQ==" f_cat_font_family="394" f_cat_font_size="eyJhbGwiOiIxMyIsImxhbmRzY2FwZSI6IjEyIiwicG9ydHJhaXQiOiIxMSJ9" f_cat_font_line_height="1" modules_gap="eyJhbGwiOiIyMCIsInBob25lIjoiMCJ9" image_radius="10" cat_txt_hover="var(--metro-blue-acc)" modules_cat_border="1" modules_category_radius="100" cat_border="#eaeaea" cat_border_hover="#eaeaea" review_size="0" excl_padd="4px 6px 3px" excl_radius="2" f_excl_font_family="394" f_excl_font_size="11" f_excl_font_weight="500" excl_bg="var(--metro-exclusive)"][/vc_column][/vc_row][vc_row full_width="stretch_row_1400 td-stretch-content" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjgwIiwiZGlzcGxheSI6IiJ9LCJwaG9uZSI6eyJtYXJnaW4tYm90dG9tIjoiNjAiLCJkaXNwbGF5IjoiIn0sInBob25lX21heF93aWR0aCI6NzY3LCJsYW5kc2NhcGUiOnsibWFyZ2luLWJvdHRvbSI6IjYwIiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGVfbWF4X3dpZHRoIjoxMTQwLCJsYW5kc2NhcGVfbWluX3dpZHRoIjoxMDE5LCJwb3J0cmFpdCI6eyJtYXJnaW4tYm90dG9tIjoiNDAiLCJwYWRkaW5nLXJpZ2h0IjoiNSIsInBhZGRpbmctbGVmdCI6IjUiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3Njh9"][vc_column][/vc_column][/vc_row][vc_row full_width="stretch_row_1400 td-stretch-content" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjM1IiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGUiOnsibWFyZ2luLWJvdHRvbSI6IjIwIiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGVfbWF4X3dpZHRoIjoxMTQwLCJsYW5kc2NhcGVfbWluX3dpZHRoIjoxMDE5LCJwb3J0cmFpdCI6eyJtYXJnaW4tYm90dG9tIjoiMTAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3Njh9"][vc_column][tdm_block_column_title title_text="QUk=" title_tag="h3" title_size="tdm-title-sm" tds_title1-f_title_font_family="394" tds_title1-f_title_font_weight="800" tds_title1-f_title_font_size="eyJhbGwiOiI2MCIsInBob25lIjoiNTQiLCJsYW5kc2NhcGUiOiI1MCIsInBvcnRyYWl0IjoiNDAifQ==" tds_title1-title_color="var(--metro-blue)" tdc_css="eyJhbGwiOnsicGFkZGluZy1ib3R0b20iOiI2IiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGUiOnsicGFkZGluZy1ib3R0b20iOiI1IiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGVfbWF4X3dpZHRoIjoxMTQwLCJsYW5kc2NhcGVfbWluX3dpZHRoIjoxMDE5LCJwb3J0cmFpdCI6eyJwYWRkaW5nLWJvdHRvbSI6IjIiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3NjgsInBob25lIjp7InBhZGRpbmctYm90dG9tIjoiMTAiLCJkaXNwbGF5IjoiIn0sInBob25lX21heF93aWR0aCI6NzY3fQ==" tds_title1-f_title_font_line_height="1.1" tds_title="tds_title1" tds_title2-line_width="100%" tds_title2-line_color="eyJ0eXBlIjoiZ3JhZGllbnQiLCJjb2xvcjEiOiIjZjQzZjNmIiwiY29sb3IyIjoiI2Y0M2YzZiIsIm1peGVkQ29sb3JzIjpbXSwiZGVncmVlIjoiLTkwIiwiY3NzIjoiYmFja2dyb3VuZC1jb2xvcjogI2Y0M2YzZjsiLCJjc3NQYXJhbXMiOiIwZGVnLCNmNDNmM2YsI2Y0M2YzZiJ9" content_align_horizontal="content-horiz-center"][vc_row_inner absolute_align="center" absolute_position="eyJhbGwiOiJ5ZXMiLCJwaG9uZSI6IiJ9"][vc_column_inner][tdm_block_button button_size="tdm-btn-lg" button_display="" content_align_horizontal="content-horiz-right" tds_button1-border_radius="4" tds_button1-background_color="var(--metro-red)" tds_button1-background_hover_color="var(--metro-blue)" tds_button1-f_btn_text_font_family="394" tds_button1-f_btn_text_font_size="eyJhbGwiOiIxMiIsInBvcnRyYWl0IjoiMTEifQ==" tds_button1-f_btn_text_font_line_height="1" tds_button1-f_btn_text_font_weight="600" tds_button1-f_btn_text_font_spacing="0.5" tds_button1-f_btn_text_font_transform="uppercase" button_padding="0" button_text="View More" tdc_css="eyJwaG9uZSI6eyJjb250ZW50LWgtYWxpZ24iOiJjb250ZW50LWhvcml6LWNlbnRlciIsImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3Njd9" tds_button="tds_button5" button_url="/category/articles/ai/" button_tdicon="td-icon-right" icon_align="eyJhbGwiOiIxIiwicG9ydHJhaXQiOiIxIn0=" tds_button5-f_btn_text_font_family="394" tds_button5-f_btn_text_font_weight="700" tds_button5-f_btn_text_font_size="eyJhbGwiOiIxNCIsInBvcnRyYWl0IjoiMTMifQ==" tds_button5-f_btn_text_font_transform="uppercase" tds_button5-text_color="var(--metro-blue)" tds_button5-text_hover_color="var(--metro-blue-acc)" button_icon_space="eyJhbGwiOiIxNCIsInBvcnRyYWl0IjoiMTAifQ==" tds_button5-f_btn_text_font_line_height="1.2"][/vc_column_inner][/vc_row_inner][/vc_column][/vc_row][vc_row full_width="stretch_row_1400 td-stretch-content" tdc_css="eyJhbGwiOnsiYmFja2dyb3VuZC1jb2xvciI6InZhcigtLW1ldHJvLWJsdWUpIiwiYmFja2dyb3VuZC1zdHlsZSI6ImNvbnRhaW4iLCJvcGFjaXR5IjoiMC4wNCIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlIjp7ImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOSwicG9ydHJhaXQiOnsiZGlzcGxheSI6IiJ9LCJwb3J0cmFpdF9tYXhfd2lkdGgiOjEwMTgsInBvcnRyYWl0X21pbl93aWR0aCI6NzY4LCJwaG9uZSI6eyJwYWRkaW5nLXRvcCI6IjI0IiwicGFkZGluZy1ib3R0b20iOiIyNCIsImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3Njd9" flex_layout="block" flex_vert_align="flex-start"][vc_column][td_flex_block_1 show_com="none" image_width="eyJhbGwiOiI1MCIsInBob25lIjoiMTAwIn0=" image_floated="eyJhbGwiOiJmbG9hdF9yaWdodCIsInBob25lIjoibm9fZmxvYXQifQ==" meta_padding="eyJhbGwiOiIwIDE0MHB4ICAwIDAgIiwicGhvbmUiOiIyNXB4IDAgMCAwIiwibGFuZHNjYXBlIjoiMCA0MHB4ICAwIDAgIiwicG9ydHJhaXQiOiIwIDQwcHggIDAgMCAifQ==" image_radius="" image_height="eyJhbGwiOiIxMDAiLCJwb3J0cmFpdCI6IjExMCJ9" meta_info_horiz="content-horiz-left" modules_category="above" modules_category_margin="" show_excerpt="" show_btn="" show_review="none" show_cat="" m_padding="0" shadow_shadow_color="rgba(0,0,0,0.14)" f_title_font_family="467" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0Ijp7ImRpc3BsYXkiOiIifSwicG9ydHJhaXRfbWF4X3dpZHRoIjoxMDE4LCJwb3J0cmFpdF9taW5fd2lkdGgiOjc2OCwicGhvbmUiOnsiZGlzcGxheSI6IiJ9LCJwaG9uZV9tYXhfd2lkdGgiOjc2NywibGFuZHNjYXBlIjp7ImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOX0=" f_title_font_size="eyJhbGwiOiI1MCIsInBvcnRyYWl0IjoiMjgiLCJwaG9uZSI6IjQwIiwibGFuZHNjYXBlIjoiMzYifQ==" underline_height="0" f_title_font_weight="500" show_author="none" show_date="none" art_title="eyJhbGwiOiIyMHB4IDAiLCJwb3J0cmFpdCI6IjE1cHggMCJ9" cat_bg="#ffffff" cat_bg_hover="#ffffff" cat_txt="var(--metro-blue)" cat_txt_hover="var(--metro-blue-acc)" f_cat_font_family="394" f_cat_font_transform="uppercase" f_cat_font_weight="600" f_cat_font_size="eyJhbGwiOiIxNCIsImxhbmRzY2FwZSI6IjEzIiwicG9ydHJhaXQiOiIxMSIsInBob25lIjoiMTIifQ==" modules_divider="" h_effect="" title_txt_hover="#ffffff" f_title_font_line_height="eyJhbGwiOiIxIiwicG9ydHJhaXQiOiIxLjEifQ==" f_cat_font_line_height="1" all_modules_space="eyJhbGwiOiIwIiwicG9ydHJhaXQiOiIzMCIsInBob25lIjoiNDAifQ==" category_id="649" sort="" td_ajax_preloading="preload" modules_on_row="" modules_gap="eyJwb3J0cmFpdCI6IjIwIiwiYWxsIjoiMCJ9" limit="1" modules_category_padding="eyJhbGwiOiI4cHggMTZweCIsInBvcnRyYWl0IjoiNnB4IDEycHgifQ==" mc1_el="36" author_txt="#ffffff" author_txt_hover="#ffffff" date_txt="rgba(255,255,255,0.6)" ex_txt="rgba(255,255,255,0.9)" art_btn="0" f_btn_font_family="394" f_btn_font_transform="uppercase" f_btn_font_weight="600" f_btn_font_size="12" f_btn_font_style="" btn_bg_hover="#ffffff" btn_txt="#ffffff" btn_txt_hover="var(--metro-blue)" btn_title="Read post" btn_bg="var(--metro-blue-acc)" title_txt="#ffffff" art_excerpt="eyJhbGwiOiIwIDAgNDBweCAwIiwicG9ydHJhaXQiOiIwIDAgMzBweCAwIn0=" meta_info_align="center" f_btn_font_spacing="0.5" f_btn_font_line_height="1" btn_margin="0" image_size="td_1068x0" f_cat_font_spacing="0.5" f_ex_font_family="394" f_ex_font_size="eyJhbGwiOiIxOCIsInBob25lIjoiMTUiLCJwb3J0cmFpdCI6IjEzIiwibGFuZHNjYXBlIjoiMTYifQ==" f_ex_font_line_height="1.4" f_ex_font_weight="500" btn_padding="eyJhbGwiOiIxNXB4IDIwcHggMTRweCIsInBvcnRyYWl0IjoiMTRweCAxN3B4IDEzcHgifQ==" offset="1" modules_category_radius="100" btn_radius="4" modules_cat_border="0" review_size="0" excl_show="none"][/vc_column][/vc_row][vc_row full_width="stretch_row_1400 td-stretch-content" tdc_css="eyJhbGwiOnsiYmFja2dyb3VuZC1jb2xvciI6InZhcigtLW1ldHJvLWJsdWUtYWNjKSIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlIjp7ImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOSwicG9ydHJhaXQiOnsiZGlzcGxheSI6IiJ9LCJwb3J0cmFpdF9tYXhfd2lkdGgiOjEwMTgsInBvcnRyYWl0X21pbl93aWR0aCI6NzY4LCJwaG9uZSI6eyJwYWRkaW5nLXRvcCI6IjI0IiwicGFkZGluZy1ib3R0b20iOiIyNCIsImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3Njd9" flex_layout="block" flex_vert_align="flex-start"][vc_column][td_flex_block_1 show_com="none" image_width="eyJhbGwiOiI1MCIsInBob25lIjoiMTAwIn0=" image_floated="eyJhbGwiOiJmbG9hdF9sZWZ0IiwicGhvbmUiOiJub19mbG9hdCJ9" meta_padding="eyJhbGwiOiIwIDEwMHB4ICAwIDYwcHgiLCJwaG9uZSI6IjI1cHggMCAwIDAiLCJsYW5kc2NhcGUiOiIwIDIwcHggIDAgNDBweCIsInBvcnRyYWl0IjoiMCAwIDAgMzBweCAifQ==" image_radius="" image_height="eyJhbGwiOiIxMDAiLCJwb3J0cmFpdCI6IjExMCJ9" meta_info_horiz="content-horiz-left" modules_category="above" modules_category_margin="" show_excerpt="" show_btn="" show_review="none" show_cat="" m_padding="0" shadow_shadow_color="rgba(0,0,0,0.14)" f_title_font_family="467" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0Ijp7ImRpc3BsYXkiOiIifSwicG9ydHJhaXRfbWF4X3dpZHRoIjoxMDE4LCJwb3J0cmFpdF9taW5fd2lkdGgiOjc2OCwicGhvbmUiOnsiZGlzcGxheSI6IiJ9LCJwaG9uZV9tYXhfd2lkdGgiOjc2NywibGFuZHNjYXBlIjp7ImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOX0=" f_title_font_size="eyJhbGwiOiI1MCIsInBvcnRyYWl0IjoiMjgiLCJwaG9uZSI6IjQwIiwibGFuZHNjYXBlIjoiMzYifQ==" underline_height="0" f_title_font_weight="500" show_author="none" show_date="none" art_title="eyJhbGwiOiIyMHB4IDAiLCJwb3J0cmFpdCI6IjE1cHggMCJ9" cat_bg="#ffffff" cat_bg_hover="#ffffff" cat_txt="var(--metro-blue)" cat_txt_hover="var(--metro-blue-acc)" f_cat_font_family="394" f_cat_font_transform="uppercase" f_cat_font_weight="600" f_cat_font_size="eyJhbGwiOiIxNCIsImxhbmRzY2FwZSI6IjEzIiwicG9ydHJhaXQiOiIxMSIsInBob25lIjoiMTIifQ==" modules_divider="" h_effect="" title_txt_hover="#ffffff" f_title_font_line_height="eyJhbGwiOiIxIiwicG9ydHJhaXQiOiIxLjEifQ==" f_cat_font_line_height="1" all_modules_space="eyJhbGwiOiIwIiwicG9ydHJhaXQiOiIzMCIsInBob25lIjoiNDAifQ==" category_id="649" sort="random_posts" td_ajax_preloading="preload" modules_on_row="" modules_gap="eyJwb3J0cmFpdCI6IjIwIiwiYWxsIjoiMCJ9" limit="1" modules_category_padding="eyJhbGwiOiI4cHggMTZweCIsInBvcnRyYWl0IjoiNnB4IDEycHgifQ==" mc1_el="36" author_txt="#ffffff" author_txt_hover="#ffffff" date_txt="rgba(255,255,255,0.6)" ex_txt="rgba(255,255,255,0.9)" art_btn="0" f_btn_font_family="394" f_btn_font_transform="uppercase" f_btn_font_weight="600" f_btn_font_size="eyJhbGwiOiIxMiIsInBvcnRyYWl0IjoiMTIifQ==" f_btn_font_style="" btn_bg_hover="#ffffff" btn_txt="#ffffff" btn_txt_hover="var(--metro-blue-acc)" btn_title="Read post" btn_bg="var(--metro-blue)" title_txt="#ffffff" art_excerpt="eyJhbGwiOiIwIDAgNDBweCAwIiwicG9ydHJhaXQiOiIwIDAgMzBweCAwIn0=" meta_info_align="center" f_btn_font_spacing="0.5" f_btn_font_line_height="1" btn_margin="0" image_size="td_1068x0" f_cat_font_spacing="0.5" f_ex_font_family="394" f_ex_font_size="eyJhbGwiOiIxOCIsInBob25lIjoiMTUiLCJwb3J0cmFpdCI6IjEzIiwibGFuZHNjYXBlIjoiMTYifQ==" f_ex_font_line_height="1.4" f_ex_font_weight="500" btn_padding="eyJhbGwiOiIxNXB4IDIwcHggMTRweCIsInBvcnRyYWl0IjoiMTRweCAxN3B4IDEzcHgifQ==" modules_cat_border="0" modules_category_radius="100" btn_radius="4" offset="3" review_size="0" excl_show="none"][/vc_column][/vc_row][vc_row full_width="stretch_row_1400 td-stretch-content" tdc_css="eyJhbGwiOnsibWFyZ2luLXRvcCI6Ii01IiwiYmFja2dyb3VuZC1jb2xvciI6InZhcigtLW1ldHJvLWJsdWUpIiwiYmFja2dyb3VuZC1zdHlsZSI6ImNvbnRhaW4iLCJvcGFjaXR5IjoiMC4wNCIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlIjp7ImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOSwicG9ydHJhaXQiOnsiZGlzcGxheSI6IiJ9LCJwb3J0cmFpdF9tYXhfd2lkdGgiOjEwMTgsInBvcnRyYWl0X21pbl93aWR0aCI6NzY4LCJwaG9uZSI6eyJwYWRkaW5nLXRvcCI6IjI0IiwicGFkZGluZy1ib3R0b20iOiIyNCIsImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3Njd9" flex_layout="block" flex_vert_align="flex-start"][vc_column][td_flex_block_1 show_com="none" image_width="eyJhbGwiOiI1MCIsInBob25lIjoiMTAwIn0=" image_floated="eyJhbGwiOiJmbG9hdF9yaWdodCIsInBob25lIjoibm9fZmxvYXQifQ==" meta_padding="eyJhbGwiOiIwIDE0MHB4ICAwIDAgIiwicGhvbmUiOiIyNXB4IDAgMCAwIiwibGFuZHNjYXBlIjoiMCA0MHB4ICAwIDAgIiwicG9ydHJhaXQiOiIwIDQwcHggIDAgMCAifQ==" image_radius="" image_height="eyJhbGwiOiIxMDAiLCJwb3J0cmFpdCI6IjExMCJ9" meta_info_horiz="content-horiz-left" modules_category="above" modules_category_margin="" show_excerpt="" show_btn="" show_review="none" show_cat="" m_padding="0" shadow_shadow_color="rgba(0,0,0,0.14)" f_title_font_family="467" tdc_css="eyJhbGwiOnsibWFyZ2luLXRvcCI6Ii01IiwibWFyZ2luLWJvdHRvbSI6IjAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0Ijp7ImRpc3BsYXkiOiIifSwicG9ydHJhaXRfbWF4X3dpZHRoIjoxMDE4LCJwb3J0cmFpdF9taW5fd2lkdGgiOjc2OCwicGhvbmUiOnsiZGlzcGxheSI6IiJ9LCJwaG9uZV9tYXhfd2lkdGgiOjc2NywibGFuZHNjYXBlIjp7ImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOX0=" f_title_font_size="eyJhbGwiOiI1MCIsInBvcnRyYWl0IjoiMjgiLCJwaG9uZSI6IjQwIiwibGFuZHNjYXBlIjoiMzYifQ==" underline_height="0" f_title_font_weight="500" show_author="none" show_date="none" art_title="eyJhbGwiOiIyMHB4IDAiLCJwb3J0cmFpdCI6IjE1cHggMCJ9" cat_bg="#ffffff" cat_bg_hover="#ffffff" cat_txt="var(--metro-blue)" cat_txt_hover="var(--metro-blue-acc)" f_cat_font_family="394" f_cat_font_transform="uppercase" f_cat_font_weight="600" f_cat_font_size="eyJhbGwiOiIxNCIsImxhbmRzY2FwZSI6IjEzIiwicG9ydHJhaXQiOiIxMSIsInBob25lIjoiMTIifQ==" modules_divider="" h_effect="" title_txt_hover="#ffffff" f_title_font_line_height="eyJhbGwiOiIxIiwicG9ydHJhaXQiOiIxLjEifQ==" f_cat_font_line_height="1" all_modules_space="eyJhbGwiOiIwIiwicG9ydHJhaXQiOiIzMCIsInBob25lIjoiNDAifQ==" category_id="649" sort="random_posts" td_ajax_preloading="preload" modules_on_row="" modules_gap="eyJwb3J0cmFpdCI6IjIwIiwiYWxsIjoiMCJ9" limit="1" modules_category_padding="eyJhbGwiOiI4cHggMTZweCIsInBvcnRyYWl0IjoiNnB4IDEycHgifQ==" mc1_el="36" author_txt="#ffffff" author_txt_hover="#ffffff" date_txt="rgba(255,255,255,0.6)" ex_txt="rgba(255,255,255,0.9)" art_btn="0" f_btn_font_family="394" f_btn_font_transform="uppercase" f_btn_font_weight="600" f_btn_font_size="12" f_btn_font_style="" btn_bg_hover="#ffffff" btn_txt="#ffffff" btn_txt_hover="var(--metro-blue)" btn_title="Read post" btn_bg="var(--metro-blue-acc)" title_txt="#ffffff" art_excerpt="eyJhbGwiOiIwIDAgNDBweCAwIiwicG9ydHJhaXQiOiIwIDAgMzBweCAwIn0=" meta_info_align="center" f_btn_font_spacing="0.5" f_btn_font_line_height="1" btn_margin="0" image_size="td_1068x0" f_cat_font_spacing="0.5" f_ex_font_family="394" f_ex_font_size="eyJhbGwiOiIxOCIsInBob25lIjoiMTUiLCJwb3J0cmFpdCI6IjEzIiwibGFuZHNjYXBlIjoiMTYifQ==" f_ex_font_line_height="1.4" f_ex_font_weight="500" btn_padding="eyJhbGwiOiIxNXB4IDIwcHggMTRweCIsInBvcnRyYWl0IjoiMTRweCAxN3B4IDEzcHgifQ==" offset="3" modules_category_radius="100" btn_radius="4" modules_cat_border="0" review_size="0" excl_show="none"][/vc_column][/vc_row][vc_row full_width="stretch_row_1400 td-stretch-content"][vc_column][tdm_block_column_title title_text="VGVjaG5vbG9neQ==" title_tag="h3" title_size="tdm-title-sm" tds_title1-f_title_font_family="394" tds_title1-f_title_font_weight="800" tds_title1-f_title_font_size="eyJhbGwiOiI2MCIsInBob25lIjoiNTQiLCJsYW5kc2NhcGUiOiI1MCIsInBvcnRyYWl0IjoiNDAifQ==" tds_title1-title_color="var(--accent-color)" tdc_css="eyJhbGwiOnsicGFkZGluZy10b3AiOiI4MCIsInBhZGRpbmctYm90dG9tIjoiMzUiLCJkaXNwbGF5IjoiIn0sImxhbmRzY2FwZSI6eyJwYWRkaW5nLXRvcCI6IjYwIiwicGFkZGluZy1ib3R0b20iOiIyMCIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOSwicG9ydHJhaXQiOnsicGFkZGluZy10b3AiOiI0MCIsInBhZGRpbmctYm90dG9tIjoiMTAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3NjgsInBob25lIjp7InBhZGRpbmctdG9wIjoiNjAiLCJkaXNwbGF5IjoiIn0sInBob25lX21heF93aWR0aCI6NzY3fQ==" tds_title1-f_title_font_line_height="1.1" tds_title="tds_title1" tds_title2-line_width="100%" tds_title2-line_color="eyJ0eXBlIjoiZ3JhZGllbnQiLCJjb2xvcjEiOiIjZjQzZjNmIiwiY29sb3IyIjoiI2Y0M2YzZiIsIm1peGVkQ29sb3JzIjpbXSwiZGVncmVlIjoiLTkwIiwiY3NzIjoiYmFja2dyb3VuZC1jb2xvcjogI2Y0M2YzZjsiLCJjc3NQYXJhbXMiOiIwZGVnLCNmNDNmM2YsI2Y0M2YzZiJ9" content_align_horizontal="content-horiz-center"][/vc_column][/vc_row][vc_row full_width="stretch_row_1400 td-stretch-content" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjEwMCIsImRpc3BsYXkiOiIifSwicGhvbmUiOnsibWFyZ2luLWJvdHRvbSI6IjYwIiwiZGlzcGxheSI6IiJ9LCJwaG9uZV9tYXhfd2lkdGgiOjc2NywibGFuZHNjYXBlIjp7Im1hcmdpbi1ib3R0b20iOiI4MCIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOSwicG9ydHJhaXQiOnsibWFyZ2luLWJvdHRvbSI6IjYwIiwicGFkZGluZy1yaWdodCI6IjUiLCJwYWRkaW5nLWxlZnQiOiI1IiwiZGlzcGxheSI6IiJ9LCJwb3J0cmFpdF9tYXhfd2lkdGgiOjEwMTgsInBvcnRyYWl0X21pbl93aWR0aCI6NzY4fQ==" gap="eyJsYW5kc2NhcGUiOiIxNSIsInBvcnRyYWl0IjoiMTAiLCJhbGwiOiIyMCIsInBob25lIjoiMCJ9"][vc_column width="1/1"][td_flex_block_1 modules_on_row="eyJhbGwiOiI1MCUiLCJwaG9uZSI6IjEwMCUifQ==" limit="2" modules_category="above" show_btn="none" show_excerpt="eyJsYW5kc2NhcGUiOiJub25lIiwicG9ydHJhaXQiOiJub25lIn0=" ajax_pagination="" td_ajax_preloading="preload" sort="featured" category_id="409" f_title_font_size="eyJhbGwiOiIzNiIsInBvcnRyYWl0IjoiMjAiLCJsYW5kc2NhcGUiOiIyOCJ9" f_title_font_line_height="eyJwb3J0cmFpdCI6IjEuMiIsImFsbCI6IjEuMiJ9" show_cat="" meta_info_border_style="" meta_padding="eyJhbGwiOiIwIDAgMCAzMHB4ICIsInBvcnRyYWl0IjoiMCAwIDAgMTVweCIsImxhbmRzY2FwZSI6IjAgMCAwIDIwcHgiLCJwaG9uZSI6IjMwcHggMCAwIDAifQ==" modules_divider="" image_size="" meta_info_align="center" image_floated="eyJhbGwiOiJmbG9hdF9sZWZ0IiwicGhvbmUiOiJub19mbG9hdCJ9" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjI0IiwiYm9yZGVyLWJvdHRvbS13aWR0aCI6IjEiLCJwYWRkaW5nLWJvdHRvbSI6IjI0IiwiYm9yZGVyLWNvbG9yIjoiI2VhZWFlYSIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlIjp7Im1hcmdpbi1ib3R0b20iOiIyMCIsInBhZGRpbmctYm90dG9tIjoiMjAiLCJkaXNwbGF5IjoiIn0sImxhbmRzY2FwZV9tYXhfd2lkdGgiOjExNDAsImxhbmRzY2FwZV9taW5fd2lkdGgiOjEwMTksInBvcnRyYWl0Ijp7Im1hcmdpbi1ib3R0b20iOiIxNSIsInBhZGRpbmctYm90dG9tIjoiMTUiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3NjgsInBob25lIjp7Im1hcmdpbi1ib3R0b20iOiIzMCIsImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3Njd9" meta_info_horiz="content-horiz-left" f_title_font_weight="700" image_height="130" all_modules_space="eyJhbGwiOiIwIiwicGhvbmUiOiIzMCJ9" art_excerpt="0" art_title="eyJhbGwiOiIxNXB4IDAgMTJweCAwIiwicG9ydHJhaXQiOiIxMHB4IDAgMCIsImxhbmRzY2FwZSI6IjE1cHggMCAwIDAiLCJwaG9uZSI6IjEycHggMCAxNXB4IDAifQ==" btn_bg="rgba(255,255,255,0)" f_btn_font_transform="uppercase" f_btn_font_weight="" f_ex_font_weight="500" f_cat_font_transform="" f_cat_font_weight="600" btn_bg_hover="rgba(255,255,255,0)" f_ex_font_size="eyJhbGwiOiIxNCIsImxhbmRzY2FwZSI6IjEzIiwicG9ydHJhaXQiOiIxMyJ9" f_ex_font_line_height="1.4" meta_width="eyJwaG9uZSI6IjEwMCUifQ==" show_audio="" show_com="none" show_date="none" show_author="none" f_title_font_family="467" f_title_font_transform="" f_ex_font_family="394" title_txt="#000000" title_txt_hover="var(--metro-red)" ex_txt="#000000" cat_txt="var(--metro-red)" cat_bg="rgba(255,255,255,0)" cat_bg_hover="rgba(255,255,255,0)" author_txt="#000000" author_txt_hover="var(--metro-red)" modules_category_padding="eyJhbGwiOiI4cHggMTRweCIsInBvcnRyYWl0IjoiNnB4IDEycHgifQ==" f_cat_font_family="394" f_cat_font_size="eyJhbGwiOiIxMyIsInBvcnRyYWl0IjoiMTIifQ==" f_cat_font_line_height="1" excerpt_middle="yes" modules_gap="eyJhbGwiOiIzMCIsImxhbmRzY2FwZSI6IjIwIiwicG9ydHJhaXQiOiIxNSIsInBob25lIjoiMCJ9" image_width="eyJhbGwiOiI0NSIsInBob25lIjoiMTAwIn0=" image_radius="10" mc1_el="22" cat_txt_hover="var(--metro-blue)" modules_cat_border="1" modules_category_radius="100" cat_border="#eaeaea" cat_border_hover="#eaeaea" offset="1" post_ids="" excl_show="none"][td_flex_block_1 modules_on_row="eyJhbGwiOiIyMCUiLCJwaG9uZSI6IjEwMCUifQ==" modules_category="above" show_btn="none" show_excerpt="eyJwb3J0cmFpdCI6Im5vbmUiLCJhbGwiOiJub25lIn0=" ajax_pagination="" td_ajax_preloading="preload" sort="featured" category_id="" f_title_font_size="eyJhbGwiOiIxNiIsImxhbmRzY2FwZSI6IjE0IiwicG9ydHJhaXQiOiIxMyJ9" f_title_font_line_height="1.4" show_cat="" meta_info_border_style="" meta_padding="20px 0 0 0" modules_divider="" image_size="" meta_info_align="" image_floated="" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjYwIiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGUiOnsiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGVfbWF4X3dpZHRoIjoxMTQwLCJsYW5kc2NhcGVfbWluX3dpZHRoIjoxMDE5LCJwb3J0cmFpdCI6eyJtYXJnaW4tYm90dG9tIjoiNDAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3NjgsInBob25lIjp7Im1hcmdpbi1ib3R0b20iOiI1MCIsImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3Njd9" meta_info_horiz="content-horiz-left" f_title_font_weight="800" image_height="120" all_modules_space="eyJwb3J0cmFpdCI6IjAiLCJwaG9uZSI6IjMwIiwiYWxsIjoiMCJ9" art_excerpt="0" art_title="10px 0 0 0" btn_bg="rgba(255,255,255,0)" f_btn_font_transform="uppercase" f_btn_font_weight="" f_cat_font_transform="" f_cat_font_weight="600" btn_bg_hover="rgba(255,255,255,0)" meta_width="eyJwaG9uZSI6IjEwMCUifQ==" show_audio="" show_com="none" show_date="none" show_author="none" mc1_el="10" f_title_font_family="394" f_title_font_transform="" title_txt="#000000" title_txt_hover="var(--metro-red)" cat_txt="var(--metro-red)" cat_bg="rgba(255,255,255,0)" cat_bg_hover="rgba(255,255,255,0)" modules_category_padding="eyJhbGwiOiI2cHggMTJweCIsInBvcnRyYWl0IjoiNHB4IDEwcHgifQ==" f_cat_font_family="394" f_cat_font_size="eyJhbGwiOiIxMyIsImxhbmRzY2FwZSI6IjEyIiwicG9ydHJhaXQiOiIxMSJ9" f_cat_font_line_height="1" modules_gap="eyJhbGwiOiIyMCIsImxhbmRzY2FwZSI6IjE1IiwicG9ydHJhaXQiOiIxMCIsInBob25lIjoiMCJ9" image_radius="10" cat_txt_hover="var(--metro-blue)" modules_cat_border="1" modules_category_radius="100" cat_border="#eaeaea" cat_border_hover="#eaeaea" excl_padd="4px 6px 3px" excl_radius="2" f_excl_font_family="394" f_excl_font_size="11" f_excl_font_weight="500" excl_bg="var(--metro-exclusive)"][tdm_block_button button_size="tdm-btn-lg" button_display="" content_align_horizontal="content-horiz-center" button_text="More from Technology" button_url="/category/articles/" tds_button1-border_radius="4" tds_button1-background_color="var(--metro-red)" tds_button1-background_hover_color="var(--metro-blue)" tds_button1-f_btn_text_font_family="394" tds_button1-f_btn_text_font_size="eyJhbGwiOiIxMiIsInBvcnRyYWl0IjoiMTEifQ==" tds_button1-f_btn_text_font_line_height="1" tds_button1-f_btn_text_font_weight="600" tds_button1-f_btn_text_font_spacing="0.5" tds_button1-f_btn_text_font_transform="uppercase" button_padding="eyJhbGwiOiIyMHB4IDI4cHgiLCJwb3J0cmFpdCI6IjE4cHggMjRweCJ9"][/vc_column][/vc_row][vc_row full_width="stretch_row_1400 td-stretch-content" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjEwMCIsInBhZGRpbmctdG9wIjoiODAiLCJwYWRkaW5nLWJvdHRvbSI6IjEwMCIsImJhY2tncm91bmQtY29sb3IiOiIjZjRmNGY0IiwiZGlzcGxheSI6IiJ9LCJwaG9uZSI6eyJtYXJnaW4tYm90dG9tIjoiNjAiLCJwYWRkaW5nLXRvcCI6IjQwIiwicGFkZGluZy1ib3R0b20iOiI2MCIsImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3NjcsImxhbmRzY2FwZSI6eyJtYXJnaW4tYm90dG9tIjoiODAiLCJwYWRkaW5nLXRvcCI6IjYwIiwicGFkZGluZy1ib3R0b20iOiI4MCIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOSwicG9ydHJhaXQiOnsibWFyZ2luLWJvdHRvbSI6IjYwIiwicGFkZGluZy1yaWdodCI6IjUiLCJwYWRkaW5nLWxlZnQiOiI1IiwiZGlzcGxheSI6IiJ9LCJwb3J0cmFpdF9tYXhfd2lkdGgiOjEwMTgsInBvcnRyYWl0X21pbl93aWR0aCI6NzY4fQ=="][vc_column width="1/1"][vc_row_inner][vc_column_inner][tdm_block_column_title title_text="Q2xvdWQlMjBDb21wdXRpbmc=" title_tag="h3" title_size="tdm-title-sm" tds_title1-f_title_font_family="467" tds_title1-f_title_font_weight="800" tds_title1-f_title_font_size="eyJhbGwiOiI2MCIsInBob25lIjoiNTQiLCJsYW5kc2NhcGUiOiI1MCIsInBvcnRyYWl0IjoiNDAifQ==" tds_title1-title_color="#000000" tdc_css="eyJhbGwiOnsicGFkZGluZy1ib3R0b20iOiIzNSIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlIjp7InBhZGRpbmctYm90dG9tIjoiMzAiLCJkaXNwbGF5IjoiIn0sImxhbmRzY2FwZV9tYXhfd2lkdGgiOjExNDAsImxhbmRzY2FwZV9taW5fd2lkdGgiOjEwMTksInBvcnRyYWl0Ijp7InBhZGRpbmctYm90dG9tIjoiMjUiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3NjgsInBob25lIjp7ImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3Njd9" tds_title1-f_title_font_line_height="1.1" tds_title="tds_title1" tds_title2-line_width="100%" tds_title2-line_color="eyJ0eXBlIjoiZ3JhZGllbnQiLCJjb2xvcjEiOiIjZjQzZjNmIiwiY29sb3IyIjoiI2Y0M2YzZiIsIm1peGVkQ29sb3JzIjpbXSwiZGVncmVlIjoiLTkwIiwiY3NzIjoiYmFja2dyb3VuZC1jb2xvcjogI2Y0M2YzZjsiLCJjc3NQYXJhbXMiOiIwZGVnLCNmNDNmM2YsI2Y0M2YzZiJ9" content_align_horizontal="content-horiz-center"][/vc_column_inner][/vc_row_inner][vc_row_inner gap="eyJhbGwiOiIyMCIsImxhbmRzY2FwZSI6IjE1IiwicG9ydHJhaXQiOiIxMCIsInBob25lIjoiMCJ9" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjYwIiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGUiOnsibWFyZ2luLWJvdHRvbSI6IjUwIiwiZGlzcGxheSI6IiJ9LCJsYW5kc2NhcGVfbWF4X3dpZHRoIjoxMTQwLCJsYW5kc2NhcGVfbWluX3dpZHRoIjoxMDE5LCJwb3J0cmFpdCI6eyJtYXJnaW4tYm90dG9tIjoiNDAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0X21heF93aWR0aCI6MTAxOCwicG9ydHJhaXRfbWluX3dpZHRoIjo3NjgsInBob25lIjp7Im1hcmdpbi1ib3R0b20iOiI1MCIsImRpc3BsYXkiOiIifSwicGhvbmVfbWF4X3dpZHRoIjo3Njd9"][vc_column_inner width="1/2"][td_flex_block_1 modules_on_row="" limit="1" modules_category="above" show_btn="none" show_excerpt="eyJsYW5kc2NhcGUiOiJub25lIiwicG9ydHJhaXQiOiJub25lIiwiYWxsIjoibm9uZSJ9" ajax_pagination="" td_ajax_preloading="preload" sort="" category_id="394" f_title_font_size="eyJhbGwiOiI1MCIsInBob25lIjoiNDIiLCJsYW5kc2NhcGUiOiI0MiIsInBvcnRyYWl0IjoiMzYifQ==" f_title_font_line_height="1.2" show_cat="" meta_info_border_style="" meta_padding="eyJhbGwiOiIzMHB4IDAgMCAwIiwicG9ydHJhaXQiOiIyMHB4IDAgMCIsImxhbmRzY2FwZSI6IjI1cHggMCAwICIsInBob25lIjoiMjVweCAwIDAgMCJ9" modules_divider="" image_size="" meta_info_align="center" image_floated="" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjAiLCJkaXNwbGF5IjoiIn0sImxhbmRzY2FwZSI6eyJkaXNwbGF5IjoiIn0sImxhbmRzY2FwZV9tYXhfd2lkdGgiOjExNDAsImxhbmRzY2FwZV9taW5fd2lkdGgiOjEwMTksInBvcnRyYWl0Ijp7ImRpc3BsYXkiOiIifSwicG9ydHJhaXRfbWF4X3dpZHRoIjoxMDE4LCJwb3J0cmFpdF9taW5fd2lkdGgiOjc2OCwicGhvbmUiOnsiZGlzcGxheSI6IiJ9LCJwaG9uZV9tYXhfd2lkdGgiOjc2N30=" meta_info_horiz="content-horiz-left" f_title_font_weight="700" image_height="eyJhbGwiOiI3MCIsInBvcnRyYWl0IjoiOTAifQ==" all_modules_space="0" art_excerpt="0" art_title="eyJhbGwiOiIxMHB4IDAgMCIsInBvcnRyYWl0IjoiOHB4IDAgMCIsImxhbmRzY2FwZSI6IjEycHggMCAwIDAiLCJwaG9uZSI6IjEwcHggMCAwIn0=" btn_bg="rgba(255,255,255,0)" f_btn_font_transform="uppercase" f_btn_font_weight="" f_ex_font_weight="500" f_cat_font_transform="" f_cat_font_weight="600" btn_bg_hover="rgba(255,255,255,0)" f_ex_font_size="eyJhbGwiOiIxNCIsImxhbmRzY2FwZSI6IjEzIiwicG9ydHJhaXQiOiIxMyJ9" f_ex_font_line_height="1.4" meta_width="eyJwaG9uZSI6IjEwMCUifQ==" show_audio="" show_com="none" show_date="none" show_author="none" f_title_font_family="467" f_title_font_transform="" f_ex_font_family="394" title_txt="#000000" title_txt_hover="var(--metro-blue)" ex_txt="#000000" cat_txt="var(--metro-blue)" cat_bg="rgba(255,255,255,0)" cat_bg_hover="rgba(255,255,255,0)" author_txt="#000000" author_txt_hover="var(--metro-blue)" modules_category_padding="eyJhbGwiOiI4cHggMTRweCIsInBvcnRyYWl0IjoiNnB4IDEycHgifQ==" f_cat_font_family="394" f_cat_font_size="eyJhbGwiOiIxMyIsInBvcnRyYWl0IjoiMTIifQ==" f_cat_font_line_height="1" excerpt_middle="yes" modules_gap="0" image_width="eyJhbGwiOiIxMDAiLCJwaG9uZSI6IjEwMCJ9" image_radius="10" cat_txt_hover="var(--metro-blue-acc)" modules_cat_border="1" modules_category_radius="100" cat_border="#eaeaea" cat_border_hover="#eaeaea" review_size="0" offset="1"][/vc_column_inner][vc_column_inner width="1/2"][td_flex_block_1 modules_on_row="eyJwaG9uZSI6IjEwMCUifQ==" image_size="" image_floated="hidden" image_width="eyJwaG9uZSI6IjMwIn0=" image_height="eyJwaG9uZSI6IjExMCJ9" show_btn="none" show_excerpt="eyJwaG9uZSI6Im5vbmUiLCJhbGwiOiJub25lIn0=" show_com="eyJwaG9uZSI6Im5vbmUiLCJhbGwiOiJub25lIn0=" show_author="" show_cat="none" meta_padding="0" f_title_font_size="eyJhbGwiOiIzMCIsInBvcnRyYWl0IjoiMjAiLCJsYW5kc2NhcGUiOiIyNSIsInBob25lIjoiMjUifQ==" f_title_font_line_height="1.4" f_title_font_weight="800" all_modules_space="eyJsYW5kc2NhcGUiOiIyMCIsInBvcnRyYWl0IjoiMTUiLCJhbGwiOiIyNSJ9" category_id="42" show_date="" art_excerpt="0" show_review="none" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjAiLCJkaXNwbGF5IjoiIn0sInBvcnRyYWl0Ijp7ImRpc3BsYXkiOiIifSwicG9ydHJhaXRfbWF4X3dpZHRoIjoxMDE4LCJwb3J0cmFpdF9taW5fd2lkdGgiOjc2OCwicGhvbmUiOnsiZGlzcGxheSI6IiJ9LCJwaG9uZV9tYXhfd2lkdGgiOjc2N30=" f_title_font_family="467" mc1_el="10" title_txt_hover="var(--metro-blue)" title_txt="#000000" art_title="eyJhbGwiOiIwIDAgMTBweCIsInBvcnRyYWl0IjoiMCAwIDhweCJ9" modules_border_size="1px 0 0 0" m_padding="eyJhbGwiOiIyMHB4IDAgMCAwIiwibGFuZHNjYXBlIjoiMTVweCAwIDAgMCIsInBvcnRyYWl0IjoiMTVweCAwIDAgMCJ9" modules_gap="eyJhbGwiOiI0MCIsImxhbmRzY2FwZSI6IjMwIiwicG9ydHJhaXQiOiIyMCJ9" f_meta_font_size="eyJhbGwiOiIxMiIsInBvcnRyYWl0IjoiMTAifQ==" f_meta_font_line_height="1" f_meta_font_weight="" f_meta_font_family="394" offset="1" review_size="0" author_txt="var(--metro-blue)" author_txt_hover="var(--metro-blue-acc)" excl_padd="4px 6px 3px" f_excl_font_family="394" f_excl_font_size="11" f_excl_font_weight="500" excl_radius="2" excl_bg="var(--metro-exclusive)"][/vc_column_inner][/vc_row_inner][tdm_block_button button_size="tdm-btn-lg" button_display="" content_align_horizontal="content-horiz-center" button_text="View more Cloud" button_url="/category/articles/cloud/" tds_button1-border_radius="4" tds_button1-background_color="#000000" tds_button1-background_hover_color="var(--metro-blue)" tds_button1-f_btn_text_font_family="394" tds_button1-f_btn_text_font_size="eyJhbGwiOiIxMiIsInBvcnRyYWl0IjoiMTEifQ==" tds_button1-f_btn_text_font_line_height="1" tds_button1-f_btn_text_font_weight="600" tds_button1-f_btn_text_font_spacing="0.5" tds_button1-f_btn_text_font_transform="uppercase" button_padding="eyJhbGwiOiIyMHB4IDI4cHgiLCJwb3J0cmFpdCI6IjE4cHggMjRweCJ9"][/vc_column][/vc_row][vc_row full_width="stretch_row_1400 td-stretch-content" tdc_css="eyJhbGwiOnsibWFyZ2luLWJvdHRvbSI6IjEwMCIsImRpc3BsYXkiOiIifSwicGhvbmUiOnsibWFyZ2luLWJvdHRvbSI6IjYwIiwiZGlzcGxheSI6IiJ9LCJwaG9uZV9tYXhfd2lkdGgiOjc2NywibGFuZHNjYXBlIjp7Im1hcmdpbi1ib3R0b20iOiI4MCIsImRpc3BsYXkiOiIifSwibGFuZHNjYXBlX21heF93aWR0aCI6MTE0MCwibGFuZHNjYXBlX21pbl93aWR0aCI6MTAxOSwicG9ydHJhaXQiOnsibWFyZ2luLWJvdHRvbSI6IjYwIiwicGFkZGluZy1yaWdodCI6IjUiLCJwYWRkaW5nLWxlZnQiOiI1IiwiZGlzcGxheSI6IiJ9LCJwb3J0cmFpdF9tYXhfd2lkdGgiOjEwMTgsInBvcnRyYWl0X21pbl93aWR0aCI6NzY4fQ==" gap="eyJsYW5kc2NhcGUiOiIxNSIsInBvcnRyYWl0IjoiMTAiLCJhbGwiOiIyMCIsInBob25lIjoiMCJ9"][vc_column width="3/4" tdc_css="eyJwaG9uZSI6eyJtYXJnaW4tYm90dG9tIjoiNjAiLCJkaXNwbGF5IjoiIn0sInBob25lX21heF93aWR0aCI6NzY3fQ=="][/vc_column][vc_column width="1/4"][/vc_column][/vc_row][/tdc_zone] ### Sample Page This is an example page. It's different from a blog post because it will stay in one place and will show up in your site navigation (in most themes). Most people start with an About page that introduces them to potential site visitors. It might say something like this: Hi there! I'm a bike messenger by day, aspiring actor by night, and this is my website. I live in Los Angeles, have a great dog named Jack, and I like piña coladas. (And gettin' caught in the rain.) ...or something like this: The XYZ Doohickey Company was founded in 1971, and has been providing quality doohickeys to the public ever since. Located in Gotham City, XYZ employs over 2,000 people and does all kinds of awesome things for the Gotham community. As a new WordPress user, you should go to your dashboard to delete this page and create new pages for your content. Have fun! ### Is sustainability ‘enough’ from a Cloud perspective? ### Technological Pathways to Regeneration In 2025, satellite-based monitoring, drone technology, and AI-powered analytics will play an even larger role in enabling precision agriculture and verifying the effectiveness of regenerative practices. ### The Choice Between Sustaining or Restoring the Environment It's about going beyond minimising harm and actively contributing to environmental restoration. ### AI Quantum and IP Security Shaping Innovation ### The Future of AI & Quantum Technologies Meanwhile, emerging technologies like quantum will start to take centre stage: Quantum computing will ignite the next ‘tech race’. ### Driving Innovation Through Data and Cloud Integration Each year brings new advancements that reshape industries, redefine business operations, and introduce fresh challenges. ### Chris Royles ### Ensuring Seamless Data Shopping in 2025  ### How Should Businesses Manage Data Access? Privacy regulations call for extra caution when handling sensitive data such as personally identifiable information or protected health information. ### Optimising Cloud Data Strategies for Business Success In reality, many businesses are still unable to access specific data when they need it most, despite having spent years collecting it. ### How GenAI can tackle challenges in Software Engineering ### Proactive Cyber Security Enhancement https://www.loom.com/share/1bafb14062464491b250273e376e09f1 ### Enhancing Software Quality with AI When it comes to software management, developers face a variety of challenges in maintaining software quality. ### leslie - Lee Wakefield ### Building Trust in Decentralised AI Nor is innovation the only advantage of an open-source AI model. Decentralisation is another key benefit. ### Should we adopt open-source or closed-source AI? ### We’re in a Decentralised AI Revolution ### Enterprise Hosting, VPS or VMware Which is Best? ### Ensuring VM Performance Businesses can confidently run resource-intensive applications on VMs without worrying about performance bottlenecks caused by other users. ### A Strategic Comparison It's absolutely crucial to carefully weigh the implications of choosing a VPS over a VM, considering factors beyond just the initial price tag. ### Does every Cloud have a silver lining_ - visual selection As someone might say, the cloud is here to stay! The real question is no longer if you should embrace the cloud, but how to do so safely, responsibly, and effectively. ### Evaluating Cloud Computing Pros & Cons Essentially, cloud computing allows you to store your files, software, and even entire applications remotely, rather than relying on a physical hard drive in your computer or an on-site server room. ### Does every Cloud have a silver lining? ### Security Seeing Through the Cloud ### How to take advantage of the Cloud ### How should UK organisations approach data protection and cloud services to align with regulatory demands The experience of German companies, as highlighted by Secure MSP, provides valuable lessons for the UK market. ### Challenges of Secure Document Exchange Relying on outdated workarounds introduces significant risks and inefficiencies that can hamper productivity and expose organisations to potentially crippling data breaches. ### Data storage in and outside the UK ### Data Outsourcing Risk Funnel As a result, they were forced to move their entire IT operations to a UK-based IT infrastructure at considerable expense and disruption. ### Comprehensive Cloud Data Security The days of simply 'hoping for the best' are long gone; a proactive, informed approach to data security is non-negotiable. ### Data Loss in the Cloud ### Which cloud service provider offers the best data recovery options? Understanding these limitations is crucial to making informed decisions about your own backup strategy. ### Balancing Cloud Security and Data Recovery Needs Are their safeguards truly enough to protect your business, your livelihood, from the devastating impact of data loss? ### Hayden CTC Profile Picture ### Picture 27 ### AWS Braket Unlocks Quantum Computing - visual selection (1) The Potential of Quantum Computing ### AWS Braket Unlocks Quantum Computing - visual selection Quantum Data Centres of the Future ### a-futuristic--visually-striking-composition-featur (1) Quantum Computing chip with cloud integration ### a-futuristic-cloud-computing-data-center-with-quan (1) Cloud-Computing Data Centre ### a-futuristic-cloud-computing-data-center-with-quan ### Business Needs AI Now More Than Ever Summary Two years ago, it was the shiny new gadget that companies flaunted at board meetings but rarely embedded into core operations. ### Business Needs AI Now More Than Ever ### AI Evolution in Business Now, it predicts employee success, assesses cultural fit, and even provides real-time coaching to hiring managers - effectively making it both talent scout and corporate therapist. ### AI's Evolution to Essential Tool Financial institutions rely on AI to assess risks in real-time, while HR departments use machine learning algorithms to analyse employee engagement and predict turnover before it happens. ### The Evolution of AI in the Workplace The shift has been nothing short of revolutionary—like moving from horse-drawn carriages to high-speed trains overnight. ### Challenges of Multi-Cloud Environments The ideal approach is to utilise a unified data model that gathers intelligence from a variety of data sources. ### Enhancing Cloud Security with CNAPP Reducing the number of alerts requiring immediate attention by 94% and alleviating the pressure on the security analysts. ### Challenges of CNAPP Deployment Gartner predicts that 95% of cloud security failures will be the user’s fault, primarily due to misconfigurations through 2025. ### Integrity 360. Picture by Shane O'Neill, Coalesce. ### AI Integration Strategy This iterative approach allows for troubleshooting and learning without putting core business operations at risk. ### How should organisations approach AI vendor selection? Having the flexibility to shift between AI vendors is crucial as the market is moving at a rapid pace. ### Cloud Adoption & Challenges However, "cloud regret” soon set in, with a variety of challenges arising that hadn’t been foreseen when opting to migrate, causing headaches across organisations. ### Cloud Challenges ### Three Cloud Challenges Leaders Can Learn for AI ### RJM 3 ### Safeguarding Sports Broadcasting with AI Fortunately, AI-based video editors have intelligent filtering that can automatically screen out obscene, violent, or gratuitous content. ### AI-Driven Content Creation Cycle Image created by Disruptive LIVE ### Generative AI & Video Editing Image created by Disruptive LIVE ### Meghna Krishna - Reanna Griffith ### ChrisPrice pic (edit) (1) ### IoT-Enhanced Web App Performance ### IoT-Driven Web App Development The foundation of IoT-powered applications, these backend technologies allow them to handle intricate operations and guarantee dependable performance even when faced with demanding workloads. ### Increasing IoT Usage in Web App Development ### Impact of IoT on Future Web App Development ### Prathik Kantesiya ### Bridging AI & Quantum Computing They are invaluable when it comes to creating AI models that mimic quantum systems, right through to implementing these solutions into configurations. ### Quantum-AI Synergy in Market Transformation The cooperation between generative AI and quantum computing has given birth to great opportunities to transform the markets. ### Synergy in Data Simulation AI Development Company, is unlocking new frontiers in quantum computing, addressing challenges, and creating pathways for revolutionary applications. ### Generative AI Unlocking Quantum Computing Data Potential ### unnamed ### Driving the Future of Connectivity with Data Centres ### Data Centre Innterconnect Market Dynamics This provides direct access between Oracle Cloud and Azure, allowing customers across Africa to use the Oracle Database Service and integrate workloads with Oracle Cloud Infrastructure (OCI). ### The Role of DCI in Global Connectivity As an astonishing matter of fact, there are 5.3 billion people using the internet in 2024, equating to 66 % of the world’s total population. ### Deboleena Dutta ### Cloud Vendor Risk & Security Analysis ### Components of a Hybrid Cloud Environment ### Hybrid Cloud - Mix and Management ### Tim Whiteley ### Willow's Breakthroughs in Quantum Stability ### Enhancing Climate Models with Quantum Tech Understanding and predicting climate change requires processing enormous amounts of data and recreating intricate environments. ### Quantum Computing vs Tradtional Computing When it comes to complex problems such as looking at huge number of possibilities at the same time, they will struggle compared to quantum computers. ### Quantum leap in various fields Well, this new development isn’t just for scientific research in labs, it has the potential to change fields such as pharmaceutical research to financial modelling and many more! ### Balancing Security & Operational Excellence Every element has the potential to significantly strengthen contemporary businesses against the constantly changing threat landscape while balancing operational excellence safety and assurance in a world that is constantly changing. ### Cloud Security Additionally customers and clients are more trusting when there is strong cloud security because they know that their financial and personal information is secure. ### Key Components of Cloud Security Effectively protecting your systems and data in the event of a security breach is ensured by having a well-documented incident response plan that outlines exactly how your team should react. ### Unveiling the DImensions of Cloud Security We will go further into the ever-changing world of cloud security to uncover why these solutions are so critical to preserving our digital futures. ### Cloud Computing Demands Robust Security Solutions ### ricoo ### AI & Smart OOB Synergy Meanwhile, Smart OOB ensures that these AI-enabled systems remain connected and functional at all times, even during network disruptions. ### Enhancing Network Resilience with AIOps Add Smart OOB to AIOps and you have the most powerful combination of tools available that can deliver true Network Resilience. ### Global Manufacturing Activity Overview China's Caixin/S&P Global manufacturing PMI also fell to 49.5 in October, down from 50.6 in September. ### Unlocking the future of manufacturing; AIOps network optimisation ### Hybrid Cloud Use Cases Streaming platforms can handle spikes in viewer numbers during major events, such as sports finals or new releases. ### Unveiling the Security and Compliance Benefits of Hybrid Cloud According to research, up to 90% of companies using the hybrid cloud believe it has improved their ability to meet regulatory requirements. ### Unified Cloud Strategy According to research, more than 80% of companies are already using some form of hybrid cloud! ### Why is the hybrid cloud the future of computing? ### ahnet ### Leveraging AI for Business Transformation and Market Adaptability ### AI Success Pyramid These elements ensure AI is not just implemented but thrives, generating tangible results. ### Building a Data-Driven Culture Encourage staff at all levels to rely on data insights rather than intuition alone. ### AI's Role in Business Success AI helps drive success now and in the future. ### Understanding IP Address Routing When a client requests a specific IP address, the routers ask each other about it. ### Data Routing from Internet to Device IP addresses can be used for personal identification, and as such, they can be used for doxing ### The IP Address Cycle Anytime you connect to the internet, your internet service provider assigns you an IP address. ### 6 Basic Things to Know About IP Addresses ### 465440480_122102388248597656_2020681399863450757_n ### How AI is Transforming Customer Communication Management ### Transforming Customer Data into Insights It can then combine this with trends and predictions on customer needs. ### AI-Driven Multi-Channel Communication Cycle AI then helps take this further through automation. ### Enhancing Customer Relationships ### How AI is Transforming Customer Communication Management ### Joel Timothy ### Investment Opportunities for Startups and Technologies in AI  ### AI Investment Landscape ROBO Global Robotics & Automation Index ETF invests in companies specialising in robotics, automation, and AI, spanning growth and blending stocks of various market capitalisations. ### How to Effectively Invest in AI Startups Engaging with other startup founders, and understanding their vision and strategies also gives a deeper knowledge of their success. ### AI INvestment Opportunity Funnel With the increasing implementation of AI and its proven advantages, it has emerged as a lucrative area of investment for corporates, angel investors, and venture capitalists. ### 1725859411470 ### Four Surprising Lessons I've Learned Leading Tech Teams ### People-Centred Tech Processes Tech ways of working such as agile, sprints, and retrospectives aim to blend people, technology, connectivity, time and places, to deliver fast learning and decision-making in a way that positively impacts performance and outcomes. ### The Value of Tech Team Independence One of the frustrations non-tech leaders have with IT teams is that they seem to have their own agenda and don't always follow orders. ### Performance of AI Models in Cognitive Tests The highest score a foundational model achieved was 38%, and the lowest score was 16%, suggesting a considerable gap between foundational models' current capabilities and human cognitive abilities. ### How to Approach the Leadership Style for Tech Teams But in my experience, this also often makes us game-changers – the ones who transform commercial performance and disrupt outdated business processes. ### Zoe Cunningham-6 ### User Experience with Meta's Communication ### Public Interaction with LLMs But do the wider public actually understand how their data might be used to train these LLMs? ### Data privacy concerns linger around LLMs training ### PR-Picture - Thomas Hughes ### The Role of AI in Digital Security AI based tools can preclude sites or ads seeking to harvest personal information or take part in malicious practices such as phishing. ### AI-Enhanced Personalisation in Browsing AI blockers can also see what type of content is on a site and block it based on a user's preferences. ### AI Blockers or Traditional Blockers As more people are using these tools, its evident that AI powered extension blockers are about to change the web experience. ### Emma Wilson - Emma Wilson ### How to protect sensitive employee data If a cybercriminal does manage to steal data, an effective data encryption strategy makes stolen data virtually useless to the criminal. ### Cybercrime Risks Cybercriminals can leverage the stolen information to launch phishing campaigns, impersonate employees, or disrupt business processes related to benefits enrolment. ### Cybersecurity in Business Not all businesses understand all of their points of vulnerability in their administration systems – or how to protect them effectively. ### Cybersecurity Risks in Benefits Administration Cybercriminals can leverage the stolen information to launch phishing campaigns, impersonate employees, or disrupt business processes related to benefits enrolment. ### Securing Benefits Administration to Protect Your Business Data (1) In addition, not all businesses understand all of their points of vulnerability in their administration systems ### Securing Benefits Administration to Protect Your Business Data ### Which Cloud Type Suits You – Public, Private, Hybrid? ### Business Operation Diagram (1) Business continuity is not just about surviving the apocalypse. It's about navigating the everyday bumps in the road, the little niggles that can quickly escalate into a full-blown crisis if left unchecked. ### Dominic Melly (1) ### Michael Minarik (1) ### Ultima's Business Continuity Cycle (1) We can also provide ongoing support and guidance to ensure your plan remains up-to-date and effective. In other words, we pride ourselves on being there to support, no matter what ### Building a Resilient Business (1) Similarly, the most advanced technology in the world won't save you if you don't know how to use it effectively. ### Ultima Business Continuity Cheat Sheet for Staying Prepared ### Perception of Cloud Expertise Among Organisations (1) Only 36% of organisations consider that they have the right amount of expertise[4] and what’s worse is that almost as many, consider their shortage to be significant. ### How to Approach Cloud Strategy (1) Only 14% of organisations according to Flexeras report[2] run in multiple hyperscale cloud platforms, with the real dominant approach proving to be a Hybrid approach. ### Challenges of Cloud & Ultima's Solution to Transform Business ### The Role of Artificial Intelligence in Subscription Management ### Kenny Image ### Understanding the cloud adoption curve and the future holds ### 2021 Corp Profile_Chris Pulis - Reanna Griffith ### Build, Buy and the Death of Billable Hours ### Leanne Aldrich - Leah Jones ### Optimising Cloud Cost Management to Maximising ROI ### Types of Cloud Computing (1) ### Ila Bandhiya Headshot - Ila Bandhiya ### Welcome to More Productive, AI-powered Working Lives ### Cost management in cloud security Businesses need to make sure that as they look to more cost effective spending in cloud, that security is not at risk. ### Data Sharing Risks with AI Tools Companies should actually be more selective about what data the security tools they use are gathering. ### Distribution Traffic Studies have revealed that 95% of internet traffic is now encrypted, producing another challenge in the cloud security landscape. ### Cloud Security Challenges in the Modern Era ### A Practical Guide to the EU AI Act ### AI Impact First, they must assess the system's potential impact on health outcomes, especially in sensitive areas like medical diagnostics, where patient care could be affected by AI. ### Human Responsibility No one wants to build systems that compromise public safety, health, or privacy. ### Lydia Picture ### AI-Enhanced Elderly Care Cycle Through machine learning Singapore has created a monitoring system that can detect unusual patterns in senior citizens' daily routines. ### Building a Smart City ### GIS for Smart City Management GIS forms the backbone of smart city management. ### Ensuring Data Safety in Smart Cities As smart cities collect and process vast amounts of personal data, strong ethical frameworks and data protection measures are essential. ### Mastering Hypervisors for Enhanced Business Efficiency ### Talia Brooke Headshot ### Daren - Aliza ### Cloud Computing's Role in Digital Transformation ### Tony Connolly (1) - Louise O'Connell ### The hidden costs of technical debt inaction ### Jonathan Dedman - Leah Jones ### Why are IT decision-makers under pressure to save? ### Ensuring AI Success in Telecommunications ### The growing threat of ransomware in healthcare ### Data Tips Protecting Your Organisation From Insider Theft ### Aidan Simister (2) - James Foster ### Common e-commerce vulnerabilities and how to combat them ### white-background-daniel - Laura Pisanello ### Importance of Runtime Security for Cloud Native Environments ### Shira Bendkowski, VP of Product Management at Aqua Security. - Hannah Whitrow ### Three Ways Automation Boosts Cloud ROI ### Understanding the Dynamic World of Computer-Aided Dispatch ### Understanding the Dynamic World of Computer-AidedDispatch ### Aashi Mishra 1 ### Why I welcome AI software development Why I welcome AI software development ### Monastic Scribes vs Modern Coders Contrasting monastic scribes vs modern coders. ### Evolution from Scribers to Developers Journey from ancient scribes to modern developers. ### Why I welcome AI software development TLDR, summary of the blog ### How Port Scanning Strengthens Cyber Defences ### Neil May - Headshot - Hannah Gough ### MSPs think public cloud needs improvement ### Human AI Robotics Future Relationship ### AI's 2040 Transformation ### AI's Impact on Governance ### Future of AI on Quantum Systems ### AI's impact on education ### Future of Roads ### AI Enhanced Medical Diagnosis ### Potential Future of Artificial Intelligence A Futuristic Guess on the Future of AI from 2025-2050 ### 453653978_844020774362443_7623005780902107950_n ### Future of AI from 2025-2050 ### A Futuristic Guess on the Future of AI from 2025-2050 ### How Small Businesses Can Start Using AI Today ### The Evolution of Artifical Intelligence Economist Long Style Timeline ### AI Innovations in Various Sectors MindMap ### UbI vs UBS ### Are Cloud and AI driving the need for Universal Basic Income? ### Are Cloud and AI driving the need for Universal Basic Income? ### The Pros and Cons of Consumption vs Subscription Models Platform-driven services such as SaaS are always dependent on both sides of a model working. (1) ### The Pros and Cons of Consumption vs Subscription Models Platform-driven services such as SaaS are always dependent on both sides of a model working. ### The Pros and Cons of Consumption vs Subscription Models Image of the tech curve and discussing the types of adopters. ### The Uncanny Valley of AI Writing ### The AI Writing Uncanny Valley This is explaining who this is aimed at and the summary of the blog ### The Uncanny Valley of AI Writing Three points highlighting the best practices of AI writing ### Challenges-of-Subscription-models ### Kate Bennett ### Crypto History is Blocking Secure Blockchain Data Storage ### disruptivelive_A_futuristic_digital_vault_with_a_heavy_padloc_a695773e-6443-4714-85df-65b94fe5e017_2 (1) ### Crypto History is Blocking Secure Blockchain Data Storage ### mckinsey-storage-comparison (1) ### Simon Bain[1] - Lawrence Rosenberg ### Three Questions to Realign Your IT Investment Strategy ### Joe Baguley - VMware 2021 ### Crypto History is Blocking Secure Blockchain Data Storage ### fragmented-vs-aligned-redesign ### three infrastructure questions ### cyber-threats-timeline ### The AI Show Theo Saville The AI Show Theo Saville ### Theo Saville Theo Saville ### Defending Health and Social Care from Cyber Attacks1 ### CIOs and CISOs Battle Cyber Threats, Climate, Compliance ### Discover the Power of On-premise Cloud Innovation ### Mark Grindey - Jacob Turner ### The Data Conundrum How sustainable is its future ### Adopting open architecture for robust data strategy ### Jonny Dixon - Dremio - Dinara Omarova ### Transitioning from legacy tech to the cloud ### disruptivelive_A_visual_metaphor_showing_old_computer_servers_a0e0c48b-f266-4016-8435-599e5fca042b_0 (1) ### Is Poor Collaboration Hurting Your Team's Productivity_ ### Is Poor Collaboration Hurting Your Team's Productivity? ### Headshot Rob Quickenden, CTO, Cisilion - Danni Pennington ### AI Show - John Scott AI Show - John Scott ### John Scott John Scott ### AI Show - Martin Taylor AI Show - Martin Taylor ### Martin Taylor headshot Martin Taylor headshot ### Cyber show - Amelia Hewitt Cyber show - Amelia Hewitt ### Amelia Hewitt profile image Amelia Hewitt profile image ### Cyber show - Jon France Cyber show - Jon France ### Philip Blake profile Philip Blake profile ### 56994 ### Jon France ### James Moore profile pic James Moore profile pic ### Cloud Show - James Moore Cloud Show - James Moore ### Three ways to protect your business from cyberattacks ### SZ, Headshot - Cloudways - Joy Mumbi ### Data Sovereignty in AI Five Key Considerations ### Manish Headshot - Emma Prichard-Jones ### Streamlining Cloud Management with Terraform Functions ### 1575088972708 ### Defending Health and Social Care from Cyber Attacks ### DecNorton - Madison Graves ### How is AI changing patch management? ### Daniel Smale Headshot - Charlotte Webb ### CIF - Dion CIF - Dion ### Futureproofing Identity Security with AI and ML ### Jonathan Neal, VP Solutions Engineering at Saviynt - Irina Meier ### The Critical Convergence of OT and IT Cybersecurity ### Mirel Sehic_Headshot (1) - Esme Hoggard ### Sean_Centrip - Conor Heslin ### Exploring the human side of DevOps metrics ### Jeff_Stewart_Headshot_SolarWinds_2000x3000 - Trisha Bahl ### AI for Workplace Efficiency and Security Enhancement ### 1682089196405 - Ali Negus ### Reducing LLM hallucinations with trusted data ### Are WAAP's needed to protect cloud applications and APIs? ### Anshuman Kanwar_4 - Daniel Couzens ### AI Rediness AI Rediness ### Dima Dima ### Jay Limburn Jay Limburn ### How To Use Artificial Intelligence For Digital Writing? ### Harsha Sharma - Mashum Mollah ### Cloud Show - Mark Jow Cloud Show - Mark Jow ### The Cloud Show - Lee Thatcher The Cloud Show - Lee Thatcher ### Unlock Financial Efficiency with E-Procurement ### Securing Kubernetes in Multi-Cloud ### Pascal_Brunner_Cohesity - Sam Spencer ### How database replication keeps enterprises agile ### Stephen Mulholland, RVP EMEA at Fivetran - Reka Agopcsa ### Ethical, Compliant, Innovative AI Deployment Strategies in Cloud ### FrankBaalbergen 1 (1) - Joanne Oni-Awoyinka ### CIF - Tara Halliday CIF Tara Halliday ### Jez Beck Profile image Jez Beck Profile image ### 0F901AFF-E85B-4379-9A61-6EFE39895D69 - Copy ### Jez Beck Profile image Jez Beck Profile image ### Cloud Industry Forum - Erika Jacobi (1) Cloud Industry Forum - Erika Jacobi ### Start Bridging the Cloud Skill Gap Today ### Cloud Industry forum - Nigel Kilpatrick (1) ### CIF - Dean & SarahJane ### _es08728_52903692305_o - Ellie Byrne ### How Data Fabric Helps Address Multi-Cloud Sprawl ### The Quantum Leap in Internet Technology ### cropped-NIK_0106.jpg https://www.comparethecloud.net/wp-content/uploads/2024/06/cropped-NIK_0106.jpg ### NIK_0106 ### Steve Haines Embridge - LS - Make More Noise ### Translating ancient code for the generative AI era ### Dael WIlliamson - Megan Hogg ### Tackling AI’s Data Sovereignty Dilemma ### Paul Walker_EMEA Technical Director_iManage - Vidushi Patel ### How midmarket manufacturers should approach cloud-based ERP ### Benoit Wambergue_Forterro - Paul Allen ### Connectivity, Innovation and Trust The new automotive future ### John Wall - Imogen Judge ### Transforming The Recruitment Process with AI ### 5 Best Tips for Transitioning to the Cloud ### Wendy Henry ### Distinguishing Real AI in Cybersecurity ### Manu Santamaria WatchGuard (1) ### annualreport ### Transforming customer engagement with omnichannel solutions ### Headshot - Tristan Shortland (1) ### How Businesses Should Tackle Big Data Challenges ### jake pic - Jake Michael ### UK IP Benefits and How to Get One ### my-head-shot - Emily Anne ### CIF 2024 state of the cloud report CIF 2024 state of the cloud report ### CIF Jay Patel CIF Jay Patel ### Navigating the Landscape of AI Adoption in Business (1) ### Plusdocs_Daniel Li_Headshot (1) ### Nigel Cannings ### 56733 ### Three Ways to Strengthen API Security ### Jay Headshot (2) - Sidney Shepherd ### A Comprehensive Guide To The Cloud Native Database [2024] ### Vitalii Makhov - Ngoc Linh “Lina” Nguyen ### AI is the future foundation of business’ ESG frameworks ### Nelson headshot - Isabel Paret ### AI Show – Episode 5 – Matt Rebeiro AI Show – Episode 5 – Matt Rebeiro ### Matt Rebeiro Matt Rebeiro ### CIF - Fanny Bouton ### Cloud is a business turbo-charger ### Johan Reventberg Unit4 1 - Cairbre Sugrue ### CIF - Nick Powell ### GenAI needs to be used responsibly - here’s how ### Michal Szymczak - Jemima Gadd ### The importance of channel partners in driving forward Zero Trust ### Jon Kane headshot ### How AI is transforming the battle against fincrime ### Screenshot 2024-03-27 at 11.13.51 - Hannah Tulloch ### Navigating Data Governance ### Chris_Starkey_4 copy (2) - Flora Carpenter ### 10 Best Marketing Tools to Leverage Business Growth ### Jigar_Agrawal ### Navigating Cloud Migration in Banking ### Roq People Headshots ### Three key approaches to safeguarding modern application security ### SaschaGiese_Headshot_SolarWinds_1000x1000 ### What observability can teach us about corporate culture ### Cullen_Childress_Headshot_SolarWinds_1000x1000 - Trisha Bahl ### Three tips for managing complex Cloud architectures (1) ### Adam Gaca ### Demystifying AI Image Copyright (1) ### Gaz - Soap Media - Eve Foster ### CIF Presents TWF – Duane Jackson CIF Presents TWF – Duane Jackson ### CIF Presents TWF – Emily Barrett CIF Presents TWF – Emily Barrett ### 56650 ### Richard Osborne Headshot ### Beyond borders- cloud solutions for universal Interoperability (1) ### Building a people-centric strategy to unlock AI’s potential ### Laurel McKenzie - Emily Royle ### AI Act - New Rules, Same Task ### Mark Molyneux - Sam Spencer ### The Future of Marketing - Automation vs Innovation ### AI show - Episode 4 AI show - Episode 4 ### Guy murphy Guy Murphy ### Time to Ditch Traditional Tools for Cloud Security ### Raghu Nandakumara ### Beyond Borders: Cloud solutions for universal Interoperability (1) ### 6 Ways Businesses Can Boost Their Cloud Security Resilience (1) ### John Funk - John Funk ### Good, Bad and the Ugly of Cybersecurity GenAI ### Mark_-1-e1572450543143 - Jonathan Kent ### Maximising the business value of data (1) ### A James - Chevaan Seresinhe ### Emerging trends in Cloud, DevOps and Governance ### Paul Jones - Katie Moxom ### The cloud-a viable option for data storage (2) ### Josh Roffman Headshot (2) ### AI Show – Episode 2 – Paul Pallin AI Show – Episode 2 – Paul Pallin ### Paul Pallin Paul Pallin ### Andy Baillie - Semarchy - Tom Pallot ### Matt Aldridge - Balance Wealth Planning ### The hybrid future of cloud in surveillance networks ### David Needham - EMEA Business Development Manager at Axis Communications_LR - Matt King ### CIF Presents TWF - Joshua Stanton CIF Presents TWF - Joshua Stanton ### IF Presents TWF – Garfield Smith IF Presents TWF – Garfield Smith ### IMG_2190 - Lili Dewrance ### Cloud Security Is A Shared Responsibility (1) ### Manuel Sanchez_Information Security and Compliance specialist_iManage (1) ### Lenovo and Cudo Ventures Lenovo and Cudo Ventures ### image (2) ### image (1) ### Navigating the Cloud Conundrum – Public, Private or Hybrid? ### Navigating the Hype and Reality of Cloud Service Providers in the Age of AI ### Next-Generation Ransomware Protection for the Cloud ### Whats next for the cloud ### Next-Generation Ransomware Protection for the Cloud ### The art and science of GenAI in software development ### From the lab to production Driving GenAI business success ### Mark Lewis - Zadara - Reese Goldsmith ### Headshot Jon - Aliza ### Aron Brand 1 - Joanne Hogue ### Amdaris - Mihai - Madison Graves ### Paul Mackay - Ruby Frost ### Dawid Glawdzin - Emily John ### Chris Jackson Six Degrees - Anna N ### cropped-Emily-Barret-headshot.jpeg Emily Barret headshot ### AI Show episode 1 AI Show episode 1 ### Emily Barret headshot Emily Barret headshot ### Clare walsh headshot Clare walsh headshot ### Generative AI and the copyright conundrum (1) ### Cloud ERP shouldn't be a challenge or a chore ### What You Need to Know about NIS2 (1) ### dirk_schrader - Jamie Sarao ### Silvia Cambie ### Tony Lock (1) ### TWF Hutton Henry TWF Hutton Henry ### Harry berg ### Sue Black ### Eco-friendly Data Centres Demand Hybrid Cloud Sustainability (1) ### GH Headshot - Lee Wakefield ### Jean de Villiers ### Top 7 Cloud FinOps Strategies for Optimizing Cloud Costs (1) ### The path to cloud adoption success (1) ### Lynn collier - Ayo Bolade-Eyinla ### The Pros and Cons of Using Digital Assistants (1) ### Aman 1000X1000 - Rituraj Borah ### Joel Goodman - Jemima Gadd ### Maxime Vermeir_ABBYY - Amber Sim ### Three Reasons Cyberattackers Target the Cloud (1) ### cropped-dirk_schrader.jpg https://www.comparethecloud.net/wp-content/uploads/2024/02/cropped-dirk_schrader.jpg ### dirk_schrader ### Managing Private Content Exposure Risk in 2024 ### tim-freestone_600x400 (1) ### Fergal Lyons ### Rowen - Sam Spencer ### Generative AI- Three CXO challenges and opportunities ### What You Need to Know about NIS2 What You Need to Know about NIS2 ### Rowen - Sam Spencer ### Hybrid IT Infrastructure- Integrating On-Premises and Cloud Solutions ### Sumit Vakil Headshot - Angelika Szpregiel Sumit Vakil ### Henry Bell ### Why APIs are the new frontier in cloud security ### Tackling AI challenges in ethics, data, and collaboration ### The evolution of the CISO ### Suki Dhuphar headshot - Fern Hugill ### Building Trust- Uniting Developers & AppSec Teams ### Alon Barr ### Building cyber resilience across the supply chain ### Sumit Vakil Headshot - Angelika Szpregiel ### Enter the Application Generation redefining digital experiences in 2024 (1) Enter the Application Generation: redefining digital experiences in 2024 ### CIF Presents TWF - Dr. Ems Lord CIF Presents TWF - Dr. Ems Lord ### James-Harvey-368x462 - Elvina Soogun ### Cloud spending defies tough times in technology Cloud spending defies tough times in technology ### Umashankar Lakshmipathy - Abayomi Atiko ### Why is Cloud security failing to keep pace? Why is Cloud security failing to keep pace? ### How Generative AI is transforming industries across the world How Generative AI is transforming industries across the world ### Keshav_Murugesh_2 - Hayley Woodward ### The Paradox of Open-Source AI The Paradox of “Open-Source AI” ### The skills every cloud practitioner needs in 2024 The skills every cloud practitioner needs in 2024 ### girish_nivarti - Boryana Pencheva ### Phil Robinson - Sarah Bark ### 56310 ### 56307 ### 56303 ### 56297 ### CIF Presents TWF – Steven Dickens (1) CIF Presents TWF – Steven Dickens ### Screenshot 2024-01-23 at 14.09.56 ### 56291 ### Strengthening Cloud Security Strengthening Cloud Security ### How to Turn GenAI from a Toy to a Real Enterprise Tool How to Turn GenAI from a Toy to a Real Enterprise Tool ### Katie5 ### Unravelling the pitfalls of cloud repatriation Unravelling the pitfalls of cloud repatriation ### Realise the Value of the Connected Workforce Realise the Value of the Connected Workforce ### Paul D image - Reanna Griffith ### Stephen Pettitt ### Why AI is more than a passing fad Why AI is more than a passing fad ### Greg Duckworth - Conner Pilmore ### Designing digital services for the public sector Designing digital services for the public sector ### CocaRivas-350x350 - Luke Allsop ### Evolving legacy tech - the major movement of 2024 Evolving legacy tech - the major movement of 2024 ### iloveimg-compressed (2) ### Zsuzsa Kecsmar Chief Strategy Officer of Antavo 1 - Joanna Drake ### The Open Source Boom and What This Means for the Cloud The Open Source Boom and What This Means for the Cloud ### Is Ireland the next big gaming hub Is Ireland the next big gaming hub? ### Remy Headshot ### Revolutionising Land Transport Key Tactics for Enhancing Efficiency in Logistic (1) Revolutionising Land Transport Key Tactics for Enhancing Efficiency in Logistic (1) ### JoJaska ### Battle-Ready Cybersecurity- Top 4 Tactics to Empower Teams Against Cyber Attacks JPEG Battle-Ready Cybersecurity: Top 4 Tactics to Empower Teams Against Cyber Attacks ### (1000x1000)Leach_Sean ### What’s next for digital transformations? Digital Transformation Trends 2024 What’s next for digital transformations? Digital Transformation Trends 2024 ### Rafael S. Lajeunesse - Réiltín Doherty ### The Impact of AI on Social Media The Impact of AI on Social Media ### Maddy 2023 - Madeline Paddock ### Taking it personally - the future of eCommerce Taking it personally - the future of eCommerce ### Umbraco CTO_Filip_Bech-Larsen_9246 - Josie Herbert ### Is Ireland the next big gaming hub? Is Ireland the next big gaming hub? ### NB Image - Daniel Jay ### Embracing DevSecOps- Agile, Resilient Software Development in Flux JPEG Embracing DevSecOps: Agile, Resilient Software Development in Flux ### AI Limits in Customer Service- The Need for Human Agents JPEG ### Matt Smith ### AI Limits in Customer Service- The Need for Human Agents JPEG AI Limits in Customer Service: The Need for Human Agents ### David Morton 1 ### Behind the times- how the cloud can help you escape legacy systems Behind the times: how the cloud can help you escape legacy systems ### AI: from nice to have, to necessityc-af78-4b78-92f7-f0cb360e7edd (1) AI: from nice to have, to necessity ### Hood Headshot - Megan Jay Tim Hood is VP EMEA & APAC at Hyland ### Optimising supply chains with real-time visibility Optimising supply chains with real-time visibility ### Updated ALiaksandr Kuushynau headshot - Jennifer Atkinson ### Headshot_Jonathan Wright - Brett Van Zoelen ### Solving the need for a customer-first retail strategy Solving the need for a customer-first retail strategy ### Behind the times how the cloud can help you escape legacy systems ### iloveimg-compressed (1) (1) ### iloveimg-compressed (1) (1) ### Gavin Masters - author headshot - Hannah Watson ### Brett_Raybould_Jan2019_HS - Amanda Hassall ### In this episode of our weekly news and interview show, Tech Wave Forum, host and CIF CEO David Terrar will use the news and opinion section to connect the word of the year with recent ChatGPT announcements by OpenAI, and changes for creators at YouTube to combat the AI generated fake news and impersonation that does harm. Our interview guest is Pollyanna Jones, Health and Life Science Partner at Monstarlab, previously NHS England Commercial Lead. We are going to be talking about a Tech For Good, NHS England success story called FutureNHS, and launching the Agile Elephant report on their progress. In this episode of our weekly news and interview show, Tech Wave Forum, host and CIF CEO David Terrar will use the news and opinion section to connect the word of the year with recent ChatGPT announcements by OpenAI, and changes for creators at YouTube to combat the AI generated fake news and impersonation that does harm. Our interview guest is Pollyanna Jones, Health and Life Science Partner at Monstarlab, previously NHS England Commercial Lead. We are going to be talking about a Tech For Good, NHS England success story called FutureNHS, and launching the Agile Elephant report on their progress. ### CIF Presents TWF – Paul Bevan CIF Presents TWF – Paul Bevan ### CIF Presents TWF – Michael Anderson CIF Presents TWF – Michael Anderson ### Sammy Zoghlami. SVP for EMEA, Nutanix Sammy Zoghlami. SVP for EMEA, Nutanix ### ChatGPT one year on six paradigms that AI and LLMs have introduced for brands (1) ChatGPT one year on: six paradigms that AI and LLMs have introduced for brands ### Cloud Gregory Cloud Gregory ### CIF Presents TWF – Frank Jennings CIF Presents TWF – Frank Jennings ### 56109 ### Remote connectivity the linchpin in a sustainable future Remote connectivity the linchpin in a sustainable future ### Why IT Resellers Must Revisit the Full-Funnel Marketing Approach Why IT Resellers Must Revisit the Full-Funnel Marketing Approach ### Situating SIEM On prem, hybrid or cloud Situating SIEM: On-prem, hybrid or cloud? ### iloveimg-compressed (1) ### Nils_Krumrey_Hors - Sarah Bark ### Nathan headshot - Leila Jones ### Frank Ziarno Headshot - Caroline Shaw ### cybersecurity must always be considered at the start of any new process, and that includes software development. cybersecurity must always be considered at the start of any new process, and that includes software development. ### Computer – Enhance Using AI to maximise developer capabilities ### Ambrozy-photo1 - Emily Quick ### The three stages of AI in innovation The three stages of AI in innovation ### Rosemarie Diegnan_Wazoku - Paul Allen ### Discover The Final Way to Cut Cloud Costs Discover The Final Way to Cut Cloud Costs ### Martin Gaffney, Vice President, EMEA, Yugabyte Portrait - Simon Glazer (1) ### Martin Gaffney, Vice President, EMEA, Yugabyte Portrait - Simon Glazer (1) ### Martin Gaffney, Vice President, EMEA, Yugabyte Portrait - Simon Glazer (1) ### Martin Gaffney, Vice President, EMEA, Yugabyte Portrait - Simon Glazer ### Application observability is the foundation for sustainable innovation Application observability is the foundation for sustainable innovation ### Joe-Byrne-headshot-368x462 - Ayo Bolade-Eyinla ### Computer – Enhance Using AI to maximise developer capabilities “Computer – Enhance!” Using AI to maximise developer capabilities. ### Rob-Pocock-Red-Helix ### 4 Benefits of Leveraging AI-Powered SaaS Marketing Tools 4 Benefits of Leveraging AI-Powered SaaS Marketing Tools ### Supplementing CNAPP: Why APIs and cloud apps need additional API security Supplementing CNAPP: Why APIs and cloud apps need additional API security ### 5 ways AI is driving the future of airport operations 5 ways AI is driving the future of airport operations ### The cost of cybercrime to our emotional wellbeing The cost of cybercrime to our emotional wellbeing ### The Great Data Clean Up: SAP Master Data Management The Great Data Clean Up: SAP Master Data Management ### Why Throwing Money at Ransomware Won't Make It Disappear Why Throwing Money at Ransomware Won't Make It Disappear ### Compare the Cloud - Interview - Josh Hough Compare the Cloud - Interview - Josh Hough ### Screenshot 2023-11-03 at 13.58.56 ### 56039 ### Josh Headshot ### Compare the Cloud - Interview - Martin Smith Compare the Cloud - Interview - Martin Smith ### 56032 ### Martin Smith photo 3 ### Laurent Doguin - Kelly Pyart ### Andy_Mills-0310 - Sarah Bark ### George headshot - Alex Federico ### Ian Lauth F5 - andrea gutierrez ### dan (1) - Amy Swallow ### James Blake ### Compare the Cloud - Interview - Peter Mildon Compare the Cloud - Interview - Peter Mildon ### 56009 ### Peter Mildon- Headshot ### How Investing in Information Technology Benefit 'Manufacturing' Industry? How Investing in Information Technology Benefit 'Manufacturing' Industry? ### CrystalDaniels - Crystal D ### “Must-have” Cloud Tech To Help Gig Workers Excel “Must-have” Cloud Tech To Help Gig Workers Excel ### Damian Hanson headshot - Owen Cottan ### 10 cloud opportunities most organisations are still overlooking 10 cloud opportunities most organisations are still overlooking ### Lee Thatcher head of cloud.jpg - Ellie Byrne ### Lee Thatcher head of cloud.jpg - Ellie Byrne ### The Connectivity Backbone: Telecoms in the AI era The Connectivity Backbone: Telecoms in the AI era ### Anthony_Behan_headshot - Ruby Frost ### 55982 ### CIF Presents TWF - Miguel Clarke CIF Presents TWF - Miguel Clarke ### 55972 ### CIF Presents TWF - John Hayden CIF Presents TWF - John Hayden ### CIF Presents TWF - George Athannassov CIF Presents TWF - George Athannassov ### 55965 ### 55961 ### 55956 ### CIF Presents TWF - Ian Jeffs CIF Presents TWF - Ian Jeffs ### CIF Presents TWF - Mark Osborn CIF Presents TWF - Mark Osborn ### Screenshot 2023-10-20 at 11.32.17 (1) ### 55939 ### Is CNAPP the biggest security category in history? Is CNAPP the biggest security category in history? ### Is Cybersecurity Finally Becoming a Business Enabler Is Cybersecurity Finally Becoming a Business Enabler? ### David terrar ### Screenshot 2023-10-11 at 14.04.15 ### Dror-Headshot Hi-Res ### On-Prem vs Cloud CMMSCAFM 10 Tips for Total Cost Analysis Facing your fears and overcoming tech anxiety ### On-Prem vs Cloud CMMS/CAFM: 10 Tips for Total Cost Analysis On-Prem vs Cloud CMMS/CAFM: 10 Tips for Total Cost Analysis ### Richard Neish - Damian Smith ### Tips to Spot Patterns & Reduce Invalid Survey Responses Tips to Spot Patterns & Reduce Invalid Survey Responses ### Luciana Kola, UK Marketing Manager, Elecosoft ### Cindy is passionate about the incentive industry. In addition to her role as Vice President of Strategic Partners here at Tango Cindy is passionate about the incentive industry. In addition to her role as Vice President of Strategic Partners here at Tango ### Why a multi-cloud strategy is the secret sauce to creating unrivaled fan experiences at sporting events Why a multi-cloud strategy is the secret sauce to creating unrivaled fan experiences at sporting events ### Richard ### How critical organisations can spot unseen threats much sooner How critical organisations can spot unseen threats much sooner ### Mark Jow - Megan Mackintosh ### image001 (1) ### Unlocking the Full Potential of a Cloud ERP solution Unlocking the Full Potential of a Cloud ERP solution ### The need for speed: Rapid prototyping for SaaS success The need for speed: Rapid prototyping for SaaS success ### Embracing repatriation for cloud optimisation: Reclaiming control Embracing repatriation for cloud optimisation: Reclaiming control ### Srinivasan CR - Ashna Syal ### Generative AI: The Urgency to Accelerate Digital Transformation Generative AI: The Urgency to Accelerate Digital Transformation ### Lori MacVittie Headshot - Mala Patel ### Compare the Cloud interview – Matthew Bardell – nVentic Compare the Cloud interview – Matthew Bardell – nVentic ### Matthew Bardell ### Embracing cloud-based platforms: Unlocking business advantages in today's evolving work landscape Embracing cloud-based platforms: Unlocking business advantages in today's evolving work landscape ### MarkusKoelmans - Fern Hugill ### The Three Guiding Principles for Optimising Cloud Costs The Three Guiding Principles for Optimising Cloud Costs ### John Wardrop - Charlotte Gurney ### The power of innovative tracking technology in streamlining trade payments The power of innovative tracking technology in streamlining trade payments ### Sam Colley ### Cracking the AI code: The recipe for successful human-AI collaboration Cracking the AI code: The recipe for successful human-AI collaboration ### Screenshot 2023-09-01 at 16.08.45 ### DORA and its impact on data sovereignty DORA and its impact on data sovereignty ### Screenshot 2023-08-31 at 16.56.27 ### 3 Ways The Latest Cloud Solutions Can Help Improve Sales Performance 3 Ways The Latest Cloud Solutions Can Help Improve Sales Performance ### Screenshot 2023-08-31 at 16.33.32 ### Building Trust in the Cloud: How to Secure Your Digital Assets Building Trust in the Cloud: How to Secure Your Digital Assets ### CTC Blog Images (1920 × 1080 px) ### Dotan - Matthew Pugh ### Tackling digital immunity: adopting a holistic strategy Tackling digital immunity: adopting a holistic strategy ### Tom-Homer ### How to protect your business from procurement fraud How to protect your business from procurement fraud ### The motivations behind establishing a UK-US ‘Data Bridge’ The motivations behind establishing a UK-US ‘Data Bridge’ ### Disruptive Blog Images ### Romain Deslorieux - Ryan Curle ### RPA In Healthcare - Drive Savings By Automating Healthcare Processes RPA In Healthcare - Drive Savings By Automating Healthcare Processes ### Multicloud 101: Distinguishing between myths and reality Multicloud 101: Distinguishing between myths and reality ### Drew Firment Headshot ### Gary Cox (1) ### To cloud or not to cloud? Why businesses need to look beyond FinOps for the answer To cloud or not to cloud? Why businesses need to look beyond FinOps for the answer ### Sam Wilson ### Avoiding Unnecessary Corporate Conflict in Climate Change Strategies Avoiding Unnecessary Corporate Conflict in Climate Change Strategies ### Dave Page_1 - Ellis Overton ### How to end network firefighting by securing end-to-end visibility How to end network firefighting by securing end-to-end visibility ### Gary Cox ### The Cloud ERP debate – security versus operational disruption? The Cloud ERP debate – security versus operational disruption? ### Screenshot 2023-07-25 at 17.09.56 ### Why is Modern Customer-Centric Design so Important? Why is Modern Customer-Centric Design so Important? ### Lukasz Stolarski (1) ### Improving cloud migration efficiency with automated testing Improving cloud migration efficiency with automated testing ### Grigori-Melnik-Headshot ### Inside ChatGPT plugins: The new App Store for brands? Inside ChatGPT plugins: The new App Store for brands? ### me4 (1) - Poppy Brech ### Creative Cloud: Driving Brand Identity Creative Cloud: Driving Brand Identity ### Ed profile pic Ed profile pic ### Ed Dixon ### Compare the Cloud interview - Ed Dixon - Bayezian Compare the Cloud interview - Ed Dixon - Bayezian ### Emma Sue Prince - 7 skills for the future -Author Glassboard Emma Sue Prince - 7 skills for the future -Author Glassboard ### Screenshot 2023-07-12 at 14.24.27 ### Alison Reynolds - The Qi Effect - Author Glassboard Alison Reynolds - The Qi Effect - Author Glassboard ### Screenshot 2023-07-12 at 14.16.39 ### Jon Burkhart - Constant Curiosity - Author Glassboard Jon Burkhart - Constant Curiosity - Author Glassboard ### Screenshot 2023-07-12 at 14.04.57 ### Ben Ryan - Sevens Heaven - Author Glassboard Ben Ryan - Sevens Heaven - Author Glassboard ### Screenshot 2023-07-12 at 13.46.29 ### Dave Birss - Divergence - Author Glassboard Dave Birss - Divergence - Author Glassboard ### Screenshot 2023-07-12 at 13.31.30 ### Screenshot 2023-07-12 at 13.26.06 ### Campbell Macpherson - The Change Catalyst - Author Glassboard Campbell Macpherson - The Change Catalyst - Author Glassboard ### Harald ### Cloud migration: Easy to start, harder to finish? Cloud migration: Easy to start, harder to finish? ### Surbhi_profilepassport ### Core to cloud: the next phase in telco cloud adoption  Core to cloud: the next phase in telco cloud adoption  ### Amdocs_Steve Ellis_Headshot ### When it comes to financial planning, the ability to make informed and strategic decisions quickly is paramount to the success of any business, be they early-stage startups or industry titans. Traditionally, financial planning involved using either expensive software that required specialist user knowledge and training, or relying on complex, error-prone spreadsheets. The latter especially is not only time-consuming but also poses challenges when updating financial plans as an organisation's numbers change over time, often needing the model to be torn down and rebuilt almost from scratch. However, the emergence of generative AI has the potential to revolutionise financial planning, enabling the rapid creation of bespoke and complex financial plans that can be edited in real-time. This breakthrough will democratise financial planning, allowing businesses and leaders to stay ahead of the game without requiring strong financial or modelling skill sets. The Limitations of Traditional Financial Planning Traditional financial planning, modelling, and forecasting methods often required significant investments in expensive software packages. These tools were typically designed for financial experts, requiring specialised training and expertise to operate effectively, and heavy implementations. Small and medium-sized businesses, lacking the financial resources to invest in such software or hire dedicated financial modelling professionals, found themselves at a disadvantage. As a result, financial planning became a daunting task, often requiring ad hoc external assistance or relying on error-prone spreadsheets. Spreadsheets, while more accessible than specialised financial planning software, presented their own set of challenges. Building comprehensive financial models in spreadsheets is a time-consuming process that demands a deep understanding of finance and modelling techniques. Moreover, this is not a one-off labour; maintaining these models as the organisation's financial landscape evolves requires constant tweaks, increasing the risk of errors and compromising the accuracy of the financial plans. This traditional approach hinders agility and limits the ability of smaller businesses such as startups to adapt quickly to changing market conditions. The Rise of Generative AI in Financial Planning Generative AI, a subset of artificial intelligence, is emerging as a game-changer in financial planning, modelling, and forecasting. By leveraging machine learning algorithms and vast amounts of data, generative AI enables businesses to create and update financial plans rapidly, efficiently, and with a high degree of accuracy. This technology promises to empower organisations of all sizes, including those lacking in-house financial expertise, to make informed, confident decisions based on robust financial models. One of the key advantages of generative AI is its ability to generate bespoke financial plans tailored to a company's specific needs and objectives. By analysing historical financial data, market trends, competitive analysis, and other relevant factors, generative AI algorithms can create intricate financial models that account for various scenarios and contingencies. These models can be customised to reflect the unique dynamics of each business with a high degree of granularity, ensuring that financial plans align with strategic goals. Real-time Editing and Adaptability Generative AI's ability to facilitate real-time editing and adaptation is another significant advancement. Financial plans are no longer static documents that require extensive and painful manual updates on a recurring basis. Instead, generative AI will enable dynamic modifications to financial models as circumstances change. This flexibility ensures that businesses can respond swiftly to market shifts, regulatory changes, or internal developments, keeping them one step ahead of the competition. Real-time editing is made possible through user-friendly interfaces and intuitive tools that allow non-experts to make adjustments to financial plans effortlessly. Complex calculations and intricate financial analyses, which were once the exclusive domain of finance professionals, can now be handled by individuals without strong financial or modelling skill sets. Generative AI simplifies the process, enabling decision-makers from various departments to actively participate in financial planning. Risk Mitigation and Scenario Analysis Financial planning is inherently fraught with uncertainty, and businesses must navigate an ever-changing landscape of risks and opportunities. Generative AI empowers organisations to conduct comprehensive risk assessments and scenario analyses, helping them make informed decisions in the face of uncertainty. By simulating various scenarios and assessing the impact of external factors on financial outcomes, generative AI acts as a ‘virtual CFO’, providing valuable insights. It enables businesses to identify potential risks, optimise resource allocation, and develop robust strategies to mitigate adverse effects. This proactive approach to risk management allows organisations of any size or stage to maintain financial stability and make strategic choices that lead to sustainable growth. Data-driven Decision Making Generative AI's reliance on vast amounts of data is instrumental in enhancing the accuracy and reliability of financial planning. By analysing historical data and incorporating external data sources, such as market trends, consumer behaviour, and macroeconomic indicators, generative AI algorithms can generate more precise financial models. Data-driven decision-making becomes a reality with generative AI, as it eliminates the biases and limitations of human judgement. Financial plans are based on objective analysis and are not influenced by individual cognitive biases or subjective interpretations. This data-centric approach instils confidence in decision-makers and facilitates more effective resource allocation, investment strategies, and business expansion plans. The Road Ahead As generative AI continues to advance, the future of financial planning appears even more promising. Ongoing research and development in the field will lead to further improvements in accuracy, speed, and usability, making generative AI an indispensable tool for businesses worldwide. However, as with any disruptive technology, challenges must be acknowledged and addressed. Organisations must ensure the ethical use of generative AI in financial planning, protecting sensitive data and ensuring compliance with regulations. Transparency and accountability in AI algorithms and decision-making processes are crucial for maintaining trust and confidence in the technology. The advent of generative AI has tremendous promise to supercharge financial planning, financial modelling, and financial forecasting, enabling businesses to create bespoke and complex financial models rapidly. This technology will democratise financial planning, freeing start-ups, scale-ups, and other SMEs from the no-win dilemma of expensive software or complex spreadsheets. Real-time editing, adaptability, risk mitigation, and data-driven decision-making will all become achievable even without strong financial or modelling skill sets. Simply put, ability with spreadsheets will no longer impact your ability to run a successful business. Day after day, the media is full of stories in which the leaps forward in Generative AI are empowering businesses to make informed and strategic decisions, navigate uncertainty, and drive sustainable growth. For instance my own company, Blox, is building the world’s first generative AI planning platform. As this technology continues to evolve, its potential for supercharging financial planning will only grow, enabling businesses to stay ahead of the game and achieve long-term success in today's dynamic and competitive business landscape. When it comes to financial planning, the ability to make informed and strategic decisions quickly is paramount to the success of any business, be they early-stage startups or industry titans. Traditionally, financial planning involved using either expensive software that required specialist user knowledge and training, or relying on complex, error-prone spreadsheets. The latter especially is not only time-consuming but also poses challenges when updating financial plans as an organisation's numbers change over time, often needing the model to be torn down and rebuilt almost from scratch. However, the emergence of generative AI has the potential to revolutionise financial planning, enabling the rapid creation of bespoke and complex financial plans that can be edited in real-time. This breakthrough will democratise financial planning, allowing businesses and leaders to stay ahead of the game without requiring strong financial or modelling skill sets. The Limitations of Traditional Financial Planning Traditional financial planning, modelling, and forecasting methods often required significant investments in expensive software packages. These tools were typically designed for financial experts, requiring specialised training and expertise to operate effectively, and heavy implementations. Small and medium-sized businesses, lacking the financial resources to invest in such software or hire dedicated financial modelling professionals, found themselves at a disadvantage. As a result, financial planning became a daunting task, often requiring ad hoc external assistance or relying on error-prone spreadsheets. Spreadsheets, while more accessible than specialised financial planning software, presented their own set of challenges. Building comprehensive financial models in spreadsheets is a time-consuming process that demands a deep understanding of finance and modelling techniques. Moreover, this is not a one-off labour; maintaining these models as the organisation's financial landscape evolves requires constant tweaks, increasing the risk of errors and compromising the accuracy of the financial plans. This traditional approach hinders agility and limits the ability of smaller businesses such as startups to adapt quickly to changing market conditions. The Rise of Generative AI in Financial Planning Generative AI, a subset of artificial intelligence, is emerging as a game-changer in financial planning, modelling, and forecasting. By leveraging machine learning algorithms and vast amounts of data, generative AI enables businesses to create and update financial plans rapidly, efficiently, and with a high degree of accuracy. This technology promises to empower organisations of all sizes, including those lacking in-house financial expertise, to make informed, confident decisions based on robust financial models. One of the key advantages of generative AI is its ability to generate bespoke financial plans tailored to a company's specific needs and objectives. By analysing historical financial data, market trends, competitive analysis, and other relevant factors, generative AI algorithms can create intricate financial models that account for various scenarios and contingencies. These models can be customised to reflect the unique dynamics of each business with a high degree of granularity, ensuring that financial plans align with strategic goals. Real-time Editing and Adaptability Generative AI's ability to facilitate real-time editing and adaptation is another significant advancement. Financial plans are no longer static documents that require extensive and painful manual updates on a recurring basis. Instead, generative AI will enable dynamic modifications to financial models as circumstances change. This flexibility ensures that businesses can respond swiftly to market shifts, regulatory changes, or internal developments, keeping them one step ahead of the competition. Real-time editing is made possible through user-friendly interfaces and intuitive tools that allow non-experts to make adjustments to financial plans effortlessly. Complex calculations and intricate financial analyses, which were once the exclusive domain of finance professionals, can now be handled by individuals without strong financial or modelling skill sets. Generative AI simplifies the process, enabling decision-makers from various departments to actively participate in financial planning. Risk Mitigation and Scenario Analysis Financial planning is inherently fraught with uncertainty, and businesses must navigate an ever-changing landscape of risks and opportunities. Generative AI empowers organisations to conduct comprehensive risk assessments and scenario analyses, helping them make informed decisions in the face of uncertainty. By simulating various scenarios and assessing the impact of external factors on financial outcomes, generative AI acts as a ‘virtual CFO’, providing valuable insights. It enables businesses to identify potential risks, optimise resource allocation, and develop robust strategies to mitigate adverse effects. This proactive approach to risk management allows organisations of any size or stage to maintain financial stability and make strategic choices that lead to sustainable growth. Data-driven Decision Making Generative AI's reliance on vast amounts of data is instrumental in enhancing the accuracy and reliability of financial planning. By analysing historical data and incorporating external data sources, such as market trends, consumer behaviour, and macroeconomic indicators, generative AI algorithms can generate more precise financial models. Data-driven decision-making becomes a reality with generative AI, as it eliminates the biases and limitations of human judgement. Financial plans are based on objective analysis and are not influenced by individual cognitive biases or subjective interpretations. This data-centric approach instils confidence in decision-makers and facilitates more effective resource allocation, investment strategies, and business expansion plans. The Road Ahead As generative AI continues to advance, the future of financial planning appears even more promising. Ongoing research and development in the field will lead to further improvements in accuracy, speed, and usability, making generative AI an indispensable tool for businesses worldwide. However, as with any disruptive technology, challenges must be acknowledged and addressed. Organisations must ensure the ethical use of generative AI in financial planning, protecting sensitive data and ensuring compliance with regulations. Transparency and accountability in AI algorithms and decision-making processes are crucial for maintaining trust and confidence in the technology. The advent of generative AI has tremendous promise to supercharge financial planning, financial modelling, and financial forecasting, enabling businesses to create bespoke and complex financial models rapidly. This technology will democratise financial planning, freeing start-ups, scale-ups, and other SMEs from the no-win dilemma of expensive software or complex spreadsheets. Real-time editing, adaptability, risk mitigation, and data-driven decision-making will all become achievable even without strong financial or modelling skill sets. Simply put, ability with spreadsheets will no longer impact your ability to run a successful business. Day after day, the media is full of stories in which the leaps forward in Generative AI are empowering businesses to make informed and strategic decisions, navigate uncertainty, and drive sustainable growth. For instance my own company, Blox, is building the world’s first generative AI planning platform. As this technology continues to evolve, its potential for supercharging financial planning will only grow, enabling businesses to stay ahead of the game and achieve long-term success in today's dynamic and competitive business landscape. ### simon-ritchie ### Can AI help companies transform their ESG performance? Can AI help companies transform their ESG performance? ### John Bates ### The Changing Role of Partners in SAP’s New Cloud Mindset The Changing Role of Partners in SAP’s New Cloud Mindset ### Robert_~Macdonald_absoft ### Optimising multi-cloud environments: The workload-first approach Optimising multi-cloud environments: The workload-first approach ### Natalie Billingham ### What can businesses do to prevent easy email mistakes? What can businesses do to prevent easy email mistakes? ### Oliver_VIPRE ### 6 Key Practices To Achieve High Reliability In Cloud Computing 6 Key Practices To Achieve High Reliability In Cloud Computing ### Jessica-Perkins ### Navigating the Energy Crisis with Efficient Cloud Infrastructures Navigating the Energy Crisis with Efficient Cloud Infrastructures ### UK_6DG_IMG_Phil Wood (1) ### Ethics in Artificial Intelligence Ethics in Artificial Intelligence ### data-re 1 BW 011 (1) ### 351447384_274531018411543_7464358927065401482_n ### Dashing across AI’s highway to the danger zone Dashing across AI’s highway to the danger zone ### karel ### Instilling data-centricity through the right company culture Instilling data-centricity through the right company culture ### 1589369747813 ### Why It’s The Fusion of ChatGPT and Knowledge Graphs Will Make All the Difference Why It’s The Fusion of ChatGPT and Knowledge Graphs Will Make All the Difference ### JimWebber ### Adam Mommersteeg Headshot ### The future of cloud cost optimisation The future of cloud cost optimisation ### LOGOFF Barcelona.In.tuition cloud services ### 351429954_799927414793453_8789791603695850751_n ### 351207480_567462495546279_2367179668886640022_n ### You might believe you’re doing a great job but what do your clients think? You might believe you’re doing a great job but what do your clients think? ### Gittings Global - NE91930 ### Why is integration in Smart Lockers important? Why is integration in Smart Lockers important? ### Anthony ### James- Campanini ### 5 Barriers to Cloud Modernisation 5 Barriers to Cloud Modernisation ### How AI can streamline procurement and supply chains How AI can streamline procurement and supply chains ### image006 ### The perfect time to invest in cloud technologies The perfect time to invest in cloud technologies ### Why a ‘cloud strategy’ alone can’t stave off your competition Why a ‘cloud strategy’ alone can’t stave off your competition ### Ivo Ivanov - CEO DE-CIX ### Tips & Tricks ### Tips & Tricks ### The Rise of Smart Vending Machines The Rise of Smart Vending Machines ### ANTHONY_LAMOUREUX Headshot ### Designated Driver: Why Complex Technology Environments Require a Steady Hand Designated Driver: Why Complex Technology Environments Require a Steady Hand ### Mat Clothier ### Building the Business Case for Satellite IoT Building the Business Case for Satellite IoT ### EricMenardLR ### Using Cloud To Avoid Bloated Business Models Using Cloud To Avoid Bloated Business Models ### New Sridhar Iyengar headshot June 2019 ### Elevating healthcare services with cloud computing Elevating healthcare services with cloud computing ### Anh Chiêu ### Top 5 trends impacting multicloud management Top 5 trends impacting multicloud management ### DILOUYA Jerome-H2 ### CRM testing: a six-step guide CRM testing: a six-step guide ### lena-yakimova ### How AI can improve battery cell R&D productivity How AI can improve battery cell R&D productivity ### image-4 ### image-3 ### image-2 ### image-1 ### image ### Adoram Rogel, Director of Data Science at StoreDot. ### Top Three Priorities for A Successful Cloud Security Strategy Top Three Priorities for A Successful Cloud Security Strategy ### Alan Hayward, SEH Technology (2) ### 55509 ### Maximising ROI in B2B Digital Marketing Maximising ROI in B2B Digital Marketing ### Genc Emini ### The Importance of Investing in EDR for SMEs The Importance of Investing in EDR for SMEs ### David Corlette ### Creating consumer peace-of-mind with all-encompassing data security Creating consumer peace-of-mind with all-encompassing data security ### Andy McNab ### Backup Security: Protecting Business-Critical Assets from Ransomware Backup Security: Protecting Business-Critical Assets from Ransomware ### Candida Valois CL ### Bridging the gender gap in public transport is crucial for achieving equality Bridging the gender gap in public transport is crucial for achieving equality ### Emma_Sq ### Getting Web Analytics Right: The Power Of Accurate Data Getting Web Analytics Right: The Power Of Accurate Data ### Rebecca ### Cloud migration is a complex, multi-level process that requires a solid amount of planning. This guide outlined the critical cloud migration steps that can help you minimise your infrastructure costs, avoid vendor lock-in, and address the common fears that come with cloud adoption. Cloud migration is a complex, multi-level process that requires a solid amount of planning. This guide outlined the critical cloud migration steps that can help you minimise your infrastructure costs, avoid vendor lock-in, and address the common fears that come with cloud adoption. ### Liza ### How Organisations Can Empower Employees With the Right Training Software How Organisations Can Empower Employees With the Right Training Software ### Nadav- New ### Why cost remains king in the on-prem vs cloud debate Why cost remains king in the on-prem vs cloud debate ### ### Changing People's Mindsets When Measuring ROI Changing People's Mindsets When Measuring ROI ### Daria Polończyk ### The passwordless future The passwordless future ### Chris Vaughan[1] ### AG22-Jozef deVries-8596crp_0 ### Cloud-delivered Windows: What Are the Best Options? Cloud-delivered Windows: What Are the Best Options? ### Vadim Vladimirskiy_Nerdio 2022 ### How businesses can boost security with a strong data privacy culture How businesses can boost security with a strong data privacy culture ### Thomas Steur CTO Matomo Headshot ### 5 tips to tackle the cloud vs on-premise dilemma 5 tips to tackle the cloud vs on-premise dilemma ### The value of colocation data centres in IoT The value of colocation data centres in IoT ### 991 Bo Ribbing ### Strategy and anticipation are key to securing against cyber threats Strategy and anticipation are key to securing against cyber threats ### Jonathan Bridges ### Raising talent attraction and retention with IT investment Raising talent attraction and retention with IT investment ### Aurelio Maruggi_website 500x500 ### How NIST started the countdown on the long journey to quantum safety How NIST started the countdown on the long journey to quantum safety ### Kevin Bocek_1 (1) (002) ### Overcoming economic uncertainty with cloud flexibility Overcoming economic uncertainty with cloud flexibility ### Sameena Hassam ### Christian_Pedersen - CPO ### “The need for speed” - Finding a way to unlock agility for today’s businesses  “The need for speed” - Finding a way to unlock agility for today’s businesses  ### CTC Blog Images (1920 × 1080 px) (18) ### Preventing data sovereignty from confusing your data strategy Preventing data sovereignty from confusing your data strategy ### Laurent Allard ### Attar Naderi[39] ### Edge and cloud joining forces Edge and cloud joining forces ### Alan-Stewart-Brown ### Cybersecurity stress: how to safeguard your organisation and avoid IT burnout Cybersecurity stress: how to safeguard your organisation and avoid IT burnout ### Improving industrial cyber defences in the cloud Improving industrial cyber defences in the cloud ### 991 Martin Riley High Res ### Top Cloud Computing Applications Top Cloud Computing Applications ### Debanjan_Biswas Debanjan_Biswas ### AI and ML: Transforming Industries and Shaping the Future AI and ML: Transforming Industries and Shaping the Future ### A guide to broadband for home workers A guide to broadband for home workers ### Mark Billen ### Three New Year’s resolutions to streamline your Cloud migration Three New Year’s resolutions to streamline your Cloud migration ### Krzysztof Szabelski headshot ### How Cloud Native Application Protection Reinforces Your Cybersecurity Posture How Cloud Native Application Protection Reinforces Your Cybersecurity Posture ### Deepak headshot Deepak headshot ### Discovering Rules Through Machine Learning Discovering Rules Through Machine Learning ### Screenshot 2023-02-17 at 12.29.10 ### Jurus headshot Jurus headshot ### Over the coming years, 5G is expected to reach one billion new users, making it the most rapidly adopted mobile data transmission technology Over the coming years, 5G is expected to reach one billion new users, making it the most rapidly adopted mobile data transmission technology ### bruce kelly profile image bruce kelly profile image ### Organisations are not standing still in cloud adoption Organisations are not standing still in cloud adoption ### charles headshot charles headshot ### Data is the key to unlocking investment for emerging markets Data is the key to unlocking investment for emerging markets ### Devin Headshot Devin Headshot ### A New Journey to the Cloud A New Journey to the Cloud ### don valentine headshot don valentine headshot ### Mark Pacey ### How to add AI to your cybersecurity toolkit How to add AI to your cybersecurity toolkit ### Screenshot 2023-02-02 at 12.30.12 ### The Metaverse: Virtually a reality? The Metaverse: Virtually a reality? ### Tran Headshot Tran Headshot ### Dominik headshot Dominik headshot ### Cybersecurity and Cloud: A Look Back at 2022 and What to Expect in 2023 Cybersecurity and Cloud: A Look Back at 2022 and What to Expect in 2023 ### CTC Blog Images (1920 × 1080 px) (7) ### Frank headshot Frank headshot ### Andrew McLean Headshot Andrew McLean Headshot ### Darren Sanders Headshot Darren Sanders Headshot ### Craig Aston headshot Craig Aston headshot ### Disruptive seasons - Celerity Security Panel Disruptive seasons - Celerity Security Panel ### Charlie Dawson Headshot Charlie Dawson Headshot ### Kieran Jeffs Headshot Kieran Jeffs Headshot ### Adam Jull profile image Adam Jull profile image ### Adam Jull headshot Adam Jull headshot ### Screenshot 2023-01-26 at 14.49.19 ### TD Synnex Interview Image TD Synnex Interview Image ### Unveiling the Top 10 Cybersecurity Threats to Watch Out for in 2023 Unveiling the Top 10 Cybersecurity Threats to Watch Out for in 2023 ### Is sustainability ‘enough’ from a Cloud perspective? Is sustainability ‘enough’ from a Cloud perspective? ### Endpoint management: Common challenges and trends for 2023 Endpoint management: Common challenges and trends for 2023 ### headshot headshot ### Unplugging for Success: Why IT Workers Should Digital Detox and Meditate Unplugging for Success: Why IT Workers Should Digital Detox and Meditate ### Cloud Vs On-Premise Server Cloud Vs On-Premise Server ### synthetic data synthetic data ### Digital disconnect: the hidden barrier to genuine transformation Digital disconnect: the hidden barrier to genuine transformation ### Leon Gauhman headshot Leon Gauhman headshot ### Weathering the unpredictable, with cloud migration Weathering the unpredictable, with cloud migration ### Pamela Napier headshot Pamela Napier headshot ### Use Video to Educate Employees About Benefits Use Video to Educate Employees About Benefits ### Overcoming IT Complexity in a Hybrid World Overcoming IT Complexity in a Hybrid World ### Chrystal Taylor headshot Chrystal Taylor headshot ### 5 key considerations for moving your data to the cloud 5 key considerations for moving your data to the cloud ### Marc Linster Headshot Marc Linster Headshot ### Bring Your Own Device: navigating the cybersecurity risks Bring Your Own Device: navigating the cybersecurity risks ### Justin Reilly headshot Justin Reilly headshot ### Selecting a sovereign cloud: A data guardian’s guide Selecting a sovereign cloud: A data guardian’s guide ### Alex Tanner Headshot Alex Tanner Headshot ### Why a Digital Twin Graph Will Help Manage Real-World Complexity at Scale Why a Digital Twin Graph Will Help Manage Real-World Complexity at Scale ### Maya Natarajan Maya Natarajan ### Overcoming the increasing challenges to data solution providers Overcoming the increasing challenges to data solution providers ### Martin Bradburn Author pic Martin Bradburn Author pic ### JD Sherry headshot JD Sherry headhsot ### How edge cloud is transforming online retail How edge cloud is transforming online retail ### Ed Gray Headshot Ed Gray Headshot ### Why you must enhance visibility and control of your expanding supply chain Why you must enhance visibility and control of your expanding supply chain ### Camille Charaudeau Author headshot Camille Charaudeau Author headshot ### Gaining a grip with the strangler pattern Gaining a grip with the strangler pattern ### Chris Birkinshaw headshot Chris Birkinshaw headshot ### Chris Birkinshaw (4) ### Nastiest Malware of 2022 from a business perspective Nastiest Malware of 2022 from a business perspective ### Kelvin Murray headshot Kelvin Murray headshot ### Top security threats to the cloud Top security threats to the cloud ### Why data is often a double-edged sword in financial services Why data is often a double-edged sword in financial services ### Joe Steadman profile image Joe Steadman profile image ### Neil Parsons Blog Image Neil Parsons Blog Image ### How the cloud could hold the answers to the world’s biggest questions How the cloud could hold the answers to the world’s biggest questions ### Alexander Waldhaus profile image Alexander Waldhaus profile image ### Matt Halligan profile image Matt Halligan profile image ### SMEs: The Move from Legacy to the Cloud SMEs: The Move from Legacy to the Cloud ### Clarifying UK cloud adoption patterns Clarifying UK cloud adoption patterns ### Terry Storrar Leaseweb Headshot ### Don't lose sight of SAP on Cloud operational excellence Don't lose sight of SAP on Cloud operational excellence ### Eamonn O’Neill Profile Image Eamonn O’Neill Profile Image ### Need to reduce software TCO? Focus on people Need to reduce software TCO? Focus on people ### Hartmut Hahn, CEO, Userlane - landscape Hartmut Hahn ### The future of cloud and edge optimisation The future of cloud and edge optimisation ### Matt Nash profile image Matt Nash profile image ### How to Stage Public Cloud Migration How to Stage Public Cloud Migration ### CTC Blog Images (1920 × 1080 px) (1) ### How Business Data Can Be Protected, Even with Remote Workers How Business Data Can Be Protected, Even with Remote Workers ### The future of work is collaborative The future of work is collaborative ### CTC Blog Images (4) ### blog How Business Data Can Be Protected, Even with Remote Workers blog How Business Data Can Be Protected, Even with Remote Workers ### breakingblogs_001_cybersecuritypolicy-1080p ### breaking_blogs_julian_broster_enabling_effective_collaboration_amongst_their_workforce-1080p-1 ### The future of work is collaborative The future of work is collaborative ### The future of work is collaborative The future of work is collaborative ### breaking_blogs_julian_broster_enabling_effective_collaboration_amongst_their_workforce-1080p ### image005 ### CTC Blog Images ### Kimberly Brunner Kimberly Brunner ### 55081 ### 55012 ### 54929 ### 54877 ### cropped-CTCAMP.png https://www.comparethecloud.net/wp-content/uploads/2017/05/cropped-CTCAMP.png ### Compare The Cloud Compare The Cloud ### keyboard-g5a828875b_1920 ### Author-Headshot ### eric-smith-profile-picture eric-smith ### laptop-blog-pic Top Tips for Launching a Company in 2022 ### Grant P. Author Bio Grant Polachek ### Decoding Zero Trust Identity and embracing its benefits Decoding Zero Trust Identity and embracing its benefits ### Chris Owen_Saviynt Chris Owen ### Building supply chain resilience with cloud apps Building supply chain resilience with cloud apps ### h ### Achieving PCI DSS 4.0 Compliance in the Cloud Achieving PCI DSS 4.0 Compliance in the Cloud ### Geoff-Forsyth_CISO_PCI Pal_med Geoff-Forsyth ### How can security leaders protect their data in a multi-cloud environment? How can security leaders protect their data in a multi-cloud environment? ### John ### Challenges of moving to a multi-cloud strategy Challenges of moving to a multi-cloud strategy ### Nick Westall CTO CSI Ltd ### Dan-Faulkner.-sq ### Serverless Best Practices in software development Serverless Best Practices in software development ### charles charles ### Aron Brand 1 copy ### The Two Ms Critical to IT innovation The Two Ms Critical to IT innovation ### Under the Hood of a Data Resiliency Cloud Under the Hood of a Data Resiliency Cloud ### curtis-1371x2054 ### Finding your silver lining: What multi-cloud means for your data strategy Finding your silver lining: What multi-cloud means for your data strategy ### David Langton_Headshot_F David-Langton ### Effective cloud technology is all we need Effective cloud technology is all we need ### Rukmini Glanard Rukmini-Glanard ### How Cybercrime-as-a-Service is luring youngsters into a life of criminal activity How Cybercrime-as-a-Service is luring youngsters into a life of criminal activity ### gundam-1010971_1920 Addressing the infrastructure requirements needed for effective AI adoption ### Terry Storrar Leaseweb Headshot Terry Storrar ### AI Security VS AI Cybercrime – A Game of Cat and Mouse AI Security VS AI Cybercrime – A Game of Cat and Mouse ### Phil Bindley, Managing Director of Intercity Technology's Cloud & Security Division phil-Bindley ### Tackling Cybersecurity Challenges in the Hybrid Age Tackling Cybersecurity Challenges in the Hybrid Age ### Managing cloud costs ### Paulius Gasparavičius, Zenitech ### The End of Backup, As We Know It The End of Backup, As We Know It ### Aron Brand Aron Brand ### What ‘flavour’ of cloud is right for your organisation? What ‘flavour’ of cloud is right for your organisation? ### Adrian Odds - marketing and innovation director (CDS) ### Want your finance team to drive innovation? Here’s how to treat them right ### Adam Zoucha Adam Zoucha ### Why Cybersecurity Asset Monitoring in the Cloud is So Hot Right Now Why Cybersecurity Asset Monitoring in the Cloud is So Hot Right Now ### CyberProof-Jaimon Thomas ### Jubil Mathew ### Nick riggott ### chad mcdonald ### Justin Ou ### Cecilia ### Olivier_Plante_HIGH_RES Olivier_Plante ### David Sygula CybelAngel ### James Hughes Headshot James Hughes Headshot ### Mat Clothier Mat Clothier ### Node 4 ### Disruptive Seasons Thumbnail Disruptive Seasons Thumbnail ### Screen Shot 2021-10-29 at 11.40.08 ### Disruptive Live Logo Disruptive Live Logo ### Disruptive Seasons Event Disruptive Seasons Event ### Disruptive Seasons Event Disruptive Seasons Event ### Andrew Daniels, Druva Headshot ### gears-4126485_1920 ### Ansible Automation Blog Image Ansible Automation Blog Image ### ansible-automation ### CTC_Chroma (6) ### digitization-5231610_1920 ### chart-2785902_1920 ### cloud-3362004_1920 Thinking Cloud? Think Again ### security-2168233_1920 ### Cloud Cloud ### cloud-2570253_1920 ### Cloud Cloud ### photovoltaic-2138992_1920 ### book-731199_1920 ### book-731199_1920 ### compare the cloud blog micro-learning compare the cloud blog micro-learning ### book-731199_1920 ### book-731199_1920 ### book-731199_1920 ### laptop-336373_1920 (1) ### cloud computing ### binary-code-4826796_1920 ### cyber-security-3400657_1920 ### cyber-security-3400657_1920 ### cyber-security-3400657_1920 ### cyber-security-3400657_1920 ### Graph Data Science Gets to the Heart of Corporate Analytics Graph Data Science Gets to the Heart of Corporate Analytics ### laptop blog pic ### laptop blog pic ### Wifi blog pic ### Wifi blog pic ### Wifi blog pic ### ctc lightbulb cloud ### 7 Tips to Prevent a Successful Phishing Attack on Your Remote Team 7 Tips to Prevent a Successful Phishing Attack on Your Remote Team ### Three principles for security leadership in turbulent times Three principles for security leadership in turbulent times ### Compare the Cloud Blog Ransomware Compare the Cloud Blog Ransomware ### IaC: IT Infrastructure Management Model That Codifies Everything IaC: IT Infrastructure Management Model That Codifies Everything ### Digital workforces: Automation processes in workforces for SMEs Digital workforces: Automation processes in workforces for SMEs ### Fog Computing And IoT: The Future Of IoT App Development Fog Computing And IoT: The Future Of IoT App Development ### Lenovo ### How To Ensure Security With Cloud Hosting? How To Ensure Security With Cloud Hosting? ### HOW CLOUD ACCOUNTING CAN HELP YOUR BUSINESS CONTINUITY HOW CLOUD ACCOUNTING CAN HELP YOUR BUSINESS CONTINUITY ### Compare the cloud Blog DevOps Compare the cloud Blog DevOps ### Disruptive Live Disruptive Live ### Award-winning broadcaster Disruptive Live launches dedicated cross-platform companion app. Award-winning broadcaster Disruptive Live launches dedicated cross-platform companion app. ### Why Is Security a Stepping Stone to Technology-Driven Marketing Personalisation? Why Is Security a Stepping Stone to Technology-Driven Marketing Personalisation? ### Why collaboration is critical to the resilience of the channel. Why collaboration is critical to the resilience of the channel. ### Testing times. The parallels between the fight for regular testing in a pandemic and the security of the cloud. Testing times. The parallels between the fight for regular testing in a pandemic and the security of the cloud. ### What are the Basic Characteristics of the Learning Times of the Efficiency Data Scientist Work? What are the Basic Characteristics of the Learning Times of the Efficiency Data Scientist Work? ### Move Fast and Don’t Break Things With Cloud-Native Data Move Fast and Don’t Break Things With Cloud-Native Data ### Cloud security picture How cloud resilience has been challenged and why security strategy needs a rethink. ### 5g picture ### Top 5 trends that will shape data management and protection in 2021 Top 5 trends that will shape data management and protection in 2021 ### Essential Points to remember before starting mobile test automation Essential Points to remember before starting mobile test automation ### drone-1245980_1920 ### Serverless Computing – what to consider when investing Serverless Computing – what to consider when investing ### Designing geographically distributed environments with the Google Cloud Platform Designing geographically distributed environments with the Google Cloud Platform ### 6 ### 5 ### 4 ### 3 ### 2 ### 1 ### New-gen Security for Smart devices Becomes a Must New-gen Security for Smart devices Becomes a Must ### How Social Media Can Improve Customer Retention Ratio How Social Media Can Improve Customer Retention Ratio ### graph graph ### Transforming testing - an untapped opportunity for change Transforming testing - an untapped opportunity for change ### Azure File Storage Pros and Cons Azure File Storage Pros and Cons ### A Beginner’s Guide to Azure File Storage A Beginner’s Guide to Azure File Storage ### A Guide to Data Privacy and Translation A Guide to Data Privacy and Translation ### Building trusted, compliant AI systems Building trusted, compliant AI systems ### 5 Great Tips to Build a More Efficient Data Center for Your Business 5 Great Tips to Build a More Efficient Data Center for Your Business ### The dangers of COVID-19 data collection The dangers of COVID-19 data collection ### Misconfiguration - Security Risk Misconfiguration - Security Risk ### Will On-prem Databases Survive COVID-19? Will On-prem Databases Survive COVID-19? ### Compare the Cloud blog Business Organisations Compare the Cloud blog Business Organisations ### Reinventing the Enterprise Cloud For the New Normal Reinventing the Enterprise Cloud For the New Normal ### Covid-19 and the impact it has had on workplace IT Covid-19 and the impact it has had on workplace IT ### Facilitating the future of remote working with the cloud Facilitating the future of remote working with the cloud ### How to Make Working Remotely, Work How to Make Working Remotely, Work ### Xigen - Migrating your Ecommerce Site Infographic (1) (1) ### How should businesses communicate to customers after Covid-19? How should businesses communicate to customers after Covid-19? ### New Tech Developments for Children with ASD New Tech Developments for Children with ASD ### Is it time to consider migrating or replatforming your ecommerce site? Is it time to consider migrating or replatforming your ecommerce site? ### Xigen - Migrating your Ecommerce Site Infographic (1) (1) ### Data privacy in a pandemic world. Can our own data save our lives? Data privacy in a pandemic world. Can our own data save our lives? ### AI in the home: Smart devices do’s and don’ts AI in the home: Smart devices do’s and don’ts ### Mitigating the carbon cost of AI Mitigating the carbon cost of AI ### The role of AI in powering enterprise content The role of AI in powering enterprise content ### Educating the digital world – the rise of eLearning Educating the digital world – the rise of eLearning ### LogitLogo200x200 ### For Business Continuity, Choose A Cloud Provider That’s in Control For Business Continuity, Choose A Cloud Provider That’s in Control ### Your next cloud move & the importance of control Your next cloud move & the importance of control ### moon-5396364 ### For Business Continuity, Choose A Cloud Provider That’s in Control For Business Continuity, Choose A Cloud Provider That’s in Control ### Enabling Intelligence-Driven Security: the rise of CISO as a service Enabling Intelligence-Driven Security: the rise of CISO as a service ### How companies are using AI to avoid bad debts and make sure their bills get paid How companies are using AI to avoid bad debts and make sure their bills get paid ### Adapting your data protection measures for the cloud Adapting your data protection measures for the cloud ### Cloud and Cyber Security Expo launch Techerati Live digital conference in partnership with Disruptive Cloud and Cyber Security Expo launch Techerati Live digital conference in partnership with Disruptive ### 0 ### w-1500_h-auto_m-fit_s-any__ccse_4C-01 ### Five pitfalls to avoid for a successful cloud migration Five pitfalls to avoid for a successful cloud migration ### IoT will empower retail resilience post pandemic IoT will empower retail resilience post pandemic ### Taking a cloud-first approach to print infrastructure Taking a cloud-first approach to print infrastructure ### Making healthcare more connected with 5G Making healthcare more connected with 5G ### Age Is No Barrier To Smart Home Tech Age Is No Barrier To Smart Home Tech ### How to capitalise on the ERP trends of 2020 How to capitalise on the ERP trends of 2020 ### The case for a data culture and user adoption The case for a data culture and user adoption ### cyberspace-2784907 ### Remoting even remoter? Remoting even remoter? ### How Does SSL:TLS Client Authorisation Work? How Does SSL:TLS Client Authorisation Work? ### The key storage challenges businesses are facing during the increase of home working The key storage challenges businesses are facing during the increase of home working ### The benefits of cloud technology for life science firms The benefits of cloud technology for life science firms ### Transitioning to Secure Remote Working Now and in the Future Transitioning to Secure Remote Working Now and in the Future ### How banks can reflect on the pandemic to reinforce strategies that create more resilient systems How banks can reflect on the pandemic to reinforce strategies that create more resilient systems ### Securing the Work from Home Digital Transformation Securing the Work from Home Digital Transformation ### Can autonomous mobile robots improve your warehouse operation? Can autonomous mobile robots improve your warehouse operation? ### 7 Reasons for SMBs to Invest in Cyber Security 7 Reasons for SMBs to Invest in Cyber Security ### The changing role of connectivity in healthcare The changing role of connectivity in healthcare ### How Will Your Workspace be Different in The Future? How Will Your Workspace be Different in The Future? ### Considering the Sustainability of Cloud Energy Use Considering the Sustainability of Cloud Energy Use ### Oracle Database in the Cloud: Azure vs AWS vs Oracle Oracle Database in the Cloud: Azure vs AWS vs Oracle ### Security risks of increasingly popular cloud collaboration tools Security risks of increasingly popular cloud collaboration tools ### Why Progressive Web Apps (PWAs) are so valuable Why Progressive Web Apps (PWAs) are so valuable ### What Will Business Continuity Strategy Look Like In The Near Future? What Will Business Continuity Strategy Look Like In The Near Future? ### Maintaining business continuity in the face of change Maintaining business continuity in the face of change ### AI on the Edge and in the Cloud: three things you need to know AI on the Edge and in the Cloud: three things you need to know ### AI can help finance team’s focus on strategy AI can help finance team’s focus on strategy ### Safeguarding cyber health in uncertain times Safeguarding cyber health in uncertain times ### How to smarten up your video background at home How to smarten up your video background at home ### 5 ways to build a strong company culture across remote teams 5 ways to build a strong company culture across remote teams ### The Future of Payments The Future of Payments ### Where Digital Transactions Will Show Up Next Where Digital Transactions Will Show Up Next ### The Unpredictable is Predictable, Thanks to Blue Chip Cloud The Unpredictable is Predictable, Thanks to Blue Chip Cloud ### Everyday AI | The rise of chatbots | Compare the Cloud Everyday AI | The rise of chatbots | Compare the Cloud ### How channel partners can enable digital transformation success How channel partners can enable digital transformation success ### How will the emergence of eSIM impact IoT? How will the emergence of eSIM impact IoT? ### Cloud Supply Chain Risk – is your MSP in control? Cloud Supply Chain Risk – is your MSP in control? ### Cloud Supply Chain Risk – is your MSP in control? Cloud Supply Chain Risk – is your MSP in control? ### Cloud Supply Chain Risk – is your MSP in control? Cloud Supply Chain Risk – is your MSP in control? ### Screenshot 2020-05-19 at 5.20.14 pm ### Screenshot 2020-05-19 at 5.20.14 pm ### Site Reliability Engineering (SRE) 101 with DevOps vs SRE Site Reliability Engineering (SRE) 101 with DevOps vs SRE ### 5 5 ### 4 4 ### 4 ### 3 3 ### 2 2 ### 1 1 ### Connecting the Cloud Connecting the Cloud ### ‘Tiny AI’: the next big thing in intellectual property? ‘Tiny AI’: the next big thing in intellectual property? ### Digital Asset Management: Cloud vs On-Premises Digital Asset Management: Cloud vs On-Premises ### Database management: Overcoming the duct tape and bubblegum challenge Database management: Overcoming the duct tape and bubblegum challenge ### How to protect your organisation from cyber espionage How to protect your organisation from cyber espionage ### Facing the challenge of distributed testing teams Facing the challenge of distributed testing teams ### How healthcare organisations can navigate Coronavirus through cloud connectivity How healthcare organisations can navigate Coronavirus through cloud connectivity ### Grace Under Pressure: The Maturity of the Cloud Grace Under Pressure: The Maturity of the Cloud ### Embracing cloud: UK world leader in digital maturity Embracing cloud: UK world leader in digital maturity ### Mobile Cyber-Espionage: How to Keep Your Executives Safe Mobile Cyber-Espionage: How to Keep Your Executives Safe ### Automation vs Emotion: Are people still important in customer experience roles? Automation vs Emotion: Are people still important in customer experience roles? ### Best practice for cloud migration Best practice for cloud migration ### How machine learning can give you the data security you need How machine learning can give you the data security you need ### Karantis360 Offers Assisted Living Solution to Care Providers on a Non-Profit Basis During COVID-19 pandemic Karantis360 Offers Assisted Living Solution to Care Providers on a Non-Profit Basis During COVID-19 pandemic ### kara pp ### kara pp ### Why Now Is The Time To Enter eCommerce Why Now Is The Time To Enter eCommerce ### Big data transformation: Top analytics trends in 2020 Big data transformation: Top analytics trends in 2020 ### How to Stop Banging Your Head Against the (Data)brick wal How to Stop Banging Your Head Against the (Data)brick wal ### Intelligent Machines: How They Learn, Evolve & Make Memories Intelligent Machines: How They Learn, Evolve & Make Memories ### Where are all the truly transformational IT projects? Where are all the truly transformational IT projects? ### 5 Ways AI Is Helping In The Fight Against Coronavirus 5 Ways AI Is Helping In The Fight Against Coronavirus ### Data Protection Day – Don't cloud your security judgement Data Protection Day – Don't cloud your security judgement ### The past, present and future of the CVSS The past, present and future of the CVSS ### All You need to know About Securing your eCommerce Store from Cyber Attacks All You need to know About Securing your eCommerce Store from Cyber Attacks ### Raising the right culture for DataOps success in 5 steps Raising the right culture for DataOps success in 5 steps ### Top Tips for Preparing for Any Cloud Migration Top Tips for Preparing for Any Cloud Migration ### 8 Cybersecurity Tips For Working Remotely 8 Cybersecurity Tips For Working Remotely ### How educational videos are beneficial for children? How educational videos are beneficial for children? ### university-105709_1920 ### university-105709_1920 ### Don’t let bad infrastructure choices impinge your data strategy Don’t let bad infrastructure choices impinge your data strategy ### Finding the silver lining in a private cloud Finding the silver lining in a private cloud ### Write A Successful Robotics Technician Cover Letter Write A Successful Robotics Technician Cover Letter ### Events Industry: Using tech to disrupt the disruption Events Industry: Using tech to disrupt the disruption ### The Importance of Data Centres in the Cloud Repatriation Era The Importance of Data Centres in the Cloud Repatriation Era ### Managing The Evolving Data and Privacy Law Landscape Managing The Evolving Data and Privacy Law Landscape ### You have lots of APIs and Microservices – now what? You have lots of APIs and Microservices – now what? ### You have lots of APIs and Microservices – now what? ### Data virtualisation – the missing link in cloud migration Data virtualisation – the missing link in cloud migration ### Boost Your Small Business With These AI-Powered Tools Boost Your Small Business With These AI-Powered Tools ### coffee-shop-1702194_1920 ### 8 tips to make RPA your best friend 8 tips to make RPA your best friend ### DDoS Attacks Vary Based on Date and Time DDoS Attacks Vary Based on Date and Time ### How IoT Is Enhancing Customer Experience How IoT Is Enhancing Customer Experience ### How IoT Is Enhancing Customer Experience How IoT Is Enhancing Customer Experience ### What does real AI really mean? What does real AI really mean? ### Remaining compliant whilst embracing social media Remaining compliant whilst embracing social media ### The Growing Need for Security in Health Data The Growing Need for Security in Health Data ### Hamilton Rentals Enabled Businesses to Keep the Lights On Hamilton Rentals Enabled Businesses to Keep the Lights On ### Hamilton Rentals Enabled Businesses to Keep the Lights On Hamilton Rentals Enabled Businesses to Keep the Lights On ### Hamilton Rentals Enabled Businesses to Keep the Lights On Hamilton Rentals Enabled Businesses to Keep the Lights On ### Hamilton Rentals Enabled Businesses to Keep the Lights On Hamilton Rentals Enabled Businesses to Keep the Lights On ### developer-3461405_1920 ### work-731198_1920 ### work-731198_1920 ### Neil_Briscoe_3 ### mother-board-581597_1920 ### mother-board-581597_1920 ### mother-board-581597_1920 ### 17-0393LD-049_compressed (1) ### Hosted physical security delivers benefits for business of all sizes ### artificial-intelligence-2228610_1920 ### Brent Whitfield ### business-2846221_1920 ### smart-home-3396205_1920 ### Hackathon Poster ### How Can We Ease Education with the Internet of Things? ### How IT can support an Agile methodology ### Security Camera ### Critical Open Source Components of Telecom Cloud Management in the 5G Era ### Digital Transformation Will Help Business Rethink Their Practices Digital Transformation Will Help Business Rethink Their Practices ### Keeping Safe With Smart Devices & IoT Keeping Safe With Smart Devices & IoT ### Nigel Seddon ### TIBCO-multi-cloud-management-FINAL-266x266 ### eye-2771171_1920-266x266 ### technology-3389917_1920-266x266 ### 5 Reasons To Adopt Cloud During The Pandemic 5 Reasons To Adopt Cloud During The Pandemic ### pexels-photo-89845-266x266 ### icon-1328421_1920-266x266 ### Today Red Hat announced a definitive agreement to acquire StackRox Today Red Hat announced a definitive agreement to acquire StackRox ### code-1486361_1920-266x266 ### technology-3762558_1920-266x266 ### Demystifying-Cloud-for-CFOs-266x266 ### woodland-656969_1920-266x266 ### human-skeleton-163715-266x266 ### blur-computer-connection-442150-266x266 ### Technology-Will-Change-the-Supply-Chain-as-we-Know-it-266x266 ### cloud-computing-3308169-266x266 ### Business Intelligence Pic ### road-220058_1280 ### Gillior_Hans ### Cloud Azure AWS image Cloud Azure AWS image ### Cloud vs On-premise Cloud vs on-premise: striking a balance on security ### AI in the Contact Centre – Striking a Balance AI in the Contact Centre – Striking a Balance ### Emerging Trends in Cloud Technology Emerging Trends in Cloud Technology ### Improbable grows to a complete cloud-based multiplayer games tech company, expands existing partnership with Google Cloud Improbable grows to a complete cloud-based multiplayer games tech company, expands existing partnership with Google Cloud ### network-2402637_1280 ### Cloud-Based CRM System: Is it Safe and What Are the Benefits? Cloud-Based CRM System: Is it Safe and What Are the Benefits? ### emma sue ### architecture-22039_1920 ### DTX19-Banners-1920x250 ### The Future of Banking: Digital Account Opening The Future of Banking: Digital Account Opening ### AI for small businesses and how to best optimise it AI for small businesses and how to best optimise it ### Evolving ID delegation for the digital age: building supply chain trust Evolving ID delegation for the digital age: building supply chain trust ### Two for one: the top security measures for retailers Two for one: the top security measures for retailers ### ux-788002_1920 Top 7 Mobile App Development Trends To Be Watched Out In Current Era ### Retaining talent and boosting business with people data Retaining talent and boosting business with people data ### AI in construction: what’s next? AI in construction: what’s next? ### Why I launched a business to provide virtual migraine treatment Why I launched a business to provide virtual migraine treatment ### Taking on the hyperscalers and their comprehensive suite of services is no easy task. A fact borne out by the most recent Gartner figures that show the big five (Alibaba, Amazon Web Services, Google, IBM and Microsoft) now have a combined market share of 77 per cent. However, smaller cloud providers, though squeezed, have important advantages against their larger competitors, not least a more personalised, localised and tailored service. Hyperscalers are so vast that it is impossible to provide all but their largest customers a level of expected service and support. But taking on the hyperscalers takes more than a personalised, tailored service. We would like to see smaller cloud providers working together to combine their niche offers in an effort to provide a comprehensive single service to customers, without losing sight of their individual expertise and heritage. These collaboration strategies would open new doors, opportunities and we believe would sustain a valuable place in the market for smaller cloud providers, and the advantages they bring to the customer. With these challenging market dynamics in mind, I will explore more of the drivers for smaller cloud providers to become more discerning about collaborating with partners. Evolution of multi-cloud strategies Different cloud platforms provide varying types of service and are often used for different purposes. Many companies now depend on multi-cloud infrastructures for their operations and it is estimated that more than 75% of midsize and large organisations will have adopted a multi-cloud and/or hybrid IT strategy by 2021. The increasing use of a multi-cloud infrastructure is a great market opportunity for small cloud providers to collaborate and build on each other’s niche specialisation. The difficulty for a hyperscaler is that no matter the question a customer asks, the answer will always be their public cloud offering. With a network of smaller cloud providers working together, customers are really listened to and enjoy a genuine tailored solution to their needs. Furthermore, smaller cloud vendors are often more competitively priced as well as being able to offer a more personalised service. A comprehensive approach to data security Network security is another big challenge for today’s C-suite leaders. Recent reports show how some of the biggest tech companies (Facebook, Yahoo, Equifax) have suffered massive security failures, giving us insight into how big the problem of cybersecurity really is. With increasing dependence on cloud systems, businesses are also facing a sudden rise in the scale and sophistication of cyberthreats. So, as the trend of multiple cloud use grows, a similar tailored approach to data security is required. No two data infrastructure set-ups are the same and it is therefore important for cloud providers to be part of an agnostic broad network of cybersecurity experts. The urgency of the issue is heightened because financial penalties from data breaches alone can pose an existential risk to their operations. Many business leaders are unaware where their data is being held or what steps need to be taken in the case of a data breach. This is especially dangerous after the implementation of GDPR. While no business can ever outsource ultimate responsibility for their security to an external provider, there are a large number of experts who can help. Having a tightknit network of cloud and security companies collaborating together can make it easier to manage the overall security architecture. Know your data location With so much data at the heart of digital transformation, it is important that businesses store this data on a reliable platform. We all know that the loss or misuse of data, accidental or otherwise, could prove to be an existential risk for businesses. It can destroy reputations, result in significant regulatory fines, and cause serious organisational disruption. The threat is even more for SMEs with limited resources but greater interests at stake. It is therefore important that businesses choose their cloud provider with the utmost care and intelligence. Management must be careful and develop a good understanding of the data security processes and policies of the vendor before signing on the dotted lines. However, with the multi-cloud approach, businesses can be assured of personalised, cost-efficient and reliable cloud services, which is definitely not expected from the big five. The way forward In such challenging times, I believe the best way forward for small cloud providers is through collaboration with other niche players and by playing to their strengths. Also, as someone who has worked with the cloud services industry for many years now, I feel it is my professional responsibility to highlight the importance of knowing your data location and availing the advantage of multiple cloud partners than just one hyperscaler. Our company Memset is a brilliant example of how a smaller cloud provider has recognised the changing market dynamics and grabbed the opportunity with both hands to build a portfolio of partners to deliver secure reliable services to our customers. Taking on the hyperscalers and their comprehensive suite of services is no easy task. A fact borne out by the most recent Gartner figures that show the big five (Alibaba, Amazon Web Services, Google, IBM and Microsoft) now have a combined market share of 77 per cent. However, smaller cloud providers, though squeezed, have important advantages against their larger competitors, not least a more personalised, localised and tailored service. Hyperscalers are so vast that it is impossible to provide all but their largest customers a level of expected service and support. But taking on the hyperscalers takes more than a personalised, tailored service. We would like to see smaller cloud providers working together to combine their niche offers in an effort to provide a comprehensive single service to customers, without losing sight of their individual expertise and heritage. These collaboration strategies would open new doors, opportunities and we believe would sustain a valuable place in the market for smaller cloud providers, and the advantages they bring to the customer. With these challenging market dynamics in mind, I will explore more of the drivers for smaller cloud providers to become more discerning about collaborating with partners. Evolution of multi-cloud strategies Different cloud platforms provide varying types of service and are often used for different purposes. Many companies now depend on multi-cloud infrastructures for their operations and it is estimated that more than 75% of midsize and large organisations will have adopted a multi-cloud and/or hybrid IT strategy by 2021. The increasing use of a multi-cloud infrastructure is a great market opportunity for small cloud providers to collaborate and build on each other’s niche specialisation. The difficulty for a hyperscaler is that no matter the question a customer asks, the answer will always be their public cloud offering. With a network of smaller cloud providers working together, customers are really listened to and enjoy a genuine tailored solution to their needs. Furthermore, smaller cloud vendors are often more competitively priced as well as being able to offer a more personalised service. A comprehensive approach to data security Network security is another big challenge for today’s C-suite leaders. Recent reports show how some of the biggest tech companies (Facebook, Yahoo, Equifax) have suffered massive security failures, giving us insight into how big the problem of cybersecurity really is. With increasing dependence on cloud systems, businesses are also facing a sudden rise in the scale and sophistication of cyberthreats. So, as the trend of multiple cloud use grows, a similar tailored approach to data security is required. No two data infrastructure set-ups are the same and it is therefore important for cloud providers to be part of an agnostic broad network of cybersecurity experts. The urgency of the issue is heightened because financial penalties from data breaches alone can pose an existential risk to their operations. Many business leaders are unaware where their data is being held or what steps need to be taken in the case of a data breach. This is especially dangerous after the implementation of GDPR. While no business can ever outsource ultimate responsibility for their security to an external provider, there are a large number of experts who can help. Having a tightknit network of cloud and security companies collaborating together can make it easier to manage the overall security architecture. Know your data location With so much data at the heart of digital transformation, it is important that businesses store this data on a reliable platform. We all know that the loss or misuse of data, accidental or otherwise, could prove to be an existential risk for businesses. It can destroy reputations, result in significant regulatory fines, and cause serious organisational disruption. The threat is even more for SMEs with limited resources but greater interests at stake. It is therefore important that businesses choose their cloud provider with the utmost care and intelligence. Management must be careful and develop a good understanding of the data security processes and policies of the vendor before signing on the dotted lines. However, with the multi-cloud approach, businesses can be assured of personalised, cost-efficient and reliable cloud services, which is definitely not expected from the big five. The way forward In such challenging times, I believe the best way forward for small cloud providers is through collaboration with other niche players and by playing to their strengths. Also, as someone who has worked with the cloud services industry for many years now, I feel it is my professional responsibility to highlight the importance of knowing your data location and availing the advantage of multiple cloud partners than just one hyperscaler. Our company Memset is a brilliant example of how a smaller cloud provider has recognised the changing market dynamics and grabbed the opportunity with both hands to build a portfolio of partners to deliver secure reliable services to our customers. ### Be optimistic about the future Be optimistic about the future ### The most important driver of AI innovation lies in the cloud The most important driver of AI innovation lies in the cloud ### Reducing costs and increasing flexibility with open APIs Reducing costs and increasing flexibility with open APIs ### Javid Kahn - Pulsant - Glassboard Javid Kahn - Pulsant - Glassboard ### How Do Instagram Uses AI and Big Data in 2019 How Do Instagram Uses AI and Big Data in 2019 ### Don't demonise data: central government, citizens, and digital safety Don't demonise data: central government, citizens, and digital safety ### Top five considerations to consider when migrating security to the cloud Top five considerations to consider when migrating security to the cloud ### As Magecart Evolves Buyers Are Reminded to Beware As Magecart Evolves Buyers Are Reminded to Beware ### The evolution of the bot: How they beat me to AC/DC tickets The evolution of the bot: How they beat me to AC/DC tickets ### A guide to cloud-based Digital Asset Management A guide to cloud-based Digital Asset Management ### Adding Context Will Take AI to the Next Level Adding Context Will Take AI to the Next Level ### AIGraphic AIGraphic ### sky-49520_1920 ### fb 2 fb 2 ### fb 1 fb 1 ### 3 Important Tips for Negotiating the Most Successful Cloud Contracts 3 Important Tips for Negotiating the Most Successful Cloud Contracts ### Anne Neumann ### Startups move fast, but they can move a lot faster Startups move fast, but they can move a lot faster ### C2P19786 - TFM - Registration banner ### Compare the Cloud blog Sales Compare the Cloud blog Sales ### ThousandEyes Introduces Collective View of Global Internet Performance ThousandEyes Introduces Collective View of Global Internet Performance ### Trust is key to avoid vendor lock in Trust is key to avoid vendor lock in ### How to Turn Your Mobile App into a Lead Magnet How to Turn Your Mobile App into a Lead Magnet ### Hybrid-cloud: Getting the Best of Both Worlds Hybrid-cloud: Getting the Best of Both Worlds ### How to compare cloud-based accounting software How to compare cloud-based accounting software ### Removing bias from AI models The number of use cases for artificial intelligence (AI) is increasing rapidly, mainly due to larger data sets, faster compute and higher processing power. ### Moving cloud services towards understanding speech Moving cloud services towards understanding speech ### How to address the technology skills gap How to address the technology skills gap ### Optimizing Your Martech Stack For Maximum Productivity Optimizing Your Martech Stack For Maximum Productivity ### Resolve Acquires FixStream to Deliver Game-Changing Combination of AIOps and Advanced Automation Resolve Acquires FixStream to Deliver Game-Changing Combination of AIOps and Advanced Automation ### Cloud-Native vs Cloud-Enabled – Are You Using the Right Term? Cloud-Native vs Cloud-Enabled – Are You Using the Right Term? ### Can Robotic Process Automation Boost Employee Engagement? Can Robotic Process Automation Boost Employee Engagement? ### Can you sum up your business idea in three sentences? Can you sum up your business idea in three sentences? ### Why companies are ditching the cloud for hybrid - Compare the Cloud Why companies are ditching the cloud for hybrid ### Big Data - New technologies against fraud Big Data - New technologies against fraud ### 9 tips for creating eyeball grabbing content that drives conversion 9 tips for creating eyeball grabbing content that drives conversion ### Don’t let the FUD cloud the cloud Don’t let the FUD cloud the cloud ### Is open source the key to AI’s future? Is open source the key to AI’s future? ### Is your business Built to Grow? Is your business Built to Grow? ### The future of programmatic advertising The future of programmatic advertising ### What the arrival of 5G means for IoT security What the arrival of 5G means for IoT security ### How technology will impact and enhance the care industry How technology will impact and enhance the care industry ### Accelerating Artificial Intelligence Inferencing Accelerating Artificial Intelligence Inferencing ### Figure 3-Latency SKT Figure 3-Latency SKT ### Figure 2-Throughput SKT Figure 2-Throughput SKT ### Figure 1 Figure 1 ### Is Cloud Computing the Next Natural Step in Tech? Is Cloud Computing the Next Natural Step in Tech? ### ThousandEyes Releases Inaugural Internet Performance Report, Revealing Impact of COVID-19 ThousandEyes Releases Inaugural Internet Performance Report, Revealing Impact of COVID-19 ### TE-Main-Logo ### Is too much Noise keeping you from success? Is too much Noise keeping you from success? ### iphone-410311 OpenJDK Plans to Bring Java to iOS Platform: A Feat to be Remembered ### Top 5 Considerations Before Buying a GPU Top 5 Considerations Before Buying a GPU ### VPN: Reclaim Your Internet Freedom VPN: Reclaim Your Internet Freedom ### For The Win: Delivering Competitive Advantage in Sport For The Win: Delivering Competitive Advantage in Sport ### Steve Thomas(5) Steve Thomas(5) ### Cloud and Web Application Security: Growing Confidence and Emerging Gaps Cloud and Web Application Security: Growing Confidence and Emerging Gaps ### How a customisable ITSM solution can be a platform for success How a customisable ITSM solution can be a platform for success ### 8 The Best Ways AI Will Help Humanity, Not Harm It 8 The Best Ways AI Will Help Humanity, Not Harm It ### 10 Ways to Promote Your Brand Online 10 Ways to Promote Your Brand Online ### X proved/ Key tactics to build a (Strong) relationship with your customers X proved/ Key tactics to build a (Strong) relationship with your customers ### Cloud Hosting vs Shared Hosting – The Pros and Cons Cloud Hosting vs Shared Hosting – The Pros and Cons ### How Fintech Based Tools And Resources Made Handling Finances Easier? How Fintech Based Tools And Resources Made Handling Finances Easier? ### Focus on email continuity and resilience or bust the bottom line Focus on email continuity and resilience or bust the bottom line ### How Data Security and Privacy Can Save Corporations Money How Data Security and Privacy Can Save Corporations Money ### Answer Why? What? How? and What if? - Opening Lines Answer Why? What? How? and What if? - Opening Lines ### Graham Shaw Picture1 Graham Shaw Picture1 ### Why Wordpress and what it can do ### Multi-Cloud Strategy: Pros, Cons and Considerations Multi-Cloud Strategy: Pros, Cons and Considerations ### Info On Cyber-Security Training And Careers Info On Cyber-Security Training And Careers ### Embracing the benefits of a multi-cloud strategy Embracing the benefits of a multi-cloud strategy ### The ethics of artificial intelligence The ethics of artificial intelligence ### Why successful digital transformation does not exist without IoT Why successful digital transformation does not exist without IoT ### Why “cloud” is no longer a buzzword Why “cloud” is no longer a buzzword ### The forecast: cloudy with a chance of containers The forecast: cloudy with a chance of containers ### Big data made smart Putting confidence back into decision-making Big data made smart Putting confidence back into decision-making ### Just do it: Cloud migration the lift, shift and refactor way Just do it: Cloud migration the lift, shift and refactor way ### Marketing automation will create captivated and engaged customers Marketing automation will create captivated and engaged customers ### Improving inspections in confined spaces with the right industrial drone Improving inspections in confined spaces with the right industrial drone ### The demystifying guide to social AR The demystifying guide to social AR ### Top Fintech Business Startup Ideas to find Easy Investment Top Fintech Business Startup Ideas to find Easy Investment ### Customer success is critical to growth in the digital era Customer success is critical to growth in the digital era ### Mandy Flint & Elisabet Vinberg Hearn Mandy Flint & Elisabet Vinberg Hearn ### Setting up a Public Wi-Fi Network – The Essential Dos and Don'ts Setting up a Public Wi-Fi Network – The Essential Dos and Don'ts ### How Visual Content is Changing the Company Training Game How Visual Content is Changing the Company Training Game ### Blue Chip | SAP HANA | EP3 Blue Chip | SAP HANA | EP3 ### Blue Chip | SAP HANA | EP2 Blue Chip | SAP HANA | EP2 ### Blue Chip | SAP HANA | EP1 Blue Chip | SAP HANA | EP1 ### James King | Blue Chip | Tailored Solutions For You James King | Blue Chip | Tailored Solutions For You ### Where will the next big platform for innovation come from? Where will the next big platform for innovation come from? ### Managing your cloud environment Managing your cloud environment ### The Cloud is the platform for ‘the new’. The Cloud is the platform for ‘the new’. ### Changing from network access to application access Changing from network access to application access ### lego-2646527_1920 Artificial Intelligence and Security ### How Can Consumers Stay Safe When Shopping Online?-2125548_1920 How Can Consumers Stay Safe When Shopping Online? ### 1 1 ### How can the cloud help small businesses grow? How can the cloud help small businesses grow? ### Picture1 Picture1 ### Finding Ikigai - Opening Lines Finding Ikigai - Opening Lines ### Smart home security: Why cloud isn’t always the answer Smart home security: Why cloud isn’t always the answer ### PSD2 opens a new era for fintech software innovation PSD2 opens a new era for fintech software innovation ### Maintaining control in a multi-cloud ecosystem Maintaining control in a multi-cloud ecosystem ### How to Improve Mobile User Experience By Using AI How to Improve Mobile User Experience By Using AI ### Cognitive Computing Is Not Hype: It Is A Must-Have For Organisations Cognitive Computing Is Not Hype: It Is A Must-Have For Organisations ### Why private cloud is needed to optimise your overall cloud strategy Why private cloud is needed to optimise your overall cloud strategy ### figure 2 figure 2 ### 4 4 ### figure 3 figure 3 ### figure 2 figure 2 ### What does Machine Learning mean? What does Machine Learning mean? ### figure 1 figure 1 ### 0 ### Purifying data for better business Purifying data for better business ### Achieving integration excellence in the cloud Achieving integration excellence in the cloud ### Jon Burkhart ### Less paper, less work: Moving mortgage advice to the Cloud Less paper, less work: Moving mortgage advice to the Cloud ### Computing advances on the final frontier Computing advances on the final frontier ### sea-84629_1920 Getting the Most From Modern Data Applications in the Cloud ### 5 of the Best E-commerce Marketing Strategies to Explode Your Sales 5 of the Best E-commerce Marketing Strategies to Explode Your Sales ### Reasons for the hybrid cloud: disaster recovery and cost Reasons for the hybrid cloud: disaster recovery and cost ### Which Industries Will Benefit Most From IIoT 2019? Which Industries Will Benefit Most From IIoT 2019? ### Future proofing your business – Start working in the present Future proofing your business – Start working in the present ### tank-165449 4 Ways Armoured Vehicles Can Teach Us About Cloud Cybersecurity ### Asking the right questions to maximise your recruitment technology investment Asking the right questions to maximise your recruitment technology investment ### Creating a Business Development Culture Creating a Business Development Culture ### The Technology That’s Creating Tomorrow’s Workplaces The Technology That’s Creating Tomorrow’s Workplaces ### arrows-1834859_1920 How will AI impact the future of different industries ### How To Can Streamline Your Business With Blockchain Technology How To Can Streamline Your Business With Blockchain Technology ### Latest Advances and Applications in the Internet of Things (IoT) Technology Latest Advances and Applications in the Internet of Things (IoT) Technology ### Green data: Survival of the sustainable Green data: Survival of the sustainable ### Black holes and the future of data management Black holes and the future of data management ### Four elements of a successful file services platform Four elements of a successful file services platform ### How Can Artificial Intelligence Help You Design A Bespoke User Experience? How Can Artificial Intelligence Help You Design A Bespoke User Experience? ### Amazing FinTech Women We Should All Admire Amazing FinTech Women We Should All Admire ### The Technology That’s Creating Tomorrow’s Workplaces The Technology That’s Creating Tomorrow’s Workplaces ### Data ethics: Five things to care about Data ethics: Five things to care about ### The irreplaceable security practice The irreplaceable security practice ### The new economy of subscription software The new economy of subscription software ### Chisel up Your Role with Augmented Analytics Chisel up Your Role with Augmented Analytics ### Can cyber insurance offer comfort anymore? Can cyber insurance offer comfort anymore? ### How Edge-AI will inform the smart environment of the future How Edge-AI will inform the smart environment of the future ### What makes a brand great? What makes a brand great? ### Thumbnail ### Are We Prepared For The Security Risks Of The IoT? Are We Prepared For The Security Risks Of The IoT? ### Overcoming Cloud-vendor lock-in with Serverless Overcoming Cloud-vendor lock-in with Serverless ### Why is Accessibility Important for a Website and How to Achieve it? Why is Accessibility Important for a Website and How to Achieve it? ### Fixing Bottlenecks For Your Cloud Analytics Fixing Bottlenecks For Your Cloud Analytics ### How AI Can Improve The Employee Experience How AI Can Improve The Employee Experience ### How Virtual Reality Is Changing the Way We Train Doctors How Virtual Reality Is Changing the Way We Train Doctors ### Harnessing the business cloud Harnessing the business cloud ### How to Streamline Businesses Through Tech How to Streamline Businesses Through Tech ### What Brands Should Know About Millennials in the UK ### Millennials Picture 6 Millennials Picture 6 ### Millennials Picture 5 Millennials Picture 5 ### Millennials Picture 4 Millennials Picture 4 ### Millennials Picture 3 Millennials Picture 3 ### Millennials Picture 2 ### Millennials Picture 1 Millennials Picture 1 ### IoT security: how to drive digital transformation whilst minimising risk IoT security: how to drive digital transformation whilst minimising risk ### 56% of people fear public speaking – How it can hold us back in business 56% of people fear public speaking – How it can hold us back in business ### Artificial intelligence goes mainstream? Not before a reality check. Artificial intelligence goes mainstream? Not before a reality check. ### How Augmented Intelligence is Changing the Workplace as We Know It How Augmented Intelligence is Changing the Workplace as We Know It ### Dave Birss Dave Birss ### AI: Inspiring creativity and fuelling business growth AI: Inspiring creativity and fuelling business growth ### 9 in 10 SME owners still lacking information on GDPR compliance 9 in 10 SME owners still lacking information on GDPR compliance ### Why employees are the future of the digital workplace Why employees are the future of the digital workplace ### How organisations can employ tokenisation using public blockchain to access untapped value How organisations can employ tokenisation using public blockchain to access untapped value ### Adopting cloud securely: minimise risk; maximise performance Adopting cloud securely: minimise risk; maximise performance ### Science Logic ### Versa ### Axians ### Bluechip ### Netapp ### Veridium ### Barracuda ### Bell Integrations ### Commvault ### Computer World ### Duolc ### Fujitsu ### EM360 ### Fortinet ### Elyzium ### Node4 ### Pinacl ### Pulsant ### Quadris ### Three ways blockchain can bolster ERP Three ways blockchain can bolster ERP ### Has your organisation locked the stable door? Has your organisation locked the stable door? ### Spend on Cybersecurity or be prepared to spend much more on ransomware Spend on Cybersecurity or be prepared to spend much more on ransomware ### 60655327_113519823203869_7873675329976401920_n ### code-1076533_1920 The Secrets of SaaS ### How The Gaming Industry Uses Big Data to Shape Development How The Gaming Industry Uses Big Data to Shape Development ### Fortnite Fortnite ### Dell EMC Puts the “U” in Universal Customer Premises Equipment at the Network Edge Dell EMC Puts the “U” in Universal Customer Premises Equipment at the Network Edge ### Controlling Cloud Spend in AWS Environments Controlling Cloud Spend in AWS Environments ### Top 5 Cloud Security Solutions for Your Business Top 5 Cloud Security Solutions for Your Business ### Cloud Repatriation - Why enterprises are bringing some apps back from the public cloud Cloud Repatriation - Why enterprises are bringing some apps back from the public cloud ### The Insights Into the Future of Healthcare Technology The Insights Into the Future of Healthcare Technology ### AI is on the shopping list for retailers AI is on the shopping list for retailers ### Taxal ### 10 Practical Tips for Keeping Your Business' Data Secure 10 Practical Tips for Keeping Your Business' Data Secure ### The four pillars of an Enterprise Data Cloud The four pillars of an Enterprise Data Cloud ### BT Chooses Juniper Networks to Underpin 5G Capability and Move to a Cloud-Driven Unified Network Infrastructure BT Chooses Juniper Networks to Underpin 5G Capability and Move to a Cloud-Driven Unified Network Infrastructure ### Why cybersecurity is essential for physical security peace of mind Why cybersecurity is essential for physical security peace of mind ### Don’t Be A Victim! 10 Ways To Protect You and Your Business From Fraud Don’t Be A Victim! 10 Ways To Protect You and Your Business From Fraud ### The Importance of Artificial Intelligence in Space Technology The Importance of Artificial Intelligence in Space Technology ### The Importance of UI and UX in Driving Better Collaboration The Importance of UI and UX in Driving Better Collaboration ### Turn physical security into business insight Turn physical security into business insight ### ps logo ps logo ### Pure Storage Pure Storage ### Pure Storage - Robson Grieve ### Taxal Logo Taxal Logo ### Quadris Logo Quadris Logo ### pulsant logo pulsant logo ### pinacl logo pinacl logo ### Node4 Logo ### Fujitsu-Logo ### Fortinet Logo Fortinet Logo ### technology-Debunking the AI myth in customer service – what to watch out for3762549_1920 Debunking the AI myth in customer service – what to watch out for ### image1 image1 ### How software solutions can be used to stimulate growth within hospitality How software solutions can be used to stimulate growth within hospitality ### The Importance of Artificial Intelligence in Space Technology The Importance of Artificial Intelligence in Space Technology ### Powering digital transformation with cloud communications Powering digital transformation with cloud communications ### AI Tools to Plan Your Content Strategy AI Tools to Plan Your Content Strategy ### Teaching an Old System New Tricks Teaching an Old System New Tricks ### EM360-Logo-retina ### elyzium process logo ### DUOLC-logo3-copy ### computerworld ### Commvault-Logo-RGB-POS-STACK ### bell-integration_owler_20180920_235006_original ### Barracuda ### veridium_logo ### brand-sciencelogic-logo-rgb ### netapp-logo ### blue-chip-logo-w648-h348 ### axians ### Versa Horiz CMYK Standard transparent background ### The Opportunities and Challenges of Embracing Relevance in the Channel The Opportunities and Challenges of Embracing Relevance in the Channel ### Will AGI redefine human intelligence? Will AGI redefine human intelligence? ### NHS, technology and OPEL: the systems used in A&E departments NHS, technology and OPEL: the systems used in A&E departments ### Enabling agile embedded integrations in SaaS applications Enabling agile embedded integrations in SaaS applications ### Decentralised Exchanges in the age of AI Decentralised Exchanges in the age of AI ### Enhance your Mobile Ecosystem with Cloud Computing Enhance your Mobile Ecosystem with Cloud Computing ### How retailers can harness tech to battle the crisis on the high street How retailers can harness tech to battle the crisis on the high street ### They are what they eat: Feeding the chatbots They are what they eat: Feeding the chatbots ### Technical solutions set to transform the FinTech industry Technical solutions set to transform the FinTech industry ### Third-party support in the cloud Third-party support in the cloud ### Up in the clouds with cyber security Up in the clouds with cyber security ### Five cash management essentials banks cannot afford to ignore Five cash management essentials banks cannot afford to ignore ### Smith Willas ### flink flink ### cloud providers used for data infrastructure-CTC cloud providers used for data infrastructure-CTC ### Budget allocation over the next year-AI-CTC Budget allocation over the next year-AI-CTC ### Demystifying Cloud Partnership: how to choose a cloud partner that works for you Demystifying Cloud Partnership: how to choose a cloud partner that works for you ### The future of hybrid cloud The future of hybrid cloud ### The damaging effect of cloud outages, and how to stop them The damaging effect of cloud outages, and how to stop them ### Should APTs be a concern for the healthcare sector? Should APTs be a concern for the healthcare sector? ### Technology: Making Its Way In Healthcare Technology: Making Its Way In Healthcare ### How to Keep up in an Era of Constant Digital Disruption How to Keep up in an Era of Constant Digital Disruption ### Getting to Grips with Your Data Getting to Grips with Your Data ### Cloud strategy: Don’t leave security on the sidelines Cloud strategy: Don’t leave security on the sidelines ### Predictions for regtech in 2019 Predictions for regtech in 2019 ### Invisible discrimination: How artificial intelligence risks perpetuating historical prejudices Invisible discrimination: How artificial intelligence risks perpetuating historical prejudices ### How Blockchain Is Revolutionising the Crowdfunding Landscape How Blockchain Is Revolutionising the Crowdfunding Landscape ### On-Premises or in the Cloud––What BI to Choose? On-Premises or in the Cloud––What BI to Choose? ### Adaptive and value-focussed – the future of technology in healthcare Adaptive and value-focussed – the future of technology in healthcare ### Data analytics, hybrid cloud & stream processing: enablers for an AI-led enterprise Data analytics, hybrid cloud & stream processing: enablers for an AI-led enterprise ### TAMing the data centre: keep control over assets TAMing the data centre: keep control over assets ### More than half of British businesses witnessed cyber-attacks in 2019 More than half of British businesses witnessed cyber-attacks in 2019 ### 7 steps to a seamless digital transformation project 7 steps to a seamless digital transformation project ### How to protect yourself from invoice fraud How to protect yourself from invoice fraud ### Don’t lose your head - Consumer Devices Don’t lose your head - Consumer Devices ### Versa_logo835x396 Versa Networks Logo ### Why Smart Homes Require Smarter Security Why Smart Homes Require Smarter Security ### To keep the botnets at bay: regulate or innovate? To keep the botnets at bay: regulate or innovate? ### Broadband for business resilience and recovery Broadband for business resilience and recovery ### Video conferencing in the era of AI Video conferencing in the era of AI ### Technology as a work perk: Why recruiters are craving simpler benefits Technology as a work perk: Why recruiters are craving simpler benefits ### Closing the App Gap with HPE Nimble Storage Closing the App Gap with HPE Nimble Storage ### How the Cloud and Automation Impact Database Security How the Cloud and Automation Impact Database Security ### Unlock the Value of IoT with Performance Monitoring Unlock the Value of IoT with Performance Monitoring ### The Dangers of Centralised Databases in the Digital Age The Dangers of Centralised Databases in the Digital Age ### The importance of data privacy in digital transformation The importance of data privacy in digital transformation ### Picture1 ### How the rail industry has harnessed the power of the cloud How the rail industry has harnessed the power of the cloud ### Measures to Ensure a Safe Digital Experience Measures to Ensure a Safe Digital Experience ### vpn.eg ### 5 Things We Can Learn from HR Cloud Natives 5 Things We Can Learn from HR Cloud Natives ### Big Data: Even Bigger Questions Big Data: Even Bigger Questions ### AI - what is and isn’t, and how it can help me AI - what is and isn’t, and how it can help me ### The changing technology landscape The changing technology landscape ### Healthcare: last tech frontier in the public sector Healthcare: last tech frontier in the public sector ### Improving Your Content Marketing with Big Data Improving Your Content Marketing with Big Data ### Why today’s business planning must shift to the cloud Why today’s business planning must shift to the cloud ### Screenshot 2019-04-16 at 09.23.32 ### Unlocking the world with a look – the growing potential of face-based biometrics Unlocking the world with a look – the growing potential of face-based biometrics ### Mobile payments and what the future holds? Mobile payments and what the future holds? ### The power and promise of AI in the coming year and beyond The power and promise of AI in the coming year and beyond ### The challenges and opportunities of Office 365 The challenges and opportunities of Office 365 ### Money for nothing – why cloud instance family moves are a no-brainer Money for nothing – why cloud instance family moves are a no-brainer ### The AI revolution has arrived – but are enterprises ready? The AI revolution has arrived – but are enterprises ready? ### Legit Ways to Prepare Your Site for Peaks in Traffic Legit Ways to Prepare Your Site for Peaks in Traffic ### Don't Crash the Party How to Ensure Your Site Stability Don't Crash the Party How to Ensure Your Site Stability ### AI + IoT Reasons to implement for Business Growth AI + IoT Reasons to implement for Business Growth ### The ‘Internet of Things’ - from buzzword to reality? The ‘Internet of Things’ - from buzzword to reality? ### SDWAN Solutions Launches World First SD-WAN Cloud SDWAN Solutions Launches World First SD-WAN Cloud ### Open Banking is changing banking, but can one regulation break the system? Open Banking is changing banking, but can one regulation break the system? ### Disaster Recovery Disaster Recovery ### Five cloud pitfalls that create data management problems Five cloud pitfalls that create data management problems ### Fuel Your Engine: 8 Tips for Deploying a Seamless Cloud Migration Strategy Fuel Your Engine: 8 Tips for Deploying a Seamless Cloud Migration Strategy ### How 5G Will Accelerate Cloud Business Investment How 5G Will Accelerate Cloud Business Investment ### How a Virtual Data Room Differs From Any Traditional Cloud-based Storage: Which One to Choose? How a Virtual Data Room Differs From Any Traditional Cloud-based Storage: Which One to Choose? ### Cloud LMS vs. On-premise LMS — Which One Works Best For You Cloud LMS vs. On-premise LMS — Which One Works Best For You ### 2020 - the year of cloud 2020 - the year of cloud ### Leveraging AI to Revolutionise Underwriting Leveraging AI to Revolutionise Underwriting ### 7 Essential Steps of Starting a WordPress Blog 7 Essential Steps of Starting a WordPress Blog ### Creating the excellent experiences today’s customers expect Creating the excellent experiences today’s customers expect ### How can innovation shape the future of teaching? How can innovation shape the future of teaching? ### Delivering an enterprise mobility strategy that is fit-for purpose Delivering an enterprise mobility strategy that is fit-for purpose ### Upgrading data centers to cloud scale efficiency – hype versus fact Upgrading data centers to cloud scale efficiency – hype versus fact ### Let chaos reign and it will: why you need a tight grip on multi-cloud Let chaos reign and it will: why you need a tight grip on multi-cloud ### AI, 5G, trust and much more work to do - five key take outs from this year’s RSA conference AI, 5G, trust and much more work to do - five key take outs from this year’s RSA conference ### Unit4 Unit4 ### Wearable AI Gadget | Is telepathy a step too far? Wearable AI Gadget | Is telepathy a step too far? ### The nuts and bolts of online gaming The nuts and bolts of online gaming ### Security in the face of global cloud adoption Security in the face of global cloud adoption ### Women in FinTech: Fresh Perspectives Women in FinTech: Fresh Perspectives ### Artificial intelligence and the new wave of innovation Artificial intelligence and the new wave of innovation ### The problem with print servers: using the cloud to improve security and compliance The problem with print servers: using the cloud to improve security and compliance ### Keith Richards, Gig Economies and £100K Service Techs Keith Richards, Gig Economies and £100K Service Techs ### Everything You Need to Know about CRM Everything You Need to Know about CRM ### How to make AI work for your business, today How to make AI work for your business, today ### IoT devices: The Trojan horses of our time IoT devices: The Trojan horses of our time ### The gathering momentum of hybrid cloud The gathering momentum of hybrid cloud ### Taking the fog out of cloud security Taking the fog out of cloud security ### How Student Self-Management Tools Can Influence Your Studying Process How Student Self-Management Tools Can Influence Your Studying Process ### Multi cloud management demands a single viewpoint Multi cloud management demands a single viewpoint ### The Telecoms Graph Database Promise The Telecoms Graph Database Promise ### What exactly is cloud testing? What exactly is cloud testing? ### Driving transformation with digital expense management Driving transformation with digital expense management ### How to weather the digital disruption storm How to weather the digital disruption storm ### 5 Ways Artificial Intelligence Will Drive Digital Transformation In Healthcare 5 Ways Artificial Intelligence Will Drive Digital Transformation In Healthcare ### 5 Ways Artificial Intelligence Will Drive Digital Transformation In Healthcare 5 Ways Artificial Intelligence Will Drive Digital Transformation In Healthcare ### How enterprises can champion agility and adaptability in today’s shifting business landscape How enterprises can champion agility and adaptability in today’s shifting business landscape ### Is your business ready for the automation revolution? Is your business ready for the automation revolution? ### Smart Contracts - Where Blockchain will prove its value Smart Contracts - Where Blockchain will prove its value ### Problems With Telemedicine In Developing Countries Problems With Telemedicine In Developing Countries ### 5 Common Video Types to Market Your Business 5 Common Video Types to Market Your Business ### CFOs- are you innovating? CFOs- are you innovating? ### Ensuring Security In An Era Of AI And Cloud Platforms Ensuring Security In An Era Of AI And Cloud Platforms ### Are you harnessing the power of your business data effectively? Are you harnessing the power of your business data effectively? ### Moving to the Cloud? Take the Marie Kondo Approach to Decluttering First ### 5 Linux Terminal Commands That You Needed The Whole Time 5 Linux Terminal Commands That You Needed The Whole Time ### 2019 in data – Where are we now? 2019 in data – Where are we now? ### 7 Crucial Ways To Expand Your eCommerce Emailing List 7 Crucial Ways To Expand Your eCommerce Emailing List ### Moving to the cloud - what are you waiting for? Moving to the cloud - what are you waiting for? ### How Cloud Collaboration Software can Enhance Company Security How Cloud Collaboration Software can Enhance Company Security ### Private Cloud: Answering the call of UK business Private Cloud: Answering the call of UK business ### Why “NoSQL” doesn’t equal “insecure” Why “NoSQL” doesn’t equal “insecure” ### 5 Amazing Ways AI is Transforming Online Courses 5 Amazing Ways AI is Transforming Online Courses ### Artificial Intelligence: The Problem with Perfection Artificial Intelligence: The Problem with Perfection ### Director of Voice Solutions and Technical Evangelist Director of Voice Solutions and Technical Evangelist ### Autonomous vehicles - Is the hype warranted? Autonomous vehicles - Is the hype warranted? ### Why even the smartest fintech firms still need humans Why even the smartest fintech firms still need humans ### Beyond The Serverless Trend, Why Conversational Programming Should Be Your Next Focus Area Beyond The Serverless Trend, Why Conversational Programming Should Be Your Next Focus Area ### AI-driven learning essential for workers’ digital transformation preparations AI-driven learning essential for workers’ digital transformation preparations ### Why Mass Data Fragmentation Matters Why Mass Data Fragmentation Matters ### What Does an Augmented Workforce Really Look Like? What Does an Augmented Workforce Really Look Like? ### 5 Ways AI Is Revolutionising Digital Marketing 5 Ways AI Is Revolutionising Digital Marketing ### How to avoid tech paralysis How to avoid tech paralysis ### The multi-cloud paves the way for AI and ML The multi-cloud paves the way for AI and ML ### Incident Enrichment | IBM Resilient Incident Enrichment | IBM Resilient ### Automation In Incident Response | IBM Resilient Automation In Incident Response | IBM Resilient ### Mitre attack | IBM Resilient Mitre attack | IBM Resilient ### Breaking Out Workflows part 1 & 2 | IBM Resilient Breaking Out Workflows part 1 & 2 | IBM Resilient ### Threat Intelligence Platform | IBM Resilient Threat Intelligence Platform | IBM Resilient ### Role Based Access Control (RBAC) | IBM Resilient Role Based Access Control (RBAC) | IBM Resilient ### Crisis Management | IBM Resilience Crisis Management | IBM Resilience ### Dangers of automation part 1&2 | IBM Resilient Dangers of automation part 1&2 | IBM Resilient ### SOAR markets | IBM Resilient SOAR markets | IBM Resilient ### App exchange community vs enterprise | IBM Resilient App exchange community vs enterprise | IBM Resilient ### 5 Service Effectiveness | IBM Resilient ### IR on top of SIEM | IBM Resilient IR on top of SIEM | IBM Resilient ### To understand the certainty and uncertainty in cybersecurity it is helpful to look at an example from the real world. By looking at an aeroplanes autopilot system we can better understand cybersecurity. To understand the certainty and uncertainty in cybersecurity it is helpful to look at an example from the real world. By looking at an aeroplanes autopilot system we can better understand cybersecurity. ### incident response privacy IBM Resilient ### Incident response privacy IBM Resilient Incident response privacy IBM Resilient ### 1How to build your incident response process IBM Resilient How to build your incident response process IBM Resilient ### IBM Resilient ### How to build your incident response process IBM Resilient How to build your incident response process IBM Resilient ### Juniper Networks Expedites 5G Transformation for Service Providers Juniper Networks Expedites 5G Transformation for Service Providers ### Mendix Unveils Game-Changing Native Integration with IBM’s Cloud Services and Watson Mendix Unveils Game-Changing Native Integration with IBM’s Cloud Services and Watson ### Why Mass Data Fragmentation Matters Why Mass Data Fragmentation Matters ### What To Do If You’ve Been Hacked What To Do If You’ve Been Hacked ### Blockchain: Adopting legal frameworks to gain competitive advantage Blockchain: Adopting legal frameworks to gain competitive advantage ### Managing cloud migration and the end-user experience Managing cloud migration and the end-user experience ### Cloud-based AI technology propels cardiovascular medicine forward Cloud-based AI technology propels cardiovascular medicine forward ### Five Cloud Migration Issues You Shouldn’t Ignore Five Cloud Migration Issues You Shouldn’t Ignore ### Cyber Security Threats Companies Face in the Digital Age? Cyber Security Threats Companies Face in the Digital Age? ### Medical IT Must Boost Its Immune System Medical IT Must Boost Its Immune System ### Using Unified Communications to Scale Globally Using Unified Communications to Scale Globally ### Cyberwarfare: the New AI Frontier Cyberwarfare: the New AI Frontier ### How Cloud, Mobile, and AI Are Shaping The Education Sector How Cloud, Mobile, and AI Are Shaping The Education Sector ### Incident Response Privacy | IBM Resilient Incident Response Privacy | IBM Resilient ### Incident Response Privacy | IBM Resilient Incident Response Privacy | IBM Resilient ### The Digital Toolbox For The Future Of Work The Digital Toolbox For The Future Of Work ### How Keywords in Data Classification Can Transform Analytics How Keywords in Data Classification Can Transform Analytics ### Adopting AI? Set Realistic Expectations To Help You Succeed Adopting AI? Set Realistic Expectations To Help You Succeed ### Redefining the Hospitality Industry with Internet of Things Redefining the Hospitality Industry with Internet of Things ### Email Security Tips for Small Businesses Email Security Tips for Small Businesses ### Setting Your Business On A Positive Path To Digital Transformation w/ RPA Setting Your Business On A Positive Path To Digital Transformation w/ RPA ### Customers Don’t Care About Open Banking Customers Don’t Care About Open Banking ### Security | Is your IoT gadget spying on you? Security | Is your IoT gadget spying on you? ### Five Trends to Define Cloud Computing in 2019 Five Trends to Define Cloud Computing in 2019 ### Overcoming Cloud Complexity Overcoming Cloud Complexity ### Machine learning: A rationalist’s view Machine learning: A rationalist’s view ### Top to Bottom Success in the Enterprise WAN Top to Bottom Success in the Enterprise WAN ### Top IoT Trends to Watch for in 2019 Top IoT Trends to Watch for in 2019 ### Beware the temptation of cloud pick & mix! Beware the temptation of cloud pick & mix! ### Kingston ### Top IoT Trends to Watch for in 2019 Top IoT Trends to Watch for in 2019 ### How to Get Boards ‘On Board’ with Innovation Governance How to Get Boards ‘On Board’ with Innovation Governance ### Taking Label Management Into The Cloud Taking Label Management Into The Cloud ### A Strategic View For Small & Medium Sized Companies PART 3 ### A Strategic View For Small & Medium Sized Companies A Strategic View For Small & Medium Sized Companies ### PART 1 Picture PART 1 Picture ### Digital Transformation: 101 – Challenges, Benefits and Best Practice Digital Transformation: 101 – Challenges, Benefits and Best Practice ### CISOs Must Lead From The Front On Security CISOs Must Lead From The Front On Security ### Starting The Cloud Journey – A Guide To Migrating Databases To The Cloud Starting The Cloud Journey – A Guide To Migrating Databases To The Cloud ### The Future Of Work? Employees Will Be Gamified + In Control The Future Of Work? Employees Will Be Gamified + In Control ### Top IoT Devices That Will Become More Useful In 2019 Top IoT Devices That Will Become More Useful In 2019 ### A Connected Future In Fintech A Connected Future In Fintech ### Becoming a Digital Accounting Firm Becoming a Digital Accounting Firm ### How is FinTech Promoting Financial Inclusivity? How is FinTech Promoting Financial Inclusivity? ### It’s Time To Stop Mixing Up AI and Machine Learning It’s Time To Stop Mixing Up AI and Machine Learning ### Must Have Logos & Videos for Your Business Must Have Logos & Videos for Your Business ### Cloud-Based Integration Takes The Pain Out Of Digital Transformation Cloud-Based Integration Takes The Pain Out Of Digital Transformation ### Creativity vs Data: The Two Should Not Be Mutually Exclusive Creativity vs Data: The Two Should Not Be Mutually Exclusive ### Andrew Morsy ### Technologies That Can Revolutionise Business Models in Manufacturing Technologies That Can Revolutionise Business Models in Manufacturing ### Securing Privacy While Using Devices at School Securing Privacy While Using Devices at School ### hd ### How Technology is Transforming Mentoring How Technology is Transforming Mentoring ### Why Is Blockchain Good For Your Business? Why Is Blockchain Good For Your Business? ### Common Informational Security Loopholes In Data Engineering Common Informational Security Loopholes In Data Engineering ### The Artificial Intelligence Revolution Is Here The Artificial Intelligence Revolution Is Here ### When Cloud Meets Machine Learning: Solving The Changing Recruitment Problem When Cloud Meets Machine Learning: Solving The Changing Recruitment Problem ### Saving Time and Money with Unified IT Saving Time and Money with Unified IT ### Why We Need to Redefine Agile Networking Why We Need to Redefine Agile Networking ### Jumping To ‘False Causes’ At Point Of Failure Jumping To ‘False Causes’ At Point Of Failure ### Why Blockchain Should Orchestrate From Behind The Scenes, Not Take Centre Stage Why Blockchain Should Orchestrate From Behind The Scenes, Not Take Centre Stage ### Where AI and Humans Intersect | The Power of Hybrid Where AI and Humans Intersect | The Power of Hybrid ### Exploring The Business Value of Digital Twins Exploring The Business Value of Digital Twins ### Why More Organisations Will Move Toward An Offline Tape-Based Storage Strategy In 2019 Why More Organisations Will Move Toward An Offline Tape-Based Storage Strategy In 2019 ### artificial-intelligence-2167835_640 ### Where AI and humans intersect - power of hybrid ### A Decade In The Cloud A Decade In The Cloud ### VR to transform surgical training ### Ian Moyse ### How To Use Cloud Tech To Automate Your Sales On Scale How To Use Cloud Tech To Automate Your Sales On Scale ### Challenges surrounding IoT deployment in Africa ### Sustaining the Smart City ### Smartdust coming to a computer near you ### Should robots be marketed to children ### Scientific Innovation | Device to combat water shortages ### CEEL-Partner-Banner-728x90 (1) ### Security | Is your IOT gadget spying on you ### Humans struggle to turn off pleading robots ### Blockchain – Creating a Power Shift in The Real Estate Industry Blockchain – Creating a Power Shift in The Real Estate Industry ### FaceTime on your windscreen, Apple's got it nailed..... ### Everyday AI | The rise of chatbots ### Does advancing technology alienate the elderly? ### Being Successful in Cyber Security Being Successful in Cyber Security ### Digital Transformation: Lost in Translation? Digital Transformation: Lost in Translation? ### Why is Everyone Talking About VPNs? Why is Everyone Talking About VPNs? ### How IoT Can Help Drive the Smart City Agenda and Upgrade the UK’s Connectivity How IoT Can Help Drive the Smart City Agenda and Upgrade the UK’s Connectivity ### Ashley Smatt ### What Does 2019 Have in Store for the Internet of Things? What Does 2019 Have in Store for the Internet of Things? ### Do Digital Tools Make us More or Less Productive at Work? Do Digital Tools Make us More or Less Productive at Work? ### Why The Cloud and Edge Belong Together Why The Cloud and Edge Belong Together ### Why The Initial Idea Is So Important When It Comes To Building Your App Why The Initial Idea Is So Important When It Comes To Building Your App ### 5G Will Transform User Experience But It Is Not The ‘Silver Bullet’ For All Businesses 5G Will Transform User Experience But It Is Not The ‘Silver Bullet’ For All Businesses ### Beyond the classroom ### AR/VR - through the eyes of history ### Human Experiences - the key to happy customer and employee ### Big computing power is the only way forward Big computing power is the only way forward ### How Insurtechs and Big Tech are going to be foes and friends for Insurers in 2019 How Insurtechs and Big Tech are going to be foes and friends for Insurers in 2019 ### The Future of Work in an Age of Virtual Assistants The Future of Work in an Age of Virtual Assistants ### Banks need to collaborate to best use APIs Banks need to collaborate to best use APIs ### AI Mimics Human Intelligence ### human-eye-995168_640 ### The Cloud Has Disrupted Enterprise IT (and ain’t going to fix it) The Cloud Has Disrupted Enterprise IT (and ain’t going to fix it) ### 3D Printed Human Corneas ### Why AI Will Never Transform Recruitment Why AI Will Never Transform Recruitment ### How Fintech Apps Revolutionize Finance Methods? How Fintech Apps Revolutionize Finance Methods? ### The Rise of P2P Lending: What Are The Scopes Of This Fintech Innovation? The Rise of P2P Lending: What Are The Scopes Of This Fintech Innovation? ### Julie-Headshot ### Healthcare Data Protection | A Vulnerable Fitbit Generation Healthcare Data Protection | A Vulnerable Fitbit Generation ### Cloud Computing: Differences Between the AI and ML Cloud Computing: Differences Between the AI and ML ### The Cloud’s Fight Against The Dirty Dozen The Cloud’s Fight Against The Dirty Dozen ### Unplanned Effects of Modern Payment Methods Unplanned Effects of Modern Payment Methods ### What Can Blockchain Do For Data Storage? What Can Blockchain Do For Data Storage? ### How Creative Businesses Can Get The Most Out of AI’s Immense Potential How Creative Businesses Can Get The Most Out of AI’s Immense Potential ### Deploying containers securely and at scale with Multicloud Enabler powered by Red Hat and Juniper Deploying containers securely and at scale with Multicloud Enabler powered by Red Hat and Juniper ### Choice not Compromise: The Best of Both Worlds Choice not Compromise: The Best of Both Worlds ### Businesses brace for 2019’s big tech changes and challenges Businesses brace for 2019’s big tech changes and challenges ### Protecting your customers from persistent banking fraud Protecting your customers from persistent banking fraud ### The year Wearables 'ECG/Cardio-tech' becomes ‘Warnables’ ### The year Wearables 'ECG/Cardio-tech' becomes ‘Warnables’ ### The Future of Data Centres The Future of Data Centres ### Distinct Differences Between The AI and ML Distinct Differences Between The AI and ML ### The dangers of unsanctioned shadow apps The dangers of unsanctioned shadow apps ### Capitalising on the IoT through the 'analytics of things' Capitalising on the IoT through the 'analytics of things' ### Artificial Intelligence: The Problem with Perfection Artificial Intelligence: The Problem with Perfection ### IoT | Taking a Stronghold on Mobile App Development IoT | Taking a Stronghold on Mobile App Development ### finger-769300_1920 ### Three pieces of bad news about botnets you need to know Three pieces of bad news about botnets you need to know ### 800x500_juniper_networks ### Juniper Networks Helps Enterprises Simplify Data Integration to Pinpoint and Prioritise Cyber Threats from any Security Source Juniper Networks Helps Enterprises Simplify Data Integration to Pinpoint and Prioritise Cyber Threats from any Security Source ### Innovation & collaboration: Recent trends in cloud security Innovation & collaboration: Recent trends in cloud security ### How AI applications are being used to transform business How AI applications are being used to transform business ### Email security in the age of cloud Email security in the age of cloud ### NetApp Builds on Its Cloud Leadership in Global Markets NetApp Builds on Its Cloud Leadership in Global Markets ### Why integrating systems makes visitors feel like VIPs Why integrating systems makes visitors feel like VIPs ### How Cloud Computing Will Transform Traditional IT in 2019 How Cloud Computing Will Transform Traditional IT in 2019 ### Protecting the cloud - an evolving challenge Protecting the cloud - an evolving challenge ### 10 Untapped Tech Inventions You Need To Know About 10 Untapped Tech Inventions You Need To Know About ### Increasing dependence on tech demands board-level IT representation Increasing dependence on tech demands board-level IT representation ### Filming Interviews at Define Tomorrow 2018 Filming Interviews at Define Tomorrow 2018 ### Why cloud may not save you money Why cloud may not save you money ### Why cloud may not save you money Why cloud may not save you money ### Digital transformation is transforming business Digital transformation is transforming business ### Common cloud security issues: How to address them Common cloud security issues: How to address them ### Screenshot 2018-11-27 at 13.42.37 ### US tech firms set benchmark for data privacy US tech firms set benchmark for data privacy ### 10 business benefits of an effective cloud system 10 business benefits of an effective cloud system ### Why augmented intelligence is integral to customer experience Why augmented intelligence is integral to customer experience ### Screenshot 2018-11-26 at 13.57.15 ### Keep learning and beat the robots Keep learning and beat the robots ### IT departments are still grappling with cloud and container technology ### IT departments are still grappling with cloud and container technology IT departments are still grappling with cloud and container technology ### The biggest security threats during Black Friday and how to avoid them The biggest security threats during Black Friday and how to avoid them ### Using cloud analytics to navigate stormy market conditions Using cloud analytics to navigate stormy market conditions ### Using cloud analytics to navigate stormy market conditions Using cloud analytics to navigate stormy market conditions ### Using cloud analytics to navigate stormy market conditions Using cloud analytics to navigate stormy market conditions ### Dave Sohigian, CTO, Workday Headshot ### How Essential is Cloud for New startups? ### Multi-cloud is the Answer to Black Friday and Beyond ### business-3288259_640 ### network-3589858_640 ### business-3288259_640 ### Graph Technology: The Secret Sauce AI Is Missing Graph Technology: The Secret Sauce AI Is Missing ### How HR can benefit from AI implementation How HR can benefit from AI implementation ### How HR can benefit from AI implementation How HR can benefit from AI implementation ### What you need to take your business digital What you need to take your business digital ### Screenshot 2018-11-19 at 12.22.35 ### Screenshot 2018-11-19 at 12.21.13 ### Screenshot 2018-11-19 at 12.16.10 ### 5 Key Technologies That Are Driving IoT Development 5 Key Technologies That Are Driving IoT Development ### End the cloud lottery and reduce your risk ### End the cloud lottery and reduce your risk End the cloud lottery and reduce your risk ### What you need to take your business digital What you need to take your business digital ### Embracing an optimistic global IT channel through collaboration Embracing an optimistic global IT channel through collaboration ### businessmArtificial Intelligence (AI) powered automation has the potential to drive forward the fourth industrial revolutionan-3075827_1920 Artificial Intelligence (AI) powered automation has the potential to drive forward the fourth industrial revolution ### 33386622_1747008232028098_2213997366971727872_n ### Screenshot 2018-11-16 at 09.51.32 ### Screenshot 2018-11-16 at 09.50.20 ### Screenshot 2018-11-16 at 09.49.54 ### Screenshot 2018-11-16 at 09.48.49 ### Screenshot 2018-11-16 at 09.47.20 ### Screenshot 2018-11-16 at 09.44.41 ### Screenshot 2018-11-16 at 09.44.09 ### Screenshot 2018-11-16 at 09.41.23 ### Screenshot 2018-11-16 at 09.39.49 ### Dell EMC Gains High Performance Computing Momentum and Expands Portfolio Dell EMC Gains High Performance Computing Momentum and Expands Portfolio ### Planet of the Apps – taming SaaS Sprawl Planet of the Apps – taming SaaS Sprawl ### Screenshot 2018-11-15 at 14.18.29 ### Screenshot 2018-11-15 at 14.18.42 ### Harold de Neef_Group Director_Cloud_Civica ### Harold de Neef_Group Director_Cloud_Civica ### Harold de Neef_Group Director_Cloud_Civica ### Cloud leaders offer security, should you get onboard? Cloud leaders offer security, should you get onboard? ### Why visibility is key to successful VMware deployments Why visibility is key to successful VMware deployments ### Cloud: The one certainty in an uncertain post-Brexit world Cloud: The one certainty in an uncertain post-Brexit world ### Harold de Neef_Group Director_Cloud_Civica ### achievement-3408115_1920 Close collaboration between IT and business leadership is key to success ### Four Reasons to Adopt a Hybrid Operating Model ### Get smart and avoid the money pit-falls ### cloud-2530972_640 ### EMEA businesses have their head in the clouds ### 4 Steps for Boosting Productivity ### Innovation | How CIOs Can Help To Unlock Innovation For CEOs Innovation | How CIOs Can Help To Unlock Innovation For CEOs ### Dave Sohigian, CTO, Workday Headshot ### Technology | Alleviating Environmental Effects On The Food System Technology | Alleviating Environmental Effects On The Food System ### Business Applications of IoT and Advantages ### Author-photo ### How AI Helps Retailers ### The Crypto Exchange Transformation ### Three Personas in the Cloud Deployment ### Cloud Migration | The Three Essential Stages Of Cloud Migration Cloud Migration | The Three Essential Stages Of Cloud Migration ### How to make IOT real with Tech Data and IBM Part 2 How to make IOT real with Tech Data and IBM Part 2 ### How to make IoT real with Tech Data and IBM Part 1 How to make IoT real with Tech Data and IBM Part 1 ### Gaining a clearer understanding of automation Gaining a clearer understanding of automation ### The Rise of Automation | How AI Could Make Us More Human The Rise of Automation | How AI Could Make Us More Human ### Data Protection | What is GDPR Data Protection | What is GDPR ### ms29102018 ### large-895567_640 ### Worth Stealing | Protect your E-commerce site Worth Stealing | Protect your E-commerce site ### Blockchain Technology – The Future of Storage? Blockchain Technology – The Future of Storage? ### Whoever controls the multi-cloud controls the future ### ms29102018 ### Best Practices for Cloud Negotiation Best Practices for Cloud Negotiation ### background-2492010_1920 ### Cloud Computing | Help in Shaping Our Lives Cloud Computing | Help in Shaping Our Lives ### Challenges the industry faces with fraudulent transactions Challenges the industry faces with fraudulent transactions ### AI Adoption | How to make your organisation smarter AI Adoption | How to make your organisation smarter ### Top 10 Cloud Recruitment Management Platforms ### CX Excellence | Identifying the best partner for you CX Excellence | Identifying the best partner for you ### Converged infrastructure | Why you must invest Converged infrastructure | Why you must invest ### SEO | Tips to Increase Organic Traffic in 2018 SEO | Tips to Increase Organic Traffic in 2018 ### Compare the Cloud Blog IoT Compare the Cloud Blog IoT ### CMMS vs ERP | Asset Management Dilemma CMMS vs ERP | Asset Management Dilemma ### Cybersecurity | An opportunity for financial services organisations Cybersecurity | An opportunity for financial services organisations ### Cloud Compliance | Pulsant | Javid Khan Cloud Compliance | Pulsant | Javid Khan ### Compliance | There's more to compliance than consent ### 49930 ### 49930 ### Cyber Threat to SMB | Marc Wilczek | Link 11 Cyber Threat to SMB | Marc Wilczek | Link 11 ### Threats | Can cloud email security defend against these? Threats | Can cloud email security defend against these? ### Disrupt or die | The crucial steps for manufacturers ### Machine Vs Machine | Marc Wilczek | Link11 Machine Vs Machine | Marc Wilczek | Link11 ### Provider | AWS vs Azure vs Google Provider | AWS vs Azure vs Google ### nca-1210-ptcrb ### nca-1210-ptcrb ### Girl with Phone Girl with Phone ### Screenshot 2018-10-17 at 14.35.14 ### Screenshot 2018-10-17 at 12.42.13 ### Life with Patti Boulaye Life with Patti Boulaye ### CaSaaStrophe | Communication during a crisis CaSaaStrophe | Communication during a crisis ### Data centre | A vital part of the internet Data centre | A vital part of the internet ### Endpoints | Do they mean a million risks? Endpoints | Do they mean a million risks? ### Screen Shot 2018-09-26 at 09.29.35 ### Martin Fairman ### Martin Fairman ### Data Protection for Cyber Threats ### Insider Threat Image ### Cyber Security Image ### Ransomware Attacks Graphic 2a ### Browsing modes | proxy vs incognito ### Digital Transformation | Christian Rule, Flynet Digital Transformation | Christian Rule, Flynet ### How is automation streamlining business? ### The ROI of collaborative robots ### The Cloud Journey: Embracing Software Defined for Hybrid Cloud ### The Cloud Journey: Embracing Software Defined for Hybrid Cloud ### The Cloud Journey: Embracing Software Defined for Hybrid Cloud ### The Cloud Journey: Embracing Software Defined for Hybrid Cloud ### TALARI CLOUD CONNECT ELIMINATES HASSLE OF DEPLOYING ENTERPRISE CLOUD-BASED INFRASTRUCTURE ### Talari-logo-DrkBlue_big ### 4ad15017-5333-4438-b16a-9619507b56a3 ### T_N_mLwTag_4C ### Using IoT to support fleet management ### Shadow SaaS | The risks of employee software purchases ### How is automation streamlining business? ### Screen Shot 2018-09-18 at 13.27.08 ### Cloud technology to modernise and secure business payments ### IoT Security Risks: What could go wrong? ### phone-1537387 ### Fridge IoT ### IoT ### Screen Shot 2018-09-17 at 16.50.16 ### Why aren’t data breaches a thing of the past? ### The technology helping businesses go green ### art-close-up-ecology-886521 ### IoT | The quandary facing the telco industry ### IoT | The quandary facing the telco industry ### Generation Innovation | being a self-disruptor Disruption ### Untitled design ### The CLOUD Act | Shaking up the CSP market ### Could Cloud make AI accessible for all? ### Cloud security | Data protection is essential Cloud security | Data protection is essential ### Recruitment | Blockchain | A game changer in hiring Recruitment | Blockchain | A game changer in hiring ### Business Information | Time To Finally Get Serious Business Information | Time To Finally Get Serious ### block-chain-2853046_1920 ### Edge computing | how IoT is changing the cloud ### Screenshot 2018-08-16 at 07.43.41 ### Blockchain | 3 ways it's changing digital marketing Blockchain | 3 ways it's changing digital marketing ### Digital transformation | Safeguarding the customer experience ### unnamed ### Fintech Services | The future of b2b and banking Fintech Services | The future of b2b and banking ### Netskope | Cylance seclected to enhance Netskope security cloud Netskope | Cylance seclected to enhance Netskope security cloud ### Managing New Workloads in the Cloud | Interview with vice president of Veritas, Jason Tooley Managing New Workloads in the Cloud | Interview with vice president of Veritas, Jason Tooley ### Tech of the Week #11: Contactless cash withdrawals using an ATM and NFC Tech of the Week #11: Contactless cash withdrawals using an ATM and NFC ### IT Equipment | Companies risk missing top talent IT Equipment | Companies risk missing top talent ### N4Stack | Node4 launches ‘N4Stack' services for Microsoft Azure to help companies get more value from the cloud N4Stack | Node4 launches ‘N4Stack' services for Microsoft Azure to help companies get more value from the cloud ### Startups | Can they disrupt cloud oligopolies Startups | Can they disrupt cloud oligopolies ### Digital Transformation | Six Degrees and Hitachi Consulting Interview Digital Transformation | Six Degrees and Hitachi Consulting Interview ### Cloud Computing | Cloud performance issues with bigger than cloud journey Cloud Computing | Cloud performance issues with bigger than cloud journey ### Cloud-based identity | Online security Cloud-based identity | Online security ### Cloud Networking | The common pitfalls of moving Cloud Networking | The common pitfalls of moving ### Deep Learning | How is this technology enhancing AI technology Deep Learning | How is this technology enhancing AI technology ### Remote Working | It's time to implement Remote Working | It's time to implement ### Telia zone | Smart Cities, Graph, And IoT Telia zone | Smart Cities, Graph, And IoT ### Incident response | Adapting your cloud security Incident response | Adapting your cloud security ### Cloud Service Providers | Which questions should you be asking your CSP? Cloud Service Providers | Which questions should you be asking your CSP? ### cropped-cropped-Fordway_Master_Spot_LOGO-withstrapline-sml (1) ### Disaster resilience | Managed services Disaster resilience | Managed services ### OVH OVH ### Intelligent enterprise | Structure in the cloud Intelligent enterprise | Structure in the cloud ### cloud vendor cloud vendor ### Multicloud strategy | The 5 Steps to transition Multicloud strategy | The 5 Steps to transition ### Chatbot | Your Latest Recruit Chatbot | Your Latest Recruit ### Cloud ERP | 3 tech tips to maintain productivity Cloud ERP | 3 tech tips to maintain productivity ### Free software | What businesses should consider Free software | What businesses should consider ### Supply chains | is the Blockchain the answer? Supply chains | is the Blockchain the answer? ### Disruptive Channel Live 2 ### Industry 4.0 | Cloud technology within Manufacturing Industry 4.0 | Cloud technology within Manufacturing ### Archive | The future lies in the cloud Archive | The future lies in the cloud ### GDPR | Ensuring maximum uptime and compliance GDPR | Ensuring maximum uptime and compliance ### Cybersecurity | A forward-thinking approach Cybersecurity | A forward-thinking approach ### Voice Technology | Distinctive voices sing in business Voice Technology | Distinctive voices sing in business ### Automation | SMB success Automation | SMB success ### Water sector | Cloud technology could save millions Water sector | Cloud technology could save millions ### Cloud vendor | The need for a relations role Cloud vendor | The need for a relations role ### AI, Human AI, Human ### AI | why programmatic has an exciting future AI | why programmatic has an exciting future ### Automation | Customer contact for the AI age Automation | Customer contact for the AI age ### architecture-1727807_1280 ### architecture-1727807_1280 ### CTC_LOGO ### Headless Headless ### RickMadigan ### Cloud Shock Cloud Shock ### Blockchain Technology | Turning of the enterprise tide Blockchain Technology | Turning of the enterprise tide ### 302A1094 ### Smart Contracts | The rise of Co-Ownerships Smart Contracts | The rise of Co-Ownerships ### Big Tech Projects | Quick Business Wins are better Big Tech Projects | Quick Business Wins are better ### Social Housing and IoT | Mark Lowe, Business Development Director at Pinacl Solutions Social Housing and IoT | Mark Lowe, Business Development Director at Pinacl Solutions ### Industry 4.0 Industry 4.0 ### 49305 ### Tech From The Top – S1E1 – Russell Horton – CEO, FluidOne Tech From The Top – S1E1 – Russell Horton – CEO, FluidOne ### Information Governance | It's about business Information Governance | It's about business ### 49293 ### Impact of AI | The impact of Artificial Intelligence on the workplace Impact of AI | The impact of Artificial Intelligence on the workplace ### The GDPR Journey | How was GDPR for you? The GDPR Journey | How was GDPR for you? ### Bandwidth and Cloud-Based CCTV | Low bandwidth is no barrier Bandwidth and Cloud-Based CCTV | Low bandwidth is no barrier ### Food Industry | Better cost efficiency is key for production Food Industry | Better cost efficiency is key for production ### Security Models | Changing in a world of digital transformation Security Models | Changing in a world of digital transformation ### Intelligent Automation | The future of collaborative work Intelligent Automation | The future of collaborative work ### RPA solutions | Could they hold the answer to our data woes? RPA solutions | Could they hold the answer to our data woes? ### Cloud adoption, On-Premise & Data Protection in the Boardroom Cloud adoption, On-Premise & Data Protection in the Boardroom ### 49239 ### TheIoTShow_Episode6 ### Jason Tooley Veritas Technologies ### 49227 ### 49224 ### Cloud migration – healing the pain that keeps giving Cloud migration – healing the pain that keeps giving ### 49215 ### 49215 ### CEE 2018 GCCM Berlin Live Stream CEE 2018 GCCM Berlin Live Stream ### The cloud conundrum: navigating the world of multicloud The cloud conundrum: navigating the world of multicloud ### Jumpstarting your journey to the cloud Jumpstarting your journey to the cloud ### Screen Shot 2018-06-18 at 09.37.00 ### Leaseweb Hosts “Fintech Talks” Event at Latitude59 in Estonia Leaseweb Hosts “Fintech Talks” Event at Latitude59 in Estonia ### Why Banks Are Lagging On Open Banking Why Banks Are Lagging On Open Banking ### 49191 ### Are your digital transformation drivers aligned with your success metrics? Are your digital transformation drivers aligned with your success metrics? ### 49170 ### 49166 ### AxiansArtworkEp2 ### The multiple stages of adopting a multi-cloud strategy The multiple stages of adopting a multi-cloud strategy ### 34857608_234252873823049_5029650155247239168_n ### NVMe and DCP1000: one of the fastest NVMe data centre SSD solutions NVMe and DCP1000: one of the fastest NVMe data centre SSD solutions ### Learning to love the IT team: how ALM automation can help businesses understand their developers Learning to love the IT team: how ALM automation can help businesses understand their developers ### Are you a data hoarder? Data hoarding can be damaging to our lives Are you a data hoarder? Data hoarding can be damaging to our lives ### 5 Benefits of using Cloud-based software for ERP 5 Benefits of using Cloud-based software for ERP ### Businesses are looking to the cloud for transformative ways of scheduling Businesses are looking to the cloud for transformative ways of scheduling ### JohnColdicutt ### unnamed ### Pulsant, a leading UK provider of hybrid cloud solutions, has joined forces with Armor, a top provider of threat prevention and response services for private, hybrid, and public cloud environments. Combining Pulsant Protect —a comprehensive solution that covers the entire organisation, from people and networks to cloud infrastructure and applications —with Armor’s extensive portfolio of security services will ensure that each customer’s valuable data is protected from current and emerging cyber threats, while enabling the organisations to meet their GDPR requirements. With the Pulsant – Armor partnership, customers can turn up full-stack security for their cloud workloads in two minutes or less, while complying with regulatory mandates including GDPR, PCI, and more. Armor blocks 99.999% of cyber threats and provides customers with 24/7/365 threat monitoring and instant access to its team of on-demand security experts. Martin Lipka, Head of Connectivity Architecture, Pulsant, says: “We’ve always been impressed with what Armor has been able to achieve in the cybersecurity space, and so it’s great to be working so closely with them to bring these capabilities into our own security solutions. Through this partnership, we’ll be able to offer our customers an even better range of solutions.” Armor’s compliance with a range of regulations further strengthens Pulsant’s own compliance service, leveraging the same operational analytics for intelligent security monitoring as well as continuous alignment to regulatory frameworks. Lipka continues: “This is an important benefit that we can deliver to our customers alongside Armor. By using the same deep monitoring and analysis of their physical, virtual, and cloud infrastructures, we can keep their infrastructures secure while demonstrating to auditors and regulators that they are, in conjunction with their own best practices, complying with applicable regulations.” Chris Drake, Founder and CEO, Armor, says: “After spending a lot of time with the leadership and people of Pulsant, it was clear to me that Pulsant shared our same core values to guide and protect customers as they evolve their organisations. With that, Armor is extremely honoured to be partnering with Pulsant. Working hand in hand, Pulsant and Armor will provide organisations with the robust, real-time protections needed to defend against the ever-changing cyber threat landscape, while enabling organisations to meet and exceed their compliance requirements.” Pulsant, a leading UK provider of hybrid cloud solutions, has joined forces with Armor, a top provider of threat prevention and response services for private, hybrid, and public cloud environments. Combining Pulsant Protect —a comprehensive solution that covers the entire organisation, from people and networks to cloud infrastructure and applications —with Armor’s extensive portfolio of security services will ensure that each customer’s valuable data is protected from current and emerging cyber threats, while enabling the organisations to meet their GDPR requirements. With the Pulsant – Armor partnership, customers can turn up full-stack security for their cloud workloads in two minutes or less, while complying with regulatory mandates including GDPR, PCI, and more. Armor blocks 99.999% of cyber threats and provides customers with 24/7/365 threat monitoring and instant access to its team of on-demand security experts. Martin Lipka, Head of Connectivity Architecture, Pulsant, says: “We’ve always been impressed with what Armor has been able to achieve in the cybersecurity space, and so it’s great to be working so closely with them to bring these capabilities into our own security solutions. Through this partnership, we’ll be able to offer our customers an even better range of solutions.” Armor’s compliance with a range of regulations further strengthens Pulsant’s own compliance service, leveraging the same operational analytics for intelligent security monitoring as well as continuous alignment to regulatory frameworks. Lipka continues: “This is an important benefit that we can deliver to our customers alongside Armor. By using the same deep monitoring and analysis of their physical, virtual, and cloud infrastructures, we can keep their infrastructures secure while demonstrating to auditors and regulators that they are, in conjunction with their own best practices, complying with applicable regulations.” Chris Drake, Founder and CEO, Armor, says: “After spending a lot of time with the leadership and people of Pulsant, it was clear to me that Pulsant shared our same core values to guide and protect customers as they evolve their organisations. With that, Armor is extremely honoured to be partnering with Pulsant. Working hand in hand, Pulsant and Armor will provide organisations with the robust, real-time protections needed to defend against the ever-changing cyber threat landscape, while enabling organisations to meet and exceed their compliance requirements.” ### Screen Shot 2018-06-06 at 09.50.11 ### Technophobia, artificial intelligence, and the UK economy Technophobia, artificial intelligence, and the UK economy ### 49096 ### Understanding your exposure to third-party risk Understanding your exposure to third-party risk ### Understanding your exposure to third-party risk Understanding your exposure to third-party risk ### Understanding your exposure to third-party risk - Northdoor RiskXchange Understanding your exposure to third-party risk - Northdoor RiskXchange ### 49081 ### Why Big Bang switch outs of IT invariably just go BANG Why Big Bang switch outs of IT invariably just go BANG ### 49068 ### 49068 ### TheIoTShow_Episode4 ### #TheIoTShow - Episode 4: Analytics, AI, Machine Learning, Business and Manufacturing Insights #TheIoTShow - Episode 4: Analytics, AI, Machine Learning, Business and Manufacturing Insights ### 49063 ### 49058 ### CommsBusinessS3_Tint-1 ### #TheCloudShow #TheCloudShow ### 49009 ### 49004 ### 31284255_236199956957021_7238718903568519223_n ### 48987 ### TheIoTShow_Episode3 ### Screen Shot 2018-05-04 at 15.23.40 ### 48969 ### 48970 ### Screen Shot 2018-05-04 at 10.51.10 ### 0 ### 48957 ### Screen Shot 2018-04-27 at 11.20.58 ### tess ### TESSonpad ### Tech of the Week #10: TESS - The Transiting Exoplanet Survey Satellite Tech of the Week #10: TESS - The Transiting Exoplanet Survey Satellite ### 48938 ### TheIoTShow_Episode2 ### Seven effects of DDoS attacks on cloud environments Seven effects of DDoS attacks on cloud environments ### Payments Infrastructure - Is yours built to scale and flexible? Payments Infrastructure - Is yours built to scale and flexible? ### Tech of the week #7: SABRE, From Supersonic to Hypersonic – a hybrid of space and flight Tech of the week #7: SABRE, From Supersonic to Hypersonic – a hybrid of space and flight ### Tech of the week #7: SABRE, From Supersonic to Hypersonic – a hybrid of space and flight Tech of the week #7: SABRE, From Supersonic to Hypersonic – a hybrid of space and flight ### Tech of the week #7: SABRE, From Supersonic to Hypersonic – a hybrid of space and flight Tech of the week #7: SABRE, From Supersonic to Hypersonic – a hybrid of space and flight ### Tech of the week #7: SABRE, From Supersonic to Hypersonic – a hybrid of space and flight Tech of the week #7: SABRE, From Supersonic to Hypersonic – a hybrid of space and flight ### Gartner Says Global Artificial Intelligence Business Value to Reach $1.2 Trillion in 2018 Gartner Says Global Artificial Intelligence Business Value to Reach $1.2 Trillion in 2018 ### Cloud Malware - The perfect breeding ground, ignore it and it will bite you Cloud Malware - The perfect breeding ground, ignore it and it will bite you ### Chatbot: The revolution and evolution of the AI-powered technology Chatbot: The revolution and evolution of the AI-powered technology ### Cloud Mass Migration: The CIO’s five allies for success Cloud Mass Migration: The CIO’s five allies for success ### 48890 ### Legacy systems in 2018. What makes the best Terminal Emulator? Legacy systems in 2018. What makes the best Terminal Emulator? ### What makes the best Terminal Emulator? Legacy systems in 2018 What makes the best Terminal Emulator? Legacy systems in 2018 ### 48874 ### 48864 ### Screen Shot 2018-04-19 at 10.50.21 ### GiantShow2-1 ### Tech of the Week #6: vGIS – A utility being utilised by the utility industry ### Tech of the Week #6: vGIS – A utility being utilised by the utility industry Tech of the Week #6: vGIS – A utility being utilised by the utility industry ### Tech of the Week #6: vGIS – A utility being utilised by the utility industry Tech of the Week #6: vGIS – A utility being utilised by the utility industry ### Tech of the Week #9: Medical drones supplying life-saving treatments Tech of the Week #9: Medical drones supplying life-saving treatments ### Tech of the Week #9: Medical drones supplying life-saving treatments Tech of the Week #9: Medical drones supplying life-saving treatments ### Screen Shot 2018-04-17 at 12.41.06 ### Tech of the Week #8: Facial Recognition – Caught in a crowd Tech of the Week #8: Facial Recognition – Caught in a crowd ### Richard Fishburn _Ben Johnson ### Richard Fishburn _Ben Johnson ### startup-593327_1920 (1) ### Screen Shot 2018-04-17 at 11.04.22 ### Screen Shot 2018-04-17 at 10.55.39 ### Screen Shot 2018-04-17 at 10.43.49 ### Screen Shot 2018-04-17 at 10.43.49 ### Screen Shot 2018-04-17 at 10.33.50 ### Screen Shot 2018-04-17 at 10.33.50 ### Screen Shot 2018-04-17 at 10.24.05 ### ### Decentralised data storage in a blockchain future Decentralised data storage in a blockchain future ### OpenText research reveals UK consumers’ true feelings on AI technology OpenText research reveals UK consumers’ true feelings on AI technology ### IoT Show 1 ### 48769 ### Tech of the Week #5: 3D printed houses that are fresh off the press Tech of the Week #5: 3D printed houses that are fresh off the press ### Tech of the Week #5: 3D printed houses that are fresh off the press Tech of the Week #5: 3D printed houses that are fresh off the press ### Timeless Tuesday - The Crystalline Sliver Timeless Tuesday - The Crystalline Sliver ### Cloud service provider: Which questions should you be asking your CSP? Cloud service provider: Which questions should you be asking your CSP? ### Cloud service provider: Which questions should you be asking your CSP? Cloud service provider: Which questions should you be asking your CSP? ### Screen Shot 2018-04-09 at 12.01.42 ### 48725 ### 48721 ### Screen Shot 2018-04-05 at 09.47.56 ### Screen Shot 2018-04-05 at 09.27.48 ### Screen Shot 2018-04-04 at 15.23.18 ### Screen Shot 2018-04-04 at 15.23.18 ### Screen Shot 2018-04-04 at 15.10.28 ### san-francisco-2131951_1920 ### 48667 ### 48667 ### Migrate your workloads seamlessly to IBM Cloud with VMware Migrate your workloads seamlessly to IBM Cloud with VMware ### Opportunities and challenges in the age of open banking Opportunities and challenges in the age of open banking ### How hyper-converged infrastructure (HCI) simplifies IT infrastructure and management How hyper-converged infrastructure (HCI) simplifies IT infrastructure and management ### IoT growth pushing the data centre to the edge IoT growth pushing the data centre to the edge ### Paul Black_CEO_sales-i ### Five tips for adopting Business Intelligence (BI) Five tips for adopting Business Intelligence (BI) ### drone-3191472_1920 ### The role of the CFO: Change is afoot for finance and the cloud The role of the CFO: Change is afoot for finance and the cloud ### The role of the CFO: Change is afoot for finance and the cloud The role of the CFO: Change is afoot for finance and the cloud ### 48619 ### IBM Accelerates AI with Fast New Data Platform, Elite Team IBM Accelerates AI with Fast New Data Platform, Elite Team ### Screen Shot 2018-03-19 at 12.14.00 ### IBM ### 48606 ### 29061004_224446421465708_4821473830428226532_o ### 48600 ### 48592 ### 48588 ### 29133834_222640034979680_4062701628054006872_n ### bobsled-643397_1280 ### china-telecom ### china-telecom ### china-telecom_416x416 ### china-telecom_416x416 ### China’s Silk Road – Globalised Digital Trade and Intercontinental Business Interactions China’s Silk Road – Globalised Digital Trade and Intercontinental Business Interactions ### stephen-hawking ### Tech of the Week #4: “Alexa, is AI-Powered Voice Transcription Worth My Time?” Tech of the Week #4: “Alexa, is AI-Powered Voice Transcription Worth My Time?” ### Tech of the Week #4: “Alexa, is AI-powered voice transcription worth my time?” Tech of the Week #4: “Alexa, is AI-powered voice transcription worth my time?” ### A Different Approach to Managing Spend - Analytics and Machine Learning A Different Approach to Managing Spend - Analytics and Machine Learning ### Data Beyond Borders: Rethinking the Cloud in Times of Global Change Data Beyond Borders: Rethinking the Cloud in Times of Global Change ### Three e-commerce lessons to be learned from Amazon Go Three e-commerce lessons to be learned from Amazon Go ### Cloud Computing | What is it? Cloud Computing | What is it? ### Tech of the Week #3: Harley Davidson LiveWire – The Electric Dream Tech of the Week #3: Harley Davidson LiveWire – The Electric Dream ### The IBM Cloud with Sebastian Krause and James Carr: Hybrid Cloud and Knowledge Sharing The IBM Cloud with Sebastian Krause and James Carr: Hybrid Cloud and Knowledge Sharing ### Internet of Things: The Ransomware Threat - Preventing The Next Wave of Ransomware Attacks Internet of Things: The Ransomware Threat - Preventing The Next Wave of Ransomware Attacks ### Data Protection for the Enterprise with Backup & Disaster Recovery Data Protection for the Enterprise with Backup & Disaster Recovery ### Data Protection for the Enterprise with Backup & Disaster Recovery Data Protection for the Enterprise with Backup & Disaster Recovery ### Data Protection for the Enterprise with Backup & Disaster Recovery Data Protection for the Enterprise with Backup & Disaster Recovery ### Hitachi ConsultingLOGO ### Print ### What Makes the Best Terminal Emulator? Legacy Systems in 2018 ### What Makes the Best Terminal Emulator? Legacy Systems in 2018 What Makes the Best Terminal Emulator? Legacy Systems in 2018 ### SD-WAN Streamline Migration to Hybrid-Cloud Infrastructure SD-WAN Streamline Migration to Hybrid-Cloud Infrastructure ### 48422 ### HOME BARGAINS AGREES £950K MANAGED IT SERVICES DEAL WITH SYSGROUP HOME BARGAINS AGREES £950K MANAGED IT SERVICES DEAL WITH SYSGROUP ### 48411 ### Tech of the Week #2: Iron Man Style e-Skin Tech of the Week #2: Iron Man Style e-Skin ### HOSOKAWA AND SIEMENS ANNOUNCE COLLABORATIVE DIGITAL TECHNOLOGY PROJECT HOSOKAWA AND SIEMENS ANNOUNCE COLLABORATIVE DIGITAL TECHNOLOGY PROJECT ### Screen Shot 2018-02-27 at 14.31.06 ### What does GDPR mean for print? What does GDPR mean for print? ### Why SD-WAN is the New Black – Atchison Frazer – Talari Networks Why SD-WAN is the New Black – Atchison Frazer – Talari Networks ### Why SD-WAN is the New Black – Atchison Frazer – Talari Networks Why SD-WAN is the New Black – Atchison Frazer – Talari Networks ### Link11 Interview ### Hybrid Cloud: Trinity Mirror chooses NetApp to Safeguard Real-Time News through Digital Transformation Hybrid Cloud: Trinity Mirror chooses NetApp to Safeguard Real-Time News through Digital Transformation ### Join the World’s Largest Legal Hackathon this weekend ### Join the World’s Largest Legal Hackathon this weekend Join the World’s Largest Legal Hackathon this weekend ### New PGi Report Reveals the Smartphone as the Go-To Device for Unified Communications & Collaboration Services New PGi Report Reveals the Smartphone as the Go-To Device for Unified Communications & Collaboration Services ### New PGi Report Reveals the Smartphone as the Go-To Device for Unified Communications & Collaboration Services New PGi Report Reveals the Smartphone as the Go-To Device for Unified Communications & Collaboration Services ### 48334 ### Tech of the Week #1: Nanobot - The killer robots we want Tech of the Week #1: Nanobot - The killer robots we want ### Tech of the Week #1: Nanobot - The killer robot that's wanted Tech of the Week #1: Nanobot - The killer robot that's wanted ### Going for gold in the Cyber Olympics: a look at cybersecurity in Pyeongchang Going for gold in the Cyber Olympics: a look at cybersecurity in Pyeongchang ### How to Overcome Hybrid IT’s Common Challenges How to Overcome Hybrid IT’s Common Challenges ### 48310 ### 48307 ### Separating cybersecurity facts from fears Separating cybersecurity facts from fears ### Screen Shot 2018-02-14 at 11.27.37 ### 48286 ### Screen Shot 2018-02-14 at 11.22.57 ### Screen Shot 2018-02-14 at 11.17.54 ### Screen Shot 2018-02-14 at 11.07.44 ### Demystifying Cloud for CFOs Demystifying Cloud for CFOs ### Cloud Based CRM to Futureproof Your Business Cloud Based CRM to Futureproof Your Business ### Valentine's Day offers opportunities to scammers Valentine's Day offers opportunities to scammers ### Online Scams to Watch Out for in 2018 Online Scams to Watch Out for in 2018 ### Digitalisation in Business Digitalisation in Business ### GDPR Cloud Contracts GDPR Cloud Contracts ### Bill Mew - UKCloud - Bett Show - #DisruptiveCIF Bill Mew - UKCloud - Bett Show - #DisruptiveCIF ### Joachim Horn - SAM Labs - Bett Show Joachim Horn - SAM Labs - Bett Show ### Mark Bollobas - Vengit - Bett Show Mark Bollobas - Vengit - Bett Show ### Data Acceleration - Key to Cloud Computing Growth Data Acceleration - Key to Cloud Computing Growth ### gTLDs - Whats In a Name gTLDs - Whats In a Name ### 48212 ### 48207 ### 48207 ### DeepFake Machine Learning DeepFake Machine Learning ### 4 Methods for Adopting Cloud Computing in 2018 4 Methods for Adopting Cloud Computing in 2018 ### Screen Shot 2018-02-08 at 12.38.03 ### 48207 ### Screen Shot 2018-02-08 at 12.30.01 ### 48202 ### Screen Shot 2018-02-08 at 12.22.34 ### 3D Cloud 3D Cloud ### Technology Will Change the Supply Chain as we Know it Technology Will Change the Supply Chain as we Know it ### Screen Shot 2018-02-07 at 11.56.49 ### Cloud is A Constantly Moving Shark Cloud is A Constantly Moving Shark ### IoT in the food industry IoT in the food industry ### Cloud expo ### a world without open source software would be very different a world without opens ource software would be very different ### CMMS for maintenance ### Cloud Maintenance Costs Per User Per Month Cloud Maintenance Costs Per User Per Month ### Screen Shot 2018-02-05 at 16.19.26 ### Procurri Global Service Challenge Global Service Challenge ### 27545677_158822701572806_302164049197692995_n-1 ### Cloud Skills - Getting the right pieces of the puzzle in place Cloud Skills - Getting the right pieces of the puzzle in place ### Cloud Expo Europe 18 Cloud Expo Europe 18 ### Talari-Logo-With-TM-width-950px ### CELERITY logo colour ### 48135 ### IoT 2018 IoT 2018 ### remote development teams remote development teams ### the rise of multi-cloud: putting the customer in charge the rise of multi-cloud: putting the customer in charge ### blockchain is not bitcoin blockchain is not bitcoin ### AI Will Not Lead to a Dystopian Future AI Will Not Lead to a Dystopian Future ### Screen Shot 2018-01-31 at 09.51.20 ### Wincanton Logo Wincanton Logo ### Chatbots Chatbots ### Visual IoT Set for Massive Growth in 2018 Visual IoT Set for Massive Growth in 2018 ### CTC_Logo_Main ### five benefits of cloud computing five benefits of cloud computing ### HITL - Human-In-The-Loop HITL - Human-In-The-Loop ### RPA RPA ### Screen Shot 2018-01-25 at 08.56.18 ### Screen Shot 2018-01-25 at 08.49.33 ### 48061 ### Screen Shot 2018-01-23 at 10.54.20 ### Backup & Storage Options for Small to Mid-size Businesses (SMBs) ### Picture1 ### Behaviour Artificial Intelligence Behaviour Artificial Intelligence ### 48032 ### 25352169_193602991216718_535037704444982095_o ### 48022 ### 26804780_204595906784093_7741071901914947461_n ### Bett Banner ### 48010 ### 26685897_204561926787491_2285047241081806213_o ### zebras-2801451_1920 ### 47994 ### Manage Engine ### 47990 ### unnamed-2 ### BT_mark_large ### coins-1726618_1920 ### Bluewolf - Vera Loftis ### data over sound data over sound ### Why technology will create the perfect retail storm in 2018 Why technology will create the perfect retail storm in 2018 ### What's Holding Organisations Back From DevOps Success? What's Holding Organisations Back From DevOps Success? ### The Cost of DDoS Attacks The Cost of DDoS Attacks ### SMEs Cloud 2018 SMEs Cloud 2018 ### artificial intelligence in 2018 artificial intelligence in 2018 ### Unsecured IoT DDoD Unsecured IoT DDoD ### Digital Transformation Needs The Right Conditions Digital Transformation Needs The Right Conditions ### Storage Options for 2018 Storage Options for 2018 ### disruptive tech disruptive tech ### cybersecurity score cybersecurity score ### IoT Arms Race IoT Arms Race ### M3 GIANT ### Kingston at Giant ### Holoxica ### Screen Shot 2018-01-03 at 15.22.04 ### Sky medical ### Top Doctors Uk ### Giant Health Fujitsu ### Barry Shrier ### Lifecode GX ### Flynet ### Screen Shot 2018-01-03 at 14.05.03 ### 47819 ### GIANT fashion ### Modius ### Digital Health London ### Giant Health Odeh Odeh ### Walk with Path ### Solely Original ### Chatbots on the Trading Room Floor Chatbots on the Trading Room Floor ### ai in the workplace ai in the workplace ### 45274 ### 45274 ### 45274 ### 47491 ### 47516 ### DisruptiveBlackClear ### What will 2018 bring in Cloud Computing? What will 2018 bring in Cloud Computing? ### Screen Shot 2018-01-02 at 11.10.07 ### held to ransom by RaaS held to ransom by RaaS ### What Will Keep CIOs Awake at Night in 2018 What Will Keep CIOs Awake at Night in 2018 ### Will 2018 be the year of artificial intelligence? Will 2018 be the year of artificial intelligence? ### How technology will shape the future of retail How technology will shape the future of retail ### People are still key in Digital Transformation and AI People are still key in Digital Transformation and AI ### Christmas and New Year Online Shopping is an Opportunity for Cybercriminals Christmas and New Year Online Shopping is an Opportunity for Cybercriminals ### nWhy 2018 will be the year of the hybrid cloud Why 2018 will be the year of the hybrid cloud ### The cloud trends set to shape 2018 The cloud trends set to shape 2018 ### what DDoS attacks are all about and how they vary what DDoS attacks are all about and how they vary ### Using artificial intelligence to drive compliance Using artificial intelligence to drive compliance ### Addressing the AI engineering gap Addressing the AI engineering gap ### How ERP can kick-start your business How ERP can kick-start your business ### IoT holds the Key to Unlocking Value in the Cloud IoT holds the Key to Unlocking Value in the Cloud ### Navigating Multi-Cloud Navigating Multi-Cloud ### AI Big Data Backup AI Big Data Backup ### Cloud AI is the Key to Staying ahead of Cyber Criminals Cloud AI is the Key to Staying ahead of Cyber Criminals ### Fog Computing - the Next Big thing? Fog Computing - the Next Big thing? ### The Benefits and Pitfalls of Cloud Backup The Benefits and Pitfalls of Cloud Backup ### Is Digital Transformation Clouding Our Security Judgement Is Digital Transformation Clouding Our Security Judgement ### Is Digital Transformation Clouding Our Judgement on Security? Is Digital Transformation Clouding Our Judgement on Security? ### RPA with Flynet RPA with Flynet ### The eight most common cyber- threats, and how to mitigate them The eight most common cyber- threats, and how to mitigate them ### Cloud is Changing How We Purchase Telecomms Services Cloud is Changing How We Purchase Telecomms Services ### The Lesser-Advertised Benefits of Cloud Computing The Lesser-Advertised Benefits of Cloud Computing ### Voice Transcription Voice Transcription ### 5 Myths about Migrating your Email to the Cloud 5 Myths about Migrating your Email to the Cloud ### Above the Clouds Intelligent devices aren't "above the cloud", they'll take advantage of cloud-based technologies ### 47637 ### 47632 ### 47632 ### Gartner Research Gartner Research ### IoT and the Changing Face of Healthcare IoT and the Changing Face of Healthcare ### Challenges and Opportunities Moving to the Cloud Challenges and Opportunities Moving to the Cloud ### bitcoin reaches a new high - but also suffers a major theft bitcoin reaches a new high - but also suffers a major theft ### Could Smart cities lead to decentralised city government? Could Smart cities lead to decentralised city government? ### The Smart Cities on Earth May Turn into the Starships of the Future The Smart Cities on Earth May Turn into the Starships of the Future ### reception-2507752_1920 ### SysGroup announces Revenue Growth SysGroup announces Revenue Growth ### slush ### Netskope and Facebook usher in secure collaboration in Workplace Netskope and Facebook usher in secure collaboration in Workplace ### warsaw-897372_960_720 ### JohnCatling ### ANS Group acquires Webantic ANS Group acquires Webantic ### New research from Advanced has revealed that many charities in the UK are not fit enough for the digital era. According to the Advanced Trends Report 2017*, 65% of charities use Cloud-based technology, yet nearly one in four (26%) do not have access to real-time data and 40% do not have the right tools to do their job effectively. New research from Advanced has revealed that many charities in the UK are not fit enough for the digital era. According to the Advanced Trends Report 2017*, 65% of charities use Cloud-based technology, yet nearly one in four (26%) do not have access to real-time data and 40% do not have the right tools to do their job effectively. ### Smart Cities Could Overheat Unless Steps are Taken to Cool Them Smart Cities Could Overheat Unless Steps are Taken to Cool Them ### Malta Datacentre Development (Valetta) Malta Datacentre Development (Valetta) ### The Influence of Cloud Technology in Web Based Marketing The Influence of Cloud Technology in Web Based Marketing ### Screen Shot 2017-11-30 at 15.22.10 ### Screen Shot 2017-11-30 at 15.09.55 ### GDPR is a step in the right direction GDPR is a step in the right direction, but is it enough? ### albert_schmutz ### albert_schmutz ### Is Blockchain the answer to Fraud Prevention? Is Blockchain the answer to Fraud Prevention? ### Peter Kinder ### DevOps in the Cloud DevOps in the Cloud ### 47516 ### Capgemini Capgemini ### 47510 ### UK Interest Rates at a 50 year low - Are cryptocurrencies the answer for savers? UK Interest Rates at a 50 year low - Are cryptocurrencies the answer for savers? ### 47504 ### Google Plans to Turn Toronto into a Smart City Google Plans to Turn Toronto into a Smart City ### Lycamobile builds new digital back office with OpenText Lycamobile builds new digital back office with OpenText ### We are Human We are Human ### 47491 ### 47491 ### 47486 ### 47486 ### How to secure your data in the cloud ### how_to_secure_your_data_in_the_cloud How to secure your data in the cloud ### Venezuelan street vendor: Who or what is going to protect a hard-working Venezuelan from an inflation rate the IMF says has peaked above 700% this year? Venezuelan street vendor: Who or what is going to protect a hard-working Venezuelan from an inflation rate the IMF says has peaked above 700% this year? ### Tata Communications Tata Communications ### AWS announce Sumerian for VR, AR & 3D AWS announce Sumerian for VR, AR & 3D ### 47460 ### 47460 ### 47455 ### 47455 ### Online Marketing Online Marketing ### ACQ_SarahKnight_HeadShot ### Record Black Friday Record Black Friday ### Windows-as-a-Service Windows-as-a-Service ### Picture1 ### The Yas Hotel - Yas Marina Circuit The Yas Hotel - Yas Marina Circuit (c) Rob Alter used under https://creativecommons.org/licenses/by/2.0/ Original https://www.flickr.com/photos/robalter/5586435594 This image cropped and resized to 1920x1080 ### HyperGrid wins Cloud Management Product of the Year HyperGrid wins Cloud Management Product of the Year ### Typosquatting ### Wifi Network Awareness Wifi Network Awareness ### Too Good to be True Offers on Black Friday Too Good to be True Offers on Black Friday ### Private Cloud Considerations for Multi-Cloud Private Cloud Considerations for Multi-Cloud ### Christmas Approaching Christmas Approaching ### Complex Cloud Pricing Complex Cloud Pricing ### Adam Binks of SysGroup talks to James Hulse of Dell EMC Adam Binks of SysGroup talks to James Hulse of Dell EMC ### Global Payroll Global Payroll ### Uber Data Hack Uber Data Hack ### AI AI ### Protecting reputation from a Brexit ‘data breach backlash’ Protecting reputation from a Brexit ‘data breach backlash’ ### Robotic Process Automation Robotic Process Automation ### Brexit GDPR Risks to UK Cloud Companies Brexit GDPR Risks to UK Cloud Companies ### Cloud Analytics Academy Cloud Analytics Academy ### phishing on Black Friday phishing on Black Friday ### New Voice Media New Voice Media ### IoT Security IoT Security ### Conjure up some magic! Conjure up some magic! ### Conjure Up Some Magic Conjure Up Some Magic ### IoT Programme for Wales IoT Programme for Wales ### Data Breach Phishing Data Breach Phishing ### social media stalking social media stalking ### Protect your business from cyber attacks Protect your business from cyber attacks ### Rackspace acquires Datapipe Rackspace acquires Datapipe ### cash_converters Cash Converts Suffers Data Breach ### Digital Transformation and Microservices Digital Transformation and Microservices ### alex tebbs ### Startups Built on Cloud Startups Built on Cloud ### Large Organisations Lack Confidence over GDPR Compliance Large Organisations Lack Confidence over GDPR Compliance ### digital_workspaces_1920x1080 SaaS Digital Workspaces are the evolution of the Intranet ### 23517433_184615545448796_1699811542918173790_n ### Don't Let Traffic Spikes Become A Sword Of Damocles Don't Let Traffic Spikes Become A Sword Of Damocles ### Marketing With Cloud Hosted Video - Golf Day Marketing With Cloud Hosted Video - Golf Day ### cyber_security GDPR Readiness and Cybersecurity challenges facing Law Firms ### Healthcare IT Professionals Healthcare IT Professionals Worry about their organisation's ability to react to Cyber Attacks ### Practical_Benefits_of_Cloud_Based Practical Benefits of Cloud Based Project Management Software ### disaster Disaster Recovery Needs to be a Strategic Decision From the Boardroom ### Gibbs Hybrid Gibbs Hybrid ### Screen Shot 2017-11-13 at 15.28.30 ### Screen Shot 2017-11-13 at 15.28.30 ### 47231 ### 47233 ### advanced_logo_640x480 Advanced Logo ### datapipe_logo_640x480 Datapipe ### josh ### josh 2 ### cybercrime cybercrime ### josh 2 ### pre-christmas shopping pre-christmas shopping ### OpenText OpenText ### Atos Atos ### IBM IBM ### ThreatConnect-Logo-Retina-Color ### CTCAMP ### blueyonder blueyonder ### blueyonder blueyonder ### interoute interoute ### download ### OpenStack OpenStack ### Amazon-Web-Services-2-min AWS ### ntt-docomo NTT docomo logo ### Qstart Qstart ### Invotra Invotra ### SnowFlake SnowFlake ### 47013 ### Reply Reply ### Robert Half Robert Half Logo ### Sophos Sophos ### NVM NVM ### 46991 ### Bitglass Bitglass Bitglass ### Huawei Huawei ### Procurri Procurri ### Navisite Navisite ### Digital Transformation Tablet ### App App ### GDPR GDPR ### City City ### Application Application ### Cloud Computing Cloud Computing ### Blockchain Blockchain ### coins-currency-investment-insurance-128867 (1) ### NHS NHS ### 3D printing 3D printing ### Cloud Computing Cloud Computing ### IT and IoT IT and IoT ### Security Building Security Building ### IoT technology IoT technology ### Cyber Threat Cyber Threat ### Cloud, security Cloud, security ### Security Security ### Cloud Cloud ### Comms ### Intelligence Artificial Intelligence blog. ### Open Minds 2 Open Minds 2 ### OpenMinds OpenMinds ### Episode 3 Episode 3 ### Disruptive Halloween Disruptive Halloween ### Disruptive Halloween ### Disruptive Halloween ### IoT IoT ### Panasonic Panasonic ### Calligo Calligo the trusted cloud ### image001 (7) ### IoT IoT ### 22850215_174067909840502_705838310_o ### NewVoiceMedia NewVoiceMedia ### NewVoiceMedia ### Thales Thales ### VR VR ### 46926 ### Kenneth Kenneth ### 46922 ### Lim Lim ### 46918 ### Gabor Gabor ### 46914 ### Lantronix Lantronix ### 46909 ### Indra ### 46905 ### Chester Chester ### 46898 ### Ascension IT Ascension IT ### 46894 ### Thomas Wellinger Thomas Wellinger ### 46890 ### Philipslight Philipslight ### 46886 ### Limelight Limelight ### 46882 ### Zerostack Zerostack ### 46877 ### Ashnik Ashnik ### 46873 ### NetApp NetApp ### 46868 ### Jospeh Lee Jospeh Lee ### 46864 ### Cloudera Cloudera ### 46860 ### Ken Soh Ken Soh ### 46856 ### Bas Winkel Bas Winkel ### 46852 ### Royce Teoh Royce Teoh ### 46848 ### Procurri Procurri ### 46844 ### Joshua Au Joshua Au ### Netskope Netskope ### 46836 ### WaveCell WaveCell ### 46832 ### Zerto Zerto ### 46825 ### Epsilon Epsilon ### 46818 ### Deskera Deskera ### 46814 ### NetApp NetApp ### SkyBox SkyBox ### Dell Dell ### 46801 ### Sean Sean ### 46797 ### Andrew Andrew ### 46793 ### James James ### Analytics Analytics ### pexels-photo-209137 ### AI AI ### SaaS SaaS ### UKFast Public Sector UKFast Public Sector ### UKFast Public Sector ### Dimension Data Dimension Data ### Transformation Transformation ### Car Car ### 46746 ### Rob Rob ### 46742 ### Daniel Daniel ### 46738 ### Meredith Meredith ### 46734 ### Colin Colin ### 46730 ### Spencer Spencer ### 46726 ### Nick Nick ### 46721 ### Dennis Dennis ### Integret Integret ### Integret ### TechData TechData ### SteelEye SteelEye ### SteelEye SteelEye ### SteelEye ### Eaton Eaton ### Eaton ### Marketing Marketing ### Nokia Nokia ### Green Grid Green Grid ### Century Link Century Link ### Zoe Zoe ### Hybrid Hybrid ### 46671 ### James Dell James Dell ### 46666 ### malwarebytes malwarebytes ### 46662 ### Bennet Bennet ### 46658 ### Flynet Flynet ### 46653 ### Trustme Trustme ### Trustme ### 46649 ### Zerto Zerto ### 46644 ### Bipin Bipin ### 46640 ### IOMart IOMart ### 46636 ### Kingston Kingston ### 46632 ### HyperGrid HyperGrid ### 46626 ### ZoneFox2 ZoneFox2 ### ZoneFox2 ### 46622 ### Dell Dell ### Zoe Zoe ### IkfHiBK ### AWSome man-01.5f5db5ac2f22063cbd00854f2b6f2a583d1c5781 ### 46602 ### JamesJuiper JamesJuiper ### 46598 ### Bob Bob ### 46594 ### Sean Sean ### 46590 ### Plan B Plan B ### AI AI ### Calligo Calligo ### Calligo Calligo ### 46572 ### Secureauth Secureauth ### 46567 ### PeterBarnes PeterBarnes ### 46563 ### Steve Steve ### 46559 ### Integrationworks Integrationworks ### Graph Graph ### Google photo Google photo ### download-1 ### Episode 2 Episode 2 ### Tech Tech ### 46533 ### James James ### 46529 ### Lantronix Lantronix ### 46524 ### Atchison Atchison ### Quest Quest ### Management Management ### 46506 ### Allan Zander Allan Zander ### Wirescore ### Nuance Nuance ### Alibaba Alibaba ### Zonefox Zonefox ### Avaya ### Avaya Avaya ### Screen Shot 2017-10-12 at 05.50.19 ### AI AI ### Google Google ### Xero Xero ### Blockchain and IoT Blockchain and IoT ### Workday Workday ### Cloud Expo Asia - Day 1 Cloud Expo Asia - Day 1 ### Cloud Computing Cloud Computing ### Snow Snow ### Voss Voss ### Knowbe 4 Knowbe 4 ### Linkware Linkware ### Alfresco Alfresco ### Nuance_Communications_logo.svg ### FatPipe FatPipe ### DELL_EMC_INTEL_ANNIMATION_FINAL DELL EMC PowerEdge Servers with Intel Inside ### IT IT ### Cloud Security Cloud Security ### Sana Sana ### Teleware new logo(1) ### Security Security ### Big Data Big Data ### 21930829_160957377819266_963189469_o ### 46198 ### Cloud Expo Asia Cloud Expo Asia ### Oneserve Oneserve ### Oneserve Oneserve ### 22196463_1235660443246782_8548990304693123653_n ### Cloud ecommerce Cloud ecommerce ### Iphone Iphone ### Cloud Cloud ### Cloud Expo Logo Cloud Expo Logo ### Fusion Fusion ### ATOS ATOS ### ANS means business ANS means business ### Microsoft Microsoft ### Ventiv Ventiv ### Volta Volta ### Mimecast Mimecast ### Robotics Robotics ### Integration Works Integration Works ### NFON NFON ### IPEXPOHeader IPEXPOHeader ### IpExpo - Banner IpExpo - Banner ### Automation Automation ### CloserStill CloserStill ### Iland Iland ### Geant Geant ### Swyx Swyx ### Kaseya Kaseya ### Wochit 2 Wochit ### The Datalab The Datalab ### MicroStrategy ### TechData TechData ### MicroStrategy MicroStrategy ### Cogeco Peer 1 Cogeco Peer 1 ### Regent Regent ### download ### Cloud Building Cloud Building ### 46179 ### Peter Peter ### Wochit Wochit ### Kaseya Kaseya ### Calligo Calligo ### Informatica Informatica ### 46157 ### Sameet Sameet ### Cloud - Data Cloud - Data ### Cloud Cloud ### Graph Graph ### Geant Geant ### Randstad Randstad ### Huawei Huawei ### Snow Snow ### SILC SILC ### Drone Drone ### 46111 ### Eleesa Eleesa ### 46107 ### Soren Mortensen Soren Mortensen ### 46102 ### Zheng Zhang Zheng Zhang ### 46098 ### John Kreisa John Kreisa ### 46094 ### Adam Hadley Adam Hadley ### Suntec ### Suntec Suntec ### Blockchain Blockchain ### Blockchain for the good Blockchain for the good ### Blockchain Blockchain ### 46080 ### Bart Omlo Bart Omlo ### Martin James Martin James ### 46074 ### Screen Shot 2017-09-26 at 12.18.35 ### DR DR ### Foehn Foehn ### Bright computing Bright computing ### Data Centre Data Centre ### Suntec Suntec ### RSA Security RSA Security ### 46045 ### John Bird John Bird ### 46041 ### Ed Creasey Ed Creasey ### Cloud Cloud ### iland iland ### DAL DAL ### DAL DAL ### Digital Advice Live Digital Advice Live ### Digital Advice Live ### AWS AWS ### WND WND ### TFM Technology for Media, 27-28 September, Olympia London ### IPExpo IPExpo Europe, October 4th-5th, London ExCeL ### Data Security Data Security ### Sky Sky ### Neustar Neustar ### NFON NFON ### Hollie ### Dell Boomi Dell Boomi ### hybridIT hybridIT ### Guidance Guidance ### Paul Lawrence 2 Paul Lawrence 2 ### Adam Cathcart Adam Cathcart ### Steve Denby Steve Denby ### 45881 ### Neil Wilson Neil Wilson ### Richard Thompson Richard Thompson ### DragonChain DragonChain ### DragonChain DragonChain ### TFM TFM ### E-commerce E-commerce ### IoT Security IoT Security ### 45745 ### 45749 ### 45761 ### 45817 ### 45821 ### 45825 ### 45829 ### 45837 ### Chen 2 Chen 2 ### Chen Chen ### Chen Chen ### 45761 ### 45837 ### Mike Mills Mike Mills ### 45833 ### Martin Borrett Martin Borrett ### 45829 ### Alex Alex ### 45825 ### Paul Lawrence Paul Lawrence ### 45821 ### Bipin Patel Bipin Patel ### 45817 ### David Fearne David Fearne ### 45813 ### Elsa Chen Elsa Chen ### 45799 ### Disruptive team Disruptive team ### 45795 ### Cal Leeming Cal Leeming ### 45791 ### Jay Jay ### 45787 ### Adam Clyne Adam Clyne ### 45783 ### Andrew Wilson Andrew Wilson ### 45779 ### Tony Tony ### Paul ### Smarthalo Smarthalo ### 45768 ### Bernie Bernie ### Chris Gilmour Chris Gilmour ### Image Image ### 45757 ### Paul Paul ### 45753 ### James Marshall James Marshall ### 45749 ### Zoe Magee Zoe Magee ### 45745 ### Neil Muller Neil Muller ### 45736 ### Antony Abell Antony Abell ### Rubrik Rubrik ### Mobile Mobile ### Informatica logo Informatica logo ### Don’t Get Left Behind! Embrace AI and Deep Learning Don’t Get Left Behind! Embrace AI and Deep Learning ### Roundabout Roundabout ### CloserStill CloserStill ### 45682 ### 45705 ### 45667 ### Futurism Futurism ### Blockchain Technology Blockchain Technology ### Booking.com Booking.com ### Cloud Computing Cloud Computing ### London City scape London City scape ### Disruptive Disruptive ### Thunder Clap Thunder Clap ### 19514 ### 20488 ### 20556 ### 20563 ### 20751 ### 20841 ### 20847 ### 21087 ### 21196 ### 21247 ### 21547 ### 21923 ### 33890 ### 33895 ### 33907 ### 33910 ### 33913 ### 33916 ### 34731 ### 34997 ### 35328 ### 35302 ### 35352 ### 38340 ### 38346 ### 38879 ### 38887 ### 38882 ### 41239 ### 39946 ### 40058 ### 40087 ### 41243 ### 40735 ### 40906 ### 41664 ### 42302 ### 42312 ### 42314 ### 43861 ### 43959 ### 44537 ### 44541 ### 43861 ### 43959 ### 44537 ### 44541 ### Testplant Testplant ### Fintech Fintech ### CCTV CCTV ### Cloud based services Cloud based services ### Data Data ### Testplant Testplant ### Customer Service Customer Service ### DC_2 ### Amazon Workspaces Amazon Workspaces ### Building for blog Building for blog ### DataCentre DataCentre ### Stuart Nielsen-Marsh ### 45518 ### Jim Hansen Jim Hansen ### AWS Elemental AWS Elemental ### 45501 ### Customer Service Customer Service ### CCTV CCTV ### Long View Long View ### AI AI ### tablet tablet ### KrolL KrolL ### Data Management Data Management ### Flynet ### gtt comminucations ### Data Kinetics ### Kingston Technology ### Kingston Technology ### Open Minds ### dell_emc ### GTT ### Datapipe Datapipe ### Navisite Navisite ### Microsoft Azure Microsoft Azure ### Data centre - Blog Data centre - Blog ### Strategy Strategy ### DellEMCLogo ### Kingston ### OpenMindsSponsorLogo ### OpenMindsSponsorLogo ### CtC Marketplace ### Openstack Openstack ### 45400 ### 45400 ### Phil Beecher Phil Beecher ### NaviSite NaviSite Logo ### Rubrik Rubrik ### Devops Devops ### Currency Currency ### Luke Hatkinson-Kent Luke Hatkinson-Kent ### Data Cloud Data Cloud ### Danny Maher Danny Maher ### retail retail ### Openstack Openstack ### Building digital Building digital ### Cloud Storage Cloud Storage ### Cristopher Cristopher ### Financial Technology Financial Technology ### Daniel Horton Daniel Horton ### VOSS VOSS ### security security ### Len Rosenthal Len Rosenthal ### lightening lightening ### Andrew Hart Andrew Hart ### Content Management Content Management ### Kevin Cunningham Kevin Cunningham ### Brendan English Brendan English ### Cloud Security Cloud Security ### Kevin Cunningham Kevin Cunningham ### 45235 ### Screen Shot 2017-08-21 at 15.05.43 ### DHrg-v6WsAARflU ### Mobile Mobile ### Mathivanan Venkatachalam Mathivanan Venkatachalam ### UpTimeTwitterTemplate ### Informatica Informatica ### Fintech Fintech ### James Kipling James Kipling ### 45209 ### Adam Ryan Adam Ryan ### 45201 ### Steven Boyle Steven Boyle ### 45197 ### Dr. John Bates Dr. John Bates ### 45192 ### 45192 ### CtC Cloud Talk CtC Cloud Talk ### 45182 ### Anne Marie Anne Marie ### 45178 ### Aurelien Aurelien ### Andrew Armstrong Andrew Armstrong ### 45166 ### 45166 ### 45166 ### Lee Atchison Lee Atchison ### A Guide to Machine Learning A Guide to Machine Learning ### IoT IoT ### 45150 ### Oliver - Alert Logic Oliver - Alert Logic ### Tim Mackey Tim Mackey ### SaaS SaaS ### Women in Tech Women in Tech ### SDx article SDx article ### Donald Meyer Donald Meyer ### AI and Machine Learning AI and Machine Learning ### Sam O'Meara, Applause Sam O'Meara, Applause ### Open Minds Open Minds ### Excelero Excelero ### Data Breach Data Breach ### Backup and DR Backup and DR ### Richard Hurwitz Richard Hurwitz ### In-house Computing In-house Computing ### Terry Erisman Terry Erisman ### KingstonNew ### image001 ### Data Protection Data Protection ### Nathan Collins Nathan Collins ### Automation Automation ### Business and Apps Business and Apps ### Malcolm Carroll Malcolm Carroll ### Cloud Cloud ### Grant MacDonald Grant MacDonald ### Lloyd Webb Lloyd Webb ### Databarracks Databarracks ### RebootYouTubeEp1 ### Security Security ### Luis Rojas Luis Rojas ### Data Management Data Management ### Sylwia Lysakowska-Lombari Sylwia Lysakowska-Lombari ### IoT IoT - Digital ### Mark Homer Mark Homer ### Mainframe Mainframe ### 44974 ### Larry Strickland Larry Strickland ### 44966 ### 44966 ### Nick Sacke Nick Sacke IoT filming ### 44961 ### David Benjamin David Benjamin IoT filming ### 44955 ### 44955 ### Paul Armstrong Paul Armstrong Iot Filming ### 44950 ### 44950 ### Mike Bell Mike Bell IoT interview ### Ralph Berndt Ralph Berndt for press release ### 44941 ### 44941 ### Adam Brown Adam Brown for IoT Filming ### 44934 ### 44934 ### Rob Milner Rob Milner IoT filming ### 44928 ### 44928 ### 44928 ### Chris Harding Chris Harding IoT filming ### Wia Wia logo for press release ### Timico Timico Press release ### Business Buildings Business Buildings ### Asif Amin Asif Amin ### AWS Hybrid Cloud AWS Hybrid Cloud ### Ajay Gulati Ajay Gulati ### Unload Unload ### 44903 ### Anthony Burrisson Anthony Burrisson Dell EMC ### 44896 ### 44896 ### Paul Brown Paul Brown Dell EMC ### 44889 ### 44889 ### Ken Harley Ken Harley for Dell EMC videos ### 44889 ### 44885 ### Joe Littlewood Joe Littlewood Dell EMC Videos ### 44881 ### 44881 ### 44876 ### Joseph Pindar Joseph Pindar interview for IoT ### 44876 ### 44860 ### 44860 ### Steve Foster Steve Foster ### 44855 ### Esteem Systems Vicki and Mark Esteem Systems Vicki and Mark ### Cloudamize Cloudamize ### 44846 ### 44846 ### Unload Unload ### Customising Cloud for SME Customising Cloud for SME ### Steve Denby Headshot Steve Denby Headshot ### CRM Cloud CRM Cloud ### Big Data Big Data ### Kayla Matthews headshot Kayla Matthews ### 44820 ### 44820 ### John and Barry ComputerWorld John and Barry ComputerWorld ### 44813 ### 44813 ### 44813 ### Nick Thompson Nick Thompson interview ### 44809 ### Recovery Data Recovery ### IoT for Businesses IoT for Businesses ### Computer Transformation Computer Transformation ### International Financial Data Services - Julian Rice Julian Rice ### Chris Stos-Gale Chris Stos-Gale interview with Dell ### 44781 ### 44781 ### 44781 ### Fluke Networks Fluke Networks ### Business and the Cloud Business and the Cloud ### Ioan MacRae Ioan MacRae ### PatSnap PatSnap ### Mitel and Shortel Mitel and Shortel ### ISPA UK ISPA UK ### GTT and Jisc GTT and Jisc ### Fujitsu logo Fujitsu logo ### Figure 2 Figure 2 ### Figure 1 Figure 1 ### logo-1 ### z-mitelshoretel ### PAR228 - INFOGRAPHIC ### 44735 ### 44735 ### PatSnap_Logo ### colinwhelanavatar-bw ### apple-1850613_1920 ### patsnap-logo ### 44716 ### Andrew Peddie FHL_high res ### mother-board-581597_1920 ### Space Sails graphic Copyright Dstl Space Sails graphic Copyright Dstl ### 44682 ### 44680 ### Mohammad Rashid Mohammad Rashid AI Interview ### 44676 ### Jean-Cyril Schutterle Jean-Cyril Schutterle AI Interview ### Sven Denecken Sven Denecken AI interview ### 44672 ### Rackspace and Pivotal ### 4K Video 4K Video ### Cloud Transformation Cloud Transformation for press release ### 44646 ### ctc-cloudphysics ctc-cloudphysics ### Robert De Caux Robert De Caux AI interview ### Derek Dempsey Derek Dempsey AI Interview ### 44642 ### Cloud Computing Cloud Computing ### Company office Company office ### James Slaney James Slaney ### Meeting Room Meeting Room window view ### Paul Tarsey Paul Tarsey ### Blockchain and Bitcoin Blockchain and Bitcoin ### René Bader René Bader ### Data Storage Data Storage ### Raphael Joseph Raphael Joseph ### Business transformation Business transformation ### suvrendu-halder-one Suvrendu Halder Headshot ### Thales Thales ### 44589 ### 44589 ### Unboxed Unboxed ### 43959 ### 43806 ### 44569 ### 44569 ### Richard Pursey AI Richard Pursey AI ### Blockchain and IoT Blockchain and IoT ### Pascal Pascal ### 44553 ### AI James Slaney AI James Slaney ### 44548 ### 44548 ### AI Episode 1 AI Episode 1 ### 44541 ### 44541 ### 44537 ### 44537 ### Database Storage Database Storage ### Cloud Payments Laptop, Payment ### Artificial Intelligence Artificial Intelligence ### Scott Dodson Scott Dodson Headshot ### Business and AI Business and AI ### 44508 ### Adblock interview Adblock interview ### Wimbledon Wimbledon ### Ask Fred Ask Fred ### Cognitive Highlights Cognitive Highlights at Wimbledon ### Pimms at Wimbledon Pimms at Wimbledon ### Wimbledon scores Wimbledon scores ### Scott Zoldi Scott Zoldi headshot ### AWS AWS image for press release ### Zodiac Areospace Zodiac Areospace Logo ### Knowbe4 Knowbe4 headshot ### Cloud for article Cloud for article ### Allan Leinwand Allan Leinwand Headshot ### RBC - Data RBC - Data ### Paul Taplin Paul Taplin Headshot ### Falcon Media House Falcon Media House ### Arrow and Citrix Arrow and Citrix ### Cloud Service Provider Cloud Service Provider ### Ian MacLean Ian MacLean Headshot ### Workflow Workflow ### Cloud and Data Cloud and Data ### Paul Trulove headshot_UK Paul Trulove headshot_UK ### Tech Leaders Sumit Tech Leaders Sumit ### Procurri Procurri Logo ### Digital Transformation Digital Transformation ### Gregory Headshot Gregory Headshot ### Bank and security Bank and security ### alex alex headshot ### gregory barcia gregory barcia headshot ### Policy Bee Policy Bee ### City Cloud Logo City Cloud Logo ### NEC NEC - NY skyscrape ### Internet- Chrome Internet- Chrome ### Adam_Simmons Adam_Simmons ### Cloud-Based-Apps Cloud-Based-Apps ### Eric Berg Eric Berg headshot ### GDPR GDPR data Protection ### Tom Owen Tom Owen ### Rail Rail ### NEC NEC ### Computer development Press release Computer development Press release ### Storage Storage ### Hybrid Cloud Hybrid Cloud ### Project Management Project Management ### mostafa-headshot mostafa-headshot ### Agriculture Agriculture ### Neil Thingstream Headshot Neil Thingstream Headshot ### Security-Cloud Security-Cloud ### Nathan Johnston Nathan Johnston ### ETELM ETELM image from the company ### 44310 ### 44310 ### Lionel Hackett Lionel Hackett ### Infinidat Infinidat logo for news ### Retail-BigData Retail-BigData for article ### Retail Bigdata Retail Bigdata ### Security Ransomeware Security Ransomeware for article ### Jeandre Sutil Jeandre Sutil ### UK Sigfox network operator UK Sigfox network operator ### Logical Glue logo Logical Glue logo ### FrankGDPRQuote ### FrankGDPRQuote ### FrankGDPRQuote ### FrankGDPRQuote ### Statistics Statistics ### mhelmbrecht Michael Headshot ### Business Meeting image for Cloud article Business Meeting image for Cloud article ### street-692157_1920 ### James Stanbridge (1) James Stanbridge Headshot ### GenMed GenMed ### ANS - Street View ANS - Street View ### AWS and the Cloud AWS and the Cloud article for CtC ### Josh McAllister Josh McAllister ### AI AI and Machine Learning ### 44165 ### 44165 ### Juniper Juniper video ### Chris Hafner Chris Hafner ### Suprema Suprema ### Security Security ### Rahim Kaba B&W Rahim Kaba B&W ### 34997 ### DellEMC ### Cloud Transformation Cloud Transformation ### Enda Kenneally Enda Kenneally ### Fujitsu and SUSE Fujitsu and SUSE ### accountancy and architecture accountancy and architecture ### unnamed ### unnamed ### Bright_Logo_v2 ### Computing Architecture Computing Architecture ### Alex Dalglish Alex Dalglish ### Screen Shot 2017-06-27 at 15.46.51 ### Office Meeting Room Black and white meeting room ### Simon Picture Simon Atkins ### AAEAAQAAAAAAAA1rAAAAJDFmMzBkOTMxLTliYWQtNGRlNi1iZWZiLWM2OWRjMDI5MzYyMA ### fujitsu ### Desktop Technology Desktop Technology ### David Northmore David Northmore ### architecture B&W architecture ### Screen Shot 2017-06-26 at 11.43.46 ### Supercomputer Supercomputer ### Building Building ### Jason Oliver Jason Oliver ### Business Building Business Building ### andy richley (1) Andy ### TechLawEp6 ### Strategy Strategy ### Nikolay Grebennikov Nikolay Grebennikov ### One click One click ### gerry gerry ### Cyber Attack Cyber Attack ### Wipro Wipro Logo ### New Relic New Relic ### unnamed ### Map World map ### Cloud and Data Storage Cloud and Data Storage ### BICS-JornVercamert-Aug 2013 JornVercamert ### CloudComputing Cloud Computing for Blog ### 43910 ### KCOM Logo KCOM Logo ### GTT GTT ### IT Structure IT Structure ### Neil-Batstone-Mangement-Headshot Neil ### Storage Storage ### Zadara Storage Noam Shendar COO Noam Headshot ### 43884 ### Iland Logo Iland Logo ### Channel Live Channel Live ### 43866 ### 43866 ### 43866 ### 43866 ### Robert Weideman Robert Weideman ### Comms Business Live Comms Business Live ### 43861 ### NGA NGA Interview ### 43853 ### 43853 ### Shopping Shopping ### Light Shining the light - for blog ### Nick Andrews GM EMEA and India Zyme Nick Andrews headshot ### tech law ep5 banner tech law ep5 banner ### 43639 ### 43812 ### science-2203701_1920 Testing ### Hybrid data Hybrid data ### Destiny Bertucci_SolarWinds July 2016 Destiny Bertucci Headshot ### Digital Advice Live Digital Advice Live ### PrideCTCCover ### yoram-mizrachi-1 ### 19146108_126945471219413_6281882370194704504_n ### old-books-436498_1920 old-books ### Daren Headshot Daren Oliver is the IT specialist and Managing Director at London-based business Fitzrovia IT. ### Netclearance Netclearance ### IT departments IT departments and hybrid Cloud ### Move to the Cloud Move to the Cloud ### Richard Whomes Richard ### hSo-logo-2017 ### Ransomware Ransomware ### Stu Stu Headshot ### 43710 ### 43710 ### Rhiannon Flynet Rhiannon Flynet ### Flynet pic3 Flynet pic3 ### Flynet Pic 2 Flynet Pic 2 ### Flynet pic 1 Flynet ### Limelight Limelight Logo ### 120810 - hSo Logo - On White2 ### Moving to the Cloud Moving to the Cloud ### AKramer_Headshot Ashley headshot ### Accounts Accounts on the Cloud ### Clare Blackmore Clare Blackmore Headshot ### Business IT Business ### Update Mobile Update Mobile ### Jyoti Bharwani Jyoti Bharwani ### AWS IoT AWS IoT ### Cloud Security Cloud Security ### Jo Gibson, Operations Director, First Capital Cashflow Jo Gibson headshot ### FrankEpisodeCard social media Frank Jennings show ep4 ### Cloud Cloud digital ### Steve sharp Steve Headshot ### AWS Greengrass AWS Greengrass logo ### ITaaS ITaaS with IBM ### IBM Global IBM ### CRM Software CRM Software for article ### Gartner Gartner ### cloud Migration Cloud_Migration ### AWS Velostrata AWS ### 43596 ### 43596 ### TransformCIO TransformCIO ### VMware logo VMware Logo page ### Cloud workplace Cloud Workplace ### Darren Mawhinney, MD CloudMigrator365 Darren Headshot ### Typewriter Delegation Typewriter Delegation ### Delegation Lists Delegation Lists ### Communication and the cloud Communication and the cloud ### Russell Russell Headshot ### RW mugshot Richard Walton ### State Street Logo State Street Logo ### Phone Phone ### Data_Server Data ### Matt Hooper Matt Hooper ### Thibaut Ceyrolle Thibaut Ceyrolle headhsot ### Crowd Strike Crowd Strike Logo ### lifesaver Lifesaver change on article ### Hybrid Cloud Hybrid cloud ### Data Building Cloud for blog ### Data Data Centre ### charles du jeu Charles Headshot ### 43506 ### 43506 ### David David Cloudtalks ### 43499 ### John and Eugene John and Eugene ### 43491 ### Ben Howes Ben Cloud Talks ### 43490 ### Anthony Anthony Ryland Cloudtalks ### 43466 ### Filip Novak Filip Novak ### Website Website SEO on blog ### harnil Harnil Headshot for blog ### Paper Storage Data Storage rainbow ### James Longworth_1 James Headshot ### 43462 ### Marcus Langerfield Marcus Cloud Talks ### 43461 ### Nick Barron Nick Barron Cloud Talks ### MSP MSP Image ### Thomas Borrel Thomas Headshot ### The Cloud Cloud and SaaS Providers ### Martin Balaam Martin Headshot ### Gartner Gartner Logo for press release ### security Security for blog ### Gavin Russell, CEO, Wavex Gavin Russell Headshot ### colour Charts for service.provider ### graph Graph from article ### Map Map cover ### Stuart Simmons Stuart Simmons headshot ### 43398 ### NFON NFON logo for press release ### Rural Rural logo for press release ### Cover Tech and Law ### How happy is London ### SaaS SaaS ### Robert01 Robert Headshot ### Applications to the Cloud Moving applications into the cloud ### backup storage Speed for storage ### 43372 ### George Cover George GDPR ### 43366 ### 43366 ### Darron Darron Cover Photo ### Harshini Harshini ### Martin James Martin James GDPR ### 43349 ### Martin Martin ### 28dfafc Frank Jennings ### 43332 ### 43332 ### Neupart NEupart ### 43332 ### What Value is the GDPR Compliance for Contact Centres? What Value is the GDPR Compliance for Contact Centres? ### Adrian Adrian Headshot ### Screen Shot 2017-05-25 at 10.00.50 ### 43315 ### Screen Shot 2017-05-25 at 09.36.42 ### 43289 ### 43289 ### 43280 ### 18698497_120376571872694_3103502938148122442_n GDPR Hashtag ### 43256 ### Fintech Fintech press release ### Certes Certes Logo ### cloud business walking Business walking for article cloud ### James Parsons (1) James headshot ### pexels-photo-265087 Laptop ### man-person-hand-party A/V ### Joe Cornwall (large file) Joe Cornwall ### pexels-photo-216596 (1) Cloud Computing ### KK headshot Kalyan ### 3D Systems 3D Systems Logo ### 43150 ### Oxfordshire business awards Oxfordshire Buisness ### Loom Logo Loom Logo for press release ### National Technology Awards National Technology awards ### 18557366_117827435460941_322369577361324745_n ### FrankEpisodeCard ### pexels-photo-30342 Business ### GDPR Guide GDPR Data padlock ### image001 (1) Spencer Headshot ### Cover Law ### Frank Jennings ### Technology and The Law Frank Jennings ### Frank Jennings Frank Jennings article ### Frank Jennings Image Frank Jennings Logo ### Fank Jennings Frank Jennings ### Vivonet Vivonet Logo ### pexels-photo-164444 Cloud Building ### Screen Shot 2017-05-17 at 22.16.17 Adrian Headshot ### nature-laptop-outside-macbook Cloud SME ### Dirk Paessler CEO of Paessler (1) Dirk Headshot ### pexels-photo-239919 Technology ### Scott Lynn Scott Headshot ### marketing-board-strategy Strategy for blog ### Pete Hulme - Head Shot 1 Peter Headshot ### GDPR GDPR ### Richard Richard Headshot ### Chart Chart ### Customer Value Customer Value ### Telephone Telephone for press release ### IBM IBM and Nutanix press release ### CTCAMP ### Bandwith High Speed Migration ### Snow Snow Logo ### Snow Snow Logo for press release ### Technology and the law Technology Law ### CIO Survey CIO Survey logo for press release ### Cyber Attack Ransomeware, Cyber Attack ### pexels-photo-403994 Cloud data for blog ### 2017-05-09 11_48_37-Mike Whelan – Teradata Universe EMEA 2017 Mike Headshot ### 6 Pros and Cons of Cloud Storage for Business 6 Pros and Cons of Cloud Storage for Business ### amy_pritchett (1) Amy Headshot ### pexels-photo-256401 Storage photo for blog ### Screen Shot 2017-05-12 at 19.44.12 Mike Headshot ### pexels-photo-69866 Lottery Tickets for article ### Philip Wilson Photo Philip Headshot ### pexels-photo-203192 Data Software ### pexels-photo-207489 Cloud Platforms for the blog ### Screen Shot 2017-05-12 at 18.16.23 Kris Wood Headshot ### Unknown[2][1] Kris Wood Headshot ### pexels-photo-273739 Public Cloud image for blog ### Bio Picture Morgan Headshot for blog ### Healthcare Healthcare ### Hero vs Villian Hero vs Villain press release ### Security Security image for press release ### Datapipe Datapipe log ### pexels-photo-210158 Cloud for blog ### Jack Bedell-Pearce 4D Data Centres Jack Headshot for blog ### pexels-photo-210990 Fintech photo for blog ### Michael Salisbury, Director at Quantrix Michael Salisbury Headshot ### 5nine_Morgan Morgan from 5Nine ### pexels-photo-295058 Technology image for blog ### Jonathan-Bridges (1) Jonathon Bridges photo for blog ### ep06-planestrainsautomobiles ### ep06-planestrainsautomobiles ### 42823 ### 42823 ### Cognition Foundry Cognition Foundry ### 42816 ### Sire Sire Logo for video ### Formpipe Lasernet Formpipe Lasernet logo ### Statistics Statistics image for Innovation Gap ### Addveritas Addveritas ### Cetus Solutions Cetus Solutions ### Cloud Talks 3 Emerge UK Press release ### Cloud talks 2 Cloud Talks image ### Cloud talks 1 Cloud Talks Image ### 42787 ### 42780 ### 42774 ### 42770 ### CTCContLo ### ContLogo ### Cash flow Cash flow for press release ### 1493836533_twitter ### TechandLawPromoShot1 ### FJPhoto ### FJPhoto ### HappyBirthdayDisruptive ### ContCover ### Digital Transformation Digital Transformation for press release ### Inxn Inxn Image for press release ### pexels-photo-25447 Cloud Technology for blog ### Jim McNiel-CMO-NETSCOUT Jim Headshot ### Julia Julie Headshot ### pexels-photo-261706 Email Image for for blog ### 7 Image 7 for blog ### 6 Image 6 for blog ### 5 Image 5 for blog ### 4 Image 4 for blog ### 3 Image 3 . for blog ### 2 Image 2 for blog ### 1 Image 1 for blog ### people-coffee-tea-meeting Start ups image ### Stefano Sordi, CMO for Aruba Cloud Stefano Sordi Headshot ### computer-motherboard-pc-wires IT Image for blog ### Doug Clark Doug Clark Headshot ### pexels-photo-66134 Streaming Video image ### pexels-photo (4) Streaming image ### Dan HEadshot Dan Headshot ### Neil Headshot Neil Headshot ### Abner Headshot Abner Headshot ### Julie Headshot Julie Stewart Headshot ### Hollie Headshot Hollie Knowles Headshot ### CensorNet CensorNet Logo for press release ### Cancer Innovation Cancer Innovation for press release ### Money waste Money Waste for Press Release ### pexels-photo-245618 Cloud Storage image for blog ### Avner Peleg Avner Headshot for blog ### pexels-photo-132907 Data Storage for blog ### Screen Shot 2017-05-05 at 14.28.15 Darren Headshot for blog ### Panasonic Panasonic Image for press release ### NHS NHS Logo for Press Release ### pexels-photo-277593 Doors for opportunity blog ### Ruth-Plater Ruth Plater Headshot ### HPE HPE Press Release image ### Cloudified Cloudified image for press release ### BMC BMC Logo for Press Release ### Comms Business ### Channel Live Channel Live image for press release ### pexels-photo-160107 Data for article ### Comms Business Comms Business for press release ### Oracle Oracle Image for press release ### pexels-photo-316879 Business Image for press release ### pexels-photo (3) Apps Image for blog ### pexels-photo (2) Business image for blog ### pexels-photo-92902 Phone, Technology Photo for article ### Chris Michael Swytch Chris Michael Headshot ### woman-typing-writing-windows Cloud Control Image ### Benjamin Benjamin Headshot ### pexels-photo-289927 (1) Blockchain Image for blog ### GlennThorsen_bio Glenn Thorsen Headshot ### pexels-photo Finance image on the blog ### AndyBottrill_APPROVED 09 09 16_Small Andy Bottrill Headshot ### Cybsafe Cybsafe Logo for press release ### iPaaS iPaaS image for press release ### National trust National Trust logo for press release ### Wix Wix video image for press release ### pexels-photo-164425 Security image for blog ### pexels-photo-325229 Cloud Security image for blog ### Kaloyan Dimitrov Kaloyan Dimitrov Headshot ### pexels-photo-289927 Cloud Cyber Security for blog ### Kevin Kevin Headshot for Blog ### GDPR GDPR for Press Release ### Satelite 4Klive, satellite from space ### satellite-67718_1920 Satellite for live 4k ### Live from space Live from space ### 42314 ### 42314 ### Screen Shot 2017-04-26 at 16.15.21 Interview with Jason from Union Solutions ### 42312 ### 42312 ### Screen Shot 2017-04-26 at 15.38.44 Heston interview from Union Managed Services ### 42302 ### Screen Shot 2017-04-26 at 15.31.27 Addveritas video for interview ### 42285 ### 42285 ### Screen Shot 2017-04-26 at 11.10.42 Will Boyle Interview for Cloud Talks ### 42282 ### 42282 ### 42277 ### 42269 ### 42269 ### AWS AWS, Cloud image for Rubrik ### Rubrik Rubrik Logo for press release ### Accenture Accenture Logo for Press release ### 1 in 8 1 in 8 ... image for press release ### iland iland logo for press release ### Splunk Splunk logo for press release ### Fatpipe and Wind FatPipe collaborate with WindRiver in press release ### SolarWind SolarWind Logo for Press Release ### office-1081807_1920 Business office for article ### ConorMcGrath_CountryManager_RetarusLtd_landscape (1) Conor McGrath Headshot for author bio ### Brexit Cover Brexit cover for blog ### Ben Martin headshot colour Ben Martin Headshot ### Screen Shot 2017-04-20 at 16.07.55 Brexit Statistics for blog ### Screen Shot 2017-04-20 at 16.07.42 Brexit Graph for article ### hacking-2077124_1920 Security Failure image for article ### networking-1626665_1280 Security, Storage image for article ### neo-urban-1808082_1920 Cyber Security for article ### Hazel Freeman (1) Hazel Freeman Headshot ### sky-49520_1920 Sky cloud for article ### office-1377047_1920 Office Photo for Article ### sohel ather author pic 1098 Sohel Ather Headshot for Article ### cloud-computing-2116773_1920 Cloud Computing for article ### Hans Hans Nijholt Headshot ### Informatica Informatica Logo for Press Release ### IBM and Deliver IBM collaborate with Deliver ### Dunkin brands Dunkin' Brands Image for press release ### Ensono Ensono Image for Press release ### C&C C&C image for press release ### Disruptive Disruptive cloud Expo Thumbnail ### Behind the Cloud Expo Behind the Cloud Expo thumbnail ### Information Builders Information Builders Image for press release ### Screen Shot 2017-04-18 at 10.45.17 Matt Watts Headshot ### ipad-820272_1920 (2) Website development ### Screen Shot 2017-04-13 at 16.31.30 Peter Vukcevic Headshot ### 42076 ### Screen Shot 2017-04-13 at 16.20.42 Matt Watts Interview from Net App ### 42076 ### Global Reach Global reach for press release ### globe-895580_1920 ### Cloud C2net Cloud C2net image for press release ### 3W Infra 3W Infra for Press Release ### tablet-2188369_1920 Table image showing Analytics ### analysis-1841158_1920 Data Analytics for blog ### Robert Cordray Robert Cordray Headshot ### ecommerce-2140603_1920 E-Commerce image for blog ### Danielle Danielle Headshot ### Sohel Ather Sohel Ather Headshot ### abstract-21883_1920 Business Office for blog ### michael-frisby Micheal Headshot ### Screen Shot 2017-04-12 at 15.36.24 Zambia Crowd image ### Screen Shot 2017-04-12 at 15.36.18 Zambia Main Image ### Screen Shot 2017-04-12 at 15.24.19 Aaron Auld headshot ### privacy-policy-510731_1920 Security image for blog ### Kevin J. Smith Kevin J Smith Headshot ### workplace-1245776_1920 Business Environment for Cloud blog ### junaid-ali-qureshi Junaid headshot for blog ### clouds-1117607_1920 Cloud Storage image for blog ### Hermes Hermes image for press release ### 41993 ### Hermes Hermes Image for press release ### 41986 ### 41983 ### 41970 ### 41966 ### 41961 ### 41957 ### 41952 ### Luminet Luminet logo for press release ### 41946 ### 41940 ### 41936 ### 41933 ### 41930 ### 41922 ### 41921 ### 41918 ### 41908 ### 41908 ### Behind the Cloud Expo Behind the cloud expo image ### 41904 ### 41904 ### 41900 ### Wallscope Wallscope Image for press release ### Threat Connect threat Connect logo for press release ### Texting Texting image for press releases ### texting-1490691_1920 Texting photo for Press Release ### artificial-intelligence-2167835_1920 Artificial Intelligence image for blog ### avril_murphy_HS-3 Avril Murphy Headshot for Blog ### Mou Mukherjee, Director of Marketing .CLOUD Mou Mukherjee headshot ### clouds-2129514_1920 Clouds for 2025 blog ### Screen Shot 2017-04-10 at 15.21.14 Velostrata Image foe blog ### FlyNet ### 41868 ### Picture1 Image for Flynet blog ### Worship Place of Worship image for press release ### 41850 ### 41847 ### 41843 ### 41843 ### 41839 ### 41839 ### 41835 ### 41831 ### cloud-expo-europe-2016 ### Greencore-Group Greencore-Group ### salisbury-1435726_1920 church ### censornet-620x445 ### ThreatConnect ThreatConnect ### chairs-2181923_1920 meeting room with chairs ### Peter Ryan Peter Ryan, Head of UK Beyond at Bluewolf, an IBM Company ### cloud-2104829_1920 Cloud Security image for Blog post ### Memory stick USB Image for Press Release ### VP of Product Marketing at Arcserve Christophe Bertrand Christophe Bertrand Headshot for blog ### padlock-1437449_1920 Storage for images in the Cloud ### Christopher-Low-MyTeamPlan Christopher Low Headshot ### Screen Shot 2017-04-04 at 14.16.05 Ian Pattison Video ### Cloud Expo Cloud Expo TV image ### Cloud Expo.TV Cloud Expo TV image for Blog ### cloud expo europe ### network-1572617_1920 Cable image for blog ### picture 2 Graph for blog post ### Picture1 Statistic image for blog ### CTCAMP ### roll-the-dice-1502706_1920 Gaming AWS image for press release ### Rackspace Rackspace logo for press release ### Teradata Teradate logo for press release ### VMWare VMWare logo for press release ### Century Link Century link logo for press release ### Alcatel Alcatel Logo for press release ### axway Axway logo for press release ### Lifesize Lifesize logo press release ### Data Fest Data Fest 2017 ### ANNA ANNA image for press release ### Apricorn Image for Apricorn press release ### Security Security image for press release ### Hypergrid Hypergrid Logo for press release ### Zadara Zadara Logo for press release ### Alation Alation Logo for press release ### Barracuda BArracuada Logo for press release ### Atos Atos Image for press release ### Cloud Expo Europe Cloud Expo Europe logo ### Brainwave Brainwave logo for press release ### Rural Rural Image for press release ### Adnet Telecom AdNet Telecom Image for press release ### Adnet telecom ### cable-584498_1920 Security cables for password ### hunor voith Hunor Voith Headshot for article ### technology-785742_1920 Keyboard for Mainstream article ### phone-1610190_1920 Phone for communication for article ### Screen Shot 2017-03-31 at 22.07.23 Mike Constantine Headshot - Author Bio ### computer-1836330_1920 Online security photo for article ### Tom Grave_image Tom Grave headshot for article ### business-2089533_1920 (1) ROI Statistic Image for Article ### Boyan Ivanov Boyan Ivanov Headshot ### 41664 ### 41664 ### Screen Shot 2017-03-31 at 21.09.37 Laurence James Headshot ### twitter-1522890_1920 Social Media Website image ### boris copy Boris Kraft Headshot ### chess-1993141_1920 (1) Chess Pieces relating to strategy ### AAEAAQAAAAAAAAlNAAAAJDVlOTJiNGJkLWM4NjAtNDNhYS05ODQ0LTRiYjUyY2M3YmUxYg ### uzxlsvtqsvm8428ib18m_400x400 ### RBA-CLA-logo-2016 ### Laptop laptop cybersecurity ### Laptop ### Screen Shot 2017-03-30 at 13.51.09 Jon Topper Headshot ### blur-1853305_1920 Security in the cloud for article ### hacker-2002907_1920 Security image for blog ### Marek Wawro Marek Wawro Headshot ### computer-1245714_1920 Computer Screens for article about the cloud ### business-1839876_1920 Laptop and data image for Cloud blog ### ANDY_SLATER290716 Andy Slater Headshot for Article ### Paul Burns Paul Burns Headshot ### turntable-1109588_1920 technology for article on the cloud ### CLS_CE_Date_DE_RGB ### architecture-1868272_1920 Architecture photo for article ### CloudFoundryCorp_vertical.svg ### amazon-connect ### 41542 ### Screen Shot 2017-03-28 at 11.11.37 ### KShWUBin ### apple-1846277_1920 Laptop with backup files for blog ### laptop-in-the-office-1967479_1920 Office Photo for Article ### Dominik Birgelen Dominik Birgelen Headshot for blog ### chainlink-690503_1920 Security fence for press release ### blog-ian-van-reenen Ian Van Reenen Headshot ### Telit Telit image for Press Release ### Deloitte Deloitte Digital press release image ### Syneto Syneto logo for press release ### Headphones Headphones for Press Release ### Hacking Hacking text for press release ### Nimbus Nimbus Ninety logo for press release ### Phone and Iphone Phone and Iphone for Press Release ### Manywho ManyWho logo for press release ### Ring Central Ring Central on photoshop ### archive-1850170_1920 Data Storage for article by Andy ### Andy headshot Andy Hinxman headshot for blog ### smart-watch-821557_1920 Mike Sewart blog considering IoT ### Mike Sewart final Mike Sweart headshot for blog ### skyscraper-90560_1920 Cloud based services for article ### Gabe_Wilson Gabe Wilson Headshot for Article ### architecture-2146290_1920 office building for article for Tim Deluca-Smith ### Tim Deluca-Smith TimeDeluca-Smith headshot ### Ring Central Ring Central logo for Press Release ### Syneto Syneto image for HYPER Series 3000 platform press release ### headphones-1935971_1280 Call Centre for New Voice Media press release ### hacking-1685092_1920 Hacking photo for Ransomware Press Release ### Nimbus Ninety Nimbus Ninety logo for press release ### Digital Transformation Image for Legacy technology - Digital Transformation ### computer-1572183_1920 Laptop and phone for sailpoint press release ### Manywho Many Who screen shot ### Infinet Infinet wireless used at Universiade ### cyber-security-1938338_1280 Cyber Attack - 'Mean Blind Spot' ### Benjamin Aziz ### network-197300_1920 Networking image for blog - Cloud DNS ### Rodney Joffe, SVP & senior technologist, Neustar Rodney Joffe Author ### telescopes-1722668_1920 Telescopes imaged for the 5 minute guide to openstack ### Paul Paul Rawlinson headshot ### BARRACUDA NETWORKS, INC. LOGO Barracuda logo ### 40996 ### zadara storage logo for zadara storage for press release ### ipad-820272_1920 (1) ### ipad-820272_1920 (1) ### 40990 ### 40988 ### PR - USA - utilities ### Barracuda ### Screen Shot 2017-03-20 at 11.41.08 ### Arjen van den Akker - headshot ### Screen Shot 2017-03-17 at 13.50.41 ### Screen Shot 2017-03-17 at 13.39.31 ### Screen Shot 2017-03-17 at 13.30.31 ### Screen Shot 2017-03-17 at 13.27.04 ### Screen Shot 2017-03-17 at 13.04.05 ### logo-primary ### ipad-820272_1920 ### CloudExpoTV ### Screen Shot 2017-03-14 at 11.46.53 ### Screen Shot 2017-03-14 at 11.44.11 ### Screen Shot 2017-03-14 at 11.34.46 ### Origin logo ### phone-1209230_1920 ### Screen Shot 2017-03-13 at 14.42.53 ### Summary ### ANNA-WEB_SLIDER_Shanghai_ANNA ### annaweb-logo-neu ### macbook-606763_1920 ### Picture 3 ### Picture 2 ### Neil pic 1 ### VA ### 41258 ### 41248 ### 41248 ### 41243 ### 41243 ### 41239 ### 41239 ### 41236 ### 40956 ### Wallscope ### Hollie Twitter Profile ### office-583839_1920 ### Claus ### architecture-1727806_1920 ### unnamed ### CloudExpoTV ### 20841 ### 20847 ### 20841 ### 20847 ### 19514 ### 38890 ### 38879 ### 38887 ### 38882 ### 40801 ### 38890 ### 38879 ### 38887 ### 38882 ### 38890 ### 38879 ### 38887 ### 38882 ### 39946 ### 40058 ### 40087 ### 40801 ### 40906 ### 40921 ### 40735 ### Tim Mullahy Headshot ### call-1866884_1920 ### office-1209640_1920 ### Monica Brink - resize ### computer-1591018_1920 ### The biggest CIO challenge ahead: Transforming the enterprise The biggest CIO challenge ahead: Transforming the enterprise ### Screen Shot 2017-03-07 at 15.36.05 ### large-1 ### Wilczeck ### 41092 ### pieces-of-the-puzzle-1925425_960_720 ### Peter Galvin, VP Strategy, Thales e-Security ### 41055 ### 41055 ### 41051 ### 41044 ### BDWL_740x90_DTTV-Banner ### CEEL_740x90_DTTV-Banner ### Combined_740x90_DTTV-Banner ### Blockchain and Transactions: Here's what you need to know Blockchain and Transactions: Here's what you need to know ### globe-691770_1920 ### chain-232930_1920 ### binary-1538721_1920 ### code-1785541_1920 ### Ian Pattison ### display-dummy-915135_1920 ### Screen Shot 2017-03-06 at 13.48.46 ### 41013 ### iphone-624709_1920 ### board-1709188_1920 ### ajit ### unnamed-1 ### unnamed-1 ### 40948 ### Alyssa Magarey ### 8 Public Cloud Security Threats to Enterprises in 2018 8 Public Cloud Security Threats to Enterprises in 2018 ### ab_headshot_400_400 ### 1851464_1280 ### 40921 ### broken-549087_1920 ### Screen Shot 2017-03-01 at 10.03.14 ### 40906 ### cellphone-1852898_1920 ### new Jason Morio ### office-1828124_1920 ### ChrisSavoie-142px ### padlock-597495_1920 ### Marc Sollars portrait ### 40876 ### Screen Shot 2017-02-27 at 15.01.06 ### 40864 ### Magnus K ### office-1516329_1280 ### lim-changwon-1461 ### Steve Denby ### NetApp ### unnamed ### 40801 ### containers-1209079_1920 ### Jason Howells - Hi Res ### 40824 ### scissors-1986602_1920 ### Rebecca D'Souza ### office-336368_1920 ### architecture-22039_1920 ### 40780 ### 13116776_1110213132332986_277153849_n ### buildings-1835638_1920 ### income-tax-491626_1920 ### Shaz ### Don Mac Millan ### architecture-1945248_1920 ### 40735 ### architecture-1867194_1920 ### 37d384b ### Gaidar Magdanurov ### Future tech ### Saskia van Daal_098 (1) ### railway-station-1928668_1920 ### Lyndsay Cook PGi ### 9f6b090f365329a6_org ### station-839208_1920 ### David Pinches ### architecture-1867772_1920 ### field-533541_1920 ### Mark Stancombe-Duhm ### computer-1231889_1920 ### Paul Fawcett ### binary-1414319_1920 ### David Mytton Headshot ### David Mytton ### london-stock-exchange ### server-90389_1920 ### Bruno Teuber Headshot ### 13270709733_9583852dfa_o ### Richard Prime, CEO, Sonovate ### cao-europe-728-x-90-general ### light-828547_1920 ### 40581 ### london-stock-exchange ### download ### buildings-1836478_1920 ### unnamed-2 ### store-1245758_1920 ### buildings-498198 ### Gary Turner, Xero Managing Director_Resized ### hammer-802298_1920 ### HPE-Unveils-Its-New-Converged-IoT-1 ### url ### police-869216_1920 ### phones2 ### phones ### Andy Gent Revector image 3 ### sky-199154_1280 ### unnamed ### pieces-of-the-puzzle-1925422_1920 ### disruptivehappytwitter ### 40489 ### 40489 ### 40489 ### 40489 ### HappyBirthdayDisruptive ### photography-731891_1920 ### Jon Topper ### water-373780_1920 ### Tom Harwood ### unspecified ### Risk Overview ### Reporting Detail ### Low Incidence Catchments ### Incidence Period vs Previous Year ### District Data Quality ### startup-593327_1920 ### 15625045_1283205551702154_1163054195746537472_n ### Brian Murray ECS ### 01 ### 40443 ### building ### money ### Bruce Van Wyk-Director Payspace ### 36b6c14f9719de74db3c863b20857750 ### forest-547363_1280 ### cloud-8075_1920 ### mountain ### Instagram_App_Large_May2016_200 ### sky-414199_1920 ### Pascal Geenens ### unnamed-1 ### virtual-reality-1802469_1920 ### board ### John Madelin ### Linux-Foundation ### Card payment ### 13402503_283904121954475_1047252302_n ### Manish Sablock ### image-1x-md ### fireworks ### Bill Mann ### cloud-01 ### luke_mead_MD ### 02 ### 01 ### CG Headshot ### 12231052_573978242754987_535279762_n ### buildings ### Noel Moran - PFS CEO headshot ### sunset cloud ### office building ### Wayne Gibbins - Wercker ### architecture-1846412_1920 ### Datafest logo ### architecture-1895869_1920 ### Rob Corrigill_1 ### unnamed ### Negotiator ### architecture-1868016_1920 ### typhoon-1650677_1280 ### stock-1863880_1920 ### unnamed ### Photo©John Cassidy The Headshot Guy®www.theheadshotguy.co.uk07768 401009 ### grid-ball-1914559_1280 ### genius-wong-dsc_6179 ### neo-urban-1808082_1920 ### mike-hemes ### caterpillar-1209834_1920 ### john-pomeroy-429x640-002 ### lightning-1574684_1280 ### asia-1807539_1920 ### jeff-hart ### hastings-headshot ### concept-1868728_1920 ### mike-pittenger-vice-president-security-strategy-black-duck-software ### computer-1895383_1920 ### screen-shot-2017-01-09-at-13-37-27 ### castle-1583281_1920 ### total-processing-profile-picture-david-midgley ### lock-143616 ### andrew-peddie-fhl_high-res ### pokemon-1553989_1280 ### aaronlint_arxan ### buildings-498198 ### patrick-joggerst-hres4 ### cashbox-1642989_1920 ### rd-headshot ### architecture-1839443_1920 ### tom-davies ### computer-1844996_1920 ### talk-show-1149788_1920 ### james-campanini ### moon-980823_1920 ### unnamed ### flynet-master-logo ### flynet-master-logo ### gas-pump-1914310_1920 ### kong_yang ### supercomputer-1781372 ### picture2 ### picture1 ### 40087 ### sunlight-166733_1920 ### Chad Headshot ### security-265130_1920 ### 40064 ### 40061 ### 40058 ### 40054 ### mariadb-icon ### Attendees gather at IBM Watson event in lower Manhattan, New York ### Picture1 ### LinkPic ### 120130_N7_screenhd-700x466 ### 39946 ### internet-of-things-2 ### ancient-1807518_1920 ### Hugh_Owen SVP Product Marketing MicroStrategy Incorporated ### man-1675685_1920 ### Neil Bramley, Toshiba ### toothed-belt-drive-209677_1920 ### Damon Chapple, Co-CEO Sonovate ### DataKinetics Logo Horizontal ### seo-1603927_1280 ### architecture-1850857_1920 ### jody ### Picture1 ### image002 ### 39954 ### DELLAdapt ### unnamed ### DELLMEMSET ### dellfordway ### delladameaton ### dellhenriksen ### DELLSECURA ### 39954 ### 39948 ### 39938 ### photo ### FullSizeRender (1) ### FullSizeRender ### IMG_1648 ### CwGvEQrWgAAH9d0 ### surrey-large_trans++qVzuuqpFlyLIwiB6NTmJwfSVWeZ_vEN7c6bHu2jJnT8 ### beast-1651094_1920 ### kranhaus-1580279_1920 ### Shaun Lynn ### geese-1622692 ### Marc Boroditsky head shot (1) ### flats-1208304_1920 ### Graeme Thompson ### unnamed ### CvSqKGjXgAITUwT ### architecture-1758454_1920 ### Sahil (1) ### GermanyOffice2 ### 39856 ### 39852 ### 39849 ### 39846 ### 39843 ### 39840 ### architecture-1515484_1920 ### Neil G ### 39828 ### detroit-1819179_1920 ### Andy Hinxman Keybridge IT ### 39809 ### 39809 ### architecture-1586494_1920 ### busy-880800_1920 ### Eddie Ginja ### glass-1723505_1920 ### Pip Rustage Steam Haus Manchester ### mountain-1410017_1920 ### Tiberiu Croitoru ### doctor-563428_1920 ### keys-525732_1920 ### unnamed-1 ### unnamed ### unnamed ### architecture-1719538_1920 ### JadeWinters-Colour ### man-984271_1920 ### David Rowlands - 8x8 ### football-1350720_1920 ### mx7vuG ### maxresdefault ### ibm_data_center_india ### MeASM ### Screen Shot 2016-11-21 at 14.33.02 ### linuxcon-logo ### Ad4_728x90 ### discovery-space-shuttle-1757098 ### cafe-1209769_1920 ### LogMeIn Corporate Profile Shots - Alan Rowlette Photography ### armchair-595874_1920 ### Big-Data-Landscape-2016-v18-FINAL ### unnamed ### bird-1702515_1920 ### Sanjay ### AAEAAQAAAAAAAAjkAAAAJDdhMDQ1MjhhLTNlYTYtNDM2Yi1iYWExLWU1NDlkMzAxYzQ1Zg ### digital-brain-1 ### images-1 ### coins-912278_1920 ### telephone-586266_1920 ### the-centre-of-651928_1920 ### unnamed ### skyline-night-668457 ### unnamed ### bellary-fort-171490_1920 ### unnamed ### accounting_table ### coins-1015125_1920 ### Paul Bulpitt_2016 ### person-1171956_1920 ### unnamed ### night-692261 ### people-690547 ### darts-102919_1920 ### Atos-logo ### Atos.svg ### background-1753684_1920 ### unnamed ### cityscape-1264949_1920 ### unnamed ### clouds-1701660_1920 ### Pascal ### lightning-690410_1920 ### Profile Picture 12012013 ### mouse-1708347_1920 ### Frank Denis - OVH ### unnamed ### Picture1 ### padlock-1531250_1920 ### unnamed ### calculator-428294_1920 ### Adam_Reynolds ### queen-937501_1920 ### unnamed ### unnamed ### CenturyLink wbsite 2 ### Craft ### 0caa257 ### Axway_4_Color_Logo_2008 ### ptc-master-color ### ptc-master-color copy ### Atos-office ### ptc-master-color ### wind-farm-hero ### sunset-1018670_1920 ### abstract-21632_1920 ### unnamed ### bangkok-1124736_1920 ### unnamed ### AAEAAQAAAAAAAAg2AAAAJGE4OGZjYmY4LTZmODYtNDI4OS1iZTEzLWIzMWU5YjgyM2VlZQ ### antenna-543431_1920 ### unnamed ### bank-924862_1920 ### padlock-406986_1920 ### blurred-218907_1280 ### unnamed ### administrative-building-745485_1920 ### architecture-1549089_1920 ### unnamed ### manila-1151059_1920 ### b9abb012-c6d0-7dd0-2612-6a5912b731e4-wochit-channel-feature ### architecture-1048092 ### How-to-Move-Your-Small-Business-Safely-Into-The-Cloud ### clouds-cityscapes_00306771 ### modern-new-office-building-1280x720 ### computers-and-binary-futuristic-oooh-640x360 ### Pitch ### SunsetTracksCrop ### unnamed-1 ### alcatel_lucent ### unnamed-11 ### stock-624712_1920 ### 3858db6 ### skyscraper-1030774_1920 ### unnamed ### building-1457805_1280 ### unnamed-10 ### vmware ### VMware-Wallpaper-3-Clouds ### office-1209640_1920 ### 248c260 ### hand-1571849_1920 ### elektrik-1646435_1920 ### unnamed-15 ### Christie headshot ### architecture-1727806_1920 ### landesk ### unnamed-14 ### small-office-1034921_1920 ### landesk ### seaside-1209111_1920 ### unnamed-12 ### new-home-1553256_1920 ### unnamed-11 ### 12022511-1062333987134363-5199503617251419044-o ### marathon-1649905_1920 ### window-1465181_1920 ### offices-1181385_1280 ### Print ### Blind Spot for Security UK 100316 ### power-plant-344231_1920 ### JUSTIN ### kaleidoscope-1697927_1280 ### lotus-1205631_1920 ### classroom-1699745_1920 ### antietam-140321_1920 ### urban-384587_1920 (1) ### file-cabinets-big-data ### unnamed-10 ### unnamed-9 ### unnamed-8 ### unnamed-7 ### unnamed-6 ### unnamed-5 ### unnamed-4 ### 489521-citrix-grasshopper-logo ### coffee-1276781_1920 ### unnamed-1 ### staircase-1590582_1920 ### unnamed ### robot-432453_1920 ### birds-1619420_1920 ### unnamed ### StorageReview-EMC-Unity-AF ### WinonaSavingsBankVault ### nuance_communications_logo_large ### openstack ### sungard_logo ### censornet-620x445 ### ram-893259_1920 ### Untitled2 ### Untitled ### Guard sits on the rubble of the house of Brigadier Fouad al-Emad, an army commander loyal to the Houthis, after air strikes destroyed it in Sanaa, Yemen ### 15-210-sls ### communication-1472636 ### library-849797_1920 ### Linux-Foundation ### url ### mountains-889131 ### WindowsServerQ3Final ### Q3Final ### notebook-405755_1920 ### WindowsServerQ3c ### WindowsQ3c ### IP-EXPO-Europe-2016 ### WindowsServerQ3b ### L R and D - Susan O'Leary - eCommerce Director at Alderney Gambling ### architecture-1048091_1920 ### unnamed-2 ### play-593207_1920 ### code-1486361_1920 ### river-541456_1920 ### tablet-600649_1920 ### London-Sky-Wallpaper-hd ### spectacles-centered-3 ### OFficeStrikesBack ### Dell-extension ### Cloud-Azure ### database-systems-1 ### ar_locations ### WindowsServerQ3 ### netapp_logo_1 ### twitter talk blockchainbox ### twitter talk blockchain ### SUMMA-Head-Office-Building-AVCI-Architects-110 ### 38890 ### 38887 ### 38882 ### 38879 ### security-644086_1920 ### twitter talk blockchain ### modern-new-office-building-1280x720 ### Marylebone ### metro-1209556_1920 ### unnamed-1 ### Manu MArcel ### rest-1335728_1920 ### unnamed ### barbed-wire-887275_1920 ### Michael Hack, SVP EMEA Operations, Ipswitch ### Word Cloud "Big Data" ### forsale-546489 ### oracle-offices-south-africa ### url ### microsoft_offices_mla100309_2 ### images ### IMG_7809 ### IMG_7809 ### maxresdefault ### unnamed ### bigdata ### unnamed ### firstBG ### IP-Expo-Europe-Alphagamma-Opportunities-1021x470 ### unnamed ### 02 ### default_signin_illustration ### RasterMirantisLogo_HiRes_Tagline ### globe-895580_1920 ### diamond-1475978_1280 ### Screen Shot 2016-09-14 at 13.47.18 ### chicago-343941_1280 ### computer-1245714_1920 ### building-1210022_1920 ### car-race-438467_1920 ### unnamed ### architecture-2892_1920 ### roulette-1003120_1920 ### unnamed ### IPExpoWarm ### IPExpo.pngWarm ### jkoJpwxB_400x400 ### IP-EXPO-LIVE-Logo-RBG ### gcloud ### telephone-586266_1920 ### unnamed ### iPhone-7-and-iPhone-7-Plus-news-2 ### Me ### unnamed copy ### unnamed ### unnamed-1 ### GamingCTCbannerhorror ### europelogo-1 ### ipexpoeurope_square ### sunrise-1282306_1920 ### privacy-policy-510731_1920 ### Picture1 ### unnamed ### sky-173742_1920 ### unnamed ### EMC-Building-Exterior ### architecture-1453253_1920 ### unnamed ### city-11087_1920 ### sky-43448_1920 ### plane-1549062_1920 ### unnamed ### abstract-1239439_1920 ### BigJW ### computer-1209641_1920 ### Untitled ### unnamed ### birds-219236_1920 ### unnamed ### unnamed ### network-197300_1920 ### buildings-1209690_1920 ### 7117 ### unnamed ### ODRZ6-Yq ### secretary-1249542_1920 ### lightbulb-801941_1920 ### cut-the-rope-1558280_1920 ### 9steps ### 9 Steps to develop game app [infographic] ### pinchuk_3915-741x494 ### 0516 Indiegogo Teams with Arrow Electronics main ### 6543799 ### Dell EMC ### dell_logo ### unnamed-2 ### architecture-22039_1920 ### unnamed-1 ### pokemon-go-1581159_1920 ### 38391 ### 38391 ### 38391 ### architecture-768432_1920 ### 3a671f8 ### Logicalis_31 ### logo-senseye ### airport-1152251_1920 ### unnamed ### apple-sep2014-event-09 ### 38346 ### 38346 ### 38346 ### 38346 ### 38340 ### 38340 ### running-573762 ### surgery-688380_1920 ### 0407 ### 38275 ### vendavo ### 37364 ### unnamed ### unnamed ### 08cf60f3-0661-409a-9336-fd162c31148d ### unnamed ### a60e971d-3a3c-407e-ab55-65ebc5970d94 ### wallet-908569_1920 ### 5c722bbb-cafb-4c0e-898a-cd3d22dc698e ### bag-1245954 ### sunset-180544_1920 ### be0d7e03-292d-45d4-aac9-0205eda74e20 ### kevin_cunningham_8090 ### 38196 ### ca0337e7-db04-4742-8117-95cf28d3d92d ### ca0337e7-db04-4742-8117-95cf28d3d92d ### unnamed-1 ### 38179 ### 38179 ### 0e7fbd3d-27d8-4d1f-90b1-5c6c69171f0a ### 9b1f4bf4-2671-457a-96f0-32d56186f68a ### video-games-925929_1920 ### city-863074_1920 ### smoke-258786_1920 ### 29e3d8fd-f5c1-45e3-8373-9ce3d5ad15c1 ### bar-984527_1920 ### smartwatch-828786_1920 ### e55ace09-32be-478e-8ac7-eddd0d0f4fab ### notes-933111_1920 ### unnamed ### chess-1145558_1920 ### 38109 ### telephone-1223310_1920 ### jbecker linkedin headshot ### spectators-1573901_1280 ### SONY DSC ### phone-210972_1920 ### Andrew Woods- Everbridge Bio ### mother-board-581597_1920 ### calculator-385506_1920 ### daniel kimpton ### hewlett_packard_enterprise_logo ### hewlett_packard_enterprise_logo ### hewlett_packard_enterprise_logo ### store-832188_1920 ### d2bab71c-84b4-402b-b274-2b7865d6f604 ### spectator-1394393_1920 ### Picture1 ### man-290186_1920 ### ipad-606764_1920 ### Picture6 ### Picture5 ### Picture4 ### Picture3 ### Picture2 ### Picture1 ### brexit-1497067_1920 ### 59b597b6-357c-43a0-aefe-20e8333a8420 ### Aaron ### technology-785742_1920 ### Picture1 ### binoculars-1209011_1920 ### social-network-76532 ### apocalypse-383292_1920 ### fresh-839260_1920 ### ip_expo_rgb_2_deck_dates_date_2_decks ### cards-1437776_1920 ### london-530055_1920 ### 72756e25-5728-4aa4-83ec-3ec5e718dfb4 ### f469b5dc-007c-47bf-b49e-2b12a647f5cd ### SANYO DIGITAL CAMERA ### cfe3f251-3cd6-4c33-bd7d-c0c6771490c2 ### 05c03cd2-3c5c-4d8e-b656-fe02b7ebd375 ### 2befa29b-a74b-4c04-a255-4e45ed7a27f2 ### privacy-policy-510728_1920 ### abstract-1392404_1920 ### it-1359518_1920 ### minecraft-655905_1280 ### Untitled ### office-1516329_1280 ### man-845847_1280 ### Picture1 ### Picture1 ### a8e60cc4-f06f-4615-a0e1-07aa22396ae4 ### background-720223_1920 ### login-1203603_1280 ### 09f0efc1-002b-4f94-ba0b-a725d9bf7a3c ### euro-1353420_1920 ### birds in sky ### API Article ### AmazonEcho ### Roger Phare ### API Article ### Anand ### fidor ### Henry Duncombe ### IT security ### Matthew Cleaver ### 1st IT ### castle-1461009_1920 ### clouds-1440638_1920 ### 87a2de63-bf13-4c17-a6dd-fb0a59243dbe ### Network Anomaly ### Disaster Recovery ### GPU virtualisation ### b5431478-70fe-4168-8c94-679b72233c53 ### b7786a2a-17d2-4eb8-b878-d01ae4cec1a6 ### fuse-beads-1515889_1920 ### steps-1081909_1280 ### 00bb624e-f117-4350-9822-993661bb3ad8 ### cardboard-467819_1920 ### M7 Disaster Recovery ### 3D printing ### Cloud KRSC Q&A ### Philip Woods ### electric-1080584_1920 ### fc6dcbf8-9d6e-46a6-8730-6321bec2a743 ### Treasury Management ### the-ball-488709_1920 ### image1 ### image2 ### image3 ### ERP Software ### Fintech Cloud ### headshot ### comparethecloud3 ### comparethecloud2 ### comparethecloud1 ### time-v-social ### v-twitter ### 13263763_10209834509047176_8160115684147829453_n ### pikachu-1207146_1920 ### GCloud Award ### TMS Infographics ### 37660 ### Public Cloud and On-Premise ### Doug Cutting ### Blockchain Security ### Cloud Video ### 160708 Rob Quickenden, Cisilion ### Cloud Platform ### The Media Package ### online branding package ### Online Influencer Package ### Social Media Package ### Zohar Babin ### Healthcare Security ### Steps to cloud success ### highway-1209547_1920 ### Koskue_Mikko ### car-park-960796_1920 ### Online Age-Checks ### Online Age-Checks ### The Media Package ### Social Media Package ### online branding package ### Online Influencer Package ### clouds-1404960_1920 ### M7 event ### board-564815_1920 ### cup-of-coffee-1280537_1920 ### Sebastian Krause IBM headshot ### GS headshot 1 ### Ian Muscat ### Alastair Graham ### cable-494652_1920 ### Pivot3 Logo ### ipad-407799_1920 ### server-90389 ### IBM Wimbledon ### Moya Brannan ### Oscobo - Privacy ### Fintech compliance ### 99674c56-ad69-4344-b4af-662ce21fe12e ### phone ### Cloud Nine ### japan temple ### Ninja Temple ### saas ### saas ### saas ### CJ_Profile ### ESRM Conference ### AWS Summit ### IT transformation ### Ben Rafferty ### Shadow IT ### Legal Sector Digital ### Collaboration ### Burley Kawasaki ### Vendavo ### Network Survival ### 37272 ### Kevin Davis ### Neil Sholay ### Disaster Recovery ### Sea Life vs. Space Life ### Connectivity at sea and in space ### Leon Adato ### 37318 ### 37315 ### Boardroom Security ### Matt Piercy ### Crowd Based Capitalism ### 37296 ### 37292 ### Katy Oliver ### IT security ### 37272 ### 37269 ### Public Cloud ### Disaster Recovery and Backup ### Cloud Healthcare ### 37251 ### Big Data Analytics ### Cloud Migration ### Tim Patrick Smith Getronics ### Jessica Symondson ### WindowsServerQ2 ### Adrian Crawley ### Cloud Telephony ### Accounting ### Stuart Wilson ### Mark Tonsetic ### G-Cloud Maturity ### big data ### speed ### Enterprise Resource Planning ### Carbon Emission ### Long Term Cloud ### 37145 ### 37142 ### Open-Ecosystem-to-Fuel-Innovation ### FinTech Cybersecurity ### Paul Bulpitt ### Open Ecosystem to Fuel Innovation ### EU Referendum ### Advantec Ecommerce ### FinTech Bank ### Andrew Brittain ### Disaster Recovery M7 ### Neil Cornish ### FinTech Regulations ### Open Mainframe Project ### Martin Warren NetApp ### Cloud Hijack ### Blockchain Types ### Business Network ### Hybrid Cloud Database ### Mike Cockfield ### Adam Chapman ### Rick Mattock ### Mike Leibovitz ### Lindsay Smith ### G-Cloud Chart ### Blockchain Trading ### Blockchain Maturity Assessment ### Future-Proof ### Industry 4-0 ### Business Growth ### Luke Brown ### Robin Schumacher ### Mark Armstrong ### Security ### Virtual TCO ### Albie Attias ### Dropbox Logo ### Connected Cars 2016 ### Cloud & DevOps World ### Cloud & DevOps World ### Data Centre Metrics ### Network Function Virtualisation ### Time for change ### Smart Kitchen ### Car crash ### Cloud Experiences ### Ecommerce ### Cloud ERP infographic ### Andy Richley ### Multi-Cloud Troubleshoot ### Embracing the cloud ### KEMP Technologies ### 36867 ### UK Cloud ### Copy data virtualisation ### Connected Cars 2016 ### Cloud & DevOps World ### IoT Tech Expo Central Europe ### Info Security Europe 2016 ### Infosecurity Europe 2016 ### Internet Privacy ### Rob Perin ### Accounting ### Insider Threat ### Multi-Branch IT ### 36806 ### Csaba Krasznay ### Brian Buggy ### Cloud Finance ### Fog Computing ### M7 Managed Services ### Blockchain Money ### Kevin J Davis ### Rubrik ### 36760 ### BlueMixEvent ### Blockchain FinTech ### Zayo Cloud ### 36745 ### Domain ### 36734 ### Data breach ### Finance ### GDPR ### Ransomware ### Ransomware ### 36716 ### Yvonne Chan ### ALISTAIR MACKENZIE ### Nate Vickery ### Philippe Gelis ### Aaron Shelley ### Aaron Shelley ### Cloud Performance ### DAVIDGEENSENCOVER ### Hiren Parekh ### BlackLine ### BlackLine ### Mario Spanicciati ### Lisa Dargan ### Internet Governance ### Cloud Customer Service ### Enterprise IT 2016 ### SuiteWorld 2016 ### NetSuite ### Rick Powles ### Uncanny Valley ### Technology Fears ### Chano Fernandez ### Gareth Dickson ### Adrian Ivanov ### Magic ### 36592 ### London ### 36582 ### Man DisConnected - Zimbardo ### 36574 ### Ransomware Security ### 36565 ### RoboFinance ### Simplicity ### Cloud Transformation ### Adrian Li ### Brett Ley - Juniper ### 36534 ### Richard Eglon ### Oscobo search ### Oscobo ### Productivity ### Paul Scarrott ### Migrating to the Cloud ### 36494 ### Emergency Cloud ### Nick Hawkins ### 36477 ### Infrastructure ### Staff Heroes ### Staff Heroes ### Channel Partners ### 36459 ### Jim Ryan ### Banking ### 36441 ### Cloud Visibility ### 36430 ### Prague Castle Huawei ### 36416 ### Dropbox ### Enterprise Mobility ### Donna Wilszek ### 36391 ### Cloud war ### Communication ### Mike Bartlett ### KCOM Logo ### 36363 ### Teledata logo ### 36350 ### Business App ### 36334 ### 36334 ### Startup Cloud ### 36302 ### China ### Retail ### Catherine Moore ### Innovation ### Ivan Seselj ### Ivan Seselj ### 36267 ### 36267 ### Analytics ### Cloud Convergence ### Simon Smith ### 36226 ### Hybrid IT ### Heleen Snelting ### Disaster Recovery ### Cloud Integration ### DevOps ### Networks ### Information Governance ### Telematics ### Jonathan Hewett ### Cloud Security ### Kong Yang ### hyper-convergence ### LiveCEE2016-MoonLow ### IoT Security ### the-IT-managers-guide-to-securing-your-cloud-infographic ### female silhouette ### Maurice McMullin ### The arts ### Terry Ip ### Virtualisation ### fintech ### Neil Cattermull ### Bring Your Own Device - BYOD ### G-Cloud ### Channel partner ### Dating data ### WindowsServer ### ExpoArticleFinal ### ExpoArticleWhite ### ExpoArticle ### Data centre ### James Hulse ### Cloud Computing ### Michael Nabarro ### IoT ### Cloud Migration ### Big Data ### Hybrid Cloud ### Cloud Journey ### Howard Williams ### Alexandru Catalin Cosoi ### Kate Craig-Wood ### Cloud Security ### Arxan ### 2016 State of Application Security ### IBM Mad Scientists ### Joy Gardham ### Cloud Storage ### Michael Frisby ### Paul Phillips ### Asia Cloud ### World Backup Day ### Dheeraj Juneja ### Cloud hurdle ### Annette Murphy ### Diogo Monica ### Mark Vivian ### Dan Jones Kable ### Hybrid Cloud ### Doug Rich ### Formula One ### CMO Thomas Been ### Thomas Been ### Digital Transformation ### BYOD ### Gavin Wheeldon ### Tony Illingsworth ### open source ### Email hack ### Governance ### Secure DNS ### Hervé Dhélin ### Business Intelligence ### Patrick Carey director of product management Black Duck ### compliance ### Eduard Meelhuysen ### Eduard Meelhuysen ### Workload ### ERP ### Cloud security ### public sector ### Hosting ### AutoTask infographic ### Hybrid Cloud ### tap-791172_1920 ### DistuptiveHeadingNew ### virtual desktop ### Multi-cloud ### DELL-Logo-Font ### WinServer_Wht_D_rgb ### Bitcoin influencer Feb - March ### Bitcoin influencer Dec - March ### Bitcoin ### jordi Vilanova ### community ISV ### Cloud sharing economy ### Ivan outdoor sweater ### BBOX-Laurent Van Houcke ### BBOX-solar-home- ### Hybrid Cloud ### ANT_9961 ### John Hammond ### Cloud security ### Anette Bronder, head of the T-Systems Digital Division ### open telekom cloud ### return on investment ### Neil CyberSecurity Evening ### Adam Bataran_Bluewolf ### nicolas henry ### play data ### IT_Insights_in_the_United_Kingdom_-_CompTIA_Stats_Revealed ### Lucy Shipley ### houze homes ### digital banks ### identity ### cio ### blockchain ### Duncan Farley ### gcloud ### zoopla ### graphic ### Daniel Model ### urban network ### india iot ### banner-620x445 ### heavy industry ### mariaDB ### NaviSite_Sean McAvan_Managing Director_UK ### roman winter ### startup ### traits ### tank ### dot cloud ### files ### Jonathan Sander ### linux ### ForresterTEI_Infographic ### Juliette Headshot_EDIT ### netskope ### Miniature ### 35352 ### 35352 ### 35352 ### startup ### humza ### gotta go app ### work drinks ### date ### mike bainbridge ### 35328 ### 35328 ### 35328 ### more to it ### 35311 ### home ### 35311 ### 35311 ### 35302 ### 35302 ### 35302 ### 35302 ### 35294 ### 35294 ### 35294 ### 35283 ### 35283 ### 35278 ### 35278 ### Professor David Brown ### satellite ### supermarket ### compromise ### night city ### typewriter ### Chris-Surdenik ### vArmour ### pervacio ### What are the Key Drivers Moving your Business to the Cloud? What are the Key Drivers Moving your Business to the Cloud? ### ingredients ### Put your feet up ### mauricio ### Howard Williams ### burst ### inofsys ### sentinelone ### daniel liptrott ### wearables ### Alice Bonasio ### data copies ### chess ### security twitter ### Libby_Penn_-SPEKTRIX ### opera entry ### wearable ### Peter Barnes - Headshot ### Sarah Shields_Dell_ Headshot ### scanner ### VAR to MSP ### MSP to ISV ### Robert Arandjelovic ### threat intelligence ### escalator ### kii 5 ### kii 4 ### kii 3 ### kii 2 ### kii 1 ### Anthony Fulgoni -Kii ### fabric ### laurence james ### Marcus Ardeman ### netflix ### small business ### ash ashutosh ### matt_Goulet ### london ### juniper networks ### Driver ### p_simon-johnson ### hongkong ### cdn ### Pete Shoard ### innovationsblur ### Brian Moran photo ### piranhas-123287_1280 ### 34997 ### 34997 ### 34997 ### colonialism ### prpl ### brocade ### curve-reimagine-the-curve ### 34964 ### 34964 ### 34964 ### classic ### recruiting ### Peter groucutt ### Steve Haworth ### daan pepijn ### cloud fs asia ### biting back ### IBMPWLC ### recruitment ### infrastructure ### data implementation ### embotics ### access ### shred ### James newsom ### valentines ### DCNS ### sea ### pressure ### cloudsfer ### app store ### image-news-view-mk ### Elliot Howard 1 ### Russell - axians ### Classics never die ### Jonathan Wilkins two ### app carousel ### start ### crisp ### bluewolf ### anssi 2 ### anssi bill ### anssi ### anssi header ### marie-dancer-image-1024x682 ### internet day ### neil mills ### Bruce Jubb 1 ### lumberyard ### Steve Dixon - Skyscape (2) ### kingston ### cybersec lille ### cybercop ### cybercop bill ### chapter ### juniper-networks (1) ### sansomit picture ### value imperatives ### 34731 ### 34731 ### 34731 ### CEM ### blackduck software ### Simon Tune_Xerox ### 4g ### vendors ### Vincent Smyth_Flexera ### Mike van Bunnens_newest ### financial services ### 2015 Team of the Tournament ### RBS 6 Nations - Accenture 010216 ### 2015 Italy vs Wales Game Tracker ### thin green client ### david angwin ### data privacy shield ### kevin roberts ### IoT_cloudexpo ### DCW_cloudexpo ### CSE_cloudexpo ### CEE_cloudexpo ### secrets ### maurice Vink ### Phil Croxford ### investing ### customer experience ### managed v unmanaged ### GRAPH jan 16 ### hybrid cloud ### Peter cutts ### sansom ### cloud king ### mobile ### Chris Michael Photo ### farm ### IT manager ### abstract ### David Lozdan, head of public sector, Exponential-e ### Keiron Dalton ### bill infosec ### graphs ### keyboard- ### industrial ### rob ashworth ### smug ### isheriff ### oracle jobs ### sleeping giant ### Mark Lomas (1) ### Richard Agnew ### Kevin Scott-Cowell ### end point ### big data ### ucaas ### cloud security ### data centre world ### cloud expo europe ### Troy McAlpin, CEO, xMatters (New Image) ### smart IOT london ### unified comms ### Johannes Petrowisch ### linuxone ### skycloud ### mike_bainbridge_bw ### march object storage ### IBm rockhopper ### SaaS ### Con-Mallon ### application ### ULLMO_JP ### canada ### glass waves ### dice ### jump step ### quash cloud ### fbatwork ### Facebook@Work_iAdvize ### wave ### SC Congress ### analytics ### cctv ### brian emery ### Stewart Price Headshot ### Multi-screen-conference ### ascertia ### cybersecurity forum ### millennial ### rhian ### david ### kate ### social marketing ### object storage ### 33916 ### 33916 ### transport-837813_1280 ### rob ### zsah ### questions ### encryption ### aiiria ### pwc ### files ### bluewire ### fahd khan ### alan byrne ### Sabine Wauschkuhn ### veber1 ### Martin Campbell_MD_OS ### unisys stealth ### machine ### ecommerce ### migration ### alpha omega ### Amir MEWS BW ### virtualisation ### cloudscape ### Marleen Anderson ### 2016devops ### distil networks ### Andy Mills ### Philip Cole ### grips business ### devops ### pivot3 ### 34136 ### 34136 ### 34132 ### 34132 ### AlexRabbetts ### Juned Ghanchi ### PTC ### BMC ### james wickes ### fast lane ### tackling cio ### true ip comms ### veeam ### consolidation ### frances miers ### kings of cloud ### individuals dec ### 33898 ### 33907 ### 33895 ### 33910 ### 33857 ### 33890 ### 33913 ### 2016 ### Greg mcculloch ### patent crow ### compare ### shutterstock_320077349 ### ci 2015 ### ci december ### orchestration ### it infrastructure part II ### Dave_Lounsbury ### 33901 ### Dr_Jekyll_and_Mr_Hyde_poster_edit2 ### Dark clouds ### hpc system ### Arif Ali HPC system architect OCF ### How businesses must stay ahead of the cloud skills gap How businesses must stay ahead of the cloud skills gap ### yes and no ### shadow It ### jason_dumars ### aws ### narrow vision ### gerardo dada ### IBM softlayer ### Geoff Smith, MD, Experis ### Nick robinson ### 33916 ### 33913 ### 33910 ### 33907 ### 33901 ### 33901 ### 33898 ### 33895 ### 33890 ### 33890 ### 33890 ### 33857 ### 33857 ### capex ### 6DG ### soft serve ### storage ### flooding ### 33866 ### 33857 ### 33857 ### 33857 ### Frances Wilding ### rocket-launch-67646_1920 ### rocket-launch ### zayo ### Nachiket_Deshpande_1 ### oscobo ### London building ### hacker ### ball 3d print ### xmasbgnew ### xmasbg ### xmascloud ### 33140 ### xmascloud ### paper free ### image sky towers ### tie-690084_1280 ### 2025 copy ### predicting predictions ### Bill-Curtis-Suit-headshot ### richard holmes ### clouds hill ### city-863074_1920 ### architecture-1031283_1920 ### Customer_Analytic_UK-3_edited ### cheque ### environment ### Phil Turner 5 ### SundiSundaresh ### bridge ### Dr. Manoj Apte. ### double down ### JOHN MARLOW ### private cloud ### nexmo 2 ### sunrise-2016 ### 2016 city ### Wolfgang Kandek (CTO)-20110624 ### Jaspreet-Singh ### 2016 sweet ### 33641 ### 33641 ### hot air ### predictions ### cylon ### science-fiction-751758_1920 ### IBM shadowIT ### Open-source security: Can OpenStack really protect your cloud data? Open-source security: Can OpenStack really protect your cloud data? ### HPE ### Andrena Logue, Research Director, Health, Kable ### PUBLIC SECTOR ### Lenovo banner ### greenhouse- ### salt-pepper- ### Dave wright ### cloud tops ### animal-17843_1920 ### damian hennessy ### iso27001 ### iso 27001 ### Tim Poultney ### hypermarket ### DRAVEN MCCONVILLE KLIPBOARD ### dock-441989_1920 ### 2016 infosys ### Samson David_Infosys ### cloudstats.me ### servers ### marcus ### 2016 ### Jack BP ### riverbed ### invisible datacentre ### hotelscom ### Thierry Bedos ### John Stubbs ### david_nalley-090914 ### FCA ### Web Hosting ### Internet Server ### Dan Radak ### carlo longhi ### Andres _CEO_Priority ### Corey Nachreiner ### ben-simpson ### Stephen donovan ### John Morris ### 2015-Cloud-Security-Survey_infograph ### 2015-Cloud-Security-Survey_infograph ### up in the clouds ### ICUK1601 copy ### infradata ### Stefan_Schachinger ### digital realty banner 2 ### Zynstra ### david millar ### John O'Keeffe (1) ### john maytech ### cardboard cutouts ### sheep ### amir zsah ### zsah ### trumpeters ### cloud adoption ### cbre ### 2016 blog one ### Frank Karlitschek ### stonehenge ### msn ### Toshiba_Satellite ### economics ### mark lewis ### human compute ### continuum ### GoodBadUgly_cropped ### David Barker 4D ### Neil Owen ### windrader- ### Eric Stine ### AWS copy ### sergi-garcia_claranet ### adam hayes ### Artyom Astafurov ### Fergus Miskelly, EMEA Advisory Director, Cloud Sherpas ### 7 disruptive ### cart ### cuppa ### mobile business ### robot heads ### health ### lawyer ### sdwan ### accountant ### campbell williams ### intel logo ### mybaseconnection ### cabinets ### market ### secure mobile working ### nilesh chavda ### sale madness ### xmasshowfinal2 ### AAS ### ptc bike ### ptc stage ### ptc flags ### uber ### spotify ### call-center ### uptown-funk ### London. Big Ben, River Thames, red buses and boat vintage ### secret sauce ### ibm eu chat ### Chris Rogers ### Steven Boyle ### aircraft ### invisible ### container ### edgewaterhires ### Bill_Ledingham ### datacentreplus ### Iain Chidgey ### cloudmoyo ### innovate IT ### caroline bullock ### klipboard ### zsah ### Alex Guillen ### icuk1601_924x95_CTC ### ICUK1601 ### disaster ### platform 9 ### watson trend ### services ### James McCaskie ### avangate ### Lise Feng, CipherCloud ### safe harbor ### Cash-Cow-Infographic-GLOBAL copy ### Cash-Cow-Infographic-GLOBAL ### piracy ### accuracy ### marklogic ### windchill ### Patrica Adams ### CIO ### thingworx ### openstack ### digital copy ### Alan Arnold Vision Solutions ### John Cooper ### IOT_640x360 ### Steve Lesley ### steps ### innovation ### medical iot ### David Meyer ### queen ### John Dumbleton ### Eric Blum copy ### tokyo ### financial tools ### sentrybay ### cobweb ### DCHQ ### shadowman ### oct individuals ### digital ready ### Tom Goodwin ### oct orgs ### top tips ### Mariah-Tompkins ### Queen-Elizabeth-II-is-Tech-Savvy-Timeline-Infographic ### Ian McEwan ### Dean Coleman ### Stéphane Lesimple ### blank copy 2 lenovo ### cloud city ### steven mills ### druva_use ### bitcoin 2 ### retail chain ### global ### datacentre ### tango ### medical ### chris pace ### WMI-in-inbox-contacts-list ### WMI-in-inbox-message-list ### Dave Cummins ### WMI-SmsTask-getSMS ### WMI-sendMessage ### WMI-sendMailByJavaMail ### WMI-sendMail ### WMI-MyService-SMS ### WMI-MailTask ### WMI-mailbox ### WMI-Mail-send ### WMI-in-inbox-message-list ### WMI-in-inbox-contacts-list ### WMI-IMEI ### WMI-icon ### WMI-firstSMS ### WMI-calling ### Screenshot_2015-10-29-10-32-49 ### Screenshot_2015-10-27-09-53-31 ### HybridOwn ### crowdsource ### crisps ### operahouse ### carwash ### Erwan Kernevez ### e.pages ### TRowe ### hp helion ### Rich Preece, Intuit ### Kevin Linsell ### Robert Zemke ### table article ### ospp ### forecast ### crystal-ball ### suse ### predictive analytics ### europe ### AtchisonFrazer ### John Hall ### Jonas olafsson ### aws ### Rangnath_Praveen ### growth ### net neutrality ### ddos ### openstacksummittokyosponsorwebinar7-150727221937-lva1-app6892-thumbnail-4 ### Snowy Road ### netsuite ### safe harbour ### commvault ### break up ### open earth ### finance cloud ### providers ### internet ### creepy ### commvault ### clouds blue ### egnyte ### bluechip ### continents ### NETSUITE CLOUD TOUR EUROPE7.10.15 ### NETSUITE CLOUD TOUR EUROPE7.10.15 ### NETSUITE CLOUD TOUR EUROPE7.10.15 ### Vera Loftis ### Alice Allergen ### Mike Dearlove ### redcentric ### health ### pamela brankin ### EU data laws ### hybrid cloud ### edwin wong ### simon yeoman ### iTunesTrans ### Len Santalucia ### Paul Wignall Hardware Group ### openstack ### snowden ### jimmy wales ### dominos ### Dariush Marsh ### look to cloud ### telco rule ### kemp technologies ### Ep4Cover ### ip expo busy ### @alsaldich ### skill gap ### trust ### software ### Libby Phillipps ### Gerry Brennan ### digital investigations ### Albie Attias ### z13Launch ### smarte IT ### joint approach ### wallix ### Nick Pollard ### digital engagement ### paul gill ### Richard Kennedy ### autonomic computing ### isabel williams ### dark ages ### heinz ### john tilbury ### digistrat ### digital strategy ### Natalie Shelley ### bluebird ### xirrus ### working cafe ### phone ### GE-hero-systems ### european-court-of-justice ### sunshine ### Ash Patel ### daniel ball ### alfresco ### uk cybersecurity ### pavegen ### bubble ### CI individuals ### CI indi close ### DellPodcastCover ### Mark Russell ### SwyxWhitepaperHowSMEscanbenefitfromcloudbasedunifiedcomms ### voip ### uberification ### cloud network ### Paul Griffiths ### ci trends ### ci orgs sept close ### ci orgs sept ### eu court of justice ### IBM nordics ### Tim Schweikert ### mckeran ### driving ### crowdstrike ### INFOGRAPHIC - Wearables ### okta ### gemalto ### extrahop ### clearswift ### 40610889 ### id1037120520?mt=2 ### i-know-what-you-ate-last-supper-what-home-sensors-will-reveal-about-your-life ### ### Episode_1.mp3 ### id722814124?mt=8 ### news.comparethecloud.net ### subscribe?u=b8f364073087264d6116e3182&id=8c1536b419 ### ?source=reg-compare ### ?u=b8f364073087264d6116e3182&id=7c40e74e5d&e=[UNIQID] ### ?u=b8f364073087264d6116e3182&id=1c8f05baec&e=[UNIQID] ### ?u=b8f364073087264d6116e3182&id=9288027b92&e=[UNIQID] ### ?u=b8f364073087264d6116e3182&id=62140c0f58&e=[UNIQID] ### ?u=b8f364073087264d6116e3182&id=6f2424914b&e=[UNIQID] ### ?u=b8f364073087264d6116e3182&id=876dc5ec6b&e=[UNIQID] ### ?u=b8f364073087264d6116e3182&id=90fdafb1a4&e=[UNIQID] ### ?u=b8f364073087264d6116e3182&id=68ca47e3e4&e=[UNIQID] ### ?u=b8f364073087264d6116e3182&id=6070e8fe48&e=06c8798220 ### ?u=b8f364073087264d6116e3182&id=7a93ad33cf&e=[UNIQID] ### ?u=b8f364073087264d6116e3182&id=34df4de038&e=[UNIQID] ### ?u=b8f364073087264d6116e3182&id=352281c154&e=[UNIQID] ### ?u=b8f364073087264d6116e3182&id=f0fc8da79e&e=[UNIQID] ### ?u=b8f364073087264d6116e3182&id=2579a954f9&e=[UNIQID] ### ?u=b8f364073087264d6116e3182&id=958c203032&e=[UNIQID] ### ?u=b8f364073087264d6116e3182&id=92e617ce09&e=[UNIQID] ### ?u=b8f364073087264d6116e3182&id=32f1e42a0f&e=[UNIQID] ### ?u=b8f364073087264d6116e3182&id=af11e8072a&e=[UNIQID] ### ?u=b8f364073087264d6116e3182&id=0a795a0893&e=[UNIQID] ### ?u=b8f364073087264d6116e3182&id=0ee815d801&e=[UNIQID] ### ?u=b8f364073087264d6116e3182&id=d2c42857de&e=06c8798220 ### ?u=b8f364073087264d6116e3182&id=a3f6e194be&e=[UNIQID] ### ?u=b8f364073087264d6116e3182&id=6a00819bf2&e=06c8798220 ### ?u=b8f364073087264d6116e3182&id=fc13dd6812&e=06c8798220 ### ?u=b8f364073087264d6116e3182&id=b55ef0d5f2&e=06c8798220 ### ?u=b8f364073087264d6116e3182&id=b1b8b72c4d&e=[UNIQID] ### ?u=b8f364073087264d6116e3182&id=aaa2bd38a4&e=[UNIQID] ### ?u=b8f364073087264d6116e3182&id=30eab8f01b ### ?u=b8f364073087264d6116e3182&id=55bdfdf1c1&e=06c8798220 ### ?u=b8f364073087264d6116e3182&id=fdc80928cb ### sum-of-all-cloud-security-fears.html ### BI ### Nigel Stevens ### nigel tozer ### storm internet ### GE ### INFOGRAPHIC - Wearables ### mapr ### pavegen ### skeleton-in-closet Image via: News Limited ### data clouds ### Matt Allen copy ### sunrise software ### Alastair Mitchell ### game changer ### gtt ethercloud ### cloud 9 ### hacking ### neil and andy cover image ### dell stats ### Dell history ### Dell event ### Andy and Neil Dell ### Episode-4 ### FNC2 ### FNCPILOT ### episode3_26092014 ### Episode_2_22092014-620x445 ### Episode_1_16092014-620x445 ### Special_02_01_2015 ### Stitcher ### iTunes ### FCN3-Site In this expletive filled rant of an episode we discuss whether automation has been good for the human race and unleash a tirade on the impact of cloud, mobiles, IoT & technology on the wellbeing of people. ### FCN3 ### cloud bi ### bob middleton ### huawei ### barracuda ### raining ### starcom ### SMW ### storage memory ### gartner ### marketing summary ### Marketing Automation Influence Score copy ### trust radius ### gleanster ### forrester ### g2crowd ### datanyze ### io data pr ### broadcast ### pcloud ### peter Baxter ### fintech header ### atom ### Atchison Frazer ### transaction tree ### tink ### street shares ### osper ### jumio ### izettle ### digital shadows ### brisk ### aire ### adyen ### Rory McVicar ### yougov Image courtesy of MongoDB ### STEM-Logo Image via: sciencescreenreport.com ### border ### marc-elian ### segways ### power glove ### New_Coke_can ### laser disc ### worth it ### everywhere enterprise ### disaster ### southard jones ### neil bramley ### European Golf Tour Logo ### charles milton ### Simon Osman ### Print Editors' Choice Logo for Compare the Cloud ### stitcher-button ### FNC2 ### colo-dc-adwrods-728x90-pink ### superconductor ### Mark Edge, VP of Sales and Country Manager UK, Brainloop ### dropbox ### trends ### consumer tech ### reflection lake cloud ### cloud hut ### DigitalDataCentres_V4 ### REX ### moving to cloud ### blue clouds ### solarbikini-1 ### Arya Barirani ### Timer Bra ### IOTBra ### download_itunes ### old computer ### 3d printing ### APIs bookings ### walkman ### financial ### communications ### steve palmer ### MSP ### Alistair Forbes ### august top 10 ### elastic hosts ### bitcoin ### IT intelligence ### hybrid ### fordway ### Richard Blanford ### cupcakes ### data island ### bright and cloudy ### aug individuals ### 3 month individuals aug ### GuideToBanner ### TheIssuesBanner ### FridayNightBanner ### iphones ### FNC-CTC ### FNC-CTC ### FNCPILOT ### FNC-CTC ### european data reform ### isle of wight ### university learning ### Arvind Mehrotra ### analytics ### Michael Porfirio ### global two month ### global organisations 1-29 ### jihann pedersen ### richard stevenson ### seven sins ### geraldine osman ### online gaming ### hot cold cloud ### Screen Shot 2015-09-02 at 09.32.53 ### demystifying ### Annrai OToole ### bake off bake off ### Steve Shakespeare ### uk v usa ### tomer paz ### Panorama_logo_new ### panorama ### xangati_logo ### blocks ### Mathias Golombek ### IOTWPCover ### sat nav ### shadow ### bubble clouds ### Lawrence Freeman ### Alex Raistrick ### steps ### top 10 startups ### try.com ### skully ### Rormix ### refme ### podo ### kymira ### JobDoh ### cocoon ### Blaze ### blablacar ### maxava ### 21923 ### LinuxONE ### lenux ### 5 misconceptions ### platform ### Dave LeClair ### Mark Woodhams ### support ### Marcus Jewell ### dante_orsini ### Robin Reliant ### cloud ### hangover ### Paul Balkwell ### apple products ### 10_tools_for_msps-1 ### 21832 ### consolidated IT ### simon-porter ### brian_ussher brian_ussher ### marion howard healy ### mark boost ### anon ### avi shalisman ### michael sexton ### jeff-dodd ### JIMMY WALES ### colocation ### Daniel Beazer ### Peter Tebbutt ### Torri Myler ### keith tilley ### Wieland Alge ### amazon loft ### education ### INFOGRAPHIC - Field Service Big Bang - 19 Aug 2015 copy ### INFOGRAPHIC - Field Service Big Bang - ### personal data ### agent cloud backup ### Lilac ### datapipe press release ### 21717 ### Joe Murray ### public cloud ### law old ### teacup ### scary cloud ### SMART-CITIES digital realty ### Ricky Cooper ### kate wright ### Stephen pickett ### gartner ### emerging-tech-hc ### apps ### software as a service blog ### server lights ### jeff frey ### chris sigley ### knowledge economy ### linuxcon ### ddos-impact-survey-infographic-hires ### 38-day-long-ddos-attack ### Marne Martin CEO ### barracuda ### Mike Hickson ### worst viruses head ### worst computer viruses ### linuxcon ### highland solutions ### nori de jesus ### Data centre ### Paul Hampton ### midokura ### padlock ### 21547 ### FIVE ### data ### ERP ### planet ### datacore 620x445 ### farm IT out ### car tech ### Keith Wilkinson, Genesys ### maximise cloud ### wall street ### cables ### Jonathan Levine ### Helen Sutton ### security ### cloud barrier ### Farm IT Out Virtual Desktop ### digital realty ### New Marketing Funnel.001 ### influence marketing ### The New Cloud Darling copy ### start-ups ### Rob_Bath ### Justin Wenczka ### july people ### dale gillespie ### ormuco ### rcs london ### Martin Warren ### james walker ### continent ### Rob Bruce ### fiona squire ### head in the sand ### David Peto ### anke phillipp ### #CI July Individuals 1 ### #CI July Individuals 2 ### #CI Orgs July 1 ### #CI Orgs July 2 ### INFOGRAPHIC - The New Cloud Darling ### Dave Hart ### aws_beginners_guide_v4 ### galaxy ### Richard Davies ### deployment ### keith poyser ### 21247 ### blue coat ### Transform ### Neil Everatt ### Top 5 IT Predictions for Finance Sectors ### cloudfs ### Charlie Brown.jpg ### 21196 ### creditcards ### cloud banking ### Cloud banking ### all things code iland ### fintech connect ### gordon davey ### data hungry cloud ### migration ### vish ganapathy ### finance cloud ### wifi ### cloud-barrier ### Matthew Munson ### cloud juxtapose ### Brandon Tanner ### gateway cloud ### 21087 ### the crystal london ### datacentred ### digital docs ### desktop ### snowden ### 3d print ### vijay mistry ### empire strikes back ### worthington cloud ### ouch ### OMG ### what even ### dayum ### column technology ### spock gif ### leap frog ### storage marketing ### David Zimmerman ### philip piletic ### electrify cloud ### mew car ### blood pressure ### blood pressure ### leading edge continuum ### compliance-workloads-datacentre1 ### ip expo ### enterprise_digital_batch ### china-pr ### bursters and uk ### 20847 ### 20841 ### EIF-logo ### money money money ### cloud beginning of end ### cloud landscape ### darren dow ### asset management ### windows ### sunset ### Ian-Finlay-Abiquo ### human algorithm ### thin white duke ### 20757 ### 20757 ### 20751 ### schrodingers cat ### russia ### cookie monster ### computers ### binary-system ### jack dawson ### june to july ### 4march to july ### ip expo banner ### ip expo ### ip expo ### USP for article 2 ### ctc twitter ### wreck ### david_fearne ### paul_bevan-090914- ### john_warnants ### david_terrar ### allan behrens ### doug_clark ### omer_wilson ### matt_lovell ### steve dickens ### ed_macnair-040914 ### Andy Johnson ### len-santalucia ### lincoln hewitt ### gtt-banner-ad-v6 ### Sophie Davidson ### driverless car ### framework comms ### tech.london launch ### voip ### social media ### security ### saas ### quantum ### paas ### iot ### iaas ### crm ### cloud management ### cloud hosting ### cloud deployment ### cloud builder ### caas ### big data ### analytics ### AI ### esther_rutter ### andrew_mclean ### bree freeman ### tony_odonnell ### james mackie ### dominic_smith ### Neil-Cattermull-1 ### stan_roach ### ian_jeffs ### andy_grant ### simon_talbot ### dan_blackett ### russel_ridgley ### paul_gainham_juniper ### andrew slater ### Hongwen-Zhang ### Max Büchler ### paul-weeden ### marionhowardhealy ### mark underwood ### Rani osnat ### Jamie Cozens ### John-Fakhoury ### edmonton ### Sietze Dijkstra ### Bill Mew ### 20563 ### 20559 ### 20556 ### special report ### openstack CI report ### amazon storage zoom ### amazon ### #CloudInfluence organisations ### #CloudInfluence individuals ### LAYER8 ### sme mountain ### podcast ### sadie douse ### gtt-banner-ad-v3 ### ibm box ### 20491 ### 20488 ### stevies ### Cumulus_cloud_before_rain ### ibm z ### CWFNew ### Bill Mew ### Neil Cattermull ### Andrew McLean ### pre cwf blog ### single sign on ### Steve Belton ### sheldon-smith-headshot ### bloomberg ### bloomberg ### bloomberg ### bloomberg ### bloomberg ### bloomberg ### bloomberg ### bloomberg ### city london ### instacloud ### io slough ### io slough ### io slough ### io data centre ### 20 years of a World Gone Digital ### software ### Clouds_over_the_Atlantic_Ocean ### Markus Rex ### cloud mountain ### enterprise digital summit london header ### app formix ### influence ### board ### Matthew Dent ### Sam Chandler 2 ### european cloud ### Simon Aspinall ### department-store ### simon porter ### consumer-tools-finding-their-way-into-the-enterprise ### consumerisation graphic ### diaster binary ### ian masters ### Broken_keyboard ### MayLineGraphZoomIn ### MayLineGraphMonth ### cloud influence ### Rhian ### Cloud banking banner ### surveillance ### header image test ### #CloudInfluence may ### top secret ### yes_workforce-020914 ### windows10 ### hpc_cloud ### ibm_watson ### zombies ### ring_ring ### cloud_crm ### Continuum_logo ### watch ### green_data ### storage_battle ### juniper ### node4 ### apple_pay ### e20summit ### ssd_wave ### redhat ### productivity_war ### cloud_fringes ### node4_cables ### cloud security ### crmaas ### minecraft ### keypasco ### cloud fs ### Speaker banner CFS ### censornet webinar ### allister richardson ### frank jennings ### Guy Willner ### data_lock ### landline ### ibm_cloud ### ibm-cloud-marketplace ### Bluemix ### elephant dance ### hybrid ### ibm_bluemix ### hp_eucalyptus ### education ### Tapp-Flexiant ### pound_cloud ### dr_vmware ### pph_clouds ### datacentre_trends ### gateway DC ### pixel_laptop ### aws_threat ### bitcoin ### retail ### cisco_hq ### centurylink ### Century_Link-graphic ### carrots ### byod ### bursting ### social_network ### reasoning_machines ### aviary ### a-b_team ### arrow_msp ### domains ### esos cambium ### cisco ### samsung chrome laptop ### egress ### emc ### rackspace ### ibm wimbledon ### varmour ### need speed ### cloud generator ### cloud server ### cloud in 2015 ### voip ### scalr ### simon_talbot_race ### password-101014 ### interop ### sk8_cloud ### subscribe ### smb_cloud-151014 ### power_struggle ### apple watch ### amortisation ### Bloomberg tech ### bloomberg tech conf ### stratoscale press release ### digital realty banner ### smart data dubai ### cloud banking ### cloud world forum ### cloudwf box ### cloud WF june ### Americas banner ### Edge2015YouTubeCover ### edge vegas ### 19520 ### NickBeale ### 19514 ### 19509 ### cisco-rock-cloud ### SME-no-to-cloud-670px ### FrankJenningsExpo ### FrankJenningsExpo ### CloudWF-Header ### gtt-banner-ad, ### DNA-Humanification ### rules ### online gaming ### atlas hosted desktop ### archives2014 ### archives2013 ### archives2012 ### CTC_NL_Small ### MailingForm-5 ### MailingForm-4 ### MailingForm-3 ### MailingForm-2 ### MailingForm-1 ### MailingForm ### MailingForm ### 4456318373_27aef4aff4_o ### IMG_1683 ### IMG_1679 ### IMG_1677 ### IMG_1427 ### IMG_1511 ### 500x500-linkedin ### MAY:JUNE VisionSolutions_Banner Ad_Compare_the_Cloud_1_728x90 ### pulsant-924x95 ### elastica ### csc ### uniprint ### appnotch ### sixsq ### volta ### skyscape ### slingshot ### tribeca cloud ### smartit ### vibevps ### voss ### workbooks ### tupelo ### virtkick ### timico ### testup ### techgate ### sungard ### stratogen ### sparkmycloud ### solarvps ### sohoit ### softwareag ### virtus ### virtualtin ### virtualDCS ### vesk ### vaultnetworks ### tselsolutions ### trendmicro ### xtm ### zip conf ### zettagrid ### zeara ### yourofficeanywhere ### youritworks ### yaytask ### samanage ### safeswisscloud ### rtwhosting ### ringcentral ### relbit ### recordnations ### recommerce ### reallysimplesystems ### razorthorn ### rawstream ### psoda ### protegrity ### profitbricks ### processmate ### precursive ### porticor ### platformer ### petronella ### peer1 ### Cloudz ### onapp ### objcloud ### noblue ### nimble ### nephos ### business by miles ### midas ### methodcrm ### messagestream ### master control ### Maple ### m7 managed ### lunacloud ### lowdown ### londoncloud ### liquid accounts ### lima ### letn logo ### leaseweb ### kona ### kingston consult ### itrinegy ### iomart ### interoute ### internap ### intermedia ### intelligent IT ### infinity ### imtech ### imecom ### ilait620x445 ### ibm ### hyve ### hostway ### hosted desktop ### hosted accountants ### happyfox ### green house data ### green cloud ### gravitant ### gmo cloud ### gci ### firehost ### fifosys ### fedr8 ### ezee ### ex networks ### etz-logo ### equinix ### entrust it ### engine yard ### egenera ### ecmanaged ### earth integrate ### dream servers ### dimension data ### digital realty ### derisk ### deac ### datum ### DCE ### D2C ### cybergaurd security ### comm100 ### computenext ### connect ### code first ### cloud sigma ### cloud fuze ### abiquo ### cofficient ### cloud knowledge base ### cloud direct ### clevagroup ### cirrus ### channeltivity ### cloud strong ### cloudbase.io ### cloudamour ### cloudmanager ### cloud industry forum ### cloud compare ### city cloud ### cirrusflex ### cerillion ### autotask ### atlas cloud ### cezanne hr ### bright pattern ### blue logic ### biz webs ### backbone connect ### assessment portal ### asigra ### asia web services ### arrow ecs ### aorta cloud ### aap3 ### 404team ### calligo ### Matt Kingswood ### floods and blackouts ### tess pajaron ### Virtual DCS ### Barclay Ballard ### Monique Goodyer ### Monique Goodyer ### bubbles-table-150x150 ### Watch ### asia_lightning ### microsoft_ceo-211014 ### cloud_adoption_sap ### smb_part2 ### video_conferencing ### ddr-closeup ### iron_curtain ### ibm_data-271014 ### smb_part3-251014 ### golden_parameters-271014 ### emc-220914 ### memset-311014 ### hackfest-311014 ### fast_cars-281014 ### microsoft_up-031114 ### node4_logo-010914 ### kinetic_hdd-041114 ### smb_crm-041114 ### communication_next-051114 ### gtt ### ibm_dcs-071114 ### toshiba-101114 ### hybrid-101114 ### t-swifty-101114 ### humanised_noc-111114 ### it_iot-121114 ### iot_car-171114 ### big_small-171114 ### chicken-181114 ### cloud_storage-171114 ### pulsant620x445 ### microsoft_azure-201114 ### daas-2411014 ### pulsant_sponsor-050914 ### vesk_daas-241114 ### smartphone-2511014 ### rise-2711014 ### node4-2811014 ### accountants-301114 ### daas-021214 ### city_cloud_alt-021214 ### 2nd_wave-031214 ### calyx-041214 ### wpp-041214 ### unicorn_work-081214 ### hybrid-091214 ### connectivity-101214 ### apple_ibm-111214 ### legal-121214 ### uptime-121214 ### wearables_6pc-151214 ### scrabble-161214 ### bah_humbug-121214 ### node4-181214 ### infinera-180914 ### hybrid_highway-171214 ### datacentre2020 ### cloud-selfie ### big_data_2015-301214 ### NewSite ### mother-050115 ### ibm-big-computing ### cloud_lock ### asia-cloud ### tickx ### robert_harting ### toshiba_endpoint ### cloud_ppf ### iot_home ### datacloud-awards ### onomi ### top-tech ### steve_wozniak ### bacon-sandwich ### trends-and-concepts ### cloud-cut ### cloudpitch ### lockheed_martin ### biggest-computer ### guide-to-hybrid ### EUkeyboard ### IBMz13 ### UK-cloud ### cloud-suppliers ### business-cloud-tips ### MoneyCloud ### BillMew ### CloudInfluence-JANUARY ### nightmare-on-street ### cloud-crypt ### ibm-software ### ahead-in-the-clouds ### watf ### storageinthecloud ### datacentre-special-report ### clouddata ### deeper-than-blue ### feb-CloudInfluence-14 ### london-cloud ### cctv ### PLMA ### sonycontroller ### CloudInfluence-SpecialReport ### russiacloud ### ibm-lasvegas ### cloud_wf_300x250 ### pcwires ### BYOD ### service-level-agreement ### chinacloud ### braindevice ### digitalmarketing ### CloudInfluence-March ### marketing-graphic ### cloudgaming ### Episode-4 ### microsoft ### episode3_26092014 ### pulsant and gtt ### Episode_2_22092014-620x445 ### CloudInfluence ### Episode_1_16092014-620x445 ### cloud and smb ### cloud-shape ### cloud shape ### IBMandTwitter ### Speed1 ### 18069 ### 18066 ### 18063 ### 18060 ### CloudInfluence-6 ### ibm620x445 ### 290_CloudRev ### shutterstock_108800705-copy ### shutterstock_122459086 ### digitalrealty_logo-040914 ### virtualDCS-100914 ### kingstonconsult_logo-0409141 ### commensus-160914 ### mirantis-190914 ### keybridge620x445 ### censornet-620x445 ### axians620x445 ### vesk620x445 ### 620x445-cloudpitch-format-copy ### gtt ### vicom620x445-copy (1) ### vicom620x445-copy ### vision-solutions620x445-copy ### kazoup ### callone ### infosec620x445 ### pulsant620x445 ### Databarracks_logo_hi_res-620x445 ### backbone620x445 ### peasoup ### Special_02_01_2015 ### Special2_1_18032015 ### green-data-centre ### Analog_Computing_Machine_GPN-2000-000354 ### xo-communications-copy ### Telephone_exchange_Montreal_QE3_33 ### storm_in_kansas_city_by_thisordinarychick-d49v765 ### Smart Cities RGB ### shutterstock_93841864-copy ### sheldon-smith-headshot ### secure-snapz ### Screen-Shot-2015-05-13-at-14.07.04 ### oic-banner-308x221 ### OIC_550x72 ### Nice_cloud_by_Miladph ### SANYO DIGITAL CAMERA ### lincoln-hewitt ### jon-fakhoury-headshot ### jon-fakhoury-headshot-e1430843644110 ### iot1 ### iot-v-cc- ### IMG_5907 ### ian-masters ### gtt-banner-ad-v2b ### GoldenMedows ### Cloudz620x445 ### Clouds_over_the_Atlantic_Ocean ### CloudInfluence-26 ### CloudInfluence-17-copy ### CloudInfluence-6 ### Cloud Image ### cloud-sentiment ### cloud-computing1 ### cloud-computing ### cloud_wf_728x90 ### cloud_wf_300x250 ### Cloud_stock_15__by_OligantiStocks ### bubbles-table ### bubbles-table-e1430474654478 ### big-data ### big-data-1 ### bd-v-cc-1 ### 321274_CISQ115 ### pixel ### CTC_Logo_Footer ### CTC_Logo_Footer ### datapipe ### axians620x445 ### pedab ### Cloud_Banking_Cover ### 17564 ### 17564 ### CTC_Logo_Main ### arrow_bannerXL-924x95